From 96508d40ac7347ba9c471a3b1043d2c48dea1c96 Mon Sep 17 00:00:00 2001 From: Yinan He <43169235+yinanhe@users.noreply.github.com> Date: Mon, 27 Nov 2023 23:47:19 +0800 Subject: [PATCH 001/248] Update README.md --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index f634223f..0a382a0b 100755 --- a/README.md +++ b/README.md @@ -68,6 +68,7 @@ The codebase is maintained by [Ziqi Huang](https://ziqihuangg.github.io/), [Yina This project is built using the following open-sourced repositories: - [AMT](https://github.com/MCG-NKU/AMT/) - [UMT](https://github.com/OpenGVLab/unmasked_teacher) +- [RAM](https://github.com/xinyu1205/recognize-anything) - [CLIP](https://github.com/openai/CLIP) - [RAFT](https://github.com/princeton-vl/RAFT) - [GRiT](https://github.com/JialianW/GRiT) From 6c9935b25ddb89d09f6ff3e2cd0fc85321166302 Mon Sep 17 00:00:00 2001 From: Yinan He <43169235+yinanhe@users.noreply.github.com> Date: Tue, 28 Nov 2023 00:04:33 +0800 Subject: [PATCH 002/248] Update README.md --- README.md | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/README.md b/README.md index 0a382a0b..f7919097 100755 --- a/README.md +++ b/README.md @@ -1,5 +1,9 @@ # VBench + +[![Project Page](https://img.shields.io/badge/VBench-Website-green)](https://vchitect.github.io/VBench-project/) +[![HuggingFace](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/VBench/T2V-Leaderboard) + ## Installation 1. Clone Repo From 9c9745cc22569dd32e22ac6664e3865ad63636ed Mon Sep 17 00:00:00 2001 From: Yinan He <43169235+yinanhe@users.noreply.github.com> Date: Tue, 28 Nov 2023 00:14:06 +0800 Subject: [PATCH 003/248] Update README.md --- README.md | 14 +++++++------- 1 file changed, 7 insertions(+), 7 deletions(-) diff --git a/README.md b/README.md index f7919097..4ee8ff42 100755 --- a/README.md +++ b/README.md @@ -1,10 +1,10 @@ -# VBench +# :bar_chart: VBench [![Project Page](https://img.shields.io/badge/VBench-Website-green)](https://vchitect.github.io/VBench-project/) [![HuggingFace](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/VBench/T2V-Leaderboard) -## Installation +## :hammer: Installation 1. Clone Repo @@ -19,14 +19,14 @@ conda activate vbench ``` -## Pre-Trained Models +## :gem: Pre-Trained Models [Optional] Please download the pre-trained weights according to the guidance in the `model_path.txt` file for each model in the `pretrain` folder. -## Prompt Suite +## :bookmark_tabs: Prompt Suite We provide prompt lists are at `prompts/`, see [instructions](https://github.com/Vchitect/VBench/tree/main/prompts) for details. -## Evaluation Method Suite +## :surfer: Evaluation Method Suite To perform evaluation, run this: ``` @@ -51,7 +51,7 @@ List of dimensions supported: ['subject_consistency', 'background_consistency', 'temporal_flickering', 'motion_smoothness', 'dynamic_degree', 'aesthetic_quality', "imaging_quality', 'object_class', 'multiple_objects', 'human_action', 'color', 'spatial_relationship', 'scene', 'temporal_style', 'appearance_style', 'overall_consistency'] ``` -## Citation +## :black_nib: Citation If you find our repo useful for your research, please consider citing our paper: @@ -65,7 +65,7 @@ List of dimensions supported: ``` -## Acknowledgement +## :hearts: Acknowledgement The codebase is maintained by [Ziqi Huang](https://ziqihuangg.github.io/), [Yinan He](https://github.com/yinanhe), [Jiashuo Yu](https://scholar.google.com/citations?user=iH0Aq0YAAAAJ&hl=zh-CN), and [Fan Zhang](https://github.com/zhangfan-p). From 2ae858fcd7e1060755e70bed96f7edfbe11ddf37 Mon Sep 17 00:00:00 2001 From: Huang Ziqi Date: Tue, 28 Nov 2023 00:23:11 +0800 Subject: [PATCH 004/248] Update README.md --- README.md | 47 ++++++++++++++++++++++++--------------- asset/fig_teaser_new.jpg | Bin 0 -> 3102385 bytes 2 files changed, 29 insertions(+), 18 deletions(-) create mode 100644 asset/fig_teaser_new.jpg diff --git a/README.md b/README.md index 4ee8ff42..1fc0d272 100755 --- a/README.md +++ b/README.md @@ -1,9 +1,29 @@ -# :bar_chart: VBench +# :bar_chart: VBench: Comprehensive Benchmark Suite for Video Generative Models [![Project Page](https://img.shields.io/badge/VBench-Website-green)](https://vchitect.github.io/VBench-project/) [![HuggingFace](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/VBench/T2V-Leaderboard) + +This repository contains the implementation of the following paper: +> **VBench: Comprehensive Benchmark Suite for Video Generative Models**
+> [Ziqi Huang](https://ziqihuangg.github.io/), Yinan He(https://github.com/yinanhe), [Jiashuo Yu](https://scholar.google.com/citations?user=iH0Aq0YAAAAJ&hl=zh-CN), [Fan Zhang](https://github.com/zhangfan-p), [Chenyang Si](https://chenyangsi.top/), [Yuming Jiang](https://yumingj.github.io/), [Yuanhan Zhang](https://zhangyuanhan-ai.github.io/), [Tianxing Wu](https://tianxingwu.github.io/), Qingyang Jin, [Nattapol Chanpaisit](https://nattapolchan.github.io/me), [Yaohui Wang](https://wyhsirius.github.io/), [Xinyuan Chen](https://scholar.google.com/citations?user=3fWSC8YAAAAJ), [Limin Wang](https://wanglimin.github.io), [Dahua Lin](http://dahua.site/)+, [Yu Qiao](http://mmlab.siat.ac.cn/yuqiao/index.html)+, [Ziwei Liu](https://liuziwei7.github.io/)+
+ + +[[Paper]()] | +[[Project Page](https://vchitect.github.io/VBench-project/)] | +[[Video](https://www.youtube.com/watch?v=7IhCC8Qqn8Y)] | + +[[Huggingface Demo](https://vbench-t2v-leaderboard.hf.space/)] | + +## Overview +![overall_structure](./assets/fig_teaser_new.jpg) +We propose **VBench**, a comprehensive benchmark suite for video generative models. We design a comprehensive and hierarchical Evaluation Dimension Suite to decompose "video generation quality" into multiple well-defined dimensions to facilitate fine-grained and objective evaluation. For each dimension and each content category, we carefully design a Prompt Suite as test cases, and sample Generated Videos from a set of video generation models. For each evaluation dimension, we specifically design an Evaluation Method Suite, which uses carefully crafted method or designated pipeline for automatic objective evaluation. We also conduct Human Preference Annotation for the generated videos for each dimension, and show that VBench evaluation results are well aligned with human perceptions. VBench provides valuable insights. + +## Updates +- [11/2023] Evaluation code for released for this list of dimensions: `['subject_consistency', 'background_consistency', 'aesthetic_quality', 'object_class', 'multiple_objects', 'human_action', 'color', 'spatial_relationship', 'scene', 'temporal_style', 'appearance_style', 'overall_consistency']` +- [11/2023] Prompt Suites released. (See prompt lists [here](https://github.com/Vchitect/VBench/tree/master/prompts)) + ## :hammer: Installation 1. Clone Repo @@ -24,7 +44,7 @@ ## :bookmark_tabs: Prompt Suite -We provide prompt lists are at `prompts/`, see [instructions](https://github.com/Vchitect/VBench/tree/main/prompts) for details. +We provide prompt lists are at `prompts/`, see [instructions](https://github.com/Vchitect/VBench/tree/master/prompts) for details. ## :surfer: Evaluation Method Suite @@ -46,10 +66,9 @@ my_VBench.evaluate( ) ``` -List of dimensions supported: -``` -['subject_consistency', 'background_consistency', 'temporal_flickering', 'motion_smoothness', 'dynamic_degree', 'aesthetic_quality', "imaging_quality', 'object_class', 'multiple_objects', 'human_action', 'color', 'spatial_relationship', 'scene', 'temporal_style', 'appearance_style', 'overall_consistency'] -``` +The complete list of dimensions to support: +```['subject_consistency', 'background_consistency', 'temporal_flickering', 'motion_smoothness', 'dynamic_degree', 'aesthetic_quality', 'imaging_quality', 'object_class', 'multiple_objects', 'human_action', 'color', 'spatial_relationship', 'scene', 'temporal_style', 'appearance_style', 'overall_consistency']``` + ## :black_nib: Citation @@ -67,15 +86,7 @@ List of dimensions supported: ## :hearts: Acknowledgement -The codebase is maintained by [Ziqi Huang](https://ziqihuangg.github.io/), [Yinan He](https://github.com/yinanhe), [Jiashuo Yu](https://scholar.google.com/citations?user=iH0Aq0YAAAAJ&hl=zh-CN), and [Fan Zhang](https://github.com/zhangfan-p). - -This project is built using the following open-sourced repositories: -- [AMT](https://github.com/MCG-NKU/AMT/) -- [UMT](https://github.com/OpenGVLab/unmasked_teacher) -- [RAM](https://github.com/xinyu1205/recognize-anything) -- [CLIP](https://github.com/openai/CLIP) -- [RAFT](https://github.com/princeton-vl/RAFT) -- [GRiT](https://github.com/JialianW/GRiT) -- [MUSIQ](https://github.com/chaofengc/IQA-PyTorch/) -- [ViCLIP](https://github.com/OpenGVLab/InternVideo/tree/main/Data/InternVid) -- [LAION Aesthetic Predictor](https://github.com/LAION-AI/aesthetic-predictor) +This codebase is maintained by [Ziqi Huang](https://ziqihuangg.github.io/), [Yinan He](https://github.com/yinanhe), [Jiashuo Yu](https://scholar.google.com/citations?user=iH0Aq0YAAAAJ&hl=zh-CN), [Fan Zhang](https://github.com/zhangfan-p), and [Nattapol Chanpaisit](https://nattapolchan.github.io/me). + +This project wouldn't be possible without the following open-sourced repositories: +[AMT](https://github.com/MCG-NKU/AMT/), [UMT](https://github.com/OpenGVLab/unmasked_teacher), [RAM](https://github.com/xinyu1205/recognize-anything), [CLIP](https://github.com/openai/CLIP), [RAFT](https://github.com/princeton-vl/RAFT), [GRiT](https://github.com/JialianW/GRiT), [MUSIQ](https://github.com/chaofengc/IQA-PyTorch/), [ViCLIP](https://github.com/OpenGVLab/InternVideo/tree/main/Data/InternVid), [LAION Aesthetic Predictor](https://github.com/LAION-AI/aesthetic-predictor) diff --git a/asset/fig_teaser_new.jpg b/asset/fig_teaser_new.jpg new file mode 100644 index 0000000000000000000000000000000000000000..a013a3673ac69a2dc11249ac8217186fb640835c GIT binary patch literal 3102385 zcmeFZ2Urwa*CtwsiekV3azwCIR6s-&K@@>LihzL75?f-ch$Im~vOt-EMrg^3LQ9ei zlB2XrPGS=!NRxA>8=4N)cjGzV|IPe!Xa0NV`R94&&izWO*j2r2SM6PEukfz5Hs>d2 z0NSOgrlAILaY4{k@DJh)LVv1y*xrR8ZEffT1VLLME@=7PG7omS#`rM2y(dZ;;g5B`OpnxlS4c| zK&l`;Pam*8l$3+e+u}H-Ort*4z58fBO&5cr70|yMS6O zu*=-BdT0r{a~hQG-CZ7l>s;?ang6!^@A9Yhva^An3bp_!$V= zB@ICp?>S$f-CK5Dk?P^*Is$Fj&BeW&i&F)mz))}E`n~<*wYWBLZ``z*XUkUJZJfZw5wyEpAQeCE>Ty}Gw}jyUnlJPC>0B6>OJtAHMP;po}h z&QG`U?i2iDztFMcV&W1fWaZAupTD4>az#~5{i=qhzJZ|;7QCY69jm+6Hnw&TU0mJV zJv_ahJ%91i&p!Yk8WtWA8TICEbbLZ$(#PbK)U@2Z{DQ)w;*!#Dl~vU>wRQCkt!?cc zon1e>dj^MwM@GlSCnl#BmzJrtl~wv0W4&Ko5cgl*0>{64_V4<&8}w_##*N$?dDi>I zwZR?i+`Bh!I(%mHo=dtsx19DKk$JL(|8huN&eyG?XZ02YZab5C_Z^iTJhr&rwckDa zf7Y?5|5H8tw~qa@U%ikDwBh&0y)RIIEx$M3zqjqbx1H-7 z-@k1f&{g{U74lvaoPJisnYnZVy8Q z49o4)i+){=nn-38+{^WvPaaaKX;U)7FCAa>E&O%VkZ}kpNinT9S%w?GEBg}%O-=?E zxw5?j7%asW%lL9LdX$;c{oW;AwghS6^)zmIg##sFR|GI)xUM=Jj$ql7lFiS7o)&PR(( zpt9Pge;e zc|GK+B*CRie_Ql~1b< zo-g1)LKffP24r9*WX_gsfHi0{o%{98H&mM5KJ9A9ft>F+Ch$h-Xt`iz#NuD?syyMW zlxnw`Uso0S*Z#kH3=FK>sZ2-O>X|r^!0m{+UBW)9J)O#CdY1Q_e z<6ALQr#8E)9LreEmGPJm{W-Z6y7QiQp?J^@%l16ZWT&$^9+|PO>YwR53Ji4=72;V$%dh)nl6UuTz&gxdgA%dOB z8qN6is0L3@W^Jn7ZaA$pLOHmS(J51h(?B+5e05bXBU?zJ{AtA%v%SRCPn0Gh66%#} zBC{zJE+-D+zZr`d=BWjuxI0s_eankR~5xw#vfuhmp6L=LuGWK8++i5l9UZQ} z=jzCTgqxM^!fn&Vv{O@3cCp=7@ zT=8UTe^XPxU5Sn|NvSTn^SsA2DkV#2yvF3pg0KCobIN}ufW{W;Wn1;^xhT;GFZOlHB66(>buW&FGa;U-hYV~4+5A(z zgjDa{J+;j{1ml{C_Pj^il4{#|uXAIVMCAI?o+jFm)7=I(dhGRtpy@)|{7 z)4|e^$ad&U^tyk@_i|qM+wWsX!}{86vQoaC>Aq22|M?qUJgxXCUOeH=2E2Gu6PGH# z?tkx~Kei5%j6ul><<>6u3>eSwR{S=b-c=b!@jb<}oXr=X8>xy(d~1I?%HUCtg)z{kJ$fg;bUePaUl_l(6dS6szu}Dj zjMI(A!?V&~3f{N}P_0OMyA6|LdXw;_HmdJsyu6un%Ftm&$HG7Xd)$^ay5a>fO-`xW}t>F3}G zo0R6+eOm7yMN^Hoc@Qt_6#dB-AS?-RdladN9KXA6Qkk3-!LFeCv@&;2Oxk1_&}&si zw);(x*k1nhuB#$@EU92fs|_M%U1koUX$iP0@}*w+a6t#9wD&| z(KAyVs6jfTZO$zmyO#rP{Z8aS%1az*m^7>fPHVuaA~3&d;TqzK13b#NUNaxdujrKQ z{uEaa)6+Z_D{5GGOoLW1BEN+VM@An%^ojiz4l71L^h!5X%XI*@v)gl*jUyM ztuIE18ahLzq(j%NhhB79$@*^-?cHVHD3wJ(QwHTChM;eHpl|*0xl!i-W}r3LDR>j7 z=HeX_QjvnN1i4ZxiR8Rfy4qOv{c78lf^9qAUQ*@14(8H-#$Fq8L~rS|s))w<4UbW( z53#Y526#*&LtTNDjM+o?TCPANZ4 z;M(RYr;wkYT1Gp|?@t?dIG$Yzj5@T!W5as6=`J=&>oDT!Q!AUtH>#f5=TWKVcjTNW zfw>KQx7#_%i(U5inCe~z7u&wYCX3;zw5hbD<)nj}Yo15TjWG6&VCnerwJ_scI>9t* z-n^Itxo=}V&d`jOv=1P)S1rM%fK*RR1@C#~hCk!_krAemf@?$y$J7}m=+3l#C!K4> z`U@<76a@Irz9-^m5mzTgi7e7_iepC?%a(onm2WHV!J`Wmuf}b%a=iCv6yc^jqwZis-Bc(3R?;E61Z{h#RR|{IS{Wq>rfYQc^Nfg2j@?~r;&91 zoJw8uT5$DqmL-co*XMgeJJ?nQ&u1?rFg5*_c{=Re&ZkM{IBuM(OnhC$+Df8FKPeZ< zlvWL@ARtFlEhk8Y0Un+!7j8BspLpPt)_}99MY4ue!3^&#M16~%f*T9x6SCd&6+;L?ycwRD1sF`ezJ_S#EdHmD|=?&wzIFRiPz=ng!X3O>BpRfb_DJRkA>0NKg80HZ_$13GA{gO$_!h2f) zGR{jQ7ZKUP;UM|)y}p2fRAZya>MN={DgHigaZ{ExJsQ)(fF%IVqp4_}aIoSgBbzR? z;Jj1bg;&*?l0^14Il{TTv_3eq>^0%fD!#g#*n2Vx*+aI%OSQG}N~IdN z{qIM7)qsr6)4usD-g(q9(qr-`r26Jva zH9a|#!lK9q3d>NI-iAeHaI5tI4;-;cLt5fk{y1ae!%X1XXuClTMAp#5>HssWP5iq~ zN-~WXUH5_(u&OXm*Or6lsyUE1aWR?ifL$eG{0$zETBl5us`Ly4UFS!8yL$1=aFBla zY2sHBJ?*u}7zd(f4x_az8C|+CDpsD->LlYR|H&BcPQw26p?W4IaehmI~ zC7q_r-kQE`o!*-oX=*?ddkP@OaPP(&ld<0`j{Tq{^&p4+mIbU_n?)EFs`O8&@8!z) z`Vj+$MtI&E+>Z9L4VGxFqi4rG#>Tf>2zDt&g^k55U;zs?E+6#i_+XRE3QubG+p z>M<+Eo)Zmr(a}DQCFnXh`S?^I zBJi#*jcU(Wy__7!z@%A;|@$&&R1^p-Hu+ds~;(dx*e&&lR~z* zAFEJ|@+D)){@_D>ioK|^I^)k!hCgKV>-*!Z7gaN?l*d2lnq8)kRk4|#JZ(G-{HF@J z)&pspBtsNA=+4sA^h|h;EHlPOpeZ|$sx~bt-N8p~)~OKtK}`CAGQ7pAzPsRx{Rum1 z=3h+k#n+iXQoA1&f{3BMC!3pU!DXIXIZ$CBBKIy%uH4wpY0gI1Fmj2Ej7<2x+M$yb zw0Z*-m(gd^XB!-km)QQ_>Sq6oEX0p_@!QKb9UK?XrAaBEX;He;9qq75(&m-R4pNQz zMq2hgGPH8S!aX3dpOkCIiJ8XHB~c@isxMPk0+|7>mVLkCODX}F?NTg;K$JmUqTs}{ z#8>dZ=FFQwc@t#@tSrn0$KppxgQv-G1qWLFnt<#T5oERiqSt2xIgZ|%9qgO3$uje( zR7IEsI#)8E;DldB26|vKAr&ZN2m3|7N8`tN0%b)z2MBa-4)j734EejHA1p&ZYO04O zsSX)r+PJIH1YT7fCYvVp_+Ai@nO4wpDfe4^fR0hAY&WfDSN?V+846zi1eZKKu&yJg zcHSV)op+4e9QQSotP!s%9-nyYKh(_xbf&N8T7>EiJq``2BEf4X6aG-2O@5kWz$ymrNyQ8)+6ui`)o*%mP;@#y~Jro zmf&0a_eEy(&s|2Ip}&7RrHIRD2|19+&3=JoMvqQptHqjLzNkLmJ{EbxG1YlkmsG}q z2=4-p9()muJ{6&p??4UGo8G+5_>Y6ktEjhMGK&=tWN1B;j@kk=gWPpXGq;NTF=Hfz zH+=NR$V9g4voATp_wU!QsMbzBWTji6cqVdusGe}YC9r&lipSsoFunLreUomix{iVS zMO`h_pJy#7I|CEHB3Z3x<_z|(MU$^wt@+t5PURM8RAiv<9&eQPF&@mZRY;^v%^tfo z{b_g^cv&sY!je+C*&lUT&5xu6noq7IUrkVrs!dmn0k22uUuv784D2w$aN~%ysNK?S zPRP^G3czjA?A|*kd*g(4jP#SEXx|0Ccxwhe#(S&aqQx1;Ayj7>;5gAk=ywaMCyy!D zZdcr4-9M&6=k-`nCiV~ESne#z!g!$Qk(LE$CO*N^$Sxmw1v>lB*HyqO=i?LuxcGL>sL^luELJO1IL zkM8{7xZ$eu=mgrJmL@lmZZ#jxA^$oZaV0vU$et z%SN8zzejy7l_fsV5M*W&*;|ArUwUfHMW6N9O0^CpTM(q5+(b{E>W2HTQyK-{$v>47 z7~7xVG#`Pk>$-MUlW%sv;xW}9!veo{j@(Xe7dCpO(D~cshkl#MbOyhO zmkjkhCRjkucWN_@UEA?1sh0H!__s>G0Opm0Q|kEXDs`E{R&DaE&u0 zb*GJ_sYERBqdjA##~LJwkmExi$(V@8=lyp~^VhEx-e2E9z9v(pP|TXF@J26_;WL93dqfzf3;Mc4Px%kzE}H#pE~v}Vn_DzCn^UoH)Ns}D#GWw*i%A9VxW@6x?T zmkw=@-J~03&~|gkJ|&zXxSy3?b*#Af{g|hT3evWhH6=9l@=8VfE2}qd>bY7&F*{Ew zw*eb7Hv|4Csf-btlvv$ktc-`)+MebDZNky`{xL_?S@OtYyC~Z*LB8V)@-ht_oW;i| zQ$-nxu_T|x7aE&yT7l2X2d4JT^wu!mGCzGoZw3g{pZ#Y$`7Z5ZNFX0e6eJBF$5%8Q z8?x@3;(H~R$Fe~WYU+2l;srJX?6RSc{w&$$K;E8THjYUzL(G#NU@0Oci)8K~qyceBpnqaG)) zQtbCCkGsSwl%jk)_rKzF99|t2WEx6$l2c8UHP`Yqa&8=;i9#n{M*Gv@7UMU#h4j< zOW9lr$(rAcg}ttl@A9Q7KaVdzD5Zw*(d1>0D+v>(O14_EccRm#y-pFl3lvM zek*!oZ1)u5PJDR<>cEsl)k5u|l?LJ%9AU3A{0>+kgg`82^{V4Iel*t5+@&PnDV(BT z8edYZXdJKo^-YbB$W^wQs{PlL$yLB;EMGG=Z$u316_u{CbnlW=1@lgkRRse6W10M) zf~WjvU^4$dZsSVW*P}uF&XNkQ*Dr}qoiO>^xnII}kUm{W+Tnc+9(LS!ZG~a{TgQ%@kOppjrHZgF*lY%zjC$M^m&65d8v63M7p(%Alaw6;Ka_3^Q z)#)p(mbL*r7o9puf2Q;1Tk+TmUAEJl0IGc5r+ECgMV+LF@pNl!=gAI}p_=bQ9+giG z%60R^Z*Dny=iyU{#Lcq;^l*3p06QWWR;BE24s^c=%?i7QUeLt^*M20%@iAQipBur? zBh6_Kee8H`X5Y=6vGJPF=Zz8)E^p8|DU*jl1 zU#;}r*Df2gNy;@w=X--uaL>fmoW5%zIhXJIZ|4@ivpcqibW<|A;-Eo~o?Nn_*3Edl zB=?`dEda%nonX1z%eS?!V)g;gVbO`>{`!J|hM8@tF zYC&&)d!cgMfIWz<>H8eSf!^Dj!@B7|YuCTndou3*IS#a-=lC*{P(ar=BCu*@IJU0M0vu}wM}bDHBQt+cS{X<)V@u0^vme)BfUsop7ByVa;a=8Z(s>*`H!xcxq=~kGuBMp{}%rNFa14$@8ETU8{ znx=uUUW+Z=%|w#0YuDK27m0~3*fHUC>fk^-vssTk_Do?`=R?rp+Qxg~uiB!*rYr@q zUj_GkpZ)*_Rnp^hRy?EWA~DScJ5o+r&=Dq}lq$T~Qa^BAI=~00Z+y-S)inl2p^MR$(o_Qpvq@&Y$;oFUxmj?n44-_dr zZ&8}=vJcg@&2J7XEe2#D$}67TGT(b~C6FGmfHq%f_Iq-y054+HGh|#rhThBvbf|3= z6&C6Y61(6WFD578=KGk8d!EVk@g+;Oav%&Y=@s1GzJuw7kb~L`bHORPU1$brN*%dH zD<+o9Vh1XHe*q=pswAc62Z`;UCoy2P)czL*y__nPeA~X#kU=>g~wf=WVe#UW10F zqvr|pp~zCB{yDOOiZIvhT{ZM7%@3?5PsOfofl8xy&)*x9x0;&+qBjob*To$}ev)HzxD+-n=9Pc)4kH5A(=O=DY;OM9{2#~ z?_p2M7BRXx&@+5Kap5|Nkh&UArc7IlQQu%4R~#^+e3@U^Rq>?VZRz|&_=1O>^W2c;o;N?$!gYqrV2}t{vzw^AaOJOBSpXCluEwaL}?SQ%Ps|&$iHU4JHG1Hr00 zrS3~c+Qj2kKtCSyN{+O=A{B~N5}*1?hkZRL+UjcmOYQuTVv0Tg<~^ZLu{(}VbPp3F z%N3>(UfqSYvn$eDra~E>pEs&)5fj6)qo2E2{EgSHu>%LGtqjlT z&{|eze@SCDUE*c&8dLCb<`8vRjwfvGazy+W!x>EHtvp{o03Jlo`Z z#ZH&u<|Cm(7fg&E9DU*+lPKzI^h%q_djv zLtL|zZw>tQ+$nkQ3yW4C;f|>kfMxb=L9dup3O;#$xG6|{=$M2tF%ekWo?Cki5fz z9$j}su4g`qV)5Ta$q(EO6JsrP&6+M3mg1&y0R=My5Aw%#mee;N$xo0x?i;BMX8(!B zGB{=7{(IuQ*)>X?)Lr1&+H{Qm0dDs&RmZGW{Gc@G;@-Do9oti33BOLUKK7O{g!j_o z8xEAVfIKG6tGH1bMhv&22NKZ2l(|Whjqu#MGH9BBERQ9X?yN{OmhgHh0Si|1DeC| zkbK|~WF^#ZDF$8hsd_Lq%^}fVE#rObJ@zZ0EsnQbT?f1uT6(ZG%10M?zVh+YE58ZC zH#?|5KiG&Z4M*L2O7^W1|% zxNbWPd=ZX4bz6=_-zHhvx|NT$v8y1m_0@;I2-^8G`k|4p7>|)&bp9j`x}c|7>3Xt7m|0J)l-rJqO;-vOY+AG0G%RyYUT5Z zZay+-2gja!aMJ{Z&IO#Sb<{=XJ)|uGu~q>i#zu3Xkn}N?k9I1Vi3vz|5nX@Bi-l>& zLqPDN77~8yBZjLy>#bZyJGAt;z)hn>rXS#AtRS2k(f}9m)k+7z2dHQskG3ky>#e7T z%585kMG`N*GFnpbUc4N9x`B@;mW3U2$}bhU<-~!wwbw(33e6QN^R>v?S;o6mX0fw{a{-k-MBg~(?YomrvSd(n7IyHeD}=Efbb zJJ_Kn+`@HW9qF=}4*M{xnV|FC)&k4}i~BHQgR23f$XJvYC48s-+KX(gRKk9-rYxtS zG&%dT^FK)g39zvK`~-pun|5&Fn{<&>lU^6Qi%8jjA$D=Gve+b=0@H}eQ;d} zv##Rk^o7_^4m4Lj&H9yYf}%?};=2%K#ob_T%$YY3n}tX+DM_>wxCV3vri2j%MC5li zJUl!&gF)s}8d)oJ2On`SXqnW*i)Y6u{kCSZ-@BrF zC@Y@i0$>5|oT`ux?z-MOcdVc|X!iaWjStSjWvrOYmm94#cyg+JL~u>+)snQ=?Q7^PaCvHRh%OgsLw2W5+rNAoW~Kbi>e$bF0D)5ei0C* z)tDmDUuY-Bx;^tXa&4e%`If4?Vw%WzI7=&seZj=*elOQDMj;c znYOy&zvBJhw;OgdwyOHt{Di-1vILK#^I3u^ri?ZDGk~SVZJEyd1jyl)H($xOUy~-Y z3GPx3_ovsAz5>{H!4TVh80+_NzEpBJkgvCP=}4ATlj{ zn;Egn&BK9LT39RiAwKqjJw}dvx;~L6=?BMK`LSVOmL2rnYWHTJh0>3hckwZ5zfj1U zH4imw7_;d3Lhc@HWq%G^GC9&$dmNsRlzaEgG(J4!2LPpjJ`Z$)|1I{!qgkiR2zgTQ}iktIzhH*^l69f2T+NjS97gt%M?YX=TshbIA?W57XjHfP}2r(UGmM z-C8ict+zzR=#TsHp=v!c5Aap*xx#8SCsS-QYaak7F&J@dS!q#DLG;SfCH{RAtygEs zx|?qhZ*I2PP%W^$jCA1gs^CYpUQUd&MN9eyl$B!kdrM~jg*^0?7*e=v_9x;}u2SM7 zq8b@@7Cjrpl&f7WcjRK2n#(T%jC?kkDFZIp_)S`R!x1IPG%w;`8F@Y$s@}lDEL@$b z`K)*oH<^qB%7T+2YGsTajh&;%Vi~rp#LmO&9l&m9zL>S_V~pX~Fqhf+B}O5J4rsZa!)_5W?o)f5cL>^dt9}665wZ6-f}*A|12m~imwvlsy|sr>y~6`h zFAIKYRTiyyHU6@7KWoxpS@1kAlt;W*q#t(@SyAfkz{4&6$i^N0=emJU_gMYP z$Kav|0x6uqf!0n{(Aws2w`WKd{K0+FUH53(0j{T|FZb7|J@6dMOI5p0kHEib>yG*x z9t2=ng%&-Ad6#vZju{SGmxckeuEugw()G(GJ^d1gmAy4&X79e$STnOeaiOKN*cI?c zKkrRS4h2b!9#ytjX6L>HtnOLY`23PpD}jXuAObJ@Lb0jo*`3Yk!~z9yehF>eImJk0 z->apD5<5=7l6(#a$9t*R>a9#)q(!Q8UKf`c24A*o_3fwRCHTzhttl7EaG*bJ*KNm7 zE95z!cdFhV#3J{ZKFTBW5=`cc7ODzP$Kz~`5FdecCK>2#%LU(~QsY9m?iZ42_#^e0 z*=FHq?FKrphBJP5Hb&_$Tma-KVu>E4p`uDh$l~iqdT&*v!m;Ld-+Q{hnsUy5IOe|N zSMC`zDlT98%iD_aDv>Sl=aerb#$J4Z4#ptB*}Ua;e0iyKFlrOf{Pck?$VX~3%ttfz z^5fTyJ(zaz3>m1a?z)BO`iT|i8Ra|fRIHdF!F!t%l2GM!k2u$V84fc_R6er61}?%W zw_LbPMt(_7G2_%dU0qehv)e0=)SsQ^|yaxqyJ#1|N8q+(eghY3qB6{e`OHA zn$>mVU}#Dkk`nN9A5s!(J3PnzX@+=)eV%RqzI3t4^2qm}M9-b*OT=;W@J|BC3sp59 z{_1Y-)aQ?7bfE0R#fx_foL=YpUz)-5OgY0Q-(u&cM#jdh0buew@PaBaq7U;^Y(p=8VAv^6n#h9^{Qumd*B$SaIBe3>9(0S<%a6*lSdct7PG~+B2&_| z&8f;H5baR@+Gi&^;tqt_5dKu*6-ef9HC6Sgxt~GNo1XG|Mf9CAVw9n#ugf(z`b?U! zZ?y9prR_F7Vn%eT?xWp?g|3D5hvGB03X9aLi7uw@8**jlo@VU5PFA_F|HeJ^YJ1iqR4?APJc#*zf5^eK1Cy?u3+ZEidx=QF0n(_UE$+{q9- zz6!LS^f!Ph>#c|VOPB#VNaXqtqh|hZJ#_lT03%dC4Gx&~EQ8;v2FVXV`+<*v@jL@s z)%&zwL!5Uj&!K^i=jG)Dw%L+2**~q_pjfQ9O>VDg$1r$z

H(U1G?=Nx^(C%@EyW z1VRQr<@Yb*oHR3{^pi1y)ApUJj%gruKR@>s6L0$ETm63FErCE-4_ht76KCuWS~93DTCi{r zUy0rL)Sn<0Ee*MdUi#j+9L9k%Eqp*Sa`d(tfU-EgbQq}IGy}GDeCT1G1j89nOO=Nt}6r+?!EjXhi?vhPw`LrDb zR`-|YKFW13DG6iW0l|$%$3Mj5JB}6}170cd^*oRG0U&Dy!50OI9RJl}{Tr!%0TSxe zhNqY6hBA_9Gq|a_qnLycLz~3bM?q&TlpYxe&h=mQO?F>yHYZe~&be)bfOF`sU& zc$5mf%g@-?vQf+o^`$J^5y_dB1-al`zYelhJrb15d2RWywg-V}^OxI?vNxl`Qbfb>1$&#PVB8NmNlrXQHxP>z$!1p=akl5Po!>TxwYC zKb}*q@1s+Juk@>mK1y+$#IgslYfns^EKD#+yh|AR*G)LW`TO(XMb!i=kc4p;;KBI+ zvAvLn6^bO<67&0x-f=s5cFi^Im8siR?Ox|ufqHX~Mh(9s&@ypFjc4p9V%YqP-g`aj zNaxYVUc?xZ1zK}1DWZ$mU6lHK@c}`8aG8 z;N$(!{bKzL1{I~5WRWBsAZ4dDcR}n8IybnVS<@9ddh+%SEVFr8p3c`yUpF2 z^|bdZqlzwb;!nQSUa=*!uFMl@;s#uQ&Yb5j7&B}+fqrsh>g5JWw+BYZ=I0OHU7BYe zORtBc+41q(ydHRS?YX;~81)ZSl8XzLsDCal0**7znl6{g`Q~LhBim(%3oF(JEZ$uE9M=}6^mmOTJ8jdID_?Q>(mw;&Ad?_vs*+@1sTI0|604=qkFX^7qr z2}J2R+eUP64@TRm?P)??@p_M(YZD#%VCu_+Ex|aCPr0?QDR_e57EA6x?chq!N(#=PX=UJsjJ_23a<<7ItsL7;@ zmGOt_Xw)ah<9{}-&u+kN{~bS0{+;l(&X3nE-N1F@&Uu|Nm;4FpU-<7GygvYb0fAlk z3pkv*$?N=FZ2TzSE#j0!zTUkZH;$0sA$T|IQg;pTFp^7IpiM3yFg zJXN)EwqO+w$*?;{KKNn>@hOH?^b1)AI_M=6-5VZf=|bEzKrKeulJO;Cj{(Ds3S~AM z9h1%_SQ2y=>OlgONSQBC+f0h1E&P2)Gk6f_WBFweC1^lomBU?49hTUo*csVoS(p6h za4+!u^FC;s;nA`o&A$NUg$Q?CX_e0o%~M&}#4PlHTCS%e2O1xztwzu$IFQ5~d@Bcl zX&!(Qo*?`_FKZDWpnF#E(6N4j)8y76fV6R=hjoO4KPZ?34A!TJ-p+w6+Yu+W7rNn* zG}Ia+3PI%QWHv#6 zt#X=-=y!h)Njg~g08}XeZ~AH?`ogre;J{;S5C^iELM_vz*(;P`4R#z18j^u24v3tV z1fW~=#>G#F{?7xIJ(8~Zm`c$#1F1out))FrsCQAFD5AhERm9cq zbYm;40xUgXXprTXTJ+rI+~i(qaxbHO$3U}4>3-})CF7Ze?cdJ%8`QpmfmAp6czX$Wssq`Nem%moM!ZTlGPuVF!;^J!yUz4pOvN z_%!)Qp3QPxOuunYuP1px+Cghgr=Wc6A!gKLwmz!8!)-y1&WzfP${J+vc$i?;32uew z6ZH6YO{*mJcAW4iSm^@3Pbd})akJ5aMb>AbmvTpoE4*fc~=^Rw?K9L(Z!)<+z_45ef>(aX_{U&?|Yoo%3vFF>|-9d5tu zQ$q%<(9q{@wh4?E%=26JsCL1p-v)PCER&l6C{x+XMd1jAyDl#3$>Pes0o7P59 zW72m>Z-tWt+TSVy+FxpzLdWT|wPh;@&oPSxC4f>?r_Z2(l{l0L(HHQNNld<%rgKwz zVodU0)(f!}9e>Ic4H%P8u2eGG7=2#JtLP=c9ABK*g;L1`q%zl$=9LLz+#X4)vqZsP zY)W?#yWlHgfRtcJ1C>AP#>KM&k=^fxSC1FLt5*E*H_1#?$sSTGpmZVS60Nn@kG8Iy zHnPRemGQ;N$@KSU|0;g%1F*UvJNcJ#9n*I3W-#nmyp=MgP1RW!5_-w7t=9(9U(2vI}7^RTBzTe_N_j+04?8aF-Peobe zW>Q9QRH(0KDvv)6Ojl_Ku%484j56D+_bPf|2Z-6Vby4^>jBK!3k;SRfc@&v>1|ILs zP)jTDe8lerv139DfU$rX=`BV#l6dOWc@FjDxhAm31KO_gr|V$cAo=%Gt7ot zx{&ikmbNtO7+A!L1z_j^4ph~-xfYpZrS$Z_B&4m!10Gt*n`;JH8=3|4U^zFab}umP z6OcWbj}TwdVigA(vgfw<5;}KbB6%rcDjW{+sZ+{l=DE-l%+X6!cpfl3UPso^o_Q9V z$Q%W!qR;hW{oxb1BkQgtJf>U0h_#?6RQl#O3O~U8x^B{3CQNFY;6ldwa|#qDJ{O0 z2n;pjSct*oQIjUoj)V28p4WV3JKE+m^n45mBdRz;tv-R)iKoovpqhlnBQfUpvX^CV z^jIg5%@lnv1$X>RVg*(OJpqMsU{W4X95K>AnN zC1hn?rQA(1)G&CPWYir0n1Q*=U*=b?ssa5_sTy|tR&-ZvV8Sm zlc||~H4~244gh>%|wjixn=o zg>2sbbsRl=j%TD8#8D~Nv>{5Ft!JmwxLAxXMMc|$*mzxFl%s|*04c;6IROPCTPE|Q z0dlmvYip~itg8W5M$tTuxCS!vAlu67QbE?L=z4mS8AU&nMVOOaPihLqkD~`Af577d zKo8Gzss zYTgcs!68lXfaH2M{mD&^qqiAV7h_%sR22owf(I3av)u^O*X4shpxnrW^|<#;s{%^5 z*t7vNW>#R_VYmqGT*sI37I~VE4t69k$~ar^eI#34M`Sw>%=xb` z0U2h9tM;0v7H_2j$R@Cea_gduk&WvMv=tSWB_x~H zkv8+J8M-Z=d-?f(*Mbqc4lIghax#=?EVG_A{E4{FMg_P5uQ)EG5CW-}TGhcyZ5HaP zCiOO}AOn|}-9!r;mioi}HHfx{8B_}W8Wq-n=R**=QGP?%Qg>PKA4|BcTK!(P4QO0j zV+oVe>?L$fc|0DSGqL6es?>pSp=qE^J?22P$>AEI!_%T5H!WKeI7O>fywV8_@5I@M4sO?(3ZFn6Q)f!!fUVOz`vNDVHH7i1 zi$}-k&IP?oE1APDGey$a6V08^`cVOG+Jignq>BTjnHlJndqxI5=h7XK!qz`|9wZW) z3|&YSggLXTBy~|nM`D+pKQcS=f6?|{VNI>y*C=j9RMe;xJF$R>h^T-l0@)ToM2tui z5V95N0vnJHSt=Vr5F(-?AVhj6(mO#AA|OO+=z-9CLJ1+sT7ED4cYgox;(X6}&c!(w zdJ$IEleOM5#~gFa@sdLU)o06dufQ(eRK^x495mTY6TCI&mD)YcI`y^gX~A~mwzk9T zA-R2|Xr#Dx5P4m&LFJfcM$V!s$RASaE7pc$-f8(zl|uniR)*$^&-?>NlnW79s14*3 zlJ-ouvT4Pj{*m<9#t=lIf5hB4fE*MfFfgCQ9{wCGf*ARutCQyb3H&Hx zs1xFyy(k46zz5K?x8-uT{dgS=+GM(21C`4)WVD|;riHkJTq^1_tcP965+2=U=TolS z1xmG#BHoU?5E{^TP=1++)bc_MC(7fzSh=|>$yzgjn+pN|YMlX8pq>ULT!6H)!V=SX z>Oo5I5#4IOFgxF)->EauW@@cvB$p*|2siY$Bj$ zGcmzMW$LT_gw*}(JLxVXBSbA)G6{B?r3JvEN%eHn**EOHhc*Tux52V+9HWRg|GDww+NZRjO^(K8EyO(}pC#Mb_{4sTkk#lgcHm|Apf3f*L;g$&gU%9zw`g;wX0Z zJrw89V2jNC+$0&0w3pLSUtcpVi{k9fNg8qVf|@~0t?uC41=X2?Ju*s@zwsC?``J-L z0lnCQi}mBf)2ac;7N{X!>FA%&%HKmZ{d{{CcsZDHuq?5x^wxmv@M2epuIOtjpUL0+ z&OdrvSCPiF_jyo#3`3NTb+5X#FX4D|?MDwAAAJfsqpJ^qhVD1hJ~68WpP>FrV57pg z^6;A$*FS3hPg?L=X(Jgt2Bp{>79QtyFQGV_z;$o>ACiV%Q}|oj=4`7&nY+DnvwJ0E zK(bhcwx8P;FUlINAFaU#8IiZYWQ9;o5xKVB9TMc%e!@I@2JjA*OUD z51mP5z7eU{NSM z4UyHrXrAO6BvEWQ+5e%SXU$|*D z#89Ub`dOCT@w=1hs3Ugd`Ym-+qM-k~>-4`aCrL8^#D$Q~B&LjYsv<0~dF7xEOr>RA z=H}d1HA{#uIL}Rj0=)&rP?AX?m&To_CM)y`g*+1nW0vi&M_I|x3SQBi_8^CAt&qgQ zT?{J7b^M^!b+%fmL;2Nb!0h0!GlpF&sngyln2%$Tb^@z|dx*6$`=1I4rjqbhUdaVC z@WALyoawEeOUJ*Ftw~S!-U$(zXg25r-*WWu!dJ|VyTEdSp}imM3pegBp%)P}QwvUl zJEjiF*3@^k!@ZyV5Bys5yr0`>?TNeo9Uz$f{C~IS!u^)LPOF2bnol@<7{CPLE!THU z#J@_N! zbm9itCBDI&Z&WNa{tp)k|Ba)YS5M;FY$fCL@nb-(gV^O{AQ%cYGaGOwXi{C*4E)q3XmqyE;y?p?0f6NKJ~LN}o< z^ff(;=47T0=8 zIwzt{vx6#s3;bd62<{jlK4`b(twliezd+qM9g+Gq%A?uiUjS>QQBkA39~rBW$0znp z5gtMZY%|u(=~F>mMuzri*kvf|r4e@?-AVhTXQm~USn-zCp|%5ilpR84*&xpi52`w6 zD09;vW78i)pMVR+&=o5|_MM?~Ni)y#y-FTNF8s{%x})hc;$@4V=SII^f5 z2kfoAnV=vx3}qK;axb7q0m&TOY>UY^tp^FrO}t78X4df_1Zo`dXI3l}+tG{Jd9VR` zOxIf%*q8XAtem4A4&P&Y0keSSHK+GriOVmGrR`e%K;ec4gYk5KfoN8WhfKt4pbAde z7HoId`t{y8;D*01NT4LZ3Lz}yiIIdtib^jp^J}iE1aMbjhj-p$8q7uo+l_WZQ2HMN zbur-=?)IdGe@IhqW*?D@?WY)yU^j_639?n{Qtp{(D=(N>I&?YHIip+8z_Cb8eesXJ z5lQK+A*RkIP?_J03@dX<)V!d8U| zi-%fHmA*fkWK#T}Pn{;c_tWD!Ag%ES=m%IbU6a^#o3#FBb^JTA2LyNxcZHLL>@XfF zRi+G6x0(-42EIcku0Hks#1OHWuJm_EJaGG6->Lf~!A@8ObXTZq0WI-v&UltH;bu{L z;brQeZO)LLR#&R$4Zx4>nEx&CgxrT^PzdUnX`lVyQTwQwCNtf%x7j=M>IFWWP{`WG zFwmt}96RDk@#!=1*Vr%pEAVxwht`}F(2@)NxgnEoTJ?{^`|%*UHZebM-Q=i4s~2a^$N2pitv7h@XzgDCm~p|)@R!BymzJhIKw=et$(sVh z-m>9Ua532~bl?SJ#3_yQI4>VZYn0>Cn%p1vwJxyu6}KcjRy58+$1O7J^l?^4@ZJ8@ z%xtm~s?VZh6W=K6%vGI7){tQ!cGlbxXtqVzFN-MgT*C>>WKQ%!pnG?f^3zci=`Ras zTzXBh7Q~L(pOH8e%R3DyK4O9g#JrDsb72_g1zrD)TF`Jm2s<6Xf0(WYb%B?!I0g0U z=YH|!{D=FGxyQ~r$X}Gr5V7f#dax#F^>oz3C5E3ydJMPC=q;Rc)098`qf{8CG%=L zjQMC2YvDZrkMWTJwy&rf!DI^Y<^Bgt>c?!i#8dQ(^w$q=B2DCmEP5OGe*P^0^!w8d zbVgL2ESa;y@*rG-${;6t2(jiQUEsx_pd8K=L`%gH}tEeJam_$viREO zXZx}Px*nj%^^KR(x8FZ_ciV#O-CfQm#XzU*zzfcU8_l7J3~%fjc@ymU14%uddB{L_ zYm|-fkxwIGcm1!wFrKAcefh7XN0z0}j)xPg6*mmHS5hXs3s!yM4NB0XsJY@jeNYAiHhIV)!{ix2y-^Vps_0lkJ}HsxN(l z9iLK{UzGzEe{%4EzpENy>Us!**JpRZsU2xyhgo00Vp<0}XL(LQAWnL7!~fYlhHKW2 zG-nJ#xZ|*^^dc6)7NHimjg$3VL^w>xu){IeNpf4dgvlQ9tM>#Csy+XsO1*Mgvz2n&vlAc~ z(9=R{hWfGW*PLc%ETbssmER@?dM@XZ|B0dLMDgMRYn%9I)`N<72maA}Ga4S}U@vN} z?V5PTK+o7_MnjCx#mp0VcQ8Y&?0{*jCz$tQhJb;`0MCiFumz#^p-$+uLkaJke;al6 z2OE=MD!WdHfNVM}a?xvo>A!C;KPmGf(n&8oONS^iAW8Jlz`=qV=w7#lfZ2#SNTbtu zIY541n#IOdjlxf$3Qr&=I3cT!>)Sk@E*P(0&P1A`T>M=;Fv&BrW27G9 zcfhTO;{TZLD|uaXHOJz&z$YBO>_pG{p@>G??V?r zcj>2P4;!(7ym`|>n)_Ar%Q|sHyXfPCCArEGpPyJ4J1nIkJM2K0oaD?DiYt8lw?N~4 zOpaqv3ROu&rC>aa+IE-_3K#@O1vcrh?b}fv_e1_~fh)4RM>iT+9RBYOESdS~YnL}> zbDI@)UXQ2RH`&4(`rHGSj#E1{wZq|$Np=ey1KL`=x=&H~%Y~Jo$oc1U+VZ%@{icz@ z(Ajr$c=jNE5Gr_baH;Z|iyL8m^3gNQ%H{5|e$Wi0Joj6`nRt4Jb{X8z<1ZV87vaEc z4@vo`zr4SD47x=WRKMzVb+jQYeYSd6NmWTn$2pC{m+Z@S8CSuWI1NtC; zAICAxtKggPQ<|qrw$Xk6%#&w=lV93h6tMa>cwlUjbeo%7eFSW50e${SY-l-;m_O6) z>VGj`8fh^bNsPOdYGYU=-o`vipP%JvKD24@d-XIR*rJtehZ}9Qu!}PiQFP3w%$0;#8a(rk(~)AEV@m4 zE$V^bNxgK7ykeT;ri?@5uUJgZqxUGyNhG+-zDK_+6>Bo}yUKY15x#P441j|ET$3kq z%ORzRvzJx8hPGyItsGuMewGQZf2v7q*7Xo>{InkuQJ2I`&JW3U%Doh>e-Z zEsC??VVtVtN7eiqO+br@()=Z2{^b02Z-R7+wKFE{6Bw}uC$VME zd0$61sh3RIYkAf2F~nPQ0Y$0XT+LF#YxQkKoQPpv=bDldIh(CTPrK(Yp0<8-Y48+S5U!UIs zT+xX%-hHAC>_=I>g8qR1I#-RGR~S(Q%ON9O9=X*@Zztm&zE`4oSNuenTgeU#!Zt4> z`rWRlp6klOlC9GdCpCM9=B{X-O!Hy)EnoQ1rR{byH&9v38o)%?!-B5JWphNB*1_xZ z@8L7-XU<%C9O4zf{g?KU=UK8lvt))}qPFbSRl7!PsuF`03IE`>Q?P=z0mujt96QCU zi};;As(Lt)zzZ^iSmkVTktIJ(sFUut9+^l^Wid%3&!X=Xp##Y~gTFDu z!?j>9K=80c$wkIXK^vEcgUjnciO$l0aqARiHVxcOY%k{;^F#I8%V#%MI~>WJL3Cdp zu!f6WfYflCBY$#eOc+zeIC>iI>0-Q6WC7VQV@E9FytgK$ag;TRD~cklUfCATK6!U# z&w{b$gKtLnH*Gl*@PqVc?h$J#L%ZZ>_fmdgy*L>uZM9Gl>6|fyoyIju~~F7 zy!!VFCNu%~u zx_RzTi?%vKY`U3g5_XQk&E}yaKET=>*N!;}iS{`cLV@Ww3CzHsm=wGP^|!#5mbfEI zr0V53jEY6tJ^YI(XdDj99e4j{e$*Ou-#;V0vMVKh`qVD`Nw{mfAiESA5VFQR`@R-0K*l`eF62 zq852fpO1fis>=GWN{G8hY^Bid+Bb7K!l~r(pR4NSU*Res49QaUPr9F1P2scif^87^ zX>L0Z!Czse?v!J{PB1^caI7qhz&N)JmR(*7DfuTE`SHPS4WkRJ*HWs_5YT;iq5RU- zC8x2V-vZaQW%EIr`SR2BK1&|9M&#S<*o>?^61c^EU0%bQDN4Nl@svNcHw$!Yp=-S_ zz$qg+PINWDl;SA#T4x(=0E6N?C-ZMHoj#0)?YNTgwfn?!f=-~p`*RHP#jNu};Xl1d z6Z^=?xZZXO4>OE;F2@S%lX^p5kKvfEJwbxJUoWPVwiU zZ#UdwP5Nkfq;r~ivm@JlM-N)2clShT>0|u;t9x+=O#JR~qEknCSL@5Ga#g&I*R~UQ zZ2Q<}SKYNz!u-KoV-pesXUZH$e4s1K=u z`pO7V?OK^CY@95M-z0jrHTIEXMgG8&+r!nbkuNL59+_VB0LQv$i2bzxH;FZV&%ev* zXHtjh>_?oitT-w6fb+!|n|Oajld1sUblpq1;Im2*&U4p z?dXn3x|AC!)heq=O+8K8Rt%?+ZTc&{E!5hhfbWG18hIt?-0$bjIqpd~$=Le*J6k0# zllbGuuHEk|gj5!M>cwi!_<9^GPeMMY;#MqXfL!7{wp!uLAnh~O%)*x-3 z&QtE6Y@(gZg=XUdYe0Z*bBF~<{$(Klj}mI+?gDp!wd@Xn&nHZ~9s!FG=?(iC4YBoc zPmo!j;QSfQ+aKiKOP)>k04qf{DEILqvcqIVlG@`Z;;)<*?I{X1N@V;6h!+{mgCMt+ z_7Hwb1-O0mPuH_}i8I`TO4sRMih*hp9f&I-zeY~-?=mM}EcnS;rJ2kc+dW;%UeZ1x zBqZo6YIAw!O#IeNV_TiQ!hMs|Eg+S)9{G1Ui1B|oAj&MBy|IT%)NA*vkz4XL{rVe; z3m1+oxJ{dTJ(Mr1F3tw z@SvZJmjzwi-8~SiPO5@Wg*YZUUAD~1e9W!VIRZXA(5_U$XV~B`U+U13 zhne5V;7BAyY0snaR!^yM{X#*JK<*O>9mF_bwlqymG_VTE4 zgW8P~Jc`fUFx&s`6yZl>$hZ^Vk}=DN$q%c`sJM$0#v6k}OE5uN=@RmJV}2Ogey~Vu zZOJQ-bKH%H`SSBrRQES zCr6<-w1y$s&Q&v=qT&!*bNH!{mfYZ3;~R(r-5Jl z#uLW3{0*G7nPbZk8S=oa95gB#fkVjF57ZZ5b`f~Sv=;utE-X7moyuMR03ar{MGW43 zgLfQh-Q!Q)+=pIEDTUx2H%#9m%}_xn+O5V1$kTiP+~@99Bo`g;4eA^pVEh(%yxou5 z8(Aw1CQJW49G?hC5mRpMS3iXj=iM9y&7is=(T;@n!e-ZotG>^+AKL%;=XHXSB++Mx zbSQg0d8|B+X`~Hm4JPl!)QZhi=M&TfYHYj&BS4ZfCC+UD}rW&875+jac=el7g6sxCd^Ul%*C8NUwPWC!99WHbbJnx4rp3Yi7RvGR#Na| z#cilB_3D-7tKJi1?j3|P8+HIczLKZ_Pl#BKpn^nb=5G+A67(o8p$pVibvoQ1eB5=v z{qT>@1z|;p69FB^&Yx7u+3`8@{)uGoNUfzh4d@i>;pN8Im)Ik7JQ#e>=K5jLB7nsaxW@tFgAeke}baGy;-uw#< z&z}QDiObW!1?(y7H)fp6_`mdU?eB>K$kyhyY6*^?zhZ?Uq8E}NJvYu2$5jpQ5``+P zzy>EJ`dgsLvInQ{Po)AcgwG}{=dcN6J}eQur2A#a2QKTOP=LA^UxnkTHb{L@sb-U? zHcU;oiH1yeVx;cLjHerSw6&>}39%E0!C%h+B-HUlGqT?;FKP8;Q?f`DevIY=I*H<*S%=qLjBc`sPTPWiHmkq^d_t9X!3Bh|2%RMzbpB-fSxCOrkbono{-PN<}DNE4x?gu zTDFT|ojhHPU49Y8r7cSNiSX2rdKyq$<0b1yD+pp5+4W-CY=&5Y?O_E0Wv7d!qem~` ze2pUrSqEELHXahYFBE{irZlX!6nb6YPaG);m7X4-R-RusE6Zh+Qh^P05(6DneVU1c zCoHb2onvOnV4nkbw9{at&1u8sGBf1vT@^cP6of~U>kqdwsTWBG#9Bi>}=1C%LUt#dT z7Q{7EP?r7mMA6sY|CcK^Z#)mc+bD1k@fon_DxkkHk=_452l4WCNESt9#*$Ya_Y1ji zY$ZQe!Mm=pRwi~pj4gBIK`W469oOG$l7?qoG*KPz%{cMdD6 zrcX0zG|pqGY01iHX4XF<2ENH-7@^}>e`mSLXhXPOyyzaZ>Tpa(hbxt_A&IA zN?C^wtC_ajrNv_7PQ2dtXMJ3g+9S9iGe*e5O_1sX3>N`GL7&pN`bB0* z2)e#}AW5t8%o^=1s(0+$R--GtqMzuUoCrKS(nI|#h$i1@otcXB`Km!zaBfqZcu$b_ zI>g3~%oOkVE${}M98iF-2KR5S@uW3()2m=1q;AIB%(xSFS?Kf@9qG;?PB}Y5rZ{r-F7}%)8l6~{TvZ%ZmNj;q^<1x4zM`mM_&NQCp1i-$FR&8#euBR z1|NyYQ8jxCwz%^9C8oXdg&T(yZTs&ly*(4|vk7-qmxW_|pa6R*ib)h^sPi0Z*@qj& zPUc9noWIixg+f;eyS$W*ItypZhP`qf=f>Y16zXMc$qMZIyH`jk{Y2$;bZ;d4IdQ>D z*ytd#T84#%#a244c5C=P^;Qxsi=J@vnrtY+uvm;;*8$IlU$90TA1u!v0(M7!bnGRF zEUUb%w0Iuhh%2;eP$h4+0NaXx0v6tJ&1?_8o?h5O8DJWO5*LShrIEnPIch~pT_TO@ zl;-5gNDY5ap*eZp?G~_^|NRX?5^SNBgQmrx`1F^2-=U}8A7h@O7l**q__g`IS0?l` zpr<*Hr3-133+5?()q;HWs4EQtscA6FDHN7n`bbB%_p*lkyrZ~c)ezb9$_6^{$XN5DdBXAxci7*qw+H!P*<_I z*0%vQ@gN9)yXlChvL;#|TF$FTB@-X&sz7r^On>M)m=qW*=U3z-m3lS|`oRJF3xmBT zDet6`+v*R#mh|5ASKC#m!IQ2Bc3Ork^Kd;VwpR%y)+T*m6yUB>D$EqG$~?ezc^P{7 z%%kM7J`5o5=~=iQh;38+4ccDy?dVviJoK(LX(jg^jjaiPe##p6I@Fd2fVE^i4&JSq zBVmUvke+c}fK(NYR{|Y)m;JUBP$GwSiEUhSHZckqYVIIj;{sEKQG<+9z+SJgak8)rjtQMUO09AGTR7IvX|6*8II!e+zHx6 z=&}*n`gEpu*nND~<|T=KWP2K;o6@i9Xt^JDEBuHBc9Pw|Met79-CgwW8n}BgdYFuI zbQ9SqJ;OWDV%*>KDB-6p#5old6oqzR4R#V29wb6nOsh*_BE}|XKijXp$X{NwYr);B zYa+zP;h&p5Z88r}V;?TZaK%HKQqLrJC9j-Qd>q#f;!#=4AK*>wmZp$*SMFzFZen+P zE#5IV)75@Ae0@>!uiGy^m|e(^-z=dJ?$C4`jsV#8fcoT%Y$}fm;Oo@=L)=RtrCslg z?f^BjRzN-PJf8X$rf#ycfu}sd-*0w*#ub#*c;P4O;GMq(svZeLQ@0&@ZPVs$2Ckg! zSt>;Y`TIg)T-_o{X7!L^Dug*lqA;P6vzQq_k%UxCMQT54{HTwdXjO7W(3v@x3w%Y6 z3yMqHVJCykzuCMlKNXhJ*K;*2yPKJOlk{-k^Cw0vV(0E`fcyW)*7^IV`bug@w2aq0FWZ&DZ1T|*?mvTdI?Ue~fH&@e8Ha&gV2gy`~=`CEvKoXwi( zh291tcbizG@>~{A6}!gYo`zZz12N#*Bix^Dx<*8gBN20cZOU`ryi4G(dz}>E{!jo# zep!oMdcix1r{cv~TjLl+6vueZC;o$~=cT34e%ki4cMh6aw*L5A>-q)Utu|~wb^TM- zXWtD_F8*)!x$eS1b54zw7ialr_4{4lu~yJ{MO(3Q_(y&4z`3$nxDVD!EKafbn`0u9 zno_pD3$);D>0r6q%voH@i|Y7U^bs>geK=0zi+C~^;6uq@<15P&6PWSMw7GGyUj>J% zepY8~W+IEo_7T<$U0t^|6x`wA!ZdRfc`|$fR&Jgzi9@K>8<*g-3E&I6uea7&hbYI2!2f-bYuIEYG zEhgyJ7BLC4bcAP!L}P^N@h)wNNWAU0E(% z1LWqlnA>fZ)9T5Uiac~wr}5O?1TYlCocgV1cWEnEQ;x=)efoq*+(31g`rlx0g%h6w z&sg=^-va362CfnnhL>hFLX(lUb8mR)9H_vE2uCx8YA*FOz!KfgJme(~j<2dMLkrkO za%^hi4>Jebw@tJc_FvVycWuY7>k$zV-@aLN(TrORJR@zg#GU|_9TX)Qs6J(OedO@| z?}ss4wMN`&iKQKkDsOy`WAq^tD}uFZ^tGBdM|iaURn85b?+nHfPF%hEbaGDo(s_Sn zjR{cvyggUb{$|qd>7(G-H%^2sMSwG8Z2pm?ut_hSb6~&Nbb!@=eLOQi2Ylbqn-{*8 z#28gr4>p^bl_Pe4W2bSiJB_oKX3WE-LZW$XkAS~a%i1TNW|=z}iTOu=yMPe??)Lmy zj03s1PLI;dcd?BI+jARM2nh>wSRVx~@#gAtMWGS!=?8o~=V*rITs-F2Yc;W=!(KfA zahZS1ZJ(&I}TbhSvbk?!R);j?!dN@<#i2FzKUFJM$jkN(Me-D65zvH|o_;v|3 z&fW~peQM406966C%o|-x{w^$Aia9PZnEQ|a1U1L*cI4+WJ%yN%3iZWAve-O$8}vhu zYbt)N9IRP5R<4O5^xcAGQ;I-llX8p1j2@Tjl%!zu~vQR`ef}*JupP+&B9=P@B_Q%J4cvZo)Y4o}5w=8zl-N zm%T8f#V>prn62Nu1%VYy@00y`$<7l`jfYSE*jBfDbJexGg2tqx*e^W0I}vh&Ep)xj%nvYhMU+3k=|tO|<%A_)6QV z2X3Os5wauq!d$O4ZN8HL;*M|dk5+G2qdrTl)PU?T&viQ}T;);E1Bi7By3n~s9{^h< z6a@G@h-Kg_!+z9S5#OQVRRbp?o1wNp+b~QOG;-!G(nj8-#qZoD0x}zr4Kvc&`g2ES z%jQLm_YMOYAA~}VsdeJFv37kd?LNMf(dz!GL9(^BU%O#G4SmDB>V~du(;Yw*{0Rt9 z8$$IlFYE~6l?@9DBi$t?_bb{-*2wI{~nO7R0~iLV&~HE`qUS4wcEn&vb0TDT9j!U)>?#n|PGkwnKJ#!xq$J zc!k=-7lY)}qTK-#Y_Vb9DHF4F7JoEys@1;Ay*H{hq#2DI={vlhF`h^WS zYpmUi-=c;N2WY(9rQ*=}?&_=PUdBhs9@pUWH~*RJtDet3)BYZPHte{P)H4uwtijCp znw%Ftl_XTNcx2E{`C{*2-?XwmMR9lAk5{4>txRqm8}U7qt~q~P*~+bFR(;K=y|Ar9 zuj*Cj?xi1K&=dCJek!fK@iWh~;bG0HD+Xz;WgqVVcHxdgoMhfvisb64Dyt8k5#L$W zTA)P2avQlDMV!YRvMv5JJ27*q5!~T&?nR(M`V12b=inqtx&A2PHLxA%f{bPm*VBLm z_Q|)nz{Y&Jy4Y{Vw*C4;{k%via^NWmjR_8No_%Pw#upW0`xa7rCps6wHlzA9kI3Ur z;{w6SC}-0qbn1V=lN?LlQ@;zG0+z52$8#i3)~KYS%kQ_qIfv3O;m7_8oVe1f82013 z&`}rMW7={V?-UNQ!MEWBOIwxc2cShhc0baBYdh*)I^G#S@H0TK0aWgk|^D{e9O>|9}JbL36}dSQOdZ(hbg6o4{$|9X{sF};9Jame<@p^Dmwi_Is&keh1!VwN-{c+Ixpiw zkpCtEvNZfe7iL8!(4n5U2kCHy7q^}LYr~~94~dLXt5RaW^}WJYMU8xq$9~5Mx4Lx1usmg+ zfMntBY#?^akP8EPoAVgw{jRfCHw?<7^n=g?mC;ZnZRG_sBbPV=1q`u`<3|b|RhuQ> zm=Cs;J6T&cN-k-S#qdRm?E7${Be~ma|Gmq|a2Qxi`<387ye`$6%hw)~i2nQ4m8Wx# z3GQl!Nv{n8R!&s@Y<017jq|!yQ4uHgzmrs3dprvriMs0O94a(`MQGX6b-UuQH<=8IDA;W%85 zuPAZB{tdCUH*2Xoq;1#^0@X00+v+!a^LPELXHJ6|w` zfBl!-g4|gp=c`R2(|Xpp6E=Y*TWl;$wjZ!;xXQx)Oo6WB^1iEU&2g^I~LzYhMhIuf4Iu7q6r`i?ap(H){KeQVSdoe~K}a87&QaB5>h;TY*5$9+%I< z1XJXpEISL?Ui*n7m5W)cyy3<*H8s8@`$tAyw1Vm7AQ@B2l-U0$A@nOdpTSqe&96_$ z8n_iI=B96-ef}WaK7G1fcN*}0 z(@&K+BIcZjs6v9$oSV^i*Q*bZ%$;q!bCX!29WG8PP2{y-q;czzX`~TZ8@5`dpX4Sh ze+lWQ=Co1>_8zMEc}89SNOh&plbnS$O<` z6_+_fLI3Hnjw6+jhlxSx!Xr0mIzZ>qE2~oGeD+~k@tFe^h1vMY|Cz5_^r?c``yLMb z_*%Ck%F|WM(BMgdNR5E2cCKOdaW6sq+UsZERx80g2+c-?g;osH2*$KOyM&P2+PIco zo+Ib&GaU=!tMFQR@YRyhhUteH#5z&lkbE13cCQ3rx7l?q)8G&22~@5+ zWu=sUL|s^#&H{&y&}K9HP;iEkBJ~Zlt*ARCgdZT!iR@}P zwu2NdhiyL`z)#Gi0vfqRo7Mo%e9Y-Z?$MS9H2~WJ7UY%cvU8M`-Jmfo4IH)L_g~jc z@C7I2r$a}BV08@0&sR&tIGZ&vbLIZ@)g@H!EqE2`k7C>6Dj2VUkLD2cKtp5yd<*X~ zRn_B{18w)xSB?6>ALHRd59&dx)N+Mib-f$A&=j=vNQ-TyKBo?f-d_N4`KGof5uCO{ z3ib2lx%-{uc#iHe`=-(wnTv&z?shM$RaF(-Y^}PjFp#u&Rd0nn4=N)Lr&>N<{NyG6@@nOeP1w@?o7+E*+5OeVSxRFh__wU;VB9 zsWbb(sY?+W;&=X1wG`dj1=Q}{@FTg(hI5G~*TrjE2YXt#`pc$~q&Qo7$X33ZGZr;% zxm5JVILDZ(YpN^p*#Z#e25)4C(S7=C3JYCB4o<4!;)3MdFYCVFZ}F_-R#>#o%L)U_ z&5r&sE*QBqYq^j zB9>mnW0e*^uqbm!cx0Ph2Oy6HJ~ZZm>8_@b;=gOiG@hHx&>oW6fW!N0s|Dc_bydC& zM+OrRIkL((lFgP|FI|>n5M4^nKc#%idE*_{k>kILdk=o8zTtZH5lng)Wt^=t`%Ob7 zL(wwW@mitZFM0`f{56IU1Ty31%8*KwSiMXu;Vzodu zX?c$nPId&0kyrBwrB?VTRKv|KS5G*V=e${W_h5tcD-Wyn7YC#a5HYr#ABG1_adS(U z>a87G`7vDg?i^h6%S#{F!8Tjw@7=vJYc_J<{iK@;T{v&uo?e8T-!tgvnU3EXur7C< zUL-xgZP3p<1KIb2ALyM4YRWI^o^^*3xF#UJeK%-kQW!cgeftb$`qAGI`!*M2gUwg1 zwmUkD?|=%0-B}AcoQHe)nHBhU2#q5VR*AI>*Z5(svP>c)5~d0{$gOnbVv$Cp{lkm( zG+w+A?eKJ}IVhBC$*KR4$0&G{ZHWXFjj^?igZqhzH1q_knbpGOU#C$z^vlpqw-d=* z@CxJsCMx*wbc#6f?WbpCgC-G?`l8347=qO5LH4L3S?5pGH5j z+s&SNtGDg#+ZpoZ^B*o#va9&YBmPxMc~c{#=5!DyH>A)d^Hy@-(FG-8sYMY9AFtl( zH9wHG6MJwt++iw)MeI9V%cX~b7K@VFp~FSWv+LD>9(&JBzE29j5<*-#~V+frKO(ZPqM>xP_Dw^Q7l5 zgCogL%}a0tK?IgG?(p@2epW9xvfbwV(lq@4CVr*Op9A=%Zm%@$UI7w$s0=-KoOcT| z^lM&!7IKDb$UM_hfZwX{gF!Zwt^aASsee~KVd~N2&Zip_eDfFC7*>pTHHv15z=6Mj zGvmvO1 zzOXh+SF>VZzd8fG!DPTxCthB!d4fY8}9mw|g z#H9|#6^!mlG|yXSc7+_fi~`Dmb*u(HFO4svj786?AHPl4y`Y5*#1w)%z+0ef)_k-7 z**(y_r4H|UlnZR?PiP?rV8{75rUGycHt#@dMsxp6rq|;nSp^4G{9w?<&TvzC4{v0c)XG1H{);q!&~!`Jlb* z>Przs6tZn!*wCVIU>qnBeTm1-%?=x`ex**Hj9I@1l9(6^QOpeA*+;!Nf(U;$rb9dV zD4{ysCfoQe#pMoDg=X$0CWF@%Uua(qdHLt;?866q18LF3);CHAHHuqjLQa!T&w(lP z9CHXWoczv)k@o61vJi3@;g)@ZZu4hI{p3C(TZd&#p&zlUmi<-KFL&|q(JN=)7|R^D zc8@U;{^R+Zy~ogjens+p5og91V291{7b6cT%kmSH7DrqRMs-h7FBKiUaw%fHLj(U3 zWVoQR(Mg_hUqq3+eF3Z8uWtf6mcFSVS1D`qD>Bev1*ia7mx#5kHsO9+%}MugE{5=4 z;yBW@`OWA7-7~l~XB5(MAiIqG7pd=}d|8T#4l=)T5=g{PWX&3YW*1=~5M+Ikx@#~@ zxD9UzTs>c7YqHj@Jlsx|J-hZ8`-iam;tGkHL4HK)vyd+v@d5%2&wB-cgRLd&V=T$9 zsxNT#nSNe4iYdgl&g=_psk<>zuiSX-&osYtv#s=pH1S8K`_gPH;ttrm-tR&iv}uTq z#BX?0Zepg*(^yyvPT24GNGN@i!@j9CPufy4X^*D-A&t0Xm9GuA)MS@^P~a%GPhbTY zkDA=YkZnMc$Q9jMJq&&-VLGNE>x-8JT{|y+MXF`FaqQ;O)7qO`=f8|zx!i)+rY1TV zky)#cTy|r~BBW)p#lP&XwHx1B&J%eT|AzY!)gR0L7zEqAP9Vat`cG);f}CSNLWc@( zVrJzF*8#hb52nl)F;eKZGk@1@xo>P%+kqH7nok+vZIC9%SrbQqm_ZWwOZMFX(!=fz ziNkO*=HPy`9TN$V=Di)1UM{oevlCN>r=inuC6Ac=apTN8La;2M$V*EnlJ=w^ZCMWY z1OXlQh21&4n>vhJMRHq@9_9%;ZnL$q^Q&{U8h&Nipi0vJOdI}y{|ZK4L5|(SzX4LW z_;+@8ps80la-*<+N@-3gj zFn?~pm!6V7ud@MV=AG+Kc&VADNu~DL8etGBx=;=?w9UA}D^l#jiXZk%Y>e?XUP}uW z#EHq$iiajKTUxou+jC{fBDAgU0?f|0a&3D2*Zf$~rZy^{UXxk61#$KtQ~H>Ad#(Ix zJUR$*k~{?OmE`NCG^p0ZOj_RXS}-)R$R-%nagpK7zpSThYK7+N<4h+6Y12<$H}mp6 zmm}OKoojYy1k=ODMWfYG?GSwiPG^UO8gVKbfvVIm3BsZt`WX^#qB0;s^l|r zN&dL*_Gr>Q$~gO2jyT=}RN~m1!JdG|u-jZiR!@RX)8dj54`&^~$7MCud`Qr69Tvr&fs_P)e*mg_-1wSzNaK%)9{9e5%tkg&Jnk*f zL^N;n3j`gqPGzqpdNSf*66Tbf6paUbb$GW;vJLbNeD{$XH(fu2e!l+sFPsA1&BK_B+*uqb>O-<z{KKYwE8= z{Pp(Qts8Ac5p@U8`Cs5d#}!^;Lgy{1 zswAb5lv;b_*a4y=dedumnEpovwCyrzJ(qc|bwL1y*m~L8XR7Uh%2%e*AiJ&LV*eRf zn+swLQ_Sav4{=@3oC0aDqD~ZXkscqlL5{J7PV{X6g!Www(_VxdwB}ZG#G|*g-|w=& zL!9lSSz zpMx^uHO6LI{S!iU7j00aseLqg^l82v(GU8b$7@U$@pl;QId*e7o zUmYIWyFU;CN*ulWn#vn8oID5o8UFo4>fiN&<4E4U3QDRnLZb_uBt6Iv9832dL~gEJ z0Wv6u^1vE9oo;^P1wEmz(ha%E1oKFM;J0{%$e!0wK|P?1xW+tBh+H&6l@BUW1-phx zEJH8<8S^MNd0Ec4M(PgW#hHM0I@vTnJyu@?w4fbCv9VA~7N62`jZYJrDBz2 z@zEQI5iGn5(23l|k#e$KUGDB~V;j%C69$G0>Gr)wFj6=i)M5LThfoT#7MQY?!5H%D zf;Jygj_Ze+uOZn893!_sC~tvZ*D2FpJda(_LX}2KEr>7+*B2)c-BbIo>@52J*$qfo zU61^ket@npWNGZ$vNh&*i-81sZ{bobph`c$E;^#pqZ_KYnY%(PgP&$WcEYcx=adpc6 zCuA(AYaLo_{qA`)qSg7SRG+*~3;2cy`!9T+jpO3iae#1`|8B7;A?jvoPD8Y1C9N2G z=`KSUrqgM;(e{O&bucUx@&E>qp-$MU;o22v1z-n>A&l$lSQl_G{8-r)HG4zi_lG1- zLN|YE8x}GE_NGIZ7uFoEwl-*R1uAq{e15>-8P@}~r9H`^?j3g9>TAU5SaPR;--n16{E&?fXQ z`HCou1ZL8*mI%)Eu)W&ne-1bXMTFW@&lM=&ZL$38%PJWCf4WMd$WY;=7Kx zGtlYJqsUQ&+{#dgl8NHU>42q8XUEjt33w@CW^TDzR^p7$4O@Bs-;G$Xaa6a@sl1!2 zhaRC>%&J;SYd=k!R6BmaUwHeIa=`GWxR!L)*MuUb4A4&EOxI{A@y%92Pda%~Cqge} zQdJ5j_^IObdswZP6XTc=zF1vMu*M?YK7zfv>p-0}MMmosa;n(;{dQZZwQJn7xqf;n z)2UhlOfZox&B?!L)-(KcY4oABjBq^SThw3YU!B0@`-#m!!7}EB_g2mKH9MSmn}F6) zDH_$bxR_Ij(6W#5GLD!O?#bsaR*Fr@vaKwL%NxDKi1213c|DB)gzP>2N+g z(8fZyTBu5qYeGnO<8Ic9+IOxW=47mDaa&r@OpDDP1f$(nUAC!Hi*p!qm8osqQVRHz zqrcYu*KBFl=F-dl7F<#IkDcbqw<9Ju1}FTqJEnkhlM4w(D$jK9-;^pAgzoFsSFdu& zm@;P@+$t7b8Lmg?o?{=ZG4}F#9Nnw>Guo)j(vh0}xlz>Z;y5*XD@Yt-U%`?XkFWm5&w*{ZeHA9^xLY4uVI!(8I_ zp3_`q;&z!OmpKXN7^QJniaN#?JR)igBay~8GpJH6w7`v&y}oHQLrLlx+_EDWs15NW zjh`~w(Qg*DLG0WN*!f*4;jn?=YfGVF&MdRymV#?W=$!+*Sz(RA83RE5=q{$?m_N!l zE9v>o#HrY4Q+n|ZXY`N!EvNv&5;I`>|L*k<;s3vxe1P~~=Oq}Mzb#mp<}Uv*24U`+ z|0zc_=2d>%x`Z<&(=h+=dr6r#Mp20D1bL?56Q~T9q~L|ixhJYD7+Y+#rl&ndZSOBt z>q$J097(i9Tp#4z+;c%z$N|j^XVtBfn-Num4u~tP=d; ztC<{LZV$fuLPqp)uj0Grv4?8kP0g4qD$7RnGOzZ*58;P4MDi}Rl6+ezQsjuoY52}h zAW?S^LRR1X#d2jRUdOma_)8_UIyfxL(}m0wJ3c~`y)QHm0X&)A= zC2`){@B7@&vY9f*-zmMpyzQ&hm%RiU)VN#eR>%XPq8ox(^s+vtMGsQ}RN6pJ5a>;? ze>R`?hts#Ex1#wIKjPc#N0pO!nijupigZ%k=;F!x`uiR~Ir$nE3k@SYphn!s1wiQW z*{!qd&WyDj%zem^>|&QhW4~ttP3zO_vdWAKR>xdSg)aj+HVYs6xDUe82iqwf_Ij?29!7rex1a;$s7Ir z@6LxyZ^i&qp~sV=L&9_G{H!Tikb>)`bgqFDY&`2{fDOpxTDdHp(CkYMx!6Ve6rZik z)teIYuC_mQ%NefBIk(4D6TJEF-%Oc$gG{@g|MaZFIlAPu+<$J&=4l-mXSfXH+DwuJH0B6*w0tM)#qOX)>>>o_!Oy4$O-1JOP zyjz=AJ-32paGl4?z4b8zVgkXtSYGgnk5#zUpHOU9kzyv}v;Z4r;t$Fq1$8gi%a1aC z7x4|t4YY%=+)@i3nja?WdCtUAdyO##ZST-WTZh(=@yGBKvAmQq&ETO|O@iMBl{gT% z=d_Cl_V@}$<=?_n@8i_af4fuT_KdQeR!YB5o!}?NxveVtSUye-Mp?+g1(pagusSUR z-OTB@&XxBv8b+(JtlVPUD@Pt3{>9QRVE|1&P9y2xRFFUue8P(T;6U9kovXxFAHKb_ zb^q1}2E1`Hs$ocH_ggtKt)1@t*s@P@mO2TaxeWO^fKhf3bg!bF7(+kpcpuD0+tExx zjeU9zlE>d>U^~n~WqQ;i$qPB4+R9iK+B10Tvw-Cuua3n(oBC0dl@)!X-QDG5{7B*v zyg~J0r9MoK2>J18uTbOIO|x&H2zZ=UXjo!ke^d=PR-IYZu6S-T)(W_X+7~gK(qO*Z ziROPiJseaI%)&l`%tfb-xm_{ebmzfb_&4oM1Lpt!gO}Lwez2=C=g+w>;7#cfb5jyM z1pdjKV$ALE|Gl#r&?NYevR*qG%(S!<_))?3@kDByCoKbzWo4eFAoWD}L| zWpY=FX%=LR+7HZ>_ZJlNfcT#W==A9GvfrwU~Wo1|jZ*#-2|fW{<;0faVm2OGmNrHIf^Ax50;7_x28M9K{jfUpSa!@<*X*BGy~DlT^LW2d>=pE z@*0*obv>ZmOK4KMB_){=muK}lueM#{*?CH$EZ4gvp`MG>ayz9j=!MIpb`(>L%eD|e z7Q^v*>A_j$q0~_Q3&iJvT*pjJT1l4-C|6}=E&EJjz;}NhjDmlQ$!+gDRexceSrOfT zz~RNjvwJ`&o{P6{jepbRJ|YV@W3ZH?2Ph_~iY19+l_T}Yb7UjxS zHpa>lKUD|~Hz^79@aLx>cp-x+O5EKE`B^mH;B8r)vTl<;+?M{6DoVUrJfnDI_cb~d zSOuxnQN)(QmBQ`Qypu!emMDzwA-yPAZwlZmd!8ar9?efhH7ce7?>Do{_HOTmrfbga z+N$&flOJnN`(2@rWI({M!OLcgpk!`Tf}MRW7qV=$O&(Iy5>m5NYGa{my48ENly+EV zNvs(t9WG+<$FcSwpkEf6HiQPP2M_FHl%A`-C2+o&21+?!AcI8<_=0>9j*HIhPt@p| z?)5y2gHz7$(}UM+%iln&UhRQ|cI2_twD+mA@>wh!Frb1kQmve?P1=}1PSk_i>y5l@ z>v-2nTfTcMjlh1fWNEf`K)Zp#Q?D=P7El(MN7#H^K8LUn+QD8(sj#<_u_`K0&XbNZx9;Z9Jh61~&ES2-et)R6F6?!k zv-1mvRdpDrPxg66bl-?qyIy(ycwohy6}j>SXcqQznTl+lz>jYhvy4j?14{+wElri0 zlV=U&e!PNeQU}>i;8O z=R6zPc}IM2Zf6R~ZgQRctNNst;gkJ-x!a^>Ao_{q7~lKg zb)a5uQ2Z|=0&mjHJHmF#wm*iT47(?a26}sl|@B}z973pSSJucu3Z_{7z;ant|YR;(_*7TO%{27YxA9grkBhn z+j6XN{+L@(FL%;C4AJ5QQk|kv4==P{=a}t`MApmF!yG|jc1Sl)0#68)kZ0V(n(rbao z<(qdZKeim~&=jd?HS}Wr9{=co#@0f@bMDM2>l0p}w(uCHcSh%~|BXiv0>0?H**g}A z(k%bh`$y8`WiM-yn`yE`I(J`O@s?fu z4)^EYlQ(DSHcZNs8MAVtEqWoG^*TpoYWro-jr`~4rF z(&jzNu+mr;Z|)~R$N1RqJ&cw!1f5PWj!DioTQT^>a_YkK${01(sE1Z{9un_Eg9z9G zV{tp#MeP|&t4J(2nx>I-aqbyMEra(~4DdLGa)$YoN5%1hgR4(Sl8#A778 zJsva?`W#`a$I;91{&cPhdJf?3EK*|QXkTmNvQJhyYpLRAFMq>6-`iyU(S%%{q)*s@aomOW!_rA7G7Lk|J=p2kk5~!>RWy@)t+D)D4GOy8Q zDaxX~_%9YY<0;Ve_kXVb(n;yUWy-tV6r`RTG#Zhwqq}pgrj1f^8}bRz@iZZ9!b&4^ zxfOoCRibkY7ezM%98+4b6C040Aq4Elkqh~)swlViqW zWz0(pAGcNBNf zr4o^9c@na9{wW1E2xwg#I(``%>4GiwLsMJY(AS>cKo_?&2DunEZVe$3xv|AG`D7!q zNkb(($Zf%+q*vle@!P-dxC&H1lfClifg3uXS+0+kWu@zu%%>&RU)8V#A-$y`2*>vA ziBT;OT@e1-OTHGBc%!Toq2h)Y;v^+LM)5_?wR`I~^S+z+UWfCWGdiFpTtR^2 zvd=Cby*JOA2uNM+uw7eapEJe0&6nj=WVQ+&Nzk~}F8^OFJZ@_VAh)nDoA&g(Hm|Z^ z(d6P;dj_wS02I^&w}%9L$GkZqq+S#w)h0j2j?&n*VqZbaIrTXDssC+0bw=jK?z$4q zp|%t??V}0;_YS^!tP-E^jeb{p@{F1AqN48@4+Ki;=b-dg{WUnPcC(U7$h`yCCf!PT|5?QBYAJ{Jx^gr;ar9^{IV$% zRpK-RwIBzcNlJSDsg+h>T+<;e`PnwBJbQ5{Swy%ODq(}|&BXw$Rd_7a0w10LXpsk7 zY2^lox@erQ=1yC$CH>KD1z-&(X$*sPGYx)?z4r)!16)k!BDuHFI2-2rdAB!8G|(hT zCgJ_KAq?_Wew~w@QlnIlBuS-0D1xKus7%^l=FNWS(1ExkRu?(OVQYZ=?Oq^EcAu?^ z`)PYU=Yr&{mfZKu1JF$vfNnP+OR52sOX|!omJPeJ7T{UN+3)5OFOqfUux~MJgnZ7~ zlQC|I4>nrrFLp5{h|7B}ZI)ql1OzRfO)mut|AOv<840vyPZU1ngx z{vPUxhL9U(y6PpQjJLG%mC#7%{IRi;!S&J(O(&PNOzgogz_>)Og$hITNCWy09D|LS z>SRNJ$y0aNP1Caxa%5MW?SNKAe}2!t87%HK+O;8oWKE76zGNyaKbi4z zC;%EjGl5V#PwyehdeQWlTxhVu9mrIlWbEnz^s-rPGsf~MQ~X8@=^!;~>*EF582tR4 zwCU_v;xIiEmKnxAm$|&SH#XFfLayPZ+%96*=JL zq>CiISb(l_%#(RZf?Yj|h@pQ;r%oykuu_ZH8GjnM;YzGq;=nX+WP5?E z@&hEJwqhPjT%6UEO|y2$44RwN(Y91c9GnwQfaB!dodVGS99f% z+FLXS4Be>_a_XuguGA5~=N%m@--Jb#L-2T@{R1bJ^2ab-4A!85qP^F4D$=)4~8Dc1@(f zE*4s15XVONIkf+k_?aIYm$j~%)ak5gEFd9{U*=IoDw1|O9kBMIMSG5X-|n8-0jJ;8KT^itMhZZa;WgL zcNXl%=KG=f0kF?Ye3Qp;un;(wjp&*~yI4J`#=A$euSapJ>)N%vbxF65^&}8za&>ztsD=u=KKjmjHw(!z6_gd=PNY9G(62R~dW>znO!sVek$OwFH zFnbPjsy&5#nP`7gR=oINW_y_6_3LM>u3g)Da6~DZN>z57!AmBtSs6qoEl#sx)>Nnu z&7WY7gt{xv;Wuw$ znzi?r*nHXl&$nIvb&=pp1ECA-FW%xJH z#Ul$NDg=?m3a=cTmltg}*%<*c(Uc)&n?eFi*PKrP-F$qLEKF)zDstQ8u<>L;cBx4| z6WM;#vTEv1qH$7*0YG$MruVn#fRi&29SDI}o-qi$mZMSs9giK^sWWbAzy_F~-Lii7 zhl!uY!!oOHVP=-ch5p*UdLk`=R2g92uFTbN)2!^r+q^t5KUM2-0=VgP?pKUI$PXHz zD{k2rTj*|ypb}__9gbflppTMG8nueZ0S}7$C%ElO2dZ69Rs7iJeuGz0JeAF4q$@7< z*4KqF^!yJLBc_1C>U*}fwQB2ZfSBTsIQy!d?H$vJp|vI|)ZDsr>+Em)m6u0wyZcL5 zt_&)ML$hlaccPS$cVpp&Lu*5x82P$Wl0cN8vmZ%0*ltSiTDwk_(Prc}oE`%_^(=D; z{p9l;qu;uwp(`%X{Y%&h6U;Cx(Ho!d@UtY*{ow_R~b50lp&co}N~U z!Sg2p}9lI9Mm}jvo<)KcGZC9b`ip;*tm6JOYFv<4F7&KabcQ)`3Of1dxJz0327x zpq$VpyI!a`njx_|f#!kW=yynj^ZPm+^#BO%+g~ao)tuI?WLX0b0-~ZFn95 zZ@rdCl_q0|zWoX+-m`4xU3zAY;x=<&(q1E3=g>JsxGu@dFsB_{U-K zU~*#WxK}Wc1C=?h`o3?)Sd(7joYm?XKhg7;T~|x|TS~l#C#T9zg9T6l>lqDK>YBE1 z?4l7Pk3e@{gqlU8*Tkr5W^?O;kn$pM7*}=!WJs|{+UR&{i?ww?XC40pbK{MXy7-mtz>U5A`$wp7Y2mkq;FN>TJ68-)VWw zbDuX~9mOUo6(qa}-g(TmTB4s5X(D|iwN4Uh)ituQs^s)~x2cPCK^pu;2Mq*T z);y_lAf3$HQvTLOl9WsZOT94HZVi`uHDH=Vt<4SZ-9ior5Umn6ywsARYj)gIRGEds zOghNWt;KKy%iJH36@K7i-J!6N$NE<^wHKZdb=CmohsL=`M8?t$2a*BkAFs^_$d?SrPpa{u?b!C-a+wt{9dFsS=b{7+E4p4 z5I5Opgp97k4%&(5Bwb5`U4DKo`S3>19+wEvEo(YJhlO7mqp#lN-cto?8C#lg0d|79 zbe{=*Nf{QBZllk$fyIjJEa>^eW;NMr zu`W>J8|+A%D5EELTq*Ye9&~M|6TnKULaBMxuue!?L#}6IyRjqC>2j3PL7IH+7BbZF z#m1(iyBb!1&0Oum_Rk=m@4~8c4kPb>Iy6gnW%iXAxEDsn&qq8;=BWZzSumfNxHa+| zv6Cz{hYdrpb-QWrzjK0g%a3ZRE)w^2--(9-hbliOMelV_ebrt+d&@RPWZ93KAWN-L z_Rs-Zj`M%la^#18by`1n%Z6yZ><4xiE*;G9J>mxencI51xN@e@04W7EMAgcSfIJQX z@(oX+qpEIO^on0B|A_gkV{c8a$5zqtLYtmE= z_Pv;7DCtxQ&~GgzX|LVe;+{p;CF(VmLr?9N^ra3wE!J=;Kh;&;*bDaR$LsGMfsjzL zjDf!M&Sz6IR1lxECzvKMj&5cQ?n|KRk=dIdoJy>#j1yZ9Uup+7lR6qWA}*Y`P>E96 z`)Q>UsOt^QN2mY_9UG$$vMb&M;Q7a;Kwwi_0y8^dxm;lQsXv+bcJbB~L<6)0M?cPi zoUZQz1$Mn)=G{WT-9r9;x72xMCwit{(RE6icHKUpR^A)Tu=}(2Qzkm;=dojL;>h*OK}h{K7A8Ei zApHw=aAl5S!j}j_4dgkKg5sHZ$^;6lx*clAp%IHFvY(^;ZP=8_FrFPX;&i!|T-aJrUjy#0G>Zlt9KFDkS z+HK6-V@3XCrW&yMli47JoER!qG$!c)zvYF|OO;7#cjBxBOypJY)pszF8|3INLw+lZiwU*I+B_TsUH&`A?>* zXj>_`X{+j9r80-m`QE+%JZR!K_jCK$T`CBhp z0LHAN$#N!Wl#MnQG=SUe->P;jn7mPaNmaq9`4)2Q9z5ftE`o9@ z6u7OgqYl|IS72%Eb-hp=oz(=Tk~36WR+x!wrS_3&=XauPdc? z+c^TuK6V(>2J=n|i2kc_BiZ9>{ulf47)bC(bG z$sC?Hue@a8y%;rFmX-4LMK^O<67Nv3*fN=cH3}HXZt{_zbs6>xg*-7T;B-CmAx3|- z`>Y5atDW-Maxe>Y`DQ|>*L_*EtEL5n@MU_ml~KeG_&&W2@`3wM6CU)A9%#G4OEJza zk}V%Z*63SeN}CPqEOfUq3tDf!$ajpt7zrUr_Ck&eg92CBP)?7=X!|#yNE(+}=M)e@ z@7ix8zT3O5!syFY+d|LWN3PJgQ>6N|${3gP>XMvB#*)WX+rf8g zh7Ku)7Kg&sj=eq+HH7L-LA?QE^Up(fWb;dZ^f{l_D8DhDT$uE&Q(KwEz~uIQ>W2&T zGshBjxHqyOceMUHWXpe#a``{--{<28M}Ine3%MltJ{gy2Vd|w(Uv5M-2(tp}(N6qG z*#du*?VEa2UkP9{0}58=96bwNq&11Lz4T|=TO!lZKS1q1m;rhA3N#=_1NPqSnCC0& zW=qb_s|1@OIw35Mz*Ub`m6tJ}1J>d%&#JWd)1BHg>;A@*QD*TXwXg@oYqv-pL?5)j^_JrBtrE4`S$?dm^89zn za~uIxyI2&+kN_w61wE>nR7i0u3H!wom5+$;G3jgkajfULOFHlE5(LZ?S$A#brW}rg z*AJu4MhQ)(<(>q5?_?~N=Hs|(fSV116iTHQLl%c{j+ z!PXX2YQy*T27{iLOB=*xsf(8%e@|4)HC@cVmpHiysG@P%#*6C)kqzk0=5|_>4rrxj zcV*g=;P@SfGb0R&I8xfoUXHOjYw>q?i7am#_w*{3yw{y4+(lY{=Cn{63av*6mV+~< z*ZDxCh}153T8P$=g||sD6W~j5Ek^SPU^f`E_0Z2*VhoU}&bpTGV#h)l2WY3jHfgp+ zZTOi4@(7PFjAsh{qpScyV18yuqFQ4HbHp~@Alnw#`mL-?HBDRkghNRdPg^1DaDRVC zs%YucmLq+@Z@ovEYc3HCkVl6k)Zs!}l^BM1RJ`p42x&w;OO=;l{7

*}X)cde1`$EEQ@0|MB)zZWc%;uq01|d(A_DLUUqx+lEzIWU#w&CLr;yn48)aFXey`4ny-4(D)A%Z|r zn>df0?CZgOtS?cJ1y@}t-kJ*8MRT;wEq=Cz#gGtswy(XD?uQCbOxezPNPj-(!>8NQ zTQ>c*ueN0wtWNM-K5v+51iTa3vM_<~n&tS#k{zH2iSF>nr5~eZk-@w;NXHL_DV_I9 zXqH2W_B{zO1Eh~slg2vp!L7(r&7 zfB*FFr2i|%f)F)R4n0|Y8M(p*enZOXD&s}kr8Mmsz!6v)W$Vj}vMnu%0c zUt~QPeog!fHa=zz(0+gG5N*MT>t?u;@M>g@!xQAT5$i5?GKaQB2MNhD9H*}xIn{4& z{d!}Jk5$dM9RA4jAeEO--~Vwj`@mP`Y!ACm#yC_}nu44Nh zg1@1Be*w6T81u-qpUiupAv~N;puju#f(J5(X}KvlL-45mpBu3gaHoDcWyG*DCIGaP z#(D2g$f~cm+tkJ|DB|4=gN&Z3VxB<>CdxzsGgIHwlOHCq^c`=Yp|rd1jj7uVD#IN? zoU;LLEO}UrZm=q6;J1i_jtt*zEtfAJu3EO3``28nMU+G(*jFsq8H+K~^=tk(2$MMZ zXdPc3APL1~no7}ku0pA@i0`mqnh8A#!RPy;rq7AyXYcGP`7u(3HR^W#{nb_ zU%mdL_^507tfC{39|By|3HCK{s%#U9cH3E=Iz+k$B%nVSlt2nq*4kOcr@t=_Y;Gp( z%JX1hf3uSOA(f-bKrCUGT-Zn0iB2F1fla@<}N=SS2;cA~`zZKzmKAc5vI6mCncby2U#uWe?n`h?)2 z#J$1fAmKaKL>=D1xHnyil4~16U$6yPDHWF=f1HvjmwW$M{{zTj4jp<5xUPZ_d`yVz z4DQUZa`v)kg2JkIteeyPw+K)6=Z{pr<&TPGuzoONbm8`|ITMk=l(Q;#Del$5J8r_2 z3D6+SNw+nmH5sN})}np6`Ctou+rP^0a-y2za0TpHVnJf&a9EjKp$S^GM{jQ*Of=<- z)$x4shqYZ}FJ``hM;Zp-3ouP2)o%tFXCO;xLTP4;*uJCnI<#@?A-0}!^cZWn@iIFQDi+vb0>RYPw{-S-lpD!zcdNjwq#6r-F2EzY?2QBb0U#aiVu4 z4N(;O7-fN>*z*`cf%tlAJhqe*WvJA_uouOS!$B9lP6=_)KH+C~g6`IC@J6UT=GJ+? z!FkJPO5x%QaBlE(xoTUhTBieE1?~EGA}`dUG;T^S@hZ$)Y|_Id;Lj@Rw>pjD&LMgx$kT`5() zpD6&elX%0QAqwn+@#wW&Jhg7Wnmdx!+;prZYSA#TMl+R+9%(N^lU|dsEJ;W6GhV%V zH!23m`{#feLDu_)$-YGOmx|ad5V{jz3!1~+sT#dDg#~2%O z5fg)V_%8(NdPxGONSYTIvoF24sTwVU7s&yxg3o#!PXE}iBIq~g_*`$Lu=x(~?+wNt z;6L&p^9s8LyYQR`n5n>7_P+qu3i;LdU&scnpzAfpJH~x#_wHk#)_UpJb%#dJ{XvjA zz2Xz|2p4sfad1v(oB(aBIVWjkP*5clUvT+qU-nXEY=~RuZPBhpb?BR$w5~g*XH#c8 zwMb(7s4^-=ew$5|48sC*6y=YWU)8Irq)&M_&cBl_9!bkT8?8WmjE2T|qU_g`;Z6 zB9K=6IOAKAuXll=)SY2J);r|Kvg~n$uNx6LvOG^_n+Jr1$Uo_;44f$KY>w&V#N@g0 zj$Vr|B5!sD4(X>vW}ewN>97R852HVrvCIG6FTH;ci2ny_o=KEp`Pb{8dIp|Ms2=lq zYJUQ~>C}I`W&Wla{NvI7gDm^+3An$h1mNkMpV06`hV*g#|Uxm(GAA~Mr=wQDaN&_{$b|V6? zD*VN=IwYnH(MhFcDHSR$KJtB<51G63i-m=Y7w=f`r`PM z(;55x!6nAARLWzCSvFhB1a8`ymdl4q56UP*fsa^^_xQo}ku~1Zz!0IFJOHb#h@x8- zOc@A&Nhx_eTXw%fz6w<2@;-nu^1l<(`M>Mh`#Wpq2U49%<5Hk&k>VMp_?Tp#?M(}f zG({J=s6A&2><{IqFg7jZ>58<=ZMU!EOfX06##T>}Vshn}Ru*^An@HL8pr z?Xm$O8!1ToOLx0L;zlcJ^kEN#sCAtnu$nkyr$W|4pdZ_>&oXYrY`SUcHV7|F_@igx zaY*Std|uA_ny-WR+0%?miGV3Cj(kHGw~-H&Wh7g~)gV3g7%GHp%DOSKDY)}Y)?N6) z+MV{C`HbXOPpIp{Hs~)3kgO$pZ0*+l<=DQ3)9f2K)}3>Z-fz^Zr3KFbbitFjj`XwWeoeOT}c`vCID8+$xc zyy&XiB}11Z)2O*d%l`B{ndklZB9j%rL(1MtiL>+veMj9H(GaRB$YkFj!#qbFaT#`A zhL$5@4AvRk$(AiFD`%{JdM?bYXGyoQ{uTdY9p#q-|NrkyT+#XVl~O^wBc+&AQ*{}J zCTzW?rhgUQ51?ndu+WxN*^KS4rNp{$J+_Sc&f+P%H;0Eaj&V-|?~ze>gMcDU&o>TU z+Fm)?WYTZxT{KPHw{PX#&;qUrGO9Ai+O!Y}IlI~`obbn-Hb?k(VmXRrx2lrGxETc5olmc9Ptry7boBaTfI_(|*=LoLQm zv@_5qK7hL0WFJ#YGqzD;k0jo}a((J@dGc}3YSNGvt?&v59?c9Q_S9e`vp^d@ewwnSxq2!D);+m zZn*e&nY`r-MtqUHvA@`?dP^jaHRdz zQqKF)k&q_EjvAr+J-*QAl^y|^0D%=kdRi9p$9q36mIK@Q>HsRvY(xC(^{;2}uQBki zG4QW3@PF4BfU(W72f^7DkQ~9yN$0)}dVPPD;O6}pStu-z9Z)S;K-re&aKaQ;XR!6? z{k|7^on)ZF6eprp7@^Os6%uz5S7Z73hOJS8?Bw^5Rlp>lhmxg;?lf00X>FuWDkIFyaymX zw@HqHwC~&QXZKHhB5EYR!-MvVLeZ;Z)YD?mk}f8(o>n=3A}@{WiT6Q{y2H3=9}*-G zT4LRV3y6AUn3ZG{Bv-7>@SfSb0nz;`gUQhXX0Fjd~t zpnZ-<2K^31+x3uKCGO6M>Q1Ex1kNWs#Y3lAQmG5I>!p>YwPmR7Hp_42iT7GAz7P;F z`OwyI-v1Vdxu)^8fRc!jquDRC0pr*TZ07$o+Zghz|G%INlbkfmZxe4H%#lL4I1R(t zF|I{wcH7>2w!6dQf(R&|Gc$Z4pkC1ZK;QnsalNAc(gZfK>+T{yRE|^^(xA=NQ~G#= z4+-?}KuGmD z>8(*dn;UGLd_e9R`bG=$j^=8C?^Dmnx1Zn#kO)O>oyx6(h-?*NnA{x>$H0?6p}}On z*-{9*UA&{YVfA0Z_{`V?SU!_RJ_N1q9NrZp^0&BgC(UN`G2eRS}& zLu|D}140lN=#?&dN!99xP%;-!UQV>8!F#ttmvBnz(k=0J87%k2s6!Bq|6MRY?ma>g z09Jm~oNdu5?oZiyF@N%Q_**{}yS+Sv{pnI6w&T0Y&x901`*oC^6aF(>4yXhE)7#X) zJI7*m@_cu-RC-r|SJa~bjlT6D%WoAC&%#uyqz@!D;OvI6?M`ITp$0#ng|2(#O;Q|^l(jt*bjXX(D3-)Gpe$ZH0VFO+PNiL~-!zgyZfbDq^kk5pul9BI%I3GuKn z9Vix6QC+duJ}y!l{L`s^3drPxHrd%Dr9dRX+3P{w)}-401wt^KU{l8_MD68;uE}mq7siSmm8J+p1*HinMHGR67y_b3q}NygDI!Wy zKp>(b7{Le#Cbo(lSu| z|6@}U|LBy=pV$BA{{M3i{J97IdJkCflXX{}>2R{Ab#ztn9>-Mea~;y~*2PWpSFmSJ zUf8(>=RmtG8L!0eRyi6Q#ImI40u{8E||9KYz>cXu^?F_96mOVpxRKiC1+Lw9s~;w$mbN&D2KXGp!}s;x&m z9;F^Y1xKEqf;=1FyH9<*YrnA9%Rv6Tv{-7L{tor+X?ZKG{J4L1EZtI!cF6VdK7n2s?%TF@J-Z5m!MD0gScn# zFT@J2*V~6LbqjjvSKb&w-e*UFpd+nCngD7xjt zyCX5zbw(GtN<`1}Ua86=Z2h&{NoKd#rX$;}8Vb%6FC2)VCb{3JqJ%I>2&0_BCne`( z68(Nz*>(0wJX#BhI4s_;D>E_uK)d;(LzBnUgTE26+qB;KA3+e_m=d0yjk>!7E2ndA zbCkM{#X|0=uB!N{ z?AhUmQ{C15+5!S^(x0U1PFmI#?!%rlWbFDuz%s-^Yhi55$_9i4uIud4|Bw67{z-fM zN&Wn}2mb%N2R5DJZgM&rF{Bk-b1`{)iIaSF7&0(lVa1p^SK%uX$Ci3QD4H9&rDtZ> z6=SbCayQ+=UYS2IUg_+$REgP{Z_P~Ib&gpz>UW2z?Qw|*+mh%d+NHP}AP7j25zBvj z)%jR(cbxun%jyqf@_yShqC3cw;y}+D$%0Fc8qBrOVG2)Ff4uQ>SCc;cwPTg0#25PN zSHhWM%F)rXgR}}WTOxnIo4wK5`wg_6lo~_$6z^B79?O}PL-)td6x^f0HCwvx!wz4| zDX?n%o_JVOzpPE=j$p~nPL5#4PR>ENUjLepRFRSK^=VBhmmT8=&}v;nVMMh3DZY~{ z#}<#cu~heZ7B+r>dqv=~J2eHm!c>#6b#2siAv4vkU#~caw3rnvWSOJ!4})a!a#Jb! z_MoueQ(N}@9}!?B$@}j>YdOQ84>;+Vp^w7|) z0Zu`=u-11Y?^cfr%uFZ6J5+(zrBv@$N{wV&Rm zxhXVIPM-FS*akZDiF{Y%XpILAte#P#ePAeDg3&bp!0zVWLX+ErKxIgXO-$Rh}Nc^wh#J>N1R*ZmW`A zk{`6za&(;oAGrPUQ|01_YvWlDtqPc1yte_(XmTZv{JOG>nU}d4EqfIyi1OZkrBKy~ z!vnt3@@(IAEc9Q=E4KT~{jy6p@YF37R_%qbro(PhVC(-#AR zR#VE~t1eH>{`2(Uzs13?cm2KT;ska0VE@d`{{B&O;9t7d6=o8n-fN95JcI3{1w!8qqpXtR%x0cZ)`{jt@eFi2F=fwtJu7vqUUWcSPt zamNl`?RxLqx@pR8;^}J55bW|;@dHQ4%iTmu#O~eB0V>r4#E}`gvA(E)EcRWU!}qpz zj>D3eK<1lu4!!0~8v&|>y|Ljit4UBq^;Z1+g9Ey!4xVZ$Lh8TO_{KZkC-yoX@068t z$;IuUn?6K~qs>sVb}>;U?RQUG#%awzIWWuRm1sVCI&ZE9UHT)&RR3#O;Cz)fVUI-lP`?5t5t1yG#U@TQFy~ip=XDZ66dFL(2kYPe!mu*&&ig zf(HGDj5q%ht15$@w>h*fX0`Z7&iRRLT0I%~&TD#FJFDFy}-I z=^rsRwiCzlL*?Ivb?G2!{jr6+lAU~8e48+Wl*Kp0`+ia96O{Bi?wktajG#Y8f!nmkmgC$$?)4<#5ywwHLHB-N&Ze4VdP4_9%>x`$hxF)QE84f9egmYHK(V2s861 zO7kyXdp`U9=_x$}bAxvNMa%DMIkNN)G4s~QObU!fMo|f)lUj+qLEf5iMDrff_9iWF z-A``{QW;rbm83vM08 zphdvWfB?+Y;yTAM(mDtCWfjYj;WZ{0mkX-4UlFEN6Q0DJ;QfK)^8kf74L?CAaWqR} ztV$Uas7F?@qFLg_V5sph;d8*NJnl9s9KDpv8R#b8wkpeT6^z$=j%X{C=xfz7QFXXPbYV-Oq`Ys$hDo#xWh?iat=~MgQnX=CNCNhLj z6Bp8?{M1zQemSgj&{~r4!r%IuNQ>DJPkHvfF-ACgLUl8CWs#1G1G(c_aC~a$H-Pzk z;=InG=Csb?gIkK9+rI$2k={euCG}KT^~)ltOr*8^u4NrkkbVvr)<#YW^vZipMJ*{R(@E<0 z3%2>D47cd7W-o}nOl4)vq-5}t)n2e=y2*Rs@6#o+rkiQbw^`QGvRncthF@u`yho+%)CJHv|l!&2yAyO6hTbI-j6)y8;I|?!RXL6bwY{sr4YH)B)j`Fds)0lN)qhgTrz})QNWEtn}NU~S8Pyt z3gKbQ67`y5K{{(~nD59eO!GWGMmF;jrZ;TiNj~ogW+~J${x9!(KvpYNwo)1=jiBcA zxaR9<$Ra(f<0_`3o7(0faXjnl-lw;QXrD)`Cv> z_YB!L)Q2>JKk4Tw_^JcO^p&hkidQG^MBNzk8P5_#;&eB5Yx(QZ0jjZ8Y_oYOUMJg4c@EY1zZ3x=`Av5}0ZH4bsZo%ym3%(n)b6 z9R07FX8iAxss1gv%0KRF%bnSYBs^bLGXZt!UeNQ^RELQqP^0K+JayR?NIdQEvlFV5$Iv_r2WL z-=ULd?5!01)a*3+_s*tF>Vo)@@4VegNQaGUK`!^xcl>f4Q;Dx`Ji6QKwow{;?B8Lc zIM=)X4!kU?F(q-p-|(>L5(iO^%!L(LagT_tx;fq4Li$XB{V4wY{(M(B!Nx4-S&4$T z#%Q+ioV0Y#pnzV>6Il{z!pco&qs7H|m=40{4gc<4XYx>dnf6x)0WfBMc4J`8U(cZY zZ`hshchkr}zt_La3H@^|{#=Xy?t8${R%_3T^nDtF`|ZU)?fc$bd3{hF#K!xPQ{TOG zSdI&rN>^$I+xTMW5BQf}cwa_z$nj-k_B;kYVZpYaDbAmaJm&bQUTG+}$Vhm!NcH*i z$wZnPn40bU%OE_N_8~km$)Ui@0CLZL;JXYft{7%H-f%o3Z7DFPC2aB2<`)j_5*rVi zE_Pb9ZqqS3BmxGk|4_U=+MQ;tFuH)d*XAC5)k9i)Q;AmFoLp5C8Xn%8 zJkK4+JImk-{Mq?VD3X~} zYDA=sT4JgN($U@__*CxYZz69L4%^zEF0;k*o!&dZh&}Ydz5L%>JkJ_UTwREh z#OLblsmFMvxd;{k0YO+C(8j^Lyb1?hnwf9VdK|!r9sEKhBN0$ z&kwNDg|RYqQ$7ugX|tg|%)00~5!jqz-{c6y`lA8O+pG%WNig~a+$ib+0Nfp*9ES2Z z>LvSXY<(&7q$N-gGvBG)Ka;#@bsCiKpxNPkLP32}HuorZX!(3-=1mMe?Q`*?r@pIM z#?W9t%f81uV=5&p&W*TFiB(Oy@EF$EF&9VoEOvCHS0YywOnjR@`t$Y?1KhUHWFhG( zu%Q=F|YSrBU2F5jmmhng_t~)X{A6uNk~SqJs3Lh{YhY&AQb3?FWa&x) zCUP`w`%-y%*}dVVqw5?Zi1#kK*&6PadTJBn4U~`oYWi;CoMRiqxvCCwJL{K8G(U{l zDz;{@H__o0ZGIA+;RNwG2$Oxh^@z#3H7s>_qoF?;7igSW>zpP=;eFZ_sY4st7)ZKl ztKb2wOhv5ak0O;k4M zf2W6<|CAn1zSC>0`IR_8y+rr3_GM>0pVk1$ZCdnvjISRE%sVPjV<$)k&c^`sz^!xQA2$tyb1>Q4R552nYbyVFvE6HX5`y~ zHUzw@+>3zElbVywrg)*Y;%CQ=82cT}*0K_Hl5lK`AksyIa0|wCSq(=h_q>X_fm^U_ zBT3>P@##ZvX%quZvLUD^0J5tf5H z-Tx@F>nK~a2CBE4)LbR2p2e>C(_i4LZX?^SfCG?!l>Xe2WprOH2o6k%aGz?kCZTW> zd2>x!2L63A4mMNvYXU&sm}S;-QQMKe`CUWNpe^u`U?x*j{?VKt?0 z6fy|&b(=iIeH8RX}FPfWHT=wyF!2-#4?itaA{N%3K+jRxeA{^4CPQ%)Z5^fsUxW`m#8x z2JP1fT)l!30lN^Acc{!;5}419G!EW!wTG~Hq{^BvEwztZrokQu%#_Ga2sHC zk;p6#;`z;PBom4z$moFn#8F7HVhtPPjf^YM?EwY=H-)JAQ4(!n(u~A9?#e6^Q-q0^{|pbd+%J z#-oRB2l?!VO|hjUo%tPCFb!Sbrqkl7)%LCg;p1Y3`oh{{M-pzvcYHK%UY=!SfUalG zC!C#r;TjY*K73qoF=52oKe=cnR?FCIc~>WZZ8I+QtIT!HHg_DX)+n?4<~m1l#&E6i z%U0cw0fBdObjE`Bs?CI=X!@*&2G(@HWE|C~t)SZuy43xy6?s9dzm}3r_h08w$-r=n zQTh#Gqua=!i>{*pTNk#JwHY`$t3Jygoh;9cfB(FKx-{zc+I@OA4jDf z)PYvQSWSk6@XGP;fB3oomGnoM3Z2)IqMS?8Q|H&e(E->HV|Pg(Az-CnjY_+~OP)wD zzAR2fRTBW)^dkz+Pk#nq^BPcX#eC1I8OwB1TY3ehVd8kJ^^o!s29d5UB6L45cvoNb z@I@rOa~e8!5!2NR3~BMtbq+<|Nd^5$8_Bi>uW?_sHgmSMyx7kKjB*()h{DNiSHLWa zVALI}?2>=fo4gWUguTi;e2tIF$?Qc>ZnLyBzwCQ(m5-rK8bhnLBJ(6}qwmb{M4Z=} z9ymwkRA8)SWgs8Bb1_n5;Dfi}A<4fusd$m;dZ{T&7xe|RpdB+@4R7qsBVQxI221Zy zcMUh{Pf5+~pqWWk=~sI(*fZe}03COIiB;(`UKEw38bQHGWmwm7FE;OlJ$W-7laU`> z{sKo%*CFqQyiB0u;4I_ir1~o#;-~v=++R#D65f`oGnRQkZ8`!)z|A6LtaVd`r)&_Oh z+jA4a{l;{dRpS>A4ma==g~;Tg3<#yR=p&9fKHcC%U&mN5=lN$JUXLy@WVRn= zd?tRJj_=z&r-p2qy3t-^o)W;?i>Z!grzgHCcnjjED&TF_X0fWD%5#{oi|ZVY_L!A5 zE;b8IP85iZ^0NF3_2#_-ysa8|$)(p!z8TlUAnd`BQ;jAE%jeU4gQTVe#8>^m)|ol2 zF)rBF-Sjm4ygoMO{zBNol{w07ao~oVLnt=9E29pKJdDfgElLQfI#Vo=+QBTy@fCyr z*zogG*^3IsfyqHC>iA`9;S&PJ-vU>o+~>0>0i|H#cwFN$E}?D{O4Hzh{9|mVjH{GW z@K}d=%zkhcnb{)jUU^oB{v<*YVUN5-7v@9*rW1VA2lP!<> zz>`)})TL*lM~~KsTo?ku))Oh#*YwueV(a1AtPxBC7L$45T|!CZbW(bJQ2ch`8im>I z0B-|ZoAuk!+sJ^8rVCv{Tk-+b;`hrn`y1cve*!G@iUT77Sm>;Yhtn2g%T0H4<%Jq`#EKGs?eqZ0`>K5>GA)AB+6~ooz8*s9mH0#>5a^Q>@O+zB0VT3u7vJ z^76Q5``kvRDOK)1Mw7ON-{7QoNRxNTj%16?orJHNtq4fCo_XRa9O`?lV5tX$$W9oh zKoIL(z=jCOF_~fl$<6b!|_0N!$DY7LswY#SHJhSy0X^9uKf(KW!*26Bv*`NjOoH}Yr zAa32$`GSlKh|a7-r@`sQuk?@?U|ph5@T?P7uxj5P60HH_f2P!O1VnFf@IVZ zLwxEEqe3<14`Q4G=KAdlq7Pq$SOE}+W3^p;%s*wmaUNx796OUB;u;f6N6A?*^5e0N8C#A}+ioBVH z$@=nYon!D6saavI^>Qi7h)}v98+aNzrBV$mx7R_umN&taStB0cKSXDaZc>=h|wQ#RrGkJ3%=+Bd5Vh(K8+;C;5pD zeTWC^bcA6GuGs+Rnoh_bA#Z;TXO#35K`Ty>N2c}!)^LzJKsXIdC44NQuqp~Vw*L3* z%&2j6PgG-Jms>QNN_&_pRInLyEKDb_K=O>UwSB9_J4mr*^hb>+X+RSYn74D1$Qu*G z^lY1;6@N&*YhxijMkL`mUc9^$E*{M5@L4AUidLYsd_08|oL|Gt6!76@xn3uqQ8abe z>`5UWbmpV`k-&t|OB!rR?olTXYAL2-3*$F~YR!xvm?rK5V(BmoU>NG)$`8nAi{TSw zFgTPg&gek0Kd4R9l%*;otICg}*Pb+Y?gvLbFIKUee^J31aM@Kbvsbdxse{5bK)1~j zQ|_kS2F3+NZv~R0ANlMa%wia%0OcOQ+%-`3!jjiioHL7mp(VIdu3>HOFwWF>3qbrpM02;BZ0nPM+VMC+2Hbq5vwi{85*@#+6>kmGXes-K@OMYQg>&`RS zaX&R%GbL$(r^KIclU>4qcrD?pG|vYfYrUI%SUK^8P5(2gg}erSv+YGm2}qrPkhG3x zCgrEs(40Z#bUDag0CscpIj^&<4WYper{^kcNp2YIB2(8rL@rJ>248jx0L1U4W0A-Z8^RW%lH$Ff~NlRJ+vc9p;a6?D{o&EJ`Q zw9cWtR4f@GXBjkPx-S^7j!fwR8bkD^YBbd)B(qb?EX5gjs?gw&7`WQX(DvG*#*Q}aQ?ZvuAlG? z5WtwbM&LVp+#kf`@w*aV`$adTc?#zvJh2|D^B{uNVw{1E8MW86;;9jrd_Oowq?9JiqgucB6n^dArv)T;uMM|WInBtcBQE*vkq{| zxBq=+GR@d4+&{2s@H=8&yZNVb+x-fivICE>Ba7bd$(rFg!3#$^Sg#o?puh5yvAqt9 zQi9J0x$LUB*c2CxWW7mXd{*q{gkqD`-LKOfFRjQVnnZTK=bEzU91fzGdk4m39l|{H zVEvG)8ILgiSSKHOrz5+L)r7QIu2KtdSILnYeXP29uBQ8H$X$mJX3;6kG3?X==*c}W zuGktRT)P>H$9EMmLeC|B_;Sg=zg0twl3{78mTd8$V=PBVv8w9Zpi1U3s;$^shhmBI zNT$owHfbRZ2m*QZ#j1PXeqJ?Pi!vd=Zq!f?vF!}T8C$b?#!AjbNwtkF@fPnH7z%@C z{nXXc63^E`GA}7G{JiV1bRp z(T=d}+31pEW8K=hRVf^?;lC;jmnz{d=I%8#b?8|Puo4S+T~_HqGi3l79`OPm|Iru^96t!^@6}36E=}-3$d%dA7cc+XGYo<-vf&Yrsmu4jv=wy7(%V)aK(`}KCNKMn zy=%4lICQ#Yc+o`O$51kt9$A5+c7icWZeneX_>zrvTUn}PoMu*3MuiiaumE3@852$SUykK@8qZ_oH9a?uTzLI26* zuruPBoVcrCE{99KYLElQl4{s?49fgap2kU4Je7Z3AoP6V>-RBX)7nWlTc0Cp?r5pr zHG0jq9Ixnqa%Zh`nq^3=YMaN?!p+yPQ`2esK8=IoWv$&cV?Iu5V>&O-KywFJX%VuubrU`bKHLZoQLlwu{(-VcvI{2Gf#NIOJqE(a z%nEj*K3g(E@}%=AycW)hQ2Gom=%_<`abrV~eo=DXSFH>-=XA$;19|K^dD2-n#Vee5 z3MiH2p`2P=!UyPCmKe^h;-6y^gY=4`=x>IW`zEpD`qh1aCkRnXg|>n;-Z^0ScWeo3 zO9kPIQb0<11IVHet7sD7x;-_W*{s=mhdy(5L#_GDaiF(Uvl$4urt4;>cU`(Omt7Z9 z9<=w$4wdIo1T(&W6&TT4(S!}VqfTB!Ado2-w%|T+}MvRNjvKi_Uf%Iywn&pWCEaX|S4sU~p}HbWL~WI*0ITb~3ZF zV7dP~dH;fw;9lbvs-#r)NpeJ)?BIvnoXiw>mZPMTvoW6SG@c;*trqq+V@OhUZ+rwQ zU3KQe_4_V7Ua^uv#a|*h8U2Ot82lEHDRHx>ux7CqD5+|aiyLK%*iATQ;2wlkDR0Gi z{X#M0qStmMiUqqVFf&E*3(54+QWtC_HYBY9j1)+RlJ0j1cy?=4 z*A>isQfV7DGVlZ8>y3FYZg2IE>;^@cu^3~QU#U+DzQ}n$V*%b_yc2;cz`Jzqi(HOL z_gFMh6?uXAYQD~aO~D0IbKC1-OI%RdL&^=1+Q8G=A^i+BMF7ld08)adeN6QUkgbdm^~0dz&Xl{P`xnqZ;~Ny z=_k+=S@u%i$kUo+qY{xzhC^1NVyRzMqhIGi!6KifiURf(SSe@G%Sb zjTi8D+5=%>KbNq>SjKf2eK&?|c$B=5$TA|sl1A8${sjD&$cH{R@XN7J%i+b4)Frsq z@nm$MH0u$-$pIwp_(MDy3_pl`me@m?p!?nTWk++xKWA?fi)1zY(&I!0rcEla1{d4whtEkGUwax^x zb0Cm{=29;KtmigCvjNEY`Vk(Z9^Dq`(pFL@yn@$W#WS6Yv`3l_aOd7swymg2aon7O z@q01cLI9&d0j?#&5Xiyr2VrYQw#dv$kYyNB10Nsz>9DFQfc@EC)5XcQ$%~)K0}(bL zX3XS$#y|L`!Q%yd`&8>ZF5~MTxfx5w%D0$jWRMJ;H;^%)q255oGT`sifVy6@q60E= zQ!tXF2VC-v%!Z@^uflKKhF%!@3z)NUt}pAx#UqT&`<0ks)x%nxjE>o9a1{L*0_mac zLLgQIOnL966>4mN?qp7*I>l{X6Yf%AM%U~m8nGD8Ga3Rz; z05mfgHfkFMxu54Qwl$he#50!dQawl_i14f&)lL>NE_tZ*H|}SM;ivn+C#D{^!dOyO z;;D>|*QVANa(m3H`;JrOlv?4H*sLLyHss(-@C~1W)7qO}my$^Rz?U<7H&qEYA25*r zmaPpNyRKBZ1E_@m!rC}Ci6>HDz5C2lA{;clA5Fja7gkyg9H%3fU`8iPA*|JqPdy`C zrpCkD2h-Pgj9x%*=fb(GbuB@_msT0?MF3D`j1eO{apBHb6FToSW*CSEs`N1<7rNk6 zA7KE#ARAV-BEeB*gQFUQXNj3L(~bc{v0;AVA@0n(On|2ii-BiZa##Sq1pGeseVFg= z0P}Z&F(*j$H2DAWX%85Q@QDkzz@IEV{o6JejEoH$m+t@LTX7+T`z$MF->q}JTd2Y1 z3$K!4%sH0*SPiZ0*Bw^V0|h-`7|{BOtmHVZKz>-2Z8FFtF2D8Ga30O(oB`kEj+H9{ z?yNStapj4`jWxqo0>`((blH&Ec(R$cTF_okcOEEP&NogsdvhJ~U3q|2Wn%dR;T!oGk^$Cb8v);izinl@Z=SK!{puNLVnCV_0%G>vuJ^(E7&ymo zK`XbS=A5%-rO=H&#}MA5Su^!#&~fRzeZ254qX@=%SXcVohVfbFI73n7{i&NVI|C9i z*lH`OIzu*)S~>GXY*YhsgYrraF4sL)?ZT23fyv?ji^<854OZ-&ogOxTwevvv2Zhb` zi&aCh%Lww&XOh=B^Vv+cK!z-i7%-e4#NG)TGo-$MyUYkkZMCu^?HBZ6?)MD^;i8SA zjqg$Qf3m=E=Y8X)8`+GWg(RJJ74AE9i_(WlN#8vd!&{uuK-FUlr1v9>xOVEVU8~1V z7*~7kb`S5~;B%AFd~_eu#3Ib#AzK1K@+U@SF;f4G%U7~o;O|qrEn1^N8P4kz$QG}> zpqnUDoym&;D_8z&KO6d-+$oJXYPnO5ItoidE%VAa7CF4b zYd*qG!m0%}yA{e(w4hIhO60B}lKV*v!H?DVhn7E{d1`)|2oI}Sjb|Ii7cd+7Hz3k6 zW;~SC%FmvZW<5>y2z;zyB0!9rp%~+u#(eH;yjUgx+v&Q_5yx#R@A29Kzc{2V&u#}` zfku^P)up}-_<-i{OgG(*)YbTkdQ)nu_kLC~=w&Wn2d!|kW_Ja=URE?kdkRB+)uDK= zRRAk)aEIK)>G;ffG_3`=wB$KdHl{aZ;mKF>_CU;8{^g+(P`;Dsn`?I-3~NTM)xSHqPvhG^yFDM?m3gz+L{3KZA9=Dv z#b}e4;O+53p%??GuzAiXTQ^NE!c-<`wNI-47wd`dd;YWJT6pgX6csWoa~jK(vDh7$ z@PhI4G^+-lXqR3^JsrD?F#V{*$TF)SVd$s1r3iOVOy-)PKu4xN>Q0oNqZ@$F96satauTXmxs2@F(hX? z9(cE&e0am??()niahJ#IZvyNof0pqrndmf=^E#j2-2x3}C^jPDH??m)+sPYt3*(bG z^exps@}QYzUva~p)-yUHA^Qx&1CBng6-mTg9L|%S9~*(;VuhETpUptWb1t58KVeaN zml!E7SbB+SmF8(7IJIOv3acRTc}tTGho6_;weB+Wk893wsrH@Eh9k9Vw)*Vbany&8 zpI_j0z7`P=l(NxG*VRJYZC zJFs0xg+t;*#5ocBq?>`yL3yNcDE)b*oI~58qxEKSKOf(j#eB*d;M--GXj5Qx%c}U< z$_X-8;CM$*UtefMny){*x!_aQ4Xo6>U2E{UlVb^CMO{x9AG=Y_@|irzDPH@^RYMrLI3b0ZF`1AB)x&YmXDLji1Jb62)_ z$~*4vRK4)L6H$k6zO9lAcc0yR=AYgLrjBGtSJNx`4_l*)F|X8OhlV30OX4(x#vh5q zWgcIQ=NuG{8(HMeT`|a)~b>b=H$s-Ru?6qe-4LGHLbUNpk zg8qA3$!&_T*KZxXgcDAwY;!Xko$WYy|I-64r=Ozx^;VWk1V3%vU%45BR8bl!+x=ZY z2>-HF-bOek$(|he>DjRt2w1Au)Yx$MpgyWS(tUo8 z|C87xT>ZSl&JzAK@v8C3c*|>1#3ph zg?q&JIxEJ=qH0TuF5P0CZ@>(ToGdaY!Ay|hxqg#s85g-`F_)LIk`}!$vbo}VEj^}8 zqmOgR3}A9faFe~TL|2+Xb^2p`o@MRN)d6)@0BQ?Kuu}PmF zA>NjwJipm*JuV86;5m7E)do{F*ixZ@(udp2Oga1(10i?&7?R=*!|8r0TqLIh3_jSH z#a^DZ_w0<4*dWsWM3`?Ks_i0tRo@5@_F{{@ry1H@P`y;6c>D)ke?!si9C^z%xNo1{ z?@VOpSkvKhimPm=F-YRJ{lVGaMIW*ICvDJ2kKBsTJ&G+1IXIBwS&1M7=P@QBzlQtO zc%v6}w}9!6qY^63oJ}3Qf{eAB-GJ3U$0mjRx_fQu#ULNxzlvw!AxZeX9!BVrgLl~< zHgaJiAK?w=*b|$cRih;%Kd}L0bbJ_V#c=qmnFcebk+gII&Xo2%B{^lC4Xi^hn!$x`}P{S&Nh@t2;yDKbasb+Ktrh zg{!m`^N#eiV758VwTWWladX%Vh=;t{Jy&eH9utzp#p5`ZIOeMh@g4=KBT}~lj59q3 zy|a^bZA^h+UG16gNYBDpX}L3AA2Gf^Y#7xQ45QEFfWQD&*%Z_ZG81l;FsN8#v2aW< z3@CWpg5X^LfY|V>$k_w~#SS3ZxI4=mg1Jxp)`lXG?Of+8`Elt9DH~J1CbC5vU2$4hKfP#=W9wS@%$qtt2LCXM|J!5qk zq4wxTr8Z-`s-t!!Xlvk8TGc!eRVKKk+jv6?Y(ovmaK)d89jAMc3A?O^KD!yIiS>xk zYwhtpiL--etiiPP(C63$%-ZzuhxlLou$4}dJTec~<&BF^A%662g|jHmjcuMQm}V_j z6VPkrj{Cq6-UHTeskbo#_M+53gkY_cCXtKhVETH|B?xBwr@u(VIKijHMeDa4|5M|T zy;*h*T@2WF1iZHUsU9+qzYBe5=tyPpLH9D;QXOPy%0iC-_&zd&eV~~_+z+Y~_`43z zH;?)3^isk7Rt>>w`m~vE8;7sQGrB{8l7_!a)XN*c{}XIB<3o8CCq?lZDo{y`Z5SoX z{DE5DT)MnbyI^MdqYEYMC`&uY!;k>t!^0}KMt3S(ynxMvB>Sb~`$R+AbJulN6C3MR zLl+!%iF5is%wEiPDOR>=+rYq8sMt7+Jt?-<8JJgg*_nDal++5q`OB@ZzcX0FkD3CQ zm9i_xgD`f9Hc{dn^QOj>3CUohsQJY+M^y- z(+=McOT@#-qy^hJ&FFL&qjDk^SI`OdkWXao@?n(4e{Mz+ur?^ zj0_cn10t|qZ@o`Sx|J*qSffG1=-vaxR#o)Rupg!Su;**2FMYNu#vCg~tN+pLL|ai+ z5nyCUs85#O-K?MwG?WeVK0LA+INiWz)*ei-Y`-*osu(k@$#R^!{D=H*+#hn#at{K|P8Qe~jkCE4 zvP0!hG5rYeHvdvhg6u}8AkH;oKcF7w$|$cXR>Hb%o%0|)y~FZ1!W^dicwKAWfYOYy zdb+=bZvL#t1md76emb5JVlX_qEf5qZIg0oIwA;!M_+`AaNb5M=A3)BA)|||?yKCZU z2kA5;XX=CR_~+cLxd1|>{E4ne*ObU=GZy~QPc3*aK=ixr6|d$hF&e-@mxNz2y1wlV zk2U^s8DC^73QB4-ZYG?8)ra>KCDeyrJyj!Da(m%uRduM7HX9yPxK;dXfM|}*t-?e1@$hL+(u|jf7-+_d?MF% zn0nW*g5a>S!uv77YyPOOZQke-Ch){~{|neJWn8;x_m@^w2;b-F<#XT+E6M5Z@;!xG zPH$zY0*E}bYzrHx`R+-zrzF>64UT&*d#mNb=48g534V~GlghSu;!p&j75u~@UCaEt z&1o|cy(D0vx-~mv8#HPdw7?wy(qh(#VCTpmFX8MDl)U-#u##vg-5f-^M%It z1HB-CllL<*hVQ>~nD5aGW^E6f1O6l8h0;|a@g;mfjqVxHZ{!S=H#UOj`gE!*kYiQ{ zK4$gMeazBj?6{KlCTMhJ?Tcef5vi>7d`ZkCAZK2|!o^PX1emEFnB~9ob=CA>!l??n zz6|CHl;2=AXC-vleD4W!cIrO3I%*~BEH+X^!jN>asj4})GB7^S>-bOKB6y||0UP%3 zpq>5~T55BnQe*^Z=`9wjf^iRo3{$yYedZYLo{sb6bJZ zue1<|mb5cXe2>M8o1Z*SbIIp*}>UH5ARgrPD6^p2V6ox15GN*-}O@zG?0U zGS6UIMzE|iRzkpqn*kTjC!~KBe2$(w_f%^KzCqD@zZOz)#S+Ea)F}8(s%C@vPi0>0 z0tFFgu}dc2$Se(-tTlWQ6!$Rr9B;sm%w@A_Z_u>2nEArwAFoXUW1A43_t2pDg~7Z8 zbOgkey=Hpe>N)VaU)Cs^w(}I$yEf5GwOO4|Bwp(Dsg5@8u_0#$mS{5XHd)4BX`n0! zQx*;`Ec@DY;yIdf&8r$-2QjT~}PM@c!n^U*IGCiDlRmN3xr*TGwWcD*_vwj^3U=sU>(~1R#I-ovf?uVcX$U`89#^g^^OeJ3 zAU#l{+oXk>%XdqwcqTUZx${I!buSVpfoD$6?aXUn`Gii0iqSPHe34OW%h<%T4%u9( zltkX2t5zai_f8)xvejV8uHd?pM=XN_a4jQ>Qa+?#l`e1~UR1nn*QYqzj3ngVQw&zR zaX6>Yl#Ay261K$MWN#Ri`~bw!|8*i$Q)9+ZWYW0hfaGUr(~8pg`79Yw>6Q$#W`KRT zfD}N4<#^sO=&>#3b9-zpC91Bp+bw#}>*#lT;4g%l*7ih^8wMX6b;!xwH%Wzf*Nm58 zT{%**CRw$M$q^WNRjtLEFR#LL3hYtoZ{wE_!|RLEiHGMNA3jj5&PC(LH|1Ky@V-c| zqh7f6JXzSk z!Bh0~4K@^ee&gSuT^kQ!n}c?Fe?=m6{Is?Sf=4)be*UQq4{O(H@ySi^^q0i@em|~a z&the^h$lP6JMnMI)Bn<%*ZSZH;(~obpZ&mALge-!0h!HuVY<(C)OUwzT|L3WQKV23 zd=`6ZOM!rrzFp!?q2kB~NqUHfN1aOzzh)Ggf5dtUt>ggv#nRVPF@1o2cn{`kMZ=9j~oVFMwN)7t4$ zsk1sXnMw49@>`D(b?=(Qf}>1X22r;c!v}(8f1t1}lj$e#!o1HAkSt3mc{`(--9~(! zPn~;kWcC%rD@Sj(5VGXzaDRdefp;t>@6E5 zq*w{2C!hnt0hAcT38OK*d3UF^Q<1p)7y5wf_ros{;k601X{##>vZ-y5ek&&~VG)?e zDZurjgyGceb&gsR5cpRkj`+hf3>Huw_K#FWJ=QMt!kBC896jx%5w+KlcQnL=jac0+ z4(@=m^$UHpBApv4!~f3)T6#dfM}e5+KMxC zexunTB)hf@r>(4WWE158#VaWm?X+O&pUzxppe&cNU*eIgb@^P@+9AW_TiUVH&ZNEgxj9 z@p<9Y&a)H{#15YxEoRbL#v&kg44Xk0m4_x;j!vqxjeG%5?;(kV(Mi2xv1qZ~82MJY zVEA&21qGQ3xT`Io!Pqv^J{3S>@F^nnXfrvYlD8x~AMUI2#cICpr=L7tMmIIc>w7@6s zuoYW5tfA_wM9XqmG4=^D_u3w#mJoTJJQzUJ-bUy)WV>cHq;`qWU7$@Y3ZqXB1}fZo z#oqz-;lyS1R%SVyig#Pd!5F`r;C$*_5k}c-k#&zU0<=B%sW1c`P9~?O;)?K_?0}*s za3rdh6m_7kiMRk--$t&!0@li`C1#m-$oLzmF$7Iv7(IjD1*o2N02?;|V6nd{`wrgN z*{6Ej3B1mJhLTGz^Z$Y&Wvs}6398?LR79;g3SL4P zt`?KSz1SeS-Gf<6vy=fzLKp34IBv8gqB6T~ulqimA)8@y%F#YAH9_)@`Ch2C%)w9@ ze!7xUmGbl076#{MQv=_Q$(G59>gW9**!iQ5B>;Cq^jL7}!Ly`QlC2EN@T;UPyP^zm zmu-QQ|9f(r0Oo_x!4wDrV&yST87<#y08h1D@oR*6g!&y&U2MGF3S(;ZRxtervpS0L zv4c|L!j52%6#L>4I7v;tbdtWp)Dmbb%Y?kD!^^C4u-ig%+OsPFkW5G{zd^A-50 zCHqvV=>S)Q#+M`~vKKcCRtC6fDjO>NM3l##7nOJH%t!{{HXy8l0kQ>|XlE^eN2226w}b~Z2eynEnR&_zZJJXu zvHRNt7Ga}o5eRXyx++TpFd%=)9 zWa~0_Cqwl@M+7yvu{0UKIY~B)O-Oo2rqzdnfEru9FF?iA53K^>iS1&)ZP?a^ltg=h zGqFdDo@hQgt{q#uB~LC7+`0`;@$~#{p1+x_O=1!JIwFH|q zKXNW^rRFJeSP-ai!9^o2-2#6)hoIywEqSS*o|s*|3xwnBA0_XxB0@&HV&yX~P&)k0 z-D{W^m&bgVOIcxa)qB3c0PE`!!8Lu;LxE23P?LT-8V1mOF3d=NPxjPZP_V%T4XE# zYJC8Ni;^URYeSS>K?SLMNmAApUo1`X_CNtCvqFR@$Vw4rz(SR7iu?Q;M@M_vm# zdMrNqX6O1PN^duEcKatArA*FB=PC*Vs-FQAB#wAW5BO{RxK&?OvTqi$EqrvRDzW2K z=el4#R8#KRG*FBfOcAR_dV3DtLylRHtmoD!g~;hfIB}}8&R@KGpGsI0@i6PcEM}P1 zd||qBvlWqPeQ9OXBVaGK*N%Q+o{~$OuzCYMW1VLOsB|1SNVZ9B)1_g73!3jeMdeZP z9|!C^F8hQaBdsOwa~%h(gs$8`8Kp*9{bZE~Bsu_-k&v(P0&?wiV!{hm)#zgYXa2yr zvF3Ly|73^C)Y#Qz09$U}sYELeJpbb8B92}j@ct^1zqA#vS$()?x*f7ME4a&n?#Q#Z z5(q2&7OvGEPmV$hCw&|Jnu}iLYeWhInVy_4c%@*ZpYnHg!lVu*)xiBo@qoe9DJSSw)=-PM1 z6_0Mm-8x(7t;CE$qk)mSS$&mUO!|8YBI0o=-Q}#LKZ!@W!|Jfl5oZY-<4Wx=HRnoG zKAU%L9eH3@_;y=PQmK#q#|O`#Ye&4Ym>u zKc>!xF|~6nMP8nCl#l{HxMgx{Qs~)ArtqiAoS5Ymq2VVX_xh5OJ~ifknL4ASz)6d| zJDSh;mHsrzYt7e+Eo4y6x?XgC;JINr16O+AbfP!a;wJZ7!Cev__tIN+ZMBAlLWj4F z)3f$;_T#xz&z)W>sibsXp!y}9&D@vh6b+k_>7K+f5*paj`{}no7ZIvc_+HA|$lRHf zlack8((Zbs(EM5Iur_G%j0JrmTvmL5$}w21%OAskLHq50K3OPLndiTWLPw=zG*n9tJ$w@OXym?z$9`yU=o=FOrbe|DfAg& z3Wddml>u`jEf*k^{kq`oPr1VX>)vPP=)7ac2(@2ivb@p!UEsxn@|f1W6IU`-=@|%4 z52FX?W3y6*K8_`e)^SsBYO2|GtG`cN=6!VeLzN=a!@pf_T`qjH?%>8y}+YWhWba$s!UiSe@7{ z20l zb?cw6e58%W<^zG8UTT1RZZSps(>b7F#6HzE-L{_F$Y^mOfjZ5f@8;lMq#C$l`4;m7 zC_Fn0>#SR=y%vjTpW~qhiVd*hi3y@2gZDP`@TXnB{DkcJT_fE_j(?$E6h;DaA)W^+ z9VtVbmkGYki0N4R3`X)%7!qWSSy4Di2E@t){eGhsV#F7`)l0F&Pe zOarpn7?fyUn>Y8-as&8Hv2NXg%)T5dUyY&vK=gRHGY{zk`dwa&O5VlanX z&Cy6WJ7=1;gp79=V)9b(G252w0whk1ivkJ9z#YG|9r>xc>bqr1Yc-HMk14nNB%<`L zGJqnkP4Y>`4!t%9D^l`|GIpr@=b!<}!_H`g4 z09e;B>gBm77z4E$Qf)i-xV_`dzKX);s%gU0F<9XhU<$cK9h7@46fkKXP|<(2Q2yS> z9yH4^2q?%_0|-5%hQ=*Ma{4UA6Gz^(Li7TKoN@`2J#k|cjbq)w3Q*_?M6RO{k`>OX zSk4Sdoa+*Mz``S{GV0Msxtz54H|ptYn`^SFAD>fnk+BOB_d3mk-Tv%vB@OmY?k)~5Gf~o9nx-`$)Q3p0m@X~VLx%UZ{Z;@xmX|}+#u^TNAy)GA} zmZ_SkTskJi;gZ=msJ@k?=kE(VxjS7%Sm#~!LIx!r=~vbgc6tfs#d44u&)QEB4CD`@ zr0e+I4+Sk6hI!b_p5b4d$A4%lOUs5StG={y?O~C5-7(fVm(on_+A*ftt}+IUzy-iy z|7{A{06_ixG=MGZ^A8;vxH7e|^c0Gv4ZL}frGPM_cK|Bo>4 z-?*cs=G9KVAy@dPOG$~(g|a^(5*%j~HeEqI7$Xaqaa7*VXjn^PM$8Uxm8%5Hceakr ziZJ|BK&d6dTP+(4sH@%v=#(c>hz_F$=SO{W{(-W$06~m>Dh?A6!k7YSzS=x;J)rCZZ0rX~Kj2j#?x$BUPN4eK zr+NXfN#RT2amk*FO91=_1?pylUGhe*14Z=!d=A*d|2k;_9aK==aTm09NJBEPV54J< zK9P?A6aTrgwgUJ(w?_DT<9=%Hvd&EH?rVT33PsR>8go9U02z}MCC+yr_X^YaM$4ng zPk?k7*2#fyuR*UeQ|*qTY8bzw&|AR0K>ejFozXHC*ex*j+=zLglm5g>5llF3~41>d)j989nB>>u5@y)%fKl$KIh9r`3WLrr(t+bFrV z_S+OGQG3BQ6+C3Rqf890ANvB<)_|!e1S?EGNsW zhwdt`#L%5c;HWzXq4I9)^{cY$jbmpQy3tB|Q>`eZr>Fm%RARtw2-}5k>FHxsG*omE z)~JrkOWK@MYBDMU^?=3x`s0t8?fs5F7uXsj!6xW=B%+>-N>31Ki)NmJ7l&xDsQEM$) z+h~Gv>a-Tjs~0|A^iqcu69L=8%Hzn{<(!OXQ|F3$shn>_C$>$s8Lxm|Z3y6KPk49Z zZgS++X@smpdctAfS&)THvzMyBHCDbh$Q}Lo1}QVkN6sc$_7f$7F?E!EDnLuPp{GJ0 z!DXI15^J0bPcj#g_*hDznPX<&Z2~XD@i*XGt^-fOD+&NQ{v*XYakPN1Mrocmlq>|S z_>63MHgo7M(3g;K%uYIK1`)0Sk69){y>*{Oxi)PULKkDRNAv4yYUk6w;Z${T>9OtZ zkD~Xf#N>AjrzmT~C*5j1bNZZb(ww{xM>x*~Ei3q{D-cy1LlMmi-IV5QV^WOT~qQJ#PcT%i2} z4uEd!If01w*Lt;o<*{RYZ%7I$1lXN&JYAkhgoP<H598NZNMYsn=suXVHNIy5^B1mMa&WIC8F; znMnyWIv$t**gIko!^`1wLL&5qlAO>I3j?fh z4%*fiAz()REmD>y-3}J>Tp#@Yfv>mRC|xy^fZA3aOjF){zm`P&T)_-6p<>?jU$)IOGIZ}%Z9hLT z&{(gM1YYlOGpSow1o?R;>=y_Czt^JNVM?HWRf#)ByTx078twLVO+ZhO}|9M$L39ugSJHkUV$ zFRG&-#|wBjIlab5)g*AaPpl|!1#9Xq_mK*}h$=qw{c@aMYU^o&=R`i)POn4w+@cEZ zL#vrI{pl=r>cxl)J!Z`0i+8A%BEV+qCm;p+^0&3k_?kyBg9%5$mQ9N-w@1Ih&zLjB zz2q*$UErs4e;9ZE2-l0f*;NZ^&{Noi5hBN%1+P{YB+oWCP_Y;z+#~VQTuyJylxuS3 z>oNHm+<839>nr0W4$&@VeCCn8!{UpkbGh%IVmzIOxl(||Jndij&%xb(mwj)zk~6V- zlHeZE>k=p@_q?@#_Ff;E`OW406cxZSz!0|`Rdt*n!rm<4@?qc9LIqB4_M_BpMPfT@Ai@uS@ve>IKvVB%jkBti*r(ej~Q(HeB4UwN4D*Wm^7j%ZC{+ z)Vu88aL$-{vPZRT?joh&?LJijr5xHYK4>C0<8(l_-?})y4fvVU-MD2SIUM8LVu_R- zYb&?J`X!RTxLl{;s?XqvMfY*7q6U%R!gKGJgH@{;zEyr1zKU75?A+~oIL0v`Kmn4u z+ndN{8?ld+-Wi`bMeqJpL^t@fNQ@5NE}yg|dW#c6-m{P*3HEG)0*tBf65Z|MMAZOL zGQv_BO3w4a?!f|s>aQ2Ocy>LB^1A6mBx!!Ijb+ne|xhNy{Zg-*0XTdW*guAaX6 zJeh_NhX8OJ0w^|CU=J!%&oh z6ODCm)EK$9he8oLX1A+4S-_@i%^mwxy{akbU8QcoRV?Jekm0Yo5k=ntZ1a%ud-qck z$aR3OdKtYNUEblpfk00AR496bLCXPBb|VB06{7K5?jq9LL`?MJqX9#MXX>)z7%kowJFbLJAn(Tb{d-nNgh+DXca zO{dLFj;>5TZ-YLaNS@Bki<&oJfxwo-+^qYTqSZR)d6`y~ppNW<3S$R53k%jz*s~Xq z4Ze@mq3#?#@!n2EL-)jqUX}K8-lx-=yCXAWbc=LHO!i)WfA9GjYvI3+39i+uHg2f! zIfq@7#%q(~B}Ko>^8)SEuj5medIor#ONE0@oUNf|id5W+RFvk~r*h^@!pK4W0*TXc z%Ysf9;)vs|t8LQh;jqmSYQgGptMrE#LMcTn-e&q6aWi4^D1sUBE}`ff%a?xl$O$7m zrN`8nS(OtP$#YYtVd#~dsyT@*Ys!_KV=nL$_w#vo2LU+>a8&l_Lr!4oneuW{!x zR^k2JHIgi)T-b5JsH>$<1LTV3LVX6iyp9=?b$e$jFRm3ZD@=ZdL}WPe=0R2WhPEBw zKP{;Uk0*lQK%S3~wyE%o(+n4;w1ZehWY5~Mydqr}iO7UQW>lr7v&Q+T?TB{1zN8Bl zX-9k%;qMZd3j(v;Cctl2S6boAd3TefzuR9ApczXST$Pn?;Geu6q8%}L)XU9BWV&lZ z)KC?p-ah=Ex2C0LXx+%Dv9pE$>dKN{3|dopDt2(K7e!JWBP+Yu(T*ycIuGH_PKj+~&vqA!N^Xm^K zZh_MlwSvA8C6n4i{GK`60_&>w*{{vNUTkCzX8!tWML56Q8ON4Q(A0y=OIAHfnC$sj zm3^v|M#*i(>9+AfJXW&1+~%|K3hKL0B=?ZxEPk1QaH+{VC~6Iu3hU~5<`d#Eag#GO zYtn!?RQ*`K>Ff%fAw>>H-&&lj%Fyfx<%v}hV3$sj*a6_TcNc$I-~W03y8#A={xg>U z-eYNG96T|6=lcYOaUEwQ;uqbrm2y71@>&46&i=ifMv$NhL%bxHQ^zose-^}6`&5n28v_L9exDXVg zkdh(u55+WJ-eQ$Kq+42uJ75{VC8lt{>5T-#6}GSf-6Yk|3sXswVL-Mz(%K!o$_3OF z4CJKg-Cg|T8=>Y3@yZ}HeyekqfX{2ujgg)cfsTRxv&L3wHf*mmE)yG654F>rl=mgY zHYXOR>x37{p`>*0*nUIV>p!@zTj(4-VAThdeMEi*tigZ8$$qB-|JOVFpBrTT=e)oB z*mYEfFYu=xaef%xaxF;v^0C@5tTo%snc1wtKU~*5Et)HfM^)od~*MhJ!oVy<;-}fjdUETzxm2J z;@rJHnJ@Jhrp_)Kt$j|;Ml%$g&~kmda3wMG``_X(?RWk;iJ0c@0x9P>iPEUBc9t?p z459lrotn~3l%}##3?TWS6k#{@RiPx@8@GgB1NxGp{-1){=is-e%O7n zu}L(CC`&(VUtUHxcIMEebUjxVKClL6e45~pxmZY>yKJg!@@$c3h5NKz*Y!u6*oIJL zp0H4x&DKNXGE&_@nUV2zSJkpnI4>q`awd9LEB|7doalEDtNdj`bnb#ax$!RfagkkK z9s2gAQ}4D~Jd(=p)eJ?NOY^)BwaSMr6P|LiU7t=m$~+ZHLdV7{E03rHh&W97_u6}7 z^cwm*3xbV+{%CzJ`3=VQ+SiN%}Vf@F-RZ&xdJV zj)Yz9M<$J!;l87FhEKM|XDqb&At!XSE4g6E;=$8#Xh%d8_<3b4YEEd5!oADL_z7#ExuoKBcWvY$nJy4xs>b>+*>@dCLx2=R_Xr@ zKu@wA|4|SMeJgmNh@$cov`y!`Po<;^7wzo?71)u&`@l<_G?bvoN(=0QFvC98J2U#V zqzND&RU7pgX0bL)QDN}|^x65%_NkIOiW`?G96&Da_8N}tcNMqk3B#f(N1>2ij!lI4 z0^;kz4G8iX`|SaRP3_bU>Sa;{-6Q#{bZqi2wCy~3uz)aX-;K^O ziiDjBJU@;nMR109M`d-w5hIw0xOS%$mmdU20CJks znm&xXk2JrjoK?x#-*n~TvuDqq+~22iL$rdK#dmkc5{}sz%i(E@{GIPLsa|iYSLi=& zx_WQQnR-i#I#h70sJfIH{aQ?ro_;>Ca1iQu@&49D$W@<)DPk-sqlTj+wz~RwLizA$ zD2y~Z)#k*TxErtjx|>M@{swgK@4hAfF>|Sd$Vi_b;EH$1SlBsXV`1$#9ns4C5c!@; zn(sx6_X+&X?bu$D0?z>SMbkW+g}^fzBNaIXz{N4y9^63Y%fNm-krhWZTQn(H7TEze zlBRyT#`L1HF{EE!lzHxm=I;YG@dPSPDEWe!Rf`vFf6gbW3F;~ zuh4tTdn7YId*B8 zbP+iwCBL-=29~UikOaP?sjd+8$mzv)#;rD%?nG%9t_6d6_z4a8&Dgb6>hlQ&sbMSm z$TftU%$#X@Cqa*RGtp*GB;cx+)(^W+xB^7z?eSV4I+?RWT#7cmlm33?@CoSA1=S39 z>pfNeT1i`J2sOcoPTN1{SmhdUwes8t;Yd#Un)+C!jLXctH0L^tK3;WV=c)>z$$Rnw zj&@-2?|XmA#hyZ98L5E5wRgDZuEk^uc;&VyK!KIBi(dfJ{r@Ft4U$E4oy#-y<}Z%C zpfYGQSWj3=K|P^d72Ht4nU!mdFM@!a!GpD#L~#4oasA8O>7nvETnqM)Vr&Ebd3RO4 zHOL#`@|)RrO^XYkvC};Hj97Z(l|h&AZt5(&*xe|mGm^_)i9EHs5Rc>AO`6F|7m0^{ zYSvI>-upDcgakp%?Tq~u>QI#yi^A1gTdc#*WyhG-K z=Z7A>Qfq2dn8#6x5T)ggV{;aFXTKYEeBcH(DGw_c%({*a7o}~*sIi=cMLhs+ozRJ86LC_Ff4w zp6yl!3DUbraSy>TVWv9lEGp%NL1BofuDY|(5APF8lklmidD5eCA&V-m3#Dr-C1*n> z`iq}(hJGUEUR=qPK|tOrIx&+Cl=7_=9{QGgrh1DdcIKVjLcS;rGxpY98+ve9PuRN} zy;2*ryhm-U0e7V|jhJ$ZW$SJTG4OH5mcjV$ny zz+?PX z%iIfoZ18fR%@4H;dfDd@7ZQD_9^VxA9?T-3;dtJ*L239m%f>Ff7$HVSln6hUVf^rlE4nw`U=zkl3>L?Tyv{UcvE@e1YJaWMNOi;P&Uqo zj>hEtiUws;$is-aC9p~ZOK+inQZ)}I9Ivc6B3w7Zb%vbfT-HCHhmQ<@D)~0c>1pk~ zDDe08FGEajRi?{uqy7+yI^WvH~Ew8#Z9>VkF?Zl2~BaFI!H z1QtDEe)IP_Ykvr^fgh#x_T=%sDF=Hhtf%Bhipv#2F6$!H1y@z@b|E-AUP9nlT~TMa zL`G^(aRE*6?0I*9XFgB)!8$3vN#cyTb0WGULwi65GwovYUeAw3Mp)2ky#p2Lez`DK z)h2p?|MBDOkdwK&@6RYqo$*AWa2Uo)Ozlyh0OAGx`vbFYu+QZm>inpM+O|%i+ts}N zY;2sooS5U^tI2MBxi_hJwYAxviaJzz8J?DK1i>L$jXVZ1^*+=s61U9qgu~DAT8{6~ zCH#CDkG;(p>23>ud(*Tc&eK)dQ*4k@n%C86S)#2uhkkTGLm1lwG#-8ymBQB9aUtvR z6C%Xp!~0ZoGuXaZk}H0egysU$+h2l5DIwNmplWmywh51umE9N*Xaq)#HNuRLoDx^hn7EUJA0vtAW*+9VmgTJIk0ec_D0G z(#DBJu)Q}=1Ve!3>l;;hJ&V#=E|oq8A%Gm<7h_C}SdGD_;c1HRt11z?r^ z74&B;8ut{7rWsqUe%;~Bjuxrmx=e3p{*bi2Hpyt{#Y|%jT79eJ8e8f-G?f_eJ``qA zl~5$Pd<1dHy__VvR@1sl=WhL&&lxNH+*rLrMrkrsS{4!5vtrU6}xA%twW%9 zz_(3^>3F_+TFKsCXt)p>6x@&ydXziqz5>^$JJ+U8lf_Kg9;`z`2j8RfkS5svAG!l| z__|sN+&qgwQ)2an=}UGID{th6%mAvYg;47>_ZGv)f`Y=Fg4d0Wye~(F&q+<*Vm^E3 zk@l@Is$*2la^BOPt;r=Z3!N`fxxldGT>WjMD8P?-J#abJ?QE?tZ}iy7nx1k+)5O$% z`WR`@RcnvEus+z^h0a-XPA&9Ofo9{WC`X7i}5;j#KtOpXuKlkN#h9e`xeRx~DHH z3)(rvWFgB)%9K?C468RX2p`O9Nq%{6u$Xe!ReU4MdS0ewCD>tBzwjY>6YS!TBXrj) zZ7zx@$x%=Z9pZ6v=A*5JPp=W@5bFtRo8npe@d2YIfZ;j+_w~k17xtz@g3l2nhq>t^ z#fdEx>X9%j>qejy`(4`A90hhY(N}j_uhvn~XL(ct9B|v70dQl*>10RbFlS|IiIJ&s zWzUo5<=hJh-QE>6UVbakP}@Ohp0TgmgvsSYrdRz>C6r5!%F;ZRZW19`ERdb`Kye;Ut>7_bIo68aR2X&+nJzFnTO#nsnKV96+aO|Hk9mrcM89g z+QT}BeQ2G&KMu_gigml&dQzoB{zYejpq%EAVSnZH3OE4zi8Gz-O~fp~JdM7Vi&uPKMfts+deKh^Zrds3Qe`xfQn3&3GB?v#Q^LzqL&*gQVO6}BDgiygz8$5S z_qOixyQlYE6HLP=?mpy;s%y@kDjd7v8U$NG`fn}s)yQ!A2Z*uzm304lBz&$DZ%m@^JC0 zzTWu#y;3DmyW-U|8z|mZ(CECVLvDu5aucFm_vmK{Fc!FW_?R?He&o`+Lv)BmW8pz0 z_qz2A&|&D-Z@X0g%-`3+As;wO@sMxJHqbBVmQp{48F5bN+ws0bL*b{XR&3Tk;A{ma z>4Skz>ofkeud^msPY=;b4c7LG_J=nMh3M8Qon-7i&nNsM+ld)coXkr39OQI6n36!JQK) zSzlZ@tX%8W8}MCv$dM%aT}5^v#ZC)z;PRi; zAkf{I-6voRTo8{1-=B6y^a*B^)iQL zC41325Ost~4`0OwJ#$_0c}=x+)Kv`DVj-&?T$ioCvEwDgLITCNHp?H27kXR(%-c%m z#lPo0-mBD$L_5<01fAPy?FtssSmVHv(`&?D+a1;7%CaC0Nt7%Ii;oA1K-n< zhe%Lv?jjU`i8b)SipV{-y36DY-NP5|Q?;uaaXNjV1Ck4nyMTl~yB(RSpuKmUON5ZO zg^bzxL3jWJ%B{e|Kco=<&~6=32S{3|`BJKWAm(Ss`w{qfp}oyp zWW{gr^{byKOy0&|LdVw0-%WI|8YNUG>|QZu0Yd$hC<fni7^1(ptv(gE%25$~cKPu4GMVnjkP*Zlrw{4lXn4~(zFwp+mvZDAn;y_8w z)zUyW2dUAc1e$LnRGsQ^<8jxYVSTR#xv2`$`3Y;a1+L`pBw-2TKqm6ZW{2S9n?ogDORs)9*NWj2dG(2(Y#iu{t705*^* zb@(TvIk?sxUzr5_0ce@gn&EI2Q())~^aUMOjv(IwoTk|5OWeu)OrNM~33L&~D-D>$ z?KO+jQelCXW+7)Y&JAI03^2_rRb#yR%QtmN?=i`a%9N~AKF4gf&3mgWK{*d1_06(M zu?ielo+EDQDwf$s)uwk4o=BN68LqSf(%O+cdiF8>o~G9ZnJ zpFxqudy(cyKmD)f1-Wr5?{KId^IGI^m?!$<;YuI8#ZufvmXxx?xZG_{ecpyMjq+{a z9}FszUAbhzdB@c=jzf2*o$Qe#IwbAhkye=!ZOerSGJV*UEFYU&JtS%GfoBz_;(m=T zAy2N{+hSdiJ3iK<=#wd+q^u%5|G`P^<~6_yzVgvpGuZaSt$qLEZc*D=F%o3|&QL<% zz_^+(LrG5N^ZeHu#7hIi!4ZQRH!Fd*BaNIqw^{XcpPYdJHCeqA5k8V@rzQ?O}| z1oDi6))34=(boTaSCY79cx-5YqQy~ODKs-AkQTFvC^tz+)wS_Uj= z?2;6YLLJ~@#+1535qkr5{N1--p{ImCMgSRb53^fP(h*qyCafaTK^{!Qbze>=%9f27 z>nd$ylJ?N;Jv5HQOuSBD`L?66AoD%crdNaGbG)`t6(@bTXxLJ7^*|+>>d=d?O!+W6 z)7S|Ir|syN5-_;^X|UxFKXFSz!2L7!`1}vD34c`Z1=b7#Q6%V(4&YvSU%w`^+<^eZ zF!A&3bo6BBM?1(;dAs|2pjv1^&*kzKH*0*s&c^+zvE zll2BCF;RAGrt2Cyv+ZIw*}_kSfQnZHkKK&mEF87Dn#wsyg0IoxmiPd@<`;wtMMSYR z?u0z+q5>Cx`Oyfbs3D%-9DKxL)YJ{1Ui zy1NoYn7k98wPkc{dpIg6nj>xR+EM73nolGRxazU?oHlgUVn+@tpM5S_{>pnlpQFU< zeZxj1z1-sGjGPe_&AIP>a4VSt8Urt6dJl~sC>@`bg*ry=8T&djhn&t+`thxDMp**O z^?Y^!cc+rhn>UwWkf_WNDl~%iDagiX`0XN=9A@JK!y2kg89#~dRBt}nn)GsET65)x@6~;(xS`L z+j(Oq9w-Gs=#q9&3VQ~WGbHb7Q1kTB=D9m=v$Lj~*aAyz(pS8z>Fd-qjm$0>#KKM$ zV;Y)$IA~NXDQyZ&FMoo-J>LA1c;9bT{OZ~5vQRF?BPOq?j^6Tl?ND{O!V@J}S38;U zSU{#=YFc<{4x@g}Fh}RW)4TZS7l;lXG9n91-^uUr?XzWoWo?1yh!d74+jhBR75Awg zvE#@b-Ft0*;9Be)PQu+|wwf^Si#gbK9k$!zEVKdMFnmV`16^v zX=>jfWh#m9JpiQ$=2pwUl!s!zC{r0XPQJWj)bs3IN45HE)xd!F?Hh+NlSKgn@w^?2-jGUniNgGwJZO8Nn9yxF~w;{H?UIiN9QS( zrUMZLW4*rr9w;@pd+(Cpy-W-WRi~ovdCqCDWX7oX%m&;NKe ztpAMhzyBDQm?B*;hFRT7OYe}E4$#kmd``rmLD3M7XHMRIqjSg;wHBLa<>p{&AElG6 zc~|-fl6zU3RktS|#yD3hNPVj}Ypcw*`h>%I@&r<2$p%q)Gfd6s02KW8cX63*N|jQk zE@~?QEy|C2Am5Z5@33d)r}DEXy|p*!_o2agq9T~I`wLDod|nD3J>y=;PTLdf!7xO3 z%H<)@0dskG4;BG+5bw&8tB&Du3jID6`x>>8uI)IqGxumxP4DOx))eZVc-<0qf0ZkV z$^UZ0^LH7tKkc3T=X?Gi8>3iAiVVu?1*htf4gKc4!IF@>(x-s~OpWbeS%#>$t;mrTT_QG6H)|JvX`0Oyu?SRI-nLdLgnbwK!NK(sO2HLP8+F-?D!CUo zvxrpChO`2#=cJbZftvl2;;Y58_-~?_=Vbg}7ZA-3QVm=q6PM@k_17RdfPA_KM^6h= zC-v@ULibvpHbD~44O_d6DZRH0@=0F0z8$D6{aRy?*Zrye`D@FD$9%J`E6?{V$tr#t z63_mW3gxUR&*{A^V`R+#W$ZX3~!h7gW3?Xz2j!ZDf$+i>oLbCF{U1fnK zlD$IN;CflEva(<^7$cm`?i|XIu*UVRA_zmh%-bgPt}YPUSfVrS2{_kTd3C@|`4y8J#e+4UEUEs07O24+hp5xMDLV>C(tuN86<{HFwa~f zijpUPMsO3l4OE7P+I>d$>DZCa9!0pq<`R|wY>j910T5GUj_zGLNe}@73f+CG?`Rkw zMY4YH^+7o1yYPo?i(pi5x$QXpsp7!zGA*eIg0$bHQ?=isAG})OUdyu5h*x&(RKp1= z^7CZ3CZeXC!60|O?_sBGX$*PMq?VRgzS9P<)v9=S0+`*za9% z{AeDB;Ern=3+nuhjbCrPRhUz}NtORQS^NKui2;q6MEdC{quD|wq(Er=bHIK4Re2u5 zfzpjz*D?X-hcjsJhW_q{%4l*1aw@fIpUPwoy!oVHLk_87A4Lkhg7gEipebS%x68o( z;PnUPB0zkrDnjOpsQp#JM1?nUO~4aeKbwTdYHc3_=Gc_!>aUIDuKxtf^3Cytlklb_ zcgG)#MQ<5$@A0fJxBEW;oi+?1oCo;qSsvSAL=&}bKQ*`Nz$sDhHe2bDjHQrbfHHPi zf3IBf2mniZ>1wp^Sbe8Q<_U z_$OE#CGqC_vI;!x$=*-xx^?i9ch$7zm)^~RDPV%tCi zqZF0PfDX6GXWXXu)}F|52iKdN(Mk~>!Ul2-MI^?Cx-ovuo9Ch(nXQ0$5gttW)gcrCOVZ~;Z$R3okf(uJfNg*}1MB4{f#WJXiysL*F)Y1_ zJ=>><1T>ebFHNP%b-!kXmcF$a$D!bPm(#`x(>A`cxW?CsMlrm;@=>)~kv)ko|VmX+a&a1QxH1Mc4h z(ai2jyhXhaSR$Ffs+PNK>-o?t&G|rPgMKvG30BF7Io8$~>7Hr)I`)$Of^QdWQ-t1u z`E)HhKPAZ^JA*=F?)^dJguspbH*<-oomQn%`d+)HO+CE(rMMXsk zh)5BLN-t6bl$zK85fQ188Wri%JA_2((jp)r2t=B6B2APQ=}ka7(h0pMln@|^XZe2f z&OY;gGiPS@oPEvS=X`%)ypZBrS?l*a>#oBey9V>)a@tyxuIMC$3RN+|s43`~#kKns z`3yCfztP@ zsDXSb7n{tvAKP_us`|zG?rYMRXqAvmkrubspPe52JlVi|kj(q-Z*aCt-3{Vm2#t4X z(TFI}8qX>4Eb7Fv6>1}c7`O6}ZbQ}K4oI;~f4QK2E#_Kmc=gCyE;a;UvtKaIdZJ1x zH#jO7(2CPZ9H?*l$DwJ!4gziaDM$P#?tf05Ku`P8k2)$6x$4hr^-KXFJR zWmw-cvXD%s_Gou>n%4C?jq%x>cfE1*x{w&)Zl8ZRo$;r;rUrwDw@98wTnoN3@UwVV zB+=c_dG@>551F_I3Genf62pl)2LQ>fAFLb1(O7aJU6`+=gemkM37A7X*sM-(b*bil zQn`ppZgP zsDQP?f_YvRh2$;+Nm_|&?s^5B#WvL+GNMzGx1r1w3n2R~)W*XMIn{k9ko(xUJS6zp z*>b9jPDuSwk%Y07D^dg3=9b4%&_1%$A-2$o9?h9SNEpiDQka&HzC>+S>KzzVLq|hr zv^w*9T)b>Rh6xj4V7T~KO0e%{una$T*T=sFY@SJJJGc@)Q=>wZ`Nk{i5qfLb!O;pC zC7R==9CSHB~rLODoQsh89z5!UoPycO2E$o7u^zJYU4tR8qytt|t z$iGoi?02SwJFPJPN(=Iwf={z+`mg`=or4rEpp*npuq=x83Q+#3xw|WBP}Y)Vci2nqLoEbdMx?v z&_0b?o&(-%sPILtv{=|v%~51V>4SOjBHU;5Ee!O5qDf$Qo%L8&%eQaA`mn?&T>yB; z&K|&A%(W15_jTtB*x45fojCKvJ_1cUUIpt@$@E>5+WOU6*z8<;am_zn)fvJ<#!SPxMbS(vgf68s z)KtI+zDpgOiB`GEH(9+xu6cMNoAZl7M8w)4T8&lrZ*fzv08Zea6vhAEj6)=$4xhj; zmQfRDzBK15`wSmzYos0hAVMK-n$z?e$ZIc52|Js-2Lj~HEd~2oTRqSB1BVIW`_(g> z>sM#M9FyucVsNyHWJ&)cuGyc@lrzB~tbTyx{Z1aYYF)O^ z)r&nHPVfTm(VFc?|2*sdoO%D5*Dpm*c<4)te9KjPc-iN8-xe#wH>hvt6R2W6j(|`B z8f4h5BWEiiJ*i|{vR``D*BcSXX=EPHDL2nfQq5P)v|4`!OIsLY%hZR+?aRm zt@(i2d7_#JJgeQ%<(&VR96iWGW>m0u!)Q+(Td2S`k`L^Y-T4#AM$*l?ej5IZxxEj< zA1_2%1~-Mx{!TDesXu=}9Q}`{YP@-KgdgCmx3wdUchWABE%7{tCw>ZlmTH-39G41W z*=YR}%QX3@2icJ&8!<08|P6Y&hu*ou4ek~w42m1Xc5<`$PPs5;ly z)#!$r{BW^@p5s9wyl-v#4I>R=2o<6V?d`j#g|SZLi-au71LFCp+&JE->@~lPYh1UR z)?aHoj3}tQ+`W=1)!Nt4n{cJ9{F$PDwt{f0rw8g7@<=6m8X`cv(`lPg;1JBW|NN^c z@)-wpfFa;!y8>OLRUL(}8B9@`xvJuhV|uLm^u?Eb3)2gH)zbU#4od_pimqODKYaY+ z#k0p(*Ly9ZWsexJsL|26(;0-*?C}ypO;&6sDaoPtcHj$?o=*)=((hk4`5paJ_hRjD zv;4oPWEE?n2_YN^IgQFjW+4M|w(PE>w{ME(WZql6d}O^|=u@lQ_ILParhExat(jco zsly1@j$d?py|!L&3XXF?wqOSw+&oB_Rd zp<*7y6v@e&Z5eeny=2Pfa9&YS=ljC8YA`b^CHHBUN0jIiyVRZR!6(lO;*M(& z+(<%wqcQa&!QSc`1bAr1Rymb-;!>EG1I}lR$7oEvwfdZWKJaLDv956S-v2m4Eh~N9 zjobN=vEFp2>0Xq!9&?cW&cggW?i(E}+B>nSZ)NMEiLP$1_QT+|4gcEX&}LqKe?vW6 zn`Q@Qs&a!8#_1M$7;JQ3oyV@S7l8jCxds0#B4OYk{!TdjkI((%oc#ai8qga!v9D3T zo=jI_>F1U+o;^{p76Y1APbbN3C<>N26&Xc@)futBX056u^S0D52b?C1mxB0`$#?%%HJ1JjcIMIB zd+bbnQ71HHJ3Bl57`>{hoMUqIs0?{nj^UbHZbe#w@78}%Vt&!-t)Sv94fDelKaIe2 z_Ak`j!Q1Y7T_^2__i9gUkT<4&z{FIAIkAem4ijlqMLW@6I+=&`=WB%v-~YBl{-?(4 zzczCJ@wtEGZvD4j13^{^;dId{g>Cge;(^F5=UeBqtdb1h@qV2_vr1%TwCumE(y3p+ z1Rfr$zQ5nhGwTt1U9{sC@O5l!5Kbc3O0R#yTyS-y+nDz*FD;{b$*?Yx=<)>uM=J3+ z01;N|$^Y8X$@TK9MXR1EJb5WH0Q5)|rHrY7*0++SiS|0ZFVaQQSCZ=bnIJmH3f-6? zB@jIx&tRb<)SZeg(0U%hA1O?^R6QuWq}=7z@M?4YbYyiyCRLDSYT?$GBkj+1%)W3@ z;)boBhUIjz63i#@ewty$>(6bhg5!IEBeYaFb8?Q^=L9Yj^s^B^4(EcfF7rynBpyDL za*#W3dbaWn9rv{Rxud1RYaC32l(}KR!SPAGd5V0CCX9QzYecezNHd;xgo=9+=f3&Z zd^3h9mOR+O!lcrB7>jzFL-IPsyr51tKw548z+d1VQ9U2Thph6YLK(_{xgYM<&s40J+Q`Rix(cS9=B!ijtZ!qh;>onfMu3lCDE-sxM zHjXv)2{+f@oi2LM|JzrjzmjO-+o(wBmS&CiFFJEp)RNrNJb2R`)eP?9gpmc7kQ%+J zY*Z`6rN5`pC~MeW=ua`4MqN6I(1A*H6d1VOM^>_~s<`dqH(|!C{hyet+d(Eluwlsudwft2NO*LE*{@Hi;x7f9Ar@fEt+n{)Zvk?XG-jpNped&n}g|J!` zll`SS^|pnIdC_9Y=MV+{5Y+@h;iWrbBK3fxoG+biFK2C&`1!EOE$5Z8nL--n@^!C= zuG^w#%3XOk=-M2oxwqTNSIT!B%nxr47Vg$w)0S9i(NDg<^IU;*m*EQ_Y{83%Bs>j? z7s2|i)%8%SQ^*hW^RzS^2B_&7UwclH6Ixp!T>Uj#IpFcBrc#IQ`#03@Y!sfxxR~*v zBnGoZ-JFxljAhm>_y>OglD>BUn)nCpdU8HN{)*4XCowIIGORD@=3@PiBcCB`$nE(E z)gi(Do#iwT)9Ba7mycmnf%dARH!iWhxlnh%IZS1pGC7gPf(ah&YI6GQH+MnDQ=k}P z*!o4Qm4NN&pawQ_N2k0c4)S@XX3kI0Fh0}P@(D#6h=yv(qvTa{vOl&FYoojV{E{Y% zP-ik^Y4(c>`jHR!@I_T*ZCLv0Pwp#X%}c%0Y7Zq=-AE=4v@0b<>rmjoaL)GYAte2D z$A?{Az{TvLz%XvOi`YDrH0^{oy!XyhF!8Gn(zX-_@0VaR#P;cy{16rEh>l&zIa z-{bI2*_jm@1nv&=r=R}VXWWvX8}7uN2S61qP{X8W#B`4&Y!D{eZ+-m-IQJLbV!`yH4-m=(s#x2 z`e*~A1$EymXt)Cl}gr1lop4&NN# z7JX)?iI3yn*t)*VQKHh`USOCUOiROSGA22sa0gy2<)-R~=cxKV6e*Cka2t}HiiTg1 z&5dP*pEFfd@9#OKKruFWhGI3}C(M%as{PvBF=3xtp2mNUAhSPxuyd_No^PO73K}%r z><;T$P!>tVKbck3xG~6&nZ)VkM=KWc44}tOW`}zNS??O$FRzuR6o}0Ie(u$POCURF zGS$Nru+L1v!ZnQ#8eC0_H2=jjI&QR1z9UOgO0WSB%1p|9Y&||FB#np)S*G;vl6^a} z&JP2Co7ng3zh~}$19uFQ!%wl>n4kQyNMs0eRE~HoMaNweja&gDTUdD;iSl&G%l6x$ zBn+|q0EqoZhdg;h4ZNkp`vYnC<;{thx)020Ban)n6cM0y#0EcjVV$jD2hbmvv_?1k zi6SPnFf7O+q;z$)%S;huSi78IZaellSl6Q9ty06iGH2n|-dx7{59Z zKe`!xSDA%U&j65I1fQb&WKGfoVs?{aapts19;1n=n&bK_W&YfoG3tjMK@08<<^-73X(od#wk+&X^GW=1&B)*t(e}`J)Duu8jvX_soqpzNF*y z;ie9#ZnX9TOccNS;C}PrL-IOBeHKnuyUI-sz4`sQr(Ysn+Z2J9*_YYhepw!N*(ds{ zj<2e{5MTsgut4*Kll1qib`({dNL9oK!@#O?awE#x-VW4B!d|mIxhudWC9@oe)-zKN zK-u4|s*Ub!^?l)4a{uO`K^i`0vHEWFd)_Q*=vp$_Go0FiJv#52voJ@z3fssc{IHM;_T)pHML#+=3R> zIN=*seB)f^$hA0tE5{az?H?+2%m*~49Z%#Pk=vO`s#@`buP6sqMZ~tWkxuH-P zLL;55r_h`+S25yZP_TPd1J(S)jNOn zg9=Evmv34|UBu;?NW-T}I#HQJ+Lkn@cJ6nDFGM8$kgX%gYh)bYbp4h+JuG+EzUdcT z;$x^k#Q!vw5x{Q5;4V?|2< z*&I)HdHEr($TxPo*S~Z3#s0z1S|fxa-*5#E4Q8ttf*^5n`Mj@fE{uaR)##Bg?7OhL z>u&mPf;qY4lL{+VNpsNmtVa5N_`Y$m0F$^MabDY+1Dzuk3}3>u3dsPZIQ`?yo=ei-l+>a%R@#U9(9q-4`9r94QgMj z!swS8>bl6P#8x$OO3}yh+p-gmY?DLtmHfGiH4mI0D&@`jMJ7{lMgH`r0|>byBm2<@ zX$BLK1>z24Vj~9u@_b+TVQtpqECJ0rcJXqjc5F@HvX$!T!Qb<5n{QV^t!)ZZ>0zNw zTo+QYCVBTN#B3aOv>%mSat8h28p1Aey>W?)tW+Ns^7xH?-s}+~7Z9CplTALVU&UYS zy1xW28|8BlXMlOuj|L7Oi~GjoLcCsbs6{ax$gvf7deVmn3K~pWzA^o@4GM+qvjKG; zY&Cjbvmo%QVsmH0Ev3ezX|Ioo>q@D}b3*3Qb{7G6U86EfPd~7%Bl1j=`)ftRFyCpFo#* z!Q|6`vlP8*fJ&tQ*A zDoj)@3CbF)KK(UkYGN5i8vTe7iubS<)nc9IDOIa~J3YtE4B}&Z82(jN&WgVWuv3Ht zkpZai4xq4_R>A$EVOp1^KT(3(J|2NIF4Jdo=_6Fzgnn$bjaxhp|RLs0EUJD_529 zJf)hBbCP2Y!}cHj+L>oU^Gul)MX;6Yjz!p_jM}nz>&ny%_vUVMnsGdr;tM!A$3@Y_ z{`f`rQ?vPoZ)?D24b(>{*n_^b|eHdb#vn#aWNh3C@R{m+57sAYU%K zjd`kHsSCmeLtk(MEULGCnsP)MbXo(hSN~Y}nT(?TidYeR{5_>xyd^W3Qx@R>433?j z#LyzjdE$J}ON!4^cT2IHsj4w^5vWM`otKM~*SzS5>7o~KhW141haBMh)G}-n#h|`H zE_XP5NF!&f0w=S4O);uP=XcjJ@}b|We=qhbS4uxRJ4>5<>L5 zE~gojQx!=Jye~r62voIcQ|f8#Pz}2=HMdT9#c}Fq-TYRTDNA)h;?c1?T=rP-Xl60RDOcGGneb?7&2;1I|E(U9vH_9uum#N8?|d? zGNLVvK^pJ@>c@T$6{W|e5xRZF7Bv+qjvCK^=F*X9(Ou+SG9w-qPUEo?^P4%$xMX$0 z-sqH#Ab4F8n|X$16`u8r?o6GSw;SH`oulmM>w5&@8`opOlXtZ_8OU$aBB9J6HZqQt zV8|y)o%3JuKlH^8TP%GtiWFTgQ-J`gng|uRdjFsWZfF#UyfvKHCqv?uEAx z9h8MLv@|~nXZ63d*25>O@rw=->~y$NCySk^fzE_8H?(pZi|Qo|FV)mc?m?{OTH3q% zBTCi%KTveZ!86tGHr#8G#JkoI(>U2rm504wEfo%6OAPh!ND_{9&8lSJil&Nd`yI`5 ztfi0JTa!5W-yHe&M5CTB&3^ktE#nOA^&BCziHFtrT-Z~Spm0yC(Ym{>&)>hEJ))t! zbBSwDz986{{9Shbq=1gt$HeKw@r7@*IS*}bq#w+io6K%TLQaxBqb`*AEKbdeSv!TM zEPkAM!|YeW`xE=FPMp>ctL)SMfKYfdW_%_qIYe4K6&(v2hm*p_{Tx=1kOU*^_{G#w z4Ss>7cd_5K`D1qdU%1V~J1X}{J5xV)rF|OE)j;;~GL(}x08tvASf|M6 zeK1OSHIFw5P?>xB3Yg4~)bp^ZyH{a)LypOZu5d%_?((PNhp~TP@LFcE-_rgT672N?Kw9#-dNA0dsjtgnEZAR99B;^T<;=ucKvw!+`V*t375n-lYD83$3RXde!WPl% zz^Wj?3NVGA^x+2(-@_?DX*OpKR`W^15sdzjr2SZI=Mf4-%{c)xef3}rNTn`9w!vmm zfA=YTo!BqB3!#~m-W?xs<0ar+Z|P0^3kZkYLwZNPn`pR&P$SIIY14IQo&(hdW&zby zy;SwCk;S!+Xq#tWg4Z{gHO}W742BrQhn&2$dklt+S(J0;c!w8ppYSq+?G7}uhDJS> zzemx|sJ%1G@bdl4RJ5e*y;p~4M!u9=OrARRt;aQpO{h+0vZfA~C*}PGmd!OVF)QGX z(7Nw<&BgBnwh$=Hki3YpBqwaw!(yLJ!QhH?+DGx1V;K&Yfj7MBI3mdyD4p_}p<1!b6gtT(VHK)zHv^*7kiZRH+YB%*Wn)T;?ME^VxgK7*Ed8A78z)BoQ zdp}y0tq1uZbslj2z|JeEHh-z@-8=QWgefK%`G{gkoNq-_-28J-Y1)#ahe-CAx3@zt zzT9=&2Iun7VFxBGpg(U^KLo`{oCJEehVk>riW$Sx=kiQg1UA*$5u2zsLRm+YnzV@& zaF$7#i=ck^&z{yqh%A9?bj3Bv<1AZqd1R|N(L!2buLY)}p{s4YcL1m$6`_(5p*bT^ z629&yc0uA=eM5+~ajKl9J8cKE;{Oq@(-CO`ykm7w)=sw&c`NN_iWnXdkMBv@{?Jf5 zQvZ}IuG}vzaPqT#D-FVpxJ=}QF202yD2P>yH;RrV>syv*B~Gqc8zVlsnz8Ih;IE=| zi4VW};m?}uU7gqoQCA7^72Yx6p!1|Nym}tCo2uv%q<^I_h7k36qtlapIN5BCD;gtX zGCo?>em=~(Z3U=&ZNc5#MA=E626FWKJp>|GxTq6VF zxxv$9VWFvnA$FTjm_CWGhJ9K_% zQ$`i7+CGjUTsT^j5I)~LZG*qOwpy>@Q;_(u@WUp%#ZK!?>pJW7=EZVFt6WrxoUK%b z$a|@_)^b)7Fp@=?h2AE0SxjUvX$J?3yOA2OdjxDEND?te6{E6 z0F5Eu2%(gCY*A6wH0>P}>ds{SB0Hi2xxI5cJjWq-#Qd>(zVW#F)s&LghL%q(q+?R+ zt1P$|>dupY9^Rd>+Zy-Q`*!05kAiF5WO`%INTGUu>q=w7dhp>C=DDd#SnCi;HomCZ zPZAF08*0MO+*j^XvBnRt^JIG~8o7f`7W#fHPBP*bx=vbOa@iaAdUTVg`xYC)`gM&7 zUYld(Y#urBQ3A~^#V2a(gG$VaKQ<{T8MyRoOzaKwg!^xuBuf2oTjrWXh#p&1Ds0nX z1i1JQWFkd3FnM;0s-y5_nC{Q2y)D-k=LtB2{`ozbfrf?t zkhno`nL8{H!Abu45us6PCaEi8jlbyr{mtSjd7kGHQ1aF;+@;H}3(59we(+O#>SzY14xiciK4aMyg%I{GiP%3@8B%M=6w`qJ10B)c+N=yPIdB!Hz>APz93b%HjA ztr#S*uk+SE5nK}!cXPgX7d@^wf7n_%oJ_97?=|)um#HtdXYtQd9BWMa*psf`-A^Nl zZ3Ef@Fc@-~JUI7qGF40)@yfyK!4qs|``#EIR!MtL)jLmNZg|B=J-E+S^P=vg4uE|f z2(5U>Z&y^v?O`0qaip3fqmn0=Oy0UD>=H4Dmf&A>?B;Ss$X;M}%LR1ueoS_Epj`Nr zfqXDC1qcwar!s)Z8)r|GTN3w0>Neruw^pp!vTLGgi+F5Pb2-H59DaKgBI_S6lY`&)O=dm63IVt||H_Nm2+WTp4ETq4h&v@P8) z{iJY=rR~y=+}`jv6gmoT9Z^4dJyk8=`E#XKFgtV&`cx>LCV}zWL~1o7!J3Q=uj?y? zuMQh5F5uV}{JwAxYHjKFVvGTlegE~^A2|6nBfgLu$-lGAz5_AF+_4%<$00Q!QXEmP zH59@=p=#4CFeIqS(^z_sbM6;iUCe2k+(g0?%kU<>##SHM$O3~$f$x_P!$b=bns}i$ zw1~6>|joq+{tVeBggFa1Ijo} zuDuuDzmL2V)Tpa(t7o`VjjZj7EC|n2aMQqeQdp@OSdaCGWA!wp=l1Vo`6_B3v3`#$ zz0c4I-~x2loKLoQ`J>^liJJW9vVN3k@Li67I~|~K46Fn9@Gc;wf9^*JWH-(XXl}BY z8#U}njMv+XeUM<1uC1rBe4|O+GN#}n0$mT=X}l@XSwiM@9@){4c1^bh9$OCBkbkwEJ~t zNR&(a>#jwW%Ndl1iB0CxL|BCNq55k%lc-x|Zv0ZmYTHRGd}lkGv3o~ zzCTY`sd*WpYmE%)kISI(;{P}mN}Ll`)^E1PLdA*v7nP+d+rmP{PXMuD^s2bD?S#{ z(uLo#2CE!@vSdKZ`vlPPUi#*bmiO=jv6-p$%d^l^J;xKnvK`zsblDR0-oJ(Gzuju` zh1^;?xdL$mZtPakM*Q-j^4;@^^GgSaD4XHb8{SIbG9-9-2Xo$rR7DDPmUw|>R(6Cw zQn&hzsuE4)2zzU<@g`+nMPxPDRj2u)wfov9>w{b^sO#H#j+wb%s((?wsvJgUJhrkIp6Z zuk#eG4MT;4nQk0eSzLfVQaoa=2T*G+0S$LRYO=NkYNBmq(Rl8;xxJ_GkGL`nS^{A^ zEmtuxe{F?JEW(yE+6dqB%xyEW*f%yStzthO?m%l*pC>x+95T?xVXfX#fx!<-J@_jE zwGgA)+bY}{nFg_D+MjIzQ=BK{Ulw#0YqvnI6Q*9tCJZ4oJwN}547&9__7jbxzX1Hh zgP+NMdkPO*XPgCbE021{^Vw9%oI57CVh+_~`(Sn}(U~eXjF3Krk*BoWBbtU14!+ZGDx%A} zQRgp0&tjqwC91Bo@=iz0jEgv^Lnz;YD({XG^7EcSpcn2J9j7u2@%%~Ih)jpvBXQO) zLGPopeJ-~iX3c(KTd5#p&X?AUYSWjyssU6Z;2F9Fu#MP~IuNaI_<{o{6vFqte=J4Z z`1O94_ARIeV$*1yMIqrZthI7SXit# z&-33LeHNP~-=zQm^bX0wc8scg`V-a1#v+vv$ZA6BTH0BHUhcpw{W|O}1M>yWlP~Ak zTlwGI-f3~GT<$q%&j=uMOe@LT#k>6uR$s-an*xlp#Ac*Ce9FwuYnUs3C&c9aS6(C2 zU29jP*9UR%OtMeo+md)|I5U>RxHHoU7rW-e`B_3ApULy{$GG=)%N`8nh-u;q9Jjtq z8?5Qc^KC_oT0!kYh4CMG82?pJwSU(oHj1IhPwp3hD zFfQ9Lx$@&AI(@sm#Ne`oO&O`UV{2HaEbtNVYd?Qgja6WtH{-T^48DCmy_)+p89f7Ut>puz-KnUAa;=S=r!L~x4fCgznr>GmI(l9ylWJd+ z`Y=ns2;~I?$<|HwK%YE0Wa%Aq+}U!U#{ezl;fEnbKR%cZr&t`B=@u*&bYIxFZLa7V zJdj-W#i+t=1U*R}tP36HRiYM@J1As-A_ydsQaWQu1(DfbAo(du=;EH1x(7sfGdc_+ zBaysOCNqYkR5u~sxh&~OUF0o_+cfu5MOM6ZPGdNhSeoPG<%=VO#dz5e{>B*AAAj4; z0)Nu2DT~eiqT}!uoHq5g)=&OUAII$Q0{~_|Kce32^3Nl*lG6H=Z}A)7h@RCF-EY&c zdJtiBYTB{;f%SN7-D)2WvV=|TCYvyzBS@dhaY;4+$V=(2xvrQkv{9=B{`@3_F7S0OR0-&%tMWaqZd$C8F z*spgZgGs2h!A<81T=}Oxn&p1Vm#OKVUQE)Cjw(tIe2(ql^`Wx2>o9;=GrE7xXw=Hz zs7VU-d^4j?IervfY`?XE)A~gRpxSJ<0GHRO$s@t(=9jTi`6Y7qkx?>Whw+F@-#zAu z?N-)~Qhyl{G7jji-706vBfDEBelTQf9k2%@v+Q2}zeBHo{kyp~Q4_xxTF3slWw^u1 zL+s6i3o)MzpU6+HcQ^HL*7NT$Fz{x(xSY+dsf01{g|k_Es<7!lkK19}KQi0HZcbio zPDa1R0>0zyHbj;X)l{s>20CGCJutV2{X8Ab8Es+yu#zx?d+#R+?8pOrekREEhR;u= z{q#D7A?_t_c2|7a{$6 zKcSqx{pS-(O(GuC48(#WBuXQvF4qnezp}Z?_oVQfOYKol1FjU^@~6}7F6$%a0^ z)k>E77g0bOpy;Je@VC>H5b~>8MjQ{hT8Ah%sc)unm<8SHX4|*WuD{u>n3A?j8fR&g z{%&7sLUx(z+JOUv%(3k$=nI;ulH#?B$X>X)Jx#FZ@2myOEEWFB7Us zZ+_G``7^!3Gd|5+iF@mv+v^9!EO&$0u!5kOYgP$9DVIPJzwL1w?~~IYwHtL+RjuV&aC8QwgFM< zv}1Fima1d|c+?t7$dlvd20_Lkzvya>ceYIV?Po=F()wbxd&{`*j@x!_06ce{R)zmkXwF2|i4eVLbgGd7+ zKUpVajnvsQ3dK{$i}-!}#x5(IVk^I>{-aa|`Fs#cw$YA+?npyv^Z>p{%qJ#`kscFB zN5sUHp#(OXdXKUhd=cfdHuV=GZq94CQZatLCDWJY&^&+^kG^>^wcSddgF9|Czt`-Y zX*K9prCN#wvR$&R+apKv^=E*z@#IXE3`RV~LyxUU7I>QzMkd->f>W+QES_;hDt=nY z{A~XN^;g!cpv9*6jG94kk%Vt~rqD$54RUfr4Zes|_CLZE8iGeSKnDz=w4=B-+#lpx z+L*(kDgs`aD(K+VecDx5)BAK~Nz)VBUD&|XDU=OWsU`7~w_CH}@<+S=H*Y#Qq1LHA zsuxRj&`&gu5ty7oipFF7w2kFhc)unPMyAV$Bmc zZTp#|X(*y0;OSuyiuF!xq_LOXpj{rg)IYt}FK^c5`N+p&?T8%V7abaBSLCIC?Xfr0 z*rDiH3@Is~wh=^vz1608AF#m1SE^cyiYFnNMdn)m9zv5qco+AfS!$60THg!vE!B=eO1;f$#x9o(k7$1AG1DRLVw7a}&?8icO@nyHz9W9wm&8Rp7C$ zo$I+>POczcZ*OeGN2!MIY)hEy>?=Gq56(pPW7ynV_58e?^3`e$@yAWkdl_5Z{`=Up zrDBBIoq+rz-AH&l(((?Qu95~OgZBhA4E-m3!xNg94H9vWYrFz4Rk z%QKH!^8u$2!65_VW~Ml`y3U^m5E|~)>hyLf(irD+5Ugso%vJXuyPPoEf_APn^s?a< zVV=8&nNz?0hnz)}V&&}Oc+Jq};QS^Kc*4d949owCqM!dI-;*b$w8>z95qu~(Q)k#7 zmeyKDQ&IJmLmWi+R<(uNiyUJaU@`Acw<{=kEWXHVhbRT9@z6qx5JPT(^T;73l^)n} z^VOL5vO22X>fZkq^*k(|+}LC_Gp14EG;z795E_P_;cdj_QGG$|s{keb84zGryfE!n z;LE?fxvPRxBp!6TF1f)9NPx|iEvXHdPjvtI_{TZ;$2IWx*MNz^;R9=@gOL^&>O6P= z^xa<}@8NIMDKg&jQh;KBDD}IB>=hc_(YV*=QS=yXl1c8)obPkaf|*pGa{JoSSO+V4 zmnA=`11}OqY>Xv-K#{ql@ltdU*Z${P$$DdwOQu<64L_}bsy@GLeA|pEoDI^cA{Z`Q z#PrV(?0-JE|3_c*D-#n)7dAcI5BYAj(*no)Hyodoz1naz<#&mUwQu&IdivWfDreVN z@7Yqdzf#3SEA({+gBummZI0E&?4>E#wWV1t-jCxr_x(aRnP z7modAeE(se`A<(wb4wy4elZZj)sPAQYAY#V);oHWzPoWC;HJUeGnp|tHc>Uk9+VgT)uw;ujPng{3% z{A$|+pAv9_&Fdu|5MLeW&PZ}Yh+@3D&Hc5FmO1YlS)WxsTd%1tB}L(kD)hfg+5bit z{^vhx81UeAwrR|f$@a{KQhUV|{?}J3MfqNb5^8c@PNv^snPQ8#BYgb&aV@%r_r&Kb z$nHoJYn%?8#}=?VzjARdS%vU^DGU2_aVFLJl5e~icuC7&a|XlajcQRewufA-jXvn@ zzHxcpBoC$s#J(@xl#4&(Ac{Ly>^{3?YRG0@!v*&}?ES5kAMgPz6@RHP^T~-qN_sPV z1CgAV?d&9_V;oK=S+%~%CwyK1QtSsyGC>019phrk9dYBRDr;VgZ^1|$f8p)9y z#QfX%9z9Sjex)%jRd6_cry)-5Xl$qBkLV9x6-JqXS zusCT^KSx-j#SUmw(RuLlKYA1YfBoyS_+-l;9rF-p?R2bvAgw| z)N*~=5O33vEIWFV!kp_ zk5vmcx3vB`Ju6icy=tKbOwiv&J%6j_{PQ*Il9XuUrQP%8j~HecT2Mcsc%IN!6_fg@ zvj`=(JX^0d$u)ESF{iV&jo^uPnP>7%j+`eJjs-K+`l}M`X{Q#MqP#5e@bIeDPM;Uc z%ASfQzV#1JUy-0Iv$1eg{`ZlKef#tFa1V5)5zO>Z;lRwJg;4q2Ks`nuesl@5+d?!y7uQLO@L+( zuQ`asuy!%2S5cz!0m4pJ<K!%jR`tku)l|6w=uRGpaB-;FAcT zI`kyA(VyX-$4{wBqimk0GTLta!G+ct^O&0)(>m~W`I-j>2C252f>Tp@d+yVjY;3<_ z??rJj)+TelUW8CqPqBBvviRD1i_z5AwS2`g-fymU;Pd26HG*TP8=lA4<#=}f+@7_? zo(_Yj#W!#t?rNPN<>4?=&>o$;BJs2&EXLmEqI3ghu!nd9IYM`Y#4Xfd;3k^VGx4z! zm9b61LC$!p=$QRng0ca46`a|7sVmdw;!f(>((GGJdyD{^_1ImH+AP0&6|M+?&$lhV znw=34PzPO<>>2*LoK#TO&2sG6jwchMs20xE%y|afDhmfF$=5@o^g>C>+jHz%s^!i1o@tVX5 zC-7}kejh`$3LP*$>1WYn^t78a030sf10NiPH5lwUtpkF{OQ^G`VLvqWh}euM4Ncgi zsmj(`0{FGeoSV}uE^!W+e*Sq2m{Q1J`<)d2(g20JjQ>R!0fp^j{6WFMQfWGW9I1wQ z>@dZO&_%5XxI#rvVuuwef#mm>67kS*Q+LJe4M>1B+W|IMllrlYwnFJ)RVMSTIPU;p z<0PVYM}4hEhG;x~U~Gz7x`NXb+cqJYa+32{cFMlcG~wBD6e`|{5}8PbE%fBK& zJ;x3&lAsgR3j7@k)Vi`~*zZ~01~v?5(dqh}XwbWKJQmd2Ewq37&=dEbaw~q@4oBLg z#iFqti5vg)#{3VBFddeITtqBr#GNUs>lhi3%Y#W9^I6!a@ER^YJMdBo0eX~Lp-~qA zHg9J(!zQ|v974Pn7;>Q(vu`td*h)UXyW-6OeGda9K=I?l2SJ-|ZqChfK+DKRYaaiW z7BarR<_yvHS&6USQdK7h z8NSY?%?)z`{nj7Bpwo}^ms@&G_z~Y1<~Dvu=zQh}LVX+4P7$3VvbPX7vc}=t0ln#4 z0|3=_xj_=I>Q@(dD1x{(ps!IlK58rKTcD7vLsmPDn%$k7-CpBmzrU*ww_n|Jbk}Vd zz%;ThoEDpaNKn{N!PmJg=c>L9V02>dieB0Z@Oi?$=x434!#3m5pSU3Fq95;jt_E&q zIO-u&p!O(HJmI&a$2RGJO88+h(-W*tyWI7*Kxe)wg803Eo|T32Aw|87H|uGTA$PwlQH+hx2aDe6_z1Pe;lRM9TRa6l_Pc1j zR|AR(@u)xctoHRv(l=__h7@CKTK>0LSJpud-X>TR0GzMkh)NnsA&F<}NvfPeQBkA2o8V@`^irXYLM&3RwHIZfiSeziqLM=M$VRMyCp6f%`1PpIi@A-vJ=c z)FFzdpiYt5Yylmx!)d?henZ+F2bgsV1NGP)z}W;}aPc?9Zb82g^TjfW>OnPkrfW6a z(X2-AFMf`)WJyrp=Xk^P=+g?wcTub_+mP)_4uktaQKZ0yR3O;Fw0})v*W}PaQXAb} zRTF>t^$~jaSSbUb_dfRK++c@>SP8#gMr|UEb8nV`%8325TMta=!Er#B%ut8F=m5oU zt5^qZ>Kp3ET_{0-X01MG2bfq6(Omz^H!A$D?PFTO5YX2)=|t=N%@T$;~hs1hJy=Mt^F#iab&+k5e|uG1GLK% zS$Uji#2sFXj;xEWf~g{#4cV2tb&qfz>ghqJHo3Q;bbRL zBY3&)g45%TUvw!3q%<(9Z8VWu3+TDay1P)XBoh}9@ZW}zS6fHmpYOSb``^SijZ%w0 zD;43L4B+!{wD+g-M%38Uq3sSQ``A7*YXBH%^8WwaNz`r`%`s`4qR#!KOosb7P-Cw` zhwa&nhL#^64qJHNkaqH`jcibdwtJ}k1aGkZE(@}Vw#jgZ^oHDakUc&Hs*!Y7jVfcq zCCkvNZVfT36C{bgl{^ek@bJ1RW4%t0MO1UkmT!w>&in24v0$Q2JlWvrtqT(oD zWT1=kJrvuS$H*&~0%SK7Fr;$LxR=-7T_xGZjPbVJvmhU}Zv_myTQo;)r;C6$9F^3& ztN0lwc8F%oD4KcCEz~rS`D}s>p-EK^bXJladLKJwa)$(AYSEi&NXyLucR`%t zO!CEdK&z<>zizI(rGg>QJO;qL@UC&|M%2%d_jkyyLlCp!&aDy>CsJnYj@i&$xsEEr zXA)Qm>5!htATk=^PS0^b|GbC3ubIAF_}Kx_fWW|2fxPMHtH0>z&iEH#*dnuUGiBQf z7q5?e(R-VkQS6h-wg(QyJr|QxjRQh+qZbQ3?=ge2Vo&r?31a<4P%Fuxp|%V5O7An2 z+FhqH*u+n;W_*o$r%1KhL-PpAny*xcn92@RCKubMWkND0KSiC=QgPgAhX!Y)w8?A; zfMRy|fG8=-L$Y-dLV6fi>T#X@LVp0MQuK6I8xx3o-I6BI-pc*v79>rhg+m-cjlVEZ zEoZ$offVFOsmm7Vx{AI59O24CzLC0<6t%dS;mGN)UtDF2)E#Q^q?6j6LPOWwXd5$= zIQ?74p1Yn!3c^-F_1esuczE!^iEL-u^W`IzZ_=`hMw;U`ZU>TwXVZk|S z+fsXtYmi2{{W19e#ol{BHPx-_!YC?=C?X38j)Tjy&HOOp?4BWNcT>@efB^8vClsDoN@0xC;Lpt6HryzytDJsrG=xNRRoJu0$e01C2F6X{!Q^3nJBj=W3KPwKC)YA9z{We zeJU#0UmF`}k^$7>YVxxXyYJ1whTT!I*dS1>`-V)ZrnYvnt1;S%rZ*GI)DBCH3;crU z(Dzh|u`rv~f>81Hgr7O@AAZ@8c_6}8MmARxaWhcGs*K|aHuCZKG2DW8BGMo(>^_Kg z)@E%r4Uc!dXvn&No;!^{Y6ps8FMbvy?~l=hm`1n(FDdFP2ykYGy(pd(DEiBGfJ7UI zJV!NX&XR35k8b;;!#J=SEb`bNCJHu96=s#ZtB`G#Clsj0*?%|ymT>+IgC~>9M`MHh z4x&Es4FbdT1{9N<6EHf^Q49~S@F_=6t}-Jad_Wui3{atSS+s<$cEcJVZO5MA{S`D& zMjfCK{?O~(--<5Y2S^OcOENBuh|%vk|7D)wgL~=My5O0jBI23*aPxQ$Ps7C;qBP_7 zt!@N%t_Uq|Vko6I$URHAh?u{Wz>sGutDgWxSGGJtRXdEz|EDmixD@yIW_>P zjybbRTz;}rZP56j8h#A=dG8P;822G{LgRt7W06EzK^oDOD!SIRDCx}$uo$*v!Y_D` z`JQmyf#p4EY4!V++7c~|^hVm>EWYiYwAsWR#pPC69XWrF5_LMI90N_2RvXl*0^{&F zcA$#PRY$4hi}}_5?851H7vl%{`nWRoEbw6@oqXI^oa*(|Ep&PBCJdj79vLaETuQ%F z(XXfO=}_bFhJAq=EaDL%Gw*Q|OQnD2FF#mmXtA4JvY*|nAuICw@y!EwdTQ@zK+(Oa z`l-^D_hs{HzvDWVxhVIk_TOr;LkC>4ZbQit20WGwIn(>RnUHLk2=;Hl_)fuqP|(jy z>Z@t|8_mfXCu(1fG5OlrutrAW0x&3*rV57#K1JN78GIevRm`5GGepj<$j+^yWPeR! z<@Yg05;1Jv=2H0T-;BC*!wT;H~n!} zusP~&`5%treuppO6<*dtq^Va_#2MV^O{3WnFW@J3w7hx|wM4OA7y;kj*3bv`&%Y_q zSuVz{V2Iggy{*m4bY*2<#YEhEqsk!)cJbp9BpDc=Tc8#x=bZ~~v1sRDbBF6SY~vZx z{zMMZJ18?R1T{0u3*L92r^?RzZ2t#ayF@;8dH;Ch2;qfq2z^w5s8bSsx4UC{l(nqF|t%p%38kY+1Y!~$7zSk-+1fza82sz zr#u3=+<>Ifpz5TGJ~%6$eHK~&voPXg#)!&dP!wC{*avFJh{J&C#GBC81CLA@KExta zI;OYWm6|sD+$1L+}9OlmXdX3%Jp5ZR)41Su0t=?ME71Cl>Oqnfv0#;zq6sF zNn4@rq0Kgl-BG!;G(S=VReiSdp+&#H}b*Eb%oGc%DnJ(wsb{*YYJ**X2 z)LNl>koOtOBe$ZUJ1curSbA6CmR?G3@m%Bhg%6v1@Cj5#pz(fzWn0-B9ylq$q+6Nu zet`(3yw5LE2RWBYo{j3t2P|N!rV{3{6oF9CwH1yuK(xRL@XW{xMRs*AdMiJ_&WZ!RocC`5^!P1A z*XVK9hk{UlZKi2fofYqCVIxIYdzt+`)LVbIk~F?YKRlkuq%dBvr_;MM_vxqo!h~;c z;MXV*sAks3OmjU^BZChp><=yVeU%zGLF-6-LiHyrR-Fqy<Db8sGSlZS^u`_KEW?vF0yl z0M!+@EQz}mNXonyA~kR}UC#KK=0oMtN%F17)V>)>)DZE-{bY@A^4X@-=Ano!d?T%O zrC=)kuM|oTVXtg|`DK#eL}2hzFPorssc`jOw)5hSN1px;STV*+PsBG=D}Mh8^650G zaHW2)mIG(67v}z;JuMMgfhIwD%lPI1ivTgrG;8Rr>UDcAVGuD=5fd>q<9VMa-4LTP z8aR6d{WGR^HmvMUng+VGs}il%$g(@VMX{vj*@Th$henoX`5u0enlDd9#daV?dHJ0r zR=Sa(S7Pj(N*pGZx{PxZtT#V4?7}*_QYXkZ(cZro81MQns?fA7r77A>=L%2OIkl-n zo`hsg>a9$cCF?|C&tAYP#PxU&v8@HX zK4jaLVLwG7x(3IVQpw!au0l$6-_ki4St2zj#pWh$V46y~MEJnv+N~H{g)pVP7a8Z& zEQ>5IE?s-0wHN=esJ(B15cadNi;_5EN@f(PY{vAy82*M?3MMWV_PXE&ehL(Z3_lPT z3tPL|_zPXi=fo!A%ItV+5=A@p1!Qi#jhzjvn5WWWVSsA zA1@BX5G1#Q#xzk?$FTcw!HdiA{v{COxn+ABx6s;mvI`J^RZQ+hb^cvggaXDH8wfpK zOg{L{wU{PPi%uJ>R?HrG~rv;_Qm4#2*fB1xvL7!v&5B`O2^t1l^SsfPUc>$*wjfG{Iq`Z6jO*<1ekuD4I}S)GC;tX*b&&3=6yawyaz zsSGc`yg(lFc^gw3r_%d_6Qi<2LRl5&<{xGKLjv-QD2wL}PBT&2NJ%M6bAEc|N}2Y3 zU>S!a7hwallMm?!mW$pn^`Of#(}gCtXW6PLFMA!;vJ0KPkJpI)P?Jq6Gk_c?+0#Rn!f{ zGFt#$)(g}Wn3#A1^vZU?oRN6!Zw@B1%eg}LK{*EADgU&hG{!&V$c1a(0%}Uz{Tn=9yqY>7b@gNsqmnJmEZ#vQNKUe>OkI^>nkTgIWb`BbU` z^#lQU+j)XPlAX{qrt@W#sw_46vhcbOx4)xGZu8wP&3yd2xFqcNrQ?bjUfjw;`s*3@V6rLcmn4lmK^{w zp#J5`)j+aMll$P}s)<;(ikY<61`~P|u!D39U z-+F0`#@Q1=7DFBxz6nyC${TC!q#e(v-4||}5&sHwQ%VloxQ%bsdpxm7o+h0-Q|8SgR z1SP$~wAG=!`!;w3+>Ch2uvsZ%BWZJ9IZoGli!4F51{L5gHIH3tDq{=ChkJM@%|hVk z`r4JiedR@$_>NNo;Cl=lLHVrq4eAY|7_nA#xS19XvVkaMJalXi-i7xTI0IVcX2qqq$LG%xaJkp{G#a z;JT3mjkDu3KB@=C*wMPk7$dqbebLYT6jyW4E?vV&AfLMDNU{5S#nS$voTh zO>ItX5m~Ugj;_NFOlp!hhZ9e<=LqWbTFyModGwiHERyN*9q#4+t~FVQ`4+C z5%o6_rL}*{O?$0Z8zCRF8s6!dgMg8TLH1?)_x<6(Q-I8nV|fIpB*B|30XG4-XP^H9N` zm-}ATT3H?p4(j%ok9jJ3|6s*pJB)}$okO>m0%3hNU>%1e_J}K0cf?4VFOGF0bs6a?tH%Xi0Lr(U4<`ImaI`s%-)u#|Z635N*7@soAXwKS- zw5M0VaehahB$e(yf~bS#LmI4tbG$gm%X~!mMOnkv(M#?1HF)dCT`S?6b5rh;&%Ir4 zglu{rUTJ-GlG z$B}Ip>8tY25E)N9)A**3gTDn65k!3@xrRBei}AX~yN)2T=CGV7nZ-tle9{J~-hITW za0ru5t{ zgbH1}b9LVKyN0jCU~9zXalcG#)g8Cg>F61(ljO&*yFT{uiKsZHZzr37sL}hZ_AAy* z7|j{#Z`j6mCtvZ>se1-W-(i?!yfrRljVfDvp6c|VSo}uML+N3Zsr!)sd+Q`@DXaAS zJTj$+8iV`>Y(^U9DLL&;C*fXHOS(;%{5LsAf|%ldY=%$Drlj^N@*Op%Cf)R0N_i?B z*1^PlZ6e#V7m(s(ATVTlZB_r>XZ%0my~qoeks!mebYjFpA8-}ni1!Wk^y5B1e3~b7 z_?4?>`=fvzQBz`SsX2;O^rFG+yV&!sg_F|+z?E5aW@hH$0&4F$&y3T>SXV>z^wrdl z81Dwo(!Zn6kJv|s)l={X*kV-Q2E^kB&Zf%Nw_faucpH?wyfr0*2idOIH(m@Tj0n)u zN$YIxYs3S(%|Ag_hqhrG&S`cZT*vxI+@+yYyieB;c=wFMLO?kbDm7Rz_NM%%ZI<7? zRT2FsZrvoF_k{z8(B(-5nSr0761QaPO%>b0BdFa*Ap8}X%U(0;e%zU5SrMP$%MB}U z_>wI|#tAOM1`$Bq#XryCxlWj~^$Pk6xY_Qe{@+MW1puZ8b1PkH#D|*KSezlKCT`pcTSY|^HxvRZ~oPLk~>B-VG*)n?Saxl>G zN2B=HJq4j9BRXjpMAeTN3H|%V>K}~R1?|HUwi1uNe~Ex|?11B>$*w__11G!W;R_+8uSr zFMmO?|1auffQY9?0`(d6zVi2rS6M&dd?&vKhplBEtUJY{MxQ8GsD%{F`Y-R-BvuH` z>OKDDK49O@2y+k9?B<@z#w5chB2>Lj-weZLU%fj*L3*~+*Fu;}Juo{oWaO!H8??F( zenVgTAxg3uxNZfr9++iLva9|ROY*Mw(sMu5k6cH|ykT1vOJVPsnk8-?BD-F1Yszll=KYPxw8cF!P&eDqrW{(c znXXX7#Pqc(Ocd34Te%c=m1|o_5UP`Je{Y39g6zURHh+r@a@o`bX#t!8X(i8k9Dn#> zL4By#BF}-7O$E{6{JZ{ev|Li*>+lG3N-P)iNRt!_>05AD#_udOOn7*4IJ;wY4W#rp z^tT;An9AQHy8jjL%19BnB;MUrT_dvkFP@83A$9$iVmH_7L3dQ!M#wvW?esis+z4Y(T9NQBBIH9GyeP6 z4)M*L9aYs=+!2gBLsLE=lGsTlAV|Ey_A1)RCA^$FB;2dp_!#7(Jl@9f@as7@LP%dP zA5AHOE@YqS4)yElf<4Q(HzAmJ$`w>77Y&Y8N7j2TeYnZPSZ}jMM0o8}rfL(j{%{b= z&~x=TQ(VV)LL6V@0%rE1{cfH7`J8vS5@3x?M85uesgnP}-*W{r(Ntg98!s-oILk!i z^bFX5{U`6at(}(dJu@~`Klvg*X0_X;<1e}E%zMPA7EJK3Ej#%~E((jx zc8OQGwFyjjWDq5snYCgZq8x;a|f9Vn3$jy6Pd1|NZsO&nDlF^-3PW?)e=3QBm`|^&< z0qg9?mc;9*_8GS7|lZ7Wrsp zr7RKIB(bRkf~ab2Nvi!pe3jVczVaP7HQXHkr>U^F5|->0oApTH@NxE|wR6{%=pEy$ zN9ISBNWItzZ#mEhVyqrwt9)iM5QzQnh3PPDaP61@euKEls<_H%@fW(++UJ)}hgo`G zNMWC@U7ao31+ z9xHv%p2HF?hUU=N-U^+Ht1imXgwG&knQjJAST@hj|6mBl`=|Y%nF{=KD5O^?&;dcX?YTTB5A>5633)GTYbugSi7akge?+41%C)2iBen zk!02ihMG)EmI0y+6}(M=hlI}}_;CZlCaH!hQc+EenAlZ8%BRHn6|b>;+Lh6u_}4V} z=jXqj#!|f&)n8gYFyncuU=j3ga&Q@CtdwkSINipTzQv8|f8DT3HEQVGfUuHeG*~~f zuG3T$PC~aHp_j#x2lSAa|8M|>E9pNRSS0NtzIBP+gJX{QK!?yn0_bV1Uod^l0wBDN z9>AVotA@D{ZMHF}Zl<(bvJ(q3eH_!W?1v!T*!hR!Jo*`e?NEjOu(|z-)sT3$s$|cg_7)2Q_iyhjL9z1$y;kBTIG;>yItiy)oz~1gS;KCw=z2qjp3? z4+B>g+@Jr4!~OcuFkb{zCh13Ou_aAqHbKad8WklRQSd^$+qeA10ML9Cv%NRh1~!?WOqLN!~A& z(?Ym?EnGEL>+xoe6+XkJ_nnr^d)GT=9*Ps&AXUT&{-?t`cF(slDmUeJljWVBFH-CW z%C#A=5l3P6E^h!(U#f&q3nzb#p6gP9J46&ZVw5z=)njs@)buM`RXE;|3ohYRhF>W~ z=9vz7huAKkTF5>RUoA1#G|i z9Js60ya_DzW~JDo$&|xS;)(OuJ!xI2W#EiC7p{K6spCRmb~pjzMTgy{csUdLOVvFV zLFhT_HQX2C^Snhjx}izu7Ah3M);xQuC@)nS!UsxjEG;iB1Sl%~MEX~bOnAfVBEMO5 z5)a*iGbzOjnvCx$kgd$A5SA7Tt0E6*75JJ1ifwm_uI4zTZn7vUcxHegxy|cx&;xAG z+Ps>o6%A#IuYt$TPXk7v)R4gt2p-wa(0u8A-22b}5{a<2n?VkmP9T`45|4`==}aBU zunTIQhPMgeG##G&;dtr!Vr937^4$jp$AbcdKCYo17uXs~i!v{rVs6erFld1ks{e%I zw|Pg%lDzc52@di_-+)gn52>W}swc9)=92K~j>DB75Bt5`FiM6ju(PNcJx}%TI*#}s zL1+@;+;)i<`{bE0A_JXoHVjEh!j!L!M0;X?S-zLe$iCkd`gtT`B(`3@OkC=2m&*Zj ziu`4r!ibdxr@|{q`YR3nJaG(uB7Q0R;;3;ZS=x=)y#J z^#LwoK*Fc%<-KQ*><_Jf$UkOjLA@t70ypUAHN(|XV``rjNX(5xw|xwIVxcF^{RE%_1W)EDFDu2zgz@BT^1V4;ER%xaMy%N3m1v5yPK1z? z>PHZZTz=j)3{lL6g8-JYgpcZ>BzPeiV!Rs)N3e8TYi24gUzV1MK~LG6vOxi;Q;(u- zO%i8qE64zm=525Je@7FQ$(VPk>r87L;SYzZ(*AGEX$xd-0kvu3^18~+3~kn8I$>r8 zZ%RsnRpc;}>4`_(SpA>6L4l>yi@1SZ%7|y5xrTPE1||`6f?YtriA;d~eD;DG?J+jo zIRm2!@v-bSOz??J9am80T<)P`5s?6#Ftc5vL3#Y$_2mjz=0<9|4Uu#Bdi=rLD5 z+$OF?h;@UNau9lO(YGfXz47fTvjR6-E{|bKwE-**TMvr7hORKjW`dE!>!>)b;s*{A zto@aSnCP4et-L=R50ejo)NNm%(mH%8-Hw6|INTEEopvt~-XsoZeXRV5n-M#oNr*$B zUNiEcT?+5M*McQQi~O;jm63f*jh-BV5ea@Pt`5VYxE-`5i*lDN0 zb9D?ESJMiNXrEHHZmmbfXi3)X7YYiN7qn(eRNpLdj30c%vHL0DCFxdkpVtfdso8MZ z8vrvsSF5mZ=`pqOZ!;JCv&vzd%kG+o@3!=uB1{q5osTOwh*`wyRqMGdHg_Oh<_iH0`e249Vw+6p#8vMbt#xFkm@9U2aDutY>7dW)rNbJqQ-MZ=mq<3Qm zT~I~Vk9Z-60RCRnWSSg$61l%`7*;5ZPNEuln{Tjh2vI%oDJoDTDN>g`3f4*N+Ho9t zdW{KuCS0D%B!!-Z(tP-#aF7l`Zfc zd}NHStHSJ~9;TtJYd63&(1wT!FpQo!S-kMJfKSPijq$emH3^s6=`|3xkL()nQK$%P zo1aja$X2JYk?9^_jOsC8|GHz^EG3Zog}sYZJujp(YF?1FM{5Ay#v>jW*^ezv>yCgf zd-f)OH?h6y{X6};*2%*BcQerh^;^5Q;Z#%BwF*LG%PiyO44h~0c)!j-)2|g#}fClbVgx!YeIPBLVa?gO;1cCpTMBj>_^Q+>< zJPlEfKOBjE-jhQs;Zs0_nU5?WQW{v0p6PjZm06aD+vgW;>zLfa7G$>+36dMKn3~r4 z!9SWF^FJ|$Yz#CE15r4XgML~FL-RUm)R$st02Xor*MdMy&sK zsr^*$Mayit8~nTynb}Qmy>vK1(&h^&<5pe0yT1W_pj?W~f7edo(4g#XLB5CeI|=(h zwSe5!bJuW7fcBMqnV48YYB9TN{!N9U{j;}7r&#+v%Iz^zH&L}}7rPL#n=JLq!?hO4 zZRX}d-5rIwF!)3{z=g}g``Wpcln@5$&;-4hJ0n5U1MV`^hRVdH&rFi|GKFzL%%G&4 z1rwK<58ofodxU8SXKIT{-`hmo41l~9(7`8*OZ4N$x9k54R8@={`dK)NXl&mzRNstz5sy0SeV{81mh_E1$F#t#T6wY}gNJ&C_v@;K67*zpungH02q z&Rsw`1XIHkO#>+ry>O!080~DD)&4z;IxR2KE|W4TwjC;}M@$IelNiIEgU9VLZT34h z#8ciaVq@AxELr!0J|r~=`5SR??BU@2vNJ&)t!!0NFD{H2oRD;-DyX;97mm3ZJp7>s zHt|o-1=g*X^}eaQlNyo7sv_a#{xufSh*nF^Ib9ysZf_~`Yh-rM3-%#u$R+mPiTBxcwdV>J zs2SpZ>o8U+H;d(os;|59T{A!>d-ytfseptM@0w|zVqE;gA&g2(kGf{8*>@2(*}c?I_a=%R4UO-BiG1c6c&k`af_YUI>jW& zf~B`5K7&a`KQ)~xpe9KEnD>0FV!ehht73F(SYm zo8Sd3ZD6wTbR8&OhP?}g{scyZY@H-9`G*67o_bHkuuIgzU?$@p;UH87h_VA~(VNW* zEKyO90ba%?FS%h|;~0*p+kuUC2G17(;Q_caeKrBVTbvx$;nr3A6rzzI6_mXm`AW`> zG5IA-EIHG;{wl4c(q41-jr$?1>eR}{LH~V*5q|JHtj{Q(nzRA+$yI^D%u%y5cE=>| zd3_?BdcsRCBGDC8;gHAr92~#KxlPy>zw<)+mu}?>Br?thzt$WelE|sMkAVkiu3e01 z01EEV9sOei52;;MM_U{`Uu4Vor0Du3Xv&el&!1ShPQ;Hg#vbXJ7vA~Y_p8rn4rbJm zL!h_U@PP&!YmH3Zq>2WL4@pIgn;PgS$7hk-^WOYM=&!JnQ@0u(i5%v-zE?D}CX*g` zw&5oKjbkC6+DQ5aRAw8V1%0b7pj3W%1TLB{?i}$$UGzfhORib7cSFWj-Uk>^FL6u* zy)}+fcHEv<$7R^>)hn1Dgz@DOK9{pnb6)OM1CQMi9X{Y|=)K_>=is_78$sgr%l0}z z6un$?sTgJvwC^FhMLT}x15mA(67$_z#?7{v_<*h{lBe5-n_;XsRHw`)oZJ2y zdU(gft=*E1w%#(FMvzta z;^Mwbsl>_BlVw_@ACX8kohAx(n`p%e+1_aFlEyPh`OXPadVMa}>>*l3c9|_je04%R zA1#=Hj&Ke~w;WoVy0Z3wxCAaqvFAbmshzD290#>eSAHM~F11a<1=NowO0M8^<$e|hs0r}!2IOMy7$(hfzwPs@a@$6WZ#MEG-_kS#jXHHfJb zqnH}A%p`0eGZ~mK)s_pxJ*deVrW8Zf>jLSAHu5gJuM|nM^SYG`1#M7OD*;Gvkmw_9 z6S3c|_ssjpMjS?*?K^&@+u&NxmwXiGKkKpXh zst>eDuxW0P$&a_I6i9Qg6RfxurN%F0soLG1cKbNop1+lr%Wyw%QX~nhwhp0q5*U}5 zBB2Ops_#VTE6LAy*l#(*f-MbNTX~K2&n1^xQjRm~@#2#x8npmp4bf<(!?)ocWh+usg_qz&kt$tah=cD9^}*5vsPnnX>?RzujR@H3Je$Gm)O@e z$g-#7BJthMcJC$!38zgh5^O^p$=NrCEtrnS`)$aQw;os6V0!O|^0GMkez~xY6Niuk z_ln$6&uMZ))YU0)i{4c;kW4&he&a=)KUBnL2j*)3!}BX9uP;DY`yo1;9<)e`Efr%i zp|J1f_)yc0g+Uns!xub-znEc_Ws3RX{Xik-GO~GCJ0MdRVxsi1yp29G0*j?kwd)O& zCbta_`QZD1%v68>`;KbZ-FI9=>`NE?9*o}xxcoIR7&qf4MFloLDlL{fh*q|npfeOH zsxa~eGKjvuUBc6kM6aoUP4`qBV8F7ZMQKR#B+D5(2ZolG!Z@`3P2bZJg>5WLieDYk zlXiW5#mMNmL}dF8{l2EL0;$>u7gf1=Ic**7%X*JgG;u{0a0#QAlXt>Kj71lFnjF(7 zU!AKlSfrlKs^K#;@QsHWQ5(oIixcortIBu%w#HABa)U^_+@~Aa8-A8soSX&jN}#XM z+5KE_UY(25vfE1iaX_hmS{{JrJqXnb{&9gIHGYT{cC@H-`P^0vhj))EpJM%1gP5<> z=SQE0?d4RZjkA^oiVf1Fn>4P)qIpq*RNTbfuM~Ukv8P$LcWpUNKMw83PCR$ki`)?D zn-c^Hq&JDXNGC#;hCu51S+LKGoTy8^o8eF+Rg4_0!aN;>io7=Lqn+{_s(S|P<+2ga zPO(!Gx2X?Js&-a*tXpLGt1AOno#XwXEv^ed(4+#MSvCFR8%`c%3Is^ADVlP6aQ>>R zavvJGfBx8gU+jx-)t<`BR089Q3EckG1TOG3`%bO9d;VNQ(+QD}Z_xXBR}S)$#!0Bw zqzbQimD)!#8Tpb6r7!~?4Iu%G775hpLx$0hfgF7&q6e8jTXUxTa(3OMnULs5p_=Go zBZ!D)Omy#rtw^!2D4cV}6+{cX?eG z$^Ka!GRHen6d5>K2_=brK_u##eagBhs~;^CvYX3%_#rjW zhvh_9CU_RxvCI)Kg)SaK<*e?Ag|w8^))i2GUOTQsX=cM)U+UK&685bg?Hl%PlL>jH zZ+0W(Opf@?04i{~CC2KI6mGE1ckm+LN7J{+*ZzST01&I(_>o0)FiNv+7Apb{#r42> z&LmY}#_`od*xyx<0k#oGKVWPSRG>jInX{lC<=}c8FafxD zla$B;Q*`KHJ}6iy@C|99Ds>dtE#|z)9-4r)!R{LlSr#;LVy<3+w!$TWIoQ4xR1;aT zdfobXw|yij;ySjsoL&mqJ1h)4YExv9C0GcOhhEL!N^>`8bB2M=edFp}|9qIzi zFz@mIK#_7sGnG(9zZ_n!>O^dFR3$y+$;9A~_^vbt<|!;(;N2C%Oee3tRU!l?o`_fL zovhL~IsS$<3_94}Tpf1d@n0F&YVKVU)#vby>94WpBmvx`+>iZ*g$ z(p6CRZ2HoIosMY^p_GL%_}zMr`)BD0yxi>Xajo)`=0ySvpM-*%?(RAsPsm~Z=) zsru0d9}j->JIMA=D)w^fO2&RsN%Z8Y1%oJvT=cfL-+dk2+>UmeN?9^2@EFuL)6AAR zd+0XVca!@in$NuME2;}AaM3&6)pbv%&>xNi-QvIqdML=_58v@(sJj=p#_tPatDb4; zk!3OB4QdSoYffXznM#a@bFSA;lC9n;MLJ&?{1L0TpjUVS^VI^W?q&*nOr}jHs27ap z?(R;pZjzsFh2}#0n6E;ZK54QS&N)j0W~|5|%~wdlSU2mpnXYQZunKPivUbB}R>^|= z3^kyKj2Y!-eMOz3!deRBRvA`x=TzdW&E9~J6j`bIQ%fwGf&fNpkpEZZ<+Y`pZ|u76 zG0w))M|?w_%5KIKE<3SmyyEX-VTkWtuX6ILC~C8mSx&I#jqOZNU1AbkfhC`WUk4k< zU)is9Y-lG9#-nnNUFx+6^0v~*MP!5$y^37I#QTd$I+*IiF6kj7HyK#}SAZD^+};Yh z_)K0AqmjNFghSsGPttcVPMuZ}&IZY7CQ46=+SePsRgp;u z#YXdVIjzuQ@^NcnJ_ zs+x_}0NjDjpvB?ET9^$H>l?R~n-6~QJ#&x1`LKUb68|V)-7jOZadL$w; z4WIXbP<`Nphc>{iCy(OKWp2z;Y?P_$sf;_@KZ(O%A>U-HcYt2M%oGor0vVY`6Wanu zk85`SKS4U`n%g>C6U&G0*H{Ib6-ubuzDYbNNw|{*S$=?>zu|;f6gvV=flp!%5NEMG zcekY%hIr$lNvwzfl*BW!LBlaRl$&)J2hbt|C(w2E-5AylfRXMXA?EziDsCXXFMgSQ z#M=ta1e>|+IOg8pW-A212?s9OZU)(W{hg3>K0xORRp`uTZI3SF}cV4YI zMx3UH5n+R~R4rUg3J+KvR-}o{)gVYGSy0rW>Pc8)QP}&>ujN}lEYz2e z;=j3FI}y&wb*x~u<$(u4KIl(flrbXBu&u4sqUHe@u+hnp7e*~g$aHxusU zhcv+7%4dJMms@w(@0MQwnO{5c7bmcX9{WYs5D!Q7t|kGpJ5}Kq<)yMg+-!Oh)Qp5) zN+ycRNfJVFlD#kX>Ka@wt|jZ ztY~Ks%94j$p!F7_WC7u|`7L?gFUkIC@bT>l`Tm8#G%)V$qi(>N50x)il0KLw1o=7g zeW%v2Gmpnl{;epM#=!#~{sLXT^+q+_MmEizW?JcYv>lslP_oGW5nI=-t4}wqWaG8P zj)Rn9So9;5-;)ePHq$jfpk-n#_~hHynRg@j*xk7jVd$ez?3!fhH43#1k0Xod3-DD0 zR47Z?YO-H;9y4Hy^j4g>7^et5*(+~(ry}exP1;Z+D8H+vz2)h-An>E*P@|y|FH^nus zR2k=&APw4qV@osW{VO5=FJ*{-`fnz%jXq&+TMS`hW$^g^2bTyF*J@N2cunn22=DT2 zCVybh<2rPhBCIkMU?%sW8%A#4_PIkzJYE!d_ckP^m80!^!}o|AfvmyhmicuH2!W6>@QR?jyEU1&LfSIm_slHW}Sc7W+C%M3DOq`5rSjHx1mmS2m!2ay(){m3YiSzSR zNN0KFO0TJGti07Rx8J;9x5OI0r#mmsI}i2a9y2;Fy;``2snrv-GE#QToAZ#_56s^3 zWXYSAa3dbx-1J4K_J3-`{!IgyW#)N5R3lgM^F`cReP#CIhUom|r>_l9Of7}+3Q~QS zUWs04!=q2o`q4WR#|)Tw?zNDQwyDL@v$3H~CnbYd?x>qTHTnE|wu3k)^Xlgjo-Vrn zju4Zoyq8HK(j5PR_~QTg&p1!@%|hx|F>!~9Rbco5-}@>MvHQ9>Xh-wUjlten-;`D}(Op-b(2 zdNHG9LPmCN4o~>2d>#!m{5Kz*<56QCzkpqbG#@ey?B*c@pr;rmbyDGwi5nJCWaE6` zs(Or5`p%hK%7(XD(>ze8E{K9beZR-ybfvwY-2<*B3Poiac_=hAvQFhGDt+EF^~693 zxR{;j-juwg;#DEk4DJ3xi*$p4C`u>(};K-F0yk39xPB(iU&02RgVL9gk;mf2%8 zRg@i~eiq+AgXFzbU|?9~0G9`x(K=`&Q2XQqhhb`vn=51V^O?PiP0(HFCUH_9%d?0b z_H471CAgn?#zUcz?b?PP1>r~tG9HXskqQZ1{eK2rlaq)X5@z2JI|}4k5or1mBeK1B zD#HTZc(ZzJvl$HGjgU%toPXONj!~&K<_AU^0HrmS#t_yY1FtF0`y}HPFZ~nHIE6H) zdTmQslnbiYWOrR4=K1Dw+R|WZ()Z`DW`61XWOLtR zE74`mUCGyoqp@U#>Mk3>wyED~Z3Wol=RnJ!1{d)F(W=cVk>ei)dQ&EfR9A%C#Qo~( zv)H28WZ22M9DKd`ZZhmmfo5m9t|et<-`Z5w`P<3VWKrQGgWb8=C(yZKXS- zKE-f+ce2Fg-#EwAgSgk78hh)X1mAkZmJ_OcK-DB&u7&YAjN|bEcVIPh zRjDN|GFJE8wbU538zWnYVG{bT^D?wW`$tjdM!)fBZzn*R^d%+noPnqz)jy14%wx&FTbpR^~6Hphk>X~u0^vW8IF1J0yhUWM&Ms;GJ0d~ zjSgwLz8Ff5vV_cU3RqQKBqTf=JoismJ5mPK4l9W9gK{jKbMw^iRB-Wrw$^WU^N8A$ zqVstd4s_*Sp}h~2IWJ1CM_+=TL;-OL<`c_8642f`ocm(ufK=>DL`Pc8D?<6R=swNe z`p*_Eggyl|XrSix%)Lp;4Trq6srE#I$S)0-&@DyMZ(~R}wVVgC*Axx6 zI}sfT`1^B4|LORrNN-NX``v=KyQa%vZagP8ExMR?wZ~)o%Rc9cM0iG;VZ$~dP_mpj zOcF28rv2gA@zBWowO+tfTI-fICM>f}_Uv1*IKK_lgReR9Z(7`cbG!UcXn3@a4uh@f zmcv0Uk{>;%G{4>L?ihM1#tq9ZKMm?ab{);I(K%r4U~w04B;9sqB(RtCRw^)?f46$~ z_FdsT{pN5;cVYWx=CC)|d9bP~#R>TLIX*+ghFaNH@A#UlpWkd9b+T9=3Ykq0P2~-@ z_3C)`Ik9nK4rfqE6(^goiXj4WVqoj-a%j3R_nbW5>JLYl;4y`yERL;+g)y3(A3M6n z0QRe90=FB;&MK^5yl;ZD!55R7BaS^Sj+I~TNd0Kc9rOh>k9!09XT#Nyy0-YA*xJyN z4-PZAxk0QMTN4e=0lmBM31|T*+Se!yFsiY}|x`IPwZI(ARmEa5fm7*h23}FuYKj=aDI%2TG~?%7M6`His2x_F3-im9$IllzFMQzLq4Wm9f$&X7M28= z(19O7*U#3P??WFj*DhVZ{#a;gt+&&%9z8B`fx%H(u+fG>cPo2yaDXh7xnSfd=b&we}fN26Ok=WV+C>X3-PM^AcMo+iAH&9%{%4 z*ikRA1@B9%6}5i&ZfhvNUe;--DG(&rMmb%(<5(h8!UP^*s)ZMYQs6&lY%1Fve0^sK zYdclD50Cky?)ELAjy{bus@FyXpw~0r|FENg>P&nL>Rg2f!G^ zr<6sFPp)f?k&f(Q3o!QcoHwohNIIi6aW5nJjerIRu>4%?CMZ9{yd;C#OjH!Z9xHg0 znD~6SL~OwqqKT`JCBgU=;dbxB#T2v@>Oj?GKg1wXZ#wPwU+TE=i6zlFc307X&Yo>f zMv=|=xxuL)@B@0EdG$SZ43m)`DwWBo1X270)6XzcSyga==G_=zSM%fJCx; z6EE7CUAse){OoVC!OEHA?3IvwVzBgkzqcWx+J#XRP;xjm&}{(c;pwi9j%R*YSFV?e zL$y7}11l-JFPVlq5tcJryV@SRsL7e#E6<$lj{fANTTDZ9x(j;r>B2cX2-IKzb*X1d@>Mzwq1NK6`)rKjn^l$A8cG#~F@{m;i5C?^0oXKV7P6WNETu@0cAK>{)e-uWQWtdHP56P zr*JQgPcYzS3@`eZV+ir|H-@53Jx#dfDLW=yxV#;@VYZ$d2(~-uYO2Trl@r3tQZIh2 zSU2iG4N?$~CUJIC7PgXHsA*6RV731tm2?0^7BLxe(o2XLU&OHOZ~#b0IYV@5rI{X9 z8xsz=(wHZB{*ipJ!(1n3Q`jlb!I7OYAbORQIO9$pL~^FVsj*LfE9_$3xL2Th7=UK| zfI@S^m}Oi(%m0KJ6&oUQ=0Visy#iJN(;@(ZlJ&{anIw8L_PI09>kndRo^Bfx3G&hh zCJBrYKw(OFReXFxcPGd*K7RuCydwjggHDOWQC$qp0X%$-`FQGkF!HhQi8b%cc;AfY zQ1hm{c|W>lwQ&251=sPr1$$5vm`~p4@~RPl3^ZH!Fpd;3!4Uy}!_=R^r$g88ldCsb zqUNUD#hSgLO42C`NE0(ppJg>!N48eMV$%eYoE4bnetDho2s$R5QG}!d=qpJLPX5#K zc%x?bxly4@a@p2^Q$#0z-2u}1A66jNX}RkdIT5&mD0VQWoCQ+Id#0eUWxDfmfg;$_ ze$SYDb%=9_oAm6^bM=(*XHy@&R+dv2p))9|K0iyc)jvXtnxs?AI(}ng8jrx2rw>ZXAIcITW>2Juj{j^SCO- zdNzXT#@iBENpzlnBt_F(LG0MM-?)vgccd5DdgZc{&S}6Bp<7@k=1_YVoSJc>Sly>X z|?;5+g2AdD^30z`61^fz+2+r0HylE$zi zFm*{}{04qvtLXi=@|4vOXC1Mok^jC$^#0{l6jv$11EmGnHn<2PlcYaTUqEPLXn}L` zu5&g6=&))OB$86<9FWw$f=73*_j+U!)Uk4wJsK6Dw5YJI<&qbXTudiFlRFHQW(S71 zWo4b_&Td8J^!oa2=A|QZWJw(vUnv~F$L=1FD$J9+N*OpJFP{yQ!dLqlX90DbEQ07H zx_=+&y@E(qp}aeFJy7HOwxd7&U%LAT`tLLje{7R96J8(bQV)jsD(gz|cxFGmGrrG0 z8QyDfwK`fpf~9xbK81VQK1z$T;*#7p>NsHtCU(i&)xSNm^0S&PRZ4-tMHr0 zeV8muM5qLOVqZC^1qP#%XlScHlrVku-o9ubk&=?fk!oA$3egLa>@S#Bn>DX>4Xsdb zz55ZK1A+J#-wwK4dQws^b}orq+ZcGbj32vt!3Tt;=bSVO|tHP zl0rEQWsnL&huo$Vg<;KkO7*yv1Nn*6$g0R37qv?2;tzeLTH(=zGG9`}y+hict1l+< zvYqBU;S0~$??3bb>uzHhJmEEa2QfS{HNadS{9D^x1>H`{D+KhZbo+f-KGB1wf^fn&RYO1M!3~7+?Hj_CsAE_VH_AAx(rQr5W6?tt7 zx&|8lPKRRPczM=!U`%kve(`_XSyue{jtL*w_OW$o*!9W8voR=FWiyO3K8E;!%d@qq zy5XcX)^sOy^=(;>;wPam{bQF^$$D&-oVG>d8Vbth^fE=!DGgfEmocjBe33izku4xNP?h_B@gJRcOXbS zT9=0}7gjOY=s#QFgz9bnEBV%MjW^(753u*bF%#?ueQ96vnEficM>C&_X1{pk)1&6FiNM_Vu z9L%mo8Ut+izvJw}6;R$pG=Ru1!O!No8y(u@oa{MW^$Z4a)(c$iZ)76;OUSSc8wG5r zB(U_wa)b=Hwz@#~QC=>LEqZ%^u_z`8Qrh=VpQfC!Tj@tfinw~3Ks4Eqj+mC^QF=(Yv53zQkx)m^2IGP{tpKcZg%>_5iySB+qkSuFP3FiW$;x`JE(x z<|4_S;-bAXr{gqosJUXV?_XMV1~FAuAqwqXk540hFXs8s<^cC*JtEAb40~j>tR!j# zL2q)~g8w2U$w3E6`=*6O`KUg`siGPu@b4`0wB#n_!41wMn*wp6xNn4w^3=NEu3GD_ ztGF1;6W@bin%llLKAddzADMkvr$r-HBG%$}vbZMSZ`$;@E`Ee4k1-lzEUAqQElz8C zYW!xQ^uvG{th4}$-S}VAy{XnKW=et+{JvM_?|X*``@xAD9`A2nukvy1((WfFIwb%NwMX0)lFOsI){*f? z;`Zf5e}Qj&*^r{Pa&G4OdS-0M$b63cxa;sF98)$VcZ`)bf(bDfAKtjgJn#PM)40xk zI(r2>Wad)jyqyQ;(_K>Vr^@D|&n}P0bSPx|^9F{du>$QQ_s&UlEEDxPw@23Cp5hvJ zoPC86{g`*~B&p~aGXzEg7s9m7OZbC07Q zT&Kyr=bkIjdDq8#y*>;r;D|yy*w0m)<5sOw{KSfT+QDEel%P;$#!39O8kogxfe0wY(%%fbwall$Xtb$YOg_&Fitcd4b zQyw@cL6Z|9G@LNk{S(Gtjh-@q5Z-_YIb;L{;_h%V-?9+P@P4tXQ}^wN*W+;-zRLEp zX)gY2X@Qx?N`}IqPKex0$xFW&UJ|j_nAZhZJQ7BO(B^`W``&8tj?Gq{_*ea&V~Oxk zpXW(1M*Vc4kOBnxqxP7NQPU?wfA)oXf)m4d}>yko8a`h~SqQy%Yb`u6GqpB@`PVTJ=5V z26PQZ2=tHS!eFeo-Y=oHjjsIllo7|}=3fZfaoR2rm+*u5n=(cVDFOovzLenQb6yUM zCMbq^fhEWL?hmy0C01B3EcBvs;sgsPCChT14?XXb-0X7A|A92I9twf07SjfvX{n0a zj4HPWH|YNoIy8iOrNxsF;lzR}JHqU-5iu4L2iv)=kJ(3A@Q=N+uR}PFC{M52qbU%EGUKMQ#6( zaPntdRSe&eC8WY2WtP>V()cUEID!?kv7oeLv}j;61tGg|KU2J%7VQkr!-JYKQr-O?BxK_YdO$8 z@pqdM<0IK&au~`oobtjTEvpfNT%Q`A!R)qQXNz?Fx7#cHW8EOxg=q zxz+R@YeJKI{WM#vGy-rQ_X0}TdA$c$F`rl#uXQ>r%W@-lgh_AdG1VYQzBv^{y`Hg0 zX2{*bHV4yK0aY$+aec|3PLtQ{Wx^T#MI2D9!^U$pF?ocVvLiyh1~b}+33vD07D532 z{88cre(T}%xSONp?dE)hK1KaWe^|-#^{@5~-aNROg(CD+R|7@^>;c)MPaC;X8S%p~f zgLj-7mJ;lIck$u4t+6nt`^TnPI?My=URkcmkK`$fW{k*y5X;Z^9xYeQw)vU5MQvj3 zDdaiP6-k?E+PW_(nIfLlY(tF8~8Uc3X37+ZN0}G{&d%9eU3)2c z|E{zZ>io&U0E{MUf@w!}20j#dv& zkb~Yn<-yXsUw2kjCggUIn~359G9UtL+P0k)bQah}Tq=k~@pCG`PP>yRQ}(I6&B4jm zu;0WgI^8&MyT87W6Eiu7p)7CQdF4oP*-P#3guytiC}`hm^yF(X2}*5~xRJ>>9f&aL zqvd{-^vDviZIakNi(%qerkO4d-y>S}d6un4qVUkHoaHkpcOtH%HvC-cyzz2Lw*B{W zPwplP*bDe+C(7YCg|QCAlW#B!s1ZP=D&3gup06y?P!8T2JX+}54pTX1lljROuUvK5 zms&CpSBj{{i1OTq)a$B`D|E!(?lhOaMVgNKt?I z31|f)IU_p+K%ek;0H?|ktl)~sFG)_>@RavkB@*AYT_$Fp^%I+u(^JDF7F%YdIK?=Z z6O(KdI^OjNhh98pX)d^Xa#T)Ci&cOz!jT8I_f{0Yr*?(xL;N@%K%Mc{6&+ABB!4^08sj7oo0#^qa=A9le}_yN#f6)l$^01q^3gg_3?-4rGNUge zQQ;R@KS*^*Dx|k_d)MARbi>qsV`uhC_usVQlb>t!w{sSfn1tVDe9*TWpaHKr8|`4Q$Z%@C~})>PY6yA1S0+`fy40bIEe53*60@tM%>X5ycxPdf4<1B!APOIfit)n;X*B% zu`dz0p=%}iN)^z;!NkR7C`)6f{8?m9)Lmpo8>5s=a-Kn})$k1lm{jF_evC4~>)Oav zXdDTjJim`j0wDjQh*CA_)@MQhZa45+P<+MEviwpsGejEDiD4gzf+z#KI#Cd zh}Az$1S-)QT@BDPKBAtp7TjfpSlVwNcsYf{^P(;B@OHqN`4lxFc<#s#A=VK>R>|ou zTd36KuYCFTm4uNlI_Ax9N~a+>z%r#Xg&!fzv(W>!KJ6+PK5P>%sB0p6<^?o7G;J@4 z2LOMPsh*2n z*L|x&so-sq16ws$EO!h%DzxJJ27lnT`8jm`-a92_a|~R?O?5Ke-ZyD%mRRvKvb7Xl zL&}9!AvP%07Ke1qbtN4+8cpy~mj{vPBmNl$Yof2IsgK4B?;j;)NVjKfS8DjdoTedw z<&e=mM(uCDeDt~r-=}Mf^}+g6$W3Z652TI%_-ySNHmr`2OAg~*h`4lXmp86vLF`4% z$L)UK9u^3MB)Bo`Mk6`&?HkwTMqgb6HPS{}9K0CoHv9(quqURGgce|YhKF;aT&yuZ ze&`NYM&AXBRmV+@zW(~w#tGJ&=cw<#D=K6+Gz0q>vDwU6&GHO;d_H5tl4^I8QRGEd zUefVL_MffhuGd}poSg>NVYgG2)BJols-ku3_@;@l`L;Vf<-7A&UwP_H{yIR_=1a&i z7G)jYAU8!@I8ziFf$~PwK>ch}YEdWtRZ3?&C?HdZEypANTcy_}weqU-bL#G`ud4m> zkaJPf!X@*IM*kgGc~WlZoNCFQ3Ax}x-loe$ZK`ETHoA2Tap(;Q4)pIt4j%(qKBZx` zLf0E+bt(G&@#wgNV0zdD{poD4_?y%HkMC=7#B{647>l{QR7AAXRgvhe1xxl%Og{63 zw0IS)lxnHCL$o7V!|zUyQ!bUnE!f>#B<~IKoHQ@^|G8rha7n!l92 zy@Fy%LIw=>8lcF~O@(@OuMU%Un#!Ti=7d)6lMh>R9Y2QgxO4*kZ5~6+Yj|_9%*0cx z(y*lvEbEDPU`so{0yKMUhjKLE%t;Iz%82tpo>-I==z@;&X8scTS`vItf1$FZmp?wm z7Mn2Nf8D{-@<|2w<0lh;0*^I+jfVZt=l?_>`!ABS|1)0yzaK9dFBt7`+t&F^rgtdH zT$&rk@^52VDqz}NB3dUktM03>l1=Q831B@>{CQE^^P9sxg`9UaqAPZ;n_kD14>x_u zXs5fbbSxQ(G5)#ZN&JprP@ zw)AV0s>O(3cj6SwAw=O}Y8(1l!7m|AoOTO$!b>pF@FX|DK*uBO73Lves(^_d+ zhVQK-v9V+JTRu7q70lr6lLMU&cD8>~@iF2{47$vGM;e0O23O!vD(ug})K&Cr0xmSlU}s@u;(+MLDDGPYX%4|$FQMIH8=)>G5`TjtE5BSy8`KgLnii0AvB0g_GmSxf17m7;n;tw{jhMCv zs*I3gL}D=iEHsB7yMUewUex~yE_?Mq!cU8`&!5|~ddOx4&!>gCEsYPq^DQ2(Eqd!3 zT)r|V*4EnF)imge`_4l{mcs3uizI)TnIABiZplam_>QmLiblu^Rg;euTx6sAh!VfnJzcE+XVr z_&6kHE8`TjN?=PoyuyAmJAH+KO|0z^#0ZpF`(Pmhn@WuaeO`)ZUiv0tP<)8%b?TQ8 zReEK?5a#RG?@i8{ua?%?zuI?Tx@wKpxpngGJC?E=3xZYcn`*N%}Q#cPi)>|#i==P?OTHkudduP>;~fhI$g26@OxD|;t) z`1Ezk(t!NrD=PEBFCpk`zH8n=nyLNNxzzTyoN``nuZr-> z@y#nZuBItJg8-Gm&X$J%qF)DzF_h&b2K+Ygle@aekGFytD)t6q(1Ol`#rm7THbmek zEgZyxCnWzA7R-DZbD0qN^888RwV{G{@WV=s$iazcAQg0+<^O>q4(HZK&UD$8V-W}U z_tnA#YcP=>48SEFLClaO?7WZ6ojsX}9IseEXme`)E~xYknVV29R;`;zIVx2Xazt!J*3yCImEfz6?t6+I`p*%hkNwqB*bAHi(PS|wehP?34 z1NQW*m>p;*npBP02ttM;xA)LluX_cNaoWokSv1)u9GR_yxdb^v*5)L^jzG)odLlB8Vwtd?kO;gD&p8C;U1fx(WB0I}$$lcI*-o5f%DC9R2F4aw)Wp zs5jcnC6}a_8Y(KiH#2Y-Xm>(GJE@)pAJ_IUkKAhv=I7~u!YMzrGE161clh~i@4VuL z>=J|;T!WP;*gF5jx<5r`z97`zSvV!imHP>qv-vWOsPZ24Q6^IEY}=-Jpz_&~P?I^h zXwuK5QBI)s=j`HkqBEUx;A?@@g}r9)r<+F=A2Rl(cYVj?AcBI*Xz(7+#lE#+BGJy_ zwLnGN4WH`p6VaEcVdGmllzt7?ZTd?ny1y~rB+hYT8b^GPAY#qZsEok`DId11lr+4` zbBDzs0UPSre%i-mkbFFqcl%;&x@lj5*EG5K*#%2rfh_fxgMCL)p+bb9e#JWe%6SCc z1U;&StH7So-w(R;y&uhw@fPXvv0>VAShwYns)eq`9R9b4Spm>6 zd%MgxR7ZD5(Lk?W6?>YE@aOESEOrI3!5F8mMYLST&5m*mP&KPl60C+k#)I9te&-Od zgTi&Nd|}>0p4KlR+c^Y%HSuxA+(7Rt!3hD-Hu$&%(T^!4Kd zb&x8Md5Ak-DYUZfDZMW7n=M*_zk!{4Da zU(AwXp1J{MScWZT_{bUL_cIz8fAk_D1hj=EBti0&Y{H=GvYG>sPWcl+ct_7}dn*ui z5@ayV*)}zpXh}{6ZB86fsk$ZMS-1#zq}a1ze5HkSk`FZ>1GM&*1_Y7f;~II~jw!)a zFk({<&u}qrcBiXyTnuRmA)bB+sIW$C8mPFwiCPpJn`RYAyzQ6 zTKF6F&1^mw@Y*E0dP(_;TjRDu`d9=)o~JadD2N=C!Y1RFHB3Kfj4Yx?n#O+T53l)5 z%s4!3#!3z?Qc`-L$qyeB3Q56Rfjc3FRnzuroq57xAB>q*VoS+p**Sd^IP|x&mSfY6 zm;OvUvRk;9@T?+Fh$#{PI5yufYUnn=$_Lq{r%_-wTq%U;_d6EXKFoMx?ACCn<=WSk^owjVm+L}6_d7`8*R^L&L^=@hC0 z-vchY%s9Hp8YFhl3Qe49!)N6pI6}nQ9sIY{A#!kT4#i1$nP{mBkV# zqY3C=hyB-~V|lh+geqtgRzKr(je?0~=n+}T2=_gw^s0_(O?E7zW-Zd*ss_1rADlQ} z9<&G1ZQz5NepD8>vof7Etr^#BEQs!+9;4kG`*6?)77GpvSr9}enPM|<20O|+wNz#L z>Rz@*&vaHl*PSxFeswPGA>Eh7A(@jL_|lrLk2yP5Z&Vo8YY*nuIvh&+B~)<)|DO1m zCa!*8ePZ3bcGE(iz>*!8Y>te{8ZbrO1b52*!Kq02dtu3eRSo6GPZS{5V~u3!+onMg z%9%`r8RlgaCyO>cbw{v;_1sc$z#dO~b@3#RvXD&jqJBoCU7RHYp%4wyu<8kLM0#5- z!WgJ+b@yWe&l}e5ktunMiQi3nyMwc@+@1d_!CSCRzHr#2Hwcr`Ncy^ibA1qocF~vC z0x$DpaLP3H=2hP9HTUhr3ZBo`6f0yV9@JuFA7tadCei6UviSjlIy~)PpMlUnJdeNspGb!F32XlX^TnZQdG_cN_TG=%1imJL3m{UKd1u_+m@vV=d=<(f z@~q|Xi^!HyNaPXlSsPII|J*wMe5V0vak;%H5uQqC%BFlLmY`NaTnB1R(vTq$J%H{30j#(q~fw1DpQ7z6=%&j&rQ$}G5*`1RtX zi2lRy=bpQHW)cFec}ZSX?NPGpo6eYS9seF&aL)}o`&P~J5!F4{;11Y`pmx#ic@$qz)XZfYO@n4yY$slAz=0)fs1tq=9blI<=d{FWw#+EYS zxh^1^956L@gmDS5M~3KfTZM}E^GIc+TV+#c_#A1hfEzTn+HBy*_aYI6EfO#Vr?Veg;lmc zib|^4L`i8TEwzU8ofZs1l}vKPCEvEWR&F#;e0;INujM2#6Vr7;{zUGsNwqh+`op)y zC(I;cO($j4&mc|{3!H{fk4W3F;?%04XH}QprL?;clcSuKD}3H~8a~KBV*yj8ZJF8R z;2p^3rLMXbFr$wi!k(-;_T2uBW!gEp*_pR3x3W^u3&^&s zZZA_?LL=L`3!ns>QZ9w_rjVUSmk#p`aQ5E#VOKoQs>oy3b9{I| zWh*IaJ;;~4&7xWhjp;yOZA0;J1~viY9Sybakw@|U{@h_z(DbU)FM-+|elNBDsF;p& zDP$HlgZ#YFZpARvgBtb<(XYDgB^T!M(>ceKi04d>#lk$4>{fvmIG8V1E<2hvBmKvS z6Dfl}dG{3C`Yq|o{R0DzXjPxopGF>T`<3^8wC$b#YNT}`RB1X^(w-yula8H=?|~DwC|#$RGqGVz{^KC=pfcNi9NwO%9nRUift-n59PbW zKbc$*t(!}miVVOY2s9NLm6UH*{kD4wpwZIiQWa4T+r$EQST}jFy^z)bVG`30jY z=nOz?%Jk1W1JTcRFxFsexK<)O%^YSsNQJkUqg(F=^PWvYA)_Edxz;WiG<`P1vB;&v==UjG%dC=<|Ax9V)Qi&BlrTqJwOFb*o)B2%jhMNp`Q<| zPF24JY?VAs&iOxgaAGYa9?OchgcsZcNWXTOu6Ug$O%Cw@>l$SazPd@Wp(e6kwadFGG$J=nC6p~2ZzLyzxw<3$SEnT@!J{=A7s2lA=-qm#{ITG*O$x|}A zqqTkObdQJ9-V3SsJ@i6jA1RHJhvG8l0(c*>P&s0;%URKT!DXdgnUiFFQo2+ZXE3cFcx}G&I zvh&vQAALj7S9t6i)AM9Y0bWp)z$SrQ+p^_V`K4+_9Y%#Ks+Y_WnM} zcKWHe+sg93l?XpS^CKfs^zoK@EgDw8f_2vYUg&6Ap9jMh(rBr?X>o9pftpyOGXA7_i^rI|mpwu-C1y>D@D!R7cdyz)uHU7E-XBH>7BBIQvjgbF z7)lFVdD;QZdNX|a=cs@2t(TlkXloqvyw|a^E5b23htPVyJ|Ue z$i+j4;(Vk*BwEUSuUYCedH04y+P-vjF|PTzx+CxoH4CCvhfhsuwXK>Cu@D=aN%HHm zbO^hI)iR!+NfwM6Y``v>&EcMVBdRQMV$|BVLm8WOv*42WUG{zqFZfLL=OwLHfKm2! zW++D9dt2f%KknatEg~BXRL=IlgRyakS?!foG)ImBMXxM}kort@!J4l@{0jBhO}W5T z9NkoTCHl_NoYzsH>d%ocQ?(kHq8qNGP>k*~3QMV-x=gRd!7njjk8;XLH5icw=`Y%j zgn&mWp!B5<6A4>;&8Pez>#Dr;9_A6A)HvcjvD25QT64O*$b^T;fWO@fs#Q(_bkIb; zH3FA^#?zynV6@Idbp5Uo2CfpTFHIGG68!Ov-)~}~9ECZs0eX131BYyOAV@6#T?cs` zcaf=I5QbIwXuD#u0h@}-KAi+j8kQFlBT3gW?5!RI?m8HMYZN04Qg_QJzm4glBfG~2 zDgbF!a-r_l-`%J*u9+;);%!gs0DA}E5C-v6#dx-$Am=h@4p#0%oOam;5P%Der?ddy zzms>{H_*dw^)r(ZNbEF(H_L^^HrZr(B1CI(yS%T+aXv%~@Wu?a=JtI&!l=E;Fmh%m z*g5J``T=nIU(fokUU~QyKnA25#9EvTt5jz?=8&2Vf>)3)ce;q7a2SB$(oNW48jN{` zCKi0b*8Ix_c)^u7`&J5-oSa#^nn2ww(}J?*izJj2_7C$sH<(0t;TOu#JhAYakctbb zdDes{t;oF>G%utfPgz6-t&@MWPOnCtWQc=fw?3`SM-3B2PKev(*);5U=KoP^2AX;= z5FeO}`@x&M@7PKbOEU+Kk$M~@?OqlHoB+D; zZq9nHI`)F==G({$-}Vngdf}3FiN~w%yGJ7VlM6x&8((qECG-gS0nfs1kRUaM+h5SY zmH3im)-mk*$^c;8-Frw^l;?Jr6ogLq1=Og6T_y0+byzieaUkT4?A^jV>zG_er$q_} z@rL1ZPNA8g)F%!eGMhae2je1F;zXUQJ&yp0Uy+)zFh??nZe99H5H3BP7^_?aqJ1uV z5Ur#`8B4^C7|?kK@&`mVHPCto)S~i%uPw2we%;EZ47*cf8GD0h_U&Qw)@4)CF zKfe|Do)pJ9Zn}-u8Cs8>Uq2Zsifxj)$s0Dem4tQ;%R5Hc-QiB^Z3fr=RHzy=cM`zFEnUQ_UqbUNAjt^gS%0ea)4~2>Nf43b*mKon zdHZshMPQ)5Qx=btOh`9Z5dydMccBp+Nu1ZP7w>5_D|(cSQ4na*;X%u87mxGe?}oaL z6rwMOpn;5K>bv=6p59|(-9ShpY%}&8V)`Ih+(73MO-*4j%B04Z%Dc0iOl(@Qu*xqPx<))Q!f`G<_Fvx^cb@4jqbqHxV^4Mi5TwqURW#1 zh*})s&2FFv@LEPDF68YkCGP>(v6M5`+&0wtIBp-UKQR7m$xun@lN*>z0WRQFJJ*t~ zC@l~N&k^HUe#7SoA6Y@uIyy?*KVHARcXEB;3!$|}_9YKi?9cO;9LD!doGXYhf!5t3 zdp$Zu_7qsldTsf5DPXSAKVt!|*FY=eIUm2|=v=EdaKm!|o!lVDM2DmpzkS(1!&Us8 zN7yncw=-b&IP-&R3bI{uA=8KBA`i5NZ?Ynpk60RHAN>fNBA4dR^2TZ;_fDzBt&R}CcVJ0+L%`(4t_C`~cT4~q zd#QDI0_ykTX<{BuSbTFb$tn}gsKyAq4>azuh--&k_y1HtjM#MIuE6b7zhd zhGgUJicRm#%E}ANJYwl~=m>N2b8X1s)X>A}!dHY>rcU~93m?i-*(9Z(XQHEYqbK}* zgq(Gj`VBvrOF|VRJYWBmrcF$R^ zxQ6`{CR}UDl_#NE9gJp8tqsAyrtehgSy>+Om8=6xL!#b30%JS)fDem|) z%@mq@Z5g~ASISQAo@xtAcmX!O@T*|`uiPrJ&HlmhF0${tAyT`Fsu(ZN9my>V9d>qu7FMt>82;Imc#g8$Cd%*|GfTZ4E|>h{1@gx+q?3vs}lU{ z`PHbo;FpMCCAQ^_%Q@i}%w>bOd_98rq$f6b7usNNV9X$|a>U$j>#x+0&eMeof8y&W z=tPD4xB3#@f-ant8!jkSONAVqHlL~6(ughKXtQK@xOv`ajY{(SAYpb{4p0sLb5%V{ zcu{(CcuK>4pBKE(cuPHmN^-tZ4G*0GdZ|zEJNowU}ho{NgJh~j*FSQ=ndV)#O>uWDbG@5L#$0Ms1MKOngzJmb8iYs>fqbQ~B|{y$Rfe0>ny zuQ(_Y?(hs6y^J0MAX0v|5g3e7o+j8pe`m%1>b|s>U>W*;dYcrS&q)Sg=HJ^JK7gg z*bHgePH(J-;>Yi=QVi|GuYJmhDLqs&M_QE*z(?l7KbvZ$AliI2W+U%tt6|2l$)oa{ zrw<#hGSv``my%C$KeMW6^kTE4Qp~yf&an0EF@aj1oSok!sdgFMD-^st^jR0-ys1#v zKLH(H&(Jxtf^g1l+Nl-3Q_x;4_QZ9lrPS&MKg{Fqko-H0|7ZK}$Btfa7}+o19=bR+ z{=>&4>dX>_`U?IIaa+;XM|0q(zZGyKlCaBh!GGqy*o+r}^vwp=Z4>VRbY2@(BmZ{z zcD_U`p#mK_O~@^%)YI9ceBtsEW~iXxh2ZRvguo;4tm#qgg)?(iHwr_0T9XG79LicS zhfKLPykqBfx8)g$@!cPbb>0y~xim39v2M6G794gw;xm2yp)`_DlZUl_vWgO$K;9e5 z^hGldaoj@}m9)xHtJ69x+wS(#mcGjkjSb?# zh8*>XO&+H*Z%|9(Z0{&rL)B01(h@oVkMMV)*NSB)>HiW3Fv#z%-=(iz7c$kJXU<6A zZaxK{3m-GHlUfQ44MkFDvSw}zB{net44qtdEdJ@Wd-ut`gBwl zw%Jz#^e2~sp6<&V-l*dvm3hra}*4CWGS7IiyEK(ak+kEp% zSu0R_icRIPO%-m)HU@Z3ATP4+?UeK~AQ-gRCN8rLzE^F9z)1(Yk;NfJd=)&K+`&7(d0qApzpff4{jV2eqrFWAv~hgw%PcH@5{b)gOu@1AZ%gzyOrm^*+(1XQ-M57slqMM zJH;Zj*=X$rmFsVUQ^Qu=@oZL`H$Rq|(a^o4CH{EC`IKz~H`01r?4&;kT0q|shR#E0 z{-@68UnNbew;BfPEzK$>_Jt6g3ko02t($YcZWSM;FCQ(O2zg8K1 zzcYK0&ed-?23sOW**qiVp=yjo%`&h367n2CaKv{kj>TQ={6-3=YZk&_3h5>tzl5U5 zt;I@0;>c74`4ha}yYw+8=ks&aYeg zI9GP`oI3hxG5luU`nJfxGr8QufOPom(SP7b{sLRi>RtB{bnH zi(?q}N*Rgcj^=KWPr=ekeUR{PED<5%#9L+|bw6(x42CsNse(m2q@f_rZzWojqJI>XcZ zC#~~d6(l!|+1*~ObK8g?ir6O_Ox=a1GlpPWx>=;kVnis=++#62?+ljWV}~4`oe^$# zSWAjF<`3{6>pFow0#hNgN#(ZoPnP4fzup4QRdvrT{P(@_NN!$tjCtwBlI>Td6x}*|Jh6gXmsa9!n zL2-w1B3<{Zzi9$En%{!%;J?RP{TGz^@1f8CgWoHC6M$f5g9;77_XhdF_gzWsV*o*6aN3af7Zy{^*sv{&@XaD@}C!xeaD+ z5ycK$J-v`Ka}TquSa|(UUAfFY=hRb?`%^_V0_Kjxd_kB6-w&!e9;1d{r5sYtrd(KA z3$tln=ME1K+Qk_ztz(|25tz|I^QBT=`y5&^%@5>~`|V$?;&-c~xRs?%%+E9jCO$M5 zOvlUR*IIWk-dH8@`+4O7)c5Qlprs4^8w+jgbUCrnCBk4FU9!*v3GnptBP`cj1<;!X zXAl$nQa(t)d9D_#Yj5EAp}$CS zg1N_Tr={@ZTYma;Yva5)xtWO|Ds6@#u%belKCXMX7fcqR_~}td!8(;uWW z{!lCl9`J@-X%*ONosCRf+}eCiBn9K|gr=|I;@ zNc)GC`8l#1!;aX9Sz29_2hzMx(2K;7+@gC%<P-mB?IctOo z5-2YaK_p69o!~(fHOjyDud1n}#AQxPxT&fYP_1OkfrQfBZ0q*1dL1f{kY z5Eqs2do+9N2n=?L3rd|y^b1``OuaX0nA{<77|{PMT6ydwCJe5a&xM6*Jw0n*Us{Ob zjvGpHl`#>7?f2(rNTG83@@lm)CWqzP2%5MsOoq3)<&V&IZjxXElvGd`>$%|$Z$Z$J z>$d37PRh1ut)YpT<=?C$Bsb*xW`EmKHx`MBXRxiP;?|~O!?gjr#B8NEaveDpa9{OZ zz-w*nntq2{G3+8;rDi^LfLJ-n$&B&(@c*#)=J8OsegCjhDbZAtB}`?jgi4ZarV^4Q zS*C1LmV`>OGiHip2@|4HOhR^(EftyUA;~)SJ^M1l>|4J>=XGB9b)NU{eqQ(U{P8^3 z?|I#SX!5O@`F@Y%`0Ve``*Y_TSPISz^-sgOO<+7TYXB|Dcrzja?;~Cm_+)Se{f4mL zod#)SHH4@8>sjLJ#Nzt;TRjeIXTSq8&ZJq~yNbQGPMyI-mNP7VQUjX`R0y(Dt1Vu2 z;G?+~q_!U(eUhu+NHDy^#S4;=n3{9x&{mURJLEQAV}H^BVdu;OrboVX4nC-ok0JlW zf5>B|$7ToV^b{|3vO8dnhZAScf=^<~Rw{wIN=~;7d)%#eda^S=EgC|5Rr-*7?m81O zyAgeb_rptqcf>y|@;mW?0B6(iQ83EdL+C)xjbjP3V$6&HiLL@8Rkenj;I0_V>&^KK zJ2qX7<&$p0_hE!@lKnBYpKgS07EBHcQ1r<+I7$5PUWv+pa|h zGoNt{FyJJTEYY#N?&N?9hO_7osXk=&KG-)rboDy&l_ghXtls`sp;C!EnJ|Y+_1I$t z9DBr2>Qz=Z>E5Ir=mw}uXWVkY^0+a8lJQWbA8qCJEmy3Kxk9gGS<{srz!+Bx!eNEWSO32L=;>b6)^Um(v{!FGdNH4oTrQG zLMOdC7;$W2aNjFO{|4n1NB8P+`D>wscfY&J>1rt(>>oufCLtx zjm8P2nZOo`?gBOd6~h9sf!(kOr3uWyrsW%PV|gGTD@>ibfFY^_P1rZT5XE>&tDhYc z0SLWcd~6RS@{6bGi8nYLFaEnwJ8v9GG#@_?J|ojKR*)M<^1;%O54}w^#M$yOrB{CO z-IGVoB4fJQrf;>3-uCOb9B4NsE-1j(GX-2Zz1oP$s!UIZE@QpOQS9ni11s!J=#vMj1+WzR{0TRljCYL z`t=hqkc!zu$o#NJ#sAj(cW|z+m zulw+w?(;TAcCQ5H;0xR=*b5T2mVCa#-YLmzNg;8_l)IuIr0g`KWos=Ho^=ys;PN^b z$IER4&pr@XDRASR|3A7a;23~ znMeeWsK--^0ij;kNzBZw(oaEN%4pj^(26gx0Ik@?*uyjQ0KKNYXFwUyQ2Ac>->(Gq zwvD#6^9+^Ykto@RlRs4^(J}A$K1`37O z|3LJjWx+`=w9i}By?|phf>e<{MjLnK&u{*(UPYM`Y#ZRAo$Z8!t_2=8m7tfm+DI!0 zO}uvre)nRfa?bDvk>3 zwtA~Gf3l5H7BUK;r)NvM!cgYyCP_3;xuf_u z25A;(KFj6<_@72b<-Wl+AVYSU4_|-#lml-v5Wv-0JMjIPkhk$8q3SB1e%? z1f?zW2K^iYTXEPH=eB8l=|L;;}J(O0p4U=WA`6%64aJz|gFc1V6v z@h?7Wjw^lubuc$Y@W(hnInZ? zuH&}6?-tQ)ya$P}uu9get9QDs##R_2{LCb;c>dzMMREefZ5T-wyk?{r8Mbi3biDQa zZL8BE4iAGH~4q=jp@Q@JqIaNMhGjD%)s`j$9DL2lwnysVn^wD$XT(jXCHosm!xCtD>tqi}K z_YSqS$=;GEDW|g!j2$a9tYDJ~Hr>p(UgLN6=iaR`jPna5tRS&D3)fd}8d}FrD~_r1 z`y;;e!25XOof9QzHqrULi+2nS{(MP3DP3geybCdvIR8E4)tu3L*5|7W1_zb)0#|td zZ@;g~nVZ*umKX?fyM&qf=sugv#L)^R7|sIk(*<~sld7-}F_SKOoO6G|Y}bHhU5Wr^ z`vC<|G4YOL7`6a#f@jtfWap{1V~U&v=vJWSVZbb(5@mWs;g(7X^(>iTLvqUx%9>3T zhOLd*_+7wJB0sB3m)YNKm^lc9_y}>K8-3jWwPRiX2TCQtK()sp3;oSF7Yu(Bf}X&B zs{waS3!OI20wHc2^B`h`w)G{jCVI#>Wz2520;cJF6;SoIBP=aRt|)#QzY{YFLf{%~ zK0vF4{+41s>iU0pXq${A&&NpRN=t3Ghz@}`C^d3SNFrl?k@O~5Y?2rRH{1s;d8Y_w z+xEdN)fc=G_K+j^@Mr2@88pPV1!Q73Bio6 zuks?h)Zm~DMek%D_PFo-+H`2_38BCjBKy!Wp^M1E;}6##3D{IMp-fhS8Z?)^t)yTA zx#l|keYyuxTF}U+h&h2eXdnAam8RjR_xEppdLbaTJ9~W*nEW0mq!biaM(D@iNAKqx zCqrv>?T)wlSe;v1=dZHADauknV8eaAZ5jtBoqRTku*xCiSdk<2y-XJlp0dnj%E+jy z<`^S#mjV{SHd@_H@q2)>3m)|lo#iPM$(3c?|9XGoTjYm02)iY2;cgIO{(DNcwOfC1 zVa_$)?XpRu;}_wPa+jJh+ug|y1S0^pfrz}XAPsp}gb%?YwFg#-%^t%B=ul5H-Xhw3 z{L$r11M#l~4hvkF5id@gmptB8%Oi)f7AHsG&aDY3TFH%Hj4s=O0J-@Xi@g5l4Vk{W z3IOLja zW{J^frk5-8bWE#&o6mEGtsyP=ICedtIiplPXr42@sMex@8F3itDRDIw*pt&{@5Y-Q38b3$X<5=^D%6Xmmc%_ArUg{4wd&M(oFVP(S-s3NL4pla> z3KNM4_hL@_KvEttlCBFL=v!mopQ`%zEBX}5fcUJAr%f#D#i0<0d(U6?$lnG#IB_2_ zi)lda#)rbB-6eCYT3)<4(Zd;Kc5mC0h`$dTU7FV^akFWf1-DlzVpQn6rTBv~gi?9qXR2@iAG zfLMV}2bX!IH)6FdQoPLk*j?m}gJ?ny^@Xms8R0Gvs8Q2Pla*+-QPZ6eyA!K`jk64` zgIF-dZbCuE_swV70ohK%H@b04YH*_nDl%^M#dQ64@IGp24g3@k#J^x*Q7P>Gz@JLU z2Py#k~e1E2gWE_V7qaIu-}OF}5C5A;%i9Qw1sM7q1C0s9y|Vxwoh;P< zKkUK({ZhLqbDnL^*sZ*^j#j18nYN>U{(92gbUr)7?pXNrnyiTq91l}pK~a}AaeabT{MuPF7E$O~iP7nk36pw>Dy7QRb0M5RasxS<`^ywxM9 zn}~UVYUSHy0A!dd{n_hTRIkm6z^T4yP57nz#7QnX5#3S_!_;Jh3_LEVA_AmyKtJ3K zd~4;U*O~VpvT{`G;S7PI_~vfw6<8G59g^N0^H$1o-}bS5YuIV zWV>6kAsl*tQ@B@viH@z%O(v*eyztU2o&~5ER5t}ZgS*092LF#bnAo}=f9m zViwAn(~*P~W#l+ALHE{62B2PO26hpL&HZy#1UVSSlcQvjdRg9^@)4HqJZ(K;FIt{V z-czr~dqZ5iLiG?N3@Zhyej}~6g}}1b81ok!TXCkmrf}#^kMP208oby?1bz3_mo(Fp zj2GUz-hGWc{V0NgJ9EA+6O{BlI0sk&UH6+Pdld3}y13Fx)uBPz4oix!$6Yx>!ZWZ{3yH;$+|H66D%f1t{+zrEH>uRx5onK#5>v}TWa1E(u!(q zOp7DsqN8dm54S|V6E~f(G@@A(KH8;N+_iK=9ffJ&ok^x&FcGQhH?tmnnSBolx#O`e z(wCJBP2s-#Ko_OcL(#$qE@L0TOZc)$_(i8K+THRL{ryMDZ6Def1gVi_FS0$E0UY}ZF$#eQ=i z&keFYJBY*04on7l=NN2F5?nJf z{W5sIkF9^YPk(=fJclWwzrr_GDKQ?=>#O#cRF?OYNk0n8vnjXol059@YcyqWBuZej zfaq!xIDQ*?Nge}ZAAX>&2ML$nlhx5kY9ACk^`&?@Cau9xqTh2CnfVlP_Nb5So4BXh z^-B-=`1r)vCxJ;*U28Ud?FU#Y=-@)}`%(d}`!o1iabDR^p8>`SM$IaU&p&D(vq+KY z@FA+f=9Os5WKGn8oMZo%+w<>qRQ~5){s)o!FK0mUMP+L-{dy?X9-aj0d7Z_-l1QCi zdHX}4>L&2E`>jnrw`<<3KyrVM@`BwRDm%Z34GzwJ3zj!yfreh=f^mwjK?!W!M&80pNPrLEFMWT)_ahA%@z>dPr?Un2xkyO(~lgkEr&#yB@I{9-rrn9!x zfU-HSb1W%onRD(HV1o55-WUA)htIH0rbg^@faUS@(EcRkhk-MA5dhPehs;=?TjRsS|0MtA~_YGR7JAn`eI^KG4}`#xEcgzhnBPNL*TtWU zGw5ipHd}W?@jl;V0`duAC&wq7qeu2xj!?CIZY;j&$P+Kh7H(?6^|D@eYFd>)nf04y(HnAK|H#*PP-wV3O5BFH{VA$2dA<=Fq;!tpsJf( zIj$x*nd9s?-WF~);C5N!4&|yzny==Khyyp9V9g-&`o(=li@B9{>fOi3HJ_F;_tuGN zL`p)-Xix#!m zX><~BR3%U7c#`>R0ON~ydai%c!kIlV3#C;PHON?uHap?Ib{sMVxeKl1j~~}=%BpKL zZda?%H~AJ?^nx+MTNc#aB7ukRd13xKpBju!YdYz%BgQZG$r4DGPIrxLC%C}{8E#j- z+qUR&KL@@rKXChf_Q-OqQzKhPrWGmYpEuE12YOA_GQYp0?QR2OISv!rfAA07OPRz} zHdF>iF?BAwTIY55`5MkKJ%p~2n(oysS9F6VU~cmZu*-;hTp<;I(Ma**@%{sN9Twl( zeig2M!R+9ela?(_3~TAPitcQ3zadwBWJy}fMOyrxw&dtc-oruUal03Zf!lKg1fymt zfG}ytI9Ih1Em2H{J@FLo=X||reNZgifNsu=-cemxSf$L3E!+Ch8_Y3ty))TA(BgVb zk#2we&_~W$cP66WlHMz&mv>$**z`-uxAisSbC+V9W@o@%UIL*}{t8(5*N~TRHv&^0 zM*|!PoK%8(9wm#G2eM-zCGG*PHO;SDS83?-;9t2fA7BPCi_S(MisFDfxBt_UkSYBzemPuiABAPRfQpSf(;fN&J=- z9aV>Cl-+v8?x|Nk?n_VHyqCL8Jm>*3Pn7o2K*iG4-lhOjf3o7xBQa4oYfyDu0Olny zyd~_<(nx>{Sd$YYLMVBKk|Gzv2M>jg-8#oJL_QXM@8VaK_Lq=o4NO7bM0XA!#@RFC zRHEYVl>A7)vcu5ygg!(m_#R-rcW?Vfz7wM;U(pWLtChH1Ki;KCDnMp=ywq{8;~giP zM&naKaU}UEu~<`1sc5BBH%_2Ay?31{sXT^*7_I24BK2gcC*(BgJ2+rmqJLfq*1z`@ zQOmyS<%Kk-L-NcgX|{RK%1YVT9!Mx|R{{C}c3MR9Cl^J>`<|(% zH+V(kZ1LYQeX5s}iR1E9ulCS2-tWAFw2Ewh%Hu1aMy89>C2q)fzsXc?tNnmeek3Mw z$eSmYyG9~l|6uhmG1%3>p7IQTzl$#sM3sr+lKaj-6W?c3WF~3+3H_D0aa<#GD@r|b zpV#TJ=bDeQNgG{=yPtg|y@>(xEGarO)<&%3#Lvu>Oku{-t( zU43$5^`;r!s+d;}E7=cd_bM!K72CMTEGEyTTSk*xkfbSB#K2^2;rlA~?Ed}Wj_s%E z8zbn}On0CecIjXycFvCr-x(hln-?rkgmBqdH_&mrQ6E((d!zs{o)kV8fIj`OR(|0% zerOH=PS?6rKeRcmMF2`O@KC6f51wx>iFL8uOMW_B;aa@G9i(=1 zn9VwxUV&3ayAP&q0L9)dlaT_WQ5%D#(P0KzI=k3;xHkW9@S*}O6OJLThGlT|iH_Ax9T|R!g#g=xi-4agzq12Q&x~gN5Yne#1fmYt_L6{1M zg+|dc5^zn>q9lGiQ!*E4fbx_0={+~*Y(Gr#A%J?yPEh13fr|?=D`@Ft<%TfQ0Q!TIoG2^I=O$HB2&&s11^5|L8|ht0f+ zVu;Xdf^1$sR(kHCEO*qDKR@o%D^VZK%aCkYb++o*?Wi9rXy53nwtlmC%pzKeA|p=zN|OBZd97scPTW%EUMy0H;PLZphc53LhMao z)v+IQ8lO14zy4EpTZhS4fvy7%+7FP2Ikwz4K5bGGrK&>99i_5~x1>z(!DrK;KBn|_ z(~QU&m>d3Oqsx0#DH@z(xKYzd$ytjAkgHUpV&1UZ{K!ybFrcRAq8QAO6b&XBLwIc-MA6NciBxTOWnfSWQ zD%Ps@+tkV&x#^7)$(>0qeRwDdKGS)0;-?pV3vZswFM_;)7?%!qCh5__K9eSC!@J1! z)gRgicGP-LzEBS0Ys}N9{tUW%6lwHq+RsfCRAN@)C&sr4nq7PknT7&ro-E~1l+!)K zHrnp#;H)VO|&uggq1$h6QjQeh+$XxdWOz7CAw5U2DxiY3)sg#vBzQumU3WwASxifigBP0igakrg;&`E+te^#(*@IR22AIaY7ZOGsG9M8Yh) z6w}Q!Tny%=Ic8pM$#z5R=+@@R?ee5l742{e; zlc}l)&Ur#ll{+Jj>`{;0xs<@RLYB;u9Ggm`amwe!hq)Rbd1NVHK2Li)`yOqbr=R<^ zRJf+mUm5#Tj^{dXXMJZr2l4FRE=TB?jzdL}(uQ?`5BtNaze~C(m&!D29iZ_Ef|40Y zUSX(@{_df}Q-}PJr~?_7z+$gcFKew&sq?dnuN0}Pjz8;An6I_na4*&j>57u(zP*EP z7kI>z`zE7Uqn8~QlJ#0-A_PA0rIGBC1QltwyJG7!(~>N-s&?VcqhmOE=NT4Wk2WhU zryqYpD081 zslUz=Aq|cw<%AZ_bTMJ`0H!3`5dbo|e`|sHKlAtXGOmg8>fsW(kNLf71G8vWIZ}n5 zLO40x5^y3OOnq*y^1!>1QnR?}8E-y*d;g2C!sD{Q5IllsITVKL)5&j+?KA{Pf5i34 z-$kWA-5gXc+ktV!pGXTLU)cO|$vfhi#ls10lb0S(e({aqj-b^k)Jxso112sP0H=QE zx3a27=j4&ghKCZbGeQv;QN@=P%s$=ZUCE=A@V1mqK#Kg^>d_ltkdi^pt$n@!sxs7Z z-BQa5(_%Ja8WTy`L)hvMj7ck_{(GFCZbqf!bco&_be{b*TaFRJu0@tYL#bO&W+ZDa zRW)igp~M9)_(3mEowL_^k> zGBeh^;v4{#`6trquZ?W~A6)-WknR6IiSys_+TWYGBHVbw}W zhBD&jK!o4GgkEX~*jv&+vA6%#!>>!poAnKxN(@MM+gjDn5(_y!7k|*4F-n{$kDzF` z3C;J1hc7oMc24_Wnc!PEDWv=So^A}f6){a;ygs_TFe1{+Xy`t=4G&GwZ&l3bNzLPQ^CtcKTDRRiSQy^)6hPpXP}Q@5mJ$G-HVed3mbYK1 zzeN|n6B6RJq4MaL$;Fd*wG%~K7$tRLFo0^x8rQCjRNoq@P(AHd#6-;G-G?Q%nb~@N zxokNsuNs_zpG;|$zIt!P8yLC@p*Cvgiw8zC9%nUNo2{QUv6-#_rTRGP zM-!Rzh+q)d;X`$g;1l8Q5mzfWW?mH7lCvMj*acwKVZjI_9Nih{KEhG_oU#dB?dhS( z5?jIY!jbwlmKJJ@348D7_O;lvc}>V1k$=oEftoM7FTxU1qxZn|75-+F>7 z{N`=<+SI*_M4udn+oH$yn+`@9*d>Ydjhn4zz8pY4zZ)dU)voDJxY8VFJmg92nH$NO zaB&Sf&Ke}G!76=W*TC$n#!GBS{g8xFjh%9x@-M#iVE_`?2w)4MXdQ-tt*Nm8fFAL! zwnK5{Bhc@uHpma^-WkM&{l%Zc&q!Qx+VOQSaSVNM?3Jf??ttN*;MQM!$$+aVW%dIz z7Od3DYX90p!A^4D!6zYOc-FEf=z)+&$s>Jvz0ef_5|Qf%;2}SDR_(;kfV>7~(%iTQ z$ufr-p5SRSe~(e_PjzL>jxIBI0Rlxb(5vmncG6y{+dvA;@GOY6_jqu3m9-OBWKG52H=-6Y zs-V-4cA<4wozD*2Ynx!zU64G)fPZkTh1y94O0p-Lii|Q$N$!|>!zqKUU)%kY%=PX; zkp(WXb3@LNkW_^v)$n+Rq| zxzxLRY@RD-fJVHQ4yvnYS|YY-V-pkYq5tHXNbiH*co=`(;YU|0_vabedsC^Vm{ncd ztX(->Ka;+3cYv>E&amFCWlnOfW78couGqRIKW*`FcZ-oL)~v}r(4>%2Sy|e-=v+m3 zp}aVHdBsR}<5|necvC(>&J1Ib++Zp*sPU*$T~>dLA-?;TsC&72?U6ZIfkg8zPUEYM zEZCm0Pkk{yn_Wf807l4`O^}F=5g%H*ixJ}~j-X7)Y9pw2SF51(;`^ZJ=<@73m)>Cd z`paVSN+1|ghuXa@Rp#l9cAW$28MFN-81Cw*DwTBT;-elCk|OzIR6P(Hv->8#LOLD=qHr zeEo@G7Xlo$Yqm$d=L<6TpgMo?DPZ3=9U6X#b+87dVoH`{6M(9xX)XvBAV$8PySMFg1SDOas zl@JVg%8(C8k-WqHt`PTv5qgap<~$dPZ!q2Fj*yWeQWnvj+&ef=sW~l|mTSGRMsanv ztYeTz&Q+Va6CPDtvpcQ~bW3eAXM`dn{qr8h;|>oYNlM5qMYb^Aw0#uKP+d0m<7tpDOu-BcI(go>x~=40x`+Dw4bo ziS|cn-q=gwAISt0SC{NXyKvoZ)diY`xLo2=(dyNHTk>z|nKO(41^6abct2WdsP8B< zpSr~YqVGT9$x6a!<&6>gwjv*Cw_<9Keb{+1&hD%(@F+~VTa=d0^KML_)H0K;tcFIG zjd0FMZOISUwCGc9;CKEm0YKUyYrMb>ou=F~X+SJji7l2?jm10g@v+%(UfLQJZr;yi z9uA1^?n@My>Ji@YkWcvaDODOIGNtP_oN|w-WTz4GUB~e?&X^5&$;*};#g<%v7LZNG z)nR+*z&^x5KL1lEuGKRb1GH)=`5>tsz^CYNmLQ@}o!qu~A3kmj;OigO6v^(Y@&}kn zcRi5Pn>{@C4Y+2i^Esp#0PE?PZU{p+`JbAu2|P$G{;f!mF=cG%Xl+F`OC+$@8pnB9 z##D_0D6_OVfHe;4gVxJt0-+bqU$kV_*%}209Y1@nqb# z<}x6)pF&YdW?ju8of259LdMm-W-LA61e5J*Qeg$hL-W)n-^ImDz z-*kTxTtd<$DDP?@rn+~c(ZTZ29lJoN`mo*LoS4_h@HyuXAAREu@8mD}9feK4>UIOq zHyLkbSRWRIU05Cq9Y;oKxqp4RNO?l+2;>)I5jYVRJkc?bnq(_8RM`U6#~8^=y5)hZ zOSZB4l~=Dca<-Ifd{9CjZ1|X4<4l#qtr4n&ysKRb^ESlQ($^Yyam_#a{>Lb3CX;Ok zVH9|S9Kj_rfOVu+UBm;x(i-0`1<4iwT#N{Of)dLscNyo z!2xo7t0`BitOa2*T2kJiwKWjg50FmXc#H%`f(8HjGm9)s(*ip-skaOc{~wjF%EpP4 zZ8ZcD3=w#5=P7fmrqQABttiu9eD;?CQ3&N8hW-gYj-84wg4RAuhs$RJ09y%cyAnl> z1~6~&!MOn3WXjzkj1D0FCsaEF4%E^lLe&zGgPU?3=c2%#rnKe%MF#$*uJ_OD95S~6 zyzz}IjwbjRz$n|0Y2AN^wO7$=F-(cZ_S735oZYAr;J;HeR}76tN0@ska29V(Ao7sK zqKuLMbOjG&qCO1>=J8+|L(&>bg7cLA&-IAP%62;;Y4Yk*Y>uaNWonV5F2{J}~Hy8EYUR53n8!KitMU6ZgcFKCr=8q88#VKgO|wWhuyL`2>sQaWyo_ z;@OuG30lkj0uw0Mo)TQOy~s!5eB1TU)C@2aJ;yXJ8FI?*q~>aljuT+6OUp{7#YKMi zjE`!{rDh~MV+`)nA?+>ML6X+x6>}rhVwj;=RHCcb5g^3FnkLe+Mg*ddvB@6eh*rO# z{q;MyEll&z=$c+{1prt$$r*berFebpr4+m`;aF32K#9gN_;_HT(6;VX>_RIkt~Aey z@pydm`=OzIyLTL!I%#{Y8Po3xiZLZ^WKWtfT6kP_=V24)hi|xF|1thMyZJ6Ejut<} zTSFjCg}(##kWZBHF)!MUnBEZpr^q5peHes7;ZHrUtDfxy2k!>Ko`jpI3pAwaNYQVi zzI_7Qe*@;tnxN^y$yV;^yjRZvMly=MQpLxOgK(@Q`&qrfAB(dB85;bw=?=`KEBqYK zkmwBm@<)dp_MB294+f-X9~sKMw{|iN?W^tKWIE3X67h>~{wb2Br9~}<(&wNP;KUw9 z)_@axXa=0vLYTM`1y2Kz!FWpuH+T?p6q%g(H+Jkc9TV{7ZeT1o=f%9!2nb!yu8Afm zcr7il7PhP-2`u^neg1pE1rSDq&!|MCHSe$JcdlkNaz!a^mQ^z0r8@Cd$>ucZ=9@|Z z?L0@ok~kE*|LsCmd^u>6jn&4LbOAUZ6o*<{*0mphu|+kYx3OlRd4V>h7PXIUQ ziet(JV!3{Q9Pk&S>|cD3periR7$cL|r0;~N&+xV0Iqt3%NEf!RlXq?;oUjNGcIPwk zx9}p|6!>HeBZxN)W-{km)0%k}oB74t2QivDg&qhjKy|RC={PS0Q0+&rtV}J$73#7k41PO1`AaCq&I{Ki-CP%um zW7~3W9wd-*Z!k;bF-}2pn<;B(q3VWZUQKf45H=$l9|~Cv9*622AyM%OL#XMYUwpxi zP{v{~gw!&De^@1_5xrd zmWK^GNQ&YmsD>&EOzi{P(~^>gJkIM5g$p^*y7mJ1Lk^#rH$~{r;~9lude1<(o6fUT zfMmxdLddcS(j19E=t~MXwPw&=qO36zmS+?m1{NIZ0fHhy4i?`t@=yT2m7{{0t%0<= zuZ)Pohg>m_DK6!4oO={q8DNQOO*8*P>${P!G)DP6m<{j*rcaXK>`pGPua!C66P~%) z#*1;|demUYi4^Rd|5ZSgnF6#REnyl0YUcls5ch>a^{M+yNzWdYsqP$(X?OjhQ{fe} zAkciDcDUDU$$jkL<)@Qr8+|E;SqJ3`n|zi<{DW#}Edk@o2Etlm+Ll>Q6eea~ub93o z+&M6Tt#Plw71xj3IHM+x3BP+nT$nN>n6GRntZs2IEr}j=UMJLz3?a)to zy=X;6g^#&h+{Q6Gs8dyui-&POr7t7T** z5>3l^Q!o0SX(J+8YfYG|J80KIPF7%is_$oqoi)SiSvlV%|E0Wo!J4FDjvteyFhmU{u9CuMmxoh}OfB-=jqTh*de zgeP;S`%gED$HS^arkgl=?LGtHeYX+<%u{P59>g@!E)5U_N5?OS)Lm0Di(OI~&a-4B zkIvn*jK+CUKSix;+aJZVg00O+{73z$a$+1;w+9+7v-2ZS@euuvkkyma;)}IEC)j!P zm%V{;cH$S5&r=-@;wv*2B42fxKYb57&CNyms$+NB_#bL8>G^bIRk3=k&`BD?j%@6H z?u&nVsCDsYPkI0>E%;#a_|N{U(jRa5Y2JWMCKn*PmVtjdYBZsR_L@FLH%8mf%fyc4tj4i*y2-IYkq z#2h7OablEMya={V1>-kv+;;I#H3nDz#piY!8xdo?8($~G4Q?9V|;n5pj7 zFwEtU}% zIq;6l>;H7%|Et&HvI0Lt(o~8{LE*5WqYGp>-b(PJE+19#om6Uf9`ZS+6pF=jkP|IK z2hM|W;F|V5n;$SyU%&S`!7`t3htN;a+dA*!x3wIrV=MI^v?Hjhp75L~a*NjdgtHpr zb&YJjT_|+q<$Q*KjFZ*qK6+hr3=5IQ&<0*|?XvXw_sP3+akuUyb@B@}R*51(Bgm*6 zaq=@vQ!Yv$Olh3aQ#x3^;ofH15BTn*F!5{77rQd$Rp_S6R>&~yHi>Qv?qoi?w;d^q z5@0Zr=T48d?HPR7)wQ9l;>oD(x^_85o^>E}0`QD12YVr7Tr|~D!hUBy5z}Xg-oe|T zJ0y#y)^_vfp0Ga)6Kk*jkVVz}{#lHm>cSNKbGLlR@IKy;C+AIn9Co4L(fIEGJDAYu zTeY8Ti4UVRp$39CpL~uV(m1x`V=!$ae{Yf~wQ@(2vB>j|;ttgpoHArk@kzBq&AX7P z;Vs_43L4|2)&{Pq+8l=pUV9q7^>IbG%;+X@P?k4mWfG^UGPSVI7iJN^2veSaWr+h#H+~u)1f;1haetXmD z(Ca?OWrxhZnp0#qAQn>4w|QMKv1j^>=>Gn<%4J_KTVC&MjXZl#l^Ev8qoPyBd^PpN zE{9?yIS9_Cwmn*zfo^u#m%t(B_A)~|U^^HeQ9H;2=ZG#Cl^;fvgQ>$8Y8?}JOMb!n z=rSdxZ0|KVn^cG}T7ftdzA!|#9;i#+1rz8|7|5G7Tf7fAv9XVx*&t8hcC;Eg#JBn# zu1Hqfmw3)l+Q_caD9{_DQdMPGk>C)2Ct<#bDi0KCB+g*hTG&txfE%<#(sZqu%-*hu z0D&ftO?Nvl7&fO5RLFc0nc0Sm)^Q#K+1r`o`6}6u+ie3e~LLQcH=!QFSJd--6Up^S?GVa{4nT}jJ?cy5r4@AX)V zbZ4>@E`p<%%fQ=IkT;bdOV}Kz8y_ zUGOY>-@H90ylfew=H==o-5ffrb?YQcoM6pfPf&;7P~8~=8Mn2*)%sr8ZHsbj;1)4@ z9IntJ8;si}oS$d5)E(ZM{^GlxLIM{${({7if#By`XOwJMpEh#&jo!2;dy; zIrbmzO~^l7##AE(xZ3#H{luI&&_MO$7oQa7xML(w>azsX2$($T15z3?gHYVmR?|)U z{DEhTd}+zjXDdl@RIblr?!w7k__oRmz)kbuRuas6ng}r~uy~*pnIzWhH~&CZ;=jTp z)4>=t#M&6NWv&dB#Y8^Ai8R3E7w-4Vc?qZp ze2ji4GgX!FP;KL*Z#TPw?AzLBS%VXe9ncJ0{sW6VbJwdYF0d}-=BJY2oOqIPWB`jzIOPuU`xD^Vcm=4Ez=sDnI3 zGJMRrJf;IrA42jsFvcClSp-42i2XD;hkfSbGFk6U?gZJ%sCCM-z{yZQMvFw4(*uFE_d3BL|s?Q``id9oOHR1esNIJ4svm4%A^rm4L{QS#-!Y z&M}fenD6_rH0yz{De?!W&chcoUgD^Qb5V>zza5T6T2}e}i$SU%Zw4NJ_jHW3##utX zc>j;hTgAwr5F}*nhb=_Bp8bxerQwjfJeD1B3gUq$2c` z(uE!YJ~-(^^{B@t+G>^Hj!&(L`oGza#VO(s;1HDfoV>H8Mx+c3FTX>0W=-p7E`5Gh z3xvU=!{LTzJanj>Bap^1Q2Nq#gmozI2j!ipMkE1<-glY*ZXtKwSn8qYt@xpG=W7Id z^8;JzO}CDKPnnar;bc9aHznKhC8nC@Y5?lt+`m(4_nY(Lesc;7eQbPtZu%E%A2Xrk*%-CT9Ybh4rJ z`AsJBI@^?5T-pIuti^u?ZTLU-`u}^a9!~LwJC_qfRC7*33;1ABUp|omw}lsN(c$}u zG^!<{lliXDZGL>DcB>9^_+TgKZBA(hP5!S%t~QtMUX(sg^2s7q6B2ZHMy;qb$fTHz zmGH>h^q~Cvm=XP>EMof2ep=+X60-cJ2FdLNvwp5pgVdb$L=^Tdrgdouqda#Xj0eAd1MQBs7fj;`waG#xPdC4rj z{rR2SPB$|R<_g2z`Ik1=qx%3hY0-RCbpgJsdO};A);Aw|t5-_7%Mr(3yc_)05e%oh z9`Gi&@FffSnk3`iWOe1Lu6!=_7hecL_NNwb3@>nPDFF62tc3|5*6E~vlqz_dwLrh- z7jl^&%2-#6^}+X)ddKZP*L&>UhVAz@>uy$OJVw3+M$IJ_`?v~PpE+y@*6foJPtvIa%5U#Mq&aIQJbzK z`>md0M;Jl;RFt%qN01%s^O-26LXucGw7yEMz4B!4Rgc1Sqoe(M{mx-FY#_?*pX@@I zcjsB2*>gMPW-S3fFR^-0b?W zr*e-Ou`ZvxJz#}+Q}u-OY=J-)tJ|ld9`^Y3`&&>-Xw@y~xrCFjZ(!UKU{~TbBe#{l z!xozOG}_nlA5cq7G$GG9cL6w3FqjM@MDEB zW6C#zAi`JVo?IC`rhO`2D={)+-z$;0`B%iB`Yy3w&WE1ANY^^1!%Ee>XfaMyy$eEz z`*hO}aL4N_j1TNy%wiz>UX=njrf~^%1=C-|GbawH(>3AjQZ(izTzCS)6z^tYW)Fb< zptQYM%ZtEnadSaEF9|ciC9DK;dhi{%emu{}g-ZY_5=DHg8)J|T+-OBBz%9(~kZkDcQNU$Iw1W|uMT*k@a&QI2F0U#s_E_EG;nd+2ufARgI{^ZK%tTWre4 zuNp>y%mI0NemWPmLxNE>hGVgztS$JSD9%g9?o6Jt|4j&!skmaG&T2y1)q?W+GyDu7 zd4sUK;Ur!@cxXB{6-xYdL~y;$|>Rq?=y^dh2bh4U4!S7FF0^TPs<= zB3$;wiZio6wO|E72Edn7TaDO8D7WNcV{EvLD9+vSPWs*O zxYTzCrBD9iyMh^o4~vq1qIzz>JU(>Dv>TU6$`Uj%yOMwTGYy%?-Ep|n#7k|^_FPAc z(r!P8t1qx}@d0wFRMB0CZ_u|HV46i0l++fXgFqc`+vd9JCZ-twF6VZ7H&$O}^1;QU zxBEmi8xA&K{95y-s+@#dNElDib8dWGChq+_cFQ9uC9>*a$o618jgG1?63DN$k@C*= z)2+=5uZAxCK2%sC|J{QYF=GIIG3czlGmWU&ZAoFHm-MTNIm3dqRST>Zp;v z+0IBxC*kgTTr)i(zC*5GFJ9w8g@Cjgl8>mMv3aH~t|GP2N~X`OCF(>ziFvDo?MT#Qtd#Fg!* zB{s;sCO%}~Xm=Sl<5!0lZ#i|iJc)00;9ZW#%)ZdX(Ivs4vV}}F6RZvJ23LjNvZJoo=2jLvI%C-d^LytO^ z2X$Z_r)|k)I#CovMlvvwn@GVW@0-VsD_*8$XDm#Oru91<*fbPH7}osm{za)3f5u{J zOu3zMMe4SWzP$A_P%|D2i8_!U`|b_lOegIuN#fE)e2&ut&Lt4+=K67tLGKf0RKe1X z++Di>4Bb1jG(lvNK)fZe3jc^pwLr{x{QZBy*KU5qste%IKVTT$&|d%8k%)i3qo`4? z_^{bQzk8x!qz34Wlx{^oMSq8HRju4O#*T4EM%3oHYj26HqHI?&PNb@FUg*>}ZBzyc3<^~0%Ij;nhjmtU)yN?a8r8Rmn)Y6@!dlTW zr1c=-dY5qxp8$7n2eCdZI#_)N$R9i5^XGp;`bn7jjTulA6EF(4tU4J}@r@4VxEk#G zjL}{*y8S5Fy!E~zR!yh<|=it`WS z2-*|U)|6lHRpzNb{C~sa|BJi#4r}^b^F^_uSWt?Jlqf~Ih)5NPigYnjBQ+`_O+=bR z2#JdH5&;1L0R@pxq<13HM0$<%-djQmffUc;xA&fzeP+%)=bZc8XXf5Nl=6_izH7bj z=lzr}^ZIS}iY5*n)41@x7c59*;oy{0^D?-njs)L2wD}w^3ouOUZ**?zA6!M_&vHBjpfHrIRt>Y}R? zSMyNvU~7KkGvUxkUel}g)&Ca#sk}JCCRknJvAd#y3UG?16E_;(QKo9WU_n!j)SsI1 zsVS|=5+R!>Ppaf2!nbz0r~(cv9^+wfO}D6r-qk*l?pDyXG;0c>@*Tk;J3s>^&-dUx zF{|`Gz7`}Q=Wl@x9btw3?+9-F-+^tP4%Dt`t21d@!r#hDeRF7G|ETUkEo5vHlRL1D z<#EiKjO~rJI|nn5ryq1Z;dIn=IQX4GNQf#nrSO^~DRbeQ6^1>Jdmw;$D_3dDu%di} zE#&SV;QRO=)y5$(d*NgDpk6_-^rFl&kz@CXX*KEDi-pDYYcIO>PDfNk9FPutypF!B z=Y2V3V&y>Hi#K(36;36w4oV>D;^w&~st8lAm)MK}?1d0V38<|;?M;05?M;3E^!>9? zBL~7Z>Vwrxd|jubT)bP#yxHaq|45l#yE^ncbK+)1P5NsZcFwZghtI1^<6PW&ZQ*Nh z4XwmYfamz%OV(rMObTFVdjPk1S6Qb!;p%?%A> z#!)Xwq&pa0m?j_L+=eW-O@Sm(ies&Og=U~^f|Js!o`$KhFh)wcNMGb>l615H&Vk|@ zOe$QC;qvF5Skz8)-&|LEUKXo^T7VSDsY(;W)3M7V&l=ZMl(RnoME|C>C5w`lyue|; zPqTP-`(LdAu92)6xb`fh{$YBnnpHlYF^n}+Tv}qSrxyTsW9vRy1%Lj^IOd{IXs)`@ z%<$M!dc+2OR(L1A773HuJpu?^80sT@DSRCbC9)2Ru1TyI6}OUpFdKmQq2jg?&9=hhq1p9t?9@Vsd8#f_IeNPi@gvSitP4(~@- z%4u#^+Uh=+S^WwYA$6Z2@s-q(cCinO-J^H5F(=VoV;t_>Un1A^&sUdCe6Cc#fVI7s z$C@ta<%{s5$xsqLqwaPB=fBO=>5-eo&vNh2OuT4UoN_HW+25qFeDlEi-I}-bIN5|; z`N`a*WDdUveGikk;Dt}a(h&6=h@f85>oH8!?&rG>O% z%ptfMw+pTh_&9U#zg&SYVU2iZ%nv8xl2Ok^z3rH0a^=(G$#P8Q2S#FDG2gTMGV*fH zTy__~Rjo4FLk-N-)442_Nr_eWN^it6gWaB;bk?~0vtCRrD>?hw73H{s@2oaX%|Ho& z6UQy_oVm<7|D#ytOB6z8;;&$jy#mJl9j9z{6~VB{@u|P|lLvvR*)lv#s$W0jeCP?Z zivUJ|)+d+`^mu#?{7TyBX*CQCAiO*Q?oqZj-g?TfuD2d$2Oy1c6j}DOcDuu|%w6tr z%n7w|a6~Sso)3V`pP3KW)ObhhmZu$UhK6|<=_8uAmUO<%7(834#GCvh@w?z0TzX1x zAIfk-5>c14Po7**AGdPR@EhZVucZoS+WCTXS%d589_t(_@r6+KnP4p;xtlS^T_ugM1m2JXKy9gox4@dHgkD7FV@}l;&}LY_$S|5=66Wt{#>*M z1p<0EG@sTVCSn-k!^(`)y7|u(EsM=&!_J7_Oy<#=9Af&4%a`GUrKFe$p^u5j$pLMl z$%fny-}`Wdp;K!ODN3+=sIs8rhPOZ^65hM^BwsaxZ_yzJBl)xtDD>XVHzzi zXGFvoVC=BIujRoixOHTcD%*~c600#Ku0Fue@?j{)@jq8~N2KNuUMgANXlj#s$9}|H zRCEM73*;AtWS1{`i@k1CY*hXdRLU&xSF(N=WaNNF=6`fvwsV4mbar4)qbI?O56)Ekags`f=H2hee!T+`fw@-%b4xFg8Sd|yb|M7tEcTZ?POIUSvF zJ~>^|hR?u;Z1%4&P8$vH(sY-2GN=bY5^>;UI- z)+O>Hx~j|QTJ_x~vmU2nr>puZY157pcdNO09-qe}S_?u#>~mp-cfeu!?!Q1%WFj2o z&Mn1ZcRnDX_;iBLt+ReugUW#8D!t=a7jt*xo=g$#yO-M>E=8HxPH6^JUC*TGT2&KQ z^C7XG))&L*=R8ah#t+}He$El2pP{F!o^G2`kybXHnD<;d|M;;k8#aD&*zNV=LbLub zo8onKw0lJO2@H(bh&f)zl7$Ph7aI7(6g2|j3j0xEII{qp;)8buLU%hRA78d(ce{60L*<(oq6&^foWf};3jA9NU;omO0uAe-!oT!sRAL7*1XK2=@k9lbOTTJ$jUy?4BzCeo zTDLlGw@2NmXKi`QoZCki7HwNx5e7&nG67CAd}fcH zCFB(6Hf(u?RMPEz9U6cZjd+lB==6OH=^sCR{roy^Or`LJDo}rq5Q{|wdBTt2xK)2< z%%1bSh0F3uc1u4N94)$Wn_ifLmsRGX-a*|Y{!q8gD)L%+|3JhX8}lIbTii?Gkve#H zJoAb^amv8=MNz4e^0+IGEAA-pMNesl1_#pwWX6?Sj&BAit?^>O8FgS`e)hd>|7d#m zCdesY;Uc>1E>U6Cxg8*WGh-EmUak3(^JDa!^3EN93(Ufk8&Z$$#FOABpPVi21AQBa zc~ep?)!iYCdPgG(4N_rN=O~U($?bqSWoAPk`RMesLnFHUH$HP&I#SgQq*fS?I>fZ4 z(Ujjqv?xI+M7R2RoberVY2JTH#$6&W91F5_2@Jt4#zyPZWc_^RxREh zmYrzO2{UO`0d1)id}#o!8W0=W*1c`-Kutu1p*$vGkj0f8WaNCCFuwv}CYS?Scb*K% zT+~)3K!97OI?JiFLwDl~%mq=)U#3^LLE&92xTMTFIka|wk_{uN1hkS3Y@!s{H0m5b z|6zKm&DWIFx4V$CbRH9UEkwmlC#t=uCdekJ+Svc)rod-jEGGqzutT>%*&3?4QSyB= z^=@b4Yd>sass$0m%a%Udp8R;=8y&#N54itJchsK&kl@$&v<_R5hV_<#7TcX7H&S7$VUX(&Q~HuK5NWj$)F)VXxPch+Z;}x!vJ}M-IP|u_CN-C%L8tfAcJyD$ zs67%H%A1P7Qr;mwsxNlFRUnU+IlBR0 zBLr@7IpJ!!%rV$$ND-bdd%DBFUf+FoeD{zSu68W2fF?(Jf4(R!L9d)oeNsv>2s?V9 z@?ssUm8REvmUue1Z+c*w7^m@+&=Yl~a5jVRd3sJ^Wih_UfA^Ph(&)q)t!cGr_lNIp zt{%ze+^HnOrrHRvSdeb@uU9vjpQcK+31wP0-yAtQjeC~eev^K@OLm33KHlA|#P=od z)|WbQLA^2`*5{hfpKH1x+LSfx6hbSqL;J?bv6>9nPb$wPInGoMytNs4vl6cF4{q=C z@R3r>k236kHd#>g)@CN|#|0Y9?YAc`7%h(O>9!IaWUX`Uy7m1_e8~R1?_28}<2dR{ z$0wfi+gI=MHCQjY-V0AO9{>t1z4oVfFLOjsx35iYH=RStILiAf3q6OJe&i@ml1VPO zxcCXXIj@q>PGE9fkFOL~+!|5sFTP45R33m&9 zVfmqI&QC))?5f04j*DR*92Wn90n5b?Pke>6roxOvo|R^2RecYC7?P#FtO?(qqNsb* zF-k6Tr*fW24RoWYuUR=~)6b^IZIxTG5A;-yfYppR?gcu8d(1#P?+L~(NnpVD0N_kn zTkAGHwp2^NwVg`=HrYT>=1`4_L-c$+`%mEw6_6tzk%j82u_b%5l%(9TW4tw&y1Bd8 z$H0t6(&wZodeJI$I-njq;04zHJM6!T9ygpr1)#^OU_e|ZgAv<|-@t~{0fB%%_!%^| z^Rn|ckTrKa2u$IEQ-r`&1zhiRJJbdEzf+d|H_b~)sDsDsVgfg_sZHzSC6#YXxeu<~ zeka3o9JtxR=M%SApv06ORrR*MRp?xF;c)w;Zys#m|k7+*S2QTp>g)C0!>a z2g#rCG_R`GSd6?g3$2wYLhrAp3Hy~bW6pnhRFyY2k7tj0IU-B^+R+PRN8P7*PVlzm zF2yWd_3*#6@wMZ#*7gz`4ZL&~P87^QRSJZ-% z`>yX`jFX>Bp~fx_%$c-{wP?-1u#%*_o`Udn%psGFVn{}-T0uiB5VEy}hyoBLX6DHS zi?4}1T<8w=Q}`0Ao}yf?*F3scE&qOpOIO{GIu+Yk>Vhn+C%*4=N|tV{JG{~h=H_`1 z=u^YA)N(cyCp{U5dPJneHl9}b>VL#;LQ(q}HPl_4i?MCOfUL20wwRkca@*zPq}-Pr zzZcQ?N-QPgJJU&EMIb))Og`*f%TeE0e4%4{q)(U3*csrnnYa**dN#>3M1Q-K1aQ&!F@$zN(%&9~GzcJNJCuK6kRBLk9b98*?j_5yGQzxe#A=G^ zc>tPr0oilk!L>}lMP>t`D*o>})mcQ-JChQCU0-aw&Zv0^mT|-5{(t`2N67h&RRE7C z)HVYs_Moa1@JatAW*|xp+A_5HQvWbnuHZKv2(#+w1V#^ZU26-nyXFM6LMHgD;@hB8 zgic#Zk?_qB$wUmN7v=-lJ$6N>A(@cEbXo+S6*Dx(h}N|luo9*!VVb##+w^Lnsx~TK ziKb=NBi66fW5V~W8s?7CM7*T-WTB+6A633nv2O%Pgt`{$F$G!+Rl;)cb#L=2Mg7%z zQ_XVDrA@fi0WM_~J9v7P*<<6c>1WUO(sWY*eDj%cUUwXKRy+5%S{S$aix~dYnPBVI z2*_R6mgVQF3%I1E-E4_g`^kB=)7lF{bg7u8y8e`f<{y^(E3nK+yg5-0?>s5FA0ak- z;7$7b2-eZCCXZ)-m_F1p5v0^T`cqlnxzFY$g`MWQwuERDUzhL%vUbyb1v(s z_SiLTe*;@W=ey*JI^Shi<}dK6lR%z26@U&7RXG_40)mYG@7arq);YM>Th;+D3nw!2LmtGoAPJe&C%B_B^80WVfkwVOib;;E0 z)K)x0^0~3|k$wb`>p5PAxKe-JFjdMVr)K-HjUl%Vgd<|B7%5-RI>`OK=IDk*&J1Iy zw@2V-$pdQ*-Y>Dyi0ATGR>dd&FmcnKQT(QGSI6%4o=AKf2eBz?)j#De-E7O(TJO!W zu0w|i5^Kd_kn1395}FOl=BjmRMS_m*tXES#B$Odf30ymz?m^bOmK@VQCVJ+&^W0_3 zs6xjL@#cBJuc#f*m{`a%0ufYQe7ri53A0g^EamBAqYLm(2Q&9JXAg%ZL4%8R0+~Mr z<{c7wk(@muUvtP8Wysv$tpuFUP-HYnJg+25$z9DCcn^N4d%aS|1Glu0s^evIK8aE# zGQ!;b6<3xgH|}kj@;G}yH3Z=vh-R+05Ar!VrTkY%V>_w8;4>COekcOf7}G2TqaL{l zNK(xR9e5k27*2}>yW_XFklo|aOV49zp&6KfG8+|$y<0c)e|3S62w|i z9gAeITj5PrkMYpNhydwJzeYcbYoQq-mfwTW9GRhJXpofJp5-}BV)tIkKS<@kGmh&x z5^up5OMvdmW|ME>c~B10Q<6w8#0X}-AJ<dRL^~E1C>ZIM8OC8?*ek# zaRV|89pi2Kn_0k%?6?x-ZUkIBqGA856THm>UEauA=xvod-R$BN&p<0}%h;>yI`*f~228>(GaU96HF zNdgmF(~>HsLGFaxT-M~W*ZPY$A81Y?k{%T(+b}y6J+6Um{N8#PlrH73wc_{_EsIhs zr1*?imcOm^(7PpHT5tpHEFF9J4(7!bS6Cjii$!`E)E*Hq%3Bn7jT^f%qOQTFpmxO5 ztLTf<5My!8F zptpQ@5ur&C!j|)~r89%%fam>$lCWxC%sovHfj6uNT@YWAS!%9R949%elzeDGX-mKN zSBp705p%5aH6|-#=jo>hk2{>D%THss%bRXE;tsve4a~^A7xF&&wz^Kl zo7t!Wl>_xPm%5Y%QY{=uFO-{`Uw4UY>%-dU_k92svWOTw2jh|hp1n$AF!%SB`$Ma` zx=*Loxl~};CHt&5d6}}tPODy`Bu|=TNEHvOXt zaDz@B9GwQ0FFS@1p`D@F%Gcx0XgWU6xfXJw(CVkcFWZ?=$eNahia*SO4)Z_f^f`QV z$Z8XG;>Q#Yb@ClRMIYHc=?aHd}F2o<{I@U|;TXskH)XAF4c23YDI&rA`^ zjrkde+e+A596mTLsZ4x%L>N`=G$>1^X-U^FIKX*}0W_(xltozX3Z>9+!~403FP*o_yo| z8=O0zRZs(nUX(DXB8Q;>n7j`xfOLDs3n(4{vSF@+kU0v#-o9Yr^gpF0^*4dXNl&GP ziqIT;>2LO`&U4i7|2Sg)S6^{W9f2<*e$0QgS(nl_>C5wyUwCN3Bo02Y+2qHo8=`HH$BI} zvyjdg&}amdcr;W7yccl{YOm+`!=)(&+5EU~v*<^Yh+_F$FPm^d{om3DUnqRX#oFQI zul>408MJCS&5djDQuXhZW>3FtCqowafmR^m!X;|PCc-qX_cA%(c+tY=t!#T7A6UC$ zX@eI3?V#iz?H)U5*<0c%O&@spB8)Kq>n7(WvmexcEO2mtZcR?-Qpr-&?yZ>oH0& zyWgx45woLb`+Z$&+q7SJLGiW}zOCbnQaZs+3+kNf{=QUmq}-+$(pR^#M^MG=YUycg z<_Ar z^eqz8v8WS1uWcqJm0Z1rh9xi;`il7qQ(G8WyUe>f={x7wh5fR5h0fGy-ir5@(FtA$ zN25K(ae|HH;MHp|EPVLb(_!u8TLM1~86?j2?RwbYSE_MwlrHg5PP~H_OZ7ArMT6|j z2AkdkLs4zj56Xn<=;y$uM`(GW_-x-iew>!)N@(i-T5MZNbEb5CMY3v^cU{v>i>;3p z%|ET`TzGg%fP9>J#IRXGp#g1IS(VqV)iBa5q1`OFCD=0??-&PMjWuFS?-y3GwYI(# zjA|C#&dAoUyu%0bVl2= zFY=mj?tF2s-6u;u#AAQlkwb^`)S-JyObSf!CTKx&-GD^SYRyFl7bouqIkI_x>OE-H zJ51Gg5tZkx)`-W1jqSjlANAJ8N3J&}CClUIYc%_8&Y^Ws9WTJF6pi$%wGt++eP^R= ze~(g6SD$t8vi8aP&00m(;FD&oZ}R?@^tRy~h@DFJ#~oK!nOwCsiT`L}14fodE{qe^ zg?tRDnl%38y7;0|rOt=#5gDpW@4+R=w^2SFF(9by+bP||Wwrz0bXQb=j65rk`q@+( z#E8xsRDUm8u}n*h9GAIxUhcDV)&iW>(bHjKGWsOfeNV_-e5$(F)!<0`+~ZLt>0=Op zeLGm5+3l8;pS&P~LbJVd)z-YNEPt+$8`;r_Bpgtr74KthSdt&66!%pN zbdWWHxI~&8)c^^-6}EyB;C3u6yP$gf#ZRB#uR5^m=Ku$FXGq;P7WwdP+U2LWhiN5< zY?`LIQbymW9FH1lQd3aX%Pd~=dP11ng`Z)n%le{>bDhsqw^W*y2}0oL`&V@PC0i7z z(?zmRnORyMHS0h7;fB99i>OoJkwXc*9BhipCShsWf;e>!!vktMy=&Jj@Alm98EdI1 zh`tn5%_ny}eS?#FXhMq36p61h^9*_lK-yC|32up@NO#r6@0r`2lp%x(x^;{LKfwz^ zZx8izIfUFY3X?t&OZCRpM7#sb+Ty)s?obrzB zZGoG+pKxN7Q5j-#eJUf-ZAXcaMTkW?a#4&YAA-n(Pz=9#PcwpQ^A{yV0Xxysl^pdR za6;}UvvBUJP(?q|q~Civ?15~j33DgVVd!7X(BrF`U(=wBkYGHNB#QZAyQdcI1GPSA z_T%>+bNO%Xe0y6i?x%AwWZeK|4CJD^9dw?)-3`RMF%Hah8WPAwW0Q9kv;UHdI_}9u zgZ`3>>Hu`ZS=DY=lMIKF+%YSSQ#)zs2*NC3xey!xS{OoD(}o#c2sbyz4xP67!}JW- zyIYm;x7e?Zp1!@KjYY%QLu8y23>h!@>Viy0EgGC?G`gSEpZ4j!xPwKTowsb&m-kJ1 zqej$BoFc9pJ9vHKON|^c%Q618>XY1*!|%kf@#d~F{Hz97xHR)YJrKV|m;b@F*fgFh zN2e@ES^Z&BVKAB+XU{p?=;vV{fLFbEmwk=(6&PgYZXNT+97ZnAt9$ePUXc=1Fb%lL zljEf@e~!KyFiRbK1fR6@;UrFxu_VtoDIq%jaH^+n=%L%uFAtMib_U*?D zPU2a4wW~A^cviij;=$YdU1j;ClT9;x4XTrYJ3`!Efe>+)x7in^((cxMSe<$dx-TD3 zO)&n1rwq~5DH!)88+?%az}dl7KGoD0>S)8DW|#K}Nize)CGU6jhB5h(*TuwNS5q>H zLolv#l4@epzlD+Dt((WHRJiD7a9J`y?8tW;;Q&mvzX}#JO>9|(ld|bRO zbIRsdr6gjkfCiH~ov3cFXR#YH$-1IDDfwJC9m#(n)wD!&5OTtOYKRMz3MZPn{7^VI zHnv2fR_4ISO*!Nc=RP;X(s-ZThnkZBN8AZ&Co3wW~1Q4G+sDlJ8d_H`VR`%OI zH{C9`FZ5)2$2r*}imFDU+CndUPN<`wfZ=|vuJQ#674ydV^VWJ^3TV%78R3f2M`@3` z&XKO6CIIG*L9{k?blsK77V^xRyvZ=E$1^AyVGuu4Q~ zbe}lD(!&;i$CePt6!FFt)Y+VCGBiCUpbX z-gdvQ&5XH^$?Y1ru#B{=xNYGjnN^>yH1yDwkC)0PfQ^1h1@pS*UatW!KqAQL6zxg= ze9JMtl)J-D)gcaZj0z1)rtL@K>;A!IQm|b>S}Ji=$Rs4u(kB z&i~TNWuYk()5EIZ<6s!JBE{kmW#8jc+2R~Q8t9y{@V>%dGea?_THSXrP*0`ly_UH2 zRBY^f`cLbg?fi8^`pmCgKOTamgoPDWEidM>l;&sh7N=SJde6P$;N2hY?Hdu_r@YyJTtW`J!W5X9A$%QFd&&$>e#p+u{Y4yL*f89h zv9K#jq10vEhboUrI;<&4yccDhl`n(PF*W)aIK`uRp2|Otu6B&XiRE3^qgL|iSy&|W zSZ~Qpz*cTfz#yY)pB%qo9y*>qQ6OYH<}AY(9QnY>df@)?k4)MMxD0Wkv1x?wXR%+d zfX^pRJ~qyZwWS0Qd+zq(rZ+seul9RBeJLq%;bBMRXOLx1xO?{0%<9j@ACo-Q1AJYe z*?su?|Jet1=cP3^jz_9@@)blKZe9t0ka<;Fh3SFb{lB?y{;$k?{?p%!cc_XX*M}xb zUvEiqwq^@nUC6L%w4W7YECzy4<5!Y&x zQmK3%DXnt%Zt`WaM0lvGaNPROU9Y_uu_AEB|1}d57q=i>OFpq6Z#G-* z8uLQV&dSt+m<2V^9^cojeA3YbuY`Y!2t0eXf}UJgDXKGQ|FcxyfpsiN|3e>sSS^A9 zkno&rqv^2Lz5m)UXVHW5**=1NhdU?aGPF$BT4c0TP#FU+u-Q?YnY4>kyJn$flTpRa zZgJa*JPFf+0K%_#AF&pyKKm9Bhk%r&KizwHL$&z&=-g_aFV2o3+$~u=vVTG9>16`S zpwacn=7sem%BOBzAZj>T;T53P9gyLL8qo@h&r4B9Ni+GR0qu#)4=vk@Fs0`N|1edk zEN!Bzm1JK|x%H$PChZj;#j^h`Ymfi8>z=KSeIq-A7BOEooG&&}aX;y9R+^SguU4ya z4a^@QlVq^l@2AmmSs}gsAw^gVP`+F5$$3;)VqNbi8@!ZVRJ~rap7ZdVVe3jcegwv-fr_nXjW_75HkHO27AaP~n*H(>m#b7R z=ml^vqgVfb^Jn< zZPdG-Vtb_7wb_z)on}n=kf>~&g_e&@p6wH*l#-M~AsmZP3cE0ckfd)$f0Vuft(x8S zW8ERwOnbseYLkZaHO@u6-*VlKtT86gN;Q^l8{{G1kvk0lLfbsm;JZbCnBv}lb2sTk zxXR6uqykq9$+;%0z$y;?l6xn1Q~Us$CIn2{Amj9m)YTW=x-@*Q{Iq3 z^yvbUO5~27g6K8V=HSN@Jqt-USHGMGgC*<$Gaz^PJKo%w_Q6kYj?G-6p}8HGyyyoG znf9h}wY&PS4-2sE2ck?+;J}FT1IcQ z=xX8{kZyVMIZlyMJc*;Iq4^xW7FAo`zG~RFHm64|5}Y3{(+iC~}`NFaG54j>b z+`{+cvZcQtDCaK~-vH{puasCiF1v|H)sB^mw>~PxlEgR1BU~2BhZ$5}FK)s%WL+EH z-Bbw;5si^yKd5X zOraysTBXYdkzRL;;g3U|{h)`H)k(KjzP_USYQD~woyrt|4u>H}N2^|4PkO}&q#T&g z^i}0CKiDZzXIU_*TmEgEb!p>)6J$B#z>qS*z@n}sr4%hX`{)sS6h@Ngm^uahqX^x8BvkFh`#`K^f_1;YdPNfH6v)UkTDXw z9$(Xs3xQfkFkVka0h*p9bd$?*MTwe|N*{$l+miiA7+RVYV3c%$tUZpvv$6!<%X{clG3av6+#`Z&A zL)NuuU}$Fr{xbD^^|Pv|aZOqz^aO@biJ5Q2nd7>i<9`GITFxA5h`xxUr0skI&)Kbm zh4zuMAe;cm;W^$GegMtX*WASQE79IjS{b$tFb?^&&D3=T+<`#IYU-lTc0ddLYbs-H z5qlNfdK%l0wWcTcv{Gs(iZ^2ncWz!}MDG?NVxWbQ&XkskwVUKr;1`R??AxNL;$7*O z+i11P1AmyFzF{o>mBG21%<_+_)dj*XbcLosx|7}KUODW83cxjG(S0acF2s%oO~&U~ zOpYGno$>ldHBwV2-Mp@k8eEgT7t~gRq#vQ0lNF0cJSMXzRVPgwS*f!In9b7&H+WNW z{>nGRNu>WqRjCUAs!$p*MjPTGVZ`Qi2_KYTNUXV>EN2a$+1dN}x>fgZ87A>Yv;s{R z=Fd10GuigNZ;Q9nRLUS18*kyIaneVglR4M>P(%HhciJ9q{0+y&EJywlItzb0kc+;H zdO!())??3GIdZSIf&%f$!RjJQb|6HU| zrIM##zU|R<5%0JjClnDPBi$my9;vOOEz8RRuS+fih6spfj1I(p$!AVZG@mJRXZcos z-DBF!Eu|Up>MCx193|Tbkj8Mn2m9fDYjFNghS2{G^YO z_VE@U>^|v-1odP*g6$Tm1Rad%+~eZL!~RQKWBNO#6m>#c>n!n@`xq&Bv|J6{Ro*oh+NKSf=6h4D6cg_%~=Lp#Ig z$5x*Hrr1AL@z7#s-26b86GA-!Yc>*UJ8r}HUJd;3IHw?w&{>QQH3&0G0s1=`czz4w zY;oN5WPepwiW{RG9M-ARxW~9oBBu7Q6&pH5UoHLniv6+y_Rzn25>`L_Q)dd07M0Rn zD?!7af(sE}fevvqpjDE8m@L+yAB{;isLRfb4XM~^?1+LUL%p{KlfWX9exW5}i@&O8 zfpzHJ6{gKJ;4*oIVGCZ-zlN9Z7?LX?Bxm>3g)FdcQZC`LmS!-6^O%OR9#*-ztJdcxJ^CMRFtUehsuNPrAuE*0}lHh?`tY?uS_ehsA zhVo>On%$Cl_lcOH68${&H#{WKS|w3cYZWk5R*>St#!J+fxEGL+g>C0;o<0BI^XNv- zZi?;?rR}h-GG*w*khP14yl{brZ+0byC47#^CCJj;T~tYTHYo1CR%SN2x~?89Ldd@U zaUYO-&R-`i0+UGp^UigU6PoF_&p6$kX09qb5F16!@>;$dteVbssam2`%3~(4#cfUj z3k^RU6W_Ls2N&Rn&0#Dld zt7zi&n&vMTfLX=ia?<_#{l0Ipn&NF@6XIgM$yuwFkihrwLevZ5eiW3n)t46ew8!H# zx$+(}OZeUSCTi%B1JtrD92c%*$>4d^+fvQ7-^xbf-C=f5`+n4XqQ@vZxI?&`C%n@e zzSbld@rAgYk_ccNtc)Fw#>Fnd)cc+lNj_>aD^Mu=mZUfZyf)>IL-oxbI#zfH$aXee z`<`=`&mCb**`&b+Dyd@z!V@0z@OkZG=HOvBp)aM@yMAHUKE_#iQg;;czPR? z7trqVR4UeZ2W>}P1!Y z?y9Nz*@1ik^*}1>HlZdJC2R;j4_mL(S8!{V!SJLVCPA-w^0T4OtOndXX9J! zb_HvHl~_#xe`-z^3VZ!ZQWv=Q8h_0Xph(dAi@5v`0`q_MR~P*0YYB;N-_aMkB|4j3 zJglEbTv+VpxUM<*T)ZHqq^{q6QR9oJ9<(g3aUqo?@#f?SPO>u~9S%$xFuKY~m{(3U zUCz(%l*&o}Vam~FmFUEgL>KS5vW^^lt|}MLl{;hi>~u;5m~C@D@TtWuqGGStO=V;i z2`yIOE}`xUpHMQGi4*|4A3^c2Gvl97xcJe*$yNPBIk0PCUHZDxyZhR_p7;{slL3Jx z#Q_0%)?xVY>0>rcc{bt^n}^F?6c(?+W$iO?a;hwTSJ}&M9^#nYK0#mDXWOC#Uz!m9&+knsKejj_@npb-Ksi#>&KmZR+B(F)k}M z1CM)q7sc-rz^fw3%>Ub$#TtosgfL_5C=PXvCYf4`r+iB7CqU&NIa!HeU5p!sVw=kdG0XVZq70 z@HIZJC!>U;5)w(@yu6_6o;21E?ZEe5&qD?r=JYG1H+U@!82BYb!uq25{8Ct_ctUfN164u|`%O|6}-hBNJ)=~`3 zisHdspUm9xMZ~BVR1-cn0xl&s5clNOvK?4D^EceD5bCkE%VZyUeZSn*T+7rL`E!pw z6fR(`5yi3_h$iUkcNFqjl9YeML(*|EdC`joD;y8R*}p$mFj4X1y8_?L+dnx!A+&Wz zmpslidHI@@L1=*;UuQ%k4Ehdz%Z-%#HCFQ zOHT4uz;Y@N`cSn}7$IRYE|u+Oe!lZDoGP4Z&RCKjf5G#ZrtjwDV(TWEIP7$scwI2C z?s#7M!PCEeYjJxPhh=lJ{?R*@$vZfybtUl$Tb+OGty29%BI4)Us})y%KG3|Ujus`1VhRwUQnDum zjENpgaqM?QRq~ai!wuBZJFgtXR5_9u&R>~xbe%D8o3SmGf`0u4p2=#+cW3ujN=>hf|W7=2D$K_-AIDAuzmGp%DO7OiG$@i-n zmW1^7reBY~af+&#B(}?e136vc&0qVR_-_Ue>J(6{D5jSAAMSCFdP68>xy1?G>U3-0 z(Nz&9oXufPU3G>Yj?Eh4$}sim33{mW zZ*XDkY-CX2BZ>m?`0^vbe#PN?miRX^!(F z)0NfJsgd&B(Ov#5@4ViKZ(yh3-+eT{$I2jm>s(aE0v=xvzuI93BjR0?M=-VJwa!CG z=8y>m%>zNvUy4F};LgT96&qu7GIy%KF4jFuZ`%aqv^?a^f0}4>E z+bDC-f7fLBhOI3tdGeRLd>Fj%^{m+5wZX%tCtbAWelAQnrRFnbjd-d?a$cfew;tmpY>rU4M%8 zg>8+-c^^)FD1Ot;k-vo_q)z#!tV3w;uF88X55LHN!AczPEQlv(70Rjqw^cHCXA|g!!gv5pv zfe0ul0#S-cjdY}i-ZAtby>}s@g!Hv<*7vS=zwh^*{mq;?b7s#z=Z_2-M;`Lzx$o<~ z%C8jlJ*Xh^V7~ovz*=%^Q$7CH7#O4n-$yEJ3kfI)^}YlgF>S90m$pQa@E$<_x`w!K zz6xn5s+T-rlC0WVuagqSfgf>6#f%)ISYUO#tpPq*Bb{i-Jab1f|^qa>RL z`50{lvE-hu$DioNjC%!9#&fGBd!Z3Qh-2FuR}VkD|@bLbizD}3B41I!J3u)_}C49*<>e5vL%eD>0}vx(^EN>^xcMRO3zk* zw)Y3^BG!AieS5`BKUFzHY-|W5CZAu_R5O!mxN8bPN&-1{^UWcb3CBac+Zz(oW}K8G z%!zd+b$tlMg60_? zADgME(KEnLE@k_SFA|+LE;x_=QmGbtWMEKD*t`pYaM}^K>%jULfAR z6D%zEP(D-QQ5+*$@8O0n|?Ss9C~wOKRD4=94G5>H=#}_s`zjw zGXxA&dyWtml96v%ZDj8ehGJ#;+auB?63OZVj^S6SS=bk(^SaJx#;t8R)C>Zh3BD8- zT;u!h;{NYmlAA5$GC;lz|B4#AB*TQLBSX#WU?uZtHKa&)2{XOH%a#0VRq)iI8PZYY z`yD%>ext6l7!%MU>g*z{o!Z#N8TaV4{g?WSe<|fC1xn&=jt3iA+?EIi|k824x@=RPxFm5|_^10K_6 z!LOTp%rfhXHN=SvAyrz_zcqbvhlli0Plno?e;nlXcii~4L-KrA9Z=v7k^RauDo zrC_UdrR~vvk5o@`5fEc2Kd8>Tzb3A`O8pDvL7hB~cc{t!VX{cbjTga&-od(U8YKm= z7!q$7JjvMi`${Z?dX>6G4PK~z)oyP%o|4PPcK^xly(Xu8|JQ-3`3&rhwwh@!T7Lx1 zJUKv@fC0(@U-7SEGz1=MSiVo*eCH(ol-b)tOw%m|8b0Kz-EdXlFXSU)QPWyek}bGk z3FaEWJ&Rx{<$|fSC2!Os#eGYS@g)im={o6PzBadrZwIFYE(ibmAXAc=fSP@O6Bs@ELpsgpGmxK< zorB(1j0=5pV7X=fc<-|`e;kym?whz{>G3uRI|;LRYcW_2RhYTU0-|pJeadpyLP{nW z(JJChID{#fpcvV zD;h)JO2Nq9UhB*L*C1oBwfq@xk5SC$-Xp0~ zh`x>m{)$hGmU~yI!0sTSZ5IadOc^Ffxv5Jcyk1vR$j!H} zfW5D!roIhs6Bsj`m*d?>nA1V49m+?jllBeo{~>$|>6&pD@gr3{^yXqv^kzCq~&S{9XNf&PCK*x?Nd%VdO4HWI$za}`*P(&h3Qguv2&Ne z07Sw#DV6G!3zk`a?B2hU?0=ZC><0^@{YH4Zr z(A>yki3kQL;BtR}%cW+|#=$%o$C*iM#g!v40ScAhNSsijg`(oC*d1Bl(uC_ zfL{UHNB#|TSP{kD(8AdsBuWb@hCAtMGeh>l9R8-1g`@B{_l}*qTbFN9Ju-&v+uYRY zp4;K`qLe_r9Nk7QYTxEFz{E2l>f}C5xKFA!AXc zKNx)B49W4Sz=%^(2lO91QP8g8T!79ySs43&x)l{fRC0HlKzNH&~ zf7iG(YTmLlup_**Fk1k9pT`bWFkVpJJ1QoV4mHO6M5;Vg5?{^}k;60|ao!T_s!!o|1bO{SLmXa9pY#|>q*{({fUc)9=&%?v2WaieKBQ_rvHfR zoS+zOf7Bt~1;;y4=NlZ?411*rIWF(#dP>T#@+`7TS?J2LL!8j6_?OwNS%1z=ahvWE z&%?`Qf?_`&krPVuN^gyNs{>tsk+(Ygyd}wjRh82&^)POPK>^=#m`4fM@h}+?_D*!o z;F&X58R8TBjwLdAUzU$soVF*a^fDZQLQ%)sDkG^u$}|{|dOlK>wF04ykdPWFiYL-z zyTgGT?R9wcdhyZd2_<4#b}4VakDV8+YRb!ZrO(v0?OM$@YIwa7qR`A9JQvM9|4jb* z7oj{FLhiVuC9^$SAsdtE!#>^(!1*g{+mRvJkGX!}q(gzpBCzu-EH+#HrrGwb#ix+M zN34K_rASzRqTtD5?>eM9_!^@I&Oz24=eKOOM9r7E*tieQRh^preXKT-0~hMcRaFTO zc%W8Q2&ooHS@qgtXS>fPnT4INk(yS1a4p3zT_oK+0cFm*MHkx7XB1LVy&J^iRd*jb zZ!_*hBs|AbIH3$elg+{8_f0E)QjD2dKeDvL_Bw(!b86;N?}>Yqto`K`tjfAOdXb`~ zZb+Rv6SFu?me%`+I08PP zi+01)LCPU=3I!$>lYVKGxYqF`W8ce| zy5sN6jXti_2vSx^H|t3kN>WNL3{>3b%C9YF<9T-V#U$_bT5}XY-8+(G+BaBjUmpX2 zT#WQO^EA4*#xb=@P3N%lwlm)NmGzd2ePj;5vsmClxP}ViF6j?X4}cE#c+7MtKJ-|2 z(UQ$u{f*#;wnJl)P+XgM^#%SV`z!KKNONB}_-A9~vdoPAUL+DQh~%{J&Dni~$YAw?JMe>t^cR#BNmTb(eZ_FG zjx0eA6_*azZ(39ukw4`+W@^~n-#7wkLW_TgTxdd=>q}=i=jjwOVF?RV|H&tEc`_nt z6i+bv)1!D!g{eWmvdFio3CChyCpzq(*LWFuag}Y^+b{_DJx_!;{w&co&0ZJJaUs+n zl+bzvqTGG3myhr1gpaWP=t6;{6h@QCA+_UNOQO{MJkHbgd`dje#vcvd7hw*~L&}X? z*;v@(9iIZW{#aTJuc_~~`^Z=3zlD#R*!`?h8JF4A@K>{HqP<@P?a(Jg!zvJB*a2n* zs56`QE<>1Xgdv3u+-gF4hs&g`j_LcSQ8%___!!q)~k8(UG>oO0KiUx#=&s>h}5V$34nvs1ZZ#Cx$l4 zN`DbVIlD{6k-wQZXoH4TTP+HEcs*Kj8+n3s!X)+!x)QDHXIIVi-&Rw5d^cm=c%pzB z0AgY^>o}!vYOX=L+M&xW>B8vQ>Pi}dLe<^6&s!4EgzN}u>cw&xN4`s?r7dI{7(I>M z-QyKpe{J;L&Hnd`Eb|g`P=PnRf|i?2T#SDtxHie{yowae!?%jho8N4)ld>3(RHdK$(2R#|uPY3Jg?65DTYXt9Cu;<1LJ zgaskTmgB^N_|pg1@8v8(C>pjZWMnfe{zr-OZ))NRMa4wtcY=Pe^f^j$-H#i6t4}>! z|9q7#$HWNsy!}pP(ehF7Ip6M<8T>M)zB@XX0QaHfUS{qXdgprCS5APe~5we!WrNx8!Bb^C<5 zM809ur)tEo@dNX(3xe{2{0rl!+5;YiG?yQgG_Bwkk`CE60;Ob4@d!&-cIlgaBnhE8 ztRi7M7BfAeduy%?%HMHp*uj5YiN9xIt23zlHqEi09f*=Vai<--Fnw8CLue?#eL$Vi zu{}h+t$oxtIX$uh^r9O+{I$F?TZH0n6@Xl)^&qfLQD-eAoCIP#6{rF9)6>-g)d3&BdcCf@M=lTlg_P;>Q~a`^zUoc2zuLY`kb|V|7|v} z@Pa5gMTpq^a!N{C{d7aA(a^9C5TEQk15V#aES>TO)*mY)p!#1t^R zx_rgb*G;n8l0ql3*X^XRlOCr{eX4D=YDFZ72s=r2dM>qZx7>^wE+1yfF+XeYL&AiG z1oP26zx{QMyPs*eO)ryzFw=%eXjmcx?+Z^(i_ud#$|bUoub&TA_$1*-%a!`9vh@~S z84b|2*m z(A0I|^d$4Xww#AvBIMI`DxEdwh)=`w@vKr-`?wo)0$%4yQ0OBlP5$O*C{L-MzW(;5 z7{+BEkIoI!XESK{K7uZ+wfv`{2h@c)HquLj1Q{^0=^+6L&Gwj(6miJb+=W1oClkkf zf0dv@Q$+gmp1XbhUOvQm;p5%!knodvg<0RGvvgO}aq@Tk;JN{jx9>oIzzWD))1>HP zNeqQ*;l%U_nwTJC3EK<(W!G$N7&74+bIW>UB|L8I`mG}B7c8Ek7i;N^v6SiG_V%N% z$81G`k`eDR5c2no6M2(i49fTS(^&d}aTLf62@#A`%2e8&RD=oX__4hlO&zx9pp9kh zovpMJB1CL|mE^a0>0G-tXttWoZ{(Ux=SBoL_`NlpXsTo@aHX%f?Kv;tzCu`1K>8uu zXubp(-{WnLdvlf+x<4u($Ml#WMZ6f2TV}WvhJIo|a>A`$#$z&aVSYD_Mh1TQJJEur z03NhVoh3sl*lPyE-B77h8N4G4x-Mu-i;Y@d>Vp;*Rs4)B>;<k-AKpE2r z`_S!Wjo;pVp5*hhLV=|-qYgE%*)Fni?-|Bj!ZYqf7{uF^uv};|n{>3#p*ZzrW;16< zm25g(RkN757Ao-G?ZGE=dS_r!!CGg4_+Z7sivS5X@NRZN>FM?(8dv)JU6kSnkWBtR zAZGl(_PYQ{04B#1#cp}7E%T+1cz3L0X!Df^H)IajF%!&heVCHob8~wchlNB1{zg1? zzZqM!x~*Py!N65!w{q@M`Da1&_w51S+Q|%a4#eTf5gRZn?{`6WMv-qP95ui_;+IEGP9QN>}E6W_JUL&?5TXkm_!@@* z+~5!O$S#eOp427j&l6PTE3Eg+4^R!%B)DN1Z62%}wyhVtpWve;N7SzhHr>oYWS)3j z#<-qJt{mU?ux}m~H z%bCOrQhZv5k87MGHg#8&B)mQ&8s1_O6PUv@LprrjkfDLK1kFrG3yR^i^MI6+ja!+j zF`01fXfvW~=f5LCc5HG*ikPjU-Kz5-4a_Z-xa9gHdd0w@1p{;A?S>eRqX3LXo7WgBhr z?=ktm&%girzuAGWzc{83;jaAI{nTuFH+BM07oUsN%R{?Nmp0fr<)X;*IYl$ z`o9{EgOBr7wrmXY`x-tqYrZJU?26UL{+yFihNTG88Poyz`HtT#$$|TQ4p84)6RI=x zFuAqhcY;)xCS0jY$fH9e6xCZ8)3;B=%I!UQ`(-*;)x)e9TR^bY+@=Nywt9-yRDPJ7 zOCHwo%#jr<%E70|8BTR?E=0yrd^J81z_b(gCh-TDC>HmmsdE(X84><18;5zq5nz`| zdO00mfGMNo7&$-APT)NXp>R9vEZKQ!`LySztS-E_RR5LZm%oxgr9Vyc@=B?3`uQhY z=b(}3hHGlkkL3D_!;(!km%2YFhUbpE&7$ztjWMXj%aWMXM915-tMxE2|3P!hwFDs& ze}7YJL43lTAzktS(XPrO$b8U_i8n*0>S|X$qDbI(#4VPVsTfOC_ zXVAHx7GeCt9_^i8ybDf;ui!D}IK)sxfwldDLZ9xeXql#4(rr|`lS z8F!+o*y&YuL3vc|;*H;LT}VbSX~Hn70P`w8pJ9{(?=?B!XIQ2fQ}!UmA@}$mkkbplV+~e7LCo_t20n^&jGY|MTaIdEb)K_&L_9UMDi*>h<0qGbz&5^ao{E z?!9}))xzM!E(7ssMA%ese1zoD`oB|^|1bS4 z^21WuCs2d~s4aFY(@%M*R$)4;Tpx%u3K|dfCN!d(tUyU256##Mh!ZSf#6^-}5PqtfK zHio}M0P${xDWOqxhj~sxUkb~J3~hu1OpZ*#IO2y^u%`$8ETs!KvuWKo4a9aG%vr7_ z-PXt!C8?JI)9Hq;s~yO%rYuvhb}!ylr1r7t&KlJL#%U%m$o^`iaE=qa(i*>}C9{GN z<;Ku?tBOOOzeJ`QEtCXkNekT_N722PStZ5Zw{I>{lHz6_ReOlB47Z`mE9pe+t18Wz zI}TFE8sgD@J`$&DRYi=S*l90jtHv=X*q$JnU+Ad|X*nJC>f?P$`aWX3{pYO6!MqKL zysz^#8m5>UhgEpw-f9=T7OA}N9gJ5PxEVS>u`ZmV^FDfpc1g;9a~A9>i;Vjs3r^DwaRuY2e?qWS!PB4ZOOQ&(^uR@G5Q-hTUFt!F#9_^I4rH^a)hGJBibh5VJE<)hy{V(i%E@13J#`wIhbRg;byvv5u_2 zuK1aGY_%c#f0Hi`u^1{lut_v28A$BAX-Kq2=<>KQc64AHwp$x-wk+z5 zVG6qcWD9qR;AJY70B(>IwpeloGPJ_dv;$3@;SpdK7F+WZTiFgBn_~Jw4^6LAffq0m zkQF1g3;{*xf14HXA(Ar#A3lD`^N!vk(wGm1HKlBmWEuD{)+Lm!8TcA>su4R!Kr<(E z(M}jUF@V(Hyau{EHFud7JD6V?N(}p+5qiuRgz;>KMp!rf9puvZcPb$qL~@;(LPr3{&&Rvzf2$P}LBT>k5<8mo zm%iF!=<{OGUfkK0xoe@6m@TL|afiaMj=B!2_XTvfVK?^I0oh4g2gCJ;x1krofKQOt zUSr67JIh&XRb@-K_kb}KhSH#r@B6A#t8`x&<1%v0!x`P)#CwaTF&*FyD zrX^hixi$9nT^I?-mHFsxvQheZ#*qxT zHMdCHMsaI*0=5x6juCzjfA}sUsSQ6I>g~@N3|GT){$!JS5pk(2Iirg?dP33b<6~)C z$0z~7UAibdn>clgJW`swv;J#l$_92sO^=pe!oxhKb7?~Ya(PqOSY#R3$Qt5YH(yH4 z&JHnpc8$$|h7;fAne1=<;PKqd@ z+e9)H`>IP^QCoj(Z%ZL6>DG}9`2zBQJ^091&3T-633LmGQM>#(W!Jy>bG2B4%*#yZ zA%qN|vL4yBTA-eWn+aHx<od*wE2t_;YmSHL*o4>p@MF7_d0?Hemz zbqKGzIQaQ%w0|z3nF~3L=n6OK#k{ha-Lk^mXSHJv8(+IXiObRYC_1q}<)p9CM#n{3 z(r22|H*86c>aE^fUn%K6?LLts! z82kDsbliD^5kpS^#y1rSpy1I1Ns9T0U|Dk6Y6fvkJRTJ;m>XcBZ#^W*JJWpOUGSbE z98pFSPyqISrHu5Oo3Sm5howK0B&Ur^XlLB#J}#3{}bP<7yvQrF=(hEks(pMJ$FjIP$TV6w!B2Iy$5$> z4%;y+waVTmRXKKeW_!^h1&PC3o^0dx`MD>z_i8~5PSeco1kK(@3)LqtO%yDd0NOj} z3vl}Ui@X#pcA^CO+nPj}!ZvmJ*9xELm-4MvX|=8{&TLG${}N z{;SR--mB&Ko&adRWQSxSsRaKqOXo4~krrA(8&lTCBRIuWKmNsLKS|c7S{q!(|LSgF zy8iEX`LB2Uzvn&w`>aJPu`)3lABmN$mWsE~N|M@T_1_k#-!~0lLpyh&s3tz%j5G){8+cqx!YEtDfTeav|}LcU3i;3Ic=(~KrMEYesHx3^39 z@%N&AEIn(WAw4Qg?E8?fa*!os|^eUk&A^ngz~2lQ06j_3N0sos9m+UMa=jHZ82ZN|kuyY&0G zd5nbO+Rs-nA}(hAm*xZZ1@0m5YXLKrAC&O4db{?mOA{QYHoo5FchZQMVZ*HR7+9E( zmvD5IRL@n#C&Y*nZ)XTA-ED1e*mniY&F;F7PK|DSg@s#$N#-T4;qt0}KC(My8Y%)k z{terMJBsXp@Roeatm<|7(UW*XH(Jc{+$q*_c`I#y?JAg2y#h*ihM~yQRd;SM4D-79 zxO05X+xGcy^s`N9Uslf{2f!2Jvk@za%slKw*)(FKEyr$H*o!X7P^u|#2Y8{bA0dy= zZ#-g0e~gKCddE{cc`0SG?9nlD)cNexSJ(MX?(eB(GbT)bT=TgH36B>`Eg{<`XfgdX z-W(1!`TQAtA6TKq-Sp@0ml}3j2{iCI8M@DE@;HB$h}ss6*R#F%>Q-`rJz#LHzz$F* z$bpp5$(akCCom%_fllG8bx#j1lY+Nr^4Tx4wvnHWVAgL*UzOP2ou_{J#X@J!%|3A` zT=w$|yh>8JH7@QtXB-WqfW=E5_|xOgDb-t%gO5qtJs>YZ@Gk z3qOK4;EzWkWWif4v$-6|kWi92H(}ki4Jw9NaBNS}ffk7(jg4U0P$D?5xDMbMA_c%6 z32>t+a|W2b8b@_@5K4P!RhiTu`3ci0oEgf@4n`}hF>4I+ZYw)?m0wddiA@DCs&$Fx zQzj_?2F1|j4BSVBL01+x5gm%@>MpicJpje+;{$ZjH*5zOD~V+hL z&x$D0j{#+-sQJw(i&XUA_u##KP0an>d2JZYg|7kPFk;@FNZ+YN9j=lft{BjHLp^S^ zPaep3YA}?AjC&etoirLU570vTH453k=~N#^$3xmnsXMG<^+(cslcmhC<>oCD#fEia zgGW6faP83&hNIthv7?wXWMhDhbp^P4oWmP!2EWt_2$1%%W)HK7kTI=TBs}m8YiQ05 za6*d770R0JJMu(x`LMpHr4M)79&c)1O=)>I=gUJuF`x*2DRlr$pyo4}OZbAstC z&QNBkB*RMh0$mv)h9gRU^qYqAHO)R8&c1Z{vIhL@y(FbH^0YE4r>FTO1QC6i(jm? zSgN<*??mx$TSzN)E=pSrdIbDp$-o+?vf_K<^TiiZk%0eubC#VPffGYV^Bwwtvx>N`5 z{M2Xjj|^?fy(w6Dy#9$YzSrbEVbasJ@v;OVVh08Q#d#H$-kBR56EF+WGK)-dsz|+b zFEZis`yL`s2K_v|LiM$+h;>2vOEvjv3z+b7&RP77?K*%{pHbDXeq&5({#=2Hy?N-? z@!ZQgJ7eEFYiebFh(yr$N6?aVAP?S4(3OnG-9%X|jHJ-r_QlK~?r8b&6Bpl|KfA|- z8@MrWW@to6&KOr$xBKDn5;n#){UnLXDbVhlZILa~Qe`T&CSyqi)~ zyt_I67Z|P=J-h8Pw72G&d9#KjC13`+?`Jn9PA_p_&$Mmjl;@6<%@>-BMk?;U#3|GM zv}fQX64vD3p^Arn=)7#IGEo(=j+J?+yk;g!QRd|HVz;8Y*`zyY(AoDV-ddWRh9jjr*mwE6Ak}vaM zvpYThPqz3c=bp0nPd-t{^V4!Xi9u6-MWN`16Z_EX^`eltS#$r)rNqZ^kEP^uZskte z#WV;uzhv$Gd~m7>o};U?bKDx~A=5tpy>*Pa1Ubo&W5zoj)yhTadKKFz7={$;f+=IK z4C_TGi(|N^x1gY~PwM>lzR&vZuAm91{e|zi?;agC(`?C2?v*fE3+m?d1!PC+7A$|E zB(%OhwwMo-y<~;_@pW`bu=tRV3&3DHA|vy__%K)J?1tpVCTLpM+uoasuV{_fQIYDA za`%axMB$aWCRKnxMMKOe?|@1K{A@+Y?=?ru)VKCAwa)j0j=;}XQXC`Pol8zf%)cCm zW&_9r_I5$}#X+k@eAB_zs$-MTqwqi3%+b0;W{!Tm(d9Lysx>SCTYVP%Q$;vM3VX2B%lA>ZqYXsMzJSTI2&3t^O$|SVN zomY1=V7Oy#f~Zyza2kh`;>I6V}82F zRWRR_`N>IA&F>5Qm#%GUuDB25+gPE=Q5|X4*1s@s;-8u)hIBqPJEH63CGi)-A$*nJ zTbE`&0V!Iq9a-lZkyCLl{ulH9JYof+YA42f9L5^#hOTHDLg=`!$)da=ZHLI(&5vG ze2!&ln0$s@a^eh)s(eX}e#LGRg2b-~3P(+!zM8pGdS&@Ur_UT-u=U6Gp67KtLWBcu zAn`DL-&*mq!R+;QZ?z^lY1F892hDu-1B73KE?OXufxgdWF>%KhvS-nsM&m;pp?fuW zmw^ge9zrvFq}UGf{q3!_nZ0w}vwwVCzPi&o$sPR_r=Jz0B_x}B<#V+Bo?TlQ6%dLf zwyF4|q%$_{xRS(Tvv44?_9Bp?ChFSqo&!|YDt>>T9}pFUBZcEDLUU7r$q{ErsAn`r z-t>unaohKCU9H9|VJ-%V#2Ko9Ke%k{$~spV+q@>7b+%b&=iQdGdYGsn^JBhylLilw z5GB8vLHT4DP2^n1NONg40tKOLF&fAzeUeEJ?vMFSFAWl!y z7Y;aonXtL)%QvFe$kKBkfmZZD4cG5+5w0;^OK%+@~YFi*0LZnj3Ok-6yZR;yjoB{wsD(KQE zbI6wBa!&j&~;H`?R09NUc7 z3GPy3otcM(E;>D>U796KOn^+QmX7DRg|MG)Jdo3INr0zJO_^thv-4=nfp^VGKW5j!KF0pU z{}a(7!L{#*wq^$4V_(*gM@aP}g(!(nU+?$ep=k#xKWqQ?81Ys8qe{^8mtnYZnb-7eeKtHwBXhF)e&OE^z7(iv0E>N-bK&f$HY(TOll#!V&Tbp z@}_E!AacvO>nl%l@hOe7UF}=rp$h^b45>UyhgDfO0{V!cWX>tX`(8k;5;JAGX1Lje zEQUks<5ReZDW9f-6vyNFI9!%$v@fJ1P{p%JPnV1-j&(HxM2{~8#O6;tm}8Fq@p9|Y zpT-Ki(BAG}vFK^mm3XHa1HQS;GmmyWD^-c70@I$3jevk%Bl;gmxql63|M`7Y%k)CS zVvQ(VZsCTu#kzyNdU{mPgX`BhneqFwwtfaQ1^V&#y=?0LVy2L#swce|@veVYO_cH@ z$4xLLSMwlM0jqu%Ssp4|IdQ<&sT(8ne)j?J$) z4IJA!(?ea+LdFfTI$(#SWEV1i&qEjTngJ{WIUtEGq#PnfHwYVsp7Vw-V^9R5s* zspoMy3PU!IDzkaq+Y0*>-JbZvCG(+RWQqaCL#iJ$7Jdpp`+z{Q4U&**?XT;4sPU7u)I`1O~xF+p?+#PAMAv@y@cUpU3^HP zLtcAcfBf1tl6@1{Hr?xiIzz)MUK0i~i+bN~o4@TiOXI4$2N3*(#U}nq2Zmt_Ue72J zc?Z`tLR7Jj?S&nCM1^QFW2$6a+r-2CJ|$nrUO1;!c5~HdUQy$fc{SX9%D(}e`l}wC zH%nhQEnUTaxtPv;_`T}tV}cylfXr0n32%{;mhX5Bj&s~sVdDX2@jp|m5ond1XN{M&WV9>u zn@=XGJ#2{8nYM-XZdL18;X?I$J<=;H%kLL73C2GV@8<0%V+Yker*s-Hd}9jaf+@OX zzR=vP$;KEy=VW`l#)3lG)C32`u>MxtRJGWIq4brms6Zv=#Hf@FFoV~6LoNa&&nTsF+Y5$!|l=9TV3wOo{E&n+$);2h?EGo5fjc? zTod$l1+D>h^v>5Y?e)ytorC_@sLpTH_6B+y0YCDE`@~9(2x*03+UN9^D zowSea`9Vi`Z>aLM{0>Fvw828jP#GW(|Z^ET`tj>px^?qv5P zHTBHBQpFVGaD%ce|*IM5%?Y`ecgXQ{%eL|tbKi~^AzKi2J@~t)#+Pe&9|?L zk0Jb*HP7_Nz~XD(`y9>oZKKF0kSb= zL_D099&yUBUs!ULVja2SoYgls|8j7|l6v)L^IT6}-YUN7le_z;wwaRR@P$NU{-X+) zI}NY1%_hJ10^;Hc(lJWAy-DoPAmK@S#6?liD(t z$mB<($GCM*L5)9~@x9et)e2VPCaFIv%y}muZ?{7Gl?3lR?yimZAt_elCxI=A zhtITF7RNT@s1-a=?65+^-kK+njh~m9(}5?1WC(1maC9h(-5OIW^LndC!)N}`HU9;I zs!!qy|38o5y)^^{lD4VDHG8*>w(ZB{EkG{K5%%MLFJbq1(dkN)2lrF*qDaJrgAr-F zM$zou2LYmK!wl~Wp6x$-)+V^VRX=NGU#7vhYK|r0QHna(5)rJ#0`z5CJ75b>zSU+XJj~Q*Pvj zE-!;75zUUX$yO>n4GYKh0BP+RQ&1nq!UCuqTF6uz)=wDcnnihcC+G{~Re|rQ%oTKR zR^}?67-e96`(w+OiUisf>|>3K&g8+X^aOxx_4$*{VPFl1jiI6q{s;g@4)G7Jv2n*5 ze#Q<+LY`r`wl=sm&P3c$?Gi0Q#;P9Nx+C5oe7gI(?J>QQK2i%!?SKI51&5ad!nEN6 zJy8ys2WrN*tqbIIbO(wDagy?J`t`j`S0%n16B~nbNW8GRX58Z`)}q3eKS=F-&Ga!s z59y>A9;@1%GnZdN?;Z&fq+3N$7#)p~^@Uy;t};@r9h#_rPrHPUpMzAW02A@;^f#bu5Lf!sk3rT#r z%sH(Q%5JzXCi~&Rmr|#-6?2Cm^(GvSBWzpZ5U<`NC;9E)!p{;)-n#h7G4vYm5z=f4 zo-0}Sh=xnwXI-Y9_4ggK={i$V>~Xp9yU8CMA3r3qvCYN?9%SkfkZg=ggtcMXjFz=q z(p*$AdT$(S`p8S7)gs~o5if{8 zPfDs(wO+?9!e!eo`8*tmhLD3P&VIz5`q)u-iMeH3gJ@AEJncH798bULb7JS(@5G(< zi`eF?m+QYuSCROU63~^kA=)MRcjv#H*+sY1yY2>}|7f^572=*td(5AQ;QtI&RT&#%={zj3^9iltH8D_ z2tmkSlDq?n9hatPk`C2LFIe_o2y({L?v!tS7U?fs@a%bkdAO;^EY9a`gdJi*;CYF6 z{54C;-&eF}&FLWvbxQCHu#CC!OY*Tzz0?*4V7aS5Lgy(5FU+d*)1cX`_6crmxVu1g zpP~KnI>N?#8&*lXAwcK(;buk}m5hLZ?osH5B+N*s5s|OAb|)~CW-}dcWcdjFVv$|# zAf47g{)70=IJCc{a^xMSs-wkp>sA}D-Pn}OD-Mz6Ni#TITW&C+S-t!Twr z7ka1c=zeTBh8u3JNqic2B7AD(kD}r?h0OWKtE&MI)Z3KvUqmnx!mj1;f9TlB@3Gbb zihLYY=Ainh^#0@KmAEHh?mZt}`;Ya>J_8xHp4y+0J%*~UCw5iZw2GMEUSF|njkJGo zSNYmq{%4D9*P3rp9^gDA@qV}v2oFNXmoVHC`ujn-PeRPS4u6eY$4eIUB*L=L)oJm8 zic%!;Zj%bVvpnFP82&AChzgQ;9UOx-bbjr7OjJ=kaA(&grAa^^K2c|5sCnHNELO3LE%vs53rX zMcGaMxo6%Dl>j-ext0q3V&uUtSgEs3K!StuJVvxq->}8AWfn1q$j?1I3r^o)B2=tRZwg=bkk z&ahMxuXL&Ax|pN28(#h*k^kkfX5IvW)UZEHjLrfCzkMU0zW$)DU!H+fZK?tYY;|SM z#Ravg&r6+IH-|gPJO6{d_ke1u-PT4?R8&+% zKtzG4G$9HCQbZu40s77Uo9qCm{DAGwNA*6X< zzO%o*_u1#%d&c?i{l~xm|L!qZEW)5fOV9j)i4>(m8? z5E`VUiCw-@lBH4~y#q7eJp^`H)%qvzK#mzbb7$ zzn%Iz`zphka@=v_$2#!iQW%d{{dGMy;xNyWr#;(uE-4vrqT1s_sm+`Dm$KBDU^>iUdhzLv$ zYt3@*U^0sf3-)@L@{I`57vBbN7N#aUmpqK!lBuzgJZ6|qJ`Pf}7Co0gFZI7VN7&sj zdpS}1+9vF(QPfkLdwyVEHoQYnk@1esL9mV9y84L7n3|3{7*b-r$|>Yb-}TlX-t7TT zj(c86aMXyKtd=sWl-V`h% z$kwP7=Im{A;!fW&3D!BaY4~!$y+sr+as1_jHKbbVh=cem4posik)MSCo3#rJ>0MMm zN5~NO&4K*Pq(=%CYw>1_Sd+enF{JuokEDY9l4XZSEaqPU=sNkf3&*#&H`%%?2MidI z*_PpUOp$02<`TK~I!KVAY0N;x{U8$LdN5f-ux95m3Wa&JXUOSLdIxbPfD5qhzPk>x$fN41G09{^|C98(s zJY_tOO|6v7!P%XsJE%+Y*2 zZ|Xf4y!w{Ra60IgeT^ILl!;o|O%p8I6p0Or`0zN+%(K#~m_*F0f5xeqpUz@bPXDD! z;^Gavow#>tYNp~3!7FS~i+!D`L*o}=yS&EFMps#OJ!P3=*#YbP{>JV76SOlCJWDJI zL7G1BSUqM>TsZ#gQ>@$Fq=j)`z4AAXsIcx>(`(u;UXGbuT3NC)5KdHc(&1i#S>j5n zdrEORnM@F8Jsv2%#4asRYE_2-M$!4{EFrZ$$oxJrdCUaa&@FtDiZUu+kn ztWrM8w#?~R8LksJhECzZ-7j*ImuA-8`pwcIlDl=C@y4U{y)X$9DpD$7cE!Y4s?I;Y zld%R}%7ot1485GH?riUT1K0G<<}UB`&dM{yg72t(XAwu{rLVi~J6!)h&V6yc`evs~ zsbJsSC4Gd*%Lbzyn7!F;y$bIulY4pWvH&h^!p*b(tA2cFs&-7ZJjSeNMoB(L@>X;s z(5Q{{6>6c6FN^Qlyo~)cmrJjZt-mkfZ=pPD!*K3;ZyOj$K=?9}c-@^HxUc+%(-KV+GT&6x*q$Hujm6i%f)o@)SM&~> zv56_FKh&d5-b8fcteTOx@6KiVG%1MmnIrBlXG{=e%v&~oMz+ItQCRB{M>k710`~20 zjrw(4ZSB)FUt1s=ud)~Q&BwZ~UyzH)2no4V6Yiuc@=|M|K44aL$xW{;E5SH2WzN_! zZLavvpjq14b3!IPVi~nhWdaErY56LeOv5 zp9Y9fpra`YuB3x40_eI-+(?2S_gXVM`JH>>hsa%Tqv6W@Z~Qqof_Ua^d=noF_z=!r z(Lx(PGD>euu@K3l z4DD#xz5^k6!6dL1N1!=p!r4Tx*wD)KgI>#a z^fuQhM{7+f3RbD}Y#U!W1?Ms|TD;O?o;33}aOwQtyK88DwGa>wf!B-1+_CG3%~I7W zFS(D31gbXb+|087j|703DvoKvlMw?=bo>{b zz7yIO7D#S`&lLe3z@CvM^J zan`LEm>mYP@_7iT6fk+H{F&*WkM1=(0k+dLc!_arkrOXp3A-M_9g|WjG z>l+16@UtZkU=tUbm_<{zZ=H4Z^2VPgrVu=PfnO?H__K}&yyw4B5>e&=zPg2*K1HN4 zD;&_&!&Rsh_{=XlcY|y)Ww)NzCg*kvWYW@ygHVs>B+9Ot;n1)L9NY4G`_WZkW1A0X zK#Vsk)aF3i1Vz-2Q*_gN6&^dnZ(})8Z&4HB296NvoaO`yOut3@CI}6mRobTdi zujUNTR$7f9RlHZfp{EwLmTr-4L&0NsHv1pm;-7EzpZ$z{NwWnBtV}mGjG{QgR%O|V14?*_(K*>l-+SH4Pg*q*Hc>|BKYSkML z1t)LHT--7e8mrDChAi1kwqA%Al5sf&s=RbnQo=Wq=rc!OAs3sqdUw>ZuLF>+J|vn1AuC1if7GXX5^-t9ZTwz^ zkgH%+`}fXIE-J&-j4$~a8Ztlm?k4j+iL6yI)b8ET*)Gg^(60k+^%@a26xTFd;gG{1 z@?gBELq3iK&`2PNu-1L&-7c-faAkRKTsSUF#U? zT#7<()NmEQfH^ID!^IQTsuI1!iur#W6Z!WW#XrN>&~}szFxGPdloAzA?|&~NJC4RWiQHt*)`6eLJ7>o1t}{?BaYaw zxwW9D)@)#l9N7-2iPf#hdu7I%pK@)?GM|@q(%mSeY@|W&!|siKf7oGj*vdUx&Sk^; z0>7k%D&Em`go?hpPtT6(g?;;cmVMFtcko;!Ti=QYB7BUuwAk&df79|u4uTHm=S3kPKe=e7iO@&2{AM{K$H7gZ zjyC`VIx#(+FE5$NXA`2Jqs!Q?`(@aJ1x=6 zg~i^k=YQOO6|FG=<{{LHvpwHG0g7)d^m+=Oc99Zan-J7kFrlO68F0i{@=Pe7q*Fd> zQf*!7NSlo7RtO&qs%!7KC%qR6E`CP@7ZS6rBeh)m~f;uXl0<9^BJVh$lPNF#)W`>8Q z#uTN5FjYcPuL=b9rZ2Ma^y!=HfBKwPbn@USutA!C`0G}Lx(dv0&8$-IH;d){-z?oe zi*=OA3tSNFAz^T++8FaRP7Q2?WZZZ|-7sq*~I zl2QO)%<+3C^^BUdBN^}+>057T$RBN|5;W3PtV&GbKC;^%9_<~k|I>Xc-Z6wbs3?^mriRt%F!;fENONH(?} zw&Iy0Bj~WP$od3mAthu|oPKyp1zyMXFy|w_#hUJdYd?jO&O>T z%*)uTs3MpST>&yukVcGsPZ4alWpjKX>~Rqw7q%*#vC&`32?S!-a>O8S%%RHrV$YDS zDU>(I&Xl}8KY`{xhKxSsy{#oQdT_Coq}q~g(~~jdy3cYmTX_D$mQ@xtkG4x)MqDW2 zQiAA_2v#WT8@1`90dvte!sO$O_lL4ei?Qr#7Oke!_u+xk%XqkISEe14y#$&1t!*Rj=q0CJJpq+52P9RLLs7oFsLu-hqbxQiW<+=@R1T^8 zbf-zg3_%(AIS}7-96;9!5PLLxQg!3u(u~}xymhhdZvlN?*h#B`s~{;teChuBnTO{G z)3bqKW0su>LNQwHmn6HK&(VZOL;TvNFXwnPv@#sqRizgZgOEeYTIs>Vz+!4*dw= zY=VJPap&BENAKBWfj4mT2|f1qDZLZ+s=O~^X~>0$y3Bf2`xEH$rwP+pi-;je9pVYe zD}&%`C_Qgnd#-u;LQSJ)7tb^5fX2a)!mJa5DEHUY>f@<4{`?UP;q8h@AuOSl z$wQ7sUZL^pzN|E5+lp%rnRo>0uO?v2n#XCEQ0Nb^LsR=lq_ft*4sr+NI`f!IJI)X} zL^A@R*!kZqB{cN(=5`Ju6;_0xM|aZDB}_d;LLaGYmBiFs8f9lr0j+b`4-h|+^zvaQ zs6FUrA>shTzl8b9VY7Tq?;j`2;?!$@vm6u*UxOvCJ8xy;2=TMEU|MpgGL*DPc;JiX zq~A?S$k}2>`Z0pAO_AZ>PNBmD#a~w}C~yD}w#<1Ep;8asa@~YoBW^O8nQII^B~Un< z;pmI}jI$wO!i{aahkpS_g6(mj*E8OIU#p>Pb)Bk?|) z=wd7pyE8l2YKU~^FOU%%A4w3VFu59bmdT9wgDDE)1hUB94PIYqwZjc>7F}&!A&jk4 z9E*roMr)(js^(yZn=lR$T4q)|cz6g!R9FCw2GPN=FA{&Vc&*8UnYhWURQSLq$$81$ zB`1yYr3ZG&Z5TGO2$T+o9G4|%m8s(#9E%wgz2h1~L5TQiu#;eKZ2!PYQ921;#P~X0 zccY-wq8pMW;}T(H2;1V9U23H zN54v`5Ujrbf=p?-J2pT+4(15K?0|_=n9mMU4aru-2b4(sFUxY>{%Xi8z0T8}qQ++^ z_doilV-prPV#sjf02x*n+@E?&H)pi+)7IJ-MmI;BG{kIUEljYxyt>tJwzPzQ?%t!G zelsbc;~3QW6<7c*hWOX|<&5+2ygPC67FyT%%ytco0{>-$ie&b?6ygr$eDnx0c-Hv? z51`$pzU>+WZ<5qF;PGm*R!sbbqQ1n@hZ$l7(jmkl$a;(Zw%%#1HTDqW!`F4x5j&kR z`EmsP{Tm>cxTVD4*t4p!V@^r#??(7c(SC>>0futDOMhlG6lXFXGB3ylTLPtAMMr`k z&5;Z`KFlYDnC)BZ&#OiLMkL1U|0|LB>Qurx>_u>=*1{gzo=fXJ7_h@o|MulC-2*#k zEZDCc)iA+csNs(gs)PgedG!NEb};J1h`+-f^jw)^xD0a-gTpx(5WME7n2%sxA$QT{ z;ET>R)|enGPz1mVGmExG5IKlF5kL;!npwrWov=mmr$le4f*~ne4q&(|A0Viz2iR|) zgY%Ji;ZAJdtxoq%IYpbBj9vmwx7=r7Bm%Sm%f?8R+m}bVp7n)aZe*kmV=a+KuZ=Q! zkk)$JVCnsM><@Bj!rgIGsW$qKbo_cURT#RoI>c3bb_$SPo!#D>s~1`RKobuF5o z*$V?s3^dA7y=l_(*0(8I_g!DfC#y8n(1oQrx}~j(b+ok0oxi3vlX5R~e{Ys5>}H^^ z&z%HS-9h@TKX3ZZxt4U3txl=*{+q`uJB^UTxD!w^DHoZI`il1T5vTVHlzdnZ$$FQ9U?jAPa=6-z-5}{SC&g7S7EF1dXANy%XxYePM9CSt>~mNA3l4DWFC&pf94 zKw1o7h2wv1%g&bKnCg8f1l2%w3Ov6Ry}I51*d_bF^Xu5#CyvQR^uE2rYU1>u%Ei?@ ziLZhtSmH1}X<1pYfr2GLIk!(!pHW?JW;fOJPGmHx5{h1BuiThwKOUK!o=J{p_jEkm zvJnf^o1_!+tjgAw&98z5di2j|98SQ}xvj4qhwym>Jdvw)C|5R*%>Je;0inC3qWa)7 zC3J4wjqx-JSG^r>!KS*y&(>LxnCF9->8PK zMgB0h6c+|s+1>EDqmk$D!mt0bffm<>olE(PE++RMRrfzH`(NjO-H(6Sz`tzZpVbE7 zmJ_Wwp}X?a^pgFW^aH+-uZ~e!#_$WazK6RvU)G^m)+D4aRi$3mEK3z%YnB~LPceqy zS#Ue?FyQb7Bpv4aWf@Q#sG@$N)?UA4_fL_aJvv>-8 zGJF)B+5f^9zz=af5Jq(QH@&Sq>3ZZLx*c}Rc~I3=YH9Kd{^PcS_x$##EPAPrB8w=4 z5cT{&yAoOIkD%JK#xyrt>VX7}oZnCaW4vvu7HK>B7KC=&^E=7gUNvOj}Tp_G5+4P;XkQ+)4oWRqg(Ff3jG9w^2S zVXyVb|LTdZqUxrE7<8)Bj--A^G~nn zFXivkU)t-F=;QV}@%?0ryw`n%3A4#cMhQ}XnJ(=rI)XTQ)ksY_;0uc-2jK(0>}>Vz z9Ly0%Z>v)S8>>f6KI(lPRd;!}VtsFr%73pXP1RIX$$ci;b|sN#?~XRij~#88w7+P> zw4`A@wyyn$&Q3~rU=qt=oe}9QK_{3C|7^d!~TA|fo_di#68t6pFGqp6saZ7vvbpF!r zXXJ}*L~M%)N`kVa%>Xl8PL^T{J}4C+hJWsJ29f%kB`<^?V`6oh!K~z8DxG-_-C8Tcoh`ZqD zY|ZNB-U+;&mgV~;o6~{JCfFFyc7Y|X9?d)Ivx+~74|{SVf2^bNG&*?i_5(x>(2;Qr z#U8J^ZErzx?46CR4U;fAoiJ;+oRe&C6Z`z^>GN%EE-b9kS`!EB>l12gX?H8b^?EK$ zSCl6fU5SWt7Bz|I(b6fv1!BZJ`Yy*1<|$+ z#)^cM90jvjf;(>APix#BXUv;RY8JcBK|zYRqIH3DG};U-L-AKTsoWG@2cQ`>su!EI4+t&FB`ln z(Qr!zrPZiHsllnq4vbsL$#w=gGhJ;hTv)oK;4z?;#Dr1f)w?c59+`h*H_VgbC2!nX ztoY?lX7c=k9+>++y>9l_Fh~*qP|@&}7s%?2;iZD))yd^AsL=Zs*u>=9LAlD&u5Z(BeEkiP_4< zZNxb+Jh4sKAq-Oo9gH3W&A8pBS)2V;w|KuZ^bGeBVRwL!`?mB)#`t$KHvfNlts*yU+ZjEo&A{Z>)agA@Qg;5eafWv3TIi4-HC8sTA%gt}U zS%x=>jK43g_yIwxHWWHW)%F#7+8jRfn;dvfoRM5I*!oMDoUmY}m}@l2A`6qcMcM?ij-qi2 zV*$rQ3Sh5clf0Cd$79_^M=k`YXH64DJMKH@E`9pV;#ZCt?rgvfqonePi50T~Cm}L- zIOrv%b@v!T?2%iq1Ma#*$fDF`1~RhM!DWknKHYp21+e9k@wCrto&c?qK&9|(aV!N* zo);~c@_#`a%qtQ5@jy$D`BZij8}j}9we)7t^H`0PB)%i+Jq0bfGRsN(Spnc=IkP=Y zoqWkUc#xd03{!dHov$ht;mT}1uoTn(WLzYR!P~Yv2LM(f(l=}Z+F@kfo>+F_8Ggih zA~f<&<{J7E&qfjQJZNaK_ACW;toKkRuvdB}kc@0d3J0wwqYz zGeL_ALl7=}Y?+HTG&_Ew^FHumOM3V}!<$14vw`Juqvbbzv>P&-kz2iyi-<5!nlIv2 zwNwun7lbs;u{RkZUm1Dl3RFj@ZJ4p^7U}|LqzpH3M-~>%1S8ik4+Cqv$GAk@BEu?% z2vvS~*kX3{E%d-;7J!t4W?%Xss_N>uQdRmzxWNgE(Y^5alT%=J-mUEf@IE474^+G-xZQgpA20ACR6JsuD9S>@UnxdC*WYg1F94!rYCzE;Z-4Q0j**JLfw} z$JU>hHL~wt$=SNX{JI0P-avW%=^Ev_9%^Xv8Smj-awI8{hq7123eZ)1Uh`2i&f<)i z7wG0)%;WZ)oyusZ-c<#5$2%kL5xKweIlgL5cnhr+7=RfUy-$k7V-0l5tRCInb52J3 z!9FcTzNv)mC#SQHPjPQ*E?7Y(9U6Hgn*H;ODwomg;=n4`L+0>yL3}6b31S|0zPAHJ$Yk;x>>w^AL5~c1rVLAo9VYk7FHA9T=Xrm#;9p{! zR3rwi9OArQ$%rK=Hf@f6hcDmTb(Q7bk-m+}D~D~RuCF^EUGy)sr{Y@a4Pd3rceVk$ z`m}R!urR(Z>*8w#wC~80?QX^Hm(rNoR~m>N9dh&X8M_sJ zsLP_|csBPU%V_4w3U%hY_g!1~7PF-M9>1B(-@5bPJ5GW*rpXH^cW`xoI8Ctf_Blll zOd^5^w^>aZ#HLPLK?O@09u6-$xF3H=%!Gld>k=A8Quo4iDL(=4*Y}LcZ6i%w%l?krL26 ztn1_~_p{zOuIXWd?Ok;N({~fQHsxk9H;1V9H0e3K7Ddn2Wy;YZp6e&@YP;9s*G+1$ zpRzE-uR*-Q%Cf0Geq6CVIbQ;WC}HRJ;<#cg_oBjsQ=4yJT#Ec<(aE!%$0M=j1U!fN zN3jNU@x|XP*NEUMzs6%)Bq&zZiwuyf>Ch$7^EuERjl z-z*znfyeQ+?S&LkZRvqdu*J7KHI}0Py2f&`GkEj<;$Lbk&YZ6({&xk70RS<#@2kJ8 z8H7JTuK~m@VDwi?ZS^qYH82*R62=JRmkLq`;e-OZh>Mlx%;m`Sq$*_$V-^N$#>jyt zx;4o^YT!{@Aj&|vp-%ki4*yR%3wclt#eb#BO z2L8XZ0i#bU$#vcc$F7`@AM+(a5XUlsdY98Op?ui$-OA%;TFOnBAjcZd_&~k&yqob` zJ%0{WJ5jK+xqbl7{cqSYa-F7wJX%JfO`L-5GjoWVKW4UYaol(9#-kGv;?@$}GHb|A ztG*36o;x4HXRp0AG`5}b1QJ*>oWZ@QGqQf!X7>~nDE zGIkTu`5e>V`x!QlFK6urKn>-V1@mgct@n>RLG1*}+Wyyg@_#oNZJUb>lKyC$RP^8j z2t%wCo4YuzHv(g7#l?b1I~2@)Bjkaa)DF98U1=vBC0)bWsUAW%ixJ})ffsNayAa`< zG&Q0%Qw#P2KIwj<(srU`vteHDxB!RBa@CQgGAWldr% zDVmc`*9_y2+wW3oE*9m- z!|toQSzRV2MNV+AkNaTFA6&?eIm5f!EW|JDxYBl&yx_ABCr(F`ei0NRu`_W~cR%R% z3}Pf(x;4T?Zr6lOGzI9`COthO0e!%|KPU7K?7khgj$Mk|=Y#G?OM7b7O!oI*Ncb@! z+X0F113yrE8CgIx&g$ke{n|X?%HHo#*ENg1fKq6~C-$?>_+n$j%B%aY!+%0D-oj&n zjzjB&bJ+1ICMT^^k^%RKGn|w>dL%{irYB^~Kr@$vXUr$CUP&yd@cm}z(J@VH>&G%{ zO)hMoSTn@F@d*?6dS-~`MidG6bro1~k*i}$CEp4s^(>QHxtOuj3w@s%_lCZ+K+4>v}p-X$Mhx; zvR=)zh84@EaC}jArs*LKNFLGZ)&Y!X4~BN1;!mDpWJInW_|4*xPR3$#b5~Y#1sABc zrAgQ0qBSI!0<1a#c=?JxbRoX^(`?}`4Gr#5rft79ehYtXyL&Qs*Ad-#PVAXS{ zl=-9Y`}Sj}P+)Y*jkLVTtup)>-CHTq6`}@OmPvgUD>$He%kEBe<-RLzqTOib zbKWuv6`R|IZ9@n!dl6hnt2MRM(a0l%@>j#!qQoJ~w^7L7EM?#P;T;1C3JNh*eWO_A zZh?-KhW9VQIIM)JB)NdMuAL{VTVU`a^lbES>c7O>;-zQy>_12}845ycjP= z)KW$s`+4P@+dTT6ZBY^1mH=gY^**ri7DKxS=h*qeJAY4^Qg9LnN$>Fq4%)^Vq`T@v z?Kl@u!x?b!v9>CzilttIJp9eFHBWB-%@Q#{0BW>Tp3(9WEoyuIm|lL@HGd&2Lf%rJ z)~AkwBc&PXNcV~NLf=T-k%SAU?{OzKm%Z}COE)Vr7pJ}U`VGwCpQ40(3<<^&au{~Y z?WWRWpzZej4{0VJmzZGvL&5q1Ckav#{*O=YKt8we&$N_Hg@EI-M;#_Eg9%uh8OjwZ zN%oD&y%fg9CMfN|dN6 zNGLU)92vH`uWVQ^_8|SuH4eKeeh%z;{}_u4o<@xdDYr9A8#i8V2v*SUAg%kSt&is> zJk$mEduieZ>s*JR9^J1rn9@$Vk7-lA(7UE0`H*#ujc3*aE|ewKJMH<9nlzBg>_YH) zyl1T=n5GHzPYQhmS-BH;6y>Pnh%%IBONscAtt#Dhx>sncJ1^l2BHtbyH zYu}vr`dGOF*hA&Lq`FCby`JRw^}gfm+pw9A#iOIPU1}l-CQG|j8RB_n+be+s#)fY+ z_i!SN=;s(2mGFt}``yS#-HyW-Yges5lxk*{{bn&?+p_r86>@z<=VVg!6>)37n+({O zvhjnIg1RxeuKNRYS)cfR!ws8>tRtBVE=D0PSLM3kZnl2+Eu3E?(UVdk<)(d=@N|)n zwyI_U^EUVC(v7N*V7J-a*yN@WCvkQGtdnmI=E^nN4eMW3UOn=lq-iqk^ybP@H~-~5 zT-!keG>;=wrr&m;Qpay=pdiVzlf}zOzczj86ALu1T3l1TtQEnX{AH9{hsP8vye#5N zDJ1Rpx)Dl=(TSyj7$wlMd%hkOJ4fYM)**$f+i)$oI^;vzDw-0J>C-F~%4q*meS>m^W@rcWR7dJimv3|zy}TK5C5X^J*~tR1 zvMl#Gx0H?3Z;Cd5`W;PA$&I|m!yWqrz*jb4_rAJU#?(4kA;Fdtf8u-_e%ICMXgwgY zyym>@KGhm|*ys32E!E>I6vNp>YPvfSZt_%ygTiNHxKI(KuC>-s{KqF3%rV?!A?tMA z6uNwV@PeY2adr8jrfHX3`Wc3i8rGd#uFdOCr^tNamAy$;j zti4cD$i!AeE=rKD*6m4|7I&~QksMG~DNH&4G=iqt1{>B#q@lU16@8>8Ex_0&YyT{z zm9W~~htgEolcF*bUKS{F8KCSduYn;)s zsO1w^Iwzp(Twr)bT7BoKX5r&oY1eu_oHifKdmRQ?5DMegi_bwnuUF-+pVMB^8t1|G z{4rY3K-Wgufp)INr;n^p&B$c-BiKA@CV|hL<=l7 z))~{<3A(;9>JLaU(mu_Y3_s^(Z({sb+?bxYf`^Q6Y+4) zBHzX8m*U$!te=|o89j*4ICj)!A1EQ^>|sem>#VaY(T#3?vpvQr$=LtaTiylJmEGcDlpEOtkxj9 zl-nR0U7M_Q7#;CK{ z9TDkqCog}A{gfnr)hxd}*kHfV>4+Bw1GDi<4J51~XuJq@QiV{$mcMp+iXI()uLZM7 ztR`fr2)VB<&z*QKy>7cvTrqEC2-ggd$aI(kC8*wB@aYegs{iZ^?WmSqaI=T7(+$y_ z^)J&bGBFxa2U#gx2^(gl)yRc*qI-jyBJ5%3gn&Es?p=y0kLq!h5^y7vf=A)Qh`e8Q zgPs9EjU7*Yd-mp~SLVY{_#pPKWgq<;-{EQ73sAx^Lyr+P^IY%6(UhxfVdR;I-=~X> z%C?3zCoL6krm5WiC~TON)>8-uc>PTm^*@JM5eJ#Qn+{Fkqq1AiI&#!`yljoXgjJ|!_oF0`3dGGRdur#)#}}TwS++~?kPPX1 zdkju$@FHZiw~PtWt7TCPZhcOSnF|7s@8Z4N6zxlEEAAYh*7(66O`8^6)W4uI3! zPmjSbUokn_nuO;QKDlYOAq3KbISs9>Pk%Pk{M$3B=4E77%Q+;d3{spXuY4_f+nDbl zTR-w}w_5Dei!*MewnEYS`5M(aQg<%RVCT|Gdzoi^*wxD&1+426oMKz}FBw|aoyyoU zzNa~5c|0-GKA|CV+@9BE+#L2Z?!fW8FbxBB)X~l90jCMQP%8j#6qh|Umy38B=`d1mB8k2Bq<6SS%?uv--lw@a%;?I) zYv`eS8vd^YH;3SU-f&HDsHt-so&YEU?3m{W5I$tBS7OA|k%NPs!kXPzL+t5C4g=2s zK9HAb>dYVI?aU(Y2?Xm{Rb1kO_QQQ`hm3Vz!b8+uTN=}56lr+niq#a&KQ(JP^!b z7t{~2U@hDdz%Po%WDYteSbi-?Z4oP|ekmHP))lQ*rZ?>epA53ld2mw29HsP#qmX+6AKk<1qEy%I>p=p^_sb)(7Or>@?@_yOUsrwYRBSy1)hIN zUjVd!67Q&hOz@#pLTjrBO!?bw*Ympgkq>etekFSY)%; z{bb);p3aIQyX<9$qdp0I4-wm2VU9sC&6j)}zfAvw*H|+Po1`w$_@z#%;urPk za$+@7)FU|NZVRBq352;QT?9)=BZRAH{Fo2Q+BI#gnM?hQ#&W( z{vZEU+?SHn(6>BMRp)Pf%8mqeH3z-thz$WWeGdp0r##UA=o_)kV8mK(+rVN0%v^|( zjyj-q7SE_|EqxYOSjJ-8yY0C&%<=LP2lt?4#$@}o!`gkN>!;r+vi+F)yvde8zmVJd zeN34oRafN&B%2g;_oQpS=M!#FVr{vJt4~`O0Q04DPy9mr@*Hx+RYs0vLf;Z$;f3P@ z6hmE-RW#Z){RZTTo3(3QV?ft4!;ReBsy(Y2b7=wB06+WU4z%>8tdom^t5ZGH6Olzo z)E1B*v^@;HS0)gpVcMDh+;IvkfEtRs=o^RNxRmTRb=q>=Mu?g%;mtW`8voN#Db`wG zC{a5vVu|*p^a7d?S%*FJ#L~OoOFm!BguV%cN(!zaHn7dByvT=?_^ANWB6YKFq}tv2 zz}jj2=LnV0xCv9$%~?78dGEF42$R(r&4r}TG&8zjepGTvS!Fdl{m#fmw)xq&uvB+L z)X;C1!?C1&F%}gYl;xC^IuZKpHw*F> zegk{G3ulK5G#e3i@rgcV4|!dpB~r@vbJ_@GA-L1$K3hMMd;Ol)V1u^KFc+CkbQS2U z{Foi3=;53!e!4G{ew7#>0cNC_PogJv4V+YOJ#TsaZW?dCeGg{Yh%tsa?7kUSuyHnILW_w$BA`~bpg?D6kl z*1tfmyA-DfI%2eD+lA98gC(=>2sJBMysdJgd?S91XT8CdFow^G#l8$v>X{GBQ7wJA z)6aVM>Hpw6{-Hpbu z|K`o70)D`UuLL|$N)B6vF z|N7KrAs9lsKKUzjMW~>wQq9)YRq6n5!Z8iiXOKgB-+0ChUrdSkRD=~B=s^~p?OEU9 zL??q5+J9^G2N2Y-Va+%(9C@>>_rlbOZ7z0%i}jhf;=2HEI-W6!NHbcGU%ouCO3_jk zsLp%xA$Nm~raj?y-#PC72I~<}?w3`lqiIWyEP0!laZrZ zPQMcReO7Wt@#oJA@cBrzxb5HCS4h8}yPzu8zlv1LQuMFbHZ$FS!Xj@j`5WugE4xfK z3}?Oal*2zH41)ymX7T`Yua6Fsi{oX2?#GPk{$}Qg>!`2G{imvJw7U$)O|R;!j3~MY zS&G9UNxJlwR)+S4Phw1;npV!G4j1$&7Q49YW&so75v%B>!(Y2HM@~)@2V+!mGiv~f zu5zEgR&Bw^aRSX7hnI5UOv|EkFMqRmeu+iD=h+nS!7Z~T%NG&jEhFT9UihNvQC-Vs z3}R>GliXX#%jxaDg4JrKhK*e@OdHA^$_VVt+a$(r9%?I5p4$DJ1x!twDR`XiReOB% z*o_8$^!QaK&$b-G8`Xq@YI(F<*Tf%tCQ=#hyE~>?YhN>qU>INx~K82f!p`WgKOF= z)>|KDFDnA{Nd?m%_&kWQi1|!Ba;!W@Vzhe77?;?OPXZUX7P63rrSIv56`&-2+}vD* z-%qMCJ@ig6OyCR4D2ch@)Q*=~gNyR$#U-%_f--135j_OcN8}6g4`d_tJ`MVW+;Zw4 zejzU79<6ZXE-hn%3P7K_KDhAbb=Y|l+rd6F&*fxeS`kAx6SUWa(svREmhA#^7AQ=Z z0!}{}@c$Rr&2ZXi2i&#@2E25pC*kC>rrgJ;=#fLEybL^Z3=L}UJOs=`)HnKm@Ozq* zSn8HD<3!)0=QL(FY&weK$gBd44-GlLpQ!L4CbtUjH2TWy_^J`QL8=`Xoq3WEyb)5o z@7b(<0$Z4nmD5ag_w@x4gtY9&;CNyH#>$!GSc4-b`LG!g>5d=1Ms^ef<_-zj1e-7! z4tSZX9Kd0(aYr-njkPg^?{?NOm8!zG&8vPnHY!X}qQymR;<mEYz2M)!??cBAhnmQ|nTpuFIE6RW(Af^4M2X)iBbmbVx#kw8s{`aE>~ zY2t#GZrWFoPeeo}?b;TST522s!?cHodW3(plL9=kt#k3HP2 zuKoG-J_|q3I*w z?P-{DKId9vV1>e*%xxCYf=YjqP%Z9=M_sy4*xdPu1HnTVd)hP~BJ_+E^+yM0_3FpN z1W!{iIcU2Dd{bc{)X^bCx;6wf1r?SSgT_@}|a*SyJN7knJumQGsjfM`?L!`H^kYNm{wuVL*^jL3DZx=9sE|G3*6T zG+4zcKNc%lb2$$Cj9ecuE!MR0JroV0+E%jB2jpl115t=jPwEKKxd950_+YAN=EHSK zp{M4Zi@(&?q*a%nj$a2_-F-!|u=m>~p(gTEdPmFczVh`8Ey2#EW}t_xpaE`{A?2IG!PI?WW8#`OacV9BPs zuD27B9k@HF3rL{gUuS~QEgH#iY{Om1FKIpQcd1QvZiO3(ALJjHRn1E8%yx)RDC*8^ z_rDZVs#e5$vmsP}@F5X8BTN19Hu*%AKDC5p6511_)gk{X$2qnI7!;C3A)&hD%8sQC}fq#dcsGGpOGz<7S6}q&^@;L2*3(q4VN3J=3an&Jg zpBYU#A~W`g6@E(DJ;{R-w1Ei;?wS|5;5`6=LM0e!b8G%9zX_*pS5)sSzOt z`~_~Tnt2CaS414@q_tHeqG4_mtJ}N=+kmOFqY1Fi@fGG+SWt1R(AmTqMw!d%`;g2( zM;2qgQwN|*SU0wNDv#c0NVpFY%F{c3Xy<^=;)b(rC+)~MP)`0~Rs9zG;WvxJGfdFq z04b+VW{`z%2vKa#U-=4?p-I0ZYPcsyQ8^pOs~&7Qg$y z*n7{YrrLE~7zF7OQF@Dtf)I*`h$0XX0RaI4DbgY$ARr*pTcRMnNmWsZ(tGGgjr1nc z3BC84P(mOfev@~-tL?MTIOpGY_8RYxFatv{dGgHrxyw}$(?jm!b z4eJt(>)d_VvhY69+AsDa=yl#LGm@MvDG zm&N}E6KJ@K*yx(E!DhV)D)r;|;D;)i;En!hlHGjaSWyDNBce1!ntar|N(K+>AwTO2 z0^+o%62q6#Gk!cksF(x?0!PPPFeiK@jeI4A6S_AB^yL(z>d&ES&@W+!rvWLA8lXyE zk;F7g}mEZLx76b zzifT)`vyY#@xMqrwBsm#1>}HCY|(pE#`_-%nO2hC>{kM4x4Y8UN0UxcR21-cm$HBq zbsScVKH6pn$p4n4Wx%;&6ZL=4beIEn?thx51Aud;mKRRAqN;!P4wOoJas3R3?K2C4 zKZt-S!~_wZ!Gz39oeUrVNB0!8z$d}`mf=d7QJfz$y{k})wkPGA7Crlg&)}*VG3d!_fey~A2Ba@y8*1}6M@!~ zll9b}=*~gZ zL9yrG{aM4tgaWVVuB@NBU?oC#VV}>wp38M?sj6G_v0gEe2%OZE{XoV{!Dg^8*98I2 zd^At=Pww&ucP|9=_R@C8>m{q-xkx^zx~N&`()Ev5_7}?V{{m&Ga$Hh=xq@JyW&0!g z`hjkAH2zZ9J?$mKUDVG`A{gI=Gqy9A>aP`|a@1rW=@JL26^7gr7-a*x2j2C!P+&Gl zcZ!0D;faP@)i0O*&-f=5L}-KNxrdgU=gc||O&=hMVt!}Lt*|yq3cP@vho5jO#)LED zm3J`m;3t&kHDthkGQEH`+C8>RS3&r=93?&v7?6kx^o(ZJCaTHHv;jhJ0p)O443IIl z0afZQNrT|yoN1d4)-Q6aDW#tG4WWCn<9XQoUaEEDJnTJ&eRoXzO2ME5Q*W^~TQfd| zP*BFItx%w{>cQkr#93b$t8rAVUQ^<^r_p7?_KA8EcO|0NrE$9l}5M8#A~HpOx$kT z^^2FhZRz;<#^qz0M?QdC(9tpFameNtds_?UAqYTIJ;BUcrr)EuGje8afru8wY_(Ghh8sVCI_&={V*}%WC6Jd z7!DZHvi)@Vqov8_qCHT4#(3+_8S!D0uAHcIS9Q{3qQB}>C+**^r}Ta1Kk0~UthD0t zp!mjz+X`!Wix-eq7uyKaaeD4`nhlk=!oBtOS_Q^+Jy}-o%axxJjJ!U#Xi7xXPNF10 z?MJj|pCdsqar%8$*NremBuLpq}n_{-;X*aaP(B)B4DI68$1bYFs+_m>X zv9{;I7H1lCL1(VYOD6EW_GCMpR(4?62`jl7=%xHR)jm8tylSDkIwmv8T6}AvR7$eL zS_r5%;~w=XKYPad7(ZMoQ<_omMO&^vIXfoD+HwUjbH$$#z)G&b7Y zOJbyJ>e{AzD(hc_e<`Bg=Cp6~mAmRj#~{Dmy4 zhLdS%J!5`%Y$-yQCoRSL9@OKeya}}o@ooy`srq*)yYsq+s^#>;5CBCW62?HMvR zV*b>;zPsS`X$q(bPixcEgUurs_HQJTk20 zy&^3&OK$)y8&(QyoySqB0oJYB&81s7=Fj{FDsq!QYMl<;R>-w7(3|q2;)3LjXbr|t zbdA7T*wxgUs2Nl^)IrY!aGCxPe9vZM+A$81^=Xms(?RQo;h5Vpl&zkBl%$2#+zu=> zRl4ypaqOAYqN(Zu&&q54noS|zVYk#p_L$@sRm$mb(LOVwFp|I=n0j66g@Ry-59GVr zY;foLXiW9>lxAnsNzX_Vmu9+WLCVI_ivf++&CD(3z1){W0L zIRTmd4*S3zQ^Wit_3tyO|K^I4BUC;{l zCQfiVDACdc*F|{e%~9CMomzBVZ=w8&c#rD*wJ%R8A35( zUk{dwI77bUNj3Lam_6B3R9fx~-thkAs_mlXmjRPnes@s@iGtVVjg3Bb4dRUf)mG*= z|GVWtHNiO6Ip7y?2E_%_H&?d{mp&bbSKZ=o`ie(Q z)d9zi^D_7qx%mL2QRs=(1d$`Jl$}f1tmBJkAxQS1_!EveV9Chhy8uS?b92F}4Tu~e zeI#UFe||<4^^S1%q!HnI(q}peo2=y!q60*`dPsuDs_u*SjmQ-s&_Lg#07r5A5A}kW za?sf|E%Rla6t#)C6tmDDcs|;%sNtS59(@P!3?_6-r=qN4-BEHdJoMZpsNiipsx5c7 zxPM6CSDVITe;|LLdUGs%93heUd6rykA#g5nEcAJg#bvekP_~|Vm<6VFy7=W4U;4rc zcRvlM<6R~jfXIL7@-{5fcuR$HDs>`FL|3_kqgp%=5#Dn2_$I862}oBl8M<9KWQ(3 zgu<}aK(a9a%`4Oa9jak@%k#~d8=fxuH@S$~qe45yn4depCM@givDIiF$W9hMfB?<~ zz(c2}L&{7HJ~tWe+{nfiD(Btq2z%%^L``~!RiSx(=GFYuF-_iEBRAtzyKevl;H;#- zpS=IPnoR=@+F*VWLP9|sU;;Cg&c*N6bn3!e5bnV-ZPA)Hg>CwH3AbjAz#aq-^{FIi zT$%`2%*=QN2$B{02nYdc4q`Ap%#|4!?P~ogY-aMTypR|TlD~h!BHjH;a!-w<(D<#w zin>ncGF|yjoibeuWwu|Qr7cMu{;oWzbDm)fCQppTjrOfA4|Hg8Qv4L;FGDiVJ{OmJ z`PgI`Yap_<5{G+*y{-T%PFL5SL{otDVCQ@djtdJ!BgP~e2`z*3B4ZI<#0=9J{uvTysS&&fRKKw5Ej$&x(nOK>rHq}qT(@u&9hDE>Bu+_VH&&K;{lTFyC z-pj}-&Y2-vyu;VXmj?w09T(Ow|1x^J0ze}-C@9?7f|c2tN1IPb_3!?%Yyi5OOO2Y> zUp7Duuz8Dzcca0dTUnZBBsjZkOakf3t1N+qasJw22;CjJ`wJ1mg#5^R9143UYn|bj zw#GhI@kz>Lacelg9KOy{v~P z+slylLV0#|jP+!TdR)zbdqgSwzkAEShIt(0yrlczQ5h!Dy0!?&9!1lrQ zpF%|oTs|m zYPG)p+!5GqggjF1GGhnlOq9?&)7f!!U1+y3MPh7ycAj$CHYpYSV#WfP2nFw;evLu; zqdY>)5LC+|uMEub0Wg`PiDQ9iJc`2d(6L+=Y77m~r*P@Dd8RryNQQ_=mFrjyf~;hBQ)hS|`JyfK6qqDP1yg31&44=V!@GDl!DJ178r3D0QncfG^&$lI&?ax-FD5V~mYlLzmI zvG3p2mDd+#sV*pEL6PMeKEi2g`NNh{EX_^2QzH3!e+&>_piU?^9@{3VQNf_EVFYL>9 zABL<+EtK---o4|}4!zob0TgE6tH64ti}Vm%RA$zHMtyMpIOc-Hp}2h>Wk`Z-v9nzM z0i`xe;1@fwI9t5U`XbQ9{tE8-N7G(^TTEeIBQIz9qUwd$xK*TlZBfA#XGk5Hw zTYGpst>+u+MTdMPU1G<4H@S<~`t~EyGY=F=_KB!cv^|p65BaFS*2V6lMh20ugDP6` zdf)l2uJ3-VhON?v_&cDD+)mm(cD0Xw_KXaZ#+V^TAOybV&ju1!lkoNImj!3LSKVHLm1b8V&HQhXXKbeSwY2so33a%QK;kTwlY%z+(G z61nU*xk-VM^;s4!nSc+v_?%}t>~V~G5h>CnH7Mw_)(Q!RyBXjyi#WRmfO`PBr!4Ku zqRx@y%v2z2PJxkOK0hSU$XDJ|q5yu$w%D5IeFbJ*buvcf5|CllqA$u5*bMl9~2ti48EZ_h8%eG$nV4{+@aUd3c~Tx7I}UdP%R`% z)E=KheO7?L`%?I@s>x}dPS(W*g^y>Jrf2{;kqCud(3ZrUvTwi~att&V9JXmau;q58 zxa^CAXZ}3PbN)2M^`{3sMXJU)^_f;1jA@zS&x0K}_2LHw$>|O_Y7gdBHwQbvUN14U zgppHrKll*mW-WJhBR4HjXM`C-130cP3Qmq5-MGlvq9W1(WO@&ZP-RdnXeija`txjS zp7^EjV)Ab-`|nC!Ryv;DeY?g}g$iF=LL$5}U?ywcz6Ga;^20@SCG=fy>ngl6Gyt7j z8Prq#^m(zkWp?qNnfs0(7tuo5Yr1h7Z0xIwbwh?cZ8V+ltPVkR-FZjldBl5ih+0pqYnuYLzFr7o8tR#it(O3U@z9{ZA1ezaPZE9f6l<`9BZ}lK0|wTs*EAmKtlf|WN${tE5vV8{d*K$_t}^`Ri2)uo@&+TC*}mAT;)8ua z6c^{~!r)Tj1#nLkWG8M8+B9iDTNx_Y>t;30)iwT$=A38fZ1A^@reb^pl<$kx)|caS zI0pf+MF#Z@Uy5xSgp&|#WP8-qs~dnXTJYFsGB-{IJ$uGL-oEJr%F=^_GA@mIt@Cw$ z*7plxM@>!O@~t%LQy`8Pp&pk2X*PS+ozSCJVvt`Gm4~^Q9Fsc_fl(dnT|0=IERaqc>1$Ig3bq7vicw`%=*%a2Ce zV#Jm0GoX@GF_nL0)N;iN5a@B^KZ|uo&nG+bQ8iVx&Mlc6rJTFLXRbnK<43>gxS$td znw+>Wl*g(3%dvyEj%|{pf@ROZPPd#V*yz63(mD4aQ(vT{bv}q9u6I%UyL6-87vAUd zVtqFtjp=L;?quaM9JS#M3Bk;7=NOTv?P3qztNRQpMB)0?I+68L&v5(teCrEqvBFmV zkCQkgJGhwgvrZemrW5$aR>*O7^|78Av+00DkjT4iZPd(>6n-uY2N3$fEIe~EZ)XOb z9*_r0oJ!5+Bl|B#KTL7aVUDC(SNo8?Lq!K8SfI$-&RZkFz&*m(2od=-ASr>ld8 z50BqswVRQ&1ad0`1zKb2kosys+|X@Rf(VOr zs@|hp;Kn1*zWw=Wl(S)R?jkd^u4s!+XBqc*PopIdv$C;5C&DuD+Od;&gkz+_H@pZ7r z0l(@|baZ{!HdLu_TkHn$iIY>NhL0y}ZR~>v7FIXWwl@mpIA}}sSg?1`S6}JNHo}7G zO|;YPdcP^YRnEvt`}WOihmqxQBzvbhGo8$Gm5B;mZVR1!=wm8pzVPQ!L(BS$mZcq< zulxHv*8B|A8dz7#^;icp#O2^ zM9Bz)AK5MXjoHkK-Pi!K=C3EOg|>ORLiGqq54hR+k-nmWAzM%Yp``lmH^m7=3DcB- zJ^T}*_sooW+5=OP~bb`WAQuSZa4UeY=+d| zhy*~#1_hu7>~xTU=fN6Xh$Z`#&ckthDi=Z1)la+ty(cpN6Z#ga;=>F58${_R5KMP> zk0>DfHh9Es0(7<%khI7+7K%j+=j|Kfy>Aol%r=pm0qM==gzZ-V%_C#=C~Z~_=uo%@ z6Fw~T*Z{x39^L*&H10%3!_Ok;;7dVn{!)5MRzmSy`A`@`EI?p51Ywt49z6QVjv zF0PfIiF@fY|FOp1>1z~L>6I6}a^-(ew)MX4Ky`}{Ofifbc?vy`;hT+zNi7za#t+ic z4iQNVDZ^`Xt!q5vx4F4Cfezl_Y(np%5q~pF%p)Ch+dVOu3Ot9zPN=lN$=rSX-7Va# zSlQ&ex8vs0SMt0Ed3C-QX^Uz}2AK_!q;pNyjoAi9r^autEGYTQEAr9F)7_?irR+C~ z-Nol($yrRsi0C<4vlR-S?1YptrM1Uay~uBT06%(v)ZK;a3PWKJ+GWT5;>=pv(%LNB z+>sdOFl{{-g6x-jGVA9pc{4O`3O|B#CXt%)uc9F529g${J&q(+`H{Hs`FYP?jado08s%?!&ukA3YeYOL+*p>s@S8-; zXZRK?v*8F|9Py9OWpI~Q9B%F`aCM(s5T@fwT(MapMrvh zg6iw3TJ$Wa$wxZ~&JwIf3Df!Bj46qp{}^2jI(X!pM@UJ>Hw8c!%(Ryei-67BsqpnU1A936b+usK0k}omPTLO)1!--cSruEi zr33~V&nnjUvR_6oRRgTQleGGTwnPDudBWAV9ejN4q*9d5cyd(q>w~AfMckitBl$91 zxbF{vTM;%qX2XeG(}Lat1~IqS=ZpcbCmG;1%}pP?M10x*CqbYP0GY+O{Xr1;p8ape zRsS)ScXqehLwUQ@)7>|=e45M#fQJ{tDA?S>T_Fv$ia9Z4Hq58|Wi_L${gj!6=G zb|O-lzjG2=PW|}+@gYWcXX3{;4?3dZcmA^HiOb zTJH@p8gS`~u|q{Nhh}1LI^^)esq*Yx7bo{CtiT`40j5uYKHI>*fTUP3>AP;9TlZo< zkPqjN3@Z)&%{!K+v?UG}ECC0s5>Dcd4f3P>IS9~$-HVk2c899yTZ-%s%`eye$>t^W)=27S;@rQD zDgMM+oc#X(>kqaap-n)@>T;JlU`o;qJ<@iDnPJ+I6$nzC7=+ve^b}X+TP#BeTu}lu zHcNm2yt(jPAiUQXz?8>o&q@$!Ob{!l&>ZXk)&6GkE#cG2V>KSoKu}Hp*JWUPJBK`H zn{pPr^=;0QV2`R*K1Ukw&%*%rtD-M69y^5*gGq||>U>eKHG#&U_`apZb35&ONDhGwi z;O02>^EXNq_96pB1CECpsis|+hcdW*`ycd2D%f6zzft9V)%w2q8w-U#)mN%F!h3+f z>SmztG0qb2j&*8Y@-vVi#Mws+ zyI~m&`@@Fr+Gs9;Z&x4mtm@kvnz_Z94n9{(un!GU_*#{pi~TeU6l214kP0kE)Ao7N z1I+n**p{3mEvAx+``0c);VR2i163=O(jz2(0%T4i+&G7Qw4|@Vh%s;2H1}YXlPpZP z*~_f`6ru0SG6Or~E8jcUD<|c{!0EaF|t8H z`558Kw41aS_~@%h1Z6LSaqca*Gu{cROy2;7aIdx&rw{`uQ!Sg{oKFGjRa$3RX!Ox| zFzvHr8b%r7h2b5b7V)}~FuDuf)WhYky6=!Av9*Mn?~2D6*ZD0&Tkli+rtoU$IudTf z@xs0IH??)^hoyOvP4qq$if>=`%8041shVg$R?C&18_HdVq=*?8z6Ro-QF(b)eN|Xz zr%=kAJ3X&+j*W-qn_Sw^$Cu5R79|_QN}E_z6C|R#S7@BWIfx;k6^n z4~X@6*B1~j60fbAb5~a2pv@51`r*&#R9_H0#e5>0(qowqJ9Ad(dGa>2zA$X31=N?` zS((|+?}@J!BVHig=u3ExW37}W5f&aSP4OPR3ke2C1l36^P>^_|K&7Za2L|utjp>6v zhv@>J(`;ke4KJgHR2ZN8#>_j7(*7_WJl8XGn9=Lxoyu;z8M-I3@VzJ77qpdjgVv7`#sJyR}xL(hwj0jVm`xhRZY;aJ+hPJd;`lJ>{7@-bx1b z%Ng38D$H<j!{RZ@xnahEL4rJO&TW*M9?8U0hi7W+g!Ko~B1&?0b%BL`Xzb zulBeTS+O3_)hWavkC^li56o302oXDyh-To3F_iqtwolst8b#|6KG~%R;ZAqiCeiHV z6v4^MwoP6>Z}lH}PPw#SVBT-{WhBV^;Y^x4D@Cv?H<|eH&aJ9C4B%}vS@yag{rti1Me#aKDt8zDLoUsbdXUz8q~Dw(?7CA0?$a$5kT5zGL6;G zZ%V$Qk}b4~<6C&Y`0-e~zxwDZN)z(Z+t%b+R9dEObzR=b+Hj2dxY|K#igXOmT0Wih zjz@FPD5Noobl%RK(&F2iZ_(RO``e(*+b=ryO|!|{-bduHPXiH^_kVq{s)jO>RA;;G z8;!erf=Nntc#q-8iU-qs7or$q+6DY?vjxNQr^{3ud7MJc&&TNuSWq={-@8%qi?;jv z)vmkRev^8Ab(D6bPpC_E{Hm7iDAkw|Aj~v3LooFq(gloy*m}36b?d;xvL>_1go)yu zmE2nsnQA2O%bWfKZxvQ(^V>Jl!7(i+67Fr?uG0gX{EPg+fa{meR;(U#yI z=rN&A!-N6-jDlwc?meQlTW!)=_TF%v&EBnZ=b3>9@iP#do>suw2V%Jx0%I3jiJ^5G z?OqGJWfrPWApmo*?s;qj98iRTEs`q1O&h$ra5YV~7WsU+OK0@ZWbCIe~rzvA*B{};Ht zlb#9#9ux$nu64qxI}1iT;Wfof&cRGi6-BtwPpRO|liN|c1V-YmDNk&}+%#v}7{QgV zPdT<^>M4G!*t^nT1w94Mz_YW<>c?|Cem-S9Hhv-MYTBdhuv&_*uP6#TIn5ocibs_g zyYnj)oUk?-;a4{P8sHY4EJr;>qQs*uhYpZ zvL&0!J&#zkn7<5X=!2P%`vJ{Sjs}j+c_#*w&0dKw^s|aPqd(h<_yk-Id?3EHr&8-5 zpae5yIIO*-g}>KaE7)f8jMK*khlu)Wq-rcot?q|ZAtRRU;p#CG?l`j68=usghTS=WO-$$1lu=^^ zBcADlw*`5~6n^%5vUR!Aw^KiD_T_Z5HUif5nh(VgP5it#eIyKTX8Q}e zK7}fei4okmb-=!4Cb_v=5_=d?dHKpvCu&CS7nI>(O~i_>uJ2qHN40_vfJ8SwQ8&AL z8;TVN(3??)EI1+Ha8sS#4klm3xF++>))f_}c$!XXH9Wt6MQivqNInEm4eo{mEw_%V zIFDohEr7`QB!5}`rZ?BGh?jd%ju(J{sx~JCm=unt&Ga<&gw)dxN~1VzyS+SU(ymm? z_K4n9DQcd1c2j4l=T^6`A}kY83{WA$VYy(lKLBBcUqk_+E-iN9Uq3eyvx)We2p~}+ zF#@0Qd^|AD`9lzm7fIj%nt*MP6?Mn}{?Udvyp=B+@YDJCf1WTvu$h52I?+!uZVEIN z@g&?y)Cls4O?QPXO9cuH0#Y6?GILZ@Ppd)+#;arKt0ftuFO=p%Q4ZHKHJLCr)v$HrQMw`8zA}>s5e8t!7XljCh}|$S zxer|W1NZTBE1CYCcy;*a&sIM;!qT=SEUtnBK=4|leEH**E$3%N2M1|Yhgr&+&JNBt z&Q~%fG?}mf6+kyXKa2XBoImK19rrHi8W2h-HupfGu(NVv<*uN!Q=_fP0#8N&j1PPy z9-82z6-YEC8TVHdrsKUEnb9xa2w0B%u;e-lfJyWjtK=vC?3aFCVY6Zw49I?(ahnwr zvYO(?`GYiW8$9*Ht1ZrxkEFL0z{hFz%!FV&WmKC5#7$$+-q+8p?ZcrYs$Cu8IWJf# z6LsC*g1Z2jS`zqe_!1mywdp~ubiNt<{4my-R!OFmF9`OIfLMRaNq<|IY-c;;OsraUkcQK zrV|&Txy?4)v@&>mnyRbr;xnP+Pi-C12Ng)WBk|h;Nz#cN%u$qp)TrW$BA>EbtDSM? zA72#0uI64nS*w$W{~D}xtQ$(&RdH&!M$_aSXf(%i`rgTXcx%-D4qPV=7FfP)SVl(Q?k)#3Z_VoCMjEMMDwKZtWuGRuMI9?@42Tg?g{ZJ{H*o0659q zKv=ETdY}3?TrJh&_Snv90h#5--5JyAaTUKG%z%fh7~I%!^}}YefJBp+xMbhHrwc(* z9&U|^iy-s}ap3^;Cm?!#6innF{p|E^S4%|!lVkco9vhKZVW*iC4Qy6V5Q)KKzm>7dMF6n1n0B`v5ByW0E$|bbkt119`5;anWVx z3B++}i3q2A0t*U&a`}s~cN)heA{B&Rq;TO_X_mEXW%yrOj9s5HtBsZT`R>|bJpUPZ z$o~>zW+g^LZ04hb(3`uzDIP4LFx-2uT<7N9qg83O#ARL_Zj-s}0Zm4V1pt;+gCyTp z1xCPtW*Q~1VsAa5q}TQDQmc>S@9>?3s8i*~AL|*6e|=z_-My&vzQlAw7Mc6$8a${D zrPfy#Nz*DXb-x7A3YMs0Wyya5oe`nvsk7D5=Tv?f7ICQp7&Pj$8r0R64dgEmN-P?Q z6~v?`s@{5hpUYaPh~hbB!LSV!A^V{PcCJQQK>6!PjaVf5VN{BJ6cD5dD>EKKt3Bhi zbEb*xH<4#tKXqz&r!J9lmOE$Kjd5x`~>Xm+S1Muq&clZAGWhiVCdc_9KIjMfQmEymlO(Ox3;)#E>3S_4!Y9 zTQnTUv&Q{t!gXhCb>gRj8&f#3+Tp&oZg?o8tqgbko7sp5>Vk;+Wd^jtrTR4NhYj1g zV`bg>mZO6DsNgtOflVnt`qPj6l%(NdwnvW0^wGAgRfFOuV~y8h?uGN($)C!m&>!0l zVu1K*5uT=24ju+HO_yjlI`Bf)VkXyhfoS(7)n?R!1-GGwQYj;x$NG2iYX^#&fOq?t z&_~zGy}Q|ZbCh331*$-vd)A|f^1i&o&m?G$Q|6l z!5dkadVVvF-0`~^3nd?^UbDJMmb|(AgUn0Qrr8#D%$=I)>8bYeAr>1uLSnrT5YWJy zu!Sx|yxa5@z}XvC6on|+X47pyQ|$Gczi!#<^%i@F8Pb^7_T}Z6oL6i2S5zUwa+|9o z%@e=eDAf^e=1p@oD{IW5!A)R5a+?+tZFk6B1C-Im=}8jJD8@CNifM>W1ruV#>(njM z;~0H)(~^g|x_~F}CZK(0KDz}(fqz;o{7--GQyI9pTgG#9?;iS~L*U5p9j1hEN%QC& zinJSpIReth?~YEva2y60$mNXvYh&y0&#l%2WTg*|olmBB(lc&*%-J=2U1i`Frhr{N z7}Ss*;_7qT^1x_uT`QBh>}W1#6^I$Sl>dc_@gMR1gU-SXqvtIC@;NWQg4(7-$n^CD z#2oQA1@mNU`05QAiDd;zeDF}Zp)vt~y}_1#^%soc<#JwlhW31td4a&s z6;wtUa#U~5k35XF7nAe}wGnSxxnxMYi!$s*TW(&zUgG(=`CQLBU%Cc6bmlb&;2R66 zNYM}ZFiPy%2xEr9CmkhXl#c!4#hCTlIV~4}` zq7>#%HH9{s;4~xCIL+9_Hk1-kDRb?8m)Jnr=6BaSMjO-sx6uM+FVSIkTX@I^%!b6} zqSShul+zEKD&L<>Cxw>3{mb5@O(8wT`V*Fa7^8y|luR{PVyDME>pc%QV{am#+3)cE zQ!KFe0pddKEVDanEPSOV)VkxN17%wsywSv@!N!R#BbV;mTQ(MsH@RsPpr??P5+><|wfZD* zG7fLmrNwEqwGR(>YTFS6&YrdCPN35HV=~N??4dN^tC2SKLbmym&z+Sq;YSCw-{nK( z0!H50?*z{UH#xv&k)c=mwudrG<3E&DwX(lJzqeWe@O>AM81;o(+ZiN?K)mJ| z_2!rdvT_c;IN|qSDN={+N{QAPMj_9Q9655&aXqy&Hg4EL>mS>}Z|Rt~;FM-6 z!|J)sI&R>k$r0Veyph3;h83XcB!dYUNtwC>V>5{VCjKN(3ZE}7p6n?SpWzC;?)+bG z{>%b_;r_vPAe2H*6G#7T$nCy=JqJBqt2LYj*}chXuN%z1e4KC- zm6y>i%a3M=`n6?mi@Nl7iqS7sjZN*o_^XspW@uRVEG2JgP8-pbwdQF2vuXZa??XPUR(iIMeXo14COqs;11Aw zp2)K3Iv&V93=hq*%#w9Bu(#hg`y~hvU{=k~E+ZqD9|W0yD1 zI*Htmy|WN;C&-K$q^FaoOY_+FyK%~V_)zHR^|AY_dW0~t?MHj&2Px3 zy@^B&D&F!6LoOfJjwvi&4;7eT?}Ibc?r@CmPXYvz21$=jstrg44)dn=4i0obP6@}v zGmsdj|09uO16OIeCd>bh#Vnak%0G57wzoghb|Cf(U(y;}-B*L4F&7==9jBDj=y+q5 z=d5{aR_A;7GZ@SHTx63#($3m(5qe<5&E&T0rq5rXFUm)qr!Ke{t4IeT;)|)a=3Q&=N|s zB9H={>RUOBnXaHp+|4-oDCRx^!Dp7_pW*RNKav*m8ouzFrRV!CiWGnG`npx?4`3{A zFhKILJ#T$2=DFS}aP>P%95%S#yQD`R45mYevf*i)(4#6-{YGxuAs;kcyMownE*yI8 z#hNqv1X3W%l-(_g*zC~Gw$tC{m%O$(=+Hj$K4P#*vdvEfGn|Eho--RUvLt@Re@qdb{*%zJ~t?f+30)a~WS@aMRl% z9SFPh4i2XRqpl8Kx)k;H@hiXwz%=Hd zR$07~3WNCX<|^1a^jyEN4~guQ58+KB4Wh<7NjSnW_BydCDY7S`R>*E(1@o|fLh0PX z*zLtApa2n#ex%CSJM*k|Vq3!Lpw}F|><~sNDP9VSg;;M&*fsc-!)Y3@nl&GcPT=Z>)?@^LSV?I`991iQ8)go%Jm5nM`-b@f~XwblsxT*%fS8S zkwU*X%}(Yv^3}qYB|S`6LcDxf;etH05to(mi|z~V-9LpEkuKnQjK$};q^{rGwD)lH0234`qeHnQ)Rf`d0LJ=m=P*a4I;9@m;d*>jr*Do6nCL_3#N9j$L_s z3U?BJZP8tk>%i)Goqh_m>akdB8^1ANQvezo}_5wK#)co(br?34`e;4bWH8|HU zb>z9`sAjCTR$C2S&oKbPsoX(NKYG;OR2J2Ax+vE2a+3aFkwtax&C8GE>n)=*?-=$= z@1Qq6q?M+1uT zz#_S)TsVO_;Z!%HK%=#SK2CKfXjAoKhM!25*q;D>#lCQ*B4^38jNes+ zMnjE?USGAwVN9XCJ0c)KxW(NjBW^fS&P=$=>dGN9W)EP)cp{We!*k%j4uPUbW)^4{ zxC3nqbt_co4CC3{>)I3y#|Q#!b`@~QB`Wqhez0mO&-Kbh1U*skfHiy!J0QR%#yZVa ze-HxnMWt`Ky+%(U50e)GPLVNO{p7`7r0}O}mp)7&-X&c@0Nqg8U(xj3G`WkZV+tl- z+G6y3d;{YG1U1xvT?-BryoY36!5Gt2If}nTolX08Oeg=5AMg|EeznAT#hka!n|~X$ z{Q3uG*|=p3U6@TWORe*m>uKb3nRU;raKLygcY#aI-ZpHPJoLj*RVod7P_^gH+V#g@ zp1)!ZUk+^1E6~(W-}(roexLqy$2fs87`krCJQ~`z7;w$A72n$8dFgyVV>0?HRgx`C zQC$sEl9VTPS|=93d0gX4E2|X=m5zSkxP-n&{hf5X{{IS>8Csp}0bV)47 zDVx6h7jW7U^BT!6!9i5PQ3JgPXN1ton{nNKIxzhNE3Gs7dFcCa!xmqI+^oPRs7K!R zLM-skU6%D{H6s6shR!Rg^ z=3f_G2$ zxqANrX4i40bymQQxo|>oV`o=B)%;|-g=_p5Z0Mh zFE3zfVHI6G!Q>4`#0C3?yw&S$>C@kqtU*tKKR6Y$41DDOnWg=|l!X5gt%|?i&tGn} zf9-*P-vee@{z9YKnvUKY{0KJajHa_&a1CBGN8JlUPw|yR{rMbuM7sYPRXP&A_H4?Qe=P=|d{mcQp1@f;1DjIFjV|IkB6C%agjg^p zs-R_|GoQVY$=J%~BgNZ9@zQ0>UTr4;a7KOHwCBo^w|mIHZ_zmQ~2Us`n63X`5sxsdBN&;a zoTy}1{6JtrCjP7zqr>`}2h@8VnecuzSQKb&puGty+l}S-(tk%96S}xJBNpEVhrn|N)L@F%} z{46xOFcOTl)Dej>DY>ExarZm_q`JN|eTCb%y0FrVE%FPMgy=684xD_C#USPfzsPZ! z8}TdF^Nx~#gVx=o&7V7#gO@h+p}LLrdN_~hTFd#GVz0SwV?X8i*pSVTv>vmVLKXxC zB+0WK6|DESudB!?WZ4E;G-$pN3L4YUytwm$BUo6hb5A57SH)*}$)bVcIXH49wRSCm zH+gjV#3~A4PWhOad>;RpzcFXKoIE5DNooBjbnpioHJ^mP98W%h*o-%4?@gO@@g#`C z7j&Y`Rpk>@g)mpg{-k~s=p}9U^Rdka)ASvqCf#N()sb|V+ z+-9!$mVc=-q+DO`RnwUT)uE?)8(yz(iLJ469|_jI6yw+Ts9Lvpv6EK^q)vrte`te{ z1cqiz%na~Dw~cIKX#0%9dJ3We9C15B_csOV6M9d|Xn#h&hjI&0z@S!OgSH?!ep6U> zyWuq$uaf)R=Z`J^Kla``uE}m&7Y(8!HY^}jqJSbqupoj8L{yrE2m(=BL=;4%iAW0x zML@cMfE0;>2+|_G6MB(eLy_KlLJ5Hs&&0L9Rlar3+UxB5yLa!k@At<{NJ#P~@0{iH@$95vFvEQyqai;x|&>QR4)*AZ}Wgg;w)gC&JK(3l6K z+3n2+`S-DxkEBNn+5lqTuxN(P8om|fTY3TE&4hHL@4XexeVvvsx0oL~{>C`}hoV9m zakO)#$@W5o%>KvWI+F9XCn5vYL<>&l+vJ=s(5MXf`sDkYBp?cpxBy#4QFN#hfPO{x(wa8#FSLfrm~S5>37HrIRW!evIAuIM341=mSI5( znHSm82dZA;iFISD!H;~d2|>87mO)-ZoRF{f2QVh>A5ISq8Uy!zHAodi7{53N8|IXB zsw1~=@wP#+Y#N6=7L@`2stL&q!I}r(EvtM^Gc}Kc8a^rWgsifez8fp9CX@ zLGW{#aS3m{2Uadva|G$W#bVg+F@pU)RTA|PmperFKm*c{0Uo}}ld`f{43z$O#~*=y zfm`d%_&psX3~Z(8#WU3BR7-&P99w2L z3RgWD`1Jng;BQZ!$R6`}tGLERYeBhtB;f}OSFPm{*IBlTXM2L1TqY{ygPhLiVFl# zzoJl9VN`4O(BY>MgV1iFmDIIevAjq|1W%p(n^_KH50syw0?9Idn1lP7;o(lE+4B?P zcz<>wqe2LmCnPTF&st_jz?*()ne|hoc&K_w zI0Hj0O`*T2nPq{Q`|BwLIiY_kpl7XM!sy#brqw5$ zH#%pcdh}1si=XcqW)F2c2EAMP@wpA^1NH!OJNgFIdKvL`W^(? zE|%##m?{mThyI#%TX>1OQ~uKi!^GhaT(pCuz^bBl7n`&rzCI1^8>kPlJpB{WHeNrg ziUd!|m7P^!s`_Xms*gJZkEcXC0OFuoT{aDL>OAGr-B1QFs;VsAQSdTF?`aaw5G2zow zVV5M@YCxi9`!5W}DY0_Me-KVJ6UTe6v7QKg(T(H{=`FGT?EZTFl-4FOtWEvA*}#4< zps8?ntQp5H^i79VXVQf`cHfQ+B6nU2Vw?;0$5D3en<)xKDG`U0uhh(mJlB5bzDpJ~ zHsxcSXGAN=!2>0b)zVp9^AEKk~Rvxq-TuC`h)X%@~0JG(F5UA zMBt9S`mmK~Wji-!CWqf?zWftnivW^#cTnEiG+VMD0yZ+`dxAL`=7c>ZNI00%Tx%J0 zBQ5^F~;&Amcss zt=2p*kNy_5v(*me0@Lhg8;?&<29Ec!JE*9zO$6;Jj+(bAzB%g$mf{I~lkc((j(2>N z2})65?T?9Fgn<6ww7EzNmQzes^Lz+*&}PWbAFS<|eO&QfH+p~G({ zT_dOx9##H?a7q@!XTc_rtx1qEU&T^R!^t=1#+Y%!KOr{~?2HtoQrO8j6EL}Rf?Hnr zFIpvX9J+8E7+pM&8d&QFhYBR1V9Oko;IT1NAD|!l8M=w!vKa$*91I5+V81S z{OQ)DnN@DwrkXS+4>^vT2VG!J5WtsM9K_C?&e`lW{IX|Q{x)~n*8cM+894EriGf_4 zsXSxc9p?PB?}5OomSTa=CX0^wR7)SWs%3=mn(wo<+C)oA@vB`fH$G>3kbU$q(o$ie zkDGPw;NBZvCTmXx&`~K4$*Uoo{!2n$mwKV99n){N1tq+AsbP57zLOMP`bkGZLegsS zQ%0uGm;8LYK-tzOAAXEz?H>2gw$G|Y zSH_=t>WBXHdG>|ueusJg?=M1%Z_m$r7rN7Ihj;}Jq7;BA&Ps&TbMh1{x1tccU9V)w ze|69Oa*%Dwxyq)gs|`V(LtSMymL86%h8|fh@C~(yTO_<2bA2>=ZtXl5oh4CSCgge| zf4%TpI#^dP*RTEF9sGBGTyk%OUY=?Rx^cJ>mOceB!i&?3mLU4iOAB)3Px#A>dP8Z_ zTK3X$jIpkyINzRs6r%^seBnsBy2nAZ?<$k9smq(!N}ULPb*1^otsaCdtIojTRc6|GLx_?or(tc#^@bK0q9$wR##c=WZVI|Fnro1p$oQIfY0nmE76_*z2V@bbhwxzQiUb#9Z8YReIL4EEU4GK-0-;B8T1+0 zQ|_S947SMUYCwJMu{(iu1cH54ii`V(_e#{(eKz%qO$zT7A6n{ zP-%p=_TA*OY>@q~nr((TdoPGd-iJX?1U8zFnzk4svE2w3zzedb*@cEC5O3V z@y-Ldhtzf6`i2Jn^{k(eD*|klLEgI1N(4u`cf1FzV93ULm|ZlqbL=Clr(tF1cr4Qw z)|0q{GoI@cIcKgpb>L>;!Xof<9=t{R(KV>h5roehVh3xYxp`-CkuT5-F(p&e7&sOJOfFi9UwU!aiPt43(zXk#}wiAC1wF z&-r8&bWugP(T;L;_tYnmz?u?xB{nm{^@88$qV9A{yO5X-&4ZjGv}APc3_-8}|3$(= z;<-g!R;H)K!pPlVAc}KCqilD)E8EAk%pXzKQD2dX*Mr_rYr>wL;qWS&99T#Uegrne z&+v=jI&%GO2#23a4EPke{ms~3z_VXHMvQ$%*oEPllWa#(pVbwuJsmxev3^n_QF?k+ zOFiZZ85XTk)%j&f{f^TWK^RcWULsjxBf(0aonn?++H5%Hrw84nDw1lIY$UwBQau6# zPG5VZ;AK#S;hJ96(WubWO*&I>R9Z@3D=a|=G5HhH7|8P|&&SB^h-|Bo_SM~NA6XOB zl(<^0FM`>GaK7Gl)^TUJFJyVZd^oY9<$%8YVB$!>?S+`EYlc5E))a1?M0{P+W%5vs zhoL*0fOOe(IK_6@O>%FDtCO3E=X=(9^P;$n`%Y#nr&2bLhBVq7^3Jk5(a^UzWDQJd zM$NB-fCD|6V;*g#-}_Sk*}>~OjHc#PD?ZLTo0$A19)Y!ea}CS>-~KWYTlL z$Bd|z^UCmnFlCZLa_wMgQ(S=!kGMAGf>Y!xTb~NT5A}vTL_R&Ii(q=jFG?%bCw|W~ z@pE{G@+`);5xl zUOa!Z)!!I|Au`5w3P;h0NQg?;2wj(gk}gPjaek$$6tMwn$JS~paLAlE9arL{6~Cil z$78e?j9aI8;AW=9h9|n0oHZ-}KllZ4z~Pl@s$CWTlk<<2Ubv!*zPn$?U+uXhAm{;A zYrFTRvLe;-WVUpf372)3wytCo^c@jo1QaCAR^Fp3+Z&}^N%fq*F3V|Uo1-P7Ww^{d zzZ~c60&-Tm-^0&G0*;=@f9tgGbFzz?2=*1>#womH>~*D^i}Rblbo+pqOOM{6UJ{zJ zuBf%YT-XC(iPiT<`kI@z+bDnFLT=DP>*Zq2m3?8oH~WM~fl}t+2*8XFY&inN8)2Yy z@3QQlOZUj8RoLF!L#R6`bh$#`UXv41HN#;_qM;MAY^-hfA7Y((SQm6*(T86rK6ue2 z82ccaSl$x*I?aWsjCnRQlu4-OZ}_xtw(ov*?2{1dF~N$nRWwtmr%M=l*FcIGEg zLL7&l{>^Lo^F0gMv8+?!5L{)y0yLHnDQ0{oC5ed7a~CrPAN-K)6$-Y8aU>?qa@=*~B1Kd!^;#j@ zV_l3#=8h^igkMz0whIdeXN+FG!s@lKB51xe3k&6TOMIYfKIycf?)awsR+L{yG6)^d zFE+0r3PPHh>=NlhvJMlPfr+gt4?WhByJ*KpvD>KEYpmbMArcgmcJLtWs(j(S!Paxs zgDAK;Ars+-|7`YBUU}t~L>v2qG1SNB@X5(i%Th&cG^nI<+f>*SWSymh=jL9=uh2PK zcEjdV@I5QK%2pW;A%oiUkMkT_#=M-K-?6`6)Csa(dvAtMV+eEnP-U(q z3>WpuFnq@jtNl+~^==cEdm|d7nM{Q#I!m=(Tb=o>F*89Osi3oOQHAP=MN}ZcjRXZu zc#|Qh1EHa~%a~c#fn#edxHonFCnSSoACUg!5`0Y@_zB@q;BrtINs9>nq-RwWY$N?6 z-)x*RK!-8E4?B&o7zQi3>^yurK}9Q_hQm$L@A%d@XT1t~vboO8275&b>(t*mUJ!qh z`8eDMm05$sqwb_R9GR7ubgXxjILS6Y5hN0S>f~8$QFhSn{jj<3Fx+8J&-=?w`m5*X z+e;6jS5H{g<6^o|nW!qW#Uoex2Ne3Cqc$ReS1ETrL}yjHO9@mxl5-_$PwEg&F1ks5 z;#!O2F_xbFuj2e)-(GQ5ldmtUP1De)KFXQx{0Zq}$p9_Z(yoe&+~j;mDEHRp7Jq0_4kR)EB_Qu#1^oWYU-wsr zj(hu0$T+VV5I!{q(UcOJhPu`x+0Spc8o)LU%VRH+D{gtL2PX6PBpvcn(4Z8Vqq4<= zd6JCxMB^9koZKD-m6rnwrQLC*6WdM4MFrOIwFcKI^8n4arMO`FYT+w26+L52X3bic zclr0Y;@i)V8m~;347h-BCOh4d8bn6vkU!w6?!373WPUUwyhqMPS!i#^>F67(9+MkV zwMM9TuCQn7hism7M1^cYaG9rncKwAR#Cip3QK&rv!i)3R>669hODgqb+m_&SpdLt^ zw7H4)aSmCNE3O<>1m-)tMGnmD)!*EXKkongOD6p>M}N%GA8X)GYk=$Yq4y^PpUm;^ zgt|2;DH)`@K^0SW-fgH877EXL=~&v9pnI$HeJ3=D2cBkzoLUoZ5EZO#s~*q?M$gUj zw?aB_4}~%N3bB-+`HnHjPQxD$no9=_#!nE)KZq;qsJ5vE2_r^4iLJJq+T0{KYege@ zp%~$Wn9J)g@)o5h9eb*mgx8Fj?9oYR!hx4x24yFr<%Uoo8-&Tt|&(>5v7QDsIcio)Q>&9y(OfafuoqSHb zf(A=RC|wBzAENmyE(5yqu^@rrLw8A4`9#&NUXs+?@n%#ug0_5VM9V3*WG8~Nz4VP4 z>V~aOH<3HypswzXjuxryffqEJ=WAKC_N$40xO+uqz~ zl_7}hVgiR`HQut_Cva2{e#d*s7=>WR-&Kw?<2>rVgOq6nDX1lZ~N^qz}dHe@?OH- zAnhwU;G1|ipY@YS_T8Y${i^F+Bo}A=f>`QN<1v{x0vD$j_+VOsvA1ug!k0>Fip7id zZ{ROF7hTLyW_7u6AfEho?dOChqLsf9?o%Vo#T>3mFgxLrJ zhgvr#U*CX1b(SVE!&yP4F{yI>6m~$}+DS^$FdzEB=-LpKV!qZ+078J~%J;~=)BEF^JE7_A z=0SWVm0M%d36RLLK<5*RHG$bo?HKv0R?#5Rio;%HKY;m+oSVQ%p%2JX3r3`VLNs=% zOqx?nO5`{yS`e(HZ(E+@f&w=Kkj}>WN6u_6I09McY81|Hll>ptOx6K&aWHmEs0+;J zqy|>S8PWLP zKQ&s5KD9NLXTfi(Y*cv12qlbX-UnioX+)+UzRQ=M`1@zptWhkp!==Dyho#PK*(uO1 zg^^1bKR1Hf9I*p@hFa7ou76;PfUX3l3coQ$1jyA#l`w0>;g7Sh_4E<}j(e?TcHpCm zTCVt#k%Vh#~DmZIS@ z9CV;pFoMtS2X_6(X?WW0uF*Crq@T*j0?5&_4|`%hW9noU97%8c^KKngo;CF~FDo+U53vsSQAfAfUGf5G6Z>zrO}%2nd8vjZ#}G@PGi*BfbJM z$T?2ZjziVuI8O`^%>H&~iRtq!a=t#ck&Xl>x?Ju6zQd}gacaaeej%j=q{1cUR!$Ob z1IF6npO7O2Ey1X&p-qI9!F{;@Rwp^)1yFTbyH$JopH}{~*<7j|+D47GA+GM>pR&YD z6!OB^YDTDhBm^g>4z)Mu!}9cbgQxdo6ggwj$7r7mtjc8Z$);2o&#q8kIAaf7EvWcJlcPfCRbMe#>M8h0yEHaE(g1A+bL<;Vy-5A`y_&YtD*|VE z*)jz97~SchHS7*jVo-K&I~tj&;=*x-*GuFLZZx2_F>%^)`UNcf3%g5b)E9R6Yr0}w zX()AqCjShzXWLw#+|pMn#Ad8>X)DUaU^(eZ)^ix@H>t zaAIFvJukl>8DmlsP7qMog}#OxsjWJC1!?B*WY^$lKGE{Mds*O!Ff%encGLKiG)-XF z!(EryN}PF~6_bg!J(b#1fkl2G;8JY0hH?!xPz^>YKOspn9x zAU8B0&2*e?m0>EzV0GzLcp%FYt1dMLu9^b%dz7Hz@lyMWpk0J75l;#0jW-GHB`1p` zhHsSGmxH&$6GF}~mT!X3TT5Oyqo3Luz-VPwFvDB`XvQ|hCCYT5Z1u6h4Q(g;0IEtWe-dT=j`|3`G0%1#vpJ7_N2^EpX2&tipuWoY(*YU$_asx2cC_>wKH*!toStWWom*Yxl{eObY!)1Pq@ z`aOA1rJ|24>_mrxS%2%ug38A2LJO6$QEfgsIL0q83gtbJ+<9qbk})2+R(TG36}Bu| z#&^w}d6`7de ziZ2U@6B%xViLkdvzOyV8oE>+$PYlr4*EeWf2Q~cdbuXIdAH3fTag=U4{epICGW?Ru zJ4^(%h*Ckt##?z*P$Kx~WAS72-5XmLeW>kSo4eO{?Jh`{y0NZR+lD>+j?FSud?CC1 zSqB&nb+fp*`9&8lU|y#FPE+}}4To!F@RWBQF`XzN1)7VMy287Gn}@3nXBsB%D-*aS z2p1mk%%L;IvK7E%9xO)sA?#uS?^B>D&8?41YpR6XPBM%-kWqq(3{MD zfDt{`_2W`)U-m8VPqB{xQGSf?pB-zASh_+uGHo>G}IvwUWxp>sKl!v|dGh8?ood{2F5`LoaR$Nistw?EFqALrqZ zHSlk-2DCrQ+gBXgh!QdX36YB=uC=VC$L7OX^;xa#%m?nvf&~6v?9Cj#XtVh3YyW1v=Iu{(G*i!W0EHW5*-#F&oUn~*Txl^z*xz^YgNyw5-TJ+!kMzFw zop`Sj^o}0C_*On;DS5+`0=R`CH*u7Mc_456IMwE$`u<6tH;HVv%aZ`~;Uv5J!d7u* z&@8Kppd;ujq{X}>vz2NWtnv1aTl*S6W(T@vSv>lF-WkG{V}3I6V7+$ba}0rZO_|m0=zgO%ZADXUfwboW~NLrqMs^wvCAPYz;b)&LKrp zgz}-Ya*xuOYi_8COJ5S_m0n`I9m=_99wC@S&eD{7-@wQq1#oS$ThI=f`32=rbIs&m zJxIo0(k*0Mp-5BBpI@Kl3ruvVOj;2N;26%Z1me?QQ$=X|jKDD()A1F|9c1qSNhQ2z zwqJl+mV4XrWg=r^|3U2V4dg7-gTy+*=KVk}e}8FwG;r(2eM>R?Zx`SH zS3r0b2NVQ}W4MFQp7#fI$J7%~>5m;t?OG0Ssihq%3rM9VPX|R^an@q%9lVBJJ*a2YH5u@2 zc12|xl)4+%&;PCE*nzo{7X+zhgq5#`Jjysl z*CI@y5_4vUBl4Z{Pb_Zb+vYB}c!Ep;VPhxt7PH6@JADp0HGpx)^kbLzks=tc03Or& z2;Wk;)ttgK*pDf^1waKFuS->6m&K=t)+Vd~{MlW&dx{QBk5D?`6dQt=bW_6oBi`k? z^V~zAytIXvLty04F;jXN;Rrxjm9dQ71e3YnJ(Hg?v|APoz@r25e=8gb@W)FVr^DfE za{$JUn_GKC5Ce|w-#=BN&t2w}f5!FK52@I)Yy<$ES>QMJIv|<(2H0sE;Z!}VjJdx@sQ+_q4NiD4&{&|hgeqR6o@3jj zf)1?EW`g5fTiKfYUbd~ND_-69n-$I?JUC1NwGvlL0` z%Q`&Ve5s3xox6+>lV1quU~5E*SvypfdTwMj>{b1%Ed9LvDCA?fuSI(Cq4j1kTI|7> z>XHXv4Rm&ur)Eo$@<6hKvI|hg+c*q8_1UUzaGrx_0TK?*E;F>`Hsx`P2}TS=7Q?Z- zkTc%%53!@Yvz5%aWzZQ)j%x#lO2B0}W2^JKxjS^;GOe%whJ^Qk&Gr8w68;;gq0yP$ zzt0$v-=3p-XtsTU@6w~UPYie-y2F-L#t0RcFo&sW78U|c@v}ygnZc#Y$uSqyoiiWX z{y5+?&OzL2SNoCl90Q|jW1aa}9X|%U9JcR?xh*wqEHYLAXk$_OZ3HdbM`mKm3`G5R zkd3je+s*K!?ihU_r%ej2>mtY@zLW=96)D6kS=XJZnBBm$<~ij|&TZb<~wxwdCFT>(D` zn=^7WwlQ%+@B?_*WgCp-t~w~jC$^J#P0OPva2NiHK(5^Ui>|$3x?G3h*Wi!z0{T>Z z7qC+?lpyeiRF9vKC-?%`N*+b5`v}J&YJ{R_bc3p;3ZqCBphwfQpy#~tZbp?e5As)L z`{S?7Hn^f)6-*du&zW5$r%5!nHen)_5X}P?fD=6x4>-4d7aJCab`&G_pV#Dh@)JV1 zSMy6k{wgvez;I(?3|YG6GR2tA4bAtz%){-FlR7pV+`>ya_d^xH(lw=qj9_W566%fI zkK;Y49~CNJ7BYWA=GWHtIlnU)4tn8N3=+6ppFSe_$(*Jm>pr#V5zgR(7c3GVS2*TR z-BT4$ybY{S+nJksyY--Nv_Rn(n=99*1dS6%O(<5CKl2Fu({7^!6^F_I@vyIXRXj5m zq)6LL3axQ`m%&iz1LunUN%od-gncIS&W{1kIEcDN1AZIrNggUK?Y?Jb6(QPN22FV$X z`4LOW>M8QHj5_4=8?p5~6{j{?^aD6IKBOv`nuGXV6?l;#+rV}D0K)dXUcYjC!|{r^ zZH-kF$cv}j#9`C$?U+W8t$$Q6UF}Wb3Vj%fU*R`qy#MxCv_qU_4WC)ePCG@HC4CKC zGDMP(EFQ^9`I!84Uk9^>Pi=^OmtHFDrrMG8Y6*u;kgd;eU&dIk=kW#( z*bw)*>j&M~9!SNvp%R#_}JGXq8F(1AT87-ORG0FvQ z$QW=#c4*z@(pG^$WFcBPL$~(v%ylj$c=_7_B{rp(ACCVI1S8m5`yLNH?MO5I35g9B z#)#}L(O-H!xstu-`oMLN$f?4LPp7k*kI}*s#E-TBxz2CkrPX z8D{3Ee@`CLlbDF1q_t~3p;zA?d)b!87n7x_;Fu^O3=`%ua^Jse{i7ZJ1Wp|k)uxth zZP0&=KhDAb-`0SbcbdZ6)i173oJ~E?ln%sI=1&YWI}@V_=#jf zawh9G5W(&wgcT*-$!W};N>vA+hK@mIyd(IHTaP(GjhN;33MvcypQ~eBzBF&bzaRRw z!IF`~Z1XDcVA>rAIn!+#Nyq8WO@)D4E_P0!4MTR~Gg3GJHnyhAK|hIq0e4vdBCW&A z<_@cn-gEDXvwS?ZLG+k&VC^#A0e^;^lNgqy@+#h^gN-phw(+PWH6$ll)?U`8wb-g> zbUMy=(??Jh>|V!yZ)5&={P#BZkMr?gbUw81wd_beV`D2+&1rmk@mg|_CSqH0iZ*qI zgj0@OSfs14)_p*Gh76l_ycD0?BpPzF7#8L1wlwr1)cJJzV?>R+ftK_)#z*GCXNPvh z6Wh7rD}B6i(DjaEsO}Qfov!Q>6VO1)ug)s1|MOeb%;Nwc6cfU8hd+9X6^x0WjiOiY zZjGquxBx8W$c5jo`#=X_ZEK7U($FO&Wl4s#zP6G}n!pd`sy~cqx z)}aSDw;ppX0n~ynfyv|IDs9MpPYN>B&oai_tT0hIJ?$=Ljh; z0o`Nz4NdR2*t+LcwLED*+(LR9L~FfC*gSye!&{Kv9&)hk<^Obu6$FPl4$<2_f1Q<0 zlnB<&ee+I3NzwIpHG&{j_#d{P`!9VBvH_mUMi0>fKtuq6>2bp44(c?s2fD0wk6CdM zpiXxmO$Q&pssrq*FV)66u zm8~8gcZs&-E4M88M`e7nD~fLLg|%uh%fo!3)vD5!NY}0pK`bd(cDUrwI1QhZMSZgA zK*uZFOCL)V+r1TOlO>6kBP*$UFE7dw>m?akuX z9s%I(DaHLaXv$Z7PfD~i%*;x1{Qg5XKDDEPTpG#`eC8p%7g(097M#ZtT8sBt4bQ3B zIO6N}x(X_#ZkICcuw7v7v=q$Qh%}5R`;BrA9hP%Wa(Mgrf%~y6<=ODIki&u1K9VYr zYKjYs^_A4k^vO0{ml8l|UXa*hFq=o|d-B@EJi0)W=dGQY$irl|&KRAOqSNd1RwIb{ z7S>rqSmZg$JljmExgF_R2iJmQpb0LX`iXAEXRLrQ!mx}ZIjuQEdI=3~&HFJcApR!nJ|;{WE~-%i2|r zW&z9lLHCxm_y=w;;TmjZh|&jsORXn%&XdUWKKhHB+q@;UC)~@n^Qum zAGoVfM{{x`PDLoUR_gVik4X(wU3ntdSO#(8y6a3VWwt}J<+G3Ep*ZB&U!KT+cR|;4 z+!V$a%g6KH&bY0|I^J!?)M{$LF~4wqU1M@|ZfF329Z2>H{f<=09D{4bTN#M+lgnCO zC!kf!XSJa|(zMVKh7c4~Sz3BnFjLfXy?W0vdhQ-$YHtM4#HdFQob;Q11|Ou1N2+PJ zLS&_Kb+|UCB;h~sUAtFFcQaqepXNLr1Fntqj$iyTm^(FDibSYkWq#cB29T(IP6b+% zm-=LMWG;RPKuA|QTTE_KOh@g&3Ev_pzr$45G9@V4rBU(N={G4Wh6Gi>6zu>^(UXOx zQQ=rwy z%zcccQgN!`EPMe5bOE->lsm9Iy0Q>?)!OC!eU9t;Dg68~VBk_Ykum7=RNM%`*9p>{ z%`JBNu|!i!@MM3T^Vy?{v#GMet?q2`ME%E6g_p-fx>~{h@)s?IUyHXAefLJ_NTQ!G zwy^IG6}~<)SAdU(Kid-jtdJrlPpP|62F%+NqJ~!{7#oPz3gaSzTR<` z4olosvifp_yEF(7K4GmzA4Z0a8b&bu$cL-?bOur^k_N;%D01Cm8|g4gHvtAJ4-lZ# z`w{&?v}VAo>NMTT7rSOk6g_K+>kn~hHpa$_Uw+-7Yrz71@93R_X~9S!*y#7$&RFtW zD(F|MC8cDFN^`w(G+hv*DQ34ngYQKjpQwMXRi0$<7HW$X zIJe)!O4(BV#m;k~q%bj&5ViUXVPo1!lZ!2UX(}?Wz%-edfP7HZW?^aMXaGy}PTsUd z4VN3fbyY4GB{4bcN|*Rm2pE}R3rmc#*pt->jy za5BOn2HVjI-DE{hd1AaURkchJLovaKi$j~NMD@dCDqCaAb#j_Y-&$3_R@TN*8e%Wn zhAc|x0mT-9d<+|WJ+#l&c+C4vNi~ica$VnH$V~tYr&v}nZQ+ildCaCMC?EYSb|&>F zB&|4XlZFZ!%KF+*POO+>Mt_IPKF0U(t(}GMk-yDRQSld{HyA)O`%P7v2%b5ssC?qZ zOz67%WsbJFrQIZGpLr#e6hc4N)pub?Aw-i2w-On>PQB4`e^l7vg7KSx;kFH7aByFT zPfA>~1a<{JMe@C>!wyf)&&_bV3!fHJ^>=@Wb5Z8C{;u%wQeFKR=h%^fw_pTMXd$Od z-DpP~9<^(T`k)oc-`PfuS|{wo)D3a2eaM`H zKWQD2M{l5eZRX!D@={L2)_6guyx=5l4Oa*-*7s84WKX3vzZs6pQ;-KLKIX#)7d{V) zb}p_(a`WD(P8IF({P7bK>br@o6a*@g?jy@+iFiG;9}kN)g3fu%QaD9jY0D@;$z{_91U~8K0lAe>*6&_Vn=s z2wf{mlKV{Ce{6>Omi-WkWVV|HQn{5%1bf8hyg4%M22 zy#mejk3}vV9@beS0N33`d2Exoig+-I<6DKX{i<_pyxj7$^)Dzfpb*y_iE}yn1?n^q zMqpjWO&MrEG52^CI!kqRe^mSfQycWa6!Lyz7}n!gfqb-l?xY{Q65(<$!4z~*q~FB(jR z`m0JpWIG>nd_56VAiY-1U-_*uLp==nwA!HGzJ182^8KR$9$Sf)5;_|>u?%})u{k=< zKy4pYvtn>Uu7FoXBz+IeIDcP^tv@mqIm+}&9-P*#4$t~hKVk6TcBt~8|LcP|joB-{ zjF0JqmqwmFu+X*ivy>_gdfA2Fh@8kL>Yj;y3H&6q*nbFk{D)VkEmM@N>^;1|=i*!? zBh=lz_sJWPEJ5GoOQ)S#=qVrZZ?MnzDU)uEl!!=4eA4r&Yx%ys&6iy;nZ1@)Q!>c+ z{m2!5#6Ho-OV}Ifw#G1;xaHFswgtx#W0#3sPlwYh%se#4ltG&Ty?2QnrF>*bztIK3 zea@BZFuNz#`-!~U50>k&y)9eYN#KJm1OaDHvqWO_&T&8KI6ucu(+M?xrx*?-#FWv7 ziYG^G8z%LLJ+jX83&5u>M&}t8#ww#CwLYk;tttEfsT6Y7^tQjCAAi!0{J(mf0q6t= zrU#F-Cw8yVva>Stcpafbt`J_4&{l|AS`^iuQrd-AAdW2T(@#r`kghy6-TwHPD5EXL z<&JZ)OlT_RJdeqI>{2}}rl(OQ^UI{Of;UYKeTFEl)vhz^d@{^kMWRvfxQV>NLp@8E z`z(g*C?C7cJklBMlJ3&O)K@f(&?a1c$f4S=_fI&4DkMJJ8@skvY+Igve)DdX(fIs8 zl+&Ea5>2jG}iuLEnm2av8OhxNQe_e{3dCs zPfDY>qx8_T$@)mXk&kY@>bnBfIL~gL-9Tid&*yC?%lbOJTxe{k^?rC3^Y+EOb?Gri zD*EV`XNV{9W_f(s=Pq_w^BO7FYPz{SQut(RSWB83#d(&VykBqov5x$sf2<|<{f=Y{{cily4SCAxpO z$$ih=a$%5yBz`H?QE=WDtz~y8ypf>>rjnF30}R)%&b|LvQ~0mGrnp9n| zqa`}`BAxp6@R5?0ZExzd*>Xpv`bw@51ko-jhUT#siPE>$&sq2RS@x7Dc7CA0my9@}wK|Zh8*GeKGr}K>`F)z(9C@9{=aONOVCkf?qp2V836;TBR_9ZtaYm(|%G@EVDQ7Fps81 z&COXF{`f>so-TLzG?;1X*k22;M~m%1W%fS$mA1gF(M#W9hca$>yv&m2?enW0RfE_| z_YtR%3#g<)ee{j&jtP?^-LKBQkAO=)wwEk(ih%%~{M8>PLav{!sqs{K+M74n@ToSa z3Mfj4Pkoyi%Bg*wZpUJkG`;@nm46v~rAUGz^Q5AO+mM;@G;G#FM)OI@!anj`$h@{573xA6M|M|UsP}QEV^TlkVw;GV=ns!Dz3Z0!z z{Z_EuWmno$O*?VrsM&T}{kzm?1!ppD;lZcr%Z-Iduhsa}+UFA@`BHC;P42l1S6nh; z5+RiB4{?vEMr1tz>~g-Wx6IBl*K5$<vK^0Jvat_>NKDyBi^a5eeH zb^b85`9$PhhldxxwY0jLQ<{Z_1@`SNv^(`BafkE3oW)`;Efu|Z_`+B5ueVGvApm}) z+N`?$v%N{*K9Ft|M^_-tGljc*Hf%oR>p|-^{XhS&A@|zGqoU~n{dX>{EYgLlpoZWqO}pu*v~%-t;TYr zRZ1Q0{K9?9E*fR}2k$k`RpIZ!Iu@*!jTX22u0y>E5VRF}XO;PZH`yhinz^gVE}vH= z_36hDwy$OKK9hvFWbN{a9=paa*)wyKKOu+5JD=%ZZ1LQ)z+RRnVLukL)J3`D}}eV>(_Z z`z%>cpX!;td-k+Sr=)ogQ-FRCw6SX7m3+x7cqJO`Na=uN*2a3fE8gpIcOg08>n~FX zy;GVc9ZZR*J|=n=z2WQ{>Z~F`{VKRbhOH-u6NB74?@%kEw-iu&p-FsUv zR&z5RuOs++FQo}%VqlP~Ug)H2-D&UcDdRWAz&N zuAEuF1yQAocVSphm>SP+DbTq7wvGNP_hP(ZT3G@t^`aj+b6Ti*aFXwwtStOdK0w4Q zbTz#%ymR(d|2|cTQD7t$Kd?({u^LADR^HBgk@3ClbY?T>nBxk|?zA43x0hp9DMM~h zfpi6-TQXksTLL!bN63-JA;)t>1621dhgPO59LTBCo)eRB6j^mwzB&Pk& zaMKH`J=B}c{BsT0b4Q&6YD7YdpCxH!9Bm$E8DooTqh6w2PYhbqCe`{pAN|&Ka&N2L ziQ-6K_Xig@wJOgc(k~lx3|$zB$6k@+;qgF5Uob~sw|Gp zdG5_7!29+f)tBj-R_3rHLNK=>p2J*{9CX4-2>dVh-aD+RcFh+CQL$2`m#82p zp$LLVjg2lMB7`0V0qIJwiGuW|ARsl;q!Xp1K5x!DNJ2b|chBsZ{qDJD z_Iz{BcbzlW{ISB7MG0AX*8SY&SAL+XaXM@824F}*r3C`GmcJctLVJK@4Pg&Edn$Mmv@ITQm}fw%qSI6$)p4h1{p{P~rIe zHb+9XuZ+EEagcS8EmQeRnSdI)r!TJs(PM;^=cZN({%YukfvPA0o%`Z7Mim;uYWJCT zAFE4m!YYOG^W0x-p6Tx@Hb?@^WbFLxF~bj)RPcy>4xoGh5}xU}2mi>*JNy~&dqP-L zC5&N-OH~M)Cp|tv-}2H@!-a!|O{Q&!`h!l`W0!zkf93f+RWwXI@mr8t#JtU#sqBM% zJQFU%KR_W#gC{mGG_5=V+Yd&6`_(?;v0PUc0)|Z->94w)+{o**vo$Iu@Xa?7(eNp| zP<0|~q_MHzGef>jQMarn=ZB>DrVyLj#XW=dA=bX&Z_2z>7-|49=HQ!0GJe~+ki<9f z`X-=-KGH<&Vkcpjl>J+{l?6F(M2q{H+RJ%uB8&i?#0t>E+)0s*K)%CG-|h6W5b@meZ$9G85&*Lezl+T%{jxj*fWdER$(E&r&7a3psicR>Cq*2^r& zCtXch(d(Ce;M$L>`tfkVDUtMo8(*ET* z@qlS@-A}EgZGP<(ym$+_V0pnZZ#pgRnr{5$Zs?u;tG(Mjz6w)H0}t1JOr9xK5=aF0 zrOSRZfAelR_9X}!#C;NK8X4Yr8Sj#q^{m|NnP0}#yN7xlU7Cc6&i-Q6u|AjcN1oI5 zIBPJA#r9t-)Yov|{TN9NAvsh0sa6DJTihv<-lKt?8{(b~MjahqK|g1nvYP*RD}8%U zl7S9$H0atWc6C^hS~u<frhtnv<@AQ)DqdM>6tqfkMlFBG&#NDQf<+uit2L@Ch;>pMUTgxF9<`;1!3zW8M{0 z&vc@jdb@z3X8oPEQwPXDTRmQduI)9$;p{#DMxz?QXq2rnxBUQ)-upmM8&Clt&;PDM z9s?frH~0%DpD@}Ta`Hg{5Z$vCALi$PioR9vzChI`g^~GyKar65xA=_GPdMJ| zfSlUFp|#nV%9}sEB*RuOu^KTkdbV2AC9fg?Zg| zIzm|sza_;whG|@FjZaBq5T4KpNe10!jVn*3Hljf6r*$(`ML5$=4fcEbuQM2zV`Z^h z8$0_snIbzrqgQ*YHRMv%+|w5~Ktc5;AAJ^E#=RpeLwg#tSOV+5$6%OZHlj}<^E~DD zEdb#Zfc+QZSd0#%-Vu@y^OQf9h8nX~_WgkHh-PF+r_;Cx8Y*^7bDZ!&i7%K;XWYSD zy!zDh$(v4%qpW?A!~#PDjjG#4siMH!wSIi~Zc(oRZ&h3jtD3853m`LXzLBK2e@cgk z5;bHNq?B3icjn4UlLtruZC?yxYKOe02~1x4S<;ldS8%7@c{IFNBLTO8s`~~?ck2@0 zs^9*qB^+&%5+UAwUrTsPOGWp&)n|yU=y5Q6zPRQR;C~o*1H7s?COR{pDMa{6`au+5 zrU~-g)?oHN-$K9o*=@0JeHWFqlL3B!l~b|+X}>CSm3nHN72{V?*Tg}Q$A4||LY$?t z=ld989d(Yl*1cigtzqtt+vz=>8A-QpVXSRFfVM6^sgnv_^E(TS-#@P&_OL*wX@BOl z9UT38<+nd<1bcFU2m9w!e{~8x1P1_4f!grzxGfWL#Oer; zn5kh`+TjMYKb<)wxdBpd0Q!n!#1m-6-aPhaRvQW^pZ}!%W==xSR2+)41;_*2WUS9@ zY&P0&75~7KgM$4Va0bWzBE<<1v-1W%{M~hT;Lgc|hyFblqD0ow_oATVwRHts(s8CPXyYH?eRcDg_)0Rp9m0TP*IC32ppCs$gb4;p$$I>Fnv zgT#Kq&g0Wl?Dx_H?pZzC-v73nC%;}g>vdCtDsLOJGN{?Q#fQl7I&fih9hrtl>cifT z$F0W^Zm)<$c6Qw>pXwjrvA_%lg>lm=PB#I=^#f1TWKa_B#=bqiMibW zHjpSU`ZQ1;FFKPQAdMHOPIk+o7?GmVeyv8BpHXMdbqv}0q31_*xQud;)-n+svWhPU zzyJk6;&+*@jBge}O@MJCMlb$FrR;0!3=KdMT_@Nb1OfESos55y^#bm_KV-fBWR?bh z4g~;KMu^!cEau=Y0KHI?IK-hG`ls0e>+N6pr~NsF{_}Gn6=K^0m1YH< z7y=ZmgfBl=EzyC?kr!b}zv-$lA&dQuR~j2C57I1aB5j-kvNa#wUfreVh@%XDkFlIN zJhl#TO8N@|$d^>#S*bA93&Cx0K4fPmBuR5ZU`C>TRMY)~)IClEE-uzizSN9io%HBrtmF#z$-}XnAe_Fc5AEct@ zs=+d|^v45yTOXDJVL~{-2Fd~ZV`-QUA>fm z2x)G|4PDD`!kt0CZahcw)=W{Z8y_{lccS&Aq}ID&pz@}1><8naO>ThGV^rSKy|j|- z7w$eU-bl=xuZ3);pWk}|D{rE^3hrxa^%V>~&i4%{HFlV9f)|*_aZL#PThb{h4in6m zz1CQZPYRAE&&!!fCr|xsk_&_xqQuudPiw77Wt*&*>1QX&^FTGYZG>viPY*o`lQq4V ztB?&8jcR_l4rHi2>~w&md-luTkCVUrdGwT~t@pHOTOeSF)fc&?O}NCFkus|-p>!3< z&D02bP?Evy``t8^!BM?xr3t5LT~2>5riKS(LSOp6#km}jj6UG#r-T3)4qlQYCRNsN zGi3AO)-cKGIRWkQmRYQ|??7GstfwK#a|XskDsJX|HAKBq*zRxSU_CATt_ z_pGo7&k)t#F>9rBVHW*syYlzX6j)B-Cg=PXmp=GI^Yr+*2!e^qPOvt3`~cfIPV1Kn z!q~;?EI_=@t*uoXU)50me&%DH?>o>#4k2NGI|3?G*^_{og|=NDAG8=Y1#_-CIw^;1 z?mR|ci`yG3%8L$~cnCal!&ewtR6+ROJ*oiKh#KYfsio;{x+@W0kDMm>)hj zLf??u-1IaX+m+DvuKQBw6>sv?>c<8~INn!ySR341Y|^vhyX&aXa2sHPMF#;X;MvX7 zt5{Vof=KeLlHTsPO*izENUue8O?` ziQ*k^qB<$-cv9rAe6%)fYDk z0H4}&+8viwvH%H$r>I`P1BQbLW~xrTX=6OV-dvIaKtZJNc~5 znu7ocEz<8G1ciwkTbrAc4?(Xkpmz+gYbt*4K_QtR8E2jVyX=!giZdOD>DIs>R?1t_ z5)IwX=feGDZj3ikSY8-C`@q}=5+prbY@*2wQ?Gc&k9)RuMElii{EGQaH?3)i=AC!k zIZbRMIWLZnQFRAq_t%NzxOIUC$jMH4v={qELkJv>jW9)urz&d??LX}^stzf0$mAUn z-#y@TcMi0^*mW*yEP2SAF^e(ihO&_^N4%{TvI25a^~w^OBS4z-^W!eS{qP|X3z21; z8!S|BzC48yxWzEEN?>eORUnAAPWef(`dBSZOk%dm-+ocBR(YsKy99Z!9CYG1T90-P z-uLqiDe)*L7rW$J6br*x_oxZq2;&VVK(rOVBb2&ZVD+fdKeP0Vc6`O;9vLukj{S7F zNoCh=at7SQ+r28)#YR^7$l{ORRT5PxWrqVRyXkg7FCa?leTnSMdxr6{d01)vRdtMD7gG@GLPtz9CR7SWm94(^1P>9B%J4l{oz+G_POdeQ5)4(+Z{|j6DAIFCN-%HF;VV~Ka^#|s)^#zEFh>nKa^=VCKV;;_^JQ3Y` zLX%sStGupM5O6{5oFU)8HG<}sihGR9H=AVo{5`Ju)o(g_O9RWX+0lUYC z7k?D0hsA0DA&_zz-fNnfCM8x&^i?ETFtXr>{#Zo2 zN-r~hacvAQ-(uQ>3=dG6Ijtk0-EvEuJ?l-qiy~{?w;q4s4_~SIdhA>^5D9bbe)s80Nm4Ig{YrQ9dK)Tt#ouzi}5l~G3?L^>z zp8w@~K<77s9){RY5rBzOedxOc|NQr-@@Alv1sI-IKyXLTV1~AjQdW04DAuru?|ax4 zFtMXSiWB{dj|MbNChnC>)odLA`6a5OH{fsp@n1Q>9W+@gIBqsB%&NL7(RtTGAyvl& zq{M`0=ono>z52m)LZBQxcDbzkxMCS^rDMO>JFjcSTK6So#*m4t6bRKq9{2b+-D~(w z98RWn!5RxcK$is;Z!!l%PG2Bd%zT#+O?2_pA(}`o+*|7mVw3B=WFQ((FPwm(^FPM1 zGq%TXeeIN3$lV;l>c++zr@5Ae%4MCD!^$qee(B%ZsQ%$|fR;53@eKn^;mJG6zvE4C^YKy1;`SwcluFTBNk`E*95-NVD^okM5(u%?n+(eCYc21a%Zd*Ki}Sfr_B zF44=+WLRR)3LNEqyG)6#q&Wfiyh-KTuf{hdM!e8Nyd{vOpNxw(6dD@lktj_J-skHW z@wU07!|{!djZRclG_K?ss;ZlM3p4wv_pFlej4g?d)d&WK}8bO09wJjw?x^&2{fVv<)3t03sxEhNMk21uLW$^4swi+c(!8Ty$rn^VsKeX#5~!J`+dhHTj;jundGLcx zMj$3-QRkVj^4qx&ZEDlFx9tL4oZMf%JjZ-fY1?9#V}roVGrJ8Kl0MWo9fLs^y?!`J z!cIvNCV@4vFZ2&-XanN$iNEQD7u#XPQ-4TBS(Ltjv7zVFNyW70Lpk}?8cD$UA)j_> zKV<|UWzxgc;9ck~!zVyoQHq-fFPf6UL_mu+zL@#IcM@>{HkfV*h=@&UkYMuxUW}yv zU5Iu;b|lS6b?BcL>c90B<^8-msloy>6~fcoukJ{2BW?uV-twn>#L0EXfNP2s#PtBz z4@eu;$I(cL7ASf+WQRrXQXsE~u#y)8193z9YrCt*R=gn0^|FAfjs;(88!`EQfGK!={D8Gl>)6_4eMVG^+3k`FuN2| za9sP7OkOXorSbdgfd_u0+e-_NvQ+8ara^!q@+zdD&Hm5Fm#(NDrn{mgidWh=%Om&FM?p;BI&@@`_kcx*60)RAX|1#yzqcZw<4gUVhhXo$b4c|ovF&|}xWrA2Cwyxav zrakSwLC&GYx!v7_>~Dp#bm?n0UVNYXIE3F0ojPF4ZVV11C9@|Dxo6pCF0cJCH)7&Q z%ni7N4Jy`sa%+%qNd9T{^{XK^GuL&h~M;hnIj`cau1 zdq(tOwPmr912HBBsfCQTRh9)0BOSV_jReNERf3M%Q09+b9mTxHHMJKmONJOL!}l`& z8{CX_{MZ2y9qp)t<>?{Fj5HxK*Dp2aQdRjUgpkkpkG0%Ft2(+PH;>NG6Lw=bh;_*D z8&o5nnf#}jF}WL99JVx0G1gFMk)dx?P#!*Lo`VYH)HA*Mc~?)~vw0*qa<{%XJ{dd9!`pZE8}udtrQbv{rs zVHLBwT55@;tQbK?(7B2<;2*Ki9i!az*+C4MPSc!IcV$o!={~(7{58$K%BX0|qsf$1 zViE#euu6V{`lUB>Z6Cp*%=?>eYNT>*-y8!Qv@vod+`a056KIjwUpw%;cUGXAdGn(|q1~t=8nwQrY*n zE#szP`$s_J72G#yCaJ(0Ru5Wd*e}SJh$_SjA%T~<_Ws}6cmHob2VOukfm2@=eqjI>-)kQdt z75T_VF0rwv%8^z~U_#t%O)CW#nt2_KZ}_qJim#_C5xr1av1lErG382pg&KFj0}R&_ zPe}PQx5xbexg2AA@yTi5mj_G6nZ2~gAHF_%jCxxo_JR=pJeJcn<@qZgg;nmDljgZR6J{SLjT5!cSR;%m+ zK&HR&?>R`{2^BPLTohtW2to^iRd|wm2n~Xes7!I_F|8pHWwP`)NO+mJcS|gA#EAnJ zrCYgmS#L~xIhVi14lJjQI$^diq75AZeG10nAQBLn z$|_0@gT&tnQovtB7cuLVC33E(6d3J;))*!(-JU69gCiISQg|Y+7@<_sv0lhXkml(ti^W; zM@0GXja>fW*0kgGedEZ;&Zn)U3BG~WPJcmqJQvaC3)XU?1HvjXj=T3zxaQM~YMp2| z-Uq(M2VBMeQa=wkV`U>aFTcBZ9&ksZW4y_jry99?<>3P(jY94Cpu^#QmE2!b&RJEU z74u!500ZXv=$$)QS94ff_)l0E{7R&p_+}oAj`86==2w5wN8R}+eN+Vs2Y~3vn593o zD+P%4N$4A^hlXB8K=VVs?QHB>ax^78>8}#0eZVFER}!h&|3Uw^42=0nkKb%VF!j## zvbHu}QGJ$Ye)9Y~<|NHtE6Bwg3dg^iUXe|$VAm?XMsS~XZP{cGkia4}u@|MB=F`lo z4CW`#Z{$}n&l-GF%{jVX1^?xPY#C>auC(7_W*fV-o7lAf7IU6x^Q80BT2l~s8#=qV z);^13(0WlqG69+OB}=^HD|>8-uD_}J7}A(C8G@2*W9oDouD570_i;Q$uW+n{ttp8|B5eR z=us7;O82XZ`B++FenGlD0Y*99uN{WH}a4!RmZr zKbC0_b-x=3Gq?Y8yA(bS@-NJ(u8#SbYIG|6WyLi?I>2sh?eY;Zvc@!OZM`($LCkKC zsm6c@hmrBG2})N4-Fe?9wWIl)(sLvY1@H=p5%H9=Iwn!CK~T{xEx0hK%eYtWE$Gi& z+gPA#-Iurw`;as;Li8d47P6mi9eEuq`=B5)2e6Q(V|aPY`#}rLL}2J@@`%PvXhXJx zAm0e8@~ja=Y+Em7IuS8`CphkbU+|>6iQHbzx+gx<$diLEI81|~uEV5+pPpy>eizJT9!Uo1E6C`kJR z@~_4qlvg_HT<_sS%V*_#@;11)2V*Yv$&x*E0lAZdN&XUgy}MW`hQN0 z|7qC1jyRZr99)V-o33tI6Z2vQ=WJ`ca~Wg%3U@)a(RSNM4`=WS(!dOU_>UR9Acmq! zASAcocDE2aX4o}dxDu_ms0TRHtgbh1Ogkbkg0=z=Hc`cQ@Q~BMI{11Yh-ZJ`Mj4c# z10L)CI$!!deK>!060b)ZPvi(iv(gv~R=|Xb*}YZ?jTyUWNm;&(fiayFr?L~X#wX`Q zKQ2&8RD1#L_#P-SK%3+WbY&xGyveQPS?3qY)wK`KByG)w!Su(HbC^_4EVk({n!32p zUfwR1hj3gjTbyNVMYB`ouuuEyb)2l763%pWU+Z9I=JOWJDOkD*NOrv$yx+aWYd;2T zM4B2XALtyW9l_WxP?m`5xl|Hx!sYhldC-*LGh6+&TxgjIZ4AjaV^QO0o9s@*Cd$aq zMSLbhzf~H2E|sg0i?J_F*{#}KPar7Hc$qGfEl`x(JQirea`*ulXsc50P@fQ3d^Q5q zU)Qxk9-_6&(tYd*V)DgoRH`IrN1~TzGz`^z-BH zHD|2ooddyl+;HAvoC|}sZ}&MC$anFi(6e6kqj1md zl%s6S`sT(}6)Mr6mxUQhRo^QPu10DU0L|zBaq;_e#`izZ|NCo}SPU9;4>=o920l^(I3nF9to%T%kH}z{Mmk!#iuU>rnE|7k<@imH*@LKSGg95{m z2W1e}U~=S+w9egLA@St;{j7u2C9l1@$_BMfv0l z2phU~+a|GYM0L-Vn8*RjcXofL*ACo{bco*r|1z>JN}jvUB{U)rOj>o#im<`g6XJ+` z`-X$0H+i3@-|4FbQuWe-J?-5g!|>+eZ~y;wa3mM6rpkw{pQY6W9!EMcDSw|g#;U&j z%%eKx7)B~4h~6jJ%p{LUKGO45kT$)A+>M`8xUm#995*u*2{bG1q65_C0!?-8+LGfu zl@hA7+FlgWxCgTfc0dLRJ%PQap6i;o``dRVrLGQ>5y)S@wQC)+fkS1z0}+U4Qo;D1 zd35-jntLT&XV7~C9U7|^y`fak=}oSn0QuqxSS?@li`wl+e(!(qGZ&yReie4UzWGxc z{!f%9%x`-Ozx}N7P$`uA`~i4_pn7h`9O7=IZRzER*V$J7W|x3&;IJl^MeYYrGj}Z4 zyM?vDue>;@aS6#z<5L@QP<48RdEanfG4@3>Dt{GFD7aD@gc4Wrv~B$L6(xkrvXr}+ z&jQp-J=gzQq5Ufr@V_}%92jW#(Nwp<;aXl*)tBUWEpD-kvAGFN91`tqoS(Ej0U_fv zVir@e!-o3tTFP{c2Cx(uU!x3+sx~}7uv2F$#WdD%IXt%V3v(mxmu~0E<;+Bf!zqF%AOWtu-P)e@UKLi5K;F>nDTMYkNG6 zU0H11+9lF;FfT522z^P*YK%0-_-vbLKH5^@8y1Kxx)dcg>=%9&IBU1ARs?^|+Idk<2 zR!p}cyZWSn{5Rk6l(`$wxc?5to0UzBl~4kjevm;?n}P&Pba$4{lxfAz={8VhTuZ}o^&s%RrwLaF@o`ugb`8}e*2eV+^p5$)QARad?buz%0 zWH-D2aAeSca&|Jzq1mVL4x1lc+dE+*eFPC@g=+gk_Mb+t(-)G|S8%PFQW=}Fq8Hep zU0wE>uLVoFHyUm(_&ui9u=?1nO|L!{o@(sE%>x~Sw)GY}`G6{D_W@^iPUy3;3p(Na zTIY&*c1>oNiSXeVJ*ZH$L#t9qL@_xwdDtr3&SVEh<)0i5rCQ!~1D;_&3+bo$_Ig zO&67n1RpNGZFG4QcurcN36)U=8lS+pcVq4i>`*kd_V1LG*9xvT0Ns#AwSZzZ&xJ;i zO|8%P^Toggnie3~6uIx%leYd%_hK>tIb3DsHTRw2ZA+wk2yYx7ZWj#&Gd5erMHF^#dV zFB89z%T@)p!&V3D`V<@Uxaa`)nK*iTP;wpyr3NLp;l@{Ib9v@Ldh|E1}^ zXY47I>po#Y7XL=34`#AQjo23{DhaYTVSNWJ%BE0uvH*|<+l6CZ5XCcw{IQp8$At?c= zgocy#ScR8H(GkIhA|OHO!Ae_FJsd&WGWISUVcFkJZWs`d+82><50X=rmkL^Oa@sUE8p- zm)^Yr$Tk1`y?KVV|kYa{GvLc(s5!e`8g@Kr#xpO^vfVmZFaOvSU{WVE6Hhxi*f9zhY)4 zIEQv8Mx7)cynLF{BB^&PVJNe;)a{E@J0uK#;R~mtM!1A%t#wHB_s^r`%Yk|5Q=}Rq zYS>dL4m57hb*9V4&W^7M*XnaghL{h&>je55^sP%G~- zOjyY-{fl#voIEb{NoG#~#gde`d(Rn;GkoHh=koT3?#cYb-P_}0|LiOLHcIL9igBiv zTp4ucX`(G5VUKeJm}(^NuIM@~yV2Gi_dz-&B(aDa5ozphLcK|K` z;~S0VH-#GD(%61T1uSNlnTZ74pFiytU25;isB4XvuZt+0L-?P+vdcIMZ6$cqJ!yDN zxFOedxwxu751q3!_-*GCPwxIZF^3c~6zKoVb{7r|dcj|l^5;OU2*z(C))Sa9*V3& zzs&a1k&x1_Wys-(peMue$ZqIjDF@kwx%EsV>%rn7~Y zutNuV@7~u)U}TuFUiSDgp^NXF$CJKL{$Tz0KCoBmpyBx$DKiuj*5*-B6hT>8Iw;C+ z^dy}oB-1!WGRd6v&7SM3SG!y3%-e2k)2%g4-{5yUG5tn064h35ue_S6{?qPoE*faW zEG>=1Xcl4~qrT%}kXwfJ0D=v6xNk-tSpEH}(=DjIPV~%y{(>E%HLmd2$-f{p(d-GE zs3GX8Ya4TYxJKG~Vu>d49K3y;d=I{4T`^+DOS^Gj&&Uac%>@6YYx1j}D3t#uw(3~E_KMM1o6(h^vLFYSy2??{VVFFCZ zS-wJ{+YoPndGRbxFwJJ_)lPe$H@=3LGCAy2szKND@j!nwzESu#br8@%pVmh z?fbg)w(6#dO7cEgD49JbU(3JOU_*+<2i0nMu8Lw8zI$1g6#i{H9eiC$?BJJb6T2l( zF?<+w#$#`5O>HBeHgxqUj2it7mk0O2IdGBl27j(VWh^GOy0orU#$u-4$2;V~OOr_^ zuY(#zT=crs$mllw%Ax_{$Y(MgvI-ZhGpox<6)liv-**N5XkcuGRl-_uZ1<`$4+6#P zIym>0KOojWNN`h)p$oelxCnf&M7lH6v&LoIcHO&csoz&^;bTI%;pOVhi7(Fwzqz%f zrihk@AE&-R=&DUsL!zz~F%Tdy?P>Hqk$zFimo?l=%4z2n4ffE9WAO=^HZz%cl@MJaJ!AOD z%{Vid{1kf&b`CzO7&3sF$+!aJ9N@Hsw8lY_!)oc*T?GQYOd<*WfN+xub($=gtxt-ZLE-kPN(@7`)sV|nmkUqlT&!i z#Q5n|p`a7T%lyt)Ky=?KNrx%rryGz=GzKwl1WTaO=uDHkmDza1M z)=vu?LP|F7PVH>eQ_(_gt&1{HVk$C#ge8WOJg1j4zq^mL#G5WW<8y;xuyHy{S<&iSMj4>}LJD5J7*&8gc)Vq0+ma z`kSz{cSe9j5c5`Nzk}>P2*?j~Bl1#5qJ}3ADvY2HmTnpG9CbS(TLqx!dkAhzmKylE zHiMKW??7w&=JPq1zrReN3Li1(i^(Zq*ifW`Am3INNXcQvs7I7C0D5MyjIVt^9HxA` z4#(kBd>3yS_F@*qqBn19_A1@&2WBH%%1iVE_5N{P55xTwpslQq?0{cGYLvgv5W4FX zB+R(rn;c=ITE76GjXj=v%1yKcD2aESE0ljG@s#@24^0yBSu@RHa%niL#}6OOR3IGd z?(=!l5t7gTdX%G#C(!gZ(*R<$L-FY@imjf2j*(-OfjMIL&P?e{Y2$i|lS`+rwa`RH za<(g-FK*G4-+uY|Q%haLDLa)G-tv^hFCbeu82xZHfA(ko_g+?`oZ)JRQR-a+F|~6< zRcPqRMBnK`FPfMcBY|U;#sjF^3oQ+(r}hV%)mU{nt!Hirwnt$v~Jd{dwFgjwkLr?n8b*ij|7c zXxX8GoZ82Qp8O%%xBYx0n>cuA=QRbIc;t4M6y~9`7Vs>1ZzDk8e~q2lE%AQnIwTul zYW|ek%>rNUrF?&>=XJa>Cyt~V%9j4ZSaV#yGKsjhOF@2{?c6^3$b~~NzYdso?dz|H zyvS50jD+F0TL(yC=I#rHC(mic$=LH{pTDsfq_svbH->G3RG}hhQ?oHrEuR{#2hU&Y zWTfuaDl=@b-U4kx`|%&OKa-XoB)5Lufi2c_&tt4q66G(|6*I0Q3W2>7Y)!d*5HSRGA{q}*v8C#KEDu=Td_DrY)p>{w}zM)M1Ph;8E!QeuTgCz4-LZkSk_ z1PBkPO;0S=Z*?&1^L0)jR5gimdzR<%ji(tyLIPxcE!G_y43Y36OiH4)!=tq5sa_9Y z58M*^c?$eP^#Y(Phio;wgtSa=7;1X&{G?S^L}~847=V~*W4@OOo<@a(Dn(=K9fc;0 ze~b&hcRR{iW;lu%)KceZ40fP+qe^CIXFF^;)49YUDeGIC`@ZQnN?I}`1W}*&03+5$ zQPSz=yZPZ&?H%xWog66+Dj31a2_yciCwngzbU1DG6+8~(QuJS_0m zZK_cmQG&L1zkSy!tsytz@}7OoOIx3dhBUxlzJyUw0Wb+)#sualyR z*qu-f&%I2gLxp!KOLvS!X82x~oIuyvlQLtRex+Mvm%&^5VQ+Zl+Ylh1z+ij9^9?l& z8JX^~h)8KMgcw<8W^Wx@*!+SigTOT9;+%HM)p^jVkT%$%Y9#pe_8Y|`K84-XYnTU7 z89Z0=`k&D@N%E%@+7*;^?re;U#ikT4lY3tCeoTCxY?Q1ic*R=5$Y7GU0hXaHDeLwdyGMH!)ZSA1Ps^lH=y_Ml*E8yT!3rgkK!L4P!AjmBFEhhjkHCnuv37#=O zrd*zrr=`|xA=j9JGuuaAdxbIdj8ACj?FY1w&`7<)Q~z!bTAZhC5dWHEV(fyp7P!I> zfO7sjsU?hws(w(gJLT9jsC<(BkofF^Jid&f2qi`b%1kwLY_$`!5JeU$8+q=RrrOdg z6O=U90>>gm(0bnkuC^I!f0QwgDjx-pGQ$1`AYg8Q##K^#8rElI_`v%1=66|bCpbnV8&dAHlKCGeo~ zdKCg!(@${P+D{Nx!c|l`Jn3fn+qIZbNFv;m*7v0>zc2tg@?^-Q+W7tSE9Sf57e4`| z;Iq%J9rcC+bDr_nFE8CX%Fm*4`z*XQ$;Vo|k&VO7r0~0FwhYjOp5_&lS zcPRP+I>Ti=m3F~u!P&X`;=OUi%m-ttGtg!8hU9fdf?MZYFYosKli#(Tg!ECY?LeX6 zj?1WlE6;yeoN&`%=0k))+E;wpIt*Pl?ora3&{Z*2L;a&Zv}R}dH>zXLxI0vS3MDZN z6+$+!qh4G@)4X)W*bGJU@yW|R3o*e$g@C3CFVzV331S1{TVZj;S|0uK?z=T0`eRni z7EqWY^jv>bF8c=|v=mNmtA`ta+hb95JfMQ2m`ci@MXjs4Uk|~L5<0&Ze4)W_%hN(zmlWmv zO+;=*bc^{bD+j{VRsi_3`>gbXT-5czC$p{sK6je?Bb0TEZ#_1)(VvjHx@(M;M9^Du zl)y(o%uY7XJih8o>43KS^{t{0=!Df}@CS|b;vuYVNgBo|8 zL4`{_8QkDq)Jk$YCpYPEanh&R!?tPfF=H8cTRn2;g(}~IdrTG=*OWYYvXiIb>8>}# zsrl$rx3dyHxTdp46l0QZmidZ|Z+S;~6ZfukGg$rQ0mn@$cnpj`(|R1gyYe96Yn0=L zmuhz6eV;rUXfWY;aW1D> zm*bVp>#+JTSDfIAs%s1Sf=SBg*LagyE#-LK!2ILSos~P;i;bMsCC6P6DT}{&g0qLL zY9MwyF|Hcebxr^c)EG(|`=WI;>f+*{*nQ?IlSXv;pu{EgP?=Cw^(${J@e)=J4L1xR zkcAGdht7f+SG0h<(kSVJnSJzD@~dLIJsqEfbF-%R ztDt!{9qC=+xoQQJ3o67A{we2F@z}1{u&)`au!ny`Ir+s98Hqtuc_n=8kwi|Lu@-(y z0Xf!AzOg%a$L%dRj{?-ywi|OOcLOYj1mcLeu?Oj|gHAkPe(c3b9k17pJYbiY?yfm2 zXgu&RaG{v$0TYfBseWSWYGT$$W(OejmDqLxK|>%cgc-l~)Ol}sUnTQ=nCc=#o4h@f zjOjfYa5qu5kbb~3Y}xtRag+Np0fko?0#BS>I`IoFwW4X8^z%_B<1^qc0T*6KeDw_o zP>fk+_ZDB`xeHV^FYDyQBK^i_aNG9*KS~74R=W>AW!D!^*F=|5-SL~cBUK{I0cq+n?3q0?d!LzE>-;#g&iu%m zEW*l4d7k^Z%TE&=lBvy#f~a(lhm17t4Ued(_t~*y2_d-gS{*?n#O;anhO(7Z17U~N*v!vgUQ=C%1Q87BMtX$>b!Hz-Qjotu`avB9 zgLN|u=ne!6qN}?QF%Ebjol1;vwZkQeDHKip|IEP(#Z3Pzv()>4ky)xp5MDI=2xsqb zM!%AsDiky7VV2z===u{rLfQ+Uw)9Q-P#n=QkgXKd^s~*sa_#(5g?aQqLiM<~vo3tdr z8F8B?9riF;nb9KpU8*}#`#|jJoZ6@D=r|^cu8I;>lCOcavT1*Zll6g*;L|rXDDd{mpha z`~fftK7k$kqkV38>f=c*vMSc^Sx3Zp!4Fm`=DZAHAQ+@P=N0S($m7fD zy@UabhKT?$l{G=P3tJVYaC}3HWcvz)1ZTyTnwfF}T{-|4osB9#Zr7U&p9R&HkLy;U zv#wjbdRn1>{r%SGn3S?C8M(E9dAM|k^_9b0N~UbHsAtD!Nvm7A|A%8n zBzfep1>h=R_<$j5b*thznbJFHr}lU0B)p&_&;D*`XYgce zw|nYLa>q*zOtRQtFdrQoUDZ6Of!66J<^f+3eIeDDkcNu2eS zi`t$Y8J`VY4In{${gnWQiLd_#!cgWFkXfrTo1}^#)#J04q8-#5Dh?!+JflcE_Ysyv zhR9PN-0x?LuCLGhP$$!PIc#O&oaYgBGw2#jm_TO`sMUHC?%$@?%W^)ht@DSMN@IAJ z()@S!ZQ*^IBiB*airr5NCY4z!FGp_p3fT7FipZkIzq$|zKnk!Ni`<|Xt4~K#%f6Lf zfv1Jwnp4h8_r`Mj#?r0L+R!HgDnKvbr^zuT+FvQ~Y*;gUsE*C2kGIo4!r{z)s`1d#}>ioi7gMX=zu2dvU%w0!xRRVD~ z3y8C#g@iQBAKArc{b(S+2 zbl}bYtgL96Cy{j8eW3*gm zqP$5AS(DIjdbMc_irXXxCL0H*M|WiUF9-*i&suOE zJbP#naeB^Fm7e-um-1D=@%AksCq)x9<^H02hys=y`b+Z@6R27|;BzrKV`N?WmYCR% zr~s7Jlnu?ly3$+Ujm_narMG?TQGZ*Bja>urEhgQK#?b%NbspSdLAh%&^Tu-N&5iP0 zq54}LBWXO|yL9oPH0Qz$RJq*1Qo`%lNu|XF`+>vZzuESsX1knbEj6@$_FB4epMvV3 zfr^;weH91ln1W4yfdEvd6tECfP)42!2?0m`?1ygq`zf0C+gvJ|F%RQDeZ4+tQ3%D@~$ z1DE8aXfO0;{^yrv(>(bHZ?0TcZsLnsz+C#DPZqe{ZR8C@ej0mp7;Nlxd)aWn0~2R8 zVB1q8gGUUxKqLs#fsi;JRtAA_q(?|exS~?2#n`_$9e@Z7Lf&v+?)l;u7(_-WrUmpB z&XPOVK|#RTcAl7K{#Lp&@cf?9b-yj{Kc!oDZ*`82YBdJlE-FbM2r7DTdFdLI@BlM- zmV^`bK(+JP1BR34i}$&nCf)`1oGepQ*x&3Le-#%RQW~XLZ1HQsnetc_-&WE0lr#&HXvHb7p5bxBXuc zY;IZeTOP6s-S>!J3s3!*D2ued`5(6hVdn_cM;{5}(XRraY}4w)3=e}wd+Hfa-w3wK z4md+x4srB-eLz976hD%DW+s3y^U(F3`i^-zw>yo8=SGD89vVu<=AYg{kSxf`Rk4|6 zKS#u=))wRj8HWKAxBDid2q=oW3+E~ot;VXX8;$}2W9`*ntcSg=Z;ORJuvX<^q?gs6 zhp6VcTX1gT%A?AG!)G^)3*DGo0Iq=r|2|R)-t3C>`)&4KJGDsh1>gwvU%{< z74&aFWtaSt(Ambe%$~hz^=7wa(yCPZ{1LrmqRKV^-xS`e@eFdeh&=i~CL@$j;e1;~rBHg-H**a`XZ-O=d+L$H^V3w_V;M+64AdOf?@)aFc zI3ThDUa)CD=Eabdr7mf3j0^<$!0vUKp1O3#ab-W)%Af1XXXd#Y*4A-$`+ z1KQqC6u6lio+O+5$e<*uWo>wMM;^p`HGRM@!b=Qp8T2(hBJ(|DOZ)bG6yTjedPKGyT49zcKN|V{*ymcu2I5+J*nA7-3o(#nyypLh@Fq>R(uR9M zy~4Uf&oZ7e0G2`+nfu2=P!*;psEf-Jc@^t|7{VZh9h@ZP$^H|;dS*C%Yf0y+eME5P zdlR5$KDd2qytxFLRPQ(qEzaJsy*iRDDfWdKYWh-7#6Y(K0LRjPrag7x%@N+*p!oDZ zNSb#yZq=Wi3CJyQ7{wJ9d^T3q0K~gf+SzXgA4GWdGuafUW+~lp(YwD^konbaf_DFn z(U+z4i{3}tQ${*AsTcinBUL0|@^~21wHWI5=^If(J!zR{py)*v>bMlbOg28tXwX_T zP0_|r27)x+=F?8pq$1Wck#&s6*GbIKRoXKY;J)loDu z+Xp}NVw~D+9;g>f%7oaQ8|bnN2X%j^B&67po;$`pzG;MCZo9VnHS%3TDP>c{?hyS^ z5M7Q8lQ)V^mEG+7i|Tjh_kE}1KWFB6P{-UkvU-EU>u7h;N(!B5QD zGVZ_c|MFjki>xOn>>%@|Hm2T%aumuNIezytQ?CK_nQhZE3^j8`H+x-J1Tfp>F}|i> zt&G0s$@VI)9qR)IBS(9GphYN|wXZ-q6Ra$`;2I67FbsbOVYVh3HkD~PNv-Vdh$TxW zOuaacr4CAehSy;eVSFf$Xf~V@fQ-X*oN?lu4L;zhtIN|f1tJFJpt_j+kUMqfsqTbt zFoO7e$HDj{=;YEFW)sj^sx-ut7dtT=D#@uRN{lWgDv{nZ;91xRAC=2UZk%$W1-J zC8O|>aI3H~fG@R90MSP9RD!MRsL^a8J~mjT`%XU!gW$J2r$p~|Exi~VFHlHRXEaP_ zyTHB**kIW;%`py?VDrSM2(eJcT^GjOF2)|lm+e}Oa3`YrLfEH!9;-Nbl1x_6g z_Us-!dUEZtRz^8>kbBxC3BQu_Z?bC=R#T^R`1q5)#r1Up$nZGmc&o1&W6`hCd{Wkd zo_E8X!7N;p<@X%VvL))xCbseb^Tv>#2=i0VN{l0SQqJ81n$?(J9`2NJZWR`nMBQO_ z@tu!JR+Q6y!##9@Ai5&6Y4RCwPhRfE^qHZ8MS}NkUmyVoo*HjG9ulpNlB$(zF;F6T z?=PH|%_|^tHckt*$E1o8hGC|IE}4zx-|hJ$g71;F-2cVE#Rly_6em~$A+kC=Q!mL< zpZ26>rZrA&6)8`1oQP4uFAgB%7YwiFDHs*L*Vlpe2IQ^>8k3b7$lR`CE|&E^U}ML; zg|GJ(ba3u`9#lO&9&FlOw@=yQu#AqU5*d-r1c8VO@yqLyP&iM$o}!$8xtHCfiIzy` zPpag)XW8M#J=PLl1T{K83ejo1~m{@!US|6{S{`gz$#0i~rs-YomVWXZ- z|4&BvW|TAZ`+c&WNjQ@1Yjr&(izbAX(>0>lVN#ePHW#~S(tmRzf<%R4 zqHyaRD?QDw9y+4_i>iFIu-Hd!>|MaYkQWDZ$a#6;N}PcVXeS;L^X1294~DD$j4jDJ zv#c{`H%~J8)AWiS{SRhWzWM>xTa$pKQLxruR3$zTK$~orm`Kr>K;i*0y~JROi1O3Z zKp?l>i#@{q+w|+ddCc19i|t^)`JqCdth&H~ygS9sswK{w7Qs%aMdmKJos%wRm&}(o zNUjgyyQjnN{cdS7-d1M;5d)!z3H4vyDd!q(VWyXP|3^jZiLU)Kq~;4i%v_(iy|5cD z+3Vmcyl#1~r#MSY)$1^+gD!skSS!a*1q3Ns!ra)`FyvAMFC&48M(?f?r>2*Klio`I zfFA(xMV`;9cRonfeAGWHO#3~6QDWQ#2aM~WIPY4ys@j^GA+DNBd$;opNrZ~`fmY|S zl6mOtVqBVq;FOTgG4|xU`)^@^V&qi4Vz091nBw%6p_N&C`Z&$z!7m3wt+$cc766_w z6m>@oruJcM31H@|P%BE?3WARPJZD;J0Ih4~~IcX0XBahm|i$*L@^yg^{>)zx&*jYjqt0nM8LUB;~8pafwTjxtBKRM22g{PvQbn4JMxo~aAv8#l|@+fq+5g1tf$8jmyf)~O9VFU>Pg37i$k1i`dg(~ z%%Lms5!vj*6DK2K<%)rze3>ST*e~&+v{!&>DI?V)_?g+#c=JoGb_o4vcgi#^i6|2e3Ya{@C?z zW^FBT4U>oMl=K(bQ;kQto~+pou!?epj!5On@be8|+p}yqt78m7HBhFIhm+~el@}<7 z_?Mu(m%{s*ZMGr=@lRsEjw7V+9bVDl^DEIU3E(*c5)7&mf2>8Xl5jdJ%tlLKN)6?M$dwXGwT{Lves&B=T6&x zyYDTn7M_Z_>*5ywm?Qvv%N2)9qjD`?#j z5|=9G>|k53C#>k67^nVWi)uq>i;C~)`We(>7^a*x1NCoPnEm}*lCNaui@$%IFO`QO zbuzh>m`~8NB&L)!E>BCbJJ(B8RQC6>BmC-bzBoqC%b`nfD%bOqM*a7OV|EX05s9m< zF`Yv9clvz^bK5WS`bEBi8MnSpF*ViK+3d;IX&jqPd+#XHp6ov%VhQTT2|2X{L=mfp?x!t{GmSj3HOF>=(DC-YRj_OQ>@1(y^dCiVrM_*$J_cHA1a< zO6<;>b}fEzA(?y~ff+~2;pv}urK8UeVo9=qCj29UW)h$YPwLt&S|OEbdPx3Z8a4O1 za9%NT65WLgLotyx@NmMzY~k%VQsUFjErS1xDp)FU ztU<^3t9PDv(4X?swY*IJZ>Q&Whm+R|=aExKi^eMa1I4dIq;QVAw^rYL>w1(nNz1a2 z28Zl9E)ssW8kibv9}hy*tzGU6<*dYR?!!m1^qHSCRP%ah2qH}LXR_Li>kvgUjM#`n zL>whcadU@m+#R0vj^Dl9@WXm)3+CR&+Nf zOa{axWV8~qy4FdSMz$KB8~`|!-8Kl>yAFGqGJw6>@FHI;%gXv!?u&gWMe$ZgVc;AoCS9-KPdsBd(ftulakitCN(QFt| z=GWQQo4%O?=+R}qM9;{fL)@(S@sNw>s43&ke(O#su@>1zlir-7xGNZ zN$2k$zJPqf?_5d~;L;6oaO~csK$MZ`?W1nF_q5)*C;Mz8+`O31igE2$H;Nf@$nq}>%?kK`d>@AuF=F-@yc6sCS?I6b6L094T&8Dzrt}#_Q2t4J~ z03_?}r`&NNPbHI{rkwpuF}enHRqHZAPSB%fr(|Y4(6xHmv!l2CUJopRYbOv399#>l z10g0DO!=@#?ugfwud`E=>^iSJLmmN~*@N@NX20etZ$>|mDXrN_o zw<+9TK}>wwAxdkdf!ftcbOFzB!85Sdi`-Xx@ssZgpCCmtom~#=^&p#gB@wJdp+e6C zpU0$7pr&p!%%l~jX-m61(7>DZtC$$MEwY8%_%({OH{mSjZaNReVZ-Mtx$BGl>Ks1s zIP>%^>krS*08RW?9;L_}s~R3m7!NMW_J!~hWc`?-Jp~@V+_v~;$aa5(^r?+o7(X5& z3ZGn1*1f92^FV2p_E$^~)(`pf!g#3DT%}ekmomAYDF6VWR??rl31}cckX?tQ%1(t) z^4}fTCG;iwFnbI=-sUYm20SnFA%*so8!v97BU0k`2H|Lel0xmsTvHMJy}-85{19-g zV6SY@KcZN5?xR)lzaB&}_Kn|o38dNoi0M2h#xRy|nrSOY0;XP=j@=dL z-7Ak-Iv*TH2QxOU)a`Ppmm5XC7}D-s0fu0Y&DgA-=k{xg9^I|0m^@(r=|$fV^xz&; zQX^%XT|cW)CtCG1ubEf9*!;9|*s6JqaDJf&Wi5>(mZi=}7^ zV~I4B_Z744BNTS|I>C%``Jn%UBw_xMufaS&D=E%F;plpqmz{UFvpm;wJsc>f&xO}d zmjE8AB81CV(G;ILeL-LMQnYo{$Iq}qjN-N%2R<*Kqdh~YgWyP3&8TWn$DKi1>PF|n z4yU~247A73D&+D7J->R!f}z?bA+DiJmy1ck~pd zj29~>+-x_6-x{>zUw!^Mkj$88)fz150 znR~2rA3oK`{&Rr?aqZE8+W+LcXY62%O1p__2GF%I^-SFqUXpcuYv-ja{Yl))Zbj#g zFE#XFEh!2xx}U_#-1a~_A#-9-J8iRK@NzFmpeK=YySwZ86(d6$KC@NRT0D5+_XpoG zANUPSjgTAn*p0gA#>GcC-ola?(RfkYa-J$5!oJp$NaV9 zTii<(!sG6{j8#SrTDaq_RswJP@t{oYD+<@}xl0x$<8Rvo!f{}E*W(~0eDu^4Cr-PD zxq?4T426LiASWZQHztj;J8LP2mOi4<{IjXq4e)sOFhQC(U^?ygO1C~Nk zrS#wTJ>e&WePnt;pw}Ty_&#ee;q)BPr8$SZ@SM|4YsG6hbvD&YG@8ul>`8q+PX2mk zruc;iX{y|zSl`Wd6=5Mx-Hd;Qqt|c)qIi9IZHU=W}00F(p(pAk6$Fe->L&+qW6RV1p~Uokfo8@@*C(T5w+UKp)2}Or#~B&Y0I)?l_m4h>3`F@c62^c{KfTocv1^Vu1Vy3f9_#;y-#{U`!~-Z368bo?>OtuXJCPvyjDE$ZbF>?ZF3($Ye~s1Lf5!nEl>6tZ!e0{BV$0b` z@QJACr(fT&4Ag!1@wtPe-TwU}lU~pZ+x%SkF9h2mT{=M3yu5EkkIPDPe0p zt~ylFH*_RIE$@IkK=SwI!%AYI{pV~PXJwI{HwIa9H}gL6#zc70eiRCn z<;myHlFf#H8xsg`kk9Zf*Qy;*(r^TLb$V6WbLpM8gCx|cyMXMCr>s~nP3FPh7cgBO zx)EkbVy&N5nq?r+W|zmAa!ho;;!b;u2%r!_(N0A1qX)xp(RPi5-{oiN^shi=Q|S>P zt~67{sx+L5A<1MV(A1|ADY;SJSY%J>fL~~cv@u)YI?$BkM{dc4gvyeeSG@Y4n>=Hh zbK8RMI_2iBk16O!vA<4o`C9cgk2g|qM2JV6ic0Q$iCMlbO60Bom;AksW{vBnp&$M^ zvP9rxX_x8G-GT-;KUf@=}l#wML z|2XpVioPV?H1dpt7)!eShJBkng?;ZDuNn8!qG-ivY#Ip>O%%NaV0Gw~CFIcehrloA z*n7PTOE&8Qi>u(TtM(n@xqH^H0h;cGJ-z@piQBm4m!dg-?gQm=Jz=gb74$fg=jdxW z|LC;cvia@jeOI;{-UrWwz`Mp%VX^?g%`l|y{t!&|%KqI}YwI3$eMN?lE~@9Ra1W^h zbp!^X1FeHgF}dksncZB#vMve!A-vyz1e774cS8VEkSRdOU0wH#9^DU~cA%(ZfFuqL zMV(z0{@eBSAKBCXu(gaxLO-X8Q6HZmV+#9y%9EvmWF_@s$+RATJdkp7$s|#4 z#a5fuq}^&dE!hJo2jiOw*T-m;&4HW6$t3PCs)_Qp@-lD)Srz&^)k4%RAha_F(o03! zCyozNuAP0uun_eZYZDEfopzL%_(^_+sMDqMoyf=eyknS;8C9X&Op_eyjF-2<7a|3D zw4ZCl$TgXmPzik)0M<=bM&zHr4KDs02O6P!xX2YY(bXq}o@_QlwIT_v1z;yp6J!?* zgTyJ^zWqk{%2SCOLoC#3anq%JE{rfVL_K!Ij1E#h#{y?2YAuu`e}(geWY$*GY#qCO zE>|;ras`Og8>-yjl*T&~v87t!Lt7=O3sySL<*j=3vE9gHGv9tPZ9_qgj=;{ApNU(X z;mic_Uu1YT6SwB(t$>l9M0v$1kHp-Dhw=eXpljOG=H=G#NLvRF#0e)6ZAfB7aFf~3 z&E!9;kkKD^GT!iBaiwEW=d*bGQ<*KI^U1jt465Y23|HrD>~Lsn4d%^7K#VHv`TwS> z`?>w9h35%Z-^Jz zHbt%NzdGiJn}K%=&3e=Bo<*XKVL*^!)&+-hto1Mfj_WUHP^2T_P+$4tN;!w%k;C#?4cX6aTRk&s^+dYc&Gr87J-P z7xo#>yp8M}4x@Zlnj6VGoz>|67NGsmV1jl%Ri0bq&bHv>E8F#u|TJPARnM=&_+$$i( zZ^4tlenJNGadxxQPfSsy%&Fk4Avt=qu2SnyhaH-cg_BQA2kj@gtR~8n&hie9&$c zH@E$>|A(#T!LRFP*_Q>#6W1 zQ?$Pwah-gZh@9_yvA}hHV^JqVPk&@h53$`9l9pAP0tV0Hq`3q~~O$S9>&G#N3 z4QeA~jIHCEjvV-YDkl-4W(V8C#-lqIbK)JOx3-7mc;9fYPO3_i3>jM?VKTokb1ikx z9tRXh(IO-g#w;c#)(1X0tIR4onq@rta2-xdm2ZA?H14N(Ay`z2g(O5aAUulIFKSBT zDt&t+Zf(UGyVaEI9r!(W46{wN!wWQF>E725-KcL( zCZdY9avWv)M1#+{`}V-XS!!ReS#e$CMCE9odDRmG=Y>hg$~j-?8SdA&9ZRZ`9+8n#X$LMyU`p*by| zJJYfRo9%Ww<9DFmk-`Jo2h6l;=WZzl>uFHI1z`qw=kfP2uR`L>g$J?I1|8=SF*Co{ zz6CyuI^~QRZ8qBix9L-zHoah5E>%WzXA9RE@Lh=x0vLD47l<~7R zE~k>5Q)kU-624%EXf;jkW5sX&L0A2%TMZ|U^sLsrTqAs94X;0v_*wbd-a}FE9$jZa z-YRJp>!VI@O5ks-6LEi({`5LkL4gFIv`e)n3=`wZ{l)vwl}8uQeEp5j_|K}<==b2a z=yNuK<&TM!;I zXBREgjmy%hmaxIne=bCI4R{c-_ei3&zA2!SJ`R6!Mmw~SQv|gdq39jIKDn(A{ z5b(>1g$6w&J9Hu7oxIG^(**G?dwB5p6_lEA?`hF4lp8YY5MaLl7u6(PQ(_%92Gz0u z$jopcGnQFRE+A>($J5P9wupn1Ka3-+U(6C=L2!ZDJ?rYr+T}{`|lOw0s!*Y z_c%WR=~CNNcI(*E^}+1!zkM++lSBT^Cg;qJYGA=(L5^{2+=As+^%nBRgNyKf4lh&g z%r%cItS5~G!y{WlVi32DVJj`=_F$;N%f~Zn*n=T{QWL^Yazkx=!=5$Tk|}LP+EFtA z6%>`NV|t4>h|x*+ES}Kc3<{mQmbz29&2Mx9sF3YUoUp8pCi!|&h)yTaHdLwpPj`UGT?1ACI6yA zJbqo9k8m)IzRxKevI5@erAFMl`h2awG&obGGxDkJrZY(z_O|YMArA1p-m|v)0s3BB zvZ#HAgx{N!P>4E(C#RCTleNef90AnBoLX|G?0{3$ocgkm@PT?OwiFV!-P65*zV^p9 zEk{v%Yz7{J>f?jcK{SYZ^Vn9w!E94Bhlakygx$d<<^F{&IfLu=i4RBNb64 z@649_TNtN1JF!fdy8x4MDlzn68XD-dExPpeMvA|%1HZaF+hQX;)2LYO*-Vz)Gow-9 z7&QNi_-pvTD(dEm)mD8${G98_%_{+q=up-vCQoOsq@4psNpqR2=OYDkZv%@X3wZFq z<_z}#!v8V;!cHOEZnC4>ItM3_v6)uw%woQC4T;nh-NeWAWk zg5Jb-Zqz9{t`vxKfxNW%NJ;_DQBB^3OWnofxYCD+Vu^MhOCQNtl3=&>OaF zJ$Dl4U+AoD`+;S#Gu0U?!fZC91#^G=eOe4ZCS%*LKJN%AminZoGU^qzSQ9PT?0Iso zdy8=m7?Q&|eHEL$-#~G{6y#n#4Xtql0LgzUKDw+9JZES!L7n%N)yLlq*=mr?q0cT` z(YYTiQrz1W*uB0cp+LL2iJui&*qAsSisBpPFWX%?Lr|}~$z2|bo;r^DWu`GCJ#AV# zR+NsZb}8*=@507YLvD~EXUQt<^cWtZGfOW!d1!r7Y~IK5O0O&_On{Vy7v8ZH?(UZR z#QHujSIyGP&|&Y)N45=j$HY>bY!}>mdN}q&D#cl5#cIg=gpMvra=n`EO!$yv9vxf> zqJ}9AS14Owz}(eYdFa9W#{B2wsHnUqxkdSQ|4P;!cP&nnzPygG@~4qL{W}o_D3ac7qenfTL?tc+(4LWnMS#RXeq^N9gU^-W@nM`EdSR6Q`+Kq2k3SIs#M-Swm*CkaA>mN}x*5_=0%E zOJ-YZbkYmDm^)Di--1UcJTSp&xyxrHrQdn>-yl%wjG=p8fy`T2K3Ek8qtI~KNt)_B z<77&Qtw^=Lp6zE}csKruK3hN-cZNvGAca6P5qZSXKS|g6>$Pbsq%LNcQ>iqw6Xj)- z3t+X-KLydP?rC=bV1j2$uFzVXDcQL8pklA!zNypd5*Uf$^v%R|zbO=Hm1RHZ^1Xt;Tu5(olKQ;0Uu6laJ_HMc1)AA-R;MgoByJcY(bpxp2Ei1 zLeBko^Nmpzlc+#=FX$SF#efTp zG5@jct~R%Q)neROk=rxGxQyJX1oa1q#W(<(LR~}All2FrnMpixIOpY6>fV~OL4gUE z8KT*yDBN&C>?x-VcuF;(l9dvO=wAt_)%Q^_4P^WUnnF-?8^_Q5rx9z?`@aIOk_>?% zomm~wEVObxyM*yWPy1R?G-?odbk|9rD`gsb^*LRD&C{0s&HWR{CcgTmb1=QoJW9~~6n6pDtKJskmL|2YBQnUu1drW8w z8JoR5(dz(}C4Tf0DQj@yN$0%YzzF z^nIhY&MT3LO7co7sTL$)ffxC*GP}=Bt!kHz+yK}6Fi?7!2>4`o8iW4iwSDQs%$tJx{ z3aFX34wkr(W@zgW^s%nt_+Fjfr66CcI&5-2s@*1R5t2Ac zTD{;9u908osI0q;&Cl&-BJgYqU1t}x~2pXPtiohxNU3aqU zXR~x7H4gC!&SzVR4J!_t#NF2Syf;a2apyIm+k;vQzL&5@?|{OZ@^@jaOgkwmvYqI}!`K%;Qo5nDlvd%M$e|^E)=sB@ z-a(>K_WSVHkVFN{K^3__NeVT|(L69Ew3*y^I66q}X?5b(!3Y!P3)`1o0r1UO`(iz{ z(nK@7^q=GkiRk_}Ldc43F5L%IC5Xu>#vDV|<7L(pgPTZC1+T!Of_)!-VesAw_n*BU zF`4C3;y2X`K@u9?+|-1e!g)rPDc?jLd0&P}I{`W={o_T~mkQjSGxxLCM;^-V+|M(+fZ5goK;w(^6X|Y&(L!of4t~>FBN3_{<6qDmg-L|)_vw=RRDIB z(tp|6#;*LeHV2yoMbPTA?N)YZ23}s%va?LsLv5ul_(_9l0Gpzk_<*(XbD1>67d02( zgS5=R(Sa6_zR4YPyl>q%ha=mx6m%1bQ`+^7zG3YWdi=hB6yZ zuKpUP;!Em8X8;mAO~-ukhktC^Klk3_OW4q}A09n{J6xhZzX|5^^Jz7lEN_`*(LNMs zEr8bLoSWKAF-K-(2JV3y{-O#Bwmm5z`lyXdRS$YWFS``kc%O*9{{h-U;&@AM@m4wm zwyu7mUXkr4M~4NnVrjOi7_HHv))uDa$N};JF_k3g`$-)GEcA0~r&US!1iJpH44$Lx z^0`9P+!<#TAzxPY~C#F-n8U;F#hh1h^Uv5S@SU=X~jz+R3?P72F>e9GVnX ztw9qM_>iaZ&*w{qRAx)ab|jnLBAVMNc{d3&-6t^>(=y9ra-VWMtFknL2QhYXeQGrE zRX1BX72o{%+Ba7hRbMa1QD0w3%6E0Of3qX-a!-4yT=9N4Uz*O{!VhT|1uuMO&_NHn zen6eC*uC=RX!ah#r0frLI4_p>v;EtVJu2hBs6-JBbFBkD0!NfP3+!z67mYIN{rP}% zD9*-W|J!4}Lfx(tV*n9Qe%yw>ndKw=T&iBSnZ-8#hE^W(l6t3lHO*M#UsO70pQXM; zarv}K{<^avnr_~7)SK8pPQ4C!H3*4Z;pTm9g4$X7QO|!`PYD||>T&G!@?4l>f_Hf% z?ERDkO0>ny5@HUoX9nDiI*+x~G(ne;+nWs?!QrUWh%(f}nPZ++9)14rn$)3TC?kK-d!LbMP z{#*?Ko{%@VRc5==1P49llZ=F2=hUOR-QzXC$r9!SxqNj^4>&zRdM+4doa6hoEA@y) zxGcYec&ZS{rbDv{`FXV8?&ZolLi5g3s19ZF;xj@BkTIS3i|SgR`bk6Oj-n3hrN zwCs<%jGNYQkr(Nt$T_IV-XH~(pQ9zatCc=9zO>uXe6~RC3fG+UohGTpAEJs_2@gnMrVrnoh%%a$8tIJawe>8cn%#i{HQwfpFp=kLD=%txXI=Pbof`y z+dbU&!HLSky(Ov58RRh|Rp7iK^;Sv%G(e7dY=|%DR_Ms1Cj~#esoXBY9ny%2>Q_w$ zTh&vGXSOg_b9>5BGKTj`-qeF4%hjbDqUqNLyBb)ZtAd_5DV%8&+wneH#Q3aUA6MpP z^t2}YmzxCU#nag9m;(RvezuuW!6$)U&hGY}qN#tJ>G6tMJQA(K`K{*_ulY(Jkngu& z_f#-=VqIocGPe?=yy$tyO7kM)oC-O4=pLE$YUe$_^VH4DdU#J|#~3mqoTR-`IOjngLto z2Uq|3NzRDHCSBV(+ ztY)<_&RKzB3E&0WN`hn?V}TR>%d(FN6;(7#p-PmJ8H@Tf1uB}9rw~=xWuiPFej;ET zZ#s6Y_2bRX4gIV_YrUuw-I@2d#I)pYGZ<(cV0bV$MOr+;-rsM`W$Zh2c1RW*O;`E( zd?cS+WaKn>j_*hmT{Es+rhY4kn2HU57+8+dJvD2^mfbB?;G4ZS&QsohSJ=Go*MV3& z75HHbW`6kHFGa1=u*XBinH3e)Hl^*%fbm9?$p#rBO_1d<)qfH95q?ZuC&K#E)8maJ zj}INXCE;E0YmPG0n?BvL$KM_StyNeE5K@q4y`bwDCIIB&ymck7EAi{il#e&>pQmQs zD9KHD>lyV~OU=#wv>w6*&;cdzTXk1a!8}PW(y!|D5+B_@xg6fRznCdQxmhBwTt3>| zV3TCpsTQIm-(j^@VY55b;TL)->P;>GYuFt~4y4@dQw~A4HT*4_v+;Mty$X>|{tPN9 zExhqhL_H*_@))uec)eAJAhLycw?8cuGo4^vy`~X}jgWIR6#25;UK9T(0b9eMr zqVZNT(enr>p^*J_h;haKRo1{x2m^cqK{&y=p04_l7tlR0Wq|imDNYIiLY_ly@?TWv zA>w_+x=7(+I15ano|w2W@$Q)@YZUMAh9pD~tMWM|7rQ<5Y7{kO(6`S<;d-`ZVa3am z{cz=hWS6{B?Xp9 z_XZz-s#oPkOlpU}Al9POD*{3K15)+Q99q>@R*H|9+S%Wkfcnz1(sY*gEltQLX~YlF z?fM1G?{I^npkKSkj)A5e!nqfBh09x|5tIEmmE0Hn*;TP;sQX6wO3b~DjCP?fi-8Yx z2`k;8nrVoe33xgn_DyNZxvlDV?_jckpmx7qROH$BzqQOi7lBwC9*0X@ot9nLxJAjq zLX-aD$Pux;7HY%90iqQ)A<}AB{;fGrUcP#%jP6>A+I2r=Qfbf!Ks55)iO(cuZ3!!uK>X(J)4vxR{eukCAKY z^BGo`=duk3mSB5tAHj@|anD7feMC~e<85hr-;!PG6Y|11PSG>=fu2ZS{wGE;x&Nx$ zZQ**kPf;g~jd?Fb9>1`AshF4;KRT;A1@poab-HDcv@Yt5-8(B%H^u{V%HmAP@EIN~ zD{EWmy-23@$I?B1&+%q|n*?ijUtYav?&s%mLcsodw!{Yz2(Y|`k|i4xLf+5}t!5(> zZ+AYM|70EH@zkhij@ov&+qszHFpY~qfGb_O95YvS9DGMQb9+y>@$NrK21Jw~(O;z7 zOdozcNPetYuH!M(MFE(^-{y-QG5~OLcDcxV>!{n3Cr@i>2mu|fCRpK8y$esvuvTkP zVyUv+NYuGG#(<52oqAUfJl8J(h-?AES1=YT}hwV{1i+v7@e=^kqOiuH# z{sS@u`u%A(xSx&-kg<7mBYOr!UkAN=FLr}yLW{WK%lNR^fesYjSp+OlZGaVk3;GMT zdI@-e=iwgWE27{cz@C*c6-o$U^y$*$8_?_Mm@Zi8hSM%J%nc9qKT z+mqtI;YEhXOJJs&U{jViH)|aymT^n8Qiz%q@|pM+51I^7$Nb3AF^~rTAjeyhsA@<6 zx=eU~Zoky+W&d8KnSM-N1!SU(0^?if9fCDOj-cyU=BVS6_WKUQe?TitWfuj^E9;X}p`#uakWf)*~pZX=m{0e!6Um0G~GAk5^; z0l^$oXhAvb1+ha82101AmOl{fk0(TkdzrV=4${WZ!W#|)55l5HBfK6x0O*JxfSDUM zS^%Kx3FU8OC=B^)>4^KeZ3y|kk6-wJI?iRXBGc@`4Pam{6T6MeL-(H$WJbQOdw0h^ zQy;t`OEFn6LeTwz_QM;&8iE46|zPPX}5%#W>OZ=ZiRB&FIbFRS8PMLO5X- z#K7=+O2jvFdtc4SaGPr^?>K8A)ch<%3L9;P{)Hy%34Dw=%DW%s1kKRRM0K_9GG5B2 zQ$2l*E{l_Fyw{H9gtMhYnV@itBQ*VO-RptaSGa5BLei!78Ela4kO;dXZNAZt8L}50 zH87X_UfHF_PQ)bBWD&N(Kc}o_L^{6(57Qq~Ip=hVIrB4qEpZLMK)Hg8pY8~H&O2z3 zSu2RhbV;O1doA+fYz@}DkX;pdFRCvOI(t?*FXZ8zM=GP0=TmApUNhA&zgFUo(F~}k zLTPNjV&EmDr3k5@`U$4X5gCD3e$eDvGJ~ZCnWj&kjn*O%c7Cpb7 zhcBT*o|!iOE7RR?@tkU}q5v?CRcS%YQ40%*=q0dcmuYjkq)D%t%=#N=Vel;MbfL1q zw~PxQG~pvqeJ32L?gufwso^Rnr`XOKSer5yvScoMAK?3%dwbi&wPy#&Ymez4i*6LV z&(M7r`9DGpJwh*?UDiE%0b=R8>^SGbL)J4RYT?~ys(ddB4(1E|n27u-mlyIOha;JK z`uHWhwQ205LII{$v9H{RSgH=PqMjj5+I2(v8kJ+<#pb4Bx7hBkZZ26tkkadRWnD^u z=A1-%K^~Aj!*9)%Zs}PNP7Vc6w z5x*E`hG$9eUtZ(`bYFx6nmF0Bg7sm&{KKsDFB=LdGS}tJ^!c+PogF< z#?Xvtc&bcemAle>LHb?e(SIGq!o!cK5HiCT?rFq^xQYE4Edi35hCt^;Os8}MyW)M*^IGco8SJxK%%G;f6r!B$SpO^Y_Szm#l~iRM z!H-r2Pw2Yl)IKWp(ZqTN213?1pZ%0-s{*IXX0uQ1+KkshwQkC_AU8Ldj>4q8LGQX%ze0}vhLqIsOF-+wsr`$4 zvN6f;)#R91C=+F7=N=`@Fk{KL7@y$a?`!nkZ8Q#@)0%aooK1vP=6;*+%o#ud!wD zHkmNQ@k$M1$^Ov+8@+b+&N}wbZT_kO(@bfwx5Pvjg;UGG;Ty4}kQ#<;2mC z;|d&!(sK2;BJC`>_Ny4Xi8RwaKL3fV#=<2`1dtOeV1&lnXZ>`5!oglRFYlM@n~}&l zmbWcR%7V-s6Q{MV9ezU;LPbwb@ejrt3|)_T9RQB=!8N_0YuByU6+cneHM;D1EO)^# zMcPyMT0*pWke4MZ5fTc$uqK&~^dfYiP8O8hxu41q=l``U2!HATg^+Kdg~;{##N^7g z7sH#=Z8aZ5&nraU6f_uA60n(z8p0c^@lRJV*CYt2fJ1T{WVPGNYNJgyw{LjCJ}!fJ zxz|bYyvaS1gp0qEI#rx+>i~7Tp}Nsxc?{HLVBq=&`^z5py&3r?vcXE|%(r4= z=dt2L$Kl(NDJHty$~}gf{&N6nRL|uY@(~iP8~;2vPx68-Ou=|A6f#meiTEKLi;Hh7 z82j9DM|PkzhFXD-8n-OSWL5Tn`p~~Z6{e85hh;?X5b5jo#x4aW07s&E;=PYZWd0B8 z*H^B;mp=I8HGScAgy6BvhVC=pF?OF{&NRe(^R@(E-CwsTzq0AGd;+&JA==tjF?ru-)>67C=@n{TbxVvQ6F=ogewo5)@*{@^?!aOQrz&p{hXF_OKOhdi4sWk5 zkzj8QJ;`a#*S}{gk5!k^abTK}?-`1zGHqUQTM?CCr<4#|FsAsTZQEfIZ;V8euMB;9 zRs_3`eon#!mWx$bpjYn2Zh8W$>B9HTwVkXJlR(h)}tp0 zgAf+LwAPfWL=)=sw|`E;zHgftaTq>!Jok7@w2hHl-~JWm4>Ba7%9A&?zBD=MF^9xm zDrqMLaBh=*&-_q4o^((LAKft{(OlOO)~aJuOW3Xx2oqq64VVc%egSjSru5u0ou%Q3qH9}ZLe+1G)R1qw$Pwv8r~sF&&cJ7GWVzA zZ*1r9jXXIiE_JS|o0W_s3fE&#Z)6N8%XM>dJW=D|u50*Eq4OnGc?H zYtQ=(*Ws8UAt@*Fi30^c53X&-%?xjTu~B|nR$DWpq4rU}S$Ee=vYR1+{7P1xVU*&u z7*B{btxSS<`WTyX&<{0q7XNln*WbyWNeko>FTbLh`6yz@1|R7Z&3ew%!;vrC!Hmyd z7(~M<&6=to9%}6J3F`ar>FPYwQ5LK64XGtCq3aq*ohhNO_}5Pdx_X)5Vua+%Hmcq~ z+3)f**<)dQB-e2!vgdp8p)(z^=x_5)Vo+u&b(*1 z$L0KfhKC?(=#D!Fv)wh2(@2h$;!{VIiv!Nwi@J2t0hkwjem^}k{Y~Hm^9ROfL-SV$ z=Vl95sF(!7XZweM9o+0M+3e!p0~o~r;STZMfTBc%#^oaR*{=jWLZe4Sy264_VoFd1oq1VAh( zizH;KR)h(h9iA**T5xE;wd(iSKkk7^Vm>*gt))?b+g9lUQG;vwTil5UNP%V%8@YmL zzQ=#5y{8>_P38!-`mn|XBSN^ps?&nw{*OhKCRyoT6wy}DhIuopm71w$y^~Pg9ikK# zX?uID-`G*B0=B*04dZzi%jZ_OdoT@YJVy?lad*mYtpJ1$&2I2xk6><3(8X{0>8UA9 z+?#uc=0xGf0;JxoW$d*W+q|^dwrMYMWhP+1_uMZD$l24ypPH)-Qxb6Bf6W{5(0VY_W>fM_tXdI=Z&2lVn9BXP!hO^~n?h5@7Jo%9%h zv2Nev=lGiV_8q7UuD^RUYiWifW`9wi_u zi9kJ2^N%Ov`e_({fM;_^qr|jUDZ_E6qBr@8@6@CekktZH{i2zLuzQAkeQh46als%_h#&M5qF@@xef{PZ%6b_ zL2TODUq={Mt)}ekE>~}$>{_4yj`Yn_=j;Wxr36BR=QtW{f0}PI58TViI_0e%>xZqe z`6#)YIMZE&WkDVrQ_d*SH!-e}@qG2Bq?9gq`(t1aCUIa`nR=b38d8^X64MYL1u$Uy zEj4391)5!kL?Bz>BnoiNXa)!(l1bgXgE-5__tbI?R8#{phZn}H87U>^@lBi$69CrQ zLk*S&eU=8ts|K0%*BySROxUE2z(w^hTIw3B z&@XA#kgJ^}C6%x|AtzWZB`-7G?45+9J|x1dJE&4;W)1Kz$d#MzS}}KS6}AG>hyvGO z*8HP^W7SjiI0gh6SnpMy8Cpy=y^1c1Q_7wF#Pu=lOvifW#}gN%wG2)yYrxEV5!P5_ zh`Hc9B;t9xpT!i_G1H}Cx^BTR6Y2*{1Er-Pot{Q7&NR5*`ZhT-Es@nKLSaIrd<)=Q zO|d1p&st#XW!RzFw>q;XVz`q|B7IZ~t^chcVRJ@udl)gihd#SHe15UQZ5U{lluhDq zvfc#!6}zx)qGbGU|0ZMR_>)adgOgo8zf4zi3W84w7b+iOrd60MEf9TAxa&FFiEW-d zTecM&bV57=7PZY6d2KK`u0@idc7m#cJZO5=oMoX^E{YXpYxUs0yXu#5dlA}L5;!v4 zgJ6OlHM5V4t5zgMlzz!>I~rHg2aX$sORQ%y%u>=}?(|KpxkG?XivaR_ya_W0{mMAizAGK2LmoHn;RPoALTv;%eK6GdsA95nX7yRtf4_Lq6}R^J2e`8g2H$&`R~!v7P+p7Gx+%= zat@=`gaEL*rscK{!qBsbFRR96zC$L2CaN93ZDa)`{i>LG$t5$2jolGkuF|8 zp&!zbDt=$zZR}v@uyFnwQrXZ!FU!y}lW1PAa^nM3#KBUCboAJ4-SQJYVU8AfJaaj% zMT6ItH4uSB*DLu8Ho<-+0t^2pn!Mqpk=_ zGt067m#-Zjy%R?a)}@} zyQcN=9h@j)ItYFf&i|Q~b2tYYUd_X#uaqB)cS^V4Uvspxf2)*k5SD%zGZ53Nmg3ks zXQ{yGE`fk^wGTAWfWG8;nU}>zz~Q?8&J1qUdS)u-ID_CA;Wka5-R#lE?mxK1i;{Ny zHV6!z!roDm-^Im^Y47_t>Shr2q<7UL`NB$R$IlK8^=!+(ZH#X zCB)5RAsM2_tB(NyAb0nMkAgl_ zNBMc_d+ZO5gaO=qSkyv{_N1r}rV$%@4sntuDV=}86t^TQsl@vS^z&0>rog&JJBnHP z5?*p|(%pz8o#URn<}Kyv=hZkL)h)6Q@&Ivw1ej`Y&*|y@c4A@G!WJteo$}^eof$Vh zc6jLS0p2vcCik=_ou|fM^`y`l#SFZUq8C*L{_fdq_@Ag2GybtMBBRRv zR14%v`JJc}yO)og2IG`=75Alt<+9_fUn`639>J^m`y`a@P+e-xH}hBjMz_Uf@Mf9+ zPGvXWgG#NEpb#GZ`c&`kc&Ys?j>Q6~0i<#ROu3@?aKMOh3!*I+^36PJmugfT?QtO^ zGH=t1^Wg1flV8%@hx3Qq_*6N6&3jrL$@lI!iA$N#z|iKl*$b(LS?Ziru^B0p3>Vp2 zk0cjA^_9ln zNNvP4P$iu8P$BP-;SFZ^u;RpIKa1aviR~p<*L}g7hNF*Ofu`9SKrwGL0}81MG|1b> zspP2HyWYvoR23e9Q<=xyzz|s~jo&3nMO)l)&!^~|h0I1!eo$Wdp4R@G!h2756%woW zoQ*m+L*1xP%bVEpkG@}EPM;RwtY?afyh-N|`n?+bbntvmbfz<=pN5xjXwx#_D zSsXA&y?}0EZ$Rwr005OiqUXOmfP0VK7XPsZp^LMNGgJ<-R>Li@BAHtKOZ&|a0o?_B zU$PxZTl?*`sJuVz%)5XX&T_GiMpfga$Rm8!T;$KE!dD@L&dA1RMPjSQ8 zygrmeu}6TY+$TE&-Mh5pw6HyRAb}%(*5w0_M#|E9NgT0wuwI66O<024U_A7d9Fkcq z3CR86bFQC=yWcGTt+)LlQ@;!7ZO1bB0V|9f4U>1r9Zt-f-qIeE5n*i$A3^)HET@xd z7V)=H5E?fTR!{YCR8_^khF*^1-Q8V2$n8&MEP(TnyP|lw zeVzeiPT={IB=KtdeLoq4#@Pajr>C#{B6gH8Ba~0v2f*%TB-zKFQu>DHSe`EO#)vIi zMU{olAGrx~kG@GU3zKZ@EY$ryaVKA~T5R0Kb`EU+vBb1L|V07*W$>qxS!Jc z+bWqF^a(X5(mDOHQz`ATuZG=Y1lasuk6eDg$#q6`_&e<=Swzi8nFI@+v+Z%hzI5IG zzNxA9VQ+glopXQ)W`(4cl#~n^lT1e&K5DSNyW0zRzY#&quUWsD@VL@mxg%dnJV=vh+u{j@sY(i z@QOL3)%sO$^P*s95y(f04>!C@izl~v`#&Tp{{dx=vMyMr%DU)(@blz&VyE0LN*;NQ~$ZEJd5d~>~!3W;}j00-N3e}p=&MEd?ZK#?6m zKVM9LQCtfUfbpqB!uHN0stK5?p?wNyoq-i$Y0;;w8Rcex+0ibp<2)To2S`uzm^H{= zAJA;p{)T!wyu~>82h>uIo@oZDM_PUrqvx)C^nR|yw=3AXu8bQb1S2>+Fu~RQ;adGQajW;# zC!3%S-KWPaf>t1(O*LuZrk|?&yfW*ys+e8Tnu|zx+=V_A)W7vJtpsMo zU|ubBga8Nu&sU8Zkvyf!%1WhCX$YF9rtgM7oqDaJw9cH{In>pSwZB8`FoALmpY%I2 zI@rom(%A>~k}@HJ)0NuSvMl|%G9}n2z1L}J#ghEV>1v*z1chUMm$_`(WI?+DD5d%i zH*}wd{>Zu()^^6YZv+y}yF;n)e7=cX20`wlhN|VG$C~dEr7{vxql|bD)`#Z>^TqeC z^3orf)D2Py!Ii3P9#FiYU*CAE(*eGHfLQp!+-+aahI<4+u?xwP8WzDP9o(jZR4wmc zuM467;C;tBuDpvgBy$qo8RrZ#MRS4f-_8DstBE?szvKC~V=F+iPaEF25 zU*yN@(J3*es;B2UE?{`&j^R3sx2v5{J@bu_`sRd8^L2dZAxiYIKA6d)*A*tpG42U{>P51~*KwCzf(T*C>Z}ZXn;XthN0=KT;!Nfy zhTInShqiUhw`rSxuI*PYJ3rhGpN+4RIqff1^+u8J%y!+*tGA06Za%9r8raT_bd5f~ zY`~$GR=q{fp;^t0dho-JY06E!X!VGfey_33$r-=YtmQHCBw`{ zj&)GoEgsf(kzdT;wKB;0a2<9J{~X7oKkE{5delt+!LU?On#`L}((5$pPo53Y2y`qR6CaJV!DVC{bw5q{c$csh_)eZO}io1eIXQ z0)XZ+*I)hN#KeTcEaOOl%G=cHvnoNf87l4{$P&7l3ei z@Cz_`5Dou^0y5osvVS4_`kYO4CDwZo55yGsAM~u0G&I{e!#SJjvnnpApQu2kBh?A?oJ#+Fuj^>=?uu8dFH7fUKftH@ z#Ml5+JJ>lHEr2hR3cb7DQXoPFhRCj3@rUZ+ySMkCj!KQCGPZ(oGnN&^EkKrq5|*17 z(b$)ghbb=9Nb)rs@~$qeH2>fp@@FDLv8VS;6+1oe_7pYWUu&V>!~3(`et zB6)DoTa(NS0LFsbsJftN;OA2R){J?$<0t$1h$f@n;N!voh93Ctq_fx3O$~|;=LQw< z>{5_v-{OLV2N7vMyBp0Y5BHywUr+IpJ`y`=jUJQ8_cdY*dZ~=us#1FWs_xP^AD2cL z3+%O;>kp&{^S%xG`Y1*H)!& z0e_RD_F6Vtc=1QqqZf#5*vl(<-gPRo$izAd{Jn59?Vi-L&u>PPHLS=2Vaqv(4|2tC z8JwB452&;l>?QZkVh;H{T_{w*|4K{onQ)~+fmWm7aS4suqaGKA2w|{!Ti{mOMALZ( zgAl{NU1pklA*`^_Ac-8+qc`=g)i6iv3|q9{&Hmg8JUoe8c1T^x7^q4Q@Zl`6^9bYo2rmXzm&n9B)!JpFH6MXQ;B1eI{{I{ zfi(t&{hAYG`Fg@pz)bPe)i{-t#9I@G@sCTM_+k`(2rr*{M#j&?NBW;Aroq4III<^Q zZ8LtXv#!?CenqV3BT)RFE%b~KHW=QmNaiV@Pnb;E8z*%xPNB{ZaVx$7D~_*U~g zw2W_AvZ>N<#z&23PSCub!!)kW>+xvy&K-in%Ye4i`tIk3&zep&dcRy${_5#rHE3#} z0lAIHd(NwC8qD!%DnFr@vWi_rKFZdO)!WnFaN>^o`l8oy>|Ao*t!qN5PN(WH8WOC` zd%3<0!yl}CO6?v2<=!S5np}@gN|JP!pDh|P#?dPDTwJ4Gmy8%>w-=v`YMhZKW2xSM zK+b9>57vPa&h1{*Np$NYVmBStOn2N%z>7jp0<&!oiOs+P#LvAYaxUXgj|R(8Y=wo$ z+y}^$kVs4Um|7Tzr{JwVxHM(wTwKGeJ27@w)d!hn502oKYMw5=&t>B}h|$|p_bukS zG*5wC24KT{&i+Q(-P7IRCr-Ur5ih$!7|`#xyY8-?PvP_K7Yz*)PiwhwzE@cnE@yRGg>9sE}Zs%1Teu(uKB67&hu+er?;<@-6?|1!ra~zDx@<&ph zki?ZH9-ZOSSk*Y{Yb}EMI2;2szd}O%rKZ)*PC47kt$G!Xn~%@?#HeLG3|cGpwoUvh zL}qPoArC9YNuOSFuh7YnLM^0Li%Ea|I{6c2I6Q(cfQNq%d_1Ve{c66}n`dnLD6ydF z)^BEd@dfOG<6QQ^)@Xuw1^}Z`(`M)5xyo2ukOLy$J0|P?R>46}3M%dTq-aA~hqP*E z#dPc4u`o$t#E?{_dg|)=jU`G1^n@7N^?Y2-fl>`_f#NZdkwX8;6^Oy&iJHo~>DW zK({ejk_bJRy|?su=)9f`xMuubctbpgN`}o_5Ts#5X<5!4)3bF7sah`tE-(k5TKP57 z^i}*ydWXNaxAJr7y1u==uk>GKQ~&M0XQxd^n)0;*3vh0JF#{<}34^$$Dbd-VTGmFB z#qbPtcLgJA@3GoR2kq{%H#o9+eTsOIBZSl8*NfA?gwK3^R<@aY@QM247(h|OQ?4Ib9MD*hmLZb-(C0q^7vtG$(7 z6j#;VRK;IcEkfeRNYIM--tFzR&ht#mt!%U5!hhfR18N zk?Y<1)K}1#es)Uxc?Ds+$ea(^&s~FVa6E3d*z#Vpv3=Vw(`Vi)VjYaF9w+{l*Q5^8;bPPFE5eQTkrG zPYz{-O!s1*4DLZ+v^-0~O?QUW+>!RxD%!C zw-kMTCOv6A9;p-?d|JFqXVo1cYqG+#4(QrNs_<(JG@V6q*FddPaz++943;)^zZ{@O zo!BmDyY|B*?!{`eAfWYolZx(X2C9Xqvw&tfue}yu$n&Q1T2^!R>O=k9PUd3~Vr;*T zsGL2&O>r>hIGeXl=b4XzMk)tzZKiQAq0cp;c**IETA=a5gu8=- zlJn$=?Tx~W)KUcH>M$l2UG?A(=nSp5%H^$P$h>X9HLAFPzdn{=@OLxRM92^K;R{lm z8xC3XurNl5AX<;)O$)yUnXzl0hDAUg?xKEG!e`dN++_59;^KFCr>tAO*H#~4%j+i% zc>TgA)9~@)MxR77S)Uhck0~S+!gv7?ErRT5UfrXpFzRU~Uw>g;H_672vqio32Tf|0 zf9nJ}98MBVa&^t@D9Gw!W;*ytm0N)4<8siB;3MpQ?U489IA^5u9emyZV`E&;#b8D( zK*QHIz;RB6k=e2%;S+gG(Tp6MQMvC(S@t&;4No^}E@|F47W(rsIZ(S`SYdca%4J3; zU48#|3u{m0J1@l_2neH1NQH{R_)V*h%3n{?@5~a5IT}H2glsBz)4*fZHMqrzKOpfq zR-T8aq%m;Cej>Dj-Qv;r?{D$S_!k)9BstPPgpb;K8mHpO{Y$`Mxi5WuN)Y2Eu*D z*?i-gdqZ&scYMwt&>M8+j(hgd+YtA2cte;S8raC{ZV)T<9nLy6Wt&mpuGf#E4^J86 zulq}Voj_t60I6xN`{D~8&iW|Y#gSt!X&+*CaU*MbkFVxh6%=a-Twl&liww(>Wu0d6 ztCm=NQsiLukep0N>qW^Bl?!xleER6*#3NPThD~uF~=gl3t*vWdhT$m6aiV z38On2mmakAy_ z`TcH_V)X=dLgw2wTuna^WjN!{_ew5Vhx4BJbd@dY*l9?H$p1+K0{;v4Jels|7Oe;{O`<~%CRcH#-77+k=!X!W4bHdvBklrzlz<<*4o-3rF43w za+3QY!$(zSX$Nh#>iC9ubu?@fxYyYw!40R;vEG9_WgnCsTZQ7zn|yx@a8`x?3yli5 zq`m)rX+MaxM1KcV+j@-gX{!u(^b6sN{Xo*pA5d0C`gWu57n3GB$343`j4Ih)o&>22 zWTXwjDC&|FFWlmd8RR5MzD{o~X9P_9tjkEsF0`060RLqy`-s^Cv@fOxB$aWK2skGL zLlZq_m0?xh&4g{%5Jm+KD96tmeL7Iehj6Y<9Ra%U3i!QA{4RGMPvZnRcxx3)IWp`y zEpy1M2LRH?0gAfWbmb%WS@wEH9}p^CN*u|?9#LNXe)R8T*a{%NesNv#xOWd5eep ziewkBMg1Fw125%iP#d?e6laY0;^12V8yoDsP^pJ}*mIi~25i%2)gx&INTk;_q`epP zejnNzA2f6omo>3^K87QEN5md|yx06M-Q*Rqw5~fOh21v3-i#$TskJ82?u+anP)UI^ zdZ>aTk*+Ig^Ta{(CGb^qPaXNM6ifco$MI?aqZNC|K}ImD^3b~qxSlrJ4!s0DuR4RK z?3V%8bzd0?U2g}KA-^R5CBzGmT7jzDKbXU)lNyX_*g?n+ZpwG}l{)-3(_g8^miqtj z@Bg1UZ{N#>e`w!t^L;CKN+k@`_?t)wao*X_l^A*nL}tgw55vCkyjA0KX4srRhUFJp zb36@88-{C2UUW;#Eslu&IBw@60NRfL?#>GmgIo`J|3Ap@`(K4I$TW)NvNIR!6N6>p zp5cUvY)d9zsxG(;##OnM1UGwv{|zMg^k1EO+;P(J@!Tm4#UAh{@y4-}t~6EnFaQk* zcY7GwaK#qOFT1K6nSq097XjmSR-m)G)BV*xKX&NJzm4Ql=#LusK7C6plOOXWGIk~o&5II$+`a!<%@EZ+e5Rvc zhE5V)pi4I7ru2O%d+?I_QtZg=>cfXIhA!z2(LvaR8{6Iywq3Yly^-LlZ-k+#au7pZr>iPJ?MX!R*n2 z2l1j~u+tLPU!A`z76CYXr-M*RxMxKbMfj4)?Ixg%@a48h)GrTPb+%tE!YhbS&kZ~a zO8N!*6iwq))xd`68=nj2Stl-ApLAWiu--l}x5WWg67Hy5+GrazPVU`3%w*D1m>?E7 zId1>usdH0W9yURDb4fDI=ah{it(o{`5&ZCJ7Iwn>z(Aq;7k=@|-GfvMXw zTW4z$3~<-R&q_iHyPDJTW_6XU1@*Lo%ndw!9jJW+jpkVF( zPk;+aNc%snVca_hOXko~ zo>^%-v(~avX}M(4FZkN)>AA&=6X_sk>pXtkK@cj$;d}`e=ipV|6(cucI;B<6cxl(x zDry({bsK!FToy_394^CKcXi1LXtC)SnW-m?^a8T#BGRq@^lAEkj_!Em|LSL}ATVhs zw*Np^065L%C5j1)W&AWRiDjNJWrJurOLz2_JLqkV@XF}c$kT4>-RfyacPn$^KQoGl zO;l$a2s70DWiG}&$-eBWA>?Pf7g2j^T2=5Gx#vx*GMg>@JG_k8g;+q74#qB4V}{#d z6z*3ugldeV5Kbp02B2T2`IG+)ay@TWoPz0g3K3UTxDdIT515txa{|^?VqO zU~BU%lC2==H1a}Z6aHmR1M~vz2=NAHQ)EyiVz=tS?fBm%H9O#+?@`9kxs!egbXcI$ zrjiQUsPX%BiDPEWHVwls+b*cxeX6@XTXOvnZ9Pr`%x>=J^W>Pxy0qwOuUUpn19))r z+63+f?`q+Sc*{8GotJ6sZs@U_rins9M87~p>NEe_Z9nQ06qT_y$pG~rYT>qc-ef}l z93V6{sq?F{HVd3RR?39A*PrN__PZJTN=2!g*yPjg1>y+C-^~6yo5NLF5&_KSu)TjR zk^KgkDFLx+JQ$uy(#8&UxJp-P+7?*E@0IO-z9;dG;}LK+kYe}$akc!bXOkp<#2~*G zpzAfCcZ%2-zho?Mx*vRnb1M#~jXOcln2et}_KRy;A?_;`qYmE1nO{uHw$9R6Ux0M- zKC^w{hPp?q#ihe+x4?P%Bt=EO7!P0dqEdkmm}~iGP%vd+-%uVNMdWX5c_PlRi> zf6AI~*S*b6=V?NQK4@EOQ_npp!atnfUKTeFvzvSbOaaS4{$|ynzmHegvz7N5r-5LJREv_P~TgMT%ZqSx&>r6HSFT zRu}OmlG`zJ-@hl6Hexv^wKN`Yp5st=ntO~{4aVOm<|M8aJC%yfdAbac3M>2+SAA}& zHm!-+CmKvXA4qp3Jw8d4+{b8_I7*4ObuxBOYx7aE z-&o?qb=+nM2tOdXQ{I*--KN94Wd6G}$sY1QXp-Cc_$KTRDhqLqp>q%nTjv23k!fct zB<4_p=vF9Dc7MB$-@RpGi|9}eA+7r2JE$rMHnv0e{SrXq;QAf#f-!^8In@KS5z?q3 z!nT0|fy#axcndL{V<3Ls;WkbCx)qHNl010OydDJfJe#V3loC+$?^`1!}f z__b{qod;2j3;)vemXd&^!MK`^>#o!SMjn3Bg%5@D9@1ELpf>cMu=y`zz_-%{)48Qk?k3CK z8&<$a=o)6!-A6C~aNJ428xZ>$n&C9~uG-5vGC>$n^^W`jF>a;+9|`v!wtEbh?>P6D z2=>p1whdJNLW9u2{6j|iWIOuJ>Ax1+k#jB8s`jpH1h!(bsfOQO)X=-Vht7B2VX}jP zj%_SAWr<|DtqKW7T}gQWQt|AtR==Ug7lAeg0M3XCq!Hki)zdxcebAxnED2i_Ia%U#Akl4-8g$2RZ zrL2KpwGWSQ+-9G_(mtWt&?9ERxdOCr)k9F=7q2t~ih8u>DB77lKy!>2Q?CKCLFOQG zVor4s@t{HFr+&%fm)|+D+MZWJ7?ex+i;jDGbBq~cV1wT=cszJvsTtdcmvlUEVRN_~ zm;=o-rbYv9yNm8+sBOuXeavg2v+#2?{eG+8_lL%uUB0vaejO1gSo`rSbQCwxMhhlF z8r1z}Iu|z6XByvGOlj<+XOrq{EB%I-hk$$dCyMPH=JJ>oNyVEqF($1 z!s`^nbO~X|8{A*!xZ(Ec2I7|o4}8(JVl1C;MW~vs(^UXNK5Ph-d!oFn>Sc81<1`oO z{Qx%2vXEI8|ASfV*pgnLzYs_7d^zf^U(K#F)t#cWxk;dgv2tRwU)J|1}! zt+Ncmr85K=ohTWSb$u4{i}|g-(4(QOlthSe9G^N&~IO!arG4mXP7J!5}`Rvfsd@5 zaj*U&qOjXWmHNP)?*w>Jh0&k=heH%J8=fHX2NbfWE#ys8qNJE3o&!#7teWdl{+cB9 z>Zc!;;6cp<;1-l>QU65mez@axl=dR2JaAQVA25=9D-z7b*~|;fv$AC?ftYHJRF^g+ z?dS1;*|>`K<14m}%MT}JT8X_$6W@BGfkmYToW(|KqkpS}oeLrvY$A`1?Ac$ctXbwO zF@+HlpQNdBW8FU- zkq0~BsWk*Wk9$P^Z8MvKG^bdN5C`wJ3D*4Vgtl!;sz6~pZy_Vybz z+n5Rnl!QB5AzLDq7<*hhV(hF*0AjYQMDMw(Ja++DgUj(ZnqV7&`erh3*Z9A&_Z~n^ zcHP=Aii#Z(5v4^1M5zxVA~lwWuA(4>9u)xrA=0FU#Dbu-2nZ@DQ32^hdJz(p4nm}c zY9REUKnUr6+vk1HeCNF9oc}*%{`tRe=FI3I_e4nUz3;Wxy4H1FE3TP>hz^Q{EJq@7(@$mooQI|vy9KF@W=)5zqiVUQsHwpUUuuk_E<8NFFY-f(8 zQN?yIJg#NPeMs{G3o{=Qa$og;W0rg_+oh0+txuPXA=0*GduJa#EQ8EH3_aJZKm`yE zbjg&uU?hC-sRBviuwdK819mdFBSJK5PwrGzRn_O$MXFgIS^ghHVfQ%NU$XRDFD#wbk2n?G?IQokduAP`*Y_ zG*ymME^r!2{N5~=)SDRAqx1tQLIr2pKA(foy>pBGtol*&S4%x4Rb#nORfi$`bfoy! zm)CrDo*t_V3t!2dZVOa0cc}0^ct$Jr?V`ke>@G*gk(y?yJEuAq?+M6clICZ=L4ZEH zhH9Ap*q|Tp*E3q;x4Sz~U}NXI6Z*9D^$z3-=HYR2WLHtvJ(v~A7D>8PWn22y_WBG$ z>L65|AtbgO$izwezVPj@ze$LEI}FZ;7PfMwT0k9ZS&X`|^c32SJb@mlg%@`RlAu9JQoj*kqroM*(@EMFjwGz+)H z(1m=te;r3qHw;TI)cvnUIgr2RFyGo-Yqzurk>i7;{&aBlsVkK&8*L!urW5maaxv|sucLJ;jGX{?G z4;_Of?&Lb-hSE00NZ!yj!m_vH;EmbK<0K}5(Stqq6Rp-;3Ezt9h6t(Bc}6o7NYv3& z1EUq3*#y6v&(~jU;_eM`8LXZS=JY4L_(QQCXqQ*usrUHdFx3V{dig9Vkaw5pBK9QC z)~|(o=>x$O>B|AMtwv!LekA-n{zZy`3EAg+*elaRKxlFb?GkV}I=OH)`pvAaX=KK(6zKG&F=;3mPkjEc|3=5T2bb7gmdkwIjE{*Zzk#P2XFnKsJDqO(To zQnI~OvKr8#+6ftf?0f-3E8Io(@iznsoK2$lx;?b+a95LcHno&tUT>gGU5O3o>*H#M zORtySGlRUA

G^E?hjP*#(=_swX~`AlKtWE*8%1uY6%=GrgOe)R=A^b!kG^;woB# z=hhV%BfQV!Oz6*-3TYnU`#VHK^J8P<8v;evB9g`jot?h%G>C%#>gzaIe=pfU2)k zp&xsO--F-zOCTcY3wD@Myj)A=9Ilc|3;OZ`bqEG3(T*WrVNH2uU3qHTvbwVP_O6z= zmfJulJn$I)+}VJd$=^YX%pZTYYWSkTs5!rZeA=7&ehY`REFA?ek@C;F3m|+dld~gP-@jgSJCdz zB2{;p&hlU^T+3W*vH0VUY={`YAFsr5BV;iRM%QHCid4T6W-Vy#6s~#K!U$#UtieeI zr@#*df3~414+0fTRnWQ60+*>c^w4=3q>&KJw|Qr_hzc8Q@u| zZ5Jw&;PLKGzmD|N#=5^-yIr-YZo2FVTQdrJRqL$!QJ$C04B!eE*LO4SU0t;5MacG) zao%2lS9^)Jk^i)8X2B-5PK|7AB#|8}SVgxzq}Zizl31xcG;Mi&3?XvhFk*3`HXNCP zYYyeXzImDtlCTbqDEFvzb6cEMIWuq^&Ah}#WCY))NiU9>53o+`YR7~}`>w!o3C&XK z1%RP`PV9GqC9nyOG1y6q%pSXpQrkaEYm}d=F-u^z*v7k=nSPjmNX>c1;Oir=ZWFb8|j9GhbZZ8||H6!e%=k z1C#D7cC5%8A*p!ZzET|W5JWqIDM;_lSe!;^d3L07(ARPOQ{k59zohsxf>G~~ou~*E zq@bbu%ix<2v)rApAeP~NB$~A2bOBz>r4Q*(!-CRLZCfGAy8yy>Qt>}-Bk3+ri67fdDKQQnNwu?}7EXYVq(W%s6mz@llw<_QBues@8#rl;@l z=QZ^E^}{Hkfo7)x0BY0R-~Cgi{m;Mtmn_Odc z4eRxBg(7o3`VxnRCt84Du?!@Pu2;*v1UlS>^}h!no!&Ggb{go|h7D za5B3txtZS%@6odZg)b1p+tCIf6UM!~#yG#6iH8s+VT5XWd{b!{)yY;5dhOS*PN)at;Q{@Hu}j~BhzJ@T2kV>#dS zWg)GQJQHV zH!8*$gl(Q0u*rPiA>VJ$mqZD@g=-Y*tGP}Zn98LXrP!!0n*!4g(|J?<`7lYzjw_5k zx)x80RDGPl?mFy#|Ez$prB@}@_}ec53FP@ITdr<)sl$wLpsGuyg=4i;!fZ9dlPAEu zM%Ab%?8$I48!O5#`ot=FeXrR4(Wl(bpJ$a(E{l5=t1z*DR3oJj9afpJg+bJ@Z?0>3 zD<#VTgzdPq(Xztj| z=lbcepv9SA0`f`qU<3q!$TCYv^O`6}ekXd>P7nG^0D)$ktDrJq4=LH)*rR;#rm{@t zTA>j<^q;@Mt2<$Tb-dt7kVATKwK)7y&mqF=X$QlT73K7O7tC@YD*yqu(PaD51Z}+q zzpxplhZ|B-tZKl!)IYNi%Va3g<_R~c*=kp;i!XZ?im?bks2XiGMvy-izTo<>kMII9 zfPM3@V9^!hlYz|l4>n=dk_30qW06(stt;?!!nRr+{0ZjkcXNJ4vo^lIC-iqZJU%Hj zSh2+QT2CpXS)EB_1fQsbfW0Ws=5KC2q2dWW2c8UC#;ouIs28~W?dFHQG0%TH7KF@;%HR&ts$bG3Q6Es9%93N#kA$dc#lHlS zPGJ|E$DMa@PMgUw4L=^yUY)M7TH;J27MwTW#9DJnx3d^p``LaxGu<`0^hVL1I*L#AkhC5V{bfzuPUJ_#t*?u53%DxfjVQg@K?tIZ_Xzmdw>u`FY;=Z2Ka1!|m#Mp+|dA1X3==C(I`9 z?AY9$k45B{2rRs+`FOq2<=${`vvBnmnLD{jK#akxRii02NazXPJ?ZBfyEppf&g0gQ z<*4RycwsWIShe^wfouCY@X)DY2aT||Y3S!-Y{ZjS&y1yY!llp$NUYN4&JS8lz3S-y7tXZwzze9pe_{?psrT3H*zSPmCtM6b%g-m;LRRTBbD4H2|n!9prCT z;#sMGw-WzAxQuqQ!bYS9)6>C|ZY05N!}UN0pU#YHV#_;-{3v)f^F7#*0A-9rU80r1 zC!iN+bx!1win<0nXYaL3?laEWM!+V1E?!rdAjN4K^>g8c3%iH;gJHfWpo?V@IqXdj zPNwhcl5D^$cAwt5CGNufAqVN?k!{citwnxyL1Q|`3XQ})1=%|l{Fwz;VC#)&dzb9> zyHZoa#o~xWgM*1PIZ5-xK9djehq@14m-Bs)_wI%O2(afZ0FTk^F6SKp_dv_&gaGah z!GkE?K2D6vHYrVV?W|K9tXh~nF?tdLPZlnPb(OQ0IQ9#Bf+pfLjI3VKIY*fE#GqSb z=6qo|dh^tR$m{Odl!OgzJEEz1n=33a#_I>m5@5-U0O}TJOwC|a6y~-^c15s_~>8xGC8~KcQosO)C_Use7 zX3q1h6{aI=vE)&%HQY6q?gR>(4xs@lQtYZ8q$q*3C=O?z%%^=LxN!}IhxDMiUrFD~ zU!zYSBOF#@5O!df^KobS*_|_K&C_A9#pSOC>dauQ!+X}SnbNu3FGylqkm%bSOd^GPRdNL0j1kOQ_7FcIF9k%!x#7!Frd##FpfslPRAB>N} zPzf!}MPXXTJSzFOX1O3d=ih1@{r%deC?Ll=SdL)R0kzGr9Hl{8Yz7i6dmzE;M)MB# zF^TYU=4Gy)aRJ(D#=m1?O~07|p!meb_vp_cEz0aB%}b-4QJs)g+cLEcfFaRS?Rgh* z=`;(z(H~%g12?bCz)8Y46k+oBapV~~?kL`Mw!fFzz@mMrjE#ybs0s=FUr7nc0f8f_HO4 z7(W&bVQlbU5HTH4jM~ddR4l(ORSb`%E_>fT!ZFGZI_kdc5;bYa7OdiH(no_Ki(Mej zV=jSy1PRX8&L=BT&Dz--LPBf+jyt9Fi8he}}Oh9l1Sf@=^A` zuI4NUcH0hb8afS_e1`r%^+)vfP>wxT;k^yQ;1c_|`B!!ZzJSBn*t??{{;Qi;m%})% zD@6`S(y7z&;dY&-+>Su(-deR^0z0sAbj0p-*}6A7!XWaSLVQ<>ujnsC7va7-|Iy3c zfor5V$(+rstu(cSYMU#*zWV;|W^I8t_i+o#0gByew1nTUhf`&2|Sn(E{fiA%vA~A9OcC9cfJSqG&~LsFva!%hdW$q_sT7TuAV? zVm@{bgeST34^$SZR<OdnrcTzB>{R4FlsjTl$nRXy#`wzOFM$cH^y#3N`=lk! z?~3~RWooGLPQ^qbZrvj+!DgWk_%gxnijH2fK#n4H-;J&7B#mV)vtG zkZ}JTyI>3Z+V1mOKkOV^x9PC*B@Z-ncZqRX+iQ~dBWwUVGM`p*R`?(?i0L)9biql% zve%x%5PWv5A};Gh_~EFQLqU=yy3v#9@m%~tHW@tz_8cXD&3AO_exeabnr~eJHh0I< z;I%)f1hv@xxhJ*yM%8|>p1E5)bji|__!8K01P}VR|27xJ$YVO|$g==%b=NO}M$+`U zSS9zOaw&(5=59jY?O%l&TP)d88_@02e*&|FE#Q{Dz*76jy9qO7kPQ(;ehJWqxDy}p z`A5_3cq;_H0HMMw1d=smyrfI5yk2L1(ygYaA!@V3t8=lUKlaB?hMRm@_L@DB6&;)y zEPXrv++jNt#kP@v50AB<&i5@^fU!qO?4=7KJ`moV zZ81NPvxj0T7gb-AS-iW(zOzlg;dV6F6$kJT#ewLz;eD)E?ukrTZcngAJ&{5bn=_C! zI>~$QFLkLtF9Y(Hu(cI8;GnDPZP2}I*NFxjMoV#1OxLw-2ZV2IMZr;>d>B_%O*o{g z$;TJF5}xXasso1w$ZW-mt%CEe>zp7@55wOVnQm=88S(_2N69h+u%K0WOn{NtxMp)?hCGZMof%}6Anig^Zo?8bX z%XiqAwi;&&LNi`7?xg_nrV;8gtQT8i%zXjO8oz3wXPHV^5udo@>Jr$E zHxHR4E4VtpucqD>sa@G4d)E63L+<8+lvy`~8>|w_+PmpiaaoRjW*<+7zwJ{oFsPZM zxCS@(G){R4+2=D&w&j<_yYt+TRw-7r+0US>AS91 zM7Zm6gO(p3yNSC%(;;}2RsYF$&200Ni##JV{|@BGy#e)58hYuYw)ij$l?8MKgu@08 zxM9pl^@(2sqQaN2fI87w--_zd97ETb3kUtU44X0o*;wR7+Bte30!wQ;X5zsH!^X8lL1O+5AXmV46|p1!Fea*32F=L!VJt|rfv~O zGn@aS5&klnv+)GGs)}Wjr$ndLes|;hur{dHP_ep=xIh)3fW1RbACD2Kt-9VL`iy+x zCg9^Rju_zZFQgv_XXqIM`#|!>WmF(V)6QHoCvDJBts2^qUSXN|4u2D{k?Lg}rY2cHf2L0Yg1`l6R`@1oVUA`j@<@01BuLJ>R-9?@2fn&^m#4o3qzGrijf9z3 z>BdvIr5*Wr9L?Q9zEt{k^q|nYXw4DmN=Z|#O`Q;>X@aKJsM{pWh$Bu-OQDj#epO4* z^Y<2>`s7{sV{ArffJARHH?{hDEh5winrHwORq9O&A7sx_+rN2?VK>joNblV8VG~pwGW`jy@*_F!gdh-Pmy97Yi$dLEW## z1I+uJrb_k4zPT;lW4E+zk3!XjDrQ=s%R8C-3AU&m_Koh;jQVRA{wVa6gRYo04zIhw zGlJ9Nl9lVMFvE!!T~2%t8KcXWbt-|$X#l9Z4ItOWoELq{UWZ?cnzK=q#4tBr^7?;| z%@%|n_($&zXHn-R&KY#;jV>^U<1Go)5d3@!oF#q>`c}jeY zJd=~l8d;hW@ZiM{@O=%(&WSWzPBybg7cTJkr!;Qa+HqHS6qWHNy`EGH;yBo*5MI4pN~Ji+UpC z-q3oTU`gePO(TJcR^Q{p2cH>?Diwee$a|Dsx-5Gj`8gj$MWM|p+)Q1^%YjF~2wUuvciCyH! z*qw{`L(G8lN9@XDe&9mR&P+?oUgS0Ku2{U20hWy1_wmiv4gJZXovSG`BUT^Ziyiu1 zj`$r!`SbgR0`6n?d*fUj!#`U*8DZDxC4lg$SiUA5?!a+$DoEGvz8k5r?M;ydFCgVS zh8whwdy&tMdF-l?{KJnAXM(#_`&G?b2~q&rqME)kZXB8ApICv1kBgoX?(*tWYQYQs zd6Q{8Ugksnj3)biM%%9s@qLNF)o~832cqkJNv1FTJlK&bcg2{6Xkef~f~YENdMGL2`;VQk>W*F6@ zp#JH6zMX|GeF=fKv3O{yp^uHw`sBub51ZXpC)V_4KA|Y7Vdc~ zC;UjIyikuC0c4TsfeRH7DT5=kKgElwn5Y#|^O9ntb3csmUMu(g?8v@CDqLMJM!Z^^ z@7YC#zE2%WtE0QOEXZ{hzhjr~Q)<#PAG-J38w>{ylhFTdw!OO>vLzAGYMIUj%S*$;YVLGtuLGj zPAzgaH)Io9R`TDraz%>Qqhdhh226|jgjSDnrS0@P!w*kKSQzLVoipU74UoR;6cy(! zRN`{IV2NZW(~Ydx?O>oZQSj5O(wRymbR4@cxQG`(+&geN$Fah|_T|y@%NTqt>dQz~7-uehJi@6Zg)+WX|-zUo9#R?o+uIDXmgk zRZL&oZYlImsLq;c%~2nmO)@Ovo>%u2_lS&dHMk?I?|&4!u${kr=9fU-@WQ*z4+OqQ z=dX6^OoUp`)gIh43XOK+zMLl_l-nKQWx1;tNDufu5*Z$Z9V=r-149uE##nq6d0K=S zafAaSz}n#D?VqT}pFiMcTUr`dMLME&n9ln@_{r-#S+J|K70DD5OF!uI|St@RB2pApIwkZb~ukcP20PIqmHi8WDk)2eRrp7R(dD zLCxkA$v=lZaPMD?-BXqQ)&`Zz4@=Qx6P^q<%X^G{q^IQfAvSO=RKNhBDy@dWXd^{$e&kBA9AmnuC}O?r{pmAl-7r|Wm8Y59^V@;s3iFOJn$&`oiJZPyQ>`r zjRP2UdZ=ysGO6Y;N9wiSt6sN*?7OV+w$yc`RhxEhPgA339ch-qMQcbU)}?*qnL*0P z(Z2-7Axz)n$%0GOBv*%<@pwz_D)ys*6@MM&N$Q4|3&%h!@kjUU7aH|`_wDFJ_S@q! z7odCc!u#>wxZE*i+4cixBYZL9bK>ZW%cWcM++D_=-JiE?2-W+OeY8AEfuado%JoNusZc z+KvV-JQ=#DpTA(HIBt2ZU$vp_v*8~*!;=s?9bxM-Epj=Nwr+>;2fhK@|MCrZ)HfOUPrdu(1njq9ex~u4 zz#gnHSdfWyZDq&;09%4V_Bn(5`V>lP=2HQTNOX=D%ALKT18_X2nSYa-19r_{ebxW@ z`hVAb2oG|SHD&LWQ*9%s2N z2$Q;h6S{*jVJLZR5&}x*KdLT<@6=Z=oj^rOn_lZpo-tuOiawG1S1a|OTB`piea&>U zw)}oiNG|<>CFwi791^dpr#bc+Ej(BH%4dB}LQC-;*)0}Dch4EIgN^K0+I>0DY%k*d zXv&!DGu+O7qXee^OXd>}%+5or$HQ5$7-M*+gf@HV^FwAXE-42Wo0~>@QC(et>bMy3XKrXLX2z6sb1IwAu!xOwul%uf|_ zit%)HRqva(D*E>Mg4HZr&sm4o=Xgq!&eSlN4ols+&r~OdY2&lrREx| z(^U~MXcf8+JwLWAneD75`VCgM_CpB&Q0>xg>Ycz2^EBoBollqC3?CCc>jp*gP zsgI<-iuRG|Fr<6Pkw#gN$^cj@_b4rfxiHR1!i`9czWT0ZfDce`(cI{CL2{h~wBu*crxM%WyaAlHEE_M>I@X zACdv#2=+s474a8O=L`l=Y(zkpB*C!TpuMCZ5)B>=4-lBc;Ka(B{A7_##TKO`} zZEJR)T%9hArwsLeYT4iVrb2h=+;`pEyJ_$j8br8?Bkx? zNn9)ujafBbfRSUJj(z*NZx8CtkcMpd2EnIEH#ucd)i+jLc0W z{yx}a;ca!Q1fzQNUW2FnfbTGlp#=^49sGa+3Q(v$_z(S6(zWhCUAVJ#UBz=%Nz#^2 zFFhM4RACz*E)>f0Z^D*iB2T}TA%5*wD>SQ?K}LKPFv#1kN>1cR%rxAZ>jsnDX2(vw zF*-REA<4z_FolYB9ixSBjJ#SCw>R4)Ji<+Tt*$f>%8VN=L_TlrLzlesNwCEwVVICp z=!x}-n#kPT`JS{wn`N(746Q@fgq*hDt0}+X5BL@Y6XNoO$}US@xNhDB=VD0G7b!36 z!*S3zEgIxyrcSFu??4_iviCFN0fOLn81zJb$ZE1I41L?5dIm)6N24G& zUp87=EWL9&P5FRyN%JB(}hRD|~L9DO;# zcO%bl+r`S-z^y`^={QhB?r0HhSe9!&mayTh>mVV9fe)@_3WKO+_1_N`7^YiZjD@p-d z6BqLr;z3~+9$>Zu*nspD6cE;oD)8*qO>xN|l@=9wf@*m)MyFC{#-uJj ziIeiSzc&aaUImfr6YbIOK=oSg*#U3msGScI!vz*B@e)8jYE} z$zZXdpPTMFk17=0;2VPuzvG@Dnaxm|=G*X_06LPfU&}$>Veh`h8^PFuW$rP!2O`^F z4h~8zxBjfP^)yxu>*fB+=}FSM>UR2Pp zNhE|^*8C``+q?i@Q1_hQ)4cH6@mOv^$##cj)jd-ydCNWkWZkL3C~@1Wp>M}ap|=*uXz z^J@XDpZiUk#BZZ1y{tc=`jXat%Yx#wi!FWz8NESLu=O)ukK=W> z>Zpfa(SbHXTbOxb(u&a*8PGekV9L7}P4*%)`JHjzoZpt3*sV1c3Nw+H{M1J+sFfG= zzY2CKu^hynb}Maz$BK2o8ehpR_`Pk?YB$`-bSIz(~NtNw~O3~l<(8kyc>O$Z2I;2ar|4fbz1 z_&#!Wnc}DWfHnqc_Q2s5)*Kifw9(24!AtRvCQ&V~;@ODSH8c=gO6B9UMt^wEp{ zk#L$OEU2UXW=1}9ku;LTy|(CTeIO^ZX6pgV{Ra;Io*hQ2PfutIUTCyQ-$6~!qIi8% zcNIC2&76WN13}*ATS5zPOr@6S=!kic-?ZsRw;-3k*YT!ME4VqT3oU+(_Aw7H`a@ng z!^7`j$n{A_M*-M9{jN<%E&NR9qFM2T4@muS)(vWWp|wXZsVe}E3(X6MP_CrB#D4Nv;PQ`w706Fo_Lz;eWjkXCGB6DbgAz zag$bdqX$&MLG`EneQR$0E3`~@K9^XUMSX{R3H3s-6eJM5C+|VtDN5=MT%B10Y!$hQ z%fFfQQ1-Dvf&e1t(6jST99RTDiBJRQfuOKGGjZ&WT4G$9kuQ=se_ zVLm4Qx8nd|=P!W~MKoxM=@=<*f85wEd;9u(#qE&20n!M!vp4ClZ%p<2+ZT3*<7 z3W2TZwKBxzG$2?K#BON9{1o&HTmuZUtmm7)#qlEEfnR1xn$ZN@(tgP5r6u&DybwP+ zg`k~JU)cklv?VkE|NN#-Xxys73J<);$sOSBFdzRCIOoYZh?nCfl`riw?+IK`vA9-I zR#lZQD&=pM@cBW(4(nSxn#;p;D$fuhZxS<%d1GHNodku68cVxExBfU>tIV`5>C-cP zu@*1*KG0O}b?aXr*0s#gmAHKh@?o2{R`HW*dq`cpQu=j??@w>VBs>^v9{UKH%Z9kDWr@kpEOu zRDkcBAh&v#PlB9p_yhS`FJIa}=_QeR^cx%N#S@L?Uk}Fr9;~eCV$vPBC&;S2i=Jao zSza}aKhb+T1u-|0UY`W-s@`_lLgky{j-~u?GK?2s+nj6s@(a>@9t5xVD z=l7uDJvmeL8y&a@5j&mT&zB0uT!}L`VQ8E>G4HJTGlQnM6=eupN>eOJzWF@(2p#xg z7|1cNz_$dGZKx9`MI0_&+5&pH!fQt1e9KK6Ex1ACiSJ_hH=Wt0C=0p)S}%)AoV>Sw zzy5M>pu>~W?j_gSw>8E)PsgL8-=%)c%-yXM??vZ~xi{eUFo)}l?d+SkMpk-A#C*P! zaBD9Nqb+uBs{`I(?yUYZ&d-y?yeYa^Xh1c|$s3`S(Zik(307A2Y)@`$ZsbFFZ~2M~ zQ?_##FFzqnWi31{UbjfK;QG~V)Wb_ullQjPIuy1Cj9xl(=a?fq1XGnjx7K7BiZMHH zQqH)-O-#?D15rglWrKO2^`a|&enx0qEz)2s_3WPUx?_?dw8y$`b{O1>REhw5vru`^ zi`cekrxt54YzK%-b&wZ%nYD+4eAQnj z`X+q^T~~*9zddi2B^_Yg^#V4nzp|u8^O;{;`or_;kkMY`tHKZJ6FsMtD#0|QJIMXa z9P$CAu6`r;BqGnFvv{N+WOX=LUi!S;g8|ts>F1Z6I|IKS8rdpbU{i{5(60-CKj}mo zxLzr!cBuvfdUR=}e6vcxo?If(Mc@V2|8T)@9LAY~Q~5cQoxooKV)b2g1=kB(ibE+V zkl{JFVF`MzJ)m78I3N&tpK8IA+y635k<_Q+9;3ZzfX??zU~gq%S^cUnJOuU)S|P-Z ztknWxc>}_hAPqBvfWL$%3YepG+s(LEgKJS0b-|G_o9mrc*2~=V%MCE5v`|&lD*f#^hz3G|NnH zhi%B>|HwAevN2i7!GR5M2-Rnx+2=E=Cwm}NnFdY_!(<{ zm}$XOc~PLl2ax>z1|+b^7w{vvXXv|d)(iB~Z(Akr1z}-uwEW#~;L?u(Jb95^%Z#dwjW9 zWqA^_qHLKNOSqaz!CUYOYp>_E=-yo}!cJ-4!8FH;#d_|Bn)$pFflwORJDxJ6Xo zHMkn`An#oOQOy4QIQj5MR(jQ|xZ_Bocsme24!*Po}%!=P*_4a%GnBTYW-2Ngv|ItIpw(#D^@V97b{9a=!@oGhH znVpQc=rwtlwY@WuS5LVp2p);QzCQN2M}gjrH{um?&)*b&Qij`&u&Jx55qCJX!}f#m z<%ats#D@!yXT9cw1vMM3ntSSdyuUdVr6HzVVotvoGHqSPJXE417}lhzJ_G5?nVrLF zCZ7{e%Osd>0eOXtJFe4=LuOnkO@;39Vn0?rmcD$pmtA&kia=KfTC)VRQ*Dz|H6A#J z;nlmel4qKBwrmF8kyjYSb%;C5_WS`3xn~RWIxIA!>@d^YWhw6>KwsT(O~HRl)xx*^ zMM&3KSV0eN2ciKD9LBr_fTvDK5qK!7MS*{kyboiZL!b)+X;BQ&;QY-EtopYb7zf~} zzumwd;bp;RheUNF13lkpvBg>itJ)iy!O;)SK|n-pqo8DiuRtZ4N;cGEkHX|05d5CtRq!!Xc!SKAknOeU zhS9{KT?M>%VFF@6;`LQN+C)Cs%`(x*tR&&l_OVC$c{=>m=BeJ|tFfN>nY}ObK)ii8 ziwsoke(YoL5Pk{O?kvh5p>b2Wb2a4i5iL9BW;$NPqRF}gVGqGHk2&mO-#C=UeAH3U*qVxLJ$kgh=&B_%69G^6bUv(ToM`=r)D?Q z8BScNNUi1sC`h50*gryjk1y-1!wZ6~G1qXg-Tdy$$V$K?60Gl>b5_T?G+x2j=?!or!-o@&Y%m6mT{tOQY-g=|7821j{)ec$RXif%Xmrt*at zUUZRf=Jm5ylXEm@P%Bmw)NP@X2SABgBFxK%-|d3OxXH*(YnFZN6;C9s1aXza#gsq9 z2KBB%b|9sgL1WqGagru+MLmbAs*Zb{m+r-1z;2wwUHMRZks{tq77X5P?v@k>x4wia z8UEv_L$|5+^$n>?j^=9FoQuCizhmEV9(9xirhZ3Y2aY8BNl}{-;ICI6r!bbD3@TN3Qt`O`{>LD{OyaLM3JNBVvB7J?ov{* zo1B-uz65h+q$!?*S#|08RBn;~)$eoHvBwNBU>PTb&z-=%Mg4(9_wc2hF~V~P^K%y@ zFSZm6-+JhfOpZ4f>FkQ%E=L!NXS#DNFrr?JpVBlGR_rnPY_w#|eT1^fh+<8(+_tlu z9!tq&NMW)vz8!vYNv(gvVx*kyJlZ`nkz$aDEk#%@0EaC6oyC8FoH3OcV4zSQ0&SFuqUg3}281IRcx9ErKIM&cB{)j^in& z5LCR@l^FLyhO0fkl11Vto5Mc@=Ypq}jh=wBFJYAA_)2u%wWaHTo3bVREQ|;*qY}S` zmCpI!Mbp^qgLoA0BRF{y$!JnzB|lbd`w8fTFw_1n=*LhalpAaqUEIY&+UqYu z85F!6t^)0nL5zvw1*4Z5Q8bDWH(Ci$LD|N~vSPvKDs7fKNBSHs9&`04qBoM_) zSj~-LHy6D8f$p9}UlIF@?k3KHNIvAB4!7tY`Yq?cR6v!XclMm5V`J=B#g3|`>%LH^ zO4Yn&%-wa5tJIXgL>QX*rb-@^Ia_hT`h0QKk$}lo5op>Y)Fbdg<_-2W7hG{L@(L*6 z*omhyxD7t`;!V7YVc<=3PVH2QOoHHb8@+pM#ecIX1Uxwq0NgEG^Kj{ zfj(F8s_0vfB*SKey+dh5vu8I zvUTeCD7MwXr{?V|sR}Wd?^CM=LR8XX0>!+pG&mpj_{zoB=+R57;X8B>KeMo{r?WTM6<$BDP1YU0s?lJqk^L-ylePz*D;*a7X?y-B*5-e)_ufHGzU{gwh>9YDh$uZuR|G^via-QJ1VjXdC9#5(io(`l(I|hZEfaBLRqt!%e_Q5z_|U zHS~6eU!e?Px3615_6`Io!5P!Y1NQDcWTfy-%5l{D)m>GB8rbU`GJw&izCD@g1i}^e zFZK>(PoUjq!Qy#q+kR7pc&ma2eXsPhnK*lIMF71d2q?JqEyk@j+U>Dw(I>yN4e@?O@Nt>Mn$!fv1Pn(Dzr#w(K984aL%uZva^;iYrzMAu1%IKNW5RttCc8$~T z{kQp7wo2bHa(sT*upIJE6M5IcU+cQdF(+fZ2rO%P2d3KbQk*VPS1|vuOT}!dVlZk} z^}k$T&HMUp423kk*VJN28O6WaS%WRldGXWoqEWilxY9HKKKHA5p`# zK5ZeSZUiS@_zZ(3A6Sk_4w-ovEZ?nMam%?8GUtK2LAW6X`Apw!8pm{u)+$hMB0FrX3`h>^LY4)>qA zVJLW2lGu(IFOh!*fH#Q8p2qO_-&DnyTbk4ua}&$|DEfAjvA40-$b)ODlq%cbR0*&p z1S??kbxf?z6J%QmefXz&8}m4_t`Q>Vt8S14yf?7~KQBfqWa}h~3|4k( z>z;(xw8p{#nhCb-x>oykD|y(%@CCRGx(6xnGu0BIGZK7p|kGGrggkWOXDa@sn1?VX#C`tq!;%= zI{~>NI5DaPQ*2;JJv+XG$`C#XkZRAG&OhHUsE=uKvS~w`fksQ^(G?Zmo5|6ysAvEu z3Dyd|h_wnVx>GI(-P&)jDi2(b->Y%c4$uvBdb`rHuuEG!e;^&rah%x1G=KtW4%wV6lVM?dwZ1zXm1dS6Jyt?1i9 zicxPAxxemVn%0f#c60UiHDWeLy7)x)LW9r?@GR{4n(J=xnI!b0b%A#>MOWaAbnA|( z{71^HMK+|Cxe4){haa06~1`cBI!HUYAgG+5nb-{``uAF zN5$SXWL(nE6_xEKdF^h}jcN*x7&?W-OHvB6BfT{J4gQ!p+t&yy$m0ZIwlat>~$0 z>ubyE&(`oG+hqQ->%#6T583>);|2-yw-PaF!u~z_6U#r^Y1BAllCfq}2iHKcI+WM~ z49{hn;M*dNNI!~d%S+^+G{!8@tS*?WEB(d&*?|T@F5H6q9kQ?J2XSOqM(B(VunMc! znnt*f1~>nzYrfn3026x)Il2R*^l7cH)6t|C8rO7g2&n9Fw1K>=1zZLuwVkK|2}xe< zPIo&@Luza|>b2;ki$0gvs#waBHA*wdSsolMNf1iv-Ik0N~1Cm5|8ADi?#kU+U2?6M*9z7*VO8koggR|!gxuJBwRX}*3xnRspz)CBDYa< zu`NyL^T?rEu!q|iX&IH`hfQXu0QOY@K5P+&53~9ZMQ#ru@*eQcpw?dGf0@ydyxw?3-G43|gO?W@lQp~dxR=l4{0&JZ_e=?R?lVIKCztk${wSoQ6O`#!;j26n3zV|(&B z0=kbJo;a4{rTnp`k{kkf|4Q_x@+ zmENxtQ9-P{Z)jiY>pFTLdef?Jv=KRgTl1)rT>hqr*m4TeF9gpPbx;QA?`*}J6>8ZZ z^u1)7bqumuejGSsL{7p4|6Fhb% zXkNe$b&gSx6ef^}_HVD*MtnTtBPzA;i8|-%t?KxlQc%yT%?<0UEqr;zqOi9o5*nHm zXK^wyy&eSWwC1<4p9O(BR8+E5*Ctrz%-w&9vG+ChJC;n;EPU%xJ^KhIb)CiS;Tci15Gn$$waVrX)PzT9=E?yU4$1GAJ`r#y6J1tdua`*H|vZ&HTFE+ za(3~pfI@^FS$G#p=g=;>YCJWnhKV6Tr`?}V+@512NW%jvw%trx*tx5(HtM-L+zOz? zT^t3MU|I6$B!fE(F$SIirB>h_CeGEl#Cr{~4LSuzoQWOVEJ$yUKs|0hsb$_OU%29y z{<-~4dWa9v34yW)gcc9+HsRC>ss4Rg(B3mE1pbS$Mqd@mpToYxo6?C1iEDR8je`@C9r3`nYv_iIz z;V+T@&^^9NdU5^pGl;HPK~j*l2LLqHRdEZ@7P!Qk><(inbX)z8fx3}Bsma6JJ)jwV zz;6U@QH&Jj6_&aUIVA+Z0BF2IIul98Lt@*yCY2lkolmib6HX@p@V*W%GW1W*9EZvw z$axr?B-je{(9NjKLqJi7ZO;(?tySSf44>3mL!Nz4!QjepL$6E@hlcmI_t%Y6m_iQKHMFz744Zi?p%u*tx~M- z&yMPUI_?F82xMBEHQa&SQu8jl}n5^zGACcZTtWS>Xh0krj8~A54;|rt}bz zGnKuLy~~4X*vakBQmgSK1f~}|lG3bK!$b5u>HqTt!RO(5!MDukd!)Hz+sRLd2Heto zKW{m$Gy{ieVU3}1wz_|D!D-kvO?KhRS%_f<9FL=fT#cV~} zyFCS{1tNB&$E?K2S&Sc7WtY#R#bR@>I@k9LGf6!fMCZW zvdIXYt62xP~KcQz{_o*g`ld;w&$v!VtN%55fRXk>dfr^2rLa*9%%E)oRab z1a;%ows<2l44$BnPD>K<;N5N@yz51)5sNWbbj~l&l`DCEOwr>*)|N8XS6SDL0h0%w zb#nMzuqppj4h|pMM-Ri956T3WglE+$0W)q9ox8c^

o~IGHZDXZfZ2h<=X;qL3$pfk4uA_ktNS}7|9A&_8OVKyg-6yCOM zbFZRz_Olmv%(p#={$q8zFUJ-j2l=O>e$vP^fl$21JUeIs86p@ckd_cDr70H+r02zx z_ba7VUsGC8Y$qXc&=z;R_5wN_rRbv3?__uDeV9Rl;&L1OqJ`5~$Rf!1aKX2I7Az0= z!~#WMDcs}P>w~s!!D&4m4>*K`=$HLiNA#EVDn)1TC*IMhq=UJ4V479@kHus#HCHI> zOYvh}oyrgdGAz* zxUdmEfA*lqkE+xWWCPj9()c99ZkGkJsoR^(e3r*CUxqW>sfA?3@58KAed#9?NsEvbF06jNB_iy@?qEbYLTeyST4T#NMv-r+5vT?pxNi8_O&@% zK7Id7oQ6h!+NbrPdDpFWuX#ZjEwgXe5$iXlU*L(t!o2~NzD%FGV)LFeQfFS#O3PU~ zEUaXe^Tc1uEu<|0m=lNE`hTkY|1qlP!fP9TeD}T5Vlo>G9=hjANc6G#Ek;UPp zUwZ}5c206-7HV3LSPAG+&87<2*i04Ar!bZL;Th-?frUf&RHo`@W?h3z-WF+EfD zaK6(^oyX2jWn=T=qaxciRV99tllCM1nh_TmjRO$e^Q4!}h>32BQCdL>eQpi(isQ9q zLp1W~_1Y%~3j!HU|A)ZSYKx_`xarsNztPkG4SJeuKz-0IsYa(Nn3uZ>SQQOlK&%Pi z(%V6dM32Q@$3?^8L(TBCn9hhvKI7XVV(|tBw_bt6BU3WQwhl^E#kGNRfa`*>HsBh} zLQKR@V40wJXpmd}MD`?6LcIFTyoey<{$4wOiVsb2=p(QBuT*V(#u6nrTLpM3MNyXs zibO4}V2dTYu9o3XP}1m0k~QbR^Ka|h<%{x)NOyb~;RVF8mYT)q^}UQiq>B~Eu_o22 z{GNZt%?|M4!0W}dw;gy`1SZNa)4K2OSd!tBihuyWT=Q;tPCxfZ&(=SyHnyF-n+8sK zSoj;1)?)7qaq#8w`t6~sy;bj`WSD%U40JZMi#st5L@e!A(85)rK3N2)?{s9J8YR+! zo#HJPs8r!3Oz4n%Z>)6%T#&h*(!SE`ur@iGzp(vu&Uip5qqM%l0@qFNY{Pl-cAA4~ z9tVX5>XTy1osua>6d2#y1W<8}_X^1Aotc#8;*J3*%a++A{$r~vKeD0gl0y!)nW%)% z-aWc%nYz)DGH$+eSJo6e7VKJ~an$k9-1B3Z)w~ z>-%4mxyeHcU|eXvsILpz`J8H0qDAyXq=$@MSc39N72Sl|xomlDzIouvKZi+uCqd;h zWL7;Q_=|WnHYnfXQr~8l0JY4pT91JUEBgrnxr`~A`2w@6K;!E-1*-$}8j<$}rk-;- z0;k~H*{#fv#Z}c-!4}WIawG^oOn~;W6EPu$G9)5ggbYGiIV3|$3`W#bmK9} z@)5cyJwQWg7#jTFmL~pJhnWAqtnt6@OS!_djgU;n8TNu#6XMU3-FxOn8=fJvfJ(X_ zsy~uGo48k63GHDdSL}29#sfMOU;s^8Yiow1>@5SDT}FJwA|c^EpwGMuCI|)^K|NtD zA9sdPrr_h-Wl+L;Ch)2&!0l;ZT~u8IGQB$Tr3Lu(p~bi( zuqoZ9``k<|Qa)nIXo1^@gA$Ke6b>^9I|>VtV5*o_jHTai6p~eEKG9~g%h^))(ruH- zh|YW7RvtreP!6oXrXK+g{Lw7ztgW}j*=?}-HRu65D^ff7RZ17&uh!$VVRCR=g4@&o zT2gE+z#RA^YWZho2u4m1$NLj@8Q`VAOE;@1j!_BB3L2u@zHZY$U!JV#_8?)^70Mdu zM9KvUhSVqIn|L2CpVQK1?)ruKDtW4>g8J$F&)b5>5L**N4x?Og@uKlqe{D4XU31Y zMlRpo{E3v?Z>pQ4Ad2Gi7NRjuK zj?Xx)mp_hTlp5yGp`Z|q(=x}D^o1c6ix zAdPG%Qu;v$>-QqUo(g+D%4Vun0FS_}gu|+2lhsEmSefL{mFwLiJ^)DJBP%B8EghtC z+;@D^WeyGYYfg>yvcuj8Pqrm@%yyD!t$g5a+ps6K!xIDC*B7^U0i zBn_?;cV>k>IaBb}WQJL)qV_IeGsG2QT83^I%3@*(YDpN3M$mt9?NyV|ji5M`=Unt@ zEu?iFmoo@dIxt=)XdnR6c^eQ2fSBCJ^nELblEU=Q<)+M8f!Y`&&*6=_Jq=^>xy!_( zj0F>qD`N;RNyB3(!Nb_e)f3Kaaz7+912%@fRB<&8=+I3X%Rf0<7>0?Ize6;k1=k9# zJ|sy*3)zzr2Nfg(xyK%Spk;UtWJdeJdfKtcFOX_w=#t?Vf4<}&qC0&Uv{#)0FK<{V z?CN#xEsxTwvxB7I} zA!B%fKJShMy-}8W|AB}tup{* z_I$IzenGNNA#G9+RgfN6Bp5fjZ*Fw-2H#%b!UVZn6V5T(C|GhJk_Y%6xI}c0CU018 zY8^iC(x$_QnOaVc=6dDO_qp)+J%!I(k4oGy^bz~ZVc?&t8KZ62Cd3K%EI|7&mJcJx zWy1LW%S%2MK6c)e-aZF9%KoYu|AotMN^nFHbvZa(HCI7jRnXE-2XBvtp!hU##)Dxs z;(|YM4#~-cE9+<_z64fiE7EW3kw>1+XqG@B_=5YWWtvG!HWO8?0LYSU z|Ia_=Ump!6!=H+4&YiA|ls%ul^PA!$_b$mpLTn;Q2S1j(aASE>EBo#)X5i6u$rw0uiA;)6y*X)erCK&;nf`oeaI+3|cODo_O04vtg zT>B?jL?YlW6Axg$lY79-u9 z&S=4IfdN(rZxl~&4+(S=K1RN5UuOMH5e0iZwgn>68{ePTzS&9I);M$W2vE8rJw(X# zazx=a#9$KOmHX=OETHkA{XO@E@}X!=LEN|M8(U7yjvF@b}~2$LjBM;Q!y}fU~dXqG|bW3iv5Y zYhlp&MdLE?!7tjrk{02a?tvs-e9U&aj&dO10cA3NNW7<-VnK_Y9FbGyoV14@(2=A4|~zyo*aoH zu_-}+67c@Ju~%pCT7|}D4z;c1Z+_&4Q9sK#(t&&7X~ZDbr9n;rGsD_a$dpoRKF!3V zuyyBE3#~gP;5vlfHo(SiHp93-V}M8oQ`J$BSEpjkg{dIS><7M+U^0#q@fm!3EKFBO zv}B=A=z0kue!-Q|7v7d6$Fdu`YgJ$V?UBt*MkN*R+CxM3;XQ#gWYacfOT#T}#oBP7 zbWVTPbo7;}j)7QtR?3(yUu0$=9|0PuHlU=4xhq(KXUdl$rMOcP+@ez7D>*f!NPxq4 zLxBxRF0H{53*CldyIc0Svx*1t{_9xq;A{I!^6xr7Qr6`~oSICoOdie3)=BI7@;vM0 zddbYJtyCa&>oj8Hk7LrvZqyUSEm6t|MX zei@BQjD&oTT?zQ+YjR1*u3E?JI9$$j0dd(h2-#9wSeeKuO1+l{;0#at1cjgUP+{NI zqb96v4C|Edok1<{-H+TK#TmPTU>01iStFE$k%2$XJZs;cw|o0QDN)2f>E>ZFBj>DzjVKio{-PqbE&_Ap=|{*2Wky75?YNK93Q3f%-%1h@))zU-U}$< zMtDF1AclAHz7`<_=nS43sZO9v%v+oAW6*LP!h3P-4tz>oUwEw-ntq7O-twClC}e6j zK|~$iP2!WNR4a@R6?0~Mij`;Y=jfuEADi%6Td*8v=vTeM_`N7FonQs;4 zr|OFQsHIoD_St@xlfOq_^4TSdnCakksHc(2^$5)`YfU=p9v2KzW#z+ywk7BgA;F>> z88IC{Vn#wQmwtUiwdL5$FzKBdlUh#O1jQSqh)iVVv(p+}l}P3YM1fMS2YlkPIMx;G zb85^#{BV?xEr4+0HFF^6XS~wnd`jrXE_-Vzq?q164Ti>5EMgq?x*xT-3@s;wZB%U) zE;^u$v0Z`>PL5~rFzu4-Pc@`_8iNuUly(-Yi5zS)s<}z`gRHSgPo^(nS`LzNF5XuCE(z;;8IltSBrPN~Zir_{Q zf*!^w@&%SMxB|hbajqhgt0gSGXNgNA=tFUTr4!EZ=x9^7K2@h7KD(wiCNn`r+RjHd z5pXAzh!!pU%zwYlpmbT;(4@MOxyR#kj)B2Si#qO`@#ENGf#A^t-KZXId0H3 zN--?ME@DH|7yB6<8t4b{sn0&$b{FM*C@Gt2^Gv(pKC~Qi6FVEXaP?w$vp?IND*{w4 zZaEunBVUefrSApAYI(e{_mHiRekn*}=Tic0O1X+{4?d_EAPQtFXHN-TqC+Hhka;~4 zypJulJco54d4}VMx|QtS`r}L=kX_4wpTjMc;F~d}iw*=Un4$_T zM&L*Z&DNj*AbCri5h*B@bjIleJ2w5#=ny!NoTuFyxR}gP6$Rs*eD9ik-DG`_x_~ps z@6#dtv79hGIfS^+?sQyK+&HA`+z@(G#!;*4j~Ck^?Xzh67uD~-S4Y`IwJ|@xs4Gt; z$5LpO0pw`*>T*pcm7IDEGkn(kndDTlc`%CQ9=Xc#IEqEs%6+Ja9AELcN0$87Pz zg-To{17B4gBSkrO&r~gv6Zy`=%N5xL+TWQiTm7{ef?XtcPr4*JP)0OxPZll{8?E1J zf3&@OW{LT8I+{Sa3^|ck7>NdyO_qNtoAUk?htGY#aVjv8t)*54i9NI} zt9l00U2_vpNwmA!?`42Y$?ea_##mk2=1p~=pHG7{y@^;YHcN@cmOu4X2gP`qp< zUlQu_J@^g$(E*rt4aQuoCDwd&&0#cmP>S>Um_qisxD9Qd^7%Kc_Ewcu$`N-z-MJIV zpBG6%u|~13ZKtohqm&mJ>q~L@h)?I42}dhG{}shyqe$Pnqt|$y`57E^(`|N!2R^Wc z9<=V*(g=(QN=#^wz<(f^`PQ|#^=A1BP1`eyp3QRinH5bDW!-6yA^m#p?PzeH4x#I4 z$;pwO2pdSKmCtU|rppTNc;RQ|ciwlhtlZI-&1YYd@qjA=#}XK;elO}?f%)`btc9gm zZC$5ge9(L^V9Y5z6eyU@N$V&ZBgy4U* z>-l$j^G+9D`)C@7*Ab?Ik7>>=UwsilW)*3LUrkLtTmw^+*_=}8i8_>F;x^-9t2Nb} ze6Cbrv8vpy^qR$Jwfzl}5(%;Oash8FxFwQiW;XYP`{P}jg|i{l{vT_k*h_<)dYXq; z;}ch2tnYXA9Mgu)(V*^Q$CMCC*b#IBia9HgIXkYdW2XQ6nhb?NZCb>UuDUTtm%0Ea zYYPWxfT))~V=r@<2YEhwTbdGEi9=Nazl(EP=-q7s+(2f1Z5NVEU3kh9 zL*=G?`UZmKugKjsU?T1$9OpT>DD)zO|TxQlokc$o!hd04JP zB)5R=tmGEV`Hf=7d;}>L4_b5x48r!Sjv-ltHhxsqNOy`o^k!7pp@8EVU=NJe!nSns zdL_dGwbI0F1vGPnZuqmlvvKY3V(5GupY5{BLu2@@F4e`5GNM~J!U+t#DW%U7RS|-Gy4~Aa&gURtp6M zSKLpExK=Bp23QYx&n!?9a6C&rRh9647PS3S(p zq-JQFYa#$c$*BIeBRpydl+{=axWDt;3F%LWedd2jB;ki}cUCbz(MXBnc7n`wa=^`g z?|Ec?-!dp@*+?H^azXnd@)f2+wAcKsElR{Ldg#@pdo_`}T@ z+;P}YE9BBxpv=b2xrzSkcI6>70GxYWNp-3}nw81;zEDd;5EL$Pyt-wIIa&Sy!AkVh zrN84oEKOaY4y-BJPQAA{95>r=4*4Bzvu$E7yr}Zwdc*kc;L4=7m%Z^Ye9p0A`NgtB zdRKGTkm`o^2lSHuD0;JN6_}eVU@S&E(0_mBsjR2xVwI{hcC%43l`haG3$M`TSrJQ0 z7VV6s<9x*Vl_uNqD~;$YQJJx=u(w3+G(2l_0_5he+k1ug;_Ulpz2(Ykc6$7H_ZH8h zh0r#0CxT=keajoL4pX;5j@AxcVCb62x_l+RgFiYu%i(l1)h*Mpa^1_$GgR90ql=N@ z29<<=#I~$kU(cQtU98_dO5r6lA>PY3 z03w}?S7UFCLUc^DIGb_`d145hSk<52AX+l}x`|C`ZfzYv@LXoMPO+i(+qv<=Xxx9L z-larN>^H@oS%4D=`3PStG4uPK!3xLkyNxW4BBfSt8HJ zPsBcfqq^%h+R%oZ`&LH-0P=ndz~ym4zyqI~jD^dtY3^`q?o?tk`5aE{&zNx9H~s6=!qVa6d|=?iPW7NkbCn3< zCwW$KqdFitTrMhr{`9gWpIhuYa(y#LS)u&_a))qN*eTuk1=8g;;rl_9SYr&q`yAH+ z1-=)CCif*g@P=L_N;%(F(`pgEmwDsto>=j{@eMT!>mv&9B5Db4X~$PDa{0RThU5en zw9Ym93!=k_lJ*JShKpy6@;SD){A}erm&P6`Gi+vBjcK&AF7}^Y1JTJzdMqHXDRD42 z-tVB;zx63-f|EC$cG#NZ5(9o>5qt^AJ;EuZFPOh@^T_UcR(zADUT2K-S@&iD3#k>^ zkDStODpAa_=Mb-YA~C3L^Ga38>I4JUWp4jS z$c(W@+~dW(PS%UM3h#`ztKI}^(nbbyX82}ZppuZaU*V@wk`?Yt1lsha;LAi=`p01Z)7lr;GxcnP!#% zI>C)qj75+KYnW8+z}1vR9awBh9Guy52!=W z0kWOqSHRAlkf)$U)i}V8@>CHh4VMPoh6ocL8=FCiH*t>{(Lol`dgLWBF5pw6bCLFW z{?Vy+`3}c2hUwQ*M)!C_?WfY)@5I|Ew6N*-nu|YK^c=v0p-Qk`@U?Q+Ff^UQ0OsPg z@r}xD+x?V}WwkPO9NqEF925Nyox<^D*dar_IcDnH3TMA zg%wiBB66vKzj5$N|pHYK*ZeNWZyMGg>xv~m3-OX8n|Em44l_<+;pI_sOR_5<&=I>IWx^;M7|I;(d<_1wxWcemZ!*ut;CVXOa zuX4f3gD&2x|5@zCtHI+8yrgs}jc(#+0V>6osI&DO98$WGDFqi;opnppW zgzwU>LQ#iO8anYO@0)ZZkv)N|lv~PF(zUp^A_tlj=-jV>FZuz98V1ZnR}cI~S0OcZ zW6bd7uaURjrFtP~0lb50C`qd`WV!NVviO(zWq(+zRi?3G)_TIalf%p4I}zMe;rYM# zqbc&QceByS?MtaGUPuINwxzjJ6?KgOW3~$3sHr9>wU8%4{Q*#4X^pkmyTj?(zFzzi zynA`nUB^h`>nDT${<$ddbs-q9@5a6f<8&o872BN7P#77Vz zihpQ_$*~k=gAy}+18jQoR1DSMD?G8A;(vY`((aGb_4u5N$ygQETLVuVAGsC}7Tm^t zZZSPyGhB4G)5%6=N7H(PE%V$i#PR(prDe25{3VUk>dT%kCu;!g@)$qTIi{M`=8 zgqkB(1Q6>e0TqXhrtYZ%-U^#LV}HB^4beaQ|BSA@gWnV((90-AfVmBxV3FE$7Cfq- znWE01E0H=av)G0#alF1R*VJ%K3YYO6A}`DlJRn+Bd|Ety;efs0@fX_x4OA{j}<_jc@!n zSft{e@|-M@92dd)x;L7pUq+wX0)$1;f&vjm$~RQ=5lImg#Cd!i;SN5p+aT$9p%bAy zWM}JH8LJ#&FY10x{GC9~me08j*%Z?za4EE9+v!<|oF=u&)nMsZ)dcCxDRPUOn04w= z1Y!xUd$-8wZOD#0icQ!f1)Cizo_Z1Q-`*_41~EuTy%|O6EpH5z7%Ytu?oK}d4XxU` zMkCO#7x`z~?*H)Pxjm<&?BpK&SOGxvix_~*db(~x<&0`!gM%k?tf(hs#7p#XZ@DH( zplP$Y5LX5afQ}|0CW1tRE$_OyE;0kqaTKry3w+;yh)DQIv7&(9As>(tb`H^Oq2s0q*^}qym|a$r!dCk z#xu&6)a!7oZO&)=Uf|GU_{7lHs_`P@l#2@Qd4*C{rsaPkL* zlPOeS?3o!h8?%612|~#8)7`KGF#YuxDkTca=pc%}yf^I7`@~M>>rzG3EVQ z%76D^#2&^l0QDnklMoZh4dIyuUXs($9(dTp6ZK2;BehCxy)*x}LvvonewcXq8i^8sf?{m11(#KYM8 zn@kq%JMg+=m|$9j7n}Jny&TLAVm8ypi{_nZaR=u|#T~wplR_-hQG)A5)7AxA4%BCs zdsfaBzb>i^J|th%iN&2--K(eIqzDoWAUr3sRw(^~nnUO&1&q%CE0ko)O-R9g5yxZs z=av>DJAi|Gpfr*^IVrBux!_8p{i;2ER--uhf3f!-P)&7PyD(N1#HfhUqk;lPL@9!_ zs7MzgAku`0Sm=oKmMBOs5fD&#AqpZOCG-was({iV^p12$C?TYIm!9u@r+nq!@sIQW z{G?YZWf^O?`Hst=cf>(tfykdP~Il1sznWU#)@iBZb>GH0tjns8VA z6kLrWG{>+rEj&8Ynq>xC%{K#F^|k-rk7l75(Z zO6Vk+ym@%=Bbe#~7I%G%+&qsrrO#E@RQEbsY}`)KP72kv&@`?p*t#%yF55_$ROn;< z48bQ-8eC3kBZpCPr)?UPFU3M5`1C3w# z#U#V7QNEZL8oWGHTU()^FWRKJ1xgjOL|6`M;<|DE<3-A$6PH;-s_zN)`Chv;Uq0$& zJ6gJ1*#kN*6tI1*36V>er07h-1HBOgt_Kv`DxTAHW}fqW6zT;Ub^%@Ldq!rkx;Wh+ zE_D+}Z`Vto_WUe4(%GXxUUix4iEmsF z;te2{&1}tB$EkKRkJF{P@?FQ-7YZadUF`Vq4D#o)CDWWkl|_dB33pMKOl&HnG9+R% zOxxT$fX_{ePZ62KdWF9X_xlqREJ4@|vvp>C^Oj`Q7$wOhqscB;iSmo4HbCSAYh5F++LG z^Tti8<-!hK*d6)`XP;Vf=YFZ$r-+j(9TOA;v36wF$(OFtQ9qjY!{6SpU8G;7c;h|l zsK!(i09$P^dZLOvzYKnyf3gs9zPsf;?QQ`%BsnxC3`sE>+YM;Q|9!{Od`4?$AEABABeMe zkq9fP6%=$T{9`6Sd_lOyruGR+egM^0fp@8L?Kb zrE!+wCruJ;SgOySjk{bkGK=GTkWU)d4q~b35R|WB4<*`_DU50Z>#Eh1`M*cj{XhL| zR7eqGUIS!4=1{6pq)`EddrpO_1q46y_>v}>LIuD~85IAm7VK|zVK5;iPnVNAL(*y1 zbSpXMBIDeapCwr3Uzd>_OwgfyC9=I`U428tZDrcd7On-&F2|2&gfGK0N62n1wv zn*M&0|MT-dIz4~hkN>mphmDQP4ri;6B4-xe${tk`tM3Slk(j!hof*12*2b{S+RG1E2vU>bz{ON1xDK z%05MZhalma&odH!viLiEROa$bL6}=(T8E${oZaQW0Qyj2U8-Cg8JWL^DEepT-vRo# zNn-649!w$t9;*5%dE!E%+^X;(688>gC(IPulmix+?Xq=h7}cuf-<+P}UwFR9yZ#lB zl9?cn5aId!>>Xakoe(juqe4h*p@=0)ZuSso~VSk_bvC} z){cV%MF0qUU2yj73|{3kYLj8IGGPTQywR5w%3n>qo~mIDzM$uwtZ86_d%piJuX|_w zy$==aNn|NewQQbp;%H-L-z`ReAYmDaCE}|q631&kw8%#tR6E2vtNYm)$ZY%u^cltb z0`C}8h1llXA!HE}e*cvO2Y-dVU4Dd|R9puxnl9i<|7&sKfBSoPP)7Q~5J)q9SnOZ$ z_?>2oRuReqX(kZub{bfmzlX@8%haj`hFV&>HkzOZqa0OySeq0IaCqySMg52}sVpt2 z5xNoy{(_F2Xp6O*iaax?bN?x?@z+QU?bx9-h?}qx3|YPLWnZs^lMC|PQy&Oi)LL`c%^oeb#O09d4KP9JA zda|3E&u}{&@!7LmcEfh3zc!hXCp}k4!Hf4Cy7>N)gCzF$`&6bg;hW(!0px)|BQL{C zzLy2kMbaq9FGIVMH{1B^tekz$^-3Jlinysa^GZW;jdiHn0Qe2X9*t|qoiyE=d+s~! zJi4Dc07=8G!DVG8iw)wuYAznb$)>lRH1aU&sp(tpaF!0fuzuOy3Y1oU+L|9bK~X1( zG(F<0AA?q=oLrE~h=nj1v9{39gzl8%jBxsOS1W9j_sEH(#9LRwp1i``SSh$edp`AA zwAJiXAP?XM)#HDrbEffFg2!pYI}c;LSy&B_R`ipMcsD^Ga7Bvyh(?oy`s7#&THB~rFTm@m?^3{`%F&k{elJwLOWeD^q~}Bp z8B3Q$A3vgEx7WSDE9Z9I`=dwp85(l4?E1{|bf2~=*N6vHlsPcIuH^t!KzgkY&wSoT-ttwUae^?~o<``_7UB z&;Y{LEGfl@WHSjk%EA-T6^vMKI4Aoq>J*;KZi|mxKpHIzRu;eqPQ2kzZ@6&gfHJWx zGC&LPb?@CCaqgU6|506Sj}yj7@WEP}`_To(`uIlJ&XGy0?kD1_hdQ6bmC0!XGnNyG5M>JoL17NP8JjM0wy!DPUN`W z*srXvrMFp>hd;0`DrIlP*LE>V90h=e=aR~{RlWz*=yd?NpfE~h`DNXWC;P?5>}f@0 zabkpv`O=tN1?Jqy*~luv7ufAss}YWXc^%Yqqjq>pHkb zqo069yxE>$ zy>OQ)w$Sl`u0_&$iCu5guWGI=&S?dVC>1)ql!e5as1I^iVjE={wEN@T)gm7-2N#64 zZNsizX9;*^V>hbQiEEa?;Ma|-4FPY;6Hp{=%82|AyQZ3bMSr#Rqy2dACE{&2+*oA< ztNtZ{8$em>+UD+G3M6;|Wx!QR*Ccr~Bm{0ZJ3Mm`+GMv?%u@>0NAohmadr5k_msv% zWE3joZ1O(3VDCQDc{-gE$}cyUs-}eM_B}(4K(SI>-g|ng&vtNCYCcs+3dMU}L`5_} z?(ajXcyem%I02r)u5=#ll^fmrLBd|s5HWAJy+31*z@QxWYTfTsfq?jpU5M{^zy_R7 zswqW(19a}DbR3QV9dgq3Q=cfUJu|yf+o7x!<2t%?Q3s`N6c=f+GYilpetoo~E#ZLl zM3ph0gPG|;N1|RB{a9NzL(h87nfFM&aL({5-@5b-98vx9wlf5vhn{cp}P5GIn$YxrX`pU zik0T$aVIcMm|v|Qc=_NI}mS=)tQ6kgC-gIb8v^4YLx=_rw z=2>ptcbqF0r&T?pV|C|ALg$jZwwlA>_J|PgJW93f6Px!9-dDzpzriQ;E#d0|zH3v6 z5J6dz59gG_v$flIuddI;Rlnbiuw(8%{CupByRxsP2mHP9zZE6GSIb=I2BA@Xb)m9`veRGAnA)gVJ&)N#R^KU@B zgy-ev_zRXfQ0cyW6!G`ILfsFwy_83@#{y}F!xtr4&&V|!Uh`Ut@ltxe6$#Vq-c;Z& zuHo&5@oDTo#(i5YOeyXu>_!cn+#?R}lt6yTpc*i;^I|{zee4D{2_JT0DpsEc;T{R} zYnkmwz9KY7baTQ>_)_}XdQd(*k`!y$w1Tr%5&|$p-vInxGmroJ!Ln(G}$o>Yj*Ek zFnv-#EEaHYL_^|YA38{Q-vdBW;1z1`wsU<|D(cMLfXfYhW51gb3@= zzyf_G=Qp0o1={066PR9B0&`7x??8e$}J?-cTpO+fnWVoUq%wkSXxerVkGZ^vh|hQ)B&JfL$`L zJsV38M!xUc38y_Q!iL444*lGHe|U2vgIUrRjsh@O)a1GYDwo-s!e82@eAL)hG|AB| z>tp4clnlE!ul%l@I^}~krGLN;B3{@>5(DZ2S@tYnj#5&)&ag|^Tl3@D{eLft~C4n&z*$XLn6786BmGlJqpO;yE*4Hc?Oje}@ymW&UJ&=ybR?mt^AR z|2F()apHr~fL6sgsJ=i5)hJxm!kT2A{at7?t#N$)yd83D7Q7m%Lx3oKTjHPh-~=k= zNyBOL&_NHTHahqx%P5F*cAMC3<>Al5{{v2Gcps7Pel2h^VL3~1A-v7n&P59_z{=nZ zp`hwxzo16|`ZGI)q=EmiTh8ta3?vxw2ERAk>hokDssK)63ar(R9rh%VpMrl* zS;Sic6a{`qNEuW#;2wkVgF|AtXb_qb7#TdoGKP;VGtztZGyT@DMmDAQ|LXhw{5Y8n z@vHCGgdb(U`-Z52zV*?d>+F+60b|hhI{pFr$=Z&P#jl^s5cbr-p5I+=9(W3@c4SIb7ciGwf`;xTpt2IyShrR!~l;ffR+(eqWso zPZ8ZWnt~y@;_YKt>oTg+ZgYY<6Ru z_g;nL7LbWD)Hd2hYfftXxb=2t?X!I3pe=O*?|CnE)TY%uZInCVtnYSG`8fkX3tAu7?m1lfCbGXcS z^^A$Z`cqZkj_D2nAqPOq$@satK)OPv;GNTw!Mc1)%UUop*5~XZ+&@5 z4^ysC9=-Dnngj}sj%SiFcN2F%aEGr+#?$S3?1sZDD^?mu?*Z%kH__H#$way%1w5j& z0gdP_>iAcqhSLn}uO$Jk_kptnrPc2c)|60(t}|=l+rb;SG4DSh zfANX(HagkpBu1%>&^i2MB-G;sxu{*8;qG?tC(DdH<1p?Ua`@~Y^g9KIboa;ZAJ0gJ zZ{^)H+dgb>whf83+j1h`Ad5pqY$@Dp2v{iCwP%D%m_3VX^<{tP z1fYAOEKth`=S4}#(!KJtYf`bm=6N>)BB1w+`wVNVg6dkQs1s8#&oT91_s`!18p^SZ=D}L&LKFV#&DXcB$_&3wDA9_wK3+l~eQeQ2rcUPO9-%tl8&jy+$ zv!T8M3Vwv$*O;{{=uLU5MXfa>7d{gOO6~}wIA*X#4f;h2XxKQzZT#}}o+j+}VKAj6 zQqKGXQswbUO320~e1rN}F&OZ_^ilrziEX3gebEW+l$NQC&B|eb(@4c_;skLBN{>V` z$*vK%kabUju|A7V_X_khR6h5sfwRB93;6+A@X(leu?oJuC63?zCWQ|3h7&QH-xC1wM%nqryyW50nml zU_-TdXVbnW<@GZ+A8r|JaVrvXyR`IlOu|6J23uSvqVN-lBljpy4e>w5)(05+H{otw z_)xO~`r!j-D#93X0I!Fs$DfdfFEeiYCEYIDJ7jU@hlVj3KBhv~#ySTCAYcDvIik!< zDVxm88!`%A{&qcd-H}s#?;<=*_A~1U|E1M^UNXeh!%J3QalubL=AYitQ+45=ROP~@ z&GW8>I4ZZiTOjZP35}7m&{;>nSG+7=Q)}N6vCHpCMh$1r_T3M@Uh1fAuEBfDM^$hd z`xu7QrzntBC~ohT%Sjh{C_%GNHD2wU*b$6w@_L_+z;U+uuZ7Rp6Ey#03cs~w(!V&m~$>5=*InH@ra)vA}V>uQUrZ(ilapbr7 z{!TG3;b{Le=hw9_e`fJc+<`WYmqG(vwL>rX3T3q6T6j*ie<%C+f<=Q?Q5j=>*Rqb| zj#ysPN?qczUw|s=X&cSfv$J4bnnUSi{^K~z4{k$QMzQi4=$7}D*Z4cY%C1=IZIvJ0 zDu<@uG5TS^9K;?sTeYdY;dcsrrx2DdvJyzLWG>4;M6V9FFdj+QR_yY7#=6MGwPP-_ z;zWAETqc8QtI8gpC4-e%F(iFNRAgU2Nh^H?yxLY9Klk0#Wk^3SZ2byy$YmOo$sKbL z47ag`BVjUcWmh;zDQNsu_zULZO)Qm-4QYYHKZgvGpibhiuMez^t{+%ZIr;N7r-qrR^-Ysx;(aNOsWFJ+>+6^6X4^OarmUW zH>ssrF4*ERor`F^knDyG+`>F?Yj%Lqld|4DobFDz-bpyf3m<=_=tChm7j!8$pN%1o*BTC15BN*lmDx^w*GW1qfPq-NQ|E$ z%zWp9nP+G40q+u{Y_d&z#qpDd-wI?DQ)o)Q~G@ugkV4s17G zGV=o|R%9s_uKt5jOg~;hQ+|;Nr!{dgb^GEZR*F(FTY_BW=ng9w`SN?^LZ!YC`{$5u^#$WqCgsRYKT`v<0Ch{yA_V zY%lQk<6*3;3*O++QduZH8Y!fwy~MdHk%Oq%de11V6$1|yt^p<2p_ya;84{&7(1QC#Mz)iUravuJMdeOF58Ybn>uxpv04GYnkzF#B#)sEJyF)JfUL|gp zgOEu9LY5g7x71*rMJ^N>`br(wVq~vw+M^#+gw%?LdzMLWoHvJH+2Q2D{$T1F+c2J1 zri=g1;j5r(+R#y9$0^fG>xK@yHFHwrploR4VB;7=o>YG_*lZ!SuSmU+VO&p z0@@e5%!TdxD@JW9n{ zI|0GWk|@j_AbpICKY=W1ZlvAwS|1p*VtNnn!qUhs=6(q)8PlF5M%ufD9_o&oHGWeh zCfDkeDWJtVm0HVD$S9qZGgq&>P@Bs&??iV?g|=7HB!*QcH_J~_uVgqPwomX}qt6;p z4RdZm_505WbmY(D5_9l&5&;Ru8*7d^8$Vf;9PUjs8Ebi^ZR8NjWi&>p@yJE=T#kzh z@Trg5_E{^Zf9mu@Sf`dNz{8LnGKbjVTsF@*>1WoDjBmHt6O5i@FUSWS#`^4QN6)7alKoXf0rX+Z9$jnJ5o$C%C==ZAZSUp!rWHE&$^!hqYd>4Wb4Q&O+kJY;M0t-HW z3-+=XZ4C|Zwm|hvC7E3Q$)dOyC{>G@rhgY7ylCUtVU=2I(TfM->3J{SA(Blo5?Z${D;VVSGI-N4rpPno;zy^W`+8~qVf4i$az9B$I|IEn)r5y*jkqx3Be7 z9~l2v?%1FE|Ms;0ychpx-izRL$rNvc^HZ$O6B0XY9iTP|@e69hHS-G-_^NoC`3A~3 z2qdMu>wQH6-k^^FyEiiV?_k{>D7gn{I>JNItnZ0)FI=1ZN{)Uq)9q~f`toLUD^KBa znty$*7cNgUJ43HYOXF?RV)I7Bc%j6KV6rd7gQB^Msc*ChU%pHnZyG;T%d_MSkJ^0e z>a&#mKyy0mL3*N)6I&3UCih5=JQ-J#nfHmZa0dfZlO`@+oChZ|CI2g70r}bWcj%j= zgebS94(fd0O5Y^4s~zVBg1pEbLRmEZ*`^#Ymu!;r0D5E4`X}_}ua@-B-~T^^i*RY{ z_1a5^^K2>UPl`K;zpjq?`bF=Khj34XnY&rQqaoNNCInmJFcYi&k$$zz`29WHa2}^i zS+%p^aqp9RQx;*fH7?C5(j?)?a}~LxV~)&|Z>tNiH}{GPer98wJWnL>S)6y9$;#+j zWUf(U6%%(F<*L*T$QL8R=o-QWZTzvC7IX9)PSj8k6^7HS5@;SYEQY>^JbeZnp`MDN zyd2-+1OYcR`3s|p(b3G@MRSf;Q$-1NssHVt-Di$6=ca$MurSKMrk%kbolj$)mSx-o zu{A6s9}Oy`Tu7ik88z}yFB%!L3EZC}Nz7FAI)XU>FGK&*hxiP4lob7nYw#8YK})bY z(Sz@RjE6mR7(YTsPe99fjz2V-QaX-q0Lzg~EFCONN+B!bt0*`TG8_>m2_p%H!KXOU zy$Gfzh*+_b*S6tM3hPRfDi0S0qY5P*#4m#Wgu(JV;Li*E^9lYrME`sT|1>iH+gedQ zWLUg+d|HOa*`C$(Qs=Hz_Zx7oGd0*RAaSSOfM;tBbBP$Xu1zjg#gto;TId(rC>0set2|tBdD=7c+ZaH^H-H{jf%exPetq>r+6a|L z;ata2;n-Mnk(a3(GGEP@UfE>}n5BDt?qjUs?(Y@HM=;RsF6boE4wE3xgBPl=&;Bl$ z4A2SnDa607QH>fw_Plf4O1+$CjdaYrpBd~U_15#7fPnY_rr1YZf1^d=--A6lD4CPoWc)otcIpVN*8+dRf zwm~9T!s&g}<6hUS{&mq{lN&HtYSgk}=$;Ce!zIaP>s35`cv!*C)+5>t-v@=gY9DIP zmA~EIZS`8HZfwlN{2lbZQue0QL~yf!$W&7t*$#iP4d@pph2MUf2cOHiD*RfhNz>rV z*DvtsZ#g-JK?f!RdU;DMGm$n%uNLJWHN+>}T?T=k2SD3~%hI32%e6o)?(3Vpc6`(Q#*NIzu$|S%KJSw!yh@pzzd3ddV zFuLZ&U9p{VoSNJl<%~4`TTlZRaw_vXD2*4-MO#~;Xii0Qxa7HWS7jPZ!zbFRQWOU@rrm+c=DSx1TSUItwKw$Y$92s;uocsqry!ZZDzee787irh^jifGbJ)0nNQ8fb(k{hNw}Ar zYy6%@2E>naiS($SDGm&>)=ft;&$begP;28n(}4xvx5JKdq0B z`e3wx3u2z2D-fTnOpZfLhBT7TuD^^$j}u0IvQV~AVlc@|u4hejfuNVFUox?oT-E2w zE+>r)x$?}G+$pMB{Ss5+AfVXz?1aj&`>b*mMw-}{S{!po#yVFR)71+5QH36Ao*3yL z^GzV!hSS%5OxT$=6=>4p1dz;rLec6U?Wyw?A7yl?+NH)5G3dDDjvn@{JVJ>7qKD$m zlLpJ%u93W&+~+feW1g}IXv%Cr!TWEE4<3V5I>@RTkpyNQ2GgVh%Wf#}M5#cx_dt9} zNLKm{$~1{{N_WZ6BcT!PYi2r<%L&STl=W;?y8n|3LBB9RoV#bsN=Ysb+3}1_v zhchpbf%?4KVozq1Y#{HEp~aa^+O}$L@crKYufb+pA}H;*o;0C?Q2=MW&H{K{{LDbm zmILWzG`St*>oN`9uNXP$@L=f{=aKFUl3ePr&O*QbX8W+s*Nk+OtgyvxlGMu9mHtNj zTG|P;X?N7c%VD>EZ*`zJuE}vu6x4)}cVxx+kD{UdZ59k-W+Z;fhhM#+;{8xTiFJd+w+ zyb?HlGs7F6HXI!0wOX(fEuP~t2g15uL^1OZ4ftG zONQyK@+YWNujA`n-tQD+H(d89=D*Akia+8Ky#@J<3;s?VpKQmbj$RzSF`rcRWkK=t zBlH9t7o_}LO;8wYFhc*XdJ*@B983ph*y>iCj1&9PX_w>q_snAFy9&!$7onKOWtZuTP+1!52CQL{gqECZ`RX=J_emr z%+jElei}!b*u=Zo@U?0zCZkLRuzYo`4ygDiAGeGpJKp@aqSX=LQ`+LWU`d`?Ifq#?xH1I-jD?PuOci( z`O(EJ{3WO;LeY&3X-A*q=Uvn$kx@t}dZxfLn_1hzd|pE1`==cpQ53x(po07=s)R;B z`Um8qwtDZ&M?YT#7zuCIcVNrHML|!GN}BUE3!duEFvr=TT?Ccl@3LSXT;-y@K)wM$ z+>Ynzor)gN2yrX8C29BWW!Ba*J(j%M;ny&TK~;A-DZkp$dD_x*WT)y2b%BTv+jKhm zasyr@ajF8wnB)u+BJIXMb#TnZGea@o`rh}5qSF*ZkwL#6G~?$*RZpiIDcVHX8+gGaOgEH8dY%a zuLf_FsYGe6v*4q+<=UhOR~_xhF&Qy4{fw9H)|fZc%GfpxIUB{Y)xL|4&h0o#Nw6Tf zQ<)J+eBNWKpJ2u_>~Q2M_o#9oe&Q*M7$l;WE zPmW2*A9nIhruXq{uzm>mvDrQ$Z*-R}V^BWdP`L%9T~s}iC+{^!f>?|;V5wP#x&g3b zVx{*FUt6h(6;E_eAb}ZXi?CF})F)C!$am!^eQC6M4{(rRtvl{L{!suX(p#e12u($t z^(;n7|7xfr;2E!^3>ca%!ThwErgf`zh~{b}joQqKY)Audi7E?1JBl#Q{@5%NC2HNd8m6itb1E&)a=q~~ zS?B=5ULp(kGP^z-5r0=vzIf+h7xe`(7%&xlMjrLFt#72BdnF@LEIL+JB=aRmZ+psf zU8og%XXbv?0bGXbio~sM@xYW=8;-cj&8P$Y(oogg`%p5zGL(!dd|=|2KE3mKqZ7yO z{=|=sSXwEylnDDFU_h>*sN7G=nXTzf>Slj{r(R9NrBpl99#D$zn%Gw#&1g>jJh*$x z9^rzy4O#7^9T_(rGJbRMV|T+BH|13lt^v+USLy-l8egL<^~X7o1n5|zzJfaUPSR4u z;TfZr?eWs~Sb^2=vUNoWn^;Y!WQ+dC>q_kZT^JY+;141!VY=>RX<+cC(=)C9E>1V#t5 z1sU%);%PO7#8mS@2CFaUjoOfD;X7idFRdye;s}DP!3girV1ZQB(32d*T>%L3`R_>7wbN<5IceYBP#%_bP zLHSg_qG7H%CBUM>u8Kw&zYy7*@|;DEdwi-r*9K`$RU&MXP(Qq(-+#z2}V7=veEkhJb9vm*+9x2rJ0?wF9OU_AiNum$`i2 z3!Q<0+X;4T9Zo+u5WRquL48NE{2+!&MA7Z+W|#D{PQ-T~cCfPaUslknk8sobGV@Bg zIdA!_py9Xn-`l!;Is#A<_U8gPD_GLcm*%$Cli>7l(=vuP_#$uk=xMpq%@Yvg3+ z(T8eoG;|CQYY3|5SB@TG^_=WohXf2GAHee%a3@N@dplo zT-=2(tWfq?)FlEQp+M=rRgc39ywWk33~y@+Pq4_xBEbzU#xz%|MYtv+eJ)g(lSS=7X*dkR*wmKWIs}`< z`D58vnPQY~4&m%`2{xf?v-}P(q2-Y68Tfgo$U7O@Q3t1e$<=q`3vzCM{uB~+>C-Q6 z=jZ%vIb7migU07gH>5cK8PNV8Anu~gjsXId$}boR=$SGKnlaqLI2(^QU0OFFgX))< zaeyM;2Gq^*rVP8>eB`cD_6aa3D%*?J75)R7eGGsJ{t@zCEkDk;rTA+DclDL}N8ZfP zN$k#vO)Pa2z`0bzhH(z|K6dwU-!a4LOs&J9r00GKfUK_*7L-u-s3G|JZC501AC8HM zuOOj!b%awOQQ2fD*$14W6hNGlofy1Z6cQ5Pd5XsA(O7mgp$I)s$h!w=Xa}w15$?Ow zuZ{)5Jb3@ZUpm%7+5PT#(}}iE1aw?kfZo*yI>x~M8Olx?s^P20o>V>yRz?ayOfqb> zzyD~+q{NS#0axojym)2{%kjW932A|8 zt$Eu@{NjR!9zbP|tJiIxCt>B_FJjY1_|`I>Dgzs|Yld5&#s*k`7xNc~eA=XJ z;lMDmg+z{>C91uDG=XPDkomGEPRhz&SJB8^ZjJspaZSYLg7@YZe!>tJQDh~YxOOLH zy?IJ4`DkLh=!y0Xr*miWnywyTE;$V$u3g*8xh$TkY;Z{17V?_-lZBPp4XOZqC9>3A z{5~m{Jt_ApCA{%#OL1?BoOkZ@2US#B3^^hcqo_bSS)cETsgK=x)p}Xj;f1gyq=r9y zFnC5?=u`kQq_2+xZ(@oP1#=P!flf08<=lsk&kF0&B~R~|_=Ri(pzuC^gK)x~r+STY zHE!yw99%>kcx4>;SVQlVegq5azxYP=Ivn9Q7574K)17McZV6tpRyQngIj=adx);Zz z+va>QoMS+t;F9o(OU;8*eXCwZD$$nH!j5hl28LdP=vH6K)PIG`qs|V2E4m8Yks?5i}1Q#xZ#ugA?#B8fNnBzeF>4M zlh&52Y;y$zCV0Mf5am&UChCrt-COS1Mpyv_ zB+To0W~}TYGLa!z4qWjtmqg~kQ6KAf53BRDS|J`cTH!-y3Vuyy+*7B&@5eMRGPD>m zbfg2$_cErw>DCwpWfa)Ewf^yO%sJ|_R`@8c&EgN6S3mjv^8ho)hILNaqpTG26i|}=8?m+>Y#WF&HD@1?$0T(rxcGi=NE=8mw zj#)J6Z#2Xld@i)tOf@c;Nj>|AcMunm@C6aLjXqAd*&kyVt6ki%y&iR?4l=?@TZhJ> zIXih7A*E&h?PD-)<9^ZmhBsBYf0aBR{(N}fUAFk@oQ;gzRBW+& z(D!5V8cP(`6VnUJaHihc77B-Qc zO`54iM=7epky3h9PVJ9nT8#pf1}ECw!I^5FMuL*P!o3#AIyLmpQs#&(Q-K#q=hDk@ zeb_;H35khOV97hQvp@dgeMG0bO@K12eI%?VcyNm;nSad|{Aa)KKE|ak2%gH1)h1kc zkN?GoDSsgZy?^gRccpCIFqWMnL&j@mh^;{}(Zt+lI7-WGq;!OKaPJ%H;q&asL6M2A zsy#9_vz-DTD(*uV#1plNsfFYt%a4{%53DOeCuF}c4#wqFi`xiEs669c!qOZqYP8y0 z+Ym1t>`z4v9j9d-+pD+9{^D+Cc7*=miPZS7U}j474}*nJ;_`GRalvC}2)SmD8sh>R zqA#sntUJOyqZ`r8nkVrrIVUb>jH&!dc$lcbMrf{3tl_UI1dcL_gP z)YSTVhEV-b9*2TC8){*lu+ z9ZYwnHgediK@(F-dC-qUUi851WD^=(0T!7SUm&&W$8exInqJvA_fimUATW>ati!ig z@;L!SxHzWNn)fAg>s!6Zh8wP(?R@-2zgFXlZRH-y0!@X&H-&x<36_SL(R?~eR4&nO z>3wq0`Me&3S0pw1ny7NcX+`NSH>InK{2IWIuNgfe{#ay(hFF}esfe}=8oyH*J^Izq zP_r{$n8>n6IC0wgV{&p+a&pIr9NlH&Exyk)0D<0ioP@W0n(|DmmqcgxEe|g^^MJ5t zHnA>jmAbL);pat$yAjO0psThgRsbi`K|!h_(=?$;zYSz}je{XzU|a?*9{jOV~mPx8b|fudDjZlkSD&PgZS*+ z7`GnbO^N5v3wYIjkl*k_XyPf2BqRNfp=y(?`&&mxT8?4sBvZ?i7Gavh3g%yG_ULt4HxHs;q0%!Z|;glizi`d3ht?Pw*8183{tT86AUM2b=1)ji>VWgrA`#>1|5waF7Dq`c)7vkZ{4M8VkI0u62P9c1*{UHBS9AfQy>ge+?Opq` znVoPhVT#A(>w<2mV@VFH2De*~Ow$f3?UV#wDFpY;=41SnQHqDn+30Yo%V*z)wg{(a zie%x~Wxkt>EHA9>Kt1Y4#Q5)^3PLFI7XjHOB((%T6g!oYuAux@sWkc|@x!GH!GbGS z=-79AhVmWY)@u=03U?y&LJ3#J&KB=O%yJ76&r~ai?UZ5iMt8=?P?$sE`?%LXU8Y5o z(Lr##6*YlZp7l4QCcSxb-ZTwtt&cwyVeU<>Eh8ct;0JxL=gW?9*O~&$m=9ZyBM?YkMqz2^%{#D+d$0E_u0)U^9|py$6M@BNp2mj|1I zM&$bLnqa6m%8JeL-P|p=fsr&it>-dt96>#l`1ZT%Ri)@$Udb#OUAC1fp^{X*2#zW5 zdl%nlc31gyd70mEyr}Nib<+YHRDHeAAG=*}w^r2Cf1_?3QnKQMP;kt3xYsZ~>aoLxpAeoS z;=W#e6z^qW^GA#D%OWZMcq=f7Sj-Kw^0n$nO}QvlH?<)Jr$bWA z`E6VktZZKxxwm)O)2rW@==t_`%}UX_r7G7>1!--yElJw1F8ktp%l7C3G5Soy!VUjLfUIv&PdA6!cA9MfFh$08p_&!VJ}k{Si! zgo_qqR68Xef%u_**~_?u6y9qe9Nqc-lBmN+DvJ;gS2&Q{7I3bb0{TCOj3jsQ z$cYP1uNQ7{e^!E&1u|+7*!Uq2}LFA$;N(?)`!7FA6?!1@aTv> zq0(3MU*uE$LoM;2eg10-HU#4YG`#P)IqxYY0yitPp_cfYi_~yNdB0r$8}GXDZ$>zy zv>*zC^)mAm2vBuC{jVlU=+ExID^xht*Y0KpoL(K^EOkZ#aZ%lmE?MSSlZU_md<$(9 z>iBj=l^3Xnzgh?8<~?}0T=pP!k_m`H{#zEhKfnL;{`~JW26i1}W54{FKgMoDYPzF| z9UH_cQ>quE(h{UjDrr)5n2w4$KCg-+c`+3yl&+ncrld=b2rvL{$6v;5!II=f5E z)m3^Ow*1aat~iO`Q`CQ74})y@L-s!P#PvY}@SNhc=O}tIBJ3wJtZ&K|NT|m0%STo5 zKT|raF83V;YUokzD2W6R)jGE9An>)QADEzYi$*L%uQ@Yz&%BN=Lf|R=f3Wx7VNJbj z)-Z|!)`)_Dl%Rl=hzNpoq9RQM1eMUEA|N2WLr4S!q(}rrq)4QR)X+PGUWG{S9qA>Z z22y;tzjx00&YW}Rn=|vCZ?5l}cmCJ`Lb9{7_kQl@e%8J2wZuQkJ_)7Hv-WKnW4E>y zuNAQ@_v`xH+yz1scJach|38uu{WpCL2w7&gM}+Rj=qx}mMAurgU=aX!QdOb4P$A|I z><;Zxf%Pyyl(duwZ#}=BUlvT>vZ5U@Qf%iOT{qu`Zd1_i#W?qo1*s+4@vw6fUDO2? ziXU{adbS2$ANi`M6WEysJIw&xTN@_x`J2>aYl1)Qn_DqpnC`Xq=gH=`f8uc_I~*My zZ4YY+C64IqU@HKrGm?;t`LsYjZLj1dbc?}=vsZVA(!nvgIMPrFz|c97C<70to(3u;i#iW}4=_l#(q+)Z za9Q$uq>pXSDjvy;JrqswmIr-*+-WW{bxNW1KO&tnm6ge6hT23;AGD*LH%ClkHH33t|1@2uJG->;ZD z@2T0SbfZwtBV?yd3W9Vj;(eaBuS)Ll?GG@C$qzNk?Y=*(BC5YQlgoN&J_{Y5o7u?V z)0!cZyEqbPhVv%(l>C4qO+x&(7wWmzJ=t@frB}6^b4GOL1)$yKljm)dTL{snpup@& z8WZL0V6BKv&%{LP2+4T_kmkBxfWB$HAf&OU6hV>_Vd!<5!emv7X_v|s~_Hh$HBTMHVq+UrHak^LO+H$AItL102 zr=+e-&Z9r7tp3!BxWAaZk`eh{U1?wC!E{9r90@i1i;mtiyUshoS_#sZo{#w) z7-aq)H2bdQNozUO%r+W3q2bFqn?C9JNWf~(Y0nv1;oYIf2XvBa@Y%~^QuOv@rryC2vMY? zkFs5c2jdFyO*-tKux@Rw!IDGuVyt_q?K*#M)z5yw`I2 zZNgW+A06aG+BMsfyiiYpy|^BoW3_z>88c<|7C zL4cXr%{hf?s}?Qpnx=y4f_vSOb1ouLY>E9-|cOZI@qPpVS;)or7&AV%P?*r?Q0IcUJkOm-!fIy+Azh17ucnJdL@?5 zU(y`rb+&2uTDf?EQo08Aa>CvZ+rvId2dlJkoEShmkQdxL|chbk|NHo0s<8%N4cYBGvnAw;uDkS5q9$Ix?eGlMo= zTxBE4RAU9;Rv%uXHu*D&K8+Y1lnP@1AaaE+D@*3LMy~78>g-vEdWT~aeFFg^mtMV( zO*&s?>C@>j7s_FCo9oOLPUggfrgrD52SsVwOM2>&AJBC)H}Y{a4x-PrbdZwh0o#yE zZCtjo;Kx{9A>W=wd4A=0b?4RVk6Go(WHt4=yo)r5*LEs|^1r&4(eJQcOnz2rn|Yp} zCufZX^nGaa45ETiwP!@YHLg2@&#%OX$8RLR^f2Cdd*`$cm70)P1ey`Z2g!6Z9Y$oF zQmL1Z{(XE6fRbkFZ>38*NCa`yU((@i5?eJPuRNozl0$>#456bsEhK3LCH~&?YSkrS zDL2cj*i-3FArVamYG&I7_o<4s@>RH8nElT$@+xCjlNaAFJa;5_zcFGXd}r zGM_zTWUAk5Ji)hq)^w)Clag4_K8)pLJ)|#or-Kzypd`B)Xppz^OB!p5__Z7!>lb$0 z52l~8*~_`p9p{=p9sk&|Fnps~?UYdW+9q(pc0jiw5VL$=m~y{s`6s?fW37|XKRC{P z6Wq=<5~3&AZwxN9tk^=@MmfVgUe2qsTe!+t@D&_;I@ANy*X(1wP{g2)!-*5YTBQ{q zEnAP6-8{d&V%P$`L`Juwidq|W-pUoNR_3M+#=j|-=&~04V8K_FsjuftII&?{Nf2zr zFd?M-z?y{5Z%Z`PtzYn07v?XS;srTmxEZwRy1aG>@9uA!<1^mazzVI-zyR3oT&0ZxKh!wvCt2V@29Zv1_acz#E>t~iTPnpE4WFPRZfOoGVAIz z8M|;S9ar7Vbk=uR%F}HVh)0>HxaH0Ic^SXNJdXuOrF(7FcfI=K0qLjw8y5b{BdB z0VZs>Bq>&r1p_7Huy19oS?mI#Q!)8rq{Ktm7~toL18n@X#BDI=ueW5#G@gPy@3aSz zogFuKIMw=LqxyrTgAWCA6NQh7pT44+{6y^w{c92Lm5eu`11oNi8dPj{MMew_mrLId zTN4CIN-Ez(2=!mN65;pwDV<{001M1;@GmhlRsv?d;ujr{**QSLH5S)#iCp-v8JPxv z2<~@rc%#)K^#P#3eRcj`%Q^SWQTN3@#hYEq@J!TSO5s*_NCJLd67Q~im~ZtTZ4wEO z9){h?@YHNJYRYDaPBk-Ar-(E<@V^1%{Y7l|xt|5HS6<2HNuxWXVQNdi;C2lZcc3`~ z9WkE8z5NM&GQ=R&*~6Wnq6)w7I(Ai^7@iH=zE( zbe0>?9=RL3@UUNvm^aAqvV6EAH_^AK|hnK-p_b8?x#U0xm=a zQtYrgv!ZN|eHj*ZzsEA?T=6u;Ut6h46)}!NK0mQ|v<5N*t_Pn+@RgSPN~q!bv2=zT z3r7sT$TEe>sNnfuL^(w{iZApvXlQSmcV)bZ`aIeCn-aau1I)vJfK^8hn#m6k3>80i zPodu`oh_EIhN-00`d`Kr-S+}yOAjn3v}GP8g{msuz)qGvaTA)B$ggL7e+L^`D}ZcQ zVj)Tv+0>rIkdM;zOzKq0(&p_A=y>286vc3VEQico-@JgwzOmcBN9jtb4F?u^77pRw z_X6#ThusWXDa9q_Kxf_eufQ^*)h6OI;M;1`&#nCON)P*V_5KYZo_R(~%E$7Wa55LN z#TR`fZ+~X95zT_o9J+9iFj0UwXIbue@5W`CFejJK*^Dm?wRHrYHS^Oq)jWvwh8MD? z6O8GeXzW-g(xV@q1K$OL;29FUk#?>(rAI-^{36Bzq&~~uzVTgKQ+(sQhGLg%Z+%%# zM7+VR-N-J41j!*8Ax3CY;_gucmLkZ7nb#mEajq)wvmF)FPVMbV!N5u6k1AB;kJ4Aw z##!A;O}3HqtyLJywoZ9Kkd|WwXJQ-L`8?BJrt3w#$)>MA=6DA?8GjHkE4WrLjp4Q} zB1Lp>`Ci;w4;yEC@*-4C5pHg7D~;f~(0wz?%jvPnT2E(a%AzIuJ=8+}R2V+|(6ot9 z*!=i-FIeS9u`)1(oZ#65Kf$MvXGxkfQsIP`S2V*qzkBUJdT-%aaaon_WU3N7E^Z053&=-zP5VszGMM)s4oXK-(n_?I)kdKR{sBZVujhNBYs0iDh2QQFzXA@cM%@Zn(MWD<6Xz9 zIX?4^J7Qh*myBcysbYDVN0J~c*F=E!3eh$O`xdEVwn3gG3{3G{X&_grN-k1R{lz#Pq4<= zAbJY^cp$mKi%`cU6LK%is0Hq<_KSCy>b7+>p8jgmUL5e_-LTZ+Bqucn$>Qk(LmP;j zuGpP&h~R90GNaF}bno5-T9w~>sP{<_=h^8oOZHdt0d|3ZMa;zA{~M!oPu%Njcpbv) zkyK5?y;#tNYkF3;F_(=Q85xrolCi^>5fVMYwxMpD7K++2hhgrbWZ~1#J#-V z!nDwA{MArK?#0k^A#ncoFbGizeBN;Q$OVd{dl zoOj%Hi^9m%+Y;S9G=dO)rd>JDLe2G1J(xaA%mBJxj{Im3(u?IMTD_IZ4_?bR$gR31 zvn>!~LWhh@q_OT`LV!a82l5CFz8G9AN@}nkZx=Dr{MpRMqapOff3bD~*Ig6PfIJ82 z{Or6e2A>)cc=y4|eW3kXtN%vpweOA+8iHkaOe7uiB&uXy&X`!Zy8;LW&_(iOV_no? z-LjxI@Ve;6xwOc(*U41~kr&}i+Q}{JE6%T!jymW6gC~>}O7!*HydYhcH zeZ?D8D+89kT6;5|BloM1&)ZR9a{)sxR*Pr#$9V3rlJRC9b2@=wP%W>pl2A`_MS08d z271=IR-MVO?0!ocJOhm|hNJ=+o9Vtm!Py$tLr5O78OdG|In`KuxiX*eeJry;8_&oi z=GGzTJ7;V1d7G=Z`H8RYDr~7FTtH{ULq9uLi)yz*8v54|qQ86K;6=NuQ$`e3VoiZR zr?Wucx`2rSdcZ1DN5$+Lh<~P>&VC;oj46_C#2nSUnCf)OHa_v51m69~y=OmKnGRIt z{KwuZGGS97SwZK!x8%M(yS53G{2}EVe^W=L_)wE~$e-os1Vd6QO}|_lQVbLg65Z%{Y?lrbMcIyRXkQHT_W%yIGU5wRgrMIJ+xtX120n-2e)*y| zFHFPtm@?Gz!&dKc@K#r4qGN_4OE}^gyyjKbif*07Gk*}kUFAan)JKHythMC;f)%=V zoRU`Y`rV#=Yjhx3(hCqD7m*irgf*B>3S!_)V+$b%2`auEfOPy6`%cKoG3)%50`!5k ztrHJ)Gwy){(D^&SEx-4SH1JI>=!Z&*c)i!g(}#*}XS)tWoqRao5l>cuvP|N|zPA)F;Ep z^MzOG4Qy12lYk4RTuxU>@w90XRqEi0Y!Z*2_>}n3eBx7-PyBLvfP*2oWWL=hUpz3B zr|y#AGj9>{DQIyTwX2biXZE?Q`wn-EOWj;_l<(P4NUwva-R+rQR@nw66$rgVD6 z!M-XmXRo(b?#=l?LBciY$wP@u8zZ*}Y{vMqQQ|2VYY$G9YuXZd5jX0nD#9(x$=cFd z;5Wd6fXqCi{gLN-KRa^4>srdAm7Z(g8@JqY&uH$65Oxl;O@O|$^5VXc5VHJOPN9(E zo2q`NF(I8iTfkD2#Mk(X4x}LOFoQnR@Vxl-$7_qOQR-7(6ZsOY%dc21u5WQ79+_uuUj74ZP^mm)ixtx`a| z9_Nr*7wnwgq{*wl95a5lFJg|N0tZ3}ksM}>OZp{2XGca&RW6QM+MjkI<8jGt&Oln6Equ7ta3M^ zreG3fyLsQ)TDpM^?+iqj{{Jmc0F2jvs3?TJOd^q$&XNFTWnVHlK?rMxWuqpKS=FOi zz=s}p4m$k*M6TZefPW^!zmNNW6uO$Y9(3#dp^)H#el-s}QFMsnV=e7sMSib7)owlP zeq$+b5NNn%w#UtlHH_v|(ALA(DN&`oBr&4p92NeiNLvidQ$^MGWYFHs0&=c#O440zgc049A@|fBi3Lny>@%{kyTYZ=LkY2wM|1Zu zsQdn6`*}cWGGZ|J0aYm7yh!dDZl&$wNx~UxR>QyZ+D|XB*{RfsS_pKqvUd)E?PuZ3 zC$dAx>=bP%>Pau#vQ#}lgl3_>A%li>3bsU9_e7A*Q|CZagT)0q)_7Vks~oL4=@t}a z)))Na0c!7#&dT;0G#g(){LH%do*G*HY^v!Nc)tlZqYa4ZiVz(DBpX;_RG=Er+4y{4 zfJPU(ge}IeJ(=B@yix!#^PGW^p*mFn2L!f*n=Uf06r!mg|Mb#uA#5S%o+xpe9KR5c z{Q3-NI_i27y?28Y1s$+C)bxE(4(^~=8(17x!YZ}4Nx}y8kS$ZJ7dED>B~VogQ-eng zw`grBHgj@HgU-Ia&h8z}VSIkg8#4HM4_vK_ER3)U+psR(Mpd~1E5sjfb24$3I`0Tu z2-#}y+=)!8!g%kV-kO-j^;Z8Lvu5bpFQBOj5K|7ckARlz0=Vc!sSqauYBskNF~zi! z0Uj@k0iI-)tPmJBiCXh&W*smgv9;`Zk*$JD*u?>{SJA8Rg%bR@0o*Xrbgcad(5wJR z9*aM+)KoG0JplfzibpszoMl@z#@~!x$6$k#c32UE%SGNDT7%ayRbv8>tUAV~Q~P~I zD5G`20OR+z2NdaRZub4F5VDNmgGOSx;G`7KFxhf6~&{rix=kFKsgNjXNC z3NGeUL^ZoJuUr~fvlZ41Jg@dDPl8!DNZRk+_dC4vr_a_-WK0r!CvPI#5 zjTg_>1Hatw>!Hu)k9$$osb~YlSWX-Fp^N$(Tw|lb4)WG*FssL-8KnFZe6Q7v$ljp~ zUfSS-yTLDXhR3+m1&6K|0>=MtQ%7TI1%}v56M7X;>SB*Hn<$EAG4*g z0bShyprpG}?VnIbk>PtXDC!}a6nT&UcccMA87|FJ#vm#d4uXwC3vlB^k3k>{upMfD zF|{dvKzl&74M)wJ!hZguOXpj!1lT9j6v=<2p1y^OFbC*|51`9q2XaI66UbtUJ!dV8 z=OJz(h^xeWt+Vr)@MOW8XTH~_480X+aq zp~wtnw1`lgGGwqFIx&>3qzTkurhYErc(HBp&{{4W0oTOs@cKi|4CuSarU?81WeYkelh1&-pt$xh@Ly*kdWXYSO2z z3hD-(jOAzED`v+LJnD6Vl^AMGVfXr^;k=ivK95>G_(kU`KAJI3=hR|5kj<4X z>MAB&V>r@mqIuduR8H3hV(Og_%+13}GXKcA{g1pxBzHG!a>T??eDpe>XFGR*|INdJ zEFp94?j;IraK}!qakOSKIQ!ny>Vn9W7?9b@OFBN{FKajnhlGVim8vw_YdHiO`{}(Y#|NT7e--^)xw?0qbo0zU{aGOd zfKGSRJ^?$GJfgz#lpY!}rx(7fZs8_}nM?H<|b=q3C^m)>} z?&CgcA(Y|P$YBhT3bd#%dQfJuVYY{lUZ;F~q&fxkCicb=4>E{9wI%r3E$2$Gx3d&$|+%ldN_c8e{o*Z;H6hMqZ_?h^WXEwOPpv{32zo0bNch=oXzwE$Y$6r5CeNcJZAI~^Dq4b65T2T=7gj1)s| z$6s{)%`U&_(muwlS;8A+9B~|n$AQno^L+Z9dBYgzcDoboC!@``AJJGRVQh2F6Di5G zwGI?##yW1TR{&{P)z#chWf8s&i()4a~?}ec-`9SqBdDlf^2NshtjrO z6;&NJj2}!zT~KB~vH)jgdlCOfFa_!;c%m1Sfg+MrnhyBSTGC2LtMDTDG4rk3^PXD1 zt|S0CF#)2;pFombk-r5={sQUvyLmJLfZ9|_5f_NEzv#X?g1Tofo6VSk$Xd7Xel!84 z!GP_mH~MbKi&n51t9es^H^MRV``FIcGbg>AeiXoW930Z@n+;ymomHllH4TRC_$yOY z5O!(TsYdP48U^$1XYeYwp`dgQvqiH@A1r(@m`GYvJWvHQo*j;*j@@!t6v`$do780! zTJDjrW6=HGwww1M*(*QR+=I zWdKd$$Odk7!WZfX(Hy53YmPFv6ecF z>`+AW1ni+m(hg{`hSCi16wrEjc;V7;!1Q867{Uw5+}jPsSziEjp6=WBzRa3OKd;~$ zo|`85Ommv}CNvtokfHrdgaD}PmmiW4;2G`GVDNY#+ri1fl+W+ig0e)e_aC{vCd0BK z^!3JnB#}ttW%idiBg%3F?FJ~4PFby2TI9BPAG}ZfMR&tCzKUc>=q5vL2wy({GY;Eu z@|lA`hN@9v$v3?U&YOEkl9UfdzF{yMbFlCt4AA>291`;)i%{4q@X_fy)&P{|0pNx` z;k!Ggtk*yIoL^Qe*wR{Dvm-Ph5=@mJm)h59ox;xlSlrFi-ru|01BZ@Gij)WRb(LzM zcG&KdLqmh_a=&%GJWL_fP(vunJgA~-9b0Hdf;|znBr4#{hAS9qHhz_0n=7&AKB5qc zB9cBJOAdWiG#lz-1N{yJe8@Ktwngox$L$|`gszO!f^JWxJ4@q^)rt;V;R#78K#e!) zq2$z|7*IOwc}Q^DP_A#SAUXIsXWwT%=qFjI%W4fqs={O zlChFj%|_-cpW{ha!K592T8Y>R3A6K73!RTLdBQfg?ydzs;s-~xFCJChx=4}+9LxtXwfga7EP0b zTGq-}D3=$QL;6dx_f10pwApi=btx9iL=zre>?TBH`%S!Ba5~6}Z^YE|NCNXOI+y-o zFnFpY1{L4?(>Jh#%vHnc?JK5NiXXMR8U~(v7kdzzs;B;M2U~c+{r-)DJzLqY_fgL^9zpd*~KyMqd(HM^St;BD`MChuV%XKEJQ z|4;?K(F3;_K{hAJM1x1=_C7OWKU@X$4eqM1<`+;yc-ieVn{y_!&wPSYywaf^xz84d zOM)@x)Au1em5qB6Bo_wGhXZc%U%6vu-s#Nz6sOvj^Z^VXl>eT=P3!ZMk(0KWxDT%a zcCrmLKIBJ31%!R51xYeV}C-w<=RYFANN`q z)_Kvo;I|LjhXXV<(4y(OeNu3$)}$~H>mL9h+TWD5j^u@1k+he{Vd5Je!g~y2&Ap^j zcSAa;x0g_t4lh+_w;G|MeJ*|p_?c+uupdiN+sb?O*gnr&*3DR}wW8s147Y+SR!*%{ z6lll?aD_jE6aU%Y8D}>`p!3fMpsOn=D!Xk8$rvyu7Hh!lu@(ZctJCCP{TDS8mIgns zAj%)k%=Wm5Kl(LYoYz&HTrd%aT|ub;NU3`)2|kak?=je%GLP1V7!`7GI{968ls2H9!VVT@e z>x_c)0b#{4{-Hu&)pbo;!w?oTc=~0r98&zQsOV52V{9a@@~y1-nocQ*`SVkybDs;kiGoxj#}jZ2ET0rd1ZleY06b*zb=gt-v={iwB39^IEQIOvB5g{+OpL0Gs`3X z$Sd#C-v{117UfMR7r(0Dw*y-&11fGdQ*0Bq7h0#=rVIM6I2R}s92DLELfxBa*x*{H z=j#V3bpAM7pZO$t9iHXPYo{IUbLwGm_mGe?o5g^}p|I6st_p|kC0`TLO`c4$_G1b* zbN`95laqrr{rE&jej|IQeEt*N6^-eW1Th|DPZ9&6G~~=efD2C*%7@*@^O-};9hz`L zmN_}i0ng4ANdQM61xSsv+?%7-8i?L|8RBDVAIie0{do9ZgmHgwixtbVHMgN5lN>UiQx{|90Ux04FxIqBqTL$0yo23Bk zwdelcI|9i6YqU;rWJ#B*LQrUb<~w$GFP#q1w}(w0M{9M7uytjkplM{ZY83a$;T?l> z8kfQA8oUbRZ${frvL%3j^J(z;W zutn5slnv_Q%Y9u#(u8VJ1|}x`-s8$UD@>zU^xY%S$zAf??V%m?h2@K*8^O+H`Y_u{ zS+?C{ZpV0-6awn8uoLJELm#wRYsbRDS-xXKrACVJ!^1$qpqO~~59FzRaCZXS=>2N6 zrN)c?7@1;8!6;tt;vha#1(phx_6cd8Y#hHy^==dm=kbvxFBGszSKQsm6YW#5}kuw-87 zHNEeB%4nVIcRSGyI8CMV6y@Mqcz_+{&d(e+r>Z+{=M%FYyuMJTux@cV_^VTN_z;p& zGSH(VCT@B$QPlYOaorN3b+^N$1Ymjp6;TJ+PA5T#X2H|BXXl!fc*`&DeVkkAZrw-$ z&04NOFZe~@*V8bK)lSS1V9zdfeb5nV_?GT;l=l%csaXd!K(0fGK}N@IW)Bx`2C&A* zwJEj`$<+~a?zS_iP$(A#M!Ym@EHiI+^IrG)?yD`kyKzC%(aZa4&ysZB4s|^*BFpQ- zZy}>&2q&E$HFuhI?xKF~V^Sk&gP7Bkk2noiUgD;1sGxhFqweqEPa|we7PBLIW12%( zA{MmYk291x`!ar_Gha&^ny+E))Pd$;s-qQ+D8~b$_BnNcFp^C6Qb8I}Zjp#{I>Eh- zFSkpxQ}gHzyPMYGhc6ZdnAT+BHdJl`G@@Fb$f%4qC9Kh2Se@dxYs#;c)nF^2f*z`4 zztmekUR^5oMdPbS1=6_um+;%qdD*a+mi#-vO3v5v}>T` zv*4WqUrpoc(3U}H&9!c3#xI@b(Zg<1+naB3HmIOV$**UGOjfmJCXV@BX#;|*UK;go z{Qzql^;q|vIq`KQ>t$?d0cP%9Y6AG-ePkTg zfq-hB3Z9g8V(`mu8PaMUS@?eQ0OGQ?%wZ{aKF&zjZ zgI<^a?qsglFC_|9FN%IRU~gVj^2Xw4T*DQ#`>P{xX;SuH7J(sj-Qi24QkBHKeYn?+ z7pT%Xr^7dVllj=t3@tO~f+_zFB73aOIfpIsp3>(O98?UbrT(I0Z(DP%IomLCv+h%; zIQ@9G8m8l0#<~ZQ36;~kQ?wHSgxQjJUYqib`xKC`qpD)|7PbcbjACUvb+TF;y4<0B z-~x}5+SPfBvQdOJE3phz-^^ONk!qvANf*x%cbAQQTsNj@XI!adtHe{b<@#sqmd8`d zn=;BG*f*&%Fnzlz*F&%YvJJJRWCL_mpK+javm%3Fik) z@sPk4@zn>=`DZpWQI%$Y>zO_(_dZIb0I(_XGhIiDl29i73R1Zvwug^{AlV%Hk9i|I zlJ2xxaxssehwS+aZa>#>dN*xj*bIKK>0t8cH_Gdc!rPyht#BnH3AQdP%-Bo{GSn z*r7gHW>dNnuzqrkJw=26SEowx4)oIqz<^2cZEUcVBv6BJZTQOM}reB!YK zwKtEFO*MSxxHKoy0SE!svm8>q2(e8LmqQ$AGL(gT@2M>}Q!9yX7iy^3r>#(R%SZOmk4Qc{0-L%o=0bd-cK$ z>P}iHMK70I3^w2@-_$Q@+n!@@PA?1K^ZKI1%|Om(3e!6mM+ z`YC|?VPkrC(6bm7LcgxxeK(h4b1ZTZJ(=>FL(TcijU5~E*TI$1!%@hXu)dF1zk>7S z(|I`(Onm-Kn`SgNPVK^K4uWBxCY({}ey1-+)c}~czis)43P*Kvi5K_DL`WWw`vj;& zN9o)aIUCJr3-$xO96DE5@9tn~}ajL4WHcK{?-=j8|vlbsAlE4nrvp6~xU52lkSw zwOh}ZKGyyIRDHk8_A8c3>D^xd0fOrCjlfWIa?|zI0#aR zVo8L(qo{d^Mx-kyp-+c@EG4w{U_WLYnX;F@X)Y++;wXdFp<-%fR=n*?e$g@EBBUPO zigw~GhOs%n7;C>2TOS$We&g8Fq_g7TiKPDRO1;dr^$eFaBG-kGjG(=Jt#l6_9AR;Z zTiMCQ!TOYm>HYks#Gy1Rm*B%fL6y0PoFRysn)&pgB#=kVF%rcd()N5*8tA~~g2 zeF*VrR_6kK4|8tC<5GBB+xAbZQ?w<)lZvld*%S8?xxZi!F8!=N6_WE|RY_$ZSCxD? zA>9O}8|&aL2Y9Sp!LM_@n4pn1OV^})Ay1UA#`>I@jrhuO@%GJ)w4vG0W_=}Enrc;= z)TQuiuDNjlT}>*C+icX&8S=o?9s z&(DkAevRe4@cL@Y5uLrmh}_6Fa?ZKBbIu^|{rpdD`e;c|Kv$vnuv}7i;f^nRv94i! zH#TGG(v!)u?EAb$h40d1@D{q!e&dHn+7(VN+m9A?g`31{I%_?8(&uW-;ihP|A`ek# zFf&ZNBD61vPDhZj$o{F*t|4Zp=smIviItPQ1&h>WL(uFIY$y_f&>vE|x*BCNu1>Ii zSV0!}x)~+K#qd-{g1KF>$gkP!38EIF6f2b}Dv$a0(}YncQs?4W^irwX(lJyiX*b9v z24_WLsK*%bM`sN{fVdZufChvJ5!OWEyv$Lu`uYF)$ zIlF*(DBOK-8>)^t1p*eC9_ymWkDCr8X{ZT+upBCHJN^NYRG=@CIvbN1d*OyesTCt* z>`9OqAIj~VbL+6LHlq|U1_3qPb7oxvWqvv4eYrumlYZVSDl30*V7ntvD`^dF?M)@t z`0#YY-ajvh?}_+48Fsj%L`lK%?&^f%&b!5(E~5$H>{{i`KhNrrXC<QvL^6-4qCCvu@KpD-U}6P?Vbe zOlzF!yel9K|Na>ls*;834I3X-DoXbZ&_7Kkx^H@|$v zEoK`3q03o8^{jp_H&55AE;*o40naOh?FOMS>=cb0QtdCgM!Ad0q7jj4ZG%6bx^ToB zbdPY1II5p{&*0>j^z|EpnKUpfcG@$mQ=Q0=Aut*IO+CNk`itwl*BuKX;sSRQ?YYIH z!>5PBw=0oDBV$^5zKNcY^8}p10<7e~dyGOs_pZE%;#cpzU(c+nAxL**9557{wOZG; zO2=OZn6vDg3lFm4n$dLjG`-Vot@|OTme^h3mq@X#$QwjNu2_haPr`Jol5qvIFwl!5 z{svxm(jaFeM!J(5KCOWD7nhpNc(Ns?NZZ_o8st@@rZq2)_{=KaVptBY4%C@9DwSME z-5rF^Gk+U{J7LBjL=jiFqTyjuiX{VGiOX(Qka|fg5n_S*dRD$1K-%; zzHd`?2rey#Ey5b`qwppqPfhTBV*h5gwZyrv9tU|(*Q-1XSv<~s+)+d?S$zJPmu!88 z8lJ6K({L@aE!es8MjLO$m(cwio=dQI(+DM}$f0^I)vTVi!S6-D((T#Yan>z-v%{#d z4#6y2I*1}_8UHoarujO5%v=Af+z;*Q-5G=C={(zI>D!B;qub58W*HRzc*W0MaSoWZ z6!r8citd{pdTLLdFlM_8{axSgnnbAeJ}cMZ){iX-m-I0mDcU@QBoWF&B^@|k>I3+t z<#x_Se+>V#4*@~?+;KX`P};W1SXBuvQ?R05VzM9y(_)G8(U|2T{V$6wl8Az;sxb~*oMCESK8K_+Z_8YqGxaadn(fQDfznD{5WhxGuc z&mQLi6JJOIfcYcVGFp`B(9#(MKn_37%|QaE_;`kn{i4&jzM8>EF^GeaI7sR6QXD}$ z8ptalkM@T0a63t(xbxY2c-o0>)NJ!g?I?AK7Hu?C5Qk!#X`on5NdhuD#O2?GL-4Qu zzo4HTt(2|!bXbLw+aD#I=(c(fkn2&OeGV>eAN(etT>+B1C7l~Uh^dL0Us23+#)N&7~VV1 z2>7qWL!ne*3^*Re0~hNCClQglZwz2WSUn)JlY{@+vFnT9_lDi zT8zKU$|SNOUuA`1R@d z1ne<=W@8&2*t9<)xrnzue;e=Ie;T@Q6DZA=dQBa|0%<*%Beh(l;5xG_#K78^vk7_D zw_t4TZZ;`=m&=s-u4R?O#@RUd9M^o!MTowg%v`sir^(zOPS*|2SovRLMOd=n8>vNCwP zUv%$~<}3PE+Wq#$d%g>Uu(ze+pTl7?`^_koUCJg2LsK<1rqu#ybk93@sHa8~X|2IC zFg$CqvlCEW*(_A+C-NM9#|pCGGy~wqU|?Qex9S9<)@|<1H%K2k?yu^J z9fA%6HPHK&qzh?A2I}uztQ1F0vmz@je1hG^<-%J!(KZLGdsM)q7F*5vRarzS`BiN1 z{m?tEp^uofCjCN7#}6-260Q+XU4Scnd?X@#M;5Y*a#MtB$|L15DsS5}8yN~t8jV1! zDeFMw-Hl$n`*&yWT};3flhiM|a2kI(VZqPM;l_)(ML)Hny59@Ap*bbL&mk33L`xFhojIam>H@&RA zh#L1Txf#D~5&JB3WhDlfH(1vb{r&(ISuMM}H+8Fm8XDLtDk|I}gi`f3Hd;?+rABE? zAJHg|8Rpl|E|s%S&1`$tXJ=9M-pwE~O@S;Vb!bh{+!ODStkQ&^M;Pj?pV6p$Z|#Tc zna1-z=x~8VOrHyQ?{U_4y~;=DBuP+R*!IlFfUkG@ue918^>r#y0Zxr)>zl>_SI5T_ z`o%SCwMMsStL{n(jEz!)JT`?tDfqfo^ij{*US=ncnF&>P^ySH33q7Uu+j)jDk)bjo z9gcUV{UQ6bG2qsRbVgcO&93wK_&f4N&?XWwTFKi_C{_gQ48F-$HLTg2JMW7Hm&vOW zGCRT&&swL46<6Dn=N>F~sd2t>(#i6sc0s6%4*-EBvlvZ)VWl=L#kzxA7baX3^N-(s z6R$Ei`7-`TOm6kK*NkCu5l_GkUsPp#goaX=+u>m1s)%G|+0kkgwd><*w6ONd%Q zklZ8Jg_1kKjhgcGPwBwl`^h@#sXT*f660wU#Ru%1s?lTOH1A|VRl-`_WB+vj4l%+D zQ3ZL)X%m1>+S(ewYGqM*m1eP?^fk16l&3K3U7>ya`L}wF3W40UyrjIz zj|WL#bzetZ!4^dqmf+jLEF|e>vlIOcF|!jQa~hQvlbKE&TC3EbfdG2Zv*$Z4!^MdwD_a&-Fj@d&5oygp-^hH2rf;mS~P`ypSNS+FYSDZSzf>+GWWN=P*+mauO?GM$5L9=9!Q zbaH#aXU>1@(}czC71imCRA;8?!`-~IRgS~)Zka75f}ZVHF|KA3OUE8C{h0zD!u(G)5?%IlkyaR5Wgl%ixqtdIXv=aH% z&VAi``ox32R?z;i*n*Q8wa?FtFIi>L_M$Z)G0ckFOhTpmxVc#M?x$Wbsd))>sj*J; zy#5mp2G%%LwjaC;)aDTv`>rX-el=MsgX9^j_ZZ#XZaZf@?%!;FZ2g#DQrwl=nUv!6 zM^X3hbxZ3*=`X_R1mJsXP=ZlnSW(_#-`i;DBY%=jW|WMi$*ME?03bBF!GB z>MtwX+9dn8vBU_d;~xYp!(DYvJ;Plxd5qb68^2YJ18K#y+#k1Eh1jLMbp~bXkn-W& z;gGS(V%Q^8a9-;dXu&^8 z9cGdZUk=5WH3%;j-31sGv5uzdbnRO$OYL*{+ed!)Mg27OGQfv#6<|0Hxycs*j~V)A z@b0byU~(blTLTAVaxSS1WcZbQHuLOIK_M<)xx+cgLYKd$y1>U*&Q}IDA9ClMRs%L* z%>Rp=$KS0`{ePI;!cWk~FvOoTpu=S!N;04YKZ_lEq*`4Gs2!9HKP;)mxJl~0zgf0^ zGF@Yb1I@9^tIv+{`}7>ifOLheSgnNs+}RM+Dh)_FUSK=rsOHp5>4i`GPM_jw z)*Fv+@P6opWIp!hK5Z>c-~P>1?ozwd83zsp)kcjRcL z)I3svu>aQmg0*7m=mvdYl<=u(j2{2ANBvITD|lTuu;D|p|JJn+_fqMMHJcGz1OBvi zY{4gczJid??B_aq&-^stxx<;Smg_u?y!w0dwg5<{aeu5Ge`iwv*K97xo6+qAWR=+d zwd)C6A@5vcS`ezy?LZ10f~rWLEc;0xuQ#kWOi=|cbe+N<7xgzI%;`^zFu;-iVJ1Kr z&V>V$!Wr`|B)6x&*oL(}OLsmscE+S09-0r>!ke=|w!K>RyWF!OT?yPUo1!UPCk zH6vGGGw8rrKt48bx4H~`z-+rOS4Hr-iDN7RKC?>u$6w!l|5bly^%)wD14r1z+)3LC zqCL;)xy>#?GHzr8ukM7&LMMOIm85c(m3T-D4!e_=`oPf1A|M+7FZSLupvkUX7mR|U zfFL5hL@s3fQS;Qks1+^CS5ut0s>NlR0Smp(uIih8ag7q_Y!(1p@a~UIM4g; zJ>Pu$n|;o-JwIm7`2mC^>*;H)d)-&L<%{Ygjqm$kx5Cxhs8lLW0!z6-R*oA zybJC=+P~@6Q!-F8DUM^!P=CoH^mmP*=`4Ny1ts{|wLhS@jF6=Jh|>X=A$!-xkh54? zonPHWcdZ}pTEzfi7oR}`xMqEM)&Oq+pd0TK@G7|TYd96w} z^sr7V`(aZEd48wZerc@YHSWFQfebgq2(*H!q(K zOl%i*sNK^0v?pp_o(%!8gieyh{2@>cQiDW!#Bxo?O2?Y;5Qsp|6HfjWOkka8d|FEZ zxA5QLQv=_%mQKRvH4yVYf)I&DOK{Y2YTy@2hUrejb`OPRa~+tm0QX(s_kS=PU#}JL zCq8zTw|-lBTJAs!o1Pk zsl(o{!D0KKQd5mtBVIvD<$dSrWVrX(*K28+S1NL|L_D8=RpR`HN?X%1?H`e(HRerC zoCwuYK~Afp`bM05&%IX9maVQ920G!5(Eo?SlORy(xy`_bso@qT!s)!)(3w1tvsRh} zy}p)f)s$AibL=$*>;+Oti+8*O@Ag-<{oUf1_U09$eXr^IK%gyn5kdtYlM+i2%_SKh zb?MD&p8R$Ck-HFsRy)r`!v(ot^H`;Wo$=ZE@(4d-zui`D6ZhNn;o<&>*aj;97MIfGipcmZe;w^Z^5u&n`^=^3M81%Ytgf%Sc zN#nMhx5iZ6+gdH2deg%;u+7j579?zgpXj@tA*t@I>e5Q?(;6Z?Z#KDuwMOc1;xU^s z^Ey@WE?nzz_Qbhh>+-I_>)T4_bFuKWHMr+$wOV-iev}%k4(fMlam3{ak9fAzr?uAu zp?+k^gtNwJ&JG>u&xCTPw=J6IGmgHcZ&Wy@dNxoVkU}#96!25kw1VUDv%6bcL%5a1 zeZ5r%W3}+S2DPD|t@KdUo(sy2FJCI3xZNR4W~6)}brJ*@=AqF=3f!NuKex_?{g#WP zzrDNYWzX8>zd-k+XtP&f^V+mCQB|7z7p=x}fFxA#2PE_Hx6fp|;AH}>O1fsAFGRBF zLF|q97q5*ASvDYE>}soy3TXre;^9OWDErgK`CfsexTlZ9D@x$+mvwuEJPn;2p1tq6 z-G)#PWlUPJTkjpaQMsvpGB|NRvR~v#H~?*xwQl-7`<(QEV+VtXRm?Y+jyv+zVJUfQ za!i57J&xBN^mJs}O@cK+qnkbNGuC)dMf*8N*mA^tzsJNc#)hc%RR6Z?fQ|@|!0m9` zqcTbHZ7LhE6~|aZz-*1%DyeQL#Bvfh>%A~U`alY+%Py^(wLoIlR~7{PXB&`F-f$L* zhG1IWBeQ05VvzHBbTXGS{l*$<;Q+3G$^+o`(6}rEWC%)Q_jvYerUy(O`6V(eYXW8^ zvL_Dc1?7SklW=WDVn&<&hwP!uUc8e{I^-iv7qy1|{F$4l3*e?&NWk3p#${=1pRw3V z*E{VimO{t3NFy%Xl$+%`%d`HOkg;#i-7T!2SX&?5J)!nH!IA(E)XdL2rBDK)Z^PM! z7mDl8NK)Jg6~Se*CmauXPuI|C2|o&`x_;k7D_!_rUkF){<;;~TUaBNDZ-5T}Q(IWW zo^u-%t@$us*VX?8BV`a1-t{9g#`69?#1+>y@w7{}K*82OaQbfgyWQ&63n|{0#Dn6s zS3!kpOm~NCbqdVvQUyw;XQma$7VdPYb^K+=4F$P7?q>qvW3K;q*D(QTAEgs>t4rxl>XvG5ON1lGfxS%c#z3uF_MJ?-sM@OWzM=`;Rw*{@S+I2Iy_$EV@-P{uO?!>$iNiH(eP9at0#W`16XXrO<<<&=B@Dwe0D(b>afdzNNv(CxeD9;6@?PI7@sWO;ciIly zmFZ7RMMqPY=61$+a5-=FZ<^`y*3Hj-+$oj;yme8M9)nB0b*^)}`DH1x{oZP|vyrZq zN*$uz(wzsiBE$>(3C^nTw#P!cm357cyO(`kk#m;@*8NzjGU60sZY?(yH|*hjeg>*P zlORTNTUA6@RcJ)Ya~nP3xJ^7+;K1vQE*S5~Sx@etG3~lecofthUj5_JXaTLLAi$3d z7O%Y*Fws9a3jalUg2@D)2iAxRK^DR*OI~c^XR!JE=i<;#j^=>kUvdo{O%>&5k27iy z7r5TOs%0f`!SJ*XjN_VXl8m}u?Xr?V?!HIf96T#@seAXmV@4i~6z@IQR%AgVG zh>s%L)esa))rAO?I#xLyVPX9UlV!`VpU?*m{>;uo=#*f@-6Y=1h_wQUdF#UUF~(~> z{gLA`{ore+CeyP$<>lAaT6&wlIar|jm=nw2Egkn>z1CB^m@;1r?cN#EG2~P#mQf8bK;pM~yuh2@Jr_p*c`v zbJS0^tw?M;L;cB=$HS z7=ILY>*l*B8?JTA#(K@QL zs1+?Lb<(G;9sO;D9-*77uj!84B)5}be3icip)akyvb+_A7k&c6ZQ-fc8<&(Siy6$P!DNa38j>Pm| z#@1sCf8e9~lZcj*kQ+<2&hT@9XW;x9 z0RFiWr3!Tv%$xTQ`eOP}N%B?b_6M)4s=2q36Drv8f-KEKU;q5gX7=djXL&MqL)QJs zc&*N%#s4_s94b~EgU5l7xAuU0n zr9~uZ0sV3EOQgXx3SYdK&Z5mhOhO203CH~^x!)>q#>PhZQtTc2*H^GlX-uR|${qZ< z*kX6w0aL)FC34j6O((KpD2(DZQ6DAJ#XtHpZTJq zERXOf{y9};2_=}#&xGJB9-3enCJt1-AiC{xm`OAxNYk6cbw7}AyZ^ieqP=MXNNBz* zyP{E!X6l;rRMBEo8=GVGfXAg@Cs-CPxUgCJB$&k2wNA!!-F!MovW9EuHBfqQVnh|X z0;7Gc333}g2I0wynV;doDo6FR%l4#&d8H`C<#n(#L2Bb+q;@3z(X)4gzdl3Qlhrzk z0N+1b9A>kB9>v{wx@7()vRJhlIh-{F{0#MtJM#~ zgfzX#ab&#u%M7gEYsh#FPBG?;c4@`Xl7LOk$1yaC@gkJsRqc-o#xgov9jm5umoP=! zoNt>w9Eh7O^ee`Z%ZD?U^4(}`Dw-D@muR+&pE(Xut=RGA##^wr2QutFdRc>42XOvj zZ)+Z3d>0~ua5sT|oeFHvX&hdf$9rT-o1h;ju3U60oqnh^sO7Q2I?hVd>W0%Afpth` z2Bz;fWseFl5f%f6|A01mXUVt3smz#yzb&#L;tR;bBR`p661%8{x%Sqb3#?=(@CGAn zeF_Be#NS!4fIq-ssK)H5oQVU@CS&giajw60-nv13e`w!t4)74z3C7L6*r-;v=vd zyR%2B#68wIXg`{xd3I+A*@EgKi(#}#R`^L1(#Lt_(ik;U73IjX=^oj*`~GCkSLX|+ zj!$Ye8suE-(3CJn?7eHbn<(%ww!_(j#)XGR2H5+&3XsX}&V5)p^~8!@jj5_B_YcS? za$5oW2FodbMsR)ScCwxIVp&PMtJ}s;MOz?8;&{N&oX*!@E1EgEYI^Rc>4ce>krAEb z?Z%gomi9c+ct`j83^eP#l-Y!;qcEWs-kHHe!%>;fJ4fhu1DXUCE%C(H!W=??Cp_DI zckPQ&l8^KNp_4)InZ@&3cO~{q7DgKjmD(l*W+UXyo}>%?M0FWQKbyYd8PjmwU9l}# z)zEVbVFwF|ZG8Cl$a0#s=GR`e&${IRW4Ch+vOyZY&P6U3Nr?MxL8-<73FF+F;G2-_3B5j**5 zj(wFu-=C5C8qXE94-cQ=FUJ#OxKApaws#9p*H^daCnxlIanl8vEscA$XqRukZ*4KZ z$4@WBt-0$j9pMd8Lv3~tUYh}qyW>B&0HreAAqu%t<3dxjRG9}ek)oPXu;Vo0!R*1D z6((5@W(H5^XtkmD@RH#+(q1F4QVZ+3X&--+cWmMb%$1#u(&{GX$st4DJPaY0bpgTr z*cxY=?XNjyaz&{Q8%O7oKc;nx;)y8+G^yx_QN`38ibSBP1u?~Hu~^bK)`&0t1{&we zbYeUl>{qWnykAqtZdBWk8MWz-9=Id6?=x0_$S7Amm#dhx@R&jWUg;!!QhStNGXDXY zZKNK_U}>12C-d>}sEv{PyX0D(b5Wh298nht%KM-zgUJL`^lQk;RW89R1o|1HS~|we zl%(G27zQurna{eQt5`Yps}y^JOG{eSmCBFbUkA{(+pelyzs18UXzicQ@Jzym7(#ll z6kF@4j34qUF)T3els z9|{g7snO;#>NapP4KC*?519%qZVv2UA3MG$ZTX6yjq$MkZu^_S2UqG>@TdkLo~%D- z=r3?Zr4vYjr9?G|rl@DSPUKLBIbzwx3XD8n#ZJvexmIf5LS9VK7z74^W^~Fe@2~*v z@0U`v2a0x{5US4z#~~cf84k&RHqcJIlRw^Vb9uEyv%k2#WW*0r1{uq| zQ@#0>l=C>{0pV04Y$QII@xhk`As&O_r(RaUDzGeR0pkH66jy+@DE02>9R_+*P_667 ze8i`uG&kFPa5!&i)|QJ>$8*5GG(YK%ZM1Dh<*7HD&BHMPRSW0N_*l34VZ)8OqOzyE zPZqCt^2L2SejlReV`i9gk071&{cfw}?{%Ia0fZJk&2!$$@nMYNv2$R(eQ-{6(?fyb zQH&;{*T`{}w1=iuRqPfhm>$wp0hpo4iBhnC8VPBBjkpS=aF0GxB{?#)R>|d6Ugf+B zpGHSz29$;D`Du)2G3KNjc&|U8`PlEf({dAbURu;EIIXVkAq_!yCBbiVq5SC&81@UA zW&l6);Qi8mih{L3QuV=v?Dp)87er2}3X{d_w#02Gw?AUBeWAWap_X6B>tJ?aA;2Oe55oLXk5O1xZ1h=9q z_2L&DT1|B);FR&BR(Lj~L@B1I7BR33>b6*V%7YDZCkYs1yVe1N_nTFd zf%9pqfF?0^;1ROY27IjGdn5(pUGr8CiE-`Edy_aZqUg9&vDjNJvB)~H(35wJ6^KXYiZWKu$Nkc2)^b z$gR0`w!9Db(z0f!PzeRoG-e7LmXssX^VQpZ!VC4XN%5nBN9WZOVsGM3pP(=9kyXCA z&?EtbuS7fSfO9>&eR;JT*+FV31Gjmlug20 zx2sM&RH}d^@bd}90Nk07r7ac)blh?#yNy= zz}^0OUXrr~(ulhG_Q;F4?z;EePi&vAK(KjePH^4id9Y}@?OFb3K)qTlBlHx*fbkx;@c&?d2+7iH)6Lpst9YPih-0r_WAT zaMLTbx>O2E*2|YKKhukjKGyxT9ARI~5Tl}HjA)E$x58h%nDfmzLZ(A#@%WsK{;%rGL%HN%y zb3eu1jyhP#I}lAUr0C$_tUk?;0>LfGH)c(-D(3Dv$G@7H3%9@V5@%VCv4S}*?JWvh z3R&Ca4Jr}_;og{`(NWB*N#65f`_m7f&wz)%F}c2_{HSpig+D@q^T}}%Xi_`AdJl(; zsBhx#&6?Sc|7|_=1tP0-Y#+y(dx`LgRT#q}n|BxU5>@$0%=*X3%mw&pH^O#n?9!V% zzvRp2fNO$@c}YMur@g3mI;@tXMrw+Rj3}H|gcx38pUP9%;W!+U5RWskfHn}uw4W4; z%gso)FQ8gjkHCLLXI2uuRQ`isqZR}@npW=xBUmRbLid7lVomWrW7X9<2Ap^EW9xbEcL@)EbSb*xwznVrZU!HCtrkq} z^<9AI2AspMG$C2#aXBRCF3m26G!4hfUcuC>`}G?jS~D{;~$L0 zPA*8EXjfKN-h%%y0vKVXf@z#t_$WB+cCYh$-M7(6c1!vX_Ep@~-;Nf%`<2(0mS|S3 z)oC;3@h&eS^i~{7GS$?QOzHhyeY`=`A(V||Wv;8AsRvxJ$Nin9v=`sm2;U0iz-t}1 zPt>2n;9L#1X!WrhZZNE#G}?tckl<=b zso?bv_k4V}U#Hj6QMEW|s9MuRYgxkKqp-ca3v)gXKgTn_^FCMVmv!#pC)^)umk_RM z3nyzP8%uBN$ld4=-qjYZ%7#@?Oh^**(C~@YmJ#Fl89Ui3#ZfV)O@m+BZohx-ci@WO zPqB*jm!5NJffcV!a72%e%3go+{<3w#Ml9$+G-w8UtKhCHS~xjpUzblnvrxqJQ;N(1 zQ1}Tlw)3PWzPV+>Y+JEU+fmkI7f4$cw&$I}CIx0$O5o*i(O71Ru#@K%KI0+t$DXCz zhL@~_=v}>HbE-rcxG@fxs?NNJ7lnA75dtkf`JTmP;`$^rN<8c@johJ)^HfQ0t|c=10D-Qj

;DV)T3=2)rpuMV<2I9qO>dE1IpExxYu1X|`5i^+e(`vrmu?c--u|w1wV1k52|FsY z=auunsuREM`5JaYsnsRSH4rE;wU$Gj%mw&bcdnL*kjC-ZVC=k^0ctzhg-Xvm{;WXl z3vF4aKyW2b&r(Tdiw++QuA3KzY&W{j{+2TorhZdr&6yrahaNgPL%u&RHF+trr3?4q z8oBldB&*gFc|?Rxc;IdYPIi8TOqU}!A~Qv|9^>oOSL+j~lWtUV<0#Qq$dMOq9#H~4 zdcPmavB!@t3vV_&F852LUXCJniKo#k#2vtEys zSZt>j)e1s}S5vkQZm?264Qo*`0k7uJy-`u47qJn3EWjGrmXo;_2w(} zk}6=*)quH7?6tIS+i_ZN2!~U|vBRHicWGj9c-GYMNF?bT))52b4p<_dt7jVk!H=QT zvANGI>2;uT;Qe+$Ms0-5C-pAD{kVvs(krVR97o%;Wmpt8{NW(O@KwmMX$GVW(*nm0 zzW(|L^eX2M2mo2v%VmBZHy_{qNaoW(c!MW8=kKgpWud;wAqnJY8nEQOfYNz<9XWXx zNjiWWz_`W(=V^V}h{%5Ur*TjX~qhxOP>6fmI9?cFjUsP3bnU}7Vdv;;XY;93Hx z9}TQqytb=b97nZC2bW0EoG~qG4VotX0qFpZ)XE%ZJ@y82Yk9z`_5(%+O9Np`?a|*7 zVatuwT!c9I$nhm)YYG{2oi_Lf!~vz$ngZX46Edj=t=@1X1w`x5Swn6zHl3V`w8us<(BM+x)EW`U0hD7paEB6d{1E+-Y@va$1y9({_fi1gLDaDqigGOn zsH#+oT)qtY1FEpYHlfK`k=yl$v^&%+Xctv311tLngv+7AtiGYU` zbAWmH4MIA{`Uk|W1vJ_L*Zl#-VN3Il(Ah({*hbPi{=mejnBsg$lD+&4z2>LL3CS672|E&&@2w*Z&Bwy{nU1p zD4xibwzf<9un$tXCQQUIMPAl7alR^#kk&RE>Dp*e&d0k?Z0tc^AcRO6@KZhvErOC2 zcZA7C(T8jjCylc1i_lqr;%o~nz^j;#dLl2ZkD6}Rd5auS7e0!l8cTMyADtLu0nNF= z`;hPC)gUbsoL_O%CaRrMx-&ZViLQ^_^NbreU47JIfQnSMB-RDikcqDTS+frRuPXDS zdiT{03Id1YqTelU8XjCq`n8N@$3w3-4kkIU+Z!bnA_{&UZ8&3v)+@7Pj58JRm@dMp zmzpeZ3$FbFOdc?9CDcrHNi!cIJrdHh;19@d%`~ZIqPPZr%O@kw$tG6c`x&M)OcJzu z%EyM|cv(rRdEU^-&%xLm1}oV{p|UJpu|j1{krLbBvAY8!3-g*aQZ3gcnzIu4WW&TB z^#>cKzl~KclD^Wz2@fuac3WPAm2Dx@vnwT)$_sd5Wxe6Km=Ww`a!+?Xn~A0T8QUo1 z3Zn$OnXegXmz18coYp`*gfTD>WV?Sja^~$i3}XUoH?cymdQTiU)b|`RieV-(<6gge zScB9r+C6(PVJG4({)Nk0{-ad8=+vkZpPPHfkG_^$0hEEIvaSA_nciYwW3z+&SpJcp zSPuf{Cv+epqdtyWxFz+9rbddCc%V;jQ{tZc_0G%k=;DTLsPLh}o?qYrIg-qubNy+p zh@`&pN@P-vg=M?o6;w%{xcB%NEwc#hrr@CG*0}_w zNI}e|%Vi#Jl7z|1S^HN-i@9Sy@A1Bpj3I|we5gH!uv)Xw-^rGT6i>T{GEMR>j>UR-2pv z|Bn69n)HFB9q4x{+FZN9Ttd6lMf^we8?&O*x(syzd&l16ZTwgP)pgs6wPY-bY}g+5hv%adB^B0I}5A`{Yc?TsH$F$#>(vKVRHY1N>;^tF2O1r zI*wyZU>-%9r0}&(dTFp$A{(77|0K>gRO_;y<{CFkCH*x0-C(9OB(Rc}Hzcs~3t$SxIR03}BF!fI7AqakEolrO@zC9sn z8}Q>n6w3U^Nvyx5JUSe3KPwd?ipc{rlD+ZV>9`}We!}$Ps~2cv;nc6+0!&F}^L#

V;9j)XOzZd0dT)?oMG4R6xKUl-1(naxzU5Ph(WpzDX1 z!MOmd0(dxgVIIi^0C&24uy=N+i$c+^sa%z&M`fNmUs`YO`C%6YF+}o*UKtsQK|Llz612>b@mR1u$}MH zFPW-F2A+Nn4vBuxBg&(;anH1_@4I9R?Q&&#d6^{0G3P$Yx92pRPR$J-^fJIo{AfwF zzt6rsWxBp-#%p6WFqFCY-WfdQcox2f?&;<@KKuhZ3u(fTne4EDY>whcKrJkdhGP&n ze6ly1FwChhQhZ~!h#x+d%Y1l(q32QJk7YHbZiG}4ean3NlX(W0U^U)mSx)Z>N9j*~ z;PnmaW)FsTf%FVkL-h%!A2}9|J6AY5fsl4|-LVt)x47}tm7Dpo@_~@LY5FWAz!BwH9}Y@D!3QX_a_3QuZhv|e8{yoTOZ&AhVaGu>$ZbUHvP>5UP&;QV^5dE zX&KGlOx$#+L9x4e$(#x{MK&nhdjX}W_<{THB9gD8icK^+M<{B3ycP#)20DU zb!j;7cJ#a1vpZLu==A(!6relu$H}s0w+@>UKJ*@IhXC%8KHVH-X%(5T5!tKrm$JC} z3(}>;2XmzDROHrOCRB(a12m;^lEXPUowP?z!n%hiU&2nCv{=7-RcGWS_<(-pOFKle z-@t6L*IU)Ayts7mQ6t-VPJjo?2wIHyIr-za>?d_PR(?euQ?|OY`mKrr0I84oY+s!) z{FAq@c)-&_FLEu}!y?Y@owYp+z;hd2S8C`T+iOM${3s`YTP0o$ALN1Jo>sBiJEcc% z48XvxpT4Y~Xj?cy6whPB(dRb5mX3I;sWg|5QM+V>jptBAp#`B;iCxM<8{@@|T6MU_ zNZ{a4MDk;7!{Lu91tk{0fhEW(Ozn+%3n}K*^aO9aNLf=m;ph_wHXPw_Y&t(F56HYK zL7w49BVJQAm%`6$xrP}F_Q!M6j8?!L$d|}4H2bTUarOke>VAVIxJjFk%CT>85vN{y z)lqSOsM}nvO|+x<%~+nnz2V7W`OkGStB$ZRK9mAuBz)8#9bpKVcFfWNaVv|eQ6P@> z!hNB-;#bb*9~L!vw1tG33s|%e?eyoB$ne8xI<9OKU9tW)fThvG{`|daS4k@@S3#I6)?s;&~xETwyN06B-oE zgnI9-eM8rX$##kRIx zPa0OC-W2Y!i`5-aaI(C|l3=5@8A5s%^9>RoqLXV!u8(U}Fyq<$Sb5SXRo0D$2g(M! z%|k=c%ucvIXHPYuCW6>$72{;4J0WfNeJ#}pq|rwNl4G0cbrvaw?yCgAwGHOsBEG+{ zBgA<0J6*vr%1H2BzKd-)mZsy9GVd+T8D&hYCL5LLvE)x~R*Ml_PTJ>A_`;P^K z{KX>YCq+WPb^lHhPFSix1EP*BC_iIpRX7&BmyX-xmvr;7-Nx3b1L411bdzqVmac#3 z)3TD9`j1ZEzxYR_WoTxNQ9&J;Z^)PEH?#PvS}ixvLX}C9ht`;LDkN$7{jUB+=W$Sq z+U;GBT7P}5y3Ain zDloabV)6aj(m4M{^F)t%F@({XA~IiaMtv|a({l3eY3~k~E7EH2hKws4hcb_;-)N^D zMxVD0wsAUMbNbRHT=58~Pbn^rxY_E%@~Yb}mkLb3Xm- z=6@x8t>sMtKYKjOb+@>HPncHmfmVa21-yG!&j8e7@@IcmPUew-a&)lxr+mHqEGJ}v zB0}4!Kxbp}Ec0K%PSsz*jxsq|ybyswG$|S*@%g|5a9O=?qU+a-38h3ce|Ii#^OM?i zQ=?OED7Nnf^aS^u-hAV_ioFsFN>s8@Rm1493OMm+x8$z;V2!qryGevFRXcZM8$H?f zyJxa}MlgMbK2m#sSyom|f63;V65FUi37iS$&0@=Ryoa+$?%2B4 zd?|})crnX^TNFyDJYUtoLq9rbDGA_BX}Fe+Oj1ee`hDaM1p0xB69N`N$%g%?P#cgq#U%xq2>%1#7yS6!9E!kiC zVqyHvcQSRY45W*V+jZvV(Itf=CO!>iQvJ{pu&vQeAlo6#)C52HtN7PL8;l?|u530e8FJ=w*|1D$r_*B>>Bqm# zJDA<$%@&0p+!1=&$Etf{Y;>a9X(Y|xnxp5vyxe%Dx>>{T{_`yA`c?dFu@ZX_j>_t% z^{=Y>+zeL;N_7clW`#Rd+jydB-b97kQSvmZ9X!cS)0wIG12UJy?thk8BGW=Voje+h zVZAOz>%1?xB2Oofv!hFv?l5BX%FX5_lJE0k{ZiQcvGf*N zCbX%iboiE|!O|J2ZRxD1QqS$PoFgMmP<YY1Lb{72!~Peae|LgX+Jc(i<0D(zTQqL_Um4nDs&)@*_3p+18UtsB@N}qFTWQR~{3inq}pql0IiI>GAM?aDr?l3R}Uz0e5&ucZz zhM_r2M{kXMzF_&N=IfBt-K?f$e#HZDE!=hri&7xX5@5LGI($l-k|o<{v5+QxocX28 z(w71@-+K&xrT2QKsVj>Z45_`-w7fMCZ7we5Ygz3L#iU0Z30m4F7&iNNh(1DJLW z_5GHV5a|T&qjf09ic+D_DNq$xf|#WLC(zAJP|>#Hd~1T0P{iw_9T7gu$?P3}^)?e@ zi-Xiyw8D4H6y<55fXaj6GGFFfj}s4nvf_dj<7f}hA#(2>U$s{kCt>H|0;F}3$1IMG zBFShvdM@yl?fB=P@;Q5dU+;3)rVArKUuZ!ysdG;@f0NR?;scH}tjHz}wU2Hx{xBIjwLG3>)CST41e~ zpag=lYwAK~PL?>5!9Em)1u1R@&eY`upIT%=RYj7Y+u{3Agc)KOTat~&590`iJYgcw2p8)-Ug_CeUT<3-*Yy0=G5`7x(f3f2Cqxrm?5{P`LD12 zE;BK`0dv~d3i@|*WZx~9J2*J|^j&ACQvVJsI+cMLO>@6RX$c}07ruFnAH_K(3OyVY zsvqXi`j~5RDc;*{D0}l=7&7$6w)S#0kT;8lFutz4zjg~1wO`<0s$!~C`}U%Gw6_(s zxD_E0Nk~aA2!n?v-3OLh+A1=zUG;1w+yn?+c9^3tFW+0cZ@P>>#I=TaW3fmD zz}7K9iG;~y-AdyFpC9WvquuxYt#x#}s;&F`_3sjd4Pqoo#}@A~&*$Uuu+snYE#kP0 z)1wcv!tN?L$&0QJfAaG)M|)%P=62XtUzVkF8cyHJv+On8s;_oj>|a^-6~+^Yk+2Bd zMc59s@w2}s0ojB(=jdlqt9QrE=7OqT?wR-|)x0O58=;=h059HVVhJgb^$ktNvLokq zIoY{}ceWw7M$croM68|eFtR(S?|N3V;1#XwUE(&wcE>k?q`SkRo0p@F?I`9+`(jZH z_egYA!tQoJrv^t>roB=%5o7&zEpORkU9R1Q5h(Sd{qzc|C6y}Zp{J|wC)A={@#1Aa zybt|0KqS*gqKWJsCZB5^JNmdHmOqpU`}|GM%=dQ5mIF}*k{JQs!fP9e5GEzG@XBkS z11j(Kf}&3uoOj9};|5jop{f*)c22wP_Bak_=|)9c5i$PLYgtK{g6U2Fo7vC{68w* z^Z&!o)#w^Mj*L@-AiG@>v=|!4%Hm3bg>;D9;`n+2b;AS5p##Xla5qev_jCSGia$d0 zK*UEwKMQye*b4g3egM{90wUZGvi1>@qEKc+uJ%neT1WcvFwkz<-_WsMsTRxlq1y0Ys7XG_ z7v=BY>FXrDVDja&iLllQz~Ghfhbo}5g!OPiej*!>KO3&!9a!i7Si$#n`%Fl}Cl;ol zP4F80VZVWxX!X@6(ATzWrhxUY{RmkT^(xuRIQN2BrbTX~3J3x>=qO|q%NP5v0^a|x zZbw4tX@4p9ynPCheaWMriG2E-3yxTD{1z%BCg>UEtL z!2Q$%3?BY#%;C&uBPl|-9lq5PA7l&BKF~(-tmZHA=1d`G4y*JtH;dvUADcfmKU!#p z|2%9$GOrUhZjATl>M*%Ps#M)C7Gep1W^^ugvIKswUqh@>DlU`095^(ii@-7ddtKK5 zG5-Elvw>51x1P%f0La$a5D;a4*caHx09!yE4xJx47?jqc?!QHve%$Ox@1i^dM$1Ok z^M7?J%oLPfYsp9Ec2st^`Y=)~5r&&atdblc63wZG@Tt-_2))RESBFmmt@_peZuNh* zJ7PW~9g^D{oy?Jbj&}f>Y2RN8nD8xAH@vlhUBP@9?YO@L&b z1=ZN>9aXAuP+tx{;&42?ie;gkSMnLW4Y#;Egj_VjOg-Fb-0YzUb^+aEOXcogD);{{ z{M3#v2pu6ARn2FrZTct+SL(!TF$*D^B#rszKOo%;sIKSP{<5|u5T4bHNCNcYUwvQV z5Yn`^M6!K=IxhEG*o|?-H(|YA8}9D=p8?EjAaC#w!uC4u) z_tOnKQol=IJ z%$<8i?u3yM41M5|_GTl(O0)(o=S+fDzgM@iRZ{XjTE5WwS-uZ`%z_zM-3HJz*Hb<@ zn@2dY0114v&MVf^^>vwL7q}`NH$`Ydz3Kq-3X}c&M}88=^m=c%a)G?hC-d*ru~jT= z&kDHL-vavg4!5Lm?>qq?=);P6wNX=vtf9wK(sS{w*=@q8Cu$OV7qHI|Vn=X%VSv#J zzYNi0MRc^|6{3tpg$uQuf0!S?jv7@yS(_F5Nw9-vcg#S%J+c_yQoR$dO`V?3uwZ=| z)Z}(>ftGYQjj%^_vTP8H6%3lk%}b!`!Oy^$kF$(1OrWGbs&yW?^0cS(BO{>EHfLsA>PbH?pai#-1ow-LD z$D0Fj9ZnO|3HYS~!~VaORQZo#hOuQu{++L@qvYXUavGDbGnhJ-SWG?T080q)tnrt5 zn=X;k4fv5*BUEte-o!|#(s0&c$!?CpkaSs~=7v!{Qg(S4mTG+9 z%UuHQ8o|7+z3Dx_!){;C`*Yqx^L3P$T(S1iCzOVHF_EKZv=3Y2m4`#J`T$&$bn&|2 zwpnO#T+5y!Yno6-KuCc=Xc$QJG#1i)M>j*?Pbu%1W^HMBeekn_S8`PESF#cg%d%$4 z-nU6n%V+|;&7-fJ7$TPlZnnq!CzvJeVk01qGAHfR=J5xgM@qz2$BtD+Y4k@(ihC_< z3$hH_5(p14l{K>~77P=kjJs{`E+Nkc*Vkt|g=c^NOobGHy?o^fcu7r zXkuJ8Ur*ZB4mk^zRiKjNVP_)M#^0T+w*hDpTiUpDC$1Lb{RG@8C+RbhsfUbi!Pm7l)WX(AGWo4ssL_c~4ir>3H zu8qEk$j+ZMGq*I?a8ma9kxWk3~U1>f>EH#1xD z9#0CFoUUzBZaB_VUHs%*sVclNR58>0tw+v5nUTNO-@kyoipnVsRFnUFwew;0wwajF zTYl>+3V=O~UB5=)x^W-ng1*x&x(U z-q?$(nn0nbprVcMlkxsJPq7=(@ER0glCEbi=~nwwQtz6r`^lgV;r)~K4A)z3mD$Kt zQV%b6j&Pl240Q5d2QUYrb=L6$+#^AmkrxC+9qY#|due*TA4j59;Oh*-*4)D0!YrB^ zi0Dx+{9co*54bTE7G_1~`AMkTiedoi7D7nts{N2C#T0kqBV$Aa>!9t*;fl#}?AK*%T4r zczEHE4)kdlP`EP^GC$>DOUoDlc>eDs_KL9gurHK)-*^JJZ%tkm1s6q;SdoPshOYMZ zSK^RAAclb!KtG%SM*+^NWug&aKp=NDF6WR}nC&*h^W75?6{E|H$Z1U4Mm;GHK}NTr z*=HbunoA7F#lQ5B8XTRgoJ{!})y^M|Bz!%XMFrf1LTP13+tSXS-;PVjT;dtI&(_*Y zVM}@B7`#MdKS>T(o?_P-RkP1k!GQg!OixE!cY^0)o@HkU+ZDH+%zpzC012GBRZosl zIFlEBbcQj0whnKPTy9 zBTa6Y0UwJXDpEgc>7L>^tTdOjYrHEuS5;&W<1!h}r*hehNh|`Wa6yA{H73-W(EZ&9 z+y7fj?!@8NH>XO-w#rGh6At92%Gr0xO!)x!#Lr*MMFm=I)O%Q)EsmrM(c#lD%o6+g z^VN$VqgH&<+L7Y=Nl*zEhE**^3@ZXXjX0q_uCnq^mz z>ay?OGH>?eA)f)T3!*8hwmhJ04j^~Q5o3kFK?dPz44w#<)u{2=# z4K71R0xUXJ0c+e-FuSANP8{nI&L!*;)jV50469nO2AwD{24I;;@Uz~|Y?DS773aW# zgd=52!E!ap9LS0K`r0AZntT+rPC#qRkPs55dL8ux;-aZf@H!N{K{#6|DRRT~?t=JU zF8!|t{M{G8O>{)#R(8m9>Kv;#d|Ys&;iVt?H2cR}h2J21O<-Mde?YUZ9I=?fPWEM-qc?HWEwiBj)4SaB}8j|=Dd)e{ecpgJZfN&tk zWOo2VJq5(=(cf;Y?cjR{NLFN!!RWsFkN>@pwHNZQa@PY{=%MF&Z}LB+tvSTE+@#j| z8?-4f87hD%fGxRfoxedyXt_r%a2c5pZ+`~~u0v{i`5Kg4& z+Cy*Y0CRGN3otU%Ue?t%li2rdTkwbf_JFI`FHmCucLJXh>IQg8E#M~HjRquy)>Tbt z2dLfjdgxWabq3YDYrCOp4LB*UV`ZqA@SW$Lss>RV`IT%N z4&VI)qD=$fD$1Ay{sR)RZH64|L-si_BMy|0|N71`5oo6O-W_NR<3O0PwR+Y3VeX!{T zab^9)6l;6{nk9ce&mg%Q(?uBqguu`_4+2xWsdu>WRC)BHibr+EXTr?VTPvUV|1a#l zXH-++wl0jKVnMNh0wPfXQ4kOT>4}JR5RqOZA_5{^q=hV%CN%;Aib9lLga}BHlF+*# zHS`jCPbeXzd6#>ieeT}p?sL9x+%m>DzPo<}M%GGM>&^SVbIxbZXFjJzyF`~ahLaj6 zjuCZ{0wm!`8SZC-_T}Y4eqy56CN>X7t9Twc^6A6?k;3bdVwPK@fUa2ZO@8;SBt>wn zz}{tjn=T7edG@tHdPVa&?Xbzj$oN^g8I@84&W9{)w-1_Q;*_*#R+M`!2pjI+LT@d$ z{aSCz-_TIl1(Jac8v!;FKnVST6@&UpJ6c9rG#;B7pPlepuX0`yS&+Vz2C}h)>)l^K z%ZgDF`3xC{r9%Y;j@0An@D-nLLzcqk!m)PnpKsd8n0V%8K*&R?C%BFLmWk<;Vbj`9f5xz@h@i zSIS9lN1bW>=t~7p`8)r7K!jF?Vfm^uuC2*6=st=TLf<*^ z5o}`$^F?$4GxBm~T5O}{XE`&>3M5H@l=Re86(&uCuCoQZDY}XpA_9%OjV?=jfSm&(4I(yy zlL*-;D@)y`*}hFKf^ht7+--ZXDb8yJ_H77c+2ToN`UDi6Z_#1zb?!K4YfmM^_@xZ7 zlsMq1W3ALZMwCxHcR^>&DZ3KLgLV+h*8*=I2x|7f7!UCeI9ms%a3AnFFb_c!w!tpv zatKM1t>#ymih^~>tK*YY#aZ%s;#XkY4j%u`)SG?t`}4%?@R+G6E^DU=jt;jU%tV;m zz5|avk8|y*61Qd?Vxatbe78*1X}S+s&8>?5D0}ru4N5JQqwYIz=hxaO{X$;$MM|7> zym>l!HBp$LY0q*xMp+%~K^&tNc$Q7-FAAW06MhZGMCf*D}@QUM3!laSN;{QMc%KND9=aNjR$Va#&E8l64wHXT^=dTe>&O zl<3}prF|*n0m8OnPaOZssDyekvZ|jk&r*;hYM(wzAa2hOJ>0y=Ds^h-z$Fl($4RX$ zON?|I$88bBvfiJUbRtVe*95h%b^=_Zd!`+w#JwgTg|!#I7!5&0k(fgqUrT}Psla_S zC4>x|-?tII>>e&1c&vnb9+p8>x*jP*>klm@L!|HMD(q*-@ zdxdGtCp%lUlBjy+^cA!Pxt;nf|7fm(^~6_tUqp#UcvE}#cueFD@J9yKU*jhPhU}f- z4uezR0Z4&y@3czmyPzWNk&D!4k2H5!dvTJYWHtgS;Wx`se3Oj_7{uF8cm^U8TUOwx zflVcq19`dPISj9tqU~npb*o_Y_BY`1aaT=*P9fzvdVG9D=oi_MPZ(k#u>=pq&nK;Z zTgP$P87nC#)VgK21Mj)0mNsP?iobM99NIjZg!lrS^YphF94!)W1o)w(tK0n>ux&5A zU|v@pSYji`;;AJwgizzA1Q9%z^1#;fi@SL??9o?9p&{()p3028Z_0HEDqNc~0{+o{ zXv}@I*x`<121h?87~oh#@=?UQ)r=u7?~ZH;u5VSG4-GuCj``O8z`4&c4X2u|cSl5W z7}72G>F#oWfvikWW_Nq4UBfAS!dvDpngm!>Q55ryY%fPA( zuD-ih`)V4NPG`B6_vXXlFP@(6l5HFJ{^wc_|KFTQ`CqQk%Z#TbA33v9A-E#$c^EbIrNB=5s4{vSUYStYH=O+TgeSzJO@_-Te_))BZ)W``d5zumArw`Ht)7ip@J< z%{nL)#jNg*h!WP|WNfK*+h>Z$>~_G1dZc3n2uJ!Q=F<%6dbsu>vBcz>sQXE_J2y*E zy)_m}l98z-odGaI(S1Eork}x|GoO6kx3i?RC~fD-#eI}ydddJ)5^;Y}irjsA;;L&J z53~z*_0G{LH}4Co58}J}T{A1=X}wkQ^r7VKCHLo(Ed~;5z^43MvUn`avJ?^y>Q;Y> zR&Df0=qg}F>?D4(Aacd1rW^oG4I%KYJ=$@sg30~x1CB&^1z6B)(rS((E~y|4-PYk> zqDthqH5R>h*X)VpNl|;-oN-x960gMkT$6t~V=vLU|axmY;Eyum3qHGKRWk_+i9c9?>JBD;02}3)OwBNv z=P)Tbg{_a@Yp&@bpXi@6V5ASNV_|fgs@C|%n-^hQUEBg7LCfyYkB&bf{TcDC1>bZ7 z9_CzUENBA{n4){E@7Ml9H5|Zd{v7-`Uh<`aPW~t0|34-V?j8NPt0Hr??1x*`8FX5*_yJDiqU_1? z@v?V}Q@Hv}-g|lD*XLp6pSb$$4?Y*7sG|2Bllz{3tALR~oU=ZIpC3P(^}!QZ0}DPv zB%JuC+dKV5pZPj8hGiwZ0M>%hbUjTaQ9@9kMm3DQ0oK@`(HUu}*n`{ck66?QMwt0K zmS;1^Gr=A#PBV2_31w-;ncpm~AUH;n4si5QX{GUvbM>JPP&&7NxpgxtL0r1AFYHzI zQ-5FJK8(C1`{x5k8}{CMLR-YtyaD1X%Ir3AJsl)b!~-Ph=_$yC)KtEH9Pxx3^GecR zXq&W_)w!YBoI)HEY`X<4TJ(cZCwTU93b-q8wftt;zP=1l4#L2iQjNd9zG1kdsKvlK zi7twHBh0jq203sn+293Vn$e;W9%aQ;7hNpk`hJQc(f9wr^uK3RdT!T%UBQ!zC(9{@Y zU|s|DmLi~~I(tbI{9V>D{1NMaPJ@|`zdNl@F#-O5$WR3j{%t(pv7h|$Q2%6VA@87P+{h?cxOXq29_$uG7XH)l{U4Xe zU!U)z+S25igH)qr@1&Q^YgWfZBaO^lytU5nK1d_xySJ>%ar4{mSyI0Fx#8?1bqBX+ zINOdhLS_;92NQYhJjVF)uR-oR@AYD=DQ={00=)hQmy&2c*do}~`8;aUqRbCgB|Izq zb7W3qexly|k+gRr=36iuNzviuK5e1T71tIF=mu>s zIV$9A$Br^|C;~JSUwuNsPh4o}d)T*XjjQ!{e^yRRznoRQRLAbWYMWYH8qnPcVW9v7 z(-1d=FhTA>*{fK@I~5q$=Or!c=ieJmI8*5}@}?AZ~hvv6dL_^E)b zKrWgm(8gN|f>|}?%!p*%<&~X2GKlcT$vA^uQv%!=Trx;CeB{&uJv@n$Nxgv3V{UD# zz$?%t=jd;N!vOv}nyR2pR|FII!;wr)STJm5M4xd%$PX(Ew3eYLpd_FGl8dQS12aNC zWe!0$;nk{OP3bCbJ5gjW<&QL1pd{@f(1s*&FuaO?v-Fzf&p)TF>~K*bhk@4YBcS5Z z^lz4o^rlYS#8V)g>9Fk$PlSD6SX0UK4SG;R2_TZ!zHU34I7)~UgdSIU%RN`;`+A$` z(XrJv!LjTOC^EtAGd9Tt#OJ`}ylrPXWpjMi?3syp0phM-iu4b|k#*{5Wf^+GZdLB% zIfdslhZVhS<}V;$BOBpoy&bDnVD%}FX1lY;RfU>UizTn{fQ`YUrFyJop$hwdu+Gcu z?b_53+3p9*&X~Sl7kCu1qvPv&LN28`4R&$}Ga=V^Ejy}1v^NK{URIrKs!v@+#F)NG z5Fy5GP#^`c@93bX>)}`6@6=?_sbvMG|{>JSTS&zRk$Y_$!EH)kFm@I)lx^a zZ^m2dMsdiq2lFyth;G4Ii{cW`ol;rjxq@2*f3CeVV}wdAqhMM#1!zDQ!oT6^u~pqP zb}Mjed@s6y9>p`Y&xHZ2=fK%8i}`}H#q0_Apvm0enDpE*?S_?O={AL*feFEn; z8+8>B9?f(!XKI*8G1WZwkPhu2Kc^a;OUKR-ps%I|g&y_08tIzmyuR`HJqi%B5Rd)5vrBnXT_ZbC>Zhj;S7U|%kA5FiLEbep7dJu5> z!F5z!D`o)T_0`ZJErjfF9A0q|FW7^IhPS?iew{3N ztv5>Ro^bAJ<8>c6`)w;163+aH6rP4O!rnlIz&3~hI8NW>rG;c4H<6o-EpDd-zZ%56 zGV<8ypN(@4!{8?=k07sc6MxLtHGnb>AvVGe0yVjyX?XzJqDtB)*~kwUcs(+;ZyHz# zrk4}^q^3KqKE6wK*#kFM5o8l+Id>rEeNXJxSB_!40;hc!u_%Q{_nR@GM^1y{-U}*nP{CyU9-7w*daEevF~yh`mL}f=Px_VUoG1TyXe2)6lMVb_?V}RKbx!J zR8m&%&oyT^NPxa&$yl>(R`J5o(ab$%80!{W{mAj|2!vh+|Vkz5zA!{HOk zhdiCB2Y!8bk*H6F6#aVMbe4)taDA(Ogyp)2_&#Ags~~j?9*HgpAu>Go#s*iMN4M;a8~Z;QIf!N!?Fz5* zfYVwE#7e$_d(hHdnQinM_=^fY;-DR~1Qvhf&vy?=F}0t7wM!4fz^&+w^AcB@f*gFMAz}qS$Fz>QLJ>hIoN=R~sXv+=U|LMqWq(xTvI6t5*Y@s`d;z=je{Vqr42 zHgI_L<8_H6YAvp2qw5;21h+LI<+H1O;%0)x*NhO$fg-(Nj0EwB$q@jkOad=P0H+f!^_a6-g(xD zeI$#DwrH`(Y!{AN-}Pl0Q<7}lg5EC;w##2(9+9rOp1D`%*4Y2fL)Q8M@P!7a2?1?b zU&Ez<-z=FW_~~4Zi7? z%1H(1_tLYSyZ0 zt7&9Y%B~YJ$*S^~(t}m(`9U+O);zOOtE(eIwX^vW?Nj@GA1IszOgFYS*bth;?qN*Y zcl@@p8f)Q;pX@<-#oxLGN?70epLCwz8X4H+7fiIbNs^H->)Uj;gw0!5`6;FhX||ku zWpqhYhKh`01~`94HKkq+YuNI=CuXtq+%n8uu=osYc({N4KugMxR<`W>9JcUu;1tYO z?evs@COIf!Mf{mMpZJY$E1))28jG2wl2`P&GSKyIvw_X6x`!i>VzdF+? z>9L+)mEAoRPI7S3)YY2yY@WSy)RPzUdt4yoS{7K9Kl)>nQeA#3Y|#1Mid_h15H+O1 z)H(r%UbY=5#+5Qmn?J*m5e&?}UB;35D!?t#`RN{x8<^FTAl+S`#_VP;b{}Pc+vThu zv6~SD##Fc6@5u$9d_Gxz?+J&EU6(~B>_XNyz~SKWKh;h~C`4~6 zDQp5C<-N+8co_T_GkV!F68emksk94l#y=bazrLR9t>#=CF-N4 zJI#{BQhx}`Z}fY>^Wh~LYmhI#wAWfq@t-d6gEYxV_($F z5KOn8I0}bED7j`@^MG6qrPmzyLFUVq?i)N&f!NbeT>_UG zeFJb3RFo^YKxNa&g#N~vyg~-!!MKCd64W^laLc%fCx{`(k)YGIdAnK&^tTIP+gTR$ zh31XDv!J7c0V8a!*mZXdX>N&{zLoSxn>cQ_0Hie?lVAvI?;)|D8AY(!niXKDz5{$| zFBXpZB`3>Nlhdc}LKs5)w;Y&Th6BteWv|k$DUsHF)i7JIVY)&}=dB`#ti4Z{nOPv2 znwuI|$D?VGpPct)n}A7tb%u(1WLz#}Q6?`6Xqrq6$WFb~MO z)d{;9F3Xuj7%dnUtRndk?v$DWgf%I19^>J0wPz}!?_Q~0aOl2Tmm1lM^Pe_7JwCCp zMVqc{&?PzQ;2Tc?3McMQtD1js37G`o&WMK}DEXqIeY zCZ*XN{t!l|jkDi?X>my|#jn`35c|l_C|<DpfnmhqW+#=dZG+3VcX(wR za~`Z|)?s2L7*`y?{Mp9oZH7TDvLE;!jAOUuKTLEEA~?4;bjVwG*dIeByj0?`q^PtS zwDCnT_2B=7$p|9<1n1UD4mh_m(aZ?0&CEbZ=-%1$M-qDSwV^wjxmr6ugA(DtLxugMS;~6cBlviObAwBgmQU#Zt^baET#IjXRP(wIA0#eozWM3-mSul>M#wa z;^GZo)YL+hblWee^ziyg$llU3EW&|FQ-Me04rDkoC4Da?z4rdNk0QOuvFJ?RC7e7y z1v#BNd*LP#m_3K-P%^4-IG!?HByF52Zo)7s-x*zp28AulPdnVR6lGpzBJy`i$O1$m zY!I>Cji(*GgcV9@eI$Q&u(RhfDnHC_C3XoO=S}M%UTJ#@=(EtzyW{FM z7hP6Se%;_QyZrg~B4n5&YdS9-i^+gFTi*d&!lP1!%uhVPuYUWyZCn-wBA$V#0p2|< z7SphrmdJ7K^XJk}MjzK_F__G?O(pum8IkuN##dLd8fkQSO8yKG;$OMUJlW67qA6u5 zFX1A$ov8HEcSe9+@stI?0~L3M@l+dgM0kJHd}cGztEfqMS#9m?H=|Ds6qAOh7?<{-Z3@AiA~+os^jWCS_^gr`C9Vv zZ!-hoguyy8N!5p{RKYIfC)AO#|o?H}-9lzjE$8iB23OWhTIXGKs zVwXRu`Ye&1ezW*ogH{=rbGVD{LK;U7hD9P|b17~H=htMMom?^m(1}K(vFlQA1D*G1 zF-uD5H#zgq@m04nm==17L7dcTarpInM~(Pb^alVNjjaZ_C9>J(Fety5O<1*<>kL?> z%dn0Y3Cv01)~6v>aWjneJ;0{Nty;G5DL;RIpZ&xK53%nZOMDhKy&$b`B&g!rMu|0;ThbT#M#B) zEF5csoeI)dmSdj)?F3?yAT`UFx7cRK^ylUXEEFeoPoOD4S2bcp z0eBCBa#X)5F!%L8<~a0Sz^z&mz~Z_-=dR&q<|TIVd%cMb8f(#DrmQ7w@}IuxpMB8( z509N)%aT;{OlMj&gFp$ykRg)EX&CkFR<{~a&^Kn&?Lp+;V^9)nVD8}K|=kV((p zWI4L(JmV#AQ~hCW+Bo+!n3#OI_@8EBG0&^kp6G7~aC90++}U@4e|z7SJ!b}&kp5Hc z1tpN9aFyj>kAIE9Kbr#>VG6i^)N^o^7eobZ)iO(0xMqOw=l6g^)ek1h+F={HCyk6K z=3LOTNJX#)a+?1yi7eCg^~6-04-SxB+bE>AC@})(|43h4Itn<qIcB87LmAw@u7<>sdKbugFC;r^Zf3~in z7yn*S@c;Eq#_wE9*)Y+kNrU;Rl|HQIzvhcyJ3IT2O7Qb0O^a?zSP-ZDdQMYUa;J1-| z{#MCA(@*uecX>}wT3QKLoO%-uAzu9n=7r&@Gx5gR$W>-=>ul=LNO(I~oEz3~qrqjm zMr2~_(ba;BM!#9S?O2crKFxYI{2pZ#Vx2wAip0chN*bobWsm6N4r5$q01BJWGpj=y)fFYKlt9< z9z6+d+&?p%zc@mT%0dYp&c+*V@ryQ#3@yhfia|Fhvd|X{&|wF;2cXwJP|gg2e@LJ@ z&izJ}%jcf~iti^}yu zJ30oYgc*uJY~b`CpwQ0OsdA17H9}!lx@P= z?BqrdYL%Xyr6=tT(K~vSWCINOh+CgU}2GxSmb((3<_3F`8Ta37*;DWNd`+nBN&jIr_chVEhK8Mez zGPx zb47kk$&|&v2smix8?*C$gX3xP>sy1RrK$G=k6wDlOz^tb|D9peenCh3N6dQfw2#Jh z-3#3Qp?tjEB(;3Fud_+9@tCWaaMZs3r07tXMX&z23*Yzjr{Ow3D0Ma;h;ACzJ_Jv=C3TuhnRT4xx+NRma0sM`l|5ePM~blC6MR>)|i9pGxNb0lYtwQQ3DG;Q`^ zX?6L#_+JNNEz#{YZsa&3Z4uxq<*9cGGS)1d*;Y)l1DB(koy{?L>Z8lkCORdsFPqD; z1;N3Ua+p>|by{`$6efC&?oFiW^(+0{E%PWS7!sa8bNBOE*hbV1D$O(>KiQj;&2@FP zyH?W7$skw6a9r^qr}1$^V^9Eu$6MKESy55*&_bdKNVscbC_f!7`&uvQGJI|kBhm|% zMTp{MgnFenPS)e^)p2UZ-Tm6~*)-NzOZl50#yB_IL~2lceke-J;@-d`0)621e_$>x z=I!ChpgD}w2caqWi%IIKLd0LF9_bCgS^g?yHbM&{%hFQCsYAe;KgAzA&4^|$M1sU? znhjRx%HMb%R^TZ78_&Z9^jiM#Jj_ys&R5IlVG1&4ezROk9{rl)>-wvdYp3izFcX>; zAf+uaX_HO(8IZI(;Xz8ey?YLz>WKNe1OTf957Nhm2{79U!XM`OTl90JFUvS7Dx2bO z7Ow&(4^o$*2f8oY0(*;pZ-wbajC9?w8BpS>o9f-tx7zclS{>}!n!BC#jgqySfv8MK zsjj7cz5zz&Z&&82bQ~8Lwi@m$TF>vP_`ErdXb~CvDb4F!45#h~t@B326^AXUr zl*->@`x_ZZ90XYZLIzsgUE?9SX%V}_Fo@%iOjgVWK0Dgq_bel7-{=HMlsC}{z?4Rm z!&WF(6`IduY@lzq!(^@R&klVz|21R1=6awl4O};^d--58SO&NK+?dJ(PX&r0bW}2q z5d=aI8fx$_39FCM>nj83KW))BHn-t#P3KT++>)e^jc`jiLns1gn7`Rae#=eiz|bP~ z!2`tvjwuYhCj|-l&4SgZOKgQt574xxXHr;op=BW4v$!93b_KM!!qD@*sDca#Rm>R$ zisQACmF`lmQgOdVF`vL*Fz`L0GXQ=Lw|wH>g-e;uqjZ`sp_pD-N*if~laS)j&_o{fob!%f(2(LYeT|Jb6+Kek9) zV^kVkuXPkhQ^E!01o#KheJT)zUPOMF#?x_N@&F*u3}D&7Z#4(u?m!O4!TgzWmbU)_?4xO~_X6ykk^)nN z;zBrnmPP4?%E9o1pu=S(6`X}gzDW=m|S}xD87G~-1Ila)U>nTI+Px3lGOAV@j@cz zhS7G&UyWo7aTgT8CHy#G!`JKTeZt-1>QEY*j9$kO)B=U4x`(A2D`RyqypvaA*$fXH z(39s630uc6A7hAaP|_j}4|1Qfi$26<#KN-8@;p}*|K)9OOod&^Qm?kG4iRc#&`5Z^e6`&z zzr^8NpAb)UOOEv0pT|3|dAMEf;LPK>plNR}aPc)L3>uoMuEJmXqp1XTN^}Ui51cd^&7YyiET&@n7mR;NT8KL=u*$9>!wGF;V)cnxUB6k6i zU90$VHkYU6j=RV9abuD1%gv+#9>TX$VuGKDFptEFIx4Z+fOgfF!`L*bA<2|iaACa= z*~0=|UI3UdxDY~>a$yD?roCHEUJ1;^(6N^G=~kaOZk*(SX1H{-Br`1Xc3&jo$t7kJ zjU}UFpm3|JtY|tr*TtO!riX;Dq#Fd}b+_&ocS$zik24nP=I;h@0Cs_39e9o=u+-LI%4pM0~`p-Euz0aeo3ncxcwFM~BWLnbtE8?LeSeK^xtskL1d z1TuBRkUlU14&KjaA8UlW-sOPt=9Bk$9zE-UWOpE&9)|+ z*_KVW>UTBQ&iZn&v_9Y!_Z!e~?T^BD@9eG1%UJlVcQ5M3Xiepm((^ZAjtD9CTNCGd z3)Y%1NAw=l-!-|xwq)sDkGO%qM1nL-axDM)=60)#3-SDP^A|x#P%hUve)2(_haTh= z;q#1%LE;jKaEYpZB65tFHug|}Y45f7uqEEka3}7K*FLQtOGTR0SonM6ZP_&l=``%T zFP71yxXo5!A)ZOltZDYJl75ECtkLvpzoe+yjd z(U_ce$Ibf@f!;7lLiMA9n4Fp7K2XFf2rOhF+-_;q=S1oKb?6O=8uXZC zHb$@*u>EuZ)QJcJ140vJ8RrcK{|ueGfZYG*B)(aPf`j!lk|FE)bl4Y~C};~7 z!?wPezNb6zfJ6s?x*1oaP~A6kvAe#T6TnB|+KU=NpWY^Ft>Z7|BOK{4>m4Xh1HGWO zd!SAG_6(d1H%0bhu%8(5z-+b^FmwxG@NglZmULf#bOi}Qi*{2-z6D_vkOh57O;x$^ z?6=aU{z!Jsn_aUauh#YVe_K_uofb2h9d_~`kDjYl)O8%7?%C(`9H04{RedHe|C%vIV=KYDhKW!!k7C#a8pz%RsMAoT^9q>Ch_iI z_UquZ69s;=T%Uu*?yvmTJs~H>J(1Ei0@{V!#MggoJA}Q$z;3Us=}+A#?^~O#Oz?JA zA3lBV*}=DGLJsU=${Y)>8O_BdJ#sjo59#Rj6>jzf@_%_BqFS^AWD7>{Z2CbF0PBk8D^zTIjzU90x7T9>IE(WLqS{C$5bp`S&Nf%hHQVK#Qr%Kdk==wfi2A3zhbCNj#=h*()v-o1y zqA@64#P1Y@!ik14<>4tC0_|&pQPxAVah_VrZCR#0wiW8Z-nqgObpTu{!tdQ={ntP( z403>}&@Zd;#cae$GBrx2;vG=?5iD$uF^zIYb0RNM;@v^naG#qnvq4o(2j*}~DWE-- zgYn@=v|U1+)S<3>>YtK#Z4!)+Zi5q-1Q@&~NyuR@<0BNpGT z_SmFn(m5%)`BNjluFE$9#`b~PzVrywv?X*grZIm^X0=SgfA_R*R|7Gc8(G?5D&3$; z$iZ$oPS2CWxAeHkg((_vBwws zStZb@4lMK&p7Fg%V4S%Xvp;_k>1zPbDT9@?)Qj>#R$<@enAfsDY*E}ssiwB}MBH>2 z8dXejxc5Lo%Ay&}jRxL~x00GUdTcrf@Y(x#y*SW(@9aK7mYaI6<`m1ukx*z|hK;q?!S|fBpN{Sy! zxSslYf$48q4N$o)7T0hNxX4{I(P{GSfHtKgR30g+>9QVI2lNJp{16A(k~V*=P=<82 zM0P^B99llQIk}&QhPQlVX#iyIO7!D+t20E!drUZcj>fFK{Q|N#m6cTqc)XsR6MMOukp*)>o^t!1|uTcE#Z zK**&j(l4O0!|JGGW9S7jEB%+a!!tB2UohkG2i!+I#^CjFLvCfUn8xR1u)DJn!PtZF)vbtgoB3!{QrOro7i6 zV?EE3mR!K08Fv1XmWfAiNp2Q4D^LP%>;K5I!1QuLdcTUtW@L{Qze2|5FcZG0>Ew#f z6pIuj#4$#FTY5AoMk|~;yZ1RhXZIMsxbeTEI~=ilE&NZYbruY@e1I+yN*T5ntfKmR z6{+oVF!*cDLUn^+gG0+ax+SUOk46QyVypgdCj1}%i>`q8Cp(3IjT9fJq``RAuD4}1 zJiBO?=@?N{7k?8`qrJ+_CXlLv5mxGQoTHe}h)yQnswp4Hq^5A|Ue7l&`cehGorWEw z#M4guZZ&M_1rnLlN*dYh2_eI_)hdgsk7Qf&z6lqbcH&`&u_7bZfe`?k^D(G9r`f$> z*y`J0Nd1qSSkaE6cZ@9lYIwpbKF>ehO5NHfe(n=bZJp9!iH4VpfSR=Kg$qYaObq{0 ze0wp1L8n+3Eb|LeT&97OPg}g#)3Yq}rR>gU#0fol27LFUJ8y1GbF4;^jy3Z(m2O0@ z3KGe$*dzKPQRCyfL2;4C%R(Q=rhn_*{)mXXH#GhF0Aqg+ zrV(M+?e1HyaV#|M6JdO05Z{ZphP#2nQVvHSM`^N~fna7dAGPPYBm_kL2ZAOZ!H~Lo2-rp+-DUzD;6#Q+fl6JM$nDr&g)IJ-wC_c;JraRp1(my`gca97@ z4pdK6l1BrpUmltY^Mu^+q52GKJ;y1`c3;_Wo+~$u#^=q zDgi=;*eCi}oB_O)7)Fn9NAIjdsM=7=n;-_9%M{*2_Y43^{8tYm^z!rBPx-kYhVAs^ zQz4izoD*dDzAtqixPVa9oWf7eSC$BBa7{C*MYURiUYq3)RNCAY;72Ig$CDoA88$)9 z`n9~C>%jrxK&p42F}YeTslVXf&Fp-XDj1J-lbwGRf*4&i^GtbpRI8Y4(@bi6;^%fV zrU=XKMN=*>Vm`q>HIsqznAS6JlXW>2DIae42~qKYJK(iO4Kk70hAROj8`>u}nz2%E zR0@~f-&`tdwR3SblX4Rf==98g%h7&FD(*I;A;uSTEVx6Y)Ik28nA(+;Q4?&+*(A&<1Q(5(JXc9%51fH~8LFn+sqh3TNWuumK?26xUn+70ks{f%J;Y z6?B|b7$aim>Q|(k<*4SU<^69}uh@1!-1&)WzX=>=2vdHuh?lq!-I{z-A=MS%aNobv z?s|>1l5}6QnRDlVVTT%DoZ9a4#WVq7$QwV1Y%$XMJQm`+y@d}i6S_u2 z0++7RBHS9_L~;W4vlOuMQMDONXaW|oZYhZP_lI&&}pAL5~UDm@11OPU-5LtsKaTAAhr8st3 zd0Qzp#IQ#AA%v>a^yI0rQR|<6Pm{ES@)b2oKfJ|s<#_UNxW0$h6Dw8=9wqVze_4CB z9gSHiYwS;dLq=@rCv`5to*cg;CN_K?ZG6A5>-_O-INpyohB~!y|JvhNf6D^S`-XvCS1|4*VK4`*|gK+$*2+WZrZymW_Vgpjse=PlcVX(oG3TN2ofb%#D; zjb?j-wgi6T7t`MnQw}(yhV7av&xK>E&52vO@g~1Yd3axxmp<4U(VQ*$9Pas(-_s~Z z4swl^1^xazDg<^$W@WDLIp)|?!!gNbXvbv``ownb&)I!+5ZRO9q0Yt=8Y7Zo{G@Qb z*SU>cw8b5TEs(GDjKt0RF}SD>#CYu(1ALVa~!QyRb>uj!6ttQ{=&7~G{T3!K)*4sYV?!t+bLM0n(lVCfM2t0I@ z5d@pfa-Jj~1{haEvGI%w71*JR7oQKffc`ad4)YEyw=Xa`28oh=A$99SdZv{ccenXz z&CC|YW|Z?bD>;%9kc-fz=;!---PM;eROR}acbZd6@H%(aJ(lYqU$(IdJ|^JK%<$d4 zF=lW^`mw}-{a{_bTzeb`-|HiXa9 z5xCDAd=wDmJB5()nbzkV4a{m@eK9@bvHLh07t+7UG39xLZL@?Ryy{HSuTP5CnbtII zq?*ZVT@<}gk4djzivz~QQmdcy!$Yx`Y=whD*?0`@xSQxLCNCA@EzRlGl~SF_X?e@N zruyaTh;TLkF~~lowAEOo8U0*dm@rR(>0NNf4IQL=!L<>Rj4--+zEwjG;$EfR**Z-z zC)X93u^?I|{<#Or-HKeh0#AE_uuHVPR+*6B(jz^(t-VTev`<*pNxb1u&qYn49auKd znc_ILw68?0EKAsa+k8sRkn$ciXJbjTqf1Z#;zk4iY5hZsL?=%r{3%6^c)^^Uu$xk+ zyHwV-n)M6(7FIn-BKX?L(~~L#Stp$)u11UL6xA2>O5U19-!YG0o+(ARoq3cP(BLg8 zXI45OjH?oRfxFwSzfW&kvL{4iZa00AV{}6!893pq0XMoLs+j-riJPkDn`I7so6KG` z9EUrDS}E{mYOa0s*1Ij zq!aP2|K{?0O+AGK>0%!+6e9PjEo@0n)3F&VdC>tmV5Ip>?9l~@saxeP0xt1}$-N^I zFSdFgz&75D;JBUFyGOkUnV2_pEj!|j8v17`ktSW9dCzD_gJ2N5;B3b4&jN`)grx?IY+n;SE6CUI)hD6x}}aAIkZ= zaig+Kox*gGES(5YubS4n3)2(cPg>xp7RphROtpU8+GOkAgBsbXa(W`i0bs~YAIy3O zV{T&fC}8qbD5wrXKHmq@qrZsmM$IsK%l{;uX5Dd3s63cE@*>b0LeQrjvH=#8FbmD# z@_GZ=yyP3ttV=>ug;w@Rm}Ue!^N{c$EFOxEgem|x3Td^IoJxVjiN$eWDGT(yF;CL9 z$pc}G3bd!>u4wEKzM2w1RwbdMpy!-`lhqQrmg5u42Gx_p#S(F&ti?H6kz7w6N`)Ri z z5JW{3q=^)PsDKCvC`eICR0IS>1ccC|6cLanAU#o(Di8qysgVv+A{}X=_aZfP2uN=U zC4{thAKx=`c4p4+?Ah7bYj@9{`6E~0As3K4_xFCjeKNdBg#=%7)ouDsZi>_CTme4?LY}>UH2E=ZM}3X`Un-LEI|M$ zATg`0V8VGgm46Ip7r09H2ddJ2ERxI!rbRmvH=C?DQ#xU?L-in6#mYbc8Uj}=!Si_R ztIkAC12#mmBSOXgf*?%rji6wyI+2U>nXgv|o-aqH8gHY1{eFM>x0`z4CEP9{kN6Wu z?k?v`B&;M|WSsP3?hh7-wK)FtzUfi~Vuw&X7>+ChwS(yEX4gK0F04B;?H7GM-e5f1 z-UI660B*r-NI&PR+V+NDrRgm`J9>|jMOlICG0BJV0fmaKyJvqKI~;9tAIT=QvWFl6 zz2jZ53ydk?Ee_q$aMdYFM*hdx>j%`54vm?ikhL`;{jY-odd46|-eYsZD$y=S%`TVn zY{Z&vuJ^79^b`+Ah{mgI@U~RT$HU9{+h*ue`{rG4;uEql9uS2svKt;Ijhaq1-~GDc zX>Tg15xMZ462YyU?$g1v4tAMmn=}oOA6AwhP@OE;kdVoPi_a;{&40KHV3tnx;6~k) zApomw_uS;@oip2-1; zZ};|x8h`BpJ%oU}n0cSc(|fhfp84p5Bth_5G$CACdJKe|uevU@G+yy&^Hc&DCm^xB zK|Mz;r?L8=4e{})HW2f=H(znoY|-1J2UGG%*kq20mk&<`$=IxP(4XXxkyt9lg)Vwi zgfNW~iLDCbq$la=wiwQ~7f+L#?Kqt68Gb%>pKHtu@+bo6Ytp;c#XOzH357a-GyD4% zlIr0FI}Ovv0t#E$0`;5#cbwN~f2kGmK9W$vo}o`nJb`u!cya0w^RcqdS;q8oeR9Y0 zmDr0CSG9+3YG_!nH7x|)Va$Q6`n}BZ&e||euKaN9ZEUEitV^8~dfo#FS{}_~2ctbv zVYG9@6q~Ht3J9+7(YxQRH+f>O9cgqOZ75zv%is&83z#)7fw;U}ARB3zdG8~^qjSbj zI5Ust55R0_?3)099GkIZ0ceZG1DlLLi^eV5zDp968>H|87x6tL2qBCbUvK)&Cz{-j zX1zTr7kKVzOrDdIowQ2U&)&;ncInEo&+?TtnNQP$`MNWmZq5Woo1gob%J1vU8QQg7 z&5If@du(O+^=VXq^79e1LXFMtk|@xfh`g{Y1*U3Z8tUJc8G}wVd3<8TuwuPF3cAU} z)FZh(T20UTJ)F0c2(iP0-@`9boW!qv!1|Ae4=5im$>UfJS9D9CsL+c}RW?dfPT}Fr zPR@Kw?cDN-;OVQ*nPan*s?Zm9)@x~Mk?b*;X_echKcC&wZyj+Xln}HrL(tqQc5#zX z-2tsV%9c+s4D+vf!^8||VTAb_z7J3x!2g|3B zspatBwHx5Wb$g&N3=SxO8qp@vdzM}x(gBdz4n1gmhN@dY9bb3>Zw7S>YB>C%45p{x z%xOA|LEtR~puM{O0J6i{#(K$i5w}5{I2=$% z=|KCa(r>Uk!cWz6dz^E9HoW~!lv?P?ev@*R9s1#rR7gHgQLh@Ku>0axcI>!G%qW#w z&Y(rwJD%>XuHLv9B8EFhlzVy%1~EO3}o zaG}VrNIp^D)(a}~!Wz8bUKxOfmcc6-o0Jfdd|V~P*oN>$U7nl47QyCU%@aPBVjqEJ zI6K~)XRw7m+kl^giM+z}=Q&zk9L3fcMxkxvtKqdqkN(`qHhsL>O)QK}wd3-?tmj-z zDIa-A<=fo5Q(n}Vb9XPk!UgDmH0wj56cH+-@v_gtN8jU4L$6u*I?wV(;#+RgrWkT{R!jxtW&3k@+5VTsP z8<%^pRFrv(nJ#uEFQ+p(0 z)A!l;cK=bN%)t~i1T9n->JA%Z%OoTqe;ZMYRWHngqWzvF^DFzTHN`9hpkEuC2!mpx zCjO$U(1oQtRNV!?uNQUB>u8zR`Sm(MrGMyh72*C&OOdf#fKzv2d zLsy?ZBh2IXxyhT9b)6SkgbrY!yVSb-kK4ijlNikWqc{BTyH$;VSq>%H=Fv#Jw8}y#V?_K$_i_1}n1uReV3`^IuuSOwJW$ba7gr8qVhbFY zjZ{*^3N6>FvkWWsRC|0Sb0&JPK>}VESKwn(-Jv#+HYDyT*DP6O0A5!x2^tGMEXEZc z#Q;7)<@n=91R1XB_a>Dci86ZT_G{K?-Y=w?|7b!={WhWyg4r^1eT>0^Z?&vZDTT4gT9bfMTO& z&D%HYaUw}}bd((-uINmi$(E3WER#sk8p{XWyeF>CumO72n1c+g7Di0#Am+aU7OYrmm=xd1;=+wm0 z!eM8_@*l^F_3y`@2sMht`w9)))%%YB%nJK+D*h8Iteog3i)U?F!aOSj5{7kXwHz@H z_sf1!Ay~7d*ajF}52Kw6e>HCnsxg)W|9IW`rvF(s0g98Fb;-5a^#X`w!$w(+gj7L7 zVZ)lBlawd$^)xi?i@Lrc))pY_9Ww!dut=X~Aj(hvkXp}z8Z#WC=KLrErI%37oxqH! z9fpTw{Y?zZ0AA_4_K%oxp!@5b)I6OEWL?`ws{7!knOq<7fvdv3s=;S7c{m%yaINF4 z;1_{mU=5S*|I?{M|1Y1dm`c=tZ*rFs_AL+@cW9NnUW+g(L@#E?eW*%dC5wcS67$bL zDC^v<@bD7RG=Lz5{@~BEPyG9i_5a2njkSHA1K!pC9gZ0VKy5cM>bPo}HZ&N#C1Ll6 zB?z?PIkVaqEzRy+D9o8z_|Sav(InSUfOynzViMLUN{78*N}e&ZCmY!W(8oQ2I;{Z! zNsF=pGgCVn_b~`Hs;>?Ug9AJVz;BSBh7n{1HVjib{`V8<$d60+D$-j#QVls(@dP>-Rob`kU@rj%(t=LpoWfr24m|D3=zP>Q}$=lz5$Vjd_OarU&_?2b1WJJVpZ& z7eOH6bi4bp;MMV+-Hd^N0ufgUA#}pcpA?DDg&nK0slE<{u$QI^3@snWEFS($o&;7s zh-)3nL<7xIH1YAdp?p8eBEwO_pym_El=QxNufPF61`Bv-y-E?W5K{f?;2fmB;iuw5 zz0tT|*Eh5l&R3GsaB)~z3J%@sl-u%<6g566Ek*Yio#|zM@uP{PE1IK+*M7>>GZZVK zgI#Hw#cjt2Zk~VqwtOY)>jAhz-@&w#lfTldqHP=Y`zLO=XGV%=M$$2U<9~Lb4|RhH z&K~|FGMf|mX`w=kQyKjgm*Q~3B|J3A2tRp*$K)L&OWi=^Pr$~b#1H__QRk8NzvzyD zzQc3``E8>xHEJlxr}lxW!D~8|GG)qNbhn_+n1(pOg-CPtxO}0em8KyFgOap?;AoIt z0Z!2WcFXqeK}Y+aip1~5KU3? z!WO@g89x#Vfub*=W<5o-#3&>|CKSMY3Y%x9XyGuX&I)E98!A3hlh+w{^6t+F_JPCo zgl3*2qX%S;p15Q~Y5|dLf-39al2t_OD2(EH2*ur{uI2_@BpwIi_FY3M56YlEf#KrZ zr_Ep5#M*T8KJb-5x(}!Yu44gQE9?Ybz20N3c00R6zs*^^^CsbzDnINh8g#t~=i9EW zzwEfSHlPS*jYuiv26kT+T!`RdfVU5>u$wR3y!cjUQ?AV&fXqCWXr=%JLLy`9rbu<@ z$7Jf_Z`SL+Z?O%_FJi&7#*E`=`L*Ir`$N78E;Q!{#wNvAO5-Vs3BAAybBv zZg(`#yX6hK)7q3l?N!lBa?2fDPJ=pCfx6lg7A(@f`RZ_UFZ*g*=TU9s6l2&8`#0mp z7`BSbu@CAa=AH{CP319=Fcb0JKBCQcS>N406QF`!i+qcZ9oX^^E>oEERV~=$)@#6{(g7Au>$Gp}b7{2G;90W~Q z4!p2H4w49|tgR{g8L~I5b^&*{`%*zOL@oX@DM%KY1Fqn1VN$)ahTrr%d2f?s{rOXq z0-M(@TOc7-&JFAgK=hk`^4%7V9p8`-d1aE76;xD|caqKbq+n*Fl1y{09LTKLyH0 zVYtf(SnxR&@bc7$=1~u!(Vh@Heo~Q9c8Sl#97F4~l(ajF^SPV9VOhjY!bhS%mPO_F z2x<|7n!c*!f9ImQu(vNnyzm%#&$n|m4*>xBkc+BCUSPHvJI<@;Ji9`;RAK@HO@61Z z=D2x61Qlcp%r!H&{CO2vUi`2+;98!gr3KZo())Cfn_xFaD0(jv!|7QjEF_<>YDq)1_cmfUSngP%O6JBp(&tC zjVCv(PBtLh1o_|Z)GNteJic#VF^KD)w?Bg7Y*H`cJn9+HRY8Y<&(*G?kZMrq=tdp5 zoCf{vTipr{pZbfgRlt>OsHGZMF=3tp5!RGBpI&ETzx-VB=;VhZbO_u596WVc;p`8a zQW@fXuvKLbB)rFd4Z2YCqcX)GyC1#&fyXv-8=JQ^}jU3d@&7FHB zR9O{e9l~`cEbHv|EIPWobax@8pRx9rMiau3S2~l@_)@?6DD_C~_V!Y`3s<32aWpNJ zPHNyZd`+V<0+j}^$#`aIoR(!`SCtB1S}U;-yD5*jwt%GbJy;_k6gctGz`TWnGT2x9 zZ@7uSsnJ58&532!Xd$`S4-@g69AM^%;~Oc|6)?DqY99&zZ3$ThG`NJeBLHaE0!2|m z8fYzkmdXI8Gkd^Uc)C(T*6dV;)1iK>^D-Ul5sTPsOX+Y>#Mm~vJgKt-dY&uUt#06R&uG3V;JoM+Ul8F?jAlYhk{1pNIw<b)(pKd#QXU@w|zn|*BX`Xnr|Gwt@x6eFo?#-Zpbud5e z!sf+rZ|<<-;_7IdpQ;Zd{DVNCuU!Y%(4wzOm<62;B7HCBAC*U6XtfdDw)5$&-Q0lv zaD)n~a;&>2iQ@S={cg~&icTLn&KC!);u#602^&nF^Pvf@uPwL~s_%Ovd=`1>NEbz^ zQ^b;p;YM#W$6h_P6ozIVaha9)-xLBlmeQ+`nCKTZT8-IPX>Z%I(h{~F*z$|j6HYF+ zWDQXkTV&;R99nC1lU^STAD|x0ZVcot2OVr58r25wRPqUo_E_zOF65^TrQi1 zqU!M|`n#U<^dmu{z~(^-`hsT{DtEN_%80_H&IGwodzO_1`diV*9TJ2}Z|6(3OR1jy zfqJatE~(#0u)O7Z_=iY|mNLI?7!z>hVfdU4i?i{ z?QrW#JJA}ptG{8FBY=d7D1kY6$$tV!X#V8LSQ*cFU`;*O& z84bNfb^oN$KZLXp?C#M9cu{l3Ik1y_=`Xr`yghjw+~Y@l)3^czle^R6&qqqUTkllD zPrOF7Uj47mZeY_T&@s2sF?{_I*$ilm)qu~RrKq8%dnj5pK~AQ>t7;5lTiONmp7ywD zE;z0DeA)^C5IqNj=aInBW_zAmbIAH0z{(g0$O64?JNC-ihEC5z>+`B1=;8FV>d74yfZ_&Jx5|K?J|5Y<4;wVirSN14S2ILy+pDtT7(! zGG;UM2_Y6XsvCYccT}2$_69oynep83y^?N&Im}WVM&eycd(zV7*LvVjJ~G5WIg^)n zzRbiPnUd|#uHxH=o~;0hv;YN2v-(Kj1@yMy9CyX17Mik*#ro%W37AI;y(gkiC~kBy zK9K1hIcWoWy*76jXCG#*<*Dm3fO|xLHZB*Bd^WJG+QtBl8a2Yoe*d!6E8=OBjI#o9 zDj2ZF0G0m?tjDoY0p}8uy~(#6pJe4O+2%MO6cZpmxRX zGmrE8!`1T>2WRQG#*UZO9n8UnBCjN1*Lb8(w2TCd*)9{M_bE^IJN}%_cKEZK+k)3d z`CfflQSv;#jG#_*@*drBHM9`Wb#-?4y=yl0aKB&dY*>Wp*Yu|ITBv+g0U#K~@SN)T z#QuTMyi|}V-E3E5ik~LZ_&5A*eUGOjCLy_e-CeOy>U8X?<1KUPJYT=Cf9YVo!nd^v-rKQ-toXj{=8DT&&jbxa0{` zU8{jx9*;s9n5hr1j|wX=6YtS2$DeO-UAW)bKF9(h`MfypI9!HY+&8n(adOKM2~PUw z9X&>4xG(qTMcG0gH+EL3qv4Z?-z)(aIf*|Gp-JR%N6eTbdI#N&ECg^ke=6NzVE>b! zFjKCM2Ic`noT1FdcPIvGu8=qJ{`az=2Re?O0->o{;NmyHZSDhD!*3Eeu$}*AiY6ZZ z1E%OW5S;KzpYN7=&vkC4PhPW<_$`At3Ile_HCO{s7wt(E0ynhb7y`BL+TxT6gR_i~ zRG`eQWQxW3?Ue0vMs-7#ijfM&1{DTkJgrxnH>XENA=^d)>-g3MJ5<}N^XP9|Rd|+K z=hfxtb?(eNTdhQMobq$jkUbBtanKh9&DB(abi)>-FXHRG=2rc%WMpy`M&Y72yXBPj zufcPSQNQBt?CzXo@49>-M?kc?PW@;;=?7^Gv$Fnbgs#44f*?8ZbW#tHM}vOEUDF{i zS5`+u2fo?t1W3ZFQ@i-~i`^YoYek_LQ?I?603&UVdI7{}Ie#HbRBpVH*HAI-F*WoT zkh`+Bl?o?IHQ2wgDZBZ9BnJW%c z7jSdayFY{I<&zjgvORMzi$Z!V=i^Qb%00`GUz(<|QcmS_z&$p z5=1w2y`fALuxNOrkPlqTq%Wp&=8ZN`2`4;7$~-2qe0LhWcjG~Rx5_C%H+BGY#zmPc zPit|Wy~TQ5?s|(RXW*v4M@Z}Wf!`-IpVILQwA54T&WVpf?0z2bt0${51;;fOn@WR; zjHAy0GS)g#?3f<^*k%zL5lU5;27JXJ3fK_@n?vaE1_)rI8{u8Vt^is4ml$9mHB-f! z)nx)|2OJ$vOybgHOfzw** zr_5$u1_)r=q5 z|2?w_fSrVyQ#^@TMLw(T)l;_Ktb*aWeYpEB#d~+c;$>?1Wp8!!uH9sO6CfwB-^qFF)r=ta%Lj`mgP=o~G;QPG7kWVg+hMM3)|!8T{0#f3pV1 zU+1M|y@dz$-7JZap4XJRWfiAl4Wx#TkAG$U=@xe=cfiX%eGJP>`5G7l z{Ffw#exMim{SVvr8qo1K_91SDkj8kvBzr;f9zhOdrjtK8@+(FC*Q+6{V6{XU@ z(GVKccNx=HYp62cfv(VO)8poq+;%~i-Y~zc{!;rVXj|uKuyObs2@+gaUo!lteYRU9 zyj%ECc;q_jM=RE07$K>cjgavY5%s;98L@U7bk--aU6zV$t+ng;wyc8!>%!=#EUOb|w*+nlV#vjRvW)&D3IZZ{apr_n9< z!K3)`9o5(A&rAiwFOCo|$}L*^Aei)?IH4us3oubjWfm>Z(&KLaH;*?jXj@J!h)2c- zzC%6!1!JZ>!3$zWDBw3S)ml zmUgX{59iKrBYo_}TE7`$(m0}e0_bAfA}(4w2f|G%Mn(@!jCS|~@9$y28giRc(pyYLUC%ep%QWA~P<`PitZs!?59U$5P4) zKCR4&w<><@9GF_Obk>POQ05drqO=ElGMI1uzNVsob`*Ly@iHx*sG<(jjl@ADQ-2mb zOXYUTMRUO$%GV0qbv=uPlgQ9T_!N4IAq_d9dQuJV4oyc2z#T!H z6sz!hmE3!tP?poo&aE!V3#iKjOLXsi&zz?Jop70bZ?2?~6ai{Qy=1|< zF-f7_-dWlx*obAXW6;$-nxU-YoW?4#MlAhJ*8773;t|fmJOxG6z1{Gkddd(^#iz~Z zz*vW%u`knEmg6^SD{{oW*NR=^$B{7yUlh~pg?fdPNNIQ*!W6CwVK1MEE0ZnS(2eJg z(-@jkrwtc6`|`ZyG1`q)l!M!Yr!<@RAn$~$i1CXa++iY(KWoxTY?7w#OPn@l4(p9{ zqf@rwVc*g zZC#}*)TOr$(rG(MTce9N9_Hp26(4~OkKSUIj+)JTT`Sd=bZ;5*wk4k9TmZ;-WOLPt zeqZ}G*6B-Fg^yb~dgm<_!Ygm0>p~`ETI;HgC*QuOEqH80qbneBA903~!RmM8MMb97 zk19|{CTj)lax>06*#oPK(rtPil&5&z!3c^WIK>0p%kLC(f&IkWcD__B1Z?m|OE}I# zj?3}VrRgN0khwW6+ms#gE5Q79=iMEO#9{3Pv8Q=_&xRwdixMBk?8l|%iHbQ+l19iZ z41hDF%kDe`7~&5-#S8$Lx&jdHNBv#!1|SIvc#r{z54(B#jd@lWn?tQt1B2OyLRr7R zveeBX-mo)8O7iUn_kVNTUx%K9&D7Q_{6)vTb+Be~YvgwR)z1vWTEYnyARpZLr1Pmyz_UAxs?FIcM7NU=eif4WULaSO2-Ir!=q54^) z|Dxk5QI+!<0CQBn8*z4)93P$&pxO93ZjsY|(7%-OVW4~NF-k1vI;_8iorA_dBOFMJ zO=0488)C~StV*CX<2KFMw>o8KNqP&S+n`0(58NCM-Z!g?vAucok{N+0&Qu2B8*k`^ zzNcTB*oVsNLyeM`O#+;mUX(sv`LIs2L?*3b@tptV-T%a4wsmmew((vWpIAwGgVSrS zW5u&tTc#mEx0K#m!rzdZ(a7X*v*O4~W+dB)_v#$9qI%)St}vM%8hin=%%#lUo4U(4 zqEm%TL#wUXv>O!{8PV+5McYx+upXUDX{=b4>|Eb3d4#m#E%w!@U#;DeYlD#>tlV*p zmT>GFT@fx4_&lzPRXltBg^<(ETrml~_X~gPXE1<2_%54^bNwU-KS9jW#lQV?%hS9_3)8!32QF~icC43uvc|*urv%GCyoZ6#mrxYa5g!XsYs@eWt zWM&ILaGK-sh>pAPHrDXDEBKHv2!Vtw_gePLtzU@vS}JEc2VpHOzKHw};^a*jV)BY^dXo(!DVE%7-I#yBlonSpJQ)^&Q zD5lq4k+={?(cN^{0FeAzg7J9p;$S3A%IQ-vv2hvN#auwS~oB;n>IAhaG1~G59Kjw0TEqHrf-spn$MtGND}+6 zcUzQE-1_5re$KnSPJ#Undu4knGt%sfVx0-bi6yH$(Z#?5unlGSmekw4xE^VLalpld zdh$v)3IV%Gn|3ZRnel;RoIyp6v2*NQ=3XYS&j*VEoYTa$$7noyA63YHol#78M4) zSjoyx)LxrgFFsYTCEppTq{+h)>{k5Fr<*~gW~Kgg1hGEr{g1E89VK>83_Or>BaULw4x*_Ygqm>EnbH!f;WC9m_L=y zhabbK9{Bd!N5_$PvDYI|-<1{@83Rk!NojuC$UFl|nZ3kp!Y4_p#AQp$I^_EHjzWFx z!=AzqJPg9oChwGzDbAS@$1skC8F{_4pIdwMl5&rWM}~xc^?BS`H1p6hFD0EKE5pX; z_;H#aw^Tty!hlcl!S=dqpiAQ`_PWIB8v!?`wCLmBYN%Stz($AvqI(4ox?8U$hl;Eq zj&ILC35C<3VJ;w@Y+3iE#~WIY&CdiLN9;zgRDAp9R}#MR(L#? zXn{DZSL_rAsu(J7otHQ9OCLUGxam5zzi1ys6CrkL)Z?CHP;;DL*uF^6=Z`^s9hP+# zv@THO>1Gh5y6@Y6py&oJ3|?4Qy)j~BwW{?qVwsD!$ul8P9PQTn;jrXQ#I$(V8uq4s z(}Nu*xsq+AIYRXtcb=c4-#1k*fnU_KlV4IUH|yT z4(Ga6JyM{Nxn9+eh|xN?=CZ^BaF61U!|2m%sDAhKvv%bv@#1k2+gVVFLh5$jh$b z(;MX)V0vRze>t`^*LOopcuz}o!ndF6a7MC8kvXUK3+I=))ZsBomCg=2Y4s&<`|AhG z>7RfbTpYoXnt^DJ(%99)e2d+0^(!ojEi#dr?mIm%ci}S+`@7B0kLaQ`SbIGoL-xGt z%x?uvEFQb^!XZ|#u&vhi8jjh5`P89d^sB)jzV@DQ+G`5{SuDES9%iVu$I%RQ^UgHV9@B!d)_%#!KyO(mQcg^|= zbgF-^Vt6F0v1IHog%`niJ{~RWg^3hzQr25X8kbvi%x;-7(>>8{bE=O z$;wsFTaU_h@KZ?p9d4mKTtQZl-9CoU)1777dGm(u_0FL zd6|}E-3+$VzH+m@#aO3H;!uL%O#KF9_fcEe3B^J=H`ntV{TymJ)rDn2hlw&hIi~Di zf=cf4uSQTRGaDMkv5vfrf{L#nCTkTqIL%(G&6`>7-Dkq)uAWR!7(LpZvhNh+BL?H0 zp5j3tOQLp?f0?(%z4mFXaDO)eIXl4^`B`>4+NtSPWh=aM+0}k&)vcDUdN@p*9<$FUV-JCZq=Lzrfu5X*cIKX)2hmUHl+QaIdUS#-j+!j#UCYYyk1vx%)U}_+G3-dh0&?c*^<#@f|ov7Rl+er<;;u;%+$&xsKMy^jE`ob(y z{QgLONqUq@vhqZCdv1E(CZZKNt%t1+e~qoiHv)R08>tdIkmRxHG2wfr_Yd4|jks@G zEvhP0cy0!M^TB9OCJLONpCn1h?bV{^CJ#J)GV6`9>CkOeEmhFN(r#m|a-nwnC~q z;gza4Bb9Pea7$v{!$O_U#If=g=)@=2;i*(bA8jKc9@jwDnPQOPC$r%jUSUG9*HH@* zo?4LC5oP&q0{%A{Prj8qzD0H= znEk<^WAVbsLWI+7P}yPPv81Z~l)HvR!k1U`5Ogn%DIVxP92DR({slG%=)wNY`S|A`S}&jln&!w0zZBQq&uyn|NObA z!p1{pt%>jvU+{sDX?*D-eU!@qe8iZlYctNgZ&w3!qr(6Bwg2CEE&{U?6V-6h2zL}Q z8h_)pWQdjK`XC*hjA{4+9g|A_>q`pPQE34H2`WGsfBHwV%$$XZav#T?l8imW% zS?9sRV8O}*qsK4P`@Z)7?ct;T2X_B|`-7{KxY-G0jNaH@1Sc`pP@qUyJd@5gL4#b{ zksVn|TAgDe3voI5to0%_%#S}&Q{&-%&H^NG5|tIjxcdTFja?N$3^`JM4d z%M{h$nfVTWxya)6@u|l6yEBt>8fER`{DoPOLE_X`uYkG9zfYwUk{X~#OvCO5!vtxw z7;o*p>O@ypLFMk>9d5EsPw%``rr&tPxB@*7n=&)nDHX^I4^WoZ6vix3l= z=Xbxh+I{u2m4c;?KFPMJyXqw*FiAMYwLH%ZllIYEtU^j+LUVjdic43go}B2v=*NFT zO{|kSo5U|DG3uBlJT{~cP!E7UvCV&M52KiXB03iQ5eBM`9XMGx?j`S88U6U!QHUV` z5yW(^hK`MZk|67xkt=O{6CUA|atUGn;G=$!5Q0S})O$c3isA zI&#J1V5I?#MYQjpp%{-W*W!SDajZ(2tkqtBHFFzl2mq^<;MNpE%fDD6l zy74XbySZz#<#V{7(|tgk0+@ej4^%hxZ3Zw?Qh@<=t1>X)xz6C!s)KIxGl2W;&>we6 z9BHFj>ILg4=9!{LsN&F7am=l-;U;32d|Hx`($0+1_hv{rkzF>Cx zsQ`1bVZdYD#p;FAxW@1jbH}@JmaU@kuW4>)Q>r=Ezwit`m?WTryaq226WjI!v)asx z$MdeeVQCwb0x-D@xFWv&?@M>$W0f~*2JCEsrcjAc7!X}_JE}a96m8ts!?7(epwB`W z2K08g;oZ781U2}-TZ*Oq_Cq@(Flb9m>yv9dLs1Rkpy4)f7Ol{R)QdLa^7OWeX<| zri*`nzm54rlT#GT`0sxDeyi_t=Ck|RH{w8DfWOry#@)&lM|hlbU=<=yIbvP6iZR!; z`ISi>1-6d4KsCvEExlD*g!^;66Lw)W}bKpV=6`sPy87{k1#X;VQ=PJ&j#1r>$TQ zP$wEkb`QGyOLCkgvwO!xG@?Qmn)s+c@?Ta}|FleGzwuq^ZV0H@M~(<%nd7Kl3|u*f z*nA)^w~(`1eT?d&Kz@wKNg8`q4sTzpPl`UFcIi1xw}( zvRZKYiXgmTyaV(pieZVG{O*^dMb2b>(|-p(8+cb4tgAEtp`40ZBJCSzoC2u;ASIZ6 z8by@|#ezd>eJgO(Di;wtJufLfU-ymK#32a7v@#}BU)}%9dWS@Uco)%;7(SL#pg-GK z1jJ@JXA&L}T%_!TjZ$kNB}usEYyK!Z)Ho@lcw|gR<2qi-hxPtN$5relUshfoM3{Cz z=b6Z!ZRgKXqK0st*fEXZ%Jps`9ks>RLLr z6YOe7rGD?1hVlH&+I54-s~G|@zZXjliw%q%m8yZ{k}G52 zA02&MNr2zm6ZDH~iN#aufetrY)nynBcR$R$Rnc@D$&Axl>xoSE)-zn%<%V^UxG2N3 zL?xp@S?{Y;Ua}8aGaJ?NNxO1O+{m)Lb_wf;pci!vd5LZs@Lj-X{ENPorA6|`v~5?l zjhprMi0q#hUm75rS>W5-U^{q3;z_319*oNGngQ_cRHYw06!7I)5D)>1E}0Lz>h>3z zNIIf=U2EcED*Gaq;kN~=jnFZ z&)sT3FKCe-#xmApRn#xrw2s9)CZlKdaNyOfqy4v8bY9_x2Lc2e253INiw;c>O zd{pS}ooAl+R`yX_;E4_os9JymHHjjzuo>osVC$vU6{v^YTe~)MFS1cLDvRUezP%V`IN6gNW^`&7K zCHnr=#*2;An0kM%<50cP1M;T_^Eb+rV;bj;??rvs;-NYns77G85{HXMl1G<&YCdp& zcyik=Ywb|E94p;IW(`gkaJ~izWchm1ocqXcL)~t2K~lw|lT@@+a4JtCBy!`gw}{^| z6Y6bSv^UEt)SuIH%PlJ`(4>zKIRhBe+dRpvxT)%aM{V`ufDWomcVKqtFKOZKYZDCg zqCY?4RKuP@g6CnKG|6ZUzL#}2ZXYs|>^>s%^{)ZU^KTa*Y3AY?3f`a^(hX6rrF?+- zAXN_#oq__+<-t1VL}JzXs>38DmuNyK-#muX%|Z83o;drh5!_}=xyUxfz^U5(vzp~n zkAccn3o(GWvxCaW&(oC1VDtjjn>31b8e5bV*e&NR=Z_hwpLr*q-vnsYZi$WRTfN_J zHc`EfTS-~mWu5nNg&2${;=j&2w4UGS2{d{qG-G^tA?LzalIMmthry?E%~j(JvIyl~ zsGfT(Dk$AY7OaiFi5<8F|Rtw{hBnlu0f%say)7Sh*axCUIw@6LYo`$u(r zz#9F5;pYw2oWeF~w4GjraW*w|Vvc^Kr$z!vCWb?EE@!tO$^;UX5+eaZ^7Zd~y(~!N08(**PP5>bviwz{p6` zFw!YGg3*Nn+ls5ZPNewoCFMZ%KgR>$lH$>L36~gB53;_!%e?+;9p@gyhfR$Ro>=}$ zF1=Pxv0q?iBGTG#C>uVn)?vSPhhFT4?cu~`2=C5a!3B(+x}Ve+3t#6vo0X-A@8A86 zMEmDR)YT{-*#(7G6oX@U@%r0X3#d9$+&oFSl~-l#-dWa086`e~+4anLo@TSNT}8$+ z#l9ZnadG=+k5`xAO{2sNg0qe0&*)yW3PjK8rvU|--B&D1wBv<8Hwk95|3YNs7oZ-< zUB${}`0#sK$QhXdyQ@W&f%-)*zp_TU-rL)mihPajZ>YPCP#sAt($0UT(j~kUdA{<< zjgI*|kVp)k`;ez@c4ocbBbnGO2Jv^EJmrxth3iOL0<+8$ZHyVWF!(KNw66(UPwTA7 zX^{5XT1k?++)LjIU@sd2e{E$OIJt80;w-wwH*CvmDr-xl zk>i{;<`76d8yHqw8Xpof$kM0?FEWyysP^=eUczx--1I>4vTs!HU6~UaOWmjRgcp|H zAI@?OhPSj7`MyM`N%l)%MYpeO_Z?#v=9?ryA&Ie0q~fU!v(}6#UhdaVg#x#eewUv= z$M61`Q53AMuWodfEl==~gpl%E)skyjg{HGW8Dh?F--qNy+vhi4bwuLgh<ZwO%4jV(ODwi^vPKF5|uWL_wDEzRR)3bB7Ao2VLdv zG#xz%-0YD;+aV&VR@pl3JpO>4ZUG`?-!bdXT+7e8Uun&aAE@TBbkvX>2ehpUDMhf? zldvB61u0AQ-U&0Y4coL7P>>L$$j5_lZ{$kMHM~`w$Z+&y`IxYfK;_OGk4V>54+&lu z)_>Rd)J@Sc+uUG)eYLec@(oZK^RaetO9I^ zv8T9R6$-658K-G^j5;^&{<*V8BI01zXnhycr7){_P>9@x;gtptkv5JFGa;eC)d%$j zHr*%r<5HvJCjtbCOt}HKYU3;#>u4k6!fJe7oVIRRUT4lfS^8DF-NcYeJ*DohDV6!% zUtK`YC6C=kRBdAzSy!D_2XzCo{tEnwQe_=1RjJ*KB!%>X-b z9DOd}_LP0P)Zw|#eV4qaMyBstzJC}P6OaYUe%be%7a*rE-N@0lz?r2J?AnO+mLeL3 z#(TzSpvA(q22yozyao7HfdnT5KedrOz0Ea*5_kGR>GknSADsTId@F-rN9dgwlPMb~k^ysQiSdlk zV8y*=Q;=wMpMDq7<5dj4eP^1L~M4BjqbRsHU-~oit zqaq+gn)H?^3J3xb5D^q2y%UgLL$A_%@4Y3|K#Je;nVEOb%(wTkXTClA{q}pj&yRbB zIdANIo%Y!sisBz1U=UDf2n8s->Wp1@rX#yolh&vm|`#t`` z`AKO`dWMlbePOoxkzcvu))3a7y9ajo?Y9d)@fVoq&Oso;b?$eB-=tC*SR8Q6e zG%`)w-c8un{+dOmki-kfY0&Uyzh_JYU@P6xftzG_9i{>2fQ!a5Z{IXSnlR~N zsKpepoe0&r>cH4cyTaVO7+cAOA5It8##97Gu3xJ@tb-LwHHtB7M8TaI8ArZeIHYoI zL>8Wr=_W41_bR04gHJhMSLQkydV-+8+BkP`q_=ex0V5?`Fc+NyVUR?c{oul^>Rp>&-g6_xzLi5Bl}f-don5#f>~q}c8@9B`7y|aIDZb|u1|@sSk2z2{Aydaf|m6{HR`-gU?9ZOAsuUbE$-cE z@o>1Oo&9yQ?UeY!-ec#eP;Pj0(ex;Do%^uApC+`r`n3_$dD@G2DHk8p@*IvA9Y3up zOez7#AU*Y9nl`Gl1#QAU$C)dSBd?$ns3W4`Q3QbByVIc^vNSPQrkj8 z%8pkq`}Mq(+$8yO^-}j=7A^M5Uqd#3{QYn6)xY{$XxX_*W`5%^*LjmNnpcaRX223w zA${d`leqCC@5wLTZ7AZ3h#|NS_s#(?*fKh6L7Nx|0NyW{>wRC<^R5JbAO!M zKX`3*b>#2lx4F1nZ32k?ldT2@x&Y!0qS1$)A_3f!Ey-O-fG_QBrL>n(Q#TAX!<#I( z3XT(UH)M}x9E|mu6^-l%4{l*BBnC%U0S+tn`8xQV1ceKCoBtrIT-~9}Q?&curJ``i z_6P5U1!$djT6QMfW7jUO`bSKR$2p)O?b9DEYe=OY25K8XxOH*VTtvd28Uq~!^34yj z7HdYW+`Q?Hc1kw&Ew9-OzimP{d?8@7g!qdmY6((aQBekDOeoL*o6)L2X$g28{3S zyy(oiMYF;FL#}oXFYnY%!;CO>f}s6lV4=MZ(lly zC0tatW;=Q++gW}|5)(^q+Z5q}oH)N&UlUBG#Hf;AYC90T;c zLYd%o&UivB@Xh6?U(c?)C^WW+c=M@GE;!EWPEe@$ULg%AT%L>?1eaMKM+RGTl@)nm zmnksLn#v_&{2Ry}#Na(#>wL)6Kp*glo$)Zra8JkMSd2iE9X-prly;!%1|*esdw)5s z|9$?!|Mbs>x;E}gay+8!WkBw*Y83CWPUnoGj-N9wb_#X%IJQ>6r&LRNm058#J%0n4 z`UVkQ0X7Pz(VpD4!AQ|+gyi~p!2}Q!wAceHJRIp&s&4At!|Q zTPUPY~hj zQqwPR!$E`Xqem{2;#3XlA3}79$w2ndb9j0mAS%Lynms=sKS+Vh#TeBjhD^K$tFsu9Vra#5MyN9loL%89n{kDmY zdpH=m5qU6Ha|Q!uC8>B4Fmt!}9S?NN6Xs{85~5nolw4fcpDo)R1>`Gy*Ye{9VC**J zj;cLmeQW(E=pH6@y+0Ah3k)9MH|PVYn}7HxQ0<})1}e*umpu`6Dc7brPDeB6d+}RY z?q9jIgL%KcfgIPQs7^9c>fQidRn=jf(6C^_==@W}9YhnN`;d8p#DOb{VR-h$?zams z-CKBtP~Uz79k6Z~%!3PCND(L?#2*G)QmhPPY%nG<^(bKY1J1X|5wKS#3~U#5zme@X zy3K1=ba6-gD-%m(hV$N&9keAZ(vA3jUWaxyK?YZnJga2}{=RmLt&L0f_1SQR-d9hy zWH24w7Z5*?mn1~>G&w|h)(*xGnp|G2=(apH_$=un-oE;_>5mX6?rXuplWQy)$gB{u3nm<50~uX@gu9mF>p1VGwy(>f3bza8 zyE9t{^>E|YjBbCG@`|>tt(M4LaWmgJJYD4?!>s{e?X1e)`~PCF|MDKJ4AR~V5F>)k zZg~>v0Wl+cEkqb}S#>i9GM9~;_a0S)2nv6>guDG~%w>^%$zQMw!30Z6IpB7rUKn2B zLewBSr*|HZfJ~yDhIy$3Mbe_L#P|J3Y|TNNnj1 zYWEBCJ7LxmCNGfY2{i6P1xPv}Ix-q9`P_boHC0Xol`U1%h2U+u1JtHOd2wsi%Lg`t zVtd`8V|2zB$GxpfV_9b8wV0VJ2Lyln2{K@{X#^DHn>y(qMK*V+$yS$;3(Ur4S)50r zt+e?74EYZhjaKqBfr#h`s?8IvEKP5D=j1 zNLDdhzyGJ>hBuIGb{YPUg1`UD>x%ct-}mL#b%MOD(Jy{{fZ5Rr@XUy}aC}ep_8)P~ z3L*7i#vt#(3?hnxp5n{(hcEpS>^mfL2$PH2uL7RR`P1reXv;JqZ0*2>ux9uWQDO1=LbRn*w`$9O?Y-Vx%rv_4b*( z!@NyZ3M8%_pyp(g&XTf#5C5}}a~4i$3y%$o0i@)>N0;G;blCG0?i)ZiRDung5;JV?!ovOFW^|ZuZ6o^D7tupxg>bX0Kz6}g#RaIlq9yPvnSmYj zI#m_HxV!Nu2+%qVwFTsge*^kLz(T-u*o6#F2&I7?N%_c+7#>c1Gnx>{Xbsp|cdpm} z(_NOooSy|fQ5)IDfEVinHV}>*hCtG`6V#{dy=%jRFJ zOJWapban(e+z2~n!%W7JuJu9|Za)Ep)|ikhPqFUs-r9WyPDa4$2v=?Yd#h0nz#q2X zuLF>NH5(-27aJ#Q7)4+@Sl7l*1q7|7A4BsEdq%braey20=*3^$mcM>vU8F$3tA=4~ zp%7k+L<<`{2yxI#R0p(NhNMxi;{kSCa@Xp7syM*JiRE3P2B3@oF|oxzI{W|lzyCS$ z;Qy)jLNHf5tyqEzuBnC;1KRr%3ssQ#S>s~sK~nh~a;pN}IvB<8+PkN~qH9sNllZ@O z1Xkk1^@{C4#3=&S0ynr(;ht_~7kS^A-abH)j3I?f&EOPjkRvYisj`sC7+s^yAXfdpl|iLP@_9!cXQv zmxni5@nIxQq}Xec6ojPl0)0s@ac=1u#Fc4C0`f*m;bh$2_o@=ltg49d?2RYy@-mw@ z>IkFwzQdMMae_}n-oX@iaK4Vvp_R@QVpc8-A=-74w5ZH_oX1S7{gzeMuJvdwnsZvh zzgWWlk^!C}^)xEUq4VgD+oIYuztc?H2;x@+a|ETSx& zWot+3G30j7N%&sJEUF&OKoakxbf;f0AF$Rv_|jsger3Jbd-JaM_2Wn#$)%;ENqG_b zs!C*R4ab=^2=5OHGo{O0?CmEmmu7nI%$&30jVgee;5Y~s-*Sio?QuRM%P6xRrQPPx zt2$b%ESyZ&xM}zfSa|LKe0{`!kiPnByZ$fvJ2Y!ex7O+jpvwD3QyJu2h-W^qz)c5V zi+P0eoxflG;nQ=Ctzvh@5E&GODq^L0XQ-4Y?`zb{XGLkI^t^M8?0fK0s`n*$R~MG) z3LRgx;IU>i%D9sjv+H=cLLnyD`(}NdJV2-}C=qn=lG=;`#bWhC_W{?V4_Ti{hy}v$ zymTP#28dkXC6DpSK3F`fGh}xO;v<92pCth*bO0AYAAg$?V6FqmC|Laj%>ldR-@O9g zp9_J^3C$swe;^LPBTf`+1n>=i|M3$4?iYgp@_mdO(#YG8D?dTAk4O&zS_xGDC#cpK zNr;2|{R?EzBorZ}41Kf#? z=wS4o&>E6JF9s<7tNz_{7=}d`5JG@5{{$&o9_CXTf_883fGMmxjortrZV(bRknjmk z{=4Ux;gIV9W}Y4V6NJ7+96>&5IEp=D1Zug`=WNJy6==%Ij-Q~$-oJlgc)f`Q*=PXx z`Q4w%8<0P4)*m*J@OOs zG(rfb#`~t>QkD*BAJqo_`bF6X0swZ;wpr~iq)^1G->a9b_1=4MNHflR2;cAU6W+T3 zl%o;8pfb5l>~8LAY~%ZJlIc-;+FKj*q)RgpdIFUur5)G#xm2Yt^8T)-FymZAH{UVC zoBWfNbb$QA`f$8)MKs^IPlHGbE6rtmPcLGn?r-8oGyqJ3f$$u>${tZ`n zYS>Rh_p;{|>~_IGbNtsDD#S@GT_TrN`R64>zf# z+nT)DLhG~{?#N_5DYx>8lh9T63NrdLU+~0if-oDNdYK69ZQnwcvy<_Tq);jJBcKA8 z+9f$7hkH355*!RjaPa(NSG&BMiGZ!2Vk)_W5E!&CYW*VTdt^XApP?)u;`d78Oa7OF z-d>AKKSAe&ahhE_klAZmQ&{Wr@t+_+FC#L1Kwf5^k>0arWD+QSDH)hIbJ*dBKI_(6 zV*omm&}lgP+!Acs=!y)vp7oDU#0-ZUqMKKIzGPlxv*d`K^v`-d0d!BCRlG({Q#|i! z2Q87U8(2nV88zF0-=alSao(|7qNCz*riIhjF=6R|aO#=d_x6l>5q_|?`$x1KM?*^} z;uBxd0e&Ml4!`>k+v>cY81qsMA8G@IIN?KZCWa+oEDshzA+uOkf@3qg=b-$pOr=bd zW$Ia{anrmZu1E8QAo#c3s5=k;?V45*o2FT7xII}zEv^rXZVFH--o0jwA3xL$bWUgt z)w35h`l4y*eU24R$pr*{6pNzg`doAGk;GK3!dV>C%t{oqguj-OOI)qd_QL@Vyg?AJ z8xiB+t&c#QjWQHtW>^Bo@`Nk&l_$qRG{JJd+JuXt%O~v4@j^hL+{n-^Ve~9i)osG( zq>2ueK*-_o&~3dz_Ga5P|36*pSiXoR4o%@x52#~5&wyn3qI&2+eysS?#_h3XCEQ*g zAb-BuZ(0=^4KJs?7}7xeqGpTumUVb)xK(z0lN$d9_PN7(&h(42Oucc$i!242k1XiM zsBN>sTTbC$z5+)R2Sh_JEc7CnT~TDUPo#t*I`BEOmD#Lnosg6InxyQ5@ds2kbvSJG z__%L>4>BuhofLU&YIb6Rl-%*c`-R!@KnusEa;D~{Ekv}gg&Xd0F5SQ8+DP$q0Omo| zgA;LMy5Hx3GE*;cPe2A+-C`7@oR`uMRW$(=FNAk72Sb2JL2?6fe_4zYis{uLt&@wf z3(s-WWG`S@BnzVfRQ}hP^Cr8JW04C%1o)~40L;5U7t9eAh#%OU`y|OdfC150p8X4w zkD|sGP679Julu)uyTkth@PWCG_W+vT1WeIJU-i1tmy9}IZL6V85m-chpu6pAw!qx1#KT!#B~$L0i7E|}679}eNeUSl`Tcre z1PG;gkGll*#FmV;cf%f~R*^%!Y{KpB;_3ohNed1yxW9qmi`|TrF%P|U-Ih5=#wLas zVQQO_h&ZE<=&Ji?6%|fTuNrGt0ZJ;N>&-Vh(SE%K8(W*0O`e*<4_b~d-}5) zfnITDchN6}?6;N0Nhp7;Jtcu2=uCm4j6>^K53!rr{yO9#6VTSnj&QZg7mB|;KZkwSJqQ~Wqo{4B zll>-@s#E|J~Vldi4wJ2cthX~Tr(ch}&jD+j!KF6FGK-zY{ z@pi}=dsxx`(9#|yakEVX1h9g*8YW*VtdH+i>uV3~5hU%eF3EMsr82S<+J z&PUu*;=tq>y{}{C_yn@Wp{lgzL84hQyhwab0LQu44 zh)_sN3yP3()+DKj>$r#ckn5NkKPZjVHSXr*Q|+uC(2CDlL%u`jxVh9av>w4*a% z0V^8?^HAPXZsqQUd77hANK`!^JJ9oPUzHNE2&N>n;eFY7LO5(C#Pnb5nIr?$V zMySB#&hH0L)y>qOAl@w?Py86DTT4#CgbPpq(!mT-jP$X9b=WRDScRod{&m7b3K?pBU0m$9OEWq`6ha`}_UDuVAg@GP-J-Qmp zLoCquzKR4QZhlS~*XL^5ChB&vO%qk}7g0z*moD@H0za5C~O}Ul=)M5T^{{5ZX00uQ4eG$B$;3B9{tkh1UaEOxAN zR^Z)@i$iNm%gXMxr;pEP=%f+il);8P&|X96kV$k3^w_P@!>~W&`C{;!vW zBRjzt?P1EIv39@MZa~qA2_Ca)3DhXBM?E(*ES3H7Dcmy7)bBmr_pjpFw>mzk{&we> zSywhh?z@1ls=J@2ve#h)I=F_FkeGa@U^aNejZ5pPCc^X$I)psY3T z7>uALc-bCcSl1#@1P@|)~_{#BbLBtv}B6b@i2TpyAcQSv-eS<#p*jU;3 z)~wFBrcQ#hrj692SWwfCa%v<(M=V(IL4j3{DQTs%aKiQ`*uNF=%%Rd; z`?BfiE6H5hf=@q&;*C12U5j*;{{x*Z1E1_lO*ZzWgJ-1yio30KEPm>9<+O-vBq{&< zXDPycmLHJxu)B;47E!hi*ZL+l*LO<~hGB$EdGV=|C{()=%-NK5A4fg@ZqrvQ?cwq_ z%@H<=()uj0NouRv43Y$0sB$9O>|KgK`lmQ*zje#=)vvnT8DiUS)97}5M3h3dG2>du zE?|5$8;6ftAzneg*MG zK%{=2ajbvNB0IMe@^;Iz*R|;7m3K-k-^HowqEn$q@76AXTn`~o#6L2#WoCmz(L%qD z@ezB)K-nJ6U0oP3se#t3*5xzto&?_7G-=1&2#c+?2gVQWdCnFz=SFw+(O;y=0NF?` zqt77iu&e-#q-mSvt5z05)WSwQB^rlI^Of328(OkGxV60f>=}bgGoxH%ljg@|&(E1J z(Mbc)cIdzOnr@dU(0esNUa{z~Qm58?eiprfh}DH9Oo7S8Zg0z!@z9{haa0WoA5k2bGIjbl zdlQPq`z;OAp@@@&*vmTSU~~A>%SB(kpmcKau~%Mg{g%(w`AE3lR6Nl_d^90hnRN-t zR3qvsHtQET;Tg9Q)&YEufOmkhO-(hyvh+OzFIB?^OF8*}l>3F;9XJrwVX2&NRpAi|JQ(1%@ z#h3$oB9kpc@2HMP_V=&Ir11~sKSAEs?|CPz%eD#j5UQik>c z4=t|JXC9cd?m7knt!=yCdXrY|-Q1j$)ksu63O``6BtVx9WCm9?T~jn7^?c6gII2{&n~9~(7R>FX3G4=P_47$E+xD7JW! zT&Tz$2Hkk0Il$iHK3%kllJ1i7^4TrLn$4GZX2ct{xIb8HmMdI878bZSEP=kTda+_F z=w+$9R0dzjqm_0e!}Zn83&(&YL%id`@Sue|>>R=FOd{AVZ^@R87#+$3)@oTdTf}T8;%WIseUfco zHtS4=*@PABi(_|kxNmU%07iFTobZC!Da42(-7sHBN7I4c$|Z(-N8gSPM&+O7pBTpL z(YU}&jHgWOE*8Ly9>(0+&G1|Zdy|~~k}kiM=Qw1Hl~**jNJzz0ypu)O&`8&~qPZaL zIlK8#an6KjYV=2IV=I@PEDQ{j&ly#_gq2eS=ztc2gWLT11ovdseyGq5K4qUw&h6a6{*LFld)EK#+GAS z8Xf$NRQvGWsfWos=R%f_cQ#!7yxbuUE0}hjvxq_1#iG+;V%t}SHv2DYmIZEBFI$d?XDXc5ve#jzeg6Bw4-8at zDD|*FCFdDgR%_^R&B=4QX6Wy$1((^xrYw7PU7C&G>2ASy#e=R#r=%iO z<71zEj1X6+o0mSda9&lq?VzGhT+5tYMLiqv9fd;Zu4gR83411ZO&~o_YIIx ze?<*aSN4R2Kx}p1DqeDbz5hB(yV^1z=<%_j^bUaAk9V(;ixA*o|Gf8EY}hJ8(DY&V z7u3*qx0zCF?W&6##ET(EyY=Xz!+HqI4^dOyDAo9IR<~2<0f@UKq??Uo1npsJxU5oJ2?spH&v8Qm!rVGmINejlFl1Pp|m zo|_|a<%jC3k-VAQXJDXHU7<|~26Cai6!PjAVO%Xn!&;iCTd-tNbu50U-Qd1plG{n_2*5UvIYGB~M4(|0av~*BQP4&c9hZ*D1Gf=m5)IU-cXl z+vLk9y&~^GwlZZPuP`U^Vw;Y*Y~^Y0$YGO(hSvw4q0Y;bOC!!(tXcee!tM{UlpVS= zo?|+vFHlBnZr4m-3fS|}61rXM_1>&qp=CmekeP=9uE3^^&p~gJui-~c)&|P9uG-*P zWz=LeFX+Xd)n%?a&d^WSE)_&J)*mST1SKVwl$TbOhHv#5B+!18RSUh$W%P`d*Z<6l zveqwW_{!Bo|q12@ZUv#Gc6pxjASP$X5cK>Y?Lx%KS7$Ca~y<7u`>$b5j52fQ!46Tw3!f>Y2$QOkFP%cHL-Lh?%q zvigGasa}zauS~8d{qwb)e<=<53byoF=H{D?8LD2mm!hVS8l&8|8u8HUr!EFtOOwXlbKPSb~-jg~(_)|f66GBqG9 z=cH9SuU<XsVn5{z?({fKw#*K6lTm?X37dC#M6o3QLSezfz7r`~QkbOf#S&NuRhX z4D2H+)pJ$1p&V|eKu$(01pqz5e9AcvQ@oA>Tf2aFuXF=ZeMmlleEl&=|4*Hy|I+go zD&Aboi?>ls^x+Ad8oTpJ)gk1a)Gfu@v;7*!Jc@clmHg^5MW_=CF3z~prjV8RS@esz1Q<2@(59~yJR z$c|2I&KtYldi28p;WhmY>Wm}SvM)En&MPNA? zOh|@mXy4<{+-c=*dLuOuhtI;}S9v1^`k{{|Eb3TNO>Bp*VvAm`Xd14gFnF#4gWfw>T zBdwX2U9v;E^W~#p$j(-dEC@1WiJ9D}C0#OO3pGHCXrEOP>i#6)JTDf^eFwikZtj_F z7>Cz%F!HeXhiFCIIPd2iH4!(vwuMf&W3OH?agVkaT^v|`;e0Ppo%2{IgKxOsf{`4l z$nW|3!m@>@yn+L1&Sj|^pxr!7ZUw8jDGoL!3sy@?C(~-JX6`hx9!owUvU=W=zPE2TLb5v;QQ0F z9*usybEvnQB+XRRwTGYc?Atn&WM7xlZZRn7PcZc?FgAr82OVhRQz%@7Sa8sS%YoSW z<{t6oMP`+=5sC_)&wz}m!4tT2`y^fckaKheIMY>E*}El&^-B(%-Qsnc8C-U7z}?+2 z#KQQQoWUb+G~$(;!0S%aii*q?l+BIyq; z6nu}!B}rLbA&OA(cJjWKNUmh+;b!4+8ca$y>ceqa+068%B9V;2FH81h6Y5UUb!h4MM=mM*^&!arYVUf1dqUZci>25By zv*iN?9iqN1`1RTf#sVHr@#RMzCp4*3S`T)Q~;C`+asiSgIamW^yw(L1jh!BV)sbR)YZ|{TYNIZ}=W-B0o zHmtZQSC~vysc->^ZPoVtv64Wt3nir}IN;z*Gs}@%!J7P{BJMU&S@$w?@jgR=;EZLU z{LY=>-sxh2#&UV48Ru`S`k#v(&tD8MX5f}{Yl)lZqHR^8;CUlppIGbOV;r41*(weW z&+A&e$Fh?X$dK|r zKIq%{P*LU^TXVU%M|Bc;@!j2IM)M(g$*l2p?{gEMphG$AGM3UjKS5U+?=dKGGKG-c z2wZpx(j%ON?B)cVAsNnVYH{@>bLU+)@iz&Vjlh->V_4oTU&%RP1y$nJZOw{6qWkof z8MhdZwEElVdMrS$vYHO6*nUN#bruNbfoo1ao!>k;N$^ZM!%7wY!;8-i=&{_&czUtZ z;Z0q+i)rFsF~W^9HoYU^=}4(s1n|}9V?kIvvZRz0#?1jht}@6yE4QbV)4ztV4lE0M zvPu`20yKzjZt$?LrXN?p$E(}oYToGL&Vb*J+ zudl*sA$<2#Z8={_ciiTT>Q?#+qKUtU;eD~2F}PrlhLP@(3nbUqgGCUWW7obsFF(1p zc&ZSaEMjy0&FC9+y2g=@86g7T9b9<0yAJFz6=H)X9!#htK{}fJlVx$0mT7CUDbqIn zqE?kL#1F}8(~cZiEhC&tYGP z8cr>pyFF+A-Q1*vWqSNtTNF9-Dl9rEuQr9@Rf6zb2U$$7FLg&!PCaJ<(jZ~a;k;4H zp8P;qL+)mPt@29uWT&t}zgws5G_>!A-d6kOsR+%>6M5;0G^F{|=g=jAjcR=0d_6bs z`#Sc;I$7=l9yJec(Y&v-YJ0|v?G0&a4l>=VlimY%nb`@&>{)g|V#YA-e-*I)32NEx z-lWBU^cBI)8Lw>Q&T%~w9~V6F_EtE%JIvu7BX#w?2z{B1?C&;`tnM#-4G6n4-2rwl zFh*;#9XA#VUWGL%oXztL)2T7qr&C6_ksc6^@XT|3eX3rFF^9NOdaLK1oTfUqXXH{8 zE5j8)81Aa7$^nNZEH-xQh686CU*s(t56j3OI($(EulQ!5QBY<JiC$>TSk z39JK+w*$FfjxwK>U;GYH$XX0c^h>p9$h3{QU}A2X@3|NpB71YiAn=UjVS4aS(9=q> zf~pz@{Hdf>FC$EZ2BaX8hmPzHPW2VYU8z6^x%pJn8^0R27-{X*y1bB`S-0S%n#jr2 zIv1^>7dV%x@!JGhfgF#JLDv81hc!%HwbEg-EeNz`JGsWx5(iqToss5zd1mH=;uj1M z1)L;&KdhzP=&Le#YKlP1s4e#47EM04+QWiQ{c6|O3@r{Mxsa;bYuFlHOzYul7 zI$&>A+CPoeK*F&>$Z-l&DZUGRaeWfBiKhBg^kH{@XPRc@qU2Gs8QZ&;onL<)&_!rk zUpK}^N6;!lKu@t>1YyvJnkj$3_`xz?V2O&D0!@J~wQRh`(-4`PJS z=-n>tf%kh)&ZGRsFO@OVPMRnn8B$yYb;d9(=mZ3FNCD`8RwWPQDgAvd{Bi=)zs zX)V5;((Lx8&lKy5?m$IWMM*p*VAj7fsk%bx=1))>?QT2rlp%&etKA|!NT5^r5Mrvx zWCXJXyed)4%yXqownMWY_szat(W&r1aSToB1F=x+zLEy%4VOemqa1|V>wcRsGaWN+ zb91}ea8~HgYKA{_KquO+rD1*`$L`NpZ-=^>VyFmDmpgF#Q5$h;-BOW>3z4#oxrcXW z3(A%ko87W9gbQT{AF+k9__{M-2p50X3uJluTN@hnUYoJ zG_zwziXz*aCP(%~IwN(4vWE3%#i%{gTTGHxxmo*D2W~J(y32?#=;y8H#An(~O9S*tUQWRe^<#t?z`Z^jF<* zIO|c;&6hT(bVUSH#1vrWtZn3VefL%eBVT#!ZUCa1(#77Km~nM=FHul z2T%P5U4gIA9oZbz#&i%gfQ78!#^Lr8)bh-6RoQ0iX3i!6RSd77Hw<#^E?IElxdY?_ zHYJGfy`G#}wk?A)(0n70NbmCjVjsENwTYoP1{%2G6x@tE)u%K3c#ok|Np zS^j|#_LwQh`9cxLtwxQW7X${Z7S=M%8pzSj7E@x~c-6VTWXRqxTi|=Cg39-y5t+sb zk=ii*;F0#8Cfx9&N{7O=@2@Fq_s>wC4Wr)>x*;@pv^jvhtaIlZIuzvCdY_JU&RK`* zj*glLdRcwgG|*l11Mv6+8cBz>wW;c9l}x)9be{&w@o45Gxe+y%*#Qg=2Jjfd)-L>gaB5QSr?SZepiwky#%74ap0Uaymo0LNd&+9@)f^%cgw`U%R= zilGFXtkiCGg)(gHX#h$|ynPS=b%Sfh8^S^_Il(>=B`%iK_i%3R#7f#Im zHbDE$D>$1bP3gyY5r&vMqvt10^VR2Rvi$F#Oz(Zo)yycln~Z4U<`B^U=oU9TPoHUP z8Rk8Lelew&y4JfiRNvNCgmCY*n8f$bDxm5i=SQBGu)mG-w=f}pt9o}nbE@st33=b* zImwpmMH^i*%IVfD(6Q;Wfa7I;sRePm&)3ugGhxz^snRE*PA8tn^%11rHUkA_{aM}F z>Ep{=*zI~`*o*u^yHySmk+BNZT5rDN_`Jd=&PkWp-|9CGHAAq=r#bG-*8m+5+=ktI zD+61kSL2T60fp558L=Rfk0x8I#bJ6>pVP|6zhB$vZB9NBlqsUB8*+TLGQY@-xqLDp zX(qn=CSedwcvK_W0>zq>K}uQH`bC$~p2v;m_d-PX}`HR6QAS;MsNt5F=eHw;hDHLEB-dQ%SDQk1<(QsRVa;vEk>@s9N?(@<6R znzv`b@L=~M<#14(d}O)&=~EbOB_3$^pr{68A%CXQfz8$>f09+A=V8OA1OeQ}lznQ? zil^s<-^zr$l|1;zv-9vQOb2#UXYM{hr-ipZ?roDRmWk;?AVkOO(?+w{AqFHp;0GUF<5 zU07hBmsHT#9l|URO8Uf#iQO=&`rQH^^6fqnwp7AVis$_FxGpsu=)tDVIQevMEGecO zfII|14C;AurHL)>d$(3F*eYrw``DQ)SVC}?+{4=2#>P(!bPg?^(mDd|eLwGGyi3V( z6gtz|$K33yXKxyB-MEZB`K*<6%bcmMbIVng<__Zia0*G3@FkBVikFw&irnNc{7~wi znjgbt6U|F($SSqmP*eOuR3~xwQZBB?*)IpaPm9jUN#$|B(W8FCy!~>y(EUuaNbiL` zOdR~O0Ig|eXw=)>L2$*`#nN=oD3$NSd%?cz!i31@=s~k*&)j(vxlpXRFcC_tY~yiE zy~p?X_vgfZOPIQ;?G&0RwOFBo$;=*2SA3r4BeB@YQvW9op+Wz%57yeh$M)^wT` z{_L~;U0U{t8e7x1VJ?%X1#z8|QSQ*E4(%HS^NbvxmbE5w>j^?Nn!*NhQ28oUXh0Z^ zNJ|f+%j-^GYCPjX4c?-VuSD@gDLelswM!SWI$Y<6X#~0bKu_f)eOK`SUm_St(D<7p z8}(!OB2IzOLQ~?NYQy>a3hE2HW};rv1@^04s2e4s)w~6Txt=H(p1-#me>&AUQ(N0( zLZdHYZ$g_UKgVuWUIu{1SqK#~V+2d}@__|CDlewnSvF}0RCXx$tJRg-(ayMHlH8D^mh(40rK!1=xv_Fa**jn6 zrBGc_fxMEOmF;?;P4LnEqBbnDSWsOm?P;wSUh|c^B}3uNg09d@({Ir-qP;ON47$*R z6ahJl9F_qTyOK;uZ!p8%qm>jjbRc9M5ZF7`6?J=zxb|!K1b78C=Kv(QRJ?;UkY;jg zq4>l5*MCrMhmr*F&jI^k)lGQ??*(LhRFL$LOY7^5BQx`JxikBHfT%@p{&3s=zRiZa zGu~pxdo>1pEG~ox@Uvfod>25$#uSqArSs+8&5W$e6eR43ZpUOF6aEvlVMgI*HN& zp>gfJX=Jhb(k`+@YQjuMs8wxmCze&U$Irz?uR%j&rSB8F9#nN<)3~szvVSY7CO>{N zkLNMIjf2JYNl*6K^Qsw`9-Vivf}t>HTg61&(t)OCkzR+@%c*DLOeA5!hARs19WJlk zUGSyJww3)9kZHDZ+*CMbF(wmkl*0Dx$;eO8{hn}DR!Kn(Mm;;dk5Y+D z?$X?)0Xdx(zE;gR8`EI#F&7h@gxh-m?xR?&v(wDynRlEQrPtJFq2+s*dEEHrcw9bi z#4dk?uG0iytuSwExQo>G&nfIzCi?eJwoag*_hg;eKShjl48dBV-_~3ye%7A@zF)DB zGZcB}_PFcIi}w^?ft^js8tziZA2$%c4G&@q0soUWgTPK1uSbZM!O;w6;?0q2jm5$8 zr}dDFF^4gw23s6W6akXtFybO1PKROLDE6}NxX)WQ^*MnPuQrs}tP~hC>J;6~@E;R- z6U0CAA@+Gw-hEf8i+;uP=7Z9TQaZE?{JrIUUX?yBZEWnQv^%lufFro=d9T47#J^05 zvgrr57~uaArI4W+dyN&HZ8v5ekgk)xF7?ar%k;8}_D{;#7uegErz>`LrcSwFZmw2v zpDxA&;9UJlgq?4Ap``^KW`b%G_1nndf9@-j-m09#E4&oZg67%kB=L|3=QE$WY`>D`fj#>BT9esk||3b8xL(bF};L&ap5#+cj;f^we%`A>7~PY0M(Icf35)NsdR`(7N{SbKMzkrxh6apFkaP~Hwj z7EZrIFxSTQnP1CJ(iHe8pe)Gua3UjG{BGAxwhOfD{&yjYhKN1>C7bPg7B*W4F_|N9 z6Sopf>W|Wk9uYbDtyMsW<4mwFG<)tRXk@rKY7KFD(3jP7u?uyz+d;kSN?F$ecl(+S ziJ_(k^Yr6UQtXBvT0&n@lU%Sb+yMzRG&J1)E=<ki#N2cD-1_l3+D*ML`In!9xpJYpA6SN*<&*!g`) z8ES3g{_65GAAGC6?mWjR5&Uos&0uCX$i_DPXe@6iN_OdCY8B6+N8NI-iqQdU;P>w` z`_U201JG6ETZkq0#Oke_nj2MR!wzQ_S-w1{>AAAYIP7^0O`b$_LZ}^HbbiLIHPq4b zc`ygFBoIS`_>jdBo96V?cfzCnzo;%~`I!^NNzW^Bj_>SA%owvvJ(9+!oE>E@mNFm= z&Gu9+gSbw}l`R9=7}Yva+09}EXRgqYryoKuhCDLn?N%xi293<~%$1$j_|Wd8 zdf!Mz`@+r8ufVG4HABDJ_X7M_Dgx51OfGFef@r*;fbhC%=r|C!EVJP|XL?FC9ZP5_79>s^HCCJW*gnhhEw9|ED)tQ{E-AT-G+NztvY%+vO5=rLJ5Hs&g9+iu4mu#?dLwvx#yne^N+EBMao=j%{j)H zV~pQ#U>_s)1q4=)-TB#&2==`#-ATuuV+nE!NWmB!CJfg6@8YD6z7VDWhXiQpr*isd04FgQSo}HX z&p&I=k8@Ve``k4Hghl5d{uu4S3c*sVH0mp|Mje7<_3+2JAJ!>mdBq>CeG9+Lzh2d& z-}Dh4e~db#ax%b?jH{=W!HV<52r}j-ey#-hCij!RXh=zj6ulcE0;3m&(`T^ z&m7(-H=ZB2(ScB<^HCY2GNF9!Gv!a?9aIG49w+t1OgFu-oBEn2yKbx24nz31}eOD;Tpj~06`VcY;& z$m|02$7{IF8Bn>I#YGrzaIr#)0+H=h;vqXWVmOPW5m!mYy+ya~HCmWG(2Ur%v))hY z>_zDyAO4ye?8Ow{{0IJ#;R41smOmD6$etUI*A>N>M3iIMsp9oarS8t_q~hRVvr_5O zr_rb?5lkZd7v_E&Pm9-ljA^O0TM>M^EeK`hyx$1VLx@>B<#{7&M-K(x^hDPM!LUNR z7S33Fg+0IpS@!mW87ucnmmHWehGMKwE;1b7UA$DdEz1H`gc03FX$_)nc*cz!6J5^X z@~rtIq9=-Gw_C&8 z2=)@u)KpV#5|(*6s4AoCee;ZLe^h<~sM6R5^+`amNjS$b^wY?t_XA*WbROPSyo_52 zawVn61Zo46+Lidvpt^Uv$`^J9R7{gO4{p~~Emd>=ldXaO*2e;Gw}_MfLf3Fx1v z{ywnxe;e2;=TqRE-mQdgLGDL#GK3Rp#)t^a9^7~i;PtJgKsS}8DEa?qdTdL^WaZ41 z%Retq%Jbw&e9P()|9wfadG=4DY00vjpF+KiO|p^BawFk9}r;^X+gfZ zn9bP+D_F1q{RcU@9wq3*oc=|{8m6qWCO%F3Bu@Kl{(Dp^#s^Q)!z)?QBxuj-aV77H zRj8F>#A#`ERq63)S!iI~JOuE}rBoxyGI(61#i^l`XJPr$MCT0~JGBp1?Hj5_==E?c zG^1?7MAyr`xhQ=;-q=LX!bt)po4aP37nM5MCKiV%@);eB4Jfp(Y#bWLR!0D@VR>9gmLUe_M_IU ztw?Ud0=~|9L)?NF_%a4)W|;PDo#FFWW!<59r!Ke`ft}9Bx&N|oiG=IFYYV!Z<9?L@ zy!VUTL8*F}J=-8jzkZDQ@(M~StW`}nBu<}bJI({5vFFCNYaaMKI-}LQHnlOE2eE^j zbipF0WS7#+9ERh1eAQ7-eNf1}y*Pf5{>{Q*eN8H8#63Xui>&Bd%vf*{6QccE7+{e) z4$c`p^txXH-uZF*-;R0(7RJ8bbB&SAw5&Q`S%F`8)uOf?+4)e zd_Az_PGI2$Lg>UlnyFZC7d5lFV7TAVbnBpe17j+W(C@_=YmY`N$|oyrIr{^k`2%Y! z56>hsH^W2jdSCU2wlo9+Te@@URWo(P3$rqCy+XIy4y22~?q6yF)4Ri2iV{+yyRqX#nYFYtZ zVBL7^*xv+Q5}S4zBa}y72EgPO%U3~o?jFF{{GF8Medb-VcMMel{7jj!(FiO~jLA%J>!}5sHP9ikU}H?Vaxcuk2kE-QZvUMqcaCpkMtcLuV$BKnD_- zcmLz>`s-IcMm^Hq(Ay+~ei;a2GGqT`XC|8lXi*PF`Ohcd>g5D*Nw?`u(K9Si#-5mW)Howc}wbo-H_dJ^$)g;%V6? zEevWb(?N2<(&v%xh3iGmYEoN}fOuGz6X7d7I$LZqaQ_9LcOa}|ZS=o@F0v4ZK*z1W z!WNY7? zrz}yGIlgFfjhf49;e>dDDSlSdGH5MRkK1qhGP@*9<9(gLM86y@0djdB3NGO@%k+Mv zFCo(6Rn*H-!n?7PATRLortWpBR#w}`fZL?Jkn*GlU3qDbXGJ1ZMKCqoDUA2nP!xcw z1t*v5L&Jx}O3$_^sVggp^Si|FQyLdmpV-jv)=~x_33|&C6`P6z%z!&A z(F8V#%2Ps|_R*i3%criTXhHALrKs14&LJk3P3+Hp+M>lCR&GE=yW^BDWWP&?jZWY2m(ph^TWSIqs+fa^=E(YU;ies&^xz z#H+w7zTbyW{g$%n@kh@5Hzb5J#A;7%CJT2j^MwbPjym(tvfm}EXQ^blxre@N;^4GK zlxOx`5$n%S{p>b_#c6{gp6d#(i%>G_FSd zz#T2<>moe^`Mq=IXKJVPt59c#eu}54!)oJl9>5;+ETs35ofkaGdPRdOGNFE_2Oh~J zKIl54E42%$S;*`Ye1Y;{>BwEFEl2QE)3+dv$WvW({hUL#UD@`Ua|OD)gZsGIglh{{ z2QRO{yPQ1oRAlA5=X3UHUM`A1YXA+GA2rNrlS&tJNL?-iY|t!BjzPK+!NQ+G8G?3x zqup(((_~yfm(VzR$%-zL{?>NQ@ml*TdcVEl=E)Fyp~UEnr=T+!kH(7Vn&@~!)1`bA zaX3hB=~+5wXXUAPk6Y|sCf&T(BI_>tj4?SgeO3|fWRb-ox%c~;gx+n7`-taTkm5+& zF)8uLru=J8HuNRCO3^)^*9<%uJ_$M0IFL%26=;(aab zUsT2)`i_s+8c(pO04PNJQA{LKsB=jpUv`%J+i9J?W{>Fsqt96k{*~?py+_rTqsKk2HqGRtg$Z9u;S^MinXnzA43_$7y z_(pKxHBQbO0!kOQAQo=m>YEwd*dMq7Q>Hq4Kj>MVirYM2R)Au?1VlwVBVru^YVFzc znUd5OfdwYgtU5OSeWB{iQEu1}a*7HU&u3ZAk*(Vc-c`8DeJI*a%ed4C!rEFYv=b1IEbLS~8I32@kuWGO9Rc$=_)^Gr=0NdRbvDuX_Om!S}vgGYM&3ia(;HSpbFx0LSI3B4s zyslk3^0NQ++k1|=CeQIL%EqPAW3CDmuhfggTRBZ$`v@Djh0w12(lKuuuRUr}}U zwkJ{NC@!RjNLea2tH$lkg2%_%!>qqd<33q^Ie&hKvMeHh@RHmokA(FXe;o8IY(`O0 zn0-pet32DdOx`y2@MGFuK(8HDeStg%Z?!nXD3Xf|q%e24k5{_JidQ*HDs6_f7oOex zoHCzghxhW*DYB|i>943L{#2Yq*|aV#eR87-Mea&H85fIE7L(Zm&(2Sz<5Y&RUOiDCY~8($~R6+a?T+V5j#|Ph68lSS(SIL zrSaX0zqB-dfB8ktw=ln=7mbSh(z`e55D4Tfn^RD5k9se^;w2vEc_(>A!Dt2Marxq3 z&fEp18X9cIx&`b!R4iGl29DqiaQZ;inL1tKh7oI{7d>IxVP@|}jlKD{fGF`KEiELD zz>4ktZ0i*zR>b0Xy{$PnY>uBisnR@-rSCI*(d->f6R@*M+cvx@5c5ovG|m?;E|j^D~<0z5q`r_aqQCa1oSijP_ejZ5D{Sp>-c$_ zvD9b;abi9V*15m}yZ%yxdGRYJCEESotcl^`Uygw@2q`2%DyAY524i_xm{R3|wfTP`}-*w(WRq}DyeAUw9 zxn|e>oZdcgKs2GZ#7m|tu9+~A=7(z>7HAn84mbHxIO&@d+uT#~f|fC~!(jIo#Al+d zLSXr^>{qNH7d%BHIxM$Uq>IT#m+odt`Mgs;)NxTG?Oihh^#dE(ghSX=j`(wV|gRrxapu~FEekRn|=!Wp1{%wQuYD*R8 z%WC}S837`2JBF&#tMwep-WJ&1Hc_WN!@ zo(`xA;R?~r7{7BU)_sVtZBe;v6%=4aF zahpzmTZZ3fM67Y%rb|(9Z~Z+<@*TtsG!}a%cQjrvr9xi_|z~;2ed1rV~!vtYlpd zaz~iL@xvv-P%7NyGUf6IX!D*NOO;)xHNLjBYBTmS!V?6b5!TGLhEH^0DjRXiQmdwv zQ@53*MlY;Yg&{#MX>IODZGhd%Cyanf57_t-lwGiGZ37QMheCYqjTul!AHsutLYSIb zD0%9jawwq?FkqgHq&s5x=nC$jf;Y4+BzH}Nq8Z#oh3&+w3{e0c{RFs18*K9|;~pdN zi0Mf6b8ff7!Qi!(7h4d?(V75T-o0ZgQ5g|A?-3?5UVN9+<~F=2aY{eWF}VRZHywh# zyQUk7a>PbNzndWk$aXv!a1F>+c-`!4B)hDJp?v#QM|*uB8j;TPlZvlj2Og_V_Gp@B zf=ZU6wuX6x&voH%uO%ov=yx-OlP;e(m#T!`+-bzwypt!C-Ru67!SrCTdwA=o3ZUog zlT&{XxDP!8?$R(LPb|O*VT0^PEh|kUC?F1TWtwGxhQE+X3$gm8LZ8gz^6Gxy<-5#TS?0=)derBt=)kX< ztm2QQ$gol8$amynv^s<7d>I6kQQ{sNRT?gE_zVQIfxq-Rku$0IX;y7TfW0m&zlfyXgi*X=r_1T(tJj zBj1Exw(}xtK1-1%amHYw=2EJt(ix5Q3HD-LmsEnhT{m{^C?lO;Pr0;xP#lheEo%pu zgUOVOV{msGU!b9>tkRe!5GjZ?3t6I1IusRjlnQ5{%;4h9q3*qUJ_sWg-j3 zXBoB1g-A2lRHQjTshLVD)umuxlaYeEnES1P{&4vmN)V<2vn)GGy-?2uEiawb=&CPc zUIh??D4b;xKB#y@*%JbjfYTl4`&Z8n?@-=%0W_wN$ZM_rt1F1R0DFiMXv*=auB+kw znc$RJE1t$O^6SU>^U8+GUw};`%(4!YD1tiO@;XbL&B&*qbjYch1kr(qzRJiAqVzI zcU!VY#TW&#@V5q#M>siJ$4~-xOzy{t%#JUbgU<}Y0)7H~z9%cR7WeSuLnpZwsX#^D z@Ovf2(UuW%{CBVO8?d&sm{Cn6b>^pPJMMVfm_x7sR8r1v$g(rM-D_NXHty<^gZ?Al z%5C~X#o*Rn>*AnqFiJA@fV*^i*JSAY;a?$kWcu8yKsAI>%Eyqc-9INpqDRTMMKT31 zkx_NFwFdtQWBp0MR=+J@;I9HKk5KhTi9lOaY)_~e`6HLe zkZf7~o`H^0IsHx9V*t%KaCyU``Yj_?M@2erF8deQB57GDb4CjTf#!gSiN8h>Qi1Lb zG*`_iZj*r;mueORF8rh3k)|ST2pK=B(VFKH- zx$KHLGCrmvyQ)DEPi%Xpbc=Qq=;>DrGj`6_YkmVeZJ!#w z6dXl@h^O9T7W!D#RzpdT;tqrmb0y{5fYG|`qmTHCLt0oU<0V}X`m~zcKA{|4 z?1&+seNMZ{{#mF1H=Swc+^~0*ypDSn3ob=V{epiQ)j|7Q z#jl=>YDY4B0Zz)S`#_!{=sT85eZ8Ui<2$YCWCOl@5t~@)PBWv@>f@@CLyw{BlCZHB zvQuWjfimZZ;7B|>{knAz?_QJid7+WBI<=rb=DcqPwttB(5_q{K*HKtAs893E8!=-g zf{=mw_Z@!e>;Xs3*ztndtcNl2!EbH{@!Jf1z-7+AatrAP@kyk-f&$ga(}_`2w0BGat=H zDB)Qdqx!JL2wF+`Cy$@#3+}n>S`sVXK%rM0s9P?jlsvJd*14XDI`VNLl@^IhZ-CjFh3i;FO8mKI(|=A zHFl_bb;o7RH|?Q@M|J|7l8YnBumZI=K#M#T^inruttEG-KIW9Q^d1$m%5>Xkai^{T z^XKI^(I|UpNfCSs76clM;STkFOB%ZLbF^!Ly2~H&OGjDj&drB`p>27A6(hRg7W=7j z*(f_7{oAJTkB{7Qj`z`85Veh)-rbyDPcXksPWnEd4LVt*4RY(53Sg9&U3IXXYlSmn z+05q}hy?k~Zkk>cor{3Z?AwCO+%(vbvH-XxjpS%*TN%nyAK;Fhd9WXOF`L|po1BPr zT){bjUK)5_9`Ck3HLCy0Y z0t1R*uMYb1Y;uk6w;ZnP=gq_pM%BC-QxZep}H+j0O zE9F-iWi&P*9?>FrrzN>8Ng*p)#{Eon7pNTHW$F``NkxzLUN85od}}mtvAIpub{D*j zjq3|5^~F)r*h@^7T#?-;CT#n?Rq_ubc~(iEm0b;~Ju$ReTQsgeKBR)Lx%VEwWhI@zsI%kIpKF{W^qG zKLT(@i*kn>+;@5_%nP5o`v>QeDpU^<%tO;6`7w%RLXd~c-EIgebn~ab#q(c99V$-W zWtyq9uT+DLb_X=!I!u7gW&Pz*^K5p~?(CxHm5furc%!tL_tWXweZL};8QGjTPTbkwg6Yb!C^bY z!n0hu5CZU`DNx!TQ{YYm!Ygl1}Ep2Fs(t${FzlrzBy$0f~MNkEO;@VVL6ekX6J5UDE!Xo~2zH{WgO_A27Es`oi{SezE#{@2Ox{cW2w3 z$#A!~2kZALR<(~a)rka}f8G2?=`J4dJOioQm&w<3sJST&eH?eHGSV{_51~UxQ34Jk+N6?K3R!Nh ziQ#VVIRP|Wb+Mj<2K$OCcy%7{Pv$d_(AiU1AR^K?c~7HDk|B`Koh2O^a_WY4z!7tN zC{ut+YoO^qliS&dp*SmSSjxMm05yNxjO%{+WA0!p@EHFTw!U2bghZJ4B@Ju$ z$%LK3%9m*j8M->}+5Ybenzn&3?l8r3vb7Ig^f~h^Za82MRnMO|Lsv*YT3lS3ipmSw z@k@lWyyaZWxPQFjLDgeS7rF)Y@F+IK8}0V3KPFfNIk_oMbnke;swlYn2>%|6HQ**) zbKg7gtMPNgG0EwX^PkgS2iNS1crGH!W2df3=g+{I7q|5ljto_}5ZtDvY^72Y>`%v@ z0xq$`0Q*SBhkhiecH6n&VyfyGDl}341@=0Y^Jf_{M+xy^{DX z$aOKdb4Z}Ab`kjzdrdIwQPK2X!bhV%lbI63P0y&Sf{sQ%B{D6-!ghQ>-`gdX5yHM} z(c|{(k--)azVt{t{_nad{Np(bY1;Mtr68sRbhTD?<7Q~pyGnP;6Z7F+^gFPbE@-ow zJdX-5z)oRey7bh1@KM{^VEi_0HEl1TQvj|JQeH|IIVbRCcDOe65ZR z-YA5BgvU}RuWbIruAk7`Okg--4wKUQ{20W>PQ12_=4E)A$Pmy%zj@*Rx|9F?^S{qr zNa>n&)W4#TUUlmE+RWXr>kL_@Y6)ho5yseQFq`!*%DyL)`BI!2B)Q2B29A|cmm2!&V=PsFJ>YQ z##w`X$_vWzmqwg8lu+I`ixCq-9+%-y=zMR?#TLU~gYHcA_d?=_dL+_R`7c`ER#&aT zTI5@2@d-Y1y~fe0GX*MR;IUrYfGy}&_dlSggvcffgh!m~K~TB}yZ|0gZStBE1%27% zVV#?BY&>olb}jN~Rb3*8-_OR41CQ$NS8yG_OO^hAQLYB~8!W_1NT`rv-NWXPrtGiF zA>x9=87EQwmVD$qiJysDA6(#0eE|)pMDyET;_Nx2I;SO>5K_^k;zq3COgr&SbmMhx z4GkAJ{O7nBT_`JH?$od9DVEAZ+5S_4(Ybj4xBk9nZUU%=H{Hi$Q2Lzjnv}DMehp z$asa6HFF4sJxf28+T<32tZ|u=KJmUOBz*SnzBh|3LZL{mTrWO?pZQvK*8NjKlRaul z;uasK-QEW7#&U+8zib>Ta97f~gM#W|YUmt3EUAA-W5J21$>(sxSsy~J06ctz1v$K< z;{9|yI!O(>_(g=2UxG`P>PWoQoZ6k@g?vvlYLZIp@rVRWh?T8iiuc1Bds z^TS|}v~*!~y{9++I3Lc)L%P;#+Hp_KvI>T%kqyGar1bRu>~NXU z&V^?zSr2(tB5=Ow&q0B?ljws?g7D#jL_kHA;QnFZ;)>XPrz7hYLIrDjh=WvY1M>vA zB=2PL=nZ_umw<()T6Nr@+EGcg>YCDuSvQi9K(z_U<~eQUo@_mjYh?;Q`(Y{jvedy( zWZs+c$+NZ(UFS7h9qKK<(tip=>IJ1OjJ+ke^$kEbQLUI!jb^Af4*j3rgxQQvM74#i;{ZSdd|vAY)Dl&94`e`A&-4T3JhcVk z_?OQJ@K%G)Di~}*RQQIrAm3p{;Kdil1H-o<`-8C{r&{5-1wmox5+H+F1F~+@=EYkO zTpN>zJjjiD-v_5t5&m0{v&>E`eK?LHPFFa9+k#wOz%rO9QYKT79kB&@<3(Ts6do0h z_t0Zz+8|blflKr{nhtISsnPo{*Nvt=6hbjW1`sql*w-V3Iy8P?(C;Vod&>P@L%)^6 zKdJuTO5wLs_^lLvD}~=m;kR@8+uQ!@w(0lw;XmHs6=oxQ-F%;TI`88=y=HgDmzB|@ zgIaoHu{MH1Ll0ECp7t{I#_Qq|Se&qs@ZR+1K8bn?OW^_{@j3kE3m?~HgXkP__CbI( zU`Mt0BsrVN40SQSWZ+b)cQe~d``KD?Ap!1ggv&RdYV7~@%p)oL^#ObmdajwVi+%dU z=kjM)0kI+1w_}7+kT~arx8(Xu4k%;Pad{o9K4xYP_>>c%)dW%Kzf3{=V>;shj^7QV zI88|buU79O78TJ1SWkJ%Xyt*8&TKU8^VEtSlc%3hH#X+*b4__(vGO3OU~1gXZv2rB z;i;wH?kR@tr^}L^!-|Y|UXDq8ix@P$bes4fXmC6#@nH}v#KlK`5S|`j5&46vJ7$EM zdKaD}Gv##bL+JFKUb;>KWrC44yhu$4yZOP+`7OxDlhvh+2$UT*5|k>cNX#;iQbVvP z2P4zz74rtZKBw0M=gmwT1wGD*K$Fz@MPvLtuuUAaVdkC=@nkYVAF(m>dQgUWa{xMn zP>p!QrgKY8*hV1U4Q{)T$E}CB?Dyo)XTlyreLq#thgp(g21y-|S<6}DDIRrC^IKt% z=^>!oMi~qgnQh}0t50hDr)?IcCJ|w<&$zlJ0rQp*Gq=Q_VJF2>2PEu-&}2Dr2z333 z>QbBhYB!H;P<-L_9~|1PPz;JxpOTzb$~@E?fLRKRE=3KiJc5}}(4e5}Gq!GN*Pz-5 zhXwI4Eb(~kulJ@N_vu%;Vs&ZrbpuL-&gX`Db*hSQZH~(E=78v+>8t-TOc)?Vb4q5m zk-}IfR-E0}26$n4f!zpUW?#ADAeB|C$LbDVw`+%jqA2+jh*5t{)EG0egn#*dPU@HK zWS$==u4i8KS^rvnSmh8YvHr5hAziWhgpb-Qg_@9PRqasw6luZ2q)!s>iYd}FyDR?lU|bs8ykSG`uavBMxG-q;w;Zy7l;rfmm3~< zSug0b;>^z|o>+J&^QXXo9cmZ-+G*QE78*hswD57jV&ziopBPX^vv{ddtE926^Tr9s z_yt>Qko?hp-?)Dt9guJ1Y&iFw#E~BiF+KMmc;+lkWPgJ^fhM!9bFMp1rT}E8$mi&H zWUq{tm5jdY*?m#p>$V_5Cd;45l^x^6G2hIka!~h@+0Q}$#K=FsTrJbQ1zG8D1WN8$ z3z%et*+8TdvtK~3l-v{k@kvNkCmoxsDs^~MQ!1Iis%~4x{21v}8Oc`Ct6H7xWA-I2 z)DLX1W*ooc8i8VmqoHaL2MUi??seVk@9X}Jov9cx`4#7z+RL{jJ2=%pFb2acEDx>`Fss6o3^-we~Yi0X-2wJx$Hwg%p!LGg#_ zCSI|`1f4IZ76&q+-%>Vgvv*b1QQXO=Rrsm6H)dvOpAd@1mu7<9vJQi`h65M}^81Qg z!t#9fnQP0S=;I;f?-jrx^7x;F^sofZzEFNkstAz~&Lg1j_2UYpFz|yx)7nLdZ94}Y zjV0*zB{MZqq1DN#O}M<wtzaCS#3Q1cwDb_UEsr(d7TO7bwX$N zitd87EXj?uY2J^aBmY(fBI&r))*Buk+;}RT)#zy8_3a=d3t<>quxBISf%57S;|o@H zFa5gj3&g0%k~Vmh!Mz~({I>*=(lrYDj0%Sj=3dkCYkGBuFiM*qgzZN}ZYs_V@1^Ty z)n|<-=%#gO82UW_mPxkRf#c2nj+gNjesIbE8|X@cIz(3^HBz=9`lQ;grO7>@@A=9D z-Ls7$-A$^oWnvmbJN@rkh*LAkdmGj8;qtVKZXpY$qj=7v{Oohdfdn$hzy{TF|}Ms3&!o~T2AwDQ1F zCjteDh~83-U~8rRRHjA@DB5K9~(!fZ8z9MZcRzTsR(PDF&T!!)9DZPNFKQ_}d>rL(Uouhl(&2!cMA= z++W;y_L1QEAAu}ti+S4L6X*ZhVZB`VZMyjoMlM#n8ZF4{h&kX(xlmt z_KG&1ny!8tPd-^4+=G;D%3-#p*w%$UPg9q|6}B?FqpL+DVWblulr!a50-eJ1S{(%~ z9ohmd*8jtq#U zh14}_&{vng>C*jy!x6=7f2RkUN}YsOQ9Rmc)eessvgWd0k+wGnGy1427KJ3)x;>-A z5ko>hAKWHMe=U119IANl!7-Y`ATHWhKG2>fXsqE%zPdT9*48Ii_EI-ft^lNpO)5>! zRmLlAA7DBl&AQI%BDGRVAv|*qSCVyQ(6l<48?jJE3hsI?5j|k5ecm3)OW=0hNl^HX zn){H6WkxzoHORQ8QnMCiS7vcWC>vu#-JpJL)&y>XLm)b}mEpMThtq*sfdoVm{cLri z)ryIOI|SC}lW~7DDgU(K`VSy&dEl$pbnpIe!IDUmO^WtY`E3z%2ov}Z zT(SD|eoRf?9c%N}>{cfi$m*fpy-ber4s-t~DXc3dp?FiA#(gu7h&eQXu1tr#Ey#^1gypE#ISFk&EJS zdLYAJ+r4P8Jhk9w1KmxA4qgr^*@CRQ0Gu|@bdOiCGO)Y06UBkjGzp*k@>`Hdh4q#9 zb{>kp&byWOlDSb~GG9;?2sy$KtU&0w(}l=lUPKyivzW^F2M?gsSwiBo=t;CVr(i2I z5Ns9}gxld0&lX%c6Uj(8Xc5<=(6+G)!AQdC&#gF9+J8plVv25A?~`7{Oaz+jF~a;* zHT%-AcoYa!kTAbq<7?>rbm)-hCDvOdh=$vzm4k)GaYD2 zzXBuwf-A7mzHi&z^d*one&+}B#sE$Rz>C>i5RvL&009~U)s5-Zve-hLlkSF)2Pcia zlo{LGsQQufRQuOfsve2={J|E^v_GQltX=An^auPQX zG?CPRdr$rP%s@#dwwq&^og_F?&6-TE&SaoRs!)bMh6{|$Z*o?;$^w(7i0#Kl-fn`` zF7c@y{Ds%9`npzs+L3Jd39qaddTAE@8ueiRae3#cektuIV+E#>Njq(yhX+Ax`ZTdU z<@Aq?`20JZQ3@ThkVmH%GxcV#KEsc{CCsKd2T}Kv5D>7b>G2Ff`Z5OuirgKm`P;jq zDJ2BHeTt*rcaMOXrH;Fr_EXMY^xlWp@mgOYo}z>?SR+bMyq+^$03Vi@S``Nqog zRpB^m$vod?v742+ooQK9RN|XQQ%QtbKvbV?M;YEF*M!LGTE8RHuH8iM!AT?<_nz4C z9S{0k9}C#pZ$V6y0VjkYsO?ijSQ7B40&jZ4f+Mpw8VvCSwm_<* zj=p;pEU+EGYbp5~HPJ;}*m`s}U5Tu=>Qub2TG7WH^CpR1WDQ&I$0(sDI}nBs1n55q zcE6Qnk@19ZlARU&CuOBz8=LqD{SIbbq4@}T&ZJeM#iVagJN^4zjV;LQGHkatJ(-a# z&tgT+R*R0|T~>UrJI4BnaTh+P23m9e{YeL~(;k7gm~FwhUpS@YJ8z+ze3zY8;s_w5BH$9=g+?GY2sbOAqfS~(|>H?)mt54?I9fPPJWJ$P2R_5!? zXUv2uKiR(YD2nZ9NT4e1#L)l?)w%oNGd>eq)>fzHV_eBWTyr+=^rT7$sDKg>*j->e zC~XhvPSho^aDp39YnyN07H zEL(@EZyRy|US!=b4yrh1cnk7UhB*qOIU)SvH87wH=}oA0E}&t6p+oIYrDBPNBPsuk zW{L5rkRx z1(3fN%@|*F(f7+Lz8yJeXzJbSbdL1~@hRIU{)cU+YiI`UU7#lu%!e-KR?E zQwhyH69fyIQu*AlZJ7hC2;QKWccSe|r~J80dx`?tR6{6!_kgtom)7&e*rNwrzWT$X z9R!jxbIpnub3LUe>KHR8z2e%sU&K2kdJhRwP05EFh|s!f;mV3|=6tnr4A^`)A%HX7 zE!}KkU3$tlbw3Y3X4Ly-Jea*tJ<~-`CHze1W;{o7Q$6r1Sn^b?uOr2;yCNWbF|9n5Jder zWf$%`av$@13iPqqvwD@&K5Nll=LOEPDD-^4%)O7rn?3Ur|2)ePm`?8uMZo;6mw-)? zS5kYF=ub;Jzm#d5iZtUfsUE8iP((4Kcq1mFTCFNQGgHwsv)Kx~> z!DrA5sj$)B84YIUK`^hC8<4i}geYp8_z^PZEvmql9#+1@c#5M$!29IMrSrxBKYpeH zJarjHMk3{wA3y4ea42*{6|~rH#7`tIc!CM*r@Ki8)2uIdl4J83rIMdoK1AUiZk?N$ zB}>sCZVMxWssN$t5s#wU&w%m{wTRUZE6%|H52qvo%vgmkgS&V}e!h;VQOFs>7O+9) zqdcT~lXv*8sfDaMnYbC}JDuhEwH`0Bd?81~QFyW{f4JTPT2bkAnV?^svfy*}?YEzt zAA&JIOC+A(O?;s(sr{W^NcNt8NxoLMBwBt#b#$*m)BZ=Jmr6E^?r^nzE` z$7{aVaWk;-l?!NkQGe^ORo5<7ks+L?qqs@C2ZF1&`W4ow=Di8dXe!zo@qXU{&M%W? z;z7DZTT#-D`9jjWa@{>Xr3jLGh4UzE|Db%{LcvIkLD!mwMO5A|=u|L52ibusoP0I1 zSCpI`%$gs&ynE3pE%KF@`%`PHrmM(1{P71a9V3$&E;=_YD)f@(N?LF1_pXQS zwjVm(iM(5hUXPnPc`7+v1h)9;qH-gr-+~m_U1it1->@3h-Sd4TW{_AJ61eX%UWT|V z7T9f3b!0_NmXHa0EBbIQT#+Tg>NVOySYXD?U_%6v4=b8(`?_=ptSbyNu?`W}T@y|1 z*cQstybfygHTmLNaVeBn8013t6_jC_Zh80Z8zEILAg|BT&364{nUoZT%An#>3x z;*5K+*ykERg?LC5L6b)+!RMS8RQJWx50P1-Y+D@FVXE%Dj?3C*=WvHEox=`k5QYoV zW0O$_*KS!{-zUdVPTT|a(t3Mk_j*4+Qhw7Mk%lO^)hrVcpk-&op7i4sWbvw25Ndzj z3bw;!t~P+(;_wKtIORIA*cL?piXmE$TJo?x$fISizZd?2<%&C1Q)+d}sQaUwkmQO< zx7aptXhnO9dgt61%r_5gJ(}MqW~zaCsyjf}H|}J%yC_Xp(O_T#Ab`33IYt}X1~o?; zTLv{J=P2$YS($wSVx-jbfn(XK;wtg^`ascV0@e5ZVLl3D0-ldr|Z1t4XsxVFkW{KF?>w{%} zF+ok=fln)Y-G_z7fSy$Cz{_d(fTlxB@boAZV5_iTy{^h*!*T)Kdf*n`>Ov{ID zpOR_>8l3A7q|Tho{3i(YR0gfZ8?a|^BCZkIV4v~yxLbV8a!{x^b3JiY-meHCbbx@k zu$|o>Rqp=|p@Ulp-UjW+{0Z7IG+^$sL5$c=03@5m@lg=LydoSzO85&HN5%@ctDvz@ zE*UibrYFkmM)g|Yw;&o>$ZkUR6RPpdJ+E!GW&i81q3HkQC(kl|;mP(8V1Et)-nI7U zsieQ2`rD|w|30c<(krI+Kld8~!@2;5#SSc$Mctp{dK2829?JOujKoXm$-fKzBVJj^ zGIEgOR1{wLSjTD~XwCVl>rCE=vD5N(SM6_QlFhoTdB}JIF*`CJIWSt~uySc->5+9C z4e)CeH$}YCGMZ?(_4Z@nB}bZVfEIekoc7Om!W}=Jf$!(qgsVBy9y&ImOQ;t82*erU z7}H^m%*OMr;P9gB$LloAYcE{MP&$?7{4^_)&{Am(OVu(L!!Bfwu;T|jZs|ZbCqFgO zC$#Ck89A2k?ljzF+$dt?Gr7#HemK&Ai{5tQo-1GlR3zN8r|AS7F8>Om!Oxnk`fr#f zG6E`*#T!CucN)HBMaI!&?8$f{&!Q5B*XRst>TQM$<48%lqi&eW4jjHW;Kn;M+w5fD zy(n>cYKYpMMwN!sR`W=*DZ*$dy2qRAhq9H+@VD*{bZ$eNOzFPDcHNCsIeE`{u;5~+a zb>O8ddTHPb&~0KSj(j_k&alt&@N5)2gGN9bu9UpPHxn2&=TVmgN{%y1}u9a;Hss-%lTc`m17Um=EfrQ)m>3j0NFsBw)W?}7lnV)OA%noj5 z;eoNu?}`}D4#WY@0b6yZ49Bnk!QOj7HTAD)qo62=N<=_9K><;!(v=bsktRkE2t|mB zh=@qhaOFk`@$#($$Mji7Bp3gC^M?sql=Pm-iqEE zAf(%_)qwkFf7AJ8pyoRCR()+&(Z7HC=5e**UkywNX9&B(Sex5gyPXhoBk!YJy1 zUT$@^Em_O3yrKuLV2c3op|^%KHJZK85K^?(MfRJ)K|UUu!{KtBQK?et@1pBY^2|B{ zz7Z3Ua1D%qOHFOTH#j&QP!QGFJml@4_Xx2|Z@k((AW#4{`T`#0QGPC=;?we)A=4mA zF&?`7Kp#ku3)5R-GKePL)KkMcoJ-jvNMGP}_s`=p{nEc8$7-&`K}M|vzL)N7r`!K9_by z>0qNM&~G|xUvjkgJ!16N@LSW#;xYoz^iFl-eP-U#pa++AEjXDC6IzSZU&8zr`?X0tFY|u4L3kW_RLiBU^u_wgsLn3>u?FbjQ38e^|qr;eyKCy zqh5NoNvJ*B*E_9l#Jh^@+_5*6t0EIzX$1Kpn1hNKoOK+^obclc1u~O&v_9#Dk5=&7 z`TpG5Wxh-7Y%47}S2uKo_vOoh^f_ke?SDWH3S`*q|8RrFQapiQf{|hZS#Il@Q^%NN zq=+Law)N0+n9NVtD2ej2L^I$Yu?_I}VP2_+t`%NMvOh?2&jBVWgds^`6hX|S13jhd zq<$QEs2AOJ%yZi36Hy?u7;xgDOMliE<4xiHP4{*^k?O{=$uT3q4QRar)sT9{b3f7+i2$zeDMfiDTiYH3UAzv-fmS`Aup;B*j^E`(~ZBX#R4RXmxU+W)fy zRx=e7YRD4LuVdvF{Q}6@-u`XX{(m0-TLhP9;UCu)Yt>epjCX$MNj) zY)A87yjU1VZlB!}qS*JfTjpRsbVR}{ZI1!2VmQ4Cu&r-5fOq}hJ~?w3&Hm5!@2&x& zi40rmGo>h&_NTo-*}*k=qfKW`-u|YWZ7<$gN1({?IDpJ>IHbkeQ<@fz$(nB0_0C`* zPi^RbG<%(QTdQ`*KofNr&Glzy^1t^OOm1CmA2uNAmZxy8tg1R*Fj#p0OYCjtFx56e zX2V$B%|{u2%xow;S25h}c}Cw}C86}v}FD%C&6>^?8}R61`rKGZo@!jqxwt6jX3s|lEM+Qa|)ceD%%P&ta5dPAC` z2>m+QKW>O+Eozo`i;#GJL!wJa%>N{P+scjH9Ngu;z-$xUs}?0|i8*3tTo45?Td<}t zIvo3SDx!w5H(x~$$A7fFKX*W`uf^F>G2zRr*n-xsh!kxu{?!I?^wAHG2>0Bt46+Xm zhwH=VET6Mwt{X~14%z674w-WQm-POqXQ;8ocG4=xzSWrVO{Sx7wA02cw! z#>FFV@2PWC=Q>J6`G^ThG*am>t*AFhc`Wf2ZhZ6X)Y)@S#~Gdw*iSY-QwxlX%GKF~ zr%n}l6km+Qqm4{0o+JfoJ_8j+os4ec2$_W^x|3l++55N=?W_hg?X=pAb6dGQv)KE8z`(MaE(qq7FSkRP>HQ5j^sm`Vx< zZl&#Odq7e09{vAWoAIlq6b4kn7UX8g_N3J-E4yjirpW>1SI``aw7cw1DF0UMjlc?x z*I_3=12))w{pMe%ILmt85WJ^ADh43RP9Fxg{i?~sz1yo<8d86K4zk_XJdaX(Z>;4e zT0p6UP5h?AL(LG_=pR{>ou8YW0P-djn}N6rx2M(09H5Hi(@;J73cD!6pyM)JKa9p# z4Z0ZJ96f0cC`3Nkt^b!l31TFi@2>O((&N)lG-wr&{gCZE=$VhJp4i<%>?a$If5qRu zC>2TzyL|S)?)v_ou;Fjtg9GT@uu}w8yzyiU{15-9RizhC?WJ504o_+D@%JmVdZl?W zKeusk@B>j&!cbkPL54k6#yCEhNhp9xl_@{D&g!BXh;a844)Q}2k>C;>dm_eCBm%#p zafDv%E{l?qYR+=>Sk^SPB}D!Jd4B2MxjuJ0Ma$QG7L$r5N=rXdT&a{OnY0=;%voor z_~^4vz;3+f2oYQVO94M$QaZyOtH;TUD5mG(SumC$F z?ohwGt+HKA#CMOtJCziN#X6XP)|j(X*RDhQNElvUb}x_EWb;jOcO#u8)v@ecaqO6I z1m^Rz!3Tbxey(suljgwB_Ew^Ufevp@QQ>wEoXK^}L}nA|Ja-T^(rb2giH8H=KLDv5nGUGzDU^ zM;tMejJWcf?pw(lZw~_c-iK?})CJ+aZY7LTzBw`9;arO(0;UlD5c3l{GTw475h*Z$ zpm&*Zln}9RrKc&0)FM!26!d;bHPU_9csiPMrRFH%E*h7|U*P;m(BaF>6OD%EU)^5< zcUu<<((fn0t{(0nPG55&r@aCinvmdricUtXgZrhkUfTnV+;#hmfg9p3)kQrHvV^Qz z+dy>f6q|b&hxO&>+q}t+8VtYbuD@F06jA$T-*F*9`-x8K~sqm}0m-g$P=$7&+2V3BXKI)s`V@6X4O=J7(a zQi(v9$Z2e{yMP#0FJjSZ!QAIC+a2L)vf{k2Q#Z>%9z=pnr>5*YouBhKG$th8Q&^f0 zZNGkHBezVDDn2FGGnP0j^~rAU_F!67ZGF`qp(qUUi{5qjMbb(}q)AcKH`76TZK_p8 zMahrS*p0m>0Zc}PZ9>O14F(GN=oqsx0csM+Vb0~R9cdIv*1W1y#>L$Cv~3;@w@238 z!>(Tu-lKCr&})Wss0n)graOVFern%(^yQgTnslnsiCo6Q9>Q#!R(@tf&5B>#Wb+~% zwU_*|@!@-43Ar_5{X5cNeSxg3mHQ&tlndH6L37H+SPu@}xyagl=T?=6vETz&-K7u<&1BT-RKr++VmfYlV= z#_SCNFoX04)b64*E#g}f1^%owQ7r0b1E0{BR%8 z>(*V!e*kLizQftv;{_~V{i-)G$=K%vVDY!Vci1R~Q@O;Kc@8S~fMQ>)>{rxq7lsze z@yYSkhan>Ofeze<*2swF0dOy<^)3Vl+?&=Yx}PE=FovQ{z!HOK7gVaG39S2Fe}OUm zP5Q>vo}a;YB=?H?+!=eqeOzLo4IUI-j6PWrb2@izXbw<%twO74snQJT_ZeX8HV8D? zfC4TfGzW1;sHh%VmbDl`uw%k}(fQu>goxO7<$#8`R}ad*J@{F%D|H$9W%21}*U5M8s!BpBT8snER({OgEQ{!yg@NVdQca`{fWT}bhGtl3+UJjq zu3oqrD1JP^761jLjvff+Ro~wmB)n*gAqQCu!_b`S69s?Wy%n8D4=<^i7i*dO~S4{0*RfsMv(# ze|jH?%(hk;Hyj8{23IQuFlFT7#9`Gt_G6-|cO13Rx8cY^IQPaTa`<@m>30uf1;d8e z1VArZ^MBJ3cm0O(Y&bH-0^ig$lzPPC`Ph?ij=m7hM$ZsV%3Q$5#3!sz=#{^}de-mNE3 z8oo5IJ{FX%C}|Zk@D8AO()!{BUFV&SChFImd+t2mQbX1OJC@eYy;t96_f$@opxbbU z$HV_Kgg zvx=f+0l)^_9|}D{oAC;kT$)X-LX17={U}hyoaj&cK>)om(%jyxQ`4#4ciz*RpC6~2M z-B`>2)-u9VO}>ecD<`_ok_#D><@_vvt+e?c+$`v5TC=u}K)GdaTCrJO{GOPKvp?Yj zuG-r_W2w$Rm>Vu9d&$l1e0+ScI)jrsUx1i7#R)In0ttnK9!GWD9K2J3fP3`KBbMm8 zojy}qWvR{Pdj{lZ6pay$XJZC0oIIcrRF%zDs|e?}C0XfwPb9NRTCT1RHjW~bbJ+}M zvOR%}`b|!oM2kNiKjQx8MW8ndKfOfjdr#(^F5RX^e{ z25YY5hlruBJ{b7s(k7b}f7k3Ha>L*OV*`laoT7!los8>nV^wYENCALC5{nV*FU#D@*?b{@HpcF>w1&B?g+oKeQU5%xGpU{FXgN$Rf&nLHi5ut=2`) zjqy}LcPo$+6T|T6?((BCClgf(<-5#vrUGKUl2yhtdG{M&wlB;-6_u=ww85|H1`Dj* z)hf#)i9d}@f}GgR15jiQr0Ru`Z;g+tzQ@nRgKRYdqa8ISTyKZwuBZtNuI)}N={vAI{Y7z74!gI@r~Rgpg!R|)v^*`B}}5xWFJhv&KE{2=3`c3UPhLfh+tgMpOy z9W1lud;IrJ``nZj2jp``UPxMV+u6og1GV=fkB=eG_J5eHWoasjIQ4NFcFVbYEIm_c z(ZOIJ+6=JeC!N-9M=>L=Bx&mWNaRU4^(5=6bumSK7Od~+LD4utVmu26Zt7iT2!o=;-H|0z8 zr&i-?g>8?@&w<%h=(7>>g!uhn`C}KAM%o-MFUwcP&6lZ_%~W%-j#ZJWo|T?}7tG$i zdlqZG|09eNf`_hi6QFfxnyl@!rLwGeEM8qODQ6@Dn$aKP511;;?D-Pb{i#2^B^d5Z02ABug- zF&MmS@)IQWOLm1#R037im@?YJvB=Q@=jcisna6}J{CvSdx(uM3aebQ&W6}On4U$=#b9HG4$3o zB8pZAKrhnGYp!65Pyi~I521!y1`skk_OPbG*~dtQL%LKn5Y7~a_SC^op@uJGcM$RG z#WTPW=bkzUaO4_rcAEJ<^KLfc(%}Xa+kSNO;{1CWpy#43fnMw%pinmPy>XoBB!00; zZ#4ssS_=8UgDKOaBTUkVx>tYGsdJFB;ql-fb}h|%=U43kz}9EjO7&l`lk`x7g3>`b z@+X zLkhECZhV}fuXNabf5_6qi_P!mWUVII-Z1$t}J#I_xR&L@l^Fg_rsn zaJ*kpPoSK8B>`)uV}&ogK2#J*2S%jmzExLy9S3mfuUuD+LP)yv3$(22b8M}bu6yER z?IA^X9M|XRz#@5D2Rp}1vuG;fjYwhSh;io>zapyWfd^b-p5Jye$-Bs|m$S$2URMc>@OY52 zV!v&YSyon7nWC4IZElzOcw?jK!-puH$KJvzUj{!(>kY_P-yYZQbNIEgwWS$$>8;JD zk!gA>OY?k}^7+AUi{dHEWO;;YTnpr^gH-KEia?)cr1|;_tce$B@M%O%fQ4PcTzMw z0rlg2OVgY{38xTXJeYy@KTzB>Iv_sJT~(8z7PU|euG0gE!dCz)KpOs$gQQ8eizcQP z0jC>)scE|hl~#A9XH`DJo00F3O;cOYVhovCk@hC6rV#@$|3BV55FHz)1_ANVA-out z8bV}UdPF+}8ol_}DO4>9gC4dzncyf*vDzKM2v&HAfFzze9HavMMtqKEl}dJ%U_j%T z8jlNfUePMROjJ-PRE!!&562FiUCxwyrtBok7j|3guP51CMiZH!H*W4fpKZQh^|FG` z3y_Z**1&Co?2|%*-8(>MmZ9^L%l*QHl<~KVi~|C%J^5hkim$5dQz)E3lDM-`me?{7Vd=$;Wu(vlHDk%Q0(N*#b`yz(Q4G z^%cr;V6L_t&o^F(?{4n_*y&i&nP}1?g!WDk{?>hebIp;!c(0PyM|4E4OPq5`9+yGQ z$9jkTXxWiisr68A)+dW`1;Nk<12Y7b3e`6aIBz#! zaKQj}e2D)GND;y;yXg@(iPRW0`K$CZ9%3WnY2 zHoco!>1?OUQ46tiiOZ-x?m)nSZ-{S8m0v}3w*esS zKG4Yb-*io^80pM35KUc|EK?&z4U482(9Xm4b|42#urc|mU?MB99ZvO^mmfHi0i6&F z0g_)S?e*@~q#u<05Jarh*~~gLP#0JS2X2@+dbp9b@UAo+t=FhEPT&qmOQ=fU`$2eF zFcTeIV7}BZU%1L?Z!hu2<9mp8&j-1lvVYVMaOB8?B%Cuto=j0?I?T&UK$xIF5i`o|u^ot9uKtdr`>7xT8g z{V>~PTGI5R;u-hemn4lzGa~eBwl;+7%~xc>37Si1Jyp3p-r}`>U7N2tM_>#oMl-K+ znzLY^?|BYI7-~k!?u^eF zyFudkv|}Woepv&S@EALnyJgx^TU0K9pz{$cy#$#Mw<3Vnb8z!No-N|n!zW|5T7T#k z1a5LmK&gA;y-_kC*FSbks4*bo@`%I(>#9z(Lc2ta%&V!!B^NrYDKv*EG%?#M3*Bwl7R?skeqiH-6WeZIJmyoiRle7=W?-#M zOm0ky#cVt*3_b^0*E??mj^zQQYm%`*B`qY;>NwG>bSzh&B^^^%5yKlCC3uxNW@NoX zVAcbqOS)YxA!i}Tz93Qq(eDyJ!BnMj>bbqAA;69_q z;0$M_C?Md|s=f`{vLe(k*_>4OlNAsOf|;Ttm@U@8;LbrFhR30cN*iE!&c+kFC%JT= zk(TXqTo_wp3#dC*XmS2p50j(C){$9hF;NOBVOm%5;Fey>bz)qWJ*=w`T7KHjV`?(^ zu^KAviiuk7@S2K?mYkiOlz<|A{JXg`-VCe*Jd|^oZ)z;G{)(xF@kL-P z?stoDciPIinE?+IA^!LDVI3#m8032MAV{sA$#gh?FVt03N#RO=IcV9C{-2>4Y2r> zxNb(sJUJLQ;~D()?Cppw!%r;gPm~jm36Z}Xp-oFa<-mWsEc`J=CqJ%UfNhB4jN^rz zA|j8f&d;ucCpE#|6MM#daa1pi%$JFI*UoIkYY#}SdpC)`7t3+vXnwe6GXDwP8`?AZ z^ADU?3r>jk{z#&V=gR_VLVz4PB!%reJ~K*|BMq^|8k}S@Kj-M7T6v1TmSc1{=Vqet z3!kFPF1K3(ko{9z8dZ9~>0Y8pW~Y%_u+`{wV6sbCu@Sqs2q3@T4$rwe%_8LvH@_w_ z8<_aXE8EXu9ZFLJWJBO7-~u2Q2@7moUA2FSBJ5#$Nvrj!)BIl0nZst|mt}}5$^i&3 z*sv$TB+qhSS?jCe$CN`HQh{SAW-#dykZh!C!A}oIhtY&o5HJEasf#RuYGeP2TFslH zz0-kf!2Sez{%_h4e?^}6tRrRd@u19fM<9D?gWrdBV2h`F>M!`opLCqtjYqxu00Zj5 z_COo>jn^Sen-PWoJd2AC?4jP4fb67-@0DQLYR-K`=;K3ohW>J-xmtpybLYGAA)ujb zb zS3Nod`4@Qk;|8GgeHLpkPJkXB12Y2N|6h;w)~)quFctzYsbs1B6SN|I{x~a|VVz;x zT?x|wL`6)*_7hK%@AYm@+dl4EFmJDliyO4xX9;UA7Vza7O7C%i)_us^kW97#CR}`E z;sGc#yjn(!@0+19OV(GR3*YmrL}x74sHL}ml=Z-6eks!IA8a05aHZ;XQe@Mx&tKL6 zLZH<>V=5vUXMN%eLqQAQPP@Y3g)MdAod>@-OAm46iV4zG`k{>JDHkgi%bu1B@m=KR zo)J;n87-=tG7KPyH>?3eD1^y-l6fhT=!?JhjY?EYde}e|pR*|Q#Md9BPsnbc3!}1E zQo|5e4#=QT$MZYe=cAzWKgyG_BeCl)FAxu6eNIS(A=6A~#-P(`jVf|vFw;-?6H8iC zjmjNr$byWRF$A8fB+Q>`Z*`@~&o)tt60i?bTFDV zw{Q=8^7;2ylet+(w?`jPff=GvVC>kvlZ!@?!gd;M{diy&r+N9W!>jN`d8cL{U9hNhCQMoK9;Y6-85Wetf4!)^JUmyq*yfvcoS99`$i3PPy)V*BjVu_Qmo)-fNO0JODD$pxanN|(|q z46hsh-7ohy>>U58=eE(|%L@oGf)B|K7D77+iu93p!m?X$Be{tskWD+z+3ixF4=>sh zIw|^q|7lVI07RGQFcTOi$cGu#bI{FE3vjmxroo(w#WSQhzEV+?-rpHlpv-7k68T4` zUQK>-_=hNHeo)YDgt{$l8yM2B?yiE=A5}YFwD$`|XvR3@Jt`OvA>ItTl%x2kU*llk zD6VT@W1dFo19wIG0aMA`J(>(O_-0o z5fJ?Uz2|_!>InkxJ)-Y)kpn9w4{DsahHr~_|Ms$-Fn=k%ziPLyiSeqKlWRijO6~OG zn;agwecLJnc!bF>ae`{*SNRSH^JRIu@hXDs!E17fi{JJT)_``(Yi|}O?Y}NRzIzhI zR$bivGsw=mrIV4YYaFG-5HnUqId0FhC5&N^cFlNaw^Est#px%VPOSMM+CNk?^^BI%wEBBz_#AWveqA;3$?!9W zs6YGaupwAk8nrT}nZO-y$$uAsx~Ma_Gq`2v^sCtb>-QxfL-@x4{(t@)qLi$Ta2%k4 zyBudw%6GQ_)m>Rf zYrJ*Zy@U9YE@f2ei^j0S>am5OCQFZ1D}W~*g|XD}b8u#Bd%6|7$=0aIKH)ub?=1b2 z7Qg+*w{gbs$Y-vNxNezm@2_t}Y9X`t0k|5>U>*Q8{x_hCVS%(>G;1*^q~_%9pWya^ zT{C3QsrPN6M-pPy$DO#2PAnGNSpCRiy!Ik0mxaYEtRNlg;oHPPY;wS1*~Xp+J}4Q? zkGprevWE@w9$INDI;CpA3!ZL?eyMhW$kK0#y;n+59)GaTWs;zRs6`cJn9lm^@aOc@ zJ}tnAo-{*ipq(=AoR)_rCn>8+KJ_^q(a7F948-XFZbRHB^zQ``j=#cPrtnYOogK`* zQmSZkPD>^xHYViM=Sxb$0iI4*qg;S#=$|ytML`7?+xw+7h*5NU!VRi z_RhN5`R!fTux$YcP2)Xyw8rvmy=hA@sZV#Px?+5T`qFO{xVo8$+JA1x|A~75eug#x zVc+diG!Cg%16TN7dMz$L7jspIukBO82G{wU5g*;VLAnR1+NrY$<7xiq{1#f(G3l`5 z_MRSBx~m6=l@<$DKo_bW+;(Sn_xn^biMC6`8zXLW!|v-;8*Xkdqdt(Rn)mvOdHt;Q z3J2F^=inNAgbglbk$W96tAVi9(Yqux_obq>`mjrxf`==`f4J3f5!1;A_6imA-Jy8u zqo|>-v5QqPh@tp_>GyR9H;;Zi%d%0$95-XTu8}ikpD+e$WC+UJxy1an0?_J#UJlIh~Q}4C-f*d>g_G+-6%jNt0z)$>P`|p3=Kl{lKkEy%5 z9{W^O>M11@%wX%G%3<^*ZinNbX%;%>+d?WSmP=u);89EDjbUa=kJY;8Wp^r!k}~7O zc-JRI;;GsvhL5C_A^o?*Ejr&6bYA|N$|mV(B7hF~tix2JO*wuu)R@z(FuKdkn*JP4 z@MeK{@AX*OD5h(44~P49Yrw2L-|q*voO_fwSU>3YJntUhED9RgVL_A^#12P%;oX3- zX7;L~;$zkfcahpZy|#sq#>&4$(@}+{ca*_6!!p*hGhQb8lE7bfVCScyJa%djd>2qeb};H-S;a}eWzWFEp zmqoav&rOG&nIUg*X*ne)7OX5F<2NI zf^SrUI#IhhmHArDF`Ior?RW9xH`Eoa-Ta9P6Aara-D(AIeEHJY>67^K~q z>BPPgn)bPrhRuXV{%JIN_(96E%=EIKpBr;yI3bmxov5m)d<>lZ&HZuTg1z`xN6F+T z->|m1=5aJG*_kXV z<1UK40`R0n!V)#fNPChe)COVgmB>>&Q^bco!RQ_E;BUIk928Ani*m)b0ZIY^+$H;iSRl^J z51Hm%*ZfMcC1kV&Nl_r@65g72crIi#=;}mi_rL55A4a;S36ESO;t2KGS@S3`{Tx`OEY*=o+4_lVBWSG z1dvWSMFdK>>Qp$51Qa1=jZh885pqDmgoXh|gMoT4;oG(}(Q(p|_IeyeIrYbMUh72Pw+g6Tce1c1fJg7^-h? zPkooIX~onk7=71%&YYw?o3+lJOK~jh+ZnIE$bVt@POOeci@0u|_Vgv;_y@RKY6fU)lJ_d|BRUEkhz+&9P0B+hl=ry2pCB|q_l6+$T`$_MCX zy_UiP&HEj3cn>TRJLftKa5bF=>RRdMSO&Z-aRj4HdqJW}5bY=cCDVH#nIlOM-yW{d z`Qx=yELFAyA!8m$pj){+ce6^fLCK|8w9CRT#6a1LB~Xr1JZi=RL-1_gOJVCj5qzAw zKG{84|7((IV$M_aypiW)|Hf1Pfm3XR?kutw0$LW2{o!VLvHPq`+N~7Zqu(VLK123fF`alP9Ayk* z8l-a@>%_hH)BEW zb!s1R6c24caUU|MSI>QIuHQMOB4d5tG|K!U#2dvLOgD@Zs7_s};U2;uAbI1( zc{H@RhMfMOB@1Z4lfg+#D2y8ZstlLlthwA_2w_JV&mZ`8QMw)C^-fC>d}S{JOsAD9 z{{JY-!xRI?3_1|6z*7AIZjWz6fH3dh9fCB14$k7R^*-K;&&8aa_ns!J8b`31L z%|V30E6_zyN(_h_HKGPx5Xa}&aMpqL0qUK>AvqSx*^aa-!$j~3~t>3cREo0coPmXcqem_`U+Zi5wX4#G9I zw{cmo*Ab@)yp3vS3p`?n?^GpR(%brF;*uQg!tM`seQ?aJMuwodABlkFc$FF&>wPj` z&)nq42d1TeM3l9*iQGDLXEI`k6ulXspEtZwQ+ z%!(5Lz7tWOyhpL*3`cy&*3%IbPHPaHHvkOEUIe@S!?Mz!$l3mX3ugJ0mA zHyd{ETV-2m&Kw`lYlc+iLjXjE?(W!u#Ikv5>)sXV3Bb0h;=ccN{8U`gJiO=UCeUt5 zd8>;}iZM58JEOyU+vkeVcBVhgIA_gqL=7&z7ya`-&Bq*$q1|{&izoCS$u-*LgAJj- z&(*-!ug(|&uP#_Q2{d%>EAC*v0whT+Rw2=PIIYdMUteT(yS{@?g4mG3OGpQObVE2@kO`LlnA@4!?saq@g+jF&Z&`S zM=3SzmwYgT>8$!8@35+W79Yq%Y1+4NyQ3CTP*aSst|o?lmHGr?7&GUHbzSS*=0VkJ z&^Y#f(*+{IM2%V=)Sf56cam%1|7-eXf}wJokWDf{iQhlJJ!~!O_SKQY-RLoP1M{R5 zD8YV)GXyEI^tizY{B%GvkXNsf+smvXKxrcKRXKFvj5nr38cexFXlbw*DkVsztU6Y* zPBxw#*IO>RapeSU^-*=sUP4+D*UJmfq=d|@or6v2-$ma99WP&f^lo>>b*SO;>cFAI z_OneS3q^cDMAp&^sxQ?docH6R&V~nl(pNNt8f$7RpazNj9!S}KfHg>Tja!k~K*|pH zqnJ=nquGh1XT|WdEm9$$15S8zIyBX;zKIE;i=KTHePd#HX(0|jKC>5iePW5m+n*!oN=R^(fEeBe@YnS!D=~MIlWv#1 zN2IiGFD- zLMDAGidkmAF1L4adqWm~Fmt%{w88zytPl#`%UBrvGWUOq(Zd;U%J2*{_ zJVdVdxCZ3<`0G3;?S&=18;JiPoB1WTrxC(IJlc$8ELW%z>ywh(V(Hy@-MB~%gx&KMPx4tjRo_Xt?p}UiSi5J-{$2G0 zG11=6)9?D&Nc}z=ggGYL19C$~-%kj?x@SQQn&nQbATWqFB$?IONx`V+L(sF~;x9-e1voW)z` zyO*C>SD2x>q=h^oZ~5tthOIwKt2ETR-pAZkK=VKlmK>t049|)eqv(%3(=Q3?6$SuQFx=S zU5q&&bh33$>{~=El#cFt9J~;9yh_@4x+howj9>VYVmw(~?)~=TyI5O0GiXI(9mymm z2H!CA>s5B7ttNgx=+n|QGwW9zZ%J8TX$m99Yh~eC2l3!H34NG#SMT?#N%T%E-5vDikpN;wGAH2JgFSGDsE;YzGQP!p% zXqw;c{dZpXC=k*{O#K)_&+=wd^rv+$^vm0YtL*i}mzKGZmVm9pplpH zurLY-a?CT#t$b(s-L!Z37NkO{y=>o56NxkS&C#=1^(?LiS`u#`rN?NbgKn~@=j)bf zcNnL(!A@{LFM=Fj!i?rj1;9_pAL(+QQB~D*-%33LAg!v4`GH}^*CxVTDxwd^QH!}& zjnUg>c#(PQYR)BmYN`%gsGT%nnOy?SGsuRhO4i8sA!Mh$7AZWB}RT7%%ilhwFpBKtp|@|e*g>Y2X#0rZB&3h z*`QBu>dke8$drCp-i$%4;M)a7wf4DaA{VAq>}0aN{kYwW%?moypLBKj!npal9_C!0 z4p0&ta<|*Q-F&!fnI1&OBU^qQ^JA;quaT{JlhIu%sI7R4)t&{2g+F!p-OR?^jRvSb zrO$npx52M)p02vpf2S9KqULMv#n9S#{cWp2 z(E}R~*GKq#nMz?G=Pp_YKDWOT>J0=`b&Vk2&`0!ae47~YiJ<#YkKywfe$ZptUc45= zW2}VQMan5cPlI^*6Ks+Fx-i@P)Ohx@u)S@%w#aG?=akpGm=T}b#x`ONf_U{M%|{j> zZqj>`hHMf#;2mt;Dg&Q9lTZL?6|0@=&xr3fqjm9Af(_g0A(1v1pqlVn+eJn65wQ`p zoe%xmQQo%8?Fr!#p9E4EdIGv}^=p1rVqcNk$*H@qkHfW%rd$15+LcpUb{3&TV|>5v z`v<087I%V;mvBQD8kTfpA|yc#hl6p#;-1}id~~p+iXNVH z!Y1&}o6L0H@>rNb&lrkDBZ!VZfQ~*Ap%1Tqa{SogSetx9Rs%TX>WUBydde=xi`VO& z`7JI5*UAm*+Wx9B?09mOn^^v7MXx1q=vv+k6%G{dPUJOX*lNcjJBOXp=dJ|b_!h$W z*&!q3tKdE+9q5nG2pGi-Gi`l|PJ}#<4zB2lRyl9|&C6pFBfZ1@B(7EcgK&V&5q*D` zPUg#F*97^VOoKZ@v_i+Ov8Sh=f5jyj^*M~Xf;ml0CPTbHxEEOlRiQ;)rn^IomEikt zE!7_rm1YG6rlZ7HuX~xxL>`0$rC*wQ55$$L*XwKSi3Ma?q83h~Z`E8c@}$Mv1n+eH z-tLyFf&jJ0s7P(q*5@R{un%=QW^qcF#bvjY8?l9206c@`PMw?ew_28S7MyIyf7AIp z3sMZrTGc=Zld|O2==$rQU(8C|iAhs;+BtVIW4huo)9Dj)UYGVEGKkaU^Th23Daz4Y z9pI3)GKu5_-vohI_uf4q6+S_^?O?Eo6Zpk?2#Z{5hL=f`Lj(On=ak&`6`+>1;uk{a zailUy+{Z}hbma2A2n96+^|RVZ(I$uE;(5Yf)CBwK6*edm1u5p7FKbhbQbKyq+HZeB zoFSIEORr&ww{}6w9+SUxwp>4=@_*C8qWwlNqe)_7GuEOmhdb9+8WTdLoM?+Dbu=Ue z<|b}zy1LwtKid)gPJnE5tOxP5+9gap?W&F$KWmt7>lP{;&QzsLu+=!n^O>xW`hsiB zO0c88Yk;+R?#1bLnRI(Q>wzMuNybH+!uOWXFYR|+b%+jlvihf&Zm84CLvh&*I{djeZ)n|7n27GKH1A6}A?ht_?j`(b zuBX#+OmUhI0{R8w@RAt?zLS8vc9wpZki{Lfd!43nKki;rW5RdLxCuJ-LF z1si-RJFv3qW+ux9;R^D^v+I~(B!`!)=1@#pb-+Z?o8NTaD#Lb16nr>N(eoFC?Owl> zKawNP4+L3&Q+(eO3VGp3zfw_?{DnBB`zl|kFF{-S$dRTCF5JSkG!ErJfl>ZKINF5A zZ!gD}-eY)zuE8}E;$jmb(t~kTVEc^H;&)X+^v5p%CFR80MC%uyW8)! zexX}ZS*JLxy^r5xbAhu+z_+-^SH)tROU;WuJ%T`b!T73*c?1vL>Z z7r+|5furxHv-tw(^D4wmBF!OS)6$pN~&Tw-<>@W$;wkzHD|OUL!qwt$mci&wfGr91SQ7oK{hHmE4FF=?;|pxK?W8 z>jb?&UncCw;m;rBJ?M0IKXlQ!P8ElI@DuK#xTfX1IHO_-jzd?y_`leD@35x&eP0kn zL8&4lMIb652r5OXLLwktKva5(iik*)9$KOxAT2^Y0vekAin0i8J*C+ol=k?QpO-%&P}yzpQ%Zi_wQLv?3qudu*nj_ZGj?0u zFmBP&Bc^zCX^NXjqX#z~-7%wmYs|xtjRG_cf~9bIBomSYmE8eUzCD{J>NuUe(oOo@ zN0MoWvh;38D-x`|J^G|5asCq7FmLRW+M#bRq2{%*yE4vfv4(Y0$%~y@P{W(Yhz7hVkT z5MBJOC?7q0Sp8D`uw4~7KZ@;S1+OmC__QGQhBaLlRwbvB2#32}d-~OW%Bp8>6bj12 zmL5iD_2)ear!3^NxHqAB@JbGB@Atpx8FFk#d^+cZ(VJVq6^ebjJw5OGxKSkEpjk}( zP!akjjp4;7eJW=u^BqxJC~3)5V$BL1fg8mhg}$&z;b=~{;EvLMI3sn52$qAF$2N3V z|GAEJ=YakB9Qd+MsK!9*OwaqC>3+!u+HLMg%r81#5~^{w1o@`SfZ5s}eg!l^EX+6) zt=l9^NlsO%e3X)@&$c@ z)Pc*UCzfA_)XtSE`BG@T?5Pw3?v8-Iw@WIkQo7WB*LE**}u)!; zOZ~-hq(Rf4%*X8p!Ns=T1Ket_l8osmj$Ms8`Z0m+;@+_PX`NTb}dy+$z3S^Gdv}&{>Vsg4tIvzIjP$ zjd0eQA%I#`=plZloOzq1JJB*|H~Le&782z)+U#EY4TG+ zzKH6+E^R}*;QSLJ*dHig*c<)>m2oX^wXaJeEUi=y=}*5qyJiSFYtN|7VR8A^mESaj zrW&Sx@l)vU9;oCL0a?*?qmOfek}3k_q+!b+^>YTNVCgOBEB*Q9Jet}fm71>m&w$n0 zs&tNYeLnR}Kmr_O2rbT<12qmxB>Rio*AB8(Qje#R12y$r*!DL{cP@RohuaEH&Y+rW zI&sHQNt3k)`6KjJKY^>SQ-T2~@F5qzA+gh}K2Eb#Yhi)=+z02h%k`DHL?+Ok1V0l0 z%No~+2ZS)Vr@|<+uvBC2rubb_-fX>@YzS-QqPG8+}{$>94NbZx~$+r zulPlI`g5WWG$Qty+J}w`c>h@2;1Cmc8j_GmD)+SM%U>5GD2gw4zJ|f~LoRi7Wz?Sc znz)n9Nd5p53HPdqmDP?9KYuBNnf0__q5A~JXG5yYif0E=mfAF3*w~l+Ozo;>z*ygg zWjStY`$)k7TRpuEW7w<)$p4KX)+9YI`f~J$v-xVvTz*Vz@k1F;W$w8O#WxKCZsKcV zqJ{5O)POcl23h4WMO8 zN^6H)E)8xzv)^{MvI)PD}10Q>)ej zKu6c=$Jpd#XHX1Ue%K-{TV@!Lvd?s49V$p)#GI$H5On9!saHm9WG6l{8qW|fDUFe~ zp8gtCeoUGRbcZH`fZ~N63J5XTmJ)m%EsO^_gsjbSVpNG8Z(NZ4yRt+n)M%18V->8= z`UmIhU#ay5jf&!jr7Z|XG#{Zt<4cPGe`(EU>OR^u`q4p*Y&ChKXNi&kX6#rR)ku&; z1Us*^uP=@YHgsnw2+&*c94nuHsCT8Akw?gl^Y6BRo>$SU1O}3{RsAq33M@2nEU6xL z!()x1cWc<%b2B45a@Sy{fg55?xPz5z#?V62uukWgbR%v0oCbQL39+g_X}4)l)Ss)! z2-2R>7kbm zj`5A&6lCkmRdD5o2N>=`Hle@~Fv8;9=KbMQfy9M77B8M6Z`oYbFabxfyS0_-O9XJ{%d!*=_$g5f3)q4R(c`tvmJ2Ukg`18hp?~ znQHcU_=c%bn$pQJ<;~xwGB@g^Zee0&sLSRuNa@kn7&i1prRer-RDzNLiH{5!woO=R zt~~M;?9gUka=v8n4Gr3hty}*ja%iCVgM|{eCmRkhs>Y)Cd&2yo+!4c$VJ;m%88)mx zV_n9Mup`kMkoSd!<>QZ6Y~;r0AUJL+f*fY1c}A|kOI{lty0Y>XEkLN8znOm<$v9Tt z9qALBmcR`Jspq3%hFe32C-Kmn-!$}8rk9X29wDZu>qnf2m};A(lHHfuAR&^73reXC zzCWVUSE&L;NZtPY^uetwq~g1Qw*?56Zhomh%EfDSbm{rnYmYNBSX%_t z;X5O?BM});%Nx^M!SyX`N6%x0yt((Eo#E_(vh@}8z9PT_oc+T{0t5!r{@Yd-t;C*M z^C-K{=wCvotkNUgZ-0U`ErkruI{ThA7#}jUcoZ1vV^}Wl)*cTuC5+7g26!4)PZc1< z_t#n3UDS5Ttn0V@AkA(!W@ED|ckN+-7U%ICtq5JL!%@QW#LVuDu0>S>?CL|*#q8&{ zN@p8zD%&hRKwrQQ8xovbSoxjss(0p4b|&`pUU8^u__&ea@ZCbOF>8DGtFFdP)jDgR zA+&W;2rSz>B7H<|OaIL2#Z=SDoy%1*nT(pxINt~sqo6<~B+@b8vrYC-5}Ez=Pe~Q^ z3F8pl*L21p^PrewYBE^9r@8f_nsCPpQS1E*fhYf)&4v+x4gI_M2Y+R=#yB3-6t$LT znnsR}CpkgiBiJt5A<00%JUz6DT#`1~TJkmJ|NP&~KV1Kd<{y1q2!JsY57`uS`3Ex$ zT7E>1RR_A$Q5b5FFyKb=0x%_kTLSg#d4WHjb=q07DA4T7nEOpbVm3vrshVs#!9~f} z{_Q;gLq&WLPqMNGj6Zz2l>ta5=a?5TPXA??HXsP>H_KOnDsL(+~A+9Y^gf#L1v>dO8K2Mf#Pr!vB(}*|#o;5Fy{+1}l zC8o57K1&vNMxPx(Fo|kXyY)Z}=<~?qj~*y8UBqviFq9_nYnwNX#`cB?+PGpxn3@*Z z@d>0yY!Sf%=2N5QH6`gFYHet!+E(S~j~}=dw#%>uDS?0oCqU$UXG-15*Ci*E&kf$5 zv**gvwUfBxzTX0{{*(ybOE@PpQ))-rR!y){__a*%=g1i&}I z8T1F4mj7o5`FE1X>A9VYYl&{ZX}m-r(SuNkNfqInfw)SYixJRH4v++m)G1cE8a(zh zdv;`T-ZN{E!92jVV;=|;a{hA!_kXP`{4akN3V<#;N&&=Y309EqXMnkQ3RNP$&^LJ+ z(W-!5-mm=#cMabwbvXj?BvuCBu#jW!hX;r_Fg*&O)s4$hwWInd@3Q__ftakEf+MM& znBDu%h)rP#H6k4CeT53?O#6oKBLy934wf0J2Ye>{E8u%t!omXK=2o5Oyr@efNp+pe zVsAP!-<@#4*)h6S*_t-KcaIQK6$$o^%eP`ioTXe(|#hHNPN8%j93eP`XCrvjLsvb9fKkvs ze)2y+6Vdj;s_dGQ`VUH~fw9$tM#hsM^KWaL6>Es&Q*7jrVL~HRq6K;$Y`1PLfMBi6 zr5N~7k`RkUs{qXRkH8S%Q2AhEckd9`H)ESKfV;bc0wh2vYK2ol1c^~Hk_(BMwV`qz z7XeA?B$)3%T1fxZ_rw%)BNZ!Uq7pVe(nQf3gN zk5;P;s2b<%p^JTQ^Y#&kdcZB(xC7`pV7K1~Jkc7V!2OoEkmD|#gL>jL^Baox^Jise~(ZM z{=ae3{PlDKeFl}&w72FBZa6vp(R=DM+BezKkitJ(Bu-d>$R4A;<;T>gWFKNFk>S#B z8n+B!ctFLoYydV}4L3^(as(caj)|2e66gjUpC!uyVnUM;%DMPalNmz-J3&^0PHf7OY9WwFbr zg&}tJm4Nm8;Yj9S~SZ6Sxg2@+ZyHmDB$3i(I)TV@E)+MG!!@R$zOSV?;1K9UwH7_?sO!l1v-D zK`q@rSqzSKr52WL()e4fFeT|^ zmhLv!Muw#03_S``*h)peU|!vi^NCi-RDNN7n^{--(J?xH*C*tWxl!`h!q3EZHoWz@ z)N{8tbQ27X$2Zn&$|x~U>W~fQ-v^&t=w?@bJiUI;FjR2zD=`RaVk{TB&m!JLl=<=Y z+ua+R`Ek!*DlFs(rD_YsYOm`%ygCR+UJ`a#wa1v0K#7i_y9)W?tK=J?&xI2b=_0!iU(J_mcv^iO-zo#*jg{_y zUd*7#-!xnZCulVFEFpb<_RMN!#e=bz#LT74Du>(N>}t2JIKSxC@0a!)fi?jkT!RHb zHCBY6*6iC`>^R|7kZ*sOD4Ou#+Dp4Ctwq|3MO)KdR(1bTl2-eYzdxQx+>ir~_W&(} zgF=_NsZ@j39l9P$kN16-uJv0`t)l&7iLMHzEMv=&YKkJr zkCXU2w$m+kpIV)I;h?>2&jm!5_min=G8MeWIb6%{b5R$NtiZ_aT=B;N2a6UkcK`1L zGvEFf1xf$Ael7|L1b1x57svpS0}6zst2Lw(Xb7bUvS@(=SPTK(!0EQJP5@Cg%9Npt zy?ot-b6^HwAc@Zje+A6^HK3&11oSB^sJa&bpyd+>Bn9LG%>;2n7Gq(P)CS)~YMeF{ z&@C^jBq!WZqc)6!h~OEF4pG>^22q9iOkf8# zw1YD^T6NA*0$ZcvN{a#K8ucu0-`aLL{gPGtwnk85OrNcKC*;G*i4v6c`E;>({%`S< zn=uc9;tZoVTYq`nuxm;Bs0e;*BbyQM+LcdIP+Ks7Q$?P?Ph|%E6?e30hDTXcSbQ0C zylpl0uC7`bzw!Q@WxyGF%zY{c8H5XMVq^P(#CE?v(xDY7^qZ6TQQvLZtiKqkDrH9IJ9_$e-cIqwTW{ za3OzbTNj>}ab^~^PP)O9H);hMba`w~9-a)^f_{SwP-@V;uS{|rQ1*%;NWOP$kTb5l zepRX<&~_f^6_Gfz&`&vIDl-aK>RcSL#7 zTH~?BcB~V$qiV5A5sxrQ#f<9pt%QdI3V>e;?4TylxyD?W4$L$WoSK^lUrN&-9E84~^Y~*TXJ$Gq2rjgLAnB z)tUbx;9@O_nM+P~Ku`2yvO4qIsY`2B zZ{HZuDJZnDElnBC-xR#r`!pyvJY&0B_KTO_Cefpx$czyR_7~4Z^%Njmkf;2W@L4TZ zT5Hx!122X}?sG}K5sVZF(dK@4*Uwo{=HOJ?ra0bhezdoGsLanyt0QdXb94arcVW#r zJ$Q_n8GwGgJXanfTaMcfen!70blDrNSPDwgwh-8yX2^u~RBol<6c-*_DWF^~heiML z6wW&h&N|t_Ff6`;w-#D~9qagL@Q~yWmfR5FG-1@|aRb4oU)s;?b|7~^sj139Z%UF3 zE^&JFU_c|&hWo1a{*fFLH^1(W3wNs)i!E6fP5L}3+@{d6^$N*X=C&=~UWr;I>eufJ zD6og9i8KEZ&H}_;ISyq8>rC*^MlJfE>w%h%RtH}hHosX@tQ0>XK=X-jf8w1~yI58~ z)_`lluq(mwBd=bL_=aWKQ0^8WBy(OC%d=o%w3WcH*#Z?^eh zvoFj0hR(_A4zjh`MbOqAztPAs_;?x}%F}4jnz%h8cUgUCB>z(9T1sXzo7)@KgOz9g zT76Oj@%OWsDIa}zt08H+#}f5l9(CY8J~aJI>g#P^L9|GW$Ib6F`(5*Nvbe9OruW7< zcD=~}e4D@f<^^@b>F`qm<$Dx_hF7AOZ+x=c$SvXFg0AZ^ofjFR@40^HYQ7YxaMBai zDKSYmH9MtWz+TF@0KnP{9QK{B^@u$wA&%v0yQ^D4ZCbi<_&oF5 zOQm)ZWOYg^Sk(XKuSYG`;zeP%m*byGH!JFJ&fZZIy;I$|!hN)6?Fo225^CFP5)Ug< zdf#L1g)_elSIpWHJc+N8r9ckf3r?c#mJ8!a%w#A!_ywkZplrT7?WXpUH$(rip)}=n zJHF7p6;fQ;hG8+6St`gZtO<+l&g!+k%kVlj|NgA{Ns`HAC8h$^cu{xH?)=e_YoO-D zm2$8NKh;aKolK8@kAp_n3FDbGtPqV8AARHF_%q`!$KYcNBk2U6tT5BID~tEE!Gc=| zcT8IQmx6Za6Ek-__g?JpI-Y{MC2x~IlS$L&Q{$H1c76cr=5DJ{w7TeO*XuuRF}4S< z7%4uKJn)IN8D(7kY_z}ft76y}&9EPc8+YZ3G^AGvODT_S*+io1+aQ^e0VEiwkWjK$L5B6{>%Ofd8i^bI-uhXE#U=5 zpKP%JVI(kV26(qld{nfG^mqBV+cR!RbA5~26&R4W)V(wLWo{|+S7l}C@Q%$Rq@p-1 z{`A2K?U>&*GMt2RiL+I4QqDeJped_wvokYMM2mi6*z+-7S)(je0fuf-8F6@>FqvWT zB|_9url9^|?+P2z((<|jkuL4kG13~hU5j_DM@?Ka!_i4gg# zGfO}(RC|2E(RDj3WL4ee4cDSQvrV_RPow_Ex}Gy%#zQ`WJvE!m^sN^8 zo=Q7H<$<+7z_ zq6)t(=-#e~J-l_Dg)c`;EBE!7nEq*Hl}6OB9pEg}z5twMPbZNS;d0a~{86J5$Gkxc zAV8GYN_(@ckF^neeeAh`Qgkr$8=TF9@9|W@H*m`!!VgHR;+WEM1=qR6;^0Gr&*N4O zepdCe#Rs>m;4+fGa_Em!>e;$cAwcP3wiOxX>)|_-X3rY`Lq9#edWip<*~gHE<%$R$%d2 zo^_nPr1ox0L4x<>kdR@B7ae+7z7V29`37c$#m-nn$+aG8U4FP(d3^*K%p7zPlZXo z2N~LeY3fEN0Tqaoy+TRaTp3b=;AM=%Z<-frG_XgFKou8~zvkttne8BZcE;MOh~A9X z_8h0g&=oscUuL@;A!>Wu)Kk7RCf;7Yfw&S24|M)0Z+@=QYE#^3glIwtQa;}Dv82lefZU?w%m?3I}ssyRTT}Wm?CLl$TH&+>0$lK-g zeEmpE>t)#xNq@?>Gd@DO7$aDPx)FMaBbXmOyXN6!eaFppnR`yiFP;N&#bI|9@6Gir zu{=Fw2}wX4i+JxUA0;f6QO0|Bfa+ItofS#Y-w_vImYA2>T42#4-SeY+MPA^Sq_ze1 zL41dwN7 zIq&l#qoGNU4CB1?uI3Z3i2$0%gE=8K&A2EL3pVjgRCfot`ad8>`~G|W^w>Yh(tSzI zh;!g<%06l(uWfc$VFTb(pRECkT?Nqo0)(=)EKu<3g94dB@F+Fv6`7--qrW0>IeYF1Rrm0J1A8Os-Pib%3K)B~; z)b)X4Z$McYkX9Dj1=N*aiHV{%&KIN~_nTPExm*OMHQ>Z%GvCKz$t~fSCQs!#OCd~o zYx1!ZqVLLd8k&u!GZw`whT~uLfU~ppT65Ibk*g++aPdJUcdVXx>x4qjkGvE2-yEnm zFKYlGpNnDvt#CK$bSjPBot7j>i-i!A0E(xyrk#BB=7lVHi)V6Q=Lj_B((-UU?aa-)PUib0YU9bWHNdlAg2`1kGSF?&Hp~0n(Dl2M=Iu|rfW_5s9@Hnx7Mb|q_5hR%7YQH#pwsL@)Q_-E@G;8-{ zHe9V89?>NlxBk-Z5?d$1LuIW(qs)J0-T1sfm8E=wO90&bQSh}c8L z1MgH%rrP3~hYYj)WQN4cao){He*e2w4leYoiHbRDp|H3ktpEYzCJFMHU{46GzhDMo zV}2*3P6CznC2P8OOWMxmQ&G$ArjUEcwX%g|fzwK(ThSSL1QBdy=t6+ba(8o$ayENg z{=*s0_Iv7c9ILs~0pf|#b$3&ZjJfYC3;33rSXJ7;EO*iT5%O98elfeJh`2f^!CZ4w z1)pXZER2r3KhW)|8dN*dp5TMB#BmN;NPT4b#!_H4#1L2HRogS%wA>ksH{;8pKgl89;ouoS_xO>Tf_uahZqE>j@y%)`s_2CwP7(w4Rn;ztCkz_iajZ_dZ7DN`-PUG!0Yf@)l8AXO!W8w+wl99rr~y zmO&r@BdUD>>t!f0No+(5+PIp1FPUnSGMLfn%+pbpaVb4c;YrojN#!^BHaWR%*9a7& z;bZB~AM}9Dxiqg})cMup89^>wOUp25RXfm0j$=)4BUUh0qwLF~uw7K!2e5Xtjme z=R%ciEEtUKQGq%b(C$5)B{GeUdy)s+5xcUeT|lYZ@X*W-CrI%G`cCM3|3hWsKR|Wh zMv}~cx`I9%g;NIZHo7{xWe1!%1AWw{*ND}eyuAcWdflIB449N0;79UQxj!g4(n8n& zRclMRv?;ALtqpPUWf3f*8GbQ#!Jy{iw*>92xjimYSE*K{N&f;vxd#T<@tdX$L}3J) zSakUq0tH^-FG35y0FxJO;FV1qq}C7A`L!X+F-@~42pNXB=u-OcM>Wq=AOGMMe0lT) z^sy}50IEav&6m8yve@R;%itv-%sJ83*z z_n;Ri9q|*Vt$wTnad*~P)V$O*Y%pe2dkooyr0R%ox=eJ>38dgKkIV7i@198#5T+km z`rV92R!#+8Q`(GC;{-Jsh3b7nE1)K%udjzZ4GxiDrHlux9T@b0BqEx6qcrvX09=*$ zs6yah`(Nln3eukqYarmSj~Vob{%FlLH@AOe*?xXpRM>3pZmS9n)+&UJZ+{IcIywm{ zLXhL@ZKw@EUwcij+&__jG({U~ zGhZ`*XVR}BTw!qi;|K84W!R-TPtM~oXc4rcj~oMZe2bz`yT0TAR5A|9WnA&7$K2T< z$-H9LdBywOqe#~a;T7+JIYhaOJMq`M|F8aB52Ovxao(Xy-gtyA(5ndFY^5gSEjw_f z(s{RCZ{5^6`bmTDGMT>)Fk6>#B3QW)Dysv~KL(`pZ|raB>RWgRzGTXIIM{yixpVW0e9cS5gJq0p z-r>SwE>N7Ko1Z1F01flZ2cz@K{&GQ|)(Mdn-}8DpbIS8v^i7YHov&t`HVF@`NKrNg z);;mp-dmt~0pn`q{Y}$iaDEIR!8`~v0igV zXwR5>1>O%~q;l&6zEz%YCc)`b#uG7n$yjf1SFY%X>M5GSTG30oJWH+ zjs(PUOT(G@b}X;MuT~0tPPozml!#5%SSq)AaL#@)Co6ngYdDayx-WIu{Dpqb61D<3 z=F-A|dQK~J5UBiIpacJ2!6`64%NW>(rZ zZB(ekVYEQswHKy}u&!&FH<6xpc7%*2e3kZt+%?@lVk&-`qW>8{#$*V>fU$MqCR^cV z$-))HsKU4(+TyKD_XV_trIK%*GJKPcwoK?%S>Q`c%-df$w4$V=xnUMqCuCq9H18%7 zk*WKd;R|1|aJ0Soldjc#`J0aeRTc)&BFNI~?M!bDtDo~_{fsMFwl~{wv5@$@zPK+s zRM2fTZc$$jzC{?siMN7ImNPLqdCfCzSQ$QdaENegkcx0VuzkWW$u8kWn&9E z$+RFB2p6#=cx&O*$A_K{fCq(+Xa(fE`<8<5&74m@DU@sz6EuPrLutN~`t(!7N~aL4 zRBp9h;9NN2QBx}UGv{Hjy+%ZxDv*+A$^)-JQ(pW)7f|1*Y|?XSn7B}$r<>j5mosKz zrL*xmJAEbW`IB4@niFF@o$U`DmSJBaUVpxS`4N|&_PCklPJ`{R6iaRZTi1kmIUxr7 z#eg10Y5|`wieIi6aLK9&TWmYjn#xURpp)Nt!%$Qxh7Mx0ma%WkmyOl)y%ql2H7;Tm z7#T;c|7>9X)qgX|^ydebb|w}<1U7-=(WKG8ypi_hH%*N(Kw-mNbqb{l1sD>ifyU&n z*&lu02ekj+x{U6z{H9^5nD!bXCV-NX4$bj5Xvy_2dg`CkQHk^Q6=)k}XuUzlK@r>Z zQ{W-I_x@*K5vcn@9#>4RwHn}>u^fZAdIg<8318Yp*rQHW=FLh_4s2dS3gC5+%_x#W zx2y0Fnz16*%(S*t&u#pLf(X5(XJJOuEZ`deJ-Ok97Q$7&%irJ9?RP=maMU4rQEao2Z9w44{>aXA zs%L!@VeSa=2gVYB@1802@=4Kp`7Qj9s~FAyfBpZ@;IDIla}Tj)S&L>~7a>(%$ofrl zzM`;)lzl>VyUXx&AHSPx*sr4}gpMiv#~D4a-Ji z{%?SG=>IgJ{hxP(|Hr<6F#K$5!|Zn3Y4{8S z?#F){99n1z;)LxU{4C6{{<vHp_vF)luPl5ozLH)YgSdDuoU1* z9R?ob58pon``5jm!C_9OxAZb~^Ev{*#JO>>NTsn{H<4Jo0Feb@t&1(xY-&QAfO%zD z^dF2bZ3k>ZFNp_wymm!m4>C~aTd&!^`>b~aw4-G~J?5sCFAc9egVy;S`<6>}kEH0C zd%BgD0Ri>hzfRQu+RxTb3$qd6GX!^Tfy?zhlTn@aG1xRiukg6?oV?&Q~xNq9u z5p)FeW}0O`jBwi5{Qe7iVFo@n;d|f209RJ3p2}-c9`*h=ji-%kT3gL56LpwMKXUWj zAhfjV7q8k|!Dlc{($BAd;y|f&byrQ9=1SXJ6FE)Q?*x0-?)az8rS#>z%&jn~@Qwd& zLiAsAcTxaRi-DIQ_~Xaa`0hvi8aD@$lr6#9k-63B z+U2VO%Fs*HF(!arRPkmeu@EBPKkvx`xSlJU!~Xw-#h!LSm+Glkuv~ zO0grG?{Q)aw1Vl3fW(&{MME69LEaEL}f)kuv^$SQqcVIu-NHbPHo33AOT3kO(cc|s*AjZ+Lg9pS9+I$Mb?Mg8)vJ79WqD_kYmcA0 z@5ek1*X5uZunkvGo~m=o%Rl>`IbWlxZ+lps0(OZF^(vv}wV>b@hme`T^a|OF zl8*A2B&D4qFuL-6SiV!cxD-#k{W|pxN(25CC#m5ue=)B%ZzHs}{`0oTho$c^5rEQ8 zyU=KGWbcB$E~1I59?)&>#84(dje>AcqyW31xRXkF<6n3X`t2uhLSe2Hmt0X z3v6cyZ^`vc<|RWQQpFK_NAr4aOE~#Ort$M9t2!mgfN0SuGc3e=MD$1u=BJKw%QNT^ zp=xeTa~^BX%|@r4Q#!w%kn{yQ!CEo;y-z^-d|W+Z?)oWs5LuXzlSzKm4}DM(v6*;i z`*3?IWs;}FD84rD$pYr2z-MqT)9oy?-2#x!LTTZJmM0lo!aHg;6vOqhKWvnqQ#66n z3aCElgv-2FdBA)%ndyg>w$l&$@IcR~@Z&wj&^sOLxued9uq-@ zj~^Mu4=E;x@PO3xQI3Q+-~QkyKMTb)U(~zxE(+jS1G-AuLi;D`0LVIkM`vT|IxL<+ z7r#zEUu<{yE3WKmo@poZvHJ&8va_NDErS;q9d-kMB4c=fvQDfgbD0S*!9 z4bJ&ie4FjuOYpV9Q6t>ezIkvS9(}c!ErT^T;5~iWwsV1Fy2F>w*wc@8lp;Abhmsk0 z9_m>MoX_c!UUIN*4Hqj1aI|D5cN;}Z)oB4))~rdxOs%jE2-G;ZIezhq4~+%?cYwY=`|&euKCFrc5BFdn4k6zdEl%I zpQ)T^AJ{ipnthM42Zh({rHL5oT|I{w9&h9LO#=$(143RCM!1;&{aXQF33-oZAZy!J zn4{iJc>HqT`)eZK9Emz$(7buw%6Y2GJ_MSL>IQNe_yF|0zp|v9cX$5e$|uxWsiT&N z@!+R!Hc?nsaxuhX5eG2 ziDQk&l7eWXRq6L{fc~!BR=|(DyDC`zbGrGjOQ);C;AnGap#U>I80Bco0B^jRR@!N6 zo0@8bcKuC*;MwTTOL-7czOx#je=!h$z&Z6e0D5dmx$zBi%cE}sIbR={zP)|Jlaso6 zxSj!M;2zr@pfcfX@LyiDXJw3Veed%Z20hK*%I4&?9NDnnLq z>oUJP_Ic5EhL3h5`E9fyJEP(9my$cLc|0AB3~W{9>;^rL-;(YF0=dv(#6~=d8fLy} zu;sHrozVaMo90gmxZR<_-)F%}cb+ZGS;;U@4FENY^OoG@zQmrb67{qF>q^pb-Kcb? zd2FOWB}5CW&)51?>Sn?7f)|y0njmP&3?R`QBL_r>C?+W7d^uqdYOn?^w~H=Ik?zx} z9M~TX@8ZU)cFN8dXFt#vrjrNQ$c9l_>YaT#Yj`g`C=a9S*9w;c=?^UOPba zI5P;@3BwS!{Op<}CR7L8lgg~Q@xrn8H7RZWVOO3#`TR!c$3@|bbdd=RKU5kZ>YB3Y z3Zhv#$IG|g6%X0hltc~8^sWp&{`-!0{;?d_g~Jcp?w(@J%y&y`i;Rr;wME{D^4=RV z>4Qs9GN~6~cK*Dc%^Q^tF(v&kE(I(*-c%m-^rz%eN6}JM?v$`^<7Pmts~4HbcAOK5 zSSVjH?s=&)-bD@VpCg#?Te)kbB&Md}4EoK*_iQ1rX2Zou3x*c^!3Fxe(Pwh%}>U4)t4$TV*X^c$X%H9Mndv0zAvkDJt7~>t{G#v?w+NwEF z2BhME9N?chS*Y&5@hqw?D*zU^1T2kHHV#@o?ugSZKZU_9*)5&>5T9oY(lP z0kP7#5Lz9ypT^B!L$I5tT8EG_8tEH(WXMkn{2NBt0}+MbysiDv0(#Es=UK{RB1tu7 z^8lZ_-9+kO?U&EZuALu~S)KxG)W1uE!cuONaMOIH)F>gmsr>6cerJimxMBct3 zvxJ$ZYK{*X=`xXSJnZFH)2q_Wvj}G6E0*-|RBmQvfqA#4A>jf9ms*`-qhZe`$ilS6 z8CGAOPj(DdDxEuTR8Tq^_xIUMHq}BAtf__|p9m-wH#VEvvqu3p)2oYI7B19}|byc$S6doJU{I3b?Wv zkgo(TT=03UeK@)YdHNF7yF(8jMoSL}X(>Tn1gafAFo$1Y3Ksa9JvOWQHEqXalFEn& z?8GmvAtul~bzF>9=UQg3_ZL%nII?$wI+Vjux=f!23cd@}Ba?byew;xg`2jN<0KbWD z*$Y_e-U8fzOL~l4Tpaf&%_PdtERGXJub$(RJ6&Ps&kGp`7}$b<@#gR+INF-3jk5lF<}ojwq@ZD8@}p&hxi~ZeHN^ z{2uT&TKgjY>RpJ6u?K2mquqEn>ku9P)bBJzGE$rQouBNHA+W2C0<412`al4+@0{klFF9j|acSl+_y%EG%MaKLAB^5s)}V-sKR8TRnXr=c#`C6SvxN z1?H*3d&|t>Kq6?IC26S6uW*~j&>A{#03`Lybz(d8d8oUCS*BlmnuAvFsZ^vL6oUugbmUCQg@uXXJ1eOsd*Mm?0;+WIlWVFQ$qSw_EK z?|R(k@HOJFlwd1(%Fjywl3yA7S;)?3J=n??3myt|CfSw5zfZ!R9W2*W)=F!i|Im3+ z%*Rn$mnAD!Q3?f-Y#T^9Bide6ErHl90J;mEJ2++@Ry_bPy)Bs0}z%TeA z;)$WOiE_jfztNxQE0iKrEj-NN?kR@pq(gm!5m6#yw%O8ikMZ{Y$8zOl2DSw$vHhG{ zZ(oV_eemozjO;Qc4=t)Q8LJFtUoBDq5%%AWv;oX|*!+Y#b!;Ht(~LYEmX3SPI)X5pb!w2TA#RUHq?vHL-%NgVZnfH>eqk78Z9i zzb9w_+O#<@lxWp#v@4&Re?yz#UACZe1-vHZxZ&i9&77EJyG@+k_$MyNQJXRg%sPi; zO4h86k4|y2dCE@udT&5pU~^jqf@8O~b)0#}UyWJ~kmU%%A0*0&i0uvHfEI|RDC(y? z#2yn-Y_Qy!qawWmwwP$)THDR7+_0{mOPiS3UNH*kZZ~(TyWlSQ$lbo)KOfk-H8z(L zM^BzrpxMt~v-1$z$~};4!Rad|TgE=zN37UvGhWO7i&OSS+7$rQ%tC%Z$@Z*=w3wK4 zy_tXOiTo(+a;Hq9eEhf^S#kh*l2Z|FaD4rU|%}aRpLwNtTME483(x;LM6y7 z1PY`@S%oYy?*}U=e-b5hPAxv#Q{T)hR;P*&rzNN?)n|aec=!%y@W%Gb{opUA!}A@o zhl?kp>V-cpN>tH)@;>oHo~WNzzT7`hs1^||xIUGbnCN`8>DR+~96t}8aG!_Q^a!N6 zvPRIZRul&kj$-xN!5bBIGcMs64v~*~BH`td%`p|*{|9^T8P!y{_lbh2AfhNpFHxyd z6{MGlNLP?9H7ZR&nur7liHh_Rih!UHrAQNk(xir71*CTZq4$KEKoalvJ#+7!d(NDh zxp%!^X4d_XrE9O94eV#{|MM@uVg^bq1?2{eD85aEuRlLl=B15FtXoc<1$%rVHrz z9rXUe;i$XbK0~3st0S-?o#lYEJO&*8yC1hn)OrmJ`2~0y513Q`;}sLDADJnRt(~#| zm4ern8PsjMtWzLDV;Sp?DF-(PpPOVK)gl*qTlC3)fM28m;ttk4*%}|-KB!ze&OMTm z-_$c;dpIe+)J`d;a)*ma$XU=4-`$*}`mXF3rC=+(nm59yBmzTA4-(G+Wj=2La?N%_ z<_mdsA97(~w4n`kn>xIKjQRnx5eM4mUl>Wv^|cFsXrjp~3ts#s)$`HlN~?-bw^GLm z^TU-I7Ym`CTTujVVjV20SUxiq?9y4#&8cjhUZ(dyNfaIch{96;jwo!of@%XIy-3@b zy-59lDjV!&ue4FZ=e?(ljv2qVTxK5jfM=|Yu*VK>3H64{-zxEVG)77Ry~Jn2p9poqxo$0u@lPXp`S!P$Lq-=iR}9z z;+88!?#qLT9*6)F2ctt#NBYeldqY&_Ie6Fr=Q9#>@J>D-qcAF63bS zR1>%Y9HOduTb^I2DoLXzK6H=oUFF14{F7AhmK6P~g6xmKKUQq5+snT58Fu(J@3(}W zuRdU7B!s^$5jtB@T=hh^8@K|I>I#xMLMC@IPyv{s- zEuv4kL}Lrddh&vQY+~v`NpS646xRH>16=REJGFvABpe3-B~T^oj`5=g!Le^J>CaeV zw27dNE55Fe-sF;_X#Xo!W?x7;F!A1EB^W&#$r+Y7a2~okec$46J=0qHvB3?gU;`JC zQeMlm3b*p^PEVC#yP&TcRMLiAg+@>Xw=MfXryCU779bbgWQgrh3G}|4<uR4#@b6J>vN&fvrb#3 z7{8^vx3?ECUKocBH}So=Jz!%^R!Q=3&9f1Pp5o$Mm@W?o#~0g)AJ!3aSn>}0kPZPr z_7MK-4FFw%mcIrOFZfFAF$dXEI`X9GJ92}h>nQYDbw=|$9hg%vjf~slLcy}8FptWb z$*-DgJa5gZpKk4=pQH7YpPAYI_SgA^!}~omni!2X`n&H8FaINJMhg*;?SLT{^_+}G zfm@MZ0J}zIIfeWs++2}lb|Ne3V)E@VTEQXvLTHh-?fa~$T{X8TqK-YK)K8CZsD=CV z&Ce%4>~#lO)bw@AKo2@LY*nZum@9Tig|4gMnyb&@B}N}9RRyy*_vdSFRfPk$FD5QFU=+KS36u^!LuEpTF}vOW@74d8_l0LXIo_@ zKM}2dBcL?80PM_F@2Ck^<2{kgT`~}u`O2XM`b#2Uy=7**LMIOZJ4{1$SP8OJy^bhg z%)lB9uu1tb{O*T5I2Upf62L5u3!eaD7HV+Sv2Fr$2T1e4iy0AOIRe)iK>ZjaCqoe% zeG4Q;%1!hyKupA)s$T~_Sn&bOS3g*0Cm`sSVLSW?j5HoS{I~cpie{8i4z&CQD#x~F zST+GZ^uJ+X6>F@!@ZG&H*B0JNc#gMZ-+C;gM8i*%Baw25(RRNpZjZhUA{9#;SDiEN ze2SQ);?{yYZo3RN%bu1c4R_gwAWZ1t5Y>3xjZj@|4X8Mg? zqCr!9i_pl|Gb6v|YKjt2l2v{s|+2AoGSv8@atrEz+@_TH2u?7K*3h` z#MFEa*QzZlnf6Gj<0^Mwiu(J4lY9HhK2~R?lKsNH>oN-`XZ+P(MUkfnI$OpBRx8)* zzf&$;oQa%Tx|;qxkRzV0zb0$!y&dck@gsCnK547}0$Q62i8xH2AE7gk}ru@Q)k`|AI{+YtYX&R1&RD?yvDn*M;99 zygZlVW*YxpKL*|_&L6$U+upi=IzG@vksc=RtDU_xLHeYWdKQN#__5`rdI1PGfps_0 zta^vV=6}Ep{s)QY-0JqyE!}YBvw^ zhM2AMUnh@4rPPim707$B{rl94jR$6f8z9cI{^G$7@`Y*wt+^HuDpTqXAj7oC*|Q2J^s&)tP>C8-SVa9Un|C zdYLW()}N%)mEZ{z=A)t|BX$^qn1=g7kpKF!*76Tciu~2)TiYZV5Uf(I2;1a6D_m0i z0uA0(Twf^Y_(i!kMs}SCpO@OXw0>Ag&vj0sutevFiw~I=%`*yiV-B5kb^4LAWA>!u zSoMr0*eV7VEDQOVXdpOvyG+U7H^o^=Fmo~8DOecv?wJlT(F;#)*b57Ik8S=IjLD~t z9Tw`ZsdEL;J7+G6P?hts$80{Ss0vJ+px`4{Wf;vLntrX6yWq>$k%JAUS@p^=wGbq^ zmfYtbTX>ARXGAV-W*(#@*gU zR6FTvt~xEFWtUQLM}PE>;GN6e>Irwj@1FuusCkcA22>wH9h5x~P@mZwX+8Tv29r-9 zFF2}Dd$ObS9pLDzi_pXcDhVhDHY3{T8f?TN0*xg4J%qMiVh}DxoiYT!O zfDk9OV-M+uG5hKI zZT0^;AxBf%$o-8`K;4{g^qP?(O$mWr@M2F!leMewPaBC2d31qZDK_`q0PV;E0z|LJ zmP3JdQRm?eC;a)A=U2pMc)C#29}DgPR+4Naml&>h`HtI{5xnmqUw=@TKhGD6K|eM` zVulW`0_0uMY4UC2liPv!W7*N@>hG0)iqDeYr#aE*i+#Gm8}>NJ61OJ!DN_$U1Bph_ zse%aEC!)|FxI@~U0IF_zvH*SmL@j(Dw$Q`bt}+JP=9cqhUci#=FM|5iHNX#%4G-t{ zX*W)Otd{KQgPqtBmt=wqEiTi3#7kLSw?PnBV5>xRMER8Gj;3KrC-dmddVnVZiF!&vgl$)F zc|D2KeLLOD(@|+0694^Le;&X=LPU_{9a7DkOI(EOr#cEp{HO{2DT^P{x^xkQOS*cIbtm2H*=d%B*2HQ7P+cDsYZMn=pny@ z8pNR1xA2;*Hy#%a25DPDkqVf&HjEie^9KPP& z`(Ac0-o+A%{y*~z50;;c8#y3rjqcY?@*QTxsnvJ-)B zcKwB1NrLCfrUfImHZ<_8XYVyo0P^=^A4oCcpV(11@kYWeeHped7Xl-9^Cqm+RA-L1 z_AvEriUkGQiGgz7=r3SrugFx)MfDqNj=wDRN0;K-k5$Q`LLT2cb-sCOx|GbcRHHN3 z2%QAoAFXY63{C?10`cxgOhZz6(Jf{z=9V!@kg=~T-A^{UKaJ>aBUhI(rvcEdiS%ih z837#Tk?K^RyBWsTF5e=(^Az+}k?o`_Z$|PUv|>GYDq>J$-;e%yYyrKM{5&4!KjL&p z*2lv|z+w1qLH}G9=2P93Hhr#SH)hwne`pjdyn0a{ZMTQMa2+DjmjBSu)}3c=g+v3c zTy41RABnzdzXh%~d49Dtr4sE>crnVPa~x++HA94q-W&}$9)Ka7Ft`$?VJ4$;=Qw=I z2OSv!s7_L=j2fPS?7nZXpj{vpi%)9jj;)@Our?ppTdtq+8YfwsS6w z;%)g^n4A~nGZK;vPgcdX#kvrXNc=&Oe$Mpbfm4zARImK*1s$Wbt&-vlr>IBcMcH4v zeZ*1OXDm+$12y@+`%Vd+ZP7D|S75;Wx<0pfDenIDv_M_uU)R+064* zAGPpDtqiOj>_dFYujqDk#LBO&7unZcv&iLt3QD2UGPZ$tg*GK+x=72XQr?O(d3b2EW zNSAKzKZ7N9*s5e#*p)V4aCJpS2gErPHe`vhk zBIeKIYgR(?4LT@4j zg?g&m6FDAr(f2QBsdbkoxh_XNC@ zg(FtnLK>eV1CLodu)EiZNNQ6R>R6}{3I0C@NNA(sV9uEgzv z(Y+JoN<+N>upcL5>3{}?|9xQf@;7=n2GuVjy6o^^O8CouidY<`5YR;Z4|nkX{?h*x zd+?H}`Qp-gV(h$)>|n4(JNRV%C1~_#`UVwbONs7i;I8XajWz58ew|tw(3)-9TX=JB zRT(+<8QBNpdwc=if}ISUFM$KqnD)AqgJCcW!!Z0U#=D3heOO$YWJEl|CHsVCY<)Fd z7pw#l8dYR7CY{WZNF6H7r1k>%P-FFlsMKwNjbquBKp;WO>wgQ>Qm?^!m^su?nKcwQ zP8kkX>KJKj1Rq}rRyi?jP9oD20+kxs2dZ#M+ARlcdsUm~zIJhhk?p=+KB@g!`;7MJ zt%3vKiLMn?wdDk@M9{!Y3}}YV>%I?6t%P(|7qtR1n4Ch)B$CnYn`3tR`&E#o z)OEPX*cx&ZxcgVV0Os{rTPpR9B;3XFUyt?RFyx4$;|@SLN9ULK&dD?!k)$Sj2;jUD z#lV{)?*F0w9B+Pnast@Lb+LgCS%Dnpwts7#RTHU|x^oO{J`(3XY8ag~)v* zdzb`#jbnlNcn|1jM?=E=2=)w!&YCJB5D$g}>nzF7j^W>i2KF_a7v%m0i4-$-GVCD-R_ts}6M?B?hO6~+jp5LM;KICT$eown)I})VRf}8+! zpi|pKElygL^&ZH5^qTVgpyAcjG-j6%B)nJi5YbY*K9hQGd$Cr= z_RK}@d-ue5X>Pp(Wx#Sz;E2xZdC>6*fZ!i+g~WK5$i&)9#6Fj9y78bSU9soO0avJ! zj+9S9jlrJIvRp9wZ?v}i8?Da((K^?nf}r)U$4*uvK)XV1h<8Nt$NyNvoh??NuF-6V z95N@7qR8Ut8KZx4Mn69neiDix0EJBc!A~^t_ddl)mk3Nuh5Uix>>=L};JB^~XkIj8 zLQuPwFd&P6G9wE-mf2{mJ_*5#m@NgIzz}Jx-YERa`oE}PEQ{w6s>pGHfF$2LDa^jR z%y)OreKOh)Msch*6AC^DRH!>hr;$!=T60=(xX$DfH!)q0vF@d z0UM)V{KAh{l2_jEai7PPrxLSkNUPX_qiQzuh-UqFHLklspgze*Nz&YR}mM@)zSi-Avr zIS)xm31D634?Og3;}q`RdFL|L>*V!WjzS;P|7@ry_d_1^#h$d96aX)X&3*U2NQ}?R z;h<%H=Up-R+*~ko)8V*yu65h_o2$FDX&~z9#i|8JGNRHY4p>8RVwG@+hwzCs)7{D_ z+U_*$QBu_~OFm*64Loz)#H^ksdzKNhRlY;#poh~>F++v-Cg$|D@>hb=sc=6#!z1ev zX}Jt-P01PZppk55vEVl=Bu-C`=oJ!R`(1I8l;L4L0NydMpzi!S?h|emH3?-nAg3M> zuK*ozYMlC2;GcgH9RGVv%-mccSIbm-hHa0mX*< zQ+l~rBQ_X8*kkKFy~v*?H;9Twu_4L7s{Sf9h6Z@B;VVjq*Y)~xOudRMU0`l{;=+M% zh(m;je`tR70x>a|8UGa&gT2yE{xya=0!HT6b=^_@>$kxP!JY8Q;E6St1yGy79~xP; z9SY@k3n2st#DKKS6Dn|L_5yEE^pU&L)$|o8-vMMBHgbjP90aKCQ?U#8!R!sfC{nr{ zGnoNFK&xgU3kIlKM^_JjYzLkrq6s&FpFfcQhi2|R3OJWdT?jbWPnyh_3giVPXQL2i zx#Z~WyTIB0E;wXs(pZ}^ihB0JADT8(qt=~Bda84=DF9J%27={ASQcu;6(M#KGHFmf z1<3PW?kD$Jqp9>pz}Z90t*D4_bd(=(w%5=f8iHTjfX+z4ADVxhJ^_lrTrz^Nu7^{x zKpOzJ7LH>^A|Wj6E>x`W-)qMM*Z#lG9z}tms|N)B9K_bH7gg*7%zwdmtW_D&npjuc!<9b#twiT&&ve(xn?a3U{3y;EfR2Y0fJb#oS!|eNqSg~^|b~{ z&oMt;t%kh(>DhanH}$>-T)8>NdrLqy)1CKC-1~LS-(y-F-Gg!OdhGtPxheN{|LdOZ zssDJI`pjt2 z5YRl6ZJ9qdO%axf>+!9nls+#mX`tQMGo%etdY&7R+2`pHvw(FkMYkPO^eENp0LyXlhH%0+Jcd?hU`e{$ncj^tshesOMnu!~eEDI`3Wn@Ap#$GIh> z2oJ0#lG|vK-gL=r!LLv<)Nez(1*->DrG5FzLtgawRSqy_z6Xenroj$*CU|sUH#7@m6vidKRr_KU|u@ z_%;NuYKX$bH-$G;^p<<=UOSc?SeMj}Iz^QrgDZfP=nByjH`yy&gOf?Ulz-q7nqh{h z(+Xa2KM8{J8!ScbwOGHCe>c~SX8p`TmB|=pz<>&CP((j92Wf1YNgYf+AyrjoPGY|83C@;U04+kKYf64d*OounAjWI@0w!QHIH|>DBlM`RnCB;(>#^@;-+sPF0c_R*;`9ur} zo%EYzYr$>iru<^&OrYf(Dl=K17^jM*f5yRacvAT>hM~BLPv_$`v(9bW-XP3o9{$uA zE~OY84YRonn02x+v|4pcZr{1El&CZwr2%tLqWiiD&myARsOO1R^T;i)OVcgcYw~ZN zMzVRIqg#p+{q6YaBAbpz=IgN)TPK`N(^L4D)S<)Vn(aZa+>~O;&e{L-CJYp@(QzDNWp(Ttf#2dnUN zmj5gR08Isqx+PhGO_HllZys~8S#qB!81^WOPvGR<N1ln@}x1)7$J zbSNN)@hI9Mw(u|+T^wy~ac$BIRFUoIaaD$nH*IGi3lScGS`E-Qdcp%)JRqL@vr+&G zpBE~(!vlTW)56bP;SMNEHETx?4VUpp6)HW2(m2;i)33K@8r~hEG|$M`E>S% zP<`q;xEcPpIIVd38z%6zrw~$;x{Cw(=~y@`nj0tdTvB3NYq&=Fs4h>m(W)U_te%g& zTKYZqYo^xE%G93NvsO_Z;_@_$gvMpeBXxQ*+iy4%)aqreWR-QrxFx2po!gaO%XU}b zeth=2k>FStxB{^y>F(=+z-Kp_N^FSUTu;q^@slbJMQUayDi{{3By_zcfcVm!QD)uw zZG%QudTKR2+Qp0M9@O*t)k0#N2-k=tlg-*gAOP*fRhBbfbf3L3oIOBR$m=hd?Kyf< z6`KR|7Zg>bt5&_Zk7a&ljf|3;0k@>zgL`b}slNJMpm=8KakfZl*Ww}X*fz2m9cq6C zFs}mO9PEND@0f!E=Oj{DM0-@KoOriv=XDzNjx&`P3IpUsbNSs8C*bIwJymkvjTB)c z8!gZNo>0Fa#v9AcQ5U#d4HTO2)<2cPxx z8f@q!*a6zR!(mSbFYHe_?p;^#j;h@mMIDS!W};G{Nj>Vzq8eL57ssrizyS zRR7*h+@WB)_N3rdbs*|SnMYqr1{r2PZ zRr%4o-TI4;E&>98&DIL}4xv2*a6KM_OOHxQm25uOPYcZwOuVi%RaaEwq#>ZF#MO86 z94G};A<}_iggxWuFUM!(PTyPuxdu4a-is+ST6$fWzteawQE$f{bJ}OtEB)!yyl~|A zT9d?vSn-3T4bKe2=VPKRPv*81dI!J|&1EAdqJuplrPh5O%`{O#%?q@863TEUs^Ha9 z1kIV#iMlk5J9Qp-0bEB*G3W%*u1q)XdNL||GyPn$6?N51SVufFO@Iqm3z^`d8ZtLy z4R$EkGci``H(<)8#N7FJ__!ijO$W)5x-lO0lMcEa(Y)_42emi-_TsfMb$5%Bo!@(x z+&VNAGx1Om{OCH_HzoC}N1`h`qNO_*EAz^(QfJ|>SSyHydLw(9sHqe1E&FM)pSXnl ze7O0=MNfgBwh6Pf7&dAbin~rI*Vf%q*;ZjRC84}bF1J~?GeV~Kdu7>U=K0fO>m5;b zzWDyR@UZ8&(UQjlw^uE@$}2+8y*fWB8qa>!Wm&PY@6n#UWSI!R3z_?@EjE{|Wo zLJd$!#n2Jj0X2c5Lq|&bZRF|IQMe+ZB5?+qPGEoZB|}s4t%BGo6zRG4`CCiy)%J5< zBgCp`Z<038)aXdP09!30%MP;Pu<~NF2Aa03>G_Q)(u^o`X-9L}6fdXFr)_P0I&0;{ zI4Mr|JO4w`O|=LUaZD1RQHqO?v>0vI1~*o@|Aaqg9>wTTZ;;_cFMLEXIudOktgkhj zA|EC6Q!G&Sl=(ycwsDtAZ}+C>U@uVE(5!^QF@J$O9dE#AP2qucGeyN?$FZfCiVIP< z;*Z;z8R}6F_%RAXh(z_U0K-#Bi$$go2$6oJDwfJbHX{1**g|Vt#^fyu?|uwD1#ATW zp^7oiav>lTq0w-K5FSDmg8KQPnPxSMTvP9~FK{&pECVD-CmZ_Y#}}FJbCU-s#bB3@ z%;9@F1o|dyS+rQLC8=n*+Hr`` zzxHhlnE7Y5zDb{wAWO*}HwV$2B_8Xe%3`Tl|LuS@3l_PJ_V<}U0+WCbM*`ONcZg4f za^T7XSObNoUrdg1W(aAX*UvJ>1iU-@0?=ycLG08Y8mdMnz{4mR12Rh{Ia0398J7xQ ziONiwANnTsTS2bEHC0nT%kB$4vrHDPXTw}YLkGdmMsb1LNggGxne2%MDxE3M(n=(= zonzaVJdk2PJC(Rj)$wCKW5N-;Mr2tav8|P+TuPpUBHJH*%1q{(&t_kqgpI9v_VT3( zTQS|w`8rwUfIn|KB)ty^Sm=&CcDI?jVb7K26Bei2WnrSG!I?Aka%0QExMJ{Sr}gsg zq1ju`q?AM}PBGCV!Pnvg4T^{^2SCj}Od7NAYwi^S-nFdzTlx1Y0K29Gx;#$Zi0+`s zy{p$VwW42gzjFByZPSf@DjETfYTmSstEb|?EInkW zRHL1o)nI`U~^B-w;80Z(-oc_%f(fjPY8oSUg|dIRs1`S3#1mL zNne)$#!^uyIT#PYxwOc!5<4{02Zu|IUxYeW3GVYdI=yEdzePWZ(Yp<>Y{KUbwoB4q z9!};a*Tryb`D%dPS1V383Avo#zJhW)ZW_`gSVS1VJu zrH%knO9glB9|ZSpLAs{%9Utirz?;7oEhALSENkHd5XLM(vrl;E?obQDaW-rA(Y9c6 z*jwl$vorjRf%8TfQ794 zs?m9B4~U0s?p8HA%kO^PzsaH@LGnWgw#RoF?hbLLc4BF6_cIgW{kZU`t<%S&U*Fnr zE*m*gLykc!yJ{Y#-4k9os-d)NsAhAXVq!_u{;!i)WTWnwj?_sXONV zJQ?p<8kHYlC9(W@ybcl)d3uf=<&H&F0~+ch1t(W0?Z)B1mHm?Ur9FI3a})@g+|bc}^oY$# z@oLC0_X(sKQi*KBvbyPgg?yZ$%~jMxXKT=^y@6&v03;Q4eCy+_`POP)j7 zg`|hcx+Befhiu+Kt)e5e_*6IQliv3o8~d63r5Sd-|&NI=1F?;xXBYM@Px#kK#C%!LfzV>;5wRsMEQ5=r5*@e&s zzqMfKE2hW|)l~`7-dW>uJ*ohN+M0>xVn6b!{p+xq6APGEHPL)t_qNf6UjhXF8osAK zU6ob#{7z*dE}Jaol)U zg{if(YU;F;n0-dlx3k9=B zh?{DS9w08Rzd9Zb36KmuDDsMWW}E%xP+;r}AHy8M>ZRqZA){Wq{#xc;50&iE1`)D5 zVXSTZHdNK+w#*-zr#aEAnjG&PU0>}4&?rj0y7-!QoGE?RT09G%8d`zkfhHN@^TOx% z4ld{$UojYWN&FQgmM_s!dU}i>#2nuAdohoV(WiFkUJYhX?$e5ee^RZpMaI{4<+e>_ z=OlN@t)&)^IL^yC2~z{c8iJHF!56;3yTE*YJuhpiO?K)EYlqdiNWu<3Zh`R7{eWZc zs0l<*=KG!fePGTe!^9lqJ}juQ@~b1CJ~1KW=ppaLJjq1Z365hc0FP<}%rhipY~7Cz zH%-h-&7NiZj96PbndoSiy<(=Pmm@Zaw@JViE=#EEgsm79Js@wU8{4C|)w58{1G#5Z zZMP_Jk!JrtG~ZJ@3`8e&0V|@xY?UXqF|3=a&2Jmv1jrjQWX^qLhz!Y|UXyLmKer%O zLT?g#WE)g)9b=*@dEx~V9g|)t9WC}Mm%ja4B!lGQmyu3vPo$D8TF|59yYRaE{dSVj2o@BJRjoX?cXaSM` ziMv*v+yewUjeKNmKSkJ>JJ@ML^BQ~!3yKvc+&>O(S@3N(WNf&p&D!ntDOO*k_JMU^I=+>LH+V8w6Dt|g^j{^e5jPAN7@ zen28Zmx-a1vr}%lW-Pn{eQ_gIooDph?3ig;YFElxF2AIqrB0=-n2c<_x>@#_I zqFZnUdEB1*j0&}G7M30rmBmv7AY*o9kX=X(+u|pcUn|h$HwX@M%xL6Kcq`~dWK2IX#b`ysDL#v+pphpH5VL(omGFUa<+(3JsD(5!`J2aaG6;Q`jl?pn%tLxKn{% z_cNKDPZzAvUtQ>HCq}k@fJ)fGuE&bZ%8>20>#a#VgI(UT@IE$aX>+R1-23kar_oHE z4v8}6$zfcNyfx_KwR9l)ph{FzhRj||FTwq85TFNjDO!MbWZ~)O%=G}U zD7xpE67r6ip5hS_Cze>HT`MD_+_#9#lR4&ob2`2rfh1`YO z96lmqakXX*mH53!Ho7wC;OjOl%oJH);GW!qSW(+hXKzF6R59!o8&1{3(t?Re% z7_y$miiHendpGib(V<_K+r|O~k}L8wnWrA9S@+fhVx7Ql>h5WNPj`mthy2~7nAfz5 zO;=A{fYD!q#>PNNPHA#%N*&s>HN~4H#U39$$J9Qt>lus}5(}(k_f||OvFgf??l^}$ zsKAKH{y6{h2+sdR<9Mu(DuhQx*6|d`C*X>u7WC_kQBpYhn9{Xs-xldFT>J;e4?a_8 z8!pt<>cX_^$~`^jm#4Vo>)-L0kR=j(Kz|=pDDzmwAt;L#DOLIUX$S zEnBzRpQQf6a1SRA);79N5+U-O4|T8Hvi-Gsj`qz}7cRD?k>i0A24FIB zZ3MBY1dMxfMn?`l`M3K0sH?}VYAM-CdU*tMllxUfn{+N)nSLDVBzFTVF&l+-)2~c!&}q^uLaG?JuGPC}yOlZcSj31D)1YkMSZ0(r`KXL(Gksg8WtmOn zU4-)y4d#wQvVZS8#?798MDD4rYOdQZN=>&UcXl<$ouxsi)ES@n6QvD?mww>;!?!Dn zo_+M1w0EkoY+B5EpAfM>eFfmgreH3Y#dO!LEkYgV`OygRnH=<#MKJxLF!wvww!Yp& zY6%QznosyHAVMW)4|iG;AvW4F{SMddE~!wrWgHjAJjGXzv}2@yc(t@A#X>#9dU{f>g}X4tc#qoPzr5 z4rlFpZoZsS6po31rFTWFr22HoTZXv=2rKzI1+C5jH3GD+0p1yd*F0v>!(>w9%6UPL zB|Mi6^rVG8{-W;zx~}|z^$=v>o7~g|tkZBYt~wgp65xB)uXEE|TMG`x!F|%73Klp# zdtn-r+r4=}svH`sK<0LO188bD(^^+ZC;0=h_Cy8dLf~j)zOqrH5GGtxiW&A4eruj` ziC^J``D<6&#lR>$F!_dEA?D8UiJbVi&GMs+O83|WWN&+Z!K!HNP5_*oOxZSgE;ReF zhG;Mky6k4pIf5xypUUkQs++Ydb;NQ_P8kvs0E3OH!zfCA10T>9b@iLWGHM11Ep6g& z1Dcrs_wS3&oJ=gH=|wy4 zxucXyu$t=~iM#n-sx&vm4|)!tQa+ZW0S+Bpfe@QM4gGp>`g?~=7(2ssVPR&{<+ocS z@YN0>j3#bdvSo?W%ge?}?Y~s#WV*JV7i&0|9=@8gIW!~n7nlcT_q?i7ohPMlqETFE zBez6j^{%dFypGcOs7)`h19RM~?*Z}S6El^vKg55w1cL)gT?pvkplCo&cBSPH%{za{ zc$K&T&FS)m?f=>dW^2ZcOvd(;Mz0^b}Kg?8U^w*K& z6j$R;mB?>ya(g%@!CnkCTKZG>L~ZmV`co1@s9>jGqGzpVVmP0KK@039wao&0B(HZ; z4GD2b-<&>@sK{ORJIe2UlD*Z;5^R$Zca&37=O8is=O1~oT=2g$g2VE}Y0$r3#Rl*h z`GlD^HFcuf5V4T+UJaLs$4AZn7l$SD?t(sonuSrR7P=F%uRS?*b^N}J$J6Trq@zXZi zj*;z4kC9>xoG;YvwV+P$6EX;x(&){hPH5LK#}D{iaSEJIUj$w$e3H$+2o$`lb?NMS zDJ`r@+8#x(z-Ju><*X`liP0n#-uX_OFz*{%Xv4GuF+t&gT0UkAwrJIupNldod9{65Sf zdwz`S4j1DU+nR*$sNE9vrvECQ^hWhgDqojJL-A9`AQEfEx9FOY>ha*uQ-C(IJ>!vpQ+OToTyLl=rOv`M@6HNFO_&y4YI^XdxpD!8Z8- zjKM&C_20o5{ue%;|Hio+QZ}mrChsNK>?)yy(3u;v-*Wdhig(AOElM5C2>LzmER!^8+j$`5vO?c|Y|*vpBY>1>t&oPyHM;Etmx$QeYKs zf!3@>h*lYpP*i-mu@=-N3}7kJxIbfF+NgPsnyG$AH53N`uJqqP-3?VqCyCd{5$IX& zeQz}*9lF1K+#MC5arbYY-v7q&20IEGv6uk(DczkSsRe66q~T+zU0oyvF@+HMRzzm% z#x(Z#=!<66X}122Gb;U|s9n=KfNS$47C@)yvtXW!6IAB?$?vg1#W!vb0>N}!7nuo7R1?9qZ#@2T>a~0C?4)={nOFpkKpT1#7 zYi~lhCf*YJvseFWOg#K=Mbv$x@Scy+9sBn{Y2bPLFZN#e1Gvq~!7XHuU!a!9g}7rd zBXB?S|E}`CfX((mpstV6iinzT*rNfuW6R7jmxVNi4KdVSpAaZb8r zj8k$Qjvh>@o)`q4tgC+)_5bTfHF`m(wVdcQPe1JOwyV(o``bSGMK1x#YkVX0BExh3 z=^8=s?Z)Y1uMct9AdNrHf)we)Dn)i5N75y5-fmXeMV^CEqGPG$qe*ad% z(ko@sQjw)3kcmPV;yqwHc7jCsop9k#Og~!VMZzm=Qm|^HC=bDypiT$7j=kPs3ysM4 zNnekj!!0`|o?laZ`uN*Y_Q7f&@a6#hW0n8@?I;1^6yhY*`Jwtptz+Y_LSJcIR@`%e zUk0;w?LRbfOY7BAGH`Ui8&mke^lfdRfCvuKhG2r-9MbA50U(+UA6UIyxcHwpteg~N zenZe0v61Ep^b7uR4cGJ}II5gG?)&ae*svsQ?fV|(sMr4Nza!ocGr&D?Xq%iFi;$b> z1^5P@uheKZ3suD8Xg0+e8^l5BePwRSI>*zLyleAroVo?QzD5`?qrGWSv=d&s5da9O zMoK^bHfA)0J>%(hE|NDe>$P2dgk8hIzI@sVziSV-d>Ky9(f252_RJl%5EXg&Pmq3l zoRxi6ox_)@SloSfe%lH$_X&U!&u9z@SciPyjub=?L1#U zHx}8z-gSENQ-Zzg(Ssl2AL3|XEXMKVZg1IXhX!c&<&@E{)wz6hw5=ah?~f>(9-*|; z$^F;S0C;smfvOkVPkEOIAZL9nSI)qZ)Go~KeNV)u5SSVrRr5o7ft( zV{jU=DYL5lZvIh%TK-hx6J}!5LlFQ%{;poRt5d z2T}F$7dzet=&X;ArF*GkZjF>LaItCVO6%Y%w`1$GzxaPwqH}05y{!X(BkDB$(*R4JfBqOZ@5(Od3uMOm_+d zy@qoTTX1Sq8uLB(r&@g^hjVf+qqLv%q7Kbiv7G2q@{JQgqDd1Ag2PRAz3eCov=7y( z)V6D=O#^^#{43=JhYdiAD?dvSCNIUJj%DiuCzglvx`^H%M-G5sc zJ9Hn9CkMUzH0;Elw2`+^Kx*;u^>OiGo5_h~!k4~$DP(13sB!uQgW3f+3U-EA0v9BD z6j!)FwtA)AY+uWh^5Kh(LHm!DZ*DTMk)Rr<-`>;;^4cWecw*^rmGR#j#{FbQeVKP8 zvX?;b-Wbi>PFP3|7N+WCtEr21e~E0?w>ZiwL4Dm=hUZx(ki7|Yv!{pwh{6f^o{cO? z$&ZTS5{=}RnJco)-r{583(QmOTSEr;wuq-Jt|d7F-*0_;3~C7=Z*0hN_hoSDdBPvG z{L~$5AIw!lliZI%0Dh!vF?|&#Ij@FjPj3z62;1V`-MwS;?AqPJ183)}Vn~zVnt1UH z^lzgHR_&oL_^@L1XN!jFI{9sE>-T#R-3=2`dIx~#g92^n;fNW}Z%V3qFF*4Iu>s2< zX$$BNY8RF@U2=KhfdzFf^8q&J9=xI%Fso8%lRmE{G_o9XWHGSoRPOZ2p?WeLN>?p* z23|Y>(NhhJww6cst`fFoWa@^k_p8?7*DsmHYcp&?Jm8(&>;=4&M-fYr*`?`<#S1SO zSaiQxRUU70_K$5q9=TzeRnMXA5zpL}&%`bbD9Lnr#%xgu?KXs7zgLH&hu%8wt}IPV zcf_uX0`iSCbg$32yzR0RE6h&b^vdwb@p{TEc3*cz5B?&darE`)ijqlji*~&=h{X{=1M>TgY zaP3eIE;xt;?_aib-psQZ9{?awhdi9}d(%T@;TTDUUVP7#1F^_-G}8zN!e# zHTjme8pQv3Um(G1q-o1JYYl7IQGZ}9n_sk>Cxz@8KJW?w^iz?v#J=VrrBCDb4|o+` zkw}8E<2O_Sy=<3^Yze>9DbDCNGys6;3<6+_=nshfTafaK{b@D%dC~d=QeMREYux39 zLPdsv(gq@}5oY0R6b8THC}ec#G$w!~IgXId{2%PScT`jV*Di<^5d~3EloF+?2q;Aq zA+b_ajDQ%aQBe?(s`QX3NRbxlQUn2!PUt8lkuHLS9y%oSo=`$Uiu3utb7$^--`{)h z+*xa8)|$2MU#t~!l9SIlXMgs7_OqV{iNhR&O?_hIfl)2wwnE<8;UAtsjK~cwxFF`8 z5otqla6x_BBbCmdpn&f=a#Ja;$)mCHn0AZt=t0;{62eaU*;sDSnUsMOn{ov{%97e?hxQnB?szLd zmf6s4w;oG4GEs~^)g4x#ofNe)N@!=_F0c?Z18mnk)5S#9Wd|Q?iteb7(7-m(mZs|L z1%^%qHBw*aO}~Xxd@l_;%=WSwg&{n-tHOxn3T+|$Q20K9D@x+r_(1&_ZMt142ik1W^UZ{EQonHk1U04Hui*QsaOWLyLFOL9j zS-g~Papzh_m*trx$X36;uRhYvKYzh6Z2TemL^4_UfbkuJK`kxd?b3`^XTw9uxc4dc zI!5^mSJO&vc8Ee$zj|v6_zt}5OzbazT(9UlDE3`bda*D3q}}qTaS`rSTi8J>4ed`K zvKQA1Z%K1kcytb46kG2-RVNj7tqztvc9!N5Tw|MyFQu4~YJZ@2SOwhXE}S7F1_b+y zm6Q}W`Ezi%Z$6=+flAU5cdwR;Md?jNy(}rI?sX&=6$Pv)I*LfUG-rMu{LZtZRTd8s z{dB*krW*47c{1$gF8+cS{mA}b1-E^(s4V*Z?>|1ML4xBQ?5lU1n3lv#bl>&M7&!x5 z-M@{!#?YaX8Vqu6(Gq1N84}OCK1>0*`stDJ$iif(j;Y;R=eml3dcA?8>ZJtD`PjH zCX%t02ea_)%Tg8co9@Ns1c5oIOc_fsSe2nBvg4QDq{+@y;}HSD?=v;*mR-x_-=7nq z{y{3l?Ecp-Rd(%c^K^m}s^oY?Y z>5fiI8HL{H)+@O9<`+h30$!@T|qL~>(!##-q9f(0WfEBf*Q{aNOEFh{+Zs`x=W;7&Dy?)wwV^WR!8n;M+=Hd3kv5nL;_Bc#)5Wcxtm6 zxAJC0-S`if)WzA5kI*!u2X^`(x;yO7{x*|Zw^My#XqPNJ+$BbAf`>uH7#2_V%x+s3 zIA<0tb&@~58C&jXB%%)c?DEHU5Fl(b8m`fcNM9yL{JLb=aUiqG>Kj73SR!962A1b8ULc@!7s*TqY5k;`10-lB^Lx)&m|a=_;1wNr8a8i&0F@1 zdTt*`QCo%;BP?m3;}VT0FEDR=htNybk3DS~IDge;X2v&;5YaNDCA>YYUPlawuA_TU zYv00+-%DY}?*?-0DTE>3iMV*`%vIQ7&!9+;1`Q15JL z(^TKtn<-n`R}doPVq@c@$!0^dF{q1-=zO*cP0SRZ+PUdI@+9aJQH5epgR!o9#zXfa~O2o{J=t?%l(=KDie1?y|3=m(v9=_yjYm5)}}Qsk%F z+J^sV&)glo;ic#(do@Mxadq+L`9W!pUAdctFclf1X^-K2CWmX7&1dnY?n2CNYt7x;iHe3u_Kyd# z-T!m?&iJfTT>ed)dG_0HW`YyKJ505iJU>iyPwmxp{fna-OO_ zu|zoH=B4C@R2JX)lj3z`+V*+#EnUnbt)R41q|r+$NLhS&&h8tHpBQe`6sibvB>UpuTfpI zPrW+MLB)ratk-w}!Uq+#VBe=bH$7K+-D)S36n$n|l2^o> z_5;h$NIrDX<#uoOr2bX9R$^}83HYII3$sP5qI~F@V{TtRl zWd)NvIjvU2KcW!}n47=qcu2MAF36rG6u&f7o?2$DC9wuS_fq+>S^IVQvgFTL|F`IQ zk&b=4rgN!_2A}sQ0c@);9LcdujaiN>-K@L?qGA|wMNSe91l@aOr z#>q|7=KGYheLfD%!dZY>PzbJJOZ5i!J945fH_s zh6K?T#I*Ny$G<%AN7ktOj9LozLzVxfw9^&c6jz6(3|RH`0m9Zeq|ukkL#g=|vb~Fp zSmIFN!Ns==kb0chv>y~}(acY~{HNuz9R9qUuilMl)>gA8>YKq{N(>(<3cmMBk^6Je#VS(y@bY=6@R+gj8GeHo}9s5)Ava zisC}ODxKfM!>1))tz12`#7nz4A&A<=%%aIrRzuOUmdE03Y|jznG<4Qsw&bzn zneRh#3@o$IV|Us0A=_{=c4?S`IoYtcoWds%`t{zfv=@IGe|sOjx%H54y&WKlZ=w=^ z!UFIjw=T@JX5X*120A3V&vbkXSjCJpnE|bf#^%FH%B9c?^-sT&(YA`?GiakOOAhPq z;{n_8qmBx17GD-TH8Xo%BK?>o<4wLMJfSvA%3PWu4{T9IGA+a8WgMR;fLS+c@fbWX z<(QX9wYYMRbaLwEU5jOBB>!IMNlzngojYb*#=aKUnT-Z;v1eRaXSs8?wt?<>AX zQM&v`*aZ{GXAx<=c~66+PDL8?Nl&fm%eZm(QK1CdF3ZpcuEcxI4*sk7TQ1rf)kVBK z=;sp$5}CiW4eicwjuq${h32R~ue3S+=u}x%6>%`l%RTIKh}o`j(CnCi47U7)3RQ&o zJ{xpfzUSIwUigf+>+}_K!AB#$ECjw`_pz??muUG0xE&AnJkWZJ^TrDP$?JH;hWj@& z-f176PSbDoa2%iAhJ27dGzP8ZtcLAk2N$7=jUgg^1HHI;ySJt^N+%_ z5;+fY!VJil$EfQ$oX_P^zA1S{l-8SfW(a;wAH`>RUw!}F!zstvzNx{lF5lfuAh1|T)sb#qBJnGNuzwek~9T*9;=fE zwJT<$O`3e^d!p3GN#cumWKtv5=F5y$vFun4#f}yEp&<_)QGM#0+^%3UgK^EiR zD`6X*5hjFz-4Mg81bPsoRK0<-SK;`L7}?4^d55Q41ruVb{#_rKM&oh}`}NzSQ|?jP z=iBw9lcV5<`d&W@?dAzKa5M*-LB1~>Aqg{SWNRJAAeD&gi)`fN>T%D7_2m?~-prah z8Oe?TP&c@$)nf~leWK0X?(RM^dM}RNRo4CcMmm^bY5BtA3T8WMAS^}pb(+}4u|SrrldcuQ+l-Vf7R9nODx}qyxzl)ZlI3bsLrh!{zvK z?&<1;+Gin`gQl7mr4StK=~H-riZ=4rk4oP4H&1~c0LiSi_!vQnf|b;K5nt(K%Ydyx z#nY>ZCmBkXRDoKhAk@vnEw4ocGDS4>L=Ox1Mzru|^Y-qaTKtnNW|1=6ZJ2kW>At{x zYVkNIh^KB+$2tD5fUM3PgoQa4^mCbq*y>|W|9Pc1qgz{D@e=_ zw6-+X+K+rpcT~VQGSK*)_;s-m_FWtR{9XtK8zBeAP_RB(C7BRTdYj8^tK4C2t=QmG z-WQkT3DWH+DT0$a zJ)1U*V*tUbg0takHyfkQsoFKlPdfAZM+Rl!5re{vA z73DO$bQVuQj3O=7PO9W6>dKccUHmlk*xRwd)jKMf(!q&uRibr+Wi#VWVGw4$<0`oJ{j40bqC{uh29gNZ-I*iSU9{s_&q z)-RvUiMw!N3RCe_{hQvdOz&mi>2tX*MiK37(c^)iPnKkA^%Nm4f(J^(6Z3OO8eH7q z70^LJ?dY3&-6iB*D)xqbpGUSzG&ER2VY=#v%h3yYSF6SL7@v?%y(YgFG|AhzxcVXG zY}{WSAK6$t)2nCrem(}sj?UKj38(wK*W^Q$TXGLojV}41)IZj^*{i=)7@w=XUrm88 zAD>hbEgN!=n7!34DnDI6kZAmu=Ua(@m$Rq;{#*+lt%t~lx#t+FxM!uEl}w?w{! zUM{i0AHDeAn*0KA$`c7bVOn>r>H5MTOIYKv`5tkFeYnLLqI5bzTT=-Lw+i059o-nO zesYBEXwf_dsk6HJmq#6E>OV1C-{y%5W-C2Gh_uy<6IQ>qp|9ZXq+B=2YyFBElfybY zyT3QDp|eMczlM5&9G6kUP-aDW`iNS=az;?ZEloezjeQgTbz%}g9V{GDS{ZPP+Mhuo z?)f?7Hum|5xt?RbDc`d&>w9}!wfFOhT)y0VBj`G>`92_*HR}W7|)<2uR58?0gR{6k40D*k62}9=tNlcd@ z6062K_^+b;jTA@(G-#ON!#;`6TZ4vi;s{fDAnICrKzTAV984#6O&vgzH6M#+wDPal9go<4bfl=(qft!IK8M$$X(Twln6bG^Ip%gLmvE$qT zag+s;Kj|+|4Sbp@wPgt&??^YL)EQzgd@y@n6z4Je4fz*czvNACk)pD0qR{)&uDHv zB6a)9VhaW0Cy_;Bc(G0Kq3c&d&5}apKj9RlU)sFu*B!lo0(8aRiSLt=jG|C{eoR$W zMSe^C-zWJb^rYAHR`hS0SxQjpI7)#v=~`#OpFhjP4uBXVy|JGcbP;j!Ax@(@3o0?s zlBYAPK9#E;+d8f^J$$R1uw`%;EZiKS0XG<04;8D3SZ_X4werom$;`PtZ2O!GVT8cT zO2@~P6czV4^!MDajOgDj@|bchm}g1&+Roac6T7EySIlV8BaoKfv%u_1tY-Da8n6e32|JXUcH$#kIc5GVEgKJpIH2vC_?+gBw;qPWFClzkaD^ z1kKl(D&+xNi*^rZFj1%u`!m?r}_8?%WGWFOliYfc5DgS_4n= zAWz`3_)uI2ZcuWQ%=nOP;@Ie+@hV6nKdSLMdxsrI7+3JyzAENO919Xfv)+63Vo#WG zG->rOkDu`Wm=4j0YbOpJrxdhAp!5~reY>X}^i{X(i_scUc}g=&Oepz1eHJ_l*yh|r zGmD+Gzm!d^@>ptL9~Ce$N$Ol#ZVz6Xb~Jh(%BPTF`#yHP<+v-ZaFuB-n9d$wcE`ni zjneyS`{LO@lbB*4lhRwTjh#d9U?3+2gSQ-bgA`Ppgc>Uiqdsc;o+xs~3_gIPp5YZU zP0MEs^EnxU@9QVd;2&lPSO4X?17tN?PNHtEac>uP!daV-W<;@Yo z9lDQ_AHZwN7`G02K5OkgVEvd4sGRt(BN)>S+vjRuak7!?1BF+C-d>{XMi&}3h1Pr6 zPf|bTA!IS6+ezf-)wy9D8#4tWxql~VZmSVFm{+GkkACe&zIr`lfu{Xu-}_Wagu%y% zpi_?XvCSIElfumJ|Hg7~p1h!vCY{;4rHon$)DU=B<(RM?L_JOc&1^EDu7Acm z41wAW7prNzdrR5rAsg6#M^fW@{pESJdku_PD2d_<+IhL#6~RF8fJ)#a#8N(=gs)f* zr~bZ;`vQA4_0OBICvo|4HtQvCNgdvzOz-|wxh+RYSaisCF028YHrdNa;M@a6_F7gE zAhGda{&xRYQv!gL@A~P8aA?e%TY{Me7TWD{;QZdcGYLG>)b10mXHhd|Wq@hu(TiY# zb$9;XAxr+R_x{flOaEo9KxU-Ge$>Mic?osqcgY*uYj)SwQ`#+(*3rano$vrQ=Mrw# zIdv8#HfqA|W{f17O#|{k8BO-A-;GXHqN%g=qvzi>xg?HN<}$Fx-}}p}6XFI%RS68F=;4Gu>C%M~_%e+RB$k zN*N-(y$ZHo|ub{@P)A`~F2g#@WgR(AK=?TG_8N`KK}f zW#Hp)lQ9NQNwKI2XUFvS!VaLNi0$a(y}z{nB>#pfmv67$cD);B_vRaZ{dQ~%6WjWp zp+(b%SCd1@o58sCJgJGSCZnDP1SaY)PdFN9V1mDZV~5fJ@Ifd&J(?OLH<8-(l*pDs zApd!TD_)kW9?F(Epu&j1lM0V;mlwVB{lL%XZ_%+VAtZ!509jLDV7T~cko&l;QY(e} z8m*6J|2`|r-j-!04~!gmddKh{LjqJHqr}AjG2=4*EtL3w;b*|avG_5q&w61?2f?ys zEH7?kO!Bg$;G&oTK|_$)*vd0~8i~v^^Fa9`HDQZ+oE6GUeFs7Z=0PftFw2Z3W64i^ zDGUm>r4PF)@^aR0qt7B11eYL_cYO%_i})QW@QniK>~d$0)|1c1VoD_%K-Cj3 zzlIYjQZ9&IZ$ce5L?^avLW2VBvMH~&zY&m~3$Ry$A*qM)$Fc$n*}S$+^w#vd>n zzb8aUjMYC?@?;?T6A|42?SSn5JL;$NO7|8$1K=J3@aC_44s+9llT1S ze|X1YBk$B)+paqDZq$!sPt*34tw_9U|7qwfC{H5xmw@`@`qUxMQ=~|;CGWZB=@8+U zH;hjW#ZDgEbojP{D=pd3@bs&YjP+@DxV@ZLyn*}=-wGoa!d}9+PZHL&!IV=N;?D+R z9Vj)A05sbN(nd1I_8Dw;3QzZO3ITjpVz`7_m!sxY3hZzzH1*A;wxiBw!_3f8 zleBP=oD~)S5yS`Mpt9!Ohw0alA-$2b8XV&DuwX&HIT*dc$et~wycEsiO1AmSKh zmr8bQ8Kwy++m5|Jt|`WUB^Xh+!JRx`Izj9d95}+&#vTI&^k)J$%x;@N2akL-iiYfN z`O9ye3%mY7Vh{V-jZ^GP#a1{IqpBSawyC;#8nP37x)h8bF ztHd&j18>^TLejxWMOEj4Gusb#vpWU$l#u`AMLO{iZ0at!IDBhA)LDHnqt)>tIj(L{ z7-0z_w=T>;AUlCSE&EEQC!t{|C{XQJd&S9QrQN^tjVr!4g9g=tU;$bNWzUeJVEmz{ zeH64WRduac;2Y4#w_eLVD8N>qf_TAupT9rPvL-Gl$10 z$>)c|AC}!|3F)O*^o)%(*u2V|`P{E+(0aawM=&Q_egS9`lYO@$&~k&G5tM<7P;$ba z-nAm;NwW5naJc(;kQ6KTb*~M*W>qcgUH!u*WAEGQ3)@Oboy=E?62^Ag$=<2Om)u)l z*z8G09)IrH?9 zhx_IIxypLbw}^EPO{&SJMM8_jHuKBLwbWO&^UlAGjmsK2wUdy-K3sR-b9*4Z%*rV_ z5t9KWhU8{ytFj1fT(RwH`oWsVT$#lRnIfBRlb@RxiQu{n=54?y_a?hJ1xQ@%>GP-= zx=p>lKsg4Wf9wwAM@T6mT!GC`i*MM@TxYr?T-9QhlX>FrPUjt&D>TLy;YQSN#h$!! zRQd8LKjnQ|*M^0pCEhp^Rif@QAzEGH&%OcJm3!ytjr1F)+-&5>ij%TrQlG1HT!nO`wQttJp8+`liTxcKqW61lN$wBF)}PlMA}Pso2msR0QKIlh7i zhvC^n8atZq^!y(i2&JCP*h)s>#97zR5^dC{$m-YRD{o1O?ddM4BVlHw#yFW#5 zlu#?}!&V85$z$D*mKe$m}9)fLnCV>rI;7Jj49s?n{Nrl1(o zvZ%fUhEk+M5#aeC`ZuD#!aV)BcP6JI;pvIQg|&x) z(&CnSBImNVq$vbsfL$t1mF{UKaFdLsL$l&l1X) zMx>W_a*x~fcuDxWyRM z_r&C!i?UTeYwwdcS8=y;!AoRm@yNNTJCF31uwBT(MfxX30qqE-BOJ&ljG6t(vMlI| zw5re@_*5_z>D=~<8G{%{1J+99@kR`ILO?WCKqf$nJsKcK7chq$&@xPjO=p?gcWlQ; zq1m?cO2>W?YAex?H$+;Dkl(of;lNM^Hn7g1Bd4oQ`=WtCRd|%)jL&dEzOS~PVSZ7e zv6;gdY1xXoiu)I_?{)7D?0?kyUN|t!>Bli%FjXf0pYa69YLRM^Y&#&}bQOPLNoBWBwoLsG99;$s<= zRH81Ud?Iw5ccEY^!9}5L!tte|R7We%%9YbWZqmv7v@{e5tvv-0r@;tvjM1Bb~m`AtiV`K767%k0lPPXK`uCn3JMNK&| z)5Z_ir!ii%K*+gG8PwA1)rM(?*E1L8GQ`;Ui&4VjO_c0mw{JN;rm1e+-`ssk{~$}M zK0X6s>Q%+Q+HO+2MZ>(j#IaF1AH7e~h{Fv{LW7>iZ%|`gcR+JcjuzLa8tmS?YVrU2 z>VqxtC93w*gbW%lLpJTh%cJfSNUF>e8%b(rR}X|) zilo7O`n-(> z`i`fZ$Vs&oxe{1n8!};Y`v_~t;>DW<3_?RkLwj{)S=IFv_u-mg287;cPlq4Pxia-Q z^vyJesSLFWnu=3AzRpfqg;ex{cFOL-Al*&*N3eUBx#bKF5+EJoY(DiyOn?U9UgkGD zy?76Q#rr5_^nG?JRk<2=k}Vx~_DL)`;ef05osyykd4YYWHYD*rb>^?+;(EWN)}E*! zMc*M(F)qVRs)>bJUFjPC;)=k?((iFQ_>4P#7kI;0*HpQPJ!s4v9gWzT8y2OHt^>*s zr8LIPjJr5S@G1?8JERwDcGiCH&3)I!R1WWOpZ~a%sys#vG3kd0#y(6lwB+1=S%@>n zrLVYBxyN2NYP*;!bgci2qXH{|r3Vr$ep(O*RQ$`gaSa+&O|U;kx;cS(yPUmDA^cl6 zp&L`OS}zX|g&bnny713=OFvs!_POWr?2O)$*i@?VBe5fqEYTc04PJjuEodg1W}B!T zy~E@54V?oQ1wwK)crN7fe%XoN3ve$CCv^#}KFlSw_iSN)e4ulG02IMK=^)7aC!qMM zC+=$!L$+NzWWGc^SNgeBb7f<{J_;7Bmi5U+OnJMfT$L<5{l|K|kl?3fp141PuRo-^ z6yrvWuX)Y#wC^5kbD*x?tgI|GyfI=Yni<=&_1>lAoG(9&;dPZe1Q8o8sqEL)+LLaz z*@}y?dweQ!=H+rhhV6@3KS|I61Uh!RqV(qE=}U4!OtpPy@>MOWS}izBOw)vpbB^rQ z=`@x;OA&RKhG>EZd7!0e@4JXT0-0(hWd`|or@tCp7Da?j6V=Fs%2{@3iVy&St8tuT zu`9?DLIYL^rNVfe2oJG(th6cmL#dr-I?co?To_WpmemzkiTFYaALBs4Yuw9!d1{}6trLj&c9LOaVlUUb#^IlWfdGgT{5a0ZzIFtB(upay31sW{ zSzzY+3!f3voiTMm>1=FB9!+J_gMJVs&Sr&}0c$u%)-M}9HokU>;l!;6ACqqRjpMgs zfEV-h>lDH(JpLh0rQYJ@`q|slr#n~Nl-|307nJ#s^m@s0R-A#%2wkv?V@~i+BLCdC z6SMWg5Q!a^kO^C7k@7uYszpe+1fZzzdXfB0L?hYx&c*YfW#}FRHk?No+2&vTS#d2ZL|#qud`Y6 z%~-N_!E$Ybg$yJ}jgm(YrHyB5GqafDklf)d401h%Q3$ zEZYwSCTJ{{2^+_8C5to(DdM>&wDJqRl*I7L@rS7kz7j|FoQ&xm9%1J~cA^y^%s9}E z!GpSzGT#73#2NsUGBv)n2H`3a9?szeInIrkb8JleJhwf_9gV=g#TLM0de%gza3&KIJh;r19&;O0l8)lcpSvC1P2!< z0gRB~MWD(K1L=Cwg8orLKj_*1m!q;qQ{$!R9*kzpj3taAY!5!zcAVNl5CNqw62$Fl zB=;&ZI|jr~Ga9T7^5ZYhr@uU0P4A+sL- z^G(J7<(n|pJ?k)K-X#*vgZs_^925j-2H+=UQ3kdVxvCmnNgTbq9cTb@2UNq@@7|L3;E41*vOfQ~h|%X!u0j`Uz3;&x=spbZ(56Qppw#2jrg zMJi%d0vEp3!~hBIE~gYX%s?P437wHS|mXt3SsFoh~&Up8{0g)BMWGe*gi!E z8~6C%VaX>>DMW>RcE`q62_*|{4z~C)Lf(wNx+FEE_r{L2;emPGx$Zn-z$+`4CJHywmiP1$8GtX zFLYbArl|My%c+~c?;g$b6U=_0#yCy8SvO9_aZlTG1s*BuWR7aOsQAvjXLlWIaShPh z!WC01!*i`wr~Mm;hm7w0k+;11s!ac7S;gEM!~b0PiE}Z)>aW$zF7fl{%%*_Mkxny< zx3V9kUqn2pRnNZs$|kIDKOOF$y1T}VmTDFa@AMlQFm+J(==OW|_L4J?h~g~oe%N7N zwe6eW>JJbmR)d})NKb#z#QXPJE<6f3g&B-Eob02b<_kIO!ZnH2@6I32sSYz9CqBnJ z(k~ZX(L&vr2`O12w&Ze7bKY)Ne}jv7x|*02WL%1OkwqK3uOGglHE_;`GqfED;(w{* zd%z~C?b5vBAnE*Q+M!Jlm#D{HET?K?sTudaS7CxKRcY09p9}TR%-<2>=f1k;)%-e) zG>x*N=pSfP$O(uzX`k5d*#F2DXud;(&i|UdQDb+dKm&!1!8rLm9{)LZqV|+mq@9w8 z7ESs*ocmUOZHVv?JoRv}d7NyUg+Qhh#+UegTt2|#CMvPBI>?|+S@-d}$WC@g$7w5|sS+76MylZ#-vG|)V z1=t&P`HAw|(9Fxf6_pY981IKXv@5C$7bRr-R?3&rD|b0JHn;w$X1%Vq5=(u;v+E1b zjwatA76@0ZX8o`wUKlrK7}3~ACoTaxjEvJ`upnXp$1`$3innxzx!$|?EI$VGYh%rQ zO!43rw2D(y77j-;;Cn;Uhqb0RSMqF*UT9WcS+3#XWwscZVM+-<6;vijLEIkn;nl|N z5Wx!bh+FogU%`KQT;OA{=8=@0?-o*^w6S;8_4`Hrprv1ZVl}PIw&UWi`y%JVQ^~x6 zTJPj41h#_TcmnnJ(VI8b4ToBChV^(;U$JDu?qqs2H>92TrN;hVI`Yu9$_vw=ZRkLl z8HVewI#ychjMvDhTlA0VISg~f7{}=u93a69TJ~!v7SE|tc5;oo4Z%7E%#Ac!s zxII4hE_UJO98zut#BD^L;kZhNMlWb4M(@9%!wqRJ9mM@e*C{^55^JJpt z{sGr)XnxpFAJWRqkMy}Q4QfIEhrIgk`KrERJ)t_C!E;k$La}bYq@>R2WaEBmBH#y; zCun`f^O+EK9KwF(<@SAX6vMRa8D|+I{2ZgXIS<>ry1FqQ3Kgj8F^QY7>({-Me9T1l z;q%E-d12@m65f+Qmt&M?EJQVsjVoz;Q_QxE&UX#Bg3wLWoqoX+!7>TT`m#kB9H#5_ zv5!#fG2(3PxLsaZQT^@@Qq>;axF8eBT|vm%6?!$;gdye6dQiR z_Pc3m-rHM1IrFusJQ$LkpYf{OD*NFLFD}hsjBreT8UUNdf&0sCaqfnXmZF?w_f&N%V}t;WQ#_-~34MBMnq)&;_y+cs5(aci=83Mao%u7a_H z693kf(j~GjKU!a~ErwQO&r>XD$+rXR;N&f-zdY~mBJ0Op=s%9H6dC1K_Hll(t!aB} z^E@-mYH0%1&LqV4`vTSOi>=mH+dDc1-aZl7W7TyVqtlZmavQj^LWIsiqm_W%9+QTK6|e()RXt&=JpbjP*&+hgDH>`I(llDWJ~NC z3;Tndz`AoJQ-RRVbza~)seDhk(u$$d_mc1CZ+=>ZaAY`d zF;fR8VcU!n;LRw5>!~sJ_+<|RKvwSVsZ5~=4VRuG6;I{pF_VuuN-ces9BV0_gI zqY^oOON5OZh-S}O_5t;Lg0weEj@HYBfUqbAn<{1ua~qD2Z(qr+tV?CekAdJ^G?N6( zt}gbaMnKjcvq53^G;|e+#F|YI*|Dp;z_$Dvr0(6H7B??rex+eZ1Wp-qi@OeXHFVbM z-twH5^tr8}_FvB<>iHZ`AoBLb{Me<{Ay4(E;F@4ZP-e7yTp8NYOd}Zw2jdSRLt@lm zLCoqU<(B9H!4W43%&%|?Xg*nh%qtEzTSYiUh3(KrK5qQ2o>&4e@u;nZXHu$4%s;|) zoN;}t1IQ*)BC(;ssGfwaMZw%P{V;)|Uyx1{3$rxb?~@#9fN7wkr#$D}N56rQ{6rS+ z7=R9S9vPZ8HQ_9u6J&`}L9+ddXWa->D)hfRJ?U%g<9TNq6Y%e_x!mdawW|aThWt-7 zSjYF(%|n6#9qtR2{rw#}52AkMwoncA(E9vmdE}|; zl%NPFycA>Igch(AQNK-Y-#-gAu1dZsvN7}tVxIDi-~v6PuU6RrJ2ql=;Lov0^c-x+ z%tLAuH%rGG{aiW#>31TqNTjEKV7e%ZJH+6mHEJaTNpQMb_;*f zy7Xy|`p$03+552hq(E*g=g95zueGCZ)pcJx@z`~rrk^}>yhRsg8xVgN^d%gir1y!_ z*|jhuz$2=`0UZ&MKW7^7rP&uH-j8TG3ooUX$$pRv0v3zE>mOW?H%OebVO82J^Ka`} zsw1nPms@FypJ=fWDc_Qsaxt*Wk6x(dk-G=YX0x6xr$DJFr}SjFE*J!&z_3)g7z3N)p28G zT6l@CnIYlZ7N3dBuW7O@@>YGTLYZyi!Mlwr{325~Zng*+_Gm@$A(;yc`^1)QDw!|4>G4CE86|LjDHtN6!tNrkuT5rqRpftwJd04;^+U?_cvSxdNq zw02OsE61#Gr`=X9<}1uWXDwrvV*fF0C&M#Wxp73K5_br5qUN?h+4#N0H9LW4Lro3deoj~ zEkCTyF3FKWFL=XZi3pW~GcS6X7KQ1AUDJ2ArRa}oZnN+4YI8eBX?sgg_Y@u}&1~Uk z>IxC|GskMxCvv1FdHpT|$e;lBZ&GY1ikRlP3HhGHD(Cnq;>SAu<}&H(48W_Te?C?~ z1(tiL3`q;wEs6G`(z9%lUC6ha9A4BlfYix|Mn-3?oX-Bte==;Pzlui(slOf{Oc&zYnf8m(93)naJO&wTz zBmHTr<*MGP^|E9!bU{5=>IL&_tcd#l)HX#t)`rQ1XklCBpORGiwYtdV|0;*hR6-U& zim>b`aRj#^9qp>e*54liQ9;cd?5+~3`S>0nlChUhPY0VoL89HD4ktljfTHuRK6e?~X4cbFPs5o_l z!$*sNh6yY}Zf?O>r*mQoF;3tCY(D)HZ}MQg?8mVMiV8wcgC3s#su$~N;LLNraPMC3 zO*eW)c&oDBpq0R<7Shg7yeQZ%x3dgCs!gfd6Ld5ce(qIHpyF zisks7hi!b8Ux$eiMvKNNm?*8WBaGQOY?D(2xLPJ`fTumRCvZgUTl4zSS3xENl(IPi zg&+-jij&0+D8shSLCDxCY+}=2o?~d-x>hCq4HYGEnbAhBJw{j3=(zHycL6;g(Rc5= zll6=J=yKLs8mwt~LMaIFv%{o}z1~#o$$UVrlno4vK%0K;C4hAM>6}&cq~^-(UWIeh z@goV6$SHkv0V8U%;Zc=LJ69>HTm1A5<}#|QplHuqiy36}s?~|rh)p()GNzJrHRH}# z^E6?In_IQR@Si-X<;$+IMI}-04z*~V^}cN6m|f+TMoNZ?xy6|7y1rC$Y3u3p?eHAw zXk+=U<17r_3V?a>{s4WXr*c&>PTY3P#$hUk!|nOY(~R$cDzwU<-wg@^)|+*RfZSgm zrTEQt=o^l|CNdj22bHE8PbDpv<(#T2=gkfMOfiNmt=xX1WrFp_(wWqa!=9LWyu9&F zd!L;Agks>WHgiU<>z)HP2bc2(2SM?4n(A}{QStCA-*~Q6)Req065sXeX|nqt6qaSn zen=Gz(L9pDFsk3~D=hU^J6Z$ej^Sc1ItPl`#1ryy$_Eb2`X@4yu}6@{n+~;fHT)Ke z?o-{^a*!R3+a^5n{`CDu%8ss>9$R!3Bq^4)O2u&XFvag)YUDTw^lnaFg6utri#X<^Hpm#)Qd* zH+9N-5oz#!wN;y%C>P6C)(uSSD;j_gt+{TKZree86KBY4Lr3SD+hSo!dX)>rT^IAt zl9HnF`79V6LUWtJ((X^zr>)yi`1RD~oHTk*7!dZ1!A`P*=*2zhz382ush(|S{I;Fj z%H}$$>^G4N(yZK4pf2YNN+n{^7jPV21K;*#HNcpV1GnPl0$}XR)9ynvPnsLQ&p9Yo zn}n`e2J+L!(XDMt*{S!PezC9M#+2S!9uBI5@o8R6lfH2l-y+=fbH!1SH%S(bQW4tV zK};-TwBtTzmVdoOTZ@$ZvVGqROB8z!4{b^LKJ z9}zbs@qIt%bs+Un!;>=)zLgmueakWGIqRMag_*V;7J~(8i$kP9x65ugL8N{bayYp~ zRlv%PXo)Bm6Y7jOcygsW-!lnOMJi|gP}<}tjjMKl?pO|U_5I)2dk>(d{&rmyL`6V| zfJi4QO+iFJq$i@%L_k!C)CdTOh=@oJiBbhA0s@LaC?X&w(mSCS={58&J&;gC+W#)U zearXl^WXdId+*#iXXc*CWCqquvNFH5*86+k=Xu`eeapb{mVuckPRnZh?i)ewti1T45qlFfdUuRgtxAL6Ui2+sv6!b#ec990krYRkjL6DY`I`^)Z_*>7(|DxgBe*B6Ev=4&l(S+M|TCw}U9w;u~Tcxc(ih ztSBr`xjS;T)Oj_+E5Ti(w&>8z&Nc#jUt9VxbZ#X8Howv=3-Z=Y)7!+G6hb!^;4=pM zM`Q2s;S^Us_f{S0C=rl3P}_-{V+Rm@(jh;S$3m43P|)?2U$Yf|#&MS*E@x690ORA# zEm-pr1a&N@gf|Z?osWKl=?XuLo`AKZ2xzJe&f^lm_mm#e`^r{Cf0c{tF%K-wEYj8q zzGMt&$0*1Ck&~j6VmHkmGnZelgHB$kE5G=_LWcLx2xW@G-^1wt-s<}Q&)(P1j@9Kg z7yDF;XZlLaQkG(9(yh)obrH&*E{+nlxZk&rkyMXdlU(+C0t7ZN?J?1ceIcdy`@Lrrguw8`6izf$pq&Vx_9&c&(OB; zzkb92_|}oXy9U+@lw0k~2!7&SvPC+;klNp-Nss4s(p7bqsTYXmqXW2>113j$88op$*E@~grvZRciKz+wq&8t<&0xUvfg3nf_K!mou=K$3)DVyZ zaBffGcYn7c|C1hLUT|$ah3q3rw|;yxo(E^#}TZ)|~2 zq<6`5>Xkzqt+S6}dC2jU3kJ(nQGm2nFJp_RHYx36y3A>cw20+h2=BVFNYV2lP(vdl z-KVe?+~)CwuCU>IDWmS;=-=_88V~2m3%}NAPnmYWG~Yee1@+^f2Y$P5-qgH+)=;7{(~s3dRU3aP3Y(NbZRYY9QUL9n9)(CXZXM9NbUc|u4|AIGTGu9F;Tay^ z-+3jUheumqL5P7lSCJw!FDzWEF>fWP`I>EQ)bn6d=hlGsS#(P{iE`jg;z6;~Dx#^5 zFI+}X>)AM;Z%)Kt4@a(A=YYhhQw$1+JpP_Kf$_@KaqCE(cK%k?ebHNS)lu7hbYQJ` zw!tU=qetq`aUKE`@3DX_+o(1QbeWYKyYn)eDf_F^ueX#R$B@x*1Pn+5TiM;HIz&*5 zYBTn8iS@4{G}&Iz$BnL=QG4&8LPo@7nDX*?qtxdL6Iw8-4%$7pfrpOc=3El8op_^b z9iK}EcQ~_|F?Teex}10`CAXLw*ENsBIz{*go)W?D_pqRryX9FP?Dg}#vx-e|Bu7Px zw2htf_s5e?<#O)}3Pg1?GzW|sOu%n>elLAi9*#>c*BK_(yQ`kfycJiHu-!#xdB$*( znc)F2UB>5#0oI8w1FG;rL z*xsu{-QI1(>Hu*6?2&{L^neTf7lT$SJeh|KrAPn;q8uQx`1uJS5U)xd)qVg-EeT{? zJ9G;)0ye}YP@fSkNJ{hNC3%xo`neY=>c@pRudF<=*-A^GN)P8PK_~uVF!n}VPItPi z%L~I?7OB7Y6otP_j_|j>D<8XwgnzU#LD);Z9>;dU`>8-7rbRNF6U2mx$dwY7?%7r> zJLl>=#`j0I_{HABRPis=uGb&ZV%`Yq8?iehucFx$TN}zwPn`%q4if`wJFd_M;QWa; z@#eo|?G9dE=(gr%Q7s~sTVu#o`^eyQCy2nHOQ`|JxXhJ;vtn*(a~w;l-+-6i(rB3v zGz_AQkR{ab)#Gd`Q&7L-3txgEpj=ZrwvPLSw-z7u?VOpg2!ko192Ls5m#zV`-$N2Oo8-k00 z3LIydM%*jO9GUY*=cunp@rz=yvkyjQW(F+-U7xGJ@6d>v{SsOthY`mJ@}Q^9V~t|V8d=z>j1#oLc#Hse!obvGG= zUrYb$dR3ffVqu>L$qrLcws{p-Gx9xlA5gQc3mgR=Q+K-2?{Eb>)ce^*+ zUa*D&gi14=dBMaN-L(Aya)D%)YvBI<;`F%H+p9A6VxsufUpmun9n8#u01@g&%$$Q1 z%}H-Wx;GOvRzk(pny*^n-Kkt&6Kc?VPvrXGDbjN*te>PjpV4OGEpI?U9!|I5qjg`% z&hthSuJhUNO_j@3mio88GGq-dYs?IOhl$|J>lO`1uJ_oEyT18a>FP51?W1n-1+sjs zH1TfP^%PgL{+C)?Vg4FHi*$2M(L1wAG1jdd=aUA%y+lN*y}i_BsXm2B%>!;(g@cT6 z>Mns9|8OBv;BkJf+nLc9-Y4I%2@is-uSM^j=ccVwu+2#E0fa^7Bw$fdZBe5Z6K}r` z@~|0s{O0_Q^h}f8+Nkv+u83e(emkrYIx7PzQEb<_9Gl1w%0;2izu<of>YMR~ba zlybkzBP~7##xd3-%)tY~rmACemB+&8%F&lkZE4;2!7|MK*W5|Cp^@#UBoo8%L|85X zVV88b{_3Oct!Ax_5y0fdx@boyiK<63n~x4lGAkXucxI;Mo5jp|kr$Gf@})i#b&)Y; z#UrBy4K5TrDewVdfB3_d373}gt+kZ=0l5d-K8>x&0oW36EBxqksuU3LQfBFq&yNbv zXP(WxUIx_%SVex*2Pyn_iFemLhksK6==uU)iC%95P>-$1LEHi^6o6VZ0PF?V4JGqo zCd?6PrF3{Mc255M zhtq;rD7GPYgDfz?5tic}XI9+*CI^_|cx1|lD_>Ub)OfpYWCJfmG7_}o2$be?se*&| zZF$(QS2i2Sfz>0ojEpqj^}`yV#nTZ=D4w|_@4!3m@b8KvA8iH$^Kf=u zi>`Z`Q3^y%65K*cHz*9K~a z5fM|cu~v*Z^ag(+0-RDXYdtpOVyd^xli20?7PmWD)1T+0ayrGxjPO|aIRwg`DA<;m zw83?y|LBGrdamM@CvknM#4q{HvHBfaR_%isA(D-k6Vs>4(Qo>SjVpHczp1R3AJX@c zHn?c?LduBvdSB6;P|+oQUh%vEUk`e1FhktCPDQc>fo*Q2;@x#`s#akGW4!mokzD1UJ+Ein-|V1B^I$I7-xugz3*xfW<-ZLr<0y55CxkuaVAB; z!mL>?m*yw+kSmPcE-*)zDpXOJ#rLGF#vXG>tXChFoqs(PE-IC<+AWy$mCnp ztUaCgD6>KOz@ibpBe4qIp9i4dj0CKJ^S?V%>Fu?E?l?r;rVGo$$ij+ejo}ZFk9i16 zKw4B$jVr~RY*^_mdOv-Sajxrz3FG0TwtH_fRK z!H4|;zIa8Vt*GvX!0D1R4EqvTo4r=Rbvv5XX+n#(UypXozWFgsGRD^Q5>~Ehhj**+JC5J=ME{6Czg+jmO!>8 zTHC9)Fled~#e!6FAH*D7+Hw7pI*UEjDREt%hq*rw)<4?w?6wbl=;7`Zm~}DF#jvYp zhz^?bC+0r)eHra2k)CIz<>-8-t4rbi05ZRYmBe=No;038S{Ma0lVG%qO%rbFxO-W5 zV$VzE@hPimFD9ISx+wG-`GzPPQTuRFFve@a|AGH4t=<7qL|AIg85`WUu@jTByR4M$1L)vNV6!j%=c=h%*XImv4Z)6EhLBWygyS6Q zxbqAYH`jliPro+Er}!hHQnXKASLY^F-diY%W>0Vo+M8t}IkY#C4GrpIU8ZJzro$XP z4;{oPZ`Mp=Y@V+B!HVS<^9Ro~(@dbgrbptK<=0q#D(o+YQ?AYajwX=@mR-E#1w>pA zdg-`djyRT;8jq>OzL0SnmE|D4R5}v=UK8(scR{&teV>50PlYWPbd9agA-J*w4pJPp zcg$rsy7y*-!R%$Au=>j%v@2xTKk+5(yn>Esr3R|Iz6{s!S!G9Nd&8TmL4j|)-#|Ic zWwO3$ByOvDIofr}Z5i!gSZN6eS&m4;oO2dx&|`eY)K+CD-R`XOUWWsC8#Xwy^^~MM z^!@v2J}K&2uOpmW>N4e{XGPNVpDm$k=Y+fDrhSo?T1tf$lthgCehZ02}Q#Jjdh6hjJZwzHA17iaP~IkE=tb z;~X@pyc=|#xc1p?;E`sfjsdX~kLLL25(DXg{C8f*tLTYn`rXz(n(riPRdKfriTaEZ zLFUunbwHp&H;Os-b$fmABlTU1HzBO@A(_`CQ%j-G(s-SZ=abuehgL}rf(Pf zBj_FaCLr9L^QG(_Kmb+0{%s$m_=~u z6}L`;UyHSE$i&BM9lfVg)j8quCgdWDCaP0}2&TCMr`KtRf7;&>(xlq2kxTZs0bF@% zg)Vs{x`7>MK$G;#M*2>+8bV?V0Q2jW?ag;U`iTU}5Nm0GXf(y%%p327PUE^sGdF-A z6;#K&gczq7lGbWv=Zw?L^ol7h;%Ob~1=UZwQFCT2dv()^UTJ}4A03{UaC$sY zH(c;p`-@LpRphPLpnh0i{O*_X3&VvJ8PEX9NcJ(}tXkI$N(0L@RnLTwg@=h;k$e;x z^&#t>%)s&JYv`(TItDFVZq)NT9PG#pi^)c(iW99vy~HMxi)DXtJuDk zBLKItj@i{|0hi$dZ8WpIcSbxrMg2oon6_W_af4qm9YakE=PgLlv}jo}7pq7b1$Wo^ z3sOh#Kq3#`_oFGl-($LdkZN~gBmD}0tl-r-!{aSkA5dg4o%HP0=*{>A5FT`Fc9rYK z)4{8q#%nwcQ&GY57ldtQ zyUj56kNaryk+qcoMrCZpLo`~y7VQt&UK8RIDuhmY{CE?N?bW9X{eM>-^Gk3jimk}R zU<6kb((histfumkik(Y%%kbpY37NAn9Qxx4#21QnG%3B*b8q-2bw0Mf1oH{y-1ry6 z*^{xTbqO^2qWN48bb4jgZ%UR9G{fWepz6+hr~_b^jkR+Jn9R$-&2MRP)CkW6TAE#{ zae`ih=}`U2BWn88lunhH6eT8Ax;JkdHO{M9T1ptLSXd*JSX#2( z?`e3PruABL&jE>=<{^IZuJJTz?8{9N38fyCsUethiU zuYp@S0vrrmdh5Uo@Jn@u#gXC`Ljk0KZCfAjDHw*Mo0%`q@2(gU$8{=WfP0_kfX8j`vb2gQar1^|R7?G{p_epk^asc=<`fA{+-MRx zR$N&+v}HG<*v%6)agL*?URmMV{7ns*T6EevCjjc2+9LpD)D2VwwVjq7g_M~E>HFCL zwMr}E5Uz(Q__B_k z(##RTBiMog;*irn8oi1Pou3&2F!YIp6*OFc}wL9kRZvNR$<&M7W_Tg{7qjSy3(b>M*0agu*WZbd=G8s9v0%no*-*Un3 zJ;8MQk~S%%jUj5^7>SM633zAFbt&Kj^}d6{_kFGK)( z@_@(>L+1vEnfGRMZ5zk`D^1LJVq zUVRfzUVw(5tKikI78a1KKOe=G{wR9K>ojy46BlG}T~^(R&NoTE_E>wgKY9uK zaHB5(b$kySnq(oJyN~lNE3Qj^K%)wUjAex{eq7lyY1<1g% zw`*1Ci#h|~|B|!pv@wa3WGYK4*++c_vcB}*e@XfztWoX2p1(hUO?ghTYd%Cu5joKF zUA($0!w$N}XthU0ZHDr%q@{N0PynMTjHGT1Ku!Z0-6J5QLupYrc9ZTu$jKm=kkxoh znu1n@2VDtJ!&=5U1K#=BP#LOP#&;8l(|1vj`Zq-HtabCx2E0`PV7L7-Alx{reYfoS zP_B|nO+_V)m&n{$4$&Y)rO$uuXRNwq1MFK@maXmF_FmR!iq?64GFHAU$y>szm}U6< zn{d(yILynl;+)b$GpqD)KQEwZX~c?y51h_3Llq;k5Gq=tK^rH^?8;E@qAG>u?;t%IIfJ8aHab6_*{Q*UH)5V^M%up%p~f5_T9o1@Dl2l zW;;)a0bZH@j8>VE5nz(aw`?7I$!1SfQ3m-QrKUh^UtnmDR@c=~lodGBQ3=YS|tV?8m@aF%z~WzFB^AIYE^s`Za|>?724;-8>t`)@(> z|FLtdP7#8e&D5e6?gJ1x3WFQZo#GkhJC{87cv@w{#}ZAQqKVW#Y;4qm1@`jw+3 z6TW)LBnP+XVv0Vnzm>hGNGjB1W)6^EDj~Bfp2flYX|8YGic>}2*dJJ@M$=gxM^VMl z_Xdd(ibm`mW6_VwxYG6$rYo7~44Wuc+9^yIc)WLM5@Fk~Ax^pHNI&5!0kwp!d-#6g z(O$JEfxx3yFWFTkPK2Y9XrL67Qy&jD6e-a4eNsg~^0i4B0P~+y`h^T)4;|0jy;0vu z`&q^tXGz$Ki7RJw%5H!0YaVrwf^e}u=V9`z#;O#u5kUBndQ5&p8auS3Q-NS1S!21V zj~^N{Q^5LciZdK+Tar(2?hAIPK&P3H3&GYEv$6_;^G!{!W!2TPaw#v~&k3((V3=cs zt0B9|H&W*Ng3oN=u712xa2?03ci|L|ojcdD_6UYA;BA3*p!UB>l_JcLmPTRs0VF>q z_G^F}eS*HdYh%4-VVJ@y_0UNvmr$z@=ZtE*8-J-vo1;_w5Q`;fWW;SPg$L&^nBQB($0u1gIH@2XM>*}aghk6&HS1bBGL>U&?BG6K74iqNQ-bH6c8Zw^T zfKGH~!8J0$fu`ypCOfGO(|&7TY!xaA(nG=#51Xn@19QABntA2cC$gvyENWVAHy-P_ zQROJ4@FZC(F7U9ZpYCaFnxBtRl&;LbP7+yO_GA%q%XI|isWQbnFS~sn$DyxB56L5v zqE5yIL#oUB(y;bDSN*ob%MS>)vC?Gn$ih<&G*P5g61Qh zdT(?tUQ62yGxD!e(HKi@fK}pf_KzL!hCi4Fh_hCoWVy%fs=lq#vTCSCWpo&N$50)> z0-k)`rk&X*?ZSV~azmGry)pDgrCIWY4dQYw@&ajb)ju&5+nBeG6Do zQG=16VaupZ0H_kT29tLy>9?@=qz5{%?{*<8=+nrtW|~4%_NZFl z3J<~r{sK9+NE2;x9C1Y@D-tyjH{9sEvWZL4)5^E$TEq1y@D_|O+7D|W${b=u-y=_v(N^;-#OGw$?!`am8XhDoPyj+2{N6)Z=U}Ed zKwAvaKUvOtM}vjW@vs5<22gT`T*(-&)&I6O6w&J8b=(j-?!#lW>U@CsL&(8yXof!c z^aJu$BoXYu9jt7#SXwySl~xLDTxLgl%V2>sdW5`|bw4v*)ONA}H5M}pNGRLEtW+Qc zw3f#$1tZXB{$k*Qgl<5Fgwz{Plya2kW}^eT>Y<^;O3B{tr;6 zAN2P;9YAkjz>vHii^O6`^Sf(5_@GXs26O?`msKpAI+F)cwCzT!P*{X1wa4SqhHy1! zW%O%ZlS5H;*apXcMz`U9fs3yOIyuc3xV8?>J5*MlOaJ<|=w|p*;A%H;%4+FJgdVYZ z9u<6;bx_0J0$k-Q1@5no_8``ez=!WK4=-mKMuMGg#5r~mUrN-=o+Av~PT34xt%I-n zw(=0+VbB+B*Fn5NZ+HRP@^nSv84$%@WI=~|8U_eHN%KVMHa+qBth~CqZ&fE}hJ83c z`UztdKn`~1xZc&~`QhIU1nnDzWn2Nfvs>_j=xVwo)t3XHbd#uo7dk{|)H$Vmi0GHwag&XPr`s}g)8|!x(%uR~c zCqxDMEoj&pT^($b$4BM+N@4Qk8kmN0OS4hx_IciiEfq+yk;%hEX3jPaQ@+rtNdHer zN3k6Pewb3!B*yS!=mqT$QE-TJMqtfSYbfU_t^lw(0XvpibyNTsVhgl?9 zl4NLfi@SczivYlG3JL%RE#^fZ7MR1+Ku&@VP19|6^0R5~Oe2p7nN=%k{0BtRs?Az; z9Ke&RTpDmsl#+v-MfOxDhTDX~iFtb8*J5oJE8qzW9-D@!k=#SU_p8h&>wgn)LA8o= z6wYjKXT$I&4u#*C?j`=kGRcBX{I208?=$n}5Y9o<;5Vua-^)oK!Zm-(7a`bzjO&GRBU_fIX? z!CGd^WZSUko_QgKM?6~2DG}9qS$Og}OJj6z6eqA$o@8cr5`ryS+RQI2UF(^T%Dj|w z)ndA4b$=BgyVrK6*5ic~GR3wri@OkC;E_FexO}W;I|OxRzb|-3{wFaiU_4=@yo*y= zZ_af;23o=qTErN79>%0)6W;u=!0^%YytcbNUiYd!gv7FVe?a{q{0LKmS8L=6#XeT6 z&?gM3;~f?sytaxZybo;RHNpz*79=UoK|_|c+NJl zq{S95QV%e^lcv~fW~o+2j~~|T0c*dxJ(>BazW~X%N*CI-vC`bXh`r+} zl~YP5`tE&{-pMGKh0i|sdJSKP$y-fzzuxIy*5f0W$cdo~9$`i+p&>xf{%PFgzN=O~ zU62RyQfT+L&C7KL|K|>?SY9eKk*#%>U9pd(OXc@JU|SALa~O-2Zdpj;oPIADbv7Ug)eh&fWFEWMCYN=CNGfF3bsHGSfONy6vgf! z4Zw!k7XW?K;fH_l8c}ox0H&1<5$H}j3;>d4uj|_UERo`{h>xviPfdj%3GE9QtXE6_|evZL{5-@QGK^iBVSa z+}*)-W=D;^WPpD_x=;CEMX)g z1weM@o4t-qAzg^L})Ep>Jfqj;d>9{6rR5LOnwtT@;ZAPK`i!!xq8s=wjZB^L3Qx5Z009 zA@Mj7+jRD`Z{N%c21G`UGYjV(9R@11ERV5%4(CykYBIG-4ci-0+lRXyn5DYz`~KpN zTo@o%$!7tetGb0y#*VtI(ug_UuNe zQE~j;+HI^Tfe3sJt^+|C@ z4{U+NflCd)xO}ssUM3h!OwG(h%RZ}Xk~Xu)R()+dN%bLB+Eq*ED54vek? z-VgYIF~v`=@f*n;hP=K>O~@!(nBi_-Cb`A;p%N4_p6>iSAR91&WF~11N$CjY#)e|B;DVuK)6)2RP==b*UJr@-r*G?hz zvCb&Db)T0XrlNbG-1Jebkh%e5nZU%_wV#aw2z@3n#c5X;dXi^1eq^Kv-;$VEqS3M| z=OY&Lp{8gl!7dhwPM6Xcyliq2m#Iih@sR8*^ph!==`90Ocnk3GFQ+IsNu4p(zZ3oO(^oIY|E|?6 zAaq<=%x>umHsuw71uANl6;~(r#qwVWe<~l(YU%nwQz{J@xWSr76kOD4frocGF-Mlk z`JDKm(9CsP=u-aTi6VhqNuk$;&89=T> zD;E65=KlWlZ^83<+rfrGUB&Q=Z*rFA9kFeTKdR6C(&-PU;Yf-^*PA5l9L2@&UOe~x zNFG@{;}L$Pz5ANaSi87j<>7HqL3Y34NPdgy=h1eY~|F?Tz;9^Of;-LFAWbeyFrl#xz>lgD$DXk3SP=-MNM7R7<;f&-}7(XBphu2<%5vmDH6PDu|S znand-g+_uA&p`J6-N*3)eh9-(AE)kmQ;mtLLMutUI2+bz4V+DvAcf6=L%5zn&Zg=X z6qf0KwP0gc9eJe8ty?60s_odFFP)N15vP+`*VO%iSSkA+a3&P=0q93_?6Jv~`txxu zEDKfpEZ;{(Q?6=F6+Zpi8yP5_fQZ#(h`?2DK+90=!d}`{{o7?@t#wOAr`yyXb55w7 z92E>eR2`2f1bx5M>Mi#^D5nc#qL=McW##NqtVh3!r*ojW8*?wW7LGz*|DBy}-S_9P zi5po?`A+VFeqH>TsVuN1E3cH7Qd4!K+8}E5xtBm0t7$r&7Q&24!$bPA98%> za$C-JtM~{Z@Q?zH!p$zkNCbDk*1j!$x$%L@anAm55>kHd-c`5b%SX>;zF{S*G1b!| zD+LJ-``3EA9@mHNK{4L0&Gbt}E($G4F6Ft;W_Nxqi^nUhGcg5jsbH9~v{(ouTo+Ko z2ECXp5kgCB{bn9=O?m`M<2lNjhfjDddIiNiDJ5RP5iYeSBF_O+vUZrxx0kT)QCug{ zi$W~r7BV7BHQ>JK3wFCQ`p99%BU-=u!4Oj1_IacT<(Y`G7>{)i4S z?KHcCA<}$qi)~3&(O?)|&EjSckZVjBlVQCxqQt%k1Uy#)v*7Z4`$0`ug^t*e0$-Fn zr6^faG~=!Gb89Y#vCiG)$LQ7``VDGlBIkUr^!?DK7|PJ1D~r>N8sU5PJiEDBdg+MG zwT^==yHCe7k1W4H> z?#p5?id&_W0L!H(&dm)DDp-~UVS)bTEP$MO+0`q*ivNq@#0!s+%#iLUS-eiNb-tNL z!d7~B?*|uvJ5@t@yS>36#FHig(1AxwmuE|&Y3)tJ(XGY_G^L2fJ0Q@QJ zZQlLNyfEHbaaX)((&gH_dr{xP)!%ZIFIOZ#6`-bLW?qY2Xmlh=CC4Bz)1Tl;^n5_< zvLyFewNmxRw$%_c{VxV~@2|IsA)!tSQ1g+|sFfF-}EOEZ` z%%qq+=k&s>4nyT$c{KsD#O%Q`sq?hNP2wy09k=>I2YXv4i4U`70+yU^=#Ox`4WK>bnD>K)zcEq zv1FQpiQ7|dk?dOFeAl;wks$AVt!Ga)tk>mAt{8JRm+N#-)Dmmq zwRMwsu+!hp=q@44*5?soEu+Df+Kv{fB_E7ON>fT-$Bj%Dn_4?+7>Z`|^4*WjNW~I3 zDf<77A@CuVA2|f(tM!_vpxH=)kF|$f8O;8zcKCMfM7$E?M}+`|exfxC+-<&{)%JCS z_+7q)ub(hm<$J|e4>omg@o+yC);-YGZ2IHh$iX=t3K`a5Qj857qP0d%Kvr*0L@F;zr580SV*=t1R!J>CeGF_e)$C|CE9~>ii}Tj>tU}7&c6N@Ab_0{$o1nzgq z;3%nV*6uX?%&qcskCy6I4qZ3KcGmyaj2N%}Unx!ZIWGNLw|h+R-Zi7o~It1ofgZX381*9H|@<+bL=d3;j!SPx27E=Yghk+$MwZ_vP{(#XFySj$oG z+#JQ-@|(~6lbejvWw`cCg8mO3Pp{9-+{Sm4JoR(e!2jO2n7X=_6n_wzYwj#y**3G& z_G~4{?S;1AtEy6W7A$JA<$KghYAVFPMCn16h84+m ztcZ^h@H^2dUu#}aEDkP{zGi#hZo|kBw+_rVbzusO`kdhM$@hh7)ARD5kvU00 zww}go%l7t=hAO5qq)Ph?Kdya&j(wZ-(6idhynx%lqpW^oiu?kt5iEdgEAFSAE~iD& zk6S#yeV^W*`na*9B6iE}6!dU(ckhZ$0YZ@Sgrwgz!%oU|&LZg>1>P%-)2~RuI}PQpfddE(iz2QG2R^aH^MmX7^*w`x%}@M|P=E93$(wajfn z_AFd4XI<$Izr?4AdlLgk}nG?8F6~-I?we0qZi zZ!S3cZ^pc{;tfLR`_FBkb+VQkGQ9dG?{ndON55zB8kziL_ez-X|mX)t{lVT}bu$LQ~i za=odXqrPg$E7VCZAFFRbc@(QBDHb;zX8;)N^+2xyVEWtt>!kgqO) zoop!Y?u z_dh()tJ&_Z0i#mprA(Wfu8+*1V~#Q(PFh%6PqjKaavP{`6}1Z->)vW-{lP32*T8e= zf!zcxI(%ZoaIX>Dr(;!8Li^8T?qjm(KysYQPBQy(d%@q3NYyN~;IfHdJk!W1Di|x% ztN#|cE7yLX5i{!U%JVuxgWoU`?sPRwGH=(#>?$x-2;v259{bXHS-Y!hFERKHB}KGZ z021#Ls1cu1%k^Z*HA>NRWnG#GKS&zBI80X~)h~mq6ZKfy57UA#x{k0Tix4kJ?P&-h zLh?j?`o*pvmU-gAX;Aecy8sLJIx6JFQ@NyNmr%CwM8iz$+xg;~1jj!13oOhOGrE(LxO^bpP zA68i1@@ilgON7@RC(pCy92~2^MEgmypvdxu2EH4`Q>iP|hEvcN_=RZW z6F+7^WA(sA0grB<-vYDmcQ*&{PkdkN*jZnu7|^=_ z;yiG{krYUvWqiF~9d*67BuSGqCc{`jPq3U@?O2GZ=y%P|R>U^D?axZ21V(+LbH?-a zxc*n2!U^j>m#9q-)wqU6n;My#$J9zpztI?ot}P4TlDGm4J!F3w!cF@?S8z@W*IW>^ zKyi!*sCTZUUY>mEGcLsOJ37rajf`8vHf;ybfJo5!RfNKP=`p?DwL)=GJnw^dU2==c z{7q43ftE*HSh8DUpKR{C=3x^_HN@35v7s7g!>=ORTJfeP@n&tEUJ!d)9ds$h+mGG^ z2x9N(!QmAB&Zio+f;t-vOA|R@v^Q1*1bigS-lBR2{Y!CAo8OX}V z@)kx?Yv#EL%_hkbdFk7JH}TyEQLXPSckbQO6)lc$XILDkt5X^JkvE*~UoQa~Oq5E_s}%KWwfFn6D+5oi=}-T!{JY*JkaE&0CL7ZyO*Va4BS0 zY9uEu$H9Z z$4|BInKjV!1fO;^mq3M^0vIu4C9d4X5i(#W)*s)hwN)#~vR`(UUxac$uq?b`HGaem ze?e~qw4kQ@(G>t z`hnrfJn(*@U~c&D3P3{TS(g$ho4H9G$o3l_BWE9s0|+$$sUC2a(T%L#DZ*3)eY_-e zUiL$wJL*}GlY7P$_=Uc4f^{ibHa-^efp zUx*hovrSMa8@!_y|0ybnGX~DmFr}cQIA3K0_*5+|P}F2leI5I|FAzUTzaMDCQwMgByP>}#HWXiNwyDzr*M6~*@SO(TSjVuN6$ zPnCRKXb+Mp$uaxxp(`rX$9kHj0o=|vPrL6(@>wA4hUM8m?tJQ#5PlFz@2)|GSYR}K zODs1Sg|x)O(=jMT`m>`C2e$|}zAU05Zs7CpNS5^%d>cko@ax1X=`I}in%H4(_pc?w zfvT)DyUoCPu07}tK=&NnpRt;q6~~fq+Htkz3_?8P)xN%D*673o^My)6TP>hLD6To& zzP-cXcDkQ*)7uiAv-$aAtSV~4XFsY)e^hBj_H77&Lux(JP>#-{a?g{jcCMtFwD=t7 ztg>)s_|>e!9Kzf`YJJI|zweT*+wjz?nb{l9I$XgtbtJrl7sB-_OWKHBL}-hlAfFZA z$+RX!ws0|amRIXNLO)sKXCf*e<{OJXS&%!5S!^o`%fO`%@!%#>M>A8y0s@)>0wPjXyHJ zHb1g9#G}?;On4)DM-M)bkHDYm6^VV}$O{|Ar07<1P*U(_?XjsV&p2*r2eO7MG%#?` z^b|W`t9{YfC%Y%UT2Zg#tQWi+q4vY&0e>-U^$6rJ_sc&yw1J~ldn134{7kC*_q$-l z*y4|^*X2*Es_rwBMS@-{V`h{c!{nztuHLJ^J3zTnqCZseJ+xzypt$;LKY`|j?LOT8 zPo00UazMBHKRXF4)3UHj!QRdEI%az0U8+}0^wH&?)1t~IPV7}%3?&L&=+UAJlvjc zvogl+G~DTU>sxZ3X$m)w4pqe8ZknK{EEtGWYkK4t)ILYM5U)J}6P{Tyy_oVD)0X!> z{|bT_!rHQw;I2+Gvd5aImgv}qWt+6nHrG&HQ)Kzd&m#PV>vtXFs~;L;u+8l#bsbQUx?lk* z!;`hzB>7O(>()TJJv{Fk1IH5XGnJjhKHnGMt3F`%%R|_KDm$@lbt{2wP#Gur^vP~S ztdSUj;mGe`kThu{6HS!@vs+%-sr?|4C*~5v7g-@*uOnE&& z(*WJziGI2mN#{Pn^}R`bJ2H(o;+@#b2!3If<0nlS$ReTVW8hx|(3=WGYj&H>y*&Jl zt%jnVEo<2O&At0TgrINIB)osX3tHC*IzR{SL1Kj|gqu*O%U7xig0#nstR z=^knO-jE`@vvK+0o3CCqK%s9N!q>Lq<^DDeSol5OIlStI4k0S^+w3L%uhN3g#P4e| zE{2(e=rG5O)^g8zZQ|yk0RzzY9%?}&2+g93nU3^}<+mVitVL?t^8B5h;OXPxWNEF@ zQVge)CwV69bwz@bb2WdQn(%AwN2zIxfwMUa^|CEn3Ry|LX9-0w<8RP5G%TnINpM*1 z`qKZ#-g`zh`R)6@C@MBYr70~cO+`RKq$esOO+{*?Mnyn~NRu8CMT#Irx_}Y^0R=H4 zOeEowa?va@3VHfU zx$Kx+G$U&odITy>MY)y2`bo6LZ~6%+ov#WRyS;N zttQ8YM{Uo3Y~Xkw`Dxh=N>IGQMkV1JLlwOKx-t2og-fw>n_nffsG zV%){9{)t%Fg{5Wv!Him!*$j#s?BuX{fkU!pA!!!}I}h2P$a+x*m5 zUmUej$gk!O0&X9gx4jhkkL{U%z2|@yg-?mKuz*On7Vg4B7$Uz5u?H8RrtwG*m5~uq zp;ixfNjLOHxy>cNJwsX$WaId`$(g`Z4mReuMTw6^Itt$_t%{3B@SUyA>*hJ>xn^#k zU0_9U#ybdT=pImYW?l>?oLJhP+Rg#b5j$yOcLs!vnfC`s>d47NMi`ig8fXO5pLMDK zh*PeDuOgObrrxXhlQFfs$@rz*gM2mf_G0>uX@{Vz<%sz{ssrLKs|_p61^hBz9oz5P zE5wfvbH;K48V+$BQu#7HmGP?C+C}#Zt^;+K7&>c)S>;_cLN$A>;{9+6HX}{0i$Ts9 z`?O8iNFMtzn_z{4-gtg-o|BK>PnD>i0%EaT?Zd1v>D8;w+)tNWpOY8O ztuv2laQ>{lD#ulC_5hNGdxJiQu7#fJUz6x%K^)sZ6=r)4&S|LslKcssoL2|ryJyF+ zdq64VJxWH{mKN(e-7Qz(>*?8Nk(J_%L@wJa^DN%&Ps`guOry=a5FR(^`%$-$?@BCh8+y6=k9yh&}0~(7N8JQ>kH(;*UA|E%EaoXH%yeTKgjn+{!En#{TtmPpb8rG1Whvn%;Lb zG;RFn-B&y3%I}vqFDs_Y_64Dh;hXAT9(emTj0+{#P#*QYaC*#{7a(#WEzZRxZd3GK z6{VU(ftEc5py~~L=5ZyEV_Q=GdUrh~l%7vCmxhakJ&;YHvbMpoQh<;oItxF#JgC3C z{DLNAQQ&8`*tdIit#sa=c2_rEy-3_|6(xxHMEIc}yq7f3yIwcazmeF2IIvSy+E?R#=_4&eQ>W@O^kn*n`)rR9{E*Eo8Ozb?uq?W#hQsD zY7l2)`ysy8J3Th`F=h|;HlB`gu$+7{b;+B}QavdLw+s>z%S9iDgd4B>eC zwe6pvk*!si_hjF?IB%w_Xd&Fl; zOb;N3TlkfG{h(D ze}=CnT%FXXNnODgVKxi>N(qd&Uoj(u==Drr=W1u#C6R-mM`RoePJ=_%w$>?R3zi;7 z0Bl+DPm1Oy9O!V-Gr<$GZr?=2yoDH}CmGj)Dfl-GYRruZAUk)HBFTI$3o!2k8&imI z>S@MN{cjive;6O?1?!Q~h?V7#rIweye6`vwIUPY>k4L|QPn}^(Z5^1~z5+s&wUA-H zG@{?v!n=0JBM#P+1(q7FV$o9aATa=URr9gE{^x9zN5b+$8inO%(tDPj6rR>TdQwsz zu}9?BhdQlp{EiGoP_l00M*n?>!d~V?r+#BEW$DTLWKT!M&ECz4KrC6r>CNTxUL z$mTs*>~3^GddMnXzI%4$l-JU^g5!st{pj#|_!z%LB@!bP>j{S#2APfqmtI|xkvS*! zq*1-XPJ7h%c#De1Y2l@5*`b~7TNK5h3dEx++?$>o&tGJ3dT#H2 zvDol;2e`UH#8eVh_@`ax+`yAx@uoWu0`97Gt1&voTjCybq@R`1zsI(C9rni6sA{0| z%cYpn_J&necZ5fe({K+Ef0W(+QKP?Pfy8g;Q!)RZO^t}z`d<;&yM%uMR-<+}F_p5n zy}_9QYHOW9&H;p;yUMwfEp#v<2Uc;e7Pfs11H>36!8U}ys*wL9BKwhu3b9|ID&#hN;Kx6nyDito~mni>*1SM*KcZxTo*UyZk~?0$5sX5VU3PMd$!h2Er=ZKu{9$Qg(Kl*w#s`+ zbM~vvTeq(3XDxf|<7)T?<&ms^Zf4GGnMIq<6wutA)}dH=cQgNd2p<+SG&E7o37;TA zpCQ54Cq&=9@Z$mkV=PSW4s6?vqJRrSAEw@0AF3_-80RS$xg`F6U0bv_h;#Lz`qEN` zOPb{CGy`d#WcmPuy`89oLn_fA(FDjjZ}e4am)I#u<5K79(H~4sA6_&T-39q+ABw_E zA)|z;hZlRHpTBOGTO>Yxlb~==g6G0QlW+@YIiu{yq zOOj8GcaN$P?VLW*zVVBll-oNsjxPyTn!1!5SZlD|I;?PNeF=>eNBV9JZn)C!diQ>n zvEm(LgaM($)l$9)N{>!}dy$8>jFX)2fv0V6X+QUDDeQ(&MrwDU=czDKJ6{V-g=BF^ zEP?~>eFX?Oi(gh>A~)XbFXB3eG!;41{bU{YVrc4D5K}DlwJb5Z2{*1EG;#6;Bn*G- zC*^tIA?dK2EJWb|;~n--DmDi*KaFdf7~*E# zn9J!?%CUGYAcAN=<%rls!RJyh+)| zapCKO5lb#Q@?hkLzT7BqYR64Czk2 zblt?uPxnbpULbwVrC(cHke+D%?BasIaI+v=wuqKN;u^N7DG}kO4w2rPWnAD(a~lqn z6jHH9Z$>W*{bb(29m0)tM#0#wLL!olIWGabTQ-||%6xqP__Y#=3vf|4?xwkmKEm3t zeXaCk3|qam0xf8wm~s_%MTT1>^Nz(|Y?L0W3PI9sEBKy!r*d+5&%`(^_&L1TomeYP zu#bU9IHcW`Vy7?HRwoJ3vaInIev zpqpoiEZdme%@)LA<6J1<(7T&vB#RbBr_+$obF{^X8-LK8gKZ{$b0p2^%Z;o9(Mhpj~6q zg!%}9V%4D4SsfH;K6!Qhjz(_%cI$n8$kUQj(bu=UJM;@Zrl}>-Oy`Rp>6MH6DKmux zPPCTRvB%6*(fAcdN$aN*^}zkxcYV7t%Bu7_P}qI!>&5CFJmPe zrlDI(Um7`X=V`A=Vp3S&(JxNwh<%9F0l6$U^z(n#-T&tr?*f7YGyT;Y!6Fb@95b?(hA+Q2`K@C(XUrn7zu1m@<1~L~lUF157iai6If-5z$DCuF0i?(RjWTBkWq#N{C89f%ix zNj?ury4W*Y)VX*P^!cW$UrqFY7xrOy59M8Kq_8`6$wo2d9?`;1W#&)5 z_%)fV>-`s-DLP|!Kza7)pWpbuU;p!`g!QkUq33G!RvS1x;39Fm0 z0BULDmtM1c9TZz>Ra^2yvFb_GL{JV7@ILzoA&n@-^Tc+Li=z~GmRCkLBJQG3!&l~N zn^C}b^|r&%7P8C~=k^M$Rjqa~jx8^P!G=x!TmLu+_+P!2nuQF{v2^wk{!YY(_w;Q* zR$M{yKEQdGSzHWGut$FhB%oa?DDs{Va8&Aj{ z;$4&|3zMhGv+Lwv^SBCsy9g^(to|qudrHx)2=sg<4kA`f(ZfUq$nXYHI04xW;R-3 ze46X{5f4}`Z&4WwZ|2;eMj~m{*D;;b38$T**BBP~&X1&aO?N=r&VEYRv>u-M&Tt26 zBQR&mfeiOlywfPCc&1(X>RmsW7;Y6c%Ksuluc_MDLu}GiyfaKq-WFTzxb0%Y^0mB8 zXb(aQc6xZc7+JORN$DHRXF48Ej93?+LPd>8MTNmQrD0Q3^~rahcq;z3U!O>>#ZD#G z2%^O(wyzp@#3w*!$ZJHzRpSbqWxWPy;}T<{(7E~nNFjt0l~~L21lH@*-@alTVF3or zcApsl;pvzIrQ>k;bRG}BRHkNkx*U>g$m)k2$e@?($YIKnl3RPfz*fA-q39(9p?eSm z=cPGl%H}m8ETVToHLDm>oy z?{{G0M+^2Hmg&*g?tSB$x9i+_| z&u&#Ls(gC*5F?jz&P>2e1WxbJgM6JOJV#RCWyOoIW+ENFFoGP!gA_Relj^co_Ab(;EUlI!3zXRdkA{`G8~B-9XLZ8`Z(hHg;)4ru_;UYJ1yp-%^ptv;AV7vO{2dlLb zeP{c@o7lzXfQZ!T@BT|!WUYDxJL={He{j(W@irUxk*2H*0g>4@{*&^}maoPu`O+Fb zK3%elAi}v{f6Z`797Ut4DciRxci;F_Nac(yMu$i9RE@N5a?IZJZ#Jjp*5SD*TCcy? zM}=-S7t$z-Vl#@!uaarE1c`n-9qcRVf;#FDLK7+>x@cBq^r#5zu?Q{ik2Tu7#8mVZ?fnZox1IgxF`t{$&#pB2#{M34+SZ&#Z#e9DY44Uw#mvhwn5#k+! z7c-sZoy)2LOl^hSt!=^?{6%0Y?@-{(z*W%CfrNKbxX4qB=pf=6WpNjsEld21jmG6b z*xKM?1xsxw5cn!!{}%4u|96x(;SH7-LigKL(ht*sasG{rlRLpY*+*p}eqgv8(1QO; z###S6$=ex_y#Ii47BE9vEZuPIHW%>KZblMxkQ>NtLvToDfJy;yGJa>^y3IF#%1)w& zHa>W;m|y?}_cuV4emL|d^9KjRF!|75Y}o?sC^B-T?)vDuvh$g&l`o;JR3Ws9B!c0x9q&SeA&2 zm~6G!Fozx(dP9lN7#8VPr8U+5&*tWTe*NA#6`9D4rT{qUud0~&QOu-nC}G=oz0fnF zA2g7ic|EU*_AqfXLEiyd-OVPont!ofg_EL2axP{Dbt4{(AhdR#ivnG*Dp6Wr8Iu>% zk!Lm(1iYsJ!WEc9Crt_zmt2a$oxe%Ax3s~(4eoSI=@-mEWeMvoFUfwQ%9=3iU$O^- zCy)v45a_@%F5n(Y@DtYJ2!hE|MD0WTL?2Kl>xXoH$2x#!1k-<|ZS(%2ZQuPD+BS`g z;wJf)gb6d*{djwZaKKG#9=Sy#T!uL~ac*MX3Pd!F3C zfNs?8Hb2hh=!eB>1vHA*pKImjd2!~e&~dGs7yHw`I;s@)OgNSpj$1alb>CSLja_Zi z$uSx8HX>$6RyZ9hCF7%|H}|yJ-8T_E#wNz^_UZURb>z(1LwaLjrMpFrg$pA)S0zH^+eXSPMB^_wI78iO&m6T!(6%^t>Bi;q6-bt$5q zg{~`8LvP`kS`s+*zDoTM4Xz1$6MAORdRrqDn9_?eS;Kyr?pmF~gzedTeCG{5yKkE8 z8|WF|%Z#ru$^XJV`0w~C6fOv#{ z@A=a(^M0HS=ert}UN5Ms+|=+Ecch3~E`E)XNODW%BI^fN9Qyn{f|#OE^g6d>JjRr@ z8R=ugC_8#gD)-a71Nv-+dk;2E3+&kN@;)hq^5vP`yXWrZ{FOFDw`}XA@ZSxg+H{j( zZ|`2iyu-yWEdi~~8f}J6y5kd6BHw}I{>1qAp!8Uc5QW#+g$;&c%U$YXs|-t=dzsIu zQXq_|`ATk#k099GOD$C;SPp!Fzh<@jelt(3QJ3I#SqBrP6)#1ML(O_2RQ5c-djIdo z;F$-?MJdlnDZw(|5F4;I5b@5dkp!{S?L=k+%Y+(>$Y&{1p1+;Ma*iX1mr3W(-OM30 zUJ4Y>|Fqh^>!HM!%y7Xak}gP9`?5k7p&~{xK@LS3B}Dgej2)JCNd?AqF9W8)3)ic{ zs9+{zrhm)>YGKC`=rbQD++Qj=!#VI({vphK;P)2%dQ??ZH3i#%&1Ka5dtNo}Ywis8kJ=SJ1%P;g zp-T0gX(POXY4j3tJG;>Axx4-}It0^{j|18FY0Rz+o*hSk*{sp{6hO6)5e35Jf!1m4 z+L2jwS_5HS7MsK9`~f~2-3DN+{$hJ7OA{Rz1tcFAu$?0>n2*Lnksfav#O zao!l#0*nG{hFGDSAnS@tSg>1n6#xK;0K+`tq6uzw?otp;fui00)(twWRlI$pYY8J< ze4Ze>euR{_bm z7OODF>wn`O%PF6H(EngiPU+}pAU47Y1$qJ>fv^6FD~aG(=_##B{P~nlxrXyPxqsCV zGM>M?H%;>5@F<*RJ{Uv4xxgtoQ8j`L4E_Z#TsC6^YTHU2(4 z^c1~Ny5w~B-Gzd~((K<>x_OVTQ;Ckm{NPqWRI z+Q7ov@(9t6JW)fJdK)<8`mKEYm%1mEPqwNdr*;XEtkNEsY?|+%rF*UN!+K5a?}mE) z^}PH|%gdWa8;oZicix^B7-U!`dgIFGH}zkQBN)=yqqJWm+uvE`g3e1}Oglh)3)%f_ zE*Xh}<#v#ioMcx%M@O;U7PFfEi>=BXk^gBf+B%y{Z^zBrzLgquH)nIM3)lP?TQzvj z@h=%*>&gyquyk1Q;90u`j(=`=UlGNA#g#P1=OlZFY*j(Bx(9Inan*F6t|bfuuL zFTINGweM_yk{|+yh6FjZJlM6&wRa~vG9k=Z%=CNUFb?WsnEbqG7dY5)%it*%YnWrz z7=h>QSMHpg>=YwuS5fFMwqGz~e^mrOdaz5mr$Y8JJy&`dn|tXAMilXMplzbaE_fGP ztG%V2DE)a{Qwpm+eE>YQQE~&_cJJJ3+eJ6feBaXE8^*3VS1RmcQ{q2)I2ja9cXLiR z)C)EJaCSbgY9n2s)Lb>ki#1Qa7Mkn**5Ivnm$dcvI%+xWN`lj;{HkrXWUBV~w$^H& zvf*q(Q)@atjJ&p7P0=tPkx1K*k2o=>C>KE8)-L%#za}-Nrc6KQ?bwyz%!5=Eczf)Q z=X^1Rc}RRPD$RT?BOadGRq{RL^;^=n{lbe@G$hS|ie(%4L6w?B)FA}l&lU7U1ihVN zUfj6c_QK3%dW#6!-aKPPOS5rUT+XrDTH zT`e-ba`)=ldul&SwW@5c6uEMHNa-${pHDAR*8e|3Z)UG1I9T1gt4b94Dx`(9ZgzIY z+P#xUKhkC(*VhTU1jjGQ`awax6yB|JNR}`r3D<5;U8WJ9S8@$fk6n0}Vg=lIF@G(( z7{G;XeJ>)@b~Q1;jP@AfcEqamM>~ z?eK31Wdmru!nUe`53piRUBn-ekI3h?wqHZZo9OS+{N7J5gjqt5j!F;I5RSBjoPG9K zzv0RcR(hpeh3GaCPq(k(6p?f5$GM*Xi`Y@S`N(<=feegaxIq)8SdL>QWWM?7u%5Ud zt()otV7NM8X4#{7(Ox)7+FMO?8;VKWNfpKblR@_aj38PplcKT1hiVc>^0?~te0n2TmlGPP^ z6~>`<9iGcf!W<}l1L>~d-v(AJYK<2&pbDFl_UEU}*4))2@`p;3PG8u2mwMx(E=E&f zaWbAPGnToFmHMc2XtG%F=TapxYU-JXSYu8a`70sRhj6OOs}G&x-@xrvM0f6g6#1p zt`6An087^v%M1WbC|w2c)~z)}A?EE11V2j4@8jV|?#_>#gfnz2GMWX;uV{?jK-s6A znF{BgETXXWRB-oFQ)i1l@=N(NHv3EoPgu0Q|KJRvvCJ5Y0|&kSCb0 zpMii%7jA{{(VEnnq!LJ-eJqe4Hw{txhbrJ6YLXA=g^2|VcAMofk3E2y&iPYROlNt_ zd0{Cu81semoOLMKo~4@|1VClGz*2k!G1*&)2@Tt27D6x5Cli`LJ~>QkQ-cP=q6*j) zral)E%o0xM@33_~j%YCLKHD5Fa1VWBzS*T~!B6 zkcq|kh^2lW_{m@+*QH8QGc(e{EAh#*+cx1r$5DV;ox(ANVCPvf{nJR$IbnaEcscbx z5Kr&Zyr08+n{D#2`FY&$&bil=DboGMIs`A*T~j*fa;~)X)}E?2zp_~ga`IhoEnwWi zP%-K^Ot1Oh-hD*1oT?kXv03rAcM%^usknhHSo#+Lp_L`GLYA2zW=My0WQVhUW7oA% z!0)mI3*G%@X+4Wz0@wpIQT*Th7|g;g51J7|v<}h`mA%+G@F%W)@wcZM2eSb|ub{`o zzxXJ75Fjwk|BH&{?p%7Hb8a5H@o>gTB839c_+j} z-EF2=^O4h~I$*Lszf^CY^cx^C{Ru#L$EqN;k0nTTo7AnZ?tP*WZxzVnk*J?9F0bbrm~~9JjcnKh4Hi?orKtna5^~ZRWpQ*g@i+RL z)YdWlY34iB6;Uf>>XlTv6DKZxa#Iq26JzOr8T-+{-MT|dry*zp?mp!`bu!211AIsa zP54S%peFZhUMy@5 zacH2!j$C|BYkpMROkXzhc_PByE~q0f$9i~VaBTdpLQ(oku%zdc2)BDDbgHIPILa?xv|ww)woZ^k>5#D*0r zRyJa4^wQM~e#sr#Mnn~3fec$-qHs5G^45>xwk1-j5`I-CExVs#DD0gej(+?i)rtH^ z3*#tB-oaI{@}B01_%v1SxNjhPpwYatdUL&RJMY>fxiVu9-L-Nkmm=wdskD;LdgD)~ zAM1wZnTPJLKe#v`yPz&4v;?ME3An!j&V)^kfvI+MEeaHvVq8fC`o0OwdFnP&Zm z9gNN+u8|#gaZS7O+veaWa9m8EQOsFH7B3A;m&P>1Ls%87uy?qg!<%j(J<|Dcs&i+C z#k&gvbn9jS7X&4f>omqWe-PBOe&gOOfv$@YF;^n@Uu+0qFPss_;kEXVH8~nuo7wsr zoLW|TAl;HFzd5pvP^RkHsmnaaY^v-E;60g%$I#p(epbLqo4<(<#eE0%<`vg1v<}!9 zguiX?8iFuOG#sTtxxNX|#G8Ht%bI94hTbV^1SxGH7l8o;VadM{gv~%=DF+aQGbW>G z!dJQI2f+uM@52r0GLU2~Mr1MvGf*0{={T^QDI-F)L%hM%Wde8bIym3vqdN4jP=Ct< zC@_e~W+XRp%>qmpYKuG>R}a~ml3fS)i3UB(67%QkM}fkX=U0|&7vvtgd1Bq@J8W4H z#tIK3XiIJUU9aRLsCqse0w?qnlieHES(*&46sS!kVFIsBE#Gbk1^taaA!BCGdzof7 zqBH@j;h$Lt$9B#wSa9@Ww2XyLMHn0!-E4peMPr>Vzm`x1$o@hRaaJgS_v<1o04SzVCwJqbXX#fBz;?2^j4AINb|VYG23H5ye$)}_+bMUUZC}?P z`6x`LwcRPQ-xpCoriu0rt`vQWv4cDFwhypX_Vc`YCC1*gcmdf9wfwJ2)&Hm7*XT~te;(IT zxUZj*l%*X4e@IBG_dm&YpXUWHQ3Ud~v;W!D61~rvK8Ibfrda*bELTHynAiU;1()z2 z6|^yb_5LFj_vX!j#!s^^)#Obc?q*IM3wAgNM6)C8d#2sRqRuYXVLOr#pK7j!X?5s~ zje?JT^B7_>wQ?)e2(A+>Mzv$zGjo5XdoRgry4mCP6utPQB^zd(PzFvcYo0V>q#5R+gjI zqpdz(XA^wJK1UDEBn{4lLO*YS8ZG))^oD&FL!63%)j6LwpZ_d3SWjPS<{(!EK4E9h zDH&{aUPo>{@9rM1=@%3{p8!fl52iFvnSg&lJAoWE)Kg_s&D!_Ys66U<#`l%dxagMp zsL|}nY}cK~s&K-<3`6rh-FNj0diez+7y5d?Mhw5-4{&bIbyWTPVlED(d!BL8_!z+F zLAZefxQq_OshK!n-U5-eb3e)x|D=QZ$S_94Q(r>!2jxVv7>WyoLz~SM;79*|r%I3b zcPLYo0YV$yO5mxuLQW(}_Y+7xau3EGx`)<+)M_Hv>(AKTcP}{+-EdEvghxcU#7oUQ z2s3^>BPo1xEH&7~@N(TvLr*x&0QPU7mP9LbIs?hf!JqKXB}Nj6m}6F>vId-ywQ;-Djrv!HlETAaD^Eu{2mCmVK6MsU2MIBd4z;TO_{ zGLy+n{;u2%T@%|w?l*?8UR3P%2p-t?N@R@}>y)0qKD;2IZM4<}^|Bx>V(@m;gX5nk zoz-@5fgl_hjSac%^TPh9kh1Z|@w}ytXA`^;xmMXG+OxyDvG!rUgG~f)76wc3pKc|NHf&KK zh$4^X4QZWNynkia$1Q3Wxe-Qq09EpKL-3$v2d>gRCR~r+oIRtGQ1a+3`{XJS z!Jpj7efpue{x^^h;fYG<_|#`i|MVwDbZ`2P4ov7G)t~O{h(AUSP9$1c)Z{O=e;%nv z3fFlB4;J44a%(i}wA;w_=6&|v<@!`_2vZwD17lgMx(s4Ep8EtbRm=2qjy_ke7kaaw zrE9A0ZU5_6CNHcS6TPb-%iuZuIPk`#iPo}((Iwc~XM~Q5je1B4td42U4o0k08gGum z`h$>3uRvk0nNDcuL>XvW3)Vi1j4RCuMkroDa~58hdtWN0%+g!GR&j%*?pN>oZFnwJ zcD5I{dd%8kv>ba~{`ave7nMB^3Jnm*dmkS0o{|#(y`CvCHnadaikI?kCCZd}OdPa2 z*!D@fz9Iv~fAHANrX*q2;~GaloQOqu&qGQ&L&S=2kg!LrMn>7#1uxhno!U90GZVl2 zgo)e`2%|lt2gtRxpnG_A*H=^Y=bTtx>PJP8)&Vow(yEXw00hBby;bBv4hdS7_@yQ_ zp&NaU9{1MAIg7Z3OJ-c58+s3Sl0HLDGVHRgj#sPyG!w(x2OpemK)vIwnS2#y%Y?Pamj@%DEDM*k?vj_dEV)#y0D&~`D}?>`GT$m#1#D-Qdofzd?74-p!O_ySik`CIi?!oJ5f{l_NtN6HaBdFOsmU!`Sz?+O=e)jF;_ zxJuzYud`P8X&d+Q{P#OMLFChM#;$R3c?Cnb+eNfz4uOsRpVo9uVaA!vO0@7Bb`9c+ z$t-Kh=P+UC_d6`ZvqF@_ze1RrI5DD^?R~LE_Gq=l4J>B(dGRLAEZ#tVIWV$KWuPt& z-Z`mux!GmIL{uvYH|vBeUD(O;?C;DsvixqkDOpxlT3MQ~z79_%dD_S5oGK{KZ?M07h6w6uxC_^IiZ{iv=CARbiSUi> zO&wEiqRxg%rHD6?OZ~ZWiK4B7;{(p{#i1-Ie#1iAqG~}U6-nW*gB(Fw*^|I|q{KZE z^Rupw=3X;_mSoQ4ow&aq*B3Qz42OZwBa+- z=7PJ9gI`g0b8ZSzUU&1{)IC$JF+zT4t-5F<`dU9)40+9f8FlNupsFSN$+0_T)gz-f zKRI!f8uh~aU`2|JMwfk!r6~_XEY|s^i(^)-^UR?{b8k=5;Z7V zH|*Mtx;r@+ZOF8bq#b`Jpq;eu9=!CRiTlDY-+=;F50(vwmtRov03(ghGm85+5m#@; zKGAd)D%TTLYBtPt%iia9qrnZvNxF3RkgIvHBWRd7B9_`eQ$|yErq26YALcgmAcO4B zUu^0a_Vs>H9O5Ub#y0E@;8cRDA<=G9pSlT8*CKY+mRy~tjx2<|^Bh(BWfWO&d-RIY zBDO%kQC-3-0CRAx{l}CCzBLFsS=Vl1&CiU(D^inJl_1xhBn~J!tW7^K;P`qciHotH z6W#Tuo=nf0Gc>lutTdqNE=(!s@k%i|dtgJ2q?w%tSV$wHUpi*Q@JFei6jth-vhWEZ z?Ho5_xnE7q(?YIc)Pijac`GO+vXJY9#koe;Tr(XZx{G2_!OQRCbW0h3o@0GG1J$^o zWasKNb_?&@HSS(A+rP?wi8_CDJ+tS%tg7*T6NA*3Phw*cz=bLrsB2dxK|r|rbKXe0 zA&EGTZiq?SFP`G+@KJlQ^z5E_&uJGV1r#t|u!&-;k~-KP zNBrS@a=~D6oQY2qsU6GKK+vURqW1jU$HkldUPyb7 zbk8oVsK%8%-IPp@3}7Mf;iSWxun-sz!-VG#P%m z8L&0g(n~21%uN=`Owu#?%N!RH?|n`aKVL4tiLJ$!^d*&?pqAIfT$()EY@_tVX5vK4 z9t3?QH>V675+a`;4taYSC){}v+Z1lUD2ACq#P`DIwosDIl2GR-12J)VwA425w|w!X zgnj;r6v%cjHI%fZ^MY__M2m>jaTCzCZ1;r!zH>9MGw0Kzq1-7SF(0eUcz{PU3vCwL z5=-4|ky>^`z2->l+4fd(l*%hX=Pk0HZCp`mDZ(n4Sza(nIwco#PUw59b~&zuA`RwM+`LLkYFLCL^;*~_MeRtuNlhci#Un%YV`9St$l_$@*`Xk{krM)uISJnu zDjY~=YPKXMvOcCt+TuY)>T;c+cT|TP-Rrn~H~?8~@tF#%_~z4dIN!Dey5 zq0j3Fyc=84xUv_^F5cfgrKwUU@Vi9v8B4FBpJjI?&ZUYLqJAZe_C0L1-7V+Tvf7Vd z?>BCmmX%wm=9!jfT!Nl^k(FcQ`E=KBJBXPcI~9bFdgPS$@f@f4Ossq7oLa8Igzb0-dG<4P}Y-oT4PmTS=Arj%c@ndPj?vSI33kr?Z_nC5k`PDgA4U z;&@Rma(@fiAnvIFHTG`tNI_R_Nhaf)+CAqtIXxeWp(JWK{qYToJaIdB5~+WrIMqY{ z!kslwvBCVNm=9+{-E=HHPIdU_5{J*@j6{hH2@9Ry)wK!IV~gWuGI`rtP+394iqggM zKcVBW^36ZBa2x6)kfYbCF;gQPb%Z`o)scSm;%M_~1WRP`Y1mRP4>cU1yj%V<;W3r#$&A7oupyt*Pt<*PpY)ZKw;$IK4?PeIyOt z*gCeeO(UZmKC0FXY!s{LT{Sq$gtxIwh~rAk5|+4k@abPrQUma*H>)VJg?g`TU;v4&mjeAoERp7xfA z)lJQY$9C))`4IDYV9;EDWrKL(p~iQr`EI{6iLXafg|(lNu?wHs;-w6AhS2UoGxDsX zDs~(SC-Mr>IHR7LIi)bEgp)m;Mq5WrnRvCcJTfm}M<&MfqTX4y@*LJL)xR3mg4duH z(C?tMsrg41;5}Kf>3De?`*@wpZfbL4h1h%iX(Oq~x#SB^SP)w!fv`LQ==xCcSfy?4cQ_$eMH%np7DzBVYD`mLz9<( zmAzl&Suyk~FZV026fDLyt>nG$|9aOTWd}ab+eL-d8G38gC(DKr11<4UKF;3xSBAwd zYl{tQcgTm^3&E8&{kFdT3uklI?e^WPUAqtxb|h!OirkQ$ZbnrHCx??=377c%lkm16 z#fLYK*eExi-r}VZHF-%-)UE`yik^R%){>dqS6OMHlK+0UXxPJQ*&j*H-X>6XY9lTV z)cR(AhMw)d^iFH9ZiC>P3~OFK+oOkHRUHinvXpzmcI_Isx5@jW*}r}g|Oq{yypWapJPf(p{% zR0;!@1N?Fp42N?!GPaXfT9LjN6U0-ZRY=B+U z*NlJvOzE>ya$KIw>I!qsIX?V~Pp0o;pLq5|+;^D$Uu?qxkN`YiB{^>{B*iprachn5 zN$K&QiIGYz@mGF{?mT=Q-Ah4;c3Ee+#}Eq+b8)_nx++p!ZsB{?o#<+EzB4tYKz(j% zr=gQ-H7@$2KcgG-^K4IEMA>b-jimLNR=&ZP9#yAc2mp&}4BeVIu+)#1Aoq=!Ihu0e zPaE$wes^P^P{2X4;{YYj`LN6U;m;3H$k87!(&&YG_TQsktoco}nYhY-)JICLLDI%g zFe{7BpNSl4gr3$b<$3h|lI?nE<2Shjr`5@mzK1tGnUzq%H(rtOP>jwrC_nZZ!~33mPz`APv@p5w zC7WQ%wCGFqloB*8uQaoi8f6_p`BN~}77M@yJE6cSvGv+=uuV7P@)YWsoB{dDhvQl% zN~g#zQsq}#zdW9A*RtJ>$fH}lTp8M+^h+KI*JNTMjnI%(##GC4=oZn8}o}~j+dVP%dT0?SYxOg$2Q|@T9>vXR3i4Zjz(g7DR1pNEO z_nTo-gK0N;(ga*gb;K%9v@4e?N|vjO5+&;-sb&-oT}rnZrxsnQjk7eiWxU`PMF`qo zy=PrXU20MGEu`3_q)wl05r6o{A3r+ht)Ra6;Je#teI3Vdm4}W*Ra_ZJjVI^vPR?3N zzN{yGopR+_Gvq7y25)_^`RdJF*X9C$MU~=6n{;+C2%Ps>p#ECtljNY)xBCk(SZmFf&DO6_1q?X@{Pv*R)7x zeW5%5H^T0EFF0JWL~h(#cVqNk+d8Q1dwD9`*(r;~HcNa;$7f3rNBIyO`W!l%>U4tX z>cqGY^_!D<$Cj-NPLrzTPf}MTc?HWb#M19RKvcaGwFik4QK_kS;h%%m=zLNI;Spv5sPtQ6V_1x$gYq^E{Y(@nn0SY|5QEHKchz&&+@BIsciNJM+)nyY5~0UyHa{uL*p7-&>w%KYQd{^frvHSQJ7{;Pm)^^NHcC+Q}=V^`eOSDh%*65IUqM%(DsNMl!8fO3l=zaWll&?bf6pg|xyZaI@CF5hc%KtS)?*HKN z1QwhcNz}s!ztaI!7mOALClP#rZ0)C{Z|+o z)QwsDBp5@}p^T{76}!-uyACr;wn_t~qw*d4f3tKee>C58g)ia9P}0)7pUEERZuNcu zXcc<`>Z>t)dxtOU(sKBjO+{RGQr=Cfh^k&Fc2J$~b-;nMfjY1&z6btbu8*xQI#nD8 z4h0*~r-NniuVfo1(hgx)ZNHo^v9bFZ!bu9zB+lPSBh;Z!I*+>lT^qmG7p1i-Q9#iFLi5*%zf3z=3G~R^ z7LZ!1G2}i2{Gpns`V27H3~z^G>s+9pC#6syH)HpjO{gX>*=OvXfmaT|U0V_Xgr)zY z`T8I0hXX__Im#Ak1>l5*m^VNqDiS%HBE*%?Iq*7uY5^xAwsXf|qp<0(lLskLHa4R= z@DGgp2nL+MHJY%p20Fb{z7^#poQmF;J_C?$2e&%C2d}vsJm1Q;o=xq8KFdQX(8a1+ zVrDv3Jpd3-;xT?(wWMzp0g!~_DKxH-?XmFf&5iTu*=GF7OQ{9j-{TZ_p@=n{Fj@UAq#@VJ+$JO5?E?BDzmD*%L5nx&N3qFF zXGYU+mavyMY0!6G_fPn*&%H-IXf)AxplKps6t=|9c*8(jLqfo<>mka+g+($;A~7~Q z)EZkx9zb8gbOY7Q(y5N&pK;id_-18C@8P(3;8GJyGi86XpmvR+ja;L$%~L$ct4u!f z@vm=QZd`TWk0ZB5JBQEuEt$HaK4e=bT`IovF|$olEK0|1se}HlVq-6tu3!|Ba8Xryy97%l3^$gh1UJ3*G*uPoX`O9?pyZ7+3 zbq?lzko54ruEycF5EYO~jcH=hQ}=!bD9Z>J!n$Q5*H^Ow&t4T(Z=j@!OJhOYV<$RmHVOx-SwMM$L^ z;MfLe>0$IALwNX97&UILUp+FqI$Xv(5|aMRRgZqSD&CaRNj8Cv%R~95Vx6=2npvTb z-dQtaKNPqv+LGNCQ7y|#H8H8`3(LdbO*I)yvyX?jtic^67z~!ZShs$wNg+CR&CzTH z&`W@aiB)^N6WaabKl|@ZnEomPWvL9v$23DiT8W^$sKggjr4sck!mSzRFHXmt2s(eS zhk0G8jHzs8mxCI+q5Gky>PqP1-VJrU++7%hqvm9CAtI|tBJ49Hbr1@ldS^llivK0o z<0ItM4zP65iE31nFE6knH8LxwXs|Gx=)`Xp$ZF{nRZQHQpZ{}?#6ag@%(COM)dlk5 zLao6a<(wIx{Vf$dm0LbSo&Rgc`d^77#<6h`Im>sCD>eJ(Zf_<7*p|)F#Yi;pN}(5WzXJrXq@N`NBg|I3 z(7M}#soAJK(`Vv*3xhUgoTXFn;zK1!ULR&@#%xCOiq};PY-h56i?et6d7FL3QW&?Y zHzU|GhSY-#GCcfenU1kKT#%-#Dl`zcKFn{7+4wO03{FVaS3CqZ(1nh5HfJ_L_+S(& zS$Gw@u6`P=ItVkyTG!-3S{8t&VwPO1Pe~)_;pNfdV8*9&`hcqIIQ$L`guIB^XwTAN z?$o1K3pmh5ADKC_o9khWN0?n5iW1hRZ(VOM|1&ztq|tXd2>_SAB_BIvdELdD%bH8K>$5#$znXPY}ru+MH$iDfEBAX z1*}+pV8tf$RMAnLFjh@^`(=Q)b^}U(S)KBre@*TyWK5@@xas6LbI{3a4& z(=3XL`C)u#&;01U9T@mZ3Xo3rdH#C>0ni6fi7mvWIRf~b7m5WU_HXAfzzbKXnBD5& zHp&k?)7N3Q`!DoyDU&F3qCLi(_!OW6YdnSdq8bf7h-ZL5_rWpf=1pi{%~4!A)Y_qR zrRGu@lTlIxtSKoXV7HJT4ra_K$>19`7wrI!U)}1zbm)qZ3B@Bxrd-r#TnnAf#?Il1 zR#CQz?^^&8XINie?^4T?r}v_iV`wXw z6+_iep0Tr0ghwfP;+IyebRemIeEO!Z|BZ9$zucC{WDd8+OE4j|GzmXYH;~*x!tVOr zAK8L#o^xs7r^Z^HCyx;vDDi}c6)!S}itFu4B%4W^-Q22W)7;pZL*F{RKb3M1rafOT zfqV{ZJ0T67Xj?vg%*s5$cHBs{*indN}-{KIc2ejN{M zuEOI7?6!6|KS|j#xAVKEKJ&oG;<>>NT#b|jl!!KS$4M1BjYT3KGBQ0aFCHz%1n0_2 zs`(ot+L;)xibI%d!7hZ2xFy&jtI3IFlNi`k$DZJg6SmQs;-5w$wSc0?pjnyyQ@_j( zr)`?IR2oN@ftXa?L+5Ap0j?~qszC@um6P^nRSj`%GdG^gKk!j}j&ETaDxBGBauV9` zZp=2y>~YB{(;n_0cUnaHG#{Qgp$gdNCDgoSd@Sj_bDxr9mu{{|Q^?~Zb$wOiZtP33 z8MuDJ@90imVZwMU5mn*0AR+MNEAm($nyY#bls*_Z6LGM=2_igzRQ6+L==&2SPK`c( zBC}^0pxWf35A(Q`p4;8v(Ali5KLHA(PGZmmhSk=cLD9|pRXc<&Oyf=<=AlpExX1lI zFLMT$Vabr(k8r`4?pF(xc8m?9c0^LjXKKidRMasGM6?2VA?VwdJlL~o_6Wp*sMZ+9 zZlYEvZ0Z(yi+yV^<8Tf8nK!WRBap6Kcrhp*xcH8gq`UWkjn& zEyE8;rpSmqWzrQD?I2%b7ZG)kF<40E#u>n;+SuTqeX9FCyCXpSav_bzsEbv0hgj~< zAh}%RRDFdUnY|@OXgMUxu2!Yry}nb73c;%C4UKN1&m!Rzzv-K^7=r=h_5JH?X<0m- zmA&6zX3&p>(T9g1m+0!ly)%PIbw4?SgQu^_YMwdtE1BPL--KNqb-r(axc8R5VfHIo zUV0Ya2{6*PL5+Gqu)|ci{&>ksUuUt%o!smlTrljg3Y>fzG1tGQn;2)#<$Du+Qh4`7A;^xXdT+%FoMl$E z=Y)Nc`BdhSSz$O7SH%#en9jWGK8=L-B6Tt|&2xV0efEECb>sQS>s6urTYI>6Tqp)D z(Fo-!O{-i3jovt4=2aUNz&zuIaWq+x4ysaqN|5GqV4mrh-F1S9hjRMOw}@jZ&&I67 z=P{$a)3%eBuOp`FNX~;cufqQH{5a|5A zeQi)WRl*lCx`;fL0$0KY@8~$eHOhT;#x%M3LR=qVdIEa_>&}XRxAh3Rf%XkxX#npHOSd|HJBqC z+I=WXg@f+jgY+Z9IKKrqQ0FGXuee``e8z1*+%Dm=q(=q<)zgTM3kdAva(u*zxrj0D z!)I+Z$}MO_`E|&7j8xPBC4}N@I4ZWB57R}h;d!PSrv!14bgvvlviN1T@o%?tW$G^t zXo_fB8>vBd8r4|KWLum#)}xJ&qzrzX=ITnur2)qY{;i1!m~3Hnb>KpCRMJl|wqRb1 zNE@}4?}V{McAx#_M{`zr6{RTuuttNrrxuA(Vg1{hi|LrBjM&u2OHa!fkw=ocadAl= zRAazKby3Uw&i=ZA(*!nds`!0Nv+k=2C+K#<6;j(Cdi0?UQJ`c!4o}q+MyR9wBAIdw z*gRUz77M6f8$hj~qDg*5QuBL1M}VTfZ8o&y!Q|aogQ4{Zn$tHi zGdhEbPGpa|KEw)PH;+Jja(wJt+!S@`)Lyy~~E*)ZUG4Vloo-ihV{Y-xRUf3x5|qPMTXic2UZKpy(H zei7A)!lP*E0m=q9V0UE#G0A$Axb|^cr@A$tCO;OLUke3T)K#`DO zz#gLRFZK}7k%z>YL{&#AXoAU-F3N$~whsuevp=2!+#<@mnaZ%V*u_~`?LJnq1-&t~ zyUYM;mN;8V!>r36BM9MS5ETujUe4Lt#H-=qz~%8-g`t(t3g0}go_1$uod9+^lHYvwk7L^-Fj^H?M|lc^iNh@-{b*JQcO5*} z^BRc&)v1yGR+r1d(bET0X@~Fonbol3MsNZYxE>kvc8BZO@+TT3i90;1_0FJ$eZ}`< zs!b>jL`%IHK0rm?n^5hn8ZVH0%Tc)SJgxBjdM*&o&T?BF_6>5&S~Kq6g6#Rr-t0a> z>ZJ9K^#ai_gDVyKtUKIH@6DTjrF)z1QXn$ljh!&;?RewhYbC@zFx}I zwFKG{1fCDeUg^VLO=MvAK}AMexF*~vZ^Pt4tzy5UF-F60P1+4VI*kn(3*Tc-EAD`S z-~*{kd57kT@}6a38OH}7Wmg-u8g%L}oskRBK% z3xjb( zJA1AWeMC!b&*Ja{Xz@0ij>7?!s&m$-pc_%ew0z$%ssLUT!iyN%t3@gGdwsTV8R*hbAF0X9qUy3Y5PGArHY@pk1Im z99`!3o5czscwRMGYC_qf`iUTc4Zk&DZ{i?(Y)nfVu9b5#c7SOKbVypp+p4g7OuL-G zS}FjD#N9m9jp5Z-4O`d)2!P8uaQ)9%nm1jQe1EU%4ioUX*cbK!G;5Dg#{8S;HOuKx zc`T34`1_SgtM2Da4b@`F&t6G!JxD~h_sdb+(eP(32w z9oCRtJPtDKX)D;QpE+-(3L*@XpuFG_<}sv26~#R7=a$B>nJROzB1o@8VLMudT7 z^_%5KX5NA?QxIyjfm`)#H>n=T7cDLev73#pzkqpZ4QxBibEttDJ`yFmp$=FM&Qr%Q zH^e0${${D(OQlL3VD2D~=OM6-C{eGN#?xs-sjY2{Wczq$;AUd#zpupeV!N~l1~G<` z(^urC6fRm!>ZmhaNXl3tGh*~Ho5Z^#1_{~6FR z%9d8I`cva<5ZL}l}P2@YxJ^|^IquPw5i zje)(OA7)@nCT>_1lkpMBv?GNOyS&e!$>NXuR<|IF7PHcm-49vn2(`^6+HLyP8#$`u zo9VnIt-Z>-Xk`)rjLB#MfW;d4=B58`*>2`FM}v`Il2b655o5eoI~5Jw4*GYWetEY%t{R zyZZLB^1Ab{7lc%Z?AsL$55EsOiQ@aEY${7X*og-c&_LeDC4=T6wu4D$M69?P z7Bx(81}lZebUiJ~C-R*}DDO^G=JKo>{IK!p;L%Nj;U8y3=0#P%*DT&8{LY*8vQ`^EC9-p*05Z=L^LPU|`NGlpl0@at_SbZH*)TUEo2NV6@MnQa7a zt2$Gx=b6o;-mVG}o6Whugx_j4`Q6|esJuIGX)|3{d|4MwuCZ@Z(<4*T2p5 zL7%WCfXw?HmbwFY@ASYIUQi`=Q-EEn0Dusz02t~xat(%^2en{xpx@lN_O3t)uqYZo z32^@008Y16LAyVoNaYnG>MUUXq)Ngw0#@;WPomvFOO?BcdVo|Jc9l5{+tl5jqH*2< zYByWM1B090YXBEiYz)|XzZXNP7ycaZ9yLrVf%vJ zgDCsqZGCmW0!kDST`WkC+TaFs&^OWBNUwc8SPtkPhW;wn{Uf{nC$Vlm{_p)nU|4)1 z3xFZxEuh=~>^V{$9>bVGc1Hq@cZGC%QJV9^Yx;z^-Wt*4&Nth}R$eYits>K(2ggnO zPK$!}a^}MrR+JIoW(jBp*!MdQqD!uhtqC}eKT{X7mFOK7?Y}9JhI(W%W??$6MEsnp=70sfWCR0y8Wb3*T7??%$Vc220I)Y?ZC_vR+c z$FEpT`V+X;9Ef?45vQFTPFqQ0da8MDdNR}a79XX6;l&*)qj=QTF6v=C45 z8H4+H7Er}u(nxR0f(=1fX`w!mzk-Jn;A#^~R1|ow*`aenWwK2PXBc^4zo==dc(@V6 zYc3!0=KZ28wVW={b^3D60PT>`Lq>Me>ZHoD{4mKoFu7I@qG^oHuVOi}r%hk1>xh`McdK~Sp$#FZo zS6NnfGJf421^^16+3Ini*(wW*attgB#ERl;Ahk58yWDQgd!cMw8RgCCzuA~U%ZiB^ zkM^;y>L@z6@@UxZ$2MK}YYTfY5;qn1j?UC9tYcO9uIc0)a5L#)NVW|BqbDq?EI`^h zDp8LkV8WEXgr%A>cmoX2 zTO#LP`($ycG6H3ENzas})@~U^>b^-|t}OXcP!ZR*dof^!-8T9Lo9U?sN6u{?k+-VR zV~dDAx+R?eHSH>V@pu{16a?fFGHBXE&+W^W()nh{*SWlEZAeL}{uDWbuSbvoJom#ykyq+}9U)F1l#6?6}3a zVzk!TIQu_DfPomJ7ZYEDc2lmE^{>rVd;lxndtb?>_Tw|h(b_{|yZnS1atXL?lZGj4 zE3|+dK5k%@x84>@m8FSW+yZ#;(oWIp4%P#9_O~?VxX$`&3E#STbEgj1^@4h4*Lqm~ zxa>?`+$YC{)zMiZE$y!2V6%`2}RK388=Q{*7QVySA0|f2K$b0 zC)XoPTV+8kA{Ir6ixab?i-f#!fuBr%NE0&oJa zxy{@gYe^Ic>=?-F7kqziKO=&j=aR4`JFAynvmy(qd-BU^%2{WgM_1e#9gMAuoesqB z%W%JhRUk8B=uvdYMGaFtxuT_s_qG`?WmrqYYrTVMq}jO-RCkjkA}h@abs?q7xlp6Y z^xnC^fFp})INg4T9P&}0D7Pskf&Agv0ohQJD$v1f^=T1JW6+4+OWu!X+0t$2l#5xU zB6pzX@K(AT8AsTAO^jC6(KoN0{V5Qas-v)>Q8$VuT>x*v9#c}tf-uc@Mr8i&-g`%$ zH+G(N>qY;9u*_F>X04uDiZ zoh+4{|Pp(IR!x?bhquHUp&qE0=0eMr1G^vc^=f!jj%Lc|MrNamUmUY5#r{0t#E z9uS!`3X-cl2<*+m5;N(arGC{k<(RTKI~A2bA&Etlx5ZGhrgtTNt!$q*vbmY1HUa7` zuM*eh7rv6+r}oGEL*{QV^8<7E(^ZNRVJIQ^sAUz(uD)(Jh5H(&77H5xUyKQtE++J4)Q15gTO6U%Sz1CcVvM0S3H>~`XT4x*rn zSRxfSOx?Lwa}22kduw&g(rWo4RxIszp57{?k3JF< zHPP&KW9PVYMjC#~y?f(6BO9kc2_+lWqmMg6dh;#3;2GFk?rq`-$K! zNhTP1AJL&Z15gTqU_$!?sQV>6tg5I4wxTOkGsrAi0YZUN9Bfn2XR$BK-y2|fg0D7#PDS% zQ6}r5f^)8_WrxD3x$gu!*|pXIAF~>zHh!fX{^MyWKwcOS2^7dzb~k~4PF$dYpgrrP z6($U@W5CngR#XUjuLN?IwW8&%K}yd4GHq=J}B?(R8mR zrv0h~8xxR*0csR6KcHPYP%zsI!>@eBZpGCM56QLr!L_P;^MM2vRwu?|LyOKbW&kq0 zzklB%&O^eaK^+R%o}gr6Ok*|hm4HbR%)C!(`?xjjF5a?CfLDt(kdgT-tEHfYk8#Kx zPns;V!7X75m+Ng4WXCRf1TJ&kRt}>I#WC6Z{q>wn&C8$>y5)qtVaS zRTdw65p$$d4g{{cv!@Kt=vH?&rM7xFfE(oOuPip1>Sc;XnQsvFgnlL(u9U4}CvF%- z?K|V1#h)Bi_~He*AxPNtaE{Q?EHWveuFQZ05jOQN)3+KEzMMurvGvYBbc2?)_mOt# zVvP``g{Gx9lVgtIqzKy<_sz3ORG{M=hAX}?Syi{TWTw|nvCr;{N84ZTn49=Fms6ZiKT#^T1OgzH#Ax35{IA>C-_;0ZyK z?{E|pp?%b0vrg_5lh^gEQsB}Dc9GP?on2K_pnu-lrDprBUGD${(o$P@vU_WI2OyYL zbc_9MC6Im}n0T00z-TB1t+YfxFvAJup!92yMZQIAIc1Ee_FcUtc%}N~+jI8I>K=ms zk3W%7&erqJW{dVDhPXlW*BeJ22QmHapivJ3|0sL;fPJ9^;)#&*`=qClZ@;41%yx8DjKQ>bE3GhZ%?(z2iS zlX=X}>X*J4Dt>Cc0QWQ1H%?@56Oxs^>@z_hO)u~gpBrEBjBab1!2E!ITQl7$NTQ(4 z$&yru1h(+uJ-xo04}TmQAb#7C@VD|qTAFoFy@I?u6#LZG4b^rQdTdm_2Uwg)GSnFxN_hAgGUcdE%T3Qo)scquM+6xyTgCoDi_`4LC1^&c5V2((64%@7)j9CND7AeqYiL& z;U1pTMdhIW!4v5pJsU-~NSrgU;F?pE;D-Y$>Ss6D*2;tKAIUcgeA|Aoy_ww+`+{~I zG-@)vTCm~gK)6Z6!Cm#3%4fKS@KTxi_obYS!@V!WVhrU7_Z6ga^7M=82@FH1wmfx_Um_~74_wCiy_e|Ty` z8ua2st=H9-!pzyk`ZAzF+gLO7q_hAa6!2{qm@c%qdD%rU%bV-0S8cmU?eLOdJvuNE zM6tB)c1DfmjDItKS1KgtH3CpyIPA%)TGK$o&5@zl$*2wHoo-5WIT+jpsF?8`mU$J$ad55e8l(u<<3AYR>Ryx22BHuTSP~ z;lh~50d3hFpWjj3&sOW`qgJOrq((*O=E|r6X0_Jf z-IjMSQA7N(QqNOe+m4Na?yahvaK)A^N~hbZPbRM6+7np?9*aE{D`D1!RV0n_AEQ}2;HsSPmJ~BEoazW&j&@sEmuIOuB;2jA9z6r)j z*XfJi(Ej=aZ^U4H2v=GB=vB)U+kPVBJ3Y(wlUe}x^0ILgp7_x1qN3@pb%xJNwESrR zS6`I{{`yyTr(v5l59XP)NUgGV`69+Cn!~+&4^1OkW@BJ63edBVs}5xElZb?+fxQp8 zoxfT9?&YG*=1cG)5Jj?>%i&6#)#M$C$$V}5Z}A6zA#gY;=1fk-FpCwXL$R9={^F$ld4q9shq49x{+orz z-z6Aw#pslTb`1*`b$W+Fwy3&8Zt2;kt1^$njP^kqsCL+<5dfaylrgj5?RvoC`xIq& zfq;J}N!N3E5uYrFNDcj%Kl}d4+S4yLlFngXFdmWY!{uPY6rZuigkwQ#`3Kk@KP7t6 zeiDl=)4BBDtL0a{t@8Vv;io5xaWyi&AiuU=982F(0_e-tThn@>Xh(k<_9$o(x;MG@35h6bNYQSd1JRlLz@uKjZ;TcMI{RTi| z{sloZSchkgkXYa&JdofO2nby{@S8<&(NB>E6pdORlmqw}fb~!?v0XJNrUO#)*CvZT7n83)yMWG!)fr27p5mnxV=PI4r6W6>C zawQ;o+%EVJ_PA;qTiCom5Wd2?toA?|_j5R)D3h>U!Ec!&4V($Um+17Ggb}`HN7>Rw z=VRR~d%z(ZXJn@c(qC4N1aH#_(#_pmNbT>Dqu!$ZJHKjqMr9AogEkss;tZEst%AdO z_n-C$KF~x=*hYjEzE@*sDq|Mf*T2zyzANRd1y49e}bjVOU}!*uyxzYV|N+g!VQDd-W?XualqhSP<0 zjw-R2$MhLdpK7#IOThKSFBoH_5ZM=?L0B!d@4M4ps_L;9i zCRc@9xftiaQ=j#FX;^>L0&D<1hiNqKJZ9#a8{zTwpRK~v19GaVaDgK7ln52~rQP&y9m(M8 zFZ>}VXzZq3h2+FkFWdl>6Y;X4-b$pOek0v}brh+?BS7nn)B(Lzlo%gQd*7G&Ifi1o z^~E==L0cw6;OkwgrzLj(%pZXC^8UBKNgMymsa5~ppFNb`pYYlE^@_?V!B6=bQ(LW; z`1@$HUG2>cmZs0`SwOi)2u536pqgY3r@XaVnx4O24fv;kzY+d5B?I*L;D7hY%2xVP z^WdwO^xg#z%JkrJJ^5;)aj@rPwjh)G1nj8dovrSjd_`Hks2K=YX+ua7kOBSA=`!^{ zt9#Y|!9Ulirn5yPCnRyW&x8Ino#F{kYS--(%h5w9Rm{HuHK`$ zC?o2XX|AWRWLPmaj0=b)CroeCoaF&RliTpq_MfOAK47l+;AP<(lsiXXxaR`!9vbFP zd}bsyaIl(}pcmL3iS$l;exfqZOiy8n2% zz?FBn9m92Aq@zi55I)&;EwO>?a^9|0?reSELW1V6?CdaZ79;)F>?{W!B!|&MsK9*u zM?W8cy#POcQD-AqU^H9@ySlRYwL8@v+>9pAn^Ap!U```mOabD$K^0lg+xba{G=w&} zsX(=**5WW7JoK|*g)=|4mVv0vQAO1y)QAAe&{Q1hgGc$O5iQeCaM)+{(C}Z79WTL0OcC7aO;1x)x9|=<9!$} zS)+uM!4jt4*~N8Nmvtr`jQ28V66*g`%w&Ls<d~`P3@BRm)GKluHoP9AMH| zrUSg;|De3!@85rT<875L4xwyTLKb}MV%xuSMq%P??-%anBP!RU>I{yE23&*rAf$Hz z3*OSm;TGmup!o&M#N&ISl9BVLu3z_k4>Vfk-=9!NuKU${qio}cFJG1ioPZLB=L*K& zbif}n`EpS|#!%zqg0KF7$$p;R!*-5;RebL6>L3tf9w_pr@_i zfI|@eQYH=)95`}4cHEZ!1E22W66$7LGCb~QLpIs124F>jHa{-S2o?F1@q^Kt&U5SH z8q+ylx9O=~M{*L?e$LbY3!Ju^AzuH)!S}4V{E>aH9bfz>;*6USSSKMhzaAh3JzV(L z4x<0<#%X)I?u@h=0lEamGm(GT(WT$uhOC92WR-sF+0OF^?p;(r_r}PAdrWb@=3sJ_ z)Oo@<#6Fl>dXZkHr|G0W9MIhLhLNV)odtEkVRk{MOC&G$Hn@{HuQ{?0(i=B@?D;n*_)LuTK0K@?Bn35qZ!4o*$g+(y_GZqhd%$EfNq+(?pMT-B zHLnpNwS=gcWJ<*o+K+m1#7~Ed^{Z;t}KV5wHCoUH>O=0Y)HhqEEnXr7t|=;N$#$GO!QUixfN!wqnm|&9eQN9 zyEa;V2F7nmThh!{e)1@&t>xJ;U_rX=XgL^0|9WpAo-|mxd z2Cemv&H`oYd;UMUTn_y2_j$ux{Nog%-z@1C80s_VSJ%NY${8kf9lC;Lc7S7KnXHlF z(tx_M7*JPgt^XZ&AOK(HXa!pn#X;#irQcoXaq57n`Wk=;qtxe^J$pDn=HDqu(|a~|=5EOgNHAe2``yb~aw>W=N>aP4ad+?~(vB-P7RQd{|1VBmOU>kH2 z*}jaU$J8FrTqGXujlAG2<8(GS_%{n27rJ}8-g2dR>{0(n@F$nqTrml>WIroNcPuy) z*M~i^LUjrA1aoma@Oa1hgkP4sOL|$}CG_%XPh=xEEU1LKM7s{8l3b+Wz760(%;!4_ zn=16%OY!7>c1R=xOVWGsZy)P_<@cc2sWUYYN<$-43g54CBS$a&xfv42df5Sd*M75W zD0bq=vlALdTD3w%MTt_zXOuD`7g66`%ak0j8{F*vFaL-DHX2y&j3?>OT#kTLvHef=Md zfey_}+1*DCjbeaM#C-~KuCshKAFXz-2yR;5evfW%@MT!%;BVfFA*<8(B zGfLR*Qf#0W|G=ijolEP&V%8@#PYFH!6==0Yb3p&3B>|bf40>@1Obpv}lNayWQTd{) zxQ*tGtH&}w|~%m|-d zpPh3XeBA~o$oGNFB{|?f;U3D>wh=J}LOz{A16?Q*&kSbt%Sl2^0{Dt!lV$hq;o9qO zWZ6Zpq)s1m`w}yT>ck6I8LK?CpGFHTUtdl@<=%g&#l{5TMf4~m`bunqX20vgC-g?>EYRlTJwLUZCaJf8`;M9X*GBkkPr*JsZWaU*T{uwq1b!AA}YfXD-PTK3J zl|!O({Ha%pVu#1sRbMx`?!gLh4Mz)|N`#HasYjFS=;4YCct;|SQ>k}My>4lDM*?_T zV`KLcX>4H?a7kdUJXMf8#9e!>IYVBNhQjlqrEU2 zOJ>s~06-m_`$`Zb87V+*=G!*{n*lm={-(_?vZ)Y2n)aBgRwSqRPdxT zpcrt8i2EebkZvSQ{85VT(Zcrq@Wlt91y+j|5IXvf=nQL9YxMy}1445mO^AUFYtjKTiQr)(c}AiZCG} zLeL#PK29G18er2$@zm9O1RH{d3dNb&-o^$aB(n?JA%O zeo5IQLK~qWT}v~<`f;t1>R)TDOGQ52LLpyDJ*ExEnVcn~+uXVZ=^2Ao*BJSLvE$q{ zX<3c1V!SkWDFNwoO6&}pUmdcu^|ovV#*K(0uA$Bktb1K9`Z~;0mFjnc7xOJw8ysgO zE;-VEB|2^u*kA$<)?g_5G%ZRyVfFh`MI$Ev`5lfYpNizbE60MPFzX5?-^Ye(T$8h9 z-Cn6{I7=-u>PRjT$Q#796VfLU)@#mrc7$Nf2wLj9!N)$J%%A6mO<-a8> zVJdjO4I(fz(z=;13aTd#pgB)G+9sZ37SRHs z^s6_YZoF5$B0o^P?BfZ~av0dYa66#Gm0c`>p2A3i$WH-$+hd~_F|MtIJf++3lfOu| zC)Isos6kyt0uztQ@0P8}6NgB84d5KOqDu80OQ>})`}`xfPhb3a!HG@&xYq0zLBrLQ zW0rb_!1kKY`{q&(6&0KPqTlbcc2UAEZ2iuedsEQ?@PT5N*H9s^iS&NJz%A!h#TeXw z`_p@7thVH8n7)fGVuyccD1vqdsZl*jZBEHp%Jj*6m7qqs!UlvxD}$csUnvb_Cjhh! zbGi8KWo7(Tg*&wDV`(8Ut>AxFF4PsrJw>4!;-Q;W_e;>2ntNd<(Hp5T`@`NDs5a=~+=AT*r5QSAf5YosH-ma@xR1{$ zTgH;}(US@h{ih^Z!imqABFcj%G7t);08sw6!;7_C6;XmJ7&PJ|8Jno&tnO%A&f_Fs{CL1od7zg{AB`{oaY%6&x3C!&|a_62Hs7$xV&h|*l+ zXjEMcb3Yct?1Tvp$H0j8zzfJ~j4z&t*)p4ooXA6!E)2#EG-FH72H50q5jiW+U!yMI znXlXEtzoUNXl}XV46q}#IpX_o7J$h49BBE6c@C^KmcM^}39}qzC!d5Wu~4scs#M++ zWiDT$2R%+@Ip9AUSHbo0@8A7Tp9Uw8J=jQ$p7k-Hs8Q48S<+0I+kKvmpZcV^sJETF zjQ9FXh+Wnk*gWmt@6c{#2>Aer1^!dX|Nob4hCK*rNZ};7gdWy4xb9i)nt5>fT5Qbo zTAqvprO3J{&y}~p1sn>(Gb43jKByqKT=X`{ppt=o=iz8|- zp+Z^aCiRyCP5XKgvkZDgMTf*bgMF)0Tp_!BQd-LqGkRd}7rqJ<#<2y*97DwQmRc^| za4oy9!gHBnjDLO!Cy>FVhJUFGg5PO8y=0aq$B@Ez(B1m!2Z&#d)!#iJG3av@{G~B2 zGh?po13()(`&~;)eBRHzoR$;8&aOIo=b~>zMX=iwV8M zbQp=NtNSgsNII-a$Kym2@#fUsK7`GgCzjgKijxcN-R?xu!UP6;&$ceO!N;gHG}&A- zEjF1oQf`qwYaJk0drQQ0zUZ~;;yUmkVDH8{(%B7beXX$C#I{t&h4M#~R7}=;2X#Qd z5ji(p@*L?wFuyTyH0Wt%0RP0LYYVnvDe*s*?;R3l+3tdPkwA5T-m=>~pxIJt;lcMw zNUxV0@7<_ouFf*V8EmhHH_($2mQ!qkfnCfn$S8K1^?S=6PUe$o=Mu_r;#~ub=ha;7BPqR)o^Z9L0ec_Ud&es!b6?X_9OQ%Om))FPBhE&ZABzl^QDYA{k|X!9 z=VE^agohGO1H&5J6!J(DO-+^2&Gjan7kt%4&&q5cQxnO9Nx)HpWQn)w3e0o;Ql5M4 zAbjY1-qm|`)G$V3dWBe+f%>tGOwn`0!J$91Iwwy|MeXjp2wOX--!AqZ9tZf&6!sZO z^P=xH19m4JcKa*AC=YWJB(3C+#nG&A95m4Ic!SDCWGTP7i-j%(G z%4VAI4ut+R&7c8YE_ke0oHBYp%m; zW1ZUp18w`)EANe7uQbc{BrpKG5JzVi$C8I9L{GeX<%yky*2TmnEEv^=JuHts(E4fy z$A2!+pep$1+i@8iKJ~zB?$>WzGpc2OtvEaFXQ<9!bB62m!-sEMPS|}MXfpQ7UI(Dm zye{4rlp-BkNqIM|Ra|`){6fJ!Z?Lf7kncCU%Q3a%xwpDcH6!NnvUmbJISQza7vR5d zpe8C^Vs^c6EZkg*Ok{v}AB1fdjNAveKG?M%oJE?Luxa(`8<^unTWStd4(<`AUVrQJ zvXDx81)i|Tet*~HoYB#!_uAR7?-3PS&JKU?Rrit0x}<(vS#;qVB5OG3O8-PsaR{v(HRcIJf}RNZCTwOQPZ5s@rBeCnj!JJ!K5*~nq1t<` zgdc#?Cx{m@mh4!xQdqBcUh`8qnM2RF6^zYNs<;f16rU_%6E)Q`wBP8`0M zu(`t_L%o?h%C*+e309iPia1|VLyz8Mj1(?Jezf>(BCg)2I=YPw#-9nXKJ2PCP;@Ydc8(UezP2M=vzLmyCoOFRv{QYdfUB?+x@=1 zq5*q*50M4i%K@$_HPQAQl*_ZJt#gB1i90`Y_Q&T5Rs6Jh(!L#OY8n%%VmbqstLe-4 zoVb2je_cw-`uVMh2anIWa$ajbe4uW{AC^_DZ{lC8@y6_3bw|-SH6P;kp-;>4Jy3uh zN2tNXnY~koz3`{U%_#VKH@|G^DCok>uakLVFLc1^uAIjkxxYk(dp7wwsC&$~_U5o3 z6?tp==~2D0QI<2FcXO%^TQt=O<6Yr4d*Nd{Dj!h3wJsIkKKgpAKu_w|l0U|PU7qS^ zWABn-YQD6aQ^@(e@6vGZNw!0q{4c^<4#n38AgdpYZ-jtt(ggLz;#z?u(z4|Zkkqq0 zgDXkJ&Il{`fy_&E+QG47Vg|>uKs}M)BGqO+V0`pjj2LGkKmMrv7zvqcu34V>Jijs!Y2E~x6*^X?=?ZDm zC@2*4oG^9^Zt6ZT%eISL23%|*S0q_1?+Y@d2Jt^bVpw?v5pSb*q_GSufDF>DXlXJ} zqwn(2w~v+yy2wdU?#_=uwDR=y6n<+&X*MJ2&<5Av*-JZr+=}<{{az>RQFdUJekO{T z(|83b^jdw3GPSkd>Tp_8))u?eciC^)<7?EKQE>?TO`FfIjI&y2PKTaR3jBUVNi0on zCPW#wM376x@-HC##EGslGk&My4tr5#ivQ8#xgxtvJk19PM_#z5gCunbDbtGDquJ~$ z__U!=ITPT-%!hUh@^(sZ2wF6dsi`oj_Ta))&5Pdsq3h*CAuHp2pz%RN+lx}M;LDwd zVuEmY_oX9EjBAV&Z@=FUW%G_a*_{%@oCC39fTbY?ouSz$q|ZZcSn+k`y2A-`FPivm zZyHN5+Fd%nh%#A2Uayc=RKd7P&XIIfMd_VoX9sxRxtiL}Sa>4*An8FX;A@~boSV7n zMvw#WH53SQBnei)-Nn|nuQs!~lf3n05lAb0m#jLmQ~5&iFiRgTT+Z ztT|equMetD6t*R-J`6KasYBY<{2HKge{v4GiCCh=~N%~Cnd+9CgQJP`OzDyfpF5uCv{O~ zsOs1vy95y_mH?wXcUG<(=2iB3z%_AoelO42bd}cC=b}@aZkX*IMXNhM04vjwgZ=~n ztwrPRIo6iKI9w;}*>-FAxw{fkbU+Xp5de?!;ojRf1_VK&QX?=W(F-GJ^0z5K>_*EJ zFW^Tj8SW=Gw#h`Zy7YBir!=NiZeZD4&fr`o9qaeCvs7VtAL{ZUEXY&lD1T9i(mD|p zM|LxC??oi*IXTAo5+0(P&)s@`SY&8Mg1!K|Uqt6#DVfVtWhgsZTTd!M7 z&2FF&$q-J;gfG=OS>23F{!n8u=nMVDlKi8OdT2txBKHdmq}AwY7|Z*QfRiWW8Ql$t z1Hi~S6Y;)@itlrlw(A|17VktIcY1>xNl$8~CBz6jinA8$bgEY^s|bW7)gCQu8C-A7 zWgZncdmn=z@Ndy~j%j7InZ7%59$uRv?yx2>>pQkC8;Ry#c<@eOS`X`ZnN=llA)}i9 zaO%D&A$Fr!>Oz^)g)*CK@^^E5x!$aS=CuYER=tKlPQQr2TokWp(Qw=4|79zUXokM= z1h+hF&vaZ;A2T%+>fyV7)3bofgSf<`>*$6~&7h1A6T`i1W?gQU{Q~m`MYq%Q^>tnW z&vy8SIQL>|BkYQ|ynuC7Oy*|5BU-hvf+Sf`>_hSn#z4mDHzC-_Hc07e8~>G3ks$-k zO5^Bp2o#D2o%hEm;?>gXz8PyM=C4egKvZKL?D((HC`(!%`AE=Xo17L9R1=-IYlrOS zof>ugWJFGB3VrS`^hcJz(`FKGsobYsgZJOgUfent*$Nm{(ll1N3XfXU^IzRCP*fyj zue$l8%pmj|rL41jRiQHF^Iutbj!DKZh>;AJbbLXU)sdk4o1qj13_UGzNv0oO+Pr(c z-$U4+cOY~U`IH*rX*OOsZfLrMj8*TQWK%9;Y~l(EnR{e@f=E`p5tu7S8pWfB3NDEA za~+vJpX0m~-p5N*N1K>+NfG*nNM2Ep*r4s_X|h|y)?M)Ghc3K&18wv&oYy$!yzEr; zlCo6pJ|vET$qYUbn7o%-rZ%kQ9>XXK(MeY7z*H730EHokg#hn$jM$MQ8X1w5sy8@y zMyvagr!O>~&0ddX>HM)-M=lv~DCK6rx@$)Aj0{e^=@@NMom7iWJ*+-aTZH)nt{(zM znS-g%-gqjR!Uw>B5A!_knNE_SyGd(B_}5*9+JKrD^HHiB~$A{#ZpA-Jov`OR~~)0pxss8fx@yR#Pum($*lUI7UnJ3vT|n%T6% z27zU7gM2kl%2maX?rK!ZUU#ifXWi|a?(8X%%wcuOs8Gq#D! zv@vs&bB6LAV>xwMV)b<52OBDVH4zb*W?D zC8)2Hd?6vl9;#m^Y`c@M`g%LPxD|NZGWnK=LefT0ZrOF7KywC>ad-L3pzao0a!e;F z(fU1-S2x!KpgAI=u-{TBFwURYg1qOHtW*g-28pm{)H{Z6%~!zWu#%S-SKll?QqAwA zkz2GmJPPT0eIVMEtbw5@te;fcwdj~5MVW0%FjyY;2M3CsWwgkd2035yTv30t4?bi0 zaX2$2eH4I&I4lmU(qcu&ALQGG-Z<(!hflk+peIVc_OQdH@rEyWR8S`qpCC6Os$hjp z6(*FvBz!GV(Xq)^axFQzoaN1ljmqEP&YZ;B;n__5%-<|LIktGul>_Wh@>q`rZM2+U zS>zS*G%KE4Q!L0C0TDjz(R2s%$i$$AZ_h2MYIfINr&+fAh_DwN8F4jkq_|~l{+XD2 zzcNN*<;%jAizCiS>-xH_cf325KE+R5CILlX3kx%OD0C_;V;kPsT&HO^u-H{vGtLm< zT(DbX(FQG@YeZ7Hd$p=z=i2Z3yeSwQ9UU(gpp8 zDdg$RCkhSC>}_>?%GQ1*xMRLOM7)!{$}EPK&0l!Fd80Bal5;Aphmqv5L!U4Q2~#O7 zmLc1>&MMtHJl$IK&a_*|)ZMkw$}d z#F(ZcegY+m(qqno&{MOIs9u)b?V(3ba&fn(f*%vM5{{hf9v6H>DF`o5{si6Mj6sM9_ue|h@pLWip3p(i8G#c^&ombi-aUH$gvb`i^t#Er&RVWeI2@ds5JGO^+nm1@3-$^O~7_Y}CD zevw}9&_Y7T@v6ExdZ+$svFjb}jR*TC#x0!hmpySPML-61r}a6fmV$~HGaD@RCAv@L z!}09t=A3bM0*(i3mg&uh#td%V5!Mj~uS`$hoiy`){=(5UF9OuHLlDD7`(Mq-S>J^{)O)bqk|o1f9DK^kohyhGkXtSb{jjBf)n zu2kiibP=;albw$Ljkom|U-?giAud!t{cF=^a-`Is+!P3ML-z?CBn}T|d6tU3_z?Gm z5Av_imGtJ#&${Q%4a=F+9txEE6w3Id9!WBuAZUSg9~^d{(O{B537lb9e29; z!lew4hu_Aig9$>i1W%u-UqLfT%CA!$+y_2@x+ty+2`$2zEEGx_DXETJiqou=W_{o5 z3*o4KSs=UJ_?$|rgZ_sknfEn}Td=r6w)Ny_l4Mf)`gb`8o{#t9!y5FaOVX+?R8smq zYNTI%NOI5|g832s-Nh#px8)NqDcIU^<{{;EJK%GRpWyrDdTK3to7Qg2e% zK+eMCoPj~}gA9k9C(pz#(1Coo+JInUQP>CInqIraNw4)GpUl6BEkIw&p@xtH+{})X@i)v15Z^ z$Sx7AU|1*WU02SqcaI!^!8d4kuM^k2vx}4f@_PH#*joNToXmMY&V&4%=>u)QM~K_p zPD+5Y9C26^JMRPZ)70lWy_m9boo{?gRsV6Y~UrR8Y~_{ z({LV}zA)39wp6v?h?yQ|6xkQ zhg;{l+lK~=vTLJOM_%7|^s6)8Jl%lb9$U-aip|>4hMybew;N2aXs8EVNTRB4bKQTl zCg1L(d8(7&kJeaFZvTSMW)HL#$4?dKX$$x)szWZZ54nRh_WAC>}LcN6=u#xlF#lgoF`B_S< zjbV>Z@?^&%9paVzPQX49j~aiA52prXW=A?cdg2 z88mxdM|H|236>;xo(o%$2I*D^JY;f&N^zvcfiI9!{_x!oTPb>wmaCKX?pKgiu=g|^ zAy>akm;Z;VxpmS}h6-ZN=SCBs?GZ@)BxXb!EuEYcQv8J@!V5z5U1 zlIk~fVY+k?1q8$~!e5+s8u3W1A!<95$pGu>Gwl5S$ZhTC0>U`Q_X}TqpiB#3%Hu z3Hup2;>q39W_n=VV4E6sC<3EhL=Qeg33@>XQ$vrL=`g(MuYj}Df?d2jtHx=el@$Sm z>C3;}mJMWg9A`+U6e=;cRm2-MN1`F~hGNX65)XT|Zhfu+~`)7POrO{>Wk)uyV$S@!FSg=KHR=Zq)7%Md2~TCgnLNx?^Iy_T9N(l z`Q64PYbXW4U=a`35ntyi$|m66AaX4)y=r0VfB!uX%Z8mwOME^o=LVk|^jgwQP)b6-7 z*o^2`xCYmQ4%JM~<6PKyw{H>l?<3}Jf;WvOUfCOSv`_&evG%~Yoa{sc;c3nv!;;@W z|1CuOFYXIbM@!=U&7%?w`%zF>*-sGplao;SirJ+ho0uy9pCFJX%O;FV#{_ewmAwrv zJc@h7=xJ;uKbU%In*GWI{pfQt**|liXi1IxV~OST5I~)|#hB))eP6-5$^F zbXQBCtW-_n;{tCd>*juswmMWji>N1FBy!=YRjD~O{2VfUf0n?-!%TlSeG37wfX-Th*T%Osa1F5B-UuEA;G-zH&! z=0V1o+KxQyd+!IQIC{%?M6Y*Vlx3dhI1pzztwAv^Ns0&PDqZ&Tl(@sU`4tbF_PJ;j zTZOXHSOyjtWEa@|pY00nK5uDsOsuQ4d$S;SxPtt6EvS0nnSq1rpBOPz~4Y76`hNTAQsu5(4Are19EqA>E zxMr$YD(qZ?Xjliq?+$QlcTZy>%P@!ixj)`v`X4(B=ub65nfbqVXV7pCpuc+-352PC zxJAex8`tnQE%Gq^9WGN|LsijUlR8(GPrH6qeE5SE8~Ohj66N| zzbz3?rmqoh*SjeWmW7iJzH&C;c*Q%G@mjVddyEv+Yo%sLSN9#L{75J$jZ|2vgxx~_ z5!HdoVYM{^!?ihi8z~l~sp`Yc{p&|!S zxWmD%DS+}5i^%l82OCgrSDgSbYaI|;=_M#FY@=eVcv8-ss@sw!xP!XZ@5y`lsjth_ zwR9acD^Z9hF~YfyRD%8iA9*PYB35!6QcI?{H z6XH=V$d{t-sz~Nfmgq>mG2o*FuZu{%wFit}iY*f=e=%;}#v2oQBts*NX9Jge%qpJlX_~0Qv}$_FA6XU;E>tcpD>6E z1A6_AfJ1UO{u;6mHDy4MjU)$%xSvp`0@Wo5ViDkyWJivvPJATDBLW*fB2IOL;4)*I z*;_#FmlfIqQ$pdn6RTfCsYUL>7$p{g+{n|zbALpJ{N11Tk(4+CtknW%;uH+Slw96? zLr{r5i7WLeJ<s>5!3CXb`8jp+_;#pL!uJubAG?7REot(oMDGDM?{R2|Za51`cU zPOYOWn=Lh&71>lvytV?0w34+7O-Dl2+P?~h|5&6-R@Pzmci?GflQyW3BYQ3jA` z2=}CIf}U{0frLhJ803`3V+UZ(-V#HdZnG2K+6**Y2B`4{=#Tif&w)4RHwU#wl0vAU z=4)vo$Ewc%=`i{H{-Wx+zkK~9z6`bOGXRKi(m@ZLV=-+$k$+`@V@e>5;;-b|`^RAu zO9c6j06-=cpiLjxjFpas!TTY5vf})}^8|qZ>BjXb_0}*_3E%>9>IKDVe>VyIj~|1& zqmqNjo$9qRogf4F5cK@c@?IK7+}bH}R?2}6H+vJRe`o}5oxOn(s;)ur zjSlE98hY&$&e1CopDuW0{sRyhdiD>V2N6~#57DWzOQ=-<{4f{}`27%j+BhI_^8)Ax zuKgJe!eckY;iM}5(%2Vo0H+&WxxjyRsR?+mP5}5F^EFA~#-;fob=4P~>?K0i)SkYOc~&vvT-&u=URoFEk#@Eg z%`Jd`BmxPCeg4~Q;f@)t^O%pMFe*x2B!{=C)Ob_Uw0(1s1V7e_zbesxElJ7YLENgI zjFgRwjgLLkj}KiA=ey}y!~Fr1R;@rOl3~q*_W!@BVmJ#hnTxx zRm4{>wxo=jXyCzCNG1#yK?np9#HO){G1+JV+PoYn2x>;toF<>K08I5;5zvzbAIH>` zQGk7rzt@g>3YeUWMc!YC%eGU!yD>yiKF@x#QiDd&kN9Kzi!9SZ4pnvk(+Ts3v@`cg z>f|?Kv?DsIvlbv93~a7jpCFG)fPo?G1K2NX11r~8B>FxVz)OH~0`n0AzWVFmZ#F;x z@%57h)Y)Y{V7*&LPHo^Gp~1|eXrLgN6Prqi(8r^Y(j=u4WX#`y4{VA3?YVIL;Tiq; zX9~Ag0uOr5U$CVBoBb?Qxs>dx0%!Z`&G&g1mF|H)fmDc)@4em5%&e~aDe(u@n81`e zY2AD~E=lHJ6&pv?X{^Jfaj((#ICdzROZ|0@$0P#!TWgpfiR71J4CTY3;`i;0kG5}NZLF@zK{4Q zOWeEii+3G1SXvZ)BEETzty}m>QhP(OUR04}R+Q4OFOq-r@2#>o>^ymqD${s`lr8E6 zR?URu_f5#M{y;(g@${!pbTGp2%oF-GO9M=Z*Y;@p@q64gkdnFPC}NqTH3E0`he1F%HFTay&N&z5i?Xe5QAMQGgXYGr5b5OXl78Z* z{Ns?*_8yJWB-P>PN=XJE?sZ)Yp=Y9d9!TOx4mZVlCA?Yqmj3wf+=%Z2gJKhrI@Ai^iW-ntp-bq4iim^d0w6e1+lJu(gx|4@w2zQ|I z8<%b}EL#?7Zs&<;ROB2*hC}H1CHbvu%y1Q`=l-hYl)=QiYWF{N&tqC%JUUV{=9jtl zgKcIUxF%TD6vT7Sa~UgK($b=5O#u-Pk8K9EAbnwA9QQ)60F!}K>9A1nytIo%tBuJ0 zsE?BnN8gNv*CZe}WB|$>d=5f~Qz=7LNBRT>HP(NKYKl_13Cl$!O>-hg8Md1>z9%Hd zD|$K!lxHSqzU8(BFgDHgR6>{1*DhYrOAFZ+^ZUCS^q+_MKQk5nqxXROQ>Uz%YLIVt zpeXW5#2h*OuZ3YD@ofKSCg=eHTt!%#L*fr^0;a>nNN#FOX?H8`7e2eOk$Zh^f*Bw| zuqnX7`zrhuSEluv6-uuJcKRx4@42OP#3~BV6&dgZma%^yP8*U$F1ICTz-FV8 z2q$!e^0~1D??*BBC))5MPA5?WnsB$B`^a-hEoKi@;}6~L>&7OMZvp-5|8T+hk9z5U z|K9^w7P*zI>{i800_evNn{&GG-22q+urgSyJ(cy)^tx#1S_NrGS7k->`_E`2!u~xb zkB-kYE#V!_euguuPOO~(0qTxIM%;F4j-bwJP|NOQY8wM_>PLZ zMG_IiT~eGTnj~ydcAz}pgAH-=g5^>+$@d=bd#Ao~^>l{qv06LzUiVf$iG(&o zXJt2QMi7lBDgH7vvH6=dd~aUayYygrZbCZ5Lm3FKWNpgJ$}Qf}Rg1|OCB6-~5I;|m zg<``r&n zw>Kd1uXW#aNcG9J3GM04KA>TE%d`crY!bO+C~l$uwKTS%X6oA(;l@P87YuQWt95PN zjN-smXzfbT5|%6x{|?!kI$_Z~vqdU;`#`q@E4p&`C&(40kU!YT zDWuSKXwio`;#8txu-QWP6E64xVY4&#QQF@P(*C_%>>0ZO%#`l9I>CICdlsw12Oj}$f+6=Tyv}wLxM?PKP^4n1vb+e zb^`ke%Vz)hPY^+N#*I*?1MqJP;l4URJ#z~I;8E>rf!G|_Z2=wVN8A%#h?z0KO;b1= zvHXHx<^)JE80YT&-6iPnEldBG{*4%{uc$E^T}C{Elc0@B(_=4qf!akTosRbEl!tbf z2SQF(LQwD9bdknf-5ouevCHE?OyBuOg%0Qy#QbN--s_K$BeJgqWf;}i6axJWa`F&3 zJZ#aM;Zhn{k-5zPAx|>sm&N@d`8?``x(srP213O_B_kmxX5i!&bss=675g9o*wOmt z%AM-Xn}86AEaKrb{X$|K zT_tFG^y9c7z_%Z8um1!cFCdyD|JqJZp=$`6mp?(PfITZ4b2FqnQsmD!tM=zs0vMXe z?N=}y{uh{JT>3<975djZh5osjG;70Z_`*;C%kuRXV8!baf=hh$FZU|MCV6GXa)Z}7bV^K zh*@s0>LvLhLFK%NN3bZ_*B7~u9-vs@TzYu_Mb&rPm^b)z$9XS>TMnlTr+w7lZ2B^m zL@KkjCsxe`)6x>DULM@JG!GP4P-4N=EpWu--@ecVf$NLnePlG z-8F7!v~yJJpa|b7xCB+iMzF`Rh0ACsD{wUQ0zfz9@FALlDw11qZ58$BW%lpv2FSR- z?KlU&x12w3ufMmPKSV3Pqk!Lm+3&#YFG1*keXQK7=Pii!$bJ~t4~TTAQ-txpRwLDqqt6g;Ewo3<5vm9M<7|rK}in$U2u_?cnz>0svINY4`uaH0SR&pZo8BZdtzE z)o36As#F8r0@g7qz#LxzFo&x6uWH{vbD4haAhQ?jV~Mz{ zmk@F8O&4p5uTBS0T0_|EUU7$S7~OAjWrrmSXcg(<@}IXvzMn7ipSC5*dCC12UnI$E z%)5p)_|o?>fb!08g36Ur-@%sDoq(3gqDNa&SlOz3bXlo9DrB;p**(--V5}`TnBcm* z<-^@g*!?KEd$!rvv2NTCq#TYK(GB%3*Atcus6Y4KBCX^gxRjXS3B}Crhb)c%WoToE@*GYB-pOV z#VDIEx3694qCnJWUWYxtv~%y2#5j{H#Gyt)QR~{0GtkkNmj7v`_^q#i`?+c;qx)3L z9JmhZu&7Ek?Qy;Mb)HDH9(X1~Xvqnp(!%7W2hZ`fy>SuqV0VPNHDx?kV~sOS*Ng3}L9 z#MSOfz~#S4o$x%@o{cW`io;?TI0v2tlZ4=xgxA%a7L=-6H~VrC48!JawE=aGOo!@ zGBk^+^l%|d=w;`q72EpJoexO{cgI!1u(&cS)U-QIUaj&n<}wpuH~cSxuU)kz<-E1+D~Ko^VW{-&ije*4_F z4iJR`TV_>7b=ij|H|my~pja2Kaq$SSc*u&&o#|iKDXKBGyd$!#@Z|+eYt*K`MzLN< z%MFe8Y;5|yx9W0+cUQCI0PWV&+KOS`o2ai*=9Selal#t0&JNYOkjE4obucmo2aYAXjcBNi|^atx%-AthSwV#^Vn>BF51?|@!=wH!orFcKe)kvwz+qjLoam9C8`+8`ikks zxr`fr;K$@}(y{g9t$@XFIYX?2Q-(F)P&n0A6S45afML-rukDxP2j?mLYF}3c&hbP{ z^WCcKVcFRDa(Grm5{D4K|s2Zhkx~1w}^Tvb!dK*6kYFh^cgFX_U&jhzajnhYk1d{PL%)q z9K=PTlXpuHCTwX(O~Qz`PEL={1C=(D;6)OwiPT=sDTUYa7>mB@4MMkIiV*$-yvh<$ zHr7;c`xZ8ZnAl4Rbt8)cpY2{CSsd9ubYOYtfE+Mtupe~f2rNYl8vwfA} zrruikOmQ2`Ighe!28)3*0eRh9wxx;1lmwn4Dg$W}Gk&erqg&{jlZDpOwICB4)gy ze!Eg*y}kj%)a)mh>1EWvWrGoAw12=PraUhy39EVjfECe&rd1eW+lU^NE2Ny|3jmajpU_SIgM>rFuuf1@&j16f}28zMM5> zZ(J8G{J;>GAZnyP{W+S`^hDd-94_b=!LFS=UXeh#cbBFD#7=ckn;TEVfjYtlY8%e*?;!-q4nJ_dR<(MNkfD!X_J^v2I( zPCY5OCF^S9t0l_B|K(Mped#C)JLbTjILj7A;doBUE)Pls*)Kx1Aj&J-e3%yu^y8+C zD(3~1Ngo4MOSAdQ{yWk>+YJO)jdwRRNdY6G8>Mx%--{Md#R=Oxw$lnXqDpk+G~Rd1-Smg$&y|j~OJP)7?5*;@ zB7+4#=R1NX*`%AFre{BFAsVa&Jwi%@->n>*Rj<9kh|g|SN(SV4Mtrf#SU=X!#QAaC zJWR1NJkShRx`zYh$FBSY(V><bLG^--tl(>7JiOlrObVv+g=|1hOg2lpZiHs%{6FnXyjlv>4p; zxE+nMuQ@Ms7{W(iA4>BRz- z>C5KvZSj4wK8pa<_NOG@(=dpq-`oOi8+Q(Qt0dJ3*>elFAuU#~bDvWib*}il@X4I% z$=-0?(>HqL+I6kdBnq6)4U#$yiS!hW+LhBd1kVR;Gi7-z#fl3r(#1T*P)w)gN-7Vi z#ob%Iqf4Pk<-Sba2~pLV-;Uj$P4c+-<;8u2=_%2!Yr(HO*BlGLFTp}*_tB9Z(u=|^ zc?7>^!6iva?B+q3$b@g}jCHj@F81=|iuL~LPY|OTg+DuiSe`b)b zz>*RYEYOMv<@KlCpoZdzzSMeFR(jNSLuMs}#1 zZZokzmF2$d9%nTpwJ_Zkd!c6gO2^Zb@)ksky0LiwVnp7;SiJR?o=>*+ zA!)6!w{Rn3A)^>m8yu_=NNLp(GUmxwIW29EwVKmb;6Vx^1+)2Y@pNPYq;+@cafz` zpLOFRW_1S@x%{9HpETa9$;%V7`*JUU!x810qi#CP*RinMAK%8W4tG)rQ;<&izjwgz zU4k}?3TY|kk#6P1YSK&`evFW0cZV#D)bua*s*5SG(nk;_u*cxX2d;P6wq7_C+5?B7 zZ0UQ1P`+hCT1EcbsKLR;lC$-^na^oW-C9Ue{y&cyX)>09V8>exvVl?yn za>HV#r2qAGP6#q-zohboTXYkU7L*>mS@}lO=knvZBrR^$N$8StOi*O&P*=oI%XweC zo2*SMM}b5^ZM>JBo8&y2eLJRLYP!V?b4591XCzp=wv5>@2aFLkjYVK7W+hZfcA;Qw zkPJp8${y(s-{vpi`{?Z0?k*90bMKP))kincGVtp2*V z4mYc-nJmJb_;WH`>^zVB*6QCx*CDhBC4*W+nAVJMm;1LT`r9z@wuZ!>R^5; z>I*kq&h5PU{Q15iberWT=uTE0?9FkfF^=qJ~ z&Nkg-=rRmda-(}17AdOo*y20c%+XRJR=OQp*58Jp>ESZiuy|7Hy&c%<@dWJQ@gddl zE^~7O!rbEJj)J)e>z;z(hrE|&{I|4PyJf?VkonfHVCwZL+mWn2Eb>4(_Z%B?otXzC z`-aD9EwO&azuhs5epVb#>VVbSrfp;v!Q4=C_>6`B1U-9mszhEE383@d1WcU{Gv7k% zgyQZLW0PcRheoB#85U{7Z$!vzzbH%nQeeyb?bWel9u{_Hhz(M%4^qNLU(C#|CbegI z^zrJ^bhj^0h}ita-8gZbHF#={o^{28)08Ap!%3stUDtBlxYNl3Nd`i25M({n1Q|T0 zDzw|8toRsRVC(Txch&q7Rl7ul^qcXW!y+KYoQspQqrJM@!?GI!xV=*k0`KlR+$Mp@K!yRIb zj<;>ltF70KA0#baC@%#dZJ(aKGkc^f#Up=v5_|T=4+w!3m~581_wq*)V|2vx4$U)0 z`o=mSSL^gJjB5T`7@zt2>Wa{=@V9DhZ>J6D!Zn!{md9vhXu797OJgezElRQWM{lr& z3%pd{TWTnZ;*PE>ssq{*6q;1JxW0f$)XJ6XLm-aDh%YET|i-S)#gZY|I8EPg5Gg(?mV?Y3E1r2UqZHRqC#dlt!xw01`ZYn$D7?d37nr0J{T++Sp79pPs$@+r?9iG z_a`U>RhAtNzp45lIsNJB^+1Iy=XoovbVjXcV^TIU_)Ti*k)HtS2pVucWX37FO45FR z8ds8xbm62l$UT^ybB|b%Vi~H-<^)bpSwDnlQY`ct%nS1@KfcRZJC4zvkvTT=lMNqR z>y|?1?jTHH?*^tkiZ?_x_uj-X^xkEkPZ5`Qt` zk?4mg#{!AhDq@3N>Jf$Ka}`C)e2mcPM^z&{Y(X;E6v-*`ajoIokJutB%w!r}4!MXa zs!U*?6X9P>@+PEmtzt4q3cD}gj(b2B5#D(-;u5my6b}NtjMQ=fL3;mYVTvKWNe`W8 z#GT<8JYP8^{9nj3S4;G|zHlIK&|``>l(vtPoB03;yODgprb(-#J^Em1cDDPyL1X4} z)Hv5|jxKX{U(kj!{I$0oKKl?x+3&Ov85C0<73uIM!@s247TOll`l0)kC6VnR)lJqYf8Q3s7!dprBBQec z*?N5+qm4eb(g2_=Yc>aa^{)h{zqopiTU9^a;;Nq+sSht0t8~2dxl%S-PJ-sD*}GFB zZEot;nui#0kdP6m>j$~YkQWdo%zmzKr8Cxp+g|Wf z^@`<)izNtrt~C1Mz8*7LWskL14M!a?sp8H9p~;fVvLp{WGT58CQ1vr5u6Ju^p7q6> z$qi9aVOfpwkd^$lUtxv&P@&jH*PYmv*p&}ImhZ$LbcrVp4V2d_wRpCr)TejfYAB8} z59rR{b`c@5IR{?kW(#Rw5tanBk^(cZ0Dn#jFsf)&vL^_us7`)ddRc5zB-?oPwJmS4 zPXx=w2G6weqwhO5+u752BPE+2;vHIU@ywAFasAV`ezFpy#8$8;>vc3&{0Sb2Wh22f}_g*PL9T<&gR zdCs#TX5x$8XviSl_!8n)7LiDom7D=D8I++}XO34>Qx$(O(B<@c^O_o`o9j)kys&76 zd5~x#ZXCOPrnF(qRhUyyb6WsU-s;CUkj6xNA3)|R2&`Al# zCfKdL{6$gjz7#%tTd=5mb$yxS5O8d=h0(UpZ!|C$R~hcz^my^`h{u$mbpU8^TZ01$ z52`|y>!S|H9~)j!YEGjynP4Na|NdItlL+=fQ$3K+#zC4^5We3de)sY5{mzm$hL096 z&*i;|RuBh~h#)1uOY(c)8x>MZvr1trNjaQ)dHb8BYadAVp$>boy3-_zHK-NI1isWW zuRY4{L-q3QRIMH8R;P?foMzwUXRgc6MU2_EE91nz3Ii9^ivxKX|4G~+58JVzlH^k; zZ&6$3VauQ`Vs)K*?>@vR-w=FVynJ(KDNdZfr$<>Ih>#P@fe85k(u$f?YJ`AUlb!}j7j9Bg71~(0 zym|#Jn%&4R#7@K`Hp~1}gt0m5Ln~cNPOC3XI7!2o9theb$JC|^-Z={FY$5yXAxV4U zU5i<`Nc&$Quc4F2=Aws^CmnA$fVyK7hsSQrlH*J2tg;l}yAQRKTMwru>V7bo26;wR z9@|E=Sg(#Yh^Cc>dlY3kAb}HjIU3TcdKo7_bg^{7pvi$@UWuB#^l=B)RORz2v(%Qd z>ml~tJdDZRe_<}T5T~%vh09%d=qz%tm;P+9vwoQHd_L?dIlL81+I2`dC!iicU)ht|Tu*uM74s%M<2(O$dLg%aIr6Vbtz+_pLiVmd3xk=EajhhL~4^8De3J{QyFn*4=ff zjX?!TIzGX-i4=LO3ar~kTE#l!2NLg)ZHceVpJ`GNyD+2l7Xk&8d{pR6SSyFm{1;K% zP6+q?H3S*;Rz(ZPWLSQJq4T$U#m!*Z=LQqX-mtt)hKgFx(}+qhA^dQvr{1#F4ZPkv zp(G=Ff83SDe&!9>gfaDZ(_wZ)TJC^|yIytS0cz_p{dP9=!yjg|?&osWt0>mmU-Xv2 zr26v9>;||f6jWbWBypzl61R=DFA@-=H0~{ccfqbZC*L~Mjq7>lKIUX#@N`>}gq>(f z#$>BNVLS08&CIPV7rBv{oQJ{Z?rZddbMgO__#U{|xe^}jEd z*1UTqbjAvHu*>-1+{ygv$?|p=$j>n&LjGo|7`|BCCEwktc~U1(@ZIG2=8*DHPz(^?;1w76)HQI8DCWTV8FaR2L|Z*$PNQ7wL$2^JZfodJSxE-vuQkMHT3T1}9MeNh z?3|Oazb3V5ZmnB@;}&n@3^MEZC^BsiUJxz5JVs< zAV^n1kQNmY6%kP&6iF0N1VlQ7mZlLe&dPKz2KRp1= z#?gH>?Eg^L@$bEkhNdU2FxtpzjqICavH+46eGbS_a***}r>MFkZV=w0KzmG~nx)9> zK@mn9LJ8(?=&?*%lJ@fD)G_?ziwZ_H%Ac*2oYDlimRVJS0Nl5fyzlWlr{%RqWq(SZ zX1#)*Yq|klqW}t;c9;o`hwKb8#2r|cmJ&+;vZ(xzY16O{e33sbB1er;ZG+x?hD*r; zigLphjJrRR0qW&x{GP17bM+Am<4KEIet^($h6sGX8~lj36mqA(HvP5@bkl*bdK@J{ z;hb(77l3ZD1EP`LZHQ96f7(wMHm%(>aRflMv6AJDyQZ{aaxRw^gPysaqAfD#&!qKR zZyQy1HclK>W}3#IG^9i|y1S07WXiM5G-)E>+YPI|#-v5c z{dD3p^V!Ox-O;esIf=QkK)F6(Vw~f`@tpIn zJ*4~L5pxZM7yshkROECcgf(&F!t2W6XVkXQ&kpXGy_ar~`UFnT%Z@ho7RoUt_R6Ok z8sYMfv$GY>dH)dXBCN@|HHR`dXsQ%Hc}nu#ps5)))(`D4CVnXLqbc*VA%XrD7)wIl zbur?T0gCEM2a11`zdQg>*c8uadYPBtykHI)+Jj0Dh4c@!rU3=_p=BN4l=|Xn*vITvjw`9UN zV0}2!S>h{-@#M#4>09w;kDl9mw*BM_k+`wL9;U*2TF@Ilq|PKVObe@<@s)Atjka$CYOp}U3lPVSwFhW8WK0LjiBH7v~uv!wM# zS5?U41t1HW5dQc6K3F7WRcpBL-q3kOdjTv57J_WqS?ZPE;bLYeg12%#HFqA;%21@) zjOo-XDg5Fc5v#xr`uM;<;41uAeD09^X|l;ko3+9iU+WLR3)~O9fB?8!3|?d~v8hBA zn>lj#b%>9#iLa_|OTXQiyxE%cBd;>RnkqHjvK)o8kc!VV)^N2}7$)C;*X>{a#T)MO zJT{YTG?YAF>JcFF_OoY6NC{#FC8~q?3cpT|s)S}%#*QDYQoP;KIrOc)cxF%BYZJ0V z_|CQAR|dEu8FrhzUl7y@M9-$CWSoPl3j=7|X6sq;S?7f+Gqjd=|954b_qn;5LR1y# zq*pV>Xi2du(#-|dTT^wJFWYoJ=u;*sSTXy9x1%;@J9}{)S1kxEHmOWRX&zcV}a1}G5Z0}0gG#x80dM?zl+{2!S zJ`T_8+M;ghuUqL z68(2(DbiR(?6|4t6^d&rO_tQ4c>LX|ht2KfrA=(T>}_WGVL?h@&!~AP9fa}2xDZ{_ zLXDY^A-5Mh`qEd&iRzqsANQAu!+(tb*J}U?jK!08p>!wPIe0M1M3bgRs)6HeLKsiu zw!py?AZ4>uWD-@M^etX0D`lc}qe+tB8UE?Suh3-3wF4rSpb#@GdJ3s* zGeylpo z=IupIRk=}Fg&<$NY2s^pyM%2widKfFLO0GCd~!w07$@!S} zK8>7Luclpc%&Q9k;n3e*f5yB#P3B8@Q{pln!p+zPq3>_UkwWiiD?KWUrLLFJ1_>1r z8WU0L3Rjs|Qyv3k{{?fmn}Ghv+rKj#{|V8T|Lv*7f6MoV%{8yQ$e|ZB?Y#*I{m}cm zK4L*=`5v&?>4|R&;6P7n?KHNp6kupn49!( z-c`Q%9jV>qAUiBmH!`~3X{987A_tP@yb|$N^*5-fytukJXuVZ8;aH!tgWwSj{WFKy zYncb|V(t0m9}U+#-(bo%WiPX*=LXIv1#M{6{Z-63_{YzGVC4Vn%mDz0-lcq0)p}r) z9gqv?pWB+RJ_M<#w0*)uD}kK={3V8F*J(e$K*;P9KyifZU(EUMm@jb6Ax>geC1U%( zFy2?4t{j;dopsr(_6{al?zWxAX+%OO8|cU?WcZ^RHyfVjsmf z-_|tccVujShLLfEy-kw%m_^z<_!1B+D<1yCd?@9m?&J0wAoHj$79%ovSiCV3sX>$~ za(+#faz1R#^U%)Ge!Wa!N@d0)=+phynO1!mRGFbd(=DUKPaDn@fA|eDwG1H{e1(dO zDp+DFeJqAt!bU01T+0&&QM}TPV1bbZ?ysRzEn58xXAU*^9N-a5bUE*JqlI6*alm22 zA~<{YSBJC@phdM4g1Y3}m(&!UbVSVd%j3Qw;K1>hM?B9k+(|;N1q`@c$6__6%|F_izCO|-~*`^t;;uJhD+-9 zxI%woNZUGDi$8fUJ0;UkAG{;)2VF970okk?Y|87^N+UCKk9jozs*I?7Ec^I)uFre0 zsbbJJW@rMAy+NrZ4w;6hxE7qf&TL`)_7M{xeWk_l6fin@ z4-*1Zr712-$eq-X-=NnYwr`i5T}MwE-`KfO8HC`-o9UrMV+O^59I$>=0TWQI*i+3% zxle60${Q_iD;6#Ai+f3p07#!IZ$-Z1nlPs*u`n{K%415)#Y|;o(@(>!%OWX#Ia5Jr z5F|d29nPA1njl54Kx%8pDguSer1RPwCJDl344@fQILqFC<@XAnkB<75R!q!TTHhvhxpz0+jb<0lddxzqZ62 z#FObCs@aTdEx?V^9BS25z^(kUVLG78&f*+^40Jfv@?630q`s~|Nmy^VZWr22@Z|dMyk5~@C~9xv`tQ?H=vhBe zR~DIihWpSnmyOIg9(DEdMEd!6^i%an;bue&ALuf>8Iam?;JK`~ONcj`>(-btd)|ZS z1x0-Y{-H2t7uQVyxb?^|473D~Sb!X00H{lL!?`?)W!l|acTTTU`3L7(Lg+`HVw_d7 zh%74%AxZ|#X1c?%@o=y(y|>VC@jN2(Jmo@quE&jTTThK#gIBx`tL1$bVdFCHw$blH zsB9Z6aAC`hdj$@Kb=lzu5aq5;`k@!IJ9xDW zv0n(#JXkPHXMLHbW+v>nwv{iGacu%2apWo0mS(*o?TbI|g!V;oVd{+@gxa8f?&PstU>m(w|zhnu9WC%3y5hlk_1&lVv zMKp)!`Ch^J60PJGVQc^!)0vrMZgeU@*zB3*_17KBNEzAkdDGq1RkxhkPV zB{Y*|4_75&y*HxYMo<|;eKiQVQD5bSMublk0v521Xl}U-_7dZT*RmA%**>r8i{n#tf%>;myWGTfZ5>+`cgy`?T!`|&*?>J8a{*Mh@5dP$A6yOg?K8PfU|LxtzoHk z*2oaAp*bqMgXaH?8Dyt`ONUxxAkY9$XNdQTPzT08_QU~VZt4L*I;Mu(ipoD;wEn^k zSzg%#w80wnenppG_=QcIup)F>#=QnRx;icK>D-Z5s^9}ct`;i5Q=A?tt0JJ_j0rqZ zo>S5&eYh!HQD!gzv#xEJW zrs|)$A=l0r$7$R-%a+q2c(AQ5fx9>7upc&gXvFwHNo&}+>;rcjmcyR4EZ>24>Mije#t90;)^8lz>#0#n$GAoylPNi==b-$pq)GR5+%0+rjo+52 z9puU#b18qs#MkyabE~k+ix1!|sSU=K=NMe+A;+e&%NOB!)K->w`ipb#uN<9McWZ8v z-W67r0z{)sCKy3hI1;QuYYOWNKQmx1oZ_MGcgwvnrlDqrZg?yU%5wz8g#<9_Ff4#k zA9_PBTfR3g4?PVSDWZY)&m%HEsIZg863NZ`p-!g3{KA&2U^nM!)w)VqT;i0l7x)oH zm?RIQl%#9gOXQ@@>pwW{@wLnXRjqr2gy4(nzH{v74QLwt7{ziD>tB4&g6JhBZlO0! zIIQ;c-Xi?T2H&0q(|4wzVN#l(EanK(y}4eI53U#QV=A|0Mtj)c{POxsm6lw;svucT z?oUhIU%yt?RWs3x3RxAtkvapPy?gu%U1+R*j<5V%QvhVgfwD)Kh$f>~(!Wxa>eogQ zrL-wqQZi>%lk(%8GrP@*)QP-MVl1&K$Gihuuox*L5G-`re%BijC$}d@3{G_h9BtnSPl78ci(4*FR*IVEt zr8Wr0EdqXPd|xa0>tAUF`vY3Re1KMPaXA)sk16SHd-@iT-xh8}KUe zso$VrAu_HZ9*vN$J|hJey*~ZJd&hoyG4t5Y#mXtsl(^e|T+%{5V(I@=<}(3sVY4+j`2&?NBZWR?q&VU2DKe$0kw+*D zHl8=-b=ni7}RnY>WLPrNG4QAHmn6I>!9A8H|O+-(~% zfy1~)TP}P+5l_qjPnVj_n>4#1E}^%^OpYA)MEJFN0Q*=mvX$|t^<`v{Wc#iQ)S{Yg zsh9GZXyNA^oWWMm_A=tfT$Zv>LjyI^CvJ0b1L6%Nx_+vsk=A#rF|DOR-SFEgdcQ8} zKUrmcMkS=dO2EEPw~3@IY&fEK@krsEPhaZ?tTL69ZrBpm4N*#U{ zE{OTs_gh#XejEe&7h&t)Ww|%+jC4C zjYvHY?q!C&r#b2SZ%|qiT6DPbK7vCT7*Nx6K*=0zck93*_HgS2+oMRW!HW6f2kj2` ziUT6iYjN}GNwsDh^4s-2n;qiKyb_8gDc9_)znqET8IaMAKi9&-EVnM{6-BiKM3=V` z(Sz~;;_@EiEG#&HGBh=%dJR=TJ3zww*{XsmR*P$Q3lc2Pg%?y+zHav<%tT(X9>fz* zkT7&WIFZCF&A3>(wv7Z>gO`cuy&W{2cWZfT>9J~bIMf;+kOM&cjsboL55seQ;bXppWm%q?L66+2**n6&v~jIq%f3&#Ld~ zSQJ}HBjRL>bCzvz{hp3~*^9~cxoBVGVhst6bM73+QMs0TUSZKMjsWQFnBw9e9HPmO z(u4MGkzvP3u{9)`66r`3Fj=zHtUt{FxWYtEI6@50!jCW&CwhMF5leWq#!SybJ?^4W&yMV?0$7@OR$jIld z^%4DMi}ln&e*NkT8M#s-6{Rt&i+5ryKgExQ0W|!3LR&)X);Pc(+v~}cba1xvrBqLh z%TVOwE8puXD?=Ic^GaV%e5*8U`3X}t?Dz@0rpiMTAtM@13;`qwxIwVk`^ePl(}3s9 zUxk=rR6aO(`Q4>+k|4{Cq|Vwd^=oM_mDSR7S+$WB$zzNLzOMqGOCavs+RR8bHIlTy zjq+9E{Gn?x!6D>o4XC?@$`D3~!Nhd1+DX1VtTYSEWtS`mDG zNK7IaV1Hc5hyKOQQLcE&Kp%S4S^ZeXSZ2rp*|!Hqe#wn{@)Z?VQ8I|e#y=vF=6&bb zB1K<3uSjI}etHdnyPmh6*jxLPlYuG_wg1e?pdN5CH~}~r!2j|+u-ra0$wR@iymB8L z&L1|knLfnc^`6_?Uep=B7$y4ztfrOfvdIkEqAT!WFA?_;w{c`$&#c^M{cn1 z9AM124JR+UN5QL1`s@P5+Z@Zo*oEER*einTZ9z!W%L2Z{rL(t=$Y&olhZ^#5Lj!a8 zZELGfk%f~7mxk4IHm|U;ONcdiow#yz`Q{;)+?dL!Ddwg%fAk5CyG2#^Ec#CJ85)}! zzxU!{0?JkIqH|+gV=DN8l~VgBc$k&=4NIzT?nr%AfzMx?G_H8zV9=D^9x5>1GTdsX z4QNq{T&%oSB|p{NumYPZ37Tjy3emJ?dv(6J`};=FVK+d%a({VL zHj4S>Spg}SIO4oO(;^krD>nA|8O|2mopJ0>;4n4yYPqH5?(BxC4L!J24`Ef7OGaqS zK2?8(yC9IcaZ3Es&f)J~OFiFW#E}VZABF5QUYgag9JvR*znw_RO6XJNawM(|&rMbR zjAuizIbAANnK89d@1xaSgjVj}QCo7{n84y{$$uOEpMAos(A_n}9{6T04n! zKfZ7jqAxOD6-7kU=UgRXyt8TYrd+Ea4wIip*ml@dT~tVlqRU@l$*@EXR?e!~jCq~9 zMrk)I&#_!oD}uSm7@WT_?74TlExur6^zI6F+o$6~1chl;d`dfLqM#&H^-Oko=7H>2 zuh7%0htDVTSI+NV+B$?Q72(<&(sH_|lN_Oz{GNwB{O01(tPJ0f%HRyYkV;Igk@ym1 zIGG6Zt2~gs5Vz)ca`=cKSnl z1b!wQXwo7G04B*x1tG)uF|mFZC^zAju=JSfkENx4J;$VSpK7p)nGBWgEA?+5tO|l0 zQ~`#n0X+jHd2bSTT*OCK-1~Gv?)mX~JCcz@cFnfs2|ewvrZP8h?JR56B9!sDDM*+z zg6rYku*JvU0nLpVmxc8Pyvf&7i}zHP0*i{us>}Uae_o55A`U2Y-!TNhaW_`)DR9lg zvO2{R^xP}g%Lp_m1wADlsCvP_K7lVlHdnqyd`6wkEbB)N-+?asNIPb^J#EtTwmA^x zgPpJj#oFvvUrxuZ%5-1a3_BIgG*%*Cx?qB>#rU0uuzH@gNy&cYtQl=+JID9wqM(v1 z+Cg~*KKAg+L$5a#^Upu%&6Z85u_fUof1-Bk;HUetjUV8aGls; z?rn!+lHO+up(gLg@U4_;R{D0daRN&76#m#;+<0#FCWXX zHATKq>rRfi{~h|s+Ne-BIyAHUF%V|0m$8G?sY7uZLAlkC@$1*(h6@KKH#I*#k;Lpw zKfyZdAS5XD_3OwdjqZ5ARN=5su5zVjwJ$cZgIh*7_!O3i1c~(r@l<2Y^!-?qeM@=Y+xPmumb@sfKV7iU?IkcfT@&A zUewtgh#ijqa_YR0ZJ1%SiG$3Yn}R_k8&|X1;$oGkN%Znow&%kb6%S+SG^EwF+7E1+ z4!Y~#-z&OEA(`JE`w4wWlP99V6t(-ow@LjWMqVFO1Ox4(#y)PnR2huab~G9QQk!de zhK9ER_EJzpGq3M$bsm=eLOMYAUDo;fwNM4vcAkC*KTg)%b|+~xvL6U(nC#>0TB@)E7CaWJAWYymD=&`;9fbF1TmCulJcek=GUF<8;SK}RhY9sJi zew;n&EoX^?gMH084sa@Eqpo;dp5%ffYGxnq%Ndt5>lUH{)>Mb4#E?A5J$=h$96W^_5>n_w5( zjwJJ~x(D|%UR`udphGc8ryS525To=@7g5|ub9#hnyVzXp%R>H+h9O-+dTp#~t5vGF zcs?)+Hng|B$!WBUzd@W^9)so^5}Vu9<0aPxOv=s&Y+JI*Y|Bl3J$+u+JORYXaj#`2 zZU;OxQi~vc_SEgknZtO+d_+yyM?@4KWMNw5z4aAy%erB}VM|%dLi)+(98trk^Svtn zD^=czYueVYa}+ic5aXJ=0Q}uoool7h&|teK|NDL}@aN^hfA)J6oH3|BuF)9qqM7Eo z5VMZb7DF!=t><{fQZ0v<$xYrKn5}Yq40P+K1(d4dX$hA)@3Bm_XPtNswA?P9M_2=3 zXeDL6DJ5l~c9tLX08O}l87R{d8s)aI%Qnt4o;uYfBEg|zFPO$&`r^aoiutMYBPZya zs4qavZ$+T%p8=A7)fZe#bZTPsd5UCkj~ zU4IY2i`YnYKri$}axJrfY!{e*TM`ZY&Z;m_KNrWGc_qTYuKekMoa^Ma1q%EUxN ziT?+IGnrd}h@-4(QI`d5_Y#b2k)+X9*zsgkZ-^|%FpsjFYj zNmpKDr2~MMlh^(RGw1?8qBT4O9SiJSWZufXR)GgybXM};^ucuaQ}i?(ft{_&%OKGl z%nbl`^fz#Q@p{jPjF(l&@et;judp)MyCqw^L6z;^>*;k<@mQ{W6^T5L_O=W$#mLIt z{dvN5?w_68f4p8D^v5;$KmQtN7!9BNSzr!b$Vd6*TU*rED~`p|os5RhCW+A`cHDv1 zvDqvT@csh&0Cl9=NAGAa`=+|NYXlTEA+|Q^-=s0hF1bKpsbQUP4~*~wqTSg*wQl)NJ2BZXe;(p5+LT%ycZle z)k$|&x231S=Z<&Ao&hT{9?Ue0_gK_90Z7VpPm&J2pY)`*9MS;DO_5N!O~-Oyw{ORa zT_J0;zU!rY$ck?pJ<)Jb0mDaOZ9s#aW4kNWZ1ch%Jdt|vz>Q1(;ic4r4MW?V@@XFPWk!1 zQgr6(O4Ml_7g>h+(4G_5&RSkxf}gE0&T=KbB~HJYU+On-hot|g&5gtJ(}bs`*(ne` z6SI0)sHH-U-Ssea#&Q7i`PM?;vkim?O_rWalbqTLen7sg1=fEp{c~czr-XU}wE!JM ze-$Fkx#5oYC%g6RBvX=y>!*`dgp*YxlEP2k(Ky#b=-?;*SNN~~z0dGy^yy8@L$R>O z5;6^`W23(N{&7(?Am@wUa6fH_!P6?8!A19I5aP)DIIgG<(OuIk4v907icM*&VLr4` zTIt>iKH_3onBdTpZb{>1d}sc3dt_-h=Bf&8N@N9=woss%N`n?@-O?4-THZZ*5IPrk z@XT+}xUk083)MygjH|RqGd3csyW7jx?M z8pko1Tbo-6&Hd?H7I!?&rhW;3ep)TGBM;<;WL+#NmQuj*cAcP{lKu_idb+~3>h(Bw zUApwwcI2uP;xC`opb0Z10KuoR9mb(H0RIY8Qio>tFw}RL3u7IVeuI4Qjo8hz)j-4h z6Pl696+FODl0w-^T(P0xEg=VjX&;`KjOw8QS^^B<&PEikF4OoS&oD&pmXP_+o2I^rv)J_V767cjF49lTTnK+11}&=dFjH?+8FLw|fn z2CNUHa<3ECMtc&(QuwDYqACE>>v$qGP|>Y|wACaMz$BMlYB<%o6DuOv{Z4S=I@i1& zrI9$nzNE@IAi1@2WH!fE(%7MVQ?UH|yzb_{fMu4aBqcD7rb)_HTuDDUcE>ZT^gw5{ z%$alO{4w1<`#xTpQb_6L*|m{w*A&Jk6)M#^Lu15Bw|n6#Bs(<#I^-ng(cdd={$F(L z)xj$1nl?tD!+0+9jChB5pXbTa-ix+|nE5QYi`{RK9YX`oO7XVkY~7myWtKr2QvgsT ze^G<`_kISSq4bi_4=5$X{sIeQEqI&bkO_Nu3VnWp?}I1XbhzCed+%M%X)qHqtHwQf zK|ri`Chy~^ypJ!h-(ZN2dgfM7(%o>QEXnB$?l1GK1{8}@w)#s%X*zxM8M#~oDkfb9%|TV5>poGm=e&{` z?EF_<(}v6dVLZ=i+Xa(f@u>@ODw%@|r^APgMaF`T4#+~(A02jVKDd^3efpi3;SBz; z>WKyXF=gd}#(_1%>S03cxn~@z%l32OfeBaJ_@cr<+i{|~lty85S3p+;Cpiv^C;L(6F2sxBVHVlsC+9@r~gR1t&QZ zm~X=M#J=RtMc+>oFWdVxg&FQ8>ibqpUet_tah~*wyAUxqHapZ@?uU!7)StZ!cU~xb_da?k zR>$uWUTbsN9&Jx7C^&yTHj_n2QSf8n!$Wo-JAkq@tagpKg*Y}=tA(GL|B0yQl3C-ejc zOIqF7u`S6LdNhqpQR>Y@GVhgVZl&0{&Hk87_ypoz1yo}AN%Hlqo=Ar_=cU{sTM<=2 z0KIhVEhl6(ovH6shkLZUf~UL`&au0mSmx~E>yV&GzJOG#EhhvH9&{Qpe0jrNHYm-M zA^Fb4#FI$Xx4L5B$=mlVw3uE#HO#cCTzeiunRU{QP8tH5_Q4jj${U)Fva#GJ%wzPW zmfa)(7~{^x3b4N_`{!nqLU*MIyO`Xue)H%3&mY0{1&CY$JGo-j_4JTUp=$E49NhiS zez9hG;2rZ)y90a68jw(AUt)(4hWEqul!!ew3rm~lfa=c&>*H1V681?UfIh0&yLnJe z7EmuHt{z4`M)i~PJzwc`d?c8{rRybh7@mz;4PKM*n8HdJl*6P4458_WZUZ9y?ytyK06=7X>h5TF5?C+5Le^*>9XC4E}2bnuDD~zCAO5m874{?*R5_|0zph4; z+pPEv>M6?#WM}2HmL)PI4Bv=fk zw>9S3K{d)nfuCYE+n)ReVR&21cg}=5uU=ix$E%@Dt$#(`B;SeCZadsjS>tlV{Dla4 zy%FUGn}FsWqfdnR?>p|6aQz}e12FQewA)^Dz|xWHap2D?a!f+bfxH?Dx`(2c#YtEL z0DD$sJ*JPqdngxN^D;MW-X*NY&uGh~keYW=?&xd@9>g;r6}$hvmX95dQc9egE*FR* zIcfyMDXB!SE0b>m*AiVJ;|7?IH@DB*S#Jm71)vrcDFu`cqNRE5afj}WC5eaLTUK=c ztQwj`CAbdgD(ChV?yvD;RkQ6b`cdN#?D>xN$amk-UeDsH^L$hprA@J!${MVuDVj`6 z7r04!@AgFHm!>S|qNEr1QpZa&$Cu<+s(&#oMWQjT(ZQE#9Zm?avWRq8rA;#O;RRhQ zd6r0-#XKqC?pO?JfSo~TGFoEnpVG_<Cl~39j53L)%;e{=9TF71#k?G7!wD!7Mj9S58n63xd)ut&>h5J+ z?9w6zpeHcyw_~@DmjY(W(>O8zCd~fmyAMpof)N%<#dbfGtbug%vYq%1Qs<@Ksg|M_ zY0z{S0?c5-4n{0YFer1xgW7ntBviM(6d2YEv4%;6O;h2OV{fv~sR@Qyn)qv8>jLDV{3v z(GB%We6Qw*g&(BF|YR*sV}b)Top2}#wZ zh}{?F$J;oR-aPmUU6aAZdumgy(kHmRJr%WgUgD!5UvwZq!-A>XgriAo*-7(^eBSc7 z6a^?3Bbtn-@C2cPBJ}R_7mQE=u<4Vz=gaNZO`kq}EZ~c35y_AR-K&|2+Xf$|F;fz! z15Yl*$ri_8ytlKuY>1$p!+vv+RdIcRO{^o>UAt_$f2(>98N6>RYtt(qM-T$ zpM~A@b~|_dDO2YM#i;Ibwvg$0s2@Jx)w@!(+X^KUjtthh#dD{vxaSIV;nOeeEE2k0 zd~r)p{pee<^cuioD0i4g%l2xq-znoC&cGV*ZXOZ=Dg55tv~~)OxA_f9$%FaADj2;X=iiCq^Ei8*RQ~_xvGrzAbVzau6Gua7}R9Vp)0_&>Ivos{_>TSAhvGw=}fW z_#0&5h<(rC27oXeXy62|a7M9p7+|~)63vqM^9{+RNX3Rw!7pM5?`?DL;oMGjSSnwp zFR*{`x?A%VT-RatC`!K{=WK|gFR!-`ZAAn_=1e+kI|L*>reb|J+uoInRQryi-Aq-V zGY|UK;;x;`5UW3Ar|u!LyyFyuK2s?v@$r1Fk*e2=z%En{oJ++#7pc1y34J~fFdVd< z{`G^1Z(z1N$WO*8I1Q4IBsKoZ;OGx#HObl(Ok~M^6sZMtf?$Vp*{qG;0-TXR^yXEQ zxmq<$FnmA!gaUr_a86Yr2*k2^P<^ zGBGw-b_optHYqcdjFhiW`3-Wh-o3or2#7!oN0U^zx+ToVn{Veuc?fh&tpT?e7wtv0 zINd>02?m_yG-AGca$8u{MWDT@((==%;L9z_k?v794t+Us5BH3Os6ht_C`6q2l&op_ ziUPH&Ix>C|Y-jMso-tMQeC^(j%uWh5K?U&?^%D!iRhncPBYPXSL&5$lcVmvu@W@9p zMb{|rs%5C0{rcmlsyD@&DuPt13q74SeT+Gvi1WP9>k1n|LB^*9^F2H>HK`h>r~4@2 zU}gwN0mQPB=rgdP^y;}>r9bQo4PI@dM6?Z->JC!qm5RPQ>4$7RsIqR|oz=on9J%LA z@@}-hW2fJwG2NqloqpJ*k779TglI8s;&q8C`s(Bonmgx>#mx%~rwhhG?r+YTKG`~c zPH^;PwKXmaYSu@*?ZQU!b4O7(=kgo1}W}qKC0YDPj|=*;k68 zGUCPC$?HnCR#3r>i`C!gk(C??!jurk`HDva)qDZk-ap?6x2iQ4pLw?hKy(T7vCBFZ z97x*+!V;s{y7YqWj%i3fBiGvd_}LCVo>$|qs8&FpV!nMV=wF1;K`3=b@5yFx(;7#% z0ZWHfvp|kDyKS=&!3X9+)@%r;;HN|al+|C4e66equk$ih%fCuV3YspYp4n2O+Fa{< z6St}x$eeYz_6Ds3smDJvtSlC`?gDjCR#6(A=e221la?%$Y90$eWy}Td8Io9eUKqT3 zBPe)N=bgUzs75KK9>I;+>6^hy}v=5iQ?eKiS=vNh%Krk zKrJyUQ3X<$79A3r?kD>jL>IkJA~Ybar=Zs9e-ZOGeuDxTpx$a>$J@}G@#hS(ia+z`I9{c+Pd`Nd1=EroRN$oBK zV4kE=5lEVKwM-F7%cmkomTN*4eB4>)koVy71F&hk_`;a~_Lu9dF&sIq^HH6Et(p~Kltyh9?iKP5cI6qQ1FD3x!2 z&sovlIVwmq<33&sY7V$;IME$xMA?G&NnvUE{S`6@uM)AcVe(U?$9$tmIhY>&DOGoA*8s znF3%Dj4PuB5id7AMOZtt{{RTpa{YAl_2Q>qGNEIqRX?4u-_G&?l48+Mz{E7#ZK}d< zBP*%kCNyiump7$I{MxcMgwbK7gW8i#q0gY|fPLetz~sN~8~-C;VL!1&j{usjWEPs> zp-ty2+BET8qUvHO`Q`Y+0qmZN_`l+9UH%`uZEWoqWc@AZCO|<1ZVr9C{Ss-Ub`)80 z8@fpc@-g6ugEgMHK#Alb1FX1(EBhJt|0zS&g`Rm~LzQI-3?OS$3vFp(!F20&2fPW| zGN5wey?`)6y~vjOYf+FczAjd|Y|-FW+8F>u>Y3}e%E8ApXOLNUunf2Hk&@X9adV5? zvC#1)O~=N>-&~GOZU45}KTO`>(<}VJp>)+25clL2O5;7c1iW{gD}023cey_W?q!zjYv= zSFv}&J4dxt$Q=OR{2}i5_b%ce$Nt-kHXh)&p*S+sV$fS%`ocC8TZUROdJW)-4nsKh zIR#^{l7hA{v43-K{#p2f{_goVtSb|9Lywy~f3N#n{7&xy_*Hnp^rdUB8qxR*m|y5r zH#sl3##tTP_Vvqa`?q`5OM3wDdUwB0^T+tly$0aVcq=c>Wdxgk7a#wvy`uqJEWC(c z@9_%atJ1@OG24h4G3S!@-jLMG=f6Qu>%@Tl_?>EgTrUNVmEw@Du-&ACnPj3DWpF)! z{#@(>Dsx@1(wAq8+}zyEB(x|hg7C{1RpNQE7oQJEWWVe^t37xw9CIi7w*QvZ?9^g$ zeUr+>e8-hR>Pmtyuw|Rm_;*$B{#UY${~f=}wV}NNd$Eo6vm1Tak&t-k)3$j8^fSwz zQKi|ocOKQXPSWc}+pX{IDwSp7SemQc{H)w{d)FHO00Mo_cdbZ`Clk_H3pmm?`L{J>_ypz>5B_@rSyb$cchvXMF+uyB|WZ z>Diq20oThWBgHMxoSjrd(#|oQsh1M1zAu$u#O446woE)Aw1tZN`kb~zL>7NZd>oIQ z-=^Y;od`+GCLN1k=tkT3s0$QpyPZ70K3VzK8ZN=D7D?^QV{dBc11sJ&70672U0D%veJ~8>haDxGTcN6j+gEd2w|1NO0AK^|SNfd*>p|wL9){Z48F+8$Im5&^^ME7hBf>gt@Zde-u9d zzkW^`F})U&O0uZ8rhdMzTrqY#Sd_pTYmz%wzrqrg2tF%nLAmrk22# z0GkXUkv-T3Ry+7Ugp2jV|@&@yM!=-HQ~RTe+G)plIPKL@v{IFV#XAm;}s1T5TQhFsjwhefSdTcEX#T) z0Fk)Th3>t^a9(ttMVtihd8GqRhCo&QmfY3BCVZV8utb^K`Tl3v5O}swjgZb;IG+Nm zler7B#-(rm2Yc@w*VNywiGrvoAP9(bf&zk6MS6>fG!YRHg0!e~A|kzoMCm;OA_7XJ z2}p@_siC7F(tGc{gc{Pk+uwWUp7(v{eCD1zXXeh#{hUAcCfV%0v&*-v^{i(-D~?u~ z@u_R0-)mHVLt0_IhjuCMY;0hLvu?x4!yp|YzWl363dc?Jnc04seQW~Vmr>3BtMKb{ zu?FqtA#?bf(%K$Du-u6B8_j%_|ruX((5e&s<*mM{ln zyX z=~h!C7pMVvlcGzV@)g{*%xY?vUlU$xMK?JF$c;0huGibejQt+mG)L=q%b4O+_G>zL zo|N6X$aZ^OSB2WLJ3NlW{AC?9#s!U#A=Yj#d=Fc{E+pK0wFI1lxJS~$4=nUVmPv1n zG(1RWH0qqWIZ>>{QZ4lT&|POMYAYIx!rar$K)ufp2v5oQYMi1?MIWeM zcAs|3>$wB|&K~N-2V3cZ-!>qp*58^1MEjs@aIxL(+G527!u;9CJmi&|I(^}4o0o1L z%fbNVuuyv(rp4f*ultY+;8;J~mGVwy5sG0d5CO${8B7P441xU_JLZ=Hmx#=h17fE4Ru@!~8%@t=#p zKHsy&I=d6rtNYaMDoklZ>IaNIC_b_G=fV~3z^rApS3D!j^06epc~pe|VCqhkE7!93 z_q#lrbVzrZ)i2;%g9(1c;#Q7(dxr`oURf`!b3xETP-*d}D(qRAJen7Bx2A3fMy6y_ zQ5ePc5Dg9Fga%>ZusQ>&*RqfAn8CKWrrytL`<|uxgZH2MO;vZvr}(+}phJFlXll$M zSg!zVI`C=eptk$k3R!7FeoO@2`&bdbH`36l(i>qj1 z9ZhML*y%%4Jk}q8C}*HIM#2@o-J3!&8oo@b1=UrK$Q@VqS)1#WsVn_Wg_EIxX7ZxD zs6S~Mf1aq}n|y0t_*F%;L6uYb z)q(9RG(zu|!O7%V3(>$hv5_2eX^1xFUb$_#vS?P`da6Cjd1SnPQcmti(>uS*f*{jp zXMBB7U?*^x8eL_)OU3CCFX}gyN8*Cq*on7xmmDxc^_NDLF|EkRX;D17sG(?HEO_ZU z*rDRi-HXDRQkVih14MIKL1c{gP|9#?yRvi7D4Dx8ofpTW4-T);vw9%YzFhI~8a>)5 zWWKTh2Sgbn4^iiRiFb#}oveh2j|_F9d{eLUh_$z5Ffb~vzSQRDA{Ttgu=vP!r%H*A zv_Y=GE;{bcko~#vPv-|{06`MFJQz7Xksj#)D$#rAE^I#U}XkznCJo2I)>PM}XB-PuFv|e6RpkCuFf(a5Yxk#unGjGs?INFwBX|=~eW;S%)eKxCUPPz0+e|$8 zO*Ij-H-%duw9$1<0 zs1J*K)t;U3$nD{6)$W^r(nW|O*YA!*c+Rk`Ugi_*7i2MazERkRzTJ|fw(btd5V7H* z&Cf-WTilx|+1@|03I{z6V-UTTxOKP&<||eJ;#em*c9)q4>))HjWgD-Q0+5 zUpTmM%-LP@`bffe%Py;VX7xyR=nhzKCVA-7}r1@%jz?iJW)eggV;OpC;qD(bNLW4I!VK8mX+1k_^<5WzJSO6qk7vGgtxZ(`WJ{nU^Sr9_XJp^;@??g2)r!~%8^y*<$(Gq!5?NfCix5tYE zVu@7zAb8wRs~L1^&1aa9j=mnW-hcPqb@nehA<|!uPutfiG(7g_|S_9+m za>jh1)seYVZ#m`{J@c_4a@;1AZ_#AS#;m^%{CN__H8kjLYFf~sYxm?!)_OqtD=*>K z-3Tj0FO)YMpM~U$oZA=V<0*0J1TDc*w6s$<#%>d%{MHSGX+($KvZo|DchdjdMP2d* z$XjjcR|nyGUwEdRjq7&eUR8fL+{}8tDGFk8^qIak;t2?@IYl>Y7^W)-@K)(G0a|PB z22afnkC_0`z8^cX%BaYiLP3i$*Q&3RYNVfXUHnIy%G7J`REO)nJoHFHiGJW&vf25_ ztykWA5G`@J$$P>~owrV-@9M(>0^MSRSPV%L7fybC$V3+Q-`NPl18$BW7GUd~9RweI z+!#x4?y-l`gD;7#BS%TVwT>rJnyNqq9z zHBN_bIT{4vw0z+JwLu+wUzqo^;(mMTSq3pfWPD>sO$8ig&1>fVBdeu=@v%(c%~|A_ zL&vnt!1Ics`mdz}Qdy5}-ger#$0!RrYiZx1XW(O!J@5$!+kj7-Jx^|}saP($dZ-gL zaW^o6duGoKss^Vbe_FxAuHvDsO~oqmDxtw55<*$kOK%w;pI>_G@u778%}h=@9}XN| z_c-@j(Ol6TUah|0RB`MK)CW(84D$Gy*Q~zzdoI@fnp9PCx3;GZZD1#k``B?N{9%<^V3S%XtFp)Xl6Jj!t){nf z;AnA2KICQrexfNI^uc_i<}mD|BLBH<$!C#!fgfLEdYB@i73gTvgKGmSY@EACA5NeO zyd!bgoj8G>Ok32>$d598*;b5dOL^&cy;~*XKRbY8#tE>v&$3b>3;D5nb6{VW7eyGC zh1${-N&XZK!T7J9cBVdYYX9nI9}g4+<0%C zYSWkT;qww_-eitX{}iC5_$jl;P(&V$+{|1b@nQZU z!R=F9pCU0EVYe6`bQNXPK?Ql^+p}=oJafkUhtP1@Oid?>;d zZiye5M@A7cU>z_CzRSOdD_1GGBL#etX~19aVho=Dx#*8ft!q&NoUs zyoJ#HE?4mJbma*1&#t)YbFERUuwC@s;R*vr=i!>==$FG)mG0v4I^n0K(Sv*;eCK9y z&~F6Q&D3}CHhEp3x5p1P#ja3T);UluGG7l*w#8Ku0jaY1A>6OgagsI8RvV95LK08H zDHTL9=zDO*PUyA11|jWvP1NNYa*cvpZ}*GS53mn2m=+9UQMK+jOO9W_3)qvFbJISi zaF>kmZXAaVFVlCKxk^~-GOU(rwPcAg1gT|W&lVhjjsnTqp65Kd4Nk*vSL-Xe$jOBD z%Q_FwB|kfO{GKU8!PrtVNg z6J+)Xjr7TxnQ-=I;dtR9sGI4vo}i^6#`IDPcH}eXfZ9X|;YC~8E~gnev3J!(8W0LX zKy?P6{h%iN!=s!93TaR53+MfXE*V;_{0Xd5pZm?@_pZVX!> z0(9`+!z8m8=XfLr^DQ%1?Qm0`Qb&SKBS3HOK>=mY$6y5kPwhTK zwMsj#zwKP&@YWfOi3_>Rp~44S-C;zuBhGytlZy0$b(?Fh$DgT;AYPQmF3vZ=@zg~1 zp50;1U2J5qrSA(r`m+zs^|U>@M0iSByF^wL?SOu%r=mKKWb!S0pl{J)9-+Jp<={q zNuG~ zBYM3g*7))B9z0n50@nd}96CpuK8H{83t&OE*0wSn4{ycM)Y>~zzN{#07FO&}`EDt6 z_2S0ek~4_Ig~b`YVL-_+_j588IPfN!sNKy<)2qvc_f)az$W`hvxn9%`ShPxZQ#)VF1vmt7p0msyuK&(z)% zKErq=-m6~I0sGK<%z{~>g|9JB;Xa|7W&?6vu(%V@B6jG8R_>7e@nAA>XXy-_#<~d? zupkwIkX(Ohkl(_aj(%cG6e_p5cJta)+oFJEhhvZW>KR|M@6d9NT;BpD_3ecAy3b}T zVDW{42^0X|9D{+6S_iW0@DLXt4&hCuw=8nXV%mbheAxKjH2fd;97TxuKEhAT9o5f8 z^s0V?*`sF_tI&&T0(i*W=cw_X$n6mI(kriT z3_MuC+uV;KCcaXRhbACdDFPedODZOA^Qva=7k*PgTwXAmo})f_=%2nBkYGSVv3kCw{EMG~06)#|40r>n?V8F8Rm@^)jV zr{G$_I_oS_T-%6X9IOkRpOSA*HEXTv4|Qn2>lSDGhA0{oNbr(vOghj$1@YDf{D_j7 z7EFng50*t%q(qjoVIk3yErRr4Ag^$16LU*2RU2Y=`T!4&(Q(+EF$t99klXy`ZbL*c zjav4haP@KpY$C_t-F3bxQFw_C@Wgt1|5$tfk>y1DsI8o+-CXdvae2h}_5NmGCz^Nz z^OKA0NX{2;7}+(pc*BtPSi%m3zZiwol0A{KHkA!nNs6_`1D4=-_cK1fspwqQnsD;X zKDKklkIE*IV@_}7SM^`A0A!aJEc$_3%ZHP0I5v42yof-49FsJWdOSZAd-QvIc7%f z_AvZ;FstyZZu9k*2-XwG)i%sz4M}=8qa(@_u+$CS0T%U5*s)ygZon;oM+W@j|3lrh zzekGxH~37`j&wAu+X+^?ZlLD{ki{Oku0c_Ri3sUD9rVxz@&yiB=gs=n`x?O?)yh1bRJN|RyY4ixBAhQcL3#yNFa9%G z2~q$DTbfy73qK+5f!pgo*CQKYRa)`b`dc89m!_q>Tn3J}=#ruN$ELpv9#Q|xXBt75 zk~05OuoY3lwbiYP_XGi?=*jpuTp$xvZ2V4KIpN8MK`|#^9&nBSsnGZzB!~Wwpl$r; zIsRh=-T$8Fr)^=)qc9?l-S|{!KDLW%ud@41Wo23!w)zXx+~WfP*28L+|E=-<_rSmM z#=>zw-j!=8J!~wm>Meu$`UxNVwG8tSVL zpoN862Jfi!via5kD0Nm?7{C?p=3yXGzhHCa!8Jf6XXEKp)nhPQanXzC#e(Jap<62v zTfU1%aoOIYFV%tDplL%;?;>MA?WySJwD{j{{;oCl23d9$sN2p1=|V`569k?etDPG< z^0|DRC9o=_j{YPP*d3V86!rB`reGUe&u3*>1DhMf$3-2WSV;o-=($rng^r0Q=YLa4 z%<;j^4yE{JvAg12`~CzalMF)-_0M&2oa(vuLB*ra=5a|cB@b%zpRZUa+}Ywr;aT1| zC$VMeuKrLhQzvI2L|2uKi55gHAqsu#g94vxp7L zAUS>YS~CB((jM$(sMoGOxP|m9 zV)^AdYQR}+mVE1|eq_FJ7_RE#vF@_u3)W^C`Lu?uzy_@}4tYjzXsIsjEgRzBX%HZ3 zkZ`wnBKv{kYMfl+dLUHO*B(vOh6`XD1Fg1rep4|V?V*NDk`IgLTjs-CKI(yNaFXe* z2d34r0KD308BBnnjioS5QEAX?E?m6N0--S{}OAJrg%S$jbPsQK!H(3H15 zsBSPyDtQr@`~uxp{uwDg^FM>7^AG3zWIGF)i6oxF7mh^=;;a&;TzB8wjMlyO=#`Fpti&;4E^f@Yf7hl$u1k4^HIMA{X{e;ZrMTySE-95GbBSx;)fy9v{?ocd+7|mn^o5c-N@8@WWXV4KW8=ffCu-zIphW`!-#3 zCQ1dqV+Z>cSk_WAy3T~ch3#V%>nO2jJuMCVgD$bl=+9PG7R4Td4{JYlKvUG&wti(l z!wB2NYu3m3iwN*=iMIZ$=>dJ|d+SI52a1LZW#e;51~3OQ#_7W1%M+$+gC9>va8^kv zJpI7nFkEN35%JQ`Q$WS;!K0~nu@v2pW|nN`-CyniJ6qnd94@?4+OYfV1+esROGU8uvKKPt! zH}E)gY9+$5t*Y2(f#bOQ?z$G3_3NO;Xgi?r@^|;}?=!>LRI5nvHVOg2f`f7^fo#V2 z*cs@`J&OyVN*O~fi zX*RgfMV@kFrw}SX05A>729w!^4M9n<;m|e6uc!ae{0aa=Hr+pIo$>`>`ISNG0r3M> z!I@n;lRul=03y}G8-Nz*35kxT2-ZI!fKLH-Np<&~x~z`hRC~;opo1;|OpskaT$~1j zaWYk)gI46KKA@QdBMG1XQ?vctKf9{`gT7Pm{ifRD-jyNg4j8wlb8AAGo%)i_Hxexc z(BkR8>fQ>+-An(3>Z+~PE1=JRj>!lJh~UVM|DI#T`GBr8U7a>hmiWV35qE3W;kBA@ z*5vwddymOPXhUVQeq$q}+U6_&8^07zL-+*fyLYrbP|Tsr!^-Hmq3X}RX|1WEZ~H1> zwP#d|RuZlN;ny3O>^fs2KS{4|Y-#tPvgUw_lqtV=O<+o>8SwztRg5@AdRp$1fH%d( zxe+TK(=NwozEm={Vx+wX%UG5t6z6{FT3Js{wALlBWjk^2KhuARaT`F||E7u|eE1SM z_d{av&~GV9h-SuPW$*Mz1>X2ba->-Bah+4Kx*gN$HrpbZ@X^{36SY}ch2FH)KMP#v z%g?4iHA#4NF}VnKN>_s%uJDbswNPvwIO`^Zxk=k#!EdVj^D4#2Iy`UVi=<=BtA=wV zMZ8yOAg$hG*JKJ*vP8B$&?JHyJWE%xAv>aXW?0)woc%GUIbZZ=yoYvKucxnOeNu=X zgN|Y|f8&pJ19tLwY=*7TL{XY*~1F z8s~l2xqLgS-9CMWs6Yw?Tr8T2=s|OSnDsSZ!8wNSyIH1pTE!XINp{2+q zCnIjHCl-1*{>W|DIct#?5fpQLg>z=l80rYWN={Xg+Ff@%>~_~2k!@`ejC&shK%xVJ zid{4$m!417uzS8vKelXuweFV9-i}fKp(@u_p=^Ne^I{W|7-IpgS*0)|lUFK1=Z?O_ z>xr{u?<&UWe7Cbx5N&y*&U;RlkWW;HpErmyy$SQDj$>jTEpC9q^?k2a% zTg%{Gf$E0%u$sC`J`d_w{q6T(j2_zw5A@fwR2$IXjd~iVS^7!B&z8-~NACuVKB;y8 zh2N@zx zv$P?qOYlB2N}j_GtP$M1=w|lp28mGg^CpLvs_Du+FWa$H^lz$XTSnK@k*7A?49*Ki zNHxW2@&%a<1%tj6`2nU}DYiW?dyEftsYPKj`6cz?$wz}VYtOIyR!mA}e#POx1|dV^ zpv^#>hT&Tztj8w1RVn%HPe669JsZ>(8mPOwarAhXHt2J-@o=M0EXEBc ze)REJz1D?CH_8I(!Y;dJ>;rlA1G}uoL$WZbhz7|jxpBM1^M#+{Umi;{8W}O^49>kp z*WIS@VY41l$DLtY>(BmFhHtKn}i6e}p+~sbuAJH+*4P%YqLURQmT3-kK z-G)>>3#YGwqTR=7j!URop2WlA=e&y7CbZ=kKP0+jWda^luPPM=$e{X7bt0e?PKJ*I zfPWuUXXFA-tm!5E&aPI$-1+Q_Y52ytJJKv>qbkp}EzcT$@b1>snU=UJs)cg`MJ635 zZ)#Ag+BL`@!6%cDpGue}P!vJy%lKUiJ2v&c z#%^pEZ1x(a@OxZFQ0mLJb_l$5jxFq#bRhnZ5L+^i%nrG87T##iH%=a7+Vmp68@ zHsaZ<-jr0-7!nI2n%0@@d0rTQS8HWm9wMUW!3?W>lKpnQBt09x3_JPhIKvNoKeL|{ zF#bFzP=bjf!H67!Wx756ps-wttSVng0YFK^Osl zMcbf94NCTw0ws3oXXOfT+@@7H##$N+U;F;s+rQ2cA8Sy)fK3FSPJH5|Nk<#tX^u2^ zA>r4e4RKyw_^9<;@yt-sL6GO^vIX+z@qnU0yVDoVn|LbV zyOoMx^Xx-eicge-T8gPj9hm7(d>NoyQ^pigEQHf;v5V??_%``l$uog4dzE()2in44 zK3v~OT2WCbR7M;;;4$UJ`U|g2M;C9X@eDp2D(4<6Cw@mPepS@wT$fe~p7d1Rm~r{j z1Y_PEA?*T)PBT>Jvnx?QId=B8j1UFZZ1W%n-|%Mt;5BQ(f#reX#4qy02 z+;dfLXG!bb1BHOjS74$jLi*Bu<`<+V!|H20x-KE7PVd^jhyt@V$`Xj5R{b5q%3-QB zS|Ju=W_L^O2A5*;w+)Q^6BC@#p>W~3-&AKD-}oY1Or7lPT>BsiKmE%}mfjx=r>+(Z zWvHwBr5DlB(fJ}S7rY7%x1 zY-l0tWn*7lh@c-CgqCsvEGKr2s|hU4izU2a4{a%ebZd1kL=e7n!s;W5Nw$7F@z%RY z`{d44HsLGWFRxS{Ffbat6kB5&BhD(rg@bk;k4OsCj(DAG)n8ofIL#j>l092uPgT;m zO0{i}qRk{E>*@FQ<(@G_yZfGeSvCF=XaD2>yPg?B@<-=dfkTusPOgQ~+ZDVCRXM)|g zItN&EKI^v3-V?dWSKfKl>H@-?jlaag&)>TTahz*5N=O=4H3Ei_`v=W`#M7F0ShCvY zRZSJCP4%jqXoJqZS%g^Zda~jynw7N-j&gD-v6dr4SZ=Ug5Az1%dOgmyW#`(ws*zTD zfUY33(t{Pr@7;Sz9*IHfr*R$nGq12^a%HnXO}nNAD6cNdl-agi<(sOMN-b{q4<0yH zM3}C6g;wRsi5Qb)AnKZe-~MJ`1&~i!=-&iHWoU0MNS#}yDD6%ihaQfSy7UIeb%@_v zl01d8t~Y@&oNOmFIQhA7U=+g#>2q5~)Ym4sycp7nn!h<^U=M1o?nFOlCF+^+3GQOU z>z?N5X8NW!OWf(SzV9A08y5qNVhp>T>UNuU*7{kBwZJg zRa}V{DC!Mt8_A1k_c^<)*At%l<(!3}IrQ?Gz<0LB@vEHLwR{!(%UMK4xWj<0XJU)d z&o91?LQxN=7hs7eQTujx+*bWcGZYR~Isr22wAj#7qFR2D^|a=I?TEC_H^MDu%R~P| zOd~i3;=t@{ibHOEGQ%Fov8|=^k>4P$JIA03>(ir|+vmg^JU<19UwMlzKotqybN!Tb zy&=4_ViiGyf4qQ-WJfyeR;KRXDXP><0sFXui2Z!GDBrvMX`r9d&yf?=czo>^>PE() z%u}BkC2#UpYP-=L5|nQoVRBu>x5Oo~0IevAqB(9Og~6*Lyh&PApnGe%K}g5HRyfxn z=o!zM5T(0wx$K1YjH)dNmCHl+Y`5U=w{;a&N){JhskQ^tWsH%jb_PdYfxw>j(I;(# zI6eV;BbsRG>|>B^h%JIYYL|ycOZXM2I!xVSUHtw0Ja>U>g;tAu3 zW&FVDCoDIu#eh1VcX42npxk(PE2qS4{}!)pfUoVo?YD7{uWzVKz^xu1UeYS^*;%T5 zn0mL80%4Muw+%d*J4leA)sHg~U2{42#6Rewt!e!w$QP8!83*Ut%0&OHIzTb(TDjvV zrY0qTxcH3~kVd0opzI<0r_Hz2L=iuym3vk6eB5Tq()u=0n>-RwSjd6g;H^$9Tg6ekNY+F)rm%jt}emlyJ7j8EJa1Z1sXL?C6<+Y z9q|uhs{2chF95vT>Wx2W6X?TClE0-@sCBNx#|`5j4>rT0nO zHg^?;d*x<|{N6{gc+_a8L}UmnX9wmIhi^~B>+&8$)$fr7(A?{c0Z#QkQZirp;oA$ zWmuGrO_&x>LH-IuAvNDTmLDMum%<4*PX!(Xx?eroyDV3A5A5Z}p{l;w65HuVI|?+a z4f<9DXRnG?c2@@eUAyX?PS<*#ECtmRohDP>_ZPKoEDu+&V{WC_pOIZga2P@Z@gqk! zbnO#%<|w6K3wWX}cJE=nrrvm}2stecqz{fg zU9RDK{>S$HkcY`TKmiZ!SBCWGPvYg@h~-s$PiaQ)f`oA(bAnzYs8lMbPEp1j6k2iPl(rDi~M9? zmPQXe`r9Al<%nWQ{@+xFV%n4qW>S7>xA?Yg%DA?w9*e;;Y@}zChKt(GE0SRd4_iMjvaOiQ+t34r!e-C#Ds$m zx+O)*{jwZ$!a4UfVhe3;_HW%-Z(TVkTALF*?ww9&F$iR8y+;^1x{dRqB;y#N=TAaL zH(u(r-}-RAXt3tnfwBh8M#rZw$ctb*sR;9H)o#E{YK1o`=@Hp%Ts)<~Wtab<5bC zE|xRDh^Yd=7yk=wUlJ#(Vl6Oa^M=i~_QOSA{zyhmFD$J(n5|Br9N<7Xn*SmuCC4>SugM*Iv-eg7lXj?Mw+@_f|&tY{JoM9TSfLd!(cAPiOYvlirO( zEnxno1DDt9XQS$36d`x~r9UteaPiPLU{(Z&Zz7O$CfHZ;J$j=0+VIw+h++ZW!K0@n zy$|_EP2lp1*YgJS_<(kr-`znG=5@8R`d0yptAMUp{+$LIeiboXjIls zwjKz^oJ50f^LAqob!Dg2oB`Zsji=Trb30?S^Y8M$6S~G$OL$(ebgkT@Ep6u1C43|F zKl)5KZv@~)1=iuS;Lvmhe7w&QlHLg;BRXMQop|hx12l!MEPyv-sy@jQ&d;L8Zq0Rs zSu1x3=HPh8pUf{XPuTig#6Yvkn4AgT{lEH~$CZ^iSjMNBD+FTiTzx(E=n3^fajLKI zQQV&u>M~wv7ov~X$bP`(hAedEMhv|?J4EfT&7l%2H2V>XXY=^nC>=|`OJ~Mr;1z) zEjz?rM&o7=_~X=hE~@iDwsZWL+Q326On8hvc7i!g|G}9FfOG4Yrz}R(@);R&uyWcq zb*1&`$)Uvud<P-<%0Cc_a1A2na{%13s;u>5E^MRZc5sYlJ;S?V%B&$Zs z8N||X)w#E#A!k1D!7@9C<6>RP8QMrU+5s{)(){^N_hEGKySu%&ok$BPL`txpao-_|`8QU|U;S`v=mK7s%h#K51HntjM>a$Fy7hFH90TTJk1L`z@r7tj21B zhaQ0rzbHT1*;H>gxGc=D)6E*q8s@VcxgQ>kxavF5$gb}*$7>3sxpG!#Ff8i6=L~mH z?5585E$dnzB7|gHU2z!)ov={5VHY>7x<39?T}vq6=50(B;u7L}=D1@8HcB#~^x}`t&BR0jO2Bso@ODPi zcB`#%?mc6F#*f$aSN#CUsBc#kdD;1)?|z%ztZzG$ zCG$Q7(VN*+R>nRH6FL|1O7u%!BjAS9d}5REAIujK6By3XReubG>C98nVGRJl8eay` z&M^~cGS^V`ZZ<1Uy9p0y*0!VnQ1Gme0clk@?qhgN1&GyG`+gdvzrrbf#pMg9P|#yt z2Mw;JsO=v{XK%dp+bAI_!>MsW_|A@Y#EpTaesP(1Koot>TVG}tj;;kue@Xn7K+8`( ziQ+X(RxR{hkE!0WId}nhO;3(a{T++*6jl-#&{0vK-4x$eUxRyT-idR|jz5g|x_tVz zj?W*{Hq0x?9wl#4;}j8`c*cO}>!&ujK5xLTknXj@_s9aq@sz%Y{v(UGZYi_7>)rez zoS<9GUg?qkS%eH?r?zhp|jz-5nCfDwFO*e>C$#=Kiohz@ByT0py&yDJjEW zh#}bN>cbfh*9{93TyhZ7*)=SkXID7U* z@cT!+?8a%78u6Q`eh9|y*%g<^n}v5J3e?&%8`x^dl?Mksqjf^!*u#FD4aHE^qY7O4 zt7eP{3N6Y6QjC&;clzUUVw>@{a8&LjnIRJ^D~WVriol?O1cXN#Ty@G)Er?8~mGAqV_3%3=?*OuR@SCav(w^e6wBD;hS;`g$#34)zQ?O31u>)j) zz5(PlRl7V#6BzE~ZhRHq_zJdK#dIhlz;K(i9sHL~WNiuZfQ>CpsvQ-5D-bA_v#vp& zNQRP6M?tMoPXql`QDf_3I4y%HK$``?;pWMfD1)j=o4uJwg80ydHRLcp*%(>02t=wJ zat@b9q1INYiE**rF_O(Z@K0EB&tdgjpnndT)M?Js5(ys%0_{D>PJw>j>?6rG#}A@N z0xlLNNNm8ZKwBHK^V#}3*Ls!k^$wCF;rLBpz<2UOCicwF{;Fwju#r=d*JyW8|cIS0U&f|gXy#z^14e*PL%4`0u8?lgql=`OiWI9 zi|fZ^egB!u6BFci&$sJy;eA}PzF7~@hF<;EhUEY70|eCK0b=&MztL)yu=FHpw|RYl z3NkPW`a`t6^M5SS_Ltg?j%CtyoIAk@5(`u{Y3(3D=7$hta6?o#kZuD|t?IL(s9#eX zIJr8K{pvn0?Gx~1Vix}DT^=ez!fA6>T+p2{i>ei_)K#&BU7C}I)4P-CnStj$D|8SG zYm&DreiN`CZ~g;eA^;o&c@fM>*{>$Mk;U4@CP@6Xa8=k&9S|+uf?VhZKw;+oK25g# z@qhLs|9m%T0Ak=`D8lk@s_!E`vInxisYt$)$PsypRs$f^V7+=I-JLD6Vepx;oG>51 z4vhIm)c*!+_y3G-Po1Z)^~#8lrmP9vcYoay2<1Fj9C^2n1yMN665{b_eaY(Yu#k1- zK%Bm2iK0;T)&b;)Id)5XuD}xLwJ@&xEpeqUC`NR9Rvt$`ho8TXy;=S^Y+jM23KTM5 zWE7uPbO{mO2TeKGI(He+LScmb(r{J(FOjkp~AF& zUqD9ViiNhPra>r>pZo{&g@64gQUo_yV6KHhIA=J8<~)(L&@(DkTKNs)z17gFd=^r{wqnj3+{St%SePMuCXr; zi!DxT>vtUyf^Lg44StyfgZ0g|IvD1mhJcL+#zbT5tb+(M`;@=9u!jH%y4xLIhUT%+z@4pdY89ztp|eV zwgW(&Xg=;>&WV=*@dqo?kKC*vltEL$WkTq2g!tp`7;?h+GW4@C;Bk&^9Mm+>MCZcq zL+l7;V_lGMGw*>MkiFbb|JJnjzsoj+2K*jiF_luT_$G94o2uSfaRk!Tvc<6(Y^{)Q zl)h8|;b`lA`l{KgkQGCs6S@G*#x&Nv(9&5^kKT~loD-0gxm=pMRYwby^*2>B??MrK z>-~vm72gbR0`g|k@C4*S3@kjeszCpNB+J<9eLDwF#7GdwS>reUT;_YA*9gGi3Tc@u zPWBPosVRY$OnM|XP@PKL{!xLd##m$8-7fKpCge&ClpJ16O)vuH6H_5ziobp4B|;GY zbe8GzPGEpWWjbU-IyVXU!OHk84}9v5NYQnZ4fvW|Ohb23t`5n}@gNji~dM%LX8#F2LBgtw5@uri_u{uI$; zjXk->COwMRJ_f;9i^UQ4K0=Q)apK%@p~d6Y)V-Dh6M(X4H`_+v{Cz>nP`zQd_yx`0 zlgm*@^%h5d_==ntuA9(GEo_^MxVyt@EkL1Is93Ja4S}0vtPmZv>PqMbyIlos_Me>b zFE4rP3}-c1_UL^3ZgnLuGivSDpCO4S4)@h}w16>$huKOfITmj7dDe-rMg9uf?_7UG zxi6NI`T1c?gOFRkY{~j%Bo{)|Y~edjcEX0@{}tfJX`L<395{N5uXcGYna|b>+V39+ zdfX?`k>Ebd#IBcn_|U|J5YU}~;L zkg5Q&rur8KlpuY?b3-$|&>JZ1wE0hGP_7%IAAC-bNN9~m*24fAckjdDx6pFh=JO3J z5pR(7XPN<~hhY?{G_w>g5Vm$baDx=}_ybYk&|4}NOJ_%H5USv$;Or<*Ohb^DDI8tL zaKl{GaWOM8h3!O0(;lye(#5ur!paoA!PZ&$q}_FB_54S1Vqhf$a9%Gj6_42zQiOWzZb~vU0IC;#F3Z-X6cf%8?&T6j z7A~+c1>5xJ0yXJ z5MpZEJXul@M3j9D<@W7XTgK)|^{*QODy@$iE}%&y=u5z_W#2=W$UqkMNb0!tz#f)$ z*g@TU@`^=UKD17_p3-&FXTSIx==L z_oU)J_A=rc#^91OIJOJDVNA(2p?KlLk3Ix|iUuKzU-4L+y1y@ADd19K@2N?XjI7hI z)st+yr4H4q_AwnK;Ixb#8$xU+x7B3fCf4afwInmpnh_J=53zL^uuK9y22Ij8AfTZq zBMTIDVDFz~i%hI2Z3j47AU9GHyz?Ho$S7Yt4)@#>$1`)>3MFZR)&r3%P}B(%@d1zo zkgth=Frkp_+N7WEKw+FbBz{T-j>Ol2m!Bn4NKL<~+Sia@I8e)||8o8ZnOF(htuwe+ z$e*o%2*5lbj1oH#zD17!C!Cz4#@`&LI2Ak24!F)M@&~d@hi17U>z#M^^zFzn&{aR-2F!NpU3akb;R1(DsN=ebDzbM{)QTpzKo{_-NzLwXDtT z_Wh>f0I$lR@${~LyB`JM;($ci@8I(MBZsvuwv%u`1MLb9NeR9VDKp65R2dG6K~T=N^4IH%E{Z|9I1V>EV3G+c3gD45v{xD+we z63_)_(WHYg=x8h?WJ;W2ZJ+qJtHvYeV`R!vo6sewJAE1z--V1rlDfzE;j@xl&{oiO zb)diY%8{KQ-y0xnMiyKY297BG_iOYAGeO}0udjij+B$|{*Lgul{$We9`DoF9J}Y>@ zemkqw@0#-ES5tnn^~oaKejClJ*f_d_CJKa-YOgZEa`F8KBy}$rfR%K9*B^hrwO@_7 z_^UA;B@4@5KzB`{mBJ!ZVHUJfs) z68ngMG~T%QE#)%9~pScSDI@*yq_@;E$S-z`$_o%l*92x!2WO^?L_(L?YCzW9QD>Q%N$E$K2zb=B6R^_D80l zvib@Z<_FgVW1e2GP`}bsgk2BrHP8g|*dTZQe>->ZuktRZ7z-PGEK$T-FxDHDf=~Sn zd6S2sdw{YI$_IU6{L++(#me_Wu1tLZ&z;By@SGDcVhvg6z6170XlW74yAzN^g}y9o zwkeY)XE7LT9bhSCzdGMYaU=xWD`pQxh+ZsJPsKQJjs zMG!BHAN}na<0sozu;3tqm*UID`muEp0D@J;p{buY;6$E>uyrwL4!x=cTabj&)@bMm z`W$&>0#VbhI=&gaAuP{fd(OE%Evj{IeZ{+js~%{ z6xr)tx>;K4^X1yJj^&r(T9@{p&9Y-j`BEPiA_5bIHb%5YR&wGc9Zc0^dtjWGdz1H> zif^P+czzYl`V5pRo=yH_)5*@P!8@X9VqS_r+3wANI+pF`3yfU=2S410UQ~Gsstxw{ zO(2W#OpH2kG`{>Av}s)PDx4-~M+YOOqaMNsG+T?Ac*XG+C`j$RZQ6j1kaNtdSMgqF z$daWEY#Q7ec>-oAT>p#O2V74aR1)P=pIGq0k|*_f%%FWGM4|7<#J#o`;@UfAJO&mrCg7#ut@N z!wuac`=;I%m8eWtI%V>^#(Mbo77^Ae+S?Y*b*vsNFMf}C_$X5UnO%ithC_g|#` zcF=8HdVpYKvp^aXL5~2dO+-g*ypv+;&|zWNd?{v3C+MaDEyo#iW%FjnMRfBcGM4r7 zA>=&iOrO?(CV;Wpj$VJ`HLXbP*+BnVUM`4K{{hCvZb4ce)+?v_OeE-8%Gj>WyXZWD z2?N`>dz*ErLx(+m_pV@Hr5oIwZ0rK-_2{z&^SjA;pu%+RRegc^l-?6nDa+UV8x+b^ zb1@!FeBFtyxOW|)bGHnpO7rlhc3GaZGmf06BHuVZORryWIMYeQg)BXHzWg=BxZ-w< z+M`>%Cu=6dQr|ZU;LfEzw0^Lxt0)4bh2pQW#CL0Sr|oeVZI;>ms&+1FIx6-FK$zdZ z_-A5B8B6%dY57e8t@n4SQicOItXyy^Dv3e*XHNEC@3+z^Tg+)TNoH9JKiQlQpe~m4 z@q2#eA|66HuX76s@Qqt{j+^6?=d2c700iMRG56bQ`cJusFf5j2FuO+5zZk`+JX{1z zOe0Vc!~(P?GC#J8Rtb08*Hit(ZXu6`G2%TIO1+LL>5aqnh|u2N&*)T4K2 zSdMj8$j#@G=)?h>^^V~5JuXpapB}I}ZYDm6A5T2xNaf3rXRZ-)VuAeM5@rKeky!<| zhWYqWP^9xKemjd;6_oaifEoZ_9J3mpzI4F!jCtU4%q2qt~UCH&cQz2(|qDULQfd!Qvyigv3M1L)La3+&Y0>`IwG}8 zr~Z5$F=<*WYJ$anA?W?L{9%VPXXXpPeUL^av+?0H4>bI%KvMr1_u+Mz@i>Cw$S(hE zb7qwrQFV(_w3JTd&>h=P^{jnz);+(>efrHlmxPZOK6{f7KAqKAK^`Drf>8`LFQd&t zdNM;oT&kLSCt%y?_lV_F>}!1MAbWbTGhO#1laI98;V)6l&-)r(Cb4<|9(<2bj>ypT ze6?{5_p4};iI6~PX8==`c&t1Oty|EW+HmH3P1Kh4(S0&+3aOV!U;FjXz;@f-uQ2I8 z+Sv2#9g1%@-|(qf&{O?nla7{7{UTl6UE+nl3jX1eyMoV0U)Y+}rFqKt_w0e+e~*)h zs1??g*?xIP+}m+B6^W>Cd%V430nr0^N-q&L_T$;o4yD*v2ZF9*<7E_Tu?!oO#=P>) zoAJSAM?Uo48YDY5B7>fJT_^wOPk%f$XRK0~Ywp)_sqO7X#MjZ{s`D}pHnqwQSU1B3(ybI;MXtQ_>}an z%EdQS?k9TJPKS%2rsW@O`CiLt)0mZ+r60rDaT^$G7wkHg2^a*@!GE_P_}_rY1&_#n z>MCw2K7_epPYQs#7UgpuSu*-p)vV`3HK1FJS|nJ?9LR!yg(TfRR&mOjBn}ur&C$Wo zOThmkZ{<^GF2(O~lI3XxGaN^B+*rYl%;Zbg@cAzQ3sLdUZec?y2)Pz!bdj zuVmi;p?mqLEBmCOyQP|Er;Zw%%nw``EI%oN)FEi`HCXPdr*O=T2m)h3X{hg z^9I?? z-rMM=;sxZB9XFk*d&u^Io+Z}^>`u87SSP)dqN~Fq&k{Vt_VA~j3qBrsz~Z#Q1>-12Dj~SZp-TV%skRe9oIps$+AqC zqH$S(K*G~tl|(+wH=%~-yaV%TA@)vm1rV!lO2fT?wxaq9p(DP-wI)oN?J?ghzjvxg z4v09FRd`QWpzS{mr?qE=961-z@-AG1cBwIgoJWrYZtxHNd%p(nXWw6f*Y)LE(8D5^QBX>43rl7{ZXP#n=jym116J!= zpAcg=jMMq1Fi9XJS`&Qi^P&G9`hR}t|Jgp1=7{U>A2N+)4)sbK>q2;e=ME-o7yGg{vQ;Q|2il^v&_iK$m~=z4ra)$4C&C zD5^F+cEkUu!Mo`1Dn`rchZw=k&5$Q|qi?@`=A*&#IH2L9(Z%cS8$*uX9j8zbxg&9|IP^?=_3Q_vx$aJ@a zwN!Sel+Ddho0bfii$B>E1@FzlzoNWHp=PnOAyWE%`HUCvN~=(8l~(OW&eb)_Lu!FN z_DBs!Il5EkZ3B8`;wnoarR;vaTR{c3l^L33riXnKi`}j+GceOZ>T9)t+#i8OzGfam z!FilZ^!ChBl4$R}ZDm8(*rySEj2)WJwV~3iPx6h>?xs^YuX8LGQ; zdt<=qMk)NjMCUZWQV_h_(ccxmpr&Y}$f`dMA|4TDJd=CQ16+Bl*qswIQ!GAug!1AB z357DPoloyOae{w+|CT#OC?IyQ+lmqIMf%BB<^+!W7L=V@KVV&MwJEBLJ!cOhWIRv1R&Q9>p>&i3)P5<>QE(=M~B7?XlBHMOp zV@(@svde^;;oR)>3ZrX|Sy32Pie10w%b#r6S4g8wW_s%ZL{BHF0$|hrJEcSFuV4NG6dn;>_17JP;qw<+IA!9F^rspuK9nR1HF@eMT!~-ZxdM7KPQa(KXlydj{7`^#s z-d%Yn3p47!;el=GRtx%%cKz1HLCdLKfqD?A>?ipNs?xoFhNb+H=nxF4qx88BZoUm$X? z8A=yAT}wg|-!DugbTY=#VQ}0H$ayo_Q;=f)*1-HNYA@3ewk8YWrDLU`Ofl+v`1Ean z@Z*nS2_Rm1j`7X&I7MwVc-?{&LInJ`I1t=~n95Wmt{P-eDrDt-34{d4n8)ZpU^7S$ zE}+JeWMJ-#I^cB;wk9(Vr|y-a8tK>If-E2_2N1&8VhM0&Th`;)HCzS9pl)uJf!PXE zx-;k!*bPvEEt0=>r%#~kuynoxedad6;|NM!c#co>o&YI%6UTC9AU+_~Pv>*iXKn>M zUws5bpRZI5h@RzSf|x1j7xDq9K$jLfD%N2Zek6DaKVjmf4RU!9w?NneAllSmeX22@ zClI-39#7o{B(Qn3`@y_E#i$)@&8-J3V5W@YE9|KLA|S`p)1fWk7VC`Y z?eJ-&o2QD)*%q0vAd=!l=79XejDQ752X_yqx4$cy{r6k`a=?&G?r*!d{eCyt;!Xxz z62aO3?G*I-KVM;9282%iJ%M?lA9RNQTa!~Ez7-69)&xXxNP8TiBbY2B$Z9h%FV+H+ zQpQC^R07B`*&f_a)p0yE7}TVvG9dr#Und8hMLhPGlhgln@)i2Qf`9bfpu>`_e|K&$ zCH{2he|PeKZ|Ywf_V46~V3hjS&8afX!m1D7smL)X4Ie#jkrHgw0}WakOLVfN}68 zEf&ID{gqs>H9eIXX2X2C0e}A)dUg~D!%8t4X@qr0WXI*P1z_Po^Ug70RoP@LWd<7c zb_Jp5Ry=lAQHB*%zEOg(1VOR zxKM$Pq|UyC2?jrhvj^hoNGQ$Vc*D#gU(aJI=P$rNK|HLaN=`5JFUE!6;Df_~rJytNrc+;iT^ZD!!)D@@; zEw<4aJStVu{d}PTQ{%$>Ca*|8+Ip3`HU*I_BHxofU)ZK1MQdbSIz`N?N#m45=yUBO z45uyG7nSM_qR5K?E&K_Rv?P1lG^??MX%4-&d4qY3&;3u_?lcXjv z#J%Lm^>2FIJet)MGRNOe=b3#`blHF5WCHs8o>5jNIe-5Lkebh)*{^P|{d15+YE#vPZ1@?v9suSUPYk3o$|BQJv9#dD?Q<2kS z)?@yl`eFXlXB4H3G;e%P;504_WZ!TFRvxV&#Re@wo#+Pxk3^g1wQhBlW$x2p?qWYu za+P(FRaVaB;6CBNPuyF6&)k*(dyx3zNA?V7b1_{!hn+Evm$#>=a}wp3u2m+ zE8)lRZM1hiJ2pHzu-B6U7ch$zB;3Tu5}?&XA1#^iRyOLD4;$7y-U&jy=1Q)-IHIwb!ZpDGWJkPp6nq~#7>FF)v%gWU&NgJ zpbVt9R)f_YW633bLUm#D)v+W;9X**VJfqa~8yD~ShZ~M6Mt~I0N2JV#$)qa6+PYKL z7{gJ}zjf;Sj7q&(QA^1tqU(+fGG%}v4sRN$~RNZiZSAY+L{J~H`!G8umoAb(5VHK1Or_QKVS~ZsvUthPQa(@WWx z7&YwJMDrsmFV;>XIlOML=;Pvf@{QM;Q2Gb>c{O6Lh!(xqupw9g2zR8a&=C;f#IJZw$FURsA_v9 z*?VQGbh<@y*BD~+Uh?CWDA<;v;l*raM_oyQ&@;SSM|4g+q?qAzW)@7TidCD1cuQ}N z>q#TeD(@8dzxD_#yTX*2gb#61v`Hh&{_aFz>F}~HXV4yxz26Xk9sPGvY9e}$@6W4{j~-kqdnRFjQNo+ zz@Fi@%`)#XDhi(7@9}o}!o2>trn}1Yowz53($OAD-zVg6CKeeY?7POg+eOb_U02-m z>}B#peC9{A>VeOvs>G%%;FIP)juW6Pj%=4~PfJlzueE}dRk+p~kMP%{HJ z_gzd{X|^WcB~s6;uhTa}uja9QD*UO}9g^yVw3`ef{|m`HCQsYqtUoa?^e+4Dj+(F1 zw1m7+(ZQpYfTPS1FzK{*Bw|8le|)?_wmsG^c*8?D&`|h1vp^hH=M)V~Eud{Mu0egV z=3*5kRCiLt4`aGWqj$zp;U-iVr)W`lICDvj>+ODgiwG=!#;ww;Y>UCDgNjbka@tJy z=3HiXawb)LJ)_bDONf$U3 z!`^#KaZ_F6tlV7RPLLlPhJPJ_-&|gKTvQl%kmFqPD{O9}s#Fp38+Bn4TYcVf92d;5 zTlRW%4pIFfw|qGNYDid9t;BBLLnk{g?iwF=i$)DVePZlIWL$NkL-~6`sfe{<5o+E6 zFpz+XuiT_q@p*rg6k;9{eR^%}E`G=ul7PvUGvM(|BXw8R)_$>qR8ob(klVH@v7^*j zqP^oDH^bp2{wQW!-BP%Kk8oN<$xui{RPQ^s-G^X(=zy`@m730-CYs7Y4uQeW<{Yih zcNH0Zmn>Fw0lr{4>)cHwsI#Gq+{rvyMAqwNvl~5D-&(?Gwg1T`%MH7TFW;2w)VM3P zia8=b0-|mO^7^|cp_m4mQyab3yYaT_?dmsd#DnF= z-rQ!X1@m^2(_~R-CHF$i+Q3MwYYP9?h3l?IkBdu9AKQ2NnJZ`Uj9)~A36XiE@y5r6 z`wC%wCd0SKy4F z4AH+d#tE!6{18d3mGII;cAZH?gNlQzJ z>L=qbW1L3HtVyGaLw{Fat82KDww5V~2umEpB`oi~3Q<217 zOoj3GbLY(*^aqAII*#S2F~fX>l#C^k6p}m%Kj76^n#?FWaFM2gbDgF@Sc!I1mw&iH ziVHZHc3jEI)9l02iNbSK$Gv;r*d3Q!XL7xU*ikKFT%>)`JF_P}k137qzqSqkF2&T- zP@7&H@ij}^eU}v6bsMfLKD~Bq%YuuwejigcePlSI$TBAPoCsR~1o>ERbK) z_&hvc1m6n@A(i3>^E5w=RSiU>iJayx{&7)Ev_YVV?Zv*8;B|NilDAx0_4{nn@m`0d zYb!9@h`ylZmbwh}sfWcUzq2P!lc4T8qPvvbgT!^SGgO2nZYHDP+89Rws^;B^7*N^7 zD4HyqSP4K9!vMi#M8{gN9r7nNQ(|Bi)?Fq1}Y=!XW(+Y5DaTCBf@hI2aZd()zl3S zgnMRsRn%prZqJ1C_z`{x2R_907I&WLP&n{uVzuR65nc^v{k5>Pl3|}aZsq*t+naQA zPd(4CWgJbv$b z!(quA&kQwZXFdIx3*!rJ=Yyx5#ls)t;$84KMKru>W3sMyrFF4lY@vxIN}XefQmqZT z9y4`GpL^#@3qI#(rzgwp_$Cq^s&G`fab(*MsZh=-lqt7HHE!0iM*VPLeI;L4xsf9* z`XJ8Q(dEh`~=P2TkUwS1n=CMHPgO_{XLh@l3CJz zvbsFaxj~mR*RKnG+^{!2J!YzE3Rvr}NC(DKO(MTqzxh?(2{${Be|PgwWWDIgqF-d@ z;h$`b`#m+t)wLgVEFc&L{bZZ6>ccNAqgeudfu>1PAo)Jl7$o19fJ80}V^I_>021-$ zA+J$?(U)@BdthI=m;ff)hbE|wu_R*fH}E#{v?D06s{Hb3tOYNRwuI5-A(3DFQ#@Tq z4wcWmPRd>hgJAX$PfuDnN1{o$uG{Ecnq9|PxqM#CdnM*DG1{hwC?cm*CMlpin=ZcW zhB|RuI>;3r3CGUMEF-D+I)l*bux|KDCI%3mnJ2C5@N}*@*k&JCo(SOxc;XRDK)i)D zGBNQ`eLcFf&`U|vTZ(K?L6SRbV0-0*H!>H57kJg4C=Z%nobTRw=r#G%@am-ByNE^; z+Sf{qO#7FCA@gAlrDIFZ--&hi)34qw;ESVo{fE>zXVy#f)a!XXMWh(e#O&=zxZyEK z8w*kb1^}I7-3c*HgE3++y7n##8Aq~0^?OMI)10(0;rMbz-s^Ch+4nKq zGm4$3DG_?Ik?mVmb6t%e#Yk*`eBVSAg9!s9KAmwaW6K<#t|#gVDOpH&*G#b9e0P+} zvrms`tm&PUj+#b%W8S9rrObhH%(=vDUdE{gT~IojF#T*zzIkT$VIPOYL{9EbR&uHpYTatuX0GUYEuh zmbgpj&WfW8Ej(r>NcVNb;tE4%)XSg%)1ZDT-b}UjQQd`zd-`rS$D<{5BH_N#H!KN_nmBGt^ z(GkTe(P(4*e)Uscj%3&R74Ewu^ZbUxv!AE#%*{_}9=OL$19E&~S&eF>lPB57Zy%p7 zzZ3GQ&-&_151B!vl_lYHSs}E&a*LI4C_X zDxH0*uUtdCf2==HrVs<|#RO&k5h3x&-!T^p$&D-m7bHdFtpV6jGIj9xgv1b1`h;VU z`U$tMT%2z1XK-9j&#K;LC$mqxZk&&Hic#~3m_ze0cN5`3u>ABOrqqNdT77lJC_#cO zV&JQ+BaYs4Y##Mu>lHqFDk)(C8KBPTcK)=SM243;f$)_Lr}o~fqmgQ$<&yIxwRZ|@ z`pRuy@cOO5G+mSKmygP9DrV>>a4j?HT*uu)FaMDDS4QJ4cx#b)glY`??8&)qH*0%vd75%ZDbEB#zylM|Kg_W#?4KyLtWtOP(-`2258LEq+6>1z%Z1XQ_ z72;$|Kdi;J{lxOdGB$s6>AZZ{j}j`H1Wh$g%r@(m<)5{m7B%29*aaZ2>r0MAUj^Kr6^x5v&C-ma_v%+uI?_DkmAw)?Rt;IQ2 zaI&m`N?}ubpc17nZudGX*XD)kodChD^9Hf6JwJGHZLR_7UXW_T-jINOslLbm^ll!n zOW9?YZ0=lo6>-uk!FGIU`wsT7w%yye$}~-ONH;!E6qs_qSz@{Pp+9`doG=^NSAZMA zT{npB^5+k!lgSC8Gr6>5uTmApm4Z8_2^!x{;g4J`EUd#Al-a^fLT*s4NcO~iant(m z=gwtkmDEwX>dVF+ktPDi`=|yP^YP&YpYGZXRkX#%7G1ZDv3*+}K9liKMS|c?!VzPG zDi64QB-EYT(H63`32R@#>6Q6bgBC$Zmaw+Bkap(_5U-InJ9oVhCM#d$Nn>SkM%Pv^c$C>7&!?>+pr%W2h){4(M4s_P23y?AO zqFRmhCT5k{4)!0~)t-EBkPN*l<+JU;y_>Idsw~9{HsDKK3>BMES@a&8vAtgBEmF-# z%u7e^4h?_cskalCT{#wd?C=#~r)G&`YJ^Vo7c?`v!UC#t0+oyP3Q(WtJ(92~a|Dyh z%yTA#ceK{@WZp?@`8g@5zt}@Xau@sY`w({+aY;GzS-INh_m4Qc78&(7e|bJXzNAEO zYN`}Lb%JO`rLgUd!MPzd{>_%uj>@v2hsDv1*SER-&hwo-XDt;!3AaTDR_-jNY7our z^mev83v)Q0<9IuMWhShT@F95Hy-mSt(adpJI6R*aiaKhaH$-)z&uEc)IgLk0`4x0d zinvF8<-T2XBW@^l+x8!ruvW4*Q5F&{AtiPO$uFM{`I2I4Ywa$6_{zPVt5*cs#_x%3 zELuV$*1Xxe3iS^hjN)`Yjx2c=peWtL&r8~aI#0HoO1>_1=fnD0O~HFyPRQ~MlKgnD z%Y!O6nJWtZ!%@~eQ38$hnzFT(GW8AWnLlL@^fY{Twu9fKC_crjw`Z5Aa`@Y7nfJsY z9eW`crD2;#8x#|3eDKqpyv!?~fr2-uW_GUHlEVl0Hr1J_#Vr$=%hu1sgmeU?KS@LaAz~dRnj=FK z)$dVR^64&Bal&Ek_T0fm3#@@paQpLG-y=_3E_0FPFf*I3ZHIH~1MM&u%V#I;Th90X50F)9RAIh+t1Hbz~C0YXP;5te!dnNAec0T-O^0lTD-{#8F(nO}G{sA|O`X?U;m1OQLS&5evSXoMq?T^4-I4=W0 zBDPr`Dy~sojMP^5s=(o5e_zx5e%o6D5L*9+y)y@% z6pKYuB`2{W)F`4<&#r+Z5k;Z3xNuNrpwk`BVAp@tVj~au?O7_M% zb6p`_as`8u&v6dG9}l3HUz@e+ofN29MHALB-?xpU{YI7Z85eCljutp#H*%{TPqGj8 ziolXTBT31Xoak@SY3OlT^aL^#4D(U9zzJl?Bc@*R-7C3`N4!6OSGm1>k9|6<@zU$- z-5Ntrw2$DO%P%F*p-z$dD1M)C2^OD+^Yy~ZEX@V=ABvnxs@rDHF8G9^D4u;C+y7y% z^I?LX389;HqIBNH>g~|&;%0_Y8$P?!5B&)6L^;bvNi%wVOvRKtv-AWzgPMihWYJ(((*)Go%ji{j@e^ba8f`iF!N5I zt79B(y$|0;^=jl3>bpBBh6Kb?nYn%3f5h?s?<9iY|CK_Qbjpr)6}9$@L-3WWerxy5 z$>!qk)8n!RaUX)wC+vqar`Kjy>fOaYnpgA#P7rSU`qtlbl7Iib9Md?qDi8=2%m3o$ zM^B-Car5iozvJcy{+HwC|FdkLf7RbpxhP+#1;lpox*JsXTEbV`!Usbt%a$(}V-T+k z`Q2!~JGQ;h(ot!u_lljs20`kifcNo0sKD&v>@$Nj+aq2ZT*8sr*r)5&IN$`^cKVc# zh0hd_)BTe9O`@~ZHQT=&Ho5&K261it;N90+&$JYps+q*f+=`wPKfd&4PZgB-1UDb} zY|s_>(72j?!Th>Z-1WuPr)n*(a7}d>YQS8ucOtx%$d_r$@8{?AI9o2y^NnDH`irxh zOUNs>Xg+nJ1%(gsuED1VUDu7Ag>0?z0u*T4nbqmX5&B ztlfw^9On7hDHMC*T-3l;yN?MU=Y{H-Lgvv9k|QIn+k=n2DJv7J+NXzlhw6m0AtuUU zq&L)LW1z{cU2x-mOvjn|&$WKy`ve2oyi!A>!i26rY6z1$<1Lvyccye_$o=F&PSd%5 zvS}kUb(I_C3j}5>QiuL9R{J*udzq2%^;Y&uQ#78cFma2h0xkVn8S*8fR9FG*73yTK z7b3&TWL1Ae``xQ#bMa7gX>n&p^9r&M!L8}MJ*H=GS4*0JMU$mdc(b;iZtH`W=lg5QA1<@O;T;ukELJ3zJ_bI8$h~2VxcuayF! z!B)jwAT-!4wH0rP+UuoKM70<)9WI9{Nt}Y6oVe|l)yK;l;|7CBzPgx_k`R=g?u9=H_G0fJFocO**mF{ zsZy+U39e0MQ>%p+p0K+dinAMM~3FDkJ z?dfExY5ul>ki!&fTdCdvbK(E9`p{f;t@v7xuV|0%r?iGXDiT>HL|o}|-VSY)r> zzTBt{n|(6r0a6C;{#HM03*!>Ws)#8?0#M{9t53|-traKA^Tz!gsoE33ZNEX)o>DMR zH(!GLi-8X~%=q^2sd$KQLnezoa=eT$4RA@mwRUeon=-lk$jb(F<2N1elnv`aijV01 zKbk^FmA0h2IN0WGOcS9idJW2iMSnKeW(|ITQOP;iOuEE!j5i{W0iC+=4>`NViUIuj z|5t(N(CVAHvpa2dwnZTigF2TYa?BkQ`7{>~Em$+bsaf8olv=3cQI9Hy4D$w=X6?DH--z)JY>ax9iuL zBsQdezE0SwBlxZ8hQRb!`$}#DZH!6YFWs!Hx<7hR-g;<^Fw`*I#gtjsr?{{(EWfU* zDG^KV6|G4I3-~rny6{a@P}K9p z2eqd8{6o)^EG1s=Q)tRQ2Q+*mAUG>~shpj7d?tRK_B9S`6C!fY`p^ss*Zt|C4pApn z1tGZIsXAwOO-Ezy^_0j`NsvXt@sHTr|NhPYeeJJ*)jk&R=r&Bsk51V(J232J_mgdu z4SG(`NXN(Be+2#r;+js$CVauXiOd5H@Bzl@!k<5|PX49mko6L5GGM(d6g_rQ3!dhN z^PODtvPjlsF&GB4mVyO*!QCY5C7#K`^yOdv+p0gJPuWU9YrM4h;9!|` z85^$wa1`6$CMo>)-~aiwIQgcLf;p9qq`bNwYvHU>ID-KwxPzHJ_TUX~(~8aWI@I4Q zA{alUK>ZG!wY@G@UDI_&=Jso+a+Y?(#aAJ)FdGEbDArl8mt5TJkrj>f;o`|omJ8K{P!XNqgw2&`Cp=Y&^KivkQgAO346M7n0Rv+DBWZy1XZMA|maxlNj6Kv0JR^t?vtGz9_q`h6uKT_?= zqpq<$Q{ESr6jc-kul8%m@}@Kk;|@mzXo%I_+@SGlcl-bD5&!d}{;%>sj1wBbl35du z^chzs5P7g)lK=h<^l=5wWt9-jAgiQd^i2cb^<~t^LaWhPJFD-^m+`{ZslGoTsBKhjdtj;t{@m?1D1yCY4A?=E;L#HeB-WgtmeRw+ODp3#p#wFmc{) zMk}_d_sG{0PxmvKkK!I!2s_-nIq~I*2758C5lOYj4Dmt%IA#PO>Q3@8`zAWf5)tWl zeke`q9WB8TJiX)GDe4QVIDbD(|GhDLbZzQ92v^IoLxCJv=5~EZ#k_!oq?q(n?xj{R zO)ZGWH>4u_b&dVUZ}(<}W+zq;Cu=mwjC1| zDz=_>k%bv9i|%(i?sMwhIx8b%YDB?PuKr|;@*<^oZ~kP{wZ?Vghm|4SoKzt35QC;4 zL(EDoM*n0xN}JG6TY+LakysE5WyYFmLlv@MvfaS7o#lnDwmD`aS#n~KBao4qM^aX% zyfTnCp@!J~m&n7{!uA~`_LGHYkH|B75iNZi{EHx%BDnz8Lpl6-`t=P4Ofd6@0HCl- z;E#HBAfPqIgumidl9 z1U{MvNC9VMW-Na(FzCoV=%!2#&|4FL43JvW1|OA12SL{B&d9%IVX(LnYj`4>Cg==! z##o@zsDzRqN0JrHBLIB^lR+E{z;>>mTo?or(qV=Uut`V1$M2m4qizI$j~IFE#KBdk z4ph5`;MNXD%46GmDw?HoXVp!HVPoov zYh#s?CEuA~0-g2L?+K|U+#%lc;B#wv2RSpB3F|7v_KR{-HKiSyijTrOT8IxFHN($5 zKyxD28j;~{2h$eGW;htD4BE2JG<|G`hJS-zl`E&!CsSQ@yBr9XCf->GXDdVYZ{J4R zir)vR&v#~f*^m<#N}?X7pD&h}`BCU7iDu*8vV<%d(p&V1iM9SzSKU1?@Wb#N_q+hC zAHN4uox^Vq8z*#xI3>A|l2MOTr~PiQu^y22)U2RK$oWAXxw& z_vYD@U?@^ug!P3LRiLAUe32w*`>s4DW%_$m$9Ba2?J!@5U-AE83qE{(nDnaBp2zua zu9F#-Nmk7uQDPj$MJu}je^&R)7bkQ z%V&hH?LGu8M(Qd~LNsZ6TApt#(`o=?T`IfYl|56&MY!raF{WgQWS4M;`RRxs{)6 zmkXX#o1)sD1W~}kIS+{reZs@s#DdaDY%;+9GR>~KEi$K5Bf0<#eJ7l$Z2fEpB@x~UTX=yg>uA{lVA#;3 z(X5%UN#k`PckwmG1yIcCC)?wSryIu?D}`*&kOs8sjhAUa>1i9aU(#|H*S-4ibK@j6 z|DSA+I>Xxr&8~K_zt0PSfhb3+ZkPhkE$>_Eu|oJhO{lOJdaE z#vQlYRB}ZwEVX|yX>^!bTxE93TBtr7pj#3qF-wZ*v7Yqgu0&a;Zx3>c05D>FMlV(^>72qCp>^FuRD`hcNm7M)v_vLcS z&5jH%b4m|$ChT-su5R#tf{Lc`{@o!Q6kjmoq$c5vo$bs_8m|7DnjLwk(3`&A)a@*h zDNM-(t$2J!vea=^y9GuXqoY~7PXRqc50F*_GHLMZAmX~;3S`!7{CZ{5fVyTf;AN*D z0qJNk8lY&?#H)X@-F}Mhq>3-b$1!=vo}&sH|T7C zM`=)LqAmmf%C5QwWUSL3`BHg5*Di22}N*n3J$5@}eTSz$n4F_~xyqf))G@@{`RC$SnVhrbnS3F%7hr;lq&1&nQ^4|mq#_km& zTqmItIqtYwBVNTVPf9treZIcnBEBpgssL-mUeGikr8m}+#wa1=SU)v`rx?x!q5Lwl z&Q|MZbz6>lYBAXt&ptjm$nYdez3_sQRcn+hdB~ieOE~`HM_!lrd~OesZaX1?$bp4nvN6)LwU6i4x7BQ>vj!|?lCihtGp924UPH4 z3Sosj2Nsk4&WKimsX;|qV&`UXaUlH7JrrL7xs~!Hk2;bySDFps&&0$#lwF9>-rBO; zRkky-)z?j+(XO2px)QlEem!pM7Yi#_cV4cGR;MghpP+@-xQ1pXVfFg{qq5(Y%6YQU z59jFqQL2p;d-uH4&%QhyIeoqMhxEyPXXpCy(W!xLGEcRR2wb5DOrDXcV{n?k*u z(K@^9Tvz#{LZFanb`?gdGT~zF5D;!?l|2l@HlaW9vWW67`3ga6;*?7?S!rJnocDIAsnRXtO zbf>HZWqEgRcX8d*-J%`K-y`*&KjC7V{MzPx_wlI@R-FechL?vN%F>3@ozy1JrkaP^ z8ip8qd!N5U(Ov|sNlC}sfR_5p3K%&cZ$m?fSY!P56BdQhDW(Yvx6}4KtvJ=5PY?Zc zmV%4I17bW)#-ow-0vOxw1Uq^<1$bDIAmNR|Qvm5BrRsO9E7Xpab7{*1>tP95PcDI{ zrAxaW)K0^LFMa5<8JmQ88TvWg%$O*CnhP2O?_{pL33;0PDocL`UH6l1_0;^>#wHvJ zwOgp<1eV&I1ZM@F9%CI3FM;v?2aFXQ44N=va{W?aG{%r)ar1xc382sSv3C8na{x9` zH+`^G^&X9@);&+H>-A^oJ5XBVA;Qozt@@X@l7AJ8QIu?;bh zZR@EAA(SYU(m3kHI`lA}epg>65I;Jx#*9m2pi&aqt2@8n9r0N|4x=JwR%#PbV>Nhg zHD#h{UWm86L4HZBs8?~CF`D^Yh_#)ChfkT4&@68=^z3eAlGq+nQraZ$pshJ~7nxFF z{M<({uV5Q2ZdhaqI2!lwstZdy1|G^()#|~gpT<#zX5NjNdzH=6#5Zf};`P!)nTJ2y z*q%yoc-CP5RGPW)b^3@;1DE*T2)}j|yIXDb=n)d^)d{8|(Nj6EBHzVO@CxF_6)wR; zkF^?Z1#U+yKRQK4jK|ih-rh*mAa;2v*&f*IbY@P}cY@!)S@P|R p=cJ6qnJqh!L z?O!^)=6s$D85CeB6H^pTaJ}&$VaCvVj~^04l0S%O90`J# zn|g^zb-uEBfGXYa{XI?K&)?I5LBYJmk0EDn+@y?=*Z3I=%sJirsgwqQW=+Ef;I|XY1TB^vj7)DZzDrtkwKCk>h_=r2W6_eXWg& zddZTM>pbuO5B9!19?JgRU!@dTD*HB-qD&|f*`^KInh=VZA`uA*gE3QNj|o{qrYy-i z*>__XLe|M{?7JBTW43;GXL~*0@ArAmIltHIJm);W-|3H!<#wC7=d)hd`+8sRi#xQa zvFz3?CdL zyAY>{at)5o;CiZqwo7}7UM9-RO)77J`zUBNPhhB8I24?&DRZbNB{mp92W|ex2CbH) z9r(;ya9$}{1V16buhg24{~*?Idb!Qz*4FXT`48{y>ERSC4Fc~XH(ujxNmmp7O^?eO;xlg+>w8a`uQ%V5zk6Ar{zo} zbHLm&A^m608n{dAZ4kDttaseT%@%ZPad1h?m<9kU84A|wx*IP~Qj z918%N$azyPNeGa`|4cWpZK zIDkM@mtGR^V(k6Yx0>d4mx}#f?4&!<1QFT@kzAN=Fle<^6~j1;>nsJ}Fl!VKP<&(3 z2e6DrUX1S);O7JQ^bT+}`5K9#{oXTzgDfzD%meos`hXQwJi!Pd?*+Da46h}G7CVpW zNT?Z0$zCj?ao&HlrEwt$Ai)EU9Tu-B^@GK;6*ebgngq+wsGpuS&^C)+Qe#2BHXGym=D6|Y-P{Z6_JRI=u`N)`wr#kv8&q4Z@yeMArt z;Q&_2_*1}tbzcVF-*6dXE8Y^$J-9XjWL3X4A7)a4e5bHU4 zc4GD>Vgm^Dlkun(-C$-1pgJ#t4V0ium4Rrc)%Bt3Z+zGa&^GlJTXvMA41_!`p#K}w z2p>Urr*IRob>ke0U56>onissoF@sf^8ojQP$ikd^ZaL`Wk} z56S=s7G~?JVXvwbrM8Wtzf_z`sj_MQxoppqtrWxd01=i+Kr*KZxr`@BMmFa@KrK|E zs65!winTg-RiY(uB<1xM{R{;B`|YzLhK~Yk$b!N8SI&Rihw8D5kn8@EGBJqLL@@H0 zz2j`>)!Pyp9E`BUJN4^!^UM6e^!GKp=tzShPL#DID2g~!7anK)V7U~Ttz@eHb@cYO zMp{aZZ!zN&^T5bGjf$=5t*D;4ot($u_kg>g!8tnBMl#5nV|ufp0i@5+;1=4Ic70nt zG<0LwISj^;N1BG0>Ymtg{0@}CsU2XHzCuWNp$i;i73OGh=akkR@*bGUtEfl|K5<$# zbFa!gwP|_~qMHz3|3^pgpM0HN`2FZdE4dtb)Vw5afA)4D{YieAcs_1jk)nQAc`S%a z={*q0t#<)j^h=-u%noY+5Fi!^o)Z*!M#5(AqT;(Z(-pEg`+x!dw)ndnI&qCxb9xEy$e85> z2JIANpt>2bNN#$0SLmwf9mX^shQxvsd8>QBwc9$xDpy3MW<Q`y{m;Mzw0wh6~voesH2XmcdcPf}tghta~679;$0^tmzJyT?nW zg~-^?5MUDyDOIvMg>I;@ZKZMc#~+qdNy68%f8doS^9^h9o2gYR=4pljf(Ui>ZmMuPxqNJq2o2YYU;OCqFQxkmm zpS*y6z5aiVtF=$bT-VM0WB+^gwX42V#b{J7FX;8|V#hu8U9nso%Q+GBnC!p41TgZ8G1T~~*6J%c#gJE|A);W6)_Hq0O zfEi<`YiQKCGm_li>bJ zv@LRq7#3B-o2ou_#%`PmxIA*{gzP@Yd7D@1t&U{1mMaK+J~jQP9HKbzY`nXE;QyH@ zlO>$J<83=ky#jzi2eLQOllul!MloCa>6Bc1a3}a{D0;a z8vgqGzi%!4bN_&6$A?mumI1hx|IE~i5>a~xjAORLbj(_D#T}hpZ1yEyfboZX(86#F zX<>F*R5#iM3ttR@KU((OR>+ey`p0L0o8&mU)P`g; zXG9Yqfs@Tj7DAA=WOs9f&`Ixa( zY%!_~HNgg6zCgwI#_Sv;;+i+;8l?1~;MdVla%U`$GF82lT_dJl8W``vn<)dcV_QSu zb{*EwH^NRl$&miiAbpv9bQUK+c^CH`KZkCRLHa^bdn^+8eNwyu3A1M_f7S8P5Mty* zrfBu!o1JSrO)!4^5UfOsj)Ge-%SV5(C}&R&F=HnnJG|XHK)6VsLhq%l?;ruI%nUCM z|08CWnJsaNsmmP*Sw3wGI6ggPLyKyMd>k|2y_fpr(s+i=7x@d?R=N#Onm*L1J&2dR zsHFAC0~MJru!c3J@?z(XJ8$-&s!;Ec)OM0H;lYItFR54OU$Ez`*Hd?2NRDsVNOECQ zq#QcUDe-w;N0H*`Cum;8EL}O;zoTS;I=7i*W+cCbz~biRi|LS@QMj8`FXE~Vx@aNp z?5JjPBEe;e8u}##2hZGls;t$%>$O3~Gt^ZpUXpCg6!hRi&8hAYJ=?6|G}1cPo(L1% zka5iY?|xv?##SH$L%&MIHkMX=Otd}O6(BqF$SKdpsP##n+w@oM5$PMe$%6O%lV132 zc`?%6Vg6AiFLJL?F#)R&Q(IPPBTS#5?h)%Rs;-)FO5Ywc}5iOeyxEeZ7hS1gW{!ME`C3e>@P|WgJrvPDhLJF z8Rw!iS`EjR9356tomOn-^2rNVvcm=7+YrYeEDx|dN?q)}awu|%H=V{PL`_#LLN*ud zez2qj_-50xsOf5Z+-5xxi`5ycvBahTNR@%lHR5=4hlnw0FImMYK6K&W z_Te*zU^OmYg?x5-xR5#=i>K5irN>yO@_7*CS}{toMftqL0RM|z`H!3*WCOBy{19T# zCpr&FGVUWK`bJfCmBq8b{jw>scZ8nUSo0kUQD%v5VKJNPK!E%x-f!01veL{MQ==Bk zjhkgD4klA}KS@Frm_mR7#=Q@q82OZ$4QzNA3ydkrMf2yaty#ruo5Y7MtsoyX^~-EsB3hPmJxOy*8M?GCE74qnJuf-off zYCL`hkbpv5`b*y0!8OWojM+Em<9%ly`q)OmMRi2tSFE*$+FxI?m5_)4*GK=P7|qk= z*hC9C04~@K5LH)A-v@fgFmG)GlA1*yVSGf{s_rQUXi1!JxRn{+Zqw2F1itoctv5Hf*JQzsd}ctZh8Gz zhLLXq4|1`;_F_}B#u@xzv4OrAnhQv)pd?PxwDQZUo)0fvkY}~B@LQQq}(^%6R zr2<@|FW6n)Zaz(^!9z$Qprrd_mj3l+<&VHt&~E$O+3~CGb<=NLdbj!=tcIU4ZYXV) zkydd28r+L24!Qz2Nm6vZ3H@~&)aDBMviPm!rnmu|VDt z$B3b$oTjk;)N7KLoXPv~2IcznhA$%btyKt<>~150V&QP96@MCJDfvVnTN8+Nuz0gJ z#JVrnS|t5GPyLP1i-=SE+b-4`?b->ZpeKH?G_4(`i;ymn_63Bj%67Gmo=%)KP_9P$YFQUg*P+?1=C3ai;;?jT#TzK?8bGgIT^WbJ!N&1^^58?U`%9`iC^PLzwaC_k;XpRux7R-lB=_-n z3vZ$QVS5lTlD}0cfFP%pbQ`t|ZL%314?VZ-cckHqHhZetMAE?oMoQUdB7e!8ab;<* z+^CCw=j%4@4`Y{^?w#<&DS>*_uE9AAl2|}WBkl~(mxA^e-_D$Sg7Bb~VXyK{?UOrV z&|OOq&LUTjk6}~<1^}buaOORyG;tR3X zsbx9rIE%2VH_41Ah$$;&TpJ%!r94S95spLkqw5n+rQqH}!_}v@Y-);nnQHJWK0T~X zZYQpKLgnorL_k`64*CGhwMKPzS|f8e2^2_fSzl8$P+1+|m$E->aOC*(mXI3FY=dU& z<(^@6`P1#TZ+C~U7c?Cgl~)mKtKB;KDwN?xQJ|Sq1X_emO5RO5&Lh+(PxYO^+27~O zSymqB&0Wu1TRq2m?NhV;d|@ksag2#i%-GlV^$sxd8}lYZ;o6H|M)?gPYeqf z_(&}WZ>!XlMFHU(xlSk3XHw&50!^gnm-=`dt0?!0iOn#6_vvl~*uF!!fRVCN$G`Oj z6{@|E^vrNW={~H)yd|iHpYlaE8~dn2qcmS@aMilBygnk{le1+ibOR+sM^L~-A8jX% z&h?&0xph1X3D?8p&n;hrM;sf8m56wrdV-*m@?1d1@>m=!TJ4So zm7$5!Hla_RZrJT33}i`$K{(P1h5abF>$HoOv<@+_A2p*e3?<$M zunO$zVo`B?t8JLpQzv_B=Z6+7y(*f)uJfOxeH z`1Yoa7=7dy$U2OE5h&WWh9XCq0}`_Danm7`E@oXJN}8U*h*MklUff}Ka)UhSF6)BCrUAlPTl><9@~=BeIevAx&x1MR1)0>n=re{rb4 zKb}vz){oP>=ED7~TY#KblErp%8-okUpJZd?>C4>){VkC!3&N`AEYN@ptq~(x_Bs`n zl&!Bq@9(XTf8gw!HS>IvcRWHkFEI~sm-H|PAx}BvhPoI)V53U)KO0qQOpqCiM5atO7?(yJxO9=5 z_lwDA?X2WtiW>S-7FUsxCP8R`cL2JODshiKw;$$9@;Sz?a_Sx%eO@|Pe z{$B80gg#Av`n7LxS20S%wk)alCCUAmQP{iF-h;yr-@U^Cetj#(&MFynE<-@bTyKqr zt_>HtI)Kpp)Rbn=ui&6mm~`R-sVGXR`__?7#jRvCP0*nP##uZrNCC1)eN{~gKbKV& z=a+sb2n!Bfk_FAOd7L#-aUX&1AtQmZ;9T@0&S$oVBHpH4yZE%H!8v+WUf!3@oPwR2 zTR{;luv`ww`zen_Zk6?Yh#GL*G8U_H)lEOK?$6RmIkD$?>L;0=niNr=cXoW4-R);r zlVr=&YvS;xQF`sBR1dZ7H}_q-pJG*O!kGK!W}W^)(+$aLwgRYlf^S8=w zBe?wSmM)6Uc}fi*rDRSe_a%xX`?gDoe;(2oks(nOx5lC@nAX2)#YR_w1Rn?L9BnLw?{b zH(OIvLcRi$O`j&dzR2AXCMm4ov3c_d{nLt8*49*M-xHH#iP zP)*OedBdb%NgK5{iS$)aE4BUI%f}x1pwnGrwd$9qtHlGDEAH**{EglCK7WfpS zW_&Wn`?eI8)usCeq@&H2=78XwSm`ooistCG*%eq?z#6p>nELQQ^R~&M&Q$dq-(ImS z3vOXkaSk9qJW8O7Z_m7R2JGADY?0})Ypq9j-}V&W<&@2jK zWnQqmB}ool|IoEdzHCEXM91=DOmbq;z}CjWRFGCW+snL_Vd^TnP2EDf|C!ho zCn<}Mo&I`K6Wmpw&4SdRC}8 z{W$Rv?Uu{Dw@}bKf|FB;&)iIt6SO(5yAC+@#K@5 z<|k>4EXS@{2_>iAA1qlM$!Q9r&-USj>;OU9C@&52tJ_l_by~3%6Uj%v7IHh9i3Dxk z?2Xn+9HXuETAc6_%Hs+UzukjTSoF=Lu>re-#Q7qvhJMrgm?7BB) zY3mG0E~x4p$=I-hB$@7r)jfMbeEsfr{!_Iwl(BD+bu2}PjteaHC$~?o}ojSJpJw$2ZlPk%swYPx=WCVP<{FEq55-{D95eG&qQiFPdfsk zZHbu$#TDZX26PlEj`|WlDMw=i*z5DL*trToG02B>m4KxmO`yqzk~21*(b6x)eXdkJ zug`Z2e7G_CIm-z=!2qc}5i6XoB8Yj!pEp%es^u%icSAl#RfF*NM8chN*n7z5mM}V) zP(ljN2rdpYkhSRge4fKyZ)epwuAY2v3S*dq^*Y}AtaXdN)y);%Qd>+=s zGRoC#337)XSpmHQ$r)VqY`?^heSHYbUu{Gi3c^iV{k>!(EDBTwQyv6Ys zG)s~Xp*W)nvum6py+~T-?2foTr#W75J6=9kKfFB>uM zf3X97r*ckw@%~ZNxE4-NcnSb=uo~TQE z(h*QsU@u}R&})qvu9Meu$g4jMx_LlC5^3~B=KE=hPX}EWz40@5Il?E|pDh}-rgj-< zWS&HUi?dCE*C1akJ|ptRHq;}g3W%0B$fB$<;l$JDE@zA!8dCjyfa+|5JxPkIMTI5r zD)iNB5*IEjEkv(%9nuSa?Oj>l$?H=BUReSSuQfcqzAv31QYNu_zdXWtM_!x0w6q5B zV0hI;NOqTvZfQp#WEFH8Fz@rT*GnyNY3Zj>EQuxa&$(fK6f zI>c?yWbA>BelQ%T&i6oe&0|Y(Kos?*u$-Ubic<5KV2PfpjBi@R?Ec!@Jqu@tyR})a zr$w>UN<+OVfjAoieKpp`KTtlRXMWqj9^Z;lcvY_(psj219-z_K#2g=xkT5|?6>Z=B1Tafoq@e?xM@n=M(uO?_q9BOwPxTugnU#*M6wt+VTX^E;+NK}C~vXU_dkc1Ib05S#7FCF!A#IX8w zx#k7HU$|$_H(dh()U0wqn~kV>1FK>y=oASse8E4XD5xzPWA#^#bSXB#-}xt55|%%L z=YL)Q-8yK$`s?@W!~H|Qn7w57iUN?d60HPBHR1CqhQn6QC=j(9$5$;-3%5-8ko

    QynU%f7& z#_sDH>A7A!S-!m43@@Do`5`#=CpINC)g<4xwec7QbDM-bU7?)DzF zfcx_KaiYWucTlD4O1}hM>Cj*RBGSjl>(i)??NY&rp;=T>UKi)IB3pk?8he3T>+)I`+87p2Sk8yF=&lN3na)|_FyCD}*scDoe zJ0sn*>KkoGdjwxOx5eN#NW`~qq%xnXyPLvFru{iSQLceZL$^Vcr7Ng+2q8_QMrgU- zA&t2OKJ-fZ#G`YJ)DI39m^Eh6fNtM{Wip0#hb~qnv7+SZA@qYKc_0rk9gtOR^sJOz z>yPG5`i$6~nv00z1K0Vj!1+d_tn0qQJkw~6NB>Lj=+YICX&Y_ z(W1TbKtRW{bg<3w2v3vZBDLJlOX(FRB$hLRTdv5}ZRz^1>%Alp?8`3XEhU-`)C$&=X+${;mCP{J%(i;4S_Du;u=gZEBa*8dF{J2a78%2FZzEJ2_7G?qE&<7??<& zsPokzk#7Kargr-cvYMo~KifnE#nJz(CeLL5E|uv9u%~2b3s2t#_KWlT%V!lA9B+Eh^zKL zcHL_nNTxajcytn((ARley)RWuUSAQl83Rm|(~Kg-Ev7IrIX?83Z-I`)Mb^!$3nqI# zt)+FlAJfA~Y{BkwhZmp19HMp9g755#341LMXWhy%H@)cLymt5zxJT{6PtCQzk@ES6 z&k5_vt7*u(GtO1Ap5|XVdUP|3cR6q$bg?`fi4u5j+lE4^Te3SeR z2}L$C<)gJDxY=heRoBS)9eARR)n(krYUDiba(KDM3s@)*9|K>0N%j2dgWsJ4ikU^T zKUsNkb!gn>5LDSJvN3vV-cMDdNdB{EH*vd_3y~ehoxb+SpMzb1OC9wtNDlqc~0NO^)2sb0P zdQ~`_^m>P>KivWuRhS00E}W4DoZ(GzblQRjAq7jD*&oHerN-aF3{*ONAu@2^$zm6J z1=nRhKxs29x!!c6s`3n}S>IcHQGhvQs`(Fl-5>Cj|3~gaNYSk9YfKB@!-0Ya@h6Mj z=I`HI!eHZ3(zJpaSH`Cz$O<&jAoKVPIq{R7R87m*Ga7GQpgq+|ez?M>y6xVzi}|ms zFCd=)JWp!*Mo{+;79qrWcV`Zu=+i)QWwcpgssByz(Bk=F03@j{%GAhGP1vcgp;FkK zGZ0rO;7qu{?VXg;1V@xnn5<;P<<)}2+GW^agdTyRU0<=e9pwIC(Ea}Da0ZvkS%A$F zxL@O?S;(8o>IFi2m3ZefUV5 zuZ_(Q7Q|~d4%n`N?Lal{Ot>B_pCta8CY$qst9-s86^trq)5X|WUM#W7KX0i~t*%e( zuTJVi4@!gMiJqb4E)n>X_Qndmha8QqMo+uT2;J3v(>|M`hb+~RI?Bqe&3dE10{ z1U{cyZ07O&v?(e7U91zlUXS}wrPQL8XPzlDF+>~1&rv{9Nk$H(N-lt7w1z(OpJAl_ zd-t4f1=71u7tIh}XcV2*{yp6ut`prlhr2|wueAv(AI&t9->-fyVv|m!g!jOB7NfN4 zx`86&{;w~)`EjCDDg2nCCU0#(+(e@{OhXt@%xMB1lX{f$`S(n^|ITYz9O~ZfGs8_t za5=&^PxaJc2ux;LQs2kc7!rWkYn4hwO~zu0{>=kpvOTE+=K}0yb6ba`N}qB_E{lpj zZ*y(f^;+xlPO^Mk)>J_o#T>*Ni;;X+leYb~K$B@KOmjAjn22!yy}0{DCF4Ru@{ zGse@K$gZcYkKLZ$t+nS}aN`s57n|A+Bt=X0FB^GnPjCkZEoyMCRkmjerGY)juk z-Pz|H88<)Ys>3Ec>#)3SLlLP21C>dg9r-4nU|hL<$uw;f+d$Dq+r9qgg}fgaRrrLa z?vvq29k*HBIP<}JZsk)4$iJWZj&_j3(P$+~9KcO>IE=2yR9za#wvXDKd%|P#itUis zP_zE&H9v12Xm`_r%I6Qa$WTQhn!v&@{{PRIw} zms(lsGZ*KhA(jRny@y-W6Y44wW*zHOvsw;&-NP6(b*g({bm^^mOOudB2oK$mWIG;4 z*E{N%_)1zUN&6sp`}pNSGl-FnRFr{BF%7i$!QpoKq1(C<>t@dVl}r(kLV^vS_xh(2 z;C!h0wDDU{WB0#$9mPc)LW=61H+rpwd@AGb8!kR@N!=MR@I63Nq+Eals+v*S6 zORN|vIV(l7&*42g$i`M4&JhtJ97a_&ZO-(XEvKNgIYquc#(7{a@0_IpyUc$mgv|2) z?>*9!Z0QuCb9Qc0+i82&K@9Iz$Jaz7ryemJ4h;InzO*#ycG{{6?RhZQ060$XY7u@5 zr?%lrdR7_nRdy?kWR%R6F&t)y>4$RS8>K10`%w80P~dq^lt%xWEvf;DoA&zN70j># z^pk8$oRG(*QWvy01YiVMX8(@K4;I$8CmJW@7%#e*v1W_8US)tVj~;d2{eBHAuxq;} z26#amcL6{s2XOT0qnR%q`?K3Mj&hURuL8dh@AP%q2>)O?FxO|OY5h;>Xa6Hd|6j=r z`SmP+4pablUXkjmlRJVccm7#f)2!Sq2-yQ!zea}wyd!`E*%O>4O+OQg@$DTF;-kGpvHq6hG^2HBFW+5?KGL)b zQLZSt?bVZB78_3u9s)SQLjW0WAg)c=HOhs^iBdIR6!qTT@NtiyXrYNdiDt5Jm-88h zc^^;ZCu^fkvQpjNsg~yXE4X4Uu>zH%Oae3-+kS z1m?+1IZsv&JFsjvHr-~j;0x?A`9Tq~TQ$`=-xt1&Ub~_f2Qm!CuRAbKq?AM)2D{l4 zj+bhv0!mmma=Qjse0|Jn`rdix_{dB_41=6om&y~jOeiwadkrM*uM3XExkdRhA4-t1 zDf77|g0g{GuLQ2nSYWnNc75Mm(UPMsOt;b_ z!OIn=fP8YUm#QH45oTj7C#C>iar+0$QhF3_J;rhO+}6nfGspvyNfmBwr~)PBz&MbS zPWbRWE-=Vu!)k6zmf}}VlPpeeXVPw9+9Y#0FoSz0M}*cBZW$E)}ZBO@l7_PGqd-1vLf&Hj|b2V?qf-{fx%5t`N8!uT zxl%=LwpqQpnz=~>J{qSlO&uZB=Qc+%A5{Pz$ow{nY@F+=61N;lL{bhp?p?)Tb_h=` zK-&}@BVyDs!U?2VH|&E7ZW)@YQ(go*GGalM>8LQkSbyne!m+0Ya5lJ*5)q6}<>!8$ zXC^TpUNaA$G6WKZ2cX=F-#Q4h6(GrOtJFL?F9l2JXt_nKUTs_?wrsr-pZOsB9;rBM z)O64mV2>;LK>#ONf z1ERbmZZ{p%dNC;iE(fJq+ulK~4c~8>vFv}6;3M)exlWVUSnxlKYnwqPVSE&~x*;1! zajCxJy}7(oh}`fFM!#X~!>yYJ&-iwB(JsN6z;=W>tf&?BP{kwWe5{A7-Sn-HlEN3g z@@LOL@7{a1J3rSf@@{jnm8RP&o{^)$c_vq|Pxj8`T31KO-c|Ws^d@~=F|5vGDXSA_ zp@U0Fpkq?Xn57cvrKgZPEh`2@{mp@CkcA$asRRWTgX%z3Zsi{=rK%XF5-h67m^vM_ z*$OI3W}_5O$OUhknIgIS!pQ^n1-b_hHx=KW5d9L$tr4@pw=+!RMSW?Xgi*;sfIfyW z<8WS^!qf17`T~iGC^03nEVkZAex>tR6zKs{;*q zpPb)%XM4=9h!kfZ0u&TdA<^vHfKn{itEJeh%mg6hv~6Cy-0&E%v$IwH&A|RwL;L6V z@^OCcS$l!2W$Y+PzHU=UumCc|8?q1$YEO7eKQj7U z8d_$&1_tDJVb9-Q_dQ0Fb9j20$ShhDG>v9g>vaL&tJU^|r)0OpcV=9|J~;@XIImrYC-uo-}DUjGqh^rQP< zAa+P|N4TUSlidD;<$bXSUn6`5_e};raF?m;SSo9VJ#r9A9GtynsOekuPZsTSV4wNb z{_H-WuQPIG)w+VcpY2IAe*jA~kRvxC1M#z6E1HmZBqP8`oC|+ffxK7B?11UjN7l$P z^$A~Lt6lFL9PJI78cml5Oie&UiO|4O`f0YID&y2)_YJH}FOwRniHyEzE|YxN%x>B5 zlL>n&%RzSYQ`kmm_FsNnXR2I3py~}KRva&Jl&|GVhO4JZdw}XbJ7%WjC5=2tE4|{d z*fRNo_3-{W-k5n5|9NhT7E%K4f8AtI*itP&q8`W(@qX^h_Z+K+9kTB?_DdMd#>ku+ zEb_IXodhg38`6_p!l1~3O!JUi-mOnY0MKEV2PD87vT~XDZINn@xYTpJV=N(2j!#?8 zc?jwz1Tex1+`1j7u>s;o0cId<^1_Q8g~uVXQ5T{2dB-cBB10STOZwlMTsekEjz*on zm7RSs8+?%G^5TAMaF;_nM?}PNF|+Y*Y(-1%%g{`nI!x#@vwVaD#TL@w%%{fR){(ln zH@)!nMnG4`A;{RPBp*L+q?$(!o;x`Fw)AUng;qe=HB2LbfvEwb>i16WNm){G`c`bW z`wFz3b(9}=54l9U_k)ECO#Z}E`;&Qa9LEb@?HX*DI@)(Mh%gs2*VG3tg}&{gHo|tS zfsplKborw4;y$Ey$U4$?GeOS^H^;Qgc-~~O{d5Op4aC=oc6q-&@ct@40}8sd%Lt)z zqDQt;7(#uQ@4lV(>z@?1bndRFaYg1sPS0zsiQ17pK9rY~R+UEete#9h@Fq3&ldbK2 zKSA;R5}y>wqDqa5PnWfeGqSuDM^TuEPp8gC=8BslCA(ps^lOYa%WFVtC&lMkiR^TK zLD`80fK8(|%~C)+&Fb@_2~nC$7o)r*>Nn zGbbM)u)V)D1mH^_@&dHt*HNVNsCt=66qVHqHKB%~8-RY@`>P-RTj#?)Tn&%!o4ePK zHD7R6btQL7r#t`Mlq811G0YbN7z(!(fcF|@_gLuo&R;zFy)LB`_ctp8P0_3ZK(|X_ zU34~JC5B{CJ0P<#xX(q8v-Y`bfA`1Ke^@xb?*7#$zvjum9wWb2!>`rw|LSUR7_Iq$ zdU)=U+ZhS$mu1|S0Sx{47;t#J*PLVS0gB`_cK^K@q$rQonV+s9vmai5(B;K!# zb{aUD)mzVa7a$)~*`>b$OgyuTzh$uR+5oM*sl+|Rh;UPF98lmd1}ODS20Qb+W98Ok zQW+ro>f^R!c@@@W;xh;%j%UuF!^H-yjob^`YBmEP~2 zq)O?FT3;Uv-Vc(9^k2FA1Wgnn>N&NzGW?WjRI{PWV;soO&WaqrvJRHG>i9Np|xhpny=4TKs_gqpme@_6yl+V3)}3DAS^kKBcDv~F?VR5MFizzHyO z;ofy>=v*x4Gds1sS!7{f3~mZO;)t^`-*%@KlkUJEGS2>X{tGi zhHaX%gr?pW-0LAM&*q6XaTxzk1KX=mXUH8 z86RH!RK2&q;@P%6q689xH3yc0Q~2cz8Nq6Dykl5Lm+qRe_Ile+VB$Gd8R~qygGU>bXudsZgimGDFkFrOb$1*cNH9 zW>}oDNmwKe`*Ep4vLN{(`jf@D+}%2P-E*5pj;6T1hGR;HwVH&ySVa#}@Q&#SJxWl@ zy+>9j`JnfVp0EtC6Uz> z2)kx?lX1gWTVg4eFwI^puV+}t3v^$c>kpQQfiQtRnqe=f2AT*+I&)Z9G+r1VuXch=i-?^>Iuu&z*2I(=tOPFpqmnQle@WC}QdY_qv%dM~xZF6h- z)Qp`zZ-GytXcHgTddio&h7u=o^RAiq#a;6V?Lp1Xr)2h_pS}ZxugJcp&Neqxl}vW8eqT<_N?fCUig9cLxh>P{G;-3XA(+@lX^{!j3CeJeO0gcZas){ zVZo~6Yvd5;hy#QictW$(jr_FwN7{_SFgIolWiE>n*J!@+1g~gOeQoYyS+Y(!kXjdg zyErNTU1W5#hxS5OFyt{bu;R%)SygY;`ttT28=h;Z2_vy7CIx$d1X_uk%Dwu`N~9-; z96X$=Q!#5uma<0LJ5IHMg{&WbwGByJ^cDzVg6Z0%;AbR(;3up1qt~urXRe@D-uM`k zvFa(IbtT*9oPd%A;9(7ZIvWYsxb(@cw8TkO^MsJxoY4_y)ae@s!xthH`U8}kN+0}e z@OOOFP+S39{k_l}YS>wpBU=GN8o0#;?qw7)k?HMll{p5Wu;F;^{u9odI?3R_5By_~ zQ2^bHIvo3&vorndw!uldqbSBc=7GqqRl#~$E=PQJ$tzKthn|N1uYShga}BQiChmT4 z)V5?npkskG;|GhFB1I2YVRgbOS-c-aD=EaZaw7Th|3DAG)y@nP!|H9ANyhvjJfJo%_FGaPC_R+~>Xo*#<}jSWoOh;`^@PGeF6B zX9g?2eBkr<<%Z>-|MBhjFn+$oobVw{K;GixVW1p9QC47%_S^l5DmrdjOu)=0`vnAbGpT zqb2v*{q2jcr~~VKyj*Y0?oR_J_kT!!5PwU&GXM!q;3%&>l8dBL3;^oUV1cM#41LPn z((?A*^qE(=tfZuzVBG?YHJH4y1!JTI&`rkRU7Eu&IBz0ilK)`Tse9TItPeBt6ng&hLA*aZaQf@@|1dbuL;iRxM(oz5Wi@We z4tp$}91^$5Mm76v77RCnf?-AK@dZo=O&?!xWwX^}@dr3B)K8x)?KgD%(+5}{ka>5l zXEI7q51++MBsMimg@?W#@+D1+ikt2R=M!NoszilJP;d@iYGRF7@N;5rqsN8ga*-Ib8M~IZ zK$J2pSpub#3ZqT(!WLj=^@o12Jc?$nEuvNBk&~Nhp_Qn>*NoIjxq!(<5#z$HZ~AG> zm`1aY2;rbEN;=^UxjG!ThEAlKjZK|&Fh)Yx)k5{gw3ki-LmY;=`FjV|U+o%VFQ8e9 zT1J9<5IVvHMexIbFKZ_&-61CKpa8^MSlW2eb`Y*aL#Q=Vt2|D-EObgDuSLTcozW@C z0B=0R84_}|i|RGq%~3FkITEp0dzNJ?680IyO~0gOGV$(6`ndV2`e|1q4Jqh`A#8wqC60UPX1v^)kyNbB ze49@;*X31CWExc;U zf`TNT%hNpGp@BG^5hA*K?FCHaJ*htlDMS*@76_J!W^ZA!OhH%S$ltiDs+q*8W5_6M zlcPm;qDVaUOuD_)3;&kmhjU(K9QC`%D$F=Yks_^kH5=9u(EMLYh8Caax4GnxPJL88 zlbqdMfqhT^!J&8Ee=6<+2oDs(g__FSo=m7>WOrK}8U3b%++W@Hj;%$la@!hy#E-`?cc-z91#PG1?s5mnNTwyu= z#EfNEnlj=th4T~q<6Lvk_b&-wO+(ne&w5=ydp^DNN%*<#W)=LXdsBEqj8?D8S(3wn zvw$h``a8Iz=2*lT5oQINv9DUPYHe~qqbb^30sZY(myPr|hgwbOshk@u0NnPg{py21 zWe#AEFj6m4WG2znDFG%rqPn2UY%3Hf#4QJ0AqX%dB2uvRU$TjXq%A;hb3f4!@a_)) zBs@zHtpR|3)sY;UjlsT6NHQMDkvZ!3vvje?yShJVf8NJmO+a(OQ}D}H61_6T%BnJr zdPnbWTh>m{A{^(|4l!FH2R2iB79vOOqe>-~d!5l5OJ}3e_cVRgHLtx-o)5XK-n~P> zBAcv^{vYrybZG)o zBGPN5BfWR&y(W|pNa1(9Z|1&x@4NG6-g`6e-T94ws5#qnPIC6yd+oK?`quaP+7nG{ zo%)T^)r#Y3Hb>&k0Etn~K7cC^9Z-61R{D75YgACz{n_@;eM~^ytbZwvHQ>`HlPlF# z2|m(T?{f7C0t92u?rO({O1aJt+dawZ1*@?9xTy2uF@kqfyao8wuW6%w?OxUvf)QxcMsq8hhETow>Xxo z>@)kdgRJHZpTGDdXlhG=^%*WtN#lqZ(y0?>WP!~^1+AL zU?h4`CeT0*!F<$);w(Mk69PBmX3CJ2{Qf+xl#i+A9JAcej19ON6FR~%ss77|U^ zdctQQx>i}Z0*$k?MsvQKz86q_Oz99AA`v~(`k{yuC<{9F?}7aDw6Xg_%=VqnCV>wV z)3h20)`4!$@$ow4A*R$FErY^1oXq5Ey7FMT05S@QFyJ{+6wc2XiWR}C!pmr(q{^r+ z)H6(gbdda@m1(Ni9tg@VG_VK4?$_V4VkQWbKE+HznUk0%gStivks-I5(WJ@zz{8b3 z{;$RqCg`6h%g|nX@NRtHSN{IR_?ILB_cNV^6eASe)vdTV7h8?IB6X3|>4S@yWYD)AK zIfdJIk8iemFihACl+{Wqp2Pc;9q@O&tk0)$PVPu$oz^N1J>^7FQ4kx%K1u3+cKZ>W z&GH;Bttybamg8=(2wxTRSg_19=R?;5z&DuXw^l7aeq>bwjqmHA|7EN{{r&HlbedSa z*ADVT-!ye4@GCjVf-HdHy3$JFf5k8r=%C znq?4&sivVON+@r5tGf!@VvkI%S8^a!MOMw(Xx58VS2IU_8n+Wa1G~mweKSo8&ad4s z8#<2KrO_+adySAn)hSqQ0!f)`;wuGN5;C>`+5mvbnoRuOgw_^^?Hm|I^`C-nGV3kL z+?VRI(Mb6oKnp9pm{5xbrE;4K>irXi%#=11j7AVfa}w>Vapk+MU%ij3g#zVNhT%B6TFkKN&9VY3Xw9Xu!I1`m zD^|Bz8%Xt)Y)-a)A8U05Jm(Ga#lqmg(QRC-uSnZf#b1!lO__9Z}BL?h8Q+mjmMS zy)S|e2v6HBe!R8Ye_Jp$x8`1X3NSd#?0@ME9r)dt-d+13>5l!+Oj%{wPYkPmuC;Bt zPR^Xy@PX#7`BC{1G@5v1!iD3)vDrL>YQFqOar>iz5b3db4qqR~H*uVI6=wAd`)m*0 z^LV{KBxInoD%BoHbQ3Hn>1X<~l5qWj=X{113?Xnuf63HxWwNZpg-FFRdxw~`8hRv!fN_H@|w-B zyGR$r#=ZKQ@L?i)dvT{p;BsRKWBT_2|3v{*lbA#1>x@@2DR{#M2r%b0{Ty~q&1jxB z?+3OX?NV{rv*&X9=-!-Nmle)x2k`RCIh^Z(f#SuYJI^tnP*0@F+lSPyp>}4hkYhU+ zmMHM4MUn=A_Du^V=>m-EV!oXXcoGMw1zRUE6U`m?d4R%X5ZDCFpdWuniQWaWu9po14Xvh9)2u(5q*uwKgo+3-wh5_|U0caNB%D)&M{rMSw zgpmK|eMF8T)``U7m21})JcNL3Kb_4crq6=xON;?pIqR-Cwq+pbI&Oc_8?beyERRJl zxqo9{R zu&bpiM=DIAGj=@yNYa#V_g^>Se?3|A`^Nw8cil;#sN@MA?CU7E-`%oDIQ2#7aLc{- zRN7fAmo;y6pNp_HOUkEQWv!kovB!`;Sc}Yc>{PoiYJ!uPntkm}A?Lz@Y2@U%jinpi z2$Y^A&r~~PpR&8O&0}b)p~%;;w165ks)Y^0j`MdKyDMf5%RY9xQbVePZa3Ny`&(H% z5z2<4H9Y6L$`#n`(?;qp(-6b&v%!cbrjg==t3I5=cjM%js3$khw?|kgFFUew-D{9l zk-flkmF3E(`Rk`>wZn;nn{&-$JjjD-0f~`yqpW`CO6X_J^jc)tn-J^}M7n~nE8FfC}wDwJQ z;#g+@LEg zDS8^Bd?wuCNzXBtgCG0H((LQy2LWj)W%LN*;p7GC>w|!B-`j_P<6Hn;%UOY0B&Ya& zR;mhq@tx^v{yL_PWB15}Ktrqj#D;P*eBYKFoz!vWD6597oAC+DJaY6K2|;ZL^E+F% z&4y#suJ0asR18w)793e$PGDHf#lhg&#%uQzrf zd2$YG9|5zH^5n2Y+sts$yeps}V3Ob#BF*%h`1 zRwuUtAh4YPRsuZYzgX_hItO^Qbi1sXuD6sI3|~GHLHAS7I;|{R0!J4JT7KsS4*>`a zU(WqA*y4Y*t^aZVzu&s~tG+}OtsE1@!Ro+U)ldpo_=@-xIA4++jAlXgm!jr-{*1qL z1sukI!(ZpVw*MZg`L9Dk|Et&Pnj|O9@0Pl_7QAB5 z>}IYFVfy+qTlzdrlcUBWCt0powywFDi<^HrQwhynF4>*S`PE$RvK~)#5khJdLyoS&T zwQ~UfyMhSSN58_tKlP5BzxLF?WVSq~yUeGC0f4vXi?YM#mm0Io;@v~$u2wJV0#hWW zs8RJ9DY2``N`tGuE%O+Jse-(7H4 z`wn{5+wGrRaUr&L&P_>P2b{fA_1|q2a*TKo3i(pW^@bE}6-~XHo>TbA9l{Xmm^EogwFg;m$6+gEUInFI) zzS(+l5wSY)oO4?l950l7kZ?wa#aX_+ryx-D*6XyM9HT;j?Z4~*h>_gy4JcXpOJA?-h^kt9VGoOd9YdL8UL%EyBSyz`O>^H$vkE1jZ+O|=Rf(u5FX?$E zYACTSYr0@?yk=W!(FhtwQkbM$378o_djWC!CPB{00{SW+O%{-R;Yc zFk9{toZz3LmEy%~2|jq#6?Qe%XtQQ~?4=d?_=H|p7~6xfmec(JJjyz;DX=vWb$qF}M3H zXY%(e-w$ZK+vtCb_SwJDP4+-$8ZS6(YB>)(K7{ax?Dman z2o5&Zga(gdd1MVH7~o%qn$4uw50vUKI>qKN-GLXzDe}AMq!){Bi4x2ysTa!CY`_3L znQ&noffF(1J$jczukmD6gZQ?mbg7TRTNhfr4dcYcZCAE5oAg0w1g;U2uzd%<)j)8d zq;8I}57oR!gsiJ5g&4O~;$~J{t`N=;A67)bK%X*SwaN{?e!-U^%p#`q@3Q!yY$Y zmj@%_Q9E;~qeU|`(aV`_R1vjl_jCBHJVO6HuXoA+kFWWE`5-;Es`D~ZBNo~!Kk z(!*-*i1e&)NT%F&-917udecJ81q|{=+t3Ua--_+4^uVNGjS<^q4FC`o@5Ee0mG#=~ z?=@u{(2oz;_>{RF3{_4~@+ZIAy_@t;06@U_qu^LZV+Rqo+PEWwx5r?kH=&!x9@&N( zcx(*}7v_GsZ#mhZdsvZBToy-|bD{Ybc7-aw_^dd=*#Fh8;TY0aUMN9a0^Q^a=o&nd zx1`)7GEZs{RVvx%i!PGVJ_igokN6bru*=U#B-B^9R)Q(P?&_cC9^u^EYyBfXp&e_V zQ~6oQnXB~VQ0Z#Y1?{P>ZPJ!}=n<{xXuy5e3v=KR#KXsPyzv)%MimQd47}}FJGi8{ zPdt`i-{r|1=HsPWBa*A5x4j%)9H7Vb#Z!9>(BB-F=f9oOcq$+F972#T#&RKOYGB9m zeY)Hgt!DRyvV8T9W`ouS4v+9K*}@mDSo6m7^en=e>ReI5HU3qCOWf^^(vW_n0GSgH zt&vFOA@KZXrtxz4ReV%%nQo=N2x|Gej129z{floS9 z7^KluS}d{iKO7#*1Wth_6MYHO(C~yDpfXF3ER1ccmw!ZRP-cx z_$^fzS=tE;hT#e;1yCK`>3g7YV0Rb@{_QcIJ{PvJ3-$Z)AV1#7kD>Krj{LA%KU?D; z=J&@s@?&NH;iCNTwf?1@#^GPO{=ehKYowB0iFqlFJR-+hZIeHa2;FAN9ZjL$)a&G- zrDUz-m;}{K9`GDm<@Na7A*v)ddPZ&EqIMCfV8%qAYnJ8A9PI~e8Fpla_-Hi9M)gj+ zhSmzbP()mY`EFf+bbO>8_qWt)-=O;gSFXg5`HOB%08M; zN0nVkAUk?H6*MBg*(YA6F8iTU1uYddyyL|Ib|kvk|17)zh0rxx9Pu1A%5jWq`Hn zGn&-_0qzUCPkR1v8Xr24Z#Viiywggp(!KbnWbYmr3VQI=n?vQ@ne?yxA(HwqzNj{}--{az=uKhR9pDn4=IyvgMrE!A%6u<2^SOYJ@SDhJeAVkq8 zq_?`f>Q=Wq9lfd`7T7Ek8{+Xb$M-tU`L(Rpm4{|*lP<#Z8D zO0mxoLv(1Tr{VJ3X}o>eDUTDcCa?`Il*T8YX-L!m`dp=bs|251&EK((f4m@g;`61ExLY7yKcjstZ`jFF=}+5aMe0m(f!tWpSPS5rQ|}5}u443r4$cGHB`Mc4n)u zzNbFA_A=Hq6czx@-viZu)0jM$C+Ra#YbExkgjPx1_-Qh)HGlA~ZhpqMcm0`7Xj=R> zy1>SdqqA3B(AdQ9WJALf>RVqL)rVm)PdUxw2rR>u;~Lc4yPvnwO;+R~z%wUpHe5z_TlP%$HTJ z6(ASf!9Y-wSUkk=mFcMplPN_zEDFZk_ff^Es?%@taf+#c~y2|@KLwk zzanmhK0mQ#RZ62-1pd4FBVWxxnCg^vi;M@ejVc8`a})yG zq>9#1Pxp{7XHxy6z@HKX=t-KXLmD`) zn3oo`z@7K`kBSFLoRYaV9pct6r?1<~UF@zb5wJ@DOs9R;k6imNbM3L|YuhXdscg(n zBUY?d7t*ZXi}dt{clUT~3oO)`WvFakoqO}4G|J&U^OxDezPk4_)y~`+>KT9fmBaNNZSOEgUTqVEvK}~2Cn;cOQ7x07KKXr}u%4vT6cI=@+?MQM zoTjo=q`096v^`E)`==6!9-Ny%a@kCIP+bJhGvlF~_LRb!KzW}qB8D;(*k<4vl55zK zjtit0@+EkpY`cI{6)D}y1!y~ududO%SZ~3q-7b*@s|hqC=<0-gw4JQD#>|BVESDal- zIG;der(3GK(WiyE#?;Z*%gXplJsF)ID|3T+BQ(IAx~Az)dz;Lbber#>b$#i-WY28Q|{ zQWE>OKFP0+`_i=ZqxTi8$TNEE($bK;8>q4xTNpNs0S>a-8HXwnW$ILSoJc6QHGyoS zMDu`*T*psznLq6UF+#)@kx-oSN1kr0`JajakmFB z6@)#|7T^w>0Lfzj5djm8enwUPqYJOd_bWo%%Hv^cjjcPe{VOb8 zE_c5QQV$5Mvl264c2$`~O!t;-Z1*y`x-yXx%J&V=c^9>6ZjUrEQ*^I@=^-2ObM-^D z*l*oH!T*?+5-Qb$4YBenW8Urw^fmJ)Jy7ufRpYzY@INn~`bXabo^&ga>&16D%SPTU zmc{cl_>G2mF4Q4n^L^aZezy9VF4?Wi%%!5l#DgV5yZwYb9CpB6;{)O|ZLMp{xx=@B ztw|r{@NO~)`VAA3N#M$3UHOE|_=c3zh^p0^iFv&MJyZ^qXC3mbHDG=aOnA66dFt!J zj0UMtU{JxD4B3juGB0hVO^EnoC$13PlHqqqt4GhAYDY>mWTv?0Fiy`9 zy#dRZHx8G4IsbjGU0wPf{$RyiSP?(+z!HKwaPZP}nIo_x3i3ES0bf~!VE0Ao)Tl+r>^PW+rz=@>L0NOMtu1r-^D4Yo z=DhvOp9|{gaw`Zq|LhQBPGVW{SGdy<_jEs z+3#TuAI%~|2ENs88+-d-A~Y^C*&^hYuJ~`CW$BG-j3S_G)`mXcf8}aZs|FLZmH%e> zUb63&&%SLzw6QpW-IuIBu?J!uxR^FEOvJ3j5R)v!o;){07UD}!!A@Re`!V+9eRJL|s%$eMbf9O*I6#Yj{kFFk8r<>rvUj z%$U;OaT2(Befi^lbgIWT$aeX1ais3&^3&aA^A*xqqCjNy({EDG8=M|Ldi2Pk{cLYf zpjh{1XXmp~Q9&yF>nd{U3W}^0`tW?^ufS4qd;k9ly5xU|OZ+?hW+qFbINn!_XquH; z(Opsh(rIc%g5qR~mbg6VYuD*Z7khly*VF}`u74e6B-C#6{0;we!KPBFrcd&(ILLP~ z9X5l?^`951rz&ucn=vz;?^F(IoEv?Lgb``Hh?lrC*KJN9o-;-gyw948ylJ&e_(-l` zau%qUAqiS_>^1>IvX1GW!#MxFjsK!7Pgy}e^qVsD12p^j)&SzHvqhKW_7kH$5c+)9 z<<%b3c`IMTUAcQeANk0Q|0Y&WBck6gX2Y=H%rWLClGyD)y2b&2^P25^GOhQ4I6u&{ zmtU6*t>vNaN9E*CBT##wgHAPoI2Vx3&nB>)B6AHY^Dc8|4x>!xsdHv}u-UN8 z>oYWM>jl*WDO@iBx&B?>6R32P6_7vWnZPqAn-^PxR72(S8(SFbXaQFK3Y7 zacDd>#pTs}KXK#6-(e8{t4`h@&;3^nXt_P%<)5P5Tj1Tx@PVZUlo`_w9}*~k#{j-i zVm6R~3D%l0oyrE-Opj0el*d7CDpE7#J+TLtw&R($jswdrG#sc*iyOF@lCP=g%w6?Q zR2)m!K32dos#nGB0iDYn#87Y7j&)(rLbkLB`3X%#8Al#ezrj%2^e~p7PtL?hvg3KI zMJ54nx#Z5>Y&2sH0e#Fog5o}$u1DRqh47TF`Qkj%8Y(#pmu458%z+ZH+n%w`93zxU zV2?N^hx@IS%K9GrPX2T38-ex2PRoHwXmaKdwzff!S?-_t%EO+HM9zX!M z35xF%AJW@5wOtpNg&KXE(1_Yv5^DcudWXRn-S!S=Ls(KjqN&s>|1)#brV$_XeHNwPa1 ze{tyK==+VWGguj_0CgzR)nGc>#%D!X{3Mt3sS}T;=s@`Rq)JYrFW$5kd@N^laZbXm zIxx3{ZzenWdPSSXyW(tg50nQvFnS30rL?;)ZaOpr5k7f5xR{Y?Yyph%yWQ zM8vDo`n-&#^@935TkF)ft0vZSeV?nPPWnrrjA{F^2xfvae$9WaJWlT>_H%(?vD&o! z;r0yH-NQij9zhJ(@5_*rERiPrIYZ_4b0s>dHt9~G;2S3rEybd#f*@?Wd1=A_ipi=Ba@bfdYz!Y=jO0+SnD) zhQT2nP4AQYOJ)ozf`$;cvfD3o$RBYR$4x0qgeP9Nx$x|gar@}jl3~Kp%Ma^6RhMZZ zK-&4b426#T5x`Wwck+88=ANjKpy}yriyEr>IB?Jyi-QI&LQnH^Nrfk5I7a2@<*&SY zPV*8$AENb~SW=Ch+`CCn+$}?I6_((XeOHY=$tP1!G~|Eijt!Vj=)wwLyl5?Qk-xkg zb!GDDgMlp*Idw~7Os!s}P;bor%BIF9c5C~tY@$f@HuX~fidcpGDFNMR@vz!df5;`; zSiQtU>B$Y5nr+G$Wc%15aS<>PbGv8yc;&mu{a1}bom;%c$;$6 zF?_Vt(I}-Wgtu1Glli;lSrJzGF)I4gtpc(7iu+#jrMl_Z^=7oRt9tWjBTVbsJx5Ow zUnhNgEk*?2Bgv$ecTKxke`7kewgL-;vdA+bLMNOu&c%ETFTMOR7eN!YQ_ppu zH?`(ls*KJz?GnjlB_;B&azIwv1A%-7KpUqlo$%^$ zd2efN+cWJ~Z~YJ2uh01t$8F0;E-L@V+6E3wiTgJyvkS*!5bB%Ji2RXA)-DvQys*{2Im?kQizkG>;+emF2;L zDNeVCq^~?OZ%psaBynDEIPHS5AZp;40GH|LTwqtbOV)GZKHe|t;k+D}AikHR6a zo37fInxu320?!x(2he)};!Axje=5*d7Sn8Jt`b^R*{93%ehcFVH5(i3zKuT^=>Y zP37CqX9!Urhdh}|Qz(dZ4ywYoHd9}E=Y!HNX+EzHrEL1E6WBLk)mP1Y9YCU}5XW!o{Ze^Kc1^#wIWvmASsw9kN(6LtERBjwgLIdil3hgvjw z(wQc1OLXw0L&G%`1z7Ym&gKbmo@Ba>PE3!{fK5D?FYohaWejuU$kI{lX|j^PzI~s= zJNqI#(S9?#REUS?TxUbi!`LN0%;-J&q!@#$(6>`)wcSDH1 zPTeCO^+007WL|~)czw6W?Oh?K!Z_(f^3`-{$E;ZWk%fTnK_EJD>-VS@{}7n06zi~j z-&8RgNKg6xozJCc(r20-G_}CR%2l}rpd>*9M8rz~g=R-)CKEmx&4YDlzXiC8o$~H~ zDG;^S_jA=^GqtESDCy00)D$MFIe^M3rFuT8vRc_67=nLQ?&Bxd z{`cTnKidB0zB^Ug60aNVA+pUIMpcF5eX!5m^#sBvJf6Zo8Vy~@uWvr7s zzj{)I8*Qv+m*2($oe7{ zZtQg_h@#_|AbTp#G|Py#18LD977lg<78i;9;94gMgc#65uuxsKjG+)xRb->?)Ys%c9wj8C7e(V318-rZ2Va`nXhzWeW_rM|QI)4jc( z);~?x1=yE=8vs{sb!nTW-kl%!xp(D|+;x`S^E9(yzl1yiZ?dX!Px+apL<@$Hr!*2F z;BC6g6up;}A>x~Q7?Uh)9D@B!jMQ**FnSe*L;$*v&#zZ)HPAZm#~LnXuLMS^h*TCz9e3f{-AM*Bh8IPzL1}1BbqNlcF#c{_XGL?`QI$`hQWhfD{;#pB5mv^m&Z-oZMVBG~cE+;mE6^ zvN8jU&P%doUkttw!j1t96Npa)Oj@}h=^pWj{$V=5r!hB*ty3G~4tm5LTx$hwW7_;7 zb@k6q>JJqUZtJ7INM{%cDQA$xlb41B{Dc)(y-^1f_fe(zAAkrlDpdsWY}lI!j)=t6 zN7<{9j>2)(9)qWy0vkHRybQlhV29KueO{b^#58Up*aIVjS1nQfH-)B2x#Ao5bB?7K zB#TiTWEN)aP$!`3W|&}EInqPo^Lc(lfzF%@T)1aN#&Pl~67@6#))>1E`AcLmJg?>l zgy+C53SV4h`#F#OO)R0Ct?pDl!P+k3I8fr(xa@Q8#9(~5qBuAIUHm8GCvUGeo{2NG z`&~HN??Kf5PHp?9!8}`s#oAvP@)i3{r1(flZN_WA#wv;LsKe5+`Dk)*t372&gCZD& zy_mV3aO81i;ynYGu^HxB(IQe$IaU|2`N#-F)h}1@f3?prWn{B{Km>}FuRv=Z;Oo6v z({FYH-3yuH*iINsqpBSY2E;H|v6~A0-gU?BHRl5P_6u0POxhKFp0ouDQ0(7KH~cTS zEUJU_?8c2-SDJT~=w?y7y*2%)amAL%!1-fHmpu^uR4aG7+U}ym6ZW;^^Z7u2Q*-sV zodmu@9at$MSm1L3RTs9Jjj~sI8^f>;*vk81sEG|n!mjBvsLa3tlzjM_F{%cc78gJa z-UX_9X`5l27{9>z=)2_BUL;0uMw!ek=t6u81W68ocOSq$Zq3eu>rnj)f0EV>7=h5# zLDxa$dmsnY{Eea|y4opehB=bgjz17V&yzg|C@V0D{UulGNBb{fSU*1hcg6#xfeJ-^ z<-+SxTv5wQhE;~w@!Ql}bN<++?mdvHzJJ1zGilZB4MOLW%K55+E!7Kp%4{Lv_am?) zuo;+_IH?&95E*{j0~KSN%03y~S!TZn&@z+-{WhWXhuF;gd&KEiQYAD1slhN03NoP+ zi`p30c?muL!G;ony|lFXhR|I+JZ?bW4`@(6+ykxoJwh$_?vxptK&E6{^uCj!*>hDH z2I4%K^4)cC0f;v7hY|o!%voJ(_?-CC9tiZFq){(I-_Xwv;L6utnNY(WsQd=d*Wg^h z?zp9u0yp>-+< zd*%0I)>-T^u!u4E{adOVs=7j`Jh|bssg@^nwVG+RfebmSu+7-ApRy=rfmpbm78E|) zxVz8nA3@FhU9b*!;OQta^m=z5VVL-Y*OUn z7)$bOdHiQv^WSa0(|La7QRnKWUTY>viD;Ezy<`Un-FiFpfw^P_TmrWvI;WX-EYI6M zoik{+Oe#Ji@*2rd&dXrfal#IL6`qsVyIFCQ#>kOtJRmTD3G5AFiAw%+{P%agnn#4p z$v7G%I+7Cg5Pj^iRrUnc9lgo46{8fQpy9B3{+oLuoX0v_JG_Fv^eQnd-~JFWH?VfW z>`DvW6imkb1|<~0eW9$-7?+$9JeaN@oP&|09E5}Cc@$j|#InPcG zZ#L-ftZDX(x0>ii-tQ5pxS)pIRYGbYRdW2EkYaT%Cp38!;gtzBc($xAg>kEUx3Whs zJV~aRBP+VwPcm^LV(LQV>aq57&3raSFPo99Ep=R>QvTm=2Yq98IafBuxihRmru%GzJw)Wfu4W*vXJ4S zUaxy+$=SIF% z{{Abbk7*D(e!-QYl$iG*Up+c7|6Xm?=jXSso~bmWgjF)2*@+Qju-@^FtW#%igw*oL z)tp?ahZ6);x`iD)q{r^$2O9Z*+kak>9>nuPyFa6o{Z%3%>OJT#ivvr(<|AJOFcSA2 z#VOk}*PL&X;zb+5WXTGY7EP@d0FSyz{co z)O1-?xum%bce>5nQ_4q((L2d?sDrXLPo>rq(T!hlSXHXmAZzhs2771fQC!4F z6+nMlE$KPbgx+IhMK$09T&1~bN`gFh3BQ>>Vh>aV0MM%QcdIWU1V+e5H}*gSHl*3?p(?qlWgm(5 zVsTPG(=r9Oc;$$hv>R$T1vX(Ij;_M`L?HNTq|JGasB%WY9pyjye9|Z(Mo!qgl$XQ1 zXr(lqI$DJ^p@~@NtMtS*ENkIUsPq;LKXf@g$Ezw$00b92YpRhayKrJFLPNuJ(6ZNh z!JDj=>6zQtz^ZNceLIGvq_W$?XBaUi%rW&$E#bIbcb&j+f}@l z`&2&5&aoyc8VO34be}x?38?Bj`|F{md5yuCF_rUe4ILiSt}$dpF!s_3vSA_KsTWhb ze>Qp|3a2)m%gYQ)EI81cU9nOg5viH=sVVgVm+^g`TAl*im^<^Ye}ep5t$4S^T`ep- z`PSY*`K7Eg20{1GOSAJLL#gcRxwrdTEWX4S?4BiQBU}5rhaVyOYmR5pU9J@teoxac zFoS7P8|Dm@UVI2ic^1NUzRHJDIX!3os|t^zs`XJyHxZ04x=D;m@<=-38=)x@9%)ei z5~k3y%|9eAW{>b+ex-}W1_3BaID<}G^VRHjI z^)YTIb!u;L=3Ul#H=(3aU=UQVG^>!0G$}ZK{NOQ=U~e~<^rFj_{b8v0eGGTj<#l#} zo?ibO*G>uS|M=C7-WGeAkZqk5bTdtJTe$l{=fOuWx3XS~m>}6_czU%%8ZJDC3qEC6 z$ccvu;6)#-B^e%LgTDP}<0k4LqgOP|vR{%|;%p-7_%|{-H%p++<_+s!KxT2~?;w!w^R2dL&_s)Ig*o+jHt~ue+0}eO) zw3g8>TYco&6QO^L+&kvSVST|`4;^&hFqQ%Z2F9JI;AthKHoETUC`&%D%WI0z7OO>o zp2A@xA^B9}{36&yd*6rqQ?SzaNS0lpo_m+VRpbVM$#e6!ljpyDd^uT}w8+$#pvcDI zH1L6Nspwl1MJRJX6>IB5f#~};X+Ldy_dV*;E_+XTyT{+yA)IFT8s#pMOZQ%llj$Q- zRzk41qD5i6@9Jf+p+(vK7Wq|<&koOm(v1WwQ-y?-e6&eTipmf29Rvk{+0Nqgdk)k8 zNtPQ(p7=V%N zgZudJ`Y*srve*B!BF6w_CC1ek+*H9)PHKR!4@xG!MFGORGa`U6FSK#C4qyO`cLDA; zcniPZ|9tu%*(3i4{kGO#$3TG_c^=C8GSou8EfUK1!@D4mC$9DJK{~j#g z1L1YP0rhfg;#7cX_9Y*5#ey4Pzg=5|jDv~rH8vovLf=5$1=Jn=lc^k}I_&0tARbYf zx(Av}-vjwMNAH2o?18?#MZxc`5y2=;7-$bv149&1U+6-&d$433Fs^?TsEK&bil%Id z0N`b1vO;CxB;66#5$OMyq_QM=g0i{=b1lh4VW1OXDwi8?eCO9!-QdI$IaOa zk`^%;*0IHX(1WAx{1(K;DylbSB=ZUjzj5wcMn(4qLxRTJr?$t#?1WlZKAJL$u+nEs zUBVP#jcS{TpEa&Og+ifZ$Yk^u7N7(^S%C6Z$xu!mxb8;z@bsd|1&JLQd?3Kf3D3LS z{Bd%=nd!Qr#!OyG5Kq2&>z#+-kNJPye(E&=bsc{)Y+FAp+mDs!$IA1=Gx-1Cb5maN zm~7cd{Mj=uOG6IW(;dkq8b8p`^_=zuVsgIF?T)NPgj;kSY%pyPWFxY{(Q1a09B(f+ z!|calR|If1wbpMRjjANyAs?tvf1R*9pZb=`31YZrV`t;M|ItuWp2GQ~K>?@k^SnA< z#kwK2j5+QTdGSoln+;fRTa!+rI6@{A*xi1y_@C?yS#e`4@Cl}K8!GsU&+bL^mN|#0 zyEBslBRK$2bsy{YSz8pd`k$TsQ~o(n{Kb1PdC#zBj4^|{M= zugj7h9}c>Oi?8SUes`yIM2~OFvWTgjZfp9ACLrPCbcDP@``s zDxX__Nn=jR(=13~`We_kxcEXlVJ|wFw^KuU)7b1w=d@63+GrrqA5sh-)1BvQHHGG8Xm|rsl}*Sgu_L&b0c{P9Z(Z`Pfb<#k(4?TwNjKdK3(Yirv&_q9!^nw)%o^YU2emL3UII15iq%phjD0E`^>1e_vwx!d^ zTt2OjMX!wwTOd-sfS;w54pU3NYYo1a(YtO%QgY`l{vti`D!$#y7OHs)oNeDMSSP%I zEs4IME=dSp+0=`CNVXvAIHDo7g>Mqpa)FXnh5bcVRDO54(uP5@`iHA>-qTmfMJ6RP zURgz6q$rCl!G1ab{*+_PfRzI{%Ks#MU~rTC5Wpw&)F z&7EJ)qyNTdo>L?sEaPD0A=*U-TTQFJV`U8)=}o7SE1qtVB(r z!Zae=fJi|F^e5!ZTg1d)M9W+~p5hpI@PQx5Et+z-V-NHg7=Q4TA@p&$epYmFpn6qd z#G#3eLpz?3>Xm$+D5L0|ApXf^HtqNL5O_4;&jku;)=8H! z-Y7G)S0VNQ*^bF~vw2V?pR}Z1_Hy>fZ2HB<#+hrho>7gU$3fEl&ChsvXG$XY76;7b z;gTN<8xjM)W_I}IM|a><0VwLg`ma%o-9?>_F^%%Q=1=ToWyuT7pYi=~FFy#s9ButI z@B_f4Ch55;Rjx6`eWXUK@hMstTaa(ctz#N)#BUlzWAdTkez#X_q3RyUc8YE2)~p$3 zL<%sQ(3hGEV}l8}*AKo>9t(Q0zcU;eZP1Mx=p3OlV$h?2}{PucNuG+79GLIcEHhkSz&B{1VI#@TRTxjV4UXI2WNo@_&+L%7Z!z9`SLlsEV&Al&QjezW@oO@50UuMuDI z4Jnb6&t3DTL&IcexS$Wl6>-qMx*e3 zM@3&;r?piOq~#g+pUd3b)iJs!Wb3SSX-W+Bs)~t7f7N!SrooqEp0|yli_JFeg(Lm1 z)5c)wnE6HQWjtrl^>f8bMN$<9&^Wbq(>qMWPg}}Afz6s4hX2#C_2007 zz;pf0*yaDDa{;hf67e|788MnlH&a@^SRwMIhH_%2;t^A$-a%`iG*TpT))BC_2Xl7u;#sp*wQdAy0IJMCuytG5XjG>2L6R0ht5ZKz+G&DKx#U4nNOowrU zP&v20GZ-Qi;_qb`$6vZLF3LtDDm_^6v#m|E43-W%9i{m$^!^0&{4 ze*)f>KXU%@U-7rCgV71=%I-P{ zK!t8We}bJkk{XNw*qJS=U!1D*Z@QupW=a05GQVD)|4Q-r|8cPalMmJ^Gq6;{TNpZ* zvTd^ma=!rNiA*r<{-jL8aLG47eEjwvDAONc1zaoiLoFBJvR&>c1jg$MOnqOs4(>DD zB3*e%NuF}{y&{EPxq}*)_&l{$hYdy4(L*&xaVVA!%4{qXYnt+|$8b zCAk{kd4KDtZmfkcfp)I- z4Gf&sTxietV5RJ%`cgsvX*jSy2m#^5|EBXHb?qi%Fk_#cv_Hot8WDN*;ms|<5lu2@y$`?#iS}A(Fect zNTjn^!5?^DRFB})V-PpYBkl>i*t4mL(TAn?=f9^$#9AE8IFGr{3cdAG&u*~0qLyYX zRgeauW!)KMN!D@8lC6N~Qo+Y@`+W(covj5hQLMgZ5x3`JOzxky5j=5ka%%Bc=GV~X@rARdWVZ>gQ7R-J`wUWrG6~N+|kf- zKF+xYa3opr63sp1d^=738tiX{>idFDkJmk+iCD^#vE$PE)Lm;?b!0RL{$N3mS1mzV z)ZMgkfkvk+>ZT16jF1?|FJ)5j#&b&LBl>e)hMYLfonVDmHdq+ zik7_{*R{OD2}<(NklEg3Nf`Mjx)6BqtroWlf)!(_l@%ZCIXUN|znscj;uxhk7!EvU zv6c;h+u6Cog<@@MCic;AJo|b-L83WUU2)9jh$X{V>R0-@P;5^*EQ83rZ$*UT)=p{J zsFqdLoc_M{1MlbA97y|R`I&Yo{rHMuhcnmpC+y#N3(}0ETFn)~J76ZV5(PnK&(=9p zRG_r(?V`K@m6G~#a45Q=M2mgq99np}zwoW?jCsE4sk!MB)%2otT6NvOp6ZU=1+6rT zvFNk?EePz?%&Yo)lpL7M3XJZz4C8sMOOSqb`SkhM5$-4#RsYte9v)+&5H``MQZ=Yn z>d^Yp{7pS|JyASr*15@CWmdUf#(!nuQ{z{DYY@-L4b$EnZ!hjw>;v1Ys@P8OF>DCD$EYnhxtBt8`WHY zF2o@vX(3^DUE|6rgNB#-&%r~O$09`=sqFhPqZ8>d!qL%jyO%@G7QCM@VHM63l|IZ% zrYp*}wc$3=;5ahMBufxKqXWc;$4!3^3B_1#e|pT3tLOVuPaa(c0c1Hw6HMi82#N16 zSC;m_E~?Z#;Vh-FB={CU3GR?r6F42`Z*~Ur`h{=^;AJK`g#~`4@(98X$-Km70D%mz z<}$N8xMCW{cI%UfRPe`KLH8|-acECYBOH;ugf;i;c2KCw?2l~JbZ0SEeu>uYf{l=c ziRJ0!r@sY;%F^+4ym9K1x<}AMVB>xEfs~omz(HD&{>m$kWc%{UY114$1Xy$%~$#mtHCTt@d{bD zUR!?9d78JHfLUQNV~$-Gy<<8GeSb*1W*}1XK(DdeFF8h&2O3zSFNhHeOXOQDZ4;c| zwi8V@7$4cSd(K}kre2G@+);9?{d%FPf}Z9B;3K^p%z8yt5ap^-VE5uOX`OwqAX0W; z>a&mftTp`;y4xqORa|O6y($Ox`<|bF6VRM>C$ zpiIx!UdtXK@q&r;9t~8yw-}JV$gMovHFV2j^T2ZIdde?GJs_>BN>x94yO9nn2-;}2 zz9IL@6DtX|t>1#cs7&GYBq%kA;lL|%j(UZj( zq??WCwVreVVjVsix@RHN3#Lpy8aW1SJX-aDULv|TMYl?YE;e{X0BW21k$lXATFg{^88}XdDl;AaQ^dvX@nUwkTYMob8Sp6=c>%xl3GMOk zf25=pv$txdm>TYSfjC%)BojKphSyV3_fDt6J@^zp`k>#`>dt{L z5&Lx=b(+TBSN8L4Fu)$re&=t@-3J-Tc6K&@norYE()fa2t_R~@i$hPZ4=xSqz=`MD z0yYDpBzW(j&1(Qa1^USKKNHfSzb&7q*s!9jYsOJq5%2X`JLF^TT#3XdCE*A+yD~%L z*J+W$`5)_kFPN0S0xXeV$i2UdasNB2kywPz=pdA(h~8e;@Ox|LUb%KAZu5U6d}u8oto7Sa`J)sm!hSrHMLRE;<9~@WwTrd@7Vk+iyLZSau+US@Z|q6*QIaf2Y!y<}i_@ z=5w8>Gt1h%SLXlP^S9G1zl)#{v%~@Zp`EkqQ8THsk za|rs0i4o8^|`29ch4(WIADnm!?*tKywx4JRVQauDVtpM zMeduN@nWQv3Xqn4u4<8)?5L2gDAxooMK{*@yMO&OX4Xaa(&IP=2|D&7NsF~lEhrzq za(}_MhOuI5c;{8&@&Y~7w@{~Ei(;^1^Hbt0svp%x5g)fu4Y?_U8SS5~kF1J)WUh9V zs-U6wn8H-?C#Z91mvO#8487-=JeA~`lR8Ja!VV-$@1MX}=M{%viSs;P)Nsm*$)bzl zWuBem1K`BU2j^Xhnv3*%cCLZ6gl2TpVuFcGDdbizx_VDylvXWP=pli(8IzuNDrL_Q zd1yqJ3;RWt4i$PlbVK|u5VY677NpYfz{;2VXK(P-Pu4w^OrXzS4nV2;cO8S)hY3zS z%Gyb+GrYK}4~Gwb!5__`>lw*U`git`=b^HYrMPR(6!z+xmBrh>0W$1U5$1G7vpsT| z6jLTrM_|DVJek zib8(QuOo?&iyWDD(OrfqV8sg(8yK-j?=-uPj7tq*w}|j`JTNgtdxu zzF-L-W+TAvO$V*usW=QZIC=2H+;H*w`~l-Cc= zVr-`)Bq62#EmPF;Y;h~eIN*^^oy}Ls5GVXgW4&c&sjoQGF{q==Xeb{px%UV*L{$c_ z-71o-55IW~8fz)a+3=htP)i?_IA_Gc*IVi}Gc)Fm0am#9Z)%m%ArjrcUq?853^Kv3 z86G{FJK`!sRh-F%4EelbJ2C@GFGnu~II61P*1=>F8G49ZD~Ep~X+U0!zC9pc&sp(P zr}oOcfWN*)h0rnW*7s0Pb@k(C}ZRaQJI(fdjG_|B*S6ODzl5 z#*|ZIU;t}32do(nA@QSwbMuZthTq_AM9dC&c=bRT4;?$W3-QMetE@)FcrK_y`hXTsydjZ4qIHLid`ZGo(JyhRi65W>kMkPcaLWGB8rj+ zMGfd5gYt=x1-L(WKiFbv`6i`^OiO;I{Y5#P4tPZ6{FI`4J|GpHn9oSEp#`8nbk8$k zc>aiX&la$Q%I1^5`mn}ZxHz#+-o1CXsoIH8?9mx&;_<;j`C7Q4Nb^4^w5{{aUi)2%1uC2MrBGr_Ss!0-*zudN*!{doYh&12zOQxI*y4j0WGxmcs z05k&jW#3P$#aVg@vw&4U5K049{eM+3`pr?tVpN;QOdA#j+);{&XPi554B&i>b|`_Va=vQs-`#z zC@90Oori^nbcUv_RL)~i(xEj#$k{EyXF$s!-)*oM;$14!$mC>msUCre>-{14rUvMu z5Sj^BGMZGd-9M9vJygC%*6LNf{_9zv$@^g|_=Q4DMUAP=&2nb7H8BoNsiu2l@8HeYU;73 zNIPTeulJL+#J?rmO(ZLkll|-6xQunV9{V}kTn`)=UPNvH&cRh~?DRf8_(=KLk>$Q< zZPkD>(-d^-*Q*~@LlRe`#}m9g!;_FzLNYs$fBox2?Nc3=L9?O3A232vr0C_vy-73L zvOvc})H$G;Rw*D1D31H~EqK$(iPtjf`an%hNn<|Haw-kPO1U%kXq_mC7%-YfUBJAE z^foH{@RS&z|CvSO-Hjpx>mT%RjPSKv&d)T|v}>gD zdq+l#hlz+oSuUsKloV)tas7c6)kvwGMVOT^W;IyoGqh{a}oRTNX_=5Mz8s6CxCB_mP+b`UQ9jvUuJR4568#icVR_FQn zt$pJt@$?C(?KTsPHgbN`eNO7N>gg_-91ovt7Ng~y#_5rc2I?(hV1`fSvtegk&EQ7D zLRKZy+20!Nxf2mSW>!l@ezhu1Bae&Y?B@jgoZ8#!yqDc)zOhPPa{XGNyhbz(2V)t} z3fi-(MZb7+hL-G~4Jed)L|!4b%{_>`e|@l+c=BWt9UAbW7?+2ryuUrB8OIdb&LlZr zD11#6Z=*k@Hv;HOcZ1Mz0hS~VALbUA$Y8BWV6puZGy4N6vtQAlyhI<3hncl#pYdr! zFM%0-Drpnv2pkdm$r1e(k(2_IbLzsNgYJ9bBBMiaG^@J~rlt?QQLd!?^ul0j-Yq?b zJi*tT#{TbU&_9%wU~N=(q7i;`sipc%sjsI@Dqf)3O+chg`u$91-lp5y5l2+qS3Yk40l9*Q_u?Qg>-0lDf#ZMKcf6)i7z=QO?7%*eWL z8W9I;NPA+3$sl@2dfRG-2#BYTql{J7;T0x#L!}Q;Xc|msPE=XX~MFOR9M7hJ00(4U4+6 zOsB?ow-t8mjeq1)tfx!V5TeWIAZV#h6!qp6`yrA?;7Q-7a2G(P2=`v{-TC)?bnV4% za#ko&05?nvue~X&X*KZLIN}Uq&6RfeAa@pX2NmHfl_;|4fDuQ11_1MM zGwJa1T~bf$s^|tl55kn_Ajm?NvQ;yp{j&BOXw zP$+$)J~T;&h&CtIpaX8>rOF3G!d*%_a?M{{36Xo!Ze`6=&u}8Q!;+sSH82I|JYtDU zOR`7d<5uOy2B!&e*5#qp69~@Mxyxu~IRdvxgW9D6y-kf zrgmO1?OtdRpCBa!#%6CArKk`H&zf{EiTL1t5waohrYVreXi2Ww`F)xg@!qV&IwNya zVeEPWA*7%+r$$+bgYoYy>&e(|QZ>z;QbsUuJYhYpr-JbSY% z+eB((8RC)(;}zXT?PKt=HGqcWSYp`L3$K2~`+EMS5!Q6ZgMtxho$o!UD`zkXKP2cv zyKaY+cGfLz-Ae1qsr($jySNIx z_(R=s*HcDqGqn%e9e(G98#tadk?l8~8ZxaJT`pG=sS>}!93k&%)3Ix`MnSG?;0?!m z)(=#4YZF17Y0IKERWwXSJ7;RWXT8@YogKb*<&(t*h0f^Z3PDUsT0mak2MX z1Hl&wY0!kLd-=(AF@qQduN^OSwsWFI=F|)KmkBsWfHs)F6m>L7AZgZ$?Q1T_6}maDM&&37o{Xsy;CbiU%E&+r zPI%?16iHXEjCj;_HNpDD^0ptaEiEDGx zR7m^wDCvf(X3ycv;$gJvjTx2FAUUk*|!>QH#nDAoR0E3`TfxkHRo6Qe3XJ;9ks@7CCvI06Wmm5 z>w|*qosxdjP}e6$D%T~8SNEgs;g`yvW`^HX zjn6Nvq|rH85NwHDKS=`cxpPXhuNh2FD_W~^uU9(viosRTbZ-jG$Ksp5*vk*;5oIa2i6l0kL+uE}YvYcPt_lYvE*~?sX@&??|$a4{?d+1*9zu{#pLD+tv zX_1kNvRqp2so8;C+lMGeY?9|Qv2F*ocqEoFD*6kQQ_(g$SCy-L8uZ-#vdn_a(bgPz z1p6~hnqXKo+p*!U_ioOpDrWIP1h(Xv;l%w#DHrJ$t$-E6+P)4R(})T6R+jlKlhTn_ zxb&7!G@8>;BUR`$>qGvJ{)A+TP^jX=_WV}HcM;PHy2!?Z@1{p-d<^X$ zL{NdSOiWfj)%EEXdcbJc6UsSPNp1ingMi@Z#i~mpx%Lp2vxAw1A98v+ACXYUAT^m6 z!$3R=1iFk|0p8 zft0LVt;!2G5-&6Oe=Ny=yUqaTB9Z%>QVS&PgH5Wmu6nZi`_-|M(e5qSc^|}ZoIzsJ z;b~RaXCi6;5`LpOq7up=4);_8f#wXQP{iK!hw1|?@}#J_v2Efxp6cMCOV=$u z&wo|Z?UouE*tXDE%Fx1Qo|W~eV-`FD#d&4C#$u16GQOArN~6-*ohqvl z`K@?YD2V)=$UX-Rx)b3-d?!=g@ARpte3$d)FLeb|#jLrQoN2q~bj4F(!pR$*IqNQa zVvvQHQrKs3+e8OvtWo+u_%Ixqh_wKw8K6ex0fDzNxK_y4Sz^=8^>ZGV#8ix$qKJfB z;F~snFdA*alD?|%g&3^-uAXvMUWTKfG$s?Hr^rb?(CG) zf1_{%Y4$Zxw7!OdfSomKSWsceuqU=A@s5h^H6hl30L&Q4A!Vx*EwACN+fT5g($+C> zHOqKg??Cu$(WUi@ur_w+xc~+CIrs(pCH>yQ(NhF7e$7vY>P!V^pb7V?-Y5!lE~`Bx zw$yJ4_PA6f#4E4(Yn8x0T8~KdY z?GX=;E(Wg6K%xM|xg7oWs62u_v3E7uDgUZ$<=L;Y*LZ7&9=16a%AwWxgNAdNu~U~9 zlHv_QGp>Ez*E4h)t8v0G&AC0>73ldBFNy(Zzl-nXzn22}Ytk;>{;x$N!<#VM;!W^( z(AYaJ@aE8bh|Y2*f@-MzU)n_MkMI0n@YnW4oyyNS>ebT9!Y}Z-%-Z_8rwsyJk3)xm z_LRR3YbO?{xTDWL60^L2axDD*fBn>HTc<0nQXsD;1K~^AUGI)T^PPYc@=P}T9;5>N zuGWNbawa4LhdRB|XJ4Iq;1$Imk^3_G7bmh)-xc|c?3=j%l#+RcG<9l#k(9xGD2a9# z&tui@+EXO!b0>t6TB&r|e2NltN$@(~SD1rNH}D!XhZV204wRa;f*Aq?W(#z9mMiaw zbuH{lTr&IGxE%fb+aBj_bX`GLOWA5gR6hN0AXR20{7v62Jw_-3pau%;C_qJ28P$#TWA08k8RzMeK z_p(J5FtGGb18u;7?o*tVKGB3Y4NTehKL0Qcp|yq(!UDjNHS0g5C-b7GxEYCr%41Nm z9Xa?PTQ_A_{~sqixdY$tubP|-JktYNEt|UrnNDh@c0f6Y2T>JQdr$<6sT)8zRG;L8 zN{zu#yP(3uvF4=pZ3j#sV1tZ51qa%ZxW;`kEY6DpVYl*-v;FrmzmgBd>jx-^qDo?A zlhG$2abU>l##G=p;ydsfpI-l$lm3g57hR$}(^1f`O(gX{ zgkNZ>qkIxcPy%%Y%x_wGxetazKD)h~YAuK_vT87LZ7qR(%z2s)mgZ|Pg@m>4eH}i9 zGy@L$@t+AY|62@=V)Yi7=r?@A2iM$bd99LP?RR@bzWn|uYe{}UjLaa?s@Bxk*S-_Y z71&`DwxTzN$q?5F{c}S_1d;uv$HAA*JwSD7yl_*otnAAV@FGs;20R|7`(ESwIsknY zdtJ{32v9R~Rq5+ad+TL8nk>vpviCB+?Dm$Q2-LdDV%W;@H#DATl*&7YVq0BL##ud| z;IZKuo(7kOZpgc5Opg*ZEsQUh)c)v9l-4ZEjE*83{bb2~>_+!A#3JOYu9ivi)BM>@ zXV(|vWV&n6Bb+q$MwkUvGLa-GIw1L z=hC}iVZ^N2;rF#dUsdz%-F}4lDDI8bL^Vo}{jkx>u^i4Z(Jyy^O81_KBbRk zh0J~1EeJkyAeDzej8N_o3rL1UXrpU;8Y|}8;Dl<`{q8S`yN4R+H`FWCf%@x&jgr!s z<{QJE)75hFM;_<|E8QDu4%<^@!mr?By5Wj0qF-;t+|GY#ePv*{a`;l4STnY5QS!EB z>Z^jLufB@8AxpbF$wvY|F{N@%&fG1D)vs@+1Qc&P~NWbYUL<&`5$u{HFl8?kD{~Fc@_N%nim?uq&Vc?KH-#&+!E;Ze-W(E5Q0BaxyGWhSe*e zm847ds3UCV2?gV#Zg1>%| zVTV?+BWa?0%44T0-YNJ1eaKh>LCx+|M*K!ju%f(mx^c^uklqs_K_5ZI$v^CMlG4gW z>_r*?f;iXdzjxdJ6XUEBg#b9@ZyJ0)=KMu9hwak+FO*d0T~Z2~%SUf&>B*1!>ce!z z6@!ZrF|tbtibejN5R_xyc#1>zRH&jsqtlJ_!r2)}&)_AW#pPi0N>QGNo|FEQI1*%Eyn=Un94EL4@4-mHDVYB0@WQRfoHw8b^ z$>6Vi?@VeqIgQxV(%^fXb^ReV1jb2zj9+THaH4%n2UD4RxY1`agHZJft9)`-9_VHU zeG5KQFTMspzqh>fy~e@cKP2+nSo-44pV7bf(c_R1pp5+%W4Pbtp@xNh}m+;*iE@w1~&c4<-{5l5+H(pq`HY|INzNQ-` z7l%9Jpv}#qlMMZI5aINl005*ToQVx*y~aSHey=iZjFCgaFv4&ZP|-Mn7Jd4th7 z1uvs98w08TVtDE6(2}R(10nqPJJ&PQNB))z@Ezk-*%o!esUjM9PkjR2) zF5a0OI&%yH>XUYvP5X$^Q36D_T_u2egnn}P@h=P{G{BB=4jm=kSrw&_NC1N;TACud zK~IdO(w`u~ZK@JUN1s1Z00idIawG6DDu4B?KLcYoS{EWSn@G+;I5a35kC+bcLs~F= zYx@u?-Sb?U#aa(mjo2o8=84sJ!F%7I^ct$G!i=o;7Wut5Nl$ z&Sg+sDGqssVCI~pj7i43Ior=2a!!jH2JNUB3<51?fQ-{-(hOiJ|L=_#{>BGEn--u) zn8aYwXDo<_yL|IBtc5nzXJ#6ogO+ca)V)QwX|Ao+Y3~QBQ4P};7sfQKEEBxl6NDy` zk8<`)36`sTH@@K`7)g=Wm;hYHWK(_~UdmwxuXUJjrv~G9u10a6DroZ`lsWW(qB{>C zk2Cc=jeiZq#ApnVzZs}Xi_N#C31s56nd{Fy7>`}QRCNZm#rL_N}`|P^jOUQTWL@2T$?b7cT%Hfg7IO{h@^tU_RK@bVkwhpLi3H1Rlo22Hq*+{Q9m zMh&jS9|q#Vv${n8xl>V8t!m|h3hqKwf;4Kf185c?gxc5GLQwAhLubZanL};=1j@4- zO`IQk`w*=99^JzzX`6{4X#X!w!F+ZEFK4nwxZVU`Z+xD=@Ds}%t3G@V*f;)#CJ!fG?No+FggJaI+Y>OF}Qrnav+Kzi#Qun9>vBH2stHxGCKc{w3_S+j`#5BVIYSjzIUq~>!6%X4)SzUF zRc3|R3uqzx9M!7!JMZWrvjmng{2~QFR$j7Clr!RK+(`6D_d#x9Z!h z=K{Ei&n5$;t%>T$1q132U8*_}{s~ihb%LCni81PF&u6A^)%k zcrybaunRXfvr@lc0S7HPvyQKgFllUO&8-O(WXTTi1=Jk(fV&8YvUM4qXVze>16xc;z zhF{Sk(*sG!4_Q>sN*p8Q1n~BM<^e>^YWSKHds+Pg^#vv6=`V6KunQcQqWo98l>XVp z>My%Mh9)HJ0z1f9l59@&VI+`hgkxpNKteJ70gA{9IR>$-lck*-!AGYwfD3#2hYAWW z`eD&WN(A1DqYeO+ASibLGN~g`1-RvK2wntE6;Gsc*5GI?9SGEpi2VK>j}F~)U-!Q43OoCc5`$1 z;kh!t>oHmhcDX&Lr!y+j)HLY!#u)HQ`bHbz4~p_RKK+ejIl+4p$^C{EF_XWRPN?!r zk@B!7S5L!ed{9(1b5m;l5@6mUp{3C82gqX(?s)*XPh8w;N9F<#+{WB$KA{q%)>5+OE(EV*#u~>hC@NO?0k(~ZoxQ;L1m9hg>*H;pBfj(a zN5Yu(I^Q)B*Ny0gO;Nu3aY`i>puR!~Cwf|CbsftHf(ozrUBoxu81{T!-e+1>;UXSj z$JjDv1O2t}i!o`|i#e{mWuoH&g)BC|&(EhdVqW6N+i7wVIZ<`7k<+nI_KALF2k!SC z1&T@ z;CvTiJfngna$kgxD#srnLw1$eantYCJw&~YZ%v-*dh%=H!pE-`Y6rJ(v<@Vb#fVvR z{;04i$Pn4u{i6(bh^;~O7rO2HC_G1F&{h{rwBcfcXOm3nvkwy?1>1;2!Rfp``2u4z z78k~M!lH!ectvyFTps&kuhjD7!IKaAc^c7N#OwX9h& z4U;0r)MzfX+tQj*UQ2JXPDkApIECyAG}_N6Zsl9ZERhAbpPg_kF~y#D&D}luLJ(f7 zkqqfU#uR_i_cc^+e9p#jQWJWPx;~DpdRRQRU6qd^*rh z#GK+j>#Xp>@TzdxJZWu@mBRbB;~ViAtmA^#eT7j`iJwG7`I4Slzqbv*iRlIk%YiGf z9>lC|!YLesl#51xMTvpY%|Ho~7SRB42KjAvIt10TtE9dUBbordlJl%qVyM%Bc;f4? zo0P<$eV?%QlOw;HpDKOVv8Ut=mBTLtDB$B;(C~$oYVmOAn37yN9^D9$D-27)2D};| zqbA5V@gSCGvZq6^)D2!v_CuT?8{m=Tk=cv7~lztQnhv(QJT{fu`~>e=ssru(J?4C0H4!$a17G4LDvu}sjB*zwOK&D z@Rcu*D3u_gGst_;q!XrmmMDX(!LT@Fhy{0MCvY%5I7qze2a>D`(@n5^Lo39A?u4>a zM^T(I6s=7~C11wS*Y_xswHKSEpL=xws%(YdiaE z=ozD;_Z6ddSma3rQCY6t&gyWJub$2=?{<5K)nQ@ZwnI+JvfrWM72{Mv zoRO(HjCN-Wh=q3v_=fp-p3^v%@5O|z{KR-XloT$7ig|F--Kd%jT0rTrdXIeE`x-e?#gR~>_1 z>wtZj_d=Eu#jFQVGOoSfX;ZEg<5RyC z=!mum9a=Ei6*#SOv22WLVFM?q?DGtlEur*@rU5dVPR7RO-&@_XNO=-W81lIn2iNM^ zhi~yZW1q#1LBd|Y>PM&F_Hh!ho$$xKnb}ujw5K!?tqIe(%I2^p82i9Gdm3-0$Y0yV zao&wFDsLw;o+~6z5u@>?4JcL*xn4!{m!;3a*Kjv0+1B9*;hPT+Xhr;kEUszJetnTR zX#99F#cOh`-@~>M^<&IRqrmumlfpsZ?ei7@08r zc&)U0yxfW3DKg6C=et7b2HFa`?WTXQf9HZ)^qO=d{iJHGic!&oc;QEvTw+SqcO7r& z_Aq0mLZj{)#E0CYeRpYT(ulQ{%Cst4C}&ZWIpH+ zz?UKK-Jv>0JCk!oPj!cCZHn#vim;RBJ-+h3f(l9B<{5v2K)fIjgXp0Smf=!X^6XcF z#DT~9gny-fK-qBK5nJG=4n{41rD==S8dK^BoHt|0$RN=q#OLJvX!(~bC#t$T0YiQ; zT{DF-fUyLq_bZDMxpfF%lb9;m@5R`12Hlcw@v6Qg8J(s-WBu?pXEW^7@GATIg)Kd>IjA!;wb%}$^=U(mS!!~v&aF6ZdDZ^+FY9Ay(_F{IZkY#;MsVRWnF^QH0+J*AK0R`0L`4pdf}uEJ7a zotSb+7}kl8wF#Kdjl;$G8b;kc*cadhGP%g zP5FfFa&)-5P|DP9W-1(gR;luN#=a%=!}JA-`97+EkEw3N$Dy(8xKw!Q9UhnrX(!KA z-9N|Nq^r}tiBJi&eQUyv^bv%f=Lf$b%P->(0Vz1wezxzg0Z9QyfA@qYh& z&(i%GObF~uJu!B!2;AUx-qP{W^Y0JO1mxP>C%J8uNLLtqd4 zLX)EP?IE*L^6KMFK4KU$Lx!})wz&P%x}4xTC8`7H$2?Pph+{^x(dSsARx(Y* zi$M_hr4uE2>lxk*8)>vds0sbl>&qi0yqy@xO!(Drt-Uc?|Nk6anO`Tl~oG zjcH8SIpKj$LhpWzo1L(IO~qD?kPg)}d%6vIf{MmI+#8Ra*WAIi)o6`vgl&gwzdoqFnad5#>6+^*-lHjBdA zR#mQVK$GK1ZG>_4P;;W(oS3rfj|<|9p5WcS(erIU;UPb|p&CV7iujaT(wn=vQX_#c z_dKw#yTPLs@}xKiT!vxZUE04!RI;|q#5C7A(OlhI(>RmN>vkoWq3tS=KVbeQ!e~uP zd){FZZVMedr$rS0GCuz7ZZa~g{OYUn3bzM#95Y?w^5bbIBG}|X5to$UxcwmZVDdTK z7DFS*{7&Q8$tbBdt`O%ud7+8nIhzf+Sq0Q3Kyd8A!6j$SbEt1kbXz#y4JfbC2joEY-KaB=2ZRtQGadW~IEmtPE6! zm{86W|BV3PbU+~P+!lz~V~qF8ij|Fb!EXxiz4VN_tHewHZC{22{2_)cHBFza70R=K zBq@4z**B%eh+XwziCs0IkvTjhxh&&XraAcsE_2)|PIOib9eUt<82}e7pOO6=1I*3X z9K+gQZGi#M85n^N%+w zdLRBu`eokicDIMKo0eA+T0;H*_w3@ncYo4Gv5?Wei`rW27E&sKyp7B4eu; zVJ-Yl2gGfuK-kw&;<($J~|n-sV%T&Q_!vM6Te;Zs~}f5kvJ>_~8wz z4AAbee3}{|Ge3Bg!-#J&Kd(fO$2RCBh?DaVz7OqB9IiVl*QCxYZg>y41@0Xos8I!n z&3{%(Xzw7*{C{A+@A6zd21zJG^LBJv;bWp$=&5zf-gSO0^`@t4=h+^4wVn?=70XI0 zXl>6**azPv@1pj968B{DD4EIYPnv&3zfSn*3t~QiS&6GxxlQfisCNcscpX~&7V`?s zlqM$*ViRLjs!|y4F8%(eRWJ@+gM%|jtzY93sgu!jdsZb#o!^Hd-CJ9iJXiGtVq7(? zkQ2x^5H2xg86Ww*kV;cABb|@;{7&tl%-(hM9h(?QebN&W?Llt7z8-JvC{20=s~z2 zEO)E^O29q5dIS2xBr(gJ?zOn}jc-2@LbX+*ziA>n??;sKfaGsC!(I|G`<=u#^s2Kf zZtYiq!b`@@jhPVPl#^c%uc_S#bYX#){(tO!2UL^YmNr%tP%IP$A$$U&6cGjK#6}k* zAfU8}2uSY`TB0BzMIs=es6?7{BGPNl5RRHLsg(&e&6)Q;0^HZ&3WW}OggrZjnTsIA~cBVL51Vq}>J1%eIOZ7s5@ zElILvEqnR%?ubYz=f!hsr!_OI=^Z*rIqQX&R=I8QIkQ@HFZv5^z_aiS}e(CMV^$=X@`&a6vNchF+GBh` zB>8%?i_tk$Ge$T6B{C}KJDIhP8!}utHo{t&({#5tIC}Vm+|FL^)e{{DeNG87L)mR{ zMbPGP(I4ici1xeD4Njv#2I5_Q|F2PrdLhdR==4S(TqvORDi%LDPn zq%H3H{(Q~fLPa&@1jA{i| z>1xxA!RKT3ZI`xjwwhwCyRsGQCSI~#7YNf2`gPVfOK%R+Y^Y{nD$ZR00fyfR^LTPJ zHFNj8^OQS}LzKa*_~#+JQ5}!v&dl#S1%>peBC)R@BTV-T!ww(4-?|}vN zAXgEW7^m)41Vz*wAm3@I!yNXPO8wq^topU=l~?jUtH-njPfpG`us>wJ4*!6Hplm0@ zj2Oy;{<3&wLo;Uep8QWm*X{hv)1RDbURV0)y!+-v_8}%qfh7Ai=x`z7VgiX3f9Iue zTF}afOx0EH0!UR%?X!Ky>BAfcnKOhu_p`H+sZ`#htv{LMqvteNKZa zQpD1g>!iy1Qd?I@*tvnT@alufjMq`hhfx#k*`HOmk^yJzIcEALI`kn%&Fq}nDO2fm z>&v-cV6%~#qUrbcFK8-lO^dYp+YqN)pzKaf36Yy7A0)oNLZ7w1aELLJ^JbEhll#jH(#l zpzHUg(Uf`5pG+zIRXo-6+qPi(P?v@pQpD1aZ4q20X%PjGsf3x%_YjiB61oo*Cc(v_ ze^pZWAEu?2lE{R;Pkh)M6)@{OBi6b+San7!1zTC{<1<7=+E_DBD`W$bmd5tpzoA{o z&Q5aRSHpQoHN*jO>Rw$8DA_*bH%$#l46gpVC~k$~fYN=Zat=HB3>c)744*n2kb;G7&*L#gV{ zT#5V5JJ9P-u+LNIrQ%(bpG=1|FsqXX87|>H5!*w=MUdyjzXORTFU08XGc(NnQ1`;+ zW2*t>s{Xc+m5eEOM&BYGbzN|w_XIts;u{Fq7F((-2b7*NG=g^f0p(W?|O<~{K9 z>QqaAQB<{UHUL0W!Fw%@PHtC|)h7L@*HuayDDtVJH_QB`&c(N9OlMyhdJ4u>8sNqz^RZ zA5`u_q*RG9cuc0ybT*Lve_9Hs*(LnTHdx_59Rk~ETYmhC;Mk6qSUq}0f_Mwn(Sac! z`E|Z5DE^bmA@<$334qt0B?^SLXX(?arHwuD`I8V7O6Li5r5^TT;`)#%{WPSxi zB!=o<3G&#|e)o6S!L`=(z*S^d)QK7%Lmz75y`eOxWap98sqD?cbI$2$KbZzDfQH<~ zX!nIgvAJz}P7NDYM{Twy{>x9MyP}22K^${J`nTx-2@^`RdNJAs+*ZI9RsRYQuAZ-# zS@1kaF=1aKWaC!Ch{?#6l1vbv2j-PqpHLF&`s?B60`G_#y@Ge6kINdN(zvE~#|&iY zMM!@3E#s@%llPvYRxH&^LtVCss{x^^{uUxDSyOe4oCZozfH7E07xH9?tYGoK_Q*PJ zB}vhqrEk6P<{tcQjOr9lHinHl-+?Su35x|dbD++@)QjDoSm6#roy69!Hc-UpZM}K#kh&Cs)n+|z7%0R`pi!C2n7kT^sTap zWmo9^^Vrr&GWZj8lN;>bBLFZDw$t0W?uWY4AMtfnJi&&juN0?y6W>pgxo|Y6B&g#DUR*Hf zJ++`C5YAH?6+?enR#Z6Io55>yYYlIYrX7Yc&L)7Zn*h#b4Bq=*07iL{3Op7Bc&=3* zF+4!PIDTMf1lwc&&=&pYTe&yO0>HV;&SOJy_!18og)<*Y6o!(+VW1s1BiIB@V*zJC z|7*KUY)IEiaq2b^ToIWATx253F;A@YIXH7|G3eGF;AjoM{N*=Qe@UPSSNh3|_-e}as9jg|&sz-9~JZ2swP zC}LB6RA<%Ey1buEL3;p%ZnJ^vd~|~KCsP|hhq(W#M=h)RA4NINuT0fKR|NsY2{d!+ z8m()Cu7H#|DO{!SC?51C6>3Ti6SQP@()kdu%u?{an*m5^v|kaS270zLI7f3PFpg00 z%ISMka4N8s;_PkWdqKv;=uf7Xee5fRxj?%n*s*00j2YyAJu39Z6Kt0t)xmzeO40$k z^6T*@Mlwbif(!`T2s_nTg#4$TEU1d~gp8Ur@m0xMLsxR+PScD1O-36^1Q{dR4C$J} zqXckzOscfe{Be4*lGSL#_CCe|@K8B#LrX9(LGiNdeWBB`zpzN)62VQ)FCO6BEkQX- z@_Th}_K}c`h<2Js6$yY_CbMy-yf{f7}#%0IISOA658lAqVgI#brL+5rAWpw zKyz2xDyY*bW#9#E2H5S{%7mnE8qL|#3u{EYdW*o^(7_41d-e^=p(|X3VNoFmbOyO0 zwCF!}xK_r15&Q|cFGuw~J&{-ACzBlTVB2=Py%jiKa>G5+PPk;1NuH4I#L@w_2$nC+ zN=5gZyE_{FvlU`Z(|rbneXjoD7Z}yA?}7Uw8@+I{k&=fN#4VYZ=TFkbrYh)M^l(Qn zgi)h6xIhy>4j;gTbOokolOK7OPEc@(*m*%U23V$tTVRg*N3rfo^j`3Kp7bZ}wzN7_ zp=!q1`(zr13T!X^WU@dQtloEwOSMH)J5+RO!S@V56sL}GY*F#;!tH^V)ZQWSjfvZ3v@fQ2C`Ds(RjBMri1%7!6j@*~V zWp^wg6rS)rW3C9GYz{-jZU_NSUyL!0UGSCi{hSgm%JR=x@Bq0{yw3V(gDoYa8SfPn z5`o&7fX3?xCw;X|&}&}f_(mP5JM~biDF92oL%RoMuVPiAmI;(PP`hqY9g zfdH+n3t~F>dP%NOwCHsldTt1eOxSQa$M&SqgO#y*>aY8g1t}NE zsOfRB7T*I&dA8!(${ml?f=#cS-&r$TK_2sGBTwe~OAYzYtUo^}DRi3iU0}cq>C0m)^N5%6(Y(t#J{uvo+Ut%^C!_TLDk^N_M_y0b}$VgFoS$t4u zuhU3DoE`6jb2GG@Wuu3M_jc2E-)-g;58MfT!0+gf(m1ZV-z8FDeyCboLqj*l<$%pM zf97^}mAO(kU3In#ovGcGLRrV!-y1$2%KynEjyg}bOT8jL)*{qrAMZK(ee3*GDxZ1o z`IPi1P@p2B1#8!IuCrBB*h^I_yTc__WTa^Nvu;O*NViAM*B3#Dm^iCkR#sgGTAKV7 z-s*mD8^e8tUNK573IVxK>Oi7cigXHE<N5=FcM^b)rlZLLnUtxFZ?PL@%B$vO}1ySoS-6^G8x-gYbdD|wuvpb z$4ipW`#F+X?V#0p#_Y6U`TR%`(jzOADI3B8Bv?n)$)1E9rfD%MM%mj&AS7(F9XS{WDGVKjB!+Js>^WnL7yKNG)KV?Ozdjj@Em68wwduq+h_c ztqre)`#&Nsnpdte&pQiX2Y6_#jT9~U=Z`Czkd6@jn6}NtU%Fg4Ic1lBV8GqCp}!qQ zcJSvU%ncSTlq~X$N*Am^#Pzpgpcb*^0a483G8+TPYX6%=ozO-Ii#gejdZ_ORQA_Q% zK6cD?Qwd@qf`!*3?O6 zUCex;GjW3*?bzfIn}Tayn~WyL5vE?4eUZE?tS~e-AbZ)P96p;U=YR#9VW&P0IUOTjr?bX40B|o z3Rq7*M-liY1D%CNZ3*{L9*3vQO!h8rj-Pg2^$Oo*Z~V!$etBtfYX{1Rv_m(H%mO}) z%!*>8ul}xgqE(}j6#ppUVS~!>nJ5D8QRg~hdc74dyq%hnzexLlep~;@jAtojp=$&^ z0>|&r#Np44n=y{uZ!>Kp=*hOvVysH{e?fakR zi9!_ikm>c6e94Q>yiboVhp`@*{&Md75Iyen94_JVgk)Y|o9lyn`RO87GfA=VK={=F zx_^m`D+h_yiY{1(b3na-6GLd{6Rs7;S|<|{uG;NsV721S=Up&5j@elzP1Jm(=P`am z)w`42w(EP%DBdA0+x~LU3(-eyoUw1?x?ECf;MfJ{BbOnWR`>c7a8MuGHeA^GLsvwX zYtrbkl`lOIEEGhd{voagJke#17VlYYv&+9N%TRUQg|HfJ+S`l5?Ab_cLN9W0H!h3q zwqeIX=$0W_O2XAc+3a04=Wds{j+**LL|DvfMvsbDh87&GhMxHx6UQdwS!4=AH>EN5 zKbbUbpojalJb#;*37*9G%2TEGeIc9eaSVY>5BnooRRC z(6s8#Qiq>tWW31JJ?C(KRm^xX*j~gHHDW-lbA)Db)Dd#anu4J7eW}GqLb@eLYD#EZwnLq90Ba?0%pk#?mkxMcM|lDLvDcAW&6?*|&*(VLyg? zvc@|~2H}&VxV-zgx0cRlhet2TAL0A*qW!|AH{0}ZPD9RS5?gK8kvNOYZx1qaDAkVZ z<@mX{d!&;wBlDXF{szNg1yXMtGmW@NZ3Pj@vad;6J5mUO_~~RI19II*|6C9ZwL-tUE%6~7ncF0Nam*9 zHcCTnPp`<8;xjhr?ERA874A6)`;IaBFXm}AQe}#mP=clq8pqzjo9uy-&(29Z-ngu| zw|)%WWlPK9-ew^MROc5E;QPiki`NL@+ru!XqWx(fePj5JDsxbb`as~`sGbUf$T-%y zh?~#5au#ADMxXRs&9bChg#`>By~g+8+M4O95mI|u*Qk+KL_A$WZiEq59dN`}b+BFD zq8fJ&DXx3R>hYr}z)(Jh5D0ffX=7Ld-ci|%HKEPC%fWbwqKFV@GDmkNBp<~>P2J8f z*LHrYB4)#oF21I$J$?&H*_uHWKyu9RD+OS$Ij&aS64cfhvl#4DDp^-}CU)!^ zv_4z?U?I+Fl6Ukw;-(t$07z=2FO@xbKjMBeo!iC3($^UAUGQ*DDx4~eYOO)&FZzL_ z77mRdK*;fIHhhbOTD+oE!}*g5k&v;3?!t}AGPLtkK^35Q6y>~8Tih0W9P4K<%nX`oqNylL3cKGkK-< z%H7n`VE323T6G*YW6}EZ=QA!igv4#C1#qrFPk`7Rc-NCGcg@HNIvYhB=*`g^)}{Nm zy=u>+CUujBv0VA4M=fr0>KHbB(~`{gmd;c2m~EG_hBtM}`ni-}G^lrTVC`M^*8E!kx+4a^Dsr z+vJjA^~M=5Wxl?wM7*inzW>Iesfd^K8($TWwvfzz>U$ji4uK1;S4!{CbAFOIv*wqB z)i!4&g_bwyXKh8OJK=`1Bv%GLKA7~s)JwbeRb?oB=>iWFB}$Mzii;41)Z@;d{3V~e zl(8~S&^D0_O_U7YL8UJxK7D@?HTyEx;S`3<5r%HYiS~hJhBe2sBZNa$#(GQBe=>E> zkObkzUO&(aOSlgJEeqB^?K!9+kY}VwAl5FLJ2I6Mix>3svK(SEEL*2N7%8I4;(QY(Y@-KXCi+8vjkV;Of539(@+!A7;>o+@V;Z@e0=E6i^QZeUxD=`-X>;%S6xZVXL^_wAv%&)FvV~Oy~8Tu0O zr=h+le0`KLlODcGx;L~cFlynRo3cN$Y=!4rx>$t#ecVjjTDsL8#gJu9{u}TV=;yWi zyjynO7|tyOn_(F1i0x>z(vaK!FIcMR!CK!F9ng@PzNifaglWccg{ZL}S92$=b{hTo z>@0_{{>l9niu9Io6xT()F1&N(#WZTc6H1V;4cA)BH2kS26+*Q{Oib3}nDaLc{NafB zTZLiQ=i3cW-2T4#y^_5$oaTZJA;R{_S%KQWw(_}Pz(3=9-{0vB*sC^E%y zBE=zykQ28I;mM~t$r8UauG3@5>;x)`mnKMTF6Bsi+-}2H+>%*V!hi2A-zhm|r z?SId>k9xi}9oT6qRj`Q;^UJ*e)Vw z{jTz!@@Vz>Qs_&6_rE@#_d)iMwfbt4*m|lUk4Fi;j+Q}4sW}Do2h#USEA$}l)ZZTM zcUu}FrV;4lQK5BfAj7`bSF&4iQ zC$B=GtD@8mXoQSOcqSG&8xGPa;~|*+|742P zFh+vXA}dM(;sOL+c|I@ibjT8?7tm--V^=Bk;o7b!Wr+Ul9B?b|d@1N{;J0GHy5Ch1 zPCux%>%M&yY-96K-%mKxFO*_IX5O)D_bg`TWHE45kFoQe;EOTs?92Ope1@)zV>)P0 z{~qdH#=oWBxwv4djn&Y5$NIFl;?a-rbVddN0ESX+Zu}`pMi)Fa|G}j3OH}CVvSg=08P+(#)u-g0Qa^M8;r&g!=-Pvt_N#q!54~O{%x$u1PVvyPu}M=$S-)~2N0=hQ_n79vW`QVjZ zFy1iK;UxBEN=764^itB*L0!R$0imOMNPxDqP*LYrPJ#;mKd<}_oB@(IJC0B<^SX-W4Z>o!#&OTNYDw@1gr zR_|p{KhE7lHBoqj} zkV8?FtLxHYcoa(tEP2Bp?`ypRO^ABu%~wVfi>S*NXe4ABHarrV$yy(-{;`%$6?Rk9^#!M!{in6(MNKS?#NL=1tAU1z0 zxKGpUgQ%Cw9r1p`ww(t0OzPY{{)P_9&|Fk8eg)OweWRt(*s(Lgt?i7eCu{92>}GMT zoJuho?t>wEnY>fIQ+1J|b+K!0n6`=*WPUr;pm=?87}FgWl{_{1VuJ6=u(l0>k86|D z!J+IK-Qe}iV{g67is7^-dN#`D2o#^)G>zSNYw@F*=B()`7ZB;kta6CVtz$BTRvg&y zPftS%?vxYkVG%t~j7>W|-cjvCaiKyjCFfhCZhNo^jU-o}irU)gTG8EhCf`L$Pz|r{ z8j`zq7=2t_EnZ?IHVHP^5z+y1PL`_vCAo2LJ+A@OqP8gv6IrpRE~ zQgnfw?(0$IT(`;pTd5ZphEmCfM_K;0xx)vMky130xjNvX ztknT{E+wP>YZkGP^abEo4iSH4+xGtd`~kX5mVR(SyyNDKyK8o5Emiz3cH8lIuI^NSRpc=gXjECm;7#*Cv)1dylmFrf(&gu~~^*i-S&^ z8<``8sV&tBIr>+p&1xVZ(lejl#!khb0$`qjXp=(81bpRn@O>v?6m{l6uIPML`GC$U z4Y;D*Z~w3bB$vIFNcNe*Tkgkv%*yFPirR_XCI?{0vv^Yq<3gRTeR}n-U>8n(Oy1ps zy^vKo6KG_AlMPlH`1bTBHiC)ip^-Y_n;rc4vQBoV+w6_4E|~AUeNUuAE+Wq6n;6KF z-1}9J{Pz>!Ib!v<3i$Hh2QAowstbec-?4 zV(Z6)m1%b7YAmT)B71R{Q1&SFIaCosq#^d3vyX-M#@DO64j&&ry)Lqe{e-gx$wzpQ z(uJ&rEN`ZM>k>&09f1eDKqeO{YHwWKS)C2)Kt4EOT0*$;-bjZf;6YP(o>g1quvuD+ zc9@E-kno!wp)IhUvBCvOPGdOga_x^bxu#XQT?jXu^l|Ol2gyUgfH1<6iUR5ChYaI7 z@%UKNG=1$8T*pI{V#Dd(WOd>YjBJ6|pefo2-Z`O$k{Yu2|4t~*(`#!!WvA5nWAg~N zR!NU^j+tn$6vx$qT)Uj^13$#{!o{9%{T(??yq zZwbI*>?^0K%9CV414o!f)TvyHpbn1+pVlsFHHwFxl5g2m!&N|X3iC`g#~UtQ<*pTC z>HToW(9$?O2KxTub>3%&d}v+xZ5k_?l#k$kP}nDlb>i1FU zj47Qv8xMLvmY0{rPd_Sn@m#R~I9%k|*I8KGf=j^l{1qXz<8XdG4onqnuxO*nqgxstqs5WKDdijI0#YTSvvq zM=y+}t(`Yyv~tnZ;6hnaI#1h^<;;}uumANzz5f}7Uou1sO9oD#D;D@8oTl${POz** zq}U_D16g{k>IrksaVUM`p62L_{iir6_Lo17wJ;euT*yq4ak!N&HDGT420y(fo359% z^=>xe+44lDp@GGn&T4)FBoou2sx>2*@cTGt4g3%JcUJk zE~SKDc=aE8lK;rd{6F}cOx%5o5cO)k_&S&rUq<(4&gQ+g!>jgpijs+ido>=vpiT4B zvqcxa?ti#0`o2U8Cq>KH%(ul)-t0|(Ba+q;F8V+qMsOoa+3%h~=Rm3Qp;j3WaQnZn zcmL1GJ2R(q@W1XpK4dgi@WJcK0-BcdIv-cJPvN^tG;T9*(O;wCBu(CzzX2>}GX1;% z6|8Aghau|TL_gy}Yz;Z8Gp?0ktCk6CqlI-z5~QKJaYZ$vb+mgPTs_aEqkH65@7J^U zFLIO{?4RP&#psEp9s+AZ-nWLJsEH=L7(%?W2Z+Jxn>)jgjj_{S7h~HusPg{4kPX-t zj$|`9$mC^*vP_*wj|&O7j)h-( zZ>D(8)#A}Ekk|)F`&wmzl<@QWe)jzlkn`*R^)lzHKBz58u4+xMpUX1b37bA{pK#4X z*n~m3Yk#nY^1M8&X-8vsDZKZ|`IPXKf2J4x?|v-yrf=?yo1kpDzcwHaB(~+5{n;k= z2Mct60Q<9eJkWf9{@)1>@z)>{|Ezud?Nk-OHG4>$r+a`P)GbYTw&(7Hf4`3m4M(l@ z10DlT9-a@l4wu!U~7G41m8uD%B?k zP~&@RKBoy>5^WIJ;v9a~+{JES5PDOg|MhfRR}+ia0%~L*UJSu*rm*_*s5GB$>$TJf zAGE~rP5oAw0~+7DOTd}bKT0-E%sTRR0}OcDaY@BQ6d_=~0eVtmnkzx8YLxGXV1-Sn zE`768J@e~YG$p|TWt08jdY0{mL{nayc(g_+NU{k8t%{NCYI5lxkWzr1e8ms)jHaz3 z{jTLFlWJTSSD!pJEh)O3##z3g36XDnD_e{%O{<>8CH`cR_I~JUij6UWe0;5|iuCdW zsIGq=X7C?87IO%)7rL%4)3uWa-Tb6LtH2IGMo#o)S-#%1J zReHIShvt%gBg#0;c6(%>Uw1d(S$3KTt&ExLso&TTo0tsll>I}x&+cCkH4W+3;@VQJ zN-=)DMvEND?VS8Ap(5VyTPE8)Q;Le`4aR>h3C3kTcC2SkUaE(Rn4DW-mvK zdsJxi(%?k;g$r6&hp6o(*bNoTkw)___8)m`!jD0IwQR;Z-XzT2XRK8y^1<6EZ7F%7 z*#1~uz{>BAUY}oX`2jFSLgvm?7LYqwD9f1x8j5`kE!j}q4-V+2fAP9e2=Xj>r!Ico z7@^-NJEBwdt(s6$hM2agB}P zxP^lEzu(g$vm5oG)-%(l_==6LT>IORP71bcCufe7bljrf(~!odKq=gQME}@Q&u1RM z+8CngFO^+U^zQOEdnoI?=$5G3{&0DLy_eP4+dPcMZRC~NkUR)ettu<}xvHKSyz7-z z89nxUs2c?^Jo}3zUQ5;yM9|ItyEzZjh+5TkOP7_|1t>hbJ9_k&$b)v=FTJ@wcs|bc zKJ~G}71GT-ac!G<5gwz$OG=3FhqyS=#B$9RFUPR*+9nBnXAt%Fv0Fx)7jUnrl9oH}phYAJP|cmmGA&Kr&aaau?b`_at)H z!$aY$66tVe+vB!tX9@Np=QPLFq0(Qt9IJ_k;C8#v?Ra{&hnb>A6_L9^aki!(tD9WW zFW}JR$m`O2GKxp-H1BIX^Ucat+8zQX0(ydGf4qdG9_EZ)={m^$HvZ(+OPA1ftP<8z zfSBCCQBa=eY%h~nY{zh1qD$}Q9d)odHi7xl&=QA&yMSdj^}+`8^5hR6l949Krr{lI z-?(YEebO1j=k<*TU+Q?W}KR)c9Dwyt=+~usWq{JD$93 z=_Fx63LX_W)sb=m!=Of=#+@0$@6HD!aM*UfxJSzGVx7Gbo&972AkNWSh&uuqKMcfg zNjj_VVjeSkCl4S`(qzfMd3)jmjlKQ<$s;j5|ys$Z;@}dG@&cZ!C#p==^`FFY0AW!V)&U9@PWF%<|s++LUdF;C-ir72@+netvseB+zvg^!D zE`A9E>nPTvk&%iyS)vVP=WzBl$7t?ESo4vyVlVk;B|NQ6#}<3bHV76w;ZaIo>3nKZ zL(Bq5&U@QKMpuoFj}NyIVaLZ})SApRRu9eaY?ci@CGK{do!J#tw{_02uE4ghU25C# z!N1cqmrSNUo9-H}#IrZ*+|aQ!?YX58+2WCKHAbc?=45}FenA+6q!+L(SP%b#se>JY zw`7|vX zy<)N*MaJjdAexiwxAd+D&2W^+ZC}bhPBxnVOq_g?%AV5#vGX%Y^z$mlv3}n+_X%|V z(6dv0oH9!?nT9_@9$UO@h6p4H^|PFG%xJ6~yNwE+oquGLQ^K?o`xM266hr{hIQavC z3doA&;0;&Aq>$a6uaC@62z3!NX`PxgiPQP{`Ph7+%IdePBCl6{gdVN7^cS?A`-sI& z)i5XPn>pC7=A|f4vA=&f7emO|>(!WfG z<|*5l3@WK+w&yJG^exHuO?M;_)Zz_mYulxSrr3@m&KOz+aH%FJonwjOy0dy*i}3>A zG-M#!ma??!>s;r&E6xw5`JX4ij4AyvybN*-HFJX`LJyLJZfXy0kR07zZCp9W(*RIM zp~_8=)5BBV(C|zkyFgUjLL#~u7Qs{V;bLqS8F#O|@B^FMtsS?x=u;1KXp@%&1KG{D zh&0z)&q7l(HdTdiQQisqTJo~hgDRKJ`NL{EN`oL)+K1jRhN@2-CU%8;IG;E~>q|Eu z9sj}1TNTE2+h)>9Y;SmHo?Pd}6T=Jl9`{v1QA(_&E43I7>0}gMLXUUTMN=l@3*!cAR9^KCxuZRT_fte6Yf!eLJ>4E;w3Kxs{U~WaHX33h~kS7v`(`%7x2gOj1sHJS@DD zDOkYTqHAW;=p4>CNH$6<$s(+%3OJ@UI6PPGnExDg$<1flwLWUbXnHVg9#<24b3C@iNX$v2I+<=3Cel)n>dvINT)l3 z@cN#my{Q)c(w{+@+Tm}v`pEXpHJ6C?T_Vr7a=rJZNKVB*xyqj{j#qq`TW_%Rs%;GJJm@;KPo_hEiWcr;Z3a7_r*%hmz%j>pi>-?* zXE5$w9%@~|`2gzc?8l!>Lv$?Tr7`2V^-m^!5o*S+(Z=g2Qbj8-r}FnAq!NKPQ?InUNO`}A6~^HGz$PjLk5RyPf^!0h>+U+w&_+89%&#Xu0A|j1`vcuyhz1k)|G8-!Sp~su=ZX@2fqs-n?P#+mz zA^SYU`fCEtIqIa!->WL~5o8|cw)?#OP(2fCLTYJ*?eQ1x=At$ZeO@f-ZYh}B}=Wxlw7~Kj}0RH=qcl%kRE#(_H_T2nZ9vVz|H$r ziIgQ+wfMU97ukU~G4?!2jybuuf+W|-`S68zDEGJ+>RZ^0C%qB(?YvOJyZp&uh-DO# z5}Gd~BJSR_x7xdV)$m=Z__3fQ*o_}PUM(W~K61{0T2>_wU8FzvTZSg{s)@Ej#Ef&Z zH9MVFFowtuc+n~H&77`e#H&`x5E&!GSnCIQHhBM8^K^~cXmuebHH|vDZ8+)>U@45A zGVg2T;ZF%RIUo{fu*$9*z2L*#J+v`k2Ffa05hj_(cyTPt+`hNd0-Hx{UR;G%aXfAK zK4UaDW`%z#-ds9#X+ZgddG&`9Upq`96`Qp>>$918bNHT77-c*Q*^JvK!`fUKai`*o zpGr`SSQ91;Df40! zk=j@~v^^ih+R=WP(p+|P*QmXp*r(Fi^4(HuC}U(jVQK~SPH{SEYO%)mUh$>|7%~6e z879O0msnRlO(LYth0$=dbfB~nHQ1gpiTcoSh2>4yM%PRr4&-(%yTU7A5h`a!5d2HC z`ldue(K#*w>5}9^ju(o-`vonNnJ`sXb`Kl=a9%5_{91hE@|?DI<|%2n`Z@XIp*M2V zjs?`^%C!eym?J}Tv`oL;A^8DTe5|Bc2R~1Sf{__)9R6Z*AE;nxP!>pSRxrZ&IV2gI zHVHOd?3lN|e7tO5_a(X3=F9f8xDk^nY&|OEC)56mt5uT8b(Y8kuN=7J`Mp!uQdL`~DVQUk_e6A8zxLsb)saxkYBaMw}cpjpoyeAvc&- z4pgORiLV!`J>NQy@pS)j02$Db#5XT@TNUCMN$=qC&uZ%&8R-OW@W-@!u7rkmd_vXW zwz|NtsV6Qqo8hwD_xnyamBjum$Hu&2_@3^}T=iPsuAlFj2NxaBNJ1}HuaHfrmHra^ zWoT_J-XgTFxuheEL~ETX(EWjM^Y&I5qX*$;a+cxS``>{AagNJNlB4bnjSPC@3TpGT z*t&ZILzu)KEVCyEVX0Tvi@ouBBhTsy#?n$$>lKh_Qf`wGW@Kj0dW@Cg~t;6N!8dYJS>RU*1aPHcH6MWZ>lbeIv3TUll5qx2Z)ASyB*B!Wl!HrG*w2=-D-Jn!xR)0!ZaPe zkuA3AV9xcyS1;92HZm@8^({d^>%6Ac#cz;mpzZ;zlb0|Uo|q847W06`+E1P(;?3cy z+~}G2HcuNw{Z5+O6V`?Wr!pd(ANSvhyNGs`5}&`5(5fcFczX#k!~m)qr*mak>7uKO z^ow&Qif^Ly?wh4X!wN6#H~M0WDN1NjJ|`5?*UTN2c}eLf(?vXv>#ZrKafl$}f4=46 zvQ|DUVg{D#=>N_L_YGpie>PQ*MJ~~sQMC{#%9}=(Gh!?SNqwHHUvmuGPR0)#D0~yxgAZO&q@{UigdmU zg$tyY)7Z40BROY$ONw)shrkGKiYx1PG6|CCz)FWQ_Rl1i;-efRZLQzTO%DS(2Swp! zYsr-#G7>JE4h&VXgedK>$H87|olQpFCu_0aK?MOnbE3^lB=aKwNwDTffCXyXQxD(J zP+z+rNV@74bx2^ES#IeNNKwJqE%7(^I75G5qS0y-RuSdqmRojFUKMsNKh(pZmd=4H0_8EQ6K!j1K6f+<%0`}2vbDa5 z(wX*+RNz48Dpaq>gzzNVCkjWFCZ@@+=VKplq(ulaKYE&(!Mt{$s~UNK5mkg~Q9dga z_{Z0v(la|&O6U`dB`2omGrHkC{_0MH6&Vq0k#`PRJ!|^>&^Ksi|JQ9LqqzYkWrD&h zgl`$O&ll>eGrBs|J70Qc{s4CukIGI8RO2U;EpCMUIvLbIxmnM1fM5oyafIbHx?XZF zNDDe5lol{Jf>c$S-yeAtb+#y?ELX-}Y%#&#tFJ{+GCOsz`gHimj?SjBcx!p!gQQ33 zlF!Q&*MXyL}IvG@2KDoVOfW4OTyF>VcrK4!ixN&YeWPz zKYyOLXHS&nI|^z8xSBJgsK;oWRplEiqxuE6=i5RSwJiHCH?N?{Tmj4L=mv%u3Hp5X z&I;~Q9S)oOJ^VMTx6VtYd+KdF+m{DkB zulB`8_^$9SpPVpt!8diY4J_4W-B*71(7$)e|Nr-YdMY+l^&7~MYTzbN^X50rL9AnU zuR3GnCzHZ@A#Mt*U%H3%K^aUR&-sFM%bx|s_wi_S2Ya!Yd=xoN(O+I~VX)Y^mJH#? zu{)Cn?m$wD3vhp|v26=75^6&XWvBE|$e0&=#=bW>VWUp_&1_cKY+No%f77TPQxG)% z4G4zcn4{Pv`P7X4SeNpGE*1v2%0oR^C{bmiH{0CGMCB7(C%jV)NsVx_UkgL4Xkj2dX z3&(W*uYz~~qDTI>Z38ca?3X@GlObXn6m;5NzjfT-nUP)GaQcPHRLv2jAEOI;09zx7 z2Xg>?*MDE+tNi*Q7njnREY{m6H$tE zA|hS75PDRkiS*tQ5e2Cc5KvmANhkCUk={h4_ufk&p@cw+_w$^Yxie?xpL6b=xp(e7 z_dE|Fe0)2}&Ms^3wf4K-clg`AZzz@1n)zL+QM5^rSm0o*?gusn0ox7pCm z#=SwjIwT1c`bxasZ|J*I$Ttu60u7NDk6exh)wimZw>!S%O|JXGXnHOdzyY2N%Ep35 z-Dk%v;%gLT1AEsRv4 zF|(J)!^1Vg^*XCAhddaci=48qOVU+|W2xyp4#7hS*X-tY=u-3z%9T>FZEg&L48sat zD~&($Dh)pvO|;oLhDEpLz2^INLir3;zQJlY#7pkIf|4-+Ezr~XM=Z>rpAG68%LeBq zmUV+XTnqkG0`Lmg;LYl)wF6|?e~gL11Z9#m3)uc2?|Sa2Q`1Z0Dg5fRpG zhK0}L&oC;i^hn8sN^T2c0z@qa#JyWU&h^Eef~XGxs}6CwE!Xo^VkBbl|qGm?dOE|gKB>XxQlrtqjK zlZASxxk&c*389&!oJmFQ!RCJMQstul?QKz$!QAfe_PY8id&v(xnm8b%@UmIV#awTY zp{6Lb(Zz}{?5u$f$Hi}!f$L`Ij=sQJL<^W+^V*k~m0o(VcZtK}$^c1J@(J(EOEqd- zQZ#9Lf>ml|UR6&Bw#xej>B7^>kV*6;!l$GpHa+(#*rDFkZHdObMa?wg;;SVrWEyR!4bUL2`xnw5Uc zPM1^F%T}_`*82n?1!5u3OT1RJKl*wVvdl-_c*}CQqc>9AQj{(9s;S|+634yPyFr}f z4HLe?fs)m3&AMU$)%f3fKm70ni)>hyP#Cxu)EyvWupwY|>QF4(P{%7=>gZ++R|rea zW`+REvtWfh2BV<8hb~+8@!p=5a_W-%`>Da^jH@aU7?&W;BDaSBgY01dTiA_tO{#8; z?Ae-;7xEu~EcTB0>e_A6$(o@e(7X&s_JX6C1qqNnRIxN@bbEOHPIc<(J(~#LPkWD<-bZVBlLX9WVJgwixo%^IgWLxc$FUJ+p zHBE1lLQ;U@X$WgiZzswgpmC7XO6CMj8br!Wtaca-Rudflzgfjui~VnbSicqP#k%Sw%mzK zHzK0fhYK}|)k42)Wavkrx3JJ-_Q<{L~To64J~ zHy)Le1M4k;@O&sD?A3Mfem5&Xo$<52wlIKrim(fWrmYFFn#jRkV<7+er=j{$#e%Zx ztndhATKLC^47wQSiRNAM2PygOM4+v~k(}q$9G&Xr5An`#G~0sCf1Hqx$t8ZH(K9~n zO@E4m?YQ8KU2vP?#Irnp)Gg$a&q49tSB##JFMz@|Z@=+Z>;!c*r;5>{-j7x)sxIz* zDu|+RP`&x-np&h{W7KI@5%Kags%u?>VR)x;d9hf0AU3fx?X`W&8`19Z?+T=|ov2A^ zQhrpuYN~d4Fq@dq4wK-_WMyI5R=_3w=mKDdbx%xM9<}@iGQHRBphMJaToAY-HnhWJ zpiFhrhKfr1gxNb-+Cx-%HtV5U>ZwKRpSn<8N*}z;e73zuQ_;L(7+xwCDgVr==H{MN zqC?vFldC#u1|MVGhNz^YF*9=HcXRzD@qhw?VynLSj8XGmxc#Rm=ia_FO0T^;(ZTV8 zKJ}9HXxAcvDA4kpl4Vkn?VcLd)wn$ZoDF1WsPMKv$;Mb-qx-P|Xr8q0o%}<)^Y5qD z(OHp^aDX1ijZzDod8KoshNVdb|N38`Ht^vFd=8q7$ms&VCBq*9;}B-;cjB*f&|kZM z1{5kQljU8eSck<|olX3Kqm|s23DzI~I2SY)M#>_D&Rv7fkA~X(6n|h&)8{0~Vm!~g z?6NZ+Vi~;yEgD{dGs&~9qZwqyZ^gM(vJ*skQkpp0?nrYI*)!m!d+2geMt=>FIrLkR# z3=-tO5f|8bJxYNl4b_hh1hLKIrenPATU`ra*DJJG=FV3S1sni;40MeR6zt*+*!lOn z^2#_y!xV@USaIyz!p^pa+d3Q2r(*lf|C@dKzdIKG=yexT{8TD&XF_spA;=B!^F9t! zhvx<2xcwpk3{G%GCe)U#VI?$|H^IYFu;nJ803Slb`Q?D78Z}`;*ulpUFssVNW_4yDFD!ddK)RIhLF?&>ekdh%|jE7@Ai3l!<=9(rgnT$Cg z_YEB&Z}k6jm{o8aHO}6AWpe!Gj6|(w`p#9)*270KSzKe4{RjF`0n&YfSE0Wej>F_u zeAab6&W*W-dxf4k( zt;rO|;$Kv{U&-rEOI0K|*F+b#uxFwnx&&B`=2Zs&vyu(|8@-Pv6{N@;Smj!y!Lz^# zhKfbm++Yd|of`WwBmp3%qFM2L3KPu-_IgW*r++yxNq{`bn@?trBP41&&o}SNo_ONuB97DOMawsGOxOATDVEik{Mj;beKM?pix+lC_k|))c)&vQK)#P}&q!FT@Quqdk7OcpK{1JjzQRWQX&L zKhzGKWqwYaVL*eoTTPFw(N=3&2P}uoLzYt{li0b#He1T!TMvB-Cb^LGaae_5jJpp; z&tEf#{T3Hdf{p2FY~0%x&M8`5;3CQ5NbpTr(OlF>w+ld(Y+#L5iDcMWV|zn(umTZ< zU<*KD!Rp5JI%KfJJeV%RKAmLP`(eLR|lMk_}O_ia#GGOs=fmB zVGH~nY^Psk$iHPn-e&&npg7mS0Q7eeb+JO7{jyI$x(WaWp(RhJBXW*HhWz`Qbu;WQ z9=@^xe5C`@@8x7YNaf+p(Uo4}e)~9qO5-R!K_Qf#;esJ zNGLohtPZp_jH}hIn@<3lCRtHjM>pFTDOo__kABFQXi5*79l2wB9z-0CM-S*jEXf}! zqUf1HA)rdW+n}{}G$becfRASA3k=dh~KmHd`2ekh)>_{n6G&4AOmxws8w-}F z0Mr)q0E=2b)Tdu%*bYyZH@M}`{)_5@A|3gXCa^H#0GtZrAn`H^OZbi`0@lK5ro$}8 z07K-{<3jn4ZwMm27I>J@rakfvK`wfkj4ZiR03P-(Is>#-f5MTf{XS=ye7<%u@B+)d zJZ-=TaFMk`Zzqx0GNH?fX}nvei-0tl*`O(TQ3OyG>xIPtljFmdaNYw_FMO#^RY_l)6|u_KlzfMWQXEE>2d4C#FwkaI@bU8t3*_0Y!(%h##+$sV ztytVUU(@lbfn|6O({Yv(PzVZig!*hLyxm~^*{fL%V2b4CnBolx1JZUn5(C?yjp5@a zC6M0h0Ck$BE^<6fJSy?F%DR^mprs3F!p@2vFas(M^Xxe>ERuOc36jF#U__;qfo3IV%*P z&Xi3z?zHGcrqY2=E>{6GU#O8pe)6B%A(jhoOc;Z&oMQoamh0dYDJ@Vbx*e$dUE2_H zLle-3qnkXy8&e;Dr0bd|3Xtr;d*|N5C#OIp16f!%oVAbCBGf@{X!(^-{J^;JPO9kQ$e=|9FSqHfuNx6yE11y{~ z9Wb0~K&3Qo8N>U))Nr~8$ZTyW5IIRkayElBR8rNZI6_nd;b)vE_n_HNbPptHhUQ=|5MzA{(VIEz)Rak}^E>x(L>7|v zPM+-|Wzr13D+l`w*y%YCFh;+CfrZihz~MT4(iB0t2h3CSf<%C&ItVKTXm$0%;`+7! zT+1;~iB~J<@d%KY9-s}>Dt8etgj)@8(1Vvl^pY&xufI2d^`5{#WlcfT zKwY5E`n4pD?gT|$J)fr4qFpjjkTzJ}&je_@5mN2@WY1;p#A{kSYxPn6MKuHVrcfhV zN6i-^^@+mj(LWdL(4MI^a@uoXlL6Z9Rdbfzj?byUT@9i(B>s_9nDBIVzb~?NrI^Wh z?B)ePtz9No6JQ%iw3)2Xb-VuE5D7H4A3^1SBp_iUU7zWj8vV4^I_dO>O*T@;Fyb~` zdLTV)YX1tbc@KQ|2jmBwf(u8oy|6jY2jRfP3S0Sj^=U@#LWRe=?DR_@oUooxM_gJw zHV6;_%jlJd6L-Im#vpNWFZ#Y_Z4X+6C@gYZGVn3?#Gl%q% zb~R4`?E{eKZp#Dv|No2w|3qk&vt{1ek~@(LNMuz;YuN9t;ztKe_Q$7RpAG);m||U# zfn!;aKLgXb^s#(Mig9B{R_AYUJ5CmE$bMi_h>eDpL(33t=rd~bLc9tR=ylI{=l1oI z>Syhu(1?S&3_^A*j5FcK&z;o%A0m*S?m4-aUN}5Nan+DHCoN2;J60Y{#uq~7EItz- z`J0sppcXV@UM~e!kpw6F1&J> z#(py??Qpu40^{a+QkMDD2DCHU&&ffUoCnjl-g^B>p|dsOH4flDEl9Z^xo9YC?Z?A} zut%*k+oG73c%5lDpRXd@E3O!2vPQPhJ-?`s#`ACXIOK!>GeU^X9GrGTPb~7s2V{?JPE_i9Cl<=(U|_<3$}#d#%hmNPs_Q)Q=s;%tK(P4K z=mm4!Aa9XiK5i$f0{Wh9Z*wB0iDyjyA6^&Ve{f_B~V*E4sC1ibF=Z3!k$B3Q9~ht^X8m{|DL? zfq0cmnQyXX*^YVV=`iOPP$A5f$L{AH-B*&n$1OpuS@&D$mZ8%9Be81RX15l$5i@{p z#4HDi2PZIcUPG|yPVUlCX?spL)lS4{ZBR6TO7RNv_X7yxUxFvtPLtvtmD}$v^0>uF zp~g_J!R2Q{al*=%OX6knp?8=?KkDt2zsR}ZDr)9oJNd-d{hok{Twm_mm&?AoM%NVn zhEOaG-68;^T>C|Jxt#Z>c+4hsZOy$8?1I}0s5W#IlDb|AI|!+6j2VvAtf}#+^744z zDRn9k1Fef&MbkHeY~h#PrQ9(VKV)*RNXZ7>a*pbLAZu8j=UPTP^rWA(?Xf6{ss!86 zO{hapTePKlryDu!=lj70n8&u3%f3m12kPw?RhkOD=B?|+h~Hu}qkSglE_8lLeH+Iy z8vOJMOO!(A@u;P`>KN26JQAE&s0&4=tp3#&U4x%uMs>67)5 z*^#(l<59UZ^UT0Zff~P&tV#9%f}!72Fv}w8Z7v)Dt`jOrh31+vtYd7Gt9Q} z+4@X{ZBB}~Q199T&&8-$Z!TSAsuQnwqxJ7)i3KP%JmGS6ZmTfrW43c47@)+Dls=PfPAZ+74~?tp%fLVA6o91=CKO z6TSCKK2qJ2YSD?;l0>C+@eT*kZTxqPlbh#7Hi z51I-%SJ`i*usf}$M#tNOj-TiXiWvZPgA-r{XfK7{y6;?)tdJ(tBVfsJ5`t_r+Dy6J zy7r`ns0y&*JSHsw0r2A-(mr2$ZQn_6!_{_*Xum&~UG%L&FUHd$SgvAV;Zrgz!867A z_Gh1al2od`)6aBPN}Q<JiHK}q5u)8b%{UiS3600}D_hiv1?b-eETLJV-Cq0p~OWk5Z4 zB8-$wu95`?^f|iLO{a6FYPF*+_vcxtJ0=_-#3&MYJ+VL$L@pnzGDivr{E1?7HSbt_KJ+@Vgboi% zSv-mBTy|DDgFer%AmGK{bgd2kzy*CqQ7VmAE1N64))PJNafG}UXN@>7amRFu>`E@L zx0&Chdwa;q;5UXe9CmbQ_3WPOzQTVh@`jEp@iMK4dt#|EHYj+op^U`^c9NtE%{fiuPw8-PN^iD)ch zJ~bAb^bIdhKX{M31R=8FeAn@g8J&nC2FQ5kqppB?EJ&~cYUT4Efq2eUI9<%9@b2F3 zWBl%8)TX$w@H;2UBltx?%8+JWIrTP)=WBu2r8MjMgw@)lYhIl`RH>dtjGj!>o6_5% z^F7SNvI;q->y<9#^&%^y3=o$0IwXqZN0`kdUBy;Ss%OpAz*uh!v0%T;@5fk1v%Rok z;htAu<%Wq9W;yn!NR0(E- z90{D9m9WPCdO$7rLDkR-YuoLcPjcILsvf~wW?{ons{O;=72FHg`!2*J`_CaO(e%e7 z+kt?0!iW>^HeQz(D%;B<=1ckMb^P?%NKS1jgfZztGl#G$cbv=ZHRUOrP%i;e-$fS# zzfC451_hK^d>WYVM8mGniib!_O?b7AQ1~sjq`T9gI^aQ&&7b<^6%*Tu$sllSw zk&18Ibvg!JVh*57aQQ}V>`cY{MbE<)i<=epaw^&%I>a=Gja<)iy*YdRjU@0>AWeQu zu7q8gu;*k!EPRc-G_9esJKrz?9+{AFFmy7E9CIU&NX;vQZ*Lw$f~pf)i_S>qw|VXDTa+}J{RJS~Ip9tG%dPxBzYbs}0w)1oP2T#Oe!r+P^+z%9V7!zr z_^z`B<1t{#))^0o&!9bxK!Ce*B!TP)J$}*$gfR|%;Gq;%NsrNCiJ{qmWf+{V~SLGw;-jLp6hB}R7Jkc1EgJ{ zkYWMPh*Gd0Z1DD88N(5e0BiI6lbWBe8_k89Pk;8hVjM+9&GB?~5)H?p@ehAe*!<;& z{5{g_4<%GI?tZ>r6DjWz|0I0bz5VPRy6XUt-J*m@4Kz#a0-ShO1%Qw!`PpDB0bv93 zLf_#lKu4mKjpEZw=o8Iu?Kw?5l8w2+X)0*EJDYV}hPgX#e(q^NKBB>T=MWkCQCA7B z2)g>u`{i%>+fR6l_*=K41$i(4C)4x5az~eK{?)(cDEz;u(7*=gBxxx?S&hWIc?|4l z1ouS8{&ZyeA7nq5rCCHajmh6!hfYjL--cZ;R$S(!s*Z45zIU+wc3+nBlFCX}Z$(>N zYh{8Zn*=X8yn{{wc?MBMqV8>u>HVTvK42AfL!jb2P;WFm(sGo;}7p|b4qdX7_(|AoN~X3R+gvr_bkC}(R=%~I>tBNh;*i- zO?akqJ0TU2VD&oIQCfR@Ry(f2O<;6&@t#N2|R}keFDLt z;+PLyO3Z@Ev4pqh|-{=({%O8vlfcLv1JhS z5rF}3hKp~w?FGB@EX~MS-Reb+%NnDVzI=fzxYmB(B5cG$L*Zz=oQ2Ao`DFe*vh;-uv>O0YSfbFx0@L;jcG3&tYHkF{6wEgW6Z zE_>2gDC>AXASp#t`j-wCj33^lyj{N&Su0*FtMJ8imAq0@!@J++e*)?Num#?s3=$uj zDJGK>9+x84KMcUP4sb#N%Ji@e-e}WBn|(CaaC8eyBYVzT`2yRePazwXB|#QRX6QFU z@=bFMCp%b(@<@A5@KmVEeo=a(wIil%B&tb?@kzu#0%`YVIqe#)>P<2&S3;KHhuVq+ z(5i(gRM~t|$FZPF1?mrQQ+8KmPJN z{r&pi5v}$0efO~!5piP(Fx23{6wz-N)IcKSZbcA-`wlI-%e+> z4F@eBWHWi1OS;KiYluvi4#+fgzQC;!xU&Ig==I(S7k9#FYR2FCe)Z_lJ}&{W0DmC( zv);4-Zf)1jV#>@cWzeC6Zk_$|fI4dXbp?unsgk@}NZ4oDI~ zQI*m^^me!CX+=v_n$tXJ6pOM=KcDk#9{(ncukg~BKQ`egFWy*tG4zzOQ>S%cra#$( zH^ohCcGlDB`$|QXlQ75I~l=FS}PGKGx&H2&dFKXan1jLy* zryRQ(5V}V6&cfB{PHNUIK{veK0K}M3}c;l+|PvaZ{OrbF4sM zc-h7gbSaCy>}UXBM@N)5!zpZ$%n|KDHbtvVK6P!vrUx-)bpPm8U%h4KMi?ESFq1PH z+3fG0LF2BbINh7LPtFpJk?VzBjJv`&C7C`WUJ-2_vTP+qGgIb%(}Ckf{OZ2bSJDHl zbqQjt_xZ@GRC|{}ZNNp3%jT125lsTsn@wlXsfFhk(2*puFSdv6S86(_(H6t&iNY9j zT&z~yquRlz&!{~XbNy>cUb@!CVXB)H7;z8(VYF}oVM z7{>6jPVZdNdHU4nH%r|!qu!5(Zv+@C`Rs(iZh70ZZlJz)x89?2WZ5()#EA9>l3ucj zX)Ug0p5hFC^MF;BR#8+uz@kOD+`i$VdmxqMohAjz^@ayCZRUD9h+Y0IAOn0cZm0;tuhewLdGvGvPVzgjTrSCORUhvYh3e#RH@^<~~W zos0_UFu@P|5C6pkUN6PAhk6u3)jzboOd8L8nnzP3PRJ)J zl587shwm#cvW?o{{UNthzlI*tP!CoFa1o#T+YUK6En&6dyn8Hjj1eMo8K+`{ehBVC zZu>Y^j6Y1Nut^C=;dT5D#dMY2IZpf$6vtd8s9wVl0E(?o3}+-8|A_hcgQ+t0pl9In z*`_acW8_r=9WGppfSiBtxNZIBn%Z>7_`rkS3YQ)J!Qon} zODqOcUiSqL&o>``=e&WXAPXl;YFZra@D3JB)^xz7f1}!Z2lKb@^9NR zIvGOToRCDv%+(<=_LAj_C1vovC(ODS=zBCXh29>0(O)6pRJ<2xTvu~h(e3uDeBPh? zsM(D@x={G}dT|`DGUfXy=JWv9XT~ckKWJ!zO!e}+pIL+@?txP)Je6kw?&l>S-GjJ^ z*8>D5g0rfykoPOWGfj_TlJY|zrgX4y;EqRofRAFf2Vam?;fc?>(s6pS6F+}xOPM7Q z*<7Dy{9#)$$lQUS>-*beZn!#-er;7?)C7MfqJ3fQn(^7EPxh~e^`pAA$HVan4TDYz zE%Nu#KDS$4Ovlr|$QTt%uEd<~-A~nd<#qWBFHNNxNJnsac7kE?RvON--Hn{+$;mm$ZCh}IhPM0C zg)N!#lz@cu8``f&N1nQ>^5q&1x6}k(#ztZk$$L4M2R@(F5G9to>LCj z_uirHD|$glMK91-#m!5s6FhgP6(VN^5BNGIG?U`OQ?Ndunw){i;6Ib6Ym&oV5S^6d zh)Ipk^nXssPirKLw$&zE7LC-#JOUPHj#5h8#snM_fNSouRIAXi?*~Dp2X0pp+wUT? z{9BYPWqJ7hb)hch_dG1IUF6ltN_3SlPg^)?qE zrr0;7czeInv%jcb*V}+oV7~ok1KrL$TNIl|Ze|aO(CA|(3Ct>ZLey!lV)w+_=&BD$ z_YBRF{h&hIzg>s>y?2Z3Wp-2|M)%WFi7pt0jt*54q2_26k&q&r(YZ$JQL%Su` zp~3qQSpqC}yewVnz#9j9Y;3xP z@1weZQHd6@w$Y4qtX*T;jSRtj#+q#LSI1L& za?z;BGV>pO#lQ7ga1lMBh);kAyCKhb-&O&?PoeG;3-b4TxhQad-;Txq8z)uM^;M>W zgI1i$WRHi8Lll#ZN!ssQfvYPhUMo_CVY^HH{}?^SfM$*e4u{aYu#U3-nP zxz(gxFqK^@8?neLprg)SW^Y1B!7;Q#9Dua`Se=@7wg!ye|~bx!+&TTl~_ zg@hL3+UEmfj|MUN72{deL4J#Q?pO6TgLdt{x$97HKh#uKTuLRb&FDz=`AcQ9ZmrcF zXv+?#E%-`3ztzGHos_+Ou_Bk{_h5thMQ3RdH{DrNQ8Qy=~xbW9?{|_P;}lBgOx3n}SsKpi zJ9glPd!X047TOj)bJciT-73<)m(wU{pcdORBk9IWEc>16{8RRF`ak8o62kG6ArCp> zIrD~@I4|*!Jq0Ewv7P;ugB;_mdY7bKO!TsQc{rDT7((Qy)6of@fSl9p&SpaIK@E_2 zwlz$o2Uy}ahs7^y&-1>A>ej|@JR0Rbk>%-0HED{A=U%A0$3kbrY5z&Dbx+2K3v+I|(?-{|Bt1AhIZ^4ZO=)dz?*qEr>+b0vSTI^6#;N{;t32ifXxk zM*>ySf7;oub2_@Wc}J?vXGAr8Ac8cNdsdgEH-_$B!D@j}ao zt`kBrEAd+xTa-xBqC|5K9KB#KN?F_i(4PtgT`+n7o8dd&K)TtMzQnGg>Fvh4i6r9Z zJLNJNV|*wE_=Mox1`b$Z&cNcU0GNl}6B_&ng1of77H|zBIhIp;T7f*#+<@=M0)2|r zVd|=qH-sdHXr?>90H&Org5#KOQu=3Ul8UN(4k~IW(f#DO&|g$YJ>sW;=g2*-Jqa^F z_kJer7ZvmhuxYU_%~vsuyW^qJ^GR!MIZ;J_P~f|&?1%^4aQF7= zF*ypgguGlV+P$|Lnvpe;3o_jVlIM7L!OC5E17F%w<&h}&0A%V~k3rCp z0)L6L4bB@wvsPa}I|+U7=)a+dA6HCzUeFsaR^!}zbXm;Z8=Ns_1TEIvhEHye>gbw>Rhqu-Q(f_+Hsh%%KTNo1Jkf5I=w;N;9P_<{iq6&kPmFM4mF?l@N+l5y_IoY&3WUD51?%jUDGMq`HlNE zT-HN)GHGG@$CptyhxpG{FWLD|1gmMyG);A$lb;hFK+|>a&fNMj*!Hlz#;-;a6$4GL z`h4IWM&J;{I|=}s%D!Cr8W zAo7#$1K7hIgzsuF32KM;3ry4MSIu@8?ZD^1lGP5Zf#0M%Mcw902blLsZ3Ude0Xn3S zYeMPE?9{dBnMWO4S{+}Um&%RT&RIhZO^D&$flC^NVFLnjkZ;_Ab06r;u6;q8(JAOA z*&NN7KP)mxE@H;FjVJJJ(~Kql7SX@0LZX? zJ6m!ow_sAS&|kkHR_%?!rTD8^nj>^vz0pCpZH6CMZaj&Aj7bY`p|Qe`0&j#j7n_%Q z#H=r+d^%3qf?%oWUhX5_%Y(7AE#ULh-eql~NB85$3=CLf>`Y~zNBWgnMtyq`L&Xg< z_g=*h5lR-$AA#o?y#2+qgXK+B7`+(c06OBB#|5I41=IoRo3prg!uwT$@|V^6+I10- z_(XKacOqN=w>iCt46j2pfo3i4a0`P~)X^&=NNDSLlZUJsj@k_y-3gJz3j___^2ELQ zEK;o!p-aQGO4Ou^c%xG!o7jpTZ169fpPi&xj1TdQwb;Pf-=B;837q^?guqjIR6qcy zFZ3()6zo31ZxIs0J+wd88=v*UK={??DzV$+6WwF%0K~YG=Hc1StVF9RjSoCw9#d(< zxl#c3ZnmM(i<7gFhLzZ;0oDM=sw^PF{kaxE)b@uKtT?5lv5l(^A(6+IeMO>M1TQIG zI$Pw3wuZm#dwU;FPvM-lkJmw9b}lLL4AZtKO6ID2KQi%QhA2E*$- z8lF(9OKJs@>G3{#XLY)`-wNbK#6NeYqRzdA2C$Ph&^IP#_Lz$y8 zR$F=yN$x!6aPE!YKPkByk-J1N(3=|O$%L%T#sEmjl~nzB4FA;y&L#>A@_QXlFkB1WgleqMmO;Mt3egLZZrI%VcE)CpmzVRa;j2+a z9CTszZ6h7Q+rE)EE>W#Z!``UaruxH$4_`i-rIny9rsE8JQ8*s&Rvj3=TV^iZq&>@FDEWu`tX=iH_ z7$~Q-=O#}B0SiSsO1Pi}_2TT9&9~aZf>99N&g##d+K!7YN#Tl=F5bg@@0y@kk*&@`CcrL-4rwj}ylkbT7#)1QlF(Gn8fj%!vPe3{zXlle+SR zllETEqLZDwr!X!}O=}7#MP)$@FYdQ$l_$&m9wn)>;(G$`9-k4O(uM4off*aG`%44L zoS@5U^JwPrB-2dx=%#7L{!|3Rz7_bB?B%mJt9{9HgkfxwDxqmE33bULPYKQ~#Lr7cMT$ z6{LVH`GhCuhTy?=<0*1rx%V?HBFz5*&5q+vS_LmT$-4?MVhOeMR~?ybfz$bGpfX;? zY7aWDK9rOhxFv`P7(*=~7)tcpZ3fervAaz$P`=ID_(gGzvzf<0wmy^#mwL8&2Zr5OWjLqBRa?5~Wgr3F{EWn~HS3=4W z2s#$ETt^N5q8iia7-dTcIY5OZ?CwxHJ?hF%?cPX^7C_()A4?4P!k$8g(IrvE?tM|7 zSD((k9P#q(n0*CHuLMYe-^A|W z9}~FXa{)9@M%;ZW&QB8A?L)I-w&%~q*v3tsXGnj2Zq72>>;To7M+u~`!@H!hDDs;E z`2Iu4AfY2^q4<-(O!M%^JJUy|kbX}|Nt=sWnX8zaDKS%|403_|6#{r(CEPt+`*e#? z;n!^2L0J*ifw%6iXhC_?Q=Z(T-EoH<7+?#|mGJHt)n+r4%nqd2GiUE>xyNbausXXIcIZ>S1+HLN7mIZx$FX}t_ zauQyiKz;k-6>peP@|#oPe(pwouLt7#hMuEAviezyY2HXkzfekMh<1H-m9oF1RhaSv zt-B%*R?ul1lmHk&5!a=F1OEeBeo6k~s`qyTvqvrlSz;d@zE4)7?2&4%V{3dXf2rB*Zgu)TLGGe!^6XRiUo=;aSVR^XqZqj zI5O>7x@Qv}JGVYs*kdrx<2PSpGuYX?Y1iCN~9E(u|9w)4Eh&&*U?$SaB% zs2G$ohUDDg6`9=pFqS5@J6~EQAcywjh&&!$dQuGucjLj>k2<5()H(oT9_D$J8Ngk3 zp*E!M>})?2XS42^O_cDzIGkZy`U;Zn(0)h4Udv#rd!LBp@jlnIpW zVP0Z0TGzc-y~B90&?DfERZUq#C~xo+DM*ykg#x+v*GJ!ddhQ(_V&629H)S!i_wibl z35FowVb1cjq{P2ubl^M1bLUmj{$$ra<{@)cZAEl?JpT!Ziej)K%?AaRn|Y(t$SUw_ z2Qh6HJ5K4rbb8eLn=yi4GN`6Z9a@lHD%6w{i3BZlmZXzRuw`IY#G5LS$ubl1!;JZo zG04M$T=|fGWGR%C{6E-x@3^MgZCw;Z1yMjiK%_(kq(neOL~29@1VjXc(4zv;ab9{+Is z4)uOzBxIe-72tedpn71bmpcZr&QCS#;?Lz3J)SDpZTRTm;h$p~n0b|6@zclBk@3yG zCL%?GKuZ$$lfHGrl}PFa~OL{!W$D8VU$cNv2*) znBbs7MGos6qt%-pOqT+d*iLa+xn9!KZM-PW>(Y-!Tg?49X%o`a&i(k%=EujfXrEFwrxU7zk{g3tLvSo836w zLFKN;MTb7KE1SA~F@g1mqhF;l*O@ATMO_OizoE^k%Y=?*O3Pc~BaRaW7hXC7`mqG|CV14Yx%u3BWExemM5OBIEW zq~9WB&aR7+xhGF)CN{p1j(WrIurs=E zkKZDLw<0cF$yQ7^j{?MP#tyu1@f&n9o=4t`vqV~>Cgpb&AN9CCsFg7g8I64tYj@?) zZ6M!q4+_sYJPBekW$6~VG7_z=vvk?(N=3Ss!yBn?_H%$hTS4w|)>S4|m)M0-=-G(ecby-atru z_o)R1xa7hmgjzhm61y=Q!tz&>(9}}m1b58KF9EeLJj@QpnWcujnBXXQiv>Nkjo|&o zYh+QWlK*n`7^?u)yzu(#=z@0-h_lchueV4o$*!*>IUBN@9mBxNtv%!$lVCM$RTH-@ zxB1*d0+DIFF38}>F*g}@IQQ3H9vo(aiOvvJ6X~R=LP)A~)3i)R8{-xi<`^O0f_onA4<79c%dz?=Lg!N7Zn9D>z z+)h2YvZ64bm^p|w(tR`cFjddD_99TI+}B_HBNXw!S8Mk#s5E^injcOV&31}xGyDD6 z>etg#CZ8Is>GD4TNMJ)I0g%5T&Cu2;z8LtPZLD#eHIQMI0X4pQXfqewc+1a37;h7V zHFE!STIa>JJkO}PZ%H$mo=uAl-50U^T91l$mr33b$?x0v4`31wqGWIMJZ0RfsuI`* zuAGT_lMnvgodhW3W>;0F4|fluNV(UiA(!V=nBWq*+W&LW?T?%q^#lz^-ck^1^I#S+uL{IcJFv}$7BY;H0s;EZV?a2Z(*~Gb_IhL5UKyezy_W1*E`m=Jnl}U%R++(# zb(b)d2ZXtnaVJXZ$l{W2SvXsJ=fR=@^`er)s_l(D)ZLCR^@|8?+?_<6x-)1h^iFg} z1Y=?tc;Vlx}wi`OWnJH29OCvrEn&6g&OVt7v1sBK1X-ftY%6*6{aKxrFG{ zFZslDq(qXxc>I*5B4|r=8NvD(OxI@%EX?Eo4Hx83NARaJ{3o#i&9BCRUpf_OMCdH< z48n|tEX%oBqf^v&bw}|^CwxEF*T2p4XnoX(er>=Jjww8EOY}0PC*J#I$AY#dL((0h z@I|#t-IN&nu_$TMpu#sQs0EE4EH@D-0msVP2p6ozt{!Y#Xf&s@B5E>$Z~^4g6D6Cd z8YXmg^EQpU;O@DuUFYMrA7@C<7vKh6tt_tnm6(1*JM+|mhYvQ@pJ$}8vwT{_GX zs7*&Fsh9$yGUBoaT?7O$u=y3ZzRfZx`#Eaz36LHSS=GC&L13oM zkBQd_81%Mu`yV0DWT14XSNu+wFL(U)huKD~?^X6VUWL(+)hUH!#s*$Onv1J7SCZ|w zd&>71ljU}?4f?3D=w*{JdvO{jWIXs#W{myEEX#XrJ38Sk7AJZF_2Up_{L;Ceq}k(?{??q$BS>5I4K zFTPfro49fFR(|R3>HRANt9)mNxkEynv}#$q^aP=1bj6G24C4;V8#qIBBmddx7JR;l z1|Rj+sl(#QXZ6Z;tFg68HIEj;RDf=c3nvUx+IyF%$`>^t(~s*7%##TWtHl+(CzSW>Pp^HEt4PZ{qHzzQ8W0Dts9!Kl z+Yh}-yV9A?Td;U;0XoP*o8K@N@~f|%lX(p@rL6J(CKoV?o_eR;;MotMcPYy1rTEII zvIE0nJjY&5UN0MFG`D}W#Op4gt1%iJmUj1X{gB5J#d;`Mj;w7s3V9BTkQXbfl)qD8 zf7+%;zY%qQL=jTp1=nXrF@(0Cq6r$!HjIdd3s1{SZcO+-yBxT5&u$;1bz`Fst-0<8 zB{?9RyfmrP9a@~w>z5w8r3UM-UXzA>u)ExFXFq$a_m#bWycTh2U)E0p1tWYktr+t22JGfHrcM9X}XNoP9RMXQ_sC^Tf%GBh^pHoF#Qxup>b8f7RnO=5j6O8#T z#BVG3!Xk|!-`CcX6O~8G$2njul5s7~1GfF=H=jqTJ1eSjeET&Qwl7b9gNp{3i5%Qc zdsU8?gJU&2F3D|>-FIh#A&#iyeQcn!_N>hdY8&s-(_53H=ko6F*^Ry}h^`VrA$KLm zEin>x-5iMp7V1u)ydTJs^T*u^UA0F(X+o8tj}pi4I^GhPt~W$Wp`B(zubJ76zlhGhU!vvgep!<-CZH^;x=^%z zy|>`d(sc*js(fi!Q7cbrKi^e+D9qR(`_;UJ&oN&^AyI{>xtSErT^81F8DS!4|Hxl- zuyIg+#yji6iqOTQtrOAbOgb^$okm4THW!KxPct*{(hZ^o9A?lqkPxUcE&jbx7;RqX z8DTHsz_}cTg4Oor<(8p^7*da_VQHF#Q)U9QrwHUjzRXg3;~#zMzj=;!4#CmDM)1Wl zhkJ7Lq@l+nSa@A;h;6>D%~K45g&{c`IPDRq$?8`krD7MyZy4^3-7GRy^nvbzEZFg= z*^y|x&g~?f`W?^j%Luow+=aeZEQL;*g=KR2{d!PSWDsYiC#P$NyB%0n(DD3rUfw7* z2GoG_Snz&gLuGXt&L(u;x1s@{lSbz!k44z4N$uE#5o)oXEqtz+qGDI)FB2aM{Px49 zl7pmaf8Ijfs;Exv!bplA(Tu3QQUVw0>Zzxls=7~2*m$vlDjZxLR815;X%Iipx^CYF zx1sic%8#`%Rlf0q-MBe&(~uAuD=PuJMUxssgs!M&56M=1n>I}wco??oZ}3j_WiRv9 zy_F$7sA`x<#;4y`4qy@e?Hfvp@H21H7O0x~nSU{1D^<@i`+99PyFUlBDV?&rx~S-Y zna4W^?V(TC9;XhMrvH*!E)?T1B%eB2mNA&(?2tUH-IfKX(}l{MKHVcmUD6tQW((*K zdRa$SS;byAh$jU~Jzm>=)7Lxxhrau-BuK7KSXda}7b;D^&*#m=!*P+s8n!3SGuPLy zbiz_wp>34(jFhj4wtb`|brT=`B4IxM@xyp)JUEH5B)WOszjRh-!lpxB`-cZrsU+CE zM36p_!_*=BfCu3=v$+9KZM>jAr}-$w@H1;kr<}Z%Hp6;fUf7>>?A)_8c|>2`2w)gt z;GV*`N9h%npO*)E!7W-dL)6!92o5jH)=LkA^2b&c-eLB==aQmf?^9+_OHtg}l?lQD z<8lI*65`)?so!Ckzd*FaY;8Qn>o?-Vc}NL(zOaa^o(t)Z0m=O*p42l&KC~7Fa8N8X z7o47bhA2Ls}k`(@84fbMZKceU%1bKQ%EbH8)-7g#kZx`^ID4`kH;*>>#Dv=1K4x_ZqE2$ zDj|Qlrx(BK2L61R3I+MiRG&sGcLP>tgh0=_+pdI?~v-(YB<=H^aKNRHSh z)9;`feTeCN1em{&R$=?T>=l}Mv%i3;y+$~mj z7c>WDkW&y48;<7~<15c0Ye-J5D>*=OkA;D2&gz=*bJ%ZxtYx%&^w2a_|4aV$2mgP{ zl{oJB3i*|pfgTRS!;)2$dPEK4`LM191rY}C7;;9)-?CcH$gTU1D2{BMT1wa}l8-mj zv0(|1*!6u6ob1q%zZH7^Z~cE{?O_goxh^euk}LyC_-u^&`U0S~heAf`t~Uhz%-Fs` zGD1|dNzP}~UduQ1PX2hHCjm6zPI@){#WZ0oh$a<%&vmjXDWyD_A_y93z@Wkq&O=iq z#i_~z86E+^v$9egGEbB56O%^>+ys4C2^yze+1EUCPMtCZ@Z8H_7+9nM2`FS9|K&$k zz&j$QhAknfr>Ptp&mqk=$hU;PK`~&kl-u61f1y*3{!Kb%aGT|Yy*@G>pa^HOl)w2B z(HxvXjy_h|PPqdhx_R6Hhw%>=)L$0XKf!&cV3f$XvC-v=LI7ijE>EzZZW%>`&)opw zcA9gv%#}Sxo%IE}%PzLEkn}-Gpv_I?;q4@#G;aqBX%Lch&i;|YlxcOM{D{Uh+WX^T zUyT#W^+e0Bl=1B`PsDJqX4SaMn(Hpa46h!nD8n+YsFF9a@?A|GDfZd&A@9HaUC@QDr zW#suaLv1Sk@f}gfTO$(4h=Ovrla&a-EMMGpNKP@M7##Nk>@U*0D3_!~HXb@fQX*gQ zNgvq<;A6Ra01SUDqdrCM3LB3;ObH>KXjBvJ%^!pS2p<+XI^%WPh@c^Lh1OhL#FrxF z9q4d5+r!VkN}Wo6W88qA*JTI!uwBvqH_GR;x20U2J%Dv$fB#<{);XFHQqJc*9f&7! z>S2y#Ckzv)t>V_%)sbSe6Z8t)L!5Uv3lo&F==3fMl=4o|l~O#bq|-R6_Z-18>HRGb zu`s?S9zb3H-1&RMo3prK)i6PO#>%j48jVRrrZw#QLr!C>9prU3;I{J!XUK8y3OGL@ z)B)5(ck|<`M%VJ4tH(*l1VM%zhlQ|u#05n*_>G?H+k2gPxz=|)t2jEB7Jd5`A^5f-8GR`!~Nnc-HXd#UW*0n(s3L@0pgS}$p*`x zIj8mEKn1hr6vHNOHnl(ty5E{i45P^6b3@n3To`f7wj{zeyPGAG9Kt+5>@`0fJa3M6 zg(1D#lDj#Q^&0g?W1J`}rn{BXzKYT+`k%!H*A69Dka4gIb%5Y{m4J51G}UP6Zh2G~ z`x>q%Rc@x2*;&VR)4QaFH)- z!b`}#a6w>B2t0%c-hYfMQx+p&FodGa9r4{3AyTxjl{hO|Xbe_iaHvB%X>v+gXl*ph zrgPAkc6UcO1Yo6?6OrDp5)I>~H4ZMSy{DYBI-(IH2em6c{<^rDbiYk#r|-tX zonzh0tWfTN(t~(z+Pmb794lokKf*{h`z!$NRat=e#f+ecR-j_=U^>kTZ3_cSPY(Ba zQs%(-uU)r3h}7@etiGr}qPZxpPOS%qo%Ayy6l5su4l~=wh7fDxz}$_|ol?LqxNy%X zPXQ%Z{!z{?oh9N06Xhh=!qt6w@%KG#WV~s+l7%4yIuEt1b9>+hKxnY_K?Ik5A?GXI zWZ1yK?g7=_4!XF>QOwLb<+)^o4B`4Q=}Nso7-`q9kCKS?SnI=g@Cd_L-Yv%^_0+dSJUj2*_W6PQNgV7bU(#c9k4!>EBxA!me&wM`6g?_Jesp%Hh6l%W$#LL39Z;Tmk^lYgGdT!A^yt zSA{Kp2pruGNqewi=ssHsXb|lH2B;1DCrXSJO|kXr&OIk0Sm`EDSzlG3TNquD4V9c9tW)@EAH%DZ zRHw0p2rfuA7?F`w`0=xB z#iG4UQ28s?62SOavO2`lhJ6dp40DNGX(e z1Di9C1|3oJ=N&#=j^AGxlA^jFekS>wiV;eilhPK7)1`McQ~>Ne|IoFW_raGX&7>Vb zD6*Ww0`PW4pL*BIP}LjV0Ir)6DD5rKf)6U3`Bo>&)iiN8yj?1PR84ChT#{rC@)$iKggbZGR9x|6!aVL3Iv%{+*To z6@0u;$-Q2BtQz=`E@VcSy;pz5P;$YoZ}6KK;ds(^DNYT*d)9V#9{tW`QJ)aj@2m(U z%;hw{i2wqRoeBlKv`X~dR~nqrCG2FFNs8v)^?rFoqjL!hBJ2h>_=ekyEEwfV9v(m_ zLTT0Z4VP2Q0jb|UC{}&L@G=NgXF5-rxjq5e@=xTODRS7KMl%8~Hh+FdS(ea+y~)f; zy1|{Sh3HNmXxZ1m5SbS#LXdSU;r2`B2yb2ui;h zL{mdFi3ncw6mABaG#|y!PFVF3*lF1ZFz1+VU_aQyNKzwF0c%^)6LVB`$$9ah`XJLw6gL&veSn#>{SwYegdqTZ)*W}CQ5(=XrbU4^UAWDLIYJrDikSaIKvJ8N&_ty*A|q;P6e~%N2eAWt z%m3g8!G&jS@Ob4I^4Ol4F;0KahDVhnFVi*!9a65~&!tkDNTse+v>g|*P*?Jz>wl$l zTTT(WYqpq?nzbH-7rnV2Q3KwLJ`Qi5mK(VgwbMRcj}9KdJz$j_-=x`fDk#n)DnRR2 zkl&J;yKOu6b5Qvf-IE3Hn~?{7@g0Dm4XUq{X?KhCUEKex|TB1QEP( zq?Va8e$rfT$Pz-P3|6tKJIEjYgM2-SjX+;3A-ZGC!yzCFbSC>pR{Ga*<07g}zg4o5 zqY_kZyvKA~8A_=uOrUbvdcI*AaFF_>gxlYV`#mR~oQWDiOrWV6%)rB^-3ZE+YE)+! z1=M_US__2N+EIKc1l!OV0I9+MT1ht)ka76;EUtxrr>*&-m^Qz5TpMqty?_g7l?Wf`P+`}KqzeB35WyePl zRKLc*{*h&X32uPhSKli@t+86{2_xbLVVQ@ZyDeU5n~-^5Egj_DA>ea$jvdiBk(GXu zogDbrqM0Wh;{*D}ibC$_6bYV+evm@`j9Rv7=f6;bPkQU_%(x1{Z{?eGJ9cnSEQ-ak{ zOt?t%_OJnToIJq)F?VW}7+I8%+iqP`pKvjh-Y`~;@sq#CG4!YC6JF1LHo;j3$ikLE z%yAAJRZpbq+CNDnJsgVqdMK_cn=-M87J}8)3PVWHR6mV_tP# z9-qn|o1VBiCh-1A(Au)fD#L7)ItNM4!V(Y0=)8kpv1?Gv1C4EA^u)IW=z1cEpsgIA zJn}Vf)R#`Lu*c$>6~$yInXa$xNjznF6aH|(d?^}$&pu*$zq(DqXIQJ1Rhz9^u1Bp- zH+=Qx>v({3Cklxbr#_zDOvZMJD@YGDTp*j_by^gr#;uU{KeHWstVNaei6zKcKg8Mn z?*d^IE`l#!s~O35)sMnF*IgT_k}_y^b$~QSNE%2I7^M#*C+3g8$o^t`(7$S8@FOwl z(aq(B&@g%AGdUEG%y+wI{tH>PM>NU9!R?yqA2avE!U$+g(uVq(3~>UijMJ&%H5- z;?8eNT+Rr?$ox@m{O5DPT;sp0kpKDkA9PN(6d8Y2)#@1NAF@+rnntZ>gWL1O5I*e0 zYeiKCJ=pwDmDI{9)vC&p9oa7ZT6SBq$iF&lrQlF1?v(YqWk6tImc%9Iz2e#{(5?nm=!Uo`XMa;6K^`ptHyE z*?W-b*$VuA?`8U0=0bwU0TC+w$#!rJ+Dx19R@shcz@NT)!=n1Dy`csF&$}n+Y{$-7 zR`b#Kd&;<&YcIEKNJH982!l@Wz85nq`YcJ5x6}gyG=fb-Fg4ClQep{R9lN0qq6d%rLiBfn)cMKbOP&Na|td_x3pWlJKVi_I_ zx&W<>Pr#CQ2CxH;hGW+TWqSk=-snb;!jwU&;!#cH@9}?MhW_Z%f+O%hKHji~ZFOaC z9|yQR>uY^v#1(Pz%ha7xIKTrcDWHtFVK`RrZ4)=#W~RIpbdx^E8UX2g6!1-o#A>0G zzQAWd*XjZNE>It0H#8iS>{iBeH6JUz+Prb^UqcNeD!nc&bc<6pA zK{k)T0;|Sf5%6^~+19_%QdJTZw_`<19Nnn*Mz#WB;+WS~%g-@0wxSqKefxJpVei43 z`U}rxR$Ma=mon;guu08RmYDX+{Zkvctn%L33lT-MkgmFdvudiIIbPf04}X}5WF%ku zPl89_XE8sdH=zNKHJm1F-NQjccMta*?5d(g=4uB$89?o?$1m1MEMe->ypi|sA(R;+ov?K)Io)xjEe8Gx{S@M z#?@TS3`9*WhciR;_LKuMUaN4QGZg+01T4L?EG8x*bn-_u8b8Z#{ebotd{rA{`HZ@X zr*Ax>d3&KGc~AS~*ss|auhjQYmV&f#jlg&=px-I-te4j9DjfCWjq9t7_*>yWj%nvA zZtxB_-CPg(G6=O?USJ7wL#`i|k@5r*MtRI6d3jR~XH(AvAfH0Z#nqfl zsB`*NJXY1N0xWkNe!`i_3d4SgIi~Af3dSF`Q=alxvVOMK$y+fe3ADp^%j781n7)v5 zQ-*S|mGa}3S7g*roJmtXZn6B?f?(-~jlRn_J3wy{D#|Bt5^)#^;|djIl=#%JJ zfigJ6^qCT*-b*D&8d~%-n z65Q&p^A5?nYx6YE%VT@&@!a)0YkgR-R*rS|Eu@B*^L{nnFkZL3M{|E!^!QtbSM(Ox zhA13zV`esw!>ODPq8n@R5F0ZGz&@yM8Dn5Is35@6Y70 zk`5ldQ9X!N$g>i04O~j;#Eb&Mh`yOb#i+$;i)RTS$_V}8<6%PFEOLCQZJOoOIUh4m z753-meaaN@@)k3EG9~fT_nLI1oqaV;oWwPSbXBY6`_s3+NbiPuA9qPqgwEP@r1|)p zV)^T>Z$zv(`AhWsF#AGR-p*|3%#x-Ux`^33YQ8uTp6SzvmL)eeMy+x}f`W_^o(;k- zuZp6pn$QBbN0xTQaorX=XT7}4)ifCHxJEVlKr^A=VZ}KW?tSkLlFq1zdUZGTsiII0 zJt?KdoB9dZ-$VkT-(|}G^0R-JGL`)t36hz9cl)zX+tF6PwT`jSZee~r8(~uq7xP-f zY*yoT45?is49Qvg7rk0I7P{C;t;mkVsX94binLB(+=N8Bro6~@$+BqqViMOt&rEZxlVkW+ogmK&u0KvmYwPLDj<}J2Umna2__Z z;vA%8C!E6UUglPKQ;jZUW4z-cIhEC1J1IbsC1;KVX&HAflJ_=zpGytQs9k8iw!-Y5I9$eb+9E`yXr*F2!eQXWQ;m2yGXqL~M48TyXVhi?h=###6&f+oEFO)0@=IsV z^lr%0{S0fxK-hD2%rFN@z>{=^%uw@e4RcT_Xn6qjyy*YZ>hd?1bsL<~(%G2^qB%bV zKR#u%s~}AuW4MV%#T@de!V4Tg#lp?q-A*am)a=}c@{p7`pfK!hObZEO+!;eWseLt5 zg)hBKyqS8Q`fS$lTkp3mPv@BUZO1!futm7r0L;+=C?+noDu_-{S*L7thBi9xipGxX z?`fAvQXytt@OBXQFDdJvaomlv>d2h4)bat|t{~`=x?;)d%0vwE$^h)1J)|VJ!(2g_ z%Gy_ckt!nLGy_rHp@S!MPf~$!<$}27b*)*7%4^^Pxp)U}-_dsBH5V=$owmXy7~&Bw zbGuNTFJ9nw6a6}$oFK&W0SZQ@pg5tUXyJU(vr0W3OM1>qAj0|(_}Au~_0FzvL1SYo zBk?-7xrU)3Zy>`wtM0&UkcOASei5M)bN+IvzoycQ5q?f3HHcjCvbA0blE9uq>}v#7 zbOgEQMs;+g_@$g^IjcOtD$Ch1_3b!N%JuuUCv)ho>bwpn!PY!d}_ED`{WBfHFcLE!*YbOk40snt!sE8Cv{He18;C;0``fz z-{+GJ*?@4xQ?fd)05|Sc9UNL#e#x}A**t?-(4}lSa}W@^5;Ib#sHP%QDC;`@IlXB` z*N9@1Q3VS=qG2Tq;AQV>`!+)}7A1}s%XZ&jMg%_YeN26_cRb3W26ghKi%0Cx!6g|X zuli$QB6%YJSrnS@ciPxYWd6gXkB1C-cXXGlsBhn#{ZX~#!lB`I!`y(!XE`$Xa2J(} znabU!=%iqoFgq0BXsB%Emn6{srAlwOxIKFw?UO-a;TgAn+UQ=ip~H6mKA$>-QwB=2 zVb1B7-XhCMKnUIG9L(o4_gM4BY@INTtz5j99z=k8E&jsVQJopn{2F3Na*Snwa z5GpL1)DT(vwp++3QbcGHvhtF#udW3cj$#qOTmPR4RR1%+|NUzDFG~Jcy(F-<8Pv_y=e{MPO(Aw%*!8O&J)hq_rGX8)J`u9AS!(T>> z`g&38_;b6tH|`#%4^yE7PiU@zo664DZ9MOkr|#)cwp}UUX$LeznInEj&fy$Dl5tNr z`U^`2+K#Nu0&oBuabxsS+D15H{t2YeKM8gjKE>*dEQN-GfRTB*l~nCk^hjk3f=xNc z9nJ}-hprvbklFPf5Wwv&2+v2A&ZO1LkZkyE=o*5|e#eY}=y-ob^J=vXy_FS@&^(Z4 z!ASQtlWi$VKtW}R<lsnnYe=I-|BeJhw;LhXQ>I}WAH9CyTrE_&ws9WAg+5pgKSMYLQE(ZGU`@SO+yik zS^!(-__&;tOnqO_2EBU-Z27fv>>=htj>Y>F@&5+LvKd4 zQ0s>dATBsTRm-Q-D&95&l}NtfqAI)BE*;Ulyoc;DDa)k9Qu&0h+h52;64w|0OM z8Q;7&0j7C(yI!+CRev#ylAz2jIsNewqF{w0z;TU|2+>5tG?e0roth0%K18nI&Dr5 zIm(A#tj~4cw{|nzYc8gPBs?lKcctF-38wgI0g3`CU8q$*A%;7rR7{pydn*1#!Z73y z3)w`4{m1w|GSJV&T^0ni_a|1S-SkU`3^Hs-=d!(7`qtw&r+4>8#g0%bvt{g+GX=RP zj327d4_(+Pb*wpGC;mWOJ12!iBv;C|K;2f=GM4+6v1lf=9m{b<^9g)->xIWshaR4> zl$r@Tx~`f-T*G(Ldqgd)7{W-l<7$-*PGO0#yx5RwA9>9pT;4BEgm}LIyeJ7YFr64} zWio9f7R7Lr?j5XTnbkxj1TIu&1yeqRyh2Q)7Fo{h*d#3Ov|wRk_PR-NQRqyV^p@+S z?8vpQm&5roI|?&nEKLwxh#&V)lyJ25Oo;g&xAdsh2O~BAGnxfTzebVp$!R%iJMLuK zY`C$M3zjGP`V+{~_)lh)-6_!rJ{vOoX4k}aKbezjk&D2u!lO0Rl<=wIk^geDS{ zuta5djuOYy~6MMFHs{K~&f3 zo*%&IjS+$v!)+jJh!$mPbZI!z;Ur|^L3ITDx)@74+HMWMbskreo9e5x7-4c~4i6s0 z^{8Hzm3-c}bOn1#JU4b+H#jux+OF73`wIu?aVUl|+Hyn_UNco3Qur=kdX>{T0<8M7 zZ464SuC^l6Zdsgy{T#DUK3^cCs4bfni}1!7g-N6?&=0)zFD20f1U=Ic$T~fQ8b1MT zon^aCCpfw~v+j=*Fv6Y4?r7EF$RD>)>8Tu>Eq!r;2Ws`c?9;a3tFSvjV9&dK9~Om| zmHr&u1-?^ID4MV8?o6fE|M}haq1T71Le2sb7*J$+&zX5dWAPKQd=j0OLJck7I;6FS z>}|*bK>pzvDi9|*+eQSw{|luk@o8k=u2T5EUo9t$N0D3V)>PhI5QU-)qxQgR4an=Z zCgtq`g)77jB>T!Jt69gy5|A;jf{7dhNM&n8QA%uxp$?%3txS{|oOR-(Wt2<8En;x-F8c(8h z>IENkBr0j3_y8rCD#~ZbGEkuE)|m|-XY^Zpq>NIGLR-?Z{bhVX_Yo~- z@P6d33}QYKs7IX7q+XcYe*<`!d?1)~L=)QlW>o#viJ1wAejy0?7-aq>!tNPlX&&Ib za=4I;sQt3nl+}t!w@g7Up;7zpgkbC~j zTQFTihKXZ&_Ikt>PN4dceI5ok%0`$9F#bwOfP`M57!zqp$IRd0)x#ZKe~fr+TT-jP z1J}g{H3&Ovw~=ovpO}dB*&er5^Lc&`OmV=k?cc^b@^cc#a+*Aa6HZp2sc{j`Iedv{ zA)E{O3Jlfgh3?RYSJ@5praAw%V>=?qx5ZVjd3)apj?N=vua`)vowvSqaqqYu6b&Cl zGNF`t0CLQyZOs**I+Xvj*{5e*Z!SjgCx?cbxwo^pSkS?S2Ts&m=*id2yyv&*cE04VfW{m!2{WxaVJYrE-X{+FK`z6l z<+vZTPGp4lC%wEyUA7$0l>L!6iyM|Qu!sRDoQ;R;iZZ7R@-DbcAxl#Xkoz7k6%uvjgwjEe^W=Y z$b{b!9)*Pxpms6nmn`;r^N#Wm)*NddLB)uLnW3DPd6K5q~p0PMonnRAM|4jAV%rpeEf0}}OE5}dj4T43fddhA1*&Wq4Z zn-(;k9fYZlC&x=#wcB)MVB`8(IxPXd!TY~9*ERN6`k`~vk}gV$%BL1^h`Kz7qJesV zB~&lrFUvG5B;PPrC~%mjocCqimS{bVf@2jNM6A*c6;DLTT7{Gs{i_`31VmgzJAtaK#w%24(#MM6BMRmKM=I z%JlXwtAKQ*AGa^-_Y2xt+Eg-Y2q00$uWv+{)2@@?_dkckqG!39oe$zH)SF?IUDxoQ zZ>`oxq-pm61?IiASxpLrQm-iF?vFt+d-aZ7A6U`vad>&VcesYp^7-zgfrP$%+G+GP z@VcKwYkQ8miYi&LlehyJ{IMfVxP@JUSC9i6Vkj}_`B}2~ZUaE>7F}ChrLa>X8;(0% zTOC24C*LnIYC5KQ<2%Fn&f!3rwsg>-Qo$Wh%N_rISsz938HbmBE}U0+>l zLEa@gw71E=IR7%PbnyXkiIB5>{*F;P+J$04MCh)JMMq?HoBdr|J*U z1#k1`<+o-fnM+1rXNR{oJx(3o8-rz-`b-l)EOAdEoW$IwV*81W)Nu%JsGx;0%TwC~ zBA9f`>xiZm#zy5b@yc(zt}gLeysg!7;9{g^I{_B&GKgl*-$(81;HIlFh`<`{D75Xq ztaP_hSKGywkR1Q-#y)uhsr>#9J$Rdo-QT<&t%EbqYxg+DIvLOGR->8cz{5pD^W()! zh)M)GmQR@4l6XWz)OLl%aS;1Rp1}Th378cq=M02NqcVq5Fdg9JkV7r0qP&V))LjaI zHT<=P8v#iY0^Zi-rN0JT$1MHPjkj>~e=F+-pp}$q7#0x=K{|)bY(J{g{fim2Lv8Rp zR86ih$p5r~sxE3$_YqBi&H!A!d4#aA3XcTDp8mE18)1!libO5#Kxj(?p@qXbEE>73 zT4#4e^Q0CD@aDI-cz|zabVSnuSe{0}!5EYro|J{Y31JJDXHlnDss>bp9~{>cQLAhH1isS|M;l&Km(&R^CUw|SWKn7}@F+V*{| z(AU0z&bGQdY0xJIjeXPt^QG8mE>W#)69mbQc;&QNrOpB(kKpo~mgs%>3+jz)X^$f- z5&}MZbsclbnau3UAD(rNwE}Ia{eWDUT`rKiWowu#Cc`l&_;c6i`btl`rtTDIps>8B zexVGSSaRw~8t46~hj#_e1hRZSi3XAl$1NcHYX;-($SO|!-4OQ1%G&{NxmdAs@xG~G zq$jHCh{g)W54^+<=hNK2sC_WaYniJR*BN!~4E@T&d$fTJ*M=Rg;Q*`C(yi7%yR#R0 zWuG(P_F;-b;@SIXon$kT5JD8^A!aOmR#C~*cB%<2TuNG}u=JP8J8%Upy(6K979h_+ zbigM-aP4btl~Nnkc=1ka!6%fIUmxZ`2Qwf1n>dEyqf`YzmkcP3Z+-*>$7k5?*Z4i* z_sikQy|ugDbeBp%eI|o3Ym%qDY&3BVsZvharm9Zd?Wb5p%sVFMgeONApc}<=Xo3b@ zqzz+gENkNRZ1Uc1IgnjdYi!J}d6++A%FH{vx}8vqEB1IjS9lBoN{tQa=zMxjxyRYT zZ~BNP4-d4W!-j*GKL6)^)2@U%B zb=h^aMJ;29tdoJhuPg%Zh4bP+e>B2jw0VsDN}XW4-V>BEkX!mXTJIDpP|blIoB?F}c=^G*ewDW}LoMb?H->O)7;7E>D3jVPI==__;FfMQ zq$Mf)l54Vf*5SrHP_>tC0>0DB>Al7Qi%1jQRI}$r5tdid28efYr2TlDXIQHT|78J{ zp^tP2g&%Do>s>CD%Y@zIqXS=7E? zOk6jW0_x#RfVVkj#+tht7HlgBzYI`UziX8@Bc4z`iJ~||301YYf%Tx*0}zh`1figrsJKkRF>t7OnF{T zLBtn5c^b%w4`uoXOX>Uh1lKcnS_Tgrx~u!bTHE90vOLi25IYKci>p?q*OhZ4l9^rg zE$sV76O^>L2|~aGncpV>zQ)Xjjzd$+j1UZy0>z1K+tyPs)PE4VV!kYFSJa%kABAV|_-@#^w&gR!^QNaTw$M%Lc^DYeWIEz(g^$rr@t0(&ruRr22; z-6UI8j~*jK@b?x?e?9uhJ-Lt_dGf&n9Yfu&^i>9>lL*k9GTmE7XRbht)oG==(Ihwj zljbRMXd8X_(yzz5s_=)t|DQ)O|LCab|IOn8BVcEqS6(=x@#lpnQxm#JW`YH`_iEMl z0C?;Z(5+P*Q!grc#X)u4c0UDBOgMY+cjNkb_BYSkd%QB9Jd+iKdg;r$DYkKrPQ6b) z3h9z#p<(mt+IKId4;}))hA#m48QZdG!T5_1T6YE^78FT%jd%es$BOlcQWaahe-Wp` zm(Q)%fiy&dxdFMi`A*q{1m0<5DZ?uC#!Ker4C;+&Qw( z4|b<6v3Y-TGs@O1GjCsDkB`NNVn+E0v1K8cjQch4;ic}^Y3Hi(CEqrvee;CdK%I8z zJ)b&n;4MMPE0W@Fqc_a^=M<2Sf%?iKCs6eL=Vv1P6aEH{=e>izYacemll)?NX(>A2 zzOeGE8})NOzXc4sl39~iL)u`JWXky-1h`yWs^$-A$Q>o0(1YH{Jzc8DunVZG*&Zg; zyy_rix&A`vs931y+gHlhPnesfdKmC=_@B@!bdWU}9!Pgi;CiDnpiUsBbY$r9X(kOg ziX4AW(#HugVAQ=NX^P$3iJkW3H!+XH7*^-EG_vP|O z(fal`Y5-ys`0jr>FBbfx;i5)zK?ZfTn*9BD$L;vGGrzLCkv{2L^l-|zR>;eZM#<^S zkfJ^VW$4?o50D1)BN{!#zWp4y7=LNZGIIi8-TacpHX`1>?7-L*Lyc7tro0K~bd7la zk)D%Z2#=VZ38o7DU+leiR8#$)FN~s~R8c^wQ2{}zO0Tie1*C=^5fCBLJA_0Bs;(IEuUf!(KDYY zI86-c3QY`qvoVWiCkqp=5V2s2+0lvNgSZhB`*m;Kg50M!ja*X>h_dUc%FJ4zx9GnS33T zo3JhC*Wl8$@}=JA-KHas*2#~x#lJ0^?j6P@HZGHMuf0t#G;pNx*~s`8G>evEa~zRy31Hn|gblxm%K|7Z-$ z*P?gZ$b?z!`qA|x*_p-zSH4tMRO$}Nf<*-Pb}fHN>M(3Vw&EYla%Qbgc ztpbCNSTF#fp}lQJbMb1Za(vpJiJ3jwv9`eduKInqd#V~Is<1cjKe`^U%&4TstUQOg z%k!8$9$XvsgegkRcuV%=am&M(X-Gw+R3(}R*tKR`NqK`PT<-pdOsDziLU=2IhsgMD zrhAbKG2wLiB~KEQ!u24>pDUW)sG%0+$g|fe&P333)oZ_BZyKnqYX>%n9ZDFhhoOOej!djP#?>r zx-E~goS3bQ0(+3L+TB3tudDCL=ZSKo^LqX_aYv`U>_v*$5i~_ed5jGtKz6aC9fQq`4L`>)=9hjq7|sBrfFWX$>B_(5axz`t39%eluC2 zuzDuVjUMi`*@5ApyO(ka_Oeqk%J0*(#ph4D%Hn?)3Yk;gbeB6gRZZmqciEZKa>7{$ zRWeI2ree4Ln=~j05i~Zx+X#;GMqaQoGcGh;1%dnS@CxKAzGpe*!X6*h55g_(OwCO^ z$|WXOD$I~%&sJ>?dRK2QY^lcaU^E0j0lEVV7AgYEl_P*%6a~hFc|L7qu-W~5)9zd4 zn1Bbp&j!Za90b08`0G*oFGo-y5uz`iItTuEFeyN!lXsZgSxUuA@B{TInyt=mfeX@T z+B50&{f*A&1*wz>K_=+H07V3MaDzH218qkFwaZzwRp&mUFKmq~dne>_+a6TpJ|;N; zXrS8N;$iG5ssvv<2rvai@zCsP3H)&`RKI51&=4@7Q!}ZrgJTxMyPMX=nqZ!ViAg=9 z?%;BT#v^WF+j18yl*=#=FY?^^vZ^dNwqwi$OAHj)kuMr+k`Lt_kfsF;}dp@4PR(ni}qcS}M3Ym+0-Wq?F zlYS^mRmI1e8f|$+5AqoJQPm&(d-E4#(~uydcz$sJM6I;mdbr=2^W7S0MY2{|Vsc!I zzj_~F*a2Gxfa-4$pzB{a7`o+rVxL=F!LuGtK=leXl{=5CHh(l{{Z$9?v}3fbs?dxF zVR>JwZb1D8EI#eCWAVTh1NSWFB@CpuG9#=U(tlJ&EyT4&E^n9q@=R>cf^k~frjK|I1`S$v7C`=bHL#uw?fBoT$0ei+?Cf`+iH#5THY)~G9K1) z3UsXxlLL^=(u4NLNA0(TRH?a!qat!^0J4nWE;<-pv3#YD9okAS@t`V5F6aXvH9_yb z+x#SMrm9`xnF$m?VNhTl=LbHU3>-Ps8X0iT?pz|6dz z3x(|1$%=@hj|~}sx%ss}L!*J?nLeK;>EtHU>AR7AN78qw^X`hBMN{OTUK%N+Cnxwm z%eAi#R%<+%%|;d?6%P+N!vwPaf|38XUO?_RE`@^}!^aHyjm5q-2LK>TA#UoR{Sfs6C4{Au zM-}Cgm#vVynduvrxt8&4{<(plyAIC+a;Aar{>%UoTydUyAu4}lgZsc`V80EfCBVsX zhg$Fvg4Lrna&LpQj^;}aGQK(#o$VMx<_*wgQDQ|ZvYgfeIOgtTsX3-+NF)xt#8}jVQ8!7v?FaE7Xv+R{PVZ@uXq$p!a0X~=$u=JaNihP+jD%zd8WH#ZRe|~ms2xNh9YTf_)wmn zL%V9kQ$2omaDZLi#_8<$2%srDx(ki}gW@XU_9pZS{Qekk8M{`7ZD}R{4$9j_Mr4LW zO_`%p9=me6)nH7maBkggvG`8X{%yE6J`+XM8l_+T+D3c9m=r&tt;cr=o^_KeKBWWP z5bg+|hF=Z@$$WcY!n3v}&F{tH)~wSo;0aw8Y+}g1;`tEqu;_LApVnf!gh!;DPl|C>@*+yV+v)k~I^<@I}+}7P~K^Fp8QW z_y&^(YwLACHFkD|iOj8fT4CMtiaJgNe?m)&O7n#LLZU?gQ}pYvKqWs7cP<>*tz5xQ z2e`ntK^KT}Si3hFOg=Epu+b-0IXv*)ts0eX>nWK7mdODqWmMnYYazSc+TG4u!$-HR zGU$KPI9C2e@;mk7YBj(JseeRn0I)Z|dTou0{`zmo(Rt8YrI4T9F2FE-2#mJB$v9#h ziXNA52g80H?uI0|z0gupMpvBMU(tH^iP6?2cXJheKar)lin46kuJ{nu*#8!N7b?RJfced8*8aX~bHpy^CheyTa5Z`TeqDgNYgm zc9PtZcCM{oJf8o&a@Up8q09*qf!JT$vG1VqHps-Jh!mYc7Vzl;h0HOQ?_TS=?*R($`5+<_hx)0$mbB?B;>6i0RA4a?r76xCtCJ^rREZHK=&XZTh?21R{`00Y zQ{$uzCy~_}gNV+Rz!FJAyRPLO-uY!Z8hk8T8JiijxiT(0+|nGMHQVnIKNmSS?UV8- zI@000#LOVE31D`OFs9RxzLR2*%X4ti!{3Owi7lKF`Y{;D=}-$}&4Ame-J3dnMQw~Y+9NK(Y zy_YtYdAzUKjQ!u9LCv58q0AoX;Rvc?oaDRrqGt;rzizM6W>fkt6%8V66o;cuSnzi( z!?8n)=@E!4Tg{f(pt;yVE4#N3YG~h4%iBDOIBvXV_3^E29*+8Lylb#!?__}z>yh^i z7}^+pt5>N|x)%#zEqh2A_;RfOZ%^m*18_|aRmEyim^bUW(Vg+WL7zM09G|c*a+faP zrS^|5;7Xv(Lqx4yExW2kb-kA>Q&YzdT<(8RP@52yw&z6BSY0@Gs6e+L9ZcpW<U(4Uk^&ip7YZ z%}2t|eh2OHQu5V@|5*DT(+YosU0WR4d4r4Az~?Uc7M14-ug2_{x!BLxhAy6XUqTDO zzc14P>(`+@9+|QlRwOU=aa9*yJd{|gWatQ#aaETqWoKQ3nyUty3q^`feNWRX|Eb~q z7U(_L5&h2nCFIA(XiaMCFjg(5?O7z%u^!dxy=Qc3YW-YiP@P>sv`D=t^aAjiEF5%M zbuB-gN&+OZR&gRCaS%p3!AXEj2H9xew#Bt|?W;#GTn%+%<(Lxg&F;WNE8TkDOQi#O zsQzHahjL|YpF1neGb*1Q57kDUaPrfRewUSYdd}P5?F2?0ANjhT#!nNsNqY? zEYOTjc%^yHha;OjNXNF?`Rkw3gUJeO7O)5OP}LK?Ktm+La9Lz zm-@ay{YMO{ zo0Sz3;WN9nd5S~i$}?n;|3a-&DEL39=v*?f|Jnr?IRV&jNOUK!3Gc%NMA^2^Wb zDoNY?88#J#7h!8-Q&uvK0k}8mABkXUHkFsyq1sVV_hUfcv>9{(ZiprLc()#8UL>?fI1X;sYv!2!tKWKBy&nD${Jl(RN6QeS=^1vQ0qEc0p+nc=|mh zyL7EMNq+QmU2T~A#2w?$8e-(V<*+8Xw)v=r(=0L@#c=Qy~+`qdjiM0_GNbT9r^uK0lP@37=>E$MeLndQ`ua0@YT z>e-fPjm>PAR#L_^+!zvBagGIGTyJN_Nn@5VADih!>8N&lx}K`Zaxy_>TRFu;dRXbYO}y{9NQCZFzPs%}OY zRBI_schjb(qzv2M+9NDdV*oUO#%&=4r~SpuOdHzTlG6TY!*Fg>&~e1uhHw&)&^Rbk zE(X-2gXFoFH=m1HlU>Q6$8TRSSInb>riG<`io#aY0 zu5t4v+S}hH=5m$SqXD|ULo@~P=>X#TrjiKmn_O^eUYBzAHVvXLhVjtFI{kK>&m;oZ zeRewO3@7_vdUvetoPH4%;&+h7 zfeQ-UIL3W*DR?B<*7AD2OQR%~;ez4=t_Lg$3%i$KO3b5$3Xc}HDvO6-*<5}kVMkX0 zet2O%@WX@aZ|M;+ttRxGTOuA-l7-oTto{JNJ6Z2%6m9LMg(jkIMX!stuv0}Zrn2rg z*O0zj+8^US=Y*c#OCP@Ed(Q2`IXA)m($@4J+Bu%Gd|1`c>x6R=8m)+^l9q<#MLH07 zE#!2}5W9A?fhCP(v4XF7biQ&naD>(J?K{qqaw0x#>|8}ErBYX-Fzy6!8I?0$z2ka7 zv#odS200f5)K*rixcNIaHPYBW6&#oOE)oxP4g&g1QIIp?VR7oB4L$ohLC9jAcSJrU zYxPp?!lGu?H3g$4ut=}kx`q0S1nT0ZqdTxoG_%iE6PYGi^~dN=JV&bA9xbh)2Suk) zsa%^B3Ut%yeDW9WWlG#!3*%INQ%yi!MLL5=b-7ua9_#A1FYCI_qf(OfQkA??Irbfx zwNCvzQExzr55KtMXc-_v9^FW|;mT6ieItx-;gX9B1tmX!O42DEm~zdxXI50^qdLFM zcCy!i0EQ)Cs+8C#yIN&_=g7y_9}w7`KLq>2P7d+mfF3V(3m}Zn3`Cg&yU=gW1Xm)v znf1LXhu&M_D*;Jj69@gtm4~P<=;h5$luxwb7*9k?P2}|PjfT!+89o9#yL2i5EmR*6h0}3|M_vLI$OO1G+QR+lN{3 zS3k;YgV*i-$SDQ!$9&RRF=_C;`;AkzaXftLyp*wm^X;TiCCTi$iBI+@5b@}$YBwxY)4(U_|GHhM``w;(S9FjZ%(` zw|9;~giSt5oaD(ZtwdKNL!C{JAA7E5{M5Yl?0-;5MQ67B>c(9@DoN5AoqO{Wew7qe zCp|B?uAsh@{F(SD|EJ+X(9>{vUuyZ+PPWew8_y`nJ1(jdcV{QMJtZUb+m59c*p1`f zc$kR)D2cl7n9H^m7!`NYi4#5AfQ6fs^#>)HO?8~;euD-HWsb@)|LAwIcz>v)BTGsF z+bdpOn9W#B?hn;Mew*+%zuidee#^IC2aRvY>GSKjBPoz=o_ zdeNGOaJ8%=OzmbMW77P3uWn`~gya+;_|MQ<+z`W{vgkJrB0lok*z|XfD0p|k15w_j zRf$lkcIM)sr(B<7ZThdDi8@(Ys3G|`)1jE&b!Dnvp=NVepV)!VA}v|&G8oK^#QZaS z7-1}=Jqg(i4m|}$9-8?&6&#Eu_eOCwSeaz_s6TBvt3sTQNwL0(R$C=iHPE%6iI@oT z1S|yrb(`#6aWXYoM_5JykphKh<>-f}~C z4l_0%+ZUGC)Q7J`c&tT#X%Iz>gXn;OZPyZ;8{+86w(IF{A{>(ybTd?;@ompk=e!q< zo6i!kJh?j4tV&jW+>-cKsnwN?*5qS8+$&Y;7wHjbr`%rQO-)PpJxK%@G6$|W>RIFu zWH_d5?i!3tSMz}fW;9@Kw!GvKA5?iKS}&IM^ag5(uq9}r(Y2)A5TWJh$5F*N$l)xydkzO@EgsuR{SF&^RB($Z~20&^Onxg_w6z$jcrZ z2Qv>SAk}3$a+43KMf7hz<|llSImSTy(*#^t?^TEsR4J)NY^q~tBnmxh*5fCq2||I= z+3c20`Q;Vcu4%VPZ_D{dZzVT2@V-^NXF{vWvh2Lt5`y>6na3YCtyV{eAxo&@uZGWM!Q%$=dqNr3_cbr~Wb#$a+dNz2QR43PsbmVv z&?tM6m#3OoB`|w?%WJS9wcOU)<;3H5pRRqayvdyY{2f4eI-@^asf`o{_Xnum{jF>Y zf}Xd2vxclFQW(n`6K{gUb{}jSL$=D1a;LY70~Sax+ARm?y6T2REq)3lIX0bU!ZRZZV_2K z72|1gq%=*Vx`_bCLmFAN?~}>WYp7=d1|;WyOVyIHPM&@J+tMV%wubULusdaBS!(7w#1EQUJaoZn1aaxqq$wi?#yEghObD3?HyHKve9@ zzbmz2LooJvX0Ix66`v~K%Sf~OMCs&3Mva11bTLqEz)K3$oPKRPT7C3+)XEO43URRdMI=;8xZ%ot5^LQ`*-z}IT6vwx?Q`ycT<1Gt^cuS4Y{4#9~( zE03&|cy@5Fviq7+YX=LSS@nKEB{0 zDvDWjcA@Mf{(`KIAa)nLXv7-(&?XvlK&|?G>8kkm$YI^>w?LV7Naxyr#oou?`^ zY9r2Nibg{wLpFOw1-9y6omC&j;9Wa*O@));e?`=O0gC+n|D#I(FR9Mg=cunZNR@a@-2oAO4A~8z7hLn?9MO=D3{Fk$|FWK= zvZyxtVBj@4RSB(VH}EG=(qH5_A)V-ovG*vVG(iDYy5X&7VliWG+{pQ(R`q6jSyl+_ z_Bq?9fq_Gk9t*vE+{;}?TInXb&|b@A+|tsEm^Zojee+1jS1y%>U&x;(UF5jaQo?zn zDn1g&-<&2iTo+KJQ^=RvTfmwVP#wMO_3^yHm0H1+kAt;>^L*M0V`90XdS1oHzZr+r ze`VbF=awskdU2PYRL-Ga44wM|&^@nlwO5f~bzeV1Zs}^7q)QH1#x2g#l1LD39baou zKI0~sec!Y-pYuv!cq!0_jQJ0tCjNUtEdTTVr)QuaK^5R|5zqc`Js+g6R0DTK81oZi zgU$M_n1-x-d4EuNK(-VSoJ7Y_2oztrIHN6I&7qsdtCUCStAV`qdMy9b!$Xek_i63% z0I=@h#RCBM+SJc#j^|nAezOR(6A;>2+IJ;Bm7TZ)$e(=pQFOMf3?@D78P25yuJ=^X zWz7GfOWb_%QA-T`#Q`8Q@U8=+<9}=%{{OxPa#(?)iY`H|vy#J}yq9Nx^hmAY1Gq?v zAt&@VvfZuKew1QIzKoW~kUuOP-$h;`Dk47|7l9$8B8SNzUEN$V-#s`s@JH$!?jF+C zYb+F@IBU~881Ku|WIwX)@-unGvOdd#Eg~hQ<)n)mFDa7PLUO<4!S5~zy&Cp3SSxS^ z{rW*pPxaF>w;oB>%^U$1LQ}4P6-3QadP#Mzr~^Iy7?gu7^yl*l2}?(yy#i>WJMS00 zf;9QDIgpkPWH+~n=3aAC01sTjaS6H#`xKN%H@e+u8b&0*H%ZU9;fzJ3xDWWe&*H#@^N;cwA-=;lzu(EcRdMU7jyq1Z4O}AH;>f+TM4)_P|7yu=d{84%K<-Keq;K{nmOo74 z_Kj%s@=KR$qk9+uA#oLPvu2lI*-!U=To2r}bVtEKc+W+}-u(3s=G7L}uMUV?K8qmz zSRaWaaOxiv<&SEed|3oy%wnLGrv2NQ*jY?s8HT2hGbJKn<8&;2hTP6>5TR9A7-0>R zlNU_CB~ee@4!`oB^+uz;OdbBUOnga zo{!UTSMK%5xh2c;OuTrr+*u(%<#%3P`DN$(bu>bsUwvdek4{YQ?~f#N;qw<(hFKAJ z%2`XM^K8xp+E7M~bWF@mZ+r~+{)_5^Yw!lN99)*$2!nF2L*!PHR8bSJE3}CT*GkAjns{&iujf9p)2RHC;l=Q7GVPbqgsuY(9 z++&fUzT?vypsXy@S*Uw2%%O+ui?|NIgF`kK-XF&Y_iwkLJ}bG*-SSa@B$Bqpc%Q9> zkpmI8hHW_J2Db&PoV#-GuXFf#!2&aU*;acL+w>8@ajyi5e$Ww99-K$APO5p=0ee2H{d z(wTlHAZKR$n`5@GWj*X#Qzm`)LL%=Bes5Xg40w%hVIro2D4fcG)717FE?3~FBn`xwuF?Td`AB^}20jB@| zM*Kn6u40^VTLig5=tVq2ZDm-f=%|w;)1_)HCG0eOslTNgN$DeQ2(zE$IH+6!ni!d)h%BM39FN7)n8+q(B|j~=f?mHqbO~) zX#_Qa3+ImTa&dU<75t;wcK$(`A z2gtez<8&Gk_h>w5hg|C`=rW+mIMiAIhqcqAV3x^s(B-;-1>Z~SrY#lkBWo{4q2CwS zhFCCswub7Tihf12DR2-hd@>)8&)yn#c7AQnmBHA%H4GV7IzuY019*^OEfbdQg7?}? z4I}7c#&!=1Y?U*Mu6-`Wd>8yQK(xz%-M#p%CQ>|&{V{cPTg}V0V5`qI5YYAqWSq@F zwJ@`;+TTo=-=3KtP^N(U6P_UYf59IO+KnEltt6XxMLxEQvmUQriy6bA+S8fv{<_$P zLeH#C-4i|2jsd0*E|6~>s1457DvXjiZwwnj)?P_?kZ-h3+|JS7|6=^&!g$4oSQhTJ zZ&8?}W2OvF2H~iCa{kK}q=949u^b~$wK!0D{Z$;V-(?7{mPpPiuZzrd(V5k&que?F(#O4({rFuuBSw_bQu5RG+0tCS z>Q0scy`)%~^-&kQgW#x=n;ju{nt9gkNA{rGE4Vo=z&?b-KatNY`4Kanoj`W_%T7yQ zHdH|)-8}|nmo6zU#g|&@7PJ7v^waOo?|BawHKJ}RdjzazZl4N~QC zA=1MqmPE(V3&9PROU?Szc(;Y9hi!8&D?7Ic)eO5|E>kE9Zqo>UjS?vYzt!SAbSc|} zyH=m@Eg{F(7E!@D_6jmT@JStR+*HOdC`sRJtolpD$9e%B$b>KjJ~ttvZq|&wzLZyC zk0|T5r(hL*^Rz1zefY8Pr5t~##EAszlJSFY6WzyOt0HY4?FLf=CX5i&Bk;GfZ$-X_ zd+g4}@}T}gmeY`>-SOncn&HBLc`X$k#;)mP-v`x|{G`|(UY*v@erKRuL={pJ;x=5W zIN4;lU}@MR-+h0^L^e(rS-1YMBXc-zQa&y5*ySXZJ?S=VB8{6zs8GkKY)=6gmZ z!>miH=bqOUAPi|{_XeS(Yk9AG`Bi89+E2&HZv;@$WGcjnA7eDRe<7XYS{d?E#bsIX zib+fj{X0@8#4xM#TkUmM&2L6Gc6^G@Zl=o^VM?L1*mLy+>MY;tVEO_q{tnncbv>#x7=2; z69&2j!=QTsNWzu5a#ZV9@_m=e8jB{TO(V4w1o$=yb{LGlhPF+$qz5e}1TR-R#>_|@ zP4rQ1WOzQnSL+_4CMNOQr&0y!9svkt!GTJE7H7&DKhaXcAWwtKpys#ni*xvo zr7|p^daFt++nY}u#@;vtEtr%y%gEG<&t#g4|BfXL*#Ek^@-Op)fGYZ53{mKvK25so zJN)9#oWI~TF{48|ISXhA@qh`E-iKZIdmzDw0Lx`9zZOx8%=R?g954z~a=t1!{2Qy1 z(f60ds*@+m;BHsO@ZmWd(9fauFI1|Wt|H?ft{+gIP5%cPz?nY={{hV1MSOKPUnRf_ zO1hXIT{P7WMABC-aEuBep1?v;W!-Z-&wY&37XfOPVEr}V13&#QjOeaIF4?sh;yvd( zi!kQ&MnLoT2sSMXlK6ZO;l7dj<*@L_0iTgWinL9i!|l#=al{aqg`qT0x*5aZe853X zbvTR@ZaRpL%4gdm&k?>hsO5I@bYTFD?sROAwmN?v|7ao<^-Sz(^PoY-5YDdG==&P~ zEtVb7c{0Ns{K*LY%F&i1Rp;w8t7eTE__zM!_k?!5Dq!b|rmv&S1C*Av&>`F6oZu}H zySB$-c_+!6egZpcm{_n89Q?)J^6EgL2iN5@&UP=}@aUe#`WGvFoZhz-y|e~}M3>}{ zj`2MC=Y!VZgX6_n8|*KFavx;9JnFQ!bI9NcyBqWroVTgIoTek0smyXxmNiwYI1Ehk ztiM$n`wxa+|EH}PDFq#kzv2KTSeFM}tx{nW1nB%OCSegxmBoIwMPM1n8pF-vM2XCO zh~B;1^3zY2o;)jq8e}{}Gyxn~MFyPUT!jz(Z_V^OnRQo3kHLjQK<3YmZlTZKQG1az zYmhcGd7I~Y{_Q(mst+$LvlA;r>-f8IgPt!Do6aT2&;k!4yHenp7Q;Ld6GAtR&|eRB zP85!gfD2z1lcu%0?Mi$=hrXfI65PH7r5u?eX+9lsqMyz66kc9Fv-$+XGw;Fi`J+KT zM#lP&xv8%Zq@i?caX=pM>oHnugjL0H{@Z45%LCO`t3X zI*k3GL@VZWSY1aKDQS+OYqrl7G?IU8)IdiHY5)NQI=qenKGdI^e{_W)oW~R6=!9@% z%|vY!JW?fM_( zLML6GV1a;+<}RZ#c7Up(R>M697qiZW_U3u}AM?1H%TL;elFBE#(qxPDO5KI0Ejs(8^711zbv5t z+%=VN$q7;<+QNCwcdAl+Ex#Z&Top#MSaIr>i< zwZC`C>NzI7X0xRTpj;m6vF>@nxBYXqO}~e0u>MXK`}()<@1~&=7N3)JRN#(F;E8)p zOFgw%ksL?!<#1R}9a1??Wd&sDTnI-k_6L#WvFa~y3^5X7qQtuE`_I5f1NSnO-UVWy zMH26*&KuisUywZTQM%otKbTurcTDJGuZHtgge(TJoMtbg-azSyCJknVe6O6KV05X{ zqvbm zm0;7dxo|X{r+L7~meK2JZ%y9uk}qQRW^MRc-)`mr(jE^j zKMK7CE>nE;oTE5_s#)Wu^j`} zx6x$*;>tb&xfnKV@Yb2~oJ#9SP5buEOVOh>{yGt0T9oTDKCKm?Gbr9-Laz6gcPi8} zfx?mriCMVBIUQ;GH8QjYz<`C7_@siWrr%vs&~4+`j2@Tu6?FT?%a zZD6eM#~F^%Afc(_`MZiGf3H>i-}w15>gtz+ zCzFO*sqU944=Jh2@{_Qi8k|CD(u(7uW8l#mQ@nzKtb>Nqt5X_n+!NqR{QULj%ON<= zh!^O~K7zhYQc26D_bGJsr#C-=zJlxa+6)-xi;z&S=96RXqQO*`8F$`j%nCP*mxm@^H1sy zH~&-j(*H(T=YQsPqMNbP3s)J~EeEFhh;gG3hP%PB{0^g?Rt-*xwF;|K|0fxa=a3eJ<+LnG9zIuWNA>v1@qs73<@6 zFA{b2hKf3G;-rajZE+{>^2gmLh4k`I>PDQ-*C#nSt>F#)K2Cl<{*}$Fr**THCc#^V zf-V*|svx_-{9`I0kK&liQ0_Z2lngC|y|m;g9nkk7VY(TU9?JQPej$4fjK)N}7sYWn zG=&M|%l)gzdmRTf;B;gI@IWg7^MoBIogC{sp2h+6H9vdC)wXI^GoGRDt98wqI^ud| zI=foeh~tL6UOLhy_0Fdb&3iKBPrYZnw~QW!!<8Ru4|9R>D_a4Ju>MimIKZtXAz}S>F<+_6PH__o~VO5=TgE* z%0>PyTQ*?+lyQ zoJz2l*n09wFz6ho3+zBdmeqJvdHp97e(cLg&hybA2Us`ZEH+)V`Kmq@@D|t<%XF=U;gkH$YGsxBk+hEl z)?jp;0rW@>)5xHTd~2mtpW6AVhfxjR{l?b%F&7DUrW~a79U9IJEro4=GJNqNO159$ z!eh(B?mS%7I6ax08RwL{pk*|tTk7R}=W#&o`*Sy_sn8Y(YQ#{DPPFjNQrx+tBu7H` ziYjA@bjh1x%0Ot;eq zw9`^s%{BoPa$bqIx?b+3q*htJ`?&g$>-~Uh(xtx zUB`px3`DHUT?@>&%ntQF)uFi(ViZ! ztv{{p1WO%^J^Fs;SrjB))?)hTcQqNPTc}(83#cbblSg#yL?zO6~c%tbY`)J^oanxY@Pd-b_V{ylFL$A`a8*(q&=5S6y3!Ea(J*^60?D{2xcU8dUU~7oxBlrL z8YD>JJ9P?t3HSYH)JuMzxu|fa{mCIJ0sa6RzG(D%`xwDhAH~d(ID3)#%wa3?QCuI4 z%ra1?<;YirqE6+^FAr8*!WL$}574%sUpxY*+{Ua`SaVIkkhY}x(MXNUX%~t~{Z;)? zQ3f@Ix`gOVr;${NT@+@c>vL20T02BVA+D2K8zj|jsrk{#_khl1RdBE;o zft}5>W5{hup2OQRRj|iKm^t!F#(8Zy!?MKKPmBE%jOWN>epJ<^tk5A4%f2f;#^!%ajkaoqPxD@mA~X`qYP7Y6=`a z(72nShD)riA@{(HdEiOLqRXc5IrexobDc2368>Z;H^?*9wzuTo&KYK6Aj*LN{)wzW zHm7O~2z&Y`hl*Z&uJg$?x~n*jHLB|Q^Zv?jC@`=hqUi8B5+9$${n)(bbZwplYk;T_ z3K*A6BwnSq12J*%Uux_BxGC7lWb=q4BA@P3z{bH->gM#a{_QRMB@Q#xReNsxhSG`& zt~#b2h6nvl?$B7%zUY824*DB%DWcA^*mvHcjwa6urlrPY1H!iocZo;~(z->-|wfPl5S+!0h_)AQy0`zeY45 zMJezvP9+|q&Q3=&I9y&yn$98I<~$W>R{l8VWZ9WiIq^!m9I>I#7j}E;<%*u~Vn?{O1^Kh}jn{#c zFJ!+}8Bqv-PKr^uyJYv#9Lw=&Va7P3?uOKac7Vp09<9x%m;8&gVf*>*fLtit7u&VJ z-5@%cRUSEB0HNKV>hdGTqPnzYCtaGWSPrV5w8h!()cb>^!@D3ks|y$SHHp4HMsdE#Vx} z()X$QOaHpK^5{V{lfBpJXplF{VdUU$*h}MkVn;kx>#9$E6h^ZVAR-@rVhQ|CE_YJm zcEeCk&(G=cCps|#>uThsrA(CKy$g$IQv?g#25Z;a%a57ms@mJNdg}Db_PWj)#SSuW zzp5o8obwHCoLJ;xF&Ox~F)ur^XLLr{J z%0j=UW48Eun0jx3AYPILU1;Q7`ByojAMT3SbvYBJw59DB2Jj`=%)&8YwGM6IlekCf zLLf7heoM-1^~cIZQ`Qe7B!2N0cT`(E{M8y&+mc@iibN10`)e2h;C@^l@1z`ox_;4GYdhb+ns4TshNaS|l^%4Z;}hzB}ydVfCGf2HG@ zixf84^9W<23;y-oB?WhPci1hGxK3~RXX+J!+|>`rF_ks&%%nU75Me2kj*1Rcpj0eO zNtWb#m2B*Suzd*jTUyU`@@J(&-`(1&(i9z7w5dzjB13(bR}CxP(zx+-Y;ii6jK8K} z<4t=_;}{@G5Qi&p^W)5X@^!=i7kl3w)zr7=i=u*}AOg~fN>dO}0i{P&Ktx1P>5!iy zB_JTuTaJSC5&;1PAxaSeA@tBAy@T{#LhmF%LLkL=ytD3|JM->+bKjdcv*xWgf4Cya z*{AHY&pzM%{eFrE>I>KNBQB!73rti7zonlWMD+LO*|}Tqb2^@Tmt!Q~qAmBRn&g^{ zuW!&SI;XCRn3zs=-%uCMnw0KSE^L0qaL)GA^OJu28h#tnXF8sS9n9NljalwtEa_P< ziupE$+Pb#Erwn$-<>)_<68f)Q<@@WIy&QF#dILmKW3JbH8JoOIv)--;D=`3v1HE)y z-}F)LMdiBD{S*3N#jhft-Y&$5ofHf6^~_XUu6hf4(XxOFHDsFCCWa#BHPqv}?30q3 z5%}4kv<>+Y0_$4aJsVEC=yFyXHIB{4;vbmIWY>l)uomKtW>Z7rym1}bTtN5)OAHb= zkw|ms+FW;jA$EO_v~6}zDI-)rWabAS?Q0Ik!_bJp@4M!;`ApT|4l6dM6>W*+=>pCq zA)==jU3D(QtWgniMr+d9cvl>v5LHID@6vIkQ^%jZ!q zA$vq~dg)5v?7A{J4mO~Xf73U5>sj4|J{27i%Lvs1fa+}l#J{WzSx=m6 zy*B2Nl{I;Bz3cLQ;8Y}(u(}H;u(Fyxrff!v2IN#iBzr9v0E(iYc~P6LpwoNL;8~6X z*<3_?Gt5td9G_*oVaYpiMd>}pn-g)7$H-l!{%tK^7_~PpUB1%xkcOS*3SWMb%ePTo zdo3z43Z%W>I$XS#El1Qe1Otc5XL{C}Me@tk4RT z2x?N<4EZDB?D$~_r@cLsmG@Oh%rnW5w4(SE&R*#9%9c!z$jLS zz;B5Aa~e9h7vi>iNXo4*2NKv)ph0sDo{b*j!Df8@F0=6n-&EQ1O=jpQJ=SQ6i%TZx zWh&qW*bP9nxk(vh(@9ezq6}2p7`#UB2P0*qSw!Z+IA&KKEdJ^pWCiU|z~9&?nprUi zmQXnM?NZ*xkArb-+R=MZIPf+kKl92j1`He^Cs8}n5)U$iHt9tAZF*n!JmM4E*drYK zT24E3W5}6)Ap=qpQqMSS;sV-rVjxr}?3LwA`Zt_!#};fJGWJASc$bRkon}H(TX=x+ zyq;eS{8`8`*%>gN{6K(^RzOv)=G<*kIBYeVe?M~%y$vCyJZPoefs`24^&Id(w++_7 zHMhYphK4&-0kDn>Cf-Gzu;f96g}x5$VFMsnOQfsQvK|#d^|81#HE`+v6Ev+n`{zct z{RF{AZ-E(k@W_1Vj_`}W7#?4l?vkki-U0A|b>sj*wgUpOyH}9ZW1!&T++(2OHuc|X z<4m?g{!^QymA~7d@Vzi5CTb^SU-F->XEhuO0z!6l{?rHV%EH8x^j?4oy8K5EtpCD$<$L{f!W5)ioX6_uYH%HxEPyYTzX=)r*Gz^o-XE%nb$D*z`|j&x zsL&|Xl)6P02Jb^e9hAfimeG4xU{!pBon@w~w1tYs0`<$GcUw<>yF0fkiN~JxN`wS9 zJxmYGnaN1@SlzN(^YUcW%u{U7aJKm^Sa4DpA3_fpTcRH^1GeQK*3bgg`0jq_FWs(Kg2AXY_rhFq3?fFys5Alwn_yjY+e~*j>s2pJ6MWxI(!vg6*L8k|7!%z-bkh zeCiE?HF#vIhEPWp_Yapn(WnaT(|bk&$Oj=u*v+jGAiKZuR-VG(*R8YpNvj_T{eclK zv4h~;Zg%ExrgQ!mSplwtK+0vm!(^*jnII1#If7->*}`wDMH2!OXP<{FSld0}I>bUf z8YZklY?j5!9RK3&tXWp z?|EAIYGjF;iP6Kx4H3m7tO`uLZBTZjBEM&3-0W7AX;jowm=&fp@V=3mudlUrl*(k) zNTSWNuF_Yok%wL@_CyV**YgNi@=IrG2};|C?|EX49DPPjcx;})2hf-F4MV{obvKLo zuZqEkNXYd;$~7;~IqqOI)nvBj!RF8qOIb-Uzbv2P^E)0M@H6VjPYj1iD#Lx#_4i%xGYKF$h0Fj=<=5K}!Z&uoJL*hwC<}h&)YT!d)-KtwE1YHaMS6KeByaBuG>= zW9L+s2l*7x{ZY_ai;|k1s|6Wh0xJx9l>ygNOw!T^)46OvAc#D!zQ}hXN-#E=i z^@P(LhN`(J-fQgU8jhe(34X$?uweq;+1nbiws67olSMY=Q_|z(AD~zsXUhEC!SGk37Mp@nZ=tC(MqhvTsKP(@C^ZxmAWT>lvW-w-&$Okh zuIDed&^+R~!BA{?tp;%kv=rCess;mvB>ALIdHF%kG0Q!!FZEW!H#-uy^n1b)I1si6 z>(yFM2ctI|MEBIuvgR8j$FW;fq|90^HpPdYh&Vn1{zT=+iw0l$<7dgB^IlMiFV=Lk z3sxD^6KGo8znm71@45N6#>}6;lSC4@gU$`!wiun2to_9>381mqC5s@=B+j&?VzpNl zWvP8On2Os1VRes&KBIkygj}mST7=4VK`U2*^ywAfvPZtEI z=0PJ%4PhV8Hgl?V+So$KtHVoOh+IUIrJ$n>&U&iq+EZfgz6IO&D~C=~KWSScY$f#( z*0){PB|t{M`N*Hg-G6_E|JH!|kN=mOV91>mSwGb+ESy@ttlGAG)F(A;W?bhKhA&uJ zTwspONmHgn;|-#^&g*tqMgOc8_s-cUujs1DUJ~)^qA%04h-lNo{tda2qotpo$w5A< zW$ekn3Ln&Ld4A~|7^&a?3ta9Wxg5zQv*2kLNzjP>1Kp-J%<8pXrr*qi2f8k^2SR0r zwK6fKxPFW=2H13VrYsz%$H$o}T^;q<23{TSq2GFHGT0&&vw{u-m&Wy=bwS5}*4s-( zXHKs{P`o9Tr=0t2uS!{rj>Oq8i-;Er&A!7p+wf}HZ(c@^mHeqP9M_GREV6ia(aY4= zKeCq{aFDT%88?JvNfjY4@|EgB(AR-O@ekj*whBY@-1M^@dE3I%JM)q4$W!Dv1AKm3 zWpunsLSGqIlK90Y*H#e^^aS)Z!0m_-AwRv5qb?tyah$fe=rM8)bQQzNFQg2mxRGvj z6nRDS`pgVtsK?2eTrlvKiV7-^peT_xuUB|;eT?@JEWd!s)%z&$!Dk}BleUpo;mkzF zXcN9hR!&9SK)vXEeEr0>YLvVDj~+t&d8x7$nV7Y?qpX4`#ci>0d~x~7E4 zff=-1t>9AQAC=(m)}dRS`N>S?c+D&lKd3CHTGbiF`|y|xtk=XooLABHPn8#{W>%UED@Xa=Q;9rz)oi7Kp8eR#Y^LQN@CK0t@Cx%CjXCkMpEjq z3b(7!8+7#;=ug|6E85f$bK;Ma46aSyA_vA}h}&cLW&4n2h(Iv)VW6 z>0~uwJK|#D)TfpD>2uF8I(oXqIv)KxGr3_WYpi;Ju@%J@?YifUZu7T&QZX#AlV`dy zI$!y&)$>e?_&`$o<8kU8nl5~W&o8Mg8~*jqwPAw{=Qk-z67VY=kI|7xIpRF-+d)Y1ZQWPERVT2=OXA9r8&GJ zFKkcLJh*sxqFCdwcbtL-QT%Z2ZCZM*qL*uZtw7@aaDIP7hry(YH|M0-p9?GSqJ_vb zikaD1-Sxy7{!kenPU|zxGkZNQu#X4bg}$?=?#&quFmp_{K7Uy;##=cjRteW%?Uo>C zl!PTU$`vW_ur^_3wGB%sk|_psscTK16WuQ(K0WV)jZVOQ-i}#puUEXr4d|RtJ|k!UQ)yTYvHsn?0?U%!ps?uS1;fmobpn zNzPSx`yoNzeD5Ix^u#<` zEhOu0x^-upU0V2}@?SbMX*u?iHiNYjvr6EDwAYFepold2_7jm+Q>n}$pZ#~~z*WoR z1Qya$ymLfeJKWI2=g~s+a^5>3tuo0~35oYfqIIf0ZCjj(28@sH_u-+jxo#_qy0h#( zm}9jn1ltBEx0emc!Ma~M-ztn0vi*T^3)aiaKwK=6_h)S}!BFJ;aIUCCUYdrZ!0lC6 zJtpo#>XcsAwAo2Zvuds0_sFG(X5B-ZLZh9|t69ge&QvpsI8}+DAM4w;byoA#>*&{h z6MMH|0)BUC;Y3{tTjxWjdLFN(`{fGv)_$HClTYi`#)*`vNfi->ququz3kXMvm0utd z4_2=zzjRn}4ndWIT&N3SLf@t3pNUL=%oiZ6bUU;We=)p8JwsKs ztyo@lN!G58K*smj_hxhsWF13Pl4VGJ{ZtL5@u?M;(Wr`vej#iqI==)6RpS_vN=p(n z825YgpjRUHIByf6_&wmJ;yGVXdcB8pHI@6LZGWhZChLv)NaPo;$bz*my(g{AP`!=Z z37Fg0x4#F{;*yGCVYO#y1#~eo;>l~Xk$sj_6A^K7Iui^|<|eh^ZN4~iQ$>^BdhC=q zlxwlfd9NxZH9zK4`VtnQHl0?T)d1mBgODrwR5>2XmP8f!MOM@0SMAm=Fx`IVAeL8A z310`nZ5e->&VEm6f0@>P`J(;G}Bchl8c=?{N<;G&o-D zJ9=)mmWwdF@k6R>Dn4s@=97xUW!KazFM0kjTEa$B{qxBrxO8bjOq=9>MQ`xiwkGVfaf$Jf2GM|IAfWasVfiz2->Kb&F9lHVNKGaR-PDT}v$H2FXS zkUy{>Hph85OPMXHhAGe_H}wL6(ydTAZ^vCU2Q6W$Rhv)tGRQDwIE>gcT})wZaldTp zbC>3pRxWeY?=C(I`8@&5=XwJ0ur`Jb$vq!pKB;1?a3rPvUk|fC54ZoF&+k#5H+%i` zGR^G}=ZeGW^(!82Cvz)|fm=!eicIfZO?MyZX1R%>)1i`TiuAUz;Q&PMED+Kmo%coB zQpXwVZeOFv_)%CWeUWkN@F9D0J2veEd3p}D{w2+8X+uaHf2JwJVIwh?t-`@$^1;nt z3`Xba_4v4dZdBkZp%Z{w+v9jgGu(G1m zxu1bQ1M37%YgqMaXAFMmhfhLsKwA~4!zqpyh%LSanIt7{`9j2%)?rq*d`0h|f^AQ) zU2jtHyu*4D-E(b&?e%o2Q{l|0qu(XM38R4$fzY|i7d1^uyS%E;Lwz70wI)RvN_R|x zwMomg>3M%KM#X=+80h1RIiy4n45?t+zhKgJ3*22N^i2ME&H~1;%IU1wclJ#cNAEZfV*+QBq*ZT8U zr9hlYTe|f9 z&WZ!S2L2hk5=>}c>>j0w2=72>_B;28di-J=V%O>Gxztd!Bs#BMa9Oyy89fE2)Gfb1 zxS1Ay=?eIy{=>$oV6(qjg89$?-smU++LSwkJiIz};p7o~qrtT5QrtW;nrs!fehRJ` z_h~@5>Zd{My>&7RDTI(s@NCq{b@0Anyj)Qn!W6I!_)JI4+m`EFay;<#7R-o=kUgU< zE)qA8oyn0@sF5gKQH~SK6Vthd&G6D1{XESghVsOZS8BiRi zWQ+B)kRN=}ynwD8ruc5DoM{2F4)SslT|y^>#0cAKM*5pYKRL|>qED{7FjJYOp_Ax7 zQ*ypki=|6L%UASp7ID$1!jWJ6I(RGnp055QJqd|O+M)&%*$&8wpa`9OyoFZOy4ig~ zYE4#oT^Y6lN#Bouoo|Unzw9^2;+4*ZFv`Q_z`Z;DO_k1nIcEQQ{Qf`t|3mB8;w@?d z8!IpjR?c?s#2;kAz@93$h?u0p`b72POQIJI54@!ViBDnQxX3c$vU-psmFkFBYU!e| zN(6Qu>@0-6?|da0IABLxaHodu*>yR!R2#`6dp{xjYTCnBgRMf5 z4f?e8uAhNZ7xxpkNcv<)mpEtlZ{tFBVo@Pg*TrFyb&r&|{93}J>(UG%Kge0Xthi2$f z*LH=%SCAZYXScC!ywyjgTNWo<)+-421oq+9P)Z^XSmGXZI0KBf%pHMx-Z8nknP>=d zuUk(cP`wuHvl9lB;@iNYP6vD+70*FsAs0J?umn#g$b{YWF9vlh6AnKyax&92zBbK; zHf{^hj_}c((#nuLhHFR3Ah(_Ea)>K|}_0 z3w`$t+oB}NFnrME{Xsv>TSgMriMXs|?ag$sEP|vh2=9mI!OFN!CJgNW`C$;+0@MCb zZX~6T3e0fO5@~Y;!*E)1ochnjUkqkhW>Ct{FZ5^Mv}d-Mb?BJeg}`238nnwnqgj7V zkHAebqgxTW!)dSMa7(X2xoI<8%2j&VLkKN48(TjeT;#vS0uNk)gLpJiE{90C``0#l zXQJ!2G^m~gyoIpLQ>J(dAa@@Fm!BAsFX6Kf$o4<|8QJFThk5kZn8?5fY_a$R^>Z5b z32a8MUTY0k4rO`|i05=spnm5NwSs!}zHg`*Ip(ksx(NcZD6`(Sr?rs6br3#XjHv@h z)kgrLllui!e6+7ZbVHcNzlN{vR+iI?_)42boEH)&u${2cc;)n+6XCQQ&ZD_F_LaJwAFEX~xnsJ9CL#gJES*|P~-gp8J|AUjtjck{3|kRb8Q(w_(BrFQ`9 z=^)z-qd@u;8~F*4_W|4#tcb|0ucm4(c5xkmzoUa%KG#4gi1K=J)QR#Lr-%WbfX%e+M^Bz-|kQX zDN0}VyFe~EWK;_WUSMM0U0~b`L|%wfuK^9PeF}(=+oHm#Hb`Rb@ufYM-t?hGzZoghdzmW?lFOrgst068 zkbz&BMWcfNkH}u|^i%NTbiiGYLSqA+%F=1@dQadbVEY)Vd=cHNM}7{r!S3vXyPPIt z$z2y|@74H_`@RD`Gz^%Ti0u2tU|NteJ@Kb2&XL@O(Rp=nSybvOY>y75%Rhh=p&G9K zVvr(&$9A0i6YL|;SPxnbEzW^_8VJesw^zTenOy8*NAUC~f21PkJ z0xa*MCn4a-ZLI(!`QZ$3j=DUI^Q?pHU$#6B;xD>F=*WAtUV;oYA_-0N(F3jPx&Q?- z*gL`}Wyk<&Zc_n9mRALC?@xkw-~$Cvk9>ol>DH6?0n7^LqGz$TfVEzP5x|KM^yZtW zb;98G%ybgPrR1z#Pj+qo;If*0pGqXx!JH8FJ_BU{r@SKvv|^y7F)cO@Q+d|hK~$#EkVy7XqVscY|Q3M`lbHHZ@UR01&GNh#W)y*MNX260{$*)q zh|lpr6>B7gl^L=%{F>g0D&7mIm4}Z_E`_<9b(rjCr>P%Jz7z61=Pp=Ge~@{X;m}Xw z#=&ul9=vxIIILoJG8EDdDyNgKbjCghrsemWDCfv~^wT|&XH6Xkk}vz+4bgkY@gu&~ zj$f$u4`$U*!;LS4t)lYk2j_y4qweXB=B?+6)h_m_zGx4+xOm!~YXCzWSpndBqj7i1 zs-IV;F5y#%lO&XC%V@sgR)=TQq>g4?`W^lULrlVf(cb7NhrNzDrT~svnFhVnURrkW zEyoYzOQnsl>bFan?$3}H!!y3=%^L#YPu{;%=Q$VsIh?|Ff}ECb74XSLb}^eNceQ^I z>?Wja=r!`soD_Mx!ct<-dM`x*j^eva^!IGs!x$=Aq9n{3NV?(MokqS38Ac2j;^&-| zD}zP%;(+S20il_~$8EoNu8BjVI!WdyCToP|-JgX#)< z-BGn{DKlzf=cm9!e8d`}R|diz(GL{w1*)AGvGYvj8+4ZbI%xfokEJpJ7SW>nmc!QW-p* zhPk6t@s45?nNyl&%!l^xH!OXk-A)Oi)megoha*%;C%isEi*1HIPyvRa&284oM7 zmr2&iyp#*f=Qr=#Jw$9VW zznhZ*_29Xt%F%#cIg00wvnT`4%n{jBG>_OrKUQzfDWE&2!>7xJBT982I4a|ebs;)I z;d%4DA{f~>%`>sb?hcUnb=*Xv1827_MQ1NBzP<8+_tiEUk-oaQkyp|;CW=7c9%%aG zrLjq6{FALBI@W?MqZBfdIFq+frD0bY6`A}}Bz(<9r40}FJ7XzC(W&de@1^Mo#Sgpv)fK6 zM+d32XO(2hD1WUSEp;1qu5W5N8raADI;1!9z`Gnzl>`lfzL&7nn3vR>L;HPxAZNIcl>AUDdUgqhDw3-Hz6*&%v3PPRf z38k7o*E#;(wrofE_W7uZ^z1RkbL-swOqNi}(T^mmdcR4=yY z1Zi3O>t(30Q;v3BW}o!CP2hgF5inHw>X?M;(t9hl@<^nsB@RyE@?*p3#+gk|LbkL$ z)Ik-9!yu%^j~qjBTl0+Vc#6l~*Hs&TC*%BHTiG|`=7)az5tv#)tz#7}6QN|;Ep0;x znk`pse5Q~{Cr5FCWH67^+f8KRM((pGlCxp? zoqF%3*3SS0ot6N=QK$WG4Ivw*A=RFIaksWdrzDuCT)8_5tvQD`k0+e`{4z^r+y6GH z2JfF0MnAEd6xhtSbJGi*H(h?H{+{fzm5svY&E0%~uN@d< zEMVOGC=Oo<->27((m*)weBK&_#0V-v-E+((Zpt+eHdq^n?~1}TmDF0g7TnTR?wpr* z8-;AJQLfklCtp-UZ@kWT7c|{6t-gM12#|&gwr}dmv`4FXHD3Wv>^cWS3o0i}Zs+(t z9$3I6=~q$I8oyfgsEf>B5g*ilsjP_ZRqpVYt2T|Nx;Pa~U(mg1-VlF&R*)JBD#pUpqt^-b&0 zo5ULaiUEYPhA({kf!^29gK8UCKttD1PKmF~8l7j4iQwpuR?jsQrExMaeh#*2)qgvc#Nw@1%O86|75RHBaQ<2Oq|H>L@ScJFhX|K4>UNgEX0nZiT3*zrcW& zsQkB8?%zEbwFzo%K2$#fCdJY)>`Q-nAf-Gj9ng{<@cc3FW%jzKS4)z=2pQJGyvY@K z7|D%*Zq}xc+Z>Dk+}vb6AZE~hceUWFdQ6Dl>GUeb*NETm`e!nM31=Nxza0c;xnNF7 zPB6c5S}>I75qcd*l%Mu9BR9bah9Yvvu7*`VOFN8;u!S*cNxa8+1$_c(%rntiToiviWbSBIGjK#EG^o6IKuYxb zaKWYa@9keOV%{S&0TnCl^K;^}TjMQt2_AePu#xI+%cJdXyiihk&dPOs<7y#MlqcVx z({SQYvq_|^c-$g7$uwhXCDRv{<6yh;^V8hrwB^eW1WgbjOGJM(MIO;Pb(-8Zv-XRD z%M10Ww$#C4|8k$C?3t5VoCZ!dF(Tc5oqrtatG$RGLXK$BwOMmP@9qvHO{@%EMz|s0 z_{Vm(VSQJ#d7FjMgWDlN=C`6W+->}HDEIuhd(~LHvhMe(K>b~O;ortSf{x;4yzKGq z>=j(;-K^C=YK`%i_$$u>;b`_ORZ^?w(flIeX7;avku`03*s8;X{(3aKi2|_x)=_ z2OcA{w^tThzxNPN#IIwELh8-*LI74N`Hqq6Yf~ao()ku4T>?@&x0!bj-L(<)QoJx$(@r%5g$ z!%zi7#p$EExqfnk{Km%WR#qNd(#CB2OBZm0HQ`x>;)ko1dPs%jeOostWLPI9+Emfe z@grNAN+FWWD!JbC7Fvc1qTUbGCVz?nwV3s{C$(z6FAC>tb@tvU6xG{%^CP=)Mxjaq zX7Z$_#Bk+?OdS|K;{|VV6-{WB&$Uy)poDP?DzO-B=52YrH$CCHMC*@WL#893cc)NV zdj9b219X;G;H-GK{G=de#9k?Lqp9kT{N>D#ldP=Ft>J{<%*y*}mL`m@yg%<#jgWM=kpmg`E#LO(6#Pd3yb2JI`YqcLACmJdGN@>O28 zJf7}^{2i3i)X_`#u!!0-YX;;9&{uH*uF>eig4-U;`MHk>7K zd$61#nPK1u<-ys;3QafbqCrF!4u$lSk_k|8$oJB;k#2;WdEJjw0P7YrqLIEn7Kj1^Z*-Azad8n!MUsHndLN z)ZFdjE1TSKTrP1;K<7cK?#X_3%!N22+^)tlbiZxJyikvgb%UFJir$)fjT~p?JPTzv z2kI5SOh}VW`a3I9*fj5kJ^bb?%N`TJ^`^@|B$M6<99@6_;k&ePkYo%FSe*^$EgEjwY=8< zVdHid;MJs|bL3(y~si$LLQ+$uu-_2Z4{tdcX3NXTKT1Kw6Y8gg0_ zW9rz?RaP4)#>>3qVEf+h)nm%WM74 zp-1@RYCQGvq_0)dyB%SUUl~8ga$g)FwW9N}&-6k_mY2v8z91UJL!Cu z%lFmSO7&ZJUkD9SY`jwqe$3cgWqyB_U%iEy8dEbm{x;q;Cwwx+sD5k2(|Gvu(;+o6 zd%d)bnQJ5C5~V*M^>@Y;#b7AsjSpt#jd~x_8^LviRY<#6DX$r2!*d6i%VAovnBL$Z zeDLGJ-&-xas^$89)qs}Te(BZ2EBvWij7M@kby2sB+r3cMETf`swT4DTV;i)k!Dp$A#|FjqJ`z{`*L*3= ze$uI~q6x*hW}3k3HpVx*V&`0lJwrJ_!%?czZ|2DpDwzJ7bMHm810`Z;f+fq=adKlu z5-IUScU*I_TEFwV$nNw7*2s=e49rj8)a9{_b+n+)(0Ry!7Qyp1@r|{ER9jJHP#*3B zw;Npe9WQ$*j@Rsy{{(9V9w;eK;Ein;4`0YGa<`{+ zGaVTot`($(s-1zC9cQhPClz`lr|#KopT1|RA76F(BoUo4b21f>RdJ-e@;cbFct<|lKCIh!;l&+ktDUTPbxNN=kVQjE zj>1Yg5#4=_x)ZuHqM2~miT!Krd$dFQr0Eb{XE1g0g%}I-M6P7nuGy#$$S~T_p{RN3 z&1~dH<$2xPT=+aFmcT2uY4UtT|MuZXt_Mm1{Nh=vxpPSeEORm&?6b~MrfSH{wFCFO zu)b7nlMT$Qk^2eOVVf}qrpVM92XPXDY_2Bdcdr1yF!D5WJlA5n2(M9X+#Wd6+Vn_O z({&8{ywzd0fLE`yTim-&yKU+~Y@=**++EFz`-FR;K)H2#X*6mEyWTPfL=e7~*0{mg z&PtL-2a0tt!P^&FWdiG-J#mdGf2ZM(O+z$bncXOE7N&FhgW?eXRQIssB;$e8YTxy0 zKwJQbW$<#w!QO1CXnfNwt27LAI((a%W-JI?j$8bhkLtS#&L&1aAZ&9O=`H3(SotSc&JzO_oaaBb-SiFrEn~u7$$V+^+JmCcYHJr_wWwf@h-@_nTOXd&3q%p* z}Ff-*M zIcmb77N(V8VO}jdftx8hkaTq+mR>uzzXIyGg1WGaWM=}4Io>>@5{^0g)MFK|aOPHv z!1LyIy;_G{8mg9?EdqAtyNaR*58PvQ*mJ#_-TCab&!lTtANS~gBB3W8YLP;HXiuzbK>yJgxo<=HL&1o`Z#B5&^wJ2iT?+25IYTG&6 zwCH%U4b$~mA>mxbnH*ofs~#KMl}1kE>xnVM_=1tkoCdiY4l_;F@{orq^5p)RC91Bq`;dy-CcE<&gRj@7-SmXCBU4{5 zyvNJNg%6QkDW-nLc*{J-waQPmB7Hh+YGe42cVE4)c4`v;P<4c{Hw{*fVUHk8(oxq-O@z&3&dtdOyda z%jrILXmxV#jCUh4f)AtUKJR@Tr zdRLKA*}BaSu1)@q=-z>qtH0@hfk;Ml!nU<918cKj7^lg1aH7`#U$aAi%PvrAqzmXx zu;^dQMt-;p*2|6Fx&{50srmj_sXhPYJf55e*E$^wYc4X>T_adOw6DgeT1P#uedxvb z&`s>#Fl4BvtQMvcm6J8^Gs3j_y9n!*_43aZS0fx`gW%(Ywae2yxwP(a_Nu=u1YCO+ z4{t62P-t1^(l`{37)V!HZ%6xFa#;b%wmhKx~`}#vFgY(5a$#=YsoHNHug)m7anqU0VMhsb6Pk$*gkP zVMU^DNJTlz;e@I6_R_`$+E&YezrBF6z~Du9t+%}v$I!SMox7s*de>R|#Dxih*dF-7En z1_o8}z{gwyncobkS$I|ludt39L3vYRu{f<_wp_KP9!{g5Svvh} zW_R^y#W4KW6k%=E7oY!Wf4)}^iZ?-N2Y&`PT^?d#Af1LLKeoNu%Vx%d-Fov+?celT z;{9%XxnTy9j}L`MRZHSXYc|-xEA)~Ip4$8#mb99RY6xy|#Q~X)1rrqyF##nUeK%m~ zzk7fi?137tb=?_@)4SKuy&*&{i|Rv}s0x*a)yH5t7u=vAX1>G-^lezXy(c~4{}{3= zQAJgS*7@(Zfa0YF3(rVrREENL&yxMU)D|XeS2F*iGp0uHit#r!`jThJu_r^7AnVAau6aC$qdr)FR zoXpU5NM8g;d%N7DDMgSHY#fA0=WM_ApT#BkuW}qHh%^ADBysQ+#`0dU#Mze|PVTOa zrsWdR8>py*3Hzq>{@AHr)v`O^8|V}p^e2Zkyi4gtwD55!aCUPGl_d3$eEXmF)t(@0 z4zYr!%dT`{gIpJ=YOIKEY^M#7wX*aws8aZZ6t_&95S?PytenCyZO9u1G(-V0Z}icv z=MGo^TkW$K7N&}eB`DHl7Uq_zvCX>@QZhtGjjBqYG?)j3TI~7K_Lk-!M{7PGNvrttoAXsLHf+U<7=%OY(S^{9?8t zHGiUl3Vfa|&DCICdVFQY=}B=>Nx@sQ(Z&nvBc6NW6}gY~4c)+a<$oJXc6vgvY#WnS z2FN_IRz7pYmY! zp;_N#|64{@pPq^f7EwuGw0O60BtGOcCof1(o4-?0N&CcDy-B6cg-h z$(+%*alt(7kjK+Et@2zeM9mS?;XM&wcG8}F{)^m=9sCg~Z?^NZ8$D^Kl`l=8kO^sv0Ypds-}Dd3c& zD>6P$Zxna~D*-}k&lzjjb;v!EOFL~lQvTx?gKwt4eXK~1&~?_VFHGl#cb;b2?^TM< z8x1+teNhewe65T4gy;cEt@D~Pvx2IPNtKQ@8Ia4{ot(u>Jo|-RuLjcxg`!RxfBs!O6%DxmOqeXjMOQrFH(v!C%o({>2P;>sCPLOD0qNR`@hORTjUbTOt0i z#8LOc&V{nrCx*c1Zg{4PqDX#^2Ei!O4SB*ixnutjL;M)8r!DXDU342d8YsP9IJ-8w zi)5a%7%^!U#iR&aG*Nq|P^a>&8aCV0?C3drz0_Ooo7A%et8?2EF}qx|)3T5)S;X^U zYo$w0-lYWH7508Tsm4_Q4(GV9a~67UtNFisVIK!R=%15#?q<=MhgmvjdrnBQm(JnG zLf~#(TOKl#{N~oi2xPQyT-!8vNljTJap1J-Hu!O*I7fl!6Zg*DQG3RqZV5tIa^ejx zL<@EpdSUCFM{jH4tus6=uND^%xF1dI)P7hY@%E)6i=`e&o#+~x9fFtAa&RiY;gxV{ zwm(6;LrVu4(2xxivqnCK^715}jJS^A%K`JTOkDe4zT27hgrFa!wB%!mQLT864Pq5sr5r(oRMzMli!!cnDTX607xOzHIpPy)^bWFOLXkKOEA7P)K z(;U*%!e>?g1%V~)uawR*K;yG6Db7PZ_EVgAdKG)flOmnBimfm9)+`Ur$&9+BTCts5eds3%zIG!8!lLSHW?H8YKK0jYtOGX; zOs4KY0cR1A#0TkoVtwIy`==}D$sP!zJC%npAj64~vc~0uU z1{Pl(D~Ril>6b5Y5PQ{gT(l5$f|^6|l06zgCx&hY+|vRAb8TzK^}K z`Icr&UjX2Ng^H7jVP_s6rUIL0!N5AjgXDvhTZ1I^&c->~HkejbNy2d5N? zLRbjmFfG^5Y$oi?#ohTTi(KAoipy-4@g&R};DtZb0*rm%hK|B3{IlGCJoWc(dMa^B z@L^*U%2VIc;*>#rrr!hIW8UE3eGurJLl)<47R}rvq_mWOh(l}8!{AbGjthktz`9rA2@M6H<@#|<8;C-Rtt0%;ukdW#evtF}HYe8d;ebv{h}JQwQpP6!&ng*S-kk zaY=3Du;c=9(-yyR(*m(Rak^-ANukRNo-e_Zi2a{Z5S~os;CYc?K>#0B8{hLd+OQTy zxx^I`!))Mr1K~nlBWdB-nguy#)#98<&j{URE5aY|1?c$m?7-fLFb>to;($h7ogZo! zFqIhE{39K{TGCj~n>S)b+1j!>!RK4F3TY?xEuZ2WX4PT{ETFPe|K{i94j0~SbA?tx zg)K{zxs);SQSBE{p^@j$Ywv%@=b;5Cm)a?^?CfG33Bo?U?^%$2cDRs1`keUBUkqn# zb8aw=%pluf5wIgvTYShPidqHRnA_-w2nVwIwy5Au=!BoU=2HG~f#J=g)@w@ApxNZL zIgMe*3$=xn7O@T2f;EL^4&0f(sYE*r{IH2jTPGr&w^b%Q%ucjZbg=uVo)j(;wgoyW zX^2OTsp#V;MF!=bbbxX+6Juxtqv}E=Wu3Ju*c7Y-7dFg3HB|X&+xqNEKWKRgN+L?S z{-(S!?(UE-k>k}Wa8+Pg+BD;<_O6Pao$-9KiP=fi-3h|_4C_(& z-EAe5%s9vdzp*X-$XTqp!>nwQ1lMHcs{dELC*1*9-NUuwv@FRSsOO0WhAhL zWW6H56IU+ZL{(z0Zl{mCJ*R1-Wv@rAz>1adPW@_l@TwP)kN>S?$N$g2OT-VNKd3ZP zKq^hiIMJE<6|K7z9b}5 zj#07zui{HAn%?Oaa~@z@GxztjJbZuEh;8L8^g0A*!HE!Yot?~XCE8z1tC3x-|LdqI zcS`&tTm_iYgnY>59Ol&&`uvk=I~Qh&sPDm$-s8bK0OHh5U_K-3z^K)oh*`hxLb>l{ z?9=iMSm7Kb8l!L>y{W1)VUXGZkXwAg-L}T<;SZ`1pn2;EoCvok+7ZEEgjzYJ)@WlWig7d856 z7NM0G@_`q!>`cq+dNiKFAoL4Nd16w;R64uO>av(f=4+HaeDZlb<7B!9N4c)Gw`svk)|ZhkYetC~X+(IB74JU6Lcolwu_x>Rkb{0sw zI{swSw4h03mP{@+#6>?O1boc$e;2a@>>}D8d00J=s6$@6_vB=w>sYg&by%ePO0_m! z!E`vP|9r?ox}Ot$9$=B-}5TZrtfg2d+)!)%(YTzgR6H0ga76B&ErF4b1p|al7c!BT!8KJ%f5%6O+l*@=*8FM1 zObFfm7d|iRGHfq;w$NLg*`Nres!8o=HsUvB9Pp$i81tP1#hjqY)a84LUX_0Ya9h22 z4}aa?ENA8cILsmnT){WwclAr?)`o*jD5H!P4nQk#tmN+CJ!6b#rQuuDqlrQyR#rE} z8kIfGat12Vb!ujjQAv`V9z*Nu`HZxDqrOS6P@AV=mOV^4IiM;;aklq#xANp7eG1}V z8?z1@dYKZM9w~g!N;DXiYYrtiOCY~7O5;{ zX)&c8(bxs|p6QU-Mh%%a%;(IL-t8mPkmp4(ep4&v*=KE{A=O78ZMI_x{A{vQNDLTq zVSj<1VxzR>nO=j&+T>j5&nv6_<)-bTwx?b2>h_+RZ;V%}e&p&?W)xSSv(^Q!Xgkh@ zwpdO%j7>ns@_Y&UDQ*wYU!llq6W2cbU~gboclJVdy((%A;550{Y(U&nu)ehIa?+PQ zCfA*seAON?k&3=P!89XQBhx&zCeyYK$)?Kvo8=&68W&t8e;#F%8@tCbSrcko%2<6) zeQG$Y)`}aEg;GlTYEhhnRY;#ep=d|ZIymbr8+Qhss}-n)zk)r)vu#bwzA4q;9{Dm{ z2-UU)b?ej5QK=Q(Be)jyuO#jsCUjW)ZE;rEKr1`be641>VB4NUWK0UBP~+NIJVjf{yF;BZsCnPz?j_PCRchqJ`Pgb*y9)&XUUe|_ z1&n%yEprSp&p^FL|JCy)bqS60fk$F@VB2L$Pzavf3Z~M|dcsP-WMvLhO-6cKIC1I&)d=Dz3wUp~9#ESThp>h7m9dmSz#I)1;;TL3%{5{nNSM zEa`G(Vc=`)nA_clzJ@$p%vtOlJbq%}>4PeuvHszEOP0OxEpg&mJl2FkrD%dpmR5%* z@9o2H$E)nZ1X64Af?_S~QCi#HdbBMIE&4;s)`2==TJxJKw(K#b>GcQ9FIAD%!Q>ZU zq-AIZM&kr9#}tG%pesI#4imZ%Gjum%ZChH*$Z<*J^0RTRa8XeK`Dj)0UAP%y`LbDW zI7A{Nq*13Q^vuL4p6{#bfsa0hjXIx>=ixgEUE;R}6;}HDimvdJPL->#D)wCJRqDBW z6fH^Sf{`64OIf$XN(lkeQs{0g0d}IXxfVT#eC!5cXforKb-7laszS=o%)`r-UP6JQ zdp5PkYPK$t6t?!4(!A&@U(38b)Ohs!BMz$$ZLalH_aA@OGH{KI>pDk?N|-t{wjke0UvsOwoWD3{*1g% z#72ufRvrsy#nsfPWtY~WPwwZ3{z8ZoIka#GKqK6G5r2g4^qC?v(x7zQz~Sr17~3+< z`}Hwxcb_LE6|YYSp=U$AhnRH;Bp3r)sDg)ft}L7|p8sH|Oe@hXRR{QuiixbXhZ1*# zPNp`rub4B`xG7hVPndN8jc7!|&{)CUPgfiuDg*$HT>#oK9fYm2!?E8Z?jB_Z4ERMgPmoVk0|aH4g)y>lR$ssF zuC{##v<)i%&MW824myr{ipGj;jV}!?>P(ERvsJvpc1j1IPu<3kd>>Y#*d~udNf8c9 zC#c%a7hzNuoG9Z3de)JMWcXYIcAm>md=0#dXc}-X1*0&ko-6Y8x?lfrdGOp4F2j-rV$dWGwm0*W;bhyF&i* z;65z;`2a{>3S0{e7-ii0VJ#|??&thOYvt|d``Ap5vXz)Po<}a5r8o>j+%YaUHu3Cq7m#y4T#MzOoGqh_)|qT0yuSi{Qr*@e>hx#wgS0oI?0ZKxlQnQ`|XflCoz1J`d! z?SA+0PE32xYb|=j_oYe*eXF)Y->s=1%~w_fj|ZDvElE16SfLO3!wK-!|Hx?m@6|pi zSWg9bdc??Y7TTddcIb^%6U@1L_XPbk@R_kl>P4wxfUHLEih!X~GJiR^14j@cs0r-m zMam6kwG4jS60sEyoUB+LoCyETqHQ~BiJvv9=9@q?h+u6FnKIbO?T4Q0n2DEU3 z6e+~tELRhp>4!iN6oRWnsiFFg`X3=4$EGL+p-Zg#rPX`S(rkZ`BScQ?)!dG2X8o>d z@d@!7u&>m%FgVb=AoN_<*`mM&n~rTR->rR=B5Cx@DW}?eV8RiSc2?{?1w5gFf_(L9vj#nvOQ+Wp)b$AT=JZO&g^<^$XC+eXeU3@)p zLBCn1CasvSVDxPCj+g>|GTAzdb5d3DviSYh?!ML3uhXH}kZ-;&=^^z^=PGk_q_z^I zSl@OKdfhVgSH@tQ>W>&%PtNqA)F>6>ya#ZXcpjpC{k+qyC^2Vik{Hir^Ii~3;8FRxw@u1sv4?cI-b99`+))aU z6+$>C7+GYzi=E~h>YvO{K)aW3129VRG9pQcknv`1#_MaJZN*DB<@=7&Z<hr#Ipj6HI!xow@BxPadvzdcVzYHxL))6V!>(xB`6{ygJ>&x0AA zV2~-;QwsMczWZ&j;>m!ZPlFrcRtk*~U&Zkqr+AiLB&F6wFv?`t@=#n2_7Xx)^BGSg z0*B5;DvK=c=iJ%BrCtgMZxpMoYP}bGOK)>orF!76+t# z4Un18__{45d9iNNobyT>Go#a!+Vl6I(p_0(9<%^Mj}$~O8)BhUJtt(q421qRop~+^ z#)Ynyz2$>lTbSW`%1Kw7boBAhlbIKLR1t$%j=1bqfq1tg08*39ll zxFOoGnPaX<9~;k53^+@4$CHu~OXyY3l=^u1JAa)9xh}QhRP1`LPOqYR`Vc6- z#s4Xn?duF9EHoaU^k2%35wAJP0W#emdamBzjP%b>$e*6_1_unke>a zK)*pA^OsKZcCRg2auq;0xqAqCyz+lra&ar7a-)#k(p$sxkGr-9iuuhVI7{KTO|uAN zqdiqK(N2RcAs=ssUIWnE7=!h(GOCr}Rtp$+BoHnF1qUe85Jro{D@=ZJKHA%nBJd5u zX;Asi$-J_1e9PQp#A>X|{-KR=#g+pSwrLxPZaC%d{hK9Ouavp}27tyRmJq93Q;=g| z_ESA#O7_Ifg5{HbqhVQR+V{HM-W2aF7d(~WuzQ6mv>&zz&Tr57&Els2n+3uK-3GRN z?%r>faw*)~uj}W#3Rbwl#?`p8IGxz*m^K4rn5+Lw%BFIg>mXnX8jzC=w=94+Yl26wW{pyeDb3d3Z1cK<*$;`!C3mM8dC zt;eMYoD5n;Ii2=ZJu%ZbWb<)P z!0+!d5sJ;~Zb|`U^`BHkJ(|!wAp~3Gar({j0_2-SL#ha{vNKg!w)5MLJ;`XNmK^O% zwhXD&as4-^p_}dIBc;&WRyE$c*uS}Fh<3sqrMLDrKf;ZJUuBv9D#;%k+wjS+a4KAU zTvQ;JJLXd?i`|v4!%v0PsG@(I1+J3~+~Db)?HsZXAvnZ;9|5aHhz&jSz>`aD{Z@a3 z%^u~mK0Nbg;bDEn{s67?*WU;L76re+P>+B}5RAl+Mh`Xq@;d(br)N;pBg@ppZPzduIz9e~oIddWl@u%Gf+py{bggwI^O{DQZBydvD))pNdGfBOmcPcbB|nzUlLo^qHj#(&#G6iR#)1g zEp*4THIX(5gK##9$-q{HmgS@aeX{OiSgL3SE$tk7QVo9~_;AZ50s|wMTEV5;aIz*Y zCK9T6n(j((Ag+TMB8IJ1VpHCq3j`&he;})3{hMRThKVne&$lEC#E%ydYRY|H#&n%b z+Vsb!ENrJX=sR0sQ>%sgrs%<30nga8o68w@V9_2iHHUxX6`1f`IlOrhmR!q_{J@`` za@IRL=i-4hQ8F*iX^P>nvHz6v1;2t}m|~OP=i@OkX--8FV6Q&9{-@P>KPx_0>6y2L+|RidN2D}kUM}eBAJrm)7Oi8} zYB8+VeLb~mk$ye6I0{O$`@>B_zhN=B7f5p;nf2k;vQ9z^pL;Ao0DKVSJKe!daIvcr z8PmrSb2S-s@Yrtt+FIJ^_OMhZe(Dhj#MMQZOf>ltmQaumI7YYuqX>+4Qu)1~-Hw_{ z$d)YSfdRVv4M1jM8DUqL$4_|`KzDY2vlOPqv6o=ndlzwv&?7B=5c)fiT-^(9JV*P9 zrR{>pJ)MC-gL82kkpy|8IJP>9u5F>*51ihIA)2(pc+=t_;I|V!W|EXYSmASLp;v)d ze4G{A;w!Nn1r!YK1u+LK%tezRnk#i1(U+q`2vnv&wbGz;f9{ne1WHvXF5 zG_m~(`cYHNWY~aNtlgjjUO2AOxqFXB>XzZE4g zo`67S;~eaJW?QmdB!hMizwaQuQ1hGE6xO{q6ZbT#jtPxonwLPy=E0<v zSlS+#zHUS$VN>;G2#Hf2dKKM2F87IFW6_c}ayCy=2x>crfPW*Vorw|Ut7+*+0@ zjH(?T4(OJI?;xR_2U5Tty2fXEy@3&9#Jm+NCDs>fm6%#I*Hi3`9yGqqp5NTE5RxKl z(C8{D2X)anVx|b^6Gw>bg(5o(=F2$_p{`y z6f1P^aydXrL|JB#qfE`PWSdAL=-}OF*PdMCtzKIiZ78m{TBW!7dp%6@JMUZcFF{mB7TsaY zF4;r#a7CQT)8H@mHyDSHKr55Xn6N36etn~-(Ip@zc=7-Adp<_u2~%GL!<*uN(k=fT zlSKpB;)3m-Jr0_sxgt>wP&~M4gWK!vbjVxBmR!^7sEQ ze~*+x^-i!qsS#d8oE-XPA|*OFSG2-osw3UA`J9ab_p_L@|_ulAuQzSnCgJUtaMD-I7aY57*RTB=vgg znp@TjUhmb2ln9!WbmJf^G#LrxlFkh*ATDL5hIIavYCtLb>%FTLf(*z=+uA9dNS3=< zYrDT1-Z;F54c>OFzO{ZI(-x$8x2x3EP<#&_q7I`Y2bf}%beXkvvEpyxP_aqlxHNyE z&mMX9_bM;o5Ya+X)lb$C5pRkZ$J=fc&$bQZObl0w7(Dl(&~iU}H7Xa?2GVb7j{pVIn1fhZVtRlq-og+%Q@-S$JEqoXl8iED;o7=i(FPgXsYIJ_nTAj5C zN%B!ICi99&R(yCAt$M^SWa^$*`HG6eXvS9Dj4UhVU9Cwh8Tp3$W`2$3>Xo2TVwdaW zRCJ4>GE^UlhaF=c^fsM7{=(0Se9F}C*YsxZLhG$an2Rs7n6gB)n#?aJjOdq_-i#-6 z*lfnY&t;dEH9fnR{n;nFU|l6`l*x`7Fd2N|YM(->G>Ax+YY}kBgsgvH4c~{#rRX-F zgoSPw4;0x-vt86LdpPTd*a}Ubey>%oZKcXxX8`700+B`RZ`Ju%B1Xr$T{C~CmeHWvp({Y<<-vg%9Etw8HbkC zMDpK+^l6dz-+C$6<_35n`-%?o<76|!WhD+iU*!`<%%j)%Y40~J7}4|Bt)H^I_^#%-nb170LWDMQ z8G2%P%o1|m40RX@ACmDFuq;qeL*5(w;uvPMY+Cy4{Ox?R*`2nHdv?A;890X{zwF7I zdh;ctsWA(L{Ga#qRV#`={rd3cd)<+v+ATsPGOiZ=%gc;-Kf+;Lq$tFXH(pV=EpYAv z`-%P46Xa)%=qH!D#ha4ECHzZcC^YG=ih{7W-%(Gtkje2Dm@kWiD^6cbHJwhdSYG0| zd}2{bGP*(}HHa@a+^cRh?=Y5*z2EuAtX;DZ&4E-q7TswWrTS|Vl{8vfNpIQcFC zA7(t?b96+b)AZuLuZPjq&C2;7Ew6l1xDa58lGaB)p;VHU22>Sz^c=G~d~Z0_H&6+B z{&hJ2%fy(lD>vgc`rAL#$L~E;sEGFya7z~}ukS&851zLBezy0ih&nOenS7^oyxy5M zIl$fCe&29n)%l1LwOY|uz17jEp{_9==)()K~XS8IzIv5 zWO1b`eB?%$d$hRE@P!t^mF;(v8>=tAEEZ}6%v38%IH$$Bebu_dBh*~e@q-EVR=OIb z(C1-##fSTVN<%_`=t1t_IX@XXNZ1ra3(IvUms}Xog?mzG}leb8Pys282cz{ zcc|}}p@c#q$Y8Ot&?9)uF`JII$x4pX8b-9H2FG0S6#+@ zJ#dR0&ud@GaXnmCKSubThh5W?8Qy?x4T_<#`XF_-A-+9351h3zGlgfqSQlpdIM4vsmvle|(_)=les@ z)Aq_(@|m{5tm9=0#mw%eTOb09i_?M?JxJpjzM4h z;KP^)DY$z1Hm>66uH|N$(s}WSl_sB?6_0bAqN~3~?zy58L0{25Yx!v!l}(2+N>uqp z?VY9*I^I52w{+!hzT&}4!oIGN^}f-x7-=t8emEs)u7#^F53Tm*rfdZ)f}y#;!W70& zY0>+6Y3%7{|Je&7i9;_&0*!TEE;=7_1R@8+ddC9K)iM>%UggI%01<{FK<((ye@QRL z+?1Wg&Lrt+=T`j~@G2gz0?fIpuEl zVpRW@mdYaS%!Wvm?v@v&VTi*#Ru}-Zjb#GKR-n zR5Ok8_Uoo*Sjmu6{|4K-7P@9XC#^QBEC8+H9Nc*4LwWM56%ukFVlEzrMIZ4-8xu;t zc}@LhIbxy~o?=Zq8w9V);1`tR5Pr!adgK~AF9unFED~Q`c|?3RpwM+xYgJXxF5xKR zM9BQb)^M{u*P!*20v^JlR+|mY(8F}6$y0-c#Ag`S43$sp9%sdOv=QfK7=sGrGJ$5^ z)Z%Ze{*SN7Ci44uU2d1udK5El_;RnksQYWfy@lchYuP<-u{hUt;PSkd*Gb2NPdQFk~VI$C^8`BcU zK&jXE*HLxKsF0UXZFZ(e&D1ddhNG-Tj|cwnl4AVvw@+eX6ffr9Z45l{CXL097S*Pj zkf;1+sg%G`np<2OO9msYO4rA)Vd*Hm zn$#GmRdfrCdqZaUf;WWU=jCU(#C!HG%y~ckDBi`KiAHA+Das{8OK6<@5&b2OmH-x5 ztI*H?MCt7RmekwGW~@a1ywx%+&%Wp0WeF*H6NOJ=9_}87BbCnQci*VrqZ&$X%^gpS z+vv;uQvM!5^v=Hcn_}ZXlYD6T8uTjZ*zHH8(qU$8zRZ^wcR|=WW8V?cSTS0ZP*5Y`Zzrs|743JX|F4cWc-^G{70_IvPufPMGGWIz)#%rzN{pHL2t z^-_yx?=Iin2HH$DXwP%2!P${8N$UA<6K!9i#}7-a>_Ew%BgqA<94%lev;NwR{%3g1 z)=?u*!7{mFK^5car&h~ic$^Na?8ZVX_xfdu;SbQ0aU}qCh&wv1cCh{keq1&{jv+6A zq#4wR;(95j2it-c{MshARnOO_FPj!1ej>lZU)S{IIxsIJ%EjpCeHIN@ZA00Lj2T{~ zD){>$v+%5{01d*?QqMTU&*Rjb{Do}sOG)5MK%*b~6rz!WvwiJq+5z7ZXNAY7(}Y^U zt^_^Lf3JQ2lfbwfD5zt9e+}1(!N52&C_XsFywCOUV3N2rC)*iL$l2rDf_-X;a4iL@ ztFg&^aJqMtPI-?)rA_76O7qBD435=n0xu!L;n!x}k-GJC&3n>~ej2$#k>*3Y`B<-e zm{WmOd0zAp&B7Lc!Qi}3#e#x+%4udGAAO&Hai# zWG-e3k!t$QViXX!$(z~!soc64EV;_im;aJu{SP@VF#><|V6gFeeAQTpzVoA^dt`;v z2K)6M3A6_~NII;dzMBUD(8=)C|1~D?+`cy~)=B*=(PXdHE<%v$yUZF<@9;hEuexX4 zoY}zNP-dqfC~EMopF9Z!Te3^_pjcod@VE~hoi0C`D)i6&wzFS*&Psp)s z{r+vqhP|jW_yqcsSeYwa;&go0j({Xi$FX}G_9;YaN1n<5ZPJ#(Ifw>r2LQM+9Y)ka zUY$JO28www-S7v@mEvrz_JC_HjZj~zPSWLv20Q2U@R1DT`Z5EYF|LvRf}%+{)q>_L zzI;!+$WIOWOuYZgb?F)Xof00*MFub2B3)B~rag&ew?jkxB{C?w4MZPBl_%F&FORMb z%({;ye(el^r$6?_O(Pp{!7+N`Z%G|$(!v(PGGCq?ej$9jGJw}{3o|pchR*b!{ytNW zJbctiR-?=)e17@v@%cH(zGHj*%UV~y&kG2q3DZ$3C9X};DI5l551y?#3#X;dwXeva z-0iFzT(Q?mrMegZ-@sjm8ievmtBcdGnM&`U>Hblr8s_wL8KwQ$DlQ#)SL;AV*NZ%)elKD5QmXti1lvH&2+7YGWSm!b)YR74ETuFtcIc4IDgWz{ zNkXP-=v&ToZo)zOB?>adTW5|})@f?%Df#`xl01)SnoFxw?#Q)u7F-)_7}9In0Wpej^*zj4}CxhcTrY%j?<9+@$&E|#0Z@hIghfSg=MEV0vhw${D3 zR=%Lu22jpJfAc(GAq2a5x)D8sr<%dV0hVlhYKtCMObY-s0mV~LD(w$o&mQ@)nCkhD z$1tw|>OdrIMV28E-H6<9+eKYN;*p@EGyB0?mRTF`#f;KKJ};(`2pFce=1b7jtOjl7#yuGGi1A>IaB;NXX;kROlPkg1*wx(lX3x!D{Y6 zUiXnX@#l_<-Z-hecT*pG8*i2u*z^smV>TC=vzz@DSl_v>wLYn>`8c>wnzkr@Fbz# z7kUFJH_U9n?$0CbtBPMRDu1B`dmK4`k((@b#`xm4OTfys_rb=h{m58N_K?sI$>Z%i zmCiNPk$VGOo00=L6-C3}=#Y1KYYZDDsu9h13CnFC^SY>O&#n0Nsj@J(TkV+;FH3Lz z?{aW91uB}jz|?99L$C6B{!s?7UKv-R7!shjd#j+9?~V05&)+p(diHTE?u$#3Rk)Vf zr=oB-7u1%k$=?Ev+lY`OMqTnxWE4FJ)O8M4^F2u3|EwM)S*LZ@)_DBJbOb=M z-bF7qShUdqo~rF!4|-Jz!i+LQOewd}WBM6ZfF2QQ4ls`q_jZ}Kvm{gzsvfqf<%y&n zHbXBd6r*&9R$!|l0AL(!2(EChg&Adv7@;(P$r5{Xh3aznQ#iI3(7Hr>_~mIk2P z;SLn5J2hp!q=Qa(dP=(OCH45Xy?@GKI}re5TZ*S+)WKFM0PXn2d0e2>FkxO|*4vr} z>-tnaQX+%Gr!cs^oh=H8@ZjxY^-WUUtVp7-Bg87K{@DGd(dC2^fN|QPI~A$d;6#J8DxiCnQT+%b zvus=xbErXlcSD6?3S`{ou;FQDxn&CcR-21JR`JdSwp`)v)xdvRBmK8!HF>; z-_by5csJjueJ+1uFc>huZ*$$KuMzIAQpl0aTaR{?E~>jTw{_YwtZ0W{Kxn$qv5@k5 z`hlv=4E~6B{_xqCf)>>2`O6a|}hX_nPl(Yo)blTDCSn{zXK8i-x)$l_R zj`A^Pn{3zia|$tXmYh$}WeMz!{A>CCwyD>$J09o0FRJ_+GJAA?uSx{6y)2iUWZYy} z7@az5C1F=OV47EnO}SUEvg>cJvD{1Vu02qhVILn}=yy2o^Kgfsm&=G%&@p?_XWNE% zwLuZLtN`^L%eFS*kK!s2XB5f6&Wa-Z(kafh<7{7jE(IvTe@Nf;-vHcsGO*!?d?baP z6F7kjqa_Vfvva*`vh52|@YmvKltW7^>}$(vSHPs{2g+OG9$r2BG%1_MF3mqKH)x;c z7jl$k)g0S93sg*;0e`EEvA;OQ&&QLuR?q#uW=WqF`@+_PN9ip)mh7%-@TMk}s+PH!DJp`zY! zXhU`dG4Jv@M(iVai30U4jorIue4`$Sh0ez^r0!Jm6-q2WEHk`O`paA{;2^|=UV+m9 z+^piJ6#W@p*%NkH<>$LzM|fVBmAa|6e%uP3JmTU~E8vPOAmSla-O^dFgPW=X5W8zz z>b8>2lDsqeYyFHPjM5tZWz<hq6}nd~ko<$jc<@>L`7c`xR=9 zjo$!ejo1k+)w*s5NjVTg_j*a!FxG&scJ_XV6?tawkiYRbv0}An`_~s;QkeTEXn$zfL5o`Q~*U|iAa5ZU;YQDeL zmZ?5Jr&}dAEz{CBQkG!F*amzkHMmjmO3V13AnNA#;V$WE6)XI((`Tc{=tw!y-Oa;u z>|YLFkl)ItY>~@)*SdxX>d`t`sUk>0#>+^B(xZ}bWGrZ=$h;z~H<05|fz%g%ADqyvevs{_b*3 z@l!eoY0W!QU%b5<2bG58T}tNlBJp2agpHh&c^4-5V2@>kkvvai@^e=$dsMHs-c~B< zmibvJd9cV}ajf12h8JxY03r)03^ zWWh_Gm|}?GF;dq5o}S&&{s#aMD`-qk#!Ro6ep!F{U+73{lSU@e(B_s{JpKU{>>L$ z=QS7GRRSgQam;Vc=CT=k~wd!xH`M3E8O{{kz|>GaYw#(u@&ez zeXm1a$3R$F@ZQoEFW{x_SkbuY z$BN#owU84R*{?jVxC<#=PrE@KDz#JJL(wAy-ibBn_2_~%3=(h~GINsod_C)D#ahYI zNSknC8Y^$d>vV~>)goOI`o+A#lyA35q;|ls~k9q5oeCunLv$}$^W zJ3BOUdhbz=I$l~V!E0vXCF+8`zIvitjLI;MdMiQdh1><`#ZUpC2&!B-)|;*AE^%SC zg#fhMw<}7rKA#f23^ z*wKSkQLFkS9#kS4+8j`)@6Cj3n_IrstrXa&AE~{%4^{p14?DfID8!YK;mi8{SN;dK zHCpBmX7SoEhTD|Aj+>wK zQ3i6-=R=vrx53YmKr!UpH3P#Sj%5j{YDz|qz3)yV?%0a22%ok8#J7THR67=e9?lP4k60Vt!wNi1zC_Dr-PX~Ip2S?N$_tq5&r!Cfk|-C zc9^cTDyt~087qbQ1Z^~0JN7_NynGJ9+*9>VHskg;*&XDZb9V|uDy*sxw|dHH7Q$C~ z81iFVVOtHTqv~8fl#!_#cy}_?CKzrtKP)vW>o1-%KKoHM7S_Z)LiJ!seW;?K<>qo_ zq5;q7dnee(?#)b>$465{eFk`|y`~b~y3gtwc?!<#wk$7(^?$ z`J~v(wZ5Y1K<1v$46U|HZR)4*pa+a40RVLNH;b^;_sopW4s!1mdsUxYC2HB!HRcKr zH9lQDvw&nCrt_^3dj>)hT?hxKG0>t@N!_a})CS<{BTLQ+QPh4&{62gH7}82iWv zSX(h1&aKJdwSzBZD(28~AfzmQu0+56KIjsB+)F3d>51#{AUUT$UQ-t3ByVG`o#T(A>_7 ztu?u@BDzMh0an<4^CFE?=!S-$FhM;?V@ zdKFR>Uy}o+R<^}1iJ-&m@UQ)^LC3#YZQHF)WD1XANF_i!BNst;>;~(+IlH5LiAcZx3aE&F1uMa`j_Dw|ic;%6_4Z60Y@f*P zP(+XB6{(R{awgq288)7n_M64$6qOAzLp=LhcC0!SUG?lYiz*^eAD;N3@DTNknkGDf zVb~96u4FNbpB%Y$treFbv2;E{pm`2b*aWjPXxxQr^&r(-avgsBX343F3#e*tnTy9m zLr9dGwGrq}wCQe$|J?cOYTrYMt#tQ%@$t>i;5}irA$Vxx!5(HsFR0RZhJZKiThNds zsu_fAl`(}!u%y8!`1La9`F(bpGu8Y|svdOU8uMmPXNaNI=-Rn_2N<6>3c2e$j2CJZA znRKrmYo0+pu)dfUeRK)H<-gL^Hn=t>!EDCe0j&6QPYqxvRAou$VJSG1)Yvm(Ki*FF zddni=l?MP^9!ENpsXbNwOt#tv%zAk|Wg6qW!AYm5ru{e0CT8AXRt$m#>~jKr(GV0A zLr+rgd?acLH@*?@3+*LaS1HA9dGc-x#H&XsQ9CZ1PIQXMi-Q%zuu+;c3QI|EZ(Qsstk*YxlEgU_wVn#=X(2UPnlCAph&%PBa*uRgM+^7o zwz@caJnqpr_V%BPC;q38Nwi=8^fgE*7}i_jt^cK!TTqc&bpNu{ zH8=R0pO}&D?XP8hse9B`Bu?WmEmp~{I5i#=tk&AzR>D!LXS!C8d z#y&%IZ~>O&m5aa64y@O)@)OmsFc(s*xtUNi2+1pC;=TwqJ_9UBU=I+NWGMx(_HZLO zU}O(Yt!N<_RbiirqW@5Rr;EQXrZ7nf&=e!)&0atk?zZ|+-^9RJyZA z?li|`k-Y|DDFEWQ_g`y4^Cspx+_f9lTQBrvere_uNAFk~v~2UHnTa>?V1g`UM3^rC zQiXblbQfXIN-^|t8;W%wD?89vsmzvq-raC9wdUu-pLL^#C3}a>QHG{?L!X`@C{1QI zGOr@m!caGuttZV?FwI@+^Y^|dIC{DS4uql*uvf2`yi_7mIiX|ct3Mu-?4#r zI%+!aDR3kzzFqZE%ZoKHSn@{JR?k&ejdaO z3yr`_cQcz4rwyXz(}j2lo7>-@@5-6Q!i1XVu(_^3LPG7>H`>Gh?Ywvtutd)qwqijM zGL;A|`r zGt=S@4qsh|2VLU%%_0L-$BAE{2fY1B-=KiuzIb6{rrN${xgM(l>j%p*)c-0&=6V1( zIy1fV)pA4OV%OvbL@1^)Koq^~u%B@2)!0!Dsxq^9-p%`+S=5F_t9r2WW|KjbK|ff= zrk+v^bIjHycbUeDf>qzpq`f_^SVe2-hAhdXE1|C947&4X3v){xFy~ecB#zGgEZkB4 z(m(d>bU$V@v2mQQCu)7=J;n;42*q>DJ=E0H=W$Ob?oRY6boJ;g28;ddiErkpKwcB4 z)jR?rtz_W2d(Z1?;+V^|#)Rrvx882qoDT#?MVSpO1UtZeN;3q>rt-g`Il1-iN6?A))qi=T=c=$fJzp5vz<_kv0*0|L zwF5s!;hf$ts3zPbYiQ@dXTqJTF)NKf;c33@ni6yigo_(#n!``g3prp3-AW3q@)Pu> zK#ZNMMWslOneUAFrkE*ouZxgmD>L};dayQ@Nxg)3suC2B0uRg~@oM2j6!Z>?v5ZOO zVQAdOL)eK(raqc%Sj%;t?os~XPZVcAW7yxK+ZN0yawcYYATC`Idaf>1H%`LPSQG~_Yes{ad!)LXi*sdlf$`s%L5Jz6UdzMjE}@T~eV5;`5NuTgg={l-z_8a0e(KI2 zvEJHIe&>yp6n`%y5#M%;X!f4W08vO~YAj%;;w+~oI*g_K0 zm`4Pqn10>6xZf;1X1+7LXX!(9 zu*LrR?E?G*av=D<;)^E$Hp@mGL}qxq;KfGt4e&yT$0-za2Duz;EkOX|vZtVJhS%Y= zSwm~eEBt0}Bjz(0Q#66UIWEqy$qih*2F^i&uj|=O6x5VlCwqF!c}Wv@*^k7Up*Z2u@l z_`N>nb~?a9JP;p5>R+)DpOmL(=PV9ap;7H%Lp>RxOD#&CBU@6xuKO>Ny<8a-K~%N| zLO3w6DL`D$T63yjssC~8H}OA?w+PV#$*|u<=N7-M$;_Y{YW}#StxgKPvG@WV{a@Yb zp493ly7<(t$}sEpV-OW<0^vMqBKY7fHsnz?KP;(3k;aWPN1uR~M_zQ&(G|WN93C#RHfn2POObf4uCuDNsh`6rH zZx&7Dvc^3$*GeH{MFR5UDf51An@d2biX=VWC<8?9+OaFAAWQdw7{>HXDbk70XM+H) zh2!sC&cFWrQ}*||yT`459M)q|3|W*`mCd(!n4WVhqY5ib|1a!)cU)83o-c}u3Wz9( z1&Ins6A%>(ErNzBM!M7ph%^D|ortJND3LB$BE1voN{jTS(up7~^iBc<0x7*%cr0XVUqMUFYNt7Vq=J9(8tNvkjr5sO_O2 zv-{Z-civhcmNLGKB#Z`>4BYjyR_-Qt8i(*Ac`;ISFUaG(EzuaQVp|x9e!4b)@8SO+ z{hLnDo`(`Dx$s|XuBGbW$`g4)2wn3ngLdwQ&>_A?!{TQrZP#l0Fecja*8%MM`rmJ^ z|JYOhpSi{~O@mOu!EGC!W+(XcTusCm=15>Ia2RTX48mg$NK^An32r4ljuV_0Wl9M> zk65^=8D`?73_@HuiMj4@lfNx@eQBhIfbuOG_QPnHesQ+pD}%Hz=+Al7diCHh9tlDRup%~W{HAy$5)qfZiQQob*S8f8(&NeOCCdk*MvH5lzGQhJh*HpDj z&6{7fp*(81qz|a@OK+h4Ow(tMWM3Xu=N}|ge(-PJQ<1{W<;{I|ntvGi03OK02PJV< zlDICwQmb%p*&=10KVS)a!XEI|R{!9vkwgC5J$@B7QPS-ur#^r6-l)X{>mxWSMKWv^ zZYizwo#u^Fi8?X+1}REDKAxlSz$Zq;1uXK!`0wRkY#e6kl5hX(%P zWI?Y(-`KI2DPnG-J}BA`dJ59&#^JAHh~0JemX_4wgIWHBPc4lctNG=6*c7=fidlgI zH^rI6V@z?#1$Id`wb$gP6onFxKFqg5K@C00Nmcvq^WrHwMdtL90*g zh)N7i)3=w~fzb4%npER9RGrm%cBcWco1B@R+E?EyvJ_xWyG4Joa87HI$mVI_!d`K+s|kSMxK|#OQLoU9IuZ>f zFU6`JEagZ{9W_|?zi)T^KxB5K1+LUEw&(pR}A;%f*5LR-Y%$VvLt6sUtv&b?w$Tu8MfauI+#t}+LF2$!)?{ra2 za<+db`W9YO2BpINQ1*GoZWxz-@cDlP5jE&W7Gmo6`EihT&aHUpKG(23c#hRwp|SdY zp5}==@wa-;ygMRn7;{LWI5j`l_O`w!G*$e<0%EogpTaP1L^n}yNc((52m8br#RnA! zhfUl4JBSQ5j}tZ;uttvw)6EQiX&NyDM8 zAxlBl@|ALNtr|u9&;xap4C!t@@H%x?KQiJgZ&v}$ulFcTEU6pkxO}+bK7VhexV5V% zhWv6+Z5OC}LAYaHF|>G`doQAm96XA zGnVyEvLN@QkxCOZXkZfaBiGM7UBHl;hi9MNru>EI1?NqJq*g+!5uQJ-``dqi#Tq|NlqOKO~!Cf7Jh-;C_u{ToKxXoXxe7v+J%&gF@+Lg=O zW@)v%qD((RS_ye@w~dy*_~=DwazuL8+5SUOs0x9S($(7=)F5&m3by0A-qD?L6|*%k zz5{0}vc#ToY3bh!xDwEBwllcqClk+uzH=YM{on_y2cS6ztf!I=Rz=f`eUS7mXeBOfCpMv}9$AUcQx2Kalg-IDvq@M@ahMH}AstY3MAFe+`y z8@jK&cK;g7Dz_PGP70dBeps&z?`F*Ih^H$jIdRt`m(&BBX6&^;VRz4JCV9y!bwXr| zsjSAX?>?af0mEP1?`B!YM#n!rp8q#d&i<(&_TPNZ(>nrp-9O&&sqB0i#BD9F{Qir= z)5!mNt=|Kv*epch_pQdwwOF_hLd}Lx zh#IA2;o{SWTIk>JL!%N8LfY84No2)c_*uDw!C;ngPKTKHG-sBZ`*^gTev81(j?xooA zcOs#!cnNukJx7Bw(rCRkoq>*|AG0IQ7Z;7+)hSjMebO)<*LBRsT znP%TCYB(J?=PluQw6d_k)3T?CH!2`K89Q*kM^DB%y<7|;sqNS0Vy!}8-vj6J%1|e+ z8GcK0l+8ckM#Z8j1`D^&H#i+q1akh;`?F2`!N89oO`arQl}y8jWvvxGaGl^h=oeyMyl!19NX{ej4x2 z-1CYP2kobp`50fS_mNDZ=8m(T+{w5VJ&|eeVL8up0y(-{r2w0niM=85)%mI#_12|9 z1i8GQ_9f@eGnj1-=2bO6J&_?|Bk|SNimzbjT2$Bd+tB&f3x4z+Tlkm}FCfQh$<^a_ zf@%SsPAkr#_2KRdV>`~j6R{Y}GkRmSfnAl$P7u68EJZi8FNixa0yiKVp!QRXUXcIW z0nTP7f0+^r%8c*bfjA#>JstU7V-q})NUl^4WALhf6MYLwO!@r*c0dMHY+e*Vw^$Gq zP zj%Xz@N=hN&f(dQk%T0jo+{XkxV}VnVSb zgF4N553q^gzZ070UI{NS>la&@->@@3>Z)Ki{T7fh=8y9+zH5BoY6~Pt z29Jkw1Ta&02-GIzE*72M!;$VT%v=fu?}2v1%E+QYaVLWRjdQ+v*zLoZa|qQL+*kT^ z07N1UuA5J*1$q(DvQPf6T9y6t5MYisP&?m!k_y}?44ip_< zyWnR1^c^JElmTb#xBfD%o0yp~m<8k*W~WiVUzPvdL6s-cFLQN!3MsT?2Po_Pd2%1Zw zlfoIl>phC5z~3UR!*Orix@SAyHE}#M_jH_j^73&lgn&-Hu5Sx{Eb4BYWt6u-mln+P z*zWBWUO+`a1K%D@9hB9IB)&9>so&V+rgVFE<*|s1Wd)X1MMLUa29xnmS z?rJ~5cJvhab;X*o|MH3g_SuIwy@G~o-Ba!D96#EceA7AdAI8Ov*>`HwVvY{r_cxbTjhjmFYq9QnAqxRK{M zM@D*#@YjEpG8?GO=+1a=ph>_S_o}FgZgVAwKUh|Y8r>Z;@P*_Z=)I%%<%wfg+tsIR zuxWY9+-C?{XE7?-S1pBj+`i@1Gp>>1!}*TM8Rre#?-^0JQ~q|6bE$DLFog9D+o)@x zNgkJWiW1jY(rppJPMaq>E}qdJ4N_(iM-@bEd>+auM2-#J?rI%Ug0XpO?+RyuDOOA} zf8gD}iV~GNs;(C=ZX+Uj2{Uz7H^x1*;K9nuvQ`Cve%*Pe@%nYJtzewKB!2~O(Z$PZ z0xN3Qq>WHwZc+L-FZ=L3tv=_cMif9;6EQEAFU>zvTREiNai6T9vviBPnBS&FPh-f; z#ZC3iH}cqoU5?(_;79@fjab(MF*bVqZ>QQyacbFhghf-J|5B-o24F`%a?QAy53J z3YVI<@ZAoko$rTM9X%$%zdj_U^0 zjZ1n~ZEgNLD%C5xsi@Xd z-BY>oH$~LSJh*29xG11;YdBe*_EVxZ;AJOI8)_&iL@G?MjBroK=-+)KDB3xZw_C73 z6+fmY6BSx%ny}1+DPC?0o$(8)M7ZRHJrE{@;FdGT^=YnVS90bP@h>EeweZ*_X|b|- z=`VSgg6F=e<(7+KSYil|mMA`2!4!-5iO`&oENF!0)(24M>yX+;>0McaaC@cH7O6QC z!Yx)+)m(McU$3eq zHdc=1rnW0$#zq>9V^)^rtn<{X(V2rEXkoD}c0N;Oc`_}zb38l?Z`tdyA`qpCK2xrl zQJZehj9pcSlwu9rJV4BKKek=%;yEYHN*7&{rAtg1%f5=xzH6Ei2`>`aWl(5~DUNTF zKP&LOyOAd%^Sm6us1a~S5==1l_~1^Ht6z%PU;Je9N;~$GDMV;9#qKWZB}D+AB9FU6 zPYt*>XVc<3exip1-z?&YWImJxRPp%o^6X{E{W!ejlJtqV(Kc7HucyKKNT*z-Jz$H9!B}j;p535k*2GSntr{| zg?$8%=*z?BDkGpOW#R2p$B?n!m4ai>wQlTCVK-Wv+?(3_S~G8;WMjk(JLU9h;%Gu& z%y_obr_E5_DA<{{!(48sV`E02T-(Ljfno$!^S0!*5vg?-C@~dw?dkomi_ZGhU3=#D zb(109KeU`kY!5~hLTa&0bLZ_nS)H+Sm%s1{Rar?X&o^6z?94i~a%HJz;@((UwG5-< z-4V)-*S{PST{~}RdgbivzP{lR z7vs`J3SXR7pqKFN{4@GE@&+azSsb$-58Mweg}-XAHrEV23xz%S(oH&VS>Z^L+c6w+-J|(jOdf zUhJxTfEsU@{O+ez;^Q2(s>XOwJpT={D{)PJU}HUzvY07iNJF?i4hxt){gmOHjE}aj z7A8reBzEIN-0iFM&ELrPEX%TO@d$B0gIPUoJ^#wnhs%;@k_#uGcS}=^=Hk=~!h%`I z<9~n8*nZ}?AQ(ID6(^ZB;1}i?y3eJ4oO6zzVVZdN`LLFDswIzztiYw$9xAiwg-LTk zbnVH*Ur!i~4r-#U6V@4aJFt&(-?FFxx1(IZgoOsj&z9tDUXgCd zRX&^NzMeWV++LmpkJ~T*66~^kw)=ga_~-BcO2+ci4l_JqYAAUCvM#lG9C#5U+1|`} z1oz|C_{|i2aPAPDiXEhj9$>)5$peu0Ux*zEVlT$MIbgdOdK6Bc#X72{$_C9S>L*iG z#&H97$TMhkORgBs2K7WYXkCKWGmR9rgvz zAd1n73|%M}5$L7bLS*?oT@`DWoDq4DNRm4aKVbh0mq<UHuBB8e#%;IZ(KzB&R8uO#=}SUh~T%aYH+h z@9c5O7*P=|wB^gtn&VtZ@B7hou;s|6|1{u8#~Es=9dh{G$d|;C?KkP5EoJ2S)r0A# zZ^YKZU1`}XFfKpkz7U%1%*8Q|W(^d*} z76PJA-yI_DGIe0(_?t^cr5qeH(JB1`_seXeR%L4Uov3}uV|V+Z!@CSR*A8;*i3c7S z@vc+lVlxSssQYkuV`YGcK7*MOQ-HJ2Xw&C~o2V%OAI}HuH2mB5Bv+-wch4O1|84}` zFvaEDJ$<*M>jzzTRj4P6@g$yKkTBsb&OHl0{qkQqKmVGr^qWlv!S0J^*-P;K-rWRv z+7}GvauGQmERf+agbuj_#P1<5d(+TmLsjHZ9r9dPkx-OI5^un%zAFygpC7hu`U%n2D^1Uq@TPpr0fLwgLO0HIC9 zAaWsV+<#y7swxIxwb^=$NuI$NhFJkw1=3Zifbv^F-i=%q8PnXiW!9*W0r{3IQIZNB zss^3+%N1vH3nia~F!V0GP3#N5Rd!@?kGNYJ= zO`NS6bh8j%587&vT#Myi^?WU~(LSy6+yBKJxquSat&vQhVxF^~TKVD=Wn1ifefzDJ zrUMzn2`s(O4E&8X(96S0I72R^(dH6L{(W`Pgl{(`^ags_tUe$dGYJ?Hp)@FjR=Esa zNP*0n@xvrMBy%r*7bFdbVVnh{XWk<(h{+&coJW(7JRDiR3TZyRx?jY#(A4|n;IKl^ z>lfl<==o3RRqkkLXvrpp3XhZL)D8I9K5x={jdQ}GAj`scN3=%5%Q-c&xQX4B4FqY` zRUoDFP2{GY3_eJ)&k-#4D z6AHUKQy^8gRtgI&D?2OHMKV+Ll*N)7ae;2i)O1|{L4OaB-+SI5dk0$;wry;NJK?b` zg=ouf2*QHOlG%r?Mh?ao(IT4sEC04$s4o9D(Q|f9ds1m#YQg&kh^i`spI*>CIc5O+ z1DVgmWlx zlJ=8}p+QN2FOtnqaPV_4a{3A5-px#-I9O9Rjk`{9|C{@~tt*BX^z>AGJh=#mI;z19km-8B25 znUqMD-cCYL?YmR*=YBA@wCHwLM9&w-c^?ot8Eevu#ahk5WUqG@s`uiJvz2iM+mj-d zcmPn2JFsI-KbfqsCg77nbLv+B^~eN6E_;EyDYS*Eg4E5MGfGK6=sm17;4KRx^YIkE zrei#TZo1SdLxo(>_QlD>{J_1n17gTZLtH}sbCh3Z+HLs=55&Hi!|uiV{q+)%k-48L zSyFIPYUix`10k7=k{?Z2+yIyW96&wShtBnh z^T1O$>+9M98pNmb6|(RKlHc&+-QgeBi$_*q=;hOpF9w~u+7qV<@ydR9m_O%bjJYeO z*jVK(vJy;XDZ`6iI@d9InK$O1#jmk&%RQ~UIMdFaT;=Gr!xsMKqQg3YZ_ki-G$oWD%&4wrm$B2yIGXRE8XvTL0x3FsmZ)5D#N(VRvK&-eItswW zeT~Zi6Wr=_(}sa3BPdZH0PNg^f*Xuz5Rb)I$YN&Xd3$}x$$3taC`@=mn_~u<=DbpK zN$6o`v2|x4=WfiT-Fc2YxoQhLYSG1Z`$=2XDCaj;_148JG=A}8LNSi z?tQVeGzM6w7c)9vWGwE#i{YPX6`Nk`BfnrQu8nMv(5uo`fD|M28!0B^n?av}<P|o#MFUK|t|t24nCrO?djoRz3V-awVK`Sn?pU{qwI^MZ&>d4wliX z#%DqRe6{mi&^)JnHHsKVQmCXmL5QCjN?BmNzCsdQsOKBDv|RbhJGND;@c4o5i8s%f zKaCYx_pSLlk@1l`W%ul2+&Ei#K+P&<%0R-uMf#+q&5inXm4ZRaI4t-KM zJ!vvDg2vu&?Ke*4?`?6Vv#jRtM8#jefjw(GJP%BxBRyS~GAORkz{gYG{Xj3rSgZ}- zZ_D}x*{|(PUdu19`&=H|c%0nV@2@}&xPk7AeSKP4`>M}yx7?K+yH_?AD`U`=PKF*$ z7E@aV*CghnGtB!WWwdB6ej_huprSK#J+iOsI0gw1D_JsbCRD{vyV6G@RmgNZLphcd zsE9x({V3-aYSAWVQhB&l9xWS9Ab$cK;nl`i*#q(d;7JAYf-Aqz3vP$OPyX_t8UY?u zCV*jL-9msKg!OfN0FyHYC?T3n!X$2L)YWfpY4Y3+rd!2)_E``JxTkmQRHB4ncr<+! z<_}S1BHH?1Fgx*OGat1js&PUD*Y~3lBnGFTU-b6L$tSgKFLq_`LCp!?_gY#iEZOyDE)@GShOt1V~cg`X4AdCw*@X(`Y;8-w76}hK8JS3vy%aZ1`{I- zV0H?I;}%Z10vCA|L~iV=A(j-N-0>Zxb*uZg3r@fN%Q|ulv!wiWZCA%GH@%52@-=Ah zz^xHgknF2hK%SHP0~TX~=!CM(vA4mkU)062Lu(Rg7$0sleHt;N3DQ}1H5}gLCPwos zuw~0lQ#Xw`fFd6Slq(5^POoLg+%tw%js?f=ne!{1m+F!J?Q0DN{Ni^a$9j_-((DXg zQKfUEm=zv4hipftV z8Jtkvd;ueS@dU_r!hPOyo4n|}6EfcA#bX5j*r~Q=0euBr)L4LvTHKE&sswJ_YzT}3 zq2Q2FG-1dET4g1&C9yo2n7Q>J4FWhu26JK1G=Fv&-xo}d#Prv8-bgwbkd4v}x?-6J z23j@<%3GP2e&zChlQ93i1GnQq&G_4q?VMMaoj~U-|K3131!+Y;kjzn-4%9MR8a~!x zoR=7I4AR`AQX9Z0|FCKF8#GTs8PJmJY_LuW97KS{iigucrHrO0L_0mq2k3Ty#^Mi@ z>$p|;Ie@Ojaj_r4K{Q#Q_ki2v{K-Vg_9Dgy2Wu#9zlqY>TsOBOCxhCsA~rs$I@e7C zX~RXG{J>-(jiX2|md(y)pcxUU{rE0HS<{~_(eW+NQBaMaK|#}ub#ESp?U#QpLcI_N z#`~1K0^=9yk($}GrT3jCo3FOPcVD{*U4WJ&)Bq8= z8~)hsUH5N``}F?$CZD4hlotNqEO3?z1acLOS*@7{N38SJE;Yh~UH)nyrgJ^{Vmwam=jBFS1~}u0d58K(7BAv0m2zTlRR(vP&&~rj4$jtiaTTd+r_FgQaLSH-Nw= z&OO&eU|VWb#9kQW)H1ERBw%uRt5) zx8KrcgVlj>iGrYuZGOGs zB=-XU=kil?-H;p(VwOS8@q=?wkR0&6`60a%USGKv5yWRRkktm^!5JpDqhe}Z8FF_cp&G!8M zgCdMM*_#gn^L#Wec>lK^gd?Wn)$yTdTwX5!RS8AFv#-5|wI zmF-!*2UEoHE4&~S*-xk_isr%IK(g2 z2F)a=@6Xw|4pEnX#mJlT8~bp3#N#sbx}DXBYNraP(_0`{*}rub_1j<9FTK^PUW{p) zJ%N0teo6lRS^!r~P=w6SU!T%$T4y+?r;uVO3$6*VU^N}-dj26$EyKzG?egqegN7xiV<(d6G>Uyz*JV zdwds{_Ak8&P!X6t5^;q(cVh0M-au^wgI?(XCUM&KiK_}8$<=`lJSxa4s~x|!xgqUJ z64Y22h+Uv3HTpy6+CPo18I<9k%7NXGmHD8BB`DCg*=3{o+pLJNnYd3kQwi5HD`o8H zf)6HI(ZK3@p+t`XF$VP62p~AIg)#{6g;UK-J)e1d*UL;j_#)nUE)AX_0iyNOZw&1$Q0grJp1kE!dOkaPlF^O3I3l;wKilFtP{r5JyzAXIHH2>qIO5ijmwaQEwGPeG}|SgGWQ z2T&)8=^V*=R_2XGD~M7CP1K#V9ucs15lz{@Z52MDt21r6z9Jv)=bFq{ZL{?r46c`6 zKNRKfI|DYi^f;H+wfBdV#J2F{B> z)-bNVnL-2QCCc^oq1AZ$&`94$BN5kqy(=nu=K|Q}n#8|DZ$iK5`-KojHm$_*VpEL zikOH_yk%%|)d6TQUWM*3CP~dU-i$ODqxjoF@un5X@+4%!rA~VSDm7Y>kv|XYp1x71~86IRrObr zK+5Tn^|VT^Gk55$TjgF!AjDN4!8?A??@i$=G>6`3pEc#i)=|@ID$7-yl6N)%L22z| z%o%a>F_+8L`#Z&nRt6vWsf&S|5@M;P1w}#2vvh`u0rt_GdAv`oAG>-N%K}tW1DP$VOC$*uAl$cGpj}RvhDxS zEN^S`p5r;v?kCd|Ew7B*m87@?-f(YA+tRCN(2ra=%A_|Bds};E45?XWDs`CDKO%iR z9YYO5bcNHytJCZSV+WQ|TF?2CaS*6I#}k8L&R(ZV2M5QYJf#VHRqsGQ8^2tyunVhB;n;RlC=pUT<2GrByn>H5`M zVecq8z2fPK*J*ApIfG;C5rb*;3$>;?cS^gHw72PeSB59sN{?{$rxI0o4URN9w|=;%IEtUYsgWD z3A%p@*fC2 z%O;5&hbVaxYmLQ;*DNAbRSgY5{&M6hwkMpWZXh7gM*QLU9j@q?@7?o!jcsxllZzia zL~!p-=@9;$HUHe5H({M#2vxz zNh6&w>e;bNQ%cT7Y=dV!ucm}zfc^oyFM^Fw{N6y5boLb++kTXSO_0OQ&egEZcnqQr%?(Ad``_HObw=mfE4OiLi<5w zu+WiIg_1fEDelv~`jLwro=?ode%gbizYhC<^wZ3W6g0*QVXcPPy_0k^bUdYXGFP)p zBjon?nst^br*%cFCN>=Ec+_!%qL;3A;FgpfF~Rx!(X;#ErMDGQ4fOX%y}jFLeA(4f z^rLo6+$`i(z$wz6qOm-Sc?4Ug-qbS?tH|(x%c`Xt8JG5Fae+ zM?;yb!=h{91O)pA%aERxv0h9eMYOs_ZBU_i!Yo}h9_sMpq{;dYy6QB&{sgq1y^ zM!d4LSFy!a()V>koy5}gG_7k|_?n-tF`#x9ao?EuB8-Y6jLBC_qOL~_)1 z`A%ccS+lpi_J?+vJ;A*us=sV~q;kWon}8k=8drI0`GlRn!9df=acAlkK}X9U^ub#v zA_gs))5XFkOmMrZSJgjNWTi$@OJD(JlpE_CBRaM&Y2s#>y2ux z4#=-)+CVyyrb8qP(8YEJQI`2vzBFy&`4P6tavxNQ50=l?mP_unx$;h!q>m=F?Ur=T zGOcrb>~Z0$34MX7hBd8y^h~Bh{tF6L7_7np))PCaCdXY zCFgOl8f#YqK($T3eqwFDPk000if?a;vT zv*UNHSxHi*tozXiCk4Wii`$(7nyz>m&p~X`;I+|GP%jHlwx4~)5=03oY*fjgJ zW((iZO9X}u`I}oy&j`hi;^GsZ9zm;QADU1DkJ0A8H=k#8^X&eC_&C)Fk9(~E1)W6> zn&#+e@4rz8)?SR=Rrm=LwVa`Y8GT~T1xV5gnJnlGVK0VIl;w86pRw{#bFhWHk{5KY zhSU7RJ;;lf=ELwc3~+8&*ebl|Q4=$l80e|IqbfeO3(zvDQ-IG`hP6M586sXEVb zWTUrOWeY#28DJ66CAC1oYNL?_h62S^>@<~HOAXqnG_$J}6 z`rk2L{$IVPLE}g#Q+AnPjLl}Yt4XDddf?7YvM!*u?$I}>N~HR@Rux$xFh zDf7VsF3-7g>Hb$1k4QNO)F3a`3}#j8+xshO8PbWSmWnTJe2P->BTM?oo6schnhgZecJnmbKB|8Ow~ z5li#_$)q$4%5INckt8mEAM`4=iUnv}M<$mq|M6N-0)n#2cvc1K*G!-QJ|?_HhD;-T zIDqULKMY7w{^4TCS{!Mb&gv7-*a33NdM)xC^jq~BU#n?^GHM?8$7_MwEONYk>j+tr z#{3$vz*FXXkZ=8GGd?R@y0R+&m!i^hk%&QrP)ApZ8YH z^TntYz*;DDa|0Au9h5y#vb$8ggqMXO_!Q8{>A3WpVl`uaZ9i}fA7y~rw^`==f;D7MPi2sKe?R7US`fQGb zPqA&;Q0Ci54PzF0$269$-&2CtBPk8=718W*Kn1jqvK(#vuGFxYm~=zKO$X$!vdgt0723$tr29-8)?cm0n%GXZPo*K@7y z&p`WUp#4wIf(p?DeEqRk0%pg0zGE~*KsLB0$3eFO;I^t!KBfIOmhSX=~C^c*ct{*>I&)^JV97b@}OwMhn z>JPI(q>D9sJ}mM_4C#5vMX{I?o7Vk^n7U1iUaT?I+VHl%x9;9Cg#wjt9Mqf9v#5JXIBQ$V{x+y6^N2CKUjrd7F^t*HdC@D;46oa)O)1UwT-8KNkixz^T+X(~( z!gy}qIRAY+4xak$uy?{5Q)`Axk^KNUu_kj|IJsJv{Nyhza<)YaxR7e~YwAn&k*H`y zn;=q)A%c{wlc@kQ*55`MS|vyc?uAKNjfqF&J`f8=4QZL4dI{`AzVj$s{K*srHl71k z@dTj-w%0hDw_uUsQ43I)6LLr+`j33Sb_$07me6O=Z{AIJ{$E-;>KJl1p1i@~oaoNA zLrGKeRy^GyRgxSIU(43^vR8^;HdI!4j9qDP)Pg;0jWB})?8=sL3~+_ewtG!E409?6 z$h~0l9)N~+N%jXwA!Zu106MpnipXT4iKfuxAuHdgt7x!Ag;h_E7Zv*6R4^G%+#>q7 zZDV)9hHC!sj^5zGcg(<~WZj5dEg?xZ{!~gb%Mu2mfcfX6D1BIN#5AT@! zjF;8+(+zRS0F?~~@aHjaBR^<0bY3C_@m`1~I*8;jU>J=Y-geeq$mz04iRH??Nw6dy zh{MfWCorC=tVt*F1YO$8SQ8hbejDjP&1Gh^qIAcbwqVO-)WvI|T}vNB=WM-Lo5P=bPojt-q$)8bIdr&REWWuwHGj>AaqJEvu6M zy=HrVDMar9B;DCCio&WEU=e52-or`D`3U+_up4dp)r#Uf%sNt&!brlUS+|Xz)I#>c zS?L+V}X9x6y*$n9Dj(iu^m^UDc5T z@*+0Bd7nrZS0F#JmUXJ-IM0+<;#^uSmnT0!TB`%WIZyz@35F5m1I*%{spMS1u3pQ# zE4MobdQpSx#@t;-5G2oMl53MwgX3_!Lp5X|pM9vi1@H z;)ANs*kn&Cquv-OW`-Pp8m^Q%!}OVJPB{JUKk6a_fV?X0!@0g^d{a0}<>!jlvNx8m zE7X+n4r1&Ij3;4p#m1I@Ofxip+yNHQR0#c;F(GzNrRN*Aw!Kyzgu7HyEeLlUm4LkJc4ZS0_GR@eBF}GirgxCe#y`t~{ zvp0YsC40~3{)+!2c`@>ZYEP-yxuTvrNoQ`&@W^>5z&w7;K}Z z$*3aO4hw0N0QCjWjFh$(jQMY{*}uYP|3CN{PIwL{zsMsq^~BuS&C`9UUxA*g`76G* zzqdj6H9~5b7%fMG{+kU?_TuOWN128TP6atynU@)cv5k}^82bek?0AqYRw4x+pB)^ zw>SQBi|?e$z6{YZt8$jbY5tpyiW;3Vaf8%{AX?4v_SJF6N4Vy@oJ#~KUrc# zhWD;)Y2kYpZbqcV(4EoKi02B|Nd7=-K~4)fW_rU0=&AGX+9d&hFMx;b_w>9brI-8Z zw!ZDnuFBR5=BVo@iw3g4Jy%^1LiKlFa!|ROAYME}Yq@_f`(lJ9)TE;X!EVQ*-j@tO zGHklML?0Ki53A=kpvJiSIFu0|E)_;dEqc=EaQT~n?G5&>m)EDNN9UWd)1CxO1R2Ui zi5RBB_gW_2t#7=3``~$jFiu1{W-^NCORDqF!dptL4u< znpBB3E0gVTP~hD+e(nM@8(pmb61E@YLJ^;m*?l#-`i!+3n7vWg_nkObn_hoL%Y}x3S)7I#4>3erL}A1{_Tcpt!S~V>!Lbvr_PHjX>vkVSEz~V|e;?`$ z1&cSu?fn1muif}!?$|J|m96i1z-YifM^2KUYn+I5@@5g0$*_#Z&o~u*QX(3x2)Va3 zO}MN12kL0sjx5dFypsQrLzq|yC3MvP;uthY8op;P|ap)%f7<@|R-~pJk zG=1m%@Nwb>>f!B#G1JQK= zpm>OF8FY7z9bKW4@Xed$pLTt!YO$RP5bQ)$181WkF9F+}6l5(lgC6K(Opk&HAU|JT zq>GfR1j=y`HrWO^f{Pa~D>sWf*s=*sV}+B^WE8Z85=uwJuQFIh^|Ks2@fiA*|8O7uE=c#ur;uIx| z_=!^}dT#u>4dvK{r^tOj9z!bx=8)1akEcNwN{D=f%U7O7o_CEYFTb%Ff>iHy-p$tz zz~~pRyzE`V9VR_N(L>_bVH9qG@G~|=H#uO&ZDAbg(?YA#;{g}w&oBALQ&k?*6vg9w zY66~EwNo!Na>w{`TP_=Qw+pNF4;|BO5o<6t#Jtuu7-U&L0v?e4L%}aDwafYU(iwGR zCrN(BrgXg>v;f!G#!FA^YP-`!BUb4BWa_MynTxPeT)gDcCXrtjq@bT%gZ8c6Zm&fGJ1&fGcEzVDlR?(h0X*qiLV-|XL8*1OiTp0$?u zZR003N$xzQ@ta@GtSmxf&7bMM2~U36CN_GmFs|XNQ_q`JEFU$nUzq>mM`kvBTAaI5 ze%>t=s_|HgHAvu#)d$eSwmYatg9vMAc!Exl06J92{F-o3hx(a=OSzWoBQ@f|L4%=n zSDoHWj0bo1G*7*KdJ9Myb@Pp;o-$;7_ER1QG&HX0qDD;WRGE@(_wk*>(H5~I!1N&} z4sI?E6RdNJo_e|SKZ5M4Bs_hLdpEjWAOp`=RK!_$r0(|o$N(oB`Bw-Ue~q;9^Sj_{ zCwRAKAMa>?T;H(qceefLr?<%5m_&1abHkH)0i`~XI#&PWRvhOP0EZMj@B7uB^`~ex zK;2ecRa5rVzAaMAhru2?#Pzixc6oOUcR^$-A@3{DHNglDL`#(w-0DjUYFcz~B_((_ zBJnpH0~!W@@BdG-Nr7c^-(eM3e(1Y+qrKL>f`lxaNMh=Ps=(@&&)$x$>qeN&PA!og z(Oj11)COR0#aAGIB$K}c4DY|6|NDuANIhU~N-2C57Si5bPbw#7tS0!We=KjIEF^q~Ct?2LxD(_}_vRNC{k9w&_@*al9{ ze_1Lq7Lb`Zv(B#Nlpw=uP+%1Ufon#xZ-%Rv{OpH+d=6;errSuO2M_~F6ak0{x+}KevN6NN*q84EfCnF$w?$_$LeT!i zF{-d5mLZ@ZME3eG8VPG}n33g6mu1qXe?K9`hGYsm^2M%B&N)y^WF)$MCrdFz#~tk# z<)=L}lLFZx>cOLKn`iA2)fuM9ex*IsG%rCE2-@%tCG}nNzM0#2)C#SP?F7OAX%_8X z;f@39h0N;T+PHpR`<;Mqxa$4G?_@EqL*R8%D-f{@nGnOeZSH(0e83Jv_Z`NuU9fZuA_jnVy)2oSQO!ql=d!82_kK@fE8ScE0dn5`u zV=~s0u0Z#ENdpLU0F1yU1WoSBfoNk!9{c@5Wm@~oJADa(h-O-=lgh$89ZUB`Dzht@yA|o%1T|%g^L<5T7WesJumQY5aBgvGPEb_xf1f z5{B^Y)|*^kJF*`-w$b&H#z4op%kdgbY|N5oWyB$ln^P9OG7IEpSkyp1%l(&11Dnn( zWak|&DSdd9m`suI656*3znhsT_l;<9rOGnO*k^dhc#rL5dj~?|Ld5wJjmD4ly$@l4 zr))f8fjxWf+ojby=8BqlTS5Eswvh72r-wf1fi|hOT-PxG`w=D99FE}Bs*ZW=%=>PU zkXZ{KDOhuBYG!oi%qr;SiV*pF37k&JsarPUjB{bR`)-LP@G=$$zpsj0wU({T zeBsjhS|H_9$-KeUWAT0~cToNM>EZ6tR_{b4Dbk$I$WlI2r+1b0e69t#_Xc3bKmM_& z>3kE~sYsRLyOm?UeU*#nNS|EraQkd$c^Utk;ZaD=^-|D zcLp?|CscStiQ%G3bySk%e2KpVCv)i=&CMS{f6@vtG=;lC!KPro`CBHghyE-tB&8#m zP~CP$#v?U%oE=KnGip9OW2^Gt&AHUcH6Sx6>}nSMz}1HW=RL2%X4>eGW_`e*Hz3#Y zmT|lRnMF6rn3dX@t)*>w#R2Bx!qI4VzN72*A#J%S0RjO3pONFdJ9e3`WwZnY7ldfn z((2}3ohWBf68kz!hJKpJ^Ld{>f%K6p+El}#NSQm7@)gme+dGg#xSqNBlD{q)nQZR4 z*ecGV0Q!lx#=#ms43S5jSsnQ_a^lqVm(T_7x_dZ0UclN^a1g1xWtW(N^q^6ZZ7h5? zK$(X11zM_9zWqoy;Lf@+_PzSD@oGk1@ABy;63uW84`QEEQJi0Kv);&xsa5Ob<8jkz z>p@~h%UxRJ3>`in`+hDU49lQeEuAp)G_X_i(s7>@G#iN<&V`j|Ow?ODpD6dmc`SW< zzVa=%F5)96-K2U-)RJCf#afwMAu$Qi`P-eXFK_jRE?1+Y*R1CKK0k;mAgkvJ=i57d zZ*+IlbI3%$BQGr~tV07N4?8TQ=x_?vjjLFr^cLF^+J8Ax$6`ZmpmNUGDEWDvz(|tXBD#90brsTRo07ZZRq(TdI|+WG-CE^ai$! zv>Pe^9{0eBFr43GVkU-7vB`JF*8I^f>ho;yh}nHyVBYHIliN=@PoyL6mR}U;5;(v6 zIG7+v%#`7DOCe}sPZeR6%XX&eAQo3HcvCuKX38WB3kQ~aoPpfLNss;>l&YYzn~S7o%!pr_&u3B#J3~z449$?KuI7Gu_?^&$J8OKKsyp8T4EYVg zFH6!R^2#Db9m8GE?K1CJ@>M5&k({^=sCrs|*C+b--s;D7y$kts$g{Z8VYpI?flak1 zOSRSVglu_svxBe7=QWu94D~rPTm*O7NBZgh_7EV8qPTlmFP6aac4l~Kze<_-vbB)#T=V1poe=$rxiSRv9XU!A`%lB z)tr6!fLf*Za2Sk3k^qOS(*=Y_rs~yJV)3(R9~G$7T7q2u$LZRDQVTicbEZZGM;3+4 zoWsV)`Hn<0s8V~b*v%Y0-uHQLQCL3}|Wqz8OH;M^Z-Gg-wrD#~hdN8w za3TkWnw7L zQg|aRR0*@>Q+imLydu>av={Q=#O;!3{KGm&)*CG%U9E@5hGIUm0x_PI$m4Yk3R{7g z?m&O05u+u6=ia>Dqm^$cCpDOKk7=skI!Ec@k*|}z?P6(qfwE)y6R=h9Smx0F3mfkL z)@SG_wt`Z5J@@7I#Ap-hLbKX4ou7xM>*qdGxN2isU*XX?Bx zfTCoqr<18tIWS^tAdz49zNrRwuqC0q5&g>N8n)n-8FN64m4$Cd^~SjGHAHI`V#IKz7%01BeGN!LXK9Fs?8_4~Y7khwK9(N2z(OfLz{Wgt z)(pm&5f-1}ss-v!c&Nw_ddx)knb;KuZZnE|<{J@J0EW0A@xzh>e;@cy7&6&ckrFTv z{EV9yjXO^hc3h!t^U3*W>c+)&H|sS#`dc!~(a2Try|X(CYE^QsjwJ~|6D3hs{)a`p z9a}(y#84X={>&RWW6@v5N;TJ)D-=No^=8nwVh81$E_sLS%ze9bwk(~Q6`aKP&Xc_h zLaR{n(N*|$rMUCm)97s({uZ}v`+BGcsjW&V3N!82L{T$i+-jc&V3y=ljvM!p>2M}! z_AJ@1!;jygy}C?CS*Qm#M3cK2NN3cg5>Y5DQXTLvN}8^YRgdbQl}4hf2Y?*&6Oz(my7M z!{f9kIv1y!B?vn%4sA=xDQI;|U!O5eU0aKtjr`^HCB?8qUz=f{%fk_Qz>JFg1?l*C z{EYzG|Be{PpTGIrwSa%Wee-|}QtIw(cCQ4qfe?W;!zOpHPNt4W9GPeCFripv5!Dgd zv%a|aT)^kinUwV(vgKGCT>d**)afSlJ_W#->lcD{(IGU@7ZHo{SRvv|A@Hu+=*~&g z5#5q-il;012+~==Rr{mD-xxOlKgkE9onTv&fAM8F@MWAR=}H;R9I(V|cGCWud+6tP z{}`F!-_=ilNvecejW*Ag_Y&Thef&-a-3K3uRNX)~MPw&J{gJQU05R})7W{#=0Ln*h z0ZhlkW4HrSo%44xfRaGGjRF{VIBo>Nm*53FD}^BWiTNZ$x);8ag>a7ik=Toh@w+tmsCriPGI8wG{Nb-jneA0L3F; zT>x5dxHl8)NOP)53n_q|P!87|9X3!#%?OPLE_}&sMFXTi*i_5gy2}HJc6-w9q`An@ zw89rE)E&nzv%eBqT;ML0WYUJz_GLezO^ishN(shD{lrLrXBcrsFYDxCl~k8hBS*}x zZN!p-t4n;oA+cDss_ewoR7s$EkP}!_Q`1y{xev)dD0A*Z$%N)e&0BBDBftpw_l_oC zD0Vb>L*^i_;eEq>E_@*++;a$rO0hFYKbO z3~rbYq4IZecc!1MGhVi23FoVNb4zpa3Ga?HCkp2>2e^#caOLWK0-Tr5x0Ab@`KlbC zropceR01`}Fl-X4?(}@sR`(oA79l-ApaS}HHTd)jbiy#t+R>Y?H=ebqfjn5aPk0eB zxz}r$J{>g~klInu5PU7~LTZc8O}KC$iPQawm6)35bAb;|?Wa^&%*UR21{3(PnXgKP z$k##RhB^Mxw@u_fgjLR?Q{5DzR3$0?SmPXnoXDN?4yaCr;(FEmE_tyS^J~hdzYBNg zOjt?Pnh=elQHnPK#duz*L}0pa+c4WE=>#5&z~5RzJAy4|L#mXT(C4#?;uqu8l|ROL zZXPyJER3Mf+1q6nS9lp!Ok6jUw0Qi2ssd$$yMoo$#v$fm=OwR3*$Eb;TFZB6o?&JC z{0!##sp{5^y-JgK{HlTz`+2NOTi!Ec$hgrMK7!D$CgT4&OI@i`X<>xn$w|)&9D?MO2VxzdhjmqxTVa zFP=Be88n?0YffQMetAuc<0;pLm!<9z=OjU~8V+`jE+R$~O#^MqHaVfu8jqniq+Vn( z-9d=`M=aHArp=T`D|X9)d@VP8Y}NpJ^f{6DGG24DpJ!>{4~Jj5UgUZ^g6DcCZP#Vt zD@ownKY_8VN1P*w5Bk@|Jt^zq+L3(XW9zb*@D+Y#&fx-1yn)4pqhV%7C(O))Bgo9M zHT!rq??Se2@j&Vwr6FJ!Yg+k(wBJF$22s@TVL@u7F6`BJRYB0MhboNlXYS3OS7`^! z8f@^$*I-t#rO+k!^O|1OELNCL94faVQ}9o%Y(%~<(Nv0zN29wTC>%@y%a-X=H=dyOV|_9(LB{amOC5Bppvnc6DszN0g3&B$_yYki zH4nCW^%<%JFl7NxLgdJ}TJp}a1`}W5gt0q~&`Tv>#1Ag1NUMFJTrK|q!6fd5jUb;3 zJ)S4i{cvLo_f%nOHOIOe)bh2siU7o(0(b2~}@~Laj!hxmxbJpW)<; zQiGl&^&(17B7+44;9~P9)vWFE55&55qI()c*{0XOvKUo+jp=z=)8Bh`hS%u^ELhWe z%`-9~x{4n_5w8B&WB+Iy(S$#Eo-i7`%XW$(=3!j`z3y}=kk)!Z!FOf#{IQ|=Iw@@L zrbDW`x(a7uap1!<_+tiD5vIpbkpu_4EldhuB4v*hMEZ76xLvr#gJR4#Q@r1x|2lPV zO)b}*v=h;#t2n#?$$Pnj`rHjrVwxVPYx7HyKH6W9G|HCt5%Zm;^SLU#+{_$DI;lbu z%N#~NX9zQiSh)wIr+CH%N;C}LpMX}(k;Dkth+yO;zh1)CKivAAtXVHMu0cM5qu&u>nZmPz11f@KhRn0P(uaX z7xtS62sZ{4EHV|m{cY*q_bq(N4Q1ib>?tK!yIR^-fU83GS+E9sjUG1RZR=Il?I_dWQkTp%!{@wf zu0@8$p;=L?i(XH2MgnDD{5{WS-5cUt5=dhK#XdEzcBzxrwdEn>!bjA5%nMkh-o!XX zH2PeQh{gcg3Tz=1Ffzlo6&^e9R<7?LxGCO_|AgwzUN;zyjT#Kd?B-l&v)cwzWV&rXAaKUZZW){TIqpo+6(vN8}r&_{s@m zGdJ#Rl{Z16ZP?kWs<#%Sqwn8{O&Ft@*3yXg*Z`eKK3=obNIHwpXmthYVDkniPjuXU z@~KS;zj@OaggcAsSLG{`JZd*kuE!+r6)qwve9pk7a7q-&t=HC*4m=NwX4Gfwx`L${ zB+pT~+|JS9g5>urIQ0Ov2%q)`tF{5}8ugz7;uk^j^Y}-B`5XQ4#N^Imd+W4)Y`Nxx ziwPouTl->(`p-Et&qJe46;IMGX@%-m1>1Qn*S-4<`0W^*$zTRv?xUTYaL5WS> zy8sY=-pf7&CijW&gZ~Ky|Mdya!Zu$9+|9w;P zXW##$eXsrcq*uY4>YLSKy8&P#Im91QXDHaItc}k~-KN7V_f7}Pvew#k0**^~|LQNA z1)W1)z6&2;nADN5Kdp(YjXn39s0|3duKHkYY(VW1 z5Z|EqYme#Q2_yTLRr{aq{@?U7-Cr?c$i`jG>R1_l{HPXbw=n{ZDV z@KVz9fr?+1pq3c&uuUcIKrfAcBK3O4tb*BOPIv}75Y#s~%t%dxd3HdgZecEp-RkB8 z32SbIiZifKKK|_lxPP+We?uMpdHw&3uZ{ggo&oWtPGxD!@mrIm)h~#<2Bj}H4$A1-kqcEwUz;^BU@Q@D{d79j@|SESKI%a)$z|gtE5WyVt!uQD#1XV z?hZKCN5N+V#3~@8o!6X-b8=2%ykz7RntNT3;?9YLqnQ1VxXsfqOGGnPwYn)6-Q=`O zjMVRdj7Jwpsh&r%#Fmd&_(w0@e3evbmITZ3m&==L9>HJh?l8*iaz8*-0rl!MeAVWs zElPW7mH5R+R_VHP@2&4#{2cm%if*U39vU!;xI{3cf5~cN7A@gD5I{|+_;I|ZVl|^>Qn`tbuke29h<{KU!fUEAQOT#_Wnq@hyoWRx&zC9FEOZQj@8ek z#KuH_M_9-9Flsk$BOiaae$FF#D=;V{d{$9I39<#iT!|>Vr~L#< zvDzuEk|<^KuzKoGoSaLnqs#pE{or_}M$f@BCp#DTr@hyR(Y0M~xs7YmLqf6((ycRY zJ>}Zl%Lgc5Zu-~>Fuk3Uu5dB+PIr>T7S;a!!IHglEKf-E{KV0W za;Bfwy=zk^+9tg4qbp*yC3=Og=NXTu2Xh681zah;NOn4z_6YM~(laUVdR8uASn{g< z$&vQMsRrYyXru@RausbsS|i5QltxpluvCG1^RUAigGH^Ci^Alg@LJJt&{z`J7VXMGtB9k5?=h$cuh!Lo<&`Sxw)Bo)PGILTPS4PfQ zv`ya@9Ds=ZLuraupfp9EMgwlqYT82nj@B6h->&qW#HOW}#XgkEy-dx^s;2E@Xqvr# zOEN`F#SRn%SH>Bk2)Bj{N~To+PN8K?r@+N0Y4}SiH67R4&qmMF6zPZS9FRlqr=*25 zsz0>c4Z(c7WX*nUa8##1V<&$smh#0dHqVhwA0EOvV)`B;(DQ?)d(zjn162ih{YC8^ z@f#j|tLxB$q_a4`6!%$`(xKhuGlEYH@z(=)jE{VXl)fQSeRkCqC+eR9v zO-xqz^De%6=6hT;$v;=sc`6jBS#F5bCOjpgoWYb$ws2}Uy5U`WY*)-Z9OG%m%i(*6 z%~+NNjSx9*^+8xRg+W9IvCZNTK`-`&0V2V$Q#-wLSgfih&7L0N8Bl%Y0cU6K>o)-$ zZ^-O<*fWp{bRYkeZCu;Hjlw)4T`HJt6DSE9>AR zURdZ2aGA-pW&(vd@8LwZ-Y*=4)Evt}G_`71#@*K^RA3^kCwuGo_JD*>3@vP@=Y7KM z^W5sMR@{3mto!sy@*-JF#hR9*td!VLuO$E#a6bD5PW#)(u%%$(cs3oc$ZN+{`4UIq z+9@k#G|(xNdkxeTSy67FWbQgGEMoKhoCf7oxB0$Abj^cu=O^N)S9gXLDL3wAyj>nI zSZ9v2CtDmnF)tKEqTKo*{V%;`IpBxj%HU~=`e!*MUW}Ad2X+9e&6l$DXQ*m@n_cOQxH@_f7JyR!% z71~V43h9NbNp5c37ooVs83`g@s1tibOu5U2mM(0U+x^D745})R);Q2Xelb!E)$Ekf ziSKjIY-Y|3`DRf!)TR`F5XFAGw@jq~_Ifm0`2eM{rPP3=!7VKsjkVYnz26b33iUY$ zTL+)@X{;0Ru=^&awthC5Kd;U7eynWW7wv?5if_`hw`9X=szOs(?$q|$$F9wQaFkfJ zCKQXc)RRt{M=?v+4mCW9&RZHA-CMJb`>ItL{?@W%0`Uxvb3{ke*)*I|V*<1PK%Umv zH8Q*2JNli}V`~BD4c5%ckcU>vaQYDZFpk5>-OdMA3 z{T}SzrU)jd+byMgmD8T+_>ry;2ii{|A~$R9WFQ)pGFpMubmj(avmUoQ7Pp1Febm&* z_lrCv_{!8P_ED=E+Jl>=M(4xC_l5Zxu4xYKTFBl(wF^zO&auwW4cW`UFe_I=R5WSj zXD%*1XuTy_&q#oIib;va(cRtT#MoSGyuhaVY3)osEktGc&lSaApF5)ra0*+E`nZ^R z$$Z;}R?57nnmVgT1^&L5FH&-E=scM$l6BiR#Yt!5TAPj>LMSLEyc0*-*$fngHxzt& zR}-9>bwl;&$G0(zOzXxHjShjH5{;V-cp=r2;BgZ|14oYp^=QIw@LaBTd{Xl{<6r>s z(|Zd1`9}@2-@b3u&**L6q3|4Wp*=_M;a3N-!1IIahNYu{nfvjJ3+vO{fc)o6+mZi! zSpCoI|CDc^0zc*?^JPtH#Kig4@f)yn^E4FxUhE1^kW4$x^*O$2ClO%Ft&F($*Blfu zZR_rV_j)g-i2HtP=P+u4U8t?jxXOZnV>mfGDdt3pxUNe(Ov6z3zlXL;SuBW+x!cg%wca3#=;2V+qx0d3-DaZsHxKDd~sGF|9MNT z4br!~61Y@#Bjg}TB!5d3%Zq*jo=e!Dlw-$L?SR%@loxu5&;2w=z0mMIyf%=0V!mvf zbZ$M4>rpqV=XAe-9XeZV^Qq9m{kaB$5cAo3) zM&}>fOssFx>#qq%jNjHLy~u~e!!3ppg?03}x4UfV`RrugU(dWkthki1$H^);Rx8JM zv=?tD?0+@FcZRILjJG$p28N|%G3F>CSAh<%swby&WVWw6;4_V7w^JD9%NSRkJCX@4 z^uy(f`Gi7R&_x$witl6Ts~U;UeMZ-Jw__6n7iwV}IRtc2?(_M@Y6H+6->~dtV3mKC zKK{$(_&p!%qo3(&G>9k(k@Wz;u9QmG`JOjD1qPDYD#o`d=4>tzWfN1IvNQlM{a`8)ZF4R zR^Ua<(ntXb-1d$LFc1qQt7dk|TYnGQb*pp1l0U2NCzWY-P`JHhg|IvM- zPLNoKSe!1UI{Hpl2++V|_ru`Ucu%8FiH$G0D$HLV)fT$R1wTED?Doyy`h^DkatNRS ztNaZba2piIT^{6L9#Ug5w~RPEV_-DPkrZDN*)nz@=T4~2-CL`3xB;!T%nk6F2PRNO z_n(v>;Jv9DLAYxm!UtO9{&^q>YJ7d38Sh0p2l%LYPVx|6Yyc39CoZ5a^E^QkD8A_s z=I_bSAkK$1s<)L2gQvzXS^&5sqhx_S z^%~jF&+n=aPB1h{x$P&#$X126rF#BAX?rslOlhFw>4;^9}$;0u+~m!+!r`@9vlp!h8qqbsokSV zd))Z`n_sWgZ*XW`q!HCWDIglIPvGOdFUkr1LF&y z;_r6J6=p1hUZW^RzmwU4ULM^?oqAgpP;wiyx!L4MvzNHw(E1qF?#b7m9;pSTvE9kB zEQK~igVj!NPGe9&)~{+tJaUzKXYbIRXgvZhcMIT(q*QAtk5rm!bY#ukwum?!T?(V? zCx!%)pv@n?lTAv71KD1{1|{DnA#;0kAXtd@Ct1Rz`QyW8AmbeGY9=wCM6PhIlf9O| z%QMBmH~?hQpF>hav##VN-Mw=lX0U1BM*J`cwn6wAzI#}A)yUETlQc6g@>aK;&;)8urd)m){@{iBiPH3e48)A^BX12lT(<(`2(4620JkAG*QVnZNS5E^ zVzFRMU1gzj1np{^!{v@Nx5&8VVF~-wW5JtG-d>MXIUjU!m)8j_4DDruSxfC#CTr~H zpVzD$%mB68i8jn8T!baSrRutXfk|P#O&S8z9Gy!+U&xn}Ay^c-xWmn|g-w&pH?I-y zz4s%)#%lQRRs=}Ds{*Rk64f9T|74q+a~v~)PGDZlj?`0|wocU9nb4Sj0LlY=OZ~)} z4f!kV=OZgYl6ZGAkl7+DNj~ipv8JR_@P&=fg@l<|nHq^)(FI~lFu+EQfC%j?@*INc zNS2WOGSDF!ybZo>g?jG{n&_PiD0f)u&1+9rYk5*r9t9YLsh4;bhi9!pta8+_IF7*w zSo@?M;fCCvLVCH7CA;>0x=xv^`rjB1*d8`(y&Zshti925sVhvIt_&a9Kl{P)Rh0hn z3Z>NuB})e2hPeEHWp-^xgdnL)z6Fw~hf43ta34+3a6N0osQIwL`bzWFDM4ygL-s5- zMorr^8!yFhrFdn619G0>b(10GWZC;g50PBK0eyDQ;E`Clr)H)(!7|eGLZRPe+~%j1 znj|hO3Ym-8>P@A}#rEP5cWba0iY?O&?`^YG>Ip@uv6d&(1vPBmmij=|g1(P7NBVBn zP|Z8gOmM}C?=*`Q2YgXH#(KgeIyzXUk2~fF$|Ii<>s+RPrxeIB-UXyM70v6R3c7;uO%PXnebtiB;c>UD` zeh4$qE0o2{hi@sG@pTf*L%~^h%0JR%r4;l!;X3j1I0wwqhxYVYgBcq>q1da{_doWT zWs}7mK^DrJW?w^))TPDQrnf(N+{e4JzF-O}&~Ea%Is!M!F@`a3%yr$tQTpdB8H{lpnUA_i+3>^}Rwwy;mS-FyCPw5OCmBAxAKN`26%Csb0w8iUn{^#I zP$hFUI&UeeXaGhn0DG>|9)+{GFrVl0o4&K=#pg;(I#+H|b)u3B9|JBlxI^ zw@F;e{Apd>D)coFy)V@A0Qz}`bFfDIz>YrVxl5f@!^;nQ2dD{I2K|yz=Qc4l`f19B zuf|)B#$t~KWo=~@t|gQJ7`Z(#q5f*R7?1KMjesg}6D1#BZs#H$Lew*hU!4!Tv!JxG zH?MKb#%bDWgduK3Xz2le2K^*+fUGjP867@epPh4SCVt*qWLa(a$xd=*SgK2pNdAzc z;b6|ZnO<-dVG7S)x(+(;!w-W}mZn#gK}(fHIf~jAd9OWs4%PJOn=LBs4G5h$c(L)M zd|>p+u_cRg*5wzlxyf9EjRLOXO2+6*y0s#Fr79}y#viYV9#~>Ptth~U@#$VQx}iD;sP~=D~da`36si{kN4~ly3xLA@L2@ZOzG4)}g&hFA0zHaY6>A!@DY^ zUV-9odStAiCP)eVKJg=58%x#Tnn0hKCEFC)!SzVzSfua=KjZDj^^fBememDckP&Fs zJ&hpd-N+{T^Dt5VbB~?w1p(ms5TvW-;MUDc1|j8oi~T0o4fN+eOeKy1Q?%``F4!xFJ2(VWD{WZjG`~$A z$MZ6Pt@+t-=2K6c=TzG|U^o2-ZU4VOHEH~*+0A4Ebb-0Xy1*w%9Bk8PJ8HlXn&;ht~<(~Bx)^*al) zeDfoRJxhxN{${?k$mIM51mH>By+}20 z-ukBj){L9S6jkItKvJ!tXXLKh;Jm;WX4R-kTz+i}Tc{BzIjJ3VewQnJ39op^jR``* z7SVN@2bJ%AJp~6qffE;7dMSDHow_frLj)z8k?NuuDmX>74E zE;U!fL0~~6eR4{ABK4Y9mOESVK+VPD>SXOG#j}&zEqU`_z6L26ME5LRd-aWJY+5pB zRx>!2?+1ea;?M1u(D2xs^n+7v9R4b1pXcj_a6pAlTxyfZq)ifMNdW9rw8Z13M~C;1 ztx%=kPkXx*nYOqq!-s*?J6t3@7)pI*FwGs$QHZ~0R> z66qRQW%+jd44;XmZyl>U472nmK_k#kA@59A{T-gS>q7~<=Y9mU45bD>+yho=;s2Yk zzoggZ&q9U<`Se=Ya+k{WR(EqVFIbwfD%dMeRWkEiQG1^AriG-ycie{k|FSp!yk~x% z|5Moh@n8BYcdBt&$?qVLBTrK$H&^7N1bxD-D&D;fX;7DG7-22UV=yc+A|+i@omUx< zgfRfN_$yoK7?rApistaW3)S@yImHqVo}m61Ea(PFcSCnhN+$n8t|f=drOdTwu-V-H z2DEEyP`u*c4|2q+%-k<6p?VOw>jERa{v?K?-lJ(tKxaj}ZbuN4(29IJRHSoIm*U1c za^TWr?kb>88ZW#?Z0wEtL2HA;Ip-`Fy?p>pyI#)Drxy=C>ljh8HX7~yj8g+(gj4bz zp!exY@!*Y7%s5b8y%)hV%AmzV>NL#&^1iz5NNypi_cm!YO^>bqSZJW{!@oxQX_}%A z6Ru7iT>~GqpFVK>hYLKU7qCXyng|cRQb@F;73BduZ`u_O7}D2&clEA+ zk!yc>gmi;T)pPej0fqY44@+5A%s7Xs4doHXbD?UGCTJ-sJHAFNCDnVlqQa4cy?6IW z%!}!yAlyW32VZ)tFuGJ6ZUA^U)=&3+vBbo!l#aQu{H!TsV-WsZ$LZ}*2zr!y zYfz2Ug7d~q-z*pjZkPokmFyx#fx0A3qYlupB|4Aw9`8xHq)316mWxpWV5D#EW^enX=)PjJ#7Zr^-H?H;nXvw>WMfU)l39S z&LfBkb#bV+=dUN}G(Cz}pY`l6tN@L8{HPK4+y2s@*Z!Hlc?CqN9`~gWK6IGZ*p$Z1 zp6?cn!kmE^4RKyIG#M)m3&g8+y>dHu8*tr}!sAf{Ph9T&S*G++i^8xfCCd;O%@roO z=eR<#1160!V%vs0rljt3=~+jzBnnG4%2g9(niyIT=OL-j@m{I06RGI>2k$y|D(yF#z`i zRm%S2Iq*E~2#}p@{VQS#^z*Slbmh-3`8i8|E}Wl2;aBqYr%?DQ6n+YYpF-iMQ21Y? zPa00RH9G_u>bx)l)7UVX%C|q;&Lm>5SP*%6FnSy_n-&NqfC>nc3DifTXd$2F&Jy$B z3Xl0+>uwc4&>EW)Yp!}-z%Ufc7)J8t8L6AxDi4tXFUWGs!W!LzJYYQJD2lLY6Aj+R zKB6b2X&EjY`Wd|c8ecn-d$d=$`na%UjljRMk`jEllLcTN|4`w!|AqHu`K71w#OE5I zh+=mSFT$pBTVzC%o>tn(o&`t8nD#ED}K zPw+GhQxNx3?ZTFsWjzaR}dz_I^AAOGM*A2uv<}<51Y2H(};oaER2m+N^ zentf(gjR_y(jQ#gNSzhK@hN!m@n^zVK}NB?YuRjE3en;edB@0EUr@WNn9NvRiQd?UkjoKj+5dwz`UKX7FgDbzSqp8@v!5W(wq(N(O}aL&zjr} zK#%Y-y#vGFQ6euQht&#hU#lmHvv5s-=dNjBpq0afEuXf70U*gviJM>MD@+m=1OD1g z9#y{5g8q24^;`kpM2Jv3L+k@yzgr4`?fwKiQ#Bwb_m^$~1PNG0OnzOWdDtE=^za5M z7`8?=DXRa{keEiWkj=}zskc38bUf__r2~xCNl?ipN}qac5z@K0r*C27xGUp{nt5!R zxTW4*N)?(!UZh?OtFl2pvNvldob(9uK8femM_Bvxvy+#%xZ0eTAmS9`fUv%toK5=n z7sH@_U4XdF?RDoT0`|`!VE>=*rE7qz0U4ma`TLw5mT{T)I5)N$^>wUb=W0~=*!*qJ z;zBZ4Hgz)Yga_4EB<$|t{IKB9(DOI_@^N^>m&&D1_k11;aOFw8PPyHCg{>jG)sxVM zs|13ZX+}|kf+Z0ztBhUg&71_&;Yk*ah;mw{?ZL^e*{I<=b8rT%#(ZFKosfISL5q(- zrO1?VA@#kp!bb9E$0=^_Du9A+nZ4y2O@vejcBZmf=E|&9Wb6iw;>ue|64);oG&ZT) zEB`_C=V|rroMn8l`Ve~uS%G{oU%!e#_|vZd3@iM{P3V`+=r@j@5&k}61l=vuW*}rJ z2z4buF0z9JzwCKtY3Ah?-IpO9^$+IxSDi3gnr3I~S3Ua-k|X$bKVEdY&@MCXPXhe6vLmBf##R7TCpj@Tl~VTEU~V$cARzsBhQW zl5U|IolQKkV#?Vkqw%B)87Ri$t=?a;W0!8AE%z#0Qm!>&990iK4!@0sGG@Ig^~D8BpI$f@E;WTeUz<8q_==Z$CuSG&5?o#xUI-L6ARnlgh!H-0Qu*-3{=V3u z)7Jq}<;CcxB=2>%W+~!qu#7?$Z1mlZP=>Yolz%7t=aD!+8MbWq*`|G|lW$BdC%Ml0 z-oAB|uzQ7$)`RZpGUpiD(&6O+;*yNk{z3mu`k>PvqGgPtUdx_xM@@n$SH#)b2JD>I z8MUs^Du-&yf8|X!pRf5g*V@=dI62oXq#nyGU2&PZs_-=EHQk18t6uu}>ci9yKvL#~ zHT>w0%7{xil*(3fjL8}UwL*zs;Imqv($9p4xLyo6klCVjKWr}Wp_R#;qN2r9sIk}R zBBUi^;zSdaHtlW0+@1aF;%xnVx%S3Kg?std1GWvNE;QqS%mq&%b+9r;?rx1K55)MF zqL^V`0qiz&;f~>a7+7a#x4c8i8cl6iYN~ScYml~XpvAiB;8yBfd9WOp&3>4JmHB`n zRM47kSl@F&Yz*KPwX5nfEt|4z6_n0<%@(fi7qz!MXQp4sa7PWJe4#}>5g0aEG4MqX z!?c}}=UElyTK!+{@Sg4V0@tJc@H!*cV&wM5Oq<=?b%737i&uivSK_=-Ca{sO14e6ZPPRd26HzE* z3RgOOh$$Ewg9(Vj@-&JX*>KHd0lfg7 zFD4flK~MPhX!ezGOD;Sw>l5}`nm?8B^dvlS`y{zCy}pBLtvfeFIFTR5qW`-7W^!Bd zC);cp*5Kt`JFR7fI-^Nr`?j9vp_UWXJak+hB{`DHq?pvGt6>(Q0T8kmQWvfFb`TqQ zY@iA*2=f~A0%IPhp`83Et25w?YoC*AxDRl?1^gC$NDDpaOw^m2uyO65 zmRG*Y^>96E12%`6pxQSaES=;TCdhsnQ)6krrRGk{&=5_#+3e$|r&sBdU>f0-RVQpm z5$z-*r@&os$$CXW`8uEIVe81QtVZKOK&MVw^n|+drR!RXjvtuYP~OB;InHJ*UrV3O zNLT3FEf21gSSBrR*yr3njl0bWD&v4OyYHLw+b+0cc^?BPePT#02>Jr9F_q+0i6Goe*cxNUmTq4dWlRJ^^74+0BMcCc=l@rx?zj!{3<+l9Mhna_rz2tkS zW7vpI-L2}zzD`jRXLs8e zkS%Dz8VLTGdsjz4&YM5uHA*l1{xS(qw~dI969hautVJsMX_|-89)ax?H6(tR6=qgD z!M!3sVid|k8kA$-N>LtxIry0A+446g)>fc`>NtERCeW`##;v8`Rj!+9GLQ?x5y^C0 zktdzhXpq|SQf(x|y&u$?Xnp&d8<%6v+3FG=N+H!Kv&waS|d4=!#6NbAPj6C}h?a*UBErzHN zbCqwU2DT?ou^56hvQgzSC-HnZjRwb@wNDdk>6#Fc$t&$_8n@P=5P!|SV`}=U#g;lZ zuqGEIRB+yN`4@Ao6WC&lv)ina?_ECH;`?+@d5D87N9ve*O0n(uWnic&er;n_d}GEf zH_n_GK$pT500x&y@bw8Jp-{i(vnMPNY`ArmEyX~O?3vz+31>`of~Ky=I6Q`iAlc<4 z*TJ+kcle`J#4-{&c#f{cCkS*Vx$gGfgnQw}bKqLoV}eh_M--LKgmVS89qVuLe1P=4 zdOga8H5tk_((`Z+RT2+*q5yF{^YVG&=M&l7b0Xv^GLrnNHDYol~?<)xKl`-68Y zxgLtfRF`9P!Ip^eSp#fAAvzKcVP*b^B`v*#!)5pcH;%|XarmDL?olwu^OLIr`<~`hfJMknj_xocz?@Dn2-!#=Ko|8k2|NO zAnDAvM;lLv7xKna6Zop0%vz&+q*XxQ$*A@7@n;DFJRA!T0YXO9cQPzp3W^4l-9~s( zFMBI{p=ikdf`FZ=HmM^9`st!iBk0`1qn71|3#0@9QfY!DdPmPHO zcSl#=Byn)+-*N#n?M9H{8uel|Z-%7nwRUklxdB^^AoFRHY7G&*Qc*j^{mBHcS|ejN z8Usa5NB~(kE|Hg)XmteB+LL{tO3weBBSpq{>j~&?3ZrCx?|TwEsk@G1%R8*LEZxj9 zM~%Yke)Qy(r*fIMP#^P3*4f0V9UdPyYDuOn%+mDxJg3{im)Kt0lhrH$Y{_X4|B@E* zFaCdwpPDJ{>?C@i+?VXpvy(c-ft?W^K#{wS0*DKv0Xw^R`OUmCz%0>gW&Fo~;W13^ z+E#Di{ku)fwMp+RB_lQ=)QvSqg8NK_YbRWTgU`-Y`3u^m89aG&N=cr{K;!ZY$;(3} ze}jSI0gOyUF#1B>MR$)HU94ymeI5JY333B_=qV*|J+008wy|ZK>yqs)C&@fd(rtwU zwOLPIBSwfNK3Z8{fD}(*I4i0Kzg;Phj0ycF3=df|KXdvy)P7Z}E*7;~Hqqnnbs;r# z=5qA;5NmoX3&oYT71g4oCRXiF%s2m>i_KSGIGAXUBBtT2ERRUvs>}fy|)<{ zi;B?jURtARs^VjY=nH^W0|of@^!5(|8<8xlhMOzo#JA_)Fi78Sm%B?BL*^pwsQ%gO zDy^pS@qbp&{2y`ul8o7c##w^g*5+&o+#%PPIOC(RJfCeIHd_X{o;YB$CoCC@b_fqA( zpIGkKv9X@^ikh>?(0ZiAW{78A?s)@lJUqKm6;W$3-0BOuEAV;m~Ijuj3DEU$*yQ`R#lwm!YF z_SAoKcNrbe^DW)u{c|$Wom-m&)tEASTU8N|NoQ2}*2 z{BE4N$KjiN@(7`PHjer|6T?omh;K6*teoscH#Gkbd+#09)VHn+$BK%I2muu#N-qk6 zAfP~giU^2+)X*cKQ~^Pnlte{ELJ3U}6e7J7=}3!oX(GM1gx&)rgf#AQpR>=m_qo4) zzCG?Z_q*qQR zV)AqBUJR4*6;L9mz4wjn{SBPUCglC<_qI>k)q13B0uhuCYYR_$WU)Y@V@nnZW)9<^w#Reb!;QlLqFS$pUvVO`7Jf^4fF3B<^6e~$v=u9 z|0{nFo}w#KLa_lih6sb6K2oe_DsFUYL;yt4s>wGlo-uEN#wE0#`M@T2HHx5UgX^as_Y$^F zIyLtd+OYKYgF3Cl014J&oZ4de%7H;zTe z!9_@F5i|B1VZ0DFJem3}j_8f574)WFW>)5|5wrz8HO&{(sEHV7J_6Ze)vF)IH7VZ* z=+z$A9dclnu@jpsHb=ivze0NBD}tW^SNO=n4hKAkILjcXk&N(bl$Bnf&lNwCEL&T1 z3q7K-b3&q9Hv(Sv5eOMEjBpXUhzznD_)xbGqFqoaIz6+R?#ByWz-)d7)Y(;RnKVE- z+X3L@%cxnlUX)kv>q}{SI0`P|K|Fgf6ich8^CZh;%pK;z7TqX1yf&hnG-1 zHrre)MHpP$lAuG%LZ-)owoCS1|exzNZA(tGsC^^t5WScQKOZC=3;!;Fn~)0vuk zz0?M_$K5WeHR?1U6M**A2HnVoR+M4(=rK%c+KR+ie_?+wFy6N5!moW$dwts$8hKm0*NPfLd;|LQ{y(&9BNIHEuJUx#}iw zK~x%0qL~7VY$N7O9msbKnO&E0B#(LpJCB@heLvO!=c4oCk~g!i3cCe|&!c|5ztLk<2Ryk zxJ%^F%thu8(|8d^&d^hbb zW9DWrITc|$Zx@Rz zdelk$DQ-NDQ;mnSmN}7T+u~w@cB*7)WkJehcUdXsV>)mo;)j@la3&QzzoSp>Y+X(D z1IU|Pxb4+_(C)7+Hl7imikNNie5oEOfu%TPsj+3$ji@S0JpvuffS+Fy$!%B3g{EMy zq+$jiOWAS^#y5{jcts5@2UG3xMM8GumpoYpOpy19qhbuFJ7iIE)8I~s%GxYE^Zg@a*;o>enzzH8N-oTc_+9HAJCEGRw2 zTp{cQnXywQfj?C6T8(Or%g!~L8IQlO?x^)1~~t4I<{ok)cPJ zzIjIc)aYs2g;+zq?vvxq{5V=VWKMq|`aj1H)~hK}-<^Oh|MBIxXs&jSW83a}Dm*Wn zS!P+*#p^ezPPqnckC%Z+$T{TSAPT>)b&cOViHyuPZ}vO4^h~#hhb|tN?vIWl-xQbl zs3O&)uKs*SRS$CB@z9r8pwL5|(n~H{huTZFnyR@kv{bnhObtex9y=EiOpHg_Vi{cm%rF)A5Pe*ebDjd3OW_X*nu|0FgZez zdn7m7BJwTYByS^N-zM}dr061NkllI6DOEafcD%t)F$2w+RLG3_znXGG80+sraQ`|; zJTQpc9trt3gN*lB{9biQYnW}$8G=6_c?%Ki@36}T+`Y4!dr%7%=_>^2?;!yT?UJ$k z&aDke$yNRr`#RiX@t&9&?bmt@upn5VY(kf2l8%+VE%!zWBfcf^bl$vd%>a7Wk zE5dC(P^}l3VH1*lM1`DtW$E%-KXBX!F*nyGjU9Uo@8x95PgK8Q2*QauYHObYM+ne+ zoO_X`aSqvo4(97sm=v(kKL_FepUgrao*?VB9D8$n;^hg-viF$-f-s%anns~^ zMt#q`p~r!|0rmNh6B0R_KCc5ziH9sg8xgF%Spy`oSJo$4E8m_wGQ63cCUB+cK(aRC z4gE4e8}gk&6kR;xU#(*i0>I?xvzGsS`f%UbJD@qp_z&H%jupjZxtTJKP;;MGKyc2ES7 zVsr~w&jtBa=;q*Hy8@{Trr$4F-_gdQ?#APPjI03^ZoP`xfII40Sz6$0m;<$JI&lVf zpmxlxSB}sUw(kt#!COx@`+c4U3u`3^eHf6L#~Hb1qFFyEz4)}(+DJZ-*(Z}Z&`2MX9yyI8TazhbC^ z>hx>4#;NN9@PQkC2m%xn&yZMKmQYKy;U0UeR31b-MyVSjU!+)GP^caty-276#~mJ{ zybuwg<8v(mB<6H7Z@C#cj}tc;^(D%kxflY(S1vv3YceiYwpUtS%Ol6d`{x_*Av zub>~;MW-SXl+71uc4_72lY%|HSLFBl7HqnoFRW%X$!9Xo;($9eXvp4*S|Aq7Y@9l?di4$UXpwbp<#~rsfz>gK z89`!nQ|AR;aKHsh#&qo(9(u~kyb&^)Ah7)fVrnyvDe&rT>A^G#dJNPX1vhiONaED9 zR2PMfPJJcgUf2`4jW-u<$=X`Fj;8)E%C{t2=-_dL98o*amUiwzPk_Bf|6}be%gf|L zw)Wd*_|7yxj-6W`OJ+67x`V%6BM|NDf1KBAAMolDxchR#X~=!F>DZCfgp`qLe;wt}($a6Nnxa z;cA=tVd21wy9HdlA8Q`#PDmqd{4DMvr)}v@M&HWas>LB`C+Pe^10ZsEIJ3ZQ6mymm|EPz65AFE5)! zSC4}!KEw_DUL8I`Sb-P)QrZ8gYMVdSbv!RZI#OaDIs%)%%ak2TA1E70ZXfNj`Sz;Q{%QI(g~J3C z165v*8U()+I*|(^M0}82_Eeh@4b0R9;<(F`oO-5awv4l_dXJnXV(zMEsHxtS)PqXDWvEa*uSKoWh5%85js{}?)$vD z%F&+G8@OyLe-b;ff?_=1^l8ML@D=ZJ@w|Fd;*Vpta|=90ZSz&(0ve?~`PRi{f249A zj|3)GzjE2oL;1d;A%N?GqrMuL8uh20e0PpNNevWmn`S^~|4KX>obB7v(hE^(8FIpW zY>Vm!9fC*7dzYAu*CsW5h`#>%SnD=+$#{8G;s@WP*U5-$x6gz3IB2i{_NrfU&Knn& z9_m8J*2$~vNiX?4julpcZ(a}L286ilRgMQd>__lY5*sA^Lu`tZm*tZch{1;npEKoX z;nQ%QC4};?K8xN)L7mQOI8QO|j`Lpkn<*_XkX2tdY4yN8p`ao-tD{6T62$+C7db*XJ2E`2mpApQZF3 z9b}!N8@a)1MYKQIA|0^D&$=T(?KYqFKp6WqmbvfS4=MVe+Vp-{4 z)g-hG)*SJ2%huTXIh3Dk&MOV~h7m+ogYifEYD_a6tlvpWU4Q-T1sOY1HC@1?L4MX- za|hKp5TKdj;EXe_L-ju@fBdZGG(B+($%*&{IaPX#D{-jlX21{#J;6x19z^LEEq^*2 zxA3h*Jgn%@is_J9-6wkuDh|Hv48l2 zx7@-J*L~3~`Q0}^X}6UeTi7nJy;k1|`|$?f&pReN8R-4I6BJlB)B5+Ps^Z~O9AjCso& zy*4F_}oYdy05Dw4(k7XVZPc*TbX`S+?*w?dT{mBCu*`Kn1d? z55$ z@0lqV0MzwelU8aC4S5V78D!^wy73#nQvFlKC(%Ol=DI7rN|)2SbDZM!%)#NK(h_e^ zx2I3ry?Lp7&bL}?(=OPj&`CSC^v)b-;g<3%e%v{PN`TK`-6iLWK`!}{!ng{)$Jc&v zpMahmz>cyIex?v1Fo$Wwh@-)6!sq3sSVrrNfBbQ%T$3PD>)o5-Vmfc*&fRmOX%2HF z;oBt-O>92YZsV0aj$O{gEBwAFM%ydufx(8sH}~#C)B*TQ$u)F03vJEO^1udAfFM(#MrS;r^d%38P1YVHchJnWY$bEYN143gHdQbI2>O;Y_I2!DuOc7K5RXFxU46I-3*ktIsSq=ClhnD+_7Z6=9wCok zCZj=Hs%Mm*DSCgG9I+ibM12T6i{J?hGepbOS)sD1RPTzSun!7Cg8f&ch9WxiZ}Zmk z<6=g8k0l&IBqHnOp-x>sGa}7bug|c(->B(6L}_0%e?Ko+5RoF^ak7V_As<_=sQ%#0 z!iR-z$x_)YUy_!hfK7hMz1u8@&omysP+lmAd={&x_UJFT4U8a9=d30V5Lp^&gA5FU z!-*}+nd5u(N@Ccla({;`%bwAO={Z5q%d_sU*<-~zZb0pvDcy`zCCxx!3eeCqboK0S z(asb8dTzUVuDrIlSF^$DU-MpbD0xlHc}L%qT-UVG(GO3MMZGQr8=Lrck7z4-yBFi+*w}^vohApXzBp)e0VA=)d zSacFiEfM{*2O^fA(x2xkyDWP6r=Lcr&O>shalwvx?ERZqmL*f8h5lfPq8$al8scxw zs?PDh9YOr&2gB+kyp-(R)T8^Lw|$*Mt3y2@^w`AR3q;(JR-9!xZXxj&;v=(c^~2%} ztjhOA3;m*{%^M;|4F&9V;T=N6%OLg%XR&+JUW1Q7*dgTkWypvX+mM{n^r*>rJ6qM=CZsss6Z>s?)Xxx;O79ktjc<5e7siN}-6*H) zYvtRoa`|xB_jwQF$Qc0(a9e;(icH9(#25)M?s9GZ#urRK?t>PK;Ht^}vkis_W7>|w z*Qi4cPzaC!#)}q)O0f=LYnNKLujQ%sPE)EG0J7C9_-yi=i?qb7bMFJ=09haH+Hr% zo_}Bg8loU!^9!pLK4c_H>$E#LPvZSP=qix0Eru=eA{?6^mS3$~;#;@c2YCalEMR8* zKPSun{~w0S|LS}MFteRS~RIE;`SO%U5;?aK>f&Te}G-_hI zyD5mgI-B)P4f0MC`$wUCJ^NapP~<)c5L5@jp1`>i+DWtiAB~(G0NXVE|EPT8KI*=A z55%-Gg@dxi>_Zk?9nOiDQ-y;mKP$PgJGA>2JE|Rn4C_^bG(6_+qwlW?3 zU)@;$@$YH-XbgFVfz>^pi5}=TzS{cU)A?y|m9ao^0I=0gJ98mtWR72Rx)fM>Rv1&s5+^=M|Y6c+1X_F(Z%wpf6DY;X z;qK%!<%=O6ld7ak$nX~hV5n&%BTV;>+r_L+(-=|=gm(4Lq+&e9g)`6svt+@`Sy!Y& zAnV-l4B<%kX@hl_6MHKsOG7PI-u3A5b} zX#*tc2-#_4T9y6^ZXDx z>#h1-pv*(@B?60kfHtDj69E4tnk!JxICo`dkwSt2#Czp0{2sf%@O!Xfo`2={V8#9w z@*aXU`|)^v#t!gMPkgNt=z(O6GB2U$C0c+1a#)Ac#$W)uCj&sdU(uKKavj5a*Es<1 zgPZQ$=#We8Ta|i*_Awv3K`UuNgo)o)Vo66K*{zP>I|41(WBIzROp&PoD*x*c-?QGi z_q?Y&P^ThmgA+FTVXs+3-_nXJsnR)Wq%Fo+f_hIkrH|SRj7LjeQ}cBg@G@F@0bru# z;Y>NvLDJi$7gG49fj}pI@)h!^M04wVujVE1sR*{6ocQJo0Ew|<5Rr|UQ6Eg@Yff%f zdvTUhVafZ%n3@XoefsFzVaj{gg#_P}gPXjsjZwBjZOswua#eP&VJNI@zz*Y6tbzCq zUf&Tkpf;(kCM(C8%tYy_SuaRZ{xLNx#oXDPd%RMgB+>kah{a}AX@ zkY#Z%VtFiL$3TjH1b}rBgUXFjN#WpjV6F}Nf8c;kd&mGADl0HJ{g)|5^eQ*E|BOI2 z!p|7jxNsYEW=Wrl;J*Ht@gOC8mK#| z4r81~elS9EE;{amqgwY(Lg3{d<&cNHiilUZ1VG(V37gwtaA$6K~ z+TMxoc!PeseUQ1Agg>_ez2qTLD^={FeTAm1-pTgmAV%sDd#yspGYncKg}LI&G{u4i zW1>AA#rec`)WrGh^E>biE@egqFutEVM!eo9X7<|Lo7B?;C0szhU_m5VmEdzs%Kl zy7>mTBKgEEuZUTu#UI0;(pHyY%4TP;XrsBC+3U*r>8-?(Rz2;Bd&^h2dDZ0I)5jgY zK4(i!>q(QiZt(r!MDDUJB|M5G+ZS+Io?5~hD5MCWk-5DT`^K%4=^(V80ECg4 zx7=M>Tdagaks4=|&+NPnrk}#Kq^hv>pw--WoK*5Ik$aRE*6uT99QL#Lz>2 z0i(!1?#O~yeaq=au1O9H(aRTek^&HN1Kb;}9Ox95y@6r+b0E`d?ohDf0sSaf-(#59 zQCa)hpDzXyag~iaW)Dwcp^L8~qqu_0T zt^>axzi!!toPPA1z=X;`1-@Nn-w2Tf<|)q$nQ}B8i&<{+iFZflCiN{g8C`)#sWf)( zq9mF*ECvX)ifBk47t_l{$)He#UgOHQjGHko(i0iJJ}inA*FEpdy8FhBWWYF2H(K z`>&T!QQKL>&enA^QR+A7FC!SSJQFdiCpFzl?e%@n22fL}5a?b379A7fM(9n<_69t<3$G|gO3_nB} zVKz?DHrhY&ej6Th*X&DG@&~E#)p&+l4Vp=0Wy&RqL`dpV^9+GAq+;t}rIUpGdyL0` zXcCadGIpzq66w(;3FlRWdX+DUV|!Rg;4VFbwI}0+YQu|cC@+9_E2RK1VMHhf&>7(& z=)&SYXi9&-GC{Le>P_sK7%jt^r~cGf-p1s*wC_+K{co$xU)ySEdJT7WkgZH zle~yjVQLQZT^5|jNnoRqoq7~NP0p)9T_o{j_f;P)Lg=gAHACR}Ejsfwo7E)|*iJ4d z18tp&LUCA^tNGvCFcX~_r&X-JTnrlCwjgm1@;hfIiISlhH*Sip-Xvc5L1>rtMortes-Y__;!L@ezo` zY4u7$$hFie`y+yct6{dnNj=ieR-zbT{87ZQ z!d@H?&JGs=LmsE=Ct4L;mw!ug1iKPM_>Z3kyPolu_#O~AN4o4xzAqnPXlnClaa!i9 z;$gnqhrPf=(|7DWNma_|FAal60*F*S-S24*g{N(^VxMp_Qe#|HtUbb0lgeWpQ)jG2 zGwRchCN(qwZ1H1sX$sNyXc94|g{!=y9J(ep`+f0b!Td#_=SQ3KIMh+XG@q$JsS=BK zmK%;lR$p;gMxIgg4GbjnPJV_*tGMRw4g>j%qJ*4OjUs2`v(<+UE8j=!fDW|EF;2j? zvG6+NDbhD3%}rXP8GUNS>(tFbN9dGqYz0#}ac`A+^Nv+kP8=?Tw*ZJH75rZ57(Z6j zzfI2U`GY&HpCpPLu)L2`S73UjaEFxrsIugOtQ~QimpF>b5E(aCx8x3&^Q5hCVvGd# z*x_N&a#-^k3~%3@SId9TWCwR*<7-98%@^McJXZapA3v=82~r~{Vh1Lijf;QD`ma6L zM@shTyw(c&)TbwofOZqR@2v} zeb67bnRcZ%#8C)o@Ueg1!pWQon^%_{G5QOlmY2+4tU8U81)0?$0(`^@~%t^3^FV{Bc>(R?5T`ZFbu)meJ( zWtiVL);;_4bO{zQt><^6C3=kM)pz$nrYhS@7h+W@%IPI9hLX^>{hzL>mV6=_j1(2u-q!QkAhdAX=EG&9rNZcB=X z^nM~wu$E&0CR{m?c?-$EAUe-p?&fJV`W(2yZJ@Hc@6=?Zp9ye+m+ylZZ*6)o>Glm5Af?xzCy zl|nXn0p|_hZB|{m&n$#C8ZeH!qzewIw z^YwEYO-FjH>#fxm0Ze1x7<2Ucgf#UnwA1K{J`f~zKS`~LQZg^Vni?^)p_$_Cls+SP zO_cnA?v9)m2fII)-v{t~9>Kq7cZ~qOqfY5xAtXRioscvjDu0qoZMqc-So)h1ulN*o zf#7jT7)dsZ^u_ zO~?i}fMlrBi?B%yaCaFU=%ye0H(o73iDf5t2IjW`tU{puBH&@ptL|R;POU?d^|V&b z0N#pvN{^M(qcc}L}>^RNtD9SZTN5x9hII^WN|Ju z(<=P(vFZeF}aW z2qemY<@X{Kc}e_DvdqBL`kp&biig8%d&C;v>|KWevAMS{hqH%QmB1oUT#@URw_F@Z z`=(1j`HE`98KHSvK)-d^u2A?xPTTHPA?W7O)~%fR7~W_J<>n^LijW<(NbZ!2NU&Fq z%32y*&Y``;^5nCFWBK=SF!-W2-%s5y&m|Y+)V%VhYfYNwBuy%vXJk3mg$My!H%y$GaS*Uoo`G zaXl?Mb(diz)q8Km^*E~V<6+umKLiR-1DF< zO*rD=-(J$y67QbzX#RKfl;)mV@7{0Vb3tD3QRT8&b@qhXYeIJBdd9fuR|~U;&3+dR zd``%pxN<9|pt*rDViI_+zatxto%j|T$ct;qd8`+uHe4zj4;z4G%uekR(WIPst3mFJ zaZE_}c76cL0u|6*>5=9hpj9ZPl}5?M%!@xET2!;rdMWJ=*Z6k^b_8OX}44z-M;iMQFQu8_e@_(i~l1m{B1*clS< zQZW^nXTN>D6+$Lc;tB&;xT`?gr4!oh(6|V!oBP|jVho$WW}py3pEB7&DVT-aIrRXb zeJ2+^fF1$(>vHk;UA!Q6J2Prq8Hh}_r7udi3DrU4QW#xrDgYHI5ThaRJf7Cx5 zJQHltk2VK38xYtgx^1A%L)hp*syke&K(c;SsoN{H=G=rOUkMN!zq%0CCJ;C;viv*o z!~hac+WbKoX2Eo+buZbgEjmpxcL?YNLAI6!o^fbv+$rC=DjaK;1C4()tFL- zbMs;jkmus7=ldncIJZ-+`!n6rO~W-l3UP&H8_SW3E>NjoWnXvBS?b=EOm$1cQU+q< z#GnIJ6~BV4Sk(R{52o@mwTACjMqAwIwZ!)y4k_=mj_X;_&jX0*;*M!ZTdQ9!$!M zYwUe!&29ST4N;RLE}(I)r?>_cR}mMN&cbzf^|y9+p9L}ILUhP{pHL9JVbaW|oD-d0 z(V5_AK$#6(P;TP-s`=cDzwzXK(}^EOMi-+sr$IMk;9>!H7+(?RDR(vPXNsAk{b!;= zZRf@gD%(H0&9_*nys~ZE3gK4ar>n&RL2ldnj0pdP25KTu`mDMYAk#xS~Pj^l6gYn7oD5-yjiZ(8+S%Y zM@FmXoaF#QL(d+ou&vLxjOUaV?aKOa(u@NX=O!vkn5;AMjISM4 zSylJjc}*L|i(ZHMjDqi^A7ndhyoR{W{0TjWt>f|?ogCc)oVR1^SbkN14oI|2tJ6>A z3)gqiUO!gC5S_5L`E%TNycfwTc?ZIcW)OMLGON=adNt~eZVM)Z4&}%$6T3$bHN!W9 z=@8o2WatwE3HbeX8eTkeQV<%~kK(=OOn;A)x^H}y&r)E}E zBcJ?Ho7;5fn<$B$H|U>ZC#bDt={OChEIw02hHRvaO2ExXBw3 zrNZrWuHsP*TqGIH4S3UG-Jg}$HM4(xvUfFQ5hrReAb_64H2Sx8@ykDbIp6$pr4?kP ztg2VqcU7>a_JcL*a-6*a$Ul*eXQb?doN`=735_Z|syt>Ewre-Gze-jDF?gB9rO774 z-`zjc5K_!u#3_nL)`M@Aakp09e{RxaJ@uV{X~A7r5oY!RkLigg*}E!DFBq7Y%y8CGIdWCQ!@9hq(W`P4wTWn_BP+(1*iIme_$7g9pk(??^$U1dJ+ zCChy?34^5H{Pr`yE2{Y%Q@}bi{e9VA4pu=}&6eb=it7tR5}q1HE7Lp+{XSy_?L8mp z)!wfE;<$|b19Q|9!{%6byk|-+Oug>sr59@`(lVWwVnEz&t~)Yt3AIICX7-FK49Zsq zMrB+7@{opq-3OhllUSTO<5}bx-FTQrFFF%>UAp;Mf97CT(Y@m6Vr5qp0P73^buyHZ zR9_aYDu*ZUmc3q$i^e;Dkor{xroVst+3DxqSd=O#z&9=+C?$9R7dli1?Q)#{kIma}(bVX8{ zgVL6BkIET3>rfs43Rjdx`rtJ!`M94wXTSJ;#^WBL#F`t`ZbXIicbD4cq04pP52XfR@J==NIk9D)BJ*-gY+<};(!Met~hL(BXGmK6g z%>K>`$|w}mtnJpnre}N`pgP6=6>$T%`F1p1kPq95$#i~r9Dm31wxD%UJ-%vG^eB2^ zg8*;THlM+wGAjD{P4IPbrk9TSAU0mKIJd32xydfo4cO3xF4~Xcp5YvjL8JB&HHv}x zSEL!aExmB5-TUk2+4U=lbgjW1o_)}1IIp~3IR3*QPc<@h*ov@sKJP`%+K6sdV+yB= zaV<3Xg{2~j9e!^1lU8ZCiOlrT9~b#Z;|F8cpry+-> zxw)9*tjYKD9T(j#2KGS_N@7_#w0n6^!*0o<#D*okVqtG6L4?+~o({?$bXp(wSG6kesfBie8}?{AE=yIB%{se8$%y>xk+ouRF*x zxzN0?etX3wJJcAH^k^;VHf3!$ooKVqABrUJ*8{>xS)%`h%E5tUC=1=$ej3WR#3#@{ z5#8HvE%^2x;nkwV8ukKnG%ntF)))FPQB0m#btM4z9@`~z;G8Uq`xXpab~h(;y(`5^ z&se?Ow}c!-$-<8$$!QyQ9T@)t*4yIf%kU+wjZ@#eGtZoCt#xut<|0bwH?EpJX=>Nt ziI!29+ps#y_!yv0aMZu`%~kg6-du{33l*Q#%c}EW@3UmjRu1(3P><(TttNT*;iVqT z4ZG3m%JbL(bnsM>cg)O`4-Sn$74M3G5RRwH>-;xgrR!}82yRA2Yjl&5(N-yR<4 zKxdie<@i2&HS<%}XhV#7>s&%wxZX#9JPHs^6k!Xl%g20Yvi2&0(^Rt@pB;mGzy9O6 z;**EBh8)$vEhEc9RjHga61p{s-R+zm~f3u(k@5;btUpuDo@guYjJr=JW7v z)C|FCTW8`mO0*#2+CS@#V&^-UC*o;$kPV%bR_=ztgA1L%R3IDgT!O`JHu{prWn6YP0=2kL5T z`ZrtwM9ThKRaMknqUmw_Xmz24kI9;X(YghWNBR?FfHbe!+24m3;w|$#q>Qn!x8V(PEr3#gRs_VXu!( zzrJ~a$+JQ|18|AWm;)sgLL-`{L|NYM9ITOdn_rC4m{R_k1UcoZxlA76x%DwZ`hAn~D5!zmHi3Wa3gzq`^wN_l}z(hfIJzd6$g=%5lG*+Vi;I z|0$UM4ly!y(hK?q5VXgkm2Lsc+08$@J7+}tqNaWJEnRklDObXI=%g`k9(5ZV27QVd zUTN3|mBHV9IsPnBKCDy?xmv4H_%P37aQE_UySL~|VWlqHqKZ%aLQ6Q*x4hH>b%`uw z5k@xiqEm@u6|S+?TtDK*W!KVAj&c%?oO~BGpF$jz|9Dog{Do+~4`PKr!j7Wx$=D;| zxYtqva*tOJSB$PoW!*jtA&kj{r1SN(ENsq~RSCCh|7IssEOY3Zj9jIwfrqbnP93|x z0xN#iWHZbT>V8nOcCOXJRbV}lnJY05AF(G}ZR)O808u8B9eS#$U(V%rvR`u< zn_(c(&mU=R?5!~sg3+c4=su&^q!CSkDXRZZFWG-~Y5$+R4y#0|-3JK*&M5hCKW-=S z{?5(q?$D_%O}g_w=#jS8K1ee?^CeRvI~6tFDM35Mh=$G%)p72FhRZVcK}#|);v9U~ zZ8d5iBwV|h#cVDBn$BxRr>5ws1Ov;K|KBg$-%I!R>+q#(;(iyt$6e=q%jPy`dE{YT zsK;@2qm`X}=Z574*P|6z^2v5UUjhq}^4}}+uP=wC%@?ody&+f5R73Jk1(6ibK zt92Ze-&`+{ABQ2UKJSCBImZ^572kXIVeA$T)Li}W+b3F-3xJG%M85Sac(MEPa&y^4 zD|2w5ogcIGEQAXT(fgOqC$KXl$`CobFkal!uB z>At76$CnZrH?oZ1q!2H?Rr@Wbny>Zyw1w{f`g;=dOX3**mI6Zy(*Yq^d0BQxz~#Mr z_Ca39>D>TQUg<=8kb5HqvXXQ65e4%Fj#31(o{_gB|LQvX6fcHtYrCm2+pl9UXP+G@aCGDtqRS2`qjA*1|Ayi$) zM5ZWux;@+lFe2P_<>}PFPjmcNR{lG*bqimRHC!j;n|$&S(qX)ks(IO+dSO~(1m_^3 zTST`a9z5_i>Pw{m#_Jo<965pzd}2=$IZ<4;!hCCu)4hXEVAJf?kinuv$^FNwA3XM<0yCxsR}OyZ;o5DO*L4-^OTiKM21owLNkRTrcv~!iAAs%H2gA3- zgY$9MXG`1&xJY29@3Q>vi2QFS;@`iw1bmWX6TaImRFR-H8HiC;1ebN2Mq4L(9|ZcN zi(;&R+RV@~a=Im&Ih87Y#=}X%`?)vOF3mgnNkM*z$Zqrv>%CV-87Py-4w2nODv0Zz7vmMOj976r1EF zkDePHKl7z%*;}jhOMqPpRKQBZ|btxL}47@ zvLK9sD3Nx1Yvvy=44_dod^o-}S!DC)C;aU~$H6qia@4ZIr!H^9ghSaj9H8ki!X>`> zI&C5>SdQ(OZ@l4Vd9EJ7Kn-!pkdFRgB`khF`oJZEz$J*Dm9S~k$BR!)dm0AxV}0&Z zJ#Vf|JN2SIXpCGx7<)2O*-@-*cW7B+%>FX<9p)=E=zp;H-eFC(-MTM|ilTrbML~#C z1PMi{O3Q1Z3lRh&HPWT`4vB(*)CdSDRXRusy+foUMWjZWK&Vm^Y9PfueZTeXwZ669 zwf1+Nz0bbR+2{Phl{OPHGv|Ds@!aEgkD>HlOnEq64<&V$|F{|iRjf93=oZ7{zcQ%SA@h)1Ay$EA_DZ>-3+ex(2L~>#z(%6 zHA&3}s>kWPpS80xH?r+#_inFdMNo>5fHC~vF?`K7bBApV$OTBasQkT;CPotqxs@gw zB43_ovX0%(WDPercKzJtTS{60ZM5TNgVV1Ccgp4!+7wJ^xYYDt-(^$s*0+)~`2)O@ zx|_Oyf-+;6pWEUVUZv@NYR*QRG{w|Cx9d&$l)Y7&KLDIb{x&zTJ{q4*B?wFXY6=xn>xSQ5vNfgdic4`bqMI2S@v= zxuN1JA1dzVGR}EU%FsXyS&t;i4?2p0`ga1~OEo5sXYFUJTZ}vvqb*0N`TOaPj!51e z1kfyH9Gd!%YnA-&YjHj0k8*#@uU7%Rk;e1p%d7gu8_u3rJ8LqW`A0h}`BY8H_nv4X zA_Fc^q8tUvs|m2qrTG3NW~H3MJDaB2o1%;x=`Dt_DBQhXRI?<7vo-z1uqCy>vK&uM zJKG=4q2hRvwlTH}!G~Rxe~6InsBA!8Ak%Zg)idG<@jST#(9U#TOCa$}RMu$u-J4nDNixn0 zStOa8BeYKO%@|~5J;o3+4y5rJ;Tv1kgQ2lE!%7N7MOZc~>UfC{0PulLA!1%Z*RqUU zKDL(zy+68D#YXa%elD%1U7@3F&> z3K?=LD5NE6_E<^5XIdqq@S#|Te$J*~o2KG`QkCa(+rshJR+aX<3WGQxujW1vrwNc7 zm$%8bKdhX#y(8SB*QWgCy67k6C$~%)?=NE%*AE2_fT1Lgb%E)44|FUA#!t3zwnZJX zqdt!oMimb5%iY|pj$iegR(hSUTx_9fV6<;@e;!TmlhkGSc8CnlGq$VE)i}~8e7JIb zZ6GE>{0#J3#A<*MJ|72pC1Qagc!!nl^%)qM5AgX$x~}G8h0bl9`?50J(oT8h$#9eD zqEDk%UF>D;hT1BBPjlKTSIHKqq1n@KQ(>bS)h(I124p${Lrb=haBb+MoQJuLnqXZt zKDh9PZh7|~3}&f5A>r*dWxbh7cV$=uowtqjMl=pqd)qC)rC-8btH$CXtgBi7+;D&% zA=FJiv(rJF8kfhEq%#QVq`tdeId&=zsP-SOiDvRxz{M9#*FTL z1-PN|*5uj=_`8+Vm1}+hL(i;yYdASIrsl14xk@8D%<|QZ6IkB^GtlaLU={RVh$(-2 zjXMQzLsicRg{&XQQtZ)RpuvzGUGkk@H1Kllx(VVcE}t|bhNl)EJH!*+_JPY7BdYN6 zKa79>emwmD$<2(kGkvvGd6m7=xQabhatY4>}$O37_8!(SqXa&zbW?TL8$G)4rp0H_zI< zb{Kv|1b{)_HxjBVtQrc1NE6I4?Tu^Uy-P84ze7eMbn;aoSZw2s1w<&G$ zp3)HkS<74v(OoG@Z095EhT_$J-^MFz^0REPvjen*IzTvxzsQO>#bP<6L?fzsRq~y- z*|c%8xUC?-M7X{9l9LsAy7No!v~876nX%V`@|q*dGf_ED(fP+&zj=x}?O5QB_AXOT zZco7~0D^VA)Aa!8C*a0cvR)GS8ppRjNLmgrfq*-W(VWo$_LP;<5Iy=#R5n7hExIj| z5jUs(N)7lNrJ(s&%BQcgSU$P_Is0T1PgI1c+__iELmIKlO6=#)Rq00Z6&_le?cMwU zJY>_~As{w8f>*oAc2;AaN~GF&Cen-91zo}Xeo{ow7{_v1vlu_wNJ@~vSOr@(P(Rd6rC(S;Wk10%b>2N~Wk+K*Qy3f<3=MjCK6(ZI9>y-6yrm;cdsE9E7aJqJMIjbj|Co%J5mCkQ8|KIMS z^@6hIwak9S2y0D#MQwLx9m}VJzk!r3mXyM(--m0nEBj0^FwQXQINILTzGZ{p=gMhWE{CQQw}8V&rr?FRzK+kc3OJ668j8L+rDnd zCd{1D3zdbUOlrdzCuJN=-ZkcDRWELgjO9)M#gs4|ufyPns?IAB-%wq$5|jl#0APEZ zPoFXpsA%=OoUoJ|Pt)lRZ0y@&YCKGp`{-*yC8;*+0fDNaHbx zVDFs)e(LA$!N2;Ow88M7%ZxDui4M9m0&dZt)#UecJVwM(s_w+0%cMRNpX0^0#gsL&s z(L+8^_27+VtpxvgvS|%@ZMv51Z~F4^4vR=E&)ndxNCUKLt^uFXLvNUXjD$R9 z`wrx5(YW*F2?iYn@%3&t+gMxL-oR%e0C_LvjoUO|;gNa9a&Uie*$pYQELfbY4=v0J zdQ=IOYCnqa8NRdC5q~_Yt8MUS^7Ze-ZhoQdfniIAd#;^JNoVdN^uo~H-X^z`#rfZZ zBgkUJR{SwE6m&@nNyl#MxM{3V6?M1$?P$2k13Qj8+6xk$o z(fQ(w*fPtn3gVWqgTP7bucKeM z^-P9%YZ>=YBPwpWf#;nkEDoX+g@c$vsrTx?R-fD~Z+N@w&4Jo?f*R<0K)PhfpA$)m zE#S)`3AgH0kLmSB7y82m3s%1r-_yJoE|_OIRQKO!i-Iy^PnMasB$upK_Z{zp*&&M& zs_=wex#*d~ya!vZ&-bkg(Qxlsrng{$YrYf@j09+3#l@@Qk(0 z;v?3zp3~Ykq!OaM4VdqpfJELVgS!TK{cauDAb=6c*J48?FN!0Obw|FtzE|Q5aF^vM z_+3myT*SSQ-hg0|wa*ImQbtI2yVtNo6G0Mpfj0QS$Ca-mP^&|+-+o8AC5uuT;13jW zP3mlH79ZC<;O2yusRXxv(E2&G-n0SPqCfI#!GbftD8ao8cZt2JV<13x@t64w4_Uxr zJM6_6jO$U*R~%dWN!x+xJedhl^Nrq*2Veco8$anjiF{Y1B7c2xSY;YhZqt7Hn>@cO zfnB$Ds$jjN)9#sDqN>VuuU+HITc2_~(Y1D*a$rrxuMCP^tU;3#Y=YVMtFOGOA zIYU!)mlf}ms!=E-GkUD9>&ufB`r99IvV2)`3GkSljG{Mg?G=xR2kzVh_Ws?SrwMlz zM~?-U8Ff1RLNh_fMIT_R8;}xX6ns3JRO<#R9NZnS8iJ;U3CW%!=r=Vx(FOumsv@-o zE-S?0yIBBmJqb?8%BW6q^_Tj-4)02C6g~MUfArSinx-`rCt|lCxAg(2%n7HQqVLS+ zi7yfuwL4$V7uP4@lrS+h77noHKV|4O#{d>v`|+2Y{9aKrQV8IX%gPWey(_`^I>Hes zul-MC!T-yKKlMOJqpjXvx7igvItheOUh^7tf1B7Bbry9!{vGuVu&BSoA0Q4I0=V#1 z;1rKn5d`ddAAmsYUxI_He-D-3xkbK@i|6^E1iS!o$(hjvzIT0?Q-01X#15F^PB~jp z09*`OB1$}@8W+%Eu*|L<1{5x8_KQZ(-XTNvCAIqNkbcg1N0ny={)0Wu^Rr>(Qvif` zgOFI^-vyb2Q2dtuhMT_mIT*QLIs6EOKcQV4DK#=Y$CM>Tgxyo(9yaOjjqW0d*3CFg z;pqX4c)Mh6`w-1fzJ^*d@X%QaZ0o5ytgQg%=DJvFsir<(eEa}Es|=w2|F7wMWI4Pz ztgRG;Tl_Oz4Qqi@gU_l6PVw#`3CT04!*0;~!S#sM`E<_LyVqkapU$ZA?x9Z=X4a)o zko$Z&r1y31Q@}pBjHd$kJrvd=cKXDwtmzYt-qs2#AoR)IL>SxxS0c1~cKg6vA!iOb zYXXsnJqCAvP)|`&Q{pLmepT!Ks*;GE^*`#g|CxUKkI%cbD1>>XoT4lwfr4qNtV>dx z=_0{{-<uNguxDgDY-k$ zSiQiYEO(aiksZR;%8U%O0P9Bt(-vNLho2>450=LSY)b-8eC)Od-4I1j^~ZyLT1Gj)&a+?ZpS}sYu^0$OP}1mK%ipjts7VhxP1=cgCHhM4}9M zy{0Z$?%!gbXJ2E1F~OCBd*+$^O3`>lBzxLd(@ha}+zY$p()?1p%WIPZqqVh3!lnj! zZaZ?^OEGH2$AV~I-gXCImrwovfmum|_;&5OF)F9eQ@uw1{Knzp+_&kuKTZiFFApHt ziO7ee0h*=HeEe144EPreT)QN=wR2lL;}3~Ae@6Sm!9RvY$dF9iNrudR4#z5KAm3I` ze+vE$HbA(Lkjb4t`Cd-{bCs{MMnm^E#IDtq5!(w$h;nEkdL z*_%?*^c0w3#JZxjCS_*NgtJIrI<}(wAb{iZ0)_b2i*VchJQ&?`Pj0H>*v{Nl0MaTB z0ncF@85@Q3aH10R{EI;zhq`e?%?D8AU7Q55qY(bqkP+pH<8p_7b5M4Y0>3E%f-Hp1 zSxg~%wi&EhJyg}Gz}@L(^sN2bC9$kWnYxGvE&gx7hL^(JSHSqJH_?gozl{_aAXc>-oVIFEtvN&5@mi~thXjGmN~e~K)$tGNYLkrL;(U^CNejh(Rdim5r?!X^yb2dY z^i80;lcirbPFga(kDpFne63x?em=;5{6|Qp9XIcoNL8Amw3sJP7sys9w^@~r^-fSs zS7phqRqLs#L0_?n%_NVoRM<4~1Yi|mc-m3xVxKNmlQ`Kg>^>#0Bhc^0bUnBWJOwbH z+Y{=uDyp8&98;5aLNbL{1E&|RT?B7aKzlw#Oo0$HZt7v6$)R26xB-h_Q zw&vLveeTz=QA{PID-AmTi{_0aU=Dt4Zo<91-k&yOkaKlzfew_bL&-*5_z@t7Ev?@5 zbmUJg^cIL`VKj*KdwHMs2s>~I>gsqseT-ZWOE8;ktcsIQGvhs0`{r3BcJZh5aaStU zs=0Sdxl3Fz0S%AKI1RBcVmfKd)&PlMJ~)yG9@o)@h3J<$Z#tfkXqkqcE&*Dq%Y-GQ=(gNB-2=IyAdAI$6_OqX zn2hp^v4@N5G2WF!w$(Lj;AEn!6TqMBDyC^ux4IF{pTUTV`f}b&Cg(#%e9@_!=NT9xu9`6rckXmR*G{_qRHM0%j%rlJ(vcP$n<$v3|;C zF%t%Al42G4DV2)1?1!4G634lX3Lh7{(FOyh9nh)r4;K{dJ0-5}ixl+J9Er;Ys+C#` zisie+8uTCC#YsbYo5g^huK`e8TGk}*nGK+Wg9>n#b5yfLrLOSyeSWU?qg!!L@{=L~ zjoyQQuf>U$StKf`f0c5O=E|X?qn>~(5qS^zn})5z$yyF#32f{;C2vZiLltGQ6ENlS%g)y?)LlzYAerS*2Y?~T!EwQWe9vyUpw+A~W4v)w zY(DIjq&I9W5opJ7 z*h1IjFSpw76jx=*%o)3fUJbg!QwtSkJ)2#I?txFi+M}}v?F&L^H@z#|B;s4wwbHC- z-s`s)bgdHYBIwk?A9vgSu1@|Jw~Uc(id-!O=WvInd-?ev6xLtsojR#@D0K{-6Z;TKL=ah)?p`pKhmvz82TdQa9cGXju z%?BcA9vVtMZF``3y^n{Sf0(Owq%3#BH1QIh5-mPo+qh(UTYU;pCRP7(MEVb-6l-jW zy|f=&nslzuXO)v%yw|TG_@2;u4)|-~_hXw!b9S;9X@h*b{9%jG&C4XUHE!Wit0Bpo zq4tjq9GB#{o-uT|ChnDFOllmwt0$bE!G_3-xw_l)mweTDn?nC8>f|}}>m08*15v)C zoOg*9)8%XS6$xOwLAlFsNOw5Xvg3gQV|b0Zm_`EFSaZC8{Oy$`YZX1Y^B2m>0k@Br5Y0 zaSk=uLXR+zk^?To@*Ojw6C1;-`MC(S@$CqCwPIplM>0AxK*_~vU(;^QzOOK)I9_X6 z#kn&aS61Xv{Lsua3GY51Q%gv}g@sbD`1oA%#c}p*9xk|qiZx62LmFA`r!~LBCd0Ne z$jsn5u>?mP(s(DJ)_bqf%G`w+Gv2ll5#rV zh~BoOTwdFI^d+CD+7~OdeTeoOW$GeGKEK2?WcHejp>mPaK~jx))8%%dgt%Mg=A}{j zZ+kxnF+Eflof}ebgoGg!2x4v9*Ir~?Yy!C-Xj0Aqzrd@(!|b=>D|IKrPQi+>yTBlI z>LVo!GtWo>PRP#?W)p|u0Mo8@4N=6i>bvnPQ8qvage1^*SHu}ZOlESMdVdu#nB@Fh zD32EBOccbBKO}V^kC*b1KYPb9@6@TKKrNQPA|{_XJPtzJX6!Im0lXw2<*q&`7~7Pf zKG4tD9|yYc>V<=*r2r6Z>8~c_e=djdA1CtvN50Q@rE~j;j|v1AWj6t~FJ~B*iA@gx zhL||$0-SWx4t}i3FG)u*0Wi5`v%i4;Feo8vJJJ$$B()s^IS{c$9zRZCr4Qc&!dnBD z|McpQqCJ_PiA#()A(8PX4QLlrwTEQSj$OHm5lT6f)7i8Cb>V4ai1er>C5w<1 zwly^!&$~H-_q!n_aI`Z07v9e$;DNHN2%#t-$@VuO7n6dVjBH}6m|kXfd9Bs#AZN3ee}5+DP%_5%48sfHoJR|8{9fll zehvL2JJ>64dz129?2t$MIG$o3R~pu-9#a*OdKr# z(*yILLPr0%Nd4dWen8zbo%Ka<;i)MZ`-eJxsN=;~NT7BpC)^x?#WwDv8Eq$i(P%r% z*8!JuY0>X;oZX(s`3&tzbWCzohocSmW_{Mv-0bCb;rTBk#7ye)WejR4^#JdHCYh#M!9*jQA#;H9$w$Z&Y z64nJ(97&ZrZ=Wng*?ce^XayzYe-Uk~FI2JsT_XA-4hroQEAxq?DxIcR;LXJbsFDsO zSr1l`^n6))pY?0T6(2RfbuV7(4|Bjz{z+m1egq>ga8{BjZ$HUbh2BXRWNSAZbIWpkbV!@JYb@V zzDMwq<+%d1b5W!~V38A=6$r=Ds5Mbr5d+6fA6r&^~;0Pnx? zyQ3RL2fYKSkDOGgpBXY{CyPKrv;X$;Nq_>v6=Lw!2o9#BWRF=gJwP=^WUnP4V=4W= zKc2!0c%Sjr)T@ApdMF<`_ZYdL36w`D$M;b~qvZg7+XBeDA)l6zQCns}MY261@E~N~ zF#rG>GP9vTsPP(j8A>!lUQi(qa3HCt1c1z(&}IVuhZJNNs!*S*a(El)Q|Bkex&p%2wy*==W6@HFXgs*G)*9SU(+j)!Wy=3qJX5KYG z6xjLnZYogX@83R9%r!a!O}9^W*Z-7+r6Akpi6%fVPX!c(!txIf0@Z7QV-;47a&Jf4 zZy&KriXBss6gK-gA=1->bOPpefZlvFH?}>wx+Pnf-C?V&&h3*0uK-siobCBBY@2-I z?nvJaj+n38D)~>B+)eDIUuPxJ4ac?MSS`dtrgshPU4WUkpGQgIBit4 zvG7!r@QDday2P!(+dI6p(>J)Q*AbHV=A`HV{vsmuNq20KVV~2ps14B^Fv`>6_8}{g zLV8AApk9VYO;)+6wG7@rpVuDT^_;ICxxIdO(B(8DJK!W?kSq*&{R2d}FzlNYNmJ9n zx70G^NU9atA~Nln!J}M3=b)m!_1S5R&wYD$fKjE-+mJbyLJ<5O?SYFh+&3p0T}&<*a^zvx54i8s86 z4;c4S?~-YtS?G@qy$ZBsdGK}sd&9L+9#D~PPgrx|Ty-jb&a`~-=4Lvr({yyJ1R7Q% z?Cg0iAo9GgsO5T%r08s1kiGtHTe{##*6#M6f8Ps2+VE6QB+9trRRjgwGD>9+;P|${ z?)LKs>j<=uLr_(|E&FXjwnr45@k#&W&wyQL^Bf-D$^J8Ul`352U#P09<~_A-@mWto z=!`(wsd#uiP&vq?OINFg^Te&ECQiR-q}Y;hfuQ4BEwMz;xTO7M1faOqWB`3G&|ydx zkE5lSUJ-Va$;UDW<*)gVQTNzS>Isx0p3B}wcQaifY8c@K!mF9i5hFl6zi2eybzi?$ zHXxtT6Yi~mFLy6e6W-}{d!1V_L0`=>?NQle{fecaTfJ#W zk1+K5t4L~*<~ z=4<1Wc-KBDS^xMB|2Yzv@uVhh+_aHjS-E@Yr4%U>2S;JfkU0w!qtQND!#}oY%W-ey zmnBB*eT7J>Ta0J74gH+Msmx^ak=&7DXOF?lQ_nw$WLWlRk7c>m2Zn2J!qbo)#$ES? z#&(W`pd~+klu9FCZ6b@_r7VS54NM)|%tf;TaVspSN-XRb4g3D~1s^+!Gbty6RLzCq zeV5zf#I3^bUg*AhuM50~K0p%K+ftCh>1?%*Dr!MEu#(a1#AnQU83F@8N*`L*Ko$J< z&~-RQnP>8+F9dRkW)<3t6eqVdK-LSFp6{UT4Lz**QjuSOs8s(R$>MD0L8Md9kUwQ% zxY};yk?@~und(_;loYAaS@?46wF9871y88*1Nw^Cx4JQBdGo+AEd-`^fB!EZN2^6{ zf(yQf>l=JDVC{a=8gqYt0I7h7U(pchf^~di44_=DuEtmgrrRb8?Pb%+869sSA-apC z^?lakYLbaP=E63DaeDG+(xlbbyhE?9F_D~5wTE;^J!jQu`=-zR{3;hg&V{yMwBf^K z{tmZ++o;PzTInSQ>($j0-iqvZT7?y$*17J-vgj2g!0zmbug&fsLa0%8)HmQoPzb2P z`E`>7(iJxdy3f{FVgeQZa_Xg;twZV3yHmW(icG?vUm{i)T&Z4oY@|#5qdQXN?2ZLF ziJPQZx2?gdLGiWsNm{pm`eW+IsbzTVoBd~+T}innKOWr^yTGiPP0-`16m=9W&XYcT z+r*64ja4r@4i1L5lmPuPL(>Yx!csapG^-bFdf8?jUU8`tDy_Ge9Ge3_ZAUADh%&gF z?~YX>_X?y0xm5K%J7QOw+%Cw1Jb8D}bz#vGAxitHoKYa7b*D9g8fA4SN)c9km(pe-K+*)t5t33TeM^^@aea(KY9XJ2kTS?8gt!Kq-tlZ9cbDX`(=^O3;BiGpG7o{N3&JK4CaEK`Ij=3UG*x)Znb9@XwLT?F zR2k}Wg7<)S{;*XnWcKV{t7vpyo2UpjU@~FOK$LMP(H(q2xo+|GWOk_>wv!O{#^-UK zdZd;oe8NNluCX_S;T`RBGkl*g;IC3ojAsfa8@ZmV4}R1VcVldPQYV{mKBB#@wyrwi zV?%gmf?rXgKIn-rdRh)W37^wT`&pLK#TG48Gu>6u#~qmS$dY%~3ydDE0)@h_G|5-| znELLMqEW9?D9e)Mj7|~{=xL`Uo`Q8@*voY}5@k*Z-rsM8d*b3rLqCD4dLGNkxdt*6 zFPk|9)_tZ;UXr{qnRD#M*q$2qb~{IE$aU{!4PV#BFK=TQmDOcj&dNuDI7ma1zGwzM74uR{AM;S|}O6Mf@u3>-9u1nE>aHy0QT8W6X6`5)9Ra^<) zKlf=ZM3Za754r&aRQwWv4p8|E4&1jmHg!c;dRl< zyIqz0`P^`?<3`^pbmrkZ@y-jp?YrB7$ULN+By4+@%m?U_m{k!#yO||96VP1{RFWxo zGqDqwf~Id0w~aGdmZ_aIxBlK@_Mqva<+R&f1^dHN#)XOHu@&WKHqZpq%^9bJ8kO#4 z-v{ljYFdX#?9(rsTTYJrmnYTnJxr~5ML$41I5`=W=KzMuIz zY#ZS0EqQjgO>#`d>hxR4r1xoow~S|8u_tSg&M9EYA$fkkx5~8)?rx!fe7tA!#w&3> zYB$1tp{`qz+I1kiIe^^Sb#zXgs|dws7%U$deVrNX2a=YyL@vCM5pe{UojsxIu-A9J zV%C&(nkOptwDz|iqTGFjaCus|>w9Y!mny3_9U6(3U=BM4LkVhv_f0B0#{dN+NbN4* zZ2)S%&R-^|1#do7Zr(v+4t%Al{;u$P)04#ec!1kZFDKfl1x>0ZiJLw?l_c`2Pkx|> z3x7Mij1@u_MZQKlIZ6`yRN3iWbbqv+@w0Gsu$j7t?p!DK$;QWho?E(GTC#q~w|SVM z&3HvyV{%k2Bj`eLlyuplS_gjlec@hbVE!4O;IOa! zLPCtJJf$OJh6h7GEA2f{Vv=9-L)BqT@TUW-fIWtW5Y)0ni9i7T_li1dk(a7_0;V90 zJ>k|-;wZm%uialX#;^-?haaGJL_3TN#2R!VKZHt0G@tD|Cq+I(eHz6k<6x;1*kQ%K zdKzJc>++T7)}2?I)B}n2c@x0_gZ^PI5v4pklYnp;`<-k^Ov@tk;q_)C_upFIiDj*B zNR{er53acF&v?#a;{M&corqfj%EZD2=Lo;Pb#Bp74g@s8>2*od`3VcuUC`4+g)is= zj1&6GU?GvDjyD!X^kq$+n{jWgYq-Z3LyUe(H%*0Hz%qr%3#c%>3YE=q*3YZmof{5{NQZjj5Ta{i!_yLp~L-9&avT>J?+j%lkWY{~QBxyk|n6|5VpPTP{+)lo?GRzrn1HF?8)0F(DK9!%|Qt#8%c)|Ui$ddlVskM z+j2$OgF-tU-QA9m5Ef?R_}(>lWc5`Y_+BpNmo4JD*z|ki=tp>htb-%-9e9 z;E$VHnsP`lt$~2F%~e!$_S5w`_CbQ!04V64Hr$2U#+FzMVwRe4x}j#uCH#Kk-JJ*4 zQOr4ee6F2HC090=3Nov)1xkje8EsGAR|KC9Js7i4Kd8q86EBwJHZf7#1_pP%xcOYQu6YGlc<3-$CWT>MzV2DbAyck#DSPlAEH{jJ$W3q zh2R;2t0Be@5^Ip$WF^*|TPMXQ>UE!dtTa-7Hs>t>lsewQg#4h3$El$9;HPrCC(aZ~ zcPDX1C`VYj(YaS^(UpQHTWuuQ=5D;#H7<8~8t72sy>5|MdbO)EtJ=GE!QxfP6w`TP zlJ5PRA97}93a_5I!uR&ldcqluLjV(=X`ZQ_^OAI0w$T^)^9>p#eC5DS3_Kv%0YN&6 zvTH63KoG$>8VCNEtG2nXrruazei?W!(uMU%6M4l--|3zgqaO1*dNH0nm=(q#ScLrq z#KS{G3uNw^pYE*9n8?k_BvQQcQ0?Mpj%@fhM0}}_08TXOP*!Os>5a#ib1L6;aPkWY zo{O}>J!Z*7+JhpX`VAA~3wQEkqOYCs^J**Vw{jFkC;M&I3mDO*gHC&AZtF_xU~YT5 zuy!XdJE(KWBX1_L7Phgc3C~-;vc;7k>r`^jop-mTC6*G4S)3z_?6v%&8NRi?ymqW| z02E$}560IHlYY^t=A#>Nz%mn1I@YIw6?3BRRcBHsgdo;Wxl75#EQa{BQ=93i(duMr z^mN9g5mgly1zG=wTpdNX+mRPqnU2`?f6;)bQrSw8x@up6BkCa1bGm-jMYU%` z6X==e_THWNsE74vsb?R-wkIdlU&^2*c?67r!_ccX#sE~)#lCvMl-PdUUuFKKG zbhc+Pj_BORfWaO=Qv3mc4aa>zBfkUk$6gl4uJZ&Bsm;&3OU?+zuIYFhGa<@G*C<$i zu7?w0b95hEcY#7Db$pn)uE;Pd5XZzY`^>nX-Mu8IT<$d8RUUCIaewRpF~4&-Q$6ol zfpk9WHeAVj@`uUSEl{8C5(k;2fO-#c5n2_dfT{*_46-i^c1Vc^Kt%6&yGOj#Mdp>Lr};PCwI_MTq-YW48XNaJ zDpfCM4a_cdgp}sb@w4vJg$EeS4FD%cN@ud&lnK#m>RI@72kV^RgW{c=ichN}@A3vI z-`|FpacqZG!C78-`!SX{at(x##m;9wxvyeu>Dsz-Jti!f0u66c^EGc*&%0VT7Ln9x zUDHsVHrcQdK&9n9Ape9uYjnimkaBx4CEPZ{4_Vo zN_nXl=5emR)b)HikA0O8!K#h6h{NrPq|e&q`^5HaTWneQ9_4N|WwLBKPcDE}pDw86 zhSK%@Q`mJ_mx;X+9@U1rY|oKarDJ6Dy*K=F);nJ-Ze6vYJE30M!x#lz#zl)Hi^Au( zh7RJ>8YZz`CS0@A{`U}%*M|~p)xY?9K~DdkzpfBY{yk;z?{1kiiGcjY4{ZnSleiGo z*v0(Tj3Xh$6+m+SzP?|Ixj=YI*-9cJ^#i5V09_SGM8T7|0U7##4fpqd*5??OM}BRo z4I}OFP%Y{6HcBN+{fjRh-Bw{gWD#4uqcfz02$eQhk-#5r}_$w;c*oJ4L zMw+#KnxaoB&#qI}RLf1q`t$UQmar)gZ6)=Jq>imiK}OgAs)Wmb|NsB)bolG%g>L1A z&RmFZ=n8JUNVon6^N?}Ec9ch>LrI-|5@==~DM2~y=AAA!R$5WUTKkYP6c5yyeqWqx zC|IpoId(JsyvMF6-!tsk8`}L+x9!eJr5gR4_MxEZYX}=qo_+69qJ0>!fi35sB=36i zv;1!pPqsXG4N>z=+>DhJl4EzAme!_N)|I#2+QcynPEkr`bH9tMlFP zR!THKRM4GqMy+mGThGiG$kv(v$szt16@vaxd!4&NHN&-+!YI@Wv=TwCZCSyy%c#{_ z5#F?Vo&#onrW(X@$k%*0vwh_7Cgvj3E^C;QqBl?%UbzK~RWW*rD8S6?FtM*&6h|Nm z!0)sE>_njQOx~tAro{kog8ZV{|JmM^j5#!i;4f39dWKZ0l3d+*(ZCQr1<0J~F2E8= z1W?tr0nn!gcvgNMD0>1(&k>~$A`rtS?Xg_0guHUf_rJl(w*jA$!UYQ|?JpW^sZNCH zpBvmEc-8r-XRk&+k%^GPu6w8q=_1uXXQ6CG7nYP8*C+e-o8s{fNTGt`5c2O{DahZw zQnVt{t=X>i_q%n(Z)^Ui&58fDz9K{sVkb6{em>GaBOJo$@Y?&*R+4skh)=9v6+v*d zy{LfHz6_G~MmWRvwBa9+)zFOvN(ld8$Lxv8g~x9{NO9gxqAGmfPnQDi^k=ltP0OR+ zrD0SOtc`RED4Q%WO+{1%` zA#I?OZtb{h(mxmo#f7#wpHV)YRAFx-mx@)=n5>N6R`!lt1`8k=X`X+-7!?XEd~)2* z@@BGriE8btaufk_Umi0<6y_l~E%htB+Sm^}5pa&6xT;b>;7ZTE(KhY%Y(?Oez;g;H zv)yer0M>2pGLkE5kiIEYo(yyo5iE5FSeic#|FjF3Dfb_?3Q$u?z5ytBXw&fICw*Yv<_Pcl^rP!+BKq0MnqoOjN`M7p#Yr`9}x zc~-cb21-fL5kJgh$ej#E-=ecpHWeRIqqeDo0d z9rsU(*?rTSpC9ateLrRbjEz>T1af|3zZ$@#tNaQ2|9KdbeTQi^60h8ok<{|znH}U^J_ZNgfJx^Q&jn|qn@PpT@^s6aJ+|8X? ze`;PI4d#M98HsVcX$jxje@f&A*Nl*D&*K!-#e{+42lwC|o}d!nukK35LcMaA1#kKf zG>#; zmj>T-KFvYPSZoGrJRR|{zgn$ex&{Voc+T*R+>Mdcp84?RWXm;TKFRLI+Z8Hn3e{|@`v>7-in|77}YlJ$-}0p%#@Jo`rHU- zPGac!XFg&KL6Q}hD6`^!MC8xf)06z?58I$L0k^Af`#eo6D457_m~cpBE;_0IQ38Eo#bGrVPppwT69W*THLdRB4vMF4Db zC5o)X^g4jQn3%r6DOsde>VkaJ6_S9QdqksHO=NX-m73|Og^5}wTibXk7`$JM-9o09Evy zop5DUi8~115DjUC(=|p_;QX_6_BN@W=yrhOTd!5*$D;T9 z(m4*S-<7(tl&^x<(33zi{yu>T<>Udr`T=O`kz#h`DAa!82_o`{Ak?DzcN(0`f_E0G{q}ta4ALjP zVL*`i7C0{fTI<0dYp$B&U9xrL)f0QL^Emh$fF&wcHqp^xk)C~LlYF#mN~u7w*y>20 z@qf9%`zCx5!K%)%@#vG1(;(XJ>ellb&D}bseF4^^X58SEk#ag9sJ}L+`poiPjRcuy zwLAwZ&RP+Ib-ZKR4V|x-e-_MMf`r&}Fud<$WkGzIB|pIHzgNYKJ3hZ|xt#z*bzm}@8j*43v z;Mj2UyUryZEvM_hZ3e({zu;KJp4XnWroDG(B|YI0f5``ij8|lf!j~QcJj*M99y~Cz z3PumL&;{s@IkDZ7zf-{XSfyViXBBFiK)Q_225@rZWaC)i#!#wY0HOcuoK2 z^bC9RMv?dzUEt(rkz9mUku3z2C~yFdH`0waNj`*{6K`bl(qdC#-(B_dK|YMi+)9fztejT^Oa4RWFLC zKU72*T`ImKaP}1=nXqVnuG_Mn%575yJU|J5q+ zZR*LuTXy#L-i>uJ<98B21Ths|dqOvfEME!D8>GaP9DtcnhjvF^5rlrT|Kxs63 zWBZW|Y`q#&N5iM04JtNR0>U!`?nVa4E=evIH;%;RK2$n74SJda*(J$;5omL zJ1B6c$m=7pkw^NAe^#J@&99JINei$fx)8+^KQHgzqE3ymLh;exCY8VME816Ns1D{L zrq(z@664L>KiNGMsm@J>N)Fezsa&%&m_R+J;lmPINsj^yn~6(={LTy((Y~|QDOl+O zig{(^B_m~(Pi>zVWow{*bs~=j*#JdDhcTrU!93m+h@-H#a$gG@9Ftn_5q69(Pn>TP zg<73jOmDx5sQ_SLMh~&qa7KAK{7Ab-G^Bmp)iB%QjI#Cz{^yVgnD8cbgo6ygQb8I4 z%)ofsi0eWVQed^S`93$9czVi?jI^6|>aWfnJ#p^YuzUy~QnHfyRuzOv&man6H@>G| zAXH~;o|rc>NmhyEW1Fn#!U)SIm+I)y(HJY|L-vNQ#3!**&cogZ*bw#Rx1%u^}g6GYSAq1a!7pruPC0oJpLZ9g!y3iAgl>j@pj-pk%GiJFRH5sb z#cLyvB*fkTQbC{AFPf2r5~N@P1;Dpjm44AI_P6i?i+vHfUo=|!)#NPwvhN>Gn~p#h z`E7xgKo$V67NT7=X+wTp0$DFc9$zeChSb>!X4UuosndimFaP-Z!>F-d=j?T5Up_DT zHxc6f{XR5|Y6?Pj*9T;YsQ9LC7xzWO8J#B^tm0VdOR|mN6@e6 zk_o64PQ2BagsGA)A^6V$Zi1I&GX$1;9nV+(M_jMFaD!zpqTc`|JX9fXxj%VXKtNy* zSNY=Oe<-ACDnigWh#-RSx~GX53nGw*b>Jq$rck3A__j#Zjb+~Ho8Wl}pw_kmIF(H_ zGfo3e{~~!ioIY?O@ob66?5cJA02#+J{2cLV2p+~hi#O;Qw>FWPzv1(0J`XEh&b?Oi zn2xe%|M44guA+sxE)53o9~Z=`i2l{-|Bb!(j%w;_*F`~8EC`|^QUd~lARU6KRR`F4B7k>5x!EnrHg$z0du=Z|{5d8F$=q#@=__ zKQfS&#d57V=bCSOp6AU2VrE$#Cjg$gtzf6rjFOjyVIcra;R#O4JLIp+7h+;=;&8bXZuB_$_CUV(Qh0E7kvp5Q`zgh+oS$7_U-?Jz zZ;@1da+HD$n2E-kytn>v|J3JivrX%JhMGf{RDW9G2TNB9C5&Qe_+P<$yf#6?=!7U? zAFtAEDDHn}$Ogw6GUugXcf&|@qjY&GettQbQ3cnkZ+Pb@Y|`Xy^B6^58{PqHikZM3 z`5vc%wujQP2zFGI%T#EW~N({_}kW;C`&=e64} zIyGKY{(d893KP==4V{H>5vtU1TMuWYIaWkZb2l35AkR=usiM>wNF{yCLB@<8{>Avo zWO3*U`!y%tGrk3JOsK2GaDiu=aBed*vnRa*J%hzk{qaAxT2kKc^Jl0PX$iike)SZa zs5*GqpW|JE5wwrinmsz^_=<4F(iM5;iR8fsC9cxvf(Sf(+{*d~{kX#Vv)+PKDLT9~ zIi?R5M=wq(^M=i{2KxIwA3>bN-~2Eyq;;$}&;0(&LZ*9|_R3tA>|fcwO$VkodT7c<&12Bf1Y8h{WlF=Z2iktLRs{+5>>8^iBEqzSTR|ymLDg zpNck%#$p&@nq7k$fgX?nMV5K!F{GVK2)4pD1Uh$vU|kqcz$={mGC?fS$Ci)bb;gY} z+cb%!hYjOxC+ll6!Q1mEqt%QT3zNd0_V&=n8&rwK<=G7aG4Cn8KHvT7QGk5 z-L6ZuwIA9vLAEkUiBXl&&0^6IRstei37pxXc#jiOW`vol$K@zUhZ&-SW zze4c7=9y*f-D)!Nz5{0HGJjxg?Gt7 zdI!DW8{z+|8|nVeaEArwa?nP}LY#!1A&9k%NUpj)<&xC4ssADjeXY!UjmPhO3dc0o z0g%#u(-4GZ8c`8CCIrj*#p|qMo`{D19z4yHHT`eC`)U#4J z09-;IEyzzKdsZJ z!p_iaX(BFAhoCrpMGyUI3{GWZ5L*6XQS?#48hF2oBV;*oFP$B1M0J5 zq`P>?0<01F`nCmW-ZoNdn=$+5Ui8|WfsY=tSLm)p2Ve{7!mq)>Rt{{ZWtTkaDih0- z={)jYs7BpsfD5?_s?lwZN3Tcf?xVen468wYHGgt#Jbl3>V{OYY!^h~*hcn4?UpmIb z{bD!LfC>RdF$&R!;Ej74*|usa!s%RJT3^LUY{wm0Qi@)a6 zXl$I7b~uVbrr-NPx8aH#yoCkczyCz{0VY(h&*j>e`YQSCjOm9FdP;>W_v!Ye+`*pE z*az=U{@zOvrr)H+Rc#Pq(N;0@+@Y5*eaif$c=tTgegCeF((aACNy9DV#xlvwpNsP8 zMGW*iF=kA<{bIoMTWKK1R`VlB;V%Svi02UeNxH0L!97 zrM795f6yM>vMjIosCxJXi*hITUHugfdw-t9;TPG@G!Yc2t1nY*s*J)#>b zvS=3<{&7;yX6;@q%f5Im$_3sF;Q}N53?uB|as$d(cQMyaOPrT~P~d~sd&chD952po z=m#FI>t(fg5Sg;D??nBmX9P&k91kyO3O2KSwnW4kkBv=aG5uu#E)#AMu-=+uRzA@yrl} z3}fEoSjjAH2SGSWv;9IkhYH^MeJlCF0jnTnD(Pw;1F6w&@mV)gWzet*RgNZKqMLAY6kLfS7pwUqx*YPKc@BG*wQ> zaGazs0XbV7}1 z8xI*8*Aep2ZLXh2@W(e~}~DW>UXT3~CfpPI@IsYp&95U8)?^8ovnvi8*bS=xQ{ za_xc_nWc8#D?ImuIsuY>f0iO2TLqvcsln0R%&)2-$zY$lZ^JoxiN`|os&2LBWeR1_XO(Ty}l@KcJRHt^H(%EjczIHG#!IZheeWP4s; z{K>e`d68GYY2JW#u`&L@d;piFt88lhz^@*AI1*R2k6y$?lsvd;Nn*jFH}y%k3Bz~k zLzM(pWG4|FZHRoH=Y?jJ6v#dM|f};d36S2k}H_j^L)umvA2r-VqjX z_jWj9-b~-Z42aIl-qz&JXrbI6@2cTLJ*f?~V3JC|^$S`iEV83}0Lp7YIRwdh&7r&? zjNC-S9MkzpiI$VOr=R7T)pF8aJFxj`;a4sdp!Qtbrfo(C_D z9@O!P&5lO8q;7wO(EFul6$4pumwf47*m@8ubs{aQDGG-FOx;9ELvgz<6MWQA4i_?u z5fUHKJ&c{r25mcf!$5OsdOf^z-B*)$7bYGDKPuJfdlbU?{Cto^&W`M*qbnYevmn$c zp#6_opCsJ|Bu)!Cj{eSlpN|la?UwQVPlRq*Kwxj{C`~$Q>ydwk zoPtv^NFWQ8<+G$}$>wuPOHpmsuo8Z$PeJi&tJ~uJQW)uhGd*5VdMG5i1op8Z233<; zlObO?6;Q5IUGw8Xyg1|$o~UF066-YRT8bE19BT%?4e{Q5BP#*N`*r$YP|{c+1SFROI@` z!Uj&`8gfzxnTx!UxM3|af5drbI&=VATAVk5w0Ak)6t$VU1|#j;19%GHp3;xZR&DUv zAE3rn?tNe*i58Plk0$DnBt^vDuxvJEeov~=pBLOjs^_1D4f);wkG#|-&{ciP9FJWs zVrri+Tk|VWm`Jgvl1j-vcj1}3Z?ocd%e|aID(o(}#=lpC`7e5mepzMH)aM{3+~8mq z@0fS*{h>WSasm1qujF`PTgNh{l>wcB4shUfL7kl#m{*5MU@ouJiw%6I~C-~Tq?1~-eHGht3l(5aZBt|V$8g0SXA zOLsaElsiYzpL^~tIqv}-6>ppWC!h4+<>T*6f{(fMlcfu2sh=F*hXb;1VH9-l48a2X z5b01$=ZWh-n<-m9$Qz3~Fy@ar`hx!9K!Jv=!RcAXu)7o(-7SN{My#pQ1vt{qUjj6( z@Q6HhRd8B>8AUmXGHST2ZbT3bGDhxFvw*%0v4kYyg^X zY9%Xzym9a^m4Sb3u5|2wm5co^`JITE&WO|#Hiaomt*X+;wRwqGAs?6@i#w0e+2pb; znxNZ|uZed>pkdF_LmfSRcK5`K#MaF{3aGJVxdBC)t7#%@y(V?LC={r}s*(1arX3XV~hqR&TuD+q0n)rLWS4TE()9md$ zy`e&(c%dJn&dx8@XbkXof< z@7`AS3}i*+#QNv`-RIVBAT|{*S@Y*I+|(4S2R?UX_d*}!3#&;;NW?a#8Z<_=^1Mro zGb>(hjtTwB#)=Ra$mSy;Mwneli{jEp4BbCkCX*}&Gt@qfSlzh|amXfN=ejfOhz}@w zwFEpazDvm~Y|aVUE0qMN(6U!|p~r3Kq%rdjwln+P&s6mdL(Q!h>3r(Za zzi9x4iIr-l;B3A;SdJLRo=1gAO-YDvb9N*5T&eoWVw*7hJ=nJ!SAWyIJo%euEN!G` z(4b48QS9htxOWYz+iwqB(c4m>^y}ks_|}99@QvtMElJBodKdoYeH5^l>kin>9!q3ny=jI&lebFKcrD$N)?) zpjB7%XM=i~`7ZFC;lB_MCnsTEZlY5IDK;r7QxoysZTu*A$33xwbNJqbfO)1+86E$s zGc|7u6I%D0bzg1iwIU)M7m);Q2pe`qeF%%Y0K=ix35qz<0;~xeo&mwz;scXeD}9=l$)085s- zJVG!*iL<4nTJ6&eihL|@1CEDk>9!%(&dgjeWrlNKFiV9yC0sdxtWQx z`8uRc4}ZlUR_xa*cH6K!-gGUlfhP>$ph}1nE%q|4uoG>*v|($*Y1|>N12X}_u@wGw z<~(5ien(t|o!O#}Im@jWZ`?svT=~#7imb2Y2*AU8_d_Vt$pOv~s7)6hg1LgyEg%%@nUFpT~UT zTHW?-sptEOH;aeJ8NJlg%dS7Jah87+v^juYe17%Yn|Y(>ec2het>etjug%T{)3x*L z_h{y=$-PPNW6aHcqG9Xxd?GuQ)o$@2*U-Blrk^95F)ffW^jv6*n)P6%#@uaGBnG9Eb_Kwj-nT%`;42o@q zN%YF6@VjRUb#W#H(r^b_l&$at2P-4uPCXRYjwn-kGx2t=0 z=>nFVgk%RD08O@mGsR9R_@fTM-`fa1IQ0eaHeR4xT%rMi)%Z^b!Do(G$`!|>%FFXro_lM${fQMUjGeIa?7;0-OIyn>~K-J&Cy_X0rB>zxLde?st>@uBqP4Blocm-pR&( zSVCv7*~9OaigYx0XbhK}1F)o%?M?F;Rp&{?+3K)Cvxb;u_!sZ=1m<`n|4$D4-g_{3`zk zVKFEB@NC96=Gldz_lM{Nq%%_#P(_{3BN%_2Qgg|*SA9d#+GLfvK+9v$ruSBT_Qm95 zmChVQiudytsb0oaUnd7GDHA;s7|v!Pwv|BM*hJ1=&0W5u^=N2USUUV@a|xe7#^RzPc{ekOw8@+5wF6QR9oEAFv}N@fhUOub5|>_<0Y_JJ=Z#3YzlY5l+S$q(srIM@R*PpB5}ie z?)~E{CReQX(mo&$*5Ks0p2l~&1+tjPKfVG~66y?%N-f?FH=R=oeKAaYo)(Asxb#)2 zPa%TK(n2naSGVe;g}X!+<|JB`(2}oF=x>C7yV2$rV7|w6sAHoK2?`YHS>KyJ-+BG1 z`tWk(8hh6O_d6w~K)t{d5W~b~ z3%}bEy_g<9(JDEr;tcQ$>PI`67)r6!krW7IGOpcFGDVejhYl)M_<+*EE0*05M0w*| z1@xJPs`5~3d`x<`F0n=95cwyJ-%hiwK>QeCa>QdBK>t!xt+&wAwD*bZl;)x0;cbn& zN@T5+1aEKlb+Fz^fcWfy<*MUz0flFM_g&O8my4aqJ$6bq-g(C!acQ{ny3th=W3%pV zUs_J5X6hC6FBBVPV}R)t*{lfM-*B z(buv~ccI<;kT4V*g3n*)+pYquNAL1V^xG>Joyj zT|Q5#u!io{#&QYNi~g#!&E^+A`*fe9^O!6>%idNBiOySH==-q0et@T-hYN!iQ%5)U zN1YRlb5YiUSm7Yhb?~$#d9s)=x%Ngk3QcOwsE3m;E{KhFwV9w_VQvz99oZsG){q__eH@h@qDOPIgLt8Vc@aif!4 zi61JzdZ`RsePge%S5J42SvL6JAY-m_gNQZGK#W@73#Q7pX$iXx(yz@}$-RYtZp4Ol z2l>P!UUOg!(RY8-Wb0th7460xhb}%bB_l~rFqeowC280Y%8A+lwl{SRI-m1Igz|&cGN@K+$-ycpY*;hXU)^xo8bFVM|s002boIkt0R_!;7H#wRPzJ ztBF8xb5>N_h0TBt(n(~;qY>=JdC-kf`<%ZF+#S(?+R>HXCLM=T-`$4;@bPb&t7b3) zQqTWDgSrnJ{O8^Nh#3y-LKQ1HE)2RM3KG^?ztHUwM>F)$MR02_U3>E0A1KR4A-=nTS(8ijXn3C+~n!5S^$++G)=dQSo{@YA71}>Wm3)pgdIi{u$ zYT^Rx>GxLXdJ`{OzJ}PiFRuk*eiCeF3I4RR=ja=7uhJ!k?<|V+ed2pj48oS!QnDjH zcmX7!fUJRqzwrq9E<=WO{3B=QQ?l^#3B)E_0O|zwv*Rv?DdwQ}qCvOyXWp0Cz4r)q zV)jJ$%#bT{W)%{*O^)FJ*fioDLSWW^9P!f)AV4_Zp^IyyAll%?o=4aJp;KLxg5GZ$ zzg#M02&Ea1WHkQ>GTKXR75`n5tN$nOlcjQQlsxskF9702**Q&3jC^7koV_cZ>vFWr zA+txVOp-9G$RVc`v-o5W=VwI_?$7)UNcR64A#k;$#8wAykn-&Pf=fjwG(?W6=moi4 zT}Q($XSz+(=#Y4joM&zOUMiSy@s|s=YQfw-pYTC%P$I)Nj+uIeKo;$hct?(F*Gtk^UI zI?Ve={Z$M%k${Z>%n;@CbJ{@xGc3 zV$!nT)1qG-*_0>Ab>V(dVeAB@W7I2k$m!;6=(EJ$Dd=8D%%oyWC-QYwTOx(gfjuC) z*wXEjgR8($ZoiaH%hfk3Dp#zZdh+)-2(M18~%UZ|8G7Ga69rz%4bjH@c>jK_I$<@-M`HE z4HfEpvi9hPRon8WwGZ8COb9g*tr%_=%PuMV$DcE|FA^{HdDina`#!-`etgmvVEOXn zeClO}D8;ZXIp#rbCg}orDE5@n9YWjeLBX=!4h5~A-cUWTWVzTWci9|c_I_ny!j&(A zf*{!ART|+ebxNrqHYxGiY@9U396#7$KIAu_Zg!mtN5{8W4`glEZ21}a>tv+E9)KP2 zhg=XSRWP2N&;49A!=UiYnr7+-QRyQ)?NsX2cSt(ojN#s+mH2EGFhIx8KfFtyFHJbT zfM!yggR(njuqEE+k~r|D+1GQw6l3v?ZyXcDi)y*p)ty}~t1gsX zfmULw?VchS+2!SdKQ99<4?_kG8iX(5i0j|*RWSgpd1(@}n{TeZ`orzKbE%M0#4A0& zGZYufCkzj;t{8%5p_!@L(wf{iYHwvj)5$chE%~=`Z+H%zd`IS601yYLL;7i{3NjPG)DU7)Xk}oY9UL$QSr)k({)<*EY5%`;UysjeW+mKpoBfrPop&;4UMJ4I|6r{Z zF4upb!_SKt{ic>8(t1lX%u`?o{t15w8+=LC|MmS>M83Y;s7{bAmM3CZU4Cj zgJCRTdZc5BjJuQ0f@R)8{XDb3MgTt2=3N}T;Z~$>aU19%P}1pCF@KZO%ANZ!{~VrI zvAwlF*(2TwXqyOctTr0Hm7RXvzPY~K8wsgM4n$mT5XkybGyUzfwx#3M_$cUdZB$s` z9#2))0>3{!Uge;W@OI%ypsX*?<=o0<;&~Ii54-yg?H+t*kf7J`^ej>6J*9BWU< zd48tejqH`vhD#13u9NNXZ7n@_pDSp%^Oc2!2X!(O*O`;l1W@$b;(`No5jze-=&ZcQ zx6DcnRDKK=T*uhU{}IA&{}+>vzKer2&T#pwV@$o34e<)1ao_px0=Zm7(` zH_)MS$q@pYW$RTTTEA(0&uW(e^%?6IS@?~r!Js<-a)FUGYB&z7_oh$xruNSY3nR8IZ1W>_Xw7R%~UYWZr* z@n&l9*$7Wx!8fDe%{cWRfolHk*!9oGdWm=VK3-ZAqT8oC;3K^0{WMOvs$*QGc~c|p zfXYtcM>jym7UnOf<0j@WDe(+nJ8!~&GbAEusdUtEa{O?!I7^uZ1UHBswj8#(IsX?6 z-%8XgJ|u+}+^bm{q|n>wop?XXxF6{?uX(KP>aNzC3wMsv7#(*!njuw?<^tx;J;DwC zx5)(&kCv$ab5(|fZ|i$3v&6&AvkOyjx5}aIVi^YiWE}=X<*fRBPBH1qJc5VG zFqg#u?*;iZ8dc8W20vYq}=4%FtF)wAI z#eSh6lC{A`rBcU4UwC?_d(r*MpRYe`{7q9D6SmEw5`N=TFu$!;X@>lPlx)kfTv^D; z0t0#7OT=?0aYM{Bug3U5it+;A4Wztm2L9r8r6y zW2S${^9XO^nN@+ly@E~iR=+zL(~sFD0t<*y78UvVwNk3&aLQ){J>;DEw0G}=&=+1K zeg@N5?!DHLY#k@4Za?IE=SglrudKIVSO&`?Nf!5}=d?c*8_dKWweDRvJ?Cw{71$VL!HIae+GV!*@k7&FWt~>>Iu`{w zpW@-g!!mC0u_iydlhM`;Ih;XEAJM-`I8ursKZ3WTMfML5j0yX!krktL*9$b*(}jom zwZa<8PI1Po=A&VSQjG?r8>EPczuYC{`$w)%|9!B>Vn7@cbIL1^J(iHod)(@L6PNJN zC0ar`0Q}yLY~+@yauyy{aXirGd!KzOMG$ib9ZP%M$^%{ptzd0B3+Ez;78< z!qLzSXl#}lf8T9RjK1q!Pi3>t#1yLv+O(Sgj2bsdo!xovp;tz?MfPKEL^&$ zr)8-**~`6RKwU$=x=WHf2QV2w`3n>53&{`g(r-R6NaW@=XUU{nPKWPrJ|Y<7y37d4 zC7)0^6Dc1<79?XE)LXl0`J_p3oYx#z+qoTK-M zIRD}6r^u5pZz4S0cdC%*%t6m z$^(|)NOjZ|TekHB8Pk@Exuo9bq_#s{?yJ!KLBm~IcrFU3*Bgn&eZkuO!s)-Bwn?7l zQ2wIDs5RXiUh>{As_mj*8$8vs!r=9-Oa`0~`U{Z94@Ti?Kyd+i#!e>Ihp6;c?BGYv zsZL62b5Y@~t+H;*Fp{D!GvzX(6>@xJR%~N?4tes)-NkhoH>ABo^%a*YbDFKU+ZQ%M zSAsD4SrN-NufUR|pX(*FlplENUQQ;dc{@xnd=eEiKP4BuGj3M(lR6i^AIA6w%iu^d zBP4<}*nD0ffc}J<3&wO0DBQOdO&g^t^N9?J^XOY}lFR6(MK4&+LjxA zk-Et(qr3K*mL*x)SnxofBu}>Jrz-k$;MxGW+d&EKfOQOEI&a=Ws(oMYS|;M$EAX<) z_n?r9z|O`)xM87W7tkMa>)C09~y(TT31DO{KnaJS`$I04DZd zdG&!RXkn;ex&OlaJQ2FY)?gi1SLD*Cmf{6Mi zrNEi#uulFda!0zIn&>+Dh}`^#tI9M6C-C`M&Wknc?e~A~U6Mz>MF1}3F8J%?ZchGM zgrJNX#cFa1l(lkfv1ls~{gk-6RpnS()G-bUqI%Yf8gqzV345wqhtsj7}M zoUc4QE0!#d`xlq@}uZ5 z<^mxO4`}c1Z+g7Q%jrpX~*5I?8j(ysHX~~_LbJV}Uabwzi+Cjzx zzIL(0)5Eo8G08xTuh!i-rA$q+qi|4Lk0%RcM$uW+6Nn)s2jDC^?thr;4dd68M){BH zPd}Ght@pau>|{B$AOytyEHE#!B55hoh#m+|m+3W>U7!+I)hqy`^)7VIkMt07mRq~u z%3JMR%NFzCotRePmg6dHJoEsCli8@ok-f1kLgR0;^CcT{SHFBQb`jK{L|Deg@$hu; z(%bRS7$?GZ=$9loYAoSc9~ta+t~KDC*&43p($RSeDKqRC$r zc~8n~h2r#J<1@{^6;a!!|;zzZ=DS(`9kEv9w~mU*Rc>0ULZb;4}Id_vjB zRv6|YcLV56oK$(pW?;a1{%$*8kgzX#04!favd$1FTHHo4tXt=P_)dJCYIRY(!%syy zgGCU%by)HA#i_y*h$I8i%U}udK(7K0DjLPSFUI5#!R0miG2|%S8nn(>ZJ5jKAC_Te zUg_`Qpje`p5m--fW;t?)k)Q$CxSm+b$xiGU5j&^*XGo3*jFj9L*fG>RWjTy&c%!M2 z_%+$n1$pU|Ror4Y!!X8#fsQxvxPwv1v4=PQ>OhCv722tcBP+!=TkwttDOi+x%{Lb& zZ&9lrOr*miUtwC&llEJ-h|pMOm~h}CDpDxxb>QD`#rrP75O#$C97o$r=y?RU!#QiD zU4wO4{L|XNSo6~ZJPIz+Dq60c_i0r2(@6^cJ~*#Uu$eq2B+MS{3?nC}$K_8Zb>e9x zzEdQlf{I^NpZR*y75cV@!U%@%IpXIb_9Q~Vh#;?9Ju&A3>AL*kW%Q_O$!KEvH?CI~ zWRlumm3Ba1Sl9rD^L-Fzhd+ZWAcjFz$Z>`RyJ0c}-Ah?t@jY5Nr~=ZU<^)hWuox}? z{i=F3X~{m|{OW%W%V~*{Bo1W5UjTQkI5X}o)>by@oiMkM7_+f4Z9V*6@^F(aMhF^l z>}5v2-m=%;-C}=F0xZ)XEdG$GeDCN{Z}CloHIaSiE(z$Q>Y#s*h)gFR9b)?(=z>XT zEeS$i_Rd&uGQgax9x7S->GYhxCaB?)ha*dKrbuVp#V_)-LEmk6rEjvb+m0YBw>Qf{ zg<~9P{z2o7+6ySHiN*&mUXGJbLYFdd1iab*l|UUT0Q4% zG0?2UJ}B6~uW|bL*F%FcDai-xeQ`hE=gTAuJuKUjb947TKCtTiJJp;Otn1E0&^wb zWe?``RCF(-%`mMW(2a>rB*1lVFu-sTE^A;J$Y)F8?2h7> zqtRF z4O}uTH~dN!Mnbhs@^dtSJ@}vCQrgb$D6Z26y0I$9nc>wlI2d3(3TbXqHe)~2Vf`o|U z0yK)uaprZB!IUWV{VCO7{5$>kRHN??21TjaI3Y5IMU6nxndS57_j8M?CALgO5(j8dJ~y9)>#C_@~+0NMU<4(& zx&7$8JnD}b*8SVx%Id7Da9bZ;$Ls{4!bY9T2^=^c!9*vO&UdE?D4iy$;Q48Q>21yRy{wW7C@q?jxWloATH1B&!^s6aWD2+slo81pLQxH&o+?eVa~C~a0xvSo<6^=U{N@8>8ZgXXl68lrefp=zr&%Vq`O=s& zxr}4Z(DDbK&SYA5IZz{-fGt{+SJ+TrT~?mi}jZ z7rwr*YhkrLv98mnRHFBa^g0_n)$8%jfk z0lBR@gU>-PCvH|Es!cl^`7Oe%# zL$%p6MEXbNm)QgA->+YwK>+Z-N!pEo^;8>uQAYEnHKDB9-t&KNrRJ#DrYOO4Rgq%l zG3Bc=JGPWAFK9pZeGN4MtdFzkQhx@*D%6=N5V?FRB9p03neiYvz+c zS^dg73)c8I##rD-(fuCyzi3YRH?4MxbCXG7Jx2_K`^m)}OZKcWziA%5M_59JOWwKg zhbNRd;fO+6*uQLlfYjlC8=BLff&u*^w)HIbD_IM#F!p0&H5>E}Qt1Lb1jnBBY<$Sf z1lhEB#ujRasZM}yceW8nWZ@Na&$=FqZ^Z2kaV?Im&(IbtP2fG6Vi)9X;sLAdB9__( zFVsKEBZ->?WY5>msk<4ZzDJMiNbcV zBDi5+?4c`QDS@Dn248-M)DePsh3H_9NuMo-KN1vmME+v3PUWZDzYj-M9tduP6dd`^ zgjKjB7v`r=1H)5tvmLqj6s-|{-Q(%ARng(4n)8d68xmBXeBiXy+@mk>i4w};L) z%@xT9T)x=XPI~a0=5QboMVKRwl)r9w)34Mn+u8%ZsQX~6ZuAD_w3ljJPF&E{sQAY; z+&LpD0C~oSxk7MpUsK}w>SGwpRV~AWx_9+l@O}%z!jUS0w~a8vhE@7~Ag6^S8CXHK zKD~_je&(0s$`>*h*IMru6?;Fz=L1QIv~ETzCW|^&D=20^fq$8m+DKXnXjN$CTRY)R ztt9D_l?P?6C7j4e3jL_sUB*-pn+QE@@$XXhdvV9_G;9Mt0w>V(<0~P$KPzJZK(D#}pGk#BY>DnQ%eg>~Y zE*T`gd0VtrT%}Q(f!k{qmjW4Lu9FGzCZ#zxeddXIWG+H>!NSHgTfN!dZq_a2%2egJ z5w-g_&1Dzy<TbVol=r9)HAj-*8}- zp6Zig&A0miQtX+8EZt!MD>y=J->6s|xI-ln?1n3||#dz2A1ea1@S1^uZaBnidd zj`vhXf9>Qgf(-##&kSFVsE2&-jg$zOIXB5v^bp$; zX`YPoYAR7GpFr=%>Q_+GsNynDj*AViO%OHU`BDZhk$7kqr201vxP|iThqk}&A$pLk zrrY*zg2QW&mDXaxjW+zVnz0GN;U#D;BB@V@P{lG9&`a*5_T>FSIOlCt<27q?<0;Rf z4Vp&qF~fNe&tw?<^EQP~nkcJ$f4OCYsIp!QL0 z2$wn#b!fkvyPeU(8HaGO_C|I29R7&DJQzU!lYZY|np(Zh{}ydoK3m*XK63}zIZ+VL-^20To5=_ceL+QMgA^1zZnk^&SEZC?)YrVx@l1YC9? zD`r0dBwnD<6V$u=>`$It|2}^enZA2I3MPQ2qivY-2WTOy-u12M(-CZ1f5u%ejN|>Rt z(P?bY+hcKTNJ5P^iN{`%tO2Fcp#kF%ZrP_ndDe>T?FCO6pq{*g;zBm@tW1&B+`t7_ zft)opGKX?x!4K_@k$ADyA$~~ zAQ(JBDDc|<<%xyio&HErMhVG8{%rC97u6-(gY)a4pv&aB5y}w^k4CR#Q=b0qrsxL? zkk@zU$r7xg*p+7>TZ>hrN?~#B2riH(3|gmB1Oft-A?qh4sR+a=BzW4Rvs5;wCe+?7 zFf}vzE1)EGAGb_lU_~CT$q^4Cpu-HdWHLE;O|KRL(8Nu=1Q%-XA@Xob4muZ0mibL{ zmxLu9as~MO+ePp03qOg81(?KGGT74o zmH+R3>p%a#-X6pE*{#6Ff`-N_-k0Wj<(|w&E$FBTQv&Jirq#x;PW_}=HyC;~G2MgF8i=ZY{CT}h!E`OS9Fib)2h!zZstpsw?4vV?HQMV6;*mcp)g*^Wo-VZ*1z zL+8MZQT$pj#Bw~Uj8_eQVIMYe26dPt3}q+NW%!iX#Q&P>j_+)ldcRmM-oZ~yfcGfS zH@NoZg|%Ad@}ujV(1I@yVa`gsTjl|1A0=4A5agc^ zAZ}{Ivysi0I*}Q9GZ5+nJajlsWd{v*))dNd!o_*|*Hi&JBrc9{`*8E%m|-BMEh_x{ zu}*`(J~N(Dlw*c}Kvh9sK}*S=pS-vF6-?K0K9aG}rC)<5I-r{mw3rSx9jT4nvLvRk z{oc?jcMDM+UO%VP3qOLFVwt?6g7?%xAI}`n$5Rk_a{s;a!me0_tF#D0!pk(simqCe z*%fpY+WmauL{lA%T?WxJ_GCZmD`rBG!sJjvtqpXqs;fLg6Cf!^pmQzDnzj+ywd!l! z*RLO+c^zZgmI(O(d1pmsA;R#~CpHtaGZEVgP#?EvNPGK(u;t3Q`IO^~W#T|R%)jt% z!)h8H4m=S?@-vTFCQiEyiHaxNxn?<^g9!_(SCQXi%c#<%G0T8V!a!3T+*pk}JIC?% zn2xO4vlQ36#NZ$%zK_4^T`uFw8h?#CwY1^}wj=8?4m07sO^Gaz6jtC*ke3jk#q1)X z9_$3IU~q#)P*hpa6Biaav!~#QB1nK^3yJGUw%;_?z1(E^;pZh}**r5x4Cnt~@4bVX z4BvF&C@4x5=^dpB0a2u>Kx}}3h=>q+6p$*?J0vR7n}C4QBE1ulUL?}HbV7&FI|(HO zNaFXpd-j|?yT5aG=bQP?oNs3T$b^JU=AAcBxu5I4O60V`jLtbFUO1Jh;b!V7)B`U@ zFe9RHJ3fGpaxIwIwyr!$2Th*$ zIe~6UZ+YO5t#gb@LPhqol2s0LbC8^@R%NJh&Q+zbTWonAQgN-mKc)#MDSRGm zYA_SdFy83C<7T#*ms_ocaa;HH;tNgCu>sc{9~>iZ!lxkg6{w5eEl=^KVV6EgvxG(U zYTn4);7dtmoPFu6b9yiR5YGn>htHW``H{HaY|b9mVrG)lH(fOI{CSuF`9Sv%NP@J9 z8t_+WH7JV1uA0qEl=z7bktDq*mV|g;j6h15*66TaqL;S} zM2Ye_FDUOCcycZlUwK>Qgskd&z66)GpcaUYU*vOZsZRF%By3393?-qyQg%Cu{``H_ z{D-USN0<%p$@O4~1l!u@adF1B!2OhBB00R?{SG*oYZ|)$-oTKDR z(0w3a>6pvI>Esm9Td2t)AVdz*6KFPqu((xIjm_I9)89FDBLK*L9I6B{e%XmEUX2Rf z+y-3kD4LCxe?Tua&kCMrp}$@~F1c*DqCi()Gd&)uls`JOpNTQnT9byG%>+@_hO<@N1De}0R&hz7kl1LIj~vv@QAelHolR;Jt717g}d zGNgL$lK;~;+gHCc%O+Y_igX$Y8}9Fk$9(wpi5HK|I4-Q{DjVlMSeB&DTu4Jol^~Kj zQ}rdj+ZjBi*i0k8+mJr@`B9pf0I|*`%n+jbg4ld3HDhu~6=HX~au2yBSJ-5U58i3F zZW?hf3eDo$ysE)k14v?Zqu}r0g9s)M=`iHQLnlVtuoZh-i`vQu55k0l0$IC$or`zo z610j;h0X&lz+j$;d}#5ufXiHu11Sd8Be{y1 zY88pfjXS?HUig)rt(CrH%iR%FnSy5W{xDix$-{B8kM5J^ja9Da_ZL6i7sMP$@dnVw z6Wlk%z~)`K+p{w<5eZyvW_gweLX?ZgM@l}2!w|wo71{}X%-$gM2L$tioJfH={x|=_ z0d?Vj(`SzVcYP+AWzG1babsBC#&>z*uH!r(iU`Ubt@jvKGh*`QI;&B zkx&VtFB~^jM}Bz}$i5r%kdrQb+4v`#=f}iI(_ovb-T7f)I+UPhy1Tw ze`UDWDR*tl*9168mU&NeG)8u8o~c2+Tvu#?vDN`V+dxJ!6#n~8b*sh#9yotqF*+>r z{;1&jl@@bTY}#G;{fgis&F=LZuHO_^meBMRq5A>k6zQN!)%xFa;4Qsfu{pA)gbJsJp_z@VM1@lhVoEFYk^_SEp>6Wx z7CMnvgf23b_yYP|CG)nJ2K=Vj>K z=d;^t%wbCBKJi5cJV*!09izj2sPPjf?g`7<`yw;;$){-)_1YF&Y@KW69^49&*Ur5-zhPrC?C$ zowJbw53{GJzgL$od2U)3s7ys*A^N#_lmh&l%ezgYpXlaB_5l(b_EegV_;$ikc3b5p zuP$oEDCT*ut6{o^a55=@TuK4Y5RVpE(>XqHNx7qFyQCWIu738h5x)2uh-BpZCW`oI zLktz@pLkU0B6e9)zvL+cB2_wCxke$9za2+@$&h4lHCpgaCv!3l4W zEElAFOinAwOmW?!Sz)c%tH=m)o!pJdBEw`|0K7Hc?BZ2(h)m6$Q#=W0fykK0}v%1Vefu@z3g*kK+1 z=s73zRQ}pMCFp7+gUz_d`*cP(O_zAjw-{|HXx2}PdSxr|_Nl%;I5!52nublp8mUlw z-&2%;cdw=^-nuRD1r^#z1@Gh+#It8OEjWdH#@QK5idboJsc5BS-ln@sUdy5owr;P1 zes7SUR##V-SBDP`i3DHJf37%i?u9BR?KdE>0R;Yq4~%gBlK=2baMG>F+7uHh?&=hb z`IJ+&4a4hE@oHmxjKO1)CHXsK3GE$1shvQqOX}AGJEz>Rk`G?O%)~6{?HeTp+8H11 z?w>vK;}01(ZuG_IgO;qzApa!Q6$H>|gTGQ;Kd~)_;z<)?R+}$<K}ofvf}4w`K=K>5|2_c>!M^=!FKN-klqe563tox90e%n>hM9^ zwe_Fx+;@D-I48gGB;olq_TITKi{BJi*^YG>d35L`gp$)S#^<`9^RhyyCv7j${d zR^Ea()c8BTztzOfJwT@UMNM`2C(fNx;mF_z8*tl2YO7WH84zd(HC*CAxpy>o*bF0s zFrD6c(LEcMwu8T}ohGd;Yk;dN>SjJpkZG z=wbRs4`Q|e{;C)a)+y(R5~~9nZdfd(iOe56CI1nDAb_r8ywSgVq*6?- zxeqxnDCj=wSPsA|%Yb^9+(XW!@Nd9aR9H4^}xw)}T8t9{-L`bYA9O z7ndTcBjtZrVl;|1k>(h93ttqz&}i4Luc)wdK3w~0|b?kH5sxJS3ljd!4yI*ZXIL74wG z^<@T2%+}=>n|f7okx5;7C$j!SZ)LUy$;|7?!3hg!rpb-OFSWy!epU%GGW=WvH#(2> zI#GZcPT-t%1-W1a7qF$e<$uLP>Xe;w?bVeM09|Y_7Zn$DClUjvY zZ*#W`)JrLL`@)aNj0gPhRRiwZVEGtZ+I*8Dx4%sl+E)<>!)Q+NH>SCFKX_qlmn2T0 zd)mB1)l8U<<;l~owgAc&8pl3Y5;K?+5-r2(;E5q&W@GM>GyDNKi*_H&k1l;EmgJ`d zv8)ipHk=}3jRFQP{{i8+Nm8f*ovD@%#2bvPbcko1wX(`Bg!`Gr-znmkR5sqn3~ui4 zpdD~A!Y8FKin{zpVG2F3DTrIkz`WbO$s1w2P67HH&<*wezgmp{PMP^vfAgpOPXgaA zoj}Om3(`%Xm@C6+!Gef%rGtod!Dg zx23t14G92RRfhmp8waSfZ4o8~=!OmSQD^^QDess;hEU!GlrG;WV3fw`_m_b2w|V@} z54OFIj+jYw(;TSbp(Pe|Y3=AF>UW0i^}%g_l;oAz#YwHYw2244mel&gdE$3fS;U1) z`wP}keLOKlaoC3>z-rGrL+td3S!|C1_6<&jf1z>*_cU-(%bTItcRQ+bqq(o4{ko8m zJUhs$)FqXEbhPC+?cReR!yao7cU@Hr!+*`y_>cd7yHjXn`p z{p@@H@df_J;-#Cq-@DsGZUqiY7t4$a{+`RHgGEl;4?@n$rZ&9iQ=3Hu9Q4v@FzeF$P3L zVaIC4LP%yW2?`o=k9_#X*0kL&E0E^3%wW;p%7iLD6@@MOa6_hz{35Y~f6l!C4w&Mi$M_x{ya>-$15D z)0+7gm7IdUM6KzNF8VSLATypnN??CHe&+_DcD&>l2>adV3j^@0h`CE}F0-7DkB2@l z-^rT~J+P{b$GAW7+$Up$`9e`t%&~c1+N2oG#8*R)$upUajzsZOP49;bt;}}N;$%qJ zy%wX+IpJ0h2QS5GCbN(g-a{&v6U|0=3D*PDas!*3iv{3TF#aJSm6N!=Fgu410U)zQ z)Z_{29l4C8d_4+Zxc~gxBjcHp$|3&NOdoohMvJbKQ%q58mqdvVug`}g6`Sr6;Gk!; zcupd%nA39w$daT4l5Q6_EkKyo)2~!=hPKgZ;F z%*3=q`@Q@m?$bMrF2#N*j42S*=RS!csNm)?a5=+SEl@Ha_v8Y;nm*`d<8y(0pELZ6 z$_1FBE*h2QJVv`~$O0sTKg(;}=C4UeV6k({)At7IRQy(b`uZ5HL-;b;MI!`;{dq@|ugJ!*6WB!UQ@H7A8WBG%vJ!bZxNvq%Rd< z0!ygOk~7ebP8=k&s`%zv#$Xt|w_U_V{nvj$S`B&wPuAasMVA!-hf9_lzXUD~s`^$G zsL>9p^R6uYO&=|}_|N2_QcRNoBgqEOqetn+c_v-YkXY>X;IRnh-+Mf+FQ6AcC7P?J z3w3&sMI&YvEm1c|=S|n1u29R~$TIl0QcbF&;wB3>lCX3dc4%=9{@!1j9PS=){qdnr zI7d-oZ_U;Aksl_&DM(x7t{LUswId*K$&`>BJIIpg;TOL`j7`O~e)ubb!CC{SVp{ zN{-mJfFeF<88TXu7I;oI)ymT)pi(v)t2hB8~R)Yks+9twR-MpP&VSt zuE$`dybPVcPilVU>QS3VNRYqb5L-X7P8aXUdc}B@9g*-cJtYBDme>;Z<{820A))V< zch5U;gqktt&m#@s!fS{hm>8wU%q7>qy_2Wer4K}rK$r&ahXi(fs*F^7IU;YmtrOe` z$drj}C%K6Rv52A|DF26I5a`#!508lk=UDY_I2$>^|4i<^h?2D`ujdk=$J6jvbCte{ zDL4hJXJU^?UdZ<1>)X_CH;*wTgGRTj-WIgE%N%TTJQ>op94;AS)2%pnP@;LQK=R*G z0KhB#lN?z7uZwex<{L2%^ht>ZaTm;v-WqJ&jKmDNV^lq^ElPL`4o4R6JjqNhhom?n zS|&q+4u!He9^Se}|CpLyApd-|iU6_40H+>WBF7-Hyd9ex)6cwkJ^WW}A3PX74rp{l z8pBsR>R4G!%_tjJsCDCNR>dMQEA-#{oxoFOSHiuc`5rgvt*QE6ii=YU|Lz=1rX~p! zky#{d-1=z60R9533oQPu?cLJ5izq`>3vvRz$bfmTq@BA4z00AKzN8^J)Hv|&gn*jt zXWBGKzcM6M>Bsfl_BzzH!Ad4}OVwpGue<+gkh!}fFcxFK|Erq$4DUdPlC?P0RpFV# zz7U~`$UyM6#wW%&(P{^Hb;bC^+i@*{RN-l(T0M9D*0|}uw1)qRyzyH`0;>XbyYqP@ z;VWrGgQrFE)8iatnHNizvD&Cgh#h1CV~ESGn(PR6yk5zp6)V-dCn=n5WZ|s2WYH)U z%g?d5Y!IVhqcOs-c9_U{&>fX zb@^oNR0mdrvSRz9fpc5nbe#12gWa>X1zCm)$@#rg-S6P`+eII8o?81J*tr5{Wbmz* z|M4OD!^R4oqHt}k3ok$>u~r0~iPQLYc4Yx*?QqX#y<){n+Z$XOtnlaZbkvf5|Znp_(wK-1_`O=hFQ9YPsVGFE6>R zT|nrT_FDr;)Bv(XofljLtgra^xFOu6W~PHRtSz=W3aoF7*3SjcG063YCGXi9tBo5C zrfOV!CD1b^$Zszlf@G5G7uXy#4hTx!o6x$Wsx+u(`iQ4M*crl2E+mTKEH#NnSYpMd z5BJLBnJNEJDt}<;(vIakec?9nZDj|D6}^jdYe6zrfN!1qcymL?iQ810U*}6p!Rp-P z&+d01PF5(L*T|tIYXU&8|34Ej3N%MRY`$#@kAWWJ1+unoNs;Wt1?8#a$&XG#@jQBo zDf?2uh@bYp|Bn{mfApTILhBsA9gL2`?;W!sT7?*BxckZ_;Ogo*(r3xC7(&`R{iCb( z$t0kH_)tYMaH92R8f@e=)cLX7e)7(4gkL*4tqMvre5-=<5EO2AhO`H?iHuPZB#lC1 zxmn6KcVeZN@^)_saka)xIEq-#V(6}G@-+zYa1Qr)E)ao>BuQWzWj350f8wJs6Ovy; zt~NJ?mji%hZqm6HrW31NG%|*1%@+wMs})*|g+&*2M@wH-RE2uP*-ZnP>qR%%$Vd_x zcm695g+&L$dDn>Y>Ab_+TM6$vz6^YW0u--Gq1RK>$zeY-y@2C})w91V0d#(j#MVDA zHZGS&M+g~kEMv@O(HPTYOkysOkFSEDxhtONP+3_Ppk_7qVxnQR6){nB7yWf`^+uE5 zH^-b-iESS4ma8rq9QP~ZPiIRozd)vc=UJOVJ{;YmGYsJWWo~p3wMoWz_eHx8LTyALH(dv-Akp&O3c%M9b5V0JqOd}H`)C7r0rqaP zt$qDKj50Pf8UwMad6{%Gf2*ulpHj{pmYE6f9|Tsye;F44pQegZ9MJP6o7IGjVV$Ff zTNa2t>otTX<+H{wgG6(*#h5@ICc4sHZ*@; zX$Ui^RFC9vatXDHq+Hky|snU0)Yy3ebML z1_FWCq^HCPQ6mUDov1r`5kg-9E+NmbQE`QqXD9Q7+O#ILA|5%b;=@Ik3Ixs<*|xXJ zv+6!?qCV2=aB$$!mhEL`K4Wcd`L2*gIR4k!{73p@-X;LO7~eZIMHFkPiwPA88XD0vxH1KWPy;`M}}p?#`eMKi~Ua=3gb|X0s$IuSTBX zolmv#o{MI~lprAJin%Ryd<4NNty0-@#$)$lmuRO)5t}7Q7;CU_EAf3lC&Op4AkW)v z1*VBH(YBf48Gp}c0t0y@W(?)=Q8ao>33;N$2mb|K`~&*8?-|HU4+0;5PP4#4Vt1n4 zpSy_IECue~w+Sd#6}}1ZGpqql3?D9uQn!gXVMu_p8Vrp8O^AN>AskFV{^h=WaG&%p z*ZEEyJiTwXHL3qgD@y1=5}1(R-z%`Rgg;` zAsJ=)ss`bCfm16IXk}Q_3|GRDfIH{7af_uTt;v*0W(TWo{-a0tgYEM3|yjg2<2u~x3`>e zp;j+rI2F%*&ZqK#-+~@)p4zR!aIx@i7^^$EzM+J1Wmn*E>D+Gwle8Qu$Y8+t(!h)F zI&#JM>8ab{wuX1^q{{qA{lH|wpsM;TX4?;o3deLH8U4YFA30&nb_RDssUqoW0Gd&9 zz5OwgAUgugSoCXHP?FK_zO%pQ{(#QJcai0x!)(CIMsIQn@XKKu!HDt7Ld5zl%)sG( z$F|cSkmygm38k$>4*;Ullc|V%Kqcb_tPdx<1O;-!;qGL_g8LnDqcp>tgSp=g^|P?>TsgP6IKy^FuSc8 z=fNlK`dvG5SidXt(2WGv(sl$NL;!7#h=7$lg|-Lb#WZi4nBJmO)|~$O#)-}MOU|$3 zcS~78E+eHE#C|!P>JUqCdB6UExcwOl{FR7-d7@d_8yX*&EdO#~fD(|=udmC4aH%iheWXaD}4=&7V#`m|Cmw^-wL=#{Lt{J&Gz zPv=+6I|MVr7RC2)UMEEUfnQXk<7jVdPGwq+AXlD0?=>+H2+&VnLDY|6Ca(bBIq3ivb*l_s4uzNh3%856!H+$ z=j5!2YB`7g=I|^7m_2-46;MnpM2Sbg!)Ny~%-rcA^g=igJ-8NQ`$7$%|3q z%HE6+oN8S8EtU{4WY%aQ;PEnGb1#n&2do@Vy#aNl4iN9LHRUCmQlLf$*!serSXF&t zny;>nmrZ%xTH*bn;D;u zU;*dVADc%A3T@A+{CHp1IAtiEz0&aQwmBcfoViqKJo~QH+Z2J;m{7y?$F?1+sAk%U zyPpQ27=iDr?Ix%XhzjJ#!v!6k*SV8dmmIWK438ic&)+0rmPp_wUw>B(U;%of zI}mI@L(SYNIBWr_)QS#;*6dVs*8df;%LO-e zuVD_$bf%Y3R&29WJM!7FHm$71$MK8h*4Z*xqkU$nXn!2U7BWfIfviBZ-bD)#R~FBe zvn70fqx~}~UpV_F)%xg{z3oJmz?S^XDnY;t5>HNmyo7}#D+-$yQgZp(aQFTBp0CIZ znFcAovSB*^grBa=u&I;2H^=r3Q((~sB1oXjW?LE`Yf+C&`tWx9V8U6Il6@|yK#%lz ze_Mv>WC;ZuqG=KsFCusmX%mm&%(zzQ$utPjI;GdPyzWA>{Vo3hu^BEBU}998&<#KD z%|yAdyr|WtK@s)8S2jfK3e>EOJB+DLxmisc;2a4_^IJ&)wr($?7_0Xv$#H=vsl!mE z3)DaOE$JAu`IAkl*Av3r;0JufdC2^?5ru>FFddvDP+vXfXz+OYM3!($%j7~w2Jfja z#%$s7mIN>SRd1(-;K(rsm)er+S+AthCa*NvikkCqap{GaMs>@8PsLEFe?a!Tf!vS| zr}z94IJGQ$u}1c|ne{lw^N&5~tYq$W`bPE2%iqb8Im*pt5a{^Hs%!koPkQSV@AZPn z2DDHB!%o>Bt+p83w-$Po#@{+K(`|sOf7loLgE#z{=pQCLo7zv?=bjHQn?#t@4B2Ks zOz>VhlkomP@B^6Wf!ah8c=1*gS#rDzmjhI)@o`1Ycd^@Ir}FAQsF9`*`)qXCST=xD zaQQVb(R&2FB;)~DWWfG_9>|efF#k~G9>fZ*AW;(;7I!DX3v9@iUC`uiy3Xv3+2ae< zKdPu{(yjSeZ9t4hq59OxpnWGgqF(E)0>)A4<~&(q)p!(@V#Qt+D?9u+Aoh$RpT-)E zSUB062v8iTBk^zR*{W*s`>j+hRYRav&7p3cfIyPW^bOmquht@N9GyoSL#%fvU|6TE z4y;R-_tpMw?w>=~M&$g0&hOzngxUXq5?!c{EMT8K?xsk>KNqK-vS#`t04FB`9r?2g zpGgNl5=ZGUta-v-Gj?Vq+f$&`V9{MOAePk+I_L>wfb*_@CnKNYN)sH|<+sAysZD)f zJ)?&@sA_ksK#4u4oRoeDw>#a;EfZW&aD2hE+M_$ADUNJ8TbC`D1Mm{Z?{#n*wK{n& zr0=BHM7;%_4+1O=_i@#2ubyUooAyzAgcS`N(V*{2_w3;Tzw-gj?=c{3M@c^Hq$!go`2@A%MY*V4Kn^4nI+!k=YTX-=qSTvnp$Tb-C;E~9RIWe2 z)uVFVpN;q$Kiu4hWeVr!y)*hwWI_6K^6c}QEB7TX;T!K3EIGT%fcG}r=5Xpvpt-pQ z3JozLf2}IAHPortVub7d7XNK2Z@wQFJ4^xV{BZHy<_^G{>2|R633wY=d6_7?rCz`& z#i7g= zkVWH?sssPM6l$IPpR0qaJddxyn^DeFbW2bPjCn{pYQgQ+SUC!coR(&$lNC|yw zy8!)m>+NLi0UL20HlDPVT92?62%@6o1PD?*WB(pJ;Q$|}<%h^Y*5n}*(dJ~s>C<+^INGwwxjr$H{&PIOM?3tE~@scPwx;Cdt z79yTG74g4JPLW!13gNzmum{k`#TP#j6)?>}qJOVMd-k4zh^p|}`+}D1#^Sec`~|-L zon^qk0D^gb{KRU;f_|*w;ozqWf(LT_oxuuL%2BQsLM8n$=>qo!nM!vvr48TD>|c&A z_WS|W-G*R|^f&lvh8v2Epcy{MEmBd>W)=&$yFeb`13~7yPcxKe!-9uYV{LzPd;BS{ziU*X;whsw2{F& z$nV4=!-(Nt_C3V4A+*|BNx^~`!`03N#U6KmRn7-I8{VwfEbGU@Wi_dei)w~vQKY$DQ1|VpFfxyWBFc1L7qpOcE->;oz4uDO? zHVEZgw3A=g?I0hkFP)LYypP(v@AGAW?_0klFx_^z=7awKJTI;GJ)#kp>^q+@prUn6`n=Z_9_4c`J zzz7e>NnnFA&s2DUcURhTV#r5zjV*y2lf~E>*bcx8SDOBn5Dl!zkp6um! z4?iKl1I8RaYrXmZ_M7%^uP^qizShlOym7u~Ole}^qTV{#zL3d=haRWLXKnx>e7Y&Z z!wopf*v|W{hLcW-3HV`PQ@Av##e8Mxz*CX^Q`vC`V1GtOR)i7U`uuKNtDCSC4k^#A zPcNe)qLhzRDu^L}=@8A1qG{Lxxor&@rqAP5z{$K)`fljKlatE~{<@2Z2$kD-Qe@rE zw;tQ4s@YsXj;YE$_#59^%UyFvRrvd)r7!*}Wzc}R!;D3Pw%>j~&xVWNlYEfhr33rHgx@9_s1jU0^fKs>$8D?kC*sB zRoPH<7C`ReTZ+04&zJA`n+98ISEmx2^^oAm-%iJguBFzxOwk0PXwjU{VJ(61I>c8D z{STNOGSOF|NhjQQXRI8z-TQd1&Gp>(4%BWq3lUmmC-B25`YoXml|Ny(<{Km}8;GMV z0$c&0YpQIEV+*}c8mW^GSKM2!=T?s#^JtJ`{coNh20%ftj~{-~-!#)(vR3{i4{G^?#S?5ep%S$k;(%DY=TLTG}|K_Xqxb()Yes zy&`Eiaw6KQmM9)$5{1`ZoO$64S`)WLsd$J;S?#>dOl$%Y!0G>Gqx(Pke9S^<1VU=` z9ssF00ERYb-FOx?&_B7*tDTfT}VG+o~#^~d#L zb^(uNwj=c~e?k6#`(Qhg-txjrV__|}O6o3)j=JuBde6Ksv&V^>Jz)+O2CEgHnV!3$ z_?yPm1vkpJfM-QELLIx zs#Lu@aoVoerE?xHMV)=Xl*DhPKT{jMBXfSR5AHvxYSZF>_aV5sSd@QTbY^zDk)Sm^ zQ0yu19iBeq2kchn_;xP{gVJkA%P#+#xlEu7rKe6qSbqbDOnf1*t7e4YCc_0PrGL?- znC*EN3S3t`j@>SG`*8c*6Qh$z=z=O6j%4&pQE4_CY$0;3TLKeRX|m}~Sl{rS@N|JP zaBvC_@4WF2ITboZ0NBpQfUgzkN^Qb}Uqw6tt3AT}_opHQVGsN-kaHWfaqfs$v&|kz zhR(LeZkSc#R)-|+HVxOtGg9^*ZJfq&boj}W9+4Ky8$iBRn|iv6-^pCozB&W_3&7q` z&(mf*Bp6}8`+ipBy(#gXo!%w7hX|fuJ?U*YO9+lbZlVvrIN!>A5!qg+t2gmB%M(~` zU&Nn4=a<=z%@EYC$C|hj*?It&c(`_@O+Ri7jE7kJ>zuwKUBVsc5Tu?ux?=CBzJ9wZ zMyb;jis8pb!EOD$h;1(TqIMTzLh<<&2fMdbiMZW;o*t;p^!7Wtn{K~2si@NP+H6rJ z*ciq=Qej?ohR@Tn24uiiLejj?IBBEms#5$9v)FZki*O-xAp376ZP5Yzy5W@nFz&qL z2g97_(e>XwQ+WdeKhRLSzzQ3g$tXCjclrg&RG~_sXUj_PiRd#Xf6ez<9l3@A_9TnU ziiJ|pua!N0huK@2n27a>JP!<}Hx;c|5^VZ?)YE2H$lsGF^XYiCz3%DUeJ?H`pqXzj z#~abAGo^z~3NBs_Y_is2vA4RE!i=p{JM8Ye<$(H#7F{ZHm9=PvG1~tn-E%A789MCKD8uRlJkmHIM=u5VSJLPZxtm{mAzUka#@dEH#%?I zjxRTwez)+xeT=yG2G|gEK6s5SY%p=HcYn0G+S zUDGs~9CP}aJXgE(vL?8aQ^HgG`szAt+vxNjNfr-lsU^{28J;+B993Vs)o^WrGb`vQ z&gvywm>`HlC65m<`gwtUe~_KoSRSls9jqU8Qp#6VI_2bt*78)^f~)+UH#Xau@KH2q zw%1&MHSj90Md`tG3ZHuua)d2(%vE$w>9qD33r-i!+R^rpGN%N+J+dNr`GHPo}=8;j+>H{AxP{Y$m5OP%` zkN1Z$MI{&BM(&aGRB16T_8p_hG?j+@7^M2sego^+(!jt~>XK#JkR=(81ukxBfL)ipwyQ^HB~ zI^XsZLRRh_p;{)vSaIT&qylsOTnxDm{Sf01HJ((3i_Ib`n6_Oq{U720NyR&Vww&LI zzV9P-{JbhTv!6?Ti58JM3Xl2vh1X`etCJ`b4nEisz)oK`dbmM+JgHD$U++|Z8biYs zRbcLa$e*V@KgVuQIfqwES=c^*pO27qBT;FuWFj`i!$ZVmJell{T2rh0UbZq3Yv zPp^7yi@4)%H(hayo@n#QVe4t;=f}$U#+9hKe$Fl_srpIK>KvB&ZEsH_;|Ft6?j3W~+n6)> zQN8%H`6mw+(Zs52S0t4;yuuIjH;=?E@S0!z#@!uvk`(<%UA&9b(*3havV%o zbSqh}(S~OID14Vy@M|f)&vgVk<=jC%FY|epU8-wsX=bM0UBg$XyJMxY4AScB33O+n z7=Bye(0+!1Xo{I)X`MY&UWx0EDoKG2O1*o$Y|yrK;m$SdyH@(V=PWL)*Mn$JLt4<4 zh+qAeehI5w;kNp@&=`h&-G6F$6WP=U7aN47tj@A68*W5brrlOPeiMHzipx?bTckD{i55~C#8o;(RHc%W<_e^f8mjH2zw+A$#_H1JY-W9)-E4HPp)IT2U2VV)E!$+*Xk$0i4$MyRchEAICSDuaA51I8%4is5*Pc zLMGm--_=}7CMiT~T#2;~3V-|&=pg|frXK+34M%@La)pvUmUCfHM(cc$r00Fn=!Z$m2htWTonb zysZ{MK;~+?5pNu2^RT*yXH)0PRmQzmxHs-jDx1Kp+buYW8>rftDW3J*kHvT6s{i7Kz7wt@CM>vdgo-rQ)})nMex1AzK5sxD+7syvQnKWvVmtOy z^_9WpfluO_I=-6aE_?KD!%K`Ox8iP?J`5!^j!q3{?7%7zkqyd=yOG>;VI;7tlWb2= z)#Kc2gS$8H^S*gdW6-`VXhly3_W?PXQhr##gGf@z3|rd}K~_GxB6Mv-ZA_hSbRbl+4{-)#0+l_B%HqLH)3q7`0y`N}Cs6D;wqws* zO{P^Iyz4Jg_13daQ=q3ieh0+dx~Lr1{$WT2pgEq6Wr#eHZdL?{;QC;4o(os>wet40 zOqbfJftqKGifLuKB9yH5av;#;utYn8i*OH%0X9r*J#r6B{DlcX+fNgyY z7bBj)?Vi=g5B1AMNgE6w8~38kd=$e?@xo?g+y-RGlhvR6MW&#{qInED1DOUi? zW^ik#dsLK(OS~)NNwopt`@D15PDCx=2a7T3G3a5FT|8&|jK=Y{`tGY-CBwBRKk@s7 zFyh?>*vG~zxYdw`2g$)X1^3GC(>p(E=EwY3 zOOx^SE7ewYBI6sqx40Ip^JEh$22LUfn#2e!Q!x6nQ)LV{3}It9vmS4povRT0s6&B1dBKRGDNuYQT!F7C;s~nZPDE%_U%DFegGE#m;a;<2Z-bQ7{Vc$A@ zu>&X1!u+*GE;Xm0$2PeNm^up%>hcA&OpI~mr-!ub4Ll2duwRYWhW*XHbqX~Otz7O% z*RPkku~V8|8E z9KAWOR^6Vs!NcV?EMhXM>cyS6&4qjTxY{8F$vLiBc&loruLqgg%u7iinyA9>!GNhhcZqR8HB}x@pFDQa_TW*>?brIxY^OkpYN#Z6O*e=ZLH}H95G=GRG2#}VGK18G zcfxPi;zHWN42r_>#@=Nqj8FCqXuxe>exnZ$G;a6$gOBPAtE z69EhC+;0?>wm!4mJn^}oq#Qv@alnhw>D*Y5@K%yJnvAp}&7waVqe)adKkh_FUmQ)J ztR(v)-R zm~oY&%VOeh0r05gzePszqP@(2Ct3tVdxt0*)#bIqx&0wEr*$&8 z%VDzNE=NM9v2-EOcy0`Xj5Vb08DuAYoqA_Z?kfZuq|XljCx{QtkC@M4KlMjo=^7Bl z!y-DT3$6%)JvkWFm`0a^I)p1AI8A7TT9v;gaL&B(9wMj3xQo43xI0IP?-Dd zezg0&IM2m{+I`7MoxUtzrWEH3U3!YjHC?MKnT_?msr(s5hr1_ok&6p7f5^oL-f;F_ zGcey-je#*qIO*L9zg9Y&>i@ipK1}wyWt6@%$QwuK{XM1I_-9=O z>x&opGqXHKgud;olPYz|yJ*aAI!&{?0cWJFL`S0SP=e9@S+N6=jLk`fu!+VM-}S1} zl*t0rh3e6vC9byf(=?nSK2}DgNj$&rboD_#k#Z7%;qSeG`c9%bbs=Uw-P@uFuHUrB zMN+xhMfgQ}Nrv34CGec{hD~_b_Pxm7wH!KG65}^o^A}PLATKZlBd8&^eqBH-{-e2I zNXx89oyf}Y_2F)#Y^6A%jbh(&)fMR~4Y@_*n5_`}Z2NqRW9CEnyLqj1%QX)VE_d<3 z9n60!1t?@A!CME4uY=>J^q2hm%v%6Ck#z>%vZKnF)kT5ux?b|+W6Ms*n>@)|6WOOl zJ8lEqM>^-=t%ys94*5&?oaL%Os|&MbHMO-ugAd?aHY2qi8SAi2z@(1Cn(2~B8D@m&I1$cWYroNf;5tj58H(c3`_Q7#?*`I$UoJAXWX#YR#y?0oX z+qOOmf}$WIO{E3_MTvl-h#(MMiiil28U-OD(nLW)IwT4LN{fJifDonkNbf{?5k!iF zA|2@^p@uZ?WSx8VDtn)O*8Sc6oV(9+{Ubd2l8}7g{N@~E%rV~g9oIUpbjz-LU%AMi z5-bltoe(5^?G|Q~8TWBS$SrZTaai^%?G`%hY?hPccg@cY7Ax;I)N3*3X2Tj+)`h8} zDD(5`8fBjV=5}2wl%LWKL^e*L*}P!rgZCdl=&8J+XSr8N>r6+uYRM$N6grEH!|zey zfB_6otlV?)acG&Z&d}Mxdc{_a>IN~7w}_<$U3FGX)?*T(C;1Jp>_{JFV!HsX`Aj++ zm|VW(amyQ8a#j;;9@`RrPj&c9M)t25;-8=SZygr1yj;u*Zg_2-qJLkx^Lc|iHKR{M z5r~G)djbhm{w<1Mt6ms?&ew*wpB_J$Qabn^(o98U%%CJnzVcl>D_#bv>#rivpsiN^ zCho=pJxMrL!H`>FzQ+#-h=uM}FqkXN(_z=Zciz?rpxuyV@4tWjfLYMtQN_GX2D+mJ zcZF9Ep&^;rT>vW3>Dvn3 z05K^VGNkb6KWO8VhPX$8U1^Kk+{tv!p@M!aaKBJqpi`yD;Bzjed~qNt7Fl zt~1b7QB9T@5fi8?qp!C}l)8+a{wAioXl5j*n4+bgwW4}kR@W#Mr#^-9_7bLtS0W}F zf^L)Iw8n<`UKt9Ua(#qfkU(eY_rLY@lZd6mh?wUp=SNL<4c4I!7HB~^bp6yP3q*4l z+^y^v+!eYykut$;(g@VX5GJ)|^mwP)5*aKB{9UEvjye5PYk;pELmETkka?Sa=rK{Ul3m8NIBP=g(aHzweO!@;DEyd5 zpqVLN>I^5BF6nHU`Pia3OnFl`;hZ=U5#|$2VSXFY=Y*`P7Wk6mB{$YCSLXsFEvF1# zdJu5vi1~%5Sf}9FEruA*M%udreK^Uz)jOmW1h3aA|1qWi7zs0TT|&NsloaJ*8qGw- zoS$_t)a-!emW4Tlh-zW?+bBNH5Qicp{Xo*cw+8~agwdrPpQVopcbYbZ{*|!BxthZ- zt+z{IzL2oIT*rT$NUfXk2s#%qiv&*c32qT9tZo?`!J7~>8~T-6Gdk#yM@DEBHL7Gi zLF)LFJ(f!GF&0ta)4iSc=(maY56uX3WVTJO;9|^&o}^WCNgmY(6qM10WnISw3@moM zxc5vV|Euieic9fCNx1xZ#Nwj6V#9JktShGSv}(?zZpB~&^orTZ$y2{D3(VLV8=h#h z>zdM7UC7tuaCkA+bd7$N9)?o-HmuF%;~doJX~zMNQcYs=4K2lWEQ^v4P(LbnY#7eE z*fPI5Xj;j8dnqJ)E>>i2*!G}fzt$^(F~{e*9z3TSTy&(buFSUAYOF=`Wt`EJcFbjl zT+&0dLkjQ>^FU0|(!3%}yfp9TyRGDJT_-O_bU510daQmIIxkue7;nHT{f&9T3`j`3 zj0%!&+6_22MrUYD4SstcLO9o2qT7J7qIuE7?UV4)2qI_jja%mYu9~0SJ6H*Gsuv$W zbZ~`4ItjMMyL@P--yc&H37*r_GqPT_$efM#kWFz)K^~PVauS+IMdWhczD+T1FRpY!+X2w>%@BBVntc$@dih9``G34>#Gcu z-nzWy7&uT#9C>kuRKBT`!2H(oOQ;?9o5KT5tS&h^R(^@E_Iv1k5&PB5#+qvANv|iw zr{CwyukBwtw4p9TOXuL5yLuSFU}fIxnUaMOd1Q`JOrVNg#*vivqW0(R@VE7kF6}>0 zn8nrN`=#ot*;cH2jYn=hiD+x9>3)?A#KVyb8M)-do6gI8{S9hi zPFykjS5FQWG+t1`eZ{f4o!21R$B1H+FNm3@AKC$zJ~dp&gFdK{5UDwLtgrV!!G5T8%?T-n?aE%VAO-vJiSR zikP-uJWhoV^wlW((L_8Y6(4KLRDC(V5(Wx7ft~bpm<{RuMc8xN*&j?*yOegeSAI8P zQ-a8#5Od`zQw3liuB2R6`C*4t5xes9>OBj-RD{Pbc)sOkl!B2dCaq}>)zvCHER&d?@D0Lhj6B;PiXRZ`TiYGlirQzEkr!&Ur16Y zFn!}T>(tz4?S2eY*-hEr{uQgW{TY5*CAvljC5^6apr;p^?)&U;T8F4!sH}1mc0Arm z;gbcm_SRl&Mp+;u5+zP1TnCMIA>7;Beb|hdNcWP%cYzwql91#9)j^>POIP`6CTHif%89cC!y3b#n0PSp3ly4>9OD^#=-OFqh03n$KXhF6V1%#o0goNAQ3@5=^~G#J1Wps38eSLxD8)&6Y-S7RR%9P3dzN3Q|qAe=H09I z4R40o^}I46ug1nxy;TqK4!8k2g{Fn49WH{j3cgg&kAUS|gKkMU4AyN_pzyrTay54# ztQL8i=IO3`vl&^fp+8F8CxB#$c}^ zLK*uVf;J$Bo|t&Y!b8UgpBQ-M&A9~fuZFV4A7~QkkGeFEd~G;4d=7Mhj2pL#6!jx< zG!8VqSevmjd0vdKiXA@^V;FgF_rirc*&jtqk5;DUh~Gqf8Og^rDxp|tp2X;S>#zu& z!tw)@7Xs(AW%16B!@gyna)t~W;u369C8)0gZ{~_?jyi`c@qwyH)^Pz=?i$u78Bp?I zAw3muGo)5K2-)KI*PX~u$d7X5W7CU!h)#Ni1*Efia-lSN$z9_eMqz2?+!3o=2SZ{E z^ENLc`4T&Jx3Fb~C-cgl zW$m~I5-V4{v_XXkfFUE~$(SEZw+ol)O2&khm&mUmB1AH-e;S9;9lP;s-|djCN9;IX zz5>!Jj9-kf>%B1gZbQL8^c2mLp!$QU_ZB2<03l&&lfk}GN>5c$^Y)UuhhFg@Bi8(o zpCL4Un#D#WvD<@utpi9-EYEFI4&lF24s!v{7$7-YnVnX)EyMB54`q+8xq(t>C*n6;BWeaN4;$e~yR5qTeEm)MZfpjt zUOImB>e_3W)I+}3aVZdTsJCqJiFJ$0)rKL_U-ct3E;|J^1y-(3u=36hR$Gtep z>4_*MH_x%seX;lQg{t&y$R}qv$hI$Fwa6D=He8Jv)o0?N+FuGQRCzUuwa#;@9a>x* zeep2#*xeg10@>6JnV4sFm7W}z_$BaIcHpnJE-7O+*1jn1+pJ(iGNM;8tWdv3`9z+U zP_Vr6N$FP570(v|WR?K1ZqS0UCq>HtP8B(edam!2Q%a-Y!%KCu<+pO0jG{g%SqD4Y zxLeM}MSq7k)b2i~!j1eEj}{*(;S7n2r{40?F6v&KD0+H7yY}2GaeueYzOf4eR+16l z!U1PYKn5ZW(uDW0|4)g25ZZwU==y>&sE0OSh|Qb@0WPuvPrL<~%RCN#oU@O!G3_!? zVqmd<{mH~Y0+HdDK~6I3_Bx&Ti@kz~*_6*p`1IX|;NXn&T#`KVXmY9+QuUnOm#p|pZ5 zLgMKOsdd;lH;Xatbqr*jw4>^h%Rygg(mRyFOl%YiYZZE|2mh2EL1_l(>DeNQ z&FiSMdc99>*XtzJi^}U(wX5}>-7KdF_jSt@35Bqu~xMg0_8O2XGyB8Hy>s0=&0k5D6ouQw&4r79*xZ2+ z^W&Q?Pgt)|@7$uBP*Tr7WZ*DrV zwDI`#DYfLcvFjcyT>Z<8s|V}g)B{f(=;DD@s1Ms6ruA5+@B^z899|ss1TXUm>hXHP zY#02!ZaB%ioVax|KXgid*jDs~-PVq9&7)x{kwF#VZEcQ#!K*o^jhX_)0gGz!=2u>7 zWNq71d!V=H5s$5NN7n3&nkF72{noP&JiiKB?SbZ}=zT>UL1=E;c_N`MDE?ACBF|XD zLR`U#+H#X?>?O;2bbe~K24Gwg;c|WPYN~cvViD?y3Gb-LzKMG-9bxn=chNl<0%#)A z4y#AIz%UMLah8_Wsr@x+wqG>tZ0`s0oveQknt}=$avLyZzPrP8a;GDD9}o+%9z#BR z0uMm#D?Sjb=KCP3cb0Zp?!=M?!sW;i{^d*6`J(K5OU3{IVGs;J^yTG6dBssH8}~{| zo@up626J;8+$%Wsk!fG@7uNMD%uQ@hM1o$qPu8 zWX?)DO^1(1?Idt(z}3i;_Yeb*uVGnlV_uj^T+ggayQWsdyKiN9>=nl5c|hJ^FP;T& zgB(DEbc5XZnO=tYoWz1raAVW;;$7#V3ij2v?}dNX7!LcSH2aomXQTLot;q=$Jze$t z7ZB^_z+w;OCp2xylf6pVb*C8JuUY8*3efPh83e1h-IjT0p*Or8#1krdum&%~yW>_skEP*2W63Lv#6rtCM9q8nmco1Ak^ za}QO09*JVLCtcWcc|m9TqpUjW2AEz_XzL0&LGrR8P=*Bv=eqH@0`Zpwj>t|9}z>Qd5oV%_;c4=N;7O2rNAopUBSe z!jI-XLyy^!SuvGQGz-?KT6c4WUScuezwck%+@$deYVvM05(66_;p{T~HzDS4CRTUg zGww4?@S|@uD>mwg1!%_(1K5_-F-8&nYOnsc(rxYpFzz4}P$L(0)Zzs&6rVX5Mf->c z=OYMzBkBOT-eYDUfDuu3hK*zh0)Q!VWM9)lCgAMApHDDSO!*itj|CY@^HdRf6p(k= z+d+(jtR0_0Q!a6;|B~nlC~~*>wvEL%9I zj8A9UK0!g9bG$HU`C1JsA&rRh^f{Eh7`~fvbLZq=HqgFe1q_K)LU`!U`P5@H7#?$O%U!d)1e8+TQySjuDzA(Tcbj6F6108 z3|gvxuTg!TpWmLIZuKg#^$rh6^WRdzijF{ZYYL&0EP3*Fs7Bd1DV#BRejp7a5kk0J~a@yE$U(y=(<|_q3Qd zbS9JLaPs{^)@Q6$M|zT)%~9}uk7IZKPEYvX`ngvHDSE@NfanPy!?E#I_C_B+F+Jtq zvnh%ZTboJC)iFv(eZdML>vUU2F?--8hB1#_9|gqSGp{|qD6ub%tf6yNVw6XVN3%tB zej$IKs38x|)V9-g6{Qg!+&K8{+5617frPSj?acXudS<5I&dwf*3_1-Ngs?Jmr+m6J_9cQx^srKY~Uv6~M5umVD$$`kRg&LIP5gURNS zLDbo3M$F*_boDn^=B#D(PMRta`}72=VJwUP*&Alb_Eos-`HBSimpl7DB08+RG_LrK zad{vE+T#mP$|vA1d%2Dx`}J@4^qyZ99!6VUc!X--)i1jns7aMw2Ho(gKH_q{B#0#+ z6WA`RLAvgSj%m=muB!C6EO|Ay2&~=ay1&K_9edpWS%RDNN#~O~-6Nsu#EHY(^t{S~ zcfU7a|6e{|?Fby^b`W-Bxk+4umQUTR{Dh_~a_{q1E#WSt@*=}RF+zoo#}RrLS?~BL z`#?7as&~_qQJlADOEw|RF-P8kC(Wuv#|0W&%?^-ovrt(wIxcQDb_^NT*xNwYn%L-*Z zx2x!s`F5(ggjYDfIs2o}Q)x%#V+y3S#UPT2CfRQyh*PNUAZboe&6ipCj7;@$S!<Jn3K{}MCSC_$**0VpBSJVJWxNamvPa&=RqVMWJRHC zryIcB`7hFex!oVlSO}=YWFHZ>OF>6IbYH)?GK#4vV_eEmVQ=Do;de|qQY5kHLHfm( zetYjDR7tY3NxX$VsiE~Q$GP^DGF8D_C%d>Ab9c}m=a;3YXots4p*3bA`NZDfxQE#h z_V8gBm4XuLbxo~pCe+qHw*!%ih1@gxg;l#=@*M@ktds8+Fu>>l&e7dS(5|m=Pk=W!4Id+ zbo5A-lrapaWS1pkB=5F0NhWA2bQl#CUipyrHDYM^_J|JQppNNa+2VrgPNT%Y2}Sl} zK)YN!M2v;6P+xOFn!QiY_O-52m<))kuY15`FR?7=-Mw=4GMmuskEZ-D0CY%KPnR@d1z` zOap9(yOD87GwJA?H=^pRywbSD+l~`o?1$ORk*9QT87A$?(!O14@Y-~|g8<)+>PPRU zU6u&A?TOv@t(f`XshGpxJoktXHC0Fxx+q*PX^P`<@DevWgQBl8)T+XkV*(2=%Z7TyVW~GMpUQF>^nV9JJv=&y-QQ zD^tl!I&iOuP#QWw922ip-GRRlo-k57ow|>i>Qe2=9T0!qu^rUU@?tCDK;q{CS_H#R zgF0{-x8!fucLm*HuH$ecu722nNkgo0&E*t}4ICMW+wSRTh#New={h?s0_i7?(o!!= zeg<^QqX^a7z^#Ppk`r}QWoc>^{NO=E?1N*8g=?nL^Jcvd&YVBtoupeDIH0mNa{Bpg z#?gwwyjDyXyl;85uj@<##}E=>Bs5?B=P#Ul5TRVfyZB;}Ru3N*lOJJ4oHYk!b%y zk@I!sj*CUzGULTB(0M%11vbT;O!$d!vM*&Rp7j{r48{=kwpJ|Np>P zFjrZ-KJ?_7r6fIMd6BjZdFNorFuX>ooz{6<<5;Q+T5WRO@C^K%n|7@yC-$nWwy-JZYX6gdz~M5jC-IU)w-t18e z`Ba4O@f1mKW_9-{-|_T9a!7Tae3RWF$?MY`f}uL&ZtSXXcpCgQLlre-#!6HOX$&A1 zyLsu3A8p;!Eoc4!UD>sF&x?h;@apVSzH?*abF|%KnLTO}OCctRt|uaD4^3`Y1;F!5 z^tw78i#PBbo4fVxDO(Sr4$Tj9069mq{edPBCUvQxb@d~c#OypJC47t#rX8V4cljNb z18Hayp&l#ZrS%nSO!{2nl=Us|h}X2*ZTCe!i#>u=@n^TyJ)bx>=U6SduQrqru@`9i z^t5Vunu#aKW7_2#+vNSd#*O{SmyrU z?&cFU0b+EZigiY6(3p(${LSi!Qrf#72ia0_Kd$E+?%Uu1Q>^Wu12mPMO~c%JF>&&2M2o; z4h@JQ7;owytwanSLK^W?Oh-rhaUWqU;kyex&C}=P#jJXA!;HtUNvspxxgOFRjpI$R z)VpQa1Ge}9LH+W*pB$(Sn=H#*pSBPIn1-IL1H~G+=x#sdmjw;B8oPXJl5wn*@N37J zn^Uqs{-dqW%5n$_wyE?%MTCb5~A?XQUkZSVgvqDPJ*6LDO%v>>D*4mr2H|;(D78KOtxT0&L70u3czk6YVG+n$Lc>U`$l*X;@BMwS8E70ort>*5z)*m0NT^E9r!cothD}%wnq(}Y1^u$1z zA#L{^$%m>>eywX{*p4HlL5sNJBMYxg{kQ z2P|?mZ1rwTno2=dIQ(tGxI8`KK4244SI(`|liMViB_r{E`^h*7J{+=HI^-yA+n; zen>hr`7Jw>NUwry6O*IeD0<-1Mr?PX-FYmBo7MR7kZ(M7C&LxLK_u_&zmNkNu-FJJ zCKkV?NCiwKaLmZ6l!VJX=WilV5R;!vvPZ~14AV<`F=_51Zx75V3KO&i5ntcle5+p7 zV~EECD5v)K5lZc;CU|c-jFl{BLh8*s5SLgp{dDH&md~r;k36RhRiIX{Bz6jTe}fR43=n!eu*dgG1@QH6o)TV zks6bBhTV8HVp$BcjR`284B;x6NI{?eN&z^{opxj(8Mp`0vL-tB{zp0|3G_xh$zb>_fLchp#oki+T0vN6=0$h!wzhc>d2W-4&zrJTmu+Ic zqPekMom1?d^YH6UaU4voi2Uy85fv81yA|6NBY(V8o&UYmaubbve(d=H(cPr9MvhW^ zSCMB==i;pZsDn+=nFe@r__cS`CS*Y*5l7>-&?@dmTR7~{V%l1c(=UBZDeIBlkGhr# zW2a_*i24pz!STNd+(5U)=oRS6B@`I&zU%0jH}jMc+@$ZuBEZ;`ZBHZfQ$T&rBIkIZ zMvH;2y3Iv8JGw>$J&AFfXVkTTokSjCVJ8aU)c@{93z>EhI`NBbe{+o$8||tZzhX*n zF6>MZu4aHf4eswJ98Dj_VbZl~2zl$WPCjLF>Fyy!Ka2?P;2b2UZG0p@Hh=Oyv8bZ7 zdb*%*pxsG;CaQo7$y$~pA4Ss=te@fMqX1kZbuiZ9j>bCZ^Jwo`2B&bh7p%JE z4&)_52c!s3i`{H+pdKiK(~o-Whcrb3d=8MUzYp7uo{S(v86RUE7!exC2IoJ&q90Xub9L!gQFyJ{y(J;;pz8IwSN zjRMU*?UDTG1p@+X&F4NtVg5e{YJsrf2|9A_Bz7|z;)Q-NT}{LxabolYAguh%>&y?P zJnJg}vvDkoFu%(NL~vrTXc$?VEl`rgwdg)BqSAa1Q`f6(kA*0)bCwUuW0F4iZ?>a6 z2|eiIhKUhIFNDuJEGV)!El8?M_zK(TLH?ER_Qi+&P?cZ~i7^D!aWoSWq5;}h*Dxld zDkHO^Yj_bbc)%4|kHT@G)08ma@`-~X&M@V=QG##IldC$04$jt!uBN$grH zb|$XL=c;u(`C#0Ad(S?rO`yqgp@_nQ-@Ofe`9*=Y3jXn~2~41J zACL%#d|pBlv;4=cCq=fEZ&5GNH8NwXXuB%zA57DsaTz7k%*c6U4?kKSNxWwB99!g9 zf!(r9oRgtk!S~`*2UOKiZRNLRWk0>$2H!pF$VO$Cvw-Z*^=F^?CTysrtV9H}pk|gf*Z1hVFfnE>n z!34G=A@7)z;})sI(Rh_<-A7-f`mZ!c-g+ zd7XO6&0+)fp>v&!XO`d*?9rCrAUNRe`MN%{K1||RQ=78j7G;pZu0*_cn=gN>xtMk=|)}K{;2r%?a8oxZgM zLSkITGd=;2_6b&$(-A$&Dc>h7;CyV5c-4}TjK~OYB$wN@O*fp^58Ijtxxo2=c5Dhj zog66%BPO)oMmM!Lv|%F_elYdh%?Of4o*o#nBwrfqBZYLfOVMcwR~SVLhU5)f==gn@ z;ucmrz$guJ0mNBP`~GegBnC45yS@euB8PxASF1g$?Rv>@oft<1bof5IM_q|@Cbx)uC~;Jo*JosE+Z!) zt+n;6i!}kCuZG`1@1I8rhJ7e0o^^Z*c=EdjE^LZl|H1UpZdsWYOI971Bdarx*eE9z zH*43{9AA(q-E~x|&%R>}nO(F+;2+yMrSgNR4KD+0-jtsmj;{bg1@ffpe+Eduh8TYa zG4Stf|C;k!#>rj&BD-UU#7^CHsb^-Mo}_>=mxQVF7~Z}p5wEh#_-xOtG%nv91o?*J znl|IIPO=5*yl8~FyCO{v`5e0@KL`{W<NuLr@q(hi$K zy3e0MMEvvjvUQ* z9>h*D-qYEgX4-#uC4atDdO*kzCM++Q3&<_DUfUl`^3;x7iIs7GzLq~o3Na=Ie)1 zBVk^?Lmh3N6D&m@x8Phxk(TR$_NR>JV)0RFYElCHT#ttmES#O4yq#pePv)Fit2bO?+KQ%SKbqfDXEK=Z?TIU7ScT=w<5D7ehcgTmoVcEZ<=A=9v)uf*VD7ILH*7k@ zj~sp+@wIu*O?WRBL_vRYHn^}3JBvBRD}Dm$>s~lyokQFD$=9>$PCC~ZX@AAZ5db&k zH?OLZ^3$#U42FIU>wX49e_CljL)iaz=kbpW^v{jQ|DQJ=y&;%JEK>sY!>?nB&RlQ>O1x3`w2hd_sDja_LtrPM;J zn*U50{=TZ1S}ysbv-y4GGgj_b7w2}Tfwq$pG);O`VZ6PqGn)BxtYOdd0^_S({co^K z{H#rDHOLDc~ z#q#D4ruYsy?$=#$<(=4|_A$EqK0>#>L1^PaNLK`v?KY?jICs#u&_vvTG;{_w4((t( zBQU_Go9R2F4%vp1{lRo+9=o;=RN(P)3_kSl4*03+>JSn?D9&7%|G@-y+)p8Wc3YT+ zkp&Sa@FlR#?;zcQ`TYUjCOwHE4$|{8BIr>hk%h{MK1Hse`{ts*C{HtJc(U&Ao+B3d z`Ox(UES&{M<6tnM&2dw(gXj;=9hB{}9*N#AZ2aMfo|uRs2ZGe=J0L6e4E>R7L4npZ zzPz!0?ic2jTrtya(WTAoX4<>Fi^x8f=QI5<^_s74XnKEh!uo)4Mjzl48$9{hR z-3Sc){RqK65LO{U_&()nI!i8k`4iZ~LweSVssk?)sqB7Wi<44&x1Q`}2*WZtN z((ezzEdXfA7dRM&4M{#oWBPdme%^qeE8wRI`kg)cX@Y+GfS*qIrxX5v7X=xYyFOES znAI%aeMV?L3ihq|)wA{UkJ!p^eFM_g<5eb}g+Ldh0mOiI5|~?%`073MNSa0VCy|bz z@7nu1d^&ydzkOI^b_~Mm)nk@osnIpnyNQISg^`_=3N|wDN1Lao!i^Yfn6H=_WhOTM z-cBTU!eg`3C%Y=y878z%6d55%+s zjW2Y>wbp^tbb@YH4C+VTgF5I9V>%Dc2W;sQiy7#T@oJz1TU~J<>=deM(FF z68#MOVx1ZUWGB{2R~gJ&2hbBEH!I=o;%K0%urF%^%h*%%7(ZHE@c@3xZnb1|qj;o8 zaO3n5YEl3P3Hz>4SKW&=`%f!~+ZDw})kjE*VHNQ1xXpOci!`;mk&zNY5)D2smk+h= zm7z|JeM2*?iT+^vG;d}}h`9YVbM4BP03#9iGrH=ILO@yjV{%^?(;LMp5l`4{Z|>>&rWbSq%ME3L9v}7LVuf7WIzJA2$q~C7 zOJ z`uSb|Cu2FG;y~i48QE8ul?Y;j_G=G^i#!$ZXj&j*YP2F|D& z4O5uk+d<7yH zLT_AeBkIwI0Fw~HSpcqt9Jyzi}CD9Z{&z# zZ(g-llyV26IM82q{OTj6H{w$78@I11n_+Ba*1vyZ4cskZYZ->IRA-G?*vX`uSlvK$ zU4T+Al2fljbi3{l7n9$ein~#Ip$1Nuf8KJ;VdEAep%hoa0M@@owR*psPj_>#4 z9h(>7zdF$VaED|lZ15H5^7>u7;#~H!B=ipD!@i(xDOL?6y?k6~(qFu?u)=fE)ly}8 z99Pm7jM94egA1!Xw~}U&kKIgahWUtBqGF|d*Wye?8{#)$GK`CoR9S9# z;A$q~E#s*3FK;cjwIlLR9GkBJNG9S4w{3t{&v(Bo;tmP0O}Il{${j+!8(xcwinW)x z-EJ2pnu30TM%Gq3q_{`E{{)3-JkRN1r37-Fp@b$V@jnmilHt?U`y-N%-*{zdxVFoa zTEboPAVf0QLt5+;egdEB}Q;K~e$XR&SS45>z^hv;c%vtQF-FM>| z6$I{^TUI@#1YPx3$SS&fP!_L-2MI~uXlAAUN?GhVPeyo!FZjCGyG+r z!}~MGSKBX@QJKev4iM@*cEyd&9}e=C9}u#kdVP3zk;2}e7)bkGQBklanUe@c?A1Sv zAQKb#O8v%AhBa)Y44ZdcO_;&cV+*m?q4r(~KKo*yV@X2=+>_Jz>x6nSA9-cOvK$bi zEcMbP8Vr~wbI1&S67p&4tM2FV8(Z4oap^{b~Mbl zrBRuk8drlZT&}|kXo)1mpJ}vW-#615oY(NkAhlaH&^fU;pzDmmDL7xN=vh{XM{GVI z-~WXOXbQAF%YyOFhFG{Tvz$2BaV|F@P?l{J(%9jzjIA5t`Pn# z)vU(x1Uu{1J9ja2=M<<}pn;`Cm)c=B$5p+APV1f%Ig%zWRuXp3B6K?vSN*VX4^1Mg z_+?hQN#u=E--kez?L9VOAvPqjgH3D` zSN4qco+Bq(sd@(ql`)FlKX709LF4WLn>I4jcm8LpcALo2&H6>ixU2slqASIgM?fV? z_H6t|r`umM`iDf`-2c|vw%)o;fb;p+1oD#m3Yl6YuUgmw(X*%F@C!S=)Tx@NFBDh* z2eLGUMEFDJBkjebM!NXN3rsm&19BVw-84vwT1=8@3+K0x6I{=mrEEnLindsAc9fDp zfaxtPD)ZVRd^Z~PD(kpEn{ccs~Z`m`jl9 zl{qtXc0h9?!SLn45t$O$HQ%}fNRNtc`VS_n*>u5+K29GSA0s~?3kVDI$!>DjFb2KZ zH_WcxO*o5?ihnna%W+UjPYMk=c-DsTd4zd^#*%yq?LFn4BY5$9yJ)v>sQXlPp`pQz z%#&;vj@#r4XqNmN8yGnOkjxon-t zhra&ubJu0^@r&YDI{>r|V62P1Y{`C3B#UuEP_^W30A~*j&@Zb9G0D8J0yzpJvw9kM zvabpr%k@18)U*3m8GAb4qq-n$YzUyT{#H$x>p6CZ3MY{?v66enTYAVo8u)RoNUy;Na@*2o|#yfa2#Z50qq7TC-^AzxMl?XwlS z*Ft;Hi>LMFXxxLN%~ZxYLPzIAHb$2MWlLSS?F334yfHBvKq3?BHD(?(F}`FF7`xYr z5HihTTz}0IKv#>uQigv0=ymYX5J0OAvNr+8!vh(;tah+hk{F5~zopPqPuSKsszz#;|e{2!K!`WJ#n@6Y%iton!$a9 zdhRmH+#h>#tdC3r2^bf zunYeaw=Ks!PoDh`3dKo?!j7#y#%RJ#zzaBRB#D01qpR)|we6VrX>%ZiGC#wA`-_mHCmOu&IF30oQTZ%hDl>#zKzZ zy$eZ#1&6h7A?EqbWwuY}S4{q|i4erfMJl+|mTxqGmi_PsG)q54SjQt<*cJi!Sjvf{ z$fyOxzspU=`h1x1^u)j~z7#$%=%8lTB1$6A0|4|_(*oDWPDB5rR~t=Y#elTy1sWK4 zpH>Vxx%!Y|%0?>e+&f&2G#%6&eOkH5i41{~)c5I_&&TYDXMr&ARp{UI(#U74w?AzS zes;=Dh*5Tbgz~T~b%PacV4$mh=T|l&Nb4|9!Gv}#`>LrTO<4>eTgQ)}4=T`8XqG}C zK>qIiMCLkrGCPuHY*g3+B!+;xF#*2-M)hb4v4ru_2;W*Ak9&i^yS$|!=ZD|eQN%D@ zTl~pS`|JF)SFl0Z&nA$8tns(b zr+YKUlj22wtXhfwgXbt&=$hvku#s%zXA34rXm?1KSpUV(0@-UliiDd>ITG%V8i(gB z`Qk3+k!O*2X}jotcpD@p`T)U0&G+cDv3_EE)?90EZ{um&jSSIyTOuk&&;$-SV}>t- z9aW1haQN5@vVnEO0@`-QzO45>YD@;-)g@gKFgoXScu}aB2m+i}4xoE#c3w zOxme%ch(~wZB!FJ42#?&c^G@mWm^%8i)C+~GRK3}@ z`jSgOO{K7d(xRXNC>J}y?dZ>1@xkA~kER|JD#9@hSkq0b-;qIHc@#pGL75S{?i^f7 zbiauW!G2*|&~(^9_ea-E2A@f#cT9Mz!7f1OTDR&p)Jv#x3uB{hq>$cl7@+uQFpAc^ z-OdG2TRaN7(unUs7U4g<;t#^_I90oPBv)}umJC`bTHbER@-ZL$FoOv`3H`W7a&i;* zzS$ysdyS4Ge1Xj(<&*|JNhb|I^>AR;J@}U1o-bz*r9k1%R>U)?l^^F8)6HgTA(BS zm$s|fPuU8I<=&w0Z9a$N@x z*FiT4b<%;wE;EzEuQ6Ny7klp=*HpKy3kMMqP?{oABq~TRA|i@FM5Kv;f)D{A0@4LU zIs~F1y-8I;i1bcGdJVlxhft;WmQVvJev5b4v)_Hr-uL{zbMHC#?muF}N=(+8YtAvp z7|%1Fv5w?P#lCJpU$)tTjfyoxUSY|(Gi*~gtOn>l$wRg~ye~4CIky8{sVdxx=z0Ql zF5nQpV4v3x!vq`@EKXKP}pcjSd_ZrQ*OL0JO4J0WZiAf3 zT1K(Q$)DEM)>Q1B#MnLqZ0AAu>wh->`!Dz$Wjb~$6#itIQ0O_6ui+*4Xg+Xa9ggs# z*9j=1gxSIQNg3-h@;9ZaBXniujvn#225pabw7%s8@4fQ!Uik&Gc#D8!p^|!To6ZDpZMNJ}*snXbyu^43K?z;EH6F`mr2jpTV+LFwPk9Xv03n0i@+!4;Y@xsL8!zt|% zmi3g{9@LV|4sY(;CN&0VMo;CYmkS3I`!{W%5iZL-$3Mj4{+q4u|0#Cpf8crQFy4in zRIbvBH)>}B2N@fs8hK8u`~uBwYm*xy27%J%!!tjj1;jkhXOh;Iz|z{G2X^GYz#FL3 zq!cp!bGQ@!&CgRMgVCxnll5=-qPY3Fbl1a@KW5qMA9x5ROR+i$@Cro}(xFW?5$n`6%Oj?hwj zIG=w%`?cmhU&*cDKQwF>2>fRpwSRr)|Nl~KAqE3%t3OfwbQz8-+A621=ECafr6;@e z`N0z@@wV3QQkTH2-W@QiYF(}eecd6ir?xQ~2yU=Spd}jDacB<^TE3wIH>4)qxSJTQ zPv3aJnq@oR`qk-Yo>!YZEaNHEmb0b!*2kZYJ#-odsaTs91P%ErQ7$qd#EB!}V$596 z_@V(at#31wBxPQ`jZU$65X*k&y!3SdJvc2Z&oWECMoO%u!xxxW48cWgIoID94b~*{ z4Fc-NEIv*&h)?5JpSXuq||-2QLp-8+pF!VYOcPock;BS{RU#;3}|Y z7*VUseS>nGl~q!E?XLJtCvbP-={ul!8Dr9I=}J7qbs~BDVEqSHbZ#~?=;QO@8KD`k ztBN*0N>p8`XG&K+ed|AjhqO|?Lyef}D4u>Rm{mvl7L+GOhc{Kgn!mG{LDK8zCm~!YZ{EGQra_YC%HZ1uP+`Peyeeu>31Frn&Eh8PHjWc2PPe`o^+T!E~UVb?6eg{5e z_A}Q&eNHXwKm}W9P{9|E;=FQL3O_fYrB-{mgyh)t@kcg3;Booj7Uzo}jza2JD8Ek3 zf1z*qZi%uEXHS&06yjtN?0}sECRv%pA8dMl4?W(ZiA!`SxtAzmd@CJsr$Lc6&i;+3 zVEFoLYDd-L*pdF)mP>{M#gods`rij0>v2g%orWQ;(?U@R+Ejonsz^#-KwXiID=reLcMTK<&z)e=Tmdj z^|_x_+a2`jmZ;B-pD_DQFUiVvyFjPj6>Ksr74{A3=c0#CXc^!;!h~r0)qHR-VrX?j z#CpHMY}l`l&%p2xYR}-`bUzq(@S>^xSSaEIZ|*6-pNofvCFw4tc24ZycqlVeQN^_c zgi0;o7k>(k{@ydU;J3yz5gx=rQf@7%^;#u9LgC)`A2*BC<{PA#ijkip)?n$e3fcqt zdj@o2(o1rNO|~4S1)MMDBwr^ccCBh3w{tZV*1ErV=V>lwxV1E)XFrnkNvvATQ7RA) zwijuMwzpqY=lRa7v(pBmXoiD%-rQdl@W12{ycHS^RgTa~2q|I%6!F;p_TK&G3If5L zB_AbkjlxVo!=&e|Mxl+pv7OnsIGCWaE^QsbJlaK2yz09+@e1|;hKuc#x3ZCLVytY7 zj5{0gFytAaemTiy%2{dHd38zD7N)W1GvIjGSnjW;CsP$8j!r$pUS^SJqi>T}sY4uo zb=~pYcP?os$)i4;*uE%Mw>Tnag~9zAZY&wz)-a8z6PndzBX@$&Rm(0t*RwLnzbdS7 zGfShHt%s%9y7|jaV+SwdE!qmqj=$Z?o0(-~-Ec>0*5EU}awZQB<^&fdeHWI|6SRs? zO1u=_cEHB-Y1?ri5*GDw7^?sB7s#yEFv`O@=e@-Tl_U0Rs^SVv{K&q=k8`jy)_qF* zm9bUDZ{fL@`FNZPk}vUaeDLQ=K_|j!iAp%I&!%Z*U(=hLc062mvo~s_nP9Jo5~LDm ztkzVEdH*oOr)W;Cb4lVvrHTdl-MhviArmC+lv*z=bmI!Pr{(OW^zh6@L&mC@2X%I) z;d@(!>YN>15+7`+FO*5gM32M;^QQsSm4DZ|pSWWZZaF3pBYnGG>&EE=mctr?36VKy zeG~4UJnk!M5-5L$`}Pta&|CXj-iEeX>ClFH`mhT#1J{&(xkmT=(~Pvdp2b>~bRQ)j zGx1KgMuBPHE!abHH{xQ6*a>{5dLK``=C>bjxbB$=f#p)ut-bq^RL-yyYZ#-eYAyy| zTATMnyVF?av7nRC4tKK!x2sjVQ8Ho1+(#)FiL#I-kQ3B=2Y zh%$Pt8>^99%C~`hh)YM<;st6-J|6GbBZD)26ZfN=-b8^qyyWMhCub~7hcln38ZrYI z%&B+(j_eb~KX?57+Z&#bPuxwjeEGpoXFj?Y+B#K?XeHQ#EO$n&JS{4fLL)?NQT{!d zk~aH3q+}-qg;g25Vps>qTCTfHv6JaXHqc8ZSOk|K&bx5VOQ+Qe`7RjX#zXN zU8PbSpD3Rd)Ob}zkRo@FX};g>gfOp(E4s@5C>;*BWPMY(#Pj6~X6#8zBp^fm{s9Pd z8${pm1w={tPVM=PXuI{j87h`zNn%wXvG#AJW&d~ltsh7C-eT`IF;}!q1p10@t7pkz z22<@|x?t_H`3%Uc7UNF*vsp?szyqHyL@;@`&%h|){3Rb32MabOL(-^EUdwT3v>237 z7dXo7M_UUW0TSqF+U8`2`u6W``~26;rgYS%caG>en=NGFH2J^mt+_9yo9FZH6thp?)jqjhD0zmFM zM}8t8C7Sw_S0NT|n0y>FZwcLPAnaEv0d``E7n44gYYtQ~<-OGq*u^C$d|Y1h5pi{4NG zw+cIkeJ?Nn2K#D~X-LbG9vS}Qu082FZsX^6+l)WMXtd5 z)&B(i3LOfQYKRPcZ_zfIfgavn3OTT|`n)ZaVX7Av+KyB<(T{D822k}=cwtCGMkz*y z3765ry^N(ldKkY^PPu5g5gWJQhyh!;!d>}Gh=sbn9*hk@_W`E$BkJ`996;M; zy^6VqcbFAw6)BGC-}C`O7~3vm*3~9jg3wGWkr}xkj|+%(oSn&V6YRr2GnxoVNX)Ek zw!Ze9AQR~2K6{MseVHopL$+%AKZZ-=)TfA0xih8VY}GBDQzIG79O*s69oM6s?|z#n zV=0al^eVpRc`L#JW|7Z5B4;1GDkAWw$Pipp`o*VwUD;csH40)|3*0M$@r2A7%z6aj z7F_az;R)Ji{F2I)q0N{}GR-g0N4?!n z=YSnQQ=r8BDikoE>sAC$Jaiu!No1WpD+`yL%TT&_U=x*fs>8KyGkFAKh7d__A#d zu)4j`zp;Fr7!jso0T+LPlJq0UUkY+J;_(PFO)UYKRa*gkq{OG*`spL6av|!9-#D+o z1se!C2=*C}*a|6{=H2V#1;=(mcQ3(u%7_;v5wk0Bh`k~Ev^SLZxZi1b%yGZdBVd`H z1c*k_`I-a@YcQY*`g!f}6L8=DEe;-YAOd{A2PrjiuYud3@=yC~|7bA(tKUaM zet~Qf!hYWD%ai%&(>69}f0zkVo@Z_CNI?9hebQ8dTn&`OF*}R7lZ9Z}ms(t)&&%GfCUx zx_~G;likvg-~Xfm`r{3tWzR^o$+Pt;(qpM^ifoz3Bh_p2i(d;Ot zr;96{&dGa>QBmr^4E=qxVwQHdt!75NSUpfJum~dgoO04#?nlN+LbmQu;tEP@4M}+= zs>0+;qbq&lV0(1fbB{F3tq8)h019@wPFVL=EX}JibSA8>kp&v{C9M*i2q<1kbFTx+_YZd$-=8gVdF6saFW6uY+Hg|IgwwjtT zjgD4V)y|=GobJX%B(o4`rjh65=fE^8M!aPdO35eMI;4dYd0==t9|$ec9@RHBs{AzS zU?y*$xfD@^S`4F3kxoM%os;x7oArmW&0GmIXdVZg+v$=0H~RTcUvGLz0hzlQ>H%Oq z{wAvrOpAX#|Gzy3x^MRwPAZqF$7^nND9@)7ZHfyl8pPLoXYIF&%E&Uxy9M#J++d;M zgdN#+>0(A;!)$r=zq}yZFI`oNy3g{IS#=qdiJ_PiF|`ikJ+<)7jC&vpwrXu5Boupb za%4erQg6?5IEML(W}M_M7h3yS@e?CQS67RNz!t4_@vAvf<&=OmhsYJ0<{L3z`gM5J zj1|vo1E=(tV(Ndr{?}Oizd8q?NTLe<^9; z7OeGLTl4L_bNIybs+gkus1lV2o-;>Bc4ZkY5x^Asq+hydL{5m@KgA>`z_z=-u??(S-S>aqd<1jvp1cQup_Eao zs26xc?A_L1AeODv2>C)AFDaL6gM3Gq}TfC>|=8a(~MknyL(_8)?(#uJV15w>F6vE|QJKx9xFJ%fz z)b~KMHCy#Z!+YfR3j5DWCVb_a=m6{7*&pZH9dq4}&@ zkkLB@?6m>a#s1`BHMMt@ca#{)$`4qkkK&MWpWUy0nOjt-UvDe~>rXdn6;;Z1o#Kv{ zK{-94J`a?p(%VF->{1;UO#mL-Z-V_hh>)ban*L}8N#sRaXi~RG>GCA|tQL15(}dj% zgAsP3)R42G^7FyXl0>7KO_cphEA!ZHy~*xp$ZS-v*tqMgdgwQ+DZ1hnx=BSrn9;_v zujy%1xq0z7ch6?lc)MUm=H2WbKHmT&F1Y68z9_+$C0o%<+lK6m|7?4sxH;+mA zZ4{xOR{2kH<8_rO5|&eW2(+Z#vNHevpF##NnVM;Hu7(HaTn8P|5^+EsF>|w6BF;uH zFfJ&qbbuw`(ybr2_f}0|cyK91QD|;w;$fm9)jmRE?fWAMKPf59Hjw+NmKX#6{SWDs z5|W8v4b1N7ZlW$#c}ck;io?lx3-%dZ%(tHD%!GW7P0=5dzj^pTi{N8=o7)OCs2}{y zB+=#3J4|orP=lSAf^^Ja>}MioRo`$w()n8Ax@ZIk3r{v^<91scrw%Qt72isba&w)v z4#8A--m#rE(+Q%OQ2pU;PJB#?Cz?$sd_LDWX1JgE>YW4Mhk0g?ZHj_6oku%@GwGKi z;PmBOPvYTbhf4CA+(K_0P=<{bHPNXJ1uHENzfNTAyg- zZ!BZVcG-OrZS-y?!Y>Jelbxr@T!uQ1-_5CxY*;_fb$Fuo7YHv);L-!Zei2`q~q2Yo}`>5fO5XL~5K=Q;m^W(Wi}#jjlIseDuk^-`Xx}kD*QE*oj~) zx{*NT->K=(HbkuJTPl#=F$zd_^VI0QN${joiE%%D?)76D%HcAb@O0fCjaVv1AO*qw zI!EfrIWCjZ0QP`4>>ZWneTGvgK!x$HTiL!X-WH`OG`g%?IPrbPuDkF|%M(2xrhmUfR{R@gEm3?<_0;J3>j(L!I;m{Ep%F`bFsxC{OjSThRaula zm;6g*?uFakDlF}>JSGo>)*DZuQxg&O@uFyB2U8gua8hW3kB`#k9)O25K}J#)1`A-i zg@Xn9_gUb@flNTB!v{8*6|LH*Fb&>D%SYsrnYDM$$&-?46(_&%P0vI7jLwjO$4PRvt&^ z;AKm2Up=_HsXk3%m}IpdL(Uf{Vj3L;1fM?kGc~0amD_20x2nFtYjW9)F$x(;lBCq5 zD<`kF-8o-!7(ERhcYRXrRE!Z85l%S_a{b7;zZOieX+W^oa^aJPDQ!`DvYja(!}yd0 z4N4`RZ)fz4()PsxmAitcMPF2E-k59(S*2Ih)QkW3f(ilKioh*W zVQwuy(xvuwtGIo5Cac3e9S!=jZP_}5Qo9Nw+Rkzq4DHel)%Mxx51YR2s(HAS?mb5C z2D1u#Uv+zt+A-@%6?|bQ*r$u}WfpZjQGU>gka!j7lzDW^f0i;p`Qi(&dXySoey$UF z5?}Nv+RZ@Qm7}A3{3*!2r^HFeudg{EXL=nWUkp7v4d%2my4yRTHy~h#&RzjFerb&`u!iKdIT=|QGJ@GR|f%v1q~{A6>ev4KM32YBn%(r+o_JaJkmvT zsouER67Gm!NQTSyZsku`mqcrC%e3YH)PXv#OJinx!m16k+@_j7)?P+;aI}h=oQ&1- zk}&8`_tNVOYA!_6Wm;a87~-&?%Z zq+E%2iA}d=A0o25B-kA=Pux~~s@3&u7eq_aA;qFCT(BJm_<^QKP1`K7&uk@0BB@p_ z3_p5S47KBfZwqKSn)Qqp?OXUZ#(i^3ZGNr)lWyEC{rw5T9NfSCB)!enwZ}OuZlD)h zS9vVCL#cG8DSEYYeY9*RlZr2={!-=2e;S!j|;Ufgp+~Al%C3Ap`4uIyMOaSl+Ac>zCnp za2w?~|7`cHm!VS8G#4TxGG3U2w=8PBqcINtA}Uai^EiG(NdES+FDCK!zJlw_)Ib}KK22KW>1$eCU;9+REnZIiguzLnEHSm5 z7)daLwbjB;D_05qF7#v+HUAud@9zHs!P>zaQ`3W3sSO((x2oSQ1^28l_3PIe_G47t zMvi3ie}SY)m3d|r$F%84i|5@9IpW{ESH1GsA9f9N} zU#1d*Qz|s6+?MJAj@xU5O=`Y*J-XLcR++b{sOU+eb)~0z!kTO%JIYH5Qbdy-cT^fquh?y`wrk60T3*h z$Shw<9+dt-KJW)n-$!G|sBakJ<`h=de4L;)$*|?-LPX5l0McYfdU=_@UGzIY_zD3E zKUYTj>V47;In$;n{iW9{_mNB8_8y)6Px*F2a+`P|d9K8ZRS|@shw7Hyf=cKaJ?MRI zPz5Ms&{j%w-uNctbGrW2SJZG1UJ$Zg26+nyM0Dx4>W6vU?3=&OJ$?3Zp?HWRFO>R0 zCHNL%DGMH1q3EqTyfWeMgH}Pmh#2<@jIXP2tv%Hq#_K8kyK+}AXxZOqXj zQ~qEMZ($cLZDp7|&X`d9TP^F+OC* zP^>wGiz12I>()Z+n}r8`7Nn6FPhC5z9dO?@7W`g1qTW=F`3LhQ$2$FmwD<6aF#{Vv z>XE8OWSg#e5ims}@m?T%ymBq4^~Z7o+PcOE12GJegXllc6ZP(beuO{84`7R?litG> zaSBaw@l?qtTZ-f?zMypcPfiNwuLnC!1(Olp*8qs?8Nzak^{_PQROx^WlldX-FqTw` zr5yR;SP5x1)`opC62?UY6Wt4*yCt+#Hvg>1HRSAc_LS%m@;KYATDMLOdJQ^1cyiu5 z+_)k1>d=2A&CcojPt@O0TEXSC3(!IG29aLj%9rx8Ra;O2B6yl{!=It7sG=K*@L zCQr1gqGjY2AfrP0pQ^1k1BSI7(e@4?jWoDU9bFT`Uif#=@f);+05VNA0b&adP@d8O zC&ju5&8!kk$uzwL$g8OUYI6K4_s)QwpVYTVk`XY?-vROBZyX(JI@~=wyv6Jg&|0sR ztX6|KUqV6=A9B=Qa?AE|%AEn0B^NrF7HF~x?777}#!x)OzZE9AL!W-~mM4B%=ZVzRy8%DE$FmVEfXELYSkAridTJ{E_>@%Tq^+EFL6x z>h%NQ^5|wVNg>iw_ED zfGJr|y^{hO#-Un(Jqmv3vte;l^oGaNo?jr5;x!-r3q3B6E=~ZYksKumW0phUC#a}N zF_4)Q@2%thLk8l_P3U$N(94Wgp6~=NP!ggRZ5Yyg^z!maU!$;s z#HS8^<;vmHxB;h7YU8zi-5}s9KMke^X*VHgXObQ{Udt%xE%1s`)4L6jy@#KN7MLEG z1QP{@fc?2h+P6jf9$x$W4HwTu<8<&r1Hi`cgGmD09ho*8Xi!>vCtv2S=*MY-JH=xs zg66H)nZpJdpDak}I&z<6`q~)*p>2+RKVY4fTI$2b;CBepR*4p#_HYIdp#8Jctjah|A- zJ3AF9d7(|I95LCH_eSgDN2aFFl*HN-&&eNnw-}uQPkz(|A8J)VhHJO;xPD^+-8;^G zvk*zZ|21tTKJN(#TT3%!AbvtEGowL}>F<X=MQv{Z{K63xWAmobhY`KQdv>4C0QSe^Z&M}^Cd71#x zX{x8jm}ekM;h$e3Kbap_tIz5mg!y8Pm*(GU&WoD9g?0l+@ z7G=&cj8?OKHJIblMOq^dx*iH;qCO|B@5R*vx+bB(S%7}^{RP91!lfnL^n-16q;8-5 z1$ww0yo#mi}9yxjLwgBG+Z@xaG0o+`ja$T)VrNOg(tI7hq{MT|i z^xkf*gs?9Ro3VSS3#bw7NZ?Q=d*momT#NpOgGE2G2olliM>P4#F~~D+jM@l&iTx-< z&I0JF-ya`E)y5v(fd%t#apXl7LbXm`p7xPuwAtgEj%X^c<#u~`M|2)M&i?wD{kbYm z`1f}!m)Z7TC>z6Ao84{Wo9f*medK`bkI;e9qEX9~V0`M(->yGeAZFOF<0CVn86u86 zfj~kLGr|IJ&Nkiwbg3yoxnOM*Mq-6Qfr^uD$z*_*2(UX~b&*(EK%-!QujSaC_}d8Y z4}U|`iM)7N0o(`|D#zHn5I#93m1MPWGVOZn+wObw+dB6ysj^*u8UD&#G>I!FS2s45 z#f5NnWguoAc+qn+1r2ykGjAm!N0*F&fj9+}vks!Mb?itI{Wo9_q(tD3@MwI{Au9=3 zv-5y=J+BERP_hz{!M{KfJGH?iMxcNP1XS@XI|J7M;sE4W`he^a1>h?6t}&27fd2so znwNM3c2yp7DEtd_$Vjeh28`@1x{J7#Qo}ckX*5)rkqQ@+Va=Mnwqqdth6k6e7MfqsAf(|bYyEhj(#7P1=xk<%g1)+KmR~0-TVa_ zVSGMP@8CH1_F61UTZrb*x8Hg?T4vN36y71_VGyK)2DR{Gx3MqPn9{JH!?o=P%H|AW z`9|b+qn`mwvMj4pY_%-bU~0InnD15r<}r--ZlLG|3C1{q8xr5*#uS7EI3hGk79`hR z6c!iWG&KrEu?c0)cY744)7)0o`wA@FtS){aAx}eYkQ&1TIFU>ZaFzoFsITGXrQxkV zwjTv^7(3p089Gf&P5QhBlY@KWX|joD2aDTy2QkpXp(rK)>H9BUU}JxVCVV+_N{fEe zvDc(lQ*lXY^f@j($mp`MlwRxDHdi~}k6(iqB z=ie}xFZ$4W{g5m8C}QqleHh@$-At6wk7yFgYs~5%CY&uBoxCXUT6v4-%=J4fzTp*f zOGvt9DJ}t=X!J~iGE-0PWW8mm(*5FX;TMt1bmnmeNyZLt_65dE9v*AvrBh1LwD^ZH ztDbjqC1`_5@b9ER@)?7X{F2c@yG&Y24a}8`Wp7xX@*lmxEK#?L`G0`(;v=%~(%lQ2 z>odHkZ-_6#@%D@8b~oP)yYc*d#UNWUvjCarUOx93r@Jmm zEw!orh!?Wr#!47C>}FKtPJT&FitjSpWYE#Hm(ZLejPXV&=cM!y=EgEgthRl8(Zda% za8AcjiQRqK+v*7{dw^UoM`8K46vvy%nb?^pT5)s|wq#^6iyGl;(k=X4%{FV}b?+Tg zaG$r-{;SgsAMa%48ut;{`R#|_7xpJ=3vF;EVXLpnqJqS-!_=Mq;0+mj>{bhc-dQ0P z-}x{>{mYW+?Jli%#3yYP%HYlDaJ*zQ=I0FV{A&Hv^lSVz@vfSSZacmq3sFUYK`Z6? zkJj0)C(L*1f>=KDQBQqk)9^OGlk>MP{@QIq+iw_5gx*#bR~A;pbsVvlmA?yCR|J7&Hg(hv2 zr!Tg85W=s{%G`nm9CZuS8jR{0mI7Aq^ZfKTN%3>@nH7gPMXxD>hIdyI1KxP%w2bZa z=S|ZvQPZ;W(((@vtIW=p^nJU?Sa*WvvTuucPXh}a(sc1v&zZt!x`%edIX`X+o6|e- zt*_bvzKN&B<^L758#)CL{b(TAAe+>(@#9@V1Wo}%c(*~e1=i4eMD!6qw9zdD*dnrz zT6bh+GVyDRUd}+8b0>K(`5(^Ee>8*t^*+V_9b>5TRko-RcjzIKa`Y8)fPoB`3T7a| zvBtKn2i)?9{JG9p$#nPjL!=^=Wg0L?UmgpL{igo)hi8k`vGnkObfYzoFG)H{$TJ_$ z-PRg62~w*V`x^b0rtW0YAVsnw9N|E4@&55)sS`)l^b554%#eXZD_H>4sBWlYQ|I`C z&9DJJqN0oF;4!tLC4*1eonet2RN;;vIZcO4-Q3e%FWk>%O|j^E`b5#>PqKV}bq>Fs z4~4>0bi|{=xNQv=h8T~+WueBUc(tG;_XNi_wR4*``MB^5JA1|V$_FmLK%SF5ur`N7 z499bwC0EmwFRuVdBJb>pFVr6QBs8;MrTN5bT5CBxO1}IGFj^`9BmLt4iN5^5&iTJy z_3S@A#+g~xx7gKV#=^55SrXGLH>0c)O~cN+nwx;Y+OP|7J)k&B1DqK=?qehcT zaqdK^gZO5}VLEp&sOEqxw&u*u)w}X+gSi)l+L5|$1GB6?WUiG~ZJdApse}tOiY{H> z=ZYOdtB?os2dJ)|P;^O8aJ4%NCIEgh?CY|&*(Csdc~l;{@5-r^S(zyt!l;6ZCCbmr zx;A?hA8iG0r4QO~YP^g$pJ?r+ucPm{4w?oDU8qG;$21^oMD*sNEgkyG4*&z^(fUV< z;VVlcE(}~VOF0eis)^L9j^1px=TO1#HTb^08(p`hT4mi58S#+91jNQy;zh!??q3~5 z>0W})h9~b|{B(-kQ}klMizgul_WB+A=c(E$EDSh7eloXkGOk~s znh!;x2cUNwMjG6=)6a6}-W7PkCt?&Z-`a>AZJk9n1!<{W5-F~^y=`wl^ffSIpMq+B ze=N--66Q+0Lkg4Q!YeI`Xg5Qd-BPx%vYCE9srxR#@w6~(dHv~uK7lqE=WwS)KVF12 za#^eXzTC^l@@I<$>}}bb&p%{g0_o^*f__1*~YVRtgs&yJ3*d(F6=vANEFi7F1qSa$(CXz1=b($_gZn-;{M;X9V7cheG>9hYfm zYPo@MTM=(DL$}4cmjB5X%G7k4LFPxpgQRuOeKx+gHWjE&^tkq{j5MG@UA%R|iq*=& zEQMRmiQBeuBZ@hlgVQdIdfM>aV$~a>SZPl3Kz3_HvnRWTp$C{7-`XtwF|7JYHogeT zz2T9zCE-WNEhlDGELj(v_oQ81PN_W-SJea42;ACiD7h)iy9FE0yD3Cbx%8wwc$fOOh?+umX(Nrn+BIC9|kH|WHx+w5DcPt{K9F|1L-jgi2LKJR(sG7$1Vfdz;P1r zABM$(v1Fu8wn2hT@f6#gX2X=Xa&R2qdLK5o$yTy?|HJp5D;DkMU~#bip+qaXP`zt& zsb1RYdKgl4#bJlci&oFd=hju}cxrM44)h3Ex5Y*M7{5LaqK6El@E8c+dI?UKGU%#r z#F3@Yfv%c?6=JvvwFeqf>MQ<879@j=PhTG#@t^lH4tkN|tqPQw@dA02F%txn zzCkXr}@B9o#TZr=pRcdZmIc;L6C(AqI1gqV8gPR;-nTrSWXIRKw2` zSd8!2TVpI`A426@RF|(y$FFCl6ulZf9>U*wC!ceyE=d0<(w}NwRI2ll4Xj5c%9s*_ z;bNG4zPL+Q^mj9uLcT5Spfa;^v}{bwP98uwm9)t10!VjZ$*4vw+{*;8k4RK%Lu%&+f}U&(zsJ6ORua?x`x|Y zSFrUkB0L&W%)y>?xcAk|O%m%G=FN!Dtv}e&S2lBW=hV7sBF_Z!BMUBv4%}W+dNQ~V zqwvOnnTl=ikIE?L-F-&gm6MegOQiDJnnsT;&143O%#_#!jcvUUee~cASqER}Z02k} zE@o_Lz}6V*pjblM_&jC&W`ARigXK0+&`j~hOmK`VB+Zb|;M;J1+~$w%aLO~-*Gh^u zd#EKK+>%%| zF%b@DPIhvHjNXhbUvs>v)lo?v^htr9?z5)1x1`^CVrCGn4F0TWdg2p7!Ril2?ZUZ) zn&6P?X$I?!RIMY4uI492i^}EXb|i-;AP|nAvHqx0sg?Kr;*jGGr(;sd(N2c-t)r6n z>|x1WQl~FDVosss*E|z)?|Ao_N*HtwcrfQjWT7yyQRTTN@9RZeEJF@=_y-&F;8 zbx+a<)Tn!SJ;}I$9<`W!`uH^W)!_NN4SzPa$<8?|r3zB6;*Gu$pWo)uMR2)Ek3MV9 z8}q3N%_ZMl7fN(ynE|X|E0Wf7v-9jwNsD;D@z3h*x&6G}RzwY3%b~bS!=LqffY(qqrxV6uj^ihkkG8 zFu%HAV8BW9bT5D0h9Fd5+3yqs-uz(w%CIx$*i!#pV&~T{Z!AG5r1x#Ob*~(oo7dR3 zYxv@ZSuk@*)j@)W1bzq^4q?05<5$b}QA^>{(7c^*8?!C^mit`N5I!A1jRLtY;}$F2 zv#g;-owhf(gdeL4sYISy|GJyMqE(x>*kp2ECvE}EiJvmU`{sq&G(k@-y0ylI-n;l{ zo?a-nv@CAZ%!f+2>mG5rw4|aY(e_n}RmO)Iol~lEbli z7=Ewfol00xxA&_ZL#?P_nN_EWQO|v2QvBtf>{~wG_sj9AZO&^`GG5@}c z+qvrjf=CY*tkO-&Jc6Sh16#|;?BBn2zvt;fuA!&D>E}0>)oIdL;$A9y&fcE?;QRK$ zg^w4tS83oRMTB%L2`I0vzgK(01_1^%Jrg$0HMlHpHFdm#7b@>rS2k_nm&cVdPNcwhHb1e zf&ykPhNIIS5;FjVH`}}W2&gd!`^RfG+n;X1K*=D3jOr4_&L5c*hvuqK&S*Q+2`C|S zgLs+UckmI};dq#R_oB(CN!dT;3G!~aDq(8+lXz>Zm#z1(M*8phu#o|CfQq`e7%+NQ zyy>hf!T#0SxK_QuM%KVmD-VihxQ zzl)ExF-dwEIi^3s1bF8gs2A}H_^aZm`F?bn7%Lv8ibMNJB2U@i4lB*8EF4VV__A5r znOaVqSxU@vfVJ^$FcIY3*7oi3A)eEkY~$kZCE7Em1S6?P1?xP0Wsl}42JR@e$p-bG z`OL+1Tw{Q|4+!`amo*_rQ(L~_GwWHyuD$Q}ebT?+xUaS>zhLN2J8kjutDVAXddx2n zAWc9^=E2*{Q8(65&cfcC-B>ZYTw)ODYYMmRXg@c1Sw`!WGUx-F##sc5GwBARo^K1a z36vr$P%Ul=b{Y*{EoGyzs->eVXobTIB;a`}MUU#iU2{nz1|%;^b9( z(>lo~6OYKRf4ft=Q)HT(;Zr3t3(9meYui(rwk81>I2g(?Rf1|3(ZAPxmnCgy?3^! zbn9rAyu_+s=4q1!PUSwJ&iVf3#*Uu(EqBD#!Y{bhQ((|%s<|Kqg41%mL@}&bix{;Lo?%ur`a)qUj3Hj&6uT5?BaZK) z<@|GEK^oI84|`}HZ^Ud)FzGxA7?J5jB#Rv7&nY)#-S7auwJxYqvGSeNPd#Of?F7HX z`%C7E{P#ND-N#{x&N(lSb%p;@QVLa^ft_~CZIvG6;>4Oqx)xZM@iC2!-aj}em3{i{ z7y6~ITiSR`y%8VYc{Wj@-bkX*Rzko3i?Z{puJFFQ{IAnmx+)saOyhcY?s1j?Fc`Qu zlgNZ2s~m)**JY4LYL~q0pn%c;+he*nCi(z`OpAKXf_UNnmF0D(bEf3zifc6Kc}6w@n)}P|LLZadmD( z>KtR)xz;3?2f-$jeV<1tV~lk~-beh*`D;CrIr5dTpCH_gqm_@TfQ&~C;+!+7P5lg% zN#@+ws5|F@)-L{9*X6n1Lvcvl!OSfb76K3+gXP$)ancA=BbB)BIg(dG;{E{TgesHn z{vGNMHh`!C?ub7heb&uU9~Thl)I;m`W@xL#&}30a?HQw4TDJ(_IFo~aS-)S!oijx`jr0@NCoo;kyz z7e7w?j>U!P-Rs29HN?r?#Tucf!ya!G;pmahnO*nVnI@b1#9P0asxZuI1g{dEr_qdn zrbc+tv?hxIhq}T2%DDbTi7}N=*@B+AUJJ(cTBoNm0>^R+V;>tN&G1jAON@PP8s28~ z1Q#j8T9FM9|FJCqc?|OA#M{iY#j8zvZwtM6YS9t;rBet|blx~I;zb6!SXQoO{L#14 zfmax_nl!63Hp`J&I^ee~iT6_s0HE&1Z&57z93ZK}*fGY?kn4BfL@m*ft+38QP3tRC z>-^lZ&3r{~4%hWCrg^$8bb0N73UNQ1k4CG)0qGHUYZuevf&%uW@DYN-?M!nOQa zM7Q0b20D*#sNRwQf%YVc*3binY^!09+C=BHRdJepF%vi5&|SBAt-Q=t8Wy5Gxfeo3 zRK`-zw$xyInqF4|n|AjPi@O4Jt}o|5@hOBdu3*5HIuDJ?i?-Cg(0~$NO^9|Bt=*j%up!_XSZ=lqx74i3);( zh|+sRr3;7(p+`izR4Jh)f)wdZIuZowCDNsa4$?c)A@rV511ZmJ-?{HO&pqedJ7;Fr zthux1`~k$c+1c#M4_~*+vVV|X3iTcGkmXjKa^Je1bHgKu<740thCI_Res@&fVzgf)q5q-*l%iz{j;$lF(JpcsmiKPi+u`9uSZeFDi@23 zG{uH%*7G^e_xgTB{YLtsv8qVs%`(Lz9|Z2awRhfeGXvRn z1PWb~OA8ZCzLEs+A}~T-J>0Gc&5a0cFOgGbe%0!+uQ}{w{a~iIxQh$}mFE(fygSJE z3Bn-=>l=#o_p+tZoWhrYw-6m|KzFO1LX+>q1_ZXeI8-?TZFGn!FU$?BQAwx zO)?~~sXSPw7*<^q;*E$(o=K4J^Z24?KVt)|&Z=h6cGk8Yn{6@=zGQfV_LX8Glqbtc zH^GAvxPJLna-%qEiY{{7#0?apcj#Ko3-kJ=H!18xp$8SZheg(CvL z*#&<8e56WT&J5imUR(e(1F$!E;(|7%&AzSIc&SLUBc*vEcTs0%W_Wq&;%g?=^OZNr zX}}{+VL50faKLWmm}J_xLA2e-LqFMx&8YgqUIiK-u~NJ&Hme0qD|Sjbgm-D-gnXgP zwE1Fb<2>CPUA{Eepzy=e56CVlGXJPvH9_(Bqk)AMXfD=yZ6vntCkvizqInSuCE&|$ z!mPwQ*tK>Oj@jI}X4Vf%URU20CuFVUe>u4=of+;;#h<}fARB`0@4K#JbQPWmII8GI z?dV@-r}<_}NuV2|CMOqSjmVczuvxDY+qp&X6W$k8a*Yfk==+sH_sjcuJyU{`fuJEtiYEr{G>5U%7scC_V%=hEL6xlS>H- zME8jbNQE^~ws=M}Z@oNF;dMeBKr_2%^v&C4&M30}QU|-*DR(Vn669!_z^KBDD06H% zie~0l4meovZ3DG&ko6jwd<1a89q7+8;8w>AhNN8G@jI8VbVagtif9T_xOZ}_Bpo$1 zQARTyt1Td?ZOD{ZQ@@1>L9$#u*vu}swFB${SbgS#l!{inLq-9N!avkt{xgj_D(K=p zoY-+E{>ww66u7@GWxtOJQN9@swzSVBUu^gKYDe|$4+^;``p?7q+t&qv!xKabJ=%*v zFrnlSx%ixE&B#p}nBJU`l}n>KCBN6te$@f>Q?etTuR3aiEKh{yfC)od=q_=1>ySxw z=j%{4(JN+{*-34!N&-F7&H|9JVp542S$+Y|HqH!P`WN#^@L$9K3W)2gcjI%ge@osJ zg}-CiFuvn9CmshcDI|4$fqw_2a+d9O%6?A;=q$edx+$Wj%yy16Mt+i&Z+X$e0^H(G zHh!%;9R8<~)&`V3|CdL_{kcZwQxMK@X$O2wPKV#vR`MR8%&iJJgKP!DWnleFJMf1S z4&_K0^2T2jd-?hklTNM(Pn}bq-ro7zNZW?2W3Ot0N(Tc&{(8{j0P)h?6R_?|X#X$h93)q%;dRq%eqw>#;fR_1E=g9w#nED?fss7h} zpBs#;UK}PWVU5QV^5_fAQo}@NeuBh252Kwo>TKi%_q$c+_^q>FMh5%|`)(*wHNeLw znvoUg7R%v1B!2uCC0mG8GuaY+zS8n`=eb*X5()0-TntmkmtB1zOO?B zhJ`&gYq<1a2JVudRKk@Yc5Q}hQ4O4tyP(ZHrp5c^*2ms*!xQngV63#b* zNk72LM%ce7NWK7cGIMJa4nK=){6OoPQB)fs)<}w{=k9yZ_f^~xpk7xzdH#CJBMZfubYg4_o@D(;6in83LY`w13dYO z1VE=-4s;a7JnP$8LUh3=JL2pxLMP!xMN=K)){8-G-*0v?@zNv;cBuVY%4qtvWvZlP zW5c0PjG_rtUzQ=6q3^+KA ztc~vF5PnQ|3@gaSvS!@)wcsJPzRKnm?PfEoCFq`F3y$?urGXgu`o@Tj2ackHeK5;jp3Pmrn=#tg4tXK{ zO{s`et6uoj20I^+m75sfF|<7hGtFzt4BZA2>IKSizEStw6Y088D#+6ul*=EL`BasU`saqqm}H z2?nYg_!1wj2k~qN<0+xLQyRDjU!~Qi1lGRl7@6J#2E?NLOOtDDwLWJS#G5QZi?j|G zXg_@#WgLv8bDg8XLvYtHoDZ=Mes1|4!E^$5RTU3KYV@0rg;G*}ZeGpa$rbxHTj_{4 z>j}`L=9d|Iw;!Uz&iadkpl|4M3DvL!>*tJwPP#Iz8vL3`>Xs#Si z|ArF*eestJzD4Jnx4k&ln%1P`UfIzZirSsNLoW9e1{P{B0r&q`X6T{z#Xs;p+_evQ z$<`B@FB;8BcOxlO9A%!Au1&k!6GC%*+BEj4r8VCi+V(voV26zMSR+J|Ex4aCNzdW-mqLrt4mVHFHc_am=4E z43l+B!9n&Z2d}X4BQ0MwOCz`aHuYwgZ+c_u4SjSH`h8;Bnj2?;PL^o?f?&3X=JZhd zEhPp0@BS)5?;iab5Um$g7yKEX+>77Maru{x489KUZv^Up0TT-oL2fo6-!_((exJ1J zWNV*TqgnqTq8+votpR?6WIe6*2&um!d*I0H6;6$)ySbm!`9l!22K@qSjraB`oX#$m z=HK@M_km*$jYU5%x49rhfGk#p-~-e4MFtX`sSyR9t{c;%QU_+Bph>U{aRN01nsRJG zU;&CMPAJW$5z}%G!Ro>>cp4G)fz1N=zN;>h%hK$zD^9F)I+ICV-B;zUeJcU*;# z&ezQcHnJ#)<0BRaXs*w5XZg~)M4i256U)Uu?)WJ{kUi$cqK-S;?NE%}$zhL6W5ycz zYZP8awbLTwebLuf5wCg*F^)xVcecui?>!BC8SG52?hcGEP_S{GgWe)fr8AM|@p4l$ z_i7OgCaoW$MSSwL<8Gc=j71SDkNAp9Hd1&~e7wH2T+5#oN@FMA`)V+TxEhp)A83t+ zp<84I0!y5D#){Uecmco0cf#;ANe})qy!g*1M6irR{H3^ zxb!x(O@JD|fU;_s@s(I)3mThBRL9N-CQG&>e5rfE8~#^A=Vc`wROGdvNbhXIigszP+RCF|txhV6-CG`dt0_L!&M% zb;}^s-+pc^<*~0wAdm{p>z<-;6Bym<_NWk~$b9vTvs=bncPv#zol z|3``&YK2$31*d)F2~tD{oP62r2Aru*ADU&Hv?C(__^0(R?gfaFH{p&CyiEHc-n{>X zh)-c11gx29jBJ7gz6#m@tsM(X;-5|juex;#myC@`o`|kd3nJPSm~p}ZC`36?C)>mZ zB8QR9Gris9Vbb)@J5um`LxHZjRDScULf*<3LwBI?touh?nL_Z0>ez&oBu3)VF2%>= zs3#pN1-}~ARHPjpd%}4G5)0yF8aHictTIUqvGV{@Odh9Cu zU5v`!H-?$JYiWyaEg^PCzLcsu;=uJ*u@G5-RY{YAw4;r-N!c-cG4JB7YS8o5c9avHC`b~0p| zcq6lp0qOILBEWoUW(!X*wPk;!5J54pD!8#1IA4KkWeBZS%eIkpv6-q1FOJ{*{2ee) zX)2RGrmN%H*@|ith`ndw|2l;gSkP&Ua%qdCDH#VtPY$4$MM6iXTH8kwl~WO3afrak z`(UQG=7N0(!ds=J%lOvjMo!$W--kscv%#ICz`<5Q-xY`ch%poIhszKQs?+h4S7LJ% zyF`d!!+1O_$Brh)nEh=(b(PcskDP>jD#b(QGYCO|Xj6kPUZ`>1yV}c(qxXL)7`#5x z(t7fWk+kHGid+J>U37$xg6RQNSBt!!EbLiIdT8#83aK84kJoMkxC%M%Ala&Ubz}eW(Z5aIr3&)3~^(|fH-^o^z{6ZX?xLPzQ7w)kaEaH6Epr@=Xg~6H56yil z^3hQFc(=D?NuC?yUi-tl>Mn;^|g-un6)tDA$m zEV*yh_Zm`$PdxU<4=oCM)Q=PS-G3-w@f12X5y;!rpRP#aUSnR;o&MDO5dZmLar0iu zM(!!QZkCj+ruZh1MZ2?uhF0M0@L}-2alJW^;^h8C@z{{mhiVm`QXy+3h9kBP{{f%x zO}~ujYN^J9kC%Pdn1RWmfXTz%%`&^=m0V&jL@m02@VkB>@(zapX7~x*2hu9J_X%f<64q#hwrp| z^3i|-j-^grn#J~9i;ibE#yfvm{I0FA{h$LucpjbXC=#B-1ToN08CVs+am?bonn(<0 z)tdNy@~VAaXs7d?&%DtT7aC!BfjSV2j{*+uXLu186yovSH&@}6@iU!w)1!i}yzky! z=&(zusHjEP26c3>{;c`w7I`=84}3%f8tv|TkV0(6+hDCQ{i!(4mRcY$`*dXFZDgX?VPFwEtosa*+Od2Kp`Z}a zg#VtvkcFM;R3|ci=m*7rJ+&Aoy&uS1tY>mURtB5qkv-*%a*eFXg5wooHVO&_m; z1Gk>JPIc5QYHD?K{JwDiz)(FVgP+|}@S})V3aRy6+M<L$X{}qYW%MO{ ztNPno1;iVWp_^xvbGAMU4yc#+EHLa?l}KR=374XXpL43A2oJ}_Y`rELU{5-^FFs^o zv-a>(tr_?-!FU}a_~-sxMbj&}7j@Ea5-z@!BbA^|9Nw*KCnDpgYdV1%-4x*4VOnE> zWnFsk&5HY*&BBh-Lh{orHtGbx_(iXI*&Fge@WvB=t@=^N+3eGlS6M(+zPxL_ur?tz z&-&zG!>h6)!6Tu>Jb^v>dra+-o?bJh_n5#2;2UIgb*uu|lKx{&Nyz+s83QrYYmbBw zCWkU(>3m6qn*d2Sw*+v1*axP;Hz)%}m>8P71?oNAM=qg~kH!FYPruc8;Qx?M_rGox z=wE{V|GM`70T=TB!n2dUUROY0^q9C&>7l7Ds#6(77|tO6)>YSw=!l<^eue5wj4M@8?N832MRKnaz))%&oA7J(HVnN%`-r)Qrx&K(0gg0% zfNw3Vs7=cmQ1<&(IkIHFqAV+VIL{|jy3zH5iW&1o2WqdCX2z{w%y9mL%O5jw-PnIj zece=9*#_lZPC5munwp=^T+Yw3XJ;eN7lZ?fVi(~dL~2r$-UU0%e3(vfjB4x+{ve%5 z#TuTQXQ$>-d)K-rB=}jWb?PUJ1w>LSw7)h|llx2`;fv*~P~Mz2-0VV$_0eXlFFcK0 z!LanyuHVwjF;!@NKzR%(UV`WkVK3)q zS?*@O$O}DRP!J9f6)0q=V;eQtqrO^|gq7|Lwkp$Twg9>q*E0U(l>huXVxZcbx2ckG z?IZlY^h$o@FN$Jfp=bw(cKqG3o6uM--cHMH1ZeH3M51NiL}6FgYx%MsUin za_o>uT6cJL`WM9ugyCisnp^K3Y2rg`;gB3>!JhK)sslub+gCQK6}4t8PD-owok@uE zUGEp&lm)fDcCTidGFJI;rL{wmxkt-^N#cuV%$}I8k>F2M@$I_c-7e8X`OXE~F|Uxc z=l{IefBQ2#Pro(apvOw;^c~xPlQ-j`_`=EfhmDh$1~asBR#fwYQRdJm6Py?l1L4WU z&UjC1m~+<+1KFs&O+h51*5>+N6Gr+@_pw#q>!joKnFE&9_1W z6VFQG`Q!38=@TR(WxmIqH|VVGD1wlRAefQ5t{vWRgQQqB=|85+BqdVK`v+@yk4)RE z{)2{_iWZ%YBK*{y(>ws7Cz1KsfLW?QuE%{KdO$$ZD1JK{*h_Y}AC2Kb&9xJ44A(cv zj98-Co*Naw|6^oUv~rh^f6e5L12})E8Ep!fpgP`jfh5{|Ce_Q~sz?(Pl<>4SRzkLp zNjt}RxRI9>3eetJV&{Bbjb}V;m*bnzg0MX(QcN&QpTDalzdtugqD4gyJW4x~Pw^7& zJ)23q@{7TG9JLR$VC#9Nf$#rQP1pZB|4&~TQ)r!F*j{I3@_^Zoifw@BM!Vh6V-Se= zj1*10fa$VCGtF@}bC2r6T9AE$i-2}z7nIR8EI+X|#az=iBEW|4c|0f{KK9zBm8<)u zE^TgUvALmY3(UHU?BE9Q$G5my>Lg^AZ(k`>K1a2i#vLEn7S`N3jE2`kvDX$mLle|W zTxD$@BpcZ^?y}bmFypk__I?){*H|@RE5;>W5HlWZh^7PRce@<%5vc+*-YPJd2*yM} zS|xAqY2Etj^w=jRUiwD6uQ#A7+KcN?l;eQcF@KTJY2V0wUg zRBggIpK+!=_%<%mSznvs_f~EM+599llnB%g%?UPmh%V#$S$5@NqRI!)$UN2yRUUs- zN>m=>ssb@IswKrGJc4r)t zEBNq?DBQxk9FvHxc?MIeRh!`jMYGB-biP-fCy1p;geul>y@}G%l$8zDsVxVm?lHrx zbg>RUrdX<3W84eaa-KT*wwq!lL-%|@7#dCG0fzw@(Od7+1b*;;V?Gliy!E+e@cx01 z=lAJnphwRnc12FMsnm&$3G714-kyF8?5!7?yHHP(6G|lFBX9?u3evU6Ut=W@?F40OOq}&pM2FOXEp}M#PSh(Cr;OS(X6LWB zeQzv*rV=sWu+;?4X>ak5jaYLPDsWf|wn5u;RNR2JTF8I0f7qKW9;W1wnb65qAX#%w zS%0|>?wYPmusXKhDNAhp3)@c{ zj5(CJ;+N-nbpmA*N75j2>+Z5P<}#|J@yvSJMad2%gOAFv{R1HlMcyB$!2@@5ajo2T zGv{b8-0PCxiqEN*UE&=Ztk6-q_1|TRFvkvLxEE^+i31~B7a#U2WpcSPA|}}+_=40) zF*pp$Dh3Lix;iF);~<%L|H+frwWT*y?@OWD(4pXKL?s-nX~N0N#stjebVcf(@bB;2 z0x?zD>SyJtj3!D(`G8JxTd^a?`WMBlPTVOq=_V-|VF?w{C+cAELZ~aYfkD^tZ!Nu@cBco0`>SoqiML*4o^6Ihx z#rOVVYzyAHDT`?{$y={-{vrCB_eX2=|FL!ZKXY#J_f)%+4Vut`s;4XqXK{LkL*E-l zeU9(UefJN&TlnxlbBcc)gAZHEC0^S)cERI{`Y;hz&ug%&4X!(P+KDMS%(&yexvd08S z9U0K%@6y3Trwk2kj-!9;KS>xL?+fmzZ@S6n8IB&{Gs%fAJg0=T*=D3sL89 z?8xYk$_zJ$f&D5O%?(~igi^wV08iWdSYJ}-M>o4Q5#u{w9u4Gn?VO|;#-_5RYbV}D zy{hI-0SX>LXsinG-So!z!kEGx*zgI!;*&y@8corOusah)ex`U*rxVy22)cxeN)S=z z@L7<`aJ~HZ$#p~KG(-5;+_q?==_Y@v%+N=bjLc5w^Vlf#;{ThJm!Ka$4qLYfFc^UT z#tcjnqZMqAvXSIUDq$7kTsnt|Hd~6e3Rc$RuAB!j+Wp!)uK+2SCfrG@+5;am>^%n` ztHrcLUpe9Ag2_*lIurzThS%p6XD?se+AlfoIMJ+2)gHB9uB@$o8>=L6J%-IqPw$Ti zKXgXdnS2WfLl{^ABCa5|Ke<@_XY1UgqD!meb9Tn48wI1kR>099ewCbp{@MPl%!2T9-9(^~>#u+ZJsme2QxL?DFSQ*<|8U$U za60}_;R6R5gqL-_KDo@}sLV{b_m9Wt$N=+rf=viEUqsRBKh54Se6Mk_vbgp#CRB`# zy5J{;Uf+Lvc>fGx1u;<(2(0$rWZ}}gM49ZrD1rf~BFBV-hODj3JtH;t>Db|?l^M}- zJ)$c0fCifzs&YLi2?SUd0657g5R=%v#L{AJr0cB9qXd1oyA~rWohMIKjswhh-7oAw z@8ILeS0)H=@U~BQv9u}S22HUY`@_qU#EW+hs4)KM!UBPs-!cg8wYE!eNx0`S)ZpYBr4Fqh3yzD6n2RDq?e}E<3k5fCQ zxvk<%SHnc6lVxy(j(XjbepuQDySA9yIv>RGVvx)Yb7^g@l5>dcn7uitf|GthQ%!iT zrcg;NB+sJvJ;2ux+kD`|=WZm&iG4)^klp>7A!-5&hDm=>3_Tiq%kAg;m&p2cZTEK?}tXLk!Ze5ZKhSpx}uNkJhNQ%*ra#RZuV({Tbz z`dk49RS7ABm{OqWhfRm&&;#eydN($@h0E3)h!67b5*fF`mhK)G3noM(^IefS%{^f*;> z)=w*`gn}KzJc=cvw8ay+m$$&v!=^~I@6l`{XP`P|(MhR#R8)i~o#Ncn4^&oTqY4%??-tLlT} zwcM`}HJu1S><%;&c^+&z5CAQY^y3S}wyLm{nb;Fd4tLf}XXDo`ll+=C8P8|%MrXt! zVVS9MH2-!h1DyKebL_l)&a^76=tvR6Us7m3kL!PbI%dMfY__fBa|`#Bd?kw@=%e;C z@kQC-uG|gZ2WoLol8oB49S?1o^sh5KQ$nHhfVC$G&JQe#V(4VvIXZlXT$xsv<6-|} zU@%zcu32H~0?da0tai*4u*)(Op0NU4)QN@c?*1OO$1I$Lc-4 z87aeY{<8U9XO|-HlixNBG$XOfz@3;yy=1hv?xHBHU@-I~1)?D>;M z54tk^#Y69oy1#yWo#BnZ(e+d|pnUl8UzXt{T%tdbA4k)gn9hi6jB_krpV+GMHp**@ z`N{R^)6kdwqKB^mu#fl%5CkL{?h%^b2e`W|Z;p>TbvKY~R*ps$ngN%Rv(f#J@E-D94RK|L4Z%gw( zG3$Rat^%3#h?ZJj}nj-XZ8ZNGadXiaJzy= z+`hUj83Uls&lR#7xx%FQ!uzpL&&gpplTLp9aEB3BV@F5uxqPBEW5e})X?$V}|6FdO&kNeiFHv$jC#%EBXn5XVpm=Bra;+RWX^SYmG zU>3*5^i+TuHbXBdcwSktix;wXV~q$<92c&i<=0f;pr3b&*vssR(<*PkwPmUB^yHVm zeem)jO$jQ0A`K};fgawU@~qaXg>>b8ZtaU|$jGDg<2A>dYN8)1xS!m8O%+{(^7`At z`OgLPfA&*aKkJg|WohQ^hwA)6(uvN>Yygv86)KUI|rar zlYi^~fd5PN*T}!3$vb`Y2uIU?lkN;eSBPAJfq&$8Ol9TTfp}`iVNCE!`-^gl*(L@L z=o-{B*tjBV9nI5}Uo@ZB6GSH$F2llql$V!T%)D2H0lhd^$0(j5m4J35i$1YywHqx&CJuz|i{0>9!aQKc~{^416+ zI3(c~F$9$)xP_c#n)e8Gn=V2WmOleZx64$(iBR7LEZ*NyS#~yK-Mj6(2S5UsbBF%B zeDMEEojASv@`_P=z8{StR#iWhBUNWCOxWJ|%lZrST_35+W9UUjKril*xom|CY*bIU zxoZ)#YCUZ1=#?)(e4=PR>OJ8ari3m*xBg1kMh6#WYv*_o&BIi;*Y!(kV{dQY?~04cH^wijw4t3z|rp~mKvBB#}>|n7gC{7 zr#GB?SH?hQCc%-8OcU(dZFutmR{o@)gdqfi4s7B7zN_k+Pq>2k1sy|dQ#t=dku5QXGx_&# z5fOks;d=<)U}G7eZoDRG5AB5mDz>U}j(>dWRREv*bo2$gGgcfj7_s;O8&tEXUEv!? zur1S#VH^^yG)I(v8qd!?MC%;2p)QqZvfqZS9|1_fIV^v06T03(@Wvjz{kKYv|B)e zkZ-3y1qLzb9|E-3*3e@_=ssMKIM6R}20jKBinsu5E(wj;pgme_ygT@y{vO8OsO6-j z%-DwY!<1^ziNQj2hw8vvWhL*#?HEA3_R69*0S@T-vv18}LSxc(Leoq}awX7M7MC}- z$N5qay=c*+att~c;C%T=;y^)S7LdlV``Oh2vKLMEX=&pI>2rQG(i1tc=;i@BzL+fK za@%VKsne^*8qPO~#~EPfb4@o~+Tymx04-@F_>jlMRQ5p2!1=if7KwMA`7P z4n@+Jzk;Y%-=k)!JoZu47I@+SnaOFmQsVUcnJdJXn22bA2PTbYJtbvk)=h#ke)y>3 zJwN1sWtng1Fs}V53-7P~AvWD0acLI$9KbW`H~5@*A1(8RM%Ai+w)Y z+OcYP?DtFEo7nmJF942dkUe9;MPIo>PA&_XT|G^M60!$T=s*Gda3KA5T+X6n^2xxq z9af_mE;UjK=7zQOxxF%T7ZNJAn>QG_yX?%;727}`mVhk^kMhCQQVk=(G#Xt$91vVz)t+n z2~bE=I!X0;mQJt)xMAXK6W%okpMX+quU%zetROB_=y}}F!aWKnv!mW( zA#LD`K1w60ud${m`g}+BDYAh8E^biHfq6u}`tE=&*|%^EP7PaoiTB1yKEYi=-g+ka zsM#gC<(H!Ad#pJPFBA~T@}YS&u|ln~&+==nbv$Zjf~KAM#Ss7dZUcRaG8J$e{<+ z56UK%=L!yOd@x)*@m}Jrcp<{~YInQJ!sR?HM4?!Z*@@Z0?RuI|LZ6qtW3ut9PvORP zO16hqX^QoX8F$4HJ-_rcgwx&no59bsMuB~w9v{0XjA6)(JqsEkhZ+Rb3UFLY%w$u0J| z3(xcL6Z1f}UpzQOd-?+$YoyyWk_nb`w0A_4)6=!~QC0w@q`Q9o4K<$r`U24%p`3<` z=W;rnxc_p?Ud+hr*Sv|0&94$dCR-!IaMCy%5z3 zqJ_ojW8r8sjQ#n=V$|V=p8i)YY153eLfW7kmw7ozBy8g6VmaudFMqoPMR8-X9o_J~ z&CJ(y6@>7GiPVmn8?S;3huYt4HtHSvzA_OTnjfjPo)_p{3J=I1i$s4?Sxam}{hr&j z*YfqokZK}_J3K<^>J z2`$_)KfP+s`0Nv}FhS7I)R~}2^cW`7;NP>IH^GPL=&?u?pD0U-YD7M>+h{g#+ErN+ z4yo5c;d_i~aI7JyccbDoRkOThT*x<6N;-pgDb6)9A9`vtSjy>wQI@(~cZ+W)niBJO z*7&^}^ChYv%b}yBmqJg(%`LpHdsgiqvB|4nI~n=L9dI6el?cK#m^YF6M(TAl zdIwE6hKuqPZ$6}2RqoVv{^Lm@ct(LGSK44SsWfpcNMzY&!r-h@dZY3BjSqXDppqSF zcOb#eV$M=BsQB%(oLX7yJ^W^ohlOXjbHxgE=|nyeZ&%Bgwb)=wXgzB)RXR?UOE};+ zDSOtge_`gGlXs*YUtE8eq|RWdE?{QM%a z)&5$GbB0mUEvDHYTn(dWx$o(C^9n2c;~Rww$Z4aci*tn-alCG(afO%r7Z%NDx0=#d znp`7F$KD!qxz1@Rzvdvj^*aA-6zH3y4nMV4KHko395FH4nA-5Lni|U17H;)65bPuS zMim?lOc53=!@R}?i!}1exJ0Lne(}B`NK+TdkH(lr0I+~1$5v3tFsCEcmh;S63AEk zap$V(<*sgIhx{vzQ87*REtRo1c1F|pOxZJ?gVcCr4YYsWM#JFr_yaNvwAF;B!{OTl z;f(f{ye8JGrIE(;O|`LEgQ3J(QZ@0$rB68Zk+v%5?Wi{QFy8ba%*?5l%9vc4@4oj$ z*rGP$TWfRvSj_B>H#949!_#oDd(H}Sbs@h$w*A#UI>lEz+&VLq?5`m9C#@G;M@||6 zbxrJNAN#>~@yABC+f(mn%N{86fKCo?5e3(5%Pl-VYq z-5nDjUd$n^QDBHt%<%)o9lNHc5EVL&dXDvX7kA;xFPY z%#KydgtUl_WZd*S0309E9d!1PBB~gE3X^SfpA*KWM`2Q*U#_%Mb$ASFUCnrl*5#*~ zfdJvzPWB=guR<14D{4!9gQNQ{X>ddCcPBuMA%=F6HpyOCE6{xfbE6Bdv@EJmGLH$V zgs5_imw+peo;+w_AZF|6(=iLc%Yn3-ylQ94nRJ`RQ@fiP)48 zJgFWrw6LfXPr{|lH;LenB?*N9p1+#3<12BRK5i7X)}a@yQr+|P;MFk_U^m?camuOK zalEe{wcrSl_cJP8Rgez|>|8WG2hBoMcLw|*v*JeYjcVld&uh;=0Uz4KT_C$@R4Kk6 z?zWCQ8X25*YIjU1Xh^ma?mu5zL!e*nME$mEK~BwIs{~zFPt6>>ug`;U_s&{Dt+BX!2FZfOLKw6F5=MJ&KmiuSYSbY z4}2hb%H^Wn-y&kWL~e(31v)YGx27#$s75WXeY#m$iwg?0_c~XdKpx|zeD~hwwFH8i z9JsaVV>4hkn59Cct{1+$;*f9HGhzZ)LALx>RexD{$Gu7}6$1i9_nBl79hn160+-M8rfBQk>5$vf@ z$58Tp$7@$EWw;4OV4lZb^W1M)!&7>@yHAHZn_58PVd9CeTb{Vt{_0ng%_%NeFFGW) zf5v!qXX%8N?ZT(YgYF7UxauYpqwQ(sn9xqq_qwv$h*(_%^A*#3tWv)?wMM#lxdm~z ztAu=%Ff>8}TTjYHci)USewJh@?cJhmSG`&PX6OCYM=?{STf0Ab0@mUN^Ml-o^!Mafp9KGXK&Ee=1zjbQd^v5#?l@;on$V*HyfjH88c zJfBNp+a1!}7bi{l9f}VTvl@u6#|oc#nqOrrVK#~NiKpdu7^=>wKqPI6LNlhpL23`n zFlnZj?!=(k!zL>t&Dk|zivd_pMnpOs-KIXB{o&-hTQBCCupMegaTi%PdoRKtn#|5Z zXEl9&MQ4m>04tE#M^+x@36|!+c=y!`jt0|23B?9ACwl5+^#SeVP6dJDK3Kax zIlxDIQTU3_(?KIuoKQGNW2)=7L>R4tS!>5qyp)_sv@ac8j64XXbZn%7Rla{4vwJA7 zDptk%K;dlx+&V2%u4`XJh4QMC$m28gF$X>?-}b^Qyit+XNT&tEwCnzI9G98D=2d>Y zOLw=dMV;#jdTP3P5LUjBgK{yS_4syu^rqRUr^!6 zzNF{K>Yu#xc8Cuig_Oa^8U+(ukd)$URrckZnex)~&h5Y2kZFqT=H!lJv=yG=yJom= z&g#B)2|utoOxg{A9W|=8gB3t!R5TI=ygU;<{*S?j0_0bPpc34t2u(<3bvKCo>+CwT ztj7VJc{+?U1iqF9-YVg(Tx-)kwMA_En?n#eamP-QNKY=$eR{+2Xcd}D>q z_Z1>Ls$tPKV(<%886nsBV2HC-;r@8NL?^#`Lrn$vxJ4OgI$9KP2or4qcRkD!Y22g8QN+egVp8f5JLsOcon+uD&dv|kB3X#luT$RQUl{4g!CY;d+ zs8Ym^Eol?3%plfv33bo_C?GbtTnQTi%n?Cn%m{fv6MzIIf(jG{02==%2I$})q#8PV zAOM^J3PkTeujc{k8Qu&eet-i-DhGbI77EY|zUBg@nKA8ub3oDG4nP72tb(c0%NfNR zVA4I1cN=)u`3M@b9|1=KxU=)bD~JUn?GQNc$kN!6{1cn{z#1K130(87&-URf8VOIP zlbEEYBh`Rg_#TS_Xl&Ze3Re`)V;l3TWDvBmnA+3|6X zTPpo6T=+)7MeUc#EXu>ESbT0Sw|hk@<+bg>I7Dw`l$4BfY;&X_g8ew#r&e+R#wSn0 zV_q|In?@qfdganSd;(2jUdyu7rnQyE&eFa02H;f~5Ch>Nrt_wo5SRH)NhT8$t@h;v zh%is-CV)uPt|_R*CV?oO9t9ILZMseMG;l8wAzNdZf?`69n&%c-8HDfqEjBwI+K;tb`6kE6@UAt>0}zbG({fkne3#>3GwrbQZ+4m%Sh za}i*6cJ|ffzfc_Li2}+z?~q)Zri}RZY@29Rcr%9Sy~A7nmA6evN%X1lP0feA9;ivX zEm>dX?j`Ly6Re3cPPH*_p^%|#Vzs6N3JolWX)x&gT%SmDv|>GhYb$#v*$(zT?9;TX z9;gMUO$qevlpD-xTmD7ib%jIp=v8PjbUZ(&B&t6JoYhSfA$>*y$oP3X&;(5z4SYcp13`w69u;AkfbsHiU}d znvgBYZJ>e08wLUT$vd6o*r|yP(}#$z1SlqC5j#N`KD3f2_#nEH7ZHTZ8i0I7;X;)E z*=o!pDs+x{!fldLg2e+{KP)*L@gZVuQtj6nWDbaCI*B3afb!ap0Kh|JVr5J_NJ@WV zaeXLZcn#`pExH-WHgRXTb1y!zLpxw9ewo8JI2?$$Q}Y2VZ{$9oM@*Nl?u$v@xR3I7 z!-ZIyMHOPaFcsprl{QfI`huFDB0g>7rpt=c&DQ!dQxoSkowcuk;_F=r3D3S`$8F_? zLoVKl7#)YCiwv8k&rV4*qHfVf2Gf3bSnz*n*P|fnV~f}Dz-F~4u$bN(aib}m71^9( z6WTRTQAvRH+5|k&rfrJ*O*^ri*d}}mst5jNGuZ(W*HP!hW)0*MOsZBt%3iQgeacgG zt&~es=iq6)o65%CHfRY}LX<8cD*SxIKeEP(y)gDI4bT~S=SKM#1=@6v@Dk5TzJ~f_ z*!x0d)pWhdRu9iETkI!tZYJUL&kGaGr;Y?U*a?|=qP>B}I9>Ze2L@U?8|&1lG|*+x z005$im(U;HMM|4nXTMkntZUeTweEJKVB7oq`$cE$fj7R7P88r12Rt^tYVIg~Zmtyp zhddd6L$fMgIzQq6TXvh)2m3j?J*&Mn$Brd2Re*@FL!q&;;zu` zSkthliv^n9TH5YFcj^6q$S?UXBG-Rde*ce3^8fYTzi(6gxhL;ozH)uGeZIU@`>Ul} znb=In340*=nM2@4bVB3v)c|!5fdiFNps$Km8T@y5`_?(XS6yNiFR1p*KU}H)nQ&oT z(t;Q%xEb|362X6*iiNG;;qV7_bhv=iz(ak4{JoDTClA9f z$K^OYxJD*a#v_V-b$xAm25}oiYjPkE-krYs-`IQas3zZZ-y16`0s(WX`OJza!8?#RuHb z^Lpl7t+=KTp8BD*f2gz#>H1-FWKg7D>N@eR%Vfk-?JDUJpgsv{eA)b!qG3I#dE(U?I1Ts4z$NZWx&nj3ce!3&H$1=O3y zeisl{1kAzjSAh-V|1Zt@{dciiym?N^jOGXY7@Stqx=UI2&v}}rWfy~EE1AFSzt8>> zEmodckmn7FaO=4<<$g7!hq;}$;tc{4)^~my?=#4Mka1KLZm~+e-myCu1mDZHJai|P zqPiP*&;|;D2L?Ecy?0+aGVWkc+M5XaN$EAA6_DFZ83Z00`^wO4>CZMt1y2?pg%_ND zDn5~^gPx^@X3hIH+K99ayi5VNr=fKIMfv#WpTbse{H-;l5Z*k_yyw?5v6(>SbNvCT zTQ2L}qvS?lK1t+1ttpEcp8Helhqw5;Bk`M)1KQ>wnR4 z`(7Ou7@(*<>w0Dy?z1wO>@`woxN?zKo6@)%fOP%JknG^_gKv0CAdTxJmwy=?INb$1*`u>;jIpN*%EMPD;~nfdx~OP_wXptg!|o07$6Y(H(wbn!}r zO_$DO!0_;N`?#3ATs-F`l_WHK+@p?O;<|3a1mn;Ksqk4peE*qHn(>`PaEXzcki zFr$JWS&B5%(60pwAtqBbAG6 zxtC&*|C}RwT3I2+>tWvc!q#@zFtRNUyJ3raQjl4J7mPh;Y;i9*f#+x1=i1u@SR;xJ zE)rop=}_?`ebn<+jkh5o$(AQaQ+Nynh^RP5B7UfSw!dRj{IZJDuaNoF3Uj3vhlgx3 z0!fvoVLTOM`Ym7)NR&yiiTd40qHq|p2G)a&u~{>wD9>A*N>uqcU#jRXf(x#-mqp6@*yYIBQqa&Yq}|Q}`BCi+otj|+zd#+a zFBP#ZFoTF6T+`Tzy0y1(W;yO;)TpIpPuWU=UL!)P&sWbWf!q2`#z&W`z(6zYOT;|m zJ0}24@Ms)0s5Z;VvEbbuwiK3VSU9-2r#}CFz4z|O+r0ecc=5+OhD%Z`(D(K1WLt9> z!DNn>2=uWwHGvfh_~x7fr!f#r4bsB1A27&l8Vu1J<%8P!42@G3P-Ug*gji_)H@If zRkO@pT$LcZfS23S9RF_i-dpPDVxUCNuIRZENBoG6nd;ckl;DZzC^gP5#G`d%S~`v) z!^?Lw=0MbYqrn-zk9*Hz1X;#>(eg?wAxq?kKbp)R&W9~Iy4ir z<3|mjoc6YBt5+JdotpeLxG>GkVOUND6@nyGHYsx6^h$Qz&#R!TigyeLb4-^pX3Ac% zwW7X{hY(jtNdQJu$)^vMN<5munz3qbJAV```sS$I$;t^Gwd7NpuOw*g04VUL-*kdB-xNJ* z3<N>)ns`#xi5Kf8vOxcE&NDGtZJg8?D}I=x@8FdRp#qIOt`!r)~b@ zq(2=}AAA4sK>qRp{R7@lm2&#imp~rAwYy!+}|-&0=Fa)vz~;F#Eiq5zL6UDf=TajT87NdemGu@^rx@5*~{G{ zXjhNqeiq>a&{M-qP#=iGvNU2_hv71xr#gr7fyw)}PRKoPstcwIOcLY(kh?;x&DGs2 z<+Q008HUcoH!Q|Lj`0ljTApCVCt%9H`+sg!{?=&y-(2^YW=UQsFH;QrMuCI-%ac|V zq-Ep2ih9HuAu+_SQ8?R7z}qZ5@&}67cM}I1Z{0)SKVy+|sB`ebYrBdPwq+G>#?56D zuGyZgif7HHFBi~#K2wsgv5Q%Y_m+sB=X8>EKb=t)Ao+bt#LMRC+mFR}k3R8l%lCV; zzi92FzONJp$W>kWcQl&$*WkaS!S`ND`DoL>v1eX_O}0^2O=&OeN|6M7KDDk3P+$%~ zW3hCQ7C;p|(C#dM-*McFvF? zdy0Y5@UJUCY1r6DpltZ5aSSZ;-reA2`Ca5y zwfBf{BH9Oi(*DMp_Ra$iaK#62I-uDCs$vZ0qhtP=;dYY+@lEl@E^H*d2g-1I3)PBZ zf%7*{v%YJ^DPCEsvpup9aR6kdzF%sWoqhCbh-@O@(jEyUkT{>_o(%wcg?H@E1fUtJ zA9M)Si?FXfK_(uVIJ(65_PTQ_JZoQzUL-M$(qTJ!{)+N`;P=amAYH6m}~ze$z34* zu|`>+{Cm+?nwva<)B)|mymr+iUvjrdh?ERXP0jH%#O2RKV5hD5`B$giA9Sk!9_8vt zz<@s2IwyI?K}j@;&4mwT;{xv!881RK_e0;OTVXlYd+yexrd(i;$+NnezW zMD>6a!x7sHM<3zr$|xN_!PdTy6o#iS4iz{~wq7%QUn!NsZWHcy8?sO7!I9++?!u1FY`u%f7>q94H6#Xj#t|lBcNJyc)Jrm zYS`LbI&L}xDr(Xv+A#ktSNf6(9RU|^*YR>cVx7VDgm3EZtgVi&zSNX@m-zzPW23Uv#5ltvf9j^$2{4FP6VH z34LI0%5<-y-(bdZK8tcDX>GoWxEK!IaHSrW<yXS%T-4*2A~Kp;UoHV-DBeneScRsd!f5<_FAR`5uRNt;@WU9iWasa~eYm3kQ6$jnUH2qn@SSlJjc6qQqKmirLJ zh+I$$CL#~@UU1NTtIxhEcK91mr>hfKrtM zRiIg{&^GvF!k!}sbr-nr4M&rp1Po_3Q8)|+LXs6Ik)Rp-y?o$a9=pB(8#Gr0XIL5HTX_B#O`X_87J zbS1wktXw2^`Q0YNk!>b@fZ9AgJuP-}q#J#rKqVU(x)YDR8K2cdzS;F@P#xCgbahQ- zAGIQZOsJ68K>&nT;1}Ho>Ne($uN|57{ydmt-CVqakn*lA&m7`rMLvbR_HZ%aCP{)Q(dIx>o^g)gN_%|>qLP3Udm#xtSFI$0>;(Ze09Xz%<>4@ONN8Fs~rf*+TUvl4$3^<-jx#*|0>7=Su{F4dgU>U52``K=)6BC%{*3oktv0pd`)JVv^+ zL)dV%4zpW|mtnl5o*<_UZuN5b7T803TfZW-kp%XGd3+;IsRW=bfrCsLVTV<;ixaI_qWCI$e{hCk<_F=Pg-3lB*Ah+5 z*(oSAo|-{MFE6^kyzzPXvd_rR(LH9;Qw(756K|}cV?h)ZDafdBoeif7?vaLRU~$}= z*FM%KxG2qD92D6y6mrF60KA=pB~@CZ%P>tBBI6~QR^Kl0W+wiiFKi*8yK9wgOo8kS z7|F?(F#HO$NZ}$}$gvQL%b8kz_098plDbhs`i(+ewKDd5dDCV+NH4|R0q3M8)fm;) zc-I&tt9=Pje({)+o9sHDib+2tlh2c|G*;#}piyb1brjQ7hjYLy-$Ya@*P%_S8o8t$ zF*VIb>%$9$;z)XKl``|nmwD5x%ry=bp3{?uW`RwpCme8PXo71>Y>9h~I0h*l*I`_5 zB9K47oG39ti9i-k)J2Mu|;x4GQ-iav9)^r zu<+>pQOI6OMdMe)gM93Lf-7)?Amp43DSZxow#W;Zp(@LgruAGug@9#mKSZv)qu=kZ==^N|ToK1{Q)C%2vd7diFT*WqX*WF0H1W&nv}Z7XaaQ#eo;FZkd?cHfn*d z0nq=TwT^0eb;kN|@ZBnbe2`zOOlpb}8z9_)T<`gI@WR71=is(HmJ;8pbgw3g$$bVr z2b6G99ERM*@@@62hb5E!x`_P{G4JX;Bh-?^cF(AAbM0dc&iUA5irM`O=||K2)15_4 zecr8-#1Y=mQbc!}_;0FvUN%_92xL>ee+5vRmrU?DD^OMx&q#u9_&UU1GuMGJvtz_Z z^4wu@sKj=HQV-kdTA*>qVIo-UpMKUv$eb{Td@eh$2O@b&=iV_)L+ny??2{#OuO5 ze{fj+xLVKmJUCwkC#Jm`ggez&hg&u9*4w8Uj_km^hzeSPuW}#jkd2WrFY*a|PeaEn zI%Jv;|4|$BdF=lA^6gu_HsJ7ha3n$tF`^_1d_&Cqv;5G< z&B)pQv7rWl4YLGT5ux>@UiZtj0#B zx-b@#n$n!3=YwgZZ&IOO41b8OF+Zc{+2j`(Sq^Cek79A2Z{kh18Hy1i-yWuzphP#W z?)ryL5mhby#41ScWH(ayLvj}`m=NFz2}=_o1t z=}7I_A=XWt_Or65R^=h!naFQ|S-b>cIAhN)Y&=-HM-9wtuUA zYSuoOBs`nC2q;Xui$|p&k1xA^!{|z!=AG+nTqegsRtC9y1M}N;lFYv5_}!tQ{5->H z(bDi+p08iniu?$+RN9ff``CjmdJ@zumg^AyTit*D2IN@rEB==TgLD4-0BTju(Gn}2 z#PrhiJF1y)DrqZmr{XhI;byPF8P29St;7!Dzw{@5K_vb|WH2Js+2hz6dY@26o* zYtQSz`B-BC*7crG%^AtTyzp2QIsE@kMo-Lp51i zvd-Hux8V_`aiWz!s?>Zl1KGH%0MyL=*}17kk*9UgLcG9ubMV2HYAo$yBW{^`ga{@R zTmiLj9kQ1vDje&gVt$0c@UTJBU4rwI(T@ zBTa+28gU_mo!T9zUyq-RZaDB}^01h=GJnuZghHH&dknx*y_;7RT)`)Isdg{hpoXztTivhQyA9PRE<+$qC zfd=x8v@0$1cDTa+aJzh`(#K?hsB&95QT^0A8WU#Vxt`b+fM$$Q;sMNXI%*;{O2QW? zA{|#S>KWUTfQ_1aE(tq{*yfOCUi41Xw*ILd;##<+G?gi+|6Bu1W|cs1_S*-}SV-ru}pH zI1Ml|_fWmKngSCW3O0dqgOmuA67RYk%mSJKN7m6tt_chxZo(y|1t<0NOOyF}yBju zv+=llkFQ6Ic-%fbLo+9!FXTRp#p8!sM0_WlX53M=bKbt?UXON>(WFO&%Joe&r>}$3 zD{!ggJJ9nbYfyVRF9+@ZEzmB3gBC2x3(+ngD!G`iODKbQiFIAx_2T>v>Kv>TIrgMb z_+y$R^%RsCf023PWnmVjP;1LjiO+TE%if9=a|ijOKv{ zQxCBuu)ZIL9caVDhfSX!)KNh`1f4?O1v4NQ65(pJx;@b0>W` zDo}Fr%9p}OJ^ECvOm-r+(D$Us5o(FA1>pVjJm+Jt!tcI0x6ZLQjJl$b6>S@>U9>RIf13-_S4mk$4ZqAC zpp7~tjhiGm{G#*h+}rUCUPu@!UB|8=<)?wTO*9HHPB4J^O|;0+?yo)f9IaqZt@-wlZjEu1-w^}d7#YL(P8Hq)(C6C zd>~+qrG_|xiSff&O+p~;0pKs3hcE|YrxP-6Z%@qA9`rB(|LGo#dfs7Y4XcB6Mjfyz z0LQ+?X(Ukx)A4nt0d~OBznegd{ec7TSNLN2u_PVfY}Tru+2a}jzEqaj{vMXD_B^sp z7w<_8b^|(#iD&}qL*2fwe$ja~B7}&G5YZKI;x-%$3a2; z9-?ol%d{x(`SnANBa}0U@Es+ovQ|lL%#`HVb}JTNh+thY-q`U<=vCR1FM4dUd0~cM z+NT`wBk`gXaSRq|A<0J#Z1Nsy%x$~Dlyk18+XEUd~9^ zevXoJd_Q1pM{nYnB*$>K`T7Sak&+V~b+xMLAAMDeQKw(!XPaGLt&tWGAQvTY5dpPp zI$OX6MS>K7VPasL1HV;}KZ~%LeLXo&CpGiDjEXz5MG_%e!D3+FG5&yQ3?|jjQTd)z zI^^wYC)}gsb{vuL`#$m~ASCveaGU0Hw8rE-AW-B3Xp@b=zM2P*?U~a`D-biwg@**Z zT0IDlQMv&4Y1mO)oVx#_Xfg_D$&TJH3S2Mw@MZ5PRsrb*0uC!@U)3VKSzdB}wXAa@ zor2%j6?*3aeioFaI%bgSX;g^c3PobpoEOJdOLrzar=y#)CW!15GAXwSIxRtv$iYRb zFRdF?#4kO5btEp_k*_0#*#gEpCj?V)vW4Bmp&LUT(;@@Y{zh3_u1F<~3rPHu0brdiaFH;@f7Yx*`r)qi<-=exg zNKmDxoA(gGBpY&*H$Y~S?u2 z##y!1R22>y*$}pFs@C!Z(L+GC0*AuS4zs}#+Ck+zs0vx`(SolJ{@CHVp%&oKwsRbg z3_*;YdkOOdP(LGZS5cq8mf)fP03YA(!iM!h&Iv)tz|KB=wES!T&{pkO&g7)+<&=49 zUmf+tvyoI?sAH;&p|Z;PMc(yW?dtBA<<1v3yt3C7`JFQ<`m@#am#gbv@Be(a`PX>- z&mE778Fms>nNaHQHHT&$&tF3OCJU&}M$N_U8tS!2|B&g^d2}naH>oyc0n-E+FZ(!- zHw8=8czxTZN@anMsH0mlM0okM+QZhA-SHZ+WiBN?IPD^;=+Rv2BjIj^(xMMaez!Kd z5ELErp=(>83}Y(l6>7y-Hh>1{^3?YXuk{O||$pTf1eF`HYglel}+$pNUn z++F!OY$NiluQorU_TxLE!?6IUD`Uf8F` z$mv)PB)=Qn{xbl8cP5}jQo?pX5PS`L2U&dw)r1u7C#0n=O?=MaJV-x)Q3nQgz}?lj zX)p2YC>(!CWFYewj8o|5uHQ{1Y4-M~sE2s9LGIP;ZI$LkEj8E1^M=$e2PD+~y2x$I$W6F=vh*=Zk>m~Rz6pidd+gJDsG-e0CyAAV@uhJGjp^|@ zO%1O4lOHkG1htR^!@$j3bqwVppiR^k8mUPbHdMU|6pz%?f4NdUUOZ_;eDdO6&c)sH zJpF@icvsU+-lKj0K`8DYZ6HLcXC!)PrjPVSgw4BSLJ+4%p2jQQzh9KPz!)5J+>+s} zZS-)Jy~t-q5FO<%HH?^#@Tl&W%os4x<1@XrcG3^t>?=qRp5+LwKH+lH!^PCzfhX;W z*y(z{YUs9^epO*vbwsPw0?5KdwMtFq;WZ=5nCWU#T)ZJ|Se-c|2h^QAVcKCoZIkj@ zX!m~LA#@9j0^4jKA&uFT*T!!?6CGu#JQBZL$xQ;|gf|><`4@+v|d*pOqXVxU!)`Lo^&3W%u$D?9w zrzhJ4HeP~RU<@P-5E>1moS*Pju@|RVqT#aq7)>1euz~y|y#Q;~ypKp_#A#CN9PGks z{C@w8iLurl=NpOsVw>jKj()cAs!bsc=#aV$Wqg%8bvW{sQ@)9Kghgnxp9Vno?NyON z=j=WXj5A!eHai~CcKK;wM1-QnHLz6yXY(Fv;>^8cNAsi@eFJCF%UWAJ^~btsPp^Y%1^BP8Flxon5M!Tf_+Xyf_6q&*@0ihO6P&!c zl`(eqUQcp_#R1Mw?EZ(pIqTMk!>%41QeyReJXp&KPg8WFk1=smgV<5fpvGT3e23fkYv>WSGOaiQk)!tphrP%TsWnCQ^wZxz0mWP1A)Wu(e*RzoI|D|jUQyr8 zsxDjI_if+51#5+-L~KL}N^G`%Jl;#sLXgI7H$novj}T;mBa+1!?gJ5c zQoCQX&b3s=D#``MB}riy-cylX>RdODX-x6UTz!A=q{o!bx)oW0wx@ajqI)hGm3D)g zju2YIo~t(T6bN+@%MZ%Eal^&XWH8)wMLuAXC zCjicJbt$XP4qX-tO`|A~3h@KD?iiV#;Az2c1om{g!s1kG_|NetG>Rjw<15;pDC!Tm z5O+_on{RFPH;kK19K?)IpO6%Yx&A4QRSh{sF)B0jrJ`Y(tnWXZ_{vYaod+DvH{#0CE(l2`oP;2Yt) z#F!-WCzIggNUwIud+IcuP=SS=N8W19$Z+&IrCadspRH(C89Ny>!|m55i(#ky1ucSa z?6h((%t324I#?mehJ}%D0jje01N9w(!Hfc@e)JWYb$Q`*^Gkv5FFFIccA?bg^-GgJ zUMUZ<$=xLn)rgB^0r=@!ZbTQ1sTy?6T&1}-+S$d0i&Mnan{jYW?MuAj$=a|?x+DH| z=S#Yu19O?~fAWjn6_W6yHo&}n9sSwstD81`v5vu};LlA9G|l@;*^+3p8`q!a!hfJ1 zL0vPO+!M8mzIVa97i&SWd1B17Mcbs-*_*Vznk-cJ+MOZl#|dPs^IKtB#`s%G0gr*C zzXKK!0$SW9FvWm(oJ0?^DB$RM>|0tqaDx1YdN=T|p?`xBzwSFPj&}l03sk}Ha@oT`S86km2eh_g+IgeR8lWe<1X*iayTr2e=ZN`$=bp%i|0fzNy1)0cLR){)mC{qf5dijpsX!9V z{QO}c+9+s8oP?y+E&`J5?z_M1rumKeFgvzPHk8w#W54sPQohgs)-9mp=x+&>I6y11 z4p4*8bA+*v{N)u8aMvB7W@J!W8ickB8Km-&fT)K+U<+X19Unvz6W8&#u)ej@z|*`$ zz*7BXKvW@b3L~bc8X#8OnvkIgweQX7rjU<-V?++!v4E-3+CV!m)6N6c`rFjYX|*pg z8Gxn%>Oc(~GqFczQFCI{9~1dsM<0&0xEf7Zc7tGp?`f2CI zr0^qrUQ~Im_nlNWuiT}du(m`Y%Oa)AISOuO?_-?v)B4S}55Uof5~3l)cnuvtEK|sw zihCM89`#ni6&KFTNV4q1NET#XH1HM;jQ4a3`+8ddFSa$|JbKi^S7~hCz~Q>GkUXoP z#pKgY0|*wZNv5v@F(9}p^!Ta4LruXHofhkwK@9K1?$9xK2)$)Gis^n6Tb^mDz_@iwqQ@H2Wp~1;o|_C_W!Mzw=NE*+H|7OOdzCUPC@m z;vU&hFqqqvZK^ccE*=ucmA?f5WYJ(cK#Vb}1uKMofezlEW+S1&;pqVDV@*mq9i!Zy z-+t97L%2}DnJ+!+4v%-phClh9T_x{VUlBsh`1pjguvSIvrivYBe3{3}5lf@f^8z_w zH|UCIShet6&&8DlJ|2R>8}F#pS0_n;xwO)*%Jsy<<`QtfqYcBDhQz^|%!{A5u+1ll z4q3x!d5rY>nkrW?`>QIg9owHBDu|CH)ZQt4{^mf}ob!1zyO)ImDy~jt7OrDP=4}2G zx0N!7pCyy&R&^>%3+Hcdv4N%Pt-qyspR}{45J=F?a|A2?mrL$Reib&z_L@~0;k)1E*r;^H!qxP}w-3)X+a*l9JD$al?*obT$6&#n zkj$_84p0Y>BCzcqo}T}A&w#hJ5yFo3ue}6EHh*o_dsrQ8F|po0>uah`!U9di2?~A|-ONkA523KPuwoz}fStQWn!=fU0EFdy8yX~L z8B|zmm5S)Wk-js*ngup!1$RGF%Zc5hFbbHzN^vnga8G+}^DylQBuXuG_{nHS`U!Qp zOHpsQZP=-IeNR)5QRE1k@R>Se`tWlB<5{uN8c_kdAJwsVj^haSg*09ggqlFsgeT*m z@58mGL?_x`_+sr+t1CQy(G_2qyZ~f#({`65%{hWig zC4|9VB=Y>SXn_H$<>~74R3I(tRGNup0OeSix}3KgN_d3u0bQkynFGnrHx9GS!wC!< z6$HknTUt@IPn2F{uNd4y@8=R)YS*Y$zA$Rmd<{qaHzmtH@v}R0-Vl$m`a{*XYUCk& zK2i;?7v~2R&K4^9GKua{Aqn(^;&{}Gn@akATi1wpa9g4m><^kat`7K)y=Y)zFV2Gw zso=w-BR_=$Ks(p-f0Mo-uV!Q&3bWF~79}DSZ4t!qV?h*Xa0$6fw`Rx7C;NOsH|h=0 zi{QBXacCv2v+Tl!8D7d|iBAR=vX5U^(0O4P^2~e=f?T%4sOF>uK(q$cY&r=~c(_x- z^Au)`7FzIjdU~c-p5Cz4o(c=|y`K?lWj9xNe&S6A!e+oQ->J&;03@HVP}J1#epy_v z1l91B$Q%Utf^Vy33Cwo~tHq@ozg+xrHvb`*+?Cp;LF7{gwPcU?^J-zN zh>g7~s>i5{iATVA7V{B2a!(5nZs;-xv6-7|;TWGy2ys1&v<$eQ;H<7}Rm~L9J-SAx zFp``?%8984YPB~nK6G)1#CT32N zu#IVVdL65^=1dl_16#p{4-^8Y%pjs2Q|iZ|;jLpH!ouD2$-R+t!kuU>>KCjT0`g<6 zZM~5#2yR!STh@^QP74#;1Decl_~1KQE8iSW$Nz*o>y}^Jk@;d+7CjNO$>0Xwk_sG* z_-QWAL18;MK9<}n_=#vHdlt#|N(GVXPjD5?ee^hA?X-u>dx~X}0(bT8u!wwi_v!Ur zu3Lj33i$&!n)ifJq-`mzb$ez;1tsx-)M_pbh4KYp+#H=n=8rl;sq*u4|yM^0S6_;u#oDYo(6w6~QpG0olxk*_sCS zH}iWbSRepL{8|4y{;ZS(IKNjKXJKLZ)yh~Lqij<&HB7hl)OTCAv&dPbkT>H(5)~DH zH9K@R^?3A;ETbpZQqcjY6qVU23rCfY^RU43K;NO8_p{&~kBjkOR6$Hf)2%Os%C6Eu zQ3v`@ZIb5U7&(TYe#oOD&ZTb9tF5^(5kO=k_FhcM0{vT9RR6f!)v5J6(Cu zVW_6=lV{2TPTAkZtho9*1_X*44;pCB4wXB-z;VfN!SqxA9$qx`yxOrxHN2haiY6C|@)P;e>6VK$$eSGMbj zb=<3;>hyPaDr_y%UOuL`FfVYKS3R04yDL}GlQopkgK3jYy8UfZ(pZ?#d5R=BEo(RHeAd>;)mr5JTwi+f5n@Iyn?b|L5eUB*(d#YxWi8g({q<_+d3? zm?APRS(mT1?iA4Hs)5qf*b|2|W|Cng65&NVL=DjWWL5o_QK_d6y!mItVnn#PboI$U z?lXT`%J_sW7zfLOKW$2C2O18D%V=4BaK0R64c_>X9v}YcMn6-`t0pk9bd`$-^4tUx`z3k7g>$Thw%8P{upT9r860N*;&hl34 z!=oD&cFq+$8hs=BQ5#ne2_&I`~Pgvt`b532Hot^)3DQAo61d@M&E0*LSpL6yh zOtcz?Fx<+lU7#}e5n?aMpGo1X!kD+t%6M45u_pHdl46%EE~! zCO!$hf_yxVswM&^pmg0i`zwC;o46yVlD=Iz@g&3Y2s!@}{q2>PU;S>tbX%DlQU0`3 z(dk-9W;pVZ?q&p~)34iH<02@9xvjH2G+a2WGS62jl|$%I`g!Q_GIV%R-@4j0b0y5%rR@L#C;4PipkZWfb5XE)_r zJT7cIIOc@%U0t|T{vbM9%j&hd={xC&dEtD7E^Q3(7|npAcD)<$6jer=;A)zg((T{! z30K>|<~GQtU3ioZT)lkL{XlxL>guPYy=ktA7rN?Ut$iAF=yq(z@duF?5hFo z_7Ssu0qaVmqv#LRiuo!G>^@mG9LoEo<)_Mua4T01SU`H1I_nL@NWXJfqqx)=r1B^U zi4wwN5T?Uri_YagNxH@Q+4NOqbFMoxd%0-+tr#`~-)}*@ z4{+EQG+A0VA1kTfVsOjJx?iT_Ipxdt73E9M>wjoi33~d!kC2K^E63G3_Jf~Z?We+E zgwXglli1SYokjOO&*&j$%8A^;IV9Uon3ub#!E&3NiclV)I$Gi&R+9B|rTED?1~t|2NDFSajKouJW6<3#kN zok-KN34hSH5v$nT6yo;1t(U{OMj!%i08*iQ3eC%<*Y5bie+7Gya4am%IW+3Z`ANGL zLb#t}E>>%;*yOq2El2{@n2B*3Vu+Nd{qHcfy_5fRm|7(XCOScRh>zXgw;I~uq@HNt zroGNR=qRpCYT3Ft@oY?{D(|fy7n3rj8F@F2j*yl>psyc?M%)a62I~4-i!%9E+|l6L zS33deujOdHc78j!NpOqYhp3dCI=9+7_+TIUjW)*goh_bf}qjtLP--20&{c*Qv6It`i{q0TIuCD z2{W9C))@v03%;zGdj{>3V?XYZ>gvwe)c;05=2S<)==_;-xq8Mb3~%n56y$X}YG{AVZ| z?PO2_q2N{+)r939yIp&n3cT$mm2bJb!AV@h3ANJFa5$n@ST>kv&1PcO64SKa7ju{E ziO)G#*iBy+(j_b-O~6+f$T+c1XjeXQYmBfQ)|Rklp#AWCLMu$l29!$nX>81I498^# zleep-FBiUqbUr2HWqU7=f+sP#xulHi@RYBmlHw2zA50g}9|Vv{-Mypm;q3rs!eHMB z%}X1fmIv%{+c|3lN6CAe<zJdASeVD!1K$xKjfjA|IxpuN+ zaC@0=cbQLG{Om@gcQ?{zF>$$v$i|27*NjvI1-BtHNJYTtKamDN$3l-@6qB*DK)e^` z3!nQd_|Tu=9w3151H2=$j@_jIf|1ZVnf?8ZK%lTO4Tu)NIsnnhAYbrXAQ*+tLrM>{ zBcp0%(F+&jG!+2dPjU~^=>KqMV9i?pf$-5p%}0Va-;4r*<)(j4m%n?u{Etq*{c9fl z{qul*CGGU!tD@ByP!+l95P6P|2YJ&;j#pG9t`gpZP=wm}qAvoqSS97OdASbOqEqrTv(ozkw{ifh$`s^yb($(!D~ zs;!bv9-wSIEd{~eKN1hWv%hf|U23&6Spc5Z`azX8&o*-tgnuxI`~YNUa)2M_|7tc0 zN{_=b`sLlqdWyyBB%VWZ^0t;!5hc?)dU7IS(#A5Tf}N=3l~waZ9ERZ_dH0cw5~4^| z{;&uI)0wXBu5OQ=G)pAj6cdfEJ2X zaEogemlINnCQlN^ZHJ6}A7A6Fy56?<{%F-vKecu_j7ZY(+T><=5D<>^(ga%F!TJrB zB0tX@l@RZ1TZs8M(m8V;Y}3OCry4I}1irgP);-DKSy&#fm(�=X_hug(uKX13v82)oAK&A7I{+cQvKUV|^RS>CBxM7FPJuwk*$tq=c zHR%8f?UV64xu*5VvR_Ms-!jNrH}m@1brq*1TmFbo2GzCIgU`q+#Lsq$5aV1Gc2$5> z*8Y=C^tb*rbM}{Un7Y`Uq`#;8;lA1ZC9BF1_BQ=*4;`1H=Db_#L*}~9+Kr+6LIO%w ze3df>1XX7+p2 z(_7<^~?2A_STKiNV-8NWo;j!|KdP)DvVN-G}VfMs+ z`0f*#Df^4zCaOl~D&B7VXmxFI$SFDRMbJ$NL3E+N5ABD$J8zmIO|`| zC$|p6a7%=IetsdKN6X)Fem2hh&3wM}-QK)v@Mjad(y$xO+^tod0+RWTP)5y^&RsiQ z?ceCP;j{F=HJbnE>$>+lb7**)JMSZBT7}Z56wfQ!4P9%1f{OG~z&~oi{k8j4V7XSl z!%_8z+adX*0+m@oLtmx+dIy?*oZf3IhzX;k@yRIO`bDQYSU`+4q%Wx{j8_*9Rt|}J z@*y{asrkt{`#1yz=v9t?j5I-d5p9zFiu?K(Kv4Lm zTRzON7lRwRsWnC<83EqGLCbAIoEH7cC9>b{3-ic=4Jgaidz*58WUqcW ze|nnvf{C3b^-)^#a}4TkzuIMxw(>s4|7uoyi0dHY62*~JLoo6uq2_t@nr@^|ew#Ma z5IE(h%X8oW$hhd3oBhcYGxoj}nM(>cDDEw05GfH?JyZ#v9-qDyZ)fcq^g!iAV(1Fp z$1~>|#hY?YRNjpFQLl6U!ZXX4bjM;eR>Tg09AJPqqU}Ac8_Bptkc~(`J&)oP(4B6% z+Z9=~_^FtlJnyku*g}z%KL+BXzCq5WY5NNMfWv2auHmW(F*q~%hn7W+S!th)PPNzc z?7!4(9%Lnn98MT%nw?PxEZ>0t)~WW_rr&?THRi{7Vy)AAChyPsGKe&r7B-2 zgp5LySCG58x%u;jXMw@^GqGxLQV;4;)jX<+ACMws`U)1SI3YjFx=mWf6On`US?KmP zX#?8r6v_)y0CLS@2umz5t|2Q%P&|J3Shtu0{Af+q+FY6$1qIqf_nZA$`3n87;eP{> z@{s3lbzRs1YVg#Sn_n2xhh20Ea`U@~>H?%4p}Fi!5o?rdgs)|w3wmg1BvP^U&_J^^ zinr7Uy1}#lGZ}im+hpxWi@u_%;N_C3O4@@mBBslm%0|r-+M!)E zQu{@x=q?A{3A3V#{Vc+2&8bf?{Im`FfSo>p_(1FIJ%F8l0iyXpUd_Y?A+r#+&^6hO zZRYoQpT_yB-@*xP=X*F@(ls5(lUMdKhqffy`Pnx0LUEr1L2XG z5Fog_1p}&*pZ*a_f*k_{K88bbDT$4V$fIg?#NP$ZXzsst6QE@4jFKn`HM<{nvDus} zY}zN(ae7#NOD9b>Oclz%X!F$b$~)!FNf&cl6<14qY~lGV)ncy#5KIgLg1yJ`h1mBkdO^~7SM=K$Lmr9d) zZtmyb0qRXH-p)^W&clQ$4@p4v8?Gs=_S#@uQN)&(n9IV%Fc-dQH z@rDNe=#qwBO|xA`@7uD@(FMXXO$MeV!r4;?APczkUJof~cIJ#-8e5?HfD^9YalRzL z$nfcsT&$=w!=>XvbkNFS zsE;9D*1BP;@pR8aW^=1(q)wR5NmhVYzfou(=eU^bns z(M~DtDzv&S7E{pFuMx6kI)$2*1<|TUmSOvxHyI)p;5bKQTsrgqlHaxXyLIt)VPL znr^68~l!g^x?uY)7vK2JMWN=3dlx3+~^ z@DR>b zP9=fjCMF?9vaTdcFWEO;*dwsyJLFt>ovfFlStHGQ5xmAJipdGOh4Jz8nQ8NWj_-O9fcuyfX)+p16 zQ|ZSDj+l&(;TF9qAu4t6l}8SPg}W#@M+EtecQSz-lI;Xc=6O0+YIFi2Ei~3iHo$AY zw2SPH<(Y@P6{Qmv9*iN7@^!?vq~fcCN&vhdDM7B^@D4r?CWNivEI~X)NSD5+)KEgS zS%DO^t5ON4W@{{B>M{&(6w5>u48MhIe+hu!p;|Cw@*zsX^XqLw^w2es0di$6k>5SXdD1hF-lp6A7=ew&4zPCB!t7wHe9W#t|U zlQNL8g}#~}?%su*F^PVTiA=cggeK)kz=A%YFGf=Mr8f$G?21!P{616A>y)iyAb;5^ z_F>uqQR^8wt9cJ!D2>5{KGZyjukH633q#V#oGtVb|1f?43)i0R5AVhMTyX!G$A4butE-oOabWK8wUtTaBasL0)oqlGFy$b~v8h%d92;l3MuX)h7>P{XrT?k6J3Z-P7ru@Ir%d zkp<2hW8;9K(-X1D1RZObwYnE@@moN;iWt`!gQDBB;!&54Icx4n-i2PnwlsNUgnFrE zxmD^?0R(_AW7RH41yg^2!2XJ;_J=T7%tVsioC!X2#v$uXN-}3^sc{h%B3F}ZAgYf! z6jRa0L~C-@=h9iW z1cGnc%}eN$_6qMD>x;Yt<(f4@t|MVPYy(QJ2ED7;@Feu|lC}w4_(sj*DDKjsY1!AD zr9*hw02b9e$%EO$VC}mS;ho2F=wN6 zx(bUGoIiPZ&|*-L)K(AN7R;hkdmTgPZzUvbDA8)DsKo|U6Z9s1A{nhP2RcH3q6?wF zGI=&scoV96TVVIf8g6apOCYwQhGE_@MxcQ&A{^@AA}POAR-BW8<=d9(#eg*!G8?@T?*B|wFqNI6faC69L3B5T4o!gcwVG#2+*Z#|R-oxTe9aYw3YQG2TpU4-? zkgnV_@u*W26teNwQZ%G#KOemb?9N||$457eW$cRoi zI<6*#Z(mu3y*20HhFWv7)VaVLEoYP-q%j$h*)6cl>$gp8ud+#fKH~lOZVtJS;{LAk#K6c7)C%6Lq7~eG= zLvW0iU#Q|WpI!B=c$$fd+e6%sFLOiY8fz9sD9z(RkX$R(P$ zp_-rucIznnJ!WTmiQ!KBBtJK+x++PaUqo0KeJ{Rcq+d{3NcStt>X$31@vo}ri;c|Z@4{a0>4O2wHETs z*4Z!F#Zi5Bhn!Q7&3ZUKJe)D#O^IEc);xQ zKUzHxoDpL?jnrcRIX>mr!uUMeO`F7p2YaJo4N0FZ=eplPX(gO7is}Z6&9QNl9}2I% zA!erZ#kUTVQ)De594x!d@~o143p;BJ`_Da)HQ2M=vMx5{6hP|erfs6~T86+E0oQ(t z)k_5Jjqv5&&6PUpSAuA>1BSq<%9sWkyG!Sp?L=^E4X_(tvcpo`ZWT+-hSx&R+H7ix z;3}O>LN0TReme8gCO%wYX`7jpcFJo@b!hl@iY1HQN!~}jluxz!+CqvZW_G<%E zUoS_XFWP9kC4pifz28>#S@8>6XTn;liGlYAs;OKDCrGa08}7RPESlK^m8S$b!5g!O zDeK+URp4U}Z1cAXBMjf$oIOz*;zhmAeZPWnpP~y%RumTHIfN+~TUn_S06>0|YxNT7 z?W0ByAWZ5oh9NK3iU?jUO8J4WiFn7DMRxw^jWYl%gL@~g)754ad4 zmCw>*SZLRc-861BYGlZkoGojZ(JF*Y4t?Dh&3Y*`!^zb(v8i2I9=pG4+2LEEGm+*P zZ1Q0qvnOO2$g_M|L5Eo>$!?S|38H>pEN|na#O>aDpMT?{E7f>r1cM~z^-AC&MY9yo zAb_b4QEVFR35Hrk2=={;%z2jBevX~!%ozo#v|%9`HyxD=&+Z6BJ>j4|C&^OD@*&vU zB1<+6FgAOkWMFd@1~2+v8=QEGwhnZ}P(LoEl;_4?R2Z94PG}Xa%(QQR;Nei#>t{+r2U$3B^J%EwdDL#~=0TV*>Xf7g#NMqWJaX zYmHrcqiC%#UbN4wMYYqs%p81X(^FQCl9v*38Lsw|uP^ zBE1(wF>~tn-=8SUNi7`>N8;4*NSrqYbiWFnTN08NkUoWs5%~czVUS1;dq=Dy{pNJ2 zfRBvVM;WggiiY)AKTX|emYcVn>spe8_$q~{g8=x%F^C@6J`QbNyi8hm=q@J>~o68DfolnU`Y2~`H} zVyoI7k{%!NgV(eE9NJeoaaA(b{~g7C@FBBbBr*9BhEgN6v$@KDe;qTmZ~JB!?mrZ6 zWc||E*@5_kD8|#aEX_mW4b^58E+gTg=n{=5qviObnzgeUHfehOSQ~$ye3F{D>(&N5 zZUonWs;L;*$`j&QxiBp3AW>N9Z2CU>rPO@<;A1Z2YFQJ<3z!#R_5FHwllCv60zc{z zeufh?V1oc>IqRrpM^YjKySC`<48#U@KJNWyg2MmoAoTC|`#Hs;ai4ZYiWItQ`FYP7 z^d5N+sf@mJLUg~(RtLVXW(c!8A+jULX`RQu+sI4_#LkTZA%b6~XZWSpr>mUrT`l%K z?(70+7hi#MT}u|FL?VRuF>nWo6QZd{06D9P!Et)OZoCqWs%Rcbn%jz^j8HWXe=|_K zko;E@LBFr!&#U-bz8*4>xjoMfU9ynf02I@jwMgaFvcBcL&eg z4@P`~+b_ct+%;00H<$x8BZ1z>4``R3ex6Wmzoj z=Rg(iv@~i4{?Y%>AyS3;0(_?#);SheJYI}SR>fI!iaPIgnGjKul8h#j+#U7h^&p1o z5df%kQT0}%lA4VWou{6bf0tG`ORK!(8MpMC9NB_z^b^8Tk3|XH!jqzv2))5uY_Z?^ zX88}jSl*Dd2p#Rrz|WgaQuGbW_9P!mekpiX&KWVBD_l>w(kF2<{lVuIt^Bb=`xJ2* zdZqPG7h71?((=fom_M!ov>QN5?zh6a@duK@9Npk?4oG%1o4`^JHAKm?wS{uy91}3n z9-Q9*Jm)Vr!u@Bx?v~Vy_&x=2n{N;UHSNY;YUp*wRM~XvEgE;o*Y8Vr#&595y7Pnw z!?wekJWXhbajRYc?XHgs?kW1kf!J`}`FTuUApe?)ILFtl%K(Sy;@AZ&%Rnl1Z7AxN zm(q+1HI_#wsiEdhpUtk4`pB%F-U#dDi=(vHdSnGHLimc|YjYI-lc%wM832v3+fG9T z{9W#y`$4KDfh!QnfycSNo1)nobT@HIYR~~r1b8|IRR|B^^vo4d7?9wQ_{i%NH1w3$ z^Icx_42|dY%zb2x7mxxwZH~t6Fi5ar9uU``ydno!kU*!r89uHs5S7h!2I@nczM`t}AFe^&=^w(SzrR z7M+2knwq)j%Y*H1VH_%iBz(0W8)5ULTyq#Xx+ArB0IG+N%wtOLwu}4)6XzC zpI;Cje%*peD<$nTC6vEm9f?d~ z0lhC~1O0C_1Mz)Yw!g`?`Xzho|Hx}%-RBg)ZkoYsJ_5aK%AQAdzlW`u6=wgCYIA8`e~rgudL z#SES=#-0!nnlD0tZ1j9C-}gC!BC6qp==c(lV>puJSxrI{{nMYAYBsV{pr%hF_%j-Cmg7PF@ZK2|5VV3CdP7#M3-c?7emw5nxE#UvsN=m8V^InckS2I#?J7*`5PQu5OuW}oQa=@I!J zY`F(U2QOb7F_Z_3tgRg#X77LLO;X3csCc6ok#5)^0D5Ex^!IZgs)_Tp$b1>eemu6N zg;$oS7fxl{arnV{J2zYZ)4o6T?|xm1yU7|L94P4KrF7F*-m}BG>iwz5dJJ@3&*(Vc zw9UoVrW9pgGAo-LUiGIlbv$@V=mC+euMa)KgvRDKr&`Eyv2!sK1)g`>2n1+EJ`p$` zU^He!GiD7$zweg$=#FfXs(hbezDL348STYPReAblm3knSl^VTtE^;{fM*PA}A#3ZD zwZidz0p`umH^?t5@cjrS?Xorh)nkk&!oQD1oLxhVOmX2-pcrKL5uO_#U2HDc?bqhX zJ-u2QG0vCQGzF@89S5 zeSXC=P(=xzh1}iya6+`qW2k4CJ)InK4ZmnRU$VHwcK{M7BEckh1F)*AzsL$qEaRS6 zjft4y^DEKW<-3e+k?-4MGDy+0j2khEm0V?UWxMk{>S{1Rh|mNb1x+hfG$u;*vW&kv zHT6(|^Ao*AiLyb!qCxkIlt6Cszu5WPD)dr#^Wo!nzPsIfNIfJYM%s4R29d0G$dz*! zbT0i?C%Hqv`CPhP>&nA^3SIZv)O4yJ3OwcE-)DCIf0EDM%v)g;48NnPs=z)?Bo=N) zJOBxFE8TyCh0I(=k6Wx|j%R3Fk&QC3W!^p;3^JM??w-7cF9Izh>JynsjteI{lD=`Z z5C-zdzbBA;Jcecss*q}VJ~{@dC&~a6hy_rT_ZHUWX`KPUj->Wm?STerRSo(wEiHUO zG3>>OH>(&3b7ukWdLkYlFKHR8I6}2@*#fOtF$|>!4FR~%OJO%{rvR#rml;V%$1$Mj zb*-+LmZYE~PaEGRI%PTB-uFAWG_6d18I>Fz70#p`w!O(fTyrmw@!<=suqRQyr<-!H z^J!%2^_x$iFyssL)J8L55s$25WYmm{qlUN~z$%VMXzVj@;@eoY7hc#UzVf zL+|OSK-ekF>g$;hL!JOkgw0`~eJHBZ>n_Q_Dao}S%YHxkHpAP?Gm*>&X+D8m+3diQ z{zd7~|JB!c7F7ZPZV!}y6)~L8znI5%BiE4RtCdviRol%XBE>tK+5O2^eRz6c!fvZj z|EWtQw`{uhq1{1x(kp`BiTx|MOhKE&^Wv558|=XbwlMi7VUd)>iy%zy%2zEy~qn%znwznJ}dexB_s6)I=lGBnW80A2}|G*QL4bJDNl$qYk72pxG(j~OxOj}zJo zo^nneBEg06shd0k`}Kj}`h>&{Y#$d6S?M{K;O=%)?I}FvPih?s@&4jE2of9w7Vv?gRXS<_`Ac?UJ-I`Lx zNTuJJyUStB`{?d5yYYhBF2i0M(h{GIlX3ilQ;(4G%3 z6o#Q$OykRR;=3cS%JcLO6tl&q$lbP@YR=Xu0@-W}M;$R%HNCqcyKdh!{Rbc`OLcrf z`$@DoFndw+*NSgRCAX(FwiBjYw0FKW*Ug2eQF>Az-ZI=d7Fvs`5N?`aWzuTYeIGsK z+yQu@ET(K8ogncNI#XNI^^)W5CY z*VJTSkEkVb7D5j*u zlpaq=>+3vz5O&7CT!pw_!iK)Zj-ThkZFAY8g`nZip_()4LhiRHbOCFoSET+|1u=JM zde+L3kY985mvN5V^e+q?(b6Sqa_S0c6D7~WJOZ(tE#R+bQukMQ2_W;H+#IW?28L%` zBPnL(2lDDx$_iRkDQH`kQ6)=_dOS_uFTZuDm$2Igv)ux-yM($V#eub)b@9g+%p8(eJ>qDp`f#~+-t+>QD%{VXZ1pUwpJBRKSp3tJ z%}|EArth0T%~FE=p2Y%+_<;4vpThUIN$P^^Y(;L=xlO zceLFMEGAoQ&XaAJ%p-Dz;9~*}`v(_U{bxGjMjq6qzfXK%c=E(f%2MxGtj(Fk1%NL0Hm`;qLI_(gotF+-QY^D@qyUn?D;abijuIG zZ%r=0_DcUW;4Ra|bOO20$$N1*P^6m%`rr>9Z0awXLikyy{ zQ4h1%?ufJwXM97D(E2-Nv;%9I*AV}!+Yyk_NLN?O=jK9t>)E8$0LET^iTs@}Z)4&s z;t1#mI6cZt;-cBvuT`3=u2-b@n*`?w?XP!?b+{blgE z%}*t*)<&~DH=YlwHOQY39rQH=q6b%kjOi}uQ-5l-F@C_G!>hX=&O2sk@LlDvIIr~>m6+ztfF*R#j+oqimX@|4Ya@fvlNsvkS4!n0jQTF*f zeXHRkdK%*Urdc^SGNoIqhE<5o98}>1p+NWRR$=G>H=e@MIifsz7HQYQ6ddM@l=Tm`t3mfuf z=Wdq_#Ba5}tctDXsVn7i&KNfj-=>VG*oPw5lD>USIY2mYj_<^7dQmxlot^DWWRXUE zi3cs|KkXF+dhi^i0(@$2~EH~pvn zhUFwbfUbtXrJxLs09+W{I0ej)Fu@Pn^QNQxF0^~9dCkRt7W!_YLTGmcx0Z3SQUrq{ zmk0+Acu~h+SmA%1k{S|&PP3a57#VRMda?jr>C&;g;p6-qx0gXH!k(x z@9Pk%jW2|$U3sT{;j1}b;aBc z6RL@56Li0l99T_$F8NeF7sI%r6tkQXxFzv_z3n?u%HPKj`&WJs@q?@L&a`V^A*WXR zm=K4{pOTR9p{lpl=j(@AO?>Zr)VFNgz!NQ|o7g@Bril8_r-!16A^_UA;HqdtkoUnOnI=l-MPnuoCN05+wVhgwXH05)@UR2E$!^$CGvrA!G9B zFcum$^}m+|{m)o9lG&s&a(zy}Yb;$mpoUhG!{Nq~eaOb^+s@z%4fs}@IKq)x%Pum+ za?g-it4`yPV0F0eiqVFwoe(vYRU!Nd4F1QlCgB9tbl82!y8oNmZHZf{(txyG&mGTXD)s% z-JUD`DpL1xtWc^tx1H<#e%(8OAW-{LEyF)6ar&qGJbnM8`tcqq)USw` z!&M$gKeKYPY&b#$maKs;ttiy25;OxO&v+E<1CGSAi?Dh(&qqb*Nx$cx%;&Bgc~RAV zTz$yJAhncrfJ{0eGM)0Jm=>p=g^eLS#qL1&~iL50B*En(qC5JpT%46#&XxC7eGYdM^r)&gSC$ z++{pke8KY&;t%fgaeS6TGJo7m6Q zo)`VnpdLE$*8$+iKw__J;lhT-b)t!Hg&h=5n18Y`cM0+q;77V+X-NoBp~@oGSIN#m{`;cvc>M~ z*Yxq~>Vf=^+y_h;PHip=+L;a-)4PDwlR~^4=lB9y#WCwSgd2m4WpCnxoTUsZj*Qsr zF{=*%oH&V$*EgQKxsW*zmt{|)?^$e~=$G4PmD;0bIzhQ{w!M2a5Pm@mT4C zm-%q@5YR|h!3ar;gyAz}r3BVY2XBFQ6UESsxHTXzeT3>fX68euF8fsbPgCHcK^Ne| z4e;d*BrEbc7RYdL_&0#|&{Ib@SMA`<{$FtIjqpo`!FWeN9PK*J{cPR;d`Ii)NZ216EL)`CdMpNwJjNY-w-Q=eQ zICu5r=AiMk7||WL_Hdz89Jk_kGunPP@g`lQkC%e+tTn3!OH<_QcVwp?5k$rYbsR1_ z3PKPJf}5q1R{&#d^G3uMoiz{5#G(w;M@#Puj*XL9*WYR!GWH;&!7sKLfC^8yGtdj; zZ53+`+v}<-HzuvBOwrbB>qN+`wOy-Q&RX#nw=-IdZaOHneOyP--lTdP<0T^L5idw$0Y$zPZ^s!K;Bom%SCzFRmC>U@C9;~ zav0=HN+34dA6Xv(Y>4Febj%5nzg|<#xgne|=xE|$pLS|n?d7N&{NAO;LfATcTgkL- z-SOl=GwsyW3;c!glyaFu+xl{J8`qDtk1ixChAVcy1wu*jlD_F^lig9%pj}DX@u=w0 zqfr^QJS`caV<;3cMOt;!agHJ&mh~g~mAB{LRFtu$G}zRQaelR<@eG6J3++{j(zS8+ zExnFcaFhWK7F5;^82PX2YO;3JSEBBQ^B`+23OuzW%wl-OcEdveUD0!7@gBaMmA&)0 zhEwW<$nB7Ar4Tvu5b5iBz$byk)~!^hi}h8tBFpsLd+&W zm)sLP5EV4@y$XM2u<{E4Pp(l0r&~iZ=$st`vH{28=DS}qd47KFX1E-4E6^CE_w8A> zs|XKopw6sf$>-gKW9qip2ZbfVDV1%ZrEpn{Xs9OFs@((nt9YK}(Wb#-7O=L;?wc_# zX89_h3W91cPxr%%y?~XOoioKBXM{3XaE`>Rsi1hVbHt6QnB^%nUu|H|AWk52g}oK> z?G?P^5ewO|bm{g%o(pfcM%(?pXFaw)%lp`6QiIo>RyOLDH*|C z;kaCG+70|VM`vU6){DFP&AlS)4QEnRE$y3AJgzfxrKIJj{0kyVYf%5B5>9{9UI|Ul zEG4FtNvL9ReywP?^d!n|Z3nTxiFVRh7Q3|(%+`6VSP5jl3YDxMjLO~P7JhwbD|3@x zL^6w~Z2M<@@r&1Cp)^U4b-h$e8iP(b^iif_W?xCo;X?!&N?6oebRWG8ojf3Y)gF9l z5e3bjL5}#>1*PC5Dx(QkZ5c1>Xj7(dgm=>2Hyj^ETbO@pXR^CJo){z{~!}^1;=B*FjaLL!w<{u-tKaw zbyZGUAMR0!C$&#Xo+d>04=}7bsma*9{A)S@e$xAgFr+Zh!)p7yKcSKW2t5|&3ABLz z^sMwYLBjjVU2ubGjK)|B9-Q@A10iz+*FDx@r5^UP!jlD-!2Pzpa$0++z z99RK}(5$nBsBIx{jbpvGB-RyG0E7NS7LfCwBjJw^bx}I4n)#7vtfi9Cx?#Y?gInuQI2<$W`ICE=E^mopm6Ji=oC<|@C22gv_l}9?V>$a!iS{6 zyMsXoE8u#>x-}cY>)8nrN`XKFZ`Ah3lb~=w0HlOsof8LG5ZEFaSU=p@9JDh>hCfn4 zryer_dFUj1@Kzt(5A3Zg=E9$kWKYesr*7M;b7d>ySfb-!t*)N z4TIK1X4td{q9@z+Q|`1jJ<#Hrf}Qo!E<)c+ak8bkAEQe}%Na6wr)jn=EL zeJ<+!qyr7bF@`*H{%|ZE5Q_&eVb9@kH28`ca@6+EwBCPdxAfoD&F@P8m#nGZ)y?nf z=67}T_f$7Zq}5FWDd)9g&6!eCqK$V+VuAXx|8ifnX1;#?%49t&|9U-;?(k_xlb@|C z#R*Xad_Pr(4rsVaf#qDtcvExAO9^B3&bTk>Nr`diL2d0Pk2h?cwpKS;_x1r2p{-Yw zv{*Y0jTpj>iXISVkNZ_Y#BiH3pb4rJZ0#=bj4vZYyG53mLV`X|v4doW2e%H{dKye3 zS1LE_0be#)@)r<7hfMqLv2FG<)qp~ra}D`5(IcgP{;HdzFIfg*ey?Y1 zgfN^fDuGg*VIkkjg(7eD@v!Z`AnCv9&xT(VnsrU*3_wW1U-yteRcaaFA|eSwF*2;pEbko! zH2&;g<@P5_0N?KbbO2y1n+umyM;Zz*y|jQ^uR(WCi0px)94O9=FgAo*BFqs8DD02{ zjVHa!gU&9MjcQ8*d=tC{;GIH1e|F!F6om1mR?&}tZNO^l&%cui2>sa*1^?`>mbU{X zgT=2dzRdWuAtHXhuSu_5f&UBU&Vips*!$Boy5A=T*r~tonEyKMzaORF zWz3)S!#_^=#zhluU^r3(ZV6eI={vnhlwreMG|Zj>>S<@;Z#f>^{_N~&(!7lHj|IY$ zdGYcTxU&djLlznP+3g%5x9y?NX;po<6Wp67Z?~+P)}IiC0(EndbShA6wB^+oqmOI1 zTG$(LWF%#^&sRb%$9OPeql>fOW_cidk*~01tbPb@;M^D>bq9A?+3dCfPQ?}5%=5xaP4TRVX!W*8Iq%pNpSU*e~Lu2GGQhFFL@R`x^$sceU#q zIz^ZJgs%%7NF>(oK;wWgNev&if7ZI!+GVXhU#n9~)#R$mLs^FnLjQ}SXZD7UaJc6e zL#51Bjq1U{2D)$^`l3qKw1_q#^%@>f>_E;&u_yfFZkhgto0gTp^c@<@~wIS>(QI<2;Smk8>Nx(W{Dgg zBv9c$NYPV2}N zyjgw>@)zs~d93PADf^91a^$ri>-=lOr4&Lt{7T8w%LaAkX}+y=5TGY9@og|OWTiU9 zFZ82Tw61lrOEe~!*Zyh*}*6SaVNf_NGrRubQF4M+7=9!`oJ)a5asr(x_{{w7+YS&eAOReU_K z>-o80aDB0$Or@FzQ>-lYrIs-nT6$FDmxA>f_-^@?(&EYzL1+9?A#%QU_#1&29|>c! z99!z?JRDGE! z=~rdwb#kS{j8pKTRebIlSVujKQXfr_RxH=E*5fv7lp}>vcj@Zqnzk<=o7D3EvSL;)_YKc6_%b}LO4aeMS86$@|GPdTzQZoShN#WJs2&*IdjeeeZ8?=D{ zxL;X=fS4nYgG+gV{N{nD{{w0fzRnKbeG+xp4W$5jj8uWc_86dORX|eACB8nsh0Y)- z_`gA|Q=0()ce!$P8b)(M)OH6QgZHsQ?LLHfL(^vf7#Qf0mw|1_5rCr5{rLBSQh=gR zB}6UF2Kq-$7Vzy;knWOQxefU;Y*Q9U`<4pfq3+=%N~{x%0Bd=_rN*l5J}_)RlMxgt z-h$U=P`et#2uMDT_=G6DVFq!8wZ+ryiGb&%2Vo|&V1GVYSb47u?U5+v^{=GI$?ncU zFO0c$)O#TPHqq2-rEb2gE}{$oKiRANi&)6)!d}$31W)^n$!4RnlCaq@3QZ>tZBR1p zaTGKLNLTXt!e>VxP_QtH<@ftKyZcUP*IJ&kIR{`SJbSyHiymu%%U)jU`R z0f5|Nx8%S0&9rkTM9i>j$c7{$!&R59QuQjJReB*yV;>3NFOLazc)TE`I{$Fv9|Jw3;NTYs%hso`@Yh63q*`C;^VGT`3NW)GP^kl z4|#a7rhYRh4$TPiqQ{DV_TuU_U>MrbsUf%7yk9$)rndAx(A(X(ZJn&RSH}Ay!iCd*dye$=o$RuX~hp=Xr6-hnbm!Kmv$%{@48Kr0hfPu=DKes5l*?0(Wr*`VoMSadV}&GrC_c#OFclbl`X?h7@Q2?)KNo z`(U5vD3J63%-L2A4IunuBx^X?OY4Qhcq@c?Kihy;p-!u?*;Obl&V$4$y^~G3t{IMs$>Cbvd4{gICUS4{W8Oq(FI!I=mRh|XLmW!_fCk)GugbU z0kmZnc!oHR0H-5h!E=U=Dmv^cWlb=Dn!{vVT7@P&2`zb^KndXBmoV@xUxMHi_I{qH zK?Z&s-dtu;!M5!b8OS3F-!7Q(ah=*J#=)O8_!cqa=vE zbWkz?56gS*C6k+>_#I50w4_DX1iPvn^8&A_vI7_q&Lc78Cq$hg{nL`tC5=$*0T%Ms z%M&Z$ItwM8R1m1US?J^bl$O_lO_INgac=<=T~lqUbp+Hz-sk4oqFo){r@KOT>R#yV zy^tD1y9Tmqq4j{syH`qjajg?(6@Cc=JrkUfbz{VZ=ZNs!sC_R&HV_rq%U*|%Zq*jK zL7MZ&%~cmWbpzO6vRw;UO80W8$iDNfR&J;wWp=Pz%r4f&7*=P$aoFemcCH8Jd}mP7 z7;iqOp}cHS-6u(Cf))Tix9c8Fl2JKfX_b3zw`yzvn>c-A}Bj$EH4Hf?|ZU1i!6a=ZxsEsSVnHMW{iJ}d6dR_f{Yx%5TE7E5h# zsD(a*S~kx)p zeaK$_y)$K#S4=aD7)`@^4ZlpD?&}^;e7?sGaLcO-zsd#sth5Y1$U4et*FG}nzME3b zQgsSd=Os4;CbTEp5-UqB%qD%I!YlD*s}v|QDV8q{K%6E?2EcD;+4jc4>Gvnk43Q~Z zvrQE~A);0sQ7N#bQ#lOf4_srai=AG6Hzfxe$}WB5ytiV-w7fzPl4>>x)DDDqhS~&9 z*j`x69_m7!qSzK9G1F~#E{s7xgh%n2-frT~-+a*viF+~-u{HVxr9A?F{5m&JW2D3^Xq^@=0kv(!5Z6b|vshApZ|RbVG)LMw>Jx_{TF+ z;2rtDYpkVohTf0t_%gm!C`W30!AS*bJbnzuM;_t()61GzGPAvyVX zAr|{bd@spP#y&Cv-dL##2uBYq0ALzs0v@U{L8gK&M1^tcfVq|Eq6H)YlR;}x(c!gn z{;!vll6TyYv+}5Pxh0EepdALxY+`}V|;X;e%g$h` z&12#Is|ROL(l`=~cs-OV2iE_|z-Z>o%aWM$5>U;yR-pBsgXB&VOnz`eTzx<=Xr$y8 z_oAX==B4EYAS94vbjPx>%MBLIIcd0Fd9Wvzk8P8d`vNvwM!u6`ferFMy52`%yZ=JX zk8Xc$$J^wc^TAD^8%yjl-2==OzgbVxK?$%oEj}BF1Uz9P>-E=^`t(W$<{jF;HZ$=> zP8m;O5qe#~aP3}ZO5jeH&u=7Oa3R4{`zv}sAk(rmX;l!*QD!F=6`)8d`Rxu5rbB6HuZk?rv#!!{GNuc28e6H9Edr7^8L-wRKEJ!$E(5u(~hrf*1M%VvFe)Xxp$2h7>-uB<96`fbp&v?!EVxWp@Wf=|h!sp0|nKS+LopcVA`FOvL{?aUpU)BsU9Xmd-S-h)G(TAn3u zVu>zSCB*UY9Yj}?=hB8ZXWX&8+f4QCWa#lCz+=nPfV&2Yo9(&+sH!o(CGNnZFW?44 ze}Le@bdMGYLjDrJ!+W=qduv5WTW`SFWl&!=tP0s2wlUa?LS0U34a4(XkgK;;bd301 z`q)`0yIW3!Nrp!A{a-71`p3Sfqq|v};btpaHC*EE6(@c+)*RaV)b#au;wx7Pb!!e6 zL1cX5Mvxd%1pXn4b&q6P?NcvkRMeIw~fUwsS?^pz&k)^J#kOK{k6bJ1m>DX)@C(q%8tESx;XdMTiUH>KT0uOa^3Gj*4%gck=!Djw5o(?t@FNcFWD zJrKmF{)_64C$NBq<|DK=&CLfIRY5BK*7Fq=WzWKzF2}eU_izV?TUKFh(UnbL>OJ`S z#wnJs=&=2uEdFb+RkAi)3PV-$+dvth3j_)`=^4LwO02~8nH95^EB6HyR_Trd2<^wF zoUwa%81&cHiah}vrXi?BHaf0of$uHg=qqPro-I*%cfhIgYQ`_B@=OA1EiL|wH=RB3 z(=6RoYHKc|C6h+Hc+wj(l8FSaFUqwX%v)8$*oSpwykT^F=`x-{hIo}kWKfpIED*T* zfAr@0$1bbP-`0}}ayJV*Ru{|rmc`gGzu-Om!hzbOsfas5QPvp6ncVS2B~)N>sn;~c z*N5khOv-UQ!n)_a7wcwl_H1!weo8!_8SQC)MLXU^=d20aNGP{%&#RQbt%i(b9{k_= zACB%G8w2*cKM=ndk%upT6!N7Ta-KcL@@Boqc+h8Mz*w5g3rAYkQ7v!Fvf40zELR`=D3gZEdf+ zfVBls)TAV)lHtrekeyX>=!NC(;UBFY2Qy46X#@-}O+nK>;E zX42*?VFV=oZV}+MV7}_DYi-12N+WJJjT)qW$_j-w$#=4@9)rXDjO8fY}=d|QDe)3}*e%6s>f^4O8 z4I5RUFS^Sf?w}y4fQ+5JU=W^Ct+Qc5?`zBOzaG`IF7-cPm?l|-k2X1?KX#|@6`TNR5CXy(9`MO^ASifD)yM)JX3{dhbnolim_a2&DM7-#K49 z=e~RIdH0-q$9UuZMF@NBZ1(EaH$5asba9`4I5Wzpt3O&bIVHI1%#+f9z8V-u>?`fKRPz+@%kH-3F`;TwH&(zDKQy6Unlb;cG= z0HLe?$RS0DFcS7y#>d~dWbobDSXXIX%r^r!^6-Y2hIUR`;@4u}>HeQKWnw!bSN%po z8X77MpxL)EYk*cmCy|}?JHaT~Q^AnZ2kB(C{$i~?D>$Cl2(8YC-UFuk>{V_yKs^@o z{-m0{|6$h+_T<2H$I8&Ah00cR&0Bfeu0o!Ouv}mZU>1W>tR=AN#Si44fo!%)L#5~^ zD@1$FBE8HcpFYv2>Qm>px#@A6vL@0EP3v9*og$inu#cQNF+iR9z)^JBBAHRNFQS|U zvHu{f6eR(cEjV1JAYE>7!W+=8HeSbH6FK4Ce)Xcrs>{<9{oT zrh$%o+%q%j1*u|bSJrRYtIPewDG_4i<}uZ;5X=A8z0;JkLtUwm%eWDlqe?IBe!vj; z3e#!wJgxcJsuAU_iL=Vrz8L@0vAO^`k?}a>Z={8zJeLjtK*nn5UDJxz6>fB}q@3-0 z$d_7{7a)^2^)MsnYRc9&S_Q4MyitA3{inRQR!S=)Dw93kQG5NZU|1&4C;++QEXWXz zh?6$_EtB7 r{4#RDI-df9J~u|0(6rwf!!LK?qLFibLg&(TVV(|lgSRGRm%I^=4YAbJ$VvZ7EnD2W!>!c!$Hg#GBREa2-9PrC?H`=E}z*GLpFrVTuq*aE%?B_>oM=e{stk-1Ol$$o!oT@(yHs5sM=aFR;-* z`ZJFA93+M-DSp8mxMQ|6nU39sY%g4(ZHqn*SGe%V-)vtR{oMm`upL1sv!Nd@<98Oo zGy=4~xd^rIoQ)J)2KO4z_}liiZ&l%nH)6fmIDBm!1ahw`U1Vsg$6htuIbb$EfS;*yCd^a zg_Qsh9toHEpGrF#4R_C%dFU zAE^7qvDM)?$Z(L}1$L;`nh4}(OP+l2!SJ2R4?3o5RFd8R`u zy7pyln%pr*5_o?j)iqm~9tSnr5w?psfCbSN0%ZEBfipEGSCFhuF}H2rILd4vdh~4L z8o(FHcKda7z8b>bDwX8&pfdP-^s*K^%}V>ra(^Vw<_H5l5l4nEVK+P}5@tSW>dwy_ zINW;zl}A1hE$U6P=do4YqFVaMtjMQN?DH`7m7>MN0a9*X5IvczUYMsu#mg)w4ZJ>j z$i4TV*!;va7yq*39lgkAvNGUIGeyZZd{rME-%X3gjNffaEj<@slMao$zG7dD&5$sG zB3VVk2W~z2^wIEK^3COTG%KCWy?~;%FSDG}0;||%Abw3|{6giVs!N!tcur*Vv}j!Y zDDd9Hi`sSX>82*N2%2$yYyx~>Hyy|E$OGb74)D(SThFR0AltozcBey3$G2q%xP!B? z?GfJ##TeJVP|=?=(GHeT056sqbSr$E0LX;)AS_PtW45oWV=pCgKqs?o;t%yZ8=}3G zkIh1ZOI*XOMKBGe5f>#dsT>%;LQ7T#ccBgg#MEXaPp|k{XQ1#lpakX;xzbQB)fYQAGjQjQFc@?EU0IL3WuG0SRN!;kIRv$EY8QP=kVYRYGxWHRGeyZna2McwCkOOT9lPiJ$0k^!=re>O5+8XnB6nC#UwvfVt3ex9~_C(*Pi1y=vc{L z%7BIi2+%zGTj=F2bU7V8?|SC}9q4{LZd~-?jnCX22@ctb= zwcDK|CQ--5dn)z##5 z>VNgcYMbyCS0&JlB!HYm@ZiC_8}QT;o8G-dm5uk);mp(~OW2vUt^Qt2ju720>yA~Z z=jSqqrEmFJN}M=((>PQ<^s7@n`I5P^`IS8z+6j1J!ed?mU;s{JYLzE}Hy5?r3vR;D zI}E^}RprgAzdq z6?EeeN)EUFLndaUJKfpf52ntIjHI9W6QI=Lu8hbPfLJ_fNR9l%CW_sKCUVmx^pGkq z^8_t~VSUXP4d`BcA#Ug*TE`Mspp%X;NYUezUDQ+yG02>FoM`yCX~NOl&>`a? zPi4ZhulnjQPym4lh!>9Ph};r(e|e3qGfAFH;dUU~}6;+@Uq~hxor|7KP{J zT`SXRCqUBS-%Ar*_a8mqH8jkR%Z5%ACDullkqsA~R9qYUqW)AcWG{2YUU06Q?&+)~ zh{IE>oOw6X4u?eC*LCwwRB0-xdoh#w%+R%cn!sZ?l6R{5#545BWcF`RWd=Tx-RTvD zQ^`t>gn_2k7(kA$52cZI(+z%;-8al{Vc_-XWm6n7{M19McNzt_)E5jZ2Ezq)`MHZJcJy|oui&Gr zuAMuY=SdxqUkqz{gR*b>QWR|jlQ_LkHcu_ReKL{@zc%+%-lFaq!^GoE`f(YWsO=r8 zA?^|>!<3ktIzp+@q5n12z<;!|K>5=LAq0KcD|}HY4jNRPqpf5=M{yA>EJ=J+R{U6H zqp1xJ=p=T#n17-8BEC0^Bwp?M=m9cX$L0${FbL~che*LwHigczB|(Ili8HQ zNzi!hJ%_&NR5~UR`?zYUvMTP;8>!9i+iglm22&X1$U~I7=sF8AkMS(3JmsdN)T)NG z{$7oNyeatUxcLH4jaoY#dtj$2caTSc(?{^<(N3-a6JcuCUoD!c)pD|__v1yp zN~1qL-LHRs_u}aDZ9S(i6A66Fh)_gH$OaEl!x7SqIPZxJwdGGn`oAeES)za1s4g$x zh{ItQlaOKN{$X5Z+zW|ec7lVkZQs7-@E6(p`(|2RTf{2G4>Y?M-fiN66$u}Q_tu`I z;h>JV5b5+34svvEy4(EI3*G|si6cK%y%(FICvYDfK7)T^U*QSxp3pI|0@DCnLP=(r z*!D4N$INy4aK$2xrrCECu#o@ATF(#6*uaea2D?M%>{2Kz#Z==dRxL!wwK+`fUydm}jy`AHH~7Yc;hk#hw`c0YkO_5m zauWsEJL=T;BzckM_H{6fOxhYe!+H0U>k+1jNtUT~a6%UD+a(ZU+MEYTz}}YQLuPd} zlmz(Pn1Mq(Q2do-&kSwb=npZQ9tk`xAJilQsrY)l4&HWaJZBcXp>U;F7BAC)l88JT znQmQKq!ZC0SKRUXgV3+HqAz0%9lGQ>M6pPqO#4qBZHtQsoD|Sw_#cqMvRC4J(`=do zS-DQR#Fa|ps%$-X;@hI48+V_5z3ts_-V~I3U+CDXj_5(o2y`Y7TD+K+!`OATL$&{D zUz>$dQOv`(pW6Jf9D;*iI}gvr?W5OY0jpMpK9m$W7Dh%AF}h`ho-jH2?OfajXM$d3 zV(l-nm?|Y{=-rOWDz~_6s+qvQm|Lu`uJzBPhv?2U=`%2MIXvJR?Nf;_l|92;dmegW z+)X(yw}YuBjd|ZiLZ=)Km@hj-HIPe*@$VpC*x);W6DjcV@5pb<=5%o(=0@>moM946 z+HMoRucVizJE)?&IV5XZW$k(TKn%|@-5n+Qx#iy0G4-_^*>BZF7Wc}{f}*tQ%Q>&z zcstQgbKU8;Jvo<4aBY--N6PuZkH7dVwi;IL9KPf3liXm3|P5hYNA1c

    z>o1I#_KKYvXO7j>FxsA^+Y0NhJe`^pJV!FdY^JKW$gY+Mx)SFwt


    zB9fO^Ja3cVy&p10Z?ThgBei2jhM2m|}v{QpDSByP)>zRSnARC44_r?1XdR zHwm1?-vEYI{WXSU*m1LeDd0{`;;3*`>J(rJcrw54@mvWU2vD4eIRtbQ`V1tXgkQSD zuqf&NLhSb)?&)^5*caBl+sa0>8kz1m^9~TF6z(3+joH=fH!y! z%09fzuzniA7r#$w`u>M5m&{)~y$^4mJjB&XHoD|-i(F|bsU^m*`c?>=HJKShH@p(U#9F_@2j3HvJcxMy*(Ik9+8{ux%=k!*Y+Y=z1x32OscV?MxOq{yjMKcops>wIB1o7{)KspZq0a z-;AnH&%=_|fHJe1I!`UH61QtQUcNZ@M!gW17(_MyE6du2M5zSKBF!1u>$UF~4m-?N; z&ivO_UE^$EkI>$}~d zvk%<@2ZO1L=Qtjh^b@R;49H*~ffj&3In6?HH@|r$;Lbj$n6J9Ua^EL#_dRiWBv&2K6`&UiFTcQ4tMepMtY!bEBi&cNvtRihVFe(LSnieDlcbLCFzbP2 zrZYS?RzDpAZ+h@*=d=_COWY@NE0RG3&bmaM#vZ`2k!dJlu4EueCqY1|dHL6zR}TXW zAg3#Dnd9b%iOn33w$`OE?gX&k>#^xKYKNbl~2IJ4yOO=ZiI{y$%=dN63*xjF{{vc-1~X;k+JC=XSn3tX)ZlaOtcj3q0}$ zre!(=dz>ahkxA6V`3q=14h~;?;8^@{O(gWJ+|MNKX5 zYgzy1@PflKDJV_)f@#`VaQv!&)C|jb{ls`lSz{M-+~ZrOOLytA^Z8B%S@3#NXXc$q zH!xiX3p-2EaMvBKQif+=ammS8r#5BFlkxYm%^1$v``JMaWXfE7;+q^fU>R9?M4&2K z8-n}qYNP)jqNpCV04?f(^8FhPrn+d)*T8D|FnE>Om*X|^?R;~JI&vxm-4;w5&aiA< z>@D%$*2v+djsfRY-Jfw(^OwY%5u->|1gIuyW6n-elaN-HJ~Fm?F6yRx#zX_V3*GE_ zVaAjZM(P@72|`Q(Iu!2-OVsxZv|BYmSG#LszNs|205-eSycNM&tAQVkm_~UWY-}p3 zDbGh5n6C8pF5YNw?_<5ma6+JjdckrBPz}IZM7#o5|bFfU6C@s+bj_N-QjW1=aC5apX&oe&$ z58$fk{=3h}SO^ox`H0>WesMe}=a>b<5xvnxPmF$^B@9}a7K(X2*3+EVE5K zl`p+>z6?z#I;3)Lt4^6?rxr^Vg5T+X?3rw9wTeeom1>>{UM+;D(8!8AgWJz;gS)6K z_Hhn+cCM>%jFxX?F5V1R8+NA#U$b^>;1_62XX%WG?I#*D!9Net|6^JcwFPLhcfsD> z6DbQ_YFTD1Az0dm=DA;F#PbkYp$UMloOUeCThl!sL-jC zwm$-li8|}bl_w=j_+K<=Lg3veVp~GEtbGvBw4+mhO<-;EjXOf?XwH1;h?U^>e}oE( z8TlYXzEw-XL+e06;M3w7NpXry1M{}ik6s_+AG~uqeexd2m*`8(|Ee)mbRe^%-hP19 zd%6P+1~CHs#TG-d^jp{3P4nY^ZO{AL{l`g`8XV8T>502pVwlu)!)vw$7CPZ76>4p( zz(@v6MgQ3cqqj02NO{xV0AOcM)hd#T$dXntxgHEf<>|Wgv^0@ryKZCPzyR~=@=4$%n9Yp{o1q4S9fKt7S;jqAkLDAmN^9QDw zu^$2*ah4#{E~sp1tAviV+ZsWmU8NY*oP+Q(fttDiZGK4O8mIgEXpR?9TMB;MtOPN+ zoWVdl0Shgseyx=sFJPW+fqUzRdoyf*9FAA7t@aKeM=s-5G=8GaC)Xs(jBd>dr!&na z1e5#fvE4WAq`#4+;16mfq@P{QIku%S-~@t-wYoUG%^?Pmg9!JAwxO?gQaOlI!DQ~$ z$adNl0^hX^a+@=go>Bg5Mu)Xxs^ny~ELoN!hMyB2z9df<5c~F;$7mbU5j7E5%2}K!g<7q=5&l@_EVgf zQkh-Uuys3okBs^}}pMO?`5UE@h!?>}UScDa- zR{*qbU|<*n4Z82=F5sNrbG9U%98#A$ZRP$Dtgl38z|C@AG=C}uJc?=ICG&Pk6aCdp z0&_QWzoo<}aXt1@7DO4t_C=~Sae%1%DY)UsRm)4eT~Ipzyos~re8(gNK=eOa-2z^n zl5FhSZn5qUTHf8JxTN}YI3f%z05Dj!Viu_ZT7aJlKOeVwc$4Ri96P9bV>wIiKJj>M zLpV{oihSb&DOdf&wngWbCAiZ z8N1zgx_ru?MHZ)$n;!d`7wNfx=c?2;819tfX`$)M7RXhFHJ{x6Qj05$EOTK)KvVU0 zP7dme+mRXD-J5@(APn;@8_cu%C`O9;kfyf9xcSHFx{IEfZ9p0#XC25@-H&0kDZDPst6x&|6DJEBOulvUi97|#u z0hEqcdpX6DfC=p^to)d>-woZC(%no2hJbQWAWh}Ub#d_IK( zMI=w#;sI79bG=5<+;7*Ndq4|mh)jG!>Tn!EY3ZlZc9pK3KakI)O15Oi4M*aU=y_k+ z<~~1N0R}C@kx~5h#PGxNhW0FX_YXBs%|8GdZs>e_;e5 z-0WJuMLM#C)Ji?w^!lt5{EYWR&F{{6WD(t^BHDo-l?&hVR-J!lxf*l!DRIo3SHFd8 zhYMM9siO>M;1#<*Tkb1qokt7)W}_6-$Ji1?Vj}vHJZ6lfSER5dCG5Y*2Fq+5?)Ol< z7Z`rfKc^8eFOtLdGmQfkJ5D}+Z7$L7w(SjsC?)@pnpSrpzQuP=iIYrnN5@1HJfDWY zW~CSY^6K2ox%ux8G`bMFmYYB5KJbv}=NQ&2vYRq(eS&9mB)y!TH|zB8pNPJwjutC8 z7SsFU^s@Xo(50*Objjf$Cv{tfe=vKWKrmM^|MpH$s`EMBd^GI$Ws*MO77Se|`hM@G zh(}y;tQn0g!6|UiVc-(&viHK zPw~oRTi9la=Gt%?p09EmKr`>2fy>NXrN2?Z>r_}2cts#O_GHUU<1kE%c}4*-+kgnY;CZeAHY_ZpI;NWPP+(-TE2NA?8pw0 zJvQ;TM~vYz)f0}*Ri_Il{+Vf(tVc6!QZ zbFST-XZ7PbSZe_lDK)rr7EO-Wlm@^l2~u_U0nAp(nFGkyxAcMGP6uWG>BR5@`H2R{ z4j1(6gC|=4{<*Xk`|Bm2zR}+=9JsWx#84JPJ2(!I^;@z-zX0v4rVFtK*U#M8EHj}) zZl#^7c3vnCS)S?9Qyw$g5L)t;TxKCTf{Nel|C$%) zu1Lx%2T-D*HXKgMECL%{3CC_Rv@p0S^E5NZr`MaqLZ~(aZ-f9fqU0del3-Ee3S$-g zRHT#8x;rWOL43c#;$u63?UnIkeSTg4lsTA}J(^P|NASlp+Jp}o z$pN!0>pA4_1kT2VIxe5xCY7G_q&lGY15&L2tfkO=0mZ#grM5L}S9b_-e*{h2jZH0lN@(lJ(lJ!@x?Q>dg|( zrpS1Ce4m@)hVdKA{FYS2Ez*A7*iu6en=J`uhkBO(t+$n-P2o1B@X{vB@(UKh;SC@) zX@UfTL8RuHMg@a`SpMb3Gg{nQ~p_j5!%4 zBMC$=kG&Yt#=Q77s0?2=>q1AX9|oi)xx>%04x1fsm4e^IWe&U6^f$rI2yJndslWGQ zZU=qJC@@jA3xeZ1bknx2LB5*!YUXq`3LG3%uL28tcHfJay45 zqZHY;?Nn&VEsc1oJaB4awVEY0iY9u$3hCJ8kf}pa6OpYr4CS~1!3#klmST;X>(jsY zG8frXhgv#bT9(-&x}n?~BSHANTD8=ss(^z#F4fSY>1JI+&IWmw{nS9@*LM|&RGS0P zL>H~d=pcT|r;B6v|6=bwM=Ecsn1WKBJ{!svj8KVF0qoSvNBPY<+@hm;7TjI*WG+u4y3=O6FZa zaaaUKv~XFpXNg>q{qQE=p7iU;FreflMoaO(a*HkT);VL2-g&m!lj@FcpbM8oO44I& zb!pHBOf7@{Lkababk(iU#+Wzc|H>gAS7yP1-Y7EIA4<@zyN_L`QONyKYE9a`ZGmC6 z@M7ENFmFfzy>gz-TN-T}%ER9T(x&Ar|BR!olJ(|YXxmLiiL`IGFYk8Ey(nrj=H)bb zQK-P&<9SIRsQj|l&hbP%BO6ju6rpp~*+XP8o|K(%$#K=`m*@85WR`=dyYxQ%&MVK( z$tg0!YP^v)H0*y=7UVvI9$>Y7LfpHK{`2UBDp-G;MI>Y*}w`5}$c==zd&g zY6z3sWeFz*v?5EgzlLPtlHLk@^bS9i;shNIC;^^I!x=xmERqEI*|%p`2GTKlWL@*N zm@YLn3b-XNVL_k{)D7!6;E&D{%Y>6u1Go_YO3jgQ2%_&p*}skXp$vZ9cj!du@tuDh zeCS2pue!ZKJ0~c-UuaBMMq=8ri(&n1Fle(K%)BG<`7YptohK?1KXGmd%z6Nf(V%&D zpl>bO7l+>ZgqwYEUAg9nYkza-;mo5ueD63J2<%sDMV7|iQ9k*99+tYHxA}4p-O_Lw z4~lS0(#DN^QO%bhPGB#^-pM1LFJ@&>Y8>kqVYH8U3uMGbgHbEgV#kf&eRXz zgiHv8lO#0yJp)g6N>4L(abGZ@Zs}fa51yr?Rn9KooN2Z(od0Ebgqrg+i5z(`z;(4K zDk?GA@iHpsE^kpe9gAGL?%QzrGkQU1;7$0=L%0q|ihyqBzcM8+{L zRy@J2h5#YX_CqCs7GstTc70Eiq$^Cu*1P%9%WqF&oOZUJqZfDGiWaMNb~;{Rm+t0W zsd_!a&rt46@7%9@MsNRQ9a~Gpr^Uga2bfBW23{S0V2qT4LQ+{Sv(sN2MtAiDT>V^( zyIF!JiJa%$(Nt>W>jWAu7&@wbHWVmT^iSZ)To!BoL;L>cq5EHzxskDyJUA^R6po-? zt9(=ZW#Po?*;C7A2{aFLroSWncjVTb_ZO-`Jib6nw&3}7vh0FLWa938?9hTnrulX( zlN~3K`90wA%>DR=n28UcO&fjP3gk#UA(tJ{8H%=~o8(8h;l(JvpUxw9t_f9MQC6UR z_x9=R&JN!}(tOM~W&!hdXE8ibaQ*TRgx@a?wM)qF=ij4L*f&(+T8+l~%S%4W3D5E- zKP6wH(h%r$_~2^aLQRg=C(JH%m#TN^U!~&*ECqLKJ$?dX3hq(9OqTeGxa)-*oKSAxufBVAR zMmoFEPObAPg|(HmYo}JzorSxLU(w^fPW*aL$!{M$;Iv$4J2!m_n?er=Ven#5&dTQm z!lJO#p03{CwveaZYZUWf&u_2Gi4l2C6xB;nviI@&;l#l#k5$~2)!^tr-KCXp7{88(#EO-M`VQ&tlKggqGBU%HVR0HduMb6E?SFN4EueL+R z5!hARPz}pNDPkc$Yq1R2#;{J90NT0#iWGk8tB%=hH^Z#f* zple<>40=}%U5g3FSkKZqu_$5R96Z?%b$uWuG4YJAF|L?)_~U5E2H+vV+O(BEuCH_( z+&zLPlPd+L9_F4P>x?w`SL(eD8^>hlpmlC$NmN*`+yGAqQ&JI_l31DBXGA09)Z0we&i&f4Yj^ne@N=Afl@JX_|oGks}3;u^#oDeGAu7K9I7Hq#>a z=x&kJ8}+b)1ERWlJ}*w@N5<)>0E$rSkT6&oQ*_-^YYhG*ipeo5xW-dx8B4l|Qy=ak z208DaY6e}XJ2R)k-NGe(yUhBm_gT$8^e>(=n!ppMCTcehztut5il9hAzl1&S{p><_ z%aa7hd8SaJ&zRyZI{_aWZFhp>ia>3HyuV0ubfY2WBK*WxvIcI7$}?)Pj7;VHz$AHI zecDqP^GqQ!*rJW?7wO!K00qaf-TK1nYINT52|V}QuP6l9Oh-z^Cv(R})D0q=*r{UQ zDK&3dl2~81y3F$17=hDKT2cV#LU42^LgcAY=-s(sLznV;$u6~&-d+wtWL?j13x=rHBv+lTr`s{4ee$znbg|t6wJvit~WBs4}Sfa z>2BZ5zMka_6D{4vrFSdOJXi|E9sdkYwLeMN6Gmv!ZJ7&Cp1aL!MfeQ?HQZ^j!{7No z;O_3r*oAYQn(GV4l!mnGC|Bb;>&AL5%E!&<8-8|RW7i5_u3=xnE<4HFWn|1~0-4Ct z@#%Z+_(!3#20Mi(*ho?4F%gQi0N0&oB0`64U3Y^n<6JLYS2A@eA`I3TeoVHzf4;%# zaUTv?`25G7kLV5fqaviCYoKt5brju(I;vTIX@qm>z8A+AsWsk*H$l4Xb!LW?cm&v_ z?83Fup%0!nr=AN2UNgk5S~8gDFf)}QwRKEWt<+!+$BE&Gc^G@Q$e*m37JVM%ivXX$ z2I?5nJ|?p4J!^}8B#@(WUarQ=Xi3vD?<=)kr`iS&c}HW3A8+X-YVt1vTHE!C1b2jY%Yi>*8Y0y|E?Nw zzO3w;HBk{uAJ`mnXbpq;1~gN7abJQ46=ieo=a|vkTB+2&-N6=syah)N0RQYc!ws#e zcS|E}k8@gq6|T7K^PfD>pVuxb0&;#|C%lH~Fc43=o#s+qB;LH7{&Ls^$s)_0gy#t- z7vN)FlU3)GUbviUt<0WS5lCH3m-d#x)FdFu5!^hEw*oy#1A9)7ofQWdZI@q0u( zPw;XgJEV>Kz<|j2hPlo6IMIk}X5TYu#a~V|AZw@F5FxGy(}Al~lHktsnl=r%{HnK} zy&BBo;u(BV7fC?Qrx#fRUU8gDnXs<{Vx~k%qu3^6efifwD2au(7-k;Z6$Y~1=ufG8 z!UpJZaQy``u&t%i5eydl-m>2Cb$b}6^}?_Ey`S-h!iw75V*iW4g6+OAP-#r4;vG?W zbd){U_P(!`E4xcdRShcT5wapgX`~jH>|#1gH(1>b<9?j_3$XJym=lTyc<}|^dkeP* zrD?|Fw%fT>d5zCNd{aL)tQ`otKyobEaM{6gMJ!BAgcrjuqH{bax)u0}w*gs2;bUX6 z2lkRPAl3oM^XmoC%+b)QTt7IJqp!SlIi+E>ccG*-|6mMw3j2(J%ltQG9yB*019u^Rmu3#xK=80Qk5_4w6{RY;X^PWN+!44W5ND7GaXRR=<5W%ojq9> zfSb>(uCJi}92mj}I9+I`rAWi-Ie|KHAR8^SC_~Cs5jb0?)jnJHt3`r4>HGIVwPZM% zl8wDsqFng#R%Jk3^%440oBNTe#L{n?ABPThlp6xO&%#OIR-pOY=sOqLo_Qs^S-ra* z;mg`}D3uofk|PCG-&8b?ePJi!`?Rm!imm10r1;DIi)E;Rx`)+XzHnXp939SwW*WO( z;EeA$WQ2>P_`cSf$7o=A0eQ$M0LkDRz1`|vV>I={|MY7$;VPAi?ycpq*<>G}rGOXU z5HI3kZ7gbLn)0e;UMb{b1$D~H6`*3a5yA|!hRaeuDzeXwUmaFdnZ><#@_656Yf;(% zZHX;H*n_WFCzoav!G68SXxn!Ba{_*Ueq`&oYOZQ)w!pVn%Z-K5B|Es!6Q*GXR+>2A z;JE}$4FP!|pW=C}1Dvve$vM^NtDj1ww6G-sZqH(<{Mg`AsEI8ydu z4Nrjj{j1gfpYh$lum1mky+1KZ5V428TxUhfm6Vwx?&wX*r&Ld;98--!y~QK=0c)EXJT? zL;pI8`>$N#|Mq9uim~ld0%!O#7vjB7>>EF?X+W_*2?ia$KqlzMw;Fd-i*LIX(-KC- z9s#qev%AD`;C^d&~^ zQPyX1a#W)ySqW59BJuO6bQ!_(U3&MdvHyK7mcXM62QudyN-fKBDEr`Ehr(;EEbC0I z6-mW~`>c8EncLSJ)HUS zn?{L=vg@c;k&QZbjv-t?hPZz)pw1MHBOogv-LT|Y2H0aIp}+pW@6i62@zzi=;S zIyUTQ%CN`DO(rb$1e3&8pH_O^w%3K_bp@)Z_M@8i^m-xTjT6*_@|PPH&pQ3!HBt@f z-xEgBqj|=}7+k@H1@Nn*h5P#*Ko6psJJYnaNRLlVhVmQWWEe9V->BY?Ar4!?@7p1E z<*774P3+8_EZXu8U(<-RdR1GYiX7P|w<(6hABIF>JH4oPr;bM-co=4tC(YQcYq~~! zFu1HSyK~bmtw*`?@yZ`5wvT`%{R>R||LAW+$*IW}YK^K+-pLP{akX5~v6S$Rc(cL9 zwdFn%5O13saVn|>Qw#+GXnYowL4dxl=z1z`TZZ@9fP%i3~I=An+m{-Eb!?ME% z?3=ehDrY?U6w@nfWlT8B8Zre%x_O|V>z5}f$De_zC<-cA(# zzMrueA`hwb;Qi}Rlyy143=E&7`a#yU0NEh69P%D=kn9}6((Zo~5~Vp3^S6@*K15lU z^6{7L#3AmJ9o658>fpYU0c+({mHG3tf;Vm|SfNcozheIU?8jUypOKL;wPvat~D^wFU>XMMC4InnoM@KNk~l0kbH z`gN#O2jmyDE%hd)SWz9ZRkBeP@EB4}VSt0ZkX~M3O~FVX7XQ^E15vZQ9v@MyE$gk= zUp1@kJ=UhvtFXt%g5*XY^zAylMNj{=RlANIaCmx5p$4LpTi>blPTxl-fjkXlWJ*_h z(ug2mCl_WyAL%=QbkwA*6xN6kTZ6|1TzTyrqK(Et%{69Byt8BPty5G~ur$ahKgy;0 zy0ESO{mNjHDj`NqgdbC*k_q}C;nv>p9w6Ap7qv*L=Q^>Ms_)qVT}076Dem5uhVKJ& zy_WcTKtDl27$+FxCis$%gnO^i%i8V@9n!%*H5Sj>LTr7e!5JjX^NMmFy}YN6=S#~p z7sYGzlm|zRtvPQ=ya7ncJLqhu%QY6$<(jACmNWg%HIkiEim#TA+?Eaz@q+lVTMlyA zX!!l6u^GxR9b_U`{`G7dV1xj?a0~U?x2goUImu4Gg@UV}p=g1ET$?Mw9arl{zZgI6 zk-(CB6J7;x_=^me19l@edcYdDvWP6`$DZTsf?d^Gw(rc z)$~(DXkATszkOdJ$eC5h)S9TQWqdm^`94YLct5QNXcG>_jlfXLqq|s5>D3SXXQ>(^4kO8HZ`9E)ddbR zM}pPkfVA5yAT{x?Yq9_7J*>lLI~n(!%s|Odw3w?kQH&A_2+my(qL&y$zZ7+9=x6B< z)Ezxoghc_3+%dt(Nz1c1Y&%7qVN{vt^SXbq$kO=qm!Art9jx39rImu&6u)E1P)QV<-x@9NW4{dmD{6R8iBo2j#kx&g zXa+H@zUj`^@)SQ1SJty~B&|0o)wRJyLBHhCpEi%M)bnR>&Lr7p>J7qr{@r&6$!am# zWsIs7)hs}hnN7i3_D$w2VH;`0&-{Fm2PcfwXaFxwJ{)`!h;jagNnsLy5w7R7APFiE zr)eMks)+7Vd`LxQ86z$CvH?Arn4Ind`9(*~bl#1i()Hl%sjQIK6R@@@aRy?{E7!^? zHqfm>bU2n@Zt3{txwl*S=J*~%zI6M&SU^*vl`jOB(IoHF8uNC7HmE}T1&E;qZ6TI2 zJEdkD%4vJXY9`-s+wXST^ZEO(il<=Zd<=jtgvP+5g_%Qm@IKxOR_ z1WbS*I)}TaVJj!_UKRNeiPM2hTDC0OhZIs-;A%rBf#`~VoWATfBiixef~xgVv49?F zc%#uT7^2*nKxE)s#;sD%lSsnk+sxXwa{+OBM0Q^6I}4ZlDU8>N@hZ@! z)vP2P$;K7xt(N9Jlf6Y{GcQr3(MH^&4nT0mULYr2)z)y%(r#EQSEMWcx)7Cs}pW9uJ8=D5wuGsurwoTk4V88*>$Qihc#9VWc_v zt-MD^zu)UZpqW~2N;9#4#Kcu>$ssFT^^URnJx&w8Hy2f5Wx^k+^SS6QK(l9dJ86L_ z+TS$|m$VQy2_v+qRLOc*?XX!A8ztMD&7X3n+3T8ddOg+oQP@K)&>jAx`~3b4@-|=) zkmEbs0@Stu5CvR^UKhuF(jkCRm8jz>RxoFg`i6VeP zJQev(v!5?deY1JI1q_0JtKn0aQ|$nRfv^4wWBg@QP zD(Vji9TVjtQH5M&&9=|>H}Q}^ND%Gr@_!)A16_lV*-lY)Xj^9wma_nLS3$31yjjs( zE7*4=PECS9B-Bz@Ldd@gaJv|HK!P9=hCjgjP!utm@X-%~9G5I9=S(5n)8M>C0A%PPX@`G&bu~#ZWVYnj_NT?!@v3%zR}?aD$_d2ey2IwBB-WX5HHvDS zIwpIo3l(rqC7#=?1h&yNvN1y(;hxky{muMpCpVfuw%su18KaXeH-n@v5N&sqB(d)+a`F?}|n$ z$8%mH1;aRu*FxDS(ti|}$Yz3aGCsBGmGnr@d^>Sv!H3Pc>hi)CfSK{ z53p1->Q75vf;!&W1mLh%unl^;@!masQ~P@B^*pm`lAO2V)s8rg+8FN2Q&@gYhjY{q zz*5-y?QdYt9st5$rh?#XxWLh9gy1H#1JMY5;`54T=9PqS4s4HLR5aUNCC$duK}@}U z1aMU_rV?r2L)-QhKyEt45^G|C58`Rk7*0)KFo!gsLYL@mzgs~poQU3AK;bR~| zDr0I6{!OFb@E!s-X$w5_Hz~6dXeEZ z<@APY=_x+e{Tqo$>vs_@eDbsJa>t zo9&g?^5Gbh%fRgTSNnKqZEeN3iWDNzu&T0BE$)fn)7-bgqCa$KXe2e)VI5)HNc_&` zVFY0l?Yn!2XYHHpn6&StK=l}BZf)m zB_jpYR46k!A^`rRX7KzoPHXB?hmxUe8hWD`BZr(zMwT65n`U1Lv@2h!+Eu)dKa)OW zW6a_EoLQsnU42*C!$8Hxb6ulr2iC%Qf6^NIH_=N{hr`@Rr(fX$)C?cty*8w!7dZ?8 z%woG(B_DCP_L6lNxT+14?l=$Rb4c=&}x(i~ zZYKpiviS!0rIPf{b&kyhXXY2e=pgPF!;eXLeD%4>vrm$=w zOPQAL7S(i*jC#lVLe}F^9d?y1wT2OU4~#4~6_o+8p?YOs;En|S_W{)DkPXUj{H|jZhfz-f8PmC-3nk*p{4OWMFaf_ zDj$uRY~zzw<|>Vs_YN|>dSZS@hd2U&Zj`3A`~IUnMihod9_4QzRV|%oQyU`EDKPS> zr9E?4d0tl~-~mBr`Tj>O=f4Ob|Fhr{y2ifMg4aCQ7OOi|)Kf!x6yx;?=yYH^mmTnt zge;5Mi3hqI)B|cpPJFERGhuJ{bH-m<-RP(fK5_JcB)P0pf<`So59bVJF@&|%!GM}f zT05_~67d98@h>{#`C#^zl!KUe``k`j&dc69!5^nR}?AiBrqvGrWM;A&(Nu5G+M@CjKc|sDn3}KnV%gQHNy9^>wb(p!X-g-|&c3Zj(I%ifBKn zg9B|4cc1uz+EWr4QD2?duN*~>KtUf(H$|!E;Lrwa?0i8&ANgRMi5h8(9yrP zO$Z2%!T1SZA-Sgvzls#JjZCi#zEaW9?)*W z(mp^13Fu3QrN_u7=_g)N^8v8&Q^xSXs@89+o6D2ZsGfluZnyDMnvVj@TFbhDM8I|; zWaSsky5D?Y;gK~7`5I`2t@iY$=wT(l}(voO@2-HqZfd+qI*Ub`vz@&j~ z(z+YhlVMHTumQf`O{`3KV*r~17gKyK#MfU&tvmd!YtrffcFLdQ{MvO zh_tb=Q-aMY2eW+c%u(toOl`BXhmk!-8q6pSib|r&LO4wLuoT<$=gLPdWoDf#A`rdn z*q_5d+5i0odgIZQ*%qz!*MVrRkhu7$fva!6sOO0=2k|CE1%}{{<{!c>2|j-F9k+$@ zT$@*#fd)_uH`N9Lzc4|R)36ao~$1NW^C3aix=vVbIe zW%$5|bfAIC(2l2t(})&$Q}$L5eL+S4XfBW>iXSp|LFzO+X+9KRN1Yyg`~wWINPt2F z05yngU>QiaSm)h`>J$Bp!}{Qqd&~zii*uLx4Q*0iV~jD@eXsn(UhPlDyhVsr-X=Dn zFNP1ysJ12^X71RzBR0EK7V~pg6R*I1_&gXCYsCugKoFYk-~tyj+;nQH-s=P>yrx5i z{e)?8hFiwEPfEr7ePlEnP5#)mo{ON4(Fw?jI>0y?5tkhJxN0J?{@()Aw{`S9 z+~HUQ>bwgI$3eF@}hfh2my93!b;0gaq;0L{WACLomAA+WM=}^u=_hFV{(U{AuRGMb*S%&-AEz)Mg=oex2W3J#yMUKvSiLt(S6V*6DO$D%w)vJnO+e~rI7KPkK)yQPF5>gpMGfx z2o^ES;=-t}xK^pBCXu_^9>DEHzN)xD5x9$|MA~UcmtD)6IqNOhW-6>4@m(^0qk9vU zEVK8S3Gm62ZOidStS^1^%K@DuW298=?OqRQaMx!S*A!6UEN&Vo{z9%X7bEACssO2v zL9;^~LG99q{)6VRIS#R)zR_(VjL;ey$dSB-!~wOhK-Q+`H_a0di_s~6i8eG}RV(mG z%4oFdQ*I8rrA(QE(EA{l?Av(tUP;&DBP{kVQ_pc z<-==mK(dOSKxv}+bp>2kW&zuYpt%&vy%biC9+HPxj_t z$3h5ZKqy5uj2fSXgjA3yxO~;5d1UaiP7U#Ps`}5#EQG*f5~-4#th1{=j6-1|4-jH- z0CJA&TZEA=SO5!Vi>0<@AL)TfPY&D_pHR19gwlGH?kgigS|wO3z8;hzEGd>;T=5xTq6Uv}aEeJcC~HmbW|KBF#ud$Wf8+}rkG_y#Wb z@DZheusHt?ZjFm+vjkN}_9z}oYSJ;HMLAMr_?Ca3fCU3uTI&;0(de8>ry85?myUj1 z+Vr_E&^$qs-<03*j4oogAvj?@g4IGdirHbL5KwgNoF&hlzgD57yLpG&FhjuVMj^7z9}09%aad0K1&0<ZYQS?p&^ziiG9LkKe>uKjcHH3xyQlZ<62zY4+JlSNZ!kSGWb)sAf^)~hy* z*ZaVwPNnC&p-K+1{DjQt6*6 zr6fvb=E&A+Uqgyj?oYamwgEvAS%snx*OK5;U8gWZAsw@12O z2kft4BmjyI=vWkBnU386h|6kPK*4XPKOg!l{9=ERSAnMJN~%`G8_lyLx3u_eo_n1L z6Hxl*Jdc?SqiQKa4(TcZjiI>Fc7o5Ftgr>AArXaU#dWWKnV|!z{J7MUQmH#wtk)sB zDP|JXia1?bg&U3D&K%v3aL=P(YqY$}6?vu`UYg5$icNqt)(B$LgJD+v&a&Rh*t+Gu zVxfSRK2mR=_ec=wPW@B{@dFUz=LCEF9V5kSMQt+k7VQJ(Eib-rl5(1!DkMxPZN39U z10>;cgbWv@fhXBv#SKp+wl7`>R;%p(NYSrH!VGK8m`ii10gf5fm9goGLiEvZ3c!qa zPro>KfSzz0hhk@h3GZ;J4A8b{SBdAvS!L`DM>_)V^b#hA8=hqzf#W-cFeu;J*9GoJ z;5| z{sn0}#hH?*`{qOX-KX;u8Li_aIiEZR%6rcJ5jU^UR-eW1 zUOX|96aZ}*Ce6~@m139oJ-paHCpOedE!`q`sg>syM^a7_)Jd)zTuMa^A$a{>ZG913 zM4s6;NyRd_Cvx6Kp!OuAe5fL-p^g;~nLlc+z5S_)*wc`Gaz*3rq~!I*)sRMa1)E|9 z#EFsU_yVVt%NcG3YgPJBEGNGEW`~5!-(0)PGun&?y)mITOE3o?8U!|Q6FrZm>tf7K zwI<|E$S5rnD65+}?XYohV}o+(_hChT7xpmty;JhibSx9|_%EfcHwysuel8ld7xz8U z9bceYL()nQH`RxWW=0OXprH<*t>PZY*#cc2kF4jyA`Z}e@awqM2*rD^WVi_&KMc<> z_ILEZyD7)_RF}X*j|cSdD-Mc^(Aaw9R{o_P-psfYl*6hR;SKlm#FaV_-5%MgeWVlr za%Jgp?@?KVo}+Ssg@W1rvbXM@^&dQ>htb3WloWt-ld5<5|0&b|Z??2qa2{Y;$eKqI zsbR;Xn_dVF^1s=^y^uqimcUU9iO=H+FAe})4 ztu5~N0Rk^*E?n*NQ5T&&tK&ShEjqC`>6gK-68zNBNdRrUvi%p(NcDs`>BUU2V^N2! z@yB%3h|S|6N`8{tHq&CsZyLSJkd*6agByg|w9ZI^-pVX}w6dVDF41WOMAS{7$V2xS zaLbkOd$?MEr$lui;>$wn$UG^=3)7YXXRnJd{{!*<+t+g4Q;&2^ZO$HAm=`<@xAVwY z98slKEZ@&^f7T$7AKjdqz^k~QZOknFC)e;Gqb__%Y-z#i$UlAOTJ~a`7n}0r?r$1Tx`Qt8L%%={XO64F z5d~sPGDdp$uF&P*1ZeWthk!c)TgfecSl#3Ie0$A>{t|m~hImn~cFQ9}e~E5#ez4S?saBw=rHjzKrPsr<)KVXL z>(iC@`n9JuzmLGPn9`HuHFZAFF6qTSN=Jz>OeOOmSO&Ajt%Wm(X@J&Vqm&=g&fkb_bv=LM{D3#a9JFd2Z;7$YGTXq+hp$HmsO}`k zH$K|VE!I?22bi``=ul|_WT!6QXn~`7t9=0C3h`8zZNXF=C)Zt5{kbku_+WI4`!D*U zP+>MY2cj87d=1I+ROeOQmN`nUVYOhmaU*~B31<2W6ZxD`h$ph9k;3TgI}|aV{JrA) z20+>Y5Xk?Sa;VBipL7|IEKGoIsAMiB7COS7If2Es?R8?PraO?W*MP~OB;Ka;+&}Mj z6b(sqj?1=EQ|RQvGRc<8KXjJ-I<+Li+sD?XT|BGV%od3(y+e8V3?hQ-FJ zdfoDM_eybVQa^yqLl=LR?~}UvsCYnd9crtXSUYXfZZhF8zupzOdhR@Mo;61+ThP=i z+PU)uDrY$KURk`KKLU~o9Cb~e>R+-#HYsBu>haaW;;7szaw&Ew244&efav>Lg~{>Q z?sU1VpAw()REa1Yn>@jLk&vag-&tIfvJ)FMa8qyNG1x2KYGf17zvv|Ok&heF=5Y)} zJeFB}JP~!RRJJtbiX0Wj_50SOI%++*53YmP(fTR2lyHwFqV_17eV^^k+Ec6ghQppq zn`Ol))sjjbNklEkM&gvtw_!g} zDWJZRtCqcnMhcw8;urzz2z)XJ<3u(?pSkgWs43XG!?A)v>6|UD&UcD6Go97pv!Lz5 zR<|KWsXfZvL;Fm0nr?aKo$kNf%{iq$Gyk&mmSx}&V+@xu^GE4}S)?9MIR-s__WIsy zY!LWI$&jkjfLZae92=e%+ZV0tkoaPJVp?9l{9QBgI$E3X3vViM(ou3s^&KbY+0&o= zR|ha`%KU^`L%eZyfRETv_MIHRlbjc5xWA#*&%9ZI|5!mAPh7=BlKt|0(+PcoD-R~i zm#pjE>D!{UZ-*acT&k2a1DxBBum6b^`;Xh~&;>IemIv9puEV}zE+B-*2xVG9h1&ulQIDZK32fwp%oxsTZZ?1fQ$+bf72QBw8WJd} zt4^G^uYTnAAt>Wj5V*J9>QX{~{f#08P;Hk{sQN>5x;gS0fWyq{uzxX87?D7Q(lJm` zuhZ*&-g_EsN%3bfnd_!%UN!?Y@)YO8s_^!Eu^TLr_S7dt1Y7GAb$lo1Lf)To5JrHm zcx}(5R~g(_`Vnny2TOjp=JYO6|AjfKy}shrtJcj9ir}ZiAD=6c<=LA+)dZGcFi$BH zK8K%uf#WWe1ewbB&BPcHgxg8_;rl!^&KNENkp@njZa?i(+Tczs6>!Fu?V1g$7ewn4#TS#M)0r+wWP`QGJ)uuMs@ zqUO;CC3x{qxo2)~Jyuataz1VxRPrj+hJz!4X0I8z$-~$|xJfKl`qpGpyuIWqex)I8@IXQP;pvna zqrgST^&Y>9>s{hKyn1l~><)Q1MH&mM!_OodlA@t&9tK5}jll6)DpBU_8KE=g&YI6# z7gASVZ_3AcenjQc5{*LeXTR1ztQtl(m*7Miix#8vE8|H&rq3ydsWb>ygAjs{M=^My zfXM3al=*_B#*CAd(JmH&=#H>AIb)lnt%*6mS{K)YDNfCs#Ft)nuzbMaJh)Z4Ro!@Y zO1F*yJey`p4l9@7pnliMEGghWur2dTu9b(;>rsh&_Cmpo8xP~u#VP!v0|G1$MF zg%)qj0ZUQ_)PIhYEHL88$hnJv^x-c)l&asini|V^U5W$Wzb_n=KA z?)B`$@NFK_#w>l#N^#OA=PtRy%Y48O&Z#{!`#IWhkZK3|KI%Qx*h}7S6>f_S=ox)6GZA1dHNgu%gRQ-IXq@&s=Jo${{ z@h7L8=v0~`);!oIVqSPZoY@^+DRNEz^k+8UGQAy}O9+OR{PgtY0u%xCQ^C%@28nTw zAFp0Hvb#b@?WUe1-x_S7Ctt&%TNosqRp&R)6)kyjuuNIQcNeKC4|`WPVE{*v%yKk0 zN*2XT$TM*`-Sn$bZy`X+r(!?Ima_7ufYcajY>&b*Ir6=3X?no_Efe{t=o-x0I>XZs zx0;{UX-*XY=(b>!%9Mc(T2kqK`Ac39BcaN*uS|<(hhYSP6<4LrXy-2)Fo_{3HGBU+jnvvyel|4x?D_+j=lFRc{*@iq1^jgnbDrrd3f*J)#ZATM#)Hh*e8`7wlU1rVmMT9l1oUcjrz$WScroS?7r z&l=jeyY(QBImXHI$DNo>`vvGWKTdd_o<$4K+D^2K6@AiE@>$#4W7MTwVa2V90bvRU zpy$_hj^3Rf>4?4tZs$&o8!~~mk36og*MJ|mY;VvkAIPJjjF-);I}|ArJkYBrlljm$PBpz)tQ?;<6E}C{&uTQU@l5% z(R9R%_WYTmQcgkZDSUJ8#t#i6o;X_R6GvDwOPThN1jbqGLV@H1c|>fU&dXO#?_#Yu z_bE&7V1q^QGo-T!tia(@ob626!bX7=X%QYlh1wdNX1yVKiWK-14fCP5c^?{C>ww(=8r)JgPL>RAu*Yc0c4i#JgelE5d@PhOYg2?G78{+@ox%Q8NQOz-P zxu9KNWgzX176Y6^O#yUYIE5B?mS2&bbA(USx^fIfFdCx__9~$VA2CtSwE;3jFm~kB zPt2DC1mC~w%WE*kOv(UAyVw2^-s1cJ{9?ez02u!|o7$HIVQi3;Xrr(Ux`inQ?EP3% z*6?x|Jz5)EUln0JiP+=3b>vLcMDr82arVu?r8KZRBQKCmZksuktZ;|u4wz1>@7-7Q z!~TeRt){lz6(1_2=gq2)j|bXyv?GIOWC+Z7CPRYDVmGX8L;mHmp!rAFr_YalE|#3~ zd~xEve{jN6j*E(=l~=&~F^2#N;&mKckHBp0IEUdjtBNA?I$ajIngSCicHL?;5$3N- zkB-$i$3a&Z0sHhp-$5K{3}M2|Wq?XyG273)@j(J$ zuWzweJ|K@$TacxewIoflR{4;O^Bn;kQ`ioRi|KX}bFTqor59XOcNkAVg}xu+^O@Cf~u^2)PKiUXC;EJu55XvD@)xm5G~X zE#T%^ObjXtbxskA6JD^Z!gaS_{7^SwclAzXLiEOTwAJ?rC4JQ1j<&AE`@MQ#h00 z;-wP#@733JXJAhpNBaD7&DEXVE_Qc<->9Cd{L7C{KNLcQ=pnw zQ0V^8tHeiL6igT4y*TMnDxdWXSJa9@!+-YH5tEc2~XO&8?$Pee&&s6rVFz zBwz2_AAcbCPxwx|wn}DoK1|Vh^?$JUo>5V4+q!5WDvBT?ASj`fELozG%0w0r5s+9) zkeno^DgjAF5)cp+P%;vVoT11Wk(_hRRm6e{-sw8;tew`rZ=crA*|)v>){mlE8*@@~ z&N0X6qmSOd@9VP+WPY_8ZcErl?W%NnTie0g_KlOa4o*!ud?xSF;)Sp8K7bCk!YN${ zs_;fhxP|Sb#l8j$es5J?^#$m;jzq()zHWn-D`KTf$FC!o-*^awweM;>87ha$o&u;L z;or6*s8D41?4F*q51<5e9vBT7Z*-1Ap{=X!R44pK5Fn`NuLTsvt@?%9ULMkrvv@#| zg*Wo4!A*U*J*}$aiQ&k81bzvaiaMW(os#C|`8n{lMrfMLy9!a@VSYzLk@mp^1ONw< zMO1y)qB~h&YY^6rEm@&%=TXsC#AV(SKj4NVVLh7Ol-Ndz@LDlbV>C2;IeKHP4OLxMZ=bj@ z8bM2wz{*4!rscGpIPhzzuF3(J^{cQNc%wARo|o!fKUbrA_znN|`P)!QwG2)}=W%wM zQX02w-__=t3Hs&`wyP;WGGe#mvdqt?Wf>^ASUQLDi9DWPjFz~rBX1C>O?K1Yj~DBL zEcgjB-<@<@T-{nd>nZV0%>7XPGTSv`$1rsPKfPn;Ua)B#Q?weJ%QH1yrR)#nPl)9f z$QC4k0&6rARVaImiEIW?DYT%zSJ~;=?Nt}~6m{0C>dPLr&Z2ei%64JgspivszO33z z#C;JL%zPEdS3$t*MQBTatr&GoU$AO< z;C7{@4Yd$S2;6~W042+14S<>TkZNrb@Y_!a zpYhLvn17* zb!{8l@Z}}#X6zX0!S3l(vQ&#}D6>J| z#3Sj?k-u;nzUf8p4i?0ZaDlPwxn<`DcFGfcG=kQ3R40@W`Yh*w4cZ#_NdwSRyk&6zZUkFZzhz zkcn-BA)(OG<&>?%1)BL?#KeQwlb6$ty42x!z5R*Av&f+?q6>zG|GCI@HD09x`{X{1 z&Pj^?hmn=knDsBfxJ74W0&nIw64jsI|F`smkuHX8k<5?C)Y88(3b7bcUgn!??s582 zVal?lP$8c()v|R?cbNYT^nBlF>1> z9|2rmg+GzpTMm;QVJo3xZLK6E*E_qz+PEV@*(1}hDmfOc87Zd3UfX)%o|GJm<}*3k zl8(ZLKqaPmtq1HeVdcg>?G=|Z+iVZQt>g6};e7|Q)fS*eo%Xw)zd?X9CiAMIN=LrLnnv7b!eTkO6f{UvNS zc;rB`TZVF1=vVbx4T760(%aPYjVAmA)zN);1;1_DM+`-lW+OkhVB~q=mo$u^XtfOw zAF=wZh&2d_EFBhZ-I}W4#q$%yyvP_v&>*UJT95E;**LGAD^Ia!&g1Q8y^oOfy-#O_ zKyMmV@kGWsPN?ize4{XFW=d!mT zmPzSReYql6l}-~k8r@aHTSs4nZPN5EeX>0f`62)CQrK@~zG%4p=I_FdRPm7z(al z#X2=Re|Ml-sA!H1osq?=%WTb%4avnH1&0~1wK8>c#n!xjj2PpsDwLNTh6F=trEM|l zn*Q3DG9Yo5RTkM5WTVlN{ch(bpH*7)8>P97jfe z^={F@m^4vTs6UHtSFpw_B_Bl#H$w%2X^lL5jU|0Dv7*fD0dYTy8y^qVE zUl}hg64uk>H8uQ)V!%aT^_4vG*N+@>EV(KXVx&2Ib0pAy8JP8A?lC$P564;V{83fl zEN=>Ii4TBAj+@~Kx#1)0f|J2t8c}h(8C|vLdF8t28zHreNZ#Ss`r*&^=I`mgjtc7k znD`Z1W9&d%bk{r($=C+)DcQ&e0@U1PG$CJlFG106ExY953W|x8){lo-OWq1{Lb&3rP1 z2j@;%>QFW_7z8*-l0Axb+-g9a3O*r}5hY$cj+$1<+5nWsclcdPe7xhrVE1(vG|;n> z-2l)>oa0S>I8Z_PvtbSK!f96WE9+K1Eg3j6;XpnfcLu9{U84+1H#6Nv)l+_inT(bR z6|KaF`QldTmYcZ}#4!HAzAYCQY{1|Dp8$e`30-B(Nzu|PY?Ad9okKtpLv5+h!P zF9#XvoXW}hY(;KAImd4?P_zFN|<;L+>R#{wXKP#p@yQew+RwlPM8##e_q3p7*9H8aisavp~!mpP1mE-PCfyyuwTBD*1E?m%sNI91vCA^i2+z6^uAB>2n|cG54-Hfg?NLShFlLEBZB zf&pr}_9L@RlJI@KLSrAe3#$N}!b@jv0Ns7ByXb08j4?m`^!TZmIlUcVh-9nG$E?sA zt&gCaD?SzI(ef#C zcac{Ot@PIl5~nZl(^9wJoW2Qmb9A6(qUmHJKVfSHm!-5*(M*jZKNot=A&?0 z=MtuB4Deo--k(_3%hyTD%}Rq9%L)>E0A1+D-1Y?-d9Nfv?H4WTGC5D4gf^W7Z#>|+d-jTB zV+Sc%5bsI&NM!Wp!Af~7@KYD;xP)4Vn$b&$kX9vcmRZCIX3V|j`PCPX7oL{|iUwYpyTTA~-I(O= zl$w-IBf9T`Bj=5>8!kn+pZcqHsZI-os!V2lD%xD!=c82Y(g>=>qiEXKQW1pOlly?5 zLAnDUcRKo9j%F9aOJVTo7Ee~U+_LzO#GL?vUgQUTcIvLHPO%Wd_0#fzg(NVK?V&C2 z)5!A9?Uitpm>DD(x%#@MkIefiHULf7pWBA*J_##dqrxm5HvkK(=*}p9K#j zfq9dVqTR^et+E4!O-|GTJZSa;UNmOoSRBSOVEa5zC^hfp_NuASmu~RUa~fz6l*U`f zelaCj_9n*Vv+B9}Tn|1B8v(-zj&T*1ux)d2Ok8#8yRtdMS#PV};u+qSE@i zJ#&?PaWcbXbQTQ(HFG=VOGdbG=FO^|x}2aIF#9(13$}EOa^QS$VI)49S%=sh4?b13 zHKN0L{RDOF&!^zXAjAl(lic~)c4!ES{9rR2+?e=nVr;6%+}-Fxrsi#}hYi)y7qvih zi@pZOE#Q!PQEb|`<~;1=Q=Ct4snL@c5qD$RI&IE{KX+k|_BdWQdTrP+RxK^3DgGgC zYgRnoZCKx0A0vVMq#1Yd)qwYeCzD-@k4whvq?NY%@|2!(jC?cJNSq=I&5H5_3KT2< zfbn1@yIb{`zUe*Js(Q;*SJpu4{X`P&30^RuZbUXDZs2b%rqH{SX%ffKuDPv%lL);1p_<}6i*XSd6O+sC z4;_ic6c%jmf_&`^2PS-X5;-9j3Tp}DvrPJSP3Rk_b9*-Nu<}u})ws45=1!t>l}*J; zP5H*i3_;DiTRM=9YHOktxhI=T2-p#Qg<;emJQU=fGFrEJ&-YRG<<+lIv@BIdiU(wc zVuwL3RAZa-kZ=Lhq0;PS^zc@e6C-o|BbremAs!U^j30_6zrfC>fUTw^b}soEMYMdY z;fpuqg-g^MylW^5*a)1-TjY!$rGr(WJV*-_z-1?6wxA6FEbbN5^Uh6;WXi1E(`JiS zWgoP)J7p&T2Fo=J1x%5r1Q0Ow1=^e_1?zn(Hyp4TK4(Vb?xmidlhs3 z!;@t*0&j!28^(G;uqB7C))LTuejHwPvm;`f>Pg<6fcuVZuB$WkuTdb!^OkqpC0dZL zYhWSl$d&JF<_vt2_LP*+!l!xIuNE%WV>y^EvsNIp9a59*U0DU!QkN6Vz|rDC53^mFXjjE#xRAfIbF zsWYmtJV<(+91cF=K?bBfNGLzF)rxLWZ$>OT-KL@ll)O1Nj5NBCWprAb#}AF z$&?$&TX*!sg{#J z!INMWI!nmL;0Y)qnc?|$)+2E|Jv-44rI(BDqXXj%|Ca6|@IOX(Ss;<4eO@RodVVr* zJmu7aY_-+P%>8x5q>Hu4Pmn6GY5o!5mE`^1y$*|F8*cFC1H$r{%jjRx0_3XGQHg3w zDKsL803khT2}Gqz5pMgO0mt2md#ct~{OzVbc>LOrVRiZMg@^Uy>Whazyt#{Z7^%b(W(k`sT1 z|NpP;0~y|ioKXF&Ll^LS0N^|%EBO-?FNgz3HpRmLv!7EmM;|u`?$^$C-y|Bqw~@qd z!)MTH;m9@*f)`Z)pwMv+NEHI1*dM*_hkw^#7Jv;!7NG#^>G5CVl{{4>5yU>PN$zuOx^vSzwqo+zH95l8@gIhKI015p z_ZNsm`m`!Q5xG3A6r&_}4PYhSe)~tr%l~Rym8j`}#{J7v7QnB^{!S%{VMRnB48ADI ziExG`-n;KZ+uIiJBJh-1PQpcCUhZvzd4lX#nC!cuZn$S17aX@XER7K6S=6?E-RYq2 zYzYuxkV<_EfC@kN$l=12rs6l;`P#z8IQw1GYN0J#N>98C51iZ`Ly0Cb^v%KB z3bS){!{Mu@1S95f%mQycCTj`amh?-ty$`>qK2o&-gw9Jhe}eAT=c*U|_@x04;P<3s z4gu!TJH#f@BWsf=*4M9al*x9|yitl{Y;TdMi@t*eW3s>3!8~ojK>&}S$Rfuh-0f(L zh`ijEYpoaZ6BJkZi_SytTPAh9DFq_~cV+iS>8j9c?Q$J|$a&z*;jt1!Fb$i1rYfft zB|rb7d}cYFpH38R3jAlmB4L>9B(aXEn#Tvb60>n1{FSau8UmD?Fq3i?-dGZ+j|Xq0 z8dEtn;CpT4@+=+TIi);44k9N*?Jb2zzNkBfLxt`xcxBX^d&_!R;~^9*%&~=b?Bp`^ zF()HEH)P&1oxoll!r*GL?`JrP+o+XUf>lVZ3IyGhwxNVnjy0f!1VLl&t&;CeZN_TJ z4}(a?o{iGZu+~FdG~lE}F~hQ6rD{d`wKP0$#_DtJQDH#kzb>_Q`B3J;=@ewn5&tc> zmoH_L`O}6{9!HA8R!jeH3jaI##sI5^M{qbmGe&RF1bEPz$U@FqB4Rn&O9OvLzNmX~ zIO#(&dl~dlLxkV}X)9UAaCF1Jg=lU@!!w*pbvT}k)YcD&r@hOpzVU41G;G$>IE|1J zf1>Tx@!d_aH8SU3=IEV;a97H!jRhR+FMS)CiDm7QC7+IydWTGyqk6D_Udbu^opYyv zK*q%?1Bx8!0Io8aD&CCng=<8BqkJ$uq`9?Xr*IXbH$q5vh0v8$ zoP8M~r0UJ<~C5naA=D7N(yZ7E>Mq?HSOYybee>eS4WS4loEE#A%hZjo9i8u9Wk znG10dbdm)rfEgYiY((Z@5!aI#EiCl?AoaY*H1Y%UgSJ;0O2SC{3V8Q7kF?NFpr#cl z_xEuXA^YE*-di8I$kS3ifJ#N?^osO1w{_^P)ara|PhX3V7r4JS|9Z(h!?|oPa$9iG zWIpLm-iU&F=|l(209I9?_jDy%zSvajwZ*X@f-cEi#oa6^cPb>ZA9gv-6Isky6TX?ZWRh8yrMls73MEq~eFdHbP7Z1iP^@KJ~!{@bSBgnU`HMZAhrR?m+LQ0zgk(`a*> zRclk)6Joq5^P5?{5|wO52Q$rvc8$d84rHJGc>AK_u+7}y&Pwc5tpF5#}Znd=()-Z$_UHom+1X|PQ&Kg~3|6jgI_qaPUH2^{$!15!vrG4{u zd&o|7g1eBqN1{z3ODsh%HgU5IajNsfw9MI=Xr2-gIKT+3Y1JO{f8fLdZue8N)hs-1 z_uMR=UvNC_+2{dQdfKERI^6gSS&!{|+S(^SL2jL#nj0o`_!^yK>TJs+y|RGs@i`AG z#N(r+MIJBSW`A>AN1M6vtUt7~Z#xRdjZHf;#zwhWl*g#g;YA@9Q-da1VcoqS%M1_oaJ{K=mm}lm6&m0x}{_O6yziC6{jOAON zw{Pw_?d9Y%vJ-lMpsXqW1^`$y#EGpF*@s^4l|{02xp;&yyt0z-7VUGnLnR6y)r6*s zJ!@*~Qz@z#6kWSIF)$z+$dE?z?Cv6o*7M7try$aS{1M8;bH7Wv|ED6pDBN}y%XJij zdlW)s_rB-u3=I})Ig7xm7;2d?XnpkEdjgt>-n-C*VlX8Lq$a+EUzNRsHb>Nh-J@|$ zEMfzz{bfnwkCg2Iy zLpNJ=j@}&yI$Uh?;BI`MQk#jQlD!9j$hhl8+*v#h)%`~u-S_3Mo4x%|I+UvMo#KrU z!r=1S{KtbZ=X}}@N6789g51{n^O&r+E)xCTj$dXi=+_j5pp#0rTr8oiIc!r7*0H%) zep}FWoVP_ZhIiQp4?gMI+hyEU{30JMFYEjKDx3tWjh7(2`w6m&3>m$W89T?j%{pK} z`Th6YZ@S7pdJxuIqO`Uay+(J_9($1Yd&S`1aR+Cuu+TGZOe8ej%U$< zYpYa*B%4mz9O2}iW@qse=Dv^3U+;S$r@;QRB(hiW=YOCIfjEKdtU~zE#U0STz>eas zO!JDu>idfms3y=togbpNvnU-fZ?^#ZcmEt*Viv9Qi8mW=se1lMvBPh*`I?`^R;$2H zGgWp+-;mCMKlCF6t%({>F~s(xyPseK=BznU7_e{HE=L}3wROIvL*~o3Z+2-|tFxDK z9IA{-vg;oAtz;sekw;%>WX|cAW8Y&J`WmftK}+j8A0RC(8KZv%)&g&2vdM~Bh;N~k z(>WHh7{S~*u(TuP#y_@&{!oQE!yCnc3e0YEf{mNKUTq`cCJT{Fs+7~MrT#W?^Y}P5C60tx-T(BoxT@lzJ7Y%p zd-F+7RLJ$Y#1Rkvf-CKOPc-@i$%{W2J09XkfPwfJ0H8+FdGljK^Sw>c+MhU% zeZ5s2jw`6n#1ngJlPaFX@I*$_zmE$iuJy$^Jlv@iHia4W%lV|?m|z_&qJ&IoHxrrG zuo2jq!2;?-L)D>l`x{JUWi^iVmY~VK)owB;SIS(|lmV)(l=NwfloG+FX~{zhr>r1= z!+O&%Q{Oq}EXzB)MwYSzx(C*__GY$pN$J)US6JI`a5sJ9ZpvnUOL``&!0A^^AMj_n zNl$zM8l_Pr>bog9BjBqX)oV;M_ezEFhko)^tOs z6&mC!n2r6^M6E>G`Y=qjv@3{hs0?APO8vGWhPK5zC*E5aQ8W!lfn>U?$pBCij zC>KuZP0A1S@`d+n>9D7)_bC&RaAU~$0DL@m&NlL$SZd^gs1%Wb5OrxGn{LDEFfAh?g zG&vL}yyrSC{N$UgqR~S=30#`2ZsVouxz4B)FFCrS$Sp_QP4~g0kC^GO<7*XP&3(4FteOoPln{Kyk* z2cBH#ZIr>BETz*+yY^kv&wy|F9_Jq#*GYp3dl$GNN9;$D}21YWqW=$ zNu&g76&0Vsvp{Trl7IS?q=Tay)jBX}H zoUuG|GL%@pTIN%SMiMWE_^45dLc}4-#i3l3Yka}pWe7MYWvwvcAd$iN1ST@@&nStiX%8M zlPvM)u!1hO3xZeEu(%d&tXYa?__@!*R11MUc28E{n^|6v&N+{I{FeMob^{ZLhY3gd zc(ERghoIq2yB93$&!Slpu0^Fu-D=qB>G5zgiE}g7{TQvpS;NDf*Y%q_hOIdI zPKd0Pz&&^8ft1vGLc^U!oieN_WaTM4%PSk$n^wPqK2EMYeP9MJ&CdJ?ZaHqD3Q88m z>MnHek^;WI(kHo*($z`++D-q+dFNNL&ab+vqx0^aMJ)2-t^y$F*Mjfhr71z9^4O>h zk9gN_P2JDRS$Q{wr@eC=9z65lo`0XLymSUaDk}{bJYg6F&z-odjh4yA`p%Da3#(+^ z#(64C_7>lJGke(5x#9T3D*chd>Nz_2^MNwetEYw{dL@ZXd!!+^D#KG_snXcqDlG5> zJO#W>pw15OD3R8i6v(aAJTnxT4+^k%sV&s0jF5?Y9|IJ;vxC|w{REwPss7S}w1f8` z6} z`ueLyD`$CC$k#-1<|L-`np<+u4O{x-p=Cg&lsbJabz1zBA5r_6-R4Dvyw;Ud(Kd>^mdVSQZB=+vqr2NA`_L;YlKY-EpAGKF_u?vK3 z={xYb-3gHeouESwUOtwaL#Bq!=e|}ZWd>k~Q+9c>WrT>)B`}`W?T8t})jD~AT|~pW zw}~7T6^dfEcZ4|V14)8c{UM$fgBLV&zZjx~x4rynAjzJ*kF*BF49T9RD30=F4I=Yj zq>mimDHU#O{#JdLBHfc3`jXfUE|O@qpLo>bO3$A;WdS48rAE4Z3hvP;z3I1ZSqnSW zW)k?Jp)PR0L2}Cfn2~rfc7BpTdh`%oJK^VHu2VfEn5H@O_?YSj4G|sbaKu_qC~#|% zrTyW3Y2JscF`2&Xnx1@!HLz6jC|#uBp+2vsbeAdmjXGE2TJHH@CiO4HBL08j7$9$B z$I(uWKS3-oMk|0zSZjaLi&9lq=_!@u$oxHod&qYd@g-L_o;T_K8$(aYFi1eTVoopu zWv(7PI!h((Xy?}SB+e~3{eh2_dZ#q|k<4|c4>mbVIw7M+4{-O1eUQs=OE9VGTuQLG zSGiqu*XfmW!(mG%Jpmmu?i4{hi+*>{!i7^xXSZxIPDiFAX0og$w$P>lS$?r@#j%MF zhI#E*oqn2x_PbnC*Bg3muNwqLMrWrYzW%<@yULPa;Q(ZZg>Mi~HODVtYob52v^Q7r z&NO-X<7Xb$2;H%Orx)4DPoYP)grIscV*l~TXrj?)A`AM{`+-f*cTeJ$r3eX^Z_HQE zadjqNA3>!-$D5CyVpVW=ECgT8sa&fBb_QDY!#jn?@h?{G=0FkBdu;dV-|9Pgx0&%$ zvh|>hQDg1Rv&DS@8AD=9#>05G{I)a=lpJA-8a-=P&PRH||4q+*VWG2c&wL|v%Kdf4 zEL2P+mYXX;5YK@#^nT!X$e@rjEtlP}^iqj#={Bd%5O19Nw?tym=Drt^oD6r(2sV&N z85vFlUr58O2HmnSL1SJ@Z#+w{s$YJkt@Q8&04cLDn`82 z2nYHnDB^%|np7hIQ7Ta=P2o80_LhOAr|RsR%e-r|3}^+?pZ-s4lQz)LyR zBfDSS^V32h5;2ve(jRfz3D?_Jv`%uZ-@`iLjqKiH9%-itzvQbyot!}qqA_e2MWfMh0Oi+Iyp78Bnzdu@Y- z46_q29+__$&9LHU)_CMB!_4p|B0cx=1J=r}sU93J@#1lofmKR5xY?aQP>;7JVRHOt zNwPc_T8~(qi%}jt0oIZqJ~rN?It2;h>HOpU(iJF?3%b3B+bVgjQ5%4J5W?6SQ7$j%?hxib6!yQ7|p8T-K|5F8n?=tgw=$=v1O9v zhdmDLi(DD7%Q{;+cp4MUeHb&gLcS00xluG_Lab@X0b_(30#=)1I6kzwUD}Q)kWEWTgqgo&9U?^4kab9X`?i zTmnvQD>?^LxpbioG8r`uBo*piN7q_m`o#W{(w`uPvN_Mk{7-$()tVC1N7s_CYdq^Q zNEr`>BcM4;tHNcBZstM#a&!ccclICk?w@V{s5bEjeL|9!wjpeJqb?$!>a_dzLXzVq z<9Tow%2H$GvF?>+gELgwV%QjH@ywM<2^dYk-KV?}?eFk8joZj}yiRI}CHy)fZ-)k~2~B(~^ez1~m~~-$~-5wI|YkP(AY^ z>D)Fl$2?k;2$a2{&mh16wZ2@1=n`kFS{sm%qTlS?qPmOUz|6MT(|JE0s-IopP3`|G zt6FXL@*PV@mGo&*H_J-B|LytHk0~%c1r+>IBu!YDs9wn|kqb6>n*7bIpU4t8xOJjk z=NKA7Mi7CXyq4+fC9Wf*vy#1Xl2=yb<8)t?)Ii!+$YBG_s}U%SagPc9IOxfF71D8(roCRARhR5BCGqB z`p6`k%VXsRof_5v$m*AqAOE>L{Hxzc4Xgxuyq*t#zYwiHi7I)v%^9`2haU)~u)%XK zce0ZVncO=<3KR^^A4iJ)1O+DZ7Goj<6Yuz#)$;VzEV@J~3s(byYnB%3A*Rvu))bEm zhuH#LbGlCS*S51@nPYC}o(W_%`}j}|sspjKB6wge=|Z>QHEX?t>Vs+@ujQF0e4laU}?3JS*a6R-@{1{8a~Q zf5U#Vd%^laG4mxwrK0051EIcbs3=_WBy>Td70&FfHIz0~Yn~%E8x=)Ce%9e3V?oF0 zTeb5_-AoiQzRI3$k z>DtccEun`}A0v;g4^nGQ=IonI(2I>Zw%%W?CPHXtsQt@|-i5MdKlllXyvF8l)CcXs zTYpwyjEq&b!$els(L?J2`QEMKEvYPkax=u8iZ~Ie1pfYmdhdTjTlOyqf#Rh^$H{Dc ziFCjl`bvI`K&yy^R#K%Q_mN)6Pkr=tG(+7-KulW4&+}K#i3Hl0vYiBu#XWH!_AxE7 z$8B_x60A=t3iJT64H}_HIYe`aO8Blc&g16 zQ9Vv|x(SW*?_)8dOB6@7lDdqW0-VHLBho2KrbPx_^RPJRk(C6@kOe zs=Y$kvP_kIAD%ae=Hoy{nww8lzY66&8<$b~xDkJm4*E&dvcX|r@M(TJ91W;QkuN7C zbiU>Y*|8uMn_Cb`j2^hb_9M#=G5c(|^>DhMfX@iZdT>JijYM?rj*5}=Mwi<5)TV~Y z@qdfKb-4KPepwBwR8MG`JqTONL1z~d1C?G&A)sj)E~LoF$k>5 zQ93q~#ot;6p~UdgM7o~p-Zs?KJsH2A$h?h~@(xbeUq|3I37NI_Tp=zVogXUABD33{ zjM|-#7wzHw-f;po8UjBX5fo%tQel%Yh_Gfs@UHc*{*71XLZlpE)E(CDe{_dOjn))n z5a89fwV#=!+5ThVJp+Tm;$C(RuUt{9VM-yiWF?3YKR+qd^_&oWr=!(}&Bz0dQ@?gY zuJome4L9$P&E;=hdp3v0c?ijXhI0QNcB*{XbK=;HsH)1-az8Tpy}78%BYl_~1Tjp; zSlNK?euVlS5{M&)&2ZaVw@3t=3NCdDzqu3nan#OGZLcbjRa;y7l1I#n)ooBiE5Fz3 zzhaO{M0K9&gN0?wa4tAeY!h!Eg=O69$NbEnZ-$S{1i--5Q3wJv;Y00Hy%WB~hmRdw zD%)`nD;?3blxtFo1bZ?zQK55cmPMaaQ(mdd@UE&q#y0^~dD%GAd~Cg=s;)HNVi!=D z5j*jKU#zIQ1Aej(t$~7pbbs3X^uyhk_M9V{pGW805wC=-IYrvLc-Mqa zR(A#Wzq;BV_MMhWz>)-#&Mqz1?H2PUl2m_u_}LxgWeZVrz^6OnLi~a&{n3Zccfp<< zYu=W~-xr97X6S9|X)SCgj_lDU(}TC!{4Pr|IZjjnyOH2XJWc$OxUCLOpGdc7zqu3+ z(4jk~H7*nv+%-L{TUtLpD0Pkblxk7 zK)30PB=bnLg)tk zQYr20rBC|oCG4j$Hf&-C8MFpeVE}N@&lA>uRhiR1aHKKFtPZ6GgoorBz!}bo5?c@) zr^U02Hudytdr?0@5RJMqe8G(er#i#WHET%Y(I{U${hXKj7Lfm4y8-jHek+|XJHq<{ z4H*x4t^?fK+dndzC!`DQfc%dT-ee>bBZ!b@jq~gsq)*&eO;q7Z_!K?B{`m2ZI>-1W znjrv%`wmhB-Wo=$=!CYwE}_HTSi7z|>zuwTHtM6f98UjPm-Q84Nz2|_XM|R>=AU+w zCI{kPn)~Yq>~GHGt5F{;C9rleizt^Ny1tJ;l%wD-YQGVI$o>K%`$G5TZyjLXRjOcMy#a5HV$8K; zpqSOSqBw$@Q?-8fyEvzw*_k*Mose|dEhS_cFfbUghBz5-8Z7?`qzNeEYI%CDrMp-q zIBnK>ebm}O+2|Q+E55zVi{lS;zMsT#uY}JhoQ>)rFD~qO>3-9H=!Oq1$y^1rxNmFJ z>l?Gx?z~%IM(yjF@6NdfA}&+i%iCO2h@)+Y+TcMbw{tJg2rsXc*+p8R)OIE-NuJcV zAeOgUUH=g;{JRa|nee0d%nipAz1VaKOTu`XF4Zsw-+Kwu;BJ985QBuQd!m*g=5X9rGeWz zvVK}GdD{gc0K#+J>Qp-s9spz3woB!gWQa$<7`~4`=ff;VPr4#T2n8e>MQs=7?E+G~ zOZC_BQaVr&R$U1__Wl(RJhpjR$_JSc zN{zvey8W$ZPZIC*buIC8|CE)UPu*0JPkSPJRQ%eDs}S-*5Q%RCZRtNsyZ>8W`8Usq zq^yl2i5o`w)^yvk6QWKJB>AEg!-Trl8gg$lU6H3W07B4$L~(c{5Am$`<2YOBd@BH< zqjJ$zGLGv$s=N1b58j_{W_m>3fq+y&Kg~!2rbUA|NW5P80sGX_NU;3+olp9$WgjDt z(rT!ADhi{5>fldAwQk>eL6mo)8nZqsA7oCe!oJGAr|G|x=M>q{vO_g=W3V6o5;}o1 zN_bgun4@We-{W2jmr)!%J1gs;=c zrn#n}I{fa!QtHJ{ZPAa27aMUBc*aV9-MT?`OhMJ<>vm=NF^?D{FnZsD6{P(Z9hlY0Bt)^xCcTXr&Aj{5q2H;I0DdxmHJ8yH>5-u6=MtDY<% zCQ&U(G(XYN)qh~|2=M%V(;NJ4qx{z}E~p4>tlIntxiz_{?;vdKH#6tuL4QXXE+=-- z>!IFmD}pnbLq;6TNr+and@Y2Rm+t^+gGdrR4cA1NX|E9<(aQRejd(F7XKM)WlUDQ- zWOzAAGV*k3Fb$0N)>*ED`2pyMyN8exZ?lnuw!8Y%S6`D3FI2vb#5 zZ~z+w7TSF-poX6u2>$cOU(y?ZAtG%k2%Joa@1HywqO-#rpU!y#c!r)~Wy1MS#~S<5 zQ^4B7;P;=&XJ~9nDLp=piuxyLR1tFU*yS|qADherQg#m^4M3(QJ6YBZ&#`$b(v}TG z#O6*M{?73IoWmL7gFOfosZZM#FJXr#*V#fB)o zm`65JJbuMe?L+4FIj1h}0{@m|8jOVhF0fqSH8ZBxb=;@eszTEE>T@e8A|0a)2)4?3 z=5%+ZvWBU-OShT;pz+z?y%3#^_$J6W@BH~H^ltMEJBF@oB+VyxY;w9wZ+St^@`vp3 zi}IHO4#G-EMnW#cJOyAGkmETr-Afv+9j%TfHqH62D!&fPZDsfM^)9^G9$EXKuMFnp zixCuPF^Pk!sw7n~j_Yl%!W)jPmbB&-CZ+U7WvKzRwga}d|N1T>i&Z*;EnG#!0d$=m zM-H;3>zA9Sx~5h$-!GVMRHbjJNVrxa=v=3+dechWNiS}FrL8jsw?^J}7xBh5EC!Zw zS1eZ!xN153ANdjgy=^sp5~jb%0L)rHKQwm*+vln2wO zm3pizDY>a_qAw19Fig+;^*THL>s|SeHin!Kil)K`Cj+_3LfaH#h#uYg>%q5u7${zFgTUz`ws{_nr9Z;@_p)~~Gj zx3@a&?a#zVE4<7`q_Ic4gESQ{G2av0qiO?B$j>T@V6Ep5aveQ|W1tkGPm|xtjZYGg z)l#Mh`;c&ODf22K5l-PQpi|agrDS(rTmS3S<<08{8_fmhwd-|}Rh{mZyjm?tIU;=d zI9ZJseDvx#PGDyTxnovWpRgmqWxS`vcFoOXsDwpH%n0!&SL*D2-H&r5o$R9%877^K zM*eexfkyn;aGxV6)x=Lw+SfDIIl_f<#Qm2`fbpx69b2i}oLAWD^UAy6vK#zhD=*@P z)D`7E?hWa|5i!U#_zs|(ikg~u9iL@UF0`JgTI%W=w^@ES^=Wq67}e-Vy^i>T`?EU9 z8{H^5qIp_i!#!&m>cv?~7%3Jqk46xm-zYDvgjglXv)P}Tfp5vyR@C1`0=6E@B+m)t zdWPK0i^v@jrB5~F=zM`l>_2*5|Bkj7vK;7<(1q=5i}y#j1BR12ns3kfNeU-_^5ndHm6yyGtTka9#Wos9G{KzcgK$=fp z`BR;Q$sQ+yAQ5w9f>qqEvJqfo33cLxJW~}RUiY7uz@JP3C?=KB0YRGL-R1ZxRL>Hz zHGdkM;H5w98H-!~Lx>HL4GS!IwMD*T) zqdC@qY|%@s>Q9iY8w7jx0N$e3U8AxCT>+1k)uinUAPAy=cS5s}HxMFAkxIc|^n@&M znN4RL3+_{G=}q^a#LK}F;eT^{{-Z1nyf3Mbc(x9Vp()-Oz{Q^g9r4?(;k#2&s*t}v zjP3znM6G^b1pt&rk=T%NH%p&xd^7G^g)_{oADWr+x5r;oy_?-%1^9lxf(rP@TQ#Ni z%fwfNQ2lC%@EUSZ9s4)O|E6W(&m6cVBe56xZH;b%|SMyqf#i zn5+)w>rGEcv8g?e!N&YA_TB@msdn8KjiMqdAR^L zB2|h=SC9_UiS(XG@6rjq_nJ@xDcqFRkk8;T9QY&vTgcjgjA9vRlXyh=Y#sYySUrgfevi#Aebt33m<&3MZfQPzq`Z9u9!3rFG81BT2W zkQjs!8Br$w?lIHopJT{#EFS&ZiZ^)wHIM!`ko|xB{BKu0{xL59myL_Tm4dnO3oECD zYw|`{n*;7wRet0MC$OJOP{p6|L2;|3t{IF4qjYUWpHL5`iklQ-zZaH^ul(-h^q-m? zbaG>#rRUsBp?4YJ-YFe71pJaN(Uaal&(Hf(;$EYJG}p~SgBh5p6+sC7r6dV^!sj38 z2&$u^!U^Rc&(-GcbsQv~vh6DX7>d-ps4FPDKCE5TE68dvxLfhL|71LLq#wYjDwqD< z8RJ&rDqCE{`|LGM)kPHP!LR@T7E(cap%Q$|=TvuJm1cL?wN2Nc%Zp71(DW7@Q3NQT zXW_(z#Z___V%m)YnKdlj=4OrvxwEFpn*b9F>CV;bm*yT!_YHON_WoB=S|IQm37Pou5^6N5A!X>2_*L!eQhw#kDnOi6HhgoH*Q1-;X(s7PbC{HHdt8_DJ^_n-i<{ z#_2`Gte&=_r}CdwD{|j{e?EpzB7gSMmdk3g5*!`xyx8Vg=`&vDRd1G#GuqzZlK;4V zuSpmZV__s2!cVVz%!cb;o3fV3my*{>swxbqb)LPF?LvEf!okrlF!(_F$hzfcYaZwO z;J;B9`HRsyk<7p0B%|kIH|s=o?5Gd;?2D5#d4#hwl~sJ4g4D429O><88-MvX zCwq%^52tlDAqo(%zCD;Sr1QS<5HnnMGxY>1xB9MKU*%ZG5KZJK*#RfG7QFx3qYH9m@; zO$=;ENn8zJ*r;a!HeF#_sNS2Tvqz&_4Z%0RwF6V~r$14F5C(`|*p8vBG7yOZ(P$q7 zb}jL*_pK&fL1G4h6$9v$mF9Ldjf+FA^soV|&Y|xjaJ;|bfS&xrK6=3WV5Ay+b`5iu zv3~k#^zR4bisd+P2Zpp`W;2w8s~Zv3>G#1dCZsE;pzSZYuTL_^Sh&OSfvH zr!Qn`v@^VG;~$armWD3VZ^+CiZ$N@7rp{MK*;WK?may=JmyW$(56^$~)6y|vjQti1qAq*|mer_D@;nANh2ir1xhz)2X)Y{i5v5%6} zYD~1>0P(T-jt4mTMYgm)kNO^fq(ZZu{olx6!YwX`khC)h#(?ZMJj`H?{pG!%0|KWf zyrVgipPTG^ClfCc_=w0k@1780?n&F(yccVa+lcBv%|nsXT+a`#LO(QpDgH4d@M3y2 zq@u7BUxf8(BpUT)O(ti1(EE^hW($PJmHSFVjNx#@Ve#|lPCg8^2;wMe`OTX_`C#m? z;F?3ZuJjAOww)dN<{1(gdvEB*l{Kc5MW9QZAGjyg5fRuq>P_+aWP+x=D7>OqW$CGP z;oWsRC)pYu{UG@;-UF@{Oah6w-k^CIZ1^=nOGDFiypQj2dchmRpdThHdEIygknf#+ z)5&Q$m*w}SK;1x1t-zq0N0}9>Ai8*Vi+zhCPTYq5Op3~dS;c22m3>ovUi&>>Cg>$c zbSy7jSWhYl5L|^55%&eTO_2>3u(=o+W789i71_{-9dZ?t!vOQT8Ntvc##U-GGy5|w z-e%Rgb)0ITSpzuV@=6u#uHC&IvP=+GWv^@V)%K~kKR7yx;^Z=K?46pU(keksPTVobhnI&OO8STOmz z3$`>T11Yws83-wIZG@bEz=REzB&6%GyxLV$v?*9Ay4Fd5IO?;pD5! z7_a^{K0^cK#>uMfAjoA|1{_9}GK`P?7Hjk4NGm+aJ!q4|e97Lq*Yzs&2l_6# zwv3qNi3AZ4>}N~-lgObv5o_F`X*tIw!6b9q7(DgG_8~jn@VA@bNI*1mLU~XRaThdh9yy+PHi{*`p78=cbm>{e3Ro z>`fl8rCWTw>C(*|#5j7T+u6NYRaiM&?e-flOb1mW()6pMJ=LascQo}6?{+7#pAHRK zC?l03u{os8?hEBRb#)DQb()$@S(q#=?p^h5ySmTB>Djs|Wo>PNpKivDUT&l3DJP|{%a9mPqD569E>Xsq zDW`C;$uqw|j4O~a+^V+VmBZUTi;u5qFmOIKlo)Dwe-3v>$sH2wme*)F(Rp6=-o4#> z9h;`Y3F{VFYu?oK<~dq9eEMVS&U=jW0f`p5Bu;#0VM@piwT!7JlA!yMXX6dWlh)SG z9x@SxVD?@xa)~%Ni0cv>XCJT!-thERzaYtcsb^WWH&XICe_HMqSm2WJ`}?3>*u^0? zXro(K1&t0%+fJ40s;PV|I?jmlevZS1fB*Wb%;@iVFFWbqVqpG<_Cq&d?~n_RaLZ)h z*8mem6+54aXeLlo9%<4cMSgnU{!7p$Kr;+gi_ZU}$p^H2M?GN&&?ZD$}L zw~#sB(rO7GI>&(cw{5|vAWh3hbXjHIaz;gdCQBO;%Pm4R+~@D(4oc?23}_kemw9$} z`uc}a+bu*jWN&GJwqa?Iv*;5mNubAN1rEHKsF{lkuLPV|FUlE{i^W!`rM4*?Ubk`6 zjp}?G(y|EKi4^a&%^$n30HS0{_C~tTmXl^WcKK-!gKFJsUq1xkcfRv6gs_{()rDHhuDnhdZJR;Ar9KAN@T{ zE&McN=hC_>UvZ2!(13Z}P~2H7t_M?@Pt>!Nu)_~+%cl9_nHN#9f+Zw7fYWmPAvSE`W z$Xh@9J$-+UHcZ7XFcJ|;`I%hyHa*AYZ4wnB)q&UZ8^hBzTYvl8IKb${w9`}pN#)=M&1nvB^0wEK zb?o?IG&WbxgiDyR&{J&%$dPvk{$1ezZzQZ1@Qy#}gH9*IT(=#kOB>JH?J{HD`$a<4DYGn4ulYrr=vnqJk7tzKO( zZWx6C1&ww^H+|0ywc@*)TkjzJoiu~hWNIKr9&-lb$A9%${IhK>^q}nqqv+SFmGqo^ zvZj&R>w4S3xKz0~prwY%>qyzcmk}XiF1)oSh}Sv9=zV0wWMF+2Tqq>2I;?qjN!O3_ z@*Ady!P3|+y_{R})-ol7p*jW+G|sQ9p1B3*yZ-)TwBOv9;mG^9qyQ=4_y6!9{jGuf zU%oH>Iv@*d`|GcNgYo+N+)Bi(RW)@413CauBm4rvIqf&FA8adqZit3DlILNc<3qM9 zg*IU>l+iJtkfvbtkdwCZ?*_+I5b;)f&)%WVdr4n6$xk5x^hBGy`eZ$s!R zY9lr4bX`T&kJXzL{i82g7uK%KlZ#;akm8t1V;T2A*$%NI=oKkqz#wxiBIGGhJIIG^ z1(#?UBU@{cTfs?ahrMv%4zW(WKR&Ke1;OLP`!Vb;R5iM__S>q<`>5gMF|<>J3DplG zbPv_8)|fEvyS7PMUvW4;Axly?4BpdLbp1N}_h27p1-QuhjcUT{c_A>xPoM+dE6>@+ zJv?_~ep!=zR!`3exIKxTIQivGGo~0aPwT2#j~NY7wkoQAfD!Nu1fnlMP#>Fz7aI&{ zm73ezmA#Xbb(rFK44wT2+N8EP@ElF1dY3$magu#3mNFe~XnX{no8FYV_;Ja}?Ou5J z&zj=pZI%~rH?zWp7x_Jh z9a-+lbI2eoWe??#y(9x2tItUcP3A z%%?kKf9^}6-NtNXM8{3V8*K%BdMEY@6V!07es5%vFIL`4Bc_rq%Yy~IC2pcjU#Tc6 zwK6uy04=J3l>5`$-x)XfSO1@v7aGOGkIn-@F4h_X2*ign^O0>vhm;zij+n~Ujl!hj zxR*H~0Y1JiP1gPEU1AMAhsVh?$YVWCungtW zKACl159n-*^Q!!ddfH!t4JZ2Cl$+wv^mvV^xq^GYKpEE1J0>YVsoY<)qX3AdW=F2=>ZdI4Z8u}Fb4o|Rm3G;A7@@-&xN1E8H};F zRfuhd#0$mrFHudh@OJNSS7N0F3}qlHwug|eWAv3~r@ulk=GE@s#Vz;6T;6f!|E$!no73f{~ zxF?{y>v`6$@w|DSX=}F@O-IyuYTL=9*I$G}PSV=Xlsj)ah~L)n7;rTMicP-A+K)FI zc82t!ux?ghrbP65RYEuHH|%GU35p1MW1B;UjT-!o9*G^1_dIL7-uk&M9M0Bw;ZEB1tLJiaL7)>qK+vMrTBjYuzA|A~nfC+WaVA6{VtROZHJ2c~d0jur&^}_~L{$virIe(M zTd!KbX%-;duzfhRJ2~@RX4t~w5<0Ffe{8C=sQL4AIe%-Nx{b@)W`{WNjC>tF&&(?Q z)%V`lkBJsr6EjJVw5b{&Pe*>#d^Yvb;)_;&Z!9;1u%WUm*<~a2rdFa#_5;(BhJW&c z8v8RZD6<%IobDVYqh?cXd*X| z?U|$Lv=dRe-1}Vl88y=?6G6jrK13Hz9eC*3zD7~AuS+&n@Mpq!%6Ww^cQ%3PNv^Yx z(S39FbZcySQsYaeMx7U4Ce=?qO#Xqr2J36;ZFFX=cVz3l{4S3h)bV5g6xUhQ2ZS`D zmBNX)D8qh0u-0>h7v2emovcg(7T`#^nQ-FC5#)*=DiLW_D)gQV8EcGevr-sTb6x zlScQX-j|HZU1q)+|O9P1I`NnD(^G(f^Q_#>-xxZ$xkox)`do5RFy7J^2Ny4OU(^1zt~5 zM8}(W0f+D^-JW(!0U+GY)m(+u27O=?bj;S%0)d`^q(Qfaj+pE@iZoy~jm;twE*yeL^Em?D{b2%wm8v%L(9FbJX4%^%s2c-2=FWx`L=#=(Z(Rs zd!_LDQ!PM}V8Q@P1eelpc%F%R?mREz_q`4Opjs_auqYyiny28y(MRde?R?-jdqzYq z7Ce$BG&}pn>_*0cYpwBq|FoQJj~KYo<(A^lgeBOAqiJAe*r4tPfU^Mz2(UZrxk!3h zDP0F#pxdi6{D+>RyWiJ2K)A;%KN0$eve3kYS0i4uG%p-=7lSGJH5hI$c{Z0Ti*^@S_{7hg_0DqyU_42o2dU}^B%{|zXw<+D_{(VbS1VP9ZCo|6( zGzocM<|L=n@f5Om+Ge*-O?Buly!`vH;8IbQX5D3csSbVa_fyb4W5f9E0FvBX2=9QG z^0)D|w}of#Of(gv_>H=Ue%6VVf*3v4_Smpb65$HQZO*EzND^-8w}?Fv*o1 z_IzN&P8Ny_ka~G>No6nAocr~fdI|V*1W?u)iA4G>pF>Kw@vDBX=eP(zDCC@b&1LQNxP{#OdLu?D%Q+g5Z83FX*4=+ zwqW?;O~|bn`eTLp!SNM;Q<|J;fB)^B8TEVofeu#x1VH%cKSj)wqTX`mIU{Kr* zB-?r9CDKDY6E;{D>V}QOD(B2{KR7fG^WYbGCZ83lvaphd`EI4xiTL zrG<%%ePEEvZV2>xQQ9ROIN0seJ~W7THw2kZEPMq4t^gp=%mo9W zML_gw3xjrW@uTwbnsYZ%Wk}1OQ3>K4Prm$ZJW>6d)urbJDVmXUc#rM&pKamhVO>%t z7w+tobP>X|cVz-RFH;!{+`ZN9Sj=n$@1N>kUL{4$-N?Msu6*nUhHu9?N|Nzn~Y# zM!QYZs4KH8#JGmkp*2YqCATCdYqRJrq8M&0c<&b|1WZ$_i}P*Tp510>;gS+PsCY~t z<9zYn1lsS7g>26LDaIuxViG;88iT5Mwu-9g&|R-YBOSl5y!pNQ>q25WgKD(${l;l^ zUfHAgWYd%(HgXy{d?o-`;J6}h4zXE_JZ)nUY*^E8-U++_oH_uUX+`QqMs^#yF21Kl&OLI3j}6>>ZZjj7Sj3&G z+tei!d*sdhZfJMVOSisNFaW~H^;C8BflP&^U8klbA+cokxUeQ2=$4(eq~S9o!r@j3 zk!h3%4~iR>RCpKUmt5t}77V^mQhx&l3`68aRIsaQ^zo$+H$4KSCqkBQS#IWCiH$SzAC!=|UDw>*nD5xk7U%l*p1wXu zq$cPbS(0##$ZAz>n+SvtLdpl5D){TGKA67Qty9hJ5v?xN&sJQa-Hwu1p6$f*7xEj` zrBxu3m-b|EtS%;xs;p7T3cGMc+;mW_ii{D#Rjqo6*G$#u$L@Eox!dZlMWl0tZb7gW z?6gh2@~e^>*7vDU*E@5O_~kNMwGFGAPxUeJ&F6Bx$; z75DP#S@X49z+i57Pf$Ger%$={G_3I8g?*42=^STj%3@FR^|D^3GO0rHvNen9lkl;j=e z&3IiLCvo6ZAF#D#DHl0ies;x8|1*9fQbn#E6*P|)br(P9qA%>qXS-=DGMZGV9qVml zCu~qekaVhv)DSgcL`&5kT?j-EdaOwGyto3qcHrxLF;3d9* z2m4SEc266|hM0(zh$3Art&8P>huG@S3qPUl;F6*OdCcG=rm1?NfsOJa&=&KYSph8- z^|94Dk#$cBwHYK}Ry8_T$zEYc8qoYt&tNy8vnzXfA`D*o?jk5VZCb4Hj*@!S@k?! zI~;5W*lt{vA6T%so>jAw^^J(3-rl(hp$V-QbSWYG`LK9V|$L*Cyw~{ z_*Cr}4Xm8fs~T>ZZ-@bGyH`NQS>M|mrk=3u~cuo!rN<>=pbh8^BtR-8H9ACf+9S>fXDzHQ> z??xSXh950VM`Q>64%axtkjgGcYDCwO5o3~SpNP=BURCwLb3^mjv*^$)_A)2Q9R7iSkMr^YH1Hi3g^Q;+>QNZQZ&(0CBOg ziswbmsT*M*_7>bgW9l3^Za38=jvUhuM*ZMO5?If=*CMR*_VKQc?I^2DMH&I#gt!rt zg4>F5%~efyw|y4*7OeFyp8SX)MPwzYH5?p_EjPuOdK&U`jB&kAF(>gL7mGMMy^=SQ zXow1$18+8YhXn8D8Rl=M&%W@#au=4C4T8=A4h~@~MvQO1>yb0j18-qM>`0$|7J23z z@oobXw5eu;oBV07SCv8C;P++E(uoI7>Bwop6J&v}!w3$qhqw)TR}{fqcOWHvTg}n( zu68(q@g}$c`QS3NO9$V2In#Z14jTo>V|Qd08=^NeMoF`Muu3E&(vsePnsIY-dvB=kV(ZdaPgVXlH+6eem?!Gc@THwX zN3zxfNG>is$W3sRc+)cm60^VOjig@iqF6)@H4=La4}XEgIsibIIM6Ec)H*&k&CvQF za*}C%QK={Nc=JqpO;O$VxOQ|!jBM(E<`I%f%ylRUgfe)*xk7Ng$q}n zyYMd@C#KM+oCJzR2a~-bUJA2dn$=_R5SUGzbmjX=ZuR~Oz@RGku%4f0vqe*t6h}K3 zDDNm(sAtP`OF1Fo$iARpapG8N;YuU_n$f6?an+HGsolR9w;&R{E|J4!@64XPily98 zzEW2p8gQkMYIOXPG?VO;#wz=`J~vvJb-n%R{9Cra;h6jU*Fm`{Z$NY)lt_@5Wn*vl z>a$x!HScd5Xj52WvX~e1`JeFkl8QK8+zWu_oYhPgzYv#nVQA>`%i_@|Yt&VWR}tb% zk%jsW<)c7{JfGJU9CdS|)TXR0zl>hauw&heXP5IzvOeS@ixceO1>Za=Wf6Qf$$dhZ<5`Iq2Y)u}jVJ(*T$*l0hL*Vf0_oaP zKa7cL9M;^K=iby2A(cKS_mKcJrPlo_y*e=0iHONg(~_hlm>k4eKkdtTy;ENL1v>0G z^!L(-c{e^4;Wv!WgBFuD!Qfy_75Fot*O6?51dL&u>b&0++-0$S-e_tP5gi~#8m;C( zl>ZU2Ka{8R^*51esYO_$4GFwg`mm|{we{Gu9l@Ealcu*j9tHxChBHZE8V$OENOBXa z_n>~acdryo|6ox8EWG@M2eHlVYJLQusvv$OjhmVRP9u;ykOZ;2TK3M#qxK_Ww+6|` zwTxo{om&;yy|65u#BY-KSSW6-2YKFKZ|E2(C73Y`|7D5eAK(9v5mg$U^wduH97u!% znq+vrV1WHqu40j0rfrwqr(5`=q<@wqGyan#nY2JIMOyU)PoE}Pr&^&;qTl!TXbprl zCE1ob--&**@>rPeOz%BZUp2`Ou)vF{uV|4!SKWHtc^=>qpm}rwdqC)`Gq(m+A2(hc z^Y5F9qD&x5n=or6eGbT$^#LG5dFY>akTp{gw6Dobx`9Imi?>sTve)AEGY&;J^>%sP zqT9x`PP4uSK$;DKqZHH%=D;X<6HHhGySu*{)F5k>x2 zP!gmFR6c_bJ)cT6Tj#2@ai^$#+h6?v*{*eJ-IIH2s*rjiU-az4MflMussds;J%&b} zGBR@Go4eHRo~97AxeXN&PkVEAefV))xx57js1f8vF^_>wXP{vV9&RS~qhF4{C~6Ie ztIHS-Z0_4$Uy|t*D7~b3_X#H_d#cX8>nLn@smFgvr(#9!7|b?0Mjm8RW?}n{_PoZ70mS%{X&0Uveff`KTj1@$y?M zL8(6G1@q;Ah4w}4Mcukx&tinA8ZqUs!B@9_s&PG5){tdtR#(RvDMSv~pUVaQ=DA`c|#SUdrhO*UYwzi91gw zQ5J0C9O(o2}I6#=rF8mvs%ked%Qg#In(E>?T+39J`9D`eOA?cR<(fb z&7;g`Rl?MntV~zdOn>|W9j}}Z?@7&M-F{-g6}{ox;&{JRdpKTdTUF!c?~}iJcjaGb zE~#JVc;#EZ0XUctT<|b_D2r_d2uIJPTRFM+tz> z2du-i{V8m+)fh&0;Gg#Zi0uBgNx%tbjbETo{utsM#uQ_5wj^!5f~bR3B?S*47r1RH z1k{M;=mTmJ_a{lB6H;}iOPY<6Mw<@_{N{ivwTJRvU`6_9;K0gu ziu&s>&_F%#JBr@SFd^8yWsc^{m+Otv;xoTM-_%KY`0#@~^$IM@DnKJCK=i;Bq}74( z`(_IZLzYF(9e8;tlHn*3LRS$xn3gWO)HhSe`^55Z3d`w#`BXnKc9d=1=3qs2fI3i% zPP`6GNg9qwRU!g?ksO%;8P&e<3)Gnb>w-?sXL_LpfJ;!D1ww}Zw{7yj$)ZXF4Al2I z221t(vuAmDdcP>Gb#oGf6p-f<30ahVq=5+ zSl;UJo>NUMS6VZld`l}lU{kN*?)8#0Kae>Mw4z3SCCkuvR9Ht;KMux!>QFxQCJXz- z*7m{(@WeI!y+e=uHSibQD3?ESqsr3%$c_5>-@%OnL9wu+z>mn;D}+)sxGKsb`9lg^ zho&JUX!nMoH{m7MHthOvY*?Pzigew_s&AM$*DOuWCLw@)^lS}zQlPD8Cwu<8zC$Dt zsKF@jytUMzzqqO&rr4L0@7WrR2}HBDYx4(;c@KUCJHiAJ<9|9r`0Pvho8{py*Z?~No%`)X z@s{h+^aSSE)=_n&Ai*+T?L^|kMD@$t)N2QTm^Kn}zsF&%oT+ke2x%6$U0>Y-v046ChS9_vXfdx3TGz-YKbHAb)RMNE6$HsJm-aL5*EAiLzHGY`I<<5PGBxWWBj6VXHbIaMkKr$Yr zHB=vraFd#Za064S)*wHpFw%K@BIkh)o{9lL)P~us;*vm84k2+1A;$(^JPX!hJtPvr z39iVyZ*eJphhDb{$V@=Br1WKGlJ7#f_L8-u@7?XLNbq4o%EoQ!6R>Dx z(P6*~Jh+us)H8@QQfkPDxw<#W7BmmRvEI;;`+*v*Ce^fbZ8@RTL_!WvG@_v$oTrtNHIDjnZhZb9zw&@3Nc$5|U zx@NWg=ww)}(qy0KOZVBhFAJTub>X7LhX=`0&B;KR++Hk2yUv-Ds07bA4kb|(>MSnb>mLJK4Th{zb+o9U%abVIiD_F*<8sVPw= z*dfYkJUgMW-h(-K@tv`Jk-7cbKB~%VN#af^_)lV?b7-x@dL1_MdBenE zkI&3gQG$G_>L`;FP=5A09nR?xE530ppq;><7(IFI%Y+J$PGkTc;R+*7p0nu=L+RvBp)3-|!Q7$@9XXO;bH72TGM6Y^O6 zetOjQ(F-dAb4>qV&X|&uK%h=Ummi^94LmIhP3k=H{tV(pw65DJ%N^;ab2AVs_AN zV6unhtigeBY(${p_5*8a*FFJD z1|!wsPd=-~wD0DJioMrA;Cn*$HMs~i`gp|m#jP@_F&&yV#g&Ng1MU0T8v@a&@q3VW zx82`z$@j;{7}TwbdwU9O#Kn>|W4lU%%a)ETtoD>Mt)^M!1>F#HGKm7sV;SnKZA#UV zh97=`Y^2U}BlzKa><2F7f|vM~EZZ5_hm!){FP0Ze(f7H77m3Q(->A^zVj7C(LG!MQ z-fE|Ije0n{OT#Yret+X~&%#3L6hp6Xe;pn7%-1tOU2a)j$@E5~IwW?z`V|Yua2d_F zOqUT#*!qs1Dc2!qOfQ(deLI>ES_OE6_E_wGEeAa8z7qz zjs)zk<;f+N;{2`4S!qrSkw@b=k7VZrt&iO27HUkv!FRP4kLe+r`!wTLEZSvF*S zySVrhwOkv?OJc49ERUnT&kO*~dq3?H^@-fZt<)R8K(hf4X+BTYFsvU`1+IXYj`+fu zCPagpy>l`NDwM!JT&>6l`(kA3HrI_|Zrov@i#kGoBHkkZ@QK``n|1A^IcUy^!QL~?64(QSQ$OSlRidgS zj}yI~O7@%uP-keW7f=5ivg=%+9}s_t$N${bD+#t5u`p%Rzu@O_dxi`@AQ@cK&Xg`5 zyM?JTIZ}04iVR+J-((`wqx?yt!3=LgZs*gaH`H{s>mU zpSVAnd=f(fR@XB560&X_;8z8iN&dOdM7{{b8Dg`T<5OeEX&_~3NU6dh{kE@q8rNOv zu(CL9$~7kOn?q7ad%jiG)By!hLt&&KWG-NhoSYm;Vmw+*tJ9r#_226`<7J441e98l z^4Mj{cg%l2yz%bwMa!?k8}!f3ps>oBB1e=G(`+fgD9B5`Wsu zA&&?Tr>uED}Hnt zqLa7l9o}LZLA5*WvXRyT$1;KUpwzKiVS=(u#~ou8lbC^*UP>~z^UUmYk-|aykdq{9 z!WZ1yfdGmYj-|O^$vbRmU0~u$%NSXE9C~|05z$CuAYd}QUg2!q0y^%%z3ryd_X{^$ zSl|78u%_M0k4OhPQ>(!&DUE2p%6_!Kq?@xlQKH8D>aCIP*N=jx-@E(bQK5JRzuO~U zPd~A^$72v**I(8@DSvE(neEZN+Ld*#FJp3UPyT?MMqxH0=@2ZlM{aPvLVe?&vfiTp z$s6Wm&efkiqWolL|0ISk?^qV~BRr(~0)mB*K38z=;wg#vn2siDtFeb|kki-E@Oq9p zCx5RCQ^Eaa&^}7?hskEcmtXk{t0BF+x#TSJCm_(9t`(>3A5vn~VIfRJO(KxzwqxAj zKZ>w-l;j)81&vOF7SUy?SwA3O48`_yNGP^$Z_Tr6%K^7OIzULC@dN|3HEa7YGa+>B zmObO7bwpK1eU^-oGRvMWE1hzU7+BY|wV`Khe$MiJ8QiTFnyofx%J7+HuKK!~pN_lR zH;B~pQIiU0I3xuIyT?FHBlH@M16@@g4-%kVf^y_ubI z(x!DoeNJj0<*Te#<=ve+-2>

    L3x4j4f_{^2b!j+QV~&hY}{GxPH*kuS~K(#e&n z)&MDeVW~AF8Mi>a8a6YOt_e9*WT+zQ#bWf7E0v1k^BJe3*&du*)Qc{^YF&Izj zhTMiIVqQRe4=NraL#j8-%pY)%TL(41`|;vwYYyuUws(8`Jw#DF45r?DH?Pj#u+rF= zoh8XS)7k%Vuf+$FKmx&j)=-vVMu@7xL5Z8vRT98QRa^5Q2A#k7nHYa0cCdyoYYCsR z<*qkydToEvL^f(Z=luZOR=!190WjA=&66yQYL%0-ojr?V;-Z-jj(EwM{W(ha1UWZz zVIOvWfm&c=J*^1&!9t)gLp`NFJ5A5c^uczd*+qJ9@@}pxfVnt6)}o`E@AmqB9GlM5 zIySXtkWBGl1{eW{!_IoysI~m8>kZGW`R^Io-16QY+M3QeI^bs3$*)>d-2j9FBAbeM z_E~h6{gs?-%{kgr-x-ZeA6vL+e0i_jd5tqpM$>TuFc@(Lx*~<#Wv9Qiq8pMxtI8D5 z%sCYJP==xRW6FDaand$)?Qlb2&NEDR7F{r#7bimg;QiT$a@H=+P5T$nxTR+v7VcKp zaL;NMS^6EP9c`pj{vj)KI{q!T$1JcPH)SEIJEA@*_YromKrn?nr$6;F=${#PeYI+t z%>+g;&#|&kf@zX6ZgT=^<#_TAp*^K_&fZ^Mu+QYA<$&E&v4O6z3lYingU}gUPXwJM zW0OU3p-j?eT$5-_Lt-^2Ue$qpXC*|C~cuKwb1tRmQk}jgAfI1syN4XYQ zmY$hH06j#1ErtU*dHWY=4jD1ffNV-mU3t~fJ5`wWT;WZ`=U5ZkIf-ju9;}_#1&Fm@M@4FtZhgWQ%zE~x%o+~-&UOp3v^gnUxqY4%4UQZbE@`n6H!kOJ> zSAO1_7eX5;ay-?aF1?#4=$-7QbbNIVCW#L3*XcIJ*Dci*Oq5NMNd+T|O2 zfe6+S#xNUqqE$4Zd7f#d^%3UF!JesSolH`pXF~P)(SAbmEI`~kW=e9+dU)TWfo#3eis6B`Se5(d@B}apf+R zg@yTVcVH6Qu-;OFvY@}AK~`X>1*fixNGCF}CnVM)$k8!+{E3#sgY{r?3?96R#5q62 z4WUcS5o2|mgR1xu<7>P6LB!$`P|>C=$Jb*jSlC@C_%hCnQ8I=(*lut;?QPxW8_Hr) zAuL|6@fN?gW**6#_=04D_bPsnI*}D{_n^&iqvh53_?s`E%ldNo`6TO)P5D7LkPYAu@%GKAlZJ#=&G3yKwQ7OSVm0lR z;p_u)t=FOtb!h76FsB6eum>jxU&TGdfBe)<%!4PW40_yMyfUMLU z=AufrbxGr!p7V<5PMZ6F(a;gMJKJMKIEhXte}pX5u1xQ5d;S8YaiN#PieGaSZx~$H zd$HWk*WH0~;pE)(rWfFE*4NLeR~cJZTdg9B!2m%WJH`qbioQZX&Dqmiaq+|sXcpC2 z%?2q@ow}bgFT~1+a)uQ!5nx0m;F)v8hm(<#l=o$XK;JJ2FyqGp4n^_1sil$LQ0KME z_|jQb;Y)C{qyQ92T(Y(2nEe%i7J1rJ#Ld(J_=b&;)mmMTB|Xn>&zX)kB(>kZ{oyNr z?z`L*H-LsQd^zHseXxGFCvnt)$)%}Ecj8T zg($s8dMDC5NDU%2AiX3&LK2erJ??Mj%-U$`&;eccYTcQdSkfaJt6o7CsJjQ)!f<#7cgw2BpQu=O}O%SeFGgf=aFN=`=E*oXO>b9U0`&SNHI;pz8B#Qfeug-iZKna7b9{wa z`pvBS;eO`AV){T*oT1*XeA#);zfSEw$Na`TzvjQU=XYR z;A4LNr|1vR$54vQ`e8(E0N9kVh5>|X+}`;HFak#&8*ydXv$DF_sGrk)gH<^C(fuc$ z2TWc0ZFt&x9LVz_m*Ot-&YcdF2sL<|kQ=MM_*OJJ=;*Qd4iN^+F~WCJIKH($7rH_o z@`N2VJ8%2M057E9uQ_7#h{|3w{#mkd6Vi+*DQ`g^pSjBMw4Y|pnVXmzw_ z7MFW4c9L5U&l#MRQA@2V3q*W3{Za-9^yZBh<>ciA~++<0EDNhH}ww~p8p7d z`PXAWTjU=IiTvMKCmBbz03q81h{{ejyKAFrnGdkm zBw)8f80B_LueBneLP)-l*@*a+skubc*A_T0yR&g&QkOfWF1Mda))dZHSJmXy(-FOK zfcH#Ppn+~o6P}4VTblZMb6$OFoz+25mT(vyd_`k>%&hzZeAZTZE7DIivaXip%mMm+ zJoobwSa6-d%f?YjEB(1dy-S-~>RqdAer%1*r7kdLI9D@%rtwk0b!qvzw4807!Is+l zT*_fIqos+D2euE;dPCe3LQusS^ky=+Zf5;PE=Zi7^8=k~*j2Cp)vNy3p7sB=Tf+$d z$K=XP+Qq(mO}U`p`9Yzgh~h$1@iVWU-ZU$QGe;6PdP6V{%d+XV{vsc17UZ>kP9mYCZW&jg zC=Jmf4>Vbv!73W(bmxolOhFaTFAKZXrAsn}k3BF4grSfD7Z*DeAF-}b4)alFsjkF1JAqYBt0i9+BiAVH? zd_atHVct?>fA@^k-`xkV$Yf;xKyBhq(Lb3|-Vudq(wHVBmBkLaJ%?aG)&Kr3OuSE> ziu(2b!K6UvIh#Qg9(3@ILKg(m0FDr}3&?cHA6ER!RUu1oP*qVr4FO;39ZC<;XiC!n zqBKuqe=@;selpE${b7Z;G0us|rr6NPAqd<14C?@ZB{}R&5Mn$Z)uB^Rpt#WY`vq*L zhwfj;^4EF!YmfbPz5Hdz{<6t``IoUYe&>1KRMVz;9eUPV5JzI%s> z^F;+>91w?L9q|(v`ZM~&9`Js0F1e}n&82|l>t*ya+k)G$-BzBK&Q_gs?_qf5ExSWK zC9r09QkW{%!e&&zCGA2n4nTN1Jg_58K7D)s_%{-H$J3W&-(!Rzi^iiUm3xb1~4%lU^3m}8~H>~fk z|J{_O-do2kA$_iL)oe^|Qs>+!#SpZ~A#0rW4F6XM&b4*&Od0(<@aozfknFI8nK-Lo2(c-4LG zy08$knB174PjpSE2}jeVW#0{DWKYAQLnqFjN=3Zl+ zlP?EP>|8h}$(8E8<}JOT#nzSoi`MeXqU~A-twN|RJEyCIEyaOOPG% zFIG<n?RS(ILXlMvtGkAu1+i9#-$c4#sViX7zliw9e~&#KjqhYY07`7xt&h zaa-%m&1DTZ=k<2v&MZ3teHjhIzp=0gK8Wa5drEhui}}lMhcB&o)XGUuxnbUkwIL#2 z0~S!yO0vNR&~bOW3XPJU<|H^j_%t|+TfnCut(kAn>Y@Trfu4HxgNapYQfQv)tMCp(;e7jV z;#N<*7gnHAj8Z}?pnH6et)Jw&cnV)o`N&@a$5$luN-@RMcm#i3&L!QvT1?`z^^*fs zc%8Fi)4eo$MBXIv)6xTx7Kab#Q{EiHb|in^l^DS3k@bW*{nCf8ckD!CfaMnXn>+g7 z_j&(|Yx5_45 z7Ki>8I#(EX8B5Hf4fgIUf)9)(hL6o8+t>v{#dwVBRzXi*c&!h7ddd%)aMwPnHSFGqPs~54kJ6hdoLCs4b=8axV zu}R%A{;p&v|FFw3xP5A%DOf3&-)v^|!$qIjXuZY2?vLc`n;#zBzdm;8d3(2sae%Be zo6v1c^_5KFbbVvCJ9q4@;Py`xVh%chA@$ksLGa-C?}WRW4p^feQE7^a3RCM zDK*YZt#RZl3z$Uq{`e%1T`T<^bV~}x7ah8jNgDXvMZdQudo7>cmLQB%SvRk%eRQN@zMoQeaYzX8QgA(H*I^6%OBl zenk7C!s*!~&fjEcOId1F?NK{ocKaZfc* zegKQ%(ql8LZ(8XdibqU zZbOgjRjn9G5?K^x_x^-+21%!oVQZSTwaqmqlooO(eCcf8EpgeU?x1I;q0G1jgX&%e@~-oAX+Bf#{*dV z-z27qe)@8Q1-5NP=gb{bFwFh8%fux)F5a_zpKra57jSDY=H%`bg#>wt}Q=Qo|W@M+S zxAltj#DxS{hl=pmM|HJ?i!ZHkD7Gc|iNT;{bG=bhJEiL#ftrFda)LxMVdo5ao^b>d zUS(C<^!`H4wfv0E*cOV$r4PQ8oZRI42za02hD<2$?ay*uIRkxn1}th38&-&SRArSA zgU>mVi)SA;@D+2T( zD|@pDP2W{;+4->PmC(PyxAlM2i5h00 z=fS8zNG|nioYlw0tfTJx}rIZOhl!&(oOD-Rs|5TcLJCTkOG4W2_aYKS#>^bzfqJwY~G-MRu{CQu8-K}!p z=Nks7F<$$rAw%x#upqM18^!xJ7p$}mq!By5CK3(nvjH|oaREttwFA)Y2#w8-^C_5Y zo!$massn<&9Y!Bw7|>?1TaL~knI<5rFdY~a&GS3RG{M4Z@HdwPS`7uBIS0~MW(HJ| z{U?(d5i;)#OERE}0l`L)t0H@uf%3VMI^{cd-T)prji7~mo+NEE5`8;SOO&pk&CdmWA+{L+(nb&X(pZ|~ zCwxrkAhS+OT_$_c`=QV-MnIj=(oy@#BQ-iXT5j9ENwb*@7Ae%SnblLw(L78s{4_c9 zPK$D9&ZH`GTiBBk*K9}(GVdh%`d1sn)#u~Ty<*$Iv{xTOPE5W){0lPBl%0MgZWFx< zFm`9&13=T~F$h)L9j59J&isEM#tbHZBPb%-;K^RE#=fK*%%ws%CAyP+@}n_@Uy)P-Cm+ilPh;%{@P>& zLC6B{^e%nz=%}5SpqY-M%we%p1`mw#mRRdo_9DdaPR8$Wg~x&nbzAJrN6o6tJ&A=> zHXDmTS>X^v!&Uc5z)Xl_>_&WLbEX{zvFpqmM56!=Z~{N_Pm=lE0oG7#22FNm4?}FR zeErF^-VDy`nO{A%X@WE385Ke&gBQ&v()s9jXV|H%35>%wuj%RWh|>V-rBjIb`}dAp zhjZYQgQuL?jJ@?6@A5gy=-fB5w6Qx7>^48=W<=}KIfsz|U#Lb|98^^xHvq4I8%m2+ zr^y62W2npvh;7_FI5Q5xkm>bfxR2R5d-t?;yL*nXx{MO{4|`>NU5hu=v6E$5O_RHb zmgi)|yWi<-*ZhFACDQEBUgW|ZRhlZf+0r@9J32paYuRTH#=0mXHDZ?OS&#Ybqie#< zPvv@~y1fAcej0liCs~78BF9&L_)s#8)ilK`J1Df?;4n>29tQ8TbJAEm^NR#~2e*TH zyPudJs-*21s$SJ|vgB#{#W=h|i_G1!VZYdr1ni>@E0_X%F#73+oG9KUcKuGrZntkyu<{~&QH6xt}glOj<4?nLx~M| z@N_A#&fkRRP65#imSyaUN|vS{-sF~zoate`rM;v>(YI)`(px&s2Os+32b#-l(x|F5 z#P(2iD5aIeUl=nm)!nsLVcKmnXQ2F1^^gzkYbaF$mozoN;2~F$KLnsOFs?*XuhWiBKM_Ry#c<^%G0iLU@3-&zI0-6MoDABQrn1A3s6r?$egU&0;0YjM z*cAbUjCXGQL;Smx^q?27eDsU%8&)gN1TXhwKoV1{tD$TF>EBDa$N-lJ`VE*%eHr@_ zTLT|JZJ5(wfQu#qke-%k7W5+H1juZh_-ID}x=a^u1TN}9BI4anHGW^05seFP#^gAj zVcH3pY6)F*FT;m3sNG}-oJ?>!9uwx6xP0ebznW?^P9as{&D!-$v{PJ{|G}yqn|kzK zEoiVtF_J;Q2~1znu+Azofe%JDz)#G#Sn!pXRaTc*ypT-LT0h=yY9|}aKhEZtwH)l$yYAz-F`XpRHGT_-&=_Pr1gRRoAr7AkFC$Jw=l_- z3J!OdD3*yAf((Q{bxJeK8Go85}u~8i-r>B0p4*beNcyjqqrfUK03>sh)ZfHaB z6A+--u)zKaxT!D{Lo+WIuXB9pWu{)Wt9`e8WVdp7mi0RilQ$31WSwCxngTuZ_!%+* zM>u|!o*@aQ5FN6@m|^qA7~=>Y;pfv~+I^1W^}BDDe<(uT6`W3h-c>L4f1bB|0u|!v z(Cuzq;ZVIpv9SArsM(O~JQm5K1jvrKy5dG<=2k23oVVfTF|I!}6^MX;^?csuTkQo^ z*#?x~yVBniTQJAPNGK98qOov1b_Y92HJ0}9{`QmUbRvRCKrBiM1!5-7F8pM=BK1e? zE4&{L;By<-6ji|t^b44+0#TZ2=2Rvv)Ed^=NpB1 z_19qfQ2673+d?$4<;m9{Awdi=mQVfQdf7@^o&`pMaTCLd7`ZM%=M)3PkJqRTZJHrK z=)Yg5A0g5ZE#MRN41O~4BhI7Jtzc`Hw-DJhy5U3oMwh)Z6yKAqHtnAMO%-XrnF927 zXO=6;4?Rhc*YKmYiUBCY&#?@0#SOX`b>On&!zJ1Ej-O0In>i4E=P|~!pbAfx#>-lY2dr*jgKp%LfeAWjkMp z#71|l2+A-Yu#CSP&M|m`Aw2DWT!tee@IswgsrkylkHvzz;3aQr$tTT@q8j*S2G~Ce zWLB_$;=aMKIxzOI3kRABOJPL4n~$nDgEO;y@X&> z=I}l?4V3oDr<+$2^lV21HZ`x#9C`3|leaLUHPeEgyNP1(HR2}eySAV^1mM!!uOJ$| z{Y`<7R}KbUg^LKvxyr&1EX_utP7%I6x_+6urQ1?k#&s}O_v=>?(@=d+;_EE|&2wKO z@-q%6zt`f&ZkEFClY2?6Y#)O$?1D;swfV$(kjs|B$nVixHw#Q0K$V8o`w_ZzMw z>S6yfNBklH@uV$Uut!4I-XaLlwv}lpPUkD7m;~y2G3MZc9I=Mdg_AMomx=CwTwCxz z1#IC3XtP8Y4yUu7x$pvjCI@<%#B8qN*Q#+zdIH>y@yF~92T(CQN2*c!&%d(G;uR9T zz~qA*?_$^hO)5oo9%k>JUL$&EABAm~@Mi(d6*=@6^d)8=7{=k(=xS$jMeJg#q)1(z z*v|LYO!KiFALT}0K2y>>Lxb=WhWHcT07)gyCm6#Y6#A+!<&OS^+sS32<0#a`VS13V zkb|zeCvNr5Kx1J=dkC|a3uaxE3%L=6t*}FV)_L|Q{IE?^V$xtzcU2=3E$J-7lbFy1 zd0lsq3`>o0Xz>mGrg}<@^8t6F4YPt}gqsmH`K4l%7;f?ieA4%^zM{o(d^q;9&5!tv z9}k^dWczjKY?zbv1n%Q=+u1Lxf@8uhH{6EkD^!z;3)P3y&1q|sV@MU95!lpH0dOJ4 z?n_P&dKa0{_GERuO#S3utO1Nw+vn2op?%J!Z84TBJEYZ-*GOWpT(3PDz!;s;$KxrA zVvkrpCF>!g&{+uUb)+w*br0d7{ph_D5BmT({nWQ@dTMu%mq%;-Wg;5-dZR!%^Le)Z z<@X*XRdYHyWv{mj%a}Qy#of!2@XJ6JZpi?m=&Z#Mba@y-XTok)!ulY;Ux7`*@PN?_ z3a3b!Ft5x_`X9*;X6P&+8H}Vz$ji|ECV(9eMh}rY8LF}v{H+a}UMs}<6}sUZybC*x zSmpc4bhfnxN~BFv4`9~<;oOWq%!Zl2CGfA^;DX7Ab4>-)!o?6Te=<3#4-;T`%Vnxm z(?UupBT@O#AM7psg5fH31zcq3=+o$JG-j*F9+C~QTq;B#Px3Frl6EA0C%z4)D(i1ECLxPcYNlPU{&8BgGD$6)8MLx?iRH}NO| zPu_7C3GoX*^q2NrS7g|#`e(yx@Oq&gmufKWkWN?={Z(|a5j!K|GgGj2SJe*E|bxQ z*l?Zzukqda$#m;n#_wGr!vnSf+|*brV^;Y)!)lT=&lrV^13@S{9BvN=pzrKafb8*3 zR_@GdQ6Fr8HURxMA|Htb7$S*g2SQ!<&z>{`w~qC*qthzcFE8>z5N=%WVIrrNKXp6_@|R z?D1KK7-DyDE)D4=L{9T80rRxZ1@Qd$b`E~qMfmIg|Fhrzk#{WYGrsI40w=Cr_TA`a zKPJ~Q1^HB#d2C4SqPD72(|D;(2eeeZ30Cy29>0Do;pq6@WODW*k6fe_gTNQEb#MgR z$W@7h9kCY_#jev-bFMDGrDVE6BrCoLLjuQBwH&QH0-=r|@8GyHfkP8n|#5Jq~KgvVIG={}&!p zmuUTSenO9cQ}1DS`BnM+pYoUM=i``ZJZ8@wdAhtbB0aiS_2%BUsE=Hfg9MfqFNQL~ z5yZdf*Sn|Nf+P_`*J{*WF38Vhx39xeAtlO%=aBdW6I;_K2`|JR1f)mbKsL)9pcSJ> z{V#N$&c4?oo$H_5XN;*^UQ_l`$Xm;%flexVxOdP0l!O1T{2pTCfM<_^bWfoLzZ;ur z7~UTKCho+EdZWOvDoooO$;wy+o!y+SZ~@l)`cdk8ILwUWck|>NYaATan#99R6Z>+e z&q{x#m#0*&v)Xr~y$#idXMYe6D);={a_KU4nx z&2#Qu8~WPf8Fm>6_}ycu)z;M*Ha5x+WgswgvC-T)cYwYBN_2r71m(vXshVHEjo2oP zaX{a>`b#ofbt|=2>oHL1o$h6H!qT}Yiy#_4RX&OLYQ!QoJ1u~8XSFE^Hj1$KSsA2r z(u{p@3B$F?mBX&{jGj?{V^D+j+nJ@!PW}=Ro<0koMAR#+DgI=7hIO@JC=shmXw%O@ zsnMQT44tO+CdGqIVoNq;Q!H5FURaEL$wjG+=lm{%U#jiw6}d<2u%mrr_wUlc<6(bC>e+OOVGn z0^dk4_;>=4;OmZGW{it8xw5%qVD2r|i4ShdeU0`|K0dng^1^UZ8HS)}v>0Br#T3cX z^yWTKxT0ulylsD+vC4$-7T-tj*hYrv0M-CMoa&tY4YS+S=eR-dC(%3m*Vs>&E#nV1 z0N$9*vr>a$f1&K|4D*XJJnv_t*q*ipi!QHx38gEMFYlOo>$l#rn_fTh<||F*+z|y= z8yEV%)-#kRljCryFOTks&0T(XdvT|4NU-+M6RXrd)C162pH2Go&vc(Hrc+dB7mr7S zuIX9m+g>92h{1(=b^T^Pnr=vnM(|B0xWPb>NwH{#b**q*s(E=H$N~$X`QD|zlVOj) zrD$O``=Sw`_@}j@)yql8Ej)S3*pHM+ko?K?^b%&fuXhWrS#a!2AC9jQQPYNr5~gCk z%w;{-gic_NfG=1GV4(kR>}q(BAk*SPQy}}{B=~Q?fUTJFuIaLCo>5gcfo!7=lCho^ z!aM3@r=8}>-89Efb~%r$-fPEe^!Z#ICcizo?f6!3t|0M_rsU(Tr_dzC5rPu7yY;%3+Ug$)fio$}B8yT0HG7gFj2%}??{}j1& z>eYa>PchDM_mv>L9Hl_0<>-{}S_`C){?36QS%uz9#sc-~8Wl$zk`<;D^vSUg8}&<| zFPWFGDjM0!HEScM*WCYO^p^G0Dc8vG^?__U_w&R6|Mz%qQUN}pxsfHg6WvzXpCuzO|FC$;wp)ZSHWH{Zna&PczTOWtaf`ILKC_-&t5l>8vV z?vZO~j}-GnY?nax#@=orH^1|=MU1WG2;B375sA|$I70+1vn$at@`~I1Q*cjppRKfu zIx0F*7j47hTgN#M2Ca!n`D|r6Lz{0!U9@LypC#zgKl%>KBwil=a8@AV;rnOU5GKdD^!ox4mT43wgQj;nHxgTP&kszYO{5HT$+LqgI{w2? zMQ6tjsQu9CkNPYmn>KcJ>!TMGv>43FfNzlkfb0B#2Y*3xxq6p7D{iVfAl{i zEx~4Q@+<^$dRRZk*HfB|%KKtjcanO5%D-owPw?p}OzG@JCZdi!XmTPe1PHb0E zu=BR6#kTb3g{|H&gq=9g_+n8T=ZD^8mQwh6dm(Ey0-RimfXCg231G0p*v%dz1imW%) zop!wjUYAIJyM-<%7u_|Ky$m{2{`U=%Qoh|aL4k%K>xBqpu;A(aUGL@>Ei#P+`9UG8 zxP%zY=l-&_zkK;e#@d>1wynE`WNZh%AX{SFaf}g1en{BXGp#i-IK@Q-m{o_pB%=OJpL`PS9v5WYxQ_5*Hd^0yvM}S$TrGpk9Gg$w&nmi3rl>j zDx;&%5q`JwIm^CU`8tz4^ts;q2iib1-u3afAM%g>&)H^;n@}(a-M{C-@Q82EhrvFs zQ{BhIt3-Ozd9qiv(Iw$;Vr=oF#l^ouVwHd9o&I|KKRLOA>x^bC;d`yBRw##Cz=3en zr>a@U>}QG6(wv6P!#VT9GK1Tv_HsQ|JYf=<%;vB!R9=AZ-Ug3J58~;h2WlE{9s+#= zJV!64&t;W-pe-ru`5|?(&?)Vk0$II380zoC8v`nmxImX(!{;AG7%b5=ZvrX=F6e)n zAQRy)`nfJU%`}g2@Rg+M^V2M2IUPD}ru;;7tGEtnjh-g4NiuiC-eF+`tcdqO*Ecj_ zMJNG}-c{s_359`YV*|FnLG_H$??CQeaW6ueYsISM)A>3p_h2r$b9ucZlOwlgyqaL4 zzLVj3%BO$=ucWDpmGtLmA(bQJqV1)Y29T@zich(yP#6G>;O{8utW5Hj)?SW0zJ2rU z^Cg1WHFbu(3C3Ow3E6uhvW0ACaRM~%&8+8t4*fmAtiF`_{Z8z%7$}Xa4rF^|2!R+R zjr9|`Ws+w|+%m3VOVhRN9!PlGco7E}a0lWs(osS$DLAkc9lY&kZGQTp*Av$Hwu)tNwBCSPyWTn?ShTdYl|q}I?&{6P}T3T@EG9kksZ zCtFWDsW6Mr-rC82M*3m4bJmz1{%ctK{_=+Xp1C0;b8?JV%_H@$Q?T&{VNs4L&$F79so zqavq5UPcAD!d}O18?Fw_pjMsv7A^La#$Ez!c>gH~#k;DW2}yc%Kz(PfjtjkjX5E10 zSY6IOyME7qQaqLHu&FE@QfM)76jv6Bngiz-{?=Os)>n$5`86|^z?=q#n*;SKXoe%b zG2^fwuwI7z>MJ0c4N3})D;2jG7XBFhO(4Lw3&;v2g9>+nFLJXS-IGJ-0_C}9h}L;f zd{|CSrL8V+=3uVlF^ro=&O3gfsvbRK1M2tg6u8C?H|Zoc_7=JMVw~8jxy_x5K+fc% zo9&lcEAIKyK@Gg~i{OPyyl{jEjK^g6dfK?E+%*?-P6k1WF=~Z4L8p8r=F2qsO2=bT zyDnbv+=%v_N;MNh>?7ArH(9}rW(>#H}6#ED)^09$4sT#_*~ zaRk)>f#yVqxa|%($i?k%|uHH@yLQ zmZ{n;`dLfd^J$;-#4eqdpp~89jWV^NV z*Ii(~B(LyubHsDTQU9Cd*9ZY*pocZHk@`7!ISx(#>4QlP?O3+&&#nk*Y({56+dg&c#Z zHiZa8sgT-B27?P42sXO@=(w5D_zgchp&LrodC^rLB&+yynV0srp_f;%Ooc`!6?TrY zrQ5lX_YarlJ%w~Vyr8Kjv>`edm;M7;~R_i`fF--@jHiDrrSPd+hZ_T8*$ z!ZWEq%lgHG{Ve=eJp7WgkA08qXJ|}y%^3?i`|H_>HNbCS+S6!#)T?%ZMl~{PlZjB4 z^JpV-FkT&J*vQxZkZ=_#@$Hy`^8UgyfOa9DTYw=%T)_0bJ;@y{02O@^IG_rDg>VTZa7uc%_Xr| zeHprP8Fx9Gc41HvDl})#cQ+t6Icn^56~pNax}fIPr|;gV#>C8^b?@>aN1M_p>Y0VR zs5b1({0fSog`Gw>jZ6*OY+P|ZqIO|TAo8f3j5Vw2S1~J1#l4SowCKs`ACO}kiEsIF zo~bD)4EH>C4mF4DljOf`KmEpHS)J|+>1nAEPSvdHFW@F$d}Q0cz4zH3UydjTs?@8B#)FB|GxBa)>9q?)or0ul=pgw+>xJy# z_u6)SABb`sv!0RC7gnw@9g5Mpw%6WvbYC@2ta+dYnrUWlsM9ZIn`{HcbMC<|WUh?w zoU;croNNfpxbTCAh6AmO1~dK4or`OuBbsiRIb&v*6Ahqi)p67Vo2b$C zW?izwR1o^fS&lIPP!1PTEZb&3*-Cd)heQO(!H|g3_2%!ekUE91#wyMUC*(9rk}LX- zoscX)qM>gxC9VpnZikl)_dLwE#AgRJ{%t{tHG(jX5_%f#PuJnOl9Q}~$O+Sx}6@Y&S_<$9fAOq#4iQ%yNUTQv6Gl4p2Q z>4vj9g#zJ-Y|HKQy=XNKk?b?C#Wq4;AHR6I7N3#NoXix{;t9<$)c)wdhun^{(^HaY zI1$3;B8WTY0HJRyH@D8ftPEuut}AZlb5>W1!EN60ORKG+#ii;Ps_NiA(?xi_w4v@si>5kKo6XV_xlZqpY zjI;a2J5h1Yma-}fibH1uMoyxK?HQ&s9}b)4D#dw=&*rGFW2V-Sm<|VZj#s(?U_g^Z z8^dJ*{q-l)s0>MKcy`!X(Gy>UT`k+r+ZUq6^_GGGm(t%lW*=aBG{5pnOc~WA4KY^pK}cF|45X6W(ZSuhucwM9P-c zf%E9wV3q?WR8T&eyHSv@rYiLIqo?7G&VZGX@B~37`tjshoj5{@%Sfp%i=o{y@kp1k z6)WyqK$m*zjHI^6vpg$Ba4#pWrWsZlq?Mx{y^wh$-w>pBMP3SGy79WlgYnBnyk5$?=;my=7N3?XP}v)M|DPj zGBGq*LNLqe2>?3}3-h3H0=$B@4wVeRdvAlmof`E%_!E!?(ETZ?e}FSkfv~O(EfxX7 zzL9ePHE~nP|M?E{Po}m69CBwr01>7H1|UGlrilTW%E~dIW|$&~iISaUXlY?)kUx%X z=al#xUzDyP`SODS-cdpN(y!S47?XSEwqh|309=;gWC`!;jgs6OG~(l7*v?FlE0 z9V1o0END*AeWUAn>w*Jr@_Ts$_X+g#5*;4XbKqmdGDDxRS94A2R-mpCQs^cnKban3 zLhnzU#f$kX4La=_s*yKp2#4EUNGeLRLC1I9r5;W$CGEp}va=BGlc-V3|8nt(G>-*q zsxwZVN0B6yB^2euk0qoD+=2A!$1wQo$hB99VNf0aimr1=+%GL^+Nv&0m-dJrS9j81 zcF;-U%0amrm1$g4v#V3`B%}It25zO15lLZhA=~&nMfaI}2;+=5N;_r&vTQaCN46MIUAkN2U_P&rnjI zqH~6PN4!5=S-cgKQhb2EP6pZH8NYf6X)h=NZ6%4}|2-k*uvWeTFfdtqY-pV7Zr&Vby)5nmu^h6(_TMhh`?>Qk0NTJH-F_N$=)njMB&wlec6O8 z3mqQ!v_-6ph7P{+@k^n}qlxtA=vK%`XVd*r?v`a!(~J@6T!k<$;pa80?^LL7KP$DN zN0u1(2#He0Rc_y7u5MM+o#26uhXYLStk6{G8MKEv-qkbE5XPhs&y}!*f&5hAACY3# z=bs`;V^XN6Tg~2w1B!RdX&eN>z!~!!IJGc4@wLD_i1M69RQWgIFKlPN_whv3_wt?* zyw=+^L}#t9=#qX?^s=F}BPA|DEk2tvN$U;oTD&7Ym*eb$?w247uT#+ih}sNpj3qaI1-kufEeJc8mvwES0ySrS6!vNsSYTtHzt2kE3kV}043|F`s}6P$?nbVGM7iI1 z^PNXblfH8mt#?d|CPU8-ho6qkHP1u6iWWkC#jdHsAHe}Md0&mXa2OWn9JICQ zoF_3Asj#`5SCDs#kGU$VO)dQV_*X7DAt7iFoNeySXG+?q`}gO=wCHO|7?w@t@2_1p zk_BY*lX`oX#L~X{3*dS=EiR&_y9L+vDl8-7@rthko<=gQ-C57-M;I&+QT7@7PtL#n5?>L_0b%~<)meY-0uVKQ%Y|teeyHiqtK&> zZ|n^1okrL?46*olh+GH*cIEE{T{lp_87c4;|6gEtw@@!9ganV0GpAuk6ezAHOUD@w z$HH$2y-EOZVoO+^Cm6c3I=}J_1l_pcMsZ~yC&g|Xjfe}xJ(io4=z^pAOJ{Yfm1LX* zBD^hR9v8nkKQmQ=td&fEG%ihm}$1I<@bf|Px33i1uix34r7|xV8 zJAX?6X^;YCHnhRqUDd3_YuL5qnK1pTYNwjIKdmm@;VV8h;?ib)>%}H*H5amTrlb?q zCqI95!dg)^uKS+nR6#OBy{AO~odADJ#|`Ee`{$urBvjC3@pcnH&S0Z5kZGMZnVOxj z@zfhna>sHx-mw?$4sek}V+r3P4oS-xa!B(O zq&vbSH=B$A0_0(kR~Ae)Sl;>}r=k3<1@F!=N&E<71>sZo%=0uRx}x}MHfkQSQs}5~ z{a`1iyNnh;zd(7)`v5;#1Ajt8)3ay>kT7K#a_HQsu6SAUq9q5*$$)peH9#4qpt{?C zMN=oj^aLFqw3UqfMTg_NGt#o2lW5JUIhqx^r5 z=hp|q%fFQo)`vR$WTKoD1RSk`co@}?3n7fAi2h_+vjkEJkQ78$HDe6mX?2>beue$L z-PbX?m2Z?Gey$*Fd!i&&Ggj|_v-4?0)! z5P?E^U5Hb#DCJ88@%eB?oow&;ozv#mtz+%vMv^#Kin0zU#}WC(o{{2e^U9EOK9oQw zBe#M~8C5QX40|vy?Oh`M?YH>@Vm(`+Il8-ju1o<|$i2 z7GtcC!9c%Bo~t#I$V2eVdC66vF7_mN&UwdnX5+593AI1Rj&wZHJfiO1L!O|~{wtwp z1h|ek-c^2PP(?#rYPrQ>Vqk^cP)>|EyFUx`qr^f6!3FR?2FyrlBxRoQzu@fm9Rc*K zq;t1YNdQqFV>C&mJJZ0zGWB31lyV%{jNUC}Lpg@buf7_6|Hk!yo&Nj3eJl1omgG$S zarP|+#xg-s^Zen0!jGumRnM2KKJ>Su#lQOn#70VVzU|CNgp`B@c&X#_hngeC94<)uzn)W@ z5$zCod89GQwQbh$oYbXH!CFS4p(5LMaO*c&Jx*V>se%fs7DHFM)l~Cvo7DT*l1t74K$FUGt3p>UJc{a(RQEP9x-?4deOnB=v&fC&Og_vtUD5 zPCM_gm?9PmErMa~#^{|P7y;Ui@=>2cegvSHFMmo8<5!RAU;gf08|;I04y}A=Bxcci z#Ky$Ffeik}t)`HID-wi)vHXUtr9d@9;KAv1frInt8MgI-iD>L@c-^ z5`h2{O&$+z^kd!X-MuhO70>TVep2{7`|bu4w`;qcER!KzLf}Ny4~@ChVAr^Bqu^Kh zk@Wm`-C(9WK*gy1C}sQTZN!gMDY%XFYSo)MHI%!IM?@QS3RfdpL2!QTao04&S6`%C zyTCT_*X8!;&-B&){pVzj+9s^aBeOXVpVku1g&9jsRD=hZAtG*inqTm&d4>26^7j*F zZRuJ%pWvZg^23qxlPZ8c%1yHiw&+u*VQv8Zx==I18f<6@A+{4?FUC)Pc@wg+!%g23 z!Qi895t}=ZdG!*cE8Qe>15+`Y0(jv#@P zn+pk|uzewjA5@8*1^t%l6chy>%-DeuQj?XMiT0=t#?1ZD?h`$czOY|k#gJw{|ZB5tOG8a@8qNauy;zQ5>O)TR-A#C&) zCE;J8ZGYYV_wEJ5%QNgO-d?SVdwYXw^sURwxNpQc@fvw}IOQyV3zh~NDUx>Mtho!l zEtp7Xv2!T=sF`N25z7!wa&^NzhlYQQPdK2wxWezBzj9}kQh-U({1l24e&Jz!I$Epf z^P#-V7yA{DsO!Q|P(jRNu1dQaV|?hi<(ru86L(MDrIwbta8g_DK&vA;=@(gK^|nRV z$2JOi_#BuwgF;LFtb8)YyEB(ADdi7Uzli}ZI5oTezW$5-N-8;}>w>obvR8J2e1MH1 zPrF67ap&_!7{)l9*%_9(Jz(;AYec7=cs+9hb3>VvF(SB+P!qK%cu{3K$>chhiTb#W z7-X&E+K5i-GX0Rwfg91CtpUXEU07}P6s#Y zL?Dj9xh7g=f{5IHN8Hbb&$c_9T;H(yA-vOOP)lfrKPPjOhRN<|)x+iC`I-11%luNS zGRG^4Y%jfgs~-3!mu`-HNXb(cFeH8@ida4r-%{>MDfWo4(Zp_%JssIL_;y5E7UrH0 zYmE3w#)xUvl;u~B03USn-?Oq3i@HA4q&)M)d%XjS*eYzKw~2&|S$Xfxu~xi3a=13K z`cmhO1Jxg35os-`J+O%cXcYASVeh@8n)=ptVH6P?V4;YVsGyVxs5DVXeilGLjMUJh zBGN=eq(h=8q98;-KoE&iq!W=|LoZSyH4=J2dJ83lG{5OSckg@eKIe>czx&-Y#vS83 ze>iY0RwgrZ&GMG#dEQAoN86RidmV9a8NBX9ZU&OarYv-&(BEK03sD!f*aN`-GikY) zTXV!_x-7c7Cw= z13mAek@orw2obw1D641=+v>%$fI|7{d ziy8D8pQv~xDT=sor+G?7KIrl;hNxn|0mBIeX@U^@;bZ%?ZR4%DmUrFSLK`xPdV=5T zU?C9n*5XW&La%uuCu$upO3~7Ab?4-Lf&eG2&9#)mNtr>8VFR` z*(69nPq*oCkrC&0c*XX%LY`2II7Lp9$GbI6QppP(iR$oMy`bo>a{W+LD5Cax9q(IT zsgnf7j8WST=EQ5mUmaTfBhujdnY}yeX$xlLDf>LDr{T>Af5${EPx^p$ zXJusnp)$q+v+TR&xehI8d9|dw-;PZ=ij`+Zw8}`vy1XxsAQU-cc`;#PH%*yOO^*m{KKXy1+3x4b^EPzw| zv@p3?7T22?f5S(uY!mI0|IkTtSJl_!U)*9teQP^gi`I|e7R-*_*qHw0r%X3JGf6K~ z&3-L%%GqP!Jqp&pn^QX@Sg)63BytWm$)0!i-pKUt5$49bi zj>>iyjidY3hQ*(p{un%C5W}9o5-?rWHAx{o%hNl8I;hL!vjoMK>CWz)QWGi8U@6^9 z7FIv474EUAvK#iLO71t8)7Xpdl7aqLTJU~Vi%{Bgx+R>EcPj~i&eeW%?L&9jK7UVR zZH?%x1d?K|QZpu&S{QkM{q5IzLqKFxh;ZC$WQ7QMvXw@upYf_4$E?M4bHr~_y9d>u-Lu*#tm7w{jkL4!nG)|{C`luQkUqlx_ zmK%~vGKakP-Nfh2!47hswg<7j(_U6M%`58W9!kbx z&tzPLXTh-dtjl=P_-$^B3WqE)w1e2%#eoauMx!yrw$@CVeR)TqZb2k~mN2RdZZ%sw z3wk7{5ctuj>9=MpVpwT+kaQ1{Dzh?E-qzaSAzRw`0=^SRM%H%4_Yxn3#2k7Pt+mio ztKZe4cEZ+%MYDdai|9!zVV7`Xd#huvLK`P}ZpvwYmWT+j$q?#tTr6KQprPD*DzF8m z9(pvmk*@5rC&Z!U!k6OG(mC79g)*gr14Xxa1-G?5m`Om<%~uM!iMBYASG@Wjyt~6Nlh6Fti=_cn01|Xh!T72IxXK@l&E##`|OOS z+k;a7%UH?k>YUZ^N!}Oyp_KiD`u1IM&amY%iYjx{PlWt=je&Jx)wA&|ei|~amz;W# zwsz^T?pB1i!=?iCC`HQ4bNx)(F!6v}}}+PV(k?~^<1G+@zX z_VP0R+jr_HSvbCtkdv$^A1YPxGYFi{pZ~^p`sWkcVIlGHJVWtw8UY&MeuftoeFC+B zYBzvN33|LV2^P2uWn|;01t^jU?C?oYt;^}>5RLC9ZA>!Rboo6SO8Sm#j1<2+h`Y|I z_s9K&J4@^yx^OI_m|f0NNH$`$psKxCbLjSCI2vM!Z`>evX)Nm$fwL>;Dz+sEqg@>5 zRK39puOmDGKK%O4!rue`BQ*EQqrDrS1i0QW#}4(sX1iPz(CkKiZgL9z2?7WIFf4@r z75w%;0g=0J3qP3JaC;j-aQa_<9B9$qwe6oR}XFLGiP^G9`W9m4}AqwU?Hex;d99+s5euC9F@j`4dT~u?5SN) z^hrYU99<%e_8M_W(wgQ|n!!4<_^av3Qkp~uR?;_rr3I!uP-PnsbvnoE%mm`np$}7t z6h*;+eN&k!gj6S^hzmMNLeb`SujmPFZRJ{jMm6gMK))93CZE*c5!ptr$Jk|j33R7d zrw;07IFBmzQA$Ft=-`~a_oP0}DN6zc;(LqHt`r&Wfi8R9pM6G;rbij0Bh><)r#kIJ z9#12y3Hx!k!MXX@9;SQnZojonff*jO7wIwuF6@hI7D+Xxn&)`s*vc4umvJV%wF@f;*-s3n!rH+{4Atb5_8)t-daLz?R$l6p z_OBFO$grLA)~jZL>6!2v^*|)GKgYs1wUOlGjsN%qdgc%eQ!Rc%h;Y!g9Xea)S{L@j zMZP>%`%=*R(J{3}^4hlx1;a*iidOqZ-915RR|AqN>TuxkP#cosR860f(#pL=XKT;G zgi9O=uJHpqn*3Il#&ucpZbyAMLm-Z?m+!i6Qa^UM=D6?ay$XS+C}Jb>qvoUCmoG^> zR6LaXSYF%gk@&m`eo)h65+m(fX>@=Pj`bB1mQ;;l z#Dq5JC5HF8ES!!7;x=W))n;-QgvZ}*KPT7hWj&cregzo**=%LJ2|-;wI~@QuLl2R4I9kQ=C__?M}Wfp|1FALzNCSu^t|_ zx*4w)U8Jn@IRA~=J4UdE(Dv9-y2p#o^re`o99Ekhfaur;Jxo~ol`wMS;;&IHLMtcY z6>;}hnN?g!{=N15C$4fRCed9C(vgQgJqs^#1bSS2wF*T=bA2y4XwGz+DEcZzE z58%km$?Z28ENP>c$N9$Y784<~c1SuJp~SOp^_bo1WIbR21Xdk?Y6p6#Cds0`A3dK_d1byA><7AAfYFTdSR&O|K(`K)4UgB%K$K-dVUNvW@ z)o3gjsi-HPt%|V553(vv=qmeOpm$ecjQfiX@6X|j!`txr*13hPJ>`tT!>&JSeIc*j z;lmrH2^AkO0i{zNt{ygr*~5?yh8QiAo<1=aMmT)3k0mFC_~IOAG?>zI*?nH8D)+IEus;%9D{U3)hw320 zK-aOvER1MEM(wFOA&>I%5PrcWk_%|Lpn< zkOmw7mvvXs`R3x6C{P;pWBqcZlf=CUQop6%t|0kI`HI*_qKI35LM)oEdvlG_>JuRP zx0+wbd%z2@lbFaojl4Qfw;1$t_hy!|r32&CB}@*% z{IMJmu@zbozw_Th<~xH;gZcNYpuAZk?7~gde#;?WSBhIzMdynP*H zC$RF6FS0TsP)gf3VTFHuT^`mV!tc;qv4J5akX|(ReO7dN*s`&IeC)L(5B=Hw_!@zM zrQ%aw!}hv6fiwg9bmJ?`sF7_o=E7>HYE_OVe>Y$-f4_cP_mS?&Y9Af;Czy)U!BhDk zCierv1iYe9kM@-J)B1DjQV-!s2a7+eGf?t`by5609#al6xf<3_2oNY^52X-<#@YEb z@CMO0^w@hFj^B!;(I*cvrVEPUWXxn&-jS(Q zyg>CwfkxS~Cl2>=w4Qs9Jwh83Hi>==}&Pdq`BX7&2E0X+Irh!{hW=7v3_5wdkceg^LZ0mx-Zb4Vr$Z5RWerMQhM0t z`@w@@J?GrA_e#soo<|4~yS33I{daeS_CQaiY{0CH-9{pHeMjy^gjKw~%)9H*kLRci z%ntYiLm49gm?DTM#x-*eX7PWi-Bx~9dt^Ki3uq`$HZLJi9rDl_`iGE53B>~TVODvG zBmU?d`@$XXO&f@bAGQG%;L@db!z%;Og}pt~WG_DPg0oM&wDx_A-UCZXt?qrYq_@jI zBp0cFaj%h$=VC2sojrkr(8ubc_iD=F?dB{rpzfE_9$?6Sd6!c70{HMtu2fFHH>Msj zqj-+le#sn;i^UBP_?vxCMZQ=G!3$Tdh|mE-ZNQk4t$&1Mpy3zt+-z>Nu7AcpNq=1MVx;Wl@uWw5XcYjR>#k!#uF8CtS4o?iS{`DHXm2ra zN(s}Rjh*`X1AnWd-U=@SRH%Z%mA@&JS>!J?K#p)u&=vc0fp+PoCB^>tHQc2+NQjGHKxOjf#4OFE+U6MiHIyesy+sNV!z1pjrD-AANsFw!=}GmpQ7Z zr+$4Qr{&>4If}mUP&K9Gr4!Bqr(g*y%hqN;YXw9PmeDFBOGYtqA58ux~=2YLpbddoH1H2H$V*rg6}b}tZubmdkKNC5qUVkGTU1@5d{}V0 zb?1)rjUV_PIiF7~;c8G#b!(D!U5KRE znVV=*IkrXlLyP)u;l$91kNRR!=U9o4IoyI{PVFo9@pHc8cIp>v&dvSc@Y?mdL%Lb^ z01e5m@KBBt%V@O(CmAyVzwEjF4bKj%krg3yNu^Tbeilv(|4ie!*fFJk=yQ*jF7^dkri4fC7 zP?jFABbl(k0#D`Ef|>k&nSF>re%BbhNwBZCpWV(2`q9SDrCueuGo2nehb;TTzJbVm z&DZtt3h&gQG#6oVeuA>~$l&*wtMcOK^+XTkYSoG22Ung!iX1~~6rIf(K^s)(QMuP# zW#8NVF_g%% zU0qp%P%Gb7R(pRq89l!vDT!DV60w))2r17Mrw8ieTH_W|1V<~qp zu+E{ve{(g8c4l>Et4=r$tfBmk*oZ8k2DsW|w*i^pI4(Y{#Lr3|>Nz!_c{yNre1=u# z$e9i)ZHcEuV}Bm;u35ME(oNT*-bC-% z#+pRNZtGIqMSbzjJ_NEPw12tSdAq?zzlVqDuxmp1+f(Qzoq}yVug&t) z8cS17X*O@f&%f2VT(>EX?02X+V3Z!3wKVV9|IX^>=%TgtkT>f$*ZyQLw9LB<$?Igf zg%R8j%spT?ay;^5pw=4L<@znrQJhG4OSEu^T z$=obkPNTSZqnOD^5*=_AH^Ism=L_BKS)Q(67F*CbfD~T-rn69oyc#Ptd?*pGjegVP z(y`hLe+=Vcxvy?DD;>D(AxN#mlbdre4j=QQcRW8SH`UzoR>$LGMwHzgstb+FoIhqb zWfd9ksqk@_uR`OqcjPs9Gl%2Pm-zy|#4wFne%%uW0T^LJ6{mBYm;)jQ(D*YCX4dgD zRlq0JkIs`%@G?!|bmZqvCEi>pCXEIq3xwTGiFK;n(p~z6HsXwsV_#342zi0fIO!7Y%K#FjM0GGWb=&^8VCDpC*+*c{K@C7#u?%CCo>I-(FK0aO; zPlcroqKAjFNVMM3%1X)f_fye9h5<=D6E9`XG@rRIpKGm`mg-ZJSQ0<=czSyS+Cwv? za2)*E8SJUzPv1pB&%n0G&>CUV-nPDHBD=f|Wq{UQKrEsEqVJ#_ zL|6hv<7|1BP5unPk{&^y@a-9VP=%yGmy8F+d|&#Fq#U@Y*0I#O=n-Y33fnp1lAyD9 zBD+(sT*J8}3+g&6+I&ac(Ck8vvwH`$7cCfQPKDP)1a&D0zA~>zQWYgJY_krHHk}Qr z?61ENnXNDf?VyLsyatE#Op&v$?EE6V-CeU2 z4khNig?o1H)QZ&heWH1|BV9UsAlU9EV)Yj)fdrHP6kjPq2SqxDsMa7G;Nf^Pv-C7e)|8P%r|PfMQt)wHO4V`E*>-QoIO6f!Cp z+ecay31Pt-^)W>XQI3x45g%>}sZk(rEdsAX`jx%Eg*@1b^C&%_vMW zPHGXFC}?g(dZNlCN0c!-aFa%m!@gc2HT2n#%0?G*s%Q0~C2 zq?$HRR`y0#^uGmh*7pQ@Ho7_Jy{FXYl=9R`IB#Ud(o=9h6WqT;Z3%9%#ULQGO_ASR zLWrM;ZPc)$*YLb-{w?}??IiwH5wLOHAN{RKkhRzmStH%6%dP=fy(SYzHR2cMk!Svl zt+vMu1@e>^b;S8OXADHDDRUVV^@eUz(=ilhcO7`Pmr?xZ*?{X?m*k=0@WW_{E$U4G zgnae`Lwzzvx5V&cW?gy+s}}`0irDOa4swgP2m&9xIaF_cV3I%wZhz*%x8 zK>HDDwHS5`)RgJ{f*H3#vCL#x3W%4u4V5_pT||W5y{Qj^q0<0=*u@AN$4t1eRDN@P z&M*WuZ2pWjf^J3NgU>qmVpipqIP%zRGmh@oSz6Sx&ju1~f%~l|GT3|<-8nqE92LT9 zG5Q$j^`B6mDfmP1tbxBj+=t`_51ah`Vc@zp8U2qpn!pX?|K97MKfNycQ_JHYy@ver z^*@GWjYv-A=-FdPn}jtN7771DfPA%?!%yXH@VxWcHO?x8ICLB%zqbcTR=>Aoe3J)_ zA+4ar6m1C6Rxrnm;lP?lE9MJd!J=X8{q^rH7|1Pj#~9ZJF`^>5K<1TF#i<9gH zb7M=UjNR8AN2azPN?JH|rJ1<17SX2yAJKsxt}xYQR-|+%=le1o)-`*YsHVAeyFI4( z$Le1)!2*HEH>5@NrI=MF=ZHYadEek3g)NS7%#zDr?PB-qhn0TteS>U4sndizt6Q(r z6CJwq%tq5tXR48?#L}*rPR9#8q4|>%$(bNU$9&$lk1b7iW%A7?#=ksM?EZ?Jw^@@O zaIC}a!5gFfk0wt@q(@%q;vKE!5|8wVS=(A{SzX}>D>Ze>_slCP8}=Y*A)`8`X}bik z{#<#R9mO7IAQv-UR~co{?G!XtB2lJq^ zM6OC4YxDDR|Mi2B4{4X{I-F-|3*;KrVwYE8g%p8(uE*kOFXLTR$)|=I&KU1G zSb1Wv;5}ikr=Cy!!&u$;SqKFQ!c0P#pYnfW>SEwRv%b2GM?27xgn@aDvf}%!ZtUUP zb4E#8Np4SoC4jE@lW_bOu;UbSVN-}Lfm{#61))I_Et?ZW9pZqUW9c^+`zsT6_%~NP zPN|RM!vgGdDaAaXo9_u^g_?6O4pZT*Hx;GqAmVSX)MH;4wmQ^TCugx(Tw~{@AQ8)| z=S${sB1amCT~U{mB>cFA5>Nzyq3M>Q-fNhQ(Uo;C^}#EzT#`_LR&C(S!0M?SQFzd!e8yK-x0OqD)Km1>AfK1hE? zf$#sRtd$>|X)|=i;#XDKl>7&*=Tqesl$p%Z7**rR#6z;;JJKER?O&(sE-;67y(gl} zX!tk9A3=@`)xY93Vs4_foT?drq}6xqOs;JguXwcUx`4n1j7Hr+-kGk{cUI;%Z(5CE zq{g}m-dh!feHK&wWv>P_miiIN|3DzJx9D)d(~YTv)3|#v8;KWLd+=j_iWjWuX&wQ# zoE9)6^-}Q-fgZoPyuh~nnxi}L2uGEfMXRBI;MS})9RcM_l&(n10+^Cg4@4y_)e_4mX{L_Y@@d&peR|7rht;3O$Iz%t3Pp0yJ%h&Sp}jQlwf zVsObvhWitDG=l5#P3XmefoaCP81zD-Yj~)|WW?@iAJ33myvBO9LQGqxChI^C027*v zJ{;EzL}cpZVLZ3OV%Gf*CATJVW2+iG7I~XI2%!e6ISr)^GBbc z$S_!$xlhZpYPKV7jFz#_%Qk72<2DjW^j#$S!|MZO3*rOp8ZF&$f!zl&c zlh1(Sy*NH^EkzNHgJi4vLqX54V`ZkN39{|0EX$8q7mV zR~Xx6kEY+^5Sy4$SSS-W@tGc#<9snWB+F)&(U}{E9Eh|}6a^sP$WPBLH<^2x=7Iaa z(ZeU8Sdm$^>>8PF`|~f2URIR6IlNkO@he2dZ$`?AIgHjKEe47<6%_``lm-{YMcd-G zVK|?w)1EgfEL@VXrZmU&Kv6&=sUF8L`JD+26AAPreVud7{SfBo#T6hbP>dG;j~>vQ zgs|RxO~x@jHUiJF_TflZx0gZWm?2FEv?^IHJZU8uR~T9^hZdoGA_q41sJiJQzXq@Q zE$`P*@H`sh5;0^ddpp%h;=ErQzV#Sma$;pEoGMTY6a2RNsyElmz)kHOUscR&VY3MI z{5jnn!I8XNlUaYZpI3e`$2ax$qW8A%rTaKP9%(xTP&Si}@9@`{c}@7~Q5*kJdXdX?hy8m&N2XIWC7 zA0GBI*k$xr`rR-=>1^4%Euz39zNv@5g#R)zta$R}?uaus<=N8tD?vg2jHILFw;wI_ z6N{{WfYXW=dF!8XPdRFF$oREw;j?-krhicBxYogQ!zEN*p{Tml+1+t?x&148^jms>4Qsrzp!#ckE1@)+e8cg%sUG~4UooM!f z(Fi2`CuX|0?h<)5A9f4;u$n~Z{91NahoY*Xk4UJ`$cnpUaOIw^7Q z9r`Z_r`RvZj~>4w@x=VO)dzk59`oyeT0HkZZKMASJ`5z33=5&UQXHF!uW6F}^k
      2mgQ zf1$GRyYod#6YiI@8gv zw}OP=WM4<>9a38x`U`IGee5a)(y7e}t%s}*6(Rp=DFZy1> zZe!4RsnkXpan9;1hm6~u#$3p1;%tJ<*5bEg40Cs0y8a9YreYxK$zE{! zu$|{%xY|BZ-2#$A`<%~Xs_l1hoOK7zZ~f*{Od~9<(06PG)?wOz5|}=my_7@fpJThE znOj>Pm}+MkPJ7}HEUkhLYS0DP#=*C8^D$}ntJP4?F?*(&oc-MdKG+B*dLJ=nT(_8@p4fkA!yGlGZGsuZk3YZ#)SEZVu3@J(Pzp!UvCo)gMANA?BC6P58DMt?QNhV9>d=i+Y%fwPa{b6 zIdt0*>@QWw>IXK^ZG4AWO=lBX??+K&vGLR=)DSFX3ZwrP{R6^-MI3eh&9$`!=_&VQ z$_DO$uS4zU0s=s2d0v@$I*?mSQ>B43<_@QCikG|lvAchBovm>oLjnn{i@Ytw=e!(8 ziA%t**yqDi!as*^T$m$lv0Y!ozaeT(dK>cLOEhqHMD}7PqG8;a9|RfB?hy=?tnqks zotetHJKaHu`_t5N{Nz|&F|$S)c*#|=S@E& zBLU3?;FZ4ArSpLj%q$#T%nS5Y_E$S05%UDofnaK{91U6n=5KdA?JeB`v{K@_Q3-?) z`*N6utn=6m?2^F147{s~1eDCi)!F-d{-sfg41q!D`0?Ld2_8gt%?NOU$vMBqI!%_b z+X)pGGgWdN_>})6b~fx-OK(ew?!s|6Q{g(?74@=UliK+lGbXcr%I_&V1_pfnWA?!L zFU7I^li>it@KlQ95D3=%CKw1vw%PgLT+<*c1D1HghkAR-jk!PVzG8o^5z*bMx|0F# zM=|!jMGubcHCdsj-gHkfWbeQ$LcZZ@PxRKOQmd-qOsg-bLDzBljnx#iI#PGL(==xs zo|wTg;M_+jG{Luh5TN3fmI`_Z!qdxke3d)I?#KP+ib%w;)J1B+SXt)-Uw4NAn3hG* zP9&%n11k8*2uy0#aUSOPD8V~r*pVG$I|rR)?tsb)_GVvi-~5^>$-E1__v&U9{^ zDbRc#lklaWOaKhS6k+P_nz#uEc;U}531r9+f#w2PKRXOzKB@p)pVtx2D6Y^28*fvI zNe^k+ylnG5ke9B4^Qo)2!x+lqcsaoK20{>() z$~`s#Xr18f~4Lgxb|{!FntE2YT{Yut3KGfaqe z7RyFG-|%8M(u{+p<=qInY<#UCvxL!yTPAC>?+fXJ zJ1e#Cm=$&e8D>ez*uX6E0gfLn*}}m$9J6Ad`2=B}vO?_gRaDya-<^yzun6U}8!%0H zK+|9vKf}~Q7=<9uh09q6qMd#~w96P3{G4t6j3UgabAr=m`$gg7>F~Xw4E%uwH_qCS z(%O4canl4*;4&w}pcu>kc@T5L>YSBT9VbN)3mLfukA;wb!neT+SXH9$;N;$g&&)Fk zqH)lJQdA)5q7`0FFH zjb^{7!e!M;4lHYU7mn5}zzKx2_H%Y9y*vkB-wbML1I6E7fiHnYytH~(gPZ#cBg+R_ zUBw&(v3^M$@1`7PiQ2MOsJi)sdb6flNsPJ=ldi96OF7g7=72bYa~enM0&QZZIzk(c z61EvND|Fsg`06O`(9EAkgO5dlcw5%kwnZ3xq7OK$3{0M>yj#MYvhzi2@XrA$@oH7G0^3&4i9EoCMGEaWlO;L3e>_Y;G?bESLrF zVj7Zlnz6E#3L2FK7>Y4l&IV?y2{Kf4(Fldz4rg0{bQpYd3nGT82VXz<+3;_!Y&z;o z6R35!ro;&i&}E&$(`D0JfTAhmfhADW5(2@g73iYpq?<3G2n*1;zTl3z&ZK>Z{}}>| zd(WH>-19frInc~c>2`DE98}S#7HZv>`Zko0@SmoIS*R2HtKC%Cy^<9+z=lcUh-kVvkoQ zC={5mW{Tq`xW<6Z8!mAI#h3~5$7~!8F^(BWEL|L2uhFyxTf>d~O(jlqO?}E|sscI+ z%h?$*S#N><7C#zj_nV80-~=FEk^T*sN*|C2eFTp+#SFGjb`TcOsxHLMpQ;t^)5K8hVjVRMwfUP%R27WE&Itp{xMV_gC~g>8>zf z=3RBpb20zi@$E(w#USKaRMeNZXB(c!Y$IoX1*$Tw2s8w?=7?|91+~PO=pfCkyfoB; zU9w>k<(yFeu{@RD*-E%nN7tIB>H(GPo`}R!*uVSR_*YfJ0P3ix-{)alkKk)zm zH75UiW8&)$HnyiQUOM4^&-^WLHZNQ+3pIMCn|SturS48snkII_p$-2X5r*LbLXt5? zY82 zj+4@ZJ7x@Yvo%~Nq_n&p*NNKbP?iPz^)g=vnON`oZVh+l$xz1etc{oUNm%WeqnzzI z)Q*K?YIze&Hz##S3a1Kcitk1@oPXyf)d%^pAr*x(32o}l#_#s+cls_cs^b6zK2Bf$ zZwXXh&vDdim@3^Z3*+K@$uzDWY)O_!8yaLcqSd~u7f4AJ?5-K3yD=ZVekj#jBj^13 zfLqH@uFxx#-|@SRSmTHdeb!7$xyMm?Gqh?}@tJEAg>W=WVRDRzP9A*7lF750f7M?3 z;C#;YRCXDM0lCjbvPN|+RW~)I&3QRPuc}--fAO5T@x_C>$4T8aLhM|$s;`}F9?BeR zQ5wp1`t!krR}yb>N=!Bt4Si$3$tgI0=VW|FoI^Q7z7-)GlJ~I70)?{b<2aXk6;)ho zA{5G)h6P)-jo};QF)IeV9xhfR=sRszvPydG=2-SnSaiR2M+^J?^qx*Sx1E`dkVT-8 z<2`7-wQ@g*eEd@TM+_7wBwbS4)kH9fGwD7FPdHhvU&XAA^6&%4(jDD-Lw`P>TXu6q z6VG-{$J1;}6OLbFDpfjhDkq?^XS-sgZ806Wb_;q~<^1)6$lSkJ3olcv68ia~mKgWY zH$S+FY~B3}T_&2-7q;_h>L*HEwMyjadY~@x4<1^+bt5wHRH7t78FSh8()pBl`=_bW z%zf+Dl`kvGqrExGEnk-yNNf^CLvBrBY|48)YZSSN%)iGr4!;@N13$}beoNF8q@&9&i1x>!rHJl%sLhB-pNvc2ejTk3*Fq$%;-_m2_+kI<>S zyU-nh-0W1=IciB=ZWy0ZW8CWsO4ElEeUBWTVb3JY!`3!5j@obMdo~m{*rjXTq2p2O zung`59ekyN=tc;r2BGHv>^}>laNSVhgjVJb_(!=~QI+~f@|D;5F;=?965h75B{VwC&OcA$#Qv!Y^uY$cnq&%iY2RE`(=$6V7R)~(&V6#!Xv<_Rmu2t1!T)Yd zg<XmogryZZ7+7W6pT5@@UM{?TDzygA1K<fQ_Ik@=n$a$~(KDY}@;Q2QMk-a0vF?Mv zh8c$qUY%@<(?(?NtlDbC90^pPC`*f8_Hqle-Zz&F%Y{-Kz@%Q9%ds6Eww@h`i)lW! zZKY8dwa^sEJ&B5doubAzbZWQ;4ZRd?{p zrkx3JXG>zi1do7^rzUC-8S=kM{8+gAHh*iu66-$U_$fj)j{O8UR>p>A(lIfe-VNM5 z<>C3nia>4Vqd%{_t^wtG{E}A>_-s6O9$CK7eS?zyV(~jsd!9f2Rh1?s63678o zMi_SDDhF?sfKgh-UrdQK`>9AY#ZJ06$zuBH!4n>w6}ovRZb^h1*Vg}us@=nNdD5MZ zd~KvF9l)y$EB^7D>jC^tg~!gBCXxJ9Ej7$jeBdgUCCF{tGr~^@%ChAW6hGrAEs1%B zoc0n*%DlWGXX?bBTZOX(t!v4d2Cg|;*IHLfJ#+uU%zkIA*Aa>`f?4KK+dLG-4+P(C zEIWy`Ruy9)+VK!M>l$D{)hS3foVn~NbHm<4h^V>Ct~uu*14p8L%%4k4x?=76; zWX;D$R_bVr4G_=}PjNrihD$#Y>8C1thb1M{3|Ka!$Eb!yp0ngPA`U+^4qeATMZc^d z`cHW*KS=O9)gm$P6))MJgpVxCGAjW^Wz-MHs2!dZPr?QE$Ci0?m*pBM!D3|+TJiwl z{(g5kQ7ZlHi1@0jR0m|LXFGvmx+d_xC*WCkq#j<7o-^@&b7-$r1_j+Mp_t zoyE$}p3k8qMlAK*725=}wvOUm-tVbn+BDb%d!@C=ye03k`zj>lbni8qkLK|MPj5x|wnJOy7{*1~> zRK)XbRZ;g>|m z_ZB7G@4FkRQ+xS)l4l})*}&VVwQd|NRemBMQXf1P+XdO0cZ7LnzeY)LoBTy z@mGE1|m-(r~S6e{7YMvFh2f1k+Hb0{rrUME!QM??L0 z3Q2?FcHKD?4AAEQH*2VVV{apS?i3=k?$v0bBx$K-OV=tnh zmEXh0Ytkl3m_P=yO?tW#F5znIdXYVh?LPJPfH<{`B!V*05ihvDHod$IoB*fme^1gI z4F`1FhK2+xWcfyJyXCq~y?>)7W!xf4OB;C#xNyF7KefVaYodxv8|^)Zwj8>OC*yZJ z8Tn$g4KGTYn%#e(bQ4{?zEX>Nd2sfkLUm)^z>RBD=k4F7)a1{hG~VEeQa!2rfrSHo z^Ang_|I0Fk@_Tz9Pk9Mx3FRFFinO+SlLTWE;;o%Ju z$>uM>$@_m8y!HR{_PCRIgzk@=_emas9x(K}$^W1yR$b0gSjsSr7rnud#lJ}!{tz=v zqePnnhr;xa1S^+ZE$vJ9)t&mY8)8dXnw-FQRz|@F%k=}V%CEa==aUY#s^*IvoHI4i z@p8^;iA~078};IOmH-jZ9?0h6k)tM=8A7s)#N9^DgWz?V|MwaC_wtzQ;mjM(xIs9C z733OGxfuYGKC36HR@lh^TEg1RE>js>Tp^n@9o|u_WZdkc?3xf`AUDQP-#e?;tcJBc_YO&)$zr zzXsxB5~{b5PKL9-TEH^=EHwjCh`F_YUk1>%LRv5gt=CcDj5%J;t4MpW5z!?ez%sZD&wL~!;Fsy_4`0l_idhk z04Z&37H}I&Ioo75_1v+!G5WC+!0bQTPMrfnzK#IFIAo`aYb4243|$GAMO^hbE042L zoLuxkrKnQ=RKk3LnK(>$&3GN-Fq*^psHO|A;ckb#xbs-c(~g9C=lGN8_BzH|K?LwE zHY_gM6!m~`M3wy?1mV9vF0<7cUp}Na&CL5e8f;K3nvtz{AVjS>9D! z#%%nuH8joF`ABz0xx#jK3c%r+%^6Ry?$I)~29Fwu>UqJ&QV1HV0^|&e!?bpqx3E+} zq_F*V>oQz_Vuhm_Xw~=lG*z(sIax4F9DObYH&8jZ8h46isDxv2i{3XKAi+)NtDAnM zC;G*ErtP0!6hs4$$GysH(iQ{6WGyS4&Sc2`%rwc}yr8Q?;hP z21QAATkD2?b>-LsvSFJqxtf^ez|KRTP-lEpHn_YZ-htV8lskuU!N2YaYTW<{qY)7H zr#bTllC)jE-8c#0lTPvBS*(^H(VSj_&=j-OyBoNB4{#*aD+sbKg~=YdBlRcmImY;m z-c&!qVrGBR!TXj37`xnt}zg2zvixK9#=G0&*Ty{**E6axACQCSAx?z|c)-Yd_hb z{0BT-(@ciZd4IE}^e}{-n4?C(!^isatrwJW3zmXr*+z?sj>A>tFKK{hzkHw3+QR7L z^Lz|0knTB0ZW)=Tgm~!5M0@IK3pW0LIU9wJu;=4a(8Y^W@D=E^J!;~QxYJ0|iGOEj zUNbfBmy(H9-_MZ9f^X#XN*Wt^w2-dwAc4E~)&szR$yh{Uk-r$-&sj$JteIO7x+6ISCe4 zuf)~k=OX*#<&Xk)I$>sB23sa-GFR1tVVL$;CM;FWP1eVLeOju~J3Q}_ECfon8b+9l zR46r+b_LmFi2p$m^%f-0GWRaN(bK9(kvv6V-Q#4d89D!LhB#JD^<+X7MwzfcDd5mUMOfh=|U_ z%Ts4LXg6pNa5!vadX|dc9K1PI35RYelK=U-VyMz9QA;^Siw-VxwhjA!is?jjcoZUj z-@9#goUIYbO1tPllf!SJ^L4y)3iW?a%Svm^|GwWroid1eHO?rMt4)B9Ez@*HBL1Ya zJj8%o?l~TS-5A4&+a@izfUXKsCYl#r1(n!Z6rPHoE;3BKK9b;_qQ8C5#^<8p&&T-+ z(*^50u(QC6rB)xQZNbN}2bOr-R_yU^+&0Hr;=vBJ)Z8PqH#6dH+c&qi`-=iG)88#s z$oDCH{fIL@Bi+p;hRxQ^oXt>+Aor4vaO)W9oTwZ8iycRrX5@u~t6(Hzvj?029P!Hv zorH9eJQ60Ir@kEooH{{ti<^W%yAdvVOeEg?>*u1&;$Pc`ad_FR#I|p!A768uR}V+m zL`jP1WiBX*s*cn&`c2#fC?4bi=sGp#@hXv0e}u^0#o&-fJf1EjG60Kc$ORDaRR_{| zWCRpqXDL2XF-dgZ_p_<7+ibZj%kK6~o}ceL&+Tb5h<+~cn8nD%i;a@?cJaW#>ho; z2=7K3^LZTGqOn5Fz|$gJ=BF8T@zkvPDEow86K&YQji@i`;%`pg=bbx!q}c}h9`aV% z7FT56nmDnP!_bCyzuxnaC*(zWQT6*C>p9!-WRiR1MA1^+TsqtO>oCT;Az8A{$4_)$ zI>kgb?(iOq@j>^(Lend3+w@OU?h_<~_nb6llrBrta5D^!rk_46xN&mx3q>vg=(VIo z@4A&}v-r0R1ApBg`+EWU6}_U*Gqfe@oom@Cuc@&VkldS}#zuIsyxduHygjEGx~O3k zV|(hS`XeR*KFZ4mPI>`0Gw`fGK+5%7^*<)FQTQ!IhO!d%q{WJOA8K+R=+{+$3%sMe zpnz3*?-LrZPYR_a&WrI`sWHtbKQ_$w{n?lw-dj+qqRAaO<&T4MnuEz}1UO?h%CxvQti0MpKYR6wk|fZ}ND!>; zNUzQAAHrzi4wUsKuy(ALKbm$Gd;PB1%-MFIz&&6Q5#j5)1z-+p)kLAWOKcwjPq!RS zzm_OHVdfy{E)cc3d%XEX>~rtq(q2{qytWE*d)Q58vM5o0-sH__YZ}*3#ifF1Zi8%ULm+ipm`7HK3T=@>nbTRN{-oXOTD~6`L6mI_Q!pr zbbB^f#=JGr^cFqoHk+hg20fSYynKZM$8Bw@Vs^Bz#1p*+ZvXC?I+M*zm;{0Y(L%dM zQ2DI5+!EHQ46~Qu5yKfdwy8~b+q^h9A@llMGRX}#mOB?}?#sgZ1prS}bjN3p7?=Sn zZIpwse$Z5f_sVdc1Bf!$=Df9pp8lzsF8RsEg}~@M{vJ;iEv3hV;6=7rS=0nTCN-pd z6U^`~*356ZBf6-hODSGT$;|KQ>aFLjp9LF6 zJ`Q^0#<<+R`4c}f+B_pZ*l@+(fawe2tk2f42B$^SNyxegUx2oha`Gmg9QnPpt@}o= zXIj4ekCD|$&9B(qTIoKwWbK%id9YfcxlQ}zZ9I~uB@~FJ!)lddFuwy)m#N9@5nV7;xG;+X54 z!v`_^DYpL_AVqxN zYG?Sg|3YicaVHmt-t;3t+FqA`_qnwLKVujDS)G?bz2}XGuMPbxE9%fyP5n=lc7g-1 zS{i{*(6!rk!>+w}CedNDYTT~&5NV*y$HAbhxSp&}BBOw!&=_j&$A&}#d_vLxoqrKm zhd<=0HY%Aox}d+2gah2U5gB5l8UW?I3%c{hIpLmeBKP`qb~rZ!qxU1%;lR*E|I1_p ziXA>O=kfwE(COvaO=)w7GtTVlTk`UK7^COJJpil^X}Hx-6Y}BLB_KWpZPaGp_Aw2S z>X0sNnW;17TJ2cQ$m`iwNJ9-bh55L z7KPS$ytLZ=7hlPn2Vz8Pw`2-xoxM<0acOAQX}$yU@Fx$8bZI*iDI{Ql+C?=YW-Qnt zF8k0wC1iIMdvtsG^ha_nyC9^hfgT>78^G z(WCYzEiChA+)dq8d&$o>SrpptCNVbSJgwRr@X?;kL1C||+z~=O ztc|7fEwy;QyZ+7yKJW#EEMy~l2(w^p6XIN`xOqz+U81kx1eX7;-~cm>Oc(C?LJ<=D zoLWp%Y0YO*rQD7A@<)nSdDV{gveK*X!R>zpvusl+CF!4 zQA=ZtEZz!VvrUb8wxAry)Y+!rgQ?PQ*3pukmcgZ^GU?=5n_t3ZZMV&RvUc&f@>O3z zI`WP2=X32m?|IwF5P9$Tjg)EV$BNnjNb*R_QjHPMLsH)*mqFUGqa1h!`~=ck|G2+8 z(J|X+vc;?+hifG7x{JhcMx^YyF2Y-KN4CJco1Ug+{LwnR3SNq#L`j}Pj*6yJMG_hc zTEo}5-gd7JH9dY6o2*+rRf^15#&2<_nhiv#d%6rm;*gi2ydcg#jSyIao*5|q{iD3{ zzICKl;72K02clE;i9pvV#)10ahHPpnY_T3_f9-GE0+A=~2A5xZPHHV?y^(KoG?y*c zWZGR$BVTKoilBN%$6k#ndb;&Muhkb%U7JKC{;?}H6zAA?*34pT3vr&pPe8Z!+Lxo4 zCW1}r9K({dbo@M>-_J6}jz8vBP`Gdc^$U5L)&!+1sWc{dl?4=7oRwz|NGJ)KIu_!- zKLt8O$v4A7YT2654C6lf8g%M*Da)VV6e+j7)Ot!zOM;2KQ9K4ugv#`NsH zh5&5t6U2E&Eq(TyY!VWlzt6C|VSv|Il9XVoVAg1pZ}m*TqPMx)SK(s*PpM(`x^tUQ zMp{a@g(V?wWfbaKh^s`^o7JasMgrry+UMO^UH1BK2jBeyFo zug`BiHF~O9k>jT{O-UUdCdpF$6q_~aBYr3}Cuq)J=Gs35mv6p!*QI_~g1_p6*EByv=N|oSp^_!c ztIpjng&t0+UtaIx_g9^^R|cVm9Pi>qio>S)g7ArR=@B%kHMb|kmkgnOW2o{OYim*yf@4fB z!>{{1k7Ovy-29)z=Ma3v0wvrT>lGXJ1COjxC&k-J^4|TDZKF^$8FnftJNM($0p!RB z60;HY1C#}Rr}MdBPOi3Q7hy{54dhWNzwK-^oX6jR8t4BU|FzA9eKk=O?So)f?uPVd~OvnI(36D$osd|lU{ z9B>kH?Dlrf*LuA<)#TPqxET|=7-N}^hZ!zxkC)9k#}dlSMx5k+-1TF&5M)~XlTP#0 zl2TWTq{Icf>{^U#*p*G~9ny=OwZ~G?0NtUwlmsLBus+EI&MZP`|#OPwIG2zhL#41~yeZ8YQSGRqP znHb+wVeS)|kH#lXB39Nl!*~!b8|yE)Z1&;a)9bYsm=ufl=8BTh@Y_h=V$5jtTKYxk zNguxjr^&ju*T>UvY9`e z(B5PLYxd?lT@yOKFYKIPB_J$uPlxBJGb6-j5B+iWpp&@O_1K}Z>wcU1=i7V>qHvX; z0}zr_#(Ki=hVd2_hn`;dO(zZXq=*_flw}%tB(Kx8nP$ z!^2XZVhM|!=*a@5)SmXK*ktNB@%+3J+e!cH_%@W)g}Vn2r~2;vUanRZ3)e>`m8iVA z%z((jo#!6ux_Q>4a1DEpBr0x!xxW0{s>&6x7W5~I!3}%U>TUj~X6!FVVU0=wBc`c|6#+e2aBAKj5(Z&$A z%H}br)=F0tp%I>Pnaqa2dqwZG$w&XDNA@(vKfFWFb7I#L^a7g*TJV1FJb;6->wde3 zdCj6m73O`-4mwFgjGhIoFWjt`YW2)(KfgbztxYLvs9m?|Mv9v5ty%35I|mQUBr0i1 zZ0*Q56Qhk(vo_}Lm3>`#BlZ;YcZQcZSf|$otSd(@jb@!yT_0j&Kxac$5Wiw6uHW$+ z94p<~-C4}RafNM030joz@eGf@y-7VKlvZy|nS~d>>iLeP*C%V4Rn|{>d%d>*R`%Rlq^PV4X=^Q(y2af?@nqF3;0lQ8{LX?>n$VwW9MdM#L129mA&?v8NLus5)S7eBMBQ z_`W8**)u?R4hu8{>ycL~@z@VVOSSyady_>=&}ywf?)HEhuI!iTkDfkR=h}o`3Lst2 zdMylzzE*FJzV;)iNw@Cq8>9yabJ{w!u37KZBg`mn3`KBlws!)fW1z6lYz_-;?A6!N3=jX)XeCv->27 zG|lcU6q@z$;~hc$A&+bG)kaZ7&N_)t@|*@yq#-?@ROh5#!12W=DJnaJKmu{`ubeXZ!Ad{`No5?w_v#lXRcmdD}M) z*C)i$;pHFD$x1Gjg zb<=g~RnsLrZfO5W=j+}qyrFu9H2Ui~!D9{Wnr36h9?}wt2)$7p`)?Tb%+MS3(+CKy zf0i4{C(BJjl#qf`Pcn{emnoYw;`3pOY`Wo$6#>h0vy+!>@mmGeoRqTRmE!nHdZ_$QqMBnTlL$WN_I7N4Ax zYsd<7%LeV5)IOmqA!`f)$4=mM~J_Qdiu;d z#dY8|+aYwpYvMKS3HHQ6S*MfxB{f+wgZ`h!e|7Rx4rDuVkruc9do~hYiVZ|aYCa3*?yyXC8lQ(u-i{JG~IxW2LwlVLyj$$v2YIt0X{fhsjuF}n-07m#oy=0-; zI*qSBrrDs>RY0a?(&ADs4!o70JSWaNUJG>#b?D)!%%kQ}(_kR4z)CSIsltO&hYjih z!Q(#H{0nxaCPr!V*fxD*RNgX=K{BZclbpBclWOxZRPn6Q+rJcZNu#T58_Q&CUCso= zN#c)JK%2bTv7qKCv9xGhMN@TYAJU#?>VML+nN_|Zc-RXtyH`qXmwLUoAg6zgdn78F zXLBV`nrTH{-2cfIID!8+&fZ7Xe0y}||Jv%-_+$l6Mdt1ste>(g-JPQ`Ni(7!ZhJG< zcK&gS-oOU?>UegDzk8G>#P6-yK$dWw`Z?j8nnUBPB2)Wl*(&|_2C<<#y0!;z@h3z0 zTm58{)XL{iEn&F)Z3)}$GdJpFCfDUhT;nGUoU_~EMZTWxVg80*T6M$4Uxi{DjGd7j zmZ{37gf&guhO;^R}rY zPDN-?JgTFRjS*&H@X;$*ej=5vJtq0t*0p{|F?@`s@~yzn`m5kLO$L7pC~jQ}Rwy18 z(4w=xeG-#QmBGPLfDJl945rEPK6e%{Kx(aseHE7TW@qhlS-t%SRxz01}x1X12pThkJ5YXaOJlnX#gjNI) z#Z=+k64H&#Y~Z_jd^sMKHss7#o;E~h8)_xhD$4sxao85AZGWo7dvEpeVj{$2Z(TRp zq9RGQ<;-%BS<{JII=OkTHFS(EM^dr?q!(Rm?K*Dh1y#iKdAyY6(_=f@AaPU>?R>r9%e+6x-d z7wd@>>UJ0EqgQpO96!|(kRh9Tf|$3^(qm8eM_XYGX12W};+&`_aUS0cUev5!iQKE< z!kvd*rZ70alJgS;6oqxuL`b>!+h;;FL1RtY40pkJcS$iSv-Cy>`g0M+p6z(wm|{fo zE{aGrTP=br_?%v}Ugy>UT|7oYPI)nBjhC(7{jS2cYH4-8HT`(+TfT~yT6?LE8RdlZ z;4Q+U!%e6@zJo<4>~deTm*TEAFKbd^M}??xoUCEia(|f#XYfx(xm%aJq0bJpdieIw zHGVaCzRQtlS-+GenYP#UU^i4uu$^PolnPYR}dyqir7$0kc#a5|2;6HI*?CmB2*ep%X+`8=eb ziLuw8jJYIPZfD6fv=r!*lfa|;!Ri7(&ozty#fG5PYAIV|zaLfx{XN-1A5K0!FX?%S zs%=}1sgA)txlkq>ZW`kk;}GxXk*C3|7V3G4Ge%GMI;U~lKlSZRHsnNq+*Ow1U zIvst7cPj`0F>kIl2y26wPu2WZm3_W_ZTx`wWapBsR3D)L@`}aHq)B+KNm1utKJ7d>QGK6w zjR;UTzt@g|Rm)(nwg6}A4!8V?rOI*{pO+ePf|mcWUzkTx%j!Oq$b@NdA(SSyG^~=$ zp}|p4qw>Cf+WqBfcv53UHK+BM+d+uegwxss*4ZSa8axwnQrTEeg(q2y`DV&xm4aP= z>sy)*hwyha=@jiDr}F7`>zKs|A|&4Zg}pn!!=p0TgR7sa8S`MNa4j|A%bKrovML5H ze`MO+?ph8?UgeKEl*4r~zKF)9UsPsf3zMyRasOU;PpY2cON=1JecrcR{L$Ljmpd_~ zo>*<_h;^{%&hH3w48I&6)sB8gxS8^}!XPO-y5oL(hKge@voSV4X*gmZCVM>y&80ec zvhK2>c8dO|R*@)Q<#3bRVwO{*+3rban4m*F-dy&BTu%vi&Sb`gYX3b95kM0kO1r`z z(6HbbEmk)Crtb0UBlo2^9wEN{g&8({>dX)S@0yoNuJD~#Sw9Gx6Z%PpFxla!lmMR-F(HFaPW+{@}f55xm*u^cH}8qlae4>*gLC z^J$a?*rhjAwW+y(raGRVW8C=78h53BSjw#vx;@h*6YjArM}lS9bDQag->NJTxGUE< zk%5f&8Jn`9j-|%k*taIxyfX`^{mH&bckJkNKLD8PH3Dav?!Wo(_-(`23m6A%k#?@K zk5apu<%mIO*}7OI(*K_0@xSAo{Lgp(N}#YBN&M7&-T#o2PVsaw z(?RjV(9r(Qc$(;he*7OOb7swWi>)R=alN_rw?t!3-ira6k67Ofm7i>g@I5g966wl| z5k63f@Rs|%KfCG;vg9UKiTd#kS(pX5R`iSF)bl&aJ)V-45f?#2h!4Agf8*{S|LbSW zIe=HqeC;wuQ#7^AJ9636u|EYmLJ6kH0K&#(=nmTl-#-cwZJ7e zf2EJqsK_7K5$UrRHX4o%Kp?v_!xFGPFVcYpOuuB{k7t9j1$KoSn+8_F7YwXIWewgx zR;Lqr`|H?Sb1Ny=Z!!1#-ZQdv#~G-LFucIyPLQ&JUp&$i!xN>I-JE#0N#I1xc?=e)O>7-~DP( z8Hk7=^JxGQl5>x+ivAe-d*4G}-^p6Hfo95^|k3kr!s{ zKMA-7_O#5=-4>vV?l88`2*tKpN)J=C7vq1VCf|HrsI$|C38WL?i3jpB^iU;L z2v$g?;o+Ub)>p#^=^Lnd-GOzeT(#a==1Xn?dIE7Y`e)c0bXMg(5KAuRf7 z4nmW5VHd^hj(>YY$4kffTrV?}@`m*9>XsSR)m7#1x14iKpNHF|j)~;$3*bdtl?}*h zl$-c|pI8IM3V_J!RswpBRLXTL3TWJ##rD3yXyUW^-Zf8t$j7t=b^miX_{3;SqRl+M zIQC>J&z#n)Hecu$mQc3uES3y7$&Y{1omDpZ;W_QK=96BiO?xav8*i_GX_L;#pTY?h zX=D*G>r1YWd!AJeQI!ddEoeF-+-<#fv#9*z$ki(`1^^&1=uV5rB_D3W=X6;5SG-JK z7Y1cHoxO*g(*v>ALj2Khb-ytLdviN6q8m?Z*p2j!o0)(ZIU+04BhcQ0Az#N=^Io5y zcD+K;G6Gc$(T@G}@L5i|%_ee-_x++_r!jL06)C0%s3PA8uFZ#CB}25+j->tDwjcAw z$5ncVdl$drwoKBD*S1Lv|iIR+JER z2=b5G$Ngmx$5yQ0W-=0@ZWMkqc?T7~zAJLm#oG5b*ETn#%(k^|ipQh6@aMK%54@8H zwQ3QOxDmfl^U7skS8*p6pVLT+g0Et}s>nGN<*y8RvdFo%_mPH#LR4dIgf>t`qjEzWCje2IM_seK zkZ^;avh8Qv`jF!%KwrTAFx62oLBrgqo#|Um%SWfFNp#2FHs%zj^FZS;5F!K8=1)2w ziYMx)rF05aF9<+qwb4yE^h z3cld>d_RVZ=3>GDIKiz#`fTU+YbqgqW>ig}B8A}3wgPCWPk+*tU7b$<^pPy@4@dsU zx)$JE1PV~emFZ=Yt%>B3TdVsuC=v(!E9YaVc&ir*XomG5$IB_$-Z*L^Lbz%svKLwf znfqEKv{68~UE7(99`U9{CsTPda|o8A-9Q;35Vo;qyKF}51d#ibqNPumCpokzpaXc_ zUmfQ6Xdr}I@fm0}oTjp5W>Fnl58x6&d1hQUcQURk=unm$0>aEFvI>CkDn83eMnodo~fs{f%V#45Mko${rhecel zb*Riy;#A1~X$w?P`^I}58;-z|N4mRA8VWj49jPP6Od%(MkAW-L>Jmm4j$xN5X9Mmg zTFtwXuQ>dm=rRGFxR>D1YpP_(>>=Tv8f_W?f+9_(;=m}LxTb>~zz$p9z=ppv%xb-X zCJg);XRsV1!PQmLF`=VCqYDL=Kf1~;hgQ0POvl3)VR-|aN9pXSF#?%bj+|oyijKK# zW80&xvqx6}`(}MJ6b$6k>@wy25#x}(7X=%tx#5#aF+<|PII9LjHvHI--10O!6io&0 zV4z9ri~urp=^zDug%lw%n?!J-NNr+}Trk=(Hkyhaj_0vHSc)I-%tgBv%Uj%y$xU)90QjR%cAkn0|z)G z0ZCzm(5d9{{5mS!9`=iM^J6~>mGl7Z0PvAd;I6AfDxTR=74+t>)hbhe>WGHB0?O2Jpd^e zL4O^WA}vId&-~8BxeLw9pMzDTR^gHyX^wm82S7tE#&mX{>Ox7z$gLls*|OK?Xx~kq z?8+g<*;0F7WEdtWnF3}am2L+_H2+W&nQZ_`6gs5%f_c^2&vxO;!D7T5an}PX8r&O3 z5*L-BouY#EivUlzIN5urx(GJ8O;bmMZ818HrG;tJJi$x9KH7_1&DzWYP=NIHBUB>N zdKSp;Hn9+b^r~uOU#8=ZQB$G%$8xST%_OnE=h?V0y0+Xpd%T$-G~ZVdSS&_}86bCIX<$aVw&`$U%vpZc3JGv?o> z-csbH{A15}T^fqsP^o*b=*@rU?+MnTdc!4(6}R~_Je?Ee&QHoYJ0!e4SsTuXe-76 ztSxWQjRC8)%H6tIZ)&_Ve%7-(sZ#gCzLjJzN|TF}fiGwrnPq9%Wh3uwwv)p}=a6q^cg&Q}19dBNIQ5Ki$mS^mGmRVG8sKOd zs4v|#O>4JS?V_el#T&zIod%GN1+Y1JN&<@EPr5NmWg;r-1bxGq*rHD?=`X8?7C-V_EJ(Is_2f0dI6a zW+IN8$cUSREr&Di)P&Y!&LS7{fBs1~1*3|m^2k%d)_^ITjV8Va5#OExL1=w>Lk*C* zo?teFJpu5b9WG6qf$tdZNs{SvX>q!6T|^6PM?Ia^ln5uC@!XZ%{0){R>@MI6C?Z$y z4O7;ekvmUJ_R|*wDwSyJQAZHA+*x=YdSwms`SJg1eW4jSy9+EV*sz=L{xNmw`N;A| zz)-Lm#;k84sTYPr;0@0)A&`6i{xm`49XK;)F^Q;2`^gO<>BSR^2(vUDSOjuI+#8&$ zEQlcrcpfpf^APe0QzU3yiK}Wy9*NQ9zz{2DY2k7d@;n=^baGIXhHyXooEcu$bg2pXJxXd3hx6=9{zT0l_-G zxiyCG-sanHqv?yNW``S@m`v@iEwUX1Ng$`W zwq`3U6Ztam8jkIdTS<9leYw+Ncad76OOj&}117`>9P^xhB5RBlf90ys2NacVbOchV z-qei0WExS6b$uDMCS*+0f%p1rap{NWEBe(KV){C1j8k(KJDu)c(M0Uu(@NlPx&DrR zvS$elTA6SPFy3`ez$ zV$;ioC<$0_h>dBuEW;lo8fC7n(Le*-WgoFmd_?0o2$jjCpJ4Pv?Y%jQLJlM!8B$oq z8ibb=0h5ZVUktMYv`Qardpa{(`TCrlvE%ezA8sRi<8zDrz7e}p^a(~6wv#Dfe8IPf zz!ANZ;zBtS0suwmLH$Sw*CJ7VuAC5P51G_<##Pz@q*2)>yJ#wzQX>=i#p*JNfbB@f zE$K98^3n4-*$PRkNrs( z@)kup3#efEU^mAhn;YAyOvod^E8fyRs17^TT9d{z?|Q*>r!p;47%Xv@vP1?$i44Pu zyIB}4C4765)^uZEpJDB-W|Aw*o|s?!`wSb%+bEjn(SqeJVPT=Ih|q03SiH5jIPgdL zr$qqyj_Of+@VKM*>f=r&&R1W~m;{-uO|it|Zay*Ml@K;_{V}}l*je#y#}@l??1NE@ zc8*H=5y%6=&3xO}7&n+jZ>MyfVfndO_2Y*TEt$t7y52~}|@ z>WR^0L}T(=Yay|&Spwmh;^)}H_e%-vCX$@vs{RDI`4JH{hbR)S-A>d@V8%;Pq$)ev z<}ZCGou<9)&E0pF4PP$%jNGxyIV#)2uTR)GNnr7|3x5q`P^gK4(5rE(RuOsTRu(S^Vv)IQ{5PpA2tV4CJhOb85C7$7 z14U8P2Qe60FZ^2?3!dbo-=rw?e&N}A_PF_kGo*DE&hW3k9`*-x;2O^SN!MwK zhfzahXQ{%G7!T#=zV5k19rGcvY^kT_rc>RQBEPM2NJppN6~$xbu$1_1daxPXb%DA6 zw0;O_Pv7D938VcSMUt);l(juhvwEjrMVqo|W;>LF6^HDuy1Tpjx{6Ov70K?FDaYIv z+uCsJX#w4AVjM2sK(E9VWkPR|g$}L|F89R~+?*K9Z;__MJJ!>bJETP9hv=GVCmi?K z4q^wgMgU^CX~ZC!3)r@i@sa1G9E3fs-K%uU-|yG&@TkgPmL0S;r)gNqY3d!~k>Vui zFqMt^wdyTQtmT5_qnEJfzT^|27d;LUy_BP5N(^*az`Idb#|g=gM_`0b44t?KtYokAe4B57+k*+C zF7%KV2l9&JNa_={caO6$-{IkZxZlW40xwH`4Uyz;K2WcP_Ln1Yhs>W~cbJICn!Ix+ zP;!4vQ>l~gIrBwI@!#8!VN@ueeC&Mtn+oJ5_2<+)kGugjm!%`6=KFcG(Dt@ptqrSv zQrqye|6wmv(wlpXkvhzV zFl7^+zBs&BjhQtzRo&vYyZ!>kae-3sTYn6DFT{knVCAVWO^vnp^vbH-%siK#n-aan zU_jg@KX916RNf0yy(n#$>;!@gd#R7T&7C*zhU#pD(|zlGa~>mD2OBds9b90|jJzGI zUTs7=I#@oUx9sB(Y5iR#-I#_=<;tgS%q%^+k0jE5^~g>je7DmWe7ch8KbAE z&HxvzxDrA-vwt)jI=gcXtX5}0xpY-&-~-(#La;m5=~ai6K!4y^T7W+B7aVu*VnDp* zvVC-dC`qfEMXuZ&U3xEh8o8*hHUcLZvVya~FTHz56_68EPo~tbLU2q98k$SS0 zSU52KL{Ga}xu{!Y>Jfo?HF$Zh9M%RoNYXu<^tPt6noO{gG)$@I@4AOS=XH z$#Y_kdujk&);}Mi`vRbrzCuxMuP$!=;@Q-q?P~Vqse~l$qszoR*-lb4@T+NE=74oy z^`*edriE>-FqfadHadFBJ0uDM8 zZDux%QNJN=Szz~LFSYq*JZ^NQ0B1w*#%zkKa_<2{|2^hvU|92*#Ci2Sh?8Vm6UL3EyG*kpO7<~#7Ry4mlQ1o-ExU4hQK!`&#tAZZk` zHpBNlM1vST-#Tn%(rr0*|LUi!>z{Zzy-3PdlRpsl{RYO#pM7n%pBKXa- zkzhe`M^+_&IG5HO#|!7`2 z);+TQdtlF;s>jqaU}oMeouP21KjT^ULFa`c?fg7*=5FQ`wE>@0*mlmf3{Hfa1<)C@^luDCwL=V9FXrdbAC`KWTGyDUxbJGPs zNd&AD>H}F>0{qC#7QT@3+>jT2&_M*v1KnH_I$t-mZ}X3{9sjR=zd?;!$$|XY zafzH<^c^U|xS`hJmL{E;%D^DJ1@_SM5(tJeLigQ3Awg%D&)5euwnmcKgUlFWNgOuq5N?K4ZjYly76(1xk%%8zziH z{~(cIL_}0S(MD78)$oD(ry715?7z*@;(3d+6J>r;gEAZFiE?CpM{Lmq(Ge%x>FwyV z&S^4feEX`G0$j%Kbca4e3DcSu*WqvPIYJbg&yKe`TsB|n{*BDcG#~M2F-uG8!51Fa z)kS^#mL(A5L&y5Hw$IH>6o>tQtQZbX@{~xzAGFE|hR4fSDZM*^I$FNC`4Y0p4S%L7 zH_t*g^nWhT_i9MI$(CwbfwEoF*rvR$4;`OoQnI9lWax_(Tn|7EeFHJ~+u3q4ID7GA zGxSQe3zoz8d*UNhJ~6}-~T$`+H}!f5NvX2XAi z@&id|$IWEju1;?~|Dk31Z1sEO_NONFs-*aPHeJ(y?o1E%GH@cHBej4gFCO0G%`%%z3gY$*=Vy`n96M zRCa)BaS!)%y;}d2taOfdsY2W50p5G-_T7K6QQadIMKaB`x+67(Q{qMOF+A6!Yei7G z0%>npwdULb1sFuL4(N-Hhouula72}qS6nw}DwyiR%)ESd*S44=n!4~*VX7&WVS9UQ zt@*i-5jl6&h>HjuM*;>n9N$T7rA!lOm@O~(qo~w7McMt6r;*>nKiSVMD!iJ zWyCEEPN$-*G$^pQ^Q8(Gr(2KTG3 zR6(}CF@D#fQi>XLF#OKjtF`LF=kR<+`WwwkdW?4!L$Rll9~DWMZZBE#(?3dHN@)KE zKp!igpQ}IjiEXKseCJ`Fe%2>&)k)0B#J$d2CH~IuPib_#v2Z${!-H}H0_`r2ew(Sw z-NMdof6k`HbIfBN(;D&nGBu0FUO=5m za|Ou>f9mW&!x!QuvW}x-Tb<+F_J>CToBmCrj&lW(5GhSs&9T~l0ROoY$ZVkcO z>uHP2xqquPc_g*;16u>u!2#+fu(kgXIQRdT--{*^Yy<0s7gA+c`6LGBhQ)oGdXjGF zdpzIe_seITsSOow(^bll3*b4z5R?wC*1hY@r-)$4ciFW{e!MDH$xL|<{n@xszyaqZ z*3EB6)H8Y>=$~JyQZ!HF%2E4`A;(A3BCdU{HJz%L5qWifx+YNU-(En$H=FbbW1*fb-4r8L`a&FX782q2$-gT?uM%OKYH~_p59*znTT@w6 zmb&ST^AY^8>ES8TrTK#G$`IF#J4loKJM{iFV^D*L|2;gj5apT~btawt*7j9vNoTFK z2a>fi>7Hm<=GTXhjAN>Yl7}IG*deRrTf&6TI!-`S@*AbuK4<5DI zj6XDvcA`AVnO|ii1rD}N6{gAE{F^E z!A|)k>JMA*#7db@we95d3&{G;D|aKZd}{kf=4&V6n6a;n zzr@ID$VO(i*N2WepnsT0-S~G{`0j7a+SMgE_q0RwW^?W+8fprr!_$URoT`RM0)>T~P40Sv zNrs#^Gebq*{I#ogkf9icdPYKs77a^q`XFy0%zaogFHzJK5e&x}%+K@8VFA6KGpY!)maFEGQy-~`H^XVV?c{`{CcSk|p>izCQ-%8*86FW7^@(xF}7``~C~m@?ok zSII!B)C7z}K1EzG9p8=PZEz0Vy83^y_ts%e{(s*vii!w`C?JhWDFZ3#n1Zx`fOJem zL_|QkHvs`@1p%cdCC#LpNjH=38r>T)28-`~ey@0bec#9R9LIA%*B{q^+X<`lJU{V% zzp5PRbz0oVA>@`Y&VivfC5`NnOM)+VBi1sI4`KGm8UE$g@oASJB>^?(2mYR>w?-4( z^_)XK+z?TEarf87cfHx!Chb@%dRe4PxcM=pP;sY5qGoRksRk_?GAiB|mZZ^e+2PlY za{OX;CrOGnHdeg2Smsv9Csi{<(Cf2X|7y#azm_6%$01kd*#-FbQpXN1;Sq&5 zg=67~?kI0pfdx0m@w}uCXE*KZXWRuRuo~K>k6{)DvOvn00Ss$oOg|HO=#Apb=RWS$ zBMTyh`fd#H>o^@>3H*pV09zd)XROE9j4X8fJwVRt(9I<7YUNTpxiA4vFxNiJ% zp?1;qh7x;Y%J|e$Jo^WBue#YsDE``?M^>6{*?tMDCJO7g5K@jvfR(@0x9mposh?_e zWn<+M86^-#%h^6p`B^Qf!&SY}5#Gc#rvr|Z*yf}}zKbt?%M+Y;fI;&6(_dXn$`LYW zfcCfoRRP~>Smz`Qy{H(s^zTSgA_?9s>wvtr!}BxAL)v&8oBT}8Qf#?^A9-new3(Tx zhNTZ8>e)yo`}0*A>RsvQ>FN*ad^F&S{#oeQKb>g84A}OW@436sXn{Z6KvlUS>4{Ex z_=U%8iD8TR%2LFaKRJ&08!SGKoASnldEevyib#7i*nec$+|_PTPq+?##b=WCx087U*=GuY@tZp1)H~F#ar^%5_Qi^7b)e zZJf^Cs8W8nlucsS8im=%eZZc4sFH_NZ<~lbkI&Y;?j--&-yN8f&71ok#0u0A#uOS| zhs%F_^_BtnUMAT6hxj-<)+%5_*#|3~%vNwfNvr-F zbh*9TlR0@LkYHr?GjQ9q{j*t^->6+g4)2L%yU$El3U`o4_*j)0Wi;E1zG);C z4_jL}-FzCX9)k@J)`bT#VO?U7i{Hsg4dQLtxgS2)+9WW3KVHUfAHLYaZQ;~Xwp4h^ zdIZJ`Cjvy@S<&*DKRNTaAQda|Znvj)znzdF6x{+K97=@1xRQWRoCWXPsbg)z!5$}! z?Jx!8sSA4fYGuGdvS?H35DqjjVhsEnBCeXN5eJO3ic^ye_RtcEpOF&g430Gz(gaBl zKQ765ad5D!%Y5}il#ozP>E*$_g&03v2thwU1%uJIV=8izZ}xOYk#L~3Ew_7KSL9NX zFPx~X*>8!oaLo^OqOfA#&{4xidVYe!YiQG}EB z1ASH3lu(=PGji%u-zDVe>OuAjWw39jzj1%Okj6{5^RYQt$Y<^vs$CKOgo~&T9yF;@ z{zag<*v+2>mbR4fFs<=K0a>#f_slF$;kZUp(J`21TfztS$ltew!M^Itj^}gf&vW%? zq*R2Eo34)Ks7v0ci+xkA+rE57ykC45e}M^7Hzl%M;k0*e>Aw=Azv?0lwt+Rf|L0TYKXDsZ|k zqONP^J_(Y_s%GSHhd%|4{mPeoRV772D?1ImVxG;D=>Pda9(0Gl9Wb*BVUVz zb4hCm?FjQ$rI@upA1D)bo!C!7x4Wypj@isXAp&mZ%{z>Xs(UqP{g68;J+)xj11VA8 zikB3XtLF;U;5jO~9p`tzSDN?YRj@3lEK@3CQ*_&l&PhqVce1Ot!%UsRzRE%kWv%*R zED9`1mLhW0mHB#FyoUFNakeK}%N}(Em;}DW9f;Zl(@PRs&?0o3 z^-g^WlEKXarLr`$w6t>2S7Cew`Eq}cYSn5~?B=i<(>pItFZ~BW2<>{I0n|N2qq#R3uxZf;gIK=6( z!E3*HW>uZe5=p!0hpsFTrxSknCqnX<%J9GPnCI^EXGaez4R!K7=va$$SqgS_iYnCW zysu^zm`p5|X%&l16rd+^dq6<#X%%++|Aptvx&L|1Kl5@eaH!oL2V8ffy6~UWdz-nvP!?qu@&H6 zxksis%^qzD(l~^+GYld-0o;@=qxno+$BK)__&2cce(?`q;6Jbt@E=2ekw2CCcj(k8 z*PTp)u}fBTZu4SJ^&t9vIr?o9DDMKooGZV#;e>SsfJp*}7J=L7FJ}Xi2(P?JOrmE1 zNR%4jqsN|SdH;CGJ`}!x4E_bsyYe%XfG*tjnM7wkz~x*#efL*I!T!S(^(hByN$9HLU*Kd zZ{+1ZMJ;?{0DclpU%w)<1W5s3Rgec*3ih#;di3S{k$am8LjdjFa2@DSa2P@m&uxPs zlASCiw^jYfQOEM-z5ah9e0al&E1+(r+d#di-ZsQnvf4Fsth#CcRz-uJ^uVPa9w)Ox z31Pr*1Lf);6w&=j)ivCc_3kUa%+W?ufn3&{P1q{E6UUjp16nnnd99&6fajZ3Eq-mp zD{y&PB&wMLb01$dpiegq>#O8$hAV}d0RQo;H_q?WqGO>dqt3^Arp522pND#oB5MCt zSN@}edpP-yfG|zocsCusz$BV5(a|b+BN0bH4z4GgaLYPnj^N`up&+csJS;-eBIe=x z?U(MW1%N%lB_HO z$v*}Tl9Y5*zH;awBGxIq@p-+{WFmu$6tAG-UJbdiErq+{D>jp@3s4p;$hmKkjj`$h3G zVF&d42=W1bz40MaCF<&2Iaoaje4Bn~bvo}5F*?WII?smLx_rbbIHAYVwRc;i=PZ!| z-dR)BhUgRrB3qu30TR}++)|QAC2DYh8)twI53D-YE)tfce>(i2oW6w-_~p$-qaN@n zdgUl8E9}JqRDMmwEu#1uWEEngZ>{6Z*MV}d7~&NBWne{rdH4^CSNk8PooVY|Chs+| zp?&Gs^7Zx5prHq-aO?9!7>@;d(}#US9hG)~`AQZIv}$U|+R8G}m+0=YHu!NUc#1Yw z2K|Ubi^1eOB?Xl6>~5*b!~e^x0p;2s6w}#cOT*of}9ZAp;PtU)aA;6kIFB`sqU0;qn+suQ3B1cHgz#Euj;z)tBz!Y6sjQVF9 z1tQOXY7|b#5yWH&K&1W=R~<);gdx_zVva`;96c~(l3hy!9yBFHJRE)70&cI3_2kws zd(Rc^c$n1ynASB*)#5#|R4Ujn7j{QVjeZzyVZc4Wh5wcPigF%lyDSv*O4jMEp!K0i z96qUt*u;9s`buPSB1ShokbYK#IY0q(5?zI)-9Br#*b5N$nMj(YOnM2bBv4HWaJPq4 zJEFi2@Q)BoX*0vt32yk*FE$@sv4eTns{LL?^hBfWr_y(mhUwXv`59GThg(V1LoZ`S zSIt=LG?YHLLx5?cdZAqe5BOi0fLio)yi4Dk< ziBY%){&>kD!yQxPe^{Q!5M7J_vH9dDzPKjdH++K|`qYUh~)bq7tCgijv-a*k7) z`85Eg+B8q!iz+nRmU|n<-awt5ag)gu-`o8d7cD(dOGfrJrQ!c=k9OT&A%!s9rshWY%;jMeMNy@v`Ij`lt0a(?f z!1g_rIjIOAM34^3xJBT=9x^76)aEBf z^Rylso~+f$AKzRIZ`~kYBPvy3BI*ftDB44YfLV*l;x!+S>F$iqY?>QQTbMVjXp4WDed#*n2HiTQEI?5%>HXcy zRsz6b_gp2;_Oa#($wek4uMI!5wjUVodb~%?e1H->5Sj%fHj-Tp|4o13X}$gzkwiFP z58j7L-ueeQf-$GF&tHePn0gY{*zvAM>Rl+b{r-BZL;UiReMnyP>dT?Z12DS>q6<8& ztyw$NEaTHWzj}prpRRWuAHVdtP`ze=1UeGIDF6^OzG7)0FKDmSND_4wq-^toQu__N zm_|tLQ)MZ+Yt)1`r9LeCqDvb%vgR7Zr%UT_-zlL_`)iJm-BjA{D`ja4ze`lCGMzfG z`zvPXrP;&lWuF2{w{=eH-{9&U4^O!Rqgw;Lsi`!#^HvMNRGU422HPBM)UkOcJNb-=*Qe9y1a5|0e1N5`(`|*6~Jh<_9 zT8{}Nw@)W(35bG_6k*xArZ`ZY6vbQC`OqzNOm`NeL3NA6{)f8xA<+tlocRi~!5*k5 zPh=Pqb!uIbw-VN;_sD=DcyfDOSo1x+8QP2}fljq$%-0OFyU^y!{Ej!Gq2OqX{sLvk zJ0`;K-vNU7)5VQazQ~dG`Uk`U@b+ytSyq;KR*LQ5Vx`13yg#szaxXQRNk%zcnA5r8 z9CLpy-se*+959%k{Db1PWS|;V=bHx`^}eC3f*GriS|-_iy@2@S zTMM~kfzFx8i8)yBkZY=|L1GmO1gThT?60-Ua;hB+&Tl=NU?BMU)YG`o;{AUOfzMQr zMsfsXa{QBsUW4M01E5BPm?Dps< zSwuS|a^bopERrmZ#KsmYAS$mVVVq}wTI4Zw(zA9MyaRl8-64vgcHtfBF}Uo)p(lL%o_CU~XV2Ssj`fXN*=(rgdhKU**y`Qhkaq+`jop>2ddG`_Rp@SlD!YJ1I_HajiR1SO%x&i0{16{GwORs8{ENBCphU}|)eo6dl zRkb`~#~2FyaEF9?dwKf>j*d(JZfY86sB}}m^Toqc#?J2FJOxP!cxr;S4+Q-q(J{v{ zS@6&!_>*ZSTo&TB_W{v9CQKtIOfB-|-K}>3YkaECUjB0_Y?X&6Bsw~p4WmUe1aO`PyC)CbK0Jhi=GSt z=+lA3DC4i8pTH_0bvLD^p;bTn9NY@LZDBM#2s(~3ty-SgE9LNZlf!HX6gmK}3aknB zAh=Kj=p=?Ezs-{8e0a|*i0b^k;FYgks?VwQTiD;3U|~$Pg3V$oqAhC2w*WeI$%wV% zzN1rI{?g7cBSNjH+@D(sfRvcfA|cJZ8B$@|2ztwIokWbm(;~M!pJJI+Ij*@ju!nQd zUqWmIvl6A3g|BH-PH+3Ep1ou!N6$Y4q`--xKH~)d1?0lmp7q{(tE(zcyhcq1dhSx- z^?OWIzlD8eZ^m59(=_Pc)XtETW_R35H4Z!`#70AC>apTh+gFef zVK4pi-Vd(?hZ7db5-c~M$at^BG|7&-x#d9HPaQc9p10B=uc)NKDkt@4BbU$+#-djq z^OX0pK`6C__K9clO^C4tm&O^&Hf*Md)5rLcJ$7t3S}he2d&a<+_6(K{NOV&}GEE zG&P;N*BxO|R7bwYaxUSw2McF_GEyD05R&_4i5)>So9gQkPVPgb9wf1=6fEmt5@l%s4~V*QPHDsp7o1<*$mJz^Q(L9Zz;ydCdmPO@O!8T z@ikr;?Gggz4U~8mQ~IDG%+@xb=;OL>tzwcYx38XVw%{WrX-FfK4QtXyAJ7bU(x+8sI8DO)Ww-b)xLK#|Vda&|cZb_oP|HJ1IS0O}5@#|;n zL6c?L*M=SvZ^VtVv&+qq52h=KV4hTlHeI*%wjMXYj>Ct02ZWj48y1G3p#^VkwJVwB z{6#yD=<2Q`>q_X3q{L1LkfPiW<_VbH(93IJ%>%&UzU#1ExE2(9g|lO zqF3YI+m%)sOLcBVMu3yT=+BrIOegz|cG(kLiK+P)HmG*;Qe>djF-&P=ziW_1h&hW^i zZ(NWofcn}RUI>4%@y^N$$S%zJ5gfgTP ztHw(BYv@Uxp0_Q?=O@9T(2MwlwiMI{&zS`}eDG9>B%P94r0J~f1#hlGbwTwD*VHc| zp8}=r!gQPp);P|75%JMv(-zJCz%;AA_T6XeLQI|4qE7UA>QfXC?>iqLhO?JBa*66WZ*-WT#HW%()ElC9m^)FO=Ec@x#TA`a-4kob8|iX~l4GKC zTw&MZ(pSS6bMY~lvHfBq(-{W-^Q7eie88c~1ai9OoIP{)nEe=GhJo#zVqAXF)=Y!L zod){3rKg>H>OO0bZ;dctRxb=iAUl0V5?iISse7%8lb2yb8H!3bOBWi4)tA0})~2Qr zWaPbcE=-*M*5TwtGr4bGD5RB}JwV>uS0K?5x^Ht&5j^}Q$M=ZKxm7RM8(tk3NfblF zBaL;IQYSsnGS%cp{j^O+;YRHD$Wtaxv}Y3k-Hk`aR2LY2hd{i; zlX2BXgbg=;)5-}Wj|sc&vK5^`3x>UhgNidc?0Tqhqx<53W9!Yg*A>mjE#9rC$8H}e zfv4UAA)P@`KFnIh5~truoAs?~M;#Tt#H=JK&dYm>M&bW)QGg=cUSwKD1^+F}+{dMT zBElX$^Vl?eKdyx`#MUNt(1DcDs#TnsmFF78K?pB@) zbY++g!uv(y_0KGlZx^l#k%U`-u53%3{0&kCYM-P_TBGWjZJlRRM#p)t8VT-of3=VP{}wXu!}rV`No`LYW8jC3d)%n--o@_?2{AU}G!C4+T5Tlx1C zM4RU4#t*bYDYV&a4SC=3j1$WpMpRt3y0Y9x=BbD?7!mnIeFth3{#51sWuC(FPmflv zIkq2(OsOv#hxknl`MA1pkj5HPEIy8MwMxIpBVV9CL3H8S+ll-Z$wFNGX=zz=C5{i~ z&Z?X&l#->o-e#t0R z)fG1E7VfV z9J@A$YRj1|#++9{$M;2+^S(4HYU;b?$nP>nHTRr_8B0Z?EWcuz+7mb1?>78nbQ|s7h|>K(C|+;}*CZSEh4ie^8u9HTfsEAwD4K$qbt#ElV^PDD&8tq1eZ^cCf)@zlZw~_iURBj zz~(K=ez}1O9|Z{@PjYh~>9IVz*#^EaC%hZl*y>2AjorwN2iGyam9aDMmbgky6VieR zD%v9Q_c$&dAaSAgva-AyXn9#4mIDbK8T`A>lReCE6OuaHUZQY1BOB3z=k2WCZX5HG%Dt3xUc428_LMNRJtUFb2Vb2jITyi*bO2is{#DV1M{eRbYP} zXKEp;;2oFfBiR^!D?sdVAD<;fK9lDU>h8oZlpS+~{0^2WJ&0tZJoL6Cg?REczQRmI zX8Ie01?PVC-HB5ks<@;?Fxj+70q?N>C@JE;=2LEV-{N~!^)uD8WG`XB6=q^Xk_lcO zVYx#bnE)CNB&h7+5{%J&R@vWL8Bit}@yG349}shUN{FNlqJ)2s}s z$d}i7Xqjw2sn<&OznQ;)(QRkr4ZalU|IVg1Hh&a>Lf_ zIAV(053BaVoE4V}K)!Ip<2$Qeq93DMK#_+naO% zs6?G*zdcmy-90XzyLacy=L0^b34GBnE=i2)$h1#|YW)c3RcUQjhMM|r@o2@dkEO~L zufPA6$!GV!cSPNmV{KV9oTsXe;DVX}kTK@r|;KyEa~Q3YuGJkVpLU}g_) zmDfHTp{Yg7#+^unf!_h10$4FqDtbS*?BCfD2E!KcXf{&KWYwG+isj(`t2pkxI_}}- za8isjvd88KxnEkoch>I z&rA~Ne!k?f+w$*`J@;tV608RHD(` z&o>9OaL31q0da&kN(&)c$c53}aI*Yua^lvCBdI*OP@x#;cbAPF*)RM-aq-eZi1m5H znTNw&WUVgiRBLZc7#Or99h?Pj$#N(VK4G|Ce!;e?#sO zR6r25gZm;fOgi4-fCb@cZ^4qrAx^Er%3}M;3q;|6ozA}_z>{Dn6y9=16L_chj zKEtTRxufY>HtL15_L2?`E_eOBFMVU@k|vIcU;;5~OJ0RL(NoISJWy^{CY3b8{J-P9 zx>`~!!|q8A?7MBZ5FYrjOKj};;tc37H8&wpuo>~`$5?Q<%Vl}KiX>70SF+-;BQfh( zR|EaFp7xc#@404{_(5P`2P@@+j-_!{V{2nj22&OJn@)1RyFm)+nT;LX?*r&(y9MKu?7X_>|flr4+p~TANuAKah@+pP@p4orGtxa4$)G z`7{I~KCymUqAv7!FVOiswLwg9!$N5N=sXubW@*6u(!Ev4LqzNkSr+wMME#GYSLz{` z%=Gf{Z7M&j*>h`~t7Fs~w7#2gGLMc}4Ms-DRHCMV*aPGfrMtu*q)ajo#Am&0qMG_1 zn{ha3dPbGVmo5MDOYG_-2lw84Mfb(TG5kSCAug8dyg(6*mMJV&vk7$_JR(J zJvFr|XB*(Qp}{dSxs!sXuac%;@7;3eC=<1N{?6#j3d0_qs`1;MJx`i$A)#yuOUv#u zwWzW5!7<9&gveF39a1HRoaGrqr8Cyu(p()D0GoJ5M{=|-Bg+4fidrH`sJ~#DG;ce{ zlPPvf7c4l57_spO!Eb=9CV`1F>;Jcv;lH>d(F#La$!u7P&ZeB{H!AF~;pef=ic0T% z5UBx91kM0{thT0SmO40WZy{}3YC~e5_8W!&?MD!kAq#_cUA9(ZiS)aDX!R8&z-hGQ zj<9Bq*$)sFvc6mSB>rdIot1M7Jzlq)Kkft;trU@z03VdbWZlu-oQnFO>K@weq3&pT z@t+sCUx3ayO9eU>t}0M3cQ+z#Zg|yyqp8#svK`dlC@~&k z)Y4P)yE%2oxLyR2hp*sXzk2Y-Nu+VHAT-TeJ$OhjUiQxCtVAuj>#U~FgXMzh{p`Fs zfcf(;^sGN0o2OXUm2AYbR$3{eTUEwNR(;~QHdO6~9&&5?sN?EMnE`B{XC{4`*UqNJ z!Jheo*1=Rxhae3qLMdnqMfnFsi)PgZpQ#cL-e7)@<2fp~`*&>@Oz_ob3-CVT2+_|5 z%J9*0o98bp;i~|9D~J&Mp(yCB$871O{qWouH|>tIf1(KE^KQ$tpZ6orB!Ck5CX@-E z-@zTOV3V^Y-ha+Cvne!p{kvG$Jbf9(#G#0)n;s}+Dpj5Jem=Pf#BG)EDs7pts=f{1 zg!IoxVClQ&*+27P_9Hxn#eUo9RIU78yef7ey3ASBR87i(D$LyZj-4+N&+NW2TS+9G zHdNpzu7dc}GVgY38)hZR32IhCaflK*B%?>%jP~+cGFyjhk;R)w^0U*b@ru+R)$|Te z5D&{pNbb3e4ecB(tlKJhqa|1$kuXmIblVNNA41;%@_MlZ6I($$Ql|MHr z*x=qL8VvMnfII;j6A!1u!fQ54Y-*7~z@KywF$W$+Wg6lSGr#6ru6( zLntljeil;Z0StfFIGwLI@1w#^;ifu;9!I|l8Lpv5{g6K>0wa$;!d$>1*Ka!rN#=j0 z-F19TE%4hZEwVL_ewn998V~LsmW`pp#Z_$}-j~7_zeCTd-$Avpr4?k*#?jhyKSU9? zO0$9QauMTU9CQ>%y)Zj_qPZ4%xr;~_FbvyIaSX@Ck*$!wAk)vlUrAJ1NL~dMLO_7t zsI>ahF1OAwk8|`kUw`&Se>XhEdHvUsH*O;fpmbGKm{}+U(FSrrgSE-^cxS{rrb5|LQJI(vg(P} z8TA+&C_tNkBwE|~o8c59TKj>iax`R%`#lWknnWtm_IgC&S0)rBMJ%Ke9B0ZD8fhQV zEZjRWJ#1M+T0=PlbTHl>VEPWU2o*%Q|GpTnPG|7`lqYIW}@fqSvULNCcUcI%+ikol6DYmmw1!v~=PcJ7BK zI0Sx%zo>B5$6q`C-YpROe(aO9==EGEYygA;>D=ypGO2 zNgAp2jTRFh?|5@Jt4ZQ3@Fo97buFj?>HEgZ`7Gr0na5MAt+*WrEMpW)CF`7otB)UY zU}PAn8$?@2b!%eLduIf#KCIxHV-ayBte%SE71FDc23ftyJn?wF-*N)=it*Im{dZe0 zwhU1D1}#=KSCWT~UOcZ%thJ2InON@2j9v<&nnz}G$EC2nX}&tf|C07aMREz0$|tMe zbSs8}<=mrEVs-AaE$x%HgU%^>Z(Q?x&`msFZP|iSQ~cS2VxH;KWyD|(yK0*0#;@b9 zvl^M&q*A7klJTm`GZ2>7Qem>TzQ$9WV=Q79&L$Q|{7hz%aD6ZadN^az_&fh4r7k0V5)<4o()Go$%+Ci9x18z@)F z_88t`(=rr4(Cyg$OK`(~(4enuH_~VJp+#<(Zi*Jk&uTDUWCbU>N#eYB>@heOMxr5R zFINb}m*btuEpRC-r-ZpTvZ0KCAc)|(xT%TvFc`ag@2WFd#CL6u5joou>LbsGjy4`_ zzntO#RbH)S%7e(>vzk3JxN^7f#GS_9%wFL91L0HJ(P)g`G&@=~E=b{Gu?{bmuia}+ zRsCJg#GxeCw$mMt>XgeUx_!}^|Grf95}u88ht%9GD%n|0=B%!Bxxe-6e1ng|*~E&w zm~F!7L{)HhAD3-xlTsy_HLseta?O!*=%lB&Jco22UZ-(p22ETOqrD%DDZ z3(+hb9;FwVang0Ob2EH%mDS~eQ_l*Bju>_F6(-E&CYN+0_F|GFyxfxPw( zK4{guuPg4FA(2ArAjQDmWTH70739$tr~I{tHZ>pK{-S)%(#^Dj#w49&c^MQ1vY!3Q zx%wkdjWTUrQu*F>yTcowjk9?aCcX}hek)*q@201fUG&;3WrYiXr0)uNJ={0o;gXoH zd=7P;sj}gFF@&Y+fK8p2w%7Q*bRV4I{P}ssUUnC90q{wvZd%%8nK#p87CTwC10r0hZFLkbi?A=sV>5WAMtrBG1mqsTO4MO4Ca zCQD?+y0#Q+HiGkK(loNF4W6X3d@=qs>_V?v^a=0stX3vQ?R9N5oASd_a^F&3q=?mH z{Zi^K^Wy~?D9ZZwXpW(;6FcZWl=Qf;!?{=g?P+uGpyQ@7$U|I$BOqHL<&1I(^=p9< zAA8Iv=d>_2DLVhN%rY*(WEOVy4~my_SI^On&8Ko5zkIu|1UDx*wvZ`Dh@e;1NuMr% z)|8=AbR0x{&CSvMXtQ9rj{E?t*i8y$l=cpL=E}y2;1YdA#W69FRQ=RkjprjR{hP|j zYZO#kp1w?ivho$>V%pDvmVkKhe-tm5Sh8IjzmL8epFEDxTNL!Wdp^pc`mNgwNN-va zbHp$oG1M}rJMY~c=JD+lTYdbqOd~&9L+i(nk|g`LtVM>7e(H>Tey>NVcvtgh-RMMy zPYwD~_|j`O9jyZzSR>f?mi0p*lnZ_x{Mw28QH}c#D+T8(%PS~zVi#aUwrqlpKCsaA zJLoL0qsZsrQUq1dhkTj;!35$fI}5cTXyBRh7eUnVn7T zSHI~UBi6sUteHK|th>|@$c?^~m& zRuP&wqPBf!&+zjuB;<^0R}!7h{-Uiw?jlbX

      7hi9zm0pk=LIG>p9jp}0){pt!P< z0g)`Lbfqacn^O(2S_Fu<5S>vk*lTNzQzKN*!U5vDM_eTN=2&&{by*?TUUZ#+nzqNA zAK$hZjc63LrcYHKx?M&|Ld%8ze(3(6o(o0~?YxZ*Ll~N=+%uf7$C&16d2fz(Hr@A` z9LP%R)sM^=ohDrFk@WBxFq;CJ9Zw!XxQPa&@@6hP)l;}K=D0HB?r6!^oaV>DAk*1* z&jwYTd79g8W@OQMq%yL=7S}nv6LVD7lkv0Uh*f_Z!8ADwwYs!|9o<#E=k3wgK-l4? z03^kay7jV-i1ylMQ+W27J(2N}Ue&o`yeN&EucQBLecXw)kn~#B(ajL#Tkc?}P`tcj zx|Y-VwI>uQ(bnenL zw{G7N#o6())jb{ z*f%jgNf#HNT&}9?YPeJB6nAn?ak_D9>hbtbZ@J8vs0Eed_{h_nt;p({mOw8CQjOw5 zcpl3|XQ%mR0;J@c>#ln(fNLy`&qym&`X_l56IATp?k#z{l<}+kb*nCtGB85&F7| zduOg8SWwMyZz2t=;ZUU)7Z*IRVJ4y9o@Zl$hH|k9Ggy&=HCTys!le z0?hsN6$1A0+7n0iF+fX>&~oq(Es?)(k^J{~{+(GyKj9(ue~&Now>>0C8T1^$I|54z3LPkx$rA<3osp@>n7jhbONqp1Y8ZcKmH}y z`#*pC?{{vP27THrAap*4CKEjro4AT6b$c5h@`)Db>*Y?jacSd#`DQcotXa0rvBg*6 zFuBDWASg>5)wY$-6|V8hdnd04D^H)VKpiC{%Dnl$#R8t!I#Q7!oZ%{}5czVSo`meI zkNe1LHMD8mz<8$pZjy?OH{p(fhM4q2$}>Vqa13tU%(FeEYNi}+nt5K5n5O85Xop8~ zap3K9NDU@1xxMGmf!7~=6GVa7k*D6azjhPe!$eRVSe7Z}l!mDQrc5Bh7jlzoSplyehjxvDNQSL9G}Xz>ZbHB~1iqyc#0jKx&L!eKu=<(~$lJ zcr0WCDNRtx^+CpwdGM)dsX|X-C~WSwl4|)&l`9#&&CSKWO0~Z@m;#jy+n#Mqdp2zX zfhs8QjX1Yqw%h&vJ61ifY~%HK6LhlDJ|4(*u_e8_MR7?};5*Z?06+&g1O0+k*vBdu z^E8eB`Ypsw<*}vHMvt>ujKi}+G7&!&*fH!&s1Rix<_t;s(H4U09^Z3y&nIsoFK#NW zHF9_}ENyJSB@O`YWO>phw~3d~0KE2+UR>i_105=j#9cVqnvd%l%kfn_-H|2y&6#5> zvRr|>e7Q5F=hLyMX*7MO*(HrLAWVGv7KNzH`J`s;n;kNdDTXH|%O@Ez3e5})hA0Iw zOy1J|2PviStcMACnyV1Gw`UQ&vu&7PID`tIY2K>JlOB+oIh40$OpFJX?211z*T3m~ z{NH|lI)cj%UBw&9=XXc$&7}ei>NQoUgUL3s145?>5M{$t_JYpuXZX8e5bi>}+AKt~ zhluWAz^a*m3Z2aFyC`R6E7Fqd(gPPue^eN0_4`jC$87muLHL~DumA_(U4eoZ^`R=Y ziA($ag&*VgQT;S&R97CpeYvbCp^KG{Rj^%_y`9ieKezVN>2Ni$(EE#e1AFx)-uv$) zVrxsyBiGInxgB=vZi4+mBtnY6ncKmFZVP~YYJS_Ohji>m`rU_LP0Yd9Jgcmzw)(8f zIa~Po`=TJ-!d=eyzc`D68yf17^(kj(_5efID_-;Ei@m)rq5dx3cwgRwH%&|f>QriJ zX$39(Ck}itLrtSEi8*^rFUI$_I=4VGtQqs@xp9q^m`FLfv?37gwMQ!q-GxUcDw;(S z4^SGTWKp8gxW}ePx=F`$?pWG}ZPLB_tDrj!!KKmxzhb|yD{c>LWc_Sgx~H#W5FB}+ z2~{1;$t_NAt*lJ9HTmt5J$fWRbwucHUKu}lU?4K|DHtp_kX4<=m=K*f!Y5Uw-Z(=Z zL2?O~0G=&Tds?=vK1-hgdP7Zgb>Lyj!9JtUOb1!2;pV*X`28w}q~ByoYVv@Mq5^IeNECP`?C(Cl**OqW4Q%dpOen!D~DYls`>_QSuXZgS2Swh zwhm3yxDcIoW4MB-f`&2RYo3zFuxfekMb=IX5ze^|X}5SoFGjQmYf`xO6bph1o^J}q zmIto|zD1nuz1r$aPCC2j=NI6JZ{qy=Gp^IG{#Vh@yG^OLA9kLb%g8uhV>D=gzA;`? z6`WXIuXo5H*bAolS=`@m7OT0EFq)!uw=_X~E{23l_GQleVcFy0NqH&O)UxZ2=!Vgk zVES6gU1nHiO_v3`t0R5%{V9r0JBBBJP@K$u9Kf$NibpPW+WEXyZ7$Q2s5t~UfD&&^ zBR}DJC$iiu0hcSEM|$0*ay< zc60Uuhuw=Hb~3@V4O+EWX|jhePNlMj4mYv%oVeWo=nC#`0 zpKg8k(wD+e{Y3C()s^|MF)jC5U(p^AI88Auk&$ny+VG#7*ac(Puu|G+LMUb2T*n3V zLyggiBz#mSSsd#gj@7O6FM7U0Qxw5x1l}pSPPGV1%BsWE-M<>LmZ>HN)z>~y9sRda zvG6BUaTNWZ`{=9TZDe7r`iDK})KZM~zGkW{P!+c7P1vw%30ZbgI6Sjd1FZ0b?7TdV}}Kujexaios7Qcj z0u!FiXT_?4fy923B)7&JzqWbZ)HcK)wt_mSvyEpn?R zRj^|Lmo&wSi^01|9)CFCI?f&-u4sJd986gbHEP7iJnwiss&oMDIF&Gk;jO^F8e@j^fH%evub5(z&_1 zoDN8=;q`Bl{v>fWVI}u2_o^vxR=qGoLPA2r*#Bbhy~CR7-fht+Dk3T(AX1}(q97n3 zy+lNsh!lm;BO+ZudQSud1c3+$2q=->iFB!ra{|VES>Y@U6+uNCE_V7fozjDDkH#Z&D^dbbfSKD~!`U_}5B*(+i7RN9eqDs~C-<<>^=6nosU$>gF4Rk=TmUT_%41 zgp=2Mqgg&#b7QO`F@=8Jrl=Mz*53JT&h-?LuTQM?s$*U$4fF6ZrUHx(5IER5jzZr| zykq{I`QGl#y>NIgZ*N4K-m^mK!en!^1ROd7#Y_gDoFKZkN*=_ofDHVWVpwPZA^Iv{ zcK*6uM1FcP(rZUQFlJ-Go*Z|Bx#5<@VP1$#eu(;}WFMyFE;S zm;8VbSO&jMV4-F+xh`rs^C%NlzXsu1TKY}AUL#Mdm<-uG6z;kMR~yL^Dj3d`puPrX z`u;`duAdQHbo$Res-9vrYfOdrLD=Oa9pjUrAJFeaD8q@Z9Xcy=vqbJ`fp(jsCuT6E zOI7xRx@NYJp968WjWZX}N)zJf37&aJ49%pS!o7ON&gQdh4khwpCsW(Yy`}~=odtT| zY~C%o{BuJ?jA;niY=_)*8yCm?WEPtTBUocJcDRgo|8-)oQdx+D1ZbmvST@nR<*>pd z8vl_^ZcN+fuFDxcGN=#`_Bw2XF&W86q30%-D?hQgg|$|fq@=#PF3JF~8Ng|^Rj<76 zJU{4p=UuatPdTXSD|Ywx_MWaYfGXku{}Cdp$y4R9>9~9*R@TCG;wL_yrt~!r<gk z!m^^5$=2Frx4LJ=82fbPa1t>QIzL=y_5?2UrAQ%1FZ!07WAT83i zp+l-4)n5zeaJsyTx6G6=R_E=%wzf^*7V+HkP7O_LKrDqZXBi3;I4A?ES=_FX3KHpAsXV z6WBv35zYk9IVi{*Hz!5b=UU!AZB-Fx6|rC)@=*2hknTrjtGo0Y56b!gFl|`q%xAy`-z;9)9amPEWa)NkeP)&`(iI65+S=ul@IC8Unp= zA*};gF!%kc)rFNLfJ=>)7+~My-_b%2Qj%w_Tkg05jEl3>obfHu;z{syH~CO90aR~|!$hL@M4h+Xvs zByzMdtwukPN0{lX{j+T1fBLweGwF@SBKgMPAk}C6%KDL_EaEYVyPlU)Z8eFXSFEH` z3($b@c;oHgpz-{1+`gZK@U!+g+Se>Fr9BdE2x3L_$9;gO=qJ2F9sWEkP+WjuFs8&l z7!(vi-kJb<&+4_$jTPnAbe7`M|L7N>`R6ObKLR)Z%@{w&hxdYli#65J<{>QBo-QnU z&tDbuy#*~KkstN{ArXxzM@H0Y%zbsmNjB%h*t~8u>n3r)gxXfO(krK)ocnbuU8oZB8-9cCFtaA{|Du+*=qgevkw*YX;gD1r^~~@$NSI0s-WBue z5c&;r1u(L`8{tGzVd5Qzjcnm*f}6#^O4$9*`DJZ7$;DBWj6siAMm1As4d z+J*8Zt)l|J9BvKSUwxH-lsrX^`;dOKuUrS5I|N;DA0@MpW_PSN6wZ!j?(MRVKo$i? zfYTEA!zKZ?bB3suZ?>G}#0b2S--A7VCf{rBJX4h5Gz^y(nMJa_@(Vx}0*Pxx>^W8b z#HQIU9}Q^&@5x2UaKbDwl4RCYYpAJ-19SrA9ohQ^d*63HVht z+WTR){wtJkX&6=v4*0r_&;+Vwk++Oy0#> zcYR+*f9%Ts_^vVwVXb)TmCFV1Vf=2U(Btr*tJ%6@&=1wXh5WC}@_+9_`+v*h1zI3B zuv62-W!%^O_++-E-ykiwx57yKplWn%%2u!XNK6HYojH0?R|G=z7`ECfQfu~7?hY?*zE01--w$uJW=)I zT26xP(p98~<2z)vwA?Is_(xjQ*_gA>d7ad9S9yx* zqt{_~xWR?LtcGh+NWWGmR%KHlBoQ)89ex#H2nNq}yPn=7d@8Q|Xg(L&{!1XJtCZo> z#Z>iPGlidkT35lXEB9ZdGBu*j^ltvTh6WTUY|){TGcx!`Mj;>Qehls1VRwsEF;g=+ z!GC*eO5g<65!A`D7tG=nelCOaX6da+DBYc+P@MUx%V!&z<48<^UI6O6pPtV|d=+lp zb}K_R<@q#yjZalrtn=aP0U}~{yBSPBB{=f|*KLsf@LcDO#A!7k>=t@X3Q9Uc?0aNV zZ(2-K$AarwU2&*DSZPJ+yuc*l60vDQgMpz%>HWg#n>Fp%^|S=oww7~nJ`3=8rStj( z-b>e_q=xwOBKUKzf=)Dmo_~V&&Az#p%5AVcMIadD>Un+y_Sp&7KOzV4MfjN6+1e^o zzV;Cvwi^`@Tfsx}0V}UWwjsIo$F;~XF_HC$@GgAK3o_59tZOqdx&?yQBPV4P&5WOFl zOTvUzcTD5Q4_J+-t5M=OiXiG0p0rX>%6CbKv7 z!z<$0cuW2d%gMDEo@B1izRKNPfj|Ofv&#VRmqu}|l=>Rh76%C$!MFw63+uOg-Vcq66n40$2~QiR1^O}^_Z0_aYyH*ovc#l-)g9MeXV z6^U$s5yp!cCng&LrCu?>*NK!IJ$8BhkT51?z6s+bq7N8{u1DD;PZ1M^3|zVD_HK!) zrY4Ctph#A<0~ocOqOPmQWqT22_i{tuRjSS|N5Yx>9ua-t;JwfftI8r}o-+NcbUL(p zl6q{-^MhqskHQS)DDsoRvxZrC?f(W{Ts@>(>&H(>AaeX1X{R_x^EDKxUX*s`o$W>I zSxuJX<#~p&UL4POxV^x8)S2f6($Tx1 zZLfo@K_~c^*P80srOVk?eSN+RHah~3v}l|a*O2z}$Lqd0 zsC1?;&cL`kV<;@UQ=IQ7S2)r}mnaA&#JSC_*VL8E^MzDpoikk$z~# zd4BWEr{#P_xku>rU3tHvcXqiC?UjyJQ|LCWV@B2_LTn7u(*p|%vQPi1pLRhRhQ^c& zZ}j7OIc*OSjBc{|rZgs{NdKV(q3=efHx3hfPO8b!G0+A(w!k0CeDY8Vd>-9+%6Ucg zOr0P4eh5EGPI9e=)1a0&r#mhZKqv;KAAY)gQT3EAuv}*#N+6q(MW*ZVsB_foGv#DY zOwFK^mBS$KH|JAP-tO@CT0c4HJ9wN{3lWk=LaE*v>k{lmUmi~zPAT8#b-%0vSZ*Di z#AFzeC&TfVGWEr;75+lhA(~M-v2!=2q`LoTu-*y%{+Gh~b*WRw&;`im;=DkWoJP9A z#N$p5ZC}WB4p!|~ z16;&?++Ke$Jf3}eHE8Dr5)OI*Mq`T2qCP4+yL2e{O7&_v!@iYB3<<8nuED$5J;(jz zot%S6-B&T0{?0xglWP9?i-f29Y-_suVMjMfcI#+ald*i%^h{9X%uL+T@Rwb{NOzl6DzZg}`fWhwA?Z;e#~iIDSAkuT;l6SVcIt&HNgI&V>d$C}8It_`A_4 z_jAtF!2sA0-$vDJ(5x1yJ1fp|H+vWUl5SD;8%s0tYq~``1DB(}05GM(Jx+Tkv?3V! zLgm1@W~>v+Wea|j-e)y^9Q;m@T**own#|}bD{WacS5vs3e_4{)6VJTus($0uMiHEu zY)2R#jjGk7q|26it63a+7q283Sf*%U>>rfA2ki z>7A}GU^|Kp1z8W7Uac5heO4+nG11N*^7(-08ybMgz(MdE)J6h`Bao7s{!|gtA}_#S zHWZimCN7Z_hxT(E62H*j%F)K=9L)ob#}d&{j0@H_PEne)^n)Z_OT40`4T&=@G8ojb zP?a$9T_1;xJl))XY5#R}Z*2bqjqMwn+-u}!Pp!h&I{M1J9m4NrENtuV4)f5zV-=#k z%4Po^osF<@y2!=9OH#E3+f})Cvh1?Bw@zOIu97SO6a8zK&7#qU6wzTa>zp72I#`q$ zCTOj0$P@ih0&4%BgtYn;$sX%eTO zeF6@JEKJ&*kLr@jT#Wm?SUkoE>pK%iuy{P=M$cu=_w3+DyyDI!X*M(iQ>b%%s6T&# zYI(lvmw6UPKDIW5(NVq*an#=CY?MJS+ad?qK2B@mEJtAdYXe$RBvt$;e#ff5AN;!U z-c_0}BXuhCdD|~d(u|x30+}w<` zD(NwUQNs|Gn`r-BMJe0(j(Nn90F`CP_zSYB?1n{@ePl(2go67p#@dwTnbpL#4}5YD z${VNQ9X^8HS4bvG((@bZRDkc#Wy7|uU(95xG~QlJxD0PQ=Kn$|0SIS~>&7GC5_qsx z*DpEY2=_{};e*r6a!;_@2hJN_`bfN3vyt%~dtYylFS{p4vki!l=6I1j<*doTGom+i z0QvN^(m0hv%iC#Z3kzpC%)_)1C7|4u0@CKwL9K}HH>3ycPDTL1#-?BS8L!q_Wx?rX&<-QM zP4TQXS-ag zWDld1mDB6?w`bRw7q@IuS!T`nD0%Kc+YlX(T6X0Wn+qR&4;zL!S&q6mt+i-e-x_Ok zcTH;*b|1T%;K6hsN>>)B3VVEz)0b)A%K7X|h1F8|yeY!?W9{K!oD?p@VsPCgy{5xc ztu<`RJFDN+_GWQpQv_h;dGdwN^ZsA=RYjzgHk@^I9&kv^Tpx8XxGKE-1mU3q9;xGL zhOx_O;6~NC*Oulj92*ml(Aw`ir%t z!nxbolldu0K|g`-uGkC)C1o9RW$LSaEq)X!E_h|_j-octo2i98vVBPXNl~IE?!CS1 zF{HAttcIg7%TSTJ`pXF2EgS?XTde>bz%MuDNv1G%Zf?U;XD$BNp@_Jw0A4{b zk-2ibBw?qFXL81GAXC_&pQ7c54z@%YPL0lCTEPLZOH`&cF(vfsfL2yMSU@{~ZS$2` z%Xd=b8aik-`j`I|XxN9)Qs4BSZMuF_?NLCivd;Nuc3>>S@VND{So1K8MwP>^XyapXv-1!#m7lsK3v8}Z$%Dtp_7dK}xw{927RQca=t4!@b|9te> zF&K}HAm5y)cPb}B4a}^z%MorXjkY-iPz!Fo{&EN^? zOUL#p7DE`frePX#P18ZLGH<|wbJ1rte}m(Qr;DUml3 zd82>tL>O49qu_XKsP}W9$!}1v!6#JC?C8Uh52i_o>?1b$TZs?p&Wowurk9bNfViR!KnnXH_d>AURmo`YXRy0s2<<%$`u(SD10>I};Pz z15Z}Qww7Sea0z!^)oYhUjLUVY1M&~smYQHyt{g2`Ro(2P!3Vu<6rfp3nPB%iQn=aY zW)5U`SjdWJCrRWOT2Ac(mnQ(Ia|Gtf0zM%!GTe5@c9f}=tFY>QSejRdY+~a331h%6 zho2mpbVqQ>>n4Yp1b=g1P!P-z-kgLSL)gO3KYDuQK!C_VJ^6TkUlFDVqtkr$r}cUv z-%SrV2{?zl?OBN5ga1nPM#~T;mo%g(G^k!M&g_Gk`Fv(E|{lm%YruiuR{p`=7NVaslNVX5qURl3K0!82;J?!A`6qHa|7nkb4x<1Q4$Dp7xj6<#`%?9EFH2+kdryA3 zDK?*_dBf}`dP&9W#CxfoUBBfEZ@&?)=I*p2^fmHj?u{9apl*tL_*Ge3dgqw5TV^-j z{MFKKr+PWTLm@mrMr)w1{Q>1ORmLk^JY}#c;YzE-C~ z?V9YKL1j4wJ0iRbPCq^!cM4c0uYS4MaBh#CEFej!&yS5d8K_aaHFC5em%qN3iEAF5i>7ldR~k_aG+#tgHd|KqcH-ZRzyn3PB^(c_sCq?Y-sP7 zs`8M~{jWw78nTnBBEQb(>GY2VvT0o#W~XF{>Z)g_?ra+W!9>>%{F@x|S54Fd}9}SwA z$h33Xi1uu~$)j~qZ$MX;IgKgJv1~?o%JfuauU1|bng?Bes*S=~oq#ITsB^y?{fRkq ztE`e(Gh4O&{krDMmYAm?1)tg2m5X#}>^TBiXa2PZR6mlFLnBdQ61r}0pE~%_@_Bxn zZhj%M861H)A!my>@>L-wlB%{;*yX*n!u#zZE+^iZ?$0^rnz~c~(N{)UMbU&fCy8A&tr` zTM&SIWsM4hT-mGI$Iu&3W}!}H(a8U761QbOW2sFe(qJB2$vH3m%U=N^k{X<~{?aO) zdvk0y{h|BK?$aJ#?ksxE$3Ned@GIR-xZU#ENjb^L$!`9O#M2H8(ryJJKB6e6jGeAUkDwRzXc z5_SHKutZo{5t@=qFr$JdV>LtQEg_F~ufoqSK>P9K+^>!NV-eYF&Q_PpWj*CChu>Mz zGc_Fb38Z6dQ_X}YdC{(X?QOXLo<%fiG1$8BPuaVSwe?L{gKIxIdB$?SyVyRBq^Sba z05i_p zQ6SC}2E&V%RVFrOPKMn85~RG~HrgV+QlY5KvM%0tq-jEUVJb9Pl|`3Ky`~1UxYYAjE6C; zzKKm+7ZqDny<{RO%(i#lrIQE<#H+818TOZ9F0Cf8OO(G(@POKvsC{{LkNV;dKCCzA zXg|LFQm;@;Uapz2y3jg#R4JC@onC6!;R;+(2d3@!l#&Eww24c@a#xXswvXa>K3T0X zu4p3)Xo}7JpSwdux|zQ}?(2!9GZ+He4=D=+oD^2ib6Z|n|qSW z`ndq+QuO=(q_cOT-%G322cSZb6|Lt`^wY_LPa8^=ZtMBi45gneVpHR9xuiHdnon#8 z6nrqgltT8OJL(t@~PJCQmTb-TGr}6XGUQy^0Fk#=DY7y)I9{U?^lceW7|&tghyh|+M)B9t$xXj zN!Od`hasv?X*st(WyI8s*)rR2ZjZa-Ug3d+l%CJz-8cUJ)sFd7Ay>T#D=K>OiWY-s z^8ENHpBiO+}j{Rg;hnjmO8k(Hu} zC3TV`y<2@$15}UY$A5zy5Ya{FvBb+qaBF=A;`?nNV;~Mri1;btrs-Rg#79GGHOwz& zXU$}F7A`*^{5YijAr~l38~qC|0Uy8?L*CbOPn!@iIMafO(cno@b-Tp)trmA_u#Z8g z6`Lg(k+n-QR(3Ndh-Pg?&Csjb9Y@hqqxpFDGmiob^TZ+HW!mGzZV zm|BkY>Z@?wL~(hAO;2TKryJc&NGn}cE{4?E(Van-^c zJn7dv=fR&JNj)3W9E7T>hF0u%W2L!C_7!l zSq0Y{f3Q8a_^iMKoK+?`ilGy7|4E5MU(aq z$?O71sN(TbT6R-c$jXkL{j=P{k=ink;xeFec;cTV;{SBqjD_@6>i{u+*uG=UHX!tj zN;1Wa)Q!|TQ;M=lk{j6hTp6oQCMqdOoLW8@1T8$hgb|a$((EZ%Q(~^~P-(pUe}k3_ z5(v?t@k`FkV(M^+#O;H?8A{j30XjwVUc zZo=v-B(BhF98+BA@kO**G}iN$%4uZ`7?Svsrbjetkq65=g`a({ z-@s_w?_WICUC8$GRc*rhF`Tc<`OK|NU9(h^p6po+t^K2l(o_Rj{FcY#~}1MFcR5>n_21(zTgKVwBI&>}C?Ubs5t0xDvd9Ac?ge!ojxL6dVsi z5}lEgPLgH#4N{mtymtIcYYE}KONNx7H|A-NF`-5x?T4FsfQjnek)PnwgbiR(q~_x) zLoY(=GL10QU^cXMTHyW7pA6upwshrQ1h>iP#X@U=QiN7 z(O`!uqEhQ_c~PCi*&~f#K`x(Xt@L1JDq=s2E8YmqN@4FT2evBOl>z7*So!0;wB<(T zsKudfc+Y^igBWb!b@}v_MdAAtm8)A0%Mn}Z=7~z_xm#>%JQHR1JL&6ze~SS8*YV#~ z+W+>K|G)aU8clnC{jon|q)YjTml#!vmRcpb%i>ng`{Y3WWxX2#9Rf(2cXxL?UAFb8 z&N!p0U3o8~;rx8$>^vTww%*8xkg(;Yl9-N?uuQiKU*682Izk#!q7>+eMX$<>2ebQ% zgm`t94WE)<0obv0MD+aFZL&aTEwEag8YVFF{lrT~I5?f?-P*s(J%*-~h4%DP@PKbz zbM#Ttbl$ND}0Y5`bKH#LJk)g;`o_!>o~kdgU=n-@IZ~hi;f`?*3{mi-1fsc6|fPD*-!eJd6?Ejw-VbnQoz8 zOgR?s;Vp?5_}1V_&s0jaV<>lPXVKi`0n~z0cg0yIr~Jf-(Y7wHi)!!WE4{gR-pN9y z9x;6;<3UPmd?i=@V4-Ig3H~X6SvcH{;Wd%YxJ?88a?v=x~|6f zo_KH}txl^MdrCprE8pG*erYH>lDdI*>}x@6W~4sb2mIdYNOWY)k*&0Zc2@lv&f-&S zjHZr7@Avc~KDQ5bwx_O|T#XoHQ1{)1!72FYzF`(rC8^rkfzIbp{ ztj(%(t|l^0@Z&-eqVAT&?e$KnrY~@;S9hBN2hkJvp_iGMwD`JgT3)oBvv{Sr4b$?| zOOI=PdD;`e_s;-4DeOtrG~16(^;OS!CBhrg!kL`jrRvBy*T$D813rd3S6~FG!hWs< zY@5HPP_2&3jqjB&M@BALI^cTQ8hLBx12p2cs=G3E8b3%Sp$kVFm0)L+fuD)*vyFN)5!@$Ybeoe?f$n2a+a(b$!btr% z9!X|8R4)G!zT8H=9Zwa%R$lk%B*=NJYhxh@dp5f-e5W{Zggt06SKe3fY1h?r<>KLe zN00+EqEfD(4lyo!s?!lWCd1r6js2|0sHTgi`y@l_SYW2AaYp&#;p+21VQBXY8*)4K!VQLSs)Uy811 zk46V8K2jt(|7eA{`({)@ymm@`+*xXwvJ2}A$(nqf1=*0?5hri^S-Z^K9}@cwf&hOj zR&^%mIcp(-SKfal)#FWn#EvX4M=3UW`&PhhY5-}FbpK&7QS8YZk?J<~V@YkP(vKwQ zTX+vYZGfAL>{v57AM37gH}e?UH9{O;YsRVKXg5_wV&_z9xP45Mjza1qDOnkU1g#dJ z9MRj4{*u3XTiBWV^&wE*(r@`knHal~(#DXv<7Yd{8YSr_7C))<(lyx}7BIrc>J0F( zYq2E+EWTT@mwX()D+USXk}~a06PY|N_>~Bz^%OZ-gNWfxYDR0Z)&Lgj76vNc}N|3z9G?MXB)Q-RZQVZ|7 zN68yIf^E)=U$F`ILknF~crjNb?zRN{xVJE0uFJTDW%EP07s;3BY4Rm3l-!eQJgUoI z<{upq`Ze9~(lz4nupoRb;&p)~ACU6%zxziqxvyt~7N7;^M1C+3(lAF~W4#R7?e`Zr zF2e6ec(9w?YfDvWl>#fmde9qS9Ezys=k3#KK>4J^MpEYLG zHku3RR?@qi;5UDn{sg2Wq;G&<9SVXHs$ME$_~d6hYP# z^`DB=1;icznc9h+&`q&)!I4M{{+ye<9zcaj6Yl@L@doBORm2uHp}_6hyA`7{YfaRI zLw|}0AXOke)pz9SqJi0Lbon2hbhE*2>+e69CaWUf^4!4aQtAD)M#XXy4!>>MmNeiyq2OYn{yluU3=$E-JZ3yIJ|LPu_pkG z5H@#te!a^M=YD`?=Tp&{wSuIl29s%&lDXltj*<6R=JgvoUwb^P_C;lZwv5a^vk;bI zN!nf>&tZGr*Q0@Mu=r{cYy#X|$(j7e<#S*(=jB}#vX-V0msanCK-P09qADg3*u{cm z)Pz#A{%owTbvs1@M#w$5ydint|Gpa6Q+|mb0Se@BjP7YaJv4kcg)5j-)%(>`?9mox z9wy;6d63B(zW(Z9%M;jZ$LH~d=0V9TN`=hu*gsSQqjCA3@!p=BDzAC&$ic<&h&JJg z{HI9+nmj6`=d6Y*4DTkRyhv^JR|u+YT-L^MjaT)jq+Gmi)rRkCCmq0Kp7n@~)XA{< z7^40#PMgRj1*8KE`|2_utwZ3v|^57i}Nhn5EDXtMM!&a^VJV_h+KZ-I?A%9z)l=2 zT-6m)J>lFPZK5ZNU6hRCtD>CEz;}p774BUds-qa=gRU|KQk40o&{REWlOwNRzho>`d^v}>Yw1KiC zZ?EkYxhnP6>fuSvK9m~z{Id(u$-EUlqs!nn)OggsW?ODXVVTaO7-?-7Uksw!8$G<% zIpp@i;W2BtV^pRV+qx;U^W~tv9=9RnQcVjtXIj#iv_M3N?3+so^BD}9|m=^JpE(MZnU zPkuQ3VrU~Ro=<gcKcvh#h;n-&9^&k{Mxbd+7$|Ix?KRJKIBT^yD z9mi!;O>3G}I{r8&I|rTDU0gpR5x=tId|%phq)GSE*=rF&8p=MZcNVxsiFEi6?apfA z+vg|f7M3@!bO;uSt-bnS>y-Y=7uu;HMhPh3H57t_i7{%hnD~hpT3vs+mB2Xg!LLZ5 zoSb&S_no`d|Q{s*fvSWyXwauaIv{jwKk%y;%@y zadY+eDJcpUi3ip-qkfVn&D1o+TZ`5W4Uzh;jM~f7jp(QG&gAtiX}$oeKKyb&EE9VP z@nTYXLSs9`2Uj8Ma(DldsRqMVPtY6M{%Fh$8tJ}0-aYRbT9h)f-FSE?OV>Q!eo*#8 zHY5lEgZt+XWJYGnCw9v)MXeBI_aUHa-S7?pY|e)OJ*`VGJx;NBXW@hQd?yK?ud z!GG``>}@6v0Ccc1hll`lF@zc#l18jUp7=OEya-|MObdBS;%k6a^tJ-VqeZAWgBOHH zbde<{Ocy$6isR03taTXA)CcuJoKUByyE);!_~ET2^xIyiyPl8hiRO$vD`Fe=J3ZlZ zxd{XD6_Qc6>~l$293%=%=V#C9fZJvs>%NQeptHPk$hNnW1<)Bkw)!N=sHM)o=<>Pb zB5+h!IL*E-BWc>OBsH765fyEba;5S z);dR`CwbvA+_w=}#+TwqmVRD?4~)MWP3i*B6hz`9@@GpiYNTrM)AN=7ytWofN*d-) z0m2FKzF)ag2TzT2zDB)GCWg+PN0QeL$&%Pe%nEc6v%P?w2V_evM>b$1xG9Q88lYB> zt0y}EU7Yn_0qV&uATjy-fs%C(hQj8m;{;{65u(|jI7tomm!(oQsrUcp2zXtvaj5IQ zP&m2lHzUv`(QQnxZ zk-hgx=qRPBAD}j==KQ1Z<}YI`nLOS}?o~kh|LgUm+3;WaYCy1$X4_a(U2vrcJa!q- zs0I?;NAW*HY6=-EZTpd1@!-NY*{stNCClN5r3KWlpV+U|d_<^W0X?exj z0{g>tPe<9`Sne$lg#QaURbZ#|AP|s#YMg@{uhLTAM`50?|J~3Nd?|&RO$1ZpnA&>) zWUvmQ;x!DMidwMHUrzG>ihF+nlBgdG;sacQmA#gi=%g$lzPdn{T;MSakWN*MxtO^X z1z=@YDk?@K`V9>qeKc7(lqNf4^y#b+FC|?!8h^9)SM%v0KqqkA;oPYHm zW616n$f0m6tQ;~cK0$s$SW3nZHXK(*?8}MT5|}t&&323V6{|pITu#V`XxULA^YzyX zzbvJ+MHOqs)47j~?#Ghv!s9-@NDpb=)$q)x)*>Z{%cMtMfGKH`VHauweyt{=t}a}n zFTYP+%(Yl-Fz{E@8Bx{6-YA64b_BOVbXzVO=rV4?Qc+xzeDOb7T7n9+p~9YQy9F~W2tM_R$2oTAe1Bz zT8Y8onh3?r>rAE%wpL z?$h-8_?}{K9K~n6V4t3j{SMCWa#Wp);x*DGHJ<2J?RE;$E_i zLBxh{hbMzY`Fo0e_fOYa?u*?lM)&GRZDHFox6S#7F{K~iq214+7V{_&5sj;_1uUE5 z!oFjk8$R5ELL)oRt#v&9sKygCbm@&EOFPv4yt2%9y*uhsb>)mgnEHt3Rl)R@!!85Gm6tZ7@HteNOmx#$k;yW%tk(pJ3{A)y(|PFAR=_7LJD zoim@PpR~JNFvHe)Oh%o^MBU2Bs76p@GvQ~U+lWTk5zh>On0yyP{RU;M!a|Aj)K!2g#6(6G;2?8aeMhAS zfa+(AX)0iuV+En`rC@Ha@4No}*oxnEK&baFq*eIm+?>{7_DZn6<%_O2s@I~vjSl!g zHk}BQD&LV>rru7MZ%&|HotR4kwiZ1_4qs~ejlwin5hV=mruuCHgmfjl*lX_UTdC@R zqXGZbIH`>{s($RpJvY3ecM;{%f`choSs0;sX|}%DbKm!9V)-MHG>3?)=fgJV5V3l( zIx%A`!H)}^A8~M0s9x1$Yv`omqv4(V!M_K2k2*(o#pi2R4*EG{`-f<>SO<(eikH@V z%3*)q{G(cj6c107viG7-qD8aJrDK$E>1wwlEUxTci| zmkUHV2ZJ8>rUDBJW=E;k;y1`l@HZ$`5Y7tSc4|XjSmzcd8qUEk*j%Fjs;XUP^DUw} zYnhkR(S_M<>jAn8L3{<7prr<90;u*ngeq@l9dxurPalwfUel(rYGxqD$k5YT^)ils zD4+GI;A+1&N?Zq~Y6|Z}fC~jPt7DrvMs5l{G&8FCc;@M_p}NHisB;shuzKvjRhVOJ zYvd;A?CH&~bpEkuFK-0Pa|T_tlbF|y)>ofA{`7r6U7Tx^cXQnV8J8cDtC=ZP4qwSu z_Sa(AuqVtI=tBnzU%nJ8Mqi(jj~e0GM>7!v@PlnA`hBlGE%w!=wmBQ}*n77aX(p+z z>dV=*IX$Y*CPZ$YZi2XcJa?_7P?vhi+N}LE+A>b!4HTk99tw)Qx7+D(oNi>6L=Bym9ItE0UycY?+kkW0_gr{-wPZFH$Zl7c)3y=D-g#w z?Fj>zmt>mXpc{OlkPQxNfYGjx;UgFmndM;%zE$8>o|9Z~skKotKOj3l5of7F z*^8tpuMKxOc}d-LAfK>T`S7Jm$8%P)K?E*Hc9>5_>{*1}-N*p1%Z6Ff5`FVfbwh-= z@I}yxfClC==%Qhi#WSX7DTte6$L!2JPbGkI7xV<6*7e`oWd2XOhl&*ug`3RymRn@x zv%ZZrc&Dm3@3}}aG>mF0+deK^?M}Loz4$GYc{p*t#P0-AQNYpZ;v)u<=+E?>%O{=y zxwAh}9>~!b5Qo%x!0j7a(r*^iOTg|@5kw|HK^L4iK-D|s2FPybNqApsa0j3}OlkiO z;!LNq9nKz7=mGc^OF~km&AOE#n`8v#lolYMg;)aXWs>u^kAQ*#AJu%{2-a0U--f}V;0%bqDlp~u|HIyUhBdWqZNn%kb`(WA zQ9uz85s)S&qMI%vB1CBsks1N%JwZT_B2f_#6rv&`HA?RhX(A<3LlX$SlK=^U6u#+s z%Gqb{ed_l<*Ym#Db$(b_WUZ_@=NNO2a*unkm2I(&KS$MRFoeO8;{m0EiZKwQm+E&+ zr=uWSY?@$jpD`PeJx(Y|I-37?^I-hD0b}jnqW?WWmEW&mnZ4mp6A~SUa}ZT2VJL88J(8sK5W%D-1W98g$C56Gf7f zVM)fbcp`yR3Z2I=_JQnjbUO2P6Y9^hh5A}EuJm^Vxu~gei!H2)P-_ln&$Q`J5{X_YV0VMY(}(=)b4wA0Cl0 z^Jj0`*HM7>(1=ktSA!0A1z$C$>AtAPrD%{-^9| z+27E=*~kK=|2@E;8~7i{nhT_^f+O;^;=sXFx6EGx@~ZR>L9Jdpkn^IYk`5=| zK}eZpNysRUOj(}(0^O?%q6K5;|3$!j3LDmd1ZaIPmqXXSqa;`L*MMz>rX^h+E&#UQ z+o}a3i^zzp;vpX%0HH$ibivQO2lwMKPEJQ_sg9*8=$OTBZ*@hAkKxZSv^(KRVAxl6 z%?VG7^VeQL;xU5I(KLeD)OLqiiukHPuqDuiH@X>&TaRo6UL^Mw4!xSznV8ty8$6mt|1TVQ+<^Se#u$n=ER7R_k z@xj+;I-bxtQr{)ujjhula?iceGEZ7c2Gp|?&a)W>)y3{&@?RDOpH{5dc^?l|*UZ0i zl}mvqxma>ZBa#BuLYlJLM@q|mN=T?CC3fXFAr~I@@B7NSQ4=>H%Q7%*ccp5J;iXOi z`kv0I0pc;;jHs}8kLo19)%^@T-A8=RyL98nJG6eP)=95h`6i7z{1=4_WL6Y<2%_Vc zAD@EyCjke|c`}Do7=YqZD@pm`)ZII8eMglSAWw5a!t9?bSj8qnUHn;ix-1P8NmL~Q zyEJ&j0!zVW%}+oE^WCJV4rDnL-l=GfffaFMCRWN#25CV}H(rcEBW1Mx*B7RF;z>GR z_RKp}vNnC(=B)|Vh9nUg@zIem6T+1eH~K6)3MBb97HWs9b*6yJyjE2WdUj0Uj)O4O zllgy9^#9E6@Q+mB{n2M=@zfBbMaY2sXnuUR(Dm;C-JJ9>Y*RY86n-p74>ABL@%>_@ zgnc6ID!W#ejH9Lo4ymwK+Xmzq=+v=D)>ncLPM(q$eL7p?mwz?lCsmJf9Es_LZ)feJ zrH@&)NR)xR%e{2uv*9*+rB~DayLQrO;tnd-72ANOopy*RhQ>h!80>U0KaJ51`zh5n zr&2aA2jM*94ju33J^Q@ z`WNH1YdaCJWq;&>Yw7$_Zy5nB@#(UZ_IE=g*T5=%{QS?%fK)3>Ht5Xvn)5-*y(~cl z-8RN7JXii{yo(3s{E;}oToA&wOX)1^jl%46Gv(;RIahB?7n+?-)!8E<-9d&hjv&T` zD{T>DGX2mocLba}NF&AFPB8Nf)#+^io9oE+rxM=|_bOT8B<*o#Qe-%_kA!|y!^zMj zp%(KyJno9G!BSSjS_t3n6Hh*-e$%W^5X5S-5A1u?X z%x)>UcL-?SjNE^1qWsg&@&Vfs-|}5tup;NIzQou6Fw+7@of_=J5b8rbvx@<6=u>5( z=5^DX!fDn;Pe0G0_Of{A*;%J*xH?_&xh3ah8GEio6BT6oUydqfdwP$yh=CTV!ge@T z)LpEf>+^3b65~=j#FubIAN^QivQ9e;RRCY0`y!|&MCCEI`{bda9CYv(OH7l(*V2W{ zB(dpZrStJ=G6HB1=3qa+Q!c6*d&4lFzDz?sa<*e8jZpd^zXEyK#O^ zjGa%ESWrtn5S9*@8NP2bZh@AnU_MlWr2CgKrPl1DXCn{UisWWPKeylV=AarqJU`OM zN%Y!=D^_T;UVQXm_oasT_RVw|h>gnd z_QG{GjFK7Z^bu8myM8XEIeSm%;STz_S(AMVU5Hz32X3}}m{W2oIc;OrYUDY? zHg|J$Tk~;a39W05_dXVCA%Hd}kPDYn# zybww9vyr=m&(lFyCVrfe@5%wowyF`cE#c&Hk2zIpz}(!4`mwL?QetoK%%`TK&W8^l zwK2VMn~j5Q?}eLu!azud}{?P&{0wpdS5aW}X;1$xU=*M`C#D2gg)=L?t zdYq}TW;Z*>qC8#YEQNfk=wi`zD4a<8x9hK|ij*c;cW!$q&j4Y)gs`fN02Ow>tY~w^=ic)SnceTgy~9*ZU|D zP}V+U=ymV#aTh7sV-Lv)=$MBq1sPV~ndcRsd>U!qxzAW{#;@hoCylj%1Kgc={u(q3 zS&FJvtWt{_b8%k2y|H-b-Zb+Qyj<$bgGD$96O)V{@3?icVFpg2p2Jk4$Aso7Azs|0 z<0k`R*Cb@T%ul~9H{?7;!{>sQOn#CU%m!J>{c!6ZW-Q*HhCpWio(2IM@M1&p-o;d(5_nN z+;7C}A0^D&Ms-{0$nrmyWmkrln<3L+LzIn762}ckHRJbvq3$O?tD`cIi7<7oEJq-5 ztqv!~5$^6pdXFfy{HF_`upydf9%C^byZ>(dCSif|(fF!YU=bx+6C|oq81HD24peDJ z?Ok&3)|vtap2{gpo_<>*_D80gR2PzG&d>&1fBClBQ#8X5mTWd1>6$`-6?=;zDHjA$ zhu%DfdxJj>+1#6`z)#kB0S@ z-ZhR(%@LWoQSh{R_{20h-_?)gvD2I2RNLmMB)Y5csbkKuEMG^I2i@J=-4AaOxnvX@ zxOdu~nr)ccb3i1npAk65!K);`!kr9__C|yRH5XzsJazI;p1q-(=urM_bjqwb`@uz( zqjP^;qa(@4FF2UfS8ZZq_` zr5qIf+(yRm+SnZ=7NPu_H$=oESZD zRXnLSrFtsHlOdron1fWkz3{|=;NO%H+@s}X(r_cPPE@5Mwp3@#Hx6}oi|w)S-h=^Z z!FeiT-Jwg~{ps?t9j`YTFV;@gcvciwd@6s|PUR_0J0?l~njysNy+0FR<`J9f*z;sU z?Q#T$>my74yY%w3>A=}tGx=;n954A`pBMHt6$jk#j4ak#7mTl!DZqbrmRjp-!hEz& z^0-??bH`S{yq!?g&C?oN`a10#pH7b;ozReWj<%B=Sj*#KF|WZ}l_=JJR{9R&rH^pU zUCj`D`t8kV%IR?lJkddH=BjFX(}yhkP=u4Y0Ep8Z3C-R6QDd~*wEM5_yWB$z723lw zXoRp-A=xb=Qx<>kvmT##OBxr_@10Oj4`~P=iSE-+T|1#^B#tUwMsWgLQb56W+8a zad)IGob-DSR^J|A8UF8s6Z}uV)Af*>qu%Eee0WP9H)Z8tv6r|WDL?Y2nE$G4{nOWv z-AAf5qNOIpl}=4cwlB6EmLCAFjK~y_aMN~j*us|*eKt(*iKlw_*85k ztJBCgTHywzD}dx)6fHy&@B)bI2jjnm@I&yp#dI4chHl{3%=V>~>dA>xY(EC2krh_2 zCIt+*W;%Vy94<~#(N;GO102rsti9W8sL%nfv2k9yPqLj7VcXj+wgGRlbnsQa;a9<& z!r4X1OjFU^NBNPGpAQ9teNuW9vBkCs-$6s&Xh-&Enj2!^a@=n7R)#^9Iv>$nJ--5v zA0j|ndu0SzDIXbeQiJogGVr#lG^6FOO{)q^6UOTkc<8wRF*6X8m`$Dp1Nn2G5O@Nn zh=-BLd3cMBpbK65#d|aJ_m>+7NRcYvSTREO)h4sODXET{6vQgjNoo~OrIPPGakAzksU3J&nP{=GvABC> zOn5ZOld>ImS!zXwI^KP^lGfy6-!$>5GjH5@X~p0K)T9h)%BUo2>{Tetc7@xhAdkUg zH0Y1A%ZPTaB|433##$&V%*#IsORzKuNxg^fXWZ|&bKsFeLXy#ofGC&;h<564ysyMW zD6Kf5xPqiU)cX{dhx{h!L zhmC1J9y?9A7f_NWYRjB;d3U>sMf2j`EQP0{NZ0mR^;++)W@lQagoAy!<~YF1XksU^ zdu7k2`D;r#Nq}s1Na@Bqcj7 zNCavEz|YQ(b8a#qo6=Gw)DGx`tpx)@_cF>G!;R_?)*K>0*qx~y#5V)kgPX|qQ)1Pb z`?8)l!Ep&R4$@1Oer6b;&&fp1Uh$@Hz}r|VDB!Gp0U^>QWh0s=MP~xHI0)VtWdMx! zlQ`D%Awq*UHS_n^=p1aQuP^tpjt2F*vnVFrYq+o)cJ!EAecrxWr6USe{IPk{x){I9 z`(4hz^nAjbedt~2w&PC%V2jJ4HNt)}#NiheMqx!S&i7s1*3To?61Sr3u@u~w!ce0b zj!)BU$eh~bZx+FyyqX28PLw3Y-5d~S8y#KrLS+ z`;+!nL0fEI<9BUm_$ni~$=0HzMz<7OiECdfjVF7TI<8k%_bwY)=xn6D*F)FY%<6k< zNHG?WskE?$nq#C_+vgkj8{WqsaX?sIeG<{M?)x-;L8{egQIvGfU;@M2?@DP|1dCoW zPQlg-8JTuda{3jd?<^ta%j2w_CYV>)8H+$ zq-oL0iUgSt%mKfG);VF`9v5%K?fZ9&lTVpyguhYNXi70ULbIr2$#=M_sg4EmQiL>S zY{s774LqlA8hPz=DCcp)C^n=jc+4$=C00z(f*PruQeD|#Djxy^{UcnK)H~)ch(J$@HDi2${M{( zr#!*FqoJYExX>bD)@o z9pC-o&dJ@26?>V>G$7?-fwd6o&MvU+rAm=5;R0E}32YW7=`bGvcabTsPm>Wu{Q$B* zhI+biA43+O3F@c&qngixqhvuI!ivYkpL=hX|L)S7#v#y1FcC7AH0ACtPVZXNgm)C< z7=aC}Ie>s+c_20wOSjn2ez+g-Nx0B>exDH?1!{KHHMq9ek}xvejA0N03PCkZKiFcM zPfKSk&adX$s;$D`Vo29GX$%sS>Di4+twwsoW;A%fH^m$PHTkGn54C&Xt(Q|W&^zH50O93qHnE&j)}+U4FI#(cm|i{PwLHd|~P zA}at9Z#EBznxb*s<8J$HpA&`U6gH|S@>*P$dzV)<(k1a}BAJEvOF-S;DCD-DnEQ!m zX1im$<3phQS z^Sd1QHo=H?00|#k@}%6$>WEp-u7M{mGin@gb$)QCN7I2i@D=sSyo-#rkf(xq|EI0re?R~KsK>Ky0xIkcUBHG9pGdyS?|6ON^7PkF7>4&_q#0o* zX%U_koJ9PQ5&BADk)f}XtsOVL#U{KDo?5)Av!}C!ucZ?8hB$S-+F<8VQ>M(7>RxXt zE77a5E#85Wm$^8{d{r+p+JaS`qzdu6)a4M&Ue0>xAt3{^SP)`!x{{|6U?grD4!OEK zqdb0JVbey?9USUh;lFJA8J`6a;_rJWsyIgJoZDz-b#)i~c37OUd$OO;-u130MO}Rt zhx3BeO3=K#UrIaN&V}iZ(RuhftMSJrP)>S$_P??n|9AbHD;?M%%ulK=9bgz_56oUm zLwn0z;_UM0W1MF9iQd4}LyDVf59}zqKesa1nR%n2)R&QV@hPS@=c~8dU`yShM1G!! zGH07?J>7+HQ-|$8fOf?}x%7c(Mq3rLs_H6m-5hAw+kyLH=qi>3%-tLL@@C8u*4dHm z6~Xz*@3FRmoT4m`uPq?<*G<2f@D<-{4yr)Rm>Db;gup|5%-Ith#RXa}=r>VE0e58( zJ1u@(YUDBNXi~j7la(R3F2gR}(_JW`bXe_jpApG8lpkq!h83!4`2DF`LJz>k-?S*E(f~mm&E%4{-s<70A-r%y=g}XgX7JL?U*lf zM`r777fQJr(CdQOwOoW=UQDP)NfomUDLr)8!f8|M!(z#vWSrs)bZ2=WZjf_4wLK{H z`TzmC+TMNyO{ZbuGR8Lp^?c7y>eN^vyCK47SEpzNc6i2W=m3g5dSf~Xf4xw62DH_7r}Bfp}`xG5=jjeXG$yD7Uq(ATfZ1Y2_8Cm-7_%<6sfY; z$XbrKknD5inRGo{@O_{2F}lpu8s&X-#5>aCfsN~;jH&bVTu!p``{$uE%zWUG7X2wn zE=9^254{KYh-J6fj1hGk-x_h=h*k8M(SkquNFx$TV3@w2tGJ<h}0;vEd$h;lH6+B50b83;_3kuX~R)L6MI#UsZC*X$ZWn8HavMsy{e< zq7&4>K>>Z(73fDy{nY{1a}*zFFa1Nf-6|{@@)o*3$*m5xSLS{E9q7t-nA=TZ?oRvf zYBK3Fu!98#nYRW9RZsa8Ugm!zs;n&LZb9jb{nLQ~n-E`22J{Axlcbm}0-8WR5^3i0#LZMb@LD2sM(VhP7bN~B7JLW5v(&cr= zi2&4(-6Na(_*L%mD9_Z0GxUM2YCQJCS`FpH8byicDI?DZS%)q85pF@3;q4=vM^egV zK4W5^fZB>(&XLSP2T<$5-pl>BLMfZu6|i>|B%f ztoQA5Rw5YsSo`-?#d1yQ63+N5?~7i~CgJe6oXs$^54XbHt+x1@%7=7 zU69PNLi^u{J)u z6IUGE+dT&}g6-hRsM1(^l_?S7wsZ@q#`;hQYN420=&7;!^V`hIAU_;=mH0!^Xr2Eq z%eChd$Kq^0mJvj|G+?xT_6PKp3!JUPa=6#3O~rUqnq!fadN$~gPKC1y^8k*L_Xz3+ zJy;4pt-l0aKWmGcih-w`H(tCe_<^YtMqQx!_0f$YDY~FeWd*YFwHQ3SffQ{m)S#&W8`0w<_KjrqA4$?kx`9h1pm3*m}$~ z%HWCaHOmk^wpW4to!BUxYZjFu_k9tE%LQSCkz=cz%#`<6zm4l|j&UBNU7u#D7w1TL zGFx5@ZYUt?g6%xPDY<^Qg*6_*(#ECnOF`Tw;Du12;Tz+T#@*7m@!`3fpO{T{EbZbq z*flc##p-kNK{^O@naOk3=bMffT@4oB0_aQ$$pJ?>i{JhK%OA znp3VIDu4y-$?vbas59!x^m47Bopk=7i)6y%;t{UPV{IPuW})o^?EFKzcXzihK9e8W zUc=0@mCAY26AB6ki2 zwKuQst(aixL$E$m_h*57a)3d61#}1iNkA7h1!6U<-lN{ZP_i?yBvvS+{&4c%(10Vc z_92Qh*brde(UwvJXUQ8YEiT{Hm=1jh zwYtd3eC-r?bN!$h@}V)D(~pDNj&9O8-@!Oub4U?BK-l9p{Sw4;T1FTa8ts)eFC%+v zh}p66eTHBpHvj+*#1wJzGEi5DkW)a<;=}d_?}ut#)IOc9JMU^L_(3em{S~MF8ss4B z$1rz=H@ZQCzp|^aHt^Q7S>!wA@CR$no4v^1lOFQRl2YG-j*u&fZuO^h)nv!^`MZuh zaFIP{d1OZPxs;+hOmw)~JWmdoPw)RIf&4FATL{Xi-VI4T&yMI(<0|XCt5K-L3N4=W zp0_?==?DU-0ou^cs@7T7^pwwSqDYM)NUjNQ~c z=Y0Q{%D(X>%hOFM7?W~Tc%0^@YjoKSci)&(#)h{rt?Lc&@EQf0(pYkk#5e+4*uINq zf2D{=)?vFYoZVxSv|H^QNA#*bD}^P-I7`xLz#cRpq6BBNP>%hMBTH>NUVMJW$8oxD zFjV=rwic2f2?7`epoQ{I=R)h;>(0YR00D7F@#(G;?2ts&C(s_yy>$s^s`3q69o{5C zFBn?Qu^IBVfXINT-mvG+;7Ym#tG5DETiF-HH?AjEL$Yci_e^%B;^lt1@FHq*9I`Cl z_5_@!#H;nKc_k<(ARrY{9#y=n5$fFuJZBdX$E48 z?f3`WJUE$Vq~T$>?I>q#6qKD`TIyzP9qD;CDSr>Ukj93YisjCGFOm(!LIT(;1rxT{xCw>65*ePMeC&JFoIm~c zy7DN*+#Cj}G+9Dr_{plP90mY`pTx`;|EWJg7WCkjNmjdg3JPyA&kYQ%O3pmYgncpEDZ|t(0((xd#T!=ftU@j0 zjFAix9WKgRLT94C{gA4{zG$fI!Q?RTU8Lyl*HuYBxzasT{{@l#b1vZj4@3{nZ=Fd+ zFH%OOTyYKscqYBHGO?xwi&i^_a{=M1@L3&f-}zaiE+5Qsu*D^;392&m(M4iGcy1`w z@9YXZYZD8;J_wMQ{!6ajwmOC`xe-koujOxgtk55R%-k@SAP2qw>4eYcrTfZ4?HmD& z`^>N)g&c+vEo^*bps>;Xh<~0{n?okYMkvRLsnbTJ#x%>*!w(3()jX`d2r~Reue6#1 z{es`g$Bm!H4a$Cb@&+&i+wI|kXeuosCG^4f)CkR@_OJx~P{b*8*sd7oCxs8t zingnU62b+#xA46 zy#@TE|Ht<-g|~ z*!nvrYz|q{VJ$&dV>ommp+O?wHBQaYZG*Lik`m`UsffOsTSS%Vy&IB7XC7Kq2U;QT z^q>K_>g|(%O2htHjr{lL|G6FPe{;UPtd9m)ES9mTWZ)O2Oe5eI<6C;~4fbAyF6Tq@ zZYW(!Y|5fVl{l6cKWpitVh3ejl2)7x`rD6OnA~WR&(N7-x6bgdD2uyKiM62PYS&CrkF5tTIai^THjeclYJ9;~_a=CKyPy?yV5LZe7;6R9k&jTzX(}Wt)#c=}J;9rI&#e<( z#&$Q{8+#LyMZc;Y%GF(xx8eP1)wo zXCFKx&1Z08ak}|i!M3nDjeppAV~fqF#?IdUPOFEJSlYWzW4J@)m#%bSbN2!@$>Izpf7OSe zi^?t=Exn`H@sgU``NW<)G&N4NS7T$Fh==R`f35ChJG0r?}?H`V*q<_r>dZ^RR2YB2&qwgl~XVxU#Yr^D2oe$8XO z-%wRQ`nK5ia=xJNe@a&+`Hn`0vi3V7GHC%rMju*3tev?|uP*U=_)p(qUpMv>TIsi) zO*!`9tsh*!^gBb<_F$@wbER#>j@{83VdYCkVk`c60gORq{M;RkWT#N&CdN3lCEWn_ zbE~Oh?*M=Qu3X|!Y^XY71qB?kU|+@Z4Cmg%JkbLF!k2V;H=<_YxeDV@K{f77KkVm} zE+f}xKXmLQ>la^TiN@ZkI9Kd{DmnG0wIVb%Md}S18tE1!ldYH#947MNUWj`bWqbMD zrq`0cjn+{UwhWG=&#!UJLl%t3&ab-WS%oy@IxB0u#p!(=!kII!FrTTJdyO11bZ!5v zD{{4uK88c&P2PH=V?7FMw;+PgEWeF*C^CPU$*(|@CQ}x=NvLKbe zcJL1}`KCJ`84ri3M~=otp3NC&3T8ihf4}N594GuUCsxRiv2h7i)aFpw5T}dK_+4}? zx+qS@w@y^mwtF&isAiozOG~m7kO7So{woKKwGmWL>TB|KCt-oA;)6HZ9AvI|5{7a6i&YzI%H%(Z0aR7ynM}tL zhv3@M%z*r+6zbvn{yp=C=f`_r^f;!yyRO*N7L-Na1y@`$qT1QOjfYFV?h7P* zvsT`p6rV1LQO0zZWR%ORv4G;Y0`30;S`$g#bSL;7%4^P0Pqvt|Of60*ca4hHi zMMx&FF=UV(#OWPq3P7(9z%zpIbd8aX0fDo6trN(EwT@-Wf}tMn4DR^*2mSM+F0inoA3iqk!nWd+G>Y{aCHJ0CcaktN z*Jt}%#+1a*#$)QLL~NXl8c(hO{Z9Ga{?$-z`2{UxOVD8&!DcK^P6|=}-H=|^EqSh4 zQe6*Q+jA#Vq$ZrEL_WJ-7cdz|kJ@42E^yA$3t~RjKz!b_`SNK!6y$P9we|PELKF0Adq+e7LBb?H)TO&%fB! z&tsy0PQy`oytMbd^IY8(R%FdFzek8PfnpVo=IE27hZ&;1)%rD>%;%J@9Gdg=8H;a@ zgjXo902rZXr6=9#e85LvO(08JfAKwk&wRL%awcfLtJ>mi#r?A6>q&BQIeLb-lD%$p zHhi22?z%=G80Yany&g5*4k{PHDt3pw{Y%keKLQBGB7=1Hl#+KPA+qWzzVz(MvoqrRgo7{X7f7P**kC=~W<~SukbR%HGWlH5ZFzxI zwkN83L^MxmI9X^if4A_8_B3?;Gggp*y90lma&@Z9_G9mJuVVxeMbX*&pZ2p9=!dm@ zE}zdW^_5NJ+Rp&>P)QmZH@`SwzVPf%FEX)c9a(SnkY!5Y72pp6Wns}di<_kPIrcF< z83oBEtWAlq4Z_+HEjPij;5)IyyVUs@!xd_-womotNSeIInG(g!<3+|QEw62t0?B6fIp{Dnb*~hbyc}B$ zeWW4V6J(n+YB?9rr-=RWm~Ee+ZRYL=H+(-&r%XrW{ zzc~9o)jF{j<6{R6-F04HPwJ!oQquxzp*q9Lab#Q22?xInu|R)UHt0}An%;Mo|K}J9k`P8v=*II@l*j2WL@_Ms{9u* z0giDV#?gqnZ?tk^o9_O7j}M4$zQ}igHa8&sU~4dPJk^=zNyeLYQcwcLGf8?;#3&d0 zn+eaf_xn!rzB=5v8w>r!49D3IUI|XuEU#{2P;uoTeS*81us|e}!jImng&hz-)v@-lAv1@JC79OpfSQ?|*|Lw>^U<38=6R8BM~W@$vE2^I5orOe;NR-z$&{dS7*_ z27My#nM3jm_3OY8tO?b51t#>*4ieG~`5nG;Q%z@!%?d;;It~-&Qh$GKHtiS+*9{V1 zTWpU0h!ZGG`U)EJems zJ05q9CyXoU6cpL4_35l~)A``>m{R*^E*g?f=Go|Hz?z9Y!WxqM#>u*1hD!|+%uauG zixQp$ad{Zf?b;DtK%-2eay!=(L5FujKNJeN>Tt4&6I;=BQ2 z?!hkiato7^r%lS-`+u%bh_F8J@ zvYw~U_)HWpbdwv<(@volnc0`mVn=^)TdbFSw;qxMd!Fhq(J18<+_NC*chhM^MmdJ1 zkMHo@u-F4lnnL3*Y(`Qq0O%0~Aa4o#MwsFO8v!+Mm~ps?YDXO9%R1;UK8*XI+``gQ z#%-?4Fi&DQ0G6?;ivmxkE5RdhMOa3X5ig1`3IxIlUT#W5(DeZ%%!?e!L~=5ZCD5Ru zkQLcH7ALbpyq63oi2|T(o#WeR)+>$uYnXENzF;tAP&^+yWx$51;o4)%#OafB9m@Vt5Ujn9I$>2147m@LBy$jzu4XQ=qk;4D}i zIHQL-#r}GtBr}{V)XpO6ERv0;uXj*4ZW6kqB3Wlfs*~djE`sg0-(?+kRxL;yD zV7t|LP^70a9|7e-H_{!f*$`&N|TAm8D~}2ORmx$Tm!A664t-C#z0& zi)~hovY`Xd`TeEKL{3JO4NKx@|9Mn3^cQZXfCW&7D)RN|e&r6W+Z^%!Ok%alBP zVKE$ZYXo~_T^CVi!+(Wo)V54J&A~wWL0IK;(Dq^6g`71QfSGTxLDE3}J3j!uT8eGY zX6^&n*^R?CxW#4WS)Ioq9$1Z^+&GRUVERxj-6|ZF8^of&i(tPxGyR#7TWk|C04!l+ zfn{!&Tv4+HB)3o|4$?hC#H?fD)op)^-NSjI97i@z41-Z(kAdeT;`ZgVkS?>dD7YrT z=$)P(4N}~kM%}=KW4_z4ox$`YnsB5C4BO6=W$`VadiA>qTniaVboHjR zzI}C)==g|G-S4T0|z z&O^>_lLJmAD?_crPwO)V$)B-JYSwLOG72~K1G1fD=>7mQsJ8Cwv0P{!gb2RJz?;UDuw4~c5RP49Grm7UnZB6l@|>1%uQ84f}?Fn<$wG{`>zx8 z_k^xOzh!-0L2=_AT<&4%R6b=~zW`7N6rf>Qhb2{;|?T%vs#zs>0{%Q>k>}On^Y060iDKBQpQf z_xX=B)X zA+}jP0Wt80Z#>daYPk;Uyc`qVMTuS6h|1_W5K@ zBKCH;+3L!Vf0J4>sDpp>OwbFm8GFr1zHGBghj`{4YNmo(4WTN~PQE`)_s&?#)iLq( zX~EFRlV;Ab(bGB;+}w0qCz2Eu*}28$3yD~3jq6f+g<6r4%`dz{f_^!`9iDRa;ahWM zmh$3a0&e+rITK+QhPe}*4Lx|ijl;4R)6!8i|B^cU=!E}Ov4baM$0Ii3y7QozDp;I_x(%F< z7?y$o$Jtiv9{%%YzSNp{(81;-YNX+3vzg*G%my=z#gd`*vc3f;r*@=kIiL`;&{aAI zlL05E)8T-E0n-Ov9{XA8_$~N_6~DrSA?e8IL=u7jIqOJ4+SHs*3M^rNW!j}%ae_~q zq55n1Uw!fUUp*{i!H)5!)uRNp56L&%N9PDHdg-B>5xW<4E;;{wmsZDuELDJ4`cqd+ zl*z_6119zzNE$CdeuC)HzkVV!wvt2k@4x2X7;48tX8*%~V5I+k#GhCF|CPD0Y^)x| z?DEULIZf6Gv&m`UTF|L1j`lHazyDtA&MyDYd2YX!rC&*BdSA;&Pto>F8Xw7HoaqpD z{tafyyb-gxn;Q*XAd)o3H5|gFccIGe8Zmb^1T6nGHR_~tvEZMBh8-Rn(Ipt{m_$-R z7R8{lm}PS3(=-!Zv(u1;)h!ESwlw2e;fru>V2b?}ZIj8tO@arsZrG#xu%q?;E1<9W z8_+Rl2EyFK51ti)JklJ0FekSjHD6uH*}I=rOj-Y{Z?##-GNv2J<5^rmk{w7+nmuaY z49s)b0`IkMx1{JcJo8BBYqo5-P);{O^)GZDGO%gHV1eiwiihExz~>|WqT_8L?$6ss z$FKDiQ-HDO(S97%MmJ=LkR8AdbI@_q=;6uYID>teY4Bp*j?lf_0{ZE9S4X(lbpGng zdWrSqHoik%!PCKa#xO0-gGQFDfNg%sG&gFC?JgvajLPY-dC6>P0glL<3xxTFM$Kc~xr4SUsS(fbrWWB^DIUhiHKFgI+Woiz_xgih2nm6J}xjo&EV6C{%} zP#l*{^(|@Fbj+0f8XopE;a=PUacvj#y6ZZt^IFK|=21-8Dk|^$d2!%w_ku`}LbhCx z0u3TFLTZYv!PypZtm#j0KBw~s-azsg!^=E^PSH5Uhe_?>8c#0bzV3bhvvWL~-g=c% zI{LLH<5BvY$%1eXB>dtixT4Uh{yx=k2Q5oX!SBAVOH% z`o1bG`&Vk7mqnaP$o=Jg^4~f^;6RQIJJKzA8sHlzSpcT^Q3Gh+yJahfzmVDXUV*xR zrKiDf=6tU;JA9~>uKIRruer+e^bpOjU&WuciE9S~$TYrXhQ<91Pm&oQ$%E_kTdOavh;`=i@d+&{^F-=QkO* zZP1|+z6-M%HHdB_!^?YCmmZRMs3=k|)&c97Vpv_w!blqJz`2zLKRfZQMEFi7$EO`I zz_0jc8U8tKiS_zUDSzV}=uHC_gp&4SZcPP4IqgBAi8DqvBJqql=9Ra-GrrKYJZB@N zTeu|pIv1v`%^tRtmPt|?rRL|BrXB384VrvYqRSiKz4S-|UxTQxx$=^s`l~IHrZ^zHAp#HlBpO%h`Xp z+bp0VAsy<*I?C`PaW=HvPT?l>d*86?oL%2>W-XxSaC(^ZL-0Uqh59savy7V1Vl4W(b(yhm-DJi!)he2B(AqTTBOK_N85fsGkRMz zt^HhP4X5~9Znt##=cUQn5bZ{Ho&WY6J(LBSpyHmqaXqhP zT}M%O9!}uQF@g<&{BTl9lAv94#h8WzDc6k7qYQy3M+Yd#?jI*F_;2B5h~np z-CcY|8)$j0fy=rd{PQ}fI)M56Vb+J-G0a4qNKSw4=xlq+^Tf=Z0V}b&=vl;QY5k0b zEJLAFL)4;ANLsJMpMb~jdilc-|FEz;5qC4=8oALF;4oKY-qZ>xjVUoI#I*@q(ZT$ z6$xJ6%95W~cB>rKz66Z4@(eZg$^vIDKE&Ug@IksYH~AYbt`cCOxaFYMvyd6(PTrEesMx8>d&9&o zx1eX9QMd`VR?AS{z=xrEX@ldxb-L6qstBnqW+KPo}cI745{00L39u`=Ak!O z3Nna{l|8g~wr-iJG1i`)x z7_>r_^aVvi;5^)sbfTrT>Z7{ zm~GfEdZ9W7ann7ZF__(ymrtYH&9HCuYyd$y`0?Kp?6eua(B~0$G70Iohz?Gh7hap* z5g#{YTB+6z4B z)tsww#g?|}7u*xlZl@FSS~8~aQ3lisn#qt)p8+D18>pD;x3JWF+BHx?10ON#%--ja z*%-L;ksgT}w`(^k@sD2`wl8O7l*BVTXrAdhmcK5>9vSgq{%{1#NcwyzT z^a9Ni^j6cKqF$cggqdW4f^+(W@Nrp)&!U11# zV2Q%p1G&u@B6oAv!JZeLUsgdTomQ>v?0WWp%sc->sWjx}J=f)?P}iTfRkw5CX(~OUjsv0L z{k(vjFk<|q*ilU^?^K2HPPHGSKQoVihjuLwTn6|`BU{;M=fepY!qD>p9%%VgIIgY4 z(K@w;6vJp92}qNUZ%MVs?_I*q^Dg1*c~AfBo4KB1(RfiCAJNc{xV>(_GNm|1&1TfY z_cM9iwX%tU zwTdqvg zg6##tHgi(*2}DZY$CmNRPhC%Ef(*xhE~MhmM2{+d*&rDPt&XzifM%obb2@{<5;#9G z*kXn7g-k$$`S>Z-D_z%Hw&GCgbyoD6P_>5Ft)b5;)h0f)kkhN?=E%FP&`U8th6dO71%27Fs-b;ba21WCp$}vCSZuT1`WV zeo@Tryko$0-g-fcwW0)1(xS$})>~0GGnkn`p^Tt;$x_wu{l^^866r)Qa!9$D#2GH* z(5icOA)`XiO9=evhcwW2GBBshLtT}LRwzh+mt-*YbNvQGev8F{+IIcP2gVCFsm)DM zy>S&9=oHwz|5C+}lgUgjcJb`~(be|=j5tqlY+FVvv)Uq}YunANA8Qb37fSmct zd5oy0tD!t2k3g6srP?WE6Y2QTRbTT+)-@BBlOtEy&<+=#FSv`QFBVwlCyr<^0G7v2re`bb;DD_0L4gjzMnBu>}(l~x> zPsEdpyEP1=_1nSj#KPkcw5W`oa8gDc^191K>8*GVE#A_E#~PNLD1%8U+Gk%y!Dt7# zD1##HAjILb%+2i0x21t9)i6q&bX)P9td%ugDb?d+(QApD(bk0c@|59nqkhA=#bf}sFUiDuBXxz>|p~#oz&r9O=Y*$4E9gq(~Q{0esB$|Y0HyK)RIKV zb($}P*X&VS{fMXmU%*jkB3weSx>|6ywp@pyj}}{^)bv(+s^yB>JZJ1tZ&z#6ZcX6w z5H+T_84i~uiH5%s*VAj&YCVgr;|rYs^r;>}MLd_0;dE6SISw+g##cvOp?ix5wGmX$ zz*EDkE4HsuEY7BE2|E*8Dq;IWqI4!5_USBgBA{g+B42W)ob#~&GkkZa4)Cp$6<)&U zjmS?87-fAXP=CQ?@PsazLeS-w1qIXVC=rTe5Hm4?`kuBC$+rM})q&52?A(~lq0jjG z`dQLtnwc3FD*-gE`Z4#5gKI^KA%uU3$_)1^DcQ*c+n-Y)6dL%1K!7Bqs2_8fI)h+S$1e2E{5NOfn17L&KM3UNUAV9)%* zL5lt3ENkpXY7FD6ms!Rfp_odSYMQT9V4f@`Mv^kN0ZQ*%EifmfqAa#BQ>~2Hqk>5K zhFFum?K(zDuQdQhVZ!B2*VJ5UW|^P+m2zT_!fx*sR)anvTp@Xfh)B3ua$N0M>E8&W ze+~2cYxkN*Z}vt`j>TH{X>)vLSStLcgyL=O5{>(7go7A?=^oxn@pfS==E()Yp|!8~ zx{sz}TH)JQ=U$?91TrsNKGbeUz9K@&v<*su1?d7*($0$b7N_`1i%M2DhR5t&pH~5Y z>nu;{CIBuFy-WwQ@EmezdBzqk(bbVnZaeX6M1y(fgxKH}I}LjNCR44`9|zvFCRBG` z1!{R;&^S|P6t&CivaW?)txw|Na}V~^FZ7&o!wsa5zaM=su7Z3#i0Q`dmbR0tc1{~S z2A>svDwi@P8p*$!krk4J&}k5s7@33zzzpmisj~{!qLw0&J$8gvOGVE4ZA%X(g-|btkFx1kar6Vb zdi_VmzGQ@WW2D-jLxlS;!7Oo`Wg3Znjvfpd*qH@%a$9Ql85odjE*YSK>-cV!u;=GE z*Hs;BNccUsAYx^%oeWgw#t)s$;xVasC-c6Hzr#oyKL=u5`fdX6*t7o3Q1gsy!TR`CNCarH9as(-vgg5kh1@i6<5H?hL#`YqR&GJENrm!1#~XeL%8(K zM~0v(PfZh9aiknRTaHXj>7s)XnevF0tw#jh0B}8?X(DW(7|7!fom8(NCvVyEBu&3} z(}u`bP2|O4CLF3eoWlCC2rPM8W&^Y9dXMx4%;i1hP#7+0Wwh?y|Bf57w=1{{#&8Ff^qnDP*ng2c{v%1B%4ywW1`r z`gV^zc_sE!;dTnG{DmBRSq-{Mg>OY$(jfBAa_nBmv=cv%o@?_zG`vrbn_Rqa`~hc{ z(J7Ckl_m|h-io7?J+Ac?FU#R`es5A!vP!JW?W4dZ87R^~X~VMTd`c zWJr`XY*gm&2Y~L%aGEYd8n7{Q4=HCA#qNMAOzxaMJFOL$ zm>DW@^UV-=#1#)et|ct>E-jkuXrG;R>N_@GQ95-{6}1y9&Mcw@dD1J`EDj)ct#`qTh~X__-84ayN$$L|(v*qhI#mm2jh{<0f+7w?T@p8h>0DAiqC* z?+DBdl?BW7tYbKZZeep*IF7`zXI5XZEzgaHIeq(PKdAC`x<>Ciaq4bva$UMZOjpQcS? z`KS5KQx`2C?IMU$eoo;L@9Z&qTg`pe0t;mKq69A(+{hQ_^bhwpiClGvyAQUXRe$Nn zNaL*3u-<2bU?WH@j~@hIVDnF zmeD1dYpEJVHh$v_dFDB_eM`Yf{SUOH5EGk(ozEZh>F&7Kr~h(pXX?Cj{O16x7gmOw z){&fKUZOJc$wSSK0-d7=E?M79fS=}BIS}za+vlZt!pd%H1MO)MUuR{k8a7WEdCME^_r#~3k`Zl{$7wOi zLv{3|@FUAIN4=>ef)83@$pO7>shy!^T6m@7S)Fc%-MA~Um#|$;`n>B)A(1T~E27ui zYYZn`zGU!=d{2!Jnv?sxQrq5U;P(?{Z5mlJvO1IsFgbbl)>^CM3<<4q=J_9B%lP&o ztg$j3{|;_DMMW$AfaU}_m+{H76Enh)&L){<}BxcgT8wyzT-jFtn(i&&fwyiUNIU zE>r*PYRI!LKWyAX%2KTm<)IUXyM{)}PEyoOI%UKV7q=CI3SMVmpDkYG#_hdxQ+0HB z;R$tJ*3#&h)()#Wj6G)P+|wbFbo`UHYG_rG6$dKNjwVm%TO8W*y->g(^$NChm%X@}vWM#vARVcVvSMX|9X@gSMldtx+EM%1 z%{QTVkb_78pBLo2O43~%4wMN=of-C|8-EhuYkBWxnt%C28PB&tfl0N<4vUb3l3;eknQXnhr{BjQn(34#LMp~xx4`3u9w1syW7-!7u_b$~nk*=c1M z)P}d3Z03)%(C!HaW_0X#Gpf6L?*4$Jq{IA0ST>YPD$GYddGx)D7Q>ANNKfqVHQE8R@#fh|R3U)XjgHOkazu34t-8`cAyA>r{v#CRKV-ZZ86y0C^;kFVo$**=+rWqOSJCYR>nTZK)NfI~{29FYf6co@V^Ub_ z)tfpNp;A);S7d(jd`v%d`{ALEi@{G2_z2>tg{kR1EQ;%LxOc!I{ac)W*-C)pAIX{l z@v5zogPf@5&zZqze5*e$bJTpo%i%S0pfin}6!rBdL2ERO&v&s8Z?Jp}d1Pfz^5 zDE<(7h5hROofET#zgBj9s>pc;>`g1f;8|WtKEF7EQFjB+fJ!5;0COI&3zm1$--m&A zhToRm7Kb)gM83#IUzbkwup>my+-#|z6YRR{Sf5qzie-yCXJIR)@BX>87497$Km zD})hQRBN09_+yKlO<)xMn`iU?=KnoWd(anEB*?rqmLi_ZqwN%a2OH60?@Q_vHjHGA z`=5=E#B8)tOZpsvn6OYPZqRuMPpAUL!va91?d_Fm!+yN=QLVfYmrLzW9`pMtQ^o(5 zzLWcqZ-H;%Tqk*(nlM_V6`%Z8H+4EUcz5;toT&b7#;%^%@wl5#-Dv%hX4yKnUySwPo*bBSyu^kFTkq_0d&tVvYSo4G#lz_XRdX7 zayYLn*wj;rA2}q~dbfKwV~`}m!S?F$#?Z4eBx(UkBx0Puow;vpD6=%pD7RthrOr9=d~M24p>Qb5;vcl zA__S-?a?^0;CW6y%wMbk!-=vSa6}gIg))eqvGrB+Z!fiFxsoS`UZBsDXzKKc{{v%}jK~Gw0wENk@ z$@4mQ(+HqIz0Xs8D;Lv-vK$MXt>V^;tN$P*O^|J-p`Hf&JV zc*Z51u$cM^JBcdpq_H;0_>-4u4KkJ1!ris!a`dp+Hw~M2h~A<<4kEHfW>GDeT_o#0 zT@T0D;un09apUfD#mTW=54*M4c3owK&0+EU%|9xaLwX@j`8lghErqSbfh-f|L9{8po0o^%w^S$L(D- zU+Uh10JS4dMd@0xGCAa;?4R=^QuGphR=KZxYL6FxEiy)3n7vndLHB|3%c#Q=SlWIb z<~-@HHS*|(@6T{$+Y^MEoCvY33owW98=28Fue5K*?k_SIE%qg1G2w}fh7=f3aGg(F;9aujgbFQFIDlbe*uJXC#Iq zW_hd4*_@)q|F^uP9z>rkTU!r6SWw0g{XBBi!dGZFaK__NqtW9BV}sy>$av8DZV{?R zt&0rq0P-UPNIr?vsHZ%Pp(P|+d+jNFZg&cxAt@uHN3o1#7!$Fq3~K2|tg78(l@rFI z?IG$>kZ;kM#5XFlZvD+6I&23trpravT{&MI!J4KY!%y`Nu~h*S7KC9SC568vErpT0 z?}@PpAh0Tg+X*$bM9*Mcq3Vn9(G_4^C@uQzN0jwWR6OBtYL#2^f=BwBTIJg?`h*xnZhglx)4X81=cS`JU+V&QbP>bOv=j?m04u zsxYed!H;V&V^+%IYU8M0MaXqm5B@qyMNO@3riffl9uRRst1m@%KV~{-(bN`CtY)Mg zH&(Tja|bNKsvj9#S&%xnu0U(zj~uVDO&j1C@2#-QsJ%=V8o*udgys(pJNroa(Oq&N zl8qZ+GQPB5!MKA;xz>GPP#zKD$|Xo%wzQaz*|JOeu9I7b#-R_`e3HPHEIk?il=W%0 z){mi0DqMh0xHh5~Sgv{sW6@K@t9UU)u~!E28D=EU<1~0i;e`ayopgkcX3iUsWz*+P z@~#!#;!m|dEt=?Oeg}7>zLJQq$M0 zS&b}g_s;gC#Ps%*zNS>-*5y&6mhPrjOYg&H9 zmZfbsw^e!Z>VCT%b?PI<^~PR|wdIhdBQnB~?^%nR`~AAH*P5pmnj1vpbQI|D?f2o` z$SyZCgQ?OcX7uEk_t7xhIo4O#bs-7~-)l{>dvW4nyNh+Jw4R$_`exdEDYCz3&4O@s zxe5CMqgLvEg7{KF;s|{JM2G8;f*P3iXAN=>MVhX@K{jA~aTg2@-qc%;P{z^p(9R$h zy#3QJjwkLA*%12TKI%$8qQJ%69Sd^Z*=9kr^hnqk>VZ7<^@0a{={>EF{d_OTvTdFK zjU-Y*&E3iRp;b9Y7>!pE#Mf^X9P+!0U6w83{2c8L48UKX)nbaNodd?o{|3+pB?Maf zcxu9oAr$WSRW-5f1LBWo;Uh}qBGw+hK=!uZoy!FGfwY7Y zrfzq@mt#c;2TCh<@T)MnNyHJrKJ4>);{w{gWQm>R zI*dBd&)X&|pyed>o)u%PB766}Mj22C5TnZ^~sG_!73(c9iZY}&SPp$=CjmT8AKYB`Q6j6Z`gJogk-SwEKQ;^I6v zKUP*2Y4wyxW~0@6D%hWMY)MPYSZSxeq7mFk(w5L>`-72f57+u0PC=s+4^`4c(lxz^ zm#X6K7c4GKs3T9;N}gRcFY1AC*3rB{p-0^1^J$R`!?$;%wP~&giZ2-z?Cd^TKZQ{K8#8G5D8x%MHj1w4t?LYH{}A3dXbP~D@^7r> z)PWPySa8G;E3KVerg}9b#`7A(e*`~&^KOi^2-Y=^u=j%3goHXjE}bMOIg8!!E6fJXAyVAG zjIHW$9ho}fUvJVq#%(Jw!%}PA#5j&6LM@gHF@opm-5qP{ttp_p2Nb$LIg_dCBlK*) zHazw)E*BdHBN@={&fKm%8hlXDaMMZ^VnZRT`aUxEyvrEImW1Rz!Ip&*e7a8$gXY#=zEv1l!4ffG{_yuN zKdcNNd?PMiC^CLDqmvz1Dx8^l_a{gZU-};jR(PI>`fL=F#*P%!J>FQa%1e(qJbJ}w z!RL~AoTGGsxr+jCt-1iagLE8&TVJ$qgB$P%pd_d$| z9olZelX-A?^g9b@ja6D!Ft24&;edzcimW0m={aE)hh4I`e{n?dvD(<%dwU33eiTa? zX$)1%1NH=DtZq$w+5o0B4ha%$pyxyU6A>0TOaSR-Ec990kXvHQZt9D$t#7Aa8Ihlq zaqMDx(h3*~Of%XA&--Yd1YM8BS{)0$W;3UA*M{YlLuwIgHoN@JDB-8-6Jn0I9QXBo zW(rtkrE+^RGv(AdpYE8ED&h)ze|A8Ca-STPvW(ZjWMaBdxi;K={Y)L2Y|4)Ek2ZwX ztnQ}?5en)1oSKF#}^xZ%D+4W zdCL=TaPiO}1M~iV(|#>a2Pt)xQ-Y$Jk~57Dc7=aEwkllCbg|ggJIDB7f;Rk=a+xk1 zU%8bP3pqaO8ZTPnt!3m=*Mu)fYhL1bAGg&Yjsw#Of)lNkF6N7+>neJD50jeYdy|Z= zTl_ZAe;1JO_dp7$?NGG_629qFOLB=lw7iE@xq=7yw&72?i!59uVKNSXcZfEZK=1za z=C|T#N@S=eyp5@Lp?up&GNt`g3tRo_?CcAg4SS<4lD5)N;!oW70jVNW1UYS7;?`Il&kW;cu11xN3i zU)$1O^zNZ=hBN0=sc8aDJdo&J&b?F!uz(M8x|t7Jl6!@1zD9X%p18=VWNUSVcp>rY zv?Z1yg+7WM_N)%&{enEbwM!;?zBgfC^78QJM(x>gag!$t2UTY&x1uG+9xi2})qnx4 z1ufx=)U*UcCT?a7Y12LITHk;SGkIlWss0mB`FF3`J{d^iP8q_-vg2J}U-Y@X3J+Xv z#u!eGv8!-Jhe|~Q=b&Ac(zm}Lnzm)=9A9Py_J}Gg^G(q4%NF&Y_(o#jm6{eCi+g9n2t zsC2cN_(~AReu~EYXdyQZhgOD@cKeG3qYsWxm%eak^xQEL?j6bhZlfVyyKJK+mWnTz zS~;ru{Tn@T0nGLBKWj>`j{d34HFxt_W3tLhaoG8ee0hOwH-77rIW|;36k>+2{l&3g zm#4giD{yX1%E4vlrmE5JuRI^k$ggQ_DS>}N2~l)bp=Uy3z4!XOVr!#J7(yAI+m*fD zzY@L%)yBs}nD0znyU^qsggP*{B+W#oiwjovoJM`k<@wPIR8_SDR;rHoI(B_J55?MR zRvr}8bv`E80bU-c$@LKXQ5tgi#SucKttg}}T%13V31bLLWWVmFmeJB&vC4bTO#(>3 zG2qRgFk5#C>QxH$pbib%EIq{aX!V@K71{+yz+9GCA(fTpO|TVNixc?ycaVL|kU^9ND+2V}q&6+NkIwh*Wm;C6YlyAbZ2Vd&&BzU_EUnqi zE;R?M^oOV;vmw4P%=d)_vj6UKx+=Ca1J&y}TvxBs>^$EvepT-wr_jk=ikFxK&ilwyXQ!#xcsK)o(ZdV?M2M2#z)|4#|#0(aZ zWdmhEp@DXz&$00SxO1Sv)&A5hBzIFL{GeilRw8!5WIW7h;9#K8$1#K1r6h}23zrP( zVN#c*t&#pLjaBj}BQFQ^y}G3{2*WZr3teX|*06UnLNzd=M36)~e3_kLUWr(9v4u6H zlR!5sl3&XKN8Q!(x6oXt?u$_-?j`(Z%R06HZQq7_330b@ z*7+2%IX6CcLYep3nzC~a&}oK-2IdsJ7V+Z19{9@wP5R(>i0YFf7OG+`SKB+!z*4i5 z;LT}jBG_e&2)jOd1y<;>b{6H*-JCmeeCx=__w%xj?X;b_5nfrfhvJe!)Q5M}_jVhf?V)YiA-520P@bd@y?!dVj{JHXO2 z9n%dzN(y%j$p_$>WFh&@P(F`Y?6Ybn31)PT_lJf`dlp{>ZY5+mA4KORK zC?|y9t&G!s$$eO^?eqig?L6wDOCN9V`;y+O@mKIvkh|i@=Na#Z&DeR7SzRLQMl_}d zX$RCX{zFpnh5i9Fbf%^aD{I99ZvNb z70)oT?Y|X1=h;9;k{C3m?A=8JO@*P733+UfVd@Z>85eB){t$t7BQFb zld_H0Nw)Iw0tim>Tim39EKxQ%xV=u15bV#a^S%!HULx0VgR)j!<67BVSQu}2N{|de zFc8ol(g0yGL?q?q0<;JaL$n|=U4rF;Um;Wk=A;0iqG|}{2mA<;`E3ohR*Pb4TwwEY zjf%1M08->oF$6A8mtZ|80+$ERtp4JN^dsg0L#ocS0V#_(h_MDVGPlkzj=h}ZkY60% zTHq`O_{|r1{&;M*G23%3hh706N4d>GOFv*ou)qI^9Mg~X1)8ugVFFi~RrbJH%()(l3sl*_ZJVc=&`hRe#RIb+kpGft_D#7VPIT0cR)}M$}bGMzdV< z0oKa=B^SH-7K=wFvkV*l=igmw{MFgkbl(W%?;p~KnAI0jhK|9;)O7Ww< zr`gWN5u>m=!)a25F1f^=+3cOggEH9|NZ!0a>yGGAySm6}QU+jv-Gn||BM7R=;1P!g zF{e`2gO+_HSdRD=c)8^lSR>W~CR8c6FHXFa`3S~}N`%4Kr@7YOEx`Xv5@-G_iJjA+ zYdQmK1N+Qde;SCs3TWK8A&j!jv#@#u{pchp_XZFhpwx$<0e#$c67uE!udmnw-jP48 z0hpeR?<~OBirSF9Y=|BEuJH0rA?^Jy z4&S=cq7lFIY=jYt1g}+scX=fkx@OIRr?tyI6J*&7eIw3|{+vM(g%l!Kb}tmR(+3(M zs$+^+kjiyg`hev;VEz;XRdFs;Qqn@l6DB-|iZJRt0YubW9#n`GflWBZeq9zQJFGT& zwo%f2L+joABVcv&DqnJvv#nclpG6Z-Qjft|@wwc_8{?j-=kE!<6ENZvPr1J##lMT; zpT+U-y-R-MjeeAJuDx80^<0XXUU2BvzA&yMw9;;vZP(%_;+l*tYOFJ#6LiCgqAjCM z(;r{p?|H!tzEi9$rD5iD_v=H_npXN{GJ(_qbFV`ZVS^D6J0PYbm>NiH=ISS(O<_%a zgjl=eEoC+Ab3KgPA>R`tf07&@EH#GK4I3nLKx!1x!q0=0#`gyL z2ibA0cEVEE&|%fHpqMeID_bU-&DTdFCIfuVV=x1C-QUg{u-ov=Ikgnsb^X;Q;Eo=~ zYUQGmif9GK5!6gtB>-wHw5mllu-s9Kt-1_Na3AG=x>x{Lz*TR@C@lxM+A;@D%4O4h zOdFgpnK3A83!CdRmtB5=4Aj3t?!@&$z7kdr8!FGdn6^mv8M}ixWPx4VMFhkw@k}Xf z5aFOp{v<8m=on@5vMub@ZK)^mpGc0XW9;S5Bhs%GWta$(Ig?dpTvpP8I^zvnk1%-_ zb>i(<+{Ejkx`v7-9qFf_EXM}yVe_}OqGq}ZXl zYWAHZn2?&REEp{I2`+Y%rH%%&3mDN0jJmnw%v9kGkQ@b}mr)q`d zS|Y=V*>V5;cP{?Nuq()M{9IN#!itQk|9Nr!N~u+M|e94SMGM>it`nb1=sE3)bETowhRCxv#&iIvh}^ZRcDM z%DU<@t}c`?j9=dhY>owhgQ%O$&c0r2LUV=Ysw*SwjpQoM{ z;N}&|8iRdOcu6)VEk0G-OS|KG$g6GjGT+lSwfDmYU&P{riPKs_xU!K&N}9qr=<3-LI>EliiYqIgja8 z=B%05oBR^DB_E6Gk~Sf#EGr$puYF!(nc0>XcSPa_7yrwxTurk4UbA&5ZiRKGE{S#N z`X%ZEZ}(-*lCRqB@yF&8u*8!kl2U_B4}8uWF$(a=jukhPo=(h4J1_)1Gw+{h@T|gW zxO%W8LY>nNkBWWuQi4K{B%_P+y!9*FpXy(95ZL$5s9yc(*Yow2CqIa<7)D0FOH#fW z``M@kC)LH~XYQlLlFd9&JEoRZInon2_Hk^Lub>ezm zl%=(-b`5Y7Vazi$$ki^D7S@T@#n*08={^=~6MU+8xQ==@z1mPkKeVmw}=&RNZ%hkvn2Fnbg#1R^{e$e z-)77^2qTZbwH8sGO~D!G7Pfpid3xSNQAl}d;B&A*(Rn-PgmG2J1kBS%a2bpoyW#cp zvM!W(wN}*psRjqfdj_(GZ}CaF?k5Xf*K@f0#^?Hq>n|#>P7kUx^Or26`eaMWV6>61;&%~4J77H z4+k^vg#wP|h~)+$u=OqUz`8QFqK}0Z@mFZUDQ&WBMT_7w$<`DHSO~ROGs~;m`PBFu z+(MeF6F?GokEDbiCr3qOPAoU>_I+h0mznO@jEV6=QO#-IBTB{=%1%X+hcM(HtoFFC zQiH1lyNJ3MqpF4lR_Kx4Unl9r86hRZ z43L@NO3gTnjBw9kHUNq$hlV>9%RQCz%yr;n*byy)63@CvxZN_S6);+a8*H7wCjGCltAEbTeU)GM_a?$k+F;EatN3>-h5kVQJF|s z#lBSLr})wes2X!@sbgZV*-+zcMeF-2NO4W6pLz@dtBpo)z#?)aapdk&I+@wqbIVd^ z+&p^%pQZ-rUP5x@>@){&-#ayWrlTtsqBkxH>2ORIXj}~sDw90zYRx%}s_t79nsxIg zmztLnzTN3Bx#iP64=O!xaj4w$V_y2+NazXe1Bx~w$M5-&(wPc0qmO9CHtY7Y&m+T+ zzV%MhS+A(#nX!Hnv_5Ak(cyE?=TO>%kN5zB6Jm!~Y@Mah;TVn1Zx>|z*|ToL``&J^ zoK7z~N4|AjM{1XX(ur>l=LbyK4hk)tUeP+GuPetcTV%9RJvlOfUrb(ge_s4L`Pe6D z@fkY0NKe0r>-4J?ns|0(81}N5It4COzG+G17WvB8ZXJd6JeT!&QIF|nOgLJC#`Doj zv-Lz-(bJjx`kjN@_9Sll(cOADMl~uXSb$?*Kg`XfXi)mc>U3N0j9=I!$8CC-LG4wI zR8emN5iZkS6|v)u@JoXy#4Jb#I%E#jBK8+$jS;>Rk|Av6@%C&qr(#c1Ke}ViZ+1PL z_PUShi&xC*52R$zmAJT*_MHTAxodwvX*jJS8A~xjQLnpuba}~b0R^2(9Zo0E{a7&v zU2FU}tYIL`w2o}0tPtjgurws!cK8&GYZ)IeH8F#}wi*R?;D&pcMj(tC-VZArte$rP ztVE_8z8g!E;3qIekqv#e8zlf=Q5HcG>Ur3%q5VJwwQb~k|E(DNL-s!BN|8KQEgpKs zu;$Rnb=3-Mo^^$hJDL^tT5ENn%j5-78vME#+dHT}P5*+~ER)~HrNHROS4rQy#Vb{K zSK9R?zq|#d$D*$W*pbcpJ$%)Uesv=iY8Oy@%p*Atx^P~5gs;KT#0@WcQMa$QcLOS@ z0;9$b7WqjFwq`|~Mg;(rcuqon>q-k1>=A!tV7LF`@Oo%WLo`W>$5-V4L%)Kth3Xu! z^T02T_LfzASsi*eLrIAX=>L=$#+*QEwqtD63+#Yn3ohB|BC9L?;A&>`KFplW!3C$c zWmA~Qj)8|vr|2?4R=Vg-rSDAF$EvQe7@yzjU=}dp2z-EL15l{GQEw%de5G2 zKYt$I4gH+0%kq+uUJ!T{mHVV5zaKf;6q4mv!#LwrJQb673~Z3W#cT%DaB{1!FV^-wof?JUO)jPn?63JMncxJ=7`n z2OmA;LH_LLP(Z|4wovh-8JYLc=u6APw<|`=_N}VQfHfdD^Y7$0x!=fDPaBGc)Zh2< z=?V}LxAJc*%Q5NJSr?#8N#Nzr1ijY&W)~M%An0eI9OM)ud%59_*(0E8jb8t^lkQ-IAqt-&#+!3GT4@Ia_`)8l-bpplLu<69o$z*LF=F& zH>myAMC^VwMQ>~v3DtYRb9g>7>jPH9w4MVHo$Wq0kSafC`+TItx0psDz=SqA+dLKBjUWZqc>FRfIE#HBfvW*x^9VpWPD!ekYBzqHv1tP4$N5 zSwDfJP1z^kis7c2GM&hZHEKupuagIdzMMpuKXhI>I45GP0P66zKS9OfatL2iaq~mG z9>PaoHCR~*DH#OFUtbpt#!^>$3EHz0`DTqqO32Hd)$N~MH)=Sn?92JH7|OueI){B-b6Tj#vx5j*@>n%1Rh}KJ;$JDhG7k-58l=y+3AU{unp;{d0=+xi=n( zReqkz{EO|cp!+#2cb3!h?6MksthY6L)@}vHa-=-a>8&VBzzX2E#Dm219G3w2jo+er ze|-P-dpX}{kVzOkiuH(QtU#V?N8L>OIInPu>|hvdJstkH495Zw>~yTCp`VI3Vc)Yrf) zq5ow9^`9U||4K!%->==Y$~iYpj(7FPoZ~I38~dRkcf3PbPg{TOq|rj_YfQR{hcLnM zp@owU`EaCDw!NqRkcs8T*GJ=>zs~sj)44l@xD=SP$P(r;*2i++r3coo@mHfuRaz>Y z)kaUARNZ|XcGgA9W0+iGzL{`>e6R*PuSh8uxteF}FfU3Fn_PaRhCT6}6}SN4Hqp#2 zR0I*Xk9nLdd&O@{y|NF={QcXX%)Bp?G}0^d*C9QwOFKxFJIL6%gXvllu~kB%&+T85 z1h4a*1MwRRE94$%KBoJJB_}I~&E=Inc&N)2RpYt9VeiTvqFy%tVbk@>`5eYBx}q)_ z@!aF<%~6fcIm(X2qsB#XN=AXXH{DS|xAU(tj=RL%eCDfTDceRzx2q**!#G23^(u~0 z*cFI|Y4{yPsKfb}nlV$*_{Y)8H^^caooQk`8Xk4KEg5yC5%>QCgNnk1!|nM7z-y*W4U68I^w-s(-{uA9g?Y%t_vswRpcsj4dQ zpTky1r4y{9gv}&mEFiCT>xHjMhMY8YN$-2;-Kf^P8A%Q?6CczfGJEU%Y#%0_cy7bR zpElRnvz?%#Cd;msvBBMVz*glt_BB3eT> zkL9b_IX|7d;ytKkY~Ybr>9jh#3_qWNs-X`oJ}(%!P;2JoT-GilTmqI@(eA)o3>zaX zH_f~@Zhr?(o2>cHEoo59WI@EfK`0~ch!6KGU%o2*KtzgJTXVO>aBD`P(B+c8`Kf z=Ev)I)@zPi)yQaS-M=WZ!b$aFB~lRn(OjRL+;q}tmeXsW#kFFckI#x;$k_p}efAh} z%>TnsI%W^8U}==Jt@Qc`V!xSXS*kVX+XEx+tSQt!_~Luyvh0ETQC8vBewBi1R~C@A=IiJbBm&B-#7hYklio@B1$0gD4s+ zH72bDuf|zW^6BcE`R*5w2>)n}00fxZai4^%ST2M8wo+cPC#%kM&+V>)rW z9;w3PSkhBhXZ1_7;Bscy!<@NhPqf`NtHnbzaYL~YY~%`I%A%0pmwkTHNLF}Cdwu0V zf!CKHvEXvGYF`2woo#>3y(NPAw$d9iW->1_Pk9=s*IoKoBd@;ewY&&aS~5icS?5#y zo%1YZDjzq*QMA)YyrZS1H{I|`kW}4vi~g2a0?ZUR=itSzysTdq#(224Puis+Dc8G@ zkFWcT7`f_hUh|PybQL~_-rJI39oyM>?$%(kH#Oe4v$Kmg+iY1m*nFZF8g4RYw*t(o znFhuK!T?qBHy#w}%xtbVBHCdAhl*nRn^wvXxpdj8F?G-el;mv zs8?G4PFYBHi~NVkHY7JejlI0;=~s`}`X8g-(NfT@Ljb;lyh;gP%e(@|7gm-wq&K=m z-+s^2fc=`%!n@1Vv=sET7g0tedeS7Awh3x?AuQT1wq17aJnFN0WfH0>k@Fg8r$<%x z{4CIgXOw2Tj&mFM7UvdKI*?o{Uh3~+hnJf?!|3UMSFU z&cXnKp@FI$CEeZee!FAgvqYG`f^#wV@j9E&G64(c;Iq;yB4$%u~%`i4%N%L!$Eg z_uGFC?o(`)BtT82oVbPscaHaG*4iwwA8o(=41!^a#_O;e2q7-x83|ePo9y5tDro8hc^BIqH zwQn==cEALKw*14&HIyy_pk};7?Wr}<{U&>14hBTRBPa^~qtyjiM{L-B053M_6OJ=8 zMbgAPuL;}yQ$yfsJAFbgJ*>UJU9n%xRayN6HK<8UE#!-)v@1P(#9uxWQ=dKs>i-mhI$|`P3Gc zE`JY+lMIN`qQvFV-Y-Qe1->o?&tgC1XWspFpadDNMiKlwNdUTikK_hD-c`ZpVCM}+ z1$Rs2_42i-fW+@*a%R?&s@8-3jtw5S@YBv*>6vooe5(zQ>dOtcnX~b6|{r@ zxjK0k2-AE~fL=Dxsx8@qZLG9f!($C`wfmQiaWYuC^PAYMn+)G@pJRwB0k#;1ENiqSr!tUvpVy>SrIENf;-nVt{3T#g(~w?UsKBhigq03(OrCM9&`eXeXBxK zpVQAJfwYTZK|l1-peTvZJm^cztYc|8+gu#lrp4BxwWCDs8wMm#e|Iw12vz8YNuVE&{ zP2k?y4TI2%pGD5oZzg~1r>YdJ=tH=3(L}`G71EHmW9H~#pIGQ5{z9SZkv5l{xqYc( zg8doZo=og2vf_B{QX3A-`+gkz22Kmr)*6l$D=I6r{K)EM)iHM|d9yi>c4;`&WyITUU{x`q#PUF3NB zj3XuDskG%noG_Qm%)uJFm{6eRY7D$gf&fKJKEkq;8g-g59OtYZL-;&`G7>C_HJpMw zRdyV>E1l!W^H|kd%$26=Bc;}<&kuabI_!bT!EVL)iG5y zQv+O=j-j0Y3}o>C`@gx|%^R^401M_x}}8vEL|#Vpm2uSN~@G=$@^ ziRm9d!vTZ{@RRx9xqkjny%%C^_Z_EbN;incWA^Dbp?~^?laSHg<8KSjid!K@XK92i zL?g^<`h&+mJXG7IqrJ;sN^rZg zm|lO#E*3Hh(HkWBXpNeP_%?eZ%9=d!1pyvA2 zRq%}_BNAW&Q?hxqU9ln^Q0zXVEkc(C+DV>0U_4J0;0F-$($jfjylX(F{O|d0M1a+ZkJ)D9TYiD|ClV1C$X7Nh#5z&pv&}LSoDKVie z(*g(&d)ivuI;&d9*ZK-959UV_$Ta& zze!d$;5iO&lF1k2%`2KR5e-}vz;&rjxH5ma+Q9$55ub-0X#-VR_*80!%3}lrDfxZ` z5Ptf(bizVqTlw>W!*w=GMRV)>XF;@ClrVhwT8 zgz|=Zi!aw)5A$zp-YJ9W;_N~^!ULiWjVmeu+7a~xUDtf#eerWY^vp_-a@a?gxZxeL z9~iCo^f+eRq!zv!!P~{l020|6A_Kfw5_CQ=XX`f^ zHj%S#{z$Gl_4EXc(A+9jln-ESo?gQDKQ?dY8mL#VJD#tloZQz+^}ZrD{t6*kgYU&A z?>)PKimD2Ao!1@cD{YcghnuRRglXYJbp{xZZN!0ZUjt3= zt!7q1NM|+gVO!Ow{E##dR$JrPN31LP7ch3}t0|ax%m9!gQCge5U5L4!t#-nnaS+Rr z{}`JH3&cXO<*NV}UvC`kqG;>OlKuwMS(%=QE43Te;m9pFIGXa(>gQtX+BKq@sBgKv z9sXzRfI(HUCT$w`leC0Na0=};A8#{kS*dH5PfVY=B5ACqm%6_?EQ{lVf)Wdb8x21z z%2QE@z=1^p^%SjQY+2XM=JyQtei78;d;T6XVfeVuAj@;0crZPQGC&1y?0E-wsJ2LNaZ4H^Vh(~Xe&vHNKA zIY*>Xi6@KB{Uf6SpgIvat7u?*tgox^VX`^aZp|BzGaqLHxo{Bpw zfb9SKO`!n8v!E7R;^#va@M;J#lH}&}=OS-wo(Lads?Kz_deIZ)yt7p&l#Bl7;Xl%L z`pwI)6Af2hhC>=tvib=e=|gy}g?jMD)q>zdQxD1$!(7AIprK>1172|9#66=SKm(u^ z?ZHQyLy;~VYzArju+&p$D#IMgCDctIhkyJ}b!7f=ZTxeI;otN73ICC~9LxzEae1@A z97IS1P(jwyaikhCT?bg1&cYvtVMFHkHV-pH^7_eHqhc1uAsbRb0Gr|*mOVDaRe^sgt>TmE5qR3sSj5ew)ppoNmLBDxy@D( z^%6EYm&kr!PHg=zSjO#8X8aKS#BQ}r zmAYJvecVqj%V?wN->suR49|KAEsA>$ zR0e3M9b*$Z=S&=M!R15p`qGUuAc0|sS@#(mK^eU187K&w^_@OxyEj7WLQ0SXy>&oJ zcY)Z+ub*TIj*CSMldOQzHOu5Fwjg@5&+gQN~E`qnRXI8 zi-4(4oZc!P6N*&0q|Bg+FY~Q2joUVmmF#b20z2C@(_c9m);0`J#-Chuj=r8aSqg;4 zsQ=rR+%fo9=}`a_`|D6jdB6Ij;ih1adeHanN{m#f4pHJ$&}+b-8;R@=g9QVwupqh5 zFgm|Kf$e1JPZp(Wwm)bchwI%&`=z}5O}50qNAz+bUVxUB$+hy4c-)0^0oJmT$rf}p z2b&X9(9srgr}WZak`&(RD3@3+J``M`v*1F(?FWziMVYp{Z@RwNr4&btx=VR4gYrUM zTP;AotzO-ZM&FTF{u&^F*AAHSM1CH0`MU9%U59(Ky`4P9%4n$Xd!1#=n}<=g%Lk)c zuKYuCRS7+vsh^T1E+y(T2C0 zu=sXjOkYm+R}!P{bsDns^+k6OHvF$*E&?Ao|C~8xC;%oxSRBH$@rIA<#$2qIR9sl% z-+Y&nl-B2%TmU>uv+?K7c<_nn_{fe-;r%LM>T6}uwMbVZ# zt(oVs-tDtI;*g8$_wBvS4&YWlXdeSX%yxiQ}oY&-)94~PH zVHUl(?$?N%bNXsz=UUF^XQLFmRv2CBSBrK}gq7Ftyv))z`LmluOa1#hhN1~WuOmH{ zGHatBuF2BZ&)HT~b3mVr3eKT7&=Qx#-fhrsKv!s^Kvmy^7oc}#66SL&CFlXbLMk-%^Eu*EltrVa&p%z}=6nH1&l)LAh4@fT>$N>=)E6JTJ zS^fKxWv-v$vs|dfZR1CN{{39%hA*$<4LKZ}o8)kE9b%?L)DvXrOh2=zV*mwLnL6TUEV++OlUyE8RiXwLzsLz* zej?uDU1*Bjt!L?3Gs7y3-(OvSE~-P#C0Yk5EGdT5zcPpzbPc@juli(_ya1!}X8P+& zs0?5aY2LF?NLeS(x}O0o@^+#W7&_sI?mo^}A`HH?2Lvu`8;1FP(;brZ8g+NByndhX zMAo+SS@IE+s7&k!c;&?VCUa_pQ@$zpHs_C_#$dc*%bX%HMqY(_UolBA_b%$ql>@+v zn*Z>0YHxfGEIOqhuj47h$kj>>4x&AM8dq>Y)t-;s9s`2}HyX*J2L#e;?>dkfWj5gm70x$>d=d|fF>xyrg% zhyKuXofPTU{at>0{FCjAAFJ!TpXN1ya>4X4dl>DSYp;t#;>p)D4ej@^HT{!8j862N z=y)}(+Ws^bryA37ar!eu{HBGBtH1kejWkuuKy6(G=vdxE5g8lF@5db|W`dP_mPBS= z;gIYNN_I<>Op}hs1oGxjp_zn{fdP%p!N9kOtU&p<8~m7PSE`K($l-8tMby*DHpgCI z@8^F1U&9Q`(%W2?H~Qwc(pI2$N*`5%*m147U9lT#NA{NUR}ap^{=%=!DPE!d<-Ev; zcl{D_6Z0gvU<3(Aq!;4EMq&U|1)yB&0tI&bRO~j}^k>-Bx*_5rBxdaFB|r`4Du-=N zPtO)OZ@PYVCFtYf7Drb}+w2mgKjqffj{j_4UJtmLPEfH%mI1EVEgH*s9UUF@&LDjs z8ZqIBg9G*zB}g%eRv8o~5Qlru82uFOk!x*iTvfaN3DiyGU;dgH5RsXWoWx(Y zSvGD{=StZ=C=*aV)?OFX$v5}1A3(;#mmxrQ#p>1rDwL)Wu?ZarkqYH6tY~_-IUbb+ z1B0--{y$+K#(OUHezO1?(T0kY7;yv^o-VGZ;%|59Z%}s59S&__I5AiJ%qHggN06Mv zw9-#gx$oJp{26P((5<;rSf72Hk8JGncNK_pXk`I7TFI8<7a6Fq%yR4S)KJAY2KuRf zfwlKzl+32Ycj!Ya;~`yR;8uGz;K3fZ$DOWW(%8ZJKh(Ba7VMWDjKc z-VTbt@hUH|w*ZX_d6qiw$g#1Bi)xDx(9x;6>8x+~AiC$LgpX3jgZLeu;r-hg`mGk3 z*&Mn=WBs20CM!ut&JZVu$ltYlKk+_#17vz7T&!tw-?&-@wmLU#EBOUxV7evEF>qDk z!CLU!??wa+MCEP&iNeY538w`Y!{AQ$3*i16lWs%l!YqH2#h8SBuuNRki73@0G#1ggkvzT%R06K#v7KfULZT5F zlBv~HF`{a_+JGbUZGR;IwxitN$7yv*ArJ*?f1H@(7803_QhA}6(w>Q-UmK5oa*7kma-AN-i-$)TLX`|H3t|%Aee(xo;^x5mXoS};fBd#LRdFzrYxv?@K|JwL=90wkN8M5t##(G%mC4&INg)?WS{aK*s{j0YvzJ)vx5-zjjLVw|9u32BJ1UWJn&EA7I`Q> z1@nr;?sLTq_p`?7D{r{tgy4DU=-kUuZ#u4l?-7<7fn1Y{;D3q5!c`NWcavlVVWoka zXG$i4wF{7rOCky4f+{V?F|#uOTOj@3p&LP*@5WM$ytc7QF{p9A;&@xXFz81bsWE4M z?b=##?zD=$t<02d;Lh{lBr~g+cgXLEE*(7Y^4w6Q1yA|V={DU`H_^TR4Q%W1OvfPT(AE%0C9ozXxvoFTPetC@U!5_pCzcx9B6<4-qvz?H?$$$u~6*?a5H)@7e~M;pMpNa~=eG}Ui$ zuRu2*uBXi0cF+?(6)z~*jkE6PZlo@F>P2sjq&1c7{Ko{4~K&kfZN3oC+ zN>&Lpj<_>?aH*clxd<%hQQ}So15S z&XL{_m1TguXFQq@hdCh_eR5j-5bqHIa?lG$rBI&YIc2e;+mG|O`LA4*xu*_lTG$`b z#3wZWCcA=n=SY58tk+d+VW_!&Ft%PzBGNntCZDP{VeXk%0~ZdZBY(c#e75TRuYC7Q zOSi0Q_ty2-vJ*5MnSduh0VV%mAanY5`SINqB`($Wh6LJ`J88aU+@dq;BlQ~tS`7HK zhzPTyV~#&H^FP0r$o_h76;g~8F(*xOdFno)UDO*^e(=}D7g%~m7oZ3TS;Npg^U(;a zS^LGWFc0;G*_n6Yhf{Bd%BP@*X~+QLryvlltr@SBYckWlyutd7G~CeBO8fepied=Y zrCh&YbHi?74%#sA6?rAJPww6hruP06z#XxNAG^kHR@Gp(uEIERHCFN%xsP`?%CbO- zwdfs^-MX*-h0r!Wp$WSs_%?Kx^8&P-N#J4M0?*M8bzhDLqA7x4YdFn{v|{PjfquYCM}6psI55c#_T z*Fw;}-(VW%HFXAcVxvh_q>_5;OL0V88f zK^OnJ(*OQPh-y$j>}shZPk>QH)mZj^Rr760RK4=CH60^e*X##-7H^u3O9-&UK;Z&* zOhiA3nRmXwV93f*;j;eM!fr1=;a1_vV+cODU>aZ0b^P>|iC|yYyE{MG`rj)sUp%pV zDD$*6e6yB<;aK(UnE>GmmcY**#1e7(fXrT)J7F=MqsMjZr2 ziS@IR9NTRJlsoFeO9Y>&3k`PhS2#iWjtTL{Vp;E6TK&~$QT(+ewRvxN zCI4#|*z?i{lIu1&2HHw+KcgYYjDJaRp|CLX@!C)LI^bLyTW(DINiplY!qtFhd8BbL zXI1ZeI5v^3>ic8w8{>O-kA{+&X|os)5$&MqM5Qt7*vI&+f@oIT7N+sH%Qbw&I;1_r z#+9=L!>kXB7W{n^we1t0^y@4#u%!wULZV2hRSH_6w{5OaxD%9tl*g(zVpluU!h!=- zKh#e7-?3tIjvo6dE6XWbl1mpyYk97x>>=_Eu^_+-pp>!`EXS(nugnS_eYUI#=X>z5 z$0}Tz?lb$vo}mEO&H!La3~K39+SoS_s3n(u-`(o!bd&YbANVbE>8_F-N{{rvD6!cW z|FsRyfAH9aVmY?ihptX9U3dn%@|Fk6Wi0ECZgN-)71l->$)Vj zyEO?+z>rc;_<5h4Cpho0Su4~i4}E8ovvR9}0ognKk4cW#UO&0C332E@3&7WxJbsNg z=oFU34A?KfI<`Wjym8G0DB7$9(+(v4U?sKry*ehJJ5w{`m5mbywFcAWhcY*;LUW$z z6*r$jl$U9vknE!X6sXh5$v&LeUtWPsdii*X+j{3Y<*-uAsbT7P_Xp2ytxndREVP2H zj_lbpz+DQ?d|}D5oBZ4A+eufW!_y;(W}fWz^)c@R4XN3@Vnyewj|F^;d4DoQ<^R$Q zfv5#r#h#@9CffnjxV4Npj==gecG$rDZ?f0F$%u7`7yw&4xo!YF2Rrzh8KY0|A^axe z+U^7LemX7U-wJ^B&~4X7=hIiv~yoU4(OX|YcQmA^+F;v!g){16jX_-M7+abrP^p| z*GoD3izihMhnRVxU%1=-vy*i2SN%OjrAN4pa&*7GA)_Qa_k$|Xi^u8)aMtp|d#4@1wpgR1UY>OLp%Bx8@Myw%dU`vLy#zA(y05iYqB~W?K zavJ^LQ6M=Y!U7%~@jcTPT`y_d+3ygtoSu-wMftB}dvbxoCmjU0tbkp)<-c`N{?>Eg zhcIiM6FN`W*<~qGcnlJO_z3=I2Udgt`;+)Q7|_hI2goQFZvm%ffaC37f{Bk*49Z^T zz2Z61>G@6mLWoM7lLuDFx@Y6cd9J4lEUiZfMl9`iIN&R=+bPJ`b~jrsZ>=G{>56zo zst|RjMr8g^-GTE-C-q!uDH7jKjE_04>plM`%N%hYH@_`{jcj4)wSpDH>>(ZMP$z|; z9~)}AgYm5KGfb3INDX2U?A72CpxxJmw^@L8)r^+>%K9mk<*3k+6Bc6R#-+ShqloJ{ z%UQwS`<4NH70pGnx;}%dg#}ZT+{AwWf;+NJ;ITR+iv!DIgi$W$55CX;%A^gyMHwrL zOZ{*tXGs<)1FRZ;Vd(98y(8l8_5|*R3YX3gkFS^4tQq>>;?RPq4v&yMxc6EOW3HHrL9wlUB6ZzB z(FbNeb^`EdCbC##_6;Qp4D?3$wywq!t{1oL0X_PKSyl4s-b@zMfv`~a7+^)4mr3&; zaYXwLu`9j0EhS?5;>^*w^3-sNgE8zVP^b%T40k{L1Z#clf#7q+!|tbmz0WkfFvQ); z)q|3=A^hD>G&wqW|Boatp@QIpb560VX@tUT_|LmuOyhPlM5Qs*LO$!#kB}ct7o6;+ zyy{W@AwVDDDYFhq!2-rdxbBUyPf8PHXx|9SPWR7wdw8Y`U~e47-6lf*(w67Z+tasfT zR&`7CtDFd84e-JjV1q_=EAXDBFl)6Uc{ejy&h&TlY-*-dOd|7)pOwPq_!i6=GG#Hc zXbbPaS!82G1`B?@2R~!KGoNVKbMU z4h-U#bLAm0R3j24;`zEzg?XSmTN1BOiWy&$peO_0W%GiR2(W z#Obu{-k8J>wK3U_lR7J70_v^!RwMK=Zqf^}9WUOf9XJ89h8yuX!xmQI*ebn>d4o6V z!-p9&3!)YA?=Hz-3*^tn30OR-b(0gh3d~O>gHj=#H3u^r#y}~Ms@knqz7Ug5pjl0` z9hG)>ASGEOs=C0P+p0-12QNz;YQ%QJ9&Hu%>4z#U1pkO2j1uzzO~Pmnu^0_%H#M7d z7d*Vw`sj*^lX#^JZ@#dj3PCVq@Q8Q|g`8*lx0QC21CrHR&gGO%MLFp-0epn@F} zm%cgdh;D%EG`~~z<&-lvxIb#Qd|S8mt(ntekmE{K$XyM|24?+^Hu@;*!{qKcaXl4V z=YOx{v)S~UOI&Q`+=on*Vai9V@R|PM%T9&tu2GdD2j}ZiVe|9>S8>lCaNfbgiaSfz zGocS{K8PkI-8bgo*7tmIN!o_}n%_0(3S|1MqteI!2s&Q6y|QOLURJUtN-2QgOWWN3 z82$FB%%605Xwl;bAW`{nh)RcjsFrKYN$i?buRf=i|5EXUuZBNQ*P?$)@kstSDrH=s zqN@7WaRy-g-u>mDe*?xnPz&Y=JU!8+%PCJ30tyE`6AdE&W*>fkaWgQCL`CP-46YJi z0h!t3;KTxma{Kf+WCoU|2#J{3u}AF+_ri+~^i;T(ffV`W{D1i8-j(7{CEH8&pBA|& z;8=n~7A0>EWAXLeghpu?2Q>LKFtJs;czc2| zj#bIoGAz}gb4IoAi+k$Khv(i9u{uO9*iI+721Kq%zlJ2Qy>vWWFeSl~LNbLsK9^cK zBYY|BETZ~!#W?v+U*=UOojLgD%>HGPS{E9pU1+c=$dvWRtU9oK@vq3*f8~)?R{w0h zc6b(X>j|@8;RgRc?NQ;V%WpDk&{TIDyt+2{i`62SrT@suk6Ep6q=$0Kkezzy8bcSY zagMFTsiE0!r%uQzc>5gIr2|e`-B#WYJLi9S#%}A_H3O?Dx5!QYn*F0d^VI-8)@8GU z@q1T{)`;K*kW#$mHlNDj^aSbhfmeL7eZx_bP1&o7%f2tawsUkg;Xdbn>&qJcc}$&4 zw{60BVh=RMhVi{6W_T9}F;(%_PBqT1-F_wJk}bGNlK*audgIdcW3Scx?kJiIL3tC) z^wAlyh$m}6REYn3T4td8K@;>RFe*CmYFUmm3cnRi$ z)vp=jSe4O@%>nfow*JuYrq&>Ta$R}yY+~Yd6~j@=9#IEAWP>#wI#B!su7-6DD!QfB zhHK-W^_=mf0!I5K?}>Sef>3V;#_^Xil2%n31(P+DmP3|`@H@z}gz+lT9Gp{1Nd60pcT zETSu$pC3XqnK#;9w;glSheiG-BR(u^lVNJ+GUpQ+Zr2IMO%fS?9xZ%%V}9yJ?URHr zHf9vxN2lZU*?E|nfPsFSj7(E7-(z4kkZZX39t~aX<_L+4x~mcensZM~pj}QonH6Yy zj0E|QH>%cusORpI(PyV7<5yE7ex;1C)s`}wi|c7sOY6z;m8Sf1h$v%QFq`@wq6p8n zc;x)pnRoA~l-TG4p5sBLgE;ptT?pIrh}pLI{*IYd;yB|Z z-XEQu<6dGk2|fDJec@*@i;wbMvP|{44g;6Spld$xc101?Fd|Z<{CPaQ|3&WoA(k>Z zy_oM;h!Uo3+k7+_`FOL+-fsw!d5+4b35Y%1A8*HUMU z^Q1X*;m*iK(?b#Y1PxYA)$1pLJ<{EqX7E`Cayt;uRF-ftfjqhPrINM0c4&iD%S0Z*|Ya6}G{D=p;QjkBh zL|vGs@>+fN9fdVHenO;_a07P~?BnR-IGnq*Y|ZwBMz&_^>^zui6W}UjA%BWr$&%F; zPGx=2_~EJ7+wbnJh}WDqAS_eU4u%8V898Ph?oA1zPbl-DGOx=Rw>|fTWM*MpI8PHT#VVMhhJMNvoXI2JI#zo)T9JX1k>oBx|667-Opa$VqlQ&U3J+ z;?b?gNdemie4I%qn>p*_r`K%wC$MfS4k;ABMPYW&O~8vhuMz`jB-Yn5~Kk& z={u>%BCUjI@Qh(*fB}v(^{YdWO$^v$4NuJ+T|8dDdx4;VOWTZFZ?MQpw@VOLf5(M! z_gI5cw%sL2fNAl(#o)k~_{?6Fz$%kBblY-s^Z#(`u0;ybp{Y5c%5=VZVeIOAyGmDd=-{MuwgaxL#AWdw+-szby*1CuzLQ1ONvRu{}}* zQD6dQz55>6$Mk-kpPLrD2gjd82OaC6SPtJuFM_&zA>? zM3^J@RX+dNR}{)_c7KBR&)T@k(+IudBLASAUUUIajkFHBTkvCzv2#g^-x@NSzIDWG zTyAv==C;eT#k(^RP!Yms7Gba!^c@6aJ-#mfL}YqejXU8ptOTRbK-JoapAB>ph=bXv znux_7Mh73D3y2l~FG3g{Mnou|pe_c|sRWFIx}!B!e-RVu;m6a2dl}u`vDZ{P)bgfd zAE$tS>el%)0&l}GxaLiu{rVM=oQt1DcSqhgHqHUxA#L(~%sY$qhhjG(Buus8ZJM?7bR534UuL z#Do~v!U3p~jNQiuK4QR76g}UDQdU9QnZv6ovJQ zF7kmtH##&|VV$`@1$p!3u(zxs;Ld#KGc1sYtprXy!q9;tu4ov|RL?E=(^`7Y58MV; zzsb%~z(*RJ$=m%9;g6j1Za4Ii-?hCGr|KR&q~mZxsf67CcV02)yS*}ZH}4?>_kAkY zD8r4~HYce^XSW+BKk-#vVcp^5c51z8+qnKZ?*dK(RkxDq|7@$1QyJ7oOP^P10imP5 z)2i{@TmI}?&S41zgy#icgqWI6ZCv@PEbcI{=yh#7G3w&Wx3MNfdVeFK03I)MZ_$cH z@Q1fCNhi(dW(RZ4`XpxF6{J0Ro8XHPu4~b()Qm)#sqhI+iX3H45F+L>U#jB z=LC27@*l9=^Y>@+`~ZH?ouBgF=ycQG+tp)2R&?bc@c~TiaX6xeeiPubPA zN%(5U=Xl`!AwJ*A$nH@}-co@YKN&Awx!5knO@wYL<|yr6IOb|IDk=uXNDx7IS2XCH zPk~wWh^JFr?$^%wZ3?$y_^QQl2|*s~X20v63t{@ic}<<|Hi}yuPUj%2ztB*Xn)8t% zlrmG5e=0y2m(qEvzg=lDD6uCW5#nvNY)P5-s-lG7WEAq6)NCABdkoiC43T}s;839~ z;HHLR3+vyDioy}0%j=5#KB0o@Q-PVoITX|(ah$LPTa;yhU^ZU3L()-r`(~t0+6`u} z6RG$inN7&~0ByX~qCf42sUO(AcAk&dxVqW->y;@e;*=CNS*#}^Cr1&+(Xds!P|8>q0kl{yV|GA zsUKu8Q?ZBs7fF9+R183-@UbfdrW24LKJ3V*X?Vwg|El+y>J9TbOOqsyK>2Z!C%}j3 zOc#VKnzuxt+qCd~TrOlCik7M}SJxp`iqyl~E>2qq{$mW9xUzNivpmB%6<@V&eyp16 zxV#8&?$J7bR`kOY>&(lmrH|^oDGBh%-(;dW>1c2|R&9wXEa3U+jJ?EQq&=>FidD7e zyPpSRHF{fzL?_p3&uzJttg9(0QfDV0fj%q*Y$I>WLbGaPE_+{ezNjScZUHOB_w9GG z=C$6L3?~o!zR;<9U%CW5w#7pNb*B9$gBsR$;36Ea z2F<5g=d{~@lW}JMCOf)@(|>=}T{yyV0Q&QU9u^&xZ`{dhsz|}jG@JVmCevgXTC!?$2Guw}m9Sh_?a>B(Fe9$%zo(d>?@ z*w5u$sF6JE3hfE(hQwz>;PZpxj^{2-Y&oZW*RtodxlmI??{!M#|3n%b*H0g;@Q9*e zU!^#uk)7D!(gB9hk2pEEbzK(n61|~D{raMRAapSwAG|n752(UAZn!2beM#1Zi*bsi z0ha`{7bHyRCYJs=Ng#0t68BG!^q|qE zuKzAIY6cD_=>UoE6RNr#G>GS#%)|FQ8e*3Xu+b%AL|T87bwgQ#q&Miz(>+GAr!KQK zb;%I^%$tYllh1ja_cX}A%hWvDHwH<{0_uHfnxJ_c6>+rZTHP@x?+}5Z} zEt&NbfR-Z(EnnaEaV%B7NOpIAo^-+DLLise#U~mE#m_WtD)&CQwu)oo7h#F}JS(2S zhdQ*vt#*b4nX(BacK6-_wpJgz@xL%1KVE)O6-^`i=r`HepieLh1$r^a=%U~B<;^v7laU0C;!~!sSOyKR`Dt2~u&9@ea{J%; z+jer2d6jQ7)W2EsI4+LwXw8|C`oiVSZdq#M<8pF@L|e$#?pLmFA|g@@6HHZdzS-Kg zzn@yo%uwolsfD3saquH%#@qXIUv<6pFzfke0M(W$^KX&jD4uX;ij)yDkdj6HVb8K^~*I*!< zWi6A*qtgfBnHuS@Ux>9Fl@>0nm3~1kJt}7erxI6J!P^W4#+#XDq|4RXf;sB(3BwQH zDapk!kY^sQ`UN)C)D%;Uu$cRmA0rHgbJE(o?bdn;3== z459WpMchm`^oHKA*EL1J4p?#0nIEk}kM*n7@)EThrjTK=e^T+U!>oTu#gufo8yg$# zZA`T_ZcUcQmaS%SxGl&7<;Ro%7X5&kfsDMCyq|NEuB(H zix5I1#8r#KJg_E!h*L8klc)3LK&Hh@`vk)kjN-KDVv~~V1f|fB55lF7QC77=mznhg zTmb_niBlci>Y`NpEg}WO%3dIMI#lVbC#QTG7uIB@|;_~>>DSZj%bf$El2_QxB;MU+CRfH#@J)|My zX8E<~qDh}Vxzu5nsY#Q7mYCq%Ruk(GJs(Irih7qIbup=++s^GsX-{%zmIJdLR zYh71+r!?3PHX%SB=8&hQe5c5(!BZW+m!E9+;@a1<4*zjRY43|Qe~Ee{GJoEZ!aaLO z^@r_5PNc_c(PCa-3QQDcu3TOLz0GITAbI*eGXGwJR8LC`csH}Uds9);oWFhI)KU4i zukuIgi!$WQ1pY2h|IJV46xXWeUeoR!^_Ha20O|~av5q8S9;-Gl>k`rgE_zTo?UVL1 z^z|DnK0eX~hzj5xu85eUL2MD3U>jNrd*r=YVt%h;mlxgr-V0tY z4v!Htv1IEvrkp>OeS3SWZS!`-mYpbL+7gr}9X53gAsIGe0X(bo#$Y&u6UGlZ!tnc0^V7=oa!98_Z;Tt(j zcHbp}BwAU8d!=*rLT&B+a}9hK+%za`o62xDPJ%AEeLav1A2t;3!D$znCV@AKJTs*u z%)e8x+{8Ey+@m&mDl()a{YTpKWEB%yz*scHn?hqM&%%rWh4{#iYW+9z-aD$P=vx;J zq98?3kfuOXKoq14qLc_o7b76OM5Kv`H0gmLNN)myQY1=|PUu}iN4oS5p?63GLLkMv z{r%24_rCMa9p{|;#&~1A@%{=s?5v%vz1LiGe)F5(ync>Ph;AENYbVpm*9}^#Fk$C7 zf&6{`a}dB5>{LORE14_(1B!5}7gh6d?^+XzS?@wzRB`*&mOH&`crAGzkw}lzQS`|a zaPfQQ71?K^O!~c^aZ>Wx#v5?%(JY9{!apDhpu`j1`HArnyNs1TypEYNn#bI^z|r{R zVP(wdGOroJ2*p!VS6lrqo->pupfa(APn3`)$j2XNn){EP_)%jk`k-PYaoH<2N8!0r zy6Z(s_QbciXg2^c&v!yP2y&f6Bnz);iw_4W?kfoA46$^tF(9sB+>>SZpWL_O`nx-_ z;KtBj$m?YmBrTjXUb%;%eWhkU=q9rzXeJVm4lDx%-O-z;+ui<*dGs_7X}q2;Z;_FbQi7NFunH~KTZnfMX;z4l7vkwhTdUb(s zSE_(U&E*wk47yl6rQ|2S?%S7!Y9x>@b( z7{MQU<6+;*xw9*{Nc?TL+_E;$!F-Qh=VjjDrK`eTgy7=9eren!p@?@2G*DUp1`16zK+OvDR&oCK3)!VK(cY&wRcuyk;CcHP?&Yv*l zx-U?sjmEnS?1G<((|N*Rsj zj^r{=nK?eHpc-lGN=}w?IoLB&sR}&I{+hhJ2GI|YBaAGWXRuXIG>VjLnYysF$KW0F z`38I1R|=VaY#r~AiH~0CZN?jWkLYIje3>$IEc_Hd6$L6hu%T2fFEH;{4YGQ3J1AcF zkvzW7>&UVE5`-Z;Zh;vC`ZK^$tsgey7p{$p+`Jdy66o`uDQe=oj`@oQ*4t7*7MpYX z!RS1*g^GHt#+pEc#C<~$_3Vo#0Ycc)V1%x>wgI{V$DU*D%GMYgB$r|kOx1bbD)Jab z700fPj~Gd^#>IQV;^r+SFOGwi6%Rpw zYq&Wnm?6xFGV}$wGiuGk*R7%_wxT5fMxKFP@2+=<6lA821#ET$#O9C-J3-QE9< zS#fgEI>MvO2+&21L*h#me;Ry){&%h5D)-utHb?8@sD{aL{oW0QO;jC|EPyAW8%gP7 z2wMgPDb-}W_Ebo#@P{Nu^z-FwD95d6ann}-g#N%U`%_nv^B1uvh z1lR1`lsewkgB|lbyiIX^LkB*B<5+YKK0NQ_^hQ2tq>x!U6BJXsR=4_LkrxZ>foWZL zQKW_V6LP-yyTg92Du-fCmx9)d3sUwxOmYUuu=-(=@{cS2={JX8^woY?u{9FR@tZqt z6QkdLw29pss&)SOLVmYPmV4r72BfZnV6t>rhQ()<;icH4j_q9qpj!R^4C7z^ZAS&R}2%rR)c57XIl)Cqp#Z)w-H9z7KZI#Oi$rjsD+E-okc|ejy5^`wkZCF*jW%@C} z(aT5@rtxI9fyL#C?*!G|8fGR_<^Bda%@QB%JVz$NNBcf%4s49>MALSGYQ6Vb6XT8? zkz}Uik!4u0Gr#eoSy5S+1QSqxD0HH^N>oiA6niuLM2Lrd^~UF9D%ZGm<3FH6Z^t8I z4`sS|#oxmR_o*fM9@y?niG1I3_yPG<1<4uvz+S}7b(7-{$g)XuJVme}b-S;7i+0iA z?sm@f(o2=SPPzgBb#s|Vcq5#~e=7oXtB?gu8&{IvPQj7l(|sV6D9BwN3_^qocb+~% zj&;C4@jZXAHyQ2gSs-Im8T;*?sP70cKq)W)y1LK@>}Tx#w+l4T3Wuvx&9jsWt?bC_ z9O-+-M0!O+^X9}gd}jyL#+iOs=;PVrE?X`)8g87Z)Rv z<4c!$)&highARw>@ly|!H;4Op{!+CPh<=sR*!6s>g_@J|=m?c}hzdHsQk~|Au+QVt z>D{?iN*+qLmpehxl2c#b7-&UC5XVRGM%B#U7aH(EQE%}=G7Gt;t9l`--A366(h_>c8Y^kZVXw0pfZpSL_o-~1*5y~Qwx2v#RZ)!T-c zig}@G7V(bmlZ`#fWFR{A`#>SSWAAc`%>fdF#xV~kOD1GXWg*NB51TdA3hz{HgoxUva z$)QK|0z8K3xRJTt#(rh#!eH(zw@1)yFZwCS@u;Jvp?_cwx71rqqD6w7Tci1$P3}#K z{aQazu$O@}{#UMnw^&5LeM0Aq-lvDEI%tEsuZUq&y6ZZe@9C zi3mA75h;0-Wpe#PFMoL82Oq{Rd7~A7beyyW-%lCV%V#eTK1;kc(}(P)YwUNm20k+|h^9D~OA5 zt0lxai_h0%=P?&K-^84Z{Tw{Mv0MuDo*DdSF08z({ zxG>+rP`GWbsV4luZ2n1yg|&@PmgldB@)HA+-meKB=g}?PogOI0F*m1rMxpi-d`P2I6X&!WZt*kn zCLMBwpzUdfl&b!UY;NFV-?x2TE)`>N+tR2$07*O$0ScyDtf(?e@Md=oAT+k^3wvmD~l>B6DM9 zexL{iPSDor(LRyaDL3?Qggh;xQvgYGv_k`v=lQTzu6c z@GpF*eXkvH4}<s@t4=->zwrG0K`FByWxXgq5WKZg)*K#IVlD zIWo5c#`{yVrF28E?sRT{QWn1?DJ`6H&@!pe6*Xz%&9c?HX25dw!?R@S)n>Nf#T~)F z0Bb+G3ixsNFmWmb2}ehq=bod9Gar@^et*HmD}PcQ+%_S7B#qjsarTN6?lOJDiQx%D zk&j_XG69m~@?sx?6 zI1Yld(kJt}#vs^jhLc+fK4Cm{JV1em+~ORf-FIZebDX8)LES@(t7ikDoM-Wu1*wlqNzf(@D) z6D?A$`D#;o7u@s0jMmCltHo3Eq#%D)0XZup?gm%^3h%zpNzxQ9UJYUwQR( zL@|(cEguQ%1bSHZZIwD)1kFtW@NtN@>gf6Fk!v@vrtvT573(POtp-`)0)^0}SY-Ff zU$DM%8+dX7-V)q5x#N76)H`_j};{Pyd>z-h>$ET-9w`Bl!xzbHEp+XLA6zA%)-r250Bk)F8g zXOj#;xLr%QqWiJf>ht!Q_S18uM~+OT|P6C@*%*(QOcH&$`Ix0}w7 zx@pK4L?5Ecp@AjeOJmoys;WQEiR!bLb=PcvypKXc>_z0Xs<9;>roW^l>u9e!NIo>z zXuFBlY~*sXw-@=6E^Fo**oakExZucm?i_#{{*C(!P(F|!$Oqf0PC2MHC(WJ1e#H^< zGUJhfgx0CgO&HFQmp|7t#(oI(+6~CbYoi(R`HVwyYgvUCmuXwpz zn)pTVQ3Hfq9(7# zhW`N!Y_w*wqu0?dCe{tMN~+#aMQ{JrEB+}r_=1d_aqbjj1fP1lp0qEaEed~gU~YGH zGBY4gefbX5P!k12CLYS=ABacrJy(+I#$WOr)ju_;9*k@T7b|{OZ9hbvy?4sidaA*^ zCR|;@(A@vA5vm+^Fh^mVpU6_ds0+aU(X2Bi#9dK(tmG}2W_x<~bARU-D-U^EjM8XiX+kRe_6}&597L^M;XPd^6Q7!qR89n?KqlC01MFK7Zn9 zqVB{ufI03}L=wW`D=-#sWK1fjrmW*b&;DZloCABh8NB;bK1g`baXWLwNIiV)%V*$= zOJl)!i@(^$Ow!nkca=znhwX#;k87DY_66!QYy+miLdZt^)Uf)BuQxw)7FN*aFT#2&4?Zgw1#^#cb zx}$D!y0_kwELGrY2JcY!`&(cJslAOB@nN)KjRLu}hX*90^eZb+7HHiL9UNbl7_RGvBo1>urB4v~Q*hs=;M$<#0 z^-)gTFQaw)-FrmMFLb#8zavb{^6I~FU+De>C5Z% zn5%`c6&)FTO*qlKiQ|gKz8)Hj-VB9)UvEY+sWp4i8^x8X&KC)Z`Zxj7ME%DmN3GR^ zkwkOyWuw}p*o&Eww*z%mod^ik#t9E?KL+CrsLUf}1Fz>2oenZIaqme2`2>`H&B);) z<499W;=5F`GT}N=aC00#KK&{3?Z%3vN;i{xgJNgH;8YvlcbFuR{ry{^3R_3J`$?qj zI~A(fzzoOTT9L0+F)wOjtW`Ow)IcZ>j##JP?R!}&g_3ufnH(~>E6fxm#e~E!Tl15L zVvl|dmrA29ZTbpSe&YCascSVK$hQqQu@J_H)-&D?IYW2-1Zu6y-VZ%dT9z0j63bt?Ib39$a)BHl;8my&PC$3cIzV$Mn``ZM^p&O-_Y}_<*-!{UZ`4^O(GUc1p z_A}KoS15*LCo%FH^#!g()Pie_agB=@L_5XzyQ# z+WYfT@NdeQCL%VhON_?5{8mTw@ZIRNCfhDH5>EZ@uMb1_mF0pXH}A+hF3m}Sr%pTO zs%aozQYhZ^D~rXSI!&{@UtF1`B#WSmOR`uVVS=P4#*2L70ridm4-#$@3|73c8AvBR zr~LfVmk@K*ra%3ULsAiu1;(~2Y5u3hJ)UHPx0RVL>zB!+iEdz!vtop%#62g<$| z`%8Ukr0kbv?~O)|3zg(nTejUG$=ypQ~NEVn?}-24MS6hoV;fM&bT9Wz*8?o`#r|8OrVw;-j+l%pj9+g=I~`Byrxziw zCk7?Iso$DYYC&ztb@5US!OdRak3CB(XvC5uj1%L0NpA=<*n)#*;5*#fXI#^i|kFMEzYX3`llAIDesVTG0Gw)hQ3QXH>=RJ)_%P z;b4aXQ>yy+Z5AJ2i3XI;Ph8)$S8(D>( zbZeH)$axQN%2(z_guNWIp!ywk?I48KY5#Vc;`vSGk+!||=OSkOMkS-=D$K`*b}i0LZ39GIx0$*QyB;<9NkLozv(Zg?WU-$0fDi*8 zA5Fr?qwN1$?Ly_D%KVrS5QzZ&-N4`|KB(Df+}Zxy=Oe2@N{BDeb*`n(V^+;QA6MZ z51=jgI=3Cv3sOn_hj(8dnV(qRD1CfXt3@6p&=W~uZ^R}AN@imVpz~((jO*~#3yz)@JV4Du zrZ%IzMGq5^gwL8l32r7zP|B3Y+hWiQc(Oa>{xx|64EkT2oB!oJP&D>cdg<=TryTRl zdY9UKy`&{u!^qSM$&Es{He?D!=%fPhNOr}r`-^w=Cw1JU{r#yJa;Hn})Vp~7e&68g zRO+?<@V%+woPiOnVHUUDHbqh|@gC_?ybSPidzZvGUtQW9m*A#>CwnQd+~u;j-uRk9 zrb2Ziros4(u>i3>Z09>_hYa@6r;A{+|A^q6L#YNmvru4sh9>`oCJ~&}5-OT2_Y>jzty zjD%*c3Lh%I18&J`oVjqXqKx;HA}>h<#}Rq9_3CF4OF8|8U43RzZYF}~S>hST(d~m7 zJQ1fJ|J57T5-;1o+CDKH=yPW7k?V@DL{iya)zJH#K|Kk{VnyTXfENH$22UH#c8uC# za2EF7Z=bNzj8u@m6!&~kG6@{6e`3^PaJoulBHY;kGn6zMxHI)4Su83FF9kUjS8yW; zKwPMvC*6{wBP7B^Z9EImI%j1x_dF?I)rYUSXkP{h0N-l-tsMODZ?%l1GIHh(3HS@# zdA@z=3g*sfK3M+PNik7>v$J<8&EnEKgsroj%;Ih6Rw&-G|7k~9Of zVHSI7Ddc~(*^+g1akTig#ooS6jVr%pfXt(J6GB?I-6jREhPgOmOoe|pabPn-Euv#3 z4bo?wZ5@So{pS}_4ml||zW6>#h(WCZH!h$Oj`?qRD*qj?=6`!G&|Ea=hPb!X!;nx$ zU;l#m!wA)Y2WDoyUao9K)Nd0XWX2g<2SOf@3o5F$@c7e`{NZ|*JS$#qcB`7pcllTbE6BoP$>LRC^B?6l8g$Bt>yrZkf zlKzd56#w-qT}x)c-P_m*#l1pdvWOiePbiHtf~M!#n=?lb0|u!?Xk0%M@q3rt2Q$ef z{XAufJOSv%uV5j)$2tdOk**_G12cRv!S0915&NkU_~@fSQ5ulJRGe(8f}X^EzlHjI z6<6yiS>}Kk>(+&{806v0EdZSP99Y^(uK7VahcmV=XMFtAY z6Z63Z$`E|98p1K|tmn`XD?tVD2?5Bo3~{g@9Z%~cPh=?*ij2FNTi_FhKcFZ; zg}Tk00HFld`~juINEYBp*})V19}wttRcuYMBC|>R4=A)A8%CxQL(8HG^^nDqMj+6C zsJ%ZRMiE8gnWQNG=w`(_?UT4-;dcZeoxnfX0mCx_3Wh^EmX{%pNLsHUXMkP=e0E*l zmI(ER0wOu%aEN2}e1}YqgF?OBiJ3u;^4IkHvRvzD>Pxvqh1fQGo_gwd?%5l0=yYTc zhP}+)xjT?b{&qQv=xM`iOrs(ElBkE+P?s?l)vn2xqr{v;g4kl>N_M z0S%6rE2pe@8$yZ$wr#06Ij2y2Wz#fMI%Fm>Bu2ni_&Q$Q;EtcD;|gJ0EI+bG;__|b z?#@lGiYE(lys`SGcX8tB)pe)+xCZE$H0-^l*vO%+n3rOj+a-=<*V5#ta>?d_3fovi zbXK-UyvjGmBC508LCmf&7!Zy)%zQK{FP`olZW9QCcAvB6ts*-hKZplK3 zj>eD^9g4}vG#ou9kF_yr%p8SvI7eGD5+Ir13zrH*9ZK#oIBGm}*yHlu)q6+}iTE5S zdA}P(;p+&vxfD=GPD@nD@T%}^`ti)@li}fl_A7r&iw|v6M7VzWF2%l!9pDE*H%M7{ zG&fj0R>ooB79^$0AxVnj7L(mgK2TeS>C7@SL8vJh02_+uc@*Y(ni2mEf z-d)<*m`cyD*qe1;AL88dzJi-4egWDv2uCp*(zDSe?0C-0dEN=ZCC;1=+GlskldG*f z>*eLqhi5Z{J3H3Q9fH2&42~kl3YBKY)M2%oxhld1!n7SZktxc{t878-%y`<@O6S4rLVbKQ;)%s0Ms$PIrag|8{e^R}aFnxz&gk z>mZmk8Qi%$UK2GrfT|#I5!t*KIw+Ti8=pzfDGn%DjtkdjS_zj%zGX@ae@Zp2_HaPK z5om>UM7R$u%&2CdrsT{Byz4zAf;>wUe+OVRQeNlv3dtDA>)XT}}bw*qw^%U6xE}dw<;5#JN2rDonIN5-T zu3wmI9y}hixD=4qeRKx>2!0=%)!M|3tNwwz^n&=L0j0%N!cQ&!=+aYFj$@^nWCi0R z3`icb1lj$BooTs}UN0JUcK&8#jexDAo$+q`M+_>xc~5a82LSlFjM}3YBv;yOMo|K6 z{lX%<#i+dwOXB{DZ+y4-sF*&Rh*;)t($rEaYKIyg|7l3xo_pB(eQmi!kpG-O00blN zH_-)Vpf-eB_K--&Mjuqo%`1d{|YP znBpYSI}27QuJg znRWJ|R<}Sykzvcb7K5aBVn+tT8(lRusf37>HQ5L&Z{IJYUP+1QjHWFMRN?RY;wK{X z7Y!uJ7jj85MYtbkc&Z=TUWv-MboVE;-8}vA5p>7#0&cV<&_pMKC4kj?c~P;b-YEJt z;~DeMqI`0@>(u$NkKNQgWUV?M{zQ$(1tTg4-zkdE%S707*Q-_MnVXs?DZWD8v5NQh z&q_FN*r{>$jN?etO_KbO;w|rcPqDEzSCZ<(^QT< z-r-nwoP5-D6UP&q+&|Fkz0zPs!9Ny8e&d@*D;0Z)f#5<|_KXRRPci4rRCyIA%lxAA$Fp*KgXk140>?5+~KHEs>d>+p!56}p{zmp>rZ6H0wmU3p#dz;D5zC7Rp(ZnxFd^@ECXKLDyW zPmMN>cDt|@&?_FFI{a@^u}$|R`-aZTkGjB9j1fK>lTbbS*eF(dV3bk`5GCBkd4VB~KjI5SX6IR|K!X=S12 z(}=Jd?LlKU;CM`8);v9;+9vX zet^U5pY<;6IvouFV?)zKbNmG5tieKJ$Jy>7s^Iw-w9CNSb;O^?iL4g66Dm5`r)~Os zRUF?;81sfqEpZ(+^UY4oDbiup0|U-^^X%+g5rII7q3x9x% zYtwP41amnITDx|~#PwhP$#k#dT2DEdNnM$032hU=9PlgPETN~ubi+v`w)l01P4&HN zbTTdsojqFhXIwN|xNR7t_uvi$lN`7iR^&KzpHB%MK0IlD=|BmwzHS;(Ro7qW|L`gN{Qh0w(*=@aTLrf_ZS(Rw1=koYuPwRj&b^R z*rTzWn5K``n*0s+R2dfSLg$CNo6Xag^R_L)@~ozV0#&V?tALJKX^WwL(D@@^v652& zRNxCWrZ~1PhY+Vz|07sGfh*HbCnV{YWT_S3t(LRTgC}wQ0aG2u(H`<=*OR+aOOm(8 zUmLdh=cE_CV>h9iPP2VJ@iSIsk7xM1b<2w^M9)1>stb67MRcGPs0j_XjUR-;2WSOUwp69Gv)dKko*m&?N;iI@@w`t*27- z`dodADsl`Pog&s^l9KV}^-fiBzw$=fiiD%Vdpnf9+;w%5xPANJDYFCkc&rN$%|e|4 zT3pI?z|QW!oq@8LPtV^hjC z2Shjm_;rs-K0{6SU>kxDNA~VG-8J63Uq8w3BTD~fkt-0SCNb~?%0`T%TZ2ROdCuG2 zE!H+`5Ez;!;v8Q~5)=N^WZVy>+ln3d+27r?1ZiM4h0yA-oNtqI4qI7pE-j`lelp=>w z1YffMwj{j3kF>k0Nv&R&ITnd+(0lyt)F(wD#P?`omzE>6giH#PV&viYDdm%-NWy$^Ht6FPM~C=}pRHAbEsb3NxfN`H3$ zYn4~@m6PAE(A1fn^5qpP-*?Ect$;hPd@XY_7Y1`);mfBir^__0QW)L%nEHySX!0gD z1$S{@9WG51%-YMEJWMHF}@w zJhrN_`3aAC+Y5H}w^Re#KDb)Q+dZq$Doo;^t)EFgu9^FGVBFXA3VAJufYVvg|FO zF!Aq70-93~FG>-dJH{u<55o6~Ehqi&W%UCR7ZY8uON#7(^JJHo3X8crUg6Lp-jl7# z*z@DDo2XK!nmVJTB#q-f_&jC4UUh8P%zL3t&sceqW~s+YR?qMAm!}v%3B|F&fJtB{ z4&e1BHY)LSOEYI-vis#7UdZz}Tl0G2QkLC13{t z{Bnvc8ll+uQ>(%W2#3KCYKr9^l-!^oP8nICS0^>jfr(zuxGW+c8vwp6%!2;`)?A4^ znWc<^>BC4h4G{buqB9loN-P9m~I13^C*=6~JHERN-rgSJ03w z!aguI5MMoOBkGYv!3QM7?kocL7V*!MgSYSggHwdU|dpkmW zMiK`D4FDkc76f4xvhx~7B3d040D}vv!bcIW(4nw{E?}{vf%V^YI~QL>0QN@3Q((bV zUjhc0MoiWMa8TR!v%gocj6C^B8C$qOORC>er%VEfbeBUezLekye8d+?K8XTE7lJTdJU4*0plWpY1M(q-Z4cyEt zQ$Wbt@fua@ps)rt$e8-T^JXD|+B~X__W^c4e2*=IM@+Y}E?)m?%Jn3Rz;0^(Gc+0~ zWRgs7&v=boqG28JNEKd+aC8ovtNfu)Ap`fhD?b-sgJpIS)(18%0FYW9JB0iL78?K; zk5H_)QiWGlh-L)V>co7+g81KQav+z;GtX#@BKAK6(+r+8{>P;KMaArQz5al1$ixAD zA>b7aNgA#H@K?gMEjx!%Zh3olsSV!|Ew-tu=U^P!nSQ3W`L62ox(H0-@-;iOGv-xU z+jBm(hRTw|Bm4ff0J71o9Z6u-pyf+5|F3$hpno~0KEApLZG(gfF2;q}A{aIkG(B=m zw55Kei~iQ+KR+GD(}8UQU*|)3&2a zzo)9}HAwbAvgGtK;eR)O|2mKV$tzs9`)&lIkfr){gJ(7g-XLaumFEFr0UhA;a0qJC zcQCHeEXQoiH6?kVLp0Mw_pvp6_^gW&e|g?(8>_%A$GsT+MX|OEcCAh_fSszhR(Wt) z?)9QNDB9NKnPlmX(pk^%Z~^Qy7*o^L4aJL~JJsxk_@D<@BOpE!gWuxQc*hd4eoLJx zIA`^-DhpYm7%Bbwaw_OT$W?LeU^1a6amGa!wHzQgfcs2(!zcdjzMpJ2|A?RxDoKNqL)=|9n>MUS`eDKOl8;alCYo(Hy!4 z`VLaw=#b-$AC^}tX~Fvi(UKacF}DTDd6DlZi>UX7r&%HHNuO;GUHxsUj!YpPQU^P2 z3_FfmNPEIt07Y#IT>dvK^;EqLNMm_*qFG*g23#LG%wr#?T&@k&_F$3gx)G{2(-0lR z#Ka6lQ(Kh4CJscm6gG|=bO&9Qj2E$sGB7(6y@e5_Z@Gk!@H-RJks8ym2#_#qq=O=Q0zg@WHb<7CKEnIb&N*+Or;KgQv^4#{=Ym4hZ%*6gi{>Kt%k(wuF-D>cC6>oayW*- z!gZ;v2L{i4M2q&vrwe(y|NT4Y&$WXTwInj)k^grXi4({g!NfiPCa}Tb$V#L4tSvCk zM)jQLG?6#mwzv*-q$i$YS$M}Og;hcQp~AB`1NrO`~eTgiJ}x|ba#Adt*0F(5o|egXDtU>2FqYEa^zE$CB_v+p z>GtWBVH6WNO*(X(LhG%yJL9hMY8q2lsrU!N22pt1iy*+MOLD=wxzYL3{?( zsWQKXMtT`qo;h=G3auM@$LHc)nLvU#ZI5fvmMo)-xDxeRea6<-+Uc6(Ot-1S9#3|H z`h9z^`*!?C{OO5Ljpm2GFP>xHw_e-QPSpWAy8(y`k9NlcJaVz07b0`4mm%l<414cq ziyHL#gTzU9(|b-Pbl^g8nK_3GiSs~I>G#@bNe1B54T@iSqEJ(4G=9|U&DUyb|4v>o zhoG*(Sv8C?{+Pi~W8}WKqYZYq(1G*bEcl|S>0v*t7k1~Fa^$U|3;$^nJ8#QV;#I2C zeA>O!hT7~9^jy3aa@{EJn>Yk;0jwJGuVaysjS7%nk?r^M0r7XqfiI&vOHGo#0nuTMi5lJswr}v!Pdj&g0Hqo_1_Jn&?kG$?0_=&NSZjP1$9%uMGxK6OrH>3jmQ#*BQ_ihaH z)0PpEx|K&p#r>`pCVwKxxa7W(1I%S#k(#HvPGIhn8}7futi=I9LXo{)|{xzXZ!b zVD*qJ{n;eCk;WP9o%)(Hx6%5*D2h)+b92{OG52?zbA3S{f zbEcEYmpJhieBL`8(fd6h7D+O&GEHB-C(QHtvB7!0==2(bI>G*GSWL8~O88<2hvOVu#BR=G%X^&3qn?6=7t0CyE z_(vIF@Cmnc4X+GuDjtd|7QA9Ngf*T&9WKjE@Vfvo$r=B#8D#U{>gA>~sn|A^Mv>qP zGaYgJJyE27X?8qof?4WYBSA(RwTXTPQG^S5*LBQ)067E%%9rukgbR*HsSei~ zV)BF@VngD}^!+yiyH8;563MoqW}|WQvD-`gk&9SJ>jE30-lgMhvEZ{jq&M0bXJvd~o4f;Es&4U`dZ6}T005zv>83zNT> z*Nez|<*7E!PrN&HTe3c1AX_QsYvO*iQQtyoO&5w9&I1Uu3X5G<_fr_i%8%gFHr8!^ z^o~Z;=OW;~;ZbU_CgIOkp;Jxv!_$-4GLsBm<9c^|RPya@f6;)UAsEXnWe##75n}zT z$^o%p_LibA4Vcz8SBGPB2j25djSA?1ETCg}%oYceHK00#2s_CLUb+KBk)NV)z^&ce z5?*8`I3mM8{Q>=^CFuI!^aV@$xDy0QvnW?M@Ci+SKp*T|eU@myrH&tfQ!b4FrCF}R zOPantkTvLOTPT9Td06uAIH!NdF@;ncW_2I~?-ak=tfy5WoYWYTZ&60c_0r6B*^`@? z)X)x>Z^e>C8{x}E5cFoKDST`c%`^)&o-LxNEnt%<*?aT@h%7ji?4>Nvd}zmQwQ+@_O=_E7V}7X$UpV2Z1Z=obi!kCSL#$kuvIxX? zzn+ahTBs6hHU_krB9|e1({m6?FkRkPa+2a1rDOOV?$aN0GXbHm4J#oPUd;r>_4iIp zkj7y&<^5V5L5K1OM9~6cbI60F&ugpX5=CnDV|gWh^xl*OVrPhcAQ9g|;6-{pZAc`~ zXhQPXHYMOR1S#fO?3+}?e2pP+`3h4B5*I<0V-9N>jhHWehfck6|Mx21{uuR}>2U`+ z&wr3K*dsQxu#A2eaK{^{8dZOPv(D!Z>U)OnT{Qur7xd>>R)YLx`~5kv3_#JeDh3=h z&xB3t_(Jk};Q59ANi}}x#Ln*O>hs4@U-mj8y7<=`RVegUi(>TNrW1<`B)$hC$?^EG zG5A8+jJosMYxV{&pI3FAPozd}qE(vc3AD{MVA@7HFG$$kDZwSOpG{UAk?Ti>|VNJGI0NJz~)3yBC-7eDN}&RH8385lMWjPR?x|j z7>R*GwMmA70SztSZUe-e7Da1t^>Hipc&{cF0Pe<$ygDE!)(QT-(VMGzOGkL-67Bi(={eKP(@`nL%0J03&b`Y2hYL4A6SVJgU znuORl1-{lDM?L0gBE}rOEgb>;l7Ins-~r1rD(S^$Ex!H(+HT~@f)J|vZieZccA;Q6{d{mO|y#Y6nh>g01u+6Z;tEG0I;e-ZO1$y zu(=gjk}$Ho(2}-vR#9Ta2y!s)%0Y>q(bXY}!mp#ZF)w2g>j{0xQ)>E-T5@E-CCj*3 zg4WvT{~+(ZqniBweNPnW(xo>kf^-2vdaM)?5h+1hP(+&a-V!N-ARPe#Y0{;H-bv_P zdhdkZ6Ka4Y-+6v_=FF^n=bp39nKggRUGqnhwVoBiQ}^DVz2EQGJ7dSg9i%Dn{>nha z$8CO25=W8(Fw&wAGnYnpl6dI+N&Z+~%XBtcE&2L9itPjTr=hG&_(S@w^7=TA9N}Kj zH+YH@5X8kzgy^d=GwwZ~^}0cfj+c`ieuQKEb2U1MV8i5XCJ23U2lMZK@cxrz7M$E+ z05?L8rf4a4K0}2-a;JA=;CpLYUW9(20|>-EU9C6SV+QInQeJ?{Qfpt&U_ra$g^-VpSQH5aza*8#uo$sE5paAC>TQ?w6a zgnZ|zLv{kbRPT$eI^Z|fV^sd}wtI-Y%2~TsUYhaseX3Hh)@?|!5BdS83`)Q13 ziwk#{mUNuzk7bPOYRFqIj@9|Eg>4k;4kWLicqgA1V@j(e;&O%@+?9NM^b^6KAA5qV zLxMV;trEP~+3_FYDp)=A+Ec71`rYkDhvYT&=w7k$+q{J)`rG)Pk_`dpUN7_z#iaCH^aX<#QJ zq8Eu;yXW2?nd+s!xph$Vdly+1c0jTMn}@7!G|8nEEvQ?=pRiB~1X4b5>^P8XOAz`R z|6psA4{3puLTPh!t8$Lv-mMNKHjkZsw<$aLWFG%akxUKugwqpFf|>3(pjy-EkYw=Z z!l*KjcQOyfzEoRtaz!dj7oJ7G023pm7N)zAXl{p^KL-*zc zs~^;@l7-oLFmU?>Vqn=SG=`Hcw38e5Bf2jA^1y9)RnOep^3AxQhNGHbEEyTuVP;`j zM_Y!7oKk7}`ELax3wPq1Ehg`nGx!?oTH+EH`Sdy0zPQcr8F#)q!7HuiXM5aMEPM0p z&i85O%jC?Lco-IN@+O#}BfT1_M}y90Q6E~ZEn0ViyX{w~np7NnUdFtNFo-5~xhvoj zFE!0{3g16iE46R)-I|dV{8XLzv}}TPVMr;p%%WnO`ySfr?Bcseud?y)^D7f~$b*kJ zUt%I}%+Dblf!p7jyhdA1Q~a|b+XvqCh29&i2W@b;DhQ`J-YkSN6EcZ#9)@{vI{wkG zoix~`K((lMs4Ac{zq!jzN(va^X8L#+_0cYSkBzW^{Jj0AvJAY3dQgK(I;yo+ez;6@ z#AErJ3s$l#X^tmMz+SG=hj}%0%Fw2VOh~>&GGh5o^#O+=J=R_3vlmhRG#9B-Dkh4h z10je_E~(9~|ETc3MFS*1jE=DzBUj%8cSM)z;a$t<%Ue)!Wmb)^q>l4Utz8G(Xw_-oW*0vqbY$QZ!3n_I*COnSMz4xaP{V&pF;Ry9(66# zW5b@C+-r1?P<+5lC3==wfZ2*g$;whE9Fq zCWjPGIM-COm;~|czV#D2wo4uViL2T)U}~L0xi-Wf@kt5`&c*6H$B-;#y99iC95Ioz zW-G*#XgYD~2bh|+ofbZ5npQdc;)NH)eC{-IVipv;`#FS&QwuzD7H>Z+k!KpmW}x<7 zO-?<{z4ool0I+6Ekl7vY64D{b}w6VylMtH{sd!7n*H^_{~wr%ulq zL6-i)IkUTlR7xG!b>?ArkDI|;!v}2WO$a$QUmVg;!PTin$M5y_T}5w?9o`Wg7RpyX zrgv~x@k^d(!iBRGgcrK7V<{BqPBrf?%D-#=PH2=j0cRhq4BJmsp(&}uqCrwX!H>(m zj{UdwA$XXwU7o*g**vgBBB@6QT5<)l-g_Xt{?EpzKumVyfx(-kSBG!xck0a{E%87U z45M<)AqqEq_gV}Sw4KYn050;jZ#y^s8^<`nJ~VST-g`|bVvF~9i4(s`Zv4=CB3~RFV`az%x8aNAGN>nw!St$lwaCZc>3pv zx5;8nwPwqc9X;ugbL|?(;c%Juh0M`@@eByp^i=#{o+D7s@a1YooG20-;5N8?0%9NM4jLZ4@WsnkIJrom*wOp zmMTxnPoPS`E8^#UJ=bD@f0j`tIlY{@Iz-I?`ywQ2nEhZ@lmMe#P9gS z{pXifB2>TPiI~AWthwDD~G6w2c}gHq~bHwffJNdVy~=Yn(alg{cK`ZC#Afp z+Pxv1W!~%H#mp-8&UejkI<$}r6*(m~(T^{mxAqh}K9E~V>- zl}}Q`E4l!+t^gW>6MA%!*J}K*1{g@$+^+3s=J&5nIbQ61(+0;rICX~c|R?U_M@)38e zo$I;R@g>W(C-B`Pqw3c#Pum*Tc;bUZm9hrA?g#Te6_$=w@&V&I|(&#p!)YK9?$L0kgT> zwLC8BAzrF?`i@{qem1FPYv~>LjNwb{Bj(;Cb74;KsBv{S*?m`;x`>XSW8HaqgbLK- z^LP!@_D3=fYh=-rEvqV|OX8n}TfCVdaqJb=kI);6~xHawy-9nEq?G3u%tWPgl%do z&Rw2eSeBq4h~KA-2FBD8*kVm@$+spvOJaEu`)sIE4KcU5ujz?N92PFSv`mQ}jL(>7 zbbj%WiT0YF%UZsT*K;_xek&lq679XGKj&F7KFha}64?8PDXc7wf1Z_Hrb(%hJ28y7 zZ_`ol819tkVJv=jdZCc!9-$8>p8Dnfv9c+iEtw#tWJ!{dBj#xb>ev z;(4lc+Do0ZF$?VZdG><*=|kfj!W-1kQ>?vORrMHO(vsAzY;cjnw#-zcaJ)Zc?PxZh zhaNknVo3gjp2+mTZAf!JU~dgF$KRni9MWuEuzhvV+E8yMCq%>I)H|*n7}Gj^;2Eeg zxI6vGt}8J`O-&x&pE+M$>~B>rA_eH9?49sDD*iwz3z(H~qkA7SdoFpH0qgo@!}*PW zqsyzHW^l_?cH0V|kH2CHpT6+Fr_!V;(^$mOO>qt%p?=G&n&Ne1UGw($%OWN;MkHC> zv16l^cs%u5l{vXVb~7ez7~I0Fk3LK!>AEsmkj{Po!+(J|p%-orqJCJkUv@@i9a&9o zrqA#?paIU)l2|Gj^*b`=UN8gMd-LE`Cw8`Yd#k<$qW5u22!zmZS_UaP@9QFv3taf; zQlg(z>aOL&^dyqZdPJFN?V3tSPQ(2&*X_}eTs&Q1ye~Kbr#gO`?sO1$q_Nc>lhV*P z{COJk+lKkkG}v&YQQhlS9(~{G?&Ct8w+%XDNOmlXT3RdEH#*5ORg?V@sk9Mpkk-=N z!4`d?gE@~ClIgk=OHZ-%8v3MnWy&iUA-dxW>e!nPa6D8B7vIpQXfgQqhKzX6oEYu5 z_F@My@v$K>3P>K3%bp`!TFcCH7Y@&vuUY=$tCkc|n+i)}3T6!JY1pi4)_;QS_WkMoB+Y99$vs(!rDP1Lqc?O&g92jf`dpKRRK+|7|p znTN7?WX*GQs9@Dn*%=YN`d0S8H%k|cOObm%=3jW2@j_4h0mH~{F4dVxnxB*r)J-s|}(H3`p z?ROb_y-GAb=lhEw%krm09QE*iop}*a7EDXCA*(b~nvdSEx}WxENoQqxYKsgH8ijZI zKreA2_}}j7YHQsLi>3;iUj1qtZ5zXO-Pz98ZQRn>CvZUiBjCcQn$gexLSz2MFEo`q zS=_tQtP*zLrGUhgaf&ZA_J=y5EF*i0DqBG~6E>SNA+ZeWUP`_o@@_P|I`O9AW5yx#)N^6^i1Z#`~4HfV6Z~`I98kz!M;=By5%Jn=Cc#`L)A- zNVXX13fZ#*NFaEQ=bPjapEjF0YFE6>{GsHr@VW(f2i^*bs@gH8824BE%ND6bzAge# z4>>v6r#2mUve}a=EI;l>b`1SEAa)+|b1{Wx{~jnd!xs&q3Ir@R5#o~wxq$!-sob@Z zoTM0^{Bfr63S;V3!EXN7gXT^kS>r-H|u!(BcnDK>bxoqw#awd z1%M(k9m%3BjIUb#2{$xgVU8nUUTY~yas3e)Ec`tM$4m`(sj@Mv3jQ){?hU!Q-Q*G+ z9O(rQlWlR;RszS10%D%$KdT&A4__<%VKMeBrJkR^<*5`sa_5w3cp=46mes-OM?1bQ z(wW*K-z2=nBlqP#(&;lPZ{YjD1M}n}8^bDZ5utZ+>nBazF2Qp(H3yVqo10Sz{7L=)QeXWU8)QiY&&lH z(hfECA(-}(jT3Wyq3I6|Q3jXIbt39e+pz`OiKAK6#XHRF6t0D1nYq7p7NXsc%H{dt zFR$Eskdzh}WJpI5r{vj&NJ0isRVh*Qm;4|1-(`9k)E#g-<4uvT7Ah zW^P}H{6GU9v@*DK8vvJhTlaW^-iyWp?(wV|oxJ1j*I$^2y-_z2iO*Ne=a%QBT zxkZ9Y<(xbB5I1cradMc1o!76Ba}1r8a&n)KFFeRVOYl)APH*e3N2)jLC-^4T%}%9YuRED>;VJlt4KTW(y-@MaS#gP&2@)9hsuT5jSBU*VsXtoJO9G~HTt&SRL=p`$i`#fhFc5Er0 zfimMh)WH~c=k9gw*8Bp$y&KyrcmLH2RCW)-0E9dWcK4Aa%>Q*ac+H?&AMI*aAevI` zvzgAFysASXyV1`7vNF*?$d=5B?6i=;;BPz4XrNW<^w$2{q2`yThoq0>w+iz7S3OYa zUS08W8Bqdnn8g$pWPbVjO^@(UBCQXcjBSJ#`B*at$@+v2%P$I;g+D_RZOu8VC&Y%3 zd0$EuV+}}OlePFlgFjo2@>axo$pn=smFc*cd#t3?EK{1i$TGKeGZU}~R^ZS-RMIx5H8;M%P?f3ouD5 zFu39f47hCzUDT}Vl*Z~Q(k`9D)_PL5E zFdExWMXBV~f2E(5{~V-GV?UK!uaDo40#>w7|N84*KlFcvKFrDb#%+|Pv{1!HwnW)b zisbTlGHNyeT%8+f-?9xfl}GQ-xy`fw`5fw(kL1qw0NjLXb4J$0${&7D3$a6jJ=AqK zwYW-p_zQV8OI0|`N6JbkbDO{ay}K#h!H4*O&w@(k2V48B9x#BUy};qP22u_8HGb~X zqpX9X%S$)_oRAD^w)7}6*A_E$``ff>5n-U99A`!rsVcpVWbl{W=<#5Rn(O*zB^@S! zpn1GDhVaG?TmIQ2aMU1f40Le9r7iDClP#UX!vKlLWKNovtvRR?7m;=43l=T3iQAjb zWWX7$E?U2-iFxNKHMCoKlZ;ZM7tV?i-)uldY1fFWStl6|3B1ps`0J*K$n)@Xt<+3T zdX;=8=A!^NYpGiFkGNv$DS(elM9F}FF;&u@X=Y>d9 z$WW}^bvb(1@$%RVm$^*BCP0%|SFrRta6erm>WPzGr+v|HWZZk#ioUP%I}z_Kr4GJ! zcHLa}!rA>8k!?o2F{@wk!;5CcU(F4DisGYL?3Gn5wyR5;0vmSN7k$5lS8SP!Iq^+LWRyVNY!%1`PX|#A^XgG7mvs*WkPx6l$TJAmJKW_j{p&N z2f4p3`f7Jb{&}))hwuVjg+L??5Xz`Z;?0jD>j2U1*o!Al9;Wjb&hg!k`x88I()Jgc zF<@(N$v)eBV`}TduaP))wszFLH#y?ceZM0X_GZ zlanh28H!?cj6)Ty;|_;jRY z3pCYsNvaZ9@X`;x(q#+#Q1{b7)n>%kZ>l_Y-``|SC-~fW0pQlZXQ40GX=;v|+3Rv~ zfer_2*fjU@FLIEsAj<+ocVQtQh67o9@YuZ{K5<&Y(sCnT4Jy6_*pI35c?Y+pzW7!5 zps`OvbytHbVBz$yU7rm`$xL;*U6skfmW!`GLV;TqYUSQXMz&*~faX|(=9P%N^A5!= z1k~r+kw2OIrqC^lied0xGsJuyKG%-*sQEParxz4yaV%rLiZ)sC3U@r*Kl3vyOnNJj zMR&OxQ&;mQ7y7enrMG0Xrc{cF@eO|cr$ABvYwAC9Vsw5r z&71G7(*e7qd{4l+yST99e&oaG$CMEita?XADt|{WxBX=uTo{&K|9KjllQ03^KH7&A zz4+uPCMozPw)6P20PPy2yP3xTtHl!L@XELRUG{R^MwQ1wU)M)0%q*>45vQ2{^-gSu zbKkA+buKeT%zNVn#$y69SL|4yG9$aWw*z*ObiqNs9$jAJS{u*jXDrDa{E^on?DW$K zW#*tIUAj7TPnv=FN8Dtx+;B|B*jBe@ma&Y0=kf0fi!>v7E*iU8VeX2e?|a1?^mVmSNh@9$!^Av84<8u!|TMCUiTYnFDd^5bNS8W@CKrPWq52V^ll2fUb1{qwIAIy~w7 zmu)2~Nhmw+_pAFyC{mQ_ggZc$cA*Ryvw>Zi0j|gdjh8AN#U=G(+=GlSTg-Ewv0ex>G!g)}VCOGRfSqEgQ`i&0$85Hk&{B<(+@ksp1XMRMyv zpC|c$xpGjdL$!Tt^Hw+N{QR_KrDPNteE3OoQmEM!H$%jgrKkjqc+-S3e#&+~%UT%7 zedf7NZYnq8st6vLOA16$xc5~TXCv24Ry(=9kSW+=t6saq149?Jg z*vm^dH~h)!=gpP1(l@m+{SYzgKRT0q9GF}fQZpxT4!L?o|M>OzOYW6# zj)`15t?@R=IUIK9t+~d#8^io}Y8yW`MumJPrHprWrDdAwv6yE6ly33uRpf?Abc)KZ zLF;wSDlltVOdG1o%~(lQ8V_UbRkgGXvOjbRnU2LK4Lzuk;n2=zJ-w_NXTm0nDGntT z8turnnHTRK5E~zhZF52KeE5>xc%wXI)m%vIK#U^=qQLR52GF;FzG(S za$q=8*thtrDsqL$J`SHzfPn;`QK_())@9gSiVV=t{NZoC6{{k88XcgdbLpP;^P;qM zjVzLKtOng3MYAkIM}8rj!!-QYX@1Hu^}2Pe{BBL@>=u5z_+%IP9JZ469&ChuTX4hS zNPQgx-tvIH>*P5~xP)q4e=fi;fu`R8Hjr)=#wo%rSf4o>i4TaulN6twBkv;q`wfVW zFxXc~$jIgX8ut4L(3xBj_x{VvaL1!y!sFhni10aY!BtB{>TLAwqPRo-Q(TmB&+X5RBX)AW7m?Uh36Dgk&t7(+y$BQst1*&(i~UDVw)NOIQBPi$uF!d^>{=H=Ym2aNl+C>-I89Puo8k%#;Hm0z^CacAZBy zg47e=g5STmW4QjW!jU#wSkNKvb`A75p%=)8W*h?6WciaGCXwfVD=s3bBOYhM2(=Cq z_pR};cG>cY#a25yUv{ghkU&rTRnZ%DVa+lH#!nc!N9TWYHSDYG4Gv0i7AJDa`+9j< zJFtt{(EHwqSXB;K!_&4JznQ2>=_LPK&X9-r><88^+cRt=O;IiKO5mib4~zuXy}Cei z_G^w+C^*Be@x7@10ZZypn>B|fecD%_ng=DgCX(YO4GTnA*1`^Wim{CZKMvBsg9@>C zM{@&p%WfQmVr_dhZnXe%h| z6-f=*-1yPW#JkreRY=^726a-Kw59&=GE`z^Hk&_IyAwPUfq#P)zJ9s`m9CA^o;c(; zncX-ZrDwenLU8Yjz4gGg6V=zrA+6WlgVvxZE5o2vPP>P&r@6&t8w#Y& zT5BvFQK0T6o|brLU?DAHcI-h2k>**iY&u#L^%WWAFP`7#d~^@}MLBxI2N&g+*!k3E zk=4qG{tT{SF0JULezFG$SV)$=MNI=lcBRww9|2g3C6F8p6B^e65K!*l1xHj%^rohgB=H2w=G$Z_^zf; z%^a^+)z63siK}Zlw&|ykMr}aV!&bDG$%#_{*2Z-L1;@%w+zS`zXxYx-8#$=-m~-n+ zCwg0K&+L)a>wqYj*+s4M@?(pXXGIEa@QtnW67=&W4s05VOEb<;>F>*cCbw~gL(#E; ztzvaKUR$?~i2J`=EMxy|y!v1J%&&|5tL#Ek)(ZXfMHPupt%g*0aTf$%r^U;}MJHYa z_%B~8*zn9k88iRtA`eJHWp*GM;D<~3-{#Q0LoF7FD#;W0nA6Q?8#6+a#V20vmR{dZ zSN9ZQig+1xh`CvOjGH-BGJE`S7q$kP#7#OS=oa~IfgY=7f2og+D}^~=zaF~M{2l1I z?ndTG&J}mjdwZ?o$9e~;Cs^nsEsLUB>@U3$iFwT_Gg}hNLv;f=IUhgpJk0|zqjCNJ zGnF0FxcA#GcuktV?k_DKLWpPz{4-W6TQv_(CPE&S&eN zEo*6bIK!vqZNSf=Ms_o?K^e^ZN3>rJMA_Tm^l~hvBF?BzyeCiVen*odGRn<^f8WKfPdXXLhh|j&_}{0d{?*F-hcB%g^U4m}trPm|*QhTb zeWaz|vg#Cl=A}P>K@Id57_i%d9~@h7A?9F-(Bj~@)XQcs4BO5dtj5+R3wJky1YauF zgX0G?7H;`JyPRH*Kj8=vMP<%)Pw)_%YR6pKvl2X5!AHOU;Dk_l+uhl&pm|yIQagV3 z^bsHF1opa!YC^94GSFP;#`jJ$Q6E0;sCd%GSPU`*7rsq z3i%%NNpg1IpL;Za3iqPY>5g45vqOmVTt|BQEywj=Y^=^2L?0z|xMl7XsSjtjP)a!U zp7Df0$dvD)-8zbMynT7(4lVMJ&mSxArI5M8zN)>R;JmS`DKrIyCK$T8diiB^eu%FG z`ug1Wkgv#UNT)4S5C@);{SmA3^?Mo>S2KL}VD^iJ-(}>xe~4C$;8U4%EZxUA5Eg_| zfleGW+o%pWK51iEq^Em-7zlWNTJ6#CErS`+XFS9?QG%1BXsrT<8x(a&rAhq^a) zIYoH8@}0M8p1q~pPx8oY1%<_FwSg;os<(CXe_+zyy8A%Hhb)2smROOi(=*|&P!hr) zyFe!I!6XRMb=N}Q&jwyu_8}T{t~U*t_BJF@mz+jRVWT`1_j+c5p$436eMRg16ciPS z#k)X8NMOp4O7Wgqn9V;##C`a^21Wwm2_MXu5N5AD3dQ&`1!Hyo8UMc+y&Y$D%f(AP zS9WX1@+FORP=eK2$3H~A>YxTVmgs4taSQDWLk#}U0!_Y#sz0r#S-)z{sj2W!F^TtT zCABS?Ipb#yebITtFLgmeg});Rzg6%5uX;=Ue__ARXoK0QyuYKu!m9C2Ing_g0;jVM z*W;O%bv#kfYA`TARw)@WXL4ZKT}N^JIH>=oFN?o&u3sTJXhjXN4c}U3s3I|WaIQN? zA(=Kw2%B6#ZLoDy%?1W|TLpjq?YjT}?$eO81qG+Zn$uQezXSO)BatzRFP<(R7H!0o zlQt7~X4ZhexVMbj$W1FwoQ49f3n?E?l9>S!`@BO9jyx<#LKz%n{I8+88!p7JU5|7@j ziakfjC9mNFwggV@i`Re*&s1PfN5RmlA`>G?(_!i7fKcToU-On8_6VK!@}4#3j?9At z$4Z^dZrS%&(;N0py&LNv0KjAz^gqQ6?+&K4fbcqjJ1O@OcH;%SQd|lN&K!5k~;N&+dE;bTYQ7PWWt} z@w;&~w16j6*gtYnQ)N~len7@4aiFf4L1#~Q^o8T&|BM4epRA~8mv~}&6upeOx9x`V zeEKi`y%6hv=5PNe-??!f(<(VP4aWL0Aw!yd?@cx<(b2a_7-FNXM~nRVM-iDF(+_Zp)=CIKhS+mYk@beJ zvf77un)tBG^<;~vJvpQ$Yk}%^A?~coP-xMILWgpebv%%=qnGf8s_n`{-l7-x^-V`{ zI?G5VtV}yXew;%cc&tN|%3RuE*Zh)i& zBM(*SKSWIUfS!}0hcYIE=NIIoT8vEY76yqwR)T2n22X%Z&?g;a-3Y}IWuabe!*ePh z$}^+I*0`|iq692>0f6sZv@20Ptije~x0d(0h|KNU1>cwrJTBnR0!ehM9Jd_n41+3F zvKp}fU@g8wl&Cot7*#n~5g!YHPq{h1zyIDR^2z<79fRk={ept!y2O~M4AUJ{&eH(n z>y+ykwzdxMZKoj&K;NQf_fjDCLa9T$zkkLV}2KCE*0C4rqGo1IH}>*>&F!waghl@Z28)$bS`Z)qmMO#m^kivO7`dcAegj4~IFh*w#upnw=!PAP*&6(1Yz+V#<$HNHmF)4#g z6-*3={}4%^Z59;SQQY#na9zmzL%zz;JH-EF^o^8~zAuePlpm-)^UE?FL9+kY`_t&% z@ob0qWizShMU(5SFM|iW^snWIq(u+lx1nw3jCFxL-KzY}i2B`uk8$(eDphwAfBNB& zPt46ddfpkD2)%4E7(@&jEDy~|rbnS0Uh_3>M!Fvk5A5P^;sek23UOM1a4*GZe%O`? z)veRdEqu%B%f3b|@m}Dz=yoCK!%<*)q^7kWsA&;6v=PtgJb!sC8G3&8b%hJuTW-kS zZ=jI6syd)fE`U1zm@Z0rvN&%|N$j4%C1-Z;{k{b7(c4 zYq>v!Yv3vP;O6ocj|%`9`l`1LUy35g0Pxgd3V?Xhoj%HglHdURM|{%??3!jL^U8n= zdIj#ot`W|{0Cfj$SUm&?<^$%I;>KE%G&WUzw+&}|(#2jGDmb6fc0(x?aje3SiKv<_U`F}%sDTHz@06HWa1e| zRC7|x^m)qKT3sYC%mHZuDMj(kB1qTO-DsoR${C0Bz)8|Q_MWsG*J{EIj{>;f3tRwp z4gXi^asiAa%tDTz%0jNLs(@j|fH}~J-U58(F-T5WLjg=F0$5j`7bR$@75_s7F@#+y z*k2xfgRNJGVRxXqs>1*!fUj;x@^5C8zU~W{b|&e6_8OcxMT3rvU})svbp|7W6R)#X z+`W9P0G!ivx<+6}N&(+9m9hxp5jJXdnDS-b%7jyPw1vFtV~4T07>y40-v6GU*#zyW z&9uc{X7#H<2dMWm)2qV)X&3mAtybE||4*s66T1W4fD6R9jz0Z?p%32uhe+?-@&4~1 zfuK9|J*<>}a&G856LAM2ssH7W{8k5Gqebi$(_VcX-xu@~Wg=TQ@n$Ww0PSQCWWvkg z^HGimr+8)pK_;xWF6nW=C>8AnWT@?4_$LLScx|ZY3e=2Zs`Eo`%XV(^&;dc?Dnqnn zf>-$7aH1PKi{oiAimD*Q6haRgHubFZTjc0Q% z;3%h1Cx2`7MW;E__$NBE-6Wa$&Ozs)x<#viNpgfr*`bljgApViEU34RqS>ND2SN#CtHUeNy(v7-7ZOz0wRE1NU$ z;>*#p_t9$D7s9=FR6Oi9=v`h?{vSaJJ%sh5FmpfR;01Pq>k2^}-r=!pQ857#^@0s% zOq;RW5B~-52u>+hBRx-LZ3ZB%07V=5LY1G;?)5;dF|7RnazOO)JJ;i{u-~t75`mSGpRNA=X+W;jk9oo zq@ofN(QP$ZsH7ir$$ARJmeP0rKP}~6WxDGX6L<^JcOi2G%(z}vr@ z5=MIuBYkVCOlsK{H({vHm@CGXtqIKciUZwFdFc^zPZd}kf2xP;IW7Buyz>{X!)CZV z2vi7r50=jEUvVGOs0G;CmHzl`FE7Z`!=5>4@CTpI_%Sre4B5yBZ;BgUc{Ta87bee{u1T)t zae`ROKR8zO6ucM^Ssl^yg^0*m6myPr0cM6T?-GE0h1-Ho z-4nOGha-x$Byx8&yBbQlupfR|nVa@sK?`w>_yXLcY;T=4^OoOO$9{3ecG(*D zAf|bCHt{vjxfUWZ&DOQCW)d|c;)UsPP%RlQ`3U&oX(dVgy$FQ04g#qQfwd>zQ-)J1 zG}Vy`KNS|hxsZ3^zZL{lm!qZ(qH?NYh8O4#MDIdmJbk%l z|N57rh0Y(26FQtw@b8$1<3rh~v1fOT+SWNj-?G-h#OTP>`ubx=6>*370+3zWU%~Ue z)#(l@q{!w?OXS3F1)BxukygJ4+lLBdU)yd?pGcB@Qdp9j7(mN%ZCMM;;V0nJc}L@5 zE6c+ZMR{YxQbVQBO_|O{SlU*02jsbV<|Av8bdlT$5WTyE&6(7L6zXy03S5GFJ*CI1iGzCsx^!K;F7c!ea zX;Qs}&&ABbQF&WkF`=Gw-Q~Mi-I=1ZOO=Ihj_=;Cbl{}k?td-6*hEAlI*WqR4_1;t zd(eUURa51gpPAaEtx|IxZAxnOjCGN1yyCKG1u`3qV);(swaXxHjx=k>^p9tIN zHC{rz25s+zKI!Ry@|hB^h!z4;_hMFuGU8@zJTH>(=M1(WU(~1c?#?$1OpE&aV|K2Dmp{O72vZiCm=XSUy!$f-EJQJw&uDCe5{6+LcH^@*05nACfC* z!9hk9FtFST7iXU~MsKG3-Oj-tP1~UcDj3s;)JIonSp$( z&Qs#tRpIhMPf|EyTfA5P)~|>Ge1fX<9)bo#NLlf_o=)y)#I?_~@rso%XdSm>tPc+Z z{0`Tze1j0#W+vRn+M@zUH<|@XfjKWmS2F0VqDC{>^^Zyrs-uXmbvFS zp1U{>tNUrflGQw9)pg9W;o>(oQQD9JAT*Yan7W z2t=;@-hpp-iD^9d5!E32Y56O+k?{9(|Khbn%?V?Ws^(=<52^sm=c!uD)ON0{C?Ba! zlfOYruR@$>-@KF~eS9}EGUNJO5?g_ye$-RDeEW>5!Li!mgYehpw+dFZ8Fy^!)H54P z{*z;RBBN2c@C*1L?g>!+T7t;2mffH`9vnPOww_ltHDw78 zF&%EKNmBPSZr#=meEYQNMy&62pX~YYK7jqX=VS!*D?l-ErRM;t#%$P*_hDp$t8^E6 zB!9e6e#LkgaN}AXI^^pW!G}hqpzAuAjm_|XPkP*3>>ZOvx|}xR%`F)MzdY`_mC%0d zVxEW|IM)Mgx`ZJ%S#ebmgWbv}t>W#@@Fqe^>wL~*rH?YO8uJ6xGP8{;8T#NT!e86H zHZ>r~DQ;MAMkKmq(D@FHraZj+D2k^V7KISElbKg{kw{n_aiPhiO1zzu^LijgD#n!O zhbNaNPAFu@I%SGoo?Z6W{dU|~Q(bj?R z1L$9guLY*_k!2nBR9R=NbLy)Q9)BNvtiMe0b{$j@Yx3z~%ZE{R!}9MS>qdk1KSHXe zKd-Z$bJ357RxhW~Cn%lo{~LSn8P(L=wflmg2vSs}m#82jNCyEap-5Fkigbub6GHDb zp-Ar_AT>%?N(AXGQX?WDU3v#8frJtQgm~8fefM+DKJR|^-e;VTXPgg=6&6{8tdjel z_ng=KT`*SvaL{$a^W&=Zgkgly zM(};-cA*-TCWOlxn&Cu1j2k$~$bQoBx9n5ugqan-0r27oI zH|WZk?j0H{-fX}B+%w|OE)e1QW?v3rI~Vk4Brzfqa#ZFtHcb9xk5`M!^fq(9P)M?S zkUD4I;f0(){f2%+{O%wh5BBim7M54_SntvVKy6Tc!y8*IDyPRYf-CY*hDb&4CO)w^ zKApF!{U;PTP_W3lhN%&=WJaz9MPs;DlL<MIJE(+z+@-01(V5zv%5#K8 zZ%++&*Z4@cXuAa-bM#cmP1jl~v&@aF=g9&4gB0<#pa=-Cm}YCEU!(^fk%%s*NFj`? zDW)A$da)*d(j`hY$G(a+ka3ban91!uyhqRq&hNlG_R6?eTPXk`AJ0!gRjtu_EnL6}htwe^jq%A4*jrc^oF0QbR&ZFc`pDjYk)xKFsqQ;?H#!?#L@7`- zJ%ch^NjPJ3y;3T!O{`FxZ@+nda-irX()IYbNXth7vdpq;y|hSgV!n;K`3pMJU2fkn zrXM zt%H;KOw|$mdt{DHRr}te5IU>r0S!tZrjME|ui%R8xfSzr!oI*yYpqHAdq=Ci^Ea3x zzvuXwo#yooR-*9Th_vK~Q^%-Ngg@~rK>{0XN>E*PF+1;m3@h^su&*8VkQ98FD0tI? zSr#m4=P}EQtZ>E%d7o=Od}#eN2x@Cmm`G;$M7jQ@(h}JfGA81lxzkzwJ1e#PU93hP zCQ1Ee|9+?vauFP*FOQ;YyZ^A#a#8Ec!=AM?moZ+18F8_P#M1N1wuZ@rzVOonQ@Qi@ zAk_2d#+23Nm!9g(jv|O9yKVRt*e&Se$iAuy|ED^wZLPkUj>Qe?0bEH>#dvtr$EOBR z_3$ouWCt?=>{sM;G6~+9ASfw~( zR&|X22viP{T~ZAtlQ&MD-ww<;`wX!h^ep$JOp^!=snrSDzeey&ER$ChH`$V@HucDh$F7%D$l@^rqvT7kOD`+e?ovr@< zp(#_6ZHUPtR6L|Y5P7LlUKjX&2?M1Ch!?#xc;1f-fG_5?|2K( zn)BPaonM6|>$vrTVGfN&bi)~yw$BvPKdY#7TyRroQPrI41drj2@#0v9IA?nl_7Vf# z4oz{flWFCK>qkraX8~KyqEwcpC;@Mg>pLB+m!kcFCW7UV6VtBV$Q;rL^jqM|SnrR}%AJljk+Ah+bJ` zsgQ|QD{05-f{zhDMG2EzFUuS*e83z;4WRR>D{y^3iU`oE1CtPU;@%+GjJQ|BuXa+p zD^t2dnNiws*Zt_sKD0U^*XCyNmIyv%S#O}?fjuMuWhvyd<=G-aonqVU7-HJ z=OOe^vr>uX1DVX~j+Gw6Zm_%!4&Kvv@&!Rp5E#W?dt+@`YY`sz^9x0nIx7M}38zV- zhLpf*!n{yAz zxiED5GJk%5$@I0Uz0It|)8_uO#?eYb`Z7y5hZy{vR|pV&;yy4qoeSb+2(w5NLc`*+AJE6Uqd7jR z{QBO#oQ3r(*f?}b(#NQU+?oX2&6(j~sh`;Qj`lZto5$levU&<~>jmd$LsAFc4!vt;Pyy5K>s?~?NDuLeR4`$TVf-m z1yQFFRYiP>1-jva``fE_ZQJ6F-x$uVYrYHmU5-D=eiq z$JV{Nhp{(4n$Jyd12GIN#sB{j^!-cN_n%v;Kf1d>F7&Q2kh8Yt12U^(QxnGr(f{K< zVs9V(KoYrg#)AQfmrFfBYUniL%RdPs0tkPgnH)vDe+SfN-kMT&k5`aDy&Dgf}?z7O?S&E1aG*#pKadlADX*ZXXA{b z*wiLP`sA}{;^?y#6RW3t;Wpl_c2TzU2@0J@@)5re1XtZyVBH;;Cibj$x&qrL!!~XP zXX*LTob%gjk)5>~Oa8%2CY&QOy^Xi{($6nDdRPeVbcp6`kOe%2N`-yL`(?fW@26W; zIG(c-Z#5b*D#N`^Ua<+s2<|-+fHi+Iv^9PVzEh6;y#t!2TGIK*7s4Dge1-{-L0_DE zS9cBLy$L+8Z>cU{}AMk({x`6c8qk~3f93< z+#VqcewwSy=QPviAzHrUC+fhK-*jBV`bEvcqrC;`gj(z;C;4?Z;si@S8_LxLsp47| z@oEH&!soesdj4$GdiwlqG{yMxchIv9U_R*PB6>}?o(s>)3@tXOBFYweVcmK!cV7$p z!tU(=ToJU;O&+Pmw|n+V9;5`EB~V=m{kFGQ;&Bjsmw==dlJB6M)R){g*KJ}Yj-mqi zOLngZsbaOqmi>AL4!Y1?X7zmw9hM20gxQ!ZAG?&m#6dh)<1_TpmWTR2E8*OHOYMgrLzj{k?)T)cY#VqN_Wmh zU?HMLWz%7;8`cA7&6M`=D*k$3jYWLDBRd+D&6LfuYlo}VnZ@khogMS}CD)Ow45+@4 zh6Z>sdS93_aMo-@mpQ6wXZN6sDtE;;Luqgg8hsT!;fT+YcR2C$NbcSggnTj7+kMTs zOVsPQcn_0v+&m#MW}lNbpaG~Oi$p7U1z=8<`q6fWn|T~_3w|rpi;~FU%CHG(A2)2d zDm%PsoskjurIYh|V1yRrR|YBUqf;xYUUC*X*iAJQ$@~3ej?Uc-*~t)b_LJ}0B9-GS zX``q~IX{4XzH)W33S=CD#xQh!7CdQK#NWBd6_NaIrc2XiFu?v*-tVNDG9mBEd*qfy z$48)6b4<7#7$#Ia`S7aj{Nd55)k?z{YFB1^ z%ywN0kU|v+A(vam10h7lpk;-2boOULylGp4_Ynd`QC`*9x_t>b^-TAGS~r3uVfw3iG|3~N@mp#cn=cDv6N-`+?jcQgSVeRDV*dee0A9_}z(ez~pS(r5;icc}W} z$*48TEE%KeYZvm|a(Gi9j+A_a0O06+aXa^tL+io^ehoThQNkB7$Bp;D?T6R){6d9F zUP>N^+s!0}_$MRV@8_Vugy@`rbIb(Klp^r11S&V4r=U>D*XGsdGIKEUFbXWw)O4^Q z!Me2P_89R_(X9HFN4;gUAJD&Uf840qx^5ypN3!KdZVds&jc)UE-?|QLLAnRUvDQPO zY2EQ#fbQt!MJCuG-=31%G`~9! z7btL~kBxn`K1WQs$VS=hGX)z>I1}|&<7MmMS5yWbh(~ZvEqU@pK4(>5>=mKlblj~(4ZE_ze=vh^RH06Lee-Rw&zDttK0+q7iK_c5qJCW$P_DX)1Cb7 z@XWpq*1o;L)YOtilX^>}|Tr*9tBN zn7o(1XVa2U=O^Gl)~CpgWr=nr>a7HQgxb7mejVUobE3yW$WJC?SkEC^6Gm{kJz!R% z_DtyqorWwYQ)lN0yE_Rdu7f-t%_#D3myp4pgvDy29gbzje%|5Zw8S-zhuAp(LtZo; z4}IAOPI__OsHri#Y^UPpG8{}JHKF&gRpS*PE}fE5L8<&Dd%Yi#D7m>pY6a~T9^M`W zL@vS3HP_D?gCn@FKTu1rYXU^+o*5j>=++LZ>K-HKbuSUL z(cfSUqw=6KF07-`7CrG*E4}6SUCUUTJ!s#MheZnBBM|j{|C?T0GeWfz_}A+Uw*Rwo z%fH(n=YP97q1%gnfWya_JnSsOLcIiwTu%Cmz`(D*Ihk!aJqu}T1_q3jsQ@)Y`^ld= z_nVhml6PCr6`bDYBFMJ1T1o!Oc=*Ll1E%gp5+D)zDuvf3XwEc~_}!2!f62Hcm@Wi& z1Oy9Zws#y+?b$8YgL*-93&2RBfN~h9JiS(73W&|DD3Z@lIi~If07K0*L5@!+Ii~)% z1W+uySD(W0V_xt`9cW4KxCbBTv&X(eOoJuONz~2DvqnON1fa%Rs}S3mm`swvjl#Rh z2$RG4JgbT&Zb=clwaF818VAVHGeKQUo31J88Zq)%z!D{Ks6k@w#)sFgJ|*mY;^zVf z1+kDKJjW5(AB*)9NMZodqa}+=eioy8_H}A2j<&c8FOC|Phw_v{kj=67*#6SrLoxtU{T&*|v_LLuuK4w0~bH9VXd0y7a+o5$Q z#8Sg>aUrCrFc^re!b208yPh({oViWUEby>r*Q(s&&=|D)ZB#}S1QYn*!Tna~a~82O z$%kK&qtK@&538(e8!S_13;QB?mtVZ1EeNEk(ZJWC(!c~7(1h_JxQ#dhh!17sXoK+ZVfW)Wm2 zzMr_{biq;>o&OqzIJm!UhzyGSf$J3V8c>m}oxfvGL ziz0JDjr)5tCZGPX0zS%M2t>{Lp`Am!R>(eaC`O8_uw$L1v0jF0#j-4&bh}MQ^jX~= zJ*tw{8NXzv>Ug^$ka>|RVf_x!sA@|&{(b~1>-)WPa-qX&Xh!szWiY=T%Gw{XxMCf>V zHJkN^6NRgrGLc`2H`Xw!Hw;BJig5wujvwgm^Mw;Vl6o>jGWV@CJJH8qU1%22wu;$I(kb? zXJ?eSRes(Lmr5j;x5ZHKp@1st5)yS_8Vp2uuYfOfiGlGEg(F~&^W`sD6Y58clYTPq z%Op)3floi2p6_l>b#`!DHS~5=0}t{g z`QKrfG+nZ(jm=%ao5TudRkgw})4F1oeZ0@C^VqSZ?~&4AY^jo0y-Biw$X?%DhKxuL zP#t$`1?)Gft}o+U-d}$Upef#WKY2Vqry($=?mO3CkR2nP$^zQ4Y(erId#R5c4Kp}0 zo6*NRX98d@T3R-E$hj#HH7W@9!Pno_ySQ^O^A=zw1ykwSz4+5F(k=WfArjq9$3U#0+S$TUNh zhybo?m&UH2{+8viL*p$juMQcka3s!V8Sr)LxJ4+Uc!pD`Fh70kDr&)yLp=^!iBqap z_%_UR&}RDX;&pUg+{s7e=WUe1R6nj@s7!F06v7aUjgG4Wa}vz8aEPet%aa*ZGu;7D zeiDC~(oX8{rKO|~YuwO6WIvL?hv>ohb$j}w@EU|99aKy(m7j-Df@|PxK&$X;6_1Rh z*Sso<2R}4VjZmggTtMFtdQNQ~=)}|)E&TVsm;cqTkzyG+t8ZcMuMXO8KZGydCBJ|! z{Wwp?k7Y?0+>Aa?U1+>GvIdQIa3UjyAm`6IAZdyfLXhU<<8k0|JQ!Nd#if7t*EJFO zHS>&N<*jkIw05d`YM-KB-UXn~1fjB)+O1Jr^U<5V!SYit8ut+Qa>L@zU0EP4xrd4l z@-SiueezK3kx$Z{AEk6gwcR~H6<9&SVvaq%`(XnP`(bI#7=BBy-_1o-HvJV(&Y4$_ zjE*mxB%X2R0q3Ql1D!xVL}k8+M$!ok;(635A2x$1H#MD~#>DDohqWq*Kb|7@VH!1W zGvzjMLrWC~(##;776>!5Jp~M5X*#25_clAPNQkoe=1Oech)^0et>25m_V+TQE2^9yam z${8bvt7o>5cFP0Vo+sh~50yF8_pMgi40=20nz2S@ZnykJpm#fzAIeFWc%7Ib@db@; zpDfq&v@3njJw~e^8i8NJIAO!woH#y2*`gNBT)+~V_BYujxv6+qPl0URl=rlxekIB#iOKRC?Viy7AS`WbMv{574omMKm&?}>A(8Dl ztP?lHc$KVh>3(fe==hyoD>q^37epLF8`d5yS;fAx>z(WhD-MHPnh)}yCT6A>TZtW3R#R=hN32{PBUER*TJJPR)Q#?1;LLqCmfCNd zD)vQsk@9SmNQK=%t}@)3fllT!2b4sj0x~*{$!u+s-hJrOMo<GsKC|Z{<{H+$_2H z%*NIzJ3#-QY4Z$N;mMmlEFMALKNw{a>sTY>3|c`wIcjGH_}L@0e!c9=Jg0FNqajVm z6d0J`#0Bp5JF9Ct+(=wg;>cU`t&SIGpvL|s>)pUgm2~TPvU7cFDIGOQk{N_01}lbx zx%|2MzWOjtw2$$GRO0WPKkxnn4V}9nwt+g^2z@aOPO;xMB0zbo4dfVL(pRl77-V*b$4-z zBnb#}*GWslR|_$=?fI_=?IBh06TnYno}rUl0u`^CZzL9J`Ya7YtCF6gj7PST*oX`x z@^UM>inBkif34Bywdtdml_E4^&mm3Ns$H+GP3ME9195Tkzhvwes@N$Em5cRq?t`@j z+bI6bXSZdY$vg%fpKEu$xp4maBw9D_7K!pDRRb9^4CRWWLxl$mx?DwPmk(`XV;FWI z&w!lX*FvJouhkH+f5xLezff5k>OVTcMd$t;G4kN0)}ZifT3*4c+;J(%7ddT25@gp$ z5t8%xVE7~kZpkqC89;|URQ$(W^*z*Ox>4{E;282U)UO@@hgIp(AehiugsvW|=UG9E60H zj!^Guwi%t%O;^zmfySX0)8?4>0tY6m1=F8`1wFVCpEYW()tu`lUvrs<1DNc@@ zghW_z14BlP>jlbSWQsC-jV?xZE3tEETQrRz`o%5KY8)(VST;yQF+0?HvwL0g4 zzJT(FFF94^ZJA!t6imdgRM~SessSw$E(KQf&#BzyCJt}d@PXL2#Hlaa@ z{;0mi8y1ddPf+di%q(jbj0FM66qeP?gJsFEu{Xo)McUF#7Jk(gJt#(R)GdEc7yDhg zN|$w}Q&n5PrK63n??&`AdJEXvT5*(FMk_kirN*a(KhSpU{h8%aUW7r5&WHw&uqKte zym7og*s4K#O8QY~wAWp{ylZ7*kRgs|Zmh*8U17(chYuD+I9G}9@~t(u`ev8(VwPPVqtrB{ zh0Lx;#*YzR&zK8RrAnOmM#@tp9iN~uoGHH@v43cra`m)sNxxF&f*wk&ID|^kNo!;o zCseQ+awZg1Qk4o6ck+f@?IA8ZD~aeWPnZybPy>agK-1{4dB4O1SA-q9uFr)XYjOX1CIWEt(0^bZhp~>@>`)L8q~kUgdO%h^w)673hToEcl$3(Lpik=c@N0WAKlV6{r_%g6 zo{@Uu_V`Dbe@Q0ICb1b8&;hsYyReW*VsAvuESD}1+0>*F_w9+~?x2C7*^CUG;w2Y~ zaMc&Th85ayeen)(A3I8Y*e>RKC)L(q(8=y8?ouz&)LANg(Eh9XrG}A%rIYW-7|C|o zeZGvmgbq>YNZm)TqG2~mm-_X-NldSHU@QVRJU%pTYKsWNAhD+GNd%JwNSg z8_UjHE}ESfa}{-8#nmx~o3me4U@v-%*M>1u=su`u&>KKY{W$Q#wWgu|OT>>zn7Z5- z`T(g@j&rZC(tYtR^XjS2;Dd(~OCCVx!S1wZqMvJrn2iI#G>|;d>wm4Vq{bhMb0=IS z?krQ^DS$R}ae^MY5-axE3CxptF0fI1l)Kg zdiEjTws~AcB8U0#*La&*(avX6wOa*u#)X7)Az{C}Ouo-5-QT#7oBCnwThslxpC<2< zf-JIJkwyM7&;_+uH<_xhqNn7Quw)-As^d?iwxw&n`u*@(U(q;n2HRA}>$38ZSTZpO9Fe`ZEoFc78eMIQBUCj6nY`arF z2iBZ0y)&f)y@3r_36IqWHGG-WTlY*`PP%llMD@&gYaZRzO6_gVIy%qUEt+!OBy0h7 z1ICo^rQ906@xv!hyjw6KkD2nA>Ee3ugUQNZjb*4@oyP+^=cRCe8H}aYA5{&9rE^d% zA#ikX4BFk4KAXvxe1lLh(A@!<;kdB(2$w-uL(6`kX~;y$W${cvqySp=C~>NCa9%(4 zc-w8WW>xQGD}&H?>bU!~ok_&`-ZXEFwrnRw4l(tjIN0ofurfoxo8+sON}Sq-R}q8 zxvGTJBP3&2(9bCJ#)QVTH5uw^T$=8eNzD;&{R#$^zIkBhgsU18OCgswMswN#Q&}O%iD@0-F7sqxthxU=ExTwUhsZ!X_MGj{vFqf72 zPV$|Al6n(UygRGpJL~VFqemWC;#fmKw>`G`=|xso*>I5x9!JpF>P8+$v!VXP`pEjrXkgwb)+h{b)buL8C9S`#o#TNR0&GD`?d+(?FtX3CjbNxlSm zQ;bbELp4Wl$Y<{BCxz&_V#qCXoAw6-{NjYH&dsA%-F|+W^BSMBbKHn8djqp9KCX5p zovF9>%e=Rxm}RT>*VUg!RYF@LC(`c9@h7|S(%M0bM`7YnGv73PR-+qlB^&JJ^QgAf zm*dq!AI~`xQ+~UW^#)w*KEvSODm)h~eK^P+mvf9c9-qn8e&*U1k&yB@?ua8qL<=Vp z`7v{@*z2$Th8XhmhE@}ZJ`H;ca>w0ACz8*EL953hgke9sszsA4Jo z-8RXtCxM(Q@AW4!lQT4s(*aQLD)-VfN1ZoAp9eB%58=jrxv84YS;?Me$fy=@6hwU> zJgD4Zx2mqXHyU6LS^qYq$z9(SY@K;-g6pZ`4a@zZn&%7Bt|;X!vn&c$dNhS?Vd1Ua zB;!n&+R^QtH|Vqy$HLyHnY|ZzeYrRw_YFKR^a>q}f6Es#y&|~5al(1o%H%q9TZB4T zZ{n!Qd%%D5(VtrK`V3{T^z%2*yBzkTy8|TUppe}Q$wGOuE976t)8wYgYn>fd7MD+OcOw-ufrKdSR?mt>fEKG;tAK{3`0AO zJd5VYUuENQd%uAySd5AG>tSG&uKpAVKaiU`;$KgpwmY3_;)0%T;tr{RRpuO=xx^nX zo715hzu2t15yHSwp$+Fr1L}Rt$)!NwRp>uIimrs;k;m@vr%Uk z%NsP+8f=m_=U><*7ex;1L#HRNN^O|Av=?h-y};QnQ7oXe`{DD}7e0y-;w1T```v5} z6P6BghA!Xa{g8fGksgs<2&0bE80!a->+7NVvcqNvmO;;{y&E(AV`M4Z^u;VPemBP* zI~+>Xg?Dpj>O9V=%a&?+^2Un@kwh9fi_!~e+Y(U-BUSre{O1qvra@@IZc*1{^Z(w7u~g! zD;Gj#F(JI6DmPc*#(PPcBIg7JxKJeHptpH%ByiGfc5qmj61uxbble9dXc>pyi^g`WdL0~@ ziUov1mGc-eme+Ki6&rOF{1{~l%G66wdM?OLBK^>j@4re%6K+m*yUM=|4dzf=UiwwM z!)k01JTNb@&?It&Xi`5}M>UdAC|kg%EGWJl&o~;b=XXyHAq*SqxbvlDd_e&W}G^;j7p!UeZSmC&Nv_Tcaqx)x8zu4a-^o2#j=K zsORUrOsAWDXCnE=~OrMy(=aI~2b({oS$Pb&M<{i4ykB&IJhexO~_T-k3}> zdL=ya-EB|1*XY2_#=xb#`2h{KmHbD1Q+LvJbOrbpH1*K;kbH-2K^l8=*W(R}%4$#D<>tuZb3keurzAMf?QkB8qjs}t$supsj zgk;CFSxYC@a8@=7m6K-4zspoinEH9RfT>4b4gNvU?Ag!PYyzTGeC5{$s=SN}jx*jj zSym(gPPG*qN?vD)jnnmHJ$9DSV{u!3Rb|sAmFl<8qgW%xxa*A#m1A7?KG@|`Z;NUv zP7GSi{@CG_)4sAwX(%drwI!Px~ zwQL;Ya+@}J-*|HIa+^7CU}SZEhzoB`WM3ddhTxOBOQmJYf)F~DeU|*c6eDMoanf$=~F&BK7U&IC_Ss|Xo>xNf5|Nj(z&huay^o zfvGdi)%a62?lhIkixO8~LY9V9SchM%TfA=85jgUX;t0|2N2kz}War5H$eAYOzMOt> z|4UXjS?Amzdii+>6zy}H{-Za;?H@o(&<&*Bi|5q)R??r~-_>_WEr3BgH2Kz%hC0X> z#teM~p3dpT=fKxT1AzJ?%GvnL$i)wFt4gsKC)jKB!P>yEZ3a9oAqM(bO^>=)!J$=j zbXBex5I>gx2kz@MdQ~tU!eTZjfXEpV?+JXEZ1pn4k?FXpMek6C*J}tm*C9MZeH`!7 zLxy)Q%JcRjNv-={pZ*q4_vArOExYjqXp~{B^Nl=P80$z zjO)io`43xP1!&%m{PSTo-HP$E>42j@Z`~cEr8EhPH4zWTa%wHsexYNe5w(wu-ibzJ zup3``kV0$!dG>mML?oIgNyn&z(2bmSoO(=`zo74ErWdI%tJn@5mX|J7tt1!)Ws{3n z#uahDXKD?>_SVhQ?x=EGzoPo7ONWWe9B|OW;{THAV^1Oh5!t;~m1N0?YYFQ~K~p1e zDtR$$oYjGg_L;3u0pBc#tTAmR3$IYHcuqp*W7c27@F>;$+{Z^6yQ{0$7_RPQ3e`fy z{Y+wKQ=?KqW(#?(DI;f~GSZ-#Vj^>@57*~dW_Me;HCghn3mq3&mbZ0@jA{)oiWIlf zDVoxB)n#^cu*W5cn4=)MW<|U8DEh~M0klQ~u-x`;jNdCzfcid382Ik}y}9KDr{BcCDGc%VA>bGHe+--Oy6JZQTE-&lBl;>8r-1(iWX5V50(Nh6odLue*Qe*yo)$sB+ z8qKv6sk~`1jV=gxed1#`&S9>c_p1;w?HV?j)MoC94MnDL9xOVBr*`zNKl6r!gu})F znDX$b0>V+X31)PeE>?M_&(cW}!kDr*#_T=}&e!+x@}q>_%m4`w<0?5G#ZD1lO?qE& zJc$~3ak(t*Ns7yD_iup7G4Te$r;M?0d2n94pB-$otz3@(U)Gfq|5J8g~N&h>)tcdAj8N^nR?w)e#oOPe3p6+O@#0Z24M z?BrKu%(15Al1^>dmx93ZVW}s@u=HR$LO}r-XVKzQKT~Mda>ZESTGa5hgfrt$L<>T| zvSwW?#xzyEaIMYZr)-w-1VFY_^g>EA#HI($-aaYOo2krJmiJ$WL=LK2+B~_A7hEWFD#$d?R2vQFyWE0p~XN zrAl)sr$YhY3)lyaF;*0wwc#^@Ubp709?5X^fCy{wj%jx3(}XMcx(})M^Mp;jjvKQ| z*mBapeMNa~Q+}$cG4^3yd*eD=85x;6!&p^-<OXn?2x$>E2N;g$_Bh^;Ip%kswa*s#T$ty6mF={mTLdXA zKN-u=Z*C-B+2RqYh&4Ey{4lIPB`=!AEM~Fw^ykM|AVsWw5M;tFr2fyEd9zHPTJS_l}v^(zK+L!0iR&v>OG(?4^B65>mb_3?=3>F8NOf z`>x89o94vLUVKzCVx-;NS@^OUS#o)5FX^g+qXiv@o%zwh9C3OHp=@=$yco^WavUe< za*(w_-bj&p#B{hlGiO@ca^v^Z4B`sW!>!y3Zog23(KKzcpiS_rDvY#w2go`gy1(-6 zQ35q6hf|n-{iBq-C88T^~7(E z-Bqi_#3Fy82>jG~D1IWs^0lF zWsXPWfjmKeO&T_oE}J`}p%_iRJi}*)8-t&;@`mD$o}g-<6OdtX|nFwP*9G1n4S9y3fj>y^2?EGwMjfV%~8l5oI}UOS?}^ zGU|3jAwXvlh5dlB$7>SY+#aSzzSrUZu8<_S6{4bdIkIPMNrOXrO#&hohmcYa^tJ)o zZAcS{8$~1X3gv&W{oJQj(&Q+*;Sb2(s^@qe6})+~;!*Vd_7M2FC1F8Fw=ZSJ{Mj>I zC#EQ|g$^+!l;UY76_ewFvf3?U$C&?mIsWVK`iyaAVmV~ix5l2|mwZQl?WHf*c0XIW zg~5}r+kmqodVhKGMxEE=7-C|C*DFsbor42xl|7V2pX zhcz$lxjo8Lkq-K)il9F~63@E)mNV)6vd7Bu_(DRDyWenJYW#gt{6R{kBJm&?H&>2|NMO?jpPzKc2+AL~H7s9W3m@2+u9rG*VUjZrDS zos2zIThUS26$x(87nk@B1kr*Eiu`ZxW3J#SIe6j=pq*#NUs=4}>ww4dMfkq~o-o)E z*j9|O`^j~g7UT-+wRw3S8$xg+MV9g93Ger=w|(51lm)zoMj+n-i7=@XxivLCiw}w^ z9xk1{f+|J+K1WBOQS`1-jAHM)yQ+ zGRKzo1W(N|s~`Js|2#^cL^%nrxy_6%pApdCYMb6_PI~C__xD{*tO%Aowa;ea@rH}? zzDnKFG7IcLDCUL;wXITl*>Gn|WkXZ~I{!~(_W!-ZhWvlE-x3t*ed3!_FJ zE!Mq5K1>#mepM9EQg53>TmWwHvLOhx<%YMI#=zQ{-jviB$iwB zIqnw`Ag6?<66PfrssA+ij`B!miT8E;UOlCQU6-BfZP$`IyL|2)tPTvv$et5L?!z42 z-F9{Bx&-osIN~|-DK)(bPtMmi;09Kw-!A+g)idNjN|G)E*+)5lovNRWhZX#ply87K~)t2nV9UlOxvr~xmp7IOkh-vP{%GH*gt4JxmuMJaH+QZNPvG=_nG0t=a)5m zY+nX|xS$kI;talYVoZ#6JDEOt6Xd@{S%5HwF)XL@i zYL{WLDMh`Kw>CyS^nh0Tzol9J`(fjP@TUX^ApSpE5ZQs4az}N%y7)W> zzE<{^tQz;1%xtrp_-3@M3wGTpg1!DXa6_U#8I?A)`UIiQ{!C;+^c&j-Hb(B*-cY$#BlB&uDsVUocHKYSB-^kjp_sdB3*r(ie(|od>_GUK zPVtdTdC$Ir*Ku`V(|fe9BfATNjli%3Xg1oOmYcsn8YpdaeDt;WnG(?hQvR2W3hWTC zz&*3DeNmF6D|p#T-it8xi<^pIQ%ukvS9Wu*u|9F{!6OZmq!LxizcH=8_NWfzsm-8l z6#V_W+uAh~n{-2ID=C7>Gfpt{&%eEr|Jy5$L?{ZA0B{?f;&sg#*DBAmg1RfPPKWyS zX9*F4HZ|d~=MdfNHsiJI)|ILdc?{sd8X6fu z)A%P^0*wH=;V#-8?JG+IJuF*+W3q6%XqYczd5fj1#V{0^^=7u(QWvDqNe(F2s;$B7 zr%c)T@y!o^dTOnuL3cg(Zn!Po^VvDeVXmrT9mz4#Cr|W${zQ}UbEfGk@n2#;0lfu(JxL~s z&ZC7Bot*1lhJ1B0NKkrz-br=E7&LG7m(0I>qrG(rxeHm5U0QsrGRYFpt z>^Z+&$Hk3p>)PgiFReELWnF*CA{MR?o)+`vFPl@EmnIsm9Djr6hCj}NePH~q9LW_K znDKy;lbh~dKwA0QX>Rjmk;^4u%Kt}2N3k+In+q9pR%D$1BQX#^Z$#pY5}QbyV@^%3 z-P1d^hNMm$>tYgDsc_)Fc8%w89#V+tmAJm!WdfZ1X>Nq;sXrRwSzHzCPG@Z$S@-U3 zep-F)F!%_#YQFD5UlwaL-X2x4EX{HA)>|z4$E^goavt~AAt~R0r(cwGR6=*QacpKnhI_w!n+eOk0)6JD(!u`g#@|LS~UcHS4LLFhpt>+aRQF8tbeXe$n z+E;kjgm)b{r#^=EsOJWwAMxd;8G)tX_Ohhmb0q@J^Xv(iG19plyb7*9fprT-y@EuO zC~#5nt8%(Gx3i1`@6Ac4s;^v5TwtMR`jS1P+1W|OwX-jf7vevj=E2RL8}}9bcl=*` z_qgQ-WuCueK3c>`$x!Tlm%ZVO72Qn$M1OJR{vdEpTA@JH7Wywy&p%sYF=2j59m|ru zUwJ#(9ucTB@sVQvhUW7Z@TO_V2dee}CRi~sfB?v_uQ7HGc@w&!>_%G{u?fQI%3rdu zP%+Xkt7+swl#x3BQpJN#s_?7GEi0PgOgSoUhkdjbEP3;EjzAo^w!1Co+&7}ebi7&c zpM#1T+5f-&e|`;^6zc1HF+Yv|5ANPGs;R%*7Dh!;s)`^z3MeQ7(vgzb00993F;b%- zBE3Us3DSENkRlNT0V$D=w9u>cCcXET5K2h%-T!Bt^NxF-bMLt4^Zk&Kwv+7az1RA! zGS{3Q@u^l{#^r}p4ZprC=_&Ea(he=s9-yO&_gK2?vtrz;~GEggb`r zC)@0h3Q}arnYCEYz((2fS@s@-xrNkGFlw%l0ae9Ex8ywwTM9qO)+ zp@RyHb^q3f6J(xia--Pgk%HsSm_{Ws1p6RiSQE3WtRe~Svks@P{?_OrUemw03A+jJ zJx(QjTzY&=6uJHJ{UnVsc_e;c%7vx&Jo<|;x~2)c#uZcS3f?Ej-Y!vanSMOBy1Vwx zcWz+~o`%dkfKY;3{%ZkbmRBC{G(0#U*R{2_)buivO9sL30=P!}3JW0smlW+!AV z(V*WhZhu~Oh;=$&wmfl?>jF+K+M3AIGIP-!OX83#Q)-MLqSYt~WLx*jqc3_r%{njA z*PV~Jb%EJvdH8FDhe>Uaz!yPL z96`PZqVqHaW7^A<5=^^tTOR&m4<^?$4cQ6`^)>^yFE%K1dP`JA&r^wSbhgdxwZ*?S za#WunQ|n#2kE~Yg?)Nz@2@6>B92(G~x}f@&3oN`guVwjoN5#(UC6^_UkC!lJ#YMaK zmJf;+p0Scj^L0<@OHE|tdtmQTOEzVd<7%s>DNV7Fo`eJ62a;A)@A?-2e@Lg%mObP> zC}Kh$UtDbO3d{nE0pr3O4M#+%ii=KUsrZN7lfN{nKYHJemE-58k2P@kg;|?4yN7_2 zT?01H_NUl-%pX4*PY2oHZq+0AjO+f=$QqGK06GQMWxip}mB`ohrm>MXDY0*PX5aKHR3Kz z4|!t5Ms?;Ij6K#N*r2@e`iYdyY-(Ht(IujL{WH*r6X%4z%p>RZOHg}Z42k;;n~=F0 z(9n4koypim1;1^@6ygl#u-)&x%9DNx7#>GiCRRdL)q;d}Cb7YY<3tgPLmoBeZV$B| zNe3+nFy8ibrQAY0WM{3_piW8R4Y6B?e`%8V?qka5n^!j%SIG`yI{4)8Fn`!ZXkMr} z`8_bI_(0^(hXSu;Au2+1-{SZ(^`#dF;R-h18}7o=z6v2JYk)71Z}Te2Ii7X{z-nmW zkS6KL6@V8LG}pQFc9QQE_##;tgcmhN0<$}014ZgJR11p8UW^2W5xI`;-tj|V$Yi7s ziv@Gn!9GC`MUUAqsvxJ6ts#pVzIXr9WZ8@yIO5QH*|F$}zci@W{kJ<=e-2u)nv3ed z`Lme--YgMc5^rG%NrU>=tNgH@I$wYNRq}PocK*E{1yzlor*&LJz8X*j=HNNm02n~( z#Zaz|O(!qgDyc}X?f$N9Tyrl_i{Bh1hjbt10P-byd55qpUGn~$R%dg0I*xv3C#9lw#a7Eh zzeX|N(pqS_H%9XdhsiCCVIbWYh}%FvCj?`WOQ2SU&MKToylDNIpDE6Pw~8{;Enz9B zy^Q;N=e@poyp{M=IAZWYOGEtqm!TvLy};G9a(c>hN+4OcU<<*BV+j!#v=((U&VN@m zn^@{F*2_;6{?KbN7QPSrOq2~Rf=B9xstw-zKI&M)ED%C#Eoiaxn?pBtQdJ`WAOChv=ew&q~q^#PIleTK=2K)(A z*zXg9O|)9ry0OR%>bNk1v+)f4r%u#8kKa<1)!H!hHY^QP7ESg>cFxmxi|nXpZn0*r z8Cx5uEww>sP*o?1_rLaIvFvTM2iTHA0*DmZ6q!QE23u~T}Y9xI8HdQglH(@E&%g-2T zTfqDd+Nt8vE8@29_(BRty=;yh#U=fGT~lH`HPh|LJ!?BQk5vW>=IOos_9}76(P5xq z0fhYp7Do=^N{|aZKsO*p7YNZ3mrf`wT^X%268X zHx)k&S#3hC9JxmmAvN1rkby6laKdYYt{2S64aC>``Z!_->a1* z7LXi-VXMFvZ_%}qAwOXL`tboRK)&_qGA5kuqo0M>ce5Gm+8}X9cWZ%mX>Hkh7gde2 zpGpB2Z$#W5{rNM0QAI`FAqnlMXmd9qe<=fPUW_3Ai_2UE;4(jBjUsCDBF;ni(E#VU zpcx%NO~{`5mnNrlVtG#fFHP-Rpv4N1K5f?M5E2g|_aXmA^>2)rnm;fG4#7{qyaSul z+n8qy$WzuuWHYD>i-(=0WfqAL=?Jy}vFv&a=9s<|;ElNi8)!!YKAZ`;l0kq{n_oeQ zH%0K`JeXPR{8GwFtWOYCAv27eP60Mvsow@2?OOr9{%WHM8YoEEAy35>Iw?Mrz(Ns( zP7yicWBN-IugyxnLXKZ}kJ^_}hOMztSm02qiXVttzYEk^j3)4{u)&jHV`~}E&n(4O4*hXHGDqUJiSP(AM*4mkn`H(oV z@%o+#mMlYm*eE=x<(+TJCu(E^U%zCxPDGUwnS*l(7OLfg(jxE5M|SvMXN6xln;kov znUksBtu(RD(gB$^Zi7riz3EzlL&iJCro?wSUufO?op*jowdbzY_f>OMmfm5E`jMvU zsPuFltt+cZtKg$f1F>9J!h1}nsiVp$`9a~_2ZgaZqxshw*gn=HwI>0}l(2;uH8#RT zQ0fJi`)$zTXp@8JG#%r)6MOg5&5{O|yd@<|qx(VDA5ef+TOoSt$#T+4h@r2AcWw;Q zL+|D(e?Hrh!)e|1fyA%b$U*11yy+W8+7d{_mz$xT^ z9+8Xa0H$Q?br(G936V<`_F#LX!KvxjT{WK`_a|^#(tQ&AN}reUF2fo29Ud9fdp}jn zU}y1?x{+*zR$XxsW-*pQu)yp)KNxTASg3q*oB~iQvjHs9hcF6P62TGn3siw2$Duip z6WPGUF^X>WKB&I}TpaF>I6z;gR6!n1z#UQyH$ilel@bDo+FENxO;Uq3CK0Y@jG(!r8{X8c4s-3G!7z* zAP1)PfUUuR6%!!czuho>3|Qg6Sh9&a0893w3`vd&*;U6xy@=ctK8-@+w!AUFDeORL z774iL7|6^B=1aQ|@JEvXTN3~lQFI}M!c&P)**WjSDiucNBrnW;b_S@G&w}| z42_bM8pRAPBIbrEI;|1Fm`dKjM`Zy(OeAR@7#jZL`5aIP$lpF0#+;=Bnj@xw=18d9 z0Of8B#!uw5x)XF#1R=oI{01)lrC~X-sWn(O_gnUss0kz{7bmKVcdZ*`GvPn3Gk z;vSejSMp4fO(~3yKX`ff2Ry9r5LrMuW5XVaeCzsVoxzp0L+7iz0C56e>SOJkH>dN~ ztxl*aK098=>PbaxQv@|anncplYnj#`S!JD??MPJNKZLKGNeV5clQ|V9nUGn>>Osqz z^I=@dg<6Zd-ug;qH@d3o;?`6WdD&G@KV?d49G!y%62vn`T9>xp6e=ntSCTr2-gQ*E z37jgsejZop_x$F1e>~sJ1;->l57W9YDfpVv=?7!f*YYWzHvc4qk^+nvQpM9Cojj#`W|(vwYiiBjyp?k$CIC z6lrN7;JK`KLk}TcE0(6+S3N_Ry4!S$-y?Na`c4*a7)^v!KB#cK{JJj>W}KnUxMAT= z<}Tgw=F2X#ecmbD{QlY}XX$16hFZ!%-VgJv@IqB+pf##?o~}`zAh#@=bZgvy*EWCZ z<8eA?7e0I7Oo%+Zd!u#Zp%OHMI*y^2p6BzGYrs9#4H~Z(r%iFz_-eU3=#X~%$n-kcH^>JnVz|x)SPqsacROq!xaAs2Qa+`}0n}tf1NA9~5AgL#xjrQ3B5{v4n zygE%&fH@Qm^auP83t<0iG3=kmbQc7&(7C|QIEL`C696l|L`N3f1RAeaeu+H?IR^mE zxmR7ezw5~bku@8V>oKr6r(&p!fb}@T*g6g^)I^Jo5-KXj9E;*NRvV8#bQmL>lwTr# zgXl5qb34E~*NO~QLi~o&BiF|_fTz9C!B5Zy2tcHHN}%Gmz{NVJy7&#sXDUE4;ra`5 zd~mo(k|YBe0E#;hn3j-4M*E#n0ZkG9*&IhPLLGc-Qrj88enS2R1G$)^BtUhbgy@3+ zd)@jTa*P)T^ry^DRK$1fp#RDp3P-3f;rA(NjaQI^V*0=b^pUuiizHd!n;`t9IADK9 zfz-p)r{NmX)*Q3G(|8Pm=8^NUStL2%8z4O0|G$Miq8|hVDI4_b)VUG}Vb$l$HB6;5 z1b+OJejxch<2^$lo+5l20wbCQ`XOs!J&d2mV$W=?@jH za|)e-f2Mo^B$5U(0H1{?jO>vCu|ft0QZ0dF`waGj{m8q*4p)wJVk*slgyw zbtn%#M={Rd9OWVk`qB|pA!{a~R4OoM(mv7rm!?!GHsvXAqi(_@+ta7B)9zA3ZvCD7+?fs%OBLY1iU*(=UGI+k#GuF7TIUKqoU8vZ=f;BjGt@%^1jl z8*>1*jKXo@Vn~Gq2*_}EP(Y?ga6L{Ycr9S);A+n?*#0DTl(P&rd`@!m-NBdAZsjH! zoS9BBB9J-J`?+ypyrpo777u2`_RIfQ@e@TqG2G(YsBLJ-D3Xp zIm`ThFz7(j7vSJ{V-K)+s1G;N5{5`G$wpX6JHQ>S^C+{=PI|JVS2yY{jN@L*@#IC{@G^>Oi#^xg%~5geKq7U9}W1N)5jkn4x7$l7GzpHHj$!jpxVQ zEL4Dbs7(AE+yjsahtKv@D2*De3aP@umzPbRiR4#Ag! zZ_s9?HnalgUM(^eSr7OUR`S5c7Um1^;4e*AZy|DG20&Lz6cTTs*O5R%b0E1je{=$& zM%j>Un=nh}4FZ2@szEyK7$yj6by=od3i|1ub9N;3mW$ITmp=rlaM*-a5&HMdHkZ_nQzHk51)y#(5ilu7oaW6 zY=iC@Qav@rT3&X%no?G5NH{HDy$?5p<`<}Wc**%g-aAJArMbKi0?17d0cq&rwm+xX z{`3xxO_}p$pun9SgNrBVENLJPWNjN(WiD8nR`gN~v(Nd=lJ!3J6grLJY?OfeEMS5s z%*Od`>(WyW>_r7`npx9y_wod*@RYw<_CF(d?sTk}Is>|X#1LwKn@Le45W+9WB=wOx z@~n+@@0y82(HVs zppMS=USH_~qbD?_5}fA*Zp)qS#x`u&0_$Haec#HraSDs^86~BkW8k7Rqy4$LA~G+!3i;+Q&8diYC&eFFk%rT42x*Ozlt3W6 zE#3wlpF{nn(XcN(0gfzkMKj`GrWf6f`N#BLQl}vYPv@yEvY75|+%e`2#vXOb?+B1E zal=-Q2qc))!UQ>dp0Ii zkf$3S!*>^T_SHpbbqNU~#PitL(0fX(i=rYe$AY3I@wRL=4(UM}=N_C`B!ky5caY|Q zvMRDgANK|xvZZ{C;P9VGP<=V0`FW9U#uexn|=rrS+{QOw0Xl zONc6MFyjhi+@;P^^x8aUxiz#yueZRqt7MpJZP%!Y2QP=*@41$1WQx2+LC`JcwMTmz z5rT2tFkGPFjI>ek$Y&|m72~|_i zrQPbWGy8V55l>|5QT1W?KbCPmqHxsi>cVrmP^zGj+DWx_wH@7eOMetO>=K~I2%VmS zwWEWHb6FP>zY)qc8yafu0H_VQPn5kxDt*@lu# ziFVs4YNWM>N_>5V`BndA;3W0^hw@WZxHk3?CtmD<~1-?M~pSOo? z|D^v(PZew?^FVjN(!Y@JTITUMARj)HvgJF$aO)@qF@RU=0#zdlWd_=Iud9iY(naqG zo>S)Y%9(NAyJEXY1j5+8;k9bQw{KfHh_w^-=rc#~_bE=~bK>i;D-&Y`pI%C2$jWpI zt`4*0HQ7h4)T6WjIKwifCiDiwq9-{KauM)0{H_12`54p`)xYsNvReupO9{9K0i>p| z7@(;1bAAb#0Im+AJ|GI=BY9rtg zhN5A4u%D&}>ey-6NZ?tlP2U}}1x?4RfrL}512h_^)&1^jH5|4ND~^>4-92hYN@Y?o zGFFj~v524&4^xv_ld|=wrm%=-z~1Bm0@Rk!P2?`+yHc7?pnKAjvN@k#4?E}7WT6+x z{b>svx{rPjo25pgj}`ZVO{7wjC6~A<4$uQ~9CRA?Gr8avx?RgyQ|u{p>Unf^X#!{V zUF3Ny^7fJu$@ci_2;cfrCi(~4Z4uGiDzkr-%AZ~X{U}5jY!U*2Qx&leoT}RDCD)Y+ z;+Uc*c$89n&%A+fe#vEv!I+ZpQix#aklhW!UR7I$MaYnJNhVv)*7IkSRJa=_#f^sN zBq(dPVUwX0Y!Q;43`NIbwKPe0bB~W><_U~CIuXT3VMIg`1x-l+9GTLc!v?@}O>X1& zkNW`cw4D#E$GIqK1RL=Z=@zoP5w5=pltXgMC)Ve8(*X&M2_P0jYeFd3L{CFyL~_S%jh7?9m>fJHmx@GlKT zSc5YE0NOmRb82}iwc>y}-gM;KKd9fo|iCH`Pys$V~VM1OeB>Av>I2 zK-^@8L2&_*#=wHPpT()vCZT*eZ{wd!d3GTF{{)MIJVw6LIlbbJBwdoDhQ~Cb(Vrmy z_QIMNkRQh9B4I-yYoHu<;U)kU{muJPg)OCco1KquXuon-Pv`D{c7SF$YFPeLIjHLe z-OBoI>eAB*u%zbq$E0tAI}4Tan`>Yz$IcUfP_FM6Q@Thc6|bdv)~!b2QONSL!LQ`i z4wq{^?Nzasbm#s+EwRuRre#zFf*ZT2BYCN0*s#-)pJlEW^*lTNNT%iJ{q|GDAX#?M zxMiNDUT$&n`nxm)%APThns(0?gSSiD{g7Q3{|^%4zd?NeC$H(pYyMQ(YCA(1u~k!x z92m3b-?4m7In$l@z@@#hHKML`@5X_b*D94&C!UQs z9NDhjvA4XI*we&*Q$X+cL!L~!N3ff{Brin21ZCFzKF=;Hg-NTBr&^2B60&-8)9P1}yoMp(5u`1`iI}YT0sV67w-Su&wuVY z`0n6TKQ5qI=HYqm*Go5vW?wjyy>jFYbpJ5thBp~--g`!KRRsybR6sjtq05euLTN+q zO0i=_{o}HBUJry$Ml1pam}i3MJ`CKOLz8R}L$)}p_QD0!rI~U#w6#ofU;n^t?ySMu zazGgPH3jj0qZPzFQTMS~IQN0$DW8YH<@wqw;L>c6iW5P$@^MTkS(Z$}9a}x3eez-8 zu4SFE&xZ!Oy%Gy&5?_4ax=_0G<^4!2@qFRj*7bgY5aABP&s6_2-9rA7I&36fEXx8o z7VG*LZ#C()q<9Fjt2-CW>-*WYD_Tdt*|mG;dLSq!n&b3P!Re2u?&)>v1z(myDihqN zE7pFk&ijWWTH+)YCZ=>Cpg&Rv`h0wDW+S^%q1aGcThBXl=SFr>K!k8n&;QAe{_n3B zX+hiOxCb4I>?-?SjcC`52_D!B{K}L!TY~L9EqUaz;`B@UO!Yq?3)p}o`I-n-r>+@w zwEq=|p57%MzWPm1Ubub$|aeqjK52s4s8!S$~w1bDc=wfLX?? zQ852qJLIE717(FHK&@#)H(KM?sJbv+K692oCfJ>l5BX5ca%@fjavTP3#J`Qu@*^*H6l zvjqpvEs+g?skE$&+KFU-!U*Cg<2T$H5=5YE-7b#BvE?7KBf7q3|2u}azIqwj{)KMA zv4!HZlowvSUN=&~@y#=2@`K#iVF{XZ5KA) z|4isvEEczOs5(6!InOBhBCc!eks7ZT2km8$NM$=BbnsADp$@orM=mB-^%iJxE=f+h2>WU9$^0=uBRN=?0A zKB5X+2lqL@hOf{Z-g+6IR%Zf89bGdf2rMEa$yV+RCaqsLh{2?`9Fb>LRr_}y>d;ip zoZB|!ALs@4FJ)I0m%a>(t@p|%>);sEc8fgXZ7$y{&8B3&O1HRY%;wSYge5WVzC?^G zi2)V_!K$H|x2_VZm&7Aqel4vkEiZkej%O_IIViaxQE^XANcmZl((;LSMNx`ukp3Sa z-tPMOZ)f71$g2rA6(0Kz?V=sc z_S-TqKOL^rQKY<;2t4*u@4hD(#FBAf3R8+oUpes#UmNe70X*3wwS_z~J8Nt8F;$u1 zwKO4YhpymmNc?oVbkp+^>&zC3|1I{=D~L ze2ezeO@RW_KYa)wsA)E@=uByaFv`R&+$vrxaM0&TRPxwkSzHD6W7GlK6o3IDcCv2K zkMuPx6dB(@d_wB69>zj$Lenv8Y3q~j6=K<9*DW~%E~8wJ|_^~ zn)CvJqc8fyv%osYMY)7-#}GtL+WaJZl^2jfarYCb#=m3=n(jW=RpFj~5}y&hrY_OJ zJXa~5QXeXpb(lxKx&sL3ZF=qhuz?X}F;7L-F_UUd-g&L!lLJj)`U(C68Ez&?*P#ti%I+y~FFt5fP|lh8)^}x|_MHn$O5H$uY5XUbU+1>u zUFja^WP(_S!KSMR#$FhK?%&(Fe1>+F$m~;;KD8ZEj`zJvcvFiC%Mxg(-ktO+kirWJ zKe|7C)vM>iKMJZtpWpe5?EDZ$Pu~ChS~rjD#UzN$s4ydAwGQyswTys2{~v)!&;Ad$ zWBENWt*JGRxB!rMv+QOqz!{X}UTjX0)tgd>=x@4iy4vtv?F^PE*iu1s5scev<)X+k zx$wHy>a+Wr)5Ki)rw-##8l0pG-q zLS`$!VD|IQ5uxo#$DrM$+rIW$t&~}%YRZvR>?Vv4+Vp;al5I8q1`>D~K=Sr0GP5q( ze!_~CsKLjO`+2$)+a(FRxNe@%1TDW@+}j8;WK83z4MKb%vSgW>oBHBd=0xc}##tPQ zJH<00b7-BhPY!{R1-Be)wQaR~UnCDHHM!6UG(gArKOFCoiqjI+!ZalXC&lY70~K)jz`?a{kK4V zItMa9g`}y^1$o_=lDwEG5im=$@emy%Sn(?E{ATrd1NR~wcr1cTLB-}|@Bik$-)F!EFA(lQWHrp2?HNbWv*Nl!zlW8Vh z_;pk9Y|_-1Uhj#~>p+Z)ig*(Vt=5~j7fNs*iQP1Md^Q~{O365Kyz-&(*3T4e)A)TI zDHpy#dGrhm(>ynYxo4Yh_v+KAwirn@nI$AM^$OgjQzPbmI0g|}&9s?Dq|+&{F0T&h znYsKkUl;Yrl%8ExH1iJbqz%1@JU?4+wH_~E(cy)^23_&^RrIK8{q?~rv*6YvssNDQ zpvUH;5j?2&XH}A78!fv{@(c2i;d+a9$oYz16y0y7#DlQrQ(PB+S=t>b{Pt@?P-l}T zm*%fO_zN9?k_b$RK(9F|-rms7XMEL|GTaKu^_eThBA&AHi|>6Z_oKQ96n$($RMn>0V@XTG#?32jJF#+ied;c44?Wzjt*VJ-e2 zOGEagVAtV4d}cLGYR9q#dhW77b|Q{YlP4X?Q?PJ$UTe%yNt$rSb)l5?!|bSFffMLr zZ0sIc|GNq}TZ*x#?x9xBS#4JsggSH_S+H$nP?;^;^q%g3dQHdVkf58LZODM?tFKLG)h>$*{hpU|89Oa3^1U;vqIZPz}Zm7LKEdc_9cXVwasLxwqWYUkKCec)F4z z!ti2;noZA`^h%lipLNWH%+WwFnMoMiCij-cAX-xWhz$=3Z^eYYuRClVx`g&y}9ID!G4$Wb``-m ztJT#W{+_lb;<;<5(Ng8Eg0qY0%eUqa$XlcSCx!NpsoPC*SCry$w5jLjl|?%3(sy38 zuf+(F{u~!gD;7VpZ)^+l(zq|RZ}!V-@j)doNIU-1^}8Aiqun19(vW|sy+4fNKn|<4 z>*z}FmkI;N?=uV8jcomkoa7i!O86kmTN7NxjVL>mv}H zAOn9N(!&RU3eV3EPU6;T(AxUfDmv|u3!tpu35uSz?HrC9|yxV_~s zF1OJX`b_hHf10Bi9MY)Qhy|JuAMK!qmQ(k0e|oy!oF2gynu*H~RlKBKsO-g4CsL?v z>p1q$cgPrN_f^Qg#3@w`1;F)BKOSfOC$JmFOC6L5tA{!AQ#OmfQ3T_lf07QYb}!i& zw<*{UAF=SGK63vgK$r_WEH;5#FtH}U(0Zy1GI>ohZ?0{6B1jwl&W($_c97jU%I^hm}$E= z5WH*Yyg2P8I_U6WLg*Y_;&64q%`Yjz7PX;|MzU(`ik!Z7R*5u2taslp&9_QgSEVv3 z3Gkd7YWGvB3wu4!?d@!)wv4%y7cTaqOH$yx_i}@^;QO3`u7S(&>m9F**gEPbu6!OX zn7X!i8apyq??&i{*F(g<(7?MtCB{i)9+6Co;TpHFJLA(Ank%l%$F^W3eAThbPZOaD zle$6gr2O_=6HP{B!|^Qs(i_G5nNW}zXKAKuoUK10Y56@^x;FD{Q#{9pqHxK#*EM7H zqDYyRq=Og>8pcf9_DWlhbxE5EAxX_n*rUzYzif=bF$4+s5_sP0G;Z#k@J~1WmO0Ot ze*6e4oM$LL(xUGer>l8Ec&K9$I4wgk#TIr{BiP5UuJx7TZ@Wk}*-dPg#Nslw%vhKz zhFxo{=;`zn@WzkdNzEf9;6SOnofzF%;IV=Gv#+$mBdZKf!qT;o{5-gsbe`g`@AOZ< zu;2L6%5#Js>s8ifxeWIn9{hR7ix2d5-^^+`Hb~)BWsdeDJm@^>&w#*GmG%;Zj?9U+ z0rTMkZGvO&ttNx%Mm^C5;QM5^I^2HfuC$}XxcpMk<*mR^1I~T6zK;oljeKKyA+rxd z7Ao@A1XoIr(Xrs$8&)mEoL5zOs|hl$2|BmJs7V95pWIWXR2oa)r^srSviaW(?^n|| zK{U#lRAzC;e&`&=9;+qQfQDCArLWyD_eWa%<>$0 zmju$l<|d~u95x6_wln%(=w6Q)laK<+Y|}Z{ogZDCYT}_hLOT=uepnMd(%v?nZ{=BZ zrVN=0Ej`OawaM|BLB`dq!KpF^b6exeIpp>43g#hg9-ylZMMAiaek2o+8i6VbQDtEAj*B)Go7f6Uih!7BjACTV?vY>`dVHa8i$2`tM_+L6Otr zAk8{vwHpPyecqYjOH9IWRUb}sKl~$paV5(#W!n^BeaWk6aM#O)(gkLY^%WO6%Q}=@ zdB9P)cFu>Syklp-Qr#pKqj*@-V;sq8f4Hc?>VEZ&+(YgLPXrHkH{{p1O-IL8VY)Rt zW-VKFtgT*s14%>LE}|{!!3H8A?cC>d1BIc=;S`aB^ADPM{6yv(8EvY)MGtJao}w&$ zJS`PFTs#)yGsgnI``(V*Vt1jX%sjd3;=NqWwq+(pu=cUGWjl__vW$^FC%BwAz;Rgo zHENP*L4uHB<77Lz?cwB?RL4BYCp<~RyJ}R(9ctwc<^tOhhrsZv z`_*@)&7BWLT9Nk3fyJu+K)fb`WG{&*?-PYv7OE? zUA~FaMl-m@Y&k(LF(JZ$dUN#73Q(F@HhR=jc}O6?yWNNmwzNq!dxkA<-)qaVVj_1A zsPP%O$RCyjJ^vk6f_T4e#KrM?(fD>}*L@_b;nXP1GNaZ_a2bm(PXyb!rmMV~S!ww|4sv-a-xylDFb<=;?{)H+?}*l8yg-|3y4X&y2DP>`->d2eD;PksW&$jF3x9EkJX^7CR#%im3#OGDc%U1=^;H7#2b%G}! z%FEeli)Bi7Z+!%tGpM@0TvA-PWyXhlnp~Su?w9cVra{x@fDvy08i1~DQCVWCkiY2( z_H^=kunWsWR$Tw+7A|gNC?LT1c42wAk%17t$o4aYyH9|gU8xG6pf1y?yJgSH(ng%| z28IHn73S|+9-vD1=#aVaTcHt-4u0Ql_zIB458dhlB>jb+v&2n&o)fNU zj5-(hLxFE$+&OLqQBS7F_Lo%`7TZin3~+{J2XzVT#s8rvLE5T_-!WN@DYRV{PEG5x z&$dMH3=7vgNcp>~BWND`mo8x433@tj(#S$*bl*UzrrEQSor5z`-nhBiw^FjPiC}(T z#W5vTd&djO8)Z3BucN9|rDmW_kn8=$wk2wQPV<}B#k|hd{#cHnoR0d6=fJ~cX10u$ zw7Kb`+<2gEFg~PJp{U;Gk*2|!UAeAbuFX>lxLqT;ufyQigeSOVJuEZ`EnS$(n^O~f zMI-w%ec}Yf=zyWYW4;~CKPo2n(F=XsigtlUM@YU{hl{MJa68rAyY*^EPQs-Q+YT~o z$T(isBjmL7P^Q(SqHlXwrf7J>Vof}W_`asFSWK}u2ab598<;rP0BPsKom#?S+xYlncx$2>RlxwD{?D3BpfsXo1 zg0G0P6$j@^SEH&2xrpc5xxPn2LMz#xrT1Mr-~r${+iRu@ie0;D4^{hq@m1zp_(_-N zNUj7ptA}SNMNdVmN~dZ%2<3VbcQelER`^uhVifq;Ib444u$`;Demj^aQ>W4g7NsOB z!_6U9Z0IQ{TwhIikGj9-1qyWEHdcP4=~UEnj#%+^cwI_LWi{kZzo>B%`$rRx@^X(Re5 zQMRH06F116VZM}X7|BHRpv9TPFkneB;u>m(PZ0= zUdN?tm7|N?=GBTV)pO^-^BN@d!hA>UX=8wqG6gqr- z;fj3gVjiBt;uA^*B53^!K*=Noobk_i?LW6u#pRhGI9f}^KF1Jfwerh&NpsB}`q-N@ z8ZhsCj5kXmJ*x73o0rWfU-~5Tk7PHMt#gCjW1o2wx^5R!NqnV4-DY{Rga<8mr5R7u z`)f=NWNjM`8>-zie}5wtf!zaFy6Qg4d)@_Jw<)yZ?}U5xGS)vNGMCxE zdKg#tRX~82kw~#UENmf)q9Uc!a@kg{lpI1b%|Tks(ie<=-q>h+tsAc%IE#iE-iCY?;Vq_aE8Fw*}6U; zn1-dTRmL(DRcC8v)OWWJiz!ZhV$qV!hGQul>scj{))r!UtLvSPY$ov7j_XEplPmQx z0^I(A0-rh^i{a7;(e&1|3ez>M0aB%Azq_ozSU6 z@GP|9`gSjc#R1JJWKDg%G_J55*o8(R0`Xv7d&KVfu}6No%TPehUFUvf z5!~DzwDk68|F}EbR9Dy!WI3iC#XLJ{JZ*DFXv+Nb(XE$nE%=U#AN)y1q;4s6DV(dG zMD;0w^y6VEwJE`SZt3i!?%Y;U-FKc#CMm6tH{!7@ZHgdH5u+|n?C%R6AA|YEQkRjT zPK2V*O}ZcMH*@^7Z{mZO#MmYvF8f;D|6q3CHeL9cv*2sX^)Fvf(3RjbWO#KIwtW^A z&e40n%Be0@pYb(5Mnq^ic4ybqJwJW=LyFC_D6hDTy8xa?*|v3RAY@N$rM&G0TL*oD z(*>W19-H==IYLstTx7jg;cQM;wio9&GZt8$(LuVOa5H0(D%U;tEgJ~K)&#V6^ylwo z8-;Um7hHUxU*R7swyh$&qx>iFUIUQf8ljp2uF)dYu5N|Ne)kRUaffgAu3mJXBcMHJ zB-L^~()oQl+}{SYi-XTNre&VKw|Jhr3(8QMxHe!;XEVs?r}or3 zsYEC$$e^oBAWlT6xhS@{je#b3M5sy*TIH)d{2e=nQfos*yQ zjM;U%_)O^`Dnf~Oak@ca-B>kp_f=_y|DNCqvmy!F@6^rIwwm%>DM=j!|SaCqmsKgYR=mN;B7=iV5=&&hazs ztAgR|ZSK_;smg=>+jq5Z+YBui(2F9co>d|^vw_PUYg^QYK z2%Vk(Qva9c%3&VdsTnCSr5<+tji+=uB6p!!7H>4p=I{>c%>Bkq9;C6C*p&9=OXAE_ zXLw}NM&~N{Pu5S?macoRo+ht?dw#U*-|#)%PTO#6&^WL+nxo!PDy5CsSYdR~|1+k; z3}=91Dm}kEzI(y${u?-1?<6A}Mu=KkN%R&*u!pVp@Dgbn;xaVQS zobq*h&p|NpYD2ybe`x=_D)JOd)jK*)y3J(M$lZ4H-3)GP!L_w0tWL^E!^!c z=OkvbAS#J?Y@o@gyQYfzf#}l9$)rv?Wzu;4cRjW0>+jpF4|W&QaD8f!sq2Q@YyS{n zP6S>(IKM1Z>~-NAApOXICIT>6;+2wnMnhbS<3(h5!YRymkY+G0ED< z4ubTdxYIf**)Z`g&BAY9D_szOq(Wcq$nBnvq8ilNww~$zvZO(*q%%RksLfM$MCsQI zOK-%nxmRb2FF($N=svaS8fRh6k-?&JgdW@m(Xw2#9_*HYiH>Q@y*Z?XuCfIIkfoE- z?n}+v14?P6iAB#2?qx^1bpD+vTGAK)YiXl7PjCXq2xlu`Zh-a@`K?4lx&Nijh{gJw z$A7+8hY0gn(STrRUn25C>F#ytj zi!e~%H``ZHQ7?>EAHE%UtT?g39E*+V*nd@DKgNG=MBagWs!@utjI|<3wE-!ixBZLZ z`@O{KT-W~R<3hcV%kG867MO&-!yqXQyKR^WnrVaBRj^_(9X?epT!46GXGx#+abbKAhM zVekc;WcnxVQfytbgUn;ZJJ^Axsef>rlkoC=RJc{`l=P?isy#0=!&P~- ze9Z6HQ(E4SPC<@K@B-iK1t#-03EFL!izdk`9!4|H_j+xGh^bH?Kid#JPhspdMZw2Z z{n?EUs0W==+V64E^>hk9SgL8D*qf zdvy^^%p*0AOMW)8{6;29w!=_85 zMh#eaHxm9-c)kjx8h+j0HZ#rRjs`!uC&bDVxEW~loz8Xn&aFmVp0D7VsM@>$y)JTEktr?2G!` zW(>tvro8r$T>DBlm;4Oer?ni0A7*NXYPF(zYV28?iIsa_C$AF<__KLR+CAbB&eNZ(SX#FCq+^ zoVNq}5I23V*LvUt)pP~HNjJlto0{|<9HA#-LyX${t8+U>GIHdj;hjC#%&S&!mM;KQ zmE0@@4c{A)Q#}h*7jXh~y}|EIrF9XFD4_FY=eh z^FI_2SsotiOdfgqT3`Xp|MxwlaK^-TEBL4C5EMd>H0dBHJtQc-2m&Hf zqXLE==~6@QMQWt?-V#a(qe7MQ2wp2_4FQyuF9RH4;`|}w$~wy5M#$D9g2U?d;__xPO zj)BGd8ZqA19kjI~MUlEi7Xy35L!YE4QO^clMY~RJ|Dr+7p$nK8+XxlpwfUJ+>~sZ{2yKF`E&a0 zI%ij%h*xOm1_%#b?lpnWm}#104wzNf7$4p}^v5lX1aOn@`s~6^!wvBVdlChg&FQTg z(hvqQH&9X>QfOhx`h%1lj1iEHf8xntj@d1Sa<3 zSXW?!FI@8rixjc)Cf^K|th(Q!9Q+b{M!aC@M|J$5d-;MaUtz$RYZxh=^Tnk?0E+0> z|G?Az*C|Ry38J0FjVxm!L>ecy7)uhYEC}Zk8C#|8;zW!BOz~9>r5pG1#G3io-R_ukFt5nit|CK zL!s*m2o?f8xmaS>I4}J3P>f%@D4Dw^g&b$V_kiQJfdz= z|J7&LR~;M}AKeGTX>BEm{+56jDEbJxsq(fr%WfPC%u?B$Q>=2qZtNXHH#6cV%1EZe z(Eeoz`>G}F%Kd%mJ>=GdQweaXNb@b!|1YLJAD>NqCmbvOf`g@%OSMtlUJmL0f)wYbEg} zXSl#J43_OTor=djEPZ0x*yo2KK`|n|H^0eUKrL5-VUo{3A?tMzwDSLV*dQPK0{ z*#yn3iLutOs-Jvi4smUMrI$0S)?gXHvYx#)-oP_oK5-7sMPab_=Eh#Vp+N$Tnr-zW8UY6Q}tUE#HNt5 zJYvFfk%Ji+yz(bfGTDk^Oc((=P?zTL74a>>?>dX+EZVf9>8EJRja$?8`k##L5|@*E z;?=bz?{?o+jMrG#R#OR>=A_q*_H|IIQ&b2@5|#z6!sA5B@c?(G7_dw`v6Tku!4taN0)G}s# zTsoBPIw6;a1@v@ z1m!v}p{9a|{=Mr)ujqGl6KbRxEiF}3lp7zOvlzbHyqFpe{CInQ= z>a+mZXVi1zM>9|SEdKOby35PnWgy$unmRgBcDzEDV4(z5)qaFE+tEWw`2p_>IJHYH zs0Pak6W{UxaaVpSJ2XD^@Y_e7u0SmJAH%o*jQlhB zGS4K+d1yOzPyj+&%P&tZsF=Ee{{tFg1%JL#mW7duNFB)F2Ey{0%$0|cw{u?r$K(B1 z+Z;)O_>m-g956h?1eZMHBF5?U^*rN&KKRHPM7st-4pgn`C=+s)LpeWNOwV02%GNK6 zrV|#vc=^HXJNZZ=38|EyG|%GaSD3PmEsSwCTjtAhBso6C&X&GrD#ISCC%(n3WCW&AO$MaQWtyv1zK3NtiEeIY zGPMMi#6Fw+Cn}Qk+gZp-dl(}e!tnO1S}-id*p$a+kg2=sKq>6LEXx>f55?yGq?BVc z^y5Ow&N48KeREH_M!ZqzF7EH34cvY8Uxg) zDwYr4)Vu1Tb)|uB6ljDL zl6AC%wZBkg+{BdiwN=7;rt2SO1W_y4_vwNxBnj3F&~W9OQ_?Bde*9W+ZnZ}zwX1!O zt(|Hh=TBtlM&f2{D3WX@-}#^u-wt+;c$5WrW@;`)6uG*Ji}6?2B(7u_itH*}^Rga} z_vJrM35|SuPL?YH_9AZioX%qhwi_n_?lpskzz@tfWgbK??JglON)IRb)gksBhwZI- zl|LXWv9t6e;Rhhx1cIKVHVXsYs1HNuPoD_#R_6Z+QRTaUdVF4ZWsiaH>!YdII&twm z1^Zb$C5v#9LgP#+AOyX3sw$<&e$*{!d3uR0mS_d4B6fg|aYNJBsEL$wS*KE4pjS2$ zV)At3&IH6U`~mfL5HaW+OiY{mEuo@_U5xnNn!Kp)3KH*CMgMaeyA zu_dRMws#_dlrr-z5|5S<@Kq=+A$<>VIN>^z_HU8@fJa^ z@jGoeYb|DTrQYmG>^fsg9)}z!>8$XF_9Zzv-nXd!6ye5s9iFc#mbSqy<^qMIYendn z^qh)HD*E*Mu&d(;j8*Atw(~!H{v*dMd#BULDOM?r4S@{QhH=1=nXn?uPP+V zUdKQEiw*ez(+oi3d{){J3`u?XqDtTFdmbufI~{bWdRG0x( zS6T6q=JB`<%bYvAY#Ww(93yyh<2JTwWkl@OS84J%GJ&#*==5=GMI8k=W(!Vw!S{^{Pj=#-0x661&BMSo~auT&*-?Ou%re{iWJvvI__hGM)^r7+&2 zV8h?6pQ(3Bbqh=O9ixHWKk-H0$n1W24S-+q`_Z?8BM{d~#q%4eYov&GoxDtq=aKo9 zZ#4ut{4d7q>T4)%^Z~<+UQKfjzt9{~*7mK&HwVCc`dMfJL4}HdQnEO0=qX(GJ^b_n z+6%v5a;>78{k2V>M^~4`lP3sQIem@B&h~YS^xRctp}}ipt*XkAvhEgM#V4hYZe*P8 z=!*Lt+lNiBUW3amh+iW;eEoXP<>j8nhqU7I^!ZCPygpq3gVS!`9sKuDOXGQbDbbS$ zqxGG9s9Q5i#pLd@yIt;!Hjt0pAVBOqF=Jt~s&yVX?+oV3P}L2h@vU-LBkvnsidXg^ z zyOQC(n2k1>955!XWRiA<?*U|X5N z%=)rdydnsysCJ%49K!)MNc=w`71#b5sff#5onccJB>C$^sShX`=NgaQudxnQNIt1r zA70fIYuV~B&q#03cl#My)lp3Bc&0K=?w%c&RJNe-Bv8nkFr{h10on^=8BH6;Kl;72 z$Vr)R*mmTAymlbGMyU6X;4tQ-qZ4V#pr`xgZtQtre8yCS=X}gHooyyfS{N$2C~qY3 zait~fmrOJ4NI;q8FSR0;7OqJ{I1QGeqERgSO7h$S>I7G`lb!3=8xdkzV;l@8zw~#} zzmR3wMO=Sn>%5$JTQ4Ol$j$U%Ux_PKDuzSf16s8Ksen+D(yeKAo7nw_(C);{d?${I zx*YBNjigdXR@^CefuC{Ni5+a5=@9}TklY$@KaY4U@So*njTL^Xs8Tlze1BGVWg9;? zHwDgaCj|13Tt=j#!CJLjTWpkxO0bDV>O58dg3(oe_rl>+J=Vbdga*1m`jhoCEz0K$ z)0(2~hxJ~$`EU4p?JFrJNyV^qc3P}_rQyK4uMk4VM!_#2a75 z{4EKe?d#SZGn{AGxf|QgZp-GSWaQ>LLH8W1TPG`Vm7>+gD_-$X*JL{@4al?S!kZoi z#ZMas8;1|8UVI}f@S~=EKSxpyYz^hjD@mwYpelQhBcgl||uwID-##heUcFp)SKa5;II5Y5?Q}GJZuQNW$T5o?V-`|3T$h+p| z+T)5HO_U}_WnOW$OEF97Uq3L^KTEmfk4Po!!TkomnEJc1>a1si`Y+WrBSfWK+X>R7 z3P>Zx{Gt}-qm{3Md8hPJFXt3IHaD}SY=z;;6Y)bxBPuWBJT;xh=yUQ&8pcTS60dyx zKr1TAKba{^Rag2mAENuV93|NBoyj^4&Zh)dP|6c8BaRTRT%F?*ZxQC3`tKWQlaw{o& z>tqzrX(-YSNk2aXU^n)$Hr2;GI_{s_ujuIrjT$O}|AkNZ`hV~VN3m()0vV$|+b&u5 z-$Q~jzY2AAd7S0@@L-Y20+{OMs+Z7kbun$LE{=YWrb?LqUOTLGr8>0}=mUjUq;m~2 z_0$rjb`Uf*9qqJ7m)KSHTvz!131ztR&m_YltF)fJ*2zyV<@k3slzQ)BbuxA1nZj&* zBoikr|Ct5(hz2}0LNt&3VQ1Psip9|8!q6tgI7(pP=uD(fpN-5!m1zBZcktxDW+Yg*AD1DDk4MF^t?h|?l9@p+X2$bj5w?VNQe zv-yT;M+q#LIwqs}m##X8%x(MiPcFf}uTxODE9pDS@)@R zy{Yj)fq(H(+q1k?REKEEVw-Qs6P1YaHJ_jcmc5kWyn&p|t`u)9 zlJ8=wY$%6B^tY)#wYHgExs4g(Wq?&TP-xaF_kTOj_@}ra(p6FXd<-3UJK>7{U$8MK zF=a?m8r=xAesBJ&_z@n5p6Qz5za9gRh#=>tZN6?=acFuQJJ`Q}$)ufTY+*cwCQGP@ zxI$QL`S3TeY3n<10fN@N_}jqPCd7@pAxm~@k4FN7E>f^0`EUaE!5tY~2DB)n8lNww zl{;+C1W4=F|78JtOy5YnE#zr1NbVo2!Hn(9rZ0`vqZ+GUCB~?$*Ov8@!vS?L;1$Kg ziZ_fLFlw~>>ToZx=Ip3~lTW*s?0Dv63dIgNUb;Z}5RhJL%nQ?Va^$2PU*4uVby1!X z=-YYjCG-$nYkc!&7y0)jRngSXd&NkQ@bcW;HJ5NlwoQ{)79L5H>2c$vSjG$0&kv5o zJB}-fSklUzZUpmM1_Ry(>sC5=%Xg)gj!8q%JhIsClfkK_YGzf5Gg=TUSU7?e?;UwC z#n@6g;4m}sV42Lq5X ze1M3s+%dq`$^>_;wtsydb;?Jw-Al`EzV`+W8hG(*hO|f3cI;u2a-NFM8a18<;>47L zFGT+Vz1~Xlm)xlA1N!68MRZ8LDEtZjxs%%*l6l6+^h;AispNg-IOS&mNe~0XnxPo# z7e$l+jk$edgQ%rFbBIeojbcT^LXwxS?c-%99#Njf&K6;>%)lQ6=N6hk5fL;mh|Rt^ zeKXIw0c`2Mu&@i~lZw+_kAe0j=qNqBp^sj~SE(1_iWo(fdBrUVOX7zbpf`4&FbmI2 zt0KJ>RK5B2Q1dD;*eW01pJ?bjVr)fSRlZJIf@7|cA`hR1DbyyvbmIr>U2dY=H<{g7 zlhta`57`2R+7nMKn-XFiawk4z5nfuI-rWoh3uL6 z19}BuSAJr`H@5nJFVKMEDme_G+_u8K!2sCpTO}39da@tydtesYSB}80KyQljJHY3E zS91X`J-u|SjpZb`t;!Dw$T!eHZa~4+X z-75`0d@5kq)?WK?jQZEPj{2-rs9)}wn}5;p0TK+Z%eae+q^=K&iErnr{jVx=|6s`lLDpYIc8cP?sc!*oFjYX4;zj)qS*-DCrq$yB zj`mFinj%O-;V0pD@df%mXst*sbD=K5c&!e-g?L#eb93Xj`e%9Hokr-;e#y5bVkou* zL(Dhx_5f@5Hy58ZH7>l$$01E8tzOuX z)#B#x*8WeivWVFjtz1L4@n=dk!IO&@6IFTb(x9W@^8Oaalk3vz-Zg4r+P^Y4yF>wvg25-A6*M)D=|n~Do(M%Fg$TzJsqwj)+0W8V+TVmnQahlrvC zE)LeVpU$paH#WRvu4!eYwv7!3=qA41cAu`-%0D*D&9V1O|6ZD{Gm5K_$okclCZWN8 zu|PQ(x^+Y}EdD9^fiDUVK&x4uU~W4H?n1W5uj$p__X1J8ZY3;q0B%V z$pDdp06$irjAKSdpAZ&9dsj1+Q<$MQ%yQ$8knZ6xe2&6;5TUg{@$V@bIWwKs?p$11 zgJn9o`!74d=|~wo?&<=`(cq3zt-?psx(b{V`elmU#6roj>=H-WiXGJz@rZJUnq2#g zV5(O+a`5x!AY2$I;x7$nPGSy4n;BD;E}}dG`nzGp9AZ(~KHeH-zsj_1N`Dl;w+dxz zV_{MSfrm3R7WZJ%6ezU-!AVgnN$GFK>G>p?vOljZd-Y4Txwcg7`3dUC&C)h{JyYUq z(pG-N4qYe9wq{AFz5E7l3f@Y-N z1Yh}FMquIC{sWRcR`|E5N$RX_6y!oyDaSh!R-G(oPVU_YS+lQ*xBuFM1<6zJQ%=?I zM{3+6os!ep-U#)nL&7l~ip)(0=w~cjyrXeb(v>XsI=qqr{Mp#Pu@g?$WpXMe6e^}Es z1ox=OfkvUp!Lq6GAX3!qAuFP((fh@M(krLEWkB678LIYBQXArrBy>m~2}Gq@z~beR zr;!~?z&*uLF|_w4v|wNfe3`DVZN<9j#f}^O4SRMC*+s++81D-ShlbyXp)YQ)3w&raxQJ(>>1v7 zv@xBFadaQnG89`Bfx(l+j;XlgFtClmYp$Xqp0M^ztr{KT(f*#|MHDNCYfyBxBR9w>}`2M`Dqo?#}98Qa?$xbV?5S2jW}P9mFBUT2YznkkU!fWq0bw@F;v%$Ls| zFJrmqbeV*0NCuzgf0TGcd95C}jD4>G&}VdOrCgt6Q!=L0RqU$pZkvf@WuH9cm9-fb zxE5E6PBQw6=tq6Av7)JuIWFkLaJuvs_@v813K_G|Pl=2Di@*~C_%13bcnIKKs2e2Z zqB+!De0DnCF>OCZDEOGp~D5kssqDO9zFt5Osb#W5V1$P6YO z>&D0xP};)5*t}sZ$&-EsW)|qPXLrO$Kc%N}tQ0V1O90RDy!~owI!3~%2WkI!dn3wa z|Fq%lk3w(5RzM)oGWM#n5UEc$9L5CTQQ7X^Qc_(TzdWT<4gZjP$dnfZ>!X2RI4#Q^ zIODFI`RF==gMNeEp4nC#x0C*=c8*Jt2N3xI_{5ccJr$%1_^Ok=eD7hHbRDFH!+Nu3 z$n{25NF9Gj5N2KY_#EJBF9L_o^@_KQJMXhVX57A7EjNdjQ8J3jlu*PFi-q;&26x>87h zDftX(d#nJs>@ZgS)9E)k5_1HSpazjIAX7qNbmW#%$X08P4`YjoU1)h7y=n2Vw3Bgm zg-@X?U&ANKCYVoWu`aV$?LU|#eL1ri>m`2S$_E-K%+JX z+j7A7w^v-AAcYdeFK#?|u61c4eAav0D)~pY#wuIBSK~S&I0boj<3nL@%P09@yQgL z*!P4=(_(G;&H^cIE=|#R{fmQA6W*P-CUfTQpFh{Gm_|5Mlx$@mbxD4fmqS*RaziR{ zmphnGPuz4xC&V^bz8b#CXY=+=D9#N@)0(~quaEd^r|a8f8cCbKb3A>=bz4B z&QX%#TV%qqDYe26sqYJrz~1jd4MlEl%|ptLv=`l!SbN&0Cg1cq9iK6GYdogLey${P zyiW}rV{Vmt>?PO9vdg*^LSSvs#(!IAG_k-qc;}fT}cmb^;_i4`esg)u$-d}tm&+K{#h?EA`+#9ul$Ehhh{1tDj*9zM9rBM@>*q@8Zda3V!|B{rSAz3RO1!WP2EwR}o- zeX6omx~DH%p2dzI++uLSG(VLy$IW@kUQBe#Cg$-?V{Qh6k1-EU*>F%_%XM0QwjFAG zK9q8A-ygl}0G0`9m6uL5+PB+YY3mQYhGkU-!$_(c*DjP8&iB1!XEJL|4_+V{F~46e zaPJc)u(s096LbR)BHnM9BTlcVtx4OQcIVwaKi+$ zuwk?_+2_glq;;mO+2^#wg_pivEB434nHDf3#m)w0H~gm8byVm1Im@395y_S*6#5(e zr53Co@tdBs(N)TY0V!^$mXwe3kg&~{!)~q& z!=a}n6s~M4_+Yjm#l+b!P>YXeU_YVte%zAYAChpj00y_!=iq>^0)X^{ch zTK+y!T4mjp5xp0&r*kv1VfIy@)CAOjX4FvQW?i{+|GB{CSy*|#LOlx4v^-M@ZOM0< zs1uk&BujKS!?tj9I$m8}hG+Y~p6zMzlSrWu@{hmg=4|*Sjuz)nDot%|z84wn@Yvx? zE4{L={S-4LYswpH{1zP)R96`=F@oLQRo1=JCVybAHvFGrV2PN`LSU~>ach`iSmQ|g zyH{`U`IJ6T8rIWd_s>oD=KI0R^5Vf-5bPXn)PnaHMT3CWABHO-&t%(+ZdvaYE*e{j z#RWFV$D`?kf^aeABp~LR6KanvPWZ?H%`HqXPbuE7Dp#+qkJmjy@-0{I&%B%2HBtnm zG?sM`0Px9N+f_$5rkIdeP?mmg1IP zUOu0NzU9ZaBm|z(z%*4;{2N82HQ(G-9uzfi_jHXub`?sql{MZ6+RV^xfli=E@E__m z!CxUI0+jG-EHyY4C=#^8ptvsZzx`y76%c853}P{>v1Q%BnDAe2*fK*yUghXLlk~K4 zM7A6ERAJM{c)_QNuIRjyAG|@Ujw4r35YR1dzp)KGxAxp9ED9vYqY`-b87Ec9K~H)o zifBq#G6EZmB6<_RO_(bi_16h0_i+zzmKFQ(GxN2rz)LO$cBkiMzM|i84}v)k=xvT{ zLnVP)MZt&vuK1}ap~>C1dQqUvGIgKtC}nx060c~ADH5P9`XhIM!pUe69DZcqd&Fru zvK$L|_X^MT`~0J%zm33!a0@F}$6}piDZg;LW3XY`_l%dR-r&P;9}tdCM=vmTf@Bz9 z80Rmt4Dnoo-4=T`=5P~096l5{hHg_4=V?v4rNsDOP%;q*sO4^^{YAm97s2~UD845G zO@KVbxBcn=EMGxa)V31H#uKDleTQ?9=KRE@78$)V`S;_O4clbc>_7Yg^?T6+F^Nb! z*=M4P3`2~5fQy%*=K7l_5|dhS8c|Wf*ORp0w=deg@W>?5_aKA8t#T_F>%L+|HV)sD zqiO(C^Bxc5h($njfy;l}2%AZ64{tG@ieLdmP-8104)({&x$~FwaVi5Y8N0{C2UqX3 zN%U>~VqEt-(K%o0bWlosN)r6;Biu+tD2)_S`k9=5L>OdoO_+3c-Jlytf7R!=%lbub zq1~E=?X}V|=iI7%eDMga9|@Ae1|7=~W4<%*wX&|B?rm7IeAb_Ej;~ui?!@%rto2n7SNeKU*AKMBRTN&|DPV?PQNi;M zNbz7bfkrN`f{;c11;SpE0ewezOub71psf$!m0rWOsWFCT-_(()N!+e6TI(w5n9m!I z{ytLZ&Gjt;ujb22IPfn@V9P&xRmYJ;=f6xTqeNE_?if?NdpQd_OZlASN8mGxK5fDs)6H?}q&xBTsg zTon=YH>=wQ(k)7qk+a9d7&bxt5m^U*y_#GvTe(8s)kGaLM^XlX5=5kh&!hR%m=GGj zk7Vn$!sVa( zj!zc<4EQd1!#VyOdD;%FNmkbV?KUUrky2EKB2G`nj!xH~rB-2I7V-fpJ&<$H_g|Qv zvM`!7*S7T#8)J_*zO2D8#AScSP);(#hiykEgp1!MU8u0W?;gcf$Ke^P5VL_Ge@!k7 zUrgo$G@mW*>(giO%UQVF@#bq4uO&p1$8AE4V(LoGODwu()>o|et!dInZ}AF5eLSFf z3AVUKnDm>@bUZGMmx>erxE|l~L_aTqO_t$(G;!MxmkFaaPoEmvTp1Rth5Ea>IdN+7 z$Mav;yL|1Xbsz|2lhS7-!o9fx)TNvLfOZH4>p8>s`6R}_c!wa^cuy*3D)WL zwzcOt<6`bs=?LS+iO)!PP-S+lh1$~Qn%ZI$S``5*=~sF^rmfVjzDWOaqDBrd=>olO zU#q0tnT}Upc`u$s`wgrLtftSK=c)_@N0%xOO|c7KS7!6)1>~%*t0ilZJ56Tzb_fNI z-G^>=VB!F_Et`W_Rww?QVwOT71N|S~GO%~MOf&5_BTY_?a`);ZbgX#T-(ClCynX$% z?(sRG>kB)|2*N~cguqBwksZ|YFlMLxL85NU%F%H5eE(&PRx>FNyezy|Lt~5fA{-pj z=z!l|hs;ILQyFnHBt}4El1qgG2YLWI5($4@bl4&?SD^G!QcQcPCboW#87%Bogha+ z|9o1vU|`3Eokw)Y1|#I=h$Vv8A{Oh!e#{JM!%89N!U}Y!39SUcxBxi0zb>j5WwKCp z3tGU==*$X~nev{$!tS#G!Wzh)$y4VFnq zJ&k}Nn=kkPs+j^pz#ekqD&R_$i<}Ft<=26=vH~Un&fn0TA;98N_?R!2u8|V+6!)*o-^U5PE&ygivlzv7zeI{ z2Wt}+SSeR(9uLC>bDR+LIB_vW4NKR4x6@5+R@J*e0rdBmFxQ9o+~ZHMEyBv0RYip+ zakLFsZl|d*M$}sE)6BH)zS`y?MFe{?_8_H&=?|#(clIApNHU&Gs#USV?&#!GdXaZ# zN|y`qDStplZXk;1ur%b7c~;>?*2YZr8@94{rV5HnTe_M#`@7`tnAy-W8_5im+bR4R zO0SJR$$@NbQYM@P7cawuKBzd!T7I?F&j*zG$l0|0JBh+#j=(Cr3*?&}s3){B3u%%! zhSrT@mdYuo-Xl7^N#?k1xP8m((>-Ip@vkN7wx>V}&~^x3qn`@^&Lhv&I#|w|Bf2%G z3bTq;ygje7Pq}cE3UxHgji2Bf8n3JcEGZ~LsOIXl%Z`4{Pl!Xl{wsk+)G&wZF2iwY zuP!0H5lyg5irfUJHf4T1I;u$Q#$}^+U)N82BF?QhnL1u|jj?(5-Hj1!`_6)PsAQ?B zzh zP0w_@wvvDdz~5;5OEp>PqpihK)^C{G=bzzrr?&aQpAsQf*dVRL)Mv=Tu0J64Rsdl2 zob-AQ$azOAX6O+dre;$PMBM1OO_MRaO_XWMj~vKpc)eEUdy|8zj|P%KyL`$RVbx;$ zGeyJyakh4pw6>EIPWX?SM z!OFXPAr9u(N!Ea$jM{wwKgA{q1!|r%`8mSQi3nQA4&N1qqYn$0IH?Q1FNy16|0qsx+ z{($cO0p*jX3FMh0o+E*+FtT*(VXeC}a_Zd@_M>V}CtyR+;~ucXsfwH`sdrmCfw>h% zd_`hn)IDvCB@dbbhbp%uUBXN7WSkgGr7k67)5>TyNk}CA<|5n9Hch}qV5^;c2R%jb zhmfm*knEt6<)Q8bmgt~cGcOzgnHt*07Fd;DCrRU#!AEElzJM3jfYY*Ep=|v%%H=3@ zvpIR*ku?tBgU<%%fX5Ac(wbZi4d?#@@*15uo;rVcyDT;YXmaP1 z{J){e6<*Cpk4)~0H~&VC_y|0SrB3+J??$4Cp>#)JpcxK={q~`C1DmbnQk_5TJSRYo zkyipX?*rI{4cs^aeX0U?r~C954a;Or_4iSk&4RqR1x%yc-+KQhl7QAD*u4|IIb6$~ zuXVHR6V#b$yNn>V>U~ut_cimr>;0QyLT%X0)pBwAW?Ftzg$8X?k8z3HH>nK^^xQ8Y z+#nT}B9<=C}wp<@d1@dE8 zERzp%HY`~2W)@ty7I&qYcz3OB{X*wSO&7JKnhOrM&ViU^xaPCw7#J@(wP0QR_36$E0H9iGu-KzaOq)lP3F0+@Dz`4LiV7?7IGCofU zfiS_X$?AyF_XNW~pk_%@jK9anULG+oiIa6N*cj?f74N7a`(h?-gz?173&9eB)$#Q4_2-BN{S? zK{^~^7jUzx*`%@cOO!8HQTP#X`Qi*=2d}&anJLE7zB+AI5ZZoDU3A*lCEUu#0+Gm+ z*hwlAGN%BD5%ArP7j}@#USe#mO6y3NJ7I+)h1x2VIgSkyLhX!M06y@Bu_PGarw5-| zS7tI6AQQb|qgzfJNd(n;sv8&&LS8q3nT^_zaYG2^J2chom0`c9u-K!N_7JCod}=^@ z7X)98oc;3b@7rkpy&ZMz^%1!WE$5^d01;&fR-l2y4E#I-xvwyOoV>lg^ao@ri5-c^ zJ75gIJ{XJy@7;?%*Ij6L2o}^JKxpSL&+G}W68wEX@zS(?pWFO3bv=Dn>5Yq+_@Sw; z+tRC%gDDoqaG2&DGU_OuX-*#LSb$~RRaR04-spG18iS(p;@{Q)!O=_U*dn+p6)Ppm z`V@B{Oy=}gx;Jf_9NCPnIXm}o`(dz^ItVYkmA4Fcu=3QUcdE8$C!2>1%O33Au%(qh z|M^ZwL;h(c2{{|0bM4!=@ifc^u#l`lY%mdsTclzOsKyxUYj9Nt}6OcaI9nClcwXNtKoE$Dc6RjeN=PU&GV%OR9E# z9AtDvfhv>Pt-1enn1O|W+^Qn$jnR6M7qSx65^oYP;Vqx4rK_6RCfMS)h7Z&qojgJR zI??P!F@w-{hxTt$K5-k+$V0~I=11em928zukIGL~T+!16<_?rx%tOdhdP%H=-c-|< zqmMy{-p!CO1Q$tf)`hQuhb63Qs4V=kqdMo(>z?>3ZVR=WpN}(S=j)-=2&|nGHkfw# zC+Y1%mT#@^!gXR)d)lXz@pGI#M#iZEJzmN-V~h1ysXeHl8^k{LE@h00j?=W$wt2lj zR&wsOnOzx5#0&Nr&>iisd<3dDp#S_2j2ZB>xDSWUW6sk0WuP?XVOOb|sC3oAX@@QI zZy)cIyKj3o-+pO8f6B)FSDe}J9?eWm1CzSMUVHtOly~a^cG_%X!hF5lzDH7Cy!b6vn+q_wmiOUv#Tnhfyr?_5S4HAhE1tBE7^EHIF8f36mH<4 znm3k3BN%+2lcE1rqw)%ey+Y$~sGiB+&6oWapv18s>6}JPBTEW1 z)F`*={TT)v4!4e}Y2q7ZyFK`Bzhd&hx26&Q2h?`lKD4P4h1WxlrA#QRX68&S(0q;5 zfh8Jw4=U;nyfOEvD3@K{LBP2J`Md0Io&c^kX)<2EiC`$` zbr+T+AgX2LAY}Jbf|TjklR*0rm!u=nD5N}Ot{!o364{>Ap!9SF-S71W)YYL6gx2_U zU<1dqBR>FvqOcbQdW6a?gD#;c_ki+=u*|)gQd*80^%d4_$@w;rC;g)4!#)F9V(&BX z!BQ@m`c>_?Idl(*tZiuzsHxEI9kq2^mLof0lT9WY$6!mBfxc282MTDVX3{uKiJ4KU zv15nNK|sj!co=##kpdazl0nf<0+BeR)bS`_mHOlxR^$(8t`?zWvWLXeKYt2jDearc zW0vGBAUcR9jSs!Io<-!k#Gtnl7zYZVM+3BCOZiS9=mbJHURrPA07973KRhnHI|?BQ zu?82WsTIKiLtxXNrr~KconqWGG_p|_aoX%dN@&ph1quIfQJV*(s1Gxwt>->br~ZJJ z<|&VN$2T@nSyIsEaXZS@M?_1kkk~=$mW+~i8%n`?V=)4$xlaWX8>q<#y;v(!(D(Kl z^!SrcDAYK!B0?IcE8PS@(ZF?V-RBJ4SP#J((50{~SvTxU@OWwiwj35VZ<)DqMf6h@ zmvzK>bmASp?`ws=4`L7ESQT-r(?JA*xtU1d>uI_>pDqlY8QS#zp^HL)+A;%gapE%= z-^1wPDzg%iZs@@N7h7%Jzm@cNyM4pPtgK z+@DfyhYoFdx`@50F0G3YcxTk7ub%a^W8FVg|E#nZ7Wl4blffKfib(uJx6lmUxUg_# zt|uZqauWRam|gE>1$FFp)yi)2ne965$79c>IdAs=x>EfUSS$GcSAx5L{2T%t-`PP1 zt}!@I+fHDjte_~ui`B)(oGhM&y^;2FLN8LcS6WJ_3arQW6f5U!(Eror)>zwW8U+CRLwcKdt!){LlyUunU|uNo4j8l}nxbS6`ow zX>QEm{}DO((8gVCsFJ5@zWxh7tgX=G5}z8L68%orEJW62=B_{j%8zn5!0)7W%LitSU;6VnTvmS0$2 zGltGoq*DvTzx@n9N?Ux`j;`NvTuFbB^AT=a;80;95I-zw*JT~;K8$OgxjfzetoCIX zZ<=-V>t0h*GC|*I%8ACzDUo%y#ZKscr$9x(MoT$B1*@J)du_^(++c+Waxv6xoRfTR z>!js>RTr(APyD$2ggO=MrCi=^zvXH7#sX5u!wA#d5I+!?=u&InNj8$fa}$=GdR%-GzBAt-3gqC(L!M=jv=NI`0rz^jWol1I4bFSTQ5Qk&h7b0f7+IkbKb zs&Lmka~#pl#|ArZyWFg(d{Pi7c_q->b*t5d)-!YEnHhevA}LyVh$#ZZyVarNmqI{= z&+wED|MIs_yL_k1l5D=vI4hco7K%^s5q<2C` zYUsW9-a@aT2NLq$_3g9w9{cQj_Bh`?ciel%`D2YFgO$b1Tyy>AuRYJ>;ocx<7CNkD zx-32E{6h$8g|jJpv1b3&@AgwhZ-HFaAM>9Stj*Ws+nKuOlLwBwDs9)up7cA%HasNg zA||9?b4`{1l-^eZ=40~ROOvCo&xX6Yvkl*-AYSnD-!WwO^ZD$Y4Y`TYk|i3NIniOi zb85iqHa0!Dz`h)GLR#R7@3`eqk&Ik6i;kV0O`NqgB$&;Fn5(f&HCrGX zxJpwMIpzru-@j@IKPo9MO&&YhsYwVDb@C2hV+q&)xwPg$OzN?t{mqv=1NUKEc=#b; zyM$(S7s;hRvc#{SVs!j>^;puWeADb=?Twx1*-I=jdTWNF9d2V4h#1{46njPiPDaRU zsp?BDE2o`?mvfAzf9Gz!!PhRa%37q+Vv{yS2UEe@-cG=S@Fr zdY~S|VE>4xnG_{vms*XC)lrUbwaL+W2!K zbWe0hnlgC<$T_Wgc_&S58c|K_L-rs)@)o>+BI{5y(zNO9(ASr!dn!KlG zdfAuzP4EcE)5lU5T*~+M$ zei7ey-mYd0*zL%uv=p1g?Lcesjxa@ z7N-`Rn(qhQlz5Y*&Tou0APWBE62uu$b-$aXRIj?aC6&rATj$H_5#{@lyIEoych|j= zclo*!iw%2iWcqTC-p6lHl4tLp6gl4auVhSJ^vurIg8#;G0d*lR^~y@8nc$D?QT+O~ z!q{{)Xe|v+=3n(oQMbRCleLGW?h>)}rdzWv=F{Ob7N2ZVg(hDa^wC;Z-743M{grr~ zAvvACtz0jtYel~KD7L>4XHtgV7_cl*YUVn52mPe+27cruFjZ{><=ReksGf*=leaiD z+KRJA2Y$q46j;S@8mh*}z9JS#-qu{xF1h$xf_Q$@$DuXXc)%q1Vehs-HoXv&C7Ujr zygTJe!V>)D=pY#vi-|wx@PKta7uR>Oed6?+Sq0>URzWgi>bd*I5wj#)2TP6M?lf*+ zwfEMkGMD^c4Dt`sxTL)k``nlKkLec;g=)ETX}SoyoUD7Q8N}=dei)=LbcmLck~$i! zRH^KJ#HzE2-sbllwyG}3S^to9A_zB{DPDh>@ZG#&{pYPY5>SvA5I!~Pn_A@0&F@N^ zvcC7*XQ~|bI$fmKwRNG^>e^9H%=J!g9Y;jwW?+vRizV(m{@929+)iiY5rUWNTAIt% zZeu4X{l^)nm!U+*uIX?e<`~O$IH%Z@>dM1%I@FzGh`-9(T?E^wTauMgNr;i<+5T?5 zkl^BjeMHIw%ZrLtC%11X;ZeGnVJ}QrX1g!5mG`g9aLgh|Jh`ti5&95 zZBL&FQ2bnTFD4c>{KSfx9oaCR`l#4R5&81;GKy=333}+A{#eD`JK)L~@UqHxe545#5ppF)G$9aWmDG4DFd~(mH&TME(tYkV?4of}W1M!fp zWv;eSViYN@c%I$y_=}rV)K7#0S~R>-;nmnz%3iu!A6|hWD(3S}U41zrTgN9*17@qP zI6EOD(5psgdyz@kAL5pk`BI1rgS z`jQMaUFcXbMF^*w7dX0nG0Inl=ZN%5a0u1ypM^|hbjj|Y?-{JqP|hqAm!;S}g}Uxl z%r519V-+Ae!L6nMK1Qp|7xGiK*nCZR$vyvCQpA?h9h!xF#~30zOk$h1&K%EFIpHme zw})BoV9vDVAcw?Yjz%zJrNw~x=WdkxoYnN{SHgGN%$h7On9#GsLI&hJV6tLzs3C5hg{FVX$bu#}IwtZE3nBYJ4y( zywAU2nd}~J90e%P>4wHKre-3ND8Mp8_F;I?_f{Gm$D#+b+M*je3{iqz)^j#Eg&chk zUR_}eO=pqrchoH!Bp+0vDslyS`W`>u9k%u9xEZ+44sk)&;2p#h!jcTC&&@@wZ3?d# z5ARGx=f8`-uxJM&!HcEG$6RVY>w{r*%EqPR;{|(0rT~;v&qwz-4$mKHsG*qojpRcl zHmCcwn%9*Yo?=ma^|4_O-c5#J-PJ)sKaAnnG<^?ZrPBkItQ?N0mS+j%RU63{K5G=<(n`$R z^{89BS@2a83qbWQjWRx#oN6i_HJa;Zo#p+W-c@dQ;Oy4M`z^fkQC%%rm#EHsI=_qr z$=Ai!wrd?v-nf$N2cdEBXf^5c*w4nX-F3ujbc0@zor@{-Qf{YPH+oSdIcofMYP@6T zbX5)h^p1bA!j&&n1v|!tMi&9AI=0LmCDWZO^{e0RCTewNq5Mxr|bFx0yfP_6>)a5=_rfUDm;6|+FE4~LA zwD-dq7Fd$js{i?T$Io&2gZ1|JV#OPIB~Fq-KusPV()@h+lhkFxxZ|_c6G<}V$wo|U zsTKc&ND3Q88NSBHugcXA6#Af{_z5USP4)O;PI=efu-1{KA@)T09IvZgcoJ6J zf)tnUBlm!Coa<1D<-OHT9q;_MPaUxGDIa_*hi;%;>?zJULp2I9JgT$=#SCZ z+7D;kv+}S%V1JbspQ)Oo4=YJh{D#(HVcJ`LM7&!3@Y@Y4$}9W=6fTh$dQ_ZIGj*Tt zm&cX7_*$cC$ZYXD$jN96uG9G=9ZY+o#GYi<4-DoxC$5`MS~6zUkbcLJYw6biYO{OS zStGEV?fV}n`2Nsc%2!;<$2RS+MJgySqMtrRMP^YCV)8VarhaQTxQcxq5A2lr^6LA% zef`kQ3~H>%BY13*Z%Wc|=1T2(wY^5M=16raM5akURd|wwaFh88VR$T?7b*pMf+%9T zB#jYx(b98=6ny{Wr^~oY?qe~vUnP2p`dE>l_8GA!8Jxu6rX$Q`OrMC;r`!}8Yr@Ku z$tFt0sLKywcL7hQ9q3;%szmT)eF^w01xefhA)bFHOv@7CbIgBFsZ2cm@#|58>f%Nr zvqIDe{n!XjCZAENgVy9qvHO)+5uR;4@0YH?l7E85w&PO4CvM9^Ig|EF?(7uT9;&Ab zUshAWYR8PWX6Zmv)2yk#zd>YEHBi@gY&A z-t!7G$=ElPzh%#n^goA(_-Ka}74|7r!L6T@NLENrPc#!_O0un*+0)e~shbBaHYy$$ z*7eeDNlkaC>&{rF(ba@M^Eo%sXP073Zg9{41bQ&jZ#BT%b^2mkAT~m)>ov5NQ0tq?KsQg+n)a_i^iwy)v}F09|+jqlh!MbUQpyjnd9oudiC z5xO&K^axx3a{q|p+Pv+yb&B|_^0*0=USm}e%bhN0?3ivf`tf=9PUq!;9{x7Eq$W1^ z$dZ|AK6J8W-hWv5IA^lmaA6SGrM)aHz}n)7r zHB93^{^d-n_ME_VIldYt`sOX{uP6n`5b|ALeA&beSy(r6S#4s+PLJnvjcce5bhXdea_KUTAi7aW5|SWZqZ%qqyu_zF_z?=`EUC-`W_J6V`qd#-j`Lp zL_G3&+c<(KKC1G3$XlKgb89`q9nt(Zf$34GhIN|cysrcG{rCEAll`BqPgInVnzq$- zb^=#Ar9Lk(-+fno^$+N+tLz>;JRm6*oT>>@p%~to!L+q8I6f1(xI-2f;J7!H{ke3#BFRdQVcj zYGjK#nse7b&W&}n_2rIWlV|LvKc_v=&yWw1&z4q7scN5kO>cwS)xxHXUQ8QF76oC! zF&$jgBn~Vi)2rtG#lnBcglJSYrAxiN)I3s?H`EZAXiqO~7 zTU9|kL7HjeXNVzW#4`1vl*MPZ<)IROU+km|i=D*%!Uzg!O?3Di%yQWIC8K;_|B&Rg zb%~Vo+(K6jubp$1Mc&jOUtv_#@-RU}1YBMBM0oPx>`_$i#WriMx|Kel2*+$|puBM!oW`*$(UJ`9WJ_+&qtf0iD zvWK@h+A>qOn#4jc0Gznh3)CVD)_jou*~@!BOb=h2p_)9Nr{}$y``}q-SWDU@j!-L~ zVy#36?8Jx;Zi*j7@FRr0ElsK)G5CGyDY_kB8o9ZUtxhUS>e)KzS{q=?V~c|)d~!~^ z0!ON@a=_~t!(jKsUy(q`FPgB3}ugmoq)5pq7iOqyRpUe+D@hsPMhZQ8Ke-DO z18|0=x&S9K+>Y<@0)wt1Ekh-F&y|;3iK)P+>-IH#e2#~Xc1`oBrfo;%Ew11Nek$^5 zF@z5q+=#-mjpa+BU8bC`PbTXVSf-F~cwA{+y+65H8^6`bul)VW ztw;1BYG(R5r;g%OIQRwF(h(k+#sXAI!8feIjnA9LFXAPeMLd3B=g9iT=+X7KS|*(3 zHF5wO!H3TJ7o7}E2cYMguvJ>Jg?0QaObu6=0GU+4=!95$#aVp--%mP#y!NpV${7_p zK%Syya37}o&JONl77F;SerG_9*1dEeSdw#cB|u#SK) z$CW-f*7kYQkB;{zVH4B$O1BSw8+3bIwaRTfUpB`pMRiXukz=w7Fc3{kmptvTOfGDT z+ESNy_|4;+AI(pfJboGev|D^r(g3DoVkN9YboSVzlKBg}2Y0<6>Wm?83>g;9B4p4hYK<=(YRUvvn`;L7y+#^r&Wt-U2% zZt|MUODbnp8^t@WjA-k4H1dGl*Vfe~Mj|QJezsSeH4R&LUg)aMJY0Y!xR@&8Y^tLJ zDP!C(Vpa6-rJXpv)-rKCJYN~ae%xyY)rkTUi^=W;>Y=E4R}W<8-vo0MTix|$VPbNg zl$Z~QV~e`tFRcBFt(d@K-DTEiQZMhuIRViDx7FiU-(nzbgbFd@Td@aPCdW%Tg-g9A zx2Yn33I<3Yp+1#a(bOd{j5It*pe=r`A%@-KIuabHClI&e_tq)%VCWE?BWiE|b9a`% z$t&j7UQlHkkGDRL=SepsI|H$}rk4F74|GO)V6e)xgrkRS5)=rgM7T8FrahOXpO0L2 zp;-2aU1}c2*rQ5BJ99TVl6;eA{(8^f?I=FH0SUZT2xT5?WeL|_5Nu^xg)_HT}P zx0H@wHCVxG$RwB=WW?JUx01KNb()@acV8PukncYh7Lh=CFgA&qt`LQGV z8HfG(%BUd;7sQDD_jKB`CNjnG^Iw{EcCwXU?(IaiOmBj^mqQ&qY9`GCgA%<(`*e=@ zgr}{NPq(u^WDPLw@zvq8dv^DR@I0_P5t&HL_ zpX=J>Pk7S7|B&IKuT^m|ad(G3g)G9{T8q&8A@0u#F9*ir_z55J`)7Z&PNF7<2D=$5 ztmBRe8sHkX600BO@1HbzQm68iSE-#j{^7f<%T0o-VludmIAjnrX5AuJq=)?c68BGO z#!L-gim-1*-(sA0EwCuHFEdV}4x~-3P_1=G^c83+!RP zr)`5OO;66Qr^V7}V~+I>2_7&;WmTK%?NrS3&7gLdMfIl8!=cDBR~-tc+?V1E^~qMR zzxwb_x;`T7q|uJyOkY?HbZ#}a-dlc%sc8Q=nK?p7W9*_J`C80*hztcGvYtCQGvY~C zAEy>ExFZuC)n4s&5VO2&RnR34DA-}nyeQnZN$XAY#pAu@3=Yyo9Qks-6Y(LnE_jboQjtif$`!#dz zGBcvPicpf9Y9d-nKaA6Ul00a8AK1~H`Oe*3zgE(ttWpGjdatxpPIc@@chr9N~exAmxm2s zu0X$C04sU97DvkZiD$`8MoK0?Zer-M{Qh`3tPxze6QRekuM0E<`Aoe6(_b;yzjFH; zS@X-9vhwP&)2FQj@Qu|L*U7-aGL8CGx?=*@2Ozyny#YYr@PlDU6ePGi1K{p0?n3HX zpTI9nRhQ|jp5WIO({TxLFufZCM8)dS3KB935(k=>$h~0yXjM33{uhXeX2P%cdhfov z%6m=3wNJ$}4o&;}3CPej%FJaqg0|#l2(wyT`1=o)HCNSK0x0{D1v;%Ru$?J6G<}7Z=on-`#Qtzal(toDdO6n7jf~Hl-bP&yFIud#=Did@fBsY1D_Qy4`M;!8`S+~VztOB% z?hH!&IN`VzPsdcKZ0$8db}uVCc6w_!gQs3egX#9~5t_gl#jFxoxwSa!#2ogkcj1&z_;MwGMx zB?#v*9sfoztI%2&ZSwkU1(`f}sPd?Ittq50UY5*jEXice{&N0mLii|!q}NCUJ&1u#z(fWJ#9!QTZ!s>D@wF(vcM)4=ge9MQ0M z)qe9`nm3&Y(>u0-`qbRJ0T;N4(qBSmsuF4jYL9Ur{dRQ}5sa~pB13%w0u|Jqrde!;XCWsBUf@6q!`W-GSGeS7kX{41qld7nTKi6mcFwt6IpKg z!F65ulspiU0s^A*7BY+q?zylp#fwt==ZzE+vQJKE?AfCn4gW|iR8$I-JjlA*%&Znp z74c=)tS|c`uMnW~hSqrFy?Al`#Sj381IXw9$)m{aNrlmptTk=Fa;pMLi)$}^7P7Kb za7@;vXP(7W-9D!WX=C$1afdfDmTz-fuu;2xG<5@n*IK|t$VwM|Fx_(Nla=xAApo_} zKI6xA?nrL>I+VYtg}M(T+9o3XaCBOs-1fNW$&kQQec#pYqOz{6&g(F6Re|I`Sj z%~O(FAOG3^K_w#ru_~bcV0pSLdiyT18Wa2sF`6C-dbBd)((39=Wv+BzG24P=gE_OY zeI=yLQna(!>IJuo5Z%@IgOuxJtj1l6xjX(6&-|~nso6)LM6)D^h^=21kVD?eo2OGZkrkRN;!eHk6C(I?1jHsok<)4owTRmHopS+->}WmBaQAIhGbuehUG zSJ<6jKpRr};)R-?^1rBt{zt^$fBF6uM~)WnoBUp(TNlgw30SjQCj1mE9y+kCAfZ|Z zUPSbe&2q^$nA-W|fp(y2lUS0hw`BNKYyKe zjZ~?-UwDr-g@D8vBIYA|Qv`j40|$6?l_ekMJb5!`+3{Y*rCk~9lJv@HmMJcc%SZZ2 zj;_=Acl(~9oKLMr@4Lr@Pr;p_IV1oue33g&e_VZF#2s8!Hvd$fDJL$P{;`*z3h5r? zS2{J)nk*bfDL3@<(KL?bny~|3K1)%&v%`&(Tz6=(mA#p}3U40fT_vjygRxyBPq4#e zl&LP+%@zw#;hP3LyEXI+GO!A(_NZmq?Tz%T6L`hx76|_aXN_8>^r-A;7MwJj!f6!@ z7cQ$z>4)&XJ~KcUN^-O?1vyXXOn-K|rUD1uL4q5Pi%hIOV*W+=q5WTk1~S{egqfu>F-SU{Q%+672w;2a%^rbA0$t53#24^NHA z&Vn`PBpuuVVxZDrB!Ij|4)P()0vrUpf3Rjtqoi`6bk{T=!a3~v$6cQD4otBPyMAi4 zBj+RiIo(8nUTM}%0U6w1FgD3m1|T6jB?JB=pL5c2QIyy9_;%s91;e#-s?6j{4yku3 z=gifZ2CT8ig*~0mXXIFx!xS^U`sce$Qk_?Mrbn9;NIB~3oa6a5J=}AQX%VKGzr-i} zX@&guv?fzQ^a4Atf~y7>T4Z;3?PAkrvfikhJ6I7nOk($ z$@W)Qh%w)f-DSsELECN`53t1%fsYCv>fnjzHSt7{df`Y9M_SBwuu|8eCtrvG_I zDIogwpZA^rV?I-xbASQJ31>TbSP;jl;#ev>yV*;z?eo|MSnUyfnu6l=Xs1a`gY9`~drh@s8rF?^_oi>xD$o@o|P^nn(0`MkVY_f=1Y1f}r- zhHi;vtL=ZN0Rgx|^xH8k3eSSI?>v^5A<2K%ER3ne3UoqGJpxzjiz#c2RKcGx4{UEX&cjv4MkR=~rBYGRIB z-7SmtA4WgSmUSZe%}$P-=Gy1FSb%p(F|%<@Lx-T=KIvEUn?6hIFA5%gE|gZytf@vm z!UVSKfK88yQyueEXDuSLhX3@mGNZyf5lwj-g39Xd3JK%y|G~Id{vW6Df7^QwIx(ua zb?oF|MB&a@9I9SU#xR?=E6em_l@0*mEvAL@{n@{6R9-h;8GUcxXjy$BrX1yxCEB2f zO0N70{EBVX|8j5qSM}}x|Nfp6aPKfz#rx}=u9?`!*Sbm91mZ%!YKrgXBizxj{hed3 z&Udi}voMp{8bO{zQyXu&fp6Dlw~intIPgs>?!iiM6#hi8WY=*e?C_)K*Y-w-ldOp+ z3koJ&sM9pD7UOXP+zAjnTvp(4a9Oeht%9-sfh(?6$o~17JyWAZ=^sSawr%h!lcN;S zTL13H8432F+kqhm8iIkeC6fiZcvOK&^>}i!4OTrZ?DqFP&0qE6wMNw;VUA?fCY5y1 z?;N9=s$5ULyc z`cVYf6LD;QV1uarz{+i_RgxDRordKiF%4WE%U#!GenNEB2TfZ+7{Mv$ck6%3a;|Amu^URgV~v+Fm`48KVKh9o*44oeK! zRh^PxD0a;T523#VZDMtJ=3H#R+_CSEo3BtVFpY35cGR#h_ zMAlEm2@6-)CfW5OP=yBh@MB0;Aq3od@h7E-l2WocHb@N6J)g_rL=0|pT1pz?n)KmX zOvxirx4Q`dEZ~6pm z5T_vy|3)M6zw)XBI=Jq^^KV5lNs|};Zo+JJPlzQ z`SgW%SuUfPXgPNzr4LBd-^Z~K<2q8z-)FRtzE&H%8L@@zbMS4YZq+&OVj#aifztj} zAEW9s9GJqW|9u>1J> zjpUC&$F|?(H{jB74Cw-?8djO2vWlIu65be?Hi(cISWxd62NMMT@Scu!cKrJPe!Km8 zyNRo%++=DOc2tkJIT5_E9<-kJ`HLn#Q-1k$G8Jh6Q$BV6ITzGWi6CMa*xAkuzapId zkEtv=&ZEe2Yzv?y5^ses;=0oL>88AMGsSWuy#EDky>Xo)wiGf2D;u}H$-(i0o1Xjz zpS(A{;kukc0^l+e;(4qp<~7XI6+kk^(2!i~9Br5;!^pkpKT89d6|O;ve+hV+>2FHG zR#>%f^e)ZB=(oL&D9IVT1+B|?rejBxII%WkS4iHnok;NHO)=OiQgS8t2c8<89leN| z{s0$1kS37-S+i7lvxV2q*t`6-(PH1uq^3FHS|Y!tGg)fZ*wh%ElWh43kUiSs9_Wry zT2DUb7vr^udNU@4UtF5Vd3w6%FAnEhQ8o=u_E%N(v(R`l4e9=jQesVs@k%a{qdsH? zEUs@mmKEP8noe8IfwM(@GMaC7d+E@awfiz-Q*2nBB8}A%zm_^5ZlPWLnc0&1uDB!% z%Z2Gp3OgpylO)Q=;ugGJ1W`|GV7%NDk_+Ank3I#hfjJHp_6W2--t{9=_ah;=rFT7) zXpseB5>@r*v3;sIMVt8{jM3^hkrpF?fFFsBuGTT_zCqzcW4(><6Z(K}HvbSP-W2+e zH|w8yFAX5A0wUReq6Jk!r^5y{u_pll0(f2kmr|*{+(y?tiq2|{wP35V1%n}#xiGzH0rytfKD85truOm;1fN(Q zrpk*AuoObRK%DpQR_qoE!~`YrkVS0;umW7|?JdX1^e>(?*Lx~{cS(PvTLQFxpj$4C zi@|^KuH9q;Fr4&e0d;GMT|DKa*&T`sk)K{O>(o8`20=z0A|ym3I!Dw_(h_L)Z+JEj zqs!JM;c?W&*{}0?R<$28j73=wUrMBD#RZVb8hZHXus5)ITu2SL=@lX^RZj+O0B#Xf z^Ftq7v}iZz-aO`4>t%Tr-oVz=#87oG4)Mb@UV$avwj_d|j!|bAK!kljkjj-p2ffmM zV+WES-9(;!CYCXe-OqE!{>5&3e!CfriN})o7-3Wq>CDif>w&olZQITEht9s~T7j`A z1q`M?Uw1*jnD3Nby0;x+IJ6s92cKQGELVX;ERI>r68$$nWrnXK7Cm@NvhOjF_pomO ztaJT3@A#Ka3gi}fQ`MwPjTe|6>TTHS-H;)GiAkG_D)7Y$?>OudC3A6Mw|SfduyDbF zmCM@-P#EcC2G$Mtp#tNLrqmj}L|!C*Kh2rvWXdy*Sde4{#l5{By->!S?S*Jvj#|Er zXW0>JDXS?Fr7md!3Vsc<*pkvv6%r5s2P0P%@8 z(=2#MuG}m~o?8nAV7q9fT}Yibf88bSktH0V0bK0M__+0B_()+l`ocdOqa!okDDZp! zyP>F3ZTQsOQzi2D``_gHDc)0&DzTok3zr*c?Vb+yoR@jHD#uxt0=-Sz3jZU2zWw>rA&3+Gd91$r#S~F%)9q<^Afui5rI7!Vi+!^8!Ok+TOq+Px~iZt>A$w4tyE~P ze|DMF`CWIve7q~zYj9w{M%?gqEe6ge=Ydy<{@$;jUi16P0s~fGek3h~C>H=7x5e*R7H1p(v_9>6SN)E<`|ZrT^k^e>;Y1<%)B1Ylm3sH`IANZMoeh?*U0MNlfp2u9=5o+)-XQ-r5D z7DHQ_G~$Dpk<7bHp%y4g0~rK<|GsDgMc^Qqkk9D6)$~yOQ!CPNA#ZWEAhV%ayH7T? zp$A)+G4X2M;)KaKApIWT;X$YY)K@x|sh}gwyq2YQRadMko(ufCN9P2ao2rYn*=7Na z{+odl_cy^dpm}xxzDbW~__}l#<4ZlD;wiDX8*unIk~~Nzipr+UJ~hl7av0&D3QfZI zAs-$>DRKAF-9fGt%X1WO?CmDYleQY9trc28yO(#4dZZ@A>OUG@Ca&8Qc2jC(vM5QbpTj45bgm+ z_E_Gv@%Si++ts_;Fklrm>sU*B-#nuD8)-`oi8>?=NeP&cWd55>Tl*D6Smd|8XN@BC z2?x<9f)$5S^#|*vrY0Ul!7sh)@;@#w!u&T2J^C2*EPVm8?-7LC(X^6Kip_H}jigld_v zyQxV|zNXzoYVS!ItYy(VxGT_ z>sl9*3@T^GDM7`29MQFlC+93de-j||8}b3JAg>!fMCA~2$qYM-zl+Nv!U-WufHW9_ zWA}fy4byML(?z@U-|-)3A4A@Q3gLT!h8_D3M3jb)WBquQem_?7n=kM5!N{=v7orAA zdx~zS{E{+UM?6C|ThE>8!wu>j&Qh{u$yl)$T!P1101`N$WP4e-JK(HIod)wQ+0+70 zNIfUI^Fg!|iFsic-<5$j>DngEl~hVairFVYM08%3{V%1ZN1Ynm6zR3gI zrAp%8q*qSvWA-j=MK!?JWWUtY=T$D!#{VW*=fg@`zlcI^Uw;E%Q)lTMEK)am)G{MC zwl?tc_Rrgq6;O~UjtaEqnh4;3?o6(-6o`u6yA}>P&sW1y<>Hs|Z$Nj2RbU2-wfI4+ zYAjsNo>{P)XA8r3BIwBss@=j{tRitz*ml5FbPu=tYVbkM;_&6+>Htz)8BLL*wqmPJ zR)tRA3cqRpnh9$TU;gk&+gJ2`oQcY(6z|OR9sTx50f-eW003COt&n&{vo_q30en{l zW4qkyRR*w30JaFH1;#VSqx8!0`<_`|}=x%CA_HeCc4A zA*yfo9cMRv$`hoG7pKZ~9QoGwEqFSPS_2yl{?k-+B=JDU8^D(j&tf%Ow^h@!p5i+0a{#Svxi@w8uyqiczdbF@ zHxlzV!AN!#(q!2Gh7WcE0yeXE(syI4XZ|K=1}>gCR5t8mgG4jH4zi^lX;l+l0$u`` z2|&Ah!dm6edAaeFI|;wfp3l7nU(sVhIUsZ*098MiHbzG!8s{}1ymG`Sxpki*W;9pA zu!ej$C$O$$A4=D`4KRy#B{$_V0~q+hwltI0CHZ`Nn^0+(1*}!QnzueJF=y)rXubhR z8*{So)O_lyDzty-5joWeE0$)z*> za;uqF4Zj3i=ldMCBj?+X8}2^Bd|94H0Q>CaJnl8t!4_ZG?l?FCnS zLsqE;xZ2w7bzL@ts>Tq5zpDZ$Ii*dM9<2BIQ8v)dd07abPv+q$P^+P~b9oWfzXH6H zVwRr4jg6a4-@#Asf|_JD>5#FA>T5?l|9On+#lm>eU8LG{xx(-j+d`H z>8pE_0V)qndXLgs?07xBlj8HN?5&N5fZ((`c zSxV&9C3gBMop=`C2B|1XxbMOA<@1t0c1I8++zQHZb%Rx~?~i|mv#cH_027&3UbTEJ zklftIOk#KG2b%}u`_iOHk)59DSb&USR{OhDuW2nG9tcsdD3}QuFOfiB=AA3I;R_E8 z$>%_8G^)K&L0&?uNc84xG}cfnuMJq+$x!LmRi0<_Oj1P;02KYm2c2fJJ2FczQ6^;U7q9d{xHg`TY9cLfvsM~ z*ztETYQTb6X`$U>B&zau+i@H|wto7ML$ZBt*`;OAjR~D1#J980QzG_7={uTy!9v1- z#kRRI{F|VwJdt^2YBX^HIZ)j`iA8kd;LkFfqJ6+w>?$8|PO#MX_0n-o+-tZG09D{U z5C4#z_YnWg)wxtNI8|m-uMM~6p?(+gE#dXYJx0q*D!vgNA>5mg*(?n3unQZAT@Ryy z`{<$9E_l;3+w7N|>T)21-zY9QlLwDAwS{k{hiYPlbbv`Cn357_uEdd1*%E60k!4Ka z-CdMr2St|x(8!fT|B~Cnkb67Rzt6F9B@IAM%y@b}&P3MS1fLO2K8xVvdeiJ0b~k(4>|!9@kYjvL9-d79{0&kM{|&f7T`r7d9UgMsA!CF%=Ps}; zM8uk23f(hUyAhIBebx=PJ>+%i7pgHFsC6u7Zq;kF?+EJb@&8C?usU7)p6d|f3aiA; zo_Ka2H!BE`1o0w-Fs@Xx<%y21puW0P4GY4+=bxX%*+9r9Fgn_AU@OiIMG5>aV-UQ($nMXTo> z$5_oyMw&!A*}$j;O{bBsJ5uaswKQt>0-UW69%?=MxU2U(bL?)r0>`}s@x8Ocgy|DC zfyYb{S#*pJT$u?Qc4z)(E0Xe^?VzUdsT{7L;pfLh*|CP)I~@`7uU8y(&f3tgGW%N< zNY0BkS8sP!-fD2q5%BpbtZ$mamsGXnZ#sJ5I$fz`VC*0;RjJu=NxtZt5lX7WA>MjD zWnTKw2@yYF^WHb;izn&F#UB2$n`H-IE5=gcjNJzGfFO3VA)| z#-6;S&dbSbXSRBCeZ(=Q31UylqjeM0SnA@yGtX3iuKURhSsL(aPrjQAik+Ls4I&-4 z#!D>pA->78JQKGG&DJ&876)qef4(WuAbn+3>{PM%;BK5aNMP82o`QS~VpY?d*BD3} zt*gM)Q=&=`)+l!SoE!;CH4_f4ceH)H&=Mf$Hu6>UhvkE~8GrpB3d#o4Oz;WlEJ|s3QHJ!0w`H=66FY@__ zg7VhbB0`w2;#u&QFAHQDMG@_Bn+HdH!gtq|f;&G4)-F|{YSTC6iS3 zQl;MaQ1z8Wr8dWdLaHS{WqdOzJc#gFFSS;obS5*bHDr(vN8$Avty(~cYR7tKd7=Ja zy-8v8@BChoa^)TmF6QQmMa{kW80aTPVE#w4$I*tR+NAaMLo&v2V;3)>@@~o@uOpUu zMxB}}F9)>w;8pqoLzx2l06{dGs}1448J*q>U$#6ltDgOHusAMMN>pU{9>l3=*77a* z{>+wp4!5@Khf%4dPw7u3yR z8Zpm_0^fWz3RVD@_*{|?-=r;lJ+JtJJ&K+N%o93fGU(sRkPb^C=;;d_e_s~gSH>s@ zXV~5X140NZkqN;KlntunkBNKpikRuW~cf)Rq__rs3o1J zEF}dR-@9A$^e%z{vp|T^F*J3zEVRsf-%lL+Q(kP{d-{bh)m(VD?Q??2cE?WoCpzI$ zyHDD9BY~E-X;+xDa8yUcqoZ!&&Ni0nNp4$IFTjl9JLoQPQ{~i8dLnCfmhlTdm5FSx;qvU;g1ajllTeawx@tb6Awoi6@A4{iHX0n@;;%(5kP{|$o ziD~L-zfG!?NCduSJ!pL6b*oj_T@!2U_de-#Ihf=S#Z)B0p}mIXRlKF3rnf)USxV25 zy*)tCdT7O1%o=t^<55MW|Knv(i|O?9KlA5UN!6Ebz2X-LkHps6C-fif`B#YN1JmR@ zHp^{=oO>Bth@4(`l7ER8^3{5F0%=S+LzGI<9X~_8L!jL7~)^gcb=4bA2$^#|&^#s>r;` zpK^Kcv0pk*!`&rrg| z&5OSO#6HezifY_9h9DZ?M-(TVs(e;yorE7BLd}4bYl*7H*yEq!@&de9#@@-P`&`H| z6St-xL7C&nbyeL)n&uzBeN7%a=$JUgs;<(JxbeCH-cL)7$ zpOt6JEP5yW5`JH?Vq5zJaqBzQ#RQz=VF*BFqB#UySpJ+N+|c=iudBPW0PoTkBHR&c z7odjR9yVo#G(pn~CFnUan^~^`5A44+@gjaTKeZI?N9@w{6s`BsqyZwN=FQQi;LZ{K z#ZN*U_kN`ucilx~t)`>;$Ilg_Q@YNk5m)I9-<4SFZX za(h6~HzV%WPPdWuQ0I2-pI`=9-FWikm@9ku=G(Z)sE&1494Gyqfv&39%O7<;@+^p) zFS9KLRAGFCjys0t873o;E|$#~G@h)du?CA(lv&LLTy+ITtyx0t6545Zt?Q zf;JK$c!IkI3l`kng1fsnH1z-0f9~05@9a9~-u>>adhgy_KUL5*snxxDjXB4dV~+8S zZ%iiT;R{5aeJp%PdO^hfR=O+6QPskGEo7bt>&;e^CvXl475rit+$X!Dqs3SJ(|b-? zm-3ipRT~R;pG@Ty5y-09ub5rM)BB8GJd848SG-mrbrge`K=iHzAQ=LQzU44Na zPSf7KNI%K4mnXUL5-D`Pb#d@qkuvU<+E_`-JY-v{9iH*L1PK6>q<;&4`}U#6%X?&C zcJmys#i|pH%Oll9_1Z9?zO3&N)mOi6va;}!(#^3pJrLg($hD{Ra=6Q>J6C>;&7<-g z`o7*dj5fQt%C_Pdb zD#`JQS__V5y>{Q`e-GvKSL6Z}T^-jQbve9?!zgx$%+{Ts4-X~{l@=D?%Npm199(Se zUu$whk$)dM+#mQ*o8&)#EKF8Pfz*{L66lYd`{=$#RH^yYT)}wdXutQ!{>TZ=ShKut zf;}BI?0uzAm+&IX=1sBid8a2dXRSJF7%BdO&G8=lF#S{=ca?(kss8IME=c^_ut@Kl z;C#a62lM9WRMz9(w8b(xdpy045Z6*3hjgqG4>8vgoSEU9-uHN24W#>rN%UR9lw(GM zjkd7+6-8vU!m2uR1%r+^N)@QoV$j{9W?r=%Q~KCLv4Lgk%=%V>obF%a5D6u(lOru5 zp{TsK3Kraf2@J8CX^VH(xI3#2)VdfON{8+{Z)EMBco-+{4U4H(hb8h8NOrM({z71R zRa+ZbBfX3(A~@Eauj`Xk90#V0UV5c#8=qPTl#xMp=X{-xv{f1_7I<`Ry>YElh$5zA z-&f1hxk|h%#RAx?8d`!`rNU}ntDyJn1!$S|VX6t#FHAb@X{#6<);#$7^hq~es_zJ^ zt)+DZpZ7db&w;^f&7tQmvkUBE3&t)ge!JG&y9g9Z9CUc=2K*|98@_P^cs2$sK7T_s z+&gpYq<9EfVTiquS&ZVr8_^8aEt2Efq)cZ&fKXaaB~pV)a5^&IiU{vpZ+x?gS>#{* z@Qsv0+4D}6YaCB`6e`oc20qH$9kBS`i6l^1IAzZqIiMy*c@wSWX(xMy3%kzEq1TFE z?`aUoa;$0UZYs3|x%L|;rR`eqStX0bYX1n16vkKs+*wU0J^eQ4l=4TjsQqQ*yu1ya zkQK~he4IPx>F9cGS1rL5+7=jQri1l808eab9>sX7MP-(@`NWCd>zFYHtjJNrCA0w zk(p<-5Pn~eD}fqM!q}y*-G)!Y^7UmFW37gaZ6CK^v`V1jQzk!02`ygO7_zcRc!(M2 zvX7xCb`?wuIjS5>g)%zAPC1pfmlinNSmz5#@+dVGR*>tphJ@EYF(N#tS{^aU>_k=N zx~rZk)_h;w9AZTe;(BJ@;(3j=CRh_mUB>3_L1kp#dWH+t%dC$S)tIo)iMg~; z%Xr%`nTw_>M2q1dH)U({P9x-=#AnXpt0%5QdBO+pJ=Mb{-N|uD+(e7!){um=F1OEU zuA{qLW5xxCBs1|%FQ-~f-}VOVY2~>xa)mHB$B6ePC?zp0-%N338}B%1+8}>#*0$O=((ftwjWbwM4$a3*r3oKXrGwrB z6cc(Gt$wFd;zPx*C$xAf0moK^ z16LM(V)_|#m=(|>8!?<dI|TYuJ*;XmVGPKrYl8I;!f?LUt>^7~9rzHP~o;qZu|`eCv{O04DKPW6oJ)zB~Q~ z%kdC|8i%_TJ=+90=nszft{l!K0cf{KIe^}{w*etha|<~zK-L1_kr!6i`kM>sKK=94 zzfchWhvx*$!jEa__o|CF`@J4NtIs&VRd?hKLk@vq`+Jia%&FW<`~*78NHMm6O_x&m z>&OEV=kT^Qk*#V8`zjXMgQbs3_pR1rb!K0F(_wn?u;)mBB58v^Z$eCjE64mU&urlJ z2d-xzKI_<~ABk$8+MoMvc&Eau^n`+Z9I)=yN_WIQT+7QbnXjmBbWeLhVEPB>xr*_d zb!X&Lg6dvoEcz0cWQ0M@7^R+$mPN5ob$auHc_SK^gom4~ccxEL%)ZyHIV?o;d1{KdxbE3* zL4-k%=Et*;ir8n@Jgs0}Xr`ZTWTZ@(Qd^Rp(#Y{M%1&#p z>-4TZVN&F~If0c^4%_W-aR{ZF7zcz^F;(U2YRNfqw)4x^w*q8&PENYrmVyns_cnC> zdmK{zZi__Tk$X}?v+_s}))Hux2<>jBe1d9!nGw+Rh%ifJYvx$hs?5xU>I+`~WRFP= z&dc^BXCPFKAL(4oo zA-~LivP2L%dQMZLIGTXCeJD2?{VCUqW1d_4|=xwGK-!8ao{4 zeOVc>lta@#RDNHNZ{5P8t6mw4Y)<3s9EgZ5!v1h6sH6f;#BQzW2_F)-YSRs$bH!a} zqiXb0cY10(+bCd%n=COtE#{PwXwm6R)a_0U`))CCy%4J9af#MIemgAW)-gPRv_ZQY z@IeDI?85sJA$JMmur5W;mU>JeJJ-VjDS_q#Fs1$M6co31?G-1Gpc94f8RSo1rb2G9 z008wMVAndY0nms3%ErG%pF}*zE&He9$f@3lTb=_|((*BT&G@#nl=R5rH6Yf-d^4@6 zn-`$uYek(EE_|$VM`TGY#P+Ewae*7JyBG6@sFkLW4wiTPd(?dBDXpHx)i@1cSE77b9 z*wn3^&3<0dHBkzQcgxv0y=%`agruS@fI_&irHVHRI1)wKRva2KZtpXJ)>~xg0b|Xy z)6FD^vR8FHy?MGn{L;GUFue#Xo}>?~#S7JnZMLYvmNSU0xq`~%`(qO=;mCE(O^zFh zJR1kW;?RHv*|JGEg1Mxqa|Mo1Mp%GUbph{(byNQVavtQ_TZG73pDCjrQ8{H`BV74B z^&lvJ>+{MMPRR1NT8!;vH1nJ_AO~zR9ph)vB%=ol%1E4e%TkstHuAUg`+xoV(ud`q zCI1kM#P>0ado$hIZ=%u)SXn^X0zn12zE94}q!HrGi6^hx!54RUXJyZrn*emrqm7vp zNIO>EjkCef=Y$BkU6|xbVDus7nZD`C%Ng@y*P@uTnyQT3VCxZqYwtOlnPI_RcNB!b z^`>VR+6_VZT6g7SiJHK~+!SgyKf;=qhKe*6?t;b*-c)o9CfWro7HcDP^cz2L?wiFy z7G7F!+j&fh^lf;*TFyYot9+{o7^t_QOuvum6S>SiQ>%MoGAh0v_p87KETH@0YP0zU zVTt5N*td>KyT~3~$mq@R&3$8r=uWA1bTJm%W(*ETNRT7K+ak;+I6E9f<8^t&tMorg zQj((zP;5XU6UNDu|CGj!Vq|K5)T@$Wll217d0Be~JfqUJ_W1HpCsIJsgYe^ZOGALw zmbh?9zR{XE84*i$#qj}=)o#~mmvrzcanQtaFe%NX_CC))Lt38QJ}a5u)(+Fxu8*v| zN1x7bFcz{Oe_}P>O^q#Ul9rP&N)r;_g_pp79gRg9hD^DXKr9p>C_1LJF)DjGv?I@H zHqG8*p#ufmdej!%w$iTV=Q&az+?}-HdgBQp5e}vZQ)(UAV~7P*-zeGj&w65RCJRq(*8=axpB}R!h1SUufIiQ z;NawU8 zj&h^)1*itINbiwBkEr#m}%e&B*`RJf~KS7%C$POp3{DUl|**GAqzLgSWZj~HJ zZksi)^zkcrLmOk+toi9(Mf0NRy0>e4U>({7xw+w^#ICcEZE{L_q?OfbOUl@et|#g>w$ zA%M6H{5MPbw^%;my>v`xPmaq{Ph!c(?M-9@**w=#E4d$DEZc*TyLrW zYB|^2)Cu0ygI+OsiHSU(n$FovA8WSvQLaFhxN!^R1f$nfTziyNmCPTY-E6)*svREB ziGk){xkTOGBN)zVKTgJ?!$9(X<8$~gUkeVFt&=nuFcO4GH+^YA>G8W6cqsC{bk4~G z)geKPDcPBJO79$B!C7xUglGXG{p4{gb3$wVX)P}Mh12p?D-LrzarFEkwj%w@qye@E z^vSCos}sb*69qzg2>#I=yc< z*AItZ@>T-AIuRzY#r>zrVHO#I8Q2V75d1w$q0FSg_%BGQpY&C1w-p&T z+x-R)0i@%PMu3W#VGK(8zQ)g%hV3em!laSr< z-?|OJ^65uv=kPAbQ?Yx;D6nXk_Fe90+4+h=ma|mP>{ooxZZQ?TW7Z7jrdKZGHgyop zvJKOlx_%sWRkad<#5lA!ga9YKiow#7Rbrgg^R$6PH%)APxm+iPsO|zYBRPrnNqHm( z^MjG#YMnS_;x&pnMCW-v3e$MoKHT8@bjMM6c^bF+4@GVn<`jx2qutDT$sLJt+e-7N zgyh$oVeC8Q2rK8TUL_%!MAhQl>+Hj}FqV=C;LlgIfBU^bUwO7ac_O-c3z^ksEcS#( z<4P=eYP0A@WzrC(zwsG~Yi=fmwlCg0j+}Vj5_tsfd{!RakIo!tQY=D|)i3Y` zXxQqXzNarCD&B5*n8RK@O7+XTxkOQQ1d+Aey`Oc0zPj+kIuTn;{HyG}sMAfpVPWy* za5i1oclm;SWx*SVV;Imz&E?n;QoVh-r81RjctKHK!b8>Z0;hr-*_g3v?5BT0>CjHz zNf7w@p8C5FAYstGkw-pIpCt2(6OwROxarN61pINLT24vnpM{6|w2S|Fb+NwYNztm9 z!6OOsJ!%1#%K6zY`s&CZ;sYHFA8zuQonVzXmeQZ|1RorpGdZa!DVsp;pz|3|?Yw19 zaJfTj9Px7zrbO9q-D6LkaR+={rTel*5--X^iqv$bE({2)u97e79j}suZc7BtMEXVR zdzs5G(S3y}HT0mpi7P~pc(Z?ScNd(HCJ$xLFTKqSJGT$~1B6?R7uGB>)`*`cOY>~# zO=WtFBq=xhTKyy3(S>r+%^y}15p-M3nS6n|$+7_RR|phHuVHnESw&KWrA4*5 zA~t_QnPE)Z)41M({LE&v5=$lBG4rB-@7GM{{~t>7H)Z*sd&Me~*OuCm7g_ywh?U#L z6DA%c5ypOQ0#5<97Pa39h$=ceTn$9HBghFfU zrMuyzjNm0EOuVHAa+_Kf*|OJEi;8`vS4WOtz?5zHDSx#TK1KmSu)ta=_U61u1=3&x z^)d9Swk5v((f46vKb#EQr%&frs^IvR=BV_q0K%&irmF6KD`q+OIaN@Teg2Q>|5g@X z>eVSj4sc}vv5;3pBhw*x`k+ZKEGN8J!`8{3XYbSKaxUVaV^^2*4))6J^Tp z<@*{9l~og*vty_ShaaHiBJZAq^JxO~)r)WMBa}<3#JfYj)dH|5q9XBTiEw;%AR~T3 zG!rzyKH~OGrF*pcnr!H^=?QxV_ad~n1*ep>x_UFm$_q1D$l%tK4yN5>Kl8Zt`r4)8 zSiA$+#;XotUfEDl!t^sU`7j_~5Yo|fTpaK$?t_BQl6@46}z(&b6qCAus`ges^Xn;BB%810RP{VB8ksK>aoSP-A-yD@>WN2Js3YJXi=Rarxlhz+0qE zSg2vgnLU-UZmx4rp2t>3?TiZuC^m_`+Fc4RkD; zyLG$6wLW;P#Cd}rt0#z_dG{R}jC?V(F+$%ytiw{jIA?e6IqOIulr$TNVtI6Z`&W`{ z|Ce2ZNu{h>KKdOQsi~G^Tf&K-oggFpsE|()s^|RyQIPMgcycdIK{9J5w#Id<@+ziV z@nC7>1Md2b`2p35Ga^M5hK_)G?+D?c;^?Tkm|u;HkMWOJ*m>Mj^ezFY9~Tw(Gm9qm z&*pkzDPnJ2*^^FX0B)=+uz$t_E)?S)*m?q8^6UF@ec9&CQiEN*>Tl)F18(k$P z!VkOeTxD1C6njsz6qYku*_S>gT9@ouT`p49dh_GOA8L7alkI0AQfeiK9Bw4@ET_n& zBFN2>(6ndgSY%ftY}HjpK&(^5#M1fZ79#22ww2J$%GXEp7QyBfnxRVVRA6U+Z*v3N^fCVXwh@ODW}MT=47jL@*DHC*f%ls zn4d;6n=XUZooO8ji4Fq%ZJkt-o_gdMWG(K&u9U|8d}gbcgZrulqzKv+E$SfC|E|F2i@% zu;Z-X26OG#(T);bibN%M0i{56=lK@uwgsW`+u&g3wnqeP%oH|-YMq8{Q#t(sf*Rbi z{5H5eMww8;E8?WAVITHd&T)Y#YzsGv@V9{%@P%+?e)VQn1I+SEH&{T$-cllvm4*B^ z5^sbgVbSlK{P&)XdJ#Mj5Oc|AQDVPO<%aT`hFk%1ZiD|eu*ch!>_L(VXw+Wf0+@yu zdl|@6z^9k_eN-EAM)Q}D{u0vv?kkB+#MxH=!N4>ze<%5cS*Sm}fG+oavZc^Epa=XG z1sKrWbVA(tHO*Bv6~t4R-}jG41(-y}hECNaPShYOxuV|;h0DfFwsmK{#i$^w!1;ey zi<!s^!U{>g8+NgueLt8$H{EAez;kGT@U7K57tFu@&@W zTe*LD5I*V|ytueHwkWD2n?P##_S2_t>gv*_Q+dWqLp z=94QtF zx7387m+1Q=Z&nU}VH-v?5n{q+*-Bq|AhqM+H=iZk4VsobL03n(ob+NvLMa!*VFZC_ zSF-gTnrcC$cn^kV^j@?+cwg!F(=gx~@F*B4!}falXA#&hJ5)SNrllMB9E9JihcLh$ zXbRZcF+NIWm$54oSS!%RD&(cX)mRsmCu$)j*6nL#!I+&tYJbGG3=1L?3|52n)VXcl zA%ZSfMJUGFmf3NP|4&#*bNk=y^}VN@mROFl3G-|LwXl6{X{B&GcuJ!9got za%D$J`7xx6j_OO5iX%=vd;6B4EdWX`yqIQuogtaMT!G-}( z-1`|qE#%X=F+iIFM(n|YoVXbKTeg|N2uj#MOMwPT=8E4gteYGa1>4ww(W3}YN_LoY z#-J_+d(;Miz@7l~A6Y2Fi|0!{s7bued2t4>=2AZw5%!L83N~v zZ8|W55H?`&uILtiXu8X$$OSoQdbuPwrZ3viUyI+4?MUn9y37`Fs_1Ed%W4p|bk3m< zTSC9JmfiZoYO4?abMf$s2}(5@%nx&AHp%>>fCXT`iLnT2{`aRQncRi3BjNvpx4;il z!M7+l3ZrE};%^J|>y1O0X0T2_VJx5e=Oyi6Rp1QAB}sTOmIG^WFmvY`)K6(O~4`#a2eFZ>|C-#<=30oGb07NF(wjTv%HSs#+ ze_C$;q}W;i)z>n-@XY8#t^JRzN7g^iCSK9RpKDl8$`k&Ni^Sre%+G%)&KDp3ywjGt z(_Qun*oyqE2EdGBbcRBJ*JOTsJ8I@1r{v<#DM?h(2h`?~=4$QTW0H)kpMaTw{Qe1- zLw}!>|3Wa3YH%)&{%3qW6kNf2Rxtfy5-MSPya`r654%kZESx{zGISA{u zPmt*ydupIp4}5b1_IGpi?|L7U`NN|TTs8+VBnGqm0a~o{g*_{J4)OKYS|!V=J-&;c z*zmm{!)X>~CdCXYmtjEjvG`q0|IKoO|G%&VdKsb3RPI-;H&nx)ZY94iXj@)-iOpz9 zlMH+eUuxhXS{qz@Oe6h^fqn`7Ci(0nRVQI(p>P`BZ_u|naZHqV+?nQ%`AMCO*%#C2 z8Y$|HUw6EsOojM?$E_NJ_v;|ldxSoMjMCK=3j}+>ES8GrFUEdS;f>F>D*--UNb4%j zaNK)qUVjR|$_y3DQQOvWpo&yIF+adN3GcEvwpwJY3!XW}7Wz^R$3iH_5|#rqkbGMd zAtn{Yl+w0b)^`PvAn7GS%WR(`lb}k=*Sq$=?(9#dyX=EoaPhmts(WbaRbs5k&PtW@ z4&ylZcYoq}BDhy!2PLbv9!z7+6fKkq?t}G8mzgWqdV+DR zGDDo4L0n&zdE?h`x2vn!ob}nLJb=h;vYjMWE47!tVpr=+ta5$X1ozD2H5lM^vE2%u zaCRLS%Z3Z7ifpcq3eUt%u2On_ZbcCb9&5HM`)53XvguB?AZn}!_zO|c@NrZOffxjOGKo04FW)R3^lyA}L@TJRLcr&Lr3!-=B2#iR$ zl1o$^iyV!&EoJh6r&)8bS8HQnZ91fuuR-MR-V~pa$9x_S-_{`Jl>W2h%<}V>&9CnY z0561!d09W$SrI3z`6NAuOlWFrQ{kPR#HjE&_-Dvb#-*en^8q_j9ig@y1$WBY9&ay1 zjG4?~#@1Zaa0n-C*)M<`pVDST7&^8eCw4&Tr^WP27ZHgawUt)n=~=W19vdR&PB$Pd z5H1M#L2Ke5ew?zH-pxhV!WHg_W3N5gC{whsH+wOkI2&{uDR9;|mo7)S4p67wXy_4Y zHhO#mx=6L`S0i}<&QU6tm11iJMtFep4CqHam|eGSYyTL0gnu=ciN5P2V5%Jlj@Lu~+gX1?d8aven{wOvbIda{S^r9vW1`%Z9TlSXz zpZ78uqRRH>0OV-}Bm(v1R;dmVb0UOlJ6&yKH7ZvSVMzTx9$Be$#NZI$M*NmXjzg~N z{?Yhz_QLl!QDPR%|hm$w&ZET*^QBn|}RswZ;0a-ZBK!{pP17RA{8Qx?dfe zh?_>v->yrZ1fyWtF`F7I$K|t{V#(Va!pWqQsU##8bWmHV!LW4J-P6qL;p={k+Bijf zdRENWq4YuEwO^V7~c*q}RXR_3>}lmgr%d zYgK^H7x}MwiVJ@oBZg{$y;D!~da`|0S{T04KbpY=9fq7R-_=X-SmbSxMCZ`?=8IVI7M)? z?O!4q{_jfkpCue7`iaSsfQ)FCX?8`HQHQLx`&B?HdBUH49G2sfQS}AT5y+GK-(cdy zmkHs58^cNT&(a*Z4Li!>&V${$#bZ8qGaESX! zRj4Af3c1DK`Q;9vlG)H))A$Br(jPvp!CQ$lXsSl=10;@~RSgo)5aCc-Cz~HGpQZKk z`UwPvS{0n-Cy}b@lHVLa{AJwTUj_T2n9lpsqYRt0axZkf z#5Nzi)Odz@#KT;!v_WEGS24%hm8PUaZW zW=qUsE7e7fP;00-^G6p628F$#yLuYX=lN0k`JDiKOhU>8nv_k3d-fpU$x?~1# z^&+jlIGyY@yD?9KRV+WPo^z_U=pi(+b^bDWc+~5BQnajaOfB z()d+p9L&AdU8UF>QynUll&trfcpI75lpoJ8N^(s{J!C)=1eut)5go^`KhxNwq)mEP zhm~Gmrgm)C;+3t_6h=_qm-0NORl+~-bJpXI3T>R17KVR-60n`RnfG*GgUGT?DYyKz zq0B$xU6QCJUc-mQt>TK>85{ki!3iaF3It=zl8z>v!5AmA)%*5(_8ONmqH#1%T{upO z_@Rv4WUa%J8n!D5?&Ly@($Fghb=s`*(K0T>2I~;Y*k4-_^bzqGHsE?X~VfkyvY7dO_(*wn~jls<0Y1k*m@E#uekQzh>U%$ zT`xG9q>`fAzM_HF1mm3WE}i@6DAJW2#)@}&@hxf}bqnkD&oTKv7ZqpHd4N$XSIK}|Pr@Wb!y`(b<>T`1$^2Z5C zi44!tkB`=@-3DK={AxV55sH>6=VcZR?es{;i} z(Z^hUICE$OGLmgo0lL>{-{Hy+Am+$bxijc&A#4$vlKI_+d=C$(QmlMsB$Iu2+*B=0 zv}r6!zpeZ2B<~5>QggmMW_n=25dc5BC)c)L^aVfiI;u*vN3)Xvyo|X&cd5&K9%~yDk&wV#cd^%Vf^s@ zlVEj*b@PCC-m>ACZJkyM>(Mg@N>knz&I@Ytp9;U&9jXeSKI~Dph4J`^97cwFrVpr&H5%KVa4yN-yr;JgvoVa9R$THpB0KE zd00g@x)`rKNQA>tbb1g)pYlAXU#{!@L8oG-UN<2cVRY_&;iLBww2R}x5r;OZo3W8! z4%I|QN25)8-*_dJ9Io|{Z59W*_%E(>v+&7SWtS!MCr#}KHwjaK$P4p_wQX_91YWqVv-S=Iclqs+9u7uW?FAI%~*WM*eAAnM& z`K(3X;MT@Q3&&ToXN~4chki4qctPR5cTxDf18s5lDam85)n5nag|brp;fTfi)fpS> z*nY!AKc*5qH4-nw-8csNr{}#i#q)nD3&q;e8N>^1G@R+_1*ek8tJj-xbbjPd=hZaQ zVIJdLpR`-6k?K!}vHe2$t%M(NonN2qzbt$nwKm(}!~Y82KAWn^hc_R%P`8|#KK~9S zHqCGnh;oj}Hbl#aHa>)VQYvh_3x!x*UiHZiSV@YU4-4*+gc>_oFxIZKFh=qEvJY_+ zO`SxLEp*Aez_jQV4OjJxl8=_DHtTSh4Rn$yLxp{d$=De79M{naa6^;zWrYag$wNa4 z4E9E63K+f@wj`CkK9LEr`~C&L#(Uu5o0uV1ao;m9Iois}7dWXEdpbXURp@+)u6VRf z8ZA1_y|LWX4l9XRqpDi`@kIcsgXAZ5wXT$&JczoT>^3A6T;@D@?SNfBEA7L+cuSjs zkl}Ee_y$QKKND?G;d5<$h;O&-!=v!V7N^}$xha9OUphOwApG=l$X$r1=x|?h)k2wdVCyx7rgiE_6 zk7G*j-2=*9q%S|sz?`L24XwG+T=~Ke0ta9%0k`0V>=&D?DDTSLG2(&#NQi?2ufe z4JF;8FDMlHw8cu=cQe7mZH52TBKFmp$NhrVvtVJjuC|^<|E}i zJ)Q0vO=C&+MZ!^zImdD2<=IS77s$xHFr2Y6`%I3pQIi3iSooGzVmlQNsd1LD^9n%HR>9YSRVWSNQTa$V!#}Fw7 z+vD~1+S=1Q$c-)Ex!j}RZ~P=Sa49jFY*FJvj*;NTH&;BQVx*Vtcc9{4D+gDbSg!rf z*c)N|b=zsN+}6&!VSBk|!m_*mxH1#heJA3+PN0N;xM*6m(x+} zj=qBvZ`j#gEeL<#+N%T3IrWwP<@+bNPDk(giEElQqS0x8(wy>GO=VX>-iMx0TJ~t( zujb$)A9*o6Cf$0}Wvz5VENgD|R_Ggz-~6NAQ98c1y6bc$kVJi6pIl4U&J8rHU_rMr z0{czzSLZFBYi%2Yh3E%#3ih&BfowU(P~ zCn1_}G5CqzLqgeoH)6C;-yY{lde2&QjlvThD98G(QK3{FOY9M`SV};jRsI!)l(hSz z+i&YW%=A*-D}AaJA>(-I`>_#hGoCylJ37Wx^J6rRkJ{*^3|PD0e{p6dc2$TWzaMwf zK$+*ERJ~&b>%C9srIqcC`VX?EGa*fUzZa27pgZ2htO6X#e{lu>hL8Cr1bQ;lHR zX|^gF@XNNQ%jRcc5-i+m+oz1#%|OIbvoHPl-y7lGYx9;mzAWH^Q-AgKNs!ie8alMz z*qakE4P;0q5?O6|(adi-?Fnu_K>Lu@-w-@&uZEm7HXw43SEWMMVS4$nUtqIUvv4zB zgRw&03Ya&C*g9{UdQ#&rN(-lEXT8+KHUhu&pyygg9-kXnq)Uaz`AzTXd5`u#c*4E3 zy_h~ZamoMsIh0%{TWzEiroERgV^NazfrkXG(iwuQt2-ZRwk#`2DsDZEn7Ne`Fd*VDUb_I54j*(w17TL-a7DV?NI&>q$n?nAWFv_ zXR`}zcN6g;2J&MLKkzr7@l~hZmqj6kAn00nG{HDyA3LTtIxzK$=kgg_*tV((BTKGc zDmg7yZj|GV=P3)Hyd7`&HiX;AS00?2{e_j00*)A$&XN^ zQ03(=vKz?P8w$7<{qh?$+mVlxvMY(!8Oy6z!`ye^rugEgNFyHiQ=e)49LG>ogMnQH z!ZJ)s*JQj}8*(0#n_F!(WaJ(bCA+&dPJMCuROi7FeZs*f%5)r0n??47KR|NN_1dBP zLFfUg+)5X5+L8RB^)*Ng9DAE*^4)G?5#+mKSjSshCz}0sg%7Z2%;rXrQ`F&kk>>C9 z#0Zqe?i|{X(|HK^Dp7n|@~`)4(cdt5bcOakqb9di^BnjqfHbF8ndPw3unl(F*E9HUI) zLVH>@v0bQ~bC}u)>tdHz&r{5E>KksBLtksO!;!w_8gjCJB9`TOrme~xC|O?m&6fEO zg{HR)CL4a;TUwrTWZ5JnSY&M{8fBDsh4iET++)=)4``b|N`EZGTW42!Vp&8!$HvVf zhc51DEj`^HyR`Hvj`ULzwU>rOy0)WJ8_3gh@_^0jOSOmXL6qhOUV5Fx=j1vn-Ef?= zxt+b?SZNq}3vsC6M%ux`j}2qc+0om%?O5G~dENHdQ2!aR)#i}n#$?>Y<+K+I#Alm9 zV$FPf4~JmS)89%wYp9H6|HcCrGuaa7*9#69f9Wow-*xSqi9DW|1!-%Ax)o}$TG#H6l8#mlHN(j)1y zvA520xBJCJi%ZJ&O`BG7mU7nBgdZg6g8=huPi+}tvyLI-YUw>>?XX;f5(0!icq%VfD*e4^?K@T&pI2U z?%9%};%|}eV5L(MwF%9fEG|ADNA?>W#Cw!^JNSXz-5ppNlJ-Nx#TDR@RJna$hg9wW zXpj{qU{?ApHyi_5*P0pAQme?a8bp;&(-3sj$BFswiXFVCdjQqaTC}~+a8cA|1(;{F z68^+^eKf#di&e%c4?lVcK-;^un++RlM?bTf z_y4^A!&0f|wmI(`a7gX(cvhF}YEqRT5PGz!u`^jvakM4%*#b}jT4c_f}P0rdI z#nG0>Jx@c8abEPD%zmFb3EpDEvH4}dt6KP#@cISyWL!?Cm_J{~IxdMAe%M5FMJ!A6 zD7Z(4(&2Ri;EJ^QSW7N3C6!io9MlJsFk^w#!QnS%DVe*X=WCm-Eb5p)cO}ou@J22k z=W|O;v|Hb+^|vlRF)IP0rtEr8PbvMCJkS?oU)8q`5_-{#YvC%sauRl+<{^?BVok6y zml3U8c5IFp#U#$rd%^MGe#xhdg9pT%u>^Q_UEj5LN*B?(>E=H`1&6L*zashU#upWz z#eMiSO2MiA1lc-%|Jxyb-qb>wvvefh!Y{t(QP#O5OqSnA^R!uv*3Bz=h&taV*_V(H zDM$B6l3tccjhTNRi)RePa;AFFZKEbN-+oe)b{9{s-5W!1;|+dk=(sN8mngYku#sTmU)Iriz3ZgCSpY$P1;!vx0?pFr%H@0!)1m)yFH`A)a2oAH8o`*iX3V_U@P1l17iFl91t3e`#2Wxk4D%V zb!y{SO;+1C12w}NI`!_yN_;PRFAE%~hKvo$1LVyuu7~@F$I8z)YK5BJZmU~45gJ=z zrRl5teLbLPUy}0GA2{Gw0I7#jRR_d9+Z?Tcmr^wMh->4%#}G7r>Qyd9iNM0kfd{&E zSRf6Jk|%+s=|vS;J*3O2XM?)qV`uxyh7XNQLC?h5Jba52NTiwL&#Lh#C!Q?av*pYe zHo6$!nIefa7jN{94P@!v3u_g6j5)*ST{Hx(wveeopb5g;*DQN1bpdASgQ8Ce&jxQn zvR<2h)?OeF9(J&I#xZ=inpY&Tgwf)J_h-5$a91MzF7_#`({X8j{16W%7H~7YGYPeX zcAp#}vOJgO$MS#i=QyJ%FNmzcxV>5haKsIWl$^UJG@`>UaRhp`SyT9+~FB%We9f`^p1PngAmaxgK}VHkmM zJN-9{vpD%yeftv#8dIA(Z~OLPEP7Uo z>{nS7A?gtgECjpA$eF1Dkq0uk!1w)cf5HMtaN22N=Q-?AV|Z$|0tD>qZ`lJF3@4He zegMCLFO?dyc{CF9sLqQvDSNAQA;@X8uM!do$v-r z?bc+IFWumfhwUkE^}*pT*ga#bad%SQqwdqMKhgzb>OaJAb96ZLXLEjoJ}TWZQ{u?C zL-AO7fS7Hem7^gOYongC%H4w>cgHd=?B{)@kg!f^h9@P-R!=@-H`PHwk>{H8c{4{j z=?9|;4*_;Ab2MH@0GKN|lIt$zASR{5+d9(0s(ec1&d<-}^@N}=!c!Wurv93~o;QKj ze*P+H)*UV@Mk`BijT3Ssbe$x8vsyX)p8$=)9{V#bs|h#r^#zJl#&K25<VcBhcVaQYieacUT7k=*WpBE}?C5wmE5WRv*e^=-U5iGhliXJnn3U_=p4MI8 z6>+4KZigbDmz-_y@~58NZ6QO=VjoW;*aM%}J%a2WKKi^+QJoVY2KeFbJDs%m-NiQL zvQM2NuT)HKeqvpI{nnRN3e1SL0AI6t-Sl>VsTM0p?Vq&wJ^~2PT{C-Ges3@JH>-8? zUm-KP<7%gCzPZMj#R2oatX26Y9cX6%LT}@K>dl{TZa^WLmaB1RLOOx+U2ol0Y0{fu z|EHLA#UYPcjk`opZ%dPW9MHeEl@z>|da@(f&KDK;b}Cz6OFLSE0fq4!43-Bt~8;hJN0VjC!o!T4{aY>Q%hljsTOtwsmnD zfBi9n;|Gmz+u4E`)*jvY3W<(gFMjeNdy%Kwh`ccMag=rPDZgiH)#IXQ-jb@E-Omfc zA>!_Nz6J8rBZ9Dw>VmAM?1F#5HCX|E2!sDPr1B^jm89xfwy@JTd5MX9V8#uTviW}=Rf&D!;)`Zh*yt7dWi za3zL`PBb{#4$xEVg&EaYpJiyWQT?d;Lo1OjDV+S2TWLQ;v{|6w=EOalkARe z=Dz*KJ4je$c*8W*J@H-oT~>40#adky&wWy-;oumSf2Ki6C`K1i>)2`$C)Rsa2*xB& zrg?BAq}$8+sHGqQLBrq>MWSu6w-_OA(4>DoW3CrL{bfZs%*Uqg5ZrW1sPz}mXwO)$ zbdFPSgKKiK`s>Ym2>XE4anyz=qE)Zy36Q>?MHRiN6R(*``WQLSGL$3=lu4R3TY@d5 z9=E?M9?e_m7x#@Cr*_EbA{yL(H!pDW75t)!3`yKYR}wo~V$k$4Nh{v{Iv**Fh6c4z z+q94ezMmV=SGwL*hh$P72zYv{GXG$%S4W@G|0MRDB3E;L(2P|sT9)sX zT2^E%fl#D7#jRWGMfVA5i=Xpre7m<@LqMzbMqO&3OYc5=1GSpiz7FeG>1iH{Pp*cp zD^*xl$HmXBsYfymqxMYtASovw!zDan$J8R4gN7`9v*_(+wOz@}OuPg9sEeV~UM-g%`C7|J*$Mbv| z(+8Ih4I~(F6ml?jG~cgg7T&*EP)?M@G3&P4Ua_a-$yY4MR2=rI+np3| zk?uY&N(cRJ_m#%Do39n$sj7J(n+O)u^t!rbl2t_EqL$UvuT=NhoedzBJR|uRDE)uB zEg81fpYWB+Guk&eTzXGM^~Jtk$I%q9$^uGQ^Jiy}l{AZL;!7oeSK4)URochMKc>%(G|B;rmxIqp*FW<9TWWD~TObO+;gC)O}f zVgu)t~Wnhpmj% zWrH`voB4QJPNzCHb-3(#9$^5k40N7v!#O(rnc5dOiD&#N(Qj`u@NvH%T1l_6OYI+z zhOSmXI{65mi-&&=M41%*Fh_^ZTSeoB@X}T?nx2yxXSMWYwSXDBek2bjNb79ktn0!D zc3&GBU&eNlyhO1zH$2yn|D{1ep!4np9)1Ac4wo_~9Pdo_|45Uuh5x4U`7y8m{dVDU zb_rF38GY24ym>NzhT~Cdv|pN0T{QMdj^}lUcYbiq6rXuf$5@N>N!&ogHYPT}^5S-* z09HBMgwmw{2g25MGG15faxAMEf*5TFno2OC?lgcSO7%-=`)XR`z}COjx(zcO{Qhu! zx_O}qX&c8Acj?{v5j`^(*%@{!{O z^`329I#2JWJ*Br7r03?rTk8%OK5}mKOLe)#@++WVBodcXeCdKLUq<*n@pI+@x)_kY7q%Ti9v3L{o8Zw=#4pgXb+AEAKArvMxf`TR#5 z_PW{=h>PC;hXT8Q1$h5&Y$?|>s)LR&;8M_0Ge8{f|I&L7cZE~lWKl!gS09qSvV(FL z6t{9i4~1I0S_K&tkefTBgUOs(tskT3iuC=!DTD@3#Q2TaVE4w zBwMx4V0tWi;NzHKI-Rk|2t5M`R+NImVIs1nZwDS1caNvt6!z0;BY2=_|M&&v17ytTI=u5x7oiaGYpfdEdmDGM z+zzkKAhM}#xEqdFS5>9@CA;uQANJY2UuxI|dLKtWhZRCr z8)*AjPVw^d`~mv+s>B4G&!tiLscYI#?w??~ zv3C%BPP!8=FPk_=n>LaZ<+6j!&MlS8rEspnE4?KDyv%03skV;CFZ7l9#7LY&=>A4Q2P% zcT`d8K9~LljPGgZ!hh)P{I9GF|971)?S{#?et{Ll#82#jEybBC*p(V9Z+TQ4KZm&7 zjFLc`Aq1FHvgLw$(GLqFSn+uo7EK> zeMn6v?CF2t&a`^Q6q`PilYjfUu-U^G7LP^Th=-&Nh2Hs@*ZlDH?H+JU!{qMy0eYtmzet90N1E@yr>S# z&(}VBd0Yg$_5zfYqczb2AaO}+G!Y0TjEV&cN~j=L6g5{$Cc?<#HGr}IV6@Q4Q%&VJ zy|!K=q(-?C@9A@0rzs=9jr5v>nkE)Y^JfykRN(5OJcS&7azRpe;m@e>-Ec*=#Fx<{ zL;{4M5i8`4NULVy?&gGzHzVkwublkief7L$&Dz%76Y&XS7j<$hx|txd8~h z0HArXdxF=*zPyK}_AI!6_EC|WXaV7Eg-u7wt90lGNSJZWuo1NF@Gl-_SRVR8Jq*w{ zhTC=SjX_aGQc=Napu~9HJ+Ye*91|LKw*Y($_=EHXumiR=@H&brM>^{sMMW?tD<5e~0@2i7p=EM>FaS3>T%v#dQ zB3&x01mmWFWxG3OrzBla)jAGodjPO{c@h8bGh(!%HPj_1RTW-vze2dzn2YMHF6(IvmW$F|x!l6X9{}GZ=O2F(Gz$G<=-*mcS2_d?Do7Xx z`Dh0&zvw1zyiTdFNdq}dNCZqQ;fI-db=|E>LDOQYrH@LpK-bQ$fP(6D*Y03NtAO+i zAUtuN^FW-OO$k|D8L8ixQsVn}L21HOO4(YH1peab*`PliF!W)rBCmE^&0tZ*<7Yir zMs?>=+N9UizP2V2`63hh*moKSn{#5ZCkYM5=|Z^ETbIeWFU2Rki)azdG{KZWb4~0U zcMfWmF(&WG8@JMe?&770f2_9a3&8)wN4-WzPC0L2mJyp_TZ+?8ZMTdAXrlc{Jty{) zC;WzG*Zse@VrFMo>7drosFr7#T!i|g9TNuoW@+4RPLPJHL*t|k-l6U52Wf3^F6ViV zjgE6k*56PN=!;y_^7Yk%RZ zR7jTlXBDgC5q&N0>txdMuo-(B^k8(`79M>37w=v>PX4Rn11W!2-`9{vM6|Xa zpEXeZu#c`=yO~}-?#Q@jvikOd`tk1jPfDy&#>>f+Y(Ha^Sksl@y*(HLRGK$NL=(%* zajARY+|ew4p@^|u=i$~2EiF3~O5x0l=wT8gze7?Y@$l6hpx30I;?I?FIm$7sfBaB? zE8HxI9LMabg2)kvgh{^k%Uxi6x$>H_>6nh%I-Y7FhvXZf^z`XH%poe)+jCSK>jb1| z46gXlZAgV^-`6NhVN$}*tFT-2khB{xrTSCN1RCy*Kqn zP!whV%Ogj|m?&Q#%!s(>ATUO)2X?QnWqfTJCCv)}^@oF5PJen|IY{%mqqd*JKO?{; zh&)E%?XRc3gD%GZ#ly;}{*hq(aq~-BI41o?D{z5e?6VDYIA#vK_3zP>icDx*d_XOZ8KR&d^U zzHXpKf0IDkI^mDQ$eOp4^GFLTphq8zK|r^h8nM~HKgzuiUMyCeuF*xRqmAP)(%(&a zT=G|^xS}1bZA3>NaG(Cg!#yISVoGMaz~KWmL|wByO45#`S=CZ6a1$T~GVJgLG#-kT z6T*$R(vJdr8JKNAvEIZ%k`b{n`5G9nHd`nr*%JwgQdcFy{>57?X~n8q4-Hi8D4-)S z3*MNH3)~e1$NmFq1BteVNAUr5X%#rib{IaO3xULCLJudLaqKxvw@{OA$gLdc;Rvw% zwNJsyAB?qiYT=)P-midUf+||0vHBa%B$%-6?uOEyQsJ6EO7u%&pCZTq1f~ZIUuH@( zVr85gfrotaJpZ5XeP)ej9TUF6@APL6Hkov)o=rTuEB8Be{rMdW@?ULzOw+mS$AtoO z*WN${SvK=O*5OaaqKNi%H7o?}kt}t9ehjrYEc9fx`B50=P{{eJ$ECN%UteiMon{bM zXWR-codRxS1)s@S=oWm{TIGZ zJf$0{FU!m93Rhq2Ns)2Kb%|qJ_uVCc%XnLZho))yoUQ)i8M?3lc{hM7A!xD+I_O0$ zKpNMd^;Nt^AFT2WSphmW(T8mdTMqJe8-pY@AS!a#Pkp`Oj(7OwaOa10 zxWOl-oUP{2=o7zddePREM)jK{p-(F1#vtz5rN4OPQH#Fr2np!%ZV}KxVE~M_`?G!% zDJ{@{vHJGw&3+R2io~iV=_a|=8p4PLp<;d^K#x~!PZw77q>6G&Dw~4j6GiTXM^HOG zyclum&5K`~ocpPycWv;WhOz^|@qL_Yu*4__y^%#5ASu%7D*qR6qLqCAiU8#iCUi{a zX|0LOD>-3HFC>1&Ei2nc*^uy@27mXP{nkEcu5PV48~SAUlScC4xbIC_iQYy&ct=-z;vg0hh|3AScEn>E{8TYy>&GCDcpmc>FXR>MX zYMrxb*@^0quF?Mou4e!Il=jcD1|Q2z8=w0>i(Z#R(0?sdX$5=(JJ_O=Guf(%uP8?N zW3=Bb8+eU-XNj7qO=;jj5+mK~Z~0H;{XY;M0&pwvmJ=U`fQk8_lMi;>9hIs6;%SNt zSCfEY)iKh?%b=IaI2B!ZP}}P2&+ac>em@d<%gW$odCe_9k9c_*bG|FlG&WWY3i9^@ zOO0eQT#oV}^_CpvS;oQgwV3DEjqOOu_|*^VrD9WiBA6ZV%NN^W3E@+mmnr2osZ+Xw zT>OsnbdV~JgUK?iHwv`igdHTiZ(B)LXA&6vWFPH$f32^x@DH;Dj7_K28!NnCO5DCc zniM7}Xvvh3XFL8SGw4aE9H@}>M5&+kyAo?Me4Xr`r$?JVLj^J;^m&~6#Mw^Q$FAVa zJ$=}FE3xOj-<3F$ce%Iu_3||?nR4m4?g0+p=1a-Op5e6jlxIM55LKg4cP0UceP-xq zVbIVlpNXY&L|QePX4&@Huxg=9MBfeLouB`bP{XgEoF& zQEc9cwf_t`pN6IxgCl6npy7P-zCRJim^f&=q9li#zs}UzirpLleFk$^l@_$)EK=j{4QX)ZSa zeS{8Cw`~L|B=fjlPWO-=aIn^aKiI%W4#{vt5!W1c1EtWyz2s+7aw{p^vFF$PDQ9r? zNvJbyo==C!(VuyN{rK{jVFp;pQzIg0Xbis3(+*U>Ke#4}yzCi8H{2v2qtK8qSHzFU zUgp|GUkYW;$~j-l`#x;gn(b+Lvw8$s6gl}et31HZXCo*Qm5c4I)i6;7_40+BMD-&4 zq*{{Rf&QGE`r2TCiTs-F3lKm6op~DI2SyO=74&0Ev%+QUo{(;0Msx-d^)aSc4N^J< zykAyD6TX1-1KAZ2MA-sSN3+*MKv$@UEgq40mV3cpV#&7+@!EIC?)i^?H^hxVk($ve z?c&L`;W?{zSLv}N$`*>a;ew;Mx0+3f3hb*+2Z0F=sN|d0By`+O27tV7Y#_klh~A5H zZg-qBBoczu8iuTGf5m=S2SoxgSd(S0y}?qfR7_T~&L6>n46COqwsReS!svgBFb!l& zYAX=G7tsn<8IRw8h2;^vqJ{qeJ90Os@Xc!>+-53imaA!qkG2_zJLu*k3(1M ztbwFlsV_gWwsGSav`gpwgfE#bE690O`BcG!3F+TDLX8ns|D3Q)fq{k=zr6?9EuK}8gea2nl z>tdr$WlR0Z6hYWSCT>^iXEY9kPEFrDLFUHLb~T{=l8^7DIQo}6>~cHLi0p>+edISM z^;fdWy*oVqq_{C?HFZyG47fHi7U_fbLWOSq%-!*rK|fT<8x&ZS>hkp2FV2BQWCzw| zJCV{Vj@%+qabK+=!%{x(FnNOUgRa2>KGy;9qQL?AqLE#lEgDkM$+^(8tTR|WlpA2C zt{JqZX#pI-0pQk7D(q1MkP~@eDpY_Z6D+3;XJ4Bk8*S~kI^`Xk{mR~G5epnxX&T;+g&pZ&7;kkL+h05w-~6fW z59pk%^KL$!e}{4C2y9NA8wEV+c2yi-KQ0^XI^Bxo-EB=w>Wvut>ygwMfHHC_gYnNS+r6Mj&O)013ugJ&CFuG!_$MlhJ<@c`NV>e@UmSO74u|az4t;H$beB)hn9pvT|OE~rW#Rmc&6e%

      cib;cL@we=Jd)qd{?2gxMz)gjXo=P(KI%{MUr8A<^VJO1&l_W2U z$WkBYpSOeipL_ptooIdD!rRUsx522ZYH_pN-IaU)=vQ&W&G#0z@JMyyy(EVp{0ev; z9Izj0*)5~3$Gqx8qQcX9JY!W+{g2-3&`J$|cuyBQ?8>yj z{WCw-!l)P`etCG+Z&5g*jBldt6DSNmd$;p)d68bDL> zDVbp1ngC27d+cr5r#&vK$sIEp2eo#O21qwzd;E$gniMiul>D)2@K=6=DR&QV zGH5)}2)S*5hrCwC>V=5zBww%W9D8s+wlkr!)$@vnA%uYSdgghpmLWlW(Cf6CCC|^x zta`ujs5C`;)x5_!35L>iRTB;KMIpcaQ8PoZ;5!bGy9LM}zeKI+lv}qwM z>AA;qGoq+hgkq8?$SaIU$Dl|Uoqb@U%^y60?bK9Vf+E|z!A{g-ldM?gieZf6pQy=? zj}R3YGp`bS1ty(aY?%J5X-2^}zfV_a0-3*&DPMhEUk&4!#frJUsJkRFdO7A3tG@sb zrIt(NoOof80`9ogSFC}*CFRu5Q*-ZWQaD~rLhH*Y-OkDp^Ccxs!I*!fCuuMS$UXF@ zJEj)zMSK6^nKbitHQf5z@p8;i;im~r1fJ=3)N}#9WNQc7gD#%Q!8Refmcea}=uFD$ zWM*aEWR|`IPmxCY9Eeg$eJSLUzKxS*@T!f^+Q6kys_|48>=O)Cew9r$ zu>%PT-?G~jmLerRs8m*tBJpuC$yk$OXDL>3bcLOJ(bDe8S$Y1&Q@FJ`P~{XcBb%6O8Pd6_;rQf<7?Tz`;@w@hxmmw@D#-4s8P68aC4Nh8JA`En>qF24zYbHPiLQU=g&*01dd@q}@rISGqs2VZEtg>XifKi@ z`4ar38W6Gtlv)490!I=>8^u)nU#N}mGh4Kq}@X3@fi#0LOLM5BJ zuTI^mcTIGE!CP<=o?05S*jM>qIw=zJc!DIp`ApV8!!GE+A(@T8>#i%hqe8bu8H>+s zL3E=rLVWjES4N%lb2zq4WLU&*HyUfTdx+}{e0geliSJH%Cg^#0ztdnQ!*dK?IY{7%`55;aN$fz1v)D)sPB82vjseL8YAH1f0%+vnu-U!Xwqbbx% z)7RHG8BI!-j3F0^r`by4g~q!9BD(aJC}P9Ecp(Wh6ACAp1hwI&(psaG6PRKk58+Cu z7~?p@L2_HMGl8ZVe_l^o@HU^^H(EV7M@?$XO3GR07%N-Y?|ulaJW?NJUjeYmi`9jO8Nb_P50-n> z2E$Y9B=LUvaTyAvI`~SIVVpb%n>dwQlq7XiluS`ejvqZags+6x0pu+L0lM`U&*9SW z=i$aT20j$Y1w8H?cDLWz-LoWwA~s-mUtjsDs5Aail5TXoMcF#6qt;h8-6W?WWZ+Rp zNZW4ItO9cKm3^~|d9CUJZI9?d%^aq;jV}5qud+mr<=!oPnh{ki#61+*qcA!HBXg&1 zd|2)$&SI$|ITpqNrEVX>s557$HBs+Qp|*B@3X+5{mU}-agZE_cKH~ZJ$?470JIm5_(dy;@(c%MCg^Qp7tN`2b_VJQcBiSI5#ic5p&XVzj6q1^-$7wIf1j-ceJ z^OBO{-pPnwc~pz+i)`z;b6im_Lr7Zx`ASFqHs;ekEH_)d%q;2FA5)8GJFY76eCAs= zXMrDn`Q^RScf!vf4!SGF6%)OlU0{1bZurE7&a%Puf&EOA!unt{i}JIlRN-XNa~n~q zy4k+eNWndS#`=~Ru&`B>Qy&eU;DhrVoi5)D^Y!1MM;2cl6ctz)xie5wb#pb`dgJ(n zd{NiB3ru22mzsvjLx;l-Ctg-Ayxk~kM*AK>lZrT_ZcfphMFAG)&|o{NNl8*OOU%+;!#^#HPmmZ5g4fTfdL+r|1zCTrHZax zDcd}@_ijrXYU-u=)Z6J(M(B!(rW?}v| zv5G5Nr!S5KkEK?@n~m{~@8$Y2*^>QTn~A5B+=n_e+xyAWlV`SUGc47$cLwlpXU1fg z`mrIyWhP6UoX$yADHJGHUv3i8-u<0s!bATWRc#PY;M-jG`LInQf?@(cK$U6Blct?% zlaMe^fV*@;BYTo3wrA#^0HNLC-MV$9Y9&fbMJJL}u3x+Rsk$!#xP z`E{{;{w6^HB9Bs-{7+OXulxunNh#i9p0we4m`1ctcGQD1g}(c}-e(P$#oaBBXM{&1 zJ+u}iBkt7&2Fl};)HBtb`SQXe{9$LJE{6r|jp^ArD;Su@# znC4Qdk1l` zf$zf1)tPr1@aW~&vG4@rNKY27Uv0L(pXpZR|o|8%T-Bzh;9ua^Zm|kf9 z{FMt)TIOv+W^L`TfS;(C!fYjrllG?xrc#ZOTfb_PNn=BSe4`7*1Z#%z;Y~u={)!YR z{!MsGLHll#pw@V3>2Px+FHNmD(MIw!>fFg(XJUoN_~Bo#W~5W^={W*1==+z~C=(P} zu#>wwv4MhJNNc{Q9Avwy{b7KIVFsRQHH4|8WbWdL8_@#Mj3SH=oN#%5SN)`-iFe* zRXLL4=pjO<^BG1W3BE)f?wh5Jwj&w14W<-{-kV^2U~lLb zq{Oyb&Dlrqy-})_BbxDI!KOngr~dx;A1ry?m2|wn=nShrTD<+e4a#*h@CeUU^&jYw zTSy|SK0Acy?|~ki4YPE_?o%t+M6$AQNvvr%$bkrVleL45nJhegTN&QZM)(T*F_k(x zl9zj#t>Cfb3-9{VPkd-ay{UT}pg{4RV9gZJjXNWw#b_cBR|~iqzLjn~9gV=)Vf8B7 zll0s?wvigTHvt*X@uS>}3}%C!4$DF`Wc7RQk1PfF%;wv8%Z>cn5Ous`6wewXL3gFq zElr@x!`4KFkoQEB4m6Q->rIX&`tlXo6kBGia0QJ#s(c`BIxDfJDt4|U-!+;Pwy_UU z1>I>L3H?-gd-C#=r6L0f!+V8J@ZI73MSFo^Ckn%bs^GvWi|J3~o<~WU#><%6tvm7y zRm={L#%62Mo^H!kB~Ql@yNQ1n?477-VqB;EJ(V{-9%CFkXKEgK4*5bAJBM+4`S4l7 zFE+KiJ|4f?Y%ZnkJ4xUVKD}bbKQz2*N&49Ecv%HBc#{6nJ63yZqAWMC?2VXHbK`8; zUSj#^T0gnzqc&Krw^6RK*vP+%yl?+8o)~GjH&aZlZiSNN$2Ms+ zOB9kjZu&X<%FDd1B|@ioazAQhuXZ@@(b>e74u}Q49cLo3!NV&7rciOlTZ&UG5l|?d)lajF$zDDn-58@avDa z_MFoC8(ZCO$er(u0(}l*X%ED`(eRum%Y2MK+5N?Nf&7K+n0KF23#G&WIjUfU6cN99 z-rj9tD(DlloVW~=XEm8jHf~BKaGokA?S}7a5_Ct!vZ%8|WgG(0-P!%f*VNQIaugnc z2*2**W7!qp3o`iyC?gD}@BBM{a_bz_b4awH9^y}%@Ji%`{@fG7qeDG#y8X$%PQ0Dt z^ZM;OZfEc*W09@~C`^**TZBkWk(Z1kv;W=;@Yj6^tFHvc-N%n-b&*FQabR6d`!vJ* z#w68|0v1xzyfq5EGi;th1Io?HU~~TrZKx#)7@H(slS@d!!jci;8D)q zX+u7Fn%OiA-*ZyGOBHrdTf8*$#y4jE%mbQ zIwl}}aJe|RKol4pvv#E`l%;>tozQYv$2T2AnCqic6Cd7QhPl(E2G%l?6lYN!A9$t1 z(<5%x5>$=chu!WLxb4Caa&|5}g3v&CzqR6KRY|}r!u}}Z(`e8rDk*UN^LyAEE6#B7 zB~OmC;-t|B0RH_z;t6{SiQG>JrG6JjK1y))TnOnaK07O`Qxa6l9kgX;G6xTy(HtA$ zl5&+V!dVi`@ZeSzL8qIK0hO3VzgW69f1QqClw2h22DQ&GnzH)}k+Z-ozv+l}CA znKSNOM(5@A{x(bJtG8|5FUThqF5F(s{la^{?^rL1m1ko2^NkgoA@ZzjwNscZ=LCJj>yi#I6P_sPsuZQDOP`toBM}Zef*0>Iua7|$Wu#Xd5fTmH_7L9 zT3sjp`Uj$=+Dg(NkkfUGyl*xBNv4+C8@+b5N7z1*1GVtyZ%3XThqjP;I>XDH-d&ne zFQziRqAbqFC~s(@;+7@7q8FH(Vlo5YXWqGDY_xr;%fH0A`mtUO8+Be{*tizB{+mq} zDLXs9!ifx_b4s9G9*F1XuA|vzsyVIAc0~7Eray}3kdDu#zoq-G(y)}dA$XJWPWp{! z8N)96h5ylVhUGg5kdh-X&*ZB$@JiK^RT zkHMTN;asG{-k9sSbK>w7ukk5>&FUAWoB9|Tp) zcpb-72j|$Ewf7#aI+1B5mc%1>tABr2tg45*n*4G<*Jw%SaISu$e5`VKEbO%$$=%!{!0 z%AeQgm-e;ZV(^f3HSk7*oJp!>$%ap%85a2#Hf17emkM^ssA0dT>ZaAd3 zHdYc|y{E~yS(=G#thfX{FBO6Voe25O&pCt{DRegpSN`P`(R4mTZ(;Z6gn)=#)3bJ=|5r6ZQw_}=eZjCFkwQ$hv@udAL_Dw5{&D74qE~H< ziOne*Q`OTA`n?S{(%&L7I>QE3xV&UHHDH199geqmddR)kcGPlmdi^PT^#33%j$zI`2y7>uTw?Cbu zH0m&kcl@DVs`VY9xYFm^Dq7ERy}b=yPv4n%wHkd+%Fm+z`O-8k^JYy#HxjFX)UsuI z(lBZE)g~n1Mg3mp8So&h=&VQIbB$XF-1gBewL?F1U000HJT;+M@QG1SMl33L9+_#S z&F{w493|JrvHDY2q_OlFx{XqXKeJ7jNkp-gCkU`^Zj2cGMT$BgEhYb4q;7Dr)OLV@-O5*3mDMz?k%|zb!amic>eCxoOymy?Z_T21Ydp@6jIA)(0_;;PGq{y>Al)>&# z3mx)63>&?^ao!7$O-@lA%W6b@dKy5bWRAaCSUxJH`sgq}>eJGhaD{Yag<;J?+2<2o ze1_Q~&;5?d9q-U!E^OU{HsAUk0<$#EP5q2#SY?IVG4E>g-bbwKyR%0)QY5+e_uLg7 zHp5hDlp}LX5-0nAV!$uN_JjN2dH#%^CW_f{dJZX`o*WFMgWdk0u{Lq^loHd*klmZ@T@N(jn)Oc(pQB^TIq{IR6;0L{#V19@c zjL`MHc`LJ4kGKG@Gx!XMz?XSKl6(DbBZ^>vHP^^YF-%@wK}=Zf)&VT&NNCq>w@YJ7 zR*_)Th_2Pe6lHrsEs&?N;r!!~S3~Ylxo?nfkoKeCFAifH+_uH8VE%Bzbj5(HQf&-oSYgLookH8zRGJt z-c{_#XNb}ja$A&85@hJj&-EiiEZ^qDvY;`FON`W^=NE8_B*pJ1tA{TtSQ#D?=6*1h zT-1GIi_o0ELCnsWkdL`VtGBDrd_c#CjuHu`rn*9SRLOEp-FHmC%sR=Ou_xoFAahgZ zc47_d7hWs692mu3=_;uCxZ3%K8RyDu>!o#YD4EjwoF? z2kPx34T00O760+E-?O(Yx}AWV2Ee6`vtkJO?rNpIODq6Ro7A7)HwamcZJ{;E5+lgh zdY=ZlH|q9bUrE_}LMPyVB)zE{fc=b7u|#oquDmVVWh?4`dPlW8#kLLZpJPdqYxvDm z_+84^28wKVE2t=-(Jf7l32vv04MOy+e^@YTimhC;-ZjymX#E8+ELRDd_%@b#cb%Rm z+V?*_8LN5-PV<-dHM#kLwO&h;-^0<{WKX_rd`h(b<;H%sW5Zn7`uFy9)=a^rT4b|F z{?feSxLS_DcQ9AB@z$+>LG=ImR?*hBS^PW`UuuU#QqyPRG0tt+^&g@1(tanW-V;+@aJ5NboN&jQ#vx<8%L8<&ZV`M-ieb} zebE4Gk|~p*e<>c>J9R!Rp&WR`n{(@(#agflBLcSIv7Qmq3AIz_oLiLE=E&$M;&(PP zl#a-175#@NPk@K9bTE-(M{m-;COhYVZ$bnxbvPpW!Z7CpQY+71njsd{UQN@-IKO-D zX8zMEMYc%M82xd*R6NBIKQ1J5RE+)QXyZ+TKZ}W7`N-9uJhmVFHw!mQr&!^u=xZU| zj0qDaHlIDEcxBL~aLLF^BN1lPBauqv{^EONi%+Ztj1OMr&pyl47BFn* z?5ZgClT($$vy%Rr(|>XJ)VU}!9~^o{J8Q4k32>%}R^0=tN*NP!z^yl5Qf z%sO@{lXcfrs(8gF{~rN}aqK+S|0Y&aeN9UKfqdcuG49!d>Sb8kHf^ZXpeV0sEM#xma|V;>b+KW$C z`^k4TsJFP6Rn3`6qKiBV+aNW+4^TphvO&R3(y-Qpcju!^lv5gNYZknLS$xYCA;9A{ zfT@;${c$+%vXfOduDmAnG_ab4DBC4(5r9EU-a(1RvH~P0jgUOiC*|7LZ86j{P2esN zf&Wx@_G8nHEZtf6F3f6g3kXT|?W(8siJ_aNR=LGy!$)pCo$ur12w0f#YA9sxFP_jS z*AXk}U{vzj5tiHU9uJA8DCi*ha&TkxFc<0tU(&}|YbKc1Hh#o^Bl>Ex04_2Hb!gVt zH=jOU23kBeFoLnWBx(M?-6f zKY4J{6luwBT9c1I4vs8Ck$X{2s^l(pcb01SC@^Z{wA;2#mOdAKT6C$tC<}VSyjF-P z5+(eef0BbB6yzz~4Tj}f_S=2(ENkBwMP>|CS$=U10ml~FJdn13M>aTYtwdJpUe!j3 z1kF_Dg_ui_?Q0@5tp%Sx^rF;*3&RCAfwB7f|ZxgP#6?7at6Q*XZ}3ZjB^ zlnw#{Hb9z&4iS(pO({x?QltrpfV7Z+NN*wnQbUzaM0yRq_ue5?=?PK-BysltoB8g& zbLO6zbJjWcn>p(q)?#IC$j;tu_Wr%^?|Gl1#yDn_AQ^|dh8O7f@n}yK;O-uIC;qFn zbgs27zK4{L95gu%y@$P;p%#BVHTC^;zMH!J!=7q76FtQ(h{_CXFkQ5V(TL^gIKXB$e!GWI0iCmC5<8nj zpMoZGST~6#F<7J{=5#kbVI5rYmRE{QDh>rZ{)8Pi*D~NA>Z#YI0xt2yvsPyvDS!%1 zc{D(^+=(jM`z^Hp{M`_PB+hsj4h*NZV6vGjl0FXmJ?xS*^Z0l=)CS5_}K`k@iBJoMRWW`Bau0DTh=8 zGrp`<&M&rG^(h_+ESVhtz{^JD6RCWDxsLb$>e_=06Cj12=fQ|v6AgHwS%<9?eP7?h zV%Pj*p>|S8cp{=oZSRWb4W;y3(Q}?y6`Qvl36Vs5e>GUCriwZl0FU1 zU)Z;86PRDC#-r|Mi{BJx31p}4Vj7L(N1)y2XWP7{qrjC1UWcUcW5;etzf zI-+JjjMUQZOUqUPI*|WrAojOe2-5)ysB~`*!j_nw`4XRw%{?8NOBi=y$_{iPJY-$C zFJE;@UVZ1qRJ+nw?`>J$t+B65LWVQThK;~tHI-LNM`EB3rYYW|$Sf7MKgp65?EBe& z**9JAgrgZ;me?{xp?JA&Jq3r*kG%BTqKrQAplX3ayD8w5;J7U>s5a^@_-L#`>)xXa zb>glBwpbrIqDLaQhJ{+`$mmBA+=(c%nd~nbJCat1^=Ow}!4-m6TLE6Xk&FB1Tge$7 zLC|W9s{dXg3U*&(9u?pP^#H1S@4r};ODPkb^nHig#AEY{T~F3+9q|RG;@#q-ofo_A zIKIlNPfqGtdZB?MhL1`71O^z5wEyN;`IlPfrq6^u5Bb;C+2Rq?;dY&r2DUU!P0jv( zo`M$ENAWq`rJ}8ca?}k--lhnuq*kC6PIofn&n}$!i=(%d3c%a{Gt;#H0Kb!GNJXod zMlp|#YCc3rgS?w`|lkvf|O7)>Ze-fPayYI3|0nPz<8G<|$uKax2B0QK+Z z|I?xV-(j1SXUo`qq>e~RoCe1~W6H}e-P7cLi|A$_qeL1B4#3`cm}d>100i=3I>i4l zQ6hSw9IKZu)NtrV;Z^Rh4B^BO$Fpfd5u%@v1$AFcT!ZOnAaw;3weG`C zc1HLjs**dk2r@1Y7lqKp^W~bvjX0+- zO{hgN)8S&x{GR`1ATbt56q47EhXDklhTNU-`dOUhu#DeMl}GK7~k+3x$tK z>8{JYX$)>atR}nQv{w}Wkmzi~?=w!l0jR;V_knH=_QsHS&4>Hzk-iG78`^Ttc5zz) z#^qqYRj3JC3-w=2jl$!i|CTzoeYJU3>7 zROurgG1h0VTNlq#?%=PjKG15mxc3t)k@Q=>+)}-#weQBeU$E@V-FLpmw^U@lu1eMP z*mVHiami1-EOoFv%=HkI$SYxhB6=WO`I;B zwIq?>VcFW`Ie*ve!L@9>t9o@Yy&nb)7lJavyh`iX4(M6lvzkE^KrgW?5f2h-g~ z#PRsKki;#6sPlyUtnfqH&;LRh*8lc8`Ty_p6v!fXUAC*XnAatA1Blp?8hEr(yasZq zBf_)*xEr@or`cf zXEpfQ8}MxBQl~e+w|Q=SME#eSVLklE{O4080WFZf=^RZLLPs_Dh(ttQFJ;Quti2j}H+L)QaID}=foNEy%CHgZzPz>@bAynM zvW~e4b*`zN5{W+&p$H+Y<0HcE8cNwzDxVHHZLV2(TS_42QDj~mZQQhSB38=>6fYk{ zC%sD1XGwSi_FkwfFI)e1sO>%pd>M_roN51T+8n{@+L@`(pvq*ZN<#9EWb66VkfrOh z{Nw4PEqy&;?og`@w09YFJXl9t^KWo>9>8BfvDBvusyTPuS56$}{u1zFVtoRaG^hKz zL_4oq)8^hMV26L5sBtZ3)&eZs43pQ@X2t;jPl@8C(IyxnI`_8%`4&({<@Wcd{*dr8 z6DyD4dsn#A(7K$;1OgWpE~^X?g>{U&;6#gncO7%&5&+$q30Dm}Y84WvP()o44#2t= zfUU&&C@@(=D17^;Al*M|ThLfZYO+|DAOZNPC^5yG$S6Ba6sL@O4H4x8GV8=k8+&{n zu7l@p5O1R@5Y3Hl8N}i0XMUNefKYroAYJ^ed_n+r#3EawY`i39<0o}5e$Ge%e#(HV z5nG4;g}Qr&h}S3bPvxxvy@a=#-al8t{}(?eJp(*AuBhXRd^7Qa!^CS|))jRJ?WSvC znW02%@m?BhMi)I28!3E4dmBP;;ApNquL2m<9o(b1jojRdto_~DjcXA!sQ z+00od;GyzV#9#P*-DcSHEe~75+W|lUNsGeK0E$VvFgWJU5THQYISLPtIfh^PL!uB1 znFSEeG(;+({=D3byR2~1xMBpT!-~K}<)T=4i#Xi2hDgj?VFNEVQ~^c7Nd*?BuWii= zK06A$P(uAKUuFd_r+~{NAv?6?k{73i;3L)q07(c%VRR55|BB&&fa!`wLc&(d6SSps zo8B1JMtQ16vc7GL{jugQueb(_(SPz&h$F(H$FKEeko7_TP&0B`7maL&L~z}jY9PgW zwrv0(5~VVkK5AiR5*#g(;H9MeNF_~&oAs{q?(zda@(WvH4~!ZEY10-zSO&L+-9g3a zKWRuFK#Dk923k3_Yx9RVDv(KxMU%53xe3`va3)v}ys{G6#Z71OU_Y_ce5jzAn^&5B zjS*F8+cMqN|2~JTLMD}yL1!H9hy<$Zr`kjnoN|_S?u?m?X2jcSn81@)1O)Li&gwZK z*FEaCoW6&{spK_Zx+Fio{=ItvPHhiml}gLX$Vg&Rm@pxoF=2~Wa1$#uvgqpFbxd8G zVs%|DgPPC>=A~KBLxoG!g*Tl2*HYCrxF3p=1RRL~YXZ2to*f6F>AY}8*zh8+pZptv zu}`(^!FkeM3lGJ-;y7vMa<=?yT96YWLGa5>Yxm^m=i1| z#iRzM^ruNXd>;L5P`yKm;LX*xuWkU+?6)@`VcoYh`cjf19~E>6$$v;*A;_RY!DAip z>)9b4TmoL|zTUBS#+=xO@IRx*fG`6|6kCrWC;P>lCG>&6lh}zajs54_*05zlRO(S54&9!)C$yX z(*HI_l5e=|O<*qCEv{n8gW-2S&LN*rwCd5~i;Wd5@k z*`An40)Uk#jC|sZgCw{%>Gy!bK8~6Kr`H1bVC=l-w0-3FwT`=T9Ow(7e*Y%PB<>8` zQjIsoY5)Ss8S?c23qy+(Y+5zm7z-?E<$qc|d~2L!yfKYj(V9Tmsx8N40P~E1g7>Cc zDGQfOeYJwC6SOWsMi#Yp`(okymB?WB{W1*7I8cB5`|sVi8-lf-Y+BZ+)$b-R5VCY~6|bz)v{4^}a4KkmiAIAt zp^=KI=C^Vt`Np6uJ24JylcxInCC=*myjigQl?FEcwY7``hzdn%UVKEJwdaozP!oxK{coZ`fb75*aD|JZL>W~GL1mQ1(Ng+@&xx!8PdCTNxH z^+r=fBy3NEjpPd6A&Qu2pWk(&t$8IuZq0>_?a{qxad1*hjHEG)3aiqdob$$BM4~H+ zmg0;g412Tdv-33!$x1)#<6jqrOSIp69l0raLdWU2WsTn&*a4 zKBu~YaFXLR<@O)){krS5pVvAmEKYjNqSrRynzsmYpmOo(McusN~Ft zcA=+2iPyG2z}P>Yym(3RUgNOqmESfU3H1IrQl!{cJt;fEfNff|%~PlH;^oRvV4ZUr z8l9J0cgwOZv}K#3c!(5AKXN4NE2xnb+2hds#gc@udp=ouKlTF@yaKGKZH%>Q705xv z+xR{mhUm%If_`rlV8$$}Ml;4A?3ri2I2ENNA*=Qwv}0{QJu1;vmC@`>?fiw}0egL5 z>M{ZB7LS{PAEorETQ9I@K44wtnot;^-Nm;N(&Qe{{;Z2etkX$oH1pZ<)7SFWnL|!Y z#`zk*y%dyAOV+tXb{SFPmxe(sHz-`gN@N#0U1BR;G|&|i*n9Q3OI{k8lRJ8JLO2&V z;Cw|14fO;I6PfTLlX4tRGCLCe#dqWEJjevzk?uUj^8S;T{4e=Hsqu8Ux96U%IA;)RLv3yVrZ|v4#93z4D`a_nGLzb*rCM

      h<~q@k=_E5%$I>eWk}$0@ac01@aBXOs ziEI-`7Ci=P*R}PlSG|t&J#5vzgG?c=EI*6M`9-3a?OAu5$eel0OIsqyc2-;e{~AjXkLy^zX##d)dyPe{~Kp z!*|x>U!cZ7+$g!#7 zm3c9RvDjMDO{N&9h&)ku~XCz$s?2^5TmNP%-b z;pD{Y=8jza!P-GjN^?10Nddu`8VYxzY-1jcT2Fno49pL!lZ|c-2kn?c+G2tx6>g5R zhU)o!D{Wg3NN*JQs(*J-2C=(4GFRDmH|^fqn_KBVr$}L4if!WM=)DJgA8M1edjiV7 z+FqtM-nl#N#72>P+r?6_a}z^TRj|V6>3b={-Imq*in_djnXutd&jIT%E?(dJ4O-&r zxBWkn9G|63VM`r)*1rsr*Id>dDR}2C8@hRnoQBgwZw`An$!OhM4pq5xt^ILiV0!%7 z-Qu>JRb_rI^sOp#`8BMpMm(;NTOEbN-y3EQ)*y^cs!jkTMeoP+_lCg8dXXPaxic$V z#Qw2Ix>Mt3W06in^@{s|bFy4pmRT8CyK;LCaiB-6V_i`ocRP8GMg;9D+_gJbXut|K z=<(Ie(utRPKY*y`xVJM zX-=$BDM%$WE_3%VcxxKRrm9Dn^D;kp0P2eRRJ`%Z+5A` zmGm|$#4!+94pUo6Wmk)VUEg9Up!R%LnX}88A(R`UXM(z3ax}Mh=TNZVW#Z;Dw@y*V zF-eUW9!tv4b? zP90k-m;5Emr?($ci7cvL1hjgB&UG<^g7@XFE$jPQQpm#yfg~`P^vRR^j_Xz$Pc&PJZ zjO;;!1_M7LUI~at8usRiISo?&FrXEtZ@X6A1)mAGt_-}8ka+LR`zv2cx>$q@Hgu4U zlbKwk664Qb`(jACuCW6toLvmGFNxiZ=b@h#;K;SzZekm&GzERMRt`*MB(J(}(hBBn zVuTGP7oQrs4*gYe=?vR-^Qqmo{A~-y;B!`1GyW&Lq|!hPcrk2n ziKf&Ndviawa1O{ajbnSpxap8ydH$Lyx1@r1o_Cjwv=l9@S?Y4Q z;4yowDEZNDFtK=f%;98nTUbuiL>MLAqNOyq$$8^m#mp*Pf})My zC;yQthpDOi$47ca!=(OTO2@|v4iin(ioTasRu&}aWDP#+HUf_tBCza*T|#xjuBBTil9r}cpORt~1;_GA{0LO0iu9GiOe%UL&1AVo31iw;>!ExU^JTLN z$6BAc)6VHyw4d#m(Rka0_LeOj-Zk$6-LPV}CHGT>*({#`-;kxKB*Hc#yqS5206RWX`ziCz^;{(ZpU#XM(GTaj+??7{l=_W|WGCE_Mv86q{5 zKJLN?u-tk3DdEnm-cO|Ij{DwCly+0h@3TyK#G}R7jWckbtqCUDd;LA^_db`3`dcym z3LoJY$~YuvOmEff+C+pYF!`lCBX*D7I>e8*R9!KPm&aPzth)H0*N)mSm?R{9jEBd) zW7~CuWS5|SaT~CIY8ErH>J6M)ta5&|pm2T(d7aa>H&7|StZP*=2=(TGWoe0uV$xg7 z=FEEU7!*=GqmWb#u^*+DFFtHn%TyJB$HOMJd?b=`%p>M}y{7%uD99l`csp!ymJ6HX z813c6Q|+_f&L-uE_)Oi2bfPj=eZwv*9A~wq^mhlinw{?-ry%Lr=qa(n7`Vf(BUnGY z*Ypitg?3zEv@R#*xiH%w5^q3~D((7SQ__~Nh>-a5=hwbV*D{S>;rhs;X&u8c5}CJA zQIF=RUcSsp*mz|1g)$YRSY*L7m4z8Zg=^Q{iRu)32SkI5M{5vmITlu9x-a<$cZPU|@i)HP zGj|x>)-Y@`q9h}Ye5c~cA|uUi@cD7=hi9sybeglt9aL*v3h{7(*ihX&U#%sex)|9u z_NvsxI*ama_A{j%o42ATodqK?)y)&-xUOP{h_^Hqlif^J-yg;gF`XL77{;gVEVf-& z9n4iXBttgx8yS68w~b=#oo6Pw1ZN*zdxFzlm25t3m|#Dra(U=RcKZA$oyi%w9w8N! z=GHM;#8f1}HvIa7RjTjQo1dd&O>p=;Yp*)^j;YvBezyU{#vW{#^1q(tOJ6 zLlnfJ$vV1_y^=~Qka(y6&F)WYH@2SPx=uVsKz`i)m0^~af$wwdNTm-TuMa|KYLx?r z**{`mCzE7cHghKncYO?#!`(P>PvE9^C^vijLD^e8{}zQ{wBR68fx^NJ*QzbRA;(7! z^$*NuCnSu?iG6?3_ikEgtb5e2CEbyxfg!5}<^c-6lNd`Zar@|JXJ^|DzM??d=aM^g zGA9bNS2)bvt?b!>kl9Ykz4rcYGP}Z5O-=cUGhN1bOEtYp#-=uW@eyfxTDFTxyOVh9 zwSgo3%%sjyC4blBtY#P(7qe_sj=AVaRvDWsS8?^czFzg})khz8!j(}XH$8!a->&Ah1Y4kZVT*rTXcC$+X%GAb(|cP1$n$)*W{TPF#UQQ`TN zZJ5IDgcE#gc*$M$+$%NsZ{aQO)}P;HSjoK;oWLt$de$(*rh=dZ zWA$$6z1L$r?B|~5BTk@{Z@KkU-vUT}2CrgEShFjmZ&!&GhhCND`uZ{wzYNrJ+EX2z ze_e3V4H=5u3pA^Bv~2RZ?=zfbUF9mrw@ZrUB(K%GPmtchL%NkUzgMD|eyA$>dzX;B z5)5vgxSf%7_yl#SWSPbDlE{3+(~RAx?nuUA;`MDw1-^+2Q%B1OHY$-)v*Mhuu(+u$ z>O<23?wMu|+4SjymkB5A+L5E$F4Q(_tc#3fXJhMR%ce^KbyZ=0c_uF(w~F22=d+t9 zz4^{N7n|3nZTkGj1@El$g-krRCt|P(@A6ZL`hX!KY2(u_QxybdSoxaf>&!v zXCb65Js)}V@sNx$rkd4&-&oKI!)L)>eez(O)@jMLXV&Yc)KDcXw*74u;6BDI!TDNM z=9#XG&Al5!_kQMgK4!yjF}WSib#j**)EOo6pD?dmQ`4>+kf%2ULcqV4sF0O!)DteD z`7f4)-_z+hp7from5<-D`6@`(IhqkUl>j)+PzqJxV|1bS`s z&95-_3e);W*P%04=uh+IJTP&(9~bsls`cJ!q8VQW%r@nYIb~D?8HyUSlCfSYuiUvj z&U^Uq^TRoLdy`cY?z97p?(>Oel#96f`EooTci0IMfq#jivp_&%ceB5M)TAi%u5JMd zbOB{!#k;%NKgBY6=wbTv6EE@d**$!kPAfv*cV7)~^+UThKq1X83|?x@b*RuvEl-Bq z^;)&PoHp-dIFCWghUaJ4PE%i(Kra5OXpml&sUjJ7dlrgkXSt@#q=F@J+fP*mFYN>H z@FDkn1OCGD)ccvVxcuKFWG7#9w)C}AwT{=k$G!MEz>vxFwS=GUa>N6wXP76(={I9@ zLWkDN=E^r_A9h~xlV7I$lt7mq5xOcn^NrJKtE>Za(M62BK=I(?C&ceeVeQO$E`|UKTTmR$zoS$0dd%y(*#xRSsl`JhTSGo~c zeR(-BmGb;oj&JVus8(H*4hb#J(h-09bXV<{R2G{Qvri_OS*knUd0JCP(mSrxef%-7 z(x_(cWMI|XP!r{g&(1TOT8f~4LMp`}HXIpf;gXBv)*S)eM;>7=Jss@76vUCTrJI_j zJH$;1z3V5rD!18to2ym1$KsGO4?DH|wJv_7KAic`_;vR^X-dbV!oJpXzw|@HSkzo(iw7Sb#`dk6V`YJHpf2-sE3EhA>? zR-a=!Qs3t!y_ySN>y+d+x!`SEc8*{?`8}VtAN&~KnfKYvVlQv2s?NNw&51o;zl+2G zfZfcO+zt_4h;p0GkXI&?;UCnMVRL+p6Ix6( z$PI~{Q{yluLR#fZz#6JDGhi)Rf%c_L>=q=$2eo^az2p@J`f;IIUUf+X%YMpz4dY{mSr1N;&z2%g)eRRc84l%)B`Ait=zSI5dn*^tJyLm?JJx zEAsJ>ji5z>OA|rItqTa}+RK=FzA4pEuW_Y9Zg&%*46(qep<~+3b<#*|pP>NeJG>(E<37#Lr`!IVTraSYQQO_3Wk)~mWu1VW?Mbl&uQzJ6 zh4XA2JG0p{X+tc%CCq>!wKzsMLe8q#-v0aS0iBTL<_1;Si<2Rf^E`7LAcn^ZM_7sf z)Of#ttv4?Ih!>^njtH^f9Co=|Gq#J4+2i+{j{mUwUF?OzecWru(jE2>r44D)Nin_P z2*$=Qa@)9&2QwJSP&sZ;zlPIgQ%nv2sx(*S28;nGpP$!$arKZ@nOAeyQ{)*94B4_= zbj7TP(oZv7U5+Mq&Qa7Ml#z8jnvu5ldjY>?idt_XutLW_aV+)rc4!V2#kyu-tQi@e zvF^fpI;f%K6Y1AZSo}Fpo#D^2npsI&nGfIlvgatG$FIcv$m{jTM~&ghT{9N7GiDW= zJfEpM)lzPbyBa@2gvf+^s{B+}@4vDhuewrvmF}(amJ_ONPe9(OvS7qh=2MFzFi(WP zz$Anf$@z{jwt7!)8v}$Pd&l6S1{i*PPNhAtM``A97s8L zcZM`qJca_!heEFVIq-k}*b!=L2JIR)-o1U$7Il49PurPa7epmSN^W#oJj93-=rbJU z)(yXXyP3!^J^C)@wHaOvLN>pB_~S*70FH-VdH@!HzmL1&;)^aed(ypVFLgz1*lo1Y zSP6HF>)T!X*XtWX;zh$#6N~I4%cFZw^gAyRNOtdO%75C2imjk#&KC!*WK_2pd8A%F zzwJLLV;r(0BV6m#rW5qCB*`Bv`B;_q3arQr^^_%9FPi^y`7(CyYRv;W^hVmvP)P=4 zQt+Umt`9WV3~z{aj#|;B^~&6H|2C5qHGX~QBMZq%htE>$5IHWe-FU38Fq5vACD%J`mK3@vlF zxk#gd-!icuY#078^}E;RXdNO#vxyDIP&{-0mXmw;Zl@3WqvG6-1;O_EOGQpm{0co| z;8^$>;-q+70f0}RhwKyTwr7R>ZeIteLYznuDQE>h9t#o$mrJzq8+|0?rsHHGu1@Il9zJG1^?eNEatY9XO=pxC)sGRFm24P#0qI zd}v}O$~weDZM+*DlmkkLbF_NoB`VCpcQ*A4`he-!$rm+z_H^H<=uqJb7SgKqX>9Yw z@*MM6V=Jvf!&Dyr2F{Z-&^qI0&|}NM3`mKeOQ9EB7xg2duUGjbzD3q8ybI$Q%+j{^ z$ZNJ9cjns6ip)*5^a?jta08SR5<>=+k|s`d2L^gd?b5Qf@(zhDE`bP4_-s>w7~z*E zM5vRTnjAVcGp2mA|BbYsI$i#NW!I~!Q3m!;IyGcuk5n=r_FIk$KT2cXIERK%-61`{ zE{#lu(Y8oN$$`BKA@}W_d1}lg2Vq76(#YEN@{hi@;d_cvE#}4I(yT}jCb4Ztm!j_x zyH9@Qy%s|~fsb8x{T&T?Ilv$&1Ey(RLCh<7R!?=jx>L{fbJOt<-X5p?KK)KzomY?H z+qH}{OvC}Ckln@KJ%XFw!YigHW~k19e2wjlgbdM|54aW+lU8hkRfEVB_PSm<*D$4K zC1v6X3wW6c5xg-|WqdtnwHH3Bz!VsPs>B}j3yvd7e1|?Ykbn2`)RNd?zj{`FNZL~1zB(-o6W;klLhfNF zr(5)5MrmzMDW;V)wbV6o3SpTKXph5mcyG1h%d8uJx&dG6l};VCToD_tmi{3G)CR*2VZ>Rf zPwZtAcJ1~_BCl(7L8s!pLQb5fDi#!GiTttcVYOp@B($j>fGU{I?!&BepiRyKsi0Dd z4tfkz!coGf(NOK^7`i$u#isWd7&6#nF=`=1s;Wdxj6u5beOZnVlvX5>t8yEt8*n>$ zkWjOUT>$?KQY~qCU)<9mx=zYP<+%zW&?zhXlrf&gv@5{`-d& z*yz79B$lIdzfW&F47|?_gTi=>XJGtQP2)n;L(mITQkMKTU1Zo&dpENqo=hhlFd7t- zh^z|Y@6xxUmztMNV3pdOY|4WNiSHOC53G-yvhAgI29dTobuk{yJX}uCI&V)G$ASvN z1@qY$Y$8Qzzfiy0L^7QVH(eiD4vXcBOuWqS!_D-0*oNf-q7J5p1GhZf`k-)UtfU0m z7e}GPc-5$-!<|TpG2BT01>#WpVz$TkKCLNo`df>yoU+=q|bo)iZO@z+lW6M=}`OS+ntlza&zSh9F={Jrf zn95qJjxY0U<~72Sw!sv5S)bh@&ZqqRpDoD?20qB>2D%I0yL@MnH=ZSu|=efbTvrN5<{5IFS&jK0zIRN25%}R-+#l&vh(`adS=4TC@LFBV2J*Y>f zO2Lc}{S(RV(vpB7-dcG+bX9nrgPN-ghYq8ySY)8qOhzpZ?gz~1jYE!W-#S4Lge7f+ zIGpEQM*-UmM4$(Xkxw^2i78o$Omv}h*w69kxd9zNB9Lf9jwdyQ=X!%+D$zta0%XH)g9GP!r1+Qap!OtW*NO|Ov7!O5x2FVHYtL>F z1dn)|E)ycG2J==5;Po^a_*q>HvT0}krwVa^8?gbcvc>-uf#cKlHyZEr#1e_rMtiZ$ z@wQ0>pt62az=h7rOU!wR)LpoR?{w73sEp~Y3KrStjp;Tx-pDls`UCo;5xpZqtn?|9%)|5_op=1?ZyvRf$Y;#Wt?WSDDU4^ZDeL;!A(W z_Wrf6WtF;gc0Ti}H%(fpDh6GJ999(?!e_ERHRFa>z`I50A^Njw6X=H%TgOzjw`m$f z4mho@rJvD7ne{_9a^<8IPU@~xTaapy(aI>t*BPFb`ukS4H`s)Wkqh0nc-~F3$Q!knC{H-w0bp2K9UqG0!Vq4MO8H5`~G>#~?c;u+KLVARDe zyxouZsOWI6TX6|V4+G5Kb&MrGXVKKkJX~PfO+=Irbr8Vdx1&sAe@L_$0W#r6X%mar zVt1^YHDfg0psZ1Ejs?Ice#dC6dpEE_k?y?j9e7!(pgN?|K+3N6JiOFekoBuk?eJeo zKtg{=^qTt9J4}>!u06zFbI>tj6uZ&e*Ux1)*nm$t;c?h^7 z7Lm%h2{#qzo^{)xhtEm8Y3ZF6CQgvSfDt@5t63c#{j>&U!lre>DVz`T#LMMG1Vb&B zGn-Rg)@%HkKL)^Yn*?r$hTnf?yPjJbRI=Dzd8n?YIBpa<0|Ja!unc3{0*5LY`P!OB_ydx=QbUT# zXKpLTd$yRRX6)0LLV^C+1JbPbV9mBJsu=e4<2RZkxZIeQ>>7Wd3 zV#B0FxhKcBI>dmJ)c<&)HZJ_2A!PBgcm?zv;~0?@XUuN9N4~pt;We~2i-WgUZM>-t zq>=9!yS|rdy${Uk7MXB~XkDfk9iFxYrVrKgKjain3q1@cDorHm7_2lFp*Oo?*q{=x zX4w_*+_WZOKCJt=zcyp~aewDV7Sjb{QqHaIUcWP|FM9%tiU>ruK55YA)NflQvwEvw z`9b;qIh+kyCimdN7cV{mtwDmkCZ7r`wcLGJ}8pNdZwhlgTuDwKq>Q(l-y$s|PcXP750EDCIX;%mVR5!=wG-ByS3{&J$mLs7=x6J0e82| zbNK$=e)kamBqERS+vEcTqZ4sSZT5A8e)^aSkjgdT_x@(bEkY#2a3oeZE>LCpNd5Y$Fd9 zVkh5swY+`SR(`gQ8EOS{ctxXMhiKFrzS2QcamAl(w!)|BjImDbEHdA7LE-t)N?%Oh zy;oJcgL1>)#T-1wxKmyIc5_05uYcBuKvMdPS7bpCOu1eV6Nl5b3ej={I>Ml(pX30w zl2n;({Lv20Gxxau>l*f4%VJ|vpkt$?47)lxt*ePvG+8{e%=acpxB{8VR9rQkO;h+2 zT2vSf{`sK$!fH?bS(T0McY=K&-6#d}kz6~MgN7=$RV%Kqqb{#%cPfmBB*7JkHYw7( z&sT}HAn$`v)LT@LiO|!%B7aq%*3FC}gWPEy^$X4)H+2MMe!m#qWo%p@e;3gC(jVEU z%MLsea?&&nOPDCg+z^lMSGntv)r2F#WmM==wPgDNi<#~lo*xqFZ`N$0AM0)$kLX^J zuoW0l+XaZHv(>EtWq4PZZ)~k<+~&|cxX+(e92+e=wW8bQGFuHmri{pS=0sy3kDLJT z`$Qc*rXi_iPhY(>(dU493T0FQl5SpNTstdYMR8r+aW>eKxo06nlUGfCyNFR)`43CaM9-2 z;g!mxGytkqtGUoFS}%*d?uy?@@VeJn0gCb*CzkHXF)jNUO3qY%T|RxHy7OEM6W8|h z{R040q%cAu4uVltUG2!reJ-|)5?~wJl%&(~D2mviuC8KX~-a@$YQ*M3&=5gWfW#eH->E@9m1lAF9!ojxphW+ zc`Ya!lh4ma=U;7x>KXq)@+Qtt>J--}_D0hJ?R}JEjPN2uKs2w(FDG{@4zssmSLGQg z7qS3HsP_Ap)kX8k(a$a`j;P3lE8hT!!YU-d{PlnG{q&1EIx08mBk!e`-9}=ziYI-K%g@O}Xld~J0D2(1sC5ZSO`XKn( zf;W|0fpl!y&e{_LvOgqFtiHxYK|NxIO%$@8J}}#xZ~gAM@EvW}!s)%1TACQczGD^Y zt#Sp9FAUpVxihGn<8}oxW1a8{?oJV|M00oti{-)}lJ&)zroOB4l*$#N+o%WuXx?-p)7hSiua`4R>+^{`5eSa_irIhF z?kjg9*cI;Mm4urfU>~fnUH7OqK6C=|4f{>409B2f1w@7E;8`w(@C!i^zQM2aX>0qy zwYK}4pVHrHTKXy<0D#wX70IQ00w=dKPA(&Ne8lijHH1(rRt8>vVle0$-jMcrm3bky zQy=1n6Fi~C#g}+{ay=X`k&`B~a(95uyq6Ep1h`q;q6EPm)D>6<>Pmy)`n*Y{yfwb+HQ*9(HjaMfXVFzt+R z3Yws>^t3{^o~A;B+Ui$6Nr}l>sx8YzawY5G+5V7GuolIRq+5p^*@YaG7w1L|p0?or zxN;~AoAD}gdvAKtGXI+HB0#Ou*PPL~cM>TRlbn+Rvqe(6hF2JV48K<#@pnA&-}=hy zWq9_>&mp_LXDYA6PTwM!MXW$qdt}I%hRt!jFRQeHavn*3R9GAtvcYu3`qo3HUf!3> z$>e(UDxfXe75ldy{vWsh&S=hI_xj+-`J`JB>wa$}$8>b!WsjA#8oC;>eV?&=I=bEL z>i4p~tG_&xqcjxgOm0XEVU?EUrN6dp__ccf4@vwB^$CSvc>Ck!m2!Kng}f|zN~uSc zpZ!Thr5G#pEo43C_kh3V^Zz$;Ydx9a%kh;Nr7kVR?KpxUbDaonsk06IN!JGad}1P@ z^LV+TUrcy>W(r{FpQ?TOr$vkjzYZJwd4{T>i!mKDOoJc9#cV-FN?@jV1401QaTO4K z@L?-M!rd>tD(koQN7*bB-5muJC^grvlXj~3u=%}c)$U21{wB>IdsdDq6p>vC2?Sc( z^yI;GR9Y(~F{S9xw0Q}miwNVofn5p#lqoCncUGp<#)}Vbt6V&e&8|+ewlV97YQIm$ z52Qd$CVNNO4U`%Q32{EJ{WJRCME|}W$}%)lc%U9CGYwC&gay0HmJxf2k%HOR@GpFj z{e#ub1Ugp0-eP~~FIR&9IH~guU8_niIO7$YqjI-1ddIP%!6xB5fE%a6J&3k~+u424 z#K#XMbcSb16S$XW_W|%_x2q2UC3eRkC>>+ZEei0f+-RUV13-rxn~GRQSfVImfDUIo z4LSqUi=6Z#rk?l#776;mHED3m|3^vw{_{Bhy&u|N*F*64*hwxHb+Q!k3KNx@R53gy zzKod?_t%}WeC2z+wP_b?9QpuyAKj?c?%B|h*Zi9y)mcEnkZX4uFvMKl$je{YK98C# zf)Q;7){u`%whzVL05-}Ab|-jb}0y03OwjV{YEAIYPecwX3_ zik&4e+wOV3n0FYWT4Zirl$LD`+T;{*y zaY@fivD-ttU=}$w3Rb#Dzzlt9I`$9AiNhiL`Q4Mu@(Ym;jl52U8G9FzDUz-~BtMM@ zCoX7>72bj**HC;Lu9OqEw}Gy-FuIV(~lnLKT)X5Iq2)od>^0O_NY0I}CyFF0ZN*=A=ZR6n#i1nEqP@+xr zP^U=+_5~W5BM>p*;TtU|9(!W~eb?VN{?}qxGJyxP8RiaV-V)t`an)<}Nq5nPeD*CO z>;17$aoKqNtW@_o)dPhOQ~Hq(`fiOQ&o>cpQp z^h&vFUsNeWv%kDZC*i6Xb($fSXTnyui_BK_w%WT7g-k|3e`%)7`L}ZZAsLfuptXfqTn|tT8rM+5Jm6YEPv{BgsikTgWa=3W^ZMT2uAG9UukD`cAL3p$mznRoevJNa5{oTJ2u z4Be0h?`C<=J?{&{7tlMS&T?eSGtYK9Wg>4|zYf-xVPhd!MF$&^i)C(|&P8PcfNar3 z^YYimvsAC}|H9pShc(rA+oB*MC<-dQOA!>LcLbvH(xo@01qGx@jr2fNM5KlyAP7V$ z(uqi~q4y%agc^D$ln@|^XTAHJZ=bu*{r0`j+4t_}oO}PsvmhjEEt2({YtAv}7z20u zQpd|ndG^rST60uairLTmdFq#ya{3;?89qNhtFe6O+C5|=%pNo?xs1u(;l9}bbN`K3 z8F|`5;b3VLPj}}HFC`ys+dg*^>{F?aODCY=;u)Iw3%H2v7N&XiKlw2&&^PhVFdQ`8 zXzCq&9Gb6)Xj&!(j{!+nTi4tm%+W4Z-tO58gHLaT3Zu!W$jB7QpH^2_msdxQjNE-s z&1fV){3MGvTBtCVjN&boc3P9itxri>)|n+j+Cts`=A%=3Xc2tgxC|RL=Z^0%;1sf) zJQ5R3ycm%n^yG(YALiAf$?3Eml#YKG-unBE_Wy3n<)5@-xFrbg@PXk5Yv$8q^t-c} zcraf`y9a8KIP-@LVphw2c8#9sy5kXrJxgxlhhN|%;e!i|SdSgqiza%PBI2A2Y+`2S z6RZ|U+ZkZSTLrL&B@|$|tGY9DFRKZ9Zu@tq`+vow$V}uH4gd`6dAYkIi}ci|w*HXK z-65F_OPcF)O91Bcpy6YW@H7PBe9K&kXe0}WcDNolp=4o;x$#%qNmz;^csxD!W64@hc}KuwzcAF4@(SH#fpWk_VJ|dSV~aqjz)>!>Xd?6{%3QK#$6JO4?%DJM9vXKYDHWHEtx?nNA{3Q zhm~-~_#*3-7fAvJBFoC1Q%}mMvZc(YO}dw3?@iqM%zNiwq|lnlXj>8BrX}&=Y;Ec6 zUuGbs64OC!?lF2VpQ6$_D=*##H|FR26WI5Y|6|1X*PrpKxC8v% z;N_w4IH<|NGk@rdgPn{gthT)acR$-}i5n>_kQ*v;956Psx?D7+msEz{WTE~3@iAu$ zXUm@O(15rw{SykW%Tv`{LtPfwFK`!{dq!%pG-mndMqNMOjzz=OcdOv$?0y~% zwa<)6R1Tg}ajuQ0$r1olD0?QM>s3X(_zxB!RLdS6xH+tTM>r|VPxUY^@+j??JBrG$gN_Mt z9yt&1eT7s(4tR05rxdA-q_qBxmg-WUY`7s>R`)m!V$6nK1^tX%erK&@=9{+gPDVuH z8?)c58MPQHk=pXNR0T*fJP{KFzepX&LPX zHV@)R7>(cb%{2(%Y8 zR>8x0JLRr^Il#_UCfQZj(9Inzb{^gGrT$Qb1KpKj<<2O29cJ@8>jI6a$FVmgJB4Xj z+C8(vOl{r*+Iv|sB6QrhBG!~ucfBNU7fYSw-JR)`lelQ`DHV8_!W#03>PjB75nYlV z!;wb0(nUs~jj$-q?$eVY)r@NnE73f|eI+&CdPmTU=&>$t`9{#QGB z%V|f*Giw(*$AZc-w{A^}%p|~Uo{4mC!KJQ!*MGo^wR6PE?x98{bo%Z}U}oNW#AVx` z%8*@?ED!qWGv`)$^WT>qhd2KQ41#x)DeI=8E$8n4a2%?ejV_m`|y zjrYb@(o!U-yd|cG@KlF2c;r|=_GIpzXZc8RxxH^_9SoR2gvp$YLls^?OT`b=^%Z)A zZov3Jw70j8EV-xA4I8U`FQd!k6_8u(i)*=eyWUD~%o2L83(X$$$}EoJ)SsSC%aGYh ztTzQt*v;`FWTkdHYzN--4DL@9C@($!uxM=3{dzMO_26ru+gd*HXHh`ml-*?7dgdwF z!zM|uP*dGKes6!{(KOU6plSaKB`Lc5hfMPLd|Ymfy^p8sOTXfJF67n5?Pn@vM3>8r zqSg>NbkwA4mlk=U;Zi`w!YMV8%l7e^>~2A9z`hk2O??BA9{Vu^hf2IY8)&eo0{L~} z-92x~rMNCb-e79i?Ne&%L4H=fc>IG{;??#}qns8}c;Wl_3}s;qRr>58uOQPWg8^ts zf=$jpwq3}_XhTQfuQ=A9cuYqzDS#}%RplC|*j+2s%%O{1Qy(Q~rs#i+O z@xYDz3da@pFGnq=9?9Le2HF$*{?!qbA_@U3Ldv!k?nMx+`*+nH6hElfG?4=)C^CHxn|0qCL6VGz^1)2**k2qnEBY zm1RRGH~TN?pT-c~=F4}X8mzFs-0gF2owk`H0_tzI^*C{ zx`)lj_X0_#yQb?4499J-MYkuml5P?LUg@4{fwgRZ$Xvotey?Rts`723(S3a&jHD79v>Z>jF+|;AE;{R)J%UuHBcF_Vn$gqQIRs`_Qh)pRnwN8C(ZS)NuB&0?h)j^7*o?hq znwHATuW0t4fy*+u$@j=n6G!VJ+JnxzYEbp=s4S55M#=sYt= zI3SF$n5;}Rv6b;*JF?xHY5=fEQGcker5T| zAqMQtbp;!5u5Zj%_AmgMC3qw3v^ZeDs@>do8DY_xhEK=F1#in*qEZw}KiDz7F}C?^ z(2GW~<%w)p3w1#DtwyaCuh-vJ9xzMWh!bCD-PTpdyxy2Oj{a_!w4bKGOJ?A{y`8o? zv-$YdLPNZvhHta`i}-z$Z`T}k?gEb5afu6=NvtUT{JDCk7KKv#`Ha;h|MQWO!PQrr zdff`CW1`;>p6Q|c`icoDWEt69mE8d=(B_e^nY%ilp2+laW`(<@(b`0j-`{ao)K zJ3r5W_3Q*VMNe178bYLgxIMOK5ATZ|w{|~OAuMvuCk=>9EASy)0_Wc=+D0n4bM$ZK z)Lw5B8?7#p)XDN9um=WB%DYb;Qfa?vkc7r`DsLr8bM$Xm9RJGnjqkFl7^(z@a%9QZ zeO;*ncUJ7p+j)!khW?_ zelzn$(@9tG&uXCn2IYD6XGf|(WFBT%y72T+R;|xxT^JZvf)lMOQur5<7F;VUwoFxaS&N1x!sw4esE=;O}9b=PSGAFsUcX~_n(XH;)w?lwr zoHH?{;W^MRqai?ki}@5e*-Qi*{mxGei!nU|?4T8{n~tXxrmxj;QkW#SSyYTeM}ArK z{t}Z{pejR-@G#n~7m*3C)bmQD|DHMsl1`0M5tQfXeVM(-dVT47BRg!zGwP`j~d#VqC^2 zp;8_C^i!_!SdW$~k7VC^MV}NV*mU!sF|%?)w3@B@r!tS9)~mX81s0$3n0r?RZ(l97 z`|ILNVbgR5fJntst2rJ=Najj>f{+tT=1H+ysDrBdnUVGF29PjoGWagLRe3wrO zy`Dn5w8dC8O<&GW=Omn^e|TvU?E2L;%fxP#uhA^|AZ+zlu+!?}exEBRipS^<;{*cp z+?ctf`?D9-_JO7WVx}Ymv@S=!!r+JXb`)O@*l#tjH)|y%I8%&P}0kFE_ zz3JGY;%L*Xy^`##?*QcLn;q?Ck0$57H#af*Df77H=LMpqW z64=>e#BGGij3652!s&Zlc*@?Yj@?iLm~%estCb6Yu3@CR%86@l-n*B<7-+B!bwWymlPdGnt&m}_d!tY zgQ1uPI+?Hv1C1cg_I_=iSyQt+-kN{`;j8k`Gi8KkKjhR4O#-RXTCe97?7p8U^?=ln z`okzk#=c6z_nwzeGSf$!CK?EHF2vMuiIKDI(~O#3E!>wbQO~?a2xh-ewLUJ`rJ$Zd zP+_P<>=j3s>fIjuSH{W4hsgYV`#KFWiqP3>Ef)qC=-6PPdaH6Sd1>O8v)H#owb1et z;01RAf@&F~J*-t6*y~LN$7B!GEeamR+WF7@w2ltQxv=c~Mv?AQ>g>U0E{yo%?QCD~ zS+7raT8Oh-m*WC_`;>*K8VnPrxRM4N*Z=bP960eNE%Xw4kWrD_g5%Bo{%wAgTkmp` z;XnXY=MaaDKV)ew8)A1jOTjh5QqU$&6>t4;LGnEf0ud(#Rt)()bDZ7EoXqvij+ciO zv^$nW0az~1--}&057Hz{(<1lQgN&`01l$<>5#?!6#Y>?weQsU~Tn}PQQHw*Q z3$6~|iPy6BG%j?Q2 z4}5Fv-rl*@?!EIoCl34j-2{rMbgb=WvZR5SJmUnO21B_Ppa0m)RZ3WWL-mU7;~Sad zdQ7~#TL6E}O>Kp<8<1cEGXF=Dv2;DXFh=Ot3k9fkIEL=SQdu3Mzkc)uJm79>YI4{F7!c7RL@WKyhtG`F>z9F&UbRP(1r!>t}ci!NBb?YGoQ!Y5*b5=^iD+Ug3+ymgC4In=oTKlck#Obo<5(prv5?Ps0##XQg%+Tx$ds| zHponPp@~3V9b0LG*e<}QO zu=C*!6Ge>%9tL5e3sWr*t>xd@8(F~XS6@8D1Kd?Ayy<2*`BUO6g4ijr`C$vwL#gzH zsU6p(WTGl`7H%~jPq*F4}J=Kd&o8UXWos z&O%r@R6Lr<5XgB>Hr@9|p_FC5TXTzol=RR=@(R30Pr|gfn4|x2XaF>vKP;l;4j(`mkUB1Ub7uWB#-n%LV5!UVIxG zI;djndI;;N3rWdQ`WgPH8H;Wb7S;Kc&A0kh?1q21Mm_zMKJW66u@mGdYFcASKZeBD zaiwgT<_uNN<7k`Mt}WHx*SQm`IRF((Tm6|Xs%kfnG6UJsMKGOOfC;>b+SxA$L`Rk{ zdKFDC*nhlHQq?409^RE6VCxkYkz}l@j(TVs2bfSsJyH9y&MjMM?Bzz{kPA9a;aD-O z$zlHLpgzZFHtMX7ka1ROh`e^fa+*g|to~>#psPgn?RR6_I$o_{mwH(^N^oL@9je$gy_~+?jIn8-UGEXm9>~`_2=QO|SU!dn>ph z#I4^D7Ca7GHuA5MUq*qSaOE`M?lV<+)kBTjq(V~^khgR)EU>H|b~K~_SiK!93VlX+J?GMuYu*nl zr>JkktsqWOon?Z=%qHe;EifG#w!E-xv#DlYnd0Rw+BM{HAy1I;sQ)$tzt}1S?MxnQ zWJ>FR34u+qizJ3rUPLv*a|LdJl(r%~aTWupvz@#(C&C-p1my7eDetIF|d$OSaI%Rz5Ews|)vR`x)kE@Mj zqif%!{R6TYj`)s17ASr4FNH0$s;*gX=09ZaVmJMhEx&TpxiI5C@yC(1ip$15;-Cjp zZId2i!4WIYGL7mwzuG*8;PsnU@%-tTS~*BFhHTFGp8M5+=^^8Ho_~fKo0W*}mGCb( zt#T~qS@^^vf(rCRig}z@`5qeF_EZiF?l_>rJ;a*BPwt5+spf65xn+I&p`|j2wqoME zN@mAMpPyCQ8*MptcG2R9fk$}U`8guy7&RFDE>iY)aJMly2S$l{#T~=;RW;vb=A)s>IpzCF4k&w>=Fl08@B z^vre}lEC{dvZJUsw>_ZP9)X()A2LnI0^ivn)fijnPUHX(>c35o8r>%A*|HT=$5Ed5F<$Bxu%&tbK(t$VzzU@pCCUmOvYq#qjqePTKqJ+$ zOdj89iw!SKPH_M=u(&oRq%`|{!m3GNn72+xHQ=h>W&}OK^{g9fzKg)@!uq*i#&WL0 z`|j`K%_EUrC`OoTTS3C~C*p!QraBfZS{(jK<0OcqtpB~{7e%=zw2ujDDkv$C{U0(k zjT%-U@dZHLnw#2eotS*Ml1?R-ZoAhJlK5PKfKc>rOi2InyRfj~#POVnj*XQDE6Y-b$*P*E;fyoxZK<9+S zmq6l>pDOAVSn|=h{0q3ZaN4tglHsiU)RR@l^)&Mic(~&Hl|&4?aPc~q)yAC4$ND1I zKFjvomYTorvr_8$7pFNSJ7q;a`bX=@xT%ua^XxB*6XZvd6kzHogMHw~oXI&RQlg*2 z!=756phXoXa~j4&{*k)7=)}-?`!-3}T>`Z@MzhYjLqD76;@6kVqZ_FFCuY{-{azK;A1OURhp0lrT34X83zO?Y_D>Sh8fW+ zbcFnX(h%$4#CW^9$pO?`od;+?O+K>gerFvajHX38eNhT#&o~NmB)48=t4y4N=4R?4*9CQr{9*^dfT(@zS&!Amrw zsg|r_*acj=c&PZk?OGT-mFGVx=J&9v^i_};Lb`Msib&;uwE6I+#6Op%MW(NL(UyLq zN@S($QmrHXvvMzPU$R{@I(eDQSWQl>`K?Cb)a<`{)I{Wv76uT_@4)x}PZ!7Ei$sW~ zsK}Kam~DKe6Xeg(kf7NDc%tY?agdR%BpOeydhAwWoBog)V<`wc2PZ92Pe*-abGp57 zQ6I@Ccr{F-Z))zX$0spvtOmR(Gc~=pIfv7b6}w%anvvOo6~yfvn1_q5jxX-dyAwo8!?If~_im@qnr~k$(rtpT8G^G7miqY_ z)5W%+9ZyFpGWP9uuAKXo{C6zpV(_u8mwLF1oHSun>&!QOjqn1Un4- zfFa9zZ-UVss__?A>ZtdTi%v-OkJe8T8|C?lv@T!L63sq;0qz)6w*OcY|9xvQWA?6e zyPPJNPOo;%!R~GHK|{a(kDK&b6-iSgO`W=oCsglI0BI$bOX6@1p=o_-9AMX!{vhet zq4rFqn~4U&uG>h;kme?u(S~{ir)#xEfA-}Tm#m5Fr&h(Kui=|_meYhWA=QvyJG>HSqE$QyDQB4(95JxI zp5U%^?zW+++pQxQc7S+Zw~tPu7IDb?7%R(Hu7~%|`+hAln8kcU;N23DzI8^eDs%~b zuVPnaz*p_E5FC#L-CX91bdd4RgsftF{;I;y6ZkYMcDaI%X>qKqzCt9oBiKSymx)cAlfqTnlXVAq`DJN_zy6YaZ+h~E_PwmCJAbd4|2-bD4cFMo zvVJ_{tiu|ed3K!+>Ux=Ujp_}LpRsq{5G)#IOnTn5F`&CrcqD8$WiXogNjCCrf}Xx~ zLWwcVCxj0gk2W&CXT!^Kx^H+}t0yO~P>MKg*_nP{nDk(v*XXx{GN0du{7I44rC3WK z5~M^7jEtnr`;4?GuRMt>^bU}bVK5IfEB>)B2x$^GGy}?fGlzc^`Tr@~d)r#tioctK zeQeml$$lq7pa+w})+ynMOzsQwkaKdISY7r>{mMZrS0Y+2KP&P=s^edqclZA7+!pB1 z=znFF6vJ+c?lK*~e(JAUGNu-n?(!bMMwvlz0+ru(<{v#nq40S@_{EI%24OL9;o7d5N`54Q z26V?ME0UOVX152PA|B=WXM*3Ueo2i~nz z0Vk-0QxN!p55+(H{|^*^e^Dj*N2}-mLtFAg^Ibbk+l!@=hpkLQ5i-x8j*aD21P!33{8}uzw{+X*1KBYaM1@o&vBZB%N(&A620r1I59f!3ndCwKh~N7$!>F(% z&POL3-d7n99XmjwO}uyqsC}yL9$e}rO*=RxGvW@3ib!WSM#oH%uI}RyUXU~DIyp*+ zEv$K!mB*R7B7^ql#?VEoaBU%GLOALq_PV~(tbR|`aMY99QK#Nf^O814ntgU;W-a;w zQP+K#eSEQ$*M*=aJbG3v=#{fA^49Iuiz@X$WIseb*@>ecaeQf~LInNGq4D6y{uHm3v6d!!sU4OA9{6!=_z3;<(c^fE^$bCj!FH)7*G$4v*HTboF zIYT({V;Yo=_)#Q+VrTM8M2~V(c^UET>x*kRA^aBJ<#TSIk*s6E)dJy;Fb2bCpZ|M0 z_kTs${?Gp#&%YSI4SNw zgZBfAh!}Ua_@9w_{gZuC%&6ReT8)udT_ZU5jyB9=DaTRPwaWw1zCN6=26^RAM2+%> z$}tfh)UTvh)wOB~3Jj~(91hR8pU}EupP*Dx4j|UP5`Xoh)|6i!UsdB+Ad@>sEARFz z+k(k2q3&XZ!)-TVPDW-4BQ~z~MB`Bs%c(ci67?&k`@9ec8XfxC|K2_QfBo|XCCf+d zyipfBMQ7=z^Sz63-*R|Q#e42OUgU~|8DM>>?7p`?oC92)JiuiNWE18I(3kkhMUVGf zz{Jpps7^nA>Gs@`oI5fDW08_(GW^$RQVmgRioqiP@Ydusuz8EGd!r`;RIRhA=5#d37gGV}7cEn)_d zMYBBtK>>U^(j#R7{PBa&Q{nuHW8-Z1khV*}Ech>!1NXF@5q_>%EP|lVksNYubfx{* z_3Q=Bye~Y;PvB=x%XxF$61^STku+&)PiXbJ$B79r2jPWii_al4*c>sp#d=nG8^aS; zgChpV_q(9R&l^N*Ij0=-< zNM;#6f~PM5s_pmpO8g00f(3cBpL_i`ZJx$%yxV-5m-)Gu_crj({J4Lz82-I_hhiNS zSx|rB25WR${zG;L)5J;)LxZl%R2)z6gEATb%C0|D|LH&h-M&RNDT=r0DlrUBu#5=^ zmU#s_yCn1mHH+oOz_)^R1uqCsLu z*AcAF@O&0O%NsxO3$^&2OJdjZHXyz%{&jFl#U-_ibu?2Iw5 zWGG<(n?5{>TtN+xuIvJ%Mc)we|B>4ibne2Ea7BNHc`-I`X-<+{EV4lXWjajFt(1^O zB8>%Esf%BEbaaWnxip+~m3Z+N;8^oM`P}i7`}It*18~ZZtrtW4ucGfWO^Refc<-AD zCh{sQ^EP58)^r|z*~^mmpSnfjjtQ}bT0Don01#P0b^Oscbu2*|pJgrc%+AQ}qgJZ! z%r3{3nBLh=dBOFV5Chl4Seag8hKCzt&L1)v)Bq15UYw12SE?(M+^h>;f{kQ zDXk|v=n)Z_qM#FI)E}~u@KPp}G!ol4c3}a&2JG$EgR=`7zKOaieSxHW=AWkUxujpkeX%XmYa&q--!Wz3)#;UnLs5~2XXrj%V+qH( z^!26YJhhLW{4Ho7pp>`Q5#Cpbf&M!vPosccz6Z1A;mh>yL#9*S3?BmP# zw(;icUAx%Mw+UkXl2@7o%@0o&oviQ^v+8J_9-bk><;mTyj}68Jc=ZKsl)CdJi@Z4sJRsw# z?UW23eSX4X|2pFHJ&wbNW|IEd+I4l}8h%_}qyF1qYB0fbg15@o+~zRNjtvIJ|7^vPq+C_=@=TWOcW2ij{6q((A47K~l#NNB~Z^x{e!=+r^=M0}4N! z$AeRhLPoem69~PN3Dnac89po@7X^mzl4IUSNXx+!uF78}F(#IBbrVHWY(`4piFbD% zqsx2-gtpU6a^z+(&5ML(g6koj3BC=3j_PP&iFGydHYyEfJ1{naZ+;0ei-4O!Ea{`a;T*TCaci-O)M+>-$gp2@ zT0C->fhi!4qPvea>5Tpb)~`dX1Z`~vPOMf~(l9(mJ{t>>!biILx0~mZJ?^;*Up2sz z?+gta7dc~tx4-l#5tsDju-&P&+_P+i%o_aoGSk>)*vx~QV%7z*DDCanf5@B`=q)>6 z&+N|q)LW$I?oDW?nMVAGpKE3I$jsw7;dE}@Q(!6Mg9J^k+7iD(U8yyJ;k~J(;I36Y zM3Qfv-!3cW>MQlkU0fMi=ua<1u2UJcUh281<5auHP@Z&wS9ZV2 zi%~`Vpc9qdB_3)1&Nrq$iX;B5q_Av*@Ul6aptUG?Ku{Hk9Mh~qYfs@{&X1A9+|$|Zfc45ZC26R#=ZjI9KE_y< zAAYyH(e#kGY!55*&GGv>W!Mw`VlTkAfgZfYmbNYXwTi`Do^{qjbr86&cE|U=SJsJ0 zez3A%SxzH9#;-5#(>sYPC#^_Lv^(a1@lP|}4Tc-nVyh!}rSBc@syJsys@cD6yjBI7iD5Qtik!Z!)z^2A z>^#(9quT87SxMu}RoT$@`raeyYXR?i-r{G^K~E1zUAXL^G7m?W{X^-w##9k#lOD zA{&}mxBR%S46F2IKUe8<4zjym?2`&#`Qu+} znth&nfIgDhdS}}4sR}ej^h|+D+4VjqVM|7n2!%zE7%R{0o8GvNF@+Q5I(xL-ceUPvB94-T-K>?v5b`IOSP(O;Ys-}SYnHH&F20EvdjjSI!D`D zZ8g_?AP_cn1-m?`l`NCD7D#`{Noe4e0|7LM;BgQLB(F2Fc;$xG{2dipybaI;{MgF6Bw*zE zSMxC|vWI3HU1O(~U{o`xwJh^}33Y4s*|u!8$_pN;P=t#Plh~_}W^1U{@8Yk12qrqB z^rfGI_@1Rc=SLTRi6G8hK&>dnN{>&HU+q#Jbg`Tjg(bF|(~?}R_5n^n;j{5s1-c1* z2#>$_nwa8^C)CYbO3f~D`0}OOmlIsfmb7NbKl>niY8oIVmO3kU`*!Xe(3RNF;x>ob&d_0Lgvvv=QQcZl!)uPSY*26c z`JBo!&S)q2*9<9x#vxY^G{bmi)c-t+D`&0l>wZ*P`^aaVPRcs6<-Ap^5~w4ySu_zL zynoKE)%M#plLY>oc8U!jq5MbpAmlY^xL>~`MoWX`kr0?mwmy(^Mcj5$g-^OYIhIZL=^q94XLcGV*@%b&0 zN!+k5T$;nfAn+#~k!4Y!n0Wt#eQGFAKp%DjmVsybFVr!v}Bb?T;s`xK|f_VZP% z*dvKu(}9-(hUbea;v3tA8V=IKUG(p@B%{6X=4u55?P0gLrzwW=Gm4rIC)X-FGnTKe z^jB4rcWK=7XG|&B<`V0!_L9Fw*20Ada}a9j4&IgP5~FKpsfd@l{)epZ1&Og_2^?ji z@oBfeQ*-H#6h}85+42ZVf?u&2g8P*7^pv-my@KdjOtl~t-@SdFRGooY) zCOllNrRw61!c)i7&_=L|+=zQg!-c}~i<@_=Y7~&YnvM8sL6l&w-lM~)~fUqMXK?sIr>uQ2iS_Yi^wmMbR$2Nw6K0phRx}`7~s)6N00b2 zlg@z2g8HJz5Sf0z>v!#v`SN8t{o&8+8`8fL*wIx3gvMzV z1%KTyu-wN(L|KcZCOQAV#0Tb3pyO%RL!;65Ci`aj!%2{THKPciA#5bccJ)KoZ?^Dg z>$Yj`;(csN>np07Skl5hFStU`hz0cJLB|-K_o+cuTXaAcO7ISrVZp)sG1;=xc(=^< z21eq^K=?Y@@JV;m_Z|4CERacvAKCSX$ATu29<5d1fplPV)DK;fecoSrCVuK29WQV- z-|a`EX6kC(dgcvKN$Jv#{={^XlE3J^2xh?dWK(fsFcquX7MS~XECF*fnudIhVsswG z#%%O;jkqaD6+M}S59+M0Qj|mo5aCw;jL_Mha}~AdG6`t42@IdBO2MUTQ`Ka4lMJX1<0y7($4OZ zBwL)ns|Ijn1#rX}Xi5qX_<&=X{i14h{xTQjvK=}sOqK+somYa-Wh^1#kq!@t?{b1| zr~?ulE}y6U-JSeH)`_mkJ3`*Q%kZ8tha%5rT&DxX?n6`mF>UbGN2LU^xLbt6kXl+G zoT`-Ju|<|v&``xD4pZugAbjf++n# zyL@s<2R}%lvt8w1Ax2mJ)=NMM$6%!=PgB+)cAN$J!x?e}dL7J-(~l(Hd1tsVyFhsj zl$l4aTwO76(R@D;sylk81})v2-)x(#tiJ^A7W@DWF7;=EcrEqSx(V7oPG-ICRxcbmJv_on~w^70G^NVf;7q zl!egSRfFEKEV0wbUf5OW%W){^a{Sr8z~OAF<~H2!5q<&lOw{HDM2g4C?I>2Lv5@{p zNKsG8Oc&c=Ld>A2dXNe1Aa2UNy_ziWlIxLc8-nvhXW(~Vlln|xuO4mSagiMAmd~c> z_u5#D5zTu8O;1WY#a5ighG-tfO}6vCd#QWUO>as6LfK4#Q5mN;OFd)_ZMd@Q5mguJ zHbGzUt4~eptthg~;hcpFZHUOa!FNF(vPIGfkCPtSs8+l)zIN(+e-xx!TFF=)k!IE! z8?FBsoC&U&Y*!5soR&r}w>#w;4)JS301;ben3JyhTFb69tBt~4o9o6mHTsnsg_ny^ z472z=qe$U1mZut6S&SznZMg4eRxAQT**_uPm0iy*WTK^=*r^XSf4A?S(*L41q9cgA zopom4EZk4%=bawhfd831yXM@!HN%v&SCiCuXHuT(?TisEfwn^>&Cd4cMl$~ zBLZ8oVZC$PM$U&#Mz>2@juPS8!8zY3XV9mf!G&Ay{2_56J%7kQ;t>BeJp7aQ>3*Fy0I@&NpKp@$lDS={0NFvxnHK@ z);N23RmJ@-Ux~M0lM?o?#Kt}ssjS)u3k^#2Ge2>CH9_<^Dbno$%H=uEEs*@Su@**| z^~#x;DKjH}zyZtMNk`A?B1d*b>Fh^w5B6aPIx*%4z+OSEv+ zX-`lI;FJ`)ucZI|33b{D1Kb$=IpjM2d&$f*X?>WO8EBg>Fb@gUz*2TQ>T17PJdQgz zI`5@>6;jcLxxuz6<>8M+Xn)?S^4(su(q^PnmpP-$fzG4Q-SN z1;H-SP-AM>=PFC>zU$U_78r(>XBM%m=jYskS}jujFK3Lg?k*u7ayzzMpUl>G7}YJ zD&xzITmyP$lUkvC!~*_!_-0l04$A31~2_C#Z@x`ac;-B)L}y7?yb-b^cW8e6+X{6p4F3eCn+Caz}HsK$O12W2>S z6g2nMt{Sk%CkP)dKf!MX2RPCez2MWL+$u(hPOvD&x}v~vwrJAR$8%3gX@Mvh7R+8? z*W*=t&M1n!EWzZ*uF+`Uf(cd$xGpjH^_Ty3Vax?GMpD=I#nE9G=S0Iq7bce&TZZeb z>-SfJHVz1yD94|x1|!ux`b!zSdhVF<`B!VfAz#0o!=SJjs2BzXFJG|uWcy0}vbQ$Z z`vzT-@b*Y}ggeU$*E<0AL{th-BZ@5iA|A-d_qBOOQ`t6;CAye(tvqKCKV>|tT;z^o zD^0xK=f58petDNwL2x8@jHK-U&V3~)AjgDbnP&622$!SixY3@LQE`vU+2q}pnkn3} ztj26wQdXYA6Z!ran69i&`;jUw$E)$xiM+|?>iM;CMi94)@hy9D-2#|hP&zo zh!9;2r{e7Ay1M8fCPscatd;17{cRr7T?GoX%$YR?95pXcH@3h`Ta;L-X_KllzJ;PA zDbN2G_TD-u?r+N%C4qzp5-d0*0fIwthXjHL3&EuW1PC-5TsuI3pp66w9)fG*?(XjH zZjCmyb$vg>rCA@bI(lOdUgN$Hs7YZx_9rr*V=1+WRL1yeEndt80BLEEeAnB zLn>RH=^b(%JUg+qOo61zWx6m+luGq=0k2dBRo6$^O0I1LU^8rr@9b#_jLhrkCow9s zW<~T-N|QXwKONy`DuW9w@-`?)LOJT(YOI5&94 z_tk@13`5qp520<8_aT{U8pvp)`F4WP<|YpN0G*hDtNI(bU4>(hdW2yFa&3#PNZwjT z9|Pk()M=xb4%ugR#8HR=~4cbzhO>iX?wCbzKE5!JQSJbgE>OdgS~!(;y&EPtow zX6fcB5UJC&i!6^@U$*eNXu+>1KEa z-l)yiQr8K81#=lyFM4_#SJs*Ttc?aep=#H<4w1f^zlw9Tt07u`tI=agzoCG6HfA~x z^CD!XSZ>(>Ma|oHr<2aAh4VWWy_286EGyWLR%M&iad-CE` zsNOQ*P@r}tW5qd{JYcu%$;w;XFA(%1^hGdZq575Y%o-Zlj0^3mvN@t5OJVfQ*KZb_vWL*&7>vS?}CQl3JFc*Yn*No z0hT81Fg#xHPQys(t)YYbt3; z8&uiFosEQG`1P#ZY-x-9wQ$ZNvs_B-CawZN{0u=@x8-9;E9c1mtbR z9BSKo3#_K!h#8xikeI0{`e02C)5A`ICFoyI^-vk3sH zcrpCy?@e$RpfJ#Xg5G967ua)ZNx*@krEMe4h}xX6YjS!T)0 zl4^bZkGGls{kHUbl2Cos3#^}Q>%U!F^B=Ak_Acc~+xW}3-cG+gqdC{HG=VC0)p}za zk*-_5)U1mlcA5}~W*|vjOyARH@|{X&Oj`n%6YjNbMruk1j9C29QYb}<%LcVXFlDL_20(y zPv0KD!5d(V&!eSa+4T@9v5{kXFxHFvIdID&ix$a6SY19FcSKP_tMOe+|(P8}nI zS~Q#q@%JRD%7RtLpPzlKc|*me*baL$WYdNBiaFu#gPAe8d&UKscQLPZ*E&rJ*K9KBCnu@ za2gDmLoTZgGh>b9TG}?X%L+RdyFZ+~^90Y`j`C&8&%g*}+D!3BC~-t0(hC9APXg0& zJq~+T{z+WNPWWPj4DJ@vEbOfaC+kpT{(easacz#j`0ZoLMzQNetKzX4+n$Z@g*)=< z2%)m3yZo}DPEp6Izg-RY6c!{70hmHWaQumP;^f66#C(!Vqkd=6JzB5<>LW|-$v#`T zGlLkf-mQ8cuwVo0>aTDnql$esQq0T>6-wPm0N*h?V#ccf*#_nYc7M9jY8~HKK0&kg zTyMc~l>qW+i=`y5jWaqq)(grtsK{K*8_E{)R{|vHtm5^4lnXPLGq9yNA&eV}Rz;@( z5WR&Z8iQS9$*G*UcV)5^x_)~ZG@jS*LM??fGP1eK3ffw{}>e1e@@8TGU5y5>`cnS+K z(d3AqJ{+n^Y&$wJzRp6)p629J)myu^mJZ_-fFnQ5FHyftTpH8UI7kxj;4$v(nCsFr zlJ3GB?m||jf0_5Q5jo!3dy(FqG~-%0pp*&!^_W!a{*?11a)t+5_-MEe4hrVfNV1i~ zndck)s#!{Vj}p7s9roQ<2q7bllYF4Z1HG}q)xW`AcRL!%+9gs3)PzV6!w-HQD~Mz7 zvMXr(sD4|==o@M++7+KbqT-Cuq7;Bj^>8n6x2_NtXIEA?rm^?toXr+_1*>Ji`Uyi< zs%ydLT zw!266)S@iVzZ=1DHSHK9nfKsiEnY^GC|05hVq7d&Sjt1j&2bImAEq(oB8JI}00t`K z(L-9crQ*w>lrRp$svny^)~Td-4(X^7dvNM@bJehX3c($EwzL)vkTQb#>H^W1*&A)W z{48qWmsV>8A=WwIuPIBGk}rE4rbQ*?+zjOUZL;3ftf4bqNpXV}jj6vj>{r>3Aonm6 z5CU)OJmfLW&{@smW+SfRT3U#tPLQHKhD02TR3j}`?vNkV(2#-Q>E829|Mn; zTcmx7d{^=dDb}Vra6A?j{eQR$MtmBX31x5dCWoK2`*K9`T{Y-ZP8rZS zNO@Ix*_LFRcxN0Q6MqEs(hVay?W_}BLs{}EG>v5PjWwT9{_i`;j5ckTSl#&0Gi*FPB37xekJuYdXW{jEk?kJwGL3;@P9&YX06L z>jwz9qg>EoBq# zuE(}s9gSUELk?#Cvux0xQIh(o@=5-MAzmkYugC4H*ufOyGc`&DN%g|H_YuCbl5YRLk>9qOg@En`3c~lF%(*UYr4S_&A_kHM3FObQ;@vx+L!?Ud@vgaJcTOa4jg(x*@J} z5pe3YuAL4C2t#%m_Ae;6sv@n3J}vhfl)(tef;Zg^XeSGt80N6)^N+MQOns};L<6)n zN`)+o9r|b=CO=V-$DIkC^MCDlU^wYPvt4DIwIeVhobc4;RX-NxiT;C)T+g<^8?vHc zL=$_Aqu{$SoM`0yg<>Kbdk%d& zhLDv#H^o0#P}3&qMm!x#i9DI@KUjT`#NDx8K&^PQdE;!O11PMjr4t#-kT$cxU5dkV z6GF%5j$hOHEv@Zfuy1B6+`X>w5E?m)oZSui*807mn@^tT3_kaqI=|6MrXzv=qG7{@M!s${hOV$(}&pS?NAOgk;1 z{#<0=B!ht~bnNpOV0=2ggCOv&IvgF!A5i=}%=S}+v+9|)`Iai1dCG-jw0e0tdM(t} zWwK+upkxX)X!hP=>}hoyv)Yewfd9jXf))4G0<-GnI>_#h6Y_>ab*tWg=EWh!wdS4PgB-}^pO*mdq z8HrXSqqU-GcA9QH*$i?2*|Yj~M63pTiwvLa#oLon#{;8&{yczifb zf3<(s@Uj1d*L$Sl&1_(<(w(J*lj?yQv$RJV<)|trF5d{9AyD~DP+?++`HEvxA={<#Ax9)ZB5q5S{Nr19=T z>dc~Z*M_p^p-B_!At8UT%r~Od_GNutb@qjh@J+4&LW=&bM#9%OyhxkG$B>OW*gz7R z`P)9|(G0N8WZDPKtP_9@sySs9fG$Lfzu8(jSbiGUG9ypBDYMMC)pv8O*55qWQd&O6 zu|09gLcBk{rld$3N=d6`N_fA8z4?)QEQSaYQcst`aOm&RJiz5n&4rr~dzcl3V!!;- zBCwt!mK%B%cEpaF4sJ*ozBrmisIBK7c6pdifYS+Nj10bv8ZeuoGmGWg_=K>Au2eT*;%<=93e``Eh6lvVqZpiA;sNK+JclD9zbgNC;QHeeN)#3iUAH;C---i{k=afPyiVK zlFH^_IzNz@JTeeG#8Ox+O-9Z!HS;r5h}?OkUYr$aU^Vfr2A&pmHPXsAm(IhtZ}x>! ze(dMWI2{EdZdHxq9v$6u@v*sYRQwr`q_wD30f0_aa5liKYa=_aZ6@s)(YNhFAKc!g zPM%Z3!Zfd%Ls2QGE8S8d#-s~$2_?g)+zNINYN2#r6&pVP!Ez%zNux!VR0;cctrtIH z_ggkl#FIbRYCcs%JKoT&f}_xxvMn_&VaThqqQaYefi<1Bsce*jz?s-{ZC-?4J`jv}{jA$4hkb8lPV!dSK14|HD=NX$yI0v}fly2-v%Th6*w`wj4*Om)Xa?%t+n|Im!SmJ}8KEHC3W1-UhB7Enuc2zG|+jP0}9@RK>)EZ=kY$G^;un_!*z|rGlKjqzHGP zM$cMJ6BWIBLwuw#CWnQ6la_Vn0qcb#UUoH~e8+G!^xQTvkzkOigqQ(PS=_3H7Ipr> z7+pLud;A)(4pUr!kMOTr24?O-YTU97_8)rC7mjGKMO`|%Vueg1q`9{RaVj;QV=wo? zO*|+~RFich+vAJa3*2npG*3;I?+U;y{tXl${y&>-65=D$YSRFij^`K@Bih~~1<;&9 zH_QMy79FUD`5xKwMv^TeXRXNw^cQt{WMuXOpeHy3Mau$Dh8CnFa~P(j@-HT757ra7N3pdeyX9P)VqHV?sA+&70C(>B zf+20MEl1k8ab(d)dLw8!Ugqu0UQRN7Y_5I>x;Glabho)5U__e&Et2E+s>Z043*Vh|D5H4g?P~oc8dgt$h!<13#Q~Q@^NzGt>SJM~8>RH%oD< z>v9^-Z8(WexWHw?90;-sm3meDxxngxyUoRt0r|ikq>IRe6ZS||K_UR^nDVq;PQ`@G z7Y4!r9%)DrOQ*y03j#5neRBt?$G-Gd^pBe9K1&fgzrrsGv5vCu#w+*PGiN}KE9i~X zmE5*i^q<-V-g{`@aOss!6jpr1Cc;_o`B&cVH~62~*vNKTsC@~oAeHiM&?nwZlY;V) z7d{mVwU8UPmTx2^mgO-@k|MJwp%-jrR!&T*gKp|6;_n_@U!gp8;!@2A$b#dz8em#w zGk%>&ssbO;TY5T3n*c^+KaKg6D;Fu{j%un2_oZymFXBsiH}7p{`rr+QT@N>A$^XVI zU*Ogr6t+^W8d&H%2&ohbfY^e91qW{>t3>q=uEre7mHMh*cl<)W!}_YwK16qZ{t7lL z_1Hc6Ej&HwqJ`YuZqMGi*kYc&K`%BdYA|CI>EMAIBLg^hsRBIB*!2m)xuu5|L!72rk49pJ+hs&F6`$*&5`3ZzN zfO(6yjKPfg(zd|q+t7mZtu#Hb3{%qTJ5^Cmp@{L7)hT3KLYz$WWT7L-*V_OdnZA1R zE!xnUsL_8%Utz*R_{F@;up`UmqeQ_%GIMo00Z(it;SNa5 zI)SJ;^UGBW$9siNf<}v`DTbl4%bCMWc)B*i4GvNZU-~f~raInphIMGhMz9Sictr}( zKir#N4l&7J&`cv9Iu1p+(ehvsw14aD zplDJ44zKvxA1v)$+6S=e%aEvc$pPLShGy_kqg&nzx4(<~Mw~}gdg*acSt%n#07B#} zM)&{`d7wJho@^f{*Ad;@x4GRYqZ9PeNT>Y=6F4_0owEKiwyy z38=+8mlg<-#v<$w-l8oyP55L}uoPd8WIwBy`9b{&(rx=(*m#32^u+^WOnDZ_oy^1;O(3sKH4Y?D!*-XAJp`2D9}&1nh4)%^*Eunuw| zc<0;FM#J-E4m$RW;+>&?iq_RCq zPiMY!TD)4!?1s!s^BwiV0BR+z5yEG5S(Xh)eq}E(Bjkwb!y+Lq zIHqy3uYNGPE<$af%jH&?R$6%R)mzYs<8)d^zoo&fxZ8sX!FAE3>ALvc-lz~hO5->s zcnL$DX*k|z1>vGOke2KDvp5|ZyikswUO}(NeSLAC76QVFw7MGeT^a`I-N;5265{=Q zr`Vzf(j@k)lWCiOQRzH0V~7pt64+)Yxx4n($^U_=gK&Pf*8p$$n25IUo|cOTLv`c# z&MefyMAGuK%YDI9qHCh^jylrRkk64jojNOK%2KbgUOx$ay&(aHv^Kx28?&2N@7%Fd z=%E;L_%dRd=X((m-O*o2BfuHhA+N}k6?4}bHr zwEIl}6^JNzSB(4RV=tY)a~jLg%i9||1`MN4^hF|v?L$wX4geDMdZVe@U`TkSjVOQ--3m3YhM;X*#F-5 zOnG2#AZ9*LM`NQo3)Q~~xEep@4qQXH0fGOAh2k@Htv=yJpMw~$Nai)hX-D1TuXp^# zocl_wH&j1*cYZ8P`&k0jw0g+?d{an&U}@Vs?K7j(Tgo>k+j80xH%CT0LCzFaYESm! zXNW>#JjRqhrnaWVwar(uO5%koFf@4v`&V(7yXPv@%0Ju$NV^SqLi*#Iy=Z#``RWjX zgcOon`}7-(Qa6T`PViwRc8o;fE_fuEjyCM!gx7=v-?pSd13Hy|Z;QD7*_%kY=>tRUI;2#0f&O5J zO(wRO0VS-oin*ndlTo9AUf|z>n@{IRuOX9!b+J8W=v;2Sfsb!y%>p%)$5o0r91fLF>&UWM4?V6 zZig|?c)|xo6TP`GGTytbEDX%|e}6a(N*zgRezAMYRPtL#q5938kGKi_7u&V;WeZ(I z9#7bQ^$xivO${}bWH{o~MEBVfvfQEA8xa`kbIP(-CYAL59OLlDWOOFc$lAHJz4miX zq}bPV#%Dk~4=fp%AYSj&Pdn-liL*c8F<5A?j^`?Qfy1g9Z45@|FZYc z_4rVR=8~{vPG;W5Hv}ArotIK|1RnZ@$V?78KRL<$apMMJQe^B4A?hHm=XT zjnT~YSSpp=w4BSQdU?(JXRmq(c3NUy^B8Rm$0M0?^mBWa;%Z|&x{l~{y)sM6mMzw73pH0tCZA|Klef*-z zx_zSB93vMoEycJ>yj-PC7Hx5JnDbnS&CFjhXMR?nA5S^6+P&@QgZA*U8@r=_kyC9N zkN%A1yV5}44)UK4yy`ni2~Wy-1AkLPU)_0mhgx30uRc)ku(@cKYofKyej zTH#x3c}hSJ?sREOWe!)`XTDk~Z|B>IIw861U^|Pl20nI*&^X~;_hPT6k4r0dKZxo= zL{5^n5*up_@|m@looZ^k6gkoPpW;eqH^a~6U%f1`sX7!aF^szJ6h76#{IUTXA~Cf3RuXeGfC|K`~78xVYiC86BbxE~%E^GsS!G58&)n&Nug zl)rc_cB?ZUce4P8cIijP!QBiMvR|+ytvuqHe*mbXs zc)Ox7#tz)De%Lc zG=7o7@>!&Om2U+0)}#9S74-2jPu$w?6L{)OfGR9Iz33l$zaOviLXCWm=^gK;z>B56 zlY&e_WU4fHImg-=UOmNuSR^E@} z`lu+x(t8u+8M<=`B0$K!YeBsTAKE9Kp+<6hyPdj)Nz%^gS=W^68ZB_B`-%4!wY@Mb zrKF*{&CjZd=aDE1O5x@J6{n<;EJ{A!NLlN;?p^-CpjXFzXaVw&>jx*AZ#_Cg&zs@Tdm+b zi<|aO!qYy)s_#C{@s=+@5l@*n$9Ty< zk!fJGu^}?HFGf*K+UhH&gd{TtKA01~f;D-+S!XG-$qZ7U5=>MZcE_fxgT*$gz-@Nj zt{-|(FXFhK%_(592%d*jT zlk`R6CNpfmD4;l%W&NDwgx_eae+QvsNrnj{n=lv>cjA#=2!QsN+d|5@o?CTBXQqTd z-MN`b=QB%AQr4|rQ1*Y=&3B+>SBYd+Df?kuOO~*Hw>l=f552zg>#B%V{LK@#0c;+- zuMooNUsl#G^S<#Fs?~$Z0W_90+EqKjHXIh;d&hcOB-~5n6=t@*ehqNPRYnr<8Hi42 z5@ZlqMjOShOvlD>^`_nW@CrAN5OnOnS&i1vx7POOy+ij=7+ZY3qBtl@`XpS6X02gL z%SdR5uj78Vrr*#b1&)Yb{gr-Rm4I`%T-h&Bx2@i9C8AV4P35VlWrl3rxO5i_CT_HI z@!Kl>Me*;#5`%*Xjj30T(%?<&k{Eaazy5I`#11a9;#gEcW8Wm^(F0mpz&7 zJG0l*5}7Ad;&>{%JKe!1lyP=@uCZ(|f5=2ClsFn`ZQw;q`#jBcR&VCHkW;!+?p0pr zsOeJ$Gi2*E>G=N3D2lm{^HJ5Wcwet)PE_`BXT)gzFgKvs2U}e0R~iWF$|tlQvp<%v zN8W7^gQkYLTo!(L7xBR6d7?(2@SX2n=0%a^_Rb6`P8yQAE(F_=DsxiKW&-BsLKN#7waietiv-hXt<#x+-V#lb-<8(hZQs8B^~6 zOC8Ju+Ex4l->JPXcM(hD*i4v#VM|+reWI}2R{$}+&?ZV2xHJTt@9G+EZK&RO&?9j$ z)jI`uMyqmk&3{7^#-lV&Q(h+TSnawMemV?G!2-7x>&i3qh_|o>eDwkExnm@)W*$wS zWZzh~h9}DVO0e&6lPH^T6`k+(7GmFNPYJ)?ZZXuNJVoUecoV=2-Ido9f{!&Z65B16 zdfn)Pu@KUzks&iF#&^sDC}IIPy7Esgw+UriW?la>wY z`ENai!O{#|i6*d!W-U4!`d$8TWix-?wsAoLFwTq`)=cc}jDSWRrvcT5x%!`6(1 z;9FTmZ{bLY)MI z*V?g>wRpjyn@p!^P1uE%PNG-ubFn;icJ0Ga@~cZ+H#Tv z_F1p`9@b9M)d(#jLE|*_u=+{s`dnglKfzSA%*R(788Gs)=u3ODg@gCX+ut`_7)p2v zDN$g3jr;D`k;l_wjX4y7*jUJ1mK!ZOp#$2O5{2_h4+>ww4==9B_ShH*>3x zQhZ8+pTmGsl^0SB9)7drF1S?eQYbk3O6H>HIvY)bTwS>#k^ondAXM`jK0i-EI1Mjz`R(?f1lup~{! z%}J#gg}}_(c~v=9PTSn*Pclwb>XyRm6;>f_q?_eWlZ!i>KiUZa-LU2F>Rouz3f^sg z{$A6KmTKQ*NYAs1pSOwZ>Ep%I8D>cTc=z#&e`A|VGq$p#eo|kJ!P%4q`oppqG=2g* zTGlx49<3gA3P+Bu9ZZwYINm>7Uw-BF(IRirdH{d1G_IeIrZ4S2aIz32i-JNNC3fb` zYpp2Vu~KGAbHiP*Gs6@u-6sWCA0y2G&i2n&Dy1ZRG^nP+%+M_hl~|fvY@JVPtYeA= zMn1_$Qdahf)f;Zs{wrDu?h&6@eBP93uj}Y`)rki!h4#c>?Nb+h`zYKc&&OXje;}GV zo}j_*MRQqHsG_vlt*>oT>9Y=&+*fY+X&Sd-!#dylQeIU2m;jaH3oYCX^?JGn4 zB;qz7N+cb8z!O|2pC;+pLliBZv{5qWpZ(Tt11j3!!DAa@ny!1T8qB8z2-s_lD(PE1 zok2J$diL$rnJ#kNEbn*djppv(djo{Y5NpLW!r*FNIfo9-0Nh1cR!rRK(sItsv_sKQ z^IP1Bd{`aen0hZ%EqFSg6`OpwgPG>L`CocN$QpG6<^{4_w?}`l>bbSOgPc&1Xe4ls z!9h$((xIeJmv#bCpv|VGAFfdbMHM(AYzw8o{5^H{4Abv4lg576o}z$EXajXJAD#1H zBo(3_g758cdKB8X)dulQv&W9gXh=&1-{zK`>I8E0BgHO*cj`kvaW?c&QDLrOGN4|HTh?H_NdR z#Pkz{{aO^|G(i@?r2pXcAZ$LJe8qQY-k^kBzXGs%2l}rOPWVV(F%dAQ?;yQR`T@vI zssoBIys!y0eVV7p(mRTY$7&wPNiAEa7s>_{d5e)B!VC1D3MCCS?&5xwb(-&KWv?3$ zNUN}Yy6XG>p};M@)VB*OqV*?4elw0;Uo^c1Ca#lz6@I%ZtE=duYfZPBM2M*%B=UBj zC~=j%4QLvum^xRJoTXwM6z5MH9PR{4ykK zw&T2>Vc?t#gtAg5(9iQ#-EXFxiLrWFTb-7 zvOGwd@dJ>QT!*kVbFJurt=7di`yzXWsTimk4X)jOvnoyO~$ zWYp)Ld~uv9c3?^RUM+MUhfc+u=_wsy?H z>Jhrp)YwUv(9kc|^2T)>8z%880kq^U?pue$42!%pM5u%1obVy!&$PKtk70+YBh1|< ztQ(iFe|Y)X$6q?tg1p`*n6M(j|5NI2fzLj*}NtN z>FV~7hFNQi`;(`fLyMh~1b>~&aVl;)-6YL75q&4Ue+c(2#XKPG>8{+@NR+r5>twOE47=S`B&FplFBYKAJ)%cF z=NA>k_=!MFy95d-&fK&5xJYws_|cMd1rOvAxi0(hSLvp<0cgRXe>YSn#^Y${{bwpJ zUF&P^{IEpxflqWx5nh6Y8--y*g3A*g^1+O;(4TZut9MBi&S90>aZ-R zo$$0BCGnGl?z%V~xW&lF$3A4nqWSQ&zG_@>mif`ZLg?o2x*yVIOQo*C6U${2T09`} z(4t>Lx%%B*sTR3~p#I?;+laI*0-xQQNv5+7m1j)uLna_YpX}eYpKO2Lv{2P#q7&40 z0w>FZ7$`4DbvZl@<((Cle?iz{5{bV>kh)@N9-J#4kJLx@0`0~a=$QTN6whkJ7d0b! z`X8(qfh#ia2*9;WlFs_%pr97@D;NbnNJ3+?0fEQw4B&Ea{MAdn_#53OH}Ws}QOy^> zS@(=XR&~qlU8JrPbuc$L01Eg#bG!-EHs;~U)`Gf-P6jYLn7C?6uWpyLU@f4r`|Ask z*8WRlmsCJ#E_Zn4F!}~=>|(TCk93~Hy$*%WNR8Ku-WZ;Nnv{DlOCK-VcwS^@(Xy&3 zou=(L(voUu|E=9ls{YTo-le<8tf71)FFuM*vuZh>r-1r2m8iQKI-EEqM=0RpoRd`_hoZDbc)=nGMoy1JCLH$8y-!%pngG@{K5fKVHK=2=o2(_Ig?2-Tyg65u zUDG>tK$ds;p4}*WXYPSgTL1xK`v^i=-g)X z<$#vxYKEj}Ew1$}4fJM<8&FyR{dE3-)}{?U`sDi!N7i?P7+eqp#BVD+@rmuIJ4MlFO?i{VG zLkSh{#h2Tx*YIwQ#4*r7R$OkPl^$xw0n03#V!+wZVhk;ddW9~;P$7IZ z#LQSee4zIWhO%xrZm(#^`ZInY>0}|#BNA>U-*VMtAC zH?R1=-V=lDl57vYg0`ka&3mOQT(~=+ z#PFhUt)K@m&p25oXOHuU)S$6{&q8jzMF~J>5YAIU?zzVsF=)iFtko;n#H4^hq)_lc?(Qk4R+dCJNxX4gl|C9 znBG5FWVn$iLdF^=Mg9S&E$rupt1K4+S9FdbF=4&Av>%}N7q)3F`lCUrOZC%AJI|(i zLx!CCsA!ogQ+&y4Y<9kiTGWMEw$WS`_cDsC*IB=v=U9rUThM+@YyHhO5P!MkrK4et z+;`tEj;c9+enXdz~uQHb@TA3+aJ79J*2~Y47H#$Y?#7e^>69f}nzpW~@aD0W3 z+VPgO9I?KrW5Yzd_>joW94@HM*l=F5>m1+@D)L+ImpnkT_X-fs<*v%*K9ApNb7e6| zUkv{0tHJAcqqDGv=zoenq(!x?rs2PfSXoS1_Qjjnh*Zqd&3J#jc~6UyG41P}=@uX7 zmO12Vd^EPfygP%Evt7>m6qmuNRX{1~pmNmv-m3j$#T|Fcj(XSF=NmrRS+96*f2dpU zbUR~)gaCR}N2mx*eJ@J$ROhe-UeP0ISWQQhpjzn-$+DK({eEXxx-W#ntAOhEpbZ2# zu}N}sZLE`rq^;F*+UERqcQ47-)@uvNG!^vO2T*e>@+{f3m*Vz- zjnj4836{qfgj>q}B*UxtTIyjf?u|e&wlW{i3aCrT~$0;8M95znyCYkhnPrC$b z(&*x3U1Rwo6niX;48BRY=dQC#;9HIvYWABoiSXN%L66uIRO;yx$%ty~I;p-Z9E=rc zICSLQ&zxOfJMG_c8jIiWv|uIA%Y#Q&IT$|Qe~>TUn_P?fe2aF65T{=vf~WRcg5-S2 zyt_zTQTahjUc~GQ78mJV&N6Mv8LFYBw#jApf0{(O4!LwtD;X$K^naV)4Gn!?BpUu+ zM=8dOvdXqSlqCn?9R0Igt=uC!L9dtK6s;;~7!TP~KxdcIkW9~e$w9H<853Sg`5 z=2VenV2(~=mK;@U??IYq7$ZA@??4wj~JpM8B6*vo81m1Wf(L(};0Kuq`d{|_Jo z@BZ)pZ(0-U5BT_xuMYRVYfR4M6a=bg{-S`1m8d3jXijn;@fSW$RYju;!NO@)Ae^H}0gfJQ##Mr;a~IfA9Fl+8tRG&1~njY+d)up)D32RT^K#oyC zg{~^%an=$!%TyAe$#jMMnfe>|2bh@Ht#+CNweeEN6@W-@_mLGNBG%hbk9NMt7J{~0El6u{g9x}Zk+FH zZ|RwqHD-8s9qj%J?b2c~CEy|Srn=|m8QG;bQ}8y74SEP0xwTA9R2@~W>-iAX!P(h| z!)B~V!IrnLn}Q{olL^y7JpTo0mxs&0sc=~$YXnM)v7?-&ym-i`>QYoll&raGN`YkP zPQ-(AMTU!m_Ies)s?+I&HH2f0T6HL1?{uDRzdZ^ecjCKQ%X`O00 zwj9a-_zEBc)UE^%r_|*)NWCz>$7BKZ69*vyT`{#4r`02A32ma7XX#JWF7PqM z%~`qQ&DH=c3b5ERNH2GXzDrk&N+A^GEttg|_)OESCZVGWX5zIy0~_W+ZM1{Lc>G`T zoK234qriKm-g?GVT80D1miG+n3D-tZy1!dgyfu(QC|)>`M={5WT&TbK|s)NvXnyu%D`4t0>x}2 z>Ke-ury{p0^%!LktA??QcP_r>MotTN?=nSF^jtb*kG`rU;MldKDsy?ZY~Pe;IAC8f z9HVTUnC&}C|Ioz{XXgFf$myayqwZ}5*9ywBg_8xwR`bZj{B+6t57yvrq?v|&KJs~v zdqw;Fv{cz;nAp)04mnXTgvre4o-gw>CUw1q^z$k{vuy*aW6YUL|A zN~`(p!f~gSfU5SHpkAD0etOmqowjPJs*~@TJy!|AN0ygo+3qbD&zW4RAyrr52+Zi| zQm16IR=e^atPD0CQiMGveHkPPkU7?h%(_iC9Ru;``%VGAs?4pm6_dqKpV7{ab3H{g zdsY=XTaw7oY-q~7dIWV*_45zb?xU#z@pLlUy_VWBQ>&IruOYI{s}&jyi4l+~FxBs1 zJUM_v!mc&DD{xfr@JjJ8W|`pi5?EA%yfiU9YtuGpLl0JIb6dDM>qy{+HL|(@XXp8; zA9I+kK^0U|)mn2o&=`Swecu|nHWWG&_?F*i7pC$w?{}!*e`Oq_zv~P1n9ulkr7k!B zVBNl&>D9VCIfDGbVo(kEgJrA+{22J&fuIRPqCZ$K3h1EcmfhQ+p1;znOZf-?zQ~e~ zz$yEcIb6P*8J5i;}4{SKNNoSb~gLdLW8CnW^TPZQ}c1?}L z)6<=L(vA24U7o^C;d3FFUq9Hiba(>FnF?!+nY-Z|?|(H~{5?W6Sq$^;CUYbG@T3U! zq-OlYzRF#|z*b{3;+!PuuA|fEYL;(c>K|0c6qX`)4ok;H>(p#b; zy@P-ty+{Y?ASFnzqV(QEhd@FJffVoS-sc;4pL4%`&)w(lJI*;{+#igWB#Sr6Vy!vn zdggqd`7(y-;+p=q9~2*`mtit`Ta#$uAj~=W5wWwk@du=k4>zBHgEyoi;W|+IFt0D` zO|^AS?);D1+oL(Px5~tb9{{UgE&FdYF#h`vs3I3)RdLE!iI`09IiNpG4`3!dK=kQE zkel})u@x7>k*h|b0MUsU-As+m-G(OttB6<=0LYgA``TOoGhfrwSz4Vh<43%efQ+8~ zZ9n^gQ95sE>*^oSucXF93VYEbi4LIvj(i1rwowI_>&c8axWexW?Dm|~QB>jibq(*f z@UU5HW$`+mOChrpE;yD2aK2oSavdKZLlu};D|d6bk3f?@z_gI1oJ?x52$>Q{aH`2f zOCj1Se0@!Rz`9hC9;1@dCY#8c!sbkWx=(oF?>{Lo?Y*~-|B1!Byo_5JL5n;Uc#Qf9 zBv3P|Vp76Zuk%WsZ*{A1p-$?a%-G1(yR~ndP909vTcKvf2NSwX@3$np8W(52o}hkb zEo>9CS)6-L`!z2J1SkOx|9Lu@Wcr zVI-FjAIX=zr6E+_9{5G}CC#(-o*Qf7+xKA`iE)={sCAjqLvS$@axb_hD#?HU!dx9> zGsRa0d-NdtLSbF9+ z#`a-$qC`D*0KfC}`|o>Wu$TG@{f>tW=icZS?VXDSh+#aY*3v5DPVC+uw)}woHV*#r zZ((0!o2b>GoV&ogZU4+Q;)wc99HCC{^&R%T%NIgcbThWz2Jg zx4_42u1H_z^_jp*ZGf0E3A+eqv_oEQW%NcYFMO%}nd~h8n=jI5Qq%u%U?(^>%_e8b zvtpvkxajGdj-!Ho)t@eajd0FIVW(Pc`wI?jk?qLAV|#$pzlu5k4+r%>aJ^>w5JM5t zD&{W+FvRv5sE^n^r$-(MH)ehI$M(Pb1G=SoDK0>HBOLEWzS@OV>Xg4SDXT|d_1AEz z#s~gLmZaS>>+5I(OWy2$DamV1*A>$oFAU}jCoIgYsH%qUnMrx~ngOlHl8Lui+kE#s zh8etta0_@XaAGmub7h9=HSh8-_E;4`>5}skRc9rNx$l4ub)x+(i9mCUmb8oZcN+^Q8Bd7@hJb7%wST! z_#$ALpZ@n(g|^GFoaBoTG)wxNt}6m=#~@Cedau%SUSXU66O0|t-T)G-G79PE> zRyPN20?D-Ad{Omt40%WkZ83+m5$sV=uM$T)(`+^X7`1)?otD>kxugl7BMfxN=^*C-F8=J*al77n zBKeVTrUP^t$OY&sWSk@DBl;Ldkh`y8B<>`n)Z^ZDq!MsuZW+QjsOd=k0fn6*wzz9g z+Z0CwVP98i0g;7}VIb7AM)m-Nc$hHuo4eSYuV?PP*wq7vDYe50+<~msbA%Tc*8(yh ziRVeL&8Woi?J97FF>0E9ScrW#rFRH5Z)YAw+&=6XK{_FpQowW!RMLk(5{4?jHaDc% z*fP;5N*|+5Bz{p(KB`G6n`)-zHBEOLa$UGHDy~|__^QrM<72FHZuFYEM}mFQ5TS6* z*n`ksya=1>$}^o4Euv{hpM;oc^^2bV#P2Vwrc-vGe0M*%r#ScyioGqjv$1tVFk)Pr ze9%K>nm{nZpZx)8;KxgAKTg%pUOx8uazVrmJZaUUt3XNJ7J^udb+z z>}z;MOB`}p8)w>kiX_N$V<8;@&feDJS?aMxvo*o3w_UujFZ}s9qVj&6a0*0 z-cJv$JnrsS>m*NR@NoHZAsmw#&T$13oNGyynBaUPk2v(}LdLNQr-1+6WmuTxnREDq zV^8Z>hqp>|+@2t)*kq zDz7ZH@2Q0$=4TWsSBxRms=W%u9m19WRww9ZXzu~t3flFfgeEkBnt4VTud^jSP*O+? zQ<>{vzJ1PQ6>?(&O+5gF#_kAG{-F%m6{?2`ngUx%GoT}st*mS8}(E;(S zQmT*VUd^2`L#FHDSB*J@hXU?1?gpZmK^Ag#0n;Tfgr-G%ZeIx+FJC-~@r6|Dw-n18 zvvlf73|fX^DA>%i$+Pj4Z(=XMQhLL`tBkZn(hKm@%^7ZFpAI<~k$YzX6-&GxR0Ag@ zbtWBwi($Be&R5&grN5=3T6JaPYfC-A)uk4mmNQqorB5oV_aN4=-4bT=#+IsDMOm@x zT~oQK@Xp^zt6mS)q$GANX!8{G=3EG{Dc_|gfAp$7X8AIADzc&d;Y@+?#o$xa==*1L z(@zekJC|r;OdZg+NPx5aC717lEi+5*KMNa&$j(qb zwb9g0V9(lhgb+%LV^woprpgRO)6J!}ytcn?KbGoOvcC3W?+j|8oQRCUSDtBDdL=|I=cK|spdsCI<-jK1cBjDc6>MX&(sRBwA4BN=P} zJ-Y*ZVcRh`Z>^4ps<+OMMRgE*YQfX_dY7%Bh<^`t7IoI zUa!6bxgYzNzgO+Dq`G#R$r*IpLRJ!H-%_Bw!y9zY@tdrluJ8VgH=y7RRTLEZbwfG> zo_LlnilW}8#2HM8Lz#)Uj6UzZbGY|m?)&NNRqEZEuofO-dp zbEYvRf-%kEE1KV(D!9_D06rdf9c7KX)1iC~BOwcZ1$dOPV01{BBP;7Vn*c+GQCiS$R1Q zi$z-4Y{m9`rGm$K=lCILXCv5EBo*@{;G4au!V1X-yyTL)VP9n34zBi6T;w3g_2lKb z!i@X})I*sZcoY9q?$NSA4P|d8p87H$SvW^)*pFeF14aIQk{`|spHetqZMc)_J8iT2 ztmzH=f#*bYCPC1?&(@TDFrtzN`&`n%Oe{&&7PS59f8|bdG*5~(s6fVb=nYF$dgr9| zk|LlOf&%0LUN2Ua5PzOXG`gGtg`vwjj5ve$VE2((HpHk>-nabXg7up30lV9rG}V^w zpfSJ=Pkc0I8vjL&b2z>>jI+X^>+T|M$4$~AwTvbftGxp2PwRy{I|}IJz*=`FA@{!4 zjtfpH35Yxdq}gX`CMK3{hdvz$%y?2Ufrr>o2fJo27~N9_&WihFu^56h@t&@{Rcif+ z7f^nS(7HPT;UV`FhDLYGd_x3hGZS_CVAZy``LM}G=I7{;{j;S2!z-;Ln8xr|tj^tAuo5?- zFrF`rH%93b%?ZMPk*H$@U4{lHJb%DBVoiBv_5?5>$f~K9qEmmIbZ3^~6DMAf#U97_ z-LGLQQZVybwYJ0eK%(FpNhWZ4zk|}8huoL!xxVyw)j(3iYgm;Ue3;lcoS>>*wd&tg@SA#2Tuq{>pd4? z|9)6pRLQOW+CWUc9VwmGMCm8)D+A9E#P^GL(_7@QlHDdkT6C+_D~AxRq#U~lH; zjScJ{b}=(0@O(w&JLID~Ux|K@wE3k$%A_Ia!aaZ0G+3C^2*M*%PTj7I0G;dtttQWjkV&TFo)hm8RCE@?A{#8J@aYy?HpWUgW@`z^% zXs=&(aWM}~-v01ey?*%0V9SdI{W<#5JZB;Ek8OD34)TUPN1qvhuN3kgnmZPqhFo$T z`Og0Gy=8#GvMQmBfl_yL^&s!Lv81to!#*r`vAP~DTnH6_Ghu`JlI}E=#@~tp>6}w~ z0!vNx7qpmVNeG;*b(ffw&Pe7ooi5>hN-^Izq|oaM$y)X-kKl&2gtc+GaAo>h%O51n za<&2oFzohXC!8ZfW&C1+MyMzQU$}%Ii0N`mr2o^%-u{Wvy+0s#O*fG>xw{65I#8IQ z!}BW9`&L?Pyi6#Ulscq_MnkVK1&-&f5b1yT3sc~!!@9v z1>m}n4g{@TebwDkT5z=}S7KDO{u_q`tZu=IUz5ZrPq=m1m5Yvh$RK!@WN>G-F_f*6 z^X2Xy(}6)rA5jm(`fA)uC2PTO)@7}^+gyb^-j^k8^m}8TC0s)#-~Og^iiK5dylBl) zf&PrWfJ7hA!;?Rp#9T#67Qf%;N7#{ksd%Ab9x8dRmuJKVuwD7doB@@mLKk3d`I;0a^KdOg&mCVMD=VHcz3mb5Fzo7Kild}OIS&Gx zkC-MbpZf#)Y&T5uLSVDkdQImu6M@>=3Fo|Pk&lmOw+yw696)hvD*QCl4#!vJulaM< zvLdRDFhBrLbH0p$q#4Jem%^4JU7f-b8jCScp#m_~Q4(U@DI%dG~ z*KvpudKtwud{+`ZHr?qt+u-ULCFTl)Uw)$9{~r%nKw>OXBsm#5_lZ zuJ@%n#wizY2OiNwg+c){-HSX=d>%nAtNLqDkCZUfo$oHYo7Kk=3RaLx6H<$plH=4` z1!6C2$^hUTWfrOE^u&fB=lx-=_1E;`q`c9@@o*cA*2qY)7a-sC>Swf~M)s}r0hXKZ zwec8@FMCN$XV+k_o}i&W@8NyQ@CHBg3jyIe0spLFR$hU*Xy`Yw+T)&{h*8PFGP6IR zg3aM4GmrRJeTk#t_Yce){o`;3TrHG;MV$FJpvC~$1L9v5npp_=t`y<~i0a98AQt(6 z6o++1g~Q7J%(}^lx8WSyYg1vOe=H=?PgcGnB4qf$CNs6kY>nh0FZ#+i11P#IV-bgwXXL9ESP@oLb%*8kvbS|re1^}i zDKF8E^xbdM8uleB)HX5TenV+vml94{&M0sW`07G$MifU-aX0$9mtjzL1BCaIh4U)V zax$FLe&hJ(jELfklneCx;r51k2GqP8N98i@rm~~5>dR3womah>8Gr8+wh?h%P}dbF zfk6wu!m;=_s*zf6zopSM;AXiWGx`Fl?UmdIBd!_#ThB>${~i zZGyV8>x03{xyI{XbQ&t;us`GOc6~6IHmv@}&CmODM9C1nF;+Svo1*Q@-ce>VT&~2% ze@}MKy~_c8RRba~FGLi>S~WjhJgt7A;;DTQt(WiL<70y|@(b1EEx!{da%XrWZSwhWV_vr z=b_C^SZBUS2fGW7nt_*P59cvOHN9y2u-j6kzyg*fs;A0)+v30U^vUW;U&Y-J6KUtN z)zXF-Ta*;D_$GqRI?R2dp_`bv>X(WIUwgt)T`psv&NX~ zNa8V_;zFEz-Ef`RfJI=JAGe!Q;q=Ti_BI9WbID~sm~^&*W*p4+ZrHYS$tA8TgQHwvope&_Zy34Vgu)z8^kD;Fl}qe}0Qt{K zKCT+Y;asFkcTKQ~V2%g$=al;cQ%cRvXxIfQts(1*klLLBD~Y$?6MAQSDn_5dEl;iV2r5D=InbKfC1aeIo zZ;%^PNsr{O|I!IbwtIi250 zd$+DX9?qXzmUlU}sfA}EgKKy-GI#xXc}fsb_Y%L_O43j;e5QbO2Ec2`h2g+^xAX)y zo6r{~smgqeHh%Y2wClAI-c7$YJyBrIfbC!wYDh9__$~YB&4&d)xy|&;s6}ik690sF z?Z5|+*+cH+BBEepzx=cgHIArEWK+05sC^LWQbx(9qWO3C!QKw-EH;524zqqYtCgSJj z3Q*Kwn5fu`9%co_=~;X+q@f76a^Q4|cKl*d^!$^#Y3v zEbLy;E_(QSBTMRJ1zE8eGNN8ge1u6~b<2yom*P11WcZ%ATnT=mCh1Z*<>w~`Wn-^1 z4aw<<*Rb#s6yoEe)!6#N*>lgjf!LCFl{Y?j`|6Fnq{{8L-VM!m5%d#{^1JaGGi zX!x*`^YrJOhMqBP9-~^JN;bIpH=JYgc?0y6#8L-6>!6wiy~%!A8Y!*ew*S0>W<2h? z&5(}#s%~tAsQW;-&5T5v@3@Mk!kfp0dHbtE4?OSF-amU-TxB{fuXsEMblF1b4cnOe zD184gYq6cV`XmOR7poBrmx#K zde#`b=P5dMeOI2=9rUXC2!?jKH52qQ@itv>E=I{2GS1Xc05e7Rs&E!?iI8k(YdDB~t~V46hf-~r9?2VycGn0ewjxuifcuU7#< zxRt_tG?uQXEXHa#SHEpg$nka66}|_$|5`K-N8#yG7R}_y_dWU#!c^H$*6X;?5D_5Dw}qc57~*5Bl)XG1@f__C|pqVu!fh zqMZJ2gZ3M_7U_ghYCki(+44;e9~X06HbMDC(?k_~akh-8;T$Y+p<#BR^6sbyg}Ls+ z)1Kez@J|Kp!&jb95r7kQ>nzbhad`P8kkc_BsG3YbL_v1fh3w&GU3L{lq7i;}ipR6$ z(hOvf^rABMno?%<3nq8c^bP>sY2jr$Xj-{yZX=rQ54<5A&RMLQZsgy^25*4y&^T4T zxyvSL;VF7-R}OitK~`pkHHB9DJ+Xm|nR*Mbp|-|PJ~j(qR3hCZW}S$;sACp`yNVyy za=jsA3yz-{rVt(ll&iduOpV>I(lxW^-GOXcXjFR&=N;om^@%B^=K zUQ%>2_1zeF6|;%qId`1N-9#M>9l6-^cZvM=R_ zpZ>;?d{0Jli})K?P8g{1+#h^;a?c3+R;tr{iVmO;IM^aiHn;4PJfF&p@q z(Ws*1P~)TCt?^B=}C#6No<-J_YBGq+JH61N2;jJOPxORRScecr=qHzZs zXebI#WW}7Vc*F~H4uP-F`mQ5Wg+irwzn?^i8*N?TbVy-)z53SS_=J4zE^3q^-7m{f z?b$j-w-lL=_$o;;RWb3WLA1J`H;ZS~PnLs;PInCjFSg}>n>PQ)?|P16JM%A>47V^f z2XX0zlx);y+9gMigWL%F>~kwn^-$ib9Mu#9z1-k8p`M$~*vGsK*F`aAf_ zRL{!1#ywhl&sV(+wsk>{!9YFWW1TD6N+4aH;2OVx3(u-X9wZDzPuz_JxxLHHcb**9 zDcs<%{uSx?myBoqN>645O1H(qG@3sqZq!hgNb{LlpZBh!Z-NHCqRSBWQS9sZvCaMS z-?ZK*B*21TwNP|miFZ8tu)~yjZ0+Xk+z=hVBIp2^ORk0b>?cE;&)0K&lhIlPJMo?w zfAQr&JFiPab)#aUdwkQI5*w++=*i}1h4>F-hn;DINPtsf{#ZQG%S&$etKF;{OPT2! zAK$g!<+NGADc9cq&srh>_CEJF_gN_v`8r}1M8ZL5MoS@RsuiIi=^DLdplIhn{Uen1>L8<2#v@@&ev5%dPeFwZwHq4xbh9&!!c02i9*woqW#EaKs&D;ifvPatzY?mr2&|T}s$WwdPfI+i@99>3 z*4Z_JO9f}M!H=%#^rFRAz{a@z21X;xZUuA)hS?=M#A9BR1kJtORN^2`9zqGKM5-|| z9lp^qtG+R5LRl6UuiPIr03JaW8RAQDjd(6AEM+@OUFD@5RRn!Xw4^l`6VajKDlA^t zk69<^H}R*22_(}0k;gKWW?@q@l-5Xzw~Z|_#&?~6UTx_gQP6*NeOsw}T{_yX<`uisdGA;N!CA1%kMu zmxSAf5U*fYWS>xn%2MPlnm8SkK{+3JsH^ zL&M+mbM5-dq7Od+v>)AkMf1pG!@YuZp<22Oea7D@M>-wO1=*ar68lquCf^1(2g)Mv z2BmK5{gl@JU_gOoxvqI*to_QiunA+nR<&piYE?~;nWrX?-toyz^sg7ZIi;~V|BS?xMj!Je)hm(b#DT|@(uZ^Wkz5=*y-)&A zKF9NnL~2IW)8qw+g?y*B_QLawvHfafVWIanKbkX%BOv@pC$pMkJ$ZK#4Nv?sxG7Y7 z`srJ(48UK@w0XHY`jLj@aNqh!@9o45D;EC83QS~{Hwx^aQmgc5c?J&W&#I-)`)YlB z3tkq_RWK9SHsQ4W#$Uo4$}Lw3W$0-vx%*P9DJtG(OERByVP z>K!aRlj?laz3(e)rb7NM)@|VrDE73lv+CB3z+1J?hwtDD2O784wcq;27>Jlk@LSq4 zXpFtrNI|hbDi+hTl1-&t@e)pwFrl%W%Uq_XQ|0myXXm|mwBA&-wDnAf?y6o|y}5>F zsHI%xaZ%C~f`(XZkJ{|;3FwICaLLxpsBChl_pwmeynnnt{u3Dw@fp2NeJD~qo4Eo3 zS|oBF(ch(Co9G_T?BXiFb%*_LL2Up3G#!5WyGokcy;SP-lH0<*g3v9KHP;0WwRwPf zIuBh}@Dx9uW_$3IG>t-KwgUdkkAmsHsj57;f~*NxoN7JqO?E_iM))z3fHWz(BY@g#<1R3H2sSL|*JyacdMLYVaT0g-2%t1g*DPX%y7cgW zKw5bM0DxVdOjA$pgH_R=0+p;6e^xJ+_&-MWoP8FXs{R8qVOd7vUXZLSAP*5OHZNJ1 zl>SWopKPbj+N0^x1)Em$S&bYEM8C|_2q#RScWTRSM~srp|Qm(Rn!VY0r;e*k$O!Q zP@BotIUObN$`6Nmsgg(%u0a0^bI1r%q2y4W4~BO6&9Mg~KKzSu-dOslTmPS#ga7|I z(0}4wT?oY|0J6aC;!6+w40WgPw&b-M93^%GyAO+_Kc#}@AZtwNa>OzXWEW~<@uNU2jr9$Ypv#yU3whjZba7U*?QB80b_(RSM5_oi;Rn- z-+LzF4r&HCwS0Dm>}CdG7Qe5k8?HM<`=T{DHI7b2cYdt5>eXc;L*Q9a3UhonlSB5>*sN;wATjCfM8U1eqbxQ!L{|hM z(#+$^gFOloh7(e1-CF&TZ>BkS9uAal@Djo@E&w^t&5%%fKz&2j)Sm{0<}r=K840xN z3>U-+^{$?wDc+o%Z{807S$kLW2JQ5ww&kcgxOVBoIf~mOlaa(r*cjqH2-VS+O~6U2 zErGtt7j{F0B$sS@MDrT7K_ngA7iQdh8Kd6?Rv zq~2)sflCkcC{zI zq~nx{NBh(j5xBi2P!IC#PbM7wRM!S`H_vC29LDmyybb*IU_u9v){?c2?BR&tFql28 z3LKgNzp4I1n#j3~$zOA?NBh$AIS>ViEUB zS@_tcT7HeCOep|eFEEHAtV98rP+v{GdYldJatVtISiEJgMA|80Jxwr0C|CpkFf7c~ zbi`yB8iP{mQf}Q3tDq+QWIYYk%f%mJ-wHLYJc=Rw1_apndkae!&Ic}}5@Ja;gzA|D zTqwQ^482KrTw4bryeRJ>s1Qw|EhXK5Kt}qIoNJhHJ`TNL#KAV?E&mQZBqj4n0s0Zv zME-cKn&gylW?oi+R>dmFV=ED5jpS(l4HkaMgk{9W3?bw=HFI$Is}`;tpM>!w+Uy?pB8cbfN!bYS?dRgmLLCsGHnq?MZ})*B7(8^Y5I}^(VvS;E4I`q_ce*6Iq*WzP#5)q%zju8&SG%?6!!n32KCuDJH^5o1f z!U9`8L}d=K`?Xajr+@_eJ095uGqE#p-`K$?88<~iRKOkX}Z-pCLgJ9N2G^L~13WyTNZ$bQ-H7%`2>J+tO^t<5Nth5H01X=ID}j`9$@0v7M2WZ1 zDiL-XGFf~gzYeau{AIBND1==C3UjgFy-5{E8Dy5sKcMq^e?XxXL52C`ve$n=30TBB zq+)z8U<%@ZrbC7ZLiW!PSTQWYbW@Hf5dYvuymxJi0svIbi%NfCJJ{CFud5f}b#ju+`av zze98wV7c!5iD~|gIux%^rRx~LTdO% z;|C}mgaEt-VEQ_Yy}{)7EHR`*pbVO=Vz{684@d`-p1A-)UklmfA&U3BM{d#-0t`QZ z=!xxz05<^Ow8v3O`$Nxv~-w(w* z#9RP4KU{!uZ8ZZDT#2Wr3gGQlx@rP2_t|Hc{(z`lmtxK?Y(qXlh;L(n;edeGFR%tN z`(FkV{tl+DCQKn~_&CO~!z-Q%*k}5q9uGJt{FHaIY(GQmxvTejXNBGg#4^ zTJh_g`RqBnNXXcM@Y}bUqT7HQ31II3_hMW>Xd5I*gb2fj*e&;|Dt$G&{~a0U&AA|Q zPDd_fDu^S6C(Rt`-1b}cezt$k14)$;r#{rxTQ{E~V|6rOZ18sS%@9?*TakCGex*mb zshB;v4wuFE9?(}}Q9E1m)dH9UU@&xfA7r;c`R!8=B+SM_SSxa7*2zXI`b&Dk_62G~ zvK7i$!SOKkPU2`{HsiqAPt)Qd|Cpx!Wwv`%D=gzXWaZOjlnvSq1!39tVo`c0bnlnH z936eM^40LY)fH)p=B6}CZ+(Px~ zXZa)A9z*((cPdbx;1mLzSU&xD4U!Tvc&7ZO6uqy`L-GX=5784~sKsfErITnUtAiTv zBi55O!J4(3^e(s@@b}kH}e!Srz zbne$d!Prgc?3;p6{p&NP{fQ{8Qy5HX^6)^mx#wP4i1Em{m(r)nEeayt;hy*5&Q}C_ z9PaX8KekvYAcnevar@+CT0DZZ2Zr>0fz>Y(BTNA!QPGr1Vn{rVfED~-p3=ud0lK2J zeAOS&NQ$un`9vsr^_HI95;e&C%L**4toi=3s!NS2^iY(pPRjhEbCTkQy`Umg^9_jm z9YTl93hzNu$}-#CTDV)saev%f=)o_}dFY#@CG;$re0a2FT0rkQLBab%eAdlR!DajG z?#_YQ#=t@9RJvKt-f+(g{xe_?e?%a|o5zy{4wt*R-+4aH}nE{rvs~W`P;+ccY#@|&4Xd*m? z_h}L3N6Td$Fk}OlBYyvu@tQm_To~!tEJj3lF{=NOOt3OO9BJ5FO%$|xV*W6`8F7ms z{yS49;>TFF{ELWJIZeo*8LQx0tZ2KTe9^Tz1DO59LnvLqWVp(26KLhdKo|g~FvG79 zWxgyULb)(KfnuA-9<>k)2N|iHhQkAii?-wX@!s7&`=*Sv*skl<^h%VvRD_q`4IG@D zJ(k`UO&0LQ`GF&9^mZxQ_NSlyAfFq_8gF*7y4I=yYUnEK#1xCjFS07C42ntaCaI#%U&YOOIMivR=_|p zA?9TW<@l#Z`TX)dhZ_ox_~@UDQ0$E&GyDA8b>B01e>}QF)Biq$c&)o)`u@-hKGmW_ zDuXEildin<&xAbwYock|&gSs61NlWM?0CbK805{S1FWPuB|L1^V+j!j);jvzP3rp%|JeLw@utA=S_tD~DRO5KurX90N$-ZEM#dEdUjGPtwghO>iWzPlTfj+xo^C=bd4tfa$LQ+c9z zYj)ein+oi{M_2kWe94;`@bh{zYY}|Wj@wf%@fI!g8$C|1-DS!MMIF75-t#xJ^}mxt z%VOY+#nrvsVQ-KLL`jUJq{_!`+j+r)VN-5z0HD- z#gGMhJZxE2=rUe<&|qsqwa4eK+ev<6!`!Y94~7YJZe}t9Cj6-;I`ruNvgUZAV`=Hp ztdtl@`eJQuokBEk&a1fY(1*pv4?&>U=SIpj6nqL^{o}y)-(K}T<5$!cXS+yK)u%w< zcV9N3>+3gqf6w(yrUoEho2Dl0xs+Cn|?qu!)+$qftH)oxO4x= zsQpj4QgkD0&jA`S)foP#J59Q>h)KWM=YT(TIQB*I%0EPO4&Z9={COS(vASgaGt(x( z*WS_FCItj@sY$T=Kn95e^yFiMe^K}M`44>$QwJ=#iv|8Z&jeazTvnDCDdgcLnkMk3 zGk>b`2*dz+am3nGSX!3%{qr&xvphKvThW8S^M-Vx&5oRSmLvncjPbcE?41r`8sG&= zQC{U0jtu{-+vrL*4ypjX>&v%{D{JXdRLyI@5vEFUsk%;lG`<6fyJLsc)2c7V5O%Pc zXI+`KJmI%pb^~r$*yVKG?#~O7c%JoE^ity-K$97Hj7po_x$7g@wjX2&`{5EaL*3CL z>QMIuBq_?(!{h8LNMV1RO0u~B}%wSigt;U?DD(QPtiKTO5*97&jey$Bs};0 zezWb2#?;vQ|xQ%_bD8ZFKLYW_veW3k(Dy1Pri&MpbYlwT*71ULqHaV zW4I%qg8@?qR8PKnzo2+7enf(hSG3eSugw8k{cwfiFvMu`7)c9jEvQ&jFEmk^P?>8A zxa>>)Sc<*4C%pNI&kB;(-jo|@KDsU1Cu+H2YjIGB(|9Kr^jdFYI#}N0>#VQ4(!}z^ z%6wK2Ag$PN;~Rifp6}H2dW9%VLHz-Fgpzd%sKfk~oB;Wdv$8pT^6G)mfq0mL_`Nzh zIW5K#9-+CHUef;d z;R>!ZQOs-1tA#+EAyzJ-DknlUTkp2QtSF8nz-5-h1X1G4$amk2)F@7PfU{P%Q^oIa z{yH{jm`P^udXlQb2}E>q3EH_M^DSALJL=}}5NmD>(Id(;^9CnA9aGU&xjkWcBP;Hs zmB0hbIK@5c+vcsroF=nw@%Wl7B;9>W?K71O)bb+7{80YgsWb}5%erl^l`mpb2eHsj z!n|b1S+r%pTyxfIoA`o-=3u~q-dDLLl93e|z94H(9*6Xh&wZ+*A3?l7HCVYI6ydKF zP$YBXX~s+G>w5gIxhhO<%olI>ZOH0vA1%;e^1JtIZ0j`lm-gzys^ku8;}@yq!ByXd zLP~S>m^h{A&w(*F`M=6qsFq1s9^5~QhdekUF&G`lRyBL{b=d}l^WPu2|ET-PkE4na zZoDMs=cDVDw!(@VX}P_~z7l!V6106bj68BCOlz4I@#TFV?oimMBl6+Ag-{1^a6w5g z7Z!602F|KKAY}F})2o=4$xbTu4~e$wfiXm znMA>T^|!wVr{_M)GEg(G`XCl|^IU8wNF!dvHF~{h3hMS%L4YAaB=4t5bzr=GCc|>3 z(h1^+s1Lm}(2N|MQ-bYy&S>yukmur?60*4I4oWwwAkXcQIeU9dBG6$3DLrlggX4I1 zG#1*0ertL<@?U0>S_Rk*W!^53234-IRd}jU*U-J)4~t!+Yo^F_MpL1Y1xOw5)8RBs zON+yyF0wzG2eLUXg8l&&Ap2{TdAvfyV{p+GRgT6ZwijIMO|Uz|%EE0X@&~ z3YOhofFCCag-PE%rDnK52-?R_t|!bq5kR}$e!#_+&~SWaLvMfCPw()-bgtRYCyxLS z!sy$8?EoO0C4&6MFSm*WA3)SLuxpJbO?ojn-Rdtrm6xw)fpXIdD~2g1j#EH-t!eLP z2OZ%C#k5e_#T`q)O?f8}cO<)GVVC|&kG6@$+!z2N@(V|NC=^GWW4_Jzpeu6y)awqk z8z@|$z3N?}4QUL(VzSm-DRUb6Tx)GGTB`eqRrhBh*|(-y4F7OR*XKSb?iR+=x$R)35TmbTn|c^A06RYo zu_1_;`{nroGU8xvcw)OfP%+_B9Pc9jB~ns@)#uqkjBU9{_`!}fzBrRz=_I`dBW8yA zL8~e@+L+{J$98PZoq$hOUXAB5FgK4bxbtZ#a2F^AIo#WKEAIK`ZG+;d&zTKK&<-IO zFz$)=YCe{uG^>Ym$sf-M^|Z(}Bs`55CJ*(xY%o=%ZdN$Q(4g$P6vf3%Mi8lu8`}ME z4!m56!v)#+ovgx+tz`ZqOP6&8+b7uB_BLvfpBN6nh+85y`%e|acS0CZdNVNBz}|2B zxrE_z#Ig@w?#n&ad=6~Ik%jIk%H;!WUu^3*^bbgFvd3L_AQGGI^ml>!%RL(g7N~;} z(E@XI$Gy(QX=Zx;7qhBx0$NXg751C|ejsJdW>*bQd9Y}xUBjTO1+EtaWo07lj zG7PU?A?utaDg?-ScdL?3$Pf(momCAoqU}^`dB(eXX)#93>9rE)Y;AQf+1Tpa$Q@($ z4$wphnjY(Bj3wu@r}=uI2QM=n=q_^csch%8e#uE|=!cR`Dv)Fl@*5ZN59s}=yFZ;@ zk&)PhulStE^)y)#OVljAMCOu-^8*gnlBgRv6-f!5*U4TKqx5G80DMJ3PI0Z+(qPCS z{PjHEsFTq%SO#%zI~(Il4wHlt*1rI7Aqc-LP5{?=yJ$ww{#4(cNZ<<^eq`RPw|$9T zD3T8`Tzis83Zh=O8HZel*gCPcwSq?cr3IWn)=k8=w{fO1O|t)V&~=${9zprBI;7FH z@d6@{w|^VRKPVwaV5xUm5QD?HHJQDtn~&QIDk>|! z0LX8hIU)&IA2Ec>F zJM7)J^o9@b%SR@ldJp;)zRE-0c>jP_9OPPM?LW_a_wH#%l7hV1>C+lAS;&h^$llrn zGh{LVN&9_VWUDRmCS3PE5`-R{FC4&@bVdrZmwhKa%Os^XdxrNLKZ`;fV9A$C3#f^i zDLXbONQ81vG7wbwLjVw19TsWIbCDr<3Pnl=-ms*vu*iTcj2P(F!~Q< z~76NC8_2rwSLm@XdfbS5Ju#Ef*tNT^^yHDg9%Sy?2A@HeO z_(^}nq#^)5&tOvd6}qGrs)#hy+#Ri%Vf#DSDvgOpuHu>1d5T5PDZ_qJ^B3tclb^1063FINkgO0L?A{=3>(uebeRZzTGz#Z|EWGzy4|C zN+JKY|AxSfPFJp&YgJ#(F@)r6o!ww3VX&#a(gJvYoG(6xjS?qE?Vq;6_zKQ$Nw24k z=fSER?eH=_iOab8>{1e4OyAC)MNEf#3AhC;fk{Nfkp_G54y8R7MgIqT?;X}e*sh7A zs3@YSR1u8|f)I*QqzJ@@hzJM>p+}^HfK;g&1reo1x`+~K(us5sBorZ3=|p<(Euo~4 z;*Q_ape%G~s1g=cxO=jMC=b5M6_x+d|i=$=M<5?qL+}K3uYxt7p zf-v){Z~0qE7_`a-0&2!Cc}7)jgLdnK;>BksC9f`%o8SWzmk3{q^Bf(Xl=*ynE&qtL z{2?#-^;s9ape1KEFxNQt)ECK!QTrLSb?>>)PGJktCV0)U=vmb`*nl~NFoc@%HBp~H z%b6*59NgI(T-R-~JYO}Wzj5ZS&B>P^a~8KVw*`(MJ_RkkwG#hR&_LM`$dU4pV@}Hj z%YD@QQ+It2Ouf@ZWCcmi>PFXHEPZ`e>t6LJ_s<-*#7KY7)rq>B7UI;MoEcxDV&GIv z|8}!?ac%7K)nSVZE2@4I=vt_9cGidaiz^CXTbZ*`6ll*GwsG>BrvE+C4Op^# znIxpu?0Hp8%ng33hHhi^{11xl5U3sFFgG}{g6m7fmT@QHF=zY=SWl7-%AYF(xWn`A z+%p!xX6N;AiwFTzN~62K&}<5}IlY)r&21v1N~k%Y2b;GqGxwL}{E^ z&=I3w!uV4aXHJS&$fEt{U0@CY`AMfN8{8{CTWNh{x?B85?kH7v-CXnBDpX7XO3;UVolhu0mhW>oJuYe`{nbX@e`*50&|8+CEmh5-mo5caIaZY_|=Ye z=23d&e5@6Em~OJ}Onkj-C*$YJmfh~uLmAJJ@@W=Pduz#~?siV9MJj(o*6%o^dsB6= zPGPdl*7S%bE)$s5%f>}na*J2Qp&KDg@ml5ZVfR$&5*nJCTz4*Q{5K0iXk?Z$o3D}_ zd*od}0njs;KHbX@pb0)7BNkN9>^sg=t zR6f8lD^-0Fni000=}8s(<9)MFg!UD>9@&$o^xIL0(@XC|ti3dsL&2_p zpK|g*FjwB6Cw2tG0=3O#%lDm~u}qoO<~@6@eZnDCZ0Ly%3&#Ol92=&~(mdulUr)rQmfcNy=uxYTYIB{4=_t?tgjYYOeO`%8JZS6^XGjI|QY{7mYmpIcnem?4RtaX{I z5G&hOA)c~6eP;B5g;{GX2g_kmR6QJ5X5krK zXR#(sDFwT&VzfP#ipm`h3ez7bvU?En?xZf{WsxzmvyP|ot}?PX=Qeb|ObeeSFuc{k za9&o1!q(U9>2%;FFkA+Ur;KWX9QA;+u9{znf=`-QopDFDCsS|U^1%$q20@3P)*Yr8 z4L>p;x3q}C2foSJy2G}Mu}ROvoOrVMUCZC$vTeK&Nk^b#8XNNT%H#emjpwm<8AqRd z8Q&v4Ahrt6f~_{n`U5QACXp zXPUhDahpSZ>?XhGqfhLT53u^akHD6!bexb-8^%Nbv7Em*Xe2OxYl!D?Pt>L`=-8!+73~k7P>R0b zqK^oWO*|Dhadbl?N1YgfD`wE23YXS?5CdeiJ~1-3a2e2J$P=gMtH^If|laV!%$t5VSH&v=M`rDuwJ zQ=IRFPNJUHy4(aSrlRkO;v#%-DOvZzs!eVK@;q65IM{KJuz9bz=JrbZHKr>RjA6eB z!lMw5uvnQyjj$jXZxGuoDqIah`6u=QRg0AKwnz@(92H{?>KxaBy#Qrw1Use!-mi~%d0p5i ze5eaqK4xk<{-W|>W`jT5!(a53Y?4$|_ufIQD&U2lMi?Xz1)B68eUx^SZEmrZD%4PO z4n(#6l34r};QZ*qbAo3xGL+do$qQl@@^q&omfepH;?!Bw?`r6)`&W|Ay?qCiu5T$L zmtl>Gs(g1}-;^`rT|U3maf+4t(lSd&F2V@Re?TZ~(xjtc%tlM>o)rmQK<;LpU;GTp z_$*Vp?GI*lJ>6WOW}jx=SX@PjtUY*q8Q1sXjS@TCw0X(QDg%XMvixTGiU2Jy9(2dc zI72^Mx#UEIT$xMILowLP-RdMLaN{@k1#FKEJmbtJ=g*9>ivvT;lDS~4$Zo>E3Tt8> zA-Q0hVXVtT!F+x{WuqVM!Zr=Hug{=EvInwKos(XlZNIy<4~KkRr<$H_Zf1~q610UT zvKw6*vxp{N>u5!(teAMg^}84Ix@hVYf$LsxFV@DV&(22yBs(6$I!`2}^b|=m=r_xP zqhL77=r3j|w0m9kbKZfnzM`!9`Uz_kmAA-amGot>oqtVE(fagM$5o)`;vawFe)uf< zGb?#KYa5%YO^|Pt{SypZohWFlfe%PQgvi07u?ar_gP_`KmwjIc zU7XcR##Ms+?2b~=oRu`)QnWZVY#X>~Q5fuT1z?{zb{~F6Qch*Ixx@+1 z6%-d}`}3K`1J;S^2#e#+*7G*J!26Fxvv!U%5J*O$OTE`j+BTl-U+j475uI5GRp2|kmKH?qs|qBQV??NK^6ay2Sn8- z?W`8xu}&bSK=sO6Y1D~RvqjKLd7v1RTO)Qv=M4V^0{Km3?5)v@5gj*MG=J(iulUxT z_jNM9kSG%OwEkr4>@KExTx9o}VikwV*Q?P`e4zYnz+tI^viyahBt=8qV^tkzwMU&3 z*(;frS9;En-`%T^t^rd#0%fTZeZuY+;uL4Ww4DHDRF@!MmsyOs+f^JdLto5bUrwpF zJ4`&{>PKry=P~U)h4)-(nE523=nc4L(47|PmqyIRs#@7*U@diyz0A`V<3*$dLl6II zNZ(hw387_XMOXaiLo*cB!F`V%>wE!XoE_=q*RrVuY=R&tfoit5Wx0okFN-qf|NOdB z$JDhlA~n@%ZjosUDsSv2HKI>A3d^=zqnM*iA+Q1iiA$6XomOErgUX*34_dy%s8HKJHQSw7%CLU(GH@h*A2^nXHJs46C zyru>(R9Fc3fDAe{=OFb?|*+m@$&&yCox5yYaZgYwZ|{t zin;G|-fP`B>Da(R`Nk34#jjFt_J4ECP&y44o&0XwIx+4#TJQZ@W-HtVUvg!}@Gch|5lkgdp~m_^+ly1fQXlYB4=Lt{r#EqnsFW@!Ee zA^yk{AKfl_$d1(UW9%u!nS(@Wl-A zE4gY{&?T2!V(%C#H*?{8P(85mNl!Av@%Ze^mz$9gW)Y)kz= zKzA|GK;yiZM?(ob_%KhFxT{WU^8NP34xj?Rt_Ku2DOhq420P!~Fgkr;2rn5KIcMdI zX%DEWNm+d>Rg~#yn-C$ZB%!I=Cbv4;hRK%w>G9)99=u0C^K(J;k**4Gp*r07_h0XS zlf3x8Rm71xac3}V6R4NHGQ{T@5nxg~6e{A-+g*1I(3mU@!$~U4B)#&g@Uu1>mV8xm zAt^9qHPxAFkI9AE!J@u}HNyDovbpY@GFCxNK(90@q{o=QFHrvQZEPe#>pEoM^ZCed zXcz^UJ*LotJX1UIpy@W(L*?5=!(XH#oYmUvG`0 zWgHe;LU5)Bm}iNX7?Dc`Hd{Gu-RwVnFrEkDAFe6pH+IhEq=&Cv`Ag{acf!qb zUMb6Rz_+Wvr6^LW{Efsr2>B3 z{ecRl^Qq8OhdxnlC+hVes0sKw{_DkuxgWN(7PZo3c zA4u>IA5ki|{S_AS@^)KXU&?$Nqu*4{C|@ae@Zz~cs-5E~Mn6^G#qGAKmtnyQqFp+t zXu*y%i(ful9Zni$B7PwBEIK;{t^`4K!eJGYTg(f)JY$_B6R(rNK){P>n5C}uGmDs4OS_beB3&nDgMd$d2@~@O*EU zevYfDzly5t9|GJakzZw&+PMvzo_>GJ-Sl+)O;L`W7h*)l*Q6-a>EXR4CxZiMQ^)X| zbor8I{iKmQ$_5q(wgrKcS=*VAX6;|IdFb}I0|ul=S-r3-^L`5xQM72GH2TtQb4j6DkVEwOs~OFKhDqD5q2BSO4dt-bNWqP zJ5W9N!4k4?|BsW=!$p1zJBfI^;)bq`huH_Md$>&ZJ@R5P$WQdcZ~YijJhzs9{vE0L zo+P$*GQl*xYQ?TRtF*ZG*}FTp{?_$vibKEGTpoHK!Bt)RIPp(M;m*%s!T*0JJ?Y=` zUHddlgRc7buh_5|PXB1s70uP7+S8rC6lm|e_REz?i(|mDT<~MI@8;7E_!=kqM}o(_ z@vqE?+D*nc46=iE&G$9O?0ORK$sl)cK{V|W@-Tjbjn1u1Uc>H_{QOgGxX@I6_lE)4 z0IdW*%>Xe<*yr0SV~7@H70@6jQHwoJ+0?`B(en9ALF!>F7(I+6?D-*5Aq4GFY3XUm zn_Z5_-9;*Z9#0wQzu04X_O{%Mc5QsD@Kvba_aKfpi#+q4$`Pk$HPnW%zIA+8ReakCfU6$-VPXSvOp-ikvCthOEbk7DRw%_BH7I`9?CE$>iz_;B>9!dC zFb3a!kdoTpg5&{30PSBS5%NnyKE)mv)}yd^*_?q#DSlhuWWGhla=8+cJaLd0|^DeyN7hp`l}`0*$o zGh);-O5KG+`yO4!c|q%yx8qC+?bBhLr@$!qxy#ZmMg>7o0idU>x<{V2r|A)GW9^ zD#)n;tIDlgY!7&ju8P#swbAm<`IBO=KosR65#{+BS+Vrb4#@Hc81;^dsn+zyPKfJuf zU{yP4hS`DWL|#!t zIn$JS&h2~F2)f`1lSeBu<2Q@ipCspgW(UxTl+#0$9Gix*S5EX6poS)3Op$cAgA}DY zQs`r3MKM5>N36!f7Ds8#j6{$iNcT#|TIs%2IX)TCG~&KDavSVuIAA}$`NC||*ZBd= zOc9`j1FKOW;)F?Hb|!5@Py_JQX^I>2ljtm9k#a^1 zT#9no#Zk_+w&$=5IT*YFp=P>xb=pN1++iBr!AX{oKb;0{Q25QV&l=p|_?Hne%>Cex zcy1|&GhAUu>Q0}l7DgA48zhzkc+Lm_j(3&2jRnkuG_EJMG7 z9POre@PZwG9K)tI8)retNi*1_o*Qf|#13oy&BD3;mtzrezzp3cI8TE=mu%O;Wc2Gc zphE4Lwka^$6j%U&tcG&V0bx||x(9W)jX-x_(0M6vuzb^%&Da83KzJ9s!88b+Om^zu zawBFls~$2nD8PHbPPdwRTOIj_r5;N$0QVon%z*;j3QX>>M${f}Ar^$dFvU^3jbU3L z-l+-v_Fk%{IK3=^^d8IIdPwTDA@5ln7GBVvyioRPXr@Y*bad0ak15;L?4wwVi~e1qZHA^$X` zWQ}=so;|SQ%q~4G{a+8m0zybRSD2k>FQh>)A|FD?Q2WcbQrp1q>;um%`meKx zT>w$dhW~W+!bmo>jIEeIRKEwvel<=4KBV?`#)z+7qpI4O2)G}w%Jh3BmXC2Gb@mv% zqJDaANR(|Ec#i}EiR$37 z^=mZ#Nmrf4G@;on0&ft1zMUGqWw>h=|MtE2y|6CuF06kji{zp8TU&poNH@6~52ris z-HmbyWqlQX7%m)#e<7o5FtCk?n=BSd;!PtKShUupX!pAood5yQ?=zL?Swcii4>00s z6m(`vH=-!RBTBNw?Afz>)yWsEDES|0@NIPX;EJ_pC7O?J`jXg93aZ4qAMc?F|(drbhuK8o#ti|)p8z#ov4F}LkkUPx&NlwYS^MOJ`+;<9dmWr>X zIZ&t}Hx%xLU^j(0s+K>V8>C($*guN9ODUgPzpHRrXs}?gb(ZXLKWCO`bP?;nasKtn zgqB$-CwF`6ia(E4Fzh7ar={>?t9X04tH0zs0`7M>XJgt5@gQY-OSzjXG(r5^sBR2> z?q72Bke=p^E=9JLq9URo3Ok!;4>W@Z=(JRoy=64b8&OF4viWWXLIHNF&`M3eS^m-n zV}wx$@`p_cC`J&-T39N>beb}h|Mj+xMu2*UrSQ(7+(sl|JMJhE&N^^HJmj@ z@KckN($t&{{7o;S_ZPk5wMax0J~;kdA~2L{fqIi=={1+lW(V9E@n*)&M6r(=l#2O% zSs7XuKP3~}q-gru%Qua0htJ#C)hLZlR=MmP3g;TmgNWDw0Xo! z>>*$d1M;_QAk~dmrwuh%Cq)WE}rvyA+7nuqLkF@#|{oj zA}>zcI~hXIOGtn@fGt0i@0K@4SD-is4msO(@n!(5PwvJ)>$x?XaM<@b|C6!(EZ6Ix zJxs|!9m3eAXS#2;UWK1*ZO|7DBheb@sZ3hThwVk5n22hLP7lqw*nkf%$nF?IG#l$H z<9i1B7G|2;YMbiq_6#qpl%PDs_J;EI_4aDGIGUbAF``Rz3V-r#3|4GA~ z4=mu*q8_~}tri1b%)xp%Fr|wP*g=7gYLNtK9Yh&?JSE)q&Sv;=z@LiSs3zOnR6UC~0qP(aim*Vxk~3 zIQ(+vzW)*9%n~{yrZVoj{vG1R*kWNOtQ$`4MW(u@c}kP#J*mY-fm*!+ePRRg4MS*i z8RaMPxn&Dq`fq_H6N3}~3;E%F+ezzi(HH`(*%KJJ29~_5&N|Pdf9*k_Q+DycS+++O zh*+^?07M7Uo@mV>yOH8a%=b@0>G)*Cs^vb~<%YSy1B4cT?5H49!P9!D+2w6NtLTLD zZx&7a>e!l@&owu}o9N!&MKE)ctk&~C3%veMeomQugjh$9j!w@fP0)_LMy5W3Cqag= zaOhTk{ZR(!934EAjO&zhQTiFA0HQdDp2AlOgI-_1?`6$srmP^XOeVhDGZw&m#&71| zzT)#tqt^VrKUR8peto8@xdc=0OAyn!e**4y=fOhy$JEws>2S5HeSxwW2q)9#_7TSq zDyJ+hAK>Qi&FIge(<5iGXIF_d(w&}i#E9tZW&{j`6Q4FKS>6lWbWFc33Ka~!cKIx} zcfIvMG)JtfV2A~Z&doOS`uj}8Bqy`ea-X&5jPx1vCds`t+E=;v!JsgcT~b9NXQExn zT~0B5rA`iBFD2?D&1h|;b<41>#E|$=O$UZQEP2<2SB6N^^{!0Y3sVmyf^od(S4LVL zy94ax1se;^{7HL2Fw)x<9pl~DVo$u7%Eht zZ`eJcm&Ja_dGv8Ko5y zk9=_Yc{hcKJmmFwe-+IXxKt)<;-XMFGNMs2ldzR74T?XoF=XdT_sq+=@rG2$QA$LE z0PYo&-#JMk=kpn>p2aj0rrQEx7Ep!kcKysVU2+vs!-Q4z?7AHqUw(cjcYNd7MG_B`3*6TfM+`lDNJXfp z16`a}z&VQBW7=uY{>@7d2jrLN_Y9vBOV95#%+yNFC<+^>##pB+k@J!KgHkHjb2H(J zP|-i_iGEs6ob-vs?DcW$SOH?>@7fP&i(T-(};? z%52M)lc`u=y2*AxfIi76KyvxAEIpqP?~82Gj}m#GGr|>cWsOgf?p(w_5{S`jIJ>!) zn@c$}19@fC;+QRv5ZuJ_kVU^8tzf>G6{OER&{rs%7?d#vgd*=UaRMt~=*}&d;%5>g zoX~OH{f9OJv$s*1t-<$kYx*ABwtT3Hq13dY$@|K4C;IIxg*;!bX;6%whpHS)&nP`D z3=7C*!}fhM3qsrvQp$?HjTdA<*_xC$KFLM$K7gc9hp^^ZW38JReNWT0i)~PWhGyNz zO($??A_*RTJXYRj{0*9r7tl{hEv7u5+A3p>#@{JFFUX3>9pWHC~x&r$rw463lHO4#d~ zfbcDb&FRKCi;r)LM_c(QkC?%=efSyOu#0hzW9qCKBR$`V3*i!3F=y*+Rlhr3iVSnl z%B9jkdpEPVd=qqz*p@-p?ZZ_<0%zfFH<-73fbW9+>9iA}!O1{^5!Y3O7JSL?IpM+* z>Rk)KYncWPb?p4(*Ir)L{c^l)?=2r!A;HFM*FLVy{VH8^|973*?fPCjtv1x-aF}46 zWtipbvjJh3;G3WYHA@ctn`u4gnOniQ1!h(%H0GQ2a$O;vsT#~}F6xI!Z%p^<{cX3- zy~|MHuzu4|+D;AJ*TOI7|9k3-|8UHgudUho4_Q55KD=%p`DigTujHJLW16hYe3g8t zhiCIs^K&`}t;VlPhOd5!%(Urvr&&O~zF2MonmQQ zH7V0W-1P&e^5MR}f0^!G3R&SJj(%>t1bniA_X8#57um$g=*<3^XpODU+kj6EibLEU z3XK+DV+=Ch)$xr|gc|lcevVY#x_Lz`m54EaiLimIPG3LRA1I&oD|F;>MJOjfOnhIL zm;82I!l->_LPBu3SkbaTk_V55zSAK;w`MHFBoIcJBoz)nz1MQ}Yvf$WiaKx=eHJOS zVhJOLj(3fBmI`xq<_P21&EFk@?$Ah7%8wYwQeTgD<4@~u^%rc;6h}|NXTEoKxw}iO z58agjxIU1s$UUu2ZnF$J4Emfq#~-Kv2h^K$FC?VJ*xU}zgtTX*zkq>Ww3YP8!7pEd zW2*+;DzRg?513V~>y>eMAHrJgg>IStpm$h*av$(mw#Wy!}=B_*npcXP)5M z;xi#fh&0XAZ&~Y8rbkt|hZEfg{A@d;M`XVhuRhXKf0hzoepFGq(DF!+@;hI(D$L?_ z(g?+Lf`b?jzuGJyxrtV2tEs6p4`-tCx7|o2ii>Et9BQ3%@m*0$cIS_DTVLU=0J!Dx zY>CdimL;bDQtqY57Mk(n0q>2E|7uO1-et;6qZg;Z+&@vG!eq$hp| zTRc(rP4q`f%4=H-QWnxWx&SJ$LU8!zHleYzZFN5J=MgsIMRcd(yb4}*IybZ>B)e0I z^ME$)Tll2GPqp&xf*n17S<`3lzHggct+X_h)Z7MwyW>4U*W?cF~+=8C!jvZ z6L5NGe)L6>a<6U+yq3lBs3~=M@`i#Jmj8bt9{RtAeff_%#>e_Dh?`oVQUdhYNL_H$ zTSl)=I$EVBO#DoF&_Su*Wi55V%_VeGXVg{NeUrsj!kEH~=dYPPz*5^o{P&lIhx;{4fBxu?OX!2$!A_xlU^Wc2{wzPu zHQ!GGzh4oQex@PoycqF;M`j?eK~=?C)oESc;j`C`D+6%VFOvn+VpoiGlaGpZ+Ug$I z#iXy|ezflj<2@^4%k*Ktd;S^K(N*j9zPc=q*4nfsM0b9L(IgJ|Rg@>uiaJ1Nhy}>< zD-TLFg!Z6z2Z&dl$$tT=*k1t`+>5`TkN>Ri>7kSaNCJ_sS!_Aaa&~#~yF1u%jCXdc zPSiIo$onWbcXc1ptCoIuLCR3|K#YCp*}&c2y(8dyxXkY;MN0$`sy_&jEf zESf|NYwB9T9|fn!tQ}T}%!1bwJwVY-6a~J}Z7<0@vkNYlx<@-Rb&No~DPJ8RF0ihF@YLAtbr_^(PZ#aQs9It5VTg>Hu?#QKN zat*0JX{nuqFh`6Dc6bW4FEX+(p;nvFu13H5MZ_{>vSpnFrDSx?Z5dnyT9GY2NJ!DDaqeHE>hSX5xZ`?$IR||Eac<> z-XqlQum4%^lz-20L0jvtFGEIQN9Z^AM}Dt`dSv&nZIoOEZ>mS(yCYpgUQ6Xs7isrW z7#2IH>L)kavvju$3FY5E8F!aH2#={Xh4DM&+P^k)oU2UsY2)I493b=YBz3@$rn!at zx}?lTzXUt%-R1%h+6DjPhpmIT1D(j+Ne9iFah71w%WUo=CpQ1yhV~ z{Z+x@dgszd)n@r6cODY3hMrC1X(l((meov3pWDaLA3u2ndC0nNt)=ctA^P}gC6ga7 z5qLAJt|R}5(W9M9y5d3apJ_aEHt#-A%za6d{#~;|jYG%y#n_9y4mEiEpx&N7bjeP{sbpYf)>xBGfvDE#| zl5s3B6k*B4|7O{eTY~M*{bt#_f^Ris$TvW?#M8hi^je^+E!_`%_9GrW`=?k4eD^O> zP7n;X7#7ZDN>P?mcqg}aaEJl>aQ$2{8#b2U%QFv#K33jO26#kC7m)D!z z%j|i{W;Z<7O2ko=vwRf&qk^egKD)&)rxUfZcgG51TC1#ZIjZV_6wzFK&iroa;TA&Nf~DiK{Axa}UyUa1 z-aHi);h*JJV@(}QuftMsw2Qu*4ax~E8B+n8ff}FhHK(6oy)W>pk(})`R!}&#WUVdQBPzMy(V{o5S+f@*K+Q-?Lq&-{#`XFS4jSS^Kv0; znQ#(zKe8Lf%RE2U^5O1F)vSz|l$t?jXKw@g&5yP?S~zZ`?kjO|Ep7@3gB&Q|JYhN3 zo#z|dIBv&h7Wpt^^M>I?wuA05Ny6KyimV--lu5}(eD>=E9mlpThn2zc@^vcd!BScR zJeN)Wj4&p59{4)HWlFcJuJmDdlS_FY?;j%}^B8$pJ7|-pw~WdUJVbPID!>oa!{s`f}p$$7%_v%Ta+{OMI}F|3GEl(EQa$q*~U7+&KRjJ@1tv` zcUwO@wXHoUIikR_q51gJm1i0c+x?>lDfsYbsYg|s?5>f6@Y}D-5#Dv%%aowJD$Sl% z%nWMLGMuy-3F`U;nh`<}RTXs=iHoSk~JZ0g;+$9yy0+aw~m z?^}$$$jcCp>tTWJaqWBG)2tUQ#3_II`v+#-ZV#n>z+Oj(%gA8|6{2jt3gN9io1LDx zE$@8ovGQ89;d5AUurv7slQgMaTs7c2YQtMxTcVJDUIIthj|H(cHjv>>a{@C*P0V+u zYfbcCvfQZ2>tT%#HI~l}+Uc+Iau;LoLB!OiX(sgh6$D6Q3Pb!zg16hLj9;TTbV|E> z#M`S&#j(|Eo#piT84;<|KcC;;7ZNyBW@ctl(=nIGjHEqHOlh39H~o@?r`K&0Q~gqB z{G^0weZ|(h)^UgofS}u5J=90ma}|t*XkkuIqoZZ3osmV}&cJ+K&H(Ggo7}s$<8BVy+qME;JMuHO=mC<#a z%O{feVS9uuuFcDCbDnnzeX|<&My%U|-$#B5UBS5qqI`Vg9zwVSx$N-x{g)#2%ip*r zyt;iv`SLvBv(V+mefQWFgtpMfiC^1g+bI%e^E$+uUJF$#7% zao=}sn=5&wmyG`DrcO4eNigpxp5m4+Lcf%5KvSU;Y#;%30fc~LUBz#}P_VhXL_S6c zScBZrf=}cACwmyxfsfnPjw4{ctCFh;QlHE4f~Bs2_0@FU@IS?o|L$u~mSp;yB{rXr z;Wq|cKq%pR(It3#f-!dW=PZ?-pqS58uR;;9t*fAI7EsGxwgruQhsMZ%&$|D;wf_%( zug&tOO1qncz5}8Q^IsIv>chzR5Jo)HuBud*><3urd=w>~Tp9p%J5ARA{u~4ulM#VX ztDH~gXeK2USeC1mu`_b-p`B9$^TYG0_sXB_sNPynllmG6e{|C!LO_xz= zw*|Aq@^tj4$b$~thPCdhC7gDxf#mjb;w0*e3)qopv)Te%_#CLr!o|_9t-3El9BRJm zwogOA4)8_5lnZva?l?t8ExE4{!DQZr>YtYq^j(B$iS{C7#lsNJ^Q<} z{2ztt|GPh9okI<$!75-b=(ozHfgRS_V?Q`E3WmO-dQGwVtJ|!kGbm8_8)$LkioWsf z7w6~dnyP28%lSLZx^8)+Tz;QO_!JVZWByZrW0ZaiEKOheH&>>i4QdK>t)9Ab6azTX z{*B#M8HUsBd*|Fai~7cv>ZzKmhcjxfK}Wn@ZhCbbkwQ(d9#%Ep0dGCJ8lzOj3KJMa z!A)a^MW-1J7mBRE;idQ%?!>tzhqcI55aOU5^q=imKX(M8Nzu(ND=N0ea*fXqwxwb| zgpqy3H3?s`V1%h~7SRofAoLD!#=a?bs~g;~uk7QgHg2F^idESz5#B!`yCit-hWuWF zIf$sclt$rf{00vRlxG2ZDLVVE7xq`(Ymvd#C>W_~3a4s&mG4!U1e3TQov9hGTi@u8 zX@-;HJeK?a%*W<`ZPmH||4SHLDR({J=7TouVU%4l_BKavE2|JR((2_@@R0-Ou4r5J znpdp}UggfZ`=xUv)Y0I>$LAUgK)rJc`d1uj2oJOGvNMXPN1QsRRAzEX^ob@^;a9Y7 z(CE70uKi)$3n-}sD+_bOhRD)wDJN9?rU%?850Ix52QK)Qh$ObMxa!?o6wkS2Ny|9z zF}QySqsiM88_M ziM??jO*xDiMv2{J5tQCvS9&xiZ=heJ>q+r;mM&K@)N!C@XR{!2vQTX5@wsD9t2{Y} zU=f__{vGQ`jroL0P%FRX2>PpUNeiGWbHtjdzM5FW2vL)|Q?o8_Auu1LvKs2v+JG9u zTCklJ<9-T}=FVy|cJ1C2_?iEl&e;;s{VQd*{A=em9|gqjPa+#L@`!o)=2`;nrt!Au zs|oeBC*R8BV8NZb%ZMtC)Us^h$E1Qu_8zC@axxH;*A10=LqGYQk;+u~W#3KE$@v8q z-Hp~2m=WXVQJYPOHzTeE0vA`yhjm|$HOG2|QQC07o+&&Ep7ymHl2H2v8LHdFk5c2r} zC}b_Rap5;huScP0N2s$`EL}STZLrFVaQ121q{bifyIP1uyGpUy9XTo z)xKeQYv@C2xl_PCmE+jBo9LnH`6m8^a&$LyUxDX0&T8sOb>tUIpaB+mk;?2Q+U}^O zK0OJ7H$TT0Jy~TQSv}uYLhMls)>eSzRKeQ6CUHK&|DwaGA0%dvGx1a1U3t(Qfm!Ae z>P&O@aG;sBD2=q0y(MY9jjRW__&^2To0%}$`ZH|1NI|Mb1Y&D*OU~rWSw`u!0CYVU zc#MhEB+F|b*Uf5fW%@Mcz}6&*g~+u%ErwJkvzR?BZU#iBi`mJ+K0yTZ5LW14p0mD7 zav=87O#^H{lLlCIM{~GWYC}jQ5dV%CRzy=t?RyN!q6RJ=KCDGM8t4M^*A`Q&)MZ?h z_m}=9J1dL^^m{=M+J6D8?Lj@YZTT@&(vNL+$WPSUB}@&Ng?KFe)>UJq_IBpo>uU;} z^cmBoLSPE5c0Mquvc;#^pXw80OC-$=)5gD&$_>Yk}J?RHDQ z_R7Vt#-P_HmRfU$^VgZ_r;5ta)dl7;!{;V0!NOv3yM-A|?<;Fvw(siye71_2LTdFQ z%KY8d%m}A&5XZivat@$7pzSrV*`Z+>mc{&uVD$ZIGxai(d-_^6I7p6x936Q-w| zB05h_n1S#w*H3jjJ6;CsulmZrS=R5b)=wum6@QsnR1+`F*6xx-g>9&?&8EnxPlkE( zA7(N#JLr91i{uLR`hoe9Qdd^GP} z)C?d$0N;sn&;B*GpZu`-SzUDaWT!xLN&}O#O3kI^t@UNq)De-@T~pYgnAe}lfspoA zIe##{mM}|}ZlROMzRavP`riSO_sB-D5G_|wjbx{vOJUf*+>z0B&o*PFzABhJGQjjE zbJCEI_U^^4aK={B6la4Hh!_s`N@K3)Ni%~F+PBdZ0*@wb`pVP;-y*SXp|_{+PyWQ7 zN(6MNCY0;y$Q4bRbRcWormylqkZ>{(ADjnUOQWf2l52%d?3s^q&dclW>S+`sg=HUKMHq1r7pS zd%?xaxeYsY88gbuh=$oW^*8mQvUc&iiRrN7>@}vpg8_;bR1J3ViN_8;)$k@`w;-hT zbD)$gI@}Asev=W2WC8h1yi>sFIu2wmfJ_1oD*_wy1sO&;1M(&KH1;rWH|ogDn~Z9W zQcwhMrkRHZW~2V)PWNHfkQYMC$9Oj8r2Sv%42DEi`~(;k!74A40DLDM>=WaJTXUqr zqQ40)GACVOlzDIzYIw(+u#>$SaHn8Do*n&a#&>rZUt@{Hj6++Yu@We!E{;0pqVW>KCR&*_erY5R;xCL1Wx2JX) z8IX)A`#-5b+$deofRE*fl|oeOQ zPe?Wmav)zeq~JO>SzasD8IYj^;92GR*iV)7(AFXKh~VO&$pg!5j8(-HYKZB{Kz<60 za}Ue9^cXb%cB5R>_&jql18Og9w1;T~z~taNl6;MuuO9-rgv1yKt%7%!ddX3J%I!b{ zGvh}8ERO7Q0n+A#Khj;OPCl|o1ZnO}@8wq1g^_UMy?q~;(P=$p*8vK-e|-i=wo+%_ zDvMQ|J!Abaqld$==v0ss5<LrKlsj(DwAE)?Bs5L}?~dtEV%@347|?dKq> zqY148S`+YTx_37NvqDnljX6?PH((zF&k zB4ALUM<<&HUZ!L@f|+Q`<3m1YfGt)DHL9<-FyW6kt8yN~Ts2h<0Q++I!h zWzeF8&>aN;7qu+$Na3grvCL(uK4fpekRj5>?J3pvP&u_z24-4cLkN4dxlLL>)iSFZ z*x`Vt<`!f%>SpPJqE&g_cw;1}cf5EvoEa>|JRT@q*H%L?insv#;YACYeE)!1-Z6pf z0`_A{7ky=#J%6*PB+P(*#MAOuRr(lB%#U9CfxTvQ$PRqHd~q#+rZzs8{W^6R<%@9* zTk4D1gNp{xjs>3j*8OfI>yLBM_Ys(Eo%ot`bjGL)6tf~6vrg3u>c*KucYVmDeH3pI zptM#auVHzCdJ1GjT*dtt_TD?Hsc2gtMG;XHu_02Uf`THVG(}28MHIvc2$3QY0cipv zf|QVL1C$mK0YMM-rqapym!XEW86OisXKeG ztTpGi=KSV20it5<>+8*XyexL!=VY0eYnL5r^0r=}o0I+URH${F)o>q6poLGCx^;uI zojrbTl^s8ptqCDigztf&M>hc&WF-(%)T=@&)96{89Qct2hSy(b-eRTEP0!~+w`x4^ zIx$!VpMi(05!m(86{U0XAeN$zcm4(~kEcFD67mHaGI!EByUHIx#A89O>;sDyE6rba z__!|J|MJbySKn{?g@NS9f0W?iMPElS6l0eV$R;gc9%L4w$ca)Vn0=F{(;A; zTq<^Uf+*SiLAMt)du~baerKVm4rNMb$xg>HB7(YpzE(aHkDWk7iN1 zI`wLAvooE}>jz0BN;t$}HXlW?d@9wh0_TJ7IYuH?*qyA(aK8U#d}dh|{7%Zfh$HpQ z_81-Ye2eo;ohe&EreG4M4%$K;%uW~AGm%U$DchDtekOSch#10(glYu>2>hC&_Qew0 zepIcKPPDwvRtFBB^g=FWS zA{uf%w^4J&xejG_3ZkQ*oRSPWXpr(&$NFh{(Dr+X3tOEsOKA=i2_pOfIVXE7d^$B` zp4DE+;cJzvI!%)RaXj;mXPVD)cEhGoKgIpF>c4Fe+RY;=k-%cIjlX!tAY_{>u`a^s!W{#rBZkHyo@M4^*-&!ofE+F z=~-w=#)Ot0z~xu=uwzna}6sEeB>3!mIgw|sQv<9P}xYV z>yurdBv^wG+hCd3ZG1apgF(D`cEO#gk&qzv#!nu^EVByXfAWa}i+vinQI6f1GpK}x ztWmu*>d*Tyy?z`=)d1JiJhEvvX+VJKR|1^&EcCf2VIIi#!xx)0aBHh@!E}5dL<623 zPTdL+M0M{kE>M{`8lt%g`YACD@&^BjtG^9gJ@0}wc?kph zlpxrO&~MxWk|~O0faY2dy~uUwM#AowY`ao;t63pzp^m&4UOqB^Y0bf${vEi1&xN20 z6$77;U+lZ{dk9Ch1)0sTg;DY;@U|9pwzV9qG4l(Hi*{6KkbL7EG*ozn>~{ucSE5!nHJKg3gH4Y zsR|CfnTBgMwhX-9nFAjmYBzOf_w*9p;Ad83&CM}~UQ7Na)NqX9f=(~XJHaY>v-*o` z=?`A#hQL;@+G&nPfh-+>0aJ6j+ygB#e$2mBvt@*R58woMM2YJ}B%9AoM=)vhoJTrm z%o4-Up}@IDXu_0)X$jP`wQ$^W&kr19Z_p@9XB4;F>ymNAC5pz&GM7ZYO!a-INH<3m z-fGmeKv&|BW2D8T5*j@dwrti5V?9r*SQwTJ5P6#+d^S7m2SG8?F)q~n(&Tv%ptN5H zfxVBLD-k1vx^zuh)|C$R?QWR9dycZItKDagqmBthi|UOVjafKq=~UrlUQL6hnd@(Bh%H#1~E^!|wihcUQJt{cn^E z1EpqAA&#~E#V@W9WZyd@PD3r&=NBN>L!oKAuS6yfOUxzqQwc#Y*meGa<2ypGd~F0L z&7wP+J?s2|enEE0@woa2Kw{DS2_!Ewe{rd4ZvC*%Mcx555MCE5{o?XI>^hNodjsrw z;(7AdQNjZa9Cj>xCDMs{B`7f53f9B=#q}dKc1);`g>Bh^qe*8sP4v=cVSW1N@N=g` z=}0wT`?hc>d3Fny$2(G~t4Ctt6B2}54qjQQbv-)k&;bXr@jczB=kOyggu8^0-4p$D zv2xW5TQ>NHBQEgHTeJ)>7*K}#HT6RyyW!LEV>>}Lb4w&swO-6bVKh#0dE>*$rQ&d! z_YfN891kjuOa78qy z>m@{#LX!~3(MkAW+#lRT>`Asq;5ew0=R5(0uuR;uUtC+4R;RrruQV=2?LN%v(sti~ z=Oe0vv*=a)d0-h3PXnhivr|_1VX9>W>U9z^V=WlIO$I+8itUwL_bHvMIF{~bxI%$z zora|XP40uNO&{2nq3*oQ=zJV|dtV+}?$r|L!=hyRA6u{B{m6xo;oRRgRdd281hoXz zvMIVwDySZvkqVsWhAEca{~pTPnvYg|y}|_HiT(Ly5_S_jf(!uua^bby8_Mw>&|>8m zU|k)vhI6HB>o$}Z?fq;(!{rNLf0*f&)4VLpbPRM~u?%9{z;j`B5<`c&;mes2NBJuJ zEdC41EOKyxtS#1v%eb!dx*x<+@>NaXi+{jq??RWs7>Cf>bMOIlO+1wr7`m zv)U|}Hs#=>9~Uz%&iu#a>|F(yGcG5{barM+aEP80LgbEC9Tv5+oTGPIu-y#v1Gg!Q zGC@1v0U1w&%d%ZpTG?UHTl=0vnsdhnF9Wr`3VBk2{N6s}vg zcbhuMi8?j5tl&PEFiE9@(c|ngAG10TdZ!RC%k~5P3TKzO)@@V06W=}u)v3Kgy_-&` zK)ynz&~%K?I81dtrAwVVH_cC6ZE@B+lM!&^79-)5w_fbq==(mNnHeqW+uq{J<4cy; z@F8$V*p_(L>SCX-mo|!m!@qM+d*DAO%-$||NuA)EWFc&YUOjdSJz_ji%w^-ZO=EuN zE(Q160^}+S*eOc8dDT6mXfg{_*y6ec7i@haAS|~AV%08;3P))&n!bwL9!GSAZn|H5W-#gd0mb#w2 z^3CRgO2TNM<5bI}f0{~3G@@xme_Y$XCV5?`PUM#B zXu193Ypw&tbK}a1>%s*a`!2RLX!yxuX?A0g6+h*2oSPc=e~5nLt0LQxN-9b>*n3+1 zks5N0wxF-;k&j5sQOg81S_ty`4=LUk-`S3xeL=dTwKCiZJU7cq z!p6k}qBSyMi@H$*l4@c+_<#om+1Hfi;Ia=>@~GUnPqLrNgnY7sj>gn&3gJEEW?~a2 zJa{?8SjX^u=e%I*YXPy42E20ddK@#%sI+58sdYY6v1!yxeP8UThc7euIC@dRME%+7 zJ{mP!t^CFL(Xh5TS=B>@TFi!{2>1DG_uZEer=eQAv11qfqou>rSdOCRcP=s}hAnmL zww!HhC^WmX=s4ymNj3^Q&?~m?zex*wD4ibJ*qr#hvV7Gbi!0ov8|ipt&P^ zKbX8+>`Ux-@QH9>NKw)7AISE?RpbH=G%yN-F<7OXdsUJ6HihMpOKq^jA)J$yz)k1W zz5o5WM{8+DCh<3$bw1IKX_P%)+3>!~+Ki>-w3ZUK>*QbdCsef#(_ zS*zdrpnphFD_rRng>PuVBQ-yxikgr2w~5kk-HJ@=H9&f*_q_w@RB@-(DZkweCyz8E z(d^3?z(yqNU&?CzWk&3chf=|djE`G23_T}aAaS1@U%sE{x*TQGdc!Ys*KDAaNmtE8 zw_w`{N04;A>B{x&kS<>4r<}%NqL5O?;gghZ(MrmPe(*m~%byIpgVe1z?Eeh?F>@E-l!0*yZq@$(__4ak42!yIv7 zu)4~4DehrSsT59^^|*ZjWk?8UR4g-k_AHhY2ZCqY%4`#5lQ+j}3yz5=TAcl=?+L)7o7EzlszXM2gKI0&B(jR_){R?+oP^S9{*}0fAHO*+wZ~i zbJ_*md>@d~Pf{@$Qd@hpl976ErWzPFv%S9~WPf=RoX2hTFlk4dNSpMTUIY=`+KR9; zJnx{WzN7G_Z>BmH(C*ZukCjW=&MeJ$Eh8-l4Be|*2RLp)i$dAnfwiJF-Xr4tgmUW# zAPV21jK66{&v5av&lfWL`NDJ5lmg<$qh_c5M%Qw@V?~Q5o|$`2EM|#fbVkjKrZ<<& zfntonb1cWfTd&qLO-GQ5O0VQ~&9rCru7YLz_oYSu``rJCG2w2|O&C;fewEBwm0`79 zJZ}>A_QTpXokZeo~?4BuRQ$POVGF$0ehhf|3D4~kWRe2YOH#+x#3I+{Sp6wFN$F&io^TUKT zIp@<*-KdTa))aL$4?r>JRDfWCiR_B?2(=1XFR=4$HWdW#`~cs8ZHIq7&cmd^<+~Dc zOv&(@fL}0yxcb|x{_pMII|i~-%Y7e=R8Tv$->c6FOmW0!onk^w3BDQyNA(6bW3hsSox9NoV=gdw@hbQX$Vq(P zg5~+#?r=AWwEG`%ribojgJwtX@KKrTVmX9@IwM^8>bX~sT9Vr}cRQiTSe8C{DaG8; zFZ-~2&(UvI_xzrsNhDOE{L>rxqy3QREM=*@jqt?nS+iN`{4cHt1Jg4^rRG0 z+Rh+D0u0YTSKUgu_t>df{-D>5R?e7mEXg!KbRuLWE8^RZ?vf|fbbUJ}!z%OUs*HI=DnqCbk>0GSQFCzAr5e%(NUQe;GSZq_(t zEKMj`eO5c|zMnskpwK=b@)CGok85r`6h+em>L;twz#CUrf8XFOM-Fz;zSw$uq*-dE z-){OjV1B+)Ea>6)nMYX~HG+&va@rVw+E?thekZp@z3(7C=e_$ESCo7DG&eJQBf4ii zU73tFj(UWc-)*MBhDH*qP?q!IRE=YRC5P4c?^F`VJ%Rw3jdLZQms0_D7gt2+d{sZ# zxmhXt$U3ty&1+UA!J3r3vm7Eym_%*hzNY@!zZ~9z6OX|UH#26yqVTi&&y|qYU8}Ku zTi%x}=PhO3FLoW+Zt$$E5V4fAB&F4RjmL-k>3+1Wwzu$*w*L=%b)5Mq2T?>%S>Q7XN4|QA20W@fdHwA|yX$fp3!4&n;KmSDkc}8b2y%W7Z>YEmADu@v#3J_}Z+_wt6q0_C&Jiqokey$VVw53CMvr{`RGkN^# znQEMF=Mnz%0$b1*uemCzp^v6^UP5+@O?9O$YqlGVC?|Z^@5BZi%;`zXiKt=rG+NXKUD|xIq=om#O=or5& z|C`!4ZP#s5S#DUZgliKC*IK+61vdtcSDzD0T1|@k3@_NA-iaQc@}h^iA@`*e+ z9?m5Mvt&mz(6&xv8uvQ%qNomwTxIcxRxhLM1#(Nd7Kci?6P!jwWc$V}f`p^^KJh)8 zG%U(&Ok;?Zh?yrE8|$va0n-95?4|eB4DMe5E_dF0R1_VrSl5I$3jxce|?0FmkG20)BA5%fqaYh@`v2*wEVO?HaNTDS*756x@$uJZQM-( zLE39=FSP0C6t7A&=#YAgV@vAtV8LQY(RRiUnEea>|Mjr3jX^s=GT9PH31kr*tx3y2{R-G~o znu#PF7@qKHAImTw{1mYU(rQ>dn5U>doB=I^n}(>p*_=vqI@CP1Te{-JqpG`x7Y`lU z{WLD&HkEsVPXrF*D5MXRC8TZW-5*k2f_HE(JiE2G%^+bA#L!)mP-HL-lV&q z6K_XJ#c^#qM&Pr@$Beadr+&D1{N9OMZWA+aN#^l%PuG*`M|ngC0DNNBeg9L+@*n>H zI}Se3MevUpKSD}1(CcK#qd+YOL(!`u8hM>G_GX9&`${%fcoFf}Ht z=k1asYI8GF4X17J7COPjaUs}iSTmoDM`M)_^+y(d==sS=Q7E_yEk_E5lkKi@hG^v@ z*`h9!<=u~I!|b|;AFv~(%60SS-<5sC`(+Y2c?@qm*NxPB4Qt9&*9E6!$QjeFxywKL zv1i`EoikLXMnO&*hsh^9m$uY_xk?$&!p;C~4t%7>)4x7B8SWil^Sph;`+d@38y%Z0 z&sY|Y3#-^?5r!YbNDSm#(t*Rw25q@Ox%07=$Ge(Ze%5ur4Lb2(&w+nL#y9FNxKiMpb64)U^^1Je%Rf{oB>M4P`5hl8V8VGB$wu7 zM`1J|)fN;1HQ;N_$7ut_D@FUEI(?RVG)IH82L)qVc?~!1(P5Ma%|&x~V5!ZBOE1FA5ReL~pXn%G}AnGLv=ASxEGXTdDW@<|_h%b?$oW<1TUtGKGe{mtd zh=aj=CJ(;(g+L#Qw&Ty|?Jyj68YICdFhzYr)$CN@>APfCur5Yx{DvIOMGk zF$6P`*>~z6>jB*YEIpWguXfED>d*V*VFYN-ud_?LFcQONVL$+I3rN4*JbI^)UB=qd zAT%W2Il8v;LjgLgiwHbJSY4)|rj);W$)2JpP50aq+#lG1)#sprcz*JH*%nxf)Qr^PY z;;7U$vOH_whwDsOb@ml{U`x8w69sNglh&5I7)Ph&&x{{q$gf035387$r39d-ci`2( zBMNE%qXtbjPClv=RvDr-N@ zEtB?b@~!DY^g^Fsxx^`t?OJ<2ojZ!;!Ify=lN}f6**7wjkxX+nqA|w@H#ASuo)Xm` z{eb&K8N%ONFqE4) zc)ed+n3ru<5IlgKFRZLM5JfTT`<7eB_xZ>nzR#ae3tkxEI=7i{xAF98_h2k}b{3}U zmjQ)Syyh0Ap&&ED(YAe{f7)7F*k6V!|JC~rvuB&FJtd335=8v!`*{O4kJ>CLCPfsVA>ziQ9S4?gFR+=_L|4;oCi*Y!USo$ zs8#f`n7bShM%&ovB|MSeL@B$U?90 ziyK|9fWMPRwN648QGzxr7$L9Phfvq7$TL><0BN-!wWl5>2*2VdEx_6fFkjVm#%_EG zbT^YAJ62WbKq9uK-qkQYb%^tlRM)jOyrBJL-8IaUJPmW~SQ=-HoCFPie7u#igFdlZ z)oUzYvITlAGpqX?wn@ZjRuvQ==kmUgqeV0=o}b;iID4Zn>&%8u>VA_`R&II+CPEAi zvTA;m{l6LKfYI5sIGq1j6KX(g zl}=@xWs|dp+X-HoWa9d*KqC92`k{n~8^as*1=(!b4TUPEztEBtAcR;_+^ExT}wi!Enp(XS2(ev%J@!n$z+G0N=aAF!qIFD`99HV*$yzTcdq_bm|!PIy9$Q&sLk@WJTMY=qqrjv^IcgNY_! zq%<`ve$OD;VaT>u9gj`e!mN2>cMDN_$3e!3pwh#-2L6(jK+pQw&Sr0z8L4ji1>} zI0!lGA9R4k*LGBwU&*#}@L2nb(SpYruZkJ#QYB9jS2cbFczzxqgdGDbjI7hm;~^nj z;+2}9bhH1diywjQ(cJybU$!2jvNPJ`AR9_`A4~RmF!sZ$<)YJfDqN$7i>q<&M1Faq zJof!#@y};88g2V`Z=^3BrC-DFd#Sfbg_CDt2Y`CGfMyqfc(poK64N1@fyxGd>$v6` zeT1TJ`!de?oHEmRJlM(EuC_Sp?bF`=PP;^+56hXd<=*g}N-WL9VXJ0{(^~7UhSpng zM}Nv^JXK6^ylMK-$G!W_wjZ{^R{0%$F;9(Dr`xM)yKOC)B_ac6rL4g1zA4^9*5RFQ^k+$O`MRG-_$;N_;?_NCbRp@lE@ZgP^ z)JEQ7UO_tQz+_WA!a?HGqszR_w|L#^Z#jGv-t;=Vaf_O#@qy>C71^=QoO@&UDrk4L z=#HvO5?asN3y+2!;L$;)nn)Ou-RQ_mE8-#fvKz+&OBLMOv&+6V04Z#-FDY)!w!L@0 zn@m3EP+q~z!pvjDY<+gE>6+Xh*>ybcYSV`cIq#EBHD5b)JpOA$-y7mHe`Z%?l@>L9 zvb?(bn`76Ro?gYcfH!2HPJML+n}$ZQ$2op317};$X;QmOP7cd&&c5JjmuPTd;(pLP z9PxVvz;D8BI&d$)VD;v4pew|CJaF_ki9cUI zj`0+(z~j(;s;Z~sJ>>lkGMoCQ+xdwVlg;Cu?4KrY=B(;Ns>vS~zpmjxt?PdGo79lL z`7enH|H}Q?^RXUh3F8;|BU0@u?XR_`!*$c&{_qVW_*>l(JI)9iESbYZKJ(rX=tMr) zdEcgyMEQ&-kux9dDl}kl1iQG8T|yCo8nyXNU+jLG{X?h?1sb9#vw4xL$xO&XuHiN7 zoQ@Nq|Nad1Imojd*EL**@d0aSG4Al6Z0BG6=f$JJIE@~rWsb5BJ=boj#q)x`^>e7v z8^Iv@v+?wCsMu!Bmfe;&K*1*wzMwxU)m0_y~6P7ElG~9KBR;yiVlX4%F?2pJl`t5THFGusGXta4D6Iwdr$IwP5FJ?-r9c^ z(f%`^7o0)uWEpYt?xmf0=%DPkHHK}E-cH$xIYe_EozJaF zFyD43cbgd2XWfS;zb-|2NJd-_$6k^D?eZ0{-vwyL zVb2L;dcNcQ*&Es8tykc|#b&H4_*vieS#pKtA)4|Np{SgxDT&fWoG!|P+Ow!z+&NieIP3o-#+ zT* zJCOKk8~v}K1MFA(pP31pD+et7vVH>o?qyrnH`(g>8C|m8ch-l}G0u1%M@Nuqy0hvW zL&D{5k*y=xCOp@P1^-dDu_fx8xEoAM} z)UTgp-L8P>^Ue>3X;Z_qC{5CWuu;{4Xc}??-6~0YyQYQ+zLVL zC-?@4yLrN^Z}e-PatyaS(k^Qs$YvnnnIa6!{b^`prZfkY zQTM*wNL>Oa<_ZM7tWQqJNTiUDM|RP6H82t1IIxcI-%C5?tl?K8L6<#hP{d?#v~_Jz z`4`4&TJI;Em9rIy@V-6*yL>a5Zrmh;|Jq+(u9*d9!i9gv!TnEoFB;D@wMV&-uf*qk zD~-@A=WA$}^*>CYtm(2g@ZBI9+$X@2BJ4tIrJPuPtEcn!%)Au5Qj3-s#Cn5oqapUR z1U}>->A$$HlUHTTo-7G69GJ@U!~H0}h3xoZy;ObWWS?<%ZR43b2tIwOINxXls0!|vuB3ObROp?fgcfumh#Y7kYjuc z61n{x8+*H~QK+)QBn2+#*X;dKpXQFP5XZrPX0$B=6~)vbC{oX z*(Vj~ukH3t{7VF5X-a7T6+$mvM}|WSbNeOwb*j$b7>+;l8bGos+%gSy1T#>!u`azZ zKDJG;s4-LD?tUP9w)+Y!=t^y2vS^FE!U;9)eDe7!Q0DX8!yeZM6Is)^Lpq*n-;syA zb}Uw=VJdZgM3-#^Ubdf6JBjuf^A=)1h%#|2PK!GFR*MEqM%4>(S;O!ttF9?*XW8%S zxwYqivw7Sl-Jo!bgVeT z-Yt&e0FgSd)L9Wh;gdF$_-fvDs^!yHQMZg6;oO5Mk{Sj?@x*d_`mzVO%n^XEcc}08#;j z$?%;QQ4%bF!aT%j zDK1^D;I}UY^>L2|w106WO_)vPKvwwPx8uj)+bc`qtHhg?AcUD_r8h_Sot3YK`n;B> zQLo$kX{+nrax&cWnhA%#AY8&Gt2xlp<{TdQub5G4*a@Z zw;d0FI-yp^B|wi{$}$D<(s&#fh=3v8voCQb2Xw$9dzQ|K+QiQA8z$>=NDZe5Kul@o zRdyB30%|A@%9CE#cjwDeUBwQ914e_^hWh9VO2+9rZ*u4*;CF6zR+5GKfZfx57?ZOnqPy9Zl^tlcYBo^yQVx8 zV881E)tV!*!+bi7;eGS^oY^AO^!5x+V+}+Ej`3+!NwhF1StSkmZPFNucLGUeq4n_b z?l~Ce)R$B0cSc1KG&-q(Mq$r?*WnN?jf7bHoJKx_fB=bg0aP6m?FmdrryDSu^x10r zDo$cfGiPmp8xRPJ_Y?ARQLF1HngDB`_Sc@A0@6_HCnyNDGn+6zUEimEo5N;<8Fw}- zqe_JZd$COGxC&nyx(rm1cZi@tjV!izPJS)iF&`rMV%ae7$-4&{5~sN*;H7QqQpE6~g% z5y+sK38Lypk5UdOrdZQ(iU_6!i#JfyvJ#|6KqeN;=2Olx$M&m@WT~!95dagy5KcX2 z#?j1)q-sK2@p|%!}&_u~5odWq@+!wIoB)vl3)s(eayqDAXOq_L9-=~k~lm^x~+{_#$ZuCZD0|b5Hu97U=64Oa0Jy= zXQ%t_VAXcKurzHjD&ji9*nPKqU-YvD z-DveLhz^Exa=L__8;F8U6%c?H+nb~o=stdMlsH7Ode z?xOlHu2_)lQo@m+I>hH30~ZF>Vizq8Y1V?Xf}Bee)$kxgOb!@pbKt8)y<%76t;hV< zT79tj-yxGycvDL7`T=r5D8Z0onfP$9sX+wSQcRkIu?~s2vRJ?>0MdrPGu-~qN0qWCnHF3W~!if6*feUikdWR^qIn`XO-7)lD9?=c@vaDGU6e5SNM{uv;A>@yVDJo6&s9XMPd%|zk$YuB?I`NlCL z$pz3BubmRYx21U;48^oz?)xaw(aUaU(J$Z&R>FeG8K zs7r0eYk~tU6tVw!pBLYZSqJsY2g()L594>_sw(!$9zR{|l+&R7=i3K26wunp6oawX zBg4DTnm@!LQ8W4qP(ZZL)B3f_EKuM=y!qQHDlc{#XNehvL8}Iim$Uq zBrT9?9*| zFIl=X6%9c338N&PO_I@$emPK^3+%GpZ1NI)DF|z+uXDV<;OCi zy&eT?&~j6^y2qGPqJvdx^~!fPO^ippC+h+7dMBB_& zKcP9lK;|*@mLeJtIf;0&JQVS&@J?}TMKUk@aP;KMvyc5TTCF==tU|w4#e=f07yN;t zC*0C_k~B|b>!_;gd4kxC%RdYB|7!E}x0?jxVDqSgl53q?`@0Z_-;r-I_tbo*-nx}0 zTwkj@8!3FQpnG~srzW_N9Rj)BF&aIY{mBcNZoz7ZAO$@ep`{ zP&H1#YsFNs@YOhTS61+|{F_CaY|n{rd7AZEP4*e;<#ew_ZsFkBVuW%BW66($3O3Hj zfaw8F)0p_^UvBJv-$!Y1It}U2R*%O3Hs|%pb9An7BB@8wBqp+Xk|h}rWQ`{CIH`nG zx2av7(lf1bN7xH~h-R<{BdGmHnw%3L0rNPp_r5^E76kkEyeIcdP35T5_{Q0ne-Sq@XSP zZ@t>o>Ohlt7km83rtXyLv=JY}5yMH5%U=5@H^)EImy%NoIG^ZR;lQldz8+NzCi9y# z9_d;Aw7`b_{JZ-7l?8RLo^Ug`sj}bT1|LFcpH+d;y0WOqvmPJ6z6bFcT70J+1c)7l zJdY)la9`4UI`>(99GOZHNT|4Fdv*TW1L-*LH$lVpNCs)d11+?_b?sgS zHvPM##^8)UKPW}E@6nwi(XdOs$*QO;h`yCF&~jXx|U)2g2Y};4X^VS;~xEB z{{S6%%qi@RZt4kgZNb=S%2~c6MX<1gL15s`zR*EDMSg!)z)MhL_;9VxI)$GudLlKgDob*9dH3C!$XeCYggK4UMXS zYxsUgmL)s*@G3^b!4+tkWy_1MzQV8J`$X6~{EHR!FTjs{d^iTmA;*l*96|36L` zT+0Ptd4fn4}dQqTA+;yF@KuSQiD} z|1nJv`#V;wc@sQ}!2bT5dZtGvDH;}k_Tz2cbFgc0P zwR|s7qj6kG<#~|NHEn+o8a!d1pTExvw8xGp8qWo%*#0{ts0;31i-LS~qqzHV*~gaSUUa~+j~9x!sNSBFnbC9^bU5h0DR5AUVG$gx4YN#3YY$2l zjl2tH^X=a?p&!7E5H$IH{Z~Etj&HB$no9hB5uE-dFfOV{B|ZJyx1IN|TNvp`?@wCJ z%{nS5AEn`V!a5f@34#97*HM(^0UviuxY`lm??E6WX|Q=~b63Wglc>e z;GGthbvVXc_*qT|Ug*_fv=l3wo)Bdqdrxweh5Z2(nh468ohLX^ixP?4hZ+Z;(L#** zzS#GlN<@!4Ww0GjWY#C2pj1dz#LVs~Y*M*yAyLt_V_Ge5T(||kDD7>0Pb>X?#7V&b zr%5A6q+RtvkxnbBZ)hb26`CNY1$!-D_fVY`=30M}+ z!wSwm&#sWe^A=tRe?NMVkE_vW7Uhl-6ja_dd7D1>ZFJGas?4D2^daVpShg{3&lqTK z#GRIsU3OVV>g_e=bT>h!GxVdU*b7t&JYVdqm+r~cSLS4T3wnD0&yd_D#A7?j*-u6H z)$+VQub3B|kIgG~G%lUJzAE0CD=xdTL&?UbviwDP!inVmFRG1#*-~?x_@C-uMtX1Z zkyuN)Z|12W(%U&Gcp^gQ@V&G` zm*%sv7^|S~F>o(p$%`K3Wt(f-m7DZF*?}X!RT&#BaAV+yL_Ro>|rKI`MXx@lg02UE9TpzMA86RyFRlY%W+UC2z{k1N zAE%vIx|>{Duk<*x?D4%j7yBexk3O<`Zh6QjF1<6*_L=@OdO&K7sno|g&5cvHmWfQM z>kgMc^V5Sh3)aPy-vreus0y8X-s%&*C&YrM%cRxVETXjqgnaavyiUPw|Mvl~0R zbmnQuTRUU;fv{XrB#l7X`m*{((O@7i{Hqhg0zX;4jw3KV^HYl4iw?jt@Otu=l;&4G z^m~B*W;6Ep=bv#G+Y%$mS=~hc$vcl=?aQotzP`iRJ`O@$1;S)L{h z3#iA{^}p`rl1=|I&uPJ=k)8!dP7yD@-M9&i!Vi1z;`&fqCD%>!Z@=6$Ynm1fQ{7Vt z$!MM$;v#LTb~X^ZA0z%R<{5Dx6WxgS~?uiazDg`L(>6 zncl`%etcB$d+D71lSlYB{olj{p{XtHl#kGECm)Zxq*n`L#?4R>4~F#OjX*YZ7ARnu zR4Ab&P{;iU_Pf)rPbZU(P$Wey1MK+$@|8gj< zDezfB41ThBEEL~^IpP&Sxe9Jsgapxk`kD4}~bHYf6fOIT8o(n=idHlXyPC7xXeRtsZVjn_xQi4w@`~N#k?H4uD>&PHK{hh>kmb7qKv8>B z`;WFQ|NGp(zyIMsJLW*%2#^5}?#jumvz{z*HB4S6!{}BE zYj*#65s=#XNxArU-*7xzkjdSILLLx7wCRBP3MY_RjIzCkQL=@R~qg zDtwn*qPpcS$>Eb%W}Qfzaod`&+R7kL<=02kvtL4?)8{K!9b>bXK7nGZCGTUZYvI9= z8AE6NOb4NW;DFkC`HG3~scFq2{ghTXWki;g8I@ky+(U$kh=*a0BWcQwSXjz3dx zxbVA4M$O&wLft0t7~&e<6)#G;_<-aRm|;Xu$GaYhUY4o|6Z6?HAOhqA>(SGn+x2UI<23>Ho0z-f>NC zUD_~;iWMv%Qlp~MLFph6K>+~)X+~*LP^ySXZ;6QXE=@p)(tD(X0-<-LgVK8oEtHUu za=z_6bDr;<^UV9bGryVl`OQ0@e@Ke3lY8%buf5i~u63=+S-2n!8_!j97WHY{6*F^p zsORdI__zcj*|<}X2zbBZ@Sup+CkfIjdXFa~yAXp1j~)Vzv}K=`rcsc$G5pBWz=(OO zuKX|JgFiL)1J38K*Z-AsfVk)Rq?&|mf}HFAuvQZ*Oxp6qP+~EUu(d;oULj;PX&yijH)K&HLAR@3 z3?$(nl-!xTbFVLxP zOEhnewu%$BXG4z?%$ixhd6!#%AMarGXuhxt4VoHWl|$H52S(`d@J6S>R3nu@>;(Zl&z^ZsnmE1X$j>I{d|x!1Jt+^mZ4mA9ml87l#SN3r8I;=?j;iopu^l)w=6o?sf&EPD z3`VC-)O98sixUL+Bav*v(jS}1qQ9rI3gCo6;di682+@kBzL}f+l??m6y-y-gv#WWC zz1!PnF7-PfBVI{e{wm7;CHWev7}bUs`DCIu?+8@L;LPMxluzfM*iCIf#%8nw*6ikF zC7-Z&dhAPM$wpg<9(2bIqSrlfEemaAoxESDWtYh%;O`&H`X@Pre-9Y`t#DTAE@Cl&f$L+8rsj!U^!nm z+XJQS!mnfD$7{F<2<33S&H0tXE4cdppV=j@F-$(9N*n%h9&yDiOGz(Lw-jE%TS^&bnXDA+2?{erK*y;5cOG~sJ^>P)+J#ZkW;G6Vz ztK#IAnLRSs{vL()1!xWflEaYAaO~vHsY+lOgsDuV+e#1|)8cP5x3Hij7v%XN-))HD zO&bK2-HrVtV?aU`fKZ@}7&44mVgrCC0#4?~`vpN~_`9jz07e59vCz-M+mPVNKMu=? ztcRkJJravlGKK(KJH?1>grO+3Xz~dv4Z!{vpv;&*j>rR$9i$0RYPIdA5Rsc8V+Gy` zKtuQG-~?te4YdK@i4gwdklKL1PX(-jTcaq_7U%*DFA)Q8oiRb~&|@}*2?r>CnD=Bb z;2%d#{p(bv{$ZwQNvDuE2w=oN4Xg5}i9_`Cs*si3K=FYLz;!TFba+;(8P4NR!-oHP z-YB0^!U;j!;L1Rj!y8%4@M}H(zRJJW}+&3OVgPsC*D%UcZ-EPLSp7%*5Sg89b_1KB1 zi!pp^ITI9VmxCml`BmZ6f9i~2lnnuDLKV{b*K^!KBEydo1l{M0@8t~2tcH#}{Z zh$6XqUM76SK^v1KMdB zcib{1XfyoIUjwsk2)n$~3D@6fzMe-)K&Oo~oDd>jBW>_?yv*z)d+DPqUHuohkF4->L9u+!L*?ZmPRD(6IagZ*I^6_RWfO?O zcnL}p^OROYcR9iDI#m?Mqn@OxOiVsAbo2h?PWP2cD(|tm*A*msjNNo+?#fQ|LiIw; zd@lmsNs#SeUgNf0#Q@$Z*p6N-3;*tD(-ql(84;f5Zi2F6F$`p;JhpL*2GWRYr)I$N zxqvj+q}TLWbG%#wXCfR-rs2M*!u&o@SxTAb?PX|*n+Rnuqc}C2Pcrz@?BgLCdoZq>?TA`<6Fqv z1Np#@>wPiUVN3;dxZkvQ+ve zpB%m}Kb2B*2t_@vN&=m%1+eFkm8Pix}X`( z1!3a84@m~ICAxzMBIEq2$1O;*9ebcA{Pb-qB?-@-QpE*g>XF5rT2&vm@n7fu=#06D*NW!*1!v zyngmf_O!y(8yO#XJiG{j7GBLHJtbt~?FXM)dUDV92|4O=aRdvos^2@m-nL+PV`b~I z`*DgC`dc`G$ECqb4O^GM+@f{#d^wJ}VYDJ$w24}1o2>LK!sXhr`s=H-(&+aytNE}7 znMjizCnB)6qJwRPh;L~AAE+WnScK;xLSY8^B|%{t8MyyqfI3nFs|eJ*BWb^IGobqA zyW9iGsdUxu6ogRk9*DJc{+@3BX&%wB7@IavjIZ-Dm?d#35aeEJpX{qiN52DvA}a~9 zWM2xEntz7;5kv=i{>XvgRD8%zc!|6h56ZhIHy0!kJcXxxq(+G33XxCjH(u zf!H9QYat2G92j_H7E;r%B&Z!Tiz(}`YFO6py=8>hUe;(!OL<*UDK>9Bt|1b1a9V3b zVj&)=T<3_`t7>Ug--_4}aPOTDH?Y~wnCp4q)Be~gYZ<_n3r4SG?(?6_^uK7v*dEQp zC3f&!r+l=Ve_v9eRo=&wL3CO5uN>6!%Mz6iR&u}h;+gQ`lhGb}U3r7$Le|~7CawGB z6Jqp=v_1)HJgMn#ec3J9hT-_IvGx_-F{KYDxmu-$}lo1MfB@})!|?S$d8*?zu{7DqhN8tP2sA?L}2rmyim; z8$e*Vwrl^BInR3sv#j%r)vN68K7|sNb9W=rn7B{YT9i?%@HxNqw(s^p3_J^P6NCV`YFz{o#UjoP7%m1^I}k4Cu@sx#8I1C@FSTRa zqGi!;?>8FYiSbX!0vMim2i%GOAly9g0(dq$FX~^^CGezVPMF;Te!>4YOh*PBwl{CP zzG`pEVfHQ$uu=Cd>8Ks>j8dWv5FDdTbP1NdYDei{^8<8+bifaOhGq*uk=SeC33B(6 z{>E28iGf+P^!oUFQUJDrs^z8jZtB*FI5kIEqxKq~Akz^?VC&thEzl8pT2~3pieg~g zJk_ui{uP2Mb>F6AS5|<6SQqZxa|fyQIpVE<0fs`*A)b#EUS8>JYvbc{eY1Wy zdqW&lm5dDUyehQjJxWWLy{vUD5@FsXPUlhluz93N?xXEX#L}W1zdr^C*x-&H-9=;H zn&J1JzOpxhwoGz%ObH)?8;``0bOmfLJifc2q}v19Rr4TAg?GCYjdfOB&bD{H%)3=5 zy}O4F=yn+Pjr-DVA=Ex@;jnkML8lB8zIHy0Rf4nnE6(76D&W$$%2(%ZB~H6EcERi1 zqv;E-l$f5W>iYIw;Digh4|A08RTF0(L;uN=IN|-??2yICmm>Kqe%z{Z_lgCI^q$Q* z)p7#a8v8mEzu20L?!?4Tb8^iKNy)rCIr(fW!Sg?tu%xp5DjuMfDLQ3ueT8RLmF}4~ zIWwljv70CAWGj_B-yG8&4pQUEePD1oiW!%dChM4V!duTu56{Gp0##KMrVCZBb`<}_pMs7 zL7`|E&-2%JS9sNC6Q*mpu#=l%uF(5~1Fwfx(kwEC=ynV4wPUVwOC|IgeQSv#YY?_G zU9XwKODP$&3F>zpjjPqVb_;;;SuIPI>@eo*o&3TnWNey%?O-{$B zzYRiou<&@~N%@N=uDA7kRgVlQ=8SrqGCOOzQ_gO#S6iVUHn}~Wj@5r4Jw+Z1A|sKR=fjZKM%tUA}2GdS0IF1ufWivNZwrI%og@uol9NYAmrsmG1UWQs;biO=?tNUatM2HyYP$WprQO$8@|_)*6>85D>aL)GU0>%9f$4)y%?CVeGYB7a{@C zE#uZ-#b9NB{kwT!+#zI{&Gu~FzCglcVBYGJ!vh*DQY=#yM=-jc!kG%6md@|%JTW-k zKx3B8ovy_^BYkQ0>-#p2#F8i1)1K~SCD`%OOb!*_@2m}h#Kug!XpDw45goiEUp!(8 zIw%tqFuocX!0rWeW{zU}c#?a6p2;>y6l_e|WC1>c{j zv`TMXx#`Du?2!VAkSSi(r-pcCecKckPmd@?e`YW@C$`o_l&ua*n};G4f(9h)t154Y zIL+j=SwRxRPT8<3w7NvFH?Zt#lXDD$=r4vKi=Z*;_Yt8Cz97m_P+4YPeMuxZ4{ z4Q>N2x9{DaduS{xCPIq?PWiGhj`@;F9L|jseAE)~eJTqO z8p3`*^T4_OH&R2s##!N;4+{f+qhaq2@A7>%e0%!Jwybtq3k92{vM}Q(hE$#`lnD5u z)zCJpf1Gu*oEB2gzs^c5$6NI)qo(2%BIAp@Nhvk&(202)*W7OyfXR6jjQ~u@I;E;x zBa+@FPXk*;JNVcoWSR%byLVmgcCDzQfruq_VJQG~GYBA2PPqf_A4y&l|S|k!gT$^v24xeV^_1(QT(=$jE|K+PzL2 zv?I9#_+cYb;FS9_uOTg9Mo)ewHS-d%ok}!FQ1r5?b+g}P${5Cd^U7Ngdt}(aQ<8GG zNYrET9(W8bApq3;1o%^>O80$6YCO(XrFUxn#I9!c*4&+3wq zt#b_of)nn}Ep@MCrR=rNws;;H2Z;1z*Xh67Gb8^-4s{Y+t;yBF?ebNbXLH1ztxwL> znBn0-M4v$%9##v57+bLIcvOI>9)9LM67T-Sdh2ey91D30HafYf)g`$$S;FY@up5tv zbA$Ffw<7WBp10wts7;L|$u2_tOgusB^ z0QuX%ADAbXaqhE*;40kiQ{SVF(cHNA&BdsPAlyN}gHC#e$-30O0w&+fDEuddJK`Vd zc>iQ^kAJ72iaZ^qV0v^W>G&)OuZh{~d<(>=2e;(B5=oC6zy__~gG`34Ava=y;a(lY z#URiP>jW7j=$UXc6kh!VULvD6Y1y|55XR()=E(#fc<>;JZECrucbbEK7M*DJ9Q3eJ zvuT*+A(c#`mz?J99&vDh^*-wU59|Hc_5bzyzg&Omu$=PJ!{t3#SKkCTxRGlaQOoe) zx3O0P+r;;YIM$6-C<9sDnf|1^?GMO?n@0E$=3?%KlmzFzsvCL$w=J1>A2tLn5>Nh( zh6O8000qxaKss}>?LN3QVz!*Tib)Uy`%C^3OAj^xq=c`Rrg$H&=RzEH(cWdYMBJSb4}cEBjOOyaXim zY`A{{H7K`6)z)~1v#5d8+A`D5-_u^m*>UC=5v<3J#@h*}Adv$g15xYQwr@j{aP=WM zOxwm3mo1mXKP#~_;@v*Cd2|ZFwBfWFz8X3M;W8MO73TLWf0;h;cxU{?b5Dai*_`0Vr*nQNcy(^iHyY}~BpuLddjJ%B9w^n|G}DPpwc zuY0ogA+A~dMq^X{!vbw6bg{C9o=)DVVh;UjB{MQI0x2&3v#kXu^I(CPhV$T>uwWv6 z81yoOVrCZC8yVP&w5Gil(=@EV^K$d?tkabq=!d;)C_SSKA1XhB+-QDFvF|cQcc-&Sgr{h( zfHO!h%p34yNv~QuNG`$$S&YzNN~1)6;yn7|y&^e}#c&tX2kR zX@Zau3YS#FF+w&*3?Y<*Aobn6U0=gIa7=PvE~!n< zbJTSbpl7;9^TYK}U&$XNyEi|pmAA#P7F7Eo+-!*H#H4*G_Py1_Ik{FQnvaeCi)a(7FfPYa0Gtd;;7?#AI`yZt~f zBuzO#bB##ilbpc44czSq@p0ObnxZ(o3?()LXaSwrIYV7b5T^PBb;jYZQDUwF%>bLU zOw#8_icN;Bi(wAj;WwJOkV%GyZcuvu+OrdC7yjeUF0Pn?yHrYfUT;CL2F44i7U{sD7$B^OZ-FY4YT(ARpIbvd?#=E%)60a1hwa%ZvL@^{o)X+N){YXrNqG22{o{NtMcDyh7Q1QL%!?cpF$jJM;`Bw0^ia+ z9sb=8Zx{Za%#aGs$-Ed$;se|aelPbh7`n4-F54^63n%{^N#KF5mSJ2P9nAbd&GvT6EcLkmrji35e#$jSYj3(lYu<6tA4+E8n(nnN z<2*jNXx1b9`4V)m&7Gp=GewQE2rsg9e-6@A$uI9<+73T&m`YL{bRIQSI=!Qq16Aua z2VbP~IddB#&)>J4T!91)M7Xweshr04E&z#vzjaSvcSoj2qDV78K+@ZfLg>x{HCYT( z*RqJqqz-_^?wq)a9@?HZeva%ojhwQf0-cFq8`SnXbfXjqZS{%a&Z*8DFzf9z=yvzGXmleprjB=$U)mEksod}^B6@LJ^`hu2mn>oKLcwkH%Kho zbdJm9-AFv#GA%d?=?h;CC-7a%ZLpCsdB>JDn*ITCnMG^PX1ADnCe(rIg~5SU`s?V@ zFO%%iU|8WEVZ(HAPrWYQ(xNh=$&5Y^7r<{P$D6Vh2(gODqbv$ruyB;>=Ee*t$tiUq z^M}HlHOyuMOSx>XRWtVUpYZp;d?&)~6?sLGb}I^q-hM#-?V^Br)1vZW=%AK{8n>sx z$PV|0t5mG(PgLtFRtK6rGk0a>vF$j?evni0!WGAzi^HKG8ji$BYL-h@p*JQq(OzCS$ISw@^c>)u7WqBwQx9u9LL~#o2d(`uYhJ zY2A%~x5^a;BtMdAkhQ~PtA={_UeMgn8aicKZJ}}vD}^C>VFu=On|FeN76T+25A>^0 zKDb$uN|V7VwMN&V38-QyTy=^?ZfZb&j0|&nO?-6;Ct}@>c2h{@YLx4oXNDN=`%n*S zancWlKL|k9h7<0mziY{pCk5s?EOmF=KndJ&s~Fqxn~GImTgbZuLPwArqgS0IP@gfY zxzBtc-e#|Ec6fPlFYO(Ijrr8Mf5HlIQ4dX$HSWqXd69LSiHv+Y3Ac%g;7(=V6?wqnolK<-p< z_{CLp1$jC$Q)zA4CY}Fu%br-S z5>adC(v?hJH(GjBwCaX)!3* zh7(UcC6r?pTxC4vHyM#DWXTNSzHi{TDbn|zLLEVm_dKWT8xp2os~-1cOQ%Qkdp$%d zWe?XIi0rD^<)W4!sfWJWEwY;ne#CioeR#t4%qH9?-1s=)c#P)3$r`t)G@Dnd)TgV( zsbw!wly^13@RODR;S~C@Ukt(O7WkxQs^1|igT`!HVV}KtJJ#U~R_t#ukAxeEFF==| z6E9pVH=RR;N&8-3kz#o_(XZhL(F!D31Dd{F%OP#VVtcC?{!}O?gL}N>EtqjKU;W5P z=9n+@(nlDFrc{Gm7JMBT57#+yV9a$?+gRH-G;R6Pw6 znP+#Fd;^&Dk4yVU(x;QmNAD55Q&!IRjf^$b)a})YFxqc>#X7fg4-BshGBt0nU)BikU16C6n@ltpB`l&3cpubRT3k3XZAVldLH4% zJM8?IEW2rk35+p_9o*E`7oY3Lyb0pT{*!nitOZqO^nUY!FT-Bx!O6W&A!xprKm0uP z$Cg<`4O=j2SXYv=eE~95wfY0N4V|kUztM!A+gBCk^CRT2d+9Lu~Ux($1h#PdA(NiC3s{0ZLqTuMO zZ4TR`%l0ji8Yyv1_kC-gKEKLX(Igf!C>44_^FD)tIZdX{=N)#!EMvQY+S!Wt+Um&z zPy60kM)IUYpX}|MExs~R$}-irCgynI)1|_aIjlx{U8-i@T^Lp)^;46O!%$}Y&aFE4 zpv0g6r# zblILLG4i%>=SvWsSQW!cW@=x0^wWks_6b6Z6)O9-)iYElEHpmQn_8^Y4wvwsg{)Q1 zs5(JIMjj-y?b$D{*!Rb{`FduWl#;ZpV@FzA?sAv6oqi;8*6R)s1qyW&L_7)0?$(V* z*~aDSf`WIZW|zZGH3t$`sWO{$6xnLEE(~w<@Kr;3_PAB*rM|`G`SDwmQ9dw@5!ER1 zRA_3Wbk=Bg8eK&S#w?OJSPvHV^nU_Q7@A`0CIElsI93Pg+&)LuE_J+UUMUR0L`}iZ zs3H9|K${xDN2X8TJ?l5wf+m4*6a#@O{meMZV=3O3#&NgdC{n^r&=AP}7$^d=*^R~U zL#I$Uixy5E@E?xARCwKE*g+4!B6?u;K#*3r7%wc zCPvf6RGJ8f5Qp8AXm*5H@8uf~y~=V{2e@BLXL{C>tHRxFP$u#*!b}&Iy3FnwfW`w| zz91k0xA~uOh2LLefYW35quf(A5`^X=W_R!_fcm-eIKz7f9dxuA-n=vq*vpzx;xFhqYnI$8oO)oWw?I~ z=?RHlV*$}!!>L7_7u=${=#=lGGpALky~3N+ES%^(DF(YQx&kh>W`}q&osKcuIm7RD z_y<-8vrf969(rXBWeJQlyWY1PrHAqDACebR58qt~IoA`$d!lGl=^5*i0MAi|jt_ma zvSArfFEr<8k}Bxw3`zyv(4U0%ifGsfxVyEO@ilGP!jF{M;~67KLF}r0;e^vnp`P_2 zVz0zQS;VZ)R@7lN?%!HWzkRBazS7r+!7Sm!lQ0#=+vn3cpSHzFKkx1ahF@Kmr|kSK z3dvz#9e&)hXq{F*qvji>uMx8^v2al`KMwlCSnS7p+gc8cug&t6 zXxeJGL(b@NDP8%8owBCXD+sfM$6hfiXA$%oDia@CoF(udhX=GZ)H=6P8n6m2)^tbS z+%RzP7~;-~)>hoj3UAa=WXGqoq(uJE(}y)GCwvuDoNa#Dk}i5l@12TxATq1OvJIkh z%I#Y^dgXfAk?W!(HNqmZ_rzJYoNih@PK3z?gb2|gtZrS%;{a-jy~Ss5wCRSo9-ox8 zhik>nelkJHYRc@48T)ZYr=~N)jC9Ako-+|5NsntbM%bE8P?`!M#H+`sr6fDJKc=*p zh$;~dPV8N6p=?`#{vz9+`qT{w4iSPcg3Lm-1|d@57Tn|>`gw~V_TnI2x$_JNwp>Xq zHUUQ3LK^%*sFwnXU1=UcVPYBZnw$0e*c+ z=&Gfu91L1pjbSBb+vh!z2qVmG*;nqZysx*FLzK*$=Dw+gE&e2~=7NT0sl~$Y^3wg9 z+Bk4s7KC7Yc2z`lKVyNzUZfVa3#J?P$@a8Bvc*#Rw@j;NEN*Eu4E9DsnYh~x2~QKB zIh!KCg2>c>+&LsK=9<`eynaSc-wuD>%gd%~eV3FCdEUp&T_HYYTG5xB_}JdduTn(v?n&AeCUhKQ5W_hxjbY3oT4%=?};i; zBh$ejyB929aaq^*djPO9hrmBm)?a!cLrp7oUPWs?d);PXlt=BN=3>xB$)lPE0iBkH zkJOl^ShGdHMrCy~_U zXO(Yn=nc#u41+%|w|53Z!(NhM4<&GQ4Kh8-#RO2)DDOEBl3|4?&qK@XGyyNJAwCX3y zL)dW4mhf8HcLMV(Xf@Ijikq?N?m~{|xQn6>kX_wTp<*Br{lIGqoMZ>Gu9m%ip{gIa zv!*w)KmKsxzPiUSN=2~$X-U(8rBjFQv!yI}T~-5Of~hq^&& zIc%bkzkt>v752gNT+b2A>?jYXTrYJGDl{FVXakuE%N5)RT;C*Yna8L|Q@KnENiub^ zl9MB!8%I;$DUH$PxDStgIUB~=T5r8^d*@z<-OJulQGw%;-l#n#PhrwMYn>M7p_e!K zNyQOfp^#1UoO0D`SWe#&M%`Wx@+V=4EiVe8&?L2JX-4iXm zhe9XS8Ul(kb?%Tvzdx{XVBUPAbVq2-RQUm)!-`F3`XHL|2~+WMLZI-*Rj$s^&t5k3 zPa@>g`M5KhzkIx;)~%fe3V+P&$+_(l5y|+N*>(|;B&sovMzZ3 z^HL8RbnGQ44hSWKBD?eKBih0-HPbM&JA#lzl~7Y5gV!u$+mb8 z9T-NL?!(#ypvP4+E6dqL48F|9F|B^E(Sy>1&+87o2zx4{pP6tJ$?IZKa4Vk=s15d6l7isJ!IAp0hjN$_a<$u^ ztWLKl^wEyAd;zIO(yU>n1kHDq2Y;l_elMr&a6|fP=F!z{+@jVVRJ52D_r3(z7q7*^KX=9BU~&w{U3WdlI|qY!G7kLi zNBn0W@5&??+%tWBVKGFXdt*0u^#Dtb&~>eJsdt?=AI{b0v4w0WEWL2;o_#1uvzUC? zSyv=7_#!rWSMA>85Bkf>s!iH2U)mI06L|XphAYk*_sWhe(xh{zwgq%Eku5Y1^_;$@ z_;#nY$%DmigvkCd`7PL??*2T#U;6KlFx$=bo^e9A@_o_ZR_M;-f@{@2Q%RX4RSp^U zN#zefLhbVL_~%mVKu_ezbEP=nJFc_>43PDp-H~YoZGAP`|i^onNEdszW`P=u2=3l<&#-PG=a zoKJ9^Vo2Bre6X+tN)5WTo4}LqpY+ixJSFgIxihjC zF8dGo0*57N9Oew2iYb>zzVUoCJXaOsK%GJl-FJY$gKVnZO%JY<7v9z@zsY8qKZF-iHZe)o<<74R9*ht}p6Zg`~R^)IZ+G$V*pc(F-M>G zmGVe^2FeO+*QY^hCCmq?Z`H#$@2_dCLJ#EUAE+A;PrPfta-KJxYRT`(SABOseh7hu zp61`RqaI7jm|+ND{mM`?^86@rocZKrm(A8Yj0JQmal|fiGp^CS@Ped!{~fs@HrKn9 zr{e48^aJsW)3BQG-k-77%Uy=w2LHgV;lrWHzWvln;ce?(gDngjveg8-KZDsn2g?W5 zCv0=C%n-EUzzf1&ajgTO-Rs{i-GB4P133RRA9^hGgc0(UI>@8eyjA&Gbl{)6fgD-q zm^j~40IhUi2GMx47AR9V1Y406<)50~CjZzb?2?9<_HmodcF^e--}x2IcyopU#gCw^ z0@)uuB+fJ&swj02yB(R$}C|~%FC`jNz4`oK%rd@MFAA^q13;8-S6MKiQB?tfKxvvs!Zb2r-QNeC+l`qsH#!a|;F)jS%$^qu`=P!$w5T$y8EaL&>9OPjpxno04J_yK| z|K+m(@@@Y(dklHH)MtCiDA3&eFBkrouS*F9fC@_FZ`t@EUeITN)UEZuTd6E+b2G9# z9yujU_W89;zn1COZu-~F|7$n>%|QP$djEHxO?m{Z3tsqb&6Px{@q;ZuBIzmM6X31M z*=UTs99er)GC-aczElGl%HJqhPcP@x(}*2;#~-!BW-}#2D!5Rum#V0iXJqK5|Anii z^IE#NMGyZDEnd0g$5gdB34$$U5T;TZ{Xr54-MvpL#=8={r-YAr%2B(Aoxa+zwG}MH zr$m0PJ)LuG-#4ES7OM8%;<)xJ+x!?1IA0f0OCObQ{NEi-x!W6mxcTtZUq|jQ&*)En zhn~JV#Z+tk6NvcF{6i>^l{fdumy>0N#b~~Am3)b~ z16__v!hxdV5dL6vF4P7>@8#2M;1YP*NF~?sxdT?PvW}Zd(+WLK)Wn!}+>jq-IG!{- zJ3#z!N7}r18AB(l1k?;Mjn#B|&Oc5Vop95#6Pj|nr&Y8*PJm%K5IhI<{B++)AA(^4 zSMaAad$1={s%x$@H=ql3baRBB84(;T`4F;+E4NTy$&*v* znR^Ob6mCL#*IXvp3nYXVtf?!S6JO{fEQ@>1wfD4a8|LL3XD}*l6@C|jb+ZAZj}Bxw z{e$YHY&f*1OM3~1O*NM`&2u@6r@yuoS8CEv=ns<$GW~Ry5Y;vxQfh);>S&6?Xobl!e2&+C35qG1g3WcLBA`~M2Q|I?pWBF_^# z8j|Vy;E!;EGb4uR;|#)FNGHZuT|3JIQx5R6uEY4xIQdk8*ok8$;i{!`0^2+Z)7A9- z?}wND_J@1sJ_v`jU=`&zTvC3c@!6_u3vBLB$H{5+$nbBm-6!PY_;n*XTxPs(3l6Bf z!f{i!S@bFVqSSMffDzx*l8u})nj{;0<;n6j8cyV)!JeDqbm+d+CqM~F!GK?xmcw;Z zPP)fY4|jnUXuZGD9NP#(E}0ErHihh{LTKtG4Lc&WW|8`YQVE%dn(?4ppad;X2aq-6 zOb23|4oI9Mvfg7BwSNKh#5|Gs8%=QPufO><2fyxvUwh$~J@{o4f1L}z&cXcau14k-$9;2^iDEX$z49S%QUz)oCgbl$YZLEi!4?e zJzzrI^B(nym|s7*D10>FbIpiOfN#M*&a$a5H;#?Q2=DIGk;f&1OyCAh9{e86V8D& zvhO=Ck?B>IlpLMvU0J<`h969hIUz|)X8SM~pj)FmpMQ0cn2hIyy?Fx0U;))9jDJ@R z`(HdiFzudeHz>?DWCh*f(tD6mWxMBq*-Hvvu)ex%V)SG5jJ2qap3@C>PG!@sN$WxTlmLf}CI?Eq=mEa9-|p z1SP;$noo-c)@HtV2gL|7wsIWPZEcUNn|lxyq$;92n5u(Go+R+V$S4;dD0WieH=4dS z{{6G)MSpyp4Js76sk6pa_uxfDONO<9L~PWZP_F0YdS8;pRatntFG^i}qEab-+g|TG z)gREi8b7C0LO5MF>ftA=xUyZbEiyq?2Aq+QLMbJiw_ za2ppjJ=p@MLx&+)tWhIhR1?%K9N6k{-KP)FzLP#!^!a3FWb4I{H>Ykook;%lSAF1r zZ-xR5fHhm*8<_-6;SF`y$-VYQP{vjVKSmt^CAOD6k@(ZxpeZ`G^BbHbMujKbESj8E z*4v~UZV z?)ZdhrC|_Zu0L5Re0#5h&Uw2xo8+r3>Jzo+p<7rhzQ1+{;W^BZ#T%RA|GmnygzhvcO4cC8y;K{RA{(jM#4hLu#((7X4+Zk3)ssL8t% z4^b20yG9Ot(O_6$ud3srV=?r>W>nd~)6e2avNU_&*;8VYrM5W!R3-Bgi1(NdZp5cG zrII)|9-m;~H$Dg{Rii&*dGgq@!KD%vALUbO507hIch_Z#WCF^Cmy~!Iqu#rIj$zq7sgnj!sF0>08lums;ei?APX9|U4O;~LLWyPfPJFE_8! ztlZ_uolUc2=|D<+l3njnn{V`xrok`66BmEXt2r6p*vzD3!~+*H)){Md z%ouxi=#K&+V6l2Y^7H7AHZHmP=~#=kc6a$u==Ze*E7KOs3%NZt+Jx@g3Ws!jPpDRu zA1roI|0J7t>y?#;P-j?)4~Or$81E8ZBZqt4Qt4xzFBO^?`6A8KX1o)4MOA&Yl8z6J z8SuWe;^k>u@#l9uE|n5hea7oTiXfx*Jz&BAsJNuH2zX_BHI^4zB zwU2kXAbs6hnpF?Mk5<`Vq;fFm0Bwc?PYXOP_kra)yl0J>^Tv8vLFg&rJqd#aV1X-`eZVSC^MR zs4!KWWhXtj@ZlaW`CcF6I=qS7)QewY@kDHYMg&A7c&yvV1f`xiDRMi0Dzy#Byf5%OrC9c=DrJ;{RdxKsqVc{ zx6PRlvGJ4_n<6R|vmdm}rQzQScdB1>=m$IH`s{L*_myKUO-;e(S@xJ_c=fE$%MS~*E)BeKsN7$iMzUH?2$UIy z-uOP}P+>dw(&qDbn{D;lKiXPiwFH#tmXNV=*`EKtW?&ghCRuOb(BPEPLxp7aw<>|>bI zaqpumX2ndyWF*xLs(n;GN57pq>!A}mH*Fv8u58)=iHK3-*10iQn03SbwU?FmKuCBx zzxoB58ch{z8}izGOFzp(&bg4h8L0+hAkVVmPP&|xSMD(b{5eLQFxQVEx(mgHk^`2~ zAbC0(guFTH7B0Qz*r7-636RyS7lMcOvGG=pDuy2J&C=lO!ZS2tiwi>sM9L0R!B z8(UV3(X+-zqOWL(pH&QYL^_+EL@jc%N4lvxI)kEIb@> z8MmHWV9g-&P5B{eTFyGx^H!Dm`4YDak>1HSVc(GX6$5*4`=tiK)IrGSLeH0*&x1zM zXB6MKIj#|glU;N1A2&_nZ7(lx6}Nw!%2ghlrL2%+zv7&;ub4|&aBbxss;Yd?oK&PC zQ>7h}{GolqSD+@yooihV3G(?@=Nm#L)>1-lG%wA{S_DnaQBq`|I0<2c`x>Z~AJNex z%eO9t8|c@VR;VnS$UWD8RAVLTqdE2DUj8BzDfAz3y z=~6SfC2wP3wS)~{lJqQOx(*9BcU3nj9~AJ@>Xdsaa{zwok$lUdx5@UDpsBhCiU-=R zQeNu8E}S$_Z#tO#PTowmwpsg?A?m)6^BWmf=Tjow0w3|&4)EUS!sDLB;kw&DlrpY0 zXk?z}4!VacD+#%h?#z3M2VO;&PZy@ebSTQZ7Hd>`<7EbA_HKis$`go*&E+j@7@z8k z8xx<=FAmUIp#x@Qap|N=jtXnZvDTW($E0#ctdVkY_C=jc!4ErY1|uBx>qe{wNou6|HCHuNRl4e>&GCWV7RI{+}xT}?^ z&$Q%3VX4>4_-su=3BDI!6Yyb~3HxwgDy61?;ZA&mQ0skBxbI}CSh8Kq#s$grMu2m9 zjy6Nn5ldl*5vLJ-eQp?zD)WiBCgeu6Hidsdst|3)_>~%hf5kA<(Z4Gz0pthX$bbIn zf%9X9zZdFKnbV`U*agnRLR8nHM=3UQBfFDtYWQn5?Skir#5Nb;Me5A02QEq*?dE0% z;m_)0;&e~mlg@6J>m`-#QzIGQT>BF4wj>Goz%m^}Rq(Xs2VF0>{ArLUXuB8TF~h&n zTp;W0zkz$D&m2oWjz-qU-6vpk)a~X>I=2rVn0-u*84sLg{eRed@35x+E?*Qy#0Dru zI#ECo0wU6DY%~!2A z?}O6uC%n>I3#B2;?J5ZtG*P6RKkLVO%#te@+Lmwl8mqLmn2#r}u`KfD2Jgyny9M)% zJnp5{D%YuB&haRN&s?Y3-_Rbsx9G^(rM_G|`wHjswHlf+#aDY9qOi-IvbXr=Ax?Uu z{(ML`e(HBl?2q)GpLyJN9xu4x)Y$W#lO7&h_-)PzX74=D@nWRHeU|fGfn9f(+v*&h zW#0R@vs9!037O$Z-{hrtx8^o7%2nL&?KiM>>CLD)b!C4BCYzW1pCltPe(Gqd^*-@v zK%<{0SiirU#(?lzb?bF$YOwboxd*VxFB% zm(syGah#22oKCnYL!)dPa=}L)J@Q4PIRF)-Io|BnBW(6=B>tC20coY7L3(+<#8fZ5 zWN4hVNO{KVbgcApzFI3v+6rM|zp^_2#b^A1%?M`Ab{ef4p&NeVXvgF0yOOJdC^(N_b}0xA`^LOvb9!oyL_Z z`}rCn$zdgVidWmuT>QM>ZA+zxI)VtVOV4Z;0n>mfY}=~D20^@hdR=g2l>gB+<9{`5K%Vx@u}wWrLzaE_jA>fv$GOm&A)@;wRMtFzESp@mg&0B zKnHs&i0?*vgC+Q)6p!b zX@g{y79d>lyY*9HOS52V+);iR9{Y|gKYOBH-1q$Tb zfO!nZt6_E{ihtymnlhiYWcIJ<{0JNAU8a^PVUPU0OA;g(nmZCP(F_6a5AP^&kaPnW zTwzEi_lv(tZH@sTfX<-B)H3Sr?zInSs0drJIeA#VrKI^wIOW*hv=41oZ04Vh0AVb) zJViGbxZfPHFRbZ$TeM-DJTq|63X}NLk`mT$Bzo})RHMGGe06y~hYwY>gTFy1$~I=9 zx!O}R+rm~JFw2yR!Tk=y-kS|jMW~*ElBsTF0>8fc6E0vFTv~QUCFY_{|03u_BLDJw z@)?J!eT|7VNfe2HQusssQI2x_Vgh( z=Ly$gIicd+XXk3|Cd4&B|8zfmeug%2>QFvz9nu)-z|(w*Ixe(s1c~3Jom*Z_WFXQS zL*}lS2#DWI)jAcEplpxB%!%NA4V$JK)$xkMm4dcuTAC1Kg2D`q2Hhk`$s>Zbb~+#{px$tzbfw3-ZQg! zGzMzwwD%Bk9m|wGTcQ3QM5{y=3%bUOFfdjRtnc}zng{SzwEf<@Oshdg92t5~r-;@4 zPEDkMH3mP@_rN?cFRNm4jx(YQdlN7MVWSdVv1htyUyKJJzaO)()KBS@^S1rO z=yiY5{o`91b&~(29}5n`>39e$2Lhz19q>CGWKl?#O#c^eb-@O81jdnjCh6NKmy*5L zp5)O#CduDhQjp25N zwdAdD&l;yV3_lE`7Knj4XPythRxw+)C%2}(*-a`($Pn%hyKv^v*@_FFU zFJeD1?ADqF(=z0+rAE?Ub!1-X{m=^4x?UEY59r65)b@t)@x!;*y+A$ny6CJ-d}dGD zoGPpF`~NgX**~(^*h|}lbH{(K-*|Y8+YdOQne6bu$5IyM`6(e(1u&_uSg!8vADVJZ z4<-Xc6!K_@n`8W{u%hckIRj&a%qrYDN&o8pmy6SL*wWF~9~#RQZ_`#hN0n={=j64bZx6f+#@V0w7M_!0IQg{{0?s(f&DI zl_u`pJvv`dGcv??rzODa1oSY{LSsl{qNl$?PHC`~7QbLlST3{8HZl9nybKj>k4d~> zf3$K4qa?XQLaJ}37up%)&8M~OmoT5-yxaBZP_|FM5~^!qUAk5!ISR363Q}-tVmf=# z46Is~q^Dt(Vt;k=ghnXVe(iu~rOY^3ZZ7iW#|J-XXlUQkaFaNw1^MAJgJ6{_p?_Z?k__qcFHQfO95s*1;!0Zc{Us-#0 zhys))M^!>d$$)0gJO(<gKA0I%#jcFJA0p1G8^h0-> zvVUlx4G?v3YXJyP#1!&uy{Nxg9X_0_(wFkIq#sBJzo2KV1S?yJ zUFTj7m8rY~@?f(v?Vpi)b)wg}OcsrW zk`{a%fpotFh>qp&*ujdj!>DsljLLZ^}jyeH_GdbmkjQ>i!~XW+UV>Ho&(`>}FTS3Vgcxhyf3l|{0x;a{R==Gs+2 z1M()jeB6Y@M0*Al157^N%1wR$Ob>N}6U#3o zsQa|a=%<)ToODa2rVi*Cgf!@7-?nop;rSt(3EX%!=YXfT|J@$SER7g8f3I0h-Bqrv z@HZhTCYp(pD5zK^=YcuM_he#D7Te22Ruj%#KPOe_#dELw(@Mi)6BZK?5;Ng^2DdDx zd1=t>chc4Eel3=>>iR|r@r#$$uLbqtI$h}L=OhRY*u((A3epj^Oz~&chsRLABJ5T+ z+g@)2Q+PLlMKu6q7L9bW4O40!+`7Ea{BE6zDM~VEhIZAC>hXJbyz=pG^N4XQwW#*< zL8-l3mf^x*a*6-z2zmdV6K$ds;Bt^v9j{p(aMgQTO@ZWaQbj*_2@puDfhQlFC>Gip zk1wMFGNvOMk!PYcsheSj)L;ychql6s`q1`%%J77Xl~lBbGM|jdOfI_5`ypHRW5aZH zWG$gY#4|R@^#efbKeFeDCagmKFpJr?J%65X+e?D;?u-$aw_9cmW8&De+1J2FD zT9S?f4ikKCr{|e<*n7}A#0nc`<}M-6OP^Cy34wagXmWlDNYboaH=s!)(%sUh~ zTVFv$o*2!Hg_iDw%6#9Icy~DAaq-qstO{FQ=axzf4l8BMF|Iy>!pI3Np|Q_IxAB6X4!eVgfb(^{am+mV)wIOn z_sUZl`w*^26gn&5XxfL`!3PCoaL{i9YSBhP~{KM;aL2ar+!1AyIVLiN(6^xBxm)sEA)HHnv1wAs%#YHkmja%Hj&5urh@+F%ZuAbK%guC z|0Np4-&INHS4%7JIgtjynE7&m{k|E%`Rx*q+)Qag+(BSeBx}^fR4HcR< z7T8*CfYHg>>}FHf8glT6ygO+8%i-|oCsxU3SWveUDFDkS1?sstDEF5BWrO-Rho~Rd zJXJ2FcMS`^-%pez<vU>i%_Tlu!kj z`P;PZ6wly8C(dNArwbMZEi{r?!NG)tp}Utmea^Tt_Nk_o!g#n0~ABv zJ!`=N9u@}hz!9#V;%php?PEWgjZapud#MVlUUNTH29$8c-DX$Xw=9p7GceM~*)Lo> zVZapAd;X{;xqCbR(%m)uMB!9RT)`vJt(%Pkmur4iVa+d4M2bxJs|4pAbo@L2D%(~6 z8xbEfzSGk35)I9p?~YiXaJWazr*-kyz(XP(P*p?7GE7S!-4Ju1mnMdc_H2ceS4%34|cJ2@k7$JINn8H zPHRbUvcA;3SESX3EDniMx#}HRFGOv(5URpY7c)LxbJcmEQPrEogHCW)Lafi;P1}}Z zs}Ow?^|-QfFiEsJy-EtUnA3>(-39qlcZIi$-sgcHnshQay0_VG+V`un z-T2RLjQr1?&&aQ)OIu~cY=bl~K29et`0M8*RSPXi85-IusZ%e*h!+<2R7C#=dMT5S-;@=KsQ>#bLmy_ zV!FQXP}%4IojwEok{6lHZI-eVSTDEbBcX_`nt0>Xia(_*t<)n-XF;+D(Y#Jm)!KfE}P9C5_RRjhh zR;Um_D$^rp{jzo*htZd%Ax@bI(+8~NlSeJ>caT11Y8QT@hcIJivsS200v`H+=yh3r zC_#HU?{0(-?udr->8|T#x{pwIRdmcBniou?zR35Y7%s$}cL56&eHlMLA7~2YQxfg% zo^GJUeV%cQbjs>XY#T=R$GWCj0t?$P5VcIM0`c{q`k#OAHX9zYl|>p-5&|8)tc9eg zQrV7L5dqY(oiE5{i9a-+<8GvCKwZhlWCCzAN$T2D-0fWumn|Tiqa7(JnR^HY zqF8|R8K;v)77Dnn=c)k6!5 zl`A1~i#WP@IOn*n(^BpFdm2sAv-Hxt&^Az{t?gnYKgm(4T#)3-w7oJ=uO%^`tv$x~ zL6k}2D8s(`O-%ZJ?rt68g+U}-TLjemLZ*CGm`#8-)+9UfbVe<@;R3>hFj5uQ<RX79_%zm@BkD#eQkJK3qGwFH1-MkLDPO-@$LHh3g;s2NNqLwMVQl+!ixkjsLT|K zs+BCHI}zJ_lynUZt3-3A~A-RH#76%2~R>J8gl43va-b5=pKu5 zJK5c<0w&w<8u;^~r3N3<-$RYANJWh1E>?y~PhCF+rHN~^Pb;)Aa?NnD5H21@kBTxr zQ6f|npRR;lwq;j9yr1uxN?wb9o+}u5u9t1o>4$A!2X9t|@N&E7kGKw@K3FLEF(KYp ztt#hfN?BPHR7&M3jpqUz-e;OG6>U^9(l|7NU3S`1&b*y4U#Q%CdT>1WxkB*mv2Onk zxFL0T(6oDjL1L?DWKsE(W$6m1Cpsa2<%QsI|A8Tb6}Xl4a{rplQm)xDE!!IFBYn^H zE@zb9TaXT)D`%M75)0|+wm**@rgKOnxkX$Vf^E7a3@k(BV;xpy!txJuS&OGRf{p@2 zuK!s3t+AIRkh`yv5J7N%Q6s|o87T$3LlqhDZ|0E|SyIzsy1!v8TKh12rfbP;3d|~y zH}N1a-^_+{EUd~xY5HP{d|E-Xj2kup1qY=O5-WPH&;Bv zm(acnDUFKn?4Q)C!b9&O5Aqk+30AkB>|Z_Yse+XkVwDQN+<}f-qI{R)mOJm49fNmP z>Om&>pu+yp*tJ!c*azI;l6Su65zhw_5HRx9y`)e0+FAlc{|cW#>6ggf;+@eY8RI1cD%BecWmtQgcrPZ|pai_l^!PIB?A&nZe(DtdljBoVJnYSoQ zi&D2#NjnkS$(1x_&ldRZ0+q5%Q}65 zrH$@Ay~nI+s^YYP@?SlSAI{;3Cbv-UU4~G8DhhUOd75)_=e)Fo)w{C5&9& zN4A_tT^%=vPr%za1;ok{ym(w1Xva%S!ANF1H9t(5p!j*d7cLQrWN=UV6(ju`Yv(Qj z%Y&ksmT~Dx3s;>Q&5cUS^tOtO>M*ARbuc`QOPl&t%4#s9HbTZIC8q0Cx8F*Til{A@ z6?*JogSZrLli&{0$%6G+OaLXQm!Rio7W0nKfzp8g0fN6jyG9G#U#My$K`Rw&Jk?Z9j4*YKZmb?x&+*z5yq~(I52h zTtE>DH_+oR?R{DTAGRUiA|w4}UGiI%x~{dypniM*fP%FkJqtF<8=W2^aAS8l)z>Jr zlIn>bG28%RT^f)dRd|^%bEK90ZcYIVpAZ-dyjL0N=sBS2LWsJ`%}8hjF`Z zw3f;+hEA4ZH>L^Of2OcM{`}(!#qQkVl2L`D-dP7aSTJAYV-9 zYZd3x!x-Algwy3Ou)2(L9X7QFv*)2#MT6v`L;~}0c_a(P^CFTiEXlg}vjLcN{8U$k zGyU1R13gISQhZ!roE)2}-=}X6Y@QcK`3@_q8&nKBB>M$5aMB7$?;#^quA4tooG@)S z7JtBAr0w5^Y>CJJw9i|e{k3N3%ZrIn0gtY>A0A#Z!E(~vS+p6cr8guF&LbrvOG7Uo z-jcNN_yK1ZziHp&;h-0v5HXvcjS!EVPc>Yg+E-bjk8V`kJeiqeYFYEw$3ln9rDkVe z+~c{e4R~<;X55xVJ|+_=DQWg7&gSb&_v0W(jp)9qta@ZPC>zn;SoztgDX*PayB(MV zOXQ!Wk=f76K4ZBwYXH@QKRdVKhJ7z8##-6FP$=tL(fKvVq|tHomIn`*z;sxb-b?vb zYZMd%8jVI=bzV8>MojZCG9cKfLk8jLs5FUao_B6s$eGJ+n1OnmunpDz2;(CP=hot7 zyL<%vhth@rf%5ySfe&5-S_$1Y(t)NHY)^mhUKMp#S>>;Pj?SyUv}Wo>zBV3=@$oE{ zE*pd1iqa5CT_F_lG!tJLjvi~ZS}6DVgTJ@liWKm;MawA>6ST1rw&D(-;9sUUGyZjx z<{o{IxvLMAi-??0YSC2E^g6tF?~(2u3vPi$L0{wExBeiN?C;un&x%$vA~3%p)jEE- zFHjc!z~!oyYQ4Cj6U&zLa-$_otW=GD)l_9cZh*wF$oHHRy$NFaV>2Craw=+o7>MEaK-%`8!VFnuhppwRPCu0d4JFCq>*vIkdK z?fLmgU8I0pNJuDVy0%>Wig|~#e}`*k27$`fs^!J{+no3Nw|Af9-)u{3F*tCDgzlFV zbrwaQ>QKJdYLaC9Vap^9GR>@rODSCJtKU|AEjIH7RF#`}RWwKOOG16-Wzwe)a@%%` zz8TslSqsX0<~)vq&xm6#nVD6spH0T?(RTcZJN=*598@Qb0sgq+$rOQ&tqZN8!`+F z!Xr?{WJdj?2Hak|>>G(w#QC32qMowJ>=&P*8kHqHpr`C|x6gL)^F$o7jljKKX}@cH zqHBI7qj-Tdey}Bj%I=!#Z(8+IF*~={`-jG&&gl{6%B;jj82-}@nYc3o%k+hwSUOmM_Nw~)t$bD&5+!=CGr*)F?i zX$o7DH%^H^KCPYH#jC28sbTw>t%~YCyV_j|OEI@MxR^)dR^fsjly()fyXS14-d0-+ z;~UH|X*<4N!18+22_4HOAh1{Pp>FSPQ{|SHQFO6rOIYxPA)m<#Ek>H~xq`#bn#74y zKMjC7hx-?Up0dMUUFaC{c>C4IDd=h3`nNc`U-t7ZV_!56lpj9@+uhHfmbh_$BYEk0 zdBB5Xu=HX_>d{)aNqwEMDAX}wl&mzsu(dX1^(0h70J24E&`dl(PjLAVwW+PD6SL*3 zoiR41dLe=)6Jma0FpJiaeLNm^aGUwUhZ|@7Z}`A<*HjsfZ0R3wJSIHpgq$lQ-Z*)n zxlDE>PO!-x{#Rn!J5?ma(#`#-*r$&=jq!(Gd12Xi3JRWv)HCskz zyXrPHf_bKg-4#9_Bqq^v2p$iAy3MjA>LcA+Q1Fz8V0vY8S6S8ez=Jnpaoght#}ZJ~ zm<5vh9D3}}DlS`Hm<)3ozl*RCejIDBmzmFQh2a_>*p4~Rj5ha>ZX2JxeRi+gz5MIr zl%ztR;TJE%D^M(&)psBY-*0#-YCSiM{oXj3Y=>QOm6Hzq$Pk%~kh$IiUPT4x+ThbN zy2%&W*R}oEN4mLVePK2JV9cHrMO}NU;`eR?`L6wi{=Th-4mUf`C!g;r7)J)ZaTYmQ z6I%*qKs9|in^GLaP5}pM94Gwf!}a*?7f#T5MTAoFuLw-fotjDmbh%4VRx0H5Vqe&^h^HlMW}i~HZl)NT1L)7+I~TTkQp7}1F7 zH9aC~eSWEj|GZ^kiSglx@-4J6H&+mrFFd>8x!!8mKV&|5|={{{p28a<*oFH#)i3AZE4m8fXDp{ zniD?iYLE?=l6XbJE5)hZKA*o>>7+>K@M>5&foGHE_;d@ws?}R$$=Cw**8*ic+)bUwuobA znR;wf6_XuPocQu->Lo^K*+W&0mS z9Efly6(AprQ$E4ipH&1x77iDvRj6g*K4PCC(PYgKfKJ-{!vyj#<_+vrA+lY^iO56^ z1Y?S5%>Z@q)EPVofTcyIA?bXn5!(&^PJNvK6f!CcFhE)KZQ_gX^jUk7aSfZX zOz*pv6XIOpler?-unjI^mWY*@(%?{fnI99#@Vc51DEA|xK6A~i?U9r#0*o*Eyuct< z`o+Nc!zUR7?q4cjaSP}zffL4Oq7|k`@5LN%o0?=N)OJ|X+&+JXAp`T~T;mT|`PJkI zF(*HcM42;}1DMWMzsAM$_Q!usTiIor|4X{8Nmk6EV8TcA>`pBc=V8%pOb#>x^9vPN z>t*A1J1NVg*|l#I7G2qRupmu5AcP#=A)tetZaWn&mDyVg8*2EsT|>^WZS=RcnWegA z^y5FU73RH8PD^mRQ8rZ@`qEne@g0zGn-5#uGwbVr19QP&A=(nDNEDos1Rx3*Rnd_f zTbR)(h$Ob;L<)2)VGroJ)80;RL9NMZX(zg%4q+tY4@SQ@%2AYl@{eKvxz!H_^+}y? zF_rlVA@iL6h4Se+{BI#inDMBE9%9bCURpB1UGJ&4K?hJWiJyWi zP%?f%*f-Yr8*S@9>GhOyUh8l5{*XQNsx0+_7EKMf^l`RAP(?>6woS=m6zo6sj7dhk3I2Eg8%iEgyf zDRV8xw1dYk!m?KpQMQ7^-pJAkb5~5GJAoj6^YIW(aXtk*H35`jq5iHgWz#ycYM z5>)V1KxD#@5VTvsiIR+w+t_D0 z)JXSw=V9+eTFH|Hgk7E6Acv+iVVV8{2al|FjzWgx0qoE#zsCPB-Y#%q#BiESoEp9` z=lz>kV=CoK-y0Xrn(hls_{AL1gdLY-q#$-;#@Y|(%Ua`PU{A^_-l@N+@?ZEvlL_AP zhje9z{{Gfo%91oyG2g#pU1ol(#-ook4m2ND_ZI(6l+C|^9QZ%S-TKPdJ9iYmU3+nX zL9%6}WveXSrya7CCAQm+35!EL3>wEgzm~?GD^OM8BB-D1{-Qr2_&!Jv{xRXW3$Jbz zXF@cX%O5jy6GKO9t|NYTGf_L>(ZswYY;2;Km{aE;8tLOSs=Cp%E-5Vsknva$>idU+ z@}ET%V0HZWc>^KrsDmo()COXlO!ftbijjG(gocIRAi}UBj52=K@1NIpIVO zgDndixQSY=hpVGkhnqU-YaK#Bp=j*7Yl#5;1EcuGD~)GhJufSvOv<(upjISY zvOLXFnqY^(>A~ZHg;YO5vgyP?2WOMW_J<}BCTMrY;_SCYlvEn04jtf98&? z&`q_Wtk$+-tN81XYJ`vvN{{FduNK=Pp!jA5MW!D?F~l$ecowWr733oEIu~-j>4^eL z+K2Zt{=+cj|3gO@-2sij?b<=ul%Ouc9a>}8$=4`-6js>AGeCn{{|`-U!aVxqf(3GQ z89hW62|FH!QD%k+lGtrkpHyC8GF6!_^Y{7jZ`-i53rw=n-186pVwx7%=K|}d7Q7t#fyX3JWIhd~=LTeYOqifxq|L-r_ zoagMOH#zU?jf(h2B{#<6pHA(|b^Dj^34BARgQwi{YAG@!j$6z&8&UR+z2CB4`O5oO zr~(wndguM7e@tggACR-(k^mMNk)E#J<#5Y-08}K93q ztCcGSDzKk3oxxJxyqbDZWjB28D!vqO|89_+qJ18SfB#xk=U=Rk>4B*+nXMdooyxwh z)i{WDl;6KCfTVl}@1MpfV`dA(SBS7?uV;_AF!7LC;{xb<+0ns!U`}0JUbV5J25^f4 zZb=)du=oe^OmB9Cj7N=FJ7~(q-k6F_@QOq z({<;Mn?fc49m;)Ia$Nz{hLNq8L3BaRBjg)z^oH*5{IpMT;mdz@75w$ytY+Jop|6}E zQ#Gv6-^fX5%kSjM6QQ38_5A({C98(*rsG%kr=*MMb3W7TbQCh{6QM7IEdwC*$CAgi z8TCw*bfn|V2MpJIkWNMI314N`-Wty8MNRw0(%w=JWK!|%Qi3mRV%!s*V^>T{lq~qP zUdIt|fB9Sf>t3}q{=1$VqXPhwgZpp!>hI3ge{W9zU)m4WHN%a331erkXNbE{`^cyJ?T}bIN>(9-4SCAA5S`r zs7Apkc`E!}mfuUxpACPx#*d9^y5%{p)*KD+f!UxCLezSFP zEib=wqt;}t^2ezvl;@k`ZS6M$ha*J2`FTlAEe((|`8Fl7{+vJmuA`FpTzir9t)FV& z0;maIl}P5A$l~=TInl&mMnqNb7jbn!LSnB;=xEEELp(KG<;o4Z?o zlYD0A4JU#eF5Uf=4^vZF>O^6zL5{M_-6uNXuVT%HuJ0Z=aY7EWmO_4uO*)i4Irf3q z@&!MHJL1-!-aeP3|6WL5`X;)9EU`Phx2BdE2GbylKLOsq-k&d?4w%#zjbR=xmX*$e z@-?qez|{cHIk14*jeE|ybzm=gn1}wJm5fANR;dg`PNDWE$pmpgeM4TDobP ze4Z$f$LvG}dHc?#{*)XQi0l{Bp&28HxlMTIeCPnT$E0k&4ZNC?WL1Y zCmvqN5E6$0vQZymC5Kp|=F32;a)9-ENdg)GxsRh3u+1|ph=D{%srV+zFV|$IxU8m% zJv!7v`9bfM5Y)1>Wvt-6PQT#x>&5W$pNJl038mr~9X!pmP-AOcge0tXJHt7VR!kot z%=J8WX_xVTXoB}qjCQd(pDq1tE!k@ByMb;!ZOCn95?P1|N@b?AGd&ehePUWLh`mq`5>2Rvt$>tk-PWM1Eyfr zhEbJWGNe6y>v$?@foijiBa1FIghU;MG_1!pN)jEjh`uxPR;Q8UZmi3l(BS047n-q`fp z5_*wdGOfDz@Yc@TNg4TDCVn^-@pnI|Jv!n3Q{qm$Qkm&P%jRol$9pQ>gJs+daA?_Z zY1P1b=b>QQ`ON1Xg8H{!c?bj+WL@?PtAQuX;I*NV$92;|g^1sfE8%9|=()w%>?}9X zv~r+k@%)j@z?ZKpd}d!QSd%Gw=uSHWm$-11>VI@b(fxm$MZi{2Tc1b-r;r8eR(N`k z0hR5T7yboQZm@z?;P!5T79?}^Taykea>dE^aMrk_A9St+~E0+28!BT_06JqPWls&8!%Fa`)$alfM% z8p!4l?bN<=>q$w3Wj5kha@nFk*LVv{J7J>Z=Cm#g=>kz}9)j)KAG$3SuG{ys;kp&% z78u8Y&Q`hQJz<6~PXe!%Yq>7D1;3P&=dDj5$^P)JfDu3+t%q)xw$JW8jsP>OSV(lB zgXaCa$C@r3?2hxA?3)pTS`?0gq*L;3t;|BdofT&5uzoo@dji9T~X2UW`XH%=N@!WSk;RO)A(fd zlE&PEnE-M_-6>_^{2y4JiAR<^N9bE6!$KcN9fb&mC-#R4Z(nrmm>b=xtxgs$sbj`^ zY8sCQa{K%x^vICLD_zU%KNN% zsB^_d1dDzH>$>Wj-3~bsI9!yk z8{CM<)OxmKVBAFZ{DpUkYu)}XR`s3FN81?fNBpG=ue!UAT3w5(FI#+07;JuNIH=5H z0bu4#vD3Q0e?2N;`Xtv5F)g8v*msjT=cQ`PqeeWtF4Oct?e=wCa6$0?&x|iFS@UX3 z?iJ|p4yIHV_JTDeajf_U5nR@0%5rGWwQW~r?5mi!5>FJdm@*|Ech2v>%J1q3s|u19 zJ9BQoJ;l)L>mrhq*y?tS2^Ew2R8bQ+dC>bJd3OR6)&w4c3SWrzEAt-#hu<-@lQWDN zgNmn87E!J{TWYHMKdafxWsWSf+R<1YHM<_B!T?^n^Qt=HZZWF)-!*p+b8jSHv=UeP zc7H87PH9Vd18i0vHs3gx@Y{z+Ze5-&SfO-H%^-AbutpxlOx)4#e=~h3WO69?pbBe2 z0k@Okgpy`m+rTODOp|6u8u=b)?=`rJ|6{_3-k<}&a6d1F0@3X!Zp^F_FBN@ZUC9YO z6}jG_aJvz}pvet*%D_x7q1fi~9E&u&qukb_x^g5(OFD zH%-{yX9ZDt!Qab=UJ#q}O-!DdQ<7gFohI#_6TR!ZFi@-5YQc{Cj^DMTa>v4jW7QlY zSnpW0RYB>!bM*1n>2CQoS3DS-#w?>JgW8XAAGA32os7q(bF=#ZL1=t|3JbyU%k$)q zL$Ca_Tf??QYDP~D9dnVI%4+WuXbp+sX!8xO%p^aLr|hC;OUBka)Pq;J0sU}2%`fit ze<9vqzy7PUf>fR6N|dV=$}+F$q)Vt0qK@6R|43kJb-1VByfF2BQS^FQ(oB{9ab;wO%KPLCn-d3)tCyUK6qZlsl@||$2B7zjPk_2 z4-y%LK~PPCR?EmXpIr8YY)4JE@r^LZ33X3t^7G$L2CEcPK1vd!g@fdr}h4T9UEn z4$x*LKm#)5#s^OeHKK)10Q~@lM@5)yI3&i8QX4=EUNW#9-^^mOo85Q1OdP-H<~yD} zJ+i>(X1NS9J^E?>&^Xmg&hB}}*okT$*#h5QS{*ySE1P(3>@Zkve?gdBa8FYLE-A2@Dqj>k;`M+R9HOl=mdSa6drg00gBx@S0dM>1Z2zrH zc*bp1^x&-MCq{ zy=PQYal0pqqN0c(0*bVtG$nMTN)eSVAT6Osn)Kcw5D^jSO{zeY-Xp#D-lc}#d#@ot z67PQBnfIKzbI;7VA7pr9`5FB?T7jaFqUWZ(o_h?~vp z@LkaN-K62)>H#_)+cZ4lDtWWD1~Un z1b)%iQPpNVPw6WmIYl9*t+-!L!pV*TWSlnca9>{;b^Rt?WV*BP(9KYMUs*+UocjYJ z!O_I~iu6Kr1sz@_y+k^8IwPs79J%*|!B3sL{HCitiR$fa>-g;!kPFaZ9M-bHX*{I3mczzw<= zfj|))+TLHNX}_3B6+PsA!a3|W+{5IH2>t;obA^aVfqYGlWl!p}ML)LOH2WYRO?oO; z@p1=wA7Qc85+YFE7$j$pLh^8i;x5yz*x4DbAoa5 z9$;|ab55tgK`{)Ca>u*`jL|0A(E0%`j}pQo&qeOzYds&dGAG!A)~F~84c{uKope-_ z+_8M`1PvKewZ2S!Cq;K;h)MF~Elz%RitP)GzCGxY;-}rLxI(|9GY~MS7__Px{YT6D zFHxtVGr-6BI=}TRWR=G43AvoOD~i3g!6sWUzVGsKy)q*(kK-U^oU{1r9;S#jtRvW@ zODxyU4Lq=`9L2w7&{FHNz>ezjEP2M;BSCjcD2|zI5q7VHZfM&dfzsbw zYm`wHaa*r=sQuyA;!P**wk{t!_JJk=#rs@#Uw8EQjb`P3n88E3z zLASp4YG2U_z8D|=+v@iM1L3Glw@j{iHB8IqaY7DlQ3j2jjM0WVOc`d>qw-C03bH*8GG{3J!|}od875a|axbP+NV&<3_^jWkWKJ{P zz*-j0L+k@p4pV1i&*4G{^LPfO$O7@0JIzsiB4$DX)w)M~gmS@p@KrZHcs=(G51v|t zkGSq!Nrho?2V~hbQ>Q!<^<`-pG_Usw)1C9mKY|}w=P^GZQQD}tE9}YGN_S;b6eWqM z=4gN)6l2>o$862QX7^jG`uyjcl$UwHUzwx^?BNrzIPrnL-=08+IfQWIE-H(rWp$J} zFyeM`A9=8E!?~g^FfXorT)8rlXDzZB9U~rsv!M*h3(nn8pT!gNwB*~EQfW2XRx_Z9 zU3e~N8(befk}_ufY$N{iju8E1;a&zW>ju%a80!nK%wKjxa;4=432$~5rd-2{Hb!aIkay~N(jFfM#*}(F8_3+P-gr_;wjs2w^7%SJ=<0xc$v5Hw z|F)lY(z0*X$Iif+#;;T zX|4tK*%UZv3Wn7QFDq`O+0?*7rsE>QT#F>?m9~5PV<@BMGiVn3;KP;u1R4QF<~|X* zE!9eLL5M__VVU!aVQtdPP-r{!vD7cW-h)Owk3MV*u}_ZQg=g=c=WgD*Wvy?MMSNpL z=*yG1wdn%=iCW3oeA*{nV-P{$dF!jGv1N(cc#7(W2=y_6g-7miXSR_${h280EBtN8 zeEp{5kc%ugsrkf`dk1>Dtk^hOejr&V`-=6;Xxb5vbH20n3KhHZZNr~TH3gr({ct?c zV(_x2m)r`LPShu1hSVqgDIY=+uI)T3+Y*nDwbwr{<9&l4A5v>=AS_C_lgIEcq~UEg z5L12U?t0<~m8L5U6_jJuUL@VHsCMFk<|rL6saH;YNsHK+2HiH-Et-GttOOXXb~eD;p>m7@ zA4Iuw7|>tg+M7XMeyK}(K)NkR|CGS zFpZJl)348Nd`xzkZZe2t;~q-!4czIVCgs!`fv3;BLTwB!pEi1QAn6A1cn1xK!Sy$= zKR|lBSj9&3rXEps2qK15AQw$yvw%00E0R@G_f(hkJk!Y)*a9qe`s*M3Bm+znv+O6P zRUp@I+Qc78NmNt(a@dkBHvNWuJf!d|Vv$q(_0A_COPNDx?X8y;C~F}a;b}oP)L%ey z>?vqiA*)Q@8h68ox|GJgWP}k9yiK2;ouY}m-A$8YWipaxrLG~RVmHAaff(P)S8O`yK7yua=yQtQ8q<4hfYMV!uLlO+weFvD-e-m&Ppf&+Z+k zU{K)1OB}df71eg>V`RkK^_a0?XVXr6R(Iy9f7mk&it8}2v~?#@9O8e3mB7tDEz!*F-zqgrSe$yL4~~Kk3v@lhjkaV(izj2&pa#X{}0C^ZuhN>x9v6)fUPG{!Z%FQ5-yON$-@^!=q@zsLG3=Q)8n?50oY$b>-euTn>iPn#;+6AV5^A8!&%E1kGQY z@MFbBNRwG=wA}0KJU*!YsMFdh;H++d7tQ~|kR0Y=)~g6E`mU1Az5N#nKl{k-PQs$9$J(pmiy=^u|2I6!CJ%Y5mB|mPFG8r8W$sfGtM4 zm7iZ4h@6FK5C+;2EEL5f1T%PxtS=kKZq|{P)Lv00a(@sUb)5gds6gc zOI_!?>#+5>AHRg^b*&3wn$hVMJTFwNN2RTQR|kO>g3%Wt{mcOYEi^w=>@`e)HskBR zQux0yQ%Q1~w}C`Cr7rYHmkFdoL^aBfe8%S(07O46aGZw zy3`vH<)dsF4|Kk{`Z*4aj`Lk z(d)g9CCYw1@%?zVvV14#5rD@@-@(aMRDYk7%uLhFrF?sbEyCC4*kO5LmwS1EM3Dpc z)SWKvcW4z^W&hNm%el-JY&d8lJ(+exjYlThP^&~U_67%;@GcS^ARLS<}P%IY+t}~pn}}4 zfwAw(by+MMI2CW|qay_65ZB91KMHl2kLM_c2Ai`qm1SyWUsA9~n+V#033>)B z4u!#`S=d&e3EK}&3@Pn|>og?A2FOQ4=McXOWpQ2}U1}_UiR8H9+Ft1klN&?tOd7ZG z0RW~HvVHEzavg-Htq>yIq9(8+NE9z|m0kMp*lNso0AHVN9?@U6{2XQLMhy~B(E z)~L5*Gt|g7gh7xgvOAp=iSCmeFCAgj1eZ}E^k{YD1v(U6gmZ}P$EGer&YBVl6{~Tl zzo(eYs(=v6mHB^|2}~S&fEt7DBXCq7R|wA@uM8O~Hl5Pp_|)p9(gEml@dx^e#M5Dm z18WsrK`cE^&>BQ}MGOoIzIfb(2VKEro0=bh6pASbZ>%qm8@2F-v4~IXE14poOYQ2SK1i)qm_R6EMIcRV(8c)=8*N(&p}&O1x#Mk|?s*M6i^srTl(X_T|9YKcXx2A3H*361@;;8lpd zZ0754-q(;@dX#UX(+K6*4|C@}ZbdrA#9=Nvy|T~3zdj`*f|T>^v2eVIYMJwKE7%PB zS!~rm^;Dr{({E4Th$M@jnmj#|5Eo#c$>{l_JcY^1itOKXF8@DX`S`!<8F0#sWB_74 zjcXK22d9x2w^~l6`_V++P~VKdME^zN(NGwKjRE=|7T{6g)l>yY@&yu6|Al(^?|jhz z<~#AK!w+39{46(`o8~k2dVx}tOX|!2 zt^xQzyKeAt7v>u)&h+RZqQUMjQRXmo+s#jhFbq1@!O0TNstB>Nuq;n~!e1gy90kxj z2Cr-&T8)}ABp^VP&POHy!Sf&cE`zrzrrtD5QxI5iG7SLVCbCEmBUIwG_v<8$#*6Zg z@H^W=1q}tF6{7XV!z0A5|Dlb>;Y5bbQqkQjL5L;m6=hRA6(8 z)0b{Gh7_=CG2`trV*!d@$WUs)D90A<94Zm&ytj66H6wVuf8M2sKR86SqJjty{uI5C z1k03C>bIAaRn*(DW)j`)UNujujZuNp9Sv|LQ>Sdz)Xeu{rqPY~KjvQVN>m|ng?_}t zFmu9m+O5N(0Gw~;yywSGMj;^^;TX>kG7KMe0N7oY=z4S5O5Wq0Wb8;|PUBR%(kvc-padv(`Nv@}bsZ)wI)oDh6?1vNH1<*J!QwXh+@6PD`{CX! zTu{Pcyvg_SjOGtq6Q3-_0%!RX+n&baLtiABtwX0Qmel+5Gt{{|I)%ZF=n6d!5PJR+ zU65dB2(^=lO+j=aVfru8@mq6%R~(oFEE^16XYh*5Y7YfR@|LcL%ou7o=wBe!<)3A6 zQA{vi6fVsANdg`QS2D*-|7jIPgIU;(Jrrq1(7663qO_hDt&BOdWdm-KV%q(z@ubPb z-WW9^7ylund0|A_aGzso$1-mpy_QR&y49N^gj=(JiH>@umxauwdo+QyqLF`zSV7yBEVWk)7oad| z=q|vR=HFEeWzo0&#&d%F3L{MsYO5e)r#2!g!c3n&=_i>d*Hh;jX#j48ZA=Uj10I*p*#| z-Jj_yz$rC2P|PS?(-(c+H#ZeBrF(9=dLYU?nrFJ(H9f_9w8;E+ zCFD=!G%wNX;}Uj#T19l4HVr_2?O|H=gfvvd;1Jz7{`4k2xcL=1j&CA89(KoY4(q)c zP~w5lp9Hynzd(Nf+$7b=4ZnM|(bCsF+FrRMg>>%t&{)N`w`yOf$V?3<+xs9G5ayqJ z#l*CND-(>#d7YJmD;>twO2=9xh4<|IF&NeJkPio97!wr#ucr8a{s}8al!Dq|H#u7H zyd55TDnY*>f05qd^X}e$Fjq)qtnm zG*5z;2uUWpk8G@((_Cg;S>nVG?W)=oaE3WnkO{h?xLwMNKX0R7L@U?mf-LF3I<`rK zIgO#$+^kv+RMdo93Jpa?w@1|sp~!ySh-*vvKEnIJ_4@KR|JzkV(vayk;=fI^bPSBX zEY&Bby#x9FT+W}HGO{|UD8fC^f7hU){j5#u*C!#;LW3I3F?cZ=3@8mu{)c(tf0Qr& zv)|)o@jrs4NnS{Q{HJOIe7xPHj`(g>7LWA2b#sh4O*nnr+m8Z+?u5+Ho!%1Z?G3_JH(cax{Qj1v**f=W;MRSLQ< z?uv<+x?nh5dmPb*OSITKwPqw|~>>Fv*OMD>Oh4#Y9i=r(@JZS$&DSbQxE z^N1MB5uRbXPkb$UAB?fjwbp|q6i!z}pe*j~)tyCnPP~;qe0T#ox+Gd#?pnm;(DIik zx@{vCf1U z=r>onf%>jP+Z=)vc#E%IF)**b28O)YbsTFRi&RE}hYf?iNrA94tsN0f>4v)LZqXU+ zysfgmpk_44Xihvjx!!QNdwC-4*^Pi`AZ%Z9vn+!mP8l;4&x};9;r%(L?PH*yvu$!bv|~gp>kMgrpLf(E9bIUn`ciP0Q3TU~PVd5W1r-(h zzN3-=W!$l87zOrXAxV_|^DWU6?Ptefn%6GMElyFNEK$O!dy8ur?#-!kr*Wv}=|~$U z9;@sXDXpH|oi?hkgwSaV*)8MzF zWu1lspi>1{M2nDajwiC*6}oy4)eigiQ0HM9+gLjB@45tswK|#+J)9G^#3B;hp1>V> zvH1Xozo%DY(Q?+B3!fK&sp~i8-`uiC!mdZ417^^{Db^?ihXk*h4Yu*;O~)Lek2KaAec8U4O)JHV!}g}_vrv&s z13BP(&}$pdk`tcwGn&NY`EZWQx-0ifrJ&34%=+?Jyune`uz#+j&Ch+^^r-uCGTm8u zwEr3H_P_r-bQRRWd-2?e6DZcQHV6^ccsQzC8aM}ydTqdeS$PQ0Fn);&gdGSqwKfU1 z4@up={HMsU>VGCO)F(VV?5tXmzCVeDwNzc+nLaH)SmX0xF>8$?uYRT z@6A3gabNtx6l+;rSo-tFOmF#XE-II3@3*I{q_s^fA!vSkMLZSN9d%5p+(yQoBK!v; zm$tt%Z?h-Mx1R#;0NHTte}C5h<-65pEx@H`H0qNX{wX4I88$rAa-aE8j2vk}JTqDB zSQLqO(txa(;?$GOnNq=LsLYCkDUlV6AlH{e_hDOnLKR+Yo#<~R2SQ-OfVaP@;Qg=!1k#eLI1JCF!uIDGE*#FH0h z@afNuA*#gGPvXh~q9>E_MEEa=)u`vukM2wlXt#>KYKJtMiQ^{oXByr(;AV({x+ zxGkY+$Fd12#aiP&L-ge>LOtvbyOs8szP-PiNaUqE-WrgrcrN2Hj7qJzie*RLg(w}x zSgd5xZ2cvoAy9c1ERh6L1l6n1l`4_NSE;YIIo%hX{c?PJv)4;=S4PShR{Y`UdkJ-h z-k0wg+?s}JDFsUJW5dRifl99SUF*MOtcFLJlbKDS=Yu@MF}N5X09gJM z?SL@ENMNpDSFnwou^=b_t<+3Y92=PUe=~R>1)o9bcLe4_yZSJG} z)jz6)t~_YWf#`p}?9f)iNb???4k#-jcc-&5D_u-qvde{+?0h8}=fs=h>$uCiu+S{S z*{8mh+z?j(L6JRAbAe})?kU&h-}&#sV&fR-+fl;Jh5A9@I!FWNOUVd z^^Jc&v;V-(!5(Y+s`dyP?25cQSu5{SFxO&1)e{wR{^0gl2ftY8**rX~fusetIOE)gS-xc=iSR z9Tu{u8f$O%?G+fzZI+Y>U*KNAYCw^1O}mU9e~Hrdn$os$qpgk4JJ($c`sk%B9u}qR z=S;E8Et56I-I39NGyw648XrpiN0%YYg?qVtPi}JF>e76CymmuC_HHei+r@WEomEqS z7taw?^rG$_v1;isC6(IOdDbeO=G#wi3!5^S(xN&M)15Q#%CTq5I;Z6d!{KG!C5yZ% zn4|I_;u;R_eN``1Uo~&L$Bz|jV?%Of_R$86mF{L8QYP693CYZNf$4+9fCBq6_~bvI zSO5$RV5&i~EwZ}+Oafe91GLHmAneUi&Zv8M;QUr2EMW(Re~Grz` z*bk|dJGf`wya3D{a-{kx9q=6}dj6p^*1-wCl?BWd;MvkeAJMwpjd0M7jEWYA<~1AA zaWMm5ME^*N1L93A%H*)H(LTr_Ew4sxu?v*tx#u1EnQbn6fRb=W5qF!WEA{oM!&EBm zql46ctTANu`NviM^Ger{IO^VWyM(qm3D8-LwR1FKu~HrI=A*+QQKJsn=XQXglg5*& zO?ulF#ny#)i~a}7i0V(ie0Lo`rUs@uDm%Zzu#{ebc88YOS?)PHtVxIOHX~MkLdVBX z7Iw2NFrcZK=HDyy?Im#lH91%Uc7s^0jB>wx5;{UXH^`Cp9C_0+3oKYtP zS}PCfxg&9Q-|!+dum%L?do^vIgj-U|gWtYT+CNkqNcE*w*GT~`{XIzsf*xUf5QwDCN@$mQ1jD|O zP^;_LucN*`uPD=PW!IuT=(X@zZiux?>0(6{yKtpV==S(y1*3ma1GiebogzH~iMVbsqngDUqD>xA$fq4wdYdIzC_f+Loyyl< zr>B=ev544nIow%!m{x|>hem3|3*8gSObmEm#faG(Z|#8oFj@AmkXk$1Jkm|%o_We zViyQ^t`=nBzYaYxsfQvRDr|0`+cxiQFq_xng`^DE8hw=S8l(R_d{6Q5X=Bo;(10LR zWkU0o{idhW8EDMuy|W;Z5qE@bPVqnny zyX^sa;r=RTq_fS5#Xt7qtSmWv0`2@1Z zoFAq?sgmiJtjOxvInvugdF|szQ6h3h*3;{ax3MdLX!tQ6DRuwxX*l^sz1uQvY5I_Y zxsR*Uw0>J+LfY-f{to~_8yIzwl3>>GzZJ%%EfLg_UmI^H1`)af-`NhI5lyi~X4|jv zK@E?nX@UKwoq3@@qE@gOCcmCapLI~N9iE z5`qsmJVzyEc;y=zburfcj+nmxP}KC=({Qa1H)TE%F^~?XhqNE(h;VrE0XLsj&RmwK zi!_5Z?Gq^upzVBi5VR3r=#c(a`dJl~KSN`Rez}nffCWM>fRUf-xN5*5x+8hHax};4 z(K+*_m+<}t9=yVfWuN%YNVNm`r)s6<^<09=$Fv{^ z>oN?;h4%=4jWMLb|LnDL{a+#3M1!z5wBEpk;`8CeSDc$Jf>w0IMf>HZ5?ry&@dnYI zCgq?fD;wWYdUzLX+{)aeCB!E{o%+q_sc$VJXm`EtyTwgh7Mm#yllnXVWeoV3gmC9# zNyVnqy9%&@wx*!x{ZWYRy!7PGWWmy@iEqW?1K7LKAdfzlEMF4xHKcBPC$D{DDB}1c z&D|mdqxC}{nGLC|oDlv!rQ7+t#Xgk5GJy130O?QiPuKi4fjyv`aH9XXPoe%F{r=j8 z+m(qYtQ~f(k;hXL(~Y$vXa8OMbwNK+N-Ke-4-!kfZy=kZn20bUM2|?a=3v_rzqdj8?TSA zy3Jo_7C;}Bvh-&ux+(CJWWkcdBuUpLFr;4yW#Ynz_^})K9gnQr3|#-%n^^-Bej6&? z;zT}^h%Px1JP`&TO9}Pj;}CWT+$c}rA+xUS4pMGu z81LM@dgW>V1_$=JobHt_f)5*^uOw^~f5Hv_xILz~q+s;|a$ljG>Vn=@OC2QOM2;4< zq&$plkZxDl(@id^e)yo_H#hsOTTb)k%J;QU%x8r~n(xa-xiiOjGH0mYa7Ra9K(}eu zvy}FbcW=K$vf`8ubHdNeV=yi+_`b~eGj|H8Iw8GrGe_EWn-7%l)$g&c?lGrOMyCY8 z^x)R`pAz6@FN;w-Kw{HRVtl_a_#Eidr0bnl>^MDQBsmvn*`|@>&>Y*{N-;oP_>Bn>g`Z=kC>9mdmo8Z zN1L-E?yT1;-(6uS8G5|=N+@${j&3+E^3TqQ*^SmFJ_ADBE&sxD5kIY>itnh$g#+a0 zSMQ6O#pK>s4B1p^SS9b;+^6**KJLnb`*9|3eAhjzR+vw&IqUjTFE4KR<|#jEs+UyU zSz1E=%)FkQdYc&W1+>}kVRd!q70cZ*mL#87-mBpa2o+q%g6GJfaBI(`k_-<)2|`@SLwemGxA>B(Kox z3~F5hKi;=BP#}n7U91O?_px7<)8Cc-?y^NHExJz8L(z7-D{NJb(&Fo9_q6vva#GwK z^2S64_|b|~m0F+VO^4GzS!vuw_pBxACE}FAB3)>1zt8qn%y{+3N;oQ-kk6`#B^{v^ z!(_^oGKWr*8fIz5PBdk0F4b6e-q|xfsg13(jj)m^pUn{4GICm3cwn$uOMRa9{s}c@ zEa4B$);&%xvnCms82sZn8RpiQ)5F`Mpgqz<&(C#Pa_Tj6{@Jt9hX)Z4g*H2Uc!-O>fz040mw!#X=ofH*b37rd+Uhb`(^Bz>iBR|dH z`D)(lejvE(goKbCXxik@5AfAIIf}+BYZ%aI;PAX41j%EMg0oF#f^l-k2^_+cV zMy<-@RmRs9*8AY&x@Cum^!n&sSFgiw{l)L&=05YO6?s>6EKKVg9>vUAxjVJdWMr^! zw*7iCMjWLAm(MlqW=tC5QbfT>K8M8ho9mn+iQl0gdN@~U3QIFfch-e0`Y{UQA9?`7 z%V9{}_L((Kwae|%{!f5YDfk}9w8RIfZ-i&bGi?xJmGOO37c2X0p^w^K-vvh+T}Ol@ zsT4+XCfthTAoZS;nYLMRhH3y0j3lyNc`^K@(x~eX`IG&?@B&fK;1+(elQ1{bcL^5+ z<3uXaXWDlAetO~?WYC0U?e)~?)jtq<<)z4JKo0V@Cj4fK12vrGp`*^Vu;`FAf(YCc zSh<$KmM-z!=K9+Zg05*inW_S_H+txDz9L&UFqqapk{Q<+*EXrlz1LUbY-+*kashG8 zWiQt7M!K}DNZHR;HC$0JlE0|0)QpE28%by4%zR13xVS$k*Vs49$GmRAM*wQ? z&wq&kBu+K?#l6>q3%wErD=z!S+u+s;Z_zJ+2e=y*JE%KI`FzLrP5>BqTY6ZJ^A~ ze&rbB|Mo$CjB*^wNGZ!K^+u7hgSi`ET~T*~AoM_i$8{nyShyfIc>%`HS13Ty;4&fn zh*E)0-M`Cwp;pVX1tLb%rf#igc442MWWRsxmfhA$q@r@2^TAXhWIaSLl?myD@q$Vo zf`?E0R#i$l^o}-j?mZzoU9V$g+T~mt%CdO2?dD<~Sy4Xq?$SK2_2k=ozqpyr>EsS( zh^Wwp5oJxpMw?@lvlcTT>XEfFQe+En?G|N#_Ci0qF~Br8pA|G;!rqRZ(`*nS!hI-$ z+_Ef=GbQ;W8sm=1Lzkb{@ga~!KA@npY!9gB*3Oom*;K@@HPf*-yBb$pDeu3_-Z#fQa|>q;x+Awkw=kdaJ#M6(V0}-oA=}J z-FRC+w~gBQ;sVO|3qvDK_m_DD3xZ?An1cQ+md;q4?OgVk-TL6QQGs|KcT`wJPB3|{#zC=Dv7v)jw_oUG{ws8v%P05dN=Cq=jyvc~)2|TLn zu=r*0ILoqU+FASz#UkGBtR8#(5$o-DEZ+fQte0~Ob>?1F)>ecH$hR5G+-d|Lx#2UD z8OQ5Ics>2zz4)}|=O$rsjLMa;KNOcZ=W)-lXUT0I@DvF2@A;+QbAmRMsXRr*qK@j9qfYTY)N8Q}#hD)nZY_{+N8w&p zrt-XKbFpy)#m8EhSmdz;q(HxWp7g{LIE2fd>!*+0FmnIO5;hTu8LU8AQd*w#RqMa% zr>%6UPMa&}bZdCMW+5ym2fp?UDnaJn;WUzA;Ud=eT8wAyDyjrx@BHimpKU;Lt4&eR zz^-VKG8lWP9wowPgASvK(BIZgs&7fHd751*P-@2sT^RZGRXa_C`h%Hv^9&oQ{u-mr z^QXG>GyPKLrWcy|w`SMNmq%BkoL}Lp)N78kgsQ0mQ%N1&@+FNrR34;P$WI3V?{e zv|VhF6Xy}Am3~qeQJ$r`ero1BdZ?t!TQBAwBmC!~WX)yr*bydG;^^A-1k(IF#1iWz ze%EJZoEGt~I-8YkORkd`DWdeES}>54X4{hixRK=XGDTirMYfl8pafX(9S@II4kPJR z$obx_!Dj&lADE3O>?=AgN9mJD{VTk>vR#aRIas9%-4dq0QGz)yzN4Lzm50`Y%=R?a z9MANH?K7OBJh9K6?d>NiyXSC$@L0rhAHB`io?!x?m{EfhOb31q>_S#`bo^`5V*4k0 z#IfQ|p}|K_ul9pCZ$=25^Ie!1)duV(tOekr|CqCHeD>D>!6xlH>UnKd{srC8Q zd}qDS+p*Jyu zg*q#i6A$*IE+F<{0$(Cr=Ii^Kr*Bs`ysb8(co%k(jHkgGFV(3$dh>N%=>7TKp1T3Z zC8(06`bQH^!Q634OB<5_D5==uCdn=uBl=)Oy7m(<)3`vcY1OamiS5j@^~oBvJco{o z9e#LHrDwQ_r5L|+y=SX#>9O9M>>Z5OnZwEg3{tazv`GSRoJ~ldmgbhSFddZ)cUjKw z<7gL;Q_=^}{y%v(EW5Yz|5^)^ND zk)2+pH$6HNp35_=Dl7%zJkhI(d3h%JiXfX$YTvVf36{sdfS8+J9FaN46_yDTH>I&B z&?0mM?oD9k1hd=T^+=poKHw{2iumd%_W@N?H-519tp5pr_kiiZjeIW}CP#(hA_}^V zH+GI?ht}JwPkl2QoTF4&9h#Rb;;N$BCj`p4Z5`ft>mI7uy$QX5a=xS! z)f8K`Q->><1nV#m>xH{xwr`l!SY<^q$PO5MwR-0nQ)BUsB8F#=} zH>WZ-8}glMMm?)KYNY~lxPgs!N@0hd%!{We%2I5&vFExhpxUf%bHf!3_%UyLcOvuq zsCdqzXL0Q*@VT1emBpquM&*v_;ZB4n7F^e!dXbyc-S^~*X?p;qM>X?ZE4 z85OCB+OV4I?_c2<#*tykOQShe6+;W;f`gtNJ|bV>A~s6{Ox)sJs08G@W0&bhD}08z z@SY8)ONlu!k8QL$mFVKtavo>eybu?vITDfYPd*5C%FhqhO$=z^7(X#Ie@uKd9oT8`8I$ z=Yn()PJS;rxbpsf4f7(DBj0-F$F*9#4Ttu)NeqOtQGR91AUnfP+FitLz%%4~M*d67 zVTdI8$X%ayiaL&n@i9Gv%CuHC(-Wr)^tHvh#A0*)GGK7<#OUOg=R0fkaN`~Amc7YH zz1^0@4WsnT+BAFds&z0WRg1~N-LoZuSl*p zrSj7DYLB0ue1VVk zT-mBVxE5G%BqtS7+ujx{Qb6`~n06Y<3-P6F&2mkoEOB>Xo|0t02llvY85M*u_oDXSCmB5$~zQ&XqeD)87a$3 z5!{dz9~(?~q4Gou;uiw6I=_#YQt#o+KAMsH8Le70yz$E5`A*H@L7Y?M2qJPj*pn10 z48eGgYfM*GiL-6x%bB`%CmF@>}u8(KkFam4~!G=a-ObJ&oKmf&7{rcOQz+ z=88hip#Xby1?FT`P&(k*f(r7xH#L;I<9}3hYahXR)pYHtmeKfS4ypP60P+b^TNNTN zQrOD^;qq!hrNEV4?JFKuYW6+EliSSJ>-tdBC;cEIM)o$oAH!*WkdSN1q9tGXb&g6} z&Krv}0zC4%e;vU}HoXm#pObFPSF$J5-hx(Wmj&DJLAIwt$q-7bXY@78H$frT+TRvs zv^ii#@ol9MM{g)Jdt2s{0s5OhOeCqNxKCxQ-IMh%Q3{05q{CP|W18EctH_=Wu!5G% z{+)?G*f3>eb+OkyG&VO)cBPYR zPng8yChvGoizoMs-1a^zk`(V*k2Z)*5E@Nr)d}gno0qDn%g7!~W#Qukm5EUtkR`LB zRw~%!bSE;Mg&s`F8T=m8QeTF~bpr6%*4*xg&ym*mXcQ|8Uj&g?3t^+LjI9B0N{s)} z)91aqMJd3Vvul;%00_op3aiXqU38TWZLgGn0iE|3B2FodA~fljnIk0NMRM}kp;n5(1g{e~M>fDJaGnqaCILj? z;oQQk*A$-V&r5M1yFI{>Lec-G^|V?HARvL`%Z6R?74y{b z-aulL`lYvm0Y}F8REqGMYJq8~z|=O|>EG%TC#k0fu;*toxB>U^E^M@u(_Qa8P&5D( zqT^Bu{vcRTH~tcNy122)Ye2gcFmLU@_xb?7)~^6kVW@pQJ3I^HS^P!p6kFVTPG*?#-UdK`M2>mI<|H0mSM>X}o z`<@^wf&vzL3kpgH={*q<5D*a%A~h-?y-SBgK|ra}1(6^fDUmKUbfot#CG<{04Wu~x zd*=SmI_J*0Q`XFyHFvG~gPK6H_x|k9)85bP`Q{huFW2!#ba~DzHlQ2$u|G(}TY1_A zxJnZLex?6i=kZ@?7XBZ6j27)1Vf zBFXbL`LeU-62#RmD*{Wuh?2vnEPf$+W>qH%f4Ej!E5xy1$VKO`&Ny0~3HQR=p3q}U zqPu4H`Ko`lhyFZ?OJA>%IYNa&Z1uv2tRC8|xjL<%(x}45Ei zt%4GW3a_L~@xrd!o8TTRb`h`|P%TD@2N~u9H-Z>n8X?6&zE-&0D=G-n?%$Okn+F}- zV6Ayq=6>jja5s$AG9-q-+MYXGED1FEMD#D;8JuiBg%nDal)a zBj44bzt5i0fE$r(r{|2~GIrjL@ySMu1#vh(p&F964P^Q(eyANuw zC-&cDWB2BKT_O}Sw)#L3E4c79DZz%1drpAS>)f#LA8OtI6w`zL|M&SnxK6l{R6Q=i z#}r4Ur-0w|*%}0V9*TCe&Gj*58%SlT-X~p|toRJ*K+fR>OI^gPV?bSAc$75jOJdrJkzG$6s?!7dsQR<@D;4TuKu5iA>lkRS4wxp7-sVJtum_MnvS4xBPRUW zUy6$xXdHbJPP|`Cj zFMV~U_uu~x|A%(U%V>n;Ip!}#BQLTq?KJqo#f;!4_#2UqSkJWcYoU&E8-CLv!|XE` zTqm5mNYc~?YvrpJ@pTBVkHlyfyph~?^}^0RBtETYjKzl3gYbre64+1t(1FV|%S|{lfSVr!I?4d4YL@TW0yYJ;F~|fgn$v{U)_>^> z{=2UXYIwb6EiiYW#*p&Ms7L-mBW%rE6Z;2GAgt5xPEiYXxiNMlb}y7YD0-S#uKQgC zslf(h;M4G4ZXkgFu)a7E&3E9JQg>g`xl2KzhC)Mc;?4)8`Q&1goOQ!k!P*V80MzhS z(>kKz9GH)7YLIolg*H62Iw{0z5^sapjuV4blKqF-6FFbLUPRuf|D(-GxfyXbf;L3E zpf4gEe8nH*=8FBkKKdnl_!+V~zfEy7h*|v^WI^W43YO*Tmre>0%+^Qy3n!g->2;V8 z2U!$b2_jFjs{K)4^wkw^ta!!9i1a<6xE8PX;8IZ8DkvZJ(E5hxqP4k7MoNF^!M0*VT&~!~N ze1VP#!BBj+`B|k3R^<2u@sj{T&Ec(YwBa56OK3I*LAOp8^_<7QycRG9G2!Xyn8V$EV{*(RaH}a?AcpOgeKA+#w-nLpvaU~FDmjg)jxiw6O2EG&2ov@3 zskZbLOb!0`;$&&f7ns$6V9sCQomkt@=N|zwY#GjBEruskKO$}%l9Y(SVeGjk9-~m( z{zT~JMW9D?^+0bX(CtyXd⪼)zkh?y-iCs+^rLFzoQAWILK-{^n)z{-Xxpp`ibu~nC*#E{0*84Zdld}+Sg8zw-UC9AM|tFK zJt1jf(fgIh`8Yb114*L~1JD~-y$#~`6BsAxzSly2U=Epu4OHSQ*RD5By@Y=6{AS$g zPGq}(pn}MWQks`4QJ0Eh!6FnlhD$pr|4VWG!0Gw8;)_t9IUs=A)Z)#m)``fg&7}ha zpn)^X2m{*qD=l1w!n+J20$DZ6N*LsI-x{MxHS7RVHXi|*3%e*bwf9s}V6Th9M}|qL z^M>cwa*G$s%yV(!>OpN7*p%VYy`n3>a8FS--!HS>;Bl-?Jil)c7u(YB&jv`a5UMMS zzN-!Uu$|$W1Hw~;DLzo|Q!1CBhtAEf2L_J&TGXGqv;VZHIb;7%Y%;eU;6gQ=P zE!SITa+8}r=A3DI;^qI+fe875T>5H-7Nr$ zPQcxxy05#qEohbxtKG(P`li!4KsutMA_TG;+dlu}d*>J<_eAEi`fC4T;h0OHARCC z2Xz3M5fng0B_i5fiFQD!>klJeKm*Ofs8~Bx#$O7355)PU@IL@I%JDT~eHoI82s^V`CQs1q#E1Y85i{if&{NP@Mf^^A)> zZc|ZYU;m@A%-W*>yCd+ZT>JAdx~KGmYP@6)#Ih#w5Rf5ZnSgHPRY|>BpJl`BJEdPj z-&2MbSqDV@*5qDldXR_H2uXN~GX3rmP8xz6*TT4Za}?TQHtk0*wm6E5>TKH51X6V1jznjcEzJlvG_icF?nxU+#8V1O814u~NmhStJV3 z&$IUjF))8MCI~Je?*N@v!I0{Ey}^`djJo zk&jULbWx;a*Jvwd>$L;%zAiehgjOtPjJF=VK}vww){E0@BqMJI!rp+|6@$-`-FqO* z19M$HvwB+AoC)L4Ll^f3C%_80g-|jl-XFo}Ta7iDY6+$0$k24H=X6kbqw;Py-i=ab za&if=D~0pkUP89(R92?Ogk;svOm#B+o~7qQa!K)C3n4XUS0O~K{apAXJnhFjv5g1VsD+dXuDb)E*iEQ zc>$#ypm~&RC8->)Y-UcAH(pZune)uUr1d)Cjf@5m*7nL)TAhw($riV~O zKs{J2&-GKnnipsV{*5S2sy&J}SmF|2@=lh{{~Sv2%2=9A!6Pom>Aw{A zgx#x_q;#TMk*RrGfUUH5{H)%s;*WhUS+u{~c-G~YtJUFR6Q3u%-sgV1274+!%Ae*O zIC*d_1W#<(o!gI^?0elH(9*@B$?-m=VkJHyW^;N$*9AG>5OW@=W;Bu{RFt+^lv)Xd z5+n2iTjVMEBM}USiJNb9&<=TCrp$7=PenydVyQdEV?>uWh%>i4ngIb}mX7@{2sx zB5#MfIzf%~J5wzc;e%H#6uieF@%C~DQthii;UrS$=Tm)F=S9IzfPqIzO#?U%VOM0`sXoY-HMpT!q+zJ|UU! zSZJzO=+N<0gXeso=*udey}WQ>m1pF=9;_au0~kjjui9Tru#G+$@b~kF>m!uj0lu7z zV^^vMZFcs@(!}IVeqsNEeAm<_kU;x*tn}^txie(CG<3QNO#0Nshc_Lt&}%()^UauN z`ZN3}`qDI2>C~6$o*Ja`|n4w z`RX80*Qo!B{VCjbosQ|W3rKsvL^4l57{QyhPo!lR@b}h+?$VrGGJSgtqznqdB9l*_6Rx+PYisHnz0u;;;0?OJ3Qc^RfN|O?d;zUQM)4#v!%Sl(QnG`a*~hJoy)9}Q;M$Tl?glUS;2@uvIlgVyqTbrB7<+C;^@OFA|`p$h!zZeSBs z4)_|ppIaC|{C(axR%>kMXq!VP!R`rRMk7g|&q+w^KGS5p|4HFeflYZUWY;V}xMUx# zTZPU%IJI~VjBUe*6CFu;fhIVlb~W~RF0?5j*Q`(M+Ovws2iloGQoOrpSr;#H9(Z*GRa*%WXd%V^aBO|sqR?R@agUs>zJH+dS2uqQwNCVu( z&%UoGo?X#N?|%FI5P288KRo@yhiUxl`Y+{+Oiyf#E}k^)54(^bz51EtsAL#D7`uTQ zQ3Wn1x#qxaUI8ITSeR~eT|J2+SFJWlk;2LMHkc1A@s=l}?e3ZhtM{BsA9H`YEdbAW z6GVwhZ}m-zH%9VF2ts58P>ne;GpXESLVKVDVr-;c5wseb=Dcza|4ub)VWy-hVj!rg zsTO8z7<@(fy5p8!IHH`TUHs$x^`E$JVY&G)W*k&ASogSLa<9vD`)GO9DqL9s$u>0_Z zxK3qF4BNwZyY_E$%et@Ld3kF?(*0QGul6}*oZy(ZeyKhF4e_TCwZaM*g$qiW3wiG@=%;%{fnu& zz9EM}*7NiqV3*{;3BnQhkmEz4$c0>{eN)AtuI1hT!oJ1_`YkrXFbk1Y;+GveWt&D; zuI%0r<+%K8tLHKGwc;zksWa3Ok8P9u+rA%mL)+*m$N<#=R=Y?@Pk0#{Z2+@e3T?FZY9E2pIqr(_zkx63HeHHB)gH-GUK z`CzbX#c@f<_gmb8MFm8T=wggN1L|MQ(1nKmmyf|IDa{Hfk~P2Zs-W?kF7iV4V51Rz@= z@5sFP1u^Zbose5wqx}Bu=MwEl+)dP>am`q}<)MzwbL_Mi}#kDZ7@TK6t z`I2n{z*_)rD=Xsie?!%Q)>r|AjAdMLF+%D0-H^Qyefje8*OtxZ_iCukVh% zHIx21a@A^g@ zb#5`fbu&sz1cGdykg?}Rr!*Ol_0?T`ecCK}a4tp6;h1k==ci;u{k1C=KNS;VB9zLS zx%$<$<352Oz{mDR-B zgl-ZnN-tBy7-;E}mtFLCs7r#nt}Hi+C^tg4{1lQz^I~~Xg%BGQM>IpnC|>(Ox4Ld= zO(cVPjYpd#SE^Y{E4*4)WfkC+H$S^2E6M8OY;E$|3z$9c5qET_sRs1HXrm`#%dgTw zUP}w;7lU#wf>=y_*IN}`Kb5%A{jur?43o$jEcp?UmD4|p`*><{xk~VtR;yX|<}%;- z5Kw}k0{T_|ON<_U1N&mtLg)g6vQBMPEW?a0?{OUO4Q+V2W)<3W->yV(BAsh-2L7l= za4$CH7(K1o%tFW+-WQjyQDIcI-JIO8qoj?xEvOMQ=fA>pEsX*6HvBMYp9-IajPV~+CF=Ly4O_sB^=bnY zNqnzJ4qiC=CZwlf9V};%EEB8Ff8_)uB{$gt7u2gQwiF*((jyDy!PxLE@V$CCIK=?q z(HZsY8}7WCW=K6%EL%o%0yAgnvjC(H0%s(E4_!VcG~FBP8aZ(#i~Twmf}`}#pno_I z_Pn!~^xD-4O@iZ^B(xsT9WmO~zW<3TKcWgoqys4FJI8IqI$M?Kq@cEw_&uMHrIhyX zqaBJ!@+Bal(_skASo;)0e)A2uOc!#1*nCkS_gEYDyJ>}c#A5Ivh#7t__AE^XM5ICB z@?#^Bx5A12?BPKA41q%_W~Gd604Z3)XHs!}Ml1k04_NXA2;M->jr^BFMw}?d5J5$L z6R=MpI`(g5Z#_rE1^re5o|*(aHF7DIbnSHE^ax0PwXWcE|5B*Q?2||`Yl-^DLXdx6 zerVeY40eK?e#RE)EZSxTS?SH83lZe1! zgm&X}$YG8PgfmNfb&x?eF>+6J*LLo|du2o`2A_pkvjw6!bY8TnS2?fmYbliESXIW>d<$ zltN5IF(IZ54gpBGjDLf$wnU$UKv6&>Kp?Pwy)wrH2adYsAC!UW>i+x%;1NdFEc;wQa^aju#2q1bwQy!ecth;$ZWH$_ZdB19 z(WHikTNKu|eh+wpji6LRKb4o6@Lh#lX~|VVvW3MKy+M9%9XV;X_I%zd(_i8pPiju$ z@ajIe5X`pS`J+f~R9>IB$Z5Jr%#WVoTSKnKiy2gu8B(wa|I&oQ4b$k(xZ;^^n*fK7 zyJ0p08*OEVzfs9p%NLLWUBjB{`n7!*r1-@8_YPWAwfO>r-sIZb`D^&tWn_3wFrj5; zeBjI3wQi0zK?xH6D`4udcV)I=EC_qpEut0nO$r8beY1LEZs%5l zuAlCL2zTwt>?2|g0oXc0=7IhvDRuUu{3+Mj>|edPTa7Y)`1JB)c@9S~7tV(&8pWw% zr|6OVsm2RFIkMo@&=rlTtIS+y=}Vc12kNW)$F}T;&LBJfQgE2zA-c6XJi`r#f8_t% z2~&1?nHk;1M}h21=obILMS@!U1AiNu<9m&IJ4l)%woBm&S@t7x7euRv5_W5V_R3{En*C>l zi@dpVUi~iFb-(PW5^tJ|GO&k3R^U|nu!g2PcwTHsC-aC{3C2QE&8=bZqVUBol`F|y zD1c=FUESZkOHk^^+CHn9kk-m+#7^Yjj^R(>wrQ9WUHPp^mvk}vrS*j?zq?QI&~_rd zt;wreb0L+4=`;R?ao*Vbza55RD;tLoR{c@U3uhu3GysaFF@+p$x)dmA{|d1b&D~~n z+o{x5bU8ht@=As}z31#Ie;R3Mr0H59|FG+Bl?89BsdLHP*W%B8$-lX{{EUq|RjEJw zPc-qk=W>>(jBlxSk442-6<5}5J=IO>X{cS=?HNYlq{v!b5x~BG3y@)EaifuG z4ld^XDx#xf=f&#+8Ta(v!rwxeh|bvg&?P1L zV>+>SNm9V4{L1E zf>|!YFansZiTsoyGWzu~FxmH*?qLV6j>d(vli~hgUosAb1RZ^y*I3UX-9R=XDP{xC#_Rg=1!sZ0#w%8C#)kTpek~AEct| z>BOYX#OR=tTFD3n>i!k6J4UV|^q7J=Bv2C=%rzi6Z7@HO3RCU?L=x;cz;?AtZxt|qLc(B2pAVK87RYm=jeSbOH zyZOw=)I+o)n|t;8m4%HS1bx+^uPZxtepvHXc;cDvb1B3REgST-{%< zBQ{KNy;YBG&R5Osj;jcb^B$wKS)>mi-#hDNd7h~B`@H;YU}sqQD~Y$;Bhe0$4c4W| zw}Bnn9v{-Jt2DPYcX$CrPPh@UoL9TZ31-kx_&fj-R>s?sTL%A9WT!Pwo&b(uyxI?- z3g)p4n7ERnRpEo3DbuS9O1L68%2;g8@WX%l4(UE(b2bi|EMZ{Hi)xKZ5Tr=WrX8ruK-R3-Q* z=)j`k)n5vjBXAGIqrW!X3-+53+TQ8?OQECr$DPCR>eIQ&l|i|Yz@X)CmsiD(VAo;O z+8?eDoA)%TJaD?P<12F>LJ)utrFC@(>^FdxHMOedWN*7O_5Ao=Tl0shChOhzq4EQY zKMYTT^;V*TN(fi&QfkmlGNLZhuk7#T&JH{>SqFn zLT{4#>ydGl(0!ZMKo)EwdRoq^?OggDmP{<$T z?C}P8pKA9Q9#*JUrhBpR1whxwhz?n4UO&sDKxX;pckvm$7<;q1g0B!7psf*M=Q>^g z953SzFkPRVBg4cNdSiU325LU9d|aD-=Rw`GNa25-Uh3c&UYqy^ALw;u$*p`Xl3YxX zTEofPk9-YPe-l(epaz)3W)qMdY6v-T7CH_AMBQ<6z)Frr9r&M_oiO}Uy)8ye(IZHL z)u*e3k={qYpT8MEFR96YQZq(_h^5AZJ=T5g*TWS)D6 z=JotU$5UqbY3~VymKQKy*SYThj@kg?Js=eYreY8=7BUVd$AdNL!oJ!6cso7|evyUt zB#IMM0-3Q!bzec(O!{!*{qBKTTMq-OH)}3I_5hkjX-y5@#E<_(wAf$z0F<)xySfja z@d8-&#CUY{7r5=t{gFhd@dz=2IG7VoJ1|ciK}>$fUs8w9`!1hY;pLVc*C zf3R1%VrHwM(!+BDJ)B79ay*b)>T2Wi#PCP$HQX+VlS{r>eM&1^(or9O^MRy$`gM(` ztTz&(kQEs5>ux3u--NU}vnG>{Ek@1AaLtrd>v6ZwOsUhFO zl9CfV75((?R?={kTg#3-7?7mw`iV-!=|l?**A;`QUgAE}tk3__ z!#`WN4B5~gL1URASX9`e>rL#mG4)oGz4p`HIWtt@_M2%&98*w^b#nOP?prbo4i(wJ z_^x>E!Vu6Ip+ug@``m< zMb`Nxk2bEdh;y@(+gMOA(MB#xtidd6#*raE^3pqt_6ok&u9dE>zl0OY{JI=Sqe@Gh zDxA4%GKGh#o#H8amE#Fb-@L$jSA)}W+?|pY^^-&Rg$`~7TuO9a1cJp8>R+!K2by_f ze=(qL*TAiI|DA9mY6v;&h}nNnj915obU;`e9!rZe@I=hhq5S-K z?g`sXVa*7AQ%cV~x`r-jZXw$MQjreen7-eAC1l-CQ8bf{dhtbz;;js9W(|s6p~Z1i*Y8z-G<%{#kM~Z9XQp5hN9>#l;Rrp zf&ARTH+X%*y1Y}+zqX7?jlg;StBO)?1;woqr?Jy1Z8Q84a5-OBH5^l-X&6$W&YmaL zM`KOc#M+|Am*;VX%r11Rj_4w7D(!|HO=X>uXei=E%*|g9pMv`zxP^gZk1>lWu0umAL_$Ue<^6tR{P;iG!(${7$cOcA!Yfh` z_Mgvx{CZWW_{j?`bY{1(U^UW)tuoO#jJGsH4mvXtF5$vYG_efPNA;2WmrT?!gV*m= zJaXo`gx`XSI%(G)g1{OTc_u z18qMH2==@PDAe?AoJ6J_5_bF)VxWd}eEAhy`dNt=9#dGSksjfh2wc7%ZI3H)avhGI z1zRaX<)!KE#mvnI%+GSz_`2pV?ovm|N?CB~b@)EXiztZ**h$_W9W9b-wi%a@T+KhU zT^(m4%(^>bZCfs=$R3DNsV-iMP<3>Ata87kMuFrgWdbZ%!zFO~h`AlR1yOQUeLOA4 z!*(VhsH7`zCjh!Og!>WUVB}W*EXW19k+Kwk3eBx_5C&X?i7t%j|Av-05#A*pJ^{K) z=HCWi8_ADf>gj96D*uvnO5;luzD{x1$9dqgd)go#@M!J^Gy~02oTG@NJVlZoQ&j%P zUYn0Oq-oWvjpet2L_vAzv*gWL9K@aI2jnaM6GV|57HDPfiNJ7qbPh;~oHENvbViPU zTY8xp5bqA-AhMF80%g9NsFnskk8)(Mi~#N2j-7VyUUy7_G`cVF+vWD{O9v|kJ!XETeU6ry-yxO)L|p>xk}vbXt-NVFMJEXXwlJj^Mh|wA zu8uUgF8S=#U~86`kqTv_x3MRssuGqjI?L$r%l)6+Vsefb<+5oB$}G{p6vm&S9zKf% z{K;b*y!pWvw#0~|RSW%#_&lo)!G_l~!vgL%=n6jUHXZq^V9WyrW7eL!gs7szYx$hP z!e3A`0u!N+g%amRp?Y<;y#r#R9ec6bBVY-l(a_spM;06%X-TgQ>e(}nbc+n}*0XrM>3*#T%Z2L`9fsec>4`6J#-FlgOLC*X5xcJmp6W||&Q=M3 zHPb}(;aB~o*v`o|Oo4~Cob$A!txH-)r&aQJw)U;`J!ENk%CL5}2<4vRDMtsAs_ABa za8-Z~!=+6}X>X~ImcsHK&0bME!go-q_3|xq@s8+7--#G~MTrt>al846k%|2+s}viL z_Copk7ex6iv?lp7xs$nKdKRC(bgp2ZsBEe&rSY6Q3^Nush!B0;h2vb+{8I-5_SJAp z?g$l#7M7f9O}qPC$1dp$eeIdY{L1=#lNA^jLK*sF1GBhQRsVGOOvUxotZ%>7ZQLPl z?dvLWaD6X=GzCn0WB8DvR31MG#jch#^yP^5r#m{keK>lFdtcA>z^!+S4z3;OiRUjN zGT|O>`gm~63E`!a<-U4<(s!?AwSIrc`ZC*wDkmp{?T0|kmN(-n9}GjE4x+NRU1lKd zYklR~zq|&HV?EmofPBam+{6D)^L*hy{LC=sI*lejUJE53FOU;^Emti$2!Pks%{hdg zuLpG0sg8e&&Y&Lw%;qcL6DL|WsPh*7*%9nBkb7VH_HWI8mXRwi*!x_~0G=~WidZyI zG?^LV^PwF!U&R-sy#h^59a~*N%xhZSlv_Je4r`=tjt`vtoIzlPXHFuaHjAmUTk&1f zOOP*j>ZIi7KYVPhf38^w{CAw@uYa|dz$FkZk4tNa7TDEf#JHV%ckKD57a-J}7Cvl6 z10Kd=ra%A3s@Uk08dG?RkC&rswrSg1$HiFgM-{aki8tcv)9}pex?@0kx!&Yu*XeM| zS;V;$2%F_6LzYE6hEa&ZJFs#XB5Mn|W%Na?PmKno5M@!yg7K=EYGT7{46+_bM_vIk zj9=e^M(Tf@^~4<57I6hf2o(G$t^IR3Uay9ttrFr72kQ_I$@7p0yz9r?dU+^QFTZR1 zY7C~@h7<86H(Pv&a2l46l%^{m3|h%DrLrtSv(K7bL_uEo>j z0>)R3?mKXS9DfjWFlB>nsH-3gp)P>^b2rcgARY?IbRSPxfBMhyY1> zp4fsCApxqx>?PO^U2X>uIJUY15Vk`+NBjL-e9!zwNQvkikMB}y8Gk8)+?$^XM-HJK z7h*vbdnK+I*O|yx9~Rk`YPAv*A-))DobYYt!*tJ3yFUyKG)riMZE=%fY1bQ@2^&p3 zKSK;g_fnjyPbG?nA2&tCT!DNnNwSO|+lsk&t_wTl!>bVV@VAzdztqfMpA

      #HWq zxQZPBJ)?s!V2-g*$`?4nXpQ;XGI~u@m)7$J6Fj z(Ve2lthmWjz1aKoJ6}^sKh95|OiK4#$eBE0tN)Ja8bj!JsO=0sO%}T4J$>?~icw#6 zlPY`%KU zrhMePxY9EF@0-R8+hVTw9^aw<`Df)q*H6xjckILt{5N39%9M{ER{#RAljqpb^>)w= z^P6!NTCVr2elNfE{(Y(ahf?yvyX|=4c5ss&Lxt_vR+s#gKpFn7J5a(?_z2*}cb#O|&Pjc_P6b~k!`Fd>7=})S^6ftXc=xL!Y0_4OMd#OnRt!skbKdeL(WB^5) z7}*v|tUV)#Z*+bpze%v+Kh2CHI|yt1Y>*_sdH;GojuJp7XiGe=W~<0jOLN|Fq$AP*Pk&Dl{o>m=}~S%t|?Py;;9~iHqy9q zyj#jbcng5>^`uSCike!PvoeC1S^zpV%J-U@(FHw>E2(_Xv#53F%lBsGH%e|6 zUG{hBCeSa0Pq=UPWb@NW>y=}aq9-s~vB=;*;0SRXlf`#Kk1xth& zjC>uz-IHQ^sV{%!@Xhavz>I@;&elAEBUx|tVO2?Z^~Wa2O!YY+&G+ot|ERk9&+nmY z$lc&+{P))?(goCHjSRGo@S)O4lL9{1K}Gr(;)D}pw>}~ zDVb}0R1J48uJCtJ5BfOeBd0l zIL4a^j1to6F})enC#YW3jQB}i(`4xW2O@+ghPO`kWmI%hAXY5^+RqN{B%m35y|*94 zI1lOGh)TX@M$NalKHnhx3|B07_`ABONs~WWZF1WpNlIwcypiF8y!E$UX3y#(zM1IV z_FwbOm{@)kqU7~lvEJ?L6^xNOFXwt89fx<0YmVe!d z^_oh>Pl{F2cX3y3o_lal%Sx;t+169#-u$auMP^!?fpk~Aetp;1kyA(T2rR?DK zM|mcmv$z)*@40CeWMJ!)p4>1c$eQ!T4xz)(8^2?BByaQW^U9aZVjK${o+((W#f&dQ zC|Afs!?~khR%XM$kSLk9f43aECiR$OT zl03+CmM(1Fv9|Y<`>KcS21BR}B}*9HnX*QF4e3DFw;HvyHZEiQOx0kW6Z%s(M>cd( zsh)3OWAV#@=h!V6^3Zqn=v&b31Xg-AH$82F%e`DndBwL2n&%*eK0lQ>Mh)0V z=Zw{W^RIG0pPum5dssK&D!N3aG^ge{l@49tW*Krmq7%9*pF&r8!FLw`GN5kZMKkYp zp;Mn~NBIYqz%+eaU6Kfz6xI>KvtyRw9k##r{0<&{b}@Wd>lvbzf_cY54Kvknix_C6 zQ0Rwen;ewdlpR(>%UXO-%~Kv%MDKoN<%b7J%rt;H$P%M2Zwnxm)$g|49;_TFP9n*} zh+COuW0Oy+;Lg^UpM(r$f}4)9n$Zmb0jT-5)$VFmIeN(k&Vb3~K>gIZxM2?HVW_Mw9&7Iqa{bo z)t}4KvxTK18 zw2v0z>*Kj^ur*<9>hzje0al<=P@dHHLfxJJ>dfiDcxAMj;X-90N8p`Wy5rIuk@myS z&$t!3f0xoa=AES<1Z_8rdLeG-SZH3BHMl%bTWV%l6})D0w`!nU@l6twZ9!@z-5Kj8aV5{`K|${%OH}}Eg&nEr&A#yG;YJiEGgAp z>I}2`P{i$S@<3%f?d=!#O~JU6b9ZQJa4F^XbEk;s8SRXWIhp68&?!dNpJYi=d*`5`N#6=l>us{i&nw1voxWmWWFN zeMbr7LB%tu5V6HCH-Qd^WvXFy|JB`g_Rxq99!S7B^!J-*Rfd~x1`Yag7_-{_z|0Wh zT2tcCZP}9^U@Xi-l5jgw>6&Rk=g-4h^f_=rR|1Yo&OWRcx&^+ z3{8v$388_t)ISd7T{EpgG~=I4vg``k4?QvX`J3;UGlG)KKK~c0=7V&-aw2J|s=u{0 zQY4?6?F4ljSCuqhlrv|p6;(({A%Dp;F3q+>?`BtyI#c1mnNf4|QCFiMl7wp5(*>2D8du@kh!+{kFFBacxwN?Vy$Hj?-BH$swX%^IJ%X z_f>c9zC~F*fA*f)LW+Rb)sd2i3bTS4vLaoj!diPY7enG5HY>u}p1_ULuyOZV=CjNw z(RuA%4E5O3QiTVcoKPlbkJlUwOztkL;=9MCS|c`(wwW}(Y|XB3Nycf!i%RYMrKbNtWOJf8}b$COysZ4AAC;&n=`bQt(j!P z^O;IGa@>!L3ti&IVZG}qnoY9aK`;xkLp#aWo~LQmKOVmbht7` zMuHnjU%aGmI?O!zOCeF``;MCQxwA39y(VF{a^h!5y$`GWB0G*@8u0`O?K@1O!4}Iv z%Ei5C-}P#sa?O~`r_y7AHjVh-zif=Tw+J@rG@^9Ab!JHW zfYUZAD#|u(xf#9SQRy?*-s$E6euDSP!6Q3oI>s|=ll?5t?v{I~sti`aI|8p`jTSyM zNKc%uRL08^f3kHpE8Q}uZXSGZiU@JOW!O{$tNa~T_UGeM@6Uc9)AKrkV1Y-^3NFi| zCgc%7W*{XQEG5xro^l6rzX{^7pXZyr23=u&z`0z8V!~VD)U?Cf-8aueLM3}|mTIZ! zjWm5A6%qyA(sF8O^Lyf$eB3APQNhX*7_)YqS52}>IFg$KAi{ncxQ{bh_uRa1g!W>L z9sY(eu6)H|x|mb+S(9tK%Qj6xyxXT|_t1?&@!)4S6lqCc+Zt{~L!3pYu?_05jC_IP)^%QQJ%7gwe{3%# zN02l;eZa;K)6>(Q_8=Nnt)dJEn4b9*d28$k*y<0jbBOuMK2KDsZUPUWMdr13_uC4%>} z59l#mn;entTpLnQ&pEU6*N8+LB4$Ua%+l8cm8jx#W{)rR+U=o-Ep>cvTCFdZ!O~^elv^n}3Y2lQi zg*eN1NeF2vH`5{U5>@^a>RFgu7~^C?Wlpu|6;W2Vh33~Bl^0AE3AvR^D*-8@ilo;e zg>Nim?)sL9Jvkv(5Sehxx`?(73Z3eC zzLI)FjzLg!N!2x|LfrBL%Pn827t*}1;t(ceA6O|T-ukmTTb@;Ic>MgEIL;sF0RYr| zH@X!NnW>0otL)VqW<&9{lpp`ORl1XGgA%Hq;VQDHS&vdOqska_$l=YB%q-&rj?0 zM`fF~@`OJ9W`Bgph!tNJdv9ONu@gAyaJdO+z5SYH8|>5qyOxQ2v234M(I_%%nmze* z*hDPtxi-T-)2m}>{MNVRchmh&jm`F9*IVZ<(2V$2xi6XVY?4 zn2c0Jc~%){b_n^bZo=L@=lgtjFBPKOY#yGZ-LJRXoZ9z8{yKB;Vi7BSmcZtF1MCJJ z`!xn}i|D!Rz!OaXu}tc|$h~vt@-kBpO=@}634KdQ-J=n{K1bTQ|oM6_0aX_-+X zeh*dU_&DFibdb&9c^!=t#ooiKR9r|)8#2+l8+qk!Xg}+o-PsFcr%ksxU>w^+7c5J+6`l?;pZIKX-d#tP{M9b;c|E5#ZajS4=-eh)!v!($R&%bDHtjKV4Sv_CgX zG>Y=>d%TlYkgoDkcHU6;ao5h+^M@)WY|7IG{A_*kh)P@6R$^5|2Nl{UYt{{?mi|=t2(#HKyEfPR^CeFWbEjZNgVec) zr@I74A|+mGIo?ja{jft*bj4wA$Mw2R;7i?nZil+FL&>`5J?EqH@AvhXt$0jOtgJcD z^G%cqMoI}De~?-8_!yeUo?!K?e;sD1fXT;m!C3nH{JkhqX?eqBvE<<_*E29RGV5Q5IThXvUa zldUQ;ExRDSH82~MwX(84+I&QbTk%H=$C}TmD@OF2WI~K7`%A^!go#ZUBV6#6;iXOJ zoyZ)u2o=6(JX;%6(ld2)Lso4<#Zn%8zwfZjDHk4p4^>m~+n>aBW87Qt$-HO#MySV? zP>Yg*G=$Vz5px>ZO-_eyepE#`3pGx$s7~d+6*t}=f~efbw3pl^+jeoIKE4y;Jk)|m zYy23I5l}xjp|aVButv3MS6bRos734zNrT(CvtT*cxMfLlu?SbFwz;kbgV6f8g6#R# z$@d%6^|;Qr!H1!5 zOZy{z&z3uoeQbr&=RTaxt@)D4v8LE6a{?0kVq!J*?RCC4$JjU5ZC8R!iyrEAIC%K= z^u0PzYK=4W^xSSATeq?>?Gk*W_Pjj5VO>z6pe=i7bF0xN6K@E{78P^%Q+xFr{7CNSF{{lCyu+s5H+z>4ce(2iTZ|)189{*JUj7|HRvPlPcYbwuaqZ)ym~Ov> z2d7vbhJH*N3jf`ek5$L6nS4o9k1QMEs4z+MSv5Cwc@7Hv74`CgvW9b0RieSI;ya%M ziVL5L7*g6a6=n8~?}f|{VF0M%nvNAoxTyGiHO_Vb90H$ULt4=eNZ9wqdl2UJ+U{

      F|w=ZGEja&4n*l-GxX>)M>InI`dri>7YCX7=dbEGvzN7VD;tVRlW;fXvSu zb2-tgTVww+R7&j@VoZ^a`kZCd%jMPW;2B?EB`oJ6a#Qu-Cm)Ur(UtMiRmv2M0-Bqf zsL%fqbLKxgE?U~s1li$9p;uSY-&S^D&F7Y5$Iu^JbgFDQIB6QP;eTw!*twM}ke;gLalxwY8z z`NR()g)CJaDt%DpV#mM$W%Xf8=um}~n1&|LC2`?Om%C-@F}}I+v&A1LmHG2O2gl`A zzW6L7sz88%fN^YQAFz2%JjPZ*6W}gGTa`$r9w($IV+_-5aM{HUjOC z&ZAN!b{f-Pztvzge=!A_OTB~5u787X+sIyG^2rR^={|n2fl{>wYN<#6Hgx!ptgHUF zAJ5K5j{)&{Xj&x5c(Pp0Ig?STTFRo3_+jYk>r`Wcoz4F81`Yq0v)Ze|69%9PE=fKbYD8kE9OjiAx(Az zz^tj?EBeu5=W28gT*O>23hL`aL_kyG{EI}&IT~7l($4!5=fm2*5_1qe3Vf+d$4$qgc0glgmEluGcAUrV z;%n`-MvP(}TcY%h$5gJ%y|yJz2Yk|z_jL_}^_YP@yHT1uM?VS8S^5f#!TzmIrCod6ZzdxW)Ddr>Joqmb}X-`@MS|#Y` zDAuPrBuhX8rF;2yxqz#?(TJ7Q&c$5 z!qa6P3Sb&S@AW}y5p7km>Vla_+aIu!k$b>e&*wy`vnzwI4oA=$0Q5y0bbf}%HS8PU zx(S+=-PEDxZGfoavrGWB5M>BRsC6RAh16m4eu!fM)e?)Yn0mjpnd2Ig&`P-PX5G%g zugk*3WoM$f3e#y(@$PZ0CBm^5*ip!zh4!bmFLo-QHUKTvpTg(a-KSC!0PDL)Bm;zO_G8DT0$-J~n^&oGT)<3$IR zR|Y%3eMIG}{Np`6+ynwdDxF5f|B=r;F3sU!xs)qv0B&@t>G;M&Mui5}<-IRS8Q7F@ zT&q9^RZWLLFW}Lq$82wdozQUF=XwA|bq3q+ezN@qwZ~BPUcSwtux+W3rf{rS8^nsJ zN>c@d2+toFGQ!Aje1%0v#H=rQ>wq6I!oLU0|3no+*Wt)CnvMk891>V+zAgmm{&pU$jchLe&63 z3ZfV-jP4jB;sGS>jyOP_3WdbumoVE38kPr{8(~Dn9>S?AuX!}jR7m!{LD5H{yr64+ zb1n7}BvBVl`#%X^7hdul19D2ou!z07bSKN&xbEgY_Jlqu=KES(x&W;^JkVo0C-mjXPpVd)Y{m{?=K6P*HHd~>G_oO%g{+|{i4@LBmEVwuEr(S#r z>H)jGM=6Hy3yZZ%%=)>O(O8``c!>1q;HV-Z#xi(?_8bxA$q47)a__wqbr@w!zuQ?6m^(&m?XA$MFAm^;5Z&5&Sl*4G<})BO~WHM`N(Q)2VBHs#dFiF!07pB@}aHxTXm^o z@_g&0KY*p+O-B}VUwtpku+`h~voN$bn8DD;_xJ?m6j^bW(rR~w)HfacMgN_Y{?j0q zn(3jNho@hM2x9j^Jj=+5i4Ys7$0Zte(WIaw(1{d#jfNM)TOnBD7rat9{T%tJG-C_# zUA1RPpkK>jmHdewxsB}fqCuS+QE~)8KAmD|%H*ijN1Y}6^^y~zmDj{4E=NuBvpjOX z*)|az&c_n^dx6t{mW>dxp~{g>3fCuH-9}2ICMnmX?&pAjbJc6aAT|+vLr`K->m4Ke5E)N?IEE53G~R14M= z296Uq?smm}f45$p+3!MsTknhSgu~|ge42N^`i|aMuHxSqy26SxcRLr%1sm{qc-B7v z(yHL~i66hN)}5#e*=lH9S6FqBxz?ne?x8Tf0C>eA~evjp(yV#8wl*)LxLEQAOn|^a)Fcsh?Vm|RpNzI9*rS-X_@ zvAMt?#j!$W*2&SX7K4ywJoRllqZ=$vJn-#Dc)C^^a?I6bOkrW|$^yH|QIwb+j} zSq9=RzH*FC1YhEhcvaZ_Wu+PLFSJ?ux7H9O7;y&KB}CMry(xz5-Gwg0Ht_)6!b1nh zAcCM{Gap#ie_tBVXl<@;`n`zZ9Amx;-j?2xV6h@mTI?8DIwqkk{`JGp{IE1JrM+Q3wB%+Mumm zBRuSnP)<%|_;*D`lVZh#&a(EXoB8i=7P5S0Jc+qTijG{vzqrSE*TPq3;drMsQwft$ zWwsXJbV4A}H@Ct%D8q?lJ(8Iiu5x7yDt#Lv;3P3K$`~ydYkRqDRR#HDY3um*| zjoWeUPW9>t#_7p1s2&RLGoYHk=ofvtl<(n|^Fl^?=2_hRYnDPB<{#rUnVt#IsAOkC z3aP4Pn=|02+nGvzKCzFonsS;;CpJKrp})yt&GDUO7rc(#(#%4(d#;tD%Atcer@@Ry z!%y`#*R*TvsDotwVrQjej*{SSYh`@0ua3S~rOJ;!i;Vg~J&iUXi;w_~5IircxZa3k z^@b>|jmq|yLE1enb{sTo!vgb)dyCDZPh?z*xI>-^^gqdD_;P5L!Zz$R!v-pN5eZ&a z0O()<5xz)Yq34QfS-ne}ZBPlcD-&dOs|D*K`?&mAW>`V$tdn8oXhqG~QvZQyD!=if zoeG1a{f1^ALX|#>WP#ux2aqb=h0kUPli1V`nsj=YCVGDaGdx@X%_xIjVgEqNAnhKC zN@PvK8G1m=K#p2QQIGyha9U@^6paD>6Cq2zJ&d>!z-dejr;jfs!KZZJ55&KEedMvX z?CMaMpoZPeFpDf{o><7YK-L98lpAu^Fnu7`_j!UXr}H?Y_t!9J5n^7;Bu5b;-Mgcu zR`Apkpo9GR@;kYi>A7bIj0NJaZyE7W8i?8W_9SA_?@9l#ih7kQ+r#rG!ZdnzMI$ic z;VfCN)u7#wpS-_?^qZxlyU2+7h>M(94BfRDKc3IqWaQ5QZKz7h?C!B>|@<P!#1(|;DWh}|!wB>@q|BB>ga~mzr!T!B zec0-jEfa^WiD>6|zg&dSZrTWh4{?heJl7cOP!bT;zqTVqL6v7s7G7jO*ZA;IGlpYy z_i*D95KyaZfH>8gd2GmROl!a_XcBR>w?MwS-+)xV_s$a)i7dZCOHn8Zad+Jc$?kY` zjH;qo(DXBjvq*O23Ue_McM;G`4{E$b`$m^u{mZ})4*^Bp{-WYmTvD7>9w|Px^g}0* z?WXCSqwkj)j2NJA2Vs9EI2jyk+>yojrz6EuW~{Y@sEaSc(tM9=D5s_*;j&m+NR;i<`;hxpuTeBsj1tL#dvzhn zl=4uDWVENzFBBS(kIMnt1}lL2c7|AD6L7KW%?YPPZw`3aOWvKED<0;QysFUx_=ris zDnP$3Z1Vn=MkojxN+*E6*2H2`u>@Lx@M*k*U~S3EpXrUdQV6#VLSgJ!Ot356M$v=j{-uG+G>$4>;z*;=JgX(-y>XP^fQ64fI9utnl zE0Yc*sG{B?qq6qzOfi8^p9?#1`Wj?5W00Hf?i=p*KPzJHG1uPk%8HT9V!zJ#=R_3M zx?NbJDEu4yDdB&O&zl}!a9!2~`|9#3?cZmdOD@i0h#W>8uyg9yh*NhHTkURQ^e<_t znL%zRbcNp44oh~33;Vh-chZ#r=*!XupVYmeLgpyZJx>spG9DLaI)S8f^%Ey$1>2oj zfoomIMS?C=-YicZx~=l8tSknU=w|p!_s;5P!Sfk)esRr>0mbBd%{IKF_SSOIX$xrA ziflyc?QNXNl8p{@Qu&XcJ_%yeD2$lIaKLTzwZ4;6hdkJ(`=?ATzgw2v3Oe7~pwE8KjQgh}ewBUsu;5 z+EFaHJnD#4;_UsFZDuDfJ@6Nh7ia^eNA*Ah?#Bd)CCg_n7XE3l8=1jY152rx^nr|l zn|k5?d>i{ZD~A9rgYJZ*g#TsO90kx`=kXrz8pRtUd80l~KVGSNzPo{IQhR&7=*s#J zCsEFZKco)`fZQ31K=xApGI(NWwLn2-_ z&HK*peRMOK^b~e}G>)s%r23k>(Dc=+)kYd;akmQi8?PB2ika=Civr3}fkao`aH`>} z%YFt{<5QQN7aq+q#XOOW`|_zU?(+c8E>Ftl_S}T;wNOB@hj;9NX}(|AA>odO!DEvn z4TO6@QJWR8czo9}+~ic!b|o8DS~L!M&1BGAhc)}PGS2{XO$8G6*;n;q`>e05|pw8Ay5-+}!4S}Gmh7qW$yJ>W~oajACy=gX}~&b*K@g2KpG zCR#iDk3Q7%UZg=V7GN?}a(up6L=7V30k}8{!3q(z4=&9>e^}x(W5#|Sx zdrtQ8Gxu1Jh8oGVH{0$GBvjO^<;ylk_U!qbrt$g_9fb8%df0riuZ^O~`x4DbJykH) z8Mcr<5YzplJy(n?0}FWo*Cj|Z{%+*2Cta+Pe7tI9pf>%Q;gJpVeUp*IaC0GD2I%q_ zh7tk`X}nLwgo{wPt2z%8GZnBS-yK{)?tLgTmrxydmSu4w; zGn}WAH!Uq)pN(zNtw2VtDsfk1<^_VJULw8*fDSf^$wjY{?NkTF{M*zbcmkfWUp#5*a0~a~9bktL=lU>Yzqu@B%68E#~+-c5xJuMaO}5 z!~$USW&kd`k_}5S$GpYth~%ux(j?+rk=x!!2%VLx4Gm(pVWa+b)Z_@8`mgdYhRVvF zv7`6&Z?Y5|g-UGh`nQ97$StHqps908T|>B%aPClv_oYKc#ar|jwfr8?r`P#4m%gMu z^VDTHlHD*5cJ(FRot>QyyOvIyHhnXqVt;vn$~;z2Ilz;>2q^AVRoU3v*M$!TH1*%s zN6FP>#hJ#4v5n+UMTmTDT1|jl>BSq#6AJ8i`6n~w;|>PJyHj2|9UX!b!jjk<9)^x# zDTfr4I&x~a_4#&Cfc772r02|oXMoz`!3lu6gl&W?)5^j80sW8nwz(sSIh6TRt1Xjq zd%O%XxZF)yrv7T%yN)8V)G3HzqkMS zEPdbbQtpm1b>|A88z(p7iyjb=!u1BIvBWf){wA1xbe^1?SScCqod5QDG|$4^c?Gv# zKx2U4Ccnvnn-JYmu@%;VC2rfj-$bfeMTHGwxjMD&yG(U7q~4IbD4JxmxtyiXjs$7& z!T$b(g#`H*WgoJxe%;y?QcYbG{ffX6bSKcPR{efcuO1;K+ul)#Yzd^w0g8_o%YY4tHK1v@ zMkSJ4<~tE$RNXPEm%H28hc#23#nuJh$N0!AhU4?sjL*lN&@c?l;i2)iOtYmp`sTpS zT>L&->Aw03D3;(BN{S`#qr!D^m%mfR$gV{Ijk^sMwwJhA+}h;>zIFN6UG=Wh+cB@R z>&_|nA@F2zKm?EMj?Y(Dzz3WE{87pLDH9_yzmx!_izzQ;e*MkB8HcgwJT)}|h#vvI zEPz;vi$BaE67@oQ^zr6_vtSr~u5|mLn~d_G%Y3Jgr%6-ma2Glc!kb#83uT-{XmjSA z<-%lido@L_)ro)PmJ4^q)uElnoy6^K^dWe8cwz6#yXN{wV`x~4)~*U968Y;eTwT7v zFgxJ90azqqCQ4(fXJK-Sg{7yt`~hV0NOSFsU^{29c-Rw@-8F&9s zvgU-Sv0gr2?+4$dJJYncRHs1u`4HIep>D&tq8Bf-?*>?Z@iR#hs?zSOKAdb!I#}Mk zhOdSj5p~*<6D~WISGcZpt(7=Y2s$;7L{=W4JBgkJEjBuWx5Z{8Zdb1JyCanw8||Ei z00G}8X_JRZ^nJP$z65~+tYv#*9Ig)AXTvl;JX;az<@Zr67&2J=e0+4+N!mJEr z39jE5e?A>+pI|h)n#UmOaN^Oc>9$9&`U48U|9MH!lF1WL1S6F;_!D5~?KBR)1ucG$ zA6O_JF5wtM&G>OEwI_R%NNrineHHTqUiQI=`6I{zf=! z*K>EsUWvE%?ch`s-`bE~(<6i^b_>)b8zwAFR>41h)x($IZinZoWSfTX_Kf6<#|tFB z1>ly{)>P^)?LGC%DGbSKzEvqc@B=xhZz(3@?zF*upot&HfBKLCyT;iHx!5!ZVO-C+ znRZIR{CoM;DxDAP+L=W0D@Xw{lDB+qpY&=6t~RV+x)UtEP=P>q38T->;aNS-L&(pn z)a!uTV)Gg>HJhUX>{iu8{X$oRZ;frP6&3dH9`$bA{i+cm9V1k|)aYME!&ErSG`k?! zt3$EsU1!$}#WehGo5WB&(EUPAk4Nh771_C_wr=`{Af@_lmwMYyFJ#iITzb*v-SwBd z3yh81wEF!aup=Co zKfH8j;Pa{0P5N<8xqlYdM-A4Bj@T5JRHOvNN4!Hq(SCw(I2IW2w zDWiK=2lBkK*ZHXP8F-#Ye6wyIP1b1?up-|bGpw=`t7e@OM)bMePe6DqqC5>lV0h5k zi2z9Rq)~VU;Ma6G69T_aZ{V#O@m@?6(K;u!FZ&T~F)n^5#Xi z;HE3eHpkzXjfOSWd)5ymd9qDezbiWR_H#(*hS*Z1shR?^@5PDQ*)mfaPqwS31N3u* zAe&h+A?A11VPDP3CGI|6PCh(~-(xQjRl6<~v`7_nUOo=%M2~TX3xj`=qvxg!z4W5T zI}!u==x6?%oeolBZ|o4#M28dVtCC&0c(QR=`kD2`(XAT()!P7v*lQI)sr@VB|9|oM zLh0=p)onU>^Jyjor{%FE#XA4H@l4&oWW7e7VdnPaU0k1QVOB+~rSzvXv+3PB)LSI` zOP{3xm=QmH;S7lWn)#^_@jRZVA3nfwU!KDt!KtOHM&;U%>+%a7i#M z<-m~aN$o%u>g=U*LmnGC=~F}kZUTmgW4_FP8G^e#7D5}#vMMdNq>(-C;&1b)c^x~5 zPAlNznh~Uufcx!&$AXq|;sLlEvQ}?)6;P9Ze(6Y0YU5Z+t5vz}FM}+EnSP(9m*PNB zqM@#V)~Js!tvjG-r7qd@PJyp~8K8t0*6VHn7Qsp3P=m)c(hD;{mfhhUk?;evBPTdp z9cZxk(wEjRgi&3J=$*4SL2LVUjQiXu+6OGI!n{g*X8J%b;#Ji~Hc2|j*12Ra1Gavy za1*deG_pfI!RbyNw@d8>Mdh>wU}URd6C(qCzi267iMYsTIDHh!Djf416xEVbSrj4z zJ&vMDW~CB(p^k^hCf);I`CSdnb6B)UPHfGwO$S}!3@sHQ!;$(-eAR~&;6uQ2N6e8W z)ulaK$R*j0Nrcmq1nmn#mLv7P)2c5}RP&l-PPELQUd;l^oT?%Zx(Y0B#OyDFbs3}w z1gxr0$mhdb13kkO6{>^KZ3g7#78b~X!<-wNq!?sx!(WC(3k=l{WAY-8UPVAALwx=+ z+@$yB&15Do9|V$HAxm};@_BcF1?ijzF!_H&mTWz!rvWUGZO7P{jEWP1-p^~RXxXVW}dkvEW_UEYIdYUMjE z^b5Rg_`iV0apvJMCtU@|&#at;Z5mqw5Squ41kZ*^|)+ zw7l*AG7x|u9)5=8PY>@xZf`)Swut;eon!oax;gn@KK#8ph!h5Y7j6pxRJ>sn3hpF9Wa%iZhGYppaezM{0pviOU1{5~cQKqG12sd0=L; zJkf&y4R5^mV)CFNTcISpAhY*FAEP^t>AkWpn_&5g-{?%le<%FCzKGDI0&6lMiGZfC z+96k#(6z+fTNzUP7CHA5P4tb-`uko;zMAJdRQ@Mm!_?3H(+d5+dz9AwMktKYg~;U( zNf8kX6M=c1f)bRUSMc**Gq$>B`<=@?or9LpZ>&F(RHH~Pef|ddhHV8puMnHmtK^wE zOpKLs{U31(IGZ)h*ItSfe13YKJggxUph5ILmD!36!*DiAPDCU5R-M&JrQ@a9VB22U zR%iWu<5>4{KDudB&2;2d^ti8v*qXIeo!^Y$9{^T!bF29KIYiPyM+;~Q*!F{VUJbne z71pXK6|Q9B9!26E*^9H`=y2B17oZ0X)Qh*zt%rDA%ikN7n$#cMKbqJZp{R>ALg~lq=i&{;V_TKV8?|XbU!iJ4-tDrw)TsdE> z`o~Ze{7H<>2fCc^_rA;91Ufj^S849PV+^}`qQ<&_0S%E+Q@+$>-eG1_vNh`FJ=#&6R6b(O=>9pGeYN78*bnX{>R|=q31zvO z&pW;fevl$n5kPG)L8|fSvr%(8yTU#+aRs8u8R)%mMJJ*~869r!w#D}$_KY?NO>KTEY3Y2*m#pmT33 z_G@US&RRQPNs2n9tC+#(#B{4*B}Aj695LnRWes2Jw9k*0`~!fhFi{@{`O6Za>=!*9P)d(I0NO>1&?C=`_uV zQ_sHNJ6SK;aQGWkt^_2r_n!BVxi5e%!m;mf_7wlk132ZZi^Tkym1pU$e{>a1=xla-O1l$*(9}Pt5 zGGrU-oO7ebFS1j?8yE z>#|NKZQIAYpjGiOkzq;J%%`qxU_L~93WNh5oJ}4l*Lb{3l6vsuqL|3#XU_nC6^*jP zhO;3c(lY%)M=WxK#}^Pn?n4N*sr+K(YUOjYvy{rk`z6{PDy7=q2zd^BSHdp13box_ zbs+jRWX~b0iZN~5%+?yTmb<>L*RDHy=fFq#+Pz}@(7b4K@+Dwem)Hb3;Qa}RITyi( zY8K8njnz_aQ9l!_wMQRM)lvthAdws-xEjeIzD&nZl3W0w2N?krMJR|EPY$Dproaw? zSlf;XMu;K1lu0mJs0U{69(H#Z7E1;_ywySoOrovO4UQ2~iDuz-x7zwH>g6wJ{Td#%y9ZfTA-{>A{1K_8Kv*0~)4(ahmup*L+zC>HjUk30p z@E7IF-plksBdY!`m?a+D@s<6$SWf4!e6i%vP?BvkoP{Gkn9BWU3U( z<9X|*4kBsEPKlg758Osrci(xTm-EJ^Q_0*E_j$<1-D<7lHh5+MALVU0bxzM{C5GC<6$3wRBe+all^c0S+ zlgSQ=x8l>yMrZD4FU@D8~l_?O{WrD&ImqU&2VmILA1 z$cYE&hF4QD@^jrl-pw+K`UisoK{j4Mc1aWcX>Wkt9!*^Dh>ei9;en>!BZrj)3}IW7 z$Glw1&sgxjnfSnD-gM=)hSWMP9JHek7Xl!S?hrs6Z>Uj=TtIoYj^qSiaRswi5=R>r z1RnLAOWG2$-HB@baPCbPdO_wQeT+WbigcLDoNYMrTcRI)8K+6zJg)i?G3Gbic(9`q zd;Wm&>XwSSE`c`ItxevP_S{f=BN}&?=BG5&I8ICnLkLrOiQkh|Z+O0_eFSQ#uCD$l zbI{J7sDEXF_K@Wc1EHP;FTNQ&H}GDy7%WIqhzGn3UiApo1j5&y)8kGCelgBI|L11Y zYMnC61TZEO`~#K`uvtkhYN*`&wR+>F`Ip1L zx|#T<(nez8>Pf%l#DoWe(2k@Uf(M7o>^ah@Z}@b-^B&0D-aozRRT68Qz9LB)O-vR- zNMj=1>|K++1fGaI2{j7FS>s9ophpFmh5vnlMRvO3!0^Dh1HQ5{Gh$yh^7ySJHt21v z)3erR9Zaxw%Mc|@cQ6&`t3WAs@PN@q!PNu1Gemdngm0gfVrt+p&9MP}T@i#22c2Jt{!}+BqbAN%lg(c)>NBOGEULwxLn6Ihd%?2wZbc9;+!Ydors`{i>0dDZ ze&*#*$s77{eAVLO;(0-I7t8)r7;f{nr6Jn1O>yz$p>0+%I?oQf4_;oYw0{$;u6mdX ziz}uglP#T+meVJ7MX!DSl+fFepsSwXii0>|VpThuJp{%-Zv(@Lw=a(!QO3VSD>Oli zyGcPTsCS)Bmw8rP*J0!*(k%1{woEFM5nV-xdh;Ssz1|BSpq~^6fN|(CA6c#dAKr)z zfd=b#JscB;)8BWvY`5Z;jk`)gvnu*_NYy<0jPtBI5rM{^HPNgZb)OY#b#jRR_#rsk zKB5}EOxPI;W~FFEm!&292GubAt{%*kN!OnZPLcs zX2gH*#Hrh3l{}`Z?z}S0@Q63#T|NX)Z2hZ*{ZR2|BV!T%6R~V z`i~O(7hCo^lWEPC(NOuiQ}xa`P)4=$7^NcWF(CTCyrd;aUd>L@1@+0 z{wW}(UVtM>0-L4v4kgnrtk&MmF!y>V+I#3}jIsQ>4R^T+%U$!23tUZ1(9sM|S}Xt( z$+}iMQc_Z{!N{O9wRqK3BkR2y;|9eQTY`BaWc%^b0DseYhj^hx<$;l2On0MIp5Qa( zYa1eVa{AxRoRM=9*$s}$-Hntm1cqqz^#5@T_XPdVLM*@q38AHnlP~4FDh&m^b``OR z;yYk-@c7Z%>yxQyuBpv|4I^i=KIerBQ*jWNJbn650#HZF)JM{b`n|Mr#W?-fv zeTp~555n>!wcEey!`iQwS8m309kTp<=&nri11gZdHTBUHVB0zmE5cRrQsdu4Hl%@F zZ`KmJGwNu7r9f-p&~;OPPpvxH@ae0Nb4$X3@E8NMCda>h9LNO{$-#{gOQ1f7!S#XDpJFN`=acTQj9)H)Cd>lKHE4QNE_k9#0WdC#c$8v;-Pd^eTvqIxk3x&Q zw{B8TLdX%iuU@7nJ9Q}_rbD16&9{8iN4E}=@NlML1LXrLCGK6(PEMi5Cv5Gg>55$R z6HR!iD_$Vhm`%$-zQAz9ltR zvwT_PgZl|Tb~EtcB?p%Fy#mXhjZBv)Xj((wM_m3>f#!JuofRIlxBhZgkN4dp_0TxE z_!S+%?HNEB12a^0CI7i?o;CV~Mzy@5m${YC8Bf&bN%95X7iey$cVj19$%PBgugST~A2;LD@@4^fS5wSt%@_@S{BY zLzsBZm!o!Fgc6k#O$hQz=d*x~j<(U_F$;54>Agnas!OaN_?wMX&@LI(j8lPX zrZK!Ym8)dGJTVTRjVUVE5stKekWYfNy5F7GB)X4GLc-^z`MApBf}4in#W$Nyuky`< zg2nUtgn3CG@2UJHngkHL3->BOQQ&_TTb60uk}a`Sd|D*G)Gvy*Erq8FbDrJTyD$`9 zJc>rz9qxkfHr^_+r0Dj*dj8P1$SR~$kSGwxZ4Z8X)1@uxlm}^&Hm!bo3@TUj*-MRP zu3~_Jt-I`5>lKKAh=zw1fG2n;1;j)G_8nM;Y=yJ^ zv#eGB$RyCI?$(rnj$}*ekmlrg3n%=|DtX{_aglx5blS*#_GT~+CSKLq627jJ{0i+1 zStmkHSDwB5m%*YT({xluw63n^=}wOnufM}cxnP1TvKcUfN4~C97^Q-WjB=(6_~jIN zes8RJ$7y`xf))H_xCn8`JQ;AXwQNVM-^7>L;iEf6jBX7<)f4_K>Tw5se4?809lMNe z#Z3$lP|dB$6zk#ett5~&r1m0c9@st3kU*aK;>)358fVXkQ>INDNyAAS`WgetWP>E!vfA@L((I&ee(VUhQpMRYS0V zkxXSiRVfPy3KbyFC+N;)QckR5jtzwniikg%JJ#w>Kr4tJlaA>6PhNM&%TnXK{;^cl z*XRc_jc?8l6(wZUhym=8Mv=+|huAf#W7QI>AJ8d

      s*4^j7c3azxmX0 zb|=qUCkz98BR5X9M_)(mM$)(03Zyk)ZIwA@z|3%^KAT_47jiY9IK6fihxkg+M{Q%L zUa4l0+1Mp^t|<^Ng&5H=>K?-z{i@jc`n~imTQdQo5G6J9@Cf`#h?Dx3c7C<^9mT4^ zY?-BrYU$v5iK#EJMDy^ziMxZ%o60>(lS^raSC zXV-3yJ$T4`NA|VyC%;3t$eNht&$wQB>~~B9s~Nmf3|nhC2RC^ug*%80-NkgQxFzx| z^5z8_EYQ|@>i5x$0o)~B^{5&H+>TxF8tfYn<)w0(ATiRTn{s1vOjd`~ArK3uxYt{f zLdi-LQ<5Fm)Y_P0)j-@u%vIFb`A4^^7~DbL#ym$6v#w*AGsy55{a!91SNbvhlFfwnN+={P51Q%%3#fiwC1I zS0k9L1q2FGc3M!19xpaEU44pD+0_4V#F^-?tU6tRtp1JT3|!DfxfW^_6sbmNF= ze=7(@y=4UyrEa*TI>cMVG*jC75B-B&F5?k=++%EMCK z57~-5JK_UKkE6mUR@n^u5@htmnR7booCPnMg)|l(z4YZq$NRnhOsw)ta$9dra#s1Th`bP?19~{#;&RSt_T(gM62} zfaXQQNczX?tZ$#T@hzW_iP@$Gl#==2W&~8zq+K&VIMQ|CL6C)oT_E1&0_S)Vy{1b< zrPllI;KP>>?_)m9`H`y4jO}M!H#kkcW4!#~MShA#S2{U(EM|d~Gy%5?Xw8t5Ql6ud zyGc^JYjs(g3MtwQudYT{iVU8rZ-&lKm1a*Y`_=77_@c_}cRF1!O?|>YjY2tvQ+tN% zRp94-m92P5V9u_@WjVlX3kSWWD@xk9WB?$NcvaSI20*66A-Q{)Ontoh-XH zp(M;(!l&!qj@3Xv-`&lmH2z?C-<7zL>w&|IEBLtkpMw=fDgiF_xvJ&Am*)IuUXxgr z8iiOog2Lp@Wo`-&IZZI`#c~Sj%>bNxrNdWs-ZOIJW^hZw#8czo&G0&rza8TK1r=_@sk3Z&Yn369(?S7 zwSdumXbmh-fGnk5CB?GY1pG(@gy=L%fRT&&J+cNTd@%jm3PQ8f*d$J~j@B)YeMYLt zX-Z+Oba_h2u!rnxQ{^xn*dXZi=x;jiS{b(#m%3;vGL?*_CS%PcW-9B?G;3JlR(owE z5p&jd^hBv2=E!DO7E2OhhohHPGWJyZY3iM^X|Mi+mf;z*Efo37UGv&4tr1vw>q_w?_;antlO|`;T#l-zhc{~P zeC~1H+^`V$%V!ue^)?aUfPI0nA}#|eDV{e1o~hTTyU^8?i%D{&%?+I+SrC*1C3c*E zMz!a~4SS3m4NL%eGes;}mQ3s(0hQsiEg)|H*Gv`s_s=Ov^|0z!mdoxfbga;_vvsOT z(^O#iML!tpL+7q{tmyQItW2I)&12@J=E+@OvL}M|2h->2~RRruSG{=OIf_DFv_)xYDy-_iN+ z1mo`<>F-SSf7@hRPMLRKUd*|?*&TiF<#am^A1_`ia^kLW==tc3I6~UZ0ibLFpd=u# zuc7v~(dfyp>IfZ@;ha=6l1V%C!SM6s$s3>ib7UHCZo7bYTGo)N{mZWtOnWOX4uEf)s8>K2aR&7V>3*#o z+R~lsu1_7hbM(9_d0F|qdvJpaCf<nG>j!`(syAjN8Wka>D;YUuts$lv}XNzg&r&a>|!SIG9 zIP(B70hGqSkxzSdAn}{76pB{~U^o9w=jrE0vO7H z6(Rrts1?Mf2^LlrDo=4skRWpa{|Bh=oVD<$)b;39@G;;-s-o`xE0+ayjhNn3cy>2f zlcqe>4mF-T)ALO;%K?(=UcN9)N*dRDrU~>I&S(4^7(eDCQ1Pybd_nx2=9*!+<*D-y zxz!^IG#?ySeiHS@4BE&#`>#mpPfwV;vN>KrPLa8@k(S1qo|k;U6s}jhRUFC0niQ}*_O63& z&7*2pGs{13i+6pOcFl>}{^3hTcsn8Re>zQxC_qvc!a%v%H*QUws&LFOjhx*SWwDHY z)fyxvf-HHBcHrcP_6f#60ptyZ&JhrCH|y|q56ABhd?p|88Wh|>-Kh%WQ+YkVkLl^< zq)iCPlRgz;Ut?`S$E|(h43UW&J3o3(3n{m}@wvAtnmVLOyy!(c2^_i`#hirAzcYs% zTWgc&>Oy$+7i4qRKHh%R&J=TU@BV3&#)`5iErE% z+D3yMcYE-*zd-L{M|Mj3YX!UzO;2_EPIcO#4>yn&exe^LztfbB*5^?Zp>Zna$Lm%&oQ z+|LWE?nS=nL5T0wetP>W_f)_jUshXwga)2_WMGSm`n80#$Bt$0dB;d^jtSLi&q$oV zecEB~yVY~4w{Vh*Sz*fq_wXJ!*bBG_JqK}|_Wbl1$<|U!+t{3AZC&MjnJ*TS)Y+3`Uuoq9eGLCm z-OqG>zQ(!)Qq|)2LrsGr=G3&2=Pg6!t)c=v}OakeIyBvlV(1f`IJL;kYmm zQ^{9d`q2&6#d9z`EQgB>uniA?59pvLI0Ley$(0j0hJ0iwUH-rem3rdj1oaQo%>%>X z|G2mRAGo9bhj{S6uQ~L;_vF8J)jhaJrP4>H@Djfu;h z$bqYnFMw1;DqCcmM181e&R^PwFEY>RCbuMwV`88bKlgmvXEc)eRavf|^7=p5+{ zazCXSH{6DJ7H` z(VOyrlehmHo@ZGEjj1s1@u}0{SjJn&IX((??-M|XKMHiNVUj?znvZ-Gdue8Kf7wT} zl}Zbsk3k@vzz~ZZJqoTNNzO{1SV}&wB>*!lD=z0gTNWo2_>%wWOpO)eJoGJcNy#4Q zm8&A$^>^cVGKV^ zBV;HC*4fhsBR#QX1}=3slTJT}{Kwye>COR$0+x<|y?F^Km{Oue{z?Jrxs&3*AUlG^ zzziX!dVeM}Lh(|r7pu|0PM77_!osujt^N0&6pC@aTVs|XT60B+e9Se<6nk~j5NDb3 z*~=$UI3sPx6UeT$cPUI5R}x!+9R(EPyfEy1xhTTye(3gHRf&a5%{oV2EH9|KPv1gu zn98IS^aTN8(gSa%40cv^fWeoA*0OA`WzOzR%_8RAF>QP3{BAgXBhFo5FG*qC{DEtc zT9XMV3)2e zA^mPuYA^87QUV&m+Vr%!E=C~t{_`a4W`;6de8H5WviIrbYUO<&GdbGSAEtbmEN)iu z(Pl*U7A=w@tce|cMt29UtD18Vq9;8+nmI1A|6G|Ynj6EfRdhvn<1OUUW9ZH~qbT)x zwiN(Upy&U$;Kk7``Qa>iQVus&AX&i8Ab&zFn+LwBm6jt$RQfq@V0Rp$GO%~E0xMl) zY$sidJzod>LA;l7&de`&MmU(;;^$Y$<`_V5*W7O3-F)}Sd+N*@P`W0c>Hfo%_!3K% zd1HNZq3JPdKW@P`A|^|l3~8lS0`Rv9j?ViBf5E@!%J0h)dd4^y4<2Gy_ay1etF1kM zn11I_1cQtiL=J-)2Vlf|7V_|fk+H{#Z-5JqfJPc7n{k3B@i8A0zS~uFRM0w6=+m*g zXCW_9c!A1f7t07}>|m3TVK!~*g{zgzr|;U`wr+$ld?J{0z`$hqsqyfux6WS(3F|$j07J9Szs{hc-!MD>47po`Q|zbPU;e49H+iqPw1=O|5#T@FuBObIbrIJF z;X#~LPHm73rAeOekddn>)o*;lkOoccf*J;023F%E?4Fm0OeY`i;6AeK5e4WrMkH zpQ{Br-})$1_zf%r|>ZN6JJ**sHoLHD52E^0_hVk7Vw8aB==B z&qn*_D5z}>jh*BKXrVXD&zY@%ca^yOEG80{_g?V3E|aI+4yGU{;%lZUU|mug_dMC6 zCeM*iG9~L5&Cg*MG4a{@Q@y_7EEYO}j-MqKVf4G!2M(6#F8jp-SQF!Hf3J3e|F?az@O!2K7Dven*jJAEa?m%-7e^xh6!Y%O2TMC_xigJ4hBy z_MI%tZc!Cnd(B#roj7RlIr)Q2*sInp$#B@|gnEiz@0;v9D?Xw8Maj?J6U;^vKGh~B ze}gubQ~~H9m{+QK70#XI>#M}l$=1|B<+ap9KXQpLw`F2)RG~*rC3Q(6OJtiPdTFV+ zEd2vY#9L@CNeIsoQf)Oq`4Bxi=J=GnmED1&*d(5(5Ndaqu_tOz2Ae2q&GNXWvHEuf z>kmq&l11pd%P@nt=a`N%t;5_SU&j?|7Kp3%xg%<@QVVX1)X2&=GQf$pb$jY0T$11! zD5UnY&@3Ts)#df!DBD3>z=Y!MJIi1I`zQ=z)A{Uut?B9?~GyNKh9*(#_lc7Vvy~ z6z+xxI4uBU1E5^{W7TuFqwz~U5yc!7MgZWP$3>BaWxSRpIQTGvn~Ac-OG`j7`WyP2 zHHHQR`+v*0_ahQ942|T%-7?_Kh`3+7WZMgWm}**R;@h^rUr!N0m?hxz!$g4J0?_TU z>BWX;GwTmlO&g!|Ur)H8t1-|29ys>9JOA>$yZUQ7=ebyp;c#99XGplHeDo*>$&ug+ zK#x)+Y|SKW(*zCKuO=dYqARiMf;GV|d&0qZ`h^A$txP+Q5c@-W${iqMs}7K>M*fQj z;pguTiit8Y-DDDAI=4a(zSbs+7qS!Y$Xx}L=s{?!ZLyg7O(lf`2v^-=;O&RAfUlAf zJ3yz7{G>eXe{DyPb>V!j{T|!ntri}tV!t|Sp9i&UC|*bkepNatk4v#cI{{(8OoHwh zf%#P^O%~HsN`M%qg6*z(DqEHjz(|o9wZyM2iP4#i`!vZX>IgSBPU*DyHxVD6N{$Ih#T$2?5?Gm9z&>lx6aMQJ62}2Z> zSuFXiqVgZ6r&J96+z`127EkOUW>X-Y)Em4EemW4zF@fMO)C+qqz-!G6{xD$xzEetn zZ4f3IX$vBAl|U$T2t`7C$$_354EdeygUDQz!%{o{dZpd=Uzy;qd?c=5G*rCKUhiaJ zl;o8E^g_}IK`Gc(_o9~a2F;0bZ({*CUy zD8mvdlZC!BY?hm*gBSdo=7Q9Ac$Zjz@UVY zYt4LCWW-&DVwSPIH)fpVl<}lsL_+A{CuC6o3mMum^>Z;maWs>dzk|GP_Pw}ur6q*l zkF}kn3Xu31qQo7Y?~Y-$H0y9yWgzF^>ZfPVt%$jdK@PNS_SAchz^@U9j1ZSCotSKR zjss^`=!81~*&^aWip&!KcsPQocDCJs)`s~s3uUxo8?nNEtZV=l6{5SP7y*0|?Wpn- zqL|cLPPm{Gtc@reiWj$PsC(IP)LGqy-lI%|ZR~DzY^{{%03vIDm{MS?PcxeC0()66HhDeLucAuZwR?l6kfmVO<`q^d)Da|r1N4ukEPCw;9KGt>h zR$AG*tQN-az;%v^iHC{(?Fk?PeST5=<8`r2^aNmH#__gOitkH^rP&*xuL1qxjaO#+ z$S3-Z<2cdw7mRHc=ZsA49#iZ;YoUMmQo08STQn4e@h43Ve zH{m8_h=(ad3q}{_f)wXe&(en|`yv&>wSn~S)#gs?PH+J`l4GK|`7Fabn?A<4eiHO; z7ZdHc4ZhW7zCt(Uc(@4!0VXO~@K;vDxRz(x0r{3F(tpHaU}u$XyrIB$XJVt^>RprB z$DK%K3Qlzzb7W=m?l8C30Mi5gO8YPHNRjMakslj=M;}tl>Lyux;hdRaO2VhDpDK?2 zX1SjEdrvU*F33Eday>fO!3w0%F8~DjEvf5cJV~yu_-R4Pv>}9 zSwq{7EkqP@Z_0FklY=HpfCb$z{hIN>21a%g-L~$@vh9XU-z}I4rHiwC!oU8*)DToN ziJ|O(6}aM|xkN3U0~*L4hoV^idU$NQ#eeNC)3RO{#93^nd`%We7(QFaRvRy5!liXzfWyU7 z07<>G9Mx|2Xn4L11@SO3wj$@M({i_HEdeg46TFZ<;8JABUb)VtbWx0BDp)h6#%cx% za63uRPbS**O#1h|;YgP`vYE#b$P|1L41oK#%B!zxatVm#^uur}zrTM7G|8?zfCrm5 z*Qxrsr6YO}+%*GE&bEWjvgQw>PG{U?wmv*7>gV`?(T(6(&w>HmFMTAjlHx^(kjYJ} z-EQ(`@G4l)TbhfZqbhrDgdl{|eG52C>yjC*)!{jF&7mJ;99iB0{N-Zy2@ye{*42-ip zYYVtIG4Xv~F1q_}VlMS*>CD|pmk4=wTB7g7Iq1~Wo976FF=AQ|;05Ab@&oQ7{%0%mU1P|nXKhb)^qn%6ywOI1uNyIU{DY{Np1lZ(|~#p3^iQ#tae1WLIf-A2**Mr zRM_*YACyBZ*DnsUn_c@PU~l)cy!(!#tmW`7$C;IV`QZ$Pb<;qs^F8?^g74H>#)^Q0 z$GODc37?bm=z_O2o&b6nmMFq1zB_$(T{-s7oiX=9LBBfQX5>3oPay(J^oYRg5&Plz zOl^xIapUm26juVu?UfI2A z`iDt4GI-$-oz0I8-*alI_S``9qdtk)V{sQy&`50c`Vi_5Q`S=#vmuHlv~I^f{knRM zWCMd^_dBtC@?y5EgQ;ZWyfg<3Nw?t=vhYpo(@5lew#62(4xD1?j=8`IbH(tPj^Ynv zOZ5QxsOajX5g2}&)%B8ep?)llV>Um+i%oy4bs|T)`4M6dG5wXapO}FO zF=&F;&v@OME4K=IChTQE6B2bUGQM1oRVIfy=zgk42~(shh@Zdmr(GHvRC%dAW9 zcXoUFM7I_mRc99N*oLb%yP%wzC~e##ex;%C)k^R-xZyC}&k5L-&I0*qTBmM_8a`k) zGJFFNb+fZDxT3t2h_n2khihQQ@S_T#$JG^?Wextn)iYp~ZEjcx?BS`n zVp=bok1Z!hfk~|E9JDk_vvoV6HgCFC$ZME`#mm*=(c67Dkl2_M?QtXkrz}d1CVO&` ziJqQrNOzg_xgh&R4{w2r>z%F|l=xj_IQRzmDiL1lv6&ztBhP9!81LqQr)wvTL!~0e=wYYTU*aCJB)6Ec6o{zuwgf5 zDcle#j&G*>7M%69{De7#WpJj7ZS%>?;d8zf9iYZxIv3@}NzsO9xQ%boo48G|IwvY0 zIvYnmDE1?pxjIR?_3|PgrEvr9Us(J(;x002a3Z}g6w*0VA}wUNl1%jz&4mlTzD`UT zOj+yc+&pUsMeyRB--u7`mb#wnv9Zi)82#)IE(x)mqTVxWVMVXZ(?%&CETnUE?m6If-ivz1wSnIq zYOf5=^h?{ZHFZufO*6^=6nx7*HzY1Dp0Bob;UrHdj+?!(Ic(@_1QXZ(eglfhgv71( z@K9cSOnp$r7k3GI6J)|N0GIDO%C2$UT1)A288T&Cv3lUPiI->e$ReNu=y;Ei=K^zR zjZe`|%CN28ai4$Nkgq^M6i#0Y5LqDPOdZ1tcBOf>b8VSET&~ zo^jni_nQGeTGy`@wy}SIE5bPBLc$Fg;bFyBD?6r5td*YDgapq-V27)9gfhGQbE+}> z4n9vFkX>#fY@2|KL|jLFd(AnMeJJv?-%9e)ACc2nALeoR9-2yVY}@sjLUch}6^ zwI3b7+DcB<$p)3ON^H1{Kry&g#aoeiN>`0j_UB0lm8Y=#SNWfNE{GM{!Fe!O@vnI| zMvxRoh|?ujBJJ%vV@>PdD%P{Ny0eKv8aoM%l)o$yz*p4q;Gk@dG9OEh95MjSgSHLk^WGo+2v|hxb;p*{8VdmC0 z=kh1k8b7=$yxL3TM5d=O&I!`GH{TeO_%f=Z(@VWR-Znt`VX^l0vmdmoN@X>K+-+Ls zl$Z`x`iNo*(lu{}zUPWk6OdD$$o>)UF|3 z^w?=skd26&VX@l-5kA;IcB;lc`h9nmpRIW+cOQ@=Utxp0n<0d7H)vV~maAng`(;tw z&8}QNVkm1pCW`S@#MLAl*CwdD;5eY_^s(x2)?>X@+yOmzONW3_BRJP@a(NAKpT8OA z$iM9dHsYWL@Vf2r>|U-3#@p7}C5tL*%<`0NGEe_})+`_N*|SFvy^s4w2|pvG8Xonv~y1{Ik5wx_v)OxUk@8$@fYNn{UzaHQ*BEZoHyW9 zT6C>vq0txm)1G3JkTxD>absBgp*rEgJMJw#v0bkHkw=w*M;vZyZBzHbW@o~3jV99$ z9CJJp>nv4>y7&V03|-hyt(jS{NdfoPvlW^7eX7Bd4 zkAsPQ=J*zP3P7LiMQ?vPC=mYHBj4}1tb;y!BM6cqez}r^UDz2oN>v8$UalmZsj-uk z;#Vo=t;sK)kXNS9r0#aTE*T{S7QV=(FM?WeycSR9Fs74gceyqMph38~B)TT5LuSC3 zH8}44tH}c7J#j>BfOz;1f;u|XGwUD*Z6}HY8JBTF&TSrkw$G_(Y|6$)){(n=q9tXF2w)A7}2!q^jO+ z%Y4HI3aiXWa!YgY^wFMKFE5IJctsF(f$JGLG}ZDC6IIj8mH}YWLdu|p!sC86(pBbf zCJ`E}IM-OnCiE?IZA9^YuBHsPA-u-KwXJc#XSA#lOF+?dii>0R>hjd~oRbq0I6yNN z_fh$Xc<9U|8tIlkHSXY{T{P2g7=LHSYveZiS13JQ_xN=%u9;JLa zAYeltl9tRG8ryiw^lg`4)NaMEgO90hj8Eb7rg2Kr|Yv7l_gJ-(xdGAqLHccAMH zA?LWg|1g=q0HkWQEhIyZuOY;Nv_kiXAKr85J&i6I-vwC(9j%2y2AAUi`wO}nFbI7|pW0aOUDA*b7Eu@^Rpzf@cyZMaxm{m>{Qk zJzb%7Zr;$F$Z4<@hdURs1ku1~MjZSU41ZXDqR|&f1`3mR)H^TfPSk_nUvpv55uP&< zZo7)fDPj9LzF~`4hL+n{>&@B$&I@ZLj9S~FxU$$W_e4tIn!mWO$=DNtO0kRA8>viE zu%5+6j`I^62&X4_37Hz%xVd4#m zi0gQ5>W0#l&_sK1kvd?Y=~%6{;9OMHux^Zx1Z)x@yA90b87=aOXM9)XqSj^5j*+A? zM;dH7{CBd?OB+qdq`5tOEg$zuigQETZD1T9dxO6#HXY5RYx{%v#@;tMnhC%#0Okv4p`)%4Ednz?){0b@`Z%?oHCygmkXuk`uf_~ZKWo+B zKGO}w-as~*JSqwvzU5l9ROYm&MX~6*=_MAk*rp9S+y}#g1xQ+ij(U%5`)^)Wx+R87 zllL{Q?c4K974Bbm^fe&Z2JB{;j+^wZIX|}^Gj_B~<5@<1)kbI;v?$6ar3SrHB1a+J z0kSzCUT*wJE(O%1ioh}2U|Y_~=P1DDUVgbdi{-&S?_pc@h<1dWxYxIPcd)R~)21YG zw_fV(bF1;Fq@(9v;3^oM@@kpiXdHx%@Vv1$Yd;$tvFlWhQQz<$+@H-GA>@KMd8aMA z=cI&@s#gu)zAec+pKa?=X#t4ys=mD~M!OFfrj%i5(VOhZ5j}vX9gJzG(Dmpw$T_r^ z8>1fJvOQ2jKyg!E>VKFnYa)thJwktz7r@w(WC;pQQ@p#uKl{!-SuRcU%V8J^i@LSy z9hGfWf57_tlN_e^7CDe#f*0DTp?1=vhcBGhFzP+mxZ?9rdKQ6W?K{D@?mVGpII8yI zpcPY1_jhV<*E%!3|Kko>V^<{Kw- z&c<(yp_Znmtd_F~`Dz%sbP_Nfpb_+8*vkESOi-;zsjr1+1D|_8#Q08MUckf)JptYF zi?~J*AM$ywqpvvnEc}wa?EUm9jhK&tZ&&xcO*TY8zsMp#ZBO_;3Tx)C;?jSN;nt{0 z7=|Eq?}>%tef0Ppi#Qu_WY2P5~W zJKuwoo`@PiZF}B8*U|n+Yjl9%iO}?Qaig$8P=FInnfQ3I(X){JiR9Xsfl@lsx!Ca$ zo#Or8_pL89yiu;`*Hed?sS#}&MKYmq$iG2;$V%JYb~(6yGgw}-zURn@7etjgcGOe% z&dq_}5K5cb1WI}Io~3^|XpDO{Fs=@H;%Zl=#BfcHb42}7>bER%Ms#gs>$Gd*yI*af zdr&-PotJn5_5n};bk8@I`&jAsF{r)OR_tQgRZq#{@CtyCYDBB~#jm(s-EZWAiodm= zxiH=bIJte@A8o$%?DdNDjsYA!VK zeQS`?G!?f7hn$e@giJ{!mkWshnu$>8*g|`r5nf<)x+B413-lWvPj82ggg-6b=xNmy zr<@t6K0X~Ta8^mAf#Mh6%%snjr$5cHZ)lV|b&RBIaKv5WeEXWLzC`h3TL#4dmI&cr z+O`h_5MF0#`IO`tyqp;c9ZcgS&!u2(;l5NMxF9hmuPehIO9)?)Y&SQt))TDzCekjx za_-vBGuRghH{%2eJ3U-3J%PC8V)a_1xHd#LT0SBbT{%nov?c%XBj-L%a^^fNzGL+9 zD;Y?v8CD0iy`}gPp=HKik)E@G0?wr38-R5$X4rYu3NP89Vi3UMl6M>24dKXhCVmr|})R`l^fMBigvI36rB(&>+^Tk zg2qH~f?tfdw#hBwJbYXNiW0JqC!E`BP46(~Jf4+X=FR#<&C_%j<3s5LqIw-Ru=aMT zBkVNkc{6=p4-9q5gY^EgldbU|{Rn|f;seg#DMOtN|64m$M+Z;{5-WwZ7=_Y%*k zk6zGUi}ONvCDvY-g_`iU7d2?!NSth2K}Y$~A8%E3eR)kcJ+r6-IJ^U=`~ls=qS=>2 zpUgkJsB$IhbWYCg&X%`K{88G7C>kd{OnTPJc}6#Ed?Ee~d%WSRzPzVf;J(YJn`-nQ zvQ#L(+pgAVx~rIjDsAfUiyFeJQmTZ?SmK5nwPNFeSh#H(%6PY<3&<05(Bzy2N(1p- z;I_*=f)>TkYST+MM1Ds(Sh(T$-@&znJ2CrayWn8DK@#=@W(f&)r*0CE3=I)~1OZOztv!&^0px z_7_6|#^r#x)&G5H?AMs;VYl!$p14k1r<&smhKGEf&SqoMd%S{TS(vZ;JuKsLI!`H% zTX>DX`8fZHei)7xA+!fz`l2Sd=>Z!6SI3n89s?%e>QO-q_CahZTms)AahoKhzNmZ} zIx?Z=$1dLMNS2Kt#m72DP=tt`{^~heXfb8e(cmFChbBJ6HvQb$ZBMg3al!e z3ZZd5e^Skx5#gb0=N(c9co=0@q)+6|97@K`AkOO>m_N9b{7~i9nL_gj(p^Ahk19wKub-47WqM|MmFa{T78my~Rz&ay zEzySJ&zc4pyBWu`Ee+{7m!rL1e7HU z3_b<3^-n`=KZjf_i8yZf{t8#r&Aeku+pxgh?cW!g2i?95#)#hCGTkpT+uKo)Qp_=d za*6l`o->S8y}p%p`cjm}F(V^h8>?ST>>9$gLq9u&Qm`rrHb!Ir%AL~5%cP$!SM%IN zBc;pPM}PQuZy}j?Rp&gPx{7abu+h$wbnyx<&fmQZbU&PeZ{7RZ?y`El>Ni6I_Sv>p z(z;wTBIevmn@1693O5(O4Guou;J$Lho|@Z(ET^4K!KSDYHiIY6yrfv?K7hQf4@6#n zUcKD}VrHE7s=pmS6?k1M5_M-1Qq%$9O)rkL2rhC>8=tVf!#ZQYJM^)`Pfv{RB}=63+C?&HuJ~BNt^s zy^-fpJF2W+_pAE`+NKg|40huV$i8nL!v(-4{LBE;I;nbwEGgUb*-Y0RfycuGTFDD{ zZ7qo8Tr|G!5^JR^*8;|*JW@gO^a`=$;r)Af|a%uNUONgGRE72Og z!iOF{)wxj+{nPapztiZ4>kZ$Sf-vFjG@Xvs>CA|HAD6I_unm%J3D1D$2*?mo(g&#IeDUl3uNn&f?)%JS3{}s>FM1{o z#{;-jN-R*Od9Zt$CQsgUGFn!SP71V zrd-3wGJGx|>yACMY?|z{r&TW8lNoo>c^OtzLPEoe82nhq5P|VE#}@Lk#Q!CQ{NVxx z*XvrU?or?r@)=>IUk}_4r-txAIB4nmnTfzM9gXoPqS6?TQ4LQd0G`iAz45HXw z?rMlJ4k}z9LjF?9yCJ7Fk-Rt{4Rjg5Fa39R`Ob7n-|200c|CJC<#BiCYoNH$5CP7Y z616h4CpDzF8ScrB8VQ?2L-iRt2wk=V^{e*fBMJ=SI98kzV?_k%ot{EZ4g|4dWA24(ia z=XyDVpa=nk3ZMFM!XiN@70o35;ofPE?eiBkViiy-hcjP%rLaV996?m!wlhI5Zq-@I zWsBeUNgv&XVLB;mm~$%z#bpObu?9h4eD$9<82|tLeIvR!<9;l1cML*5W^|&rA2}y( z3!v#zfi`TQEfjC9^2RqEXkr}oI}O|g=9%P6PGdu(0E-mW5&Hdk)ut0}_yH&@dpBq* z*HV^#2T4EHmv)10+~sEjKu7L#?Z561-Jx?FGVNLLO#Rm>Ic2uO4RMUbB#XERH=$0>tMjm&koR|+tSJZ; zzt+_@5evW#fOd4`KZ_oM>NCyVWj7fs-vB_8a>1*zYMt?tj?|6REqxP_bZ;ZDu&2ra zzt-R{3eZ1mYKsSxk}?Hg__8qRPq&@lB#>^cE|>w}R?`hJ@ee{3{#y1yRxz44cTe54 zyN3X{uyt2tnBAyyUjB8cF54O3Z>-d_rlvRPp3q^aMC}5Uv0X^(f=dBhyRh=8o?laE zX>zy7J6$#tBR;&B?0Og>nDXOSqI`#-q_5KpJrlv5nxZL%ihBQ;L2Yk-Kyo1jX`=n? zWkeS9vcC0Juxa~f{B!vaDdugL_KfasYKw;p@2G#n9U-#J1ULL)3XE6ADIyEZ?1rn? zwi9l*s~d%0?TnHwx+X&q{!V>DbEAJN$)<5oFO!56P7FR&K4K&%U zr%1UbsFtY`%*R*N7I|7hPIvlebg!zzZkS~>2CBIfaecf-El@?|kPPYtxd=Cco$;8X z_U9{wV;pV?lq6m|thM6tjYXtFF}h8zAR;$;>iD`|U43%vY&9EU``6T|9noqj0&nl7YrSfn{_95rm0B9EV z5$(Q&MAyKBiI_cQk`qz$nhg;he5tlVJa90tB@+IH=Wr`pih0yroKrxfC)a4;~h>l%< zQ)B2QyB9_%!TFbzzb-OYKb3I`uz_3u?cJBX2JK=kxQ7{IgEsJ~Vd9b{Z&>|dk`N-B z5!aA>G1Cq*vc>bu}B3l_r* zl7Te9J^F(E8OO=fLrx{(^3GLCdAVY?Be!)7oewk=bqWWH8gQzg9m zSJi3^Vh5bBUaOV_n|xu5y5V(P@r6%~NFer=q#*6~@MVJTy1{fuk{|(|`nWjQ?i0K0 z7gwuDKTB4jzH7eXv>S`b1l(`};47chXRiAWe8r1<=2b-H{@ao&Cx_Nqw|iu-wz=z( ziHc5)jx6xdxUmF(mrN>TjF+(@N55rW zWdQuzV$2*`U2Vq&yMw`pM*F494?uL|!$CUDg}Yb-bKDFkz+b_!cULIv zcn|ex^;WGJ1VMvFuFM0JOxbEh$N)b@4f!R3{$oA{plJXze11mMB7UKk`j^#WMxhpskQirgVd0#E}p z)eknqb5{_k&yss~K1Vc8Y{qa>J6RVTfmc7*t_S*5J_W1w)(W?JIh%r zIAc>#j)_R<3Rgc@;kSWFvhh8=ei&hi%WjFz~!uG zF4P@t1WR5jz#3`|;ie;+0EVxUZ<3eoIx8@#yaL;$ax>oU&H+{JhLInK3V?C(3eG0V z0M%#+DD4O$h;|+5ANYPp;>sB|SjF5OS-dSRdg3Ibre#}Z?>7g72q7@C$lbz>Q+`Ml zEO2n!nVs?fJEt`ep#bz#0z*8@J2CY*cR~k4oqm#08}$QNJt*BFgv|&BuiM8^Uy1?# z%xX~F-)}2O0ns#}U% z8ew11c3eBMb&#L{^9tSbt1cS@>u9>*9e!;wbhY~aX@>2{X}D*-C0{ks`GbVC_zgaM zt!MKCmk5iVUW6@64dUank9Bf*rkTAd9AQy%Kr#KMAK|C5qU#2GR{!KSMjw||Gs2K$ zFmIFho-eD6)&bMj%XHuu=YZIO8^y#$P;dT;4vGJ@0~|g8yDaKt?GaMH8}VNawPwqw z=>346cr-XT2#b;g2#IFEDx410cBveU_u@M?Sb6XTU_nxWvjlDqM&Uw) z;g+k}XeSN~ffKc9CX|EAeFZ=0MKRB#`ToTacyIF$Q~DX)uc3Whb25T=74r8dO32Q@ zyZs9tah!l7T>ZNt@b^13xo=ZmXtUzp9pcx5-#2{4c5T4zx3v);m-qyNgLZr^IJLE7 z_dcBSDO~CF8<|79!4weP@`o+evS+*vc~6EtMD5>Wo}+C6oF^4nHUVLh-MhCr69)ua zR%AIT;$pMh(4LkZ11LM7BWT5gn{@C%c*6}W(5FlK01bu109tZdyth`IpQeI?tYR7M z;ysRpj~(=xOcENfx()biNpVfmBzZ{z-#IAWuwK6D++GA1+tlfuTYv=cHu+kr1h6eL z0itJ+6vB%z=#B3nN|~ zswqX|eLcsydSyA^4f*q{fxos<-Yr`q>`^ABXY+>E$w^-P;`v4f7MjTr(t7%oJp~mE zSF*v~vS^a1GH>^|F?ufh#iJJluPdl@c!??(X~8<)xJftp=gt=7e%*9-FG@lhxEJ0n zPxfTjW`TuNnh9tr01wr4KQ>sxBtH=w3Kzx*zROa|bG@^towL;S@kOq`)H_S`GpdSL z_sEf&(&B2CY|++5qHzl_?MRtRGzdx!4&;l|_O_?PQIVZRnnPmkX%aSm>8kKTC#t64 zk;!+v*WBo*ki#X>dy#z{lrBL$BI*(_E211DsvQB`I(q7WeLk;swMnv&gxLm1rqntD zKiSVj9}ZIh?WG$r?hUgou8Sa6)X?9XI6ZE6_j5FgWp1Z~ul;IaSRy!R!TizZ7=Qz5 zUQWrxdeVC~H>9FRF9sUvFG`8JhY6C=jR2i4xt`})IFgEFtJZ`F`*lL(w^Gvi|L!_xNqVBZjzMi-VgN^+=@-%pcx_P}gX zv{8Us|79uj-~7y@>K<03_vYe(G1;$V)&cm*l!7GNkQ-MT%yrcGpo~y?(e>RcxIV&$ z`gIE(d-zEtM5wjvr_BBHsK!0OdNH$qr$gd`Zdrfv$oo6PS!ZPg-ZEb7ic~CM`cx5h zJZX8Q5QtNm2e3SD9f3;_rx==7r|2#LfLRmQzV^R+VoTAHv2ehy^rgu$2z1br*E|p# zKBHYdKpg8%7`Ri4sYUZo6^TLR$Y!G;pVK~}bbco0cfYvELxgdL`UESZc81|j5eMkb zw=L^a0juWl8Cz=WUlHc@UzX$l=%NoqtGfJQIxT}-lRUD@_r(b#P?ZDHxI#V+kaix4 zi|rcS)pQ12o47R-K7WQ9;6Z|3?@6gmmC3~y-WX^aOH$_0Vh9Ij@NO$`nI}#2ajKg+Aa{BjMI()M!6WTalqJ_jb4`B@M^UuuIdeT zf8BK|-PXAHVnxPUbw{9_#+caJildMjt{BB`0?y30xSC&yHZ8FuFfEvZ?)bJ5`( zKCoaKXw_h{?loOw6*>V68HekT!rW#jj82L6+Q)u9PXtwKtZe>814ccDH;0OEKQb0` z&yga<)V_l0aIL0(G zlWqKmaZYw!x0%p}Rr z&gskh@=1N;Y-zgUXjRqgNM%?&Uu~vK-b$QL$tXZ_uEwmHFS3lh;o9eDo`6xs@w;2% zPEqvPQW%N%cLPz?fpiBeGyBm3K+KPd(}AwwRLGvW%>CXNb=$ z5q~~WN@ifz^M!tg+@5|LrVqwp$LAa*9p=Za`BD8yS(eMn4jQfu<>nGkWYx#Az05Cr}!6GAy>I6l;# zBDA^>$vk}D_J`^EbMVWG9kH)M#9tUR+<>aGBCEMq%lY%XofkJJe!eB=8`+Sq1x2BH zew4d!_E>g6=|0Ip+~OEBd_>adPF$!W>>COEYiiZ=@ z)UdKWcOx>Serv@FR8i2+Vy#a%P`f6Tfdt!?TxZkXZ$@ps?$m5u9%-e#()}7@0L$#X zA4^bo22F(AN2Po`K}wj0zMsP>np6%hGlElR?mxt5y!?P8%ww5XVb0Ly>N5eaQ{{Ws zmB26pQfTJ?V(-19n%di~Q4|%dC{;m;3W$n`4M7F673pFWgrFc%5$PZZQj>@ZNR5Di zf)Eh_0U^>m5kjw0Lx&K0O(-EGEAjr^`+d(n?>^^z=euK^G4B1wxPKt7#ULx>*PdrS zbIykuxi{J10vYjJYruXs2{va?O1YwT3|kM>`dD%ozLj`=ZY6ij3c|70IcObdZp>qf zxp$2tkW9gkHFd5j+XC^*S+(Xh{LAOjWbD3Eg0nJF7CMh_kq%!za2*M2A_O~f9f?4xVcIVf`&1MxzDPY>T2$IMcTg?@zLaurCD$JLopdL?P!C~_ z4GNJn@|C#?1#Cy)d>ua#m0hW8ZF3j^g>-!%>>ES-J~s}xMY zicYJxeNwDO4iGo1oGwv#t(kppY|g~#Ov0FOeF#RSa7=EHGM_tV615#r1ZDt0wDQc} z>ig&MLNBSlQ}*_&P0Frp{^*_FZrjC=|1dhV)f8xu;-#a)H#*(B{e*AdxTHObFN@&z_<^1JnCqH>ynVdT5`0kw*~w{_(ni;z8b-b5p7V`YLi zyAM^j;1;F$K)(&~al08@-&B>0KGu!8^Gmhy=)l0ZTX;Y#iPKHxZb8enH+*-jI3`R+ zqeZ8VS-iRPwb7o{0b+weftJl+77F-QC}Ke?T1hBJZU>O)q0qfG*D08`B+i<9K`a}s=+kTj!(FcDh?8gxxBl4 zR0`E{Te0+6*5Xbe_rOxHiKBEFo7jR$%>lp0%BCgWx^G6Za!?#1IRtI7ImkL061j0_ zZ4$d*QM02$(zZi$;Bm1=AqsoxCCD}_l#pttOOkcnP}fHW9RiZulu4y(`+D3it!3U@ zL6%&KUPeZh2S+%s9@_fkcaU)Dl9FsG{C0?e@m2G9T5wg<7(x6V>1kZ%DQw4+El;mM zb};0w?B!o=@eD;rvQ?}fwUNYP=YsVMrg%zkKd`L995bj ztgWd0W}?-Bet+eF+iCSe$h>Qynw58C{sN;~moyQLDXx1OkNM&yPP6m&%*r~NUudx0 z-1sIb?TIokU6G|jfeQ>UO-TipcUSP|hirQY8A#qIib|8aY0<_RWJaX2a5# z*pb|DXkz1#2#5o};rajkd+vxS85i8jZy9gca-vBkL`17AL9QOsq1RPj7R)m~e=*bL zvOJMh#;(VhVM|MA%!mv+1VqoemB5BKVk5O5eeq_nSYH=Z+jdMgKN zaO#a#-5J)@%UM!iSr6u(^C+#+Wn}AKC>5 zYb1Q3ah7j769q>OVYfiCK!1BU#@!}o$@nOxa=)D9S#dmRk}(v-x+z>)QdHmc#SNP} z=CLt(M12n#nK*v|jPnl+=AUEx_wORz9yWl4fmqjC$ucg5?>TmA9J>Tx-dwKoSt+Y5 z%Y}LunkBoMOWC*W@Hzqi@QY^-dFAetj*6owU>PWqmQ?(c?D#Q& zU4xqoBBx(|H2?htXs&Fqk4J? z&t-;zY%CW!GlT&jRHeRbmCxmZL#^MYXzatVX7UqWX7-tCpIuNYVR)(6c%*?7r*cIW zS}!7E8)vWocH*UJ5`>PwstS-<&^r4pzn}+vQC=MBvF@AC@X%hLM}-=W8Y%R6aAWly_@&dt$R$aV$i{?uJ58uoy>evV>c(l9WppanPYuDN zt68fSXLdsNdyChHMS%fIvPem{PngdkbKO9E8kpmln|~JG|L+9u-%%Beb|bl0-SKO+ z2+o#nR$@AUn@+EAi@x0FzB|MMi2+ z0?gREDy7$p_TCrtV5`~d>4g?g5uxrTSsKyF;OM}YECNVvY+b3i>6GZtTXM;`*Pt7tga)-r$fS!c$7RDA!67w!;WXIUfRZ521F~BjY|nITz#B!S<%J6a zo{mqPw8snf@e$JzvK6OMI-Ip&*waC|&<_`)|+-d-!- z{_h#Y@)x4EJ`8N$X z9^_WfY4R;f#su(M%Jc%GUy|WhMrSv8N}_7j7Oq> zOYDh!EsYOf!f)<3&9J=48(#j(arVYb2;e^PiR{pd6k3;L$Z|*(! zic31NPKZBFm_C^MV)@&}Ll3+8oAk{h95#lyajt3?E1ce>sbQX%A7CrdV2pSgxrTN@ z-rz#J&y7#`T(M^45ly{6>bAVSbR6^G($*K!08MRm^8;?3-rJH>S{Y>Htel!t_D1bL zC+=#G@Ace9bW`JH4j`F4Lx8S+c4wRzpiW(x_U*zI2eFqbzEDD73=^3fUf+9UqQ zg`B#BPlR+0o=?or4=3o$PNOmEA9c%Tx|lKMQ?0Q74cXQlW*^ud{_F}nr5wqbVM(RZ zP=1OE2blz2TG(57N51K6U!4Q|`=Qn!*d3=$*u1E+ip@050NPxY7Z$h>NjB;6S1lfP zidJu`A9u5|kKEY9BeIdp-tJ+niWu9LKUPfLQzG*==$zJI?85i`-WxWDS>l{?ble*$ z8vb=YL0DJmbfl&8bC8ILMM|i_4?ZGZpa)#e2&^NW`oifQxVaoX;$l1eZC%-J=l7yx5Es4&TzhQk256E`bBITZ_f7ts{ zZi7PTCid!8yI;nU;w?ypd+xh>bK0#w*SYu>;2dTb)@&haO{+i^X*(y$|7FPMj;N?e zgx)~Ob8?$`Kq2Wvyq00F{*Wqe52sK~q@U?jT>#&=YORvtwD-J*m+(VT`nFMS?VSd- zWRnDaA30PWK7Tk+G@B-MKdSa|r*uInt}ohB6A+sYWzZcfN~=P~JjyWHD#Y(YcbTtJm}-1y1{!aWpB--q>OcMX;i8ApLb1OGVd%cV z^^P*7q`v9PGCB85Os;GWQ)%M_7xw#c2m7>_+srOe}4xTJp^P71seRk7xDgi4FW}wVJo=mJz}$UcFGCWf#75 z+vgZhT{1Sb!rZEz#udNtv^B5L3yN8LHi_*QW8}V~v?jq{LVUX}hm{J5DG5S9@15oD zPLdes{aSCNa`2-s%{-N@B+ zY-GY)?+&$2K0+AiG$@)AohXZ&nw|VHA4!}(C+cMNy|(PE<1ZRqG$YJ z`*&%7BQ{-o+ta3veW1~#I2V(#=+tXbCnk=a{)qw=KD2O8N1570?>n%VHLm?A)T<1riIN5*w%#G6VJ+HrXpA|9#-h8M zubEts^UUAOPxX4F+}8a#+{8O*d8MpXS>=$~or-GHwVZaJzIMM)v)!SJ-S|}ORyAQ8 z6u*jq9i{Y-oRRu1T{ED*R=PjrT<`)+wLG&(JQb7^=RS;=qiOcf1_69sebLb^(0PDLOS=J;w*YtA4qig z#7pv%R^aiO*y~*?idnXjlHEzmgbRJ+g&)w8$4>UH9XKdCmLoTZ8!#m4Btq6PhW=;5 z=L(G1AN#&|eB;LmZm|+>My_AuTqSLMt{wdfV)=v<5Q{7OgL;PB_$`fLwSb%t1_o(t z3N~|{8{LLdzleXdK9jCm=HMypu;a!Z&tQqS&AugmGFPRFc{?gRIpNsx5g&!M3eXw5 zZw|&NWk!v`15mpkO74%GCwon_^~R10O8ZMmTzStq==Z~kvL(EgpR3)x+*mYxnDF}Z z!ijDmMr()%9vrJ)B2+!T9T9yR80~S)6)w31`9eXKh9@x-d)$iLWok$fJq20HeSVb@ zR|hwXAhtwC!S=xy?$Q^QYBuPG{(MKzI@uqsOQ{qUpthC!EbGY1M%c))Opnr5gbi=How8S^=)9_(T4N+6Y327Mn4F+Ifvk`U4GR&8U$VC;8nET zDcBIMATj^#ry`<%tC_2HR`0dZFscClE8kbbP!Pmh-Fh%J>dDyzp`nnNb9zY#Q_OBJ zyhXT$<(p64`u94!xb9_oiT9bPYOhp(0?pAiBQ5i>RHDbhM5*5Jnn%b|Zs+scGrxEa zL8vWyd5VpCIM2keYgoZ*@@wSU#oHksAu+f8@8vR7} z1U;GBFP2WJZ2?<^Bji#ScK zE|jw$e~M|@yCPG$engn%uLxyVyV*{CyiRM6{x}%ic-D*9sx_^Ayl(r;DZAv64DL6d z0s1pWC*7Zrfk^R@4|hc$-F7m@NwB=T_Hu#ruI$A5i;H#AAq$cZDY*VAYCr%Xsi?Fk z=5B#;)4;I0gxo9gl*V1Lf!#QZj*2IriAiTZ?ta()>HD03XlY&?#$gFNW5SI6P;ks} zY6^=9SX8J~zzqvB+f(*gfb#5uOP{;h%(Al8?M@#D-$|{=E}pBD7xis`L|;R}3e6?# zCVwl_r!=MHcG*h{Y8+*)9uD&}d3;7?ueJ(uDb+WB4luIiNL<;{(1&w+@qQlPxM>OX*^Rl*7REL> z26HRdv21xv6>^yzwDBCaGxns3t?j;vn+>Tl zl@s;kJ3w`RAU9wFr`q-v=3zaM{rK-OP!z|I`y&R%GIT(RkpFhatPBh24)#dq-4q0O z4bjChBxA?7f5no^h=A%g^@#Wx?auupHh|;Z1i%P!?sK^nh0WYusBdrvb$D!O22hp$ zAhaWT@uMx7OCRXo)q4ZC2&z`ZNQAtYtByIiZukzNUbMEuZYA3&{^enr_oJH{q{*xM zKrZsrL-t3kfS){dU+B5k{U7%%`)$)?{oGhpHAqBmU zDvGa+9 z2xY9pY;d*6iItBkzMWj=1LP`!P_d2MM=nVr*ilfydOJ9o(aLVx@2;m$y*Rzs=hIoL z*TT*r=V(~m1?0?!Hr|Q%{ioke*D53h=@2xDvWb+3&umw}eEBjG2~TQ-Tsc{oefNBn z=<{B~egct69`AHioWEA7yS(;n&$@hKFe|BGjHik`jhe5a2IC}sB-#~sUo%u>BuG|K z+GJvTDce;W-XsE3;$qmqd6G^RBvjrNeWuge-FV&Bd1dR9Wte1dr0mQBsukKDni)CO zv|TAFCO5_^Gmz6x05^1b9jqkQ*gNj*^5iUB4&;*4H_%!y&$V@qd)F)));fCks z`PdG&{oDmsp4iFtgt{VT9iy7rGG@9V)1J={U3r9PaIzD2{Ns?4-pSnPH7RTOkjONQ z4{JYiU?z4UE1a(B3UI3!li{T%?M3;*gM4ZxMy{}G{-Lm|R6dHW91AY=X+nP_2r%Vck((=iXjDu*LN#@z^gO2Vewm>hWw$61bzO z9TAp1=G$E9C6VzcL&~CH+@r0yVJ?>4#;~U!8VRc4XCm)Zylr|Ky<|H-gy^&M5RM#- z`a@^QoR95&S4;MViqhAprpmPUzN>;&^IFsiTq7}|OfAZc_$FGUBIkA40=D}YI>9B$ zbjz&gFz}y%RfV5dj zYgr&rA15vTw=em(M+yL~k==m6c@kH$8}P{_*A3P2ta)D|Bi;vw`U3Qrd+>}lzh`Dt z_=B>-xS5DYfGHWZk&Q~B<=?bFdOr?DnDw~y+ToVmTEJ*UA}5(fASoSaZMqP^g@nWa z|30#tWyC&X3v*x^Iyijfyqq0_rI>)TCSPvP=~N#T%RX*MS%Ep*T=xeqQu+fRCZSCuFUx#;*e9;D-z}; zUEsB%NVks?R^G31?JPbjAStc!JL7?dPmc}hPKLQ_c^_ui5{WvoTP7X;_&5Pqwux+mfO_QhK-~Yl? z4%v;}1R6vV_xYFw`eDmGje~E+uKa+5WXC_vDVR$h?KQF?HyJ3hblH_BG(K0BjXK(@@ zUIPoae7P}CUwTJq8LyyZbXM;t{oJt@;~J`0Un%S+J-P!zouSvp#+(-0)xRX`R4KcG zM`-ySA{by@;0bdhyiBrw8v_@xw$sdQ#o-u&vt7ewTtvp?K^VUx=U~#8AjylnQWNUspOJc+<>++v}|I%#NFP`U*t6$M<+jyz(JTn7Hcilq>%nQVG>9n}Q zVt6_1MIN#rEBTA(G1`za^ln(5Cf#W#E_~PS&7rHsd5EQmB0YzwSLt;N-!H9a=RbxO)p)osDA5c_BU>S@FTEa5mM3RB?-Y0j4ZXA3 z*Uor*Kx7l26zsHC8VFQC3ljaMDm(*lkCHpdjj%#86lyL6=qz$L&VD)CX^Al(wE6Lj zS7~=kY>mVe^t56}+;G`-502LxMZpsVUkny2Z2c z1a}uL=ib@0z}LsJ@}Un+n09Q=h%ZLe2aY|!GzQH?Jx7cE#e!x)KM) z2FiUzP}#%+9Q1ZzgvH4N3ePz@a5~$uVt-TVl$*1mOGG3R{f*c>J+|w@3FJh|w{O1- zP!S%W)Ctr!Zo_;urND}#%&>u~Vc|Tx(z(c0?VsowX~&@8LHwtN!?3OqKYVr_iMY+Q z&YP-CuDP7eQ~2@AJx#RWL5TYy^3r@pM+H2{gUb8vOZJ`>Zw;rpJ_S@%8$zMcb_`aO zw}{c|w^?Vs+ITM7F-ZKP-HMgjmY)Sljg|6Oll$ct${s(y5Z&=fY8Idqhf-}~9mB1o z#r3}!8)rWazbalQ&Jp>8cR3?}LmzQSre*P=>q<0{rL%gLqUd0Yfzc3CyI$_uEy7-+ z!d~Ndm!Wq7b}XqUCa3X?)pE5vFQ+fk9?WT|lk=HqX~jgQ=un;b&2`X}@o+(p)H=NF z;|E{ed3X~ai7D^eeqg8p=0K7>Ir>aIo75GYYH&|_O`m}vrfWsvD6@Hwro@`V>B(Tb z_$v9I7VZG&`(N%OohdN?3N`vzlkBTiiE?9QiN_J@2hJw#i%)J9KKLwTN`ypS5SYgl z@QzBpu&>$(m7gi9JwM4wxw>Le(NgaHs6rO!k&<%zrAh2914rG<^I35f2>}_A6ZGot z@yML~wgv%FNR>`}F%hJ&e>&!#4qnX^ao~=R8%Ta!!v}f8%^WZln1RpLkeH!cVXJsl zVfaTY47qiLxoxS6czDiy+nDW#f=!RoD>a5PFLtK08(O~a6fo_p88QLOlP%QzZ~Q8` z;R+b01uud*X1$M@M0$Q^_qpe;L3bAg;b&@VOl_BYckwrX$2nCKUUM%p_0tAyFc)(5 zY!sx<{h-(9)E!MDJxCjU)wNe()UF{3{~-O5q@c*>SvhtgvJbnU0JVmdNNGrghPc#kQ9jLGCDzinb5Nd5XVKiANP!eow-`!HV zf%l-88f${()eT~U?I677|I{1#Kk;=$u)5Scx$xD$l;q_p2W^c5I;?@0 z_mSIw>B#fJXSC%n9rA!^P)t{RmIJU7LW6X@N&fF8vb?R1- z=mBb=pP^QgQR8jT-820GIrjR0eXs*1Qy8A6576n??qgF6Qlsk|`Uza4V(>&=C2j7M9rB%x8-`2hhk{ykw8FkjCHpSg74c-14 zMwMyn%dC59Y4zlgu#(6kemi{hgQ!@42@{+xU+JoKVUYSI0htiPMtR||YfVpkV;@{O zxuJXcp`U%4>6jIeXHUpmCQ{r%xEoN1?U%TRuY~0&ymX8`<5Qd;SM>A!RqjzUPV7k6 zf%{E+f&@esvD>lW0ZOcYJU|0H;OajfuxVD*)L-txr{{N;-?@@@aI|S{er1ZI<)QEuJ=#=bnx$K6@Do!S^r{>cS35pli^fB_)o_4!z23EAp*NR3`uG`+5vPz*?p3% z(UG~WH`AV?edOx=8=i(muY@(ChRBDG|F;*9g>CG%sEjoBCFg4-7V_$IIi(>Ef@-8D<|P+(-z+^VJH z!)F=cC-YuBHT#S|>OA|iWE8tek)8I{=JN9lSPXh|!?f2K*1mdz1ih^iEmY&BQ5&3Y zGGG^L`GYeNKW}psf5UOB%N0rf`De%lg)jLkv%NZmz-J28bX>qM9-%T`Izmf)>4jpG z!;{WXt{!IjsMo3Ni;@=^Zre&5p;2ZOdy{Q~0NJ#bnCt$=5fUSqPDIywelam|J0bF; zIVctO{OZxdYuCuC6vTkx7^+I%7Pve0s0kHifAmSqxK}OKe7iC{2{G7QJWD%M_s8oF z27s9+n|5O0n1o%kskxBTwvB5hR38!)MmwmHL$;~H|ZQ(UVbBeZ19G3-wu>O3Y7 zJXj}Sf4fGnXt-n3_9G@^hKUgpk z)}*#W9c(gy=wWm6*a;MF1JC=nxBKpI*HP;&!vFR>AO^c@_lxHwgl%R@aD@rgZ7g2w zcnuMpV@dydNvN<+;%HZ4FI21ndQkwY3AKTw{wIH)T7ZH_`NUt`!;L7`oz0|HyrGqm0N=fk0riF^k$;WQ-4^T-;qDEv>ybK zvj=5sM)I_!Y8jqOsWqUAWJ%aU_=L5tmZ}x^X7wJ1k|QCzsaeSI4U+AEGFdnuura|O&v))wdsSTFy^=Y$67 zjrR1ciiB7{GB?zAA|sC(dLXzimK7c{d}Cz2Q;1_1OsG7Y-knl5GKop{cTW`?e_2js zV9^O`avZi!z-%fD`39K{@kd%!>ltyO^El`!e&+rMiGp)6%<#4k8I-B%wi0W=@hF3S z$F4kbB)BzvW8YDLee@Dn`3TPlyEs5QHhw*rO9Zg$Psy^7Up$;fLgKU~YH9D9+A;91 z=~AmN2tNThu#9k3V66id^ivxrIGMQCD8LbwtDEapyNqqn1Vz29TiEGrb2uu%3bo{h1r?n0tIegH;oriI|4w6H z^#2`=z0Uw!V{XK%m#b;{+>MCakJxUHF0p7_=ae?H#O|58|)48t^Z*3*_zGV zo#jOOq~f?Dleljz-riEH{R+At_IZ3&H9=CYg6MHuLrr=3uITZu++vxyFdX&)ycg<kAiU-?8zSjbV8bM&_q9~~FjT8$m-SD^6%KSt1 zKNF1KBKS*8zbq`d5=?!Kr$FL<>k{)-7B~Z%ut&^9Faqy^JO?lyg`l>sGx(|iFe%IB zD4U&_;hzOqHb4c+VE53efvC!(w=UhZf8NEi>xs={VOkbV9 zIxbmTHaMcd>XE$TlrhjX-$XaBP5+9qxNTlT#l13@rhSVTn#nFmOF8~z#loyXJPh&v zRk$ytE>&EoOYetJVlOkU`9xVf!Gx*{?Z}=?`CYKdGJNp)KOpcv&JA+kZl7>>;W0WO zo$zAQAG@+*s-qmFbo0W-F`R zh0W|yeaC6bc*bo(yCqtSYx(D`cC4CO<`?2})+{)aveB|#Wb*jn3FA^j!eSjAJMM8H z%zFoBLEe%n6-R@SIIthF?tSwC@t(&e(MeKL&r0>wOXVxCKvg=*IS zepi>P=EBW6F+X;1b7nMseFPQifZjRC0wgE^<~^g5#Lck7Wo;7@*iaAXs?tFb<5G+l z=vpr^_MogSZ2EH$bc1tPZVCIc|C$X-z?Csjpf|Rzuz3G_DwnwvwFTLmN=3Yeq$n+N z+TV@49T&chhDHmDN?V3r-F>9K6#XjAdq+az3e8eO<)qYF{X7yB0jzz|5V%%7~1|7QV?JdX6}M zw@bF{QoUe(Gv7}kUna8S5pzJ`&JsHv_N`f%yXri;8>TZJsiWaDm@$c?t`^=M??wT`a+gMUTFDf)O_hSKT*+3%SrZb#`Hzj{CC zY8LR}(i`7^T!k_6rn=xR_9H9VZ z2{VMM#26_}rTuBY(UZXc&*N}Kc0m>tIxh=wT#L9mC|P6={9jewIA<$1#6?BGuhtWX zEdKHu`l$MBb>I)su1z!!L_|M9u(3HmL-y}CSetQ`F!kkhyb#D^^A7)WVs4}|BFg)c ztxKL`$@-^?z_QR41NJOCmR!vkAsfI7?JZ;^Ci< zKk0o(=X7G2fyjZbv$8y!J*CNQGS7PKIR|K`-^Pl)%0r(;+648VqjF=FLD8E@SL{K) z^oXOb`(N<9*LJNCeeXKet1xO0ns={GT@LjpIoJ++2y#nb3~SWRtH)UjDd61Eb!wwK zI4*^Pqu<-_P*2<0q{bwS1kx@n+w0q0%C}qRW}!=;x!GCmG&8^P_IkOw=8mLUgvg{9 zxz;Iwb$XoplkJ)TiKbh&wKEZrSA|%Y8yaa6w;-NBPL9BO*&L{6Im(K}qpk!~gK)ZM z^vI}X4#GTO_fXi4>00krw+jaT%&Luda0+dORsC;OG)NHgsV*_1iBwfP1hSQ zgmT_&+@}6aw}8`5kp(n-TF9mkivxm+sQ2xpNZwmNypQOIw~oL$6j1~=py2A=-1MYG})*0Vf->f3@Y zA0_fy17iz~u;{qS$ZdB1u(GHC!ly&;o~pL$u*`4)Cm(%DRkU_rn&Dc7ewd_v`e^Zu zs9-Si3X)4@Ub$Kb>}10C4@tkTj8~wmy^YH3NZzk;ZASx)&%n{ZC3+sb z*T1y>gx0n|{4#uQWNl4$h0uN(n*W_0NpV4lhBCljW+q(XZc)Xm&v;ldb5=Pw#lg1M zjL20D7)3(wOyR5vI5${v{l3o$njBYG9?x+Br=dnHJBw&MN?2Fo%5F$en2{WcmqH6T zf>CBZ{{sIa>tT85r4M6=WD#wwVB;9rT6aB=(#g%Sn30T^=wVe2?!-qauwI^V>?Vq; z93I3jk~b1Y1ZtV@*vCBFzzaseyzb`aM=%z59`}r(^eD25e*i}f=ZBWt;w+p}?}dKv zuMI!EdS{c?wA>2*$I9ISCP}lnOGew_Q>nksq52)^A^Rf`9To5CVhj5|-_>Ps4`rjT z6E@pOG=#8yUsj{9kq4h9C<*wd8f4nsEr>}-K%Jk=9aOy6`T9gHLoGNR*&l{ax#s95 z_ciky!;${M@S?$tYHSZ%z3O$ zNL0Kh^#UDCMs4K>KlN%`m7^}=3&Ov!bz)@G^!$Xh{VKy^wj|oU< z$J|J%(`##SKFXobh|`PMtCGcIC1;9sqf*j3Li<9!XsMDmrBZmB(n~kQ!MPJDCb-kW zznxf&k8{wy09Ic_`%0}!Xm*^saOm`aR;Sc*X6CJ<3Yg0gzvTwS=@@ZZZ74i>o#d;f|!MnsPYypGu}~r zC~?_f%_*74eG^ueQf@(g`imzxDxIB4He`~^A;2!XiISh5a$)v(Jln3EF2jF~i6VxIr)X~YMD=Eb_n zf5XJg&(f!co40di3Hfh(`1|I}$Gr#ygD|WT&ZQ2Bwrf*=$ zThSDn;zdZQvu*v$M-W~c!#XMi2<4dVyL>Kml(*DzabD~_+ffUE{=O4xUx9R>t#hpN z-5@UD++sc@#DiUGpeLAg+46-lTVr@x^}4mziY@DCg$XD5u|3o46hxfS0#&XO0$<=% zlkZm!Toh+06H32@>ZxnG19Gx+m^UIUMcRtKIKp>T@9-IoZ;m8Rn%4V~3~^|J(50s> z+S4QPnDS8L>gw$vcI(zJ@w&P55$q+dRuf%?)2l^hL3$v5#=P|bY0AHrL4RK(r<|$> zMhcn0f*5~kz?F_2!4Co7^n8{KDpu_T_Ix#~3_n~}1pgR{+Ff@ZQ~S{eu%ei6daUqvXo9b@{bGmjEt%0EH%#;9Gsv6Fp2#yYl0`CLG4} z&IN`?KeyRS5q3c|aTQ1G@Oj>rNwnbz9_|6hIQYhHIJ1N!vkT?-f zTtrHKxN#O{VApnk7+rY}rKRvQ1zD467SGjP6jAvMpiES59Vo- zi;Z(zTK#2V)JZsdhBm^SAsi)x{9DX+Q&RirE4`4_2P!jK&~`2gaw!uH+9*Q_Q*CG3 z8q4w^cL?#sUnd}-k4RXvFf}IUXL>p1*(r6J6D{PTd8Y+ z%;#Mc@K2b1_9*+^bX{!HS?(U_Tuk*kof%*fEUu9>VH5!QXcMdWTn*Q8>g#A{M$J{w zs@l@&P%>y$edg;wTUEbICG?)NPuYLjy!CQn=^_UE-eTen{`|ma5P_6iy!dAb^7rp@ z6+b(l#2w($Em&k;^YB#J7>muIJ}WSJ-h5aCHmo%MlsZxp-ZbP@tz(H&y1r=<1ZL}!f6#EP7h0SXTL&z`WhnZ;l{=)qE$i1 zQw|dv5X=*=2l9!VOn>pXjNHPqXDBd0(9I{0^FhK{4#O8|k70wIixOPcdjU>#DOi4E zNxKj~+FDL>1Bj2Ufvc-B^MI=TIF`NW3C#UhNT_!Fe7RU8Tg_v)s%w!cm)(TDw|XAo z>-Dh+9+{^d4a8{d6yeo&|1ahSxppeAJwk-cUeVo$pYB5_rJ3r{FOC zFW5}~2i(WFV!F(CUls_npv&YqEeVla>qK3GUI_VgOthV9FSR zj|9DlzVyPCbI)o7&BIn4A>jA&0DRmBpbO$_C!s-3EQC>!r6T5cZd|nK=l0%2Y<~*x zR)Y8=0%o_*puV_msCm9%RUE0YUqg|@S3PtGX@5E~QRDwPvj27#{2u8$il~jP!&veG z_^J?c);|O2qe}m3kZGrj5)|r%j6s&d=3)MKJhwl8P8b~N>jWJdzpL%F7O(E`j48-Q z2TeytuVhwSYgwEX`Q4q2X^HxT|BemAG-TP02%H|T_IhoHwck!{3D|yIa^4ToF1I`) zb`F(kc{2}LuCwA(fSp=ad*x3zZ`;?%h{lh5$1G%z;+Wc8k(x1Y>-GB{PbQv7@51K} zJ~{!?3SUoq^gHsnKK8n~^v`cTw<#Pwl2^oHez$ybK@9ov*IVI>=fjs!zBDGxG^VbB zHl8wM1Ply1&i;sNSq^0JwxTZ)1S&r=&i&}FGQyI4Mi_G=i#Eg$A)!XOGR6qAxi~rVrJCqs!Bz$@LRRi{xP>&yw10P@%WRAaH5Q)b7 zKQvY3fl{7x##`2QU4yO5jWREX!FT^OA@!d;W{(J)DgN6l1lf*TjjXrZxCcxWg}=4f(Ub zczRC*_+CA_LY45H_#f|edkAr_Gf_0Dsa))OY(nY+I3i0O7>E}Y``+wwd}#w;nyAKc z?CP*iJITQmOyHy9%~WLQd$!h=L5#fs#k`m;XP&(N1F=Ipw5U#`TC2Ct`6P8&WD;Ab znXr!KKcygL#tBCH_Ohf%=djZ^@}arN3iDH(DTo_0;?Ba`1(Hv9ZXm~T4JQEyI1|@; zjR?vGj|m$}Z9C8k*wq3r=oSI!EBfTP`YTm#S9ad5oMQt@jL{+VxJ|6a&v3y3{Xvo0EYhT9Wc(`wGEVc;cM)&bn2W!^AvFS zNS(%Yj`Rw4$pX7@?_tv>4^X=uw8t{M*lo8u=N0V^thN~RUunAX?rhVpdg-w1jzHvi z_|>2|b0=eSIkH{FBGpkMu_!mj$wxclG?w1T78;wg&4%RZk6DMR3rb##TMmBL%^AhFcm>aixHHtTux za`&^4k5(ymG$L0_wq}VXsn`Y`F#E+rZ-i4^e9Qq;g$k}JqX1}n@LPrl{6iN`=Eftt z+(}EVKx>U50|-ZM>$hdJHjSc!?=dtqqWd1{SMTo7Ir^HnYm&Gp<0~0*qQXV$4yqj| zukC)Md~@8iDa}zt;KtYR_q?lF=ER$$YWt!Qeo@=oJ~SfR@7l*qdt9n`oUJI_k)vBT zA24rWEzjLLv$E+I4-DxZfiVyUbL0fEz;8TygjXT)+O?tK{_DMqyc_S~K|2B%aus?+ zU&c?lZ5Q3wozJYwynYXTgw+qtbg>-a?46fryDrA~@>SiZ|+!<(OtAgS(IiEhPJ(Xi>!Z)rjqTxxmP(B8(I1 z8tCSPQQTSyib)c2HC`y+*}2Ha;oFp3fwM(*2akR zmFHe!)?S!)QsOFzXLIy1lgQDLItK%J3#x30(G!|fu00d}^(_>{$D{Da z6$=rT0-N9b6mds>@X$z%9~rMCTofkV#ZYoY4b5czagDZz-#O$_A3NI)A_Oy071PUDy3w|6I@O^}L?@55+-~`5wpdS>NyXC+)0`%^BxF zY0s@AgwU9?c;aY_<;lu#LnAgT=g>P7M>+Ngc+3a`eXyGqzn`H*=7WcV%Du`uyU~*O&mi$A4O6;+%TX57 zUWKyH>e{n>jfUB8vrvU7b>jy4vJETiFAN+|an zUWDAN6K@t&Kc^M^3zK7Q( zPH?N864o!J-CyzZpD`3vM_EG3;Ux;Rl=T;kipf1IO@GzZWzhfGZNl#WOESV|UchL> zZ@A+G_a;$bZu0EpZ()o{Zs1LgLE5Vee?I$*Y2&Uw1J_TR4f2s@Xy?=9wqSBOV9N?& z8tU-SZ{`Gr7fF7Srp1yB0QE*)-lkD$Aoq_An?_ODhf#}L*u@k0cL-p#{h9*wH+4pd zmNQh z5*0h`y5tPwq@4^BbvyF&f&#{KpW`<670jKkIWpQ{nTKr`oorkkJ$w8~K(GLU3W@D6=8peL+N}J_NbCTSr(q9Lr*V)xw-n^@H$LUe zaV8i2hb0O(?4LxotrFC39x{6=tp)`123m|dP(S+4g;E!w zA6dq4?C*yyPr&I+Lz~6$BtJpoJvzuNDkOG~t{X1wiNl}C9{|!_8tV71Yk$dbg3x!* zqnF{|qfiKumTO#g%$HIk|7DE)9^EKnTFkl z9schT3Xy;91G)Aegu-mloNYUD8^9O*QClqVt`=bR)@jdFc=mR|8N94QTQQ%vJwOw} z-$kvq*0t31Q5etuVId)<@ZydmL_uw~vfcNnDXrvbygP()Qbd6sIt|W&OZXbh9!)t0 z(9Z~46yO3wc1XZ*+(`%4#|=V1Rf(ND!$wQsVc4oAtk| z{r{&AtN|H3jFW~{0Yi~R0%L*wvE0=K8<#_c!s9Lg!_EiIp}T;zgQ7@% z|9^VH|Kxit9XOSrz>cEWYxP|tSIU%7lc9Vdvxui$ftO53^#KHMt!AFn-R@CW$1n-p zJpo2JxepG8zAETts;dKwKx~alYMr<#w+q>5N8*=A;?PH8v;tsgFf4MWIure%OlTJLZ#}Ph7%$@p zh_n-h@Y*&WdQtja z)yo}__N6@G`P&Bu3d>h(6B1u%5^rYNs_~ycpY|RQ4i;U;zHTm6y8HU=fY>FDHzdly z$B|Gju#0VWA9m5}|5lydL`l3=rDpFab7~w-pK8XKXZKdr#$2c`W?JJ~n) z3G?m{+t(>y9bq$xr5o7qpM=Xf_2{ER6~--vq3=|6)NR@?zEM1?ub3;o z4SU4A|Nb5LYn&JvJ%c^u3g=n&y-+NDqhdXbL5I%L<;J`qBdx7RneB(Xs0OBNX1517 zF!^U^X|Xs@-p0Ex!LOW)%3%)u=Gjfh*{hCwS{=3kgH|5?fSKUrfYny)?sJIND(LRYE>;ehN1a9l1Rd>m>1D4TuYHs)UhfBvV8R-O;|2i+lsq`=m8*Blu}V^5>? z66eKLMU=g(2{g@vzWPcXd_Iabfa$;1S?EH{VxG1_q3 z4?eY8^tnA^lF#7eSZ4IKhhI*cLsQKCc0F$+hatRch$F<>hnChj(_c*CQZ5Gd(+LCS zst1to^B|6zB>2}c$0%&tLWe>+{YOWPj~O|O42#uSRI@p4r-&TUHv{JFnx9}Rc2v|n zLDbx}_{S^X}FwT2*Cfg9QujnbRfN{F2| zk8j^St6Ko77Ls9P_<@98lTV*?w^2@e_&&GRmKgwao9nzos}>o#URUIY{HahTe|6?o zRpi|L&+&xEL0ZRIaX-mi)O*W5mZ{hm$TMRcQ*PKm zh6<2rHjBlS0Ku2n5-;Si?msUh;Uzh&zYiB)@6#EyK$W0fsUlT0KJ*&bgI}OFJ)u91 zebUKv`}?^)nbWYb98hY(A8H^#f_IbgiPg(ZyN7L>o7P=@54S;6;2wYr6=1%JX1%w3 z|J{{s5%l<*s*}T$)2Kkhp3>D#J&%?@=moxd%iB6L@d7J&rdqCP47d~gURXAA3Bf8* zbl@#xE?B)??9o!nTSei~pK;%xxNO19=r5s%$DR5bvh;H-Y|kr(8$XsZMIf@`6Jyfe z{}J;&x}1vV_1IyNYu4ckpRdtfEGihap`l|M?r#?#_LdoGowX4NH15MH2kEWr_Di0# zvCx;|`><4Uz2z=)Kd*_)s-#q}i+zXl2)HYBCVC2BprQ^Ut`XV&CR^QoEE)c0y+TsF z(lNY0Y;Z=+rJ>@N>m|ww9)i4fCAF|Y4}*<~5$91yGmZnsyimPB3}(xG1Xd2O$A;k# zDv$5mZR8uv>dIE`sa5)RlJ#P|t~ce1&>7|hv1mEM($Mj)#rcbO3B=vG_*HMmjN*&6 zQaL7G3)Sp$<2WuxciKgA$KxRpy&_D;<>^!XXPGyeeDC&qNhx1>HvD0%Rau@)oleM?Po3(J z&)Ph6%zekql)3N6C)ATm=pzcGu+vzTLz`FW_v7)bUDabMuY#*fw}5UG^8OvGG%4a> z%`}(WdQ+NIX&~42_8ooxkk^5S0|X>rH-Tf>VmGv;ymZ~eIHWpI89__$D`l4(M6&uF z(IqF6jIR>Ss?L1+vBaeo|Ni*>=EJ%dnE$AEJI#D?(Phmr0$-xqh-F7y7@`@zSD^{+ z8Dxw#+j^WhhfXcX3gB(s51E6@qhm zFvT1I9AZMY4Pg$SIZi@13!my!R(~onIB`b4rB^~Lww~gVHs)s6R^J3EYh6hqGFj3S z8N7Drb$Zb{GRW5-Pj`1FZgvYispawwL-fT?#-EB?^%=w^ULwBJe3{Q3#J35qD2)F! zHUs7~`9eaRtVLoSy-F-A)qJrX=^GyR$@f*VuH|6w82@c}}Vh!C1Ddsi39o-umU*BGe z$y*%R*wAPUr|S`YSB_702&H3MMaBp7pQt@7ygGUXvo;$R6ILm=C9pDxDRHkCy*8q8 z_LOwKuCG@s!0SyR_u<$?XHMPGv;MSzNw!3RF;Vr~KOc9U(p&2o0P#2(qm67|o-EF; z@3a-v=en?PJ-%Ak#ar$$YW&?-e5l+MIvQ092}g3b(Cb_ut6;g+_p9DWvwp24Cfg}t z#k9rEeQ0V`ie&{fhhO>I+MC|amKLsi6OUc&b7q)Vbw{=p^V@d~48?4UgH|oluXWXZ zB?jQTcc%@hNDv|ZT%O|@U1nUz9s(_#_*O=wz&7gt;Rk5#ji`;ggFxd5POqeXYh`H1 zY@rtsEbxp;qG2}$gIW^=`GP5>>2C{rL*b~6iJ+mfEh9h*^Oa+gybfl5Ir=btFn6e%t+*N z@XhBQd=t-cTl?_)aq@_({m|2XznDx(L7Q>n-?HR0G*6UA#_%zZa&!a~ zOV^>!KZfyizx0%STM$@lRURA<@L8lR3*#=_8(*Pa&2=G#XF z<(s}|ujKUILEn$nsW-u@;hc0o2U!sNcIwF4Q-LpCYp-}A?qmkDwD=8Ia*SoQD%8;F zGe5ksIY?FHcRUC3II_EX%<^ase#G)^#Vch>Xh)`phTCXdnt(~_RQwFnLO87?9cQRW z+?2;gWJNt!lRhL9_*r*&(zK$}A}va6sn+RhoUehE?K5qGs%boz3r=CM5TQPuFQdYB zo%C6o(;i?bZZX|fJ9C5S$qj*>0ogCSo+%|(72h{9j*iEKX(Quk8uaIn7##w;&EpTW z9{+ey+83UeO;u3JKMJieu`iV<-;z&Ui6gRhUWN)&wd_T~X52$DpN9A16~7_iUr!q} zavN9N&2q0RCtL3>J{~;fyw)+qIMCg4Np1&jL62hWcTQLrNQ?8xx-ow8b8S)*GT0{` zMj9dSf?mOjBLTaAc+4-Xe#n3zFFUT^&TLRoe0u3Bw&S$8uC*#vy3pr7PG;`nNO)}A z^R%(ALzIt06FmuXVI<{5m7-d7I4JMXg-H9t_m+2z$l_D$k%>^L#4mi-6ieBWgBcaC zuM3S)hR^85wZ2hgz4UI#ka`>0mBvZ7$#`g){p{%Wnzwr2gTW0Lrncm$!PC>O!$PfU z-8Z)>?6vOCLLKRGa#TYD73P%YYd&h*S%RYk*4MyYBkD!wrxMPWF*aHeso&IKUV^dwhn4}j@wAH)R*74WOPDU_K9ztQ93dn8@MljT5KKUG%9Jnq=lI49C6PA|uuIdt=k5i&W|# zv=^MIo2#{r9;d?M9FMBY6^1cR^i}_uTTIUk6)rlZ4|LER;)ByqpZWIfF%wIy?xNhX zs0QJZ9GI48xS|ftvS%$PnK7(b6gicbui~6T`Z|+BWHF|u7r~fro}4rI`}n69JPvk; zLciAo4Hn;ywSBi7YqHDv5e@4z>#x4LhKM4o02~xsIzfxZcH3Qpx%% z6Im_i>fj}{>Tt1bf8^mSOKgQr$m4TWS%!93%Avhx?!BGZ=K0q9C!}=6@{s~^Mi7j zmzHn#I!R2mN58qQn(W#%?P3`_#`F;a;g)j8MrML6a&tt^97q(-_Y|?o6~1JM^x?HO zfXgn00171I_y%nHh(R`fWNjBUY3qZs+?W(*G{L#H1N~1DUAHHm`L=8OQcXVx?gKST zh!R>0e*8Hp<#aC8BDcZU$%^R8V9PQNJtt$M%s|bf+n3}%k`iC2z*+pVjfv5pkE~W( zcXEa*RS89F$s7|eI4F5cOyV*Y59MIAp|}z5#BME;?<;rjBdEOQzDuK>Vji-Ie}sR? zu;0DhE?wlZ>bWN0;PB(&z5Mn{gUkYKU_O0Nqu`W{S=snSd2`Q2*f1O823r_x5|luD zt6b)JMmHH+3?5KpvBJ~2FVUIF{!{WtrZLY8Xu{JA$@&~eWFp5D{b7NUH-=^NpqpYI zy_Kj47!&OXDPjFq9qIm1vF=Byn3f$2BgJdakJLPY25@>|=Miz?ij~L{X^uvW&advS zu5zP>NxOcv2Xv?657NxwFVIcwa@*YuA(yEk8XX|O$VY&a7sht3M?v=&lbuK~8{(}b z<;0`OUNQa(GOSnQuEw3d_kQ1H1Rqn^CG#zFF48K7gbr_Y5cotg z&UJa@@5*o9hd+RF=oUd(g~|skUHL%ve*4T@HX-(L9pdGNDkrJ(byb(@VeKGpmz3#G z$~Fj-{(IH3D6NG_rBk#qn$|5o;W~bYtU7BPG!@x-#5>UFiqgowU_u%1@MCVAe!;w)1Vx z!hzDVCoQsXyu2?uIA{d(+;&;_Lxn$9OI6hnTX!Ao48RWAw|#NgeyDDut#=w*gm+MQ zHo;A9_ax{d>`r#arx(5tSNGp<7uB5_+nlT6uf=4a+v(%z-zkRY(6Bv0D=uHnAIQWX zb$H(RwOsp_Xz8q_+!|ygWXFr>*tjZsbwJ}_1=cfnBIQCw42?eby5teFmRs8wwXOh1 ztPU|{=NNvV@Wdcac0xZZ^X>h~lMk@*+qW*oYw)Ck=5Sbi0jI}i?!&J? zPu0y2Mt8vRZY!&dmgYcnZMpJY2nHlMuK$&jsA--2wBAIw?XsXT`$9u)1jR=+bH!iWoq3U|;UKnxz? zPD)_hput2&ets@E&_e%mjW@2VrM?5FHlkS}uwP`Mb)NrmSr(jEMW}G}Y?N?+>~n!( z!j}1R_6yWK$n#`nPzgIER)ITKRX=h6&8T+ToRN3|Td_|OVP@wB9kcNK8cs;@wW6@X z*(PM^Vbo!lO?HN|!(q%Bd$qGa<&ftMiJFU#aVPUXCxUlF&elf8)2`8sh3v=sBF+pd*cedbmrhR&V-}emQ`*uZlh!i;X;px9wrM;C)5z=vM$T&o#zh|lcf2s zZz$!-o(DN2?;jVB1$(X_<<8Hn;5pFIHg_uSl$Dw5cDT?Nw@=^S&b#A1Go6c~CztfuiuWLgdAv6<;tO;09383Mqtb=?%>?ZHnTb0>B~twyyp#dHUd@69mS)ri+MUMkxl?U#1~O(BJYD80c{;qK{cdYNV`bS98wkj1IiWa*jK z$f8=&={m^mzJ<>vvE&ZlxdRT-H)dq_yDdny@qA>)7RWjqzp=>-6FyZG9DX|I^zUZ) zzkbBLynS~|>^s|Yo6po-47Cf4TE4rL&3898u+J*XU;Gy7%^huvf5FE6+uQkvrUKgx zYn(Lg1e^$(LoXBHPhf`^qmR(Bs)l8i?lj~nTK@VTZIAxJglx!Q51JCUM&uVQlH9i3 zFNuzxMe#$jg5pRl1k_XL(T58*Bu15MhNWj|m-~dCj%oPGj<$)m%z**(iVvGf>2B#a z*5#g(Uyux^cN(9?oJZ`EtT^QB|Hj9(U>t)I!Zgo|kY5z3M>f-U%68AgZpsn##{L5szIK^vDp> z^75>uCy)8En%`?m55+zq>jR(MUfK@^jsR6mQ$hqU@v_X7a68k-s6>j<>jAhDu&i4n9c{LWp22<&t8?!^wO*rx?wyl zBq2=1h9U&hNBuk-MNEq#vMjkijSM%wz8W%om^p52TXPv1E~oS_{N#V#^uHg1|98I5 z^|yCK@rLdQBxwpcLbOJSSYdJqQK`{Zv&Zp!c_DPttS_57m{|PESTp>bu$lPVCp^c( zcSTMs7KT1&NFhZL+Vl`0)^lQLmb9=^2fQTDl5PT{qblW)Y&4=W`%z}ZPqFF5ssL0I zW)xK4n*x521F5(Dd{Bp6UfXf)T_&)tNZeI(;6iW5Bwk4>t(&BUq+4G7 zVHArk3($LZCfkwF+ChRX)z?&upV)6+ndE>FLQR`vAV!7VR}T`fX?r)O3ZJ>7{E;x? zEp_c7Wp_R&pS?*y>FF<{mQQlEltr8|Qpl^0zBD|;69nR$yRAAh-UQD`j9dOSXG2utIkS284MH>wQ}k@pnHS2gZN&DPb;2$Uz> z_KA-+=O_kcz%z5~o*TjQA5E%Tjob2{zwx$YIF1RBFss}CJMP2i=En;Vm#RL(9$`rh zwxdTqj|1?&;HZfR`0)J2L#?AKDHm~-x|zSxUPk1YG-2chG#jYSM%mLbz_-a)d{HI) z-b!Hr+X%eW%D|SXKdj!0B@4kx_Rfk4ah&cvAr|28Vm6iN3Xk8e;&j8g=NF%pL{&)-`HlAEL2); zZC3;nm%74TEMTAqB=`1DoreD@%S&n9c@3aHte~(BSNN^%O<~31=$Wut=1kO32cR*r z;;vxZ{A%cXwhIqH@LxY)?%tLt*0eMT*s5vfO+$A!}0p=O=?@yl)Cmu|+o zgS0<4fDx7o8fd?)g$%x9OG@sx6S6#VjJ3zn94AJlzA-W?l&~KGuG%gU{T`%5uhO zR8{x787;wNrFaqANnn8UvL`mM%TckmP=qZFc3EI0ef{&F@)naaRC5t~~H_Wr4ov`LU0d8j_v{!d; zGJu2o!_NM>WyJ0g9>%pB_~ji1I`i__^529*Bmpw#N&aGhusLA28NSK^`*Z{F)vVaj6784w#OWZ`&4%y;RMMY1 zkO*l~A45-No&E&mrrP18y&axt%*GuoYWWRh?~FNmXnJ^vIu0knzR{J={$ld|P2?+D zo=yalhz03ECm)xOBhG_H5%5Ex=#yChKA-8^h9?!tk&S=z_fGys0CRn|!jBnEzV37z4u`4GjFBVR zue@!vqmg~yO6Ux=dfI_MdSi9lXOs09s9aSUHE8zIceJeTfdIC~PI+(COA-0Zh$=pJ zwXjE5Ki)zKU}0E)dfFqinJ*Qp?!TDMCb*f?7rfxJ2EMcXnjg=;fp|Tc7f?HW%T{N^ z3*c~Okq`cQ$O6Qn)*KD&OJiO1_8<{I3tat5`K~;|lX`uaf|TiVe_Ef`?KmSAiHc*V z*q+Zu90>w*O%(cfF9w)3TU9%DD}&e53!1K&&TXl9xs@MRAHD{z6z)DN{-(>uh?_kS?$L%_|_H9FTUd$j+tL9T zKp<`O&I!<{dQo(df#vd9_}><&2?mUx`ZT!>nDRH&glqu*_uboyKK7T{-fShHX>Y`R z88|Bn`;-fwoZZ?i@5#XNFoNZtVjDIgyWtoDDrqnD2>JFF{Iz)w0>E@)6ElJwfkp|R zdE(H&E|?`5QU?@BI-A#K0FcqRhoLk120XAu-&2 zc5v}@4G|LcsB|Zlef#p#HSP`yfIcVxww-4Ez;1(P`C=LNL;iZn1YgF>rmGd|23X0t z;-qao(=5YyW%M^;uR6Vb*9PNs(CgQmDB+o3zdmkao4_0WEP6TPl4G}c-NoYZiJw;B zG6X-UOx?}~&Y&I!RLjvikv?9pqlmx^%Z!@O8*r%ayI}5E!qD24tnX;?%4<+r=jd~kc| z)r&;U1EfxT6)Ne)t&@2Sp-e>a2V@sld}4F*r&cg-9#6dk%C6zNC@w#aMkF6|nSths z6d@&8x6C$WRLb%DAA#m}+CfLzq#CEUv$z7+0Aj7JdERvCr`kiXHv!pyLDv3+E}1t@ ziUT}tx((?13cq%I7IuL>Z-Z8@$(d)?+(5x_11c>! zY47sCgj1H5Zc+Iz=P4ovZ-69%kI-=2-YT2Gay47Mp@?&h!+=KNsUt(3t{e8lMloi&*XoLeMZ_Dsj%m1GKsP8jj`?TPpEul z;bV#4pmlQyG{3eqGT&z4+bE?Cn9{j2QH5G3*ofE@c{{0NJHAtx^qD1>#bqvNtA4q@ z_=RzKSAx*pncI0VX7kh7>Uswg#h$qgQU7gf_uCfxyg}cb-x0{=*?%8_K=0E*8Z*;+ z1-Sb}NW0wZ(q8LM9B|a)Y>aAaloU2DpR{%cOlJFybY4^wq!s`+E7;A=up(s2L&(Oy zKGgCEgwAAEet+@be}%eyhb*~>G&LsJY4e*sKk!dpCPW(9`#2sCTF9g>{BFl_kU{0o z>+LP<#1uOz2Z1yodjgjbvi`LuPwt*!_@Q4%9h~x;YB&oH|A| zvOC|TS7M2V+&FzKc!a;=bJV?XMS(j^rnfIKu`!vH-GQ6LO#gNee?89wNFT(_KRvuH z@bI~?ROG9n1;^%Y3S@hIZx09+Hj}!E$CvjY`c1Hw#O$TV{lN6z3rbpn?ZeNe1M1C? z3=lN7z*a9$8$N??+<63S;G17eduN8mz4s~(psMgBSUZHL8MDsOgM0*@`rga}I7I&& zPKcADzyEN~j|5=^3^z{3UHZkOM|HzprDvY8)C|1Qj}HxW3?seTMXl;l+!S9B4R&0> zeAdxviWOg?SNA*cg0$Dv@Bj2X{{ReE8;CYaoF*NDuehEK%yLuB95R+b{N#XVIy0HN@5?cOk#B zjGsQ737x=WA3?622?+LgCm&SV0s7>J$&jWqrP>ZV%ke+u2TvZ`dr^O7n*$TDHFd5Y zi>-gGMJ`sNC~RtQn)Au$w9W&ZH3`Cs;3f`*A!<5ZMCvxC7$lE+|5p5cUg zg?w1+PILr*QF4nL+;||jwePPztA)=@(&GLp$uJ`K1*(Z^*TgVzAg}qJ|96Ro0TAE) zDbcV>)Mw~^jN7gR9)82TL1nk1L4k5A*>DiJbe(p%eFJ3u$-n9|n>;Rep%(kCjb6|M42NXi6OO8937@L$h1 zT;{UWQ&!#g`_pUcc&0JY8kbFeI4MaP{9=-SAaT=g`v3{kq_P^owlcHpK!WCOhxed% zGSKum64r_>=HBuzCT{NWi3rlgbHb_YaDKe;T$xTxvq$a&`=?CpQh@3i*oZlxVNbXa z?eYUv@@O>fWPS5^Rd|C z4)Q74-sZ0ieRn)Vh(d$&>HgHV>P+*WS`UQ^5bqUyJ_;yBbu1}KpL!kXX>G7^;q6m( z+~IPopUBQ$8SS?&*y?ne7G8>-PpJWquR`hHp5cGe1mWj=$GyFDx`y{Q>LAMo_Cl%Q z`89sVCcsPgO1Zx5rhhk_gX_dx|N3N0<6Fa0%z~=ny!G#+3knUnUbB-y`+hNf=9%j> zUT!aRI9w$cVFxEZ<(H&K` zFfV8-8D#aQYTp+K*rAF(uN=OKiSys)BqQyvJUEFKZ$DL$lb33tudDv*iu|8?FraGp zr`#>uuOT32aGnGb?^_kHt?s?#t&vSbG4xiAF{6Ru{g;q|Fzv$Qfn}1IpFT7p>UX7K zgMGU)g>iZV^>ZMMamnRJ=T2B|O9<6AEEf3JC9%I=p80Q`g8#34%{I5*I>`)SA#Y5t zN)CT?rWO@`f|{N|`Gai11IV)t>3Q zD4m5gzoc=ir!7nT5a+>${oeb{mk_YG^n1Yu)SezjB!%Y6t^2c-*CHn5gY4HQ2d9GN zwk4;0o8K}{_(7#>UyIuta2z1)+M{Ew18%SyQm?!~TABM3XIe)vO&ESFa){x#81(p< z*1JO=>^Eh$H2Zgs`Lq=(eZwv*Kf;}?pdi{C?`gg(|3UXkF9zHXap{%-`6B#>uCr&> z4_{JT(liZwVB8=cqI+qyB*Ww`M;GOw!Jf-T}D6@kQvl$k)l!^ZRmcgDXzH0=Mo1=B)mH;G9M`cFghy_`xF#sRF&9g0HP*B0C z6!k?#>e#OR3yFC=Cdk~lcCcK()Ba9eXAM4Ccixj`q}TG6A(c$OMXy45(j$meL!v=J zikpVE2ktfxt;X{3EaYaJeQOaH@{aks1jj6{l||!GLb-S%**<##RRxi35h`-di>1O+ zfPun4fLi}gOBE~)t;i56%VA*w70*-gy~_HN$~=2^K2>m95CnDyZh){Q%F=Yn?nPwP zV+JTUgBpo_W7n>|20ip8=ABNY2BZ5I(@`n%6%uzJyz=axL2rudRpr?-?kf{)K2cBY z#-0L}_vFujkEw+}Kab#I9w+*k#}s_(6dsF?K3~A|K>)!v@?B88{*BJumYvw~ zfJylEplIz^FA%E53ClW*D0QI56_g1d`)T=(%hlIWvPt=tV*3g4g3N&6b0*6>t%Ti` zb}+kkZg_BgfA#gxm(6O+N0Y7+H5PxkBDk5Si=Wjnv6BoqcG!tQsT+m_9=-`@v$kQy zCUz?~$hhP9@zx?H=v(XnC@%>;$(7G7rQYM4TXB zAz&jVq|n#C-#ce{&1Ay1I*zN%u}Z_CoA_SMevw&nXT3p!y9Wm$8ygUP!+K(yZlXKT z{$tdb7ktimd@*F=mPwN!4uvp8$Pb+YxC%thqW>tA+$`s3PKebNn|;S^@e2B} z3P8Q#{4^NJa>$~ec2<0gE=w;k@}08gj}mLk^Jmwe_H1XlDBOZn|3ZUDbgdh`6MMWh zN!Yq&=z@z#$EQG3WILRuA7!qqp)CC&ivJx^2Y7{_4iGcbLzd9GGzBP1mitE`83--1-4r(5@?uY|V|L+~N9^~n znVH{}IFCkOX}UnecMJt2bUozKh1b#MwK@{Rdo*y>Ei!%HT%2N@lu!Cm(hGDbOFB<>kXK0JPTT30>jyvGDy zeHm1bG%eCt(=I5^&Ery5Mka5yrl1=y3D}%kci(_r$Nzy99&MF=1%-@3haHAqAlaVQ(qO&Dkdhkn=QEavaD#k-{O(jH6Y2%~NN-$5BW9Jrbg| zSDJ=Chy`C^#WR!K!(8kQnyzKRPu zx{63+`rrZ2E+zQ!Nk;vpn~4X?@FIrptZRHO0!fW~ZWbBByQ5_DiCGE7xq*i|%d34k z6+LB>-uD>`6-hHQ7oE!<(4+JB){qa)msD4Di3!xE5rf_4)d1Gu8OxVeKO~DpSCY7U zr#4^RJv-^pef)(Fp69p3b|{pmF3gm)zt1miiKWVD&P?4p`gvp3_?C|YG=96kWaP(4 zS>GkA(a{eN9vtW8W!06M0wFjv96}I$)$dX1`?OJ6<}n#T1NI|J{gcQz0C@{NyG#X6;?{@8wC zA8Xh@^aE1vxXR{EQa6rhnBYlGX$r7@G5ZFQzA<&5jU_jVZLHf!ALWBn!{UhfFJv8l z$e1(~-~@&QCJ#*q%2~ncZ^eONo4wQ4LE^6}0Hn(9X12 zu2yLrzdP9+rzPv<*2;LH!YvIbF=}(J$t( zX@f5c+pNlSvP=uhG0a@P5c^|d^g_E`66X_1?r(*`a!@=H;YrMxUTA(hpyBkPA*U^( z{M515OA6^7!saX*GZ%NTsp~L1Oi1E+%4^%@qY>{`qX~SKTlCqpC`)>X94T8N*C5E5 z?|JpG4I&NHkvH!~3l?D3MqJillKFi|5yUalWJ}OZCkAid$lW_nlh0{ zDZ0DEiZ@0mQ6&&O#Fd|(&=ikCUE!}Ol#ZgH-g6>~I}Rss@qK0E>N+J62EDHLO3I$P zMho%UN3o^nJ!RTs-}IamJ$?G3Lw`!5|HH&~b$n_R@typsbz4gxof%Z?`fvw>KNS7m zS-()+qFg6IrZsNv%##!OTo@(3Og^=S8BHQ^j1F+&QL@6!tq9F!icj@t`X#e2ChD#I z{yUpj9lv|pj#Aygc=?KOcV+l~@{?7SLt9P{k~i+Yd^i)+nBFY(_^w^VOg2l=WwNZA zgtP^az<4h3$KxyF2chM=VycD$Nwd~T@suT*QzNCNZe1f@rEtmiR=qTcV1TGLmMp#H z36I2=`!+_bhfVTfy+&9!}BVH+od3}$)ba=J|y%rN41_@$xq99(A5fUOH z*9YBVVMm>cUsS!q89#YdUm5A8CN#k|{)otIxx3C!Hf~B&>2W0Ph8LUwkd2tlmeXx* zi(hqQn@VN&`>4?9O~W|p&{Ex|BV177v&$LqTYKoq((60+ z)n*OgnZBV8tzlbgI>@mS{G59;MPb`hpwkVWIz&_3G+zk=W~dc%cohK;VmTW& zon$th=|FL)xv6|kMpN(IjftmB9+l=R$B*sPtrtK~gA9BT$=@r#1>yS9lI^sU@#t~m z2h$+KD(To;x*BoGqgW`?2(J zm&0Lztn5P0hnxDy_hM~d-KDLSo2ylI_~uegsV6|=8ApaM z2LWA=(xJFN0~CZh5Rx#Ye>fS?y#k#3De5$;aF>zLZEHXc06(rtT&KT)?L3dC)qo0b zyPd)w)p#>zmyDqt#S*dKX+@x(u~$YTu0q97?OH@U{lz2vR)F~yFT5Mygr=NEg6`j^ z!bApI4|IYk9j&mL>AI7Yj8(GXRPGYGd3~gVaf_n`vu*)~jEV_tW(5gSEl}+Zex>wf z7+o?L=Y<`uG-PPsHQd|fp`V1vyake30PY6FxiQ>njq-gD&@PlFz=@?R<>mr$471$b#$m5>lPlZNil>rFk4TCWpV4;$Y(r z&}}L=4ex|*RDuhtA6(3DAxR1&Q5=!8=5vDk^P8&=O>IDFZ7 zeUrrc$EBmAHDWUt+p$gLzVLHN9eaHtQ3s(W)u-0BgYNfI6k3-yY7d-ISnI2u!0xRe zcz+WC+k6`)w^#jvFjFf$ZF}!OS)=oUmh{U{6S07YkcqGjV(h{aXVDq3`*_Ml1@<6! z)K+)^{S&qsbORK?5o*L(P0~-K;UCG7Fruhg7kx$@mM-hqN5oQV@!*G~P|!(U{Okw6 z3`S!jeou2bvZ$M&N_(TgkNRP>3o4Rz9`SUAM1bZ7Q!~&n3P~OPps69BYDb=P6qRr+ zskyR!P?RE%Y#ExuG`Ndsfz&54OvB2{XJmN%K&1xl)2)xfRxCyRPdrxPwZu-YZ|BCW z+Dckv!*q;a9%Exoq2sl@IR8Zoim9bjBW+n=I;y8^hANg&*!p zNRWKWKiFKqmuOg5`RvWkxwa09M~kO7Re=<^bDOBp;_5-WUE#SHP`0#x!z!=V_>&Mf z=`-{X*7F5QlhVnvG*v1$Lbi8xPU2WCZg5S?=tE@igtBIm5%dXcf^P zyv`Fq89$kwEM=e@v`cRHs1nujxn8E%w(w_b6G~1O(!RX zGMl`|UUJ)6>Kj+k#I$Yt1wuqrsvc}5ZlUv0kI*j!Cp3xj6|c60L}pBg9lFO~vD#Yc zUf#Scu7BM0u_lXstlWkOKI83d^uxi2cQ0M!EXY0<(snHR{m@qoAKo5Bpb}7gie=wT zoM?X$_fL^;G8I0W7VrOHlE!_pPP=u>Wz(G-p_h^PeJ@%QdTsJdY#p}RB@k|NHZW;I zN49UL-**Y!Dl$r(3{$}zT~4qSC2r?WI16qZOn=3yD_ym9TkW176Q7^J-fT1oj;0k>QVE)e}>hA5Dw@8ga%oQ1H)mVoVa;*SfS`f!rTzG<|(szwCTQ( zNtNB<$w)N^5&5anTxhZ~cEJnJ(%MKI@YfBr;q615Cu(Llp_msZ*h@W5{CvINpQ%3f z!O1xUs%)h>fSug8LnO^!7&#S2Bn)hN_$%y0wXwyA9_wj7xaE5WH6^h9iwWGUa9e(| z!E`P{hY%4}^%aENtS(8HrzPR@hU{u8WM5G27jr@J)>`F|o0|FDtMRWt-A3NX5R-CR z%&BA<%AJjaWBja|;z`31oyBtF5pL_N20L5hlwpbP!}cHa&;!#oeLZ5{|M6hj{D&o~ zBLrBM6amIj#AidswqZZr_-{uqJ?L+Tb5mg-gA-l=+nJdljQ<36k;}lJ?e@)}hcQ29 z7&k5e=mcC41I$Gjy%sh5<{^GZH0&3XZjoFiJsLI}@(@5KNWh9^OjM#qTN%37fd{p# zL~SJ#@NYgrHtxdqsQ-(-_l#~qH1W9%P{ke7tF-S7R} z<+`q0vJ;&O$jebqvlsL%ng*>PiMg!Sru&@zgI>VR(7e-<;@W4|u>z}~-v-^X7~!xhJTb-m=? zo!xy)k+mh?SfuH`;uo6(=BCN-K`8$PbFcH%<=&(y2bX&vtbaYf;0`t0-Es=6;qco0 zxzXIfb+6DQXJ+R5!cTvg*Jmb`h`@Dt!MX6l1kb!ZzIcijIwnD6zs!H&jfd_?M)UMwb=h}@v zLb>g=D}H{88PC4Zh2s7+_vkC;Flzv$4C@i=_1N%j%#Pgr(NZ-VfOLz}5B<>gO7oT& zrVpr3DvL4Q!(-R~SZD9ZhFuyeFb@ss3onW;>rWWkZHJblr(R=`IA98S_HTA z?0P{>`(8Q;V~~19(;dTUlme-LlIdzWgzx(We&+{og=rVN`FkYkmlmp|bl~nja!c_-OhDS*GvlU`aBQeu2N!!MIqpG)|Kc2Vp>raNf>SRRp zHF~iX@QCLq37Ne#A!nK$^k_RBL5C7FY1z1sw74`J4@cvKm~LghIktBJP8ZFBbCWs| z8@OypW;8?Kg)B)HVO2V`azZG5I z`_u_^9_-ePr04Or%Jexduma?Eci#es1i}5u@YL){Dp5Cd=PI?--DC zwuaYDo4XioKFL4|n;^wVRj$)NEN7J2bcshl(XFyFo!C3XS9x+TBGuqsorp?$qY8j) zyiJ^bp2ATPc;%CP3HSRzjR_bm@;3 z!f$IfAxngSQ<^?YYV^!CY5v~b5GAnn)GtkM>z+y`9xS7NhHbIraf+iClztW(be?{- z1m%toa#Sm15?aIaYXZDM4fV$wBes=`0y%X9T`+$x30(~Vf$C@_;=Iz>{>imMDQ#=) zBzStOGnY#fx-hmGw#3hl*sRE758Q19p60^Nsv}iZP7wysfqo(xUXSW|HDc`9zkj%1 z+5X@p)I(lq3_pina|e3f+A7zL?c!NCe-OrW{SJ68Xto|B2RqEt+Gx*-n_=QgC#O2? z8~4kuFB^Qft_gx_XOsilyZ9hd=p5)9OS-B7RPAM!c-j zpOt^W5q;dM%!X z`TWWo8?y{p;Xh)jM&&zF*m{Ye@|A|A8s>OePIblDNYrx}5jDoj^i8bqZYy_P?!es7J!e)lI2pBJ z3}1&4@pr2@qDX)l%Pgq)Hq>Uwhuun!dd>e7;v>BaS0o!up=l<}t(|2n^mix@HHA3LsjLyMQbs8>J9Q+4C!!?q~%c7cp~TgLPh8B<<` zqa%|K#5T%S16-&YaDz-P=XZC6_-CSr&AUb?9XynhFw*%aeuZa@nUrA0;U$`NVK5HT zA^621)!*WMESOIn>7QS=(pF@#@#t%8JxA%g2zA39y?nq}T8KcVjfdA}o3uLtn|}6K zcrq2c_~Oop2$vA&bAHwhu?B*nT#gB|;c;2yi-BqZf0Z5jc3$YKHfhU;{@5P!jo*_5 zW8M-lG#ql7WtSrR)rM^S{$@L<`j(pHaW`D=WLvaPUBPfp?O$9a#{A2t)w9j(vxZDp zWEno?Kl6s$w2p)t%}ySw5o-z-Nc{C`a=2-dbgotT2@oXLq*tYpz~N@Aq6;8X$Wbx1-(kM$3uI`#8xaw8boQsh7KZrc?SZ&` z$M)7IJDZi(A)lO2+4j@giwv4e{d`Cb2x(FczF{>B`MlZI>S9`*)pp3p=H+LjdO3cU z7a6b^+;eC+^i<77KNaFBRCuvhQF|eB??XXB_tpDnue={%lu+Y)&KQ-ngY8xjTm7MP zy6&0Fd~9A_Lh;w#Uz485hV5?$$#>n8{pS$DermKzcahA5`sujvI6{EFlRG3ugE#Nc znY{J##Bp`MU2lpZ)vqe)p5LRMb)~BYgDvRZ*&7RVV}#~1a&41hx;i0D6i2+8n0eI|E5*+`+v|_Imn1+<>dS`l7-4Xf2YwIM8 z+0R3tdLJyVrSA5UmQRa@%}r3j%H=dkHdFRXrT8OT2Bu-;t>4(MfEw(7|94yxsPJiX zP0r>49Ve&*sPMIXSBauQUfb{Z(vKg2X{MXBx<48AR&a9v!xm$rF8#I~isJbvgLcSv z5{!NTLCTSQGtO*ge(o>8O-PR7=esh#}!3Z&@YlM&d_d2scBp9 zJ&eRHN0uz^MB`ZU_I&s1w*drkwFW-E{k0TuZuY>r7l;vqs*s158m(fj2RiAjsDmGv z&Negx5%V8P@N!hI~ob75IFt#mk{KMO~%wOdz}}mLwVBr}`+#)~bK}WBkpk zfwlO|J|w?C^5oM#7pz2rWdFhDZ1Xzv^kzVU+chQFh!2e%SMuFp4^3X&)h;RfJ}AEPDWoN6r0b|U)%pp1+inXLZZ2Cv3L%P_6Yp(ZxKdW` zR&sbwph)w%s3PRUA?N;RU;JnMAR#!2#3FbW!mg#0XBZP*owyX!r(2!kVU=Q#^Dx)N zPRt(a41}1<{Jb+(SWT$6OM=b%%nJFxHGDJsqTDYylz7o2_5z)N?T0|T0yT*SAyzZu zhvuDb(r#V9DuvO|BL_eZ^Qs=^RYxkulD#QMNlh!1`iaW~<*|3Oy_umKL%bfDUDH=k ze_UX)9Ek7>4mCaUHa*Zg=){TB;;sT|ww31NTn-Msi<$KYjarZQq|G)9XVE22wG`M9 zQVPC#0{T`hixjD8^1jF7O7+9CsK@}Tc1ucIb1H-1eRv7F*f7|V$nmrL?BxemebI_T zA#^sUD?g%kQC?JgqC`!5UDB45y%e8 zpA4^Xv#5vMRDwWq$q%7)#$-(4Vm?B$xlFVyMJOjY zZC#{H9cHBNKi-|)CfdY#tjk0USN@}mu80_{;HXzDuM(dkUBN4Qc2=R?q&)+GP4r!pecHrY+AR@{1{W$+7@}@0UyL{%GTn-xL{3^N9t$t zCHQQm&>+s|PJ#^MBfQQ0?0$nx42pTASfJ9iIJw#UBLy)IxAk7H7@ zWC<%1k55E-B|m2-h$T=pYM3Z&p(1_tm1+qxe=@LqO`sa9BTD)B_*ibV2&FFI z*lYOwE))0bI#t)I%raXKbGJ2mysrK_wDjmMez+h#tRnIX4jj;a@$gL|Cg4~ z#;~94cJ(^8g)AYEA}FPED%;z0TN(EgAT*JU-KWg1kfM$J?psFHD83)O5U>w)$I4VT zusdI&BPO$P?}$Gh5oi;E7Os}(V;w}>Bs?6YI@*jRnqFp~xTbTwb;_{A&2a6poUC|y zc%;HR_TxZn@pPEJLtJo2_YJJ5&c}H-$~(jlL>+YNPBD^ldJMTNo{zYPUzJ_E4B7x; zhM;u4-b{}k!MBlW0rchxGE^wN9=@slD`|h72sT*1131silJrpN19x~fuC$pNlEMML z^cF`J#?KEpO-oYWFZ;9u(E4*{C}pe$9!1WhoFoZ0-zS{Uoc+vq3i;Jcxk5$W#NN*@ zz0d7km}LP+ZzT|S4xAlKC)nTa`>G_(;xCj4?r(kMTR|@vWh^ z*d?CD97!MZGC1P$mKd%2Qyv;QyBHkL-cQHVp^B6CCS$m`34=+sWt*I@<^Co!rBqnA z>oz!1TT~?0bFXjqWTDU(qqoTYHNo!Q>h0;~_4b)@L8G-A0VBb3>z}d)x!=sN<*M(| zONadJXEx7znKU3E{d2#n9%glCPBagS^rW+{luv)$f_A=)#5O3)s&#SZ&t& zA%!6u3o*S%Z^fVQ7RSalCPv^~P#i?rP?JoeOgx#~ZfJSXwbTwz4tBteOR{1p*6V z0{kQ}TEVnTzHj%qyv<`mTk_bb6?IG#wK($STA6jprnoM>+3 ztK2V)t#Ujurj6`0p;*EM(-B#{;~Lju3@XeNp^*2@2D#;ST1Eb>l$(B+Q8SAT!t2ud zLTbNy%qA`A#tcJuTt_<`ace=%a@gSrxPA|ndwMh({^0qy-l8y%sJ4RsNY41QTr%Uk z2g(ds(BHM#WYNr^3YHU3eqme1Z(*V}QvOtpvXtq^r`FeFJe0wQtY*zV|O ztMZ}acTI-zTp6Ye%PuFEvNIg&RUAH4Z!i4?Lc z{N~71b%bC1FNGC0&X5axAPq4&YFe1e9D)Ll>)RjHHnayiJD?5uWL z@cX!Z^V>RU;QgCvyK(ma%RN^aDjPM;cqOjtp<}466z}T z92JlrI-TC5CO#6Fp+{)^HOUbt>;$Sd_g`20F0OywGVBM+a#5%Ge)s)*XZ)HWpf>WG z0kI=Wx!voo*v?J6)vFj5A*B&FJmz@)PTPZu#n+=+7giYMurdEjOZYFCZ8wiSoq?p5 zYmd0_$pqfcFOw&uViV^v(vcE=)FY)irI-}Dv8AZRciw{^U>$^V1abU?=GV(F9(aw} zFJL&VxHtAaxA0b#lif?(YZH&5-!n5(^3le_h!GsOL$N@0c}mm9q`XYGUET%WHy^$R z(#g_fIKgSA2}KO6eMQRCem`O3)Vk%iwqsCHonTL_A@iDR zCJwp}C`3D`EEdwo(2#X$r$fVjw@Tihq@8#kV9R_YUxM3)8UHfVgY4*JOw9%}3RvLHz7mH=?%_2z=9^S(`T5FjvkdKFC*a4tbZ--6gVvNpriYGH2PJbxN^ki7 zDBS39>F|xdkFclLrf_R-@_2?wYN`67@B&o?rMF%VUCBh-tu~s7^l+##;LM z6jgG_a&=a(JzZ=~Xk)yv*Kr4hBC-2bbr{82Mtk$`XJGN)u@I>w=_Ber;p4?0qaA9< z>su=JtjW(rbwA(7@*}FKT%a;^nRvph2*Rn6!abPrQ^Wg)^{a^xzNhKDfY0$q0};-O zz>h@6=1^S*vU*=cPFn^H%9{(aWa+vI7=-i|E~XcSm0|VYQrMLJ{@ar?X)L6AVZAC;d|w@0YbM?lXAb^G#~d6DmJi zWw^IaklYJ?w*!AK6SZI_Xld>8$6`y_`~_6noomVyReF0+;^rQ6gGu+1@zxT2!w19~ zTk}Z%8#l6Jc-g+9IwGE`LGZ_+s!kSoDJ(MB#eKgWK7Q8J5j4c7B-n8$sJilDEndln zksn+D38mC3qh|flj%NQf9Dj``*}*AcbO(kG6>S{PZ;a?4G>rO$DfRe0s;z?On#xqT zq2`p+Ni))9lY;Eqm)`{j3erl)j(Ujwc6jQhg?sEYO%v7=ZsSj2?qa3$0(IuXlDt5WjZyU%uu>{I;$51m-1AzKM`n+TDe<>awa8#;e zT<*R*N_q&I4jl>`-_U1}_LA(TB)nPTX|j9O7VWd|+qn6Xwv`(ZQO9C^4z7gQ3rZS;*^Pvjj_&odIun0m7nqPNi&NXc)3 zx?zneM?-Ck3#9^l(E-LGw6CBS7WQea%HD30Rpq?Kj2t#iVrE3@)t5VJT6W_Mkq>^c zZ8Z(IpN8%Y(!SJiQyoZ<>5!@;xyR=e##~K@?;iH_Xi|LAfBaeNtmNyqyS%TOkK$5S z^Gr-upM1rpYKQs0wV8`lKOGrl{cQex-w~1N1=pcB69!{4ES-iNE(P+Y zJhsF2mfEC!Q~Yui#WDMHR3I%viI#=aQn|-pRbgWH~m;XM{mLZ2;A-_J$_(LmcZ0? ztAUldPcGm;E&5R3_08&T z-d)_%2jigA^;#w?B9x_&DzHpdbH90CAHFQnTEK7oPjRBZ`|AIluYR+XJIEHMJD|27 z4}c)2`~u-{L@#uUp%=CafCgZ~*(Lm(k{O*Q3B-huACQ%riwExjq%Ug!f+j2=H`#VUfMJ;X&(p%aQBCjz!3sPD z^lg{susYc5Bv2UQ#~T9x3Y5kZTblreQGvvl1t;h}1w-i$Agq7h1YL%6hy?Z>*)@QT z<1+w}xw$4#=n?SShN~gEU|Z65@ZCA&&NIVOP-*C|Myhpnf<+CpF3cq`jF)BtiV$I_Lr&UsNjtQ=1xCxc-F{XT*5>MGQIV+)-euDa1Da~Ygs$M%*HpM3)O@IinNfug7BTzbF>BvdN$Mm>f7TyHk(JnfGON{ z=X-6)H^WqlFEvA@Mgvr~_r_@hR9#Ny8SRk8iWKd{&sGm82C|^I6sEb{(I!1Te89Y1@Fqv%N2)FP`=1Q$ z^Hq8Xi7!zkHsjq{=ppyoJg@U&LND7h+Xzsz;6=5a`ltS4!>jv#m2B(Tx3F%wDRlhI z5=G&A@Nd~DkEChy3lvpQ1PS4w{gw^?qU@XO1z1C%&0q?`ToDd{1yASf8DU( zAw)=B!trg5X%Yoi>Qc+X+nuhe<>g5(T-SSOQuGYfE_F=qrU5?WYT|w3X$+>Uwxl8f zMwxHke03a*wccTs;g9)c-oHx5gfHJuhRCmR?&U6B?tZ zvm#zjBqlECHoLqb;q&Tu(^(uuwC(9WJo5RnSCx9#{KtPkj=>Alf6qJMH|V_vM|Q^9 z9y7hSiaU#3%?*MA#B-KdtqRO;vkLDP>QV!WibDQa~J;?wd{!#_=En^(cC)6N7+^Hdy` zH-|uPJM**sX|G9(^0h|b{VrO-?SJk@H|Riesh_v~v~_mUB|3I10O&|pZK(@;N-U0+d8nNBER9qx%!h#MRKs(gKYvZv?z>u(ED(@+{ zDJN=71dLt*AYAFj)gD#IsVCj_iITR!Y<~q+2jHArapNM$cL>de&nQZxF3@aK_pv_n zAL>br2g}6AmNUHNKnV#1`VY^{BerFkMj2JIWgd z0x)tTs7YR zgACYS0S{31%UmX-lR(bDd1Dv&vg-8C|29x3natv-AGLAhS!Vicv5ka(en01hr*#5_ zmV5goSK2fg)b3#V?soi|-dMO8p({*Bbt9Ru9`DW|E& zyEI7njMGf>Z^=Z8>%!l?!lB=*e7PjAtvFzr8}-g|XC#cbK@3%^nQ1#m+=~p{b42PM z#B~}~xH?BiFCm`%G7CmksZ3lTA}|>aqi64Rb6{eHIzEvPG48WB%;ggo5>?Jm+ZMshfW?#ByA*VqHp6#CrS5_ScAf^^b9(LIk2JF0 zjU2g#i>X2GaP2B{Pk~&5&kr1#v@T;=i&-_uziGomYpY-QHMb__eoK$Mz2edHd&%U8#jfH`Dq(yF$R7trAE zk+2B7yzNCTYv>5qP2J;42ampggAy+Nq{mJ@o(X5MYFj~^Zq8weIXiHG+ibK^bFo|Q{+1LiaN?1myJ02)i#BW>Anl(=;=1P#_;z^0XoE1=Z&K<(Jsx(y z2y(z^i`!j?05diI?)G2b;l2n{S$ANfImBzgo!|P)zJ8+|g-HV$CHaIsYLl$i^G9E2bIcaY02$eaS{=vq6eb-Ove zGiU7CD=vsS*9(~~>*z$o(U+iK*e9F0ZU@MHydwOQ;VI*q$c`aS?UYV*v0O$~EJ`%r zwsXSa_fqNlx6)Z*arc>}8m{jOLFJwoGx$tDDcE}tmcA9Ql?E7fgmSDJk88)KtUVYIo2zFz% z#WQ>N1XZ^06&c^l3nlVK%p}XZo1LCB9W5NW0~7~Z4fI<0lp$L=Kdt3nY8PJ^-y24; z#~Y;Q-&t}1#i9R@qJbx)*JZ^PG7pV2Q9}E+3&^2kk^qLI1J5kA%dJM?T_7b{@ zR?Fn1-_gz|^b23ePwatk339g$Njq=9Q9;t1GNgVqfsyB7^f2jtGemwZe8Y*VyQNuA z6r$Aaz<;ZD)Rg^=F`f1wbwn&0B=->l1ym91JPj!~%4_anL2h8=H36aIL*-b`CCi2y zj{coL7=5#!Saz#kI{1K2XEaARyxSw$eT-X2D8tyWjPT*XaLSGCXQ8g?4eLCi`aK_1 zrvelL?_=w_`)-Ab^{Q}qRgBh|3r3o0sV(EZLZ_sHPTbgPVa|p2But*Xdn1Qa$3Pcr zDcbk(={&nf1&>Gk7ki(v6oV)W>KZg7Dw$fKl>*Bv` zs9PI+wcwDiK6vM5#(g~!At(*`IiqEwr54=@KiB%XI z>5_`RJQW$d1jrgK(uitOtF(LJ{7o=vQaJ4s>aB`AvQ-lYLfco!rA+J*Y?&y-2h3KI^+jcunZ-b$%MY`mb zNwRp}RV-~Go)o_j29O0ue@D2g*&xClq5=p1H*p@ZrWz#(sWcwCI{~4csE(_0odrO* zFi-dOn^5vaZ|_M;GMs>GS#L*9h>+8#?Z9FUfoY$hj84TED%F{G3D9qT+$*Cu}^Oq)L>=G~?DlYFzKB6^?P^gIdV z#9m$pwtOLPU?mcRsm)z3H8$(%kd0E3q_2fOrU#Wf4` zq@bNMR%DIMh8hN=t8`!^w*X;4G#wszS6uPN?%_T|Rrm{fcgkfmwYq(-!x5BqiARr- zeIg+i2Qq7Pc4Dk2?o^fe#7UxlNsb8YdCb1-CD%{roaWpl+3CpL^;h(#--fcgOq8MN z^aAhU0=Cfy&L$hE9KWp5gy+EQab&Pn~&24H<;xGsWD0!jGaL%B zF`e#`O@iXJa(}y#zKHH@8*I`|$cWgTsNNJ9-f}WrLQua%}+U+J3 zBkW77CsUu*_}{CEi7ctId-3!J3CL;~mhKkHC9xYz4DdZ`JonISQP{8%T_hMiXf74_ zm}%c9q9O@&aaf1y31-E6$zHn6V1Q0Z&h)WjFGIJAd#R)Nw)pv%2t}Kb8uvclrVdQgLS&n1lPzzA?*et1| z7q4=`);4DnJF#EQO_p1Hau;lYTsoGn47d6PPqXk6J9WxVRmoQ`{0)=wcVZR7U7%1yjis^4|phBn~U zf#xfRSp7pf`%Ta*r=+OAbhktE{C2^WT@U`|eYc%Bsrm$Tptgxwm*ZwIwNf?69bo$x zPm+?zOh(U~A1^hheEnM0Y4a;t%ca&VB?f3pc~b7IIEYY=hOsy=qNEHwny1bg+D}7J z73Ygzm8e_^qzGDHx>{LU*E}|Lj_RU(2YUs@_B2;FT3ThNm#RblEF*Y*!C9Fe3Z{Fz zoMPYPHToyX)_+V%V8o;g((bY@X;I%4;2-$M-6*9TM)(Te{zE zoeq;?8Ml5QF<1W2dYgYYbMOlNvj^zD$q{>KwA5Wgp#RQWwcTu)Cd|07gbr+8Z8?HJt)~ym>OnSo+#-7ozXlB@%G_01Bqyc z^M5`~|7CYa{!jYd?+E1{Rm_3yu?nmQbgM!TX{0KB9`#p~&CTjDI0|x;5snx*WvKma zaA!1`@9{fFafw-*`G^XsZ?(`VpUAZ8tJv#tT)EjLD^=mw&#~Xx`a$#w-NlCCPV7ns z8BEth)f$tAU-)#~N0pQlFmihbfwI+rNI*a80b6C){u#pMqFYl}b5XF23Y+ z2GZRQYr{z`VI#VfU~SI36GCivw|$G|unr10`s=MZL$B?4zy>@~9+y0&oUfFZjsqqQoI%!6iQzhwU(HTt#oXDAC&o`_mNu6@rPnD!{KLsUa$E6f3Y2jI( zdN?WWeE^q?GO>20%?o=hE&Y1C&9h|%&QBuhJC^C@Zu8yDFA!Nvlcm_#ivc%gAgApN zT%)$bEKk*eHhfPsZ0hE?{oXF&>iI2Xo#J$-#?~tpRkxneR411Rk>+W5+(*NofPKy1 zDt}O&Sn6o)v`y8uq**o^kF{?o<7c|pApl)Rr0|xy#3|g8ng3Qa_96PTnpU*226UZ= zspU{*_we>!0%|HNFx$(KqN^3 zq30MFz>ki%{9ZFm^r5=v^8*zqqpHQX!ufz@kNs`W7e+~sSMT>{Z`75zL0Ii;*ax2C zmE`?p5|J-md3xkEFRVx@1jtF3HtClyS%Gko$;ZI>pKSPgt?`(8uGhij{-0{ zBI+;WLi|v?t}cGPo)%{D0fqL)jB825OZMI&#ua$A}DUXxqzbS`>? z4aOLpPBzL>-$I`6@7w5)G_NDI2t^gt7~+v6|R ze}z_AbGFDgt5-V1nR?}<2NgBKyc$YPOv5QRl60f>#v)9BjDa`N?0q2RK>}i@r{Be==Slg!Y}&i(bDLGenzLP^3C&uJ|gGq+T9+= zD7Ho$tI@a5KrWpiB#7Z#y_iLR-q%l6caTm6rI)PlN;qvz3E~QaJp`5MmaiPk~1}bG(!L)McGA_AL#|SU*fY~ke z#y+`a+T&rpGzr2Pr=0VQ&@ienuSIyfrOJQ`uHxf$=sp3Dgsv|V8k#>BiCaA1d0c97 z@EOo&+Kk1f3VGS)Vg)*KG=6q_c^ZDa^D1Ve)%R6l zF6Co{DLRX|i;Y#ejOasiAoxt$F)LizeK(iC^Wt^&oVZ0+3WjYtSqGl9sKjosv#(ky zF@1cRfzb$9Ff*H@8&=n7%J9iG3>bXe`TO}x0`bPoG`ll)Bg*|HvL*R7BB`!C@9QhR ze^A91D=|}CNQ<|MgX)_qOjWK#DoKv8(%W4$^Vkg{-h<&Uu49V{B$rUab^-U#h3gNS z9z81Qeo@W!gQ`;{p^(qv^Qe$qZB6l_ibo;)itaySZYF3tU0Gatp-0wN*52D8d9paV zp&)kgj~F~csPE6AUB2J_g9_=ziP>NyP;cc$V-EzDfq_qV-zN%+4pvZ5`8E8Aq2C#n zlf6W2>U*lBg@l@a)aQcFx`fYN$H-}^`_1fICY3GgzbT5*TJroU83_*T`|>)PZ-4gf z8Qrxz*^bQ{r*;dmh2=ciejHl|iaL)MRYpl76mt`*IDsAYMjCdiCY3j-kF(%%IcwaL zOxruppDP8qwm53Y?atzCbVu|NGRq1t-dU$!6vXQ$jm1YJ<>y@9G5l0N!61ZJ8Csk- z7Moils65f-+U9&jTOX~SuN)uQTOHc5N;Hg>d^W$a{v8-y&A06cvy%pis`#7h9^>b= z+*aQ^J~Y|*aclag+~VO#`yD&UQHs_zCRm^ln`NxYfEfD4btW&fUSpQt7H4lwu;P5_Af(1||E@B6c>s3PHn^cbQ^LowgJ#%=bbu_=? zwJxoOi}&pFC^YPcQ~T#pD!9*s@vJimrLt2#rHzmXUKLU$qSRcx=I999rEPoqV7GAm?P`a ztx7VSU9REuqTC_&pSMShn6%tuS-5_sovoqX>r~OYi)!s+A+|D$hAWCjmM@u3+tqs_2VsA;D%d}dGF!R(;G#}hO z`G zVQQt^7Li&ko@XSjB2BLeD~gFU|MA!vF8Isb(J6Y`dB;CUBH^PS*&9=Fz&%~4FWjDz z_ycArj&zcI*L`|*9P%X8)PHEUG|H7uOol9*GDpIkI1AiH!=MZ#A0vq@joXzqQJNp` zG)=Sk23}i{nqunbjXh{o_cQ0N^NUMX*Bh2~jMSOL=d1v4|Hxp0dZ-#bW!Si^a?)#1 zmp$U9UE!Fqmi1K?r|40BH~+R#lMj|WtSv3G`Cg6e`=6*suIgWyUd^k!+v)9xx)Ncl z7oEhLP2D#iGhvF; zaIRI0PrwgUjyjBPxLMvfQ9R;qwg%!{f5ry?Qv~c@0x6MDAiUEqmN`&l(=_nCdTg0< zC!`j_tH|K-C&Lf8xedCs`K@57-i$5_+*p@-tM^U;+X@R9JA)`b@CB#pI=!bCnV4rB zk+E-EG?`z7yy^4l!93koAR)D=51FMXOIfkpl;t>fU9r6E4P53vNl(CiWY@{1Nh8Ut#2SN^hqV5O@6>@l${)M~9RLy#s1YtwtCtn6{k-T7 zF;iB^Tlv=i`sELTF>14Gc=@alqJNmfuPhD5%8S&lpnk0*_qGeGe{HN4VUOL3m%UbH z*JoDHAd%sC*|ME6X_gPn7m~9{tEvio_f7t}HP1PD?|grg>XPxAMU5NR^2{*$6SW|> z?BUTIfiWTNja1l3SD=Si%y}H{iaoK41{%h!C4$f9pIqR4rB!>Xuhcb=qKfDXA!6$X zc{#uL=)^c!dq{F=NhmFP8cj$)NIfT9AMGtai$TT+V`DHs51wDgja<*9jF$SgdsY^& zlX$~_*lolYy6k%lgUl6f9D?cm{twA`a4K6C`>Ox$fBn1d@FZlhM)DpkB*%l9WAEBs-5dii(2YMq zT>?6PNfmYqk! zN2E`$%{=FBL?C4Tu`sqTq2hnbmAheAtzF6vLm&9)twxf(bCC8VNc~8ZyzQF! zXWZmgzwBzlnmtkB+k8yelE10h09Yim|EKdG4bB%%gP}d|JnSp<3%)2I=;uD5QlH+* z9}!KL<#1aQ#FkH2;MNuiXoIoVOy#!`RD(VR#N9@~%#&V7z^nyNw|;kXK&;?F7iYACEjQz;xNKUnwOnVvGKisx&e~2L+`G6>{p9E5n z=huGBEk~I%PH=z1eGCuY!S>mS{lYYQm2FyYfpD9%T_UTy)qe9Wlx>GfORwkXL>%(S8TUF9!jwC&g1L2|4+4ppRFgUKyYj zAlsMce{7&SD#Uiok;B^48Ivm;?!vIKlp6pNb$b9MyB0Ck1V-TaTR+gMroQMLc06;1Gt#{7wr^Y&)`OMpwXWU4`p}qMe4n?!1a= zTFqzVu;z^MPI;fnvJ!gmvi|6v5$Q`cS8Yv9>8`55kEFjon-4zgm{t?wH0(R`WtlzE zUxIm;{Uo9T$=e8xB{$wDwAxAzr9Z;av%mXV(F(sMMeI|;$-!ovJXF%V!(uuAJ3@FHz1S+zBumUWA z2af82-pk^qBbql=cYV-p^w|a*tONLr+h3o7P!o&62%{ooy@DUW4t!yz-yZ?H8G#iM zCb#!sK9IbAy5T;rfV!_I^#-~LMrMFePyY1{9H9%&iCa1hBS?(?$q@OAmIE>jbRnh3 z-yE}rD1*_O)@4#8$$7R}VqP!6mHQeG8GpXx*iWpdSh_+C z#^408V!#3CnvnP1**6L_&d4H*|E_2U?FoKeEL@Cc^71r%Q>y8UF}{Zxxf6@aN4!K# z_}m!vQ|poL$_hJamP6S+;PqyYItB6@;qzJHRt@52{hYW zE+|)0>9XG=G`K&(Gsv3Hj}wOyo=yg^!L?hEyK2Em+3Yt}anAfIUv|c%c7|xsl)2-y zCicWjL$FPl$U;gO>jzAO^K8ax&8YmQuOD8s?TOJ}VEYXDxjx8q)w`ErLxf-DKIxn* z?$XfGE&g(A@B=N3t~O06;(JN5YJzVHbM!bAL)XXfLtwv)zBS`t86f?34F;t1EvEP# zV}*OY<`d(YC(o(0ebx>5#S37I!lOP3tG}kd72??!llPX~cPScqsaX&68tx5Lf(^gT zgN>;&zC>gdBum>in{NW*kxShGMkQ$%mvR@|2dG^B{xCp?s>j8}Z6j8Er}5P5+UG5u z;7U(_kY$}8PA2bbo`1nKj0|FKBBV{Lv{~zhgyP;6iSwhw2C>(tvs#4?n2AnC3~crL z#x)BCWxq=9G4j*K&TMN#4!wS!)s(W(y=scMnfb#jR#E3+bzvt6}P%iaDM*!tP~$xZ(x|@^{`m|Df>XS?z?HM31 z{?1J)z81lK`gR4jtjV|x-=qM<<_J)f_EVkpv??I>j!5Lq#hU{bhA@3^K7j!&G zrfMI_k~iy+W8of5v1&_Lv~OB<_RD)_|E3aQ!}qTb+Mb=JUP3Rp|3<1E8E?44);zNx zePdwBVTC8B9`=HvKSja>JkFGBhfw4n+(Y#9;d?X@8Rs4qAk$JNCR{s{`H$+$-raH- zPTYCJP@6)UucM5!EuHu_2=OuF#vi<>lbZXO%4hM-OAP!=hHAY%RU!9QLAX(RjEx*kc|rN< zhR>7gPTi#eSO8(l4J8iFhKXyyjw-GM7)_+I`1$K#G?6VBL%<3{ZetuuscvS8x$v+8 zC3nO@MQyo+ISunwb&Vo$Ja^WZHYp{?$7)!8@enCmQRBJ5aXm`S{-Hmc3@L^(X)|v8 zoZm4~7v<-o-e{Imy&QB=Fyv;IzjV|sud3DG%qn9kmClT?**UZPEzBQq>1EjVtOq!BhY7-jN7&tq#+;au9oE}!M1Hj#pzpy!J&TvYG5Pwa^AU&* zjW4W&z@gV&J1GP+<^V7`35@1*Fn`1dcYt>V%7Nyj7EE6#r`f;Y8*UnKs;qivf0nl( z>cHWO`>R(5%Gf1*m|%Kl=lIcl61*0Fba=RlFTkK*>*_2y#Mh-)f79>X-EUf;r3u7t z1#Qoyzw**sD`d9TScoL8h*){^#NzmNGdJ6VOsqDSkG^=)y&tg+-dP78Rihd!ss~M; z*bVI*U$*TA6?jG+j`M2tC#cSEan>>C)z2K8d#iPr-_hDWNwt2#_^1n(UsTSV868u!P_7)5WpCG!}4e22UCfliOC zP+C}wCHMn7c4 zSZsjd3f(Zw-G2YeXmsnBNc7I*6}z4F?2!PiSy5tWQ(iFSIqdBR)nnX1GQF4H$Z{ zQIfj%5_~@Spu@2l{EOIr#&Y*{+}Put4SX{ zY~A`YIfJF$jwq3E2tu@vtLWb8%Iqi{7uPvMiMv=hXwC(Cmv;glYy;DB!W{?Rr9x8@Z`SYK;e z0&HD^yoaWU50TkIQVx(BSgv7}4$==rzAuspJ4xD4Q%RzTc?1v9#^;(ugU7Z$YD6zP z`byH=Ng*=wZx)CwXX{5uXnTu`uQtsf-{H(X6%-=DM4u0S1=d9i;t}8H}0v zz67J}F9qaP$~KLkei=9fogm|wK@Kd_-9%#c>ei8ALx}>nSx;(Mri{gb&s2V529_mcD>o5T^kIeHYd*+}?hBg;{t>y|r1wyU zEPtA0A8Da6_pfXfMgil#5h{>9cdyq#TDjTK3+1 zBS(gny(acd=`lTIqDq1H1(|<}5XR!8Mj0RRDIdJzo+T0ab3M%QjhMcWTbzXG=VLx% zqB7eB6c6TSxoTC%oeSlWKL)9x2KUtN=;6-XbH5xf8*{m({+yJPcx>W4?CJUer(`iI z%#&#G5U~p3GCcCr5S!GVcrWB$_s)&`4SwJ7xG64$j*ZbSbzS?>*e4oGJ($DBS$?SE zOUkmv!F~{)YkCAc<^y#bku&iE|Y6wM`r30$V1J`MtOz9JptjW4mI?dsUTGPkegb zvKd`|p-5~`j>1XZM&{|aQ`xvw6&ZAM8|VhbJrGxze|-zA!lGAu)@$rNC?lh{UK?zs5=177Z|*rZ=9#2sb;0C){2~6@mOkvx&)j6b0L}h_3EP@vioJ9 znwG_97u@P5-)e4{CPf81-3@>YBZfFwH#opx`fBh<%8+xAkxkY$7N{ceRs<%O2F;%W z%Eg#HQ{6l3gsnWw-B=1!S8zTkO?(!KtbvsJY=Vq>KPXN&EI`>Hf?$@?h3i>5(OcLB zv_Cj;m={n2bZ7cG9bp8U=9zrqz=WIVTyI5KCjJ4LBZ2{)y6{rdByZ5M)BR+Ff=fz? z(!QqEWIOqlM-C!NGV|SoJr+Uq8zS*4Qj^9ZjN<_8Ev|?Oy5BTG_sKyPy zz9^boj}Eu!U3Q5U0Vx;S2m|eHnTjM@mP&89nPh6A-I41(>o5cJ5<_;u@9;wSLVzzg zw!){NOfDLfbtIQSorRK0tB(V9)A|=6EpPi%dBP`+cxm2@%cD}veTHj#L9C)))^#); zQ%9ugf$7%Op}<~V#+9XR>qhpN#@6{Ogs{X7T~92Q$MuZ`cUJOIyxpwZB?_*$1U>Znv#{18|TWU(hOWA+QDL-x!_QXJ5`k>5zYsFTb@lUw`t{%{m-tMw) z{w*p~XA${?Vl3nB$OMg92tLj-xvkRm0Bq)t0=a$TO1Ndy z0Q8n2fe$3Xm`w!gUu@U?7;uKUA39?(VTGMjK73l{1=_o`;Hp-fq&R3fsjk3^AH_)3 zd@?dg9kU3b`F;ki37HzqeG%Ie73BC=^E*GjerkhI?37Y*o&D9{ndvAswJ>cCYI>l+ROi+EJbkcp244q5-Y&zhIetd6&*aD)?2V~O@kmjl$5V4TOYKh+?0qL2*mQ}C1 z_1<}oQ3sB3!lz-T6Z3T?NVpC`>izKt##+nQ>y@KVO2s1G4KoI~Bt+{S@r@{_7G@q- zvc_Rcmu`j#XPD&k)h4Wl_fh`q*))VT>JoJ}F#+ zmS>jT>fdL>GSs0FKFqnDkr{rd&iAW_0UE3LzJFJs{WA>xAFuz58G+m14GUTsW~`$S z_^$X^3+tSSDd=Rs!Rh`pS?B-6*Zxh(BP((^`ma-rUemU=pA%@TJ!UZm+w!moIk_xj zgjZhpymiHTAl%9&^gL@*46qPivf_RZNKWgHq`qV>-a~_m#$aME-cmFTj-J*`se;yI zzm}iqdy58%TcmHJALiwN;Pdt6bI=$p8K2$-U4yo4d^g;Az3P|}Mx7r14^Dw#n~1^r zmIp!9fsGQ6!&ak)E@KbvO}0(KSeeg3I_}~0B1W1Nv=Kx*A_ALX1Kz$|t>qW$+#g4ftf_!XL?{ZmuD#63dD(pa2OZ4QNt* zhY0jSCM5qGkY=PYCc&@3CNmAiqPY_E$!uhu9L$1W`o34s4TFV)SX80V%rS%mn96|y zLpMKhFJJ40pC3(E^VSv4#(HWFL?=HE=MQ~OTdOj@Cf8KXdynS11yebj7uu*T02)3y z-6>10#pY3_p<~gIF;A^pCWIr}dK)cs_v06-L)j|m$U~WuXIFT~e4xYw6_bY3+KF{m z7OyvRC)V_V5_+X171AsoHszT8<@Kj?S?|UY&*Yx{LuAyOKnqZ7C$Ze8^(enG#?H?P zrHIe}RNsY^s8jHIA+$E4vRqYz`{$*$Nos0XFUEP7iTbzlW`_K`iF5A&85!(t_ zTRngwb*EsI^K52XacS7ujQBG*=Jqu;-hgvH^GXio2nzE${Pvg5i5Hahm!jtcq2ZVx z@=)iz?xQ)lqKVRS!&T=ckY`4|2C?kwU9;V^6_ z8K~RWU?1?slbD@1zztzruggu1WVcKcVS*Gc1NuKqEKpK^)6Y{NQMipec+B!UC`}fQ z9he##ppPNQunwlO{9kO|84Colyk#)X3`^r@21q_D`WJ0|4oJnPd$tiD@?@wcTbO8? z~3V+nv&rjkw zQSQIEqNhi~(`>#kxqSsgT$`F1-shy2ZM6{23?EW9GWmp!{gAiyLdT~vW^eSRq?RE)LH)K8|y zDC4mBrN(AZ*KZwfGn+6gxiepjtJLs)lbq$hV^X_foG~zW;8P}Ikw5>5#-nv- zk3FL?r|Q{Reo@=&$-+QFu`Kr8C#FkFSFtu8Hrl=|wl{QPFB7>Hks9GtdiG&q01aj` zozH-&N=XamH4i?AM_Hh}slJ|^?QWi`smYlVT;^!SP-w1`R1-D?+hf7Lyl)yva0hyu z)$6oO&6x{iQG^Yqy<|tZ*HKD1k-QA*ZUWcJw8H+Gk{npbNnvQ$<)Cf+CkBR$wI*Xe zWH^l!j-C5)7h~x-n+*BDE7K@1{Qn<`(~O5(xcICjm0q(T*q&KsB2L0Qg&OW^G}@ zKcaN|!HJcHSt!C^#MgPib`l9h%m*6!F!lTv;w}6e>LmE>5>f*k!0*NlPZ;>u3#N}@ z>-Pwa(9Xk924^yMLz@9^>CUyye>vCw`HcJb*X)NIVH>hfz#y=B(0BxJnu^R8;lEtk zjM899D(KOm59*=&n72XaiKN#HeaHfKFEURs+kkLV_`5n3{|S)JdH!NM(ew?rxCs_w zZoqLwA!(i-sbJj8m{W8-ghk9ktWH`GOR z(1<1ry9|qmrwij*1QIYpw-s&L?4c-l?h3E|V@(MxP8`)1AE!d@%7{5_Wv4?OUwLw5 zyvMEXp`M?lh_6*aEqYX$+Hxj(rB+E!A*Q{o`%#~xCO2d@1FEQU(L9Qy4Ygf`Wi@`}{CA!~cO zB=%-f#W&^a0jxWf0T*z&e!;i~*xre6NZ^e#w;=jBlCYj2jWJ+CkA8fBleuBM!PJty|=4?34U*D;CYt_^R@uMd46zsJSM%k*b z4bmJ(4%B)qZ5v zq`?vL=t>QvY&~(_5&xp2phNc@Ys1pZ3S*nmcYkA8MSDLU{RJ$Wx89$+>v23bRIA2f z{{9W#5GQV-;1%+Xn)9$*siW|PPiSW!1%m`DvVCm&)vt`@_d9k!WL!L`XU`Dyky~8{ z-i-;qSq*?8ep1KJ+@VAx{r|+3XQ&piE%t0mtx5qNi3WuEyp*Mnkb?Ka3ULgd1=#lQ z>0L?GCj{2bfzDtdVqjNV2xE4e|LFL9)biLG0gZ&Wd`ZBM=_klN_ZDSK{rv)MFSZtX zZ6gcE1G))?Vt=yEf6F}a!;m`sgS=W|(sM9NwD+CfIgyNr`wq+*cIa9nov#dgl4?KF zI4)fIGw)gY^9~nGpy{j-Nh?5C$5&v^W;;hEGk3=}o0h-u_S3!~M6pN!!Y{8HlM zwa9OyPL}UzyulprxVG=q@8laxa{L=~y|$UmNC>6PEHj>BV7JU)9$2A)Rw3;eR$ zcnDd%+dYjrFe1OWJpypulJ~8QAp5HAh_A>^e3dAoZ^?EPzor2WGf(441Vp9g>e3l< zGijs=fcw~E{M-0bWXP%jMx#>nWKaB{=)M6Cf96$%Hc0Pxyc&PQMGcwpZ%HdfjHot& zTc^{1mE?u?md9siI8v~ftkGsCS13F>5Z{x`3d+QPthA$FwW_afuZ?HjJjGG2 z*UPkmUPGuXqs0eM`ks_3cZyZ3dJ|*YDYhXyDKNhJ?G@jH;2^l5Zw?MZqw{t;Xft!E` zJ8nCB`)K#CVBe$YAjWOdeR4_gTz<(N3f*qril@1&3a%Zcq@^4*On^y-nM?XeT@#MZ zH#0jaT{9VY=@^pCI@>M^<(^5Xkb^H_e=1qK7Nx|FIqHG;?@X@U3ax{NXQrmv@{w;2 z=0oL*>M`xm>mJPRYYZ(&d(U#(^Ju;sgBln29dkdZbx?H|X`3;L3AdFbnH!!5sE`q2 z*ZbYHBS_U;TM-%JBdU~L2rvC)*2PZ`iv!^J=uL}somHVsJ#_Pg3Or%~BT{)dDe}!;&!`xjTfzLv&rpKl*E(Qyd@doj0nCk{qlyqJ+hcUC zCjQ#I;WpcgAo5XLNaoUTa?$gF;iBGz*b;bLxUk=mEz$b0-H7_gv$gJ11Ip_F>x8vX zNFOiVwe0gkyxB62z&ILazm!Y}l4Es+im#fyu%@S>UanZdt{^T>Eb$M2?M)bts~;UT zaQ}Uy@jdHI<}U>Md;21?Qk4^;J!3TB7C*Z|Ska%fz_;0A%VA}kh^z>3pRW7eV)=XR zLZT|HUfNCqQ(4yi6F$gGc3s)|YUn*Ju&tHR`n~vR$^DJDS+L`!EM7_+x~z^nsa77n zHjX3hXdixk!c2050{H)876jAVg?@Pli1};i`~R`Z zd8a2t^cYsz-AlzNfp(N zW)2^p*hiR2bb?u7nNUd-55gbC4fdE*8q)=~g^HtS9tTJ>iO>m!HQ;J&38`9^pc~TL z9r~wbtt~^PPlfzO-Amj$9PVtRbYtO6#Qq1=$c)jsN>Re~>eJLKn#d{tNXcg<`t%Vm zt$p9P_E7E{;PQ~0ZNx*l4l%>CN8IZgMa=JKpR3*H7leo=+gdI>w7Fh2gxp+aVU*Y1 zWglkh=&XytTEf8FCTJBb;9nv5O|INvC=%a529?A8gSw+}lNgE?gv8Vc2srX{^m^<_ zdHabMNAC-H~(dRQMi&j@DK&g4&C{cT>>kNiV9*v zbdycMK9w{FN3$ROBw2fZHvNw|SoG$)WW)F(^+uQd!K-_^i@zene zh|47i1q(H4MCIcd{^ax={DH*Nzf)Y4k4?zKCZ4h$T*03#FE6hy^{*I8Ci-&O>Ypr7 z+?Lot)^(KjAG$bYI^kV1>gS^^9P6F4QX&40oDu1X6@Jp0`6RMfrLBH1E=FQYG;K%A zxQESc>kcB>ELS7eBdNlnzbE6Q4p0d14|qV=@QIouPTV0ib|g&~9d>ic$O@6T!BK4R z(JyuxAvN$DQ4sRs(z0J(LCLXtm*U+OOFhWTtwY_t@RGXCk8ZuY8sn^Z&CxYQA14o} z4OmL_Q@fLxcK z5he9kdww#dAm=|BFtUk;?ERHk)JuXYFH_;NMj zu6*HnZQqe+FJ!``M_-Qlb8Oo(^#s^QWj6XmQtvMq{P+lTyivp|dhG;MwNp?fpR6Kf zT4KHD?EQjoJ5r&HX8}g&H!*G~UsG8GYd;G!hl+pv_EOnTI-!NG6t=;XnwF{`v(Qyl zbT}n&&OQCKc8)tP7H$EJs5~X@8@(JaTb=SJYp(8(e{l5)=^L&Ss;((%USS}Iii!y% zt^vjucMFepmAzZ9MD{60tQ#KIk8y!OUX!VJ6P}z$ZO$lIco8 zC@tQX0g+ehcPX9^b(K^SWCt7Jw_82gHqyA+F^*HXA7K7+9dXtNH;s$%oJ9txC+HQs zDDF9PS8GNSOFd{wE_)pBE9msGRXoPx{Y;EsjIrgeW^?4{s!>@NdR6s0)o!eoG?QHP zsWN)iR+&0WL>xaPTj)xCXI`wgHlkem7u%Ctid)tJMn2J!Z|v2jVvHntN#-4%Pp_9P z8?4YiYV&VGKzh;hq?G$GpvxuUI{G!RaUVw(*>Po^Tj{&cq|Gq| z8+-rJ440V%b8}264gK2es4GFWX4(Ql@F?U#SItFv%QFYy(HB+^*0bAtByWF}zMC}r zMmaI8LYtIA5GPE{j*G8PS{XCAVgWb^5{FksC1i!*=dQ84h)B{GRZ=9cQh&NStJAm&2=ormahpXDo1l*aM z(w18At@?+t%>Sx>|NqJN_H_HC*3qOQ3JoLK8~BLC-;?edO6M0AFu|dAOeL93H6p(5 zC}t`?#qu$-Q`GD9I9ZS5mjwG-^`+-pde30@mg2vrFVY$#hV8Rm61)-)7vD&`-BLpl z8z^|^pnCpCG1(u#q3(9k^o}ns{2hE?;sn87hN+s~&k|XyDx6KMjk1rx^qhe$bI_&c zKvFu;(o?II)9vgjbWP=Yc4}Zwjz%CvER3nJ|NY$ty~J3;9F{Xz;L7lgu&R+7T%iKC zl25##>?6|B{k|{%^#s!`eKd)}@k}Jx$A~vw{P3+p-zQBA@9({{A$IrR_p^RMxhgHW z$CuXW+`FZUw#Qx@)0LR@t1)Rbcv-WAhwhKM+Fwnk?`Dqq2g;BQ%5e}XsP|6X9~bFc zd&;kVPWY9saXIuI>vXk=K;1EyElF1?7KSvRYN(8y)?3ly-EB@m_x{Dk{rqSC(i^92 zv8Bi1?QfY!bl`xsq`G7YUkymVcsQ5p?r7{)^67CmV}GCGZXzlBS(m(!E(^<8@zR4D zZELrsRD1P+zyS^R{v42wtT+NxIY@Vq>=1MfR&uJ^u=Wlo} zq@cd~g@n30D+Rir(fNtACc^11C-;Y0dfl77G;3M>Em`mZHjmc*Z$>@;?ce~ulYxY% zklm9gEzr&>z$`qh*2IJR_xDSS$s8b;G==Ex7EBkwrrT)?AlDDDsQHhK|L^|>T%pS_ z9%HBAQIR1*Xr?SbbnHDRjnCW=LPiKr~8 z5-H0qFfJeI^h0h7@OcZ$4E9m|w3~}iF3j+}^&%N84?|pW&+}D@@_RY?(#r1Wu6fQA zxXGvcR|e~_VcV{O6Z|UrX`y#WURfc^fr}2Z`7CbToKVsk!hOwCbX}AHIjgB0@g_Gn z?-4swbc-2tyP2w&R0MME(hiw{m2Dg4+XCM!LuULMPZL;yMyOlt4IRs}f%SIZHr@YX zQ+k5HAgmA_+sH@emg~vya?$;c4Edkhdl`?&k{w>$E4r1WbeeXMe3%W!R*Q(c#bllj%cCJ!{m1NIYLc^uJPKl2#+M8m=#Ju9~>f0Wg* ze`n9M*OZRZuMr-Y3T=YCDbHxciy(X|ZXdX4y=(TC2Y0&f&_>U4tG(7>Y){x2wV6EK zL=ae?c6RG;H!OPCPdUajdrH&Q-Pc#KX}agrmn)|$ZxL>i=Ia3n{H^lVnXbaP%9IKP zwLV6I_mqQgcRy4T7K}axIvGn2-MGw&@AxP}k)}Yl&w;g@onN{s;m?d26=RRSv;TU0 z#F$R3E+xZWY#H+|j|-By(v$K~3f_`^1c806Ji7O2Y>mSHpYhF*Ap%jWq#hbZvn|W) zoNuOgl?^-ofAYfbY!?z#S&ni1U%WBfwIE?Pc%Y5jXHTn^%q)=SYR*e%MAh}ynNsa@7l#v$rr){rdFTqaeMHsj$|ajl6ZT6bC7ntMKg@?lv+kV&$v!S zW?37i4;teg_rD3;=vS7J_G}=hTV4=3B<{504{@gQ(jTjZ7Sb_?7-klDIp?}&*Uak` zvr&pQ7CqU)(5AAdo=1G~1cn;fc&a&^2v%}KG5?UjV8TJk+n6e$W6RWU__>ZFn|Ztq zx8xJFq^hxT6sKGqzVm57r)hA97)~_}^fjnblcz!+D8$@(^`U80WN53=o5fExBr7+$ z=hzC8`nDJBu;;I~9@O5b9&9Go5&XxAZ7TfkhJxuA-+pI@S`v%a!w5$;6@~QiKB>0~ zq?PmYQk9@-PYiK2f;7G9Thc#f>L#n!;R>}nu!gC>*i09jE#sbVl`Ns}P19~3&Qkb9$P#zZ|*p8=E~EzCnKIqeMMCvQZR+sRntkV09EP^)pzp7 zd@rmp!ccYP^@Cu$FrC=r*;9l0m(@+#gd-9eEBM}vP@xaR^D)_p_3hSD@+>C6;&8KXpW|D6-+4e{bOta)Cm+y5s8hjM8 zjLDrHOEYtw2>m$I+8xa%s6Qy0RHJYEW(J-5r%PYIC;do>igR z+4)1@pVa)l&H5Cya4kEf7$_$%qHrj=osL9_kmYl8Em5~fV;>WPvn=I`ww`_QJ$1f7 zspE3%S<~+~pz@>15a>rzt!hWBtE-Tq>?`S%y&WAsY&C@? z^zt@&Nwi5Kc2XXS7#EyXtqAa*sybGknc3-d-(YtBLu&Y}NPF=`269tdgsjIO{~$Gi zc)sgyNI0B-p)Ywte5`5a#B#6am%FX&1KThaj00f^YB8X&3hUYmsKl`UE#^?j^9`D^T+C?%_fc#V59l?$olyRWnoXL zlO>K#-8$7H%VSGSuF?D{$C9h>@^+pw;+dcqH&YB)-gPDz4RYHyEOIyx1V?MKeQxUL z*V(wPFUmS9HwD%CFOKG16r+d-^}^%f+wwXb!n_AtExze(=*?40{0fj2jPBuyA#s+` zdL53!{EO{t&p;crWPlNA%LAI|GFZ^yTYwB15+N_2YIFEtbb2+fWeg(*i-_H>S{-)` zg4gDv-J@zIv-d&_cKs>iK>Dj`L|o^}2vxOmHd!%q&uQwv_rcELEw9-l3HFyC9boVA zHKlM4OHG8OdyrvrcGah%xsvgKs0U^Wa$&}4m}^AozYl%v5~QzF_{o@Hl(NUP4D9$4 zCDFQMIiqmz(xcGMW6V_a@0q=DzZouHd>`u8SM?d6*R?kE3QUw#_ah}k{sSN@I};#D z2kq{EnS{O_M5!mQ2dWys;#nv+BiIFC+uia%grRO zSBtZ9_rh%%B|Rw`K8a35Q=ASh7;Kd+oF*^Euy?)C=2nk+kK}n<^G;z>OV!;Q@0GsO zjvc+7l78pzEjh=gPepob^R>`$KjS9oz7e0mz8E=!W2Xs4CLPKzyEtsHJ8x-jAA?@A zlo;G3d{e+7V1sudF*wiY>z2Wh@3Fnd+^R3IB*1)YvB?2&w$!h`PL`(H5ig>ELmQ!B zOmCz7g5|kssrR!tBik^0A!31h@I6={&@aZ-V|gr_ou^-npX{ZY$>LRWLi--+$RQm;Y4@)G>K6I!zx8F^@kkG=Yqk7Pf`7B$PhHVl>deNfa_wfgwu~-}RJDS$; zL8M(#q`x|a+DJ;6YJP!n!0{|ri~MrV4KB(m{q$3--azuINcw&2d@=9l`)tGtr~nI8 zp8HXPKg5?ab8N&e1{pz}Wc6InKRiw?#dqPN5QRHotn*h{(83pv)k0iCD_JA6AHTGH zn5k*m_vrMCJu_=!f-7y)4Ikz%M4}9yFVq=WU2N71;M)jDB4jBwXmWAC`*QSAUj>26 z(PgQd>HlS^*mC@XRIG1Cr$~c5Eab*}4QS0vL^Xk{(i6zxNG>0yMFi@vVU9Umc7OQZ zO~Pm`GJ3uD*%kc!T-D~JL-PbTgSYRa)w$m1LIX#=?{^mRHE9keaXfm#rt#oK?ZQF& zI(MnOZ?T{7Mqs+^q;34L@m|(% z=s`_?=fi_YlG8xU%OpdJl~W&~TPZ@fZlcBGlKHswaJu3E4@A1twyOlkvq=6>U{MFflY)|64W<8nPj6;li2B6`ysfdUcHjw)Hh4d9VzgwbY(T1NGZJ5V$_?1~kwciLFj1=b zJ?a(h@UKXU+Q1Hfe4|)NS(FoJDxdvu#0@r2si?7K(#K5>3-U+%&#T==<6ZrCBD?eI zOvBm*tBZklw3P<5qiNWX|5iJiPZyvj)sOR2i#$AuM!FM*f%AsR2WQKoA4atOuAB1C z=RCye7TcxRU%g2MrX$cP29BfB)Q;ng%HkSc_EE^mhvi}V646xCduFnjJvB3AL}w)? zC|CPT>e<>NdNsCeM_tuC=SXZocVieWW71)z6V^oFK#EEPSqWNG)L@TGBEM{kNB65n zmwdl}??Lb0Juu~7?tdmestcjP~7#(Chz_1 zw!kYQTt3Xyh^4t>?em%i)Xa<&qhZ>R*4AHaFHpsBLHQr1ES?62KS_#+s}B#wHdvlm zeVP_rQEcM&bUJ_Ym`K@*u#7+a-N>+w&8Xbh90ip%uN=p(ACup6s-GdCHPy&^Q{CZH z7L{s5!^5jJB5$4rns)D`J`_Q};#?6$9)6506FnQ(bn{hKAV)TQ=3S*j({1dJ#woB6 z;Luv2=jK_ItlLW4&Ft-2XSksB>4Kn+g}szhbrtlf{2dg zYtE%6-EI29Ve6QuIPReZ+&)zid5698Hl@Wy!5*zc(p~eSFYUyV8?csZn!a>h#%U@B zDUldeb>NH7i`ByKo@JljV-5`(d;yl@@8B~7P2G_|*xZEUiF7bA&R?WFNX;)W8~v`^ zED>Kby8g24y9OH@8|UQI$5_R#&G&QEHq_hjyTk2c)O)VgZR^go1thldcHNY%U{b}C z3G`~R>iQCqPnXWB|);lAv_iwk$X)u;HMi08UhzAc#mAvHbRTmJ74i!=CL`M_H z1bSQ3zVJ<`cx`Lm5{@yC^ZpeNbH(oRXexk#-iI(-cku05OenJyZYoNEi+ z%7__((jyv&SSmy^F_vYvQFj|&~@X}ek=zjZIkO{h3}=iOfamk2p(_SAZm=5a5I zt?bXWtyZ4Oe6{%czDEMOqDu|%1;olK-OT4edu=)J@bFf4EGfS!gXHbWL^`G35^q0y z=EC6@-bX7Pye~ps7FhCNg2!j8NDhl1y(#EOS2tmh?oAv)PUWM;=6!ET=V}IU`FLo< zU2GJ-ro|drwK0d5O}@!b09|mbBs#hHYrm+(o>rrKvye`$WDOo-384JKl6*|5nBFH(`gj zHm~ZpVhTWS4EqXXI1Mn4MZfvrTv1?4MOrCPwrWt8i0{b)6Z70xas`j2#wbT5LmF@e zYR#{xszk#i8#hr`uH{R7GrDS;+)|?-;H7MDm(<1gJ^BVj(e;SqD30mF*Q>yoxX0gg zCx!JiT&*cStz4l)FSPp@yN#v7UWUo-DyfGXTU#$(@}J=L8u6-9)oz#1HspO}?yKeO zs&&C(Dmju=K-7D!3SG$+uNRDL{CUX&-Ff;IY_)*xo7Uh<>-EX-cwS;uG&!%*!qK$`!K4NImOpueQk2=4CzrJ@W-USB82o*#F{i0%thBHze(?ygXNo@3(6WFiZbyCyHYcDkZ4`b+Ww zc;wHxN1p0tch&caUcvwF4uTdJ$7#itgtFvAiz~4Q%hUAB&v9IKh&(XkYIgbFcHKaS zPi)O#Nw`I|c~X4j*K(wjHJRB<8khrET!|GtE zP_IlKh5;i2THBPo%iX@IVkdzV-{Yg)7AKCs5}PiY7Efe8Btx4p!J<5o6EJRa22{b*+#D50cC97eW@5$8UIUFq$Goc%3Je|GaE6%IX8(*^XRUd4B(6pWkK(XikeDmwc8!X2cTh z_ItNcG#M%S51mRznNrNZsi}uC*uU70v!Jac|3KFOzRhOXn9X(vr977f^aZQ@F-RWq7asF(>b- z99L^*Ux@qp`JWPD5ul|wGJnJ6l;ezP{JDF9T%13?&xkf0M|5cg?9#Q-1HY(NMnxp! zM(BkA?Lk?y#4v^Tr0WOv-b|f)bf@me?2E~u-p9lec6d|iH0jRVN7JENEwyEex?awgE^>LpazA;^FBG>rM<)*9P><2j2=F0T? z{q5b9<{EAFUi<-E2G#pcS()8SVXv41hXp=@H|i1@Qj;uKVue47q-cbR{_KjVN$&VC znlbnSnNr=imHYp&_uTHNQ(^+Q4lE+5F#QqGz&c>Hb8n4 z=^(uXh&1VhCPk3md+#No22y++MW0A9sh`52q5`(*4}&VwSGlZ@=BW9 z;pE3p3py{F@RmG*%mBm>_WX-d=RNBtc2f(=O3_j_@L^Y}Vp|W@7hSQTyYbQRBhV+Y z>f4hHXh6*UeAGmrn&;i`f}i4U$tZ1UpAUSV4D&$sppN1DUR7TFo!=qA16Sz9a+WdAw9X@PgA+-C)omezp_53L{0sFF4>kYT#|ey4xhI zPUOX8pr2T|=_G^qGe;d=nR-4Ser`>6`SFu;AWFXu zLG!+VNBXPm$%KiDCz2uoX@@>;T#%7F{~(Pnm$OkJP|AU=S@rYL1e>hoNgBu7Q~u~{ zpJ4H1Fo9B3*XE@!=9}=?yu{;Ql0PM;AJvIQDm z*^T?iuD|>E#V-GgrNv+O-u|Pn-NZb`P##w9w4kQ<*?Tj{)1*xu`0DLiAfeG;#7qLV z_h#h-Ywb>N=qJnV`@6<9^j_=&P(Q8}#2doU6Yk4f=FlMUl!2+w)+B&fUx|BWlso=) zC@GY(@R0fneP`Wu8+1%`gr1f%IR`+?`SxICvAAJ!>k0au))1g48ma&gaQ;Mp#LnKF z%*G8UUU07I<5+iyy$)TA_F|<-_igAAR>zicrxnlC`W?V7K6U4sRR$+u4TMpf0rF;z zBb107#qXqRMEbi^vRTJ|(4bGYEx|hq(NeerqKeIg+KJ79DlTdSs3XTjYmbZiA@haK z+L`KQAe^ulm;WY~rpwJpFFoLEcak;*_h1H(R08HXKuh&g`=Q zO(1>IhJei~>q5-+-wSmcnGnLMsyT(!W$4~jNEhZUG2TQVX`JEt;MtFCD*_70M1Zg` zfbf#^1+1aqK>3R=?Jz|5t$W`r%BcfDXRGcx!`jF41IhmS;S?7zg(Ga^ySpUm%7C1e zY;RK0bijqgwz6Dn^i<(0zT2Qds|ei@XZ8YOoJMg_%m+9W+b=dg{duTqUlHGqGe|d- z-YzIUcf3)ecjI-4pX=Mjv|=&>z%eos?H=NDG6aINuNlWFUSABThy*!6DJE}&;VD}! z#IzbNke@X)f2;Y~g+~$7@Q=5fR{e(-8WpOs;cY})4&1(F)LS4oksSJ#?|pwqFbmDM zVdk?f#*O44NP8r^#a_A6^K$ie$oZD{3cQy>8`2wBpLLeM^s`5-4;#^EA6wV)kKEq8 zbPB_3WxJ0Znz{1ZrU5azg{Kxy%2gXD>NNqtg&QYXc$Y_5+(Hh!BgVn6qjWuH7EwY2 z>0^EpwY^6Ji=jx0^Y)riB`gH?wRs1jzGm{4JMzJ&iXE(NS}0kBc->(#lCMwxhQD31 z(BV@vMso|pf*7ZH1(jkl9Dk8(?2XsTROQIoDaGhB0#sR7#wxj`H$Z1s}7Smu!w0f5w%_~@ug23VBW08q-_`*sk=s3_Y|2q&Dc>A zdDChO07}6*R0!)HH$I=Lx(Wyz8cpp+k=L;ucAmX$mg5TaoJY8qgDXKK6ML$|_V5YZ zpa2=CjT~c!jwrcMy=(xu$U?)u0ob1)s>jFx<1EA<(0+OZRK_}E34IKe{99t*ZDCL(KzahDqm8Uu}VcQ5PC7J@0f(pD3xauPjSMzF$ z;=P-b0zv5#pXFNS+LhnUZB&sF@glxJw+}5r{{AfFj^EHS@GF}Kfy`s1tp&LJCTjzV z#aXQFvc1IZ0p$69;oKPQ#MKC3;R^or2VAew7QS!1V#oW0ew(+>@@6F`A1SX@0axnP zm$9<+NTsgrklHw!vB{J}pk_{Duw+gsPCI_>>^SN14pu!!^S#Gm?-(0)3cP*NQ}IJ) z=_S`V7a%Yq906KoAhV$}?CUnyyF(Ev4sTvk$q!C(piYbL9K_vPT5o4Msd5OmK#l9H z@W^j{-I_0#uMg>E36p3eZ$?uyL)8?JW%@>=Ii zZ#iq$Li@qaChHj%2Xl>Bl}TmDMgZ>8R0K5Z2B3(XzanF%HlXT4eGR;~y7=4f4fqpt z6J%7MB>Iw3mysr3OFdMB=#<6@LyT~mH{_QLqFHJBFnz06FY4Dk(Ab4dqbwtuWH%6H zz*1Eq`TpmING2028tUBbsHG|ZX)UCDutH->DeDW4aZBsWBw9d8G4qKgxtet*jA&#P z^;Jh>H+~is``Rf1&^kEm>IEpz3b{)6qA85$UEEz8s=3YMCzT`SB|cwV9j*OlO0^c6 zp(mE=QcVF{`LR`DS7;-t6A1O;ww5>l0IPUHjO7PSV=e~ZUB@<1eXYpt*iN8O``53v z$}AB-RDkE*y-pyRw}9;oRt>)I{+c8MCZ;*)Y?jh(_`y2DHtd#%2Ra-+Y!>@bEq~DT zVv{l30bJQFxK`PTG7ob;@?X}_Frek7d4=FFNCAF>OmA5*oYKkqW+RN!i%I-v?7hzPoXA4@^)x9N1SIEgYlaaS^6(22F17`_8@?iB1fBel7FedBJ8@$Gd>7yDP_5VS=-h z0YPN`bNljiS;Uka-M3FWX7U$y_+cHWQD?PZ z*n%RVmQW%AVDhIltl!y{tt_D=+nBm@D_79&JbR?d?f&$d%DyI$6&dZT^>BfV<43&? z!fihU)^hPgpZmwo90)b-NN)RV zS3@{^GRbrcr*G5x6v9)UmP@G}r6eJ{)`r@N%MCKIXa`I15>R-6EPzh$-^AZAfaNQN zhMH^$#~%e~dP*M#FxmH;_ZZ>K4aX^QAktWpVyGv4ZQpqv;JX0M>lT0Gyp9;Z2okHi zz;6^Z)Vi_>Dcb#vEkSJA(!-bkedb=K1{=w=<>Vp1((3!65Ng-SMCPG-pJC~D-(;+>n?qhdhKvXCW^H|`MI^mi;# zR6H-H{ENw}ktScD2AJQz-Eea~rv1Aeq}tw}B{w813)tqnPq`ey@wBXG_N%Vz7zHLl zuUDI7^aWl9H|O%OXc}4e|8YtihI^0?Oyby!sr9*LuVOf-chBCHEm~Gv;&63 zS}@ctZ48H~r5vAs&PFwWNl}|ZcyKwC__WEmWgK9Qj+rJ$%MmC z1A{XjXoY@=9RkL?;0#f1DvkUBv9>ouNA+c$W?daCU<((C_c!Ujt*ArEJ2g_&@WGf! zlSaTE+fVysri9Z*;!{jRG(g84uN$t_Yy`gNVy7Un>r&OQG%|NrayROn<#l!;Ya=Jc z*QbC3%h*G(zwt+<@_UzYktnm`f%D2X+vwn(ePCo6I0A^7Kbd#Cus4Y93Igf$RwszU zRk&0*6VD}LT1I@9`QG!c(cO3GypRlN&}{kgg!H@2@O50{a`OnLwg+Wi)eWS=9H7U4bZ#USQ`@X@Vc>8yh>a64y6T=@)WBG7H<5^lce>1F~Fv7G0*!cjpP$eDg>~wl>ZebA8*0R3n zoT(nB5xPu@eJKBEUR&$+uDOysf*?KqzMDVP2>%pudgWtOmj?_w>iby{Q>BpGPEK+_ zyfA%sK4tpeAp%oQK#Jb&(M6$W2jv-B1H zLDBN%A!@5T|jy?8}C{~NN_^9G0)^<6fIz*TBeedzfssoxt$*Mi@ z>&>;#3yh)s5t8cUCJ0EY|74ZDVu|2qY!A`NtCvZ!Gc_OW6^t+WV!c%PkPIR0Pum#6 z{_b!@jpao>z9nHQh}G2i&xx}j0X7la_Qrt?_gPE8|K;2_+e$TFy)|6wr!$c%sqOO9JsBu2DYV9EbQ{E(a9FQqiGY- zTX!RBHabr}6;R0ld^p_g>ed_55$SujAH~OHhA!83u&f`T637QuHl5z07>OD+yoT*w zq?`92feX|R47zK|isep|`qKMGP31X4buMX4<6u;{%Q|uOT*cINB#_u)OidAK`ZtYD zm}>28U-K!A3#V5d%mz3l>#7jGndr(aU}1jpTYUai#b0iO;fiH${2CiH`|;9RDamf< z6lfNm?ZEZi`txgaKi$S`@rd5eq9phxAdy-bD7j<*IWTS7CBnw4DY9pQEn{Y|7QCaW zvS8%E#1obX-O;WrE6{J{t1yE#4UVyiFpzpzt zaieSEk3e?{3}vW}9euh8z&(|&Oo1NwQ!HygZxZ^W6Ch2h)q9tzXi)~VsqfJGONeN~ z6T9{_=*R^$VH}$6@YoP@v-?}=4(HU?Jc?7UTrU0V!YnEsT2sDWwnq1b#Bc|^eVyC` zc!yyB){oaGgNd_lxh8hitJRX z!66x5zoR_4Mv+Pgk8Ta|K0*mY7VQTB_nlg)jjj#(-3cp|moBWVR_paU8WBTq zAO#WLQoWx9s(vMPGy#6-jbCkob0bMfD0l{EkA&ta)#PM`#oHW9c-2`K+o zAwAwk4uhUR_bofjnS)ouaJ6Dh(Z&Fj3#^SQ1f~!DMIl)kr-I` z*1nPT-qfr8wESP;A^bfywl%@!u*U~8gf9Rn>2NW3lG9Ns5oFZ|Z%m~J&>pyd85K%Ly+SKjbp1oMN%yy|7HPrIdnHyYI19p>)dkZ#RiJR&0W7fOdM|H zE3oApUXy$PBsk#;_kWHzwi)gN{VeTYc?PLQ5c9VttEK7bEje`=HgG4mox7M_#^6CK z4}b>4lx&pCGs^0`yB-Xa^YLP2=Gr+o4aw0;!2TK{(aVEKg;G4wmr_kCC#X<+J%w&w6p zix(2hPCowgLPGn20N!h)%SYdt#Uuky=g}k3){;`9Umwud@4Go%gJw9PU@4@be`iwm z;6G?R{GWdRguk0oK7q~ODbb21LX7W1|Dv2H@1ng*F}5v`!x{1jubbS+5!4-BZw6%@(mUMj#p=IK$WYeK<9`c_CG85=TuQI2C+s+cd{~Gz+QylKpZI z9PY44JwVjJ5^EN@qtlHVR1dMo)^{!~oRZIlj?P1GZl@HzD~Fg~h-*{MdvpIeubgvF z{?_rRc|=VW{+5Z9@J8t}U<^l)zddYo=gJ0TP=Ba@u-4S#jf_Z#iNA`HHV2J64F)Jp zIffOev|ItQ!b^WRC`xtkD+ef@$U1sv{iwqAG{A_DmVB3pNGA$veViN=2v?5v_IxbC z3|()%*&pyKzWnOxX%Gb4?ft~Zd5|+mmD^9^bK@C6o95iuNiuyG=g{@MVZfv5K?Gp# z_$;uqk``Po-*Y-$y=l`OlvzU%yvSlzzgB_=dyy~Y5L36TFUzTv22m!DFw$XiCsexo zM&m*9>~yb@a=alq0wuwIMNB9C`DkR?Hw9|Nfj2kk?>_KKe9^HRAhW-|xuojD-8PEa~b-@8|nl0W?{$PAY|j zBmORlqd#ae%Dt1G$od6vxV>%M$+i;SHvklgm=cKX*v*(kpRgKb?3wAlmmnL zy$9NN=r-qFa$IqDw{RefIKDzaIM$SBnlI?|^R%9a&(nq(joiUifT99p+Z@_l?$vX~ zv=3OSM|?})EV1<{rj_z8Tr6%0uUHGJ3!0%!8YUVi zR_0!5xLV6K-Uixx^q_A;)Tu)6k0LkPBSn&J zBl%{PH+8Gp5@@!+BKflB^>JQUxO#6ke?ctNqh3Nmd((JT0B)5DmIdZT#+rb`xNFeRWZr%LW(B=+1VIN`Kps#Y>f< zBNt3$yjxGQOd)=2%S`VE{oHNjbUH78e_nr28umpEV~JXqY24;3=F2@D(6ip>2XLT< zNG`lP)gitS>KX&7kMqRxuGV}8tg`8b`+B=!;gL@_FWu=e!J^V^LE@O}q(Jo%P>Z34 zO3zD!x0aYeHFI$)=luk-WVf5ocWk#ow1Gjt{^&YOoWNqp1ykt#=MX;tX)=DiC9(jb z(?;IV0LW?AhTxp<$1TTpblBUdhuacU>N<*Wpkne$BJSG0#-rz4j!*o%kU&aY4DP7n zX~BC;(0Wyk4c8ffnnZxNfg8yFKxteb3ZS|`8&K#4Yik9RS+ysxTj(?@etrGblJ^LmGn2d$aP7g|lr zr51X_(@H(VQ?2ms;W`CA>%3(GO2IFdJdc#5E1o7Y4WbLaCg|HON}hVG=<}Rz1|I)J zh6|`vm!_8>cYq@;5v-@?oz%Qm3+J1*^FT8=h_NBq2E#TP*!-<0Rf7ADU#X23k?a%n z<|e8*5zeh zXIOna3l+5Js+;f(>-=OOr(mGJ5nm^Pzclkovq|HI>`4%<;BqOX`Fmc}!*9qiS!)tc z@o6M}W7g|C_@iG|GUnh>K%zlv2?s(kA>u^Jebyd$s_~q?Z5~#w9EL7CB@e(J+&=Bd z!-fIrndA+97O}o(nsEw?uDe;_Lt5{;Gut8^3o9ua=8ScrA8gdGxUb%{vpbwpuKT5t z&y!VK$>?CFAC1DRjP%6H35W%<*;t81i{|J2m}?7&qbVmGj^P#PNR%{scs9VI+cfU< z$HUha?}s+nYos7;;2B|agF7{2w9Ww&SSg-Mx)ZJXxAhXu6th3S(kS7?`nDD8ObW|( zbKP5wYv`D9nM?thYZ{6Kw%s9t$6>F;H%~pkDSog%@ zjT0v>T{cyAlT1499meI8>$nT9Ktd)rK&oo%kb9NhswNS2u1k$dje zi}J-os`Qm=hMQRiea|);q@9S|(bl1xe-7DjvDbyfhe_E2v+W8n#=de-6} zbU2nqcFdcmazn71-TRYp;WQA__YeH_67L8EYe*crwQEwoL%yeAfhc`mUGYf2hB(ve6hZqK8Dd7 zeqs{P16Wsh_)LVo96mucukMD8GLjjHw()KnAsRbr;d|oQO91%;ly%hNGQ^RO>iSli z>L+qbhL43Rqy%vErkN)1D4hFkhfuV!UR_IN+rAHEt{r?2meB`7jiM6(`0Vxjf44F1s#`4H*dG;^>Dd zJ41~)vNTD7=vw=f%2yhy_2Ie6-LIhb@ipk9z!bMHv8k4WN}2r&jvaqWYQ_3!aTP|1 zmDs}{@7H29-@xO?N+zB>E40Fm-q{aUmv6rh*5`_{_yQvd{MXrc%S@`*oF8qfYJ~|7 z$%CIDHkx%i0k`S<(P}-95r_zG1Rto|^=&s}H4kKSwWRG7QKc3o>}}k)%vUy0ll!eP zi4y*R1N0a%EA!S%SEA?4iqwwe;I|h}Ugb}*Y$yaQ7kP$=cALybjp>%Y%Lz?u7tKAo0;WS9Kl>A4|&y{+zOW!pHIPi|gW)n}x-yz-bA>?25 zV({rrwkBl*=nHb23SE?LLNae`T4Vfbo7k3#hVMAK*11t6wLF5lxegd+ur^SW0fCw*C;-gmOQJkY z7@&e@k=(GX%iqoXYUf$LQ zuQh3?n~U&?GHPyoGuS)$?NDL9#~IH{9=8GT(I(Af`ZyoRe8`vg>)gPG^KZxxWGB>LD~9iH*Pc-%mbH~^Uo4JM9w3(;Qg9t z$lHg7r#-!9K7gE?nxLy6v8PK5N^@hgvBqvLp`NNNNuYFjAIhyv*r07}#@A^p-CV;4 z%0Tc_c++cELhalF zUm%S(%`zu);wJ)hYi{jn2yU4eT_p0ceE9$$qpGCv@+b5_YsqczEU*=4iRbZyY$pZ{ zR0G#Eu0T~-yB;DH@QM*P_2wOYYLhPCPzK-FmC`1HnZ|H3*b3_VNP-ABpPh(~+}%pG zaBVFV3`>64<~nlJ^9s`4{v?!ZE1Pay&|m6ao>iP~*s-PAqDTRSUwB`($CcJ5mQ$egRZ#J4`52YOsWwpVJGIrlq_s`a5VyyM)+bN# zR7)u&IlG9$|!LP@nxC%zrVC zgTVG)W4ki)nFSp|m06g|;0LbiTW~*Q{uX}TZ#+wWWI?ro0G}hqQBLw@vsxwt^J~4; zhaw$+V_)icPNOIqI_Ay2>w^J%$cr$x=u3rww{51tO^9#E<8li+?hRIxK&m1*iLx0S zFoNGXllMl=8>Hl~Rx~Yv=zJ;m76t92FWPqb>`_`Vu%7#ah6Xsc5(Y`V9A}C&L`S8( z=}c)rjoO7Hp5s=kquI)o%K)_}s)*@3nNbZ`R{5&P5IkM2tD1__w<0c<2Jp%knAk1tXB7^@W!+&l z0Dh_~;%A9w{|fn9Zi!c^RTOX$E6AsWBFUjlZ;!|82Kq5CCc6e91}$~5)=(L3=j?9kZRph=91Q359e0*?Wfhs;B5Ncj)aB`G`%^$IgOIf)>{~R$>d=e% zB>=*Z`uyA5br|ht))ZcBqRE=#d0#<(_Y!%jDg4l6Z)YG-QEOa#qUWDrJ{}CLd~{DO-Nj!fa8x9O~7S1*bi} zs|1nr_^-I7SrlrXdV&CdPY(d!7hvocMoyo72sp`qsR?YY2(KN>|5El+7tT2P)tFuD zG`u2wUIQSh7j;%vCsr&A9-c_Z)|~;cc;q%Q#yXvpWG1-x9F>h|JU(`eHCjE7*-(sJ zu0d*gu_z&)Mrnrt)xpY4PcA^PU&}|W5x>|QGj{p6T4pZ53}N|q5G?ns<4o&RDYKfp zQO{tG%;qU|Wqh}&^R(;Ot0icOSv_d^GFsBe7qa_IK7Qm7RcF>=ZEij9ox6q{O`&yJ zngD5@o$?tiifCW*Z0~YR^dcn$E(9(aet-G&QlS93g|Ieu8;FLTlRya>z=(YX6E94z z5T}yBYtF9yK;Askj~!gxcl3J`)DY-b2@WR>9s*=%kRy-cmkf9RgcL)$!8X6*O^e82 zJn${I>s_EZ1P@xWkOCne3;j;y!|$oj)eh>Sa8CoJ2BOH9#H&}^bbyL=ox;!aDeaFg z3OUQ}lwf4y2sV*<_XP!Fma%wg_Y3aqK>)VoJ>Xde*e@HpM3GJH?S$3^47Mre$|Jy#vp!i!3`kxYn= z&c98 z`3CfW=XN8?MWnYid{&_*^Xm7~d?MZBTBH}xgTLjbQGy)40`6RB#vv#PFN+qiB7N~$ z2xOXKA0o5wp2A-)1IMqU8b_tTiw?00xKRo0x?itD-*N)s$lx*ClPM{Aby*EN`E&G= z4ZyyloNk3%KVA;l`bYqdJVj}B#3KG^_%qM}ja*{UEa;|6go3azWRIIf3mTxzHIZDH z*bg!K1o%L3$6lvhm(s{!j{dlRj?z2u*GJ%g<*)ysCojA?_7=LXO}+$B{&M?bW6-6# zfI=2dz{ZQp@2*W<-#_@SN$M5`4kTj9uByS%PMvDjxVNQt|969gE&%C>zgu~g=;Ur<3XViSQJTqn9nyA6F32s9jmC8n zCyd#itfZkNR30_7=<8FI$H$h0-YphTYvkj+vg%^V0@^ie2&HsU+cbLRr6+Gcu%k5` zkaor?V&=Nqy7Bo)?(j8}t{2l*fa0LFqW$kT@BhxvA-$EXNGn1^^^pG!J33XGdv}#Z z`HFqND~qr8yLuk7v-QF$V48sR#P&#lKxmQ?y{21Q=BnkLtc3Fy32Cwl+q+cCu`5MhhdxkO2i|<^eW1>%k=(o@PvZ&_c_io4yK&*6- zw-yzxLVlwe(*Cv|?(d0z`xwU0d2V5Wg83fXYpXt|a5{S5_LIvEiRr5v2NP3n*NDDw zR7>x)8Q_h^GgmVayC%=S3Q_3g(h`|zW?Y}-`*3q;4tc`$={B}|Zv*rWn01?Lo{F6J z1dcF|4Ktpfsp8n+Z@3=kD5wP1>WC5N~aMp)I9(`$38OBA$pbD2A)=OG! z^eG+7kKx6vGm4jO25SOW>yowv_xdofAYw@oA|!A%6GVOqUee%)SMDCUO7fC4+DOZ3 zKK_)td>eWm=yz`(Qp@AlSbYT*UTKR9GD_ui?Fj+utQ+R3|1+5c>y#p2==VO+p^i5| z<@yZ$wiUiYSwmX0G@R5~LP3PHY$|uNrN{;(r!NHv-g*OowD|u!Pw-#&_6wZ#&xiq` zsT7GgS?$L3k0LPl79q+UaE986!mo%pEjs!xT}JPn!amC>7ig)$T?0-dSyeyZ(=<|r zS_$6K*vN4#6=Os<)WKKL`g4eh4UHXd-dT2{h4GT=9oJ46XZ!!mF-f+qcO?tbESg zBlpr5forVb9CV6(&%Ak8g?4$sqk}sJvfz`cwx*$^jKCIvD~iG9AZoqe&xLlUlk=%k zoq*|^a=&xT!vuqk*hGDU5A(J^mZ?|yu$GGcX+ zvu!MF797u3<{7#> zfh@DHO_#i_E3T5d+}EiOUlt-5tzHg)(7hQ=Q^n+|mlEv${^1wuUBujmBBa^=`E*^M z9~`l?BceXY>EN1)e``TZ^&x%P*gi&l#P6izmKl?zL|$BnR^w;i{8QU@}U?4wOV_MLj>NDG;H`0eBK z0lkWflgqpY28`&ZSd)z_K0BZQK#PA1{KiP&OUJ5MxK?iWS;?VCht`|uhxB+P`c0=~ z_)lHmqOD!(Z_cNeDP^oPp80;vD)PI&U&i+utM34!&M@`*|4&Dje^yxp0H_tl1#jbM zElW#9jcPW96oXs*Kva1QJZ@{Rh=5a}4iRCKkZWa8+~@*qj!nd!XG9^^T~&CpeXOmu zWFXM>J%r!j7)&8~pyE-jt_|BoHSR*3MXHax45Y~inlF$UKjOp@08#e1-p1v?Q_k+u zRE0U81r=;e3O86o1TEX;lfGemx(w(sXmrO^>Sox^^>+j8do#D{|E$mB_O-`4p4S)m zSIvmZv#1s$X2RW0peqwjIzqjn7YuX2;~^fE3QUcW#UjDeK$x+Zlm5_{T{P!_l zAArbg26S>fga`&yk7yie8tsqD*EAXe>IU8^ zqZZJ`P)s8Jpou^~(W086?gF&84@3ZyaO3+JWgS`{NPG~_iksIf1GX|1V&9S!3%X9b z9^IBsh0cQeKLDW6_)sW5MG=B-&vyc9?SX_ODoff|yFE4l8nYHmmD>BR1SYUvir-j- z2JU+Ox|Ls_%C7+XHCKMEV80gQU;E0hEpk8C`IWf*%E5l6lfN?Kzs`*FZ2;|_k*@cX z_@0{3A2bSFG7Yn32=O|9nos}(k>00bDvV7NfFVC#HduMp9SDo4}@c_rDAP&ub*fS+$+>5Wy%g<-9 zP^Y{OboK7=TJ5-HXD{SFP-ve!`T2jTmhpFcu>V2r-Ql8m^z2vv8++> zo<;5YWPW4#8b8^$0q4Nkp0k^zu_g|V#GjGEVve(tC_b$?!+lL`}RWqm(W5B56TFr%j%VZ%t-i zvKcsg<>OpV2VAI)WP@r6dWx->;PbgbY0ZNRClQE7thtf z8!2kFvd8#rvJP zGvhjg?oL25kML9Ndxi{72sRUHln^-pF}PG;E+dU_d`n40pFs>D@{i;5=}G%c3FV(4 z54y9HV#HlvO54(-t(S`D00dwNI>ULVGrLLDY)p ztXv+LWCc9H)NMu^f9ilh#eKkf)1@TE>h-qx}IXWo2^m`|>S?tlRL)52Homfm& zefQ(?;2D@xv@rky(*!7=fT(&49MD%Y{XxTkYS$0&O-Jl~9sT&yBGvnFX1$Yl^UkeSFk4Oj!pyU|NhwN?NQjB}KXZxv zXaB8CE-d-Y8g$7oe(xA&>yHi76fy5(%P0si@RV1`e}_2m?-2_A_I22N+wv0W+|a20 z^(naTN9$&BkY&nMq4h1MN3Y+uSx^*z(CnyuL^X^Z$)=uhE`@U<2*)nlF*I>sw3aAh zH=k8f@Tott&C}Y^D1bA>-~(C6*Dzku&Gr%nmBSwwNlJqs5)rc>*%T{(*qNUPS z0r^>N3Xgq_w$vCb(<$i@B_GC@?q*-iR=eqvUia)`C#~mgaOSy9(ehkUvY5y&adllz zm5Y;L2MEk)`g@K!^UgfsjinM?CAWg!y=|6!aZJ4H+?XH$jJ)y-JX{fR^O#~xYoHzdWy@v>lbs2I)Z}GvecIt zEZ#1#+!X(VM$6Ekdaz*6wCYUJ$FZ`Ylby=X0~8_{ji{#M?(9eK4uj!4A(rHx>>bGf zFIJ!*Sm7C`vRqV;er0r=Se7&#jk&$NwdF^NzVChG8$a!g-t8YWdc4kOOF^aAXK^4N zE@_Kl;}1b939Ng*_>Ju;eBqqeEZmzd)W(>R!d7Yle$1S6@_S7Qd$~Dqqy&fP<_QrDeA&p22!ynjZM}0OT6{Z3}=O zY4H!`1b(iHv|osboB^WAB-j(Fk{+XP85NsS*TDVWM<^`j^{~gGKv&9EmxDXx;paf@ zPO-BU`C(4{?TvZL&5N9>hfq@F#RqS$ojFb0i<6y-2PMR+$7&#&E~I{`=o<~R6KXYL zy(3?w(b_N^UnJ&-gieAvEV$TwU09a~r2uw-rNtxJ2L9Dz|M&d12jzpipPE&20fL*A zJWa;ok2A&-An78ckuN`J*j)ODwX$_}^b2bMvYExqU+@Byi+J3!;6&`rYoEIv0+~m- zF~xk5b&Sfv8NoMgxQm#MJd4|Ji0|V_6ei%itefb)jUqt65djY*7S%ZqSH=N!hO1X& zu!#!@+w!)Qi53Ro&iF$A6%&HS_A8#=0+gCIbMm@d0A>)<_ zh?{gm^Lh&@(F947)X?$F!Fxw}d>g!9lY?=xyQO7NFV{ITA4;~}A#aGOcb)kx(4W3r z{S?^5j$w0KEZpued%bLvc(-d7xkmy%j47|Ke>+S6JX8O=0w|9^*B5?W|I^08&Gxl) ze3Cx&kup1VcNd-VKiGTkxTdx}Z#asIih_VtfvA8;6Gf0tEC&z}qVygS5a}QwJy8@; zS_A|Hgh-JpM0zJ6O}g|>=)EMA5Yl+|nLFp+x%WQverE3TzBBXO*FVl@pYTac*4k^Y z^7}ULSl`yyS9YWj5;ULt9Yh|PJv3N8Xhmo=x#E-`LB1WA|EPS`%VkX}1V}xT-YlIq z)T&#wrR@|_y4v8UBMDQdM3oufq1A@m2*qFOt3@?w zmH$=^S-0A`T=UqYK+Iry<+Je0qy8QZ8~5La?f`+O|I}2wKW@^m>wgb*_Wv*a&USI- z(Lk)Eh|rxGx8xVYH2Oo(XKupjZHd!o*53O8uYPLzZoHU*v!>L|0PCju+kcfjANk=O zMpJ%r2y!PSF6E`-IVk?kw57cQ-?|F`Pt7}Fa|RI{%vbUuD_`*Wg%gOkZv=ynJJ`&iTc(-?pUNeM4S6}n^{rEZ`Rx%N9!Z1ULPcW&)2dk zx0fV8h6fgd#O_A!wIXBJdItIeOPH63?27u%@2b%BpGMu*Kds=V3G+HRD&Va(IROyt zQ~$lb1RoClsfrVX?^MVaFMw1mAhuFbD;5qGjk7-#H&~vWcwl1&r>EbQo*Ud7728ss zOqiQXx0|R|IEZ_ZMf2H0ZaZPg(mFX_tS6>^jDm5OdbYFkdNo=Fez!Vk`e&XL|I4`T z1u+d(lHSN>5;)|`4=o+i*kIu>YkM{#+14cY4bMW6p~wxV4CEMgbgrI20}=1dYzVSC zbxkk?c-(Ki5cY-Ty7KZ#XjRb=%hBAIARZ9?ZP6W8!ohGSO&C`m6kBI1Du?ELab*!IX1g+v>c$|63TJj4 zX{=55UxeiR(W1T<9Y3i;H*@wJFFaiSis7FIPT!b9rmmxZV1IR=h!W&$-gC2^*VoF$ zq#%sdZeEg`(T^FKqBxF+I5h>W1>V1$NsWV_R>NDhxMZ&ty5Kegbxgk~T@!TtI8EYV z0rr#OZWY|r4}lDRn%*1v=|+!(?q0rl`fcuY5Zz1A@^QG#EaV+)$I>3p*;=$AMx3DT zwN~)Xs}2Q?wze-Kl*0MKFcdn#n&mXD6?Uvf;T##BtX%6BXCA4jD__^S{OV#yjtNV3S_{Rjhv3Ez|_s zrG78>sc`_uk2F-88Ww(;8Vx_;9Z1LH%W9`XNNn-pw?q+)YHacMU`+l`smK~b9I1P) zsg9{a`Qp!w1|L>uiHqT$PgM@A^?8k5)pR}|b(n-R!^dBk5Wctod*e{{>nl2DY>`TL zN4}L$Y11cPh`aH6vhkck5Dyab>dX=Pi^FuY*DlShKd_qAA$VQ(x6Re%S@bN0o@eej1URQU<#=B(1RY+T>pZ&UP&KeODReuiD0i;?LG zU{a19m7r-ed`FpLsh3kQ_Dg{Z9|g)jK9$o;XG}PA-2u17UJ-KPO_wkbl>QKT`Gpox z*>R2|0~y{S*Y$EO=aVkYC7X=fGuNa~HEgFVX?*6CREE6q zm~(r02i@m#0^W)`;VrTt zk?YOvw5ya31=h{()=Fs{J@OFbykW1oUrMw~zjHR)Mx{F~U!Vw4QrNPZ4VjWaImfoX zojr31db_3}_rznlpa-xcJQ}xgAYSb$70g5*80xtfb3pw;mfs@y5CwhC1-qQEYyWI@ zqj^Dwx*G(YOFVczi&=-HPK;>fQiLj^+ez1Zs3^}%(OkEe9G{nWN37I*6XFtWD}c=O zZ1ffsiO-fFg%RfCm{Ng^o14`m~CRTsFsEY&jL3(U+e z@dJ~Cm;J{1!uwy$c^SEk`0kv@HC*P{7{tvEKNylizVmfdHX3|aTzBy?*R1)^N}#@% zekN;_z7`H2-fcZ&`?K?#6#l)zqDev5vtev`6z(I1Kh;1FbsJqqJ4tIb>51H3b7Au) zt!lBVRq97Gc)ui@H`%_QFEbG2j}IO9^tB@s$CwgXunyLI6gHKrPhAwRtSSj&} z3up^5;`rq<8t16pZVn{S6LyuHuH`bF};F| z0LT+k`QGN-*JS%@@u=xB@Omvy;_gcorl8L?mP4B5(K8=Os2ick&mcd8QecMKz}?e% zmsWvuP6=(KZws%-zWz-QnJtZzX*vXr>&~7i%%3o!$iyizsRgPUJ-wb$e)F>|Bkg%n z=4Pt?9o~`Q`)qME%%*Q)}%F^3P{tBBqqEU-r&ZU$fQm z#|GL#5|*KI^(xyUUKXLPgIE@fC58cO49)hJtWA1=Q4x#ty7sjEePpNrq~>mbQ3Zk5 zVSBGP{Y6BBfzBQJY#@4-LPz%ShUgS}(=M$PE${aVTMDKnh55YY1(BDVa7QMTj0_hR z_Ik!8oFz`L@67w$SgD8!rCw+(Prf%kv*Ur4kP!8q+Q-xmXC*3{tysbYaFZ?wqH~Z5 zHK8A2g?TwNbmNX^T9=>5=d)7f@u%0_+Ahm6s|I{HO?ir5UWDv9?6U2o+_aUgx%~aB zxbeND6iR;Hc-**eae46dl2t#b-T4hCn(Zuf<0fvr5y1ezf(cQ)TGX1=VRyE@RW#v@ z(kW;09t>&S7qhD*C!mU)09PzSPD?Hm|%|QLOuu z)!R=`z81jxcDwGG3^DL!4F(dct*$vCbmYV#CZ=zQg939)1tbSB4SuO*yoyy^j zwE+uwnb{MuxW)$g_y}oFyC>i76ba6JN%&zjNB_zB$q}Vss67SMq^-~z;71k`b5de; z@DeJ5X0CNpp=oW;LoGVir1xM!QEhhE;;ee=HNoKVhBC(N>t~joFW~JoV~B~}Lx`sm zuYBjl&(qGAms$00m}tJxT~#c5L_DVgcO{|---%c>Oj9{IPc3y!?m&D)eD>1q6>^*Y zywJcVh{PD1KI1ZeZ%#GD1DS7b>K!=Z+G{qRp@k$^91nlJ*-#e?JONFLv)--aYHV9C z2r(F}1N_#G3~j7lrhR2)ITB3>q(*-x+Rb#Jd5Jf5qqo$Ht;s6+KTkZ1Gd%e<`r?s+ zo1i-nZ!ZZY9zbV?gFoZrk^V|EnmOE8r5(tAcIG!Kt)g!i7%JL2yKTYTDSFhe6goWr zbV}*Wnx844x8sh!cdjJLtL27>LQyA1_+hTw`>)!{7JCQenGLUpxVqXb?-=#R@m1zc zhS%e`V3O>NmAox#{D4udw`8W9u6Q2whL(Z0hB!%7Ny>L`ZPD8}!?hR9^=Rl?7j)TCBtY{@~Uc8ps+^d%c@`?l##O2;$z?$93hjW)uymLWj2W()(#n6@D3N7 zsQa3a)#TH~gQqzfT|&pY>T2gicT z5xFFY!#ByQv_6^;@y47e)7fW_w>lDj5fb4~1pwhSJRL2l zOJWBIaSTI${!xJl6`2tpXwZ+&9ImR|$ojG?i}&t4X0IXrq_S$+M`l9}m)?jtrHHUi zCqlclJqzvwu`y!ir!uT!{TW7!Ep>HjN$L>4z4PA9= zJlQ88HgfPzAnt|05~kEA+^tP2?7W8gM3DZYX!U6jy*sRTLg5~(CY?sgEyL{inyr8U zUlNKrB3Vv9CZl#LL}dni(i_=ovR5ALFKWF{3hMo?nhfGK{)o%19eIBo9!Zg-22pNW zIk0Utb*Y+b6Q@h_lRi4dQ)8|uwvr-H+Iw5z;_?Pl@M{`FuxNG^MI}e1BU>9J#<_jb zV68|wWg~7A6hDV<-D31V-#2Ue7E@9>z98=KB)@!ths(y1xA1lSoR~N7DR2;8=&etr zO$4o{?ON=5;*^CK`3AXUur6%PRZg7yd~KO1LMi^BSf`QTK?G-Jn<7q($kKu9?#WRW z#2ge10f(*QYuDS(OtF@iUz7EF8a?@egcN^CX%XGAvY?{j3J!)YZch2F8Q--H7&M>V zmFWFqEC2~hTvt3!cJzPvQ(_j%?V~Kv|E}p}OWgkHx@^2iDu3^XrA#%G&=fwrlB}Qi z6ZEzA4D^1rM3Y;$D~o?*N=j&}3*Mb@xvqEP%CItG%NU<($g;`2)$T`tHkb z=$4H>8P-cSi?P(OrbgaxGcfpF2@aL@|^0 zQL=>85cIdH$hGy3T|cav-%3}qb;%&aC1gBzdlT7XvbXhe>U1ysvcCy{oxY~r6_EEN zcRyvm)na*u)^-njjiV z!2%v>tSyHiE*B(0>~6>XSKrO-Szj-x`o`YAZtagaxzaNG`8BNaW&>|0i|K=#Jh_iT z9t3kr|M-8{)KNMbX@XB>9hFQ(baD)wo>$6TGvBh_b6T+}pQr6`MLo=sZEwF}SX)v# z13yDWDxV=*jH>2Y3cW>Nm(lX$Ki(5isHzVFYF;G49BQ!T#Mt;B+Cieb#P5D5Yt&WrOJC(1+8i7c zeiEfoB@YZ4#lW)&m0R&Ih6&%W#u~-r6PMd~-aapgl>Ex%dglka6FZ(a(_E17)eg+E zJK|rM$7<_;yZ&ZVb*ek9UzH15#N@kQa|2J$W_Z46!oRW--9ELF?}#-$SPPI}X+>EN z%t6O{4a$}&)U?n-I|ugw&$AJqoTQr@zq+M`7b|=cd9|uuk6vc)Gqi2V@T@yZ446f~ zgI@}1RAw%g@p?|_Ze0lFB)5s~>hsmAPanNczZ+H${)T-e6=qbRd=Umk!+1D?=^gx3 zfT&8RrhFP@m~jhaw%h=EsvBR?sEb^2gGXqo@O5-N6VIeBNAq zP#wdw!YTXq;j!e+89OW1<31XDPpy)pUN<=Md4oO;??73+-+f&4?#Z6t$e!wmpN&r@ zNmOf@ZtNTJ#08zU7-r4e8AW_tgN_bQ2YrH@DWCmj84kyv$=Z3|Qi))Ox1o+w?$D_w zp6Xjrx>A&QKLH-voDK=)CR!O$%j<5yrliKKec=f%s5pUg$@h>kJFcRD`*h2qKmCu8 zy&U59?Hur;s==GR-_CZYoN*Vbd^eI#Jpye>^Ji`L;NN>|;A|eNhGjS zu={fd9f2v5c}L?mG9#upatK-c6yM(QuXfH)2RC1iMYF6W7z?1!nRONF}yoA$YM$bpicVTzQ8GqCQ;=6A^$qxpa`} z5I1){HYvi_z8!X^pufn^I5R&$W($*$@EAw*-OW`qQcg2zQ*-4CHz>lJd4r*(w-88cM0CA0nY{)ejj^(2vYAWuqks7KXIebym9p94}vuH z)Yzmc)}+zT?qVFn8)JUYc}dy!sP#Kta2?u&#c=?F4zrrmG$M?9M`Bst%aXOSPhuam zShePysB`%amuS{&?KmEhC%ls{m*ga!U%H9j=bccA>(|pad89EuUaTw#AX+3bVobj> zllSquE0p`airme8Qv|2sYo8=do6bbP*S@Ww7bH~4M*4~>1O_78o8PARqDIcR$dsER zF)Tq;(Ix1JL_K*^@gAOiRhge5bF2{DG!pk*seZd9;0L<8PJq~g1quK!_j))d=D5xC zeuPf)S#39=aKfk2bFi)y0lbC0PgVV6k#@zhh_ta9R8}s`XJ3vFrfylA|12xO)0~-z zz`Hb#(Z8VydLtVxIHMN&r8?R!zO8n)EDq6`B-9xyc`qgI>1tTE@caC0_#phuY{r7lr;P4*f^3|2FS- zQGW$<0x_RMw4!0G&wOsPI>OcZ`=dGO?%2sCk^S&M3qk6dC!sgrpA@{V^q=6 zA2;}VIOws^nuV5oiAfL{pa4rXJLum#1kG#$Qt(qUuXdF@%J!>E-eo{`2d<4mgicg* zAH)#!hYvv|b$mT(KSpEJsyrX9oIeEB=n^Y^NzNFa&Tq8#niT>9*V_LRCB+KV)+!?I z{%2bB>%IT^QShtJ{=Uy@>FbN#y_^xFZok{`_T=T_JZJ_*g-g`uk~EmiZh=?}&wzCV zuBh}k-!|D}!1-ZCP9SA(t6aQj%wSg)^2PhC&%tRTHfz;$5p2C&bG5p3+-e2jc+CaZ zhStP1bR^p`?LX>S3Eiuj@5XM$XB{L?D+8dvXsg1Xhdbs9W;*ks(*O*}9H^j6t4m z(X|b;_y0~}5c%CK$Y+gog!=C34(2J8-g>^3`Z)xLsc2mZ-7SXBpmpQv@i$ki0KXvC z-aqro{?(u6y1!OSkk>AZdeShGF?K!W3Lwy-g#mUEr|}Ja6>g^KnZe)rKD7I+JotLq z;bK~-aP&VjF!HNyzZeMr6=Q%+P)!Fg{Y#d4qw1MWjXea#`VRU)-azIhQJa7@lTpFC z%y;*p-|^4o8-F~pa^S!C{G|gSti*o_VQnY+_fu!pDuZa;oW6ZJDJ8Kj9{(O>`s?q% zs0sg6W8lRUFK40E+9lfEip0&5BZnXs{AQO$odzagTNaN!(!P4Oj^s`o1xGcjJ*(bK zg&IC0^$R)e{FwR*T{9>toy@)s8S7c-@+sV3^TiBIt}sO{O97dh2h)G)CBsv|DUmM- zmjx8Tku_R8+jdp3*d_6zQ-ajpoYNZyHuQ^bDQZ0Z>gCo~HgYC=VJYPpoDS3LLTe*$ zgn0#jb+Z#EB(@RUSBoY1P&0NV#yGuSm1td_Q|CTX#&}IYO_BjEV|PT}s$HqeNm2TK z!PAn<(rG|ST@0{spu3&+Cv6km;qbqZ9ge1?(vLYhi<8?fTqQ*{dn*N;vS7^_r^Y|3 zEN?k_zdqn*>k3Te{Qxn|r}E7Un`yekL9nwrenk~SAYP^ z+6Z#`emCY4#`#nVEP|Pz@p50jK0Ii_Wl;-WGd#b)F`239f`{H8yhU5wQI9yX=bahv zN}-cjf_3sSX^AZe4M|d&YG?ImxztB^LBKVBNNCCO4@uJo;N>^HtjLekz2}Lv87j-q z(ViTcK;%L0oV!EX{|Xs91kFnKyi#UU&s=f$kw?B0B%i)6Yx6S4ZUj;9ny`VnT=y0i z5WP~z(kr!Kp66iYbdK)j(ckireHeL+OV?dv#d|ZnSIal2ORGLH5x#inAg(qe_+9e_ z6WzpYVR}o#kk~PcM8Rcv)JM%#B71tTAPJ1pd!&LeLnQ^lf52NY=bZhWaEanFy766g zxQUjgLb<0euRi0Xp8o*eG@J{7htTR}`U-O9lw{;0jT|#cl2Uo3V5;a2guQ6lpe8Z9Te!c+U(MX96rf{ z7(W*_c`D50nFY8qacXLSkMI7+mYudphB8J45uBVDv~zHjq6*-AL9iwc2Rr*Mj!IH+zPkU#7r|>L3BrE5+j~C zR%Lo2%HYXtTlLlCgcGLFn&M=ogp6&(hhPkkC?TCt4aOsal9Cf3$LFOY~7)9by4(f@hbtXN$4We zv7;voPGf*bVjzD|S*j1KinmBOkpqY*w!1yfmK=h7p!zPVIw#j1W+mPtzGa1M4Vi8Q zCl;@`e~K-xz2fj3QB{PTy0~w73H@aT&583&=k^JmM|pmQtzy^~UFRyEW|+qE8aih! zvav0L&%rr}b2ISlD#Hk=)*>ItCm+t?9?*Nwo8|0tO-M=}_o2cYgPV2pwe-^kuhu4d zJvhkyut47-_>_&x(@M*S2Vy?i(;}@7K?ayRut4I*A?P}S09eb81Rzc!I;-tx^HO$!nC_k&sf{hxAiLX>wn%(Pn1Q^2NAS_wJ`9ud_oi=SC~TjKfHS=>Dk~QQmSCy+e<%xdx36H6t$ao38Q22?_ zVH+{6)=a%Ojg7oC{7{jkKXT5VeQoIeDRMjG31UxYtwOleM;-G@>IK&hTAcFsdn*qYY-%4h_VVg=!2#HxJMWd!tt_C;Lkb zE>s!7lY5r7bo-@wIUZK@c{|Esac$~QGBiEu-`b#(EucmBhHIc(45P7 zbM18IiOeUJ$bQG1;CkAMV1SDMB|MnGHY~Ll^N?tj8=-S4S`4kxtgOUh`4%{d*YNZ0 zk!+c~tK*;a1&i+M?f|yDC-JF1jogW6@JmJUP)w;) z(LKozo{)P3e4isv&bt}2k-o@BebP(YOxWp?TK|qo-2&!`AL#T$&~P$_7G^^8b|CTL z(3DUTgz5v_y5&-I1I2^J&M^cd+wD;V@p=dFtc9%&!0tlO6e#b16N8X+-C)ea;W)eh%fw#{vc@X*#orbZ2oxC^46p?sdCdy~13YL(q zk(EV~UsNRfe7SjG*j>6Gvd;Px+?pbYX}!LZ$BtP0+Kwd)y0prUT5gAH8M(9%1BDJ% zU3Dny$+|;OT-K`b#WN`&bp2M?lzbHtiN9cmw`mH#U*OE^^h)lO6W8czB=r4Jw}EKC z;Vq9|0lXqhct)uHO959^CjKZGP~dF5OzXkyilQitwLr;ak1{jU%KrFyXsN@K86jbT zcoFT6f(@5c!oifeDJ$pqBz5sEq6dWS8F~y&w1$oD(}LU&L9d1}6gcv4USoi8ptqy3 zh@L8x2jYN#2{GuXwxb3>Q2a}%5&q_Q$vd;iW6(>_E?vitqxUrzg)ww*4?!lRtvPJS z93Iku;aGWo^@-F$!8!23ioN14prd-P2e1(ZaUCF#;nU^RKq!3m4s3Y@Q9ndF-u zX)2QCdHMW=!&iLoD}Ux$_R33^+@7P)eUEROLcYL7QA@iwe8xJjKXTK4GiE-1@ zXT0_(D~-?#0HukTMR=IDwvQ8b8`A&|-P_$-51Emz(ZyG5CN5qQuX_jN5eG`j{C4y= zc%EoFlN;=YAr|vZJQL7LdKZwvtB#+Mki9+~1VMxLW^Ted+nSgdg^j}!IjTu0u9!aE zrv9+41A91Xc!s_e3Z9^-6VnU4L)9O|g(;XCM|L>Su_5WoX5t|mH_*WI#|~5fLNqlV zm~kpHF*bXU^|STDb;uWlqUV9ot5JVJUFOi}!K=;wo7j0|`fPib>iJ>Q-}T)j0b;sp zhtXik$TcUAwb2dB8uOS89Z9jAA7eB3&&VnA?0sFmn)6V-EN-qE@)Tk%!3P+v+T!3H z+LbW@`@s_{+t%ApZgZb#&5FCrqEEO z#48vhycw=^2zmpXO(H&7)rzyY`m!O=^?0XTGz&c4m+1u?Y01sy>|{+``Bi~n8*2~C z_w9|xn3q)$SKQ**eI(9NpGNm5=vhRp){)WocFl^?njaqsj&;(WKP%Xp^?Gef48YTw z<2>r;9#na}EjM^o=WPTifG-GB95RXY(+HvwguO;-X2ohp{CIt3)iE1*j}v*Aew46I z`i5SWAI;~q!fNB?A$3paL>}y?^OmcJgIN2DG zwzOYxVN%ACbQS#(;2Li6QbQUOg&I55Z(Pd}uB`;^%-`8=$r_y(Kzg7tdAJ@zI37z8 zlL#9l6rM@3iOs-RmRwx$Pjbv$AiMx(Zk!0s4%b7c*rOOHUsh9|bO>Vgg%mKVRoEyW1AhJ^-)AT{OFgToHREL3xP>HsH0d}UGn#x~mw4-}hll=MXK{7s zH-m%7W?_Es?(?zqHqX0+0(S~qJhKlX18rzD(vu&BxUz-2&iH7E!_}WAl5dXTD0F1VVFnXyT2}hxIweV4=VyZ8h3q>n%&* z1$w%j?++2pKh)9xNT>bJ{>(i$x8q-G$_;hI9)AD2x$c^qE2%E@nJ!-nop!V=E7xvJ z{LNTqzB@0FHn$mV%qenpUubs^9qu1MHmd5ng2HXL8J~|LjF&=WYTT@?Mk-0!Xg1;? z{`Es*Y`)T$@Cc@m^-p2Xd|*MKi<2LMZ0jzBb)kP(z;KJv6w;I8Tb+%f%* zURZ`s$2XkWx2*Fu(TZLsE(W6mi0>|QP!iKBn8+q^2x3#CT%kz;(yD*^DU$$gf70rcW8z&SP>m;kXYs9o>?1SbNciB$St zB_)IBVb7WDt}Q(RP)bY$WEql;nAXEk^uf3!Vn6LQDTH=h^AOZ*C;=o^rBlK-?`#N8 ztU_jF@L5#)JtoR&U_A~^r?Dcq5kC*jn#%q~LiDER-~7tIZN&54M|<*+A9eG5edgVN`z!yZ z5d{{Z%2{d$922j9_4U6Fm|uPUYh3@DLw?O6f4xlony>$6)%m|;ZPWwmUg!LM>}NF* z%&U*SZI-(;Q4=dq3oEj5qz3{!24gSfKwSBTI9{f)FKv~>E`KcisY1pDJeDPMsHfVQ zZ;(*l2#J6*6XQ8ci`^~rL^l0|-%j0LwQ`QWN@(D%N4NVH+xmEAnKvRVZ;np@wZ{2~ z&??DKfhXsgVS*{HR~R@>tTf*Z;O_Mw!scLuyF_$-tJ4W}p*X}V2>bA?K_J05PsE|> z@grCIqt=RI?MkHUyiA`j8rB%@5Uk<{zYoRn6_um~=!+D7(o6paATv$R{<~n4fA4$Q zzNGgMoOFhqUs;<`Pc=|QZ@s)P$95AbX%~5|d#QR=t6aOv<^Ct2jFVY&xi*tD4V@V+ zP@|NV0fpB9aqm(eE*>W?Y~>`<>)9IReB~f;ID5ek$qn5z*QgKj3clFFivO0%Ntg|51y#0fst2bduSc7%)JAkcywGS#0&8D zHAA@TMF_I>*aJ5(9fVhOA5CCM>Y^rj9Q!>qT4v>7q8edBIgg%%jht|i zA^4O+59V?#(2MQZq1p#MM8r+xDLmK#IHpbW0X}hmQNY=zHr!OEj}0ur`bE~uF+1V2P_a{ zT1}oJhR2Mxkfb?*JRo;YbcY>t2Q&RiJ|qesZ{9yL<6I*5(XYht7Lm$1+-s%3a(^4Q zg;m>-2B;5`z72mc9}a*13o{0U7Q{zjV!m?Sp7k)GsI?;iMx85Mu59vn=?K^@6iht# zgQmZ-z$WML1$!@^>q5V6640!@ZN-Oap(3{d^aJ$I`kzQJ(7P9_dq{Iav_O06!$HJ6 zhtDC%41l_SYT0Sd-t_~A4ggGz$6(MOxA-5qve~z9&p~U-F(pnh5Wb=nxShd=oVI_Bi+pOS5+IDLCKwTu@epnK`5P)}c9GSoDFBZmAicjsMW-1t zeo-P-Npwk}!}zQ7QemZev1zsHT<`m7WT#=$o9^osp`}+JW`DVAnjr3kAX@oW*uM|` zG-pkKq`I7*1PIv-%lq}49JgL<+YbBNBl6)MG;E?{)LIJDAf8;gQ83+81HFLQ&>)Qy zEb)$f*N8$hm3>+}E3QC%thPe>7$uZxNGW1hx|=qHIg3`lP9M3Qxl*fCOHiP)7sqE)Fxv!;R1&gE&%& zg?9eeA1<5k!=>O)Uhz1k+|i@sFJ@;uK!jChT(x0I!`42^sMERywAa6P3pd5v!;;kp z!L@$#zdgX;S95AsN!#FLS)$X#7o^ksVuS|G-+^FCXuy~7ohQm5zKG85H#x^$&sJRj z;giIdViK#oE{e~rfgLM4(5)Y8Qn!;c_EmA&FRN2CWHZqI`nXD>lXQ~+6oz90aaluv zEyVpS^!RBClo}0eJEm)d_khzzpYR&tdmXq1d%)dy2tqpmbo+!gS2Vc>15kqm5Xx9) z%zUbMI_-NrhNQF>UeGfuL-`NaBME5ia02d(I zXI6p?TsKaxZI@y6Fptp(LMebb{Wmo2XCy`$?@s$IR>|rhrROYwz)7h3Vm7=EL08DL zi2bjK-6C#^0eTEIwnfw02Y$sCfYPm4OB3@gV1bBj5kNWs1Jh#Y-~ejujZCAA=yyRL z4M-*ZDCHGdHC3o!TwV+slVh&c{}9O3m(vSFw+rVfA(OPn($}{_P-U1V1jAPYz8irO ztfN;>956+_pr?91)Gws6TviIk#)#BqwY##4hKq`tGZy9%;aMTbqMY7* z^b&8~+RdqY{Bd6`;ksuBTl=V=Okou843?J&Uu#spTC6$FP%6?s%@VFkmkRM5mNVda z4GY4s!TIdL4E|oZ-VzUEddRH7!H=&C2HvTXX?>zcX)HdfV7R3k<>}^^F=}DnI^Uzn z+;-VxZ1Z!`&a$@Xc06J295Re#iM-Gq=7QnFJgaC4zyo}@n>dGCXwB;>C-&dOyjUOB zKTsyUP?{2I{H~{Ok+-cZ{)h7eWzKNd9ZV`EohH=00D?3jWW*uJQO;YFW3=U;K8Gvz=v&uF=gtpMcaaS^=T~68$bL;y4A(LIC%!AC=lg%sYF2PTzUgHpTF7?T>rTBJL?O#@gZ{-$@3HiaC7@IHO@- z;{mJsPU_fxTcJ6dvrUC|N{qMCG|GVa^|29RH*>a*A6)}+agsNx@tMgtOJ$x|$p<+% ze@^)LYE!WnmG2LRr4zlR#bVA3Z_~6=uDr+(1#a3ta-I1FYpaD|VGn6TTA-Eco}fnS zfzccG!HQhMF5yru8|Nq6FO>}rnaisx-IVRThfjEDsK)zv0VdScA4$olj?v!Ou<6~9 znkUnI!6h8u=SR&LZkEk;AjGD5lo|V9x57*)7A|90LXeJrjc%aq#SvpSAjn9RP=GlD zc?JzJM4hb@UX#~*|KVG0!j>3}XfpLe3aJ6O3b)6^F zZg?)%&8^8MdZwP#{*-9$NSgzgNx>D=RLHVDz|FWILA#+wk#5~MbqKm=NV`9Zz=s~x zg;MvakJ>TIEGXI$@U}99tjT-`f(Y&xuP_lEkYlX0U^FGF2&f9NbrI2oFk;V~6FY** z8`)F@&6+%0)Xc+9pgq|)cxI(4;uq-Z7mAk`)lig((~VA6l$ILmx+?>FCD|$3W<2m> z&l08zZ$oUmTAU|M0dw2=y`AL0)z=TW{e>eVdJUNAvS@+_s4_MR*>G-Zv*beq_CWnRb+~CLsIEX@mW}n{W3B7_Mx4^$LX}n zPXM~>HDNfPLcjPh!c3I~YAh(YU4=+gMuslSaO#;~L!SGS>i2zGEk^_56#O#fZc*)k zKW}PJ_L$wOiJe}CTSU5`eufPrap}oIZ6#X1z}Kis_fyVyyLN?)!9haR3PN&0TU}T> zU8=;%gnWB&-Lp#)Dzc-UqVN_*F8ZxufOgDqmtlH~b%faSJrNqZ+fPI0j;pw)%Z)Ey z6=bUVl(Wx9*)?3vTE8)e-9XZV#z%lLwI5WtP zFP*O2tbUKEHANk(Ij79$FOJ&}%79*5WUz^w_aXzi%C#TB=3krDzOfxlI!i6iJe!uo z_;j@`{it1|JyNPzc+tu0VCO&qs@j42MEo3@Xb3y)e`of7#`e~+6H`BzP$g#YO-aav z5TaKLer|Xfv9U0Rfjs=F%wwYdp-eY>@~WbXe=A6NHMLub({L#Wo;Sm&zUPQuW3kvX zL`3(((To23EKh1P`J-1JB6<*-@`tGCAG+>hUqiO_C^vv@M&WS6Uz1_h=cs+< zM4N+~ju1TmDsl)=9b#gbs{Wzx0VImbU}`~)OwrmsyoYd8EL zeq4ozKAR1fC~0&TpF!0j5!*8N)CdqrV7( zo;dheb0^!(EW13W3o4=?&Xh#=DQC7WC9O(y&i2vsAKl3(ndao}z!!R~I}>x)0Q#tb_hvHTgA+ zqIlpvfyD*RxyC*8fo8rVV)~uho8=nRnyN2=u3$?x$z^u;yS&oSt|6L(MpI0PPBQ^_ zO7gv$*vp3Bj$4Q~Po7$zp>qPB-zU}L{3d;hcP#mGQm0f9w%1;0GKAcVJo#f&A##xT zS^z0pQdH>mT=NDXo6kU*kkkm~_<}Dp01uMAj@KAUaIUQrP8Zl<$-SvMjqu}l4md?< z?Xh%UlFyZl4BuN@(766QY!I9c+|&T!*}rsy0r*|PrkJ8k9?YbyZDT_y`qr*T#9PR# z^BL;}QU&Y&csxBJ3x|5`*;aB^2pVHnQR2?CEs%HS>hw&kQC^(rPuvObE?=+j0CSy# zn3vV198ot$=;sT_7}+_;7k*OKTIi+P>~Z+*;+r>~$_Yh7bsUymMx8eo$LH9SjQp~BQUjq^*gmF2)q)$mICz#{^DX|*ycL%T?3yKigUY!hBXhIBVNSWoCio}5di z4`jNC^FcQq)nw^>4i!iwI}oIYJ&)u%09!N$W2sXwvemz9d_5P-TYduI)p{8AJib{u zn42n{uH<${Ph^f5s{D1RQYXH@zNpa65s*gmNz8YR>Kdx$N*ma?g`NKzr#5@ykqh!4?zf$ z;SF=>J9_pd>eh-`slT=}W=wFK614#g%se3wUB*CDa;WCg;h z0p(`0%1Yd5>-f4$DruHqg4U0YsnG=lN><)UhQ0Xmb(#2cR{iZ!@g&14+7FX>g5k!c zX}?c%omFv6l#}IgU}{QT`Z=_=MF>(;ou35c!ig>*AOC?v9`^nT*!&7%F%8I5seCB; zqq>zsA47UtE?f|H8Sr)cc9DA4P9sa;L;PLa3z zCMEzjn#iH)C%{I|B}VI*-kjzxdj+H`o&2NE|B>q^oW~o0Q;9ZnXCW=}bZ}Z5&AaV8 zl)+5^O6y?yF&%juJ&q2-viS=ErgUz^kncP51?-EXKUhsQ)Fz_TX!4E0yb7;~T;?)C ziZ-(-I{V(W6qbR-zAdXcfAN=%1;e_ro_hwz{0x^qdDWjUek=0cztk!%fO{Q4m-;&i zMd7nXw(Waiv=Rv1Uu_1(M8v2k-Nh@T7{6G$DtkYD*pbrYWx)IKab!!K9A0f*2)D0W zA;ZTpE5~*Jy`lXvEsvfRv_EX1EF`8iYDxuq;=5 zZi`~qA&y(G3}0S|xVza3*%|A;heUr6KsQ}KU^;KTr@8qhT^0)@7*NGNk5cy zUtx7p&O4|7_M9(EP>7ZBFse+Z-qa}k{hIFNH}G>&(U+~``)^Yf|3rSD{@x;75;c&1uv=yT_RC-oh@M^plOXugkqiSMD_7|(s zz}b)!evzVIV~XNjMYfvfgY-npEjP~UDd8R$kiS$ zYab@>52ECfpLVbceEu4>H@qH;uNt3;56#5&yc06=S3(zs#CEffEvT~G^*v^}UAeJc zcu^QVa?B%&{r-r&?6GstPb!_1rpyix!00jdJ$w}LvhtrUvB0U0sbTv|$-04&VISAG z`*cO$Y1?Z8C~jqZ{t%atReopwpJ*4t~NLFz1zfOV$!)Uz2GzEr%fT zQd=uc38wAsM;IeY=iR56?VfU z3_sHj`fbCK8SGVHqlj*_iD+XYmawgT2=Y@hsW2yy&Jx{o?#x12 z%u&p1?43n->L4Oxr|2)mDRr8^TJCg}528O7V4k;1vUrtgwkAI~@WSBMGwVbEdTHP(++OZ|QwOsT(iKoG8P&9~U{9&Q?AVZ+CN~4`NmZW#^4#e z7sh2vY@Lg*D|xC#uQ_=l`NA2NHxWD6mAG||wA!4O4}`w#RYBZeNO0oej!N6%jaAuR^jcb+!G6tOIhbG3zeI9S)P2{mgEdg;3~~Em`7ReWhVQNl=SK-VBA4F%$~Azj zI>|5BHJXK1G2ThU?M6L4xvK=Bo!%u=~YLQ7YVxk{VF#Q-SsR3w}$=m=v}pZ>!*;NM%bMK^ zqcJ)36nGolA9dDKb9a)D3--``&w2Eu2z*WWnES|k#GM*`vL^*HkQOSzX~$6|-3C_p z&b~SUHj9X6%52<2$IYEddV9U{f06f{VNJDbx+p4&NKuhaRGLbYCP+y>R0u^tK|rKL zML?Q#2rW?&=|n(45J7q;C{1eU9qGM?-V;g)X`c1%J$q*F+1H$z^Ua*O_L=j8D`X`r zd9&90KJQcR`+hR-w~0D7diE=AMwf)mI*KK%T74?uyRZE6xme;3&88NZ+ z^}q^B;e26NU|a4vPA5Q2Xr5v7elp)kp|U1k=3&LqL`QSf8X>RuRd-EpoaXU3y4?qt zPef{RF-$$97Gi9U()U zM~S`JK1MQr6N&{(whY3p?mVoE$Wg%5uGSeTU7!0t;gs?P^5qvD{f1J0zCDjE+Q~{o ze5*er8u~ORI9;q;kM3B9Fouz)Oi>HNqFT;bw%oFcOg3@cY;s2Q(oJOZRV39ZU25@y za6{}Qc_?OYb(lFv@8I@6Jo(8XvG;V*dthFh&U@lGI!BNmCOtY0x0$D!m3-z1uPBTD zu5DpBvSW`CudYtk^-XH6%^e^bWB*DZgX%Phc?EfQ;c<)5K~b)~OR-Z}dqlm1EB%?9 z>OL1xU)RhmZZ0Zx+chZ*iGVG9Crx2-voUh>Nme&31}C&eE7tuo$xcAA&lW{%@WhaK z%kW#i=&cKQs5-G4&rUJ?YO3dG$yaczUE+8_#8RA4Xzo*K(>F2o6ox?KRmzojxpzgI z`U`oqq%@1#LJ}T@<1c(PAGFzo*0ThG>F`0DYxA`-uzp7m9^bKTetkUZW~a>unKP&O z9$UVATL1bcvuJCLH~Cb?_uAPeNBErr%}a?Ngp5_<;2)Db>00&jxsST$vbU5LBm>Hk zV`m*Nm3-x5dOKnyRdatu?Ab^93YW$AMqG<@nj+hJ%(TfA2Dz6(gg#8)68-W8)fHjn z_{+1=;5<=$_@o%t0Sz)`d%gVSc1M?qf6_K(Yh5y8oJ`~{JHR|-e4~DNO(W?w(ne<2 zO)4~r-pF9Ij8r$VL@o;#oqDa+lpJ}6ROx<%mPrd_s%DXVnf2Mii^8_Q0{~uvpFo;e zPOMwp9O3hvP`}4^;eGP_dw~#tZT=stl`6;3^&8~LLS53r!`csBE*=71qR9o@+ENgG z`vLF`zvuXh7V0w(2>TT3>a25l={tvt3ZsoOZ^7iv=5Gh}blz5olKI(_$|)=&CfP=w zwdI-)@iW_iddJb!h8(GTRI;(0k=jZ9Tmy4lfCTV!)xQ62Z_HFU#aSlK#E~v+`Opf0 z^{BHG?JX8?+6xsSG)DUiW^3Of?rPSF5~{g)?AndDtH8_vP{vu6mlfVZL^XQIm@x*j zU{$BR_w84X(p;5W6?`gE8ux8BL|S*=l0;h3XR|_bevZhG=vtLj8_q6iG%;~W%L_2Q zc)@>+A(U9HLw)0Sj}$p=+2R@XnEr6?Ei@r0Yvfv#Bp9{VYQ8=jv*i3B6!CpScD>ib86{1{SqNw zF1~brl++X-h&O}nix8%0G;@eEw zRmGPN*$)AC730JiJ)#5&d2*qq>-DSs=^rlZeHvW>>v!h}4+k)3u}CH!nExxg2L2SAhV$#y9*@bG>8oO(Xhk z9t#*&o!qU4TO%%E@ZJ)Qmg%6rD~AP66AN=YO)xsj^??Hr!>Y5b@q=|2)#bJy9fGcR zuUmP!c2mCIey-8Rr{iHELap&TXT{StFMgc-=#z7yYll4>xY9s0YV(mXFz+~MVEE30 zZ1G*J{QFUz5;wt1Mx91NuD5ocH<#2gOY5G}rRzXDz5o7Gq-FWyH}CK%+m`990ZhCY z@{#L8{@nHcSjFqNAt~72^XR|T<~sUWt2c*60ZjAv$P4=rp7rwGX_Gay{AXSzzs8a~ zfDH*g5pmQYpwOy@sBMPZj&yyKwH57r5UpM9VhfUglNj+*%TyR6bC~sa=nzuSzu{mV zX=2SyfGfzhUbU|N%v^oj_rtrzoWwh1&7E*8Mi3Ve^lj|Mj}Yvdf^f9)0I6h^7|)|s z@HDrR{OKc5pIh~dB4v_KB0H4v9MnK0;ho1i0DPXMeBKH9MK|*ez#6vOV=@0hcWN5Z zhb)6NVS+JC2a@!}CnGfNT1`M1%2!$Q;==+N!3-+lxo-EdiaNIq%2EQZ$P#@0aZdlM zl+t&VRVJ>}vy1+CLuhw@twH0#%$w6=UA(w<#f+$%R?qe? zbEm5z)jDrzW`Dz9pL~Sa_2VMur&G*vI@uu(d;+zGY_q1KR)yPiM{oE%D`xKapnqHj z^!IV`4YM5XY^lWW(;@HTD=LFGlc&d?e!q3P;ZJMr#c{}a1V2Rsk8GrR`>D-s4BEOx zL<^k5>+rkk-&4Lxg+5SS#@wT@kv)i#58v`2`EBJD)n@3t3xE4jE>hWgFO=n3P+Kl~ z*+LLEDAk2Wnsyp|bn@t=t{$0s)=Flrl53cK@03AoLi!Kty7$B_38(P)yd0Dv*w0E~ z&k+V5Odr}AQqL^Yqs+odjQf&70QNFF$UXnZ0ZXt?m%Zw3GGCrtjUZ~Lqq_UqW-3dq z?|oz$W|JytVT)J2qnK}E@rrr{lIc7nLj3_gZ_64rPn0Yh@0~uwF^=se>)ql+oPOy& z%k|vyz(d!L1d9POymLlI2SzM;OCS6WVg7&@wLWPNVS&pu&Dwl_e@+BCkZL{2UEp%} z;?ojdW;%Mol;+XBwkt+a3F^lLQJMZMcwa&yF$iyWUG{Ur1!!N=%I2*M>~GA_FISE- z+Oj(<8z!R?k47-MxY2-bx6KvHT%v2W;;Lpw>UQ$IM)zz2l9!#EwWP92Q3s@xz5r>i zV_}x~8~EmGMM=lf*m?x@i$~0xrKt}WQ-1q?;<^7ZG+pM2T&C<3uBXwM9%%1wMtUveKSkwqLq*A9X95EmAEfI%k+V6xU^wlFT$C#Kyb-_wWffO_MWjOMWbuinrP?~W$}`# z1tj>6ej|VQgN2Y|1$pZa>#d#v#k2s}hgmj12^tf={M&qE2Gmx`+XN`&;X&lNP^vHb zJ;WZ#=nb^1SkQJql6Wxw6RNWoADrT;-?W-mmmVnP_bEtcONfYGOX7-9YWn;QU*-w4 zV4~>HFyUKKSe2INWnb=%?J%VXVu%a8zN2hvh!CU1Br~CsKD>bNmwvKYK~ZWS;B5j^ zLv|gHdW`^hpYJN>vpx&<*6rLs)rmA~q`QOo7Ar_vu1&^pH}|>cX(d(YwhcDgJ=9lK zyJ&RsU#h$RiF@&%xsU%S=OUaD!?U{vK#{!I9Yo@dB|%_GpfWGYMgm^Qb zO$i5Z;-e&f3l|C~VefYWOkYsPfBz*kF7y*6iY}nS4c1wSiq{y|o(s4LHKz%}lsp^y zzKK}6GjOhyxh}X2IS+ktxHt5?!bxpxAojYh7OSyMbiYLl-e3-fLa+#h&q#*Owhx&M zaEUMPT`YZiN&mDF!wLRRXuixhZT-yyhQ)jOVdI4l4AT`|ml8{WjC@shw`o&Aw#N>+IHC!)w3yl&|+L?lN{C&@+Tk6KhZX z03BzUwg3@;?z}ts3vRH(5I>B-WFzt%U~MhD|3i#R_IaP+)>Ky&Atd$mXq9o0T{?H> z+KWFLd;E1A-j&z=6u8xgRtfG*Ws5NVdAv#aAMci|ga{cc*~zL)lN z^~@$`LE~~Vz z+!#U<&rdCYTqxQ(x8Aj3s{n+mjt>pyt?s%b1V%BmZrHBLK8&;i1Xvln#;=Y4^^uhy zEDKG;2%AwXCQwoxkq00xNZ0WV|1VXJEu_eIeuu5&<*6C{=?ruk)R7;9(xQE1qw#x-C6Gc5|P;fi^huK zRC!;~^Tg+X$An=3&4a0IHl$7M_tf~tQFIT+1WnUwdR2)XxVGkrul?>Y*2!Lgx_*#ug=N_jqov-o zoGvK>-3ans-BqGUoAwo4GMf2oUwDoE$5?1?MB@*BVvF^9+vhkEL2|rvYNT%qC};M) z{)65o-GAqK{{)3u^AmI_Y}rNw;&F)wRl6WYUi&y2;2P~;u&6cv0eiQC{@n-;JpaqQ zb+ZrvRlg>exyFvd!;$ID$O&3(`)JFGC^eOWvIp>?HxDv0Rcu9jq5q5#{R>v~|C`50 zk{;%gIt@ckSIVXbd!h^=I?ay=qn;Jt&TKAgE~s$dV#0iC&s{MtI}nYXSPH$Cmh(sB zlmE*zCg0cJCTpU@Kj(fN0TfN?!JCy$yj-4C6>Of!M>s=-`I-yX~<61{V^x{V=qU^hrdBh`nEu#T%0g-DCwPB9^RA#c7t_lo_w3wanGPa{PEBBiTR+PwV*eLte*>GFZJ?EdA*w35`&Oyei6 zdi_&XA4OOq#eGmf`094~@4x+TQEmT|UN4I4!C+NgRXBfF80sQa0N1`k`Q~3; z8~USSA~g$W+evZ8Z$;k9+ei$^+1C{Dg7pVJB7$;}mK1B8M27BAfZ1SJ3gKMiDL|E4 z6Rkrnk;QDHCk$F$j=f7ZH4Qx6D}3&X+)1-v5eGf+7cu@eY)K{*wcM>~`2|H%j*`!(=qs1Yze`VM87Ey3SBlYZ$3 zpo59JOUaKz>*33?KR;uih~^N=wD=FnjG#|s;vCzwsj-W7_r>#zds`V z?@bhcxfQP;;J3V=%`jvgDmrMC_W6Ra{uslb0li=RDe+ zmTxceT~hdXBZowH67iTem8?XYB)-WeULCkU3HO}!tLFy~y3>E~_iVkupCCtyEwm#j zorI#TRq*b2kukfoTua)$w zt{H-{b!VH+dLDF53sTpr{VI@4dn)(SP;wTp+6n=;iGuTg`bWJU<3u3aA5Rg+uz2`i z5}b9vy0=y9w~Fwfr*5*7xcBo!pRj+qo!jP-D;;W^1?X^Cl;Nuz07dRViu%yz6`eS* z2txI%o65Nzf-F}4o#5ETGO`4JjolqNff8e|g|o{mG16y(AwGxz4VS37?n3u9xzf?L%4oo@3#L-r+0 zIjXsO>1n7^%o~#n#F(b?O0rSDJu?KOY)|QV^mL|O=kp_D9{FZi(8w>k4T?_Im3{dS z0Y^>+xd=(Wn?XCoydFl2U*MSs$O+omk7DM1Zl{w zt~Y_4*YckX%Y{r=IIXL_{GJ3fwv2b$eopFvEJOQ-jC-OH!w@Ee?Y11qzzN!SMqpzs zi*NEEmvg66uLO~P-WbsiL+)4uqSQrgC3joZ$VVQZkK`P|--B`B@qA!w3bj|Mv1iKwuH5WYYTya0|07|^+< zJv*3yb^9l%`B@+xL}Bx|{1mbuh+>xxDd#n>5!Rz(kc3d=`m#LDpC(FaIaoIb+R|)k ziDKQz&q#1>paBGcrBjoojX6y=-UTKnhO7oTaN?n)gaYkdNCyC=J%KRHj%TA5XKl@^ z0h-i_AbXz63ZU7dYc_?_bUUFQnD+03J&HkP`g4B!$t5CJ*4}&b zI8CpG0T>T>C9Mwdo!-|ELsm_I`r2_*0J@P5yd4<+?{7cjM7xSQu;&7PG7DON+rL-M zPwMw?2gcY}!t&>D&-~NqfSLKH5BLnQZu)73mXH{{DsVx2EA#gP2t@$hKZS)F@D(_1 zv1kSOe0|{apZ|F{Fl9djVt{<}?0Jz8k?-az(bJcAhr0t7tsNEvH>$HlHx)qQh40#K z{0MwGUPP+)RHk&XF@CIkdZ0W2IBnCY#D6P-hgkXvkVPn`f8dHxZaGBw%(RXTeV3Aj zd2Spu{E~xUXS(1D-7#|xinQM55pcDD(baFv(jc(d38%K+@I>_#$4uK9@w)dT19G9) zn%>(z+m%dvXbd0y7@ zzKAiK-Ac#vw1HT#xHmZKZ{TPrpjTM$t_Hu&$IIhM0fTOq8?a+qk=|mnWIoI6UQcwd zk}bcJIqJDVqnP}Ne{@A!9YF;~;}RkG0T=xNa?WGt7aa(#Oe`Tw`kn8quA5W9>!Ehzoe9=0wAv zK}GN^!gk}9r_Nag#*IK7|&q#jUa3kgB_Ht;tedDdyjUu)#WU4tEw8NAeJ3C2hvV zXN&UMsy{46H9vZVWK<4lO!*$vz;bpn4g5|nERfUo;0(7XvH&S|7pH1o)?1GZ;BfJl zCW>kqw#x7oxKvR0-{@_?QXr+st=kI`daXX$0m^K-J!vl|ly~d~o_q_%VqZSa2)8n* zG;qFh=fmm66$OywnBOB}Lgo||2raDhPh@NhIawc}$ag(?@I8L~^i7*&9(9E(;wYD< zx$xpl?dJZz@wZK*pUxjM)*536cA;y=s9$u?CEt?{O5^z%lAX@YzSg?pSy^ViZ}lO8 z1Dr<$O{<9RmYDYM_^A}yai7XaeYHOs;+Vzgdg4PRlR^gm~!)2SJZ2Og!wEdFn`M+YCo*} zUa2amUs&xJ#c5^Y_O?Fq_hJEB0=;^6-is0gIsJ=n>?&Z^H3us2EaCYthHM|DpA-=u zsaUNE?r?3#S_z5RgI?t-$S;NzeJ?;2k8Vhz>3}lHr6p$ zV5`t9%x}avA&>a|fEoXES7Wfioc{M0BgO)J5!oS)tDpurBKMz+E~+4E z{H_55vzm7zp8^aM&?c)(8Svyh(`k}@htiNpi=eN7Z%cI92Y4L+aLo5(0Jo%>MFR)? z0%$d43%x5hV|-=~MkD|_O#aANFAuzTHhzifuF@@u>jQp9{a>~2H9V(hIIypMT~6G3 zm*=d;w>tZfOV>tS>bolGUVLCsd3l`5zS4`gpMM0~P_{|S9(`C2BI{RBAPX#K;AJ*Q z#tl!!&WYFyJffe9PFU@gLgIQV6cl>RxU(6?%-UCT(n6#DE?kGll~#NBHvFQ~F{4@6 zme+VSH1w#ENUrm%Y?qb5il^I1=SRMScH z-o+OQ$IpMYsdLe+7L5?-^fM*@=64hKs!%v6&rU~tlffa^>4Mf(C-tYd7QN`Al|uI+ zjPR}|u>3&lw|RTw&X!wQf;1No*Tj0gbQ1qNMv7saiFNF9LRiNYvFQ`WNf=S0_;%fG=!%ICuO`wbBr>uY$8*%hQXKFh`wd ztJc6tav(3jXOEJu=JB6yRmiQN^P7AnZpJ=_MVd8(L)z_uR7u^I3MF5QIbj{pV ztJLJNca`Pw{4d0s1Qh1-pKd)HWk@~2(q>1(5Fk~_kPyYdwA-vbSz149Lo}aHXdG}U zkP?p^PPshs7_uo*nX)j~mHiyKN)-aO<_#7g>iU7?p+Iq7P1S6^*g{qD@YkPGVaeVu z)b=)!jmF5uM?dXE6Ls5jS}I<6T{K?l4Tm_W_-rkfr7&w;n5L1QE3t;XGuYG~ z4Oma%MTpy2)FkC@jxKh?t<+};K5XfWc0Y#3q`o2bvS_Fb$o zV|%8u-r~MMHn0VdcuG+NGGSgwlAaL!PjITgJ?6JePsAO-5;p%;Rh9iq9>7}(d?t}w zFx_HhOV`yU^G=O_krq%Zyp_*wJl3C8l6T$filMruGMy+E5{xMut}Q@iQEru(rd6II zZ9P^;ZX}s7`G-Z(*3ci6P4NP4{0bj1p>iQR50Zkl9p31bYQ6)RS=X2u`D!uu9j445 zCdV@6Wt7DQZSbe#A@e_D1@7j+Q*_@yT7SQObzwaTOk3VG)qGfiANvJFqIg%63bU`;BZ>j~)e)N6RVO0KBJ*A+s?eit? zYkDU<3>to_g3IaU18q8o6IQUhYx-tgk9-01WKp>ic+prMi{XI#HYmzg9fmla|2ZHb zJyr5;H8?%aGLnw8e=18~(>zkjahy-+h26~MKKIkj6Wf`N+3M5W%P-#X`fyyuDAoxg zhG;kOvN5(Z%s<~`o&POL_thfI;4)VEpwSglW53KZxiPJ^v?Iq8B2gr`lo8ui4f1@h ziKf5M%ozbwR!(oBlMG7U%NM3?tyZ0 z{b&P1nwp8Y1}}nY1F`6)l1!4C<)=x$lIzrY$Q_0Xx4LJA8x+7Pn1%{7-a`}jDAFUNJsdf&{p1jgQq_LxC= zHdvasyk*LxA$Hf$y#NIlQr4PlbJceCyf|(8*k%9nadu%nui$5oeTxQE&iUFjLndq* zEZF-C(8`j6h^sY2k7^#a2sei?@MI~7Wvj5}h%hLi^^@)IJaAhstJ`}IXZ#i~j>^1- zjzI9yM&c|kkIeNeTV)ByRYVFzu5+EeSJcKRXvFnEkLQiTHeCnE%2=I;L-+e~*0!Z7 zi{h(6RiNOb8T`<~1u(UtGiQj7_RGX*;@LR!{t2}Z2|;W`)!vY~GlL^&S@fAR`yva3qR0`3M<1^LjpCjv{ zeb0pp5Lcp}!+L_*8+-AndBbXlm=+%w$flV9sz;Ok=uGZHRk^xr3n;ACpB} zUaCom+?^V&B&aC4SViGQKUp7QE1ME*abh&{v8)s8n8p2-;|klBwm|sc<+(FZ%}5bN zr-Cmc`s&1qNH6xtLgmEPciksG+;3S7^ttqAZ1E}bn3CQ%rF=hm)~qd~TYEBvA7#-D zZU=Wl+xtAGMVcp6ifU7*#HS;$454a(Q0M7&prM@i30ChyACl~#F*JO*^qHA)FhT%+ zp)o!zn;2$ZT3q#%C06jO#<5kc@DD7xSTwQM49NtCKEYqqT-MAQ?{yCKkx>tP4Ibwm z#S+qUX6+6SxwEpB%%-Q?2f753N~(FI5m_}!gEDa6Yx(6~bI#zAty%!`n{D?5fwsDo zo@v(RZLcuRTl^!6)91ZefRS~gC_LGF+A{K{NkTB=R}aBtI`$(=9Av>A!>Eq7&$%m3 zOYve!IQji1oq6stT|LHvO6K20-ZZqhq_Bb|i7QyOt}i~+=;m_$&hz@y3{D^YcOf-L zU~FXytHGn8!X3|Bw2l-d>o>66dAkapFbWO`%yY6pj{x9ooE8P@5e|y%!9K$tibJryyD+zax2zLinEu7 zvw#eWk>`C}YZm1_&q9`?BdnZcK417aZt?jDl>N~_AWBH>&qYFp%K?3Fl?*o4P^^E- zZKCOj@nkhP2#5$jXHRHSD++|6WZK-sOlUf(*GUAf!^@g#IMswcQJlf8$* zZ71;>gL0_oDvRJX^gD@vT*^Qg)5kCj#2IKpIOJPI1WnszwcbN)FE z@nOFI!b9jT{g}94;Q}U@CQgjVEziW-kHo49i@FOveaGLQt=i@mr_zxb+?%U@W^fp@ z@gP0+k!$HbbgJ$K6|KyY(d|f)dFE2|J^6*mW^u})cK&V31LI>XTdDqpFQfeI;gd0k zMw>zO)P+kdi=y)+(Z*!2;8*KLNBjk@o!6hVt;Yiv%og^(PC8NP%r@v=T?`%6H&ESj z-0DR3pyz-uBk3&qDu!`m3q9_&a4DaPW@yr5(;dB4ae8)q_qG^b@wz3OCFP{*WJ;Af z0|Uxb;^)+2?(oLYT&q>6lv|en0bpSV_^$yi9@c?BRc-Z5lCqcRZUWhP>xGCt zdb7}rN@1(mn6^}hijLm_;;QCElGD1+ z{&%mManiKp76Zm3XBr8fTtZ#U*}AQe>TSp{JKjuWHO!tUTtMr@Y&x2ONOdmUg+ zdC#zvi{f7;6|^_>#ai7y;jGHu!+bRt>2qzuL|Hr|ft;xZXV{!xs5NavGLJaAO!#iy znBSc7yloq6XwHT5WSrMrcZunc+c`Lp6}|DnAhH{`KIj}1whDMR=g3g_pfOIbNpnZ` zHuuclgJdPS1ZnOoS4(8B^Dz%y=5Jnf3l$xCG`CDTdE=GqLvuZ&AwSD`zN>DI`Sx+Q z444`9+%7ttCRm@U(ObYoP&o4Zgz)FyC!p63y=2r_HBzfWH912*%o{$Asqa0Gk>r~( zwiSH7y%??8A$TA-Xuo7{vzuh*?WMAAa^F%&!trfg0X0T$nne|lC~+xTzeRG_e;jCB znQ+C(Nly2e>oJ8b*!6x3iA&X)4aiwK0jW1ST~E)w6`eMNN)%J{#25Q%mWR5b zM~4Jkd{M~)?EMC~R@7{IEneT`Xj{|!Ase4b$a+0^z$fX&t{Y6%c(AV^PgTc5S*vtP z5K96n`#q+n`N`wj^tn%5aw=HwOkD~QWP_f57Rw?$dL_7F;v&AIRUL*)Aq3+_o2kF8 zN>6}M`LU9`(i%dVvFeYZ!W?3!H1*RX)tz0An(*g$mw^P(j6ijbo8W*Qhl$995uHCB zC;zbJeBclP$wrZMaafk;{*t!n(WI5vw5ocUY}{F}jjo_=5mZLw3Tp4GJ|xs`4dHOTS91qZXB7<9GK}L zt!}g{c-Z?Ik(U*9LPJ~oV`f|G$-jMWUtwes^AD77suRbxg%u|6G%04*WnO&2a8rXf zRfYaoXG#wHkpaG|WgD*xEF4twaP1a z&g)Y9yu^auk-fvpc?rb}k;ahbW9ZLuNV}SvI)EINi`OL?ZTILf@m8_zo!Ivh>#~jO)qBeNh0NaY{%-$~wxJ%6e zvuzlkaTKRaSa#IQJAb@S;<&R8&Umu}i@Ru?!b_{&-t?%+imghiyu=-uk~aZGW=^$u zJ6`M5K3fpnH2(2}-4fz9i(xa!rR&~pc#rH(aSA=4Qie4eZyq(HWypRHET(fq1tnqR+prGbmwi4h$sFU8!{H% zq&LW$Q(T-qn4+zt=@h@wRrD56w-W5q0dkx*5R#q??6Vv`foh^SRnaf7CjCj$3bC@D zvbligF(3xc87Xcgl!*&Fm1s+CNINn`wvopaob+X8>fWFuAeMSZ>=eLQ!lHmQW))eY zE?z8W<<9xIm9U7n5hjAbmG|OJ{of@qVT_1ze&Hd@m^--e^Rk}$>C@fEu*T=U-<;`q z4>8yj-L9hSQNOoQ43j}lrW~N7i=(k$Gz*jq7rmB36gyYiGL@`ZhogcpyZy zd43#XFSJBTPIitj(d#Av^1gNTM}2=6eg0K6npIu!a^N;&-WkDtI)l8zgS^?tQzkNw z9Bef&A{!dGsJ1tuOAIZrh~yi@K(twH?LOg&*sBhE`wO2&JA2PvHrP{Q)^&>v2>m<{ zX$7jTe&Er+9_Tmm{%p`6f>aW0NjvXp7g_bBg4imWnwtOr)}dcpLvJsm;>yi*7=u zc>)z&Cu+}X-aczH0NaeiuiOaXGDAB$i=;38FUu>SbRZf#HuBsP3>8*q7%wv?n`_f$DuOLH22Fw7D2! z2}l_Kzk<`(XO895G`O1P&VQ+wuvHG31S+-MwQp;(8~S%f=#jk&n?iZJf6ck-Wq-5Cie~ zp;^o+jZudi$Vi)chC$_&r}le^Sgd_DkoYp|5bOTA=`Ggns(C0}CmOaF@-Q@2b3)H} z?HAn%()|EQL&5bPbMe|c*k$atb!Bzza_mo&=`bnq;;y}A(~R&4MI4bB8R9;&o6AMF z#OtHD|LlNryMSACPs8w|RWFCl6#2TUT<1v8DoYbGn1n>h&VF{R%YDT#ahU-(ldzad zX_zC33~cOy)JHrdS=Jxek?yyi&G(!tfINUYLr?Ih)scw43tcSyFVjf8YQCo~Ez=+F!rBQTLmMwq>W;uVqMk&RNH;>E|Sw`sTD)k&_Pm)e&I{^Im7hbUwCY&Xef-yre zL!(Xj@>|ZCj8YTEn9d1=$N+AJUpgrdoRZU-ID_@oT&OCGN=+1IihP$Z%B*_A{c36& zM)vm>FB1?|#3f=+js5&%bWt$bqr{s-VL{?6D-q~-!0Im(HBrl7_n4aar38DiV*=di zG6|_g=WY;cf;esvwLC;HM@pd2%O3^w*sejH_J)9UMqp9?KV5C2t#z{SW#nn1w^(V1 zaxL_cSn$FQl$(0UPk9ZlqW#ve3lJ(ylQ zOfJ$1inxDAf7}kPDird&Zm*8eMdJnNs!czTkMr|#v&~~UWhWh9TFvBG!F%NT`f<*_ zG1mflS}~DCz}%uRZzjw3nVc~okS(A!`LK*wQuZlJ5d)@q$oKtyFetbfA82*$04Q<+ z+Z*n$wm14Vup|YFJZT#>W_uq}hHlvH;9^r(kY?LRzQ`{VGs!SNnV_oBE>0q?>1IST8I42Q+p8I zO=Y>(pdNd1WXFho1H@D(Vh27mBv4utmEE7^pv~6h?M+jlo?icLKB(^Wu8(MYry&W-ab@ypWnCU&({Q-IT*% z3ZOS;6-MJx{M{Ft!u^d)@Dp$edu@Je!mB0U2>f;Q*2LA=n4T2dua^r5c?{-k2T6jeQhBfJB1#;Us z)OhFxtSLTs$&&WF^XV%#2h8>lf0+94k3y*a-;Cp;oX)K-lfhS|U$m~s6)x&A<@iV| zG=QSe9W22!r-s1%@b7y~UiIJAy(T-=PxMJC@%j>+)K7S#N7n&BGlsE%CPH*;4|Cvd z3AQQuQuX2>qW2I4ded;gutfxg#L=^L7}BYu-&CEDA)TL6}hFK$uQQb{Od zy0@mC{J8|2LX09m@zW#5%ul8;&Zt>tu8n>b!~PUGul0JxQaKuYapB9?7tU}&Dx-;y#~}T2rf)SjalMM`lT(<159D1b=qiieI)oT>3y=<2|&7#H-dHuS%r7#QwY- zXBq0jQC}LmyR#0UINiZr^zR-$>l)1gZ1Nt%9|my);O}F**;~eQgZXg%!4^Vp!oJBZ zoi`m7NpE6k8(VE(pWE-PV{AMMg?lXqB;9pXy(VFIaP5%sD6J&ly|!8em>8p4i)+Q? zzSwOdIoHwu=CpM z*!oG|+dt24aC)mcRUN>FBhADuG_w#t`X9E4xi!b!X;}Dh>UUz)BCG#7O;}Wl#((v` z@PGgFEIhDIVK8QjpoNOPS%t}9028$LBj`@lwgA4xUMqYZ{8)H4N;>*UxsNzIekT}d z|K-xTez@tq#LfNl(~6Cl2O%|42ci~N(1?NAiwDtN_TJ1f7hK9m*piSKVFJSxo5c+1 zy^DvOef%ixJND}(QrEBbNLiKqIc>mbAx81hl&Sk2`@e%>o2$g z8TN+-@a*QgE6+*W`5rmED;K(VkL@#QfC8V6mO%sJDK)u=0&(8yWGRT2oZdvhyhTcs zt;$RWltP}2WuX0{X?as_top+%+yMp%Q=w{KuTej1`R zf@%XjR0#&WVyg1q@xRIu{OkW&hFQi$X_|+@Kmq<5WdDw_5YXNJIM6I?4t7{3x#Vc> zw8;*npUBNf!bK42iYU;Q^6eXBE>4m>5x-lGUf~5%;w32n+Ii4{?1S08ek4h~MwtS_ zXOaT(H6YY?3em1O+APY0WG60T%`e>j(bU9vjl)YSIa2=kw^Z7tD`Cuc=H7$d0I04+ zIWZM~XJXPg=x2YayL?3`V)M;uwr2k??d{vC`3Kq9ie_`kdFbl{5U1T~LGsmi6#+_q zYqP-%3=DM1nsoE33Rnz3o+IMhV;@}S_ud8FQGY{%cF&=&s9#;YU}x@67B<*C^h_F} z3wFFz+dWWOl6?VfBC#0fWx0O$)@=PX{vmtV_tE;oSJU&zvn|$J1KK7+Z-z*(3oCG! zQFeeJ${TzbmZQ1W!qK3)s-{)mq0Rnmsb_Iy7w>O#PrYtiPQ@72@lqU#A-Fc9x4-Cq z8_tRz=Hzn7j>;%caO*66@!<@p`Ny-d(GQf{y|wWbChuyC2sMR>sZpw!d%+GvPsgUQ zWE!QAlst{!AZU+(xyJRdvkYdM1w0j$i{H;^K6+)wdiL_q(+WE<8$>^-M5GCM&QDn}yD8#0++)z& z{={{rLyl9HN}J=~jr~?`b2jvr`BjnykR1&~(=|YWD#KuOOZmInMMv3gHi44~pXRRd zMr*E~^jir@dKlidmv=r@-z-$>)%DWM9c}lD+5)c^j~NY%OANHO%6#ogq4k(Ekmo9= zT7UAvu-%0@usR+xc;-$Vo}MDD@yW}{ydY1}uPKov?KGZ0XGUo3r+}Je9i zjH*T@>|EAwdb>T3MTcv% zB}pk5%%uPe(fap{qg4v|;nW(K=MACDXCv+-s^)_v&nI^=gyBxiZpwJAz@tHmYKFW=?q>zd-0iZ#AXhi#zzuU zO#L=AFp7x!@+g&>prw2;S^DBoyO3Nqy;bkgT6@s@6I_ys%-dKiwbQu&8JQT*<1L6vFtrDm~TbGn1I*TgRB1xTb1%^^94aXW6XWA*Gq&u69!+A4@Ss5wNZ$ zOJ#nglffy)j|R5VAhL)X6iGZsqaKIeydGsM_j=YT^mR;ep5%=TAR89u1;9)O)- z`Irjhz%4nJ_as>8Tb6QLjxJU|x5cSk<_v)UkOTciXMwqvPk40_d)+$0K}wDTlNX=2 zv1eq{ZA}ykDw^l3fKEWZGc?T7Q4Hf-WZ>%!2kZlCAKpsI7U#(Gd+z#Wsf!$1@-7rQ z`ngdP#AC3Y0dunfPwqC#DG4Xfzev=){8}*ZK3Xtv+?aFQ)%oI!IRXZ!*IT*JTa*Am z`V9{;VSa}5JM=F(X2kH~jpJkE&?-%q83l4qdcMqsz=uPL6lCa+HmQ0%-^*u9*F}Xm z&39)WMAXhyRUehCxl+DII9xOvFgM*u_t%R@fDkQ6CS`e>i!;*a;h*(yDEDmlO^PB1 z6&7wZGbec7-gq-mYqUqOBXamV5o~Kf)?wk+0)lz~9#0_OW>yX;P+Ahoh*1kYS}&C) z{UPsV$_IwcYf#m6o94mz+iZ>UH=MkhLS|hojvV(YJ4}pz*FzH`k-BqLVvv0Z{Y`K z8LqRrYa<-*1J>LjoP;Z?mQQU=lokw~7qV;6S?Tt2eMM)JOJg9u^~v;`TM$8r4VtQP zJr#X1@LG~{kCcY|)0J(o{EMHj22Q;`l^3g1*$Wa;@WQ-trU?-j>$Bncw}4O zgoc^&TNmPIbNAn#YFn)oSO}P1gqlGT)*>Q%D=N?VCA+4buXyM7KIGT? z8@Wsp*Fl`82Y%}i)m-<}QGeH{ujUD%7ax3iqYMg=XcW*Nt7kt~wOVM|71D&{^`8ll zPE0nPZoesL#JiUd#ls@-f_#iP#BnO(x0ZU8Sn;)X`-aa`DEe{vIHtCx<-qz3@x~@Z z62@XC(GKwuRM^irLc=ZkwxK362>Pu8=J@MsnsY5RXP;lHD0|-`B?foQyvy@(540fw zTGdZ@f#z11fcxya?I5|F9_xfFE6G2IVUlm!TCX?adIyQy@0^DzyQF>v|H%|{KT8$a z_!nDA-oYUYp;DOph>*Ysj4--ru31k2afo6mw0}p+*=s{BpC;k?iIYKqtxRLD2WsQ% zJ)3V$&4jCJ+f#UB64sM=j-ODw1E3uEgXK?L3W;RAW_)6*qoZrM#?TlMcX+}BR@b98 zPWqh0QlX`gxbP#d-8GMZR0k+CR2#>9s`?DU6%DeycOO1m>(LZHWIs_`|L|2&N2yXK z|K)bG*i#D3_I|M9RDyrm^u(^=HqMwQ$?R?c`wx5~0EtIuqPl5}#L{$6%Dq7+wwX~3-|A|FWJ0?7=lC)52%xr3czA3#z*g-=4fgQg>|N? zto&5yWB9hfhq7p`>4Bc~$Xn0_v-{MLcaH7nrId}b;$OXTPB9(Za2BVEE@6KXOxcf# zE%9|*UM+)tT?JQ&H=)C4&Ui6ctca$c8ObtVJ+3RX#;de8J(e$mN|(icG(0q*voDUH zX_?`m>yu0{Zv$m{)fO5ZWQAQKC4Sm{_W`}n_;pSjbofDG6hXFq<{ga7Yf9lEsbJkV zZsp+?pU4$Xl}^yr1JKde*RHYnk226qsqsL55{YD$X(P(`QITNY#jkNcq*t;j*%#~w zQUx@cq3fSa*{?Uw7p5(2$Z;4}6iiJ`LhX09mIM2RvIGmSXv;iN@$*N+;ldPNs&7UU z|MZQ9YSOWpV?#G?RLtBy`-vR~ANYcCFJq*sA0HP_W)J!C#EUWOr@y$~j#W_?L5#yV z>DE^((z!hFmU1PBC*SWqR5O2d$O!04HqVXV`YI8(NcD8W>Ot$RnAX{OZPDNTjSO8* z*dhHlRVe=YLKXhmnQ^@pa+Co~D!sMz?sz1F4O zBdhG=rzo#UuBC)3=0*7M<7AUC3XgfS-@qr;SIt?PG2bBymo=WBI(G+jh8_HnIXcZ- zK@xAAIeWd6H%LMw=m>Zg{?t`?jqesyael#X+9hu3I{9}H)G%$rykaMzB;L)$p=MHR zzCTTq^K?|ptsNVC?4xi(DB?Y`38KD)Qm=7fPC*wLfB$+VT@u+D&H?b%0RLnURT(g# z3>!W`w#RMk{Q(K#0Dt05prxykiFl4%`R8q8>YRXzSS;!Xb|(wUfvoRY*as?uu#9w` zfQPjgcoQ>8e4%xNT`0?F_8rV}%FyaGbBJ$E>s*t2l!)ssz?&v+_c%7KhH;IN0C~ZX zSf-#)1qK947j8U=dRX(3aPv9DXX-ZtXVmNvTxIpemG0Iq!-t7#9&URK_1f~PEUmyw zInQtXq#<}Z?ta!bO085{49qxq5GaXj-~~A4o3fYfhrDY)ZLYg&NvudE%O#A@O$Do6 z^XGu6<8~kf2pc@0jrMSYJ7S0Gt#XpNqwvLh;JHGl^?2Bcb>~@d#7t3*(&xOuDvE8QCIduip^Qhq*d*{RkWqLfpZ z+=e(!(tooH9aOc*9LSu`RuOyZDgelF=VK0Z9_$r5VI4EfxrDOyveNsUK9R{$QKjbS z+&>@|h@D%=v^dooKfpl^ZVs5bXH=#e@txbST}eA>f{W)itMbZZ%7L%;(0ZAk%z(zQ zQFmEUmlDjKpW~L$@xAt zpU@|)O|t*$Xy=gow2e)!*F;$36_>FO2pR!L6OvnDJila;2^bEm)ub1MJ8atuT6Jv4 zrGCaGMDJkg@Cf#S%c|;PSvPBR6Uo!CGqGcnF6_eBdW&Knu_o8&)h96q%Kw1uP;aou z5#oWjk}XJ@A8F|r7r(2Vi;M8PAg+vwq8fLTaPgZrm!hGcm}BU}syB&QTN=1P4rkFq zR4$;AEf%-H@l-49>Gjamb61p)qHCOR>so+Nt`w`pS+cv!+# zaOnl69^wcA)(Xu?Vu01~WmIo{5o&|-g$;cJCv*3p#_bMs*UzjlAf~(On;gdcQS^k5 z3puApui`0C`|=nJZ#Haw#`hK~Paa#N4^llfxN^TaayMKWdwE?zQMIRD6JGEYW+wV9 z5~2Y6$(a)QDxpNHLu9f9au;i_Kc3L2E>HF3j#0n7QsTt9n()BIW_$6C;k58V$h*%c zuda58?6j-tzh#a=&M*NkAE_|;3zcK{&Q|?lvT9U%sw@BOsD=cB7?Mq#?UFCrD3#{*n z4!5{F>id``x_ZCIc%e6pOR|I_S8GtaFghVQ>EgSJD(rV3DWX0xGI%^b)1sHFU z^Y2%&6#gmzR^seL#7xdGwa{z;`|u_p_vH!rWi>;{54>RqXPzQg6wBezjV(Z{Zy*f0 z&i@BQyO|#%40Qg-Fh77dC%lBxBj?EUfe@)mY7lX|4Z<0y%@Rr%ec>vTx&L|c-Ir+o z>+zK;#ty6cpmVnig|JRYJ4|R2_B8Y2uc_a+In7B&9sS_*f%vEP7wg}?(A3(Cg04RG zYs;5t3V4z~l28Uk`_2PmQcL9M+BHQ#s~*n~BFnliakzz2G%#l-MiToPT_rO{`9hP_ zyqD%*ng&={dSPq;-ns`>U_J=VJi8{4UTfjY*sJwjgNY(Wk!2n4uJ`$G;ZZG=Gmf;hg9m z7L+ph~w)?o!{N&?7;HH`|nQ2yG!LU%9(=Z@F;v9 zl`t*Jn;iE2Za+&eC7~+jCvC$A%CmgLCbvM@Gon;A|Av{mvpT%%`}bXcy0lWh-tzD` z%~^T<7@MX5!F{#Qi$2iB3pj@dssFpJWvPFXQ@w{<>u5&pi&~&Vd3Eaw^oVGj|>O0eJrj@I2f`aAE7%}7DAf_*9T3oOBuCD{S`}- zz(>|nxYTX6b@{ThKr5j2<-eAvxyZB|Ml*Nh`K;=d~ClY%Kg3e>u0>wo82CtcSazHiSr3gUPk81j`3Hky*VEj)cLo zg#9Qi5fgBY$;qrqcjcSYT8#ll@P_|C!PKnG=9DX(fCB6GkSOGx2vTSF*7gj4TX@+= z-!s|7CUAdItE|^p)L!iIjNB{X1@8A(-hVLAFu44`(V=0PUjl&Kbm}RBD8am=RByD@ z&-o2G=T%^5#{ud+%bpGm7YG|v5*-6s%Z$A&S_bEH;+ zYu^lZEY(^q(ZMA>hgOH>*tn9vi5hbo2Qa`h#B} z5Q6%;zN+%Fxh{1M6^X?IK={H{kA|KI{Eeb#v-`>5+M@wEqFC156zV-nC+V4^7v!i# zgU@knmEe+G3w5rS-sb!XO7RjEx@KIx`aMt)b|29z2;A#$qX-3`=W}aj?ni9F)RZt? z7ir~6P>hLWyl>E_`^?1BB9oK53GU8 z@6yJ`?JoFX+U3xpR#M>IGeBtmdD`E^Z=j8Y4o$xb-es#{2rKSgMB#e|(Q;aIs9fqC z%n;ET4nAL~Rrm?QjE-!uz~n$|JyDS*D9c-O0BkF^f&J}&P?BJ_dTHTQS3tn4$?r)^r)p(a3fV0=ul(e&zBeQ#BtiovqdhfN%3FZ?B5f2d1@cr5UY zH7e0>-%D#0e=RN(EH9aKQM~%Sh8v&G(zh>H^=G9v2SOnwq1oeKaq8mSj%c=7T%?YrQhg)yG zR6^F;L!}}NKAg+6y=W)=>p)P!(nQ%)w;lQBtwM~81Ufnqm?%rR=}AG{%<3s1^;b=t z<@%>h_*n>w-DE#KONwf^ScCI!zXkudeOIdE#82HP!p|z6m=WY7;tf5{EC48EaPzUmM9m0)%HaTsb5ypgyUC4UR}}BGz$Ka{r=ACX)nYH5+Zk2|NIkp z#)Fr7c9*VJ@7qQbNtLBCT9pQ!TwCDs`ocJtR;vB*xcaDCw6srdbi}=DhScTuYjysh z09;QZxqPKY@u|^x`tFsy55vx$d)zuB>$?T=!%v21ddnL^#UrqF@e+2BxcuF@8IX*O z41yj0TsJQK!e$7gbHQ7m?!`*sWw3@mf@@k1lDVvV#0A@B-_h@VryAX_Bb}*Mt}H2z z?h;O~S_TP>$ASd94r4`_l7R*|gakEo*HSy$^jeaI%45U>8}VJAIZt2p#zR|>a&+q_ zMN1#qaDQ9uvcH*GVRyPnwO#jDN{l{Zr$aRd5Gb49u50Q?bgKXt_fC@ad7BN$z}!s=#+iXe993#Pl}Lg{KDQ-q)Siu-&#jAl z#2S3svZ>;%n{9v=lX|m=;A;j}Uk&+7_4giOhurLky>%U>WeO%=#_41V zT=bFj1ImCf2;wLBb_+uT=MF=CYOkD(q`!QoJGT?5Yl;KW6)c&9c2WMO8vgKj$d4YBeP>Jtk{v zl@^_DQt&}v6~p*7XV{AbNs+QU8m?hUOHtmk=U0!0)qnW-Gi~2PmIFu;yl)FGiPyvT z+tmicuOh_!Z~AZ4lyvH<6t8M4>ZIlMHvyqj8T3k9 zk)};lq4*94%6)GlfcbxE1r`2&t`TP4prWywU$pS>^Y}Xz{ZrBB5Jty>(}jFced!c& zPF{mr^+UsSjyXLmkyM?Zil;i~LHYNbuSPcR9WxG=3@gVEoQ1pMT>|f9{az`()~Wk3 zT}*g{<%r^e7wE{5*HTJP2E9;cpJ->UC3(QiA6^@L_-5o_c259hFi}75PN+!0{UndQ z+WG-#wdY_e)MXLf2R(H{*$4SHSgd0f7!`uzZGCfAKR zJR4gA`mqIOaJPF{+%9LueP-?|Zy%kOmG2=sHPyo%Z}z0LR&YKg z!^mQCz}JH-^M}jzVfk-DhkJWF4&COD*9SqD(Ew*_JpA5ImrJ;Y%ePa~^Dmu|1}|1r z4iIa0;yMBsv~HhKg7!Qknn-mI?w3xrKDX9UiX&peRFBN*^Qh0Ty};cHUON8Z=esvM zj(jH|oK}N22jYQdwQqIZlWsiJ^wIN75}c`l!T@8MOTnj}7*0ce@rTA44BoJLdvu{)7d z7|mnPhGoj4CS6`!9$pC6sM_BL9y0NWrsXU@7_^*tXucK)1xm?W86uKW( z*2BYv1Zys_V5=Uve2+OF15_7CL(UZ5`rpB&iQL?S=~ga!nwdDGex?i{jU!ERV1xiq zc8s8NL5R*kK<_Qy0`LMl4m?@T!nD)2;V5d{j*2`o*y<|)-amGr9LQ1zly9hS=2T45 z$t$M1e2&KQEj$bMcBJQSx#5s5k7UQ;YZ;(J&GD7`En82Na z1g+t$LrDS62A}l3-W^^Ne*fTc0Po%d$n!OiliXHUJ>~e%mmIpYcl^2j3d~~=W}yWz zjUA;*8*@`k+_Y>+2ZWC17&p9n^@ZeIoyw%j7q%5O8WJUWY1bsRCwk)I5}H_4aY@h_ zfC%aRtsxwrI=A<=x+*$IL_-|tuyX0+vVERf^i9`h$O-f9ZY(paf^;*vDi?z{uLa4fIH$5-dQ?D-QK8(Dk@kzhJf z9LToAp~*?|7k8x}M5&mvu%RxAeiuwVdyajt7?yA}{*|9t-H+{S?u=eRj3-vA`3iyz zyDU+d--I}g3k)Fext)glgKwmDbluer=pc4T{V(gFIj)kUfTNvUl)VY%j5iv?CY z>+rOY+C3fPYXC=Iabvd-k=cOw(zg%p0hQUGH<;x#H;tgv>fmq5b(5+Qi@SV- zA_!?}N29t7@ih4t9HSTCHb0QGbo%@o{||Am3ZkQD=N;0ZU|1fkY_~AZgBW5Qq!%X^ zY~fob9W#ymc${+I3@@niKNAXM4CwFze`v#0GQobkp!(`cOGdYu(La|{{8)Xzk;XU2{Vi?+R>^|+V@BS{Sp313kwz9iyGx0PP(1jc`zb``f`!0g)LYvFDvugSch?` z{Yq7N?(;~gSkhkf&I}0t2w>#iLiZ*`98_(GTrFtUR2#=$PWTC}znPOQUZVP!z>XhF z2PXD?;Po=z<8Ln@Ijpp*J%f{qD{jYi_&^%55p##Ia=WKB4@?`-h9jGTtD6f^U00*c zrcj33`AiYgnOs7>&t9wfNQ3tVJxrUgXh6eY-1wQqI*1ExZ5@y|b0=%%XL;M`@DC5o zkgJ;=phK&Az%nZ~(QhXEevlRCDPDu^Rbz}fF}!grVp{Qj^^lk2<+6mMPc5IHc&CO# z@PyZdz+3_a7^yq#$IP1ws0}B|Ed%k*MKi~8R}Wvor}@58hWG+Q!?ij~K8{m#a1##U zOcmMCgZ0uAsb7hZRtT$Cfr=qcw!k&^tFT7N@9JOOwKX+`T6ild*cW64Kwe?;!YpzJ z=A)R3`}fF?q(g|B2_*Jr9OA;*DG{i$ZJavY@N4(2rJuV3?OOip z%Z1}j+R8olo|<|p8QV%wMm$3N*HoFgZIXFn!=tVXZW4193+~r~-dJxvg+)Rfun|`5 zKV9kC1rw8*F_HN?UAI&~z2~2u(HDWPDZ&RfP7X9-Igz-F*c~tL+2^lralFmKd~?XZ zr;xL7u@)S!I0a|N<092XoRoNBpN# zIxwI!jfW;`0c9Bbvj#eHSMQ!5F25B!Wb=asBII=FS zR*uS!22q&aqix{7YFe`jD1NFJ&FKc3cGGzqy&tg6&@Vm!3se5mTpwbOX59x<;;4Ap zkd1*ibWx8|^O}cKkHQzrBRW{K%VKY`%{_x4ocVN7dW6RQ zxre+$5KIvrp~%5Ihs-wv<-Q<#u@`nHy=#JQsL3mVFq5Jj)hwgjo|wVciXHH+*)0@P zK2M_SguUXoz}fxTOBGMu&1l^9Z+T1#WC;e7d!?>DGsMrN`Z?El2;#glQ*< zyD}CUwdwag{iZjhz+omEGl!VsPw7ge%jR!s`RrndmEaSd^StIj zrEYwtErD8#H%<2N6HoQc<}Ut(GJJOMX^-4(yQhk)f-YZo%5Bo~R=-=I#G~90P5|)P z`(Qf2&b3yVVF>PaKy65zOdXa8?7BE0)0Ht@wMr*cwlnlgvWZ_X4~gJrS;0UpG5yny zZ3P?^^&`DPZ}dH?$gRF7F2)k-r6_*XD?q)B+gM$QbeIFbaqISv`qLvVEIlkGY{5&v z4~@jpoE_{Ovd!CUtPfjQ-2N&Yt9asF#ut-|pdhL79(f0o0G$-s-tFRQRB2$*Z>ob1 zBU=1L+%ERKvptz zihtebhPg7^r^Tu)b*+eAvF0Z)TP%OrG1YwXU6?W3h5mwhm%p+#F6I+XC~d!`pZ)tX z?RX59V!mIO=c@ef3B_Zuo(G&Y0=SgE4t)Q=suTWy^mAsktb(i;KIc|b+D1%G-~UZe z#bEOhi(3AY3zCANnq~A-XnwV0Jn$)(6Hm68J3qonpvdM5$jfs( zhEY;?$*y(c7`bD$fKzGVJ;Z#|i+v&$f1`{bm7o9la#!PlEsG;$>#nWAu88T`ULx7Y zNVBBL#G;jUBeHw2e2Tl(R5xc?iRg5p;4gJAje#uVG_Y%VzGZG!VXXb%Xd54c|EDnD zZ4`IlQ>y2zMcZ(lK|!0N&uw*`9|;e>>|8CDO6;6>ZP(ISRaGKj&QM$QEMYCT-kQf` z2-Ff@Zn)N(J>sz#f328Oz#?Nj+T%5!tYFT!W1_ouARs@!r6~1rLwfFDNN0^#cJ$+G zsTf!!l5HAUDjT_zA)Y;H-7&0EjCOjJdvUYvqV{?$B{I_{XNC41>^Lm*tu|bF=G2cU zbY8sVhPiFoc>_IrW5dn6dC>qKoE7BFwUms!0JO$_r>bDiuXM?$aZo0ja5U`@whIolvQQ8RqC*nQN z-a*Tb*6;$X333Uv?e7XYlBZJhot4cA zWZPiTRq{0xup#_=d2M-psO!~5ZGtO`kHu(5SO^%nw*N73M_)YE>2H@V zd}Sjm3+S}JNkaub|BPpUr_M&2Klc=_YZCSN>|8J4W+vu;0dS&XDHc9neN!SpccqV9 zJ=h0Lz{t+}7#nDKr{VwYb)N@!C{B_*+m0~N7nx2{y)UtIWOmvzw!hwJZfJvwC3pZ# z_)AQWT#Qr+prg?VS^M?}bc;U6Nq6+e_8v%rTTovQ5&?ao<)!w}|A6@QFZbhrHw)o9 z*4I~_SeNV@|NZRU%xUe5l9I97_NpM21eiKJfs#Qr!0)F_Osodk8_zN#r90UV$mflA z4K-{mI3A_cE!gOI*{KN?261Tw0!Mc`5f8q81{14!WG8Uv456x>Lgax9-0E(1ss&;GTK`Tvi{3H8_Hpf^AAO=FaACHPD0X6yCq$W5VT=>dQ#M#np0;%Mq3PJ7@}Y>a!$ zB2da^=~?%MzA-KxQ0yf?{e2Ut^VJs5m&$dC|4xa-p6>N1X*ErB^k&A<>^a$ThK=Wb z0>ezWb+-n20*)~vx>lYk&$^)#SfZ)lbJbx>I4D4{<9Kx>An9#)@=DYAH_1CXGVoaEZOeJMJUkt)t)hYs7*BgTF|5f+%f8u)%nQt>y z0We&l3q7Aq#y;MH%=IH_i!t5yLy2I2iO0Aevr6_hW~(^aHEO`U{++6(LVDf{Xp#x$RFK9z5{&nXZr2`#Nqti zPaWF&2UI3bM|mJ|A7C^-+$0wXvT+i?!u*EqE^sr$cFAdW%!O^O+LoCuuoxT=TY@Ek zeY}{gz;dqV&L0rw0LI`gr|H5{t^R3Krn@XL&mYrC-4vtZ0YNg2puz1!|294^ZMALO za#11(7M7rr@MB);`9dKbKj#r#i{*Q=?cnfp@L@fa;gB)X`N@?N-{<}^3ARaG)6y;CT0*FV}3+RnYz;6DcB=m9{nc@utB zK11_-^8L8|B%#W(+$(!P;QC7e^uI8~{@?sAMuP$i)Bzg;kPKU!2~)YPZIcH1kGOf1B%Wy{qk{W$XnY zLCgC8gb&uYf1`l(56_RZ0umDE2OGrA7!`*rcz0_Ed})CthlXyxp2vJ$tuK*hU&KWQ ziW0!bR)=0Rh?EAXeB9@&hFbMTE8D4Otl)&Oc2X%M{Sdm+oTXeDn)69VfUBQn%b6I) zyhoVPCi@C3sJYJD0+h(xlY%DHtJYzKdphjm^0%}!5jPBAFzOcNG486M?#v;}7@e7T zX;&Vb&q0Y)ExHd=k-S)oN1vtIFRddk54&Dp@q8^Fli*tuf34-WEp(&W6dBUqPW6vy zn1s}NF0SDvjS#eJ%YX#PBlmweNzi}E>kr;sH&`p-B23na(%3RHv!DeRutB zkX_&<`Znw-{0QZ7X&HPM{0+go0_Q!~-t-|p(IHen{8l(G z@A3p2Ou;()GikAZj{1>yrd#<4N&LsK*AzsnBwTN(Afv4OjZepU9ggz>huW+#Ez0O< z=SA5Bd{&L`b`T(WPDSem_snTo{sQK3QpcD7-QnOOqBpE((Iu-sf~ut`j_EhfUbp{% zl5!OIz&^L3%)?Tf-E;IHycCQdu4PE<8)KOi5>kRR$<|F zFJ`pd&il+;`JztY_jfS8%BVx@9%pnNotX|lpE-Lr4|qufbFjM4->o_Rvul(8>i2Q= zz$@Duc!=sL-uj#{(KM~g=eI@PB3Uo(e`K%-5CQ#pFmohbWGqFO{b&m##{QM+-l5Gj z|JZymzKV7YV75w$qR=zRbZV-H?mI$b0k7vc2b%1HZn7> z70EMzc)GKNvhO5ZSX+rrnVUP?Dd_$KUyfZn8nhLM+ne}2l01>vnhcdst1GDnXu)O? zVtLT<*($?OHKzuLJ>j1Gk9{iw?3yHK8|IyryiCe;saz#;-})rky5sBpMmOpOJ-YV?`k!D5M6(W9f?+EOKk3KAI4pFklF z5;-x33zED3D>Y-PyO{Z+HVq+VRhxn6gzA;k?-4{$V=7N{nU5BbJFMijCAc|~OM{Ff z-@5!ZS2~Uq#6{Ei&g?;++M)9;)Qt=vZv{mnPuzFd#b^zypQXzUfeyf7KO(HcUc6n0 z@c8nQ^GH4dfpTuN-CH(EeVYSa9|`de9L=FqHOF|wiCm+qV6Y~H-d za6K@SdmT0F!NfvGV2rU)WSV`MFe9OXPkg3+1Ryc`zd(K$YWV!E#BWjh52&XdItmnK zuWUynOK{Ee2hBA7y+W3qJJ1ZqCj_VXG=Q09tbKxf<=Q#M#je~2*@rVp40AL&qlJ^= zPE|(-X!z2mo+7o6J=GuIn%zPAwg6^s+|u*~jKjB5zbt@s-!mSxj>#Wp8vF4EvQQtO zeOun6^ShC$qlr{eV&*Nl6OP1&OoiqSKHdh>=xq{X9nmfjw-`lq!#nK|`;zAbk)I!8 zPcScH^iQq)0Y#rce#B^eOU5}PRSteJJ_PU|rb|*^)T>HoF>e&38A+L7Hg!IXKI;lH zCQTg3S_@Yr(FLPS-@PJ=$GV3w9@7#x+*|`?f`Gzrt8oY(*RrsJkGYDR`~aqF_!t|~ z`t%o(a7J1H|2ryjR&Kb({{=u*RYp=%;6(774*fGLEzGIQf0=3Je@h;dL@FG##QEH` zo<$xzlMWv4VAKFNN9%A$a2@Zh7dODvQ$RJE(^Bz#eH~yr$G~^i^8X^by=Bom(*cNS z0W}wbqgDuAYLfC(XvHvl~B zoMq`OC;+NZ7#wLe5%g={;LRVyMJNBtym)55u-C?BRtR#^8(ZO6ZfY1`g-KcJ~MDMK`YnzjBL+E zCjg)R1@+0t`(h}n8+#OX zhXn}8TWvU?_={V%8i+xrfcIth1(j|<$C={G`i(Q;6WHdNunFH_#r!HahoXRBk=NJs zN_B=_D}mrwJ=+jIh^Y(3ll14spFqc#G%MnYe@A3LLLTFCX7aTVwit(YAO|FG0Ij3A z4Q_RNH!ViZB=`lI5vmP!yEvYkMEGc&cl?|a6@U9^LYD|YmO>pRK5`E0Tl!?P50y_` z-+Sdd9<*RzT_~8v^Ee|&nkrcE_j(E|OB^El8&cCnJe8UE}H-b6e#9y=%2;rLr*$Q~;e(ghA1G0;HX zN1sP5F#CMtVS%TJfkWOF3f%bfsEdpX*z4GrH|-MU7Hi^f+C}n--hOl!GoJC1{Z738 zND4!Uk&K-WrfM;(fc{wKIp`v81NZ@o9<+jYyL$kOujiO0{WIT^Ip@y(0cE3U9$=Tx zSPCnQvNb>ONot+_SLDhDQ}Ozhp|yuKF%54rkc;0MAeaK|bIiUp5s3?(A)7NoVh#P& zfh*k!6eJd3B{PThN5Q@b9I->Tk)1;&`lD@$9Yt(YlzAUVm;x+Va7eOSgE$b~f%`!B zGN#Rmia9z%7a0M{6hNHNj#wq>A2jd4QXzluw--R6rPvWMp_ic6oYTJ*E{Quz@otEC ziP+8!ixnaNtux<0I_@=CoW;)FvI1mg^n`zg(<70XMTp!Bz^gYxU4rqR5xrac*=9B< zvAk!C*yw*ItKDy7M>eyc?y(lPbYIG%(`?6c99_oc1On z+E)$%@cYq?DeTk>d;P`c^Z-Xvf%3?>4nP=aG=jaezTQ2Fa@a2=YXb3PMHr9DM%rxH zyNJ)VN4W8=M;V80j=?C&U5>`c8IJ-)4&7<^@|ZdYyt~p=tTYauzAp_i`~=BA;3D={9t5rGxv$}| zC-3ot=moy7`q$NT+Vi+qsu<2BI+%}i!P)1L4mM1?}RWgNYD6$;cIuN#YAvH*eF5-V3et3e^6|X4p4V?_TeNq0+ zpa$Z7padX3#TWQQ?w2D@OLy#vr#0L})Mlpp<;MGfD^syLk@P#cms6bm;t5j|SDruX zT-7NlNeezPf7YLuz7|m@-%~2+6o)F!VU7gs7ocb918#@V&WMW1I{P7G^+vmmhsKHqS$VDBBef7B~AsW|=RvwjYap?fHTB-PFEB)|z`&eT|9 z##IFhLJ!ZXP1P#-+!gj_VGO4Q=^-5e+EjU$?aVQYLwZ=Q(!3Rg5AXjcdw+~=Hxtq84#SlRDQlt^iOvnskkA-diaw9>y z_@9cC1_r^AGXj9IX(<)QY{nsJ*Ki5YA9UbS#t+QF4o(3x|8k3!9<0a~xoHe(0e68F zqnKA2u*H{DUrQ{p9S{*g=>{eH+qFXF1Sbj4TGU%(gYs#|iT&7YYGq4`TZC6j!+=Re zpk$r02J% zp(bW?=f0F@tphoVEZOH^d`DF_Y9Vh$9h(ACZBU|K1O($V3GAn|z{$?1o03L+{;KZR zy<@k`YtpY$Tt^p&0~0A;f?AQn=nW`PRXAdKPSbcmd=t+yljO0P2<)~J{pM9Vn2K@) zOIqmFJX7KCIRx21%elj~hAwva>%%-$U{XHG)f?bS*K^152(zpc7bWgcI(-HEj8aKx z2&)!H-$56~p5c=`oq)W4@nzw7RA`wwy#^ufd0;@@VxH9x@$V&SFHEn294me)s}$Zj z$MKu%-A$GeRsfP-193$Hqwh>zdPo%g{wd^kNPpUqKOkA9+t-dFOV8egOe+%W1DKri zXya}hL*40|eRS3!U2#7xP$QD^F`TQ+3&`&O&zOP+1IfXBdHx!NAwrRZk)=>Q+v<|U z!eqf}LJU|VSw-LyQ(d929ZRsE1R>2RekZQSAkZx6)*;wHM@wpE2{jZSfV(8! zkZX5&;GxJeG81`6QT=AW)#NF`q~t~HEc@G`g&YRNXq(TH5#Au|T(g)_@EeYLxb1x^ zq7W?PROND_&Rmq zrvC}P_XmXKhS3vXcQliEazszwol*yrl^3oS(z2=PZNvb|JyKK zyzk<_Yq?%>pLAA9LI7Ni`O%N5=7vz zT!Bb@)Xd45JK2t7{=pZsot5p(tZp21)tK3WuFcLK*5bji^EB~mpgIQI=F()3m+MJ$ zHgx%aHr{^9#Ven!Lz^M1n*l%r8b7t@->bu7aHvo)`C%QfynZ$bV_YcTXNO_HtdRLn za0@0-!J$bzlLxqvQ#pS?Q9EkAsJY~?f3=7Z{6UxgZ}u<#{qN>_!R|y{N0>&uXvnQ) z(nS7%%qQ~34E!#*-x=lKUhy9RCLjd-O)%Nht} z+yuB7@ONjn6*yTPu1nx>lA_c;))(};cTQr@wuvSRzbPmlUnx8RL*EWeN@A=W3SIrWgPqFoba&Z7xXLVotC-EC?M#vQLTTh%BlC{ zS1~)>`q@)G$nybtn@H+uga*`&@F#P3MnCt1 zz=qD0geG;~N}})EGBlgTv%=4=aSCT*F7fyP=Ib}5skp_C^&UJsRRB6*&c@S80V3M-f?T}&2CAR=4dXPe;z!ql_;c!2EjHeG6w zta|HnKtoFUn4?L1VzSYA&bP<|&&9)eGcs#Zp}R31%TA;BfJ{>ms2x?!bSLma*qi~* ziM{OpRhlfr6KE$6`AIcO{nMuHIu;Us6|1l=2Ww8wfS5R4$%>ukkr7j|iO`kXVU^d@ zeTqie2e=)k#u605qV($AwVoFhbJ6%0{u_Jm9o5v^t&3s>3t%B2H7Fp`L@5FaL{z$n zAYC9TAWc9(dPr2Hmk0Zsd*1NLSeCK@T^E}gbeEYnqH^7h5EYv4J&N*4o2UJ}%J*ArMOJciih`+^V zxcl9dpAh)(G$$UT3mK#-a?D-!sXPO=<3l|+^IAc$)II<1d!{{6Kx*_^;v%bxs080- zUAedi133p_Om@{O^{j9gox$WKe}!D~lbXl{&9$`P496MXfr4`WtCv~7e$YLub4NVN z_Aq`pZ7F#a{TqD|K4yIbgJTaa&v&^h+Zgx4)lOjUgW9h6$j(wY{2C0H%C$#g!}xez zq%VI+vlL~FYgRx=6NeMkdzU1hn9|sfrWqSrbdJ%rNzgW6oZH!%FpL-SrtGf9gb@I!oA z9_U(WH_dfDWb7N&r5Frn2ju66NDG&i_;{=6-keWHX$;EqBaABs-h}ZiZj;p`t_|Ck z-J}EFWp|b2Gx10kN_?At+=}X@p@re=Pr1*dyWRrOTgibz;MhB87Wg>)s3Y#I5MyfO z`%^%F(I?OrtGQV?(y*?hpHlsW79HBtGW7EyH@#GJcFcy0DqJ+~OO33f z<%;~vI$A$7ghzqL93~Plhvr;K1C`tn9u{Hgi#l+BSEJcZy>@E9)x(&KUU$Y%27I+| z5xDkMRx0Z8UT6JXc4364QslH(xu0fdkX4y0>ZnRR|P1_L&zi-KVF z*AnlC&_#0f_a>i|yTniE#A^f!+&Y`R16l`{TTLmTxa+hf!CtoSWf{9Bq)d38bIaNe z%+MfTL||)2wJWGz5m!P*xykw5&~!iFrSol=dZQ!P8QeTjcl+cQj!d@1;(YekZ+XVh z1B6W=AXorgyMHBcGWVg%_rp@|Qgji=Z3}AZ2F%3-Vjdz7GH%eEw`(x>5wG$K)6Vxx zVwMVbIsv4Q0Dz(P=ss~u>Y)p^$~N$pF|Q%U6JRFWG@hfN+hu%%3_s}F;1|r_y47H> zBRtrPrex_xrYjXNB$aQx8Qi>KP1oFdIXrUBxJ10{)mWSV<=1xGUx>!(W(i{=UbQUN+g*)6U;196z-K%^x{1pgux$I$$AF z2gv-B0mxe0BoLvc;8(H#aK!!fQY$2OB$akk z8B8w5;*SKC!Rw}AmynIQMc?%TwFR(ap<%i!j_Q5W{4y>K4gr+%fKz4 z-ss9$R?2>y-UiBR(mIA&=38h+X6>3yJSZsNBr}kGsCGl6S+_D^`q83s@(tOD-$E73R8BP1B{PD|?czaLaXmeN%tA?^a?<^z54p9dP@@ z*Y)JZI+=)la4_l!4N4MUl&B2etv{Puzus}zYd84O*$;=vPd-Ys6kL6ae!hJswZ>pQ zriyGvl){EJ2#SU;g@=4im(YQNB`g&ud4RTX^=@OI0rba6H$~ysA^4rLiZfuyaDTz# zDO+K8$I{lFG?^ZMRO2%yi5zrQcQL#vF4Q`^XYQuzC~TTtg;Fj{tZhY+&j78E0X=AR z1yzh>X^#($M681V{3yc$!?V^%fDknM_dEa{XLccso#=&*fs3s0Tu)E4i;%Un6k8r> z%p#*+KI5SB(EXUMoz7%>&4lj3j0-Vz3Q**~8Eed;hb(q|;mJmpHI*JOw_P)=|00Oh z_-VF7D@`DH_WnTl4DL`Yf2ZlMM=0|T6G!4($jSkLpBw!3@Vwfg<7vp8F2KSC3E<^p zBkr?wzk0V}xPdeUJ65p`UVokLd(i^EWGJ*90wJN%0R0BF?<*hxp$tNlA+KBlH|w5e zP#iBESOjKT-q={QRI(0iYJ=RStTqJEb6)zvWt=+NuUv9@9B}NY!j99yc7uXUO5zol zeHt?3m&R~L9dZyee!p?RLG=WJ)9T&{M%_X}*g@i`PzBed=SjEMrv0WP@&{D*jpuH7 z)aexgZeqv>`@maWGh=;zS4VIhPEoL&Z7pE|pP&tEdEizB6Z*od+OJn0z5%jc$pWLN zK#_b5Isf=FgGIvgXr*#IubRfS5TaY6DlR!~>aK`IHiC}0zKs|ZmK{hDNIYFbu5vfk$|7b>}Z z)7xhAk#^qRuQ0;t(!t~^f|p^}vx@j?9ED17i&7QF6>ddHE4M_d>eboUp0_V*+1q=e zE3o+UG|auOsN_M`bjRs*+cy+;QM3?6(t&vC*S;IbWQWO@UV3f#QN28^1<-d}JBebufmMz>%% z5gkK-Ef#j8oe||kFaO6Q#Gmo_(4IXHmxb+H3;+x6)FUG4!RwISZxZB}819CDJd_`* zddfdO6=ndyXAM!L_kXDzFt;JjNdC@z-#lz%WemJCBd=SLIi&iFhb&{>)@C{xqy7aj zm`@dt0uD#IHEB;=;(YX_jw{#l_WQW-( z|GFrU=AtvXws}G%DWqJ2oqeVIFCD$2(Ej-xZq=pSeYEuDrrcpKk)fSCo)()o9$V=} z0Bh+uXeOX0m4T*VO7odDcMpWH4&@|94e@Hs$Jevj2Rv2#1+kU$3y>rMQxO`t?)+i$ z13*O8Y)Qsxh!#-bl|t|qt3eoY9qv&7M4%1y3cQ^i#k6)4;q%1ipqYzeiuKwv2l5*G z&~LNAa8`iSCeg2Sw9bwEera|o9JBTTai}CQF3}Eg03Qf70LViT!0OC2tR|s;p)zUF z#lVRn^B_X`>(XPB3{5Lf#;>_QOd46R0O)gg8*cmd8j8#gK9rXr$N)2F!1_qWS>%8? zurAsG@KM)}kpc)e+FPVK*!2we0aT;jh^W>cEq81s#{x+5M*t-K;z?W>LjtM?%zvKu zDd;WjIrnly?#ly~0oLvSq}qFy$z$5Q`dN3JlpfteU%)^g5YpXw0#zoTl(^eZjj z`pR@Bgn_*tqA2{ZC)V|(EyLT{fPUiQlV50s?J|?7ee|xuNo6uA0a2a zpm^RdBmNG`rXcjKfBaKy=Xq10_c`N!_Q##REuS? zT-pu2XV-S#1axFg&?`BL8#?(62pdL~Y-p1TdQZre-88@eVup9GcbMV`#1enXh!VuXOWmpAfmMGj7i> zw5L9+_3}~OY$@579|Z7f0m!ZTLGden`DAmYJx6Uf3to{DxbqEL?G7M z^jFMZU_zxe9Pj^S-$aG#!8;>SkwBBmyJ?n@Be7_4h$O2&#&Il(Qr=7k0T}uf5b=u} zP-F#~$O2YDkcSRHR?2d~^e@24|5rZ?>YdGp>Qug4J15sQow;)<+0l}0#I4BRmku}s z6zTxZyGfHp}+o+RYOEGNrx|i4^WIvQgHPk_H}RdGfn$pH_6F z|5mG%zi$2)0oE&A5ZkipZH>&gJ*)?8O{3Uqy=$V_pWfYa>W^H&9Rvinz?71|3x+Cn%3Oba*m9)Q&GPAOOaslC6 zE`pGjMBLuq;$*a^2@l}|(9EkOr%6A~o$T=ip#i9;5V+`!!Y(WV>gPt zL(G|$6^?bRtF>Q3_ufGT1HNYR;PqUZEYR7WyjNHrb!zWydal6i+zbX_g4pYa_+y$8 z%$Ayjb61Aix9#FQJcjOkE%3YoMi2(oa(4$gW=xQFj42Yp=cU z1czxP?FFSb113J`{k1B2VKCm=5!Kw2#8>c)A?eHC@w>udC^B_A?)q0Hz3o;fAI5c3 zb_ndu_l7b;SyRWTn-K2 zReV)+-ZZUXs{$2)I8RfP)yt#A9a>Er*gIs>grf4;8%>4ig?XnM2Boj=U@(ifw)&JZ z^#;AHUe24ZaC$;56|`ofvjAM;$YVW6-_R!OtHjh|HG-8)pp5P(8I7`U_qy0lF&*-3 z+lj|n{ssoJMC=<(`P$4OTiMS*)qLF0)7H(WxU1?CiY~taH{FkatI2H$*w5}MG&Wo> zA>lrh<3KBGr4p1X4$-Sh90{?f=I)rCPxTzda7ye!$u^fL(%(>~wLoF=MFQ7xQ`U?B z)@Gq+S8q5N&1!aa6|n z?bYs|8GT1$HEaC7^c8pJNApxj3#ZVu6147X3@T^dBVEPdBYy$oh73@TeuCSNM{^5$ zD=H&xVxbN&F?#;=d{saEYR*B&gW-(F>GpM`n47CBQ9jl`8bn6qzM4L$3xt@LCAY0Y z>N!Z|uBv&BJbi_)$0o`}HZifF;{KfadH8Q%ECC!`cLF~|aPeLb{L;A1d4+$IJH+?} zceAZ$>MWb~QH*SZD9ww6YRL~T{Qk9k8iee$Idd*6z^~wjEd=^W;H_!w^lHsldVOKm z-c;2&vu`moO8j45-19I#qWD-}5d=zz-puj=g?x1r8R=<%<^CXex-TPdK7O~T&fCSX zs+RwJUvM{G*EC;zXf9F9*8I6gOsII*Yc>8~pe=v<^-fZdVa3zT@9hr!KH^jQJW6Eo zQ{sUsphn+jyc@EPd54mznLK**?!_XJxAFacv5AlMzX21;j}sGKRA+G}eH-D~onQZs zd_r-j>iMqy8fvyDoI5S8vm)jryXM!yL`YOavQtc9t3m5ZwV|vri3>8Tp>(Y zUlI@wZ#RlX1czo5WLSU{t{yRqZlU5rRBybeRJ&Z=OXOWH`xS=TdyDW_p7)PJ`|o*7 z&AgfaW-tSZZI14qlkYw85?(W)r>SjV8y%GSR*hLOs6Xftx=i&I1x@}*3C(vPIs)bZ zq5*cpFPu}_?ir^^wO&w0;)QQM$1%O>fSg*cnJL6iKf;+0T>oBHU-^vG&E0$y@>%8A zYz#YSPkJL7Aha-R0xd`j#PWgjyc0!vrXcN##d@I0(PSf&6x25!+=gf73WIZSeqbF@ z4pZ}bh4hGe=uIB7IeQA|jw@U?<@0*xi%zQ3q^iLzNWw8Mv0268F%L`>D^IH&yDBV1 zUzDHYSTONsd0Pg!;KgYdtSu(2IxOsS>7LNx6)J>=sj6r<45kSWk3zR1JA>c>W!8~+ zw+RITCgkdg3eWsssALJ!60nsX>-Y#~jhKCcN4J*MYp?mdjpR^gNeQJb)D|68yqD!Y`sLx%^#^9_h3=oSmV&&P8ga<2Ts+-_-4mJJRfce&e+BxovMuckQ;})B;733@wafewFy00_ zADuEoh;?apM*NiNVlO-%K8DzSSdOEfX$Gdq?1K)>`SRb$%4T~RX7FpY(_ zK=uy6j7W6p+MNE_5JMdSL~!gp;_!MyH~L2xWcnN3OWCrSG6>Wb^%uj$0S*((jAN=M z-?K7Vz7xQ;R$(f#X-kvshn>Jvkr#YJ5aJU@f(^R6FH=j}?v#1&&Zjcz!L4WplxYIc znyK$OwodE@I|#LTkdILuLCrQ0g;2?;d3J&aPm9 zX`hj4CqdSKPjw_IcFUl9<$f0ZQOz2G87Xa6zZAA77Or%#x#+ey5}*gTC4#j-YCk8q zXec92RmTn?jb9ist`y7R&QMa!qjbMJa`|-lP9MLZ1Cf7XRKYJ~Jg{)iY4pjs)_7AN zuSPq+?=x2S0G-I0+S`7DyQQkF6i2GX>gc}v3hKh#(|pY)dJRjHF85xXd{gP+1`|?a z1?oR;J>g{XdX@8{NNYutWNat**_bGqOXq(bH^mnQW>Q6I2GwOF`ptUQhUBE$gkIOQip~`ml$WXSchn|Ad-vH7ugW2HMGcuI;SrpUV zt0a?rP)c#vJ((Q{wIp2%39}i-c4K%Ej_pI&uPciZXN;Z0!eizP8tpXrKyUfptIbx= zQPb&oV({MHX10$MBhwuShU*sEb}u&9{S$5C^hfK?di(LXG{x1@*q@$BsJ`yFjI4uc zQtcdH*EZwz*}FZM&&W?@T;AL6R}x|zm-uZ*FM(hAawsMq;4qf$?kIh)HmJ#O$=g#H+|D!7uEKwr&g7rX=LtaTM2exs=@Uk{`N1S|wd7 z-Q;a$KuX-|5>am!>#QTXik@4zzfLWRzWgDZ+f&^MbT~x4jp)2Y#L_)cf0%TH(Vb3o zNBUO?z;}R#*^tqeo?ic|iFYHIj4{V%^pwnu*Gw-zaF6YlrP|m)}d(zUSw}znD ztiA5Kn2}IkLQS&sbIuw6m1C}TwW>PWE*kr>W6qA~E>OMB9dQ4e?%E zp<^uDe$M8n#P-ItI*&t*D&<&`R-g zl;9#tI25M9C0PtUe9pB=pbe-UVlUd*dFDc&u8cY^T3TtvYGhKrb{K=h>Mxc_JOOJ+d>g4c`pg7qfel*^oVY zpWI4`|GA_)2xOF&MY6zJ)(n9LBI$!GTUPqmqc8Ih23y?sY(%%wF%Cs`$ppyPl)4k( zg*(z=s*(aqq{Li8YX$Y(ur2%i*3PZxN9Vjp`X26ry2lai^WeP!#8=~dlYbd?Z)uqF zsXA?a#X#E}Tx|pp4w7M72I{aN5{?`#a3KxB>1+xq4_HH+NL_pb544a z-LwYhJnJGNX`^(MKs+d&8`i*FLFKwb`JMcKWEDbnb658bXQqGqXiNUK8l#qU=7F}& z>b47}zipyh_LuG6%oSAF|8NF#H;6lza!b<;V6wCuIxa(ugU}a@gx{huFlWBBS8%~$ zP#@_`^*h?W=^Zy0Pam1di3dAfH1R8mF9P4rW7QPi!|Yt#$obJH8P{$fAf3gt7fnG6 zRGqldjjvN5EzAXC@rVf_LVL(n6Q1tXk7r#&{xH!mKLu~ZFS7q?4Wts0bM~RC66STc zR$s|S+On#o6pQkoX_<0+xHo%#%HtF*z7WOke25ON$cLPBh))Ta6gwXKbB|w*ef4)9-ro2ICfe!n?y8?_~^_qpIGR@)CjRfCxJu}CNmXlugmR8f>7RSvH`B8tE`V4QQ zW|@&Gn{yGZ_oDa~9sm26lnv7dx&mCpk@+G16q8XSM0-{(`*hmOo74}7&#D4{Dxo7_ z##DWp5$U3`w?gc4V$Y)sy-YCg9(482W)?hQ2k_#QjFv30no4|W6Q?~Nr*G((K>A!q zsL;$T-ByhWv3hjI*xUOe?LKJ_uqSH|_Ywmubz47t6xiUjR(FSl1Cq3&ojTp#V6^QCHTbl+nMpSG zS?;@dqkEW3`IhSUoQ1?|SNxXT`P_2>-|D}>XZ^b;w4sXdVYqZkGvsVmacNR_HTLCG zRn`4CHAlqJ%;#h$G6|G$+tuyc!oc?>5zsI|%%96{+t4Ivlw429)E5@uscouGfn9$=Z5E;p0vq`+?}yKDV}X--kgT@;yW61 z%Ux=K$=`Stfssq%)v6nSX*F3r)jW?>M;)1+cyE34dHAdg@+nMen!zFNFSY8g$t34PIhe>X%Rxk!)9W|BrWdBzHf<= zHV8_q7>H{t2lmU!+(g@zu7pi=|6vzEl?FK(&0_^RX&ld=un$Rygo*d#*n`(N|wb3Xo$l;hYP6YimGf#zNLM$-%@ zpw&If_q!p_T>vC|ck>tt-CIkY{HjBq zGpIwz1bpGqzO-+zB)WqRn|s(E7uVeG=S1f^z;qm1^EG+ujRZ*5>wE%sFx$Y-#*-Vj zf%BZ|kiRK^O7|-t3#gCa5atm0*_4XzX z!_v~5Cdbdc2As%6d_{9WWNhOfCO$xwrI`+J{h*z-j|V?p6W$T#2i14chbRJjB&jzI z=SU*h4|VGHMHBP%uYhC!TRLSj6x#A;S5YN@yrWF>LyH(aDRUQyjeg^4r%b=kVZR;f zs~Z(JR1B}D55G#)b@PZ6esw4RY>>$s`TLNm=91J?jlXCM|GVg5LFj+u>aoZ6ca9&* zbM?!Am?m-G_Ufc7^pjSB^M=C$&G*3TLyWQ(sIEReA8Lf{!jH|xl7a9GiZIxDF?xg`@ z*$pbz2Hk|p1=Oi9NZa>>0(~Batlqu3c)s!>+a#71`3kGAZg2AG zYW2&ay6#|Y^_lBEJI;^~5Vlt@FE(eit!E4BZ0lh5d!o7k!_dgHHzr#?KFw(5dnTUu zHZT1R6B|?RwSwj6VulQk3%e%bg?@c=@$dp50xrt{G3q}*Vfp`u?-eez0#&ZwL~fk~ zXW7Oi_~;|9lTQM4Q}(`yp)=PlO5hH^^Xe>AE&!WLKRfF0u8ufwWSEP~Ld{Q?*n0(= z)35Vn^aVz?y-TR>jNTfPh0vC`)$8>pqg@M_tG)y5vm>vw#9-p{U5>7Oh0Ex828c0+ zIjeeuLS_eSV7s5hZ2G?ZVqY6^?c4Lyn}RM0IAPoE%&frY>N*~gR|1k6rwc;Mwf~=*OhGDDM&eUqxK@V+28SkB&C04dYH+6@%%h<#DLq9xwT8Sx# z$KS8oQ7lp)$n=hG|U?ey}}aN zpw)PpRB3IkC5McNEv0+^p^M)CFlhU)@^AUm78cOEqgBOOEz-@xe{=N>GP(96<8T6ikb-lpl_92sC zKM}@N&+b#&TcCaL9YphW*fqeTnhQZWhr?^qWR`%LNO%Ae%y7h;R~RPv-c6fU}87wG-N3l$2hJ)F&&UI zV3aRHch5`LhYsfe!wIR#uNZuiDhHzP42H(y2E38m6IclmsHq!)(&zNy)d*V*;AU&TM4ivMa)#Wo;-hrC3DnA~Q|3a~rT|FfzxS*!D>u-gL>kDGm0 zFn89-NW-8F3$Kw%*~J#XK_XBLM5%@vH~ak%x{$pW5u;~;_1X0Md3ZCp+W37#1RbTy zL(wOwZ&tVvd>0TUn#Y!;r7UyZ>3vvDX;J*uf|4~54K>jrt_PPBt8vb)*`iG?ApE<%mbA*4U zYk!x50;%?Y$)`g3B#vxHQ9pLqkIqTNx(1X&ep9g`!dC>UbUJa4f3=+%-)VT`#%W zX=PuOq8V@Z_PF`iwjqy;>ju%Aa=rJ2SbM)mbL31L_isFde=uE1mHiyxB=8%`fw+#E z_fjKU^ozNhX$EfQmPB;*Shpt6~OxnmphNVFU zB?&JpYohJKhZoHG?6k;&BYCv1`3`-Vz;*lhzl=Bizv(dt+aV$5W{^a>@PM33a!iB9 zY5SW`?d!gKxjkT3{qCM-I@jL*4*yOe2*!KE#r)mVlmsPC0Hi#M7=k{bfwuy2Z@P=s+<_0Tl*qOEc$RO%K=@i)E(+)u4dC5FjmkRd_ZC(-e8RwUK9iNu# z0At1XG5cn>V!?LA(InOMS$vd(x7Z&h$KN2t>;uz5I43<`S=0Z*Tx-qu+qU+{o0)bN z4ml4D^nXf0eJ0E&9LX#8z%&W9b4{8&AJhfjX)xu>Z!0!PqE*DM8}DPbY3v$lRwqbK zz6hZ9wMPaw@Uv|F;5Pwo=)|Pn#U;jgG|&Z@t5_fPzGdN6GqFM@O*Eps6+d2KPk(&E zBBQ~(;d-H|f1KO!X0T#AUJbwim7f1U3H3u$!4*EVPZY#DKQkj#1dhZmF>e5czKANA z%B%)Jnw`*eUq`?o2txH@geIjTm=|Haz*+2G-u}Nj59Za-br4_zMQuTlIlzaPCE@_B z%R59aQJOlDv|U-0Nj{Yfrrf3JvhD3$G>j{rH1}DR%j_A6dTh57V!^c45e4_xQNE zsM--rAKAwn6?YXO2?(WUfSSn~T7cP|RGmT7BH+3Jn**W%t@+fh@LBi z{_;T<`-V&l2Z;u$q%BUUOep-t14UOr9HGIcP)=1IOGk@)p`&u+Cge`-V?D7P`Reg+ z6*AAswrx}clTwWbXyyfeQBh8!Gc|Y;)-{zeC7t$ zb6rN+Mf{WXg|aF))GGY0_1KJKRlMV26uXO;>!Q*NfJSyZ6$G>>gNAwXp%?*4ia8NW z;C_k$IdJ+HWZiula`&qE?ISYJ$%m|CMYr04h<-j>OQebzoL{U&-|1DPWqE^xz40`+ zUjgzPIokTAWzYae#>ppmow}Lz5o9ZZo$|D$L9&j~M{G;*%D0Sp=v`db!<@^)##~J( zMN4k(u66EGDb^FKqZ$!r-wn}xVcw;sX==xx1@&XM1S;m-K44hNn_Y5nD-HaWVVqC3 z?rAI?qeu~WCy5p`ElMmw0>TBmO9Ds8&D)lbc%4c4d_PJdNiwSYr?;Yz&}c*1drIH= zl#!en{-!vc-s$PXap_u{-@P(FpJ#&l{Jd2>$r^u{;)94)gVaAvTG{9!Vx{}|aCa5g!E2Or`U7@xGEDbPx0 zTGG}z1XeFjmkjzcV9$rT{!K?@{tCF03hajAT> zk!GdrKhaAFHsC(BxQW@n_=o9{(1s^^|Gyh^B}=TijpZ|RVch)rw3IcazsxcCkToH*-XIDnASWB@;LT??$Nc1LqblzQkC;n zT<%jCm)GTVTo)ByH`n@Q&a@h%D10`C<@XJ^A@B&p1N#I5-}KRI(D|x20b8jCRx4lE zC94X@-NKiC=|AIC?Sfw9nu8EwXTNr1`C>HK7whcy`u6fN8xbJIpOan52o>pUE2NJk z88J>GSVa-I!<G!0xNR7<70J%$Ueu)zTMw z*!RAO@4-ZrV#xhLrMb%lPu!7Vdwvad`0-ZBVx9QLMJ1uj37_x@p@Q#~j`Ps1C}*ga zzRs-{cIc1>5o^uw&cxtw*Q`5I=o>fqst;^F0*dcfV#>Ko4sbDIrSzKIPSjCXgNlv( zN7q+P*2c-DJ8a+{F{(mH!EONT5S+viC!)iYK}FNPZY^X(Gbheh#&aKSgp_5{q6$aM zY|nSd_nGc;5u{K0@)B61?zbfCC04sdwQRS=%?gk(!uIrd%h4dB0-im@?TO(4O!PVH zu)T-YfL}rA$itLMlzWga>s;^&&3%erfd17&L|Xo)_s;25F;zDYCtUb8X5ETtYC;_L zL0qIgaT-nH`2e;b!Y6ol2)^La{-Q5|ar&_AJ-_?h3bT<GR7Pg<+2uakt3s?RQ5i&wQgSLChnmdq`@FC6D5zxcXHFQ!k{>#NPT?nhGc zOK!mun~59ixT2KBOL=Aw6-FHIbZ#9^%}V?SK0e#l7<0(;Q$mna^1eYw*9w2@EPdpe zz#T{44?<=IVxrD4&Sx>MrRlRG>ZLV&eKr8FU#Kqxu@1D650&y~yQmm4Iq?>0DTH8p zVo2{r7rP*L3bz0S{Xt@qILCPK*-i7HEC~?T1)%o&CHC~mX9AuO*II7=Q0=cNf?}V) zS3%`C(F)MtF)tC(12kR|3~0ZKZ~G!Fj3S7FQv-6gpBIh!DtmjlHAS@Bw3yPa*p8Hp z>seg#@>?Fr&6h!PH)()e;EjFx?RW8mo=g1!?DiX760(cO;@0g zE%a`8AKf#-2wLBwT_j02zZ#7jr$j!$=oOvp#J~2NSoM5==lv1pV)=zcOewZeSLLpg zyRQys!lT(zeRp9cffD5(GAa=SjL2-?#T+|d(AzeHBst!SxYfzg4fE*(lxDK6Y>P2@ zcplJP7P7xb*h+tZOOm*dyST9T7ah}=J(NQT#)6jG6(2R(_rpXNT_dBs8qouNNXWK^ zvNO<+2h7zto!^uGFm3P>vG~H@sM$VtUAG*kxZV?a6BD4P&Wh7boC!nocOyI0`W^=J zVZBJ3X7RIC(cDwl7~R#D@iFEs$5yqOOn=)S6e-;3z43Eh@4A+b!DDM3-Nn6tjj_#N z&sPkZhqE4;I^H=PqX_~-@hGB8Q=}#AL*Sz4(QxPi97S|JHfSzK_lOYUWwGV^4L9fmY-|GRYg}wZ}@PSLGy()2Ou|1Cy z>QBo@DyH1(A{njWb3OYP_RH6NMx4qk(BF2IG>)-{{lF_(_zd;a1Xq3)vPx)=$WPzn zmryTDWRUTd?PpmAwU+wD-|N9C&ZskRUf)DKS{@i8=a{sA-D9xZ@uHUYUSKV9$s!pX zRL6}!?x(DjG3L-+#Wq?=QcK{Y^+d?r*?}sVwu`Iw?fcsIEsiu$p+wDBzK<*?`tLKc6kHRtW#3iF}*`8EtaCP6}BASe97 zpkU(a9P=IP<(vnqA0z`xmK(8ys7t`!&wd1^Qu1?%cGHV1tnS`@VfUv$<7T*4Y8;K` znD*2~!e=+ups}i#v3m%RvrmH%wuooz@vw;FkA#}Cs^Zxr{YM%v9gA2Z&V`W13cAb} zRh4Tiyqv4D2G`?<7GCVsd5OQ_yby^&B(2(*ZnZb@=4`<^d~YpYxulAGK2*ZF%tB)a zQP`om(2}ioefAlO4i@hd?q*XuQeJ6WB6~x<=G|CnMoHt&xsu2AI&<8=+WRS;pG;cp zh=m#oWRKdamsJN2?!>mMHwxxNoipbk6Nu)j27U^j@TMm0yZ;9QnX{dpzbUk1ui5p< zCaW(iLJin9xZc;f^WF5tJSZP&ZGdHZp6(dfZ1cX!o21D7YpEWV1&Bq{n27d``ot1yz8UTaB8vc_c@Xw$b{?q8{U+KAfKTs1;j1PMOb&} zRA82Cpge-e-Z^ooKsF(5(F9SKcH`YelWYo?^nP~E?a%u?3~;~pFs)gBCqY$7K#bGu zbLO`bW_;%~@3F#LQ|$WNo;y8va*@O0unW~YicXseTEj>UIE7|OJPx_Y5O7vK6FbW7 zmT~fw<%N0#0*$KW^6pZ;`x_gW`)aDI< zKg=1vO}+1U5=xcdB93=&ccF;f$Jm=NsIJ3IvMW9FI-Cuso4!Ws^uq_Lf zB7$FGHMAK{c4> ztuUU@a2J;^v^#`TGmG5Ac2|-Ot+k-wHzC45d?lhhF0m&{93BJezsLqK&7gmJ_YPGG z@u=pZ?x{(^3IBrPpHmI1tEvf&GmIX@QTa2OK9CPm31ds=wdKT6`$AriRZ1c#J)x}& zAHMNjwb!sDn$%QG*Je8dn)fI?#!!L|$uw(Jb7TcJu19 z1~oNk&{tF=7cvgYLO~xjg2f&cs>8!*vq0&zH?#;a57UqSM=y*ufGBs$5Uc1xE{I(V z?7RxR?k{3PoE_ecdU66#u~(qGm5l-AawbE%e_sX?V;!0f2$m@By8zGWUuO#*`QPPt zlULuU>htFiecB*|Y((MH8Sc>EqZ=BEhGqb(K@IXd!Ew9Ea6K}cw^QgZdo0Sa<-g;? zGyr-Rz(;X{26Xf+r7*iiJcmvVw)9dq$V)UY0?P}fON3McTC5AIIT5HckhE3$BcRv< z_z3N|7X5uW^pY1eQT`Cnt{m?MuRzZ0Q+Z5xYS6&^bTI)}-Z{T;WQKEj99kphEKg@EvJ`&MA{3vKGy8G~*v6q%=Z?5I9s1vazyM5rS zt6w1F6}*l+Z{x%u4|fht6Zl1Pr}i{F#Z1JTkMo_^wi1mffVby&blpe-*MXm4_ljKe zbbhypz6Pcip8bn}^q+Z2D3}kv+S|CY|6uq@PX#7KO9W~ZtAlDkS)TKKWY^|VI{^}2 z&I^pM3^gDwrr&hW))zkO{&tf0NN5?GW=XEH7(oKZLP^piT~9+GvLD4CNj-TQZmZf; z{bb#F6+E&33o)K1R!k}^@C`CUcBaN)&oLc=Y-zrQOv#h>;64!<%cJgA0;q3f4sr2M z!?h?VV+3(cA>MeO{H2gUx3ad*^*L+d31AvL>k78Rw9rv|b!(7#$3GHO;#3>)cZEO(J|l#I=!DssRC`w$TN@XJ_~)vyjIm z6Y_=dPV_gFWB&5&K{J<9`PQo(Fk3xHFryn9>vbqZy-XpnYBues9j|^?W$`dt`B?2_ z&-g2Lval#*>ol?Sbe{bBVdhA0Zk5hjrfM?~X*SrCPlqfqRI>gsxnmOapuF&YOgRUS zNPXYCI(CoJ)h7~~em>kCl~0;AJ|McE^rL2h+GTU+Makn#W~DMG4iQ7k5{SybccA+Z zOzCEc;P%iwT3>!E;J{d|!NNqhmaje<MR12Ior(Ab$OB zvNy4iItyW^INv1}rPY{*N?sl+H2wN9{jq+&KtbSd zJy%l*0SfZ<`E}^L%D6gwWoY9GCF>O@WLlm3rJLPxs%a-6SW5V2F(Y@IIn*9gWzwD^ zx8`cbhCiz1w-LvlvtdOB(tp5k=1_7xO}YNGfYmP)$(g`n*-4 z9gUnVy$~2|6P=tJ@+p0@z%e8;*df;kBALG{O_sA&(u^x(I$$ybqBwU6;hIP>{CT+T2b788 zOa8Y4KdRk-W@)6PFupYK1m((ez|;t=FSlC!H@~5Ad{a&e7c3#^j+aV~joBY9SLRCB zCZM<+82sV72nVbCXKYGkS(lc%gy|BW(P)e<^tft1p|+2AS_9sj(e~nP?62ytT8ew) z+(2sPrw6-wA+pfo<|bgK*_YR1WVBN44N<>)`e;rgznDV?MqTwUDE(i<_W?V(y>v&J zIn#;B_E#3*S_X4W@sIx%u^hpOy5133e%H40>8WLN}a+9Qa*!Hfle zMik)wMS$pUE6ZSB)m;Wii)rCNW8oHH*#(R~`geySLsi9SF$GKb@a^`Er0lu|5Rs9` z-02b^EKKbczymf7y>NfiwWM~;X>9<@3{D;iuJKDU36mD&e?;}Hd$-wHtvQMja!!JL zpM$2<-0I5qH(BQ})t|*tG}yy;cS&(N6LN9Kt8Y}ctGg-$1vSN8dMk5R&=Z_6x)bUPk_<_G)(SM1(#vlU*p?diaPT}AjO_$}P5pS-6ZCfMv?tMLz0nP#&a^XJVisT3>BL-4H~ z)=SUnW12wACeRhY^@EI!9_KMJJv-8W+Jb}&!|~NI6Q`{w4Qr^bKW>{_eB2IVjQ<{M zr?x9Dv-;=BIo$tl&^WDAy4m+4qd|NK@v4_6Q196YOhG;?=*CHhO%q2B|Ia-R?EMN$ zP@9$HgqV_SKvrwp?dj;l)6Eg)Q4}M?Lc7*HCuWQ(OIQaL4`q`wKT>jJiS4Zz@~H-Z zSu4m7*kp+6E1 z;;F)^FcbH`8MPW0DybcxI*+x* zDBDs`nt)9}L=Y^*MZ5XDv;tuFXJ@b?D031o>(&-3W- z(E5uu%-2;nFL~S?j-X#9E;WNg85db#kaP=+|Hj^XhBXzf>!K(sDheVUfv6xTRq4_q zA|N0}YD7v@qzg#zi4^G&0Rg2dMWjXqq$kp)cS7&ICP+dc#XDVVuXXlXYv1R&&pP{_ zbAFs3c?dIvIp>&TeB*2H`~J+hw$}`78wi*?t+JHidXmqt0>K!(X{f!bNhMfB`4%o` zK{7_?d(g+JKGyvYLmlBXnsz1goI6a@$BcLzyAuUjJa2P}izIYOL;9!IC$F)>z#q?G zA1j~@>X&msNW-q8G$RC4cruA|4%<8%UMJiy)9N_FMYd@C<;+}g&ga3w7#p;}&|*-= zUhrw*#{}&kbe@9P9!q8F7YhCSrO!hZ)(94hPgiK&^H1+8aYtN1l7qt0J}4c}>-}-x zqZD$^CRu+jKC+BDgv`%w8_)M~Z6?z|SdaGpRUnyg@vzC~iKc?F%{aF~JKZZyzj&C5 z@jTvWqtwB9Y4nq#^N<=W5-tWXAmjt_C~6W(41Me1Z4`iNDx*aW&Z)1a4-HT{PWJnIP(lODJ~BNuDx_Nw;FkF z1&bt*LrTWJ=jXuI3u3Wedri_qQdeM`um~wl+|L1>Jc~0X3||C-MVEczl^TySi4XSU zw?O5z_o-EFpmUSj2UZ!Y6sHAWt&L!exg#)}=29Y`;!;{AnKwDDO=kP{uG!3qm=}bW zsRZv%!IQ|xEX)jr?|zphgLlImZ1`hB%{}NgiC!G7(gGA@V5yKXym8; zabs$7{w%EQ4UsC^%Hxa}?s2>cB8+p}P7QCais2V8DzCRW`?_Sk1q3&k^n1QucRL?2}x_Ojk%N*nf`o0e!limhABVq7@ zbpiMAy7#rM3TJIq+*ccHd^F;fq1p8zB30j8Li&TQt~4+%65mJ!;ayrDDKuI15nEI~ zINy02zyDlo82VO2?5$4Xy;goH?)w|3mv|^MwXQ{?`?lkQ0mCD8pjtMl_*@p7DQT$5 z4;Hz_CF2)*c8PgDF*hP`q|tD{^fN{?=3RmvXI{_BEXYuMQXUGC>i9Gd-6jnZCP#U*!y{g%@hUVDfETX z@^)1T3?3UTE6MJltc|!~{a{IpYe9y;^CL67vS;GS)U1WNwe2N6mF8mC?3i+;bdjJ9 z?H{PVTcj)})1CliY107FV;Itp99wPPwVfAtY;pTMR4^@g9pdrbz)hvr?d(iksyj1vbbsLa4lzI0Kqw zA3?PM!Mp??0bpB4YWPo8 zKM3Ov5h`piDl6)|K-n&pD>%Kz(9_c|7FNpahM@J~`E~ zAhT0@aGhbs^ohwUvyC_RRkL&`sM4xNnj#+dx{|(tWT>Ce2K^BaJ%B!pT;~XVbSWUiy=!q>Gr0`X>BDfgtZJe4O?2ePyRlXTS|AAy0qP&3<44p7QsQ zb(l8EY_Y0omXj?Sg1!2^Cg9k~*(2o-D&ZgASkX3l$RgZHXS2>>)K^933Y$Irw%LxK z{<*{>TD|;d&~Cf%km_`Bl3@TTK(Ur5xezz zCEDnXqq|GYwb29mf$UH^|FcJC#BZdCquI!LoKTj;OAm9E9}MAjuKh=ccCSF+3sKrj_n)Z?CR`{ilzN;=N9QSoULom{Mbu>*FcGRk zyXQzALl?GxY_C?|?w~Eq#cij5aoH-sC7kIGc!)o7iC8C;b}3G(UYWF`)|vn51YyE) zXJV^U>V0)#B{chX`MJjYt@UyN&3+jO(|Bpt$1%11Nl8k1uxumZS@Nmvq!XmSHA+%LGhpRUcI}wce7SrnFDhU$+6uOO}`w|H`L6kw65>GVF?_!YnNhSlcqbXK+-X(0|+r79z%XxK6i<5^L zeK=A6fXX!nM7^e--!j17FS#X%@G7I0<~MF7Y3uFF_iV_XWFMX=tom-^ zmy|oo6MDnd=~bR!oW{AhlExGKr)3?Yj)=%i@D;QiMFkQzqgp>8)=2BEHm)t7?zr$1 z(v^s{I%Q6{eB2M40L3D{gDJ=fd1?qJKsKriy9!eb?~wN)azz4zDYMslh|Q8M|Nt&}JLVE-OJbhn~Wf0S>7L zm6A4L-*L0yPk-&~FdH^$hU_X;SzE5+xCQBtfX1LXSO|@0Le1@~P)_Nw50ZK2U#*7k zX|F~LvF?NGC;$cV_VBYUA(2I>zPH873>~YuTm!DC7=|n_0Bpz z)L45{n8YvlEcRXQJm1B$I7AGPWt9O(rW#>KZ?@YmC|r0iap&wcfn0a1-5YCMd-?Hg zY6nrGVe)VnU9DcOq2ZR1p&ywKeX4!{>m^7) zcsST`sg^=!(KK?XCO~LhM-#H_It~_nQ+fhM<)iyWDz*<1Qlm7k)yM_|<8%3YvY%jJ zc(>nH>THv^+>N+(nGTzB!GqJ2Lwj2EIMheS{c|doey&*_0T~VfiLP7!AGyK!{>lx; z@o#a1p;SpF3${4#<`P89O-zbPWs#Wpx=eMPG)alxVnis1Q@DRA)I9Y*Wrvzg6Vh9p z<%Q&%_S9((@NI98KsxuBuBFVnpSwX@lGWh=o#LFGV;9!>b<$j*|YN$@3*opo2YTJve6 zMTND;IOc*A(3wnT5(}=q(ecul9e&0_m{LEpuwW3VIbQ}LM^$c&%9;2%rSrhE&Q4C6 zU2SKxlPkO$bCS-3&bW~LfWufP;fck8stYSJL#yF!RAChRowEu6VR$j*0?APgZ}~w2 zxgg4g+7>#&mZa7waOoYvgzxEO z)c85@&gJu!?pN1S+|7zIi_2<^g{NMh*O=FPV!_yQ^sthi-TzXx&`Cc^+!;^=S8?=Y zSBd6c;N2sW%$4N{?HBIJ-tNiI{SzKKekIw?r=d7W2kQOeoUYY1F+W}3mT7wbY+s9l ztiz7+IY>M_0-1_m*#(t*?#*s%;1w4eXE?)~CS*#O@=>SGU*(lC>1uxK&naw2obC^T z89Fdd;1G`ugm7DO+v6pf_fJyVm%okwsxz)wW?)faZ zIR6C8SO~0^{qPOhz=zETx^M>ja&g>h;hH=>ebB4iGvN#O-FMp1E)h3VqKa$n`W&yf znAX1hl@*eyga}7mkeP;SiN4q-`u|Xx?7U+YM`q$>HfARuk4w1Pap{x^T`-dx1E4O% z2oS3hZSh?IFJ?o-o29hfeGU${7nHWk(u^PyL)VVcsIp2_xd*IJrf1ErP+-k(uP@e8 zl%I(Nh%c{^CHU!`j_s~Y!ME=pm?u+2hd1YDCqb<<*1eH)j@}EGu?KVr-*R4ojZ?o! zQ3ujuTID%6_$|+-sN6$_I|AOn4U!Sjqze%IWE#!7tz2Q&JqMfw?xmeVz_00*d- z7LbYBh5`%G3@=O(rAnAsC}~}uc2Zz9k532bb18tfIriE2uGu+EWXx0+S~_xWqgRgX zg_9aAHO1P0$qof5;`DY~z+Cn;wlc)^=^82Bn22n(npyr%1-@PeKguqPiSb6&NCQ}X zbKNjj#B0kgK#61uq&p)gLQF}{W4<>Rdu!jl3kDJ=zF)P4zkLx#-%5nN^3f&U$1x?c z8Tx!)STwJtU$tnPnN{r!N32ke&K{{Z^K%c>gFa38_^VZMjC$Fz8+lqyzdx{1PMjJg z1X?m)%hMB<^(OH%2}`F>f67)aot#57ncb&+sTY6IOiR7;mgcNYbxg*BZCpYXJGGbA zB|mr11zjs1lo<585M!zU6qp-`QTx^**(2$kQ?A8&_jYn=zW{D;&QN5KvTdOcWE=dF z$0O)OotMkGPmQ99Az`bj%f*<*J?5h|Om9O7G13CVU9SUP&t>>k6;}`MqJb|1Z12&O z&AD3_^oy>SBqBZ#Dqc*n1|p{l0gcN0syFcvBT zF+s_Bg82#MLxT3T_`Ia(x`gg-H*Mtv?H$Jrmx`38DTT1DMTfEaH5@sQ~g zTLTHWUqo%(&W96F!2h&$GTEPxEf%x-4td_pwB#3@^PTz0B%nTdX~M_7v18F}O>M+; zL@cKHeTiC=$dT{+nfn`ZM}P*zrnTw4+w3}p+xrT)4nu3wFK@+%e9mkW!^t~`?$Nq;h0+AdpKe1)+)obj4cHca zl>|CtuE~EcKHMaJC;5IfUHfjSJXj)7fp+}l)6i}Eyc2t&l+rW{TG3z$caZ;x}p5j1?=WGVkPi^47DrR7hR1)~zr9*zjQ-`^& z#^#s^hkWktuD-6~(^IiBzsi;40K`jK$8(5aVu$QLbX#pP1rKep_b~I7f9&IN$q49J zQjo5~zB-MVLZuZgm#5oS77mWv(`@Tfo*mwpEs~?$o}adAO&vjn?_ct&Mb`($XVY>s zgYGOP{@}%~dp7~X2AD?liTj@7HnLdRkH@6CKQ^;^@8u}Fj;D!FIjx%_$~l`);1HqE z)+f1_9ggBr2vkWMR?1cEI`^^-Jj8H9%~`Ge``=*}Ip zJcCEhLBr94_Em~uSuT7jku2WoZaH>950D4i^TY;De`iO#@AuF~>(Y)mQT*rK>}tK5gHgZit$B1(RX z>ADIRd8op~>_9{Sy5RkGu!||s3S?Kf`FFGMc6CKo-uFkZivt0%Z+5upSFowV5<%aX%`7I;XC{Fr@)5j0%Y=D7m@kQ8^dMalr-ei$>$SbmaI;6 z7>TpAV?i=>hG*F3cDy*cyM-V$l6o%7jTQVyBDVteTB6<`xAS6%4gD!sn>ePg`QQ^< zqyXQZQ3Bn@4<-Wf{qW{9Go(@0M!+b0p(j>e%cY-^iyDB9Z9Wfzuyl7Q%@Kl6+Eqb>4Y7W z0evj;+$@RX050k8q*LrWZJVYkBV6#L6~g+r*n!%kVA1ZP6`vqSh7CVo|AANsCzV=5 z(H~lWa_Tq#t!&r-BwX=-&pE$HI6p1_p1iZ%^6E#-PJ4D?OaeSO(Q71Fs|!;BiHh`> zWSGFvbINlOmNfGT&%x8Od&?gDBG`O9I08i&-#b55+{ujlIL}Et{_*VUCXTYok5~9G zvOYBQ>BtiHhA3mj)DU<#O7tVHGz1(;QGfQyY)N+N(_`ekT2~e2q?+vN;OEtAr+0d5 zdNgjU#1jxdB!Q&c3z8dzWm-tW(~4XlXL3RTMqd9G*fw!65j;A|DEjzi$s?`vwPJQT zz~$1Z2Ir%urMB2!?F(qCfK~HX9(U^&85n~M7y9DR475h9R-~9fYg}PVHGRa6;#XjD zQ|EW=gDyKVT}!v7p8Op6 znO}aR@lPIWCBQVv|152kaXMuVpxpXEH04m57+gxmQ=*Ija?(#bS!T38v=rI+9ql|i zQEI}*@nvQi>#vW$={)1K%2eJ2io!mi`LL48vL)l2$qcMd?(?3TySdgG`2GdR3YN>G zT4Y;lZ&KSD;!;%?`$B#8oFd4zpN!q8zmf{zP1_(d$mOA($sN^85jQP+Q5A86lpYa4 z>!lv|kF9ezxPybcUoi+~4%1aYFUhxn>D63`rk<0xgkB*6a_M)PgGi|xV!Oqampto+ zb_8mJjkn50#1O~y4T_)EeBP?>4CT$pF71tq!s$VYVYt8?uKh{Jv^{Xd^+@K0c5j-Jxpn)t zu*YL+qMC-~}@3P1%^;% zBNOpU265VBh=PE2Ni+-Yp~^SWZz$(jA0wpfZJ~OkRNf7>y$;hiZ2C{${tv@tdX!v} zw+2Lh4kM)t<>8+XGaj+7F|s6X8{QBs9UF8S=V60hd$!nja>Jd`pU-(v`|e@0o{T8YQC63F<5q_ zXnGN(L4OZP(AYXg659k-w1Qhb8vvk*{R=l63v1;V+Cj%5Fhlktn`teNp>Q(crOvcyDDv zLN<>wS)oV7oyI&bK8(ol{NxS zS&Fp#f*xR??8CbCi4ow{Y}#{l*xKI94!h4{@ik31U;6P5tw;RlW|+UeFSLY>tq=Gf zz3KKE03IjQ2-Xw9mZohHE4mi#0j2lH>&_E`CROX4+i4$Ihi?eT=LQqOsVcqvux*f# zk0%hmCT4kg(9Qxy@Gjsnp~x^QPvwy=EKT+7WuJC6;n$Mehjr+9~uy z{Up(ZEGa5y{&jda)()eko`^`QG9RvC-WS+Unf!h86scUE{+{J1OV|qL5+BIaay}01 zXo$0IV1#WisNT+h$&o|y6>blpROc`!b zQr}hW%0|-{3WOAC*-eJ6xxV4%d_yD$!KlHD2;d0*2$R@0EX8D1U2FW{>q= z05d)At#}0#boH})jPcTz(es&~YJmAAC`h; zg|gY_#RUXu);jN!*J=P8^!eGpoXd6ta58hus6{4qQN8?WE1q{CAH_5wmCqjg{A%<| zZ<=X0t4J_Wy6an5+(LYGUl>y)}5NA5Z9Ipo-b%4={VZQ`~WZQZQT(X0$G#;d(9+U*k+ZE{Fq7Ep6b|tG@9x*JX?>>e>H`Df1h9(D?>2uLthi zQ4pQm7^De4kpVWl?c2fsRTKLEVaFKa-Yq{b=JMnn;>dV(KtB{8l3l&_1Js*gdSrz5 z%3}1^uc93S0zmtU(ksC!?yI(!FLY+mkI2YDxYYbK*koH z%t#?``vXI>bVB#TN5APrXt|q*Xlkwup!b1PRT&~~GeGMyw6dpaJ|NDi_>CJ?CMc9e z-|HD_0EZs6Qv_z&0zjq76SZ*EGbIqw zDuWwy?kp`$#Qib319@uS4ls&)y?;a!=zov?opbVd$=>d_YRInU;;&Qn9Oz(ArkHS2 z5GW&nqQ<~b87OpjKXqTZyYZ!G=DC-SClG!@pf9owi~eA#QN|J8mPxanO92llzE0JY1HOwdqUV<&cnJ11Q=z z=wFu3-SRT3xJ4B%KdJN!{ue|C zI?Hy?n;MyU?%!vTuXoaab2|QaE_awb#h=ujNk4=Pqo`bW zf)j2(>l=UWvIySJn&U&`x0rJSj7Z^DxXGr(P514DJ;?NvyNKo7ZiPOw=t0_;RRQ~< zev!jOTwTG`CQrq-yRX-Z_6z6Y9!mIA2D(=YsDk#Qw9!szy^= zFjX_zL}Eu*`c{YFp0;p-Os7a|Nl6c(+r-3G=2vfLeB0}CrCUGfn6!m3nq#=8!NCpM zhNvi)<-&s389+wB*30TOko9%k!GzA1YlJ1g-TY0*B4E}v6c*R6^eGy%>)KpJ277o< zPgaoxa5{bSgapUhEl1Y{(RWL@E{EdVvCMj7%(~c{?$$SZ+eKZ44^}=;oVz`+7POYT$iCJ z%Rv3IoG7q&6?0E?88cDTIJlX~Sc#WS<;+rh_knJc=!+{2Bz{%jl5*VgVdmrE&X8Bh z^P!UAa>|t6#3WAV@Kp<7?hV?u1K}g!EC1K8i_OZ!Ez2`Jasnp`F3%Ac5==i=zwJ`0 zi1RDIs7=w+(vUFrkDJ$EGrXX$!FH^F-BKKH`Gza}-YCk0)Yv=e|KSR^3NA)ud$)y| zf5QQ4fEUX5INf-0=D{42#LBZ|S8D(COO1QA=)pG`;}=PZYzZypyont`2~ncC-H&wM zcD~Y5H?c)G)Ees+MG6_O@pT>B?Zz%@@_q2!BU!w-j*5Y!HC24-0b9tOW^z9 zxvVVr_ACo&ph2a`kqA(%#5*>D2?szXG<&cN@von{{QXl*1~9ieJb%HC>_Lv8LpC5F z?1sNwe=<|~>7lcSE;&}!hD*{#w57$*`m}68A!)WPbxYi(Wb-KO6YDF3g$+}qpbfr= zQp+e>a57I?ic%ED-yy)EK(IJ72=IIV!oGef6&gv={aB#KY*IJ$E#ADI20#*1Lh@E?U;%9PH72O zLvbG?e;!u>RiuF}6{1PMl&qcJC7t56(d`BqPPJw;y#FT&b6#Bf@yNa|veFJaGzJ;! zu1d-d2z(S`=7&Vqdl#Ff>6x|oC2gIPd%s$rdQZgP$yR{9L;o>^QO98QonheC6W|Ws z8-I|LSdLK_+rh^?GmWI@Qz#rZ=@G8sJe$nDTJ=`fMd03I&J7X7Eq#4;=+v3~>gt-r zo|>dQcv?m@sC!sO|E|dxa4CNga$tbwlXlyAbPDY)uMWRV(xeO9vHQV4X>P1%)`}Lv zCU#;y)220~Gpgh-K9JrFw=$=)_kQ&-su^Y{kDFJSm{c}JT`vv(b7M^3`r0#Z4{f)8 z&RP)HoS?fY5ViB-j*48-F>}H;m6-Pm>lfRSVu&%l=C%VbZb>Zwa!KbOig(`9K4M!H zDjw6<@NeJ3&H>tU_>g;pJ8xBDLoSg&`koSf&UJ&Yxkf{^1*>X{4C7=rWHp=*cA`Xo z(ilz+^|syE-J$}}SQ>j(Q!H!qulD8H9?-ElTiiLtp=fgN9ZjI5xQUWYSClcyjDA&_}V;^M-!~ zla&EAY#1fz_Sott_S9igbt8c!0$W0v7dTP~a{`xm((uxIKbIRj?j=wR6hK6-^ora)OVo|!S~GT;J*u*D@WI4ZrAcFf^;UFWos9Pb!m5(_k;YoL$2HLf2^ncvyvhEDGI!^@47J zo9k}K%+isZo1R5p9x7pJ&pDmqy$MT3oTGr`<<{T|KqT>uSF|x_bw5* z^Hl_pplBBAu~LZ!=^y=0<<&Ldy@4j$S!t>xQnsM$uGYhg@?YW>fWFyim9N=R*XjQD zLSw@I7M;uXKbdz-~-dyvv zI;?XU?_2C7kPsy_4r$O*qze-m=~`&9u~N8dq#F(ejOykB@>0#xu2 zd2WO?ONFY=C)3CJzkz@vW32DCZ_-F^?}Rg6k|B=444^}apXWvo}9t*3Z#mY|qlvPJ1BNCusF(9UKJc+U8c8CIKgcpl&0h`p7{I#qj`$TUto~n1dkv zsro%z44ko{A%cCc75)|8i)Q>y7b-s=mo6bN?q3AJY#^CAz4Tp%vk^^8Ubj*5*P8ux z)*F)ZovqCh5-lvlYk}o_pw)HK6#+00@R4%;)}gaUS2in3-P?FP+sfv}UG7(KXCAe! zoNgz(wRB<%iZ_Z{v)AjirSXdBIBL>ERY_B%nOPR~INJS>K-Af9=@)!1M;+M`^@d6% zXd=-+hu;^!%#WY^I{W>m^oFkMzATR0jY?6`C%kVb7x;nlWtV@`aZygguLgjJ>s6c? zChxFLu=YbZMWiQvI$6i$9EXASZjm9w=+N)B9Zk;S+8$uWmH*sm6J}xe z(s&B(ligr8k8Q+UqUhAc*mF3!NMN zegE#o$EeQpsr|ba`yh1ftyAteeo*HKBtRQMoU_3Tg)qle*adV^S1e&*(g@V_r^K7{29oH=!+Q^FOb8!zv(7W^X%!C)(s=mzYuPhY|aBtqZjjN zy;d9SJAZ)(%5l!(+RL{wjA>1RDH~-s#&Z43kq)HKQR_t+3qmFSeq2LcH#BCj^7R@E zhzq1DeJ5y3YUo7l5Fl+-l94l@rNQ4xM=a)93%ETGzDNS7g=V;iY=@nhq^{n*yINN@ zt-sT9mV9Y`b}MW-XBaLxLzly_B^R3zn44Br4Qbp=VGb#=m=U4b5XxlMC4JxjrYrj% zTOc1nn7weqHNV|XTz6md!xEIvFUk1!{t3S*a=Y1}xGa^2d>3TiAVk#DI*hRLc$$12*yq(4<8K>2%fZxUrw@;1DWByyaJ>Q z3nX;!w2O8BrQyZ0!lOjk5bJqbCWC)ex2r3ekyrnQ&GSGyD&)>>+2w_@B+<@2a@+=& zoqW>{#=Z3< zZdzv?#`g{H?>}O2*z0=@FfoXuxgUS8EPYvnXJ2?W?yGCpCnTuSL6h6NJ*KstD<=` ziSEzG4g>O)+n9@;k_c^QHbe=%^}iNenqxT^T*7SnW%ZZ4Yy8b#OX2k5xcou}QmXeA z1iE`}s?KC2Z(3E+M-dINHgI|9C<55k90F~dc&w|!>rG#HMl(LTcw>M#mCY38ZT;e` z0=J0k6t)j7j{`-b`8N&h;6jx*b)>A`y0djqNwYc3!ycFUMdMI0IhG}7&=%bLNjL5m zdyaq#pTM@&aNhb6+OCCN`2fBs*_&f<2Na`w6+roq=$cUr;TP>+r3~%tc-k}EtE%!? zo$3|fI^~jxr`*Ov zmV8b7nprn_y{Q5@+J*Hg!aQCh^F` zrjl$ykt@@(5E>*KzN4>VDZZ7s(>obX#X0^miQ2aejHs+o7&X?RLUI6}uZIBYFVyYv zdV*y5mf+7Dc98T=f0=f`mYqeeX7*$yJGp!}f!5sRBPY3Bb`XWPu&{Y{2cjjh}c6E$7}2 z;aN4Y*sBE$pU(cUs?wQ9bJKb(8F7Md9VV(M5YLDR>tyQ|mYci0rUC-xw}h2rpaUZW zZdFr7@m_y)c=-Fp9R2IfR&Pi-0F`fF>n!m!<~QA-H>&LkwG>~P~XL6%e`X3!|wsK~VxlVtrY%!K!X8Xy?VF969rx6zSRCm1M77D^9zOh4K* z2QPr-dmfgNudE$-0}Xt#p_*j5EUU}j?OkXa<9ouxuUincj{BJnVsF4jl=0S7%nT(M zaQE$gL6Df2@j$R=5{B5gjM&%3u3E$6;bV${J$=m$ibw3ze3{u{5Tfy-C&PLjJ)Zrr z4#u2`B8IO4Lg}+R`bV;4Uc}FkVt}$r53~Qkm@on$AwXk&PPiG`j`2i_-@6h}ZLWR+ z_@G|^hbk)CaI2*CF{?MXgJr@Jf=OSG@QIl`_cv9Cc?TV$@W8lM2C=<@p5_Q<{;~=dX#(`RaYK8Hcn%mFFVz->J zFZy9qIn1u2@bJo4t~>|Nekp#lEsh1f!R#3Z2>hmd=Fd#Mu0#C*OoC7ocUXV{E?sWX zT_WG{2GE6kgF_P>T!0!oD&>XMr4-z5Z@mBz+pP8R{%I@_-E$#^;>6IWNr_)h=ZNNS zbh|jEfh4Zojz6}(r>d!R9$k6VVQGc8nh#6DLz>TLTUL$l%(=)7E=eq&t8N#F$zkI1 zo+BKupTeHpUVoxg5;t{9c>Lk64O|&J3`kvJ93U5cr1h$Rj7&svTTTAt`S|-UZn*P! zS+v~rwEXef-?{D9T&6S*5~+N$ME8?V@<2&`Oa>AV4Bo<5X&N4RDBNc6z_uEoOhTzx zn?P}$M@4*|TI(;smAg%c7r4a;3TeC{C$zG@yu+sX91t;N%XA``S!Ufy;@VGn?@x@H zDhAgi@*2k8f@}Q)ZfkfO_2Ms@G{il&ewWw~T^pB!D4NwP*hsl(F+El0dkyG|F5A&w z8W9;BlnFGn9DX_|V`eJpB4PH@^@8g#sA?E5J92-)3T`80P zCoWfw(_1~0OuyN8?hYOA)33Kts%#>9xAQ!z6qij{9K?5g;$n+7cutkrm#iDSmlU(s z&UPK|UkcRvvZH_+Lxrm<456?3+-+`EgtKllezz-6WLs~EfQNtD$>k@I+F>8TWzjQy z3@v5V;1_1^&)KH?zPZ%h{&G$T!R$czcJ5vdbbR0Tl%s}X09Buu0i>L50>+?k0{{$Q z0(b#&M1%o=@lC=C;H&NXHz4MbkupUmrOb701+E@T*aq#&1x z!WC{uaYvP6tU>_nI_)cXgKPFkQRI*V-9e4@sH;(iq8!zj$De*Pa~oUiPS$vJ>`2oc z#{8~m;c}p+3T;Je?N8bp9`N>0!kRRWe`>W5{s*b<0tR^cBtVd&uepJbM$5$A%O>;kfigbB?n;{;Sh~a{>ID#2H53l&l1ad5Ql-ISX z-&pN(Q@D72E>1coPISqAiaw^H!UXBQCRK0XN_6q?089)`(IP;yHc;TPRy~fo80Mk= zL3~M>TYw?>G`XwI=kjXEk@!4-S5Q5U*C@1D^CRSSR`<@z8fXBWJo3$`g_H1`1Lp6= zR`R&^WIa5dyR8`oOiB|r+d#ZElt zWU;$6%^*J6bL>2s>0mO3=SfX_29K=I<^2AisWBF@uvLel>(l9s>-7l{HFU^WZ~V`!}04bsc6C9 zaahFJ<#XL#O?oZ26=Q+nXJuKB969GdBfU|SJ2+y}i5k9-q^1J~Zr^G#xV~@kW?xel(Em23ilClqkgAr9@(b)Oz|FQ0IULcx z6`OCPr{8Pd2k_3Ql*iPOWJpK=A|IFT%|A`#fvHyUn6hX(aW3M(3kK!90P#!AFq<)_ zc8Ux^!K`+uK^J!T&l$Rrq!rxANN@OUa#7v(1^S&*fIoBz(uBT4LgED4v(6{&Rx)~G z22*G-aW{PFLx%UcryfqlS1BKN{L!*(Tp-$E;jLAo#rN=Ymgw`#mB zKT&NWM-ZYVzMIBHiar%0QwOZ#3&`@H`8$z6ggE%gXb6dOuy!w|^HC}wYL5BQe6vwg z-#rR&S7fyeF(Eob&IH|@h?nyTw&1O{Jmi`P!57-jxH?3;Ncuc~$=*>oT6Ktp8x-*p zI%bk3##<+_YTur!4NCAEE%8J-a!yR+r(M55uC;_yYaW+T^&s;NOq(s}%R>Pw)-6X# zEhcNOZ_mYW3XjXVa9DWEuahANnNFc!R=Cs2o>q7P6nLRah1(q^(tU{lUsqN zME3j+hHL#)xdD%8ar*n;`uB(TwqIU}JDoBh@F+)>;^z1eSQH)!CR%M(fp;;=h?id? z@bDmcn=)&_&#|_mI_@i?&rmDGEj*de&MYn^h*tmFiREsmsaTqLp~6zSektv>la~V2 zVTAx06K@)gam|t1UI#jqzCx=!)Q5cmeT!YIC260drzI57SreC_RNh|MV9eq}OE7L~D!&C_VB}@uHiUe?Ibyf~79N^GD|p zA*_UZj*0Hv%xVhHrXb(9ATC-KQDesz6SqN~=t@*7!uq^~1$bV@TdH*6%gmIyaBb|@)n75#!4blT=K#Wq zsC>?`!4!0zHip=HP%%sX1Dnr1U_UH}<<5QwQ4+=FX-$CRxUSL)e~Im~o!w;~;NSUo zpx}pT#BcTfu+X04yo`05K51+6+LV6iknAtvZF{=|fJtSUo1==j2YRaBPMN9b#sn#O zNdx_Y8~{8(HqezULCR_5XF#@~h4Apalc6!+M{zFK@?I#-+=>5a`}F>UdH=JA{86L_ z5?2OOza&kl7B;XONSvdx;^nu)=Nb8MBs1-rkr%vYgsx5q>?nAH?-s#SeNu?|q=M4$ zXh`OH4vs~Rvg4!!U+;8s>BJ#AYNI6Wxo~~S`lmO_=y!fRe%~vL9kGC^Ge>7Vs&W2i zet!_N(vsR;n$w(l29Jo&WASm+svY)iw9B@Mg^(c!Q2bSN8V<7cW+Cu?9yU%gXJBsQ z>FdZH=CY>&l}EuqS6fpYQzdk#6|-4WVy>a+IOC=I@Q=$28~*rLm)H4^%UkyWIn?hd zhyz?+ssiZ@kZ})Fe3^=_7D#J>c74}0tZ`b{To&*a+}lzRrh6q7s00LE=aso}JAGSd z&Y^@eT+emG4|lHprh}{ZO81^(^EzJFd6ZryYy?v#q%8S7o@yO7FB{-iq74IQtE{KUAA^VYAk@FYuV&kA}^QQOHA>%oe%@~!JJT|ev}64yHbIWh2?t4=tA!@ z=LQ>i&k**-c)A&H*@%%&k$0K6wgfaf_~-vW?7eqXli%Jaia$^+D2h@Af^>)oNUu>5 z5D^d*kQ$W^(mRAiMWqu(iqweo8W9kwp@-ffbO^l@dLYI7IOpD(bIzLg%)92!yYAds z?;l}hWx*!Te)jX--~Ii5N-Q(McQflZ4M_D0WCMzj2XAu_A%_FMX^^P1ziBFi$g^m1 zMc_~cghNfZ8QGo;#5Q~t#etFvS%~y=qE_|-fIWi=Ilzw{2m?3b zXXdpB4>a>q5Elc|a6lz+6eSP*)BnnaV#cG&Aa5{uIieH%Xa6>405a%IP!QySHFNc> z96k4aWn9^JhmZcFljwfj19@#*hld&OViVK|#dQ&sFEy@z^thY9NOva6J|n(SPhKDh z3Q5S}eA^&1GL2+gpr6KrxKbW;p6$@23}s)wzbkkg%V@o*T3~5MH)y8A)l&UsF)TPk zq`~OP$GgB5s`@VLUinE2%1pSmNXg5d(!(!&6HDWvg>LN)hl{Do6W$iV3^xq-_0lYo3*hGrOl zj?JFHUc$F7>F>!6C*Onv-t{C2KQ$_oYuMp<0%{EoEa$F67{?UgXF47B| z2QiUNRb+sY9z{3+O9-2ot7Wbgcg6tF^QB!X_<*8Uifq`0yjovIjDI$E7RO(tRB_5? zxx2nPyFasp>e5gTx&jrVj@s%GjjEnQtGw+2j0;?-C1~0P{5W}6)r<~p1VopqpcWGQ z$`ut{C|UpJ^LC~thYJDpV*DAuX}BH|eK0`v!b&N0qDhsv300vXMy?Fq!HwBfuDbmy3!;ssr{>?CO- z+U<9QNQW~gI?R5<31m1hz6I+sZuwd7eD5&TZWK79ya(NfF9JQy6OkS4{Bmn2whnN~ z$y-nW2i=6egH(gQ;Sw8IQOa&E=awWay4=p_-qyJ31m4eeb8QrK$RuKWFT2hyL(d05rT# zg8XpjqhI+{0Df~u;Vm!EJLJ7jr-8cWU)}y$+6+?W9pBAb2RfF|=U)b&Oqq;w5AGh+^_YJVeh8?q15pyjk!eEQ>6}Q!_ z>E|a`FE9W$?R&5iXD{Uk;C8;9x~;Xq^-8o$O1n%`Sv0mlD&;frvjwKH_gqf2D$fd> zEt#S|d7M1W7Td|gy?E`-w)yEGA(F^l1`I1tjQ*aQoyRJ5=M-X}Mdo<=Qc|jY%Yhwk zPN-8?#|NU<&m{e9{&D_S8k%|%2XQ9JFGw!KWsI`wWW$jxc#Jdt7KTRg8b!l>_Ll8)29Md|=_ zXyohqj>WMVf2;T&MQ-`{GgJc?LG3k_P7(rk=1UF2%?Xk!{U^XyC`6U-wCsmEyoq@F z9Aypb5t%Nm)r*A7!Ml0^TQC$WOywLGd7-u?_;MIH4SkhUv}En+?A6a7{UGPo>o{>W zlCW{t(6ZzG^*HLezI47}&n8Cq7WkBrZt2&RG`>K zH18q*)a9`c;?2Z|Zh_OC0$y;QerS%WFeMhwzW!#hU%mCc+ZbOPLiQ#JdutwPV}p*V zU-!Xll5Dyov&$yi z;y2r#^QbwEhH(UKF!ZD0I*5+!33*+|>kyEWyf|j@;Ah2a${=0WDIrQz24X=*IQwK69+IVrsCcNvVXZDYuQ{s6zk#iq+H;E;Jb=AS&xp7xFZOR~WCA03Az%j^D zv(8;bQ9#7PNRNPF7nH_{4;xA;Ca>-D191)i}eG#2Evw$?RL7NHWTRn=d9 zddAiFy2kvk-g_zy3xT-A6q`=wkT@ZX-jfRPmMZ3_xZXZ;>$jgW6qfC#s^O(D7$nG! zPeboceb5MZmxN|XVF3A_oBQX;_*{zUZ5@l#K`#I~=!%~5YJ&F!xzBcPLWsyl-O=n1 zeQ}@~w;SY2ghDF`br>tOsmEc=8E}e=S`(UlPK$YUp)$B}&dNBuckpCruFEZLJR-z0 zEoar4*`V=c;z>!x&?{6|MOBn_REW3xM}%kvv_eCxAY;$A=#LTW7oe~7Hzx}KHP*g! z>&SArSXm~s$7-N(6f4+-EPIf{@GLr)@w_5;%?`4t#xO`V5g!FL_$i+>(&PX5q#D?%`#;e9cKWC3beB|KLER)uHv_3Pp-=SZOn7!P09!*z>&l&MbF+z-^Q_ zPw*D_LM!O9vdruk+I3d;U+!Nv=Z~`xFdy#CH_#hI=v-?79(1M`KP3VgSWKM~f-!%P zHKcbua;{JF<3676M2p`#xR5CBnjILnq?Go@aqfVB5Bl}xuYsYLdEoA%K%o;9SS>_c zzY^nf;%oQauXG+6;e8l0^ANESC_8&r*Orfxr-&q8f^h64^2hu+AXHbj+bZy6}D$bt}*3cJZ~8nYB)q?lsKVs0@!)j_c?wRn}VR`qDz z(@DNVk@_|IYxFzryX&Tg1R(${6q=xVdbn}2R4a4p<1IeUO%=LuNR84-wBkG{@UFC+ z1lHYU?m^b$l@1T~XV#a4&Z2QYl$Yc?%qO&5FvTJDOa5Np_bN0?1%T4!gmT{>$X@b} zZ`dm`AD&bjKyWehKE5h1&gT}cnuGq@j_!43@=JcUW0@mZP^4=c*FKJE;5dA0S&umA z(UaLv*c3vzbA1TO7HO7hS1wjDD(@m40*$N5 zZ|jgm$sM)FKZkh;@fbH8i?6k&Gk=Jux)32-Ps!{nSF~iS)AIQk_NWJMhfUU!x$P@^nK> z708DZ)9(jB$&BBT?+|N?h|OGzZRH;lBBdSO_5u>mZc^LkaVk!2;0@1cN#`m*Jq-EN z-Z6qvJRCDcwX>N+fQY3nJ&2#KQ(uL6&r%;~l4;!B*Tz&2TKm;u0*-rmw*sOnmV#6i z_1`YO{Q*2)KW_es_Co_Qwa~lOW)zMsZ3P@LCX^2`>=fs#}mDFFEF|xKkEi+dI?&UdFsdFrfet z(zTHV@m$mF=W$4>`?X>`gQSR^?+wGJ-u!7anN#J@KdHFYi4l=_y)(4jMZQ(*>@0++JA*j}5fe zoaUt9l2Nj&cw?#U!Nr-~1Jz`4{IQPbNWafo(5X4`an|O9#dN|!S%9@>W>i@A3AyR{ zi8I7>r`xP0O%V6&CpzEUf|e}K_X?y=${;ZJ_Cx6-y4_`Fwryt?k6s0-FkIw-(is5r zU9(ph&%RUCn<=XRTTC|=T^X(?U^*EeSqd=Xq5NT zD+r{yKC!#6sx8%Kel<_iaP-6nI1c_&{lg}t`LHy1LEGPDD@C0_v}bGbv>`h`=@j%b zuvX`}F>_{P&R$B1(widZo<3{W5}+lfeqe-5(bLuePa6aaCPjckD#BbIQt}_9zl8iU zb0;6pQk?J{>o8EBdgvoWw_#zES<}g%+SrPXxb2nH~+y@ zpPczYkB6#B4*0f&mw<`xUCztu0VXaE#uxN;lLzxxdX#O?r~1iv!$_dKr$%xiuGT|j zNVcQFa?n)a?VkZl_ZbkWuW(zJlVYDHJlQkR=*|YJ6Z$vQ0ht6W77ik)CyaZC?Mc0T zS@DS7`s7m6a5VjVi;ZcGNJ~$~5B0$uk#-HyjrFW}VP5|ncTV@WU0rt!Y4i9Ar~UN8 zsj5ULP2aLUUS~_Zhg|Rv8pJeDJoyYGkg?69!?}5Pnv3YR4^`wjvkI*@-F4i% zUeH;+s?)Nu?GE<#XR8J^FAO_^=OyLGhf~+OF-AemoHB}zec^Y3twH4b(@)wrna;iN ziNa-w@F3HfxvmM)_ahkK(vCiIefIEMm2yT>m%}`7bbTzDb#*DMT+`z#c5ev^hQ@oJ zFcWf~R*k5F>~XlFDp#>@Vw0iP_OU3BYj)EZt=WyaEX!9a#)06R#ZltcjZXXpqqpVz zcIkg|YrC$~eylBk`1PKjen<#Rv6GU!#$ob+fyZd^4n0rxCI1x#z=e$jO;Q!X{2Er) z&Ua+JpD0ur7aa)0(yq||Rc9O__3%r;dFj-Yszc4nPtP+df4!tl?E;ZI%MTHzpFfy%-R4sj(DB2oy>orti zH=EMEdsLKkTzcp_B$)7E#0$%hb(S``6+kQcod+?)yq^jmi=DURyLi6W2g8+kTnD+h z;q)+R1z2DKLO1p*HW}v>UP*(+FHMv@$x~APwaqEth$^^@n3dUC43Z*gA&FmB?5K%) zfKh@NtQ(4Sw=mJ>l0GpJ|3#ta&Cd7`Vc~lG-Ye}gZkd1$^P;%?r8$;mFpIe63^Z3W z!1AdOa`6K4TjeSy4G(XL+rqPfJ3*m=8~l9*?%GpUIoT1TU5lxX?@dhcz8=6? z-9T&h{IT|&IM$??JaxU{R$o&`?UKYMP}Y01s7ca9J@Q6NNpyocvHYwNH6&Iuarr9kylz;RgO2)s~ zZkeiix4^Hod3>E^_&_`2E8_ivbMOkr3;tK-i7+}b&)|vg=W3NNchyZr8w{G1M|KNG zcBKt&wAlze1HBX#6}>8afV>E^+rfmS8aQD#wO@=V6;@))P%uy)-*{afL9a!R!Ee}L z`W(C^_fvOS^BEz^hn4x&-dyHmie9v4*vNmgBxEcK;x=g}rYM!d`Zs>lX!x^D#zz@? z6_PRz1jCO3|D4u@ns~;G#8&;KpG+58 zSW5KgHFd99^Hq@ekmd*IpY8HlkDEG}x5kMuGI{|vZ4;wd9-Ft422_s{x29!?>t9#I z1VTb$9V-exnsb_^AHC?Lup$rN!z%w(B4l~S1~uodlh@I|0W?K}_f*We4K>1#3}<|R zweFYmN3x2BIWN;Gpf1>_;zVXL-V?o}G1{Yar4MQK$;8X>o{M{7sWz^$;)Z zK#tg_&qUoI2f)jxaH9*umAL{FYOSsJ1`mKtsP@Un|7S-B6P_Hd1AVd|K;ldBS-3Sk zELP|eLZ!{*r;4$or>n>gfal&Ppx0+d%)qC5-C`j<*gbdPD8XDD!Kl^l|xzS*vV;JM+i5T^w|Pabutm?S z{M{~Mrd6EcbWn^D16HXv^asomH^gu=iY|BtcIETQQe8#xBIy|0d7zn`h<1lawXBcU z(U?;{&mDzW3!TbLf{>2sA}CH+r#hms@1b~@C0j}geIhKz+E$`Sk94?pvq@ ze`$+sRN~1Zkblh!4gF6Q4F3%h_5UtofuyJdkPE#LHOk7ynd;R>{T~wyLDC~lOEyO$ zo#H?7?YtO;^+TXL0Vvi*$^c@!_?&Qqt(U!-GtS|MH+tJq@#>)IdmbtpngP3_;0<>+ zzv18{9A$Q%KtRQ z?>{E$GcU|shrOIWPxQ`Plil_AO?W6K7Antc@U&40!fzU*kRb*Z*RS-!&@AR%PW|Lx0FdSh9J5HZ%SynbN$s#dDbITqxBnGS_rEnp z{*g~S?y&UPv_`4fB%da)M8^;LCK2v;oQB~d*@IQIyA|`|jXK)+_^QqG%Hz!QpB>zB zF8;fjDyf>3eFd)iog_-uh_o}fnd8-oMuijCVrs``jL4a%$Q8}UnmoG-=&+g~lF-`a zZlLYoG*MppItGsQny`jiCv1J+pQ@nElOZKs@O!UO$js?gX_raB&a`bSluL*Ve#hq19mrOED$q*Jl^@i^=p6vW6<0}(EsQ+ zRZH}a6Q((lB}nyGh(;y!Z<@Z_>r^s^s1Nkmw*Vr-olZ0fz{21)M*x8JZBo;xL&=q* zfBJV2JQogVCbI)y0S+Y#pR_=hpne{ztt!kV|2ztruL|fh)84{tcm8R>KFc9@ZeP+I z#eVzwPlEtGbzdl8US)%$93u!Bf4d-j8;Nb-2#`$rhd~I@>wxKuY;jUzSvM&#AWwjCw~x^yKit}ssHi~lk#8u@vLWN|+<3g0Ni@MbRy z`CEMXA4Fzm%-^<?Rw?G`jxa*-xsg+knP6rPH zr)u3xrek!6z9NegS@C)w8oW;fM3A)7$m^9)FDA~0e*XCSNa1FXdkZYhAIo-=lOr;A zxnBrTyMcl5VJ^l$%{?&EFs_HZ)5Q0Lo@D=cy*~a8opOfp6RTk;$b5IfO_%=0UC5n& zb%e^2f0Vg}|4Xn^JCJGaYYyvYaJWB?ZUYYZcp0D^@%n$WR}u}{me7qE+t+@sdFuj- zc%_XZX(<~zpxByHdYOI*SJ%MQz~Ck{6?G5!l?{DWCC^@it5;1tx+tb!oq10v_fHk; z-}~~M7O5aCk6Iuh$>R$1L@sYG{#7!Q>XBo$`=4U+%BLi(i~wj}mg0AaT}2$K{Qx?n0NctJl&ox>S7)HtKT{T0kM|3*EAvL53>YhGge$t)7hB!=w=(m8&*x}9VhJIz zX@*65<;GB^li0mp-?6Rv$6ar#3Pf7>Rop+`UC~rH=m0C0Wyt^PwqVuGh**jk0@(L} zECDqlb8W{_6+CjO#F_St*)#mE!97L)NQ-u;AzN=zP)vSq`sLGxY=DE*wKecfS*BEG zN5__tX5EY$K#Pfa;N_pmhON#c$B_7=D4D$S9f6+DzFv#VEQM+2#n_vRTxJY`aXfZ0 z=ay>0_~j~8?hwx z^Lik`e=J}v1aFwLMo`)O7_kVsaA|v&<%u%gQ_*hpcbC$Q>o?9}%YDn9_FRc6i|f}< zj;OM0OG?=Ap4FO~c{F@*K?i%oyhRlK{rKAanqySXh6DaSWR=QOn@p|LyoNYoIUxy# zc45CjVrL6?iwZc)34B+ z&zGR#qM@cwQBj*NM(2o7+{%FX`)9{l^;h{EP{MNpP$IBK> zt8*zm8uSXh|9!xj_s37w^qoK3g#%ZHj)$nw<5yMH{;vGHlj+l!^JHX$S?96vhO?)W3IDH=4#a)da`vP2)u!zA&U{9O->rYO)NKr zKAsZm%IW%*0GjsfL;UzE4l_>gw;-DTjN*D;zU6^y9m)gjOMq&Vd&hq-T)ZHLG%3AoZ+{=L2cSKvBTX# zLF%3_CAXm$%PRyJpUlCHg1zUY-nhc-9@0n|mo9k4rvL zcaw<7!??E>ppmBCwPj8W`dHii(7*EEICIm`z6KxoV1xnXgfLpN;|>j|UW^sM|GtEb z{PL_(7U_FMB>MH2|2|05|6hpY{tF9b4hBN%np_ z!q!VZO;afE95{H)v8;4|6ab^FA?c#+U;GN2{iPl+a>_{KR9r0KF8$k!enMe@@Hyfg zmESIjOJU!CCg$+AzZI48-M$`ZJt4l86otH3cPL1DQXs6sOLnXBJHZd^db&!T!t#VrGx4FAP6zVM3wXu%LV3^(62o8~MPQk1KvBGhP06@9WOP zBnYqw_Y8Z9>v0WN&28uIXnNm{A4g2J1f*Uwb_VbsWrooXJqEs?0lW`$`YB8B9Dsl$ z;gL}?4l>e-C9+R~$a)8Z|yZ_Jikpl<&tbD|p9PIEPkD!(~c@(z#gO?VP$e^k`GAsaAWm^`%O#v%U9}0D4)FZhh z1xSOd1W?l#2}ZTQM&}3@2aK9}tIB6P>DJ&0)gU&|XlGNJi*a+pOYyBQR$|A?pe%0= z$VLRAr}6Hs9Ct+CJdr={C2}{)5VNy(k`}%kZ5X0U4jr3chm^V3u@L=dQKvOa8&70v zgRRb}>kM95v3XTVeulGAXP!DMT%i7JDK^>nh59u^Cg#1I7S}Os6O`#dZSJ}gw!hqj z#w7DWb4hpEbE(#cT=mbbC1yPsO;2uxx7Kkp^L5I2IXo=Sh)(K`%tVT=`w1nzPF7oO}S>M#A+3U^YSs<0rPskE$_)YF<(NT2>%ENp0#Y zr?q@=QXNyAUbLKSR%nVUtJVA`5(iODdHtvvQOQa)dY`l5OE?iR+&Zv$=*{(1S(tI? z*qUm1P|p0e<2yiec(rqY`GQwwCJZ(}8J#1!SqAyaKedhtxu*HklMH0d*q3SM`;6^; z9h?wOGI1N)Dz%N$9vZW^t0WqX02c9Vc!|1CW6I0Qmxa+LIu$+#szw;fM=JZ&+L^8K zZT-?W$%1DWqdqdK+eLk0%oCcfdrv^UgF$AaSU`+K$A0IVc!8g9O%f*xsyXAW)`CLr zoc4+}l6LdTtFu=Zba-+_p+IXANEBBk|0IdZw9S}O&BorVPE1ZuO!cg!w=AEQC3J>s zD91eha3PK4gDdtkZnWek7$Lde_Wm&r)$`EwA(9j24+qq?=i07($fjHTB8%e;9I)>?3}Za zNgYII0&V@wMH$A;B1rF$j6{~f+-mHXUO5_HxS=aUSKc!6ES2YG4mfr>Ogx2& z&!a~7+m$R~D#J{|qcP5|5if#btm!_ju)_Dc{W5bJGO4 z+TZ&vRbGX^OC83guMBpJwwSEJ&Fp7>^tNp3+C(3=&OD4fh?!fGr3t!mTNZKt1X>UI z;rdP)!XFS4dCLr1b&pN0qNChE#aIBf!^6D}HPJU#6G+#Y#vSk$+Za8>Rit0LeAkJ= zXQaPS|9-kYkdx^H3cNJjabnku3*G)SIk>&&d*550C4T$}R5_o#Sn73=#VWtDN;EE> z>Kyj)4E~hnZdgN4K-5Do3B+$VnZi3OKaBiJ?E0|*X)K)a;>O?Uh0Pl0=s%;Fq971a zj3?At)za+G;=C;R_8n}5!fKqS4dek-ZO8@6!Mh_|^JN;jLf1;Ym<3rM4#o!0FO)c_Vu$+3O*9&>KTU>TxDrtlzx7$s#8NG9J-YC_Y2%y@dMsNMuFmE5 zz;;FH)7I1G!AOP4IOMHswu>F{IZJxcUS_VB{01%hZYF!zr3SgXvTy%TIA0u8&p8Y^ zKh>Q>L}_9w01U0OzpE&sCI-wnWMsIx*pu$BLYYr=xn=EO?KzyeX}S$bT)>+Lj(T%g&e#puHkWzKEyCoP$ipJ*-%9+IQ?*3PyUtycIm!wM&S; zPy=dPRa`2vnG~TN{L?~BH;FB1%2MG0fFOH}YOM3zVF?Msrve1Frtz4~s=*vT{(XT< zw^fjj1B12jlt4p5UAf)dMJ6lPAS<32AzFDr^;r3`#$B$47P$Y1f{pxEomai+BE9h^ z?iP9z9pEJ-4Rr{_NY=4)b)7SN^K~ujPMj_MMi-?+yX|}cndCkwly~@IhZRq{@d>7v z6-Ub-F)Bfi3ko{8X(3W<-wqOb!YE&@#T5m6L$VqlZ=X$N*~wI|1X!GoH=c84#0XA5 z^_CZ{#1?sjm@p2Q(|#J=MP#tD<&9x$`seSD%rr-(W5nqx!~AKBelo{{bKC5GNUUO4 z)pi(~V{1+KjNgO79~v3&^;cALgam6~^S84$ri9nVRu)G^_C%5QqhDgwAXXN-3NRt` z%0w&4X($ElCKM_RIX;V9P7T;@5D(pjN$+$Q7bt-?Ix+i!!<^K`w&VH(wGUivbiQcX zo6Vwz86vL&M#pb?O&Tq)*Zza?y3wSRy8C(aJAbgzZ51In(#0E95T=5Eq=Ea11>!aUWP(7!9e|X%{hLP12Hooh!iMCNd(kb3^Olhm z(?41ePv+lZgdq+9fjo^5@P~&`_&;Eb*LQ;t;{kb;u{j6{VCM}yaB7$X&5A?&(~9-U z#;a7#_r{yJ510k>0nGXUBZURL!}8BmadVhl4HcRDcV1s%7Hw_J90qF;j zw*gTzjoQG|L9JJbkN2UU_Slef+QX<6)K58bEEkB{-~f=%7tx;z)=?AJsShs!$du=0 z0JDA<*EUQzMw6N;dN}|o_l(ebMZMlJAC(Po)B(O-21jiG<~9`sA5!Hb$c4OuNu0=J zQR34+$d4+uh|t9!Coda#1PY_o5)KEdgy~eTs3VN(ZbEL*P`&qmy_&z2*Luq|vr4KJ zonYwcd@E;KW(jf?AmT&uA3h!>)vU{q(q%^;H1#A_T~ZiNTw8eV(_RlbM@&8}CZ-mv z3;awtv|tjN_pY(Y#*%hf*pCdKKouMn<1-4TwR6{S2GbUgJa)U5wAe_cOgf$~iEl+r z!*P}64C&Jc?C7~|X2)rn9j>FWgNhm7pc_84Ti>FYuW^Ux({MM9{P60QJ$g3e?3F6Z zCo{Z|$M8V0wC!FlSu0=WC}g9W@G{UYcn%TvQnKU;C7Je;&##8mHwSd@4h1k(PRoQZ z4s)Wrl#PQD=8T7|BqX|q9p6^NhgacilK9x9^D~^~MZK@#^$9B77$nMTNFSQ6A;SM@ zm+7<1#oTeo9c<9Ni z0;150-Ue{{crl@f^<4;s74SW1MU~``!>KobyVM4B?<|56pgOULYPFL!zsiuEOy(sF z8%&mZlV~$1#>6Zx9ba_wp<>z3g-78kojjnZ07K>}gY)|CLAnosrvI%FyoKeU*bU5( z^Q+=p1|8rq1KG4k8=;a7KI7*a59GPPBXDJyasqz4`NG6AUU0VKO`OrIGpwB-7c;mp zK4a$ItpfqLNaLjg2Srb}_!W|(=acZ5wV+KrW^_jTy3mehXlA8G!|q1Lvj)(?z&(Iv z+F*Hv?|lt9N0w+8%MNCqT>VWWCVsAx|77!HEd9ziKvP42q>Xv+e1njK?5a>S7iAjX zJagD*u8+vxXP7hl z(vR%dwx<47=e~p8bj4r1Ay7uJC`_6fZ8OO{bNzXcKV{ykPnf>cI0Jh9Va(BjyFt^_ zqwke*)B5Vnc%!qS8P!skPy6#w(p4|Xeyy_KxvA=OS&nV#iuFln z>GP8o9R*T~Gowh^KdZ#vklDShZf8 zCgIvBKXI?pcZ5F3TvQxaj}bHIdG=}OgOSO+DBG0=I?iBdm`h=#KETa>%lKFwj)Hd> zkH>8{`VBtzSG4$OnBpaK)g!y-y)n9*|UHpDm@U}8cWIAQ) z^JvZBIgxR9&B<8fL`35<@kwbH(8Day!%}?BsTNKlJFJ^3gaozs-P!4aXvO6#ruUYG zDtdYDWBX89K{t0|zZEal?KMYpWccXYVJ6MZ#wI#_y?LAH0A&7xXnrnWZ%cEkUFO({ zbRYKYJ27?6>6w=_P*yhz|AD3JI_ofguX=_ncJtKTWzH14ztAnNdL90Fn085bV7HjjbWf7f^jME7ennJY;W?v>v;Z&B<_dO8mVI$ zUHNHXyQk)GqcW24w3DwbQpNDi?8#dt1N^FhNzMd0BsqIk{K-m(@VC0QW1yh+4{$Bo zdB_B91Hz?Gx|eeto9Klli8auoc&%djPj03#`l5RSC?z|-A!5oFc=$Z%D%ljW1`a&w8tx*#nr}8W*{=!)Hp27d>6{%X zdMzkL_4EpS%D4wnt`Rey&GrVi7hBQQx168{KK^aH8> zG@QSvN{52$x1?h^lL7Jli>BN6hhIueKkFcLgp{T?oFLUEJ@xtyYW13uYO|{|9yxEn zvNn0?-oqYrcQ&{$xPg;{$ROaWk&m)km%LV+CC9zINnzfTYuX-1)LRM;;zL?k!C%)p zg1b%bvTQZwMeE+7F9y+hEwihBhO}bd9ReJ;9m6*?b_=lzG|-NZ27V6 z?Tig8Iu1?E%zER0jCD^twx#dTw(J28d8RoLE;m@Zb04x>(JNiK#`{;`$NbxV?d|o& zFaN7m`C5^VE6)eNMZOzsatoeNc9BZ$B(u`hYX(M|Ti(p?@u(|+l}X>L8m({I%O=oP z{wO4!k7!e=2;G&@V6*ZQYwM&cRgvt}#Nz~#iJ$X6GAzT)P*u0m=~{KQ|97MSZjJdd zstAUG=0nO7n%ZIB1gdtCBA;MRda1>o{M)_UlG0WfQ8^j~fjisKxM4zN5Bp&8jAq^e zySGb#mbWSQ9wBfx5N29YT^Vc{tS5V0ROeh+))zj8g7+PHMhlF|xeDlFvJy!Uc6;ES zR$=*cx$v4mLPsddCc+8s;2p4<)o~H*-jM}ZKjVvdak^{`_Ad>A1Bw4mC5FimIe$K{(&zhv@=X%-KQ(0tz0G>#6)kmqOsmrj5v0FbelUzN`pi?d zt3hOXendc>VVeI|cFfX}Mv}v`(m!tU{(q{JBQ?s<%NmOM#(OIPUtsCAj!K{()Jw!q zvq%ts6kf~}eDuy{{!~VsW?u9ibmUHomC72>Y>}>%={>zZE)A~JUcZog`BGM9Se!dC ziJVUa1?4@1JtH)&x)^AoUw(LWbx-Zags}XY>{asm!AbS*q*e0(tYhkuxh=bX(jp*_ z8i?FjEN=Qrw(RGxeExAdQH<>~GRayLmAnaxdI57FyrR+*k%bOopaf^-Zf{4;DcQXe z8!V|N{pg7$^xBWRqIwr%V{WsX=$_>RR8W|AK6zvqZTn%`PUIY8$w?3|VKj<*7Unvj z%D;NqAs}*$-GwNlOruzNsPvJDi?NDamkb0kB7j_FKhPRMa(Fi;v zIjN`gJ@TRIu4wVOaXY*|#>$i~lmW$Yfij)Mx=C zmbb1zvE^x&PN2#yKNIJ(Pm`Bb7l_8B$17>F!#4=qVTG*`x#p?gAMnq(g44p{X)-`~ zdY=8B2lQ+CWH4RuC+KssC?&rf%BaQ?wsS^A#*D50+17kRRpccaw+Bp&9%=#c#hlg3 zB<7i3a!;YH@nK#Nb|v)&O(geSv;twdW#+>)_>Dk6S$w~A?E}=OF@>Wm@uHSfB0)>AkpAyTkd=N75C$+mL$ZWA>^pNcfW9OpbB!fWy#51j=X2= zItUucj~PDis)oz`7JhJv=kL=gw9?QS@f^@*#hOz;xc9(tB3D;s?>L-LdO-Sxk;2eW zmh8A=J3j8`t}o670L_3WG=Kls{|CPZ=;QF3GE#T=<(|+$~~Wu#oJ#H)le_v zaPoP=%%M;)t3jgAq~rE-%E4~JVq}nHJZRgv+;{^f{^pSF!HmBnH4sj}TsaLM$N7xe z1IsSx+yc-!zXQ4n@1rsS>RWnj<3d@rJrMaxYQ7%nTb(I-Q19}}&osV%aWAEQQgYka zxwWULs-`L3vw(nHdH!lefNYj`7W7tUgcgSoXj25sjV;+ncgkbM4h@I>iUfe%2mnP$ zZOI3+_iB&T&L3X`vQy&x3L|i4x9ep=*xOfdGgra7vr)G>4=LX^?gr-E6sYS)T8^qENJQ502*4ap9TX!40U*UqXP10Q=o#%i=57! zV;C~H_j_7Th2l6ArN0kcqCWo5CH^VsbLKEPL%nknt7*yRyUPg?n z0ytiE8Mv$QQjI6s5sutb?fifUfjkDvT6n$eKmD5X!o_=ypo?U~d5xJYGk>42_+s0R z`zmg3+8`VH>7VH#c}5Sf&HFCc85O?ZcBtkH{#MsbINCVI2hwSIlft|es&iEzK^}=r z!>cjp52BPBEUygQ;ebauoC7uAF!)Uqarqh5`{0Y^`Q~oR%vb~~6&i*&o>P?cdFx7m z$mXJluD*Erq*Hb$c7Igz_+ z#MK67D|%5+gP}=?LTn%l0Z~3HsM**=OYWs5CPfr^f4)&`*exd8u-9f{CBsRqu&S z%mi>@hy>F^qA$nF%h{3cF+i-XAkEe_#86__!U*Gj)D4J6mVr|L+yP*r4a@{KVYD%n z45*m#5T>kRny@_pR21!Torc6^W&~#F7_ll$VdVQw;~mnw2Xr%yd%1J}3z=3gT0!vB z%lF#VADN$8#{|#-(g*8!KypW5BwndI;BKaw?4{Q0%Y2lczkbuyM{8#JJpXYr8s^a- z)dSPGl`7D!KS~GyZ(CfQt#~bMb9zHJU6`trJp&|03_?^T%l8gnSA5R$l?)bb4tZT^ z45eQHudTSL_=AU%pvYrxW=nx=R3!`GMzO~W$n@3r?OSL_T}pr9>Jc?;9cB*yitB)| zt1_DqeHe)km1V9(H-9$^J@X*6q_Jw5s5Kki9O>q!;Hk@05R$#tYToYJpFCT-=53oU zVB4E&|1JEOZ(3Tq$lyqQsHogoQBQPfxH)useSMAbo>DY;Kcp#O3KJ%Moy!CFt|ko1 zKn?x%?6ZaLw5!y(tu%FbS1oAPBG z(ieAyG!jEJ5tO(>>f=hnK55=X!;aK9N7I794F!ALLUTe! zdW?5T1->&>`lX<3?buua({D#r{)h(PIjZA70XDuZvLs=mRaNY+>gVU`08&4G4|Ges zKFy5!*F|SFMuWmik(P=-PtxBzdYUSCQ2=ggZpcsgwmF}i$;~T z6|Az_ziGrJZ014R3}>LtR((l(Njs8kjZvlrsgIYKwe0OBI%~mXyY8{0&v{5*N(NJcc zi-5BEc???oY%DAI`4qIU$XCvE0Wb5fp~*>FiW0Qrt4TzLInylr1BJ z0U*L&gs0YD;Af={>TBW#-)_`+jKbm%!Xb?9ZBbsC;5q z^@rt@3?f$AU*27uc%yhfN>XK!KT(>P06#&JjA-(H#Id6zx)&59rBK7FY_yIbasCz{3oX zM;LJKziHI|Ld;46suFSi)B<3f<(WB5S0~B!0uY^mA_w2%$r_9cuk(Mg_ug?$cH6or ziV7+k6#+pdA|NUP3L-tR0RkdYgS4m!2ndMumMBOOfd~jFEfkSXM4Gfjq(tdG6zRPu z)R5+xervC@*SGgN_nve2-Dj_R?)n1*{(ec`cfNCu@{DIZL%qRd2_K%BTKmQ-8hSxD zU`>Ic=$*KaCJgplUiueE?t+f8FjeT~O(JU-z}6b8S8DNm?N#TjHV^FSa7dTGpe{M- z;jSFLF*i_syC@*H(RSRgk2cSmczwL?ayoAo$h+7db`BgDH-552dUE%4LqLbYeaLr1 zuo7LJpFhLS0niHg$O)MCq!^qtv_iI%fSUrkfQFZj0p|`3i9%RK(dNZ=;1?m?oBC?-u zA!JTeMLg5_#b)e&lEdzUhDg*iZBZ0F!A7>wWjf*WY=smRW;1(?t0GzON(A#8xUf?z z`_3DmN}hjIRvMlWdn54X?)R7!_GpW;u@!$o#oo;(7sm>-1FZ|uC6uLOO*Z|{3-ToQ zm3TC1O3;oW>bgGS^o*2Ip>&8kGf7#=p=2Xp&8HVyd(2pW7DY$hM5s`l#>xV!L`TQ6 zW@FXF)16PCEVYD_@{UCYJ*=)4AISaoCC4?xBqK_bzCaapQ(k^KfBtDDA#!yI_>n6< zJCJ_9yvL|E>F7Zs>#YWICusz`XAex^@dyZ{)_r3(%?`6Rq{xI8s7f8-D+V{)_S=VV z2#@c?@1<6c4&wHE7EI4s!ux57;mQX&8uuqwq3Sh8z3-ETd*0{H4j2(0oOz!&C^}S} zq}$-N2#UIkYMeMel^?y*7||us9Gmo5YY0R?o}}ZlCJ;*7GG-v zO%Zed+6_#B{r-Qmjn`h0lNFTCBBu$>bk3PCMz|KIXc#gLpYn?>$R4z)rM`fZLEVVb zNCDzI5{$hClA&7<*Ne|xAtxh;FI);M+w!x2%^x}b2}rZ2TFn#GT*ubV^H}}3gFwp! zgFF18kE!DeAt{isr8P52=LYA*aeVl{{hIYp3Tnbf60_9iyl_2< zo{4?}66d2zm^$E0XTitfm*KPbbr^f#jAt0CbtQ&&j2p-~gSyv=f|U(Z=r1-e~5k!}BmtZ=~3=W#3 zKYA{SxTd)sfye&-TPL`U&IfN*-1a93jAOQFSg9Y-)Jg2}r)|~~@HR5UfT?-ot^WLO z#ndtdf0_C8ZG#@>o>KT(5QF?N5SY#I?>I}l9^ZSE^l=@ANZCxS_vP}=#gVOu%vVzu z+sm&93zcbz=)~#vZ@O-xxwDmo1s8(U*QUezkq$lY2uD%pEIbEh=LVb~Uw(Xi@=mzS zK@gtO&u`9q%>k8MWcb9T81Zg5yUJ2jaB%Inw?|L?78WVczn@k~;PSmf!Mwh!^P%tX z6ZBLSvFSBg4AfYk2UxUyWVIAswxi)ov&ASPhho`h7eTjM3MoI}JH?G8tLQki$!O(L zv$-B!JcS$ftyJ~1@}ggvTRO~Y>&?)(L`*pHIeMk!Z72oK$GAXo3NS2ia;R^K0!^{a z{GLR5?T$FgvGp@J{x=V2)#E>0UM^t-2Ddym^!*e|aGtpggb#=*Je!aj$SbM5tIEO! zE%?nL7YE^Ft8n$RI_|n%<;agZ#2&|Qw1wN%0e1`E)H^>}5OGulLycwUt@uBXV`6SC z^?0cBKJX9sc7&jU=MV5Sc}msu;xCe?K2|ye+9IB0!KV^)YjHDO-G}V6af4@j!f$2Q z45{d}%TS(XPTRF&-1j63oemcm%jXPE9k;^kFzmiOh7t!uR8Sy<_|RCQj$P!Y@tC;_LximUP)%MQsz@SE zEBM`(Kxh_#X3X_tx6Q<_83fgs2pDDiq|F-*lqQMAxY&)~baLC+;7C15uE#@WiaGBG%_8y{H$(Yn>1kpUdBaXGR9~$UDdcpd;(Ps53!KIOc8iDM?xf294 zr;P^$8}m~8LE>dT119nQ{mHlg2>-87)U;A|%{X;ofNSVFd=TjOMq0Stm$MkeY?8J~ zX{4@4`+r8SDOc704h4N?lEJ@!z%@MEe!tjuOLj1ZAYYSD8G_W=K4HNKzA<##jf2HJ zfs&;KAVEY@bHf$A1byC1^T)Mh*z*=e&*Dyeid;W)&D{8$N#1Sy+!`SBLnqPq*+U1g zML~g2OkHd}^~{%>!Fi!qLACuE+=QIAFDmb%`W}1$?gT{@(^ul<9wstUg{1N! zNwPs2CM{j4jET9~4=&vxEk8W{J0bk%&wtVypqyCUy=W_HidX}TWVwrA#s{4DBV6Y` zs?uilr&Y$au_YZSg<^kM;TKzh=g=I7SoC$#OzT(rp$t}c_0 zw+oi7G!G8UEf7@xMSAo9lvl76N2xX%l!ySYu^~sPE4`E}iGEWL8Y_~UGvD~xr=;a@ zJKO1cr^>onqpo?k$&?+~==^;78^n`Cxo=Biz1< zz1pP*^x9?me;(d4+fY++hh6lOSIJ!+M0C2HLsVa6kQTV0C2-+-|62Up? zjLe>VUs1158aD-oevMislPPgUCU*tXzc%4XdKmC^SPI0Juijo2O==Ei!H&9y{{YGM zb60pBd;luXS0`I&dl2k!SoacOD{B#I#so#G? zjQ>G&+YF9jo$^GFwZ5bcS4)}Y9TJmnJ0nxVu_pFhvA54xdA>#O15WJo%~refE0$%} z271j`*&ao=?^|v0nQ;0xa`0BW9Zaaq3vns~zYxW_==x+j(>S@& zbkFt?Blhkk*MR9vmW6U8kogG>1eq)xnk5urK8ytUsc-__uIv%&9pE_LWr+P^ z!zSPudz>igEP>!LID;{S9+d(76&qb0mQbweAMZHD8p!X5euR^|;lO>&OaoMMM*;&B z@q<#SAcup?A*oy=6Ug`7Ak970B(Wk@jedi6Fj~ZPL8*4om%rH7*4#iG*uuJu{?k|D zn&-~~@l>nWt+-*rcX(;8yZRi4Aq)A2-rlYNnYJ|;!wetGLV+m1t%by5m!p!V-HxK` zq?>%2R*;j>LBmSnb>g3{8~W#Kv|m#Ug^%-FRZnfONX>*|NPWxkapeK+;2;a5t~?_R5osZeYwJ^a_>eBB zB0MIw9Nl=r^{@!bHukB-sP>$K3)$l*`}2G67u#P?3gW+J_5Qh$|FC-hv?>2zGVdl89!n?5ukXIO=@wgJJ?cf) z3}Dgr8KBM(Bp+vCgZN!cG_|G)-+A2-pW(KblbqB@OS?CN{dWHzIu`rp7MK$u{b6m% zBnGtng6_}-7Ed>~0eOj0;n!(7^Q!QT(#6@@{udmbgsZUqtaf*lVwBVY&n(f{3boUb zC}YRMKK89gpbDAFf?gw=sgJRQIXFyS&-jm2Y`a<5FRR=>V=sV-HIbPE>_P%e1)zw3 zVi*4ZrNI3UaG&SX_KXBe;Cu2-h4?uK+ZgQFY4HL3mZ}2|730(7h{M+*xhJqw7SYIa zNruz;P5pumd#(i10r+5Xt?z#&0_B;&it+cAB(Jw zaLhbB|G}&9aeLMN#G{TK72kuoULM}yS>Ei*Z>d7;-Lkq{OZiQ6H719X1}+DNVB`0k zqq;%cc!GFMifzqpT>>dE?B)TZ>j8(`==BV5Ol$eTMjNl~h(j0GcNx*LR`?K0XxoK~ zi0sVYe@jR_1>m};|6#MIDl8E#z!GFARS0;5$nM;#bG*_!YZY^Cl8^mm+x@yD3Ooxu z)Jh6B{RWGBSY#>c z-6;Ao)#NdTDP3_7zJ9i|NESD!sL!oFHp2DtEhbhgH@p#2^*B_AxhdZ1)ha-mX7D$B z9w3#S+sq89y79qNU$ug?F&J3#x#;@xqMzdVh(%%`dz2?Y5+|@nwI`5#^TX$ol#8-j zJI}kFgWlBhiMhMSFV~caJ*sR=IX&m7tc$V=IwQeTW!dC^8eI`Uplio&;b6G<78kf< z_rvOLA7d`Q&)U`f9iBI?#?`~lb^0nC^stlDy?TqqNIa*2CqzzL_q_d>Rsql# z3(C~L$6yR4v~+=fsids-9kuWN zZanczX^i}S74G%$6pR0UGjbI+>I8PPXO7 zam`%kmjx~JV>#C>G>Ph<%CGzU@8W@GGoi}nW4PUOnFxwnDlgXVV0-5gEx9dS176F; zltjd((UI3+P*IhS`v5vf_T`tVyQbe=Js;gyjJe07b^n?B%*h2Yi+yGhn#|V*A4t+B zf-wLe{_yW_`9BGU`LBk!d{hstrS`KDt3=n=9U?04ToDY&G4b^n zo(N&czQU*R?VU0AuA6nh56R|pBLJrxhA9}(u=H_s|8X?8B$)6;<@ZEa`Mew^mXcZ9 z#=*Gyna&ql+_${xFC4e5%}0-tJmx18@Ka!$qn=P$KcFE>^p`D;l4-0#Wf=Q-75YP% zWLU|nd|qhY$+>v~_v~5g;>mm!eljqiwBW@aO4PUOxvBd_%8{FD#vtUDhT{dFfn6K;(O zh=Xx-iBQn5?q9MuNqlfR5Wi7fSo@FJT)7W5~vz+O7sP2-uFteRwtvT$e{gxiyl^REvrHT{Fxb0{zR|0cy zYbk#^``F6zhwbU_yOO6gqr0c^E|-5KX(+xeo5xtPzdu``nE;LLM)bsE76T~f2doAB zmXmyVQATTz!V8!4-a!KdT9ExR<2CZ9Yy4VjSg$PDo1FJA%Jdo!Ae%AoZNG;L@49*q zOb7ZQvdOby6}*V1oc6kkCv-VHT>hpZ;#70vCI*6t|~Xy~P1uI5~Rm^Bhqxxb3+n z+lBWv-%sQoobP)O8Tq}ygkXsmV_fM#iBaveISiTIbMGY_uQt_4%HNl7Ob#ob^ghH% zje%RXh1lFZ1k3!GpS)}hIp!t_jSP^OH+gXS@N#)j} zI*)1T?6q>80v&1>Ium=*0o;>qnx{XR^3~`(v z^WB68wIlWOxzm>`tziL+om{L(`_@-Nu%A6%bs@jt$?Vr$KAJ1ImPP5Dza@9#ZIf?a zI`xiA1n06}<|y>55+*2)Pf(O{fnsJacle!U#1RvDVKIrd6OPtjm!9unOC&)CC}3Q0e)Wsdn~@1u$fQ>OsC>{RH|W;!^yoYQQi(ob;jOXK*Pnl zihTm{HC$=@OIeSEWKcrBiCAnA&&Y|w86gq2fYm6?Pjg`48Zj>X^55$}W@}Wu+F^FEU%PfiIcx)7(#e^7B?~X=A;J zn*#DuBIK>d-KzCgMJCRy{LG#V(Q^hy<#?OWK()7mzYju{m--BHDRNitgx`^=w;><8 zO2yqTL@PX??tv!xqLDc0G1w7XAiHulPx1_%H}zrjW}J}6q9ofRtM84lJx`x;?sBc! zcB;h|HdHh7k9*=wO3Y=CU%0>^#;cyhyIy_2v0XTa6nbyvmTBI&Zw`CIpQs-$@?k^0 zo3Qg#?8q1-aN@mtVaYHDU2?Gzedrp_G^g~+m3XTde8PhxHTL!bsz>d=rJdTBqw@S% zuHlC3gQner-^q45@psJIT6O+i8|Oyb5-5?2a;X1e`(%Qi-L{{!qbmO~SGeasI^a+m zK^=93ae#VnoDgII?Ufxm;3(yoViTpSOewmu$*&}$YY_lh>>``+0YVw)Hu@?FAp)sw zVY!iXg!0d3pDEb@XkfBezI+*Paxjnw$McZIUNaorOHq8CnS_T=%@K#i=&M%_qq&i} zCWsnLp!ek_=Lci2g-*NLTCRm`3EDn^daq}tMI7rrdsi*wVoo|6;eFb)ns!hLOJA9)2wFNam#3X;w^Ko7~Z)teNRFh@1aMuoXy4m4PwjCg1IM`i` zXdnCZDP7Q5_&Y8nM%@dPj>J>ywrw`6N|GnW-10gkr%HNGMcrWbT)O=p^lZO_G!QP^ zZ?4qApq;+@X+u=B*>khL5upeJI9YjP@N^IxJ#3Utns9tE>cmII8kiry z(dj7Z_--^SEKA43`s-B{;}=LAeiwOys$Nqe7i9HD`%HrUQOSvPMO?G`9XT7p9$yv3 z>$`T)L*a&l7?vql2+z^kVZnmoAmgAiE)>0x$%o!De z7Q$DeJL3yqkF7xF1bI3fl;NqSuT77O^a~vHoYxSG()b~=s__FSPi-S76UWPf0wX9+ zIeoEFjV|p9P-(ay=0)N&PL9O;^aDIA2C4iv`b$!$zU8G$x^BP^xwSxy$iulVY4)*g zwmy%iM8X?5g}?k}@Qg;N^F$**<5j3uqbsj?p${o4Z=!DQQgsnLGJXP-wV3e!Q`yVn zlMrrH1Fcz)Um03Z<8l~2zhao?23=`50dVQ(`MyEor*k>Hy>WFk0sagHDIYaoeN{hU zsl`l+yTVHid)xCsy5AAu#8L@jif42XEJvW`&4fXdq#!KEvv84BNZOJrd~4__@j{)P zqnc}#gD6fKrh&UlE~^=VQfq5T!(tu9z4rUh4yf8RmelA}eUjW8`Mm9pe1pe*#IPU- zGY)o=s%Kc;%fq#Z+bKPz5VdQ+*j~i)tiRM%-KDXMvygafUrx&IyYg&1zV58{ z6@!%J+c=4MZLGN#f5d!Sy;Ot4p)Es7*eCL>9|!ooavlG?wz?gD*;<)C3R5XU>Qf1u zp40~uD%u+f^t~5W&Ny^de@MRjcI(1B@#vq=_b%JFEAx%Q_xr*+BQRs8LbmLs+sC3- zZ!zVRN;P0f!S}^)^J?iOSt-aqQWKMTIrk+aSLrw9Gm*0M-CJcRYSUi8E18(W3j*ZMtLs7P{D++yI z-#B*#_+RCjv3{p+@;Y?H)&0J+U~8I_Di-3s(tM;FO8$aRC@GC}R8cAwkd3M>gel>I5Uj6Ljq9f%C)5pHkVaAoE-l~AJ^T>A;WHCFa zQxt#xQ`F4yx){yqZ1XFS^zX*dxS$ zBgNxoL27aLjG^f}N4X37M~@rUW(`bM^gvs$dZUy$Pz(z0scY)Qk>a?o z{Z)EVRIu>Q{IbTvJD2;_wCkM1IIb#4pAS3C(A=YN%&E0k??W!v;Tnz-)Ep_;_ZAb& z*h>*G>UKQsAZ6%iGT(jvJ%lNH{R)8u41+z-T7R-%5)v5tU0@>BuRp#R^=`PgQxw`j5(ioeBMR^g-}TSU z*WD*=3JQyptG;hn#26IrOBW%ttst$bKAUosleNK=JCzPrf~cu9mD>ew$vJIXzuX_8|`Q zll{uVOR5#}^CU$$b&F z_3|acqsJwjEZ%0)i>g<$`~!o|0KN1P)-M}uK0uwur{ySGzt5bl@WaPsm}cZB<#3LN zUJ6t_$;RW2kKV7|-xVfRIDyD+lam}nCXWJcSLQ6Y;Jo`=24Q46Z zb56YrF6PfGERGvcC&r9Wg~x)!@j~XD%9lR=xU5k7^Gm-Iv|tXxFBl*(djJ*S3EwoQ zT8*WjY{=dfzcFxsxe{K{R(vi{xPJ@L!Q0#0`&8ZD=Eya!tLH`Rcnl6c-|>}ae$3Nb z$E095c(?D6#JThtTmZVLH=Nw)M%vJNkz-%KFYkqWt>du=FEzbHBK9Yqc*^#pt>djO zQu_vEpYEjpXJyS2D+Lz~tTL;8gcCcC2SV4!GGm^q!v@yoIShF9WC4RL9vh7`ZTg>Xc2#CqGrM>|G}W zIdQSfDKQ6qI>zH?JU*xE!J_z%9+3F+f5d5__ zkJ+;(lOLL?ZP&(4g_8C4s7ZOEtn}l9RL2e=@*+n6_>i#LAp8?MF1(EKWv5}SZ0s}d z)nxX8z6jIPL?9xO1)^Av`)u{)Pt=Cc4)}-G!oFpA4Zu{bju_!@2LN#B}?H|NQW}KQfprvm5GIg^)p(!o zrP1SlFp2B^MS+p1M`9I!3D)p8D8Y*^wwEz3qvnqW*`p+IZz}?pqz`=c)5TQ#CQAur z^Ld+qkbnzjzgJvG&6h{jXET$2iSe*2G0)_t$9H(d@c3!C*dwRT>^>Bb&^flU%*&o~ z+)Cw9yr=F_AI2o>YW!;Qo|2{StZ5HFyChsQey!Gi1=8;_+SBu)gK?W8TmCBa@W*cx z2ZE2;eqY>|c*fh`l>V_|BnbYwAYknEJ2%{MvomFWy!;L^w~E87?<`vp9cmCgIy@Vn z{)qH@))kCyPl)Q3X-7_47V(Jw&{^RH$!3P~|E`M04s{bhS=%12V%gj;X z*XYdd_AqSleBtb!&zyudlQ79@>o6PZ-SlZ5bv*;O)k`YRL1*QkhFJggGyczeEtfjE zFhUOZykXF5So(wO!fE(=59=nN`i0~z7#fVuNJN#pj!ud4RKlQO1QGkV&hdGV#=(@( z8@pi1X7IDZ{UH)m(hhYCZ62!RnW~)b7!Ajvw(iHBcm4hQcR%q&t0&Y6OhvqQsvT~Q ze1zt-Y(lZ0DO{LE_RrLgI8siIgjkyqKh}Si`NgJf{rx-b)Dxzvo0Ty<@1c--B}0+5 zM}ATLbH@23%qIzll+PtWt^MvdfDUeYis3PK^3Ke(P-ctxV@z`bh!X!Yp&5UJnOb#j zQ$>$dUz#0)dmi*mNi$$lj5&D>#NZ@SOrz&Wt*&R;w=LT77t^=jzrv6kvC`1y`*iE7 zur(=KtAIf-9cRciUTM07YWR%3gAbnG;1OwWa*`Qr%@BDN`Z<{sR1KI1)RC_)qoR1W z)V~5Q7xa+l6{Y>bLv9}qJND08HavvK0oLUXT;M_odA(!K9Akb3q&){@}$f=ahW_LfFyHrnoOElo@xSG5dk-s-6Wd z7p~kek%Nf4Jks4Y3dtft7>@)<1H0D*T;B08goY}^H92NCGgwk5wH_zA$@2|2?X=&fTa32+<%CTOc zSK`VqaMI;Y3~bfxTkZS_f$3VU*bi3y>E0bx0;_QCS+Fq3LR%LzO6>%VcGO=cS=Odk;#_gQ0)#CcNi{f9d6u90nB0?1Px} zZ#L!c$Vz9In|;nM+n<-8j8zR4m0Z4_^CC)Plubq3k3z0+HRX2)@@FAWneJPM z17@q4cXV!B{`~&g*7E%75>F*l<=*u4d;i)l0s{1}2LEE)UkqozTE@1&MVGiDY3BEH z!h=YJ$FXrSHBI^;vzUC%5%O`Sda<#3viM_HW_vY{CdL<#IrxNx=Jc0k9oeahmAV*J>%k5OR)7N0YaMwV({qb4M zc+l#4?TOEE8z|--WE~f?FelYtp(fB2ifm6>#ouM6reWV4sPOpZmDH zM8}6njH&^@7@23g76|?x8S*TvL&Pt22)DURrLs8VULwD`Z78TLOnt%0dU^}CLWX0L zUuQ2rGKShCl173(AI-=K+Bm+mD(}C$>uQ6`A!M-9&q8@cM z7*3g}Isz7JX3~4>OSRutj5gI($iLV|QPC+EyyHT5+L;9Jx~VpM8h&G|t^C;f$%Ww< zLxb~SJHE1~94$sH^qs)R3Z8gtX~lH(-6O`MS6rT>1dgtGusrEzq=3Rv>5HfQr8!lu z@vKTUKwo~E?f5ZH;8OkEq)=X_TXz$S51_xFhX-d(WRhY|wAVZDA8hr%m7xa|_V|tO z`z!g2H@7^?@vOL$*jo*?EMt#I+PkLim|MQ>r0Rh@TqQEhMKSM!}^T4y2Z(C9!ZlY^K94YaazC%~e zpjRT1N=BuDIW4L7MvBc)RO6$!5c6OWlyK;LV<8R_5@^`5FxO2y8DwaynJ|w^(7iio zcEiJgXQU|WO+$X?b9qdN#if86K>@7M6WJ!z z@_-EE*{2Cu(-!6vQBQoWcXM+=wTKVjX4^p%IkPWH4M!@p)t5@@mui&Q9c9zDj?U8~Y!aMD;|>%ZTq3lsTk zlq5cm9^huFe?c!aN-+;=(jVf_GE>o$-odP@G%&1d68CmOYD+3!WmpjK|F>nnWu=8O z-qDM)$#O7RGJqJe0y9|=CQM;m%gRnJ>PRi1iXSQgvWdCK53)1_J9=Mc4g(W0?lgYm zw`?&zefn=vB1jC{R+l~kff`n@oDoeW8&XRbU`_|2G;mh~va3BPg?i-51`HVmzyAFf z+t7nG#_#zFYs5TMYJcLuRX<&H%)Ojgjw@G)toNBL|MNrWYd0RDbR~cd&N-y3YE)HO z_Nh3Twp91jG+jejFf>bAXY|~as7e7);gh1&65-_VS$Lm`Z-UzaL(M42rJ9a}zOo_c zt9kKp%{gD>xG9d88oFj)A>y^{TMVx`LVis>h|d(Ob}^)u`o#lX(6+^VzDx%9`V-vF zEIr-%t$k;+g@R09MGfzg))EJ5!%@vdHa5j5`}$k!T8i#Tez_wzKF)nAsc)&>eXIRZ zG|M~r`7`@P9pGiy*OPWwc30iy*6Wr(eOm6?VHyWk~JQH7wk>x-`gwO!^u*BxyQyr7a0E!QSA^$w@W1 ze-Lce46ywAGdtsFqrj}XAhlgUXO$!CbP~`g4yFW;T~NII%oA68_{ry1M~80b!(uBb ztZ8q|*u4DQnjh+eHW%Vhb=9m$s3Ay*g<^sL=z5xlF8*t z>OHs4`_-c!)R&7q{_*gfxGG5OsbT|@3ej2BG3q&k?PHIDIz;+xTN>JiaMUlrtejlk zG=5UU_xrWkGhxp-=s&RZNkpI9c66#;(PNXIh$4=W&+kRqNe_NNL&gcWB?^QpOGB53 z%Ic-X(+|$qo?@LQ=#EyoWM4P1prAq9Z-C3VC+0wtp#joF7hwPRd*!+P5&=)DD&v#E}O>D(U!j`@BEUD`AKFKxM zeJp70s2|tym;*V(&$l2XC+B0Z*CXAN>kTSu1f#Lmli{4_dt#=6av)W#)_S`t=SSB^ z=13ch4ea<0;`84er>6w&d{qDF0rwL&4!MBikHVvwu>>Xi_1Zehg<*l7)CZBGALi8y z!vj$^r0GN#$cTPiH0XsM#ah)$Gk=oafsY~+yPyWB7JUZ(RH;gM5r){zK|^%MtV%5o z2_LY(7vk4j7LMsbvTu_}hIGh2O)94sa?NI@f6*Z{u4(mm&=t1&w@3IN?)MKT&nrfK zy!v=fyi)hM&&(=+0qcw3UOBsc=!!ga=0?FdzUXv#R@t=Iv8@#l0$6m^{a)JNKRke8 zSd6YsG94$>U=LCGt{zU(xy#$)G;^g6a-i#_DLnu7hS`>^2O~R3i41FZK32l zkk=ljrY4j<;rBC(ot%|0S*wxgGq@_a)k^=ztxfkb$}SHJVRH|rR~FdRw{KU7-(gfk_(4m*Zs|6)^4id)+z#F zOTDmZ_HZW_BA^l1Sl98<@`1B*eQQtkN^TGeWEx6gZZAEuVD|2S59juU!TRzhcU`iNE`R$;kiJruIjvYwzi>Or~8C~;nv;}{h0xOx z2xd;y`@QI$+vajJPJ|m+p5t9na2epanNPQiHk?L-(wW)h^jZ&v)+Q%mkOL0^Pv39f zV?RC8Z>dLjs^u*fOV~d*&fb3U9y%eOrb!TNIk)=IPQ3-I9UL>p3(m^Z#o>_^o^;0JLsDUBmA^%CwnSDuj-l~Gc%a&Hau5(Syj_cD^`8H zek!RK>n0n%F5%?tST7P@W2B}Jgv|X97~wi`;b`_eypSoZGIdu8g-g5!e%}22x(39W zJrMns;!j94iA_MKkF;gX$FA6WJgFB6y;c$v{7`&DIrmCtXy8P;DQ+QktuIXG<3``d zy?UJT!qt8WWT(wHS6*`jvQ9T!2)zu@b-bs3wP;J2iBG2%H9^827lfgZTHFUUUo$$6 z+G`%9!0mUDEKXEGq}i-WqxUK!SI&5&i={Jr39>EHihVX7c|M@= z$#ST6d_uo_1^9PL1faG4Kc-UeELoZ=#oDMa7yXa>JRZ1r^2@#uxl6FGT%<{2Jv_ua zxDLx}X76#~hYkOw{%_u@Y|74EE^d|!#H-@VxB-{CXR}Byv7$1Es0D7KZ6v8_TAjhE zowU`CHbp!v@J5kEg-|#Lo%EyZ~K&*EmwMW>Zt( z2C&@>P55DkK=jBbMST2CKF7%^S8SUIlXihdP2x4Upp$*xbI zH%A2C>*5LL*qQUpbGYdU*&h@B#KxmN=~7<4G{3EFs8!sXF~4(k7f zMf&Gw{I9Jeb;9=1tv6PQfl&Q=la3t=7Kcmdu&@X<0e2gQBBULmb9eA+X)nbzbLiD| z%-4q>asG9NGl`i$X#&nuq1ON>Rulw0|LUN9edsQB0{Ce19=cwKlug(2DSQ#kp(>wy z$!0)p^3>=|B=P~G5n8&09r!(-iuV)0z5A!PD85etG;Y%|!Yio0B4Iy#pA7{UR)C(E zu^`03M=D|n35W$926wPwRkt$7_^gwo$QRENd93uMXYAiN^mMlDDR_c0dh~r+LLTbW z9LV5vqi?^3Y}GennY$2W?^J!xcu{Ru(l;alij|~t_3xRHJe|WPenq#wT$TPVhwJfT z-Qy{{b@f!h_sbwAyS%vUx!c`hf8{D}ve>e20gU`|VHBD5W;}l4G*hf1{E(w7i+Qe; zPut4pKqsiKo!Lc?n4s`XUt9gY6jhY-db<3^T5uYIl^84C0$<5Q;Z0h4;n*^X(y`23E-0=@LDDwF} zFtlQ8aR~&vuMm2k0Xkn4?^Jx}`LMr&e=|%fKJml^`%?wt-9DambDnTt<@4&XcC&ym zd~Hn?8zHDj)t|rydmMnix=hl#|FlfkTFYj=?)iQ0**+(`wH#)9?Agp$V@bTXCU@U6 z-kArvdu=q70ye7DGXFf||Ina7--)Z4s^A$~`*J4L2-yqU? z8>)(W`xEsY%4PMi18Mik-}^%Q8~93=2GedTZZVRa)6*DU|QyY5_7XY(Kyd0;`@kUK5 zK9>-8BlvKFT!!q!jKILyIIBHLAHT--mVOrwJ==Z$^jR=h=(%CVSzOvJ%HZF9`rEae zIaTE)V30abmz&sW}Y8iVnvU%sAB`&eYqCl$ImSOA!gL zp6*@OokPNG74uoa*#ip#W-U%0Hm948hCVgyofg_!{f}{_e@&AA5ng!= z`tR}M+Z`sPf3czO4{cA#+J#==`@5&^57_yCmmSVH5zqE?^scvFmuY7zZv&~+6E(6} zp_2FL{#z~-ZVhch$u$<3pBA@T5YJ)R@T}P7gdSGg83suyXqFyI+%TR$n^w{MCqSiK z=kQ077t9+FQ2HX7o4eqJZx7asZJ9#f{bE}{BqgX(z0l@&5`{}{KD%7oi@mL4Kc43E z5InT2{|L|NKl>VZ*5FIWabAk3_hVTJ(_h@n-$Is&t0}O$DD6-O}waw_ISu`aF3@;9LItavQ0bAS_7kd_G%Csi}>4ml$bYv|vIFO}N59B;HgqRr# zWx?t!z?0mJ2k)_78k>S{uA`|vzSXKav>qVog2np(ViWqs_FxPhO+iolDWh$*3I>4o zp$1SDbgiXpjJUU_gJLxRKkE?9Z}on$nE>L@+8P;%I`q9mPBM_&*xS$nExn~w08ye7 zfcO6h{xn)j1Aq_Olp#Kya}J$HUD@8IO07aSCHc1DGCp;*0cx~G6=`?Qy#(T#z-5uF zXE4iX=0ucV%l!^j&zSQHKN?MoI$s{u3+>+xJ0}u!Gycrk*&E^qf!w>Qbf`ZjoqIEa zvaca$z_3W3jrRg0d1mLkkz$WmgkY^ADDvVjw#A;c60p>H?^d>H003`c5o^&x8-f9J zbxqP~DQ;W4Pv9=@+1|6C>aA>ZNaYvm}W(F<25hSgsmhQAX z(v4*SP!7?A$xr#ZwfXx3XLO%VU>@n&u0(!Fmg6T|;NkO7AhwYfPq?zMW}(QEfU9o< zG!*(+BTiK0nu)L#CTtiL^|aGpc~aC!3d~ z?YcG+@qrfQC034m2e)PliQs!>fACHF%P&p9H`-&-%-v{3S|dS!^e6r@nKiTmlq{qPMq&6Vodx6vgE&jCatLO&+r_FF*95MnR=0HS{6@_X44Bs*a zsL2&@)O{%n(3`%y$nrgI>J;}>K?m3AK}U!hmVapds>_COC%&#ge600Nyx5C%`(5Vw zo35Z7y6_>r*{P?{JRe1RGexO=kiNUJhentu&hUSf=*>UpYH|vM)BVb$zk`>&AuEsB ztwx`kvw-=cZENKC+4%z!dP@{kskBv*fpL&m(09eUs%SVgC(uurE(f6|+aH*7#L`as z4k0yJEjf4c^Wh&3Ls~FYL}jfwvmy>4{qGZhu~q8O=KY*NV!Jpq1G1#M9Bn|QB6Luz z{S>(;cihh)Htyuj!*85HukaH{%4xhT?j~UUB7p+1LzYFuV1EAKV!MZ+**$*2=6nM~u zyQf;PihI+eVy%kL?nOn_l!HrbR};w`2zTyDgcoWp*vP@~BMsVRv`>#j+-`9k93I5aGMqcBzY&yKV${egoe6WbZBy4n*mGI4;wP){--s%)ShHAkL#HW+%|I8=`4G4V8Lb1u!~1Uumz0D*4B`Q^wEkfG z%*d6TV1GGqYQ-g^G~@OdF9D>Sys5qOqo zxwGC1I`n20oiG3sGww1^h_6G&IEl64+^li}E8LsGz_lY^qF3*@>d-*WPP>%5BoAdy z|FyDw?003^8<^6&e_L6`4Dhkkv)~I2e^iz&@aLGRgh}r*U{{V2)*r>JR#~y8*3>I& zM^>O`JhgxOUaY%Y@=mad6TnYx0NyzmV`N0QCBBCKY9lBI_#Vuc66ke1(%f!T+}Dws z>;)S1c-=@{a0=LoTVlauvP@T1y|6WS2_tU3qL85CjhgX0%Ml|*S9r9mnK!_Z5 zi@9I;Z*|BEY3PrMbG@0pfV9%CYMvvY+Ztz$o(n_MICD3c>^fuWQ!+!~n~ZKD{;%zX zKducOiUQ{8WqQLv;G#ptz!1KL!+EMcGIFscmSI0FUj$uzT5!p`Nhc1ux}Ds6BXBdO zJGq0zz$k$Ou`0ackB_GA9<{<(q50=8O&?dgG?0Db@|SJwY|5w)))0qv4VPRn?UA_r znrBF5^7RpKs|aboVErZJhl+#YHe*AAT{pW(Pk5bQBK6dbV2jUm%5k_cu6(1s1#O=~ z2F-7fNqm8ws7`|0oSfPg=mz^X$M@o28?+IvBgJ=~m!Y7aDO?)#!M#2>7$lwa)x~b8S!LqWbdJdN24qRIfUH$Lai46wdJtFVKkj&R{MURw>{^Tw2 zLGzilyM2r3TIgm#SXlcl7u4OeIs!-o$b(p)-+J@^s{8o_^B3Juap2j2?&l#y2Xu$> zL0NNMhIy8Hg1JKJU_eJ^gNNUt6UQSxqKu!YN(IcO^8pyhK|*V&`7;Iyd-pn{<+4 zfJdip0A2L9DH_;ANH#x>bh_CGV8n;i{Zb-AEYtxyMH)gTj2urizVEDwi zzR@(oyIgUSC<@&^Ca`LAWr9K}SaZCAm4(4oe@5CJ!-Ry7c{ zj^{=g!F&qJYNEGI%0t%UuORCl5HeiKHi75Be<^KHF0+DO^~@EJCHA|S-lOd}9k8wp zZU1lVy?0pC>y|bQqM}$LDj*0%Hz)!EiXehORHTWB)CdSs5ouBssfI*BinNFb*dX*G zLPV;xNN>_hqzR#S2!xQ%`>^MnJ!hXY?|k!K-}k*U*Ub4NS4jDV=l9gL)_t#g5r*#B zES_qT&0KL%H%}P?<483gQN`7JLhX9YNZd`20mA0v4NI5C;O>(Yg9pj)u7o#1mIc^c zJt%YBQCx0w_F9+jQi1AiTXplTp|p-_{;xO?{~g-IH$B2f9$9n^syy+P%l)X}gyCGL z+}o5jkdAA*w3u8`_`dD#+i9(cV8-b)=y<0q8RFyWZ4+%u`p{6a>4z5chprM|Ki<>x z5jqcDJYL1pG{X%khui;YG_F$ol8?R`jCJ12G6xFm`J(h=<)$7^j7p$ zl1R4YZ3!vg)R+C^F5wx3VvT^P+jXcKZ+sQ%D5>xD0PLkR|J0$^5nVbT7@Z}P1~a|~ zwE1YuB<)e|m(1;5bG}BExMW$&wg~sr(tvvY$`d*Qm2)pnLxsH|@u^S(4LlJ6U_%zG ztsPCjFY&yyfFf11-Be0?WC#YRzrFO5o z+|SraT#IUZ~l?h8|Le0i7mz{sMgVDItrW40;W4;x_v5nsmVwA7?Y z^o+vN(yt|YWs(WMydx)GpFcI{cUd?9p@<>+^7+l7=9%its~2ac4n@B1VPz%b70ORA zt}(S&e(L#MdqQxAb2IDIBXp@Yl#G>~xd&xXMfq=WS*`Nn<;ha3c1Q=Cn83q^L(g@Hauze)O-Kk72x4JG|Wk z!U|&E^}K(&j6bMHHo!lHCP6#?{nswuUNzRoNG{p@ z05VW_zQKMN6SQ6Sr-*KYrbyFwkydTuBDXO~tL!eZRY*>SVU%#u+xWtci90D0U#6pf zx+A{fcc~ws+JR)LKw<}3CswbC&tjGI#-~_*{I$W&XRO+pl%CfHRgVb5T}@e9ol{vd zadRLT{Aq#^O zIX%vMJI`us0}}ED0DS?x0`IHJNx~5EwKJO>W0=0A(rUQRT(u1qr*F^^`swO~rkgIMXPmyd7H8z1IBgf_unHe>!yjg5~OPj%j=^{ztdS z^8`_3jMA2iyB7Xe7kBHKwM3P6c6C}oS4lb_ti>hIE?s$+ST>k)PT1?7E(pQ=fDB$4 z_%%2l*kAne-Z^uNM#iP1AG3Bus(j>ZMWrBxK}}E1Q8a_o0ZZ2tLES+OjI3Ht1K9|_ zj8#Xr?Ml4?>mF_v(43(@SRCaP5-&FK9(X_Ln1KJCA@d)7UrlNBM}$C~!CrYc{RJng z5;s|qP5=a0ZuCT!H-e2PvOVa-h#>DQ_J$r4maYCloGn~&1=vt682i8NfPen|uUTdP zKla6e9Zyhv1q|zaUNLD7d1y_>oThusENWN&>Y`o6-5)u5s4&7hEbZ}>>97NyVe*NqILzqQ1(IpPGYvso1Q>}g{&sx91NCj+kGJ=73^+UAee`CLJPW@8MUtLP7 zG_1`o#Q(C!zGa)%!tb!6EDOw4psy1m;qf66sUBT1o%6p0Di{x<4%oy6 zY}l>L((e^dT#XS@-Z|-ZnY%>I=|}m|rxT74`{~INYuGPq?b-M1JPRq(wbiK)+xPZ9 zgkHzTc9u#uV@=ie*}=2C8?H0--{M0a=%Fi?f^ic*%=s3l*Qv@@5lO%6=gIs3B{I70 zKlPc2I*!y>*{DH4@FA*~7H)otJYrK7>i{`^^C0)-*Sg>N)?5M@HdGYDOi%ymr4QG? zM)|XnBm{PFj=MKZCVijteEq0HXd(xba^u>en0K9edyJgkOKU7NOLLi!;6VlY$30jG z430eii&TC(iH*lnZ(m$*m6u(1$0sn6FryaB62c)#nf(hqAsMK-( z-mEcoB1&Op4O@-Ln8CeOzz+-(t=3Y#vd|eW$N1mxum$A)Qm+e47|+JYerj=zx=aDX zHK_2S-?ii6Z}0DgtGz?5VjpmHY71P;s1mPE7j>gXbsB0OKeji4BR7E89)~S=0H%)o zibwZsle=ubmtaj=h&eALMdwG>I*+c%{L6e+9tzq_bbS(kZRk)$^Or{MZ>%wFBmD51 z6fZK@aoWVq!=m)%dIjIq)_Jlr-XNcY3trDmSS+<{-h1fKO!hIWkriP=I>A&W?|r#d zXZwWTCrQrj_aE<*alG`m%K+XwQh0Bg$k#+J zD}qr;eB=n^EWC$nk zb{fY2J&v#3fEERirFz32<|(=uWx58oDl-dPpXFwU@fF|NDq0rh3eYE<4vc&&r<8RN z%A%W#?x>4>ORM+&=v1XPwk?A@*>3N5k=6* zrOCTVvt}yD?aazMug5N_4L+6xYzXQ0+Nn3dDObDEd_gFNud9P~G#P(32LuBRLRY@| zs8?NJ&kw!=88XInLNe8h(o$)ZN6oM4Y=oFUOq0Z7(99e z<*~XCg?_)uQGT61#JzG4RfgOSE`<>>bP5avkqOyACd{(Qc?DD5*|_UuD0IX$m@k z*g5`C_tOnd(L9Z{Vz%1X%G0YzY-NW=*Wd<^8|++w&KO={8Y@k4C+&`5Lg7WByZ z=f`NIucQLUfYjx^4asMj`yhuil~m*dhWsV_0~=?OLjlPH?p)BSwZ6B=^K09)(Sk?$ zd+GQ07mV|huQ%wF2a$aui_`LSsZ>knF`RLg!3ff&?2GR zdG$B;=3`EtoqJ4egY|D`z<-YHV;}wLy|aq_$UV!7bGkiz@BWa0Oha09(*k$bgdm?| zZ;G;Jro>Zxlc>7WyY3AAx{FN-alnF{Z}=|U9-)Oiii6PINnv#iZF>bL^URbVD`Q$FOD_98gsn;uE?O!1<0l?>py$~@ zy+|v^jff6|nmi#;Pt2gAL5PnrOWTU^ajs4;O2NE{;^<4eKflSbx~&I2Ki~>pdQ@`HSixRG z-zjW?-B;*A|@3?{F_MvY|A zqzUTag`_W(+9d)!td>b@OacW_(iALe_8O_ie_9 z%#A}HUTI*8t1J4?c?>(HK?*8;1-N-JjCpLL;)aK}z`}a6UvRaT%8#&@ipp)>^TIFy z!Z(BwDhUh1(*7oHkNj+LG*DukHYGN2PiUq99JiIPNYz`1KsgsB97@usC zwKfBm&^nEVKq3!E|DZag?w@fH0AXG{8TWNGnB9AVMWCfjd&i*0)bS~5MqnoQU|#-l z=2LN$U?}PKAesR1Zux5rqe>Z8GJ3+h02r&g@Z^VutK~NA36|lK+XkbdDPa$o91|J8 zMBvYFs5%423u6}$2JOXlNihdkE;mFdaBXrJ?m@7G3hBCtG&i}GPrkTD>e5LnaRv;r zU*{U<-)z|2Eh7b)toC3G1CD5xq8U}@Y?Y)`0wpQEV9hGDYj}|{{J)f>Oq@XcwIn62 zn*h5};2!^5Exk&B?Js@pEBRc!fB}pQu%>6yM)*WYC& zMl6tsc~5h#yeccBhCdjNC#%0thOG(lRrS&*`u@~#`5fmnuq#1@^KVHxqhD4~#hy31MLtfT zE4~CW+oXlBzRozwICo8IP1g%lK?ekc?W~*9^RE4cFS`m0+PB$brZSk)6YuHE6xZi* z8hx^5x{pKDGv93ql+1eTqKh@AX`_xe_DL*kVqVvT&qmM z#oTZG@_I3}abu1Np=o%@7oRWDK(`7bE&?`v}Rxc4@*%nO`Yj5m#`%1-AZQ^q?xOs4)yH=1Ho-cz-3- zyB1OBv;EAEwVG|MF3ESlnJl8uxgC+1sNXe}dw5d9KG{~h(5}Y>-i{JsWy9>ShZcpr zB2OQCz&_eV5r-O(W5Q8|*9j-13!~RRZE~bT=A_?p3x|>sB9^u?t{gMwknu*gR{31zpiEZl z6*+px3dyig>uGPq_tJd-33Q8z8B-_Qw1s+Qy1nzYywCG@Fi1Hp-h*%T$otzdCq#cx z99rH>J{KG%ggALZ`%B;SeZPC|8*JD`d$Oydvc&bM1KYX4!XNVfP$7QU1slco7=8nv z$Zn?Csh`Ix$o*Wq2DabNB(U!OdzIh&FLXbUCUh7bPLZo!KKOIoUg_)gluj3zFFqx7 z&6kd#WOcHuNWaPG!zX$cl3!ur;RW|mJlQltD<6a3+;ZUVYaxj*$gon?Q?lmtt?P-+DY3p}RH9tU?7sm9!7y90zcV?##IL0=d z`K2k`i0+TlfOHTVA4MF!5r1PASH>15e{HfKvjQjRIOOKPE}OqDpZ{LZAahA+fk=cN zDJV?{jM!s>%sVUVJmE?iJ5mMvb=&uqKJwx?Ixyip6@3nJz+piK6fX?hh+Ujr!;~7_ zsWK=;74^7l@YNb@pZThNOse@i-yL6n$aZ+g6Zot_ z1ndDWm@Tm;`{af5w)L-GmK=8^SeXVKPmf#B4C+Sm(Y2Vp%L(g)6Wf!hX#iFZ%~R~l^@b_Yx*t5R{e#j=4I;v zyW`@YEL0 zTzBs&a5`;6iU@-;rtm5=I8ppQq6EcxW|Je;Lqo*S17=Zx4#V$VAIA5y#q)=T`5jb) zVVZGwS<)SM685}}b#mapHJEaIZ^n!!O3Ew3-Req6v#v}0mEzxCL1@cG>d!2UHC4bXX6){ zev1c~uBdqKS!-KsfYt?OuzZg;$OjBVQ^@e@B3L0MPZU;AuC$$9sb2$pl%&%!ef?}g zf-n+o?~Pi5BBX9@owr5s!&G-cXdV@}rg&@4AZvY*Z%rS!2w!O%7kwfWSj>9z9KwDs z2G?|E=pc`m)U)o6N)*I`qjw>*`kTa?dnpx48$ze`l_*h5Z{TkRnQl}A4|aBKYzuZd z7_-TtVQA@wyh(L+$4J)+JurAs_R?VDhjhX%sTOb8x7bg0>laIXjg*mPa!-_R=%veo!l4$=xBQA7fA@e2p?;!Q zcdxXwyuhz@7*!s`m7yQPXxpiF?6M({x+96$4JQW&kE}|jZF0DxXRp(Q2dT@GH1ICMH&tvNG4@C}MwV$}z z{M~=o@A!0_s@;q655^w}rK35kE;9_afCF*Ba)gPQcDmiSZg9{^pp;$cMu5{_;1s%w z&AgN)e)@8L7v3)|Rha}diR@s!$kPNl#fpPndLz4UH+E^LIwBU5B9U374bSL6_nN)# zWyg>eh|YdlhoJhqNr>IzqkXZnlP(dS>SBDsS=U??`O`jJY-v@Wlb+u_QS|M3K*Qwj`=)*^g8-@h%dw&No=TFpM~9xY zTC)7*6!iS$>N@ol3>{?@vyQ)&etV^I$nG_l-Gv#Qc1IQ@bnF+BcJT8D-^DwRXz)?> zQ;&wQee$k%Gs>0`O!Df+8@baz5Jjhr+E?UEA`d$p8R}n2jAnm( z%ENrZ{?UV@P)T& z@O1eHykfDgCD9L~tnkuw*wZZqTo!}y9*YTW6x9`6HF2~GwWq^ zwdQyqx0H77r8>W9hNolcRn&hIQp5a*H@?u?O!!eOZu|jF8Wu z!cRO{IOeUn+0J}FY?P(MZw?xiPIC{z=ah>>`P}+ZM@7F3m`;^G9*h|*Djt65eFW)$ zF2@K{Ky~hZ%A`ZvkMGbmJ0g||SOIuv7^g%aN;2c+!z5mF?*xeI;t@na&5O63b6O1& zKv!IpFssb(x)-%SRz4xrk+ML?MFYFc+fM6qR!gGEm&TFqpx7*#Xu1Dk(1zseW!1PMK66fvqJTv`cS9xRtQ!5T#hE<&VRM} zQmn-hAf8i_@br`P8TCZ?SdLFQq)zc1QasN%t;xaM>1n@=r`3Y%2iy65hQ>yfaWFsB zest4sD=^lxf_B!4NcPD>;#RW9Jc|z+q*C?F%7QM(yuK(oAg`wJVvtT)FsR1# zPZJ5%VHfUqe>OJkyMj?N{C0DVe|a(>>fm&u^`r1uRf5%+YuF~oUe%+Zrqj!E0~;Gp ztnbcu7GRG02%N@;;A-=J$zMye(>`6J5<9nyW0l4%uF=HivfP>tr)o`#*N2#}KJ}x> zt1D!PM=lj%(mjfajeV+|5)Lt1c#gIh%ck#MDXulx<~@30CAPvgTYhi3+jZP;k?-u+ zN+6G=Yt^(bXP$zjTF204|H$F2{ysP|xuPc|^{2FJ$Y>an(jEBa# z#93PQ?A-f`O5pK55BwyTivWLfq^LW8mLi?s$$(nqU`B ziVA&YxZrrA-h|L1$ky0g>Hh3#_W^qS`*k(d?Fx@U6DyJAD}V3wyc2gg?R@x|JYaZr z1oRlsBtj-}wJIj>`SDry?sh-Aa~~8xT>l|^bCW}AV+3UMS_la&LPxXHAD*cDQ4~C~ zcvF`L!j*$^% zJ$PFp;-QlE{^Wo+H0X>E_gee(ZR%TVCB+o!i^4m~mz`C;c!gCT9`VywT=FFF4qG<3 z^EO{g3LdxKXB{`%0&>dw^wrv1Ifw8|#Rw3kQEAw7@uE;)WZXq1&8TY^n!g+aWynZs zPWsN$H(w7@)mNdWYB1TD#oiWb_BD3Lmvgn}6pL;>#gQQMR(wfMY;Ks@MS)m#^^?E1 zwf-|+L$M=8&ciKEzm9&9MTRV*e1Q9~;?^Bx` zTVZIZIrw1b$q05{wqw*lO_sI3hsc{NNBH4(x00>tTl^`Qs{1L zj;eo_e=+a7DHl+{XLHu9yo*Zp*-4XxAT=|3n`bryLWSN(#(2 zAK;nlb*g?}9*=o7Z>d2j#O`x6bQ0PzQ6%t6I5qFAnP?(CU`RM%2;P$*&3dJ3n|8?s zDU(%|>%9GKJgNmC_)pHMK8qwRw-uJYhajegkj=LGH_1s+=NnS4(l6#Q`-j&e8c#Twj!Ptp_?k7Gs--+gwco|wpd_vWq$J;*WRJ}env7diz z=;4En`^ZeD&>Q)WS5tl@D?d1FoFBE~c0%XaXyp1jwhB9dFkh)_#Rr%BT%1#pzRN$k zy1_sET;g+7(c^y0qmCB9XFHN&Pg=MXlgvx##0{MB!brNIz zG#bdVcPrSzN&Kl9jYeF%#AY7cT_VJmt#nmFp0TsI;JeW?5*N#hEe&?#J}dcpDoT&(qZyG`ULbYL*|Pk-+52rpop zFwvio=Sj{PFBPU_4n~RmI7U(|K!s%h6(SN1sJ#dM}}xJ59)o zSK7_G7Yp^(4Ha`H929O;2#RyQ!DK2vt|XqD9N;0jhKz_JcdQUo^D0VIf<6VXRTtgb zGK$}hc*2q{<~15ybAnBp5>{GN`fWw2W*&RG;q9{AIP8o|#NpE_HYpdTRb|gcuVk0$ z)D6WxPzGg`)LM~CHVKcFcq}TX^8%V_afAez3H559G8n7Sz2VNtes>)D##Ym(&MLp#LqwBVq26biPJSuaK9NXhGpXs`t@jS>E22~!g`h~dA48VwMoR= z0y}9&Ygq$q5iaRR#2lQB@=G!5a#{S1&Fs(>Vn0kY<9je~YH_~VWOQA3huwn~XcW#J%#*1}T z*)HKKWO&yh7?=}=&oN)mVN~ya=K^hyYV+6OGwW499I)5VLVw|6*V}pL*(b}D*vEV9 zdR!o+b}^kxWIHVm*3~rh z1!4`ib?HA~yG3Z9c;_a3N;tZh>s#_Ip(ghW7k#(Y8M`#=WUmm$T_aFJzqC&00xpi; zsIHCMLMOl;(|wvEDtJh9yTAwG`dF)%m{ zYZJ&m?8nP+Ct_**WGtN=u*snrI{4XO?|!^o-uOQ=E^X;TTrlp;mfhCvu;q(uCBKkrm~G z0Tm(i+$t)3r$gwI%x~I$k)QYNFiJMCnPh3-tn9@tDXLxb{8m=#`Py7~GXCcTkUua^ z#$N(k*77q^0c>?_k4<}D0x|UqC+uNr8@Z%_NxGrx_tV2V0P4_vR;DeNv}s>G4xSW9&%YMy*m$-@vkA_t@9rwcBEii^V-! zhPtlzE^UDNbljt@=Ku&A_z)?X8teVAXG$CRgOAOJ^X6(vKksrRU13M`#&hfQe{7Eg9ObV%SSwq zdi9vIoo`Jb7-kcp>tR1ZvM%PD%_}mO;bZojp!eVAc(AYlvS@1~7!>$d*PrnJ)z5z_#W3*d^v$=PHf3v5`A%!i zJ+zbK{aDT2){?*lIUM?b(INd$3IL?%OG$cNG}Y2o<2cMoGFary5z(lBnU?ZEO?LYg z*la5PB9t+a$&{|1cds2NbNY3>$0N3#et%Wx+rsAtd*;p^cL_x01nI$h67Vv#m%(=y zY=?G#SNO$a*#ovO26EG!(k<6yhO&6j6=FM}ZJC5G$o-w@!g4?7i@7reKYvYD$nUIW zdYypson$)8d2d=bm zE_S~h)hvdO@hN3T2eYmx?(ws(CwX0Myqg>k2~5cv{qA7BTft0Dbaig0L4U)SZzGR0 zzvPL?j6+vA*@uY^_9|Bb*xRJ)4%6Uw-Q{^*Ic9}0-;HY@MTT+{4U*$u-M+*zb5>9D za4p{yLVzr{xXdwkHhJlNlV%V4mXB+z5)XOcepG5_fl=mtH`C`cKtXT1j@WBqlqU1u zzAq4d!_g^;=B(*L#cH7;-ob{Q;_YBx^l>UsN zKP~}$KWD^y4_dJZ`e0m&dQFh}Zh9h!MG+yH>h-LT=m|3b`pg4ZY^DI0ls%2@Z(wUW z8?f;P?67NeA-p58eM8C`n?-ALWv{`uf>;Bp9RPseT5DWHJO;6P6Obry-smWt0M3C* z1EqX-8qg`hG$E2BQ$ux=1DVY}iHeB@iY_&sJD#w407<>&GPr`VMd0}XHyFjygbSc_ zz{T1sP;mPeVyvbYWv7q!w+^0_0ryx5jorFO=owTOmb91%r7WS}c&x_w0RjnxtOkh7 z-Fg@hCJjWX)nWgoEWzbQ{IJbmsuBd}qLKjg(GY`Yxf@JK^_u1Z97q~XpTahpeq#Fv zH8RsJ1OFiSnU%G@%!JUrfJ=W#G28t8h>9Y0lOw|iP3AK1r3yY_i=%7+?~utxed1e( zE_Ac=YY}<|tZalVya%0cKyT;ulYvok{2<^i@!d4`_0|0L4Ra=5!Os{FU>5AApIh@W zuC#S}^Y3cV#U*QycZ;DH3YQ8;(FHf#{ZCgE0y30I;rR42D-@SBW2?tX7R{(51KQe2 z)GI#v@s-M9XB2eCP)@4|&b)u3eGS(}XYo;;<=aPPb`TN^UZM{lHJFoLqzcNp!VG$7 zl_!C{hSn$sgGTNitNxqg1;R&N%aMIIYJU3gn3a(|k*l&--3FS`%t*v@S_VfU20YoI}#<@Y86V53a!}xJ`vT!tHVP%d{fyRNHYh}}T z|6`|rxN*$s0+mqrCFryF?zfn*NMexsh`Yl0%vk|;lXW+M-~tuURBfhWx_!p7g=IY^ zGVnXCeXL)x1HRPn0`0#$+wBJcR)5sp@a#pYX1|}H%ZJ>zCc&0YZnpChZLN{|K_>F_qiC2WCM zENKYvCilI$gdTYeRGkD6p`^pKyh6)m8Z>3{Y?08OMnJsoucCY?k-<-Ae?pIF^NpYI zvgv%t5?DEjWyOrP@;4l{xTq5bp$uy%u~qdw-t=fj@Ey2o!oi(*c#cJnII&A>PjVj3 zRCi6cX{)$P;jt|Sfpi1v;^k@N#juY?o=&EtBI={ZBSMH0?D_X+R5}*g_K@m`U`d00_K7)VFrKFZoLcwDG z)efi@=$U?}MIlhcHVnKo(DM5`vrdyG4Xh@5$`EmRBW9+ZYbyY*vAq97m7VC>4on&g z$BwXIoy4^wb_shIUsaFm8=)&Ykw4*eM-fDmb?t^num;E_aM4ox+rK@W)$eUF_(c8D zvuO>h*JxKN^H`}3^qnuzv!8(m4iCJkm%71^#<%~K?F;)Cw(kYs>Di)yw%`mX-7^@h zZ@!#cY*FP7g_?OM2q5*{!zZ3%@{(avStI}mNMC{Qu#)WT%lB-zNhNe1r*DtKQ!J>- zl^v|)ASAEtVkSVmlhbfv^Q+eK=tW#7)TgF}g`EOx3>TwTL)IYrl@2gr9m&=a4{_8+ zJ?ymA!YpFG`iSr<~ zvBoqAr(96phz*;@6Wx28+}Pc`qAiFr_HN=6u%Dbq1@Ub_zw0Z1MnN0gYM`#)-ajHA z0fToR7aU{$^S5oFM9Tiu_bjQEbO5sN0^*bft%q6jdfuI+juNOc)AD2`!T-$A&zGh% zaey~Ze7E*d&!Q(~|MIE|XgS_M{gm_+WWboMR=~v`ZeNY8#NyuK9WJwj6R~uMN6gpd z`%eMlT-GYHb8Z5=d=u=Hv@4+N>M_ zynEaJdP?9w;s(5G$LmI*<&Qo~FGqL?%J7yQ{<^JShw__hC_7NqYHSaHDUHoZrcL;PIk(xNrWq7HIAJ^{<&ulESx|?4vrokJhx0 zEep+v@`WE9sAJ{4Z9t4PgQMAkooIMa{MPu*%x4_jph_MOb0_O1@-)(rq3Qi-Sg9&S zranckI9l*lJ5f3f#p;>LFMp#Q5E%iEaBK6QTaGMxv**u5K`cqFc%Z=Y*-ndN;?R{t z?X&MmrSzRTeYXZA1mW9x7o4hOk2^WNlo?sRx>Y#?-hJQHRZt^ua-8@nYJe&_@7!AV zPE{wVq4x%4jV0aXR7>82%bTNg9}1Kytc++_$34v5I~ONo?J}*k(<`uS$7K&!k>P^a zudF7MET*Hcbd!F@6Hf(izSi>C;)9mg&XCR;Q%(4WsDvAlwkwV!%h=31}VwKqU!=cK#g4WplniZaIfou5;CJe)E=FWwdO zdEge3GW_;ALJGLpbG6 z*CXRKp>x*oO%8Q|arRwSiX{#68BNS(0t@nz-^EpMbCHI?w(}at(z#|w-i$CGSKdg* ztjmEh#%{Ih47)Rj+0Y03b2Gpx7F0%RXK|93n_bz;2nj=NdRz?9lMMvc=3BIVAKPXg zN_D~mLP!Vn1`~Kn<8RP^Y9%f@GliwsyvZ?L%|!oz6*!^Ku49;rBa!GCNCwInO=^by zxs~vy9z%B8BzldAf=xjO6!HJhJq226w2G&#;Gt`fMWK-&el~x;Q_P=R7_;he7@g&V zU{VnnS}-NTbxJ+&&-V#z_rGZ%O(hie&!PC|Q2eo=|EH{qqA*GY>D9ybucDc9K@u z+0Vmpm)E-IhkZA!(a#r0^L2WWUpujv0aQRE%>i5EALy>YDgJ-;o-hSZgzcrKL&a(> ztSS=|?+B)t*#f^b?;CoX^mnE_ z7hjz}7E-Rk9ymF{a$27X-T1<^48f2N#t~V54dSam)#5R=bZLs1#fm`yQ!_f(zR0U> zq~Y|gcB;A(JDE8(&6b9U6|J0Q@SqUbElGv{-_kX6&`r4oE|bIaUYO8qTu6w>u=FLBSR-)Wzv`l zC;8ADl1mrk52IvB_Yrf2**)~z&9G?G0&t*30@62pOidRR7Y$*(|0X4^bOqv_v{pIo zfer#iu<+@6-~5Tv>_=GtLxH7S~od%pTvlA6qi4?jnKBwn2hV#xj*sAHoP|K zFuM!A4<-GU%u-pH^=*MrSIUoz13TdQhl}^=lEabLLfKvoZn;ludPT+dNi;NRFfsOn|VN?jJS&=0Q{wI*=LJiD1tRUJL3 zIBv>1GV=4*T#@l?X?(etXWS_pP~c3}1yo=&es!6wNRwt5v)lUcdt8VlOhC$AVuhB0 zBvJfZ{>O>H?W;C!`@3|Hg&%63CbPLnTFQYsqoh2;PZTEyA0Jp1?V-1F<)xXr-qdAi z9aI=TQ_MWXV5>79v8UphOVVrUcuo8$c7(Jsj@eLHo_>VT9>Xw0@}QJucHE&b5WIW3 zRY?@j?nkU%QAE(`V0w49AR2cW&aG@ zZbo}CY!pG&tAq?Y_H`~-tn_3f-wMA)O^nYRScWa}FkNvoHHO`biq7iudiZdCC!a$# zQlb&Up{8$Gmw&@!aU$Mk%iq~&`;cOO(X!DVr;mozlg$9%Z$3!0zVc9&>O6KIoKxa6 zRG$2l)xM^NlBynf^tp7Fg|^yng!e}6po+2imaDP*sgAdpWd+5DhWENt!Z5!YA-P?H zpkgx(!cg;uviZ5|Mj-a?+EeB=7s^Vv+akkUf1Po)J0`*8!9Vg)(`wi5c3Jn+y1XwZ zT;AYZLYVM1^OtlEeINGusU@SNII#Q=F;nR5okP+=U*o-I>G%$28LL#)i z^@DvoA?j`rX8AD3B{9}p|0p&M7wD8lY6x{zbsgVlQ%9g3awHfHIZIsR8=Sbj5Edv} z{mqeu3H)jsH%WGjzgM{Ki?7R{a^DLt%8`l(;xfU@xTjN^X?l5Fw_Q{0O= z0J5k=Gq?vpfzWPHsTRAcQw{+&2V~BH%Jw!}qI8yQJtSZNK?>OVi09P1;QM@AIy8-L+b~;@C^+VfmH51fN;Q=Fa{RZu zKR&#@lTbJTU8%w!hRrCaphsFT?6dIi<`m7?$)4F#lBqaOX4$KAw)?K#5mhVS#Z`6F z;`J%y-Bc%`^ygWFPAVsQV&Q#};-U6$Q&RT7;6;=`In+-n)!~9-MNgIwl)gFL7gu!w zQE+AFt^HNKF{DAyi-1Y}=YNN|$8Lr0L>jLkF84igJvcJ(E8@ze>~9P)o#qn#h_WA7 zCq|#NUn4OM+g6b^+cJW41Wg0XiYUH{MZ+j?VgZS?hIy$644%%OT zV6L+wt#9W!naaH`29F!-P$ANzLM&%isf-aifbLhtm(3qoXjZrpT^Mm&%a+IjYI8HmyfhGR$baq<*9fj8Z-I< zD%EB=4OLhyL=PiKu;wzy96QHYvpkE2je9rox4oE-t;mzI_<}4+>}4(NX%jF_a@_%b z@ejrY*UAW+XAf~?0gVZUn^w;c*rqkP!oxiBpf~g6rlcob_aSAe(c>Z+)s_4`>-SL@ zXNS{KEWJ;QTtn~Kv979WiH<5SREm<)Erv!ca~m^a5Bu2Mp5|HD!m?` zsJ8f*;%F~r1J0Ju2BT<*{@78kf?mXp0wuSmjq#GRa_=L;SE0uUa816cVo#4mP?g}{ zj9;5t&YM&g#CF|yodwnS$=;b`sBy_#xKnpKZhHzuP)fUGtU$qOEq(6Hu-ybV2!EA& z!^SPy^4;ZOwfJyk9M`^uai;dFVm&-gd|d3C`u(TWJ)iOclWXrN^Sua8+##xs6^lEJ}!;FGGC8|@2uEB?*Wd(ff2O?vN^RO`CO>aQLb zN%1)ftH1WDWOTnTHg2&LY#se&Y#m(CCdY6o&?w1r8j@a+A2=JhYc;3Ku5H{=*)*oQ zQiJumQq+NtebG>R)I`T)pGb=C&%J_)@A?$CN2ZAc4N+tk5!UcPR96OcJ5w7&cL%>z z;xMJqlc}%%{-5%lE$TbMEhb-)DQCbDr~i&U5~l=)=tTTytIT_4R(e z?&8k2Bi>Di6fUXjuAQb1zSn zOt6SWPx>6ZoR@d>Y2}!iBVZ$(zj4SM6}KTg{_qAqptvz?{ut*s6%m_*3h)2al7)#G z4DRb27`|6-X|*5W_-tO(@vBVJ!n5SB&$XmDkh8>6NH2&m$tR*5(&_PC(!=|--ph}u zn7B(=d@RE4XOI&_2|TQz5>d@mTV*)E%MDup?B0pKQ=*FXXN>`IP=o;9l`E0h(l=YP zGFiVEUkq=kR;aqPd4XB>;ZHJ|yFz^#78P(1y;u|9=3sYqAEjt|kF8`ySr4e*JQqq6 zpE_OQJ1!HGbCBfnT!(DO_c6{;OFcGT(%K5ach9_sK!;^1=@>gNR;DKaxX$ z#9-2zh}{bn_a%HAu~(`xcpl49KwcY&epIlxs%Evrv`P6ZT$7J0hs$Oe_XHv~;Kn)H zIJp!h6_Pk7BeVF~<@$$xO=_K-{2ofu6U#lT2BH09yOLtVDwGR6E~1Q&g($IaHtS)# zhh$U;0igfJT{fKGYR9bQVkzgBJ5l{lz~YNd+@?h#ifjFh4N>{B1m9dxJ0^y&$eQao zV+G04q;WfY2!w9mnZ7JF&G}B_h*=Lg`BE@-?sEEbk{<3c$y+IUFZoD%zSAy6jJsl9 zcn`XMj}xy&i(R<@YrI+AT+HmX%iMG7oV!~CGQ(S0UwO27p?Cz(b5&^B*8;_F^Lgjn za;3zZ)YoQeZs0PU$O!?h(BqAUu=IA!Osb8FA3YIpk|9y4Ym( zOVMX!9Xwm=jp}JI40g82DB`J{O_$!UB*#M=_J+m=Mz8^{gR1sTmn{SCHZf3C3kiY5q`d4zNF`}3yg86QYN0GXrGzj znHWAxrU-gVX!22z#GR~+AJ;A$e~#qfk_bi+p>!INefAGkb){Z(N1+y4o=}sk>@|I-!oZrLGAK!(xtbSCTRsZ7<8m{8B1XC zHx2;Wc`SX&Puf9?aJbNUGLC42Q4fM$k?xix449-UTk{`<7J#rOOpLfO*&no4q!cit{Sh_?|TC;(^6E>G` z8KGcb+go)yKa!StBEM?#ykgs6%+nv7iDx=Skl~S|lX{+$vq}A-owrknW{^gA)&u$; z3iL81;k9#`QkS#e-8Q@L*?HMAJ3MPHztOli5yP=t{FR%kFEcu6(LO!ZUc4|Pd~w=m zNlW};VC)OtbLtFa?IfM%O^`?4qWyF;FjOkW5A;c|r@S6-o86k)3t;>PH;mU|d4Ap>Fk9}D20!z;hTvmINjWOe$drNa&Q$3w; zX|#}z!D&2vA|}zIs-KXaGFne}IczxI@UEN#b67NKjcbSDp!?4to0$>oG%%_o_#Yk& z{qGJv{`dX=nVs&xZ7pJgwGf=XIswP5Q$6&PVc93+h2~&;eyqrdtyLWwp|IyDjL7aU zCZ<|1U4&9m?uI;SnZsnFYwnv%PsDS59*EzLnUN+!j@UbOx*J7+G@|?;)gw7SQUpz{ zd>xjPpdUdE6|g0ylOW-iQ+z*vz+aE$CDV(>!p^*yPFlP_>Hg%wyq2zgR^MuC$A8waaNwaPwaW%;5d zuGp(K(>-DEke9|o(mkW6@2)VZmooZbZAIeK8#d^VAp0KUnpH=cv71a}KgQU&GyPW0 z8h#vu5Bfa%xS(x8CVrHrB1fwNSe_#oC_*}$`7Wdyvx=p>W*n#Nm(LA^No715BurYwtV_WD@y|16nA;Ug#M(@x$GZ z+7kO>Tjh}cUV18TF7Ar?Uh-0*3$A-!O;^qK`p?d>&D8TGOG3-&pHolT(-i)hCgNS1Qz8NvoKe8(I-smJ&X@LJ8R_z=*CUSbH3p&*~W^=E<3l7>q$O3<%dJCR)p0FW_jzSq@+i)w87SjBm!G_s? z3pU)61>Gp`hTl%*durFrQrJQr>$USa_g|eC+>)!yX6*#yQJ?Bv!6CSFYzxX_-1n>L zBN{&Ymc-yrJM=BtfA)nZ`!gsRevR-Y?KyooW|aq2oTkUu{=^|lSc^W+Fu^UFY&o%XLuIwgzod;FchG7|K$)M<;G?Ao6t4O!y*=& zxL?cdgN}#fX)qBe_29k&ce=;8CceAUy`_~}t85)%!GBOp@=FXn?}#u(ZN9QO)8Osb z>Gf#8ZuYiWOE+QjWV}ZHco-5X+^ugucrnP<+xiIJ5L821TFKNqC_knCLiqI?(rws% z^p`~%<7p}fbXtsa`{Jp4xOZ61^=pos+o7zO9OR#EXk}#|`(V4O)bhug$9Jy)-I=EZ zM^E!%z7g0@KY#Oj|ez`b6TseUsMFpF%qE9d**)bO$1?OmiekTF-S*_S5lCwAJGq2MOs4^lnE~$>{v) z4*@q8I@zl$by))vBQ$luVIBYA_ub7X_MM9`SOUk$b&2~JTuYF+mc}z$CDyq))?1-sxCcsuAhQ^ z#YsbZHT4QR4jkKE5TSML{=|v(Gg)HvrqcmfZi(~-Y-~=4^c-M#(fzy zM+*q7-N^d2dBP6n)dQa6DQ-)LpQ;2Yev|rYbu?kBj%(bU1bHYVV|P)F=s-ertghPV zy1lH#8Fp-zf#cWL+_e2K(8v-w7NG2&Z~>^8A7~sy$plP!%d=wO6HY(C756PL^kiQ7 z7WBCv6e6~;&%l3gVV^<9$|LHQpa}~D^AHR^Ln$o;PBt=Qf3eQR*{m^=pcY-wEsS>+4;cj-gLQ&`dVf#$Pa8yvo~3g z(dt%cNGE;k&6C2%hmSmEcaoQuo?0ygtIJM}!q?sDdL*qPD^S|9ry~s83|^>JX1(TO zo##9ShAXxKa_J54!!5P~JW<^fpbitW=w=f)Q>R9l0u~Q_hiWwh=Um%aLU(y^@S=xu z#DVo=^{>I0$Pc0^@gvC!w_h?o0W)eF+Ls&C>i_ge1`SVw`|Cm^x!v+HU(j za|0SVMLnW=EewMtgQe~N=aH1pl(6?{Xz|hdM%U>bzdpZ*Q0fMbv!#e%YZpKO=wX%D<|;xg~G^Cz05GCm3or^H+g(;=7o`AoDQgeA0Az$; z%cN%7Ubi1|($*fcCjBKFYrS!IOgRjcXV<;@N@Drjr< z-md)9f%?nA`ak7sWLBTstk$dS7uw#7hu&)S9I2iu>fHY;olx!y0vmhK|Gtk?U`Z3t z#x$FptQLS7DY@L;Vjdp(!$B3t4v3ut?CFU0mtBkFxU=J^(BWG?)5M6OM==yHWw1qd zA?LW5f`qE?7>6=A2(}E>U4kmd%|q zx{o%bKv}4LE$GU43JENEpm0lJF+;YAWMn%*=eJRd*i1{ZH|e=a?kRZq&bUZv5w{iyo0=`#TSAu&eSE(=!9FOyEKUmz=Fr41}5f_Uj_mx7D!&W@~& zG|13{5Saf(q0xm@|2tJ8y1UhUh+k9cR8jG);M<$`ECdEtsgB*N^X5z1K3P%99jg>- zdVXsyFjZ8m|Mt2+FmV4wjs8FBd&SmAdV!qM+jpzJf!Hu#(_fElVv>f|PQg3(ekY9w-~1iAO~`?m=3FlwaGjyPH#-$D(G3dmA9#p; z1<1%Ne&g^t$97<)g0IpYS;0>?A9pS7 z8yNSTNLZGr?nq8rw_)}W`&G}?S{%}=*0G_QYdO%0VVX@=MqU$1M}hlV^DO&|_5CmZ z+R~P7cJ*pUS~ljiYi?2IM@wIgxzq$u8}v~XC{1OYwiJ>Wmy9{&+QD+CTaJGL;!S)% zDH0qKZJb9MEZD|Pkt!_65NiEKI-_J43n;Nm5`D5AxNrT&kzex4>>BroHaE{NkBl6| zW;U%6ch1rL_Kyp5pTx;a#n27A6<9l`&-$lN(WAijG#&XpJp50<@^AZ>UgO5)`vzVB zMs-GTThkEzYMFa9E-TAaI(2Q;?mLdft@AvLunT(tWz|G(jF@J>)SSf4WMt1~8Imgv z@Z~2;9;rBb#FU>USwY?^Upy6~majIRPuNiSk?uOZZa}wyp_5$p;26^?P!^vOG44rv z;$wQ^;vLqM!(JWMC-fJ`O>wu2nd>?>Y&eaXz`l|F9=?EiI80d_doK=%Ra;RTH?!C6 zgj&UpR-H-4?qT}4okrDxRAT{`D9iMqe+P~p+ZwiUqccQ#;r$qh5$>e-!Z+)W1OMMi zYZS^Dd-`w_^+s~!lq+0K5_F7?$uiOq+K$T-b2zG2@;+=GZomFgbAlFoVVPwi#8A|* z$x9I)#DJY_)ecu2UPsXr39g4V8TTM_JN0qBF=Kt0K@~2>y-Jo5ayCxL*3Z!_Y!DRd zQN3Br=7tljFYI%&IzoqMT^v}Cgv>}*4*sCuE6>J;876EEO3j@CPoco;q4@y$PRcw* zvzq+IBEZhcqWDOoj`q0qxa-($X!9fJOHwShC%_*f=P-i5add(9a5aD<%HbYgErd(s zMz1jM0bt;B)aE7xB=;ch;W<1u&Ds-;8?M383t0N!35-!6wgCumKmW~B95y$$o}vro zC^;J)EgHsVm%-QLu!#VZKkUOcVgtB*?}w%48u$UGDkbdo!yH1TyRDIAzkG%9IHrqy`q*_uCG6Y6xc{WT;<@keI5>0dAC4A#}~ z@U+{nJ(*%jI!=1>u5iNu(zE$;HboYM55luZ?IPUc0o~}f{$mm1bQKeYtwCeSi7zZq)}#gBSew%1;SOvE2ubc^#t{CK4y2 zWlP)2DYBWrQaoB*%_J768*CM5jAgc^>fX6^oFiQp^?jI(Sf+1Msy8EEtmYA$B3S#u zcnz)wWLi}eXi=;61cnnqm_`-SV}9c}E`wkd0>Rsg4N*_l5)@ubP{;lqrcYxN)vNV_ z3_-b-TG&ZO#`3ud%Ox>$ACrV_OUZ^>lL{y*g?;DJg*aT6tj5NV1o0yKjO&$uTD%zz zC0~~FMkXzDq&hHc3|@YQvoduw-QKRu%p$u1UKkeG>muPHfFca%B3|P%&~|ZdD{ZRk z>7^68AlD50a*qS-T$|%xae21sEYBj33^R|>?1@Mqp7vELufhog%rdXk4tBQ>XG31~ zSChCDzt}!0>Ssypz237ebl9Hz8JC{P@sVvEA5ryglC0O2ux6X*A*99LMcO4y^^6d% zUw(5CaEq`Pj83P=6kVw-6YVebPGU*l<_n#E5|S&N?lc6KEW96dA+a` z`xx8Hz67G!l(Bj@_b#1v?7_DtulVi`XuUrnK45u!GtT)@%BGWV+@?ztNX`^hltC*VC@u!M~&Tq6hM(xu|4{I$JsKq&T+qxsm6&Id`aZDI3i)nmS3C9F9Fm`D=5##{BTw%a8LpZ)ISD3Otzk&b$F z|0J(l-}7fqmFR2@p|#%Y2(MqajnEPo?-$t6R$o|!zpv;>JL?{}07^^LhW}`O2DwQG zJ4I5v4CtAGUiT`oYmWKx?=LA=B4`DuQI;{aSBgCOyDX21hx)(d#Q*pDf5pJ3{$-(V zx>EU~%KJjc6;ob3?aOPH&U2xKp~LMUd!#0Qtda<$;n(av$wJdpJ%KlM&J1}Q@)6E2 z_Y8r6b^QOCtZ_r{y0Xo!Erwm&z@ZdG^tiQ1U}X%uCm;7->xBYdRU?Q z;dVbrcNcJcAs0*tmLVo;y2(YWRVJ|AooU^kuTMfhILzR9s49U*G0{?91_Ob+kJ5G( zB=A{U?5=CrVEcA)Y02YxRgPggs|4aXduz$;KuOB%Nxf_KnKZn z9*|AP5uXtutS_VqU%CUUpU65LT3lAry_0#+9RQe9krd0hvLfkqP?-#p8iW5XLz-th zXj?|IcGx(M1tDuLoF(lQdWpWDiu>?jONkA8`|sgj^NWz4diHhrK3sP!oN=zZauT|H z4$!1(f5z?7oRI|8DKt8~27LwaL~vD*)_IFS)lnAZXYh~kU$qoyHRo_`i3n5!`*#?x zmF{&=F%K#SPNTcHo9(N}2r>5trg$5mlKEKsr?c+i_QH*5)}H$w9MT=qhs*u(ql;(84;p5_dZof~`(chP_0YLHZ+vI) zL13Of2b2Jws816PERAN;&BkgIHUr|#s(<6y9(qCp#Mt95*iY=P5?M3hU;J(lxjFK? z6L{*QvCZ4~rYRWDf&Hk6PD+uwfl8h9cNqRue5cZ|{w%InMH%OJvPNF~Xa3`L=t=e? zWF*tJ<54F}#w~eDKU916KJ!d-5d4}DRcgU@lon<};qo;*7urSXjS*W^c!|e<4RBf` zhGS1wv@o?Pj~r?brBRv_74moGCs(G5s`fwb=yQAjQsQ~v8SnMp;{LV5fX!6+d2x8Xysn)kWHL9ac^Hcs4wz4 z6=6XvjBFP>ihP4@$E|3r4bv9k)OZzI0U6Qfz`jvZ$Yv$5KULq$19f6VD_irrF#YG6 z=A`1kK|opkh_oyRNFx|Be`cT*TA?jEDO8fv1@&Jm#o2og0KSfh?heIO??r=z;^nf# zg>OX;2phKNw&aE|Cy40{1~a;Y_~=s7P<02hfcTBg#eCFSB4FJ|?N)i(7y2zJJMX3;Wd}^_;bs;UKIKD-$>(Di=a zoDI>pEToQb%%rqH^t99_v$M0Kid;iKO7LI%rgx#^^y$D?+kSDj++ABc$eS*xIAqog z5e^J7Qv&`RnfQ$G3I0beN%O_Oqnq4lxtD#uemdrk^{4 z<*p7WS1RqgXFkyHtnB|?mQia3Uy0~-x`FYq}e&RzFqZF(de2qfG`-MAg^1jvOkd_rtY&XNqPf{=tfX>x?zG)zs&S}@ zE}ynDHoR3sXuZ1O-9IRN{UI6s-=F_oVetR&>%+542ljziuS3b?s?mCUYge3?$k`J& zjvtP`A_#H1`|(Uv^&98KmD;453y6=fgd`>xG{9X=uRS%Uec^j1KGyaP2H`}HC^Qeb z&}%dinO^35GUX2wtbb{iznPPT_1(ag#6pNRtM-wH)_7JIbs?FpegA(_wf%4JMTp#74e0K)tJ_VI>a1HTIj94gpNCUM zSkgww#)YeO1f=R!(HBWue`@75UBV72~ zyV&}_<1oK7X_2sF-o)*2_+nIxdy$1Z82GM}UJWAY^%{Ii;q@%#$|E+U;dE7C`H8bq z?0ZX+-sFHf{3I&Mu?sqsfb7JkticMsPS0LIHvo7`2?rx)4&R%rUuWyC5bY)X7J4s+ z&}ut(lkPdPm)46$pQ!~s>3ZYu)k$Uz-&|mh2a?`vx3ax^o>iB^?}A%G3#@TVsH>+Y^RocL10^fHxdgw1qz7BHpNZNy|}zKD$97I%Lx!E&VL0Rt0l4ff59?Q zgcmvBW5r_YAArUJk)PAQ-FOpa|ejo?~i)`W!lCuDR_wiRR}1?;ujV@FyZg zgTEtEln0Z*e?Ty;h5fOqjQ)bcQ>-?bpic|`GnXPbvBWQGPSoWa7u%lVQw0F-;CPm; z?A42{&a3kd|3DP=_dZ5klK*V31v?y+!oEfn-d*E+dRCJqk7MoVXdboHbjG59M`ALt zssA^*6hUQxYudY_nUBCr@78PYR8p7la>Jc=V?pUNf1BoBYX8GVn7Rk*nd8J ztF8a&_uOSm=;DtBT)~C46Gf`+KIiCnJEA6gvBh}A&QG1GR711gA*LCkMBd*dqYckFym`#T=_zbQw|P0YM}%|4Pv zy5Ri=Sy&;~J>Uo!`c?nbLSyCxvn&ASDzLVMYYu;_(3_CF2t~E2pA?&xEYia1eRkhEPpm>#}6#m(A9T_cldc#HxK6k@)x1Z_S%*BcQ%ipl(vp zuwvAWrnrVpVI%LLglBMBQhw*D0=5UpHoja#@cdxPX&zv$vpb{~PY@J@S0cS^UWv4-f#GYYhlaKEBTPHsw@}_?zAFt(T!+NqdT(l$<9Q zDLeg8UjOVh^J0h#LUnBJ7SDE9!=ll0l&t98l=H}-pPM_oju+fvmAO4ryS+-SSw?;R zSS#9^`tzLgGZ7=vH;|*AGq&rn!;K}`-uIe^0`XKFyn#u(@XLdz04?jV0tKIDOZ1H` z!;h@OWmz!;4WVlzge?@Ae2Kyqip&u-pNKm^T_gipPx0OiUenAi2VyhQtpxbiZWW=$ z4||a&EZRu(OAyvUX=)Wo0S)Rt;s?UOXz^{_%tO+x6e@Nn`v}T_K{y8beS)y9G<=94 z^VdEZ-Me`MPY-@l-8P0`aUg3t*QOVb!pAgX2#l9F*7lOC#F`7aEC$wZ1$drgpoD~P z-r~zBd9sdZYvzumBd^193;~MmjP2KSr_ySeKudF4KrY+YWM-Z=piCa0!Pm{B9Xi0_ z7-*1QEg!S;MtGrZldR{y91Xgv3X8r+I>*_oizE|U7@)vEzU(=SZ23kx6J(#)mLXYf zm}{TF=6K0Cpt!;`4lAIWKUqG4HN!Peco-izqn}o{9!i)%TYtyS^3i%4OX%<9|f#IK`F%E+8YU+EI*?n~ozqo4fkT4(TW$k;M0kHv05{HrugZOg>0=nma?~#RfFklQIiyBwp>* z3A`uKE9%tq4nvE??|JxrObxR4JK#vOU3MrWg66Kk>Itd#C$no$A?{t-Zu)pdEs3I< zgq!;4A4n^Q_F$AW`d_if>3gb&MY5ZPrDqH7+bL*2ed9Y^{0Wy+{*yg3-Vs5GU@pVQ zNY=uddvcL%&8|sfVQ~X#hkoXkTYn3Sl+GXvzn!KK-Y=dvS~(`wkmtzcPZ)yhw|IQK z=Pq6|FKT1_`4xwVMD#13&HhpNtKd=?6yoN&smGKF<0^oR`r`_tF}fZPi=eNuiI+2Q~gI!H~%eh@TgfX6`hZ3Z??c~Q0PSU{Ogc z0|;{YG~>`G066wXzxqqurL8a0ZePuk4+bD_#f{gf#6#Xn!s6_3IQxz53Y&Ckoa7tU zOa%d_QqrHmZq3cfus@)#=OQ*t{|J6thk?HKvxHS<@5W!Lf`9uzRuz1krU^z7W8KNS z`A=Qma)jkbv>RTD^GvDHpWDL!_kG@0O6eTo@F67{7rJzXJxZF+HI37D(saP z0>n96{-BCX(cq5mKGY9~SENx{`l4;xKy^uTSy$WePw`0wfr&EMHv)Kx=Oy3m^b=GE350l9+2A|1>1 zGv=0e-iFcqK*e{SG4sJTjW5YVcbOP$UHX#s($D0!r3lH-C+Rm2uebFX~TH?(Sg%m2*n zcu)aeXqNDx;}*qV!zom(k$IKDR;=$HxSgD=uH&N0^T9FL4}400B9q>VR{r<#pSK3m zujt=KsFd#4*$BROyT*)|_0K&J&_<|Q#`X3EoflNYPd0V?^#kYiECoBm>2VUzH2`P9 zr!A01>5gaWf^`h>^&qkcIUx{bnu)h35SQ}q@f~=ELzCVEr#oWaxmt{^_N=JNKG_-A zun_^k^<&L0EP|@L>i+Q>yX8l{@$;s3R<&g*!JA2t&D3XFGpRb*sO~WgHVVCs9#o=6 zM!UXPS?W6f?8m?(X4?rl(UQ$q<+v>FCFzL^nP>a=3r&D=bOxq6E)2QJOS1tUUUcov zzW~IAc#)oJCn#fuGn?z)zc``RQvr`d8>1evdr&JP+kvY|zX1OI`v2Ag@{|>oZr(r6_d09jroH%sxkE#Uc~4YQ$c8?di+%Gq z4sZB;Dv5F8Cd-gmTBPU8U@5aIH)EC<+k`$?O_}b7k@L)ITH%YOoAYS{ZBfIGVT7FhFQ^VCb*{w;D&8v<7Y^nc! z`lE0VE8SN^A`W0>TX}tRtF%pDy=6A@^SymNAZ-Er;+=-krpT_ppqo;3HH*8c%^7p{ z7fUja@9|t>3Vzru_(>&JL_y5EiWHIof1x|WPKDhwb@8jyVIoodr>H^3qxWg8f&xB_tRr^J)pa~GFKK0mPd z=4N%LFpO?Eb;GWMWz1VmCGK(yRF1@W>UF1mC9ODgO+WOyOHk_`-+ps zAoS;TYH;05v!7S_%MzOq`9Kq{Rw~E|6+y~@S7YFn`x45Qci;2#+aprcc-^f@&v<pu-q#${Ukml7zAr`QW=LoqhmHn0t@69`7pjZ$+fMN? zba#-;(uMgmd7F@l**9sK%Tk>z#=Miaa&wHHSk;@1WyVor2K*u7V4s`ESXz}Q?xZiA zzc))@-?P0HwtVzi!)}4tGF#+wM^$dS@F;LS0Zb8gRpXa-y4u z*sgUee5Y%tRg6wv($19$UPbnkS7F@({`HhoH&gmM6(*Ydv`3WiWEf&^b*v5YP}q9K z3z&73fum$TUUGY&;Cq#QH>yoG-W2zbS68bc$G!3`a4re)VHR;!6d?W>ZuBx%*y_2k zWDZe-W7`^#M`P_Q>!AGjr5E^c+0@*wu+T2vxBlw56StmA>nEHpmSuJgzR0rd6G?h?a;r01ARt3@1z-o+R?+Ei8oGd`&8|_OLv8%He^!F{Rs41&y zSHVtXkgeV+Itb0dDa*0_+Rw@_3{EB;(<~G{NmFDpZ5U}wg(>Fq?w_LVhQ6tW?T6E1 zL;QX+wGoUs<%d%8yaLsV`Qjg9V#?q(p1snL<^q z?VVnnmkRd9=Eug$gXuQOL7VU|z^;%U!|t=z=obMN$?q57O55HJr-rI9f3dPj6S>X6 zjBFw%I%i8bSh;DcQX|E}Wd-=R6Zv7+GvCzc2=y3zv>O*l5qk1o?~1>FwCyA!04|6E z;d|hc>Z3llwzeDJUmxA~GFIj)N9N+aAT8CC)d+VcaJ+E2(rJUAMiKXZ*ciUhrCFy% zHK#SMst7&%nZRu}JN$EEjW)etO{ps{chVwfR=(I{XNfuWMK|wb&UankE@s8q7` zNEq~&SKeua>Ao(Sa%$q-1ag`P!o2%Y4;&=IeqkkYLpysudFvqqS~0voQoLbo6P9!} z`yjiUv>yxYtBkb)l$Ep35^szbKFq92Y1MP7w-9|$KAy7^qaKeP8sZFl_Fuf|W zqT#R8yh`=H`4CjcvfT?wNa7GWT|U*u;L}Jd;W8+~^;`t*`acv{{4Z4P|1-J*2C=(_ zy|b1V&*+`}{P+U<1w+=Mf&yfO!rq==&afV~`S4Ji_+`d(iH)&MBjOu>!rIb8>#z2e z=}mfr@g||EBmX+I(9%|A>HTOwSEs6aqUmht9n8j3+>M8*lza}XOC4Q!EDXe4?+JaH zzuX57wav<>DxcXk=Ch=H_Ta07{Cpx>q5_eg26c;$W=~&G?`Yg8`>l1E>VuZ90Kte7&s*eE7H*H8SROXx)`L7e%??h4g<+2LJ?G+8ySpaTL9f2qd$jWb zRC@H$x*C4GE$`;4lrZ;^k2W`GHF1GYx}JP_eqo2ez@eZ#2$#(U*UhMW5nw3lKrxBl zjWZ9pxGEhP91!KN<8edByr$~h3H+4*CIhqAtv;q=QS?>HsiU490;tQ~+s0V?-Bi+& zndPh}N2+P0stlqod*y6oftU*@u!-dTxmNNI`QiSHf3|I7@$Y!k{w*zTsd=ddFt;Fq zEXOSk;%4t=FA0r;>i!GnN7P=RLgoTs<9gkB$i5MhCq!cds_4s>DKxG1Xoi%hG0;=0mIuNdQl8(3fg^j)(J9F!`cDN)G|K!BE$!^a+Ds%E)63vB6|%>Zq>{GjcD4|RuGdEPu%`C3AUQ;G@*888EMA#2BuhtqH`6?^ z>9^akZUkDC(VwV`8!}6ycmDt-*wSIYK4Mvb%aXmJX+4$X#cvD;LyVjN;n}h_eYf1GWxyJzKG#?uJly<18*sd5p=`YVcZ80N` zKSTe(l_a#WO8GL*&ZWprv^r#r70?Qg>T!O(W~ou15Hr}mtXYw zEA4U->(Uga;Kw9L`&AuZ5aTC*`0SP$x99m4UMl|cxi<9Hc=k(6A?k~TVRAY`jGnF7 zs4FuQaX2+&2`bZ$kGSCdP+=9G%eBly&%5F>*Pj9l(hE3^y|IZKfxdzMBJ4fie>kz> zMfXN=!H)ayPjK>RK1>w4v9=3LPsN^BZeh#L`NrY?{Y&g6)vG ziS>HMFsdPRoqX!0#K$+^Pp@dpjcl8wzp!f z(%#f#zD!4tqTI2S!<6Zo#_~ak)LUZ&jS;tN_w0pNj=!uhaTEq&J^UXm2QWoO{97xl)OB zdjIXTfv9B5=4nNP8>h}D%vHZC;YAA^G9dV%FU6{-sJr^f=nu-}*T5Gpp4_b}ki`!O zwrJX>+NK8&^wkOGzPoTY;L1tHwqL|r_&eMov;jGJ>RP+&u=NWr=Pvzy{JFFid>(!L zD8Gm2YkXD~qOX6{>)Sq@O|?hdt;<6a-i?<%-aYZ#r&foL!#qZXj}F+y8jn0A8{`+% zLhsx`owFIvCYIIjJYMRMeWF_I?lxiAl;IV)IO+m@e4R+?_6ahD@|Hur)1cyLtHykB zb^-r(Ks0k#fHgNmjJ*AZPMopOKAR$yU5vwTjh!%@Y1* zy?8yNiQu(F>FN3D&Nx{o4O(=uOmc+zdMYcEO!PhPATpyAEzu&ix1={>wcv+?Fk3To0FGqw8V83^&X)d68Y4AQ% zZ0#gGinO%hq08K~xJZnk;=mqtVOnfz$vWrDt#TjRcNW8WtpU-INaUrrZtBSTQNx*1 zBTpV@wJ(KPHg@!ASzhfrZy;r(OxgEb^C0KNb?L_);#`u_8)6ull9Z5$pDThTs0g+$ zy13fjeOwipz!X>|joP`H)<`J#TQtE7o0i)z5@m=_vYaOhbtNgGZDOJD}6>9 zRkEJM3|qUdSGScYbLuo@?OVc#owhkYF>FCC0ae6SMBsn4@b3C@XgdZ&J*ua>%_!lmbe@S&=yxMi^fvMBPfrL`if!}SAikv9~fPerV8%TxnEjw)RegMRbyVXkGw{@IM-L=?Eq&WeK|E z>-jGZ+qqwPa)!Hv z*&cYeI9s`Gn8=h{zbD_FN$_~DA^7#cvAd6=IfTg7?a+JJrbCF3Sew3nqOyVn4 z$`8R=n>(3-oRfHa^dFuDI@kCS?;hCkxw%~j9tS>Hp1K|juok^UQpz}~Px?kQWS^0( zEuc{7{NY1#P&)BYoz_?x`Q?EXv8`t(MO?x5j(U zb`@^cx|Fr|@v27g4jn7$>(AvP#SVI$D;29Esg5V>QWM$@HlU{ELP7~*1tRA`e;#YS%Avtr4^j*8u*7zkcSN4M z`>7D?PKbQmu2<2RJ_=IbjF+Wok_Og3kTlX_R$9Oz<(xwGA=v5+ zVU`QX2i0$GD3uJ;*S>0}(4`mG>&$gdKTzP1M}OxbO@65EIYfU$5+CQ$z0j+*roECo zgP}n>5WGRNzyfr(P`^ZZ7-hDmc4XxSrCJ=x6+^V}A+pg8M<+Tg3f023Bweu|+;H$7 z6?@!tAC$#mSggsbwd2$d3+_YjID2?`f=IZ8!y4i%NrhX`jXwdoPp8q8keuvr(u$%@ z_;DGZR?^@zb;cn*%DIt?kMGt7(%MeFlJCOmS79Mwk=Dgm zN5CR&zxdGCEt>n2>hn{Eh9Wybht>TMoofvDYU=-D@4Mrg?AB~!MJym9CVNuL(7f;(h(joVnjAbI#1%Gk4CN z>mU9SV)7CF*ORXYIAlB)Z)Qd$-5yXkGMX>1qh2?PXOx;Dak z2<{QVj-&&}v`i#SbHH1%jGOqXh@n61{a`diHcd-7Al$eg*qiyh1K)@Ca(GQ2qDu>z z;Z8h1*JX&FEm3pUdiVC!>OthqaT?Bs;4=W^$ydQDtCL@s>lw4y9XX*ngj9@X>Uc-lLyZWcbbJT)GlKmmlC5N z!S4gnSQXFeBhXc*W3UAK?hAK@xGq**S^WW$pWlg{xwG8sjoPILj7Se6>`$x+u_T7% z1c;Nu-sXH?);QUS&H8D>a|ylKhOoQ-Xs$uP|%bUQX-1TTOt6>`C3e zlOnd)e?(MQ0j}9ajgpBV#lYwJa^M&stIQ5F#B>06*AH<=!W)&_maOE6tEG`9etX=q z-+l~;i@ySFQF<}Kr{jnK6587%UQOy@&cY!8&49_hX8XAvO&;S6XN~|+*XfQ^gS@WN z6)d9f4wATfzc(i|zrdHN%`16E=lp}3v2{g#QP@sv5Kp`$a5x1K%A4HRsv znf9TUiLs7k&Mnr@Jh;g^s%TD{;;#orI z19I7K%Ns^Z-q_5QMs|kQL5r8tb?aFn3Qs8VQ192)5hHzlFDiM`jmslcKjmMHpP`Xu z$BCh`rbk2d+xenhrTE!=+&rp-z;ZzH^r#j4_UD|#7`=Ou+hQv~WA{&}q6&e(+L)A7 zc+2<>>eBtP2M2~K_g0Yckoqm}h#k+z%Izm4oePK>rfku{H$>N*77Ty_5g2mst|x$r zWj`GSjNuUgsReL~Ydcvp)rh-gbAW&_CvS#*nDR3F`>mHp_P6)0qZO=ob5ExZa)&OD zlF_ttrP0>$G~2-BRxcmE0XF+@uFFr$xFj$t1#gs2MjpA_kwBOhy8VOZiR)@gHB&ae zu6JV`m^u-gg13=_Z;#I6{m>k9j5el*URuB}*>dx?oC#eY)>KeFFEsw}3y_xJe3F@z zVJ-&pQuEALD)tU>6M$g-UjSt-@<+dGOHItOti&kdU+{n9x@UmKMGy(i^)>+Kw!6j4o=9kR}aB4Rf(nz1uW!( zhCPo~v*A$T_aGtiY>x}60|h4gC_L}gWhkv^9M%eO7k;(X$Z`WM&rC+Wb_}Y$Wb|Mf zkzlH?qp9w~$@N+?Kv2~Tz>mtxDsX2>o_Yj`8Bir-2GD28=fLB5M3cR{6um9vMmV?^ zvV~IFbsW#nn{caV&aA)nAf5D|@w0 zG2!0+iQ{P5X#7mVL+=||$B7XwYh0C}C;wsS` zz;t;XTL2c~KVdJMcDU;|3N1#1;og_kgD#7uq;V;dVO~~!PlR&1QU=|P#Mr35mAi1$ zLfxky!Haeo8+g4q0KVLpnm~7T9dX>z@i&H@_rSXN!{F?vXk@P8A+spI=U7BAZ_s`e zPj&_UW47z}yM``YyVbVz#%=F(Mf9!I{&HFc&8iY! z=4X3!kNNFKz+3%>Hr04z!ir~}k#=+II+K7Y`qW#|8uA>LKbFf@FOVe_^zNOH&@oj0 zi`tI*-|IW&t;4nO>{rNsvShx z@V$$FDv2QttsR1wj62;@-@-}A%z8;Cc4h>5&Ed!eWsTwyOK62gz3i%FR)1lb2Uzk2 zW^;aYYLGU7C1i+@0kDz3feB|De&PqK-=S{cY$NyiFZ_{7J4J^sU(IVhq%4X~FXG*Wc06W*L!hocLennrq>@;RW*Yg%Z!;{yVryh4r-k-!c?>j)F zG#-x4pX)smCFo1*V}|W>En5Y$I+yo>73cFh%8z9&&RIqxcWmY$SVY6W`eqT1C2Hr; z!mQWg+#SAy9+LaJ;Wr*oS_Q8S1I<_?j@+oZCN;^Vebssv&ORwv!YJNf{;P)NtBP=- z+X*@Qhn(rqo;Q@pj<3k_6dVjbVZ4laDbsHqPm*fcsj-KgW@2gZj>MZb8c+v1e<(sz zP2@~uS7El;U7;qK^`($DG;f!X<&1^b>^?k6#{XP^?vAipphC;L~{7)2ep_zKa_ zQ{-n^41B~0Z5rIy*zg6K6p$C-n5t&7Uj(iQ{?J~O7?AD`A>BeU0}clZ408{#)N53K z^cwzeJf0Y?2K@lF!mvvu7oa!eek&|<@UMUm>!g65H19O>sXu}bcY$F;7nSmUo_YT) z|2(Y;Pfw6SW>VL=%D~Ynp6swhpm)LV-A>V z)oPoLRN}->?a8Q$24&>|^S`Mf_>V$VuFHV&ay!kFBE6N&Peb^#96x_KA<5ol22W%) zm#IU;cRM8j1xg#|I0?23DdEZ|ZwG&5vZB-R0v$*5qMybI*DEz0Eq&OF8D_2Sx6$Ge zOLb8+b?jVx@b&{o;y3N!w})#m51<9EVg~8&eR^;&A#*fNUGg=3+cBX#q2qaDSIqDN$!MfgefW$jHTRAO`2q1eDTC+=s2X9K z)SjunmTG6d9CUYzBG52wk!CDP^|MDHv@86OBh-@dv3^AvZ-=@vSE%w{Ds-1$J5rv~ zFU>ct@Dj`w7KQs7u$iK)qD`)9~(HK&$O&0u~7` z2>x{AJI+GoGyihoLiDP}lMlKYr2TeSdIVl6Kv%n|-)9BfS+KJGfV_bg1a9mO`^cY! zJ^w-`{Oh&ZH}4_-$h*Y<{#qUizz@( z8h7pFQRMNfQ{>H7w*u#nH&LGDGQM5lz3sw?#q65io8$IqtKyD4paJLD&P9(r)Cp1& znHgS}gYTZTJ1c)B;#tX!#g=2f1itPq({rmA@78aoR-)=z*akc~XQc%iCFBoXM->J{ z?c*oS_?ewHLb{8*?oVfG=a0nSH=E-rIQpDehNi$Lso_d|?;x%$Zi*$PKbo@7VsD5=yD{BE~x5B+YQZeh^q2 zp_{K7F>@qwB)CStUa4NZ%lx=7rkCBUo~xa8fA1(@Apsf@Gw9p9Uikj>DSb1RGj~3$ zaPx+IZmYSdBzHlgiZ*bY`qB%V6r;nuz%dnX*uKVr#%``v)#`TFIf6w|sd393%Uln_ z!9*Z)C2R$1V=c;_Sz>O|RGOt-P%d$+YY|`y*Ju3iiV(oZyhn+p~f9JF=U zdV2>yKpxct*(6O^VZ0bt)Zd$JS;)<`A&5?W{qn?Dvm+nMlQfNr#sOYQ(M?t*IIFMB187)GOyqi%h$&$&o;rcH0-PLc% zj-Q?j)uvmFc+Tn$Epl_JcL%>O{BTErY9u8nsPxcOLwn%6IB#^WB>L%EHrcG2|(bZAm{&3wu0%FKVS+e?)238h2Me8Hg;*ZgH4p z@tnV?bn>D*4Xj*bdQy}W>wV2_Nk7b_Dqv2ds?=sjI`-4m2!!5qXP4H^dnfm#FrQ+d z_o3Qv+9EUrniL}jiY;0o`Q{NW9ykqx9H)$WoP}h|@S$uNO*+@WCWqGr?-&ftG|WH# zGTterIAX?HgN>6dB_#Op`O}_?_|k6B6RP^?jkPSk+Vaa~EK4$?7c;w~Uj89pB0 zqLQE2mLKACMp~Bc&;z$K9kf+n^~~5+uH?0+h}&35B;P*Nl$_E1sEiQk9!%M?w8xlb z@_5vWK7V*hE|PvHLzz89o;}N6{?_!n61y8BG!>{J*e@KqEP(g{GLz=g z4A~-sVHbq-#P3Eb#8bh`SjU&8Kfvp%T?XOi@IV8z6!zXPhZM?uo7bO1-U*c~7U7Dv zGj}Iq?N%~&oxks=bayUwVsG3J;p}*-hrKZK_7puhjp8G!&wKd1};yZK)b{W&AxAQXztm(FV8(Vw} zg%v^OKF@D1PYdPe)#_3BoS6=Y4%iK5-%*-#UEj#Y%aE5EWX>0c66*WR6@#TJi{siN z7NTzth>9_ZK16=8KG3SPZkz&DUvX<)D6Q_JZ`;&TF3p#b|01pL3+RLBX<09RFQX84 zOx&3F(0@7F{w47c8G0GI{2Tyg&xk^B0=QG_C&`r&Yi}rw zISK9tUEB151o=syfKIYp!b@VOYA>Z={C6LimLoRDRijcV2BdrgkOvMXQ6}-U{(YW< zam0}N51nIrsfcDo0OXvfEVdpNsXj1c$;FstS6BWn&?x}zcn3i|Z5ysWRBHgekhe$IHpyA!cFH3;a#ZM>p z?^X#bzgr*GO}=ZtB*_+JYDBtxOfUaTW~Pe4E6&Bv`iLCW0bm!?%zcf&j@?AeRxz&C zAMNE&=uh&G(4gh0AJuH1nFqQmuFN*ClV;KWP+==ZPc5ra3f>aUTocaSIHif^gYSg| z7K+uJlg98NdS}riYctSKW?vSU9x!?hB!`KFUA@MY zF)Yl$HIRV1UO@_yy17K6Xtb{Yj_r%9DlJq+Yci#NGnTF{eJUaB5yTwoL%Ofcg-c5wDqe+y5^*5i{q zR;o4{of;>Jw{S}h3_5Ohs-dcm_81}W)d?>r_9sm4{2E1GXzbL!4z|QEnpwLkG|1!8 zLqY#jHZ=YOt|-W~K1DfZu;n8ouR}V}Nerq1y*eIeGD*vkqVd?H`)K*KTVPl6t^r&6 zAKvi@1bPCJ0NonUG-+J|M@|*#z4b~VLd%D^f_#e$PF||RA1Vik_zr*{J-Q@)W)Bs^ zvtQC!eF2LJz7yH&srf9jxU+3@X3jPyq=~_B9Bdl_uy+C0!|sYDW=r&KF8g3M(*k?6 zg5Qkv*J5_NgSQOiWz^?Rp@7)U6cK0g&a%DudvP7EBqeczAdqxV}&8J>=MUWk_0Ny`y{!GvHKlSfg(uChF96i9K1dhA{tdRp|Tay z++M?V%&Ch?@ixo?S`em{arSP;l!km)N2Z*v00?yrUx78PpQ0;V5Iw2CM&G7THJkf@ z`PBol$*7IkcP-Mg`zkzrIf^3J=R}+xv@>p2zo^n^LNXJDagp^gI~KFnrvq+0`)-tK zGHC>Nd}u1RWLZx&4Lt!XgBIVFh|B&W5mD)?*MDA0>znh_W!<+FI8?lpHi4JOk2_J- z6?guKY14`P;qVh=uj^h7_46^9K)}ojm%I8}$H7f=oV$+F!UEla$kIAY;Hx@vw&-@e zkD<^RqMiwy&4A2I3UZFIPC+)ZuzuO66BitlW1R?Y9EqJisC&0%bdIxOavU8XG5K5( zvnV5sg}jwg!kshvB#WK+9t~w}Wjm>GmbSfRBHY{J+esWOd+I_p*b;oflgDNzhv&do zm}4izFW#KcrDXf&*f2=QeW=;D5k5vnn{DbvbID5`U3;W7lGBwWG5fgbB1`KU9-x%oumqhwo?gRz z%ewdd%WOyGiE$u6N%&h1JBwGwnAey62Q_tJW!`@2P}uA_JE2o$Cf{Sh2|pDg*#>~> zpD%prb_(&xUbxEx+J6~(+Vo3bqxQZF^e`!lGB88VXm{jr7kP%vE(NG1FmfbRsWt(s zZ+wWMs-+n;aH6kYTQ1t7Q)X@)IC$jTbHp}B1ypve5 zYR-3bbkZ0a>r_@EC`*fOR%v2I1H7s6{q?TY^CQ)1mdH@+=ll-)-P%!OU@y z3%Yx?90V6sPC6>S9{cK2?BT>z*x0tT`?$P`=7OFH5h$l1)KV>;)7 zP!5y<6peG(|3qX(48?*M>TZ?+M?NUE<9;q`@lf+D$ep8zFV<5dJjA?gzJ%i!P zMRZ6*)CNSXlk@xelFQ?cI8=-g(>&{2@4@F5urGzI-G{X^CvMcd@w=#KuQG414-g2Q zf8vU=_8XlXQ5kCpIWcniCW+f>sTHpk&zM#eTb5kB98&x=UcK=3^=zQfgvycvDw%(T zyL?s#nt6E6`)SvnWch|^tro?f$WU8blLhnR7j22SgfIasRN>rSVFP!{Ke0mtVk6{^XVVr)eAPua018)4`SIE(V3KSw8(i zpRnckmR;s0z8M>RU|0^P!Z+wK2SiYH-o*}z^#Eq~5|(ABW;l@Qjw zV)*UUF`Xw=Z?rHep;!_xqm5)MP%L3b-0rmakZ;xM{PX;uYyaa&1S=J0a;`NRSg#mN zl$O3NV~)C3!0vJqzK5X0r3F`?$A7BJ;oJY3ld)nQ%6$5H5PT((T4PIqxk~~%aT)q4 z#!Tn7^^{(u3z|sVTVw8Gzcn%V3c|aL>Os?D{AD8TLJ>xaI8jFJ?<1G%0tn!LczoN zC)1tRIU4W?*NO6nitw!z12YJK=DBrjXo%_Hbpe7fFY+2btlvI10x8`j_X7zl2`PXF z!CI9RgBp2fZP1{3bF~;(0Ho%QdzBJ{+_(IzFSyaxFFYqO4Deq;3auZ;5^fxa&y^jZ zRL|9kwIL=|cshT8BD%N_$t>V==!9)9U!XEu(!`5O8Tq=Qytw!>nSJ>|JG@2~mFsy5 z8&*#~r&ASvL0*R`YTo1`Aaf_OWQReOGC{MrGjHApaR-9LKxYjIfmwcnc=J@!iYq)( z#+o@!H@i&lr(Rw4XI$N%_uEyXW?}W$z$~LF4MOLP^+r@0Z!_qbRDhZ0q|eMd)vlOi zC^*ex!`J$qGqX}{UO$ zk6uU*Pe0^@ebN5`n(#)yZINzt(b02x(Ufd-BR9%^QD~RVbYn)S?awHw%5>^|g>;|4j_*V+yRvGIrKp&|q?ls&->dME4 ztd8t~823kcM%(d*=lrc+EqcS}Uk_JqSLwFr|AqtljO&i6<|Q(h4sCXSAhMs0DN`1` zdw7NeveQdhGYgbVudur&})~f0iBpMV~Ue%37(P zD4R0=;wW3r+Aw8wt$+NY$Hmta-*Z&BEl&Lu-eNNAFPCA%!D`U-%NYJ6Civ*k{e{P8 zE~T$CP%_A@Ax$Kp;wheN_;;D(yzD9-x?EBGXFlq<#}b>dCo#(x=(csm1kNsX#Hg0+ zBD|TDj1@a?5aKPcOEala)7O+fLq>~uB92?O)SSp|V2*@*l!~u$e-1FT&)63(*J`I` zhs^4sxI>*{DxLxrNZHT7>+U>evfVdI?z6tu(vWAwu5?{gQST}%Nx@TL4*gE%I(A~b zM1r$2I{r&T_zS;7Q}Bc{Q!L!=ysOv>%(2%*#xAt4Nr5|Bq(CX*O^4Of`SDj?)EzNi z6h;-Xh6XDb+K1|QIn+Cm-{^Li>frBOqjh&>xNy(TEWOz(DQbC!8kuJg3~nFZwpZj? z$>eEmcasY`8<4!vF*HFl71+J1>(8N^3piC?4%7eYW}P>AU(s&H^P;V3C2 zYOc6?25^H}cwPkrgwKhUwi*j1h+c84((om^U@aO<=q&9D8*MaHjXp06^8(sdY8Tz% zuiZ8s`YLVF+C1G-YS@BW6K?b9qXt>W35yca!26@v3)x-(+E%LDC(PFEv0*D2K_u_2 z3K}W)E##sW5#3fit9*w0wqGS>{IfH(d-+XS5m^ZUnygyE!HEdM?egA-o5fPC+-3k? zim+a=wqLwU)fM9%O;#$?5Up3WM77`;c}8dOr{)eHP!;DbRdmL}r`WPkeLq0vYx_?| zhrVA{k;tihIS&A>k+u+al}M^N^Ysi!Jo4*O%uG^Nw9NC|ojT+NqTw7pu$VhuCGR-h z)uP$uc#fWC#bCTy@FLS1)De;E7LQ6zJlhKvoz}ln0=x@LlU16!nsThbp?ItY1npZO zUfk;Z7}*MsntO&1u~CGGl9j^o=VrM4iS!vBHyQoX#f4ic0bot4b=MRMW-o?<0*o^q z{Fma+a62cNc~*JeUsa8^{V6N6a{FIRUa)>*JeQBzB_&n>A;%Vum8D1*;kV@6m{sy*-m@sYlA*?X&1x_jCmTF3;LP>*h&o2Yy48&XKkc6B^!EExZzrSX zw>S`l3Q4Nz(XLX7L|7Z*%C-h(EyYF!3-$yng=fa7&n~|zr;nu$ds#Z}z4Kk%TxNdf zNN?{lbV57rs)8bSEBLl}nJ8BUF!$M?0oU>W2K?ba^c@)z+B&XjI<0T;bFY*2lNc<- zj#p8&4f5s;o!n@ZNh`}XKHh9~ZG?ia&I8cL07`bK8?}z|qpGflEb4cwrnpCdoo_Ny z_iu$(e|#POqvtPxSs)Y_&ewm$1ycB}&M0UK%T`PF$D#CdT^s4awtdOm!W;nmT75b( ztEc@oOIN;~MM8H#TIXrE4G$%MJAeT7PP&3a%Rp!OKCXwHN;prX&bDnj0Vs&VWSgTm8 zF%xCsCZbyE3IU4uJ`&sTv@<(;-h3>VulCD{RTZb}BoI8ui|w1+Se?wRdt{a&K2$5i z2vepRSO+3VyoYLb*EQEWpX?)k(&cou#g#he8(V6nWS*nLxZNEF)FHuO;Z}lxt zg`VTNk+VF*y%Owg&jYv5)Xe_(yxa19Vn2OOG%M=>+N)S&uB#JUUZ^8n)9`a``~Sdm6OrOdATd-&Epbf3J@ z7DprM@Kw8hCcMLoOTlv&S$I$vNx+CI@0HlCMCYLko9pB;nsMI6z}YJMXgd%}B)}bD zkJE@P2ctZ-=$RB58hDQ$jCd*rkZwr<=qbbT3cx!b!?E;sO5pG)y=1S0q_pQ z8B!j4UU&v|bWjR#koHyIMJ1zObbpAa$NOfyD3PHr_4vA;7{$qVP043H+0>0;1$Gy+ zt8ucinKy;<#o%UzRd-lNv`caiu%-7b&ir58)5q3fAox5A??f(d2h>)~P=i*)4EzZ| z)D2#b+#7(=bOe9_8vfU#{ht8;BUq_1LlB^I z^LT_^Ooa6AoiU+j&)j4ZJka)$ZuFLQL)J$l-xsY-QLWJ&y1o*+I@as2`+N`ZYL6I6 z(I^T$25ZtjYa3tg7psMV70Bhajq|pMZzr=Cd#}MM_it1a-ijrPo$U&8C6w*06OMp` z$i%|O#k7;T_Ykh|(s~j+RCzn2Gs$}2QGfEDeQ%4P*9U0^jb}pCruus%3iagYbT=zf zJ(JWV0sItq!ygZ*|NNu+Sv6BbnzPo?wk=kC>vsb_E3H~SR-8Qb8a6gyG{ElaomyR| zS4Q{qnEddMTT)&&Sf%~f)fj#oqV8OUd2!#|^#?W_S3~E6WnHAQy_>)M0J(-OEFLzV zzi`E%GP|5VXI~NTDV5p&<>o&12z+LKk^x&e$vb%c_LOzPLzVjTU$WZ3c7kzJYDIf= z*E17+AAwGQekNvisO+K!nx0vF%&0_;mkWf-bWZ|{ijYGQU)OolVl6M*31 z4aHLVN$lv2>0>mB`et?LV zJ!Ue>`#dFaCUs1$&z2rmJrUndxuN<><}J*9X%L{ekRZp%aJx}8&%*1g;5Dvz%4~zm z2kS!YUp;#O{2Ayok2f8$*Q(r@{4vwyMR_uxoyXHVUw~xFYtD}W5=i@+fP?MP5`O{H z&c?bvw9~#MnQ{gvNxlU3H;Ak^mNUl0Uy11~KDqsJbGVqoDc|^Tmnso0zjoM9`dSTG z#kxd%djkH;DvU?99G|dT!0=YJi@rlJ2c3`7~^N*@9qSZp?xyA<=`D2tiow8el;W#F=PJvSxCe ztsarSETH?IF~h5?(J)w7Vf`S1H#%=MI5XcR=*9(;o?5Q<>)QxYKbfKmXns_KDL#ps zcGW8wF_JE-TMwZ+lexEU!rGHGI6cKezj%9FG@ZQi10)6g+|d^}|H^gy>)OBO`=5Ue0E`cBfDrhuf$T%BJIVc)$*N}i#r-BmoQ3Dg zm8n3emwko9&RKVgK81>#lXqBB^N0dKS=T~#zdhmyNcpQXq^26FJy|}wm89ObzL>%D z^Jug+|9cLF60hnV8`|yMGmxOy4cMy6?sKGhF_bF62| z4w*~2iz9hf9UIJ}w6q_f!aqqn6ly9w`q#AEe-eWJ>+}DNje7r||0d=aX>TXPp70su z7g@K5zjbFx>v0v)tFWPOUmqenzdH!s*)^%F5L)LdP7fQgeflA%_1fND8u~l>Jx_H& z6nSYtLb^@q{qOf5-FXcCJ{dO3sb4vCCgv;ulJsSZ3~66ASnw3~5i-yw2Vbv_*GL#g z=DXSFOyCp=m_F1dhQ0E3?7XWZHIL!UvP9^DqKnndEadk)(CMW*fiaIcih2z$H4Hz+ zr$T+K)TaZ;3kBw+@`dJaR7}Tu&XO|mmUADxIqqt1=Hs;oWr%u-K#c=JVKz{1_%M{cB`xc0Ok@!YR)u|wV zr?yofC8Z#vJbd6Ku_towrXBmr^?8K+xcBK?OspH1)(A(mnZN&G|EdI{9a&e!QmxJG zzpu;W&Qf$X7WY&4Z~iX#cW{kN6k=sunP(Dn6M|` z?vuiwP6HjzTn%IH$6(1YmA&m)7{?(?_m0^2!q-V1B}zZlG55GW|Nd_Mdj5}VKch3) z?&^P3^reY>xwIrsMVTppff-;Ib7LKhzLjQk4_`8?v7TQbRbEp6@xIwYj7D5u*=~S* zKM4br-d;GPf{TDKy_%ugMzn#RR^JGPedy~btrLIMBQSszncZoq(7U&$m^ZYZfG)QE z-p$xjwpV|myUI%u)+l1hZ;%z`4Z&*26vG&-nW>w&d3NCq$o#GPIJ3%8+0tHf+k3vm zmV(VtTgaOyJPjhdOYuF$4wslJ5v=9W$I9Qz85=@B#*wtQ3_PBzIfm5y;&qg908I9}zhAMGMT z$+g^NsU14OFaVs*_kJISV+g4U88MsDe%0*&$)m3Tte43x4k&M<62D3QfzMAmrT3*U|pQb(c5&O=&xxC9;RG1zWBni(xTy@fZOa3cGu!U+Hn zn5YZJHkU7v&I37;gBhS%Ad}D(0GVZ5MDO6>(>%m?gd6^L9e7Fd)dL{nO9-7J6Cr@C zxf1dAW*HCxa-xB%Jej*1mx#cBc#R?wVfeE@K&<4}gs~kNDj-2Aje#til>nJKfMLH; z^9Sg>M=rdl6{uKvg8{WlROO2#@^u!V&I!3fK8^+&I$pdYcE3Z)jr6cIWlokZQQgloxhAim&R2yKov)UKrG;F^2r< zp3U`cGm%lX&g%9b4JGtrO0co{b=U{y(x#) zr`t!gJ`w*e3$la$-zEfTRrs~7e;KS_hs=M|@u{1Y=G}~}ypZLFyd9i%Dwh%ne}r*> zl5;e{F(3;6HoC~p+K%uFGJlRcfryTptER-$V%KW=nqQkfu|v((8phOdy0Dk$9>2vL zkS?>guA>R#3v3>0&=`frgzdOWX7mNpC1wMGiX;7BjaKwb3UV!3nEO)(i`rY(U(PUC zK7W!a4eHkNIM%uW<8rImFX`i5@PM6KQA@X(HZrEYY%yU976)skjaCI_u{N0+=M9V* z@54NLmQVp{8gmxIh0w*KsSP!p)ICFH4_|R>X1o5JZaH8x)&A&!}g=;e}I;brhs+BTq2MGAk^c?`MI(yX| z(h05tM6TdAn(N*D20S}FfBGKy@4o_p_hUfpC~d*#!DBie!n;EOL_qw&FAfhlJ&+;5 zp}N2f{3COPj{JI0b zme8+F{L2dcGNZo^vR^0hFIVDkICCy3PYJr78ik)l`i3XdTtbhdLxP4`q& zjf7vY{#OZ!_|ab`?H^O;4k3z#lFPGCHBG1%G69m_B>TUCxA+g?mj6B1 z$+vjRHuZ`{CPnRNB_lj0JpmksCZX-mI^D*v@5*r(}_Ch09@Wc?}hDEg-6SO!4y3 zOLyJCS6S^YPOHtRA0UG=VIzzDwC5F~b()J2lP&!KL~pL!49HmgUHxYs9C)9E*YdoE z(-L4==~?fWddd*ehKLaOcFde*4D&K}?-@CQ-~fYo_oFGHbKtrQruBNg6oWYwV}<=w z8;_3bQjnardI`4OR;#r`^XW)=#@)jt)58JO6k@o2b{L<6HLn{b+X)+Bn1!%fMt+Wpfu6(?^axfN5 zbFO6bhO~bqTP;_jFs3Ltsr!Pc(rK17Uc}TW&>Z%1PwsE!#q*22xS0)l##*;a7p@U< z$Sg%!3OB;^Ox2s;8I7q~NRAotx3Y^}k2$$L_AsobkwgBolBo%5mXFd!Uuqe2MH3Oe zI}Ry_r@ZD#Qrx)albr@MUk2=0^*dOYadjX}d-lG%c-V~dt{h;F5T=GMGk2kp^j9cd zUXB?7t+ zcc~~$x0bCl6N>@OZ*$AxKOn;aTH)VFpNrO5u48AdtZX+xhAB+@e|T?O91GWrd99zn z>df_0ZfysPhtD0tf1~gHe@6K4pC7UD3sv*~_B~QR{`q|Ws^Klxc{=-zA7bcNv?~AaEY((sV&p;g*7k`JMIGISm4K!nItJ@&WGyF~+FJuR}hy zG`mhKUGLQwCUpBgg;OG~h9oUO!_F<)L}UMcE|0$6=7|Bvjq_%*KoU+*jPi(~gX zWr8DkFSa^x*I#VEU6Y~QIzm~&XdsOk-IQvj*i4!y#Mf0s=U7^OvFD}PU-i;((zD*! zCNd_ePnazEf5_?P*#)306&=TaclJ{Y;#+aTs(5+qoB?hzt*G-ohhd$%6d_7R{~0sc zf?aI6OgihZ7?9`CXmIa5_#hHBy7&X66|Wle(J-r>hyE2rOU-MH;7NS>1N45Z*}+q= zXK5-4uh~7Sg8X8m;0KnemWzAW`-uQu+Vi4<$%is^zk2;OGcuxrr5$^2Ms2DIV007p zd3K&RC9E4A0$+%T<+zg*-{?o9aS&yFBvVjO)W1HR_xLwO*`b!ieOx z`$-csE%7Jg|IzoC_#MD;Jr=4Mrr7&Hl-=jd9z~9uZQ8RVm)#*CvB}#_F{r^x*4-?s z$M-+n$qCc?;E5%oHvJQb)H^5`SEHlOya%LXuBmS^2qWK>lgn9H8oWsBAM}p8uBxkT zlV?pocZQIk4$;uWGhz+4Dz9YPxb`hb$(TS;4kWn5o^Y&yl=#rvwz#0K4L@{X*9!30 z9_`hIL>Q}xwdK`Pl-?*26U?aryo5>X z!F(#eW7}KG-S&WW?<92i`7ql0f~(TUfTUKwyVrmwyKhdvpoRa%Fg}r=H|=~{y3{-M z#5dyY-4t$0njEW)%t1r$hsC`z0PTcG=QxoS4|5@x)YG3s9xk2zseNOEQc5iG&#FIk$E3gYv`Vj3yCuY(wfMe`{X#T$T__{q@GwEy*oLh`_)%LX=_4fzy2+npm!l;edDj54QB*=FsS1v?m*Rlj~;qpkZr z;_~&0qOI+L_6zSb42%i~N47cXHj#|p$J_vEl*Uii(2G{|j9+l4I;*Een|)eVYv^7w z1wUX@h=eD!yhk7*?YV!{$Ip_J`hP{V=4?bO~i$fZh_Ez5Q8B=Vo zbY1|bsW12@M*-8u)g_A0r#vu0@Mjv#Zie-))olGPF=|FD*vjp(MCHrbAwiuBOZI^L z$ce!HrQ?;^aQcN(Be}X~JU2VX)Z90`KP{Oi1{<0rODD5Cd+I9Ct-#M#)49dW0!SPR zU!w4W()Ifm@vmF;6FxrB>d?8Q%nJgUAzy}UJi|HX?WPzMmYFgedQllTFNoi4G%o18 zi7NF}CY*Eo0eaADF;M7b)vi}oBDXlG^&oCPFd`^;T7RFy*2{WkGnR43(e`?+IZb$; zi_U;+(e9zN8cv_P&_LJJ+LvyUcoz?7MFqt3lwke+=wt?k3K1O1*_!u}ugBj!IKgU9 z$?9Ob~?15$D%4dS_- zkUa)2EE2U0TpHDZc=GqR-rrA&{4-vI`&`xw0h^S9a#>#CB*o=%KcIP$aM8;B=0oZh zyAP8W%CO)-7w`omvu0buH>V#UK2~p2yIe7u>EOWz=n8m)4qHXEnQ6qGM$nT?74@EE z3iXfexivesAB20|TgcjRL>sg^(S7`uvi}8vAye*r&14EIk=u+(pIl+|t5vTj&8lJ5yEyG!INcX;hAa4B~=rL1pAfLun z`{Pd3W8L1#MLOGUx6_(-_{koGG(Uq}7gRc3ah6z|A}*&cr1#?JOsj}9Pzt|S_dEaZ z_&@S_8R9CwLRn8wBbkjVTcRJ8Kw%XzDZSh9UdGD~3@Cd(cchRlIO{Gts zgiLrMNa0`Z5NS%DW}`@$Sc56om7VLj?T|>O%dUbMtl+g1Dj@{UMu*Tc$ti)v>c#3X4|1F2 z?TdIUXVmMCEF6_VrCA>t=N2S)mC(C}WQRPcVlX_U>3P>8#bT{eeT@|6>A^idNt8x^ zE>zEPwFo=Vhv&$Wv1~OPe$s23D&(iLywUpSjj&rfMH=NE@Y)@jZpsh3mKnZ4|14YI zUjJpiPauOkC4;c{AgeS7IO+4$SN8%`?(GFIZPaO&OTh zS*A1wNIWq##!bD5hJBF~^`oi~M6Qvn#ibW_DQ?#}2~*F7O|6jKp|@3(xwt~DVe)-Q zy=r}N8i7uBcL99-V7AVS{t6GTrmE(1aZ9WpkoXNX~!FK{g0tD$U z{$$fj!!$j^@n?^ZZ1g54u;>7D;lv0`%fCx$3LFLoQoT=6*WB9Y;uA95PT6Ej-OimQ zdt{9BzACx&eP>o5!@uyA;WD@HvEnaRJH1S?cKdhmO4yi|LH-T+3>Rj$-m92I7b?m; zQ?fuCFDp6z$_ik#Ta6b=PE7JTpCor{5@-g7-b(*_Sp+C}R)hN7p6HPECtt7-V(tWB zX(ZNfX2f2Oh|-LR>zzJH$3f#GsatVvEly&_*2d6UKyw%&C(YC2sWkK5jq#|~L2ThO z99()pc3Cu|CR4q%1TBt~tG2 zV1{chdu0td8@rD|5q31=HxI*(5SqlS{2*@|mL+Kwy}@gP?f7@~->m?WZX9!8a21jn zu`4p5Y1novOZd*Z+sPhP8y;DS%RuX;w1R?3ch2(N(T2y6iZ~?z|I%|5u&ugwEk|M| zCpXL#(Sms75ip&h+^`(DsST6>g#ODck>+mKOK zNku}NvXdj=rFv2C3&!*1m6K*ssxP`bCesK>Q-_-0mT9fF4@1o*%^qa94}ArtUf3~y|S0>GbzCZjRcgWaIJ@xyKS+A{oo;IZgaf$nMO4ZJK93W z^FpyS57|XW7dFnO(QGw${^nxsCAO%)bmf>;okeQ0E8By z($VWYCE`0jH?O@-t-ZFJTV5S@qkV(0g3{QL>M&rf3n0+LE&Tz^elz-Tfo8!aScHG`9_W0i#=TQ;S#UnA_7t86TRu6Z^PcO& z-b6`s7|jkl4WYvGNG^?nls|8;;Bj?aslYF$z@J(p+5o8`fBtSyvz_gy%bGWJ6prcY zDuO`vAm&rb3UoUbnSbun!hiJt)vPfneIY=6*6sPVkXkKi*)=UqH9;u>oxCu$>Mxy# z!`|AfX$hI1H8=}f9Vka}pKAyKyIWT+;Z_A#uz~GB2WUTf0kebRMHSb~>RKsR9mUhR z8-;z}1TxPCL9x&f=g*GP)DC#Y3KPR=5_G-9LgVW!&%12`(D?*QTQEt1-9pq6xPKPI zs##D{w!n{ei1}aaeRo__?UpVIii&~-lpYlY1p#S-w0wwwfQZycjfhB--dh49f>Z$k zMIh2Up&FV*dR021NJ;2Dp@cvRcYE&K-^_ex&dj+pXU^}Qx!)g_?7Z3g-Fv@ft!F*! zS;u@Zk?syr)MCf_I}D#a6+RJ`?ECHRK8v;)=g=3Ixg7kHbGhl6+0Fb+!J+rFFZ=H$VhIpA0Qu_6B*p8?Bj@bBQ% zHN*rHO<57}7?6H3)H8r+Iqk!Y(>~kd9r%HsqMQUx9$gXySo`YlxMcD&P|B)a4Yt_i__4jA&YIiaN=&Z;Ybbjg!d_8uMU7bY*sE9enDtgd3U8m z z$O1(IYxVNDUD~R^Hl#{H&DyD;!vWXY#n-i> zEB;C|;~(32Q)TsZ-ka@?P5N+i+_C1CoSQ58c)&*Xn>G>%s+Y=<6CGa|GJkGLnKFG` zUPC`9QbQ5i8mVPDM?$ttiK1zPgh%;qg=WXbuhk@ETj@}Qs#91jTHSV6ot>Dwg}Sl9 zDK*J;PB|q9++%4{&xB;(8i6u?m2xL{OV$v+r1fNM=$9ryz!GX4sZWBIvN7PzQ`D0L zh26A|*sl(DYqp(xw&L&<%SU`K*f^Z7(rtvs?Ud|y$$-1RRCaKD(3p2YM zx+fW{m;5j$IB>-D1&Uejmjs;lh3V-rMd@$bs6^6ojVe9tGG>ut06g7x$wXPUxX2pE zWG_Q?+dUuNHnWyElok@e`OpC)ALU2rBGMk?Ri5L_)M8-=(tz3I_Yy7n{U zgZ;b3md)dlvJJt0cTxi!-MEQclDZ<^GFF!(qgnEh10)2$&zqwqXEYN!owl`il{db8$JAU5|q3-g?|rw6W$~P#~@@@R>dy`XeX(7Ua^N#5XbSxVjrVx-BsS zU+GPOU@D}g*1z^q4eqPzM%Ro(+{J1aF62Y;SM)5Ftv=cQh){nf2){w9UHBu;C3`qHEL->vqf+-=yezwY=rE zVtbE#bIo&l31am?M!Fg?3k16J%7GGEhJLhvq$hiBsiY`46Z*e*n~f2_P4m z(H<9_>FjgNA#A<@Dp0(2=w8@jVRr#=g!FawK@$Ja3W&Du8$<5OExCYseFHUheF ze1TnE%?xL~FGpv5<*pD*<1EPMsV!M*`+7|Gjy0~BZAYM;!$3epdwo^mH8jj0|ySkr(=VJl9kMtJD^5y`y%6aWmroV7& zUHp76`&x4usHyo~Ksn2DQuFj%a}H8$HWT$!Kt&rP!fx(ly5)*o{#tYJZU-n1sJm(J ziwdn`uX14C&=l%m)_w5Fp2f-8FSv0-sO$Tdpm%mpWfFW__Vy@B`%&oHJR1SoJUw2} zg?%;%yJ|vBGrzPxvSNH^FYZcQ!@M|n>}W$(|L%9LmOJWsOJ1eZ6JYf#&Z`LJ=+Gfv z6_{64x4p$=LHpw1rb4a3tkgz@A%OMLKh5~t3-}(1lWpaCuy`skE+7OZVlSHlXiEna znWl6H$XB4r17(&6$<&15PEM_5&~}qHl>vimg|%W8q-gz$4J(vhL@)H)SHNIGj>aJV zmA(D{fDq*0x<3k+u)M|mZwb=P5FpOAvt(gOYgwNM(_|OWrmKXC#uV@7dwewbI()|t zvz7|u!xc-Gv04x3G%Oy{F&sRMBFC9q07lEaK=toU?!WzK^I$BTC4{Najww*lW*EK0 zM+M6+w%{x^HH`Pg~>&D{p-cXhzpcUmKwX48hK)`>8MLaoYpY< z8A!LHjcQzt>WGa}a#w&-q2b=UZ6`l@d4HEnc1IkXksHer<)C^vDjr7Vt2@ z2$q()TcYt^xZqBd=7#gmVv{3V`Ti)6_75`#9!vbjkvsH7S7{(zR`0F= zYaG|pNA?AM@=_u>O$$rx>uNL5<@HTtU&hxxpPZx&ns%SW=xt16_c0j-1%U;(8R?jZ zF&d$UZa;D@VpUkr72VFeuN27mo1n&xlIH(R0tr36@gS@ZVHXG^FkXhAJ)}El>fsyW z>iZ*UKHkSsQMo&4VtN@WmtuvX?kK_+d}ZXdM`ORAOSV#fplZJmuORkbH#S_O^dIK= z{|~ea{&@C(i}kP2@Qd4I*;X4HY@3%&t)}2LcWT@PTcPq&hLj`mk0ye@nfjd6ld*PW*>(1NGx@-m{?M{c|s z)(I5Cbd^urP_t;Dp4rxCJ^`Z}3i^T}I)O4r6M@=H+LwQCTu@d3(fuI*7sK^@SUGUW z_-Q>Qfp9+SuXUr{ps}+vjocp^ZMmpV8L!CjF$)ng(0O2Xl0XyUtRz>=X;SaQP{Tn- zweP39Q+RtKZ+FaA+&nGENy#A<0+!$wK_&F<`gM|?f9;##Jku=K?1_S(w#$4^>v$RZ zKQ9V}wsV;zBCU^v=dy2oW#-o6kal-Pl!kt@obBZVh4b>ECO?zR4anlv`;Jm{^a8_A z8GynvHn}>n(5k9de>wccAbfkHFC?g5;!=`D_B}pbms&OXdSo<`@BH@-Z(N=6Ho<_J zwb{3vsh#*s?yn~5_kbtM;n2U000aKV5f0^}MdB(j{9rsn z<8br<-5TGNcZ2kUm9Q`|Dq5nT# z>(BAubJtOAdFmgWbo>;G3h;|)39!TfoMXvQh3 z9M?~YgJ8WT@sKtdGEKnDtr#70V)J*$Bx?en}Z+vRJ>UH2`o zeWQDV?{f5OT?{~R4^_PEjcHPWV|hkOUdBH1uX63Tp1RWoYPFy8S0FzjjWoZ6T^TI) zuxU4ZIMD^m_7a$T)!=lY=}FkCRz8g7PbFDyfZsl$GG;V9??c-mLzAP2MlRP4JO_K% zKeK7?&%7Q(C77c#{U4KjaAz_#2xs1xP|syQ$pYg~-T{1B9{-AS+Oxq>z7jU9@W&|~ zEknrTCFPrQ=Q<=;uzxd$QKBklpJD2d4s#N`a{(H`M%FHZcku7gs?`C~e=_9>a;#o} zB*t5INH-Ca*6S78`zxgfZ7%=s+tm#f)DK|muR^nu<88&b!iY}}APupWSEB{;gPna{6j2K+*3y4vXgycVmR@MvB1H{Fq;WHJT| zT>}@bjy$%9Uhp_6`Q*is#HLXpGY19vPOh6k@i}XIgeiA8etYU#qgCawsnsKWq6um# z5uNLXJQ-AIZd#ep8?=NR^J5vEo>9mWVrz?a^ZoX;9uxmYak1+;v;!iC1C0h_kLNR; zId4L}=+BdL_cdXWn*BkHZ&aLWKa)@T9^d6(tR=D`I34VSSMYIeyDqf62dFH8ujV1) zq)D+miW1JQfa3!3nBS?ASNZs?;l=sh$@Er@H+fEbuCbUzJl4zd2(SGht z@_XLzyga@avIS0s_uR<;j7nQEcr|h-8L||{BzAD*yS87Bow2RKknr^?(Y^b5ohiPK zN9(wU_nq<+(&mBvJx}SM5&b#x>MrY%^|xecifddC6dAnfpHUZhh2cXRd^m9$;ZTEQ zjgx*mep5d)9mt?KtM*u}7_KHo*(4!os%9p%U$|a>6g0`(qy0(bf(IM^sF!lb|mbL5;yb``Q0L}&_q7S6(iI31+@_%-SjQ}N8 zDu5PZGCOD-z(xX%OI!MnNRR)u?>)^q@z@5dhOee3`Ca)m2^#1}8PZ&G_WRZ4L08ZF z4eaC|1%wL*Z^T4_AqZ0(phvie1)jqZ?6VP)GCUJ-K*CO~338M@%#`|KN# zOa~Zs4|kjXGe%s6RI2?0{6>;n@haxs@Hi;HSzoy6H=wwPHsg;it_SCtfX z>&3|KUB`MBSbF6sAFt_IQ`PFky9VC2zU6h#6JCl=T9{z-$m*_BZ60>6g$T7=3M)N! zezP@Iy79mv6W(;wU%m{)A8ypc7$$7u}|7hzk7+kfZtE`F03YX2nlY+(RL=kj>* zMv~}9I}+pO&3Tg*p+R@V`6nyhZ)4)`(vB;@Clvtv)rTvyt(8Zk9IwQFdC*?gBSlL< zOJeDiP}dIfqWRb}uKJ(%6tsURhN5T9-&z!muGlsV{MO>3*UXYvGVY=O<*t=hrjQZ- zds_a-OloWUULY1d1;P~jyh+b7Uyj>{xFqaYbfNk%wffQ*@n-j*)kyDaz((&HXK{HF{uU#ODqWuNrF$CEY~?YmPz$mQf}_-?G=9qKdczMvgN=x zkyhmMLU93!@sPb~_rqwYp;3j^aKYmLtVaL0|0>Z%@+txQ=d6drwTTNZjuPiqz;!^? z>LSjatt0I6!KTsqnIGI->?AK<­Bj&?DbxT@)cy#o}~Rt*H_F9yj0)}1a;zU}X5-TiT` zzn>-jKloYa3RaZrAA7sRXpAZEcPuI0kk9LN8=@X_tgs!w<{jnLqc$8fL}Xji>7K(V zsCDzME#|*f-)c9sGO_7oh1(5HbVs_OY5=b&umDvXl&bQzcHeNdUyWl=%6x<9HzwWV z&BN->rdy5-4>x^pkOV^>tjXwkEu>+MjXggJ?|dYeiB35!DXauYtH?my1|m5@eO2=`t|2NTECyp<=hg7UbhLERq49kUcm#@c)au6 z*&uYzehZ9~4)PEUg$dQ+=SjkxJjAWoAj#8@+IcF*LSHaI)~DF5#S%E4W>J^wkdk|3 z#Df2nN2r}f##o9l;29S3{vB;rlhV@qVW-c7?n}nFmE{qcOr=CG%Rst4$?R(hqt^n= z$k)7IX!f$&r3og0U2;t#+y!X9Kit~;S%YbKBzpCSA}fiRCDfd$9KVI#h<;J`&fV{= zSRHTrZ>JOjk!r{VX5sH6(jeVRMV=&pv#CI{9)Eh9_9HIh*J7|=Q40M1S2@}L$nAur z4f}VBJ=?UTjVraTK z^?SaJG|MRfHZ^e}h7ndCntX}bD>Hx5)zQk$W+ihOuY~kX@YaXCjyG2We6b}5RvqKG zm&>rLgyUgeHsh(oL94VFiBZ6OyLaz}Doqv5aii#ZW9hOEwe?m%K#wFFahGXGTnOd;ypV-# zrPD~sZ!9dc^L-Cqaa@Q2O~!_9NyX0q1;+pZ5C-JB_6DL-A2RzKc9S3pUh(j20R&iZ zN|a<|`)n_FE3L4_I2>N*b(^MT)Nlr&5wRge)`8z7pHzX}S0l*b9oG5^XbFiheis(|-J8k}Ogk*#aMTS%X#eL*thTxws5>g4EbSDY%ET9L=LBETDbf z6eh$$vwcibL~kS8DWTL{<{l~}1AXh#>}KoyGVUgr{VDi|Y%1Ia8Ywi`B21bv0WGp= zVo$4Ws}TuH$tn~}9uErDK^RUN)&K*{wMhDsFS)DsPKR`D%kTnr z#6k>!MR&l*^~n-D2jP@9P$i-ji!lU(nG;1Iipm49ty4sBNiLw>kFtT2lA$ZnLiK-= zgwNLglZ3p4+PVb6#$)I*n79ri=?4=(kxIeyTcg15%XJUw^vldJ|8xOL@4$-C_Cr85 zldYZd)fhqcH1?+E(gb7ny(_`Q4fL!gRj<_x3a8M~`g0dx@B5*jK=>gr8+frQmqs5$ z$Od$!Gf26>aJdDr)qyHL<=M1+Tzvx|b$^7K3G{|P1APY|;OEh9!eKxN;>aN#SrAIH z(+-DD0i~P1!D(GFyV`8?zkpW|G3uANDrFKlXErHw6&UA%0FW8+>l**^b0$^4)h7VB z96ed}vIGgGHl!`2<$rp%)1u>$tIiCRkdeq+Eb`L&nzs?*xUj5Y~i@G7sXAowB1Qcs%5458%ihL5l3wF+ATo1mHhteu@CFu|9{WB(2+K06*Wvvt~4V?T|BD^=UO6KqoV>&rJ>Q@`aW_YiMx#e#j5JG(N(Zm|LyM5{g@-rAfAYc26 zwmI>bk*uNh)#a6yrHS~el14spO;!sfkEe{A;*SOg8hRyq@}J_jTQHuH24f2rOBz|j zGLF57B_uEsZO~(x_eoFExPeohUwl05=rQq8SfvunADczTx<$zwytc?bZr?{^r!`-a zCOWU6S*z~e;#8_5D|TBXs99WFuQ{@7UVEKJ%$N*BxWjK}K3k32VdYbgGW_vDdsp&icQ%tz<*72@jl zc_pvT4X6pLz#{yifbh|g8St9}Or)eRR4;B%{q@81^H^`h0^O_g*DM-mHfE>lGiNYs zA+s!DBhI9VRtW^NO?)7JBPL+DiC6jB^T2R5RapJmEN)CYw*FpMfyMPG;+I#R!;+i! zW5UxNJ1fse!-n*09H(+SSRrd{h&3(zqw(p^R4Fc0n|!B8>DX?IUEu*F(O;SRx-zFM znT$;aW8))Makf5WZPNqw_BfWcD|TPvq5cvh^Y*W~5_@&Vl@CHvkFF~Qy^AsA zv7dJ~7kNGmyl&l)gHBtL`Lsk|VY7E$VHP5NAEjb%Uwg3(n`*wQgjv5n_R7DLnisHBhkaw@PUD6JL5fV{lrfE&66rTHp*BvP^=(18Xk5{k@f zpT;_{A+LGxb~;@C*k|t~{Sgm?h7n1!ef!yZrt?xQ zn*4pmOkzxR+$hE2?dc|`G5Z$K`FibwRu|WCuQ%>~o$I`$9CPf9YqrHbXKj!KC?TMi zjUWlk;atuEVxoS7P)==CZorqHyVmi~E`-mMz9{Ntpe^{Q;fg1+5o2k>_taftw&Jwpt-{n2eW1ko+>th!d@TB`D`lXr z8Cz2m*HD1bTp+)Z<1Y?B{zV0s$YgSR6Qb1m_4+e)e_I^a*L(g}ndSOl!a!O-uDzhg%pZ7`6dlqbip^{??5(GDtg zeE-s_hUA6(fwEx|Zpb({pgeZJOpx{9y%XGce+>l8Jfl1f-I$cj+?lZn$QyzVL>;UO zv(+)-z1$}i!(Zk5Y9^B&LR zs&@C&g1c7zZVZ2kaa`YuKB?_|^7MN2_k`X( z6%Sd=niAgc@fY`j>>>C3j(r}1^LdJ^S5so1Hm^dZAgz>SDTUwQ(5qM2X5!l^LJR)v16ks14=lXbB>o_oEM!8L1UwpnOLzD_JX zB9mK{)9$gKq<2-`qm&f>qCTKlZ#q7LC`oXNekn{*p4F_teiy+ix3VNPY#At=RQ(ot zu0uGU_Tz+9XR7t+!-k|UZTdQ9a_IY2FFu&^Xbj9}mGnj|Au(joX#7lZ(N($FeDj9e z<|d!B5T>eOV2xRSBk~FFtHiN6{0o#%j#;2*@ksZFlWWX<(KWl>mV+2<9;q?@k$<$V zW?eKQWO~MT5L=`RDThmJb%pPUzJ;EaH^XPP2xEIMnViTLYT(F}!=}m4`a1;0rhN`n za~A5g-zgFiO!ajsR@Kn)&G}*-s+*E7z#$~b3PJgUNQ|?HNR;C^tG#u$NwxashWHib zp@JhlG0~n#`lwddGXa~}%~2@U6XBTI!kL&}`16qNf>3s)+ERE?Grw-sdGX z$~OtI)oO+`kLa&UOJw11ftdCgds)v7vmKE0X)W+$n-=cnwEn?4zImGx<(UJVzCWf7 z%?W?rdx;`ahG?nfX{TJ{WT1C1zxuKoWxjsxY3U8Q+W6=qRtUc)&QUMAAg{?$-0!9k zbg_UEa7N6C`RaS0Xt#&2UQoh@W!kXSfb6Sp z@%ej==FR!u85}xAWg&eh=xLF_Gxms!)2+n(?&7c64W9d$^G6LdpU%(~^#mMgmtmaK z9zDMGb9qW)aEoD8_Y2d1$S_YPz2TL;NqQd_E%Ns6D3g}M9Jsx6-aD! z3=@E5drCr87N5&Lw`NX81D_g0UqLnclXJ7DJXp zx*IsAurD7BWvpVj8U@G1U4r{u88t}8;Y4&GnCVu-ZGWYhCgaq(#5)lVQa$DOSUc`} zAAvYuw)*h=o%eV4(`d%n8UD}XxlEnYz(U3@WLOwcge1uEO`@L;sge*SayM?(D$mTu zehPC7KhP%5MsdwdR+PW$EHrVr<|@Jc4OcfEjHs{bQ zZ)GkjEcdR?^hg~#0&!{w9;wKteR9>yZOkOQ$Lq>EadVQhJZSq3`+kfa&F6;E}e_*#sXgQ}d8SC#){{0*Bn#f|M6h)mlfmvOXrC*`+pu^$=D-y0s(fIZewyE}?% z7vdq=-9J#CKkyF%vFV9IexlQ#TRRlymZXKqf!q#&&AZ`YMWLYPm^y!+{a?EaXZ zI0bp-F4oD#W@YiJi1`HH5a%ZLhe+d#0w+fPeFwinC)Au}bhjOLoEYIMd!u9mi+;61 zyXw8Df;Wr*Fl|u%Wc)X2b{t(LGMao=!g9PJ_(E6+r)#0kxR@4duF9JA^89u7jUW6e zhyYjyTDxA4e2yaA=Py*SYL7c($l<=KUUE8Kq3*u?d(lLlQz5KFNZw`8FF&%A&FZCN zFDm)-GMz6>iE4QzwC6u~r}=4KEh#^*)dQYuNQvE&@t0ibj!Ujne3^kZt<)a%!ZITq zBMMFJWV2>BD}Iz+QS9b?#W?(SFVmPB+HiJyqIOxDGdk%mIWoQ|kEOOFEB27?DB~1b z9w^Dy9`Cj_Ww_a$y9obAL=UHEA#4MzqK=H3 zGrjxVlA=J&qYB{r6hg~ANMnC~2pJ9(+f4^b$4Lh8s^L2Tl#%j&@7{lgbn;jIM(7KE z@r~A)$-0sYqrUPmwHLL0U?>xL-?lwj_5JeQx~pJjZwF-fH1wv4OBd4XkZu(&yH@qx z?##uZYfm{wpGmGU=JUQPvOB2y5@ov}bF8DqMy8V%2#Gc~>kV+;g9rnhak3wah4~%c zNP-?QL6NPThQ82{HO#)%?s_vP5oehELS(~LeML*%)?Q(~$#_C!nR|w1B_KrQ7Y(iT ze4^K%)asBVzk6C&r~Rw#KQjIZVBAyN-9_s0L`I$P*-^WGiaB#jWSLW+Q{^)%V!^E& z<>`dqDF&iEQTaMuk92<$XIT?|5oYQC+Az<*YPXF5UdictIfCD}+N2dnZl1O0Wy9R^SI3fb%ZDn*;{1FvWuC_}EA8-=F}{+Khiv4ody!?D z33ZLL#`qnr{VM~+3Cv?;d?lSz82qeg0aj`z*OWqcw$X6!`gled@Q1Qpf(Bht=JvrkW8P$&h z|B9dyJxITA&BLIlyWD(0^t56_=Bl}Efj8`sPV@St|0sMhV~3(qn<$<6-vk3I}NTymf0FjETKW!Ub6pT)NCfU~_kWvkv~)l`0v@B&yu-!Nf%2_M0GV6Lxmxw-by;S}y%X{tEX?)3pS6PLp&kb=N z95XXhPCuQS&Um$am*>Kxp*F{7`-gN5_0o&eguFD+zDAV!VpzXfbmq*&>dVgOP4fz} zG89=*^Hzz*gRGDqw<~Hq%BRnfAd&#>C_lH1=MSF2-z`o@_g4(>Kid3va3fMKjig5E z*RWCQQcfU-=TL8eAR;(UO<&?P&9;(s(8MF$rMoj+A*(t7K`M zQTOuc;QE_XUbHl`V@QK)KSOZ8%^s<{_lCS24nvL!9B_kJEOX0T#) zG4-Q*E>KK{jqhJ=n}02<;-<4r<7bA9q~l4p88dgnJqK0CAxA3pz6`9hnjY8D9=|bu zqxy6PB1rWZ_c!ry*fDliH@dEyB-W=oJ@hwnXLN$pZvW-4*I6C@_gQY<>38N*k#pd( zsXmjDTG?MY2>hFLJd#u7sE%f^>KVH0f29EJUk(raUcffps9{N9Sx>mCJ76@0$~t@l zRer(dRd}uBDskBnkV=ZxJtLr_aM`KPFoH#A{nFm~6}6gF6}zYldf=^g5dQih9jA9+ z6(7KG27-z~EQG=*veO+jTtK~CCU5vS?b{H%7QNS(4g5#p678ViAj)aZVjcUlCF2X= z(}4bTFbnuH0QfR}g!7SuCz@+w zRqS{!>4CO@a11Y$ki#RO1KMgtYlMarpn}OCs6~DshYSXDrlY?AffxVe$gxS}R8TvG zVMtcOBu6)S!!RRDOatw6;vhUaJ);D$hcEx9#>8JccK#vTo7MIKD_1tgrL+D@5= zBYYO(m*{97P{ROYsHCV}9k{wZ<4gTIhvGV*w(>n@UXw^>BPX6DeSv55wlVntG0@;~ z#aMXITNOvRJ>Z2wY=KbuH(NMLr2Xdu%o3XFtB?v>q#Wx7&r=>gj`qC?4v{zc$r~#M z#MhEXWH&y;->Oj<^q8yiLHqVGB;dy(a*t`&d+c0anc2C^?^tya-+N+^B8GUj;EF$()G~hY0ths>Sz_KUde*7b}gxeiG~^|lR$*r zn6n!2k>8^B``4D>y}jS=uE0fcZtf6GYLy+}<-B}HQv;}1*|fg?X{uG4Vjz6CAdxeE zyny`SYf){>f=vmc2fv2!<<};!9&kd-yIuCkOcaTT{m>D))|9Vo!|-Re z1BY~Zb-lP^N`4eJ_mD2pi;fBb4Gq$|i~FGqA-mNc)?zA7=uoGmjdE$6+D-v7!wyJf zx`kqS0h%?^q&IRpZrq}@E1p;AWZ?(;({xKN>gaHu3EmJh-O9aQE2XP#QTg8~S^p+Pc02 z26dW35txZnc-gWgFz43lGeMj{39@YPlygp{V4ynP>3B26Pw6{pJASVZ=|K2m%`<{b zSJ0c?)Y*w;Si-!6>r|RfP4UL6xF` zX(t2BXRUzr@Z%8=%LP7K(SPJ-zOOTPNrCj(wF%7yE}$`4cF??tZz#?E@muUjO%boRdNJL%dbEGO}Na2VK?{vwZ<{&3{qs9+HNYLK-PUvwh+3bPcV0Jy4%lHnYs%GCvf5`k-Md^QU2Z zgrs)47B%xFy5>D)2F`?7f~UG95Gxl??I8Qpu4M|k?I;FC-%q&sU{qX&0m!n}M#|u$ zYvxM_NS;~;S$s7*z z-U7POp$xoj4bAjX2APXP1Tj`_P%Im(j#<0RKPHj>A1?s{^3h+11kLc*?n762uI#=Gw!#5j*=tJ5`Ws=nt&qt)#8 z{aV50+9A6qd1srFv&J>=XvYguGp?*W^bx#Sz+Vff6qPCs%=S)z8caECm?zpe@GW)052Z1rU3rK z6zFj=rD#7o{J@RIA%}CKP?phqG;Nxc+P}U|%x)dP)Iku)&42~>*I!46hO$Cz(|}Ba zd#J671zDlxUefc;x9IKd9AMYn2a|?D_|EAI@Wn--$LK9Mp#VIxRzZWeZ?fbW(hU9E zRRZn;m{%%6ng(;9CLhuX0LEzM?|OPLnF#MCQL($Hrn`DSEFp8j*!TwiL%QA{z&*p~ z4(Vp~|E*_YQr-(78K8Q_bd!CfZ63Cr#oyiX?|ZuHX#_mO*}c#qA-IsDC7`VE=MNU3 z6aBuoRL5@cA_ay&2yvRX?s10WwT!wWG)?RM}dhW4}sVZ+xM_p|B~doB1HD|oWA(@Iz^y0 z*)TqTH3`3LtWcDc>}mH6j(mP zldQV6HYb5fUt1Jk-UUV>{b~7KARP>{s8;m1*G5#K`+xnMY~NSYnDL0WN}RM9d)R}v z<#bh-ooCVjnys#)FL0x-b?j}{o}%{%5S0UkjF&8wzgpCA>phU1PXcfBT$>$zR{!Y< z^_^pq+Y&FI+K%RjQI;G6?cvN3Y4onZ2)cg#V! z>ayI#&l^ z6MRtw5D(7Ua(e3L!-((5W|_0Af(>gsIk|C!FI-PxG4MDI<}*#$Yev%98A(rZ59U}7 zM-`c%H$T@IX$O zQ|nGtT~#k-yV^2_ye8$T98ucqj_HJdjPJJdQd_8EJ2>Y;zD7E><=~;EaTQ*VuLi<( z%KIbS09}?je44O|(+mT4(IC*iIOd@@;`m-X`6J5_NEtqrjhHoEwwCz`&Q|Cr6${Gy zFrq(Qeklw>Q=ljVPUYtPRZ4_mKaQL0kS+-kd`Ned_Y~4{?v&6fkZwZndp8gQjB~dtzS$sM;)yP#CWMzVAlD7-rMUgHLY-LG~ZadyFTfPO;#}ox~p=Zo! zr*ii>t76iDUHQ_~TcX5TuSmw%n+?nqoktDq4V0?_`}$|_NjSS*d_yM5q=t4c-|_}f z8;RWKchggTD}2)gYu%E)ignP&SH3@_yPg-*EE+b84Do2QJ)_`UY-y?Uwq7_u8+XoThNCc}Fg@IY&q9mo^^W3%a) z=Hzz(*ZMA+lx%^QjtYgctkBaY0m}quOZElY5?nRQx^>LN@Q}z*v*Q%_L=l>zB}tNQ zoh=FV2d|GqY5XxxEDK5x;v5>m1g<*K$mH-Xx5JA2681XXFtSE#48H=7bNDhMSbDG@vh^ zRsOk|7!G>C0IXWmZ^5Kw*oXqq&fRd4!pD0JXs0zzCB;*K6Th}>LF}BY0}QtF@DSdA z(F#Bc|EbkYIF_9tN<9j|i<>Qfy5_HQ=|}BpIU-JbCn&ubaQwy{_cT6e-|!?KWAgNw zO)b~ed~rkLcfYvLLGvMh2?T}3(;6EXw$?p>2fpV*89bLW4oDtXC%P)U-TJ230$KR! zQeRSbKYOy{)aU1MzH~x+%~-})7qFOSGx{dEd%adKb7pxqu2bjQ6=_x9# z|0;_N>PpEpab=ix2P&~dvS#Xqjw@vDBd*Y<6lKW#Z$DGytK!c?J1~2fdHwTTW9ip^ zJfZ3~oFeJWsw?vPtMth~jlAz0Fu)PGGP1e#5FPpQn;!;BR`Z^vT>yysvb5sUX;!IO zXp5@GuKhb`e#qEQiI}~@=E5bCbm*1PPTQFkWVCQMZEDg#9@T;SF80Ga*HF|tw}Tdr zT&n~ssqS%lj2C~8&Zcj@%qVnC@20NAl0IgeJnO_{?k&)I(Eb{^y=W_~JCVr8HOwC3 zvbOq$w+;OaZ50Eo7ELaAYt=X(`R++}o;$NCDHDNw0%$STYX=JrQq)@clhoMRD2evO zSlk&H3%@HPrG7l{b3AQG<+npRXUshBMO<#;BHLDSFAng>QJH=LD*z0G`>ildYt!=%bF3+#I zRh1s$X=i+;0=t8mgRwa)UgA=jB}An1&$55597atCeA$E{)LZTrYv>XsD;tUth9pg5 z2dVe-3Sn|}{59-y=`HyQ+nG1F^5xg!SXB+%AI_`oX~8OfJ#^`qdZ8Rr^KY(1*_3-9TYGCtPhWHR2W%G_yLPIXjlS{9m0edP_5E z9xWR+);J%bzLgcPt(BPcXguYo+{fVrsFf~6yJicrZjT?qU^iRUOWZ%amW75{lHG4) zp5!iSeG%o-Mt|lTGo3S|##T%)AmCLGK%N%gr2#21OJw~aol~*MU4LP$!poQ`<0)-N zmP$e>_}~Vl5pf(ohx7)~nv+b!AL==o@j z_9hrejgrtR1 zPx#y?``=X6K2^T@p?cysRmNLtbW|xExjR*^$#^Q=>Pk2u`n&Mt=hFoDw;#3TomJop zNf3xMM$dezYx}3EocgsmF>5ck#21#G*S6G=?f$|u;~c5#Bdv8!sb8no_qGNXm#PlT zFZV3o6O`Ob(pw?{yfk!-ZT=ah`g_qq=ZE6bwYP7kWr;@zLKXsG3$OuWQ{VP))9l7w;(Qh4-wq~Z zd%*h079`8EwJD}@lJ(ADo@ujF?2u2%1#mI=gxgz#E^a|I_6~Y|g1=!E{mzA-L+yMD zY=GhkHby@eeSHUZ9DfWQS;Y><`QdC5U%UWG-ihiFL%Mu*;q#3pcI0?c(SpiLhlnQcEA_DCehs@uI! zR{yeNypfSOm83vvVsj>W_cygp*V1a!BtYxO2qkjUC}iX=-wlURiNqN2ZqK~fI*RQ; zlq^i^PY!MUut5=+J$IfItR1Gs-5ovb#-;FM{(s8*`&7yH}AytsM04>#Bld(?*l4|#%{FIa1KR6=}(jMLz&BW ztAF}m?7e4LQ}3QGjH05V#zJpVLFph$Z&8#kMtUbGq7>;>AVft#X^}1nL_k1lq<2D* z-b9+xd+!hkX`aXbKJ)H(X3w5EWv=sSK0vN(C0SYPS!@03ecwMjCcIc%2VF789^;`k zq6WwL{OUm68N$$n^_lF?<6TFndlwS)sK@VM8*-Mn?8UPYLa}TS)E)ziaEnP~7XfoF z)m+O%by@Wd?m6s;H)NVF*j@0f;-b#*-YQGJ#i&gXV-(|P<$}ziDSObMRnIxWMHB-S z0592VRNZ%sRtlFqyX#+f9HW7};DKC6P2KiAedu4y+=Ms@W}-AQRBcTL;^|jy71ugx z^Mv;zU*0_&{@yp06n$J~#$HLdy=-j6-ViC8Mke?0Kds2gRuW$BB154Gg zjvlQ;>v#S4hX|)GDsRwQ%3_rAWL8ej;4J0oW;vy1=mJVGthTI7BYE_Ey6vhYEUV&| zsq6zr!Xk7ncblj`mU1x$vR~k!N7drP_HX`5Z}b<~f;9?@Hd4ikCD{U#h7pw_Q4Z?Y zZJ8Is6JCtLO~~cV&qd%u;$kePboWCH?9Rz%rYw8f+AkpKAbQRG6$bFQaVK0CxrUPE z2eqBTgWt=~&}9ce$dMZASj)?A@!;u|2t7x~=++ z8NwVe8_*BA2a z?$8q=EeyC{y9S12f0+4{!EbjsK2!C4kF??5B=N1vC0*;6$Y@HuJxSBgt)$L$ZyPxl zan4hX+Z$aN9#a52O}R@+s*{9F+I&YFr9R1iDn&fK$Bj?lzX0x^yE%Y%;yU}AMm;}w1;*r!ZiGU_ zlT}-&+)+IWJ;VDK}bVSUyd}z%^Od zZUW}$I(G;mm}CbgG@r=2>dhG4!x;Pcdkn`$ca0_D$Z8c3m5_Tck-bVJ#$b-r(yRt4FQ^5~U zRXFdeOur(Z#E$Ml&+X)VV0~o!FwDFp%3=ozmE2swE9wM!yw%rN*TRWlTu9~H!S(x? z3ySQkbxf6E^5slD``^N1D&H~(2K#fInf=1!g{@zqq6H6!hEh$DPpRY5Mg`Zj_YAM% zU|Y9XrQnx>o$k#H=T@Pj6jFpfV;>+rR6(1!aJbPF4)tC^F|2-lL4*h0bIrMNc&*0m zLrlQnLw;8UrGSP8iZG#kTlF46d8!k`v?}n-`l_J4hn&gGbOS$&{FLRzR=U%t8l>sI z;T|I|a^9NC^NE5?*c}|)s@xyO0@(~dWlY^ToBKi)RcDTBTWRWCNkEmLzu`&yq{unND-^7vjJ?mt zNn?B4&v&ZDMomVW=Rft*BruNG9$ebPJ|x(a{yd;{5xq0kB|a0$+%yN0+n<@W&#f^Y zZJDJcfydLn)Q?};e=uaqEUgs!NJv~~BTCC}2#3zz8QAr^Z`}h*?(hnd7l>}NrbOS} zZOETWR+qT-jB_r!h&wg>*^M^_=W_1)uNP%S5<(8$rn6pc*710+s&1LT;v0#X=6QeS z@FBKa=mY3R({gxmGd2#7fymx7`0V00VJ}3Cql46g3zzo*EVdgz$_jl~z+ukkOjXvA z_8iG#?wB};_QZ83_D8oo?oT8n;@VO5F!d4=i2*nMekV@E}RT_HUZoADzf&f&umg z+g5WcHF{^D^=Yi-O4(ZGP_kLxfkW-2gOp;lLdx}(I4;q+aTs^Xme;!Kc6y_1j2C+5 zROv^{`gVseVioGl6=RdK6#d}YA2>uE7hzPDkUSMG)l7U0DO!0yKmC}a^>Rn&C7*Rm z!z}KUARHfVI=oIE_5J!N7@`5V-7~qHs9(C%xClii0#tWvF+6{^JkI=1fu&x;Q+6)C z?>nDgb+{MwURr(<`!)B{%WU{abN=O_>_d_9t{v9Lv$|15e%RVQm=w%bG$`&j4ZXjB z9h*o3G&TbvFlj7m9rXClowX>zFPkN2hF=dad98yCfx##1HU4&bS49w!Nui_hdl4E0 zP_pE0>~ICMc*-rlrWt|i_FSp)=8DNhtA*m{vKjA^xJvD9#-keaF+8+U0uuhK$#7WKL41JRG zkH&n!ED{a?31*{u2@Wc?lxz z73mpSZW|SyNoa=z;-wllW6OV4L)Bj%S3r5wKqn;dV}-UOJt=(;A`wR8tsP&DAIC{? zHgpt4;!oft>bL7&yzhQgEa3QXxw{rut>8o;qdg1s-!y0G1QhR349Q~TCBsKk2u=cK zy7ld&ZPho9)b=k*1+PMcLfU4}jdfT*RqyV(}+Z=_3#rG@1tzKQddPm$g>C3S&v6pJKf($mcHWDb3?HDH(+*jB{oUNKx%&%p-vq`?5KdF{aRX14Nb%y?{$67 zBV7=|#E_OfreSx!K~rB};2J`GQ~!j8Ln!I%fYph?xcCv?QKohPLBqV=IQoN(q<@H8S4Rf>3pnx-Rk=2B7= z;!GAK_-;u5rcvYV=1S6lMl>LrH}}$5>=*vdgZTYE{}xWJ(<^ z7MOxx(e*%<;I$vy_qV?KGl45SbWdvnd=O6C1*Wzrx*w9|`V zyDR^Iqmgu9DSAZqH%(-{{|J{HsLPp7%WSRZeH8VBOPA4bNd21m7JXsOUr(_Y@t*%+ zctvm(GRoVE+KMBaWA2qjBjbY}n@Z4-Gd&L%oxVIi|C^=~!J9RS_)-9Rm^<)nYlg%4 zS{RbNPZqLDnW-(fZt%+%z^q{DIseGAS$d621{W5nPju#@{`du=Sne+BD5y(@?4hY+ zzDa*kZ-T%0_fFCWH)G!!aB)GYU&FQDP_p(kF3_K@ zW}7LMZDiTxTE(d|mqnDOuLem>Sw}Fc(qc3SS^LMnPiyfu*Ij#*@HAzl4}M^9&6XXi zvx@nIwW8$Kv6Gbu>C+DEO)1CAHQ&xu+kgDD`(BC$!-}RFs1QvLU=$MO- zzW!S6;mvXCidDo-D{J0K2@({@65nlBJW7Cn_SS5k;R`iM^TK?Uxh^or+ z2u=a#D0ecCddt*I<&(sIw8O2)@i)=R(S*y4hrNMDr9nL|tVsS{9ls$YRIrac*z5Gx z*7v@Upt_01BpO~rm;wG_X?N!6KVkcviw8DD+6Dp4n>`?F8Q`A$nK1Sbn-GjZx#Mmh z)rKgKp8i9XIpp8?r^VoZRwVvk+CHZm`TDX@mov>=v_-=7CVhV9=(zG-_@M3=xZ4UU z)^(=PIfK1zB%E35iT2iJV5kFbpg$t}!scJIa~|cal#M7CrZ^gHy_GHeO#{(Y$Qvh2#&t^F`FXD1SN{h6`J9_@n$t8qQ?6P*UO|4+LX9iTouu6ke(I4GiwGHk7Ey`>E zSBvOxN*%)#AF0*Eg-_6wbbqt*FSor&!P9;$(H>VNewRgKzPO@v_zOa>#yGPYSq8LS z-j^sn-IG)q$;h8AcFDBS)kUK8G@T9zzMb9E) zb{McZ)L)3A0SrCf7{s0zzhh3W++3JB*~;eeEjW`-jRW(3Pj|;?+#B)iNaGlr(*3N*doAhTS!+$Eo z#m}lgko&S8Pxs!|=F>U2_ErSOD?u35_Zc?MdY9ly77QMcSIDtFw7t5^4Q;{PAp4?Q zYNJ9Ly}gT#25kptpkWo7}~jb}G)N?tB{ z(1&`pXra&j1=T@HH|$wj_l|!JN+9@Lm*03_N1sW*jZe(wqUJ^3gS4x6fGWX4(OUtH zuMM2xnpzAWV*VpoH-AUo;z@ewoo@3`je&*VH2U^}5c6PY3jgQhrv-grXI7*3O1UXB z>4%9`DQAZen>l@F&v21OJx*5>G!BE`(fu3JXC~;xJ#|``&xuz_?03)y?9u3mL@K>!mlc zU0!8}5?HhFFX%_o+rD5exh{PVCV%Cno|v3{0GzijI6aIL89d@Gn87)&K`uFEUnu!% z6@*L#2&PToAasDP_9p;qDyJ%f9-ltc@V&2|_d4SmU8BzhfaX1hsvjlW1~;-@S!Rc% zCnEm5{eqEkmA9cvYwdZ3DGs<7(^^R4(N8W}_ z>rd7P5JXWRR3hRP8}IJF-R%&pMA4d~@m^|^4^;_+hFb;&IFA`rI&|f@sI^GbqUsf8 zcc*Cu582z$rB#Yrj*gAVH0BHRY*PbqW;SB|2IOj zF-ITJUB@Lf*HY$CSy3CYj2^OH>e5uMm1+WCs#KjVdKYf zDFTE1p4YlLgc+A3y}6oYQWjKNGno$q3{Je7ws*pBWI-q6p=}}9M}9>B`fs1_FB9F$7bQY%gDGUHHR6M7ER8Xu}Q4Kc=jNg zPf|UWx@S2#eO`p`A-1pPmWP-09bAc_D4|zuckUhrc8j#Kq%aR7GP0ot+7!DV-f||6 zIiYyeR=2y>Z0Mr{2Q!KhWXXT-v*j zG6|+$O88B)*oWFIrZQl*lFU%Oplmg%N(KSIIpcV`R}QF<<=*c{;9MuZZo2SN<*1Bm z-V2WVp}%R6n5`@bN4c*Uz}~?~Y<>fQoi~@Z6q6@jpd8>&aTheVv(cQQ zzCx%`IJG4M!EuLd5T+Og*-KA+U-=Xc(s@O22nv~GQ2&1S89X4+`|I+PiGOZf(eO1` z$SKu+TWaY|Lch9Wwg5#Bbywf#ku8|+utyyRclRuAT1)Xn(YhP&9IUYs^V|vOC?N5lZaAK=bAY-ukl1CeK;wQ339$d+R=`lwyFoA2v!HisT)y3Ej4Y zC7WuF%Af{&*(og0Rrs$vp^2&|CXPiDoQ@DPVfjiECu*qdA9@EaSj3#<=8n{a|Ga0> zmtrz;a*%F3k%?@CS}QfLVb8aZ6`2HvaQy5<&f({&NJS^bYqBR@P! zP)&4)hNXSFcz2jxFFjIF54|~ZAGZDx6{zRUi>5qzj(e!=OJPW1@A4lw+z2^XEPoBh zN9H1LOXz5EYs7E<`i7dKg>_`D^j1)^r(+C@aX}?Aiug2i>LYTvZgj(lAZ5^b2*c=bgbsQ^|WV9xIem6b^pd<=)IYExEaAf$A!oo?-@mVGF`Y=@WBrn<{0 zBIG5|cjuV1&FBBcN8XH$MuXoo(Y&NU4zQ!j;%(AD$f#(;>(h_M-#Ap!*0I!9z2O?! zr}OW=SpK14fp`CWJi>CGRT*ZJZj;r;ki0oZ^nXl` z`#V*RMZ7}XOz0Xzg}7y(%TJ9z9I~|OM|>F#=_f)1-~5f(#dHzG#Q!xv_a7bd-w(w> zyg=@hvYahAY5G6Q7+zu~BpKr0r_!hTXBGdVvMrp`jsZOv$q<=<7;6SfT6o(3w`6}? z^FMsIzdlqJBH7m^q79gEePyA6L46NKY({JutY_M9Bz4}l-S1r+C zGa_Kc2hvhKF4qvZ{a8x{E`h745uNeD{HT_;44M#QtP--3JYNB_>Uz$Nm@*|-P1myHjGwDi z(kNHBS8h|E=7|LOk9%T4V8u2eEzoNNtLPqnz4Exv{35|~igKujyoqWA05S9|W(Ps7 zb>hf6Pwkz%+uXn5I$&s7$5{&XQH9+H?$i3a5S{rt9yTRtGZ#DQR1+NW>RX@r$Wj%k zCdU~QL1M=iV{78Qe3U9blzSz25c@3{9c3MR|2)_4lHe71|^iq>+x$~@Om?VPDNWpDjV(}Lw!JuKnfS3R@E){A?O3G+a2vq zuW{NI$!fQ`k3G?ehqMC!cqFd@uNpt69cdA>1*82g( znaG^h)*NZGkeh<}(M_JM6tg^>8FVX0$J=oVm6Kq)auS^Je2TU?GTLZnxh@M9siJJF zo@OVuhX$33Cjoj-N~%IBPegF~67BLrv)jtp$Pzo{wYD9;wl7P0FEn}5($nJkE^2G{ zse=Fa+taT{JcKIWx*nD&a~sLf-4gGT9z&$G`eC`|>!2Zkrwkt&~9 z07bTvLHok_!f?Q5V!1K%Ic!MVGCx`E*zB=(75sEu;Qq1kfB3qIl8IhNz+DJTef?hs`46f0*x=9S7G{odd=1oH^m1awo%UZ>WUf`J zge*<>t@(8O1piW_J^|R7{!pzxt3_8JqUr~#eS&pyhB~Gtb)fs}qI$`|s<=9yVkR;% z{XnoTl%xp7Tj(GqLon-1K$33|C1yUbV?@bL+G$mclzzLXpiFEzk1A2!%gvsy4xkoU zBR=tx1#?KsI5HkVTGB!0j8c8FD0_khtPeOHH+RM2Zl>P=7iE_6q-*jv9bvHKyZih~ z-lPGnpYfBEFrD!%bnD7}*}%5|d-(?3d4cY#^g-x*PgnJk#%#+buI{f;T&QGYQ`0A; z$g32F(=S(iyjNwZ?oGMNO;l4z8=z9zWd%oci&g})p({Vpa2yFS>6GnBgw2;zWhbY# zsH6R;#cZ%+EyOUm{Q*j{HjH%3B&3Zx){m=0ZAPo^FJLIUfAK0U(#flvkVlFwwTzLJ zN|4Qg#hSd1UhkG8Em9A~kv`~MY>mPj65E~ue18H|1{ENb9a~gc$>iZGNaD%~y2N$p zBEkpkqH8d%_(X9g<`C4y{=5;Cp9MX1gmF_DhD73bv93ct&yrH7+SAaSkie2|g{sl4DKT;mp7CQkU6nF_=|G1AvzF7L z6*?6*`%o%)Z@1qEh;R+fGlqTDv)Dqo*wv7dX1yJMQrgXs&Ztjc*qe9>A}|o$t`6%f zTL~2ekYEY+%(|5=DcL~O&oJp16G#rK=;6sT48#aag^7xW)b=i9O z_W*LZfxX~2(iX_PUMB8U39>%XBdW(0pYl+RD=ONmN9Rk{-hb&WKOk)X{5Oq1gHuj` zk>(X7FM&;{sY)IQtEv1bSzJ9z(9B)y~1sQGLwHQNl*d^t#tg(5%g76wCSH-z6{*};K{n6V4%%2R1 zQiP$4ovZm%okkxNGp{jbjwpUkw&Y#9LsiKw4O~GkaRA<{m5_pR4PwkDM|rIyMt6k& z(q<~+OY_YlyXHS~=C}XrA_G8?XZRBWh!@b>WI1H~D^h;+(GatGRm}i62xeFTnO-qO z=H=6~EaJp}F(c_8PA+Bh{q2AKpBKQ}FZs)g{0VMej)2(nhV4sGSNN|-)ay+3mqjeR zWL4JOITwn-0_Pa=Pgeop7TpGj9spWN?S6jE0t_uFfU zBbA9tsspbKqMHu#4-%NSd5QY1v=*j!qE;I`ZhR?7mY1H~`QfYx(!AkZ`svX}2lH-w zpm)Kx907ledbWMbyTd-aP20R`N!AB{W)dXBr<2{oWqgz%1JM+Fn>qcIbrj+%mm(dl zvd`7!A>cyEwW`uWoV!@jjVnF6!6%lftELB z9kP*PRf@iC>j^u6&s!ub8xreqm@P2Z7&-zQmv04Gusc=n1NRp!JPnOUqF{f%1uH>S zy+AT9&)-?Ob^k`APa39PuMP^H98?2zKllKCE3NkOuMw$krsQbw1ySg86R z+jxJd+2250_1Y9{mpDk^=HnE@P%WPvjy6;=BT~Uwb7Re`d1dy*L#Dzl=^Vj))Hx;p zxh};|z@7F^x2ujNzy~f+CEgT;DUo22X)U_Q2aYLG?NDO5;GT58*LTt3jWrb5&3S>` zWs}z)FEN>roq`vZtEv_THo153Y>BaCl&2^80V>&4%j+S$20X6%pf6)!HQEa$w*B(| z6cQBGY!H;xO`AdYefiQiS2X^8F7gy~)?oM>Q-qk%^ILmYD^Ck%vNn6{7d+i^pS)WP z)J0(DF2UBX!_0I8BZ6HHj$Wd6&Ju?i0i@rT?A}kUdxF%NJd^MX7W~|FqZm5_*~>vs zgZMDc3kvdX{A!YVW+=t92ziRpMZ#Q3F}jdA1sp|bcw^^mh7)|5{^y{nS{0R}2I_jx z=^J`kA9{Bn`m1=YOwQ@6uVqY>p|@50>|L+y&mKq zX^FZA0p289-5Xzbe7B z^`{sIBgBFt1vBtp>X~JcnQgb`(DjkzaM+qU>@_IcA#M@vsxVu!D zobKr_kgkw6KlS-&BdDr5GO~3U$ zwyn!N%a}8A{WET-6Q%;4eJJ_SKiOe&i77V)mD} z+yTvxXNKl8uQ6z=J^Ih{PRs9qPcRvu3d#YmJb155!KHOkpQUW={YOpLQpSQr`C7;8 z5l5)~XQ;6*LVId56_qzJ%FD!Ohr{=sby1A*xrNbyfSj~+Ea{w#fRXc}trdu7#)tq6^oZ%Z1+uVgNn_ox`3f|^L!FO+Y2*83Q@gk$%p2wQZ zD&G^&ftY01BV)Nk`}B1=QQqxVe1WVzsWa@bsF1GA8Focq5WcSAzL!e>0=J6iUMY&- z=5a4qyyCVq(ppZ*oTKUoD0CeNS9ysbiT?V>wa(TYoGm*hg3NtgCQ|y$R_>6!G3H9q zmF%k_>Pe~>K#ithx_siU#!SK$_-W*b;d;LgIA#CkCS6vhz<__t;Np$qZX9QWx+<%QydlOTP;V z#WB_JZ&@{X`QO?$`xzk`H8HNX>sWB|+SM?wMONOlmquHHPBV~iP zy;a6`1q)S}C*Lq3x%V{e(#uoOUR*0KB88W*J*zISniuVmW$K82PT4Prr8{ z%4kRH1wJEVDAnt1V3??jo9ilvwn|3wcn_n#Do>O%&5B*E|2JwgswmxCk=`j3cOhSL zHJ%@LzbN)a@w-;yGQ(a%!=WL8VRx59wA`e`R%sFQ;GC-hYF~wt_=#|3I%t@z63QFq z(H3KV-iu!C)EVLT5vSTtsBkdRoC!Qed8f>{t1lqanW&n3@=NaJv=D*Q|FDApQ~m_@ zd-y*l_@{;N9c6!ewJH#d_vvf9p<0M#8WPWT|JYKFN`UZ9P9T7ukW*grGmTKgVtcQ{ zCl=bcs>vze;f%Q1+ORb$liOn(w|psiwTtIej~V8BKQm)X#{Jr0r(D0#OR}eIZes;1 z$`gmY%9B0SSsP9y+q+35H=2!>{o3KY}fZ`*{7A*h8qvPrZEZncpA|oy_v-K^Q zrd+cVIoGAZmsS-9AT^)oN;)G9)t7xhm z0u3-irT%=V@6&m|X{H(hf;cpt%7kr(5IH0;WEL~jwg(=%eHPGWSv5B?he^=U3i#m` z4zdGt@w>1!+yFhhzLD>DDWf3%?5bebJDv#E_vg>?JZ)%L4?9>A>;lMl(W9uw7h6uo z<*#8k57cON%Ie$4pPzD;i|S*h@rgvlTAU?JZv#w?;13CaTDrIrzh5S$ttnA6T&&5_ zCwO+LW8=o0IO!gCW6MyMziAEyeOf7hCb|6hXm{8;%lsNTJ^ZZ} zupl)Wp1F2ArR@Kp?r`~%!;T!BsEeQk{HB@9Uxn_|Vn7wgA@^WHqFJf#jzz7McgHEL z>Qg^YL#-y(lbY_mIVcCr;Z-L9g=9y};Rz%}4UD~xg}1pkc)0GoDYnq8m}mb-%0oDl z1|6p-$9tEy9u=1W!mAnVDIkC&yFMI5)_$_j7MSkfJ?6og09P?sVR>@lM;*MuujftC z;WPv81*?=}nddNZbVEUeFIB==uPmHn-T#Asy<}8o@QIQU8w&=`Dq-)PtMMNFEMj_PmPD}H;oeG;r7-u zs>Z=9z-pBNFyQ6;9{6YL^D|2l5GIfZ#r0M!?f)3-4XBS=`=YNn%}uv7xPxiH912c= zZ+~6*2m9q-t;M#eQSnq#s=2#7F8Rgk_(o6Ky!t^M=z;Xx_G?TV%g>efcL|y5P-7;Jp$64?kCx}s(Iw33Xhl4#8z?TAg>m`{zSIr1sLJ0hFH zZn$W^xxRZXd~RRYoBkkfM;w&JE2#1wmniSyExMogc>%gG+yk6n;u%B{lG9P_aAP4DIB@q=3OUnLU(-u%76zr2~;r0qG23Z;sHVU$96v!9Y^pq9#4UC+FI=BYi+5-Jt zB`8%NMvB5o^I=C(GfYolf0DImI3P7B8d5Al!5TJ&Ya_+asIC|1de1ggWxXlE&}T+? z*l>;52iHjP+~pg+tiSV^rYP#BdW(-?)9s7S0(TkG#Ke@iAL~g<^LVi03!0`&B!^)a z?=D}}TzQ*Qj{u%zg zuwv$*8rsxfg!mT83o90URm!n%R(K*hy{<|Ryq0=(fQe2DyzH?~<)!vx=q<4jdw4d0#TjKGwR)(!@2+=vl?(uy5_WUtT-MQ~R&&uo%e-2p>WDSXC4wZv4 zzPfzA+rl$cl2L-xv!sn4w+S4EAa(sWjd5Y=wsEq`fJ2(+1W}~VVEF)Jl`vD6s==_N z>R*S+ChCf=A56cWyE84=YSCO^rF(cB)>BtPP~H|9o*91aP-AQRnCR|xvGaW3m|9cE z9{Fssoj3Iigalo6iuRfA(YqoxVnW#(0D1 z*`Bhh504lgjtFPEmACDjCs(d7F!#VhCCIHosfyQnSH8FC*(J#&hO9G1X?9PnkK#Hv zc_m&ow792l%Gs|JAhmo+egsJwwf-7mlsq!g`0+dRpkWs@-xprrJ0hq>m_+6LG+HWJ zR@DgMH~ySpJ-Hxec1B>91=}n@xk-6LhRkSPAGX*o$gpuJskM-PRPt;(Q|uPA2V&#Q zl$4TN7?@+{GE6h~%hOQzf(Fx1<;{vI)j!V7Pbht1Cg@9%+&<}tlV zI*q1WxaMzQxPn<<{!PQP+BCV(2qj+CAm?Fow?e7+fX+d)XxDBM7PB%2~=&`uMDErWI#dj^U&Zlq8vv?D}rVG>IPI+7O(+r!1b2quEHoh z_-(|FAi8r`@k};w;0P$G>2N?_2)YVl6>pR*^{p?I3B5FdOYXe@kitzlsNn&CGh)Rq zMf}62Ma&=YaOhqzbqtxsL>8x_Qc3KjE1DdbV!+`TJaNX!FCR}bYdySxV4#jm{%GDt z!IK+)DWVen4h?E7z~t~g=&t7Gc-L#EaD_fBu#lqM2VU zqG&>az_~JG=5z1-*>3IRga~YW%v@n>CbL;YJ6xNe?|&1*qE{2q`bpm`%ATmBqi(+D zchm5D1|k*n^-Nn8E^NK+aDx6iT^(Zv@UBXXJ6_*#uTx|RnpSC(-aRaq)G=c=>6!R0 z??mD1D6_MvUuQSsIA6@vI_-;YBXL1>VyYPvn)3}a?CQ_sVP3t*t1fg>0z(Gi9`Aw4wY`Xx4(vGkozaSu8wxy<@mQS0WLDu#?D{Zg8 zRo|#=w#UAu%bV$j2=wyw@<9#^Nqw)a?XE6yRlOSH?jdb$Ef2UJY-C&`m8pGKo`YJd znV4sdt-Y46F>*Z~RdKLOZn~tDv+r7BvQ`cQCF}pd-_S(ALS&-83ALe?00;9ri1B#5 zVXG%@9fBqXRkeofP1w%$a~&*Mt*(_3|0sN2=HkTolUwKe8E@WF)4%zknS+m(acbc3 zY7{FQ|NHdp1_PGMq=5U_;!f8U7L5x{)9IZj*Kk}}2^KXgT0QC2phjot@>bhoej4*@ zbOs&fcU1AJ_cJ*9;a9Inos)AQ!gRUB=&9CrF!2H0aqR-oVDzbg| zfL^e3DC)mo9=D)N{ib0yJ4qWt_1+YDQd^y}gTTp^gyXHE$*Vi?ql21{F~1J{!92S? z3M74I2{zqoV`@Cu26aE49XXQ?yyZ@u>3; zn;(&&mt7G6T;o?t@!rR9_!M%!xr&+1VLrcew#ZTJ(jJ#(@1j|fmIT+$PxBMdIs2D} zoxD!Bwp;!H(2Hl_mV~pjy#zxjV+r)47H_=uBE5Z`H%X#(V+`}84V4o@Dci;bJ+HB! z7DhD26Luu*jU76z$SXT4Q9S+s= zJw0&$xkySodZX4dbE;`+k zaY|j~lV#n5WzO77&WfqpFAqSd&+eVlTAy-ekgV^?V7jy)b^p?*$Dunau!-k<2NFPa zgCOh#Trt{$kC5{vZPcLnjVWJir<+)FU+Y7Zdw2@f>d5!uQ0lm%9JPJxLF|;Ex|m`$ z4C8#{ul)4nF;pqP4s1+fH(|r3B3iQHku8}iocz(OpA=P|Bj@}9^<{cxAnw4E9C`R6 zdqpa-CBrM%?jmae);j7%65f{QaaZe#x`dV@k)ka5x1lXr3_1p;P&(i0VxlZ|5%uji z&Cbo0N&Eqf8fHg*EBJ4Rv84F@%ixv$|H=MXZe3%(Db<8IZ~AICGsh@}U~`ogfNLez zP*AE|`_|1~m0C8s{V|`CXcg3ty6$ADb}V>N|=*0Bud@=G>o zTs5X#?VD!gac8-N+;@-r=-M4P2x>?^SkC%fL*z7K>wn>~gnJ>{4mdSai2r}ci~iRp zF^T8A8uu|B!!^`4{Y@wlnDO>3@CUs`{b~inY$4|S(Z6Y$>RTx%QHOfJX*M(n!?V22 zNYh!~u#e4E`8meNUI~9H4-b8t8dgkW^wKEKlV{3f*}{~=Rb!Pw!~ExzKv)#qlK(T~ zH6zQ2wA$zKuU%{E%F7%2=9BY%+n0BrFE&4|2ISS=_*aU}2=zN#lpG_*?m0mw2a|#Z56#lC|3{tbAiDC~Q0! z+$6L5z4v>K%!#Np+d|=t(_MBr)3J>80cH_iFPUa4-U$x!rbV<(0~7czjQ~^RSdw<>(IW zcZkN9BfPXxw))0DuO97cr8%cAu}tOI>iBiM7WZ(NF4iIy2JqM&wOtPc7_MO3ImELy zKEEYPOsCqH{ibKeeXS!!wTpQtVUd)Ldeev@ucJ4E4?$#1+TW275k>OY)6bX zpFS^0W_xcT7cdapyRqFB5RqC)QdM?;_^kue_)I4%Zt=YzGl74q_tnHDbdv=`fZq=X z-_Bn&yBqVgn=+gXZiRHGWuHXBv3<%`{E{1;>errtmcCl!|7HdIn6<%iqbFRSh8QP< zN_=aI@63yBvW^nnDS|#B5BN94M}F>hVvk9;7gf^rJz=4!5)2jGrn{dGU&{~uvVoCW zlA#Fx0>hw#LdEqHrbN)}43QdN*Hq3*)@~SGN%rhzYpV-80?oJToz7>Pmm`NY#q>Jma#i_to~|ecyN7AS68j{ZW*;rMv@Vu`9w`NyfSqd*0y0C6nAs zp~nK=JKJuG7t!!|I?G+cESl~RbWPEpuijb7!wm!&@7Ro#ApCzI8q}%!=PsI`Ubx>` zpKg;qG2x9HYH7#o2(TrqNZF)2$cBh8u><_&3SWhh;|P#-0pQSTn?LZTz@YVv5g_h7 z{rH=kGW7a{6l~pz8g6pIUKm2G*0e<4zm4gNHrmtGebf<=EN`0ABH2=SdGAz+7x_9{ zZyxHP+jT(S*Vv2s>Y4{^>w`UP0#2Gi6Yfi3fN=S?St$c}5AuwwRrfc|si0yC2wCd* z0EW8B7>}kN7MO%6HAZ{gdQ2h7 zm(z)S-3NE@80-R0_x}J zVf+1nzh7`P#D4v@^ErJboeaKzmIW^Yo06Dk&?R;S z{QQP*nCU)`PY51lCktrLgrBN1_qPGjsD*+uEMLu{nX5})Ijvl?l8E!pI;3fM`+@xrf?=)-2*O${$lt*Bs4#PuW(w(0os?>;jBydUiC?7Z6T?uj~ z=nVl)>MCm6DDRg$=Tj)o_6mSGI8u#s>^&LXe}`s(pa};imfa(D;Gc z^*1v%zaXUB=tXUVuR|V|V*8b92d=1OL*LPNGVgOSIwA1Ikht%M#wsE2$*nw)CiJ{| ziOvF)uY;eYe0@YFumeoM7y+{?zK-yZYO!nm0TRdC{vFipMSn_J1R!hv1^Wzez}f-I zNB^NS_!l60_HQ8Jzag(x&N74V0&ucz#$P+P83my2@U+h~r^`9<1F#jI9dr&&@GeVw z0v#U-$*xz`8fni>I~E4%nlh;IXQ<_vfS>)kS-FFH05fTf+Szcz7gvL_1PDiDv3glt z4IMoV;_!2(yDxH1DWtdVjT;!Cj+?kfN;Gh`SrH|VI}hmYkH4?QBut{2mgCZy*!(9+ z^RV|41_<^^DdGoi_(9R+w3$AG?9K*P2d}DV2)pKUFRTanRRO$%I7j?LC#}ktVJSw3<_)7ovZ0IH|D z|J)FC^MwmITtGFFcmV(8R3B z>|;W>`VC2X2kECJ+cgqysafO;B8^bD1?uvz_}ZixEBpK)s48<>c@itBv3y%)Z(#GI zFhkUK7T%bCl)H@&YsJO-y_H;rCN1MyQsm{*(1-2_Q1TCl<+LeO$v+L*M-fy@5Qls4 zVKTV)-}r)b40DfSL(GLZqFK^vKM=g`!FHB#3E-J}v^5ljnjb(-YEi|J26xu`W@8M$ zLg{T(spf@1)&NBnNOL-66a4wUVg#xse((v`FD`XupImP#RRwvVnMMc+thtm4trl_Aw$d?Zzq=~XG0!@RYOKpE4x0USv$mZ*IW!{2k?Bj!Ep_os?&Bp-BwQ} z)Ktem3}{7WT}S7%9^}oF&tv9wJ9*?XjnLmr#!#N?=z$Z|#|5o36kEmJbjm~2eM5z7 zHQT8C@aUD3Uu;S0BB^cFoS*%raL`rDcXbKSfx#uRD{g0~3+(}*j}lmgv}{9+D>BBH z3j+;zZ<$fcxVv`@C0h#(*;8E=gC9iVK1s~!=?mQtTzl+m=Cgnn9j^&PnmX+^n@4O6 zA@I7u9aYtXgnP?JV2B@ok92&Dcr;|oXoEMDq8EnQh=RqMqm2jCN%2us5%_M7QN~Zq zqXl0F#@TH`x*3{k;RW%~r3lP;cN)TLuA*`saVSar~ANDpuE56Ik~lrB%F+;PA(9L^gaC-BKPZ!_bM@CijPmQyO&j z;d0Yqh)%p!n~$>%RwuqiR?gWn3Z7E{(>1>3jZycT)li2`oQCb|KAeUw@Js!iS}KTh zaO{EOee|o4cVL~2kqRB%KTd-D*ZBguJ#V}^CYld0Rx-XU2*x};%dVvuV%9XzUKW2e zF7J-5hpUSdeJ6D#n*3!jFl5j0E?po#d&>_k#I;fHpN!HN4z51{4FV`LA7b1TgOv1x{~@I&b!a2^E~H+8naebR@Rzp%rV9szyAQ2Gsc=|AMA4% zHg3$YLF>G$%2!A|9gHF09Pl?g5KKEuJ@OIi?_zFv#rHfGu1BQj$<=EudHUh?Nl?E;;y=ZRPU zgz|*Z3*Q5A_`APZiUEP9?0W#jP2P|l25TveS{byAtg+{|=lO}-{nNby_%gd>j}v8A zZ9jC3qidscqfIu`CQH~ZN#-fB_EsPc()ItG#S1snZlJY$5Xi0?)fxs23BPYnY0gQY zTz)9101gZii;LQpHPp_z*XZ%b(RtCPv^${R$}QhBIUMgmsnv5lrPs2y9F!GCKLr`T z9=g@0<`T>*SVGc@Vrd9nTNvBw2A1!$^`Y|23U>WGoTdFPLjci-N<~CV0+1LSOtp2&SHGCHx7Y@P;=q6F;i~O zr27V_{Q1@F{Q~JB{#odR4DFObgT=BIq+tw2R1@ZH71Wu>e34?jX9UgD0!a)<3?g!? zl6_(oc}B5&R7Z&r3W%!B5nwLj?ce&%atghv&BQC!Zsx6cF?i-+#246Y21p`uAg%$%UGlOgY!%CF!I~=7ESvotPs^F62+dOL-!ZhYi%7W_ zvXF^*Ve(H{inOUx-DcQ|lPk(#^iW@r%A4w7bXuA~t$g=Jh``(wbuv0147#ut7d~zi z-E~hh1C8Qj#&HBWqvP}9!T#@v8W(!H&oy_+j~r`1t)bX9H# zX%di_Xva>AF4f{VIB=?lRSSFOKv3`x$l*lN1;kCw1842S^62bmxLpoKc>GvlrRO`) zrhD#R3Qemc%1WrqpckgisYfNcZOUWO&MV*n-Qz&|k{sA#$?epK_@$oROWI!;C-sxz zi2jgyXfp1p-+UbdiLT8c@qBDY8tPF>w4q*Ll0ZLCS9zO5_?41Uf&O}z`2|<;^BnDj zAI}ivm^h8<%T7U))2AA)sI1Okl$(T60yMf-&ikBz^ZZ2DswT^gbpadJgkZL+Y%-tu zZ@vqdn%s`FE7=M4H}2Ld5#$Ajn)uuEDD`fz-d3zlXj?wm7$H;YLu zi7|%YDr{j4I&|EoLMhmPx$-y5);5}g(KT$cc+!RXFbFL>oV0iv$9SjB4uToBTxu?? zJF;`>CSU@WmAlmbP=!Ncpak3((1GKcGn-NiQXh>>iA5SOt;9?W=;Vy8;jX~q61GYgKV!&Ww6eB8F z3svOcZo)O?-xkcEy08gN{0mvFt(2GOuVB&cP3qE_L8Gn{Av{C)=Xl6iS+L(O*n}?E zU#cApAo4;r{*;flqc$oj^?6nkwe_RG91FN*WzYZ_S6U;An#+8qMJ)_m4*M1H1cgrx zEEECe(XMIa!*3S7;+SKh0BV+QxXE-=W23ztPHJ!#YsR}8UPmCxY>zTH0RwaBu=o}y zYK|q9jJUALRC(TI1ULsDt@}>#8ITFYLj2>MH<3mgcH@d>S`;?Z`~pRjU|@aW(ivXZ z;cX#KxX;HPnSNS-7&Ob4$-~)I7`12#8GnYpr<6epDp7geWz;#eKX}CMPx1C4Ps`i( z#Ysd9Xmd9;D+su#g{U04mmA9ShorlPsWHzh1(M?9A2ayS4{ZE1Ptl%u>F~_AkH^U{ zzNJ7lQ+MzP#WGs-{T*+VyE^+MFex*b)lXmB>AcSvW=+g?o#7hZq!j~Y{{$>4Wl3}p zXk~vti&^|spHWP7EilrIf)Ox1Te);uj>n=(0JY|8PgMhR){1 zJ^e-J2Hxm5shYOnEVe6{QmP*v?LP#kCPC(PhOTj~rUQ=Ryvk>xf0_*CAvbP=b8^i%;r*aEgC?+YVj(&wXo>tW*CyTFWwasYnp**>M+%JZQgc7s!8mE+L$% z+dv4C%?!3HejewIwb(s>p*Z?l(sIV$I+ zEhX%tY8O>HW~RqxRtL$NO{hzBu%G@EBsiAJSnpYHcg192cy6#bvdpW&U6sY3KAs+( zfOG2{_+Qca_}|&*)-FEbll_K`A!bT<5$Cb4QycSog}ORcwKhE{iOWuXm*3&)dhl1) z4zhmUaT1=in|ZNP2R}dzp7=kmI)+)xeBzoyoO7!al&wyU?lKBHpOOV!na8?H$u8Ud zt}(~rf3vtfQVOEG7`fell}OP;Kkqr=>z(vEu{@PH2_?IPkwGp3sROMWFk87BvW=ZL zaB0aqwDjqmpwBIGDB{ zd`&Fc9|F4enPD>_r1a>U1pCjZ1$1rUL;j8>SZTD8V4I#~B7);5l>9z}0n8IBiAV-f zcXn7T%>=S&c%((Lb!qVXLqM1`bw2~4_kf>JcxqGk5%lOcrkmSo%&FI%CpYxOJ6($KLbmqa^zsiD{JesY&9m`b92K+o{PyaMdXjR2q- z(8tqvSoSLYqOntWrf3DHj<)HIJws6#N#}pHcJsiKl9Fh~QyY+GU`K|-dy>Id=_6<& zSZ0w8@{z{NGbH$>DNQvCm-n9bKL~yITFEtFUP}^vwW2~z*Qo}ymSG4_lS0PJ>TiWk zZoFF0!r{GUJI4s3iSU0uQ&exf`1lF+g=9}!&b8?W=)*BrLQSR8fP_VlNMRB9VzDtD z9%Xed^>F5+Wf%|*eDB2rCw{wZHN&|Ox*xOjgg8r-OUbSGJ4o)7(*M}5h`>@Um~i#kW_6DGkKjTr!oNQaB>)8I4d27(&@ zJ`=92&i!EbbXT}4oJDF%t+dA~#YC_?q<2Ux7xGiA`3q7-3<7-kw7H~+gJNlODJ^>= zw8Qj9)R`G)9TH?e9}7zR*0&q73GQrhYG69-)%h@qF#Q54%={$r$D#t@gRo$r_1_-PxSJWi!^0FXgf3JcBrnA@{JRNmy-?(uttLHrTG)-)uMOck&8pv5`Ez zIQGy1C}kC-N5SUs!YL+LbYARLNaCauFp`U|wAenLE-1%O_eiv`x{_eU@@ap<#Dld; zl{Fr6hTnI8%&3x36_btzsS`E_y71O$`ZkW3&isz22csVaxo}{HnO_ONSNA*e1wh(x7qDoBOsh{4MEX%ntBLr96%qL7s92MIQ#GX%bxv zQaWy*}SPFyp86DOrKPNKc`huy&~$ zT(2`?BCc1n%|0&(C+-1JAa`Ao&28pGAQjTel!fej^X63g}j@m-mfH8?f#S z0sTl?C&(f^iwSNz&G;p}Aht=49zwr%zPX|Np`Xmzf5SgR0*N0Bt$_5R$%{uZw$Mj@ z#T+lM3ry9c&<*_1Hnd)ZXQ=UHAfhv7W-j}f^=sp&L4m1*3o36|N%~B6kf2Ccc|Fb) zd+2vSwz*3$Yl@RT_TtR={O0vjlLEqhBG6~RbBnMo_gZ-9r(24VgvX=S&WxLQ1U%|N z(2JY;^g)_C!+N>fjqZt8N-TUtHcIywsoZ-#9wWWqKH#lBue|oef3cLV zLeaT7dF}dF*eae`q~Q5(3)J>CfpVliY0l?9DO;DS;#{v)%JLGw*DdTy)2};Emjm5W z!5G$paHe`f7-Vo8QX!mmqsQnCx2XZl4?XqL^W4~1E9h6+Ffh*s-WnAo2(m#x!zT0y z#4!CWFf8wR&_#6PrqDff8^IHYXmFo9@t3`b;IXs+D)7eRSCx)r3r$IgPM_3k-z`EQnY zU;w}-PUdZ(o6#Y+YfR;Jpe32~)RRx~yje#fl!IRcY@hdwLWx|4xJ^b-64RLB4J%nv zgo2f@#okH|0j588kq7o}GFWpb&)^3VcG0ZCaRG;CO*(9t4XX84nME=bEUmbX307VgwJVjN zOn>HK|Nhi9e^JR7-fVxn1iLoXJ8uS}FC%fBhVCJ(vrJi~RuDeIV7qAdaIKVB$}6od zKIZ=T^ErsSu^_Tge~LR|EcBeU6G}=8ElZ0JW$l8ig$nwm0*y@ zEV|8>0>7Qj5k^-|lFG`r%Wk4qrc))+4`4wrj{{ST0YVESIzRl+h*eMW93P(r zX{yDpLd~v6>)dC%l=ofT4U#oBEe}Ukw@v|zzUyj6@R{@PV&DJFgxh?1n?`uJiFF2P zVt3%lWQFn0hckiyQe9abaLMzvRH`D}?af593iaE^>K#op>(ADUTambhu?ft=D^)-AF61r$e=2pO9Js7^v&1MJJzx1pQWLGeO+EId)6xV9~B6%&{^r~6H)Nv&^$UtClzueure(DWHh?|D|%*eVEicLS^*l1$j$Z&IGrUV<#I z+Eo}LOa02)2}n_0G-Q6c^AbaVZhP>Fj&3{uIbRN8H>n9jGm6cgt+S?N5Dy3HpxN)k z`r17dK~IfND8NE9 zvRMnnpJ;YhURGh|UolQQotWcs$lmJX86H6?b*UYp45yS4wFp+Aa#V^8gMZlnH{#p% zC*%@e-zCU%=Ir0z^E(^9+HqHaSZrnYe^L0s@;|;WwbNO?LS5=_PkW33+qGxrQUH(# zDlC7pebE;VK3CJw_PCBzgWkA|n(Qt*=)ZJECDWJiy)f{2$Ao&gcuK|Pv>c2y$M>sUg3$AEN)70*eR4Z&_wo+geF9|*cSf8|*wo$rq#}TH zQ`p{UzXhw>15Jgsq*pqt)W1e8Tr~QmV#Mx@I2rUKoLN2fa0Zg0P>tE}C@73(JP{is za>k@luK;_s)@r>ef;zXxhO@ZaX91jY4$P8qU4PKZ-NkJvK&|VrpjK*@b)l45)T)3x zQVy0I^CHHyKBh{RI|rlYob(Y1#?m8VW4D>7GpfVVQXi^PkD$YHtW7dimM+7MAnCiu zMpH+CGqaj#Q@bdS0fhEb#*RNz6gT8YF#rk=D2W!wI@ZzX;%ts=zp3efxsa5B)usm_ zWav0tqxy6y)c*|ZmI`IyI^6n1OF8Xn^s6sa!CRDfXdc2eAiY0$V#X+mrW+>*gyByM z02Oci;?$1Ma&gCJIpM#DZrx zrndX@J|mZp-A?DJxIC8~*jcoHA%ma`?U9Aig-bj^&%KXS95H(;8ih7}N4j8mpY4LA zd7v6~EB;&c4Th60*9x+m$-Dec^%_%5^yS0=}d zIZt6ZJI!r~pv7eAOD0aIGM=<3fmk`)Gzt~fd1}%+Cshvaev#tc^ zp=E6Eua~sb7xUxckLc>c$ITLouU<`vad~ZKsZ=MMy+6Q5`qJd7)o=7b_h$0zd_3=9n!2@ds_Lk@Pmx&yPj(#R@q$j)Y5&C_f7tN zbUi3Q%6|R-sXExYF3hKGB=hunDxVSld*Vy3friuf9t2GHVY4=e`=*`O*x0jjNl1uStA`$dl)|CKe zE_qw&H%ryE+6V=#{KbfA*RedyrP!h1C}Te6*+-AN*xHi@?-LduI+tjsGk5i$fqBwr ze+^fVzl7-^Z=GbzxWF>bB5pe?3B-V1F88LY!yoPAMm~%If8F@6{y=|s{!bz99eJ-k z5hSGmMhawTIht|yR1a-&p1gq;)b%{ulMUfsYjTEIG=3|Q-=|R zeGo3VZfxvL)4X0XRek7ZUGv=HkLVt{lT> zH`n~g%%q(+N`R}mXl^zQBdQS*`7y|5e+pT}G^6g*E2mRmu z|1bajfASdkkKA^jjaB!2iIeC#Vl!)RI=(@V@u?Oa#QB@W9`w6t>aeV+o23GiWQ-U3 z4~{GMpa1JrDK$=M5sG{4cwM)glaW|d*o!oJyLM`maQ))B`^e50SsVw1KkxNOk6c*ie`D%i?9(0TdTQAEgmD2OUpf>$afu?A`sk->^;TWZ zkZYO3w$roD6W(6;Gs#-U!8*MZ%SPr!qCvRY#CiL5729`v3FdK=NA13k)ZAqcjBY$xd`#!@jjK~v+x?I< zR#4Ec|Ic=w-)FoOeF=e?CX4N=jnXeH$M8t+#G9vPh~OXRX=%#3Dnhu)YKyySU}DGVCI;X|nVQa16H){}=0qTGspN(bqP zhMnj-GH$sl$W(R? z_G^~i#_cB_UO!xZbxF`tqwq-|xj@fNFEwX@8C3OgmMz5cUY1v4H_VLANt2-q5;?Mk zDiW>Y?LQx~$JnMDjn~(`LmEULzJ5iHqp`SJ+Si&GRI|L_P{p8BwMhKPCw0XWUR#h( zr*GPo#B8ChKAaLm%@TcC+NN;|_1(!}-c`MLz_#L%xAeLKHCE-`5vMw>k`XF@ z^?8y8e#-dhf|BGw$~vBWEyfyQ8jnfJPpeH^V&UoaJr0nWs&O=p!i8lPLDzk?UdD*k=De;I1B*i<$ zn#s;Ec}Wg+kfA*_!Ts|NS@Y`|j4zGzIw>XZLidKnMw+-31+ix2EgD8+VbjVaUPeF; zKzP(p<5z+v**)syU)WR&_DadLHZzy{y9+f~zx+Bmz9qCg!)`Pjn}Vj>&63CEk3Z<40<84;ej-rq!ZW&2Ns<&fJ4{Dqj@a}BPvn?eUzyE+B_g)T$Fn&`V!M3b{4S6?FuRsm`Ac+d zCLgtmZv&~X{&R{~37Opm{%_Y5Z%+Xoc6~xHQ&zsxYI8{O#RnOhheDQD>)mmu>f0{{ z=XU34TsGo`^Uy*Kqq}M!yyAap$Cr7A;V=4aAX_5b17>TRP1Nmr>TGYBU z$9puou#8swU6sSoz-=vdJ#0%~l<%?1JOJy!ioP$BI5cE?tUvx#S56~kkSFT$pn~S9 z7M!v&D?p5G7JV;HjR|;4AuiW?FVBRBmqm?wgn)!*;j?9jdVYHATLr0Caz{yn{M9X) zeVY!|F{vZ?wypdnMt4wtrjQjNEWD_neQ2D%JmuwbM+7z64|YWIbVuyzxum>`r5v-( zC3Y;u5wF)>KHohD8wwZSD@(^a5`v$+?wEWu01xHYjMw*^Am);x(_BBJ-=gEA_59QtEye5_3_JtUn?Pi%|a6Y}PQTStv{aJ-IZA*dBOI{~TaP{b8bOa@EDwnHa z^kil4qyZ;)K2C)5=bh3Uew#CCp0-W?GDMs2@mvGau7PC|=K7CD&o~oYb2WDf=P61! z8F5bLX{VwDFta(!7OO3hbDm^gx57 z<*xn~QXk1pLR@_Gn{j4!YI=+X39rmOC7@5o^djN@v7Vv}&?P5Q>zztsJE0bP)G*~s zdTK$4Vv0$o%i&<-7G1?B$|Cz#N*f(*=T|YXz12{v*Uz@q6gSNb~(o^2I(t{eyD4F<7<~zq^d!O_==#t7j>LpkmlnxN$>($aM$E+J! zy?*lc!AGGZH^(C|R|EW@j*1Q5ebEgadklSVlu%F;w2127q!*7Yjy2prmd>)L)#8N1 zx5frq>Giy?u+VA|3Yqk_p%`zTL&@3H_fn!eT`f@-3eJ7G*GEY5d!ZuQfcQN{C~snT zd!u#E`KXUIzc}}Wnq3hAFRlt#IrkFr(2(E^i_&VZNF3 zEf@mkfY=r_5(0F18sNu|qJEjXRi7bkg;biKI6}c2R`pF4i_}MJRc5>--pf3?3mj92 zD!tEfW36(pqnk)_kNu+Eqe+VoeXp|?;13y=jF;%?cMF8g6Qw{0#@2>N7#?G);6Uzi z@EVwhTu@Ebe3YOm!oIrmgqLL6^6BHlzG1@}q$H(_r5UkY&x5C~W_56Srq)+WRrBL8 zZu)V0^V+9d-7dGaJbTa0E4oJ{mg5>eu9d=3bb{(tkrr#X(NE3v7Az|8Y~7#xvpkbe z_MiAV2E(`(e~=-BS}&V~ZS%Pys8qlq-BhqGH*lyO<7%333^>O5zEF(>))I` z)tnNK-`qS3iI0=^=@MRm4$Do>zEnDCGbH+oe#*T(!ke?y=WSZ(oSteiOIV4@(2@6b zlFcl!{FLHa>bhEWN$<>ypXL$ z4-ZPg%*797x)}?t?*8rcy^NGfX_D5xqHfvN4}~2(^_eG%*Di$E#~${I+MXzJ6W@yN z4CWf~y~DsUdFY5zyW90t`4IRk_Fn}bL=|66;v~l%dG_ zuHNmQD089Glc?$Tfn^wJhIB=3k`}lJJJPUMW&LjNKosR|Bl8l;C5WU^=96hMeJJzI z86DVC=26yh$>S6k+~a|QrYXkv^lPxX5%{^)g9tqtw{3KdqT1#sl4uYeMZ(w154@5x ztI7>uBE6YB*(BMp*Qr91gU?>`($%ZY%{BPFi5SMGb6+6{?jO){<9lRyr7fw;Ts?o+5DJPzK z^dpgPBjnA^ZPe5W_=Nhs!EaM*OSe<6T^aHiS+zco`q`0Itb}1XG(o!E;CuY_uGgZ}f>cr@0GLQmm;*gcRA|~pT`^~*(6&J4_R~(Fu zpCKIo1pJ{*;%ms;)S>II?drgq`T4PIFNZgkUQ@=$n?*<#;2PUqkF*C-6@(_H(MM;& z^BeJ4@|B6rTphK-UsEGShV_;Pv8x*eO2XdLvAHlC<01RHCS5FXb%Nfg&g&^xQYyLm z<#uG~1xMe68)sf`zvkOk)}N3EIdG2>f(`M5k;@ZP8RE*i{N zxrAa!La36iB;M`O_sgC@zCRgjcRn=L%+8rrL~5V^X#SkPWEWdWSjN13GHj`JFR>ZM zV{T%*Jp^3)S{RR#uqTJG!vyrmsCDsl)P*MT3oD9LA zbO$U-^lf6c<8u&kenwKUeUlyit&ks*<5zI$zgetPgYFMrduUM>e-|LorQbau7R{UZ zRs^!N;kSlxl$SHCw%Ww5faHaFQvx@a?!s;}A}TLzMthx$TZ zLEdSOfGA5MggWrH^;|*A`u!PIkux7Z{Q8I$dG8%=sK8rW_Jyr-apr2OCMh2kuBN@3 zUE#iq2^YglA%MsahRar1dWZdMU!lxmekR{Iqffj4B0(b@cS|-*ut7wuL@gX zqbOA;Fv5JVfLaBkOZAi}ub}^pBpvKIo1%@jHO; z?+C8GW*h`NT=bjeSkOTy+o#{`;!t6alB>d;Vbo_zo z=(wk~whnIIRmSsM1#^^49&>)lK4yURH_+8lyb!fEWzy@7du zW@=UJ45>jZt+BtXlcLrjKk&j=HI;&|>tWy?^q3#5AGs;~BI|mlxI|s42x=KsGlum) zNU2G=Lv?F;xM)L|=YN{BxB07Y;{~S~mAMvqzi}Y?WvN-Cuvu`Xp)iCy?@069z5Lv{ zn*JbzXl%%KsyHFVlH}LD`yws*71ws-TjM<{&#p!V z`-PnFODNt-Ve0wK=AE0`Bib$@59D%&ZxVTH9bWRgmPdtuv1SUD1fsn>#C062u}&AI zRPt=-4=Aab*4|t7cLL%|A;XLC^rdpkzvBjume${vPU+=v++pKxd%HLCC;QscI+f9G6#T9p6&l_V9MhKX9_9)IX~I&Rxczy(somsz(dlf6e>bzuu09# z<2&iL_wjT8<2v!*blZRLJymW^_0tV((e77S4!^g0dSEfLlYY@D-L9;>?aCNsouOE@ zn9xZVJ|SRVbVEoZQ7MD4>W#C<=`F`iy}B$C%z5NV_*WKGKuCLa0~BZsDZhd&-wz$? zAN|f#Qhf7-Z716_cx1SnEXz9(r)K9m`n2fkX11f}eB%0JOvQPX!;|ECI{l04wZ8sR zCs^yR{ARg0dC`5#L`hW$1q3a=H-Q=qnK{#_Nu5{wq*g%V$C#Lw#>EGv`nGpQXV>=h z#@ALk-08F{^vs8OIo8df=M?vOU;XJTmg%hKoLM|-$Voo9q{vvz zNW^Tjk$-6#^mxa&olMMnff37k`=2gH{X-)6zqc*zSCph@yw`|XsO-76`h7ZkMBaO{ zaC~_tl1MnG-~%*p*uqsP6@vJHrPF=B?5W2}fG)h^;!!!o9$YyWU$D@LC&hB&h%R&x zjCsxocfT!cE0s4U%_!FISa;%y{_w%YhHS{Sa&bYeoEE zl>C=#MgP-ncZjz|sm*=W3H(tQmgFG|u#oNFf0hcCTLW*o8lEcu(E!?uroyGPl`NhJ z?okGaDK1iu$+^tS^y4CD$A~*i)y-JQ(S@nfbgJkgg8x`3>{mCer6?^a%$>u+f}Qs+ zbZwTGK@_jM+Z{wNs{@B0pi3%auAxj_EG??u@fXr$DRb$c=ogwkH zgbE3ErLV5VSSD~XQfNFlIt{Ea`?3|mkKJ@G8C(L*&t9Upnp0?JQt2cFZCVE-mTM5O zR7_ZJfUSn9{IEmMV$fr!QxoG}{6gpJfv$Yqz}JOCp8!NX{vbFKmYB(nG%3?xa!0S% z0-|2i?^P?)4`h|9St&Xh;*a*V0O2L*`Z73q!o*$Ik*nmQw>#&OS%`eXYYth;K1VJH zVlwa)Ee&ikZrdejXJa1q14)CEwbQ0yOyn=+~cph&&{kN69tLjBR%NlZRibFX^h!eak- z};(%>3jL3z= z?KO5U4Q9Gly##c%N?8NaFpQ{?GGs-#0)Ct|Q6qEcA$k>@!lV^wcPZ4Z_+1IJh zVL)DbalP}5j z1#RrkN)42eUEJl6S+XH(th2+dvhys>Pb~8&Ehj>A5bLsbGVNDjWcNJjE?&BLmgEYe zMCH{p#O`brGaqScrWbS$ByjF+$%I~|gzRSQ7bmM7oRM~z_AKF0OaOfEgx*C@p`YlSa*%g&yR{=LQJ$D> zIx!fnt0K)CuwlidRuWd_Kshp|`|Yi+30hQGEZWolH%n)MUawAuli@)ZzviS@e9EUo z9-#ACm^$aVb>0IJmw&0%3D4fBVnDxU2SOLXV;~=oVQCG&(=9^lcVe>fZa+@@Hetx2 z@s1m|RDsqq+vd@Gxealm0(T{B$0 z_c4&&w3T2W3AI(>;?v;IQwP?jrv7n zyNnXz*}9v?qIBun=zGQSN4vOI?{tt|7zZ*QS|0-V_dUrWvrbp5>g8lh#ze*TOyTLG z;sC~daVRZ=buIHDs_*?`PSfi;Q$HtW;;1TYMAKFmdT7zbWz#R9?*pXg8r}qqJ(Q-Q zyH}ymfl3F;cIn;UEF++Us-(qOjrUW}%ilzrdQPd#cPttP@+5xd=?v0UHmrV=JT&#K z=zi#u9JD>(_CCvgmap7}y6I=~*aNwd|NWMX5576Q*B zV3FGB!*kAVMSJGjU}4!q5(d1F%Wy7HXx6!oEu9`99ubqlJkW!smi;o$>0#`1chxSx zxC8_9*k0!TU`A`LQaX~rY&p$(jEIZe$%`g%ST7Kg7)>Bjq>THGR&1FXJ z@stB_dGU_A1z8V>S_p4;D}+^^$qo$@r^1L(ze3k*qZWNep8Z+9SM>R|Kpug^G|n{qT0|Ovlk@##5P6a zw!~CYti#Zmg>Mi!HMkBQH{=mhZmiy@t=vskf z?^ZDNLWnzZ4t88EX?R_b2fMDa5zeHhqHCc^Xi(q3nxO>n*Di&43^{M&{1LY|So-1K zfZFmSiw-!7z+V9wqaQ*y;XeEn+uKG_NhAwcTW@Q^++ue(tlo0Fhnf`tc;Ga=qoJsp zafRgpP*e%{UHKnFqCcPWf)3}B4uxboq1MkkpZK}*# zbY8R8>n%WlMPj7zq9X;-q+75!1H@rlZ1>>vl@?gG4?B1NvD2GNYsp0WT%bJJiU31f z))}^E1V=6rBiukxP|01oxm}{ecPE62GQteIVL4hF>-t%|%6R)3C7POO-!tY7ls;TD zNvOX3{fbG`MBGuSs2VKoe$#dgK`8o59W$r@%+}!L?MZx>ZbJ0UYZo&x0S9(t?ZtbM z+IRnZvGRW!XYV?s{}qHVvJPqcGmAfdA&eEi-U0XV$a~Emm-KISjb$gJpEa#zEKh{6 zQ-_{!0KOUc_GQ9@z8z(4B`;ld=LMv@I>>1PH8EUa?vV%6Zyxa9UDCW+@cp= zDg)L#21^JsQDI7cfX?=C)!aPtXAhj*a|Cge>9pfOa(N0(QS2jj1QLuw;L@3Oxjv7C ztb95B=yII6oC$`+PD4-4Gi3-w%pzfZ1-1%u0{r9!F>2QB;mgg032p%2KLaCw zfNrhwl1$j1O#1U3W&b$D1!5@pfI=twYr|Df*dxU|X3Tv?csTS=c=1r-c%~5k2sID* zmG3-7wO3_?cOsE0B@9~kbyXo&)_AWG@GR@i<_@N#zV@L1`k1y|R!Iv)h^_E5?XI4&jEMdqG%Va@jh#_Rke#dlX7k~q$@h3!cU)m(Gbqf zi10Zyq*m5}58MFHZS+$HXt|zBc;za`cO}8^c+i?`1srB6R6FbU9Z45RB8;qa&5cd6 zYoqr*#F0Hf48}^AEKUqh5~|yLzv6g){0UQ_Rw-}xH;V;OWzjF1gKc&IDFytk#MHst zxr{6QxLRosI6#`6n)80cpuP=uwMbFMC7ANOtkXOcH^O=OI}TzKu3s9BTQSv3P2`%7RU)s47^EKb%&xQ~8XL5&j&{osd!1hw5Kv<+DTcgS zJ5}VF(v$%-T(5cF&4IdoaB6mN0(GMo@*tD!KujN~^RQfP4ww&TNERd(hN{6m&P;uA zPM^T7{`3c8AOo4I6Ronem1W9vhGarwN$8`Se&KSMH0npEx$>vzwN|Gsk-ouYJCI1y zIV#-+N5Ea{AR4g4nwKI~6$e44G;i0x$BX|rU+=k>>BiD)F*H)WaTmawHWCI}BOFU< zPtSeI+DY|bUeD`!?eyN%4dnS$*W=UR*C6qJ#XOEz9k$jARR)uFXh3TG^PsLw3rzdC z6(P5JGSX}70wO32tKvk#64S6jCz;1T0Q9k|>rrKX0Bzqnb&c5zzE@zcrq)6K;yb?A zq=|qz0v?SnLbQ%hK*^7~lK0*F8(EZk!4 zfNQwq&K;GxpiyOJO5qRA6EcQ-TZdj*NMLf;+1Zagz73AT z^J!wM8y!SHN+<+P9If!mSFWgJaRM^&^at0)J^g#>8A3NZ0Tcqz zdZt|)U}8FlCd~j#9`vqng}}-17Yt>?tbBh18pA$Op#!fW z2I%Iq#XWiB_0mVk1MYpJ@E<)kBUJbqeS794;dQH7fBI5a*0?KlA%|DeVK=qTr}X#@ z8ZfxQgRdtA#Xe&0t*qHliO>1uKXMXXZ?@grnq&j;<8ygrQ>N@FQeWj%N?*Q^l3}R0 z>!HmBq<<=X-$P>#4S#yZ4jz(Ij2XIieqTBLuIEsYO!bMITeG5brOYo`qCrd1tCveA zSvLSg<@qb(Sm$NW7;Jh}%#~OIXsM`)Zp=Ot5`lK#3ng;urscezNN=*jm$vtXkA&#x z@I7GJ1mYt0H~tm3OB-TqCrxb*tIl9Yz{TN;ji2<7W0U6d|6*PIUzmdV4yE6N77j3V z2yhp~K4tv|eu;Jc%MoaH=(Z(nhQRRt!W1GHmm`@$&^D9<+_^&(P+eCZ4!upE#U?LY zWX$m~E&%;FPoUTL%5NxZi*6WqDG|yt8eO(9kG?946_f&7Ve9j%Aqa znGd69MtN~F+rl#1g!_d2O1oEb`sj)#W3J_X5K4fKaUBx(Z>k*zJ>N33BioN%`E<>& zHW)Jpt@99O82s(h`{@SI?a1^KpgS3Kx7wZ`Z`VByNe_n`*$E>xa=A8Ym^y}8c&*i& zAnthf+~NOW@4e%i`qp&uC@Lxf z(k!&7h;&epA}t~!0!jn~q(-EJh&1WhDgsg?T|tO6=>()n4MjkjNSEGw3oQgv_-&k- zbIzPQznSkjbI;s6bNM6tgJ$iWFYsLhxwabf^B^>Vl(WSpa#fj9)-9 z`T_75lij`Iq+Az<{|N5aFtUdUZZx%#0w!7Y!_tUQdJ+~)8M?YVtv)LNZl8bg{v3EOICSVqw|X>` z@$%3e;4H;-f3Tpegy%msq^Yq`c-TcMaLv3!@T279&S-;f^?uqit-PbGch_QFT9J)orn!kKxG5LOhp$&Ni z*$FL3ds}sgkhQdKUxgbmr*|KRCM@fS!~Q+`7dn>6Cygr1py zXEWebcK4^@g>ccetL9R2HPEg^uy0zuwEg#nA{Ox(q!Q&jkiS_a?FkPUWwtwoWV^+k zpZCrL^LT zF+>_7$G4Th#lRn4&L73ivfudovDJSB90PkDf2fQM@+lZ+mGn}!-V&CG&B7oda&*WV?<}k!^#-A zSabb&&Z@FW0%zY;r};V$caE8C=z_QM7R#bT6MIK?{#9p(cgh!h5HMZs@lgi(=!xE` z=pxfCUbw^Z0^f|gJ#3Y4bti}>aC+N3fAOIfR~WZaxWuVo(c@F6t+&?OIP>bfxQ0b$cu%rIJru%G7e|8HRcR^K_igCA~tf#^$x&%fg(<10SjY$pr>Xk{Y2gl z$SRHWACX*0Mhim2?Fn=*N{UCSW6AV&!4pzXO_MeHAL{JnV6_+8c_iyfyJn2&eXWuL zDX-)o{*itG0?X)MFpBC3XiXmPry_qIsJfCnSdCziMc`Tt?HSyJiA!hOnQksmr(~|( z(BW;lly@PD`m>`HhAPl8^Zig&F=K?PrjG95mW(LrwzW=(%F_?KABa!Esw*adczj?F z!h>lXz$4y5p;de&^-d;*HDq>)w1;Ye`?Z==7ruY;y`1DH)if!EpEnjahJl$W2>xns z-P2t$Vjf00cEiGrI!>W}mr4r#dKW;HrfVnn6xB zLphb0Vd2?=1PkXi(V))>@t~Q-bkYCAyYT(xW`s9p`8-#c z-5E^?5-WqUWZ9ucD7e|;6A3-za);aN^d-Iqn5KQp8_bT*yW7#=b29P6i3@bcoa((q z-pay4T5$p=d?K|69LsIyojc@<0j|EL{K#}|e3aIp8WfHkF_x)y1I8;;J{G$OLhNCQRGz97%U#lz}j5_;rmZa22i5otr zz5MW637$Ba zmW{Jg=3eOE21?ncxn{!3&}pA?9+?D=i2EHm9#jo^d~(_a!!r2jS;?1g+Z-n-N9AWr z@hlrf`PvSmQ=bbdC2~?4zT9m>ZOh6J3p&R)a!1b-GjCYP*H$F!^{tA#4m7sBghpR= z+eK^X>FT;l7Kqn(`zb%0-Jw^|UhJ=)4oh8?GXtqAZ_E{+?C-oT(`IaVTjF-w%)3p_ zy6)Z|{^Kf`7sED?dg=$2=Rvd_iTzP2aqdz{fXS=Fmrg;X4(B({G;>bN>A88aMSuB~ z@p(Qv*6=WmcH{PY@{D{+uP?K1%a3R3>mCW2DgyiT?05bV){X_&xIFH#=8in;9aAll zO1dEomhCp>{EAC`X=)Qmb&0AeDl+1gU+uH@q0QjT1?`s?f=_!e=xb)Qa0?iIXjo`m z&oC~+0h91h|D-)?GcVR-WaGmRf)O`{n+FwGzFd6ns=qreJ_BaJbco9`|@0K1Hstz(+LMy09r;7u4wJ&f?;$(V>2)Fy@0$ zQO)LWRBZmpeGE{79R{p0he+-n@?D?qx09*wMfI#F-yvctOef-~vWF9<= z`9GvrI*`$Uj1F|*|DX=syVfxL5L$~?y7GGJSZ{yTCNCIt5i@z>F$f~$UV$Y_im7LR zrb-SaIk$ml(UvmgLv+6fK*H%`mEb)f_RUZ92brtdDa~Ej+1FsQnmPD?8#Mqw7L4+y zk+N3kZH1>VrahwHaRq~1T9i67*z^dDI0pFYC!zEF*FS*~RBa+{%s*uWgW&_>QPe;E z4SqCdEyp`g6<2=pzhv_&6EuTl6gq!n)mCXsCcg|fEd!-v+gfu#Vy;cq%h+>i0f`?JllRX##%dGv^lGVhEqz36 z8hwVnQJdmhH=bFx6U*W#mSAniI{VA%<4t9=AcQ$!_{QISfQA1p0G%>d@&kU6ThHksMky!oRdvrEw zE8QIl#$3mG^M8$}C#M|?#E0&;e>!*_$lyQ+{z?bZXZ!@-CYN8KuSe99~AvOQCwW`^=D zMZR+vLO%tPi*l*!0U8q3q>d;7b?5br(;h^q``l-IYgm*i>egL6rbm>XaY5sCYDiEx z!rFeLic*3Q`&MtZ0FR9;3EKg;;oxCP6PEVbWf6P#41FIG-}QUR6k-6_FdOX49p(kX z4+?g03ekTL65Q9{_$@de6Nqq_l}#uy-DnOZ(}Gm%N;qor(IOae1xgW9CW(r?gv|+M zfFYvidyutQ4B_n#Vgpv{k5_uuI2*;X3uXkK3L2x|rLh6FhhQWN6u}Gu0e%{Qsftx# z2x;;<@J+cH?6pX4g?$lt0h3L~{2cvXhSXgmP*GkFpvl`{?CyLr<&wNEseQ$S9A=Qe zGp&U51M{i>YQ*`o)zHl|-C%0USAft4U1M0s&HzsIOM%0vc7B0PNy=Xhz_I*i zr-Av8bzq|xt3AkDRMZ|sjQAcf(^*2gYakM_|Hh!dlE!T>%no{}j(F)0iVHTuH4U^< zW}y8n<IfA1m`k=)gi0VoZBRo7REA1 zK!osEPtKleBdPPp3S~w{;5g^ZglQ31h+6rzrA$$kUEM+KQLM$z(`R;I7aK;FjVG7X zyk*B<=S|5H#=^RJ$Xmm<3ve0&bxjpbaN~H$C7Q!^w3l+{A9-94iQ)PcySO<1HRzki z*Z0y=ch{9xC}6ItNMrXV?!)cJJD*oT90JpdGW*M@;@|goO5l0RuqVDW<l08V+Mybg_b?Pz4FxFrSVbkae_4jBuJ zNCtZkcobBHo~X2VV$I2!-+9x8^C@}C1U&18_M~|N)@iktqlPMy7u66>0@X8_Q2dC z+MoIsG)^^?4SJ`+If00zG7e~6Oj!ma!_?rmBUtZNCIg~y+mY4}{LS3kex#HaoEhky zST`>zR=gbY^W^3*Q|-G)%cWJ{x8sEU34XR>u3!JR267Mqa5@$1QzALY5 zI41i{G}oV9G@RCvg)BtUCEOSj38rw^(-9`B4f&@AhZ^6M7CDigH@#MVn$cV=|5yom zfdr7Fy}WV!c(wUSJJUKw`3v1+ibalYYH!Z3qq{C=0W?hSAKK6dY8w~zfYFsosxNlXzB4r5IKyz3?e4!;*z~Yx{aF!{1_Kr)r5B~ zX)w%O?=U+iCG%iPVg@2Q729(DRgk*vt+^+xUKp)yJh&E)Z{GR~h=^_*9`6FUoA6h2(}kSEsvkWLS!q$aiSlKFkNO2@Odn4sg+>Ly%|;Y^Z4ONFapOt)TM~VMy{&-9E-_2w+nr7ZvM4x#}}Dw8jANp7p+s=7j4O91)3D?L`?GxTkuCnkEn$Yc=D7sr`RbJOS2lHy1x?Tx8A$+HU#Vtp0oq$E zSOkwwWTLR1fLPy~Zm@<@iRFU=n0G@hlH>NIPit8A-0Rssf&qD)Y}>N>RCsef_^dMr$9J2jBJYOPBwI$LLbWs-nw3ZN$eloD_!EZL}hdQoy;e`I~Bz{-Q|W-();Ec>mulkw1_XScPz) z8~?*(V8CkOn{Ja$TV{u<2&n}tsX)WG86r@}ii`^|bl*TtnpDfzm2Bj)>zJrVqjZ1G zCHqE_>7XjO9DHJcCDiiew#f#=5-bCw+Ep0GAuJaiHkOdpfnI={6}TA=7mO;`4;v-t(6~Ca8QmtWJHoFaJ#oUuU1|&}2?v;$Pd76|;_D37-Zy^# zaTU_vVj=&%u`K%d=621(IC%Gk(1qS{)*DZv6dNyHtej)U@ENg)h1gVb%3$XL9 zOTui{Ko&p-wblr$a0!O;2wN^_)1 zj?_1$9xgTMcEuK0mG z4&-s50|z?r|Dp~w^@%T^t&_h0qkKYO&Usk6&9%+p`Gi!5x0(cX-j?jiUQA;cv069QKzE=c5?eg$vv#w*Us6vjANP-yrsW< zZcjXP#7x6jF^LenswQxxKX#CqmI+3AfK-pD&_wI>TAspp(%T$0n_NX+xnhT;MT~Tv z`jX!U2l=8sl(f_6!n`nBBa4>~8XhQ(&6vv*TPB`sgzWqI|CRQ%jmKeEpCd`Ke(NL>j0Xw~V7l)nU_yzXVwEek>rS8{%VlavS;0=3;^JSKt?1KYdP_JfsGiFDaJwfVz&p#jO!LXa z%{O!m?)KFu$6R3NMuuhtjuNi}BMY-@b1?cJU-1?%1g8o(Rb(UX=e9~{sk64|nChPA zvnpg?S-<9fu0Zl~M7VWP{^vRU&~$|o)Kg&e*;%lK3qdz=9?9IbbnWuwvKZZx`~&Y8 zlI>GOl^45?yhA4mH}XShB4tkeBO@3yO`e&dGwbHCc+SxDZ{2`VevuYqL_ zn&I~&B+Z>RMD)nT@*;|B_K((9eZ?hEcPicsvflr>VJF{mw&k4Fx|LAcZe z_7;+cG6c|6aqqCgg4o>@jU#BLhwsvU1e@vI%cVOULGC}CP@Rvdi-)Uq7aHtr<~->% zU>mgH6J$zO5_~G{cL(CLq@;mK%zB+BZp&$Qjqx5OGb*ZE`gC*Mg-`|RuW7m)l{f&O zN+`vdCl&lG5=)=U4~v%|I`@uyb4`Jr2#!(7zO&uZ8mC{b!}(9#^|`-xY7c^YYwKuo zPx6O+>k=><-+LIzc;qtf5av?V;paUzp$l$rzY`?_%^WYMe*qf-RP$F)o%T4bY~$^~ z23pX6Y?Gsd=O4)TKnD(V;6Mld@7DpAc+aeP>8JSUvXG@S1&F3;gOo4)*E>yLe6KsL zmv^GNsov`6?=os**ENIjV&66UYJJR0(#@|v8YqV5W5y@?-hYNWKV^44uT>*H7pWB- z>w3#(S8UyWERDp}N#45BD=#sW{X(bnOXKH$wbI;M*A7j8Ku*~O`@%o^G%402Qwcy_ z*;(J~L3UJiI;ZDjU1FC4M~06$e8N+eF2FV2WqaFLik{wHfVqdMN1B4&yI5X*`a1$= zhxU5@F4uh@9ypx39Fm)OsHFVn9B|cJVac||1#`;CIxfv$Y!Gnsp5BAF0s315$J>vU zKX=>d@UZ(;Tg)jow6l_Ga96dw6~Uu6w;p;@W!i*BGY|AN{y2F;<`%vz^<$CT}Eu5~Nb17^&4m~wG zxzm@ZVI<{cqzWIh(B^IMOs)~mUdc|mI()lBYDsy0E3;>{u@gnm zBegpb_hVXMLr7hmobsYE9(vOCo5+Z`pYs%$}DK( z3X7E758ozwgTWxl|M2SSgVWdl?CSZ0`TfVtujWe?$f<~%CzPC{InVTn+zeMb)8R7A zLk(32>1ymQ;XVhnc6yh&EFBw1n`{GOhA!9g;L-9M>yn|IaNVREcCg#sMWVEY%9h)M zz9b@rWk`>prPdX(ES;UCS?S7=k6Hs61$4w$Lqxcb$sCU!_Bf2R0J}fz=yE%`y0}ry zyK@9Q^4q89AC*Z3Vwt9m4PyB;{hF6HgWAu|N2^R(ckB$hont!?ltTQLg08e-Gh0a#IflITUW2nQ?1B}BiYc}yBdNok(^v2^*USaTDLP1MQs4&N zr+Q zenBVTuSjIX_!Jr!Gk+p_JsBbLAjJFBjc;G)-~X5-*C2{Fb+u0!&M$3>X50}SqUfyT z2xmVtFR1igUb*ljT=`O@v-y~x*6ZUdw7Qsinu(I79bDR^P&rU*os^7N9^_24C&qnW ztNxZL|5!%wRm=cZgdz;oz>r*S%0GILY=)R1=Wi|E=)-KAi2HArVyMcln)VCyCBDC< zFL%C9if91KL$MQiR2bWU?=Mz6 zvR0_sP*=bOQNUkH?=8xUT(Idlox{5XXnE*hQWgiiq>KGM=@Lz!ODf4#1PQ8FWzMOF zZHgx&=qWWYT47lxoRu3|<5W!5)wtzX=RR_0KTd&)-1|CwMOoW?6Z2JI)>h$;I$uHx zxAQ)|PXX^AEx9-t_s7dq4rFy8s{>{t>~u=n-=qSlVvnN6oTXtBCrgkd4fiP>^-##_m`qxOd< z7hi9%tK|ZiM1RQ*I$jm-f@NRS84ypC(RZ!;uB;t;EhY@xe-mDWwDkhZ9A5{tU5L9w z5c=${7)9ysJ`ECtdaxmHZqZ(pU+bH(50KTGfS+6VvX~&yz2ahb%f3<$Je!66I{a;n zuKKym$r_BlpTY4(A$9)upYcCewqOBOuwIgh?c}nhQCU@Ac3S8JO-73FnhF#3{g=4R zY$@$S%BjF@*s-cB*4VREY;#qzUB$_W+N0l8ou-VG*V~}EEH#4(Sw17!1e8DGMAb!- zZV&8>*U=<+=G~MN+#1EloksNHj-!Lj@W-%rfZ1yQ9;6-tsSwGquHx>S>`v+B4js>4 z{z*>g)z-|`yi+G5HE?cOV?Eg8;|4dKc5ZqFk2ndHot_l9Mu+=P9?GcpdjD>qH+LdEQ428DY1=u~ znJxQMk8BK56{ggI5alBvhagxwD!q;S(AS@{o#6eH{t@S`(2hK3Zaw!S>{Sx2bzgv#s&J={rL$RV8FM6GS{3(AfDIKQU9T?ROaKe{gyr*#veS# zf>cC%#jf)Zb2da~yOBjrBW|XT3YxC&G{XX6ahr>~x;e^MljTykKWf#f%Os^4Y4F0g{t`qS9shJGy$SBzf*>lY@fg&b zp#41lm;VR$@)!HxFAwy!mG(dV=D(Nt{NMh)2Zv-&mU5y2$=xh#sQg>_X3A9bq)KI= zvradXqj!o}94h}6F{5MvKfV~!%L32BY+lKW<_{lHTNToXn{#@oYgk3oGnjQ7s{{M_ zu&#PHW(|IP_Y90EWU+A))*AH4CO%X+aE1C6rhKRo%r^5CAzkPLHrCN$spMqE9se!; z_)yls6=3{W#j+E0%G$qr*WdOnio8W(2CRNk9`EL;H!>-`xIY&1(7WFRh67UwCTz~& zEb^q??KbV#Sk$1(J7F*#v?*9*uY(KyT6s}tpHNnlzue8zP`x;WzykZ;lWoLP7&F3i z$2*u`Kx6a1(LYm@W@hypH&VWgDD7rIJBzfR;LW7O%lJD~6}0@$U5_&J{l`XG)c02c z`zP=s{{U3>(ESUV7gRds|G$e`B)EW*|8zr)^=UW3 zXYI3K;K$6v-Uc6$9I&g_=A~ah+-kXK&T$sTyG!>vLUT^DL`(ZRS$c4$8*_+@qE|x% zYYMBX=16S@MjpM@b{$LM;G8Ced#&z4dMVniAnzW3OvCC9V4Rwn<0Z{{Ha0gI33Sd{ zQ*S%@lKffFIyu?7t8C`kLSP1?nHhb?vXOSoYs^fY@%Q4zIe81R6xsUWH_fE%0mYCT zZNoMWE{WzKB>rj-a{JXsz%z+r8Mq|_rI|nT$X=k`fu!j*ng;JIgB*SdspiOP!XyY~ zU@8{VUINw$-X>!pEg;C}1ws8Lyj)r;Y@wBV(O;Xtn~;r6+0dQCoT-w?frrF*&)t!^ z05;u>-0$cP`M>eCfq09irm%VG<`d~r&TMvCHp`(FMllTZyOyGMVeYes@;rSIX!&d4 z3DF*8syo3F)%TKg7s%%>eQPv`KyALYYe8MDyKY;2hbBJrd;ASkL?*M&X1P;!Uiuzt8fKMMC%-XOC!eg5;7I;eH6F>WvaUFmi!GkPf}TvtR1Kvi4xzMBuCro zC7d{#Efso{`#5Z&$5EOocA<9MNM{LuEl2NGT><0U0qIGQ5EbuD`z>Q1@Ll9^}C=-~w8kic@z2dv|9sM(&1?RuSwWipJU?Oo(1hsS8 zirGr)Nd8QLOAMvhpLr4T>6950!nKdqMxD{Y@-|Ov?(0C_?Bn3*LzSis!I%{lc$M_2ww5;j(2tz}Z5Jo=XN zBJ{ovXgxw7Hm%K026~UaHNHr4F(@ESVU&sdh*n3k!wXods=!V$e1L7T0&tx;R)#El zvrHQ5Az-ahC>*&RYBLKRDYe{#{47u-@*CAHhAjcg4|O;4>6eX&w`;a#U62msNw9h6 z++ACJ`av#yR7qpZq;)6^YYVo<+-1HJgEu55k{#txb9Vwswd6p+3J6A@MP6A*;2^xh z&}!W;=$I+HO%y6TqF!?PPMY(zJIKY7^YzGtJ{Aw*kEMDY>Pg}Iu2Q}rZxTrU_LlQ% z5djzracbDk6K`eZPARcxJe_kPK+Wz5?!+8ls;e1oyf4d6mpQndT_)&WffagP%TZXI zrm+S#`Al=G8Yw4Pa3@zey*%opf!Ul`c3`yBxQvr7lXyk7LI&;Bfw=q&OuQ?&KXast zH@pL7oh<^3y`&v}Scik3DF5voY z0{&mDIXU>v2enrJjdehl9`8fgPDQp;SPIv!q$L~KQa|`Oc=H`|=*6qI#?%DNK0T(p z8mvywvct6o!;cV#*XY~6bguT*dKi8<%pt~i4S`>QuR1N}eUgvbT_Y-!#Gxz{T5!96 zt~6uAsZf?bi7pyMu3wdt9BY6lvr-&tWu!tL#_&K6{qVkK{KG)?WAck4WjcOG^}e6s z6BC}LQ>A5LpJR0dSHQX{PL<2+DqHEc3ur%RKDLoQz>lpa1*8*BRrA4DYDw&94PUJJ z@I{*A6@1$l8SceT_CcO@V`}P_)yGTuV{{fesgID$-k@Z_LYW{x$G?!Q_-| z3R1~SlmJ~*oY)#G#tHk1FC8jwdNJ8t7}V$0{^t8)A!B2n_^ur;f^uSVY?-R52PwF~ z^G!F)PeLN~(-*C|_HSSIAXC?VWnelCWCY}-dj&Y;k1yMOdJtC6_~y38PyTesp*j9G zckP1nEqCWao82DeY>8eMqEf^)#2gD4Z<7c4ii^*MjtqzgUgm86M32d_?ND-Gj=%W7 z;hagm*Y}?TGC_1?A0at(Ui+2KdFt1+j?!qKrDq#LVZ`hHdDHacgE&$aCaF02ZM>wL z41=^ocwyN1u3t#{dN>YX297Q6%zWyW70aC8 zF@?sTqwwyK8Z`{b9&MT!&JcFjc*$Q|<@9H*b=~v=ik~suE1v%Ms0dFYOS9> z?`&rmifp~z%?6jRSfPkKK3g2L?%L~8mDc4P&TGqFcegVIS_MgV-e=CNbAR+eMPchU zb3b?0m&yI0Dt(2rWJj7_stD{ix3Xv3<#<KrFsBu&?q76X|<^i5ZhDR;7MUQ$VFUXRm9>Ff#(% z{*0K9E$4j2=1o@bK!#iHo@{vZR;b@5vYdU3rH{uin{9RR%;em|J;+jRPMYr~Fwrw< zpQy11DPXOioEOIW>_IN|vRcwAc`&*eD6FQh>`K$GWeJQU=s=d;K}m8Ud`^HTR?~ zP~^eS&=y%PGeVDpT)l6I$qjQDQ?Pr1uEHJ!jUai0os>`nC1>ak9wa9p-Gc-s6Q9Lc z5`g#Kx4b(hBX#q@NAc}HsuToAMoHH6Zwf_r#*|64*akGdN4R}=Cuo6b1qDs2bR#H( z4tl~p*qFt8FODSixC8jDuY)wp)b}uBr}Mj0 zfNJW82NV-;oitPbr1C4cJDKuJd1wL?uqIC9CZ@+BTf8et2rNP+lfI*~;Q37*Mq+Nt zh8&l4cBLpqG;Rh)#>J((($Gbn{7Cd5!maWRQ4F0_99K8cxm5_%QSUsYOocIX%H0)XDjN88;Ke62vR8XI8SuT;yG^s5WU{5D^Esh4K}S)EL+ZNr|1R>x zeSr9{vFm>W#n#-%kpHVPm4D-je})p{_o3_G-_!7LAaByxKG;~zto2&PyN73vU&GGP z$|EBq-Cpf-ioBE>^r69PpJ?*4Y5Eb*1BxvGsEB;H9H$rdoNg>KC9kT=xGKG~qEe?)J$LG}DML@MMymY-+VDZ!t~--F0I0;AXeX{ZQ! z-DL~ps1TFBZ3ah-}?a-w|Bc%sbVPmx5`<5@Ht0yUkZ7Q=p{RIRx zG~e?W=jV}V?`ZpL1g2gVH~|`45ZC)13iyr6{uzNyZT^=i#k>E+9RGRFej~rX!ws(c zx&_V|zqX$xF|5G*(DfS^@>y7(K0-(i7r!GjU~5xZ)zgOsXJGg?Y;9sM%CQt`Dj(R* zdL+jpRy z8DFH?rK990SyIf;7I4?+G;}C2FYOEKcUO8YI{{fpM0 z!Z7=|>b`7vzk&i+ss{fhygvocLANsKC1`m5q~;gE|M{)`I{TylbpMT;;4Qx|hWnuV zZ{+>Y7vX*Q{dd3#G+=)brTWwDU{s35W}thhcvMu%KRc8eK+S9Mn7_E5tDRJ?CdX9Q zV_;bdG#oJvyj~rHCmd&4^l7F7mH_`A{I&HZ7?*R5)`0 zzIw64XPDhEfw~K|TRM%%&>|mw5K-+<(KAM#LU*%nN(;EnxbzK(x#`;{PItO-X&Rfx z^6W-a?`WTw<<8|9_C=8U5DQ8kWR^mw*cQ+86{qMKA{2U`5PKb=+jmx{W0Sd4yJ3qa zU{0X>U5d+Ur0#8}CYOM?mcBb)+e)5G(ZyXrP5_iVO}*1|{<;t+qSEs8E-zv_aOB36 z$G5iW)nwUQf%p@$Jc&)6htVtLfTm4w!tpLPvf-DsiQ^RQ3a3~R&kq%++J%xT8R3UT zNM9ZR7mH{!B_PjBYS}cv%1sF&CE0ADO`3 zJ4x&fM!u0;zfuw^90A(6iyg6ZPc(E>I=6VSz51%yhxWKk6nogm->b0DBB8YnT!kRe zatqYtKjLuzUL#)nYvK37ZGS4A_AjOTmgC0$#jqND^X$K~=6;Xg`P2B{H+lW(rhi#} zf8IL(ylY17-vs$PUw<~gJ2!cgvIqNzhdGLWR-}=mcy=`HgdTNP)!PVU6yyBz9egfj zLqHOnQj(u-QV~ur(*z^TsyO$OmjaUY&N!P?VCL*k363hZ3Eb*Z?Z)Ke4P2iCLFW*y zYsQtfiSrVVt{Pk!C*6~Tw$2n~wPk@cW9?tbvkyi0ATyR9^az*M?};z99N$EIz<$d8 z++G_i_nGFlrOAk5*v@m9>O3z-xY%6Zvt>m((f5r9?d`kWmu(o7Zyq};9VqZ}?hj7W zADPPkk}HpK(5fE54d3KUA$GNPl0rO&THH9Fv2&!^7-zq9V9 z`{Zgrb`1TkK#;r%2n&C}KXhVKDvF#_tbWj*t7WS8BsR^=)3p>=F-M$(9K`dnYcBcuA}`}SN)m&oUj zb8TSKz37Ry$e+j)>*L;DZm)Y~k+a&th7nuunLkoQb(BM4X7mHp)d^-L$g?pzg(w)d z^LvFRwbpbaaZGOD8bZ=YfB4aminS-rU^~OW+V@lrIG8O%-&=-t)o`}p-1I{A0!Q~C zH7c7lC}H}y@(f?UJfnhtZY{`=uXX)Un-RZ$(VV|I>fm#!yqMLJb+;ZVHi3|}A`5#~3Qkbs0 z)W5CP(s0!0V9pmSQ(oO-)-Yz6^W8stvec1ogM?ZzU*uVIgS>c|r=0NcWXROrgeR;z z1IbTKP&~ntEbGrQKp^=P;e<(;<+1z>hOY00A)w=e?Q8&4d*e^EJWAr$6nNp?qlC1w z0dShwW4GaW$=6#niG#~$uPLB&YlPe>so)bt)=rgD8gbk?C243cFsv|lP01kpsP50U z36mkReG-;=i=M#S*HHd3#}Ad*pE2hZ6u^!YasVAIj@+{k$Q3Xv;t)B`pfInUgv%=d z3-G(c3-!$5myCs}SH)RKYonAtpb(jgE_9n%2GM0%)7SK57S_JbJxKo926#Y%C}GAK zV0BB=?6nhiF0afpGExz|^w4TZDkSu0H6LX10@H$3MvVG3NBhU2{6hU)_%mP;zU=QJ zd@+}J(|b64m?y?om&Q76{%4begV^lT{c8A7(dUrMeeJriVxFabTeM^Yei^khYwyBC zd36GFkI-R@AN3iLE@3hunv z?H*mxzYW{ui~a&Svadr)jOrIVWEC!g?Fl@!p}X4XPAm&=CZ#P!B=h{TQO(M4{CI-5DEvvI-?w0F}vhbCZWLs#eGUcRs z2A<$ePdOF02MGfYo&FzefK7NJxMU)LbvdM=hB3OE58!Ls`mhbtq$V-9Xs+URp``Pm zu5fiVl9+!Gv31~?2xkTCuwsRHlNXq=k@5xdvkPEJU(J1C>Jfr-+I$uR=okBC*Mq7l zHnI?cn`C>-ZbQN1XAzM-&uUyp4brw#6n4VY7MOvU4f^c%Qu2ciC6*1#OwjB>Sk~;n z0fQ0as5*KMn~}qPg+@;%Hk8AjL>pKw4_pxp1U-?RjxvcGRIx-=Y&>e93L%3JT@KqvfxQ(*>rR2aenCb*LZy{c^5v%i@7f|Fy11(Z+i1q$AJ4K-88THo- zA}`+EgW&Z-kenLEX?Q!Xh(#xu%QzVeSOy@;Gs$O(z4w4kMq3@|U;*APW_58Jbn7ck z9KkqX8}dYl7Wo6b6)*@RL+Xfj$kvF*K&@%st~?qELljIDjg#|W7K^AYq*U>F3P}@H+l?SH@9nfs|!sbzGB<$%h&Z6x9FHt8a4z?c^F8wKH*AY7SXNfwXR++$}1I&{b^ARvd1jq{#@^Q$FOFmSmz~e9or!VH2yq*F zTb~ikOq7iIn)kwRw(VTK;f1MXj{5zs-O0R|_AGk>?{LffSK5VU}?fpxnA*U>h9+ zFfy%98Jqn8)tLE+Ecz`g9o(`z=O^XfL~1AcN$KR!)`9>UG*v;e0u$fOzv7Au%(Lqc z(Qx$}DnCy*cz3R$^^jbF(NRSu&_%lo`ceM{+zvQ61Em`ZA@QUe&FilQFJGw0j~Y*4 z*0XS%4hbx~8o=(Sd7P`aqDvDrFP1dy*P9TxCa2n;Uwok$02)HaWAkfX+AvQTh&GPwKxGzl+! z#E)5ew|063iPk`b>HCwIOX@`}i6Yf_9ZG`BR0J@0tA ztIj?~Gq0%h1q7mR71dNdW|q(s?T9lTw;XMfAqPL?FYP1ygjJ`iiel;i-zU{oYx(LA}deKkX(|O&& z6`_Jj+>XySK`ibbAqR}MlHZfgcpv)&6Cl1a{5r&L zU9WnjTcA`%YyoueE9$cMoex}l5Vlac4)Izt*3KU#Ec`e|J5=Su`UUG(kKWRTSKcr& zR>hW-EX@tO$V~74Y9Sx>!hl=05Nz!sfgR|IpQQwcVnqdh)jhr_#V6*MdqpGSE_ntq zSciZ9tgL5%{8Sj0)V)Y_LAHhTeUqfw$}6LU#gZ74N+_*qGnQMWQ0z6z5UfCeyn03! zShe0B?WEF8CKtghRFJD7%s{277xu@H?#7yu$P$<-y|oVQy9_8^L-v7ixbvt}qN8nQuB^AyHfi0MoD^XZ|J3N@76l^=s{AVdRk2(+b2Mdz20G;~M&^XOQ8w_P zTX*Az?OMzPIaI!WGXBLWpj}RV9*B(U)TeYJpn_{t1W<3D0_|!|+@Myp=wSF`9Ob{3 zLp#VUr1b%0-)Nz4v@Dw4NCGySBC!tY+xOTqPxxRQKDxLue!?mj+|lH&6R&}r;_V`O zfsA4L#W0xH9HveUM&K1$rm{obD9ow=+6_d-_8`b9RiXj@HaQ)6k>t51T$)6vgLetp zZDw7MznmXvHJsvp7&2nkdCF7{l5I0)b4&PM=ef5{{ZJ3^NYLP|;p5QsY z>bpuZpD6dSO|=fh$YzFIe6W&!HJt|f;pLd_Ydy*UqCjcouv;6K>41UEuQfs;1X+|~ zt*ScDI1|0J8y<+Wg@J3vtEw?ypQC@P{NHb4=HKS60KA|gnOihziS)X<|M(gdV;wu*>`5-EbB5NT3FlqxmS zmENm#si7o5LP*)(h3|XL_|7-R9pirAx#Ny;|A0Ssc3FGvwdQ)}Gv|C3tD>c2Ud#EJ zj<<>`W+sZb3^JBN+WdS`dPw+rLz(#RNW()ns(aQY-H`trWqZ99aX z^Iqps{5Kzpn(1#G8x93ma;u7waQh2!dj~CfPJZW~oV3?^qr1a>yWZugm8lgkyj^GP zNXm@`@tlJ7I1KA{4{|(ZOvpLE*zZy3!%B?#ou$pnYDD^zve2b|rk2(0B7Cys3m5ET zVvb-R7kb;5-;|^wEx@@^jDfx65JN=T>t0zJ?gaWXG#eW4@`ocON-Cr$=u*JF-h4Tv zkYl_g^it>6bUeH{hUv7?U-;nzfW^CGZ~v9x)x4m3g?uLpx8!obo@Va1x$&-@J00UMNlKNY<8u8w$Svp_Xrb}y*_#KcSM6x zRP383&6B~;C40f<k2lc0!A9*LV8LMx=EP3wQ!TO;Uqz zlgiW!>K~P(^HFw5@3a%-30X8SQS&JnIi|%X{dnP|ac5ij2BE`@1rYl4=y@1b&%1X4 zY53gw;_7vzB;5GS+ z*;zz8OI@TrWhU)CKou9yA{Y|--y+ne1nMhMA_kP=5)WQ`&*Slks2INZ4AvETNRBmA zs)mt1vz(+8yJ`z>s##Xyme%WTuJyd3TH1DY@`?;COhvH~xi&<- zgIwu|j;ZTL+jHf|%msx6Rii-88vEmVPQdJ8jtYvOR!YmS4@uaK}AQnNDW6r=5^5@%9k0CDs6 z(jHa`=>PYoE8Rf|2Y?-oE~oQDq?F+j_$NAxZ)WU}>H(G_z0}^?n!7-;ISSRx-O;p;GM~uK)X{q18+yr1bM1;cbi)hEw_TjK#)^F}K8M|0umKI#P(tb2 zGgSh(^BZFT?79x1T~b28Ap|Rs$B%F+^pPw3j%E8y9|rz_58?|r|Fv-^<6*0682;2)0DS-#n@9$RMv8LyBW_h@j>rHHvtF%07hVL{4? zewM#I%59eU=H}Q>M@Hn$sR%EP*+D@7I_lJYT62NO>SM2*mHXG61Mo0c|6xpznsdB&Id!OlnQ^ODy7VMd7+{m3lv`U?U z_S2vDBkjA-r!;O%c)|~eS&G18av)9%^sw3+dFf}CJ`n2uaPV2&3^6+=Bqp-MU^`4QvO{T69nrrGscW&Bw(0gZ|&3 zZ}Y_cQ{=A<*hr~5FeZpmDsBQ}WF2A%K%Vk-Lp{Ptj^-fF^QrcPfAhsO-}b$-lq3xM zQ5ATOt-UU8u^rLQ`{~0SQ^+V$Dx3(8+SALx-E8PX4O881g>eY>&fhsE4Baeg9@WMf z8@>&&-vaCKhbgCGvu0EWLEwvk!0tw}V*oVOTf2bx0xcnIa0Pb7@pLdg<;nX1u-?zP3E$N#tu5M8;vZ^LOjmGy*J(9f>v|?Spmp?TV(B0oX zLX0cRFrXB2o*p@RIf^)PErzkDz{;26tq(WQ^`=zJm2{HAGFy1d%*%Ou3^3(iKIE&4 zcSGZ8#C@k5X^kO8VHVU?o(j9`PIKtSIus*q*AJ$ zO&4P29(;zb57{n-FfT6*@Q+?Dd6m?Ez+!lc5_cMb_v3$0X&K^5*u@#UZ zM%kpZZqi?;SThCc74a`eakf=SXQ8qEQetntqO^~VAiC@M=ptm&cSf3TgQW zs&@G~ZR7lsZ;_Mku2o#`Z+v?1Eo`7d+9>=HrRbD`)@^FX~gxv&Kdb2Qy*@h zGAQeurQqKdjl_0cyLjjH)3Z@B#>dmt0ik*FjlU4eJ)MQ7>*B}l;7gq@+R#3;kn)n> z2s!82xVg7-zt+tpD~m?(i`V_HMEkUBL-QMC-U$rFm&<)Pd{Qr_HAzsP)sm$g)$9@#X>^YqYD|K~5s zdIVKx*0{tZOKJ3+FZ?;tcP6O~6~ZdNNvXOl`NYg-S?ZQYx^#%ed!2KGv?abxCX#i; zdpyB)*Xv1cUAF&F<{yr(T=;DiI2=FW2cShsNKq(=#4XOV(jA5uJfT=D@b%|f`8IL# ztRsZ#oIf1G@LnRck@rCNH{R48s=nO|aTg#XIa8uDaY(1k=lMub96zy7R+DCPXKw4U&-ZV3{ zOLULiAnK^LGin@mvA-ehi{~e&(Vr9Vvz_LFLF-v265a$li>-|CO(Qs!b?`pzAbI6Q zfYGE9wrIicGj898?BkmoKWb-`toNj(hW^yzsLama*i;Ig#}k=SU}HB zuE1>bKOB@YUAiN22nmOV5t$pHdgXX3^L$`O;8v!&7Y6nrPUHcdaFBv7d>(xIUkH{% zJS@2OU+$2N%%2_M+Im2!I3WgU&5Q7MAe3j?G9RZjYx;v(p>VM{bN+f&rjrAidPhytThdEJ2SSpRF+`9QV;VH>US;VTeya{2}%+;bl{fM4O^JYWEenBNDfo78z$StQ&x` zi*F1eStpnDbwTq_B%6R2tr2vEU@2C+L@g?XN-@dr7DRfP9z>-@BHzH zW2gZy2B`SSt4#QCmu_bUf_WvKorwAp=)kq;#7CB1tYxBoGNu~qq;R_&6?Z1R-&vI? zxAMV7G`v1FT6AxyLiHN*7IUFAyDJuPHsZeAX^+2%O;6o*>;TImX&d=GamxOj&cd9r z!RS%W%}3~e_AZXS9R6snps@b=?Z}(UUYAhQU0XIN1}>Ojk=A=+!La|9V-JkZdGa7| z@W;)I7*-7s?M)sb6A+h(EwkhaVo9Lcgz8YX?emrXdG~_=6M-sIx`bx!6kJfFn8H5& zbSnuEo7oHBWz;F*(qd7w7s0!OKo&Jgpif|)23+SNbqjEbKn*8cP@Ay87AZn9pGXAW zBCKMEYfBl&)dQ@_Q?%c7BZ^W(0_xc%9fVm)kKF5hcPr~IH}CnP|GWHj?S-QTojQSu zCOzrmN?HQPGPZ;e|L%>qO+DZ#t^~C>Fy>z;&;2j%<{Pd=V6kP)zup;Vsmf$rb%b&Z zVPo4R>tBpE%%&;Q2|F0U@M+=A1H^k`rMd2~EX^)vnLhLnwXg!wwg-*MViA@RMOgRw z1*w@jHbB#l=2cfB&uR7@k77;o*&`j1gI&7iQy&ge4`F9wD_?+goX|q7`&@!l%~_)T z*i*%jmdjp23f*Z*-X>Ezi+h!X#EBrm+cz?rfInJEj%P(AYF$J1qXD~ap{W)mZcE(i{**8d_M_T<$Yt`b!M&%OTeAi z8`d^d@wk)|Oc3ckXqJ&s?)k_B%&cY@>x`eHml!XF= z-6CJpIvP9wqt-h&to?wUVnD+XDB~?Ie*5H>_C*m6zbAEqDWkGvKlYN>_dw?qaGR{v zFQr|b7BO_@_pj_f90%ufs1+}u#a<)$On?G#12F5cjIpaonqL6nOJ3wCX023JI^aww z+a*yh^cppEYK&VBD~9IX{v={G{7EYtha(gH}x%O8xC?z*1-h z9VM{*eauZ_*lW(XCXKvT_Z_q>MXnfQydJX!Q;4#lO6hr=k2*9pTrJ-P5j_tWpTkro+z)(oqP^cwiqhl)lECU-Zea`Vc?o!}ZR}@X6`lPS zmLq{%sh+T+H`aL>7Ylfmee64>bahfCs(#;7oXp2%_C&kFe7(Ab=y+*b=pIT;zUtep z`q+Uj&Q+Z7is&qj89M+U_K~hFBi6QB7KVk5TpLg9JNjre$>74#_d@$M^@7pwR^b7t z@Z%WNu))b5_BYz%+xdpoI3=TIZqH`D3b%aE-qVS0t0NXcONZshD7^r|dpD!mJ9pYP zGj-sO!yxTec%ha({G7S5gRt4 zzJO~KZeiR|i4Uuy2^!R2md-ohxQE-f_6(L6gOv~_qKE7oD5@p9c1;PYq zXVaIZ4>`&wl*H^$>SX6lYSC;mcuHCxHTbkCjvIUCZ|p9;Y^i)JrU={P2gv8Cw5^WN zu(?+H>aJ2kZa+<8V_5{9^Nj5{tvehcAJhU%B1d9$k()}p;UfA$tCs?tOCqiQLcL#w zBKwe8aeajknHw|`^(_}Qduh^zCu6MFNr0qm4a^B)x(qW9H=$c>sAD3G*Dgot>Or*9 z2dX795XES(&uq2!M#Z%ZEdCdgWdupb>ak;q@6Z;j$c#RE@c3!w=;>~*zEuQ#KF>bs z0y(nC>+B^b=LYz9mW$-*jKsqs#l+0Iac>2}^ib6x`{{n9o%wyz=TSf`Wah+W);BTv z_8AQsBR%H8CNOt8fF0VtHz&vOxIMw;k+#3ThrMXD#_m_IF1EDz0I@z@4iwKHj=i=M z_vfe19sk3z>+%@O(vaTyheOoA5l!3Hi(HSp1>5Ejxq^)hoIX-Xt18&}a+trQW~#eE zQ6An0;xhCyG~Eohk5~67Ktc2iz=TbbHs7QG#FZbvb~MnHhRpbfqB-#)a7@6S3m#kA zy5dcMHvP$%cfFYw0}W`n`u?@U>WD$@&0rGtD_aHo3dcLVOvh;>X-ojP1dNvg9LI;5 zua?ewlJ>ryxtAs6t*2BV{8apbmg;k2d;)D{@T)i2kbXBn)Ah-$v|XQlA81wGZ{!~? z!p<=O6(Y3!uS_4p^h&*azgoRnOVsW8GTJz~&NC{9z#T#>>Fblhn@0~JdQS7P3N`Z_ zU7_rr^M~i9uq2cS#Tx@c$3}RojIJ3yRzf*49|zvc{Rf;fN0`}ze5I-q%GP}?cL_vM zR`0VC3Qy4qlv@a@s!X2&d*u2!B$mzw>+!oSR=wMakJYJ3DVb76E(8Ql?f|pJs2tIB ztGfhD9Lm7WuZz$xE4oqc^M0=c)EiwC>m45Gm80RYIvXJG|clFH3y%O~Bvol%xzz zEm`gFe)xyO%{K;U{|*DI6q)^viLRxCWA6G7VXr0qRYqkf^_@4;G?UQm2u_Gg7yu^> zg7wDT00nzBoA-^8^$DECs{P|58c55S8bF>nX@Bp9^{c+_pPjel`{}c{o|c3pKMI?& zR2|hecXIBN`ZoDN_4Y&#-!T8n4sWim1f4~m#3G<`6ktHN_Q!su6E$%v%9Yngh%pIc zB^1q`Jcv8B|kDEagJc` z43vx-cBPE4c4?a%Cc`$<#6hO8EU*Q2GomO#47Kg+$3tmr&lPfFw=(>|pU-TAOQh9a zI$x>@DM>DJSvyd}-Q_^xYa5pf5J{{Vxv!#cW*e@_nc4B8~reSHYSdcQ?H zy;~sQ7RN%#wR1K-3FuCzu5YB3t?Fe{S@wsCMER|=H{384Afk`YhT)g7{ALM zhg);NCja=7r`<5BM8?%7?$SL>@7L3jzWuuEa$@fQmwU+3tBo;BmQDdtOg9rYfo4xp z{`7E2`@T33Mf&yz4tKnDS3f#z;r8^yL6w!IO6Dcg-(q2c)1h{`(!2{D&lvq8{$Mn7 zvT;uCzqPWh03&NUh1{TY5%<{nFTew;vh-QXMVFoQV2$o|(xvX% za?Rqk+n;YH$#KNH$-KO>>E4~d<|fqO2KG$-WG639wX*{3`?EI1A$AaX+J{9(gvShpP)Yhpk7@oq%)W0>7k!vF8~I@_xX=z*EV*V~Zre z<7Ek$fIOjqoZl_b^oJk~?5QUgtse*S1R{HZ+b_w)iVQWEQvE=Ub72#eV(3KeUL>aR zIUt8~Xt8Q*J{WG~=$9fm3bVQU`TpJYrxC^(tW1^Ma-6ODR$aU+n|-H2cOw@VkHM?> zo5=A`KzC~gl7!6PGF`2qaPX=A+CqH>9yzT>>s}5@3OL0s54z|xJBfa&%$H-IAT~h7 zYYcVSpDOugYAj(P6^pc!JbK>st%ynvwT76eJ7_k%1QjBmooet}kP>4vHK2KPr=`ze zghGRsy2)})YvD3*!hC0M0*2|Y^gy&7=gd6Pv3+o8%&5FmTKPU7ekr`myd}om@}%jh z2jTMXiVy5wIEo>yj}09kH^x7+nOZvR?qMpO9D?n9Xd}m(cJzDJTZC6Lw2tw0J=+Km zU2!3qdM-mYwdH{L&C~wM*0c|jBsYB9*6Hc5qUNnz@JsVNwDMA}65W*O>8FcTcK!5L z5%H?%gaSdgz_N9}>W8DoZcXJk)5BMhnSxTAZCywQ>7m0nhH28m4sK5KwD;;`v$L;d ztuj)BM#l=R;!sOviII6+jyXr;K~B=9TzG!zx?K^Ntd;6PDsv#M(B;+cdEv_OrOWKs zocz(P4kkFA8Hj}ba_~~bp-WANu7Ao8{>B-p;nSw#7+oyv0gL!~TAzCjChhl*Ej0Ci z|6kP;3TnxPsX|vZ#d{(JhJRVqv_$ZHbQ`%$RgkU*JQl9eZ%E3|Ht`<5MyjT8rD3Dj!~?%qL>&Wr(_)z5^%F=urnDYYv+&;ig_!UJGWuhr21E{2c~d_Cy_5=V#KYn4n}k znCYz8yKzkoUmJQfOKS7%5C7rFDaR(&7vp|R7Dk&)AB(8P-gQSK^>Z#JHr-TB`8NBe zaq-8MWRDv+82)cGr`{p8X1K`RKjZvxo z59tl6uW%EZv=h!G#wJ()#Ed^5-(>rwjuT_#3xfmsCOhAOkyfGdE6vYE0rqtLk?k3M zdgkC9*7ZR)w;RVsuGCLux%Oe6uc}P#ppX5L711}{Mfg=`FEuTsN4qV|ZXDA_K>6O@ z`E*g+ht9ySxn07`-u=uXeR%#c)RFQwaHI!*2H|Zxt zF|jZB4`^OT0Uw#+i&@KmI8L%%&F`c=Ac#E>*+tVh$2XkuYy}g1y3S!VSd3bagbz3p zHyOG=`lV@ad1o$fX|nBA_ePV0L9%jtSCB>wIH7%W6T5l|vj4Yc8;`m-f*foGwYNY~ zX@GGTWEQnR1iBtOLv1s7p-4$coo^Vb;f01KrCI(g3EpQ^;qLlP;JzK!+^78`i1jjiGf zps@xp=wcm*n${~~D35!@ZuX+^AICP9h&3sSUZ045LwZzD^|j}UUDeoAE636RpBQti zcqjT(DE(HIM1MSNRY1J~es8#3BDIn1N;S|s;>m2%?9rn-NG}pfxr!1N+M)t1^Y-Uw zF;aSRrp`U|Ty5`#7}O>`t#Qz{Q-~6r`&aaFhuYk`IQm||41m;m;2PtnkUF)D{)6gj z&6$umTwqW-N{GyWrPPrcA`HXksjnR@s4cU8QH>~hZ~W47sCsO{4D?aE(3~>%-9H?+Ru!e> zqdV6mtUvo7q7$yUZ2C6#B3P216yaAR=ldx;AQ`K3>S{mcJzN-@N4~Bt`adTnD0x(b zQJx?ZZ%#4w;*uWFSh63MxW!hu}?fIgAj%3NcZH%6C z_6`8@9YGrHsxbA$Tl!gP(@T+U8R&zKj$L5V2_a?;*r7&FqNx6fUg6>)t9M`^b{(ch zKsBG7_G&s8yTW_1(^e0_M?(2EX~MCSn8rGh)eZl{KTTW zwo2-Io0i74pqFGswl(ObamTTV(s1@QL|0*GB;J4M8+iO4A4+Sezb`}?qhz#4S|U$Q zgAs$eFkCH|$;_{T#&!O1)PXBU9(o)s=_J;nZYjHd_I~d26QXDPP^zM#>DXF$JCS9i zPl!N2v}y`jS)rU>PaFhjkFld_aPV zupHu~kAXT@zxwQk^EVWMASc+t(f__y zmJeL}lTj%>qdYL55X~lOY4(92-kw*3qtc_#uTc%GK7ib|&sg>#v=Rkh3T+s=3Ugyw zJ3;e{!t0>+6f0reT9U|xNOfJTr46#?u>=DIIXekcDI|{zFBD3`NAAqetrWB?1PDABRk8#W2LG5JP zhW^iuIGObbfz55^jUTdQ?Y*V+G|i;$=wHW(|E;^_(E1KkT8-1I!g4I9eipg$x?RS$ zjH9kOjVE5WIR0M$;GC*8n~P*Yb&XA?XbCpF82vFhM=&fV8^`nW)c%xVEFm(Gc&F;* zc`!~F@FG8!Yh78xa%UnDe^bqw; z@)S>IvgW6w87R?$`M`G{&Q^NbhG)E#>z{Yf?2UzFjRk5Hr#xSy7N!^|f}V-;D#3;w z(8|A^D~5L$Ni_<5J2cw?Ils(J=47|p7B5?OrWx=e|A>m^r{1mkDMsVoLDv7r)G@fP~tY_6N9ePPC zh?>H?;YR#{X^XEeQ^vWsgt7bD*LIbOBzF0K4o})1$SamoDTv-)1NB&fp<4mm%~MjB z(K{TpTV&NwZk*O#mFwQu70lGo<<|31D7#zlXH7YV->i_%Qv+2f%*O05$j|Pk?)F0= zLo-WT@&qnej9KE=5O`dKDuwO)mLQd6qO^3m>Jg^IC$&?A zObpFKP7Se2Iy-P3hWK%FgN^ea0}T;m!sVQUmX_Dx6k9C2Ph`{hP_;2r4%*)95O~kq zY5W>X;IE9t)zO{b5N=Dny*I%+IQxLYU&8yV+#RL?7Wn$E!bwWfSH#%CW~v2rO6u6% zhI45+>YQE1wdL2t-x+V1F?B)dp>UJKE+7M2xld=^ntc^7Ug3H24~OL=-m)s>JGoSW zi1nFC6C1U~ZHN}*^Pev(HB4vT4ZNdQA(AJ3)2J$v*n`O%J=Sl4&te?|y*uYhFHnog zo#tiU9P1G5z?yfRX2wSao#k4~O|pKCt>>+d{Zg3J#nL_PwAbWLv0Kny+_cfKc2CU8 z&~4SPC3MucM%Nw0g)|gUuL)*lo8gv9@sr3*%~WM>eBR-eIucc1^3d*a3i?{t#hiT? zxJZTtqhuB|&OQByV^lRDrILa_hghkJpx$-Z)m2!+?0bS^J|a$-on2LVpxr#?UXw|1 zcH+|b_i}Pq{-wudk{vPl0<3|z^P$AIG0u_YLg?WXj>YCC}DZCX)21l2E!?oA`##*fWg39%$6P3^0$&^Q~q%DXTv?otTEi? z-i5bHLOgsUxb44a<+{w@-Mr5T{uYg!2yzBHU2QXDOYc`0vqYTW>QS;#-wG^+$ioK$ zwF@ahfbOa5hC5POHCq5vK;6ssI|FB*%o_f~@j4ZpV*vZB{Y8kn%a@~gWtf4ba9jFhh-N5a7i*BN)>1xkLm zj9e~}oy!usf$aV2lF7+Ph%CPO^uGv#EN)Y`Ea~>#OQ3I*L|^p}QD8BQr-{MT_g@|w zzJQ9ySa%!+-z7Yur$wdNYb{2aVb&d??R8hm%FZo@VS8pb!O0;W+5v~T5;VOZKzDHI zPkP$y8lCRE8h^CWK^f;(5K8C>z1b^Pjp;Jxs;c3d@gX=q@9G+OFnT16-OIUYgqg^~ zAqx)sOp|hyeU&Bhh~`gHNBSAS&)L&EyrQf{wPtT47^RfjB=-56^xnJYo8lt~`X{$c zov|GP!aL6n!)8{8*Kb2+>wBb*J?b4BVuhI~21cF^;K{%eT_qS!h>qS^vQ=c|Z9Mfp zncy zuPE7TihoxsV2z8Z89IVnT`B|tq!whJlq!11yl#t|_A$nDq@HC6UxjOLT=zr81A)9M z@!u22D_F;nnaHk!bn>0-kBy@GyjfgSL|^3{%-l;COHbmDpX2jYp%K~>5pL&;>qD7m zGd6|#9+r8s5F~usZNKF0m6Mo-m;MGz>c^C;RE`1prj@Q?b6vPhG-gYCpzmRoaOWc? z?GDv;TRNi-lWaGGO{3}jvMmA+{V&>5cc2Vq zF}&U`Q8Cqqnu&Uru~pnX-S5&d*QmfoO18w;3jS7c3ge}jnr{mA8_M+96n4fW%XNHD z$(+h>7` zyc56}aG=Rq_Up&2FlEw50!cm}7Iwoo44<#|({oQDHQqpr)N@s})0Qm@Qg^NFiS_r8 z+(M9YcU5m%Qn55$2bOg3O=l4Rs+3vHn*nkXKf09F zz3<2qsM9HZBZP6ECf`96o@BaMQp9$**hfN$>DJD2Bc>zUMd$z->%E8>|gl9bmAjp#$~ z`GSy~rdVppMfr|xtToGezV@?S*=46ZMo?o;gH*?9OERS3*Tv~Sm2-Eb7*j%NvG;$ z2xFhYTuMPHpuhgB$ghqK>gbcBe!})WcXx7o;4ivRi@(ZH2V^%V!HlR1l{RzrOk`AR=?=3;~uxopG|R$Cuwj{|)9?X@(J z>7YFZ^Gxv2kcNJ(q2+!vOke!zwm(yI;&w`;@%A^GdRpp7HgAA%!C&nsAYp?#x6ULH>9i7G z*%8C2&2>DVNqTz{V>}lQ?b(<%QXhIMej;>x zq6I;9iMJ8nMxc3bU^)5YbvEt;l5U%vc(>;+W^q0)MV^i-@4UM<2!UW(LBD$~N~8Bf zIv_l1dFKtJL#dzVZN(;KSfpX`+)-bHG(Bx;tb0Rz4TI!SB(l(i#G9aZJ3959M(1GX zt;*Ha+7@U4ucto^_42(%mo{<6A7Yz;x?KkwvT8cZphh(W3pZOOt3O)9jH;jIlWhgN z4=DX=KyvB|@BFsn1$3!EFRx=Ht(?%`;0)cbvqP5F3qu zIIJr9G3cS*KO7&(YK%nHMv%2v!!6H(rP*YtAt7L|>(CN%CB`aB9Wm6)gm3>P>*EOaF9;6`0r1pl-O2RddlrQFNARVzOL}7dnWilJ93C1`hw}w= zv!m{!I^G_6b4{8t&{+V8#YZpqX)JN~M*M;%Z;mivnR6d;aGl5+B7hB5-a8y3r4ZJ@wT5=Zu%HKq37 zZ=5x}Dfy%p+)0%7=*~Ik(?|4BuD}HUR~@IwdtjNtUDz_s|Mj2lyX|1-ga7>ZSIrFQ z(iEL7bke&kIL*=JRWzCIY*gyNQkUWZKC8XoWzxGX?#h(@;H(ov%>q# zS;;bky=qS+KOceGyP&Z%`G0Zid&n?rV8SkM_CJYC93U?Nxe3_IjQ^x8Rrr2EH9Y=c zq{zUUEQzf>D}XZ7I}n-Fq$Tps@MiZcKIuV$fxlaBjKvoFZ4b^U2WO1&|hyo|rAtvxlj}%`ykLH}gZa*|)h)L@I*(xnK{XEo2MI zYJALXZUTx8-SWZIqPbUNn&3-v`k*Y)n_lgm-7JT`%VA8%6|n)`f{TY3G$hsgkaXt{ zxX50v(h6DbD?RxeXh}t}ka#h3dDbrItx=$~K9+3*$hj@bF`&n0kX>z`H-quPSz^;W zkwylrdJAlWSySNDaW;yzJI``58cBVHT9CP&M)p7tqMt~vS>BmJ3XSqoi#to0X@s>_ zD{SLk1hp@mxcmj#vS$q=b4LenWh%=U%BuiveC>ww`%vap`8B>A-iV@0lXj_Z0(T?Z5$SI2qPfKdmm?K<-Fq_g z_8?ANg{CGu>77J+fe$mwh%Bb|3T;v(vtZqG1^1_f!BjZEn z0K*w%Bqvnu77>ZEhz^d7IDRIn2GoZB&vmV@Z#yb-Dh|VuO*7U9*t#SF&g3x+ zF|DVIBbtir@FK>JkF=lrr{bW!cDlfqA7IGxl8n&RL$V%+!~iKKxR8mr4$rO{X8!Q_ zXJB&dJHwtC(?rzIVC>0-cfMdZHy{kT7?>41ys?dCV2Ut5&brO~do;GxhqqUTt%Gvm zL}N=`4>Fb!9sL=4W@m?p(D(S@jDgk9WaB(V<#WJyHckf1gIbD6pb*$#v)qq14GaU?$UzO;0Y!M}HB5?;yJeDuW4)sQ6tZ!^tS6veviYJj)Z*jqr_c;OG6Wtstd3huJ4;8(1ltDbzm15Z|7Q ziZM`$AHa@!X-Zp~PK8Za`Vyyy5hz_sR(y@%An6dHw{e|dnexSV#2NHuJaxBk(%q_k zeY^&LSF&pc($A=}8=qeUorP#-R6!~EkH)J~3m23-19*H{C>iXhGwH%R;^_nq#u*6j zC&wp!nK1UR1qw%aCO{`>F*LF4gmmGZ#jsQcQhj@QnWV5idXp#(?h?xsx=NM{rC8hw z`T*6W3-9hKZ?m-rL1x9LqKVnw=-f&f$u$d0BhG_F|)y}(TP6W0j zQZ}|Y>)IN|0l<0!MlDdD-p4*tRf5_a8*N;?KX}x-*6b+wNu4#!veyi~1oHCEhi&30 z_%ew=hm193g-WddQ6d>@M_il3;iXw33~AP8#>P1z9x<&+ z+Z6%WTUXfVt2)9Yi`BluYb)G<1Xszk7@C{l`MY7T+N;`c8rL5QeDL@FAN`Mx{m1>q zSR!R&EQS~fmMc<6nEt(ijc@~kCE|x*F{Ed<)_$2}#HJ_M{sD?`(bFXvLGRgs1G?zhTjJ4ELeY_NE z#{*>xyLs!u;c&WOV^Ll3DzM^9bJ?a55D24X(G=9i&7J$5k>jUQRCnl3REoeLC!b2{ z5YIMEgl1SF)LVgACDk3!L?stV7)f&or}IzIodKQlUB^$I&+#&W1*(DoN}AE!l6Q4I z#sKlcJ+J5@h9}OvfX@NOr9Vs0umA%xu(t=L-(EFUK7kh-F^pmNgXoBHK-G!92_`KP z8mWcmt`HnjiXwfDoa3o$r^iqF2o(ddHNvy0e~Iu}RlUgW8Z>Q(%Xa^6--t!bNPRxG zu?7IUFZ9UgG~d!iy8Dd}D`o&ddP}ih<<18jPyM-}{NoSD$m2g87aTty8xbqB`#CPw zvFaz)IZZw*#?~2@W;Pw)JZRcQ7iL2}zrgZlL0FgT7TTgxUACyy1gX#QHHyC>uhXEt zU=_uk56M;ESo}TJ=?$Ijzu{__4)%?DIM?bNo*-#|F7F?}Fdh~Ifwr|wM^D?YiO(_r zQi}X5Bl%ZB68Ojx*F=)*Z=QMQ5fwUVukh^Xj`j3Cji2PChw?(Ta4_Q%(Gg&Rq;@a` z1dXjb!qp|_LRT~P`UiKNL6Q&Oa?aB=-8!rGEWURDJ`-um3ub~%Dd{&6z?RS4@hcj& z3q0z5n1EcfZ9AR-Pc|TB3p0T&ql_}ox`0xRQLcD)@0l2TsjumQm_n4c%fvZ{0RGqa zxyrx5YG+FRAxk}RK2KQ;V{5^0e;UXnpO|KR(BzTNS7u$3Oq zEpZFEo*(Erxen@vEj(oh0g2xTjNmj4=I(h2NZCo@!L9Itc^tgEd#d_df|hRa5y*hi zZ?FEOr`54}$-HA!;g(}^l+&j<+P77_fk-(|IEw2q@ zUzB|qrFxu#V;yFWqhGOpRw&U{_E}uM$zSk3`6o4Us^zi26y7zz}b>*^+Rg808J1Bf%tR^sL5hLY|$((8$eIY4R*##P2+6o}{C*F6^ zW4tub$#_ip_>UV|-RViysCtjS42Y#%Bg{4sGumIN@)pydjR$&H3&b3^^K8C?-m@%YS~_-=b(V>^%L?m4faQm+`#zneJ&-i|4)@hI5LWq zJ`>A0^WaM~GD(m_bR4x_nh|i2E|)ewKf1GH+y*|~jV}R2u(>Rw-!o%QkUyHEtB~Y) zLDBa@8pPPm_n{SaGwzE^DXZSZN3-5951GGiKHNH@{J~zdoxS?Gs*@cSbRysmd8~1T z*D;~-MNuG9OU9~EZlpe(a`oBL!@)AmtEuOY?Dva8mlzuJxr>Q^TV1x_ck0dt=)r$W z%G6QLM}fc~SE!sNmk*YCtIOZ%_e1H#W4EWOWYO{bW8~Yzw8kO$A9V!Vy&J?Q6Px-; ze0-H>Z%mL2u5^W$Af?X{$KMZK+!6}ME}sVZ%Xe>yI8Q#^NcH~tAh zzH?xUr4;%s5Lw57v9e5{bTVg>ALhc#b6U6&?7=NoT7IG-fhnM;5Q zJ;L7Q;fJ&vZ4T8`_q-iH7}Zr(n36u3jVzSAr2-6LF3;BZ*U zIU(E+T4MoGh8TA+1Q+DYL#6^v)e-p>FnPbA8O>sVAN!FkuXm6&aqtV3tWJb?SVEKq zh80GoatjRjgVU{D2Ks6C&Q;u@6%3+qj)xFScRdx^1o z(aGz^V-`8AI{)2XNcl5ho;b3UUFaJWuT0#ECMR^-tmy;HojUPyD>5Q9D#q+{Em@6n zT%W`c2u46r-RcCk`4|`;UI5+*13L5sa;z&Dl1k5g4YYmUmw|kU#*(_DXf|6eX4670 zC?C8W)jw>ZW6>>%lRWGr{~bO-_HQ|VT&){_c3*b+jhB6VVL%>;%h@$aDklZ=+j)6r zD-HT8-=-4tK~Nj|c#s(D{)hS?Zt}20!p*Nm&(4_VnEbS>`%Wa>1K1I`Hr{djb=z^55?8fK1?CokM^2h>o8`64@6@0Nzqr)mBu){GvyL;y{`LwqI+ zHsO*Rc45qVZUa+qX{eM$pr0A5DQO?5(8a6I&m6Z$#!6T8nz8&6QI z)J^fTU9`IVHNLy0C=U6%-GbWAI_y3aKYXp zBF^cbNVQ|nCkEY=4h4%0RkahI6URSQ2~&`-VSYNKO>T(t&1p)g(LH54{PW3=*De>A z?+bqAlnzgrt4UkXnDHbuEe6?`2lYaQ>Uqk96AL`pm3&}&lC$t6=>MSrI98A!^M^z2 zY5yw=mJaOO4bh#n3y1-UFe9kTRY- zzQ6A~XPvdawcfMN`v-qK^Gsr9=AM1;eeG*s*UE2C&CFs>DbeZb<4K>;-MIC68D039 zNEemJ=G#Jl5h8Yvd7s3hUxcRj6fT|QOAx>>f1$T61TMM(;=dqcZlC&~!X-S1W&I3b z*8_ke0S=2Z@<2!*$v$YfesBU6PsMyLFfA9Ao8U6E)}`s@oD8-B>!MGbMT=+izIJ__ ze!lg)lli~uVvMEaY&NV?l}$mX~ih!-rw05z>@;# zE*lTAtmM|SK>mOKZTCnv{{m#+roNcn#2E;qZ*V@}V==?om_>`1< z*I&ts1L{=u98<@Ib}bYGG*C%geYD5$05V^kvnZ$t^wGMx`e;$~HcrevR;ALVwcJX~ z)HX=7!}x!x#&LjZ+%#BJr~>aQ%&R`Om>>=FdbDFr|D|`twqT#$k4k4q@;XURch1@N zV}hXX&TgEf0xx*t^Ad2h*n`T(E~BI0d`TyN5cN2%A`P|zFtT{MkV^d)=b)Jcovw0q zIjnys;!(QD-lkeQ&z{))TXJ=WaKTtv)Y=X(B-`el%I|$g0kqeHKQX}DtVIWzz%yxG zwj${#f{84^v3FX*j~xkatka^sRcoXkI08XcW&vikMjJ#sW}rg>ZO3_LMSy<_e8^#A z7f{hl0;l$H#qg#JiOx{(gdbzkfGvt!w#ov-&(ZTU$f2dq*UX)U%hzBpFr^y3Zys|i z+Kzb+)Njs+rc&2#U)^fQvLgvb+g57FKXQPXkygxIKC{lJ7fqTzin*4F zFP{$Ok{F-8>=`#}^hUtiZkt1#Ql0t1LOpwcM6UEVfw^X#vToNqG~MApF}UwOhb}+GjoO^= z_vWlkOs!=Nxrn_=2?``nMi7&MW|=av54oj$lgACcmV$A~&ZOcisI(lIR6Qirw#11^ zw(PS@jdDBU0}2LH^R-8#OQ9Lpx>QfKjCYLy_28W?U^Vez?_)tnB@e1Yy@3RXBXTq= zOUR+(W%-_NWib=qT*1NRmv(OsGza|U0daHd3}f!R`k>(@kk@0m&D`np&6uJB^n-J2 z;{1H>je7+a$n}n>;W)g&VOTRjKT#JRyG2}i>0%Frnh4?j;Mo0CU3j9L%x>tO-^Mp& zhyD?0&O4rkqFv`W0s-ZvwNj`owG2|}Di{U%*nP4503A}^460)g@KZVH#8lR00`v{j5qJ1BINF`G*dY4gHDQKdaZ~wPu8Uc$LF<&-9?%QqLx+ddpRn~^r4Yn|H zjXb~a{LV!sjC5?OoiL9@vZ1JP$|e|*5qz@O0o~B>7iyTkKQ0I zI%QBAhRt@OhGVWm9c&%(-hwx9;yijbg%2j8H;BgqtweFZ(;<*uf+`}75d=`cif?&; zd8C60MJRj2XWn=o^}T5f@H@?L79ZD*BN+yKI{PzdHTy4THUByn6~9hMa!kt%OIK=? zFg;Z-%grw*mAIv{rUd7QCO^<8&mVR}OcN1`L78NSw8ExoiQt2;j@>h1OO4? zx}ip=l(Qt~$)mNrd!KY2K-AL(P>HeZt%MncvFK`jw{Rk2L4AvN`tRgAcSQx^u{YN! zC9YLoAB~;|&JBl=84`CUz$Wtw(vAjIdhf?6K%;zG+x|jE;_+RlYsRIh(`kgZGH@Cp z;t;AAmT&*Ix^g>Kj>8-Hq!>@TfeL>+-sfVq_tz>*yFq!}HJ91#SKkKP#kAnJ zXD2(^jg|=lYdWW)a~MZk>`zc176pVEoL@w)G_2`Bi^JC@a{lt%MfKfnKjCxbMp8MN zuDhxb#5Bt?P^Q1rYzV~RUDf@^KBJJgmjqH=BNEKUL8kDfFtCKRkFCXs8pTBvS?N@a zR#=;~+vS$WP#?Ob#%@3T%=FV>L*(ww7lymZ#ZVqZ<-d+9xK!dzLtuIfv-M zfbdaE*cRYa*~xYp3Dngt!0D7+Nm%H+n&@Qf$a8W2{72~@&*Fy<_2t(7*%zPHnsTZ! z<`%~bHDQB=|4Lz5PmlT zSq0_VLQ-G3RI(n3Mv^ODgto=LgJbi=&Df)j@eY?eOkmi$6taR{rMntww0P^t3rlAA z1+vKPap1GwucKkj^1Tj{tmr(+mGDdu$)u8N!tf#2hJd6w^RO0sZn-FQk@-??LZuRD zp3hxJ(M&{ziDP3w1$V*=iQ(*FtBw$O7kaoJl##e?vpZpGlL5QBdz_yi)w^1Tm{VB5 z{sQY3Y@zm0w+|YPSvLRyaGN&~K?Iv)Fy?^z{bV2U>n(+0E`!b^u8F?Jlm>I`%Z4Tk zL?S)~D~xVIrsEA?h)ebY_;(I%ewCbl_*JP4S&ndWy1I6u7>7N<^p97_Ih&^~g zp0l?+N55=G%N*vvP;leR80^vvOT zTwFO1wM`%fKr*T>!Gudoy8~URyzrn18=Z)!yB=kBkb7f*`Wv*9#KiA)af6rp<(5CS zeggyQ>`j@@bSneGOC=?j-qEM(k(W2e2+3aE<0DL{ckZK&QyiA|MWYO5XScKCMZ_@r zKxD^Z?IhUP$YP(kbJggv=sGZ%JeK+bdy+w@VyZcPx18=vnMmH_y8G9iVEe0$0#A0k zZjt0U)PtUPg;XeoqJ$nG;AAMK{pMlEL=+4^1v;Thekz_98VvMqP1NZnXZh`=P+M8o z1{{7boe(Gi(6N_52y%VE@K)gz6#{%!2l7%~!xZeR<%)pAUb{@T?B1UtZ+OSL2+O~u zlZ2b+h%HY6)lXj}YP}(r9{5s;Zd5~+U?zOGecXK z>It}cJi=MItkhQ^3f%As18*y6esPInO6mpM*;7B^2lg+J_1Z?5HVpt{-fugZC9(i% zf^!^L*Jdrw`N&$lUr(5AJAy@=MWuLKdCzZRT^<^vp6_!FhNdc}ftf~1B++9!E5X6! zSIUdmqp1AU8o~j#BtzqC0Fbk$cdG`!f9S1|R90*5@>=4i$d&NCEyL`JAI<)reCVy? z$}--BJ-^U&v3+5(ae&>SXbv4 zo$D9&qDFM9sp809+kLHB^wnh#mJcYk0>0#Qb=5c2qfUcadk=WtaX!s+f%v@7m>a-}q}-gufg`T1Pz~Xo1R8wnFOQuM(X?=tb>*fZ*qL@@0@>ne zi-t_9u~-Lfa-CzP{`oMe#c*0yksh`prH*eXUoin;@JWDUVWU%N*Xw|!FABG= zOwk;70WZl;mYKG5zDQ;|V)quBt_kNQ-X=cA^8&x!#jNnJ?Ya3ov*TJX@M)8OKn z$!DfjZfC-xj-L{~&*R^{sKJMpmE1q8*mpif{~cGT6l7!(mUho2V@(RhjhhT_Z`Vtg zma}dj?_9Ofx)uuMYf#I-pZV(TYk%eR;fFe2Qx|@bvv(jSDCWc{vracpJuGLu?K;I$ zC>Qp`V^XbB*@tv%a~)WRNav$%@s4?s6p*q-TCIeez&rbwx=1p1fr+!yJ=&h~=a{eW zTw->&>k8{p#eaK9_At!5=*#sO&3a#%ab(?LKDVX=HuUJFkkPKNXURU@R{3VIP~qVh zU|>fRF=5-Bfvu$!_c6lR#4< z_caKR*ejmPMXAFS!RZpXw`DQw{2EKUAlGiLLfhTr^DejfFZ4(o!&V#uY$MpS0~l+k zby&1jwr}*xE1|KHm#?4j%qc{`A6QbDCStXyGRt9<^n5i^d3UqRjTbnFl8cZQrd``t zIQ&#qjv3H4J>UKV6ql2qLA7v`_cTzXxQuB|A-?jMs3>+n^8N6SE4e>rv*L<3YwT$D zFkd$>ez#E+bz&NNEp(P$2c1WWdLpQX;aK%bUTN8~G%9LdrX~Y^ijJ;&{MhB_$3FAd zBpq&sW?S8`5H)$hR-SCf{FX@kb@5qK7h;Qtjr4 zP+k3^TZ`MTqojm0#LQ$PrR)Ulm^z8XgiK^Vf*#erH10Dr0^V^AS1Q=}Ti+IX(lNz2 zD$)gqvMMMivDdyob$~e;rs_GV9_X%7?1~ZQ6&*t}GK}cz5fs6Oc&TlC%ufe6wk#j^ z150HjTxStY>x#<6B$7P+owhD#Y(8)=QptYTPDvm>t(RlPurw(?m#vy%efYhRwtFlW zr~r~%4j?0Nk$X`2|JM2FX^@7m%r2BE%<-tA8!fpV<6puM+qY2w?;&|bLz7E7VEPCmJBXC zOl(NL^*-nn9JYszO-rB5Yu+CKB>VRvBrNX_F6_U5rfrH<{HS=@dB$}?W?@7>%+lO3 zSs1OeCrjhf5s{BEXo>%m^WJ}Hy8wNc>yZ|thf^K}b;O%{ln#aQGUq_x<;>iK6mI-< zHoDa`q)785GC6+LXCS}XW9+`XNyb-N8wJ)>w@wB^$(VlJ7hm3P> zKw(Bqvjt#u{-%XMIPEu{{+fYs!z+w#4}Q#PWMH`BXa>g7cid&C@>*@Tpr{5c^?ty5 zTAGY~%=s<+muHn?)cdu~WwpIOMQ2NXt$uBbL_6}e z_kQN^8Qr}3?_z9^EwRRIa6d6ytHunpu0^9m{d=;?Ki#wVBy9JtM&Py%Z*;`xxlw&Q zL+Zoxsd)%-KdQaypZ*zITU|ykvrhO=!QJ^@<_B%;H5(vOs6+y|`ZhN~jz@=>;lwVq z3V_A8NirebynLXsWvkO>~5$5$eSe>96x@SEjviHZldEW#M)_9^j z1GH7BAX&RS;LuRg?wSuWsCfYEw)AFnS({?4pRrQ4v0H7h1DJxk`T!!o?n8v*DobvX zmP`1KxZ-l;^kN@)@HqCpMMzJe9wCB%yFB8akr6yJy^n zv3}`Q1j(duaKkqy2SNW)z;c@(8~gP5bCHuIhnb*)qFcc}(k<1uKU^*9ZMH270$NpF zi}VBz{z3kf&-yjs$)Hk8WS&uS7lev1^X(46_9~ZKQ>Tm*nOkKrxB$|ZL3D9jWA$u zFKtr;n|Fd8CHQpde%sA-Ga8;7UFNO7VHPcq4LIYQp;UWcbtN z{UwL_yti}_cg@X`dp^$16AQ>MK*W06kpMO$A-sA%{Op z1#|d~DOV>!_}Q-RMe?g`EMP*-*~#2Q(gBs)p%ZA4{23$A>U!WbgJIab(;Vf^kQ;UF5O@ z(sKzxTJL4Ky)IOHv$i5<4^2*%^nW%qt^i@AvBUwm0lkach_an?VM)QQf}fVlcc#gL zj#x5=d0587oS5rCRq%YCu;(6cd32E;Q*pSJnx2!UBEdw;ptISVOIVU5xsVlyEiA~u zWYzz@2~&jLi0M(I<9?9h68o$jU_Z=HXmz{uDlr8A9Y?|-eEMsKB#2RZ&qXDT3d-LG zX7|#oUIR3uXWltAC zPOWr!jA_pf5Cw3X&=d2EQQShS|n(SecK(ZSR+HsR4&s96Z3WEMl`Li$egS z|59H#4Z^vx3N~GV_9v7x7DC^{iNDKYYK&7BDK}870#=PtgG`2lhz4xpizH0zUw_EPkDX}F+fv_0euZU5DG89 z?cfp{^WjQ6W_($~dSc_$bl0J^6`4`D=1tkE!9zP-Z%1?uUe@0ALR)lm+E3)u^voKE zM1Ytc-314t9;^almlIBeko6Iu!g-)jw1S_{#Sm%eAW!{XB^kp@-^PCEby|;Z+sFkgifV$=c=x}hNBR~kuKtyTg zD*Xekijlobkn27cm`q3kY(zlM-n$gS;UNI1UH}725Yzll!JSY85_kzrjKTXULQo4x z&%xRExo3l+;w6Z~vq+?g{5v`Hf1ex*-40X*%wfhZnL=qNR|&YY%r!Der{}o(k+Zz~ z^g=##jCrbJeA(wm;oJ%;t6>)#05i{{dWVJ_hqhQ>IDvPq|6<*Exo6{O{bNk3qIt*Z zWrM4sZ^1Qw)JmFdT+d2HTEwwWE2KK_UeCUd1jes~S#0Pkv5;8Ch$nJye!i0j(N7R* zrnFm}Fl4NmNX*DM1$V)VqjKrK_=eB?9Mgiw#X~o-iCbkrNbuov0Qv#|i2K5v7kN!ns$Vzl(vV!>+{q z<`$Tqau2mMiE69t$r?oKv?=v#-O28$O0yQ&J8|nuzU~hXwOa??^>n>=Xgn!f`$yRT z*(k%$Q7*0~yj)yoBa@W74U18%ljcwmWhZixA+F_?6ZZh3fA*vF<-#Umv!Fn3YQ?n8 z1sr66aWLqo>Xtx{Of7(%&*rnYGXMty*ztrA_nvNnPJxBsOxn<1QyzewrwKD4O>-bV zlj|O#Ki7=)`~mBkVd#$-V8Q$U6gkDTyn=e0%E=U~0;jV92KD@j#A1Y14dhj)0?#JHvZ#*tk^L3g@=ErM}D_aZDUFP6W0PyG3#2Jwero5-QE0DYf;6|zw?N+-Z?!R3;5P40wtOgLbNw$RUYzSItx;|+|7`5eoib9ru6b) zVg@10z5e*V_bi1!%w$}Zi1a$(gWM+f*1-!nAG|)e5t$-se|f%3&MyFkjK-UWDRX~$ zZaTv4sgp?4cML;q>?DzWXptTWmBwZS%JHtf=~M={z&BtkRQ5P4wQT`4w-e0kUdNcm zY^1P|sWGgpjAMba4yg6!f^WpHp&PqL$nz$&s1Yc9m5B6tvW|{2YO~UPDJy%-M=$dH zs|B-X7PKj$JAb6-`c}?$jv4FK&eya*HV2hG`oIq}|Keq;eH7~zX&D#=lm=1Hw_6MJ zPbNv|y4DDPK}mmmJ_K955_{66gK5kC6^N6%Fm48 zeq!)ClnI&|fgW8*)+6UkfEg;PUq*E@7CI}7WKr~Vz_i6QAD!R?K;CXkzS5fGB{#i~9NaFw_N2VLzTCcB zT9sBX;PwbWqkmMvOY{XXNv1z{1-vKhQ9KjpthM~(u`LSN799eQlK7@lo`D-B2+sH= z1b_4!%Jlc&tndCPFAIIB{MC`&ceiZ2^LD53w85jjU`v~0EL14%-B}rJ0ytKFOe>!Z zXDN3v)3iT}TEnH%3*1=a!}ha{R+SEvoq)faf)`^eoxxVsd`r z`%u<1;J*tM?3j)<36w*uMI@qVhM-%z-y`TcN+0ZewXJIzP*YG(WA4&P=mu(cu%@uj z6+b5dB=g1x#xLl$T%V9fK|799NnoI<&~uN>irrPE`(9%=56sX7ik4rhJ&io|;V|!y zUvst(@0-i(e;AmpuKKnRkpOg3Pf)p88frUe`yIpesadS&=Yj%qH|B8n9D-`Zk_^Bw z3YUWa^3-DyMfp+Je-stIwsQIKdR<${eB&zFYGV+(jp*ZBD{4p2`JocKTSt}@J1?Zl zhD4jckB_^`hz(`EL=EFH1-m_Mq_D*yOdV&STCQHHi5kM|7-kRP+YVWA@5BM_oiGk% zakCFyk}j+7+cPAq8;xq~>3V^o8T{pW(+q>bjLlG)Qa_}2M}H0o=xV%RArsa(JFxZb zoK(6K?fU1(d%<*(pu5~EA?PT${}0&uZj6m3+qd`y8IXls=pyQCZVFK7&_u9LwPHnQ zqg!W{MT|N~eURg|=Ny6=pOBnbqGBdfn$gKRAE3!t@lU2O`urKmtz#cAXvhEFfB#duD{Jh@ zBU8Jr-h|_TD!9SYtij1^n7geno&zaOVt0Np=`vr3Wepl3rB{FzYQ>bdU_m04{{l0o zXk9|25Pu+3di+gTcK65_8~3r?<(S0MT5wR_-n+!xP@gTgN5mklVUnCe?`Q<^=0maw*-6R%-F;j0d z-W)$Sz5*P)ugU=?fnG5NVNK|QyP0~JiUPFFo_jMn7N0u43ajPQoS@r_eHx|HU14j) zAmH#T&DsLD8gv_c$7HWUZA9-=U|G!i3UiDlM~q?620Svi0 zO*Q>bLe{qwBOiXah4F%2FAG^^k+h2Mv_wo<6w z^5CLHyWE2j`GdKy!A8}R%k#G6PUNPsTmlJVcw3Vg}rOXx3^hN!2%L1xbwF97Qn z06n_;6=2+@e}KZi%MGM*xS(X_Ii?L*HuWy+yFyU~8N8jhu}6>$$zsw->uL}H-e4u$ z4>Sk*Nk-?eVd#*+QgEpv1+;MB!!hj4#Kk;?sKRS|@byHDW6GKOBy*AKIyLZIZ}DFI z*L@leE@lZ}+XYn%sQ?^Z5l`?Hz1nd`WlwFy-k+!x?0T!sg#goZN_6@MtJD5?ay&ja zntBd7L^DmD@x4cBa5qinyoH>E;PqzjHqo|f%f*sx5jj};)GnZ6M zj9=^v1&s=dTm^D{uV0K0u)s+ke^#eLAD@g6F+ zUb@9H3wjTRA{HPeP}{jaW}Sn{-6|{U+fo45_hbM>sV{fG=Kz2B-qJb1NT8xo1F@4R z&So$KbJgoEsO<1$C4ScWW7d9bQ~Q*|i`awy-@XR2W1mRxnxgOEe3m4C_=d5bVnw&% zIS5bVrhlOmWBP7^vMIarH%)7VQ+XDBv3)1AFzX)tt>SK;hp0efji1=(k-@WP^V_Ij z3tyZYsJ^iO_`UTDTO!j7+VA8afa45+4NU8q%jPjUFDLOUIXczC5T01I0|)zU`|3>K zd%kdeAnwh(q4Si({C5V9h_r{;c7wn_MlmW0Lk`AtZ-_9lWAr1OZ^r;xal~Q@MEje0 zNG9K$f{;^i1mpkH?<+V@l=3T9 zBSpEB_Y4d9osw&to9zO&xuKVtF6-+epT}1y4-;n5vF1CU%zi10tmQeYGFq8U09E`w zXp zHANXKBy{CAXq%7s&2H^@zgO`+{d$j%54qCamOH3}L2uToR&&btR%Db!9A`(6Cd1gS zjAltDS!|BF*7=YOcVt8c*fMgJnG}&ZH&CNJ0e%Iosq@sO5oQ3Y8+CYR$cj}la=v|u zi76Wvy#-Y`&#`|JpjiC7$vy{O$y)rYd?;t>Mh0*d!rXx3y6z3hE!`)v16;fTINdFe zpX{8=g&V+KRU~@kkvGL!Xl6GX6VgXGp^#HLyci^`%R5e*Yzj zRL?!M9TJv5Oi%ZR($vs>o+~{i0GIOfFfr2O2F?1f7(9w*G>1|i{uAW4-Jr5!7yf=^ z*BB}zl(U^6&gi7M*HDg>izVz{QW|Xj5No6Db)w)@-r{S^i%p?(&yP%n_9KjRoUi}M zas!0!?p92(IUsaXo(cR|SuAWM0<~>WYqomL*my<*vY&F_7`bJE08%Xtw}tJ{VtX{Y^4yfu-3-~fIRSgk>LU?dnIClEn7EZ&CrGFjQ zyxso(rU$IA6clm>@TQ`xIPP@NhHTqM7aWD0Q zj&pHU2fA+o>BF#wQZ9HV;=pP2ywYNasp<-Bk8%(r#WNhh609a{cb3ZXF{xIyje_>y z*AH{KniOMJFSK*>0#5`&J#^LiRMv#Vx|H+-0Mq@)49@7{4pZel3TDZvMeNr>`opaFrj6=|0sug{OTO@8QZHH--*9Y|%% zGu)DAZ_cTrUn<6dVGJwjR}XGSIR4&xFegp;#?B*RANpIh_ZI))&ThTkW=bq_=|3E% z__nsaeko+jN&U?VxQB%gLwVTA9bjCfeNWN5rA^kYlI_<(9BzCBt%|I0Ks&o^>F%D- zlC`liJ)&E)AMZrHmq&CF?Wc1(KbbAKmn*6vgZc>bf-D`U4VJ`*`{`qF;NsU?9Q74` z-<)jLX}EVUXe=%Cvprkae>Q zPx11&?Hd7^OwtbmCg-}nl$mp*ogd#VIc`qzYT#62%$e7F+|GpTc$%yE^oiq^CI3+e zYVH!L8WISwN%HYGFT(4zD_NIWTl>zBcv@snIQ|^U$lLEVb@KIO_B22ZjO9X$`q8&& zL%4vX^cT0RbD2>mf>1OOhF~pvT{VZ8?p&PKO7?eGT$X-w*6CKsFO7{XXXgE&)cZaO zhQ%M&QI;M!hE(uJ7TT7^*Wy=L$ZW(XmP_GSvlVK3AR`WC@2}`9jfl&3Y96R4mp{rB znmhw7#jT7qD z$2621sV^s=+e;STW;Cn!lQ8RMkcTs9&-R$EoXFBU$T@HsD4JsBcBETd56phGPGAZ; zCWekGbBp&~(kqzJb?R77fGi_*6tSslMBm1ktn>MfUU*}T_@t6O`PW$RiAU1$fSLI` z#-eC@5{-Ga;=0wRs8ttF*t6@ql2xgW+=1PFQ?oYw8#Q1n1QfLg47i2%Et{9{z6&7W z(>FeJ_JV!j3cPh-jQqV@1~r$GwG|qR-fK()PKnqdQF8e-A>0+^X4u;>CFYgw>xAE! zvpYxPu+TXP-Sk#Y%;LHHddC;^$OgJPR)zj1BK{ z8<%Gj|Gzj-{oJG*3%yaHBq04ckbknBZO=ft;0qX2mqU=04k~)%w|TkR5_%*$Y|U`qa3>@GS{H$? zn4tY(T>0RGTMu1x^N;L#6B`gYRMiv#b2G|+P1svgZ`2L$4-5P8(;8~HgUenx{s5;SvdgDjV(A9W6@^)C#G+Gj1YlUWEfV%r@P1EbUWqLUsK!u zbe$b(Eef9#SX#MFtB%i`eSLoZWN4Td>fUr>=&*cSKP}~b%uGj5ooAb`#yW}c zT*OP8f-5+Ar&A||E5#j<8O20rsh)Y&`lI3dQQS?JjXNDOBGvF@9lTaf5N0MF;7FJ8 zzBF_5&L!LCzs!Oz7XyxV@%%)p1652VF90I*#X=nZ!)QT><%wL@s32Uq;QSL&kOCuxo8(mcq5PiNk-)_V58y8HZd#Q!FtBqZyN7Ico$| za+5v<+TE_%0QO@a3a)VO29jsJLY-BAg&h}d__<&AkyGc2ZU)#adHC5a><4YcT+MSn zV|u#_T?;EN-Gp-KAyR<5GfGUEWe+kwv$j$K3>dkiNKwPl9*=Nk%O8Tf_jrXrK6{ey z!inD^Z@x>auMxE52ts-*3UP-FeoG@G3eZ%1Fqn=YZu7{FoMQ%O=3LcWF8}SelX>Rq z5s1>jl&mJ^gTCeV2d=xTR3Am6%4C`_4?Y;CMUMaa$=T~|sPL@_t&;R2+GK-D>wXsC zGA31y+vzCCY@8CxFNzF6J_)_Zc9~h^g{v_If}vjeX--9zM~VYPnc)q=B#!&jYr`!0 zP!1({hil?hnK6sRH+Rf<%xz_m) zneF&p{YN!}Ed#pltePpUO+AO~kwI9%p=B7W>Ml%VY2~JN@UTpF&@!C<8#+mG_?%nzc(b>z!I4 zUu;3FPNpP~IR>9Q3}gWRgofbtldoC&E%+bef>ujc2`*=ZjR9Os=qEs$T}RdmFe|0^?6*p)h6@4Q9dF--~QooluZrObnozI3y(+vrUDSRnSE~_ zrUKM+P7bFhTpili6`hzA2yAo(XbGhiGBtssP<=&kXyNCsc_JZ3Zqt^!{u;m0wF0-F z>9EdM*WDJjMNi;ehRcM9i3OI_8@ZQ#1N;;C`pmx({r=qf%k!Y04Q^iq)GAd&_Ub2~Ck3Q~ zQaSlrO&Ft?2@iA{SAaK#9_!W&dhmvfORGce%^RS{e#nqz9|HZv7^nFL^3Q-II%tNn zGK6V{s&U|?#f?R|Ng~ievk!U1{Giozif|L?@FA}QCin^#z%#Y&A`hR0zVE!M z_WQgIZ*|$!h`ve8nvoiZFLvFd_&I9raYi5yzta6PtOC@*F3e!UF|m*={q-{wY*WGt zQ1R5i6}5O-97(+Z#}IBOe*freHvca1r*5WtgWMKMF_ZPGoik9p%AbySmMsWt$^weQ z;3g=+r6jS$Ar1{W*XLH7*zEHH5=w8~$9-RLU-TKKd6R(o$0UK46PSfGWJ6Z#tq~2lmyw7XQs)zsc4`-x{|@OU5srLjB2`qz%1R9x4QM36Br(74r%HENRf+ zbBwtx=ka*ONBSD}uJ3Wp+5^ltaACCavs>=ldNi9paW0`6fMDlru!F7I)p`pnx3A+X zJ_;5%4klWTm5rO+X0Ja&bw&~$r;vk(iK|Dt3tMe&l22#1*&F}V`Z_G%f9n!lDv9Gv zh;MW#txDP!-mp_Nd0tTg@EB^uo6ssbn_*ko7xV)hXXG`orZ@v-iT?JZz?Ig z^fw6y6Q>`W*{e1FAa>rEf#(ibxA|yU8eg#N~2OA?E%D~>W^)+IDC-`A6uen9`wKc|gJB(~WQ zNc9D$VnkDyEDx+kp;$6Naxt#q`(K_$M&3u+RQI-Mk~y)?d2IR;uw_tOpj|^P;+F_# z6TgnHs$owCdZoH|kiu7!x}-w>SrI$}z8lbU5Q;Vda|B;AepF9~I%#btA)D)6@a{j} z1yHq962L|Oc_8>2@IiN5R}h5jYZ!)s2W|npasfE51&UkBzIWH5VQJu)iT(%>evuIT z5YjegW{M=@O{v1`RjDvRs>fdTFxt2#@dKH2sTWeMP~yiTvB zj;RH!Q_Ngc3z}Ek>OU7}3ofnzI^yiGpmmU=1=^Vz#5bPbSlJ?hf^7F&+58!e({kTm z`NULyV= zYsK|lOlw|Mxh7p2_On=*kQ0@Gw0 z;8gR~MG!elJuy|b7d!^BI{&-Jbw>^lksGHm6fB}S4$%l!a}-Erf~@!Z z@cRu6Msf22ur2s)zwNMI7b?m^YJ8Tt>FMuDU5 zqa+mGszUp*X>A7{1dg$5{{qMD|8C&;|D8>m%J*9XnXmuSxbEmMF~|n?Qy9c_ejw`t ziIwCA2XT!O`p)11A7UL4i$j||713ADeXs-i!S>Tt3lRdC&3*VaWiY&Y%55T)9QCu< zj$_cK%=!rgdi~7!TaR>=a=j7^R@Gw-x6etfUwHzRQbxDkjHXtib1 zi`t;9mxpJ+blSHmrv7iO{w(HywECwB4_vqW)9T-XY)I4;Zt2hd!WIqz?v(z9S(vpI zNtUJ!StzmM4Fi1%wM_k|4Yy=UVO&|2+gL}f_XDLkG<`yhOapS*f!wXB0}WpPYN;kEFucK+kXcTmR5L(^Z(LhCKp9kC~~h-KmtutyGk5-03a9@f>mP4RDzlRiE#0!A?Cj0V%szx0xs`X z4A8|cqOy-AU3xyHKbYON)vk|w6a}Q5I-vpT= z)U>RWSTy9aEOSRHCq!BBMZwj|Rf)@}auV}y;qA87Ikz=|)~{>36gJi}SJ8?~N+*}} z@-5mzdjTugLV#PL|Nr^pKk~NbY;9T&s*eY#%|?bg_3Vl95EXGFItY>1btn^|jsJPW z<<_lHKwB;&u{c;Z%PUIZgOEmRmHx1WAv3H917rXvLYtziL?VGo=T(j0ChW|PQ#qgP z3%jj9HS&T-@&vQnh+{J(ik~A`MNSC zWHoy%<|gFWJOWgGOQq8xUKjmVBNzu?$e@n6dO{QZC}2O6Z7v@L#Cnf``RRnjB3y`P zA=f)B7_pZ2CnzhT=m|to$+QTCnt(-_l$X3?HfCPK4i1`LL zyL%gRqqJQKAxcFv;430{*D`mG7QqiCYTNAMdh7|X z1^*Hu3O=*{za{10IuA;1dh{<>Q-tuvL*?dxk|3Th#JlfI7XtTu|XBcsK&HZVh&+&G`~7o}W-%R#&0_N_LC>SF&rk zXEhs?5C~Zn@sxp%Lz2kmQ6$&bf-8Sl_jYUFWH#iE);YZ=~q> zxbn|M`Jl9;7sTt}IV2}eDN6a-UG@IKJeKOznh-!S6vY1@sy40Km zk?QhOS;@*P{XjlB|4LlYxi?Jtcg#1TOS96EzDk0Htp$(UtrMvGQd4h1=_TRhFL&?_ zgw2TKmHz3YB_%yD+9m~+~0>lTES?F7gwfCK2O|{#$ zAS#L?pVCCZs3?du6%naX=_*8!q9Eo|M9`p!2!^^81(ae`6a<6-f^-7XYe11GMLMBK zF9AYHAV5f2-Z$Ug_n!0fo@bx?J@@R#ANj?~3VHLccg{KH7-Jr{9%jXF5Pw;|63$hX z<(MPm&C9Fb40$bzg27Re9E|)K4E9r1if^o#_0jZ8d!mmbn7hCOr9D>+Ee#^svsP?P zFU1&%^vsW}Z=R*$P|epPGj<@>q=#^Xp?aLNN^Mg`v`cn?|quz;Va|R67K2!XYeUGCz?IT8mlEC8W$_8o9b|lsa83=9po=ol6rIEYZt;*C$YIxt*NFC2g0Q(dL?S< zxj)FYT2D(gEK>wp{pMKiEb_Tq!*g<5rB;7xu#E*4cjn$ME84V7Ov*mu;V*L7WYA{w zu4u&;NU8)(5{=yWOPjk&mcUCb~UeRmmKRToMm1-Upi1WJ}&_h|Ly zK6b!&0MPJnCgbM%s(@$kUsw72$>%2IZQ#W;{fmSro>4Qh=q%`wI+*jOGhYI(QUKG8 z0mcUs7Z?qERIaLx`;#Xx4oZWNj8C^ph>3n;5D)7Yn_;K?{ZS`rdkW*x3dsfcW- zkd}NBAnB*W^j04rJ?7{-Cw&jt?tCJD6Q@?{puS^=JE^;(?@-2_s!Q#~fS7H$)0UsE zKT;n=cdKFHzVL8L_le{u(RYqLfAkI5AZ(gv~9_327?ZBe!X>*lfTgy`;nlrW6 zS=(V_1f)G4nVUVZx_DlMglBq{;6Su;>p(7a4viRdwCw><~I zd?j8hg&A$jeO$c>6FdK5RM?1m$v47$5|d%&V2~dY)0lT*-?s}WsRLtGC^z=HfFCr+ zruTHV}PVF=9p0T(3mZ9;2^xRM~VAaLV z(kHNcy7B>6>0l{<-1XmPP47BEUvY@aZbLGKek^x{y zoyze!lwNlFzr%RrU%1Bx;fv=w)T^|{OuFyvC+1h^OShVbDOt*T+8S-Rv}ZVU@f@y~ zG<@)fUkjg07JmM+Ap(As3`RnwH|M;2fAQzq?X%_!Tv7 ztO~LlE4L3n?=UD2KRHQAOV6^VfGGj;y`=m_tD!u~Y>VnoTK3Cdd==?e@mwLt%NGM` z@da=29wwDx-2Q1C{ItHhAvuexj*JQGHO{9?6KSZSUOGIhl>*HJPJ8ICDL+BBn4dJ$ zb4*#4v_~e|3T|ybc!bKI@U(n<3ewIc~$p6@{IuRW&ov( zGEwe2k;LcfdF@B93-mi4pukw$DJSt9t@e=5-{RGkTUz+RWdlsZp*o)hE~f|(yTrA? z=GTC?;)tWdeq6ai1ltwE*dI!!7D4au!0Va9f@)7zZ&FpwVFnTBCR?Go)GIi6GnsZ! zZzU<0*gYdMeOGbetb+f2!jd4EKQQtRJQqWJj_if3@9^>{G@CEfUBnGUI%i$Byylc{ z()O~Zh`KB&JO3E%VsVaYjf}zdkn(_U#Sa%Ab|C(v86>-#uM9S3TycpyHhBJ_shYh( zyQYAYv*K&_8BB6=?4MS}dHJ7qyUvY0@-cc0y%fJ2s_;K5FxhNN8$oZyT!wqvW*1gH zm*&Qzk*n>ogW#)qzej9nF(qFTFQb9Sc2l z-_89%)L+2^1a5w2))=x!er)o?sEnkoDWQzDX{zQNRhD?VG*4bltcGgj<(_t8|Vj7U+a;Su53A-H3do$ijEzNPIyR zN6vc+Sx)WhlPm9>3j}>Acy8e^1_KgA@%WgM`+NXVgoU5#&HH54?H#L;#7t&Eyv!^!@cv3rR&n|Ym@!A5RnYq z8F3Ov6lr&b8%gS3gotV7kV0SHJaRa85Q%0k(*t7U+4S@HZ|bi3oy2!7GfT4pu5IWf z0KC^~R3FI=XdYGGh+3T}M@`)A!XY`LS!0FhnPp)iw^fx~K{)7c(K)GeYeBQX&PIj! z0nvLOA-m#B+x6f_j3_j8qhy)T@Er*4@P%UN`?@M_t8Sn#lC+o5gL)Dm(aG6P%laI! zz5DIO3+>1IF_<8uZONBf1^GkPJL`})I@;|BwRos?udwg#6eAO8s69$~5s2dfzI6~Y ztFBMJ{?3jT!>)M-)BaS+tJj00CU5d9$~QbS5;)1{+*iFFvAueq>Hu9YT`#)e7a!e| zrOUn##I^Z!`O(RkTqfd+tzA8SW0;$3hkk*}+q)_<(UX7P7s%jX#zc9PPb#@47B`#@ zMcTGqbKYRC;j&0Iy7Ge*EAkw_Lvo%;utD(Tr_Ay2^2O9Ss(gl=B$0RZUNgQG^=xi0ZH3Yt)i-Nf#^bZO62hs&Pvm7(# zOCRtLAy>eiK+}pm2h35kXZjXu&SBFOLiRp^oP_v4NZ9g9oz9zZWPf4 z4v4tNa9R()^b#vnJ`z{k`w`RV$hT}9`wnLZGlY&erlk_T z%xS7`F02Skd#^)3wf&&U=l2bb@81HxHJ%<&J5=}p4;dAPaj8?t6&v-1KQXnIrUlJ) z7bD|tA*3zPt{}x+|BH{Cbc4OZx`AG}3f3$!BgX_9PC)EQWv|*Pp%)w)AjVE%Pk*3n zlvQ=qBYn2Aoe~3-Z$t#HsgDvewudZv9%4izaxk^vW_`@WA^Os2gbea0TOa5YqoTH@@#;jerYU@ee~PAA*Lw&U`i+W}6zrjaRlJXlBTr8XP}n_h;m)vJ*7fh`s?g z;tQ-7X6GcI%U06YVetKvt&gJWe;VBJ`?>sT_3o1K$D>0!?qf%-Cd}$pFWwfrkrHRz zecZwj8a8zc0z3zDv=sY9mb3h%446*CIYyXn9M$-O6Y{gBRSs^WrxpxM zdwiEB>@sOP>JoRK^X;@Wn;c{xP=_v1uik<+{3u%)y-H$;K4#fwJgNWuUh{@K`AgZ^ zDbBiNViFbIyL-i*#{$QkAaNM4fibr;+nMO`P%(wW_=HR=PEK(;mrfrvTqTkmT$Q!IS2Pq&Xk^(t!oJS+FZb28t%+T zo~=7Uv(IhBb{}M106s|S*X)l`!f+~+Z)lad|7q655){7Y4=`Kl^MV*t?W{}p>ZJ}! z@uP;b^#?t*HU8Kv;C1Yj`Wo^U_}xaC9V)_uWr;N+*K|AbIC@QjRCJM;t(mmdbe2Iw z>dr=+0KGn_R^9Ibll?IKL65&*sa)jRY;J4^ygBCHJ-6z8nLf`LpN2BZ5Di4zLpgh{ z*_I)JC+?CQxFB`Jiqv1}?yEtFc9w>I82|OIGTN)7D+@a?kq}v<7`Ill%-UzDU*84B zOMxKV8hIJ7ho-VFqC63E3XAs?0$^WN#*loiN28x`E9{MBS z3n^}+#nokq*WYE+Iay7fXr{bmZDUSL3N>~hY|rD=k{H3^Yc_y_oyFJ_+IKQDY)(zX zNF-?uoe9;dC2IPMl4cX0yQgjB49Fu`!rnj9z!TZeOxtFT8};8cyDh=H^bwvtsQ|?X zG+NkYf6koI|DvDIe3b;(;$;fa5-w!hm2&?{{D$R%pSo3}@p;*q4#nU8c&}p4_nkrL z_c@fEHR@#fb3j<3-GVp7gmRqM96`nr`;M{M#$zqGdA0cR0S0wF3k50sL*~vQ<2<_*tat8Q7&JK$N%hAlgFPsc$PmN^;L{SF~K? zRdS+yff;5mVofD&IVZ}=BqS(?S!5Vc4H!?g>N>lCk9&mb2xI9^awOfjO=MzJ2XV)>cp3zc|tp;+@iXe|lm6*Q^ zqV|N$W1g~pG;OY9jJR_OLdfT}s#CcqHa(ys2l@ui3Elm($-MK=V;b2dN zEA+`K31nHOgX-dIFghq7OnkjYZz1X0*CmM2QdhE4tBT&cqt*!;DF_7%2Pv6^O3&gWEJT1EN9FL3uHTb(wnAWOk$TuI2rZ`ki;v7iUcw;; zJLV#dN@gJ4+?R8R=!)yJQ7s$0g=2RYgPI}f4Cc!O zO+ErtZRn9XBjjq19HJR>0=Vy_fjje_1zSaAtN%_e#LiS@;-s50rpMt0mY~M^rdGf2Q?HqelNeZ}aO9touJ1)U3VmA=gtu3L(mz*XnUG zTQ0=2uGl7m-F9xbo-*x>&!qwU4*-=5aT*DbVDQt7k8P9Wtb->a^_U2n1@z)cZge7o z?FmYW@mMU_#u0uVG7oSfc(y0NoyRkoJ3^;esj3slR+i@d*?>hU%^U*;mu@1^urg4s z9&G=`_l2Z)5&$W?zYHMe6 zfW8#}LQk2wG<*kLH+ZDzQ(<{tXKb@Xy{2q<{k|9H^P)`6XUnEe)t@9#SVX`H0G9WQ zrhcZs_^ts!z1>9-Clm7Y*5IKb5gT(GSMGKob*j_l1MF&naa{kFG1~||L9D1T9T8}l zUtZlAl!CwgtfBgHu<1$uXu(68gdYG?3JeTrHETCh(2IgbC#yj|82B?$ZN?QmHHa8yj=hf5(^z4bh$=OJBloC ztX4~|2*^UMS+vZh{$y`N^%tzXhAIgC^6j7dS@pNy;7cUu0Aly>t9x1p>Jyx|Yfa^? z^}CVN@_qO&klY^@!KhF=+XMQ5&~zN{TmRHwNl9?Agiubr27&!53M*p*R0^3Pxq5lA z{#7YaBT2!kN)_LvaSEcEHPnQ+%;i1t>F40gZd;ZtVqB5&1xVf;(|ef2`ZJ5&mv-21 zx&G0>X?=E1>q}isw3LxhY(*^rkQH@qY8pFX^E+qus?Bro z7@t?PL-ar^vC#`m)BV7%=0ZLoP&=zn`CpG^n@q0UpY@9~^p6yrm0}y4!3D+zPq^$U z5je2f{sM0nn9F8jH*}*G=8hq_8&9Or!DA95($kVpoAA$K^Z)cl7V5r%mWIr$P~Qar z&QR3BD;y@6W2(nySnFj6C#UcnuTF9sq9A0}fi5yU^wmL)1n=!;R8*V;)<~~++r+21 zX-R}T`k9=(zL}EhF?!VM=Ycm73Kp+ldm+z)@U=G9!HZuy3!~hU{s2*IcZ<{#!|bdM zRCQ-BT*PT4FKcGqit6ir>-%BwE?bv#?6h1*aPQ$auA&L|u!OKV2)MiyELZDvb=?J# zNLQ5)JcqJ&jC&{3JIqWb>MZWxCXK|4vxhKjA-LI#x#Xv<+roW7EJv``hdIa)X5HE9 zm&Y1t=Y?@=(;g31;Se=jOtS*d;pEHAePlwPBI`sU_Z`uhs^hJ+q_JJ1Cz^5?hKcpR0h?b`-^Qryv$CEm5ZHFIjODvGYZ({ z?ET9wM~LH325J|eM_+CcqwXs7fx|FPzP|1{ij!9=!^B>z?R8pQ7u;ub>bfpi}72}x`Ou9Ww+h_7ZB6dc*AxCZ*KNp zL_^_Jj`(He7?sV8qr1QO$SxP6Yk<<^)Sy^vTxzu*aP=B<5EoDW+GTIX9cS2VHSh zpn&NCT|VhWKoR9lb?+MD00Q9rvv_4u`gizdae6wooZwBYOg$?T-r2@@!!EuDdhbF| zs@=JBmw?66=P<^F!(YWCv*arj43%F#m<9%mVmw&o_v_z(4b)#r-|^fEjKOc3B~1ga zkFqSK)FEwYXO7+B5laiE!_(0Oih{c(V0+-dWVxA=xqyp7tL6&V| zo5Q`PEYhu=jXd7#N4E_#bYLmyGO?%WH1||?S*32y>D4&^+ExHNwpNd;U51`tv@=Ug z-!<|J<_}x;U#W$w|EcN_XB@POM&3|mnxQdzoG1OUId~OHU%^>=(q0T(jiM5oD z%m?Ja4r7=KJyUO!o{t{K3(xZvNLgbWkmMipLuE4cC;YZw3GSSS@2xcs;c zcrwYtVna-QbsWnX2#XN3EEX3*G2mYHe_PkSdCv12J)=XkeF8B(G--?GwebzMXl=> zSuQajG8Ew4o(#+MS$Nn-&U1*Ag&i06qKS{fDjt2mUcP<6mgy^8bGCh2Lu+_InL} zKL@|}!0$cq|Lq{$W`HDP=a_$F*Rw=mT7vy(7Y6=3+kzDQ)zPt ze`;Dx4$(eV{{gi|BtgTUKe;HfizMQCB6}l^7Urd=oZ1XRO4?kH0kgcH(=!W8O95Rfh5>R{u;6 z{3kn=KPGdk{6yKhKB|`l>*=BXE|vYJRwvir<*(%%eueiTq1IvdBbNjAbVUEd4<3`1 z92+VA!dpjn1P~~?ri#xoil0HVMNT@piN-MDd=Wb35Hi)(!dOiv)MC2)0eU9qIx6`P zd?qI|bL$~K-;3!f|Iw1Q@89P4Z}WQ({N4k<_rUKx@OuyZ O-UI(edqCh<@Bac(b_Q4g literal 0 HcmV?d00001 From 2b6a92851505d92f355d05672851cbac85b93fc5 Mon Sep 17 00:00:00 2001 From: Huang Ziqi Date: Tue, 28 Nov 2023 00:23:27 +0800 Subject: [PATCH 005/248] Add License --- LICENSE | 201 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 201 insertions(+) diff --git a/LICENSE b/LICENSE index e69de29b..f49a4e16 100755 --- a/LICENSE +++ b/LICENSE @@ -0,0 +1,201 @@ + Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + + 1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + + 2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + + 3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + + 4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + + 5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + + 6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + + 7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + + 8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + + 9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + + END OF TERMS AND CONDITIONS + + APPENDIX: How to apply the Apache License to your work. + + To apply the Apache License to your work, attach the following + boilerplate notice, with the fields enclosed by brackets "[]" + replaced with your own identifying information. (Don't include + the brackets!) The text should be enclosed in the appropriate + comment syntax for the file format. We also recommend that a + file or class name and description of purpose be included on the + same "printed page" as the copyright notice for easier + identification within third-party archives. + + Copyright [yyyy] [name of copyright owner] + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. \ No newline at end of file From 57014ef152cf1cf014acaa0a5eb0a65780bf285b Mon Sep 17 00:00:00 2001 From: Huang Ziqi Date: Tue, 28 Nov 2023 00:26:54 +0800 Subject: [PATCH 006/248] Update README.md --- README.md | 9 ++++----- 1 file changed, 4 insertions(+), 5 deletions(-) diff --git a/README.md b/README.md index 1fc0d272..3a1a40a0 100755 --- a/README.md +++ b/README.md @@ -1,4 +1,4 @@ -# :bar_chart: VBench: Comprehensive Benchmark Suite for Video Generative Models +# :bar_chart: VBench [![Project Page](https://img.shields.io/badge/VBench-Website-green)](https://vchitect.github.io/VBench-project/) @@ -7,17 +7,16 @@ This repository contains the implementation of the following paper: > **VBench: Comprehensive Benchmark Suite for Video Generative Models**
      -> [Ziqi Huang](https://ziqihuangg.github.io/), Yinan He(https://github.com/yinanhe), [Jiashuo Yu](https://scholar.google.com/citations?user=iH0Aq0YAAAAJ&hl=zh-CN), [Fan Zhang](https://github.com/zhangfan-p), [Chenyang Si](https://chenyangsi.top/), [Yuming Jiang](https://yumingj.github.io/), [Yuanhan Zhang](https://zhangyuanhan-ai.github.io/), [Tianxing Wu](https://tianxingwu.github.io/), Qingyang Jin, [Nattapol Chanpaisit](https://nattapolchan.github.io/me), [Yaohui Wang](https://wyhsirius.github.io/), [Xinyuan Chen](https://scholar.google.com/citations?user=3fWSC8YAAAAJ), [Limin Wang](https://wanglimin.github.io), [Dahua Lin](http://dahua.site/)+, [Yu Qiao](http://mmlab.siat.ac.cn/yuqiao/index.html)+, [Ziwei Liu](https://liuziwei7.github.io/)+
      +> [Ziqi Huang](https://ziqihuangg.github.io/), [Yinan He](https://github.com/yinanhe), [Jiashuo Yu](https://scholar.google.com/citations?user=iH0Aq0YAAAAJ&hl=zh-CN), [Fan Zhang](https://github.com/zhangfan-p), [Chenyang Si](https://chenyangsi.top/), [Yuming Jiang](https://yumingj.github.io/), [Yuanhan Zhang](https://zhangyuanhan-ai.github.io/), [Tianxing Wu](https://tianxingwu.github.io/), Qingyang Jin, [Nattapol Chanpaisit](https://nattapolchan.github.io/me), [Yaohui Wang](https://wyhsirius.github.io/), [Xinyuan Chen](https://scholar.google.com/citations?user=3fWSC8YAAAAJ), [Limin Wang](https://wanglimin.github.io), [Dahua Lin](http://dahua.site/)+, [Yu Qiao](http://mmlab.siat.ac.cn/yuqiao/index.html)+, [Ziwei Liu](https://liuziwei7.github.io/)+
      [[Paper]()] | [[Project Page](https://vchitect.github.io/VBench-project/)] | [[Video](https://www.youtube.com/watch?v=7IhCC8Qqn8Y)] | - [[Huggingface Demo](https://vbench-t2v-leaderboard.hf.space/)] | ## Overview -![overall_structure](./assets/fig_teaser_new.jpg) +![overall_structure](./asset/fig_teaser_new.jpg) We propose **VBench**, a comprehensive benchmark suite for video generative models. We design a comprehensive and hierarchical Evaluation Dimension Suite to decompose "video generation quality" into multiple well-defined dimensions to facilitate fine-grained and objective evaluation. For each dimension and each content category, we carefully design a Prompt Suite as test cases, and sample Generated Videos from a set of video generation models. For each evaluation dimension, we specifically design an Evaluation Method Suite, which uses carefully crafted method or designated pipeline for automatic objective evaluation. We also conduct Human Preference Annotation for the generated videos for each dimension, and show that VBench evaluation results are well aligned with human perceptions. VBench provides valuable insights. ## Updates @@ -89,4 +88,4 @@ The complete list of dimensions to support: This codebase is maintained by [Ziqi Huang](https://ziqihuangg.github.io/), [Yinan He](https://github.com/yinanhe), [Jiashuo Yu](https://scholar.google.com/citations?user=iH0Aq0YAAAAJ&hl=zh-CN), [Fan Zhang](https://github.com/zhangfan-p), and [Nattapol Chanpaisit](https://nattapolchan.github.io/me). This project wouldn't be possible without the following open-sourced repositories: -[AMT](https://github.com/MCG-NKU/AMT/), [UMT](https://github.com/OpenGVLab/unmasked_teacher), [RAM](https://github.com/xinyu1205/recognize-anything), [CLIP](https://github.com/openai/CLIP), [RAFT](https://github.com/princeton-vl/RAFT), [GRiT](https://github.com/JialianW/GRiT), [MUSIQ](https://github.com/chaofengc/IQA-PyTorch/), [ViCLIP](https://github.com/OpenGVLab/InternVideo/tree/main/Data/InternVid), [LAION Aesthetic Predictor](https://github.com/LAION-AI/aesthetic-predictor) +[AMT](https://github.com/MCG-NKU/AMT/), [UMT](https://github.com/OpenGVLab/unmasked_teacher), [RAM](https://github.com/xinyu1205/recognize-anything), [CLIP](https://github.com/openai/CLIP), [RAFT](https://github.com/princeton-vl/RAFT), [GRiT](https://github.com/JialianW/GRiT), [IQA-PyTorch](https://github.com/chaofengc/IQA-PyTorch/), [ViCLIP](https://github.com/OpenGVLab/InternVideo/tree/main/Data/InternVid), and [LAION Aesthetic Predictor](https://github.com/LAION-AI/aesthetic-predictor). From d3b7e51facce0aa77c996d909655a6a0d22394af Mon Sep 17 00:00:00 2001 From: Huang Ziqi Date: Tue, 28 Nov 2023 03:14:04 +0800 Subject: [PATCH 007/248] Update README.md --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index 3a1a40a0..e00dbac2 100755 --- a/README.md +++ b/README.md @@ -1,6 +1,7 @@ # :bar_chart: VBench +[![Paper](https://img.shields.io/badge/arXiv-Paper-b31b1b.svg)](https://vchitect.github.io/VBench-project/assets/vbench/VBench_paper.pdf) [![Project Page](https://img.shields.io/badge/VBench-Website-green)](https://vchitect.github.io/VBench-project/) [![HuggingFace](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/VBench/T2V-Leaderboard) From d5c7aaf7663d8525f7d5c205a0d26c14fca7919e Mon Sep 17 00:00:00 2001 From: Huang Ziqi Date: Tue, 28 Nov 2023 03:20:17 +0800 Subject: [PATCH 008/248] Update README.md --- README.md | 6 +----- 1 file changed, 1 insertion(+), 5 deletions(-) diff --git a/README.md b/README.md index e00dbac2..c65ee1d9 100755 --- a/README.md +++ b/README.md @@ -4,6 +4,7 @@ [![Paper](https://img.shields.io/badge/arXiv-Paper-b31b1b.svg)](https://vchitect.github.io/VBench-project/assets/vbench/VBench_paper.pdf) [![Project Page](https://img.shields.io/badge/VBench-Website-green)](https://vchitect.github.io/VBench-project/) [![HuggingFace](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/VBench/T2V-Leaderboard) +[![Paper](https://img.shields.io/badge/YouTube-Video-c4302b.svg)](https://www.youtube.com/watch?v=7IhCC8Qqn8Y) This repository contains the implementation of the following paper: @@ -11,11 +12,6 @@ This repository contains the implementation of the following paper: > [Ziqi Huang](https://ziqihuangg.github.io/), [Yinan He](https://github.com/yinanhe), [Jiashuo Yu](https://scholar.google.com/citations?user=iH0Aq0YAAAAJ&hl=zh-CN), [Fan Zhang](https://github.com/zhangfan-p), [Chenyang Si](https://chenyangsi.top/), [Yuming Jiang](https://yumingj.github.io/), [Yuanhan Zhang](https://zhangyuanhan-ai.github.io/), [Tianxing Wu](https://tianxingwu.github.io/), Qingyang Jin, [Nattapol Chanpaisit](https://nattapolchan.github.io/me), [Yaohui Wang](https://wyhsirius.github.io/), [Xinyuan Chen](https://scholar.google.com/citations?user=3fWSC8YAAAAJ), [Limin Wang](https://wanglimin.github.io), [Dahua Lin](http://dahua.site/)+, [Yu Qiao](http://mmlab.siat.ac.cn/yuqiao/index.html)+, [Ziwei Liu](https://liuziwei7.github.io/)+
      -[[Paper]()] | -[[Project Page](https://vchitect.github.io/VBench-project/)] | -[[Video](https://www.youtube.com/watch?v=7IhCC8Qqn8Y)] | -[[Huggingface Demo](https://vbench-t2v-leaderboard.hf.space/)] | - ## Overview ![overall_structure](./asset/fig_teaser_new.jpg) We propose **VBench**, a comprehensive benchmark suite for video generative models. We design a comprehensive and hierarchical Evaluation Dimension Suite to decompose "video generation quality" into multiple well-defined dimensions to facilitate fine-grained and objective evaluation. For each dimension and each content category, we carefully design a Prompt Suite as test cases, and sample Generated Videos from a set of video generation models. For each evaluation dimension, we specifically design an Evaluation Method Suite, which uses carefully crafted method or designated pipeline for automatic objective evaluation. We also conduct Human Preference Annotation for the generated videos for each dimension, and show that VBench evaluation results are well aligned with human perceptions. VBench provides valuable insights. From 5e912e888695c5c1387b4769a0af18519e22e7e7 Mon Sep 17 00:00:00 2001 From: Huang Ziqi Date: Tue, 28 Nov 2023 04:13:23 +0800 Subject: [PATCH 009/248] Update README.md --- README.md | 12 +++++++++--- evaluation_results/README.md | 3 ++- pretrained/README.md | 5 +++-- prompts/README.md | 2 +- 4 files changed, 15 insertions(+), 7 deletions(-) diff --git a/README.md b/README.md index c65ee1d9..59ae9df4 100755 --- a/README.md +++ b/README.md @@ -14,7 +14,7 @@ This repository contains the implementation of the following paper: ## Overview ![overall_structure](./asset/fig_teaser_new.jpg) -We propose **VBench**, a comprehensive benchmark suite for video generative models. We design a comprehensive and hierarchical Evaluation Dimension Suite to decompose "video generation quality" into multiple well-defined dimensions to facilitate fine-grained and objective evaluation. For each dimension and each content category, we carefully design a Prompt Suite as test cases, and sample Generated Videos from a set of video generation models. For each evaluation dimension, we specifically design an Evaluation Method Suite, which uses carefully crafted method or designated pipeline for automatic objective evaluation. We also conduct Human Preference Annotation for the generated videos for each dimension, and show that VBench evaluation results are well aligned with human perceptions. VBench provides valuable insights. +We propose **VBench**, a comprehensive benchmark suite for video generative models. We design a comprehensive and hierarchical Evaluation Dimension Suite to decompose "video generation quality" into multiple well-defined dimensions to facilitate fine-grained and objective evaluation. For each dimension and each content category, we carefully design a Prompt Suite as test cases, and sample Generated Videos from a set of video generation models. For each evaluation dimension, we specifically design an Evaluation Method Suite, which uses carefully crafted method or designated pipeline for automatic objective evaluation. We also conduct Human Preference Annotation for the generated videos for each dimension, and show that VBench evaluation results are well aligned with human perceptions. VBench can provide valuable insights from multiple perspectives. ## Updates - [11/2023] Evaluation code for released for this list of dimensions: `['subject_consistency', 'background_consistency', 'aesthetic_quality', 'object_class', 'multiple_objects', 'human_action', 'color', 'spatial_relationship', 'scene', 'temporal_style', 'appearance_style', 'overall_consistency']` @@ -62,8 +62,14 @@ my_VBench.evaluate( ) ``` -The complete list of dimensions to support: -```['subject_consistency', 'background_consistency', 'temporal_flickering', 'motion_smoothness', 'dynamic_degree', 'aesthetic_quality', 'imaging_quality', 'object_class', 'multiple_objects', 'human_action', 'color', 'spatial_relationship', 'scene', 'temporal_style', 'appearance_style', 'overall_consistency']``` +- The complete list of dimensions: + ``` + ['subject_consistency', 'background_consistency', 'temporal_flickering', 'motion_smoothness', 'dynamic_degree', 'aesthetic_quality', 'imaging_quality', 'object_class', 'multiple_objects', 'human_action', 'color', 'spatial_relationship', 'scene', 'temporal_style', 'appearance_style', 'overall_consistency'] + ``` +- Dimensions currently open-sourced (the remaining will be added soon): + ``` + ['subject_consistency', 'background_consistency', 'aesthetic_quality', 'object_class', 'multiple_objects', 'human_action', 'color', 'spatial_relationship', 'scene', 'temporal_style', 'appearance_style', 'overall_consistency'] + ``` ## :black_nib: Citation diff --git a/evaluation_results/README.md b/evaluation_results/README.md index 20015343..b85475e8 100755 --- a/evaluation_results/README.md +++ b/evaluation_results/README.md @@ -1 +1,2 @@ -# Evaluation Results \ No newline at end of file +# :bar_chart: Evaluation Results +Evaluation results will be saved here. \ No newline at end of file diff --git a/pretrained/README.md b/pretrained/README.md index 758f3162..a1e20351 100755 --- a/pretrained/README.md +++ b/pretrained/README.md @@ -1,2 +1,3 @@ -# Pretrained -Pretrained Checkpoints \ No newline at end of file +# :gem: Pre-Trained Models +[Optional] Please download the pre-trained weights according to the guidance in the `model_path.txt` file for each model (see each folder). + diff --git a/prompts/README.md b/prompts/README.md index 164f53b5..2c0714a7 100755 --- a/prompts/README.md +++ b/prompts/README.md @@ -1,4 +1,4 @@ -# Prompt Suite +# :bookmark_tabs: Prompt Suite We design compact yet representative prompts in terms of both the evaluation dimensions and the content categories. From 9fa56eb0ad70d4de52b191c9c43c6ed15c3c166f Mon Sep 17 00:00:00 2001 From: Yinan He <43169235+yinanhe@users.noreply.github.com> Date: Tue, 28 Nov 2023 10:19:14 +0800 Subject: [PATCH 010/248] Update README.md --- README.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/README.md b/README.md index 59ae9df4..f95957db 100755 --- a/README.md +++ b/README.md @@ -1,10 +1,10 @@ # :bar_chart: VBench -[![Paper](https://img.shields.io/badge/arXiv-Paper-b31b1b.svg)](https://vchitect.github.io/VBench-project/assets/vbench/VBench_paper.pdf) -[![Project Page](https://img.shields.io/badge/VBench-Website-green)](https://vchitect.github.io/VBench-project/) +[![Paper](https://img.shields.io/badge/cs.CV-Paper-b31b1b?logo=arxiv&logoColor=red)](https://vchitect.github.io/VBench-project/assets/vbench/VBench_paper.pdf) +[![Project Page](https://img.shields.io/badge/VBench-Website-green?logo=googlechrome&logoColor=green)](https://vchitect.github.io/VBench-project/) [![HuggingFace](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/VBench/T2V-Leaderboard) -[![Paper](https://img.shields.io/badge/YouTube-Video-c4302b.svg)](https://www.youtube.com/watch?v=7IhCC8Qqn8Y) +[![Paper](https://img.shields.io/badge/YouTube-Video-c4302b?logo=youtube&logoColor=red)](https://www.youtube.com/watch?v=7IhCC8Qqn8Y) This repository contains the implementation of the following paper: From 1a9008f6f9da95edbd45206788f7a081b0158428 Mon Sep 17 00:00:00 2001 From: Yinan He <43169235+yinanhe@users.noreply.github.com> Date: Tue, 28 Nov 2023 10:21:05 +0800 Subject: [PATCH 011/248] Update README.md --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index f95957db..f3d24ed3 100755 --- a/README.md +++ b/README.md @@ -12,11 +12,11 @@ This repository contains the implementation of the following paper: > [Ziqi Huang](https://ziqihuangg.github.io/), [Yinan He](https://github.com/yinanhe), [Jiashuo Yu](https://scholar.google.com/citations?user=iH0Aq0YAAAAJ&hl=zh-CN), [Fan Zhang](https://github.com/zhangfan-p), [Chenyang Si](https://chenyangsi.top/), [Yuming Jiang](https://yumingj.github.io/), [Yuanhan Zhang](https://zhangyuanhan-ai.github.io/), [Tianxing Wu](https://tianxingwu.github.io/), Qingyang Jin, [Nattapol Chanpaisit](https://nattapolchan.github.io/me), [Yaohui Wang](https://wyhsirius.github.io/), [Xinyuan Chen](https://scholar.google.com/citations?user=3fWSC8YAAAAJ), [Limin Wang](https://wanglimin.github.io), [Dahua Lin](http://dahua.site/)+, [Yu Qiao](http://mmlab.siat.ac.cn/yuqiao/index.html)+, [Ziwei Liu](https://liuziwei7.github.io/)+
      -## Overview +## :mega: Overview ![overall_structure](./asset/fig_teaser_new.jpg) We propose **VBench**, a comprehensive benchmark suite for video generative models. We design a comprehensive and hierarchical Evaluation Dimension Suite to decompose "video generation quality" into multiple well-defined dimensions to facilitate fine-grained and objective evaluation. For each dimension and each content category, we carefully design a Prompt Suite as test cases, and sample Generated Videos from a set of video generation models. For each evaluation dimension, we specifically design an Evaluation Method Suite, which uses carefully crafted method or designated pipeline for automatic objective evaluation. We also conduct Human Preference Annotation for the generated videos for each dimension, and show that VBench evaluation results are well aligned with human perceptions. VBench can provide valuable insights from multiple perspectives. -## Updates +## :fire: Updates - [11/2023] Evaluation code for released for this list of dimensions: `['subject_consistency', 'background_consistency', 'aesthetic_quality', 'object_class', 'multiple_objects', 'human_action', 'color', 'spatial_relationship', 'scene', 'temporal_style', 'appearance_style', 'overall_consistency']` - [11/2023] Prompt Suites released. (See prompt lists [here](https://github.com/Vchitect/VBench/tree/master/prompts)) From 9092c6a88a03a147b57327a3ea25216cad389499 Mon Sep 17 00:00:00 2001 From: Yinan He <43169235+yinanhe@users.noreply.github.com> Date: Tue, 28 Nov 2023 11:43:59 +0800 Subject: [PATCH 012/248] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index f3d24ed3..8a93d90d 100755 --- a/README.md +++ b/README.md @@ -3,7 +3,7 @@ [![Paper](https://img.shields.io/badge/cs.CV-Paper-b31b1b?logo=arxiv&logoColor=red)](https://vchitect.github.io/VBench-project/assets/vbench/VBench_paper.pdf) [![Project Page](https://img.shields.io/badge/VBench-Website-green?logo=googlechrome&logoColor=green)](https://vchitect.github.io/VBench-project/) -[![HuggingFace](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/VBench/T2V-Leaderboard) +[![HuggingFace](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Leaderboard-blue)](https://huggingface.co/spaces/VBench/T2V-Leaderboard) [![Paper](https://img.shields.io/badge/YouTube-Video-c4302b?logo=youtube&logoColor=red)](https://www.youtube.com/watch?v=7IhCC8Qqn8Y) From 9dcc1088bc62af9b9cbfc01a2db5d26795851990 Mon Sep 17 00:00:00 2001 From: Yinan He <43169235+yinanhe@users.noreply.github.com> Date: Tue, 28 Nov 2023 11:47:42 +0800 Subject: [PATCH 013/248] Update vbench_env.yml --- vbench_env.yml | 10 ---------- 1 file changed, 10 deletions(-) diff --git a/vbench_env.yml b/vbench_env.yml index e2f629e1..2b350b9c 100644 --- a/vbench_env.yml +++ b/vbench_env.yml @@ -10,9 +10,6 @@ dependencies: - debugpy=1.6.7=py38h6a678d5_0 - decorator=5.1.1=pyhd3eb1b0_0 - importlib_metadata=6.0.0=hd3eb1b0_0 - - ipykernel=6.25.0=py38h2f386ee_0 - - jupyter_client=8.1.0=py38h06a4308_0 - - jupyter_core=5.3.0=py38h06a4308_0 - ld_impl_linux-64=2.38=h1181459_1 - libffi=3.4.4=h6a678d5_0 - libgcc-ng=11.2.0=h1234567_1 @@ -193,9 +190,7 @@ dependencies: - pathspec==0.11.2 - pathy==0.10.2 - peft==0.5.0 - - petrel-oss-sdk==v2.2.2-4-ge70974b-big-stream-lazyloading - pillow==9.5.0 - - pip==22.0 - pkgutil-resolve-name==1.3.10 - plotly==5.16.1 - portalocker==2.7.0 @@ -241,7 +236,6 @@ dependencies: - s3transfer==0.6.2 - sacremoses==0.0.53 - safetensors==0.3.2 - - salesforce-lavis==1.0.2 - scenedetect==0.6.2 - scikit-image==0.21.0 - scikit-learn==1.3.2 @@ -254,7 +248,6 @@ dependencies: - setuptools==59.1.1 - shapely==2.0.1 - shortuuid==1.0.11 - - simplet5==0.1.4 - smart-open==6.3.0 - smmap==5.0.0 - sniffio==1.3.0 @@ -265,7 +258,6 @@ dependencies: - srsly==2.4.7 - stack-data==0.6.2 - starlette==0.27.0 - - streamlit==1.25.0 - submitit==1.4.6 - svgwrite==1.4.3 - tabulate==0.9.0 @@ -303,7 +295,6 @@ dependencies: - uvicorn==0.23.2 - validators==0.21.2 - virtualenv==20.24.3 - - wandb==0.16.0 - wasabi==1.1.2 - watchdog==3.0.0 - wavedrom==2.0.3.post3 @@ -312,7 +303,6 @@ dependencies: - webencodings==0.5.1 - websockets==11.0.3 - werkzeug==2.3.7 - - wordcloud==1.9.2 - yacs==0.1.8 - yapf==0.40.1 - yarl==1.9.2 From 2eed79fed1bd668a2ea6fe85e157c677a83b989c Mon Sep 17 00:00:00 2001 From: Huang Ziqi Date: Tue, 28 Nov 2023 16:11:44 +0800 Subject: [PATCH 014/248] Update README.md --- README.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 8a93d90d..1f472290 100755 --- a/README.md +++ b/README.md @@ -4,7 +4,8 @@ [![Paper](https://img.shields.io/badge/cs.CV-Paper-b31b1b?logo=arxiv&logoColor=red)](https://vchitect.github.io/VBench-project/assets/vbench/VBench_paper.pdf) [![Project Page](https://img.shields.io/badge/VBench-Website-green?logo=googlechrome&logoColor=green)](https://vchitect.github.io/VBench-project/) [![HuggingFace](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Leaderboard-blue)](https://huggingface.co/spaces/VBench/T2V-Leaderboard) -[![Paper](https://img.shields.io/badge/YouTube-Video-c4302b?logo=youtube&logoColor=red)](https://www.youtube.com/watch?v=7IhCC8Qqn8Y) +[![Video](https://img.shields.io/badge/YouTube-Video-c4302b?logo=youtube&logoColor=red)](https://www.youtube.com/watch?v=7IhCC8Qqn8Y) +[![Visitor](https://hits.seeyoufarm.com/api/count/incr/badge.svg?url=https%3A%2F%2Fgithub.com%2FVchitect%2FVBench&count_bg=%23FFA500&title_bg=%23555555&icon=&icon_color=%23E7E7E7&title=visitors&edge_flat=false)](https://hits.seeyoufarm.com) This repository contains the implementation of the following paper: @@ -77,7 +78,7 @@ my_VBench.evaluate( If you find our repo useful for your research, please consider citing our paper: ```bibtex - @article{huang2023bench, + @article{huang2023vbench, title={{VBench}: Comprehensive Benchmark Suite for Video Generative Models}, author={Huang, Ziqi and He, Yinan and Yu, Jiashuo and Zhang, Fan and Si, Chenyang and Jiang, Yuming and Zhang, Yuanhan and Wu, Tianxing and Jin, Qingyang and Chanpaisit, Nattapol and Wang, Yaohui and Chen, Xinyuan and Wang, Limin and Lin, Dahua and Qiao, Yu and Liu, Ziwei} journal={arXiv preprint} From 0b890b5d790ab5afe591021b228b140df9b199a8 Mon Sep 17 00:00:00 2001 From: Zhang Fan Date: Tue, 28 Nov 2023 06:04:20 -0600 Subject: [PATCH 015/248] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 1f472290..b79ec210 100755 --- a/README.md +++ b/README.md @@ -10,7 +10,7 @@ This repository contains the implementation of the following paper: > **VBench: Comprehensive Benchmark Suite for Video Generative Models**
      -> [Ziqi Huang](https://ziqihuangg.github.io/), [Yinan He](https://github.com/yinanhe), [Jiashuo Yu](https://scholar.google.com/citations?user=iH0Aq0YAAAAJ&hl=zh-CN), [Fan Zhang](https://github.com/zhangfan-p), [Chenyang Si](https://chenyangsi.top/), [Yuming Jiang](https://yumingj.github.io/), [Yuanhan Zhang](https://zhangyuanhan-ai.github.io/), [Tianxing Wu](https://tianxingwu.github.io/), Qingyang Jin, [Nattapol Chanpaisit](https://nattapolchan.github.io/me), [Yaohui Wang](https://wyhsirius.github.io/), [Xinyuan Chen](https://scholar.google.com/citations?user=3fWSC8YAAAAJ), [Limin Wang](https://wanglimin.github.io), [Dahua Lin](http://dahua.site/)+, [Yu Qiao](http://mmlab.siat.ac.cn/yuqiao/index.html)+, [Ziwei Liu](https://liuziwei7.github.io/)+
      +> [Ziqi Huang](https://ziqihuangg.github.io/), [Yinan He](https://github.com/yinanhe), [Jiashuo Yu](https://scholar.google.com/citations?user=iH0Aq0YAAAAJ&hl=zh-CN), [Fan Zhang](https://github.com/zhangfan-p), [Chenyang Si](https://chenyangsi.top/), [Yuming Jiang](https://yumingj.github.io/), [Yuanhan Zhang](https://zhangyuanhan-ai.github.io/), [Tianxing Wu](https://tianxingwu.github.io/), [Qingyang Jin](https://github.com/Vchitect/VBench), [Nattapol Chanpaisit](https://nattapolchan.github.io/me), [Yaohui Wang](https://wyhsirius.github.io/), [Xinyuan Chen](https://scholar.google.com/citations?user=3fWSC8YAAAAJ), [Limin Wang](https://wanglimin.github.io), [Dahua Lin](http://dahua.site/)+, [Yu Qiao](http://mmlab.siat.ac.cn/yuqiao/index.html)+, [Ziwei Liu](https://liuziwei7.github.io/)+
      ## :mega: Overview From e45adf6d9ba40285c5453651a7f4b3cc7cbde348 Mon Sep 17 00:00:00 2001 From: Yinan He <43169235+yinanhe@users.noreply.github.com> Date: Thu, 30 Nov 2023 00:28:33 +0800 Subject: [PATCH 016/248] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index b79ec210..0ec1a754 100755 --- a/README.md +++ b/README.md @@ -3,7 +3,7 @@ [![Paper](https://img.shields.io/badge/cs.CV-Paper-b31b1b?logo=arxiv&logoColor=red)](https://vchitect.github.io/VBench-project/assets/vbench/VBench_paper.pdf) [![Project Page](https://img.shields.io/badge/VBench-Website-green?logo=googlechrome&logoColor=green)](https://vchitect.github.io/VBench-project/) -[![HuggingFace](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Leaderboard-blue)](https://huggingface.co/spaces/VBench/T2V-Leaderboard) +[![HuggingFace](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Leaderboard-blue)](https://huggingface.co/spaces/Vchitect/VBench_Leaderboard) [![Video](https://img.shields.io/badge/YouTube-Video-c4302b?logo=youtube&logoColor=red)](https://www.youtube.com/watch?v=7IhCC8Qqn8Y) [![Visitor](https://hits.seeyoufarm.com/api/count/incr/badge.svg?url=https%3A%2F%2Fgithub.com%2FVchitect%2FVBench&count_bg=%23FFA500&title_bg=%23555555&icon=&icon_color=%23E7E7E7&title=visitors&edge_flat=false)](https://hits.seeyoufarm.com) From 5efbef71695763a1ec393f80a1496224613defc6 Mon Sep 17 00:00:00 2001 From: yinanhe Date: Thu, 30 Nov 2023 16:42:44 +0800 Subject: [PATCH 017/248] [fix] dimension return --- vbench/color.py | 3 ++- vbench/multiple_objects.py | 3 ++- vbench/object_class.py | 3 ++- vbench/scene.py | 3 ++- vbench/spatial_relationship.py | 3 ++- 5 files changed, 10 insertions(+), 5 deletions(-) diff --git a/vbench/color.py b/vbench/color.py index 90f73cbf..9ba24c5e 100755 --- a/vbench/color.py +++ b/vbench/color.py @@ -72,4 +72,5 @@ def compute_color(json_dir, device, submodules_dict): dense_caption_model.initialize_model(**submodules_dict) logger.info("Initialize detection model success") _, prompt_dict_ls = load_dimension_info(json_dir, dimension='color', lang='en') - all_results, video_results = color(dense_caption_model, prompt_dict_ls, device) \ No newline at end of file + all_results, video_results = color(dense_caption_model, prompt_dict_ls, device) + return all_results, video_results \ No newline at end of file diff --git a/vbench/multiple_objects.py b/vbench/multiple_objects.py index e2c1bcdd..6385969e 100755 --- a/vbench/multiple_objects.py +++ b/vbench/multiple_objects.py @@ -58,4 +58,5 @@ def compute_multiple_objects(json_dir, device, submodules_dict): dense_caption_model.initialize_model_det(**submodules_dict) logger.info("Initialize detection model success") _, prompt_dict_ls = load_dimension_info(json_dir, dimension='multiple_objects', lang='en') - all_results, video_results = multiple_objects(dense_caption_model, prompt_dict_ls, device) \ No newline at end of file + all_results, video_results = multiple_objects(dense_caption_model, prompt_dict_ls, device) + return all_results, video_results \ No newline at end of file diff --git a/vbench/object_class.py b/vbench/object_class.py index 6a50f112..50cf7d05 100755 --- a/vbench/object_class.py +++ b/vbench/object_class.py @@ -51,4 +51,5 @@ def compute_object_class(json_dir, device, submodules_dict): dense_caption_model.initialize_model_det(**submodules_dict) logger.info("Initialize detection model success") _, prompt_dict_ls = load_dimension_info(json_dir, dimension='object_class', lang='en') - all_results, video_results = object_class(dense_caption_model, prompt_dict_ls, device) \ No newline at end of file + all_results, video_results = object_class(dense_caption_model, prompt_dict_ls, device) + return all_results, video_results \ No newline at end of file diff --git a/vbench/scene.py b/vbench/scene.py index f3895367..640cfbb1 100755 --- a/vbench/scene.py +++ b/vbench/scene.py @@ -52,4 +52,5 @@ def compute_scene(json_dir, device, submodules_dict): model = model.to(device) logger.info("Initialize caption model success") _, prompt_dict_ls = load_dimension_info(json_dir, dimension='scene', lang='en') - all_results, video_results = scene(model, prompt_dict_ls, device) \ No newline at end of file + all_results, video_results = scene(model, prompt_dict_ls, device) + return all_results, video_results \ No newline at end of file diff --git a/vbench/spatial_relationship.py b/vbench/spatial_relationship.py index f6819eb9..40796aef 100755 --- a/vbench/spatial_relationship.py +++ b/vbench/spatial_relationship.py @@ -126,4 +126,5 @@ def compute_spatial_relationship(json_dir, device, submodules_dict): dense_caption_model.initialize_model_det(**submodules_dict) logger.info("Initialize detection model success") _, prompt_dict_ls = load_dimension_info(json_dir, dimension='spatial_relationship', lang='en') - all_results, video_results = spatial_relationship(dense_caption_model, prompt_dict_ls, device) \ No newline at end of file + all_results, video_results = spatial_relationship(dense_caption_model, prompt_dict_ls, device) + return all_results, video_results \ No newline at end of file From d67ebacfb68f345409c044fa8cf11ca38c0e487a Mon Sep 17 00:00:00 2001 From: zhangfan-p Date: Thu, 30 Nov 2023 17:10:03 +0800 Subject: [PATCH 018/248] [update] temporal flickering, motion smoothness and dynamic degree dimension --- pretrained/amt_model/AMT-S.yaml | 63 + pretrained/amt_model/download.sh | 1 + pretrained/raft_model/download.sh | 3 + vbench/__init__.py | 7 +- vbench/dynamic_degree.py | 147 + vbench/motion_smoothness.py | 180 + vbench/temporal_flickering.py | 66 + vbench/third_party/RAFT/LICENSE | 29 + vbench/third_party/RAFT/RAFT.png | Bin 0 -> 204077 bytes vbench/third_party/RAFT/README.md | 80 + .../RAFT/alt_cuda_corr/correlation.cpp | 54 + .../RAFT/alt_cuda_corr/correlation_kernel.cu | 324 + .../third_party/RAFT/alt_cuda_corr/setup.py | 15 + vbench/third_party/RAFT/chairs_split.txt | 22872 ++++++++++++++++ vbench/third_party/RAFT/core/__init__.py | 0 vbench/third_party/RAFT/core/corr.py | 91 + vbench/third_party/RAFT/core/datasets.py | 235 + vbench/third_party/RAFT/core/extractor.py | 267 + vbench/third_party/RAFT/core/raft.py | 147 + vbench/third_party/RAFT/core/update.py | 139 + .../RAFT/core/utils_core/__init__.py | 0 .../RAFT/core/utils_core/augmentor.py | 246 + .../RAFT/core/utils_core/flow_viz.py | 132 + .../RAFT/core/utils_core/frame_utils.py | 137 + .../third_party/RAFT/core/utils_core/utils.py | 82 + vbench/third_party/RAFT/download_models.sh | 3 + vbench/third_party/RAFT/evaluate.py | 197 + vbench/third_party/amt/LICENSE | 176 + vbench/third_party/amt/README.md | 167 + vbench/third_party/amt/benchmarks/__init__.py | 0 vbench/third_party/amt/benchmarks/adobe240.py | 56 + vbench/third_party/amt/benchmarks/gopro.py | 55 + vbench/third_party/amt/benchmarks/snu_film.py | 70 + .../amt/benchmarks/speed_parameters.py | 38 + vbench/third_party/amt/benchmarks/ucf101.py | 59 + vbench/third_party/amt/benchmarks/vimeo90k.py | 65 + .../amt/benchmarks/vimeo90k_tta.py | 67 + vbench/third_party/amt/benchmarks/xiph.py | 104 + vbench/third_party/amt/cfgs/AMT-G.yaml | 62 + vbench/third_party/amt/cfgs/AMT-L.yaml | 62 + vbench/third_party/amt/cfgs/AMT-S.yaml | 63 + vbench/third_party/amt/cfgs/AMT-S_gopro.yaml | 56 + vbench/third_party/amt/cfgs/IFRNet.yaml | 67 + vbench/third_party/amt/datasets/__init__.py | 0 .../amt/datasets/adobe_datasets.py | 75 + .../amt/datasets/gopro_datasets.py | 188 + .../amt/datasets/vimeo_datasets.py | 176 + vbench/third_party/amt/docs/develop.md | 239 + vbench/third_party/amt/docs/method.md | 126 + vbench/third_party/amt/environment.yaml | 19 + .../amt/flow_generation/__init__.py | 0 .../amt/flow_generation/gen_flow.py | 72 + .../amt/flow_generation/liteflownet/README.md | 45 + .../flow_generation/liteflownet/__init__.py | 0 .../liteflownet/correlation/README.md | 1 + .../liteflownet/correlation/correlation.py | 396 + .../amt/flow_generation/liteflownet/run.py | 385 + vbench/third_party/amt/losses/__init__.py | 0 vbench/third_party/amt/losses/loss.py | 196 + vbench/third_party/amt/metrics/__init__.py | 0 vbench/third_party/amt/metrics/psnr_ssim.py | 140 + vbench/third_party/amt/networks/AMT-G.py | 172 + vbench/third_party/amt/networks/AMT-L.py | 155 + vbench/third_party/amt/networks/AMT-S.py | 154 + vbench/third_party/amt/networks/IFRNet.py | 169 + vbench/third_party/amt/networks/__init__.py | 0 .../amt/networks/blocks/__init__.py | 0 .../amt/networks/blocks/feat_enc.py | 343 + .../third_party/amt/networks/blocks/ifrnet.py | 111 + .../amt/networks/blocks/multi_flow.py | 69 + .../third_party/amt/networks/blocks/raft.py | 207 + .../amt/scripts/benchmark_arbitrary.sh | 5 + .../amt/scripts/benchmark_fixed.sh | 7 + vbench/third_party/amt/scripts/train.sh | 6 + vbench/third_party/amt/train.py | 68 + vbench/third_party/amt/trainers/__init__.py | 0 .../third_party/amt/trainers/base_trainer.py | 243 + vbench/third_party/amt/trainers/logger.py | 62 + vbench/third_party/amt/utils/__init__.py | 0 vbench/third_party/amt/utils/build_utils.py | 15 + vbench/third_party/amt/utils/dist_utils.py | 48 + vbench/third_party/amt/utils/flow_utils.py | 122 + vbench/third_party/amt/utils/utils.py | 297 + vbench/utils.py | 27 + 84 files changed, 31021 insertions(+), 1 deletion(-) create mode 100755 pretrained/amt_model/AMT-S.yaml create mode 100755 pretrained/amt_model/download.sh create mode 100755 pretrained/raft_model/download.sh create mode 100644 vbench/dynamic_degree.py create mode 100644 vbench/motion_smoothness.py create mode 100644 vbench/temporal_flickering.py create mode 100644 vbench/third_party/RAFT/LICENSE create mode 100644 vbench/third_party/RAFT/RAFT.png create mode 100644 vbench/third_party/RAFT/README.md create mode 100644 vbench/third_party/RAFT/alt_cuda_corr/correlation.cpp create mode 100644 vbench/third_party/RAFT/alt_cuda_corr/correlation_kernel.cu create mode 100644 vbench/third_party/RAFT/alt_cuda_corr/setup.py create mode 100644 vbench/third_party/RAFT/chairs_split.txt create mode 100644 vbench/third_party/RAFT/core/__init__.py create mode 100644 vbench/third_party/RAFT/core/corr.py create mode 100644 vbench/third_party/RAFT/core/datasets.py create mode 100644 vbench/third_party/RAFT/core/extractor.py create mode 100644 vbench/third_party/RAFT/core/raft.py create mode 100644 vbench/third_party/RAFT/core/update.py create mode 100644 vbench/third_party/RAFT/core/utils_core/__init__.py create mode 100644 vbench/third_party/RAFT/core/utils_core/augmentor.py create mode 100644 vbench/third_party/RAFT/core/utils_core/flow_viz.py create mode 100644 vbench/third_party/RAFT/core/utils_core/frame_utils.py create mode 100644 vbench/third_party/RAFT/core/utils_core/utils.py create mode 100755 vbench/third_party/RAFT/download_models.sh create mode 100644 vbench/third_party/RAFT/evaluate.py create mode 100644 vbench/third_party/amt/LICENSE create mode 100755 vbench/third_party/amt/README.md create mode 100755 vbench/third_party/amt/benchmarks/__init__.py create mode 100755 vbench/third_party/amt/benchmarks/adobe240.py create mode 100755 vbench/third_party/amt/benchmarks/gopro.py create mode 100755 vbench/third_party/amt/benchmarks/snu_film.py create mode 100755 vbench/third_party/amt/benchmarks/speed_parameters.py create mode 100755 vbench/third_party/amt/benchmarks/ucf101.py create mode 100755 vbench/third_party/amt/benchmarks/vimeo90k.py create mode 100755 vbench/third_party/amt/benchmarks/vimeo90k_tta.py create mode 100755 vbench/third_party/amt/benchmarks/xiph.py create mode 100755 vbench/third_party/amt/cfgs/AMT-G.yaml create mode 100755 vbench/third_party/amt/cfgs/AMT-L.yaml create mode 100755 vbench/third_party/amt/cfgs/AMT-S.yaml create mode 100755 vbench/third_party/amt/cfgs/AMT-S_gopro.yaml create mode 100755 vbench/third_party/amt/cfgs/IFRNet.yaml create mode 100755 vbench/third_party/amt/datasets/__init__.py create mode 100755 vbench/third_party/amt/datasets/adobe_datasets.py create mode 100755 vbench/third_party/amt/datasets/gopro_datasets.py create mode 100755 vbench/third_party/amt/datasets/vimeo_datasets.py create mode 100755 vbench/third_party/amt/docs/develop.md create mode 100755 vbench/third_party/amt/docs/method.md create mode 100755 vbench/third_party/amt/environment.yaml create mode 100755 vbench/third_party/amt/flow_generation/__init__.py create mode 100755 vbench/third_party/amt/flow_generation/gen_flow.py create mode 100755 vbench/third_party/amt/flow_generation/liteflownet/README.md create mode 100755 vbench/third_party/amt/flow_generation/liteflownet/__init__.py create mode 100755 vbench/third_party/amt/flow_generation/liteflownet/correlation/README.md create mode 100755 vbench/third_party/amt/flow_generation/liteflownet/correlation/correlation.py create mode 100755 vbench/third_party/amt/flow_generation/liteflownet/run.py create mode 100755 vbench/third_party/amt/losses/__init__.py create mode 100755 vbench/third_party/amt/losses/loss.py create mode 100755 vbench/third_party/amt/metrics/__init__.py create mode 100755 vbench/third_party/amt/metrics/psnr_ssim.py create mode 100755 vbench/third_party/amt/networks/AMT-G.py create mode 100755 vbench/third_party/amt/networks/AMT-L.py create mode 100755 vbench/third_party/amt/networks/AMT-S.py create mode 100755 vbench/third_party/amt/networks/IFRNet.py create mode 100755 vbench/third_party/amt/networks/__init__.py create mode 100755 vbench/third_party/amt/networks/blocks/__init__.py create mode 100755 vbench/third_party/amt/networks/blocks/feat_enc.py create mode 100755 vbench/third_party/amt/networks/blocks/ifrnet.py create mode 100755 vbench/third_party/amt/networks/blocks/multi_flow.py create mode 100755 vbench/third_party/amt/networks/blocks/raft.py create mode 100755 vbench/third_party/amt/scripts/benchmark_arbitrary.sh create mode 100755 vbench/third_party/amt/scripts/benchmark_fixed.sh create mode 100755 vbench/third_party/amt/scripts/train.sh create mode 100755 vbench/third_party/amt/train.py create mode 100755 vbench/third_party/amt/trainers/__init__.py create mode 100755 vbench/third_party/amt/trainers/base_trainer.py create mode 100755 vbench/third_party/amt/trainers/logger.py create mode 100755 vbench/third_party/amt/utils/__init__.py create mode 100755 vbench/third_party/amt/utils/build_utils.py create mode 100755 vbench/third_party/amt/utils/dist_utils.py create mode 100755 vbench/third_party/amt/utils/flow_utils.py create mode 100755 vbench/third_party/amt/utils/utils.py diff --git a/pretrained/amt_model/AMT-S.yaml b/pretrained/amt_model/AMT-S.yaml new file mode 100755 index 00000000..f0673557 --- /dev/null +++ b/pretrained/amt_model/AMT-S.yaml @@ -0,0 +1,63 @@ +exp_name: floloss1e-2_300epoch_bs24_lr2e-4 +seed: 2023 +epochs: 300 +distributed: true +lr: 2e-4 +lr_min: 2e-5 +weight_decay: 0.0 +resume_state: null +save_dir: work_dir +eval_interval: 1 + +network: + name: networks.AMT-S.Model + params: + corr_radius: 3 + corr_lvls: 4 + num_flows: 3 + +data: + train: + name: datasets.vimeo_datasets.Vimeo90K_Train_Dataset + params: + dataset_dir: data/vimeo_triplet + val: + name: datasets.vimeo_datasets.Vimeo90K_Test_Dataset + params: + dataset_dir: data/vimeo_triplet + train_loader: + batch_size: 24 + num_workers: 12 + val_loader: + batch_size: 24 + num_workers: 3 + +logger: + use_wandb: false + resume_id: null + +losses: + - { + name: losses.loss.CharbonnierLoss, + nickname: l_rec, + params: { + loss_weight: 1.0, + keys: [imgt_pred, imgt] + } + } + - { + name: losses.loss.TernaryLoss, + nickname: l_ter, + params: { + loss_weight: 1.0, + keys: [imgt_pred, imgt] + } + } + - { + name: losses.loss.MultipleFlowLoss, + nickname: l_flo, + params: { + loss_weight: 0.002, + keys: [flow0_pred, flow1_pred, flow] + } + } diff --git a/pretrained/amt_model/download.sh b/pretrained/amt_model/download.sh new file mode 100755 index 00000000..004890f8 --- /dev/null +++ b/pretrained/amt_model/download.sh @@ -0,0 +1 @@ +wget https://huggingface.co/lalala125/AMT/resolve/main/amt-s.pth diff --git a/pretrained/raft_model/download.sh b/pretrained/raft_model/download.sh new file mode 100755 index 00000000..dfd8d473 --- /dev/null +++ b/pretrained/raft_model/download.sh @@ -0,0 +1,3 @@ +#!/bin/bash +wget https://dl.dropboxusercontent.com/s/4j4z58wuv8o0mfz/models.zip +unzip models.zip diff --git a/vbench/__init__.py b/vbench/__init__.py index e1ffc820..053e371e 100755 --- a/vbench/__init__.py +++ b/vbench/__init__.py @@ -11,6 +11,11 @@ from .scene import compute_scene from .temporal_style import compute_temporal_style from .overall_consistency import compute_overall_consistency +from .temporal_flickering import compute_temporal_flickering +from .motion_smoothness import compute_motion_smoothness +from .dynamic_degree import compute_dynamic_degree +from .human_action import compute_human_action +from .appearance_style import compute_appearance_style class VBench(object): def __init__(self, device, full_info_dir, output_path): @@ -21,7 +26,7 @@ def __init__(self, device, full_info_dir, output_path): os.makedirs(self.output_path, exist_ok=False) def build_full_dimension_list(self, ): - return ["subject_consistency", "background_consistency", "aesthetic_quality", "imaging_quality", "object_class", "multiple_objects", "color", "spatial_relationship", "scene", "temporal_style", 'overall_consistency', "human_action"] + return ["subject_consistency", "background_consistency", "aesthetic_quality", "imaging_quality", "object_class", "multiple_objects", "color", "spatial_relationship", "scene", "temporal_style", 'overall_consistency', "human_action", "temporal_flickering", "motion_smoothness", "dynamic_degree", "appearance_style"] def build_full_info_json(self, videos_path, name, dimension_list): cur_full_info_list = load_json(self.full_info_dir) diff --git a/vbench/dynamic_degree.py b/vbench/dynamic_degree.py new file mode 100644 index 00000000..6393322b --- /dev/null +++ b/vbench/dynamic_degree.py @@ -0,0 +1,147 @@ +import argparse +import os +import cv2 +import glob +import numpy as np +import torch +from tqdm import tqdm + +from .utils import load_dimension_info + +from .third_party.RAFT.core.raft import RAFT +from .third_party.RAFT.core.utils_core.utils import InputPadder + + + + +class DynamicDegree: + def __init__(self, args, device): + self.args = args + self.device = device + self.load_model() + + + def load_model(self): + self.model = torch.nn.DataParallel(RAFT(self.args)) + self.model.load_state_dict(torch.load(self.args.model)) + + self.model = self.model.module + self.model.to(self.device) + self.model.eval() + + + + def get_score(self, img, flo): + img = img[0].permute(1,2,0).cpu().numpy() + flo = flo[0].permute(1,2,0).cpu().numpy() + + u = flo[:,:,0] + v = flo[:,:,1] + rad = np.sqrt(np.square(u) + np.square(v)) + + h, w = rad.shape + rad_flat = rad.flatten() + cut_index = int(h*w*0.05) + + max_rad = np.mean(abs(np.sort(-rad_flat))[:cut_index]) + + return max_rad.item() + + + def set_params(self, frame, count): + scale = min(list(frame.shape)[-2:]) + self.params = {"thres":6.0*(scale/256.0), "count_num":round(4*(count/16.0))} + + + def infer(self, video_path): + with torch.no_grad(): + if video_path.endswith('.mp4'): + frames = self.get_frames(video_path) + elif os.path.isdir(video_path): + frames = self.get_frames_from_img_folder(video_path) + else: + raise NotImplementedError + self.set_params(frame=frames[0], count=len(frames)) + static_score = [] + for image1, image2 in zip(frames[:-1], frames[1:]): + padder = InputPadder(image1.shape) + image1, image2 = padder.pad(image1, image2) + _, flow_up = self.model(image1, image2, iters=20, test_mode=True) + max_rad = self.get_score(image1, flow_up) + static_score.append(max_rad) + whether_move = self.check_move(static_score) + return whether_move + + + def check_move(self, score_list): + thres = self.params["thres"] + count_num = self.params["count_num"] + count = 0 + for score in score_list: + if score > thres: + count += 1 + if count >= count_num: + return True + return False + + + def get_frames(self, video_path): + frame_list = [] + video = cv2.VideoCapture(video_path) + while video.isOpened(): + success, frame = video.read() + if success: + frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB) # convert to rgb + frame = torch.from_numpy(frame.astype(np.uint8)).permute(2, 0, 1).float() + frame = frame[None].to(self.device) + frame_list.append(frame) + else: + break + video.release() + assert frame_list != [] + return frame_list + + + def get_frames_from_img_folder(self, img_folder): + exts = ['jpg', 'png', 'jpeg', 'bmp', 'tif', + 'tiff', 'JPG', 'PNG', 'JPEG', 'BMP', + 'TIF', 'TIFF'] + frame_list = [] + imgs = sorted([p for p in glob.glob(os.path.join(img_folder, "*")) if os.path.splitext(p)[1][1:] in exts]) + # imgs = sorted(glob.glob(os.path.join(img_folder, "*.png"))) + for img in imgs: + frame = cv2.imread(img, cv2.IMREAD_COLOR) + frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB) + frame = torch.from_numpy(frame.astype(np.uint8)).permute(2, 0, 1).float() + frame = frame[None].to(self.device) + frame_list.append(frame) + assert frame_list != [] + return frame_list + + + +def dynamic_degree(dynamic, video_list): + sim = [] + video_results = [] + for video_path in tqdm(video_list): + score_per_video = dynamic.infer(video_path) + video_results.append({'video_path': video_path, 'video_results': score_per_video}) + sim.append(score_per_video) + avg_score = np.mean(sim) + return avg_score, video_results + + +def compute_dynamic_degree(json_dir, device, submodules_list): + model_path = submodules_list["model"] + # set_args + parser = argparse.ArgumentParser() + parser.add_argument('--model', type=str, default=model_path, help="restore checkpoint") + parser.add_argument('--small', action='store_true', help='use small model') + parser.add_argument('--mixed_precision', action='store_true', help='use mixed precision') + parser.add_argument('--alternate_corr', action='store_true', help='use efficent correlation implementation') + args = parser.parse_args() + + dynamic = DynamicDegree(args, device) + video_list, _ = load_dimension_info(json_dir, dimension='dynamic_degree', lang='en') + all_results, video_results = dynamic_degree(dynamic, video_list) + return all_results, video_results \ No newline at end of file diff --git a/vbench/motion_smoothness.py b/vbench/motion_smoothness.py new file mode 100644 index 00000000..364fa071 --- /dev/null +++ b/vbench/motion_smoothness.py @@ -0,0 +1,180 @@ +import os +import cv2 +import glob +import torch +import numpy as np +from tqdm import tqdm +from omegaconf import OmegaConf + +from .utils import load_dimension_info + +from .third_party.amt.utils.utils import ( + img2tensor, tensor2img, + check_dim_and_resize + ) +from .third_party.amt.utils.build_utils import build_from_cfg +from .third_party.amt.utils.utils import InputPadder + + +class FrameProcess: + def __init__(self): + pass + + + def get_frames(self, video_path): + frame_list = [] + video = cv2.VideoCapture(video_path) + while video.isOpened(): + success, frame = video.read() + if success: + frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB) # convert to rgb + frame_list.append(frame) + else: + break + video.release() + assert frame_list != [] + return frame_list + + + def get_frames_from_img_folder(self, img_folder): + exts = ['jpg', 'png', 'jpeg', 'bmp', 'tif', + 'tiff', 'JPG', 'PNG', 'JPEG', 'BMP', + 'TIF', 'TIFF'] + frame_list = [] + imgs = sorted([p for p in glob.glob(os.path.join(img_folder, "*")) if os.path.splitext(p)[1][1:] in exts]) + # imgs = sorted(glob.glob(os.path.join(img_folder, "*.png"))) + for img in imgs: + frame = cv2.imread(img, cv2.IMREAD_COLOR) + frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB) + frame_list.append(frame) + assert frame_list != [] + return frame_list + + + def extract_frame(self, frame_list, start_from=0): + extract = [] + for i in range(start_from, len(frame_list), 2): + extract.append(frame_list[i]) + return extract + + +class MotionSmoothness: + def __init__(self, config, ckpt, device): + self.device = device + self.config = config + self.ckpt = ckpt + self.niters = 1 + self.initialization() + self.load_model() + + + def load_model(self): + cfg_path = self.config + ckpt_path = self.ckpt + network_cfg = OmegaConf.load(cfg_path).network + network_name = network_cfg.name + print(f'Loading [{network_name}] from [{ckpt_path}]...') + self.model = build_from_cfg(network_cfg) + ckpt = torch.load(ckpt_path) + self.model.load_state_dict(ckpt['state_dict']) + self.model = self.model.to(self.device) + self.model.eval() + + + def initialization(self): + if self.device == 'cuda': + self.anchor_resolution = 1024 * 512 + self.anchor_memory = 1500 * 1024**2 + self.anchor_memory_bias = 2500 * 1024**2 + self.vram_avail = torch.cuda.get_device_properties(self.device).total_memory + print("VRAM available: {:.1f} MB".format(self.vram_avail / 1024 ** 2)) + else: + # Do not resize in cpu mode + self.anchor_resolution = 8192*8192 + self.anchor_memory = 1 + self.anchor_memory_bias = 0 + self.vram_avail = 1 + + self.embt = torch.tensor(1/2).float().view(1, 1, 1, 1).to(self.device) + self.fp = FrameProcess() + + + def motion_score(self, video_path): + iters = int(self.niters) + # get inputs + if video_path.endswith('.mp4'): + frames = self.fp.get_frames(video_path) + elif os.path.isdir(video_path): + frames = self.fp.get_frames_from_img_folder(video_path) + else: + raise NotImplementedError + frame_list = self.fp.extract_frame(frames, start_from=0) + # print(f'Loading [images] from [{video_path}], the number of images = [{len(frame_list)}]') + inputs = [img2tensor(frame).to(self.device) for frame in frame_list] + assert len(inputs) > 1, f"The number of input should be more than one (current {len(inputs)})" + inputs = check_dim_and_resize(inputs) + h, w = inputs[0].shape[-2:] + scale = self.anchor_resolution / (h * w) * np.sqrt((self.vram_avail - self.anchor_memory_bias) / self.anchor_memory) + scale = 1 if scale > 1 else scale + scale = 1 / np.floor(1 / np.sqrt(scale) * 16) * 16 + if scale < 1: + print(f"Due to the limited VRAM, the video will be scaled by {scale:.2f}") + padding = int(16 / scale) + padder = InputPadder(inputs[0].shape, padding) + inputs = padder.pad(*inputs) + + # ----------------------- Interpolater ----------------------- + # print(f'Start frame interpolation:') + for i in range(iters): + # print(f'Iter {i+1}. input_frames={len(inputs)} output_frames={2*len(inputs)-1}') + outputs = [inputs[0]] + for in_0, in_1 in zip(inputs[:-1], inputs[1:]): + in_0 = in_0.to(self.device) + in_1 = in_1.to(self.device) + with torch.no_grad(): + imgt_pred = self.model(in_0, in_1, self.embt, scale_factor=scale, eval=True)['imgt_pred'] + outputs += [imgt_pred.cpu(), in_1.cpu()] + inputs = outputs + + # ----------------------- cal_vfi_score ----------------------- + outputs = padder.unpad(*outputs) + outputs = [tensor2img(out) for out in outputs] + vfi_score = self.vfi_score(frames, outputs) + norm = (255.0 - vfi_score)/255.0 + return norm + + + def vfi_score(self, ori_frames, interpolate_frames): + ori = self.fp.extract_frame(ori_frames, start_from=1) + interpolate = self.fp.extract_frame(interpolate_frames, start_from=1) + scores = [] + for i in range(len(interpolate)): + scores.append(self.get_diff(ori[i], interpolate[i])) + return np.mean(np.array(scores)) + + + def get_diff(self, img1, img2): + img = cv2.absdiff(img1, img2) + return np.mean(img) + + + +def motion_smoothness(motion, video_list): + sim = [] + video_results = [] + for video_path in tqdm(video_list): + score_per_video = motion.motion_score(video_path) + video_results.append({'video_path': video_path, 'video_results': score_per_video}) + sim.append(score_per_video) + avg_score = np.mean(sim) + return avg_score, video_results + + + +def compute_motion_smoothness(json_dir, device, submodules_list): + config = submodules_list["config"] # pretrained/amt_model/AMT-S.yaml + ckpt = submodules_list["ckpt"] # pretrained/amt_model/amt-s.pth + motion = MotionSmoothness(config, ckpt, device) + video_list, _ = load_dimension_info(json_dir, dimension='motion_smoothness', lang='en') + all_results, video_results = motion_smoothness(motion, video_list) + return all_results, video_results diff --git a/vbench/temporal_flickering.py b/vbench/temporal_flickering.py new file mode 100644 index 00000000..c704366a --- /dev/null +++ b/vbench/temporal_flickering.py @@ -0,0 +1,66 @@ +import numpy as np +from tqdm import tqdm +import cv2 +from .utils import load_dimension_info + + +def get_frames(video_path): + frames = [] + video = cv2.VideoCapture(video_path) + while video.isOpened(): + success, frame = video.read() + if success: + frames.append(frame) + else: + break + video.release() + assert frames != [] + return frames + + +def mae_seq(frames): + ssds = [] + for i in range(len(frames)-1): + ssds.append(calculate_mae(frames[i], frames[i+1])) + return np.array(ssds) + + +def calculate_mae(img1, img2): + """Computing the mean absolute error (MAE) between two images.""" + if img1.shape != img2.shape: + print("Images don't have the same shape.") + return + return np.mean(cv2.absdiff(np.array(img1, dtype=np.float32), np.array(img2, dtype=np.float32))) + + +def cal_score(video_path): + """please ensure the video is static""" + frames = get_frames(video_path) + score_seq = mae_seq(frames) + return (255.0 - np.mean(score_seq).item())/255.0 + + +def temporal_flickering(video_list): + sim = [] + video_results = [] + for video_path in tqdm(video_list): + score_per_video = cal_score(video_path) + video_results.append({'video_path': video_path, 'video_results': score_per_video}) + sim.append(score_per_video) + avg_score = np.mean(sim) + return avg_score, video_results + + +def compute_temporal_flickering(json_dir, device, submodules_list): + video_list, _ = load_dimension_info(json_dir, dimension='temporal_flickering', lang='en') + all_results, video_results = temporal_flickering(video_list) + return all_results, video_results + + + + + + + + + diff --git a/vbench/third_party/RAFT/LICENSE b/vbench/third_party/RAFT/LICENSE new file mode 100644 index 00000000..ed13d840 --- /dev/null +++ b/vbench/third_party/RAFT/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2020, princeton-vl +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +* Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +* Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +* Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/vbench/third_party/RAFT/RAFT.png b/vbench/third_party/RAFT/RAFT.png new file mode 100644 index 0000000000000000000000000000000000000000..a387fe2c8b2d02602941a5a74993992cd6490a4c GIT binary patch literal 204077 zcmeFY^;28X8}17&?ohlf5*&iNLyEf&4@annW7PfMTR--GEbQGQw)R$5R<5qD4*q@~P}?Zf z@dJvOi0A-ye}F=95*P73HNxB>_A5eEF>$RD>kG;LUWo2cNk&!6()vu)~zZY9N zOOsIZbf>7>cfJmeMw;b0aqHiTfJ}6+q(tN5;&amCQG#o`-^x*&#;B_j3k&nR``dzo zf@nW`)XwNwCA_Pz-8rdqcXvlFuOvG*;Lmz%TXSt=dL&BW)X&ckbvzK_ZXFTm8PPjZ z6CaQs?zMU{Uz49Ou(5S9{8g{?i$F&1w=b|T`_~Ke^LwaMZ+*q-w%qh@;i$rWRKnKb z`Ss1&^`~>x*bQpA?Gx%GBi-M&BNvL2Uq%J5qds(_3=Wf$k~+Q?qYgl;_o%kT$;OeN z3tzIM^rTSZ`hV`n4o479@&YKIzKN{|)KMU6y|ncR^{F3rd-J>f`Z0QFX1*}$;l2sg zxPOn#aL`q*3ou5t2R+ny9-$`M=j(z#_CIc1qozKV{+(C@k<+N=k8S&?UyZ2Y6-&Y&%*et_j~*It*FD*t$WnWv!1!r$D0$r^lT@dJ7+XB z1~gR#d4mA-;|B{3)4-NJ{5`_{Np8Wsy z|2+a8Mi^u_WE?k!&K+MCvx!W5obrEcA(pD>zDytDN-}A&-Q-#NvK}VgNv8zpr!Kq@ z6wS(3((h7*f0DID#zanz4XL8Gb7{i(wo{(H2N^od4;y!DWmL44)Ymhwt2c+5TYRoA z^>?fv0xQ|^*U2mjNeqjR^|AD^>LZ%#22M^6Dr&w|R{9IKe#~yK4l-=_o_n5T3M7i@ zBTX6z4i20#oijF?TUc5$o?9@Un_5^sRnxfn-Z*~xy_)Fs8l!l!K^XPWo||5}g#5XN z+WdY)ztV|v^mk*bL!@X_pPW`MuQi8-?OjhdE4#S3++-M@u30QLula}BFf}zI8XM1i z53V0_bMISQTcrjHEkY2rt&Q;I+}7_-v$K8~@Y68<8!i_QsnOS_(vP2{8?0{TjWta{&K7{|LzPTjHv8JlH zmNqW;^q5mK9bc9KQtB=7X1drs#Evy2r>%aELTo}b?e&{E>_xJ(gQ`*m1qD+lE!oL) zmTiB~<3T9Jqm$f|A%b4)Ybkf<*0H5Cu-cZ;3mGIAOshFKsJiv=C@2O9T!*4TL6rIk z5@-c-U}dXg(Z$Besp5-E#g`fkWwWuXldHG6Uf6@yNAF*6w&%9za;K-imBRme>1Wtz z)M8dWs)uq#9UnWRPQ*Qg4z=A~k55ldSHkZ0csl}u?hlx7>@+*_S1npIny{OXt52#c zzcDBI(XZs29xSc>3|s813r0No8@GS@Rn)iq-EE?cNB ze0+in!YqMQr8Ml@;@G3#AN%Bft=_-f5a7VbL%7kt#u$+F-lz3)MbS`S_8i{9jAQ zX>)T3upUBqV*XTWDQ!^AI@n*^vpNJsW4=QywcG2fsyfB>9h^1q1uygTV!ah{kVyA# z4V7Nq9d3c&exlLi;Njrl;pO2`vN`Y)T{4MYryfa$&}UZ}HI>%Z)`@=Tc=BoHZR4-k zaUGH5Q7u#E3+8hRL|>a=_jPr3$a@dotCOJOcPLutVI^ws8%7!geMXMO(Eu$OoD0DHp*@E0#wOrRBlFT*0jX zq^ql|?W~{R1(7;KGDz`vm6?(C7;!aYEgHvcpb={-B1piA&wY}Yxrc6@LEUjzP>9l4 zCO!NAWjcc5fTkBcKcqcX#WvqQSLZx27>?hLo^mQ}0W*0$tF%59$=29CqLWpzrfO%P zCT~{!q<9&XHKf{$5zI$UsPquaBMkzwloBz*k1k{~a>tU1#F(rT$h@Xf3&^N8n*Nl|+3}otIsKeuv z^lHZmUgLrlLh>|^4s?DIw9LfB`&P8k=~7;i39Y>0bIiLDs`EdhH}r@U%#Mwwj(2&_ zbQV?ynMLVV!g^~FjZM+cQ7RXXgb`&cPrhO-53;N38j?L{TvsgZue8d~$Yh0WU>rB@ z-=y;D^`7&aO4Enhqd^%)o@oMW9ujN93jZxwQ_6@L_$S~kcrjG#@kX{mv25Eso@GOW znZ%epGS;k{%qyIeb>QoQD{Z_S`=ic;_G%v!{Un`BtXT|@IiC3`=))j!+x&`V5hBe5 zU`2mQ?l1WrSkxCg7p0LJQXj>wW$}y_TG1o`Q&FCJvzj({bp59D^h!k3!tGSinbbML zLP-^nB1>D_=hg&1WXi!c{%d17m>#!5>G~~`N$usqezS% zO^Iotv0V8PT=}8btESi0le|XCa{sCIx5{YG)B!1I;Fe_)^~Kz8g%NHxZtg@ePUm#S zi3v}rNEJdT!~4j?!J{G$2Bq)+OfsxT+88bKI`Y%Wu^x8SJZI&K#GgAnAl*8dpLgt4 zGaNjsHcHsUHD#pwO|1;Uwa!va{HS2#6`@j?46RHSO~iRRm7MU-P`l;jdcTD+*@#@0Ewg($aTRAbg*TRC(0y@c8r&o8*JG*HAajSDeE^ zef|QRZ|kqShHAZ0Ll3-no!>lLBi42vPp4(3;|Ubc@l!x{q`0h? zDHHK2&!%MHa}U*Xo!b8f?F1^v3i;{B^+VX~WtZiV@7hwXskoVOR^E~6VNWt1QpY$| zBmT7Z^&4~Z4~cf&wjy(;6)zWa2&~AvkXFdD739}riCpAVo+48cbN-^9!;B3N4>!k4 z7jL%Z)4+`H6qjO5Fnhqtr5E*=&!20%rbn`%>TRAWD_iHA0l4ai?6E2~Zdl-wv_UWR znEj;UFn2#%rfxw=Gffnx7zHp4O)?(PW?ue{gi?IG^UI!7W~$M2 zQ`zhHZ0Y^ytBrCa8`L<0kvUi=7YD(l(E!{A72Zb*l8bU-=| z4$_^PjrUCtvR8A}{dD78d6F;(312I9r+xcX_=3_+kxcT|T&l@U8ak;+s$Am9nhx^J zG!lXHt23{pww!vmM#|q?s{r{*(jvR|0SUED`=p7O5B)C0L6Lqx|8TFheG)DtiDPrg zB`1pjwr?cI95SVN{E_cGuieaz$P#;w<}(y}5ae_5c5Qcj>T%t2ZEC&yqbRLey^Wlz zy~wppYo|YIXi`=x@Vq1}FtFLmI2qzgY)F1q8{$R(eB|RVe4>oR`x(BUr>sVnRCxns z_m{e?7qX%qqs&xR8TUHN59N|YwNfh!U(tXO;XN-^#Zx~i0catxA?BP&#!u4eDy2EN zdm(3FUEN6)-ES|^>-0`lm}@^VQ+B1F+|*xOUv(T@c}$Z|{I3eYXF{t^rttQE`<6qq zH+$n%kfIRfySuVxV-xVcDL7eERilg&Do*)rL?CHN(WF^{jFj!vm?6^j1Nqex-cN*a zzEiJ6hG@?f4#)cXY#|@@Qu4@+8F^Ki|8Pgr%HK6%?8KTG;#fym7ct&QEzr9dd~8#Y zW@LLk@%gcNoaUUplV|W*Z5b(7NuGpQbF4oWoK#i$55EB?cQmYPtXWd^=Qf}L1jAZv z?i6Wmlx%LKSzHP{zZ{xGIW(glAHJWW>}mW*s4PYs0WIO&+MRMb&HlV~HY>Vii#F9`@ zwqI#uK+vxlIGnN=F}ikGD4Y|jOD37e&u?{A${J`+r-x96md!op~eJ&iC@`gYas;Be>>QVL}jP?5u2o3)@;3PVk=G+44&F^ zqmt@^)HWg;5h_s>0+PY4NrvpUr$39e0{wn3?byBGU(n%eHr-eC_=87tBs= z%$gti$lAdrz`IA<&G<~eQtBZ8s#cB(@~>h+dPu!LQ){eLEZp#tGR1)@_<7deC5bfM4aARq`@jLSii#I5Vm+1eg2qgCt+=Bs-B&8DrRb$!S*~)kd zsd4kuT8$d1kNx8U?cb*6kg!AF-k29t=d(f{+liELtiBV0cCO4p`2DZ40+tJc=_X+Sz?F*5BLX^k7^+fg`xd zBhi2a8DKB?ZD4tuVCHQ9Q4ggXc64I+qDAr>Uan^s{9aPM6%6KUcAbCSC-|GeHa)St zwKXSZ)m!Ru?XN^}G3?!JWS!1ZL={mK)Ry&pBUe*r?&eb6H2c8tS%-7Sh4_6c&AB7) zO)1SJco18)BKaE8=7w^;Jo@o(jh>LoSu=7dR82_HUmetE$FDZ`r1vx{*N2@PEx~h0 z?uOS-Iy!m=&YH%o1P46ROUmvG(hs^yyW}%5b7;&mSM!wovrHF$dDu?K9nYpstK#*} z5yhsFXRNnfHUA#H281$%C9fD0Y=*|IAIfFMO*B9r;*SD{57XQqWlGzVH*s5H_z*hN zuIj4X4WG-2l-N|+CpX^b@eT@g4Ke5ueglLC_T}K$&qu_3eQupH%kj6>vff-@=gBNA zq++CG>NvKCp$;DNa5(?S0R~yPwDXoxjvlB_sG(In?_s=cr6o!Qssb#dYQBa3jC#x8 z(GE^aptWV*jwV!txSTR&Oj6k&-`#Sfub1}?+KWBp0?K;Uq;MQI6+(!hmH(*I_w{~F z=E|u=#Tg2lz>_fl2mH4fzW=G!Y8*?Y1+v7_l|;f`nuU1S5y}0q)!Joo2gK>@D??3e zhMjvJzsy;axKhTS?e6aN^%F=(*V6JOx9d|;e-&ojL=w!+uxQDM|I6oI8~$Mbz!&PyksnC^HaG1>C0OiIq$+Mq=a4gQOZJ@`E89~n3Z|eee#TsmxLfaN(34Elka&t4?Cmo~ zk0+Vl4|__POA*xi_+#Jb%RlvX$Hc(xF$s=d8hRa&gFLNzS^ZKiVq&@45g5y^K1yQt z(fsy84^~2wyQG28Rev}=d6Wm+Tj{VQlu==r!zs(kJ)rk?8(M=N4b|KPJSRHVIi7s$ zUwwfL(fR^R?i&>Mo7(`X^Au+>Hk`WsV?|~D>`f9V#n-~jekN_6=pv|jPrrb(&t`=k zfGPWiHR_t3l`1~{7;4B_TAEH!laB{*ZJ!*{tk38x7&y$!x(wz`GHaEZn(s&-w*9)y zEW4;Ew-9)@Fx{S>z7#qLyz#e8RmRhBpUy`eJV>7#Z6~lt=Bf!z#;U`L|4E9a=Mlu8ZWk84{Tqna0H#~M`kX?d%DQz;;Kw$(=Z#D1NGdF& zul1KK?~OKzV(on2RVOt|BR$qmEhxO3qsX7`6!hpAWAs+41YVq8PRXbir*kXoi)(sF ziVB>Xj*>1Gnd0J8C<*a)m8gXZ1u~Qet0W-%+Gi~dhFwA`On;>|1w_r7XQjrD>m?m_ zzd-xNDuLUWO!aYj>%(j-*o%rrAojPv?o0*^3UGlwNLW#se6zSSl2JqPhKsYs@65#X z9=??8rx)91WntXqgCDuP7TnSnMvpy`TFjy+U(~6yp&9)5xOLAZ>HjHb@%O?bQM=h) zK2SFIL^(K%LY5#%W0zl_hMOOq5}$ipuD1T!bM+^C!&p^69Xs~Q^z(87dt!s2`z22R zL;9;WomT13m+5Ba`zN8;2S(NZ75^rHWS9wK0@4=3_f5pZvNgA-Il>3?t>QL?Azy`* zNoG&~=2vmxPDEMZ%+zOs=e9w_^y`&0k7jD|l$3mj*JBWn6>B9LHKl4Lpu!cZ2OR9I>%jeoW*W~iR^BzKJT?~urL=6IumDZc9 zxf|H60lW3PyXWpD*(jnic)4~ybzA|W_yenIZ6;M9)~_75A1A;M6`;Q1*EocdN|x1fJf*VtGVL*nh`MT!Der#kEL zkjQ~3*(>@sxNPGi%YWgYhZ{!Y@@3mwe8hU&+ZLFBjKk@8y_kVJG3?&B=-G1y0qOQx zI^@D9cOFqw*P8%8olB|!tRGf77?fZL{kgLGbatWTdV*(SX4ox&ZqJz0t)ZO+bmUe4}r*aw;`W&hYqvSK>e zAjkUoEdIQu8<@&S+s@~i%!XK+I)WH-8{TYgmv|^R?gY>@yGmibiH~1w`MKIEDNDZq z)`)*~GxV(f8MXa2?vGLJED>gZJ(~8OYgGRKSm?k3W}5sd)Z`bq?@*oXC`)wS!<9j7F^&}F7#l5ui~-Wsb6CF-S~ zs=8Dg6*6-~efY~9!2O21#wDtR)U~djj5V>Y^{=R_)AVmy(O?@l)kiQXs%j1^ch09# zq%f;R)Ca4;Kq2>z-$62inbzASehV!)ne0p4QB+x!nySLWYIL2s=IqP*UrwIdEoVS< z(b7oWm2StopfEE99}Zfs55}w}tcTMsf_f!m3`KwWY5BDFT`5D5Q_M2SF6HOq{LSU` zS^!*jHCQ8t0y+;)2HQGmD*!k=F*b|b63DrS=IyUIPnQez6MVb-N_vojIV!9@<4Fo{ zhOE@lh2dfMBSvo?W$qt?ub3WJ&LpqbIuCh&?+oz|U>{NZ_tf|E1-d8ZL2>9jMPYxR zc+xik(RRMj7%kwmwE-#pNIo{U19thdb|SA;kwOGRnx&V#t(5_>7Mw*P^`KOi7zQYs zpvhb)(HlsjQL6vgp;5y`!o~Y@d3{T2(=Dksj;!S14cF`p#lc?p2Y`a0tg;y)Ao-mq>ONVK}${;Mk~Q$HVPw0CL}80Hhwm@yuEdEgh6vQpBojxWCZ2 zKL8)uX&RzvmM7m@vZQoRr?}OgoG-L3 z6Szhi$N&qGmVML`POVCh=@3DB+JNY>ssqMal_2-q+0F*kEUgxQtVC{RlDoj`@H2JB zYSs2-4OpSZxA-ou;N`Hr&ZP_DTV&@S`Y;sd26O~=IuRtVNv%^ z>A&p7j#Q<&!Y%LT=j%;~ov&d3l5!fF>JV`YZtV~#pW4Ko_*90H3YthxeC$~sMrmxP znlAQ~TOBD&#cfadT1my;YMQS8eoqWr`c0W2H&Ge&M~T$!AidR=l~+@WIwL$|`bAth zAcpc9k1@rcVftQ@c;5?GPWpnN#P&U%jqsF7G|d@O;pQL9HXkoyUbX45o!FjihxbYX z^wGPmoQ|F#Z-+2chjG}{`B-R2z?5nV=JF0hWe@^qT^LlJdth;7CnaU^d^1zL145Xk zFtNS7x_r7hdP=uYaQ{}TCdJ!}M+eLw;K#Yn!1Nv2o!Fi3&WD{ZFVAW&E@7R& zch)+Gd4FgBu`$0kpZWez&F$kz$v!DUOeB`JU}NDZFF&tO>WS+HGu~BRY8HowKSMHR zYP7nse;AxHpxaN>&{t4Mh08E`T{g4xeF12U+3}F0wZIE-2IJDL5LVf zDtd6s?%lqSvT=$dP&Webx#+$Oo47T4^e>4Pu3m-+zi7Eg5g+9bRbOg-180Rs{ZU_j z+3Tr(2r>6^WEUDu7hs~nu|!I$;!hqJR&r<8e=L>57F2hf+-lTT(k}&Z#w#bZC0kR_ zrBdi^#y_D6G?zAzq$PM2I0;sbYvxMP>RU;0`BNO~b>@{5ks^aw;(Uhxl`OU)o!sXr z`P$2{d_enVMbVZ{Qkgkdt5te~)j}}}o}6Au62d~EGbG#@IyC{lmI527T?VMN2J`!lYSfUo_h#rN9rp?kL)#12T3e)vz#Y`hl#ypKY)4`&pET`+^ zm+k;}e$3IGljkz@p@)*hlLnrDDaHBwB0UQ3AJfSDa`e>6vdu$rZzU3FD_cY##4czI zvP9)JFqyZUM%4HytdI~~pPY!sWgDeqdYnCj%RQCu<~>DY`1;j1f=kO=Fi#@Q=lCZS z{cKO!`$fr9?}uxlMHFwDYDa~ygay1$Oxyw)u9ld|v^iJFJyE5!dukf&c<%fIqXc4U zgy;@13mTJ>1@1iS-&bHmvUzXMDh4b@NeE&+1eiuabiX%ytShHz3EAPE0Sh7i6omQR zS^iVMfA_&yMHna^JayW;_~Ho;b?qgl+wVy+IXt}ZyE6YhOe0s3kC&!F0@HP09_sRK zY>ZC^6rfrlbRiIw9K>t&8A4l`?IYkB+u@zn6jGg0U&9X^Tv=9Kx%?pHZe0AfDh#6U zBNRUtKW9%ToZ+wPzrx|^&g+?QJ=Hv}sdnb8VF z`1Cy5rLsz1XUeJa8|gRAft?dFa3g(<^Ewc!^JmV+?`)&j-siqqKmY8`VE=E}@;|>{ zD9gb>Nke$0Q+lSUS*8&>p4Djh-!K5s%ay%JcPhfWmm*G6KRsj0JnB*p`y!_V@S6u; ziRc`(^^T{Vn}{CtN0$3J4J@eY4H~RA%rf~vBz~&+m}#|*{GsnGNxwrYfpjznj&kmv z|E8dGo$&En4OA&4ZA+^>Jvli!PDGaYWpN6Lx61|uW3THL7o$_LqE}kgHLy9Ys9)*w zIhLQDYy;9ItU7oX99WrH(4c3+0(xZ6D_%aQd1~3usisy)ny0I#Zs^WGo0J9F^A2aX8E4afn-Y0dhXc{;X#DsydV7b-Vw-pTyR|n|TK!iw9Ukjb3u8#>6L-2V z7fYpLl^(kJ3(`XUr%#PO0k$9{CXaYB@UW^n&;SbWGX9_Z$fid>TZ!)naKP(-(nX;= z(P38tRW*rmbChTJ@fBzIF((0uUYM%a6&Scqt3r9Jj? z@gL5~OO<}%^-g7h3#FI0@%j`cLjiRvHs+P>azE^6w4R!(eGon9&L3KTHzr54ZC}YD z1XmFcAmie#;HKfIChTRiMvz6MPc?!I8tc1NFmhFRjIS`gb@6FORGd^55etfTFcqLC zx`(8aX%~eUCIm8#`}Nm~%@vSUmPj(U!Y^Y^6J;vV$DcfzYncjmhmCvcjp=}*RL2>c zV|zil$$aYQp}<>53r+stR+2WJIK6;A5DaW`pw~Oq0jipeZXJ;JczI0EEH9kvTys?E zdQe%}>CdLN)O3)`Lc@So>4~iMzIAN%2!0l)NY1!0?x7s8!Ig!}2H+v??G9HYs7Fu} zEf=N0;`PwVD2X^eG}NW(eKf<%Nk|*MWF$38NSj2A3J4f-(yAK`pRlz5Ajg@ZX#>PD zc$U7P!f!OiVdYDTQN$p>k@~u+8U%x7pVz25$jpXklst2^IxpT%5Gj5qL^*^>g?8ybaG($UhJD#a5m%Z=`{FLc zb@ZgV3daDAW0(@!1!tb@9+qmdMqH_-zRM3qhWWj1ps?&}iyQ<^o2Hn0woM=vZ7H0N z8^zmer?D;gKE#?Z3c(47@Z^Cy$?O8HlR5&NWoeiPxl2d^*%@1b?C3eq(?z>pAF=~1 za}(7c4Nrk@b4Q7j#~atlcc~kh5WvdOlhs1q75j-i175>pp>X^~5|y5GS6|2jv;xiT zHx0-86hl@5RXWE?Uk$M>iJF)JmQUP?D{d@q5shF~w*zpMJdbgrooi)%Nr{wYAv&~n z5o@!k9H&NSjsVMxZpRrHgU-glN}yN}red1fY4?Zp(e%D#?WNaLn^9@SXZqr{t3IFpISHQo)-nqPIM#=fC-$g7;gVXmP~K#L7_+TFE=1;(l-~GY>M1{qEig1NmLw z=j7T#p6pf&Bj@%`56lYpX&6jQQt=#99K{HSm>DVS7M7c#mQ1;(%YFek*J3#y9)7P4 zXW!oKmc5dXwX^#$SJ!E~FMh68&VLsq)nuLp`<> z2g@o#Na4b&y@dMam>P2o_JlL;b=r(!uQfR}%^-jl3N(9CV7;qU`o2UA$GpqeWdgcv zluDpxXKMyiQ(Hg{zHkcTds32I2rHd78?O-@SX${Y=OU*3V4ZZ(8F=*6tm*_`2NCdBZ*-3!VL z0q*1;tS^$S&au>|Ea5XlR37=*aNHDz;VCkAhDExq6bKC7(ebIvc1j^i{n3WWS0;mV z9J#)QaZXNexW-ML%hWHq3POI5U`Zdu-}Bqi7^^%O`kDckcqD9Kg}u9Q|Rs(Ha{fR~tP)xk}S5mI5G zt&01U8k*(QyI*`hMVghy9-Nf+jiUhu0KMg?V5bttWR1|b4lv4O>17(i!R=Ea7N^rg z&t-f&W3F5-36$sj{Yu(kBCCweV{ur6S@MImhn?bV@jj=gvM><40c0NnjEcDlT9&?0{rf94Nu)*M3xBPFVawjB&HmMm<}`kWevkCI z>5n6u;Mi(@_0Ms+WNz-fLo(A`rX?-aNiQ$r2Z2w3r7WD>1}DzfABRV>h%D`U<};h$ zKW_*ZKE|GjZPDwd3TFi1Ck-2miYXQ-xxu*+uy7VT?V>W~!9VL&rLPG%0IXu_?IfJ| z;nAOMZ^Uw)Jnln2(z-er2vPn>-kI)p0rL7}hWOJ%UwBP=z z;DV4Qn-p#w;?g&O4hE&hmzu?W;Va4pgj|;LuHt43C$$@`J5VBL>FbG)-tjst%yg7F zz*{8}k)~IM==!Jcr5KN1&ia3g?a+9c`%vM&>KQB#=fPsP4mUo0CK~f{Z-mU9LaK$! z-Sin?k7LSy5_ryxn+1Y7D-1zu)RRe`AS^sR?miewbSt(YQU4#LN*efvG27$>q6^+p&p>`pe5`pFdw(bf#sCMt0D-7bmH?DK5w@r1=h0DCjmb zh7D@*w#ceSABgP#%@GvBAzcS)ONTz>J7z#@2gV&i!_C;HY&Er85lPqFx7B&S&FgR_ zXauOfD1!UI!xe&@5ti}LG0OG#*uH;R3x^;Qx1U;bTgUPfRrmU4?0Z-zL}Pbp#W&F= zM~5?AWu?x3e;njlRVb{)63Iy*S36NVH|TZ z+fHGctH>qxfMPCAD4mI42zh`TBju~P(B&u)DqAcun6)RmTrHuq?I)|KWC6VTokAlr zc#5+n$L$C#*6^q{XZn#^kUg}u;iXgBSOhOEDr&5(s;n%lZ1hv0n>DOI*$g)>${smc z%5JeshhSI`Uwz;^&ZWwz&V-rFk{MIK!BXHCa{w6yHLZ}pkNg8Bsauz0C?et`~q_> zbV>okzakk#Zxf#}rfYmJLyp@A%lNYhOqMLz-(crbUjd4{o)lJ4? z2F{RUz{=o=B^p{yENj3gZv1InWL>p_7v7+^F5D$U-DZbK+xFA^@s;-^xk?JrQCEqL z%h{V75hTIY?CVCMo2IHvn`6Urq}dFmk5srga;OBRET?_|cH%!>;TNVfp>3X6W0{o> z@xMJY9L|cj-KWm*YMFMgDIOB|nRjF8-~ST$Y&_BPee;tF0}LPNv;5FzwbShKhq$Qc(C=Ad4{Ap3J^;{T)#J<7K zxK_OQ2CAir1lxX|xWqfivSoX?8Oh%DT5Z&SSVGn_?}B6dY*xM<0rq>G4E0{=aT$aT z_I#a_HfIWoXHZjAOlajd91c%kTy%-#mKvH0_^KWhmxoNCd68qM5Z(s-ggF##`{T)L zMcn*4qgcgyisqenl_gU4X9~bFR?#}@C~^&LrB{Xy+wgn%40AbC1x{)~mbQ@K7BYr4 zH~nfgk1k*Mq#8QchiS~$nn0mTUIlDX^Rj9IK*@E~*D^*BaXh<#6v`)%!S|(NEFhvD zlAuL>c?ZtXo?z>-F|e#&Z)ErRMvMarD7$|4#S!@A*_yn~khf|_|2e(6MC*bWJ1Udj zQ}<5iA1KHV!6=~k9SyFuT5BgnEiUdZo^BM9lX`4f9;DL9WjSX>s#p(F^1a2Bo;`UC zEg>mHb*}`2^y|MI)&}N`?{k&TpEK`H8jNLFL@ZRx`?gNqR zgW3Dg3F|VXZFO=*5X7(PrlsE{Ee}A$0FM{hPRA4J;sb_5yWt~t!W5=}?d3E5SMLjr zk}*pm!1Ms)TgB-(x^nJvgo@VwW*>l=vKB-N4P=LykmK=y*!+kq$#VWCd9$;!vMzaf zu&8LTUX}>O>Cxu_(qC01y=+0W)-)!I0o-f>uY5t=Q(F+*6i+B{kf)&!5f`ZEcA2U! z$@G_zw{cyZTJ}3x2u3q_Sqs6ny;fv?Uk=4!hR}D^(?bg|Mi%mr{j*dnyVEhBDS{@) z7xvDRIMx*p^$-5jcK*x&excK^&kJgZc}#1mK@9X_8<8_4&@(^C6r<8p{`#zH&IBF0 z_~uu1N7Z898P^rESbCmyo~>W&JBdqIF2UdsO(mMoO{!beo%GF^m593s=q%rzD)Xk_Mk{}I?~a~1po5Zj zTJ;N+@dox7B)>>5_`8%LO&UBe&?2d|o$ZCzeAM$t4Uq81Y-X0r)l_ZSKSz#0+4?`I{&I7uAVsL<9Gu5YzRcPEZLcGaTixJlA5Xa1SPTTKEz`q(8Ciq$wmOF%h zUNp7}7%(wGHDqBM;l4%@JSCQ~?tlK~n8i98rIGgqy2}mcuUXrkd@ZO=`=Oz!8NrSQ z{e`~`f`6*4LM=lsJj<$=B|sL~=M?(zo%7iGJvOfGXfPk_ zlU6;oVPs(+d*+kc!F7jqmZqjbB{EC-u;i0>T{OK^owC&18T>q!tmIm2-WCmni68n+ zfyE_I1y2qRNO# zZ`fxsvDJV5`qz*<{VESs0g|YH)B>;redC{sU>HmoGZ3+(|M@l(x#2gZ)?rtglp@ck zZh@89XF&I*OtlkP$5TgGT`AR4$9jkbk@u!W3@@t^rciKtRcg||lO`&nHq$E%u&XS> zGN7TRuc^p-qR-6xeid;1WOQbB+(W64U=$y%sEKq88y;zLC$MT-OoIQlh?x>tYtv8) zqc|7Naw9)t*CEwHSer7kV8DS&QM$I2?3i74Ui@mrVpx3Vqod3uJyK}SIUuMMNdUq+ z%uBhO6j?Li!kEx!KSYRI5j)Y%{pO&zV1(&(UZFy-AV}uN54#n)Cfw7|_}(WqA$L~j zYtFo92=+G$?&onkeGQ=ISLX*?>9wF2nPD2}yn6p_+iXvnBz?_oEbG>tN!IOhbA4*SG778}jryb!5-7h27wV|Z+ z(<_b#!(uG(Q*aXMU)Kw(i%+0g-OTZp&WxfJIVtvWiY;$KJk)B86rY1zR| zk3SewkHbwgMUz0y^D^~`wbIjJKz>7CEbxO|gD2sO$UDFX(iFo_;n7l*o(SN{<)oN9 zk@eYpMEZRck|sJcR+9R?qqBeOe*fk$pTk*l6b0gzg{?l)!9Li1Tywxwzrto~T}6%J@;5joxsx2fmqT@={uB=yk~ zoWn&jrkwgNZ*^(QDcsIO38u6i|LZVPl*2>agLAaE2PZ+#%xnn5nYN~>-HRsy=4j`u z`-|*mE(Pbh zh2ekJ^LqJ9bg!awMsq##M9vJQ_~DEZ+jxVWYVvQkohnWxk+WJEGqcOaRw@?^9J!K8 z(xi~ysdYmNJbsnQ9`fAn;qC|VpxTLNkW&{1^^6*P>x1FZlPOIq4+6}dTdmpLxWvaH;@{7q;Lj5sSZM=^I~ z^amFm#`Z`35!4K({>#jbl4z`lxHoOA6nrX&7M2wZaS96X2^Sc0ICb$p%2UDFVQQDf zS$#Tbjcu`PGvey3D#kvdXU!^)IAfTG@*xXLt*(ok$K#~s zaXX5}g~wLQJM{YBpflxD|D3lWML8ijHRdZDa&*M|ttrB)G+l8R!SFB_Rz0~eMEdNI z?~R&>!|Cx#bDi>mU|yP&r_J+`k(V5S{fh$1UV{3*V4HX6m&1#2NS5TzE|0&Y4WtG3 z*(@k1h&zGNUMo(};L24SngmgO|DjhbI-U08qJ4y2r8U|d_qTI2HhenDdJ1&7mjb-v z>`duLG3I9{*H>_KV+Km*{7?cBcDZSvp=ujz-5rpIyW!KnZ;#mPt1m7wamIQ=g^I0e z(cCGy#=yTy=MihxW`%t0)a1W#tOIq;mxi*&Xk-n|hGQZNKV_P$5EXV%TY#hK)J+ecSc>(z7?) zql^6Swxlj!Y}hGqZn#5xr!caRAKtEst_c!L;`&Qh|5CQxmIZHm_yfSdD) zmwpBkeMbQE*Lt$CgdvXKM_t{=-Q3+>t6hIa&`k@@dM>s^7-I@!_Oa~Xq-A7aw}iYs zG52^2OZ1uc@zLd?U+|rKlLFgkTbf=XRRr>L;L`HGvO)*-_6h(38PF+J?D3wSk|*ha z!6^?0)0E9_NAOpThQDf6)+8|UPbm5^myevsoq=fhEU&~oDz-&1r>a?f+EaJyd?$XU z2da*~@(m1LPB1RxC2V&t{S~MSL^hcnn>kY(W71b!Q*ba9EoTkrDKM)WC`Qb3cY90q z!=UrsztG~?=bttOf*!jw_$7;NUMHsV+$ z#zW;vITs@MtmW4uVafkzbEZksH__8IC;Js!Vo4ddQ)_^v)|xcai@h%@TJ`;=cw_V% zhw@6!+4lHWB;omSb*f9A z*cYE^$Fou+qmz8_nZ9aoWJE6MZN62?#A8VG?b*1I`E%K>y%zj)`a9ljM{q?F3qHAn z3*^1@_~UlnfU&iB*8AI{Vxj+u|G)eP|Nqke@C`dUR?kF_EcP42TRboi6$h2NQ-xSK+OL!(Ru}A1We*2K`KPH^}a0o-lqK%DfF)#IZF_ zGi?4mons~*K~hls^o`0Q^4AWcOrOI-%#G_>VkLz@`aZWQbbkuKHEeEFC6<6tb~o}>^G5q=gKiNB0pK# z%lwMI9v{2b93R)X{wz_jsi=~(mbY~3dCSEIsB#~|rQxA;hV(xl!D_+oFI98QF$B&z zd$r*pCWoUngZQ9Hrjw6jj{Ko5(m`7ifsor7G6uD~jDQ7IlUH2?NMjcVt&*Q zT1lmr&piu(g+#v+gvn&J`CwaSv2tD7I&o+wJ;F?wDV<(mBVv1}l(b@p8HRA&cd>?||7=o@JwOH*M zzJ@_5Z)Ey^_Xwl#yf6#%y-d@l7pGUxEAYkxdlUaZVZ=6op$Whu z4-fEovj@J8dP52gQkqOic}CXgrH~ffLqf6dp#9o2wtIDG%nU(r2;CTjm`@n<1r#*k z^y?V+0Q$Otx?WwM-XXx|B}xIoyvJ@%u;Xz9LLSPe;BZQ2_|sU$*Vu$Fe33tVMQ2rp zCIUnk3S21kIo}#UJM^9|CIdOYG^uY~U5;OVxZHfBO0=X=R`jC!f3WqIQBl8J|FCpQ z2n^i~LkdX8(4{m(cM1py(#_B{#2_IJ(lB&)H%LgTgoJ=}$1~@w^E?0b-1q%%Ue5Zi z>$>*dpW0EEdI&t2x@)6k)+nE?#ob%DIU>X6oWR{cjOm@njh!_qY%fy~nwcAEH1Nw3 z*?5FhhUaH20WPy1No_33$0_kXRX?Gk(0{Z5naPDpWT80UCgsm$yRjLBeUiXRG`+wX zYDb-!OuVIxgD(=~r?pH(4)-*L1i#XdiN?pqP>iYWRSA@PL*ZukJSmDPv|h?o5GM<}VILoXE6BH#R|qhMsll zrsW~8ao}*PbEmYknx>EO+D-7jV&d*zYzpk!+ zUWF!&8_(mlw6L?Earjjaf}TxNL|96)oTe$%#^w~8M_<(xt&Qxet}{p)i|F%^(`PnG zdmy`my{g#X7nkuL;zfRg5~pNm2hQqWVXFhDwbkC%52y;lsJga)ZV|(+1!3R5if?G= zCnA%p1JNFa%m1k?DG9;!1v9!`EuS+&me)pYbUCbfV^?IdfqlT@rnWoVzn6df`R}<~ zd0IadyYR_s5~D~$d+|n*;Yt*k?je=~xg2lt^<;<+LX=^7Y^r_3a%LPv}B4k&?3_;Z0J8Ql~1&M_CXBX zmCnag%&eIc_P{z<)y|n6zB_ml$w66#D#j=V5<4;{HW&||7ONRZ#V25Xa=zRqDiyNPzaz( z8+5LE;~Ali#k%hLE!yKd&t1&FDEP3NZL1K0sYUEBKZnj25s*W%B~jhj$+^l23Il( zlqyMypB~X35Mw>w*~1nIJTQn{ zoN7{SzRO?9$pn!mVfTiGYdLQLNWO`2Mqc=4jB6NFML{IHK)`gKB^b=Sz-G-IRl?r+ zXlai#k+=~vowt-C-p2K0k2cZJsGCAy#>O>msN?ZIcRXeF*7xC~W@LE~bP>&+Vxs#8=9FA}dY31A16h>K-%AG| zy4+WDwrHTst>U2Mx?j<3nl@^y(=i? z&Wq;XXXw??O+9KK^0yuoocK#<6!*h4{x1M7SIJJo7>wyZhgr+^gUf-Q7{Kx-j9G9C z*NeNNC1OCUBw1DCFPJe)yuX*{uSP>-?Qcq4mC;>?4*+cHP2l_SpFH^PXAg;YDLO0eK_pCU?79GQrHzR7=wbDI0rDq!;@>^)+7L> z60-5w`LWu(vCrF-q~&Y1Q5sgN=h0kxfP+wgO*LLQHap!j!`n3{v-{DEZZw@ZO>t~9 zfSv8r9*w3CHKujZ6~ikWu$J~mUdgAQbc#S%6{|NxB-^AK5Xv+_-(UX07C9A2IN1Kj zoZh$3nX8?$M6K(7NuRMWrgo(51`eAJ%gODNQ_wSz zs_taBu(vR%yv@xj;M4P7MPcD`;qv8UAb~Bl3wdfK;rP$Kr>EVp8x!9BWVMy2a|^q7 z?-G}E&6W|VDMtDR`;MO$u8XogP_GmnuIf%BOLv#9`?|Z{&>r)NFqFlmhONHdhDfgX znTZ%+-;f4WC+^7IX}m|wBX6a*eBR$)KyyiaH=gPwo3KXGv8En~s5YGOl8 z1$U!mYDuqjbV^1v%rs02lz6~yg+<{-^s8~tZcTp1Tvbjr^bJ4D)AAz~9L`%VQY+YE zmE#G@@{mx#az-JpC2Zs%%Y0g#1elOY-h0P2ObN?8<*&XSogwa#gTHLSGfJ~jOb{wJ zFP%EjRbyi*|LSIq#DST~7iVXQyN9U13p)uB5fPU)dUkgA`bLR{`uh4yViZY+l8SZ) z-vJ~Mj|u1)*HWb6y`_G_Jj(Q$2L}5#;~(5r!i+wc)J*bshs4RA3ZyAr#hZO#_GeKY z%jokMuO^mz#{t=?Uy{&Q0&fmdj5~Dj(d>L1-mjb)-Eg-xa5p+jL_CCBbEK93(Zgx< zU=pwTk{5q_U;k2MLQBo+MXW%J4WV{*9h{5;XTtJTbJZlE`K6m$b{hQ+#5zgm9){VR znkAZ5bI-;YQ>#QT;OrKUZRS>?L_!#b+Dme}Gp9RQuQU#rHDE#MV-ip} zub=|+#mS&k8tk;-BD3k)q#ZNB6Jhg_eueIzs;c@b#R{{k=_^`Y!GeA4HSHUw$eQK(x{_1t zBnbGntk!*VG(Khk6#WtDxnTDmjGG{gjz!ZoGMv1AKVc#nAa2CT6qkH2BEtK>&tbyG z#s(nJ*UuU8{WLkOPhxt3D%MCwsVO$NmT>q76V(r@)Nc+nFwyZb-U1AeZ3E>X0X;#R7` zE;sqH5A>X5L*D{0dl~)j`xis$~RyR}%1%iR&)c8yKMirvLsh0_hPmczTiwIQXYVX`Y zSMnm45PZ^HNkiWTeAwzsc`BSHt$eH7t^NwvR7PU+?W6Z$Dg#_RS?i!tifOH^BwvXE zDi;lB>HM-!&Ij*kexyLm#d~hRpbnL)nNgWOI>b$$#FmIR4=Mno}Pv8!{PfP#e|?463H2|l-`!Y37sK2 zgf3}ybx?M`)j3c#pdIHeK(vV}s=F9o09Ut|v^s54$^BgW&6yovxZK3VlQCk5e^w;% zHGo}C(DL^6rOaxsRh4s#dtqZ^VWHI*paw?v-0VqhVNesvK7^~P0Df_OjnDm%-aCDr zcz2(uXQWK2sUb(N`Ft>|P+Y;fxp`(R`Nn=|#LnRzO4ErdVh~X_-ODoj(P`5pm(3=P z5c&BAiWM6FJ9@~>qI$+4US5N-oooQQh1aM_gTM>-D>aUL?Eao3Smo7!plfNxp9m@{ zyP(r8OGCBH5zfNgGtG@)1LrlARRfqG^$(0SRBdbm>Iq_7)I~6ALthYGM^t5`>SKGG z9C;ocb#5MbzTj5Zc{O{yv1SeU-uiAZruI|i|R#nn(8*lt;V!c8Bnk4HmIpSIT|8w)( z48a<6t8>vV{7fSG@0Vgw!8hM0xrZ|E$n-W=&)Rc6hIZO&#Zz_y$j!0fmnlK4X^P*U>yzBk$kRkd%@=b3=?@>K_5NB(XuUaUsXa zyE|4uhz?J0!TVA6K&0PfL*(*Nuff~+4n3i1fM)06c+NLWTn%k2iF59S<(}lEIEzL= zAX5|t1xUT>J z0c7W#F%Tj7tLgUms=(E3O=ZpB0=?+2@wTCiFAo0OPKxMvf;T^ul8ikqepB2&VyN(8 zOMXZDbr+j_c%<5=y(0d0nD~U@vg{wnd!RdE-k-n#6IGLdVq$FeG=EcSn0x5kbj{kl`x-r_z#_ z#z^lR(GgmUcuOmv;ELiV^#-zog4Ml9F`L0h6>XCW!37;Ht=)w9c;eZF;@MxHORtoN ziTy54@U+7}GhQUFUbM*Thk*5r>_)EzJl6JlTsg@LmKP1qq$+#Z{@_#6&;(rG9{B1m zymZ|`ADBC-O;gl;X?DruG$|MJk}JlSler1iSIZ%}qb7%K_wo3oYL1xZu$%lSB5T&2aRPfy7t$ety5cyuJPR_yQJJ7qg&Q+)pV zY31Cg9I~d)%u-W>N~x;Jt}Ba24^U8B4VzQ7dH$GLNmcbj+I&A)g`(WB`(W>9qglvX z@5=@U5s-UtI_E1Km2rK=qA-_QX`rsIK3#eOx&Ul(`LRgQo^whsY`lm?TRDZuT;$F_ zEA^Xbqx@XQ4$Y<3Y!X>0LT5y4iKH*-cjjPG51P(e$9=aARSneJJ0$dJC+95?LX@L= z>g^m61?r5TiFe1e>0SRdl_4vxHZ6UqjCb+)-XI{`l8rHw^&S^pD0Kc+qNVk zC71R#U>0QRC>}_t4HF=lo<~jJw(`nQM zUK*%lEdoygGe1A+iclV63#9(jgPV$6lh^Rm)4QVMv7dnm6n}%`O@^<%Ktas$_}h z2$R|SPc`*^*dEzVh{|-17=ErMsbT+%%L-8e0ui;nk0fbF0AAXV1f@!zn6deYbAT-f z()T0KH(t^-$n^6ac_moye(4+VG4aF4)?Y?YJ&LZ*Q)gWi7>ts^R3MlaEI-3d2e1JA zzEWD?t&7FvtC(eXm)=_6R4#)K5G<~gA^zns^-SNV&?1ekU;!%!gvlrKjMEaT*Gw|u z1>@L7v6UEa*%fkNM_ll!MHk1?vVu>desnMzvwh4ZJ<5-22*fP^19to!xdUeP!)}l` zBIqRMjBWy`#LiH#)}e(~=<(D1*N(1d566Xr59aq1fe6wq{_lz%Jb%R3XGPm#nQHgG zlXG{?Mc=pgJVcNQStXvjMKeuE_6*FP9W7kB2`|(XBxqm1ru!mi6{4Sm+3+u3^}#S- zJSSN0ESlNON@U}Yw|t8{%>Qm=5qng!m9}xZj56o{;A7@*=NB^Mq(k)UK9NAU>RfKX zxGsd*`&9!revfSxw{7WDu46Oq3pn+dO+P=h9;hEF@$ap2zQBpAGj0@*-412x-R)u{ zD^M|R_0eW4(XCV5v?mGr6T!xTA?!G-QPe@+7@pGi+H`4PWZZ zDJ#xR*J}Zo_O|PB80m$T$g)(T;oElm+(~R3e9?ZMOfdG%*Mrd0jam`f+yi z(J>&ado2G3hj)G$7mNf5l*TUVb6>}4*AN;&cl))ml7F7*|DZ3ZG?|o#os9Z9^gQf< zf)IRDNqRkecUN~1g3svDU^}CXDqCA@-c_7G*3S zawo#|D`7^@J6~&U?T$nwXEGy0jQ`yomT$}+XsmheByBrl#7?54SJ1fiwikoeb*Z)^ zM*5p(V@F$ZW;B1@vK2%@%c1S!>x9JAQDGz^n@)*0f1nZ%&*k;94_~Gmi}j6$H@=Te zB1O3$C7M6Jq@vX*JhCl4i}V(J_E84JFw}c{;byAGcFkFLSaAHmyG{)rPtYFzw*Q$i zko63k_M8FE*1O}*eus?*Si1ZpAsIrbuOJ3(y~CD zYoY9GL4N1IF}$UC>k5oCycq>&bYCil(oLY1S3y+G@r{83zi2;;O1~x?ulf#V!Hq6< zdoiriJdZUid>kSJ6x^Egtqz6C^81&@oyt0W&Rj7(XX=kmUDY%d&paUCVu(>dCi)I8 z&)~GiM?Yc%R#b%L;y+c?6#W{zGk3J!5!{MyXIT2^dJSAeTir&ze}UUPnu%T(6z_I?cy6%p;%k1D^U`Qy8i za*yTt)SNa zmKqeE3|6WX0P2tKc7XLL;p7t*2C@pO7v^wgDO zjRYp#n|I?_d->s=qEg!GpJgKNp?qLYpyeNgC&8BiDN45L4+I z{P`ihu`D>n^hUm&392VE)mxl8VWO^AAH0b=eoc^>3s3ehIMc&4W|SqGDkw#*Sb=q}f#JF`|ZF7=!BI#KmM2zhXVG!mnR!LpI% z3^q3a4XB0YRLP+>DoIdQ+5S0 z?d@)>Lg-|!n4=n!^Lk=EZ^YuI@|?48z`UCvU@|dBzmL(k#rXH>3D5l>LxJ60OAd%Q zQqJLRj$zA%g)jXJrir<&k~#fHk59diL6-~q&*7tTYa9ggMo+G3-b2pW63_Ir9j@7mm`0oku37}u^E3os!YCosP@fnX!j4$)1a`z+qgvoMAs5F zlss@w2~!YY_p;N;?c2x00RA zi*X&h%y}niA=qr*c0`x~anC>6iC9Lr(Y?LG?mQlBWta;Gk!muaz90ZsuXTj9Fx+8l zO~jW6?7NCd7boceIoahi;pXjC_7z_aBujg}SJ}`>jxo$L*~cera(!loiL7E+76Du*NU|1YnPh4&8}S$PjZ8 z1BnYc2}B~pcK(y_N_i1s5+pWQxTu;Oie@-!Lin3Xhm4)haXkdg@L5~ZR-ut3gLzi_ zz+J&kH%z4L^X5%q?@{=|yK_<2FXs$|cAZ&(9xNFc^3!DQDSVrZ$tzbLnvRfh9N%If zAWPMH(DIMuATfc|U0UO=3RahY+<(I-$;uQg<{g6qUZo2*RskU50FE z9~pZrO$jq7|2Qx-SNqap#tSV>MSW0-YkpR#@%-inwMR=;5ugGoB`z=iHYQD}oG;YD ztqlU=(_rqKf7gAydr#R81&^U~I9hxbQhq(@TtinEzQv-g3Iih)87!RE@xJD-*j+ZB zA2ytY) z&$Z>&P6>Z=C?X$H*s@i=PF<7_HG?;Ar`SIP6_#I0TwuMTW#?(urJ6q0D2`1VD4HAE z5!&ToXKQnFzujw=#4P+RWYVe6La5b6L`M&kRy6@Ep1(~?rWrK^dME6*xPdCQi2wLHe@{&y9No%O z)J~ccrFFho={4-K=BajJyYwF|K(ryfPzFEr#fa(~;D#smS^ zXiGBm5FvXYzFqS=oVe5FXc2c>G5g#e7(Fi$flG4I>1t@@gRVP|-YU|px0kVge#)3D z>25$TLifCD9c1ab5 z^$pOM&^K$PhmGp0lR4$lsz#bP0ZZ9RnHm{;(9%+Sz#%}~HUJ2O;^GX3_7-sKZ)1=&9RAc&=?|Am zl9LYsV`=Wl1a+Tc6cMfkEl!WAn=hVNk@A7G#Fa~%HCTXj7`UO zqY1$K?~_W5G{}OjIZ#~yu$QNgm5vJmItgJQPpUb<-abf3iWG%MuJqCtA7qwgOIN@+?Ft`Q`$mui2|j zjCrBnJ(Me~7EE2FsM(bG7=KNwFXu2>h(FcbJ_h=gudIICnPPe^l`~ZEojzAQ!Ga+d zeCD@d+xmO^^Hw85YE?}lD*5A<69L(6b2+zqpXx(|XQhlgyl+V%!cfV`e;(XF(b52A zcL=Cf8KB+rrYLs;6b<--lY>oPHW{GVA!@Ts0YLx-fm~6fc6(f;d=!Hq4L3W%V0Gv;U*^Yf( z1li*ppOaG6rc;~r!#Qa>D#=3-Gd_I0wcghdG|xJzfQpN#^U_Zw?)I54Wdv>t^J$E z_oO&C&5}n}A`p(BgM2F4Qu=K77o(=Pln9*y6DO13_VYF$ZWe+O8Iacl05fEL9V&z6 zcOWgo|L>I`m6MK=2mB&wB*Xtm#+ay1lcS>G9D(a?7YJsKnv$P{v68T5b@tRK~g9 z8sbQS$i?#NZURLWzl=xwgI)EjWQF1@jd&?}E`_wZh;ULiLA+c204Pj^9P&M4Uyq|` z+>!S0HP4|_JN0i(uQ6LKeyvL1md=U*NveXp;noE~CvF;?I76Vix*D%QMugLb=%|veBHTCm+ zS>x7h>GGlvl{9+kLsVC#k(C4k1>$X^uHW5`EG?pUM5h;4bFvfUl@Q43y;grNrFJ`nco-cl%IyL{}vCW$yOnPOD#_rRR%t@21 zmH&QFwrS({?=cvdS{71m2`;O-wzNdjiL-TANLEsbLimR7xVgB;G4nZGx^-QQG<6Z6 zC4vS|cZgbT3(8`btf;T&XVyf=QC3B?pzP(z?~CNS)o>|#d^umm7KM(@fE`)T>ByPk z^J)^p(pLF8N)Es{64B2E+SM0s`Q64Z1MVpI=Xn?qx^%C*YkC0AK%b6H2(LdvnQh8S zt*v!_bhMz(OfYUw#2-8A6rn3hU8rvRF{OC9pGr7>sCc@dsn|Lbg7HC{cWM;eRYX_AdFPq27puZX76Cy z7;DgQXWbB1Hkj-7_iTbk*KEI@G?$H*uH3z%h?UzKBhtd5Cvgm zcaH$O!p-}4z%Xdl)00dTBs+2VU~4R92)W?};c?R7ou946qz2M>lbZr9mXa$LUcTCsbt-6j*h}j!LX-;dun1oli$K`+i%s<6oQ1wJ7#^gk5&yKU}wta&Wk<4J94wgt%Lr0Vw?qM zMrh?EiB-URIJsSWRd-jbuN+m~9A0=rDw9U76&&EYiwl=S&R>3gp=?O?X;BjzF$Y`_ z&xG^mQ<53fHk-GTbfVcpj}W1Vl^;Gp0OYHO9~HBk8C>*jYHTJ}>)*8VP~Z&ZBwQBu zM}y57)V)OSF7DSPz51}~zaJgJ`(CM5pRr=#4Fw!Sr6HBOM5@P_D8dQO>^vhUh@_xx zr(3Dz`qFqr(nhy&2l%c%^f9Hy_jFUE_if3$qJLtS7!`YFbl#%iOfXogYC_AN%9Vh` zg8GJ3o?LV|8MJi8rrAIVUe$V4*9dMd9c&~%Q~l1R&TlU=Ye)R)Cur5aA%S0z_^u&y zjq|SC#1oVwz#-LID0g&cv_S`1^xaL$$^?bMz zdA)@d)YM%Y*F)h;Mma&4%bQ0d4cn1d@o_x3crN!zYFasi^McjNbs!Vq26qvRTMnS= zCQ|rC^`>t)bi{H?>kr2tUpz|lWc={vaAN`;CLX4wK4ysypI?0#PJL1B+{BE2r)tB2 zDijanyaRho9|8SSf2=&uBkMg*cMb5IsKqN21m6}F^qS9*e@Z6Y6gyI5*;U0+1^UE{ z@`-Qjy=~cvOWfIxUaMxkM%cvn8TC0Y3i`KE5vTTN^Vzh+yi-*k z|3n9SbO?Ggiy@tYE!a66dYiSB6_mOxPsiG9Y4Xcw(u+;UGHufC@+&0fii>IPtK4@s zRt)qUV@f$fS-cIEu%Q%Ck+2(=Rs+-HzlwqRp_q*0?){ULVsp`G@kKA2u?Ir+5VIPZ zR#`7dwXPK2R%Mw7ud=1G6cOGl>CPsI9mC?|C$vFxf?5{vql@>8GSum~cC8%uemPRF zDK5K)Eqvts3U6t*x}dtqlocKtk%{q8&c#Uh5xpy9aW?Um}|W z)5*$hs+K<<6CZnrpC2E>-rj<7Pq)G)&K40^;mLOWi{y0%FGL~ht4w-;9lG5PXP^59 zJ)QDQE#GyhccbwI-YDg#)Ms&YF4}^JAi0%S2n0snHe8SVMT68mlHgD46B3L{Vx$+HCn{g+-a% zm?n3M!uJ#wQerY+TuShZA>y!;j~y!Wrn$I&Szfd`Q<_`ONLyL03ct?4A9Co`3fs19 z0oPXJs?Fv?%H)c|?N|HYS|3q+b#y!$r;cTT0MhlJTE>Qi6tN%@Qhpp!RnM;ks*S7b z3*>fik%Q0j>X|r{ZZ}xxU+L30V~lYbV_(Gh7s8EyD(C3(f;N>tb7XBG z)rtS$3O6o<8UyYVz_T*$AM)~65i2heb(*9ZJyWsSh3Rj@AdsxzzP+UGpcw5H&Qbg_ z<~UVIhztiB6H)j!V^?`v2IyiZ_b7Mp8-M#+NE7HQ08yf)W&4{I=TR+;@;SR-;AkKR zX z@?>aA`P8w9JsbsdG7;TqlRIN+mvI5^_!?N;p+MS6IZ{Sg@Q0K@Uxs=A@m5hhN(9_d z6ldU}_)5}i?k|4CA-n$VH9br!&9uXRpX$rMI4gC0Y@LLKM7scBYv+z+bB=_<`oHvh zhWTrRt%Y^h2!rLv4Y705bRNLEue_~=@2|n@DY4P(^{ex7Ql5>wE{s20|5P>852$a6 z0bx3IJz6@ri6d0>PSgdt11O%J$(Kmr1HNVKGP>3zqE{Zj2qAI;=~kDbL#x>_b(LqJ zx#%Jp!?k9WUVEH9=YzY0+R3SRs53TxXo$eXRwzLIffih)=K|M`m=%L3vC@kHdWx+SlLII*dV{IJ+PoU zYK&h*b~s(BMufzlVRDx#_Tp$V;wU=$_}8ty0NJJ7+}rDpb?(XE#=pN`m)F@u@lBk? zO`S~`=Tj+c_hbCALrlE6l6D`gzAYd8?hblF^6B2U8E{8>eKa^;GXzwJa4S9qTRN#bdp>J986P$f) z`nu(Nmcg$OiCVFH6Y%ld)D>ZnSE~lFJsUzD5?HO1fZ#7LRegGLJ?L%&#|lZi`lK&-rsCr>(Hg+IiLN0{ydTW^6BHtZBM~60 z$WR+qXSr6cmb4x8lPvnp=C~}po~~`JKm;JO_%k73c8=8^PDR|z#*ey60|6SnK( z^gS$RP+qEJb+9vIucjbYvR>>l9BVP+d`G1}bPg;o6G12X?U#<8 z*q61px7XP7=lB#{_N^D^L5cs3tYcr&Cm~ccM(^ z&ljSH)nR1=1?L)yE1RGxH&9<#;wN#!cbUZM(&9t#w>F(eKz1%f`jwY*Iys@PZ;N0F zE?Xd!D}Bw6IzkxvE7@UzHNaioo+B9xxoz(gCr8{E!L3AoYV%b3(|0SAxD{FY#1HdF zfeZn5($shX-t&U9CIL&1xfhXI#|`b{`>4g@Q{S)8;KnWHQ9#m~FIV*lc~1h_6jCBE zxuaBX?`BW?2BBUPu^c^Oziz#JNPDoF3aWn!O}1(5kh7D)Z@r0r)jTCNeZ3MR_0zAY6QIJXJlD=u!X>w2?jz>0|izqkT%0#C3vNT9mlZK&@| zcBWB=5jm>%$m80OmFl=~8M&c?0lhqGk4Mz5UU`&t-D|8A&L0I}|A@-a9*~uhGPWDa zXEjgNBLZjLkt?+qTygd^HNw(b`EaVKRYYWS5sck%5~|`TYv&LK=H22CQOf4B!FX5X zw?3Po?Er+bv6hT04pa}*UdWMwiKAy+AabcZKf5k+MWoM^++!}Q zq`f@TUk`-urY?PP^!lphU?*zvWZDSo(0o!_(reuQ1N5O7=L2o`$R2;BLH^?I0RSYmI zXz~1FSuh%Z*?Whw%8#Fo4h#wOjKR8lY`Y9HOqMvYDri_8^M%7bZ}hdHO12;DPLzzB z=6vOeNdjvT+Z|w+2|J8sjFvJk`z@zjw2!oF4{>$Y{L;jjX@Xj8Ovu16g9ot~$o>k;fZmvyFqX_i zWphquYQex18J?Pz>2wnj{=3N!trc5D zUrwj4s{g734WN+3Gx0Aw--q4%@I{J7)+_7~Ni{mttJ;2FvbVDJ{(KEdmQ_xSx8`Xt*qJ}0J!-P3nDR{#rtX@Hr|@qVdcRa z*f3(}^!MgMA3)3Xa|Sxo07BCPv=1DAIuN$)tLxmn@~E890tB9CP^6BN06-!yt}h)k zvw*So^EYO65UgLfD<3IIt!5r0r4c(0gvCyP5=H0%6a%xBxebW;c_sD<+GO%5TyJhp z95wT~!6AmM_Z013;S8b)pB@M+SRdR5^2@Vk!N1$vMAexGFXA|xM^Y7M!ML9CErWl= zDt$KzqtRH$i$^BvEIBx`i#7Is%j!<5AqF9v?0Sz&lGm{h)na;n+px49D^!48`BE01 z$CWtYqr8^V;VjfCyOgGjbS*nhfkE$mc^5Ygn;^zmcg>Ky3`NW~EPE>XD?B^6Ke~Zo zmDnL0f!l=9+R=>hAv6c$*gMbITLWM5C!~JgsC(J&T(_zw-S6_n<_-SH5vhHFA2j5E zBV+`BQKOg(`X}Abhj1ihEP99`0ucKG3oC!@4kql7w;|m#yE?~?h@@_;#jFcvRrsx> z1!9T||6u4OEA)5!_r!%H!}u)lw~^uKWHpDWj5*;fM5y`Y*`#b7(9-nrr!py*Bi}~y z&wX@Zbo;@?$R@(5vMRE?Yc3I5-!ThKa=ViA7caNbhupq{fbrVeFL1D!VKLD{ZKIm@ zI+|j^gUk%2C(iw|Z+TzgK_ncqj4emQ1uWfcq6bmo^X)|nN+g1aHP%XlGaK>ck3~gM z13wrzaEc{?2Q1?>H-^MvY?fu^4E=-Q`G`fiUt-ph)!s2Je0yIvk;PRlMh1nqCOyt7Yq^v0VvdOQRHaMvn$}@2yZ?@ zwS_Q$j~ykozX_B-p6!0xdG_8f4#-w@8P!FQ5-#sV*x4!Wzlr?a%zZx8?d011yWIET zVG@GV%}?@){8jyuQP;yxaZqcs6~{tU-Cou$A)qs+hWq~c1TyI1W1RWcM2W+1}GS~tWzop zBw?Rs!4q+W{Ar(o2%)DY7*{exyZT;;R(*i&3zU9GgI++ZxSn7tAUzrR@oY1$$;p4&(>L2rS}=Xbl616FN?CF9-^9Svawn?hb zRzxNkbk5W5eq0B&p^kLS3GD73-AeLA={+KIqnT}w{aDNm#n`Ix>Z4sZG`U`CZwh_* zeVzLlS=@}sTg@OOuq4zY#2Nq0-xznASKQpJEkM$0mFm{f(7^D!*1(rg`vl$t?DXW? zY(WEi2g4!kQv!(X(dU-}*Nc(#fae&Gn3S?e0_-XS`vHbg`qOF;4;{_Fz=PQB=%U2u zcnN=?ImSS!E2ujy3`K$A?Ti&q^4@KE_7s``WabUKOO0Ji|?rL(J|1n z*Zn~__`k>`e8ZonhdrF<9A-CPnmwPT|JYioCJFGq%N@@h5*`nX zwuajaC??W$^WYkY^>V^WEDFBX z9&HvD%ZaL{m(g>6N0Sj-5 z-+Ys;wt5_j*L#oLUv&{4JD(_&fu=tJoLrPJ(Wh;8<^KR*q5a;8=g-@uh zte464k-KRd5=ReCNS#QDutqKgeBQO<8=)V^_aV#w&LG9f)CjWsB-SpLcpkmOgwU1i zyv2LHI0Jrby2e>!@*Mg4*|_LZZ9T(T{GrutsA5|3xa)DALNO~IiH<3x~Ln6 z&XT(`7Qs_byCA7RD@M8oQ!-cJf- z(S7sVW~ELEs3v8dM;~xU56B1uD=@ANxyS&*z#5t*1Mg!oNc+d;y}i^{raa%^&2&i3 zlwg)rX1(Q-c?+yl=%4LlnKBZ931iPGIe4anKX6X62mova--!4+j38H9+W>I6mcCWt z5AynNdr_(z`gi?=@=tT4qD)J!F9A^qCXOL~1zZronGWJ~g$k1U;Ya@TBLMNcOHnyXOPEe}QP=**yapg89+@ zmZN3-tQ}CVp3@=3?`dr6ze!#ipDw%e;o|po(@r9OBulx1G{&18)3-lwF}60%dPVOa zB#Nh1*C9XVG;1|tKgSM5Y#hs1|8l%iza>5Xu}I4zmVG_Mei)Sa?EU-YqmYDk+n%60 z5E3A@SN|AAsL|kFS&vy4w=7NAK+FWtNqGxwJ~cH`L&Q_t024})XXXce^!Py+Dg8eR zG7-yMaR-eu1G;p+ceQf81jF3COHq^7`Zx$;8~*GnmK@R5&m_suSo(Cq0PalQ6lHh$ zJsB+SjqwCHZ;kaR+L@7bj8sKlKk5&>$63j1ipoMmD=8^(bIfa}+^YI+==%Thbe3UJ zHc+<~1c^bKLArYY2|+-bQCdnETDn8JyE{i|>F(~37`j2aOQeyG^Ss|V-}%Mxk87^s znZ4Iu>t0)wSatbIE>*EQ7O>;Bm>h4nvI9N;+8^)j+RY(5IlzO;K{|~@9>QIzdCctM za^hN6nK&qJq8y^L40&Q|ll_+a1`IxMVWaRYw*Khe><(RqP}HWloVDMujxrb~AcfBu z*4#FrW=_DU;p{D|E`v8Tu|neG@G71SMW!Dc55mtqALq57crEet2Jz%#XdSL2s1=28 zutneypAWGxx4W5?+SXE`m|HR3KO7MI=Pak9>@)>Cnp@Wc0L&TDn{&;RsdAF`d*_bZ8gAJaOG zQ0R%BAXNx64`qzH95uq?*OmwMN)SR$P8K3G8rsNTeJZhCIT0zrU-BfRN1Ykv2&rnK zuoocz6z_LqEJ@IcPQR^IalG&!R}VvNETLE3rql02_0xoROXRsuC?cMeNS!8LCr<2x ziSl!-gCKuV zUF*?|<89*f#dO~hM~?G7iC+sli_L8TE1FLlr{qcQ`m1n;PPeg?3w}}HfPc1BnZbiB z^!)pz3HUobm%^cCBqu#+3k(&K1jC@!F5JjDm>w?G;|v+tpQj1A>o+xSiO)@es!n0F zLn>Hq5;judfKiHn$KLPKrt6hJ4m%UKSy99M@`9DCrDwy&;3d%*_gCp%G=rvO zo2eu!8PqM%_LO=*4KAo-nU54W2QSl5(SLB-k+phG91Y{yuJ+Mkc@Iu=`|vlvtyw0$ zF+YEY(J`^lpmxDy=Mxd2-=Y>0-`%^E_g9*fDJ^WA#zM69~{U6<$AN$F8DALx=QNF`|Q?Eg$#CmZ@~TEPXG zNUZpC7mCGKp_K9dcVzSkv6AZD_D6TgZIu7TD!_S)}9MMdxS z>^g3Dhg*M1Vn@>bZi(50y)WFj*hlF#B%+d|huZK=Y$`Cc#qM>^>pw^t3Tz}3n?nk} zZO(c_Ud4&g0p}q|HTel#mkV&@3p~?)ribrcU{Y#F5us)UGINIAMg-eF;pkdK?>vZJ zS*>}5i&p;k(RUG_zScW!^O60u9v$JgVKit24t1zj!bhq4nqLq5BWMGiuh3P;AFj9B zbu+Wm71EvGO&?k#WBxv zF*PCME+2|BX{TRCDs?UM*-z*egrR-{kq7qU=Mn%%2YteKax!=eIsSo|f(;01Gtwan z@+5^9Jazpx%cQm%93(Gf><`RO-)};BkL{}jzQexO z#)EswAEHr>-@jy880ep?6}J2A|8X@i(Q$kFlOB){$9Dg;qb{%;NUX1;XlNnAbdcz} zt=}|3wUN6=_1;EqwUVW`WDbUj*VmK2h5!8hA6=Ub=Ede}P_23Nj=~Mw{`~0>9)f=b zvk@kJ(=?%;=cbvI=$BtxfnoKkH|Iy+_P#M#QP;Dzv@oMR=~f-s{$JPphDdsCYFmfL z?J0`piz6`KXxRUzm3f=rY?;x@Rzd_KA&UqS1MhT2mkgO&IDztj>^(QSTX9&uGK*S& zf9i3n79bF*NY{LN@4(b=)gQ0+J||=|r#}KijkxYd>Sdy@QF_wan6%$hw-6qS{yFco zZHNz~35LMbB&?1~W=$+hVKr0Mnn}>E);&irmsk847O)+zoisPaw*#PwZPGHX?+9ZIC4<;>4@U7o?P`-+3kOV!9cR<*O$jk-Pj^jMxe)j#fByng9&L@Z zAbJPhcU#V0TGHXTrOb4azjnI)e~>D|a)YS8usBJk*!Y8|#9#9Z{_ejRzQW@c-kM>~ zjjxN2CB(>0P&d;1D0ZVqhwTm zq0O?MT}$RpvpCPCVa|4R<|ZK!9UQS)!4Wxgo)nWDkr6CmtW}6;y!;%G>StkDFtd^F9trLf(H2{kB2uQv%}y5}SR1`7VF5!m|Fp4}j@<-~~30+(AWRF~F)jGqrx_a_9t+M@h{*u&8D^(j|+ z%DgQED0ppUN>3a$K7URDUH_h+4spub=CILYM-2zKz$8q)G9VHCGm1&rDu(1P!&BFy zyXR_W+h4x&>Ne{4cg6gi$PrRE%*1*uEq_~T9hw}>D(rlH?G11}``0tBj*jv)!}^&0 zIl?uO;bL9*C)S_L@PbiofO|?_kcSQSfO(>97R)0d?H-GWo1D$5JrqXIKQp;_dKA4D z4=Twih9paIPrFn69Gqam&Oia#yl!9bCwp1M{L;pRV9HnD5899RWgV7nM=nECU0qe| z8IjqzOjrAJ2&E;I%9Vdrx0bFv4Ej+;F;4k#{faT1m^c0~9}x0-b(4u4epZJ$NvTVF z6e8&A!10=2@o%8)d|6*;LbgK!;$@M*njsz^eAelD9eF0dGL!zRrL6m;)j;{kIdG1d ze!er069nsj-M(`eB}|x}9$w@7b{j`=G2WItMaqzkJeU-;km315ioUb7_)n}0Pnl!wXfG1)MrWc6T(+pUwz;~Pp~+7NyEdZvee$Z>oN737t$T6ceDh}CY^@4CeV+~bo%4H>N~6lL z8mqx;CT^1F<=tJalVhileue~ag09>Ij2v68Ya>i(lK{&>y=V%#*$*Nwo|H;v3@>nR z^^L=R*k@s8HD~=2LDNBm1BzDEIvAC3eogUET$7T&@A`&qNR2;usB5tMbOwc8!+czS z4n2*!g?G-$bw28ARdOC~yWLqs(l@cCB>Ld0CH1J!b;|g{zKxy_RJM={&>(+!i(_tX z{TAT58$GDgA{Lf4%2r2EgDpI?B_y!S@i-&n$+^i=sDjyYYI?mh6?l`1d>u@Ak|KDR z!|HHr!)R%I!x;8s?;f#4l4h0ib~sQaWcBb)Fpr7;HtKU5k{aU8?yOTA)T`2O<18FW zK(dX0I4Ln6e5|b2>b!5=bZr#*@JyXcB?qP$!7@Ngze8Mi$hN(;Ym z;g*JGDv$3{IgTAZ-sThE+g#wdpX@xhKG3wr?Yfi9lV^Ro;(U1a?*ytgh2n`eIm^5u z19mrN8Yl{D!J?P78U>AQiFj|rpeuTmhxr2JL=v0rCVADvJI>>2I>#6g&R z6y*NRIE(O%by*4%NeKv8g~cmpI=MG(SqB+Px?kCEo%O1W0ZX*Q{i5GN@{sjtu4waM zjVj4*$X$ypGiqJRUlTA!oxomh41v7a@!FZee-TW+U=4?u~BQU|g*a`q$emebKHu+LY|d zZ%Yinn*cW=IR1r72q2oDC;UAcl!+yCJMsT2izwoCsrt1MS3 zkvQ})ARiB^k$6>WYzI(pvWdf~RUS4Sc`vf`;djI*&gYQTAH&Z}gTIi=Wl9cgPN)mt z!I9fZRxhXj8*QFJ2lKA1$eP3D^uZiFX(T*HU+#V>v#V(?ERda8e@m*n+!YxbP@@01 zU9e2ZlSrN_y+N08@~!?Q8+$S{;9s5XmV8zHw~V8)2S*KsWEhR!QZ12CYiN|C@(Ksr z&{)%kLDb2}KZJGbrRo;Fk0nq`#q?u4s$XTAF?rO+9P-BQwINAmx0|*k54a?hbg5vd z|A3hP3y%n%z1n!|$?8#o0!42?(sYZg#XmvhvA<6}=|qmD1B;UC`wj=aRqySVLX-Mc zFsr4tq&bwp4w8TT>B9*T6VL3cWCP~%;;x>Fete9ao^^?eRL1MeL&@)s*}G%>Lq&Fqbe&k>T7BaiOeAK*4V zV|WCW*rb|6I$K$`UgBwr{Inu}_rWRt*Vidojpi@-l65T2fi$RZNQ%sVX$x_@N*=y} z<;jQ!k0ub2GSLOq(NbvS8A=OQt$Sn(O7p#r8bFqYT)3VbKgL|Ftp~k6%uCxp>Oblr zq|hc=U6#ZTtDccHDs~Zr6l_lA;0IHGMPE%;{#M1@4K)BQ$@iIlUm8u9;!n)bHC^bT zoWV~C-4~Mtk(z2xXBO17k12bd?)z$Vo&Rh1)YjIWC76vRxQg~9i|xqP+xL~NF~`I? z7T$i-x^i3%s)hyUJ&<8}K8a{%?{!p;u`(74?4U4Z{>Se9vmKZCE^~BV&Y_o9D1DK| z_vPX3z=vyw&P{@;lFr-UdmrQg#JORGfCy^&SIp&~`Wbh+lQ2E5h=C0+0*yTNqEq)y z{xjLmJuCd^9o-vA-|^x3&~_U>@{tM4bvV62y*wg4h^Wx0Xq zv^w+nKSo1A(ks&5j)%%dTir7=Qe?S7mMsIVDPvxmw=l~k(qU%T3K|8J0yVX$=^1r( zDRYAvrL7yar5UlU8L3>{NKdOnv$c|V(KlXPgJNv2e|ugx;v8XoMvz`5=Q2fy^-s;t z+VXT$CA?%U(2-{Di;@I3un(yWU}q-lsdR8ICWE9)p~6wokNjf(fF=pFCZ{0o)37BI zKJVLc-kAH{L-y;3xw-CB`+r+W-gIZ{hjtww{JQ(8PNxdL!YxxzMm<+3+uGD?6b`Dq zU+UfZ%#Mz`mr__kP9-Sij5VT#n!SC><0QVfNjfS4W`}X(V8s*Phti8!_c4&qztJh{ z#XEjv|+XiW+E_|IDto;?5G2C&Tw z-_$E0<;IA6<*->$Ugo$V53~}rRm>Z*4rFRcs6e&W^g85g@A2~B)l-&h>hwl96*glE zcAwW4GA_m=5j}>*ayRoX`vbIMD#7z`FI@K9m&z8;~l3h_1 z#ybVu|7Aq_Wmgj>%f@P}NK755UlOG(K~~gfwD#g^3KOK_`x~hv6SO`We$ZU_aFXxp z-9xGsS#i5}J3?$uV)@y2vC(e4qBL6~r%nw2D@X;@0k_M4*Xf6aI?g@SoV|gaKU$}P zcZR}xV4K&_+>oOkmS1M`!VqtZJx@Y0jMhvC@>1UVxZ-OaF7|S2jlZ8cq@(q^1KO93 zFEa}Bl2!=|7cG5N?IsOsj*JicNG<_cttg9SSc(nyQi@T9sg{;7fkc*+hn?B#N3)C8 zrblLFZh43#6>{0}N`a2NobI2Z+hp8pM8w12wPQntN<(p4$U;>OJWV5yzu`#f3cP}U z!wmT`PmVb$GoK>;Dc!>Cl_ksD>JQW1SDJBO}4rm_z61m*?Mdk@3DynD<4#0}m%A*<_cc7aep!+xM4})%|Bi%jg{jUrOLPq@3V2 zm~uHSh+Ij7yxFuftTq5KEeT46YFQloEl=`nVi*%MT0G{7gDoh^gx5?oGP^QIc-OMwPfWa3pcouI zjHcpX?bvsM%`#?7hqP1_oTGS$Icog(f$CYmtZ!B50eRu;*Azd^mR*945n(7qKr6m1 zE=kZ%D)4kVZ!Oc_w9IftK`EM$V_p9)Di->tXFUv}u>EthSp>d}Enqvin&qq%6OLf>et7UY%;d>yFAr5q$hUWp28b0;o zWf~EZIdmv^gg&76+tS|vRS!I!RFY*KoQ8Fi4LTjI_Lw22R^FO%pWetPF~)q1L=d^E zVZ-}3I?MFU$&7Lzb{bElLz~pK01sjI3~T|GG(@Wf0_9CjYB6A`{lt@I*dRz98>==jsOwPGYUhh)Syc08)H79r7|A)MB`b@lZ# zj@H)Jo?_9-BGp|5<^{l|j4ZrsCb@=7`|go1zgs2L;S%>kP2;8bOlrm8cj)ffDMg=e zP~D*w?lT{pR$lf=_ahdBEGfU|O?=m16W-6Xmse{XFJyipp>H4ng-{O*do;Qk6B-JN zx5x7&Hh-dOmPcA691wx?^k!4;ng^h*^$i%3!#`7T4q}@4j#~_|bU{^OmhTg$ABx-d zi8&Yg-0rSLB?5h%cU`57+?iyhrHm%WmWUrb#em zIg^iMfgH4DakvQyG#f<5xT)(k@r&7P<=3wY3XF4bkG_zi6f~$S1bdCTrgb zf+3KSU;iwg!J1K|P@AxcSy`Cb7c)%RzcM8iX|msV>4qxNL*|%o@Wu2akFmRFOx_p7 zA63ol7ri|?G?TE1$bks8dF{9nUe;w=z-+-%hS6LF)!i-{KEPNM7JgLpJ?@ zxOr~57IGMui1noplhM#-R|vi#f!_v&dZ`Zg*6#pbRIRnCOSlgN&$_jkw*fBAsti1z zv#b5EBwlMeAhhx9m9+tXnkMj|U9CyfQq0&A!j(I|37Esg%Ue4|y2RncR)1t`Sb3iQ z9Xs(;znwR|@kB}#?nQJTjd#d^d(7g8i!J2!e&nd+vztvOsM3Cu_r zK#)Kgpr@vXY((#LYZ63TWDhkn@)0i(UnY*{}@MNw9C0|^f!=k_r#=+K8vH4B=sXxBu zAC_Cz_JE76;AVtDB;N=A!91pJtUO_=^HEduRi?Knp~J?%p*kO_l$cSf$;q8uT^p|R zL>ib*O#D&x2I=6%#ev2Nl+9RROfE}qBANFqyd;W5rwV5h(K|orGXcPZIdo-@a{Fk^ zQlwhCe1au?kRqXVoHEr{o`$1lXR}Y6Q*b=1qGKd2O(^x4BiL@DQt>Iq9Kt!?xg&A= zYAv!q2WYN4MVk_V6Qg_*{kIDhQ<30pwJG=*W=K=j_eEdhb&XTWL5Q2?$ZuhJX-EJQ zI?e*7KEW-OKiHj_>aymC;3UTc&4`j5^|78o6)uIcp!u5tv22r9lsR7MfBMk=*9+kE z^;a-NO-}=kLB~?TN0eFlpC!zO-J{%J=J=E)g_Te}K(4GT^#4YSAWmii zV$Q`DC@}IHSc}SVdN^nD8+3!$2+EIn|)PuC=LQ6YO`?p>b&iwHU zu%#C;V{x59WYBp$WD5<=^=NQ?H!t+cGYZkrx`HJIStr|yibZvJ{}N9C?xs1uG#z@z zoFxLJ=?8=;D~)|oU4eM_CM^pJ>)rIl=gZn`jWMJNo8wQi#DLCn@hnz(PXY*D^qT#@+N>!z#Sqrj3N=*rpaD2!5 z`K6d&lH@etPxre%&wsfUZ1UBL0^>u6%PAacm$QE&-(#vFbs>KD#H<9S_Op}VsY%9a z%cepN4YVACXE2p4F|okv-l17kf6brVN~So*Hc_;S*2NB zvI*-Ws9WWpzEI}{&wG8JM__V6itA>0ElGNWtZT#B&~Bxa-w?j+u^%ns9J0mgPb#62 ziZGBhu|(R$(inK6+1?P&m{^~y*O7kPLGlS0_?6}Onp)fbIq+Rm%iwJ~ zZ4;TKmYl^IG8V$TSCTfkAqFg5^OO51*~s5N{bczn!y%((4$cl<5Pzeop6?X)$;lk= z0u%IV;UBg^REa?9e8prB*pQ{vT+*o9NDh?;hTMrYGm}$+fY4aD%L#@t?IR>yb(CR@ zjHI{t(S1Ha#*&{`$`BIY91ri#TW(+~@9lmaE3*u%lSVd{5{w~BoK`2LR$(n|@@}DN zkxPP2vDTS)t-{1-a(_n*Gg$QEvzc+xl`~5bC~kbM@8aMVm)2o}Tz4$P?su(vlgTTb z$l7`sYS0ke&Rt2_11W=Q$l-k?F4s+s%cx!&KxzOPYkX2;I|V>|&ma}X4CZ{gf&hb> zJeq*Oj)2?~7izO~f&hFkYajJ|I3-Y;_yPU(-7M!M>2M0{nTRT)|C~<7!nd!Kb-fY{ ztc7CJ$RSc$e-vU~;LFKmyisR8)Y|9goYS6-%E$ja{N+SRmEdOpg|iwvg2@uElr;d9 z5P2$-xX(@<+1xS=!ch9qtLk6SJ9V_go2h&JlUS79{M#CBv667mHZ4M;7&kEETWuJ~ z)wbm5c>RSO*=-hgdSe)ClxTX6FL^I=P8XXAAv#9UM2oNAD|VuK!?ujn{SZx9N$Ya4 zw$N0z{^Il*Td%WcZ{;f0a;D6LP@l(ePkl_%XRO98ql0Z!xqQA}gBqB%N}=xjRFBOv zI8>V)?DIx^#W)`H#dMjGbo7+rBPB-@`5=@bR#p~)f!bjFGY$zNFJ}CObQ1}d7t}IK zfe-5Hae+c`_9G6{of!9jpIqm?q0`juY77I;l9Qjnv}5jWKIIcf$PcnFdL|K6VFsVgek0>I}Lf6@O+; zui^6 z^i>O?{)V!YzHfI+vP*^vmNk6*!5N@efsQklkzhHIeq70 z{_MYpLC>>?71i+V$>df!KrbW5Y(M9d`dReGB$xwo8BIr)CmSMVUtE?*Bbm(J08W!d z*}*hd^O1@whOiRnIY2e*^V?I4$7pL#xhUT@f4_^?yGLr07uV^i ze4&{BsVx`Y`n*{97k5XZk835)_{#Ec7Dd!3wwEP0jf$PI8dX&q~eOP8xm zEPKPXUtfHQD=SC76MWLtCFl8m%Np#2qy>4XdX2%jO5)nZhiw+38A@ACj_4_uCx~-t zqyCah4dMpfI8d*`Hw>h$y_#A{4P{I|WafoFeVgmS9}Z_@(7%fNw;W3i6{$wxac1LW zjupldn+BizC;&vA+iO2Thoe2w8?}M}=68FYQLcRh=Nu1}jev*F7f|bF>m>%VC)n&H zQ6jYSt}(xiWXKwFjWlW!gD`U3Dqp`t#)=;aqYtZ@dwu+am)<@8?<@5FT}w_D_Wpeu zj9jrqD>ABKhnG&=tvI#3RdW-9pgKgrKv452MS0RCbyHo%EIIkgep4-K)p}cBX}}aR zJno-GVkRU3#JUxjWsG}?JakI_Pl}K)x0*NNdvNkVg~MIWr97p~nlB1D!ld1~-q+Xc zE=INGa?PPg&^Eu?sa{T98c9wjvG1dW+8p9#owRLH4AnkVovZOTIfw6Bgf{i7jXu-E zcjk<%iSm6ElFnlzb1Ri!6w%eiB@FPs4s!HBUj7cXvYgQGZw+llN;y70{?nW1NC`W0 zdjm;&!1G+iOiyo(Bg*lQh761m*ROHeZmVzW0ZZ|>1ftF2WyhSydBm6b9FI#n zUbo7eclNQpQ#uxr*Vq7k*gjZ0s_83+o(|xz`BiSsoE*cLQDHfIwQ7xxa9!y+TKF<0 z_A`Q(@V$vU9#>2zbSXWZpGODA7`e@aavM`f2&yqAvNgJ=3KsIU4 zm(eR9$-p#lmb)tr^g?DbIag;dLt4s6)S|oMOCI8{8Hs#OAMy^%M{F~VWE~KRH8M1= zBa`p`8oJfeg1@c8c^8yYel~bi=vd}=HF_WsAIQ_uMtZpx?1E{_ih>7G1Gw88Zky%3 zic)MJTp5V-62yUF4w}47VPoXS9v|`|_4vxRf&aWmHBp`aPs9MFDSl47pIwBGC>$b^ zo*f2}w|7-HiZ{z_trdi1i9`Y9jL>cD(Unp3#%_&J-IW&;Axr`j0_XqaZgqP$7ow|?Tqy1a2 z`%hR6i%AA*B6?|OM_rEPd*u_?B>XE5DqB_PnvHj3kOY=8p0xtqWkija4B)N@{lfU)A~{zhqvV<|z= zS8>HJ;`m@5FVs*wQ|RI0`H)gx-il=bCIbMqw78K@f0Kh}zhMUU&Z(%ss{Yz|uqzXm z?j=?OZ zajL5(V)HXTK60c4BBVc&B9i&zi4v&8dIMJJ9U>o2(aWLR#JzAqjrV=!)*2V(!KS@e zTbWBx!{&RK8|vFZHT8F$WN5IP5|*uGS69~sCOTWH0&TN^N{vUy z_8uQuBLOJ0fL~lR)93#pE$$zWe0Q_|V-6Dw*j@O3e&VV;-2~MRdx~b3UVYJ97s{#L z*UHwoHR?RN2VOKja^4VlK9=aGPhfO8%+7i}-hHv~jT}SnyzIAOPX^~#ztxg9Juw!C zCf~D$;n^A}pQ_ALq5&H1M0aMwMuA1Kne7YJ{a?UfrK=f(Ku6->t|Xz8vm&aONWwBO z$rhCp?K)T3l@ob}8WsP9nES;v5}8AYUdSzUiPP~dr($kn#={m_Sr_-6B8Rvq7X+4f znsm6;L;|Au`s@5Q;WkKlF=YId=T@d>#MpS;V99vBNx;Us;WCIR4tar<7#yL%mqdu+ z-zz4z(^4d_QR&%D5U3KQ-F80FMY=&ccbLY;sc>RL9|R+;IbB5gA_bTn0`mVN;Sb-u z|DiJZs@e%CYRPR|Z#$rDP%I!QMt`lzJx5i%X9E@-Gr-vZJbe=~nqwa zoli#VbHl>nv~f^xy?Kq8RN2=`gHsU*)-qX>T57DdkZl!f%M%A_k0SD~Gy+Do#(aBM z1&zd4$0HRLw8e3SnxS8dz!F+qEmx{*)=r--%pWg^p9gY3k&)^1JiRfrIHaTsGtjhQ zEB$&&!@gr#KuKLSu?LRJo|kcjLLqPQ?BA0|$ujcuI-D5}Dt)uDfPTX=*(u>z? z!C@gzrT9=6BB}hj1qyh|42xY%!gq4yiC8(Q@YBj=Ut!d`d3uP!O}|Ho3BQ%3Hdmwr z%(`zJ%xFKJ{d}5B%xHe?)fl^!|8XCwUSS;_L5m!DirGus)$>4S!Nv0dyAsRw<=0Ik z5|>viG7t#Fz^ca2E;&6G>Mx1SsHhy&{pPmUK}FZebU+p7;$TZO{Xb1idG#c5HelS= zC;$EA;B%#5`00O!j&MGT7WI9GXJe=G2|47vET39hu>{GWmz>CQCY--gcXBni!OM+G zayzqk*7F{yjGFJD$JEJx4PQhyHx*CsD2aeAlno0|nx+m3UehG6;#D3A?ntQd{9^yY zf4T@Djr8;AKzFi|zp|8Jlr~=OCx@-Bp($)ZF;lR~kaoBVO%1b&>Th+6p@48yGdA~Z zK0k&RH7Ll~KCh4;VrTk(E6J3XG=CJ9X-u%je^Y{gc%(B$s)Lp9X%d1jY!^C+NkTdd zowe+2vp_KR8WOrnLgqay8ZRz%M2LL+ftz zv#1OLbVo29Ti(y@nGQ3m?f?3my*7%i&s}cVA|NNg>Q!+AtGl>Deb)K!479#qxY~w3 zA%flQ7yDI{_0W?6iB<%BYy45R$)i4eBQ)0($|S3U7Y>bs&I(0#&5~HA+{W4qql(<= zGp_{WicS4HTu`QV%N34%eYN;2%nh5qzA1`87^y)rpZen4U^Sf%m$Os}_#zi(Mda{_ z660mnpWW!t*X5T+**oO}TV=zRkRy3(>pfE0$cjmo7`23OWU~+bRcZ9U+UDE62tXJS z*3=44y0S$xsn;H{_bT)m9my^4u8qwSl5kTM_=)7?-}m+BgLoQh3kH?N)4W&l4Wz+8o6@5Fn$m0)~F;^auIANER&D#bxdy=ES zmaGH)5@-T<-LjRi_hM0{{|3F68{Dil;UtRE#Nczuy2I}Ujr^T9H&?D}Pk*U0+nyVr zqF@pstz5qqqjA)FNef>E!4@rtTF9*W(MXNnJ0u^ueE3NZ4r2gYft+yGgo`MP-ZZ;3 z52~nW3+8C`sflP-dW3n={YnF_pPB19M25lZ=Hi%~=DSB5Gx^m1?w zODifb7Iv+c?(bbl8n3D*&J;}*TydC~2#Td>GtkWv273hUKI^cUoo*0n0f-{S7(`(7 zL*VQ#Ks>IOD0Jnc%vg09x<5}{eFReJodtnkzwPMRcoFr}>p3G0u3xqopJOU~lX1WNoW4y!iG2Jx3)6!T>`d(mmox6U%dI1*= z8LT>O77@RAw4F&r;)T~}wR}%fsDBqT|Fej?S}9i2YB_?>oM8go<-PQ;>pio&kcx42 zQcP4-)TCGE2j^a;{WN}a_twrT1$AchYU%G!DqfHz0~~&Fh!K!-4C4-iV6S5uE6;4U ztkR{^g1TsDRhc0ir#l|wo0qF&I&Ur3q;2VWQlfY=Ttq1|aoa;}xoiyefGgW@y_fPr z)Ja49P{D=)YxUO3tp7)sPev*F=@`G*EdHb&b@!eG(p_-3$arI zSY@2qu-d5st!Rep-5{j*`~_mO%DF+>o;FhiYx-5L&KyNZra@tsbU*j^1}Ne6h{ww_8FkcXe1ypzxD2T z?j@cnBk4;bBRale>3C{63kzx$X97dbH8YnBgDDcs0y<}7f8z=zjequ@o&@D5{T%v6 z!>|Jw3+EiTk}i~?xystMDD*2ch{HtEIQX}ZaB{bKS{@RJNWsYFa2d-;^0=c^PZ@DhOHS^>*N2?+fTs+I_^h zuSq9!%bfK-3lr&ZVM8Q?ub}UB*7BU_<0gPXu!8uyw&_*^QO$U~xUq_T2)A+=Cp{`6 zU$t2&`L>~h;m`n&Ier^QbUnhM4!w2zT=E7+o^gbQ2H`OM!K0b2T-Jz9{j;N^{k&wa zDA$*$^hjY+*mmfM2o<8?gSYW=?Ru&Gn%MXL+udBbc1I>h7_^t*TsWG5$fiD8g?2`L zS{V077c#d}$f?7N3`CC@O+UuU7g7*{wg9doheZ+0ZT6A7@Oku}N}GTX*RHu$kWJUQ zoe`C}J}+UU`VISCWW?J}#C84fl+4%`Rmm;Yux$5xuV+{pYoxS6M%SemdMHjEEt;djBKMaPy}1=BI7|f@C(|E?Hqran*Kz764z*$&X5HgGY&q#Wxi+2 z5-zZxzX{-ZO)>BcdlpDW2h27L3bIlzyrU%}$4bO+a&M~f3@*dEYN z>!VR(a5WUw-jd0`f1tK_AkYg(BE#xvFfkibm znm3%uM{^9|tUGx9au1x!?o1cC%}eB(;TEo6?xTIB__(+kNi4U=1aNAF6}75^ET*z_ z*w)jw)I{MQ{=9GX0QTqsK8GxAQE#C5=+@Ky{DI#SZzklb? z{{Uag$@bRyJ2v6**CT8qGZ+?r$wNeeAvp>$q&pB>TImoRq zRVpNg#guE}vIo_TA(Z3x{=k{SAJB-nrO}mWIA^9JW1*=l5y4Pri|#;eGjV>rxBgn@ zBxaC=H!AI0YFg6feexhpZL|8~AfV1h0^pP#p)2mv%%&b&|P#{>>QA#7DEG@bLcP{&0@AT4`47VcY`< zOGoz@tI02FP;;JE%Jh3$jI4q+otyO@v@qiH(bThlNFbcqSN}$0FZU3Sf&%Sjd$v1g zpP=)|KF`~EOi1PQJqCNROR+m7C|qypMA?T)z211m!w~obZHO|15?NP~mVY*CO1?)^S0VeVov8evWd^IE@UC1DVTS7^Ft+DLCyjM{(Y^el7dR zHK_4H^!i}u*NJOR=ecsHs5jn+yIK8PqIS(rJ*m3L?U3!K#*)1pIBd17GtTGT-p8LR zfBbu?OP%0%aKXETB4Ih?RSQCK>6#jP_iz)yn+uMm*F(EM)h2Sv=$}mB+&1EUWYqjX1O>ff+QSoOhTU+h@>GQY9W zc%R3VM0b_2#cPHm@z!>~%EO%s!k5f?_3Ha5q5!D4Nqv9&VA@L4>3^DtZrfCM_5%PrSS? z&rV-2q&N}5{SthFArJYbo5VDIBqG8kC}@0m^!G2J`9^8tucYt62I5A!g zox%g;UFu$k`h))9a0>cCsGT7S0x3KmRV+TUaw`3)#lY0!*oJ!oMFo$Q%O2G}KHs8F<~QoS z4P1+`hgY|wKaP?Vft_+-ARz=UNgKqd=XudAQs*VfNVVD|GrUAb@wzmC?V}>Pp#d^* z6QyhNi*9%d6Uu>4)Jc(8wua&ybEBI#QG}aFST}^Rzp*JTWd|mXDY{ssd!NJcDWMmsccnYIy4?^k7^w6qx~k#UgPEFQ?wRP-Z}nAGSdzRC<=o$L}~-d zQ~1LR=m!IKUs3WzMTciiwF=+H{)T#LkSJLHfO&WpBdJ8?L+y$s9WCko4x zLaImbAshYj7oMj4({!%pGlWVCcU#PSqi#lvhiNOGwLsna;{(Ln4kzN^>X{HHk_O2A znI9r$e8GlvDab`CFt`y`IcdadG|I7}%Nuy%DA=~vc1_t@1;9K^G};pDPrDm za;Tpb5DeE3Tg006C)RyRybu;?tn@pH0?3+*xxH(^u9~UMmiJYP0-+${{eYHCBsk18 z@Yms?^?$0OGJQ*G@j=gAPyek`xxE1hhX<1WA#JwmbUe+v-t!UB6do2e_UaD=H0>L& zqh4Up zxdeDw+&~9BVqh!gf0UwOE-8d8r963Mras#xz`@o!Kaj*5K5KT6zxeC@#P#fZ^~6^I zSl+LLFOZx^)ji_=5{Pu6eI}Df_y0dMePvi1Ot3Xj9EuZKN|E3YptwVk;9ev+#hoHW zix!F(cPqi21b27$;w|nJclh$&d%r))uk4eZ-PxU;IdhI_5a*jYPFO6OT+azPDuNj38>9o?O4T!kq^rLIjOQn5Tw1oA?R(z_g4+Lw3 zGaM_@I6g@leo87WDHWYKZ62uR3~r!n6&QLGAS$wUe3^{9Nu@i zmfLVgS1Ee=L-RsJEpB@;5i%bJnyBuD%b0W0CBJ?)Gg9;by(Jn!XEO&4w*rbQ`vrlq z-C2)X2~?Ws-G3OumrzzE>9KWsXMjSH=k*_Y+Ncqw3?+idzEh{s@tVlGzRb{mNvtNO zP@;?|)q_D!^Ef`B%hM76%WD%F<-HRgBAiy21gd~-jvO;f!Pg-HAgkP@nWBke%vek| z4IUMu>=4L_dO|_TwGjfHe{4_Nzg%puGNJ12O8M$0YS&5()~F}--dTq9;NRN6Ss(Ya zXc#djf+6w#@qcj3!+PTra%Pk>8TdJ5xk5+1BqFs7JDOqS#F476 zKS|qTn%i8_KMRLAhUfr_w4kwnBBmj+5V5;0Bb-5$h~j+eZgCbaH<_dQE&TA9ZMT^= zx%_h@X&aDlJP5U%gr5&UOe%CpKA;<(spjv2of0VT`bMMa%6PLQ5cKqlZQALj${x_0#;h z=NP=sTOOg*m12Ul69I0W2Q3#Pt7KC3s z%-Q-}ige6P^u$8=9RFy=A^e`tk;crXuj|@%DGtp799vKTmXst=JQg)hIEVcv z&_N>icb>7~3OA7W;lRLDyJ?UcvmQs}74@U{a*N6?$MVzZeRqp+N4=BgwnnE+p?9d+ z65-P7ktLUZJTA@6kXIjJSF>1Y0bs8S3KLg+;MFw zY!J#%x!IhLcece{(%mpo+3T`5`Jkuz5~T=Y{vV5Yi5E!&q)Zp{L!$c9=20~E>svGF z{@SAJ%~LdVvxDiAkkQfMkiALzkS3M}1;4UVcp2Q)wWh}Pj_|B6F=0JNIZk&hAVaw_ zHr&*uP{eA|e!ph0QIfp+SH+Off z&1|23hpKRp>a!u;riyinv04MWB;r6}=P!3|Y_mm<`}74K3k=Q;(f@}w1QuX0p;3JM zHsmuJFWnTV9txAQ4nqzu23bKRap><|&yRLYvi}4bDr}|SkEvYO|3(4Nwp!3xcPkaM z*73g41+K57>9&*MC_e6tWf0*IeFcAlh~!8+ej~DhW5P9GF+Ug1JOc=IRfutk#?VZ{ zrA7FqO}qH=V$s428uyKyoxf0+44q1tN?0xm!XA0kR9v2&RDbtZZ4E8>{OAzw*nIUZ zGo&Bb_kGOw3f9QWQ?n`Bu>OUZN|8udgQA9(YJ?mpUlhZup8GX1{u}n9pl{Bx@xzHd zO>cY@_VG9&)h<)>;5kSU6pB~E6K`c?ItBi3&s;t|#K|D?op2kRx2#;{Xmm-kP>C-g z3S>Y#IZP?2q~(~l;uL8h$+3)5qk!I^NwP5PkAk{i5w~?0cmv9UL%qqfTyrcDB&EVj zWLiLSz^GGPFa5*$dNna%^2X!k4P%)u31CV~Ou}f&uTjdp#1dl31_~YoN(aTDnW}%< zBIW_Cw}qW%Q}rk|M0bPY4{@sp>3ZXDAid`b@esVWyWo_awixpPk1;JW_isp32z?go zHh8u0&YGdAJTrrxr2Q z5)qmq6sQ@gX`C_hZ;k}FdL{VqEF&5is*Zm3_B2J)8YkOZ* zzqeJ;d-1&S(c;4k1XF+nbWZ*nBI^NjFe5SHE5d9*MUS_E)GzO2(fA`@c$O5TV?%@| z7;p5v0{Vmg78sCP*|b|0-~BuS*$fGD8{?p&A$=>Hay_=LYIaL}ojk3F5+rnqxAXhkxXwL`{B2 z=$}9*vHT4+GCiFfXG)t6H21w15w{F5Bzu6Vy|yKveRf@4)|thxW$N~)!!V8T`B{8%3+92ZRc9U$EFAjnrehFv7C8vdAf#+1u}Op za7`~9-Pvybv;eo`U&NOvJu5C<>CfRBgJT6sY$WDG`vdbRutp{?*1k;vb_@<1`W3fV z7$a0Kw;AI7z4ojw3N!DgQ%P&(?B#x;VHH4rSOD&VsmehfJcWVUOg$(o(kFMoIED@d zW|lMFxs?mRCne2Vbi-_MmCmEu_V>K5T$ZGZFKtI~Snpc3*I zU_CrO*I6$a_HMbJEt}Itt{h|?Z@_=w96Q&*p3@}cq!qMy(j3Q%dOMUzyD0ho=drtI z)K$hkWI9)xB*Fh>prN+Mv3Wp<9C7IQ-(HCq%E?%2#KJB2yKF!h@}U1Q8KN)MKn`@i z-hA?M$@!dQ6s$ZXf=CpR{go8w*p(Bj5pXC7Gmv*+qu=VC44NkTP;sx5*b6=A#p=QA zwXTLzGsqGu(q38SZQem)sR#%Pz`5afX~DC25`>Ont7lx}6ZgZ@&A(wPErlBC;IP{d zSaq1;)|2N$Z46`xFXLUp@zbY=5(u4Jv*4JHFw)P_}-6Fk)E_5%eoZ_g?*h;4jaq;%`fIQ1O zY#lrwtF*q1sxa~ZqID)A2T~`o9!0_@CM|B!7VllkbHm-D>EDtrnLLn0l+P(@YFdw- z#pBOY(q_51|J(4eNccXFINLk|`qx*jyFC2w|Ata^Je&z@NDEI#T{FzxgHl0lxOo@;qnZlszC&I)&2m8wJ&Pc=99DB14jHetPCPEKw%| zRb8V{Uu#91b*PFn5=si@Br*Mr{d0Hmp{8#jXmfrxhDfS17EIBWxJQPA1Kt{Ra7+rs z5628gpxQR_H?WE2KSJ!1O0-u%H;-&w?5=f4BcgG-sD{9Jnc!C7Um9^4M1|h6pm&ys zH;awFqq&x`rAR!dX@EYNe_g=8U4w9>q+GSY!I<4YX4$16c?te8Ns7vXbSOfMrMNxw z>+NyFy5B_N9E+mH%-`YJqib$!kPvJul|@Rs?CXP}duxb>(!8k@5ySk-DU*)%D&@gOE1A0{@q@mMyL48;bP(7ApVWhebR;ooYKBYmrbC(^$qi3O32rA%6hj z8^4R+GZM>?V!e{5lbd)Q@5L<{@i?5hxP^-@0cZgNDkyPi1c?tM20|HhkHA_Lm38MY`9@nkz}a4Owz1<9*aEmG~e zik84I>ovJ3O_{t>4I5Q$KtiH}2))-FUCGyFBb8> z$Xxk}=Uh!bvQYK-ug4?j>GlF&T=e_ju$vE5 zS?-jj{_QWDn~w=6du`Vs$IEjoJ-d|`6~pa^o9KQo0LLcIeL5@WqDFWh;1^RzF{?(t zvBJ6@36zN^PI{+Z?6E$G4W5b?p;;@-P%<#VBes^4Q6(ekmVQw{jbo}6tfdlncZCm#^5mA_f?frJ(sAL3+izs~-1j56y_D*&G6vm!rjS7y?-{BEyT{4t+lO zR_-+Cq^ZS>N%Gm6Yf6+A<=P6Y>=yCsmLbPS2XTL3C3u zKQm#-f;e6>?+;kkPjh&EH!$`QC)n+xBFEEHL5|V=zT&eHP=v1NTq&^hJ$rW{$b{{>a30Is22UjHmTO44GNvD~CXZgA3MZF#YSl ztp)J*sGq-gXOHTJ>w%_`sA1-_QlzZYl+^)mYi2?&=C?J-aWaHy(x2@=5tzeJBg~~~ zppqK*$q$3dLxezJzSb6*DWYF3zTW0D4KryGSuN>AL_bqJjWlK4^Q7@n5had#?t$aG zf76=&woO_JXURL8ZSQ5x7n}OeKbKPCEJ!dWSCx%bQ3&M|jv#m&GUO8^)2FU)_|Qaz z+v6ocT;X9_$Mq*JNza<1$-GC^&k{ng@)Ouk;jPcM>O#Kx;E;JLþ(>IIG7b4p~ zBtaMir1yGXZ>jzzUA2f`?4PGnWfL)K*t6B6@fn|^&bTR zfN)$b+VT&a^#q%IpU@WO+2%>X3>t@gj8(=FSBL(0cV3SloKwu)Kmdl^1;bjKlrmog9=AaY=~=;B&8P#Oc?#;#bWd4h@g ztM$T9n_&Ow=UPDz^Uj97SoV7jM8NEDFEnJ;@S6$C2PG_z9GnC^w z4Shv;bD5zF+-`@GQ=)l0$Jl^g;@7?K8q_8_KK*H5(Wco7U1^T7()psh4hba^mln0; zLv3tSAk(&EflP<8?P(tJ)ytus`cLwi#9+mpdIm*(~tWZxSOKmc-Yv08k+C zEfS%nge{HPAXyw8P00|42{cGjV~m8DDY?My?iu%Q?5|HeM`0Nyu4~xkoormpdQ+B7bF|WQ{CTj!Q-MCqi?=OJF=tQr z7stttVf1=G8oPgwnYLDx+gBvg;EU^6+;x#6t{c)XXy$}+pVf9$Ma;2>z9krEf>tf$ zjpS`WH|BD=Gtbs}>;CD1A@*}~?S=Q=6TgQNWMeqI?&G{rb=_4!yJgeZOz*WzMnlEM z4*V~^g)<-iD_N)FDw}iCE`g%?o(G7!uyX7aY~WS7-FPh9*PlE;zxVg}M2_P&V9v*A z8s2sRD}NPM8@jKaYwmu5J{VWelqE)a1+xx@KlAzKJl&rz`cp+-&wcIji$A@Ze*LX} z^QPagZaRJUPN^CGb@gE>RJ@H2IQmnomsfVZ1H&Nf;SCR84yONL({rsq2eWJT{%Plu z6;J0*g_M1mLpBy0n``= zfQVkj-#=Q!vjq=s#Hqy%f`|YbN%!hHDkEWIeOx5aXhLF||*=1^_P4Q%DP4MC@?PZw0>`(RtrdGaZUEG!EZ-|k1XPn z`%&IJnXFppnKk7!^tg&kXf#Ik4&VC7v9>e24yhf%7Ps_>$(s7*Ric&p4SRfuhIfPv zDE&8Z)!vpU?jujK+b&!0#^?k$PW5Gz0M9G_SFF+*gMf%fuxG(Ov2>!af+-;}4lwc& z8!mS%E9l_$r$N3sR@pfU^h5h2*%x$YwCV655ESN?a(4#v+)<^>!Jd2gH(JZr4H8~} zXpV@TquuIsnBEO!_~t#{KW4z6hY*4A+z*}t1bNHJ5dPI*c8NhQ&pc#+_~5S$uhYUY z{Nyv@mMw*qJGKgWeCdV9hyk#q9~&)Hf3=pqwSSeHu+egJO>Fq|q(kY|V=;~rustCx z7?UFKmkdugHmANw&NtW6Bicivdoo7(FoDEPZmQaEF z%+cMP%SHoUL!f%10Cz;(H4+MGrb+&o7gbaIjA|ePt?pzwr|qKN{!G{xEKUu!{WlV= zD#Ct4jvztNP3#2ncXLBK20u#mMO(d#%;W3koO_nDvwM0RK35X`k#?``Qv8mJR>#`G z^=JkmDjC0Jb?&*^Vtm1uMUJYrb#}m&re6gc)V{BEH*=sFnJ2=Dqx;%f_pS_Vi|@eh zo8xP{BK7c&)EkUKxO{b;_Sj}b2X#>WC@isoeZ#H88ei!y+j%~%>hCA>t+(Zx<6>TQ za;5$}!958p!?)f})t+L|I?&Jo8GArI0?k&!u%z;5H#-ZLxvK<7lAgSNZx2ezIhz{< zD8MH`hJPp~AXl6e%g$0x@~a`h>7ud98JV_%q5VqIqle2GCJ*j1fygT`tHQK_lPK^% zR&fy^je7t?crog(7;&%UIikS3)H_V>46F8nMdp$f1Q~oKR4?+Db%M4hh2Ii4Sj3dT zj%*W7{+Mi-#qnuX_&CS#02)8DDes9u*G1D=Yu$Yh)Ugq?DrP;!YPKV@?RHF77_Vzs zGmh*u!?ORhM7wg#Yv5*WVTg-Bw~vSK#I_|+A@1{2t}^SsU&KlAe1~^ek5~PFs~jkF zP-n_Al!04T`9cX$JTCdP98lrrJNjQ2GWrDma4(=}2&zzr1Y6Pp_CoYan5=574NE^p z4l<_~14Th|n;N~q8UW{-4A^XFY&_-O1tF_MAUFjH;yZ)QZkQ9&DmsYrk-oiCQ%;9N zi@^b4t{*(ymA#?Gt?OG2>N5bdiamBEy>x=s@14LwqOVz`Ci|7_dAA)^?d?}U|B4C` zWjhCpLgGkr9XX^}_bWm7rl1uB4~kp7+?aSeYkq}-Lv_UadjlaDoofi?qV;OfrSJ7& zj+r=bvX1*krrK(!YODmq58&ROxZ>$ZLv~81nv6@z<&gi{eO*z-Q#j9VR0HAZ#>+^H zLH&Cc(6mCUR&-RZp}pUxTFU|o#6jLlJ9!xz5P^r)fr`Rc!;XwuR{T@TaW?8-1QKpT zVi(vJx67|po#Y*7wojb2*)@nw zixG{O`F<^+>YUH|JTr15{u@{(Sdp+`C6OF8>f?<=_<2BqaJSp*TgvAYQ+(s`n0#4> zG`iXo$sX4Ug|1Ffd*02-x4%(PKpOmba#%dWufxXzkth>MV3=upgFu2Fso*5Keu1&{ z7%)_l941hZ-66`6BTW<5r=_j=>6L;>7(Jh+*Zt3U2&&FU5;ZiKbdZ#Sxy{xyD4`Qa z#^7bY-$VtA-jvQjbI3EyCFW53Sf7G4uAoJA?mPwfH5EbZC3-F$g*B;{p;eb=$46V) zWv9DXs;uQBld%ai{H9e(8Hp0c6EP_vX$(C|e%Fk}g>p+}2)23`2bqM}qkw`#6ci9GK7hHyxZI<1e(pQq`|;uVcdJ%2sk-2#{E z(zC+ezKw(J*C;DXkuEBrq^FyW9T2IzN%|S^T--iU0esU|Hi!A+SFF*w`%3o0x~gU| zeq3K>%DQT_ADKE$a0b+6gC~rtPHcT2uthv6t*l@+ReD(_psHx5_yNK=8WaHhw7E!O z3)fWS%qN@=zPTkv-Eq$GPV(5}$L zXolVBaQ64uek@#psH6#zblexV=jlj>JUhP<)9qMT5^8GkjMt~MUFS(2csjVBR!qB| zyE+^=9oc&D*tv6VUzU(ayQEC>x;;aiBBa#ukMEbjUpdbi{V&@XSQ|jcLyTeD8))a+ zWdGR3%0B&V5<-$cnAHtvnwkWkI_T1tNb+f&&jWkoRZ9t8(MSqQhh!!@{tEe!z_pI8 zGQ02)5vfm|42YIPJb_T8SadM)uztP*Nd%zrD7BV}|5_JgSoVHH|dsaplMP9yyZZGv8ZMH-5WEre#oGy6Y<2&Vj-sZ1G%u5^TQW@2W6L;xu;nQpTg6cNXDk^ z+-p{rOLJpG?xEB0-t#wJlBJ9Z8BvF$Z$4P>77ITJuzwTK)m!w_*dvA(`Pz?{%JP^h|sX}v-_8ckx`{>(GiR|N9cB`dm{ z!$khF+)~q~H?l0MSsqcp;`G)Ik{oY$tQRoQ&{woEL<8%YV|Eo7DC}GP(d?5$x*OMw zXx!7}{yWHSblR5yq$|hiNKd_L)SgTT&AA z0f#*Djos&XTRs&P0m!en*8?7r{Xntg$T}`eq(PKQ0!kuAoE{VsG*Dxpg=y;k^Yio3 z^ATlQYLYku%i5LI?vCX}kt*mf-E2KvqWvf(6Oya^R81~xF)91i?FqdF#%FUF7P;+8 zEtn~`y)*25M_Lo3G)c)|!PP4ozUsEteopnW`LZLsTHLbreaqM7@5?jUlW1CdXnow0 z*H<40d`iX}0k=KUIxN)#Wu zBH8_!JRM-$+zWX+q;q<}F9R+m)As4+Ft5imJIV=Ji&v%lBY>#~I7NfKZkEQo%u7fw z&q!VqRtaqc?}49EGt+yaVFexHp>9^;*o(&!2*t!29_`3(s9!jK62=5eG(xI99K%Ku z2UQZp1SyW)207=-XaeAY2;i3o8E;_#SPi2De6gCIgzlPZ%E4<9{V~$zmW2w(jc#JU zEH|u*O(DCE*nz2qg&)6cwDB=2W|QsYaAb(FM}9CkuH=Gl4(sPfT2o-_oN^8IR_fyN zk!BA|c*HCtucgAbs6Ne~WsR+Jp5-#=-XNCJv$@$pB6#I+Y2L5!+GJ6TJLVWh=#&7- zfL6jRo_20-W?G|_&N`#h`;A?;r{uT1Ijt$YT(Gfd>HhUO{?dEupdLp*vNIvqx; zL*b6f)f%_;q9+;v(FaOaGtE<80$r{`0WvJUm?Xl?gWKoK&woiYgBfEXM&4Mm7 zH&0kOB#JayWDuX2!OTHEP@rr;-(e4mDo2h^XN6i9#GkvBz;I@a6nsQmOh6lc|IZ(fi=sxZ-8j(QaKqFeFgSI^WhI zKR$c%C&ECVzW5 z;2g!oY!7ubTSj?p>?-5p8Mk|L%A%C6hKegoDBO>#27j98yXz9WKV)F(cHgD>nTe~y zypko;=h4EM?6@rS)=Fn73!U@lxl%(wqc%b{LiXq~8yB4}Gq6c=lNcEx6q1#?&`Wxf zHi&yb8=VIE`PTHaxJyW;%3rcDG)tP-C9o(*Lj|~8uAds}(In&FB)S@Db?uvHs*ebO_v^<}bD+xHM8RVhbHsdSW(C;7FFe&5f9FR!`3XV!y zg19yzn#P{p4oLW zY`sF|(8{s)Wuf^#vl9CkJ;x?iQJg!&8J=DpokBm!H(cAI} zbE!GS7=Jah*%(RjLxilQ>)Z|CewyZM zdF($YD?uZvA~^eK8UWd={ERo3VjBguoQvMWTgoq%mS?^^1Qmu+B%qyEVEf2T;g-WG1Um@fbZn6KtZ5DnQn<_c)bmNBdm|<)|VV4&a$KlKJ3qnrHg zCHSPog5GPU7WFIkQQm~9fO|0udCFCxdMAJ}9=i`SrXtS!FKI47{4+3>p8U(=F@>@e z;4{F&E{-4(HGIoPH+G;}u80C}Ao4PK`NDOT%yjXn?67{_fce$=Og4Ylcs5dpZL?dv z*b8m%x$b>X-9=iGMNLSZTgHVaGLdQh`K|q`5Qp@rz10hIzxLZ1lkW(|3H zQ*#=6tPpTZ5O@QmzLaem*ZboGzPob@lMf;EZo{ z&bKQen!fw@eaD>Ow}y}btBiRsPK}a!TM;wE{yHQxo|%jYTFrdI6$)!4h$tjkijh?f zWo)ZDN*)!-sgewTHyvTOFm#1IT>Og=cCZICp6$ z=alUHUDL+T!c3lpjShqZ`xBw{(OLWTyL>_f0F>wlRFcg~9U}d0$`j4Bgk6xnpJ;>I z)Qb7+?lvhl7S#&nald!kO2(|D^k8U!l%f`uhLF%mTD!XiC1HprEi5d2u0AHWh?f}0 zF$!zV-j!$D(1yTF-{=a+sC1qbMytL>0c%$GuT`6x}@y^Ab zo*4L&k0&+@Qn*H9Ysg-{>SNk08(5h7r$NzAz&&pUzA@-`GPeS2P4`FQT8-|mIi5<# znNSeCc6Z$$+*WL}Kvu6;D?U^;XeMIUC?4#iqx@E2N=(096s)E^TV{8%-OG%iLEa>+o0H3rRFROMM4 z93iW~3PDT9;)%bM4vJO4XW4D~LtOISK-)-p+oaIZuh@aM-H={|IXq%9^ZA=#p-_C~ zDWbYtp)|4O!}!VgvYlWA& z7mvZ^-a6`f*w`L(c3dudT*VMK|3Th}acoL){?-AxGsrKUyy0n<9$6Pv0f{jB6Av#! zl$`%S=W18MU0%!$4E_xJRk}4pGjqd=NCD#LCEm;<6b%7ysq{3w6uv#r6rEDT_Ij{L zA9zE+^tgmaQ{9s0IG@dtA)^tH06<}TyRA$5JJ=?ADXD$_d@d}k9oQZpPi`zPSkfhAyRTuimd& zU9KjiaroqfS=8Y9wm4P3(dDA;NTaa0zkJOY7hXTSx{y-iF?}5kn*B`0TczePMJP zc?i{d8f?rU6-x9;-sF?^17NBQ4p#0@YwXBESYi^lCn2w{v;}iXIsFJkp%w{*oSjwv z9;Y*CiG~P=SI5)&+IhHdFuxeQA@(cepInX%NRmxy*eS?OdM1?#(IggsX`j%AyT!5e zK3eElrHc}ez7*dTN z0Wrc9iGF^2sJMR$Na6qp7s}07vsQ0sGFoEOC{U(jgq|A4J zahX#UzY=$$;?U5<2a<|0@hi?S=B(W1V4e?9mPncZ%RF7i|Ij7&JB-o(_rfcnFN1` zIwxWr3i8QiOG_mlR%_{cC$g3QrdSV2HIan$yz8$dgZhVwS+;LbhLDfgkT89l=Qn5| zNl@_6hx_6@)_|!E$?NuNUP_?=)|W@IZ9i*JmHqSF*9@mh4&O9pV?%z%WDdppTG-?( zOO3hZF*fjW&Ona%?cLpzY%KS(WWZA8SKR3BJ3l*-3@aCKB;};MtDLK|f2ACyNjdJjYbSc-BJn%> z?113u`+%)ZS;z z1&Ub_wiyDBP&SuFLF0!LCUpL^W{L>YOt6WG)6-nl*QE=~BhQrlkPQJ|+qZjhU22_T z!2Zo$$#*n8Co-3G>609<_V+`v;pU@r>yl=R*CM~_cb7{8+q^%)>|5rM(bmUO%j{bT z^3d|p&L3RMsFNxF1$~iM!F%Bel(?Wb2&9uqTPhfu$JdAZnIUX6WZK?c12?fsNk zWCMaj9H4r0uH5w*o7itp51+!Y9kQ)zR{3?_lJavOA!pcumWQMzK^+sR{7XCJBlWw1 zQU7H6GY5=X36#V#pjY)33CxHWzV(~U)Cp@GHFRx+N37cqs~3JcbK0^bCL@hpdd1IpO34+X1V9~@~~@ZW)1QS`A1pnn9>&LCuV1b`8-3;Q)kcDyT^{t4@aT>=NCI( ze27rWm-#zXGQ-o^Xl$6vfq}+mgqxM6HZ8`2arbQH0m4^;@i?uxT#x6ZDP^`erpmAh zQC&yToed{hM+$n-|9#_^E@-R__$m1KAG+ zinoh$wPYcnAhyRqP(J@VUZhXcief8KYB>y;#2f_+twoPx8xz z?RTnwD}@WD4;Qx&uHx=^7AXAfgB%X)LO=RlI#2C4_iw7v_+rpU$bH#1kw$tO+qrD= zo@}dZ#=g&`-1$=|yyzOTyS~VtdmNh{$6ezNEnl)t=P+ME1(Vxzyw$m?9EvzpS`?2H z>PJ(@@&UgwOc$Z(gNPRbw~X0hD<#e*b$g17cx#(P%uUqne}2xz(p<;*BE{kVuM2%a zm2Ij1Yqn43asq%K@4mIJ)9>pYwjgR$)PA3B;L|^tJ)+q4#IKXtkYmw{wm7@-Pq$g= z^5XAn@7AbNuZM|@Axy-XilcVnK0||B@9-X#DR0L?=CO#JquUxe4=sBg@Ebqc;$ryU zvY)H({`1pl@c`UL^~FXjnVBADu)?%JSF?deWzSVPdA%E_zE#Kn5|?pb*5L3yn5U;7 z&QI3&9~Mvs=MCGhhE6|%k=!xwA%~78OEorP&6n4g2`TNZsYyvh+G1nMu#tm$V@rjg zdiOs5{X~y1lxbiipL43wf3sy|><*3(^LIzjuHs5t*|m+%YI$f;fDd{myKL1|>bX)G zq}+?w8D&xJH&^4TmK+Vi!qd&wvqv$Dy1Mj;9+3VR#D~#WEqPn7B_)QW)+A`dJNIE> zp_rT-cp&%R-x*2!>+N5SkTOStam-uM{~OygSKq{#2Ou^+~4h4p?{-Z#R9y4N8Tpi;eyq-yC_Mdc3%Jt~$Sxv2V+p=R`C!KKS$q zi2paHz%Af6;`P^C-~7bC5G7A4~YPhd)|)RG!G+fTg%(8D#qFJzoq#n z+0fa33&C^vz3lWd#``@;l1ph#1~?+Kx{z*xB>6+Y`SRv&oxgK{x5{%9qZo6-S1k;% zq!zf;2%b)kXTJ2vkdb4t2?DD6S2tI*SG!~1B_lpTpQCk1f0v8$^3XuwE_PPVZVU#AO(6SMBcCu)E(1V z$l=;8V`b?f*R_6Fm;hV`EE@(r8NEmYM<#B#>22&{G7jY%HP?!ML|epgL^*r@-{$Z~ zoOIi>j;s4tzjYQGV&B&Lb;VQ`aE$PTZ%?)`p~SRLJ0!PK6OGWIcX@vkB$yk%rQ<}{ z6|w810KX3bzY+P2$y4#)Oi~sZMqD$^5BTD@e|8;=&hO{mE~Y$PY#K-5CPrNEiEkSI zKH@B>&EXz1Xp>boqnQ8SAq0K{elwa%n0p;nRYox<>KYq%_#GwOf@mM$4W7C&3(PZ& zR+*Z$IxqZHBJ2~NweFp(jG~+k{`b!#mRA$!hzu4ReAY;lH2QE5@qTyWP%E*5(iQz} z8sX!-2LeB2Q%2VAxEeXf*i#VG=X%3EQJQ#v!)N)QXCQvRTa0!+GyskNPQ{TQAHGVr z-a+-fMd14>*G%l)L`o%uuc4KzhrMhd4R-2^zuibKOTKQT3UVYH3$?&`|;nW;5?2zx61X6Q7ejqf12 zgXHA?&j4AAVsd{|hdSWa>fpRXjd|T}jj#XJ|5=_|&!hw4Z8Y-G#o{U*nF)^VvP14T z->x?(O6$JKg=*jK(*9t#o{u0X?-{?1cPKVIc6r#5|cy@V#$u*tAa%P5WCJ}UG z9S;x0e2uV-DU^pf6waHm-q)%dGQI12Sh}kx(!USC3bYzIF>TfTuRr`Nc&%;|tn8)y)vR;7w$Y%nn2r23>aAV5WlMj5|3@=3U0t)Zb>oi4SwGXp zr={>~n4Hr^=iK6cUy1&wgoUM!rl+x-%GIn^U!SuRw{Pywd}9190&vfV>*JEp1xHmj z71j0Qjjryk?yW6N5z3tGtgJRs!KN=h651*-su}+E4-?JcYkdRrr*8fRjMgLx{36@B zX(Cu-Uc-U}ZLduVM0qqTjE6r|>k}j+qhj7jZ{Ojr>1wBfU%dG$PV9JL+m9yO%0ExC zRQ|=Q=Jblg#@H|?sq)Whsqg~={~0+qH+R$V>FJH=BK~Rnm3^D*v+fXc?EJbw48F$h z*+pbU6+))h4p#PI^(CNR@NxGPRon9tp{pmNk3d~s%&lc+MAAFbIj#sB9>Mgav{ik9 z6xq%#T>x*RrMYa5 z{ZEw?$&kFvj>YZ~{CpsjF|q*%zv^a8PLr^|Ac< zx53KHkz8>3+FJO}9V1zT`iq~E0a8wD3MhzZqV2`B{qB0N{KnYquk_{V)&702qOo70 z_xDr3r99%*gr4UoS+6NUEiK)bN+CE^j{5R|_zP%vDoZO#3z`!;MO(D?!T5#R+MAQL ztW=+O+(X>z+a%Z3Qgj+wGn%O+2=*?Uq2r|*2v7wc$$RdZReTJ+tb%>%9-}Rrd;c@l z_1wj!|GRrj$;8Z*$m9xgK6P;v9(DakwlO2wfr?!H043K8=?iNNR^e{GPk5v8Wc#US zv$6+=#A?wLuT!xYgAHwcleR7RikKNgU%M(@Q?w+Ek-*$&+rp^L_R$T8BTNzwtdV@_ z^;RL#l+7Sx;3`S&V0~AX_D=Z4ILb^!vlZrx73_1dIUtac#xf{NB|M? zR%?EZjRJT7{vLx1lB%uGPHxuMNm0@$k>w}@Xc~|*`c4_EL?6-FEWCrv0@xdhPS&zp zMZeme&ChQ1IGKko?t9YNI-@_iY99(inRRyy|B4{W9c3*Hd{$Sz6`yBQ*1`iVKggt$ z`oWCe5JZEOPhB4*J1{&Rwh)r3c<-k>7V;Dm6B7;ol$G6|c)tu-fsN%`aE<*Jm94I? z@x?_pvmO^XrkJe^Z~ul`{Ugse{3)95yZws1=zf>#*^x=rXSTclvquM-WAm<~f{AK%k@IXVIZrCbR~vmC0FkBzy=SuUri0VTgg*p z)ATn)EC=*_v&+jhW(DmXA>c@K~rNsUr2P7Vk(9Y8nYAOk}H~4x3wX95FG_% zFc1m_YiMGVN(9Qm@{=;LpV~Xxs7IZB?rCQD&mYG4;m;?>Ej7@_Fx>BZ=hg44Y1rL- z$Sv=TJaf}u92l4zrx=oKQEKx-JsJ@SG%%?Ui9Hv^I zToHkMFt+zmj)kaOq2LhQXbX?^1q^9WX76cIv4*7Vul8nBEHA!uJ4y$Hi0a{EUSv)a zeW+lJYEF|*n<~dGQg_jVn`QruNCO)|i!&|zAki6rRuE(*?%!C(Sw=WS0N@&Gw((g) z_^pGC^8WGFeQ%Gc(EIJmLY>4!qZPKiHq*ZW_L^Vz<|ox;G2&=0ldu3hH_fkhGQaTv za(E4l*Jr(1a7TWPbc!GC9D`E`CPt%Pn{aMJM2NmiMx5R1#Tn}z`c~A{ZOgH(`hHPp z*kkimiJbX=T)lN5-SNLaZl;+yINCUxqkFo$#yFazyK9)vqr0bL!yHUAadgL~P3JH< zHtcuyx!-%g_xJtl{Qo}B*Rvk+EZtT>ONe)G%XrQyIHO7wB5AIX80bJgkMnBWeenf0 zWZewnf9Fobe811t!1p+L>L5Uh&ebIBhD?_|x9$Koo6CGBg7c|&Qphh&aL%=XVbPnK z-_&$xuE*$LDFae!eJM%(4(n+G)Q|PeJ!iS;XFTCJwGKTY_dBbpw|z9>D%A2(V}Nh% zqJFGfl?v1zQ)?V6nXH`M#I4@M;}OY{Z!PS>o%aO34KESgApv1#jk`_n-#di$-%=Be zf9$;9e_NEq^p}-sdLEnN;PRnW8tltJ1Yk0f+OTb8>pem!hHAM-y8pl&KP?UckssGyFV$c+6AARK<(-hK@DyGPH#OtU#IMSxF^FwML{l@;1NR+Y!oMx2?slL!e+5f+gsspZc_dmcL=sEY<`vgG^&t$n*M^#fT*p ze(ShsXo#vP?4{}J%?q@;yBw{2uCKpsIg0i3zCIoq8Hu2rc(Uu3^O zVkQCNA$oG%MRw`FeGT+^HC4mBzr7*qr;9knC${bs-k47>Ts?Qp+fj&FCzCz*uggPT zc?{kAoDk$D{*n8*IBswLAQVzmRh=RO&VOJTmx?ceJI{#>ZBICdEln;>=5Kd#*0app z4AHdLFlP0^g>ow^S5`1E4{|BWcD^|9Viwx|@t?77Z14%fV54VT!6~qe@~dU#+KAi7 zmX-9kSkt$#A1Z86(XN)7KBzpny%*PSCI$1*5ZiNL2@p3Zokud13jlRnJ1H34`VUl^ zqv2Oc@lSKcT9T>6spw7I&sa-J2*+tgNYczScH^Y{$7!|UWCE!!ST}IDHwBitcPkgf<@e<;!=dbR zWm$f421U|^%2h-L(CrCX3cZj*^=P0SCQwDNkz(M~Mt5j`0% z|2ba?1s;7;oSAoGQrNSi>_ut$n&rN-xDCR$!(VN+?2C6k+}V}DjEPf55Z+$gM8#2< zUpf}>J;C7MAXNTS;YCdoO6Ds=@wRHPm)E?AIdEn9q)bFJbM=!>j6%OO!MP-&x3hZD zbITpZ(CRiEF2lm7rql1G+B+E-HsOA5a&;&jQ)b1bUfCqJ?>p@HBS5|KB8WgnXwRko zB>-xfrdRUI&%Gi+!*>N0vkoo|exs;oUuEFr)+&~eU?Jir>|ll5x|UlaeKJ;2!)b&w zhCm1a&i9nB z4&)^W36z#-D4C+L+6Q@wKj*tQ?k=z<4$i-N?pvJfFXq0NX3vNQ8dGU+9D)3V`*-`h z(gWubAt#5QzhTvj`~g3^i%=0YmoD3g!;Yl*C$%JNc^lZ&pJaE55O_CezXo;o*9C*ejIBR3EiLR9B+iFs?DtHn&oiyv~8^DPUBh$?!dw=6Ti_NL1!UOwt*7iVRh|0=lrwMAS+ z(9_g$NK-e>g-eLcQ|%g~r8io3xlNK0=(v%k6Bt#n7i%2UQO!^q!^$6DtiWo*StBcI z`(ePSj`kStuQ|#L9sisaml)+;GPW_hPd}??L93%jD3IY}Ady1Z$s=2!=bY+`SU7QK zAA7`eI%{Ss<_L@WmUt)u%-8DU2g+ zeWGRFD+IB_SHh(5G2>v>>o0}9j4RQl!=LHz4+otx{EAr?6sY6Wgi%w^6($i5s6PfR z$XtN}S>zscsrnPC{^D5{D*n>g;3deye?Mpe_@))FKo(xWdF&AM4kFY}j{J}+1LQFN zR`e94+646LL1Q5fw_U<(zG+bh;$BzWnD9bXx!R?n?v&EcBZQ=|7fNt@BdAHtc8qGn z+*$7vhr>XyvXA+RSPT@EOlEghB!(CO${dT9fY33nvNC&4 z04YG;GCl$?Ey3#ag`rjyPdm6r!sk!l94CD?CN4oQSq*}_mE<7A-^s&q*SBfTl+{hl z9oC*49rIFIXl6jD8m%MDz3s1JrdMtyK9>gro6jGK)?L!|Cm7R`l0uf^2ej$IfLp(9 z`O$K^FM$}hX#iobO`L%3jDgS5@sCOVBNAl!9$M1u=ONd5I5-rNS*DMeqKBW7rlXRh zik=?o%`ziIbC4Bh_6+kexp7d9U!Q`r!2_2ON@CULzpBUZZ8;@o%e#_nb%d1CxfAxN zFX}#AS=m9Tw{K1TzS?HZ87_~ps%=pJ!Z$VJ`4U}kf(B=V2tdU#|9wBi>9mkCx71?P zxfE%>5#sBco4IEF;VS)9!1SmDT;JT#2L zSXg1_t}c7np3L7%45PYMZAL{_mDm$Yk1YB}syu^Wzm6C81!!hig2$4xJ;pE6y6Ubs zP}j{$L+8es4C@-a=K4jbUA|eFk(8EGyh}n*p{A|XO{z`8;|~>Ie5I>-r*Giav)OW? zQrQu{?^f0&3DkYX%aHB;$-&k6Cst-;D6%p+1I|R1#PL0e72(0CC;TvEeki|YQ+^AAUdo4A_%smcq zZtmUZl%|gH2C@ZkUX}NE-ezfBD{)m~Se*U4WGG{owcq+clTjeVC(vuJikFQVe z<7Xg^!UOL_Oa=l#nVETKKd@ekRQi3ei_sCJL}}jkNzhal6+#)7AWjpg%7u*Jc55@) zk{ED5nZ*eIpmoI0_4NAq;lo?-6BS>=Ow`e@>RONoy-3t64+6H~@9$J>9@R_RQMl4^ zF9wR1eo#Lr($M4tZ;8HhKT6B+3RtSNrcePxQOzsu+#n`)gMMn*qRkB960QVRASUVT zeX{}yuR6Sr1*$Y1f@&VSMC0{$WFPCHZ5|?c(B%iydv7QF-x0`4U3B*QbbkZ+z7?GQ zeLo=gB*dnw8;ero)dNtGVT?*kotX5JG4H1b51%wx zD8dKmf^HfkJu`tfWjsW}r@LRV7`8Q-p7Tt(99-D?hH7saic1WTRl7*jh!l%5;wj^c zCK`Covr(eAS7?D9p-iye5&to}qr>dHCN~WGlc=z0>S0bW*l6wTog#==SeRELxHA!t zZfP%si*w|9GY=6zW8`Y-A>-Va7Z~VHS#u~O=1~!Ki+TEz>$6mG&KWi-dxylIxTx3Q zojXXli~&d>y%>6KG8`$qLB*mo$}3FPVW5TO>^hbAlMpxFc&DV&Ld2IiE6%tZ1Md8g z6!g2|QYI)R3Xb=3eBOLu{S_)DY|Co7miGm8xz;9Ok75;-5Z6jLea(c}S9hL~PXaV5 z%iuR0`%B%E0qGk)%1dZlFA{(8&4!lX79AMF-D>{as6I?!;1twa{5kf(rq(H~7IYaP zI~mKFjvL|9B&jx2l>M@6R{b=%tF@L{_pPJ+Z?_ZbB0UNyFOVzSlJLs&7%3@0OWkuv zrNZ)E(QZp{PtENsb=jK5o{NcIpi_iE-qwlJIV0#4_$W~nH}IckEbr)l@Tn#>txC)| zYXH=IU0r=$h0^W0BI>(c{pW%kl zi@D5b3S2$~cnn|oF<+{gX^`vIJ0GTs+hJ8K8xPpaUK;N+TeewcIMZMBZCVqBp>yD7 znv;{|iV%+BpODYSb~y?jyqH$U^N(nk=x29CHT2x0*L|bq`^Hu4=fsiXm%kf{xd^nr zQcMD}%Cik7)$7o)R+NeU{-1JuWo#rYyo`-S`Ixq|rqzd@P9Cp_K5w@9Y@Hwn;zO2h z>@TXQx^-kRFMbnUaGG!~d^x4*5@d|B*+zrc<`{{`L4vrxmGt4XR&!FB7qA%qe*Y9) zH7|7(6Y?N|^wo(elpOuS@_tWcY7hI@C<>IEJ+wtotIVO@)~ zY;2R6I(;!M@%Ko-`5F7zv}Q7Rnji@&p?ikAzfkYwl0L8mDHTc?w59a(UEJ{qgK!bF zn)i4@a}Ehd!uO))?#L~q0Tir<9XakEEl6@+&JSNscD?hD6eJ(dHm#`y4XGz5G(U83 z!)UYdz}A<_2-Z2uIDgx>nB@YlO=6n@R#1=IH!R8S37Q$h| zBzX;`UCNwd@7R(~u>J7V(MXF!*(5I0Ia}3_6`$e#Ov^7ck1+z=v{6{wCwbt%RM#M) zGJ5O&XA5U3{M1I}1v6tGT@bh4Kw{uoVUz)w%fQ_v1L!INhs3^a;cfiLg7EvJsmur( zT`Qu}kyG(SWKQPLL0C0yTNS}?oe%3w1MMEF%+I}lD?N((^$fEIKa6X>-(5rs6;{|> z{fhDEfMvAfzGM#Xzfn<+17EnzP&crB$M|=x6O)15?(YOP(|&S z!N}9BcyOi}NcOol?CXhSX7Sqz^nUJYo(60ya$Zy`egzBiz1R+~$hdMwf|wF z|K)$0DFl*|M6CwLl-LZ%gc?@bQ;*1Av?1jj?h)A)A6>YPJ(2F$f7&Wo1%%cll{Hh^ zJU{hur4N|Hna~C*5wR=QCi{bIcpMD6F7tIdX`$i!R2~=byS!fF46+ZM2pSp~F2p?2 za|pw2sX!ULTmM{oz(Ic3B#3qw*pkzkRpNDhzS~Y3#j66>b zFTXy2;$Xvef)l_^N!~CeK;X@qs6dFk4=;eU-eK)lP@>o(zps0ZGVtO`{r=vLLAav+ z6h+E4Nyrlyk>gNt7~g3m3WW*|;@@9VKdgUqJ0KdHB<7BGUi#jFkN${{ zE{o4fVm`_wTzx}!1BYX$Pn!R_f&BkpH$XQC5l>|fDHAc3tAj7^b*7#2i=Zk<1KBaF zPgks0OIbG4P(8Wk+bVopV7s>Tl;nFxV$G$+wh4TmfIWv!d=(K4`dsw=3wRq!)@EXx zJ@`h%Z8_DQXqk^@5u3o$Aq-MK80n`&*WMp zH8lQ-Wd_s-^60$YJ;+%1y5~l)NEY^rQe8*c9!Kes)S+;Qq!K~G_b&nybm-Z5;(i5OrUxKFE&X7|-|#dn%?QVf28@=e3bUpQ$5KbEp`ySR)C%UgwIl=3#Z zU__J4T6_z0yJ03>El&fNMv=awKbt$b#gDvedPPV6NyL~u4Q^FDpuVdVE{V!a*@%@} zt<)$_TRn(lGM8HsUjWj`8KvMZ3dH>G+(s00Iw29o*2P{uDDFCS^b7)7i1dqk?K{Hi z%29xt?N8C22jGk&uka6?HQ1hnxNu?FwJRZwjk=gFaim@bH|I-c6|ymFRl%*r;nePW zBQp@b-JPj_<(&{|!*lwz&wByet2DRQ^~A813Xaren>&VN&Ez z*BFw>HM{xm(e99cu9rOPB0;&$V~f;0?TOJ!{En7V<7QpO3l&s6X=a{C?unaeSZ~^ck*sc({VewrY=cka3l9K}UPw4TX{cg!8@P{s zqikWH!e!S?fDlVS@wwnU6;|0o-cUq6){KI!Nt)_ zjE$|rqddyEkS>>)TTqbKq1vBhRqhg2LR3{*8LXq`zDAocm>-KH$b=X`{ob6q_M>g; zS*&E@MUa<{-FyCFzEy>;IFl0PN6*0QJ5yjV3a)OMn|0)+JA+=3VedouyXQCRi99*8 zn^fVjpJt3DYlr2Tm_mv3)RDJFNU%Aq*-b_%hJ~p z;Uz9@=A2U%mkH-*9Mn#vOkufgYdkWFdH9ZjXJLv_)AZf&P$iIUe;>jKYIb*Va0epo zcTVf6E-0JR^CGDW3l6!IVirA2hZtV_Jg&NS{3gep6=eyB#nghPeS8q`Wu$#ZW8+XG zXd2W$`MedQ)0NnjvH+T4AG>$vW47zl#lV1Sh-kuKF%Jslm;5_4@_P%&T4wm4zln+Y&B3lt>Nj0OpHiMc{pKP6 z(}rpX1iydyGIpp@1ME;|c4=5C=0?CL?q(0py)s1Jbms+ZGcU@;?$Q0#5ehK;2O0^v zJ@d?0A!V05$;~ZkO5{w^j>(eNrn2JG&58<5K?~|w*|terNkD8<6SRq~L~oHY0vE{S z0i%x0vhuZ<=HukYF>{b=x>g=RsrH6LFCUIXn<>ngJ1asQ(9~xY%;#7dad~p~N0iUy zl>Us6;%w};A26}*TACbku`drg#Xs02eoah-_c`z-kd>68Q(#Lc0=m0c`$|q0%=!j! zvFPvJlhGpF85@=@@{$DLj%*to$-E94WDJnqwz2f+>9AuPANVFRKx1TOfz8YIQSNvH zmHGAIP#9@$71TQ!B?Iol8^T*U3Qm|x1;zA}_{);7&P6-hjatznkLz|t_m)Ip&gMCwgI zSe&w8@q94wZ{kvL0FI!qPWNYqf=?C-QU0NXvFK)Oqz(w5F(e6Q@8DoJN8|qSk#94_ z-}Gl2Nk1zzD9oLe-M}7GgHTW2h)4pd*2Bd8#b*LYpEGIm61*RC0>z<~G3l&p;`Js> zE^Wn*q)Z`tyOJPuj`Xn^=~j;5q_rFdL*>~qS7O}yj8i=3ZB^tqJatcQc61*bZcMjR zS+)DV@qo$ciyV9YD~tXsPxAk_JON7K1zbt0zG@@s7Bfp~{}5_lU44vfpt{*PZ zV<;<|?h?_eL;3i_i92}Kot5(9SRJo={`U^WoJczKaza1r?$G)Ef-EmAYrD* zBG!m~u*F#sJzme)t3||(zK~2?i*?A&T(E73LoI|;^U`?cR)Lar_$1) z#>PhG;nUQljhVHF@HDMh6e1J=@q`7{=mH_GEmb+$ho7pEA3nO|F47}`ssqM2bSfB- zbc`VHk}kEF!)PBm1A={AE#SVnKV{_%=a1##nUdLz?9u$?m6@c*U0hI0LGP0Eq)|mhCBgcfpXBKM5Lu&tcKOzkgX8GY{3SKpkTMwJcB&Jt-Ge`-9_NC#!zqjs zi$zs_VZW?URz4&^Oq3^&cVxuaNfGm5&$hOxLHneXGOpUmS*PQt!iOXFc3WpYt{cci zUwgfj2G#!h)CKjeP=;C+GU>NfY4@NZ+gD&n5Xq0PknU#`%GH)5J;`H^$sx1b*b z<0bkqh+|}FdGWe9V!NI!QQ_U5(k4@<)bx218`mg<)aV>!=gLstfj75Z#yORW_KJ>W zxG3AIb*JlvVwpP*6t^S3`|O8db{^3?>^vN=AunZW0JKEXWKGvw3&WlHO?5=a<0K93 z*rm|{`D4oE6Ix7Y9+bc$Vz)FcDi*u@@2sQ~Se6pmwv)Fbj~|LT;Ql8dmk-{Q!pWyr zH7M7Sj0MUeNI^Si=#cEQ_NGCSPQT6*H#77%wi35?zP#!~xJ6#~D0ng@ZOD4Cq9j@uorrq6KN-7w8 zaF2e-ODoD;PM~g59$h9iaLLeC(akd49EB{1-agVvUhQ+_8Bjy zT1Hbcb&8UA!h$O$Jv;W))$9}$OifMu{ADcH;qg=ag%TC*{oo28AK^}mA{D+&MnZW# ztbik5@uf0Yb?BY+cB9y z;DUSYa|J?Z%^%?Dfx7R4-gTm=6{##_QwM4$vGUyCFF!7M4i2dSz1fb(4EtsKEGS5S zV4=4YXC$2@POsa$)Kx;&%93mCbv55<@*Hakh_U)p#gj1gTNZW5T{Q$Xifl|VLww6U+UK4CVEj~o(WBr^j{;w@&5i?xKt@N~cK0Bc z7^xdF5s!MO+ZiVmi2BtM3LOzivt`s2Ytw#I*@r0s-5BI%+~L=LMf`sP>2S?Zg>aG2yjG(PV;-G4*!bh9t;C~w6H%6|H9A9ApQGkVM` z2fm06-i6WO+^1svUM8 zL-#>nqMg2Bs8IVp=;kT!TP4w45OZgAXXWTqqLv39Ja`sJs!E=@cz$2(bto>K~^n$a{a&cA`ND`!HBaujX46 zykrDXr zbIm4QxN1-6__2I65xSQk8%wKq*P!{2U+Y&qcpySVY99%~fS6(sz7jNP%opBHZZuvn z2O)%Pd^d$Imkgz4t#=_8b~gY{_GbExTEm>4>7K{u)Yq82Q>z~a{`j^8UdanRIZd5U zmi)S@ju9pJM-&+)`2Er*J=+E>WMpv#?2Wd3Ib8q@;?Z8y#QrMBt>8GJIpZD0SxnDK~{l@>>air|$fAo2?ne@v4ONH5+TFZjPtg*8%q8il2~! z7em3(@%70vWMK_>E!vecG0;8z)~r$H0kHm6B#%mOk;XpDk)2V~m|`u(f}n>=2+1hc^IiS$yv5bI<31qX@Un@? zDoje@Wn?gqyKU?;@8W9?+>UF zXOtCiXEz?hPw*f1$swxUZboxIh*`PBr2J}cru zX?fR%0+t9|>Ox-_5wSkrkFo2I8*#q67I+0)uud(vt)WuGUQ>8K35@ksOMS8+Oja%U z&=$_B%;^}eXOZNjmTrYx{?x1yAV=4u)irHpO>aKdCcA5AR3>`K z!`NdVFhoCJUH6d!>alVPuaOC55)EUCLf?s6s}^lju=T`cwAItro)zW5o^sKBPSXvu z`c6nJ`{~&mYtZ@IMkhxuzOvY;($+Vs(9uaQ(tPLc&lAq|HbJrz0(kuExJBMPKdw+b z)wo04VZFGxf%%yALVkDx<8%p#qF3a7BTk7r1K+mlfAw)fe6BnT#+E?6bMLFF71paS z$WH6+ue}q$%#L>BF&V!9*=Fa^OCBK_A>GQTffZ%*MG#1zSakz;CR5pF^Ur(EkhL=T z{W7p;=qLO~0P*=9_P_bRlJ9>AIu@wKv-4G!zcHoEp zfo)dHn$;gT;9Sty2qk}}Q$5S=vpPcEFV)Et;DT+@cIGZdJ&%ZF%&o>v9E7&yhsRO& z%kItIqvOYQ>k9z6SCxHC3pyic63G?SZxNf5UTkLc7jy3Ee zewEVOa>MLX!F-&8o=T&n;emt(+Ch4N^d*OKpy{K!PLhZlJ+Md9+Gg)HQBqftGOLAe zOLR?q|FSs{+=@Gbg@%yCIg=}0ZfW{n|qqjQeA3*nY`r57G*6Jw!D`WQe@k!Zv~|9QO=*a^f> zB}HeT%#GSqc*RsxUOulhl+5AXi%7GPkK)_8nnKQ69yCaVztkh)4aG95o6c+|HdLh; z?r%#E3bN z$uFQb`Qan>wu;Eqz4vvHer(k4cXGrW1o$R}?{k;EPhcJX#eQjnH|OWilUXYG-W!R_ zV%v>xzW6X=e`OP_XlQD>GLh*MUC93M`WqQe7j=Xg4#)mbN2Q_|qgLLtLVDn!3eq~X z&1$Gpg8$hCyqrkw3!U0y0wFHLU#yB=HiJMGl&mnFvsZ{dzL^sASTx<<5`6vm{W-M) zER~^E812A0A`~xJ*}ZB}^`={Y6w~wdnUrxtJ1f`aVmiys->;1Hj2N*~lanvqoY`-y z!wPqhf{8qP@Bz}WRv9JIfj|&=gSRqYt~>Q#=@bbx|5&ZqxQ+W4l+3RVLoztc^5A3k zVTr~lCKS^gXoe|NYiD;R)V2;|XHus*rUh9Ssbrs~m`y|<1At@0L}YxzM5;#?cQSMp zE@qjdG5SL(`qiw9Odz$@{x_I6CKMfpHqSn5z^<}TorFh#u3*n+(r9_WDz~(%9h(rn zEVgh}+)16Zc&{I+(_-)stm^b14XN8INQ=ir8}o5riD9D?k+HMx5;?KscpMbvcL;q4 zlkR4;fwGL%1s4(W{6Zpprz&Va9+AT+pH6~8DAS=Arh4#=e@Lp%TN+S>NY}YY9%e6@ z0?%_|v-2uMNG(}0(TO}iBLe_ehF-^t}v*rYFze zF2yA?eQwVa<$;Qh{CueD`iit;P}%6}{F3}{ab5ZUZ2Mz}Kuj~2Aljk+l5<;egg6GS zck1*u8$Mtng-|I(D}=TS>V<9`w+~kR+A5>YQH(($qqaCC?{1i5ViaoBu~HAUXWEp@ zn-z4e495Fw*C^(uC!}-zZHNcPFz!dDv@LGH{SjMSM(PTAs*HBO;9}k{Nkvk7tNqmr z$R%zvjhyk(!-IMs+^eDxK=py+P`vB`e2^5EUtt4OOZY%HY_Gt6Rh{@8P*wW+3!$3k zmxg#D557Aj=@d4(8wshUrza#*do4qM8C}~AMy&A(%$^+D7xhN3K7-@2m7u%WcpsjY z7(bU44(D>z;BQTO+gLbOKBL-Zay*m!T*%lb0I|d2xzEFjl77g=%8f_$qS0R*yLNaR z8fI|&i56xNEA*HxF-B?>jDe&&U3^*Nhc66;Ox)XTXSzOu2)t;cuYA*m08qAN>q7XF zeOQ9?GZm8UMad#JPux%1>4K{Cfno`B^YaX*wYPH01Fmdt)LNT43Y0n*)U$#Z2NnCA z5ww~uoCz9S;7~dPcnsu|R?9jDC0$nFx=eb&n$N(&$4|cmxVSWv@$k90hpfaY8p1`I z#APcU{WC}osRPG(f3R0X6{3Q-KkWfs_9hPOl4uyf2vugbs6c!*H8pgO<;toCHV@K2 zI1ZC{wj*Z!HvH|8)3%;tJ4Db~bl@x`BiUw+tR*)JLVi#sw?hQ4R>$kAD$l^40Qa-F z2^0UN-wNoZx>UVPSu(*u$-{5Ye&PC}D_)FjD@v(sa)W}eWS!i+>&7;=VwyOskI4>j zqnF29GeZS{<#=DkrnW27Zfhf-|E8e74T0kZbD#Ae`lnG zJfm9d3xdsT{%ru;{sXTdFG@K-J5t>u?X6&0>k|pq|5WvU1+**Xgn{SJ6bF6`D02Tv zl8L1Gfv2^28Mngm&hdIe>9!qm``?bQD`V~ejoIvQW7meDC#^X(_s6Ptj^y7P{C8hP zc`vg|TAd~mq>*aO&5DE)R2a+Qv&TW*%#BCvh2SNq6j>E}}9|D{S{R*hg zlpV~-CDfjq4SV!$w(@R{Pz9Hj9dbRt4DC8dnhlH7*_Hv%GH;@NMQYW7Y^O zxvRhP84yH^K|RyS$j1k1S!xCA_`MX$e){o19hlsGSNo++gt}+*MZ23NG$65I=G||I zH(!7e5&#u+o8MMB%4l-?i}G*XatFCReeL!~KIVQ=_!!ByS4V0ZQh+xaW?BEjbkbN_ zRHuSf%`8OTW(s1Owji4ZTTnKxAhqe^2@I!I{CVQi!~QHa^<$ZVl4AWdtZ?u5g3mLe z%9@%sQsHCJEbTp2E+Zqc66< z;ZE};U+E)ARa-2kqU^MdudyHZzTMD&1&essEPT(SW6KM!k&T&TBDuYD=4&n;Ogr~p z)cqYjLIH0Iy9(!s1?QU=51TgGNB~PnC7hdFO-dMTYQJddnj!7TcRZrfhRxR5g5M8I zq_b>1{bxP67stI`e{TOTVvmrFd*_R}gf~c){2Ho+L|zlB&Y&XqI`0bmJ-GeRo~Yvj z#0KBB`*khcA-?dLB|~;rsAEl}(AR5N9y66lscGl9&iD#H$l(^5NJqIZquK1Sek+{- zmsNAhJ;&mWjESyK0+53e1LM{HK?a#jpL`8w-I%YiT%NA?W z53dxlvAb_KKmRgc{STb~_xhypO+&Cz;vmw&;^=E%fD>{}*z9xb7xga(Y#=Fqvm*aV z1hvI?j!bnn;`v><_#X{JnGC4OESqW2dZCNcVqKvJmiL)$8H3KT`uhBbEKcBEzotpy`tskL8FzGt% z;RDoY9!f%^j67o5{hmOLI8T{gh-fs(DCz!Gdy45J)!Xw49l(Mt4L&)ql-z)3tEbIv+=QGxgf~}|&VX!DO@dSJX zs|?wU+(MjVAJR0r95~24nQ6>K;98j|X6-u+S^|O(htnXr8Dw6uoaJBt(pJ zlk}N$f;+|P+QP9!)S{6LNC#ht>z`98F=lf$rt2+b%<@8k@vZZzZnyJu+D+-?Dm>4c z>u)Ojp=a<)lH(mE?^P+hcqhT+{q5aEWe{&ebsVDVT_(rwr*r;xx~#pu`}@7Uqwf=s z%uo4{ZN>7+w%I-Cl44`k4?ZI=Gc)a3wX*L9$>)*$as~lZtYKP&CJhZK_?(iaQ#x#1 z<(_gNAU5?QS9@f^7H>Uul|Eoh-m~?-2Ug;tICI|$~$`adAI$cm`!FghRV7pa5eG_ zP#Y`Msme>s6+dULFZqf7Dm5d^C zuP-VpswRr*sd8NX_H{F{iYr$&p&bqYN7DFNmRN?BsP7Lvis?H80RlA^+AP`lCallY z3=sL!EBGAnctnG0UYbo2rH5&BGid2yQObo?JJx(^Vg_|tKH9pHx~dHmT}FtDq045DYsamThzNIb84_aYMqUr=A=jwqEtq9 z4UFEUfR2WOt$9%sB2}%StF}jHn2Pn^J0zp-Z7eMv5C>p!-!siBS13_j6qF+eX=xqA zkp@~YG3C)3QK^ZZ(q^ zH^jj6BL;7Rr$Ck={Qf>VUbJp++BY3OYus){68( zVM4z-j3uhbziZ=l(sY;1pqBs^Sf(~F^460#DWY6oP^c>hndbC>e zf&-*BO=4`*lB z=V%7|f2E6+!xhHo@jdnxQOHIV7EOw?sc+O4u(m2@T={mEIFTz+NBJP^DCL#gcPsQV z49C+KYNBK$IrQPO0CllF4Us?MXR-OU$GO`6L%6lo4!<_2jqe~CEoq3j!aK;EV8-X` z3uWY$MR@x3MYv*=`v#?;o8LLjqn^^gwMd>;?BGB6R2o@*oSw-5t}}l;QO8f@H@-QP z@9;~6EQ|?))I?>~$yq*p{hj39zkS-<8a|-+qm5671(n@EiHTeus)On9r8T<2%i=7B z9!;hEnZ6=sw6+zaML7u>D)U*qY;QRVnVLc&`+`=Vh&tF$@{3$4p>GzD1Q^s-s0@Dn zBdf6uAR6vGrQ#s^qiwi4I?&lSjNO+I*T#|@&*Bv{AQGZ_ibPlPD-dGr7jzJt-4KUb zP&jURKaa{)!lj5%Wdr()Xv?evs6E#Wk)oxdk4I@I4@R0Y*QmGV;q6Vs|FQIh3RHxuf(QtPdbl}hR0{5?d-W20Vyx_3_@ z5r?8(mf}(e$$!}DTHl3Y!N~=YS~+Sf58IZ|pYiI@uOp)duOruXFCqLJ^_TH{jn&Y{ zS!o?44&2pP-jey6xoRXhm5b6YKH|x+(VjIhIRQk8BLlEG>4kj=FdZpR6?*Q9B$v7D5*?~Or=7RKW)q3 zAaYBryk&3N!+uf|(|mJpn>LX;$c5>oF{#74IotZn3Cg-~&r>x{;I%j}KRA!QdO7~~ zn{v3uirLXM{M&F^%WTIo@|4HcuCH%w1hDhP%kioD`FGC$hM?xg|2qVsGwLN}hyA(y z6NgoklQIV(f!Ctqdi8_l6+G2#nHedXmXDg3O4otLhDaxq_D`8L8xlBh`*giDQvR57 z>2$k@LUbZUER3({HBlqjus52i1E!F-XC~+^s~sIyX)PvxwX);HuK{WbxUgqH_R+$R zboL0WSXp5ZoZ2cDA245WgwVcKc>`nK#_A8`*3A;J5{ zBl5z>^=mil1V@PVTSJn}`%m(a`^Aci%*=5E&DsY{h|Q0WnYX0v-D^`F^?O8ZHUSqQ zFHFtNXgv|G{-D%KsH`kkc8u&7OD(5FjuHMoh2~0Wyy@tjL6+!j!cg0r35YsTOjR3v z_B=UKZiC)`@7t3-_=KFX5JD?uaM<*@UG{kCq(~Y*4pRk352ew+^g{xq$e3&=M~WBn zyTDfPS=9Vilp^Dch7{rfYi9OxjeZ^wdF?acAXq*bY2u1zXq`n>m2PhLvWz=_avgh5 zLgO{8BiqEDLZ8qNH)_fIYZTyC_1)}iXMi&J;OD256|jSJ1!>{yANqhO)(Dt~0#x$O zfvB13P~nbg-%)G^SYZ#w)V@oIB3-$aM@)C4bKR zO$qdECR7>x;QB6%)^8FhnWfil9EM#g+tJXpOe;%w6oL#&6+LC0vExaj^YNXcQDV1{ z8FQK!iT`WRsge0_GQD&PRL>G^+81ZR?9HYsMIG)MKhbg0EiG>iw z?U~?~;qmrrqf3&hgaCOJhGa=PajjmA4iUnzIPG2~uba@ZFZc#~xCbDu;|SopC}4@l z+28shA+D}9s@E64eEnmhXOJ+?8BQeFkA9-L^*tff^6n^;JLzS&XM1qXz7s@EAc^994 zf^-Q=3JCK2{`Yg-&+)uwuef&Y`o?*lAAFk+2Tc?6rgkH<{?SF0Id4Is67EzD);7)s zjXSx9?(pwoy4YOWLH*Po$*9FFIgW%siQh41`q47yQf5f0ZkWHM{>|7WU zg}Qpbx-+qEVk~^}ty?VkQtj_1jY&!qwOzRG);6Kbk8H)UFz#`>340r=PPq#+xwEUO z1yj^+wPTdSpj-KEHlOsjte~6+FqMhC+%#SYoaPT;HIaN45SASRecn%%Z(!U)krMJoZGt{h zHhs2lYKJ~D^Rs#{3=sbwV#Sh?xPBGg^nU!6Ir_L*9DOsxyLpJ}^%2+8l<$Tc29d{b zX8729N>WL)D1*{K`E}ORDm5p5*VNDbvZ8@TGg0Zt@i@h@cM`4lX*q7T=UCKfJG?YV zPVPcWl25bS$e=D zBNY1oTRHS?xSwURz3sQZ4@{XQ;$a+#Q}(F!kDnhkl?jcohVQ*vpa}egQ|S=VBBkQ| zFh9*wD=b*76rZZbNgYy2N!mn|9oHOzkKg31AO7vrzutMsLE$I9LmZZb&rAA8Un2MM6Y7n(FTC;Gw8Z7bKQmHS{V?N(LyKeAAhG znV!(5y=M#64E5;V zv3!8(`dc8HQJvIo+pKCdN}&yVR(;&iK5wLD{i&*IA3E~6SQC1)<-|_OXDc9JLwATr z*DvcN!r9^Uw6&kRi05VesC}aG-_`6s#BhJn$RZ7-*s>0ypbeYb0;eNWKwDJ{*X+b7 zMS545jkI663SJ^Xj25XZl@&?+EnFY2wY+A-0V$2!MD}Alz-yWyLLpHjZuJ4FUL9(Z z#|x3dJ-p21Tjx(*6zP+}DBe!PlyH+e;oI;Gj}zN$5i4h@z(%~XveiUExmQmaQyxp0 zs|Ptt;eGi1IQX*q99?#Ksp#z^6w=b0Flu7F(8FEmR2i~aZNFCyFV*lYER`u>YpF?M^gd5;`n&EPF7}Mfnts2ps|`s{sW`-StXA~(lh^H2HkBXyBSyq@pTTXE zXmjo&45`2oD!oE4fc%{w^v}K_4D>%W^3?Q*9lw3UDY_)FMFqBr>kD-`P5oe=H2T8l zVY&67(JL4t>~**Ng!}bS`Z}+$q=Y$x!bE7XvQ*jX4)C(~sR)j(KX-N;96o*)JypbA zBlKWU#HzhB>ci&ZC~t?HaN-9rj3j3msMYZvBCc9ZV$5i(6PNyN9lHIa7JdvuefJ_y zYAyM!>ZYZ$QByL|7<&ioc(RrTtFi()7{$c+P& zg)0^m+S5Hs&`<2jtE{O|_b;nxY^;C^$Y}etJS06TRx-H0@`>5>_^?gAG3dW{6B>rH zEfEs9u)z|1ewdWIVO>ZlOSZWaY(R_F#a2>+BFw7j#ff;@z3#g>I}?jNEcSOJ2I296 zGQOHuMW__xC~0I)+8LFP=$)Q^GF{Ugq+UA((K3D%v^{&8HY2I9ypox3!KaE3W2zNX zS*;isyuFIW){XY@rm#u}m^Ww9NV%u7DK30*oINsd>W=sG|Ej- zIrMMh8kbPhfvaJ@vk-gO_S?zp$mI_-J`!mDH&>kp{46!zTG3kx>4nzo%_9E=y4+d~ zKMkgxZ>;hJHSOSzXBK&uKW`VV#oce(U-6$#K#_}cjgWNO&^k=iN59hg^F}T$hx$;l z&VDN=uyD`OW~s|x=vn5Du4H#+qvyLeq<1`bWdon-1ly3(`276V$Vg+OlH?o$Wrjix zhZrO;Wr2#O=J2P}FIQJr&&vp7^j*UqN=ZtJdHMh`iW=BCIBxso@90gk))W=IF}>K% zcjRKI^r1Frak56hMjmBTVV~px0ah@FX}78@=1*+!kbCrMJ4g=YBCp(E?_VMQz4pR3 zH|!xlNmZ)m51QJYIwkBSFwHPljg6Ad+&|F@xC!{w^r709L9a^N%UGrqHbr2li{lEX*Ab2rSy^*JUHWcT-+DyPQR#b~UJPfNboZaC2%KJf zdf7cLD79)-d5q_T@0P(b<6iuD}+_&6%Om^x$9L zWMuP$8FaCU4KY0X{_>F{rERH&crj|rWcw!;Hkh)@gN@7L&D?yVDwD_QrM+4 z1AV5`rT6FZz)sF$t67Obl$9T8hO#R`Ry#rCpgNj%1`6ht0)t|K*Z)GbH8r{85b;7~ z-R~z&P~`7F3U8t}M!jgIPmeN|qPh?}JhwB~*Y+L2OEO1Q`uSmHc z=xm?I3g=}^UU0Ae$Y&Un`C8vd_}-%KJ<15R@n*PTMVjIiEkBhB-h@6hg>q8*y?xOO zb+yWdthf5gEN@r^?c<6FDrEuUS%BwhpHU1bmTF?6wziQo_wQh0;Ba*^;1Ga0&ugsIX>YvCQO!f3z zO9Zgjp&sowSII>Rm2JGpcP&Pt22$4C*ox$S6cfDG$NR!d-4k`#n-bdj-P z>{skSBRL;LptOh7PiDajz%WlECx7$wOD~)NqjWd7t1b?VRoCVD8#CS5-FsKd)|q~# z-?m>UW*v&HcXxEZ-wb$NCSF}7(B$Q(EI*_ZznLubW$p80?G3p!>ZnfL`0n1*z3#5N z6xZ>VK|$4z_xYWZtE%!m4@b^0BW-_n!&{4v@PVYU-=B?XzK1jGINjaTkXdThM8YOx zC1QqKv;Mj~7YO05P2wl0zIc6RG!r4ZsTTXytd6|3P%3nHGCrncFbClJX5)xBpUT4Q zV=LW{m11sCkf+-&C*E=j+bIDgd^f&_WXc>2iw_fK2A=0VydnQLF)}f1!n$H zOeDv5xj_Wc?%23}<1z+3aUUPTG7v~DD@F1FSz26aXKPW;K%QLoAVN#Og10_muc==J z3b7PS-lPx=>rIHT^Xy!`n9xomW;JJJ{xs~lx#3YYVr}Q_db3%^1M z>IBu$W0m&I`mW0=@6jGIFRq{SjwTTj{q$Zr^rr=@S+W18|ttQS-Qf*{gWsx5@{7xVOGi;OlJ0_k;1!Hx_tq-!&sYip%@(4|K z-49&+9|b%LuyQ-;4oES+eObLVu~IqgJ8}3Rep9;R`pZe>D+aRXaCePN0B)J3E~u%W zu796q|LU{-rX)9-Tfxens*Q6Df!O6zFch9BJN8@=!X;S!$co8yC*6oh^||o8%c%Z= z0C|VOhHnVWC!*a)>y*iOq0M!NXZcyYOzOcq^^>&PVSlE0jnN%H5 zqaZrygJ!AFMdJNnmD$SryLqg3O-K zwR{Q9%>0{JTt9TvoqwiVxYN7Z*YkZ1smBdgF)y@AA^RXks|y{qSWibS{zv$&-xSiXk+{vjsk zmGgD>OV(zVmLF}Y)?Wf9C?^kF?3dj?LAdF|SY1tvo+6_cI(wj$X{`kG6X04~A~2DH!Wwnll5^AsE%3F8_<) z)n?7(T;?utXzE{n+ZSOEh1iJ|VHrP5Ahd?Gc3Ix(Flkc>J1R8y2Os-6`F8#G^fKWKfjb6Jw1i!6{b2z9s|9*GVh)U&X4IOGy=&tAjmj6Quwd1?)LXV z!VY&5HzZ+%4w13BA;Q}u6}q{BwkB((Ub!k>y`ZxS9)->;n=sx%bJep2-1v;Eo@TH6 zn=KXVICe>{;OJ2JClq{M+qiI$$OuP**ZMyfazL{5)_&vi<w?|*D&ins_k~Y5Z=CEVMUUYG|Omj|fxfRYmtg&w#+xS}9W(S=}@h zuWLFRo{Se>*BkNZ7_MRsO6hRRht(QOh3LMLMl4!6XMf*pzi%d|p_*?=)E6}- z;#qFOsN3vUi>qZxnOJ5}*C5N>sDWV2K6^d(`}O?~TpnZ8sLz+a0bqzJ6)>fI8RlIb zCN9GH3JQN)C`b1Os=X>dPwV{eJ0pSleoLefcJwWEW!~K0pi8x5h?7O-o)(mH^!fH< z160;=qpsbsPCQ}q=D}kE)T0@y4Yiqgs7MwLAKQ7fu-f;llJp8rb7XwuoEJbR+(mz_D}G!Xm5M;mBl2pm zA6rsBCa)+EG5W|2x#cxOwQOs2-k5oCMQe6$tMMNlsh=zd?zHHDz{MWQFhGdv^5AUS zK{e^c!I=by?&Y2>+xHG1A79MUukS&!HD9hZi-$}z`~lSyk3rFy-m#p~ZrH)^cRg41 z=*{)yp7xR1t20h*9xZ@r)kAGJ54WdUwa*_qZuRr~)zzM;5rGUn8(0BOqJvpLxY^h) zpa=qv+q|x~Tr{*V3YbZNe*fTapQH+e#AOSyZ~hob%;PL6g^?Rtg=V{p+k;2Hmvl9uyUEInii@7I|r80*%^-+!y8`Bm%=lHd=l;3~(X%mL1bnAGNCtYru`qVP?ylwx5hX+=1>E}-V zZ}N1%u_0x_TkB0T^l;KgPv^GhS9@ngQFkAd_bp%{hi07u1aba183WMsML|#DJR+P5 zZl9!z+Hs~q;_+i(?nV^rI^e;xMMKhN8f5ZM(0gD4+Z*h0x>y(BHZ&o-_p+>@K#;x* zukH@?v7@7(NV0X4NA#p2rR0aYt(T0?Aq*nY^0lkT5kSYTVjk?}@Ttm=+E93o>PDi1$+ zzYlD;HE_<4r-O7n6XXLNnV6!H?#nW3J#T)Q?wMZojkK<4@lc4xC(X#QIZAOB_Hn3E z2hi!OH!LFDHcQ3Ui)oaBYFde&_gqFACIvt)ouwu~eiX=BSDh9PlJUG_|Gj+WLfaa= z7nw>K{#Gerl8P+*Wn^x1Pp#N0Q*jl;Vob3phr)BQy2BnzTTN9emWR#Uy9PzP!=MGH zQ(Vgz?o{$5mK3Ymmq8v9}&*r9HSB)7!c3F${0 z7AEUYv;uYD6cog5PWFH8$IF{R;poIT0Xi7U`K+ZFTS0GNuW5gQhdxa0PHf=RA}Kcu z^uePuKta}H4$YuKr?(HX`(+}gJWorRvJa%~GIfvN1d)&!Jn@IBJ`&f}=WSUR;OiEV zG?ft?pHAnG>OV*B=6vA*Yd(yy68MTFktHT6YG|e{P>w1JJzV;6J@?~ydVb}z`d8$h zysz}&PmgqD!VPibo35F8dkzfx_LXFGLz-GGW1ab7!d=0BLr*+;W{RlCuYXs%lm8}G z8Kuigq49v}8Zh`8=yxmeyA&o`oWdD}0W`78-Pv3**WN5MifUBy)qfTf6tcp0|zJ?1-GK5#0&JzKR2 zB%l7fF%@p#@2aXmPlvUD zQv(mj?&@NK@!=xWjc~&_O%qgxQ1SQYy`~+Ro7Vw%QDQraQ~vyBEq8p*US8%#!hCk& z0#z>6ti*_=;xCTdxg#*s2}5u==o2^x_bTys1o5nl4nJo7wH!MwEr=3!by;!>h&Xzb z#ChZCt^Z1?L_gYV@rX>GHgT^h%Y2&_$885)|4_S=^;+NhdwY{3&fm;H1AtB|;li}c znmauHQ$o0$X=>Vje9(%t1x4#C9_oE7DPfT$v)=KE?L6sJIGRpSgIE`pkcpfg(B-A% z0hTN2uL$z&UEFVPn~KL!z?YeCKOnQ-7$k$~0U?TrrM5Gzyg!Qs*`40;<3xlBtfu_^ z2v)}I)(TIo*9IBd&eOvR-fz{8fEMVVPGS@^a1kfQGI+#fl@LdTe?}A{e}?Ox)ehNQ zz2u4Ej6&}0o7`2c<}J?G${j+SZN;(yj6ln0@NdW&_QX(zNnJ-HyMXO=I#QIOAzxWz zX_q3%v3kE#pfRiSm}8Ub0ebUD!6~_xqMQ(a8c*m6yaGlTE%7HI0UvwELM4mN=XpjE z#jtZ-ZBJZenQNB$vkIEa z{wX%hlx{fED7|dYjqdh*^^wTg9pin*#29GU^oM}tj^?KU_C|JCVzhnfNO2tOqu{2! zre(}DX`?{n#_bu}s!M;U_&Jmx&7F^$YEd`QE+fr%j4x_r{@7D|l%@9-zm#~F6;leJ z?qTe(Ugf~6G#gCbo>r$oH?@kAl!@?+SnLRA=4)iulBeiTcL_!P@9h;(t(>$&mShzi z$yKPps#P~x``13JZXLslRkglIS;)O1)kN9y~`4@M{BTXp#Tkqu0BB@vTTMDUqT;O2n zokYdegNAb%=1~e^D~CMxI6?+EK=^S=O&*WxV!5lzLcNStU`Kc(rs(Wh!K4Rh!P_%Z zj~@xCu>OuaSeon}`B3||5QvzJ78;}pqNdaC`jTVy%d)%pSLo?QdWp$^>(+f1mYv5Q z8hjh%?TJGUT9B=^4~(FNWi+Q$x-N9_7YDHEVM|qy_C{!q`eOiPOu>)umv$eBus_pN zfw;D^#)>44E;fAwG^DLY2IX>lrOU$!$Zo3z5(8H&pQhzx0r#r>)Rm#K5h4jrgV~cg zXRE~DkAN=dj0_)>Zy%{t0x37?f2y|ZQ7gg{gDxHV@h%Tl?E@Mqm~87qT*Jk@QZHm* zpUTFQ#FrM^feet4O%wzZnoR^n!#3R$Hi(sl^b?4L&p4YX>DS)9Z;=$eC1nLa+_acL zLReK`$i~=qZcgv4kyf&pq+YR_y)^{&AD8GTU$_}qF~MJ-USKB$X%eXQrTLjdB~c@g zFvV)-O@H*gp#4O80{C~gN0)91M5sVWeTql#JdA@9Mu&0yh>Q?>q=F_-nm!e9QJlM974<41nv7JMvc|Gh=r(&geqZP#)b`YlLPS;fM}`cu(g zE9|@g?Nj=@WBQ-6@&6^|N$fC`v6HSP#7gCSp*w<|1iy#piC}uB_c25=-HpqAX0QGJQo%TBnqdN8*SwWKof|}nNNmf&}vo+c{oRKAM7@ z`Vxb2Ffy~L7CF-miN)T{0Ftg_i~jz1HdZ=6{eNZ=k8u8VOGKyA z7A=Qgo!nmDtl}a|MjCqC-&VoLzu3nSI!EK!s8A`)>4q<<^D0+X6y(B60-U^`J$vR< z(QfOV)qt2oJ?#n3>1glR+}s%P3Hl;6y|2!KRdXBk_TpQMeL;QH&h+7rC34+%BFBb{ zIY5R@+tiScP_p_?lguAxDs3RoM}*xHtm%mfRodrn=~fE7DIJ;|Db_JWAq+Uixlq|K z#aO%6@n*2(d8fF&ktK&7?Dk&IhJ?HKBR1hl%;q|dF`|2sViG)#nBCslL4FtY;}oS~{>_ZtR&N3}6@ z(E40iAOkZVR`)G26526P+(OBqTCZgbc%$D>1fy@9Is^IMVq$b3AZw-X7kG;lBO_c5227V$^VHcPU$mRho?p}8*)b>ipT`1QlIC!>b%UBTqo zOUYMpWr}WZ+iJ<^{fqOJ^A@vt1}kF2)-uSXn&CH68L!}`n5tGg5*At2ET_(fcdsH^ zDYc^wZK(pH@x2@mPvO`_U;(&fP{NzBtwU^%ND@%Wwxw31KF33mxCtMrX={^Olstq# zD>t9_X*{U=*8k+)b7~q(vc_emCk8)AX-uZ1nw=-=Jf2O~-LbeC6tIN{hTDnc{G`a>)UGwn4i_NBW$f+w*saPL-qsyXchYuGL%`A+ z#tyk7Kw)vUsc#APA*psOUa*RXYT2ZNM93QfqDi!mX-1PT_t`y!A7^9M4qZNB%t&Jp zTi-zXc@-gnk++Mc7IZ0MjU(2H$w3a!QYn2nIvB+edM*tGAn~*=j<~ zJG`^-AyRU1_i`C|Th@`CIC?k!N#6+fuhhbr(JFJMhu_?T2|#7V&9T((2}n4|rP7HX zx)Y8hfy_NMPzE|)`tHRD+rri#7DeQv`Ld2}{l7fPlcR6op%^h?f>?>{L+s>u8O=f3 z#C5P&$v1O}r@u0a#peSY^$P2pRct?%i>*JG`Tjc7vGo)+=J0Nwe`#{osYxo|K`m!u zl?KjBO*=`l8b8(hL&Y#HbBv9Hfq%5PSCt?hQnrJ8;^x&mJM`*)$k|MIQTPP0RdcA* z&&7L1EV{QlN5UFGW))mZ0p%wXCD2S@2!ng*6QqK-Qe> z^LM1r`|9GpB;cb>o?(fI-TnJ>S=!aBRQ?3Jv$AItE8>I7x-tGWX)*zm29#0Jx)SS# zU?BmVw9`^X&1kV@&*`40f`A1)6WKh-_TSO{MP}*ao{Rv03OUYjzg73e2xBWGMG~^P zcWo!;4Yh`MD~fYDi2cfIpxlI>>C>M7-M5l{WXKPpG#pK!*GmLJNmIf{9pA9RTC$$n zQP>X+L(H)j0~r-p;u2B&kJe4FBB}4U+?mAH`=>ieLH_u}YdEy_B$}%!uPtd~LLnGm z_Gg2?*+M+?C!vGB2ZP&YB&Mc(EPguwe%BJkv*%`si_{BOi;#prHLv#C7KIJaMCTV-2i<-mG0 zL+*8df3hzJ*s-uI6)*emSO`WbZBNL-tJ#Oxj1SR9a*0svh?r zdmZ^hMBM%?-+wNFp;ak$K%^ZtGm-+qQ}RxGJHtj_sIWT<;0Go9WJj)WMZ6^LIIk;Z=2U zr79Gu-AcP??s&pVe|_0|;F*Hb7-qA);F}ATtlt#Ph<6Y$2WA|Mcr$ErNcHEX1?d&B zGKWB)GE`_6r{WGL!M@_Hco@A+kAgQwX_YCtD(h|EW#9Fjr8nU4s`axO;M9KgC(&oKvE zvy70~pprCaC`c6b_L5pHTB@yYT(B42UhGQ=U>>lUMkBl|dibM}fp27UYgwVK1@x*s zG0fkvKkBHp#Vj*RV`$Z##IM$T+ErZgYZk771yqItM)J+RUe>=(;C2oF^(%=yj7)`f zFTOLxoGK{!3}`abg_YasWg7q`NGdn6)o=%N zSX}>WZ{%Z74+|fU*EeEDynS%T!V-A@9o}ZUYWk z+~m|0ZKa#cft$;5VanD9VI+?e_x)s89BiObm{ejMN_iZxAD0DiWJV#Y92_(}!B7`k zTqbXQg^eePX~pv(n@6G(z?aTn7sbm=n|3nw=ER}qi5rM z_GV90tj5&>Q0E7nt|w}_1I2aGOTSwm**dj&g+|g$^7@F05d7Dzb%nQS2ckB#w7i_? zFarjph(t2G`K@Hf8igO=uWO3M6_7}kUO_GE;o7}lf~rV% zfoX7K!W75*%F0r>qM!?l;F>WN45qcjDI#I6Y}W1x8(Ffnk-k*U)~#W0_~A(v<19tR zFBg=g1$}n!6ffI5V>^k8?k@>PL5Hmn*`5x=5^&}8O2>pEWlQRIa1jP3h9J+R6siY^ z=`FapjT3J2BA({Jmh^<<%{=PYx#>$!<2#W(mF2@4-ISk-z9kF4lja+;5t~0qFs;}# zQw96k{2Tf(#3$uSWimz9jNckJe%zm~8oQhky(3iIFhn0qmgU`Sy`e5rZu*m-C1Xua z#yWW`wvWP2hZDX%wEv*bV)-bc>B(|9c^o&6{)_lhOpVv!&V_9c>GI7!&xg`I><%P- zgQ#6^d{igUkjRLqMSze!bquHZ_n_m?q57F z$Uqueu?va=PgET2%RisqZF#B`s%p#@l8pj0$F?@8c5S_eCT*)BZ& zyg;G-+LFKyNHE*Gs$xKkLbiJXyZ_K5JN^NZjHIs+D9d{VZz~_7r$~C^$kqqfdL7XJGGNNov>5G!zHDKO;wGM_ zM2~?xqZ-`0!OZ6NvAZ{Q#H-s|0x^e3Gt=S$vg9r~lad ztJ#5I(CyN%%r;~S=Jo;MNI0F0$snI<95e#@xTpwSwiz`#zwKeBnydm15Qg`!Y^-A{ z%cU5-hf{kd2wPtd3Qf<&O{BRz5Bq!PC7}=DGb$|1HJ&IQhhM@RNNQ(IjpUFb7#WHz zIa#&5zv=ljv$aNXH9#ra+}Ann!H2bV$1jgi8bNPZyhL&IG1@w9wnaH>d4pl%Qv?pm2DxyInIDeq`d3r#=4zIwz2>)b-$6Dn*PBf^e`)U^Xd$4L3VaQ zs00K%Y~Vki%{B}Drs0Y4)u%JZ@z;o))XKy1pp@;MFKSK z%4_e11O-H@n+eldN3<(&_^*gQMpfPduQ5d<7xFj#T_!{e`0W!l;GZwug?ZL2hE z1k#>%o4DajLfxlO+UxbwFYq664f;(ff_91*~RBZ{9;G=~E9710sh4QZW z>@F@sh|b3Pt(qoi57!5L@$6wh`jDfVCmyPy5tpjP?lR2Sf)-!+60I)XhsY*~&KjGf zbw1XxNYgqxZk+C$Uu9C~MgSv$_gB3Xp#j?HLzs@TnSlzaHAmZBfaq5I@5iVC*Z{8k zUU~tvPv0RaAu%B;^ZOhiCwsjt)$L|KPH2r+HX`sBlJ@!UcteC^JAl?c608K)tvx6Fm^ zsa^3Tn)2YBQX04~VR2){UTa{XFiY1AU4>cgN;Zt4v=I^l-LuOe^8M!)tN3J|c?wl3 zbGboW$trA{k>`^PI~(FZpA?TV6sfVSlOeA4(>)()+_}^M$H63@ESO0$(3`%b4-q~q zpI)g`f3`!c=$G{Uyo6BVO9Ch86K%5DMnQOPB{}bvCdvmDkTYszXoxU-inAT|z&KIv zhcmn-v=aWqU8C}Rx|8jLb$+w&c;i*PdQ^$H=sVdqtLUbZQB_wx`2bz&U6+^W zDlszouM3Qplrs@(S$UiSBOJsOA$X0Q+$LK>nlv&2hVJT|8hJwqmAK6r{S(~I&euDz z4Y_j8FZg4=hG|60DkHf)Q^k+#H?ro!@* zmTSv^an)?$&uviRF7GEXWz5Zm`NE-e6mlw2FF)LzX4Af?-RO;W)4qtg1?fS!>DdyO zwwOR^Yf7h4&a@$BS`g~BZ>lX1jH(WnLkF(zKF8z6I=#*4gn3cf?9xD#HTY?m`AIrP z622PQp#lOP*t-e>01_!!sC{27-TDx?L;+vWr(pAmRFsZ@}|42t(K~0aQkB2A@8M2rM)d`4(%1lAG|2( z%`-f7vv%T`M-^-1E9OUvrA?Hf&4^Rc)(;%gq(mCmg5 z2E~#gP(Z?EijFMkz|MHTu|mP(*LuK>X+7rWU-7c6ymD;C$ydvsy^N>fMbibyci2 zEKh)X8xM}ZF6^0?!=X~zt5}SE=0v4PYR$FyS&l{Y$qRaodC`Gco_|KDIao*vTsGp7)lX^8dU3}AYpN_ZY+T}XWEbHi9Md# zct7*@#wT_C6}8Av+XK7z_)>i>SlYASz~g-xzudyo5>>AclV()IZincyf`PABE&C|f zSR-~Iqc?79EG(ePoC5@+h6E58&In6Aw3!cKOSCMr+GlGaSNhCby|Tx4v8Q_%)TR3M z!?lh9$~&2R{>y#{YBIFCD|k9+xMn7k`-7BEA%j37u%&Snfu&a>JQuj(1(AE0GfPzd zv{wpLGa1&B8XS!>Fu-(o8fK5?mh#|0i*a|RokNMo-O22@HX^pm8X2k+5GG&sUkhh4 zMQxswv>3qIsqO9k{bBfOgB}|;=4G$cp!z zTWE0~!tFbZFeusUxWuxjanY-ksVuUIsMeFDq*6F-ymgE#yqzx2-kERfJf?;&Aqq!N zeAEp1k2|lguRE{ZlScSS3_gN$fE2NhDrd`APgTRFht$LuiZdZBt;LlOHIG7u@h4e3 zL-{?>fzZ_)Qp@_g=(3VNmqZ@*o;3WAKjZbi?4z(HdMi86-l{$4WESqFB-4s^@f;}J zKnSL9QaMaWU?4g*o3Y;A$a-w)FYHk;AZs25nf`(B^V#r(6!J5Ffw$8!R-Tzf5}ITifVIlK8i#y7@l z19$U*F|l;AT{d)<>+@)_aBQC1u}vws;pe3~_=&A%mqwX&^ZhE}L>KA`}^09DKgE|qqsCp2of-kD8E(vn^6qpQrB?XF`SbZ7w4)<@!?r=A`KGBe?gfZoi>@C zdE%oMP2JUX_pYTL=MpVPzceigtYB!+b4S0L8Z{$2s?MmLp_BpTNG$|>;x$jR<;s_M zC+yEI3%G0y#ux4WRCL^_Z7LN^>wxaE{R!UV<=xr{?-(UR+Z)+xp4$wkJfR?~q$^w= zuBNNURGkbFi2#HiBIhz5ACLc}0iRn=;Fb?C++hW7z*gpfY1m=HLwFtmNCKafbP1i^ z(+qm+kSrSW=s+wPbbkR~Q*$#tDDfe2Cq9;7WF$fvPG}QugpY^9YfR&ZtdTSnhZ2La zEh1EZWYocSJ8#j?t4>M0hJ^dH5y#+H=^RwtUr<2gZrrxj&XpqO#fC0*Xx#osMm}!; zm-||StFIGOHf@XUg%!;qzZ!gfrez;ScXnSdF}>eycytu|Cc@W5NfUsnniAiO-v*X*kA9so4=e(rkgCg! zAMsJI+O{=@2n=AXB#rtRgnFW*8^1~^6O9CFqEwcN-Q4)(fQ+Zw+u8kkn#+u2E5&&q zuuZ%50Sk+}8fUtCV)ewz)Ltc4F@3<3kH4|T8lw>m-G!&CgHpFhY8QW}$L^k{V+2dz zYd4N8^N(AziojU zAzzfnqcsEleQaFRWmCz9=n**1dW&4e%W8`hy9}>mdtg^qlpw=vk5A=+&y@m%)3%bIH?JfgLdR|FOp@v;wx^1(>pC0MGX|Q z_HhfCmaeJV5#b4aOs2PdT~$5m!lke;C!iVoQm*t2Wn znCXS)m&&hXS(Byy^X#B`+S*tNXv;sK%G@W6XiuT5(ZF|X~CvS@T69)7an0B zHb9ovr5b7Wuqlv7B~iJ4GTViIMRD3DkQ|PmFxGUa6>${dApGMbA8Xpx^)9%`-D=G+ z@@+!a>-F;Lix=8Q^7R1PN>pR$SL5sB33ukEh5M9;$S67_)xM-dynJT{hbIy=1TIvj4*w_6v zY!_O4^Jso>lz;bKnJzC))d~+#&XZ+4zB)6oC3CJv1?}$j@0+N}^7|;0-$5z-3;eTk z$ixB=0}q~5iRVkuXn)&l+%Ylg>GW*v!0DqNxgc%lm0Oi$uudm!@*lx2$6WOvd8Xk| z*z|gxlaH+=slI9Hal#KB1~M)cvU}#`)a`;*!lZLAzOMo)VPpYqQ-m4Mdbo2Kwn?IT z>Eno?uSn^=%t@Ny(%>p|4Ra=zj04k~AWm-zB;K7cZ)F`J?$HU9vXbVgYS|jpfgWWg zwDLjM0{Nl#Lq)wE5fM%Po|UoME)s-WNO1~w=F82Y33Qp6tHAPjRPd#zgA!Vps=+ZYUp4rs7mTkvC+E7DKB+AL+4}_Sr}=xEWwh6CHj7BiJc|hlIFx1B}eOp)XM71m*pCT za~wok{hiYDO>#k1t&Bx=Km!f=SlWyw6HY&-&GbL74|(EPPWMw$UnKF+c0w4oXRJY2 z*0>{L?(?Zia7H+uTzK!0KW%g0#7g^iC-h`$WqBDd$$U(#E3bR8;Pdk{WX3PVf#q}V zx(@}SrKaj#W-aBWA?tf?cEG^CY2TLlBHrwdmO-VmOgn@d^Rq&Q#uZGWbG1)r*KS@k-g5Qz9!t!QPGa9tTY5j{DS~AmmR8VIB5K*3oVaK7 z?640^CMXt1DWg>>^+J=l*I#f8tTog!6)&>MJVMe-x~FOVZW4*>m<&kiXd}z!E1{US zc9D*~)29o7laU4*4v<%6`>PD~voBwQYz!2@R+%g4OvlIbPjpFLSss`uVXm$c&uY6D zevb^7Sg^20KRi3DfNJ2;M)T<>PFG5@bO=@^?15ScNb7EmJQ1SUG$r`T45K4oIfk$+jM+i+0yY0`z#}ECpW+J+eN@nZq+eC|zZ&*5_PNnnKPl)_OM&OsJH>YTY%2eQ4D3EOMzRDS&WC;M57 z!9L+>f8*hgE-0n;u@7&2;@}m7vp}K@MQ!A^WVMx#<*=$CQh3K;*o@!q|4*`=no{3z(uAvHu^MzA~)o2Wp${7&W>=+=3gLvJHaAR=3;;chL zj9BDU1c8+t&Bhd`35=vu2mwOzlPnSiyu^a>@qM+S`1Y8Pt*y~sn_nwxbqN49gt)~A zHJCvv_O}5GFMY!_w~UfoI|r!-_siR_MFeX_4P**%v!PKQ9Br1{F?V1m9-pyK2K(>PKmou(?L&d|m^ZshmLJUu3OHdng3#N&ZVhF&oewN3 z6gZJ9+z+=Tmm}22njU9E${a;3w(?!%Ng9tZapH|P+>0K5XS~_qx!;f}zU?U~i(C_@ zTk#wJ_2UPy=ASYDYx^$ue?Sa5S;u~*P`zX zI-2;R5a^V;^!xI-t?bWiy6yvyW4*UW*h+@l9ONS+0=K$;yE>>W15Hyb(l`Cn?NXH% z{8x84cf}j`)1HeV>W7WeKxn-pBVA?za)OdB7{#*sF-LO*Ll`|e3f)X6#9;5=N8N7n zS8qqndJ(iY$<8Xhd;nLxhANd_K>9D58Gtb@yxCC^Ej-i-RF2z;eC@ zS2#+f`)bcG#8|ZGdvAjWdXK;s4r7)|z5&PIx{%mEcr8f$TtyL2pB&2AN~vYkil@@Y zL#K1Z!_y#MIv)h1*0;8JQU|MtmJ3!X?E|FkG~>5$l%jA(Y9qxwOJ<1K%8_+H6(=C! zZaYd5#?f`S+S+z&sex7MzR(bwl5MwYHznXk$9&q8Dd#vPp%V{`usG_eo#o@3iupy% zeQ@A*e&C)lHbr@Y`a_vmP2Bgf!n-zt{|TVxwoJlIBuB`ltZ25WX?>DYuk|epFwsO$)i~~0( zoqkRWLXRuGwk~duK-N@o^>nM}0{2e)?mxGo>h=6wx+)!N4cAjnwmJ3xn#zE0$4a3| z%JiKmi|u!cJlzW;HX$9h$VL!39aW6*wUj_>S*5UW6)JXGU105{wXU|drS`T`jD{v} zEmi~X*8(NZ0Zr!oAOHB1JUjANKY9sJ#q@_z;C}>h;^Q$ZiekOrn69Pz5cR?E2m27u z$I7j%O}byNHNO`n#O6Y{R9Mlo$-W=+RF)oy>9}OPV~Fk@5tSbrPgt(x_^DCf6`aMy z0xXl()%Aj!b1#5J(deM#5N7=ERU~JddqtRVM=wk&wSXKb=WmudgCf{kJ1gi@agsnPgLAhhB8HkmG zn$-MZ>3Zd6zxPy7QvgfQc5h!{9FgaXzKy?50x5cro)lbH{q0ig*Sy2{$oTOc8bcD| zDy#u2jmn?X(+l=n{`pc4*aMpne@wjyu`a6qt2h z$0DIr75|A=PX8om(ipLOGCgxpV;~c${it71XG1&xkL&f1R{rrbS+$XUX`lzyh+AUv z+xHb7dPzLS)}%+}g^sysdIYrMCJp7V_SBTCXn)bJ%#PN>$^00Da(|whDTwDwL4ZRw z06*D(s9cYV9f7)e)MLEOFZHc?Mc)y0`y&L@78DT6EbW4eSY?UF>H%dO&6R!n{suM; z#ROS~*mPS>6t{B>Z-A^A3pHucE&9XYYx2DyAvO?*aAEZAhX`xml`#d}q_N7;MPWlZ zOGdoHA9C&&=du`bI-B^kV!r38DKo!=neh}O43^mo(LSOQn8ct(@XqBVx`H%tyF;~# zNC~+c9Ec2>Q$GJV>uvpM%>YE&iCysGWPrZ%F*k(O(02$fdq@@jkk646(=D6+OM12> z_zm6sEAgCWqXA!?%7B5%fk)Y--{tM7w#!!Z;|8pn?U|^pN;pAa2hm(Q=95i<^Ab6= z5^I+h4?N}-Wm@irWS*6B6ke%qPr58~NAW9n-P+X`Z6EHR?sa#|N&++GAZC|4&OcUv zj$KI*PA$-_?C5ISt6&<7UCtFq11ZeOm^IxjL=G_h-3hZ@lDz5&gZBIDUU7V1<~}n1 z8xin)c7W<(BfV&&-m2&iDp&t4KDgy@o?|SlrKcIN&QYXd!k+Bu*KLfL4t@7s`DJra z{xen2$a2ch`|8J)fQHw|9jX%4onCDbo~o5No-VO%IhA4P};`X`VA&Hf#q(0q>pKB9ih%pW{e4RC`w7v)%HuRmNu6txIg-5DYQe- zimnJD@<31y48Nm;(Vs2pBoqC%aYL`I+uGW;%ZAXUX@+e>#a+=tSeOb~Wd&*zybV7? z9b?9TSO_9v$`%Cf9gK;hz<^{2|M=Xjte1~UUMh;xC0H3tg4-L1k|PU3VbYAvJxj$k zO^crSzJ5aVtthz}pl!gr`}R{5x;4lBh`r8n>QD4fX$7LV=udiHv=m7_C>H&DMF)=T zT=r6yx_`!^!INMOx&8oZcXSk`*!g$+211{IlAd_fo9IA35Q_MLnJE>o1m^k5^GcAo zi2=kOQHKotkz2asAt6{h_J?8&8fWw`OF=IuFS|F0*WaL*q0C(6F;BNYsDH4u_xrai zN>qWZJ)&O~auk>Ud0Y-ca?%x9+Q*hmNzBEv2td7@@0MmEQ+o>92{RwfG5y$pdTOAu zM1U6FcN8e%#N*+!TU>gl{tWoGQvLc=^_x7GrqQ-|iV}phs_i!dSjdzF^B~qC5>vCX zE{%<$rzHtWCp+5K#p^of?~+!Q)2!I@gNB`k{4&Vln$Mucvy^omKoK@R=h*Kgo zRaEJ5f3{}1GIAdH|1)YY8ay|i%^`tP3q~1!NdLY3k97_h##Hm_jK!zM=y1zpjt-Wc zUsuRo!RI!~g_Lh26Uyzo2;gJ<`t)7sM+TbV3RoZ1vMIHAb9Y!uC^hiP*s=1-{DLK&A+VL|9mj zcAG&M>u2<2%QcMT)RhMQ2&5VKsszksfr^qQL1zqtK+V`s<^nBBc&Q6t5pa>N%VzR8 z^$WRwxAz;S06ZQpExoyaSFaAUOyi4Tx1f#ov5}_CU;PoEHJRgC*j`aowGT_lh%JAR zpD8M;pfkZ1$7PD$y-(p$ByDw7l7G+d;{0lWYjP@u6Cal-hL!5>nc7*YSZ1_;W%oTi z>vd+})a2vAT8x~k0vwh)I3M>6!)=;Bpd51uE#x+hjd`hprmpz~Np=#?f z0B0>nzWHvK<-F(8F*3{;O?rIhT1m?a@vM;&m9@~jf|a^y)aX0bD=$`+!`3=pr;A^^`*+xcA&18dPSYiKAGbRr{|YH4@(QJ<$^|}LbboM zyT4X^{@lH@7WYlTIZL!BmyZv7|2r989CMSZBl<}paT^Ew>k$5R1s2oDMNO4bVhPzu z#jgtwQQ>(U|LbEOIfAc>Hceg5foOmTuY}$nChMw;k!tb5SuQI&1VRH9$`2 zzQnf#Fb}K-5|&0v9Pf*(yp4!L{~&`_)MNl7P4J=HzPfH$W z*-Z826&IJRp7UWDrA2vF743;+hv!wJZs+Yrn8>PVEh@Ar6XvbCxsxh$+eaIO#EUG- z2<(lk-Da)jGCU>npSCaYtMYv7yfTA8u z^qpS+qU^U2^KZQwChg0}hG4S+BXvPT=6s65A49W3zm|G}kgd+9BJCokPbjl0T9^jf zv*pufBxyFlyhLduiOERG=^U)zv(guyn!0`=A?}*wiKM=K;_N5jP48H;6^R`J6mw>G ziEi5yHYY(i9#Vq!1hYDB9x0V5Dk29 zr~DIpPG^D8sSpP}4a<(0P&%MviKc|^I_r|)n6hB=yMT!HsjF>B5gME8 za$M2M!CFQYQc(%xBcJXJPBZGx-$Mm&Os-;Lc}cb3)6ik0Ueqbw5?Q{-r*rO;gK6J&(iUqo~0 zeP)f9OMn6sluQwnxt70-*NW!1uqy0Ew!5MTa<>0MiA$4jmKHM`NpQ{AWUF^kbTNjy zMC9%`K;f=ZVqyp*N_6*upg}^8PB3e#q}xfOw%yYk+9Q5ya7jjRaH<{HP~q0lsV>Yg z>pdmeQ_VYyIj>Enu9rVv-ZvOz+C3Pw%~i{^+>huQKWv~X4n5q%2#y3zrESwP|Lsv7X28&ufidYs?$sjTkBu*wM?c$Hf!!0~8Zz&4Aju?Rc$f5K-5mXK*WZ!}&%x8BsSUl&OCov3eu4=YPG z20oyW)4+dR6sGTqeauKVb{&X?n;(MFs?x0B;~%-E1qvvVzW!=hx&BI$ab0QyQZpUe zxRDdLR7S)64OODD7cqM@_x&zlfiQjCr%{{tr$I`lpP>;Dv3tih253@}wlFt5Ze7V6 zbOVnV(w!~R9Ks|_DHgZ7Ek(g<@rXFA73Z^kQ$7Gu$$c+S*gEqu6;r1lS_=}OOW4A= zOG5I&j5n8Sq(av1CyQdR@Cmx})(WEhz-Wk^uBR`sIx{o#dj;E4Rm4MCgK%BY?yI*p zm%F?|m@KHmBrv)^low#icpPjk04D3VY{9+ee5VMaD%?7j=%hfhmIqDaNsy+&0AgTaFX9G>13bD(ZSTmm+#kdmRK^;D_lehE1YLFJ)eIKN!P z4@0s0f-z)p!}z?!8UOrD<{$WCeaFyyV$aily+|E|*)H#jzE3QD(nZK~*x{%5cs`Tc zfwp%&eW9LBPhq>-;4VZ2SEZQGT6QCro;pbVz?Y>b27I(1B<}8xf?dTYxfT2*mkE?g*_f2j{C2x&@Qy=29khTVDoPx4x+@;y5;PU;U zU+fAR$c;y6`J)++knj5%;e%JozFMGR;yMTn72vL>K`uNvlJ3w*FCrZHN_Tj}1*A~+ z9XkjY^GAMC&0I(DM5{-``y?N;?3xWq!KiJcFz{Sa9NeT+PKr02y{mDnLYu+RDvjSi z0uAvb&}bu40mK%KMMB_rBZrG?&bl*8>aIFUv@U7A%IVVjn31z-Nb0X0W`nP)7;Afm zdLpp`{nw48`ag1VTpm5WHrNny>h(yqm3nd8*-NHAxCkK&$E!vW>EIQvL{#~dV6(h7 zzTZs4L4Fb-gHkFKn-=A`AebNeW0Edi|AuU41`jf;45-FW`?#OeE z%EIj06|l(EzlCY{$1iZ8n238jR;6W0*MG_bt^74xu8lD23k!HPN1_9hlZ4JQ3SB?{ z*>xRZ{U|+vD65LXc1De@)K6c+ZnWvnqLNI!?`4$wfxb^UB{!cHe9uQ-H(~qxT#Djp z(LLR&zpMvg8{TR1j+#98m7Ji~(kO;BIHAn7pAF`wGVul>j^;6Qmfd`>{>;9Gwt#gvT;j9*n2dDw#l&p{>j51WGYe9~I?gAaz zCr6!N7d@IG;$J6+p`7XRfVTuvrcR&Rjy@MZm{;+UI)kD+7VO?n6CL3<&W%u3HH;8^ z>zbH_4s3!)vpcw+-@@M_z3ClUY&%tN7y1U-x6ySze_x(G*Oq5Y+Vk#AJng=A5mu_v ze3b^Q65Ko@RiIgmRo3Px^flVpl5Chv7QkT)2{z&Dx&Q0`@7sLfd2(*Y`KY?FkI&Ue zgYsZooZzgk&Wt2^M*i&B7GS$F&9X8|s=xlOSN8M_Y%+`p z=S0?RmSeg=W|8w^1P68hrG)7pKZ{3rH*q?xv;{~2;H%O$VHugJAqH}VWabpI6aY>Q zoF2`HGu>dl#+x4hZ3f7|%A$W6p%rMUiT-o&2b-DW8k4|H%3EpyMvPG-dP$7_=?3ek z#-d1rNnP-&G2_s*35Ei6R1*$EDoi_NFEkAI2vI$DRlNE8(|X^RV)$5Mr9aqNWt;2C zhz$~8)qF3BJup=ePq**}eR|^gGCl~I-^b*a!@%A(Pte_YfCeb{d3#3vZ=|QqQZgp# z4q%UF!S#M?q-b|M+t4rL<5LO6zt0E?$X8vxZQ+|Rv`QSga8%-9Wf%n82vFDlR1-U7 zPVnHhebd6ve}M(_J_8><{BT zfU)(V$fDS%jL+xXua)KcpFXP9j(Z>aRSEL_-!1^G>!B?R3lOI@Tv9xsp+gNt@^vmq>)bF%jP&7r|l^25Ff-ll$eh_SEgV_JSOas!Ga@--?> z=Fl*zr&-5FF4whiLtw2Kl5Y#>l2UZ^txH7}248@ws6L&In-BIZiA#!=7is?#7Ee|i7xlP%ts1VW^a4q1+I(YPL~>@?YG;1n#Mz|L)rk-FDMo*&4Cli%}9xlO$=_P`z-;^=He?J8_Qp3 zW@;fVlR4Fd(?g7hcbk)_`HE~x4UitCc2%nT+XL)56qT1w1k?O5LckHpS~(rH`K*<#C%=9GV(5Yw($U?>5nua9(_XJodq(T0qb8Iu8wSGR`%~bB+I0P%w zSbb7@^+8|X)P zc5YL~Z?mif+82>Lw7{B*+ssFHq^AOG9J3Dp_Q5L7B^2`?X zyPo+;ufT}*OTJ-pRqdo^^?I)W>}f+iPw(sg)aj&+O<=VcV#-55pQrci#Rs_l z?5XpnQB(384F~&)rr`7rT5f|u=6s$Yp29^1b*bfoiy@dZDQZZ#(#AaXKms0ru3lWl z@z{m^D7FRmH*HO}Bs5SDlX!mlbmQFIH^FkLI+ugQF{8+JAA`URi}#kSVn8{N$JNy~ zDwji^9>yZZn^uVo-#mhUj5DDNPYptXrUOnsV~evj{zwCadjz6K3~0D0HH=bjBERK` zH3*{6jfD;E_o{bKZjpkw<~qRdv93A6&FQOSYhb8LM;_Uj1_>jD3>)yvUZgTn?U*N9 zoX+7FT3nPH$XNf5B`W8&aZ8=;fKoHlS0B7X3*%5mxYWeotHXK1N1vxgc~b%fJf15P@N9lk{l&=WwQjBiVZ>xnSTIL7=UO;-wn-PJjpY(_){o$` zm@H~~kp^D2tRb{PD>m?zxsUS?U4t|TvagB)aU)YGZHytM;c)?BB>30*S|uPlU0t7X z9qj&A#Eq#$$>$~$Th?niW3S2o48D|0t(A}sMc7#B2yAH3pxjvMw@+S2^>(~5D;xWK zkr373O};_e2GRJ1Ss!e{)1IgK$~RBzy0R8TH-uu|kCv>EmLiMuQ54v$5cwQ?F$|-j z##c1AOnMJ4oe}yHx;rgdN@H(Tze z$E-jT{vBUq%U-~Us|sCTZ(Uhq9UL50{fX!9W#&Lbv4&FZJlijvz)Ae4u-I~~e>#_u z3NF=jz%*nCEeSt3F%@DQOzLwTzX=K+$UceO*~=L^z6~YXq=3+uhU>7mt>dAEqGL2q zzI^TqS0Y4*O~xWVVFmeN8!K@(F}ELG?~wUnuB|P5Z4VEGwXFs_oZNK9r$B9KQB!^! zO@#INtWBurn6!Fc-3tiq?H%(i!Y+wnNYLLY7(cQpMa#;#wzF}y*lO)PvqsK+! zm@<@Pm;rgF-l0DUo@BF$k_>gjqQTsrxEoUzHuitf6wsJznSH&_e7SARK-NvBnmV3q z(B{NGIq@&GiKbgqdALS8rW7<&2T3=rfBh43Hl-*&B>!vu<$l+0RT;!sr9n}>kU6aM zEJPycUHxe$Zs2h~dcOTtsvO@i^R7~thf`qoktNPrAw!o^t>wpl;>+Nwh9c)0jU)Ci zq8=@tY7Y*S4d^D2;RTpVSP|wT{t`*skc-8|1j3EsZgF~LDn)x0mA0;NN|4p#cSRrm zTp*-6Ku|y^Y{>O*n>UC9DACCQEOS%LxT3(p=lSv-Lc7WWQaeUXG)#0tF3!pr!uYY5o<{A>PyH)@ z1O0Ajo7N3Zno+&FHjAvyqAwP-f&$Oc_7nOV_STR2Dpz7t0A#RFsdIWZOrD@R_k1&b zw9N0wncWWMB#v6t5=);#`ZpO&09f;dbK~P5zO+6Kb@}@gdg*$d3>-n)TIZ^zedwal zi<<BM@y8=zJ~G50RcvKN< zz#2;OjLz8(pyfWVNv|Do+mua12qjDGDhMDu>r4_0e6i$n9%z3&Ds-|Fdri6u9k7e| zOt@fU+e$cOs7zy#i2ZJtH+{UmG7o(=!b-1FXE@Wlh}sda9&@qjap}K6tr}FWl)J{r zgLp?8x6QxtlVW!_`;u(5L}z#xJe@n0P{_da5zj@I3xlkb3EhCXFS{1xwy7;tW1!k> zuBlBuFDU380jICln?YSIYFN$4{VUhrgo%K`L-gbf9RI9nwTy+Ago$x5CG?$X_brdH?zuMG`uQ?mMK zVlI1xq?VcYD;DY&JTq0_JzveW{PI8)rDxVSH|2u*`FTAUPO#p%D4O3>!y<4*Kg~LM zgmu{5IQk{2fTGS7^zA6thdOORxL8wR<8Qn7DSD0mc>2^EG2nh1j%>sa)T6I&x0Q$M z-sY%WcrTV{J4a*c6GJWE6eJmDVPb>pQ3`%9&dtr?W2sCCrRS5GdyCrm4*JgHZbb%+$1T21jY>qU`(c7}u`=5QJrkvE+fUH-aRUx;<)xmmt=yqM^d z-6^u}3Osx!e17DY&pFidEzbQS3SsUD#?#>xlzhjbw30sM$bv#_KZc^UoS(mK9n4KI zIqiu#6>SPVuY-O&$n`(pFR@j`P0g1oL5M|6rkU#6*2jl9Rx3DvsE65+r?Sn>w}@uCOoG#F6Gohis=%zHNHF2aFue! zOyQs(3AxqCFlEd;BGqL@PseDJk(B_|rKlvdj(UjYza9Np63ei=;+C`kQlFPo#4Ejb zFe~tDw>T?H%Em&`yEz`m9fGrfxevj=k8Iq%y#?LPsv{T>L`MU$MMO!-DY5+VJ}HQv!v^@`Gu1ISJ4k)gCR0kMz@lF^G}45r;BhWrnfQ(2@sR3 zNgO$@-`{RIyAjLr1{j9?pQIIE@)#2B^J6Y1HD>g?aTQ~hJi+p>CkwDXz2h=(uT(yQ zzFxoi>CQX`UIVz7<<~g0udB?j2iX5D)WkA#Qots1p4ezjt|02RTv`u^RG$HUNi-^^ zfM?aqZ-G%R0bBl;$g~qLkLo*WeNY}rlN!~oX=Qq|xB&@)X2Z_$E~{wi!Kd0=oc&Ch z$LA*2!*n4DGd&KbV)=0t1Pqtn3X(9BU|AtDfJUth_Yntv%tgtCOkImdS++4=)Wu;n z27^A%Oq4jHU5Z%sSY`PgX^{O=pVP+gkRFJba9(G~2}2hyk4yO$O{cOkHj8Z)!T5`y zMaiCHmMwkX@!cv|D6fISxs0bh;RL+8joyBwo z`s0Gp5y2MMUEvh@Ukw>KmFPU6a*wR2tpag;A-b__PR4|GbRzD*p{Q#>3Bav%T52_3 zJ)cA6tr%#P25VtG%Xwk-ps3^r4)v4Df_UR2#zyEnZ7@wk0*c@4Nsd-DRsBcSc#ryZqwiVFy<&J*}%kNu8-25irCh5DD_ z>O}SvP_aWP9I*xI^|UGTjHv-4=Q%o}cG_EUv0`_Zf5RRQZl|;PiGX~0Yhg!^bD2*rl5a(>gS0-1R;v)j z*mu;=HW>0BQ?!NDm!rU0l~p1+zVD`O`A9ZkEUVa^re(L8IDIx*{n*e4eBF9J$xvOM zToK0vED0C!L5^WE*Jji|q*Jyax6oybaehLM5STi8RpDhXqBeh(o&plq(a9j=GrM@N zg!X6cdvtVJtC(I}667nI6&o@Npa;YiWfZg5kZs>xi&W{}=xJA(Qk+Aj_~|D}>q7C| zjBRlzu}&lSq`9iGs)%l&QFtvC%HIz`p5R|-vtHV~fpqM&fA@6x)+NLyK|k`m_OLx@V)Ne_ya@3 zV(JDKkKZ_s`zkqCtsDAZ5}lKskJsxxwfb24s7U}%9O&C`I__laj)(imhR96_h%Yn; zUw!JTK8+d}H*OUfgN|?(#FGPQ%NHmwd*QY+hn{B}3W(L**t{?+OtR`Aana@Ka&S|c zYwjhcHBAjg_C^gKd|m0`CPwNlRN+rW^@`6Q8VVy1L+{i{pNAV63XWa<(}0Lrkm5TI z#$=(5q)jk*pHt_U& z*MtgNLs{$)B&-)|`gUn7=_A7M5nHm9HIRm!xAmbqH5_txUnWyy9D`_xhfv~6#hpP> zjPp_~t7EkTW=LY5SPbX}J!~LXRqIQt7U+5YbDv-1=cUF*DDA6@P>rYU?XW6)bl7K! znUBx=L8hmoHWA!LJ}-|gKY4OeY9CKbT!SOth=%H2A0M@y@02)o`j{@=SNEyBHih|b zTz+ML!;i49C=i!^`Ud5-f0YK5b1RGY*BMbv3E8>&`NetF|N3X(xum5hTGfov3tz z0#j5S;$*Ukz!5c`x#!4*zU13oL1qa-8*(ji_2;yB@XdSqpp(GY7@7j)GT0D=TTcMAlX9 z#qA~bOIkO~|Blq0m-Ca)jH29dU(+><{Pm=kij)lr_LUPfERAt&dpHmw8uEN74jZd2 zT&e#Y%Zb@@4vZ@Dd64Um-((uc8jeTn4@D=k4UTDcHN=8BSRmydSLgcsiQxIugF?QY zG$F|~xwP!6KfRJju%q01Iq+KjE2 zG&%gi?APSZL$5_=flp=#2r7%sOlii|Hs*m@0`7MHG$rg(MP-=M#fA^MzE@PV7K=)W zqVi>Qtdf48Vp!OMB8S{dX@#|Oo%_TMzK0VI=UAX;v($w7vHQU zf8Yj$54!||uplw-X3pdEZPd&6AK@+3G9P3F(N0nB(JgsFk5!-dkB{CY1lIAO3U$2x zn@0=3I5q9W)MxZYo08cR2G=jU7^q_7e*l!z3*vexHm$e-*THXe4jqn3;?O(gHBhVi zQ6jO1>bwR<$=GNLacm?Y^@r$9z1>`ftt~Cz-sX26GndSFAs-fC!;xjV5Ku?HQ*O{R zDg!hC+~&h|2!5-kOVE9JJv@lP5y5e=xI{#zekqT{cu2*uPBMUohYEp=XzQlVIv$G8 z%!Ug1)V)GV9}G};5}Wz0Y5dYf5tW;NOC)eS^V8H|?dwry4N^5y?c+9Ft)DNoQ!>bQ zURstGqfV^dY2Rw0?%mn%?Qk#%LM)4!xw*O%&~IK-yVjReU}+Jnv_Z6Sf2|r>H)-nc1`WYy5f{qP4=dp@;T^h z=&|rOqT)AAQ}PYOz=m>y-tXgj)H1=GYldKXS}l+F)+d85lf<`pzc0O>l*=Pbf5|HV z0bIjtm8XcM%kS5GL z>R(Z1lBc(Rx%CI78oDfJ_*xrnmHOYaDh*LX+eE68B@^H0B8g+4LsYwn+Ry9h z+vYw?jrQcfJu2=B zTZF)nfrtDk%>Rxx7+XBFnzwA7i>fPC_6-2{yK;v;A@)!D0H&)IyW1;Pzp-<^Ur9);7H7q|rDB6?k1Lqr8n?D{at zztXA+#%;Skzu{}qVhS;2MT!6fmOo^!z5d;b(N$Fs3GubL%a|LS(`m)#vy<9&RQ*j< zcC46yJB3pcC|ai zC9k)mcP_?4!GLfCW9i1KBXK{cI19ckdfO}p5ZYh-g3)k!Zr13Ng7NTw}GnrSLV3fo4UH{87 z6C+7rR-&hM%i!<3|A}A(Ag8nA-G^M&)rX8JaV;KgmmRPn+=y_&;yr&;y6^RCadY-& z+8AoA#^uzG@9oh_%#&9bz=XV!QogRgxLm#8 z&7)AO!Hnj_m;kcJu>TVNDMUJ|5XXmX{AvsAUP5_BSI#mtjnM}zjTpojwGX`R4w9?% zU#MFiFun8YDc>ew`wGZYg4~2*v}6-qVe@o8pU;t5^K~#!a~(R^7NayIEWj#7v9t`F zL7WuI1P3@S?27&06~IBn{LEt(;5TnPX~m@C1}H_OpM|h!7tg0Z?y)L%bl@J3)jZ+p z=Q|wCHzKa#@q;1zQq%aJDqs?ZL}Mvs$#(5$V2mM>35Ko0uQ=#@>MSlmK@Z=j zIYWpT_UMajKt(q@@{ zplr$7KY};mp6%_IK8_?!mvm;1DjHE}f18ip-D~S#8E8o0w+D7+9Vm81C8#@1Ri{p` z;Jw43Z?tK7=w%7+ysC$Kzq!c__~g*vj>mOBMUKNEq2Na5A2j|EjUo!{{mm}CkRCc| z1tcCnCx=&dddSh1;lb#hAR~MT7i!qIgCKQW6AVA1`JI}4=si-5$cX~p_lBydS6bQ* zK%gD6rRHlIwSYSS%})uNrWxb~eq)JMVU^aR#wyR+H>#-_(8k6_eRe!R{5ttcRV8Ku zdxcoQ2O!`_37{LNUKXQrkKlmtP%0{J1*hK?TcyZJ;*%b80DcJ~cDdd!3s>^h*WlU% z-}7~fF#-{_5^wJ@sys_{jYgUjBGbEqdW@tJHnhh%Ppq!Ul5W~M&yF5)NkX!*FRyrn zVuy(B;RZ8g$B$M^)i$j0{izU_89r7zca)Y%BC+OLQ7FCyJ259o%A3}Ap(G5odo*f4 z$TBt@vT1VOV+VVcR-kL_2h+O|GOfxB^lLN)b0YCn(F&e0q{}ChJw|numn(k~Dzu29 z0rAxA)0OU~*IOW)`_Pc$W@Y9RgmOpOV&zUE*sj?ZZds;-kZHcII`{$Ya3|MW< z`~A@1JjFwIbEIo+c++-6Gn%bU?XQ6}jO05h^DKJhGW1D9y?4=3eKP+Ss z<30ob0E>d8-7B&5868^UzPQW(EPiSJyk1$1)y~q5`a8*=L>h`I`w@wPLHd}rnaGSR*XkS;fZt)X zISWA*+C~i<>U;%8wh2*%OL?taT|E&#uH-fyE?w8JgJWscczm^uFqQLLTkA69kbsr? zW5HdM2yRd_i8#|P+DRaCXZ<(&=RfKngpr4WLx@0HYKAjz_lol&KaoJ>W(D&7@&~5* zI%qW-KRr&h#IcZvK?Wh(Wz>6s2HSyd6_dY-Tn3FE*~YC-fA41x)Vas((Zq5gXWa_= zbIPDUHO?wShWiDCC1f}&v&M`Mc;AzAF&k~X@^ywgcaB~vOJ&KD=@LULz7E6MjnPi5 zX5P#*9)-L2c?@{Gs;0rEvU$j7HuVencgja+>;WqGy6~@s{HdM6&wa}pqrxUl#?XkU z%M>|3)q82253sUo_dQLbaycD2{rk>N9m7am1}d4*uz$_=<(W}j1hrza7y70VStDPS zTVbW(OH@%MLZi5uDnw6F^^>vNdB}_Fi;>00GQ+&-oe_xUoNLcxod&N2bYOMx=)dg={-qpVZFZ&Hp-+5!nXcewlCb_x8PfdmKx#1Vc+ssRv5cC6h~-ORzoM zF2ZWd7M&exk0Km8-4!|h6hsWchDBWsJ769g&K`*DatJ1_nxqLz9y{7eolAaatMo3= zE@W+GIchsI5n{L8Omj~x2m1ZMFy*U|BJWOMo>a^FkMq-92h6{js>cbQPeNrwDSFd$ zF~KA*V~Chl`^`gc`ds#WOo5B5^6Y3ofwvc@qI#)3O)|a@kQZl9^!Bg}+aVMahSI>n zZtdVQQ=7os59JcJ*(QDu1h2Lv|9+{;j~vhep>&wSq9_IC+S$SI#hH ze+(TDX+~Ms)=n*gsdK|@rW|LUVG+P z`YL*J$p3aMh%vbB^ar&2-kVp-*{5Oxid$eHc@6}d%Hb8aR*%jmvuO9x?ibpaqog=O z>Rg2K-?^u>GiXqESVh4LSSZ>f%E*}P3~~uVW<4Uo{yvxV&emd&OqNuC$B|2jM1Di^yAI$==rL#D{r?@Y}AsD+iloT4kCcbVo`Y!zH=ga zzOE)jGRWDs8%F}g%W<+?H_3vUasv+HUd~J6Wjd}ld@3mwgIxh1gfK-`a2yg$foSmx zY+`9ODvBfs)HT$A64RK{Y45m0GlPTnbA_>^ucID4@cT@mQKtwRu*KcS_(;38TQN~j-kP$Dq6e1#W_sYUH3U3M0O%B@+`8IJ* z@>;~TgOx2#&=eo~P4XgHgf8fwh7D?jpK7YxmMW_x8?#0R+HF41Ljs1M88y1%9f+S0 z$Dv18o_#U0Z~83Zw&gx=D~gw$HsB31LozEPJPfGP$e@i;61o2DMo1sJDp4 z#YU<*Oo~d&f*io-?l)_uSEGDuWP-)pH(o0>3FxMKupv6{qp7}wRP%^!JDUg{Vwn5& z^<3{MFbX}PsI?Pg>+9rO0CG2uxO#1!3+g_Q8yw8#JZc3bGOKaf{4!>gDdM=dcri$A zu!?59zcMSb+JulY5+;JFgi1$oYSbvw7|K=>X@EE%Jdaq1rXy1@zeB{FP50MlVrK}e zG@iwPgi>gFxLnOQL~6nkz#PcLE^Y5#pIyWurRRje(^f$ill1o8t=$#bhJ$OUsZT>*Rz=(-7}$ ze0eq$B=*3HvT*Ya8FskRi96Q-sWl9^qY}45pSp?8xC&wp#igfNPBl>Xl4 z$AB^TKb^QWK_=+Y7;3jZFPSuUf1T%kXVH1GF5I9mO%$-sFa3Qu$_M_lF4valVoDiU zmC&}_6%#Zjm^;~QxHCtVBoe?Y^UprlEkso~Gfb!Sl+WqN(31M-6;BzjQG!wI5%}GM z=nW2ck8xeGz_ha4yLrdBjdMYhIkLLmJW|1$I&2G81qW8>EzZEUQgS^+6MJ&{y9WB3 z5^yOcD=&7Tfuh{+DF0*tNRg$VRgqJ^>^r_H7SKR_u&CHT3kG!MMr+BKENs+?;9U~i(c{o~Fj z!rGgFwF#cwVjUZMR_5VvA&?M1H!gr0EdX~`emg7}B-k9mB0m3lxz*zUL*w1>44v+2 zRvwv()#ksyFV8U|tKz4(2AHe9cil6uC$h!xQzlj3yTPwJJ9xVjhrPPNdAgo8mHIvh z4<@1J`_y3UY=Fyq^}pU1vH}hA0m=k4qP4z9odVVbxzhgLNB62jrG+fk5nyQ#)=)uuJAfftDRgo;G4kEKKk5&aZy^j z<(f%8EA1{~)&>iljv7rqJ99DKODag9T_q^CcRV!u9lyi}-3TB-MY_TZPO(B)_MPLBw6k+H_x}>|iV^B&^Q5pv6awut}yO9o&mVCay zbI$*I-f{7AX7*lt-Ru6`%XMwK4Sn1|?@-~+2pZs9t^fQgQd4ex~k%4}-ovavDTh{&;}%aCf~|43JN=5jykgm87Hu*zN$4ObCU4C3O1p zIjbg6o=q34r`9AB7L=%8-euGSV_Q9*-?3DWp1+Tr?;OAGS}l=Gp6acfp3c0TVQ)LXh_T5>6Q(xd z(X~tN6%n$ltP}{E$Y)4)>GLR|@rLT1w}=*NbrRwq!cEQ084_MbAMZu#s%@}AIb4MQ z|2F`r$TM~ZgRm6? zL)Scjn<7SfiRfmV!wnkpvMun}y$EGSqA4qb1OU)=9*hQrOhTK_CH8`N2m9TtMRx^QKwDQ@rw zv9iuPHCQRfsCOyiB5 zbB%gKxdUS;oX0hWmmF8!*jK%w31(vc`%|r-Ni;6td@CX6^m(Gw_8ejB61 z%^;Fh+&JGaxViY!Xhi^VlfCoFQz|2LOzt{p6zq#VgEqWIkxH%>b>4tsj_nQYdCK3) z$SQ$GQ{;{71$9Jl1Uk_BqmU;6B4L9O36+6{${Cq`eJwQN==yjQe9hQBhme5K1z~71{^QVi~efIOW)d{ z+Q?clX`-IEhI^gGJIHYL@}R@L%`DHwPNZ9XQl6hD2`8c&{U|En3$JdH8Ri?d#xcrC zs-@m-4=)s}HdYpF2=w#xeH!bU#ZSu`z!!s_q;|iwB&!^g&-T6ezX2O#pXFQGQIOi{ zsHl-qdQhM^Sa^O|cfofrYDlHp_}8aJB;|s>Qf|q}PXz6U)SuCxE%r!XrTcqJeXX2x@iNANce72)hT zIW3K;Ohmox&`NZZ-63YBIOVD{LgHct43y21qN(}W+&8g+qpj@MyTU+F!jNCj!Lz}r zrbZvs=;gB}i3k^2`31@2hX>~x%YP3MBoF(|e5P|DffwU)Z@oQDAk;|oSgaD8+xV=y z)?UNW05Ss?YxAwjnied}W-pF0d;>72epMSueDM`IEI*M@MsZ57f&N$~w!_S1`R2J^ zOve$)m9*RA>uM=~0Q)I+Y#m%Er$l;2yoL1XEUXZ>la5wZ(>l?A>n25?K!w|EzVgLK zt69r7to%u2R75606uvd`{VkgQ$_u~arQDikj)5)~p5T z!1JoL+{94a&vrhi$ng9oc6{O)Qxl>c$aG_0FHcIadVhTH`b{zasevnTZiv<92y%DH z<}36l!@I{@b-yXDk}<))K2~Zs`S%^MedFl>@`W#@Z*jxM^5?$6P=hB8F#l~%^ZqsB zcPkHgA6s!q>c1^)wcT7!o%HOe`$n8&eCs;EFU7Pb=!7GcHSp);ci^XCB@>IenG|^P znIMcwjJi-g$UH0?Qg4%{asnN49I~io#gpu!L|STxKqAa3e`Q_>C(X{z>JQe!F1CKd z*BFu(efa*|+)MccfjqF%&Wdotz3}F!o`qfCFFvaXuvq>z+kSF__(oDG)n?VV%_ajK z$<4VXUcsM&ecR4zbTLHUPYfEZgX;*j4b6QPUtoUN)0FLRWAFj<3#;+oHN?Y6d( z1u%ylmXZ8@Xtd(!Au+u@+ba&Z%rE8Sc^(_ZK(JJJP^N?82UgP7+7x~TBX7{)kp@L0 zgL+y}bHm-OrMaey;RZkldMeN6`EZxJW<^Y;hERn`mAB^h5 zmHUTp)M)S<>@sh@2r2?ESRuS->fE>cpR~`B-|098&P;*qNmD)GadRJ{{QO)fnIn=F z+Vvil%z?3uNgYF$P{m`NGPnZ1rh4+nOh|!wGnmE@a$D9Ikk8T0A^Oe~PHC1Lm7$0G z9y24`QCe`~JD^6`5va4(tI9Ukg&Zc)eth8p&8B}{YL##vwmd&xAR*lOedvt)(67F9 zYCLY#QN7-XzHmG_?pN;GDRjEXE2;vMlTe)@fDJ4;IDP~t|FaE6*DS;5K&uGFeF8i$ zuOV(>|Dq!+`U;L=m5$g$yhs>pRBW#~SXQ7UwR;P0Q1i`#fxnQBVMt4rc7~(EzWgXg z{ZG#5=!g+%)}ukB8JY%gNLlohkf9dMoA!d>r`{DIH_JkAJLbph{my1IU@7VVxtX)q z(^lcP^`s|hvDJfuCf)|{AQk#gDlwnn?>t~{3KGtZffV1?KzYbWzq*K0%GuaN{OG&7 zTd#U_f5fSl3iiF9RK0TM?XFEJwhF3gTju!Bb{}d`rcyq>@He@DNk)W-4d)%AvJfsW zUbmV1+{DNG2bUsu9mH48&?LdwgB z8&~vO(?2}FtwT1=UY98PBRcx*qpr*Psz3Z4Nm$K&bQYQMhI)6T2k{5}yBCbv7(!kL z@%tIse6yA0`}{J6VF;;PN=@zaCcxp%<=bPp!<^)w$qAZY%~)c>_OKI1%)~erzuWhg@3y zOu9V}yxJC8^*kz>Yx8n+q?o}EAADQ!0o~dUj?o`BchcxtWb=-W$N5^+vDW#&JZOs z;;lWX1FI_8^861!^VetI?;|VVNDqpV_jW#vOS#HOR76Q4 zcLkG}7@H>(68~HlcW9ysd9YPAt(o#MCSr_ijOThItwRKAheZ&YNZxG@F#tI#Aiptm!}i;b>F6ZMX7JCl2#8Zvx3 ziOs0+hBA?CRtbElQIBUqkc!}$lquHaE5`ONAp0@x9xX1uf`kZmg}I*g&L;QXe;j%B z&N8EYKzV%TuZ+_XT=(Ydk&gLFW_RmP*x~2?C4v-fb%`Xs*x7vYYnaVQZbS7P(Z`~W2%cPe@8=*S z!;+*2*0@zEkpmBoH7#?@4$)uy3b|0?LK?a`6Z}o1-LDP~KMgysyH&43PEdfdILF~A znvJ2AsV91lUURrFd%&X_rq9lF?Ccvb zO!19c#eZbK3l)QCP7*9GvI%!6q}BbJgI~NK<7xG|3cC3_f92z>2yl|DP!s<`s&HBY zh+4POY{Ju7R+rGhLJKF&M%*5(ax8y)>OHJOOG zq--VW*m8YHc_Ce+(30<;!(B_9UWjKu8)91|uexB#h1kce$-;zgtlD^Y-(t+C{Om!oYVPN9>AiEveh@%Few#e3 zNYzz|tJYe8bM(Gf3EgK#QGh5}sT?Bgpadhh3{es{X+5$^Q!i1FaCpx-qAj6|x|0ug z%WP#5)Dtezgpc-%aH@`|D-|nP^o!eQ+uynGK2oYXEXw@0&G*TVf2%`oCZ+E_K7!)0 zI$eIW&FPn$mzH9r0=%WrT?A>e%~mB_Kw~xL8?@rV&IqDKMG-VH5LRL%ArUb*DaaPh zffgq)jDSoS);-<*#>+;I@jl6{J9#VO7Fjj6|Lvn`Ya8KR-?4+1aI1M=IJo;9jHm5- z!0ouukA%j2IC{0l^R<#bIsfI&jM0vFh3T@_!+G49zBdHl_>5*GSd@$bNu)(A%ZHg$ zOpmBH76K84q8-k9r8OQ&idR`us`9F`E_Y4W6LEetr z7^ApAq``%ELkeHtMrxclr3(r3*t+X&b5?85eW=j07V2@Qya$(ytgbc&Vj06fXn_Xz=7a%ml*S zplr*nbU79a=wDb=AYv<1nIz%XXz#``k_aILGO{BmnyxM{n0p!4Ev(v+=I z>5np)&E5A?_;05F`d_nUV<62hY=Mi^idSC0W=;EmYK+I|*NdRLF6y_x%wBk$|0a`9 zl=LVj87V3re!|a{3o^!xMag4X69k89;FgW0hBdIcl828Yo12N4&*eB-;4LhYE25@C zW4GIBPnnl5cMaFFrgvu08`T5`#UTjcWDbG{mYnxv`u#XuMQtfBpU6V--wbE{UwfDZ zsQMv7%sKWlPZe9AQ*`CkSRpg?YqAUGkKMZ@|8w%y%=Kb5S_w=dom-rnO1;aTOn^O29$@m2)Vh`t)7v;B<7_ta}Np2d%w?=)(dZ%pR%qpt|@${$N8$MVPPq ztP=0ZYYI0&>>xk1S41o?%=*{|2rUKBz0wmF?Z(Gc7KyM4hbOAU3KN75$#^Z#(TGf2 z7322lHg4BHUd=vk0&QoY<=%}@((~W&8sB%CawMb#u`EqgS$g&FF19^LVVn0-xQsZP zW@9`tn$q{ZjjN`&r@5^^14bD?#Z@{)*5HFI_m5YIp6;(ZXg$Fg31SJvtV{gUf_WVM zL{RZr@W1Wb_PQ3TCPA44s=D-jcEgqA6q3xw{6EHzoG))d1{%zrgRs|$_8PIh3i^2K zMBu<}_HKR7Na0$OeSmtgeXe}j@UUW+|a?N7X1=<0SHy^dL@7p}1bd(3) zm98f``q*;D3u=65V@+U8;4o_IWWj0m{eND7Vy8n+Md0{JA_iLttwKfg$LhIMda-d?C4aQjRr zWs+Z`dMB-esm!V+jL<*25WEipuYJQw5KEv{T~=X52uuhS$96vMGX9c2LCscIN=;6~ zCv`Mhv(^QC4?=BA!&(R$8gW;2Oh{lQmEL)(1kRz26YKg7d3X1tZ~2U`EiZc0!v5B4 zx_;$QdAImAY~RA4kD`{JsJf!Moc{ACoIb<)M+k$3+|fyPd|vGpz(b)-Z+*qZJ9gCr3`;BUhRru_2>? zsAU2}JbZvzqmcyrbuSL2o_!exRXpcMO%<{pGbA`lwX!X{wU3z&iv0H+6$d*D((gSN z-Hn&5P&~8gJAG3ov6+X2pINVx#_Ajvl?J&Ro3+ zahj8Zpo7K@zjs>ZPug=@I_oqsTYc=Dy$;$h{dOEe_^R;{*r$5k8ghNz`vX)Ps9aBK z?21_{?Bz+AZV7CW5Ugpp;2R)7Wu|Sn!`e_zm};+FF+jj{;O<;l_#B9uwlR0&IKe?9 z(DUr5E%(2^3Tb*Q#VPRLl7IWi5}N(KGO;uLOKX515o)ZBc<^8h$2IkD>}^2m~3Q7~tJc)(U`1qfKhM6k-u3 z`lElsxA5=Y$LWU_lq`n};%WQ$iPT@1&xe0)2)nrn3nN<8`{PIc*G0ejXk}sgdGEDm ziWOdHgpE{NdkQ>IFPeJZ?2o5)(78a79_Nwd{r^hWMrM!?Og17@AGAy#T{ibqQ_-brcGpdNZYSXtc4Z`TT(xfwG5&Lk*Lu1xr{C}Ao<)t4^#lky4aW8{YxzY?>6(YUmVK;Jnv*5!Yxs9mAByZhvOT_t$jh;t)^{T%ncJ&hBJ@H%Wa z4$Ma~(~!&EgB5NAuL?`P7CaOr9~U&7)*@?f{?;GpGtc$)doC*`JLOqw0T%aU0fA69 zj1b(i{taL<#7WD+93*B|l=$5#3P$~y!*xS*)x8EkY+XI}dYLMY%B8DIY-aiIFuvU3 zm1}B}xYWL(T~00@>wqo16j^D|rGs*a;tGm?XJSwdo^h!f^(6Ffl1d0TbNdP`4!g}< zR}0AAJQDJBZQy~#yRk50XupWVQv4!_lgu%W-+K?Rzl%6euts6(=!R?TnAMO-lmf`O zh1RM#pKguZoQgY!CP=|8{t&YRGzzNKom~gEj|?sxA2)EbAP|Tme-@ynj<70vO73K^ z23?9pT^Xe~uY{V{8UYv<2oVxGebo0hEz*jtmlW~5qIlrXS|LBQ&R|~kI~843?DJ=& z1o`pR!lvRAz^gU380UOY3B!AnzXOX1y+=@F4BPm-Bek9)!yZP%-qAB^LrjNHmzg;W z;YM6p@{?fp!sPwhw1%6RpLCJlVV{MPG_j*e30;~x@+|av96vB5VY;OeS~j12{x`AA zQ=Xml<{5_R#A`C8wC;zXqSzpW1Ggb?pT`UUV=d zpO=6lXGj-A|B&GBfb4w$20P<*X% z&O=s4b|>|2)yV4?ME_&>96)C!B>b?TA3J;VN3k7Y`9s=8KO-v60!fJU09 zfX`R@vtMvOJ$NWlmhJ;5YomWa# zDhZt;|8oXxGJJLPS@2MHm=^juyG3keER;X{)-4Fn^mX}g`e6OAc&xAWGTbxTvD#aa zo)E)MuIXu;`{iYfF?=*+=IooTM#?A21ei zU_m*+;}bKE6d-J?pQ#-|pmBQm`SXDN8d_1O<$hV z`l|*Dy>=I?v!LCqf?&FEV+01#Jg+a@|~k(A6xp)6A8-!x5@l^ zSq{rI5tZC|mDzKxu&2s{W-kypLlYm;D@%zCByQAfZv2uc^0GA+>b@r1G@CtpQ`7c* znfRvwjfv0%X=WOe2jCGDra`EFPk%>&H>`_3l>LrZ{#rD{vgkr3#+D`rOL&N-UOHSq zny{P|Ga;R~>2JW_q~Xdwmi=gW!yMRf!ry-DFavk!ZG3YOJ&Yrw{p}c7Pq6tUuTEo2=`nDcUaGAujhF_+j8y{gUVCIjN*xk2q7p zs<()KPQ}QKM$=#$fb}2fl-@s_`3t|viIU*z^gBBE`fjCEcaM+@LEep|*%K@BO{roy zC1(E%zbjHXMP`8l!_Aw}A2*3*sP-dBMUrt}UC3?Lm)&VgC=5_~fdwTXIjdd!;B{qg zBW!F^hlURFS5-1lj1wC4zVN22T0%euOv2?z;2j$xTh~X`u*@CJYyKjyIeBNw1*R!Y z%&?(7T$x@B!1{aM8yoeLhus7ho}4E=;ahyGBC0`Ofn~n8w|C?mngl(xim~_^&K#Mo zp4I!lZE3!MJK{J|y8c%0W03oO2UkJLk~o6!FqPO?kW6)(NdRtFjVd03tOzzQ@CCx1i?%cR_moLiSaNvtp=%s#LT=}De0iBOm!Lt=j{^!>MU@qXPtIcnn{bNA_oSc!v)>u}|~xxu1qMV3G`(oen_ z2KLN62}KSN1~@VmL!)z|mvH9S9M{zELMGxL_0I6)RNpUk`FPljmd*>KYsfcV)`m2q z;A0_vt|-frDu?&6#epR)9ben9$qa6CYlww6`dzTR3C%^Ax zTA&#NL;YebKJ=U^(ppbXGctq%;D%e45#%(3^m+szWbx3izh`G4c#%)PwhuiPiyf1W zCiURA;v#yarflpiRU#rVm(|K0IcUsVYV?lB-j0N$L(_L2n~%}XI)L#1 zkm18%@EYISP$-2o0U)S++VhF6fi%A_qqt%m#|#8Hc9I)!zbvSEy$FVs>6HNQgbCl- zh(ZDe;!2Up`b5nKF+@6fXiBp7zuX|(N3B*Ivx$dfchd~1>fRU*`<)}DV#I-8_Eol~ zhv|Y11@+TyIJ#e4W(6UIBqA>A14CQ}rp&hrfd?47erh2-`T;t{(n1cHpG$_-^J|z_ zC0zy0cYx{Q?>qz>RfNK&qe1_Qy~c<;4N6*C0PMUi%I#Jo?yB}HUgwP{sM^WteyER` z;=B1?{`aRAUIc}#cz@xf6wDtJ^nvb*YJV5ox*1&trU*&r>4EAe8y~jJdD(2HvFW$i zf0Udf0)K4`m<7>k@v1-~QzIs|%A`i#^J)n4`MH^t;p=9~?;c189{-+bV+A_`Av@;N z&vULCU-oU7S!lA}Ev^s8@^!Lj9xFh(<%-<7 zq-{&{WCBBqi3-V3=KIVCsd0u6+^OE>C+gH~MPm`icITY;kT)8amyC)^NBfmjR7rE$ z>K-pg9e4aH#j1maZ`mH_6NahgX^A`lH zi)wbEu>~SUH`{nWpLe%^5GFW;8@BdW-)-^n=|C`1=pI`ew=%pea2X`7ZS%yUh`cJ; zApSr3KJTBqb7S7-2W$sbDZ5N#N} z{$mlnS4(=t#l3ZOrqY;%Hn!M(Gn@G_Dio_6uN^NO%U6K{jo>dnXn-meIz@n%U>Q+3(+`bZSOoz z$ZrQ;E64$n*|^0|v%+Ik!(r07{4a5N*xsYG`DIhm?>oosYqLDNs9}Z@$=4@OAI!J= zGD(42oyPU7n>LXmd)T_c>g%%x2Guc*S1&^J1F39@eRz-k>vobvT%ooa?{z+zO|)QcJT|D z=g&vPy{1Ice8mcl3?2mu+;OCHF*9Vp=akx%fnJCS3+w6|5=$aZ!4;9QZ{lgxsok)O z;HX9?DFPHLxbcfVE*zCdTFUQblDU*+!nU=hnPYCI`Wk#kjtn(|z|jnwYq2F*6@U*( z1}eo8t~YbELbOcPJ#<{v+RKz1Zr&preD^0;9?d!`99MX`sS>X?7|*iq>%U4xg-A zH}~uzw|E$xM2Y>eO%u*dWcLtlpZKIZ6b4AWBdcyq2@Ek*K^W1rRKVAzTvVDXdntc`m={)6%sNw9KSPY1ZjdTYf zk5t8GL5&`CoVU7i&m$>Hh;@>qVvDH_Oz_E`dR-+~L}rkN&mC7q9Pw^${z6NRLTTze z#!WBkY5ry3{z4URo3+6PevFTsNAO!BIiFG#leWzn{$JB**~dB=8Z@+c)gj93fpB|m z3wUi7m4v73^-iHtEq(F3u^gI&vXP3NuTsB{WY6S=rE2V zx6HU*i8Z<=3w}G*J-39T4iY^a2v?80{KHKX*ALB zUJ&g}{SV(i-jprUYF+zJ;z_YZ$L}9;)v1zfo>@FB zK%G=wz1PrQfbNs$5yoK({kPcjBy!IcouCWVXc?MN992Ti zYFR`^0EzoV5xqIvi*xsV+pFbOx7(Ywztr*thx zdS#a^tqD~r5qfEs^CR^*ik8(>D)l^_=<72KrY)X&V&E7ChME(hJ#Xj((&^v82D1O3 z!4DTE2>3JA>68Aok!rpmm#rFc;EGW&`XJE0h_7+-M1WpDxwk?4%Jg%)Me4KlNP;Q+ zEt{ilaRKVQh0ynn3qJELP#YV9Ih@}AnMUSNq@V(w*@a;M4y5=Z$Q(RZhjK58D3DUG z_jCrs6o&Qe3c`0GU4&yNV0aiaKX*+Z4IRjqwuC8yLbxqg+sr_5Xc2)}cu3YbeE zB0C2|TpL}me*pPRPYJtWA{yTcQAuTnAk(xaK+b#FxeI7amk&TSG0 z%~33)F$Y;UTH&om`T#j>n;`&k{l2k{l|tk0^k!CEW*c3St&n@g8!BNkzul5HMj3f!Bg+ae-ku0f;~e z@r-UY++y#DDMsDJ$H>pQFX!uu>005)H_3xOoPG0wa4nCKak8O}>JCHeVDL!nQQOIx zF;jLv!m8-byJ98d@qHuuizfHb?rFQ9ickwlL> zSq?i!IgpKzX30NG%+n?ai-MTT@*^7dZ_3tN25&ri+IY*$fe_Cr7-L;4mUSn%9B9Pz zM177oz!A>x)d%5{wINx^|Fb zZ|H=lbBA~L8@udt(njqgIPn`wgp~lBdG8T*z2aB179%@`1V6C|#@MWu^jN}qZ=$G|jLN?4%^ zvl5v#JZrCGOC|Pym4|CfRg9;>TO$-)X zGYkpN(pbkFlLXcD1PN9}j0739@=rNl+qRcyH~}3V(kkwN$3eD^0J&k_j{b3bD|VXC z8!Fry_0_Iz=7g(<7sMLLiLQfd{w2}PeK&q7!axaodO+ltoX(Nr4mW zPgvjJG0=Ur`Xg2$XP}0(20%zLXkaKk$S7}2+3@RI22Pw* z+ZrpstC1)vw6UDdpt_k4)E=z zt;_HE+tX}wtCXok2$HM>K8@{6P+A*{Z{Pgw$(}>@+}~%^X>y>vNFD9pmA9}oiA|zY zd(}U8K*7|)8BlnfRIHR%6PAU9!vc0c(NTh0DJ7g@vCm^j&l;-QzS^cHi2B4c60G4T zx*9dQM5_RyI%EL~xZql@3M?snX-lmcBbUL0?y zLNq=#yXlr9XX#GdAmqe1_Q=YrZ<%U0X!PAk8d89*_-cb7{z>tll5C4+p{O5BDVP&c*>JM&E zO2|lLdoDfBi{vO*e@U_&Wf`*vZ^YPtj_kT$B5!v1US|%j#qR9@6L`Gno`#PB)pEeE zTPi-_>~H*d^IsW_85MxZe>)l5E%iF4oVHbjWfy^4Xt1N|lDw1^$iJo_5{ion>!oTnwhu7yiM# zxTk+yFC^;*ls1ti5;HBPmQj*gDem|JTD?NBjdQu?Hn1g!R1sGOb2{XR4Iyk+RFNGC zNG}raIC_T5wf>V`w%{i}MO`0bKl zuDNn2Eb1q+=1-11)9^y#G4f&}Iev$g6@2zFFX)2fRvtYn6Ev(J`?`=O{7G!^Xei$p z!Nu$hKU2a3?f@%Ag&Vphzk}#WuwJ*3;fh|C-fVW~>(~xhC{F|+)_2(wg)e)1aDZsm znZnRi{}h|Jnf)Zm7~?^grSmVLf${Py^DRfU(33K|xYfv#4pXfmcv#>dYs}y0!ijZp zI=BgtbbW)H3+jLtyT#TIh9%F%-Jz1IOl{scq(l0*Jg@t_50w~AUKnHC2`&XY4~)hv zzW%0RLQ|=h@30)Qu2n{E_5HiQw(>xWZYhVcm@#ni5tMpkg|OE|&s3X9_NHn^Sp7wd z9C_y&G8q`czu-{j$4SVH^TJ_37+1TMxxh`c9q55{i6SHzg zVD&XaTs3T-7?2XG?IeQx4rag}sixU|t+dD@{(j-B2F+zb@V_402_C|#Rxii5&wzDA zWo2b1;(+ZB$Cv7wxl2dnq4S!TKL&pM9CS!N?nJJQ9A}Wbkf+A!5btLWU|c8h z^?l;ac76qRZp7=GTiuPt3Sq=jvE`7T)S1Q0T{kuSqX9QRz;~W2{gnD}Ajfm+7XFvn z8lOkPve~#0qS5grO$Lm|@5ay`8EI+J9}FPYk3cM2O{AE@S3ReCEdz1(*eJVKT7+b( zL`U$SB~)cRJdLr%Sb#}e{gFOHQ$+5|Q2`>bixiR94t65w*^wvE^MO{l9hlOoVMdZ z{3j#oJnR}Ua_QsTDR;J;ajPUbCQDj%-W{~AY-*i8Gqb>>gCG9Ok3%C=NMON2da);@ zdotj36nTy6{Dq@sjcx{!f_TR9UdQ&@M}^bdj{KXYJT5B8a{a`px`1i^R9TX*7WzHZ z1%Q8>;!GYVXL#v(l`n0(k9)GLkVhAEque6%2eBisX7S^Y3l%Tquv$xvTIVmBSjSTi zzs$sGssQ3tC-&2SpFSvs>hYW|j!SFIP4JaA*pgPk@^tXDJYA;oz*s#->+(a)&qUtJ-U8WlPn z%0R00sqyQ{Pw9Z~_`VC7$lV9)Ipv4GOJgEbmeTJDf8R*?F}9}7Qq`1fNWMT?p6T+c zS`N|Z5ry^SVP$(ug*oU88~Bg>6=TI&H6EVRF;zcuSJm{JJg}4U8&>OdhE_b1kzmoz zE#BU{Nz#9bAwC(7@6@+ZL?I0KH*aY0(W%4%f}h5luiU;i(l8{bu)gTGmGO&(rA6cn zbn5p%YVzI(WO%TK86ZGO&V9Wv5&K17{~E4=ds?IL;=|ZK-N{agZN0IM=ZJ-*L%PF3 zjbGp=M$&poAP-)6cu8AD2i=pV*uDv!$(bJvssyE$c8-p3cdFW3y)T|8u3?l@-O*){ z(o*sBuo1y2>_DkDd65KaZ_N7z=f=BVyMHogL{ViH!RVNo9f}x-%=-Y>2WJ6DZ+Ay+ zOpqv9;rCsVLsUWeXT}6p@HBo>yiI}|^g!51r?#IyXXyxzd-XmPl&it~*CmTLH$$)A z{{c7tq)0v%YrvsTaDBdQN|E_dJuu4m=1X>D)xZo=T=ywMLfdxxRfqM(y{qYC9%wj` z)F@eDp~<#t6kr%kR!9(3$MOk$aaYXXZ9@2 zYW#iUt_b-@HL7$cgHoK|C1Ai!HXq`d;o2XA8LCr)u~7zFy1-~>nQjnIRseRln5UB0 z&=MG$|NPVe!??JJ-5{wP8`Dzt+j3*fhE>SQAd;CP#}t~3YM@$6bzLgGs$Eqj8o2XJ zoRQcy+@x^u-1_AOXtqw5#2eR|Y&e{!FUqR3;cXjuRX)A=bGV40wfhN2C^R?4c_hrr zUBHz~y&IWFzLm;~NcK#|qVDH8;t65=lqZhh?&XO$!rl!RHWxYIS6b#fe=|!wz~C6cx$N=?VcFh*TEDzRFxZ z=vC(Q^qUzL)*-DCu@@|rMJ;<8+Mp}9LKypv2y}~)w`k;J|7(& zIbN!Mk9cM`S+6+pt!W;)=8E>d*BxrFBifC6n0smG9p(=0d!^p^2H!R{2_7!zky)>t z`Ox$uB>l1!v%dl*WXE6^MYg)&;NYko?ka?Ed7=f8M-yIug`cG%)uXZ>1&Be@{J(ki zU4X3zz67#Kx|8AWpFL-B-Cd}F-0TfuJYa_QvTsnCQ`s0qT9sn=!Rkr1)ip^N-dmVT zip^Q^v0TA|w0|k36a#C3gwbndgxG%y(iegU!>r~%ddxI%Z~*cx<Q3)vBs9tTfP#<>2^MQP!`g*0DNo%2A@pZnW~F+WB-6?Y&Ov z0@tWO%;VYW-Q&X_&Vehccz=bps|c^CPk{rEQa{ANXDd%d#(xV2YmZtg0R6?}@y|HryUUTNV-? zYASsox?^LR;Yd9%?>P^7Xz7vIi@d%*i&mW>5Vv#v7~wQAZlC4w8_Fu0yKrT_y8^7< zF~aQZFiYCC+SuRl!9pr-Ibt4B4D@O53fU8}aPY2^p=?;}XbgLT7Ki0)9*9Axvj3}B z3y36b-YF4uA~KDz%J}!QGvRLIQ*7DmOTkQ9*xPd+P@_nY*9i@IEtVN&mQ}M`37$iZ z+!t*Xl>*5(i`utDl{h4CX7gLxepLNkAdqyprC;S0U(hvbwc4>$P;}WeJeLVm}mY3`-8OA2(kv}vkb6FP@NC+XyOO|O z7~M+V`YA4=c5MS$_=|-3?IRbPicD=wwt$F%P(L73GWb*T{{AXB!)uTDD{_4~r4LZ_ zrf44aT-DUUQH4=zq3rV!3TbSHW=|MYjg0>DDIZU)uh)Vxt&8@YgNXE8Ur{zcJkNKe z`Qxqrl#{eDo(Tr-WWVro$L+{+j~&Gj@!Pf~(;OS(q@QWqqQ>p^7^)6q`#IZ!edCfPqu?iv5aHZcWB4;kpO21ys% zV(=98BBJK4NM2inhk+Zf=Je-v9(ej~f8}p>pEPmldsRVKejf*?7vPT7EoWCzLvJId z*?9;cqSa|DgGs?tMDBD#%5Q}Vx{Qa#z>!}>lm!w*^bD&fYn`PeyB(w`O?Waf4CrS2 zW_zsuJ;|aIe36h?p#<}285e9Y#uM0YfyL=@R(1lWiY1jAn?` zCQ5f*AfdDhqfc&P`jJ0^b zlg7{9W*chYWPc9{o*C-On(qwV8;8E1Os_cpCJP4%I^G|$1Fcx7XI@VYbC_6l|Sp1d-o#~2pb`wm0 zw`~Ri;iook5wsOO?2Rs$e&t${e>mhZP&4x+M-UUZM>((Lj|}+&AQaEnmtF8>1fLEY zI8a=;3iv3)7s#-!Q5(pftA8eo%%uB%!CxuzkW8H=B0E&8z1J3jN)0-Y10@RV_9jko zXnxhEfZ&NMMzp#zom=5!fq|?F!(}h%ty<}sbVO);1L8yTZ2i=`SmfX^uAHhj@AUQk ztRs=-bOyw zKfs!>X+o)k+>~(6CW}P`6TwA>1T;&StL%#`7t+>07|8LANU%c@XFQ5*10Tg+WKqow zU)nG@KkkF85yy!GSV!W_VhhR;C!)5P+i%<}xp{Ro?r6cYaZ1IrDG#{75GMx;EZ)?+ z&zeUENr>MX)fNHa|YC2f1ph>50k_K2~FDgLOc=Uo8Y~|@o_qk^H%La zZ)1X_``t?ejoRsHRKZNy`)zt%dC;%qT0Ptx3>9qEXar^LEhX+EoiUM5AFKBIt?mw5 zLa>}yb3wEe)@QB9hBZ;CF&x!@&hYk9FfiA~MmJ7}jUpc%NCl7 zk2}O|cQ0Rl3TdPD9O&9jLpY=U zEVNCl0M{NrTSL;RCwCMVvKs*FqNE^0w))3j8cSG=lXBZ*KQ2(AnZOf=t|)tc$g=!M zZXelZN#Z!$hHGMrxVN5{T&69gA;IKs(GmE5MkqJ;|6T;K?V3kN0g_KSdX6Unw;jkL z_&%hg4OED!56(UrqBnsc0vcf73ZVnm=Ydoq`5v1qdVwf9|FQp{(GFt>|- z*MZ|V5g1R|R7N!VVuC7}7Db3fvu-M2#jHWp4oCt+O$__sOHLS&>FzlQ60nqlj-*wv z>h#l(Zwi*^>-v_t$d^qL52b5oCFL5@P3PgJ_mC7q9j30g&!|T`5Mj)0qmjyAWU}#z zHOl4O^1)t`tY0dMce?jZ@geUX>V$PeZz8i5gZ<&Mb}qW zLt{q!)Kqd^)R#J(_A52w>}l&gBQxX~yjvJrR$7cdfm)F6EJ67MH<=?NCup(G_!7#B%!s<#Vp} zUbhH+TRHLGp$fV4JLQ5hQFW<-e+IZy9EmI6+Cx}3ii*t=6a7Dcq>BcIF*nYQvidl~ z&~~Gex^3Jc(fd@AA6Zpz(QzfD3gvDY>p-s*Mxw-<)x!TqG<=*fB6qB$z;ZT1?z$jX zg$Xmt3h?M|YcYvcY0m@Jy$$&Zh`H;C!!nu(`JnUR{hKs#rmjUXH>f|WmquobkT-;z z(RW3X(Bk4u<3v>N*SfiTWx|Lp6rK*&)wPIge`oF8Lf&HfGMtdpQ1d>7|Dr;BhKi|6 z?Kuu|xbcSMg;tKh*0Y~+tm9kvJyL3ft=w6c{f?)KRklx&%-xO?zw(z#`~d>_T{cnO zsu9;_-G1)JW44d$CFB-gTPJU=jQC_fKap_1rihln&T?Q)DB$S~qZRe}ap3Nw+*exd zT%=K?X6u#`OIhOZz&T@mG=NcJ5(*$e!HO7>JL^&Hrhw0-6-bhirH=V`ar$vzp^{^g z+He<8Md0(~6f|ov8AJmTL2t-#-f{fxybj}ajfk@}7FCTYyW;jVrdw^wIxGXtN83PG zk;#&*=5{3;cjGZhf|H&H3MEjDln!??kD9v9@~P2%Z}3SEUZ{D#*k}EL+9Fo;&VBzq zrW6N|`4^!?CYw*6qW1ph$$`(dMq|bvr!b+Rwj@>EOe3WxAY-mXP34bm{5q^oBR44l zc|-e-3UqS)y7zBRtiX2$ORuHG0C~IS+7#$HeY^G$VWJaAq?rjg(Ukg zU#UMv6>_t%;e7RKA-oF_?T&{DHP9s|x2IjbTXwN`tz z^jTDHaa_F0K)48762u(ol+KAw?jP=T`$dkC8Qh06L6gjxGsxc0j){vKXo?c9k=_ia zXj}7D_+ep9`*O9Vu&}WF?N--y23gLbKXZyaABpu0+W8dr?Q?mkTA3jNd>qC^K}3bk zO;aPx%+chEib?ZpopIX+qDA_WMdf4T(c=hxMm0U9+>gnbELj3I*9Nn#?mu7lp@^RwPFJ!{ZFsi zPV5H#7N>D;92@%>jtn+S$+|qtDZg?S!su;=*~;DejC>SI7rM+85ggI-^=8A`d{zLD z&Xtx~PG7b(st;6JysU#IbE>?FJpxzg-ElOs+$@(oc^*NWs~Fq3V>boqbQasc$8UK} z{~_XjB5{D+XIa&cws;%Q77AYxng)CHg(wLF-XdhC?ew;XhUXn$_hnhQ1}@vQlRQtl zvmr4ivp%PjbzI(~KyvR7%ljp#ifZ~gB`vN?*JQWZu71);SO9^Nxa2_C`D-~jzhrG9 zzpIqyRYg<9E{**_5%&_6fwqnoMWz`;n*j`>KFk|mM!69n2*Ss!y@+CC)>aaEckQYR z7jv;o<%L_mu*gHN4bwkl#JIrE9kUPPk!p{LOM|I8q zPP7^qU@v@~!a*XI&KbEN%&Jg;TzwVM%f5x&|K+k1CPblxzWzjRNHO+`LYRpU1K29pm_aIW0bLwtaRn^~C&CC?& z8cKG$W~48cGssX>WBoRBt1oJeSA+_qY~8|QWoy*ns5R>3iM6sFxSQJ4Rq7a@`-6-^ z_thi4?KG;|zIaAy-VJ`TYA}jZ3+_|(2xp*h6>83*-pe#aH~qjxa3;)#St&`&y?xBF zKnL4+Wx@`BsP9j~A7lv1 z3#tFJK955df#!`M#W6i;5(PCIpLw3mJU#7RTqvnYyhT8umHX;OhMc4uGXB^x zN-4lwI({tRV&&c}1(#$g&y3j_98=0-(Xb=zN>6>oOgogS*MU@h?QDH8Hy-v{G&w!^ zVk*Ago0tygWr%AgWf(I0wKlqm{6%%o8KPj4J1PYt%yrE$?86gZ*6~_-lVRlt^BGg4 zA#P|JrI%*bkD24}(&jI0I`vpc3$hK{>t*3!IeQqugxNAXo2rVN(BHy>NT@$7{BFwJ z5$D|hJO{5Rp~NTbsBp*fy(2DwPF(=q763%u?p=St<<;7lyZ@XAi*QAgN0@1Alj7n= zU~SPY7C!P0DS)N|+yn1$o@HmH2DHSv$UacK;Qape)Nm$J6@tSmMIsaKD7vv?C=8Z| zMiXVIK<<5d*P9QR4uZT+{ezQRO>hW6A>NF}A#MBKhov;mol{IGdrC5L9iHNJ?RC!* z4k-d3ugmVo79{AhBNGIssr6LGM)~4N8K@kyz)x>TC@adsq7H4p2Rk! zH|c3=*k7m2CeAdaYCf>xy?!}m3%U8sxaah{$#tpliG1Y(IfnZR@IB&p?n@%wsqqu9 zoMj`&cH_#F41XMo%Z|TO`uJGC!{H~M1bUqv+7VXVz!ZW{te}ht{oTg>deA3uDBTyp zKe@^$s>w^|LB$(ab~d;X07z1ORl9z*=WAM%(fhwfi6FhjO{3ge>*?y1kEO4ZRn-8p z1Oj(TwstHkhw1)}hsYq4Hlhph))l`xHxk2gVx{kN& zgPb1^CBIQxmV*~m?GM3nSGQ7Md9)9qE(EYZLai@9xNo*FCcFkm?#b}K_IYm5NJ{^i zP!X#A?;3>`W*^p~Q8|?p8J`%bcp1bQH0g8>Imkf0LpcMNW6w8b`jnu=hE?9A@o>Xv z!(Q%9lWEi{yD%sY5Xw|-N~st0DAmfq9$fruDUyY&?mcyx3brAiqa54pR$;O_rt}9V zhKB-rN=yF_3t+3R1!bTbviRqizao>-b@zLIu0)H4RyG9r51>KRaLZ(eOwwDoNqrZi zgesW4u^}>9Esy%Jkwm#4+pW!R6~c<4j~`J!lB_Pb*y5+i0Y^^>e-pGUQZ?-gH93_5 zJjaMQVc@A^`sqn4nZn|A3y>-e-*R)|q1-%iaI%!e1^n@gvzN6vx%|(L?RUhIE66x- zVy8;FcBwwi66qUnT5(pFT}l>Dx((iW&=o{9u|i!`b6}H!cH{ZX2fUmF)N7n3#@cov zq-OeYTi9_Bnwm8QKQR;xd0TQ|T%n*yn(V|?BXZ-T7KJ1|Zp4OcGujC=O%L)~3;!7x zyzl`?oqv94VKu9cI3a*~sVBjdwl zH37h_ecrt?OFIt(QrLpn*c{)WvG3HKI9 z)h)bPe3lMQmGu+6Lpy8FF2B>YwCm2R3He2MGbmD?+n^nWb{0O`{GkeoCr$k4{c7}s zVj|UIZz4o;M^F{xps1Bj^~X*Y9{r zc@cN#eba2+vDwd20nWzW=$!{}5WUlYNwef`-ReCB@ z+|cc+?bNaO9nPn&|F%#~@Evm8%V7?>U7rz|Ll(&+0WKWs9k;A`wpV86A83I|&H4{& zx910?KuxQ4#w{>`LMk zKq^zwy);oI>2UX+yRIC3-t&#_8wWR;hl957$TqKBBRD+xvT296|73W=z5=dK{L8h1 zMCLzlOC8Q#mx9pd+zJMh8LD!?tue6wgyd!|&tVJ7Z*rEL;?m_|1i7d35>7V9fRwv3 zHLS^KSP|r00Gn_BIu3sztaDuF&cd#)o$a6eYr1+9pw4U`WS0Aq49k7Z2MqU;)d4}34WBC5o_eS60Gv`0o;L}(VC;2$u7A9A*}IYdmJs-f zu8MH_j;xj&&&ckrof=TM&*6eF&r3I8M_k)ANrZLi&}LF*7}c64n1}qgTh{aiKJ#(t zp7yKO1>Fgz&28_nDiuVuGn*K zmII*_;D#-i$fd^lf|YL(S+Q?Td4s`#J^0V_+y3+XDZB?rN5tvsu{=kIIepy|xj4~# zVFyT0mF9xgeLQu5_^$JR!1G?++E%2sdh^vQ+CB@F7rGy9Mzs3m>1H_d^RJxu2Eg+AqoAOqR08qHx(bY}1%$kA(-zeK zt|zAE&+KuU@qu+@a?%)&Yb>3AxB6`+eemO z&g_WhpeLAY6zUx#)cSt138@h4&l=3D54$Oc1pg zC9j2q$zknk)5|?<7d$f>A0F6Pf!76P-PlZ0-4ffS*o1(39yoKP_~)w$E{vowl^O;7 zZ+bzqh}*SX?fEW2u}hdF^5@-Vg)Rx+nK8xl%h%*kvu7oMcV&s>js7AT%RJxJe6nF& z92+JJtvQe^EPT*sK~Nm0f8^$Gut7* zgZtn7kP+}(3I{sb zFXBMKb&Ba^ejRnYk&h`L=N}qG^Ze`W8iJGuhu>;lR-GINRJB(B8pTNw7~Eg9NGnFT z73h0|%NW!LBD^y#mxA8Z9mL9g)VBx+7j&&D=>7dmMzDIa{|>JHkYv0*$%2$ef>DAf zVZhm4rK|v0NbtQ(NwGcwg7nKome}50x+nffLI6x)j)b>sMi6RB%I#?<-_nyl|9`^j zWm)*6!@T?5x|#B20iTS-PYLs8n%6LoIrrW=dS%tjJ_@Wch&Vu&&IL1$SU*6QgJ7n+ zaEA%YgKkhEGc!%QP5yr~Hev!p*T+eq@^4$o(yQF`q{9x7EoR;-XQ0Wfg`(v;04j2! zn^Ds?Bc?1qKkOftNHrBUyGGQAIPN2Lcc_*f9Vhuow~+UVMYz

      )Aypbn6-mzSve zHrr8%=6BqT0m*9DL>jYN-rabAte-L_UBxzDoPYq+qBYMlns~1!Hp=^~IlC6c|H-i; z4q-;2vO;G(y75$xB_ws45K>v8{*SmLR3YmBr0~z}#gwMZ>8uX z(D#B9x6JO5LGS(5rv{+nBi}`I`>%2owT~RAdq`m$dCql4xCw@cCBfgm0j36PbDySn z!T$!uE0zI0Y0j8ieTH)O5sSFQvE9%T`?qi>4yCK?ao(YM|C`wO0^26nuRj6R|Cl)6 zs2Ct_K6U^BS*9K$=2O(7jdlW32M*v*edFoXyBFiu9<2G(ewhCOX?rN$(Fsd+v}=rs z#vG-pFyAwqwEtg{xC-Y+QLnP|gZY*0 zWHBq48c=We&vCx!yY4_odpydI6*ER;DU%OyAFf!}6_ zTE=ZQFCS0LExZy1JuEIR(V3I;bjs*9dCkc5Kii-GsyW%8p{V#rr>yos%_RbemA0VLyv(zV4XN@xOXdpx^=m>!{5I)TNj6eOqPv(2k~E>t%g$0QmW;`6 z&$$1iLtvqq*2wbfdxp>_LwFw!sLpP6K29`1d`pXz6YGS6IKF=+%SQXiWuR87G>3e~ znS^Oe07TkI&m3#W+sXK=QcEu4-d2tCXzxO1Ae|jvf#aaEF#Hq|I6cz6$os4X4(!6PxUBZEij=Ef!rs*5T zETXq03Ia5)8;CHCJ~|D-E5?O(y2c{lKdqM50vr5Zzr2-Ry<+vtO!uN~c%%wC^Y}p} zYDd4i2a_MrKJU-LWj63jGKm`UJ2<^=TxvcDdeccvw0|`PjFe~$!Kfsa?4U8D{)Bt^ z%0c*B@Q=^^LKd7C$Pw^^Zt6IaO8*ZZwDm~^$k;LeOpvZ7>-mGV+PN6`P($|7&xjq* z9~2YSU@ z6z^~ocd)@g#1=bf3oy_YL@wTwlWV3^TBrd(@#&aG)_5Kdm}ikT#ED`KqFQWT++X@3|kOJ}A4ryf{nD9V!e^7H8AdRVYjDh-x=rEJiL zY4+pm@~9t0PJ82#8%S2i!)N+WWzwg2;OHTM zynwo9a7a+we{w{{ZmslpVEd)+qUH@#H$}o9eGe$v@eyy=>GuZHxHACLV#Y6CfZreD zr&#wO47hMlG>q6~CzBpYbwnUQ`QNUuu$eo|pCouAaj(m=(y!X!CBP-?{07wdJC+yP zt)(v%2DL7~58YvU(T>4}?8)od=w0qs{?|tO74)9U$h|%)sMKO5Xcxgs%mLqi# zzd&)?v$VXcur1=Pv>tCJkt5=(NqdA$4f$$gBWlYu``)p(W7KqI)4zp3;K_Tv)3P$N zH%4Ln5v6?hE?SiLC@YF$FO3ge={>*62nqgkw^|N9-asi&rOdRKIiE4II!Sf{n~^y6 zG?yjGOG_PH$ka2rkTF(%R=)UK5^9dPT&^voxO-$QqB`WS1D|5}aG65wa~JxVrFZ+$ zux0q&a3$}dSh>ghHaZp?O4O1%Gj||*+&tJX40ixxV|#~~IQ$@!8N~w0!WvcBXIQD7 z`0#xq0ZN8AHYQ-Vx!;7{L1O1yqQ}>OTebal5xg?XA{@$q|BIKs0XI*#F%yOqGG}

      vhDB2ze-p1THp+^tIk{o_;0GmB>wGwJ3(=)p2563M%_;A!m^Fex4XRf^@O;#9 zO~0!4iHzz6)cg5~0Ago9-SNct%cGa5`7YeB_RMq-A^ZKiEF4tV4ZboWxDT4j*{U8!@#)%Ew+gcLdMpAunAgv|Q}q;w@)=`u^=c-q_q zU5cpdQ;4qIiV@MQ9U1-r?~Y$dci|czI2M+XSqXb?0;O`M-(;xM;&hwj1^vZ1?|{&7 zao;cLjRPNk&3y-N0;0Wt)ZEP9G;#gD`U{!2V0(q^6gGpK-)$8R*@MWWAASUz!0O*? zkDNWs;nah?w5^XlCl#Af5)%mo9PK8LrCCI8UN)$2dB~R zSLch2(qaX@!&xA#44XjV*OPwb?M)&>63K=`+B^*FV5ADXD!aS8hC+)elrAFrp4}^k zj@W!s5oTrnAz?dHNiNJBHGP#haTgJ8*%3hI8|}`4!x^bBrQi{HSIqu;`zoclwYMhm z4k6usd?2*&(OOF0G8iQh1QhNr`jB}>y|Y|L#%d*U3z;a2x%zf0LwG}%%6gtf^BK@&x9S*&L`+IKj6RlW+m zVLs%6RDB7X`_CRWfU3a^Lkyi3wD`2w;q+d z^rBwM2)zfCf1*B5iEebl%{=WMw3+~{lJSBuoFoD;M=P7gqde5%FEmq5$S>+@pSeG# zUgr61I?_L3;}sv6aa1x=J4mJmN4&h*_4b%r6~hb?{+z4bYS$cyo$03_D%ekc5UWDA zGbMGO^^j|5cK7NMy5R^|2S+6|-8-st!NQ`2vb}@`YIrwQRUnWtW1*g|CC-#mT+IZL zdw&1#_1mL_i>X!o+SgR4H4od>KgB|Qv*3ifrF`k#y>#I`{R3O$saNFFK5}&;x;Ug7 z1|(WcVADvc6G?C%F{9DCGcXuQZ{Ej>L74}FoH-JlS$98meTRJ;fo16JAp2U757bUx zr8+TS)$9`bMQGgw1O4z7SM8BR77gVdq?4PZNZ$sfODi)aPvcvCuR1pingYE3$0Cr@ z{``C&eN}ej_zx+8$@ose4@jCuZi3Z8_d2K@eZL<1Az|}#r!o;V2Tm*+3GO_W6)`H? z^a9F2D9q)tcR7IR`Lb13mr+`9*&l~mEb_&Be-dUF=)?rST%pp8qQI*fV-FHd5Ap-2e(?g*qv&6H(<}@!VawDOeTDyxX_x_fsUn>uphz+ zf$%RC6}8po0sFI6SE!-YFNAh!h8Pe6K)6?$E!1qt#}#s&OQ%Bcx1*wvaTA?R)C4lkx7cU^?yg2gdZL0d(Ou+w?ut==2P#zC4I5n^hb*R{Hcw zIN7{<{ltWYdIy(2+)%VN6UK*3eSHa39jNMUr&A6;SH#x?&P+7!5^VOdaB!ZAgh%No zCI)6Vrw%Sn#$3`Z)Bf3OZWeOW+55Gwyu%?8=En_;GR91YWtnOwsMaP8^D!FYJ@hd) zGFPxYo)}uo?_B(DDluIWh(Y-os1rf0(Puo(2}nv-puRp%tu{*=&ko0xqS9y_*R4@s za`fxyx=7>D>?$|9y7$y^C@R+cN{%tF-Hb+z;6+?AJY#%v(;WLuq?_Bafa9R|Ozw=D z&Eos#>3a&-C_b?|D3^a*apC{-v5S55Rl`E4Xv0Of$4Dlx`1f!0pcVI}m(2bL+ldpU zooG)2;yhY0W?!nRWKCM0u%6;q)$L|dt*pb0I~ySl-Hn=s%nO(!@7Pmk0EjYVJ_B;a zJ;^RfFpr9B{Js>ij+naWo}dO-!pk{s8gOH{$R_gJ`@p$L8S5=&?1wNCIi@V;fOj-; z={ELqaLZ@vwb>;qplETiDdjZMyX&xrW%lb^NuZ9u2NC+{)9a6u3`3GH*9-nk77$Vk zAdU$)-Z+98W2_fc>`U00u~*hQ_QZIl3Stkx^4ru|+|%lQcKj<`*U!(DsGH^IyQ*IH zSt^C+HG#_g341dGR%Ro&&6Zh49Mm?3k))a6!Ep@@Xf)&cre?p)8lP*?VSK_qDPNtY z*P}E)4kr`K=|MmoSoNu%S_KJ8aoMez+b64yV)sh0Kpa^D_rkIyeYT)O``$L)XnG71 z_-0q#uIrR?>*HBwI7%Y6907E+(9vUdL>wi?jiU-!Q<5$mU|un$4DV3Gt`(KV&3(MWTtPu14!U-0+X3U~A<){=ck0f4QOm;f# zpeEqjyup?0jK_1815<$ecnMqTs|8P5)(hN^=g;dL9J8a?Fl!2PsPi+Q zc~|#U1SAX|&g`Fv1_d{tnUpXcb@@MyV#7#L_xg5~%s5~heoJ{Qd0Mln16*~S&rrqZ zHN1#4a3jVJ2DQRgWHkEoLyUbFL~@RroJ(rbfyI;gB)D^mXeshXleW61I_fG+t#47AMMmU z3K|@LDJ*=$uD9!KQk65HKM>^BUkZ3n>Fib)B+hB8_v!)1QbW<4v%FNcPO#hphzBe= zhtp2!CZ7ZXbb$gt<+puY14QwB&yHxt&duveUo9EF6`;V7aC$G?EU7eb-b-njV zb6r;j$WX=-z0}>G_bg8M0Qcef{=ri964N#kh#sZYLh6igYaPZr;)YYDQ1n?z8|wbA ziSj>iLk(%->eZ^yUzPn+RUovf!A2s~f)2VP;-iV|pXo;Z+&P221^Nm|&sQvTY52OEaj zG`Ojc*zMMD5>V+V+Z^UPQaU{Ml}?%B%}r8nTF2$$N-m5Y)1sjBd2M!85hUj&J*~|B zRE7@3)W}>zM=AwJmr~pO9pxd&1Vb`sYzCXdbYkZ+g(!TaEoxDcc2p2NKfo<+t0(Ud z$f8z-J{#04fhB-MU!|sC^`D-d5ST6;GXdCX!u5YkZ6FtNg4PvpbTx9aHKyYA4?lB! z-HpTVAg8v+6^66}JnWe)g?V59dP_?tAN>r?mV&>noa|)idCl{AUG#1f;-rudlX(B} z6J}h;hT4Q~46=gUsLer{`YSbK!bnb}>0(W+if7hmKc$qbx%d(~rWY{))(ZE~@YmG7 zDG^2mxE*YmNbjk_%x9IFLF3>zq6~*ni_Hf}_QiIl6}yopMz2xV9*RLfKfBag^okpM z1ItH(TMlIGj)}HN6BRX2IAi6rEy%m5uJgL6!&*XU+T#lw!+-6_-%BRzZZkC&8FbUM z!*-mqBC*h=OF#d%2J(V*(b+hg)NFr(b7jA|kH=f)^!5dg43Oa6&)gFJNEH;x6Qr;$ z4opV3t}JpX>a0|RX(UQ#aKjOdTox0w%xdm&$_?uwK=ORrKFpZ3Z{RpDPOKyM3zE!fHb0${8vITGNgp!_NMU{ z8`*1C2U|1O4{4p8)mV7v?7I!*9U=}z50I)-Lk2*kg~PEGpI(j?XmvSOx7BXDdzzp9 zm{0lhNj@N%?03}%V}n93?8uWBc$DS~NoDJ$B(P&mkfhTYC9{v$-|DSld_Ua)K+GbA zazSKobzZpy!Pk5ijS9&qTlzs0>%W>2#l zue(0o6lF9)Rmbc^cb$d={`wfJiPk!aL2=o%f0=u7eTXC@OvC1)+tL;1_tu-i-+x;% zrP9v12cdkAtl8JXk_8wzVq{ws-ioM&l+0*Rx&oIHwbkJ1>^GC zLja>4>yVpVXAr{sj4`*YyC~-E8r)H&j0a2e#?m#0(%g}e-_$#v+=Nwl`b~rpQzbKe zm>phUl1J0$uu=OVd;@vRi@EZvn-eo}N!R0dqt$AGmeqPOZ4@$+Efs1QuSCZYJ5mtB z%Q5N)fHy$B2Ue4Mv&Up4-77q!1E~$REO4ywIzzCdlp}a#PvExY#L&WAxy4(6KgO?{ z^X{u#_-`AiNgvFi?Ne0`o)=a--vY}gs9C83SS6BqeX54?96=SaY81J5J;69mb>8En zNzrnJDc?C*tY&-o)O+)8D%h}CptKleh{9X6=*C$b!~565Q8;o@h*ka%SEC`2*+0&k zQddXapNRLC6rTgVFdq>+u`y1royRWTC;ZdSQ9P2L94Mh>z>>uPC705@KwXm)M?z;d zCpIMIvc(SAOriR4UFQ3e3XXZKtF+hAx6V;0} zQ;jW^fw=o=nZucsYneYFVDxvE!yOG3u)1$7)kK;c=*v>4*@_+GbBNkQWO<{_W?E(i8|4X-ds3%d6;jN~T@$$|NTzSL*n zQimapq4f5B4SYbF-CnZR6&s|+?V2WW;%M%8as#qoV+F||IT@LO$}WschjlkDmumtX zGOGaoIAXzQ(Fu2k-&4GC_#YINY~$&l>Na9|$AcJT|5m+(VDZFs>sfTpRIk$V6{g1p zm8M6U-0aj66zA@n!0QnSco~nJtC6F`xRH{$k3cD}P9&21);FB25NImQF+$ghTY%sg zt3a;nJ&11wG@&5HQtQq^Jm?Mg@Q8VgNLIBFP? zM0a4GPU-LVIyfH#W^_lpZ7p(GvJ7x}U{w~&ko?5`#m_?XSIn01UwBVo>e)cItR~ zYI{P^9-sv$O*iEW!TSOW=r=)C6x7Bl1I>0k5lGUDxXOQ1hGRX zgj6REjc8@f2UKPn>&UQK$Phv)9dsnkD5?M;0SWvFf!Lurp4Mu-szh7WOzy_h zK67;RW&c=Tz!V)Vu2Uht=j1OAvd#`g=IREkc_^yDBY;g)(&jFLxc}whkCz4ACjM^^ zD}I`*R>INC98_wiT(r4)ey8=EDXMle|4wq5qXfw)jMl#rP{h+snFS6t5bYxcISCDv zIaq-jy0)5YG(pAFd(oHgOd?gU^R~1S6A<3`56PeMpWl_gky8(3tJS?$B!qK(V2l7d z1gz{Kw9~pFEbGXj#H}AhxC%W?vwkSICV8`>UpJb+<35I+d`^*%q@o_utfKbp8D~?F4xcsOQLpr(wt;1pCWtA{h3wFalVqeyq zIZb~*Ci-@$eQmU4B#J1y_LYtQj#DJO(8*nZ2%z=c)O^|YslmBH1&%>hFiEDW3BlLQ z)3aM8&YvetC6eM)>5LTh{Qw;s&dHBR7R*lbF}wXuiMqGJuD*=PU?ZZrjckZ>|*&ZXgj`_YbVA{#&t zDNMlG9jZw0c9!SJybn7Pod-LVcYE2DHdn(z=YL1bcH+ef`rEVa< zZAdwK`jU)5vK^r0I6=IF8=NtbtP|0DJN-*u3z#NKdQHQAAQl=}S?0_+pRtbX>z$IFz-Yv%;zs4Is5?V zhIIaML(%WlkKP$dSWG^vqw9%?;UqhPJEYX!Lf?!LpWBH?g~oh>!dz>JeUuVKp+C~j zs6!oai%JricWUTN{f7+*LuXh!wR_}*lD2T(aYtF+7pP>^w79>+XgaKpjz5%KbRM&$ z@tV}eu=-ms*YAV(e?*s(ur@D>%|EOk3Ex%{{R)$O+dBV~2^yqmo|aH^j7u23$9AJq zdsmn-U2PH0fuG>@iHEh`k0VPZ<5VMgPwh&tgXP_>`TeqhX^B0{-A??&;a*ybCv ziqcZt2gV)~nn_=4E)neWU8OQnChy)e1G0jA#$>+Ek%Dut+w?l45=hZGJ*O|}-O8c0 z%^_qXTkwLP_5m#Vk;qKPc%r+}TBC}uZLU8QrR$-LJ}iolxDG)MDXOhR>o(T|^DF*5 zMPo_JT>^I?%~Ou9al4K@;u}N8oC7bDu|NJeA-S;AaLe2n;U>KC^~-rNaG2ZIR2ociZ3?|W$y(BnIi4~YCKD`6g! zB1k4Yv9zRKhHXeN%_TcYf^>wxkZTf8N5%zX>gEAz|kaEDK zIvM@oea$+~6sPtGZ39bIgTp1;_)viv32hfegCAQ?{F0i$PS_9X%Dn?K#z!lja~Q%@ zBFN`Z^Zd})&{@LX9BeL6FQz-|ei2FKtFz!z?IGinWsdqo?QkyQU5UhrP{sQ+T(%+2 zlfA8C8aQlUsrP=@JmDT`9r9!d!o;{J4Gv?F@Rv+uA3)iR<**|Y9c?j^oUm6z2QJS< zbknFkZ!6EaytgTj7p=&8iAK?9BR;FSKBaS&cq!Y7&p+zN>8d|dmD@&U`3`R&6D`0^ zkTUc7>&JIIB|;wWxHE#DbVr>w)yV{ZDCi!C|0G;8vEh5wi$k=Hc~=7DFVh{vW$qI> zxB(%T&(MzMt9t7=UEYTX>@RQ%=WjiE0~ zuF!Qmt|6JyDG4PSdT2WqjG5yXS@>Xs)oJUg^`myukS#{7-l>A{th;} zzK-DC461POr;RI@HNjK8IsCzus6I>S|6C60_J64Q?r1jO@O_jPp<;!qBE){p4zosx z-P*iCmDt*%I#6oINYn^YdsD5MphneBQCn%N!zxAFkJV-Sz5V{q_jk_kAIEXxoILOI z-p_O2_jTRZRePOoz{+yVsZ4j)l9QHPURHJ@A^s1pq4QfsZY_Q9cB@YM^`mIvGIK5Z z=#Z7i8;Xb9`OR~?OX*`;p#~*$J<_}1zGGBS;Ijf+ODdo`-h@-`T!oA=l`)ZSDD8Dlbp=w5#HzI;s5Z6B)u( z-^{lM8T+D3w=iu1$wI8!Z8B{H0&@p?=k+1~_et+320rt5$LPU{lxO$>3SGgcN*&R8 zkV*p5br3I|pOf8ObuSnoGQEvQ2a`b{xLA&OrQV)1j%lDTe0F!aldjQw>%V601Gt(F zBCBcsPRQVS(}I?h&zluHc*`xADMsUJFD%*8At;w5&Unp*AuMyA^Rv3QvRl>1o0pnd zuL-(yY9JWMii3f3cZH*u+M@pr8Zq(h4tH*L2W>?>bWpdO&axAuNeK?AWHBUVpZ=G(avI9&%6 z3*LzaMEl^5N3I{nCEkaWVK~|7hogl(uo7sAnWucMpKe83GFC{fi8955pNtVX%m9Vy z$B1t{q6TPvFz6vrAI1>3{J4|R#g!y?S9c(6=Z}Ve1mrjG-+5w@DF<0G7H+%Eb{W5_ zfXc$7h#sJ3G;K957{%To<;)VrTf)tNK+R}0k{+AE)8|S-L2x2zQ&v;ZJ2neSB_<2j zcM#ltTyvD{GLO>IQf-c7n=*A}Swq7oS!>JCW}Z~NIQV#Ew)n=xl7*qcJKKunis@qL zgVtuA2c0j%{(Ml|&F5=~B7MqAsx`^1g#|tOEVbjnkgn3MXxM4;xVLR`YU`IeImjSS zC86FC_U4aq2WFF?gJ^2McfdN|_Xt5Tga0Cj4QCkp@ScEldP@2uxlVZGSsB^t2;pvH zW3aWSy3sA@E{3x0KMM4gSz^8+@k~9uE{Lg3)FPrXzM%E1^D3Us3AIcB6Ks_?2}}Sh zBwMrW3sQ#_=}Eyxn1;1rrUgIUI`1h*QQiqwSCv6k$@s;8OohH1niU^&-fg9<;1m20 zer>o6zh4v~{~~8(Xns0vnEGiSDfD5?W41Q(;(|dl!fhdv;m>`E>-d$L+Y;}H(~~^# z`x1Xs?*C^feE3yNkNPRmn`+YdDMS)I&qa~)Ld*b=jVy=Ko} z6`X9;n^8lYu$|;2))dhDLZLiKN4$KZQan<2KG=F@; z^7itWcGUW*)K~~%_}v5X8yO9t(210ii(e(gvSi>y)7ZscormF0L=} zBM}zBQ=3OOu)l_-otnk=)*2pq%H~+dPM`65;TY0-{32r{$q1p|V82u@cfKW&fvi1A zTK2^NK{&aU<5RVIJrqz6u`VaR_=g)R#Jtp#@qEuu=M3n2ZLzUzD;e#c; zuaK2eh(wb$edc`ddMZZ8n4ike>_!ixV_#s+Kx+-6r!BHX^1H3moYX)5BU{`wZ4vqV zMEZ53F~V7pyT$_IYUdvp`V1(8a7)8oE-X$JiKP-uq{!K6K+vr_p+Vz$rw0U)BDbvVcuu_GYYL}$5!jRHRvd13AwE)}(VO10wSDP9lq~pm#!#Y5`gG&Y zZ<0zDrXH3SUk);8av>jUxG(TR9Z$VF5u;A|IjD=<}DIn&TA6XC9@qZWfcu?wUy2Wtrf%e`Pj-#Ck zj_}P1;fOn5@>LxBM3n-%DU53fZ=#K>dm6J~BGu4`bGYZx6@%SUjRml)ah%THx7rd$ z8AK&4zv_HA#!mq&doKUh)0Z)tw?$bB_vzP+>(MFW;`Vpz_yT#Ao6=#T`csa;rO1>$2k;nAeLx~uSKFAXhR%a+oUwtrN zy77=BVOaw_l`RNkc&%%U${?&JMmo}g-b7ipBqmXGc}w&Vf3mo2j*=w()2!b)G`LEU zjRXcb{p1@H#*?m=fzcV*3B%RN_F#`=T#(7F8tBo_7AxC=ep!$A_-VMl-2 zwr6u<41wpU=5e`K@`xUiZkzG*Hn~v*VO0$6i>ipi`;>TkPpy5y1lqIMq&>p{9-`G< z|4o=K3`-WC%v%?e$mMCmjh6S?VNw{is37U>}Jr0JvRv^$LYI8C0sRJc|kav1;+vO&l2vX|Lz6s#N zLB*S!4nQNN0@+_&r4dPQ*{0Mp0({cE&iKZLZ^mpK3d8}7lChGVyCa51(2od7zo;v< zrdIUhxF1+$U6$LhEtC5&zIb0c8@B#Mbc2+llA!5HT-3DOL9EMt(_;~1Iv_lHZzau8 zh$P@H{9p-*%Zz8NJ3u7hokzLM*SD7?YeRm=ot+FpK*F{b2kle!XqQtlHA{<9?g6HB zSPM=biLs}dw<57wZ`yMGtdb$4k~G*K2)jI=v0Ef@QR$y*08U*|?ZS`)W&Wa8jgN?Y z0{(4-#Im2h>5buH!Pra0CjzpF`{GFK=nYP!$s4yrt~S{;&&2`?jG4Ft^`r zm=L}%BsPDny)FuumkTH?O25*k1fd}2Pv1i(iCs9| zI0lYFqwEX9PKYxUFaY;QN0#1&ewSewcLuSZMqC?{W=!?f@wI@HhLIFD#Wd^2a_ zXgL(Aqk+U53*r)SMCU7!rL8Q_eGt|?GwKESU1XE zVakzFm$GHFHD^x~ya*gvjN%=!Qpm7;%FE!-aFT}RaV)jKJ2sTSw{NA={+A2D0btR| zn4X_!yB2@x_SDcl$M60q<9{47L z*}xV5n>SFz%?c-e*W!_jX6mBH=l>3duYjc=%nPm>J!5yj?M}3W@nU=V_s$x#rvEEU z(+OVkg1@B6K37gC=>kGD<#VKRMEZM}62C&>za86*jFlqW>|cQPO!!`8(?Td;o4zb$ z)icieP=APbHAzQ5(JdP33?25+sql597 zdS%5I)5{!q*APb~*hSrEoT5+B4w%Z!cs8(my%|xf`12!2#4d5{48z}S$u6qH( zw+7xIv5Qe4nrG+&lrzA1KYq#N(3kzNP#){Pbz}nl*$gPhoB}^ZB>9BEx)@NRg0KMC zCJ_mC{57w%FI&7HDAb5%8;!P&*OYZ!4=kGbhYXL)HhQ~S4KH3UHNG_auU2dC_xkphDH*M9 z{`KY{7x(pII8B+Igl~~Od3&#y#}rN7;RwjWb@Q7Sx)+}g*t6e%l;by>1xdK>eu-Gg zNr+Tn1=*~S4Y=@N#?An-J=s3bwAH{q$&*b+21BUplP!GGC*gQn(S_TmK@1!+$8lxv z>q?-d%Jx*#l`%UkkOh$GmF!l2nC8vE@TFU^|DAFV`8{oa{*gbsyOLe=yxlrG;#P+N z-H>gW9i48*&1|}|*k^Bwjryo5_ESw*5n>7!qUrahzMp0F%U;(@b+QkDLaKaTbZmof zFB#2>4Gd4~k9NCF0UNnn(}5-LoNd{UYoBVSpU+|3Z3*Q{U_?S?3d@c?Mh9ll*n5@2B%KK}fB^uAX$#5_qSqWH%wk2sl}+Zr2s@ zJdjj$+gjz-m2$Tiv?-opRU-88T8s4dq_<3td(HLm?Zio=e_1Mij%r!qF02t;Nxw@A?fSGcOBe(gdHdOK+pBgCRhu>QoM z%MuLH`@&88aF#?7exZlVz|czJW%>wYZPMxjZ=eiBwMoEivi?RxQhbNEE+ykX6ecZ| zrmpV7owvjWcL_+#Q=HWUhP-;1;BZo1UctXc`Q=vHvdkQz{F>xpUh5y#0N^cw_ABc0 zpppd49wE~adZ#LxzWKQ<%OLIuoMm@+yYnV_x0y94>klnt_e+y1o1~ofy6$W%eaP=DxHcIQK`I7cCtnZ! z@B6zB#Y#8LYozj>Q8a?$a=Gs~IrX+w&nG(zf>f+`%c8g($!!7xmP8yh-}jKzaynqc zw?f=SaZJCFeSwlAei2&=1y%zy^>p>BRlG2Vu6he3s!;goV>|?H$8q>#uRJa>oIRqg zbfR0r<-f4CFNzF7)&j8v6h`dZ7c-LhiTpDA<#kb1i#VOQ{9|^?xT%qLv;zds-1$C| zE<$d-=J@RZc z<{*oyc+~*Qvj8RXj5W+JqCdr3^$DSdI3jj59H@#CL+m!J*yLqjB{!s#u2qf2r9Nfp z{*pg(*^a_#W@&$Ko1_LM(CpOOaoLD0E=h-CwOZFPuOGrpLaRChOX^MXq zSSCOiqx>-MmHlxFS|`d>>}$QKp`X9WbTainXHidMBLwr3_p{k8Sv#%oC!aLkmsEK2 zs-`kY;J1!&x7%*Y64+G_o<+dyy4v7VV56wW*ObmQokJJUOrM;mhG6u}rmJ^K31~L$ zxS<4dql`D~UT3UtFQtGPIL;VH4oU(|E}PdvCfYIM8ol?f+$L)^*5+cL>}*B+c@dZz z0tVJ1aY(Rr530(ekiC>gxhq^9wu>iwS5ht}HxAqbWHTwAfe-?26(3PEGNGtp;NMQQZp9 z>Bdz^{WNj)@ZGKWk{=wpmKIKu|0U+^59=QiCu>3iN58)f7k{bUeLgSTeNtIZ(2J&% z?wr{l+_&*C$Au7QvOYT9sV?+f>5+b|g((zX)9ok$jdQc3yucai+sI8uWDR z!MApfQI_@|L6~;yCDr|19$mjBt3UgwraQU}b959zIZwR^5k?{oI=Ph`L|j3hX1Xf= z984gUez4>|AdrT)pv_Rsn8DiIzPsYFu3^c`+s%a!j}!{qPnZ`6h3_IciREC@tx zftE1_>&dF%94FCSc)}Jdwq$32yzl3&=juoyM6P2z=zeAe$}zuQxAL3I7a@OI;(gcP z8lv$Z?dA##uDH|E&8c6~$G$*ZG5wU5!WGUFh?6N_PI;p~Cu6!yiV$c`1WJn3*R3Yo zuUuYg!JO|dyCVqG%spOCkJv4rU}fnSJ<>9v9IVD`Z@2K4@M0u7mrqOKEtnDw06N7* zxiX^_!Kytb``VcQei|;XW{RO6@GdlAYxVsiXJ>ya_Gw?N)+YTC}?AiDbqk(@Qr=FXe|R zVMi=DN+d31KfjrxI4;JaR6$_xC5_exvmyr_3hWssq2_r{I8f;`hShs1{8?_MCkNi=KjCFflhy5xBp0Cr$)3dzj1*0m zEX}V1|KNP9ADmK0f5*6!JtV)@lNU|T`{?STbiy2_Jj$Ss#mnuvGuwYkuOABp(>~NQ z=cCzT${d;G%Wy|o-4v%MMwt9uf8D_%&J$RwTl&qWf;xCDuOU(j&y4L$#uuQY3|0;2 z!M;dLrj`({SbiW7FwoF96paCe9-N)n)87BFCvM&a`IwxrmCym~IUB&1=!X*OgU+Ro z868O%xSw1wTO_bzozl)cr50iA^?(7f+%RLUf4Fj((QUuE(jAD?pEf5t?3x+UY8{xq zV;ZSYTa3H(W*%~ViWI%zwk!F7W!H!nFU#PdFU#1Q35UF!M1yIm5RI96jKrSf{H4b3X75+WSIK-Voq1- zZmT=Xu)X(dT)=DgzuUb$UHrDNYvrl5Nb-+Wlw*xdTC|+MBGQvZmox~wYOg;|f+Prb zRW9UjT!jzaExXiNn=L6g)RfgrID7J5_+`Y_`jv^Tr*J5VVrVWlYb_?2Z5A!gjKxcI z3MTcD?z{QH2FD3+n>*Xsf`6S`jbVe(;duy2;Gi(Ow-UPdGG!9o@7)+Dj!P3PfUA^! zz-(wqgm$i6*EACBd!d#n!ndLQ>(^VTLw-#ZRwpRgHwj{~m#>884e0>3N@rf`>s=hw z3HV)gWHb(%|W?r5EF9Cpn)@MloHYJ(_Z!9D}GrADTsNZE2-6K7N@io9l7dTlY2MTyvp zO84qXfC4zNZoK-qb?HACbcm{0j?SE=u~4RPq5}C9sM|iiIgh{YQ}Ywp)89{-iP(yg zeGFV#)^|kwCWUxlg(oa5#Sqa59M6Z(1c3o0Zx8Pie&td0JXp(}j3#JXxn0iE=c9u!riivKo5UAUH!pm?9jvgTWGSl%Ip&S zUxjQEXJ-Wc5(l}nu7yifTZ>{u2sG*Yg?P8!(Iq3AAk)jme6L2dyF(!1y6uja-0mwK zh{ep}PSLqT+2+k*!7Z4nTEd$dQf~E7%tSZaA)mai*r&KxTqi9k>PUk%XzDsj4&@Ym zy$5ln3=E&Y1JK)T-BEMlDR_3k+8lKT7AQU^)6ekNlOEu>-1GbKbTVXuI{{-E!MPA! z?FE|`dGw%v+xaLQm9hJIBYa1LpQ^e2qS^Iz;O99miU^iAUS)1H>W#Okx$RF6rXnvZ zBh^`indwQ9K1k$9hD)?^u6>qZCa>wl<-YKlgN2>Cs`ag+d0tSZMH z;LX(F($i@*G4fF9dM;(k`J4Tp&T<;&Kf@kMi{;=jGqv$cMo2JKz`xOy9xG^V9XoL; zm}yAlh|?1qc1pv;DJ&BZeQ{tm02qfo9U-LH20pQ;vx&TV_bJlr{tL$UOgC5`)_MYH zNJf1i=QH69j4r{v9$qWe%SFdKlHI@@5PoWS8!iIFO>}20UnPQox6-)yIf_v=qV`ec z!d@>F(}Sxq4D-0#gwLl8QHE^W`MI^MMF~T@JF%IreWGdDsbBZ0Uh--PS*RQrx9hd+ zdX?up3W0=W*%1XT8}*UjzRI~5@^heR zOnqN{pfhJ_$36a%`?>#uOkpYJ?6@dSSS$C*)blXbD@k?%0zJP@-3MG7NgB=&IrRk{ zPzJ6_plZ>~%T|HA_G}Xh#=Vp>j2ovcLbyrFg)?>Ak)u(>9EgcckLX9GE(dWOF8&-E z{|(+<3`S6a3NCMS6FTW#;G=$uQz3f=<2_eRt|&Kpkh6u9h^^28A{Be;QI>m>`{~3P z3IqY()M0&_cf-{BA_2Sl8K56isC(wGpa7nrhFh*Dcz8i`T{i2>aOl0hT}sS^$X{Gu;CK(4(-{l$)YGerrM*wz-dq{=%YPPoq>-Vg4JXiW2>x2S2L5zJ zvF+UxPI}&7!tj#|tV3!tu}KCAkr?h>#QbANGF?4?uVWUaYZH>JggC+)2tCTUHM45$ zS-o0j^{sKj0>}vI{6^xsSK>QE8MBt-Q|Kr_YuM#++RJzI+$IbbVV_)OB<8`%CJdSq zyUOEOA9_4H04o=*r1b*MZ!P851c_96o6{rfrKC@nQ~s5_LU^@iKgck4R?Fz~xHJZM zY;XJg$p@k-C1WWWG;x>+%ciNuM`4l9X=a@9Vhf-Ir&hezhYs{v9p=~G%_1i}@xkJm zGMY3iXc9<|QYk9+X8Ua~Fo65555;|VW*#SzCjF`E;!x0IrLQ= z*duF(XmPivMQ3h9;hvTULLSP*CQr_M3tYeQpzw_!r=e5*zMfU($(N(qVobwn)dQVl)Em2=fnh}-6~_b zEY@w|B=ng;Ow6wsm)l3WLQYCNkb9!|H`xrPrbwa0*A4IzcL=bM#lbYwv|xKz&e4xnFvs$aheF%FUT z{Po&$y3(;c1`;RB;p4F&!iGPu_XG)O#{GSMq#;m#T#*}P2=U)brJznGqFMD8r(#m| z?2pSX|F7>S>HGcd#Y>lJ2@j}h8yW2f)-gBg{dw9+CfJW;LI2F=H&3EKllS3e6($Go zt|1l8j`l|#b1QC_{)8n!bK?`exvS0UOZ zbZEPSY@VFjNSSylisH7rm7+^MINUgLRzv8M_Lmj34EbC4*H7TKjL1MRX8x3!GqXyV zJWUkQEx8_fLhMp0p95QIXw8?yB|wCH^FA;)_>5nfbBQej!WDXn%#w-}c}QUO8WZ{A4ZX`CfVd@A#mASP|&4Yvl_81=f}uk9a?(}W5MVoxVC zQ6T@ggsRjJb!KdK-a-2l{^5F(#b=?32+hsFbLhgzZtV6<1Hr!j5U#r?E=Vi}nZe3M zdQX0lYe8RiFW zAOr(~F}<%iQ-f~^lJc@e)cQ!H6f%lYZIK;f-YV#eRrAO5V;I$l_8-qHb|$sCbcg`%xPlGyk_l|T^v)<@{`WP zHU9`U5>pn&2r;|LSskA+jqkYA3r)3CvoM+3LT!o}%$&qBx*YeR5WbVxXnyS(u*Lba zt;(F>SI~^*1&cS7$c@`eL7Lawj|KYmpl0Wn%B5Ke5wcA!G*EGJ&?rAZ z?z`~R<@7Oy>oJPWzZki4m^3&HRek$?J{KzH$U#qvd1a(`B`%CTA(&G#G@HFAN3HY{(>&^ipMw<5`R>aApxNo@ z?U~tV_wHvgnwodYWXe=@mZ6r8J7q39`TS)%xyoU6TMF?&<$SQaVG1>twS^c&$a!-96F<*&6NOvv-LXR;N#&P|!LL7b419yL+@&YI zWnC<_kt{A3oGwv6rL=VXQ9A!;hf#4^JIaJTRp@Re+Bs|`du)Ib=?vuz z?X7D}#u&q1MU6cO~lqbz4s za1R7c2k&SNDtk#In(p44|NcIyJo^(Q49uJcU`jxfdGWIEENLLcltoFlX!#V`<#sB5 zTKHv<*jw&4rCWoZFq`O?Cq%eCqwl<7Xi3;-gG06ijB>9jT`wdPgblAFZv?`FfS(D2|W2e!_2mZ$gV>&=jy()<*NTWXaG`%rq^wge2l&qg#LW!gnYM0C!g8pnAFWyj^+| z54pK@pLHSbI@2Zkru5I+>;0Oh zh)r&~^Ny>rTConRX%NA*N~7)#4PIk4@Z(J+MgrG5`;LjlZcEX$P4CW!&+Ut`|Gcn` z$QLqGt7Xe7#Ua@7NDP={hq@D}?v>pePlhNNHmV-4ohraEl80phCijzkI2W@KxrEbK z;(d+5s>&20N|sUHNMB}eVf~cw?a!Vvcsd7kf|z6Xmn*2VI1nD_5tzKZ4}q-SZ2nLG z>1IyDCYWi^ymLkKNve(FMi?Wxop8Doohg`Kr(jVgCQg8NXJPv1Wis=E!;eKvV`SddxC}WeE+j2(8tq& zJ2|*50Lw-2NWR?m#-~;cOE7@6&qxXB@Ge4J`}A~do`F#p>!kzNz;1ARm=a52m)s8> zq|i|-tHw2iAvS2-8JID`0XhpI0yj&-a$%-du{v`Iy_{=%bXT(tEXs|uW<1~CqdHb< z9VM&i&6D*j0p^`VOjA zuB{V;*5^8pdzOEF%X-Xrx&3%yt)>=dOrFU48}RIFjU}}bI$A17Q(>xp&?CoEGT;=E zw{l^*;^)S^xil@tuhw%Kv0@I=r+Kz!gW$~P237ty9Zpq$7^rh@i$vz z^>dQ`kYfPAwM`}<;i#T^qr7k?Ic=-+h)4vJ^vP-|r(G{DfS$(i<)g5rEDA+j0B?4E zypV%`-m;0D+&qW`T=ltRLPpWA>sBw(p{$p@@5q+1{w3Jlv^1Fk$q<2Ox*6(xqs5w6 zr&6a5N&~ z08Ks$DjQgvLLJi2?Db&zBM z+zA&fy{A+7;om-F90(Xsp&MIyhtW^z}(gDdX zcRlz}12jH7gyV$8d_PflkPHxw!TD4|abdJ4vVU=&unF|Z()Wszo0mW~8pF`X>*3o~ z;xyH}!PeyT{KhVq_(N#drXH)KcVZs*Tk<|dUV@R%lA0Eh;V01a`H zZ|<(9MD(Q;gI#s{^)DV3@x5lSx4_Tr#+@w?>0e}^hC${ejH!Cj%7$A6YgQH)NOE8p z^nQVZJiY68)RA$i`%3kv@1gJdS1jMU-zTcO+%k9?9H@`A)pgb9jz(GYKy*rs{OoK) z`xQNz=fQWP!LOr3kT#}Y^=ZzsyYDd`hAh{=5q7lmZoI4Z+T8HnU#*TXZB+2CuC6X9 zxSZSqqv3JIglFOoDSWfo^G+{jn3K%^yIYtMHXwL)jWAnR#G}FbksbynO;zupdW9gu1jw zh$7yT-&6!)aN>K4St3~3%km}dsi}2j#`pqr0_GDw%nd=b`Jtr??C!)!jbG;K!!gOx ze;<-_k(Qn@r?YN`ti}&s2%Lqugf~=jk&I>uFdCJc!#o=NQE2y}AYu7KW7{VppVd>G z74Rw<6GMULj@u5Xvz{sRtYLb|(6Vi>Lf|!p%$DumIKZy->{s2I2m}hx5@^K!5>9N= zRU;)BbGop#Y}tLN?=T55TkD`mcuf%zf-)O(trth%02yS1Do&uEi;k}xlt5z9sqO?c zV;@(riem!TY)s1F;g)Hw6bb4(;~35V&MQs_5H1SEN>4KL&fiEWH-K%*rFGyS?o*jG zFP|aAL&XGfTNc z%Km=rbZ7Wtws}YYU(L~}mm89OmiQ&ZbaZvd{{CDH*p|+T$KBn+BJbb$Q#KiyORG`{ zip-e$mH=MTe_GVtjnrNtQm3r3EuS^*t7>HI7?Z$XhwsORO81D%c@^0F##^jK7enf1W_qQ^lCTr zMAILfXQ7lu(SEYN;4Bx<{E^3L>Q>ksebb|_#J}%OS^9A2DxdqzY;?Qzi25Ucj*d2; z{^WNZAJr^l1uhUJ$@k(;csgfI3EUGauRN^|^@dv8x`IU*rPOJ8H*kBh4la zH}>}gDV394^u#LRW;z37e}Q!4mp)}LL*P}^$i)7 zqSOJvs_6LE6_&)l?H<^55oy%9Sj}3+vt`&~Bem=xWj!Fo*6)SNU9mN)=c8uIA3}=9 z_N7<6+m9~#U#6B_Bx|?iS#-SGF0_K6Q=Y&MM|!fho~x@BGqza8KV0kMWpy&#bl|kw zO>B$<@Mzcm00xRsSHC7(D+>_$utrkVZ|5c)zq{*>8u>O4?_d4+TJ(P&!Ykxq{4JTE zqk=ux^|JiWn|}0o{ylMFq2QgqZA4iZyzbl5?WI=_sPCUs6@obXsD2!ydowHrE2SQP zf)4MmlrC@6ca+%PsGtmU?}_h%R~P3J3a)TBcHiXe-5Jbp@#UyKur>9-JjGDa(vO%< zlli15k#+_$ddhpq2K4NMk3|C{Bspn7^b&b}ef{Evxd(O3iI{H`#z#zCuy2(1zV(rf zt+)(06Ji9$($La@KHfd{c!MsXNKxy#VCVG)BsjsmR}47mOqp>o^)RuFS8i;K=_!AGy{VfYbfldBWM!64!lZo!0z5 z*>W6-Q|`t2n418JljI8kw`CvZ zJbAM4oTeZgYcJ9;Tveqno5o1^8W(8D^!#I4 z+DBEJEdgT0T_TgI5ulow0aL;H_mO@o$*qctA=;I`XP6b#grt-;TkIz0EYlq8xfNV> zX-230ct=sb&O`kma^ zyuO~j+rmdj?Y;nXI?u4?Fg7N9&2R`VfcY;3?{cfo&p(ie)O@lzVQy&FQ*EJuLA3eP zqR!U+dYo|?p~YYT1QX?@D1y}?%>}8vx%RXf`#Z`%<>@GF3bEkT$3yYyC)Ne7I`0pz zyuLhKQ+?G@gd(W7{3d@edsaAeHTcWKcFcE@t*nLzbkJ=%x2G)6Y;RT#Mv391YeFgQ z_dMzx$hhKHB|?ZNb@gn-j1)ZsZ*PY(UYlg=G|UBZoH$}koB%BG?tuaQgDySdjVJ-w zWw{*(p>q$v{;Zq*r9~Ee1jWh#?2ul?7bvN>uwF>4+ zfWF7Z={<)Yh04;~j;%V6mZt;zq^kXeJvuytGD7YCs|t_mqdxt_o!|PC((>2G*M7Hq z+6rstmc7!8#$jzPlQ&-HehJc$AX}}7E!W6Cm<|-tm!#-Gn&#*}gy>Lz1{f#v{b|*X zgec65M|VAVBa!A6$Mpz5#q~ChrH8JPer~hAQ)oZ$WiGv(*Nkm~7Q$Y4^bB_VtzYkN1MI2?5&v`Fu);9c)ZTFQgdDT#ghZ#^mgRk`}gjbw@C>9t!0Y1zx8G7mp!;PVdN&A zsa`1S|Ma2xmqYuI2giLfg*pobNAqD1rvs)XtVFI}#sK0@O$wJ5p4EgNLh})77?Q5M zSXmcz%nSa7dy6)TUPZhHlCkRpGHtp#oWsxRt?b=6PZnjDhh9-h^tgNX?Ys7AO0F)T zUT^WEU7R}J58aZ?awKL*qPUu(nuhOiOaP>Uh8PqU-K6i;JxUNCkri@Yq9h8y((qx_ zs0?b93{lmQIxwk=OT0+-j_G*_{}KBQI^SvRdqm6J8*Z@_)A!ER*Q^`G&2E-Ap#S>U z4gr#(uhUwRA9L-3J3ZOKle(%q)|;H2+;e!R;F41IqZdU#tODUQf=;JH>w@y>TtsO`FM zYKP5vwYZD%j-&g+7i-v!uvFJ~+x=D5;OxF701KeqLvSTxY!UV;m@q&~996xTfmoC2 zb4FX07Y>MC%ax`S3BhCsC_+R{1q@lk8~A7$We%0az0)ZK`KBcMprG0*a8`i`V^@U7lgfU&0HJ>l8^obo;VTP5#IdnT6OeX!Fy zMIn1s%>I7t4Uhz^Zt5BqH%AQSrQW?^XI1>FC-;)2u|S)tSRypbI|W)DIbMtR4}+Vk z$dq>VM|YrI$SOYWx*ML5;asRRC)NprxINo8I%0uM0QSUEEFA`nfq)JpC4|jk{}luD zkM(o+^0Q}}&v-9b>q@`g)FQX#=l#?tPjP-5+vynL%neYUXTB^ysUyP*IFM;_)A1gy zRzE~!Iq4yFRuzkI3*)QJ0Zr+ImsSoj|2cNDi#r>IDGFgO_h^F|N zsET#jisT{SuNRfEJSl~#uI#%G#mD-3?iw)oTi*Uw!6;QceXx3v6`8y1a^7tFUTvp~ z+ig3v^WPD{ClV7qQavctji`ONjxbg(p!fct;1EMx|` zwlg)*fq8wRyh|u){YOlUqN9#-556nX&YQ9Ui7EJp_<+X80RtRL_s^ffpAeQP2Ls1N zCqRL!i0^&mJh5fDXH^t7E&ZD;U8JOhgi^RO&fU0HJ69$cs;5lK_m^dr*8e*@+gSMh zS>t!HYPoU7K-pkh6=goiflJ05zB)Mn5wDZ1i7jIEHf&D{nuFz~# z&94+9YP9yw*z~aG-;FW9GyL-Tb@K}?di(xgwI1c34<{!s6x{TCZr~Pmp~Y+Wgw4uc zdM#S8EBhqkpx!|H2Bn>SPA=Zp&RG1@3CGpG-;ontV(u9N?rEQ+Ux`kY+&LS!b2+8t z5JjFawok#o#A|6*^^v>*K^D%3xic@ZQ5X%?H=*DC)wc?>*bNY0@l!O0D2#h6@3=)r6TDM2(`>RG#$|CSca{BAG-0R1X_Rx3AUYWUR~fiQWI>ev(?dKRj~1 z_;_jiUfBU)KBesWwm+J1xIB_QZ25C62ss9KhTaAFo9oYL%v{piLluEnQeHaXvIXQL zUZk$7Jiis3V)ah=BI4exx_eskh&#E666qisfEB5yyBf&$JdsK=lRjkrobYP$WB=@k zzwgG^1{Z_~OdVB96gHEq1Lq#mum=*sH-Dh(Ygpm`UjIb1XtK(M z!QQrSX$Jc-nf} z7pQDX|JxzI7Dm3~rE*eI;rtuyh5D=Ry81Znp|{l-X-QcdnagjJsxBSx-D%-!Z@%gl zHHtoK>frXY{>3=NT%uh_AxwKmWW4}^+B1Oi5n-*G7U9%suAU&=B;*M^EpOp@Y*A zFM>8^b6A6Q-4}vhZm9e+*wq=ij}`srLncq6O*G28)cG zSz3#k%JF%&;-!sJRZ}}4&&^oSzz`bZ-}6ZY2Y_)*5vB6)TT@_|)%Xg6Raz>P7%FNy zZtrBkOl*2GF6`9(@o_*q$5`QI*x8{U@84&M5|Tb78DkJ7rqRGS=)y|wgKylJK}96S z6L*EPKVnnn=vD0kBQb6x4YQ; zr{5*7XX*jp{|`~`9Z&Ti|A8Wka~jq;vN^~&ACib9a*$P4#yPfQBtj^AM2>KheI#3y z%{g|qj=d$yCM$bohWq*ce!qL~f9LTy|D5;deBQ6|9Iuz#3pv4MsA;_Ksjjs;2}Ox6 zzYg71Ofa>ZJ#bL!4F~R~2FJ?xP1t>-XNH$w`8So=hpP72Q*nKa=pla`4=JN2 zwn{J0q72mq%)@P2i#kjwxtJ`Qx^HuElc;_dq=^+MgTGpHDo#OZS>1 zGKXn$B(caTV$5@zGXS(UdD@L-zNr>nc8&`)I7X|bF8r{?Tk>7fFstb6kM;bqNZ)<9 zCDAb6S+i&s$nh0(EcLw&F=DcejAin9>yMN0&Q!B~HsOyPXsLZ^PV3IkJTX-CDVYZI zbyOyj1_6BpY^LFM4gCOO3{9oU2Mk9UEkOtCs%7$}tb?lNLQ(Ffp)W+2bT^RDBoQ8y zrGnGsBw{T99@Uw>_e~|!hPAR#UhqC-7Ez832KE}f$il(91dOehTjFNKQ9(h&$S{N#l$s>OseC* z8-c5nf**_Jgl@)O`TYdQ=p!NJ8xzt)(-$f@?&)16#+yzT*roFoD{Ms+cq8jt{05-t zeL8Oe1zu)dO*W~;=O%A&3GJZww!!vaq?~XMv!) z^e4Oj86ZX=y~8tFZkvQ&2r?gdhjwc?fYfz_(M+nBFFXUTvwbLX3d=E_`uoCLbEp4P z@|bMhLikz>?DX@f_|5qjufNl=ogv)1l2 z@Xh$QI)=2&>Dx1kMyGUS)hGYKD#oq2a#Ubcor}MT?cUNLY1zh`#@L=7ziVq@cz1#L za%!Y7+8$UkJFs&AhK$sDs>W(D5K``kEP413OiH>yE`!U|BYSH)HYhx~P0tekY;E`*nu<7|@W`FK z!x8)o&IWBG{%Sp6d%Li>xmQy6uf0((a4Ydp?%uwV_OsD_GJB-IhmmuE!3dfwt?4@I zuklM^cQ;$MVue=Ni1pyZNE&B&k!L|QW4MS^99?3M!UBIhM@;LFwQFq9w-m#L^il-v z)(%ETK+%?AwP-OUB~C>WzEp2uM^_q5w_V!QI_Cs*Nbm~pd3y$0q|XPrNy1fFM-ngO zhLvZ<&A;>c8QUm#LBwh#(J}K8BJxEnQx|h-*bd{>Og4NNfIQG8JvGMuSF@CDL`|&T z&3tzZDe9Ji8(ZDkZQL%Vt0QwRJeK-$2I6oZZ5xHsi!merXd0W6x6OLLSLc(@#+SoS zy*N$WD#wlGP0KcecC@4j910D))rI$8{u|h)8Gk&S%*c8A!$><_{?yFo`Jnm_FRzyS z&nRa=J9mIrf7<<8kNf}74l0WkS6HHqG;-<(SIv+44N9Q995>v5$hUc(#LSKr><6VF zXPRg2Oe>NO)QT1Kqn~g?H-G7>?Vw_r-Hss>RR<%{5f;(A3&v$agJ`!xphet@r3-S{ z_4Hs->XU!_lx$`yj@-^D3Agx?E`V9K(dC17cMN*2CU?`}3Zu)W&IE%Ns;*!7cmw{^ zd3uG5eK~<&M;QAruXSRFt3Q0$W}HPy3*(W)`}d=urBqu=FT zxaf^?Xj|^&su$kDsFMo$NpsHLXME5q5Uyw4`C5pi94ZV>1c8D7U=xak+ z#$2yhxNQLc&q%M%s37fduxI7*USB{ypaz^7L1Vy!Ku4P=!c+`LFZB zTiDr4;^m|N@C7G?%5M<*?kzcHR*Q-#9Sf^_DehoCfbI=pCavynx|cK#7^&@|J~AGo zkLhN+%fxcMA{VqeQ2yX0Qc!%k{Q(g*Yr-|ItFVRPwXv~t(KKEze1}uBp|qn%4@`ZF z`i}l-5VmO2qMiYNq`Cqz5+(K14>@t=m+>1J4f6hQqID$QtgT^A{|kWphgl<6AN zFgmNRw~Lt)=^!M485sT2Gf0We)BpVC7-^kw>qYml&4_vi>;ja_Oh!j1(W>c!Totjo}Y1jjaU_*Plzeo>{GM&7+7dw_!J8AD}tiJutU zGP5n%3inX(slxf*6T`aun&ARAIQzO*bl9P8A6^SWUp&tLQM{5; zJDZ8$eE?TB{&q2(<_{kw%iXVBDzDqaY>yNqAK%Mj(9D}(tMIm#p8g^pmniNqWxQHQvu5TJzE)=pCi@GL0iCTEsr?9ju zcP>%nfVQS%>^A#`)Bv9RrAjreP_&r+TI{Jg=|*;mED@!s-9j0vwxtbmyi+4ra%hQDg2DFC~6ux{+I&+eQWADtP^R70#O1 zfVqv8x!OBGg!u1j-`Ek`NU*~)c9zmSYjuCor2WQV879=}MiOl%{|CkwxYY+2!Meb@ zvy4P$!F5w zEg;a86T)DTvv9?8`5ljWve?A9^fcXLr;*WwIoMihj_C!<3Tx>lC#mT;(b2CQ*zNPV zhsP0eCVhu$ciFl>RnV0z$c}oy|6bZQ$4{6Y9qDy1zhlyR&0w-Hy3v8pU&Z_kUm=y} z9=1?3Rz%>alO45qF-774gTk!&>sBO?>#b>*nI|yG7$Z>Pt{H0KAVrAgU^LgY`;}qw z&GG^-p|i^LDen(-a0h}7T87fHvC^VF;(-}+RCsyQL^Uv=m$AB7@7>2A(b<<9-!jX| zIuEMGMmV;Juto}?7-`~$>kA);15#v^Ep_s(n?J@i@uJKGd}36ZX2qw(rTiXdUPbbU zi;=Coc)3UlJVSS^0w2IPor>M?jabRC%l&OKP_` zv)*_MtC&w~$uI#OMXXg^%)fed;=!v(^C&6)_Hxg)p_r3|+Rh>Z!kFB2_zR`Ji zB@&%0qgh+B&*`l+D!G_hIO$K5xJFL;eA=8Ek0u>M`t2ROWC~b+9HZS{SgM#m7A1Bx zR%85eOFIv}4{a7x98RR~6GhZ{^FB><80^v>Gf&#q&4uN>zg>vOLCcOI(+hFm778yL zXadQh-_wJ77KpjOd`2z6kpy2MyvPt^!1_BMFsN-MpowHQwZD|0m}2eL$xE=&%?5QE zsWesAzuAl)-T}8zv^!5xgWFVYGAkeg!z7)@EGONDl(*CfA*i}NyY5*vHhCo z3Wu+f%Qnv?>h{3LhOPaZip%zy2-|^#=Vbky>!5M}7p~j-Se2jg2917hIdx5IkzNwc z5PlkAbRn$&C9Qx=Bn|;}4#NCwzMI!U1>Vb`Ja{?E(iPR6-U(En1irXr-n`zAAj+_J zCrbwky^4x7c+)l~ED*pvOk2Yv4r)tMJHDey-_gWpC+%$Hwj zFB`QR1Lr3e@b2#Jfral4L{@*LMzGjTL zyze&~SlwOr_V(s&Sj)8kqK};aa5Fc$9ivzHUB3FP=-f#ekuq|MY}-+LX!kDtCGq>K z5`{m=Y++tb+HIy|=RD#>LAKtv8yT=cbTwj!bl3+38R{_|vEetD*4%|-_h&riV{F%# z=S`%OV6pIU;5tSvMqt<`CYxR#4iTj*_p)j5ag;>8${=uc=z*R$PZp+q5moDw$@U{q3Nur74=198hxv{LPqX{!7@Jc|- z)EyfBhTo8`{aNsB-((x)`+4RsC7>Oel{Ke|-+!{UN~8)CC?c*+nw(RV5CUU13S!og zFLNi{NoO0qrr&@4_?0m9^{QP4f56Vuw}EMQ@@&FwpB_Cp#>_jQY(LRD4(JPW>(WLyK@170al!;v^8kbIm=MP0> zF7)ddba0qnPbO9&kqz(4nG znX4TV4Li&qMSr`F@9MOh_kdJ8S%O$o$z;mJ$1CNMNv!MD14<_Y9+?NVWZi(fScfmb z*JZPZ@OE#}SqZ`2a-viT!-nkHQDyyCiVkQnva&xh_HZ-Y)T40V>GVCQDQ~Hctzuj= zj@rm;i66dQ`WiETNdr#-1q&h}Wvo@Z-3PSDg{Fji^x$<`fCfPv zTCV(0-Zo>)BEI~+>h)MPizt4J-;0+OvXOdlXG`dlE&vtvn@X+Ww zIW{!9N3*kMj;0YVpsy!SIK4(VP+pbrg*(EJ>Yje85=?*v>y?XugD_#49Cbi6*bDM)0^X^eRx7ka~PW3b_iw=QBO6Ce55 zFa|hEsvHa*{eSN9BN<4fNrySka^q``vX05aX*CT^6w>TG>L+#GQ|a@LA-1*^A-1&<|2GhU)f^2 ze220V#|B*kZZFN)=Iv+7iq%(9d2nIk(T}X3wY;rRe=CWNX=>Siy}$Z0T9=BdNYLp( z(?#z?4UGekGjvw8kd}J3#}v6R|K9DWg5cG$~pm1!t0pY)^a2@najiC2jmk6I_8Ri%p2XCXBizJ;PgjK*}L@w9UzF|-RgR~ zc0>6Z)U-;BQ}SAgnz34*vLt-p!tF3du8R!;y?(QJFjAm)+|GxPKqjT=@lEMI!T@o!<#KjdyfH~wTTzm)|d)$90)A^EcU$nc0 zy93o&!U7U(8(x)W*Q~u}l|Jfi`{T3%MxRV^$Sl5pD@49Hjp81k`_SqLtlqhj=u3*g z@b-;pVV^N9k3HcE*v|MIDo6A?SzFv$gq?qAX+=dY(AeR0WUn&fmy!8IHF?^JxnwQK zw?h@FsP`K4iqx>F7`30E$Z0&x4(_*f^_5p7E)_SR(E*{vq#O`VaoP2k1o3544wEKq znBgu&C$F?d85Q)HR@eGS)S3HiUx^++{B>Qm%1)hTG8}DsHLN^%u~0OFIyT}r{MERd zF12ugg!%+1q%e#scyI9MYtFm5WK9Ue6J@M!(n5k@ddl4@|KhQ?dnP<^LDF?1u)bTp z<1Zrt&Vr|-5x@; z6m5ivQ~_7PyYSiro3{k{MevEn%T~=LYJfhP8pA`3*O4uj9g$|OF{0)3j^seeyVfy= z2{dUGx3#3ok%ogc&~st7&Kp_P&dtc!>xdHmT*h$G5;>rk~9Z{hdy_6yrx#{$Q}|MsH59NfCBkjn(jf&u8!dK!+HYS4F>MzzXu7=~*4^zU2-$9MXLH46C>x^Mxq$ zh~bp~8YY^yMU(<>Y|w1rU}vG80sKEV82cpbSI84{(P)vS|vUj5koTrMm7fq zoQi}}-38)6z&#@YqM9)Lbp+Q)n7|T~;j+p@;A~yd97L&ohWZIJe?r1-rOx1?PtZZ( zr(e>4A1XaJ@mzb(AlDtgs0xPV}o1^^Mqq{bs@UC4B}3VuY$` z4=(udd%iEq9dnrilnG-3ow5oGt~U0x8^)5+8bc(6W{-Gy8_DW^n96Gbv`c*lYPD!{ z_FXIh3~x%pjpA?;cqO=z>{Vf2@$OB9kaa`N_SU0}5H*Kf+%hHQ4W7pkIgYB_vDr<| z?q%6uoSJUN78b9{J#)3&ODLZ6ij5qZ25>L(P2gir2IzMt~(NvxDzg15vT zp*OakAx9ke2wMT-M7@9}z6gq*H=f@bSISm1`qw5KWdjeC_7df9ojmldd!M5^=Pc8b zZ&8s?%t()p|7uG#`8U&+Ri%ZE5q|whX+87oZRw;^@s~1En3^bEshS<_@Z3iSVB*FQ z`m5ECS8SRHQgsBdHUl1x3xhs{H_h#$V%dUw5N5a!Ou;R|0jPOc1mH3STq#=e0tE!h zY#QamkAvl87QR2cdLI!eLH%}m=94YGU#g>ERe*zG zV$GP`-!2g(i7%dsxG-Qi97#K2%hjsLfs6&PrT+iGikn8-}*KqP@saZ88MjqRlf;3=4sT zOh^`M4(>YH@JwdXcFCC>>+|j2wR#T)f6hy3C(ek97CXZmR8rQ`B|YE84$3SKDfkFJ ziUP#)2A{k-1R3#L>NYpO^h{U%Y0s5E8DU~dk~cqP8{Htc2zQ<;le5_HPmC?iw*_45 z+eiU#9Dubcpsq#s^br&!xR^2JJ4M~l)8=mAIyS5)O9F4WbWTszovsL!5;>9|z6wC# z9q|abKva{cF!q_gH7hPEN8;6XN+Oz{PdP?~8t&hrCdv;-Xui6vbeR$?J6@|UTIU^r zvQ0NW!bfSlI(G*gUiU{z%VceaSs0B)*Jt8z&AV|WYfx76?q`ofRX$P2l(J%r($kXu zOe|l3jCdmnEqjrK#eYW+?Le0Hy1C^EnHpdUrZnciIo}RP<(y>@b*u)@dL?*3<-0iX zwL!Kbw&v)FNF-J`N*7Nqe}9mM{#Pf+D&Ml97bnJurYj7tdDk&Fg8V+*=m~1IIlkVB zGZ(pI;+C6&>pIo=+qZvUhTH-q>KOo-=4 z8zH|+%MDMP9jDyQyj-G;fPTO|g#Vd+WJ%Y9x`{tV%bfAjz@1zIe0KNeA6RUTZFELP zAcBlYzqn}!J=KrzcDl$ zCLKWGNPnnect-II>-K}Of#%OiG<%h*2!jS@aLDa#{ ze&`h^Jussz80bQ<;SVwJOQHs((2TJO8AUi0QLG*cL6r7E#KP*Q@{>6r5 zB-(iO)Zw7ai3NKrGUqH|j7uKG{HVa0(xWAJN6=v+_{UI{rnwt*!dVskBAQtx{5uhr zng^l0CBz4|gM|5zavuEt3lBDY)fDO}gQc!{CV7qq-T^#q{CaeB`11TXlD$^?SRO@J zo~^60=pYHF(xlf$IP*&BtZw~dbZRyD_0Uu?>Q86qiY;q7Mh_JMNU3LD`vBd!(zMfc zNwtsI9CS~1YiIkmj>@6L4{dtqvd-7I@oIpCP1zn(7MaQ) zta9gFJ0Y4M%{qX-;ME+MoX;y2ZfUGd=G&1=+?r~aL2z^F?(4E0h%#K%w?GS279A$r z1A1xG0=OxE7P8y(uZ>x3nk^#E`EZOwxJorSDEvj(Q|5&P;G@lAUg^vYr^mKV6<^nP z?!FUd9)@*^?);{siJ~j53w}lqW`gIpRjZ6n9*&kQF@a3&`5WU7sfPg0&tzjP&f$L_ zvcQCxJp3zK*@~AWE~xy7w~9wpImp(PP4lDPkUL#j_FaKAld{a;yi3;M4*w;uCEI^? zq9dDlQdnCSCi5WRlBMf9q;15>`% z`{JT^hD(ceZH%!fh63d{Dr7bb1vE~r`d%>B&A^5*((N8}{Kz`*|9`6Zd+GlpdM~Xu zgH{;*-<2%b-y#mQB5uIdA(RH7bn6f?j8ejn8SiYmD}Rv8%3hp#HaH)B`6tH{4TrA> znY=~qZciqvVp$;Hq@Q5v5O`S^PT~VAhRD`iPZ~@DJt=a`NTs~ zZd?W8AJfw1h>P-%X;1+V2zLh1q(2 zpYsqMHvhjkQaJXly(OTRLyvYYq8q|1m~98&im(^y>mfyFRZ;5hB(<{G^A?32R1z=` zfXPXP3?9u0%#8K+D;G*29)r$L=M1GHg{3ceUVpmm*{C8wm~Ll;4$yS$a7QUr6ftQL zE*cO*(5S2GAG*RtsYE1Fw~73#IJ4n8#DzP20wFy*IYw`tLbva1K?ZROuDc(0m@Zd$ ziAqk#qOq1i6!{3p=_6T1x3I!v_ej;dw<2v_T^)za|HDL|KN}*NoRzN6TI-tY zng;o*l5|$oz*wWS|WPfg_JoEFZRD<|7b%YFE;p*V+TAW~pZFD-9YZRt!QK)DzLYT_RXB zhS8!Gk<9D(WE~&6gBq}<;EQY}YFxy~tcE(+omM}>sx{13L zN41(5uk5Nj^!l-N&saWS-GU?!KyE7u$Ka-$hS7FUDS<~<1hn8qojX&>Py5|Nc#FE{ zB;hd_viHf9Z!{XcHbRMS>UwLW;Kxdu|-6ytWE_RMYGJM9-^0R2kYqFm3pA38`qwFUpKe}YzBnZ4OBdN>M@;W zS_`+elCASV{K>Gn(EExUxY%i(iRZn9w~$hoCP$vKAg_UwQupOLl9guOMGspR;}zZaJ=QOr1gG=Yk)j z_01T9RHQ&NN<}+;sB3Y*P(Y*_Ae$Y%XSG2=3Sp0|G=J*)3<1MP-p(#)MUd=7CE;E! z+gAsB5u7Pz#vKL?kq=!x>gGP+M4*|iELd-r4Or);``YNtHn-K0igTH=)1UMFj3TSw zqvsYokm{I#q}}mOK7T+_hqqFVuSBSEOI}iH5=$s2dGz|cvy$=LUcNii6?5=v!VZDY z(1=_$?5S)K=GL+2{lf3bLrCgAN!c9<%gK@Njbzdsoo06me8qr1&oj$ewJrg%T91-1@cLhqO~rq`qx{k zb*PM42W{k;YqZ5*NAc$pVDY*!S*B$}S8+UOvix`q{Q-`}(#jJO$*(nS7)%)wTK_sW zXq-d{0h=u^p>{PpzY>r}&h`I>*H>(A=5IcH7aL6%B(Pn=-+jMQ$Y5LU{->>MTSGyc zF;}P}P3jwJQyB%FY8bGRQo@6R?NUYVseRx zvpBzVI~8~y-z5hI4DKlt9i)=ww+=2Jf z1sN#ogT}aPz$0kAg8NI898##YOj+50h^fMUJRaYK3R}<)bjm(WpZ+(Q+Ql9CeVL1n z`0qKx39yk+HERB1s#4`XVc*AS?MJx_^~2mkp&~E4%nn2;72|A3H_g$o4q0%bsYL6h z2Li7Y&rIwP*@6Ux9wEyAe&sy5eC6uZnei%X%M6x;XM$0^S%d9vaZPjj@F+kBof0P? zK$6L9y#V#5sYXnGY(|O@AK29F9X>gc^;I$q(?^K%GB99-vUn7jZzlYi37k!t4_3SK z)7TvOMd_QBZ|$FM#X!5tn(kb7>EDbf2vAXJLJ428iqVf&Wl3aVboSoqO#0LT36IM0 z9FQcyk3Wcop#w(LP8=Trk!xFJjc4Ej7U9fgA}=9pGmVk)$s(sc%8LbEq#%$N`(5 z!W&{o&_@E_9d9DfzVQGKp@-wT9<>3$&-hl*vCEKV6p*;hs=RW+sBYvW25idSecMA{SVawg-Bmje`%-FpzIObS0{-}N zkV*0V0AG@Cqz6kx7;c^stsEmz!DF2GDAsY{foL!waRF;}njqzCcaHiQ>^O_l~6wMCGA)0aGD=R;>x?r!~A8614fGjMXa`9 z)R_t#J7CWLjipSXX3~V{hZyNxI7qnmej-<^$l|6U>n^c;XsPBpnURG0y+xE%1FX^w zAyugbDouz1I75dTKn*|m{TD__;K>hjZM_X05SVToDmhKYJ@86Iz$*$uUj3r83DFrZ zO7;(NWA7}5}CL&SK0R2${yf+7XEcZNQrS;t)yD$$T3Mf=>AVB)oA z5p_L~O-gBrC18S^=J*p1DL+YT($tXKv22=9e+2X)2;H1HlZQ~FQK=5j)NQq&zskCg z{%pAuPI1XeD-XyorLZBtV*j9=028dFH&M1@2ne5tCB2g*NJeg%m=Q|(-1isnEL`Lv zVAlDlV`w1VbT`ZF^=J$Qa2 zgk)Qn*@Q5fQsyrgT`lu_6GR{A!}vy;J5`CMrefzESRW3llI+Rg`H*J!y90~=2MGht zt&EPtPTs!ljw;@ZtL)mXX#d&U1L=%vhvy5Uo95})MQoVdXz5hG9L>CLEn(JCpBCVP zD>Bdu3b}qJOO$}FUe=l_Mz8-I5g}bIJ@m<&H~INwcCQ+^SjYx)-5B3ZGm2*P(S8kD zK5JjYhpThQE!jE<;y|G90{mFUleCCO;05W=g+w6sP5p0Gtq9_8MT6T=9`uc(uWjQ) z$2B(R=Q)Zh?Nup|Q|_xU`1&-5BwV!b=0OXfXrTTSM+;o9J(?KAj~9HJ^wZmOu53UR z?R>eggt?4WmlFn4K1myj8`t`KmEmXbcFc{9(#O!kDFxFz*t`Sz81&bbqUT8qeBbw` z9!mGpg*PEQ;5*@h@JEAgnqk`%PaLiPEY0*L6%NciXs1ADgSL}Bn0z$)Vdy{lFv+R3 znJWWXb+LWWR0Yi#mBV=@eLE}jPG{`nSVQxS@S3)I{CN=4g>EM3(Cf|J0!k=ro zr0gVI`J6JL3HU=t-+gIWC}c#Gt(&c&eMlnxZkqkAFaa;UHg?acP=8~QDG$=F>i-9( zc^8bUTD#U|UHd4|QXU3$-O$1^py`@g=xTCWVbUy$w1Vs-=!+LmcpsJaIZg22ny&|A zSFsTOy=chgf!a-y48C&r5jgzo#R+}=M@m4^n!!QWzGmm2^mS%elc!7bAODt;ytKl( zC^-jBC0UDff|QiusV||W13QrHW`gUJ(d4-zf|A2Rh|(mkU#evk(Jf|Gl%PRq14jg%#}R zbpEnqa0{ai{E;|2^&|~vvbvZnledaGP8J&IFm8En*|;eX&&!fp^xI0el(MuAxBbzq zHnN>5ebW#swu^ecS4NM?UK33Pi|2hl;qd!s7348^q9R*htjVj)jl&{1X4noV2lQDM6d_k`B0@&T4MraaB*50p;om z@_L49Gz4m+lyHGPm_|2JR+XjOG%yh7b2ZKD3OVmA%i_65054T54Qz{KV;4WpabE^8N^Be}wL{ zUT0^|XlY@$w}pGUP+}-VUzVvUJe|txL1BZQ(oBt)S@{3mF*^VIBY90Mmrh#G0Z9@Crr7je#T93!x4?VlmN#<|q<8e15R9LNT`{H9R`EE4W)3^Kh2i70 z(KY-}Bc1|!!{9e6dMGi`cnMOrS(OxHxAt}T_Jw!ja+k2ItE=n@X&MiKzX}DdCS_~n zopu3(KhlvoUWpwMX}bT1F(rd%6UySB$a?Y-J`_)1jmzT}te`E(el3v9!scynL*Jpt zZM^afK+8)GSo|^c$Fbo-_XJR@D5`dq!AQ&Rc!w?mu=3*V5V1&={kzj`Q6FAD@TKlMJ+n$LiWzLYxzWdNm2daAvgA661yE}|IY0V=KdTic`IZJ z6>VT1@XwYW8jbt@05-)#5-Q7GHx!w=CSb=Fr{=AaQWb=m2tz%y49CI20C_tX zWZQFY4x%rvS`=1(wqy{xNaSbb>V}AJP~EM5+Mfo+k5}mUR{SpvOLDlz4PDOOUrebY zg_49kmn6$BnNPK(z@-1HY zaIx)gS&UE_6|!D@ov|X?xMf)pym-FDQlbrrl-MSX9^|c3YIe;5v|co0Z^qMIWHefg z3dfbfA0dw45Z0oWF4fIcbidOBr$%+4sl~IbdV54rd{T%TDQ4G(F zkG{6EO5|~P#SAx$!kdVdg|(r;etYLlcF2r%3^`f?8lw_VV6xYHqP14#5^($@k4cd0 ze1+Wo0KV0gqij?2a=r02ve@NzN)<;nrp-C@4_pMl3>5lzTcudZVUi{CXseXTh=kc{ zn#9b)6pD8fe`b1Gfa?lj7cSKVyMAQMa#slwpg}5HWi}xYpfh|`h?yZm%V$(Hn+0@^ zLV67fP7`<;ywI+rVL4|{E4&;d6v13Wbj88F!(XFArA6JQ5$hxWd$xl7`+k6g$891R zDY{+WA>xc1e$N!JU78ieP{P9-0{;{q8!=q_qqXLUr>cE`UJhsoA3)RvA6H$s5|VrX z6%`EuL4FPvWRlmY-Ts52f*5&1{-8#C?wE; z563;938(2`WWaxQ`Xch=6Ayeo_Z)R4J9NDLqIBS_8Lk%5hXgk4_QV`dt3qiL$7|dTLg4wyw1#LC zteyem+)ss(t6FRDKOsA9my6%tgYCh0Ame#YzKEsz91TgqjO(IQRN&xBsXZvkUJCqf z7j?buXnS0oRG1*^hC-wnXKUas@mrA1e<=kM7b|Rg$2JD#iDQOKmHn>SO0Hw1d$+A{ z3cL0|eu2%0laIp2I_kciCraR!IILuU{rSL&p&;UyIy z{v}&0Tl>ECX3u9)D%?8?Rn+D3tVbyEZRjV}3=M)I$bJ*|-WdLr3!Nc`qCx+ai1;Dr zZr#o%HwS5;C+iev$MAK$U8s!0*h2}MdCgN=ZJ|KB@i_i2f`;&fA5PKT{iR8y3BGBa z*#>gs#ZPR(d1)wc(4sxSPe?p?+@~jOFyju#N+(gO0v5_B{t#Fqd)IYLm_Tknf)Nb)rqSgg zXbL2L5G_7WAD&av2wG@9YF;76ayU;8fJ42SdJtD8L#fg7(~)HQHPBUI1K;V&(;PMm z16!w`=-?O{46g{Njw<~^?-I-Ab9S_PoqTjXQV5*#C8#RM&I6awJn&hgw4o%fB`fvw z5b$r_w&s9~jOSWGDcal`@R8XK4kDVgNQ^l{I86-?0zZts>-N$a!ijqTbU=u##KNhk z_P6@$*2pBX4=`e6e`^z)H!Q~1J!f9OZp(oFLpy&ZWVxgD+htdp@7`3qp`Bujfa2_(nkF!GLk?H!{pBBz(R`Y#q;h(r2s z|J)Uor2=!DW??(l9|u1$ti(szx|U4slV}THQfG;rlCQyZNqP_;=})e(6WE{cc2X7>KQ+0X2TC`-^&#asH1W+0Y7G=r$0LCWA*e||u;Zf|RMQTUyNL*PHb zB|r{L3d!%CsTs?GC=nu6$AZ;{rR}ynJcocxKV^x|VR}bpIO`IMel12vY+8hvvo3V> zj4v)*C@S3z*rV@Kg0~Vfij_H2{I_L3Iw|SVs>OCe!e6OyRB6i}HGk)>J8zD;HkGbe zGhTD|dSGZS5e$$UT^5t|5f6bt$))uyG1~8FIPA`NC!(~2cH-WBs;jQx;+^j)mqiB+ z9immj|C=n3G0`b4-1xXzXy-MZbaET4@5-LRNn1em(sVYgp3*qruxbkumg8F~F+S=4 zeCOxweh%Nz9@6H%M<*n_%?*1H{-rM^6Tc8zjsA(DEvLHKT@lS3+~s!4W?H3>(YI!R zY9feo{U3pw>DB(VK99a$jseln44NbX`fhrY94br7qKRt1-=T*_w7}-B;W*hJY7>4e z1)7jcNc;%=Oau?!UUNJ5oD4VK0(rb5xNmXDjRCtd%uc~Fk%2)^g%xg+fy zdo&wtY{|ODUSOD4>PH>EaFYu|A-*-8MJ48mWyRht)DH0h0|nXv1=X zItSKyL#w{B1K*JW2@_D)|$Hx!o3TaE}F%-b?e<8LVRJ+Kasf{E>L2(MZuXI_Q4jGw3 zO_B@1PCG?+0clVteQ*MmOdjrE@NCjsHQ`dl@rW&MX10_D@@NM92R#A_E?ke2rd7*YzJ4XV8Cem+MZ>jpfh}wXPyywES zskuTknREL;+2BEB?Ow%rcO`A1AN5>zv^7jsC_>nD@gtzN3%MSe!FU&nfC=ctP0^>dfCcppro33C`ex!7w zkEJY4x$CYZ1-AlaOgo@VAd8gS(X!DC@FY^gt6eflDL)%Zc0HDe)>kpKU;=*7(}aFo zI^H$V<30|0Qpl$CDFVW#otl?h^AMfahy~Hd=m$vc3r`N2GFlZQ>;dudmgXJ)*R~hu zG+x=*JokD2+~#ZG^XD0!uM>V}TinJGf?^UH{nl7nSc7Z@2?F~thq~aDX8_)>8+@x@ zt5|k2&|-LDddyE!vghn+qFaP*VLmR_4z6{vb_8lPwK_Pnx{Ov72Uz=CTOa&OZkl@m z3J?z_Za+?1NPG4GYQh~xV>r4}=`_B-K3?U2vJcMu)T3h1SzWGy*7JEx^`<#1;N74Rd+gf-UCTb*a|>Pgma^PvzhL zk0>0;F^-vWkaZz4Bje=AO2{||AuE(b5+U=L>6iyac2w3ml)X1)SIWo^8QB@>_txjW zKi~WD>(3s}bzSfITF>Y6C07t7z@^u3;1jmL|LD>_Q5a@>yOlJHZ;3ja39_ME@$N$y z)8(QHArdu)5^EHK5C4t|i8dovf>y zO_}>7`xWC@yG^)+bBRZ3!EV%Td?~v9=K;+yg=){N`OED72r(PN+uNUR|4yoRon&c@ zLPRs|>a>bIW7qM9YQ<6Q(<;n4e>Bd;$C7k^mMHHD7<}j2G1%34 zK!>e7k|K*XD@yzwi;%X(Zr!qx^@7-li5$B`%5H2->6iQFl=IPz-k*)lzWZ^GRGZ?} zhHDg&N%ltTtuZ1Rc~q24{IE?m28<;iqBqOAAv2cU!_Vj-vsK!3@y`@D1!MMmwG9lcnLRI;k4 z^A*H5Ydi4gX;&+u`6Sv0;PjP3x zlT=cD=6!HEgA5_MrjMG`*yb@q$3loOD=0Itk2`5qB|B3iZ(SFEGbz-(l;UjqwUr6E zFN&gduWLkzr6!NU!wN{o6&Y3T^aT_LI#0y(XtER!TwNu(fc!XT3x^FPl3| zaTi`vGzK>>uU#2hyHA-#)LCPu{iLVv!od*?$)3FVI=S+cy*m7+%n-(W+D+3V9;C2Z z6KquovvpOR`P@yvv`$(4!L9K~0Cb2ir#5tB+w|T0jXT%)u85MGMM%*~a$6di%qr&s z5ik?8C5JG^A}Z@@t(`3;o2+`q_YhCr=73Tri#lSGHfYZGBWC{H2P#{*P1Tb$hB&kj zdJq-Q${0B}2(kbbna8vUT9TGo7MqdL5QF*M-yI^ASOpF>q4OuogKs{o@x@Lca?20e z4cYF}Kr-X)AAGP{V$Qqbbz;fi-<`f=S?0dd{`Tk0of^MX|Lhs@v(*43=>DG1I4J~& z{;_@{>HJd$O~$2DM6%6(L+YlS4D1j}6B6Hy9AXkbNerdKJ451LzoFg5)#i#XKYIN8 zlT;@j*U}APmRjooic@cGzA}U16;%VRWOiYw;uPCNRkb^}zkI$Xgx@0R z`-iOtDiab%&y`#P{w0mX8X30!iaB*ZO0p--1!8?@U0c8btDznL$q2$_Eg-a*Xr~yC z`u*7vPyyr+SE+$`sE&lz>G5i>#kpi_l?RRo^|#)cU}H155wF-z`Oghj{kZV?ZL|Ra zo7T=q4y&A9Fg(nhw@AX~ZF9;GsxNn3RlaziUcQ|FLizMXCfFmT%N9-2tEU@WdAtOQ zGob*$8aZEbEdsFokwz~MuS?V zWhXrH(-m*p<17a(nxjIP@ggcGT36NMlG-He<-1Sj0kB5WTrUS~F4*PmA8slH959(F z%%{!-kOPPU8KG*;5Kk!zBcWI;ZUdssJ#>17yo8jP=VdJVI!X_g&R-l6<8V@wpri%e ztLX3DV1IX|FKc}C^+)X;`0K4*vp1}UfmDXmKm7Jg!5b1$1_XwwH>+|Ml34y)uX0_`$V8m57sUC#(WO*vl z$}(7+ZS)-<5{)LX_6OQ-AW-+XlT zVrH}_FY(iJ;wzgB*XA1lvUvP!#sP(o@f%x*&%Vi5Tsz6De1Co8ZlY!tc^uA}FH_Mu zr1HO50396%^5UOzCoI?NEu5hGM_iS8FY^dD8oZX)$46^0FXp`7-acsK)S?zCF}Oiq znCpW+pdASQA%pPT}!)oL|*O|Ja2apEmZKZg02P9%yRCt^q*QwgJ zp7T7q>9@mM2lsC@T*)Lxyhkl+1Z)NAL_Ie{w~hBmUx5^-qjFfLK$^QW`f=ORZsDKi zj}E^ES1(`T<0CX23Qp3-;#AjuzI5I1fRn*A0H!qQ@;Nu5vc`5K5}*s21E|H5xAaMu zmm$ySi-7Ed6rr@I*~S=!q8>-G0!SQl_mDepVD!DQY}>R1Oslupl8&fss319}71x~G zJ0f$to?J=Uh0etxIb80}0^7%CZ||$g*gV_>%Ps4HO9Sc=J4t+o7|R=v;U>zJ2Da-Q zJvQlL9}%pQ#}$dzghB611pAC>^v7N&B0RHwkRHqTL(KROEibV=1do0K84=K~74#+% zVqY=ARH~~tXD8w;V+8wkvx%L^+zUt>_Zt>t59LkOoKtofVY5m2?Uz1$x%S$4fgRMSvY7#^GF}O?5M2OT`l}a2~`$1>uIHzBPwzy7bOT*3$E-;@U zuCcOPeK_>0MecY35o4W&tmE$qkPsjcc8O1vV$ho?GNvphG_&57;w0x-T87YiR!;^- z0!{4nxlYk+A}R*NL+0zgL(IV#T7D?o|v%s*?nqRq!sx)Hv!5Lu4vsA7R@)AE8T?K zhZY07>7nbo*_vI9CHUu%=bdRG)4og*uFa)ALkDCCD9Rby<|{dk1q|ruASX5-_}S1A zGV(r777jjQlN-7DYjyQ<*&~<3G?U2FD>QxJW-I;Pe`S=MtyaJmKAq+w@U-q!&qB zuc1ApYxTuT?O8MoFY1XwSxs4$g2Si8qVWRcoe>W18M7ZpZhwh%aQ4R)>ij`UQUGo@dkvLNAjP*NlUXUsGX3i4 zwAnUL&k(_ls3bu+RzQ|Auqu=sUSjQ|BI@%zL>QBJgVtm|#BG+~1es^4_K3ZQ@$rOe z_N|OiKN2{z-E6CyTr{W?P7USw=U}AOtQNf&_&SA`8=!4dg?>I8s>(Ta$!(bBWST8=#{|IEpD$3{2d>G(!CV1$4(j7D^A z3csryW?<5YlG2fghc_c_b?ds_DOqt$`9v)p0WVT#MIi}RBZr?FWp72D%*F2Ri79gk?6nVaO3928LNY zcC_Om+#dq2Tal9{ch*^XHowo{t$+Q(UGuhM;_uo0#xK4l=n3NI#wdX>3F~@yMFqfV zRbYi(g=!Up=Gs%oM;~S=K3GItxvT&7{T-L|j4EVjc0bArj%(X>)29&^Hr4%Jh`E)8 zH$E8w)zu|UY*Vm?gd!%<=ihqL(Lb0xmndx-*PpphyMJLf#F+Yml_f|oUeej;#)ZaU z2rCze$Wx-AaP=awT^8KREHuVV8M72eZm!{l22Cw@@^8WK!80Y%ap+CeOvX6lU(}XU zYLU`w-*B=!bV6dm1r62U|_b8;JH%Wf$>ylk!`%JOm zR=4i;$F@l-zor0kMn<-A#i&Hg)9w%KZcWg4iE)Lpy02BP2iNm?Nu~L*ZV&DW*02Gk zM@&J2ayl|W@T!pQtjr<7CsSEZx(|S1^NMF74o(10Is%h+8*aU^a52}!Z23Jx@&UfO+`fpIVm$L z?G)lqiXd}?vhwQ~L4#-oxAfm~g2d8aqi8bSZ#LnTX0fMHH;?E&ovhZhF2m97X4$`j zP0n8aaZ1$xMDJ?$SaTx0`sBxV=2q-no*}ZSn_g|Is?m>|Z5ed4L8YHmSwXtt2SxjC zze`5jn8dnFb5{t8(Ne z*4~(2tXQ$LjR@hFR%AXes7I;RrLVyO^Hu0b#|760(|BWg&(_h|rc&Y}{Hek$C}0-c zw>zb6x@<~zgc*HsZ_WDA1r!e)E_-fn8ToRt8uJs9j<%qqJy7}}whu07McmA2YxpJi z7381g9$i?;V^*b@Q3n_?>U0LPw0tVHX-E4k9f-s-fAHfXHvXylL3ER}* zHN*p84g&0SBIJ+!4%=9*888!%r(ETP#WVbnjTEVKJv0enHADatd=Mrbgz6woO9 zXcReNt`(iMD{pp8-&E|bJU&ik=Iht#IYnwy4ecl3SW?Y?6n%A5q(f%mx_FP?cP0kI zuA^Av7?*w&Y(3pEnJT_YS0Q4rP3@?aX%D>b2}J)f^rJf&Bh7;4s&g`k!9@KD+PsAU zvb(ayARql}Hx_gUXdNhZj#ewMVm!}tk374&#~w%eDX9!60|t5yw4b& z@j)e_dHw=}6VJt{kAkO6Oo1EpE z_Y-PdnuZ#PbAiKcIk9mJ(HC#GIVc&x4cOV#DjdfW9ctfaGr|rbY>Z&&$B4Wy8H72d zBN9{eM>}l2{4Me;!JF^Imx$1$;OI+4(8uV&5+B{$v?hrBOzK+Umj>AN2FsG6$`VOZ zX83(4&)BeY^H0u)D{DOb@cbI9ua$*%cliX7_a)+8YWR`XNeW^ zo^qVJ7r||LUZ8&A>h7+FOthan?!kDnzdv!9N(o{_vhq&0Q3&$l?K1LU)n> zLbFRN=@CyZPG$3-=%rPKpqXCLr2_N+uu@s)46Xbs{(4*VE@;}&S;^75y1RtT~+sP8Y-(91s^}lC? zRI^P4MYi(Ft0ZsQufIpR<7%_}DeMg`vtn?n*~H03!wUf%u7$K$z!h|%8oO|-%gj`n zxVFaGXSqgx}i+McAK1nm&9> z;@WuLEr#ZEq$E$b`s)v~v@kavJMDBhFq4u+*^jv*fl{k9>Q89xMunu7_+X%@u~|LB z1)u=vnF&03NWe<4MmfN3)b*+5JD@A-Vp`O3z-5)bPd|?sxp}bnB>vU2upEccRVKXk zK3@0FgCC~0O;p*l(=b6ozlBy2pf&H2-FPN78@m8KVl4T@Q0a=&(pU`RZV>~l z-|mFBU_1O8Kl!ymrpFKF;p+}gz|8$}#x#xpMSTrc;sw9lqH1xt<^V4R*-{4bwOW%H zz@lZmoC11`dcfQmj6+RI&(^?18W`+Jr)RjI_PdKEBE2Vla4g?vPR_VKSmNGd-k)~+ z;~B}4}|FT2zM|56--}Zq(XL9`i=HBUlCGKGsjB zqP?jmH}I0w&>!B?%&8!37W`aTzu7`ZzNT^J&fr$>zo0R zFb!YTiBbbwIBxE?3uJu)V{FGqJk-4#*GUnBV}u3dria8(NdxYIBAu8gJx320U4@n1 z)aL!xrpM68(XY40?g%$48vHj)79Q9pY8X>LrO>-IL7Puqre+fqKG!}ryk)GqT(>Nu zVX%HRUR}foNDjLR;NkJHM$NS9gvakng=KHHl5+7IqRYo+@A7(->82M;aS-y-91|Pr zo?D#0llym4fOqBL!nBv`pSmM6r6=RcD$*zI`64P${fw-}a^;+tr{sm_4*2HR-Me`y z5>0c`I;W}0=mcapY^`AzMB1omc-4;F?W8fZBQHI?Tmax-xj0N5Rqe)a{2z)kziU=m z#}nu7`pCNjckAmCqN1$cXrX$aqa?TbCj1Ud z@H)X^pN|-8zR;MSTIIP#7m;U8QkN>7jnyGhk7S*+RF7v4ZM{J9D)^Ii1o&=pdo85@ z)S>X9PAUYsD$47UGdKO7pVbq0?n?dL;s99VYg`?7@A}8#?(vl(HzHj?VQ_og_OukZ z=%%C2J=M6SzGva(K9BI3MV<|t=SLu7@B8D9_TH`44hHjPt{3kjNfh)oa5@fAh>Gjyv-R{_V{Un9V-dwIJ~f>7hr^vM?nw?i33)7U>Zc z1(WPJ?5cuv%7^%o#I2?lM%e<1ii(w*x&z9N z2fd6iMpWrIP}D#ApdymX0SiTB!fIs828)#L&f3rj?%%&h6|KdyJd^gWX!v~O32<91 z`7vAEa|b2<0bLFr8mTY&R4sS&ErlrXlcVot2p5~26WUy}iDXwPO$(me7#V74Ah4^0 z>SL|fD2xk=888t*l71CCw3HVbD4_$L05?tzR)Ekj&=~w2gFRBWhdTnOIey{W`*UuPo_vGqj5Kdp)YD9E|tKOSBFy=Hf=Mol-pw{fuiX$*+l& z02&Yt^#8lFk+8IiWYCA=j+hPYq7dj+ai=LLAZ%Z3==bl-Y#s|DZ05!HpJ+(F^$SU4 za7RCqe-lZii7>Wl$$OO&W!TXmtd^{4x-7>5W(-TL-yDLGr` zq%&uG%K#_=5F&0UeNZp~IncPjneUJ4e>2~oF#OvXHBn}rMj;!##;x^K-aiXis!{4H z8&-#;vE10iIR4B62Xww?FB}$sp_5kKUY#950i*4b64~ktI2@vZ{C@2>2ug18cYnJW zi4H=iwkcv8MQ!ceTamOde`u*5{oV1&;`@5gA%-~VoE7Gj(8{oX+Wkq6%Xe2qlJC;1 zd>dzSk|lhoXbAJ-3VBtAaH;nlb)H)bVrp3zSS$KgLVgff9(1WuK&!>Yt&B}ko5W$+ zvd~B3+Xu9Yw(QO^w|jdT|AIqbFmDpa#*+nQ30|GFSg@j>r6TkGCigGvyUTB!+EhkX zonH?*=bw(YhpWIfvxX3ykteT<6?HON|B{D?b9{Toba1!z|{l zk6>|vq63UP&Byu(u3gTcXaRwSLwlI`T}Qs_r*@aW+y_linLRCBj?H+cTd#CFA~Nsn zJ}q2I3mM4p5z{FiCEEKI6^HxrODi{S^}V0{^&r7c$d12OTbEZ*mL-C!D$9jOPk_gt zE{ICi6DUN08zkV*49}kx8H5*N`d-9WCwuKQaN^?QfrD*eHkiuysTM~IjQN9orWmS% z;K7Q@7+Ymt4s!!zdJEUl$0tQ$9LnN%%uFl{{c)8ZzfWAmkU=#!@u{!%u9@89Me}rv zP2be!*Fiz>+XeN=4V|+wXl&cJD>dy2no!SHs#R4B^tUvV;w*+#L?-?l(Xhd7WR^2__sOz=ka#Jj`0QmIe>TW=&Cj-YC zFOkwpT~o^gpux9x@RACaz(0Cc?S_9G`J5GtQk9@Qqae;a zo>mHb#RuAS;xca=KV^3!e?PxjiC*K~dR8C$gn^l$qq=i~Sendm_e&weyr5$487ZgM zp$BtGxnT`9y0zD&#SPCcyA544SE?@rj{d!YXLyG=R`z2`J#trk@GhvlH2Fj)-`vJ5 zSz=bQiLqwPS9l8x5}SQYu=mzHE~iAC*L=91!pMwBF*DHkJQu3KZTqsVRvLJM-IY`^ zRJACZsg;K9lQ?t`6$^HP9tqBW7gT?)$B;SxM7cl=0>Oe!+cRyy+ujw?j-}^C1F@$!p)y&_T&;ym%RF;;yP{&f5!^|GXnvSy{UJY6h+uF^L9X#59 zZl7^c$j?k1E8vw92Egf@ADhc>W$PsX9EvIuhwZ%$|3WtLMC!Q#z;sgS}4 z8g6o-R%Z=nSw5V0xQ8o~B1Mq1o?xikloCX^g7z*h)bBUm9=f0r!;60?u~{VnhddYE z{&WCY?>byN@V6w1xgMwsnTS^%NUGWLQ>1Ea?gi3_!=hc#t#{I*3sQ=a>WptH@UAuf zzMOY~6%Q+#3lSSMjJWiyB*VcPin{LUGJ#g=hp|drh3xQ(FvXo zLPYQkO#1!Tf`q!cQy!^# zsjbuOphqh4d4m5^ZF!B$I#n*;S3WUMvpVM;TMGpWinG(%EShk9R41fzYPKJJORD8j zMJUS`Asmo_02m-M-WQVuR}>V=B42)?r>~u(M-_gJ_GDWxrXG~HqvGlJG1N&~OBS@ArF3Gadh3~h>31sl7)Ji0Z>-XN@ubV2 z9IG*Q?&y5_fiOkE#XFQ%+S&l?Zi(k}v!_Ao<`UIZ3}3b7M5q{9OyQlxt!&w!faJ9^ zRGr`qi5*U1yNBi&daMp>UbH}`8IOwz;)htm#?t7mZ(i%f@;)WolH59ewQif7V$`ju zXPEQq%5huxi7x1Z7PFZ*{_2Ci& zPb$a=fA-b`3Wy3~Fq=7R>s|(o^wnxwL+3cJt@b(p!WN`9pMfTwxWH+Uh`HD(mUBA@ zYKG`7$;1`M72iR`d$hkcxn02+=lZN5M-7%PKb7@XsZ*3^c|-iYG)YI#sBx7_G}^ac!0>c!L@OiVqFhi>+j!beb1YsVvx(2_aDymhd56*?a*kIM z?Y>e!Gqc+NK>*RG03i6pwHyGMj7eL$1;^-7<_bxFxy+_fPwo@07%tW=uEDx;cD5p+ z!?`qaqV&wmN})Z_RD#iX{dyEs|As-g8qE=1Hg4n?`RpQgq>8<9LsAQZmui^I28v+i zd*}*)VDH3-f}o!?h>@K-<&AwDT&Jy2Y)uXojEj*PnTQ^#66tj&MZn0D?pU#*V${3!nOy@Mj5XMTTV*FWe*#r0+r zSc|XMHaTiAPb3%xif+8wGWB3@P*Lc^6vK5bZD zAg;XGOFxC`1k8jlhPniRUT%@+DmWQ3MH`%I3}J-bzg+9gA5$7>KRwLmfOs&z^x~;RBm_Jk|xuDv3?T0Ak=XrhT6BfZzD(s9D^w?;BGBhZ8)NYLX&`>T3kh zd@28E%+N@6K07x{o3HDqg8urM@mOrGd2~PBg)5cn*W-fk$6^#B?X+yQpE3#o#o5dR zgLKM%-pDqXV34UoP#(R>SV&2ehh!d$957&8mvwCyn<^Vp+p|=&y}o6IMLL))5%o0k z11>%inw~4s?Ag^EEt&OrTDokZJh-{Y>0dw1btMr5f> z>Iid@buuI&>$}G^u<3>F8*SWR5&Qm-?_x*DfTEn8kMi`;n}JOUPoE#37bIiwekD{c zL45l;E|I;Ze;Dq*u84>eoaVTLGC?yV-j@6nOQZ|&mANLKrcaH#?|uR`#$>WsvOO-X z)Ov)9#iTHADDs_$9mt1R)F-Z@j1sB(83n@}3reX=wM-XmA81A362X+Ep+<{X^-|c7 zUZp@7Xr}ww>RV2}O5vB(uOs-)po&12N!qmoHF4!ME_k@s0}q6gc9vGig%ikP4NhHu zgf^Yjr@{S$v&)UXoc?$3kVfFVM47==rD44o!7f_aBLJGn8~ib#>n%X*U3(bmw3p>{ zkYuS}5$U#W_VwSndUGW3lZ|OYM*^?C?rEDQ+VH|3o)RE)@!Ri#eMiZF^N%Lna=yFT z?D^ji+j?6o5_Ll_;{%5K`aF~29YpovDBGXx%6EeT4ED?E!mN%To1<-}T79Y)S67@^! zR|=MJM?;=y3Ya_E37v^Mhd5%;rqum4>5RjVk8{@4d?uBY6(e*EVX$+H_HZY|EX@?q z==cqkwnMegzwczj*cwdmq@RAzEBaQs^@RpA#xe;pz65|l!xuAZH6!zCw+&kAypL9` zj*ts>asIez=u?Z~&zU(f?Q`1^!pI2Yz87=m+;yI*g%8If^um5H=$0iXUvFjfKX6@* z{A2goj5)6B-dS7h;Ko71_iJbK$)!a;^QTU+aMbu*_gI}{Cd%C0Lv`g&?96S@PFaFz z%5QVVZj!|)j706D5qsIOOH7M`h7>U}Eser-o%Eq9MQN?a=$?;hyQ6N;*WN1rBBvdw zQY=Wdh8lc*_+w%2cG7WAqLsB!L?2Xu6yk^`#BWCO;aFL)2AhO!%!rv*5OAAC-^5-N zP-+d2z{sHhJRuO9!XCw`E;xos#9e(&DtI5S4rqj}QB*SKrlFW}-`E`lsNV&`%2}pZ z9~uOH*to&!m3!_=rn;@-PnTOWhs1lf>%KB+zEPq-q}7N7Mkd|bTAS8*4k}?}w4ivO z6}0`-S7Bc+I?_Qfndhngj;E)}oJZPde+<%A8Jy5BU?-Swbovk9zM!`nY@c)$OG(QaG-75jp=0GNK@{N7ov8zH2Q85xKg|08ywi8<3BYRqZVN7`2h*Gm$R6G3a>Zr}0!yJT*);OuKOkh9gF-Kx<-Nb-1b>jZiA~Jwms37mPR}F2<~-~Y z-3u%q%WhA5IIP@;KAVaiKCnkN)o7c2k%Yzn50d2ZD#s(0YM-pHHI< zAedN=Fo3x_$RO`rYW*`#F;`b;LGz3M15zcZ|Ojj_i& z{XtdqrVQb7s(66mY8OPrMmH{LW&*E-1l9;UF8}4lIM{|xP1Jh1fq61VqVqq5g{?rJ zB9mYY0G;0>95(<^NP9O$iv~Xe4i0Ty15fX>3cq`2ri3?;y1=$ z|17<{g}>Id|9~kxZYVWH!StDVMepeS8n60yGm$L^E#%ct19rcbh41z*1>>Bd8{u** zzdkO~jBm$8(kNdt|K8gAn8mC5+|K77vqcpzH#mksEt7x=ZnrhrPoAqIg&gmPhlZ5Z zHt(+LiZ(vE+1SiEw^4QjZ}X_~u%XqhXBj>p_Bjc(qRUHIZz*4ze%Pl?gZ+N4H|K2Z zB^^Z-!ahgOn+59+JJXN3fBgJ8KjiT~L?IUvAy2VNMtj|YJGoGrhGC9=J=eTFaV~La zIN9}7#os;H_LM9z^Oo|1sIuohb9E@_+SsWb?^tGcHVd^UapzjIHAsNJ$vqx*a3EaG zW{`WiadeZ``?#p-LE?C}frDS=sd-!Ygb z-}ltzcuTlZ>VM{N&0rFekUbg)d|N@Kiif)F#AaUa)CRv_xg*@kgYua+;2}G2 zJiirwJgBeo6b-@hf}2acFa3l#boUgubGW=rS-+W4d#cCaX!2;jJXEdLu$z_=q3IB0 zuuka$1f&>>bPg*;rw7$qK{w@E<$^^21j2cQ)1>t|C)0Cam zch~#hu7JA==v6tw)o6v^5Cgd<%c+D)6Ut=Rj2G&zMmVH!!1j02hPMa&KOcp#05fP<_Hmi#P8iLS~{1^AHOSqdzivNr_lWjow)~h zN>i;B%Ui!hLBy=h&o{*F&>bH;U-aY`fwwc(Y>!T<8ZY@AFj&a1r`xCYuzBc_eDr&dUWH!^Z<$Iea&nrhb7yh&w0T+~gM0vXh-M#7E zxQ>K^mW>~mzj<(Uf1u=Ea)QQ?03e?ESzs)*7u7+0bVznm;nBN8+gG0P&e^AeKgch; zFd>3oY5=ziYipf#GK1>KY6$aeLvB}bvw8%B*hR*qZ8LTF*SrDB!RCaT66ts+zJ%L2|9gV4E zMm*Q?R7;J1b&sG2G6=FR#6$W{9p{U*3ltk!Tfw&;#VzuIM<+#qeK7SA|9{ybo!Upf z^#8azBh<^`zwI}jSlaVV>>t>E|$`b5$b*B)&^bGQN zTzVF7vRxaoy7IJR_wHj8!Y?HT>V4Y0XTLxL+%x-DpO~Kav=k2F!-^i<%M#HW_H6#- zx;%j_T|VCS1T`Qt4jmjgo+>}Sr&K8f;-@x#*I4<^8pb!-Q2>r^=^@cfQn|^2k!ou~ z#KS+RVU%G^0=gQRoxJTLmyJkLHZzO{McTJ37H^m->~62mI#8KnAdtRlaxT=mX^oWkQ@r9HnY zso1FZOB0N%XSv?p)lrtsYY9Jd`wWsxI89?@=p01x;jjDk2~)_edw)Vz>GkpOR^cg^ zs`Nt{|6Igj&NE#+d%&`(y1$CvbThfNawpj_cd77Q8zKk@*M|Ve%OeQ3FwW(f%Hp%V zBk2|yQQfj^r6Ri zBj>3;kB1FnJ4=5ZIJG*^mufiriMZP?xR;O9vI~MhjpzYTHT!z$=g$5=UTQYdMumXJ zeeiM#p_b|8v6906;s#iPCY-Hg6L^oX7Nz5E17FK|)>+SG8Q^0gf==$)0V=@z4wCI! z{-wCTUvrEu`+9-mG-LEOi%Va{H>;$7|Jz)ELhSN!ffOxqO6ltVIlwYtl@V)xS@=A| z@2;e4>L(oY|8;^v21``q*!SB1oq~5@nZWw${QJZ5>&NCf(u(x+mUBMa=xyR(x8omA z^A*sG)Rn0ZcZPKRAKnT0_cs_y8H$+Aa5FtH2Mj$FE)(YtMwI?>1piI~h*3abd4%z} zbPRR`-B;3ySN`Y6@zg+mG5U4Kv`MPbtM!QwT`FCQUlacwA#k*OkH1M>^4Ksmt2{j5 zv@7`r=I(e_j?IHGMNCYXtK=2e|2`WSIwkb+r~miz<|lYnS1-T)`{Lu^%ccr>Y)vff)_uWLR!xX2 zJ^lB?S3mvr8La^4v}SvQ^!eYX2026>2O*y@M_Xl@U#$ZYmdo@c`|0ie=dGN_>jQuc zoR+*hjQ^cPq(KCAAvF=o#=>=_ih<7bzZ1YLc6{Jl z6Tpg!m?kdY`S)ociW{ev>0|^-BZI%RB3=>;SGR5Y?*8{Hf^0R9&DQkD2Uq~XRL9B(u7|+7SH_evf!Q^PwWkmn6Sos?L`@!Yf$IJb~T_+Exs{1_jvsKvtJhz82=p@z_91sKbUKxUe zPt7v#-_7~=(CQw$VNg{`|2Ix{Q{hL1nyr=6@P6^!h zZ-5ECAb$Q2fcsZGq2PQIAO2k7n4 zD2;~x&zB>DjGSv_RNPuz?l8%|e_wOL^yS0J`iEcn-7*#wT?haD=6}cZ&xu-L3HrlB X7Nga}msNC+4E)nZ>uVIFY=ZwEKa7xu literal 0 HcmV?d00001 diff --git a/vbench/third_party/RAFT/README.md b/vbench/third_party/RAFT/README.md new file mode 100644 index 00000000..650275ed --- /dev/null +++ b/vbench/third_party/RAFT/README.md @@ -0,0 +1,80 @@ +# RAFT +This repository contains the source code for our paper: + +[RAFT: Recurrent All Pairs Field Transforms for Optical Flow](https://arxiv.org/pdf/2003.12039.pdf)
      +ECCV 2020
      +Zachary Teed and Jia Deng
      + + + +## Requirements +The code has been tested with PyTorch 1.6 and Cuda 10.1. +```Shell +conda create --name raft +conda activate raft +conda install pytorch=1.6.0 torchvision=0.7.0 cudatoolkit=10.1 matplotlib tensorboard scipy opencv -c pytorch +``` + +## Demos +Pretrained models can be downloaded by running +```Shell +./download_models.sh +``` +or downloaded from [google drive](https://drive.google.com/drive/folders/1sWDsfuZ3Up38EUQt7-JDTT1HcGHuJgvT?usp=sharing) + +You can demo a trained model on a sequence of frames +```Shell +python demo.py --model=models/raft-things.pth --path=demo-frames +``` + +## Required Data +To evaluate/train RAFT, you will need to download the required datasets. +* [FlyingChairs](https://lmb.informatik.uni-freiburg.de/resources/datasets/FlyingChairs.en.html#flyingchairs) +* [FlyingThings3D](https://lmb.informatik.uni-freiburg.de/resources/datasets/SceneFlowDatasets.en.html) +* [Sintel](http://sintel.is.tue.mpg.de/) +* [KITTI](http://www.cvlibs.net/datasets/kitti/eval_scene_flow.php?benchmark=flow) +* [HD1K](http://hci-benchmark.iwr.uni-heidelberg.de/) (optional) + + +By default `datasets.py` will search for the datasets in these locations. You can create symbolic links to wherever the datasets were downloaded in the `datasets` folder + +```Shell +├── datasets + ├── Sintel + ├── test + ├── training + ├── KITTI + ├── testing + ├── training + ├── devkit + ├── FlyingChairs_release + ├── data + ├── FlyingThings3D + ├── frames_cleanpass + ├── frames_finalpass + ├── optical_flow +``` + +## Evaluation +You can evaluate a trained model using `evaluate.py` +```Shell +python evaluate.py --model=models/raft-things.pth --dataset=sintel --mixed_precision +``` + +## Training +We used the following training schedule in our paper (2 GPUs). Training logs will be written to the `runs` which can be visualized using tensorboard +```Shell +./train_standard.sh +``` + +If you have a RTX GPU, training can be accelerated using mixed precision. You can expect similiar results in this setting (1 GPU) +```Shell +./train_mixed.sh +``` + +## (Optional) Efficent Implementation +You can optionally use our alternate (efficent) implementation by compiling the provided cuda extension +```Shell +cd alt_cuda_corr && python setup.py install && cd .. +``` +and running `demo.py` and `evaluate.py` with the `--alternate_corr` flag Note, this implementation is somewhat slower than all-pairs, but uses significantly less GPU memory during the forward pass. diff --git a/vbench/third_party/RAFT/alt_cuda_corr/correlation.cpp b/vbench/third_party/RAFT/alt_cuda_corr/correlation.cpp new file mode 100644 index 00000000..b01584d1 --- /dev/null +++ b/vbench/third_party/RAFT/alt_cuda_corr/correlation.cpp @@ -0,0 +1,54 @@ +#include +#include + +// CUDA forward declarations +std::vector corr_cuda_forward( + torch::Tensor fmap1, + torch::Tensor fmap2, + torch::Tensor coords, + int radius); + +std::vector corr_cuda_backward( + torch::Tensor fmap1, + torch::Tensor fmap2, + torch::Tensor coords, + torch::Tensor corr_grad, + int radius); + +// C++ interface +#define CHECK_CUDA(x) TORCH_CHECK(x.type().is_cuda(), #x " must be a CUDA tensor") +#define CHECK_CONTIGUOUS(x) TORCH_CHECK(x.is_contiguous(), #x " must be contiguous") +#define CHECK_INPUT(x) CHECK_CUDA(x); CHECK_CONTIGUOUS(x) + +std::vector corr_forward( + torch::Tensor fmap1, + torch::Tensor fmap2, + torch::Tensor coords, + int radius) { + CHECK_INPUT(fmap1); + CHECK_INPUT(fmap2); + CHECK_INPUT(coords); + + return corr_cuda_forward(fmap1, fmap2, coords, radius); +} + + +std::vector corr_backward( + torch::Tensor fmap1, + torch::Tensor fmap2, + torch::Tensor coords, + torch::Tensor corr_grad, + int radius) { + CHECK_INPUT(fmap1); + CHECK_INPUT(fmap2); + CHECK_INPUT(coords); + CHECK_INPUT(corr_grad); + + return corr_cuda_backward(fmap1, fmap2, coords, corr_grad, radius); +} + + +PYBIND11_MODULE(TORCH_EXTENSION_NAME, m) { + m.def("forward", &corr_forward, "CORR forward"); + m.def("backward", &corr_backward, "CORR backward"); +} \ No newline at end of file diff --git a/vbench/third_party/RAFT/alt_cuda_corr/correlation_kernel.cu b/vbench/third_party/RAFT/alt_cuda_corr/correlation_kernel.cu new file mode 100644 index 00000000..145e5804 --- /dev/null +++ b/vbench/third_party/RAFT/alt_cuda_corr/correlation_kernel.cu @@ -0,0 +1,324 @@ +#include +#include +#include +#include + + +#define BLOCK_H 4 +#define BLOCK_W 8 +#define BLOCK_HW BLOCK_H * BLOCK_W +#define CHANNEL_STRIDE 32 + + +__forceinline__ __device__ +bool within_bounds(int h, int w, int H, int W) { + return h >= 0 && h < H && w >= 0 && w < W; +} + +template +__global__ void corr_forward_kernel( + const torch::PackedTensorAccessor32 fmap1, + const torch::PackedTensorAccessor32 fmap2, + const torch::PackedTensorAccessor32 coords, + torch::PackedTensorAccessor32 corr, + int r) +{ + const int b = blockIdx.x; + const int h0 = blockIdx.y * blockDim.x; + const int w0 = blockIdx.z * blockDim.y; + const int tid = threadIdx.x * blockDim.y + threadIdx.y; + + const int H1 = fmap1.size(1); + const int W1 = fmap1.size(2); + const int H2 = fmap2.size(1); + const int W2 = fmap2.size(2); + const int N = coords.size(1); + const int C = fmap1.size(3); + + __shared__ scalar_t f1[CHANNEL_STRIDE][BLOCK_HW+1]; + __shared__ scalar_t f2[CHANNEL_STRIDE][BLOCK_HW+1]; + __shared__ scalar_t x2s[BLOCK_HW]; + __shared__ scalar_t y2s[BLOCK_HW]; + + for (int c=0; c(floor(y2s[k1]))-r+iy; + int w2 = static_cast(floor(x2s[k1]))-r+ix; + int c2 = tid % CHANNEL_STRIDE; + + auto fptr = fmap2[b][h2][w2]; + if (within_bounds(h2, w2, H2, W2)) + f2[c2][k1] = fptr[c+c2]; + else + f2[c2][k1] = 0.0; + } + + __syncthreads(); + + scalar_t s = 0.0; + for (int k=0; k 0 && ix > 0 && within_bounds(h1, w1, H1, W1)) + *(corr_ptr + ix_nw) += nw; + + if (iy > 0 && ix < rd && within_bounds(h1, w1, H1, W1)) + *(corr_ptr + ix_ne) += ne; + + if (iy < rd && ix > 0 && within_bounds(h1, w1, H1, W1)) + *(corr_ptr + ix_sw) += sw; + + if (iy < rd && ix < rd && within_bounds(h1, w1, H1, W1)) + *(corr_ptr + ix_se) += se; + } + } + } + } +} + + +template +__global__ void corr_backward_kernel( + const torch::PackedTensorAccessor32 fmap1, + const torch::PackedTensorAccessor32 fmap2, + const torch::PackedTensorAccessor32 coords, + const torch::PackedTensorAccessor32 corr_grad, + torch::PackedTensorAccessor32 fmap1_grad, + torch::PackedTensorAccessor32 fmap2_grad, + torch::PackedTensorAccessor32 coords_grad, + int r) +{ + + const int b = blockIdx.x; + const int h0 = blockIdx.y * blockDim.x; + const int w0 = blockIdx.z * blockDim.y; + const int tid = threadIdx.x * blockDim.y + threadIdx.y; + + const int H1 = fmap1.size(1); + const int W1 = fmap1.size(2); + const int H2 = fmap2.size(1); + const int W2 = fmap2.size(2); + const int N = coords.size(1); + const int C = fmap1.size(3); + + __shared__ scalar_t f1[CHANNEL_STRIDE][BLOCK_HW+1]; + __shared__ scalar_t f2[CHANNEL_STRIDE][BLOCK_HW+1]; + + __shared__ scalar_t f1_grad[CHANNEL_STRIDE][BLOCK_HW+1]; + __shared__ scalar_t f2_grad[CHANNEL_STRIDE][BLOCK_HW+1]; + + __shared__ scalar_t x2s[BLOCK_HW]; + __shared__ scalar_t y2s[BLOCK_HW]; + + for (int c=0; c(floor(y2s[k1]))-r+iy; + int w2 = static_cast(floor(x2s[k1]))-r+ix; + int c2 = tid % CHANNEL_STRIDE; + + auto fptr = fmap2[b][h2][w2]; + if (within_bounds(h2, w2, H2, W2)) + f2[c2][k1] = fptr[c+c2]; + else + f2[c2][k1] = 0.0; + + f2_grad[c2][k1] = 0.0; + } + + __syncthreads(); + + const scalar_t* grad_ptr = &corr_grad[b][n][0][h1][w1]; + scalar_t g = 0.0; + + int ix_nw = H1*W1*((iy-1) + rd*(ix-1)); + int ix_ne = H1*W1*((iy-1) + rd*ix); + int ix_sw = H1*W1*(iy + rd*(ix-1)); + int ix_se = H1*W1*(iy + rd*ix); + + if (iy > 0 && ix > 0 && within_bounds(h1, w1, H1, W1)) + g += *(grad_ptr + ix_nw) * dy * dx; + + if (iy > 0 && ix < rd && within_bounds(h1, w1, H1, W1)) + g += *(grad_ptr + ix_ne) * dy * (1-dx); + + if (iy < rd && ix > 0 && within_bounds(h1, w1, H1, W1)) + g += *(grad_ptr + ix_sw) * (1-dy) * dx; + + if (iy < rd && ix < rd && within_bounds(h1, w1, H1, W1)) + g += *(grad_ptr + ix_se) * (1-dy) * (1-dx); + + for (int k=0; k(floor(y2s[k1]))-r+iy; + int w2 = static_cast(floor(x2s[k1]))-r+ix; + int c2 = tid % CHANNEL_STRIDE; + + scalar_t* fptr = &fmap2_grad[b][h2][w2][0]; + if (within_bounds(h2, w2, H2, W2)) + atomicAdd(fptr+c+c2, f2_grad[c2][k1]); + } + } + } + } + __syncthreads(); + + + for (int k=0; k corr_cuda_forward( + torch::Tensor fmap1, + torch::Tensor fmap2, + torch::Tensor coords, + int radius) +{ + const auto B = coords.size(0); + const auto N = coords.size(1); + const auto H = coords.size(2); + const auto W = coords.size(3); + + const auto rd = 2 * radius + 1; + auto opts = fmap1.options(); + auto corr = torch::zeros({B, N, rd*rd, H, W}, opts); + + const dim3 blocks(B, (H+BLOCK_H-1)/BLOCK_H, (W+BLOCK_W-1)/BLOCK_W); + const dim3 threads(BLOCK_H, BLOCK_W); + + corr_forward_kernel<<>>( + fmap1.packed_accessor32(), + fmap2.packed_accessor32(), + coords.packed_accessor32(), + corr.packed_accessor32(), + radius); + + return {corr}; +} + +std::vector corr_cuda_backward( + torch::Tensor fmap1, + torch::Tensor fmap2, + torch::Tensor coords, + torch::Tensor corr_grad, + int radius) +{ + const auto B = coords.size(0); + const auto N = coords.size(1); + + const auto H1 = fmap1.size(1); + const auto W1 = fmap1.size(2); + const auto H2 = fmap2.size(1); + const auto W2 = fmap2.size(2); + const auto C = fmap1.size(3); + + auto opts = fmap1.options(); + auto fmap1_grad = torch::zeros({B, H1, W1, C}, opts); + auto fmap2_grad = torch::zeros({B, H2, W2, C}, opts); + auto coords_grad = torch::zeros({B, N, H1, W1, 2}, opts); + + const dim3 blocks(B, (H1+BLOCK_H-1)/BLOCK_H, (W1+BLOCK_W-1)/BLOCK_W); + const dim3 threads(BLOCK_H, BLOCK_W); + + + corr_backward_kernel<<>>( + fmap1.packed_accessor32(), + fmap2.packed_accessor32(), + coords.packed_accessor32(), + corr_grad.packed_accessor32(), + fmap1_grad.packed_accessor32(), + fmap2_grad.packed_accessor32(), + coords_grad.packed_accessor32(), + radius); + + return {fmap1_grad, fmap2_grad, coords_grad}; +} \ No newline at end of file diff --git a/vbench/third_party/RAFT/alt_cuda_corr/setup.py b/vbench/third_party/RAFT/alt_cuda_corr/setup.py new file mode 100644 index 00000000..c0207ff2 --- /dev/null +++ b/vbench/third_party/RAFT/alt_cuda_corr/setup.py @@ -0,0 +1,15 @@ +from setuptools import setup +from torch.utils.cpp_extension import BuildExtension, CUDAExtension + + +setup( + name='correlation', + ext_modules=[ + CUDAExtension('alt_cuda_corr', + sources=['correlation.cpp', 'correlation_kernel.cu'], + extra_compile_args={'cxx': [], 'nvcc': ['-O3']}), + ], + cmdclass={ + 'build_ext': BuildExtension + }) + diff --git a/vbench/third_party/RAFT/chairs_split.txt b/vbench/third_party/RAFT/chairs_split.txt new file mode 100644 index 00000000..6ae8f0b7 --- /dev/null +++ b/vbench/third_party/RAFT/chairs_split.txt @@ -0,0 +1,22872 @@ +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +2 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +2 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +1 +2 +2 +1 +1 +1 +1 +1 +1 +1 +2 +1 +1 +1 +1 +1 \ No newline at end of file diff --git a/vbench/third_party/RAFT/core/__init__.py b/vbench/third_party/RAFT/core/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/RAFT/core/corr.py b/vbench/third_party/RAFT/core/corr.py new file mode 100644 index 00000000..55e70156 --- /dev/null +++ b/vbench/third_party/RAFT/core/corr.py @@ -0,0 +1,91 @@ +import torch +import torch.nn.functional as F +from utils_core.utils import bilinear_sampler, coords_grid + +try: + import alt_cuda_corr +except: + # alt_cuda_corr is not compiled + pass + + +class CorrBlock: + def __init__(self, fmap1, fmap2, num_levels=4, radius=4): + self.num_levels = num_levels + self.radius = radius + self.corr_pyramid = [] + + # all pairs correlation + corr = CorrBlock.corr(fmap1, fmap2) + + batch, h1, w1, dim, h2, w2 = corr.shape + corr = corr.reshape(batch*h1*w1, dim, h2, w2) + + self.corr_pyramid.append(corr) + for i in range(self.num_levels-1): + corr = F.avg_pool2d(corr, 2, stride=2) + self.corr_pyramid.append(corr) + + def __call__(self, coords): + r = self.radius + coords = coords.permute(0, 2, 3, 1) + batch, h1, w1, _ = coords.shape + + out_pyramid = [] + for i in range(self.num_levels): + corr = self.corr_pyramid[i] + dx = torch.linspace(-r, r, 2*r+1, device=coords.device) + dy = torch.linspace(-r, r, 2*r+1, device=coords.device) + delta = torch.stack(torch.meshgrid(dy, dx), axis=-1) + + centroid_lvl = coords.reshape(batch*h1*w1, 1, 1, 2) / 2**i + delta_lvl = delta.view(1, 2*r+1, 2*r+1, 2) + coords_lvl = centroid_lvl + delta_lvl + + corr = bilinear_sampler(corr, coords_lvl) + corr = corr.view(batch, h1, w1, -1) + out_pyramid.append(corr) + + out = torch.cat(out_pyramid, dim=-1) + return out.permute(0, 3, 1, 2).contiguous().float() + + @staticmethod + def corr(fmap1, fmap2): + batch, dim, ht, wd = fmap1.shape + fmap1 = fmap1.view(batch, dim, ht*wd) + fmap2 = fmap2.view(batch, dim, ht*wd) + + corr = torch.matmul(fmap1.transpose(1,2), fmap2) + corr = corr.view(batch, ht, wd, 1, ht, wd) + return corr / torch.sqrt(torch.tensor(dim).float()) + + +class AlternateCorrBlock: + def __init__(self, fmap1, fmap2, num_levels=4, radius=4): + self.num_levels = num_levels + self.radius = radius + + self.pyramid = [(fmap1, fmap2)] + for i in range(self.num_levels): + fmap1 = F.avg_pool2d(fmap1, 2, stride=2) + fmap2 = F.avg_pool2d(fmap2, 2, stride=2) + self.pyramid.append((fmap1, fmap2)) + + def __call__(self, coords): + coords = coords.permute(0, 2, 3, 1) + B, H, W, _ = coords.shape + dim = self.pyramid[0][0].shape[1] + + corr_list = [] + for i in range(self.num_levels): + r = self.radius + fmap1_i = self.pyramid[0][0].permute(0, 2, 3, 1).contiguous() + fmap2_i = self.pyramid[i][1].permute(0, 2, 3, 1).contiguous() + + coords_i = (coords / 2**i).reshape(B, 1, H, W, 2).contiguous() + corr, = alt_cuda_corr.forward(fmap1_i, fmap2_i, coords_i, r) + corr_list.append(corr.squeeze(1)) + + corr = torch.stack(corr_list, dim=1) + corr = corr.reshape(B, -1, H, W) + return corr / torch.sqrt(torch.tensor(dim).float()) diff --git a/vbench/third_party/RAFT/core/datasets.py b/vbench/third_party/RAFT/core/datasets.py new file mode 100644 index 00000000..cf849799 --- /dev/null +++ b/vbench/third_party/RAFT/core/datasets.py @@ -0,0 +1,235 @@ +# Data loading based on https://github.com/NVIDIA/flownet2-pytorch + +import numpy as np +import torch +import torch.utils.data as data +import torch.nn.functional as F + +import os +import math +import random +from glob import glob +import os.path as osp + +from utils_core import frame_utils +from utils_core.augmentor import FlowAugmentor, SparseFlowAugmentor + + +class FlowDataset(data.Dataset): + def __init__(self, aug_params=None, sparse=False): + self.augmentor = None + self.sparse = sparse + if aug_params is not None: + if sparse: + self.augmentor = SparseFlowAugmentor(**aug_params) + else: + self.augmentor = FlowAugmentor(**aug_params) + + self.is_test = False + self.init_seed = False + self.flow_list = [] + self.image_list = [] + self.extra_info = [] + + def __getitem__(self, index): + + if self.is_test: + img1 = frame_utils.read_gen(self.image_list[index][0]) + img2 = frame_utils.read_gen(self.image_list[index][1]) + img1 = np.array(img1).astype(np.uint8)[..., :3] + img2 = np.array(img2).astype(np.uint8)[..., :3] + img1 = torch.from_numpy(img1).permute(2, 0, 1).float() + img2 = torch.from_numpy(img2).permute(2, 0, 1).float() + return img1, img2, self.extra_info[index] + + if not self.init_seed: + worker_info = torch.utils.data.get_worker_info() + if worker_info is not None: + torch.manual_seed(worker_info.id) + np.random.seed(worker_info.id) + random.seed(worker_info.id) + self.init_seed = True + + index = index % len(self.image_list) + valid = None + if self.sparse: + flow, valid = frame_utils.readFlowKITTI(self.flow_list[index]) + else: + flow = frame_utils.read_gen(self.flow_list[index]) + + img1 = frame_utils.read_gen(self.image_list[index][0]) + img2 = frame_utils.read_gen(self.image_list[index][1]) + + flow = np.array(flow).astype(np.float32) + img1 = np.array(img1).astype(np.uint8) + img2 = np.array(img2).astype(np.uint8) + + # grayscale images + if len(img1.shape) == 2: + img1 = np.tile(img1[...,None], (1, 1, 3)) + img2 = np.tile(img2[...,None], (1, 1, 3)) + else: + img1 = img1[..., :3] + img2 = img2[..., :3] + + if self.augmentor is not None: + if self.sparse: + img1, img2, flow, valid = self.augmentor(img1, img2, flow, valid) + else: + img1, img2, flow = self.augmentor(img1, img2, flow) + + img1 = torch.from_numpy(img1).permute(2, 0, 1).float() + img2 = torch.from_numpy(img2).permute(2, 0, 1).float() + flow = torch.from_numpy(flow).permute(2, 0, 1).float() + + if valid is not None: + valid = torch.from_numpy(valid) + else: + valid = (flow[0].abs() < 1000) & (flow[1].abs() < 1000) + + return img1, img2, flow, valid.float() + + + def __rmul__(self, v): + self.flow_list = v * self.flow_list + self.image_list = v * self.image_list + return self + + def __len__(self): + return len(self.image_list) + + +class MpiSintel(FlowDataset): + def __init__(self, aug_params=None, split='training', root='datasets/Sintel', dstype='clean'): + super(MpiSintel, self).__init__(aug_params) + flow_root = osp.join(root, split, 'flow') + image_root = osp.join(root, split, dstype) + + if split == 'test': + self.is_test = True + + for scene in os.listdir(image_root): + image_list = sorted(glob(osp.join(image_root, scene, '*.png'))) + for i in range(len(image_list)-1): + self.image_list += [ [image_list[i], image_list[i+1]] ] + self.extra_info += [ (scene, i) ] # scene and frame_id + + if split != 'test': + self.flow_list += sorted(glob(osp.join(flow_root, scene, '*.flo'))) + + +class FlyingChairs(FlowDataset): + def __init__(self, aug_params=None, split='train', root='datasets/FlyingChairs_release/data'): + super(FlyingChairs, self).__init__(aug_params) + + images = sorted(glob(osp.join(root, '*.ppm'))) + flows = sorted(glob(osp.join(root, '*.flo'))) + assert (len(images)//2 == len(flows)) + + split_list = np.loadtxt('chairs_split.txt', dtype=np.int32) + for i in range(len(flows)): + xid = split_list[i] + if (split=='training' and xid==1) or (split=='validation' and xid==2): + self.flow_list += [ flows[i] ] + self.image_list += [ [images[2*i], images[2*i+1]] ] + + +class FlyingThings3D(FlowDataset): + def __init__(self, aug_params=None, root='datasets/FlyingThings3D', dstype='frames_cleanpass'): + super(FlyingThings3D, self).__init__(aug_params) + + for cam in ['left']: + for direction in ['into_future', 'into_past']: + image_dirs = sorted(glob(osp.join(root, dstype, 'TRAIN/*/*'))) + image_dirs = sorted([osp.join(f, cam) for f in image_dirs]) + + flow_dirs = sorted(glob(osp.join(root, 'optical_flow/TRAIN/*/*'))) + flow_dirs = sorted([osp.join(f, direction, cam) for f in flow_dirs]) + + for idir, fdir in zip(image_dirs, flow_dirs): + images = sorted(glob(osp.join(idir, '*.png')) ) + flows = sorted(glob(osp.join(fdir, '*.pfm')) ) + for i in range(len(flows)-1): + if direction == 'into_future': + self.image_list += [ [images[i], images[i+1]] ] + self.flow_list += [ flows[i] ] + elif direction == 'into_past': + self.image_list += [ [images[i+1], images[i]] ] + self.flow_list += [ flows[i+1] ] + + +class KITTI(FlowDataset): + def __init__(self, aug_params=None, split='training', root='datasets/KITTI'): + super(KITTI, self).__init__(aug_params, sparse=True) + if split == 'testing': + self.is_test = True + + root = osp.join(root, split) + images1 = sorted(glob(osp.join(root, 'image_2/*_10.png'))) + images2 = sorted(glob(osp.join(root, 'image_2/*_11.png'))) + + for img1, img2 in zip(images1, images2): + frame_id = img1.split('/')[-1] + self.extra_info += [ [frame_id] ] + self.image_list += [ [img1, img2] ] + + if split == 'training': + self.flow_list = sorted(glob(osp.join(root, 'flow_occ/*_10.png'))) + + +class HD1K(FlowDataset): + def __init__(self, aug_params=None, root='datasets/HD1k'): + super(HD1K, self).__init__(aug_params, sparse=True) + + seq_ix = 0 + while 1: + flows = sorted(glob(os.path.join(root, 'hd1k_flow_gt', 'flow_occ/%06d_*.png' % seq_ix))) + images = sorted(glob(os.path.join(root, 'hd1k_input', 'image_2/%06d_*.png' % seq_ix))) + + if len(flows) == 0: + break + + for i in range(len(flows)-1): + self.flow_list += [flows[i]] + self.image_list += [ [images[i], images[i+1]] ] + + seq_ix += 1 + + +def fetch_dataloader(args, TRAIN_DS='C+T+K+S+H'): + """ Create the data loader for the corresponding trainign set """ + + if args.stage == 'chairs': + aug_params = {'crop_size': args.image_size, 'min_scale': -0.1, 'max_scale': 1.0, 'do_flip': True} + train_dataset = FlyingChairs(aug_params, split='training') + + elif args.stage == 'things': + aug_params = {'crop_size': args.image_size, 'min_scale': -0.4, 'max_scale': 0.8, 'do_flip': True} + clean_dataset = FlyingThings3D(aug_params, dstype='frames_cleanpass') + final_dataset = FlyingThings3D(aug_params, dstype='frames_finalpass') + train_dataset = clean_dataset + final_dataset + + elif args.stage == 'sintel': + aug_params = {'crop_size': args.image_size, 'min_scale': -0.2, 'max_scale': 0.6, 'do_flip': True} + things = FlyingThings3D(aug_params, dstype='frames_cleanpass') + sintel_clean = MpiSintel(aug_params, split='training', dstype='clean') + sintel_final = MpiSintel(aug_params, split='training', dstype='final') + + if TRAIN_DS == 'C+T+K+S+H': + kitti = KITTI({'crop_size': args.image_size, 'min_scale': -0.3, 'max_scale': 0.5, 'do_flip': True}) + hd1k = HD1K({'crop_size': args.image_size, 'min_scale': -0.5, 'max_scale': 0.2, 'do_flip': True}) + train_dataset = 100*sintel_clean + 100*sintel_final + 200*kitti + 5*hd1k + things + + elif TRAIN_DS == 'C+T+K/S': + train_dataset = 100*sintel_clean + 100*sintel_final + things + + elif args.stage == 'kitti': + aug_params = {'crop_size': args.image_size, 'min_scale': -0.2, 'max_scale': 0.4, 'do_flip': False} + train_dataset = KITTI(aug_params, split='training') + + train_loader = data.DataLoader(train_dataset, batch_size=args.batch_size, + pin_memory=False, shuffle=True, num_workers=4, drop_last=True) + + print('Training with %d image pairs' % len(train_dataset)) + return train_loader + diff --git a/vbench/third_party/RAFT/core/extractor.py b/vbench/third_party/RAFT/core/extractor.py new file mode 100644 index 00000000..9a9c759d --- /dev/null +++ b/vbench/third_party/RAFT/core/extractor.py @@ -0,0 +1,267 @@ +import torch +import torch.nn as nn +import torch.nn.functional as F + + +class ResidualBlock(nn.Module): + def __init__(self, in_planes, planes, norm_fn='group', stride=1): + super(ResidualBlock, self).__init__() + + self.conv1 = nn.Conv2d(in_planes, planes, kernel_size=3, padding=1, stride=stride) + self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, padding=1) + self.relu = nn.ReLU(inplace=True) + + num_groups = planes // 8 + + if norm_fn == 'group': + self.norm1 = nn.GroupNorm(num_groups=num_groups, num_channels=planes) + self.norm2 = nn.GroupNorm(num_groups=num_groups, num_channels=planes) + if not stride == 1: + self.norm3 = nn.GroupNorm(num_groups=num_groups, num_channels=planes) + + elif norm_fn == 'batch': + self.norm1 = nn.BatchNorm2d(planes) + self.norm2 = nn.BatchNorm2d(planes) + if not stride == 1: + self.norm3 = nn.BatchNorm2d(planes) + + elif norm_fn == 'instance': + self.norm1 = nn.InstanceNorm2d(planes) + self.norm2 = nn.InstanceNorm2d(planes) + if not stride == 1: + self.norm3 = nn.InstanceNorm2d(planes) + + elif norm_fn == 'none': + self.norm1 = nn.Sequential() + self.norm2 = nn.Sequential() + if not stride == 1: + self.norm3 = nn.Sequential() + + if stride == 1: + self.downsample = None + + else: + self.downsample = nn.Sequential( + nn.Conv2d(in_planes, planes, kernel_size=1, stride=stride), self.norm3) + + + def forward(self, x): + y = x + y = self.relu(self.norm1(self.conv1(y))) + y = self.relu(self.norm2(self.conv2(y))) + + if self.downsample is not None: + x = self.downsample(x) + + return self.relu(x+y) + + + +class BottleneckBlock(nn.Module): + def __init__(self, in_planes, planes, norm_fn='group', stride=1): + super(BottleneckBlock, self).__init__() + + self.conv1 = nn.Conv2d(in_planes, planes//4, kernel_size=1, padding=0) + self.conv2 = nn.Conv2d(planes//4, planes//4, kernel_size=3, padding=1, stride=stride) + self.conv3 = nn.Conv2d(planes//4, planes, kernel_size=1, padding=0) + self.relu = nn.ReLU(inplace=True) + + num_groups = planes // 8 + + if norm_fn == 'group': + self.norm1 = nn.GroupNorm(num_groups=num_groups, num_channels=planes//4) + self.norm2 = nn.GroupNorm(num_groups=num_groups, num_channels=planes//4) + self.norm3 = nn.GroupNorm(num_groups=num_groups, num_channels=planes) + if not stride == 1: + self.norm4 = nn.GroupNorm(num_groups=num_groups, num_channels=planes) + + elif norm_fn == 'batch': + self.norm1 = nn.BatchNorm2d(planes//4) + self.norm2 = nn.BatchNorm2d(planes//4) + self.norm3 = nn.BatchNorm2d(planes) + if not stride == 1: + self.norm4 = nn.BatchNorm2d(planes) + + elif norm_fn == 'instance': + self.norm1 = nn.InstanceNorm2d(planes//4) + self.norm2 = nn.InstanceNorm2d(planes//4) + self.norm3 = nn.InstanceNorm2d(planes) + if not stride == 1: + self.norm4 = nn.InstanceNorm2d(planes) + + elif norm_fn == 'none': + self.norm1 = nn.Sequential() + self.norm2 = nn.Sequential() + self.norm3 = nn.Sequential() + if not stride == 1: + self.norm4 = nn.Sequential() + + if stride == 1: + self.downsample = None + + else: + self.downsample = nn.Sequential( + nn.Conv2d(in_planes, planes, kernel_size=1, stride=stride), self.norm4) + + + def forward(self, x): + y = x + y = self.relu(self.norm1(self.conv1(y))) + y = self.relu(self.norm2(self.conv2(y))) + y = self.relu(self.norm3(self.conv3(y))) + + if self.downsample is not None: + x = self.downsample(x) + + return self.relu(x+y) + +class BasicEncoder(nn.Module): + def __init__(self, output_dim=128, norm_fn='batch', dropout=0.0): + super(BasicEncoder, self).__init__() + self.norm_fn = norm_fn + + if self.norm_fn == 'group': + self.norm1 = nn.GroupNorm(num_groups=8, num_channels=64) + + elif self.norm_fn == 'batch': + self.norm1 = nn.BatchNorm2d(64) + + elif self.norm_fn == 'instance': + self.norm1 = nn.InstanceNorm2d(64) + + elif self.norm_fn == 'none': + self.norm1 = nn.Sequential() + + self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3) + self.relu1 = nn.ReLU(inplace=True) + + self.in_planes = 64 + self.layer1 = self._make_layer(64, stride=1) + self.layer2 = self._make_layer(96, stride=2) + self.layer3 = self._make_layer(128, stride=2) + + # output convolution + self.conv2 = nn.Conv2d(128, output_dim, kernel_size=1) + + self.dropout = None + if dropout > 0: + self.dropout = nn.Dropout2d(p=dropout) + + for m in self.modules(): + if isinstance(m, nn.Conv2d): + nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu') + elif isinstance(m, (nn.BatchNorm2d, nn.InstanceNorm2d, nn.GroupNorm)): + if m.weight is not None: + nn.init.constant_(m.weight, 1) + if m.bias is not None: + nn.init.constant_(m.bias, 0) + + def _make_layer(self, dim, stride=1): + layer1 = ResidualBlock(self.in_planes, dim, self.norm_fn, stride=stride) + layer2 = ResidualBlock(dim, dim, self.norm_fn, stride=1) + layers = (layer1, layer2) + + self.in_planes = dim + return nn.Sequential(*layers) + + + def forward(self, x): + + # if input is list, combine batch dimension + is_list = isinstance(x, tuple) or isinstance(x, list) + if is_list: + batch_dim = x[0].shape[0] + x = torch.cat(x, dim=0) + + x = self.conv1(x) + x = self.norm1(x) + x = self.relu1(x) + + x = self.layer1(x) + x = self.layer2(x) + x = self.layer3(x) + + x = self.conv2(x) + + if self.training and self.dropout is not None: + x = self.dropout(x) + + if is_list: + x = torch.split(x, [batch_dim, batch_dim], dim=0) + + return x + + +class SmallEncoder(nn.Module): + def __init__(self, output_dim=128, norm_fn='batch', dropout=0.0): + super(SmallEncoder, self).__init__() + self.norm_fn = norm_fn + + if self.norm_fn == 'group': + self.norm1 = nn.GroupNorm(num_groups=8, num_channels=32) + + elif self.norm_fn == 'batch': + self.norm1 = nn.BatchNorm2d(32) + + elif self.norm_fn == 'instance': + self.norm1 = nn.InstanceNorm2d(32) + + elif self.norm_fn == 'none': + self.norm1 = nn.Sequential() + + self.conv1 = nn.Conv2d(3, 32, kernel_size=7, stride=2, padding=3) + self.relu1 = nn.ReLU(inplace=True) + + self.in_planes = 32 + self.layer1 = self._make_layer(32, stride=1) + self.layer2 = self._make_layer(64, stride=2) + self.layer3 = self._make_layer(96, stride=2) + + self.dropout = None + if dropout > 0: + self.dropout = nn.Dropout2d(p=dropout) + + self.conv2 = nn.Conv2d(96, output_dim, kernel_size=1) + + for m in self.modules(): + if isinstance(m, nn.Conv2d): + nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu') + elif isinstance(m, (nn.BatchNorm2d, nn.InstanceNorm2d, nn.GroupNorm)): + if m.weight is not None: + nn.init.constant_(m.weight, 1) + if m.bias is not None: + nn.init.constant_(m.bias, 0) + + def _make_layer(self, dim, stride=1): + layer1 = BottleneckBlock(self.in_planes, dim, self.norm_fn, stride=stride) + layer2 = BottleneckBlock(dim, dim, self.norm_fn, stride=1) + layers = (layer1, layer2) + + self.in_planes = dim + return nn.Sequential(*layers) + + + def forward(self, x): + + # if input is list, combine batch dimension + is_list = isinstance(x, tuple) or isinstance(x, list) + if is_list: + batch_dim = x[0].shape[0] + x = torch.cat(x, dim=0) + + x = self.conv1(x) + x = self.norm1(x) + x = self.relu1(x) + + x = self.layer1(x) + x = self.layer2(x) + x = self.layer3(x) + x = self.conv2(x) + + if self.training and self.dropout is not None: + x = self.dropout(x) + + if is_list: + x = torch.split(x, [batch_dim, batch_dim], dim=0) + + return x diff --git a/vbench/third_party/RAFT/core/raft.py b/vbench/third_party/RAFT/core/raft.py new file mode 100644 index 00000000..ba5d48f9 --- /dev/null +++ b/vbench/third_party/RAFT/core/raft.py @@ -0,0 +1,147 @@ +import numpy as np +import torch +import torch.nn as nn +import torch.nn.functional as F + +import sys +sys.path.append('vbench/third_party/RAFT/core') + +from .update import BasicUpdateBlock, SmallUpdateBlock +from .extractor import BasicEncoder, SmallEncoder +from .corr import CorrBlock, AlternateCorrBlock +from .utils_core.utils import bilinear_sampler, coords_grid, upflow8 + +try: + autocast = torch.cuda.amp.autocast +except: + # dummy autocast for PyTorch < 1.6 + class autocast: + def __init__(self, enabled): + pass + def __enter__(self): + pass + def __exit__(self, *args): + pass + + +class RAFT(nn.Module): + def __init__(self, args): + super(RAFT, self).__init__() + self.args = args + + if args.small: + self.hidden_dim = hdim = 96 + self.context_dim = cdim = 64 + args.corr_levels = 4 + args.corr_radius = 3 + + else: + self.hidden_dim = hdim = 128 + self.context_dim = cdim = 128 + args.corr_levels = 4 + args.corr_radius = 4 + + if 'dropout' not in self.args: + self.args.dropout = 0 + + if 'alternate_corr' not in self.args: + self.args.alternate_corr = False + + # feature network, context network, and update block + if args.small: + self.fnet = SmallEncoder(output_dim=128, norm_fn='instance', dropout=args.dropout) + self.cnet = SmallEncoder(output_dim=hdim+cdim, norm_fn='none', dropout=args.dropout) + self.update_block = SmallUpdateBlock(self.args, hidden_dim=hdim) + + else: + self.fnet = BasicEncoder(output_dim=256, norm_fn='instance', dropout=args.dropout) + self.cnet = BasicEncoder(output_dim=hdim+cdim, norm_fn='batch', dropout=args.dropout) + self.update_block = BasicUpdateBlock(self.args, hidden_dim=hdim) + + def freeze_bn(self): + for m in self.modules(): + if isinstance(m, nn.BatchNorm2d): + m.eval() + + def initialize_flow(self, img): + """ Flow is represented as difference between two coordinate grids flow = coords1 - coords0""" + N, C, H, W = img.shape + coords0 = coords_grid(N, H//8, W//8, device=img.device) + coords1 = coords_grid(N, H//8, W//8, device=img.device) + + # optical flow computed as difference: flow = coords1 - coords0 + return coords0, coords1 + + def upsample_flow(self, flow, mask): + """ Upsample flow field [H/8, W/8, 2] -> [H, W, 2] using convex combination """ + N, _, H, W = flow.shape + mask = mask.view(N, 1, 9, 8, 8, H, W) + mask = torch.softmax(mask, dim=2) + + up_flow = F.unfold(8 * flow, [3,3], padding=1) + up_flow = up_flow.view(N, 2, 9, 1, 1, H, W) + + up_flow = torch.sum(mask * up_flow, dim=2) + up_flow = up_flow.permute(0, 1, 4, 2, 5, 3) + return up_flow.reshape(N, 2, 8*H, 8*W) + + + def forward(self, image1, image2, iters=12, flow_init=None, upsample=True, test_mode=False): + """ Estimate optical flow between pair of frames """ + + image1 = 2 * (image1 / 255.0) - 1.0 + image2 = 2 * (image2 / 255.0) - 1.0 + + image1 = image1.contiguous() + image2 = image2.contiguous() + + hdim = self.hidden_dim + cdim = self.context_dim + + # run the feature network + with autocast(enabled=self.args.mixed_precision): + fmap1, fmap2 = self.fnet([image1, image2]) + + fmap1 = fmap1.float() + fmap2 = fmap2.float() + if self.args.alternate_corr: + corr_fn = AlternateCorrBlock(fmap1, fmap2, radius=self.args.corr_radius) + else: + corr_fn = CorrBlock(fmap1, fmap2, radius=self.args.corr_radius) + + # run the context network + with autocast(enabled=self.args.mixed_precision): + cnet = self.cnet(image1) + net, inp = torch.split(cnet, [hdim, cdim], dim=1) + net = torch.tanh(net) + inp = torch.relu(inp) + + coords0, coords1 = self.initialize_flow(image1) + + if flow_init is not None: + coords1 = coords1 + flow_init + + flow_predictions = [] + for itr in range(iters): + coords1 = coords1.detach() + corr = corr_fn(coords1) # index correlation volume + + flow = coords1 - coords0 + with autocast(enabled=self.args.mixed_precision): + net, up_mask, delta_flow = self.update_block(net, inp, corr, flow) + + # F(t+1) = F(t) + \Delta(t) + coords1 = coords1 + delta_flow + + # upsample predictions + if up_mask is None: + flow_up = upflow8(coords1 - coords0) + else: + flow_up = self.upsample_flow(coords1 - coords0, up_mask) + + flow_predictions.append(flow_up) + + if test_mode: + return coords1 - coords0, flow_up + + return flow_predictions diff --git a/vbench/third_party/RAFT/core/update.py b/vbench/third_party/RAFT/core/update.py new file mode 100644 index 00000000..f940497f --- /dev/null +++ b/vbench/third_party/RAFT/core/update.py @@ -0,0 +1,139 @@ +import torch +import torch.nn as nn +import torch.nn.functional as F + + +class FlowHead(nn.Module): + def __init__(self, input_dim=128, hidden_dim=256): + super(FlowHead, self).__init__() + self.conv1 = nn.Conv2d(input_dim, hidden_dim, 3, padding=1) + self.conv2 = nn.Conv2d(hidden_dim, 2, 3, padding=1) + self.relu = nn.ReLU(inplace=True) + + def forward(self, x): + return self.conv2(self.relu(self.conv1(x))) + +class ConvGRU(nn.Module): + def __init__(self, hidden_dim=128, input_dim=192+128): + super(ConvGRU, self).__init__() + self.convz = nn.Conv2d(hidden_dim+input_dim, hidden_dim, 3, padding=1) + self.convr = nn.Conv2d(hidden_dim+input_dim, hidden_dim, 3, padding=1) + self.convq = nn.Conv2d(hidden_dim+input_dim, hidden_dim, 3, padding=1) + + def forward(self, h, x): + hx = torch.cat([h, x], dim=1) + + z = torch.sigmoid(self.convz(hx)) + r = torch.sigmoid(self.convr(hx)) + q = torch.tanh(self.convq(torch.cat([r*h, x], dim=1))) + + h = (1-z) * h + z * q + return h + +class SepConvGRU(nn.Module): + def __init__(self, hidden_dim=128, input_dim=192+128): + super(SepConvGRU, self).__init__() + self.convz1 = nn.Conv2d(hidden_dim+input_dim, hidden_dim, (1,5), padding=(0,2)) + self.convr1 = nn.Conv2d(hidden_dim+input_dim, hidden_dim, (1,5), padding=(0,2)) + self.convq1 = nn.Conv2d(hidden_dim+input_dim, hidden_dim, (1,5), padding=(0,2)) + + self.convz2 = nn.Conv2d(hidden_dim+input_dim, hidden_dim, (5,1), padding=(2,0)) + self.convr2 = nn.Conv2d(hidden_dim+input_dim, hidden_dim, (5,1), padding=(2,0)) + self.convq2 = nn.Conv2d(hidden_dim+input_dim, hidden_dim, (5,1), padding=(2,0)) + + + def forward(self, h, x): + # horizontal + hx = torch.cat([h, x], dim=1) + z = torch.sigmoid(self.convz1(hx)) + r = torch.sigmoid(self.convr1(hx)) + q = torch.tanh(self.convq1(torch.cat([r*h, x], dim=1))) + h = (1-z) * h + z * q + + # vertical + hx = torch.cat([h, x], dim=1) + z = torch.sigmoid(self.convz2(hx)) + r = torch.sigmoid(self.convr2(hx)) + q = torch.tanh(self.convq2(torch.cat([r*h, x], dim=1))) + h = (1-z) * h + z * q + + return h + +class SmallMotionEncoder(nn.Module): + def __init__(self, args): + super(SmallMotionEncoder, self).__init__() + cor_planes = args.corr_levels * (2*args.corr_radius + 1)**2 + self.convc1 = nn.Conv2d(cor_planes, 96, 1, padding=0) + self.convf1 = nn.Conv2d(2, 64, 7, padding=3) + self.convf2 = nn.Conv2d(64, 32, 3, padding=1) + self.conv = nn.Conv2d(128, 80, 3, padding=1) + + def forward(self, flow, corr): + cor = F.relu(self.convc1(corr)) + flo = F.relu(self.convf1(flow)) + flo = F.relu(self.convf2(flo)) + cor_flo = torch.cat([cor, flo], dim=1) + out = F.relu(self.conv(cor_flo)) + return torch.cat([out, flow], dim=1) + +class BasicMotionEncoder(nn.Module): + def __init__(self, args): + super(BasicMotionEncoder, self).__init__() + cor_planes = args.corr_levels * (2*args.corr_radius + 1)**2 + self.convc1 = nn.Conv2d(cor_planes, 256, 1, padding=0) + self.convc2 = nn.Conv2d(256, 192, 3, padding=1) + self.convf1 = nn.Conv2d(2, 128, 7, padding=3) + self.convf2 = nn.Conv2d(128, 64, 3, padding=1) + self.conv = nn.Conv2d(64+192, 128-2, 3, padding=1) + + def forward(self, flow, corr): + cor = F.relu(self.convc1(corr)) + cor = F.relu(self.convc2(cor)) + flo = F.relu(self.convf1(flow)) + flo = F.relu(self.convf2(flo)) + + cor_flo = torch.cat([cor, flo], dim=1) + out = F.relu(self.conv(cor_flo)) + return torch.cat([out, flow], dim=1) + +class SmallUpdateBlock(nn.Module): + def __init__(self, args, hidden_dim=96): + super(SmallUpdateBlock, self).__init__() + self.encoder = SmallMotionEncoder(args) + self.gru = ConvGRU(hidden_dim=hidden_dim, input_dim=82+64) + self.flow_head = FlowHead(hidden_dim, hidden_dim=128) + + def forward(self, net, inp, corr, flow): + motion_features = self.encoder(flow, corr) + inp = torch.cat([inp, motion_features], dim=1) + net = self.gru(net, inp) + delta_flow = self.flow_head(net) + + return net, None, delta_flow + +class BasicUpdateBlock(nn.Module): + def __init__(self, args, hidden_dim=128, input_dim=128): + super(BasicUpdateBlock, self).__init__() + self.args = args + self.encoder = BasicMotionEncoder(args) + self.gru = SepConvGRU(hidden_dim=hidden_dim, input_dim=128+hidden_dim) + self.flow_head = FlowHead(hidden_dim, hidden_dim=256) + + self.mask = nn.Sequential( + nn.Conv2d(128, 256, 3, padding=1), + nn.ReLU(inplace=True), + nn.Conv2d(256, 64*9, 1, padding=0)) + + def forward(self, net, inp, corr, flow, upsample=True): + motion_features = self.encoder(flow, corr) + inp = torch.cat([inp, motion_features], dim=1) + + net = self.gru(net, inp) + delta_flow = self.flow_head(net) + + # scale mask to balence gradients + mask = .25 * self.mask(net) + return net, mask, delta_flow + + + diff --git a/vbench/third_party/RAFT/core/utils_core/__init__.py b/vbench/third_party/RAFT/core/utils_core/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/RAFT/core/utils_core/augmentor.py b/vbench/third_party/RAFT/core/utils_core/augmentor.py new file mode 100644 index 00000000..e81c4f2b --- /dev/null +++ b/vbench/third_party/RAFT/core/utils_core/augmentor.py @@ -0,0 +1,246 @@ +import numpy as np +import random +import math +from PIL import Image + +import cv2 +cv2.setNumThreads(0) +cv2.ocl.setUseOpenCL(False) + +import torch +from torchvision.transforms import ColorJitter +import torch.nn.functional as F + + +class FlowAugmentor: + def __init__(self, crop_size, min_scale=-0.2, max_scale=0.5, do_flip=True): + + # spatial augmentation params + self.crop_size = crop_size + self.min_scale = min_scale + self.max_scale = max_scale + self.spatial_aug_prob = 0.8 + self.stretch_prob = 0.8 + self.max_stretch = 0.2 + + # flip augmentation params + self.do_flip = do_flip + self.h_flip_prob = 0.5 + self.v_flip_prob = 0.1 + + # photometric augmentation params + self.photo_aug = ColorJitter(brightness=0.4, contrast=0.4, saturation=0.4, hue=0.5/3.14) + self.asymmetric_color_aug_prob = 0.2 + self.eraser_aug_prob = 0.5 + + def color_transform(self, img1, img2): + """ Photometric augmentation """ + + # asymmetric + if np.random.rand() < self.asymmetric_color_aug_prob: + img1 = np.array(self.photo_aug(Image.fromarray(img1)), dtype=np.uint8) + img2 = np.array(self.photo_aug(Image.fromarray(img2)), dtype=np.uint8) + + # symmetric + else: + image_stack = np.concatenate([img1, img2], axis=0) + image_stack = np.array(self.photo_aug(Image.fromarray(image_stack)), dtype=np.uint8) + img1, img2 = np.split(image_stack, 2, axis=0) + + return img1, img2 + + def eraser_transform(self, img1, img2, bounds=[50, 100]): + """ Occlusion augmentation """ + + ht, wd = img1.shape[:2] + if np.random.rand() < self.eraser_aug_prob: + mean_color = np.mean(img2.reshape(-1, 3), axis=0) + for _ in range(np.random.randint(1, 3)): + x0 = np.random.randint(0, wd) + y0 = np.random.randint(0, ht) + dx = np.random.randint(bounds[0], bounds[1]) + dy = np.random.randint(bounds[0], bounds[1]) + img2[y0:y0+dy, x0:x0+dx, :] = mean_color + + return img1, img2 + + def spatial_transform(self, img1, img2, flow): + # randomly sample scale + ht, wd = img1.shape[:2] + min_scale = np.maximum( + (self.crop_size[0] + 8) / float(ht), + (self.crop_size[1] + 8) / float(wd)) + + scale = 2 ** np.random.uniform(self.min_scale, self.max_scale) + scale_x = scale + scale_y = scale + if np.random.rand() < self.stretch_prob: + scale_x *= 2 ** np.random.uniform(-self.max_stretch, self.max_stretch) + scale_y *= 2 ** np.random.uniform(-self.max_stretch, self.max_stretch) + + scale_x = np.clip(scale_x, min_scale, None) + scale_y = np.clip(scale_y, min_scale, None) + + if np.random.rand() < self.spatial_aug_prob: + # rescale the images + img1 = cv2.resize(img1, None, fx=scale_x, fy=scale_y, interpolation=cv2.INTER_LINEAR) + img2 = cv2.resize(img2, None, fx=scale_x, fy=scale_y, interpolation=cv2.INTER_LINEAR) + flow = cv2.resize(flow, None, fx=scale_x, fy=scale_y, interpolation=cv2.INTER_LINEAR) + flow = flow * [scale_x, scale_y] + + if self.do_flip: + if np.random.rand() < self.h_flip_prob: # h-flip + img1 = img1[:, ::-1] + img2 = img2[:, ::-1] + flow = flow[:, ::-1] * [-1.0, 1.0] + + if np.random.rand() < self.v_flip_prob: # v-flip + img1 = img1[::-1, :] + img2 = img2[::-1, :] + flow = flow[::-1, :] * [1.0, -1.0] + + y0 = np.random.randint(0, img1.shape[0] - self.crop_size[0]) + x0 = np.random.randint(0, img1.shape[1] - self.crop_size[1]) + + img1 = img1[y0:y0+self.crop_size[0], x0:x0+self.crop_size[1]] + img2 = img2[y0:y0+self.crop_size[0], x0:x0+self.crop_size[1]] + flow = flow[y0:y0+self.crop_size[0], x0:x0+self.crop_size[1]] + + return img1, img2, flow + + def __call__(self, img1, img2, flow): + img1, img2 = self.color_transform(img1, img2) + img1, img2 = self.eraser_transform(img1, img2) + img1, img2, flow = self.spatial_transform(img1, img2, flow) + + img1 = np.ascontiguousarray(img1) + img2 = np.ascontiguousarray(img2) + flow = np.ascontiguousarray(flow) + + return img1, img2, flow + +class SparseFlowAugmentor: + def __init__(self, crop_size, min_scale=-0.2, max_scale=0.5, do_flip=False): + # spatial augmentation params + self.crop_size = crop_size + self.min_scale = min_scale + self.max_scale = max_scale + self.spatial_aug_prob = 0.8 + self.stretch_prob = 0.8 + self.max_stretch = 0.2 + + # flip augmentation params + self.do_flip = do_flip + self.h_flip_prob = 0.5 + self.v_flip_prob = 0.1 + + # photometric augmentation params + self.photo_aug = ColorJitter(brightness=0.3, contrast=0.3, saturation=0.3, hue=0.3/3.14) + self.asymmetric_color_aug_prob = 0.2 + self.eraser_aug_prob = 0.5 + + def color_transform(self, img1, img2): + image_stack = np.concatenate([img1, img2], axis=0) + image_stack = np.array(self.photo_aug(Image.fromarray(image_stack)), dtype=np.uint8) + img1, img2 = np.split(image_stack, 2, axis=0) + return img1, img2 + + def eraser_transform(self, img1, img2): + ht, wd = img1.shape[:2] + if np.random.rand() < self.eraser_aug_prob: + mean_color = np.mean(img2.reshape(-1, 3), axis=0) + for _ in range(np.random.randint(1, 3)): + x0 = np.random.randint(0, wd) + y0 = np.random.randint(0, ht) + dx = np.random.randint(50, 100) + dy = np.random.randint(50, 100) + img2[y0:y0+dy, x0:x0+dx, :] = mean_color + + return img1, img2 + + def resize_sparse_flow_map(self, flow, valid, fx=1.0, fy=1.0): + ht, wd = flow.shape[:2] + coords = np.meshgrid(np.arange(wd), np.arange(ht)) + coords = np.stack(coords, axis=-1) + + coords = coords.reshape(-1, 2).astype(np.float32) + flow = flow.reshape(-1, 2).astype(np.float32) + valid = valid.reshape(-1).astype(np.float32) + + coords0 = coords[valid>=1] + flow0 = flow[valid>=1] + + ht1 = int(round(ht * fy)) + wd1 = int(round(wd * fx)) + + coords1 = coords0 * [fx, fy] + flow1 = flow0 * [fx, fy] + + xx = np.round(coords1[:,0]).astype(np.int32) + yy = np.round(coords1[:,1]).astype(np.int32) + + v = (xx > 0) & (xx < wd1) & (yy > 0) & (yy < ht1) + xx = xx[v] + yy = yy[v] + flow1 = flow1[v] + + flow_img = np.zeros([ht1, wd1, 2], dtype=np.float32) + valid_img = np.zeros([ht1, wd1], dtype=np.int32) + + flow_img[yy, xx] = flow1 + valid_img[yy, xx] = 1 + + return flow_img, valid_img + + def spatial_transform(self, img1, img2, flow, valid): + # randomly sample scale + + ht, wd = img1.shape[:2] + min_scale = np.maximum( + (self.crop_size[0] + 1) / float(ht), + (self.crop_size[1] + 1) / float(wd)) + + scale = 2 ** np.random.uniform(self.min_scale, self.max_scale) + scale_x = np.clip(scale, min_scale, None) + scale_y = np.clip(scale, min_scale, None) + + if np.random.rand() < self.spatial_aug_prob: + # rescale the images + img1 = cv2.resize(img1, None, fx=scale_x, fy=scale_y, interpolation=cv2.INTER_LINEAR) + img2 = cv2.resize(img2, None, fx=scale_x, fy=scale_y, interpolation=cv2.INTER_LINEAR) + flow, valid = self.resize_sparse_flow_map(flow, valid, fx=scale_x, fy=scale_y) + + if self.do_flip: + if np.random.rand() < 0.5: # h-flip + img1 = img1[:, ::-1] + img2 = img2[:, ::-1] + flow = flow[:, ::-1] * [-1.0, 1.0] + valid = valid[:, ::-1] + + margin_y = 20 + margin_x = 50 + + y0 = np.random.randint(0, img1.shape[0] - self.crop_size[0] + margin_y) + x0 = np.random.randint(-margin_x, img1.shape[1] - self.crop_size[1] + margin_x) + + y0 = np.clip(y0, 0, img1.shape[0] - self.crop_size[0]) + x0 = np.clip(x0, 0, img1.shape[1] - self.crop_size[1]) + + img1 = img1[y0:y0+self.crop_size[0], x0:x0+self.crop_size[1]] + img2 = img2[y0:y0+self.crop_size[0], x0:x0+self.crop_size[1]] + flow = flow[y0:y0+self.crop_size[0], x0:x0+self.crop_size[1]] + valid = valid[y0:y0+self.crop_size[0], x0:x0+self.crop_size[1]] + return img1, img2, flow, valid + + + def __call__(self, img1, img2, flow, valid): + img1, img2 = self.color_transform(img1, img2) + img1, img2 = self.eraser_transform(img1, img2) + img1, img2, flow, valid = self.spatial_transform(img1, img2, flow, valid) + + img1 = np.ascontiguousarray(img1) + img2 = np.ascontiguousarray(img2) + flow = np.ascontiguousarray(flow) + valid = np.ascontiguousarray(valid) + + return img1, img2, flow, valid diff --git a/vbench/third_party/RAFT/core/utils_core/flow_viz.py b/vbench/third_party/RAFT/core/utils_core/flow_viz.py new file mode 100644 index 00000000..dcee65e8 --- /dev/null +++ b/vbench/third_party/RAFT/core/utils_core/flow_viz.py @@ -0,0 +1,132 @@ +# Flow visualization code used from https://github.com/tomrunia/OpticalFlow_Visualization + + +# MIT License +# +# Copyright (c) 2018 Tom Runia +# +# Permission is hereby granted, free of charge, to any person obtaining a copy +# of this software and associated documentation files (the "Software"), to deal +# in the Software without restriction, including without limitation the rights +# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +# copies of the Software, and to permit persons to whom the Software is +# furnished to do so, subject to conditions. +# +# Author: Tom Runia +# Date Created: 2018-08-03 + +import numpy as np + +def make_colorwheel(): + """ + Generates a color wheel for optical flow visualization as presented in: + Baker et al. "A Database and Evaluation Methodology for Optical Flow" (ICCV, 2007) + URL: http://vision.middlebury.edu/flow/flowEval-iccv07.pdf + + Code follows the original C++ source code of Daniel Scharstein. + Code follows the the Matlab source code of Deqing Sun. + + Returns: + np.ndarray: Color wheel + """ + + RY = 15 + YG = 6 + GC = 4 + CB = 11 + BM = 13 + MR = 6 + + ncols = RY + YG + GC + CB + BM + MR + colorwheel = np.zeros((ncols, 3)) + col = 0 + + # RY + colorwheel[0:RY, 0] = 255 + colorwheel[0:RY, 1] = np.floor(255*np.arange(0,RY)/RY) + col = col+RY + # YG + colorwheel[col:col+YG, 0] = 255 - np.floor(255*np.arange(0,YG)/YG) + colorwheel[col:col+YG, 1] = 255 + col = col+YG + # GC + colorwheel[col:col+GC, 1] = 255 + colorwheel[col:col+GC, 2] = np.floor(255*np.arange(0,GC)/GC) + col = col+GC + # CB + colorwheel[col:col+CB, 1] = 255 - np.floor(255*np.arange(CB)/CB) + colorwheel[col:col+CB, 2] = 255 + col = col+CB + # BM + colorwheel[col:col+BM, 2] = 255 + colorwheel[col:col+BM, 0] = np.floor(255*np.arange(0,BM)/BM) + col = col+BM + # MR + colorwheel[col:col+MR, 2] = 255 - np.floor(255*np.arange(MR)/MR) + colorwheel[col:col+MR, 0] = 255 + return colorwheel + + +def flow_uv_to_colors(u, v, convert_to_bgr=False): + """ + Applies the flow color wheel to (possibly clipped) flow components u and v. + + According to the C++ source code of Daniel Scharstein + According to the Matlab source code of Deqing Sun + + Args: + u (np.ndarray): Input horizontal flow of shape [H,W] + v (np.ndarray): Input vertical flow of shape [H,W] + convert_to_bgr (bool, optional): Convert output image to BGR. Defaults to False. + + Returns: + np.ndarray: Flow visualization image of shape [H,W,3] + """ + flow_image = np.zeros((u.shape[0], u.shape[1], 3), np.uint8) + colorwheel = make_colorwheel() # shape [55x3] + ncols = colorwheel.shape[0] + rad = np.sqrt(np.square(u) + np.square(v)) + a = np.arctan2(-v, -u)/np.pi + fk = (a+1) / 2*(ncols-1) + k0 = np.floor(fk).astype(np.int32) + k1 = k0 + 1 + k1[k1 == ncols] = 0 + f = fk - k0 + for i in range(colorwheel.shape[1]): + tmp = colorwheel[:,i] + col0 = tmp[k0] / 255.0 + col1 = tmp[k1] / 255.0 + col = (1-f)*col0 + f*col1 + idx = (rad <= 1) + col[idx] = 1 - rad[idx] * (1-col[idx]) + col[~idx] = col[~idx] * 0.75 # out of range + # Note the 2-i => BGR instead of RGB + ch_idx = 2-i if convert_to_bgr else i + flow_image[:,:,ch_idx] = np.floor(255 * col) + return flow_image + + +def flow_to_image(flow_uv, clip_flow=None, convert_to_bgr=False): + """ + Expects a two dimensional flow image of shape. + + Args: + flow_uv (np.ndarray): Flow UV image of shape [H,W,2] + clip_flow (float, optional): Clip maximum of flow values. Defaults to None. + convert_to_bgr (bool, optional): Convert output image to BGR. Defaults to False. + + Returns: + np.ndarray: Flow visualization image of shape [H,W,3] + """ + assert flow_uv.ndim == 3, 'input flow must have three dimensions' + assert flow_uv.shape[2] == 2, 'input flow must have shape [H,W,2]' + if clip_flow is not None: + flow_uv = np.clip(flow_uv, 0, clip_flow) + u = flow_uv[:,:,0] + v = flow_uv[:,:,1] + rad = np.sqrt(np.square(u) + np.square(v)) + rad_max = np.max(rad) + epsilon = 1e-5 + u = u / (rad_max + epsilon) + v = v / (rad_max + epsilon) + return flow_uv_to_colors(u, v, convert_to_bgr) \ No newline at end of file diff --git a/vbench/third_party/RAFT/core/utils_core/frame_utils.py b/vbench/third_party/RAFT/core/utils_core/frame_utils.py new file mode 100644 index 00000000..6c491135 --- /dev/null +++ b/vbench/third_party/RAFT/core/utils_core/frame_utils.py @@ -0,0 +1,137 @@ +import numpy as np +from PIL import Image +from os.path import * +import re + +import cv2 +cv2.setNumThreads(0) +cv2.ocl.setUseOpenCL(False) + +TAG_CHAR = np.array([202021.25], np.float32) + +def readFlow(fn): + """ Read .flo file in Middlebury format""" + # Code adapted from: + # http://stackoverflow.com/questions/28013200/reading-middlebury-flow-files-with-python-bytes-array-numpy + + # WARNING: this will work on little-endian architectures (eg Intel x86) only! + # print 'fn = %s'%(fn) + with open(fn, 'rb') as f: + magic = np.fromfile(f, np.float32, count=1) + if 202021.25 != magic: + print('Magic number incorrect. Invalid .flo file') + return None + else: + w = np.fromfile(f, np.int32, count=1) + h = np.fromfile(f, np.int32, count=1) + # print 'Reading %d x %d flo file\n' % (w, h) + data = np.fromfile(f, np.float32, count=2*int(w)*int(h)) + # Reshape data into 3D array (columns, rows, bands) + # The reshape here is for visualization, the original code is (w,h,2) + return np.resize(data, (int(h), int(w), 2)) + +def readPFM(file): + file = open(file, 'rb') + + color = None + width = None + height = None + scale = None + endian = None + + header = file.readline().rstrip() + if header == b'PF': + color = True + elif header == b'Pf': + color = False + else: + raise Exception('Not a PFM file.') + + dim_match = re.match(rb'^(\d+)\s(\d+)\s$', file.readline()) + if dim_match: + width, height = map(int, dim_match.groups()) + else: + raise Exception('Malformed PFM header.') + + scale = float(file.readline().rstrip()) + if scale < 0: # little-endian + endian = '<' + scale = -scale + else: + endian = '>' # big-endian + + data = np.fromfile(file, endian + 'f') + shape = (height, width, 3) if color else (height, width) + + data = np.reshape(data, shape) + data = np.flipud(data) + return data + +def writeFlow(filename,uv,v=None): + """ Write optical flow to file. + + If v is None, uv is assumed to contain both u and v channels, + stacked in depth. + Original code by Deqing Sun, adapted from Daniel Scharstein. + """ + nBands = 2 + + if v is None: + assert(uv.ndim == 3) + assert(uv.shape[2] == 2) + u = uv[:,:,0] + v = uv[:,:,1] + else: + u = uv + + assert(u.shape == v.shape) + height,width = u.shape + f = open(filename,'wb') + # write the header + f.write(TAG_CHAR) + np.array(width).astype(np.int32).tofile(f) + np.array(height).astype(np.int32).tofile(f) + # arrange into matrix form + tmp = np.zeros((height, width*nBands)) + tmp[:,np.arange(width)*2] = u + tmp[:,np.arange(width)*2 + 1] = v + tmp.astype(np.float32).tofile(f) + f.close() + + +def readFlowKITTI(filename): + flow = cv2.imread(filename, cv2.IMREAD_ANYDEPTH|cv2.IMREAD_COLOR) + flow = flow[:,:,::-1].astype(np.float32) + flow, valid = flow[:, :, :2], flow[:, :, 2] + flow = (flow - 2**15) / 64.0 + return flow, valid + +def readDispKITTI(filename): + disp = cv2.imread(filename, cv2.IMREAD_ANYDEPTH) / 256.0 + valid = disp > 0.0 + flow = np.stack([-disp, np.zeros_like(disp)], -1) + return flow, valid + + +def writeFlowKITTI(filename, uv): + uv = 64.0 * uv + 2**15 + valid = np.ones([uv.shape[0], uv.shape[1], 1]) + uv = np.concatenate([uv, valid], axis=-1).astype(np.uint16) + cv2.imwrite(filename, uv[..., ::-1]) + + +def read_gen(file_name, pil=False): + ext = splitext(file_name)[-1] + if ext == '.png' or ext == '.jpeg' or ext == '.ppm' or ext == '.jpg': + return Image.open(file_name) + elif ext == '.bin' or ext == '.raw': + return np.load(file_name) + elif ext == '.flo': + return readFlow(file_name).astype(np.float32) + elif ext == '.pfm': + flow = readPFM(file_name).astype(np.float32) + if len(flow.shape) == 2: + return flow + else: + return flow[:, :, :-1] + return [] \ No newline at end of file diff --git a/vbench/third_party/RAFT/core/utils_core/utils.py b/vbench/third_party/RAFT/core/utils_core/utils.py new file mode 100644 index 00000000..741ccfe4 --- /dev/null +++ b/vbench/third_party/RAFT/core/utils_core/utils.py @@ -0,0 +1,82 @@ +import torch +import torch.nn.functional as F +import numpy as np +from scipy import interpolate + + +class InputPadder: + """ Pads images such that dimensions are divisible by 8 """ + def __init__(self, dims, mode='sintel'): + self.ht, self.wd = dims[-2:] + pad_ht = (((self.ht // 8) + 1) * 8 - self.ht) % 8 + pad_wd = (((self.wd // 8) + 1) * 8 - self.wd) % 8 + if mode == 'sintel': + self._pad = [pad_wd//2, pad_wd - pad_wd//2, pad_ht//2, pad_ht - pad_ht//2] + else: + self._pad = [pad_wd//2, pad_wd - pad_wd//2, 0, pad_ht] + + def pad(self, *inputs): + return [F.pad(x, self._pad, mode='replicate') for x in inputs] + + def unpad(self,x): + ht, wd = x.shape[-2:] + c = [self._pad[2], ht-self._pad[3], self._pad[0], wd-self._pad[1]] + return x[..., c[0]:c[1], c[2]:c[3]] + +def forward_interpolate(flow): + flow = flow.detach().cpu().numpy() + dx, dy = flow[0], flow[1] + + ht, wd = dx.shape + x0, y0 = np.meshgrid(np.arange(wd), np.arange(ht)) + + x1 = x0 + dx + y1 = y0 + dy + + x1 = x1.reshape(-1) + y1 = y1.reshape(-1) + dx = dx.reshape(-1) + dy = dy.reshape(-1) + + valid = (x1 > 0) & (x1 < wd) & (y1 > 0) & (y1 < ht) + x1 = x1[valid] + y1 = y1[valid] + dx = dx[valid] + dy = dy[valid] + + flow_x = interpolate.griddata( + (x1, y1), dx, (x0, y0), method='nearest', fill_value=0) + + flow_y = interpolate.griddata( + (x1, y1), dy, (x0, y0), method='nearest', fill_value=0) + + flow = np.stack([flow_x, flow_y], axis=0) + return torch.from_numpy(flow).float() + + +def bilinear_sampler(img, coords, mode='bilinear', mask=False): + """ Wrapper for grid_sample, uses pixel coordinates """ + H, W = img.shape[-2:] + xgrid, ygrid = coords.split([1,1], dim=-1) + xgrid = 2*xgrid/(W-1) - 1 + ygrid = 2*ygrid/(H-1) - 1 + + grid = torch.cat([xgrid, ygrid], dim=-1) + img = F.grid_sample(img, grid, align_corners=True) + + if mask: + mask = (xgrid > -1) & (ygrid > -1) & (xgrid < 1) & (ygrid < 1) + return img, mask.float() + + return img + + +def coords_grid(batch, ht, wd, device): + coords = torch.meshgrid(torch.arange(ht, device=device), torch.arange(wd, device=device)) + coords = torch.stack(coords[::-1], dim=0).float() + return coords[None].repeat(batch, 1, 1, 1) + + +def upflow8(flow, mode='bilinear'): + new_size = (8 * flow.shape[2], 8 * flow.shape[3]) + return 8 * F.interpolate(flow, size=new_size, mode=mode, align_corners=True) diff --git a/vbench/third_party/RAFT/download_models.sh b/vbench/third_party/RAFT/download_models.sh new file mode 100755 index 00000000..dfd8d473 --- /dev/null +++ b/vbench/third_party/RAFT/download_models.sh @@ -0,0 +1,3 @@ +#!/bin/bash +wget https://dl.dropboxusercontent.com/s/4j4z58wuv8o0mfz/models.zip +unzip models.zip diff --git a/vbench/third_party/RAFT/evaluate.py b/vbench/third_party/RAFT/evaluate.py new file mode 100644 index 00000000..431a0f58 --- /dev/null +++ b/vbench/third_party/RAFT/evaluate.py @@ -0,0 +1,197 @@ +import sys +sys.path.append('core') + +from PIL import Image +import argparse +import os +import time +import numpy as np +import torch +import torch.nn.functional as F +import matplotlib.pyplot as plt + +import datasets +from utils import flow_viz +from utils import frame_utils + +from raft import RAFT +from utils.utils import InputPadder, forward_interpolate + + +@torch.no_grad() +def create_sintel_submission(model, iters=32, warm_start=False, output_path='sintel_submission'): + """ Create submission for the Sintel leaderboard """ + model.eval() + for dstype in ['clean', 'final']: + test_dataset = datasets.MpiSintel(split='test', aug_params=None, dstype=dstype) + + flow_prev, sequence_prev = None, None + for test_id in range(len(test_dataset)): + image1, image2, (sequence, frame) = test_dataset[test_id] + if sequence != sequence_prev: + flow_prev = None + + padder = InputPadder(image1.shape) + image1, image2 = padder.pad(image1[None].cuda(), image2[None].cuda()) + + flow_low, flow_pr = model(image1, image2, iters=iters, flow_init=flow_prev, test_mode=True) + flow = padder.unpad(flow_pr[0]).permute(1, 2, 0).cpu().numpy() + + if warm_start: + flow_prev = forward_interpolate(flow_low[0])[None].cuda() + + output_dir = os.path.join(output_path, dstype, sequence) + output_file = os.path.join(output_dir, 'frame%04d.flo' % (frame+1)) + + if not os.path.exists(output_dir): + os.makedirs(output_dir) + + frame_utils.writeFlow(output_file, flow) + sequence_prev = sequence + + +@torch.no_grad() +def create_kitti_submission(model, iters=24, output_path='kitti_submission'): + """ Create submission for the Sintel leaderboard """ + model.eval() + test_dataset = datasets.KITTI(split='testing', aug_params=None) + + if not os.path.exists(output_path): + os.makedirs(output_path) + + for test_id in range(len(test_dataset)): + image1, image2, (frame_id, ) = test_dataset[test_id] + padder = InputPadder(image1.shape, mode='kitti') + image1, image2 = padder.pad(image1[None].cuda(), image2[None].cuda()) + + _, flow_pr = model(image1, image2, iters=iters, test_mode=True) + flow = padder.unpad(flow_pr[0]).permute(1, 2, 0).cpu().numpy() + + output_filename = os.path.join(output_path, frame_id) + frame_utils.writeFlowKITTI(output_filename, flow) + + +@torch.no_grad() +def validate_chairs(model, iters=24): + """ Perform evaluation on the FlyingChairs (test) split """ + model.eval() + epe_list = [] + + val_dataset = datasets.FlyingChairs(split='validation') + for val_id in range(len(val_dataset)): + image1, image2, flow_gt, _ = val_dataset[val_id] + image1 = image1[None].cuda() + image2 = image2[None].cuda() + + _, flow_pr = model(image1, image2, iters=iters, test_mode=True) + epe = torch.sum((flow_pr[0].cpu() - flow_gt)**2, dim=0).sqrt() + epe_list.append(epe.view(-1).numpy()) + + epe = np.mean(np.concatenate(epe_list)) + print("Validation Chairs EPE: %f" % epe) + return {'chairs': epe} + + +@torch.no_grad() +def validate_sintel(model, iters=32): + """ Peform validation using the Sintel (train) split """ + model.eval() + results = {} + for dstype in ['clean', 'final']: + val_dataset = datasets.MpiSintel(split='training', dstype=dstype) + epe_list = [] + + for val_id in range(len(val_dataset)): + image1, image2, flow_gt, _ = val_dataset[val_id] + image1 = image1[None].cuda() + image2 = image2[None].cuda() + + padder = InputPadder(image1.shape) + image1, image2 = padder.pad(image1, image2) + + flow_low, flow_pr = model(image1, image2, iters=iters, test_mode=True) + flow = padder.unpad(flow_pr[0]).cpu() + + epe = torch.sum((flow - flow_gt)**2, dim=0).sqrt() + epe_list.append(epe.view(-1).numpy()) + + epe_all = np.concatenate(epe_list) + epe = np.mean(epe_all) + px1 = np.mean(epe_all<1) + px3 = np.mean(epe_all<3) + px5 = np.mean(epe_all<5) + + print("Validation (%s) EPE: %f, 1px: %f, 3px: %f, 5px: %f" % (dstype, epe, px1, px3, px5)) + results[dstype] = np.mean(epe_list) + + return results + + +@torch.no_grad() +def validate_kitti(model, iters=24): + """ Peform validation using the KITTI-2015 (train) split """ + model.eval() + val_dataset = datasets.KITTI(split='training') + + out_list, epe_list = [], [] + for val_id in range(len(val_dataset)): + image1, image2, flow_gt, valid_gt = val_dataset[val_id] + image1 = image1[None].cuda() + image2 = image2[None].cuda() + + padder = InputPadder(image1.shape, mode='kitti') + image1, image2 = padder.pad(image1, image2) + + flow_low, flow_pr = model(image1, image2, iters=iters, test_mode=True) + flow = padder.unpad(flow_pr[0]).cpu() + + epe = torch.sum((flow - flow_gt)**2, dim=0).sqrt() + mag = torch.sum(flow_gt**2, dim=0).sqrt() + + epe = epe.view(-1) + mag = mag.view(-1) + val = valid_gt.view(-1) >= 0.5 + + out = ((epe > 3.0) & ((epe/mag) > 0.05)).float() + epe_list.append(epe[val].mean().item()) + out_list.append(out[val].cpu().numpy()) + + epe_list = np.array(epe_list) + out_list = np.concatenate(out_list) + + epe = np.mean(epe_list) + f1 = 100 * np.mean(out_list) + + print("Validation KITTI: %f, %f" % (epe, f1)) + return {'kitti-epe': epe, 'kitti-f1': f1} + + +if __name__ == '__main__': + parser = argparse.ArgumentParser() + parser.add_argument('--model', help="restore checkpoint") + parser.add_argument('--dataset', help="dataset for evaluation") + parser.add_argument('--small', action='store_true', help='use small model') + parser.add_argument('--mixed_precision', action='store_true', help='use mixed precision') + parser.add_argument('--alternate_corr', action='store_true', help='use efficent correlation implementation') + args = parser.parse_args() + + model = torch.nn.DataParallel(RAFT(args)) + model.load_state_dict(torch.load(args.model)) + + model.cuda() + model.eval() + + # create_sintel_submission(model.module, warm_start=True) + # create_kitti_submission(model.module) + + with torch.no_grad(): + if args.dataset == 'chairs': + validate_chairs(model.module) + + elif args.dataset == 'sintel': + validate_sintel(model.module) + + elif args.dataset == 'kitti': + validate_kitti(model.module) + + diff --git a/vbench/third_party/amt/LICENSE b/vbench/third_party/amt/LICENSE new file mode 100644 index 00000000..c9cecbde --- /dev/null +++ b/vbench/third_party/amt/LICENSE @@ -0,0 +1,176 @@ +## creative commons + +# Attribution-NonCommercial 4.0 International + +Creative Commons Corporation (“Creative Commons”) is not a law firm and does not provide legal services or legal advice. Distribution of Creative Commons public licenses does not create a lawyer-client or other relationship. Creative Commons makes its licenses and related information available on an “as-is” basis. Creative Commons gives no warranties regarding its licenses, any material licensed under their terms and conditions, or any related information. Creative Commons disclaims all liability for damages resulting from their use to the fullest extent possible. + +### Using Creative Commons Public Licenses + +Creative Commons public licenses provide a standard set of terms and conditions that creators and other rights holders may use to share original works of authorship and other material subject to copyright and certain other rights specified in the public license below. The following considerations are for informational purposes only, are not exhaustive, and do not form part of our licenses. + +* __Considerations for licensors:__ Our public licenses are intended for use by those authorized to give the public permission to use material in ways otherwise restricted by copyright and certain other rights. Our licenses are irrevocable. Licensors should read and understand the terms and conditions of the license they choose before applying it. Licensors should also secure all rights necessary before applying our licenses so that the public can reuse the material as expected. Licensors should clearly mark any material not subject to the license. This includes other CC-licensed material, or material used under an exception or limitation to copyright. [More considerations for licensors](http://wiki.creativecommons.org/Considerations_for_licensors_and_licensees#Considerations_for_licensors). + +* __Considerations for the public:__ By using one of our public licenses, a licensor grants the public permission to use the licensed material under specified terms and conditions. If the licensor’s permission is not necessary for any reason–for example, because of any applicable exception or limitation to copyright–then that use is not regulated by the license. Our licenses grant only permissions under copyright and certain other rights that a licensor has authority to grant. Use of the licensed material may still be restricted for other reasons, including because others have copyright or other rights in the material. A licensor may make special requests, such as asking that all changes be marked or described. Although not required by our licenses, you are encouraged to respect those requests where reasonable. [More considerations for the public](http://wiki.creativecommons.org/Considerations_for_licensors_and_licensees#Considerations_for_licensees). + +## Creative Commons Attribution-NonCommercial 4.0 International Public License + +By exercising the Licensed Rights (defined below), You accept and agree to be bound by the terms and conditions of this Creative Commons Attribution-NonCommercial 4.0 International Public License ("Public License"). To the extent this Public License may be interpreted as a contract, You are granted the Licensed Rights in consideration of Your acceptance of these terms and conditions, and the Licensor grants You such rights in consideration of benefits the Licensor receives from making the Licensed Material available under these terms and conditions. + +### Section 1 – Definitions. + +a. __Adapted Material__ means material subject to Copyright and Similar Rights that is derived from or based upon the Licensed Material and in which the Licensed Material is translated, altered, arranged, transformed, or otherwise modified in a manner requiring permission under the Copyright and Similar Rights held by the Licensor. For purposes of this Public License, where the Licensed Material is a musical work, performance, or sound recording, Adapted Material is always produced where the Licensed Material is synched in timed relation with a moving image. + +b. __Adapter's License__ means the license You apply to Your Copyright and Similar Rights in Your contributions to Adapted Material in accordance with the terms and conditions of this Public License. + +c. __Copyright and Similar Rights__ means copyright and/or similar rights closely related to copyright including, without limitation, performance, broadcast, sound recording, and Sui Generis Database Rights, without regard to how the rights are labeled or categorized. For purposes of this Public License, the rights specified in Section 2(b)(1)-(2) are not Copyright and Similar Rights. + +d. __Effective Technological Measures__ means those measures that, in the absence of proper authority, may not be circumvented under laws fulfilling obligations under Article 11 of the WIPO Copyright Treaty adopted on December 20, 1996, and/or similar international agreements. + +e. __Exceptions and Limitations__ means fair use, fair dealing, and/or any other exception or limitation to Copyright and Similar Rights that applies to Your use of the Licensed Material. + +f. __Licensed Material__ means the artistic or literary work, database, or other material to which the Licensor applied this Public License. + +g. __Licensed Rights__ means the rights granted to You subject to the terms and conditions of this Public License, which are limited to all Copyright and Similar Rights that apply to Your use of the Licensed Material and that the Licensor has authority to license. + +h. __Licensor__ means the individual(s) or entity(ies) granting rights under this Public License. + +i. __NonCommercial__ means not primarily intended for or directed towards commercial advantage or monetary compensation. For purposes of this Public License, the exchange of the Licensed Material for other material subject to Copyright and Similar Rights by digital file-sharing or similar means is NonCommercial provided there is no payment of monetary compensation in connection with the exchange. + +j. __Share__ means to provide material to the public by any means or process that requires permission under the Licensed Rights, such as reproduction, public display, public performance, distribution, dissemination, communication, or importation, and to make material available to the public including in ways that members of the public may access the material from a place and at a time individually chosen by them. + +k. __Sui Generis Database Rights__ means rights other than copyright resulting from Directive 96/9/EC of the European Parliament and of the Council of 11 March 1996 on the legal protection of databases, as amended and/or succeeded, as well as other essentially equivalent rights anywhere in the world. + +l. __You__ means the individual or entity exercising the Licensed Rights under this Public License. Your has a corresponding meaning. + +### Section 2 – Scope. + +a. ___License grant.___ + + 1. Subject to the terms and conditions of this Public License, the Licensor hereby grants You a worldwide, royalty-free, non-sublicensable, non-exclusive, irrevocable license to exercise the Licensed Rights in the Licensed Material to: + + A. reproduce and Share the Licensed Material, in whole or in part, for NonCommercial purposes only; and + + B. produce, reproduce, and Share Adapted Material for NonCommercial purposes only. + + 2. __Exceptions and Limitations.__ For the avoidance of doubt, where Exceptions and Limitations apply to Your use, this Public License does not apply, and You do not need to comply with its terms and conditions. + + 3. __Term.__ The term of this Public License is specified in Section 6(a). + + 4. __Media and formats; technical modifications allowed.__ The Licensor authorizes You to exercise the Licensed Rights in all media and formats whether now known or hereafter created, and to make technical modifications necessary to do so. The Licensor waives and/or agrees not to assert any right or authority to forbid You from making technical modifications necessary to exercise the Licensed Rights, including technical modifications necessary to circumvent Effective Technological Measures. For purposes of this Public License, simply making modifications authorized by this Section 2(a)(4) never produces Adapted Material. + + 5. __Downstream recipients.__ + + A. __Offer from the Licensor – Licensed Material.__ Every recipient of the Licensed Material automatically receives an offer from the Licensor to exercise the Licensed Rights under the terms and conditions of this Public License. + + B. __No downstream restrictions.__ You may not offer or impose any additional or different terms or conditions on, or apply any Effective Technological Measures to, the Licensed Material if doing so restricts exercise of the Licensed Rights by any recipient of the Licensed Material. + + 6. __No endorsement.__ Nothing in this Public License constitutes or may be construed as permission to assert or imply that You are, or that Your use of the Licensed Material is, connected with, or sponsored, endorsed, or granted official status by, the Licensor or others designated to receive attribution as provided in Section 3(a)(1)(A)(i). + +b. ___Other rights.___ + + 1. Moral rights, such as the right of integrity, are not licensed under this Public License, nor are publicity, privacy, and/or other similar personality rights; however, to the extent possible, the Licensor waives and/or agrees not to assert any such rights held by the Licensor to the limited extent necessary to allow You to exercise the Licensed Rights, but not otherwise. + + 2. Patent and trademark rights are not licensed under this Public License. + + 3. To the extent possible, the Licensor waives any right to collect royalties from You for the exercise of the Licensed Rights, whether directly or through a collecting society under any voluntary or waivable statutory or compulsory licensing scheme. In all other cases the Licensor expressly reserves any right to collect such royalties, including when the Licensed Material is used other than for NonCommercial purposes. + +### Section 3 – License Conditions. + +Your exercise of the Licensed Rights is expressly made subject to the following conditions. + +a. ___Attribution.___ + + 1. If You Share the Licensed Material (including in modified form), You must: + + A. retain the following if it is supplied by the Licensor with the Licensed Material: + + i. identification of the creator(s) of the Licensed Material and any others designated to receive attribution, in any reasonable manner requested by the Licensor (including by pseudonym if designated); + + ii. a copyright notice; + + iii. a notice that refers to this Public License; + + iv. a notice that refers to the disclaimer of warranties; + + v. a URI or hyperlink to the Licensed Material to the extent reasonably practicable; + + B. indicate if You modified the Licensed Material and retain an indication of any previous modifications; and + + C. indicate the Licensed Material is licensed under this Public License, and include the text of, or the URI or hyperlink to, this Public License. + + 2. You may satisfy the conditions in Section 3(a)(1) in any reasonable manner based on the medium, means, and context in which You Share the Licensed Material. For example, it may be reasonable to satisfy the conditions by providing a URI or hyperlink to a resource that includes the required information. + + 3. If requested by the Licensor, You must remove any of the information required by Section 3(a)(1)(A) to the extent reasonably practicable. + + 4. If You Share Adapted Material You produce, the Adapter's License You apply must not prevent recipients of the Adapted Material from complying with this Public License. + +### Section 4 – Sui Generis Database Rights. + +Where the Licensed Rights include Sui Generis Database Rights that apply to Your use of the Licensed Material: + +a. for the avoidance of doubt, Section 2(a)(1) grants You the right to extract, reuse, reproduce, and Share all or a substantial portion of the contents of the database for NonCommercial purposes only; + +b. if You include all or a substantial portion of the database contents in a database in which You have Sui Generis Database Rights, then the database in which You have Sui Generis Database Rights (but not its individual contents) is Adapted Material; and + +c. You must comply with the conditions in Section 3(a) if You Share all or a substantial portion of the contents of the database. + +For the avoidance of doubt, this Section 4 supplements and does not replace Your obligations under this Public License where the Licensed Rights include other Copyright and Similar Rights. + +### Section 5 – Disclaimer of Warranties and Limitation of Liability. + +a. __Unless otherwise separately undertaken by the Licensor, to the extent possible, the Licensor offers the Licensed Material as-is and as-available, and makes no representations or warranties of any kind concerning the Licensed Material, whether express, implied, statutory, or other. This includes, without limitation, warranties of title, merchantability, fitness for a particular purpose, non-infringement, absence of latent or other defects, accuracy, or the presence or absence of errors, whether or not known or discoverable. Where disclaimers of warranties are not allowed in full or in part, this disclaimer may not apply to You.__ + +b. __To the extent possible, in no event will the Licensor be liable to You on any legal theory (including, without limitation, negligence) or otherwise for any direct, special, indirect, incidental, consequential, punitive, exemplary, or other losses, costs, expenses, or damages arising out of this Public License or use of the Licensed Material, even if the Licensor has been advised of the possibility of such losses, costs, expenses, or damages. Where a limitation of liability is not allowed in full or in part, this limitation may not apply to You.__ + +c. The disclaimer of warranties and limitation of liability provided above shall be interpreted in a manner that, to the extent possible, most closely approximates an absolute disclaimer and waiver of all liability. + +### Section 6 – Term and Termination. + +a. This Public License applies for the term of the Copyright and Similar Rights licensed here. However, if You fail to comply with this Public License, then Your rights under this Public License terminate automatically. + +b. Where Your right to use the Licensed Material has terminated under Section 6(a), it reinstates: + + 1. automatically as of the date the violation is cured, provided it is cured within 30 days of Your discovery of the violation; or + + 2. upon express reinstatement by the Licensor. + + For the avoidance of doubt, this Section 6(b) does not affect any right the Licensor may have to seek remedies for Your violations of this Public License. + +c. For the avoidance of doubt, the Licensor may also offer the Licensed Material under separate terms or conditions or stop distributing the Licensed Material at any time; however, doing so will not terminate this Public License. + +d. Sections 1, 5, 6, 7, and 8 survive termination of this Public License. + +### Section 7 – Other Terms and Conditions. + +a. The Licensor shall not be bound by any additional or different terms or conditions communicated by You unless expressly agreed. + +b. Any arrangements, understandings, or agreements regarding the Licensed Material not stated herein are separate from and independent of the terms and conditions of this Public License. + +### Section 8 – Interpretation. + +a. For the avoidance of doubt, this Public License does not, and shall not be interpreted to, reduce, limit, restrict, or impose conditions on any use of the Licensed Material that could lawfully be made without permission under this Public License. + +b. To the extent possible, if any provision of this Public License is deemed unenforceable, it shall be automatically reformed to the minimum extent necessary to make it enforceable. If the provision cannot be reformed, it shall be severed from this Public License without affecting the enforceability of the remaining terms and conditions. + +c. No term or condition of this Public License will be waived and no failure to comply consented to unless expressly agreed to by the Licensor. + +d. Nothing in this Public License constitutes or may be interpreted as a limitation upon, or waiver of, any privileges and immunities that apply to the Licensor or You, including from the legal processes of any jurisdiction or authority. + +> Creative Commons is not a party to its public licenses. Notwithstanding, Creative Commons may elect to apply one of its public licenses to material it publishes and in those instances will be considered the “Licensor.” Except for the limited purpose of indicating that material is shared under a Creative Commons public license or as otherwise permitted by the Creative Commons policies published at [creativecommons.org/policies](http://creativecommons.org/policies), Creative Commons does not authorize the use of the trademark “Creative Commons” or any other trademark or logo of Creative Commons without its prior written consent including, without limitation, in connection with any unauthorized modifications to any of its public licenses or any other arrangements, understandings, or agreements concerning use of licensed material. For the avoidance of doubt, this paragraph does not form part of the public licenses. +> +> Creative Commons may be contacted at creativecommons.org + + +### Commercial licensing opportunities +For commercial uses of the Model & Software, please send email to cmm[AT]nankai.edu.cn + +Citation: + +@inproceedings{licvpr23amt, + title = {AMT: All-Pairs Multi-Field Transforms for Efficient Frame Interpolation}, + author = {Li, Zhen and Zhu, Zuo-Liang and Han, Ling-Hao and Hou, Qibin and Guo, Chun-Le and Cheng, Ming-Ming}, + booktitle = {IEEE Conference on Computer Vision and Pattern Recognition (CVPR)}, + year = {2023} +} + +Copyright (c) 2023 MCG-NKU \ No newline at end of file diff --git a/vbench/third_party/amt/README.md b/vbench/third_party/amt/README.md new file mode 100755 index 00000000..2f322431 --- /dev/null +++ b/vbench/third_party/amt/README.md @@ -0,0 +1,167 @@ +# AMT: All-Pairs Multi-Field Transforms for Efficient Frame Interpolation + + +This repository contains the official implementation of the following paper: +> **AMT: All-Pairs Multi-Field Transforms for Efficient Frame Interpolation**
      +> [Zhen Li](https://paper99.github.io/)\*, [Zuo-Liang Zhu](https://nk-cs-zzl.github.io/)\*, [Ling-Hao Han](https://scholar.google.com/citations?user=0ooNdgUAAAAJ&hl=en), [Qibin Hou](https://scholar.google.com/citations?hl=en&user=fF8OFV8AAAAJ&view_op=list_works), [Chun-Le Guo](https://scholar.google.com/citations?hl=en&user=RZLYwR0AAAAJ), [Ming-Ming Cheng](https://mmcheng.net/cmm)
      +> (\* denotes equal contribution)
      +> Nankai University
      +> In CVPR 2023
      + +[[Paper](https://arxiv.org/abs/2304.09790)] +[[Project Page](https://nk-cs-zzl.github.io/projects/amt/index.html)] +[[Web demos](#web-demos)] +[Video] + +AMT is a **lightweight, fast, and accurate** algorithm for Frame Interpolation. +It aims to provide practical solutions for **video generation** from **a few given frames (at least two frames)**. + +![Demo gif](assets/amt_demo.gif) +* More examples can be found in our [project page](https://nk-cs-zzl.github.io/projects/amt/index.html). + +## Web demos +Integrated into [Hugging Face Spaces 🤗](https://huggingface.co/spaces) using [Gradio](https://github.com/gradio-app/gradio). Try out the Web Demo: [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/NKU-AMT/AMT) + +Try AMT to interpolate between two or more images at [![PyTTI-Tools:FILM](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1IeVO5BmLouhRh6fL2z_y18kgubotoaBq?usp=sharing) + + +## Change Log +- **Apr 20, 2023**: Our code is publicly available. + + +## Method Overview +![pipeline](https://user-images.githubusercontent.com/21050959/229420451-65951bd0-732c-4f09-9121-f291a3862d6e.png) + +For technical details, please refer to the [method.md](docs/method.md) file, or read the full report on [arXiv](https://arxiv.org/abs/2304.09790). + +## Dependencies and Installation +1. Clone Repo + + ```bash + git clone https://github.com/MCG-NKU/AMT.git + ``` + +2. Create Conda Environment and Install Dependencies + + ```bash + conda env create -f environment.yaml + conda activate amt + ``` +3. Download pretrained models for demos from [Pretrained Models](#pretrained-models) and place them to the `pretrained` folder + +## Quick Demo + +**Note that the selected pretrained model (`[CKPT_PATH]`) needs to match the config file (`[CFG]`).** + + > Creating a video demo, increasing $n$ will slow down the motion in the video. (With $m$ input frames, `[N_ITER]` $=n$ corresponds to $2^n\times (m-1)+1$ output frames.) + + + ```bash + python demos/demo_2x.py -c [CFG] -p [CKPT] -n [N_ITER] -i [INPUT] -o [OUT_PATH] -r [FRAME_RATE] + # e.g. [INPUT] + # -i could be a video / a regular expression / a folder contains multiple images + # -i demo.mp4 (video)/img_*.png (regular expression)/img0.png img1.png (images)/demo_input (folder) + + # e.g. a simple usage + python demos/demo_2x.py -c cfgs/AMT-S.yaml -p pretrained/amt-s.pth -n 6 -i assets/quick_demo/img0.png assets/quick_demo/img1.png + + ``` + + + Note: Please enable `--save_images` for saving the output images (Save speed will be slowed down if there are too many output images) + + Input type supported: `a video` / `a regular expression` / `multiple images` / `a folder containing input frames`. + + Results are in the `[OUT_PATH]` (default is `results/2x`) folder. + +## Pretrained Models + +

      + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
      Dataset :link: Download Links Config file Trained on Arbitrary/Fixed
      AMT-S [Google Driver][Baidu Cloud][Hugging Face] [cfgs/AMT-S] Vimeo90kFixed
      AMT-L[Google Driver][Baidu Cloud][Hugging Face] [cfgs/AMT-L] Vimeo90kFixed
      AMT-G[Google Driver][Baidu Cloud][Hugging Face] [cfgs/AMT-G] Vimeo90kFixed
      AMT-S[Google Driver][Baidu Cloud][Hugging Face] [cfgs/AMT-S_gopro] GoProArbitrary
      + +## Training and Evaluation + +Please refer to [develop.md](docs/develop.md) to learn how to benchmark the AMT and how to train a new AMT model from scratch. + + +## Citation + If you find our repo useful for your research, please consider citing our paper: + + ```bibtex + @inproceedings{licvpr23amt, + title={AMT: All-Pairs Multi-Field Transforms for Efficient Frame Interpolation}, + author={Li, Zhen and Zhu, Zuo-Liang and Han, Ling-Hao and Hou, Qibin and Guo, Chun-Le and Cheng, Ming-Ming}, + booktitle={IEEE Conference on Computer Vision and Pattern Recognition (CVPR)}, + year={2023} + } + ``` + + +## License +This code is licensed under the [Creative Commons Attribution-NonCommercial 4.0 International](https://creativecommons.org/licenses/by-nc/4.0/) for non-commercial use only. +Please note that any commercial use of this code requires formal permission prior to use. + +## Contact + +For technical questions, please contact `zhenli1031[AT]gmail.com` and `nkuzhuzl[AT]gmail.com`. + +For commercial licensing, please contact `cmm[AT]nankai.edu.cn` + +## Acknowledgement + +We thank Jia-Wen Xiao, Zheng-Peng Duan, Rui-Qi Wu, and Xin Jin for proof reading. +We thank [Zhewei Huang](https://github.com/hzwer) for his suggestions. + +Here are some great resources we benefit from: + +- [IFRNet](https://github.com/ltkong218/IFRNet) and [RIFE](https://github.com/megvii-research/ECCV2022-RIFE) for data processing, benchmarking, and loss designs. +- [RAFT](https://github.com/princeton-vl/RAFT), [M2M-VFI](https://github.com/feinanshan/M2M_VFI), and [GMFlow](https://github.com/haofeixu/gmflow) for inspirations. +- [FILM](https://github.com/google-research/frame-interpolation) for Web demo reference. + + +**If you develop/use AMT in your projects, welcome to let us know. We will list your projects in this repository.** + +We also thank all of our contributors. + +
      + + + diff --git a/vbench/third_party/amt/benchmarks/__init__.py b/vbench/third_party/amt/benchmarks/__init__.py new file mode 100755 index 00000000..e69de29b diff --git a/vbench/third_party/amt/benchmarks/adobe240.py b/vbench/third_party/amt/benchmarks/adobe240.py new file mode 100755 index 00000000..2faf0989 --- /dev/null +++ b/vbench/third_party/amt/benchmarks/adobe240.py @@ -0,0 +1,56 @@ +import sys +import tqdm +import torch +import argparse +import numpy as np +from omegaconf import OmegaConf + +sys.path.append('.') +from utils.build_utils import build_from_cfg +from datasets.adobe_datasets import Adobe240_Dataset +from metrics.psnr_ssim import calculate_psnr, calculate_ssim + +parser = argparse.ArgumentParser( + prog = 'AMT', + description = 'Adobe240 evaluation', + ) +parser.add_argument('-c', '--config', default='cfgs/AMT-S_gopro.yaml') +parser.add_argument('-p', '--ckpt', default='pretrained/gopro_amt-s.pth',) +parser.add_argument('-r', '--root', default='data/Adobe240/test_frames',) +args = parser.parse_args() + +device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') +cfg_path = args.config +ckpt_path = args.ckpt +root = args.root + +network_cfg = OmegaConf.load(cfg_path).network +network_name = network_cfg.name +model = build_from_cfg(network_cfg) +ckpt = torch.load(ckpt_path) +model.load_state_dict(ckpt['state_dict']) +model = model.to(device) +model.eval() + +dataset = Adobe240_Dataset(dataset_dir=root, augment=False) + +psnr_list = [] +ssim_list = [] +pbar = tqdm.tqdm(dataset, total=len(dataset)) +for data in pbar: + input_dict = {} + for k, v in data.items(): + input_dict[k] = v.to(device).unsqueeze(0) + with torch.no_grad(): + imgt_pred = model(**input_dict)['imgt_pred'] + psnr = calculate_psnr(imgt_pred, input_dict['imgt']) + ssim = calculate_ssim(imgt_pred, input_dict['imgt']) + psnr_list.append(psnr) + ssim_list.append(ssim) + avg_psnr = np.mean(psnr_list) + avg_ssim = np.mean(ssim_list) + desc_str = f'[{network_name}/Adobe240] psnr: {avg_psnr:.02f}, ssim: {avg_ssim:.04f}' + pbar.set_description_str(desc_str) + + + diff --git a/vbench/third_party/amt/benchmarks/gopro.py b/vbench/third_party/amt/benchmarks/gopro.py new file mode 100755 index 00000000..5d049a58 --- /dev/null +++ b/vbench/third_party/amt/benchmarks/gopro.py @@ -0,0 +1,55 @@ +import sys +import tqdm +import torch +import argparse +import numpy as np +from omegaconf import OmegaConf + +sys.path.append('.') +from utils.build_utils import build_from_cfg +from datasets.gopro_datasets import GoPro_Test_Dataset +from metrics.psnr_ssim import calculate_psnr, calculate_ssim + +parser = argparse.ArgumentParser( + prog = 'AMT', + description = 'GOPRO evaluation', + ) +parser.add_argument('-c', '--config', default='cfgs/AMT-S_gopro.yaml') +parser.add_argument('-p', '--ckpt', default='pretrained/gopro_amt-s.pth',) +parser.add_argument('-r', '--root', default='data/GOPRO',) +args = parser.parse_args() + +device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') +cfg_path = args.config +ckpt_path = args.ckpt +root = args.root + +network_cfg = OmegaConf.load(cfg_path).network +network_name = network_cfg.name +model = build_from_cfg(network_cfg) +ckpt = torch.load(ckpt_path) +model.load_state_dict(ckpt['state_dict']) +model = model.to(device) +model.eval() + +dataset = GoPro_Test_Dataset(dataset_dir=root) + +psnr_list = [] +ssim_list = [] +pbar = tqdm.tqdm(dataset, total=len(dataset)) +for data in pbar: + input_dict = {} + for k, v in data.items(): + input_dict[k] = v.to(device).unsqueeze(0) + with torch.no_grad(): + imgt_pred = model(**input_dict)['imgt_pred'] + psnr = calculate_psnr(imgt_pred, input_dict['imgt']) + ssim = calculate_ssim(imgt_pred, input_dict['imgt']) + psnr_list.append(psnr) + ssim_list.append(ssim) + avg_psnr = np.mean(psnr_list) + avg_ssim = np.mean(ssim_list) + desc_str = f'[{network_name}/GOPRO] psnr: {avg_psnr:.02f}, ssim: {avg_ssim:.04f}' + pbar.set_description_str(desc_str) + + diff --git a/vbench/third_party/amt/benchmarks/snu_film.py b/vbench/third_party/amt/benchmarks/snu_film.py new file mode 100755 index 00000000..6ab7d1a9 --- /dev/null +++ b/vbench/third_party/amt/benchmarks/snu_film.py @@ -0,0 +1,70 @@ +import os +import sys +import tqdm +import torch +import argparse +import numpy as np +import os.path as osp +from omegaconf import OmegaConf + +sys.path.append('.') +from utils.build_utils import build_from_cfg +from metrics.psnr_ssim import calculate_psnr, calculate_ssim +from utils.utils import InputPadder, read, img2tensor + + +def parse_path(path): + path_list = path.split('/') + new_path = osp.join(*path_list[-3:]) + return new_path + +parser = argparse.ArgumentParser( + prog = 'AMT', + description = 'SNU-FILM evaluation', + ) +parser.add_argument('-c', '--config', default='cfgs/AMT-S.yaml') +parser.add_argument('-p', '--ckpt', default='pretrained/amt-s.pth') +parser.add_argument('-r', '--root', default='data/SNU_FILM') +args = parser.parse_args() + +device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') +cfg_path = args.config +ckpt_path = args.ckpt +root = args.root + +network_cfg = OmegaConf.load(cfg_path).network +network_name = network_cfg.name +model = build_from_cfg(network_cfg) +ckpt = torch.load(ckpt_path) +model.load_state_dict(ckpt['state_dict']) +model = model.to(device) +model.eval() + +divisor = 20; scale_factor = 0.8 +splits = ['easy', 'medium', 'hard', 'extreme'] +for split in splits: + with open(os.path.join(root, f'test-{split}.txt'), "r") as fr: + file_list = [l.strip().split(' ') for l in fr.readlines()] + pbar = tqdm.tqdm(file_list, total=len(file_list)) + + psnr_list = []; ssim_list = [] + for name in pbar: + img0 = img2tensor(read(osp.join(root, parse_path(name[0])))).to(device) + imgt = img2tensor(read(osp.join(root, parse_path(name[1])))).to(device) + img1 = img2tensor(read(osp.join(root, parse_path(name[2])))).to(device) + padder = InputPadder(img0.shape, divisor) + img0, img1 = padder.pad(img0, img1) + + embt = torch.tensor(1/2).float().view(1, 1, 1, 1).to(device) + imgt_pred = model(img0, img1, embt, scale_factor=scale_factor, eval=True)['imgt_pred'] + imgt_pred = padder.unpad(imgt_pred) + + psnr = calculate_psnr(imgt_pred, imgt).detach().cpu().numpy() + ssim = calculate_ssim(imgt_pred, imgt).detach().cpu().numpy() + + psnr_list.append(psnr) + ssim_list.append(ssim) + avg_psnr = np.mean(psnr_list) + avg_ssim = np.mean(ssim_list) + desc_str = f'[{network_name}/SNU-FILM] [{split}] psnr: {avg_psnr:.02f}, ssim: {avg_ssim:.04f}' + pbar.set_description_str(desc_str) diff --git a/vbench/third_party/amt/benchmarks/speed_parameters.py b/vbench/third_party/amt/benchmarks/speed_parameters.py new file mode 100755 index 00000000..b5b23309 --- /dev/null +++ b/vbench/third_party/amt/benchmarks/speed_parameters.py @@ -0,0 +1,38 @@ +import sys +import time +import torch +import argparse +from omegaconf import OmegaConf + +sys.path.append('.') +from utils.build_utils import build_from_cfg + +parser = argparse.ArgumentParser( + prog = 'AMT', + description = 'Speed¶meter benchmark', + ) +parser.add_argument('-c', '--config', default='cfgs/AMT-S.yaml') +args = parser.parse_args() + +cfg_path = args.config +network_cfg = OmegaConf.load(cfg_path).network +model = build_from_cfg(network_cfg) +model = model.cuda() +model.eval() + +img0 = torch.randn(1, 3, 256, 448).cuda() +img1 = torch.randn(1, 3, 256, 448).cuda() +embt = torch.tensor(1/2).float().view(1, 1, 1, 1).cuda() + +with torch.no_grad(): + for i in range(100): + out = model(img0, img1, embt, eval=True) + torch.cuda.synchronize() + time_stamp = time.time() + for i in range(1000): + out = model(img0, img1, embt, eval=True) + torch.cuda.synchronize() + print('Time: {:.5f}s'.format((time.time() - time_stamp) / 1)) + +total = sum([param.nelement() for param in model.parameters()]) +print('Parameters: {:.2f}M'.format(total / 1e6)) diff --git a/vbench/third_party/amt/benchmarks/ucf101.py b/vbench/third_party/amt/benchmarks/ucf101.py new file mode 100755 index 00000000..7d29b0e7 --- /dev/null +++ b/vbench/third_party/amt/benchmarks/ucf101.py @@ -0,0 +1,59 @@ +import os +import sys +import tqdm +import torch +import argparse +import numpy as np +import os.path as osp +from omegaconf import OmegaConf + +sys.path.append('.') +from utils.utils import read, img2tensor +from utils.build_utils import build_from_cfg +from metrics.psnr_ssim import calculate_psnr, calculate_ssim + +parser = argparse.ArgumentParser( + prog = 'AMT', + description = 'UCF101 evaluation', + ) +parser.add_argument('-c', '--config', default='cfgs/AMT-S.yaml') +parser.add_argument('-p', '--ckpt', default='pretrained/amt-s.pth') +parser.add_argument('-r', '--root', default='data/ucf101_interp_ours') +args = parser.parse_args() + +device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') +cfg_path = args.config +ckpt_path = args.ckpt +root = args.root + +network_cfg = OmegaConf.load(cfg_path).network +network_name = network_cfg.name +model = build_from_cfg(network_cfg) +ckpt = torch.load(ckpt_path) +model.load_state_dict(ckpt['state_dict']) +model = model.to(device) +model.eval() + +dirs = sorted(os.listdir(root)) +psnr_list = [] +ssim_list = [] +pbar = tqdm.tqdm(dirs, total=len(dirs)) +for d in pbar: + dir_path = osp.join(root, d) + I0 = img2tensor(read(osp.join(dir_path, 'frame_00.png'))).to(device) + I1 = img2tensor(read(osp.join(dir_path, 'frame_01_gt.png'))).to(device) + I2 = img2tensor(read(osp.join(dir_path, 'frame_02.png'))).to(device) + embt = torch.tensor(1/2).float().view(1, 1, 1, 1).to(device) + + I1_pred = model(I0, I2, embt, eval=True)['imgt_pred'] + + psnr = calculate_psnr(I1_pred, I1).detach().cpu().numpy() + ssim = calculate_ssim(I1_pred, I1).detach().cpu().numpy() + + psnr_list.append(psnr) + ssim_list.append(ssim) + + avg_psnr = np.mean(psnr_list) + avg_ssim = np.mean(ssim_list) + desc_str = f'[{network_name}/UCF101] psnr: {avg_psnr:.02f}, ssim: {avg_ssim:.04f}' + pbar.set_description_str(desc_str) \ No newline at end of file diff --git a/vbench/third_party/amt/benchmarks/vimeo90k.py b/vbench/third_party/amt/benchmarks/vimeo90k.py new file mode 100755 index 00000000..c598e8c8 --- /dev/null +++ b/vbench/third_party/amt/benchmarks/vimeo90k.py @@ -0,0 +1,65 @@ +import sys +import tqdm +import torch +import argparse +import numpy as np +import os.path as osp +from omegaconf import OmegaConf + +sys.path.append('.') +from utils.utils import read, img2tensor +from utils.build_utils import build_from_cfg +from metrics.psnr_ssim import calculate_psnr, calculate_ssim + +parser = argparse.ArgumentParser( + prog = 'AMT', + description = 'Vimeo90K evaluation', + ) +parser.add_argument('-c', '--config', default='cfgs/AMT-S.yaml') +parser.add_argument('-p', '--ckpt', default='pretrained/amt-s.pth',) +parser.add_argument('-r', '--root', default='data/vimeo_triplet',) +args = parser.parse_args() + +device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') +cfg_path = args.config +ckpt_path = args.ckpt +root = args.root + +network_cfg = OmegaConf.load(cfg_path).network +network_name = network_cfg.name +model = build_from_cfg(network_cfg) +ckpt = torch.load(ckpt_path) +model.load_state_dict(ckpt['state_dict']) +model = model.to(device) +model.eval() + +with open(osp.join(root, 'tri_testlist.txt'), 'r') as fr: + file_list = fr.readlines() + +psnr_list = [] +ssim_list = [] + +pbar = tqdm.tqdm(file_list, total=len(file_list)) +for name in pbar: + name = str(name).strip() + if(len(name) <= 1): + continue + dir_path = osp.join(root, 'sequences', name) + I0 = img2tensor(read(osp.join(dir_path, 'im1.png'))).to(device) + I1 = img2tensor(read(osp.join(dir_path, 'im2.png'))).to(device) + I2 = img2tensor(read(osp.join(dir_path, 'im3.png'))).to(device) + embt = torch.tensor(1/2).float().view(1, 1, 1, 1).to(device) + + I1_pred = model(I0, I2, embt, + scale_factor=1.0, eval=True)['imgt_pred'] + + psnr = calculate_psnr(I1_pred, I1).detach().cpu().numpy() + ssim = calculate_ssim(I1_pred, I1).detach().cpu().numpy() + + psnr_list.append(psnr) + ssim_list.append(ssim) + avg_psnr = np.mean(psnr_list) + avg_ssim = np.mean(ssim_list) + desc_str = f'[{network_name}/Vimeo90K] psnr: {avg_psnr:.02f}, ssim: {avg_ssim:.04f}' + pbar.set_description_str(desc_str) + diff --git a/vbench/third_party/amt/benchmarks/vimeo90k_tta.py b/vbench/third_party/amt/benchmarks/vimeo90k_tta.py new file mode 100755 index 00000000..ebadad1f --- /dev/null +++ b/vbench/third_party/amt/benchmarks/vimeo90k_tta.py @@ -0,0 +1,67 @@ +import sys +import tqdm +import torch +import argparse +import numpy as np +import os.path as osp +from omegaconf import OmegaConf + +sys.path.append('.') +from utils.utils import read, img2tensor +from utils.build_utils import build_from_cfg +from metrics.psnr_ssim import calculate_psnr, calculate_ssim + +parser = argparse.ArgumentParser( + prog = 'AMT', + description = 'Vimeo90K evaluation (with Test-Time Augmentation)', + ) +parser.add_argument('-c', '--config', default='cfgs/AMT-S.yaml') +parser.add_argument('p', '--ckpt', default='pretrained/amt-s.pth',) +parser.add_argument('-r', '--root', default='data/vimeo_triplet',) +args = parser.parse_args() + +device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') +cfg_path = args.config +ckpt_path = args.ckpt +root = args.root + +network_cfg = OmegaConf.load(cfg_path).network +network_name = network_cfg.name +model = build_from_cfg(network_cfg) +ckpt = torch.load(ckpt_path) +model.load_state_dict(ckpt['state_dict']) +model = model.to(device) +model.eval() + +with open(osp.join(root, 'tri_testlist.txt'), 'r') as fr: + file_list = fr.readlines() + +psnr_list = [] +ssim_list = [] + +pbar = tqdm.tqdm(file_list, total=len(file_list)) +for name in pbar: + name = str(name).strip() + if(len(name) <= 1): + continue + dir_path = osp.join(root, 'sequences', name) + I0 = img2tensor(read(osp.join(dir_path, 'im1.png'))).to(device) + I1 = img2tensor(read(osp.join(dir_path, 'im2.png'))).to(device) + I2 = img2tensor(read(osp.join(dir_path, 'im3.png'))).to(device) + embt = torch.tensor(1/2).float().view(1, 1, 1, 1).to(device) + + I1_pred1 = model(I0, I2, embt, + scale_factor=1.0, eval=True)['imgt_pred'] + I1_pred2 = model(torch.flip(I0, [2]), torch.flip(I2, [2]), embt, + scale_factor=1.0, eval=True)['imgt_pred'] + I1_pred = I1_pred1 / 2 + torch.flip(I1_pred2, [2]) / 2 + psnr = calculate_psnr(I1_pred, I1).detach().cpu().numpy() + ssim = calculate_ssim(I1_pred, I1).detach().cpu().numpy() + + psnr_list.append(psnr) + ssim_list.append(ssim) + avg_psnr = np.mean(psnr_list) + avg_ssim = np.mean(ssim_list) + desc_str = f'[{network_name}/Vimeo90K] psnr: {avg_psnr:.02f}, ssim: {avg_ssim:.04f}' + pbar.set_description_str(desc_str) + diff --git a/vbench/third_party/amt/benchmarks/xiph.py b/vbench/third_party/amt/benchmarks/xiph.py new file mode 100755 index 00000000..a8bd7327 --- /dev/null +++ b/vbench/third_party/amt/benchmarks/xiph.py @@ -0,0 +1,104 @@ +import os +import sys +import cv2 +import tqdm +import glob +import torch +import argparse +import numpy as np +import os.path as osp +from omegaconf import OmegaConf + +sys.path.append('.') +from utils.utils import InputPadder, read, img2tensor +from utils.build_utils import build_from_cfg +from metrics.psnr_ssim import calculate_psnr, calculate_ssim + +parser = argparse.ArgumentParser( + prog = 'AMT', + description = 'Xiph evaluation', + ) +parser.add_argument('-c', '--config', default='cfgs/AMT-S.yaml') +parser.add_argument('-p', '--ckpt', default='pretrained/amt-s.pth') +parser.add_argument('-r', '--root', default='data/xiph') +args = parser.parse_args() + +device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') +cfg_path = args.config +ckpt_path = args.ckpt +root = args.root + +network_cfg = OmegaConf.load(cfg_path).network +network_name = network_cfg.name +model = build_from_cfg(network_cfg) +ckpt = torch.load(ckpt_path) +model.load_state_dict(ckpt['state_dict'], False) +model = model.to(device) +model.eval() + +############################################# Prepare Dataset ############################################# +download_links = [ + 'https://media.xiph.org/video/derf/ElFuente/Netflix_BoxingPractice_4096x2160_60fps_10bit_420.y4m', + 'https://media.xiph.org/video/derf/ElFuente/Netflix_Crosswalk_4096x2160_60fps_10bit_420.y4m', + 'https://media.xiph.org/video/derf/Chimera/Netflix_DrivingPOV_4096x2160_60fps_10bit_420.y4m', + 'https://media.xiph.org/video/derf/ElFuente/Netflix_FoodMarket_4096x2160_60fps_10bit_420.y4m', + 'https://media.xiph.org/video/derf/ElFuente/Netflix_FoodMarket2_4096x2160_60fps_10bit_420.y4m', + 'https://media.xiph.org/video/derf/ElFuente/Netflix_RitualDance_4096x2160_60fps_10bit_420.y4m', + 'https://media.xiph.org/video/derf/ElFuente/Netflix_SquareAndTimelapse_4096x2160_60fps_10bit_420.y4m', + 'https://media.xiph.org/video/derf/ElFuente/Netflix_Tango_4096x2160_60fps_10bit_420.y4m', +] +file_list = ['BoxingPractice', 'Crosswalk', 'DrivingPOV', 'FoodMarket', 'FoodMarket2', 'RitualDance', + 'SquareAndTimelapse', 'Tango'] + +for file_name, link in zip(file_list, download_links): + data_dir = osp.join(root, file_name) + if osp.exists(data_dir) is False: + os.makedirs(data_dir) + if len(glob.glob(f'{data_dir}/*.png')) < 100: + os.system(f'ffmpeg -i {link} -pix_fmt rgb24 -vframes 100 {data_dir}/%03d.png') +############################################### Prepare End ############################################### + + +divisor = 32; scale_factor = 0.5 +for category in ['resized-2k', 'cropped-4k']: + psnr_list = [] + ssim_list = [] + pbar = tqdm.tqdm(file_list, total=len(file_list)) + for flie_name in pbar: + dir_name = osp.join(root, flie_name) + for intFrame in range(2, 99, 2): + img0 = read(f'{dir_name}/{intFrame - 1:03d}.png') + img1 = read(f'{dir_name}/{intFrame + 1:03d}.png') + imgt = read(f'{dir_name}/{intFrame:03d}.png') + + if category == 'resized-2k': + img0 = cv2.resize(src=img0, dsize=(2048, 1080), fx=0.0, fy=0.0, interpolation=cv2.INTER_AREA) + img1 = cv2.resize(src=img1, dsize=(2048, 1080), fx=0.0, fy=0.0, interpolation=cv2.INTER_AREA) + imgt = cv2.resize(src=imgt, dsize=(2048, 1080), fx=0.0, fy=0.0, interpolation=cv2.INTER_AREA) + + elif category == 'cropped-4k': + img0 = img0[540:-540, 1024:-1024, :] + img1 = img1[540:-540, 1024:-1024, :] + imgt = imgt[540:-540, 1024:-1024, :] + img0 = img2tensor(img0).to(device) + imgt = img2tensor(imgt).to(device) + img1 = img2tensor(img1).to(device) + embt = torch.tensor(1/2).float().view(1, 1, 1, 1).to(device) + + padder = InputPadder(img0.shape, divisor) + img0, img1 = padder.pad(img0, img1) + + with torch.no_grad(): + imgt_pred = model(img0, img1, embt, scale_factor=scale_factor, eval=True)['imgt_pred'] + imgt_pred = padder.unpad(imgt_pred) + + psnr = calculate_psnr(imgt_pred, imgt) + ssim = calculate_ssim(imgt_pred, imgt) + + avg_psnr = np.mean(psnr_list) + avg_ssim = np.mean(ssim_list) + psnr_list.append(psnr) + ssim_list.append(ssim) + desc_str = f'[{network_name}/Xiph] [{category}/{flie_name}] psnr: {avg_psnr:.02f}, ssim: {avg_ssim:.04f}' + + pbar.set_description_str(desc_str) \ No newline at end of file diff --git a/vbench/third_party/amt/cfgs/AMT-G.yaml b/vbench/third_party/amt/cfgs/AMT-G.yaml new file mode 100755 index 00000000..7b3bb39b --- /dev/null +++ b/vbench/third_party/amt/cfgs/AMT-G.yaml @@ -0,0 +1,62 @@ +exp_name: floloss1e-2_300epoch_bs24_lr1p5e-4 +seed: 2023 +epochs: 300 +distributed: true +lr: 1.5e-4 +lr_min: 2e-5 +weight_decay: 0.0 +resume_state: null +save_dir: work_dir +eval_interval: 1 + +network: + name: networks.AMT-G.Model + params: + corr_radius: 3 + corr_lvls: 4 + num_flows: 5 +data: + train: + name: datasets.vimeo_datasets.Vimeo90K_Train_Dataset + params: + dataset_dir: data/vimeo_triplet + val: + name: datasets.vimeo_datasets.Vimeo90K_Test_Dataset + params: + dataset_dir: data/vimeo_triplet + train_loader: + batch_size: 24 + num_workers: 12 + val_loader: + batch_size: 24 + num_workers: 3 + +logger: + use_wandb: true + resume_id: null + +losses: + - { + name: losses.loss.CharbonnierLoss, + nickname: l_rec, + params: { + loss_weight: 1.0, + keys: [imgt_pred, imgt] + } + } + - { + name: losses.loss.TernaryLoss, + nickname: l_ter, + params: { + loss_weight: 1.0, + keys: [imgt_pred, imgt] + } + } + - { + name: losses.loss.MultipleFlowLoss, + nickname: l_flo, + params: { + loss_weight: 0.005, + keys: [flow0_pred, flow1_pred, flow] + } + } diff --git a/vbench/third_party/amt/cfgs/AMT-L.yaml b/vbench/third_party/amt/cfgs/AMT-L.yaml new file mode 100755 index 00000000..0cd60ce8 --- /dev/null +++ b/vbench/third_party/amt/cfgs/AMT-L.yaml @@ -0,0 +1,62 @@ +exp_name: floloss1e-2_300epoch_bs24_lr2e-4 +seed: 2023 +epochs: 300 +distributed: true +lr: 2e-4 +lr_min: 2e-5 +weight_decay: 0.0 +resume_state: null +save_dir: work_dir +eval_interval: 1 + +network: + name: networks.AMT-L.Model + params: + corr_radius: 3 + corr_lvls: 4 + num_flows: 5 +data: + train: + name: datasets.vimeo_datasets.Vimeo90K_Train_Dataset + params: + dataset_dir: data/vimeo_triplet + val: + name: datasets.vimeo_datasets.Vimeo90K_Test_Dataset + params: + dataset_dir: data/vimeo_triplet + train_loader: + batch_size: 24 + num_workers: 12 + val_loader: + batch_size: 24 + num_workers: 3 + +logger: + use_wandb: true + resume_id: null + +losses: + - { + name: losses.loss.CharbonnierLoss, + nickname: l_rec, + params: { + loss_weight: 1.0, + keys: [imgt_pred, imgt] + } + } + - { + name: losses.loss.TernaryLoss, + nickname: l_ter, + params: { + loss_weight: 1.0, + keys: [imgt_pred, imgt] + } + } + - { + name: losses.loss.MultipleFlowLoss, + nickname: l_flo, + params: { + loss_weight: 0.002, + keys: [flow0_pred, flow1_pred, flow] + } + } diff --git a/vbench/third_party/amt/cfgs/AMT-S.yaml b/vbench/third_party/amt/cfgs/AMT-S.yaml new file mode 100755 index 00000000..f0673557 --- /dev/null +++ b/vbench/third_party/amt/cfgs/AMT-S.yaml @@ -0,0 +1,63 @@ +exp_name: floloss1e-2_300epoch_bs24_lr2e-4 +seed: 2023 +epochs: 300 +distributed: true +lr: 2e-4 +lr_min: 2e-5 +weight_decay: 0.0 +resume_state: null +save_dir: work_dir +eval_interval: 1 + +network: + name: networks.AMT-S.Model + params: + corr_radius: 3 + corr_lvls: 4 + num_flows: 3 + +data: + train: + name: datasets.vimeo_datasets.Vimeo90K_Train_Dataset + params: + dataset_dir: data/vimeo_triplet + val: + name: datasets.vimeo_datasets.Vimeo90K_Test_Dataset + params: + dataset_dir: data/vimeo_triplet + train_loader: + batch_size: 24 + num_workers: 12 + val_loader: + batch_size: 24 + num_workers: 3 + +logger: + use_wandb: false + resume_id: null + +losses: + - { + name: losses.loss.CharbonnierLoss, + nickname: l_rec, + params: { + loss_weight: 1.0, + keys: [imgt_pred, imgt] + } + } + - { + name: losses.loss.TernaryLoss, + nickname: l_ter, + params: { + loss_weight: 1.0, + keys: [imgt_pred, imgt] + } + } + - { + name: losses.loss.MultipleFlowLoss, + nickname: l_flo, + params: { + loss_weight: 0.002, + keys: [flow0_pred, flow1_pred, flow] + } + } diff --git a/vbench/third_party/amt/cfgs/AMT-S_gopro.yaml b/vbench/third_party/amt/cfgs/AMT-S_gopro.yaml new file mode 100755 index 00000000..bb50cfb0 --- /dev/null +++ b/vbench/third_party/amt/cfgs/AMT-S_gopro.yaml @@ -0,0 +1,56 @@ +exp_name: wofloloss_400epoch_bs24_lr2e-4 +seed: 2023 +epochs: 400 +distributed: true +lr: 2e-4 +lr_min: 2e-5 +weight_decay: 0.0 +resume_state: null +save_dir: work_dir +eval_interval: 1 + +network: + name: networks.AMT-S.Model + params: + corr_radius: 3 + corr_lvls: 4 + num_flows: 3 + +data: + train: + name: datasets.gopro_datasets.GoPro_Train_Dataset + params: + dataset_dir: data/GOPRO + val: + name: datasets.gopro_datasets.GoPro_Test_Dataset + params: + dataset_dir: data/GOPRO + train_loader: + batch_size: 24 + num_workers: 12 + val_loader: + batch_size: 24 + num_workers: 3 + +logger: + use_wandb: false + resume_id: null + +losses: + - { + name: losses.loss.CharbonnierLoss, + nickname: l_rec, + params: { + loss_weight: 1.0, + keys: [imgt_pred, imgt] + } + } + - { + name: losses.loss.TernaryLoss, + nickname: l_ter, + params: { + loss_weight: 1.0, + keys: [imgt_pred, imgt] + } + } + diff --git a/vbench/third_party/amt/cfgs/IFRNet.yaml b/vbench/third_party/amt/cfgs/IFRNet.yaml new file mode 100755 index 00000000..1ce67ca4 --- /dev/null +++ b/vbench/third_party/amt/cfgs/IFRNet.yaml @@ -0,0 +1,67 @@ +exp_name: floloss1e-2_geoloss1e-2_300epoch_bs24_lr1e-4 +seed: 2023 +epochs: 300 +distributed: true +lr: 1e-4 +lr_min: 1e-5 +weight_decay: 1e-6 +resume_state: null +save_dir: work_dir +eval_interval: 1 + +network: + name: networks.IFRNet.Model + +data: + train: + name: datasets.datasets.Vimeo90K_Train_Dataset + params: + dataset_dir: data/vimeo_triplet + val: + name: datasets.datasets.Vimeo90K_Test_Dataset + params: + dataset_dir: data/vimeo_triplet + train_loader: + batch_size: 24 + num_workers: 12 + val_loader: + batch_size: 24 + num_workers: 3 + +logger: + use_wandb: true + resume_id: null + +losses: + - { + name: losses.loss.CharbonnierLoss, + nickname: l_rec, + params: { + loss_weight: 1.0, + keys: [imgt_pred, imgt] + } + } + - { + name: losses.loss.TernaryLoss, + nickname: l_ter, + params: { + loss_weight: 1.0, + keys: [imgt_pred, imgt] + } + } + - { + name: losses.loss.IFRFlowLoss, + nickname: l_flo, + params: { + loss_weight: 0.01, + keys: [flow0_pred, flow1_pred, flow] + } + } + - { + name: losses.loss.GeometryLoss, + nickname: l_geo, + params: { + loss_weight: 0.01, + keys: [ft_pred, ft_gt] + } + } diff --git a/vbench/third_party/amt/datasets/__init__.py b/vbench/third_party/amt/datasets/__init__.py new file mode 100755 index 00000000..e69de29b diff --git a/vbench/third_party/amt/datasets/adobe_datasets.py b/vbench/third_party/amt/datasets/adobe_datasets.py new file mode 100755 index 00000000..8ffa857a --- /dev/null +++ b/vbench/third_party/amt/datasets/adobe_datasets.py @@ -0,0 +1,75 @@ +''' + This code is partially borrowed from IFRNet (https://github.com/ltkong218/IFRNet). +''' +import os +import sys +import torch +import numpy as np +from torch.utils.data import Dataset +sys.path.append('.') +from utils.utils import read, img2tensor +from datasets.gopro_datasets import ( + random_resize_woflow, random_crop_woflow, center_crop_woflow, + random_reverse_channel_woflow, random_vertical_flip_woflow, + random_horizontal_flip_woflow, random_rotate_woflow, + random_reverse_time_woflow +) + + +class Adobe240_Dataset(Dataset): + def __init__(self, dataset_dir='data/adobe240/test_frames', interFrames=7, augment=True): + super().__init__() + self.augment = augment + self.interFrames = interFrames + self.setLength = interFrames + 2 + self.dataset_dir = os.path.join(dataset_dir) + video_list = os.listdir(self.dataset_dir)[9::10] + self.frames_list = [] + self.file_list = [] + for video in video_list: + frames = sorted(os.listdir(os.path.join(self.dataset_dir, video))) + n_sets = (len(frames) - self.setLength) // (interFrames + 1) + 1 + videoInputs = [frames[(interFrames + 1) * i: (interFrames + 1) * i + self.setLength] for i in range(n_sets)] + videoInputs = [[os.path.join(video, f) for f in group] for group in videoInputs] + self.file_list.extend(videoInputs) + + def __getitem__(self, idx): + clip_idx = idx // self.interFrames + embt_idx = idx % self.interFrames + imgpaths = [os.path.join(self.dataset_dir, fp) for fp in self.file_list[clip_idx]] + pick_idxs = list(range(0, self.setLength, self.interFrames + 1)) + imgt_beg = self.setLength // 2 - self.interFrames // 2 + imgt_end = self.setLength // 2 + self.interFrames // 2 + self.interFrames % 2 + imgt_idx = list(range(imgt_beg, imgt_end)) + input_paths = [imgpaths[idx] for idx in pick_idxs] + imgt_paths = [imgpaths[idx] for idx in imgt_idx] + + img0 = np.array(read(input_paths[0])) + imgt = np.array(read(imgt_paths[embt_idx])) + img1 = np.array(read(input_paths[1])) + embt = torch.from_numpy(np.array((embt_idx + 1) / (self.interFrames + 1) + ).reshape(1, 1, 1).astype(np.float32)) + + if self.augment == True: + img0, imgt, img1 = random_resize_woflow(img0, imgt, img1, p=0.1) + img0, imgt, img1 = random_crop_woflow(img0, imgt, img1, crop_size=(224, 224)) + img0, imgt, img1 = random_reverse_channel_woflow(img0, imgt, img1, p=0.5) + img0, imgt, img1 = random_vertical_flip_woflow(img0, imgt, img1, p=0.3) + img0, imgt, img1 = random_horizontal_flip_woflow(img0, imgt, img1, p=0.5) + img0, imgt, img1 = random_rotate_woflow(img0, imgt, img1, p=0.05) + img0, imgt, img1, embt = random_reverse_time_woflow(img0, imgt, img1, + embt=embt, p=0.5) + else: + img0, imgt, img1 = center_crop_woflow(img0, imgt, img1, crop_size=(512, 512)) + + img0 = img2tensor(img0).squeeze(0) + imgt = img2tensor(imgt).squeeze(0) + img1 = img2tensor(img1).squeeze(0) + + return {'img0': img0.float(), + 'imgt': imgt.float(), + 'img1': img1.float(), + 'embt': embt} + + def __len__(self): + return len(self.file_list) * self.interFrames diff --git a/vbench/third_party/amt/datasets/gopro_datasets.py b/vbench/third_party/amt/datasets/gopro_datasets.py new file mode 100755 index 00000000..4fa5540a --- /dev/null +++ b/vbench/third_party/amt/datasets/gopro_datasets.py @@ -0,0 +1,188 @@ +''' + This code is partially borrowed from IFRNet (https://github.com/ltkong218/IFRNet). + In the consideration of the difficulty in flow supervision generation, we abort + flow loss in the 8x case. +''' +import os +import cv2 +import torch +import random +import numpy as np +from torch.utils.data import Dataset +from utils.utils import read, img2tensor + +def random_resize_woflow(img0, imgt, img1, p=0.1): + if random.uniform(0, 1) < p: + img0 = cv2.resize(img0, dsize=None, fx=2.0, fy=2.0, interpolation=cv2.INTER_LINEAR) + imgt = cv2.resize(imgt, dsize=None, fx=2.0, fy=2.0, interpolation=cv2.INTER_LINEAR) + img1 = cv2.resize(img1, dsize=None, fx=2.0, fy=2.0, interpolation=cv2.INTER_LINEAR) + return img0, imgt, img1 + +def random_crop_woflow(img0, imgt, img1, crop_size=(224, 224)): + h, w = crop_size[0], crop_size[1] + ih, iw, _ = img0.shape + x = np.random.randint(0, ih-h+1) + y = np.random.randint(0, iw-w+1) + img0 = img0[x: x + h, y : y + w, :] + imgt = imgt[x: x + h, y : y + w, :] + img1 = img1[x: x + h, y : y + w, :] + return img0, imgt, img1 + +def center_crop_woflow(img0, imgt, img1, crop_size=(512, 512)): + h, w = crop_size[0], crop_size[1] + ih, iw, _ = img0.shape + img0 = img0[ih // 2 - h // 2: ih // 2 + h // 2, iw // 2 - w // 2: iw // 2 + w // 2, :] + imgt = imgt[ih // 2 - h // 2: ih // 2 + h // 2, iw // 2 - w // 2: iw // 2 + w // 2, :] + img1 = img1[ih // 2 - h // 2: ih // 2 + h // 2, iw // 2 - w // 2: iw // 2 + w // 2, :] + return img0, imgt, img1 + +def random_reverse_channel_woflow(img0, imgt, img1, p=0.5): + if random.uniform(0, 1) < p: + img0 = img0[:, :, ::-1] + imgt = imgt[:, :, ::-1] + img1 = img1[:, :, ::-1] + return img0, imgt, img1 + +def random_vertical_flip_woflow(img0, imgt, img1, p=0.3): + if random.uniform(0, 1) < p: + img0 = img0[::-1] + imgt = imgt[::-1] + img1 = img1[::-1] + return img0, imgt, img1 + +def random_horizontal_flip_woflow(img0, imgt, img1, p=0.5): + if random.uniform(0, 1) < p: + img0 = img0[:, ::-1] + imgt = imgt[:, ::-1] + img1 = img1[:, ::-1] + return img0, imgt, img1 + +def random_rotate_woflow(img0, imgt, img1, p=0.05): + if random.uniform(0, 1) < p: + img0 = img0.transpose((1, 0, 2)) + imgt = imgt.transpose((1, 0, 2)) + img1 = img1.transpose((1, 0, 2)) + return img0, imgt, img1 + +def random_reverse_time_woflow(img0, imgt, img1, embt, p=0.5): + if random.uniform(0, 1) < p: + tmp = img1 + img1 = img0 + img0 = tmp + embt = 1 - embt + return img0, imgt, img1, embt + +class GoPro_Train_Dataset(Dataset): + def __init__(self, dataset_dir='data/GOPRO', interFrames=7, augment=True): + self.dataset_dir = dataset_dir + '/train' + self.interFrames = interFrames + self.augment = augment + self.setLength = interFrames + 2 + video_list = [ + 'GOPR0372_07_00', 'GOPR0374_11_01', 'GOPR0378_13_00', 'GOPR0384_11_01', + 'GOPR0384_11_04', 'GOPR0477_11_00', 'GOPR0868_11_02', 'GOPR0884_11_00', + 'GOPR0372_07_01', 'GOPR0374_11_02', 'GOPR0379_11_00', 'GOPR0384_11_02', + 'GOPR0385_11_00', 'GOPR0857_11_00', 'GOPR0871_11_01', 'GOPR0374_11_00', + 'GOPR0374_11_03', 'GOPR0380_11_00', 'GOPR0384_11_03', 'GOPR0386_11_00', + 'GOPR0868_11_01', 'GOPR0881_11_00'] + self.frames_list = [] + self.file_list = [] + for video in video_list: + frames = sorted(os.listdir(os.path.join(self.dataset_dir, video))) + n_sets = (len(frames) - self.setLength) // (interFrames+1) + 1 + videoInputs = [frames[(interFrames + 1) * i: (interFrames + 1) * i + self.setLength + ] for i in range(n_sets)] + videoInputs = [[os.path.join(video, f) for f in group] for group in videoInputs] + self.file_list.extend(videoInputs) + + def __len__(self): + return len(self.file_list) * self.interFrames + + def __getitem__(self, idx): + clip_idx = idx // self.interFrames + embt_idx = idx % self.interFrames + imgpaths = [os.path.join(self.dataset_dir, fp) for fp in self.file_list[clip_idx]] + pick_idxs = list(range(0, self.setLength, self.interFrames + 1)) + imgt_beg = self.setLength // 2 - self.interFrames // 2 + imgt_end = self.setLength // 2 + self.interFrames // 2 + self.interFrames % 2 + imgt_idx = list(range(imgt_beg, imgt_end)) + input_paths = [imgpaths[idx] for idx in pick_idxs] + imgt_paths = [imgpaths[idx] for idx in imgt_idx] + + embt = torch.from_numpy(np.array((embt_idx + 1) / (self.interFrames+1) + ).reshape(1, 1, 1).astype(np.float32)) + img0 = np.array(read(input_paths[0])) + imgt = np.array(read(imgt_paths[embt_idx])) + img1 = np.array(read(input_paths[1])) + + if self.augment == True: + img0, imgt, img1 = random_resize_woflow(img0, imgt, img1, p=0.1) + img0, imgt, img1 = random_crop_woflow(img0, imgt, img1, crop_size=(224, 224)) + img0, imgt, img1 = random_reverse_channel_woflow(img0, imgt, img1, p=0.5) + img0, imgt, img1 = random_vertical_flip_woflow(img0, imgt, img1, p=0.3) + img0, imgt, img1 = random_horizontal_flip_woflow(img0, imgt, img1, p=0.5) + img0, imgt, img1 = random_rotate_woflow(img0, imgt, img1, p=0.05) + img0, imgt, img1, embt = random_reverse_time_woflow(img0, imgt, img1, + embt=embt, p=0.5) + else: + img0, imgt, img1 = center_crop_woflow(img0, imgt, img1, crop_size=(512, 512)) + + img0 = img2tensor(img0.copy()).squeeze(0) + imgt = img2tensor(imgt.copy()).squeeze(0) + img1 = img2tensor(img1.copy()).squeeze(0) + + return {'img0': img0.float(), + 'imgt': imgt.float(), + 'img1': img1.float(), + 'embt': embt} + +class GoPro_Test_Dataset(Dataset): + def __init__(self, dataset_dir='data/GOPRO', interFrames=7): + self.dataset_dir = dataset_dir + '/test' + self.interFrames = interFrames + self.setLength = interFrames + 2 + video_list = [ + 'GOPR0384_11_00', 'GOPR0385_11_01', 'GOPR0410_11_00', + 'GOPR0862_11_00', 'GOPR0869_11_00', 'GOPR0881_11_01', + 'GOPR0384_11_05', 'GOPR0396_11_00', 'GOPR0854_11_00', + 'GOPR0868_11_00', 'GOPR0871_11_00'] + self.frames_list = [] + self.file_list = [] + for video in video_list: + frames = sorted(os.listdir(os.path.join(self.dataset_dir, video))) + n_sets = (len(frames) - self.setLength)//(interFrames+1) + 1 + videoInputs = [frames[(interFrames + 1) * i:(interFrames + 1) * i + self.setLength + ] for i in range(n_sets)] + videoInputs = [[os.path.join(video, f) for f in group] for group in videoInputs] + self.file_list.extend(videoInputs) + + def __len__(self): + return len(self.file_list) * self.interFrames + + def __getitem__(self, idx): + clip_idx = idx // self.interFrames + embt_idx = idx % self.interFrames + imgpaths = [os.path.join(self.dataset_dir, fp) for fp in self.file_list[clip_idx]] + pick_idxs = list(range(0, self.setLength, self.interFrames + 1)) + imgt_beg = self.setLength // 2 - self.interFrames // 2 + imgt_end = self.setLength // 2 + self.interFrames // 2 + self.interFrames % 2 + imgt_idx = list(range(imgt_beg, imgt_end)) + input_paths = [imgpaths[idx] for idx in pick_idxs] + imgt_paths = [imgpaths[idx] for idx in imgt_idx] + + img0 = np.array(read(input_paths[0])) + imgt = np.array(read(imgt_paths[embt_idx])) + img1 = np.array(read(input_paths[1])) + + img0, imgt, img1 = center_crop_woflow(img0, imgt, img1, crop_size=(512, 512)) + + img0 = img2tensor(img0).squeeze(0) + imgt = img2tensor(imgt).squeeze(0) + img1 = img2tensor(img1).squeeze(0) + + embt = torch.from_numpy(np.array((embt_idx + 1) / (self.interFrames + 1) + ).reshape(1, 1, 1).astype(np.float32)) + return {'img0': img0.float(), + 'imgt': imgt.float(), + 'img1': img1.float(), + 'embt': embt} \ No newline at end of file diff --git a/vbench/third_party/amt/datasets/vimeo_datasets.py b/vbench/third_party/amt/datasets/vimeo_datasets.py new file mode 100755 index 00000000..03da0f53 --- /dev/null +++ b/vbench/third_party/amt/datasets/vimeo_datasets.py @@ -0,0 +1,176 @@ +''' + This code is partially borrowed from IFRNet (https://github.com/ltkong218/IFRNet). +''' +import os +import cv2 +import torch +import random +import numpy as np +from torch.utils.data import Dataset +from utils.utils import read + + +def random_resize(img0, imgt, img1, flow, p=0.1): + if random.uniform(0, 1) < p: + img0 = cv2.resize(img0, dsize=None, fx=2.0, fy=2.0, interpolation=cv2.INTER_LINEAR) + imgt = cv2.resize(imgt, dsize=None, fx=2.0, fy=2.0, interpolation=cv2.INTER_LINEAR) + img1 = cv2.resize(img1, dsize=None, fx=2.0, fy=2.0, interpolation=cv2.INTER_LINEAR) + flow = cv2.resize(flow, dsize=None, fx=2.0, fy=2.0, interpolation=cv2.INTER_LINEAR) * 2.0 + return img0, imgt, img1, flow + +def random_crop(img0, imgt, img1, flow, crop_size=(224, 224)): + h, w = crop_size[0], crop_size[1] + ih, iw, _ = img0.shape + x = np.random.randint(0, ih-h+1) + y = np.random.randint(0, iw-w+1) + img0 = img0[x:x+h, y:y+w, :] + imgt = imgt[x:x+h, y:y+w, :] + img1 = img1[x:x+h, y:y+w, :] + flow = flow[x:x+h, y:y+w, :] + return img0, imgt, img1, flow + +def random_reverse_channel(img0, imgt, img1, flow, p=0.5): + if random.uniform(0, 1) < p: + img0 = img0[:, :, ::-1] + imgt = imgt[:, :, ::-1] + img1 = img1[:, :, ::-1] + return img0, imgt, img1, flow + +def random_vertical_flip(img0, imgt, img1, flow, p=0.3): + if random.uniform(0, 1) < p: + img0 = img0[::-1] + imgt = imgt[::-1] + img1 = img1[::-1] + flow = flow[::-1] + flow = np.concatenate((flow[:, :, 0:1], -flow[:, :, 1:2], flow[:, :, 2:3], -flow[:, :, 3:4]), 2) + return img0, imgt, img1, flow + +def random_horizontal_flip(img0, imgt, img1, flow, p=0.5): + if random.uniform(0, 1) < p: + img0 = img0[:, ::-1] + imgt = imgt[:, ::-1] + img1 = img1[:, ::-1] + flow = flow[:, ::-1] + flow = np.concatenate((-flow[:, :, 0:1], flow[:, :, 1:2], -flow[:, :, 2:3], flow[:, :, 3:4]), 2) + return img0, imgt, img1, flow + +def random_rotate(img0, imgt, img1, flow, p=0.05): + if random.uniform(0, 1) < p: + img0 = img0.transpose((1, 0, 2)) + imgt = imgt.transpose((1, 0, 2)) + img1 = img1.transpose((1, 0, 2)) + flow = flow.transpose((1, 0, 2)) + flow = np.concatenate((flow[:, :, 1:2], flow[:, :, 0:1], flow[:, :, 3:4], flow[:, :, 2:3]), 2) + return img0, imgt, img1, flow + +def random_reverse_time(img0, imgt, img1, flow, p=0.5): + if random.uniform(0, 1) < p: + tmp = img1 + img1 = img0 + img0 = tmp + flow = np.concatenate((flow[:, :, 2:4], flow[:, :, 0:2]), 2) + return img0, imgt, img1, flow + + +class Vimeo90K_Train_Dataset(Dataset): + def __init__(self, + dataset_dir='data/vimeo_triplet', + flow_dir=None, + augment=True, + crop_size=(224, 224)): + self.dataset_dir = dataset_dir + self.augment = augment + self.crop_size = crop_size + self.img0_list = [] + self.imgt_list = [] + self.img1_list = [] + self.flow_t0_list = [] + self.flow_t1_list = [] + if flow_dir is None: + flow_dir = 'flow' + with open(os.path.join(dataset_dir, 'tri_trainlist.txt'), 'r') as f: + for i in f: + name = str(i).strip() + if(len(name) <= 1): + continue + self.img0_list.append(os.path.join(dataset_dir, 'sequences', name, 'im1.png')) + self.imgt_list.append(os.path.join(dataset_dir, 'sequences', name, 'im2.png')) + self.img1_list.append(os.path.join(dataset_dir, 'sequences', name, 'im3.png')) + self.flow_t0_list.append(os.path.join(dataset_dir, flow_dir, name, 'flow_t0.flo')) + self.flow_t1_list.append(os.path.join(dataset_dir, flow_dir, name, 'flow_t1.flo')) + + def __len__(self): + return len(self.imgt_list) + + def __getitem__(self, idx): + img0 = read(self.img0_list[idx]) + imgt = read(self.imgt_list[idx]) + img1 = read(self.img1_list[idx]) + flow_t0 = read(self.flow_t0_list[idx]) + flow_t1 = read(self.flow_t1_list[idx]) + flow = np.concatenate((flow_t0, flow_t1), 2).astype(np.float64) + + if self.augment == True: + img0, imgt, img1, flow = random_resize(img0, imgt, img1, flow, p=0.1) + img0, imgt, img1, flow = random_crop(img0, imgt, img1, flow, crop_size=self.crop_size) + img0, imgt, img1, flow = random_reverse_channel(img0, imgt, img1, flow, p=0.5) + img0, imgt, img1, flow = random_vertical_flip(img0, imgt, img1, flow, p=0.3) + img0, imgt, img1, flow = random_horizontal_flip(img0, imgt, img1, flow, p=0.5) + img0, imgt, img1, flow = random_rotate(img0, imgt, img1, flow, p=0.05) + img0, imgt, img1, flow = random_reverse_time(img0, imgt, img1, flow, p=0.5) + + + img0 = torch.from_numpy(img0.transpose((2, 0, 1)).astype(np.float32) / 255.0) + imgt = torch.from_numpy(imgt.transpose((2, 0, 1)).astype(np.float32) / 255.0) + img1 = torch.from_numpy(img1.transpose((2, 0, 1)).astype(np.float32) / 255.0) + flow = torch.from_numpy(flow.transpose((2, 0, 1)).astype(np.float32)) + embt = torch.from_numpy(np.array(1/2).reshape(1, 1, 1).astype(np.float32)) + + return {'img0': img0.float(), 'imgt': imgt.float(), 'img1': img1.float(), 'flow': flow.float(), 'embt': embt} + + +class Vimeo90K_Test_Dataset(Dataset): + def __init__(self, dataset_dir='data/vimeo_triplet'): + self.dataset_dir = dataset_dir + self.img0_list = [] + self.imgt_list = [] + self.img1_list = [] + self.flow_t0_list = [] + self.flow_t1_list = [] + with open(os.path.join(dataset_dir, 'tri_testlist.txt'), 'r') as f: + for i in f: + name = str(i).strip() + if(len(name) <= 1): + continue + self.img0_list.append(os.path.join(dataset_dir, 'sequences', name, 'im1.png')) + self.imgt_list.append(os.path.join(dataset_dir, 'sequences', name, 'im2.png')) + self.img1_list.append(os.path.join(dataset_dir, 'sequences', name, 'im3.png')) + self.flow_t0_list.append(os.path.join(dataset_dir, 'flow', name, 'flow_t0.flo')) + self.flow_t1_list.append(os.path.join(dataset_dir, 'flow', name, 'flow_t1.flo')) + + def __len__(self): + return len(self.imgt_list) + + def __getitem__(self, idx): + img0 = read(self.img0_list[idx]) + imgt = read(self.imgt_list[idx]) + img1 = read(self.img1_list[idx]) + flow_t0 = read(self.flow_t0_list[idx]) + flow_t1 = read(self.flow_t1_list[idx]) + flow = np.concatenate((flow_t0, flow_t1), 2) + + img0 = torch.from_numpy(img0.transpose((2, 0, 1)).astype(np.float32) / 255.0) + imgt = torch.from_numpy(imgt.transpose((2, 0, 1)).astype(np.float32) / 255.0) + img1 = torch.from_numpy(img1.transpose((2, 0, 1)).astype(np.float32) / 255.0) + flow = torch.from_numpy(flow.transpose((2, 0, 1)).astype(np.float32)) + embt = torch.from_numpy(np.array(1/2).reshape(1, 1, 1).astype(np.float32)) + + return {'img0': img0.float(), + 'imgt': imgt.float(), + 'img1': img1.float(), + 'flow': flow.float(), + 'embt': embt} + + + + diff --git a/vbench/third_party/amt/docs/develop.md b/vbench/third_party/amt/docs/develop.md new file mode 100755 index 00000000..e927e976 --- /dev/null +++ b/vbench/third_party/amt/docs/develop.md @@ -0,0 +1,239 @@ +# Development for evaluation and training + +- [Datasets](#Datasets) +- [Pretrained Models](#pretrained-models) +- [Evaluation](#evaluation) +- [Training](#training) + +## Datasets

      +First, please prepare standard datasets for evaluation and training. + +We present most of prevailing datasets in video frame interpolation, though some are not used in our project. Hope this collection could help your research. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
      Dataset :link: Source Train/Eval Arbitrary/Fixed
      Vimeo90kToFlow (IJCV 2019)BothFixed
      ATD-12KAnimeInterp (CVPR 2021)BothFixed
      SNU-FILMCAIN (AAAI 2021)EvalFixed
      UCF101Google DriverEvalFixed
      HDMEMC-Net (TPAMI 2018)/Google DriverEvalFixed
      Xiph-2k/-4kSoftSplat (CVPR 2020)EvalFixed
      MiddleBuryMiddleBuryEvalFixed
      GoProGoProBothArbitrary
      Adobe240fpsDBN (CVPR 2017)BothArbitrary
      X4K1000FPSXVFI (ICCV 2021)BothArbitrary
      + + +## Pretrained Models + +

      + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
      Dataset :link: Download Links Config file Trained on Arbitrary/Fixed
      AMT-S [Google Driver][Baidu Cloud] [cfgs/AMT-S] Vimeo90kFixed
      AMT-L[Google Driver][Baidu Cloud] [cfgs/AMT-L] Vimeo90kFixed
      AMT-G[Google Driver][Baidu Cloud] [cfgs/AMT-G] Vimeo90kFixed
      AMT-S[Google Driver][Baidu Cloud] [cfgs/AMT-S_gopro] GoProArbitrary
      + +## Evaluation +Before evaluation, you should: + +1. Check the dataroot is organized as follows: + +```shell +./data +├── Adobe240 +│ ├── original_high_fps_videos +│ └── test_frames # using ffmpeg to extract 240 fps frames from `original_high_fps_videos` +├── GOPRO +│ ├── test +│ └── train +├── SNU_FILM +│ ├── GOPRO_test +│ ├── test-easy.txt +│ ├── test-extreme.txt +│ ├── test-hard.txt +│ ├── test-medium.txt +│ └── YouTube_test +├── ucf101_interp_ours +│ ├── 1 +│ ├── 1001 +│ └── ... +└── vimeo_triplet + ├── readme.txt + ├── sequences + ├── tri_testlist.txt + └── tri_trainlist.txt +``` + +2. Download the provided [pretrained models](#pretrained-models). + +Then, you can perform evaluation as follows: + ++ Run all benchmarks for fixed-time models. + + ```shell + sh ./scripts/benchmark_fixed.sh [CFG] [CKPT_PATH] + ## e.g. + sh ./scripts/benchmark_fixed.sh cfgs/AMT-S.yaml pretrained/amt-s.pth + ``` + ++ Run all benchmarks for arbitrary-time models. + + ```shell + sh ./scripts/benchmark_arbitrary.sh [CFG] [CKPT_PATH] + ## e.g. + sh ./scripts/benchmark_arbitrary.sh cfgs/AMT-S.yaml pretrained/gopro_amt-s.pth + ``` + ++ Run a single benchmark for fixed-time models. *You can custom data paths in this case*. + + ```shell + python [BENCHMARK] -c [CFG] -p [CKPT_PATH] -r [DATAROOT] + ## e.g. + python benchmarks/vimeo90k.py -c cfgs/AMT-S.yaml -p pretrained/amt-s.pth -r data/vimeo_triplet + ``` + ++ Run the inference speed & model size comparisons using: + + ```shell + python speed_parameters.py -c [CFG] + ## e.g. + python speed_parameters.py -c cfgs/AMT-S.yaml + ``` + + +## Training + +Before training, please first prepare the optical flows (which are used for supervision). + +We need to install `cupy` first before flow generation: + +```shell +conda activate amt # satisfying `requirement.txt` +conda install -c conda-forge cupy +``` + + +After installing `cupy`, we can generate optical flows by the following command: + +```shell +python flow_generation/gen_flow.py -r [DATA_ROOT] +## e.g. +python flow_generation/gen_flow.py -r data/vimeo_triplet +``` + +After obtaining the optical flow of the training data, +run the following commands for training (DDP mode): + +```shell + sh ./scripts/train.sh [NUM_GPU] [CFG] [MASTER_PORT] + ## e.g. + sh ./scripts/train.sh 2 cfgs/AMT-S.yaml 14514 +``` + +Our training configuration files are provided in [`cfgs`](../cfgs). Please carefully check the `dataset_dir` is suitable for you. + + +Note: + +- If you intend to turn off DDP training, you can switch the key `distributed` from `true` +to `false` in the config file. + +- If you do not use wandb, you can switch the key `logger.use_wandb` from `true` +to `false` in the config file. \ No newline at end of file diff --git a/vbench/third_party/amt/docs/method.md b/vbench/third_party/amt/docs/method.md new file mode 100755 index 00000000..1343649b --- /dev/null +++ b/vbench/third_party/amt/docs/method.md @@ -0,0 +1,126 @@ +# Illustration of AMT + +

      + +

      + +### :rocket: Highlights: + ++ [**Good tradeoff**](#good-tradeoff) between performance and efficiency. + ++ [**All-pairs correlation**](#all-pairs-correlation) for modeling large motions during interpolation. + ++ A [**plug-and-play operator**](#multi-field-refinement) to improve the diversity of predicted task-oriented flows, further **boosting the interpolation performance**. + + +## Good Tradeoff + +

      + +

      + +We examine the proposed AMT on several public benchmarks with different model scales, showing strong performance and high efficiency in contrast to the SOTA methods (see Figure). Our small model outperforms [IFRNet-B](https://arxiv.org/abs/2205.14620), a SOTA lightweight model, by **\+0.17dB PSNR** on Vimeo90K with **only 60% of its FLOPs and parameters**. For large-scale setting, our AMT exceeds the previous SOTA (i.e., [IFRNet-L](https://arxiv.org/abs/2205.14620)) by **+0.15 dB PSNR** on Vimeo90K with **75% of its FLOPs and 65% of its parameters**. Besides, we provide a huge model for comparison +with the SOTA transformer-based method [VFIFormer](https://arxiv.org/abs/2205.07230). Our convolution-based AMT shows a **comparable performance** but only needs **nearly 23× less computational cost** compared to VFIFormer. + +Considering its effectiveness, we hope our AMT could bring a new perspective for the architecture design in efficient frame interpolation. + +## All-pairs correlation + +We build all-pairs correlation to effectively model large motions during interpolation. + +Here is an example about the update operation at a single scale in AMT: + +```python + # Construct bidirectional correlation volumes + fmap0, fmap1 = self.feat_encoder([img0_, img1_]) # [B, C, H//8, W//8] + corr_fn = BidirCorrBlock(fmap0, fmap1, radius=self.radius, num_levels=self.corr_levels) + + # Correlation scaled lookup (bilateral -> bidirectional) + t1_scale = 1. / embt + t0_scale = 1. / (1. - embt) + coord = coords_grid(b, h // 8, w // 8, img0.device) + corr0, corr1 = corr_fn(coord + flow1 * t1_scale, coord + flow0 * t0_scale) + corr = torch.cat([corr0, corr1], dim=1) + flow = torch.cat([flow0, flow1], dim=1) + + # Update both intermediate feature and bilateral flows + delta_feat, delta_flow = self.update(feat, flow, corr) + delta_flow0, delta_flow1 = torch.chunk(delta_flow, 2, 1) + flow0 = flow0 + delta_flow0 + flow1= flow1 + delta_flow1 + feat = feat + delta_feat + +``` + +Note: we extend above operations to each pyramid scale (except for the last one), which guarantees the consistency of flows on the coarse scale. + +### ⏫ performance gain +| | Vimeo 90k | Hard | Extreme | +|-------------------------|-----------|-------|---------| +| Baseline | 35.60 | 30.39 | 25.06 | +| + All-pairs correlation | 35.97 (**+0.37**) | 30.60 (**+0.21**) | 25.30 (**+0.24**) | + +More ablations can be found in the [paper](https://arxiv.org/abs/2304.09790). + +## Multi-field Refinement + +For most frame interpolation methods which are based on backward warping, the common formulation for +interpolating the final intermediate frame $I_{t}$ is: + +$I_{t} = M \odot \mathcal{W}(I_{0}, F_{t\rightarrow 0}) + (1 - M) \odot \mathcal{W}(I_{1}, F_{t\rightarrow 1}) + R$ + +Above formualtion only utilizes **one set of** bilateral optical flows $F_{t\rightarrow 0}$ and $F_{t\rightarrow 1}$, occulusion masks $M$, and residuals $R$. + +Multi-field refinement aims to improve the common formulation of backward warping. +Specifically, we first predict **multiple** bilateral optical flows (accompanied by the corresponding masks and residuals) through simply enlarging the output channels of the last decoder. +Then, we use aforementioned equation to genearate each interpolated candidate frame. Finally, we obtain the final interpolated frame through combining candidate frames using stacked convolutional layers. + +Please refer to [this code snippet](../networks/blocks/multi_flow.py#L46) for the details of the first step. +Please refer to [this code snippet](../networks/blocks/multi_flow.py#L10) for the details of the last two steps. + +### 🌟 easy to use +The proposed multi-field refinement can be **easily migrated to any frame interpolation model** to improve the performance. + +Code examples are shown below: + +```python + +# (At the __init__ stage) Initialize a decoder that predicts multiple flow fields (accompanied by the corresponding masks and residuals) +self.decoder1 = MultiFlowDecoder(channels[0], skip_channels, num_flows) +... + +# (At the forward stage) Predict multiple flow fields (accompanied by the corresponding masks and residuals) +up_flow0_1, up_flow1_1, mask, img_res = self.decoder1(ft_1_, f0_1, f1_1, up_flow0_2, up_flow1_2) +# Merge multiple predictions +imgt_pred = multi_flow_combine(self.comb_block, img0, img1, up_flow0_1, up_flow1_1, # self.comb_block stacks two convolutional layers + mask, img_res, mean_) + +``` + +### ⏫ performance gain + +| # Number of flow pairs | Vimeo 90k | Hard | Extreme | +|------------------------|---------------|---------------|---------------| +| Baseline (1 pair) | 35.84 | 30.52 | 25.25 | +| 3 pairs | 35.97 (**+0.13**) | 30.60 (**+0.08**) | 25.30 (**+0.05**) | +| 5 pairs | 36.00 (**+0.16**) | 30.63 (**+0.11**) | 25.33 (**+0.08**) | + +## Comparison with SOTA methods +

      + +

      + + +## Discussions + +We encountered the challenges about the novelty issue during the rebuttal process. + +We are ready to clarify again here: + +1. We consider the estimation of task-oriented flows from **the perspective of architecture formulation rather than loss function designs** in previous works. The detailed analysis can be found in Sec. 1 of the main paper. We introduce all-pairs correlation to strengthen the ability +in motion modeling, which guarantees **the consistency of flows on the coarse scale**. We employ multi-field refinement to **ensure diversity for the flow regions that need to be task-specific at the finest scale**. The two designs also enable our AMT to capture large motions and successfully handle occlusion regions with high efficiency. As a consequence, they both bring noticeable performance improvements, as shown in the ablations. +2. The frame interpolation task is closely related to the **motion modeling**. We strongly believe that a [RAFT-style](https://arxiv.org/abs/2003.12039) approach to motion modeling would be beneficial for the frame interpolation task. However, such style **has not been well studied** in the recent frame interpolation literature. Experimental results show that **all-pairs correlation is very important for the performance gain**. We also involve many novel and task-specific designs +beyond the original RAFT. For other task-related design choices, our volume design, scaled lookup strategy, content update, and cross-scale update way have good performance gains on challenging cases (i.e., Hard and Extreme). Besides, if we discard all design choices (but remaining multi-field refinement) and follow the original RAFT to retrain a new model, **the PSNR values will dramatically decrease** (-0.20dB on Vimeo, -0.33dB on Hard, and -0.39dB on Extreme). +3. [M2M-VFI](https://arxiv.org/abs/2204.03513) is the most relevant to our multi-field refinement. It also generates multiple flows through the decoder and prepares warped candidates in the image domain. However, there are **five key differences** between our multi-field refinement and M2M-VFI. **First**, our method generates the candidate frames by backward warping rather than forward warping in M2M-VFI. The proposed multi-field refinement aims to improve the common formulation of backward warping (see Eqn.~(4) in the main paper). **Second**, while M2M-VFI predicts multiple flows to overcome the hole issue and artifacts in overlapped regions caused by forward warping, we aim to alleviate the ambiguity issue in the occluded areas and motion boundaries by enhancing the diversity of flows. **Third**, M2M-VFI needs to estimate bidirectional flows first through an off-the-shelf optical flow estimator and then predict multiple bilateral flows through a motion refinement network. On the contrary, we directly estimate multiple bilateral flows in a one-stage network. In this network, we first estimate one pair of bilateral flows at the coarse scale and then derive multiple groups of fine-grained bilateral flows from the coarse flow pairs. **Fourth**, M2M-VFI jointly estimates two reliability maps together with all pairs of bilateral flows, which can be further used to fuse the overlapping pixels caused by forward warping. As shown in Eqn. (5) of the main paper, we estimate not only an occlusion mask but a residual content for cooperating with each pair of bilateral flows. The residual content is used to compensate for the unreliable details after warping. This design has been investigated in Tab. 2e of the main paper. **Fifth**, we stack two convolutional layers to adaptively merge candidate frames, while M2M-VFI normalizes the sum of all candidate frames through a pre-computed weighting map + +More discussions and details can be found in the [appendix](https://arxiv.org/abs/2304.09790) of our paper. diff --git a/vbench/third_party/amt/environment.yaml b/vbench/third_party/amt/environment.yaml new file mode 100755 index 00000000..cd402d0b --- /dev/null +++ b/vbench/third_party/amt/environment.yaml @@ -0,0 +1,19 @@ +name: amt +channels: + - pytorch + - conda-forge + - defaults +dependencies: + - python=3.8.5 + - pip=20.3 + - cudatoolkit=11.3 + - pytorch=1.11.0 + - torchvision=0.12.0 + - numpy=1.21.5 + - pip: + - opencv-python==4.1.2.30 + - imageio==2.19.3 + - omegaconf==2.3.0 + - Pillow==9.4.0 + - tqdm==4.64.1 + - wandb==0.12.21 \ No newline at end of file diff --git a/vbench/third_party/amt/flow_generation/__init__.py b/vbench/third_party/amt/flow_generation/__init__.py new file mode 100755 index 00000000..e69de29b diff --git a/vbench/third_party/amt/flow_generation/gen_flow.py b/vbench/third_party/amt/flow_generation/gen_flow.py new file mode 100755 index 00000000..a9d393b3 --- /dev/null +++ b/vbench/third_party/amt/flow_generation/gen_flow.py @@ -0,0 +1,72 @@ +import os +import sys +import torch +import argparse +import numpy as np +import os.path as osp +import torch.nn.functional as F + +sys.path.append('.') +from utils.utils import read, write +from flow_generation.liteflownet.run import estimate + +parser = argparse.ArgumentParser( + prog = 'AMT', + description = 'Flow generation', + ) +parser.add_argument('-r', '--root', default='data/vimeo_triplet') +args = parser.parse_args() + +vimeo90k_dir = args.root +vimeo90k_sequences_dir = osp.join(vimeo90k_dir, 'sequences') +vimeo90k_flow_dir = osp.join(vimeo90k_dir, 'flow') + +def pred_flow(img1, img2): + img1 = torch.from_numpy(img1).float().permute(2, 0, 1) / 255.0 + img2 = torch.from_numpy(img2).float().permute(2, 0, 1) / 255.0 + + flow = estimate(img1, img2) + + flow = flow.permute(1, 2, 0).cpu().numpy() + return flow + +print('Built Flow Path') +if not osp.exists(vimeo90k_flow_dir): + os.makedirs(vimeo90k_flow_dir) + +for sequences_path in sorted(os.listdir(vimeo90k_sequences_dir)): + vimeo90k_sequences_path_dir = osp.join(vimeo90k_sequences_dir, sequences_path) + vimeo90k_flow_path_dir = osp.join(vimeo90k_flow_dir, sequences_path) + if not osp.exists(vimeo90k_flow_path_dir): + os.mkdir(vimeo90k_flow_path_dir) + + for sequences_id in sorted(os.listdir(vimeo90k_sequences_path_dir)): + vimeo90k_flow_id_dir = osp.join(vimeo90k_flow_path_dir, sequences_id) + if not osp.exists(vimeo90k_flow_id_dir): + os.mkdir(vimeo90k_flow_id_dir) + +for sequences_path in sorted(os.listdir(vimeo90k_sequences_dir)): + vimeo90k_sequences_path_dir = os.path.join(vimeo90k_sequences_dir, sequences_path) + vimeo90k_flow_path_dir = os.path.join(vimeo90k_flow_dir, sequences_path) + + for sequences_id in sorted(os.listdir(vimeo90k_sequences_path_dir)): + vimeo90k_sequences_id_dir = os.path.join(vimeo90k_sequences_path_dir, sequences_id) + vimeo90k_flow_id_dir = os.path.join(vimeo90k_flow_path_dir, sequences_id) + + img0_path = vimeo90k_sequences_id_dir + '/im1.png' + imgt_path = vimeo90k_sequences_id_dir + '/im2.png' + img1_path = vimeo90k_sequences_id_dir + '/im3.png' + flow_t0_path = vimeo90k_flow_id_dir + '/flow_t0.flo' + flow_t1_path = vimeo90k_flow_id_dir + '/flow_t1.flo' + + img0 = read(img0_path) + imgt = read(imgt_path) + img1 = read(img1_path) + + flow_t0 = pred_flow(imgt, img0) + flow_t1 = pred_flow(imgt, img1) + + write(flow_t0_path, flow_t0) + write(flow_t1_path, flow_t1) + + print('Written Sequences {}'.format(sequences_path)) \ No newline at end of file diff --git a/vbench/third_party/amt/flow_generation/liteflownet/README.md b/vbench/third_party/amt/flow_generation/liteflownet/README.md new file mode 100755 index 00000000..9511ad98 --- /dev/null +++ b/vbench/third_party/amt/flow_generation/liteflownet/README.md @@ -0,0 +1,45 @@ +# pytorch-liteflownet +This is a personal reimplementation of LiteFlowNet [1] using PyTorch. Should you be making use of this work, please cite the paper accordingly. Also, make sure to adhere to the licensing terms of the authors. Should you be making use of this particular implementation, please acknowledge it appropriately [2]. + +Paper + +For the original Caffe version of this work, please see: https://github.com/twhui/LiteFlowNet +
      +Other optical flow implementations from me: [pytorch-pwc](https://github.com/sniklaus/pytorch-pwc), [pytorch-unflow](https://github.com/sniklaus/pytorch-unflow), [pytorch-spynet](https://github.com/sniklaus/pytorch-spynet) + +## setup +The correlation layer is implemented in CUDA using CuPy, which is why CuPy is a required dependency. It can be installed using `pip install cupy` or alternatively using one of the provided [binary packages](https://docs.cupy.dev/en/stable/install.html#installing-cupy) as outlined in the CuPy repository. If you would like to use Docker, you can take a look at [this](https://github.com/sniklaus/pytorch-liteflownet/pull/43) pull request to get started. + +## usage +To run it on your own pair of images, use the following command. You can choose between three models, please make sure to see their paper / the code for more details. + +``` +python run.py --model default --one ./images/one.png --two ./images/two.png --out ./out.flo +``` + +I am afraid that I cannot guarantee that this reimplementation is correct. However, it produced results pretty much identical to the implementation of the original authors in the examples that I tried. There are some numerical deviations that stem from differences in the `DownsampleLayer` of Caffe and the `torch.nn.functional.interpolate` function of PyTorch. Please feel free to contribute to this repository by submitting issues and pull requests. + +## comparison +

      Comparison

      + +## license +As stated in the licensing terms of the authors of the paper, their material is provided for research purposes only. Please make sure to further consult their licensing terms. + +## references +``` +[1] @inproceedings{Hui_CVPR_2018, + author = {Tak-Wai Hui and Xiaoou Tang and Chen Change Loy}, + title = {{LiteFlowNet}: A Lightweight Convolutional Neural Network for Optical Flow Estimation}, + booktitle = {IEEE Conference on Computer Vision and Pattern Recognition}, + year = {2018} + } +``` + +``` +[2] @misc{pytorch-liteflownet, + author = {Simon Niklaus}, + title = {A Reimplementation of {LiteFlowNet} Using {PyTorch}}, + year = {2019}, + howpublished = {\url{https://github.com/sniklaus/pytorch-liteflownet}} + } +``` \ No newline at end of file diff --git a/vbench/third_party/amt/flow_generation/liteflownet/__init__.py b/vbench/third_party/amt/flow_generation/liteflownet/__init__.py new file mode 100755 index 00000000..e69de29b diff --git a/vbench/third_party/amt/flow_generation/liteflownet/correlation/README.md b/vbench/third_party/amt/flow_generation/liteflownet/correlation/README.md new file mode 100755 index 00000000..e80f923b --- /dev/null +++ b/vbench/third_party/amt/flow_generation/liteflownet/correlation/README.md @@ -0,0 +1 @@ +This is an adaptation of the FlowNet2 implementation in order to compute cost volumes. Should you be making use of this work, please make sure to adhere to the licensing terms of the original authors. Should you be making use or modify this particular implementation, please acknowledge it appropriately. \ No newline at end of file diff --git a/vbench/third_party/amt/flow_generation/liteflownet/correlation/correlation.py b/vbench/third_party/amt/flow_generation/liteflownet/correlation/correlation.py new file mode 100755 index 00000000..212af710 --- /dev/null +++ b/vbench/third_party/amt/flow_generation/liteflownet/correlation/correlation.py @@ -0,0 +1,396 @@ +#!/usr/bin/env python + +import cupy +import math +import re +import torch + +kernel_Correlation_rearrange = ''' + extern "C" __global__ void kernel_Correlation_rearrange( + const int n, + const float* input, + float* output + ) { + int intIndex = (blockIdx.x * blockDim.x) + threadIdx.x; + if (intIndex >= n) { + return; + } + int intSample = blockIdx.z; + int intChannel = blockIdx.y; + float fltValue = input[(((intSample * SIZE_1(input)) + intChannel) * SIZE_2(input) * SIZE_3(input)) + intIndex]; + __syncthreads(); + int intPaddedY = (intIndex / SIZE_3(input)) + 3*{{intStride}}; + int intPaddedX = (intIndex % SIZE_3(input)) + 3*{{intStride}}; + int intRearrange = ((SIZE_3(input) + 6*{{intStride}}) * intPaddedY) + intPaddedX; + output[(((intSample * SIZE_1(output) * SIZE_2(output)) + intRearrange) * SIZE_1(input)) + intChannel] = fltValue; + } +''' + +kernel_Correlation_updateOutput = ''' + extern "C" __global__ void kernel_Correlation_updateOutput( + const int n, + const float* rbot0, + const float* rbot1, + float* top + ) { + extern __shared__ char patch_data_char[]; + + float *patch_data = (float *)patch_data_char; + + // First (upper left) position of kernel upper-left corner in current center position of neighborhood in image 1 + int x1 = (blockIdx.x + 3) * {{intStride}}; + int y1 = (blockIdx.y + 3) * {{intStride}}; + int item = blockIdx.z; + int ch_off = threadIdx.x; + + // Load 3D patch into shared shared memory + for (int j = 0; j < 1; j++) { // HEIGHT + for (int i = 0; i < 1; i++) { // WIDTH + int ji_off = (j + i) * SIZE_3(rbot0); + for (int ch = ch_off; ch < SIZE_3(rbot0); ch += 32) { // CHANNELS + int idx1 = ((item * SIZE_1(rbot0) + y1+j) * SIZE_2(rbot0) + x1+i) * SIZE_3(rbot0) + ch; + int idxPatchData = ji_off + ch; + patch_data[idxPatchData] = rbot0[idx1]; + } + } + } + + __syncthreads(); + + __shared__ float sum[32]; + + // Compute correlation + for (int top_channel = 0; top_channel < SIZE_1(top); top_channel++) { + sum[ch_off] = 0; + + int s2o = (top_channel % 7 - 3) * {{intStride}}; + int s2p = (top_channel / 7 - 3) * {{intStride}}; + + for (int j = 0; j < 1; j++) { // HEIGHT + for (int i = 0; i < 1; i++) { // WIDTH + int ji_off = (j + i) * SIZE_3(rbot0); + for (int ch = ch_off; ch < SIZE_3(rbot0); ch += 32) { // CHANNELS + int x2 = x1 + s2o; + int y2 = y1 + s2p; + + int idxPatchData = ji_off + ch; + int idx2 = ((item * SIZE_1(rbot0) + y2+j) * SIZE_2(rbot0) + x2+i) * SIZE_3(rbot0) + ch; + + sum[ch_off] += patch_data[idxPatchData] * rbot1[idx2]; + } + } + } + + __syncthreads(); + + if (ch_off == 0) { + float total_sum = 0; + for (int idx = 0; idx < 32; idx++) { + total_sum += sum[idx]; + } + const int sumelems = SIZE_3(rbot0); + const int index = ((top_channel*SIZE_2(top) + blockIdx.y)*SIZE_3(top))+blockIdx.x; + top[index + item*SIZE_1(top)*SIZE_2(top)*SIZE_3(top)] = total_sum / (float)sumelems; + } + } + } +''' + +kernel_Correlation_updateGradOne = ''' + #define ROUND_OFF 50000 + extern "C" __global__ void kernel_Correlation_updateGradOne( + const int n, + const int intSample, + const float* rbot0, + const float* rbot1, + const float* gradOutput, + float* gradOne, + float* gradTwo + ) { for (int intIndex = (blockIdx.x * blockDim.x) + threadIdx.x; intIndex < n; intIndex += blockDim.x * gridDim.x) { + int n = intIndex % SIZE_1(gradOne); // channels + int l = (intIndex / SIZE_1(gradOne)) % SIZE_3(gradOne) + 3*{{intStride}}; // w-pos + int m = (intIndex / SIZE_1(gradOne) / SIZE_3(gradOne)) % SIZE_2(gradOne) + 3*{{intStride}}; // h-pos + + // round_off is a trick to enable integer division with ceil, even for negative numbers + // We use a large offset, for the inner part not to become negative. + const int round_off = ROUND_OFF; + const int round_off_s1 = {{intStride}} * round_off; + + // We add round_off before_s1 the int division and subtract round_off after it, to ensure the formula matches ceil behavior: + int xmin = (l - 3*{{intStride}} + round_off_s1 - 1) / {{intStride}} + 1 - round_off; // ceil (l - 3*{{intStride}}) / {{intStride}} + int ymin = (m - 3*{{intStride}} + round_off_s1 - 1) / {{intStride}} + 1 - round_off; // ceil (l - 3*{{intStride}}) / {{intStride}} + + // Same here: + int xmax = (l - 3*{{intStride}} + round_off_s1) / {{intStride}} - round_off; // floor (l - 3*{{intStride}}) / {{intStride}} + int ymax = (m - 3*{{intStride}} + round_off_s1) / {{intStride}} - round_off; // floor (m - 3*{{intStride}}) / {{intStride}} + + float sum = 0; + if (xmax>=0 && ymax>=0 && (xmin<=SIZE_3(gradOutput)-1) && (ymin<=SIZE_2(gradOutput)-1)) { + xmin = max(0,xmin); + xmax = min(SIZE_3(gradOutput)-1,xmax); + + ymin = max(0,ymin); + ymax = min(SIZE_2(gradOutput)-1,ymax); + + for (int p = -3; p <= 3; p++) { + for (int o = -3; o <= 3; o++) { + // Get rbot1 data: + int s2o = {{intStride}} * o; + int s2p = {{intStride}} * p; + int idxbot1 = ((intSample * SIZE_1(rbot0) + (m+s2p)) * SIZE_2(rbot0) + (l+s2o)) * SIZE_3(rbot0) + n; + float bot1tmp = rbot1[idxbot1]; // rbot1[l+s2o,m+s2p,n] + + // Index offset for gradOutput in following loops: + int op = (p+3) * 7 + (o+3); // index[o,p] + int idxopoffset = (intSample * SIZE_1(gradOutput) + op); + + for (int y = ymin; y <= ymax; y++) { + for (int x = xmin; x <= xmax; x++) { + int idxgradOutput = (idxopoffset * SIZE_2(gradOutput) + y) * SIZE_3(gradOutput) + x; // gradOutput[x,y,o,p] + sum += gradOutput[idxgradOutput] * bot1tmp; + } + } + } + } + } + const int sumelems = SIZE_1(gradOne); + const int bot0index = ((n * SIZE_2(gradOne)) + (m-3*{{intStride}})) * SIZE_3(gradOne) + (l-3*{{intStride}}); + gradOne[bot0index + intSample*SIZE_1(gradOne)*SIZE_2(gradOne)*SIZE_3(gradOne)] = sum / (float)sumelems; + } } +''' + +kernel_Correlation_updateGradTwo = ''' + #define ROUND_OFF 50000 + extern "C" __global__ void kernel_Correlation_updateGradTwo( + const int n, + const int intSample, + const float* rbot0, + const float* rbot1, + const float* gradOutput, + float* gradOne, + float* gradTwo + ) { for (int intIndex = (blockIdx.x * blockDim.x) + threadIdx.x; intIndex < n; intIndex += blockDim.x * gridDim.x) { + int n = intIndex % SIZE_1(gradTwo); // channels + int l = (intIndex / SIZE_1(gradTwo)) % SIZE_3(gradTwo) + 3*{{intStride}}; // w-pos + int m = (intIndex / SIZE_1(gradTwo) / SIZE_3(gradTwo)) % SIZE_2(gradTwo) + 3*{{intStride}}; // h-pos + + // round_off is a trick to enable integer division with ceil, even for negative numbers + // We use a large offset, for the inner part not to become negative. + const int round_off = ROUND_OFF; + const int round_off_s1 = {{intStride}} * round_off; + + float sum = 0; + for (int p = -3; p <= 3; p++) { + for (int o = -3; o <= 3; o++) { + int s2o = {{intStride}} * o; + int s2p = {{intStride}} * p; + + //Get X,Y ranges and clamp + // We add round_off before_s1 the int division and subtract round_off after it, to ensure the formula matches ceil behavior: + int xmin = (l - 3*{{intStride}} - s2o + round_off_s1 - 1) / {{intStride}} + 1 - round_off; // ceil (l - 3*{{intStride}} - s2o) / {{intStride}} + int ymin = (m - 3*{{intStride}} - s2p + round_off_s1 - 1) / {{intStride}} + 1 - round_off; // ceil (l - 3*{{intStride}} - s2o) / {{intStride}} + + // Same here: + int xmax = (l - 3*{{intStride}} - s2o + round_off_s1) / {{intStride}} - round_off; // floor (l - 3*{{intStride}} - s2o) / {{intStride}} + int ymax = (m - 3*{{intStride}} - s2p + round_off_s1) / {{intStride}} - round_off; // floor (m - 3*{{intStride}} - s2p) / {{intStride}} + + if (xmax>=0 && ymax>=0 && (xmin<=SIZE_3(gradOutput)-1) && (ymin<=SIZE_2(gradOutput)-1)) { + xmin = max(0,xmin); + xmax = min(SIZE_3(gradOutput)-1,xmax); + + ymin = max(0,ymin); + ymax = min(SIZE_2(gradOutput)-1,ymax); + + // Get rbot0 data: + int idxbot0 = ((intSample * SIZE_1(rbot0) + (m-s2p)) * SIZE_2(rbot0) + (l-s2o)) * SIZE_3(rbot0) + n; + float bot0tmp = rbot0[idxbot0]; // rbot1[l+s2o,m+s2p,n] + + // Index offset for gradOutput in following loops: + int op = (p+3) * 7 + (o+3); // index[o,p] + int idxopoffset = (intSample * SIZE_1(gradOutput) + op); + + for (int y = ymin; y <= ymax; y++) { + for (int x = xmin; x <= xmax; x++) { + int idxgradOutput = (idxopoffset * SIZE_2(gradOutput) + y) * SIZE_3(gradOutput) + x; // gradOutput[x,y,o,p] + sum += gradOutput[idxgradOutput] * bot0tmp; + } + } + } + } + } + const int sumelems = SIZE_1(gradTwo); + const int bot1index = ((n * SIZE_2(gradTwo)) + (m-3*{{intStride}})) * SIZE_3(gradTwo) + (l-3*{{intStride}}); + gradTwo[bot1index + intSample*SIZE_1(gradTwo)*SIZE_2(gradTwo)*SIZE_3(gradTwo)] = sum / (float)sumelems; + } } +''' + +def cupy_kernel(strFunction, objVariables): + strKernel = globals()[strFunction].replace('{{intStride}}', str(objVariables['intStride'])) + + while True: + objMatch = re.search('(SIZE_)([0-4])(\()([^\)]*)(\))', strKernel) + + if objMatch is None: + break + # end + + intArg = int(objMatch.group(2)) + + strTensor = objMatch.group(4) + intSizes = objVariables[strTensor].size() + + strKernel = strKernel.replace(objMatch.group(), str(intSizes[intArg] if torch.is_tensor(intSizes[intArg]) == False else intSizes[intArg].item())) + # end + + while True: + objMatch = re.search('(VALUE_)([0-4])(\()([^\)]+)(\))', strKernel) + + if objMatch is None: + break + # end + + intArgs = int(objMatch.group(2)) + strArgs = objMatch.group(4).split(',') + + strTensor = strArgs[0] + intStrides = objVariables[strTensor].stride() + strIndex = [ '((' + strArgs[intArg + 1].replace('{', '(').replace('}', ')').strip() + ')*' + str(intStrides[intArg] if torch.is_tensor(intStrides[intArg]) == False else intStrides[intArg].item()) + ')' for intArg in range(intArgs) ] + + strKernel = strKernel.replace(objMatch.group(0), strTensor + '[' + str.join('+', strIndex) + ']') + # end + + return strKernel +# end + +@cupy.memoize(for_each_device=True) +def cupy_launch(strFunction, strKernel): + return cupy.cuda.compile_with_cache(strKernel).get_function(strFunction) +# end + +class _FunctionCorrelation(torch.autograd.Function): + @staticmethod + def forward(self, one, two, intStride): + rbot0 = one.new_zeros([ one.shape[0], one.shape[2] + (6 * intStride), one.shape[3] + (6 * intStride), one.shape[1] ]) + rbot1 = one.new_zeros([ one.shape[0], one.shape[2] + (6 * intStride), one.shape[3] + (6 * intStride), one.shape[1] ]) + + self.intStride = intStride + + one = one.contiguous(); assert(one.is_cuda == True) + two = two.contiguous(); assert(two.is_cuda == True) + + output = one.new_zeros([ one.shape[0], 49, int(math.ceil(one.shape[2] / intStride)), int(math.ceil(one.shape[3] / intStride)) ]) + + if one.is_cuda == True: + n = one.shape[2] * one.shape[3] + cupy_launch('kernel_Correlation_rearrange', cupy_kernel('kernel_Correlation_rearrange', { + 'intStride': self.intStride, + 'input': one, + 'output': rbot0 + }))( + grid=tuple([ int((n + 16 - 1) / 16), one.shape[1], one.shape[0] ]), + block=tuple([ 16, 1, 1 ]), + args=[ cupy.int32(n), one.data_ptr(), rbot0.data_ptr() ] + ) + + n = two.shape[2] * two.shape[3] + cupy_launch('kernel_Correlation_rearrange', cupy_kernel('kernel_Correlation_rearrange', { + 'intStride': self.intStride, + 'input': two, + 'output': rbot1 + }))( + grid=tuple([ int((n + 16 - 1) / 16), two.shape[1], two.shape[0] ]), + block=tuple([ 16, 1, 1 ]), + args=[ cupy.int32(n), two.data_ptr(), rbot1.data_ptr() ] + ) + + n = output.shape[1] * output.shape[2] * output.shape[3] + cupy_launch('kernel_Correlation_updateOutput', cupy_kernel('kernel_Correlation_updateOutput', { + 'intStride': self.intStride, + 'rbot0': rbot0, + 'rbot1': rbot1, + 'top': output + }))( + grid=tuple([ output.shape[3], output.shape[2], output.shape[0] ]), + block=tuple([ 32, 1, 1 ]), + shared_mem=one.shape[1] * 4, + args=[ cupy.int32(n), rbot0.data_ptr(), rbot1.data_ptr(), output.data_ptr() ] + ) + + elif one.is_cuda == False: + raise NotImplementedError() + + # end + + self.save_for_backward(one, two, rbot0, rbot1) + + return output + # end + + @staticmethod + def backward(self, gradOutput): + one, two, rbot0, rbot1 = self.saved_tensors + + gradOutput = gradOutput.contiguous(); assert(gradOutput.is_cuda == True) + + gradOne = one.new_zeros([ one.shape[0], one.shape[1], one.shape[2], one.shape[3] ]) if self.needs_input_grad[0] == True else None + gradTwo = one.new_zeros([ one.shape[0], one.shape[1], one.shape[2], one.shape[3] ]) if self.needs_input_grad[1] == True else None + + if one.is_cuda == True: + if gradOne is not None: + for intSample in range(one.shape[0]): + n = one.shape[1] * one.shape[2] * one.shape[3] + cupy_launch('kernel_Correlation_updateGradOne', cupy_kernel('kernel_Correlation_updateGradOne', { + 'intStride': self.intStride, + 'rbot0': rbot0, + 'rbot1': rbot1, + 'gradOutput': gradOutput, + 'gradOne': gradOne, + 'gradTwo': None + }))( + grid=tuple([ int((n + 512 - 1) / 512), 1, 1 ]), + block=tuple([ 512, 1, 1 ]), + args=[ cupy.int32(n), intSample, rbot0.data_ptr(), rbot1.data_ptr(), gradOutput.data_ptr(), gradOne.data_ptr(), None ] + ) + # end + # end + + if gradTwo is not None: + for intSample in range(one.shape[0]): + n = one.shape[1] * one.shape[2] * one.shape[3] + cupy_launch('kernel_Correlation_updateGradTwo', cupy_kernel('kernel_Correlation_updateGradTwo', { + 'intStride': self.intStride, + 'rbot0': rbot0, + 'rbot1': rbot1, + 'gradOutput': gradOutput, + 'gradOne': None, + 'gradTwo': gradTwo + }))( + grid=tuple([ int((n + 512 - 1) / 512), 1, 1 ]), + block=tuple([ 512, 1, 1 ]), + args=[ cupy.int32(n), intSample, rbot0.data_ptr(), rbot1.data_ptr(), gradOutput.data_ptr(), None, gradTwo.data_ptr() ] + ) + # end + # end + + elif one.is_cuda == False: + raise NotImplementedError() + + # end + + return gradOne, gradTwo, None + # end +# end + +def FunctionCorrelation(tenOne, tenTwo, intStride): + return _FunctionCorrelation.apply(tenOne, tenTwo, intStride) +# end + +class ModuleCorrelation(torch.nn.Module): + def __init__(self): + super().__init__() + # end + + def forward(self, tenOne, tenTwo, intStride): + return _FunctionCorrelation.apply(tenOne, tenTwo, intStride) + # end +# end \ No newline at end of file diff --git a/vbench/third_party/amt/flow_generation/liteflownet/run.py b/vbench/third_party/amt/flow_generation/liteflownet/run.py new file mode 100755 index 00000000..1957621f --- /dev/null +++ b/vbench/third_party/amt/flow_generation/liteflownet/run.py @@ -0,0 +1,385 @@ +#!/usr/bin/env python + +import getopt +import math +import numpy +import PIL +import PIL.Image +import sys +import torch + +try: + from .correlation import correlation # the custom cost volume layer +except: + sys.path.insert(0, './correlation'); import correlation # you should consider upgrading python +# end + +########################################################## + +assert(int(str('').join(torch.__version__.split('.')[0:2])) >= 13) # requires at least pytorch version 1.3.0 + +torch.set_grad_enabled(False) # make sure to not compute gradients for computational performance + +torch.backends.cudnn.enabled = True # make sure to use cudnn for computational performance + +########################################################## + +arguments_strModel = 'default' # 'default', or 'kitti', or 'sintel' +arguments_strOne = './images/one.png' +arguments_strTwo = './images/two.png' +arguments_strOut = './out.flo' + +for strOption, strArgument in getopt.getopt(sys.argv[1:], '', [ strParameter[2:] + '=' for strParameter in sys.argv[1::2] ])[0]: + if strOption == '--model' and strArgument != '': arguments_strModel = strArgument # which model to use + if strOption == '--one' and strArgument != '': arguments_strOne = strArgument # path to the first frame + if strOption == '--two' and strArgument != '': arguments_strTwo = strArgument # path to the second frame + if strOption == '--out' and strArgument != '': arguments_strOut = strArgument # path to where the output should be stored +# end + +########################################################## + +backwarp_tenGrid = {} + +def backwarp(tenInput, tenFlow): + if str(tenFlow.shape) not in backwarp_tenGrid: + tenHor = torch.linspace(-1.0 + (1.0 / tenFlow.shape[3]), 1.0 - (1.0 / tenFlow.shape[3]), tenFlow.shape[3]).view(1, 1, 1, -1).repeat(1, 1, tenFlow.shape[2], 1) + tenVer = torch.linspace(-1.0 + (1.0 / tenFlow.shape[2]), 1.0 - (1.0 / tenFlow.shape[2]), tenFlow.shape[2]).view(1, 1, -1, 1).repeat(1, 1, 1, tenFlow.shape[3]) + + backwarp_tenGrid[str(tenFlow.shape)] = torch.cat([ tenHor, tenVer ], 1).cuda() + # end + + tenFlow = torch.cat([ tenFlow[:, 0:1, :, :] / ((tenInput.shape[3] - 1.0) / 2.0), tenFlow[:, 1:2, :, :] / ((tenInput.shape[2] - 1.0) / 2.0) ], 1) + + return torch.nn.functional.grid_sample(input=tenInput, grid=(backwarp_tenGrid[str(tenFlow.shape)] + tenFlow).permute(0, 2, 3, 1), mode='bilinear', padding_mode='zeros', align_corners=False) +# end + +########################################################## + +class Network(torch.nn.Module): + def __init__(self): + super().__init__() + + class Features(torch.nn.Module): + def __init__(self): + super().__init__() + + self.netOne = torch.nn.Sequential( + torch.nn.Conv2d(in_channels=3, out_channels=32, kernel_size=7, stride=1, padding=3), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1) + ) + + self.netTwo = torch.nn.Sequential( + torch.nn.Conv2d(in_channels=32, out_channels=32, kernel_size=3, stride=2, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1), + torch.nn.Conv2d(in_channels=32, out_channels=32, kernel_size=3, stride=1, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1), + torch.nn.Conv2d(in_channels=32, out_channels=32, kernel_size=3, stride=1, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1) + ) + + self.netThr = torch.nn.Sequential( + torch.nn.Conv2d(in_channels=32, out_channels=64, kernel_size=3, stride=2, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1), + torch.nn.Conv2d(in_channels=64, out_channels=64, kernel_size=3, stride=1, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1) + ) + + self.netFou = torch.nn.Sequential( + torch.nn.Conv2d(in_channels=64, out_channels=96, kernel_size=3, stride=2, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1), + torch.nn.Conv2d(in_channels=96, out_channels=96, kernel_size=3, stride=1, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1) + ) + + self.netFiv = torch.nn.Sequential( + torch.nn.Conv2d(in_channels=96, out_channels=128, kernel_size=3, stride=2, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1) + ) + + self.netSix = torch.nn.Sequential( + torch.nn.Conv2d(in_channels=128, out_channels=192, kernel_size=3, stride=2, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1) + ) + # end + + def forward(self, tenInput): + tenOne = self.netOne(tenInput) + tenTwo = self.netTwo(tenOne) + tenThr = self.netThr(tenTwo) + tenFou = self.netFou(tenThr) + tenFiv = self.netFiv(tenFou) + tenSix = self.netSix(tenFiv) + + return [ tenOne, tenTwo, tenThr, tenFou, tenFiv, tenSix ] + # end + # end + + class Matching(torch.nn.Module): + def __init__(self, intLevel): + super().__init__() + + self.fltBackwarp = [ 0.0, 0.0, 10.0, 5.0, 2.5, 1.25, 0.625 ][intLevel] + + if intLevel != 2: + self.netFeat = torch.nn.Sequential() + + elif intLevel == 2: + self.netFeat = torch.nn.Sequential( + torch.nn.Conv2d(in_channels=32, out_channels=64, kernel_size=1, stride=1, padding=0), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1) + ) + + # end + + if intLevel == 6: + self.netUpflow = None + + elif intLevel != 6: + self.netUpflow = torch.nn.ConvTranspose2d(in_channels=2, out_channels=2, kernel_size=4, stride=2, padding=1, bias=False, groups=2) + + # end + + if intLevel >= 4: + self.netUpcorr = None + + elif intLevel < 4: + self.netUpcorr = torch.nn.ConvTranspose2d(in_channels=49, out_channels=49, kernel_size=4, stride=2, padding=1, bias=False, groups=49) + + # end + + self.netMain = torch.nn.Sequential( + torch.nn.Conv2d(in_channels=49, out_channels=128, kernel_size=3, stride=1, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1), + torch.nn.Conv2d(in_channels=128, out_channels=64, kernel_size=3, stride=1, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1), + torch.nn.Conv2d(in_channels=64, out_channels=32, kernel_size=3, stride=1, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1), + torch.nn.Conv2d(in_channels=32, out_channels=2, kernel_size=[ 0, 0, 7, 5, 5, 3, 3 ][intLevel], stride=1, padding=[ 0, 0, 3, 2, 2, 1, 1 ][intLevel]) + ) + # end + + def forward(self, tenOne, tenTwo, tenFeaturesOne, tenFeaturesTwo, tenFlow): + tenFeaturesOne = self.netFeat(tenFeaturesOne) + tenFeaturesTwo = self.netFeat(tenFeaturesTwo) + + if tenFlow is not None: + tenFlow = self.netUpflow(tenFlow) + # end + + if tenFlow is not None: + tenFeaturesTwo = backwarp(tenInput=tenFeaturesTwo, tenFlow=tenFlow * self.fltBackwarp) + # end + + if self.netUpcorr is None: + tenCorrelation = torch.nn.functional.leaky_relu(input=correlation.FunctionCorrelation(tenOne=tenFeaturesOne, tenTwo=tenFeaturesTwo, intStride=1), negative_slope=0.1, inplace=False) + + elif self.netUpcorr is not None: + tenCorrelation = self.netUpcorr(torch.nn.functional.leaky_relu(input=correlation.FunctionCorrelation(tenOne=tenFeaturesOne, tenTwo=tenFeaturesTwo, intStride=2), negative_slope=0.1, inplace=False)) + + # end + + return (tenFlow if tenFlow is not None else 0.0) + self.netMain(tenCorrelation) + # end + # end + + class Subpixel(torch.nn.Module): + def __init__(self, intLevel): + super().__init__() + + self.fltBackward = [ 0.0, 0.0, 10.0, 5.0, 2.5, 1.25, 0.625 ][intLevel] + + if intLevel != 2: + self.netFeat = torch.nn.Sequential() + + elif intLevel == 2: + self.netFeat = torch.nn.Sequential( + torch.nn.Conv2d(in_channels=32, out_channels=64, kernel_size=1, stride=1, padding=0), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1) + ) + + # end + + self.netMain = torch.nn.Sequential( + torch.nn.Conv2d(in_channels=[ 0, 0, 130, 130, 194, 258, 386 ][intLevel], out_channels=128, kernel_size=3, stride=1, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1), + torch.nn.Conv2d(in_channels=128, out_channels=64, kernel_size=3, stride=1, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1), + torch.nn.Conv2d(in_channels=64, out_channels=32, kernel_size=3, stride=1, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1), + torch.nn.Conv2d(in_channels=32, out_channels=2, kernel_size=[ 0, 0, 7, 5, 5, 3, 3 ][intLevel], stride=1, padding=[ 0, 0, 3, 2, 2, 1, 1 ][intLevel]) + ) + # end + + def forward(self, tenOne, tenTwo, tenFeaturesOne, tenFeaturesTwo, tenFlow): + tenFeaturesOne = self.netFeat(tenFeaturesOne) + tenFeaturesTwo = self.netFeat(tenFeaturesTwo) + + if tenFlow is not None: + tenFeaturesTwo = backwarp(tenInput=tenFeaturesTwo, tenFlow=tenFlow * self.fltBackward) + # end + + return (tenFlow if tenFlow is not None else 0.0) + self.netMain(torch.cat([ tenFeaturesOne, tenFeaturesTwo, tenFlow ], 1)) + # end + # end + + class Regularization(torch.nn.Module): + def __init__(self, intLevel): + super().__init__() + + self.fltBackward = [ 0.0, 0.0, 10.0, 5.0, 2.5, 1.25, 0.625 ][intLevel] + + self.intUnfold = [ 0, 0, 7, 5, 5, 3, 3 ][intLevel] + + if intLevel >= 5: + self.netFeat = torch.nn.Sequential() + + elif intLevel < 5: + self.netFeat = torch.nn.Sequential( + torch.nn.Conv2d(in_channels=[ 0, 0, 32, 64, 96, 128, 192 ][intLevel], out_channels=128, kernel_size=1, stride=1, padding=0), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1) + ) + + # end + + self.netMain = torch.nn.Sequential( + torch.nn.Conv2d(in_channels=[ 0, 0, 131, 131, 131, 131, 195 ][intLevel], out_channels=128, kernel_size=3, stride=1, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1), + torch.nn.Conv2d(in_channels=128, out_channels=128, kernel_size=3, stride=1, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1), + torch.nn.Conv2d(in_channels=128, out_channels=64, kernel_size=3, stride=1, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1), + torch.nn.Conv2d(in_channels=64, out_channels=64, kernel_size=3, stride=1, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1), + torch.nn.Conv2d(in_channels=64, out_channels=32, kernel_size=3, stride=1, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1), + torch.nn.Conv2d(in_channels=32, out_channels=32, kernel_size=3, stride=1, padding=1), + torch.nn.LeakyReLU(inplace=False, negative_slope=0.1) + ) + + if intLevel >= 5: + self.netDist = torch.nn.Sequential( + torch.nn.Conv2d(in_channels=32, out_channels=[ 0, 0, 49, 25, 25, 9, 9 ][intLevel], kernel_size=[ 0, 0, 7, 5, 5, 3, 3 ][intLevel], stride=1, padding=[ 0, 0, 3, 2, 2, 1, 1 ][intLevel]) + ) + + elif intLevel < 5: + self.netDist = torch.nn.Sequential( + torch.nn.Conv2d(in_channels=32, out_channels=[ 0, 0, 49, 25, 25, 9, 9 ][intLevel], kernel_size=([ 0, 0, 7, 5, 5, 3, 3 ][intLevel], 1), stride=1, padding=([ 0, 0, 3, 2, 2, 1, 1 ][intLevel], 0)), + torch.nn.Conv2d(in_channels=[ 0, 0, 49, 25, 25, 9, 9 ][intLevel], out_channels=[ 0, 0, 49, 25, 25, 9, 9 ][intLevel], kernel_size=(1, [ 0, 0, 7, 5, 5, 3, 3 ][intLevel]), stride=1, padding=(0, [ 0, 0, 3, 2, 2, 1, 1 ][intLevel])) + ) + + # end + + self.netScaleX = torch.nn.Conv2d(in_channels=[ 0, 0, 49, 25, 25, 9, 9 ][intLevel], out_channels=1, kernel_size=1, stride=1, padding=0) + self.netScaleY = torch.nn.Conv2d(in_channels=[ 0, 0, 49, 25, 25, 9, 9 ][intLevel], out_channels=1, kernel_size=1, stride=1, padding=0) + # eny + + def forward(self, tenOne, tenTwo, tenFeaturesOne, tenFeaturesTwo, tenFlow): + tenDifference = ((tenOne - backwarp(tenInput=tenTwo, tenFlow=tenFlow * self.fltBackward)) ** 2).sum(1, True).sqrt().detach() + + tenDist = self.netDist(self.netMain(torch.cat([ tenDifference, tenFlow - tenFlow.view(tenFlow.shape[0], 2, -1).mean(2, True).view(tenFlow.shape[0], 2, 1, 1), self.netFeat(tenFeaturesOne) ], 1))) + tenDist = (tenDist ** 2).neg() + tenDist = (tenDist - tenDist.max(1, True)[0]).exp() + + tenDivisor = tenDist.sum(1, True).reciprocal() + + tenScaleX = self.netScaleX(tenDist * torch.nn.functional.unfold(input=tenFlow[:, 0:1, :, :], kernel_size=self.intUnfold, stride=1, padding=int((self.intUnfold - 1) / 2)).view_as(tenDist)) * tenDivisor + tenScaleY = self.netScaleY(tenDist * torch.nn.functional.unfold(input=tenFlow[:, 1:2, :, :], kernel_size=self.intUnfold, stride=1, padding=int((self.intUnfold - 1) / 2)).view_as(tenDist)) * tenDivisor + + return torch.cat([ tenScaleX, tenScaleY ], 1) + # end + # end + + self.netFeatures = Features() + self.netMatching = torch.nn.ModuleList([ Matching(intLevel) for intLevel in [ 2, 3, 4, 5, 6 ] ]) + self.netSubpixel = torch.nn.ModuleList([ Subpixel(intLevel) for intLevel in [ 2, 3, 4, 5, 6 ] ]) + self.netRegularization = torch.nn.ModuleList([ Regularization(intLevel) for intLevel in [ 2, 3, 4, 5, 6 ] ]) + + self.load_state_dict({ strKey.replace('module', 'net'): tenWeight for strKey, tenWeight in torch.hub.load_state_dict_from_url(url='http://content.sniklaus.com/github/pytorch-liteflownet/network-' + arguments_strModel + '.pytorch').items() }) + # self.load_state_dict(torch.load('./liteflownet/network-default.pth')) + # end + + def forward(self, tenOne, tenTwo): + tenOne[:, 0, :, :] = tenOne[:, 0, :, :] - 0.411618 + tenOne[:, 1, :, :] = tenOne[:, 1, :, :] - 0.434631 + tenOne[:, 2, :, :] = tenOne[:, 2, :, :] - 0.454253 + + tenTwo[:, 0, :, :] = tenTwo[:, 0, :, :] - 0.410782 + tenTwo[:, 1, :, :] = tenTwo[:, 1, :, :] - 0.433645 + tenTwo[:, 2, :, :] = tenTwo[:, 2, :, :] - 0.452793 + + tenFeaturesOne = self.netFeatures(tenOne) + tenFeaturesTwo = self.netFeatures(tenTwo) + + tenOne = [ tenOne ] + tenTwo = [ tenTwo ] + + for intLevel in [ 1, 2, 3, 4, 5 ]: + tenOne.append(torch.nn.functional.interpolate(input=tenOne[-1], size=(tenFeaturesOne[intLevel].shape[2], tenFeaturesOne[intLevel].shape[3]), mode='bilinear', align_corners=False)) + tenTwo.append(torch.nn.functional.interpolate(input=tenTwo[-1], size=(tenFeaturesTwo[intLevel].shape[2], tenFeaturesTwo[intLevel].shape[3]), mode='bilinear', align_corners=False)) + # end + + tenFlow = None + + for intLevel in [ -1, -2, -3, -4, -5 ]: + tenFlow = self.netMatching[intLevel](tenOne[intLevel], tenTwo[intLevel], tenFeaturesOne[intLevel], tenFeaturesTwo[intLevel], tenFlow) + tenFlow = self.netSubpixel[intLevel](tenOne[intLevel], tenTwo[intLevel], tenFeaturesOne[intLevel], tenFeaturesTwo[intLevel], tenFlow) + tenFlow = self.netRegularization[intLevel](tenOne[intLevel], tenTwo[intLevel], tenFeaturesOne[intLevel], tenFeaturesTwo[intLevel], tenFlow) + # end + + return tenFlow * 20.0 + # end +# end + +netNetwork = None + +########################################################## + +def estimate(tenOne, tenTwo): + global netNetwork + + if netNetwork is None: + netNetwork = Network().cuda().eval() + # end + + assert(tenOne.shape[1] == tenTwo.shape[1]) + assert(tenOne.shape[2] == tenTwo.shape[2]) + + intWidth = tenOne.shape[2] + intHeight = tenOne.shape[1] + + # assert(intWidth == 1024) # remember that there is no guarantee for correctness, comment this line out if you acknowledge this and want to continue + # assert(intHeight == 436) # remember that there is no guarantee for correctness, comment this line out if you acknowledge this and want to continue + + tenPreprocessedOne = tenOne.cuda().view(1, 3, intHeight, intWidth) + tenPreprocessedTwo = tenTwo.cuda().view(1, 3, intHeight, intWidth) + + intPreprocessedWidth = int(math.floor(math.ceil(intWidth / 32.0) * 32.0)) + intPreprocessedHeight = int(math.floor(math.ceil(intHeight / 32.0) * 32.0)) + + tenPreprocessedOne = torch.nn.functional.interpolate(input=tenPreprocessedOne, size=(intPreprocessedHeight, intPreprocessedWidth), mode='bilinear', align_corners=False) + tenPreprocessedTwo = torch.nn.functional.interpolate(input=tenPreprocessedTwo, size=(intPreprocessedHeight, intPreprocessedWidth), mode='bilinear', align_corners=False) + + tenFlow = torch.nn.functional.interpolate(input=netNetwork(tenPreprocessedOne, tenPreprocessedTwo), size=(intHeight, intWidth), mode='bilinear', align_corners=False) + + tenFlow[:, 0, :, :] *= float(intWidth) / float(intPreprocessedWidth) + tenFlow[:, 1, :, :] *= float(intHeight) / float(intPreprocessedHeight) + + return tenFlow[0, :, :, :].cpu() +# end + +########################################################## + +if __name__ == '__main__': + tenOne = torch.FloatTensor(numpy.ascontiguousarray(numpy.array(PIL.Image.open(arguments_strOne))[:, :, ::-1].transpose(2, 0, 1).astype(numpy.float32) * (1.0 / 255.0))) + tenTwo = torch.FloatTensor(numpy.ascontiguousarray(numpy.array(PIL.Image.open(arguments_strTwo))[:, :, ::-1].transpose(2, 0, 1).astype(numpy.float32) * (1.0 / 255.0))) + + tenOutput = estimate(tenOne, tenTwo) + + objOutput = open(arguments_strOut, 'wb') + + numpy.array([ 80, 73, 69, 72 ], numpy.uint8).tofile(objOutput) + numpy.array([ tenOutput.shape[2], tenOutput.shape[1] ], numpy.int32).tofile(objOutput) + numpy.array(tenOutput.numpy().transpose(1, 2, 0), numpy.float32).tofile(objOutput) + + objOutput.close() +# end \ No newline at end of file diff --git a/vbench/third_party/amt/losses/__init__.py b/vbench/third_party/amt/losses/__init__.py new file mode 100755 index 00000000..e69de29b diff --git a/vbench/third_party/amt/losses/loss.py b/vbench/third_party/amt/losses/loss.py new file mode 100755 index 00000000..8d6ff33d --- /dev/null +++ b/vbench/third_party/amt/losses/loss.py @@ -0,0 +1,196 @@ +import torch +import torch.nn as nn +import torch.nn.functional as F +import numpy as np + + +class Loss(nn.Module): + def __init__(self, loss_weight, keys, mapping=None) -> None: + ''' + mapping: map the kwargs keys into desired ones. + ''' + super().__init__() + self.loss_weight = loss_weight + self.keys = keys + self.mapping = mapping + if isinstance(mapping, dict): + self.mapping = {k: v for k, v in mapping if v in keys} + + + def forward(self, **kwargs): + params = {k: v for k, v in kwargs.items() if k in self.keys} + if self.mapping is not None: + for k, v in kwargs.items(): + if self.mapping.get(k) is not None: + params[self.mapping[k]] = v + + return self._forward(**params) * self.loss_weight + + def _forward(self, **kwargs): + pass + + +class CharbonnierLoss(Loss): + def __init__(self, loss_weight, keys) -> None: + super().__init__(loss_weight, keys) + + def _forward(self, imgt_pred, imgt): + diff = imgt_pred - imgt + loss = ((diff ** 2 + 1e-6) ** 0.5).mean() + return loss + + +class AdaCharbonnierLoss(Loss): + def __init__(self, loss_weight, keys) -> None: + super().__init__(loss_weight, keys) + + def _forward(self, imgt_pred, imgt, weight): + alpha = weight / 2 + epsilon = 10 ** (-(10 * weight - 1) / 3) + + diff = imgt_pred - imgt + loss = ((diff ** 2 + epsilon ** 2) ** alpha).mean() + return loss + + +class TernaryLoss(Loss): + def __init__(self, loss_weight, keys, patch_size=7): + super().__init__(loss_weight, keys) + self.patch_size = patch_size + out_channels = patch_size * patch_size + self.w = np.eye(out_channels).reshape((patch_size, patch_size, 1, out_channels)) + self.w = np.transpose(self.w, (3, 2, 0, 1)) + self.w = torch.tensor(self.w, dtype=torch.float32) + + def transform(self, tensor): + self.w = self.w.to(tensor.device) + tensor_ = tensor.mean(dim=1, keepdim=True) + patches = F.conv2d(tensor_, self.w, padding=self.patch_size//2, bias=None) + loc_diff = patches - tensor_ + loc_diff_norm = loc_diff / torch.sqrt(0.81 + loc_diff ** 2) + return loc_diff_norm + + def valid_mask(self, tensor): + padding = self.patch_size//2 + b, c, h, w = tensor.size() + inner = torch.ones(b, 1, h - 2 * padding, w - 2 * padding).type_as(tensor) + mask = F.pad(inner, [padding] * 4) + return mask + + def _forward(self, imgt_pred, imgt): + loc_diff_x = self.transform(imgt_pred) + loc_diff_y = self.transform(imgt) + diff = loc_diff_x - loc_diff_y.detach() + dist = (diff ** 2 / (0.1 + diff ** 2)).mean(dim=1, keepdim=True) + mask = self.valid_mask(imgt_pred) + loss = (dist * mask).mean() + return loss + + +class GeometryLoss(Loss): + def __init__(self, loss_weight, keys, patch_size=3): + super().__init__(loss_weight, keys) + self.patch_size = patch_size + out_channels = patch_size * patch_size + self.w = np.eye(out_channels).reshape((patch_size, patch_size, 1, out_channels)) + self.w = np.transpose(self.w, (3, 2, 0, 1)) + self.w = torch.tensor(self.w).float() + + def transform(self, tensor): + b, c, h, w = tensor.size() + self.w = self.w.to(tensor.device) + tensor_ = tensor.reshape(b*c, 1, h, w) + patches = F.conv2d(tensor_, self.w, padding=self.patch_size // 2, bias=None) + loc_diff = patches - tensor_ + loc_diff_ = loc_diff.reshape(b, c*(self.patch_size ** 2), h, w) + loc_diff_norm = loc_diff_ / torch.sqrt(0.81 + loc_diff_ ** 2) + return loc_diff_norm + + def valid_mask(self, tensor): + padding = self.patch_size // 2 + b, c, h, w = tensor.size() + inner = torch.ones(b, 1, h - 2 * padding, w - 2 * padding).type_as(tensor) + mask = F.pad(inner, [padding] * 4) + return mask + + def _forward(self, ft_pred, ft_gt): + loss = 0. + for pred, gt in zip(ft_pred, ft_gt): + loc_diff_x = self.transform(pred) + loc_diff_y = self.transform(gt) + diff = loc_diff_x - loc_diff_y + dist = (diff ** 2 / (0.1 + diff ** 2)).mean(dim=1, keepdim=True) + mask = self.valid_mask(pred) + loss = loss + (dist * mask).mean() + return loss + + +class IFRFlowLoss(Loss): + def __init__(self, loss_weight, keys, beta=0.3) -> None: + super().__init__(loss_weight, keys) + self.beta = beta + self.ada_cb_loss = AdaCharbonnierLoss(1.0, ['imgt_pred', 'imgt', 'weight']) + + def _forward(self, flow0_pred, flow1_pred, flow): + + robust_weight0 = self.get_robust_weight(flow0_pred[0], flow[:, 0:2]) + robust_weight1 = self.get_robust_weight(flow1_pred[0], flow[:, 2:4]) + loss = 0 + for lvl in range(1, len(flow0_pred)): + scale_factor = 2**lvl + loss = loss + self.ada_cb_loss(**{ + 'imgt_pred': self.resize(flow0_pred[lvl], scale_factor), + 'imgt': flow[:, 0:2], + 'weight': robust_weight0 + }) + loss = loss + self.ada_cb_loss(**{ + 'imgt_pred': self.resize(flow1_pred[lvl], scale_factor), + 'imgt': flow[:, 2:4], + 'weight': robust_weight1 + }) + return loss + + def resize(self, x, scale_factor): + return scale_factor * F.interpolate(x, scale_factor=scale_factor, mode="bilinear", align_corners=False) + + def get_robust_weight(self, flow_pred, flow_gt): + epe = ((flow_pred.detach() - flow_gt) ** 2).sum(dim=1, keepdim=True) ** 0.5 + robust_weight = torch.exp(-self.beta * epe) + return robust_weight + + +class MultipleFlowLoss(Loss): + def __init__(self, loss_weight, keys, beta=0.3) -> None: + super().__init__(loss_weight, keys) + self.beta = beta + self.ada_cb_loss = AdaCharbonnierLoss(1.0, ['imgt_pred', 'imgt', 'weight']) + + def _forward(self, flow0_pred, flow1_pred, flow): + + robust_weight0 = self.get_mutli_flow_robust_weight(flow0_pred[0], flow[:, 0:2]) + robust_weight1 = self.get_mutli_flow_robust_weight(flow1_pred[0], flow[:, 2:4]) + loss = 0 + for lvl in range(1, len(flow0_pred)): + scale_factor = 2**lvl + loss = loss + self.ada_cb_loss(**{ + 'imgt_pred': self.resize(flow0_pred[lvl], scale_factor), + 'imgt': flow[:, 0:2], + 'weight': robust_weight0 + }) + loss = loss + self.ada_cb_loss(**{ + 'imgt_pred': self.resize(flow1_pred[lvl], scale_factor), + 'imgt': flow[:, 2:4], + 'weight': robust_weight1 + }) + return loss + + def resize(self, x, scale_factor): + return scale_factor * F.interpolate(x, scale_factor=scale_factor, mode="bilinear", align_corners=False) + + def get_mutli_flow_robust_weight(self, flow_pred, flow_gt): + b, num_flows, c, h, w = flow_pred.shape + flow_pred = flow_pred.view(b, num_flows, c, h, w) + flow_gt = flow_gt.repeat(1, num_flows, 1, 1).view(b, num_flows, c, h, w) + epe = ((flow_pred.detach() - flow_gt) ** 2).sum(dim=2, keepdim=True).max(1)[0] ** 0.5 + robust_weight = torch.exp(-self.beta * epe) + return robust_weight \ No newline at end of file diff --git a/vbench/third_party/amt/metrics/__init__.py b/vbench/third_party/amt/metrics/__init__.py new file mode 100755 index 00000000..e69de29b diff --git a/vbench/third_party/amt/metrics/psnr_ssim.py b/vbench/third_party/amt/metrics/psnr_ssim.py new file mode 100755 index 00000000..cb934772 --- /dev/null +++ b/vbench/third_party/amt/metrics/psnr_ssim.py @@ -0,0 +1,140 @@ +import torch +import torch.nn.functional as F +from math import exp + +device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') + +def gaussian(window_size, sigma): + gauss = torch.Tensor([exp(-(x - window_size//2)**2/float(2*sigma**2)) for x in range(window_size)]) + return gauss/gauss.sum() + + +def create_window(window_size, channel=1): + _1D_window = gaussian(window_size, 1.5).unsqueeze(1) + _2D_window = _1D_window.mm(_1D_window.t()).float().unsqueeze(0).unsqueeze(0).to(device) + window = _2D_window.expand(channel, 1, window_size, window_size).contiguous() + return window + + +def create_window_3d(window_size, channel=1): + _1D_window = gaussian(window_size, 1.5).unsqueeze(1) + _2D_window = _1D_window.mm(_1D_window.t()) + _3D_window = _2D_window.unsqueeze(2) @ (_1D_window.t()) + window = _3D_window.expand(1, channel, window_size, window_size, window_size).contiguous().to(device) + return window + + +def ssim(img1, img2, window_size=11, window=None, size_average=True, full=False, val_range=None): + if val_range is None: + if torch.max(img1) > 128: + max_val = 255 + else: + max_val = 1 + + if torch.min(img1) < -0.5: + min_val = -1 + else: + min_val = 0 + L = max_val - min_val + else: + L = val_range + + padd = 0 + (_, channel, height, width) = img1.size() + if window is None: + real_size = min(window_size, height, width) + window = create_window(real_size, channel=channel).to(img1.device) + + mu1 = F.conv2d(F.pad(img1, (5, 5, 5, 5), mode='replicate'), window, padding=padd, groups=channel) + mu2 = F.conv2d(F.pad(img2, (5, 5, 5, 5), mode='replicate'), window, padding=padd, groups=channel) + + mu1_sq = mu1.pow(2) + mu2_sq = mu2.pow(2) + mu1_mu2 = mu1 * mu2 + + sigma1_sq = F.conv2d(F.pad(img1 * img1, (5, 5, 5, 5), 'replicate'), window, padding=padd, groups=channel) - mu1_sq + sigma2_sq = F.conv2d(F.pad(img2 * img2, (5, 5, 5, 5), 'replicate'), window, padding=padd, groups=channel) - mu2_sq + sigma12 = F.conv2d(F.pad(img1 * img2, (5, 5, 5, 5), 'replicate'), window, padding=padd, groups=channel) - mu1_mu2 + + C1 = (0.01 * L) ** 2 + C2 = (0.03 * L) ** 2 + + v1 = 2.0 * sigma12 + C2 + v2 = sigma1_sq + sigma2_sq + C2 + cs = torch.mean(v1 / v2) + + ssim_map = ((2 * mu1_mu2 + C1) * v1) / ((mu1_sq + mu2_sq + C1) * v2) + + if size_average: + ret = ssim_map.mean() + else: + ret = ssim_map.mean(1).mean(1).mean(1) + + if full: + return ret, cs + return ret + + +def calculate_ssim(img1, img2, window_size=11, window=None, size_average=True, full=False, val_range=None): + if val_range is None: + if torch.max(img1) > 128: + max_val = 255 + else: + max_val = 1 + + if torch.min(img1) < -0.5: + min_val = -1 + else: + min_val = 0 + L = max_val - min_val + else: + L = val_range + + padd = 0 + (_, _, height, width) = img1.size() + if window is None: + real_size = min(window_size, height, width) + window = create_window_3d(real_size, channel=1).to(img1.device) + + img1 = img1.unsqueeze(1) + img2 = img2.unsqueeze(1) + + mu1 = F.conv3d(F.pad(img1, (5, 5, 5, 5, 5, 5), mode='replicate'), window, padding=padd, groups=1) + mu2 = F.conv3d(F.pad(img2, (5, 5, 5, 5, 5, 5), mode='replicate'), window, padding=padd, groups=1) + + mu1_sq = mu1.pow(2) + mu2_sq = mu2.pow(2) + mu1_mu2 = mu1 * mu2 + + sigma1_sq = F.conv3d(F.pad(img1 * img1, (5, 5, 5, 5, 5, 5), 'replicate'), window, padding=padd, groups=1) - mu1_sq + sigma2_sq = F.conv3d(F.pad(img2 * img2, (5, 5, 5, 5, 5, 5), 'replicate'), window, padding=padd, groups=1) - mu2_sq + sigma12 = F.conv3d(F.pad(img1 * img2, (5, 5, 5, 5, 5, 5), 'replicate'), window, padding=padd, groups=1) - mu1_mu2 + + C1 = (0.01 * L) ** 2 + C2 = (0.03 * L) ** 2 + + v1 = 2.0 * sigma12 + C2 + v2 = sigma1_sq + sigma2_sq + C2 + cs = torch.mean(v1 / v2) + + ssim_map = ((2 * mu1_mu2 + C1) * v1) / ((mu1_sq + mu2_sq + C1) * v2) + + if size_average: + ret = ssim_map.mean() + else: + ret = ssim_map.mean(1).mean(1).mean(1) + + if full: + return ret, cs + return ret.detach().cpu().numpy() + + + +def calculate_psnr(img1, img2): + psnr = -10 * torch.log10(((img1 - img2) * (img1 - img2)).mean()) + return psnr.detach().cpu().numpy() + + +def calculate_ie(img1, img2): + ie = torch.abs(torch.round(img1 * 255.0) - torch.round(img2 * 255.0)).mean() + return ie.detach().cpu().numpy() diff --git a/vbench/third_party/amt/networks/AMT-G.py b/vbench/third_party/amt/networks/AMT-G.py new file mode 100755 index 00000000..a24cb1a3 --- /dev/null +++ b/vbench/third_party/amt/networks/AMT-G.py @@ -0,0 +1,172 @@ +import torch +import torch.nn as nn +import torch.nn.functional as F +from networks.blocks.raft import ( + coords_grid, + BasicUpdateBlock, BidirCorrBlock +) +from networks.blocks.feat_enc import ( + LargeEncoder +) +from networks.blocks.ifrnet import ( + resize, + Encoder, + InitDecoder, + IntermediateDecoder +) +from networks.blocks.multi_flow import ( + multi_flow_combine, + MultiFlowDecoder +) + + +class Model(nn.Module): + def __init__(self, + corr_radius=3, + corr_lvls=4, + num_flows=5, + channels=[84, 96, 112, 128], + skip_channels=84): + super(Model, self).__init__() + self.radius = corr_radius + self.corr_levels = corr_lvls + self.num_flows = num_flows + + self.feat_encoder = LargeEncoder(output_dim=128, norm_fn='instance', dropout=0.) + self.encoder = Encoder(channels, large=True) + self.decoder4 = InitDecoder(channels[3], channels[2], skip_channels) + self.decoder3 = IntermediateDecoder(channels[2], channels[1], skip_channels) + self.decoder2 = IntermediateDecoder(channels[1], channels[0], skip_channels) + self.decoder1 = MultiFlowDecoder(channels[0], skip_channels, num_flows) + + self.update4 = self._get_updateblock(112, None) + self.update3_low = self._get_updateblock(96, 2.0) + self.update2_low = self._get_updateblock(84, 4.0) + + self.update3_high = self._get_updateblock(96, None) + self.update2_high = self._get_updateblock(84, None) + + self.comb_block = nn.Sequential( + nn.Conv2d(3*self.num_flows, 6*self.num_flows, 7, 1, 3), + nn.PReLU(6*self.num_flows), + nn.Conv2d(6*self.num_flows, 3, 7, 1, 3), + ) + + def _get_updateblock(self, cdim, scale_factor=None): + return BasicUpdateBlock(cdim=cdim, hidden_dim=192, flow_dim=64, + corr_dim=256, corr_dim2=192, fc_dim=188, + scale_factor=scale_factor, corr_levels=self.corr_levels, + radius=self.radius) + + def _corr_scale_lookup(self, corr_fn, coord, flow0, flow1, embt, downsample=1): + # convert t -> 0 to 0 -> 1 | convert t -> 1 to 1 -> 0 + # based on linear assumption + t1_scale = 1. / embt + t0_scale = 1. / (1. - embt) + if downsample != 1: + inv = 1 / downsample + flow0 = inv * resize(flow0, scale_factor=inv) + flow1 = inv * resize(flow1, scale_factor=inv) + + corr0, corr1 = corr_fn(coord + flow1 * t1_scale, coord + flow0 * t0_scale) + corr = torch.cat([corr0, corr1], dim=1) + flow = torch.cat([flow0, flow1], dim=1) + return corr, flow + + def forward(self, img0, img1, embt, scale_factor=1.0, eval=False, **kwargs): + mean_ = torch.cat([img0, img1], 2).mean(1, keepdim=True).mean(2, keepdim=True).mean(3, keepdim=True) + img0 = img0 - mean_ + img1 = img1 - mean_ + img0_ = resize(img0, scale_factor) if scale_factor != 1.0 else img0 + img1_ = resize(img1, scale_factor) if scale_factor != 1.0 else img1 + b, _, h, w = img0_.shape + coord = coords_grid(b, h // 8, w // 8, img0.device) + + fmap0, fmap1 = self.feat_encoder([img0_, img1_]) # [1, 128, H//8, W//8] + corr_fn = BidirCorrBlock(fmap0, fmap1, radius=self.radius, num_levels=self.corr_levels) + + # f0_1: [1, c0, H//2, W//2] | f0_2: [1, c1, H//4, W//4] + # f0_3: [1, c2, H//8, W//8] | f0_4: [1, c3, H//16, W//16] + f0_1, f0_2, f0_3, f0_4 = self.encoder(img0_) + f1_1, f1_2, f1_3, f1_4 = self.encoder(img1_) + + ######################################### the 4th decoder ######################################### + up_flow0_4, up_flow1_4, ft_3_ = self.decoder4(f0_4, f1_4, embt) + corr_4, flow_4 = self._corr_scale_lookup(corr_fn, coord, + up_flow0_4, up_flow1_4, + embt, downsample=1) + + # residue update with lookup corr + delta_ft_3_, delta_flow_4 = self.update4(ft_3_, flow_4, corr_4) + delta_flow0_4, delta_flow1_4 = torch.chunk(delta_flow_4, 2, 1) + up_flow0_4 = up_flow0_4 + delta_flow0_4 + up_flow1_4 = up_flow1_4 + delta_flow1_4 + ft_3_ = ft_3_ + delta_ft_3_ + + ######################################### the 3rd decoder ######################################### + up_flow0_3, up_flow1_3, ft_2_ = self.decoder3(ft_3_, f0_3, f1_3, up_flow0_4, up_flow1_4) + corr_3, flow_3 = self._corr_scale_lookup(corr_fn, + coord, up_flow0_3, up_flow1_3, + embt, downsample=2) + + # residue update with lookup corr + delta_ft_2_, delta_flow_3 = self.update3_low(ft_2_, flow_3, corr_3) + delta_flow0_3, delta_flow1_3 = torch.chunk(delta_flow_3, 2, 1) + up_flow0_3 = up_flow0_3 + delta_flow0_3 + up_flow1_3 = up_flow1_3 + delta_flow1_3 + ft_2_ = ft_2_ + delta_ft_2_ + + # residue update with lookup corr (hr) + corr_3 = resize(corr_3, scale_factor=2.0) + up_flow_3 = torch.cat([up_flow0_3, up_flow1_3], dim=1) + delta_ft_2_, delta_up_flow_3 = self.update3_high(ft_2_, up_flow_3, corr_3) + ft_2_ += delta_ft_2_ + up_flow0_3 += delta_up_flow_3[:, 0:2] + up_flow1_3 += delta_up_flow_3[:, 2:4] + + ######################################### the 2nd decoder ######################################### + up_flow0_2, up_flow1_2, ft_1_ = self.decoder2(ft_2_, f0_2, f1_2, up_flow0_3, up_flow1_3) + corr_2, flow_2 = self._corr_scale_lookup(corr_fn, + coord, up_flow0_2, up_flow1_2, + embt, downsample=4) + + # residue update with lookup corr + delta_ft_1_, delta_flow_2 = self.update2_low(ft_1_, flow_2, corr_2) + delta_flow0_2, delta_flow1_2 = torch.chunk(delta_flow_2, 2, 1) + up_flow0_2 = up_flow0_2 + delta_flow0_2 + up_flow1_2 = up_flow1_2 + delta_flow1_2 + ft_1_ = ft_1_ + delta_ft_1_ + + # residue update with lookup corr (hr) + corr_2 = resize(corr_2, scale_factor=4.0) + up_flow_2 = torch.cat([up_flow0_2, up_flow1_2], dim=1) + delta_ft_1_, delta_up_flow_2 = self.update2_high(ft_1_, up_flow_2, corr_2) + ft_1_ += delta_ft_1_ + up_flow0_2 += delta_up_flow_2[:, 0:2] + up_flow1_2 += delta_up_flow_2[:, 2:4] + + ######################################### the 1st decoder ######################################### + up_flow0_1, up_flow1_1, mask, img_res = self.decoder1(ft_1_, f0_1, f1_1, up_flow0_2, up_flow1_2) + + if scale_factor != 1.0: + up_flow0_1 = resize(up_flow0_1, scale_factor=(1.0/scale_factor)) * (1.0/scale_factor) + up_flow1_1 = resize(up_flow1_1, scale_factor=(1.0/scale_factor)) * (1.0/scale_factor) + mask = resize(mask, scale_factor=(1.0/scale_factor)) + img_res = resize(img_res, scale_factor=(1.0/scale_factor)) + + # Merge multiple predictions + imgt_pred = multi_flow_combine(self.comb_block, img0, img1, up_flow0_1, up_flow1_1, + mask, img_res, mean_) + imgt_pred = torch.clamp(imgt_pred, 0, 1) + + if eval: + return { 'imgt_pred': imgt_pred, } + else: + up_flow0_1 = up_flow0_1.reshape(b, self.num_flows, 2, h, w) + up_flow1_1 = up_flow1_1.reshape(b, self.num_flows, 2, h, w) + return { + 'imgt_pred': imgt_pred, + 'flow0_pred': [up_flow0_1, up_flow0_2, up_flow0_3, up_flow0_4], + 'flow1_pred': [up_flow1_1, up_flow1_2, up_flow1_3, up_flow1_4], + 'ft_pred': [ft_1_, ft_2_, ft_3_], + } diff --git a/vbench/third_party/amt/networks/AMT-L.py b/vbench/third_party/amt/networks/AMT-L.py new file mode 100755 index 00000000..f992688b --- /dev/null +++ b/vbench/third_party/amt/networks/AMT-L.py @@ -0,0 +1,155 @@ +import torch +import torch.nn as nn +from networks.blocks.raft import ( + coords_grid, + BasicUpdateBlock, BidirCorrBlock +) +from networks.blocks.feat_enc import ( + BasicEncoder +) +from networks.blocks.ifrnet import ( + resize, + Encoder, + InitDecoder, + IntermediateDecoder +) +from networks.blocks.multi_flow import ( + multi_flow_combine, + MultiFlowDecoder +) + + +class Model(nn.Module): + def __init__(self, + corr_radius=3, + corr_lvls=4, + num_flows=5, + channels=[48, 64, 72, 128], + skip_channels=48 + ): + super(Model, self).__init__() + self.radius = corr_radius + self.corr_levels = corr_lvls + self.num_flows = num_flows + + self.feat_encoder = BasicEncoder(output_dim=128, norm_fn='instance', dropout=0.) + self.encoder = Encoder([48, 64, 72, 128], large=True) + + self.decoder4 = InitDecoder(channels[3], channels[2], skip_channels) + self.decoder3 = IntermediateDecoder(channels[2], channels[1], skip_channels) + self.decoder2 = IntermediateDecoder(channels[1], channels[0], skip_channels) + self.decoder1 = MultiFlowDecoder(channels[0], skip_channels, num_flows) + + self.update4 = self._get_updateblock(72, None) + self.update3 = self._get_updateblock(64, 2.0) + self.update2 = self._get_updateblock(48, 4.0) + + self.comb_block = nn.Sequential( + nn.Conv2d(3*self.num_flows, 6*self.num_flows, 7, 1, 3), + nn.PReLU(6*self.num_flows), + nn.Conv2d(6*self.num_flows, 3, 7, 1, 3), + ) + + def _get_updateblock(self, cdim, scale_factor=None): + return BasicUpdateBlock(cdim=cdim, hidden_dim=128, flow_dim=48, + corr_dim=256, corr_dim2=160, fc_dim=124, + scale_factor=scale_factor, corr_levels=self.corr_levels, + radius=self.radius) + + def _corr_scale_lookup(self, corr_fn, coord, flow0, flow1, embt, downsample=1): + # convert t -> 0 to 0 -> 1 | convert t -> 1 to 1 -> 0 + # based on linear assumption + t1_scale = 1. / embt + t0_scale = 1. / (1. - embt) + if downsample != 1: + inv = 1 / downsample + flow0 = inv * resize(flow0, scale_factor=inv) + flow1 = inv * resize(flow1, scale_factor=inv) + + corr0, corr1 = corr_fn(coord + flow1 * t1_scale, coord + flow0 * t0_scale) + corr = torch.cat([corr0, corr1], dim=1) + flow = torch.cat([flow0, flow1], dim=1) + return corr, flow + + def forward(self, img0, img1, embt, scale_factor=1.0, eval=False, **kwargs): + mean_ = torch.cat([img0, img1], 2).mean(1, keepdim=True).mean(2, keepdim=True).mean(3, keepdim=True) + img0 = img0 - mean_ + img1 = img1 - mean_ + img0_ = resize(img0, scale_factor) if scale_factor != 1.0 else img0 + img1_ = resize(img1, scale_factor) if scale_factor != 1.0 else img1 + b, _, h, w = img0_.shape + coord = coords_grid(b, h // 8, w // 8, img0.device) + + fmap0, fmap1 = self.feat_encoder([img0_, img1_]) # [1, 128, H//8, W//8] + corr_fn = BidirCorrBlock(fmap0, fmap1, radius=self.radius, num_levels=self.corr_levels) + + # f0_1: [1, c0, H//2, W//2] | f0_2: [1, c1, H//4, W//4] + # f0_3: [1, c2, H//8, W//8] | f0_4: [1, c3, H//16, W//16] + f0_1, f0_2, f0_3, f0_4 = self.encoder(img0_) + f1_1, f1_2, f1_3, f1_4 = self.encoder(img1_) + + ######################################### the 4th decoder ######################################### + up_flow0_4, up_flow1_4, ft_3_ = self.decoder4(f0_4, f1_4, embt) + corr_4, flow_4 = self._corr_scale_lookup(corr_fn, coord, + up_flow0_4, up_flow1_4, + embt, downsample=1) + + # residue update with lookup corr + delta_ft_3_, delta_flow_4 = self.update4(ft_3_, flow_4, corr_4) + delta_flow0_4, delta_flow1_4 = torch.chunk(delta_flow_4, 2, 1) + up_flow0_4 = up_flow0_4 + delta_flow0_4 + up_flow1_4 = up_flow1_4 + delta_flow1_4 + ft_3_ = ft_3_ + delta_ft_3_ + + ######################################### the 3rd decoder ######################################### + up_flow0_3, up_flow1_3, ft_2_ = self.decoder3(ft_3_, f0_3, f1_3, up_flow0_4, up_flow1_4) + corr_3, flow_3 = self._corr_scale_lookup(corr_fn, + coord, up_flow0_3, up_flow1_3, + embt, downsample=2) + + # residue update with lookup corr + delta_ft_2_, delta_flow_3 = self.update3(ft_2_, flow_3, corr_3) + delta_flow0_3, delta_flow1_3 = torch.chunk(delta_flow_3, 2, 1) + up_flow0_3 = up_flow0_3 + delta_flow0_3 + up_flow1_3 = up_flow1_3 + delta_flow1_3 + ft_2_ = ft_2_ + delta_ft_2_ + + ######################################### the 2nd decoder ######################################### + up_flow0_2, up_flow1_2, ft_1_ = self.decoder2(ft_2_, f0_2, f1_2, up_flow0_3, up_flow1_3) + corr_2, flow_2 = self._corr_scale_lookup(corr_fn, + coord, up_flow0_2, up_flow1_2, + embt, downsample=4) + + # residue update with lookup corr + delta_ft_1_, delta_flow_2 = self.update2(ft_1_, flow_2, corr_2) + delta_flow0_2, delta_flow1_2 = torch.chunk(delta_flow_2, 2, 1) + up_flow0_2 = up_flow0_2 + delta_flow0_2 + up_flow1_2 = up_flow1_2 + delta_flow1_2 + ft_1_ = ft_1_ + delta_ft_1_ + + ######################################### the 1st decoder ######################################### + up_flow0_1, up_flow1_1, mask, img_res = self.decoder1(ft_1_, f0_1, f1_1, up_flow0_2, up_flow1_2) + + if scale_factor != 1.0: + up_flow0_1 = resize(up_flow0_1, scale_factor=(1.0/scale_factor)) * (1.0/scale_factor) + up_flow1_1 = resize(up_flow1_1, scale_factor=(1.0/scale_factor)) * (1.0/scale_factor) + mask = resize(mask, scale_factor=(1.0/scale_factor)) + img_res = resize(img_res, scale_factor=(1.0/scale_factor)) + + # Merge multiple predictions + imgt_pred = multi_flow_combine(self.comb_block, img0, img1, up_flow0_1, up_flow1_1, + mask, img_res, mean_) + imgt_pred = torch.clamp(imgt_pred, 0, 1) + + if eval: + return { 'imgt_pred': imgt_pred, } + else: + up_flow0_1 = up_flow0_1.reshape(b, self.num_flows, 2, h, w) + up_flow1_1 = up_flow1_1.reshape(b, self.num_flows, 2, h, w) + return { + 'imgt_pred': imgt_pred, + 'flow0_pred': [up_flow0_1, up_flow0_2, up_flow0_3, up_flow0_4], + 'flow1_pred': [up_flow1_1, up_flow1_2, up_flow1_3, up_flow1_4], + 'ft_pred': [ft_1_, ft_2_, ft_3_], + } + \ No newline at end of file diff --git a/vbench/third_party/amt/networks/AMT-S.py b/vbench/third_party/amt/networks/AMT-S.py new file mode 100755 index 00000000..a7155bb6 --- /dev/null +++ b/vbench/third_party/amt/networks/AMT-S.py @@ -0,0 +1,154 @@ +import torch +import torch.nn as nn +from networks.blocks.raft import ( + coords_grid, + SmallUpdateBlock, BidirCorrBlock +) +from networks.blocks.feat_enc import ( + SmallEncoder +) +from networks.blocks.ifrnet import ( + resize, + Encoder, + InitDecoder, + IntermediateDecoder +) +from networks.blocks.multi_flow import ( + multi_flow_combine, + MultiFlowDecoder +) + + +class Model(nn.Module): + def __init__(self, + corr_radius=3, + corr_lvls=4, + num_flows=3, + channels=[20, 32, 44, 56], + skip_channels=20): + super(Model, self).__init__() + self.radius = corr_radius + self.corr_levels = corr_lvls + self.num_flows = num_flows + self.channels = channels + self.skip_channels = skip_channels + + self.feat_encoder = SmallEncoder(output_dim=84, norm_fn='instance', dropout=0.) + self.encoder = Encoder(channels) + + self.decoder4 = InitDecoder(channels[3], channels[2], skip_channels) + self.decoder3 = IntermediateDecoder(channels[2], channels[1], skip_channels) + self.decoder2 = IntermediateDecoder(channels[1], channels[0], skip_channels) + self.decoder1 = MultiFlowDecoder(channels[0], skip_channels, num_flows) + + self.update4 = self._get_updateblock(44) + self.update3 = self._get_updateblock(32, 2) + self.update2 = self._get_updateblock(20, 4) + + self.comb_block = nn.Sequential( + nn.Conv2d(3*num_flows, 6*num_flows, 3, 1, 1), + nn.PReLU(6*num_flows), + nn.Conv2d(6*num_flows, 3, 3, 1, 1), + ) + + def _get_updateblock(self, cdim, scale_factor=None): + return SmallUpdateBlock(cdim=cdim, hidden_dim=76, flow_dim=20, corr_dim=64, + fc_dim=68, scale_factor=scale_factor, + corr_levels=self.corr_levels, radius=self.radius) + + def _corr_scale_lookup(self, corr_fn, coord, flow0, flow1, embt, downsample=1): + # convert t -> 0 to 0 -> 1 | convert t -> 1 to 1 -> 0 + # based on linear assumption + t1_scale = 1. / embt + t0_scale = 1. / (1. - embt) + if downsample != 1: + inv = 1 / downsample + flow0 = inv * resize(flow0, scale_factor=inv) + flow1 = inv * resize(flow1, scale_factor=inv) + + corr0, corr1 = corr_fn(coord + flow1 * t1_scale, coord + flow0 * t0_scale) + corr = torch.cat([corr0, corr1], dim=1) + flow = torch.cat([flow0, flow1], dim=1) + return corr, flow + + def forward(self, img0, img1, embt, scale_factor=1.0, eval=False, **kwargs): + mean_ = torch.cat([img0, img1], 2).mean(1, keepdim=True).mean(2, keepdim=True).mean(3, keepdim=True) + img0 = img0 - mean_ + img1 = img1 - mean_ + img0_ = resize(img0, scale_factor) if scale_factor != 1.0 else img0 + img1_ = resize(img1, scale_factor) if scale_factor != 1.0 else img1 + b, _, h, w = img0_.shape + coord = coords_grid(b, h // 8, w // 8, img0.device) + + fmap0, fmap1 = self.feat_encoder([img0_, img1_]) # [1, 128, H//8, W//8] + corr_fn = BidirCorrBlock(fmap0, fmap1, radius=self.radius, num_levels=self.corr_levels) + + # f0_1: [1, c0, H//2, W//2] | f0_2: [1, c1, H//4, W//4] + # f0_3: [1, c2, H//8, W//8] | f0_4: [1, c3, H//16, W//16] + f0_1, f0_2, f0_3, f0_4 = self.encoder(img0_) + f1_1, f1_2, f1_3, f1_4 = self.encoder(img1_) + + ######################################### the 4th decoder ######################################### + up_flow0_4, up_flow1_4, ft_3_ = self.decoder4(f0_4, f1_4, embt) + corr_4, flow_4 = self._corr_scale_lookup(corr_fn, coord, + up_flow0_4, up_flow1_4, + embt, downsample=1) + + # residue update with lookup corr + delta_ft_3_, delta_flow_4 = self.update4(ft_3_, flow_4, corr_4) + delta_flow0_4, delta_flow1_4 = torch.chunk(delta_flow_4, 2, 1) + up_flow0_4 = up_flow0_4 + delta_flow0_4 + up_flow1_4 = up_flow1_4 + delta_flow1_4 + ft_3_ = ft_3_ + delta_ft_3_ + + ######################################### the 3rd decoder ######################################### + up_flow0_3, up_flow1_3, ft_2_ = self.decoder3(ft_3_, f0_3, f1_3, up_flow0_4, up_flow1_4) + corr_3, flow_3 = self._corr_scale_lookup(corr_fn, + coord, up_flow0_3, up_flow1_3, + embt, downsample=2) + + # residue update with lookup corr + delta_ft_2_, delta_flow_3 = self.update3(ft_2_, flow_3, corr_3) + delta_flow0_3, delta_flow1_3 = torch.chunk(delta_flow_3, 2, 1) + up_flow0_3 = up_flow0_3 + delta_flow0_3 + up_flow1_3 = up_flow1_3 + delta_flow1_3 + ft_2_ = ft_2_ + delta_ft_2_ + + ######################################### the 2nd decoder ######################################### + up_flow0_2, up_flow1_2, ft_1_ = self.decoder2(ft_2_, f0_2, f1_2, up_flow0_3, up_flow1_3) + corr_2, flow_2 = self._corr_scale_lookup(corr_fn, + coord, up_flow0_2, up_flow1_2, + embt, downsample=4) + + # residue update with lookup corr + delta_ft_1_, delta_flow_2 = self.update2(ft_1_, flow_2, corr_2) + delta_flow0_2, delta_flow1_2 = torch.chunk(delta_flow_2, 2, 1) + up_flow0_2 = up_flow0_2 + delta_flow0_2 + up_flow1_2 = up_flow1_2 + delta_flow1_2 + ft_1_ = ft_1_ + delta_ft_1_ + + ######################################### the 1st decoder ######################################### + up_flow0_1, up_flow1_1, mask, img_res = self.decoder1(ft_1_, f0_1, f1_1, up_flow0_2, up_flow1_2) + + if scale_factor != 1.0: + up_flow0_1 = resize(up_flow0_1, scale_factor=(1.0/scale_factor)) * (1.0/scale_factor) + up_flow1_1 = resize(up_flow1_1, scale_factor=(1.0/scale_factor)) * (1.0/scale_factor) + mask = resize(mask, scale_factor=(1.0/scale_factor)) + img_res = resize(img_res, scale_factor=(1.0/scale_factor)) + + # Merge multiple predictions + imgt_pred = multi_flow_combine(self.comb_block, img0, img1, up_flow0_1, up_flow1_1, + mask, img_res, mean_) + imgt_pred = torch.clamp(imgt_pred, 0, 1) + + if eval: + return { 'imgt_pred': imgt_pred, } + else: + up_flow0_1 = up_flow0_1.reshape(b, self.num_flows, 2, h, w) + up_flow1_1 = up_flow1_1.reshape(b, self.num_flows, 2, h, w) + return { + 'imgt_pred': imgt_pred, + 'flow0_pred': [up_flow0_1, up_flow0_2, up_flow0_3, up_flow0_4], + 'flow1_pred': [up_flow1_1, up_flow1_2, up_flow1_3, up_flow1_4], + 'ft_pred': [ft_1_, ft_2_, ft_3_], + } diff --git a/vbench/third_party/amt/networks/IFRNet.py b/vbench/third_party/amt/networks/IFRNet.py new file mode 100755 index 00000000..a5eb61cb --- /dev/null +++ b/vbench/third_party/amt/networks/IFRNet.py @@ -0,0 +1,169 @@ +import torch +import torch.nn as nn +from utils.flow_utils import warp +from networks.blocks.ifrnet import ( + convrelu, resize, + ResBlock, +) + + +class Encoder(nn.Module): + def __init__(self): + super(Encoder, self).__init__() + self.pyramid1 = nn.Sequential( + convrelu(3, 32, 3, 2, 1), + convrelu(32, 32, 3, 1, 1) + ) + self.pyramid2 = nn.Sequential( + convrelu(32, 48, 3, 2, 1), + convrelu(48, 48, 3, 1, 1) + ) + self.pyramid3 = nn.Sequential( + convrelu(48, 72, 3, 2, 1), + convrelu(72, 72, 3, 1, 1) + ) + self.pyramid4 = nn.Sequential( + convrelu(72, 96, 3, 2, 1), + convrelu(96, 96, 3, 1, 1) + ) + + def forward(self, img): + f1 = self.pyramid1(img) + f2 = self.pyramid2(f1) + f3 = self.pyramid3(f2) + f4 = self.pyramid4(f3) + return f1, f2, f3, f4 + + +class Decoder4(nn.Module): + def __init__(self): + super(Decoder4, self).__init__() + self.convblock = nn.Sequential( + convrelu(192+1, 192), + ResBlock(192, 32), + nn.ConvTranspose2d(192, 76, 4, 2, 1, bias=True) + ) + + def forward(self, f0, f1, embt): + b, c, h, w = f0.shape + embt = embt.repeat(1, 1, h, w) + f_in = torch.cat([f0, f1, embt], 1) + f_out = self.convblock(f_in) + return f_out + + +class Decoder3(nn.Module): + def __init__(self): + super(Decoder3, self).__init__() + self.convblock = nn.Sequential( + convrelu(220, 216), + ResBlock(216, 32), + nn.ConvTranspose2d(216, 52, 4, 2, 1, bias=True) + ) + + def forward(self, ft_, f0, f1, up_flow0, up_flow1): + f0_warp = warp(f0, up_flow0) + f1_warp = warp(f1, up_flow1) + f_in = torch.cat([ft_, f0_warp, f1_warp, up_flow0, up_flow1], 1) + f_out = self.convblock(f_in) + return f_out + + +class Decoder2(nn.Module): + def __init__(self): + super(Decoder2, self).__init__() + self.convblock = nn.Sequential( + convrelu(148, 144), + ResBlock(144, 32), + nn.ConvTranspose2d(144, 36, 4, 2, 1, bias=True) + ) + + def forward(self, ft_, f0, f1, up_flow0, up_flow1): + f0_warp = warp(f0, up_flow0) + f1_warp = warp(f1, up_flow1) + f_in = torch.cat([ft_, f0_warp, f1_warp, up_flow0, up_flow1], 1) + f_out = self.convblock(f_in) + return f_out + + +class Decoder1(nn.Module): + def __init__(self): + super(Decoder1, self).__init__() + self.convblock = nn.Sequential( + convrelu(100, 96), + ResBlock(96, 32), + nn.ConvTranspose2d(96, 8, 4, 2, 1, bias=True) + ) + + def forward(self, ft_, f0, f1, up_flow0, up_flow1): + f0_warp = warp(f0, up_flow0) + f1_warp = warp(f1, up_flow1) + f_in = torch.cat([ft_, f0_warp, f1_warp, up_flow0, up_flow1], 1) + f_out = self.convblock(f_in) + return f_out + + +class Model(nn.Module): + def __init__(self): + super(Model, self).__init__() + self.encoder = Encoder() + self.decoder4 = Decoder4() + self.decoder3 = Decoder3() + self.decoder2 = Decoder2() + self.decoder1 = Decoder1() + + def forward(self, img0, img1, embt, scale_factor=1.0, eval=False, **kwargs): + mean_ = torch.cat([img0, img1], 2).mean(1, keepdim=True).mean(2, keepdim=True).mean(3, keepdim=True) + img0 = img0 - mean_ + img1 = img1 - mean_ + + img0_ = resize(img0, scale_factor) if scale_factor != 1.0 else img0 + img1_ = resize(img1, scale_factor) if scale_factor != 1.0 else img1 + + f0_1, f0_2, f0_3, f0_4 = self.encoder(img0_) + f1_1, f1_2, f1_3, f1_4 = self.encoder(img1_) + + out4 = self.decoder4(f0_4, f1_4, embt) + up_flow0_4 = out4[:, 0:2] + up_flow1_4 = out4[:, 2:4] + ft_3_ = out4[:, 4:] + + out3 = self.decoder3(ft_3_, f0_3, f1_3, up_flow0_4, up_flow1_4) + up_flow0_3 = out3[:, 0:2] + 2.0 * resize(up_flow0_4, scale_factor=2.0) + up_flow1_3 = out3[:, 2:4] + 2.0 * resize(up_flow1_4, scale_factor=2.0) + ft_2_ = out3[:, 4:] + + out2 = self.decoder2(ft_2_, f0_2, f1_2, up_flow0_3, up_flow1_3) + up_flow0_2 = out2[:, 0:2] + 2.0 * resize(up_flow0_3, scale_factor=2.0) + up_flow1_2 = out2[:, 2:4] + 2.0 * resize(up_flow1_3, scale_factor=2.0) + ft_1_ = out2[:, 4:] + + out1 = self.decoder1(ft_1_, f0_1, f1_1, up_flow0_2, up_flow1_2) + up_flow0_1 = out1[:, 0:2] + 2.0 * resize(up_flow0_2, scale_factor=2.0) + up_flow1_1 = out1[:, 2:4] + 2.0 * resize(up_flow1_2, scale_factor=2.0) + up_mask_1 = torch.sigmoid(out1[:, 4:5]) + up_res_1 = out1[:, 5:] + + if scale_factor != 1.0: + up_flow0_1 = resize(up_flow0_1, scale_factor=(1.0/scale_factor)) * (1.0/scale_factor) + up_flow1_1 = resize(up_flow1_1, scale_factor=(1.0/scale_factor)) * (1.0/scale_factor) + up_mask_1 = resize(up_mask_1, scale_factor=(1.0/scale_factor)) + up_res_1 = resize(up_res_1, scale_factor=(1.0/scale_factor)) + + img0_warp = warp(img0, up_flow0_1) + img1_warp = warp(img1, up_flow1_1) + imgt_merge = up_mask_1 * img0_warp + (1 - up_mask_1) * img1_warp + mean_ + imgt_pred = imgt_merge + up_res_1 + imgt_pred = torch.clamp(imgt_pred, 0, 1) + + if eval: + return { 'imgt_pred': imgt_pred, } + else: + return { + 'imgt_pred': imgt_pred, + 'flow0_pred': [up_flow0_1, up_flow0_2, up_flow0_3, up_flow0_4], + 'flow1_pred': [up_flow1_1, up_flow1_2, up_flow1_3, up_flow1_4], + 'ft_pred': [ft_1_, ft_2_, ft_3_], + 'img0_warp': img0_warp, + 'img1_warp': img1_warp + } diff --git a/vbench/third_party/amt/networks/__init__.py b/vbench/third_party/amt/networks/__init__.py new file mode 100755 index 00000000..e69de29b diff --git a/vbench/third_party/amt/networks/blocks/__init__.py b/vbench/third_party/amt/networks/blocks/__init__.py new file mode 100755 index 00000000..e69de29b diff --git a/vbench/third_party/amt/networks/blocks/feat_enc.py b/vbench/third_party/amt/networks/blocks/feat_enc.py new file mode 100755 index 00000000..3805bd31 --- /dev/null +++ b/vbench/third_party/amt/networks/blocks/feat_enc.py @@ -0,0 +1,343 @@ +import torch +import torch.nn as nn + + +class BottleneckBlock(nn.Module): + def __init__(self, in_planes, planes, norm_fn='group', stride=1): + super(BottleneckBlock, self).__init__() + + self.conv1 = nn.Conv2d(in_planes, planes//4, kernel_size=1, padding=0) + self.conv2 = nn.Conv2d(planes//4, planes//4, kernel_size=3, padding=1, stride=stride) + self.conv3 = nn.Conv2d(planes//4, planes, kernel_size=1, padding=0) + self.relu = nn.ReLU(inplace=True) + + num_groups = planes // 8 + + if norm_fn == 'group': + self.norm1 = nn.GroupNorm(num_groups=num_groups, num_channels=planes//4) + self.norm2 = nn.GroupNorm(num_groups=num_groups, num_channels=planes//4) + self.norm3 = nn.GroupNorm(num_groups=num_groups, num_channels=planes) + if not stride == 1: + self.norm4 = nn.GroupNorm(num_groups=num_groups, num_channels=planes) + + elif norm_fn == 'batch': + self.norm1 = nn.BatchNorm2d(planes//4) + self.norm2 = nn.BatchNorm2d(planes//4) + self.norm3 = nn.BatchNorm2d(planes) + if not stride == 1: + self.norm4 = nn.BatchNorm2d(planes) + + elif norm_fn == 'instance': + self.norm1 = nn.InstanceNorm2d(planes//4) + self.norm2 = nn.InstanceNorm2d(planes//4) + self.norm3 = nn.InstanceNorm2d(planes) + if not stride == 1: + self.norm4 = nn.InstanceNorm2d(planes) + + elif norm_fn == 'none': + self.norm1 = nn.Sequential() + self.norm2 = nn.Sequential() + self.norm3 = nn.Sequential() + if not stride == 1: + self.norm4 = nn.Sequential() + + if stride == 1: + self.downsample = None + + else: + self.downsample = nn.Sequential( + nn.Conv2d(in_planes, planes, kernel_size=1, stride=stride), self.norm4) + + + def forward(self, x): + y = x + y = self.relu(self.norm1(self.conv1(y))) + y = self.relu(self.norm2(self.conv2(y))) + y = self.relu(self.norm3(self.conv3(y))) + + if self.downsample is not None: + x = self.downsample(x) + + return self.relu(x+y) + + +class ResidualBlock(nn.Module): + def __init__(self, in_planes, planes, norm_fn='group', stride=1): + super(ResidualBlock, self).__init__() + + self.conv1 = nn.Conv2d(in_planes, planes, kernel_size=3, padding=1, stride=stride) + self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, padding=1) + self.relu = nn.ReLU(inplace=True) + + num_groups = planes // 8 + + if norm_fn == 'group': + self.norm1 = nn.GroupNorm(num_groups=num_groups, num_channels=planes) + self.norm2 = nn.GroupNorm(num_groups=num_groups, num_channels=planes) + if not stride == 1: + self.norm3 = nn.GroupNorm(num_groups=num_groups, num_channels=planes) + + elif norm_fn == 'batch': + self.norm1 = nn.BatchNorm2d(planes) + self.norm2 = nn.BatchNorm2d(planes) + if not stride == 1: + self.norm3 = nn.BatchNorm2d(planes) + + elif norm_fn == 'instance': + self.norm1 = nn.InstanceNorm2d(planes) + self.norm2 = nn.InstanceNorm2d(planes) + if not stride == 1: + self.norm3 = nn.InstanceNorm2d(planes) + + elif norm_fn == 'none': + self.norm1 = nn.Sequential() + self.norm2 = nn.Sequential() + if not stride == 1: + self.norm3 = nn.Sequential() + + if stride == 1: + self.downsample = None + + else: + self.downsample = nn.Sequential( + nn.Conv2d(in_planes, planes, kernel_size=1, stride=stride), self.norm3) + + + def forward(self, x): + y = x + y = self.relu(self.norm1(self.conv1(y))) + y = self.relu(self.norm2(self.conv2(y))) + + if self.downsample is not None: + x = self.downsample(x) + + return self.relu(x+y) + + +class SmallEncoder(nn.Module): + def __init__(self, output_dim=128, norm_fn='batch', dropout=0.0): + super(SmallEncoder, self).__init__() + self.norm_fn = norm_fn + + if self.norm_fn == 'group': + self.norm1 = nn.GroupNorm(num_groups=8, num_channels=32) + + elif self.norm_fn == 'batch': + self.norm1 = nn.BatchNorm2d(32) + + elif self.norm_fn == 'instance': + self.norm1 = nn.InstanceNorm2d(32) + + elif self.norm_fn == 'none': + self.norm1 = nn.Sequential() + + self.conv1 = nn.Conv2d(3, 32, kernel_size=7, stride=2, padding=3) + self.relu1 = nn.ReLU(inplace=True) + + self.in_planes = 32 + self.layer1 = self._make_layer(32, stride=1) + self.layer2 = self._make_layer(64, stride=2) + self.layer3 = self._make_layer(96, stride=2) + + self.dropout = None + if dropout > 0: + self.dropout = nn.Dropout2d(p=dropout) + + self.conv2 = nn.Conv2d(96, output_dim, kernel_size=1) + + for m in self.modules(): + if isinstance(m, nn.Conv2d): + nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu') + elif isinstance(m, (nn.BatchNorm2d, nn.InstanceNorm2d, nn.GroupNorm)): + if m.weight is not None: + nn.init.constant_(m.weight, 1) + if m.bias is not None: + nn.init.constant_(m.bias, 0) + + def _make_layer(self, dim, stride=1): + layer1 = BottleneckBlock(self.in_planes, dim, self.norm_fn, stride=stride) + layer2 = BottleneckBlock(dim, dim, self.norm_fn, stride=1) + layers = (layer1, layer2) + + self.in_planes = dim + return nn.Sequential(*layers) + + + def forward(self, x): + + # if input is list, combine batch dimension + is_list = isinstance(x, tuple) or isinstance(x, list) + if is_list: + batch_dim = x[0].shape[0] + x = torch.cat(x, dim=0) + + x = self.conv1(x) + x = self.norm1(x) + x = self.relu1(x) + + x = self.layer1(x) + x = self.layer2(x) + x = self.layer3(x) + x = self.conv2(x) + + if self.training and self.dropout is not None: + x = self.dropout(x) + + if is_list: + x = torch.split(x, [batch_dim, batch_dim], dim=0) + + return x + +class BasicEncoder(nn.Module): + def __init__(self, output_dim=128, norm_fn='batch', dropout=0.0): + super(BasicEncoder, self).__init__() + self.norm_fn = norm_fn + + if self.norm_fn == 'group': + self.norm1 = nn.GroupNorm(num_groups=8, num_channels=64) + + elif self.norm_fn == 'batch': + self.norm1 = nn.BatchNorm2d(64) + + elif self.norm_fn == 'instance': + self.norm1 = nn.InstanceNorm2d(64) + + elif self.norm_fn == 'none': + self.norm1 = nn.Sequential() + + self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3) + self.relu1 = nn.ReLU(inplace=True) + + self.in_planes = 64 + self.layer1 = self._make_layer(64, stride=1) + self.layer2 = self._make_layer(72, stride=2) + self.layer3 = self._make_layer(128, stride=2) + + # output convolution + self.conv2 = nn.Conv2d(128, output_dim, kernel_size=1) + + self.dropout = None + if dropout > 0: + self.dropout = nn.Dropout2d(p=dropout) + + for m in self.modules(): + if isinstance(m, nn.Conv2d): + nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu') + elif isinstance(m, (nn.BatchNorm2d, nn.InstanceNorm2d, nn.GroupNorm)): + if m.weight is not None: + nn.init.constant_(m.weight, 1) + if m.bias is not None: + nn.init.constant_(m.bias, 0) + + def _make_layer(self, dim, stride=1): + layer1 = ResidualBlock(self.in_planes, dim, self.norm_fn, stride=stride) + layer2 = ResidualBlock(dim, dim, self.norm_fn, stride=1) + layers = (layer1, layer2) + + self.in_planes = dim + return nn.Sequential(*layers) + + + def forward(self, x): + + # if input is list, combine batch dimension + is_list = isinstance(x, tuple) or isinstance(x, list) + if is_list: + batch_dim = x[0].shape[0] + x = torch.cat(x, dim=0) + + x = self.conv1(x) + x = self.norm1(x) + x = self.relu1(x) + + x = self.layer1(x) + x = self.layer2(x) + x = self.layer3(x) + + x = self.conv2(x) + + if self.training and self.dropout is not None: + x = self.dropout(x) + + if is_list: + x = torch.split(x, [batch_dim, batch_dim], dim=0) + + return x + +class LargeEncoder(nn.Module): + def __init__(self, output_dim=128, norm_fn='batch', dropout=0.0): + super(LargeEncoder, self).__init__() + self.norm_fn = norm_fn + + if self.norm_fn == 'group': + self.norm1 = nn.GroupNorm(num_groups=8, num_channels=64) + + elif self.norm_fn == 'batch': + self.norm1 = nn.BatchNorm2d(64) + + elif self.norm_fn == 'instance': + self.norm1 = nn.InstanceNorm2d(64) + + elif self.norm_fn == 'none': + self.norm1 = nn.Sequential() + + self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3) + self.relu1 = nn.ReLU(inplace=True) + + self.in_planes = 64 + self.layer1 = self._make_layer(64, stride=1) + self.layer2 = self._make_layer(112, stride=2) + self.layer3 = self._make_layer(160, stride=2) + self.layer3_2 = self._make_layer(160, stride=1) + + # output convolution + self.conv2 = nn.Conv2d(self.in_planes, output_dim, kernel_size=1) + + self.dropout = None + if dropout > 0: + self.dropout = nn.Dropout2d(p=dropout) + + for m in self.modules(): + if isinstance(m, nn.Conv2d): + nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu') + elif isinstance(m, (nn.BatchNorm2d, nn.InstanceNorm2d, nn.GroupNorm)): + if m.weight is not None: + nn.init.constant_(m.weight, 1) + if m.bias is not None: + nn.init.constant_(m.bias, 0) + + def _make_layer(self, dim, stride=1): + layer1 = ResidualBlock(self.in_planes, dim, self.norm_fn, stride=stride) + layer2 = ResidualBlock(dim, dim, self.norm_fn, stride=1) + layers = (layer1, layer2) + + self.in_planes = dim + return nn.Sequential(*layers) + + + def forward(self, x): + + # if input is list, combine batch dimension + is_list = isinstance(x, tuple) or isinstance(x, list) + if is_list: + batch_dim = x[0].shape[0] + x = torch.cat(x, dim=0) + + x = self.conv1(x) + x = self.norm1(x) + x = self.relu1(x) + + x = self.layer1(x) + x = self.layer2(x) + x = self.layer3(x) + x = self.layer3_2(x) + + x = self.conv2(x) + + if self.training and self.dropout is not None: + x = self.dropout(x) + + if is_list: + x = torch.split(x, [batch_dim, batch_dim], dim=0) + + return x diff --git a/vbench/third_party/amt/networks/blocks/ifrnet.py b/vbench/third_party/amt/networks/blocks/ifrnet.py new file mode 100755 index 00000000..586ae610 --- /dev/null +++ b/vbench/third_party/amt/networks/blocks/ifrnet.py @@ -0,0 +1,111 @@ +import torch +import torch.nn as nn +import torch.nn.functional as F +from utils.flow_utils import warp + + +def resize(x, scale_factor): + return F.interpolate(x, scale_factor=scale_factor, mode="bilinear", align_corners=False) + +def convrelu(in_channels, out_channels, kernel_size=3, stride=1, padding=1, dilation=1, groups=1, bias=True): + return nn.Sequential( + nn.Conv2d(in_channels, out_channels, kernel_size, stride, padding, dilation, groups, bias=bias), + nn.PReLU(out_channels) + ) + +class ResBlock(nn.Module): + def __init__(self, in_channels, side_channels, bias=True): + super(ResBlock, self).__init__() + self.side_channels = side_channels + self.conv1 = nn.Sequential( + nn.Conv2d(in_channels, in_channels, kernel_size=3, stride=1, padding=1, bias=bias), + nn.PReLU(in_channels) + ) + self.conv2 = nn.Sequential( + nn.Conv2d(side_channels, side_channels, kernel_size=3, stride=1, padding=1, bias=bias), + nn.PReLU(side_channels) + ) + self.conv3 = nn.Sequential( + nn.Conv2d(in_channels, in_channels, kernel_size=3, stride=1, padding=1, bias=bias), + nn.PReLU(in_channels) + ) + self.conv4 = nn.Sequential( + nn.Conv2d(side_channels, side_channels, kernel_size=3, stride=1, padding=1, bias=bias), + nn.PReLU(side_channels) + ) + self.conv5 = nn.Conv2d(in_channels, in_channels, kernel_size=3, stride=1, padding=1, bias=bias) + self.prelu = nn.PReLU(in_channels) + + def forward(self, x): + out = self.conv1(x) + + res_feat = out[:, :-self.side_channels, ...] + side_feat = out[:, -self.side_channels:, :, :] + side_feat = self.conv2(side_feat) + out = self.conv3(torch.cat([res_feat, side_feat], 1)) + + res_feat = out[:, :-self.side_channels, ...] + side_feat = out[:, -self.side_channels:, :, :] + side_feat = self.conv4(side_feat) + out = self.conv5(torch.cat([res_feat, side_feat], 1)) + + out = self.prelu(x + out) + return out + +class Encoder(nn.Module): + def __init__(self, channels, large=False): + super(Encoder, self).__init__() + self.channels = channels + prev_ch = 3 + for idx, ch in enumerate(channels, 1): + k = 7 if large and idx == 1 else 3 + p = 3 if k ==7 else 1 + self.register_module(f'pyramid{idx}', + nn.Sequential( + convrelu(prev_ch, ch, k, 2, p), + convrelu(ch, ch, 3, 1, 1) + )) + prev_ch = ch + + def forward(self, in_x): + fs = [] + for idx in range(len(self.channels)): + out_x = getattr(self, f'pyramid{idx+1}')(in_x) + fs.append(out_x) + in_x = out_x + return fs + +class InitDecoder(nn.Module): + def __init__(self, in_ch, out_ch, skip_ch) -> None: + super().__init__() + self.convblock = nn.Sequential( + convrelu(in_ch*2+1, in_ch*2), + ResBlock(in_ch*2, skip_ch), + nn.ConvTranspose2d(in_ch*2, out_ch+4, 4, 2, 1, bias=True) + ) + def forward(self, f0, f1, embt): + h, w = f0.shape[2:] + embt = embt.repeat(1, 1, h, w) + out = self.convblock(torch.cat([f0, f1, embt], 1)) + flow0, flow1 = torch.chunk(out[:, :4, ...], 2, 1) + ft_ = out[:, 4:, ...] + return flow0, flow1, ft_ + +class IntermediateDecoder(nn.Module): + def __init__(self, in_ch, out_ch, skip_ch) -> None: + super().__init__() + self.convblock = nn.Sequential( + convrelu(in_ch*3+4, in_ch*3), + ResBlock(in_ch*3, skip_ch), + nn.ConvTranspose2d(in_ch*3, out_ch+4, 4, 2, 1, bias=True) + ) + def forward(self, ft_, f0, f1, flow0_in, flow1_in): + f0_warp = warp(f0, flow0_in) + f1_warp = warp(f1, flow1_in) + f_in = torch.cat([ft_, f0_warp, f1_warp, flow0_in, flow1_in], 1) + out = self.convblock(f_in) + flow0, flow1 = torch.chunk(out[:, :4, ...], 2, 1) + ft_ = out[:, 4:, ...] + flow0 = flow0 + 2.0 * resize(flow0_in, scale_factor=2.0) + flow1 = flow1 + 2.0 * resize(flow1_in, scale_factor=2.0) + return flow0, flow1, ft_ \ No newline at end of file diff --git a/vbench/third_party/amt/networks/blocks/multi_flow.py b/vbench/third_party/amt/networks/blocks/multi_flow.py new file mode 100755 index 00000000..4563c326 --- /dev/null +++ b/vbench/third_party/amt/networks/blocks/multi_flow.py @@ -0,0 +1,69 @@ +import torch +import torch.nn as nn +from utils.flow_utils import warp +from networks.blocks.ifrnet import ( + convrelu, resize, + ResBlock, +) + + +def multi_flow_combine(comb_block, img0, img1, flow0, flow1, + mask=None, img_res=None, mean=None): + ''' + A parallel implementation of multiple flow field warping + comb_block: An nn.Seqential object. + img shape: [b, c, h, w] + flow shape: [b, 2*num_flows, h, w] + mask (opt): + If 'mask' is None, the function conduct a simple average. + img_res (opt): + If 'img_res' is None, the function adds zero instead. + mean (opt): + If 'mean' is None, the function adds zero instead. + ''' + b, c, h, w = flow0.shape + num_flows = c // 2 + flow0 = flow0.reshape(b, num_flows, 2, h, w).reshape(-1, 2, h, w) + flow1 = flow1.reshape(b, num_flows, 2, h, w).reshape(-1, 2, h, w) + + mask = mask.reshape(b, num_flows, 1, h, w + ).reshape(-1, 1, h, w) if mask is not None else None + img_res = img_res.reshape(b, num_flows, 3, h, w + ).reshape(-1, 3, h, w) if img_res is not None else 0 + img0 = torch.stack([img0] * num_flows, 1).reshape(-1, 3, h, w) + img1 = torch.stack([img1] * num_flows, 1).reshape(-1, 3, h, w) + mean = torch.stack([mean] * num_flows, 1).reshape(-1, 1, 1, 1 + ) if mean is not None else 0 + + img0_warp = warp(img0, flow0) + img1_warp = warp(img1, flow1) + img_warps = mask * img0_warp + (1 - mask) * img1_warp + mean + img_res + img_warps = img_warps.reshape(b, num_flows, 3, h, w) + imgt_pred = img_warps.mean(1) + comb_block(img_warps.view(b, -1, h, w)) + return imgt_pred + + +class MultiFlowDecoder(nn.Module): + def __init__(self, in_ch, skip_ch, num_flows=3): + super(MultiFlowDecoder, self).__init__() + self.num_flows = num_flows + self.convblock = nn.Sequential( + convrelu(in_ch*3+4, in_ch*3), + ResBlock(in_ch*3, skip_ch), + nn.ConvTranspose2d(in_ch*3, 8*num_flows, 4, 2, 1, bias=True) + ) + + def forward(self, ft_, f0, f1, flow0, flow1): + n = self.num_flows + f0_warp = warp(f0, flow0) + f1_warp = warp(f1, flow1) + out = self.convblock(torch.cat([ft_, f0_warp, f1_warp, flow0, flow1], 1)) + delta_flow0, delta_flow1, mask, img_res = torch.split(out, [2*n, 2*n, n, 3*n], 1) + mask = torch.sigmoid(mask) + + flow0 = delta_flow0 + 2.0 * resize(flow0, scale_factor=2.0 + ).repeat(1, self.num_flows, 1, 1) + flow1 = delta_flow1 + 2.0 * resize(flow1, scale_factor=2.0 + ).repeat(1, self.num_flows, 1, 1) + + return flow0, flow1, mask, img_res \ No newline at end of file diff --git a/vbench/third_party/amt/networks/blocks/raft.py b/vbench/third_party/amt/networks/blocks/raft.py new file mode 100755 index 00000000..9fb85ad6 --- /dev/null +++ b/vbench/third_party/amt/networks/blocks/raft.py @@ -0,0 +1,207 @@ +import torch +import torch.nn as nn +import torch.nn.functional as F + + +def resize(x, scale_factor): + return F.interpolate(x, scale_factor=scale_factor, mode="bilinear", align_corners=False) + + +def bilinear_sampler(img, coords, mask=False): + """ Wrapper for grid_sample, uses pixel coordinates """ + H, W = img.shape[-2:] + xgrid, ygrid = coords.split([1,1], dim=-1) + xgrid = 2*xgrid/(W-1) - 1 + ygrid = 2*ygrid/(H-1) - 1 + + grid = torch.cat([xgrid, ygrid], dim=-1) + img = F.grid_sample(img, grid, align_corners=True) + + if mask: + mask = (xgrid > -1) & (ygrid > -1) & (xgrid < 1) & (ygrid < 1) + return img, mask.float() + + return img + + +def coords_grid(batch, ht, wd, device): + coords = torch.meshgrid(torch.arange(ht, device=device), + torch.arange(wd, device=device), + indexing='ij') + coords = torch.stack(coords[::-1], dim=0).float() + return coords[None].repeat(batch, 1, 1, 1) + + +class SmallUpdateBlock(nn.Module): + def __init__(self, cdim, hidden_dim, flow_dim, corr_dim, fc_dim, + corr_levels=4, radius=3, scale_factor=None): + super(SmallUpdateBlock, self).__init__() + cor_planes = corr_levels * (2 * radius + 1) **2 + self.scale_factor = scale_factor + + self.convc1 = nn.Conv2d(2 * cor_planes, corr_dim, 1, padding=0) + self.convf1 = nn.Conv2d(4, flow_dim*2, 7, padding=3) + self.convf2 = nn.Conv2d(flow_dim*2, flow_dim, 3, padding=1) + self.conv = nn.Conv2d(corr_dim+flow_dim, fc_dim, 3, padding=1) + + self.gru = nn.Sequential( + nn.Conv2d(fc_dim+4+cdim, hidden_dim, 3, padding=1), + nn.LeakyReLU(negative_slope=0.1, inplace=True), + nn.Conv2d(hidden_dim, hidden_dim, 3, padding=1), + ) + + self.feat_head = nn.Sequential( + nn.Conv2d(hidden_dim, hidden_dim, 3, padding=1), + nn.LeakyReLU(negative_slope=0.1, inplace=True), + nn.Conv2d(hidden_dim, cdim, 3, padding=1), + ) + + self.flow_head = nn.Sequential( + nn.Conv2d(hidden_dim, hidden_dim, 3, padding=1), + nn.LeakyReLU(negative_slope=0.1, inplace=True), + nn.Conv2d(hidden_dim, 4, 3, padding=1), + ) + + self.lrelu = nn.LeakyReLU(negative_slope=0.1, inplace=True) + + def forward(self, net, flow, corr): + net = resize(net, 1 / self.scale_factor + ) if self.scale_factor is not None else net + cor = self.lrelu(self.convc1(corr)) + flo = self.lrelu(self.convf1(flow)) + flo = self.lrelu(self.convf2(flo)) + cor_flo = torch.cat([cor, flo], dim=1) + inp = self.lrelu(self.conv(cor_flo)) + inp = torch.cat([inp, flow, net], dim=1) + + out = self.gru(inp) + delta_net = self.feat_head(out) + delta_flow = self.flow_head(out) + + if self.scale_factor is not None: + delta_net = resize(delta_net, scale_factor=self.scale_factor) + delta_flow = self.scale_factor * resize(delta_flow, scale_factor=self.scale_factor) + + return delta_net, delta_flow + + +class BasicUpdateBlock(nn.Module): + def __init__(self, cdim, hidden_dim, flow_dim, corr_dim, corr_dim2, + fc_dim, corr_levels=4, radius=3, scale_factor=None, out_num=1): + super(BasicUpdateBlock, self).__init__() + cor_planes = corr_levels * (2 * radius + 1) **2 + + self.scale_factor = scale_factor + self.convc1 = nn.Conv2d(2 * cor_planes, corr_dim, 1, padding=0) + self.convc2 = nn.Conv2d(corr_dim, corr_dim2, 3, padding=1) + self.convf1 = nn.Conv2d(4, flow_dim*2, 7, padding=3) + self.convf2 = nn.Conv2d(flow_dim*2, flow_dim, 3, padding=1) + self.conv = nn.Conv2d(flow_dim+corr_dim2, fc_dim, 3, padding=1) + + self.gru = nn.Sequential( + nn.Conv2d(fc_dim+4+cdim, hidden_dim, 3, padding=1), + nn.LeakyReLU(negative_slope=0.1, inplace=True), + nn.Conv2d(hidden_dim, hidden_dim, 3, padding=1), + ) + + self.feat_head = nn.Sequential( + nn.Conv2d(hidden_dim, hidden_dim, 3, padding=1), + nn.LeakyReLU(negative_slope=0.1, inplace=True), + nn.Conv2d(hidden_dim, cdim, 3, padding=1), + ) + + self.flow_head = nn.Sequential( + nn.Conv2d(hidden_dim, hidden_dim, 3, padding=1), + nn.LeakyReLU(negative_slope=0.1, inplace=True), + nn.Conv2d(hidden_dim, 4*out_num, 3, padding=1), + ) + + self.lrelu = nn.LeakyReLU(negative_slope=0.1, inplace=True) + + def forward(self, net, flow, corr): + net = resize(net, 1 / self.scale_factor + ) if self.scale_factor is not None else net + cor = self.lrelu(self.convc1(corr)) + cor = self.lrelu(self.convc2(cor)) + flo = self.lrelu(self.convf1(flow)) + flo = self.lrelu(self.convf2(flo)) + cor_flo = torch.cat([cor, flo], dim=1) + inp = self.lrelu(self.conv(cor_flo)) + inp = torch.cat([inp, flow, net], dim=1) + + out = self.gru(inp) + delta_net = self.feat_head(out) + delta_flow = self.flow_head(out) + + if self.scale_factor is not None: + delta_net = resize(delta_net, scale_factor=self.scale_factor) + delta_flow = self.scale_factor * resize(delta_flow, scale_factor=self.scale_factor) + return delta_net, delta_flow + + +class BidirCorrBlock: + def __init__(self, fmap1, fmap2, num_levels=4, radius=4): + self.num_levels = num_levels + self.radius = radius + self.corr_pyramid = [] + self.corr_pyramid_T = [] + + corr = BidirCorrBlock.corr(fmap1, fmap2) + batch, h1, w1, dim, h2, w2 = corr.shape + corr_T = corr.clone().permute(0, 4, 5, 3, 1, 2) + + corr = corr.reshape(batch*h1*w1, dim, h2, w2) + corr_T = corr_T.reshape(batch*h2*w2, dim, h1, w1) + + self.corr_pyramid.append(corr) + self.corr_pyramid_T.append(corr_T) + + for _ in range(self.num_levels-1): + corr = F.avg_pool2d(corr, 2, stride=2) + corr_T = F.avg_pool2d(corr_T, 2, stride=2) + self.corr_pyramid.append(corr) + self.corr_pyramid_T.append(corr_T) + + def __call__(self, coords0, coords1): + r = self.radius + coords0 = coords0.permute(0, 2, 3, 1) + coords1 = coords1.permute(0, 2, 3, 1) + assert coords0.shape == coords1.shape, f"coords0 shape: [{coords0.shape}] is not equal to [{coords1.shape}]" + batch, h1, w1, _ = coords0.shape + + out_pyramid = [] + out_pyramid_T = [] + for i in range(self.num_levels): + corr = self.corr_pyramid[i] + corr_T = self.corr_pyramid_T[i] + + dx = torch.linspace(-r, r, 2*r+1, device=coords0.device) + dy = torch.linspace(-r, r, 2*r+1, device=coords0.device) + delta = torch.stack(torch.meshgrid(dy, dx, indexing='ij'), axis=-1) + delta_lvl = delta.view(1, 2*r+1, 2*r+1, 2) + + centroid_lvl_0 = coords0.reshape(batch*h1*w1, 1, 1, 2) / 2**i + centroid_lvl_1 = coords1.reshape(batch*h1*w1, 1, 1, 2) / 2**i + coords_lvl_0 = centroid_lvl_0 + delta_lvl + coords_lvl_1 = centroid_lvl_1 + delta_lvl + + corr = bilinear_sampler(corr, coords_lvl_0) + corr_T = bilinear_sampler(corr_T, coords_lvl_1) + corr = corr.view(batch, h1, w1, -1) + corr_T = corr_T.view(batch, h1, w1, -1) + out_pyramid.append(corr) + out_pyramid_T.append(corr_T) + + out = torch.cat(out_pyramid, dim=-1) + out_T = torch.cat(out_pyramid_T, dim=-1) + return out.permute(0, 3, 1, 2).contiguous().float(), out_T.permute(0, 3, 1, 2).contiguous().float() + + @staticmethod + def corr(fmap1, fmap2): + batch, dim, ht, wd = fmap1.shape + fmap1 = fmap1.view(batch, dim, ht*wd) + fmap2 = fmap2.view(batch, dim, ht*wd) + + corr = torch.matmul(fmap1.transpose(1,2), fmap2) + corr = corr.view(batch, ht, wd, 1, ht, wd) + return corr / torch.sqrt(torch.tensor(dim).float()) \ No newline at end of file diff --git a/vbench/third_party/amt/scripts/benchmark_arbitrary.sh b/vbench/third_party/amt/scripts/benchmark_arbitrary.sh new file mode 100755 index 00000000..108daea1 --- /dev/null +++ b/vbench/third_party/amt/scripts/benchmark_arbitrary.sh @@ -0,0 +1,5 @@ +CFG=$1 +CKPT=$2 + +python benchmarks/gopro.py -c $CFG -p $CKPT +python benchmarks/adobe240.py -c $CFG -p $CKPT \ No newline at end of file diff --git a/vbench/third_party/amt/scripts/benchmark_fixed.sh b/vbench/third_party/amt/scripts/benchmark_fixed.sh new file mode 100755 index 00000000..55d06b04 --- /dev/null +++ b/vbench/third_party/amt/scripts/benchmark_fixed.sh @@ -0,0 +1,7 @@ +CFG=$1 +CKPT=$2 + +python benchmarks/vimeo90k.py -c $CFG -p $CKPT +python benchmarks/ucf101.py -c $CFG -p $CKPT +python benchmarks/snu_film.py -c $CFG -p $CKPT +python benchmarks/xiph.py -c $CFG -p $CKPT \ No newline at end of file diff --git a/vbench/third_party/amt/scripts/train.sh b/vbench/third_party/amt/scripts/train.sh new file mode 100755 index 00000000..92afb646 --- /dev/null +++ b/vbench/third_party/amt/scripts/train.sh @@ -0,0 +1,6 @@ +NUM_GPU=$1 +CFG=$2 +PORT=$3 +python -m torch.distributed.launch \ +--nproc_per_node $NUM_GPU \ +--master_port $PORT train.py -c $CFG \ No newline at end of file diff --git a/vbench/third_party/amt/train.py b/vbench/third_party/amt/train.py new file mode 100755 index 00000000..f0591e90 --- /dev/null +++ b/vbench/third_party/amt/train.py @@ -0,0 +1,68 @@ +import os +import argparse +from shutil import copyfile +import torch.distributed as dist +import torch +import importlib +import datetime +from utils.dist_utils import ( + get_world_size, +) +from omegaconf import OmegaConf +from utils.utils import seed_all +parser = argparse.ArgumentParser(description='VFI') +parser.add_argument('-c', '--config', type=str) +parser.add_argument('-p', '--port', default='23455', type=str) +parser.add_argument('--local_rank', default='0') + +args = parser.parse_args() + + +def main_worker(rank, config): + if 'local_rank' not in config: + config['local_rank'] = config['global_rank'] = rank + if torch.cuda.is_available(): + print(f'Rank {rank} is available') + config['device'] = f"cuda:{rank}" + if config['distributed']: + dist.init_process_group(backend='nccl', + timeout=datetime.timedelta(seconds=5400)) + else: + config['device'] = 'cpu' + + cfg_name = os.path.basename(args.config).split('.')[0] + config['exp_name'] = cfg_name + '_' + config['exp_name'] + config['save_dir'] = os.path.join(config['save_dir'], config['exp_name']) + + if (not config['distributed']) or rank == 0: + os.makedirs(config['save_dir'], exist_ok=True) + os.makedirs(f'{config["save_dir"]}/ckpts', exist_ok=True) + config_path = os.path.join(config['save_dir'], + args.config.split('/')[-1]) + if not os.path.isfile(config_path): + copyfile(args.config, config_path) + print('[**] create folder {}'.format(config['save_dir'])) + + trainer_name = config.get('trainer_type', 'base_trainer') + print(f'using GPU {rank} for training') + if rank == 0: + print(trainer_name) + trainer_pack = importlib.import_module('trainers.' + trainer_name) + trainer = trainer_pack.Trainer(config) + + trainer.train() + + +if __name__ == "__main__": + torch.backends.cudnn.benchmark = True + cfg = OmegaConf.load(args.config) + seed_all(cfg.seed) + rank = int(args.local_rank) + torch.cuda.set_device(torch.device(f'cuda:{rank}')) + # setting distributed cfgurations + cfg['world_size'] = get_world_size() + cfg['local_rank'] = rank + if rank == 0: + print('world_size: ', cfg['world_size']) + main_worker(rank, cfg) + diff --git a/vbench/third_party/amt/trainers/__init__.py b/vbench/third_party/amt/trainers/__init__.py new file mode 100755 index 00000000..e69de29b diff --git a/vbench/third_party/amt/trainers/base_trainer.py b/vbench/third_party/amt/trainers/base_trainer.py new file mode 100755 index 00000000..ec747a92 --- /dev/null +++ b/vbench/third_party/amt/trainers/base_trainer.py @@ -0,0 +1,243 @@ +import time +import wandb +import logging +import numpy as np +import os.path as osp +from collections import OrderedDict + +import torch +from torch.optim import AdamW +from torch.utils.data import DataLoader +from torch.utils.data.distributed import DistributedSampler +from torch.nn.parallel import DistributedDataParallel as DDP + +from .logger import CustomLogger +from utils.utils import AverageMeterGroups +from metrics.psnr_ssim import calculate_psnr +from utils.build_utils import build_from_cfg + + +class Trainer: + def __init__(self, config): + super().__init__() + self.config = config + self.rank = self.config['local_rank'] + init_log = self._init_logger() + self._init_dataset() + self._init_loss() + self.model_name = config['exp_name'] + self.model = build_from_cfg(config.network).to(self.config.device) + + if config['distributed']: + self.model = DDP(self.model, + device_ids=[self.rank], + output_device=self.rank, + broadcast_buffers=True, + find_unused_parameters=False) + + init_log += str(self.model) + self.optimizer = AdamW(self.model.parameters(), + lr=config.lr, weight_decay=config.weight_decay) + if self.rank == 0: + print(init_log) + self.logger(init_log) + self.resume_training() + + def resume_training(self): + ckpt_path = self.config.get('resume_state') + if ckpt_path is not None: + ckpt = torch.load(self.config['resume_state']) + if self.config['distributed']: + self.model.module.load_state_dict(ckpt['state_dict']) + else: + self.model.load_state_dict(ckpt['state_dict']) + self.optimizer.load_state_dict(ckpt['optim']) + self.resume_epoch = ckpt.get('epoch') + self.logger( + f'load model from {ckpt_path} and training resumes from epoch {self.resume_epoch}') + else: + self.resume_epoch = 0 + + def _init_logger(self): + init_log = '' + console_cfg = dict( + level=logging.INFO, + format="%(asctime)s %(filename)s[line:%(lineno)d]" + "%(levelname)s %(message)s", + datefmt="%a, %d %b %Y %H:%M:%S", + filename=f"{self.config['save_dir']}/log", + filemode='w') + tb_cfg = dict(log_dir=osp.join(self.config['save_dir'], 'tb_logger')) + wandb_cfg = None + use_wandb = self.config['logger'].get('use_wandb', False) + if use_wandb: + resume_id = self.config['logger'].get('resume_id', None) + if resume_id: + wandb_id = resume_id + resume = 'allow' + init_log += f'Resume wandb logger with id={wandb_id}.' + else: + wandb_id = wandb.util.generate_id() + resume = 'never' + + wandb_cfg = dict(id=wandb_id, + resume=resume, + name=osp.basename(self.config['save_dir']), + config=self.config, + project="YOUR PROJECT", + entity="YOUR ENTITY", + sync_tensorboard=True) + init_log += f'Use wandb logger with id={wandb_id}; project=[YOUR PROJECT].' + self.logger = CustomLogger(console_cfg, tb_cfg, wandb_cfg, self.rank) + return init_log + + def _init_dataset(self): + dataset_train = build_from_cfg(self.config.data.train) + dataset_val = build_from_cfg(self.config.data.val) + + self.sampler = DistributedSampler( + dataset_train, num_replicas=self.config['world_size'], rank=self.config['local_rank']) + self.config.data.train_loader.batch_size //= self.config['world_size'] + self.loader_train = DataLoader(dataset_train, + **self.config.data.train_loader, + pin_memory=True, drop_last=True, sampler=self.sampler) + + self.loader_val = DataLoader(dataset_val, **self.config.data.val_loader, + pin_memory=True, shuffle=False, drop_last=False) + + def _init_loss(self): + self.loss_dict = dict() + for loss_cfg in self.config.losses: + loss = build_from_cfg(loss_cfg) + self.loss_dict[loss_cfg['nickname']] = loss + + def set_lr(self, optimizer, lr): + for param_group in optimizer.param_groups: + param_group['lr'] = lr + + def get_lr(self, iters): + ratio = 0.5 * (1.0 + np.cos(iters / + (self.config['epochs'] * self.loader_train.__len__()) * np.pi)) + lr = (self.config['lr'] - self.config['lr_min'] + ) * ratio + self.config['lr_min'] + return lr + + def train(self): + local_rank = self.config['local_rank'] + best_psnr = 0.0 + loss_group = AverageMeterGroups() + time_group = AverageMeterGroups() + iters_per_epoch = self.loader_train.__len__() + iters = self.resume_epoch * iters_per_epoch + total_iters = self.config['epochs'] * iters_per_epoch + + start_t = time.time() + total_t = 0 + for epoch in range(self.resume_epoch, self.config['epochs']): + self.sampler.set_epoch(epoch) + for data in self.loader_train: + for k, v in data.items(): + data[k] = v.to(self.config['device']) + data_t = time.time() - start_t + + lr = self.get_lr(iters) + self.set_lr(self.optimizer, lr) + + self.optimizer.zero_grad() + results = self.model(**data) + total_loss = torch.tensor(0., device=self.config['device']) + for name, loss in self.loss_dict.items(): + l = loss(**results, **data) + loss_group.update({name: l.cpu().data}) + total_loss += l + total_loss.backward() + self.optimizer.step() + + iters += 1 + + iter_t = time.time() - start_t + total_t += iter_t + time_group.update({'data_t': data_t, 'iter_t': iter_t}) + + if (iters+1) % 100 == 0 and local_rank == 0: + tpi = total_t / (iters - self.resume_epoch * iters_per_epoch) + eta = total_iters * tpi + remainder = (total_iters - iters) * tpi + eta = self.eta_format(eta) + + remainder = self.eta_format(remainder) + log_str = f"[{self.model_name}]epoch:{epoch +1}/{self.config['epochs']} " + log_str += f"iter:{iters + 1}/{self.config['epochs'] * iters_per_epoch} " + log_str += f"time:{time_group.avg('iter_t'):.3f}({time_group.avg('data_t'):.3f}) " + log_str += f"lr:{lr:.3e} eta:{remainder}({eta})\n" + for name in self.loss_dict.keys(): + avg_l = loss_group.avg(name) + log_str += f"{name}:{avg_l:.3e} " + self.logger(tb_msg=[f'loss/{name}', avg_l, iters]) + log_str += f'best:{best_psnr:.2f}dB\n\n' + self.logger(log_str) + loss_group.reset() + time_group.reset() + start_t = time.time() + + if (epoch+1) % self.config['eval_interval'] == 0 and local_rank == 0: + psnr, eval_t = self.evaluate(epoch) + total_t += eval_t + self.logger(tb_msg=['eval/psnr', psnr, epoch]) + if psnr > best_psnr: + best_psnr = psnr + self.save('psnr_best.pth', epoch) + if self.logger.enable_wandb: + wandb.run.summary["best_psnr"] = best_psnr + if (epoch+1) % 50 == 0: + self.save(f'epoch_{epoch+1}.pth', epoch) + self.save('latest.pth', epoch) + + self.logger.close() + + def evaluate(self, epoch): + psnr_list = [] + time_stamp = time.time() + for i, data in enumerate(self.loader_val): + for k, v in data.items(): + data[k] = v.to(self.config['device']) + + with torch.no_grad(): + results = self.model(**data, eval=True) + imgt_pred = results['imgt_pred'] + for j in range(data['img0'].shape[0]): + psnr = calculate_psnr(imgt_pred[j].detach().unsqueeze( + 0), data['imgt'][j].unsqueeze(0)).cpu().data + psnr_list.append(psnr) + + eval_time = time.time() - time_stamp + + self.logger('eval epoch:{}/{} time:{:.2f} psnr:{:.3f}'.format( + epoch+1, self.config["epochs"], eval_time, np.array(psnr_list).mean())) + return np.array(psnr_list).mean(), eval_time + + def save(self, name, epoch): + save_path = '{}/{}/{}'.format(self.config['save_dir'], 'ckpts', name) + ckpt = OrderedDict(epoch=epoch) + if self.config['distributed']: + ckpt['state_dict'] = self.model.module.state_dict() + else: + ckpt['state_dict'] = self.model.state_dict() + ckpt['optim'] = self.optimizer.state_dict() + torch.save(ckpt, save_path) + + def eta_format(self, eta): + time_str = '' + if eta >= 3600: + hours = int(eta // 3600) + eta -= hours * 3600 + time_str = f'{hours}' + + if eta >= 60: + mins = int(eta // 60) + eta -= mins * 60 + time_str = f'{time_str}:{mins:02}' + + eta = int(eta) + time_str = f'{time_str}:{eta:02}' + return time_str diff --git a/vbench/third_party/amt/trainers/logger.py b/vbench/third_party/amt/trainers/logger.py new file mode 100755 index 00000000..2683f3bb --- /dev/null +++ b/vbench/third_party/amt/trainers/logger.py @@ -0,0 +1,62 @@ +import time +import wandb +import shutil +import logging +import os.path as osp +from torch.utils.tensorboard import SummaryWriter + + +def mv_archived_logger(name): + timestamp = time.strftime("%Y-%m-%d_%H:%M:%S_", time.localtime()) + basename = 'archived_' + timestamp + osp.basename(name) + archived_name = osp.join(osp.dirname(name), basename) + shutil.move(name, archived_name) + + +class CustomLogger: + def __init__(self, common_cfg, tb_cfg=None, wandb_cfg=None, rank=0): + global global_logger + self.rank = rank + + if self.rank == 0: + self.logger = logging.getLogger('VFI') + self.logger.setLevel(logging.INFO) + format_str = logging.Formatter(common_cfg['format']) + + console_handler = logging.StreamHandler() + console_handler.setFormatter(format_str) + + if osp.exists(common_cfg['filename']): + mv_archived_logger(common_cfg['filename']) + + file_handler = logging.FileHandler(common_cfg['filename'], + common_cfg['filemode']) + file_handler.setFormatter(format_str) + + self.logger.addHandler(console_handler) + self.logger.addHandler(file_handler) + self.tb_logger = None + + self.enable_wandb = False + + if wandb_cfg is not None: + self.enable_wandb = True + wandb.init(**wandb_cfg) + + if tb_cfg is not None: + self.tb_logger = SummaryWriter(**tb_cfg) + + global_logger = self + + def __call__(self, msg=None, level=logging.INFO, tb_msg=None): + if self.rank != 0: + return + if msg is not None: + self.logger.log(level, msg) + + if self.tb_logger is not None and tb_msg is not None: + self.tb_logger.add_scalar(*tb_msg) + + def close(self): + if self.rank == 0 and self.enable_wandb: + wandb.finish() diff --git a/vbench/third_party/amt/utils/__init__.py b/vbench/third_party/amt/utils/__init__.py new file mode 100755 index 00000000..e69de29b diff --git a/vbench/third_party/amt/utils/build_utils.py b/vbench/third_party/amt/utils/build_utils.py new file mode 100755 index 00000000..c9cc0d3b --- /dev/null +++ b/vbench/third_party/amt/utils/build_utils.py @@ -0,0 +1,15 @@ +import importlib + +import sys +sys.path.append("vbench/third_party/amt") + + +def base_build_fn(module, cls, params): + return getattr(importlib.import_module( + module, package=None), cls)(**params) + + +def build_from_cfg(config): + module, cls = config['name'].rsplit(".", 1) + params = config.get('params', {}) + return base_build_fn(module, cls, params) diff --git a/vbench/third_party/amt/utils/dist_utils.py b/vbench/third_party/amt/utils/dist_utils.py new file mode 100755 index 00000000..6337f999 --- /dev/null +++ b/vbench/third_party/amt/utils/dist_utils.py @@ -0,0 +1,48 @@ +import os +import torch + + +def get_world_size(): + """Find OMPI world size without calling mpi functions + :rtype: int + """ + if os.environ.get('PMI_SIZE') is not None: + return int(os.environ.get('PMI_SIZE') or 1) + elif os.environ.get('OMPI_COMM_WORLD_SIZE') is not None: + return int(os.environ.get('OMPI_COMM_WORLD_SIZE') or 1) + else: + return torch.cuda.device_count() + + +def get_global_rank(): + """Find OMPI world rank without calling mpi functions + :rtype: int + """ + if os.environ.get('PMI_RANK') is not None: + return int(os.environ.get('PMI_RANK') or 0) + elif os.environ.get('OMPI_COMM_WORLD_RANK') is not None: + return int(os.environ.get('OMPI_COMM_WORLD_RANK') or 0) + else: + return 0 + + +def get_local_rank(): + """Find OMPI local rank without calling mpi functions + :rtype: int + """ + if os.environ.get('MPI_LOCALRANKID') is not None: + return int(os.environ.get('MPI_LOCALRANKID') or 0) + elif os.environ.get('OMPI_COMM_WORLD_LOCAL_RANK') is not None: + return int(os.environ.get('OMPI_COMM_WORLD_LOCAL_RANK') or 0) + else: + return 0 + + +def get_master_ip(): + if os.environ.get('AZ_BATCH_MASTER_NODE') is not None: + return os.environ.get('AZ_BATCH_MASTER_NODE').split(':')[0] + elif os.environ.get('AZ_BATCHAI_MPI_MASTER_NODE') is not None: + return os.environ.get('AZ_BATCHAI_MPI_MASTER_NODE') + else: + return "127.0.0.1" + diff --git a/vbench/third_party/amt/utils/flow_utils.py b/vbench/third_party/amt/utils/flow_utils.py new file mode 100755 index 00000000..84fca204 --- /dev/null +++ b/vbench/third_party/amt/utils/flow_utils.py @@ -0,0 +1,122 @@ +import numpy as np +import torch +from PIL import ImageFile +import torch.nn.functional as F +ImageFile.LOAD_TRUNCATED_IMAGES = True + + +def warp(img, flow): + B, _, H, W = flow.shape + xx = torch.linspace(-1.0, 1.0, W).view(1, 1, 1, W).expand(B, -1, H, -1) + yy = torch.linspace(-1.0, 1.0, H).view(1, 1, H, 1).expand(B, -1, -1, W) + grid = torch.cat([xx, yy], 1).to(img) + flow_ = torch.cat([flow[:, 0:1, :, :] / ((W - 1.0) / 2.0), flow[:, 1:2, :, :] / ((H - 1.0) / 2.0)], 1) + grid_ = (grid + flow_).permute(0, 2, 3, 1) + output = F.grid_sample(input=img, grid=grid_, mode='bilinear', padding_mode='border', align_corners=True) + return output + + +def make_colorwheel(): + """ + Generates a color wheel for optical flow visualization as presented in: + Baker et al. "A Database and Evaluation Methodology for Optical Flow" (ICCV, 2007) + URL: http://vision.middlebury.edu/flow/flowEval-iccv07.pdf + Code follows the original C++ source code of Daniel Scharstein. + Code follows the the Matlab source code of Deqing Sun. + Returns: + np.ndarray: Color wheel + """ + + RY = 15 + YG = 6 + GC = 4 + CB = 11 + BM = 13 + MR = 6 + + ncols = RY + YG + GC + CB + BM + MR + colorwheel = np.zeros((ncols, 3)) + col = 0 + + # RY + colorwheel[0:RY, 0] = 255 + colorwheel[0:RY, 1] = np.floor(255*np.arange(0,RY)/RY) + col = col+RY + # YG + colorwheel[col:col+YG, 0] = 255 - np.floor(255*np.arange(0,YG)/YG) + colorwheel[col:col+YG, 1] = 255 + col = col+YG + # GC + colorwheel[col:col+GC, 1] = 255 + colorwheel[col:col+GC, 2] = np.floor(255*np.arange(0,GC)/GC) + col = col+GC + # CB + colorwheel[col:col+CB, 1] = 255 - np.floor(255*np.arange(CB)/CB) + colorwheel[col:col+CB, 2] = 255 + col = col+CB + # BM + colorwheel[col:col+BM, 2] = 255 + colorwheel[col:col+BM, 0] = np.floor(255*np.arange(0,BM)/BM) + col = col+BM + # MR + colorwheel[col:col+MR, 2] = 255 - np.floor(255*np.arange(MR)/MR) + colorwheel[col:col+MR, 0] = 255 + return colorwheel + +def flow_uv_to_colors(u, v, convert_to_bgr=False): + """ + Applies the flow color wheel to (possibly clipped) flow components u and v. + According to the C++ source code of Daniel Scharstein + According to the Matlab source code of Deqing Sun + Args: + u (np.ndarray): Input horizontal flow of shape [H,W] + v (np.ndarray): Input vertical flow of shape [H,W] + convert_to_bgr (bool, optional): Convert output image to BGR. Defaults to False. + Returns: + np.ndarray: Flow visualization image of shape [H,W,3] + """ + flow_image = np.zeros((u.shape[0], u.shape[1], 3), np.uint8) + colorwheel = make_colorwheel() # shape [55x3] + ncols = colorwheel.shape[0] + rad = np.sqrt(np.square(u) + np.square(v)) + a = np.arctan2(-v, -u)/np.pi + fk = (a+1) / 2*(ncols-1) + k0 = np.floor(fk).astype(np.int32) + k1 = k0 + 1 + k1[k1 == ncols] = 0 + f = fk - k0 + for i in range(colorwheel.shape[1]): + tmp = colorwheel[:,i] + col0 = tmp[k0] / 255.0 + col1 = tmp[k1] / 255.0 + col = (1-f)*col0 + f*col1 + idx = (rad <= 1) + col[idx] = 1 - rad[idx] * (1-col[idx]) + col[~idx] = col[~idx] * 0.75 # out of range + # Note the 2-i => BGR instead of RGB + ch_idx = 2-i if convert_to_bgr else i + flow_image[:,:,ch_idx] = np.floor(255 * col) + return flow_image + +def flow_to_image(flow_uv, clip_flow=None, convert_to_bgr=False): + """ + Expects a two dimensional flow image of shape. + Args: + flow_uv (np.ndarray): Flow UV image of shape [H,W,2] + clip_flow (float, optional): Clip maximum of flow values. Defaults to None. + convert_to_bgr (bool, optional): Convert output image to BGR. Defaults to False. + Returns: + np.ndarray: Flow visualization image of shape [H,W,3] + """ + assert flow_uv.ndim == 3, 'input flow must have three dimensions' + assert flow_uv.shape[2] == 2, 'input flow must have shape [H,W,2]' + if clip_flow is not None: + flow_uv = np.clip(flow_uv, 0, clip_flow) + u = flow_uv[:,:,0] + v = flow_uv[:,:,1] + rad = np.sqrt(np.square(u) + np.square(v)) + rad_max = np.max(rad) + epsilon = 1e-5 + u = u / (rad_max + epsilon) + v = v / (rad_max + epsilon) + return flow_uv_to_colors(u, v, convert_to_bgr) \ No newline at end of file diff --git a/vbench/third_party/amt/utils/utils.py b/vbench/third_party/amt/utils/utils.py new file mode 100755 index 00000000..0473226d --- /dev/null +++ b/vbench/third_party/amt/utils/utils.py @@ -0,0 +1,297 @@ +import re +import sys +import torch +import random +import numpy as np +from PIL import ImageFile +import torch.nn.functional as F +from imageio import imread, imwrite +ImageFile.LOAD_TRUNCATED_IMAGES = True + + +class AverageMeter(): + def __init__(self): + self.reset() + + def reset(self): + self.val = 0. + self.avg = 0. + self.sum = 0. + self.count = 0 + + def update(self, val, n=1): + self.val = val + self.sum += val * n + self.count += n + self.avg = self.sum / self.count + + +class AverageMeterGroups: + def __init__(self) -> None: + self.meter_dict = dict() + + def update(self, dict, n=1): + for name, val in dict.items(): + if self.meter_dict.get(name) is None: + self.meter_dict[name] = AverageMeter() + self.meter_dict[name].update(val, n) + + def reset(self, name=None): + if name is None: + for v in self.meter_dict.values(): + v.reset() + else: + meter = self.meter_dict.get(name) + if meter is not None: + meter.reset() + + def avg(self, name): + meter = self.meter_dict.get(name) + if meter is not None: + return meter.avg + + +class InputPadder: + """ Pads images such that dimensions are divisible by divisor """ + def __init__(self, dims, divisor=16): + self.ht, self.wd = dims[-2:] + pad_ht = (((self.ht // divisor) + 1) * divisor - self.ht) % divisor + pad_wd = (((self.wd // divisor) + 1) * divisor - self.wd) % divisor + self._pad = [pad_wd//2, pad_wd - pad_wd//2, pad_ht//2, pad_ht - pad_ht//2] + + def pad(self, *inputs): + if len(inputs) == 1: + return F.pad(inputs[0], self._pad, mode='replicate') + else: + return [F.pad(x, self._pad, mode='replicate') for x in inputs] + + def unpad(self, *inputs): + if len(inputs) == 1: + return self._unpad(inputs[0]) + else: + return [self._unpad(x) for x in inputs] + + def _unpad(self, x): + ht, wd = x.shape[-2:] + c = [self._pad[2], ht-self._pad[3], self._pad[0], wd-self._pad[1]] + return x[..., c[0]:c[1], c[2]:c[3]] + + +def img2tensor(img): + if img.shape[-1] > 3: + img = img[:,:,:3] + return torch.tensor(img).permute(2, 0, 1).unsqueeze(0) / 255.0 + + +def tensor2img(img_t): + return (img_t * 255.).detach( + ).squeeze(0).permute(1, 2, 0).cpu().numpy( + ).clip(0, 255).astype(np.uint8) + +def seed_all(seed): + random.seed(seed) + np.random.seed(seed) + torch.manual_seed(seed) + torch.cuda.manual_seed_all(seed) + + +def read(file): + if file.endswith('.float3'): return readFloat(file) + elif file.endswith('.flo'): return readFlow(file) + elif file.endswith('.ppm'): return readImage(file) + elif file.endswith('.pgm'): return readImage(file) + elif file.endswith('.png'): return readImage(file) + elif file.endswith('.jpg'): return readImage(file) + elif file.endswith('.pfm'): return readPFM(file)[0] + else: raise Exception('don\'t know how to read %s' % file) + + +def write(file, data): + if file.endswith('.float3'): return writeFloat(file, data) + elif file.endswith('.flo'): return writeFlow(file, data) + elif file.endswith('.ppm'): return writeImage(file, data) + elif file.endswith('.pgm'): return writeImage(file, data) + elif file.endswith('.png'): return writeImage(file, data) + elif file.endswith('.jpg'): return writeImage(file, data) + elif file.endswith('.pfm'): return writePFM(file, data) + else: raise Exception('don\'t know how to write %s' % file) + + +def readPFM(file): + file = open(file, 'rb') + + color = None + width = None + height = None + scale = None + endian = None + + header = file.readline().rstrip() + if header.decode("ascii") == 'PF': + color = True + elif header.decode("ascii") == 'Pf': + color = False + else: + raise Exception('Not a PFM file.') + + dim_match = re.match(r'^(\d+)\s(\d+)\s$', file.readline().decode("ascii")) + if dim_match: + width, height = list(map(int, dim_match.groups())) + else: + raise Exception('Malformed PFM header.') + + scale = float(file.readline().decode("ascii").rstrip()) + if scale < 0: + endian = '<' + scale = -scale + else: + endian = '>' + + data = np.fromfile(file, endian + 'f') + shape = (height, width, 3) if color else (height, width) + + data = np.reshape(data, shape) + data = np.flipud(data) + return data, scale + + +def writePFM(file, image, scale=1): + file = open(file, 'wb') + + color = None + + if image.dtype.name != 'float32': + raise Exception('Image dtype must be float32.') + + image = np.flipud(image) + + if len(image.shape) == 3 and image.shape[2] == 3: + color = True + elif len(image.shape) == 2 or len(image.shape) == 3 and image.shape[2] == 1: + color = False + else: + raise Exception('Image must have H x W x 3, H x W x 1 or H x W dimensions.') + + file.write('PF\n' if color else 'Pf\n'.encode()) + file.write('%d %d\n'.encode() % (image.shape[1], image.shape[0])) + + endian = image.dtype.byteorder + + if endian == '<' or endian == '=' and sys.byteorder == 'little': + scale = -scale + + file.write('%f\n'.encode() % scale) + + image.tofile(file) + + +def readFlow(name): + if name.endswith('.pfm') or name.endswith('.PFM'): + return readPFM(name)[0][:,:,0:2] + + f = open(name, 'rb') + + header = f.read(4) + if header.decode("utf-8") != 'PIEH': + raise Exception('Flow file header does not contain PIEH') + + width = np.fromfile(f, np.int32, 1).squeeze() + height = np.fromfile(f, np.int32, 1).squeeze() + + flow = np.fromfile(f, np.float32, width * height * 2).reshape((height, width, 2)) + + return flow.astype(np.float32) + + +def readImage(name): + if name.endswith('.pfm') or name.endswith('.PFM'): + data = readPFM(name)[0] + if len(data.shape)==3: + return data[:,:,0:3] + else: + return data + return imread(name) + + +def writeImage(name, data): + if name.endswith('.pfm') or name.endswith('.PFM'): + return writePFM(name, data, 1) + return imwrite(name, data) + + +def writeFlow(name, flow): + f = open(name, 'wb') + f.write('PIEH'.encode('utf-8')) + np.array([flow.shape[1], flow.shape[0]], dtype=np.int32).tofile(f) + flow = flow.astype(np.float32) + flow.tofile(f) + + +def readFloat(name): + f = open(name, 'rb') + + if(f.readline().decode("utf-8")) != 'float\n': + raise Exception('float file %s did not contain keyword' % name) + + dim = int(f.readline()) + + dims = [] + count = 1 + for i in range(0, dim): + d = int(f.readline()) + dims.append(d) + count *= d + + dims = list(reversed(dims)) + + data = np.fromfile(f, np.float32, count).reshape(dims) + if dim > 2: + data = np.transpose(data, (2, 1, 0)) + data = np.transpose(data, (1, 0, 2)) + + return data + + +def writeFloat(name, data): + f = open(name, 'wb') + + dim=len(data.shape) + if dim>3: + raise Exception('bad float file dimension: %d' % dim) + + f.write(('float\n').encode('ascii')) + f.write(('%d\n' % dim).encode('ascii')) + + if dim == 1: + f.write(('%d\n' % data.shape[0]).encode('ascii')) + else: + f.write(('%d\n' % data.shape[1]).encode('ascii')) + f.write(('%d\n' % data.shape[0]).encode('ascii')) + for i in range(2, dim): + f.write(('%d\n' % data.shape[i]).encode('ascii')) + + data = data.astype(np.float32) + if dim==2: + data.tofile(f) + + else: + np.transpose(data, (2, 0, 1)).tofile(f) + + +def check_dim_and_resize(tensor_list): + shape_list = [] + for t in tensor_list: + shape_list.append(t.shape[2:]) + + if len(set(shape_list)) > 1: + desired_shape = shape_list[0] + print(f'Inconsistent size of input video frames. All frames will be resized to {desired_shape}') + + resize_tensor_list = [] + for t in tensor_list: + resize_tensor_list.append(torch.nn.functional.interpolate(t, size=tuple(desired_shape), mode='bilinear')) + + tensor_list = resize_tensor_list + + return tensor_list + diff --git a/vbench/utils.py b/vbench/utils.py index a3886e06..6f2bcf9c 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -192,6 +192,33 @@ def init_submodules(dimension_list, local=False): if not os.path.isfile(umt_path): os.system(f'wget -q --show-progress https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/umt/single_modality/l16_ptk710_ftk710_ftk400_f16_res224.pth -O {umt_path}') submodules_dict[dimension] = [umt_path,] + elif dimension == 'temporal_flickering': + submodules_dict[dimension] = [] + elif dimension == 'motion_smoothness': + submodules_dict[dimension] = { + 'config':'pretrained/amt_model/AMT-S.yaml', + 'ckpt':'pretrained/amt_model/amt-s.pth' + } + if local: + details = submodules_dict[dimension] + # Check if the file exists, if not, download it with wget + if not os.path.isfile(details['ckpt']): + print(f"File {details['ckpt']} does not exist. Downloading...") + wget_command = ['wget', '-P', os.path.dirname(details['ckpt']), + 'https://huggingface.co/lalala125/AMT/resolve/main/amt-s.pth'] + subprocess.run(wget_command) + + elif dimension == 'dynamic_degree': + submodules_dict[dimension] = { + 'model':'pretrained/raft_model/models/raft-things.pth' + } + if local: + details = submodules_dict[dimension] + if not os.path.isfile(details['model']): + print(f"File {details['model']} does not exist. Downloading...") + os.system(f'wget -P pretrained/raft_model/ https://dl.dropboxusercontent.com/s/4j4z58wuv8o0mfz/models.zip') + os.system(f'unzip -d pretrained/raft_model/ pretrained/raft_model/models.zip') + os.system(f'rm -f pretrained/raft_model/models.zip') # Assign the DINO model path for subject consistency dimension elif dimension == 'subject_consistency': if local: From c125c9e98826efab7e9a19b76602abe2e4f72d77 Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Thu, 30 Nov 2023 19:36:50 +0800 Subject: [PATCH 019/248] [add] evaluate.py --- README.md | 5 ++++- evaluate.py | 58 +++++++++++++++++++++++++++++++++++++++++++++++++++++ 2 files changed, 62 insertions(+), 1 deletion(-) create mode 100644 evaluate.py diff --git a/README.md b/README.md index 0ec1a754..5e9957af 100755 --- a/README.md +++ b/README.md @@ -46,7 +46,7 @@ We provide prompt lists are at `prompts/`, see [instructions](https://github.com ## :surfer: Evaluation Method Suite To perform evaluation, run this: -``` + +``` +python evaluate.py --videos_path $VIDEOS_PATH --dimension $DIMENSION ``` - The complete list of dimensions: diff --git a/evaluate.py b/evaluate.py new file mode 100644 index 00000000..ddda3950 --- /dev/null +++ b/evaluate.py @@ -0,0 +1,58 @@ +import torch +from vbench import VBench + +import argparse + +def parse_args(): + + parser = argparse.ArgumentParser(description='VBench') + parser.add_argument( + "--output_path", + type=str, + default='./evaluation_results/', + help="output path to save the evaluation results", + ) + parser.add_argument( + "--full_json_dir", + type=str, + default='./VBench_full_info.json', + help="path to save the json file that contains the prompt and dimension information", + ) + parser.add_argument( + "--videos_path", + type=str, + required=True, + help="folder that contains the sampled videos", + ) + parser.add_argument( + "--dimension", + type=str, + required=True, + help="folder that contains the sampled videos", + ) + parser.add_argument( + "--load_ckpt_from_local", + type=bool, + required=False, + help="whether load checkpoints from local default paths (assuming you have downloaded the checkpoints locally)", + ) + args = parser.parse_args() + return args + + +def main(): + args = parse_args() + + device = torch.device("cuda") + my_VBench = VBench(device, args.full_json_dir, args.output_path) + my_VBench.evaluate( + videos_path = args.videos_path, + name = 'test', + dimension_list = [args.dimension], + local=args.load_ckpt_from_local, + ) + print('done') + + +if __name__ == "__main__": + main() From 556b58f01acc9e486fae89e102879b59d1949db0 Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Thu, 30 Nov 2023 19:38:44 +0800 Subject: [PATCH 020/248] [fix] human_action error --- vbench/__init__.py | 17 ++++++++++------- vbench/human_action.py | 1 + 2 files changed, 11 insertions(+), 7 deletions(-) diff --git a/vbench/__init__.py b/vbench/__init__.py index 053e371e..a324ea08 100755 --- a/vbench/__init__.py +++ b/vbench/__init__.py @@ -1,22 +1,25 @@ import os + from .utils import init_submodules, save_json, load_json -from .aesthetic_quality import compute_aesthetic_quality -from .background_consistency import compute_background_consistency + from .subject_consistency import compute_subject_consistency +from .background_consistency import compute_background_consistency +from .temporal_flickering import compute_temporal_flickering +from .motion_smoothness import compute_motion_smoothness +from .dynamic_degree import compute_dynamic_degree +from .aesthetic_quality import compute_aesthetic_quality from .imaging_quality import compute_imaging_quality from .object_class import compute_object_class from .multiple_objects import compute_multiple_objects +from .human_action import compute_human_action from .color import compute_color from .spatial_relationship import compute_spatial_relationship from .scene import compute_scene -from .temporal_style import compute_temporal_style from .overall_consistency import compute_overall_consistency -from .temporal_flickering import compute_temporal_flickering -from .motion_smoothness import compute_motion_smoothness -from .dynamic_degree import compute_dynamic_degree -from .human_action import compute_human_action +from .temporal_style import compute_temporal_style from .appearance_style import compute_appearance_style + class VBench(object): def __init__(self, device, full_info_dir, output_path): self.device = device # cuda or cpu diff --git a/vbench/human_action.py b/vbench/human_action.py index 0e089e0f..5c9cbd34 100755 --- a/vbench/human_action.py +++ b/vbench/human_action.py @@ -86,6 +86,7 @@ def human_action(umt_path, video_list, device): break if flag is False: # print(f"{cnt}: {video_path} false, gt: {video_label_ls}, top-5: {cat_ls}, logits: {results}", flush=True) + pass video_results.append({'video_path': video_path, 'video_results': flag}) # print(f"cor num: {cor_num}, total: {cnt}") acc = cor_num / cnt From ed4bb21da6ee7f997edb290334b1dc2270077252 Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Thu, 30 Nov 2023 21:18:49 +0800 Subject: [PATCH 021/248] [fix] motion_smoothness loading error --- evaluate.py | 2 +- vbench/utils.py | 24 ++++++++++++++++-------- 2 files changed, 17 insertions(+), 9 deletions(-) diff --git a/evaluate.py b/evaluate.py index ddda3950..ae0074b9 100644 --- a/evaluate.py +++ b/evaluate.py @@ -47,7 +47,7 @@ def main(): my_VBench = VBench(device, args.full_json_dir, args.output_path) my_VBench.evaluate( videos_path = args.videos_path, - name = 'test', + name = args.dimension, dimension_list = [args.dimension], local=args.load_ckpt_from_local, ) diff --git a/vbench/utils.py b/vbench/utils.py index 6f2bcf9c..4d615b0b 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -199,14 +199,22 @@ def init_submodules(dimension_list, local=False): 'config':'pretrained/amt_model/AMT-S.yaml', 'ckpt':'pretrained/amt_model/amt-s.pth' } - if local: - details = submodules_dict[dimension] - # Check if the file exists, if not, download it with wget - if not os.path.isfile(details['ckpt']): - print(f"File {details['ckpt']} does not exist. Downloading...") - wget_command = ['wget', '-P', os.path.dirname(details['ckpt']), - 'https://huggingface.co/lalala125/AMT/resolve/main/amt-s.pth'] - subprocess.run(wget_command) + # if local: + # details = submodules_dict[dimension] + # # Check if the file exists, if not, download it with wget + # if not os.path.isfile(details['ckpt']): + # print(f"File {details['ckpt']} does not exist. Downloading...") + # wget_command = ['wget', '-P', os.path.dirname(details['ckpt']), + # 'https://huggingface.co/lalala125/AMT/resolve/main/amt-s.pth'] + # subprocess.run(wget_command) + + details = submodules_dict[dimension] + # Check if the file exists, if not, download it with wget + if not os.path.isfile(details['ckpt']): + print(f"File {details['ckpt']} does not exist. Downloading...") + wget_command = ['wget', '-P', os.path.dirname(details['ckpt']), + 'https://huggingface.co/lalala125/AMT/resolve/main/amt-s.pth'] + subprocess.run(wget_command) elif dimension == 'dynamic_degree': submodules_dict[dimension] = { From d90b666c4f85b2065f07854950df1ea79fb2d0f8 Mon Sep 17 00:00:00 2001 From: justinyuu <869113692@qq.com> Date: Fri, 1 Dec 2023 14:55:59 +0800 Subject: [PATCH 022/248] [update] background consistency --- evaluate.py | 2 +- vbench/background_consistency.py | 21 ++++++++++++++------- vbench/utils.py | 3 ++- 3 files changed, 17 insertions(+), 9 deletions(-) diff --git a/evaluate.py b/evaluate.py index ae0074b9..29b0a8ab 100644 --- a/evaluate.py +++ b/evaluate.py @@ -28,7 +28,7 @@ def parse_args(): "--dimension", type=str, required=True, - help="folder that contains the sampled videos", + help="evaluation dimensions", ) parser.add_argument( "--load_ckpt_from_local", diff --git a/vbench/background_consistency.py b/vbench/background_consistency.py index 69a90493..6940ffeb 100755 --- a/vbench/background_consistency.py +++ b/vbench/background_consistency.py @@ -9,15 +9,23 @@ from .utils import load_video, load_dimension_info, clip_transform -def background_consistency(clip_model, video_list, device): +def background_consistency(clip_model, preprocess, video_list, device, read_frame=False): sim = 0.0 cnt = 0 video_results = [] image_transform = clip_transform(224) for video_path in video_list: video_sim = 0.0 - images = load_video(video_path) - images = image_transform(images) + if read_frame: + video_path = video_path[:-4].replace('videos', 'frames').replace(' ', '_') + tmp_paths = [os.path.join(video_path, f) for f in sorted(os.listdir(video_path))] + images = [] + for tmp_path in tmp_paths: + images.append(preprocess(Image.open(tmp_path))) + images = torch.stack(images) + else: + images = load_video(video_path) + images = image_transform(images) images = images.to(device) image_features = clip_model.encode_image(images) image_features = F.normalize(image_features, dim=-1, p=2) @@ -32,7 +40,7 @@ def background_consistency(clip_model, video_list, device): video_sim += cur_sim cnt += 1 former_image_feature = image_feature - sim_per_image = video_sim / len(image_features) + sim_per_image = video_sim / (len(image_features) - 1) sim += video_sim video_results.append({'video_path': video_path, 'video_results': sim_per_image}) sim_per_video = sim / (len(video_list) - 1) @@ -41,12 +49,11 @@ def background_consistency(clip_model, video_list, device): def compute_background_consistency(json_dir, device, submodules_list): - vit_path = submodules_list[0] + vit_path, read_frame = submodules_list[0], submodules_list[1] if not os.path.isfile(vit_path): os.system(f'wget -q --show-progress https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt -O {vit_path}') clip_model, preprocess = clip.load(vit_path, device=device) video_list, _ = load_dimension_info(json_dir, dimension='background_consistency', lang='en') - all_results, video_results = background_consistency(clip_model, video_list, device) + all_results, video_results = background_consistency(clip_model, preprocess, video_list, device, read_frame) return all_results, video_results - diff --git a/vbench/utils.py b/vbench/utils.py index 4d615b0b..194139a5 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -180,13 +180,14 @@ def init_submodules(dimension_list, local=False): logger.info("\x1b[32m[Local Mode]\x1b[0m Working in local mode, please make sure that the pre-trained model has been fully downloaded.") for dimension in dimension_list: if dimension == 'background_consistency': + read_frame = False if local: vit_b_path = 'pretrained/clip_model/ViT-B-32.pt' if not os.path.isfile(vit_b_path): os.system(f'wget -q --show-progress https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt -O {vit_b_path}') else: vit_b_path = 'ViT-B/32' - submodules_dict[dimension] = [vit_b_path,] + submodules_dict[dimension] = [vit_b_path, read_frame] elif dimension == 'human_action': umt_path = "pretrained/umt_model/l16_ptk710_ftk710_ftk400_f16_res224.pth" if not os.path.isfile(umt_path): From 3c8541d08add7f56b7dd1ae11110782aa72285ff Mon Sep 17 00:00:00 2001 From: yinanhe Date: Fri, 1 Dec 2023 15:04:23 +0800 Subject: [PATCH 023/248] [fix] color --- VBench_full_info.json | 9133 ++++++++++++++++++++++++++++++++++++++++- vbench/color.py | 4 +- vbench/scene.py | 1 - 3 files changed, 9135 insertions(+), 3 deletions(-) diff --git a/VBench_full_info.json b/VBench_full_info.json index c107742b..a3a4f096 100755 --- a/VBench_full_info.json +++ b/VBench_full_info.json @@ -1 +1,9132 @@ -[{"prompt_en": "In a still frame, a stop sign", "dimension": ["temporal_flickering"]}, {"prompt_en": "a toilet, frozen in time", "dimension": ["temporal_flickering"]}, {"prompt_en": "a laptop, frozen in time", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of alley", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of bar", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of barn", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of bathroom", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of bedroom", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of cliff", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, courtyard", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, gas station", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of house", "dimension": ["temporal_flickering"]}, {"prompt_en": "indoor gymnasium, frozen in time", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of indoor library", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of kitchen", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of palace", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, parking lot", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, phone booth", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of restaurant", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of tower", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of a bowl", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of an apple", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of a bench", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of a bed", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of a chair", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of a cup", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of a dining table", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, a pear", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of a bunch of grapes", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of a bowl on the kitchen counter", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of a beautiful, handcrafted ceramic bowl", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of an antique bowl", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of an exquisite mahogany dining table", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of a wooden bench in the park", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of a beautiful wrought-iron bench surrounded by blooming flowers", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, a park bench with a view of the lake", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of a vintage rocking chair was placed on the porch", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of the jail cell was small and dimly lit, with cold, steel bars", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of the phone booth was tucked away in a quiet alley", "dimension": ["temporal_flickering"]}, {"prompt_en": "a dilapidated phone booth stood as a relic of a bygone era on the sidewalk, frozen in time", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of the old red barn stood weathered and iconic against the backdrop of the countryside", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of a picturesque barn was painted a warm shade of red and nestled in a picturesque meadow", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, within the desolate desert, an oasis unfolded, characterized by the stoic presence of palm trees and a motionless, glassy pool of water", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, the Parthenon's majestic Doric columns stand in serene solitude atop the Acropolis, framed by the tranquil Athenian landscape", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, the Temple of Hephaestus, with its timeless Doric grace, stands stoically against the backdrop of a quiet Athens", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, the ornate Victorian streetlamp stands solemnly, adorned with intricate ironwork and stained glass panels", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of the Stonehenge presented itself as an enigmatic puzzle, each colossal stone meticulously placed against the backdrop of tranquility", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, in the vast desert, an oasis nestled among dunes, featuring tall palm trees and an air of serenity", "dimension": ["temporal_flickering"]}, {"prompt_en": "static view on a desert scene with an oasis, palm trees, and a clear, calm pool of water", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of an ornate Victorian streetlamp standing on a cobblestone street corner, illuminating the empty night", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of a tranquil lakeside cabin nestled among tall pines, its reflection mirrored perfectly in the calm water", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, a vintage gas lantern, adorned with intricate details, gracing a historic cobblestone square", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, a tranquil Japanese tea ceremony room, with tatami mats, a delicate tea set, and a bonsai tree in the corner", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of the Parthenon stands resolute in its classical elegance, a timeless symbol of Athens' cultural legacy", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of in the heart of Plaka, the neoclassical architecture of the old city harmonizes with the ancient ruins", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of in the desolate beauty of the American Southwest, Chaco Canyon's ancient ruins whispered tales of an enigmatic civilization that once thrived amidst the arid landscapes", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of at the edge of the Arabian Desert, the ancient city of Petra beckoned with its enigmatic rock-carved fa\u00e7ades", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, amidst the cobblestone streets, an Art Nouveau lamppost stood tall", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of in the quaint village square, a traditional wrought-iron streetlamp featured delicate filigree patterns and amber-hued glass panels", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of the lampposts were adorned with Art Deco motifs, their geometric shapes and frosted glass creating a sense of vintage glamour", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, in the picturesque square, a Gothic-style lamppost adorned with intricate stone carvings added a touch of medieval charm to the setting", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, in the heart of the old city, a row of ornate lantern-style streetlamps bathed the narrow alleyway in a warm, welcoming light", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of in the heart of the Utah desert, a massive sandstone arch spanned the horizon", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of in the Arizona desert, a massive stone bridge arched across a rugged canyon", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of in the corner of the minimalist tea room, a bonsai tree added a touch of nature's beauty to the otherwise simple and elegant space", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, amidst the hushed ambiance of the traditional tea room, a meticulously arranged tea set awaited, with porcelain cups, a bamboo whisk", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, nestled in the Zen garden, a rustic teahouse featured tatami seating and a traditional charcoal brazier", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of a country estate's library featured elegant wooden shelves", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of beneath the shade of a solitary oak tree, an old wooden park bench sat patiently", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of beside a tranquil pond, a weeping willow tree draped its branches gracefully over the water's surface, creating a serene tableau of reflection and calm", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of in the Zen garden, a perfectly raked gravel path led to a serene rock garden", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, a tranquil pond was fringed by weeping cherry trees, their blossoms drifting lazily onto the glassy surface", "dimension": ["temporal_flickering"]}, {"prompt_en": "In a still frame, within the historic library's reading room, rows of antique leather chairs and mahogany tables offered a serene haven for literary contemplation", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of a peaceful orchid garden showcased a variety of delicate blooms", "dimension": ["temporal_flickering"]}, {"prompt_en": "A tranquil tableau of in the serene courtyard, a centuries-old stone well stood as a symbol of a bygone era, its mossy stones bearing witness to the passage of time", "dimension": ["temporal_flickering"]}, {"prompt_en": "a bird and a cat", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "bird and cat"}}}, {"prompt_en": "a cat and a dog", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "cat and dog"}}}, {"prompt_en": "a dog and a horse", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "dog and horse"}}}, {"prompt_en": "a horse and a sheep", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "horse and sheep"}}}, {"prompt_en": "a sheep and a cow", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "sheep and cow"}}}, {"prompt_en": "a cow and an elephant", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "cow and elephant"}}}, {"prompt_en": "an elephant and a bear", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "elephant and bear"}}}, {"prompt_en": "a bear and a zebra", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "bear and zebra"}}}, {"prompt_en": "a zebra and a giraffe", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "zebra and giraffe"}}}, {"prompt_en": "a giraffe and a bird", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "giraffe and bird"}}}, {"prompt_en": "a chair and a couch", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "chair and couch"}}}, {"prompt_en": "a couch and a potted plant", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "couch and potted plant"}}}, {"prompt_en": "a potted plant and a tv", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "potted plant and tv"}}}, {"prompt_en": "a tv and a laptop", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "tv and laptop"}}}, {"prompt_en": "a laptop and a remote", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "laptop and remote"}}}, {"prompt_en": "a remote and a keyboard", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "remote and keyboard"}}}, {"prompt_en": "a keyboard and a cell phone", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "keyboard and cell phone"}}}, {"prompt_en": "a cell phone and a book", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "cell phone and book"}}}, {"prompt_en": "a book and a clock", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "book and clock"}}}, {"prompt_en": "a clock and a backpack", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "clock and backpack"}}}, {"prompt_en": "a backpack and an umbrella", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "backpack and umbrella"}}}, {"prompt_en": "an umbrella and a handbag", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "umbrella and handbag"}}}, {"prompt_en": "a handbag and a tie", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "handbag and tie"}}}, {"prompt_en": "a tie and a suitcase", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "tie and suitcase"}}}, {"prompt_en": "a suitcase and a vase", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "suitcase and vase"}}}, {"prompt_en": "a vase and scissors", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "vase and scissors"}}}, {"prompt_en": "scissors and a teddy bear", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "scissors and teddy bear"}}}, {"prompt_en": "a teddy bear and a frisbee", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "teddy bear and frisbee"}}}, {"prompt_en": "a frisbee and skis", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "frisbee and skis"}}}, {"prompt_en": "skis and a snowboard", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "skis and snowboard"}}}, {"prompt_en": "a snowboard and a sports ball", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "snowboard and sports ball"}}}, {"prompt_en": "a sports ball and a kite", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "sports ball and kite"}}}, {"prompt_en": "a kite and a baseball bat", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "kite and baseball bat"}}}, {"prompt_en": "a baseball bat and a baseball glove", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "baseball bat and baseball glove"}}}, {"prompt_en": "a baseball glove and a skateboard", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "baseball glove and skateboard"}}}, {"prompt_en": "a skateboard and a surfboard", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "skateboard and surfboard"}}}, {"prompt_en": "a surfboard and a tennis racket", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "surfboard and tennis racket"}}}, {"prompt_en": "a tennis racket and a bottle", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "tennis racket and bottle"}}}, {"prompt_en": "a bottle and a chair", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "bottle and chair"}}}, {"prompt_en": "an airplane and a train", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "airplane and train"}}}, {"prompt_en": "a train and a boat", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "train and boat"}}}, {"prompt_en": "a boat and an airplane", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "boat and airplane"}}}, {"prompt_en": "a bicycle and a car", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "bicycle and car"}}}, {"prompt_en": "a car and a motorcycle", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "car and motorcycle"}}}, {"prompt_en": "a motorcycle and a bus", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "motorcycle and bus"}}}, {"prompt_en": "a bus and a traffic light", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "bus and traffic light"}}}, {"prompt_en": "a traffic light and a fire hydrant", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "traffic light and fire hydrant"}}}, {"prompt_en": "a fire hydrant and a stop sign", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "fire hydrant and stop sign"}}}, {"prompt_en": "a stop sign and a parking meter", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "stop sign and parking meter"}}}, {"prompt_en": "a parking meter and a truck", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "parking meter and truck"}}}, {"prompt_en": "a truck and a bicycle", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "truck and bicycle"}}}, {"prompt_en": "a toilet and a hair drier", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "toilet and hair drier"}}}, {"prompt_en": "a hair drier and a toothbrush", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "hair drier and toothbrush"}}}, {"prompt_en": "a toothbrush and a sink", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "toothbrush and sink"}}}, {"prompt_en": "a sink and a toilet", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "sink and toilet"}}}, {"prompt_en": "a wine glass and a chair", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "wine glass and chair"}}}, {"prompt_en": "a cup and a couch", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "cup and couch"}}}, {"prompt_en": "a fork and a potted plant", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "fork and potted plant"}}}, {"prompt_en": "a knife and a tv", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "knife and tv"}}}, {"prompt_en": "a spoon and a laptop", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "spoon and laptop"}}}, {"prompt_en": "a bowl and a remote", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "bowl and remote"}}}, {"prompt_en": "a banana and a keyboard", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "banana and keyboard"}}}, {"prompt_en": "an apple and a cell phone", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "apple and cell phone"}}}, {"prompt_en": "a sandwich and a book", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "sandwich and book"}}}, {"prompt_en": "an orange and a clock", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "orange and clock"}}}, {"prompt_en": "broccoli and a backpack", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "broccoli and backpack"}}}, {"prompt_en": "a carrot and an umbrella", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "carrot and umbrella"}}}, {"prompt_en": "a hot dog and a handbag", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "hot dog and handbag"}}}, {"prompt_en": "a pizza and a tie", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "pizza and tie"}}}, {"prompt_en": "a donut and a suitcase", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "donut and suitcase"}}}, {"prompt_en": "a cake and a vase", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "cake and vase"}}}, {"prompt_en": "an oven and scissors", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "oven and scissors"}}}, {"prompt_en": "a toaster and a teddy bear", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "toaster and teddy bear"}}}, {"prompt_en": "a microwave and a frisbee", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "microwave and frisbee"}}}, {"prompt_en": "a refrigerator and skis", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "refrigerator and skis"}}}, {"prompt_en": "a bicycle and an airplane", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "bicycle and airplane"}}}, {"prompt_en": "a car and a train", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "car and train"}}}, {"prompt_en": "a motorcycle and a boat", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "motorcycle and boat"}}}, {"prompt_en": "a person and a toilet", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "person and toilet"}}}, {"prompt_en": "a person and a hair drier", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "person and hair drier"}}}, {"prompt_en": "a person and a toothbrush", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "person and toothbrush"}}}, {"prompt_en": "a person and a sink", "dimension": ["multiple_objects"], "auxiliary_info": {"multiple_objects": {"object": "person and sink"}}}, {"prompt_en": "A person is riding a bike", "dimension": ["human_action"]}, {"prompt_en": "A person is marching", "dimension": ["human_action"]}, {"prompt_en": "A person is roller skating", "dimension": ["human_action"]}, {"prompt_en": "A person is tasting beer", "dimension": ["human_action"]}, {"prompt_en": "A person is clapping", "dimension": ["human_action"]}, {"prompt_en": "A person is drawing", "dimension": ["human_action"]}, {"prompt_en": "A person is petting animal (not cat)", "dimension": ["human_action"]}, {"prompt_en": "A person is eating watermelon", "dimension": ["human_action"]}, {"prompt_en": "A person is playing harp", "dimension": ["human_action"]}, {"prompt_en": "A person is wrestling", "dimension": ["human_action"]}, {"prompt_en": "A person is riding scooter", "dimension": ["human_action"]}, {"prompt_en": "A person is sweeping floor", "dimension": ["human_action"]}, {"prompt_en": "A person is skateboarding", "dimension": ["human_action"]}, {"prompt_en": "A person is dunking basketball", "dimension": ["human_action"]}, {"prompt_en": "A person is playing flute", "dimension": ["human_action"]}, {"prompt_en": "A person is stretching leg", "dimension": ["human_action"]}, {"prompt_en": "A person is tying tie", "dimension": ["human_action"]}, {"prompt_en": "A person is skydiving", "dimension": ["human_action"]}, {"prompt_en": "A person is shooting goal (soccer)", "dimension": ["human_action"]}, {"prompt_en": "A person is playing piano", "dimension": ["human_action"]}, {"prompt_en": "A person is finger snapping", "dimension": ["human_action"]}, {"prompt_en": "A person is canoeing or kayaking", "dimension": ["human_action"]}, {"prompt_en": "A person is laughing", "dimension": ["human_action"]}, {"prompt_en": "A person is digging", "dimension": ["human_action"]}, {"prompt_en": "A person is clay pottery making", "dimension": ["human_action"]}, {"prompt_en": "A person is shooting basketball", "dimension": ["human_action"]}, {"prompt_en": "A person is bending back", "dimension": ["human_action"]}, {"prompt_en": "A person is shaking hands", "dimension": ["human_action"]}, {"prompt_en": "A person is bandaging", "dimension": ["human_action"]}, {"prompt_en": "A person is push up", "dimension": ["human_action"]}, {"prompt_en": "A person is catching or throwing frisbee", "dimension": ["human_action"]}, {"prompt_en": "A person is playing trumpet", "dimension": ["human_action"]}, {"prompt_en": "A person is flying kite", "dimension": ["human_action"]}, {"prompt_en": "A person is filling eyebrows", "dimension": ["human_action"]}, {"prompt_en": "A person is shuffling cards", "dimension": ["human_action"]}, {"prompt_en": "A person is folding clothes", "dimension": ["human_action"]}, {"prompt_en": "A person is smoking", "dimension": ["human_action"]}, {"prompt_en": "A person is tai chi", "dimension": ["human_action"]}, {"prompt_en": "A person is squat", "dimension": ["human_action"]}, {"prompt_en": "A person is playing controller", "dimension": ["human_action"]}, {"prompt_en": "A person is throwing axe", "dimension": ["human_action"]}, {"prompt_en": "A person is giving or receiving award", "dimension": ["human_action"]}, {"prompt_en": "A person is air drumming", "dimension": ["human_action"]}, {"prompt_en": "A person is taking a shower", "dimension": ["human_action"]}, {"prompt_en": "A person is planting trees", "dimension": ["human_action"]}, {"prompt_en": "A person is sharpening knives", "dimension": ["human_action"]}, {"prompt_en": "A person is robot dancing", "dimension": ["human_action"]}, {"prompt_en": "A person is rock climbing", "dimension": ["human_action"]}, {"prompt_en": "A person is hula hooping", "dimension": ["human_action"]}, {"prompt_en": "A person is writing", "dimension": ["human_action"]}, {"prompt_en": "A person is bungee jumping", "dimension": ["human_action"]}, {"prompt_en": "A person is pushing cart", "dimension": ["human_action"]}, {"prompt_en": "A person is cleaning windows", "dimension": ["human_action"]}, {"prompt_en": "A person is cutting watermelon", "dimension": ["human_action"]}, {"prompt_en": "A person is cheerleading", "dimension": ["human_action"]}, {"prompt_en": "A person is washing hands", "dimension": ["human_action"]}, {"prompt_en": "A person is ironing", "dimension": ["human_action"]}, {"prompt_en": "A person is cutting nails", "dimension": ["human_action"]}, {"prompt_en": "A person is hugging", "dimension": ["human_action"]}, {"prompt_en": "A person is trimming or shaving beard", "dimension": ["human_action"]}, {"prompt_en": "A person is jogging", "dimension": ["human_action"]}, {"prompt_en": "A person is making bed", "dimension": ["human_action"]}, {"prompt_en": "A person is washing dishes", "dimension": ["human_action"]}, {"prompt_en": "A person is grooming dog", "dimension": ["human_action"]}, {"prompt_en": "A person is doing laundry", "dimension": ["human_action"]}, {"prompt_en": "A person is knitting", "dimension": ["human_action"]}, {"prompt_en": "A person is reading book", "dimension": ["human_action"]}, {"prompt_en": "A person is baby waking up", "dimension": ["human_action"]}, {"prompt_en": "A person is massaging legs", "dimension": ["human_action"]}, {"prompt_en": "A person is brushing teeth", "dimension": ["human_action"]}, {"prompt_en": "A person is crawling baby", "dimension": ["human_action"]}, {"prompt_en": "A person is motorcycling", "dimension": ["human_action"]}, {"prompt_en": "A person is driving car", "dimension": ["human_action"]}, {"prompt_en": "A person is sticking tongue out", "dimension": ["human_action"]}, {"prompt_en": "A person is shaking head", "dimension": ["human_action"]}, {"prompt_en": "A person is sword fighting", "dimension": ["human_action"]}, {"prompt_en": "A person is doing aerobics", "dimension": ["human_action"]}, {"prompt_en": "A person is strumming guitar", "dimension": ["human_action"]}, {"prompt_en": "A person is riding or walking with horse", "dimension": ["human_action"]}, {"prompt_en": "A person is archery", "dimension": ["human_action"]}, {"prompt_en": "A person is catching or throwing baseball", "dimension": ["human_action"]}, {"prompt_en": "A person is playing chess", "dimension": ["human_action"]}, {"prompt_en": "A person is rock scissors paper", "dimension": ["human_action"]}, {"prompt_en": "A person is using computer", "dimension": ["human_action"]}, {"prompt_en": "A person is arranging flowers", "dimension": ["human_action"]}, {"prompt_en": "A person is bending metal", "dimension": ["human_action"]}, {"prompt_en": "A person is ice skating", "dimension": ["human_action"]}, {"prompt_en": "A person is climbing a rope", "dimension": ["human_action"]}, {"prompt_en": "A person is crying", "dimension": ["human_action"]}, {"prompt_en": "A person is dancing ballet", "dimension": ["human_action"]}, {"prompt_en": "A person is getting a haircut", "dimension": ["human_action"]}, {"prompt_en": "A person is running on treadmill", "dimension": ["human_action"]}, {"prompt_en": "A person is kissing", "dimension": ["human_action"]}, {"prompt_en": "A person is counting money", "dimension": ["human_action"]}, {"prompt_en": "A person is barbequing", "dimension": ["human_action"]}, {"prompt_en": "A person is peeling apples", "dimension": ["human_action"]}, {"prompt_en": "A person is milking cow", "dimension": ["human_action"]}, {"prompt_en": "A person is shining shoes", "dimension": ["human_action"]}, {"prompt_en": "A person is making snowman", "dimension": ["human_action"]}, {"prompt_en": "A person is sailing", "dimension": ["human_action"]}, {"prompt_en": "a person swimming in ocean", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a person giving a presentation to a room full of colleagues", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a person washing the dishes", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a person eating a burger", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a person walking in the snowstorm", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a person drinking coffee in a cafe", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a person playing guitar", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a bicycle leaning against a tree", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a bicycle gliding through a snowy field", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a bicycle slowing down to stop", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a bicycle accelerating to gain speed", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a car stuck in traffic during rush hour", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a car turning a corner", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a car slowing down to stop", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a car accelerating to gain speed", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a motorcycle cruising along a coastal highway", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a motorcycle turning a corner", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a motorcycle slowing down to stop", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a motorcycle gliding through a snowy field", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a motorcycle accelerating to gain speed", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "an airplane soaring through a clear blue sky", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "an airplane taking off", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "an airplane landing smoothly on a runway", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "an airplane accelerating to gain speed", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a bus turning a corner", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a bus stuck in traffic during rush hour", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a bus accelerating to gain speed", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a train speeding down the tracks", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a train crossing over a tall bridge", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a train accelerating to gain speed", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a truck turning a corner", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a truck anchored in a tranquil bay", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a truck stuck in traffic during rush hour", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a truck slowing down to stop", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a truck accelerating to gain speed", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a boat sailing smoothly on a calm lake", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a boat slowing down to stop", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a boat accelerating to gain speed", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a bird soaring gracefully in the sky", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a bird building a nest from twigs and leaves", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a bird flying over a snowy forest", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a cat grooming itself meticulously with its tongue", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a cat playing in park", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a cat drinking water", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a cat running happily", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a dog enjoying a peaceful walk", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a dog playing in park", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a dog drinking water", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a dog running happily", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a horse bending down to drink water from a river", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a horse galloping across an open field", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a horse taking a peaceful walk", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a horse running to join a herd of its kind", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a sheep bending down to drink water from a river", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a sheep taking a peaceful walk", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a sheep running to join a herd of its kind", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a cow bending down to drink water from a river", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a cow chewing cud while resting in a tranquil barn", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a cow running to join a herd of its kind", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "an elephant spraying itself with water using its trunk to cool down", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "an elephant taking a peaceful walk", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "an elephant running to join a herd of its kind", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a bear catching a salmon in its powerful jaws", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a bear sniffing the air for scents of food", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a bear climbing a tree", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a bear hunting for prey", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a zebra bending down to drink water from a river", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a zebra running to join a herd of its kind", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a zebra taking a peaceful walk", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a giraffe bending down to drink water from a river", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a giraffe taking a peaceful walk", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a giraffe running to join a herd of its kind", "dimension": ["subject_consistency", "dynamic_degree", "motion_smoothness"]}, {"prompt_en": "a person", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "person"}}}, {"prompt_en": "a bicycle", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "bicycle"}}}, {"prompt_en": "a car", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "car"}}}, {"prompt_en": "a motorcycle", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "motorcycle"}}}, {"prompt_en": "an airplane", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "airplane"}}}, {"prompt_en": "a bus", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "bus"}}}, {"prompt_en": "a train", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "train"}}}, {"prompt_en": "a truck", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "truck"}}}, {"prompt_en": "a boat", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "boat"}}}, {"prompt_en": "a traffic light", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "traffic light"}}}, {"prompt_en": "a fire hydrant", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "fire hydrant"}}}, {"prompt_en": "a stop sign", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "stop sign"}}}, {"prompt_en": "a parking meter", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "parking meter"}}}, {"prompt_en": "a bench", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "bench"}}}, {"prompt_en": "a bird", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "bird"}}}, {"prompt_en": "a cat", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "cat"}}}, {"prompt_en": "a dog", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "dog"}}}, {"prompt_en": "a horse", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "horse"}}}, {"prompt_en": "a sheep", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "sheep"}}}, {"prompt_en": "a cow", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "cow"}}}, {"prompt_en": "an elephant", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "elephant"}}}, {"prompt_en": "a bear", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "bear"}}}, {"prompt_en": "a zebra", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "zebra"}}}, {"prompt_en": "a giraffe", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "giraffe"}}}, {"prompt_en": "a backpack", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "backpack"}}}, {"prompt_en": "an umbrella", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "umbrella"}}}, {"prompt_en": "a handbag", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "handbag"}}}, {"prompt_en": "a tie", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "tie"}}}, {"prompt_en": "a suitcase", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "suitcase"}}}, {"prompt_en": "a frisbee", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "frisbee"}}}, {"prompt_en": "skis", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "skis"}}}, {"prompt_en": "a snowboard", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "snowboard"}}}, {"prompt_en": "a sports ball", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "sports ball"}}}, {"prompt_en": "a kite", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "kite"}}}, {"prompt_en": "a baseball bat", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "baseball bat"}}}, {"prompt_en": "a baseball glove", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "baseball glove"}}}, {"prompt_en": "a skateboard", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "skateboard"}}}, {"prompt_en": "a surfboard", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "surfboard"}}}, {"prompt_en": "a tennis racket", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "tennis racket"}}}, {"prompt_en": "a bottle", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "bottle"}}}, {"prompt_en": "a wine glass", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "wine glass"}}}, {"prompt_en": "a cup", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "cup"}}}, {"prompt_en": "a fork", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "fork"}}}, {"prompt_en": "a knife", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "knife"}}}, {"prompt_en": "a spoon", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "spoon"}}}, {"prompt_en": "a bowl", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "bowl"}}}, {"prompt_en": "a banana", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "banana"}}}, {"prompt_en": "an apple", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "apple"}}}, {"prompt_en": "a sandwich", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "sandwich"}}}, {"prompt_en": "an orange", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "orange"}}}, {"prompt_en": "broccoli", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "broccoli"}}}, {"prompt_en": "a carrot", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "carrot"}}}, {"prompt_en": "a hot dog", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "hot dog"}}}, {"prompt_en": "a pizza", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "pizza"}}}, {"prompt_en": "a donut", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "donut"}}}, {"prompt_en": "a cake", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "cake"}}}, {"prompt_en": "a chair", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "chair"}}}, {"prompt_en": "a couch", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "couch"}}}, {"prompt_en": "a potted plant", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "potted plant"}}}, {"prompt_en": "a bed", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "bed"}}}, {"prompt_en": "a dining table", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "dining table"}}}, {"prompt_en": "a toilet", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "toilet"}}}, {"prompt_en": "a tv", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "tv"}}}, {"prompt_en": "a laptop", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "laptop"}}}, {"prompt_en": "a remote", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "remote"}}}, {"prompt_en": "a keyboard", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "keyboard"}}}, {"prompt_en": "a cell phone", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "cell phone"}}}, {"prompt_en": "a microwave", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "microwave"}}}, {"prompt_en": "an oven", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "oven"}}}, {"prompt_en": "a toaster", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "toaster"}}}, {"prompt_en": "a sink", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "sink"}}}, {"prompt_en": "a refrigerator", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "refrigerator"}}}, {"prompt_en": "a book", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "book"}}}, {"prompt_en": "a clock", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "clock"}}}, {"prompt_en": "a vase", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "vase"}}}, {"prompt_en": "scissors", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "scissors"}}}, {"prompt_en": "a teddy bear", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "teddy bear"}}}, {"prompt_en": "a hair drier", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "hair drier"}}}, {"prompt_en": "a toothbrush", "dimension": ["object_class"], "auxiliary_info": {"object_class": {"object": "toothbrush"}}}, {"prompt_en": "a red bicycle", "dimension": ["color"], "auxiliary_info": {"color": {"color": "red"}}}, {"prompt_en": "a green bicycle", "dimension": ["color"], "auxiliary_info": {"color": {"color": "green"}}}, {"prompt_en": "a blue bicycle", "dimension": ["color"], "auxiliary_info": {"color": {"color": "blue"}}}, {"prompt_en": "a yellow bicycle", "dimension": ["color"], "auxiliary_info": {"color": {"color": "yellow"}}}, {"prompt_en": "an orange bicycle", "dimension": ["color"], "auxiliary_info": {"color": {"color": "orange"}}}, {"prompt_en": "a purple bicycle", "dimension": ["color"], "auxiliary_info": {"color": {"color": "purple"}}}, {"prompt_en": "a pink bicycle", "dimension": ["color"], "auxiliary_info": {"color": {"color": "pink"}}}, {"prompt_en": "a black bicycle", "dimension": ["color"], "auxiliary_info": {"color": {"color": "black"}}}, {"prompt_en": "a white bicycle", "dimension": ["color"], "auxiliary_info": {"color": {"color": "white"}}}, {"prompt_en": "a red car", "dimension": ["color"], "auxiliary_info": {"color": {"color": "red"}}}, {"prompt_en": "a green car", "dimension": ["color"], "auxiliary_info": {"color": {"color": "green"}}}, {"prompt_en": "a blue car", "dimension": ["color"], "auxiliary_info": {"color": {"color": "blue"}}}, {"prompt_en": "a yellow car", "dimension": ["color"], "auxiliary_info": {"color": {"color": "yellow"}}}, {"prompt_en": "an orange car", "dimension": ["color"], "auxiliary_info": {"color": {"color": "orange"}}}, {"prompt_en": "a purple car", "dimension": ["color"], "auxiliary_info": {"color": {"color": "purple"}}}, {"prompt_en": "a pink car", "dimension": ["color"], "auxiliary_info": {"color": {"color": "pink"}}}, {"prompt_en": "a black car", "dimension": ["color"], "auxiliary_info": {"color": {"color": "black"}}}, {"prompt_en": "a white car", "dimension": ["color"], "auxiliary_info": {"color": {"color": "white"}}}, {"prompt_en": "a red bird", "dimension": ["color"], "auxiliary_info": {"color": {"color": "red"}}}, {"prompt_en": "a green bird", "dimension": ["color"], "auxiliary_info": {"color": {"color": "green"}}}, {"prompt_en": "a blue bird", "dimension": ["color"], "auxiliary_info": {"color": {"color": "blue"}}}, {"prompt_en": "a yellow bird", "dimension": ["color"], "auxiliary_info": {"color": {"color": "yellow"}}}, {"prompt_en": "an orange bird", "dimension": ["color"], "auxiliary_info": {"color": {"color": "orange"}}}, {"prompt_en": "a purple bird", "dimension": ["color"], "auxiliary_info": {"color": {"color": "purple"}}}, {"prompt_en": "a pink bird", "dimension": ["color"], "auxiliary_info": {"color": {"color": "pink"}}}, {"prompt_en": "a black bird", "dimension": ["color"], "auxiliary_info": {"color": {"color": "black"}}}, {"prompt_en": "a white bird", "dimension": ["color"], "auxiliary_info": {"color": {"color": "white"}}}, {"prompt_en": "a black cat", "dimension": ["color"], "auxiliary_info": {"color": {"color": "black"}}}, {"prompt_en": "a white cat", "dimension": ["color"], "auxiliary_info": {"color": {"color": "white"}}}, {"prompt_en": "an orange cat", "dimension": ["color"], "auxiliary_info": {"color": {"color": "orange"}}}, {"prompt_en": "a yellow cat", "dimension": ["color"], "auxiliary_info": {"color": {"color": "yellow"}}}, {"prompt_en": "a red umbrella", "dimension": ["color"], "auxiliary_info": {"color": {"color": "red"}}}, {"prompt_en": "a green umbrella", "dimension": ["color"], "auxiliary_info": {"color": {"color": "green"}}}, {"prompt_en": "a blue umbrella", "dimension": ["color"], "auxiliary_info": {"color": {"color": "blue"}}}, {"prompt_en": "a yellow umbrella", "dimension": ["color"], "auxiliary_info": {"color": {"color": "yellow"}}}, {"prompt_en": "an orange umbrella", "dimension": ["color"], "auxiliary_info": {"color": {"color": "orange"}}}, {"prompt_en": "a purple umbrella", "dimension": ["color"], "auxiliary_info": {"color": {"color": "purple"}}}, {"prompt_en": "a pink umbrella", "dimension": ["color"], "auxiliary_info": {"color": {"color": "pink"}}}, {"prompt_en": "a black umbrella", "dimension": ["color"], "auxiliary_info": {"color": {"color": "black"}}}, {"prompt_en": "a white umbrella", "dimension": ["color"], "auxiliary_info": {"color": {"color": "white"}}}, {"prompt_en": "a red suitcase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "red"}}}, {"prompt_en": "a green suitcase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "green"}}}, {"prompt_en": "a blue suitcase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "blue"}}}, {"prompt_en": "a yellow suitcase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "yellow"}}}, {"prompt_en": "an orange suitcase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "orange"}}}, {"prompt_en": "a purple suitcase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "purple"}}}, {"prompt_en": "a pink suitcase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "pink"}}}, {"prompt_en": "a black suitcase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "black"}}}, {"prompt_en": "a white suitcase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "white"}}}, {"prompt_en": "a red bowl", "dimension": ["color"], "auxiliary_info": {"color": {"color": "red"}}}, {"prompt_en": "a green bowl", "dimension": ["color"], "auxiliary_info": {"color": {"color": "green"}}}, {"prompt_en": "a blue bowl", "dimension": ["color"], "auxiliary_info": {"color": {"color": "blue"}}}, {"prompt_en": "a yellow bowl", "dimension": ["color"], "auxiliary_info": {"color": {"color": "yellow"}}}, {"prompt_en": "an orange bowl", "dimension": ["color"], "auxiliary_info": {"color": {"color": "orange"}}}, {"prompt_en": "a purple bowl", "dimension": ["color"], "auxiliary_info": {"color": {"color": "purple"}}}, {"prompt_en": "a pink bowl", "dimension": ["color"], "auxiliary_info": {"color": {"color": "pink"}}}, {"prompt_en": "a black bowl", "dimension": ["color"], "auxiliary_info": {"color": {"color": "black"}}}, {"prompt_en": "a white bowl", "dimension": ["color"], "auxiliary_info": {"color": {"color": "white"}}}, {"prompt_en": "a red chair", "dimension": ["color"], "auxiliary_info": {"color": {"color": "red"}}}, {"prompt_en": "a green chair", "dimension": ["color"], "auxiliary_info": {"color": {"color": "green"}}}, {"prompt_en": "a blue chair", "dimension": ["color"], "auxiliary_info": {"color": {"color": "blue"}}}, {"prompt_en": "a yellow chair", "dimension": ["color"], "auxiliary_info": {"color": {"color": "yellow"}}}, {"prompt_en": "an orange chair", "dimension": ["color"], "auxiliary_info": {"color": {"color": "orange"}}}, {"prompt_en": "a purple chair", "dimension": ["color"], "auxiliary_info": {"color": {"color": "purple"}}}, {"prompt_en": "a pink chair", "dimension": ["color"], "auxiliary_info": {"color": {"color": "pink"}}}, {"prompt_en": "a black chair", "dimension": ["color"], "auxiliary_info": {"color": {"color": "black"}}}, {"prompt_en": "a white chair", "dimension": ["color"], "auxiliary_info": {"color": {"color": "white"}}}, {"prompt_en": "a red clock", "dimension": ["color"], "auxiliary_info": {"color": {"color": "red"}}}, {"prompt_en": "a green clock", "dimension": ["color"], "auxiliary_info": {"color": {"color": "green"}}}, {"prompt_en": "a blue clock", "dimension": ["color"], "auxiliary_info": {"color": {"color": "blue"}}}, {"prompt_en": "a yellow clock", "dimension": ["color"], "auxiliary_info": {"color": {"color": "yellow"}}}, {"prompt_en": "an orange clock", "dimension": ["color"], "auxiliary_info": {"color": {"color": "orange"}}}, {"prompt_en": "a purple clock", "dimension": ["color"], "auxiliary_info": {"color": {"color": "purple"}}}, {"prompt_en": "a pink clock", "dimension": ["color"], "auxiliary_info": {"color": {"color": "pink"}}}, {"prompt_en": "a black clock", "dimension": ["color"], "auxiliary_info": {"color": {"color": "black"}}}, {"prompt_en": "a white clock", "dimension": ["color"], "auxiliary_info": {"color": {"color": "white"}}}, {"prompt_en": "a red vase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "red"}}}, {"prompt_en": "a green vase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "green"}}}, {"prompt_en": "a blue vase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "blue"}}}, {"prompt_en": "a yellow vase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "yellow"}}}, {"prompt_en": "an orange vase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "orange"}}}, {"prompt_en": "a purple vase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "purple"}}}, {"prompt_en": "a pink vase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "pink"}}}, {"prompt_en": "a black vase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "black"}}}, {"prompt_en": "a white vase", "dimension": ["color"], "auxiliary_info": {"color": {"color": "white"}}}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, Van Gogh style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "Van Gogh style"}}}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, oil painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "oil painting"}}}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand by Hokusai, in the style of Ukiyo", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "by Hokusai, in the style of Ukiyo"}}}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, black and white", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "black and white"}}}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, pixel art", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "pixel art"}}}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, in cyberpunk style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "in cyberpunk style"}}}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, animated style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "animated style"}}}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, watercolor painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "watercolor painting"}}}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, surrealism style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "surrealism style"}}}, {"prompt_en": "The bund Shanghai, Van Gogh style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "Van Gogh style"}}}, {"prompt_en": "The bund Shanghai, oil painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "oil painting"}}}, {"prompt_en": "The bund Shanghai by Hokusai, in the style of Ukiyo", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "by Hokusai, in the style of Ukiyo"}}}, {"prompt_en": "The bund Shanghai, black and white", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "black and white"}}}, {"prompt_en": "The bund Shanghai, pixel art", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "pixel art"}}}, {"prompt_en": "The bund Shanghai, in cyberpunk style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "in cyberpunk style"}}}, {"prompt_en": "The bund Shanghai, animated style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "animated style"}}}, {"prompt_en": "The bund Shanghai, watercolor painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "watercolor painting"}}}, {"prompt_en": "The bund Shanghai, surrealism style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "surrealism style"}}}, {"prompt_en": "a shark is swimming in the ocean, Van Gogh style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "Van Gogh style"}}}, {"prompt_en": "a shark is swimming in the ocean, oil painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "oil painting"}}}, {"prompt_en": "a shark is swimming in the ocean by Hokusai, in the style of Ukiyo", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "by Hokusai, in the style of Ukiyo"}}}, {"prompt_en": "a shark is swimming in the ocean, black and white", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "black and white"}}}, {"prompt_en": "a shark is swimming in the ocean, pixel art", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "pixel art"}}}, {"prompt_en": "a shark is swimming in the ocean, in cyberpunk style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "in cyberpunk style"}}}, {"prompt_en": "a shark is swimming in the ocean, animated style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "animated style"}}}, {"prompt_en": "a shark is swimming in the ocean, watercolor painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "watercolor painting"}}}, {"prompt_en": "a shark is swimming in the ocean, surrealism style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "surrealism style"}}}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, Van Gogh style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "Van Gogh style"}}}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, oil painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "oil painting"}}}, {"prompt_en": "A panda drinking coffee in a cafe in Paris by Hokusai, in the style of Ukiyo", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "by Hokusai, in the style of Ukiyo"}}}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, black and white", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "black and white"}}}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, pixel art", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "pixel art"}}}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, in cyberpunk style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "in cyberpunk style"}}}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, animated style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "animated style"}}}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, watercolor painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "watercolor painting"}}}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, surrealism style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "surrealism style"}}}, {"prompt_en": "A cute happy Corgi playing in park, sunset, Van Gogh style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "Van Gogh style"}}}, {"prompt_en": "A cute happy Corgi playing in park, sunset, oil painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "oil painting"}}}, {"prompt_en": "A cute happy Corgi playing in park, sunset by Hokusai, in the style of Ukiyo", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "by Hokusai, in the style of Ukiyo"}}}, {"prompt_en": "A cute happy Corgi playing in park, sunset, black and white", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "black and white"}}}, {"prompt_en": "A cute happy Corgi playing in park, sunset, pixel art", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "pixel art"}}}, {"prompt_en": "A cute happy Corgi playing in park, sunset, in cyberpunk style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "in cyberpunk style"}}}, {"prompt_en": "A cute happy Corgi playing in park, sunset, animated style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "animated style"}}}, {"prompt_en": "A cute happy Corgi playing in park, sunset, watercolor painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "watercolor painting"}}}, {"prompt_en": "A cute happy Corgi playing in park, sunset, surrealism style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "surrealism style"}}}, {"prompt_en": "Gwen Stacy reading a book, Van Gogh style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "Van Gogh style"}}}, {"prompt_en": "Gwen Stacy reading a book, oil painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "oil painting"}}}, {"prompt_en": "Gwen Stacy reading a book by Hokusai, in the style of Ukiyo", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "by Hokusai, in the style of Ukiyo"}}}, {"prompt_en": "Gwen Stacy reading a book, black and white", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "black and white"}}}, {"prompt_en": "Gwen Stacy reading a book, pixel art", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "pixel art"}}}, {"prompt_en": "Gwen Stacy reading a book, in cyberpunk style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "in cyberpunk style"}}}, {"prompt_en": "Gwen Stacy reading a book, animated style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "animated style"}}}, {"prompt_en": "Gwen Stacy reading a book, watercolor painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "watercolor painting"}}}, {"prompt_en": "Gwen Stacy reading a book, surrealism style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "surrealism style"}}}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, Van Gogh style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "Van Gogh style"}}}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, oil painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "oil painting"}}}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background by Hokusai, in the style of Ukiyo", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "by Hokusai, in the style of Ukiyo"}}}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, black and white", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "black and white"}}}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, pixel art", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "pixel art"}}}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, in cyberpunk style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "in cyberpunk style"}}}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, animated style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "animated style"}}}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, watercolor painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "watercolor painting"}}}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, surrealism style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "surrealism style"}}}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, Van Gogh style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "Van Gogh style"}}}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, oil painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "oil painting"}}}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas by Hokusai, in the style of Ukiyo", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "by Hokusai, in the style of Ukiyo"}}}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, black and white", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "black and white"}}}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, pixel art", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "pixel art"}}}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, in cyberpunk style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "in cyberpunk style"}}}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, animated style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "animated style"}}}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, watercolor painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "watercolor painting"}}}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, surrealism style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "surrealism style"}}}, {"prompt_en": "An astronaut flying in space, Van Gogh style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "Van Gogh style"}}}, {"prompt_en": "An astronaut flying in space, oil painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "oil painting"}}}, {"prompt_en": "An astronaut flying in space by Hokusai, in the style of Ukiyo", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "by Hokusai, in the style of Ukiyo"}}}, {"prompt_en": "An astronaut flying in space, black and white", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "black and white"}}}, {"prompt_en": "An astronaut flying in space, pixel art", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "pixel art"}}}, {"prompt_en": "An astronaut flying in space, in cyberpunk style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "in cyberpunk style"}}}, {"prompt_en": "An astronaut flying in space, animated style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "animated style"}}}, {"prompt_en": "An astronaut flying in space, watercolor painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "watercolor painting"}}}, {"prompt_en": "An astronaut flying in space, surrealism style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "surrealism style"}}}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, Van Gogh style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "Van Gogh style"}}}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, oil painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "oil painting"}}}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks by Hokusai, in the style of Ukiyo", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "by Hokusai, in the style of Ukiyo"}}}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, black and white", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "black and white"}}}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, pixel art", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "pixel art"}}}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, in cyberpunk style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "in cyberpunk style"}}}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, animated style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "animated style"}}}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, watercolor painting", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "watercolor painting"}}}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, surrealism style", "dimension": ["appearance_style"], "auxiliary_info": {"appearance_style": {"appearance_style": "surrealism style"}}}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, in super slow motion", "dimension": ["temporal_style"]}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, zoom in", "dimension": ["temporal_style"]}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, zoom out", "dimension": ["temporal_style"]}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, pan left", "dimension": ["temporal_style"]}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, pan right", "dimension": ["temporal_style"]}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, tilt up", "dimension": ["temporal_style"]}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, tilt down", "dimension": ["temporal_style"]}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, with an intense shaking effect", "dimension": ["temporal_style"]}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, featuring a steady and smooth perspective", "dimension": ["temporal_style"]}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, racking focus", "dimension": ["temporal_style"]}, {"prompt_en": "The bund Shanghai, in super slow motion", "dimension": ["temporal_style"]}, {"prompt_en": "The bund Shanghai, zoom in", "dimension": ["temporal_style"]}, {"prompt_en": "The bund Shanghai, zoom out", "dimension": ["temporal_style"]}, {"prompt_en": "The bund Shanghai, pan left", "dimension": ["temporal_style"]}, {"prompt_en": "The bund Shanghai, pan right", "dimension": ["temporal_style"]}, {"prompt_en": "The bund Shanghai, tilt up", "dimension": ["temporal_style"]}, {"prompt_en": "The bund Shanghai, tilt down", "dimension": ["temporal_style"]}, {"prompt_en": "The bund Shanghai, with an intense shaking effect", "dimension": ["temporal_style"]}, {"prompt_en": "The bund Shanghai, featuring a steady and smooth perspective", "dimension": ["temporal_style"]}, {"prompt_en": "The bund Shanghai, racking focus", "dimension": ["temporal_style"]}, {"prompt_en": "a shark is swimming in the ocean, in super slow motion", "dimension": ["temporal_style"]}, {"prompt_en": "a shark is swimming in the ocean, zoom in", "dimension": ["temporal_style"]}, {"prompt_en": "a shark is swimming in the ocean, zoom out", "dimension": ["temporal_style"]}, {"prompt_en": "a shark is swimming in the ocean, pan left", "dimension": ["temporal_style"]}, {"prompt_en": "a shark is swimming in the ocean, pan right", "dimension": ["temporal_style"]}, {"prompt_en": "a shark is swimming in the ocean, tilt up", "dimension": ["temporal_style"]}, {"prompt_en": "a shark is swimming in the ocean, tilt down", "dimension": ["temporal_style"]}, {"prompt_en": "a shark is swimming in the ocean, with an intense shaking effect", "dimension": ["temporal_style"]}, {"prompt_en": "a shark is swimming in the ocean, featuring a steady and smooth perspective", "dimension": ["temporal_style"]}, {"prompt_en": "a shark is swimming in the ocean, racking focus", "dimension": ["temporal_style"]}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, in super slow motion", "dimension": ["temporal_style"]}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, zoom in", "dimension": ["temporal_style"]}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, zoom out", "dimension": ["temporal_style"]}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, pan left", "dimension": ["temporal_style"]}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, pan right", "dimension": ["temporal_style"]}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, tilt up", "dimension": ["temporal_style"]}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, tilt down", "dimension": ["temporal_style"]}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, with an intense shaking effect", "dimension": ["temporal_style"]}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, featuring a steady and smooth perspective", "dimension": ["temporal_style"]}, {"prompt_en": "A panda drinking coffee in a cafe in Paris, racking focus", "dimension": ["temporal_style"]}, {"prompt_en": "A cute happy Corgi playing in park, sunset, in super slow motion", "dimension": ["temporal_style"]}, {"prompt_en": "A cute happy Corgi playing in park, sunset, zoom in", "dimension": ["temporal_style"]}, {"prompt_en": "A cute happy Corgi playing in park, sunset, zoom out", "dimension": ["temporal_style"]}, {"prompt_en": "A cute happy Corgi playing in park, sunset, pan left", "dimension": ["temporal_style"]}, {"prompt_en": "A cute happy Corgi playing in park, sunset, pan right", "dimension": ["temporal_style"]}, {"prompt_en": "A cute happy Corgi playing in park, sunset, tilt up", "dimension": ["temporal_style"]}, {"prompt_en": "A cute happy Corgi playing in park, sunset, tilt down", "dimension": ["temporal_style"]}, {"prompt_en": "A cute happy Corgi playing in park, sunset, with an intense shaking effect", "dimension": ["temporal_style"]}, {"prompt_en": "A cute happy Corgi playing in park, sunset, featuring a steady and smooth perspective", "dimension": ["temporal_style"]}, {"prompt_en": "A cute happy Corgi playing in park, sunset, racking focus", "dimension": ["temporal_style"]}, {"prompt_en": "Gwen Stacy reading a book, in super slow motion", "dimension": ["temporal_style"]}, {"prompt_en": "Gwen Stacy reading a book, zoom in", "dimension": ["temporal_style"]}, {"prompt_en": "Gwen Stacy reading a book, zoom out", "dimension": ["temporal_style"]}, {"prompt_en": "Gwen Stacy reading a book, pan left", "dimension": ["temporal_style"]}, {"prompt_en": "Gwen Stacy reading a book, pan right", "dimension": ["temporal_style"]}, {"prompt_en": "Gwen Stacy reading a book, tilt up", "dimension": ["temporal_style"]}, {"prompt_en": "Gwen Stacy reading a book, tilt down", "dimension": ["temporal_style"]}, {"prompt_en": "Gwen Stacy reading a book, with an intense shaking effect", "dimension": ["temporal_style"]}, {"prompt_en": "Gwen Stacy reading a book, featuring a steady and smooth perspective", "dimension": ["temporal_style"]}, {"prompt_en": "Gwen Stacy reading a book, racking focus", "dimension": ["temporal_style"]}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, in super slow motion", "dimension": ["temporal_style"]}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, zoom in", "dimension": ["temporal_style"]}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, zoom out", "dimension": ["temporal_style"]}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, pan left", "dimension": ["temporal_style"]}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, pan right", "dimension": ["temporal_style"]}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, tilt up", "dimension": ["temporal_style"]}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, tilt down", "dimension": ["temporal_style"]}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, with an intense shaking effect", "dimension": ["temporal_style"]}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, featuring a steady and smooth perspective", "dimension": ["temporal_style"]}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, racking focus", "dimension": ["temporal_style"]}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, in super slow motion", "dimension": ["temporal_style"]}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, zoom in", "dimension": ["temporal_style"]}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, zoom out", "dimension": ["temporal_style"]}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, pan left", "dimension": ["temporal_style"]}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, pan right", "dimension": ["temporal_style"]}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, tilt up", "dimension": ["temporal_style"]}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, tilt down", "dimension": ["temporal_style"]}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, with an intense shaking effect", "dimension": ["temporal_style"]}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, featuring a steady and smooth perspective", "dimension": ["temporal_style"]}, {"prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, racking focus", "dimension": ["temporal_style"]}, {"prompt_en": "An astronaut flying in space, in super slow motion", "dimension": ["temporal_style"]}, {"prompt_en": "An astronaut flying in space, zoom in", "dimension": ["temporal_style"]}, {"prompt_en": "An astronaut flying in space, zoom out", "dimension": ["temporal_style"]}, {"prompt_en": "An astronaut flying in space, pan left", "dimension": ["temporal_style"]}, {"prompt_en": "An astronaut flying in space, pan right", "dimension": ["temporal_style"]}, {"prompt_en": "An astronaut flying in space, tilt up", "dimension": ["temporal_style"]}, {"prompt_en": "An astronaut flying in space, tilt down", "dimension": ["temporal_style"]}, {"prompt_en": "An astronaut flying in space, with an intense shaking effect", "dimension": ["temporal_style"]}, {"prompt_en": "An astronaut flying in space, featuring a steady and smooth perspective", "dimension": ["temporal_style"]}, {"prompt_en": "An astronaut flying in space, racking focus", "dimension": ["temporal_style"]}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, in super slow motion", "dimension": ["temporal_style"]}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, zoom in", "dimension": ["temporal_style"]}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, zoom out", "dimension": ["temporal_style"]}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, pan left", "dimension": ["temporal_style"]}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, pan right", "dimension": ["temporal_style"]}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, tilt up", "dimension": ["temporal_style"]}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, tilt down", "dimension": ["temporal_style"]}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, with an intense shaking effect", "dimension": ["temporal_style"]}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, featuring a steady and smooth perspective", "dimension": ["temporal_style"]}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, racking focus", "dimension": ["temporal_style"]}, {"prompt_en": "Close up of grapes on a rotating table.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Turtle swimming in ocean.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A storm trooper vacuuming the beach.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A panda standing on a surfboard in the ocean in sunset.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "An astronaut feeding ducks on a sunny afternoon, reflection from the water.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Two pandas discussing an academic paper.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Sunset time lapse at the beach with moving clouds and colors in the sky.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A fat rabbit wearing a purple robe walking through a fantasy landscape.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A koala bear playing piano in the forest.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "An astronaut flying in space.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Fireworks.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "An animated painting of fluffy white clouds moving in sky.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Flying through fantasy landscapes.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A bigfoot walking in the snowstorm.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A squirrel eating a burger.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A cat wearing sunglasses and working as a lifeguard at a pool.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Splash of turquoise water in extreme slow motion, alpha channel included.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "an ice cream is melting on the table.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "a drone flying over a snowy forest.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "a shark is swimming in the ocean.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Aerial panoramic video from a drone of a fantasy land.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "a teddy bear is swimming in the ocean.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "time lapse of sunrise on mars.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "golden fish swimming in the ocean.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "An artist brush painting on a canvas close up.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A drone view of celebration with Christmas tree and fireworks, starry sky - background.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "happy dog wearing a yellow turtleneck, studio, portrait, facing camera, dark background", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Origami dancers in white paper, 3D render, on white background, studio shot, dancing modern dance.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Campfire at night in a snowy forest with starry sky in the background.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "a fantasy landscape", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A 3D model of a 1800s victorian house.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "this is how I do makeup in the morning.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A raccoon that looks like a turtle, digital art.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Robot dancing in Times Square.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Busy freeway at night.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Balloon full of water exploding in extreme slow motion.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "An astronaut is riding a horse in the space in a photorealistic style.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Macro slo-mo. Slow motion cropped closeup of roasted coffee beans falling into an empty bowl.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Sewing machine, old sewing machine working.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Motion colour drop in water, ink swirling in water, colourful ink in water, abstraction fancy dream cloud of ink.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Few big purple plums rotating on the turntable. water drops appear on the skin during rotation. isolated on the white background. close-up. macro.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Vampire makeup face of beautiful girl, red contact lenses.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Ashtray full of butts on table, smoke flowing on black background, close-up", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Pacific coast, carmel by the sea ocean and waves.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A teddy bear is playing drum kit in NYC Times Square.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A corgi is playing drum kit.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "An Iron man is playing the electronic guitar, high electronic guitar.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A raccoon is playing the electronic guitar.", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background by Vincent van Gogh", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A corgi's head depicted as an explosion of a nebula", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A fantasy landscape", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A future where humans have achieved teleportation technology", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A jellyfish floating through the ocean, with bioluminescent tentacles", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A Mars rover moving on Mars", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A panda drinking coffee in a cafe in Paris", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A space shuttle launching into orbit, with flames and smoke billowing out from the engines", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A steam train moving on a mountainside", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A super cool giant robot in Cyberpunk Beijing", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A tropical beach at sunrise, with palm trees and crystal-clear water in the foreground", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Cinematic shot of Van Gogh's selfie, Van Gogh style", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Gwen Stacy reading a book", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Iron Man flying in the sky", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "The bund Shanghai, oil painting", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Yoda playing guitar on the stage", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand by Hokusai, in the style of Ukiyo", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A beautiful coastal beach in spring, waves lapping on sand by Vincent van Gogh", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A car moving slowly on an empty street, rainy evening", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A cat eating food out of a bowl", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A cat wearing sunglasses at a pool", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A confused panda in calculus class", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A cute fluffy panda eating Chinese food in a restaurant", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A cute happy Corgi playing in park, sunset", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A cute raccoon playing guitar in a boat on the ocean", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A happy fuzzy panda playing guitar nearby a campfire, snow mountain in the background", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A lightning striking atop of eiffel tower, dark clouds in the sky", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A modern art museum, with colorful paintings", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A panda cooking in the kitchen", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A panda playing on a swing set", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A polar bear is playing guitar", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A raccoon dressed in suit playing the trumpet, stage background", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A robot DJ is playing the turntable, in heavy raining futuristic tokyo rooftop cyberpunk night, sci-fi, fantasy", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A shark swimming in clear Caribbean ocean", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A super robot protecting city", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "A teddy bear washing the dishes", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "An epic tornado attacking above a glowing city at night, the tornado is made of smoke", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "An oil painting of a couple in formal evening wear going home get caught in a heavy downpour with umbrellas", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Clown fish swimming through the coral reef", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Hyper-realistic spaceship landing on Mars", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "The bund Shanghai, vibrant color", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Vincent van Gogh is painting in the room", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "Yellow flowers swing in the wind", "dimension": ["overall_consistency", "aesthetic_quality", "imaging_quality"]}, {"prompt_en": "alley", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "alley"}}}}, {"prompt_en": "amusement park", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "amusement park"}}}}, {"prompt_en": "aquarium", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "aquarium"}}}}, {"prompt_en": "arch", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "arch"}}}}, {"prompt_en": "art gallery", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "art gallery"}}}}, {"prompt_en": "bathroom", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "bathroom"}}}}, {"prompt_en": "bakery shop", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "bakery shop"}}}}, {"prompt_en": "ballroom", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "ballroom"}}}}, {"prompt_en": "bar", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "bar"}}}}, {"prompt_en": "barn", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "barn"}}}}, {"prompt_en": "basement", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "basement"}}}}, {"prompt_en": "beach", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "beach"}}}}, {"prompt_en": "bedroom", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "bedroom"}}}}, {"prompt_en": "bridge", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "bridge"}}}}, {"prompt_en": "botanical garden", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "botanical garden"}}}}, {"prompt_en": "cafeteria", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "cafeteria"}}}}, {"prompt_en": "campsite", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "campsite"}}}}, {"prompt_en": "campus", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "campus"}}}}, {"prompt_en": "carrousel", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "carrousel"}}}}, {"prompt_en": "castle", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "castle"}}}}, {"prompt_en": "cemetery", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "cemetery"}}}}, {"prompt_en": "classroom", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "classroom"}}}}, {"prompt_en": "cliff", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "cliff"}}}}, {"prompt_en": "crosswalk", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "crosswalk"}}}}, {"prompt_en": "construction site", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "construction site"}}}}, {"prompt_en": "corridor", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "corridor"}}}}, {"prompt_en": "courtyard", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "courtyard"}}}}, {"prompt_en": "desert", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "desert"}}}}, {"prompt_en": "downtown", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "downtown"}}}}, {"prompt_en": "driveway", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "driveway"}}}}, {"prompt_en": "farm", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "farm"}}}}, {"prompt_en": "food court", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "food court"}}}}, {"prompt_en": "football field", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "football field"}}}}, {"prompt_en": "forest road", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "forest road"}}}}, {"prompt_en": "fountain", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "fountain"}}}}, {"prompt_en": "gas station", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "gas station"}}}}, {"prompt_en": "glacier", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "glacier"}}}}, {"prompt_en": "golf course", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "golf course"}}}}, {"prompt_en": "indoor gymnasium", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "indoor gymnasium"}}}}, {"prompt_en": "harbor", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "harbor"}}}}, {"prompt_en": "highway", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "highway"}}}}, {"prompt_en": "hospital", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "hospital"}}}}, {"prompt_en": "house", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "house"}}}}, {"prompt_en": "iceberg", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "iceberg"}}}}, {"prompt_en": "industrial area", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "industrial area"}}}}, {"prompt_en": "jail cell", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "jail cell"}}}}, {"prompt_en": "junkyard", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "junkyard"}}}}, {"prompt_en": "kitchen", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "kitchen"}}}}, {"prompt_en": "indoor library", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "indoor library"}}}}, {"prompt_en": "lighthouse", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "lighthouse"}}}}, {"prompt_en": "laboratory", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "laboratory"}}}}, {"prompt_en": "mansion", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "mansion"}}}}, {"prompt_en": "marsh", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "marsh"}}}}, {"prompt_en": "mountain", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "mountain"}}}}, {"prompt_en": "indoor movie theater", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "indoor movie theater"}}}}, {"prompt_en": "indoor museum", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "indoor museum"}}}}, {"prompt_en": "music studio", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "music studio"}}}}, {"prompt_en": "nursery", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "nursery"}}}}, {"prompt_en": "ocean", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "ocean"}}}}, {"prompt_en": "office", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "office"}}}}, {"prompt_en": "palace", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "palace"}}}}, {"prompt_en": "parking lot", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "parking lot"}}}}, {"prompt_en": "pharmacy", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "pharmacy"}}}}, {"prompt_en": "phone booth", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "phone booth"}}}}, {"prompt_en": "raceway", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "raceway"}}}}, {"prompt_en": "restaurant", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "restaurant"}}}}, {"prompt_en": "river", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "river"}}}}, {"prompt_en": "science museum", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "science museum"}}}}, {"prompt_en": "shower", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "shower"}}}}, {"prompt_en": "ski slope", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "ski slope"}}}}, {"prompt_en": "sky", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "sky"}}}}, {"prompt_en": "skyscraper", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "skyscraper"}}}}, {"prompt_en": "baseball stadium", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "baseball stadium"}}}}, {"prompt_en": "staircase", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "staircase"}}}}, {"prompt_en": "street", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "street"}}}}, {"prompt_en": "supermarket", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "supermarket"}}}}, {"prompt_en": "indoor swimming pool", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "indoor swimming pool"}}}}, {"prompt_en": "tower", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "tower"}}}}, {"prompt_en": "outdoor track", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "outdoor track"}}}}, {"prompt_en": "train railway", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "train railway"}}}}, {"prompt_en": "train station platform", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "train station platform"}}}}, {"prompt_en": "underwater coral reef", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "underwater coral reef"}}}}, {"prompt_en": "valley", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "valley"}}}}, {"prompt_en": "volcano", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "volcano"}}}}, {"prompt_en": "waterfall", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "waterfall"}}}}, {"prompt_en": "windmill", "dimension": ["scene", "background_consistency"], "auxiliary_info": {"scene": {"scene": {"scene": "windmill"}}}}, {"prompt_en": "a bicycle on the left of a car, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "bicycle", "object_b": "car", "relationship": "on the left of"}}}}, {"prompt_en": "a car on the right of a motorcycle, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "car", "object_b": "motorcycle", "relationship": "on the right of"}}}}, {"prompt_en": "a motorcycle on the left of a bus, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "motorcycle", "object_b": "bus", "relationship": "on the left of"}}}}, {"prompt_en": "a bus on the right of a traffic light, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "bus", "object_b": "traffic light", "relationship": "on the right of"}}}}, {"prompt_en": "a traffic light on the left of a fire hydrant, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "traffic light", "object_b": "fire hydrant", "relationship": "on the left of"}}}}, {"prompt_en": "a fire hydrant on the right of a stop sign, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "fire hydrant", "object_b": "stop sign", "relationship": "on the right of"}}}}, {"prompt_en": "a stop sign on the left of a parking meter, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "stop sign", "object_b": "parking meter", "relationship": "on the left of"}}}}, {"prompt_en": "a parking meter on the right of a bench, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "parking meter", "object_b": "bench", "relationship": "on the right of"}}}}, {"prompt_en": "a bench on the left of a truck, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "bench", "object_b": "truck", "relationship": "on the left of"}}}}, {"prompt_en": "a truck on the right of a bicycle, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "truck", "object_b": "bicycle", "relationship": "on the right of"}}}}, {"prompt_en": "a bird on the left of a cat, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "bird", "object_b": "cat", "relationship": "on the left of"}}}}, {"prompt_en": "a cat on the right of a dog, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "cat", "object_b": "dog", "relationship": "on the right of"}}}}, {"prompt_en": "a dog on the left of a horse, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "dog", "object_b": "horse", "relationship": "on the left of"}}}}, {"prompt_en": "a horse on the right of a sheep, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "horse", "object_b": "sheep", "relationship": "on the right of"}}}}, {"prompt_en": "a sheep on the left of a cow, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "sheep", "object_b": "cow", "relationship": "on the left of"}}}}, {"prompt_en": "a cow on the right of an elephant, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "cow", "object_b": "elephant", "relationship": "on the right of"}}}}, {"prompt_en": "an elephant on the left of a bear, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "elephant", "object_b": "bear", "relationship": "on the left of"}}}}, {"prompt_en": "a bear on the right of a zebra, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "bear", "object_b": "zebra", "relationship": "on the right of"}}}}, {"prompt_en": "a zebra on the left of a giraffe, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "zebra", "object_b": "giraffe", "relationship": "on the left of"}}}}, {"prompt_en": "a giraffe on the right of a bird, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "giraffe", "object_b": "bird", "relationship": "on the right of"}}}}, {"prompt_en": "a bottle on the left of a wine glass, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "bottle", "object_b": "wine glass", "relationship": "on the left of"}}}}, {"prompt_en": "a wine glass on the right of a cup, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "wine glass", "object_b": "cup", "relationship": "on the right of"}}}}, {"prompt_en": "a cup on the left of a fork, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "cup", "object_b": "fork", "relationship": "on the left of"}}}}, {"prompt_en": "a fork on the right of a knife, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "fork", "object_b": "knife", "relationship": "on the right of"}}}}, {"prompt_en": "a knife on the left of a spoon, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "knife", "object_b": "spoon", "relationship": "on the left of"}}}}, {"prompt_en": "a spoon on the right of a bowl, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "spoon", "object_b": "bowl", "relationship": "on the right of"}}}}, {"prompt_en": "a bowl on the left of a bottle, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "bowl", "object_b": "bottle", "relationship": "on the left of"}}}}, {"prompt_en": "a potted plant on the left of a remote, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "potted plant", "object_b": "remote", "relationship": "on the left of"}}}}, {"prompt_en": "a remote on the right of a clock, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "remote", "object_b": "clock", "relationship": "on the right of"}}}}, {"prompt_en": "a clock on the left of a vase, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "clock", "object_b": "vase", "relationship": "on the left of"}}}}, {"prompt_en": "a vase on the right of scissors, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "vase", "object_b": "scissors", "relationship": "on the right of"}}}}, {"prompt_en": "scissors on the left of a teddy bear, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "scissors", "object_b": "teddy bear", "relationship": "on the left of"}}}}, {"prompt_en": "a teddy bear on the right of a potted plant, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "teddy bear", "object_b": "potted plant", "relationship": "on the right of"}}}}, {"prompt_en": "a frisbee on the left of a sports ball, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "frisbee", "object_b": "sports ball", "relationship": "on the left of"}}}}, {"prompt_en": "a sports ball on the right of a baseball bat, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "sports ball", "object_b": "baseball bat", "relationship": "on the right of"}}}}, {"prompt_en": "a baseball bat on the left of a baseball glove, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "baseball bat", "object_b": "baseball glove", "relationship": "on the left of"}}}}, {"prompt_en": "a baseball glove on the right of a tennis racket, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "baseball glove", "object_b": "tennis racket", "relationship": "on the right of"}}}}, {"prompt_en": "a tennis racket on the left of a frisbee, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "tennis racket", "object_b": "frisbee", "relationship": "on the left of"}}}}, {"prompt_en": "a toilet on the left of a hair drier, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "toilet", "object_b": "hair drier", "relationship": "on the left of"}}}}, {"prompt_en": "a hair drier on the right of a toothbrush, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "hair drier", "object_b": "toothbrush", "relationship": "on the right of"}}}}, {"prompt_en": "a toothbrush on the left of a sink, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "toothbrush", "object_b": "sink", "relationship": "on the left of"}}}}, {"prompt_en": "a sink on the right of a toilet, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "sink", "object_b": "toilet", "relationship": "on the right of"}}}}, {"prompt_en": "a chair on the left of a couch, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "chair", "object_b": "couch", "relationship": "on the left of"}}}}, {"prompt_en": "a couch on the right of a bed, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "couch", "object_b": "bed", "relationship": "on the right of"}}}}, {"prompt_en": "a bed on the left of a tv, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "bed", "object_b": "tv", "relationship": "on the left of"}}}}, {"prompt_en": "a tv on the right of a dining table, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "tv", "object_b": "dining table", "relationship": "on the right of"}}}}, {"prompt_en": "a dining table on the left of a chair, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "dining table", "object_b": "chair", "relationship": "on the left of"}}}}, {"prompt_en": "an airplane on the left of a train, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "airplane", "object_b": "train", "relationship": "on the left of"}}}}, {"prompt_en": "a train on the right of a boat, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "train", "object_b": "boat", "relationship": "on the right of"}}}}, {"prompt_en": "a boat on the left of an airplane, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "boat", "object_b": "airplane", "relationship": "on the left of"}}}}, {"prompt_en": "an oven on the top of a toaster, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "oven", "object_b": "toaster", "relationship": "on the top of"}}}}, {"prompt_en": "an oven on the bottom of a toaster, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "oven", "object_b": "toaster", "relationship": "on the bottom of"}}}}, {"prompt_en": "a toaster on the top of a microwave, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "toaster", "object_b": "microwave", "relationship": "on the top of"}}}}, {"prompt_en": "a toaster on the bottom of a microwave, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "toaster", "object_b": "microwave", "relationship": "on the bottom of"}}}}, {"prompt_en": "a microwave on the top of an oven, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "microwave", "object_b": "oven", "relationship": "on the top of"}}}}, {"prompt_en": "a microwave on the bottom of an oven, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "microwave", "object_b": "oven", "relationship": "on the bottom of"}}}}, {"prompt_en": "a banana on the top of an apple, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "banana", "object_b": "apple", "relationship": "on the top of"}}}}, {"prompt_en": "a banana on the bottom of an apple, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "banana", "object_b": "apple", "relationship": "on the bottom of"}}}}, {"prompt_en": "an apple on the top of a sandwich, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "apple", "object_b": "sandwich", "relationship": "on the top of"}}}}, {"prompt_en": "an apple on the bottom of a sandwich, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "apple", "object_b": "sandwich", "relationship": "on the bottom of"}}}}, {"prompt_en": "a sandwich on the top of an orange, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "sandwich", "object_b": "orange", "relationship": "on the top of"}}}}, {"prompt_en": "a sandwich on the bottom of an orange, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "sandwich", "object_b": "orange", "relationship": "on the bottom of"}}}}, {"prompt_en": "an orange on the top of a carrot, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "orange", "object_b": "carrot", "relationship": "on the top of"}}}}, {"prompt_en": "an orange on the bottom of a carrot, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "orange", "object_b": "carrot", "relationship": "on the bottom of"}}}}, {"prompt_en": "a carrot on the top of a hot dog, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "carrot", "object_b": "hot dog", "relationship": "on the top of"}}}}, {"prompt_en": "a carrot on the bottom of a hot dog, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "carrot", "object_b": "hot dog", "relationship": "on the bottom of"}}}}, {"prompt_en": "a hot dog on the top of a pizza, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "hot dog", "object_b": "pizza", "relationship": "on the top of"}}}}, {"prompt_en": "a hot dog on the bottom of a pizza, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "hot dog", "object_b": "pizza", "relationship": "on the bottom of"}}}}, {"prompt_en": "a pizza on the top of a donut, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "pizza", "object_b": "donut", "relationship": "on the top of"}}}}, {"prompt_en": "a pizza on the bottom of a donut, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "pizza", "object_b": "donut", "relationship": "on the bottom of"}}}}, {"prompt_en": "a donut on the top of broccoli, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "donut", "object_b": "broccoli", "relationship": "on the top of"}}}}, {"prompt_en": "a donut on the bottom of broccoli, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "donut", "object_b": "broccoli", "relationship": "on the bottom of"}}}}, {"prompt_en": "broccoli on the top of a banana, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "broccoli", "object_b": "banana", "relationship": "on the top of"}}}}, {"prompt_en": "broccoli on the bottom of a banana, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "broccoli", "object_b": "banana", "relationship": "on the bottom of"}}}}, {"prompt_en": "skis on the top of a snowboard, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "skis", "object_b": "snowboard", "relationship": "on the top of"}}}}, {"prompt_en": "skis on the bottom of a snowboard, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "skis", "object_b": "snowboard", "relationship": "on the bottom of"}}}}, {"prompt_en": "a snowboard on the top of a kite, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "snowboard", "object_b": "kite", "relationship": "on the top of"}}}}, {"prompt_en": "a snowboard on the bottom of a kite, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "snowboard", "object_b": "kite", "relationship": "on the bottom of"}}}}, {"prompt_en": "a kite on the top of a skateboard, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "kite", "object_b": "skateboard", "relationship": "on the top of"}}}}, {"prompt_en": "a kite on the bottom of a skateboard, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "kite", "object_b": "skateboard", "relationship": "on the bottom of"}}}}, {"prompt_en": "a skateboard on the top of a surfboard, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "skateboard", "object_b": "surfboard", "relationship": "on the top of"}}}}, {"prompt_en": "a skateboard on the bottom of a surfboard, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "skateboard", "object_b": "surfboard", "relationship": "on the bottom of"}}}}, {"prompt_en": "a surfboard on the top of skis, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "surfboard", "object_b": "skis", "relationship": "on the top of"}}}}, {"prompt_en": "a surfboard on the bottom of skis, front view", "dimension": ["spatial_relationship"], "auxiliary_info": {"spatial_relationship": {"spatial_relationship": {"object_a": "surfboard", "object_b": "skis", "relationship": "on the bottom of"}}}}] \ No newline at end of file +[ + { + "prompt_en": "In a still frame, a stop sign", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "a toilet, frozen in time", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "a laptop, frozen in time", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of alley", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of bar", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of barn", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of bathroom", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of bedroom", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of cliff", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, courtyard", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, gas station", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of house", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "indoor gymnasium, frozen in time", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of indoor library", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of kitchen", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of palace", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, parking lot", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, phone booth", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of restaurant", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of tower", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of a bowl", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of an apple", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of a bench", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of a bed", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of a chair", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of a cup", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of a dining table", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, a pear", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of a bunch of grapes", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of a bowl on the kitchen counter", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of a beautiful, handcrafted ceramic bowl", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of an antique bowl", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of an exquisite mahogany dining table", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of a wooden bench in the park", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of a beautiful wrought-iron bench surrounded by blooming flowers", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, a park bench with a view of the lake", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of a vintage rocking chair was placed on the porch", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of the jail cell was small and dimly lit, with cold, steel bars", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of the phone booth was tucked away in a quiet alley", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "a dilapidated phone booth stood as a relic of a bygone era on the sidewalk, frozen in time", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of the old red barn stood weathered and iconic against the backdrop of the countryside", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of a picturesque barn was painted a warm shade of red and nestled in a picturesque meadow", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, within the desolate desert, an oasis unfolded, characterized by the stoic presence of palm trees and a motionless, glassy pool of water", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, the Parthenon's majestic Doric columns stand in serene solitude atop the Acropolis, framed by the tranquil Athenian landscape", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, the Temple of Hephaestus, with its timeless Doric grace, stands stoically against the backdrop of a quiet Athens", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, the ornate Victorian streetlamp stands solemnly, adorned with intricate ironwork and stained glass panels", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of the Stonehenge presented itself as an enigmatic puzzle, each colossal stone meticulously placed against the backdrop of tranquility", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, in the vast desert, an oasis nestled among dunes, featuring tall palm trees and an air of serenity", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "static view on a desert scene with an oasis, palm trees, and a clear, calm pool of water", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of an ornate Victorian streetlamp standing on a cobblestone street corner, illuminating the empty night", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of a tranquil lakeside cabin nestled among tall pines, its reflection mirrored perfectly in the calm water", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, a vintage gas lantern, adorned with intricate details, gracing a historic cobblestone square", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, a tranquil Japanese tea ceremony room, with tatami mats, a delicate tea set, and a bonsai tree in the corner", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of the Parthenon stands resolute in its classical elegance, a timeless symbol of Athens' cultural legacy", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of in the heart of Plaka, the neoclassical architecture of the old city harmonizes with the ancient ruins", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of in the desolate beauty of the American Southwest, Chaco Canyon's ancient ruins whispered tales of an enigmatic civilization that once thrived amidst the arid landscapes", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of at the edge of the Arabian Desert, the ancient city of Petra beckoned with its enigmatic rock-carved fa\u00e7ades", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, amidst the cobblestone streets, an Art Nouveau lamppost stood tall", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of in the quaint village square, a traditional wrought-iron streetlamp featured delicate filigree patterns and amber-hued glass panels", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of the lampposts were adorned with Art Deco motifs, their geometric shapes and frosted glass creating a sense of vintage glamour", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, in the picturesque square, a Gothic-style lamppost adorned with intricate stone carvings added a touch of medieval charm to the setting", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, in the heart of the old city, a row of ornate lantern-style streetlamps bathed the narrow alleyway in a warm, welcoming light", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of in the heart of the Utah desert, a massive sandstone arch spanned the horizon", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of in the Arizona desert, a massive stone bridge arched across a rugged canyon", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of in the corner of the minimalist tea room, a bonsai tree added a touch of nature's beauty to the otherwise simple and elegant space", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, amidst the hushed ambiance of the traditional tea room, a meticulously arranged tea set awaited, with porcelain cups, a bamboo whisk", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, nestled in the Zen garden, a rustic teahouse featured tatami seating and a traditional charcoal brazier", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of a country estate's library featured elegant wooden shelves", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of beneath the shade of a solitary oak tree, an old wooden park bench sat patiently", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of beside a tranquil pond, a weeping willow tree draped its branches gracefully over the water's surface, creating a serene tableau of reflection and calm", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of in the Zen garden, a perfectly raked gravel path led to a serene rock garden", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, a tranquil pond was fringed by weeping cherry trees, their blossoms drifting lazily onto the glassy surface", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "In a still frame, within the historic library's reading room, rows of antique leather chairs and mahogany tables offered a serene haven for literary contemplation", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of a peaceful orchid garden showcased a variety of delicate blooms", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "A tranquil tableau of in the serene courtyard, a centuries-old stone well stood as a symbol of a bygone era, its mossy stones bearing witness to the passage of time", + "dimension": [ + "temporal_flickering" + ] + }, + { + "prompt_en": "a bird and a cat", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "bird and cat" + } + } + }, + { + "prompt_en": "a cat and a dog", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "cat and dog" + } + } + }, + { + "prompt_en": "a dog and a horse", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "dog and horse" + } + } + }, + { + "prompt_en": "a horse and a sheep", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "horse and sheep" + } + } + }, + { + "prompt_en": "a sheep and a cow", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "sheep and cow" + } + } + }, + { + "prompt_en": "a cow and an elephant", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "cow and elephant" + } + } + }, + { + "prompt_en": "an elephant and a bear", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "elephant and bear" + } + } + }, + { + "prompt_en": "a bear and a zebra", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "bear and zebra" + } + } + }, + { + "prompt_en": "a zebra and a giraffe", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "zebra and giraffe" + } + } + }, + { + "prompt_en": "a giraffe and a bird", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "giraffe and bird" + } + } + }, + { + "prompt_en": "a chair and a couch", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "chair and couch" + } + } + }, + { + "prompt_en": "a couch and a potted plant", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "couch and potted plant" + } + } + }, + { + "prompt_en": "a potted plant and a tv", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "potted plant and tv" + } + } + }, + { + "prompt_en": "a tv and a laptop", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "tv and laptop" + } + } + }, + { + "prompt_en": "a laptop and a remote", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "laptop and remote" + } + } + }, + { + "prompt_en": "a remote and a keyboard", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "remote and keyboard" + } + } + }, + { + "prompt_en": "a keyboard and a cell phone", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "keyboard and cell phone" + } + } + }, + { + "prompt_en": "a cell phone and a book", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "cell phone and book" + } + } + }, + { + "prompt_en": "a book and a clock", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "book and clock" + } + } + }, + { + "prompt_en": "a clock and a backpack", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "clock and backpack" + } + } + }, + { + "prompt_en": "a backpack and an umbrella", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "backpack and umbrella" + } + } + }, + { + "prompt_en": "an umbrella and a handbag", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "umbrella and handbag" + } + } + }, + { + "prompt_en": "a handbag and a tie", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "handbag and tie" + } + } + }, + { + "prompt_en": "a tie and a suitcase", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "tie and suitcase" + } + } + }, + { + "prompt_en": "a suitcase and a vase", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "suitcase and vase" + } + } + }, + { + "prompt_en": "a vase and scissors", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "vase and scissors" + } + } + }, + { + "prompt_en": "scissors and a teddy bear", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "scissors and teddy bear" + } + } + }, + { + "prompt_en": "a teddy bear and a frisbee", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "teddy bear and frisbee" + } + } + }, + { + "prompt_en": "a frisbee and skis", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "frisbee and skis" + } + } + }, + { + "prompt_en": "skis and a snowboard", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "skis and snowboard" + } + } + }, + { + "prompt_en": "a snowboard and a sports ball", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "snowboard and sports ball" + } + } + }, + { + "prompt_en": "a sports ball and a kite", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "sports ball and kite" + } + } + }, + { + "prompt_en": "a kite and a baseball bat", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "kite and baseball bat" + } + } + }, + { + "prompt_en": "a baseball bat and a baseball glove", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "baseball bat and baseball glove" + } + } + }, + { + "prompt_en": "a baseball glove and a skateboard", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "baseball glove and skateboard" + } + } + }, + { + "prompt_en": "a skateboard and a surfboard", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "skateboard and surfboard" + } + } + }, + { + "prompt_en": "a surfboard and a tennis racket", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "surfboard and tennis racket" + } + } + }, + { + "prompt_en": "a tennis racket and a bottle", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "tennis racket and bottle" + } + } + }, + { + "prompt_en": "a bottle and a chair", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "bottle and chair" + } + } + }, + { + "prompt_en": "an airplane and a train", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "airplane and train" + } + } + }, + { + "prompt_en": "a train and a boat", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "train and boat" + } + } + }, + { + "prompt_en": "a boat and an airplane", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "boat and airplane" + } + } + }, + { + "prompt_en": "a bicycle and a car", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "bicycle and car" + } + } + }, + { + "prompt_en": "a car and a motorcycle", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "car and motorcycle" + } + } + }, + { + "prompt_en": "a motorcycle and a bus", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "motorcycle and bus" + } + } + }, + { + "prompt_en": "a bus and a traffic light", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "bus and traffic light" + } + } + }, + { + "prompt_en": "a traffic light and a fire hydrant", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "traffic light and fire hydrant" + } + } + }, + { + "prompt_en": "a fire hydrant and a stop sign", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "fire hydrant and stop sign" + } + } + }, + { + "prompt_en": "a stop sign and a parking meter", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "stop sign and parking meter" + } + } + }, + { + "prompt_en": "a parking meter and a truck", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "parking meter and truck" + } + } + }, + { + "prompt_en": "a truck and a bicycle", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "truck and bicycle" + } + } + }, + { + "prompt_en": "a toilet and a hair drier", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "toilet and hair drier" + } + } + }, + { + "prompt_en": "a hair drier and a toothbrush", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "hair drier and toothbrush" + } + } + }, + { + "prompt_en": "a toothbrush and a sink", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "toothbrush and sink" + } + } + }, + { + "prompt_en": "a sink and a toilet", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "sink and toilet" + } + } + }, + { + "prompt_en": "a wine glass and a chair", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "wine glass and chair" + } + } + }, + { + "prompt_en": "a cup and a couch", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "cup and couch" + } + } + }, + { + "prompt_en": "a fork and a potted plant", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "fork and potted plant" + } + } + }, + { + "prompt_en": "a knife and a tv", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "knife and tv" + } + } + }, + { + "prompt_en": "a spoon and a laptop", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "spoon and laptop" + } + } + }, + { + "prompt_en": "a bowl and a remote", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "bowl and remote" + } + } + }, + { + "prompt_en": "a banana and a keyboard", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "banana and keyboard" + } + } + }, + { + "prompt_en": "an apple and a cell phone", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "apple and cell phone" + } + } + }, + { + "prompt_en": "a sandwich and a book", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "sandwich and book" + } + } + }, + { + "prompt_en": "an orange and a clock", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "orange and clock" + } + } + }, + { + "prompt_en": "broccoli and a backpack", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "broccoli and backpack" + } + } + }, + { + "prompt_en": "a carrot and an umbrella", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "carrot and umbrella" + } + } + }, + { + "prompt_en": "a hot dog and a handbag", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "hot dog and handbag" + } + } + }, + { + "prompt_en": "a pizza and a tie", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "pizza and tie" + } + } + }, + { + "prompt_en": "a donut and a suitcase", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "donut and suitcase" + } + } + }, + { + "prompt_en": "a cake and a vase", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "cake and vase" + } + } + }, + { + "prompt_en": "an oven and scissors", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "oven and scissors" + } + } + }, + { + "prompt_en": "a toaster and a teddy bear", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "toaster and teddy bear" + } + } + }, + { + "prompt_en": "a microwave and a frisbee", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "microwave and frisbee" + } + } + }, + { + "prompt_en": "a refrigerator and skis", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "refrigerator and skis" + } + } + }, + { + "prompt_en": "a bicycle and an airplane", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "bicycle and airplane" + } + } + }, + { + "prompt_en": "a car and a train", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "car and train" + } + } + }, + { + "prompt_en": "a motorcycle and a boat", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "motorcycle and boat" + } + } + }, + { + "prompt_en": "a person and a toilet", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "person and toilet" + } + } + }, + { + "prompt_en": "a person and a hair drier", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "person and hair drier" + } + } + }, + { + "prompt_en": "a person and a toothbrush", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "person and toothbrush" + } + } + }, + { + "prompt_en": "a person and a sink", + "dimension": [ + "multiple_objects" + ], + "auxiliary_info": { + "multiple_objects": { + "object": "person and sink" + } + } + }, + { + "prompt_en": "A person is riding a bike", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is marching", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is roller skating", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is tasting beer", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is clapping", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is drawing", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is petting animal (not cat)", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is eating watermelon", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is playing harp", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is wrestling", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is riding scooter", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is sweeping floor", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is skateboarding", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is dunking basketball", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is playing flute", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is stretching leg", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is tying tie", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is skydiving", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is shooting goal (soccer)", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is playing piano", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is finger snapping", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is canoeing or kayaking", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is laughing", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is digging", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is clay pottery making", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is shooting basketball", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is bending back", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is shaking hands", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is bandaging", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is push up", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is catching or throwing frisbee", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is playing trumpet", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is flying kite", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is filling eyebrows", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is shuffling cards", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is folding clothes", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is smoking", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is tai chi", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is squat", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is playing controller", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is throwing axe", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is giving or receiving award", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is air drumming", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is taking a shower", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is planting trees", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is sharpening knives", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is robot dancing", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is rock climbing", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is hula hooping", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is writing", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is bungee jumping", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is pushing cart", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is cleaning windows", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is cutting watermelon", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is cheerleading", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is washing hands", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is ironing", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is cutting nails", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is hugging", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is trimming or shaving beard", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is jogging", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is making bed", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is washing dishes", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is grooming dog", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is doing laundry", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is knitting", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is reading book", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is baby waking up", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is massaging legs", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is brushing teeth", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is crawling baby", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is motorcycling", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is driving car", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is sticking tongue out", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is shaking head", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is sword fighting", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is doing aerobics", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is strumming guitar", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is riding or walking with horse", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is archery", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is catching or throwing baseball", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is playing chess", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is rock scissors paper", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is using computer", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is arranging flowers", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is bending metal", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is ice skating", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is climbing a rope", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is crying", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is dancing ballet", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is getting a haircut", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is running on treadmill", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is kissing", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is counting money", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is barbequing", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is peeling apples", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is milking cow", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is shining shoes", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is making snowman", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "A person is sailing", + "dimension": [ + "human_action" + ] + }, + { + "prompt_en": "a person swimming in ocean", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a person giving a presentation to a room full of colleagues", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a person washing the dishes", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a person eating a burger", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a person walking in the snowstorm", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a person drinking coffee in a cafe", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a person playing guitar", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a bicycle leaning against a tree", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a bicycle gliding through a snowy field", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a bicycle slowing down to stop", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a bicycle accelerating to gain speed", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a car stuck in traffic during rush hour", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a car turning a corner", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a car slowing down to stop", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a car accelerating to gain speed", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a motorcycle cruising along a coastal highway", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a motorcycle turning a corner", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a motorcycle slowing down to stop", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a motorcycle gliding through a snowy field", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a motorcycle accelerating to gain speed", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "an airplane soaring through a clear blue sky", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "an airplane taking off", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "an airplane landing smoothly on a runway", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "an airplane accelerating to gain speed", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a bus turning a corner", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a bus stuck in traffic during rush hour", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a bus accelerating to gain speed", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a train speeding down the tracks", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a train crossing over a tall bridge", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a train accelerating to gain speed", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a truck turning a corner", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a truck anchored in a tranquil bay", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a truck stuck in traffic during rush hour", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a truck slowing down to stop", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a truck accelerating to gain speed", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a boat sailing smoothly on a calm lake", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a boat slowing down to stop", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a boat accelerating to gain speed", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a bird soaring gracefully in the sky", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a bird building a nest from twigs and leaves", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a bird flying over a snowy forest", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a cat grooming itself meticulously with its tongue", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a cat playing in park", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a cat drinking water", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a cat running happily", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a dog enjoying a peaceful walk", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a dog playing in park", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a dog drinking water", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a dog running happily", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a horse bending down to drink water from a river", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a horse galloping across an open field", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a horse taking a peaceful walk", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a horse running to join a herd of its kind", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a sheep bending down to drink water from a river", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a sheep taking a peaceful walk", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a sheep running to join a herd of its kind", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a cow bending down to drink water from a river", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a cow chewing cud while resting in a tranquil barn", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a cow running to join a herd of its kind", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "an elephant spraying itself with water using its trunk to cool down", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "an elephant taking a peaceful walk", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "an elephant running to join a herd of its kind", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a bear catching a salmon in its powerful jaws", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a bear sniffing the air for scents of food", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a bear climbing a tree", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a bear hunting for prey", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a zebra bending down to drink water from a river", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a zebra running to join a herd of its kind", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a zebra taking a peaceful walk", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a giraffe bending down to drink water from a river", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a giraffe taking a peaceful walk", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a giraffe running to join a herd of its kind", + "dimension": [ + "subject_consistency", + "dynamic_degree", + "motion_smoothness" + ] + }, + { + "prompt_en": "a person", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "person" + } + } + }, + { + "prompt_en": "a bicycle", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "bicycle" + } + } + }, + { + "prompt_en": "a car", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "car" + } + } + }, + { + "prompt_en": "a motorcycle", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "motorcycle" + } + } + }, + { + "prompt_en": "an airplane", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "airplane" + } + } + }, + { + "prompt_en": "a bus", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "bus" + } + } + }, + { + "prompt_en": "a train", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "train" + } + } + }, + { + "prompt_en": "a truck", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "truck" + } + } + }, + { + "prompt_en": "a boat", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "boat" + } + } + }, + { + "prompt_en": "a traffic light", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "traffic light" + } + } + }, + { + "prompt_en": "a fire hydrant", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "fire hydrant" + } + } + }, + { + "prompt_en": "a stop sign", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "stop sign" + } + } + }, + { + "prompt_en": "a parking meter", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "parking meter" + } + } + }, + { + "prompt_en": "a bench", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "bench" + } + } + }, + { + "prompt_en": "a bird", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "bird" + } + } + }, + { + "prompt_en": "a cat", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "cat" + } + } + }, + { + "prompt_en": "a dog", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "dog" + } + } + }, + { + "prompt_en": "a horse", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "horse" + } + } + }, + { + "prompt_en": "a sheep", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "sheep" + } + } + }, + { + "prompt_en": "a cow", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "cow" + } + } + }, + { + "prompt_en": "an elephant", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "elephant" + } + } + }, + { + "prompt_en": "a bear", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "bear" + } + } + }, + { + "prompt_en": "a zebra", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "zebra" + } + } + }, + { + "prompt_en": "a giraffe", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "giraffe" + } + } + }, + { + "prompt_en": "a backpack", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "backpack" + } + } + }, + { + "prompt_en": "an umbrella", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "umbrella" + } + } + }, + { + "prompt_en": "a handbag", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "handbag" + } + } + }, + { + "prompt_en": "a tie", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "tie" + } + } + }, + { + "prompt_en": "a suitcase", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "suitcase" + } + } + }, + { + "prompt_en": "a frisbee", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "frisbee" + } + } + }, + { + "prompt_en": "skis", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "skis" + } + } + }, + { + "prompt_en": "a snowboard", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "snowboard" + } + } + }, + { + "prompt_en": "a sports ball", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "sports ball" + } + } + }, + { + "prompt_en": "a kite", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "kite" + } + } + }, + { + "prompt_en": "a baseball bat", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "baseball bat" + } + } + }, + { + "prompt_en": "a baseball glove", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "baseball glove" + } + } + }, + { + "prompt_en": "a skateboard", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "skateboard" + } + } + }, + { + "prompt_en": "a surfboard", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "surfboard" + } + } + }, + { + "prompt_en": "a tennis racket", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "tennis racket" + } + } + }, + { + "prompt_en": "a bottle", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "bottle" + } + } + }, + { + "prompt_en": "a wine glass", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "wine glass" + } + } + }, + { + "prompt_en": "a cup", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "cup" + } + } + }, + { + "prompt_en": "a fork", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "fork" + } + } + }, + { + "prompt_en": "a knife", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "knife" + } + } + }, + { + "prompt_en": "a spoon", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "spoon" + } + } + }, + { + "prompt_en": "a bowl", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "bowl" + } + } + }, + { + "prompt_en": "a banana", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "banana" + } + } + }, + { + "prompt_en": "an apple", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "apple" + } + } + }, + { + "prompt_en": "a sandwich", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "sandwich" + } + } + }, + { + "prompt_en": "an orange", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "orange" + } + } + }, + { + "prompt_en": "broccoli", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "broccoli" + } + } + }, + { + "prompt_en": "a carrot", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "carrot" + } + } + }, + { + "prompt_en": "a hot dog", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "hot dog" + } + } + }, + { + "prompt_en": "a pizza", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "pizza" + } + } + }, + { + "prompt_en": "a donut", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "donut" + } + } + }, + { + "prompt_en": "a cake", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "cake" + } + } + }, + { + "prompt_en": "a chair", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "chair" + } + } + }, + { + "prompt_en": "a couch", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "couch" + } + } + }, + { + "prompt_en": "a potted plant", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "potted plant" + } + } + }, + { + "prompt_en": "a bed", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "bed" + } + } + }, + { + "prompt_en": "a dining table", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "dining table" + } + } + }, + { + "prompt_en": "a toilet", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "toilet" + } + } + }, + { + "prompt_en": "a tv", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "tv" + } + } + }, + { + "prompt_en": "a laptop", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "laptop" + } + } + }, + { + "prompt_en": "a remote", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "remote" + } + } + }, + { + "prompt_en": "a keyboard", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "keyboard" + } + } + }, + { + "prompt_en": "a cell phone", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "cell phone" + } + } + }, + { + "prompt_en": "a microwave", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "microwave" + } + } + }, + { + "prompt_en": "an oven", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "oven" + } + } + }, + { + "prompt_en": "a toaster", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "toaster" + } + } + }, + { + "prompt_en": "a sink", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "sink" + } + } + }, + { + "prompt_en": "a refrigerator", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "refrigerator" + } + } + }, + { + "prompt_en": "a book", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "book" + } + } + }, + { + "prompt_en": "a clock", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "clock" + } + } + }, + { + "prompt_en": "a vase", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "vase" + } + } + }, + { + "prompt_en": "scissors", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "scissors" + } + } + }, + { + "prompt_en": "a teddy bear", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "teddy bear" + } + } + }, + { + "prompt_en": "a hair drier", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "hair drier" + } + } + }, + { + "prompt_en": "a toothbrush", + "dimension": [ + "object_class" + ], + "auxiliary_info": { + "object_class": { + "object": "toothbrush" + } + } + }, + { + "prompt_en": "a red bicycle", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "red" + } + } + }, + { + "prompt_en": "a green bicycle", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "green" + } + } + }, + { + "prompt_en": "a blue bicycle", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "blue" + } + } + }, + { + "prompt_en": "a yellow bicycle", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "yellow" + } + } + }, + { + "prompt_en": "an orange bicycle", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "orange" + } + } + }, + { + "prompt_en": "a purple bicycle", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "purple" + } + } + }, + { + "prompt_en": "a pink bicycle", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "pink" + } + } + }, + { + "prompt_en": "a black bicycle", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "black" + } + } + }, + { + "prompt_en": "a white bicycle", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "white" + } + } + }, + { + "prompt_en": "a red car", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "red" + } + } + }, + { + "prompt_en": "a green car", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "green" + } + } + }, + { + "prompt_en": "a blue car", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "blue" + } + } + }, + { + "prompt_en": "a yellow car", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "yellow" + } + } + }, + { + "prompt_en": "an orange car", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "orange" + } + } + }, + { + "prompt_en": "a purple car", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "purple" + } + } + }, + { + "prompt_en": "a pink car", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "pink" + } + } + }, + { + "prompt_en": "a black car", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "black" + } + } + }, + { + "prompt_en": "a white car", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "white" + } + } + }, + { + "prompt_en": "a red bird", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "red" + } + } + }, + { + "prompt_en": "a green bird", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "green" + } + } + }, + { + "prompt_en": "a blue bird", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "blue" + } + } + }, + { + "prompt_en": "a yellow bird", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "yellow" + } + } + }, + { + "prompt_en": "an orange bird", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "orange" + } + } + }, + { + "prompt_en": "a purple bird", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "purple" + } + } + }, + { + "prompt_en": "a pink bird", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "pink" + } + } + }, + { + "prompt_en": "a black bird", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "black" + } + } + }, + { + "prompt_en": "a white bird", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "white" + } + } + }, + { + "prompt_en": "a black cat", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "black" + } + } + }, + { + "prompt_en": "a white cat", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "white" + } + } + }, + { + "prompt_en": "an orange cat", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "orange" + } + } + }, + { + "prompt_en": "a yellow cat", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "yellow" + } + } + }, + { + "prompt_en": "a red umbrella", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "red" + } + } + }, + { + "prompt_en": "a green umbrella", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "green" + } + } + }, + { + "prompt_en": "a blue umbrella", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "blue" + } + } + }, + { + "prompt_en": "a yellow umbrella", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "yellow" + } + } + }, + { + "prompt_en": "an orange umbrella", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "orange" + } + } + }, + { + "prompt_en": "a purple umbrella", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "purple" + } + } + }, + { + "prompt_en": "a pink umbrella", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "pink" + } + } + }, + { + "prompt_en": "a black umbrella", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "black" + } + } + }, + { + "prompt_en": "a white umbrella", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "white" + } + } + }, + { + "prompt_en": "a red suitcase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "red" + } + } + }, + { + "prompt_en": "a green suitcase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "green" + } + } + }, + { + "prompt_en": "a blue suitcase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "blue" + } + } + }, + { + "prompt_en": "a yellow suitcase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "yellow" + } + } + }, + { + "prompt_en": "an orange suitcase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "orange" + } + } + }, + { + "prompt_en": "a purple suitcase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "purple" + } + } + }, + { + "prompt_en": "a pink suitcase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "pink" + } + } + }, + { + "prompt_en": "a black suitcase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "black" + } + } + }, + { + "prompt_en": "a white suitcase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "white" + } + } + }, + { + "prompt_en": "a red bowl", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "red" + } + } + }, + { + "prompt_en": "a green bowl", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "green" + } + } + }, + { + "prompt_en": "a blue bowl", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "blue" + } + } + }, + { + "prompt_en": "a yellow bowl", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "yellow" + } + } + }, + { + "prompt_en": "an orange bowl", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "orange" + } + } + }, + { + "prompt_en": "a purple bowl", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "purple" + } + } + }, + { + "prompt_en": "a pink bowl", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "pink" + } + } + }, + { + "prompt_en": "a black bowl", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "black" + } + } + }, + { + "prompt_en": "a white bowl", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "white" + } + } + }, + { + "prompt_en": "a red chair", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "red" + } + } + }, + { + "prompt_en": "a green chair", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "green" + } + } + }, + { + "prompt_en": "a blue chair", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "blue" + } + } + }, + { + "prompt_en": "a yellow chair", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "yellow" + } + } + }, + { + "prompt_en": "an orange chair", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "orange" + } + } + }, + { + "prompt_en": "a purple chair", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "purple" + } + } + }, + { + "prompt_en": "a pink chair", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "pink" + } + } + }, + { + "prompt_en": "a black chair", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "black" + } + } + }, + { + "prompt_en": "a white chair", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "white" + } + } + }, + { + "prompt_en": "a red clock", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "red" + } + } + }, + { + "prompt_en": "a green clock", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "green" + } + } + }, + { + "prompt_en": "a blue clock", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "blue" + } + } + }, + { + "prompt_en": "a yellow clock", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "yellow" + } + } + }, + { + "prompt_en": "an orange clock", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "orange" + } + } + }, + { + "prompt_en": "a purple clock", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "purple" + } + } + }, + { + "prompt_en": "a pink clock", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "pink" + } + } + }, + { + "prompt_en": "a black clock", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "black" + } + } + }, + { + "prompt_en": "a white clock", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "white" + } + } + }, + { + "prompt_en": "a red vase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "red" + } + } + }, + { + "prompt_en": "a green vase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "green" + } + } + }, + { + "prompt_en": "a blue vase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "blue" + } + } + }, + { + "prompt_en": "a yellow vase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "yellow" + } + } + }, + { + "prompt_en": "an orange vase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "orange" + } + } + }, + { + "prompt_en": "a purple vase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "purple" + } + } + }, + { + "prompt_en": "a pink vase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "pink" + } + } + }, + { + "prompt_en": "a black vase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "black" + } + } + }, + { + "prompt_en": "a white vase", + "dimension": [ + "color" + ], + "auxiliary_info": { + "color": { + "color": "white" + } + } + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, Van Gogh style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "Van Gogh style" + } + } + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, oil painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "oil painting" + } + } + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand by Hokusai, in the style of Ukiyo", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "by Hokusai, in the style of Ukiyo" + } + } + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, black and white", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "black and white" + } + } + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, pixel art", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "pixel art" + } + } + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, in cyberpunk style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "in cyberpunk style" + } + } + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, animated style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "animated style" + } + } + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, watercolor painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "watercolor painting" + } + } + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, surrealism style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "surrealism style" + } + } + }, + { + "prompt_en": "The bund Shanghai, Van Gogh style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "Van Gogh style" + } + } + }, + { + "prompt_en": "The bund Shanghai, oil painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "oil painting" + } + } + }, + { + "prompt_en": "The bund Shanghai by Hokusai, in the style of Ukiyo", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "by Hokusai, in the style of Ukiyo" + } + } + }, + { + "prompt_en": "The bund Shanghai, black and white", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "black and white" + } + } + }, + { + "prompt_en": "The bund Shanghai, pixel art", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "pixel art" + } + } + }, + { + "prompt_en": "The bund Shanghai, in cyberpunk style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "in cyberpunk style" + } + } + }, + { + "prompt_en": "The bund Shanghai, animated style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "animated style" + } + } + }, + { + "prompt_en": "The bund Shanghai, watercolor painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "watercolor painting" + } + } + }, + { + "prompt_en": "The bund Shanghai, surrealism style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "surrealism style" + } + } + }, + { + "prompt_en": "a shark is swimming in the ocean, Van Gogh style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "Van Gogh style" + } + } + }, + { + "prompt_en": "a shark is swimming in the ocean, oil painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "oil painting" + } + } + }, + { + "prompt_en": "a shark is swimming in the ocean by Hokusai, in the style of Ukiyo", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "by Hokusai, in the style of Ukiyo" + } + } + }, + { + "prompt_en": "a shark is swimming in the ocean, black and white", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "black and white" + } + } + }, + { + "prompt_en": "a shark is swimming in the ocean, pixel art", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "pixel art" + } + } + }, + { + "prompt_en": "a shark is swimming in the ocean, in cyberpunk style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "in cyberpunk style" + } + } + }, + { + "prompt_en": "a shark is swimming in the ocean, animated style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "animated style" + } + } + }, + { + "prompt_en": "a shark is swimming in the ocean, watercolor painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "watercolor painting" + } + } + }, + { + "prompt_en": "a shark is swimming in the ocean, surrealism style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "surrealism style" + } + } + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, Van Gogh style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "Van Gogh style" + } + } + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, oil painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "oil painting" + } + } + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris by Hokusai, in the style of Ukiyo", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "by Hokusai, in the style of Ukiyo" + } + } + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, black and white", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "black and white" + } + } + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, pixel art", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "pixel art" + } + } + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, in cyberpunk style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "in cyberpunk style" + } + } + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, animated style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "animated style" + } + } + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, watercolor painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "watercolor painting" + } + } + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, surrealism style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "surrealism style" + } + } + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, Van Gogh style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "Van Gogh style" + } + } + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, oil painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "oil painting" + } + } + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset by Hokusai, in the style of Ukiyo", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "by Hokusai, in the style of Ukiyo" + } + } + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, black and white", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "black and white" + } + } + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, pixel art", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "pixel art" + } + } + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, in cyberpunk style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "in cyberpunk style" + } + } + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, animated style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "animated style" + } + } + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, watercolor painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "watercolor painting" + } + } + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, surrealism style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "surrealism style" + } + } + }, + { + "prompt_en": "Gwen Stacy reading a book, Van Gogh style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "Van Gogh style" + } + } + }, + { + "prompt_en": "Gwen Stacy reading a book, oil painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "oil painting" + } + } + }, + { + "prompt_en": "Gwen Stacy reading a book by Hokusai, in the style of Ukiyo", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "by Hokusai, in the style of Ukiyo" + } + } + }, + { + "prompt_en": "Gwen Stacy reading a book, black and white", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "black and white" + } + } + }, + { + "prompt_en": "Gwen Stacy reading a book, pixel art", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "pixel art" + } + } + }, + { + "prompt_en": "Gwen Stacy reading a book, in cyberpunk style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "in cyberpunk style" + } + } + }, + { + "prompt_en": "Gwen Stacy reading a book, animated style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "animated style" + } + } + }, + { + "prompt_en": "Gwen Stacy reading a book, watercolor painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "watercolor painting" + } + } + }, + { + "prompt_en": "Gwen Stacy reading a book, surrealism style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "surrealism style" + } + } + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, Van Gogh style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "Van Gogh style" + } + } + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, oil painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "oil painting" + } + } + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background by Hokusai, in the style of Ukiyo", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "by Hokusai, in the style of Ukiyo" + } + } + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, black and white", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "black and white" + } + } + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, pixel art", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "pixel art" + } + } + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, in cyberpunk style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "in cyberpunk style" + } + } + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, animated style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "animated style" + } + } + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, watercolor painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "watercolor painting" + } + } + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, surrealism style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "surrealism style" + } + } + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, Van Gogh style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "Van Gogh style" + } + } + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, oil painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "oil painting" + } + } + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas by Hokusai, in the style of Ukiyo", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "by Hokusai, in the style of Ukiyo" + } + } + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, black and white", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "black and white" + } + } + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, pixel art", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "pixel art" + } + } + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, in cyberpunk style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "in cyberpunk style" + } + } + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, animated style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "animated style" + } + } + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, watercolor painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "watercolor painting" + } + } + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, surrealism style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "surrealism style" + } + } + }, + { + "prompt_en": "An astronaut flying in space, Van Gogh style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "Van Gogh style" + } + } + }, + { + "prompt_en": "An astronaut flying in space, oil painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "oil painting" + } + } + }, + { + "prompt_en": "An astronaut flying in space by Hokusai, in the style of Ukiyo", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "by Hokusai, in the style of Ukiyo" + } + } + }, + { + "prompt_en": "An astronaut flying in space, black and white", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "black and white" + } + } + }, + { + "prompt_en": "An astronaut flying in space, pixel art", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "pixel art" + } + } + }, + { + "prompt_en": "An astronaut flying in space, in cyberpunk style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "in cyberpunk style" + } + } + }, + { + "prompt_en": "An astronaut flying in space, animated style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "animated style" + } + } + }, + { + "prompt_en": "An astronaut flying in space, watercolor painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "watercolor painting" + } + } + }, + { + "prompt_en": "An astronaut flying in space, surrealism style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "surrealism style" + } + } + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, Van Gogh style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "Van Gogh style" + } + } + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, oil painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "oil painting" + } + } + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks by Hokusai, in the style of Ukiyo", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "by Hokusai, in the style of Ukiyo" + } + } + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, black and white", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "black and white" + } + } + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, pixel art", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "pixel art" + } + } + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, in cyberpunk style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "in cyberpunk style" + } + } + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, animated style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "animated style" + } + } + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, watercolor painting", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "watercolor painting" + } + } + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, surrealism style", + "dimension": [ + "appearance_style" + ], + "auxiliary_info": { + "appearance_style": { + "appearance_style": "surrealism style" + } + } + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, in super slow motion", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, zoom in", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, zoom out", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, pan left", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, pan right", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, tilt up", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, tilt down", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, with an intense shaking effect", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, featuring a steady and smooth perspective", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand, racking focus", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "The bund Shanghai, in super slow motion", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "The bund Shanghai, zoom in", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "The bund Shanghai, zoom out", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "The bund Shanghai, pan left", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "The bund Shanghai, pan right", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "The bund Shanghai, tilt up", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "The bund Shanghai, tilt down", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "The bund Shanghai, with an intense shaking effect", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "The bund Shanghai, featuring a steady and smooth perspective", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "The bund Shanghai, racking focus", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "a shark is swimming in the ocean, in super slow motion", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "a shark is swimming in the ocean, zoom in", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "a shark is swimming in the ocean, zoom out", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "a shark is swimming in the ocean, pan left", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "a shark is swimming in the ocean, pan right", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "a shark is swimming in the ocean, tilt up", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "a shark is swimming in the ocean, tilt down", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "a shark is swimming in the ocean, with an intense shaking effect", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "a shark is swimming in the ocean, featuring a steady and smooth perspective", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "a shark is swimming in the ocean, racking focus", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, in super slow motion", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, zoom in", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, zoom out", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, pan left", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, pan right", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, tilt up", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, tilt down", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, with an intense shaking effect", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, featuring a steady and smooth perspective", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris, racking focus", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, in super slow motion", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, zoom in", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, zoom out", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, pan left", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, pan right", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, tilt up", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, tilt down", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, with an intense shaking effect", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, featuring a steady and smooth perspective", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset, racking focus", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Gwen Stacy reading a book, in super slow motion", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Gwen Stacy reading a book, zoom in", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Gwen Stacy reading a book, zoom out", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Gwen Stacy reading a book, pan left", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Gwen Stacy reading a book, pan right", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Gwen Stacy reading a book, tilt up", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Gwen Stacy reading a book, tilt down", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Gwen Stacy reading a book, with an intense shaking effect", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Gwen Stacy reading a book, featuring a steady and smooth perspective", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Gwen Stacy reading a book, racking focus", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, in super slow motion", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, zoom in", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, zoom out", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, pan left", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, pan right", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, tilt up", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, tilt down", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, with an intense shaking effect", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, featuring a steady and smooth perspective", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background, racking focus", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, in super slow motion", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, zoom in", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, zoom out", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, pan left", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, pan right", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, tilt up", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, tilt down", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, with an intense shaking effect", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, featuring a steady and smooth perspective", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, racking focus", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "An astronaut flying in space, in super slow motion", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "An astronaut flying in space, zoom in", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "An astronaut flying in space, zoom out", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "An astronaut flying in space, pan left", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "An astronaut flying in space, pan right", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "An astronaut flying in space, tilt up", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "An astronaut flying in space, tilt down", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "An astronaut flying in space, with an intense shaking effect", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "An astronaut flying in space, featuring a steady and smooth perspective", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "An astronaut flying in space, racking focus", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, in super slow motion", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, zoom in", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, zoom out", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, pan left", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, pan right", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, tilt up", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, tilt down", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, with an intense shaking effect", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, featuring a steady and smooth perspective", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, racking focus", + "dimension": [ + "temporal_style" + ] + }, + { + "prompt_en": "Close up of grapes on a rotating table.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Turtle swimming in ocean.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A storm trooper vacuuming the beach.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A panda standing on a surfboard in the ocean in sunset.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "An astronaut feeding ducks on a sunny afternoon, reflection from the water.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Two pandas discussing an academic paper.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Sunset time lapse at the beach with moving clouds and colors in the sky.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A fat rabbit wearing a purple robe walking through a fantasy landscape.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A koala bear playing piano in the forest.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "An astronaut flying in space.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Fireworks.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "An animated painting of fluffy white clouds moving in sky.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Flying through fantasy landscapes.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A bigfoot walking in the snowstorm.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A squirrel eating a burger.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A cat wearing sunglasses and working as a lifeguard at a pool.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Splash of turquoise water in extreme slow motion, alpha channel included.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "an ice cream is melting on the table.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "a drone flying over a snowy forest.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "a shark is swimming in the ocean.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Aerial panoramic video from a drone of a fantasy land.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "a teddy bear is swimming in the ocean.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "time lapse of sunrise on mars.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "golden fish swimming in the ocean.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "An artist brush painting on a canvas close up.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A drone view of celebration with Christmas tree and fireworks, starry sky - background.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "happy dog wearing a yellow turtleneck, studio, portrait, facing camera, dark background", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Origami dancers in white paper, 3D render, on white background, studio shot, dancing modern dance.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Campfire at night in a snowy forest with starry sky in the background.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "a fantasy landscape", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A 3D model of a 1800s victorian house.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "this is how I do makeup in the morning.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A raccoon that looks like a turtle, digital art.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Robot dancing in Times Square.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Busy freeway at night.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Balloon full of water exploding in extreme slow motion.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "An astronaut is riding a horse in the space in a photorealistic style.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Macro slo-mo. Slow motion cropped closeup of roasted coffee beans falling into an empty bowl.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Sewing machine, old sewing machine working.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Motion colour drop in water, ink swirling in water, colourful ink in water, abstraction fancy dream cloud of ink.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Few big purple plums rotating on the turntable. water drops appear on the skin during rotation. isolated on the white background. close-up. macro.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Vampire makeup face of beautiful girl, red contact lenses.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Ashtray full of butts on table, smoke flowing on black background, close-up", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Pacific coast, carmel by the sea ocean and waves.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A teddy bear is playing drum kit in NYC Times Square.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A corgi is playing drum kit.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "An Iron man is playing the electronic guitar, high electronic guitar.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A raccoon is playing the electronic guitar.", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background by Vincent van Gogh", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A corgi's head depicted as an explosion of a nebula", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A fantasy landscape", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A future where humans have achieved teleportation technology", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A jellyfish floating through the ocean, with bioluminescent tentacles", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A Mars rover moving on Mars", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A panda drinking coffee in a cafe in Paris", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A space shuttle launching into orbit, with flames and smoke billowing out from the engines", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A steam train moving on a mountainside", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A super cool giant robot in Cyberpunk Beijing", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A tropical beach at sunrise, with palm trees and crystal-clear water in the foreground", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Cinematic shot of Van Gogh's selfie, Van Gogh style", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Gwen Stacy reading a book", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Iron Man flying in the sky", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "The bund Shanghai, oil painting", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Yoda playing guitar on the stage", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand by Hokusai, in the style of Ukiyo", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A beautiful coastal beach in spring, waves lapping on sand by Vincent van Gogh", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A boat sailing leisurely along the Seine River with the Eiffel Tower in background", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A car moving slowly on an empty street, rainy evening", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A cat eating food out of a bowl", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A cat wearing sunglasses at a pool", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A confused panda in calculus class", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A cute fluffy panda eating Chinese food in a restaurant", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A cute happy Corgi playing in park, sunset", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A cute raccoon playing guitar in a boat on the ocean", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A happy fuzzy panda playing guitar nearby a campfire, snow mountain in the background", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A lightning striking atop of eiffel tower, dark clouds in the sky", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A modern art museum, with colorful paintings", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A panda cooking in the kitchen", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A panda playing on a swing set", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A polar bear is playing guitar", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A raccoon dressed in suit playing the trumpet, stage background", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A robot DJ is playing the turntable, in heavy raining futuristic tokyo rooftop cyberpunk night, sci-fi, fantasy", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A shark swimming in clear Caribbean ocean", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A super robot protecting city", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "A teddy bear washing the dishes", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "An epic tornado attacking above a glowing city at night, the tornado is made of smoke", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "An oil painting of a couple in formal evening wear going home get caught in a heavy downpour with umbrellas", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Clown fish swimming through the coral reef", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Hyper-realistic spaceship landing on Mars", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "The bund Shanghai, vibrant color", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Vincent van Gogh is painting in the room", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "Yellow flowers swing in the wind", + "dimension": [ + "overall_consistency", + "aesthetic_quality", + "imaging_quality" + ] + }, + { + "prompt_en": "alley", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "alley" + } + } + } + }, + { + "prompt_en": "amusement park", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "amusement park" + } + } + } + }, + { + "prompt_en": "aquarium", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "aquarium" + } + } + } + }, + { + "prompt_en": "arch", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "arch" + } + } + } + }, + { + "prompt_en": "art gallery", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "art gallery" + } + } + } + }, + { + "prompt_en": "bathroom", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "bathroom" + } + } + } + }, + { + "prompt_en": "bakery shop", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "bakery shop" + } + } + } + }, + { + "prompt_en": "ballroom", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "ballroom" + } + } + } + }, + { + "prompt_en": "bar", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "bar" + } + } + } + }, + { + "prompt_en": "barn", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "barn" + } + } + } + }, + { + "prompt_en": "basement", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "basement" + } + } + } + }, + { + "prompt_en": "beach", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "beach" + } + } + } + }, + { + "prompt_en": "bedroom", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "bedroom" + } + } + } + }, + { + "prompt_en": "bridge", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "bridge" + } + } + } + }, + { + "prompt_en": "botanical garden", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "botanical garden" + } + } + } + }, + { + "prompt_en": "cafeteria", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "cafeteria" + } + } + } + }, + { + "prompt_en": "campsite", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "campsite" + } + } + } + }, + { + "prompt_en": "campus", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "campus" + } + } + } + }, + { + "prompt_en": "carrousel", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "carrousel" + } + } + } + }, + { + "prompt_en": "castle", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "castle" + } + } + } + }, + { + "prompt_en": "cemetery", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "cemetery" + } + } + } + }, + { + "prompt_en": "classroom", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "classroom" + } + } + } + }, + { + "prompt_en": "cliff", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "cliff" + } + } + } + }, + { + "prompt_en": "crosswalk", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "crosswalk" + } + } + } + }, + { + "prompt_en": "construction site", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "construction site" + } + } + } + }, + { + "prompt_en": "corridor", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "corridor" + } + } + } + }, + { + "prompt_en": "courtyard", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "courtyard" + } + } + } + }, + { + "prompt_en": "desert", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "desert" + } + } + } + }, + { + "prompt_en": "downtown", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "downtown" + } + } + } + }, + { + "prompt_en": "driveway", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "driveway" + } + } + } + }, + { + "prompt_en": "farm", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "farm" + } + } + } + }, + { + "prompt_en": "food court", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "food court" + } + } + } + }, + { + "prompt_en": "football field", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "football field" + } + } + } + }, + { + "prompt_en": "forest road", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "forest road" + } + } + } + }, + { + "prompt_en": "fountain", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "fountain" + } + } + } + }, + { + "prompt_en": "gas station", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "gas station" + } + } + } + }, + { + "prompt_en": "glacier", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "glacier" + } + } + } + }, + { + "prompt_en": "golf course", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "golf course" + } + } + } + }, + { + "prompt_en": "indoor gymnasium", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "indoor gymnasium" + } + } + } + }, + { + "prompt_en": "harbor", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "harbor" + } + } + } + }, + { + "prompt_en": "highway", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "highway" + } + } + } + }, + { + "prompt_en": "hospital", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "hospital" + } + } + } + }, + { + "prompt_en": "house", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "house" + } + } + } + }, + { + "prompt_en": "iceberg", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "iceberg" + } + } + } + }, + { + "prompt_en": "industrial area", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "industrial area" + } + } + } + }, + { + "prompt_en": "jail cell", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "jail cell" + } + } + } + }, + { + "prompt_en": "junkyard", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "junkyard" + } + } + } + }, + { + "prompt_en": "kitchen", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "kitchen" + } + } + } + }, + { + "prompt_en": "indoor library", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "indoor library" + } + } + } + }, + { + "prompt_en": "lighthouse", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "lighthouse" + } + } + } + }, + { + "prompt_en": "laboratory", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "laboratory" + } + } + } + }, + { + "prompt_en": "mansion", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "mansion" + } + } + } + }, + { + "prompt_en": "marsh", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "marsh" + } + } + } + }, + { + "prompt_en": "mountain", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "mountain" + } + } + } + }, + { + "prompt_en": "indoor movie theater", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "indoor movie theater" + } + } + } + }, + { + "prompt_en": "indoor museum", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "indoor museum" + } + } + } + }, + { + "prompt_en": "music studio", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "music studio" + } + } + } + }, + { + "prompt_en": "nursery", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "nursery" + } + } + } + }, + { + "prompt_en": "ocean", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "ocean" + } + } + } + }, + { + "prompt_en": "office", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "office" + } + } + } + }, + { + "prompt_en": "palace", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "palace" + } + } + } + }, + { + "prompt_en": "parking lot", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "parking lot" + } + } + } + }, + { + "prompt_en": "pharmacy", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "pharmacy" + } + } + } + }, + { + "prompt_en": "phone booth", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "phone booth" + } + } + } + }, + { + "prompt_en": "raceway", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "raceway" + } + } + } + }, + { + "prompt_en": "restaurant", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "restaurant" + } + } + } + }, + { + "prompt_en": "river", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "river" + } + } + } + }, + { + "prompt_en": "science museum", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "science museum" + } + } + } + }, + { + "prompt_en": "shower", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "shower" + } + } + } + }, + { + "prompt_en": "ski slope", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "ski slope" + } + } + } + }, + { + "prompt_en": "sky", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "sky" + } + } + } + }, + { + "prompt_en": "skyscraper", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "skyscraper" + } + } + } + }, + { + "prompt_en": "baseball stadium", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "baseball stadium" + } + } + } + }, + { + "prompt_en": "staircase", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "staircase" + } + } + } + }, + { + "prompt_en": "street", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "street" + } + } + } + }, + { + "prompt_en": "supermarket", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "supermarket" + } + } + } + }, + { + "prompt_en": "indoor swimming pool", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "indoor swimming pool" + } + } + } + }, + { + "prompt_en": "tower", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "tower" + } + } + } + }, + { + "prompt_en": "outdoor track", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "outdoor track" + } + } + } + }, + { + "prompt_en": "train railway", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "train railway" + } + } + } + }, + { + "prompt_en": "train station platform", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "train station platform" + } + } + } + }, + { + "prompt_en": "underwater coral reef", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "underwater coral reef" + } + } + } + }, + { + "prompt_en": "valley", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "valley" + } + } + } + }, + { + "prompt_en": "volcano", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "volcano" + } + } + } + }, + { + "prompt_en": "waterfall", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "waterfall" + } + } + } + }, + { + "prompt_en": "windmill", + "dimension": [ + "scene", + "background_consistency" + ], + "auxiliary_info": { + "scene": { + "scene": { + "scene": "windmill" + } + } + } + }, + { + "prompt_en": "a bicycle on the left of a car, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "bicycle", + "object_b": "car", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a car on the right of a motorcycle, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "car", + "object_b": "motorcycle", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a motorcycle on the left of a bus, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "motorcycle", + "object_b": "bus", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a bus on the right of a traffic light, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "bus", + "object_b": "traffic light", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a traffic light on the left of a fire hydrant, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "traffic light", + "object_b": "fire hydrant", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a fire hydrant on the right of a stop sign, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "fire hydrant", + "object_b": "stop sign", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a stop sign on the left of a parking meter, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "stop sign", + "object_b": "parking meter", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a parking meter on the right of a bench, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "parking meter", + "object_b": "bench", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a bench on the left of a truck, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "bench", + "object_b": "truck", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a truck on the right of a bicycle, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "truck", + "object_b": "bicycle", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a bird on the left of a cat, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "bird", + "object_b": "cat", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a cat on the right of a dog, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "cat", + "object_b": "dog", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a dog on the left of a horse, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "dog", + "object_b": "horse", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a horse on the right of a sheep, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "horse", + "object_b": "sheep", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a sheep on the left of a cow, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "sheep", + "object_b": "cow", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a cow on the right of an elephant, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "cow", + "object_b": "elephant", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "an elephant on the left of a bear, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "elephant", + "object_b": "bear", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a bear on the right of a zebra, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "bear", + "object_b": "zebra", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a zebra on the left of a giraffe, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "zebra", + "object_b": "giraffe", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a giraffe on the right of a bird, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "giraffe", + "object_b": "bird", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a bottle on the left of a wine glass, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "bottle", + "object_b": "wine glass", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a wine glass on the right of a cup, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "wine glass", + "object_b": "cup", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a cup on the left of a fork, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "cup", + "object_b": "fork", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a fork on the right of a knife, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "fork", + "object_b": "knife", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a knife on the left of a spoon, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "knife", + "object_b": "spoon", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a spoon on the right of a bowl, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "spoon", + "object_b": "bowl", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a bowl on the left of a bottle, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "bowl", + "object_b": "bottle", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a potted plant on the left of a remote, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "potted plant", + "object_b": "remote", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a remote on the right of a clock, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "remote", + "object_b": "clock", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a clock on the left of a vase, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "clock", + "object_b": "vase", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a vase on the right of scissors, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "vase", + "object_b": "scissors", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "scissors on the left of a teddy bear, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "scissors", + "object_b": "teddy bear", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a teddy bear on the right of a potted plant, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "teddy bear", + "object_b": "potted plant", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a frisbee on the left of a sports ball, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "frisbee", + "object_b": "sports ball", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a sports ball on the right of a baseball bat, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "sports ball", + "object_b": "baseball bat", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a baseball bat on the left of a baseball glove, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "baseball bat", + "object_b": "baseball glove", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a baseball glove on the right of a tennis racket, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "baseball glove", + "object_b": "tennis racket", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a tennis racket on the left of a frisbee, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "tennis racket", + "object_b": "frisbee", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a toilet on the left of a hair drier, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "toilet", + "object_b": "hair drier", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a hair drier on the right of a toothbrush, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "hair drier", + "object_b": "toothbrush", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a toothbrush on the left of a sink, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "toothbrush", + "object_b": "sink", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a sink on the right of a toilet, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "sink", + "object_b": "toilet", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a chair on the left of a couch, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "chair", + "object_b": "couch", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a couch on the right of a bed, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "couch", + "object_b": "bed", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a bed on the left of a tv, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "bed", + "object_b": "tv", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a tv on the right of a dining table, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "tv", + "object_b": "dining table", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a dining table on the left of a chair, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "dining table", + "object_b": "chair", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "an airplane on the left of a train, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "airplane", + "object_b": "train", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "a train on the right of a boat, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "train", + "object_b": "boat", + "relationship": "on the right of" + } + } + } + }, + { + "prompt_en": "a boat on the left of an airplane, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "boat", + "object_b": "airplane", + "relationship": "on the left of" + } + } + } + }, + { + "prompt_en": "an oven on the top of a toaster, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "oven", + "object_b": "toaster", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "an oven on the bottom of a toaster, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "oven", + "object_b": "toaster", + "relationship": "on the bottom of" + } + } + } + }, + { + "prompt_en": "a toaster on the top of a microwave, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "toaster", + "object_b": "microwave", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "a toaster on the bottom of a microwave, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "toaster", + "object_b": "microwave", + "relationship": "on the bottom of" + } + } + } + }, + { + "prompt_en": "a microwave on the top of an oven, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "microwave", + "object_b": "oven", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "a microwave on the bottom of an oven, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "microwave", + "object_b": "oven", + "relationship": "on the bottom of" + } + } + } + }, + { + "prompt_en": "a banana on the top of an apple, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "banana", + "object_b": "apple", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "a banana on the bottom of an apple, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "banana", + "object_b": "apple", + "relationship": "on the bottom of" + } + } + } + }, + { + "prompt_en": "an apple on the top of a sandwich, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "apple", + "object_b": "sandwich", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "an apple on the bottom of a sandwich, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "apple", + "object_b": "sandwich", + "relationship": "on the bottom of" + } + } + } + }, + { + "prompt_en": "a sandwich on the top of an orange, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "sandwich", + "object_b": "orange", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "a sandwich on the bottom of an orange, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "sandwich", + "object_b": "orange", + "relationship": "on the bottom of" + } + } + } + }, + { + "prompt_en": "an orange on the top of a carrot, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "orange", + "object_b": "carrot", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "an orange on the bottom of a carrot, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "orange", + "object_b": "carrot", + "relationship": "on the bottom of" + } + } + } + }, + { + "prompt_en": "a carrot on the top of a hot dog, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "carrot", + "object_b": "hot dog", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "a carrot on the bottom of a hot dog, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "carrot", + "object_b": "hot dog", + "relationship": "on the bottom of" + } + } + } + }, + { + "prompt_en": "a hot dog on the top of a pizza, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "hot dog", + "object_b": "pizza", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "a hot dog on the bottom of a pizza, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "hot dog", + "object_b": "pizza", + "relationship": "on the bottom of" + } + } + } + }, + { + "prompt_en": "a pizza on the top of a donut, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "pizza", + "object_b": "donut", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "a pizza on the bottom of a donut, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "pizza", + "object_b": "donut", + "relationship": "on the bottom of" + } + } + } + }, + { + "prompt_en": "a donut on the top of broccoli, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "donut", + "object_b": "broccoli", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "a donut on the bottom of broccoli, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "donut", + "object_b": "broccoli", + "relationship": "on the bottom of" + } + } + } + }, + { + "prompt_en": "broccoli on the top of a banana, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "broccoli", + "object_b": "banana", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "broccoli on the bottom of a banana, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "broccoli", + "object_b": "banana", + "relationship": "on the bottom of" + } + } + } + }, + { + "prompt_en": "skis on the top of a snowboard, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "skis", + "object_b": "snowboard", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "skis on the bottom of a snowboard, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "skis", + "object_b": "snowboard", + "relationship": "on the bottom of" + } + } + } + }, + { + "prompt_en": "a snowboard on the top of a kite, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "snowboard", + "object_b": "kite", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "a snowboard on the bottom of a kite, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "snowboard", + "object_b": "kite", + "relationship": "on the bottom of" + } + } + } + }, + { + "prompt_en": "a kite on the top of a skateboard, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "kite", + "object_b": "skateboard", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "a kite on the bottom of a skateboard, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "kite", + "object_b": "skateboard", + "relationship": "on the bottom of" + } + } + } + }, + { + "prompt_en": "a skateboard on the top of a surfboard, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "skateboard", + "object_b": "surfboard", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "a skateboard on the bottom of a surfboard, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "skateboard", + "object_b": "surfboard", + "relationship": "on the bottom of" + } + } + } + }, + { + "prompt_en": "a surfboard on the top of skis, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "surfboard", + "object_b": "skis", + "relationship": "on the top of" + } + } + } + }, + { + "prompt_en": "a surfboard on the bottom of skis, front view", + "dimension": [ + "spatial_relationship" + ], + "auxiliary_info": { + "spatial_relationship": { + "spatial_relationship": { + "object_a": "surfboard", + "object_b": "skis", + "relationship": "on the bottom of" + } + } + } + } +] \ No newline at end of file diff --git a/vbench/color.py b/vbench/color.py index 9ba24c5e..21b45f99 100755 --- a/vbench/color.py +++ b/vbench/color.py @@ -33,7 +33,9 @@ def check_generate(color_key, object_key, predictions): object_flag, color_flag = False, False for pred in frame_pred: if object_key == pred[1]: - object_flag =True + for color_query in ["white","red","pink","blue","silver","purple","orange","green","gray","yellow","black","grey"]: + if color_query in pred[0]: + object_flag =True if color_key in pred[0]: color_flag = True if color_flag: diff --git a/vbench/scene.py b/vbench/scene.py index 640cfbb1..ba2e76a4 100755 --- a/vbench/scene.py +++ b/vbench/scene.py @@ -41,7 +41,6 @@ def scene(model, video_dict, device): success_frame_count += cur_success_frame_count frame_count += len(cur_video_pred) video_results.append({'video_path': video_path, 'video_results': cur_success_frame_rate}) - print(video_results) success_rate = success_frame_count / frame_count return success_rate, video_results From 60984726644adc4b91162ece13313c5931e8b4ef Mon Sep 17 00:00:00 2001 From: yinanhe Date: Fri, 1 Dec 2023 15:41:49 +0800 Subject: [PATCH 024/248] [fix] scene --- vbench/scene.py | 11 +++++++---- vbench/utils.py | 16 ++++++++++------ 2 files changed, 17 insertions(+), 10 deletions(-) diff --git a/vbench/scene.py b/vbench/scene.py index ba2e76a4..44905a8c 100755 --- a/vbench/scene.py +++ b/vbench/scene.py @@ -4,7 +4,7 @@ import torch import numpy as np from tqdm import tqdm -from .utils import load_video, load_dimension_info, dino_transform +from .utils import load_video, load_dimension_info, tag2text_transform from .third_party.tag2Text.tag2text import tag2text_caption import logging @@ -27,14 +27,17 @@ def check_generate(key_info, predictions): def scene(model, video_dict, device): success_frame_count, frame_count = 0,0 video_results = [] - transform = dino_transform(384) + transform = tag2text_transform(384) for info in tqdm(video_dict): if 'auxiliary_info' not in info: raise "Auxiliary info is not in json, please check your json." scene_info = info['auxiliary_info']['scene'] for video_path in info['video_list']: - video_tensor = load_video(video_path) - video_tensor = transform(video_tensor).to(device) + video_array = load_video(video_path, return_tensor=False) + video_tensor_list = [] + for i in video_array: + video_tensor_list.append(transform(i).to(device).unsqueeze(0)) + video_tensor = torch.cat(video_tensor_list) cur_video_pred = get_caption(model, video_tensor) cur_success_frame_count = check_generate(scene_info, cur_video_pred) cur_success_frame_rate = cur_success_frame_count/len(cur_video_pred) diff --git a/vbench/utils.py b/vbench/utils.py index 194139a5..9471c981 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -34,6 +34,11 @@ def dino_transform(n_px): Normalize((0.485, 0.456, 0.406), (0.229, 0.224, 0.225)) ]) +def tag2text_transform(n_px): + normalize = Normalize(mean=[0.485, 0.456, 0.406], + std=[0.229, 0.224, 0.225]) + return Compose([ToPILImage(),Resize((n_px, n_px)),ToTensor(),normalize]) + def get_frame_indices(num_frames, vlen, sample='rand', fix_start=None, input_fps=1, max_num_frames=-1): if sample in ["rand", "middle"]: # uniform sampling acc_samples = min(num_frames, vlen) @@ -74,7 +79,7 @@ def get_frame_indices(num_frames, vlen, sample='rand', fix_start=None, input_fps raise ValueError return frame_indices -def load_video(video_path, data_transform=None, num_frames=None): +def load_video(video_path, data_transform=None, num_frames=None, return_tensor=True): """ Load a video from a given path and apply optional data transformations. @@ -121,9 +126,7 @@ def load_video(video_path, data_transform=None, num_frames=None): return buffer elif video_path.endswith('.mp4'): video_reader = VideoReader(video_path, num_threads=1) - vlen = len(video_reader) - frame_indices = np.linspace(0, vlen-1, vlen) - frames = video_reader.get_batch(frame_indices) # (T, H, W, C), torch.uint8 + frames = video_reader.get_batch(range(len(video_reader))) # (T, H, W, C), torch.uint8 buffer = frames.asnumpy().astype(np.uint8) if data_transform: buffer = data_transform(buffer) @@ -131,13 +134,14 @@ def load_video(video_path, data_transform=None, num_frames=None): else: raise NotImplementedError frames = buffer - frames = torch.Tensor(buffer) + if return_tensor: + frames = torch.Tensor(buffer) + frames = frames.permute(0, 3, 1, 2) # (T, C, H, W), torch.uint8 if num_frames: frame_indices = get_frame_indices( num_frames, len(frames), sample="middle" ) frames = frames[frame_indices] - frames = frames.permute(0, 3, 1, 2) # (T, C, H, W), torch.uint8 return frames def load_dimension_info(json_dir, dimension, lang): From f0245249879e2f746dd2c7360f8979e193867137 Mon Sep 17 00:00:00 2001 From: yinanhe Date: Fri, 1 Dec 2023 15:52:43 +0800 Subject: [PATCH 025/248] [fix] subject consistency --- vbench/utils.py | 1 + 1 file changed, 1 insertion(+) diff --git a/vbench/utils.py b/vbench/utils.py index 9471c981..5d7ba105 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -31,6 +31,7 @@ def clip_transform(n_px): def dino_transform(n_px): return Compose([ Resize(size=n_px), + transforms.Lambda(lambda x: x.float().div(255.0)), Normalize((0.485, 0.456, 0.406), (0.229, 0.224, 0.225)) ]) From 8f3ed6f40f06fb48ec397d8f359f76fa383937e1 Mon Sep 17 00:00:00 2001 From: yinanhe Date: Fri, 1 Dec 2023 16:03:27 +0800 Subject: [PATCH 026/248] [fix] consistency --- vbench/overall_consistency.py | 1 - vbench/temporal_style.py | 1 - 2 files changed, 2 deletions(-) diff --git a/vbench/overall_consistency.py b/vbench/overall_consistency.py index 46b514e5..9be760e5 100755 --- a/vbench/overall_consistency.py +++ b/vbench/overall_consistency.py @@ -51,7 +51,6 @@ def overall_consistency(clip_model, video_dict, tokenizer, device): score_per_video = float(logit_per_text[0][0].cpu()) sim.append(score_per_video) video_results.append({'video_path': video_path, 'video_results': score_per_video}) - print(video_results) avg_score = np.mean(sim) return avg_score, video_results diff --git a/vbench/temporal_style.py b/vbench/temporal_style.py index 4b9a4774..630996af 100755 --- a/vbench/temporal_style.py +++ b/vbench/temporal_style.py @@ -51,7 +51,6 @@ def temporal_style(clip_model, video_dict, tokenizer, device): score_per_video = float(logit_per_text[0][0].cpu()) sim.append(score_per_video) video_results.append({'video_path': video_path, 'video_results': score_per_video}) - print(video_results) avg_score = np.mean(sim) return avg_score, video_results From eed61b4e864f455545f149d6828e184e7f3df1f3 Mon Sep 17 00:00:00 2001 From: yinanhe Date: Mon, 4 Dec 2023 19:29:11 +0800 Subject: [PATCH 027/248] [fix] overall consistency --- vbench/overall_consistency.py | 4 ++-- vbench/utils.py | 23 +++++++++++++++++++++++ 2 files changed, 25 insertions(+), 2 deletions(-) diff --git a/vbench/overall_consistency.py b/vbench/overall_consistency.py index 9be760e5..be3b7ffd 100755 --- a/vbench/overall_consistency.py +++ b/vbench/overall_consistency.py @@ -8,7 +8,7 @@ from tqdm import tqdm from .third_party.ViCLIP.viclip import ViCLIP from .third_party.ViCLIP.simple_tokenizer import SimpleTokenizer -from .utils import load_video, load_dimension_info, clip_transform +from .utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps def get_text_features(model, input_text, tokenizer, text_feature_dict={}): if input_text in text_feature_dict: @@ -42,7 +42,7 @@ def overall_consistency(clip_model, video_dict, tokenizer, device): for video_path in video_list: cur_video = [] with torch.no_grad(): - images = load_video(video_path, num_frames=8) + images = read_frames_decord_by_fps(video_path, num_frames=8) images = image_transform(images) images = images.to(device) clip_feat = get_vid_features(clip_model,images.unsqueeze(0)) diff --git a/vbench/utils.py b/vbench/utils.py index 5d7ba105..4b9504f2 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -145,6 +145,29 @@ def load_video(video_path, data_transform=None, num_frames=None, return_tensor=T frames = frames[frame_indices] return frames +def read_frames_decord_by_fps( + video_path, sample_fps=2, sample='rand', fix_start=None, + max_num_frames=-1, trimmed30=False, num_frames=8 + ): + import decord + decord.bridge.set_bridge("torch") + video_reader = VideoReader(video_path, num_threads=1) + vlen = len(video_reader) + fps = video_reader.get_avg_fps() + duration = vlen / float(fps) + + if trimmed30 and duration > 30: + duration = 30 + vlen = int(30 * float(fps)) + + frame_indices = get_frame_indices( + num_frames, vlen, sample=sample, fix_start=fix_start, + input_fps=fps, max_num_frames=max_num_frames + ) + frames = video_reader.get_batch(frame_indices) # (T, H, W, C), torch.uint8 + frames = frames.permute(0, 3, 1, 2) # (T, C, H, W), torch.uint8 + return frames + def load_dimension_info(json_dir, dimension, lang): """ Load video list and prompt information based on a specified dimension and language from a JSON file. From 9ae22f55631b4c233c35def7c641acc3b4b2e844 Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Mon, 4 Dec 2023 20:04:52 +0800 Subject: [PATCH 028/248] [update] load checkpoint options --- .gitignore | 2 ++ evaluate.py | 3 +++ vbench/__init__.py | 2 ++ vbench/utils.py | 25 ++++++++++++------------- 4 files changed, 19 insertions(+), 13 deletions(-) diff --git a/.gitignore b/.gitignore index 6f07a4d4..51794472 100755 --- a/.gitignore +++ b/.gitignore @@ -165,3 +165,5 @@ cython_debug/ # development logs private_dev/* +evaluation_results/*.json +vbench/third_party/ViCLIP/bpe_simple_vocab_16e6.txt.gz diff --git a/evaluate.py b/evaluate.py index 29b0a8ab..3448d820 100644 --- a/evaluate.py +++ b/evaluate.py @@ -42,9 +42,12 @@ def parse_args(): def main(): args = parse_args() + print(f'args: {args}') device = torch.device("cuda") my_VBench = VBench(device, args.full_json_dir, args.output_path) + + print(f'start evaluation') my_VBench.evaluate( videos_path = args.videos_path, name = args.dimension, diff --git a/vbench/__init__.py b/vbench/__init__.py index a324ea08..37dcec56 100755 --- a/vbench/__init__.py +++ b/vbench/__init__.py @@ -41,6 +41,7 @@ def build_full_info_json(self, videos_path, name, dimension_list): prompt_dict['video_list'] = [os.path.join(videos_path, prompt+'_0033-'+str(i)+postfix) for i in range(5)] cur_full_info_path = os.path.join(self.output_path, name+'_full_info.json') save_json(cur_full_info_list, cur_full_info_path) + print(f'Evaluation meta data saved to {cur_full_info_path}') return cur_full_info_path def evaluate(self, videos_path, name, dimension_list=None, local=False): @@ -59,3 +60,4 @@ def evaluate(self, videos_path, name, dimension_list=None, local=False): results_dict[dimension] = results output_name = os.path.join(self.output_path, name+'_eval_results.json') save_json(results_dict, output_name) + print(f'Evaluation results saved to {output_name}') diff --git a/vbench/utils.py b/vbench/utils.py index 5d7ba105..53693e25 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -226,13 +226,13 @@ def init_submodules(dimension_list, local=False): submodules_dict[dimension] = { 'model':'pretrained/raft_model/models/raft-things.pth' } - if local: - details = submodules_dict[dimension] - if not os.path.isfile(details['model']): - print(f"File {details['model']} does not exist. Downloading...") - os.system(f'wget -P pretrained/raft_model/ https://dl.dropboxusercontent.com/s/4j4z58wuv8o0mfz/models.zip') - os.system(f'unzip -d pretrained/raft_model/ pretrained/raft_model/models.zip') - os.system(f'rm -f pretrained/raft_model/models.zip') + details = submodules_dict[dimension] + if not os.path.isfile(details['model']): + raise NotImplementedError + print(f"File {details['model']} does not exist. Downloading...") + os.system(f'wget -P pretrained/raft_model/ https://dl.dropboxusercontent.com/s/4j4z58wuv8o0mfz/models.zip') + os.system(f'unzip -d pretrained/raft_model/ pretrained/raft_model/models.zip') + os.system(f'rm -f pretrained/raft_model/models.zip') # Assign the DINO model path for subject consistency dimension elif dimension == 'subject_consistency': if local: @@ -264,22 +264,21 @@ def init_submodules(dimension_list, local=False): if local: vit_l_path = 'pretrained/clip_model/ViT-L-14.pt' if not os.path.isfile(vit_l_path): - os.system(f'wget -q --show-progress https://openaipublic.azureedge.net/clip/models/b8cca3fd41ae0c99ba7e8951adf17d267cdb84cd88be6f7c2e0eca1737a03836/ViT-L-14.pt -O {vit_l_path}') + os.system(f'wget -q https://openaipublic.azureedge.net/clip/models/b8cca3fd41ae0c99ba7e8951adf17d267cdb84cd88be6f7c2e0eca1737a03836/ViT-L-14.pt -O {vit_l_path}') else: vit_l_path = 'ViT-L/14' submodules_dict[dimension] = [vit_l_path, aes_path] elif dimension == 'imaging_quality': musiq_spaq_path = 'pretrained/pyiqa_model/musiq_spaq_ckpt-358bb6af.pth' - if local: - if not os.path.isfile(musiq_spaq_path): - os.system(f'wget https://github.com/chaofengc/IQA-PyTorch/releases/download/v0.1-weights/musiq_spaq_ckpt-358bb6af.pth -O {musiq_spaq_path}') + if not os.path.isfile(musiq_spaq_path): + os.system(f'wget -q https://github.com/chaofengc/IQA-PyTorch/releases/download/v0.1-weights/musiq_spaq_ckpt-358bb6af.pth -O {musiq_spaq_path}') submodules_dict[dimension] = {'model_path': musiq_spaq_path} elif dimension in ["object_class", "multiple_objects", "color", "spatial_relationship" ]: submodules_dict[dimension] = { "model_weight": "pretrained/grit_model/grit_b_densecap_objectdet.pth" } if not os.path.exists(submodules_dict[dimension]['model_weight']): - os.system(f'wget https://datarelease.blob.core.windows.net/grit/models/grit_b_densecap_objectdet.pth -O {submodules_dict[dimension]["model_weight"]}') + os.system(f'wget -q https://datarelease.blob.core.windows.net/grit/models/grit_b_densecap_objectdet.pth -O {submodules_dict[dimension]["model_weight"]}') elif dimension == 'scene': submodules_dict[dimension] = { "pretrained": "pretrained/caption_model/tag2text_swin_14m.pth", @@ -300,7 +299,7 @@ def init_submodules(dimension_list, local=False): "pretrain": "pretrained/viclip_model/ViClip-InternVid-10M-FLT.pth", } if not os.path.exists(submodules_dict[dimension]['pretrain']): - os.system(f'wget -q --show-progress https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/internvideo/viclip/ViClip-InternVid-10M-FLT.pth -O {submodules_dict[dimension]["pretrain"]}') + os.system(f'wget -q https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/internvideo/viclip/ViClip-InternVid-10M-FLT.pth -O {submodules_dict[dimension]["pretrain"]}') return submodules_dict From b51b7ed2757593db08021356e21a28251af0ecea Mon Sep 17 00:00:00 2001 From: yinanhe Date: Mon, 4 Dec 2023 21:13:07 +0800 Subject: [PATCH 029/248] [fix] fix overall consis --- vbench/overall_consistency.py | 4 ++-- vbench/subject_consistency.py | 20 +++++++++++++++----- vbench/utils.py | 7 +++++++ 3 files changed, 24 insertions(+), 7 deletions(-) diff --git a/vbench/overall_consistency.py b/vbench/overall_consistency.py index be3b7ffd..b229bf6b 100755 --- a/vbench/overall_consistency.py +++ b/vbench/overall_consistency.py @@ -31,7 +31,7 @@ def get_predict_label(clip_feature, text_feats_tensor, top=5): top_probs, top_labels = label_probs.cpu().topk(top, dim=-1) return top_probs, top_labels -def overall_consistency(clip_model, video_dict, tokenizer, device): +def overall_consistency(clip_model, video_dict, tokenizer, device, sample="middle"): sim = [] video_results = [] image_transform = clip_transform(224) @@ -42,7 +42,7 @@ def overall_consistency(clip_model, video_dict, tokenizer, device): for video_path in video_list: cur_video = [] with torch.no_grad(): - images = read_frames_decord_by_fps(video_path, num_frames=8) + images = read_frames_decord_by_fps(video_path, num_frames=8, sample=sample) images = image_transform(images) images = images.to(device) clip_feat = get_vid_features(clip_model,images.unsqueeze(0)) diff --git a/vbench/subject_consistency.py b/vbench/subject_consistency.py index 3f658a65..215a7f58 100755 --- a/vbench/subject_consistency.py +++ b/vbench/subject_consistency.py @@ -11,20 +11,30 @@ import torch.nn.functional as F import torchvision.transforms as transforms -from .utils import load_video, load_dimension_info, dino_transform +from .utils import load_video, load_dimension_info, dino_transform, dino_transform_Image import logging logging.basicConfig(level = logging.INFO,format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s') logger = logging.getLogger(__name__) -def subject_consistency(model, video_list, device): +def subject_consistency(model, video_list, device, read_frame=True): sim = 0.0 cnt = 0 video_results = [] - image_transform = dino_transform(224) + if read_frame: + image_transform = dino_transform_Image(224) + else: + image_transform = dino_transform(224) for video_path in tqdm(video_list): video_sim = 0.0 - images = load_video(video_path) - images = image_transform(images) + if read_frame: + video_path = video_path[:-4].replace('videos', 'frames') + tmp_paths = [os.path.join(video_path, f) for f in sorted(os.listdir(video_path))] + images = [] + for tmp_path in tmp_paths: + images.append(image_transform(Image.open(tmp_path))) + else: + images = load_video(video_path) + images = image_transform(images) for i in range(len(images)): with torch.no_grad(): image = images[i].unsqueeze(0) diff --git a/vbench/utils.py b/vbench/utils.py index 4b9504f2..8848b1f7 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -35,6 +35,13 @@ def dino_transform(n_px): Normalize((0.485, 0.456, 0.406), (0.229, 0.224, 0.225)) ]) +def dino_transform_Image(n_px): + return Compose([ + Resize(size=n_px), + ToTensor(), + Normalize((0.485, 0.456, 0.406), (0.229, 0.224, 0.225)) + ]) + def tag2text_transform(n_px): normalize = Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]) From 6282768d97374d9094d7a25f627f9fef6334914d Mon Sep 17 00:00:00 2001 From: zhangfan-p Date: Tue, 5 Dec 2023 00:34:52 +0800 Subject: [PATCH 030/248] fix bug --- vbench/dynamic_degree.py | 12 ++++-------- 1 file changed, 4 insertions(+), 8 deletions(-) diff --git a/vbench/dynamic_degree.py b/vbench/dynamic_degree.py index 6393322b..d6ce3ae8 100644 --- a/vbench/dynamic_degree.py +++ b/vbench/dynamic_degree.py @@ -5,6 +5,7 @@ import numpy as np import torch from tqdm import tqdm +from easydict import EasyDict as edict from .utils import load_dimension_info @@ -131,17 +132,12 @@ def dynamic_degree(dynamic, video_list): return avg_score, video_results + def compute_dynamic_degree(json_dir, device, submodules_list): model_path = submodules_list["model"] # set_args - parser = argparse.ArgumentParser() - parser.add_argument('--model', type=str, default=model_path, help="restore checkpoint") - parser.add_argument('--small', action='store_true', help='use small model') - parser.add_argument('--mixed_precision', action='store_true', help='use mixed precision') - parser.add_argument('--alternate_corr', action='store_true', help='use efficent correlation implementation') - args = parser.parse_args() - - dynamic = DynamicDegree(args, device) + args_new = edict({"model":model_path, "small":False, "mixed_precision":False, "alternate_corr":False}) + dynamic = DynamicDegree(args_new, device) video_list, _ = load_dimension_info(json_dir, dimension='dynamic_degree', lang='en') all_results, video_results = dynamic_degree(dynamic, video_list) return all_results, video_results \ No newline at end of file From bceeed735f0f20cd3704af2a0519d94a20ba913b Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Wed, 6 Dec 2023 17:14:21 +0800 Subject: [PATCH 031/248] [update] read frame method --- evaluate.py | 9 ++++++++- vbench/__init__.py | 4 ++-- vbench/background_consistency.py | 2 +- vbench/subject_consistency.py | 3 ++- vbench/utils.py | 6 +++--- 5 files changed, 16 insertions(+), 8 deletions(-) diff --git a/evaluate.py b/evaluate.py index 3448d820..0fca38d1 100644 --- a/evaluate.py +++ b/evaluate.py @@ -34,7 +34,13 @@ def parse_args(): "--load_ckpt_from_local", type=bool, required=False, - help="whether load checkpoints from local default paths (assuming you have downloaded the checkpoints locally)", + help="whether load checkpoints from local default paths (assuming you have downloaded the checkpoints locally", + ) + parser.add_argument( + "--read_frame", + type=bool, + required=False, + help="whether directly read frames, or directly read videos", ) args = parser.parse_args() return args @@ -53,6 +59,7 @@ def main(): name = args.dimension, dimension_list = [args.dimension], local=args.load_ckpt_from_local, + read_frame=args.read_frame, ) print('done') diff --git a/vbench/__init__.py b/vbench/__init__.py index 37dcec56..6f27c269 100755 --- a/vbench/__init__.py +++ b/vbench/__init__.py @@ -44,11 +44,11 @@ def build_full_info_json(self, videos_path, name, dimension_list): print(f'Evaluation meta data saved to {cur_full_info_path}') return cur_full_info_path - def evaluate(self, videos_path, name, dimension_list=None, local=False): + def evaluate(self, videos_path, name, dimension_list=None, local=False, read_frame=False): results_dict = {} if dimension_list is None: dimension_list = self.build_full_dimension_list() - submodules_dict = init_submodules(dimension_list, local=local) + submodules_dict = init_submodules(dimension_list, local=local, read_frame=read_frame) cur_full_info_path = self.build_full_info_json(videos_path, name, dimension_list) for dimension in dimension_list: try: diff --git a/vbench/background_consistency.py b/vbench/background_consistency.py index 6940ffeb..94cfbbec 100755 --- a/vbench/background_consistency.py +++ b/vbench/background_consistency.py @@ -9,7 +9,7 @@ from .utils import load_video, load_dimension_info, clip_transform -def background_consistency(clip_model, preprocess, video_list, device, read_frame=False): +def background_consistency(clip_model, preprocess, video_list, device, read_frame): sim = 0.0 cnt = 0 video_results = [] diff --git a/vbench/subject_consistency.py b/vbench/subject_consistency.py index 215a7f58..a8ea2bb3 100755 --- a/vbench/subject_consistency.py +++ b/vbench/subject_consistency.py @@ -27,7 +27,8 @@ def subject_consistency(model, video_list, device, read_frame=True): for video_path in tqdm(video_list): video_sim = 0.0 if read_frame: - video_path = video_path[:-4].replace('videos', 'frames') + # video_path = video_path[:-4].replace('videos', 'frames') # TODO: original + video_path = video_path[:-4].replace('videos', 'frames').replace("yujiashuo/sample/sample_subject_consistency_20230927_027_0095000_128_512x512",'heyinan/aigc-eval/sample/origin_sample_subject_consistency_20230927_027_0095000_128_512x512') # TODO: DO NOT PUSH tmp_paths = [os.path.join(video_path, f) for f in sorted(os.listdir(video_path))] images = [] for tmp_path in tmp_paths: diff --git a/vbench/utils.py b/vbench/utils.py index 956e0392..87b3f1d3 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -209,13 +209,13 @@ def load_dimension_info(json_dir, dimension, lang): prompt_dict_ls += [{'prompt': prompt, 'video_list': cur_video_list}] return video_list, prompt_dict_ls -def init_submodules(dimension_list, local=False): +def init_submodules(dimension_list, local=False, read_frame=False): submodules_dict = {} if local: logger.info("\x1b[32m[Local Mode]\x1b[0m Working in local mode, please make sure that the pre-trained model has been fully downloaded.") for dimension in dimension_list: if dimension == 'background_consistency': - read_frame = False + # read_frame = False if local: vit_b_path = 'pretrained/clip_model/ViT-B-32.pt' if not os.path.isfile(vit_b_path): @@ -226,7 +226,7 @@ def init_submodules(dimension_list, local=False): elif dimension == 'human_action': umt_path = "pretrained/umt_model/l16_ptk710_ftk710_ftk400_f16_res224.pth" if not os.path.isfile(umt_path): - os.system(f'wget -q --show-progress https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/umt/single_modality/l16_ptk710_ftk710_ftk400_f16_res224.pth -O {umt_path}') + os.system(f'wget -q https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/umt/single_modality/l16_ptk710_ftk710_ftk400_f16_res224.pth -O {umt_path}') submodules_dict[dimension] = [umt_path,] elif dimension == 'temporal_flickering': submodules_dict[dimension] = [] From f4ca516065c73fccf3e15abd1d9c3bf51fc5f1a8 Mon Sep 17 00:00:00 2001 From: yinanhe Date: Thu, 7 Dec 2023 15:22:37 +0800 Subject: [PATCH 032/248] [fix] some dimensions --- vbench/appearance_style.py | 16 +++++++++------- vbench/color.py | 23 +++++++++++------------ vbench/spatial_relationship.py | 2 +- vbench/temporal_style.py | 7 ++++--- vbench/utils.py | 8 ++++++++ 5 files changed, 33 insertions(+), 23 deletions(-) diff --git a/vbench/appearance_style.py b/vbench/appearance_style.py index e6ae21c5..e187e3a0 100755 --- a/vbench/appearance_style.py +++ b/vbench/appearance_style.py @@ -5,7 +5,8 @@ import torch import clip -from .utils import load_video, load_dimension_info, clip_transform +from PIL import Image +from .utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps, clip_transform_Image def get_text_features(model, input_text, tokenizer, text_feature_dict={}): if input_text in text_feature_dict: @@ -28,11 +29,11 @@ def get_predict_label(clip_feature, text_feats_tensor, top=5): top_probs, top_labels = label_probs.cpu().topk(top, dim=-1) return top_probs, top_labels -def appearance_style(clip_model, video_dict, device): +def appearance_style(clip_model, video_dict, device, sample="rand"): sim = 0.0 cnt = 0 video_results = [] - image_transform = clip_transform(224) + image_transform = clip_transform_Image(224) for info in tqdm(video_dict): if 'auxiliary_info' not in info: raise "Auxiliary info is not in json, please check your json." @@ -42,17 +43,18 @@ def appearance_style(clip_model, video_dict, device): for video_path in video_list: cur_video = [] with torch.no_grad(): - images = load_video(video_path) - images = image_transform(images) - images = images.to(device) + video_arrays = load_video(video_path, return_tensor=False) + images = [Image.fromarray(i) for i in video_arrays] for image in images: + image = image_transform(image) + image = image.to(device) logits_per_image, logits_per_text = clip_model(image.unsqueeze(0), text) cur_sim = float(logits_per_text[0][0].cpu()) cur_video.append(cur_sim) sim += cur_sim cnt +=1 video_sim = np.mean(cur_video) - video_results.append({'video_path': video_path, 'video_results': video_sim}) + video_results.append({'video_path': video_path, 'video_results': video_sim, 'frame_results':cur_video}) sim_per_frame = sim / cnt return sim_per_frame, video_results diff --git a/vbench/color.py b/vbench/color.py index 21b45f99..dff022d2 100755 --- a/vbench/color.py +++ b/vbench/color.py @@ -4,7 +4,7 @@ import torch import numpy as np from tqdm import tqdm -from .utils import load_video, load_dimension_info +from .utils import load_video, load_dimension_info, read_frames_decord_by_fps from .third_party.grit_model import DenseCaptioning import logging @@ -13,7 +13,7 @@ def get_dect_from_grit(model, image_arrays): pred = [] - if type(image_arrays) is not list: + if type(image_arrays) is not list and type(image_arrays) is not np.ndarray: image_arrays = image_arrays.numpy() with torch.no_grad(): for frame in image_arrays: @@ -23,7 +23,7 @@ def get_dect_from_grit(model, image_arrays): cur_pred.append(['','']) else: for idx, cap_det in enumerate(ret[0]): - cur_pred.append([cap_det[0], cap_det[2][idx]]) + cur_pred.append([cap_det[0], cap_det[2][0]]) pred.append(cur_pred) return pred @@ -45,7 +45,7 @@ def check_generate(color_key, object_key, predictions): return cur_object, cur_object_color def color(model, video_dict, device): - success_frame_count, frame_count = 0,0 + success_frame_count_all, video_count = 0, 0 video_results = [] for info in tqdm(video_dict): if 'auxiliary_info' not in info: @@ -55,17 +55,16 @@ def color(model, video_dict, device): object_info = info['prompt'] object_info = object_info.replace('a ','').replace('an ','').replace(color_info,'').strip() for video_path in info['video_list']: - video_tensor = load_video(video_path) - cur_video_pred = get_dect_from_grit(model, video_tensor.permute(0,2,3,1)) + video_arrays = load_video(video_path, return_tensor=False) + cur_video_pred = get_dect_from_grit(model ,video_arrays) cur_object, cur_object_color = check_generate(color_info, object_info, cur_video_pred) if cur_object>0: cur_success_frame_rate = cur_object_color/cur_object - else: - cur_success_frame_rate = 1. - success_frame_count += cur_object_color - frame_count += cur_object - video_results.append({'video_path': video_path, 'video_results': cur_success_frame_rate}) - success_rate = success_frame_count / frame_count + success_frame_count_all += cur_success_frame_rate + video_count += 1 + video_results.append({'video_path': video_path, 'video_results': cur_success_frame_rate}) + print({'video_path': video_path, 'video_results': cur_success_frame_rate}) + success_rate = success_frame_count_all / video_count return success_rate, video_results diff --git a/vbench/spatial_relationship.py b/vbench/spatial_relationship.py index 40796aef..898b59a1 100755 --- a/vbench/spatial_relationship.py +++ b/vbench/spatial_relationship.py @@ -116,7 +116,7 @@ def spatial_relationship(model, video_dict, device): cur_video_frame_score = check_generate(object_info, cur_video_pred) cur_success_frame_rate = np.mean(cur_video_frame_score) frame_score_overall.extend(cur_video_frame_score) - video_results.append({'video_path': video_path, 'video_results': cur_success_frame_rate}) + video_results.append({'video_path': video_path, 'video_results': cur_success_frame_rate, 'frame_results':cur_video_frame_score}) success_rate = np.mean(frame_score_overall) return success_rate, video_results diff --git a/vbench/temporal_style.py b/vbench/temporal_style.py index 630996af..c27f89b3 100755 --- a/vbench/temporal_style.py +++ b/vbench/temporal_style.py @@ -8,7 +8,7 @@ from tqdm import tqdm from .third_party.ViCLIP.viclip import ViCLIP from .third_party.ViCLIP.simple_tokenizer import SimpleTokenizer -from .utils import load_video, load_dimension_info, clip_transform +from .utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps def get_text_features(model, input_text, tokenizer, text_feature_dict={}): if input_text in text_feature_dict: @@ -31,7 +31,7 @@ def get_predict_label(clip_feature, text_feats_tensor, top=5): top_probs, top_labels = label_probs.cpu().topk(top, dim=-1) return top_probs, top_labels -def temporal_style(clip_model, video_dict, tokenizer, device): +def temporal_style(clip_model, video_dict, tokenizer, device, sample="middle"): sim = [] video_results = [] image_transform = clip_transform(224) @@ -42,7 +42,8 @@ def temporal_style(clip_model, video_dict, tokenizer, device): for video_path in video_list: cur_video = [] with torch.no_grad(): - images = load_video(video_path, num_frames=8) + # images = load_video(video_path, num_frames=8) + images = read_frames_decord_by_fps(video_path, num_frames=8, sample=sample) images = image_transform(images) images = images.to(device) clip_feat = get_vid_features(clip_model,images.unsqueeze(0)) diff --git a/vbench/utils.py b/vbench/utils.py index 87b3f1d3..63f03aaa 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -28,6 +28,14 @@ def clip_transform(n_px): Normalize((0.48145466, 0.4578275, 0.40821073), (0.26862954, 0.26130258, 0.27577711)), ]) +def clip_transform_Image(n_px): + return Compose([ + Resize(n_px, interpolation=BICUBIC), + CenterCrop(n_px), + ToTensor(), + Normalize((0.48145466, 0.4578275, 0.40821073), (0.26862954, 0.26130258, 0.27577711)), + ]) + def dino_transform(n_px): return Compose([ Resize(size=n_px), From 3d25cc5f12c63e0bc31b2fef2a57cbe4a0e0e00a Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Fri, 8 Dec 2023 15:10:11 +0800 Subject: [PATCH 033/248] [fix] human_action --- vbench/human_action.py | 3 ++- vbench/third_party/umt/models/modeling_finetune.py | 2 ++ 2 files changed, 4 insertions(+), 1 deletion(-) diff --git a/vbench/human_action.py b/vbench/human_action.py index 5c9cbd34..3b4d7867 100755 --- a/vbench/human_action.py +++ b/vbench/human_action.py @@ -16,6 +16,7 @@ from .third_party.umt.datasets.volume_transforms import ClipToTensor from timm.models import create_model from .third_party.umt.models import vit_large_patch16_224 +from tqdm import tqdm def build_dict(): path = 'pretrained/umt_model/kinetics_400_categroies.txt' @@ -61,7 +62,7 @@ def human_action(umt_path, video_list, device): cnt= 0 cor_num = 0 video_results = [] - for video_path in video_list: + for video_path in tqdm(video_list): video_label_ls = video_path.split('/')[-1].lower().split('-')[0].split("person is ")[-1].split('_')[0] cnt += 1 images = load_video(video_path, data_transform) diff --git a/vbench/third_party/umt/models/modeling_finetune.py b/vbench/third_party/umt/models/modeling_finetune.py index f47030d4..0c714073 100755 --- a/vbench/third_party/umt/models/modeling_finetune.py +++ b/vbench/third_party/umt/models/modeling_finetune.py @@ -346,6 +346,8 @@ def vit_base_patch16_384(pretrained=False, **kwargs): @register_model def vit_large_patch16_224(pretrained=False, **kwargs): + kwargs.pop('pretrained_cfg', None) # added by Ziqi to accommodate timm=0.9.12 + kwargs.pop('pretrained_cfg_overlay', None) # added by Ziqi to accommodate timm=0.9.12 model = VisionTransformer( patch_size=16, embed_dim=1024, depth=24, num_heads=16, mlp_ratio=4, qkv_bias=True, norm_layer=partial(nn.LayerNorm, eps=1e-6), **kwargs) From df651fb0b91cc0467d4d43e6dba6731d478392fd Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Fri, 8 Dec 2023 15:10:49 +0800 Subject: [PATCH 034/248] [fix] multiple dimensions --- vbench/aesthetic_quality.py | 3 ++- vbench/appearance_style.py | 1 + vbench/background_consistency.py | 3 ++- vbench/color.py | 1 - 4 files changed, 5 insertions(+), 3 deletions(-) diff --git a/vbench/aesthetic_quality.py b/vbench/aesthetic_quality.py index df570161..c2d151a3 100755 --- a/vbench/aesthetic_quality.py +++ b/vbench/aesthetic_quality.py @@ -8,6 +8,7 @@ import torch.nn.functional as F from urllib.request import urlretrieve from .utils import load_video, load_dimension_info, clip_transform +from tqdm import tqdm def get_aesthetic_model(cache_folder): @@ -39,7 +40,7 @@ def laion_aesthetic(aesthetic_model, clip_model, video_list, device): aesthetic_avg = 0.0 num = 0 video_results = [] - for video_path in video_list: + for video_path in tqdm(video_list): images = load_video(video_path) image_transform = clip_transform(224) images = image_transform(images) diff --git a/vbench/appearance_style.py b/vbench/appearance_style.py index e187e3a0..944a024f 100755 --- a/vbench/appearance_style.py +++ b/vbench/appearance_style.py @@ -50,6 +50,7 @@ def appearance_style(clip_model, video_dict, device, sample="rand"): image = image.to(device) logits_per_image, logits_per_text = clip_model(image.unsqueeze(0), text) cur_sim = float(logits_per_text[0][0].cpu()) + cur_sim = cur_sim / 100 cur_video.append(cur_sim) sim += cur_sim cnt +=1 diff --git a/vbench/background_consistency.py b/vbench/background_consistency.py index 94cfbbec..d33cfc81 100755 --- a/vbench/background_consistency.py +++ b/vbench/background_consistency.py @@ -7,6 +7,7 @@ import torch.nn as nn import torch.nn.functional as F from .utils import load_video, load_dimension_info, clip_transform +from tqdm import tqdm def background_consistency(clip_model, preprocess, video_list, device, read_frame): @@ -14,7 +15,7 @@ def background_consistency(clip_model, preprocess, video_list, device, read_fram cnt = 0 video_results = [] image_transform = clip_transform(224) - for video_path in video_list: + for video_path in tqdm(video_list): video_sim = 0.0 if read_frame: video_path = video_path[:-4].replace('videos', 'frames').replace(' ', '_') diff --git a/vbench/color.py b/vbench/color.py index dff022d2..78bfeb2d 100755 --- a/vbench/color.py +++ b/vbench/color.py @@ -63,7 +63,6 @@ def color(model, video_dict, device): success_frame_count_all += cur_success_frame_rate video_count += 1 video_results.append({'video_path': video_path, 'video_results': cur_success_frame_rate}) - print({'video_path': video_path, 'video_results': cur_success_frame_rate}) success_rate = success_frame_count_all / video_count return success_rate, video_results From 7c2fc5bb85a65ddbc21f90d9e41a4f11bcec7b20 Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Fri, 8 Dec 2023 15:11:50 +0800 Subject: [PATCH 035/248] [add] mapping "dimension name -> video folder name" --- dimension_to_folder.json | 18 ++++++++++++++++++ 1 file changed, 18 insertions(+) create mode 100644 dimension_to_folder.json diff --git a/dimension_to_folder.json b/dimension_to_folder.json new file mode 100644 index 00000000..126198af --- /dev/null +++ b/dimension_to_folder.json @@ -0,0 +1,18 @@ +{ + "subject_consistency": "subject_consistency", + "background_consistency": "scene", + "aesthetic_quality": "overall_consistency", + "imaging_quality": "overall_consistency", + "object_class": "object_class", + "multiple_objects": "multiple_objects", + "color": "color", + "spatial_relationship": "spatial_relationship", + "scene": "scene", + "temporal_style": "temporal_style", + "overall_consistency": "overall_consistency", + "human_action": "human_action", + "temporal_flickering": "temporal_flickering", + "motion_smoothness": "subject_consistency", + "dynamic_degree": "subject_consistency", + "appearance_style": "appearance_style" +} \ No newline at end of file From 4b19e10aa3ecabd6aca0729347fd4adb7866e454 Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Fri, 8 Dec 2023 15:13:29 +0800 Subject: [PATCH 036/248] [fix] video paths & wget commands --- vbench/__init__.py | 6 +++--- vbench/utils.py | 4 ++-- 2 files changed, 5 insertions(+), 5 deletions(-) diff --git a/vbench/__init__.py b/vbench/__init__.py index 6f27c269..5f3f8f8c 100755 --- a/vbench/__init__.py +++ b/vbench/__init__.py @@ -31,14 +31,14 @@ def __init__(self, device, full_info_dir, output_path): def build_full_dimension_list(self, ): return ["subject_consistency", "background_consistency", "aesthetic_quality", "imaging_quality", "object_class", "multiple_objects", "color", "spatial_relationship", "scene", "temporal_style", 'overall_consistency', "human_action", "temporal_flickering", "motion_smoothness", "dynamic_degree", "appearance_style"] - def build_full_info_json(self, videos_path, name, dimension_list): + def build_full_info_json(self, videos_path, name, dimension_list, special_str=''): cur_full_info_list = load_json(self.full_info_dir) video_names = os.listdir(videos_path) postfix = '.'+ video_names[0].split('.')[-1] for prompt_dict in cur_full_info_list: prompt = prompt_dict['prompt_en'] - if prompt + '_0033-0'+postfix in video_names: - prompt_dict['video_list'] = [os.path.join(videos_path, prompt+'_0033-'+str(i)+postfix) for i in range(5)] + if f'{prompt}{special_str}-0{postfix}' in video_names: + prompt_dict['video_list'] = [os.path.join(videos_path, f'{prompt}{special_str}-{str(i)}{postfix}') for i in range(5)] cur_full_info_path = os.path.join(self.output_path, name+'_full_info.json') save_json(cur_full_info_list, cur_full_info_path) print(f'Evaluation meta data saved to {cur_full_info_path}') diff --git a/vbench/utils.py b/vbench/utils.py index 63f03aaa..6d6c6a15 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -227,7 +227,7 @@ def init_submodules(dimension_list, local=False, read_frame=False): if local: vit_b_path = 'pretrained/clip_model/ViT-B-32.pt' if not os.path.isfile(vit_b_path): - os.system(f'wget -q --show-progress https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt -O {vit_b_path}') + os.system(f'wget -q https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt -O {vit_b_path}') else: vit_b_path = 'ViT-B/32' submodules_dict[dimension] = [vit_b_path, read_frame] @@ -329,7 +329,7 @@ def init_submodules(dimension_list, local=False, read_frame=False): if local: submodules_dict[dimension] = {"name":'pretrained/clip_model/ViT-B-32.pt'} if not os.path.isfile(submodules_dict[dimension]["name"]): - os.system(f'wget -q --show-progress https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt -O {submodules_dict[dimension]["name"]}') + os.system(f'wget -q https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt -O {submodules_dict[dimension]["name"]}') else: submodules_dict[dimension] = {"name": 'ViT-B/32'} elif dimension in ["temporal_style", "overall_consistency"]: From 9c88357bc16b17386b4bb98696c6560e4f19bbe4 Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Fri, 8 Dec 2023 15:48:51 +0800 Subject: [PATCH 037/248] [add] evaluate.sh --- evaluate.sh | 30 ++++++++++++++++++++++++++++++ 1 file changed, 30 insertions(+) create mode 100644 evaluate.sh diff --git a/evaluate.sh b/evaluate.sh new file mode 100644 index 00000000..61681d70 --- /dev/null +++ b/evaluate.sh @@ -0,0 +1,30 @@ +#!/bin/bash + +# Define the model list +models=("lavie" "modelscope" "videocrafter" "cogvideo") + +# Define the dimension list +dimensions=("subject_consistency" "background_consistency" "aesthetic_quality" "imaging_quality" "object_class" "multiple_objects" "color" "spatial_relationship" "scene" "temporal_style" "overall_consistency" "human_action" "temporal_flickering" "motion_smoothness" "dynamic_degree" "appearance_style") + +# Corresponding folder names +folders=("subject_consistency" "scene" "overall_consistency" "overall_consistency" "object_class" "multiple_objects" "color" "spatial_relationship" "scene" "temporal_style" "overall_consistency" "human_action" "temporal_flickering" "subject_consistency" "subject_consistency" "appearance_style") + +# Base path for videos +base_path='./vbench_videos/' # TODO: change to local path + +# Loop over each model +for model in "${models[@]}"; do + # Loop over each dimension + for i in "${!dimensions[@]}"; do + # Get the dimension and corresponding folder + dimension=${dimensions[i]} + folder=${folders[i]} + + # Construct the video path + videos_path="${base_path}${model}/${folder}" + echo "$dimension $videos_path" + + # Run the evaluation script + srun -p Video-aigc-general --gres=gpu:1 --cpus-per-task=16 --job-name=eval-vbench python evaluate.py --videos_path $videos_path --dimension $dimension + done +done From c7aee20a82c4d337d6dc433433c0db0cbb8286cc Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Fri, 8 Dec 2023 16:04:09 +0800 Subject: [PATCH 038/248] [update] evaluate multiple models and dimensions --- README.md | 14 ++++++++++---- evaluate.sh | 2 +- 2 files changed, 11 insertions(+), 5 deletions(-) diff --git a/README.md b/README.md index 5e9957af..17b84047 100755 --- a/README.md +++ b/README.md @@ -45,7 +45,7 @@ We provide prompt lists are at `prompts/`, see [instructions](https://github.com ## :surfer: Evaluation Method Suite -To perform evaluation, run this: +To perform evaluation on one dimension, run this: ``` python evaluate.py --videos_path $VIDEOS_PATH --dimension $DIMENSION ``` From aa5f0876183a1311ffbb32c9d7601e119263206b Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Tue, 12 Dec 2023 01:28:00 +0800 Subject: [PATCH 047/248] [update] arXiv paper link --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 83a7ee74..1b7d87d8 100755 --- a/README.md +++ b/README.md @@ -1,7 +1,7 @@ # :bar_chart: VBench -[![Paper](https://img.shields.io/badge/cs.CV-Paper-b31b1b?logo=arxiv&logoColor=red)](https://vchitect.github.io/VBench-project/assets/vbench/VBench_paper.pdf) +[![Paper](https://img.shields.io/badge/cs.CV-Paper-b31b1b?logo=arxiv&logoColor=red)](https://arxiv.org/abs/2311.17982) [![Project Page](https://img.shields.io/badge/VBench-Website-green?logo=googlechrome&logoColor=green)](https://vchitect.github.io/VBench-project/) [![HuggingFace](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Leaderboard-blue)](https://huggingface.co/spaces/Vchitect/VBench_Leaderboard) [![Video](https://img.shields.io/badge/YouTube-Video-c4302b?logo=youtube&logoColor=red)](https://www.youtube.com/watch?v=7IhCC8Qqn8Y) @@ -76,7 +76,7 @@ bash evaluate.sh @article{huang2023vbench, title={{VBench}: Comprehensive Benchmark Suite for Video Generative Models}, author={Huang, Ziqi and He, Yinan and Yu, Jiashuo and Zhang, Fan and Si, Chenyang and Jiang, Yuming and Zhang, Yuanhan and Wu, Tianxing and Jin, Qingyang and Chanpaisit, Nattapol and Wang, Yaohui and Chen, Xinyuan and Wang, Limin and Lin, Dahua and Qiao, Yu and Liu, Ziwei} - journal={arXiv preprint} + journal={arXiv preprint arXiv:2311.17982} year={2023} } ``` From 48daf59153990ddf718ce80019996b95484968e0 Mon Sep 17 00:00:00 2001 From: Jiashuo Yu <35335269+JustinYuu@users.noreply.github.com> Date: Tue, 12 Dec 2023 11:39:12 +0800 Subject: [PATCH 048/248] [update] video sampling instructions --- prompts/README.md | 12 +++++------- 1 file changed, 5 insertions(+), 7 deletions(-) diff --git a/prompts/README.md b/prompts/README.md index fdfa06b6..e3dcfb48 100755 --- a/prompts/README.md +++ b/prompts/README.md @@ -26,10 +26,9 @@ for dimension in dimension_list: for index in range(5): # perform sampling - video_, video_path = sample_per_video(dimension, prompt, idx) - - # save sampled video to f'{base_path}/{model_name}/{dimension_name}/{prompt}-{index}.mp4' - torchvision.io.write_video(video_path, video_, fps=8) + video_ = sample_per_video(dimension, prompt, idx) + cur_save_path = f'{args.save_path}/{prompt}-{index}.mp4' + torchvision.io.write_video(cur_save_path, video_, fps=8) ``` ### Further Explanations @@ -37,9 +36,8 @@ To sample videos for VBench evaluation: - Sample videos from all the `txt` files in `prompts/prompts_per_dimension`. - For each prompt, sample 5 videos. - At the beginning of sampling from each `txt` file, set the random seed. -- Save the videos in the form of `$base_path/$model_name/$dimension_name/$prompt-$index.mp4`, `$index` takes value of `0, 1, 2, 3, 4`. For example: - ``` - vbench_videos/lavie/overall_consistency +- Name the videos in the form of `$prompt-$index.mp4`, `$index` takes value of `0, 1, 2, 3, 4`. For example: + ``` ├── A 3D model of a 1800s victorian house.-0.mp4 ├── A 3D model of a 1800s victorian house.-1.mp4 ├── A 3D model of a 1800s victorian house.-2.mp4 From cd950ee89d1a2466512a91a0fee77c28dcb0d677 Mon Sep 17 00:00:00 2001 From: justinyuu <869113692@qq.com> Date: Tue, 12 Dec 2023 11:56:31 +0800 Subject: [PATCH 049/248] [update] all dimension prompt list --- .../prompts_per_dimension/all_dimension.txt | 936 ++++++++++++++++++ 1 file changed, 936 insertions(+) create mode 100644 prompts/prompts_per_dimension/all_dimension.txt diff --git a/prompts/prompts_per_dimension/all_dimension.txt b/prompts/prompts_per_dimension/all_dimension.txt new file mode 100644 index 00000000..3887556a --- /dev/null +++ b/prompts/prompts_per_dimension/all_dimension.txt @@ -0,0 +1,936 @@ +a bird and a cat +a cat and a dog +a dog and a horse +a horse and a sheep +a sheep and a cow +a cow and an elephant +an elephant and a bear +a bear and a zebra +a zebra and a giraffe +a giraffe and a bird +a chair and a couch +a couch and a potted plant +a potted plant and a tv +a tv and a laptop +a laptop and a remote +a remote and a keyboard +a keyboard and a cell phone +a cell phone and a book +a book and a clock +a clock and a backpack +a backpack and an umbrella +an umbrella and a handbag +a handbag and a tie +a tie and a suitcase +a suitcase and a vase +a vase and scissors +scissors and a teddy bear +a teddy bear and a frisbee +a frisbee and skis +skis and a snowboard +a snowboard and a sports ball +a sports ball and a kite +a kite and a baseball bat +a baseball bat and a baseball glove +a baseball glove and a skateboard +a skateboard and a surfboard +a surfboard and a tennis racket +a tennis racket and a bottle +a bottle and a chair +an airplane and a train +a train and a boat +a boat and an airplane +a bicycle and a car +a car and a motorcycle +a motorcycle and a bus +a bus and a traffic light +a traffic light and a fire hydrant +a fire hydrant and a stop sign +a stop sign and a parking meter +a parking meter and a truck +a truck and a bicycle +a toilet and a hair drier +a hair drier and a toothbrush +a toothbrush and a sink +a sink and a toilet +a wine glass and a chair +a cup and a couch +a fork and a potted plant +a knife and a tv +a spoon and a laptop +a bowl and a remote +a banana and a keyboard +an apple and a cell phone +a sandwich and a book +an orange and a clock +broccoli and a backpack +a carrot and an umbrella +a hot dog and a handbag +a pizza and a tie +a donut and a suitcase +a cake and a vase +an oven and scissors +a toaster and a teddy bear +a microwave and a frisbee +a refrigerator and skis +a bicycle and an airplane +a car and a train +a motorcycle and a boat +a person and a toilet +a person and a hair drier +a person and a toothbrush +a person and a sinka person swimming in ocean +a person giving a presentation to a room full of colleagues +a person washing the dishes +a person eating a burger +a person walking in the snowstorm +a person drinking coffee in a cafe +a person playing guitar +a bicycle leaning against a tree +a bicycle gliding through a snowy field +a bicycle slowing down to stop +a bicycle accelerating to gain speed +a car stuck in traffic during rush hour +a car turning a corner +a car slowing down to stop +a car accelerating to gain speed +a motorcycle cruising along a coastal highway +a motorcycle turning a corner +a motorcycle slowing down to stop +a motorcycle gliding through a snowy field +a motorcycle accelerating to gain speed +an airplane soaring through a clear blue sky +an airplane taking off +an airplane landing smoothly on a runway +an airplane accelerating to gain speed +a bus turning a corner +a bus stuck in traffic during rush hour +a bus accelerating to gain speed +a train speeding down the tracks +a train crossing over a tall bridge +a train accelerating to gain speed +a truck turning a corner +a truck anchored in a tranquil bay +a truck stuck in traffic during rush hour +a truck slowing down to stop +a truck accelerating to gain speed +a boat sailing smoothly on a calm lake +a boat slowing down to stop +a boat accelerating to gain speed +a bird soaring gracefully in the sky +a bird building a nest from twigs and leaves +a bird flying over a snowy forest +a cat grooming itself meticulously with its tongue +a cat playing in park +a cat drinking water +a cat running happily +a dog enjoying a peaceful walk +a dog playing in park +a dog drinking water +a dog running happily +a horse bending down to drink water from a river +a horse galloping across an open field +a horse taking a peaceful walk +a horse running to join a herd of its kind +a sheep bending down to drink water from a river +a sheep taking a peaceful walk +a sheep running to join a herd of its kind +a cow bending down to drink water from a river +a cow chewing cud while resting in a tranquil barn +a cow running to join a herd of its kind +an elephant spraying itself with water using its trunk to cool down +an elephant taking a peaceful walk +an elephant running to join a herd of its kind +a bear catching a salmon in its powerful jaws +a bear sniffing the air for scents of food +a bear climbing a tree +a bear hunting for prey +a zebra bending down to drink water from a river +a zebra running to join a herd of its kind +a zebra taking a peaceful walk +a giraffe bending down to drink water from a river +a giraffe taking a peaceful walk +a giraffe running to join a herd of its kinda red bicycle +a green bicycle +a blue bicycle +a yellow bicycle +an orange bicycle +a purple bicycle +a pink bicycle +a black bicycle +a white bicycle +a red car +a green car +a blue car +a yellow car +an orange car +a purple car +a pink car +a black car +a white car +a red bird +a green bird +a blue bird +a yellow bird +an orange bird +a purple bird +a pink bird +a black bird +a white bird +a black cat +a white cat +an orange cat +a yellow cat +a red umbrella +a green umbrella +a blue umbrella +a yellow umbrella +an orange umbrella +a purple umbrella +a pink umbrella +a black umbrella +a white umbrella +a red suitcase +a green suitcase +a blue suitcase +a yellow suitcase +an orange suitcase +a purple suitcase +a pink suitcase +a black suitcase +a white suitcase +a red bowl +a green bowl +a blue bowl +a yellow bowl +an orange bowl +a purple bowl +a pink bowl +a black bowl +a white bowl +a red chair +a green chair +a blue chair +a yellow chair +an orange chair +a purple chair +a pink chair +a black chair +a white chair +a red clock +a green clock +a blue clock +a yellow clock +an orange clock +a purple clock +a pink clock +a black clock +a white clock +a red vase +a green vase +a blue vase +a yellow vase +an orange vase +a purple vase +a pink vase +a black vase +a white vasealley +amusement park +aquarium +arch +art gallery +bathroom +bakery shop +ballroom +bar +barn +basement +beach +bedroom +bridge +botanical garden +cafeteria +campsite +campus +carrousel +castle +cemetery +classroom +cliff +crosswalk +construction site +corridor +courtyard +desert +downtown +driveway +farm +food court +football field +forest road +fountain +gas station +glacier +golf course +indoor gymnasium +harbor +highway +hospital +house +iceberg +industrial area +jail cell +junkyard +kitchen +indoor library +lighthouse +laboratory +mansion +marsh +mountain +indoor movie theater +indoor museum +music studio +nursery +ocean +office +palace +parking lot +pharmacy +phone booth +raceway +restaurant +river +science museum +shower +ski slope +sky +skyscraper +baseball stadium +staircase +street +supermarket +indoor swimming pool +tower +outdoor track +train railway +train station platform +underwater coral reef +valley +volcano +waterfall +windmillClose up of grapes on a rotating table. +Turtle swimming in ocean. +A storm trooper vacuuming the beach. +A panda standing on a surfboard in the ocean in sunset. +An astronaut feeding ducks on a sunny afternoon, reflection from the water. +Two pandas discussing an academic paper. +Sunset time lapse at the beach with moving clouds and colors in the sky. +A fat rabbit wearing a purple robe walking through a fantasy landscape. +A koala bear playing piano in the forest. +An astronaut flying in space. +Fireworks. +An animated painting of fluffy white clouds moving in sky. +Flying through fantasy landscapes. +A bigfoot walking in the snowstorm. +A squirrel eating a burger. +A cat wearing sunglasses and working as a lifeguard at a pool. +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks. +Splash of turquoise water in extreme slow motion, alpha channel included. +an ice cream is melting on the table. +a drone flying over a snowy forest. +a shark is swimming in the ocean. +Aerial panoramic video from a drone of a fantasy land. +a teddy bear is swimming in the ocean. +time lapse of sunrise on mars. +golden fish swimming in the ocean. +An artist brush painting on a canvas close up. +A drone view of celebration with Christmas tree and fireworks, starry sky - background. +happy dog wearing a yellow turtleneck, studio, portrait, facing camera, dark background +Origami dancers in white paper, 3D render, on white background, studio shot, dancing modern dance. +Campfire at night in a snowy forest with starry sky in the background. +a fantasy landscape +A 3D model of a 1800s victorian house. +this is how I do makeup in the morning. +A raccoon that looks like a turtle, digital art. +Robot dancing in Times Square. +Busy freeway at night. +Balloon full of water exploding in extreme slow motion. +An astronaut is riding a horse in the space in a photorealistic style. +Macro slo-mo. Slow motion cropped closeup of roasted coffee beans falling into an empty bowl. +Sewing machine, old sewing machine working. +Motion colour drop in water, ink swirling in water, colourful ink in water, abstraction fancy dream cloud of ink. +Few big purple plums rotating on the turntable. water drops appear on the skin during rotation. isolated on the white background. close-up. macro. +Vampire makeup face of beautiful girl, red contact lenses. +Ashtray full of butts on table, smoke flowing on black background, close-up +Pacific coast, carmel by the sea ocean and waves. +A teddy bear is playing drum kit in NYC Times Square. +A corgi is playing drum kit. +An Iron man is playing the electronic guitar, high electronic guitar. +A raccoon is playing the electronic guitar. +A boat sailing leisurely along the Seine River with the Eiffel Tower in background by Vincent van Gogh +A corgi's head depicted as an explosion of a nebula +A fantasy landscape +A future where humans have achieved teleportation technology +A jellyfish floating through the ocean, with bioluminescent tentacles +A Mars rover moving on Mars +A panda drinking coffee in a cafe in Paris +A space shuttle launching into orbit, with flames and smoke billowing out from the engines +A steam train moving on a mountainside +A super cool giant robot in Cyberpunk Beijing +A tropical beach at sunrise, with palm trees and crystal-clear water in the foreground +Cinematic shot of Van Gogh's selfie, Van Gogh style +Gwen Stacy reading a book +Iron Man flying in the sky +The bund Shanghai, oil painting +Yoda playing guitar on the stage +A beautiful coastal beach in spring, waves lapping on sand by Hokusai, in the style of Ukiyo +A beautiful coastal beach in spring, waves lapping on sand by Vincent van Gogh +A boat sailing leisurely along the Seine River with the Eiffel Tower in background +A car moving slowly on an empty street, rainy evening +A cat eating food out of a bowl +A cat wearing sunglasses at a pool +A confused panda in calculus class +A cute fluffy panda eating Chinese food in a restaurant +A cute happy Corgi playing in park, sunset +A cute raccoon playing guitar in a boat on the ocean +A happy fuzzy panda playing guitar nearby a campfire, snow mountain in the background +A lightning striking atop of eiffel tower, dark clouds in the sky +A modern art museum, with colorful paintings +A panda cooking in the kitchen +A panda playing on a swing set +A polar bear is playing guitar +A raccoon dressed in suit playing the trumpet, stage background +A robot DJ is playing the turntable, in heavy raining futuristic tokyo rooftop cyberpunk night, sci-fi, fantasy +A shark swimming in clear Caribbean ocean +A super robot protecting city +A teddy bear washing the dishes +An epic tornado attacking above a glowing city at night, the tornado is made of smoke +An oil painting of a couple in formal evening wear going home get caught in a heavy downpour with umbrellas +Clown fish swimming through the coral reef +Hyper-realistic spaceship landing on Mars +The bund Shanghai, vibrant color +Vincent van Gogh is painting in the room +Yellow flowers swing in the windA beautiful coastal beach in spring, waves lapping on sand, Van Gogh style +A beautiful coastal beach in spring, waves lapping on sand, oil painting +A beautiful coastal beach in spring, waves lapping on sand by Hokusai, in the style of Ukiyo +A beautiful coastal beach in spring, waves lapping on sand, black and white +A beautiful coastal beach in spring, waves lapping on sand, pixel art +A beautiful coastal beach in spring, waves lapping on sand, in cyberpunk style +A beautiful coastal beach in spring, waves lapping on sand, animated style +A beautiful coastal beach in spring, waves lapping on sand, watercolor painting +A beautiful coastal beach in spring, waves lapping on sand, surrealism style +The bund Shanghai, Van Gogh style +The bund Shanghai, oil painting +The bund Shanghai by Hokusai, in the style of Ukiyo +The bund Shanghai, black and white +The bund Shanghai, pixel art +The bund Shanghai, in cyberpunk style +The bund Shanghai, animated style +The bund Shanghai, watercolor painting +The bund Shanghai, surrealism style +a shark is swimming in the ocean, Van Gogh style +a shark is swimming in the ocean, oil painting +a shark is swimming in the ocean by Hokusai, in the style of Ukiyo +a shark is swimming in the ocean, black and white +a shark is swimming in the ocean, pixel art +a shark is swimming in the ocean, in cyberpunk style +a shark is swimming in the ocean, animated style +a shark is swimming in the ocean, watercolor painting +a shark is swimming in the ocean, surrealism style +A panda drinking coffee in a cafe in Paris, Van Gogh style +A panda drinking coffee in a cafe in Paris, oil painting +A panda drinking coffee in a cafe in Paris by Hokusai, in the style of Ukiyo +A panda drinking coffee in a cafe in Paris, black and white +A panda drinking coffee in a cafe in Paris, pixel art +A panda drinking coffee in a cafe in Paris, in cyberpunk style +A panda drinking coffee in a cafe in Paris, animated style +A panda drinking coffee in a cafe in Paris, watercolor painting +A panda drinking coffee in a cafe in Paris, surrealism style +A cute happy Corgi playing in park, sunset, Van Gogh style +A cute happy Corgi playing in park, sunset, oil painting +A cute happy Corgi playing in park, sunset by Hokusai, in the style of Ukiyo +A cute happy Corgi playing in park, sunset, black and white +A cute happy Corgi playing in park, sunset, pixel art +A cute happy Corgi playing in park, sunset, in cyberpunk style +A cute happy Corgi playing in park, sunset, animated style +A cute happy Corgi playing in park, sunset, watercolor painting +A cute happy Corgi playing in park, sunset, surrealism style +Gwen Stacy reading a book, Van Gogh style +Gwen Stacy reading a book, oil painting +Gwen Stacy reading a book by Hokusai, in the style of Ukiyo +Gwen Stacy reading a book, black and white +Gwen Stacy reading a book, pixel art +Gwen Stacy reading a book, in cyberpunk style +Gwen Stacy reading a book, animated style +Gwen Stacy reading a book, watercolor painting +Gwen Stacy reading a book, surrealism style +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, Van Gogh style +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, oil painting +A boat sailing leisurely along the Seine River with the Eiffel Tower in background by Hokusai, in the style of Ukiyo +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, black and white +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, pixel art +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, in cyberpunk style +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, animated style +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, watercolor painting +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, surrealism style +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, Van Gogh style +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, oil painting +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas by Hokusai, in the style of Ukiyo +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, black and white +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, pixel art +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, in cyberpunk style +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, animated style +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, watercolor painting +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, surrealism style +An astronaut flying in space, Van Gogh style +An astronaut flying in space, oil painting +An astronaut flying in space by Hokusai, in the style of Ukiyo +An astronaut flying in space, black and white +An astronaut flying in space, pixel art +An astronaut flying in space, in cyberpunk style +An astronaut flying in space, animated style +An astronaut flying in space, watercolor painting +An astronaut flying in space, surrealism style +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, Van Gogh style +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, oil painting +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks by Hokusai, in the style of Ukiyo +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, black and white +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, pixel art +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, in cyberpunk style +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, animated style +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, watercolor painting +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, surrealism styleA person is riding a bike +A person is marching +A person is roller skating +A person is tasting beer +A person is clapping +A person is drawing +A person is petting animal (not cat) +A person is eating watermelon +A person is playing harp +A person is wrestling +A person is riding scooter +A person is sweeping floor +A person is skateboarding +A person is dunking basketball +A person is playing flute +A person is stretching leg +A person is tying tie +A person is skydiving +A person is shooting goal (soccer) +A person is playing piano +A person is finger snapping +A person is canoeing or kayaking +A person is laughing +A person is digging +A person is clay pottery making +A person is shooting basketball +A person is bending back +A person is shaking hands +A person is bandaging +A person is push up +A person is catching or throwing frisbee +A person is playing trumpet +A person is flying kite +A person is filling eyebrows +A person is shuffling cards +A person is folding clothes +A person is smoking +A person is tai chi +A person is squat +A person is playing controller +A person is throwing axe +A person is giving or receiving award +A person is air drumming +A person is taking a shower +A person is planting trees +A person is sharpening knives +A person is robot dancing +A person is rock climbing +A person is hula hooping +A person is writing +A person is bungee jumping +A person is pushing cart +A person is cleaning windows +A person is cutting watermelon +A person is cheerleading +A person is washing hands +A person is ironing +A person is cutting nails +A person is hugging +A person is trimming or shaving beard +A person is jogging +A person is making bed +A person is washing dishes +A person is grooming dog +A person is doing laundry +A person is knitting +A person is reading book +A person is baby waking up +A person is massaging legs +A person is brushing teeth +A person is crawling baby +A person is motorcycling +A person is driving car +A person is sticking tongue out +A person is shaking head +A person is sword fighting +A person is doing aerobics +A person is strumming guitar +A person is riding or walking with horse +A person is archery +A person is catching or throwing baseball +A person is playing chess +A person is rock scissors paper +A person is using computer +A person is arranging flowers +A person is bending metal +A person is ice skating +A person is climbing a rope +A person is crying +A person is dancing ballet +A person is getting a haircut +A person is running on treadmill +A person is kissing +A person is counting money +A person is barbequing +A person is peeling apples +A person is milking cow +A person is shining shoes +A person is making snowman +A person is sailingIn a still frame, a stop sign +a toilet, frozen in time +a laptop, frozen in time +A tranquil tableau of alley +A tranquil tableau of bar +A tranquil tableau of barn +A tranquil tableau of bathroom +A tranquil tableau of bedroom +A tranquil tableau of cliff +In a still frame, courtyard +In a still frame, gas station +A tranquil tableau of house +indoor gymnasium, frozen in time +A tranquil tableau of indoor library +A tranquil tableau of kitchen +A tranquil tableau of palace +In a still frame, parking lot +In a still frame, phone booth +A tranquil tableau of restaurant +A tranquil tableau of tower +A tranquil tableau of a bowl +A tranquil tableau of an apple +A tranquil tableau of a bench +A tranquil tableau of a bed +A tranquil tableau of a chair +A tranquil tableau of a cup +A tranquil tableau of a dining table +In a still frame, a pear +A tranquil tableau of a bunch of grapes +A tranquil tableau of a bowl on the kitchen counter +A tranquil tableau of a beautiful, handcrafted ceramic bowl +A tranquil tableau of an antique bowl +A tranquil tableau of an exquisite mahogany dining table +A tranquil tableau of a wooden bench in the park +A tranquil tableau of a beautiful wrought-iron bench surrounded by blooming flowers +In a still frame, a park bench with a view of the lake +A tranquil tableau of a vintage rocking chair was placed on the porch +A tranquil tableau of the jail cell was small and dimly lit, with cold, steel bars +A tranquil tableau of the phone booth was tucked away in a quiet alley +a dilapidated phone booth stood as a relic of a bygone era on the sidewalk, frozen in time +A tranquil tableau of the old red barn stood weathered and iconic against the backdrop of the countryside +A tranquil tableau of a picturesque barn was painted a warm shade of red and nestled in a picturesque meadow +In a still frame, within the desolate desert, an oasis unfolded, characterized by the stoic presence of palm trees and a motionless, glassy pool of water +In a still frame, the Parthenon's majestic Doric columns stand in serene solitude atop the Acropolis, framed by the tranquil Athenian landscape +In a still frame, the Temple of Hephaestus, with its timeless Doric grace, stands stoically against the backdrop of a quiet Athens +In a still frame, the ornate Victorian streetlamp stands solemnly, adorned with intricate ironwork and stained glass panels +A tranquil tableau of the Stonehenge presented itself as an enigmatic puzzle, each colossal stone meticulously placed against the backdrop of tranquility +In a still frame, in the vast desert, an oasis nestled among dunes, featuring tall palm trees and an air of serenity +static view on a desert scene with an oasis, palm trees, and a clear, calm pool of water +A tranquil tableau of an ornate Victorian streetlamp standing on a cobblestone street corner, illuminating the empty night +A tranquil tableau of a tranquil lakeside cabin nestled among tall pines, its reflection mirrored perfectly in the calm water +In a still frame, a vintage gas lantern, adorned with intricate details, gracing a historic cobblestone square +In a still frame, a tranquil Japanese tea ceremony room, with tatami mats, a delicate tea set, and a bonsai tree in the corner +A tranquil tableau of the Parthenon stands resolute in its classical elegance, a timeless symbol of Athens' cultural legacy +A tranquil tableau of in the heart of Plaka, the neoclassical architecture of the old city harmonizes with the ancient ruins +A tranquil tableau of in the desolate beauty of the American Southwest, Chaco Canyon's ancient ruins whispered tales of an enigmatic civilization that once thrived amidst the arid landscapes +A tranquil tableau of at the edge of the Arabian Desert, the ancient city of Petra beckoned with its enigmatic rock-carved façades +In a still frame, amidst the cobblestone streets, an Art Nouveau lamppost stood tall +A tranquil tableau of in the quaint village square, a traditional wrought-iron streetlamp featured delicate filigree patterns and amber-hued glass panels +A tranquil tableau of the lampposts were adorned with Art Deco motifs, their geometric shapes and frosted glass creating a sense of vintage glamour +In a still frame, in the picturesque square, a Gothic-style lamppost adorned with intricate stone carvings added a touch of medieval charm to the setting +In a still frame, in the heart of the old city, a row of ornate lantern-style streetlamps bathed the narrow alleyway in a warm, welcoming light +A tranquil tableau of in the heart of the Utah desert, a massive sandstone arch spanned the horizon +A tranquil tableau of in the Arizona desert, a massive stone bridge arched across a rugged canyon +A tranquil tableau of in the corner of the minimalist tea room, a bonsai tree added a touch of nature's beauty to the otherwise simple and elegant space +In a still frame, amidst the hushed ambiance of the traditional tea room, a meticulously arranged tea set awaited, with porcelain cups, a bamboo whisk +In a still frame, nestled in the Zen garden, a rustic teahouse featured tatami seating and a traditional charcoal brazier +A tranquil tableau of a country estate's library featured elegant wooden shelves +A tranquil tableau of beneath the shade of a solitary oak tree, an old wooden park bench sat patiently +A tranquil tableau of beside a tranquil pond, a weeping willow tree draped its branches gracefully over the water's surface, creating a serene tableau of reflection and calm +A tranquil tableau of in the Zen garden, a perfectly raked gravel path led to a serene rock garden +In a still frame, a tranquil pond was fringed by weeping cherry trees, their blossoms drifting lazily onto the glassy surface +In a still frame, within the historic library's reading room, rows of antique leather chairs and mahogany tables offered a serene haven for literary contemplation +A tranquil tableau of a peaceful orchid garden showcased a variety of delicate blooms +A tranquil tableau of in the serene courtyard, a centuries-old stone well stood as a symbol of a bygone era, its mossy stones bearing witness to the passage of timeA beautiful coastal beach in spring, waves lapping on sand, in super slow motion +A beautiful coastal beach in spring, waves lapping on sand, zoom in +A beautiful coastal beach in spring, waves lapping on sand, zoom out +A beautiful coastal beach in spring, waves lapping on sand, pan left +A beautiful coastal beach in spring, waves lapping on sand, pan right +A beautiful coastal beach in spring, waves lapping on sand, tilt up +A beautiful coastal beach in spring, waves lapping on sand, tilt down +A beautiful coastal beach in spring, waves lapping on sand, with an intense shaking effect +A beautiful coastal beach in spring, waves lapping on sand, featuring a steady and smooth perspective +A beautiful coastal beach in spring, waves lapping on sand, racking focus +The bund Shanghai, in super slow motion +The bund Shanghai, zoom in +The bund Shanghai, zoom out +The bund Shanghai, pan left +The bund Shanghai, pan right +The bund Shanghai, tilt up +The bund Shanghai, tilt down +The bund Shanghai, with an intense shaking effect +The bund Shanghai, featuring a steady and smooth perspective +The bund Shanghai, racking focus +a shark is swimming in the ocean, in super slow motion +a shark is swimming in the ocean, zoom in +a shark is swimming in the ocean, zoom out +a shark is swimming in the ocean, pan left +a shark is swimming in the ocean, pan right +a shark is swimming in the ocean, tilt up +a shark is swimming in the ocean, tilt down +a shark is swimming in the ocean, with an intense shaking effect +a shark is swimming in the ocean, featuring a steady and smooth perspective +a shark is swimming in the ocean, racking focus +A panda drinking coffee in a cafe in Paris, in super slow motion +A panda drinking coffee in a cafe in Paris, zoom in +A panda drinking coffee in a cafe in Paris, zoom out +A panda drinking coffee in a cafe in Paris, pan left +A panda drinking coffee in a cafe in Paris, pan right +A panda drinking coffee in a cafe in Paris, tilt up +A panda drinking coffee in a cafe in Paris, tilt down +A panda drinking coffee in a cafe in Paris, with an intense shaking effect +A panda drinking coffee in a cafe in Paris, featuring a steady and smooth perspective +A panda drinking coffee in a cafe in Paris, racking focus +A cute happy Corgi playing in park, sunset, in super slow motion +A cute happy Corgi playing in park, sunset, zoom in +A cute happy Corgi playing in park, sunset, zoom out +A cute happy Corgi playing in park, sunset, pan left +A cute happy Corgi playing in park, sunset, pan right +A cute happy Corgi playing in park, sunset, tilt up +A cute happy Corgi playing in park, sunset, tilt down +A cute happy Corgi playing in park, sunset, with an intense shaking effect +A cute happy Corgi playing in park, sunset, featuring a steady and smooth perspective +A cute happy Corgi playing in park, sunset, racking focus +Gwen Stacy reading a book, in super slow motion +Gwen Stacy reading a book, zoom in +Gwen Stacy reading a book, zoom out +Gwen Stacy reading a book, pan left +Gwen Stacy reading a book, pan right +Gwen Stacy reading a book, tilt up +Gwen Stacy reading a book, tilt down +Gwen Stacy reading a book, with an intense shaking effect +Gwen Stacy reading a book, featuring a steady and smooth perspective +Gwen Stacy reading a book, racking focus +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, in super slow motion +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, zoom in +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, zoom out +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, pan left +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, pan right +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, tilt up +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, tilt down +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, with an intense shaking effect +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, featuring a steady and smooth perspective +A boat sailing leisurely along the Seine River with the Eiffel Tower in background, racking focus +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, in super slow motion +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, zoom in +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, zoom out +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, pan left +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, pan right +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, tilt up +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, tilt down +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, with an intense shaking effect +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, featuring a steady and smooth perspective +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, racking focus +An astronaut flying in space, in super slow motion +An astronaut flying in space, zoom in +An astronaut flying in space, zoom out +An astronaut flying in space, pan left +An astronaut flying in space, pan right +An astronaut flying in space, tilt up +An astronaut flying in space, tilt down +An astronaut flying in space, with an intense shaking effect +An astronaut flying in space, featuring a steady and smooth perspective +An astronaut flying in space, racking focus +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, in super slow motion +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, zoom in +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, zoom out +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, pan left +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, pan right +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, tilt up +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, tilt down +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, with an intense shaking effect +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, featuring a steady and smooth perspective +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, racking focusa bicycle on the left of a car, front view +a car on the right of a motorcycle, front view +a motorcycle on the left of a bus, front view +a bus on the right of a traffic light, front view +a traffic light on the left of a fire hydrant, front view +a fire hydrant on the right of a stop sign, front view +a stop sign on the left of a parking meter, front view +a parking meter on the right of a bench, front view +a bench on the left of a truck, front view +a truck on the right of a bicycle, front view +a bird on the left of a cat, front view +a cat on the right of a dog, front view +a dog on the left of a horse, front view +a horse on the right of a sheep, front view +a sheep on the left of a cow, front view +a cow on the right of an elephant, front view +an elephant on the left of a bear, front view +a bear on the right of a zebra, front view +a zebra on the left of a giraffe, front view +a giraffe on the right of a bird, front view +a bottle on the left of a wine glass, front view +a wine glass on the right of a cup, front view +a cup on the left of a fork, front view +a fork on the right of a knife, front view +a knife on the left of a spoon, front view +a spoon on the right of a bowl, front view +a bowl on the left of a bottle, front view +a potted plant on the left of a remote, front view +a remote on the right of a clock, front view +a clock on the left of a vase, front view +a vase on the right of scissors, front view +scissors on the left of a teddy bear, front view +a teddy bear on the right of a potted plant, front view +a frisbee on the left of a sports ball, front view +a sports ball on the right of a baseball bat, front view +a baseball bat on the left of a baseball glove, front view +a baseball glove on the right of a tennis racket, front view +a tennis racket on the left of a frisbee, front view +a toilet on the left of a hair drier, front view +a hair drier on the right of a toothbrush, front view +a toothbrush on the left of a sink, front view +a sink on the right of a toilet, front view +a chair on the left of a couch, front view +a couch on the right of a bed, front view +a bed on the left of a tv, front view +a tv on the right of a dining table, front view +a dining table on the left of a chair, front view +an airplane on the left of a train, front view +a train on the right of a boat, front view +a boat on the left of an airplane, front view +an oven on the top of a toaster, front view +an oven on the bottom of a toaster, front view +a toaster on the top of a microwave, front view +a toaster on the bottom of a microwave, front view +a microwave on the top of an oven, front view +a microwave on the bottom of an oven, front view +a banana on the top of an apple, front view +a banana on the bottom of an apple, front view +an apple on the top of a sandwich, front view +an apple on the bottom of a sandwich, front view +a sandwich on the top of an orange, front view +a sandwich on the bottom of an orange, front view +an orange on the top of a carrot, front view +an orange on the bottom of a carrot, front view +a carrot on the top of a hot dog, front view +a carrot on the bottom of a hot dog, front view +a hot dog on the top of a pizza, front view +a hot dog on the bottom of a pizza, front view +a pizza on the top of a donut, front view +a pizza on the bottom of a donut, front view +a donut on the top of broccoli, front view +a donut on the bottom of broccoli, front view +broccoli on the top of a banana, front view +broccoli on the bottom of a banana, front view +skis on the top of a snowboard, front view +skis on the bottom of a snowboard, front view +a snowboard on the top of a kite, front view +a snowboard on the bottom of a kite, front view +a kite on the top of a skateboard, front view +a kite on the bottom of a skateboard, front view +a skateboard on the top of a surfboard, front view +a skateboard on the bottom of a surfboard, front view +a surfboard on the top of skis, front view +a surfboard on the bottom of skis, front viewa person +a bicycle +a car +a motorcycle +an airplane +a bus +a train +a truck +a boat +a traffic light +a fire hydrant +a stop sign +a parking meter +a bench +a bird +a cat +a dog +a horse +a sheep +a cow +an elephant +a bear +a zebra +a giraffe +a backpack +an umbrella +a handbag +a tie +a suitcase +a frisbee +skis +a snowboard +a sports ball +a kite +a baseball bat +a baseball glove +a skateboard +a surfboard +a tennis racket +a bottle +a wine glass +a cup +a fork +a knife +a spoon +a bowl +a banana +an apple +a sandwich +an orange +broccoli +a carrot +a hot dog +a pizza +a donut +a cake +a chair +a couch +a potted plant +a bed +a dining table +a toilet +a tv +a laptop +a remote +a keyboard +a cell phone +a microwave +an oven +a toaster +a sink +a refrigerator +a book +a clock +a vase +scissors +a teddy bear +a hair drier +a toothbrush \ No newline at end of file From 93445514c79bbf682cb877a2e8bd6a3930a21593 Mon Sep 17 00:00:00 2001 From: justinyuu <869113692@qq.com> Date: Tue, 12 Dec 2023 12:00:29 +0800 Subject: [PATCH 050/248] [update] all dimension prompt list --- .../prompts_per_dimension/all_dimension.txt | 916 +++++++++--------- 1 file changed, 463 insertions(+), 453 deletions(-) diff --git a/prompts/prompts_per_dimension/all_dimension.txt b/prompts/prompts_per_dimension/all_dimension.txt index 3887556a..f26fbf80 100644 --- a/prompts/prompts_per_dimension/all_dimension.txt +++ b/prompts/prompts_per_dimension/all_dimension.txt @@ -1,3 +1,78 @@ +In a still frame, a stop sign +a toilet, frozen in time +a laptop, frozen in time +A tranquil tableau of alley +A tranquil tableau of bar +A tranquil tableau of barn +A tranquil tableau of bathroom +A tranquil tableau of bedroom +A tranquil tableau of cliff +In a still frame, courtyard +In a still frame, gas station +A tranquil tableau of house +indoor gymnasium, frozen in time +A tranquil tableau of indoor library +A tranquil tableau of kitchen +A tranquil tableau of palace +In a still frame, parking lot +In a still frame, phone booth +A tranquil tableau of restaurant +A tranquil tableau of tower +A tranquil tableau of a bowl +A tranquil tableau of an apple +A tranquil tableau of a bench +A tranquil tableau of a bed +A tranquil tableau of a chair +A tranquil tableau of a cup +A tranquil tableau of a dining table +In a still frame, a pear +A tranquil tableau of a bunch of grapes +A tranquil tableau of a bowl on the kitchen counter +A tranquil tableau of a beautiful, handcrafted ceramic bowl +A tranquil tableau of an antique bowl +A tranquil tableau of an exquisite mahogany dining table +A tranquil tableau of a wooden bench in the park +A tranquil tableau of a beautiful wrought-iron bench surrounded by blooming flowers +In a still frame, a park bench with a view of the lake +A tranquil tableau of a vintage rocking chair was placed on the porch +A tranquil tableau of the jail cell was small and dimly lit, with cold, steel bars +A tranquil tableau of the phone booth was tucked away in a quiet alley +a dilapidated phone booth stood as a relic of a bygone era on the sidewalk, frozen in time +A tranquil tableau of the old red barn stood weathered and iconic against the backdrop of the countryside +A tranquil tableau of a picturesque barn was painted a warm shade of red and nestled in a picturesque meadow +In a still frame, within the desolate desert, an oasis unfolded, characterized by the stoic presence of palm trees and a motionless, glassy pool of water +In a still frame, the Parthenon's majestic Doric columns stand in serene solitude atop the Acropolis, framed by the tranquil Athenian landscape +In a still frame, the Temple of Hephaestus, with its timeless Doric grace, stands stoically against the backdrop of a quiet Athens +In a still frame, the ornate Victorian streetlamp stands solemnly, adorned with intricate ironwork and stained glass panels +A tranquil tableau of the Stonehenge presented itself as an enigmatic puzzle, each colossal stone meticulously placed against the backdrop of tranquility +In a still frame, in the vast desert, an oasis nestled among dunes, featuring tall palm trees and an air of serenity +static view on a desert scene with an oasis, palm trees, and a clear, calm pool of water +A tranquil tableau of an ornate Victorian streetlamp standing on a cobblestone street corner, illuminating the empty night +A tranquil tableau of a tranquil lakeside cabin nestled among tall pines, its reflection mirrored perfectly in the calm water +In a still frame, a vintage gas lantern, adorned with intricate details, gracing a historic cobblestone square +In a still frame, a tranquil Japanese tea ceremony room, with tatami mats, a delicate tea set, and a bonsai tree in the corner +A tranquil tableau of the Parthenon stands resolute in its classical elegance, a timeless symbol of Athens' cultural legacy +A tranquil tableau of in the heart of Plaka, the neoclassical architecture of the old city harmonizes with the ancient ruins +A tranquil tableau of in the desolate beauty of the American Southwest, Chaco Canyon's ancient ruins whispered tales of an enigmatic civilization that once thrived amidst the arid landscapes +A tranquil tableau of at the edge of the Arabian Desert, the ancient city of Petra beckoned with its enigmatic rock-carved façades +In a still frame, amidst the cobblestone streets, an Art Nouveau lamppost stood tall +A tranquil tableau of in the quaint village square, a traditional wrought-iron streetlamp featured delicate filigree patterns and amber-hued glass panels +A tranquil tableau of the lampposts were adorned with Art Deco motifs, their geometric shapes and frosted glass creating a sense of vintage glamour +In a still frame, in the picturesque square, a Gothic-style lamppost adorned with intricate stone carvings added a touch of medieval charm to the setting +In a still frame, in the heart of the old city, a row of ornate lantern-style streetlamps bathed the narrow alleyway in a warm, welcoming light +A tranquil tableau of in the heart of the Utah desert, a massive sandstone arch spanned the horizon +A tranquil tableau of in the Arizona desert, a massive stone bridge arched across a rugged canyon +A tranquil tableau of in the corner of the minimalist tea room, a bonsai tree added a touch of nature's beauty to the otherwise simple and elegant space +In a still frame, amidst the hushed ambiance of the traditional tea room, a meticulously arranged tea set awaited, with porcelain cups, a bamboo whisk +In a still frame, nestled in the Zen garden, a rustic teahouse featured tatami seating and a traditional charcoal brazier +A tranquil tableau of a country estate's library featured elegant wooden shelves +A tranquil tableau of beneath the shade of a solitary oak tree, an old wooden park bench sat patiently +A tranquil tableau of beside a tranquil pond, a weeping willow tree draped its branches gracefully over the water's surface, creating a serene tableau of reflection and calm +A tranquil tableau of in the Zen garden, a perfectly raked gravel path led to a serene rock garden +In a still frame, a tranquil pond was fringed by weeping cherry trees, their blossoms drifting lazily onto the glassy surface +In a still frame, within the historic library's reading room, rows of antique leather chairs and mahogany tables offered a serene haven for literary contemplation +A tranquil tableau of a peaceful orchid garden showcased a variety of delicate blooms +A tranquil tableau of in the serene courtyard, a centuries-old stone well stood as a symbol of a bygone era, its mossy stones bearing witness to the passage of time a bird and a cat a cat and a dog a dog and a horse @@ -79,7 +154,108 @@ a motorcycle and a boat a person and a toilet a person and a hair drier a person and a toothbrush -a person and a sinka person swimming in ocean +a person and a sink +A person is riding a bike +A person is marching +A person is roller skating +A person is tasting beer +A person is clapping +A person is drawing +A person is petting animal (not cat) +A person is eating watermelon +A person is playing harp +A person is wrestling +A person is riding scooter +A person is sweeping floor +A person is skateboarding +A person is dunking basketball +A person is playing flute +A person is stretching leg +A person is tying tie +A person is skydiving +A person is shooting goal (soccer) +A person is playing piano +A person is finger snapping +A person is canoeing or kayaking +A person is laughing +A person is digging +A person is clay pottery making +A person is shooting basketball +A person is bending back +A person is shaking hands +A person is bandaging +A person is push up +A person is catching or throwing frisbee +A person is playing trumpet +A person is flying kite +A person is filling eyebrows +A person is shuffling cards +A person is folding clothes +A person is smoking +A person is tai chi +A person is squat +A person is playing controller +A person is throwing axe +A person is giving or receiving award +A person is air drumming +A person is taking a shower +A person is planting trees +A person is sharpening knives +A person is robot dancing +A person is rock climbing +A person is hula hooping +A person is writing +A person is bungee jumping +A person is pushing cart +A person is cleaning windows +A person is cutting watermelon +A person is cheerleading +A person is washing hands +A person is ironing +A person is cutting nails +A person is hugging +A person is trimming or shaving beard +A person is jogging +A person is making bed +A person is washing dishes +A person is grooming dog +A person is doing laundry +A person is knitting +A person is reading book +A person is baby waking up +A person is massaging legs +A person is brushing teeth +A person is crawling baby +A person is motorcycling +A person is driving car +A person is sticking tongue out +A person is shaking head +A person is sword fighting +A person is doing aerobics +A person is strumming guitar +A person is riding or walking with horse +A person is archery +A person is catching or throwing baseball +A person is playing chess +A person is rock scissors paper +A person is using computer +A person is arranging flowers +A person is bending metal +A person is ice skating +A person is climbing a rope +A person is crying +A person is dancing ballet +A person is getting a haircut +A person is running on treadmill +A person is kissing +A person is counting money +A person is barbequing +A person is peeling apples +A person is milking cow +A person is shining shoes +A person is making snowman +A person is sailing +a person swimming in ocean a person giving a presentation to a room full of colleagues a person washing the dishes a person eating a burger @@ -150,7 +326,87 @@ a zebra running to join a herd of its kind a zebra taking a peaceful walk a giraffe bending down to drink water from a river a giraffe taking a peaceful walk -a giraffe running to join a herd of its kinda red bicycle +a giraffe running to join a herd of its kind +a person +a bicycle +a car +a motorcycle +an airplane +a bus +a train +a truck +a boat +a traffic light +a fire hydrant +a stop sign +a parking meter +a bench +a bird +a cat +a dog +a horse +a sheep +a cow +an elephant +a bear +a zebra +a giraffe +a backpack +an umbrella +a handbag +a tie +a suitcase +a frisbee +skis +a snowboard +a sports ball +a kite +a baseball bat +a baseball glove +a skateboard +a surfboard +a tennis racket +a bottle +a wine glass +a cup +a fork +a knife +a spoon +a bowl +a banana +an apple +a sandwich +an orange +broccoli +a carrot +a hot dog +a pizza +a donut +a cake +a chair +a couch +a potted plant +a bed +a dining table +a toilet +a tv +a laptop +a remote +a keyboard +a cell phone +a microwave +an oven +a toaster +a sink +a refrigerator +a book +a clock +a vase +scissors +a teddy bear +a hair drier +a toothbrush +a red bicycle a green bicycle a blue bicycle a yellow bicycle @@ -234,184 +490,8 @@ an orange vase a purple vase a pink vase a black vase -a white vasealley -amusement park -aquarium -arch -art gallery -bathroom -bakery shop -ballroom -bar -barn -basement -beach -bedroom -bridge -botanical garden -cafeteria -campsite -campus -carrousel -castle -cemetery -classroom -cliff -crosswalk -construction site -corridor -courtyard -desert -downtown -driveway -farm -food court -football field -forest road -fountain -gas station -glacier -golf course -indoor gymnasium -harbor -highway -hospital -house -iceberg -industrial area -jail cell -junkyard -kitchen -indoor library -lighthouse -laboratory -mansion -marsh -mountain -indoor movie theater -indoor museum -music studio -nursery -ocean -office -palace -parking lot -pharmacy -phone booth -raceway -restaurant -river -science museum -shower -ski slope -sky -skyscraper -baseball stadium -staircase -street -supermarket -indoor swimming pool -tower -outdoor track -train railway -train station platform -underwater coral reef -valley -volcano -waterfall -windmillClose up of grapes on a rotating table. -Turtle swimming in ocean. -A storm trooper vacuuming the beach. -A panda standing on a surfboard in the ocean in sunset. -An astronaut feeding ducks on a sunny afternoon, reflection from the water. -Two pandas discussing an academic paper. -Sunset time lapse at the beach with moving clouds and colors in the sky. -A fat rabbit wearing a purple robe walking through a fantasy landscape. -A koala bear playing piano in the forest. -An astronaut flying in space. -Fireworks. -An animated painting of fluffy white clouds moving in sky. -Flying through fantasy landscapes. -A bigfoot walking in the snowstorm. -A squirrel eating a burger. -A cat wearing sunglasses and working as a lifeguard at a pool. -Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks. -Splash of turquoise water in extreme slow motion, alpha channel included. -an ice cream is melting on the table. -a drone flying over a snowy forest. -a shark is swimming in the ocean. -Aerial panoramic video from a drone of a fantasy land. -a teddy bear is swimming in the ocean. -time lapse of sunrise on mars. -golden fish swimming in the ocean. -An artist brush painting on a canvas close up. -A drone view of celebration with Christmas tree and fireworks, starry sky - background. -happy dog wearing a yellow turtleneck, studio, portrait, facing camera, dark background -Origami dancers in white paper, 3D render, on white background, studio shot, dancing modern dance. -Campfire at night in a snowy forest with starry sky in the background. -a fantasy landscape -A 3D model of a 1800s victorian house. -this is how I do makeup in the morning. -A raccoon that looks like a turtle, digital art. -Robot dancing in Times Square. -Busy freeway at night. -Balloon full of water exploding in extreme slow motion. -An astronaut is riding a horse in the space in a photorealistic style. -Macro slo-mo. Slow motion cropped closeup of roasted coffee beans falling into an empty bowl. -Sewing machine, old sewing machine working. -Motion colour drop in water, ink swirling in water, colourful ink in water, abstraction fancy dream cloud of ink. -Few big purple plums rotating on the turntable. water drops appear on the skin during rotation. isolated on the white background. close-up. macro. -Vampire makeup face of beautiful girl, red contact lenses. -Ashtray full of butts on table, smoke flowing on black background, close-up -Pacific coast, carmel by the sea ocean and waves. -A teddy bear is playing drum kit in NYC Times Square. -A corgi is playing drum kit. -An Iron man is playing the electronic guitar, high electronic guitar. -A raccoon is playing the electronic guitar. -A boat sailing leisurely along the Seine River with the Eiffel Tower in background by Vincent van Gogh -A corgi's head depicted as an explosion of a nebula -A fantasy landscape -A future where humans have achieved teleportation technology -A jellyfish floating through the ocean, with bioluminescent tentacles -A Mars rover moving on Mars -A panda drinking coffee in a cafe in Paris -A space shuttle launching into orbit, with flames and smoke billowing out from the engines -A steam train moving on a mountainside -A super cool giant robot in Cyberpunk Beijing -A tropical beach at sunrise, with palm trees and crystal-clear water in the foreground -Cinematic shot of Van Gogh's selfie, Van Gogh style -Gwen Stacy reading a book -Iron Man flying in the sky -The bund Shanghai, oil painting -Yoda playing guitar on the stage -A beautiful coastal beach in spring, waves lapping on sand by Hokusai, in the style of Ukiyo -A beautiful coastal beach in spring, waves lapping on sand by Vincent van Gogh -A boat sailing leisurely along the Seine River with the Eiffel Tower in background -A car moving slowly on an empty street, rainy evening -A cat eating food out of a bowl -A cat wearing sunglasses at a pool -A confused panda in calculus class -A cute fluffy panda eating Chinese food in a restaurant -A cute happy Corgi playing in park, sunset -A cute raccoon playing guitar in a boat on the ocean -A happy fuzzy panda playing guitar nearby a campfire, snow mountain in the background -A lightning striking atop of eiffel tower, dark clouds in the sky -A modern art museum, with colorful paintings -A panda cooking in the kitchen -A panda playing on a swing set -A polar bear is playing guitar -A raccoon dressed in suit playing the trumpet, stage background -A robot DJ is playing the turntable, in heavy raining futuristic tokyo rooftop cyberpunk night, sci-fi, fantasy -A shark swimming in clear Caribbean ocean -A super robot protecting city -A teddy bear washing the dishes -An epic tornado attacking above a glowing city at night, the tornado is made of smoke -An oil painting of a couple in formal evening wear going home get caught in a heavy downpour with umbrellas -Clown fish swimming through the coral reef -Hyper-realistic spaceship landing on Mars -The bund Shanghai, vibrant color -Vincent van Gogh is painting in the room -Yellow flowers swing in the windA beautiful coastal beach in spring, waves lapping on sand, Van Gogh style +a white vase +A beautiful coastal beach in spring, waves lapping on sand, Van Gogh style A beautiful coastal beach in spring, waves lapping on sand, oil painting A beautiful coastal beach in spring, waves lapping on sand by Hokusai, in the style of Ukiyo A beautiful coastal beach in spring, waves lapping on sand, black and white @@ -481,199 +561,27 @@ A couple in formal evening wear going home get caught in a heavy downpour with u A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, pixel art A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, in cyberpunk style A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, animated style -A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, watercolor painting -A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, surrealism style -An astronaut flying in space, Van Gogh style -An astronaut flying in space, oil painting -An astronaut flying in space by Hokusai, in the style of Ukiyo -An astronaut flying in space, black and white -An astronaut flying in space, pixel art -An astronaut flying in space, in cyberpunk style -An astronaut flying in space, animated style -An astronaut flying in space, watercolor painting -An astronaut flying in space, surrealism style -Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, Van Gogh style -Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, oil painting -Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks by Hokusai, in the style of Ukiyo -Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, black and white -Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, pixel art -Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, in cyberpunk style -Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, animated style -Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, watercolor painting -Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, surrealism styleA person is riding a bike -A person is marching -A person is roller skating -A person is tasting beer -A person is clapping -A person is drawing -A person is petting animal (not cat) -A person is eating watermelon -A person is playing harp -A person is wrestling -A person is riding scooter -A person is sweeping floor -A person is skateboarding -A person is dunking basketball -A person is playing flute -A person is stretching leg -A person is tying tie -A person is skydiving -A person is shooting goal (soccer) -A person is playing piano -A person is finger snapping -A person is canoeing or kayaking -A person is laughing -A person is digging -A person is clay pottery making -A person is shooting basketball -A person is bending back -A person is shaking hands -A person is bandaging -A person is push up -A person is catching or throwing frisbee -A person is playing trumpet -A person is flying kite -A person is filling eyebrows -A person is shuffling cards -A person is folding clothes -A person is smoking -A person is tai chi -A person is squat -A person is playing controller -A person is throwing axe -A person is giving or receiving award -A person is air drumming -A person is taking a shower -A person is planting trees -A person is sharpening knives -A person is robot dancing -A person is rock climbing -A person is hula hooping -A person is writing -A person is bungee jumping -A person is pushing cart -A person is cleaning windows -A person is cutting watermelon -A person is cheerleading -A person is washing hands -A person is ironing -A person is cutting nails -A person is hugging -A person is trimming or shaving beard -A person is jogging -A person is making bed -A person is washing dishes -A person is grooming dog -A person is doing laundry -A person is knitting -A person is reading book -A person is baby waking up -A person is massaging legs -A person is brushing teeth -A person is crawling baby -A person is motorcycling -A person is driving car -A person is sticking tongue out -A person is shaking head -A person is sword fighting -A person is doing aerobics -A person is strumming guitar -A person is riding or walking with horse -A person is archery -A person is catching or throwing baseball -A person is playing chess -A person is rock scissors paper -A person is using computer -A person is arranging flowers -A person is bending metal -A person is ice skating -A person is climbing a rope -A person is crying -A person is dancing ballet -A person is getting a haircut -A person is running on treadmill -A person is kissing -A person is counting money -A person is barbequing -A person is peeling apples -A person is milking cow -A person is shining shoes -A person is making snowman -A person is sailingIn a still frame, a stop sign -a toilet, frozen in time -a laptop, frozen in time -A tranquil tableau of alley -A tranquil tableau of bar -A tranquil tableau of barn -A tranquil tableau of bathroom -A tranquil tableau of bedroom -A tranquil tableau of cliff -In a still frame, courtyard -In a still frame, gas station -A tranquil tableau of house -indoor gymnasium, frozen in time -A tranquil tableau of indoor library -A tranquil tableau of kitchen -A tranquil tableau of palace -In a still frame, parking lot -In a still frame, phone booth -A tranquil tableau of restaurant -A tranquil tableau of tower -A tranquil tableau of a bowl -A tranquil tableau of an apple -A tranquil tableau of a bench -A tranquil tableau of a bed -A tranquil tableau of a chair -A tranquil tableau of a cup -A tranquil tableau of a dining table -In a still frame, a pear -A tranquil tableau of a bunch of grapes -A tranquil tableau of a bowl on the kitchen counter -A tranquil tableau of a beautiful, handcrafted ceramic bowl -A tranquil tableau of an antique bowl -A tranquil tableau of an exquisite mahogany dining table -A tranquil tableau of a wooden bench in the park -A tranquil tableau of a beautiful wrought-iron bench surrounded by blooming flowers -In a still frame, a park bench with a view of the lake -A tranquil tableau of a vintage rocking chair was placed on the porch -A tranquil tableau of the jail cell was small and dimly lit, with cold, steel bars -A tranquil tableau of the phone booth was tucked away in a quiet alley -a dilapidated phone booth stood as a relic of a bygone era on the sidewalk, frozen in time -A tranquil tableau of the old red barn stood weathered and iconic against the backdrop of the countryside -A tranquil tableau of a picturesque barn was painted a warm shade of red and nestled in a picturesque meadow -In a still frame, within the desolate desert, an oasis unfolded, characterized by the stoic presence of palm trees and a motionless, glassy pool of water -In a still frame, the Parthenon's majestic Doric columns stand in serene solitude atop the Acropolis, framed by the tranquil Athenian landscape -In a still frame, the Temple of Hephaestus, with its timeless Doric grace, stands stoically against the backdrop of a quiet Athens -In a still frame, the ornate Victorian streetlamp stands solemnly, adorned with intricate ironwork and stained glass panels -A tranquil tableau of the Stonehenge presented itself as an enigmatic puzzle, each colossal stone meticulously placed against the backdrop of tranquility -In a still frame, in the vast desert, an oasis nestled among dunes, featuring tall palm trees and an air of serenity -static view on a desert scene with an oasis, palm trees, and a clear, calm pool of water -A tranquil tableau of an ornate Victorian streetlamp standing on a cobblestone street corner, illuminating the empty night -A tranquil tableau of a tranquil lakeside cabin nestled among tall pines, its reflection mirrored perfectly in the calm water -In a still frame, a vintage gas lantern, adorned with intricate details, gracing a historic cobblestone square -In a still frame, a tranquil Japanese tea ceremony room, with tatami mats, a delicate tea set, and a bonsai tree in the corner -A tranquil tableau of the Parthenon stands resolute in its classical elegance, a timeless symbol of Athens' cultural legacy -A tranquil tableau of in the heart of Plaka, the neoclassical architecture of the old city harmonizes with the ancient ruins -A tranquil tableau of in the desolate beauty of the American Southwest, Chaco Canyon's ancient ruins whispered tales of an enigmatic civilization that once thrived amidst the arid landscapes -A tranquil tableau of at the edge of the Arabian Desert, the ancient city of Petra beckoned with its enigmatic rock-carved façades -In a still frame, amidst the cobblestone streets, an Art Nouveau lamppost stood tall -A tranquil tableau of in the quaint village square, a traditional wrought-iron streetlamp featured delicate filigree patterns and amber-hued glass panels -A tranquil tableau of the lampposts were adorned with Art Deco motifs, their geometric shapes and frosted glass creating a sense of vintage glamour -In a still frame, in the picturesque square, a Gothic-style lamppost adorned with intricate stone carvings added a touch of medieval charm to the setting -In a still frame, in the heart of the old city, a row of ornate lantern-style streetlamps bathed the narrow alleyway in a warm, welcoming light -A tranquil tableau of in the heart of the Utah desert, a massive sandstone arch spanned the horizon -A tranquil tableau of in the Arizona desert, a massive stone bridge arched across a rugged canyon -A tranquil tableau of in the corner of the minimalist tea room, a bonsai tree added a touch of nature's beauty to the otherwise simple and elegant space -In a still frame, amidst the hushed ambiance of the traditional tea room, a meticulously arranged tea set awaited, with porcelain cups, a bamboo whisk -In a still frame, nestled in the Zen garden, a rustic teahouse featured tatami seating and a traditional charcoal brazier -A tranquil tableau of a country estate's library featured elegant wooden shelves -A tranquil tableau of beneath the shade of a solitary oak tree, an old wooden park bench sat patiently -A tranquil tableau of beside a tranquil pond, a weeping willow tree draped its branches gracefully over the water's surface, creating a serene tableau of reflection and calm -A tranquil tableau of in the Zen garden, a perfectly raked gravel path led to a serene rock garden -In a still frame, a tranquil pond was fringed by weeping cherry trees, their blossoms drifting lazily onto the glassy surface -In a still frame, within the historic library's reading room, rows of antique leather chairs and mahogany tables offered a serene haven for literary contemplation -A tranquil tableau of a peaceful orchid garden showcased a variety of delicate blooms -A tranquil tableau of in the serene courtyard, a centuries-old stone well stood as a symbol of a bygone era, its mossy stones bearing witness to the passage of timeA beautiful coastal beach in spring, waves lapping on sand, in super slow motion +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, watercolor painting +A couple in formal evening wear going home get caught in a heavy downpour with umbrellas, surrealism style +An astronaut flying in space, Van Gogh style +An astronaut flying in space, oil painting +An astronaut flying in space by Hokusai, in the style of Ukiyo +An astronaut flying in space, black and white +An astronaut flying in space, pixel art +An astronaut flying in space, in cyberpunk style +An astronaut flying in space, animated style +An astronaut flying in space, watercolor painting +An astronaut flying in space, surrealism style +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, Van Gogh style +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, oil painting +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks by Hokusai, in the style of Ukiyo +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, black and white +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, pixel art +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, in cyberpunk style +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, animated style +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, watercolor painting +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, surrealism style +A beautiful coastal beach in spring, waves lapping on sand, in super slow motion A beautiful coastal beach in spring, waves lapping on sand, zoom in A beautiful coastal beach in spring, waves lapping on sand, zoom out A beautiful coastal beach in spring, waves lapping on sand, pan left @@ -772,7 +680,187 @@ Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and s Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, tilt down Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, with an intense shaking effect Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, featuring a steady and smooth perspective -Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, racking focusa bicycle on the left of a car, front view +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, racking focus +Close up of grapes on a rotating table. +Turtle swimming in ocean. +A storm trooper vacuuming the beach. +A panda standing on a surfboard in the ocean in sunset. +An astronaut feeding ducks on a sunny afternoon, reflection from the water. +Two pandas discussing an academic paper. +Sunset time lapse at the beach with moving clouds and colors in the sky. +A fat rabbit wearing a purple robe walking through a fantasy landscape. +A koala bear playing piano in the forest. +An astronaut flying in space. +Fireworks. +An animated painting of fluffy white clouds moving in sky. +Flying through fantasy landscapes. +A bigfoot walking in the snowstorm. +A squirrel eating a burger. +A cat wearing sunglasses and working as a lifeguard at a pool. +Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks. +Splash of turquoise water in extreme slow motion, alpha channel included. +an ice cream is melting on the table. +a drone flying over a snowy forest. +a shark is swimming in the ocean. +Aerial panoramic video from a drone of a fantasy land. +a teddy bear is swimming in the ocean. +time lapse of sunrise on mars. +golden fish swimming in the ocean. +An artist brush painting on a canvas close up. +A drone view of celebration with Christmas tree and fireworks, starry sky - background. +happy dog wearing a yellow turtleneck, studio, portrait, facing camera, dark background +Origami dancers in white paper, 3D render, on white background, studio shot, dancing modern dance. +Campfire at night in a snowy forest with starry sky in the background. +a fantasy landscape +A 3D model of a 1800s victorian house. +this is how I do makeup in the morning. +A raccoon that looks like a turtle, digital art. +Robot dancing in Times Square. +Busy freeway at night. +Balloon full of water exploding in extreme slow motion. +An astronaut is riding a horse in the space in a photorealistic style. +Macro slo-mo. Slow motion cropped closeup of roasted coffee beans falling into an empty bowl. +Sewing machine, old sewing machine working. +Motion colour drop in water, ink swirling in water, colourful ink in water, abstraction fancy dream cloud of ink. +Few big purple plums rotating on the turntable. water drops appear on the skin during rotation. isolated on the white background. close-up. macro. +Vampire makeup face of beautiful girl, red contact lenses. +Ashtray full of butts on table, smoke flowing on black background, close-up +Pacific coast, carmel by the sea ocean and waves. +A teddy bear is playing drum kit in NYC Times Square. +A corgi is playing drum kit. +An Iron man is playing the electronic guitar, high electronic guitar. +A raccoon is playing the electronic guitar. +A boat sailing leisurely along the Seine River with the Eiffel Tower in background by Vincent van Gogh +A corgi's head depicted as an explosion of a nebula +A fantasy landscape +A future where humans have achieved teleportation technology +A jellyfish floating through the ocean, with bioluminescent tentacles +A Mars rover moving on Mars +A panda drinking coffee in a cafe in Paris +A space shuttle launching into orbit, with flames and smoke billowing out from the engines +A steam train moving on a mountainside +A super cool giant robot in Cyberpunk Beijing +A tropical beach at sunrise, with palm trees and crystal-clear water in the foreground +Cinematic shot of Van Gogh's selfie, Van Gogh style +Gwen Stacy reading a book +Iron Man flying in the sky +The bund Shanghai, oil painting +Yoda playing guitar on the stage +A beautiful coastal beach in spring, waves lapping on sand by Hokusai, in the style of Ukiyo +A beautiful coastal beach in spring, waves lapping on sand by Vincent van Gogh +A boat sailing leisurely along the Seine River with the Eiffel Tower in background +A car moving slowly on an empty street, rainy evening +A cat eating food out of a bowl +A cat wearing sunglasses at a pool +A confused panda in calculus class +A cute fluffy panda eating Chinese food in a restaurant +A cute happy Corgi playing in park, sunset +A cute raccoon playing guitar in a boat on the ocean +A happy fuzzy panda playing guitar nearby a campfire, snow mountain in the background +A lightning striking atop of eiffel tower, dark clouds in the sky +A modern art museum, with colorful paintings +A panda cooking in the kitchen +A panda playing on a swing set +A polar bear is playing guitar +A raccoon dressed in suit playing the trumpet, stage background +A robot DJ is playing the turntable, in heavy raining futuristic tokyo rooftop cyberpunk night, sci-fi, fantasy +A shark swimming in clear Caribbean ocean +A super robot protecting city +A teddy bear washing the dishes +An epic tornado attacking above a glowing city at night, the tornado is made of smoke +An oil painting of a couple in formal evening wear going home get caught in a heavy downpour with umbrellas +Clown fish swimming through the coral reef +Hyper-realistic spaceship landing on Mars +The bund Shanghai, vibrant color +Vincent van Gogh is painting in the room +Yellow flowers swing in the wind +alley +amusement park +aquarium +arch +art gallery +bathroom +bakery shop +ballroom +bar +barn +basement +beach +bedroom +bridge +botanical garden +cafeteria +campsite +campus +carrousel +castle +cemetery +classroom +cliff +crosswalk +construction site +corridor +courtyard +desert +downtown +driveway +farm +food court +football field +forest road +fountain +gas station +glacier +golf course +indoor gymnasium +harbor +highway +hospital +house +iceberg +industrial area +jail cell +junkyard +kitchen +indoor library +lighthouse +laboratory +mansion +marsh +mountain +indoor movie theater +indoor museum +music studio +nursery +ocean +office +palace +parking lot +pharmacy +phone booth +raceway +restaurant +river +science museum +shower +ski slope +sky +skyscraper +baseball stadium +staircase +street +supermarket +indoor swimming pool +tower +outdoor track +train railway +train station platform +underwater coral reef +valley +volcano +waterfall +windmill +a bicycle on the left of a car, front view a car on the right of a motorcycle, front view a motorcycle on the left of a bus, front view a bus on the right of a traffic light, front view @@ -855,82 +943,4 @@ a kite on the bottom of a skateboard, front view a skateboard on the top of a surfboard, front view a skateboard on the bottom of a surfboard, front view a surfboard on the top of skis, front view -a surfboard on the bottom of skis, front viewa person -a bicycle -a car -a motorcycle -an airplane -a bus -a train -a truck -a boat -a traffic light -a fire hydrant -a stop sign -a parking meter -a bench -a bird -a cat -a dog -a horse -a sheep -a cow -an elephant -a bear -a zebra -a giraffe -a backpack -an umbrella -a handbag -a tie -a suitcase -a frisbee -skis -a snowboard -a sports ball -a kite -a baseball bat -a baseball glove -a skateboard -a surfboard -a tennis racket -a bottle -a wine glass -a cup -a fork -a knife -a spoon -a bowl -a banana -an apple -a sandwich -an orange -broccoli -a carrot -a hot dog -a pizza -a donut -a cake -a chair -a couch -a potted plant -a bed -a dining table -a toilet -a tv -a laptop -a remote -a keyboard -a cell phone -a microwave -an oven -a toaster -a sink -a refrigerator -a book -a clock -a vase -scissors -a teddy bear -a hair drier -a toothbrush \ No newline at end of file +a surfboard on the bottom of skis, front view From 53da61889110be6e01c2756f2fb513de49d008b3 Mon Sep 17 00:00:00 2001 From: Zhang Fan Date: Mon, 11 Dec 2023 23:04:50 -0600 Subject: [PATCH 051/248] Update README.md [fix] bibtex & pretrained --- README.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/README.md b/README.md index 1b7d87d8..8e909ad5 100755 --- a/README.md +++ b/README.md @@ -37,7 +37,7 @@ We propose **VBench**, a comprehensive benchmark suite for video generative mode ``` ## :gem: Pre-Trained Models -[Optional] Please download the pre-trained weights according to the guidance in the `model_path.txt` file for each model in the `pretrain` folder. +[Optional] Please download the pre-trained weights according to the guidance in the `model_path.txt` file for each model in the `pretrained` folder. ## :bookmark_tabs: Prompt Suite @@ -75,8 +75,8 @@ bash evaluate.sh ```bibtex @article{huang2023vbench, title={{VBench}: Comprehensive Benchmark Suite for Video Generative Models}, - author={Huang, Ziqi and He, Yinan and Yu, Jiashuo and Zhang, Fan and Si, Chenyang and Jiang, Yuming and Zhang, Yuanhan and Wu, Tianxing and Jin, Qingyang and Chanpaisit, Nattapol and Wang, Yaohui and Chen, Xinyuan and Wang, Limin and Lin, Dahua and Qiao, Yu and Liu, Ziwei} - journal={arXiv preprint arXiv:2311.17982} + author={Huang, Ziqi and He, Yinan and Yu, Jiashuo and Zhang, Fan and Si, Chenyang and Jiang, Yuming and Zhang, Yuanhan and Wu, Tianxing and Jin, Qingyang and Chanpaisit, Nattapol and Wang, Yaohui and Chen, Xinyuan and Wang, Limin and Lin, Dahua and Qiao, Yu and Liu, Ziwei}, + journal={arXiv preprint arXiv:2311.17982}, year={2023} } ``` From be133b1849e450dc41b49db677cf248c2eacc5cf Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Wed, 13 Dec 2023 16:47:09 +0800 Subject: [PATCH 052/248] [update] subject_consistency --- vbench/subject_consistency.py | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/vbench/subject_consistency.py b/vbench/subject_consistency.py index a8ea2bb3..f946b1bb 100755 --- a/vbench/subject_consistency.py +++ b/vbench/subject_consistency.py @@ -27,8 +27,7 @@ def subject_consistency(model, video_list, device, read_frame=True): for video_path in tqdm(video_list): video_sim = 0.0 if read_frame: - # video_path = video_path[:-4].replace('videos', 'frames') # TODO: original - video_path = video_path[:-4].replace('videos', 'frames').replace("yujiashuo/sample/sample_subject_consistency_20230927_027_0095000_128_512x512",'heyinan/aigc-eval/sample/origin_sample_subject_consistency_20230927_027_0095000_128_512x512') # TODO: DO NOT PUSH + video_path = video_path[:-4].replace('videos', 'frames').replace(' ', '_') tmp_paths = [os.path.join(video_path, f) for f in sorted(os.listdir(video_path))] images = [] for tmp_path in tmp_paths: From ab901487da8c8ea81977d19b4bfbef34850caa57 Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Wed, 13 Dec 2023 16:50:46 +0800 Subject: [PATCH 053/248] [delete] gitmodules not used --- .gitmodules | 9 --------- 1 file changed, 9 deletions(-) delete mode 100755 .gitmodules diff --git a/.gitmodules b/.gitmodules deleted file mode 100755 index 7cd948c3..00000000 --- a/.gitmodules +++ /dev/null @@ -1,9 +0,0 @@ -# [submodule "deps/gmflow"] -# path = deps/gmflow -# url = https://github.com/haofeixu/gmflow.git -# [submodule "deps/ControlNet"] -# path = deps/ControlNet -# url = https://github.com/lllyasviel/ControlNet.git -# [submodule "deps/ebsynth"] -# path = deps/ebsynth -# url = https://github.com/SingleZombie/ebsynth.git \ No newline at end of file From 3f66475498d0cdd40f3fd19ed82964b597282e97 Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Thu, 14 Dec 2023 16:55:39 +0800 Subject: [PATCH 054/248] [add] all_categories.txt --- prompts/all_categories.txt | 800 +++++++++++++++++++++++++++++++++++++ 1 file changed, 800 insertions(+) create mode 100644 prompts/all_categories.txt diff --git a/prompts/all_categories.txt b/prompts/all_categories.txt new file mode 100644 index 00000000..33be2a87 --- /dev/null +++ b/prompts/all_categories.txt @@ -0,0 +1,800 @@ +a black dog wearing halloween costume +spider making a web +bat eating fruits while hanging +a snake crawling on a wooden flooring +a close up video of a dragonfly +macro shot of ladybug on green leaf plant +chameleon eating ant +a bee feeding on nectars +bird nests on a tree captured with moving camera +a squirrel eating nuts +close up video of snail +top view of a hermit crab crawling on a wooden surface +cat licking another cat +red dragonfly perched on green leaf +close up view of a brown caterpillar crawling on green leaf +ants eating dead spider +an eagle on a tree branch +a frog eating an ant +white rabbit near the fence +a gorilla eating a carrot +close up of wolf +a meerkat looking around +a hyena in a zoo +lemur eating grass leaves +an owl being trained by a man +a lizard on a bamboo +brown chicken hunting for its food +video of parrots perched on bird stand +underwater footage of an octopus in a coral reef +a cute pomeranian dog playing with a soccer ball +white fox on rock +close up footage of a horse figurine +giraffe feeding on a tree in a savannah +curious cat sitting and looking around +hummingbird hawk moth flying near pink flowers +close up of a scorpion on a rock +close up on fish in net +koala eating leaves from a branch +a pod of dolphins swirling in the sea catching forage fish +low angle view of a hawk perched on a tree branch +a lion standing on wild grass +deer grazing in the field +elephant herd in a savanna +close up on lobster under water +hedgehog crossing road in forest +a sheep eating yellow flowers from behind a wire fence +twin sisters and a turtle +a pig wallowing in mud +flock of goose eating on the lake water +cow in a field irritated with flies +a close up shot of a fly +cheetah lying on the grass +close up of a lemur +close up shot of a kangaroo itching in the sand +a tortoise covered with algae +turkey in cage +a great blue heron bird in the lakeside +crab with shell in aquarium +a seagull walking on shore +an american crocodile +a tiger walking inside a cage +alligator in the nature +a raccoon climbing a tree +wild rabbit in a green meadow +group of ring tailed lemurs +a clouded leopard on a tree branch +duck grooming its feathers +an african penguin walking on a beach +a video of a peacock +close up shot of a wild bear +baby rhino plays with mom +porcupine climbs tree branches +close up of a natterjack toad on a rock +a sleeping orangutan +mother whale swimming with babies +a bear wearing red jersey +pink jellyfish swimming underwater in a blue sea +beautiful clown fish swimming +animation of disposable objects shaped as a whale +paper cut out of a pair of hands a whale and a heart +vertical video of camel roaming in the field during daytime +a still video of mosquito biting human +a curious sloth hanging from a tree branch +a plastic flamingo bird stumbles from the wind +a wolf in its natural habitat +a monkey sitting in the stone and scratching his head +bat hanging upside down +a red panda eating leaves +snake on ground +a harbour seal swimming near the shore +shark swimming in the sea +otter on branch while eating +goat standing over a rock +a troop of monkey on top of a mountain +a zebra eating grass on the field +a colorful butterfly perching on a bud +a snail crawling on a leaf +zookeeper showering a baby elephant +a beetle emerging from the sand +a nine banded armadillo searching for food +an apartment building with balcony +asian garden and medieval castle +illuminated tower in berlin +a wooden house overseeing the lake +a crowd of people in a plaza in front of a government building +a church interior +jewish friends posing with hanukkah menorah in a cabin house +a destroyed building after a missile attack in ukraine +abandoned building in the woods +drone video of an abandoned school building in pripyat ukraine +elegant university building +architecture and designs of buildings in central london +a pancake tower with chocolate syrup and strawberries on top +an ancient white building +friends hanging out at a coffee house +house front door with christmas decorations +city night dark building +a bird house hanging on a tree branch +sacred sculpture in a temple +high angle shot of a clock tower +modern wooden house interior +the interior of an abandoned building +opera house overlooking sea +a concrete structure near the green trees +dome like building in scotland +low angle shot of a building +tower on hill +a miniature house +eiffel tower from the seine river +low angle footage of an apartment building +island with pier and antique building +asian historic architecture +drone footage of a beautiful mansion +mosque in the middle east +building a tent and hammock in the forest camping site +top view of a high rise building +house covered in snow +skyscraper at night +house in village +a casino with people outside the building +silhouette of a building +a woman climbing a tree house +drone view of house near lake during golden hour +an under construction concrete house +a watch tower by the sea +exterior view of arabic style building +video of a hotel building +red paper lantern decorations hanging outside a building +house on seashore +aerial footage of the palace of culture and science building in warsaw poland +aerial video of stuttgart tv tower in germany +aerial view of the highway and building in a city +drone shot of a skyscraper san francisco california usa +waterfall and house +view of the sky through a building +drone footage of a house on top of the mountain +abandoned house in the nature +clouds hovering over a mansion +light house on the ocean +buddhist temple at sunrise +people walking by a graveyard near a mosque at sunset +view of lifeguard tower on the beach +scenic view of a house in the mountains +the landscape in front of a government building +aerial footage of a building and its surrounding landscape in winter +time lapse of a cloudy sky behind a transmission tower +blue ocean near the brown castle +fog over temple +house in countryside top view +building under construction +turkish flag waving on old tower +the georgian building +close up shot of a steel structure +the atrium and interior design of a multi floor building +city view reflected on a glass building +aerial view of a luxurious house with pool +an unpaved road leading to the house +drone footage of a lookout tower in mountain landscape +wind turbines on hill behind building +time lapse footage of the sun light in front of a small house porch +a building built with lots of stairways +overcast over house on seashore +the view of the sydney opera house from the other side of the harbor +candle on a jar and a house figurine on a surface +video of a farm and house +a dilapidated building made of bricks +a view of a unique building from a moving vehicle +aerial footage of a tall building in cambodia +push in shot of a huge house +a beach house built over a seawall protected from the sea waves +exotic house surrounded by trees +drone video of a house surrounded by tropical vegetation +drone footage of a building beside a pond +observation tower on hill in forest +a tree house in the woods +a video of vessel structure during daytime +fire in front of illuminated building at night +a footage of a wooden house on a wheat field +tilt shot of a solar panel below a light tower +water tower on the desert +freshly baked finger looking cookies +video of fake blood in wine glass +halloween food art +a person slicing a vegetable +a serving of pumpkin dish in a plate +close up view of green leafy vegetable +a birthday cake in the plate +video of a slice papaya fruit +a muffin with a burning candle and a love sign by a ceramic mug +a jack o lantern designed cookie +baked bread with chocolate +a broccoli soup on wooden table +a freshly brewed coffee on a pink mug +grabbing sourdough neapolitan style pizza slices +person cooking mushrooms in frying pan +rice grains placed on a reusable cloth bag +slices of kiwi fruit +grilling a steak on a pan grill +close up of bread popping out of a toaster +man eating noodle +preparing a cocktail drink +close up pasta with bacon on plate +milk and cinnamon rolls +boy getting a dumpling using chopsticks +a mother preparing food with her kids +man using his phone while eating +fresh salmon salad on a plate +cutting cucumbers into long thin slices as ingredient for sushi roll +a steaming cup of tea by the window +a glass filled with beer +a kid eating popcorn while watching tv +close up shot of fried fish on the plate +a man eating a donut +person making a vegetarian dish +spreading cheese on bagel +close up view of a man drinking red wine +a couple having breakfast in a restaurant +a student eating her sandwich +girl peeling a banana +red rice in a small bowl +pancake with blueberry on the top +green apple fruit on white wooden table +a man eating a taco by the bar +making of a burrito +squeezing lemon into salad +a chef cutting sushi rolls +video of a delicious dessert +deep frying a crab on a wok in high fire +close up video of a orange juice +video of a cooked chicken breast +woman holding a pineapple +a woman eating a bar of chocolate +decorating christmas cookie +squeezing a slice of fruit +tuna sashimi on a plate +a strawberry fruit mixed in an alcoholic drink +preparing hot dogs in a grill +a woman cutting a tomato +an orange fruit cut in half +a coconut fruit with drinking straw +woman holding a dragon fruit +a woman pouring hot beverage on a cup +waffles with whipped cream and fruit +focus shot of an insect at the bottom of a fruit +preparing a healthy broccoli dish +man eating snack at picnic +close up video of a grilled shrimp skewer +a woman mixing a smoothie drinks +close up video of woman having a bite of jelly +businessman drinking whiskey at the bar counter of a hotel lounge +cutting an onion with a knife over a wooden chopping board +fresh lemonade in bottles +grilling a meat on a charcoal grill +people enjoying asian cuisine +close up footage of a hot dish on a clay pot +pork ribs dish +waffle with strawberry and syrup for breakfast +tofu dish with rose garnish +uncooked pork meat +egg yolk being dumped over gourmet dish +tasty brunch dish close up +little boy pretending to eat the watermelon +slicing roasted beef +close up of a chef adding teriyaki sauce to a dish +flat lay mexican dish +a person placing an octopus dish on a marble surface +close up of tea leaves brewing in a glass kettle +adding fresh herbs to soup dish +a scoop of roasted coffee beans +fresh dim sum set up on a bamboo steam tray for cooking +a girl putting ketchup on food at the kitchen +cooking on electric stove +a woman with a slice of a pie +grapes and wine on a wooden board +man taking picture of his food +hamburger and fries on restaurant table +close up video of japanese food +a cracker sandwich with cheese filling for snack +barista preparing matcha tea +close up of onion rings being deep fried +people carving a pumpkin +people sitting on a sofa +a man with a muertos face painting +man walking in the dark +men in front of their computer editing photos +men loading christmas tree on tow truck +woman washing the dishes +woman adding honey to the cinnamon rolls +two women kissing and smiling +three women looking at watercolor paintings +a family wearing paper bag masks +a family posing for the camera +a boy covering a rose flower with a dome glass +boy sitting on grass petting a dog +a girl in her tennis sportswear +a girl coloring the cardboard +silhouette of the couple during sunset +couple dancing with body paint +a child playing with water +a woman with her child sitting on a couch in the living room +a group of friend place doing hand gestures of agreement +friends having a group selfie +friends talking while on the basketball court +group of people protesting +a group of campers with a cute dog +a group of photographers taking pictures at the north western gardens in llandudno north wales +a group of students laughing and talking +a group of martial artist warming up +a person playing golf +a person walking on a wet wooden bridge +person doing a leg exercise +ice hockey athlete on rink +a young athlete training in swimming +chess player dusting a chessboard +baseball player holding his bat +a bearded man putting a vinyl record on a vinyl player +an orchestra finishes a performance +people applauding the performance of the kids +band performance at the recording studio +father and his children playing jenga game +people playing a board game +man playing a video game +a man video recording the movie in theater +man and a woman eating while watching a movie +movie crew talking together +a director explaining the movie scene +man and woman listening to music on car +man playing music +couple dancing slow dance with sun glare +a ballerina practicing in the dance studio +father and son holding hands +father and daughter talking together +a mother and her kids engaged in a video call +mother and daughter reading a book together +a mother teaching her daughter playing a violin +kid in a halloween costume +a happy kid playing the ukulele +a chef slicing a cucumber +chef wearing his gloves properly +brother and sister using hammock +girl applying sunblock to her brother +a girl pushing the chair while her sister is on the chair +colleagues talking in office building +fighter practice kicking +a woman fighter in her cosplay costume +an engineer holding blueprints while talking with her colleague +a young woman looking at vr controllers with her friend +workmates teasing a colleague in the work +a male police officer talking on the radio +teacher holding a marker while talking +teacher writing on her notebook +a young student attending her online classes +a student showing his classmates his wand +a male vendor selling fruits +a shirtless male climber +a sound engineer listening to music +female talking to a psychiatrist in a therapy session +young female activist posing with flag +a man in a hoodie and woman with a red bandana talking to each other and smiling +a medium close up of women wearing kimonos +a male interviewer listening to a person talking +a social worker having a conversation with the foster parents +a farm worker harvesting onions +worker packing street food +worker and client at barber shop +elderly man lifting kettlebell +mom assisting son in riding a bicycle +dad watching her daughter eat +young guy with vr headset +pregnant woman exercising with trainer +a fortune teller talking to a client +wizard doing a ritual on a woman +a footage of an actor on a movie scene +a man holding a best actor trophy +a singer of a music band +a young singer performing on stage +young dancer practicing at home +seller showing room to a couple +cab driver talking to passenger +a policeman talking to the car driver +kids celebrating halloween at home +little boy helping mother in kitchen +video of a indoor green plant +a girl arranges a christmas garland hanging by the kitchen cabinet +candle burning in dark room +couple having fun and goofing around the bedroom +girls jumping up and down in the bedroom +woman and man in pajamas working from home +a muslim family sitting and talking in the living room +family enjoying snack time while sitting in the living room +woman holding an animal puppet and a little girl playing together at the living room +kids playing in the indoor tent +young people celebrating new year at the office +a woman writing on the sticky note in the office +a woman exercising at home over a yoga mat +girls preparing easter decorations at home +dog on floor in room +turning on a fluorescent light inside a room +colleagues talking to each other near the office windows +a woman recording herself while exercising at home +music room +different kind of tools kept in a utility room +sofa beds and other furniture +a girl finding her brother reading a book in the bedroom +an elegant ceramic plant pot and hanging plant on indoor +furniture inside a bedroom +interior design of the bar section +living room with party decoration +firewood burning in dark room +a young woman playing the ukulele at home +woman painting at home +a woman in a locker room +video of a bathroom interior +the interior design of a jewish synagogue +a woman in protective suit disinfecting the kitchen +modern minimalist home interior +modern interior design of a coffee shop +person arranging minimalist furniture +aerial shot of interior of the warehouse +a room of a manufacturing facility +interior of catholic +interior design of a restaurant +a female model in a changing room looking herself in mirror +men walking in the office hallway +people sitting in a conference room +the interior design of a shopping mall +chandeliers in room +lucerne railway station interior +a female fencer posing in a foggy room +a toolbox and a paint roller beside a huge package in a room +bedroom in hotel +a woman lying in the operating room +a chef holding and checking kitchen utensils +a couple singing in the shower room together +a woman cleaning mess in the living room +an empty meeting room with natural light +person dancing in a dark room +close up on blood in hospital room +a couple resting on their home floor +a young female staff at courier office +a man entering the gym locker room +a bored man sitting by the tv at home +woman dancing in indoor garden +rubble in the interior of an abandoned house +indoor farm in a greenhouse +man doing handstand in indoor garden +an abandoned indoor swimming pool +home decorations on top of a cabinet +graffiti art on the interior walls of an abandoned mansion +indoor wall climbing activity +sunlight inside a room +teenage girl roller skating at indoor rink +home deco with lighted +baby in the shower room +men enjoying office christmas party +a bedroom with a brick wall +actors prepping in the dressing room +kids playing at an indoor playground +a person sanitizing an office space using smoke machine +mother and daughter choosing clothes at home +a woman sitting by the indoor fire pit +man standing on the corner of the room while looking around +person assembling furniture +a family stacking cardboard boxes in a room +family having fun in the dining room +person disinfecting a room +a woman washing strawberries in the kitchen sink +modern office waiting room +close up view of a person slicing with a kitchen knife +boiling coffee on a stove in the kitchen +modern equipment used in a home studio +interior of a recording studio +people working in a call center office +band performing at a home concert +a group of people watching a concert in a room +people packing their furniture +young employees in office holding a certificate +a criminal inside a dark room handcuffed in a table +couple browsing and looking for furniture in the store +workspace at home +video of a indoor green plant +close up view of a plant +close up shot of a burning plant +plucking leaves from plant +a plant on gold pot with glass lid +a branch of a tree and a plant +a leafless tree +close up shot of fern leaf +close up video of strawberry plant +plant with blooming flowers +close up video of flower petals +watering yellow plant +beautiful flower decoration +cannabis flower in a jar +a footage of the tree leaves +a red leaf plant +close up view of a white christmas tree +snow pouring on a tree +close up shot of white flowers on the tree +leaves in the trees daytime +a dead tree lying on a grass field +tree branches in a flowing river +purple flowers with leaves +a coconut tree by the house +close up on flower in winter +bamboo leaves backlit by the sun +close up video of a wet flower +a man putting a flower in a box +dropping flower petals on a wooden bowl +a close up shot of gypsophila flower +variety of succulent plants on a garden +variety of trees and plants in a botanical garden +forest of deciduous trees +a stack of dried leaves burning in a forest +tall forest trees on a misty morning +close up view of dewdrops on a leaf +close up view of white petaled flower +removing a pineapple leaf +a dragonfly perched on a leaf +butterfly pollinating flower +person visiting and checking a corn plant +woman picking beans from a plant +woman plucking mint leaves +single tree in the middle of farmland +a plant on a soil +drone footage of a tree on farm field +a tractor harvesting lavender flower +people putting christmas ornaments on a christmas tree +jack o lantern hanging on a tree +tree with halloween decoration +flower field near the waterfall +truck carrying the tree logs +raindrops falling on leaves +shot of a palm tree swaying with the wind +squirrels on a tree branch +person holding a flower +a fallen tree trunk +tree with golden leaves +cherry tree +wind blows through leaves of the tree in autumn +a leaf on a glass +the long trunks of tall trees in the forest +trees in the forest during sunny day +close up video of tree bark +reflection of tree branches +trunks of many trees in the forest +tree leaves providing shades from the sun +leaves swaying in the wind +low angle shot of baobab tree +bare trees in forest +a plant surrounded by fallen leaves +a couple preparing food and pruning a plant +a man cutting a tree bark +oranges on a tree branch +plant connected on the stones +video of a sawmill machine cutting tree log +women drying flower petals +macro view of an agave plant +a video of a person tying a plant on a string +green moss in forest nature +coconut tree near sea under blue sky +the canopy of a coconut tree +a man leaning on a tree at the beach +a full grown plant on a pot +candle wax dripping on flower petals +close up of leaves in autumn +a woman opening a book with a flower inside +a man holding leaves looking at the camera +a shadow of a swaying plant +a tree and concrete structure under a blue and cloudy sky +trimming excess leaves on a potted plant +the changing color of the tree leaves during autumn season +a gooseberry tree swayed by the wind +forest trees and a medieval castle at sunset +woman cut down tree +an old oak tree in a park across the street from a hotel +wild flowers growing in a forest ground +a mossy fountain and green plants in a botanical garden +mansion with beautiful garden +ants on a dragon fruit flower +scenery of desert landscape +landscape agriculture farm tractor +burning slash piles in the forest +graveyard at sunset +view of a jack o lantern with pumpkins in a smoky garden +sun view through a spider web +view of the sea from an abandoned building +close up view of a full moon +close up view of lighted candles +close up view of swaying white flowers and leaves +scenery of a relaxing beach +selective focus video of grass during sunny day +aerial view of brown dry landscape +fireworks display in the sky at night +a bonfire near river +mountain view +waterfalls in between mountain +a picturesque view of nature +exotic view of a riverfront city +tall trees in the forest under the clear sky +snow on branches in forest +stream in the nature +an airplane flying above the sea of clouds +scenic video of sunset +view of houses with bush fence under a blue and cloudy sky +scenic view from wooden pathway +scenic view of a tropical beach +drone footage of waves crashing on beach shore +a scenic view of the golden hour at norway +time lapse video of foggy mountain forest +brown mountain during fall season +video of ocean during daytime +boat sailing in the ocean +top view of yachts +beautiful scenery of flowing waterfalls and river +wild ducks paddling on the lake surface +a relaxing scenery of beach view under cloudy sky +natural rock formations on beach under cloudy sky +a palm tree against blue sky +video of sailboat on a lake during sunset +aerial view of snow piles +time lapse of a sunset sky in the countryside +aerial footage of a statue +time lapse video of a farm during sunset +clouds formation in the sky at sunset +aerial shot of a village +drone shot of a beautiful sunrise at the mountains +time lapse video of foggy morning during sunrise +sun shining between tree leaves at sunrise +video of lake during dawn +vehicles traveling on roadway under cloudy sky +view of golden domed church +a monument under the blue sky +firecrackers in the sky +view of fruit signage in the farm +a dark clouds over shadowing the full moon +view of the amazon river +a big river swamp in a dense forest +a blooming cherry blossom tree under a blue sky with white clouds +a river waterfall cascading down the plunge basin +flooded landscape with palm trees +a blurry waterfall background +waterfall in the mountains +aerial footage of a city at night +pond by small waterfall in forest +aerial view of farmlands at the bay of lake +rice terraces in the countryside +a highway built across an agricultural area in the countryside +gloomy morning in the countryside +drone shot of an abandoned coliseum on a snowy mountain top +boat sailing in the middle of ocean +drone shot of the grass field +natural landscape of mountain and sea with islets developed into a community +aerial view of zaporizhia in ukraine +aerial footage of a herd +an aerial footage of a red sky +grass and plants growing in the remains of an abandoned house +view from hill on city +aerial view on orthodox church +aerial view of bay in croatia +a footage of a frozen river +overlooking view of a city at daylight +view outside the cemetery +clear sky with moon over meadow +clouds over railway +aerial footage of moving vehicles on the road at night +aerial view of town and park +top view of skyscrapers +top view of the empire state building in manhattan +top view of the central park in new york city +sheep running in a grass field +clear sky over factory +smoke and fire in birds eye view +view of a pathway with snow melting on its side +ferry under bridge on river near city in malaysia +mountain slopes covered in green vegetation +panoramic view of a town surrounded by snow covered mountains +aerial view of a palace +top view of vehicles driving on the intersection +a graveyard by a church in a mountain landscape +a modern railway station in malaysia use for public transportation +drone footage of amsterdam metro station +train arriving at a station +red vehicle driving on field +close up view of flashing emergency vehicle lighting +vehicle with fertilizer on field +a highway built across an agricultural area in the countryside +drone footage of motorcycles driving on country road between agricultural fields +a road in the woods under fog +footage of a car driving through a wheat field +vehicle stops for an ambulance passing through city traffic +emergency vehicle parked outside the casino +zombies attacking a woman and a boy inside a car +woman seating inside the car while chewing +video of passengers riding a double decker bus during night +traffic in london street at night +elderly couple checking engine of automobile +a green vintage automobile with an open hood parked in a parking area +close up of a prototype automobile with exposed engine on the back seat of the car +aerial view of road in forest +train departing from station +aerial view of a train passing by a bridge +video of a train tracks +video footage of a subway +video of blinking traffic lights +couple walking out on the subway +time lapse of a subway tunnel +monitor board inside the subway +metro train at night +zoom in video of a tram passing by city +young man using laptop in the tram +man reading a book at bus stop +close up shot of a moving taxi +night travel in london street on a public bus +red bus in a rainy city +flow of traffic in the city +close up shot of a yellow taxi turning left +two women calling for a taxi +drone view of an illuminated bridge across a river +policeman in police car talking on radio +airplane taking off at night +view through window in airplane +an airplane in the sky +helicopter landing on the street +a pilot getting out of a helicopter +a helicopter flying under blue sky +boat sailing in the middle of the ocean +girl playing with a toy boat +silhouette of a boat on sea during golden hour +a boat travelling around the lake +road on mountain ridge +ship sailing on danube river +slow motion video of a ship water trail in the sea +drone footage of a wreck ship on shore +a white yacht traveling on a river and passing under the bridge +female teenagers drinking champagne in the yacht +video of yacht sailing in the ocean +red combine harvester on road on field +a woman sitting on a bicycle while using a mobile phone +a woman sitting on a motorcycle looking around +three teenagers fixing a bicycle +a woman in a halloween costume posing on a motorcycle +a parked motorcycle on a foggy roadside +cable car near sea shore +a truck travelling in the road +footage of the road without any traffic +a road sign +love padlocks on a bridge +camera moving at highway construction site +vehicles driving on highway +a motorbike on highway at timelapse mode +point of view of a car driving through a tunnel +time lapse of heavy traffic on an avenue +ferry boat on city canal +black vintage car in museum +a zigzag road across a forest +people crossing the road +video of a kayak boat in a river +a person paddling a wooden boat in a lake +a car charging in the parking area +cars parked on the road +footage of the street with people and vehicle passing by in the rain +traffic on busy city street +a woman getting out of the car to walk with their dog +yacht sailing through the ocean +people in queue to military ship +man wearing motorcycle helmet looking at the camera +empty seats in the bus +empty boat on the water +cargo train traveling on the mountainside +cruise ship in harbor +counting down at traffic lights +pressing the car ignition +fire truck driving on the road +a footage of a broken bicycle +drone footage of an ambulance on the road +slow motion footage of a racing car +ship sailing on sea against sunset +big cargo ship passing on the shore +back view of man and woman walking on unpaved road \ No newline at end of file From dbf993516b1378e75b8c4cc0799aeedb9ad4570b Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Thu, 14 Dec 2023 18:01:15 +0800 Subject: [PATCH 055/248] [update] sampling instruction and combined prompts --- prompts/README.md | 87 +++++++++++++------ .../{all_categories.txt => all_category.txt} | 0 .../all_dimension.txt | 0 3 files changed, 62 insertions(+), 25 deletions(-) rename prompts/{all_categories.txt => all_category.txt} (100%) rename prompts/{prompts_per_dimension => }/all_dimension.txt (100%) diff --git a/prompts/README.md b/prompts/README.md index e3dcfb48..19ac0250 100755 --- a/prompts/README.md +++ b/prompts/README.md @@ -5,37 +5,56 @@ We design compact yet representative prompts in terms of both the evaluation dim ## Prompts per Dimension `prompts/prompts_per_dimension`: For each VBench evaluation dimension, we carefully designed a set of around 100 prompts as the test cases. +We provide a combined list `prompts/all_dimension.txt`, which combines all the prompts under `prompts/prompts_per_dimension`. +## Prompts per Category +`prompts/prompts_per_category`: 100 prompts for each of the 8 content categories: `Animal`, `Architecture`, `Food`, `Human`, `Lifestyle`, `Plant`, `Scenery`, `Vehicles`. +We provide a combined list `prompts/all_category.txt`, which combines all the prompts under `prompts/prompts_per_category`. -### Pseudo-Code for Video Sampling: -``` -for dimension in dimension_list: +## Metadata +`prompts/metadata`: metadata for some prompt lists, such as the `color` and `object_class` labels for prompts that need to be semantically parsed. - # set random seed - if args.seed: - torch.manual_seed(args.seed) - - # read prompt list - with open(f'./prompts/prompts_per_dimension/{dimension}.txt', 'r') as f: - prompt_list = f.readlines() - prompt_list = [prompt.strip() for prompt in prompt_list] - - for prompt in prompt_list: - # sample 5 videos for each prompt - for index in range(5): +# How to Sample Videos for Evaluation - # perform sampling - video_ = sample_per_video(dimension, prompt, idx) - cur_save_path = f'{args.save_path}/{prompt}-{index}.mp4' - torchvision.io.write_video(cur_save_path, video_, fps=8) -``` +We specify how to sample from `Prompts per Category` for VBench evaluation, and that for `Prompts per Category` can be carried out similarly. + + +## Evaluate Some Dimensions + +### Pseudo-Code for Sampling +- If you only want to evaluate certain dimensions, below are the pseudo-code for sampling. + ``` + dimension_list = ['object_class', 'overall_consistency'] + + for dimension in dimension_list: + + # set random seed + if args.seed: + torch.manual_seed(args.seed) + + # read prompt list + with open(f'./prompts/prompts_per_dimension/{dimension}.txt', 'r') as f: + prompt_list = f.readlines() + prompt_list = [prompt.strip() for prompt in prompt_list] + + for prompt in prompt_list: + + # sample 5 videos for each prompt + for index in range(5): + + # perform sampling + video_ = sample_per_video(dimension, prompt, idx) + cur_save_path = f'{args.save_path}/{prompt}-{index}.mp4' + torchvision.io.write_video(cur_save_path, video_, fps=8) + ``` ### Further Explanations + To sample videos for VBench evaluation: - Sample videos from all the `txt` files in `prompts/prompts_per_dimension`. - For each prompt, sample 5 videos. -- At the beginning of sampling from each `txt` file, set the random seed. +- **Random Seed**: At the beginning of sampling from each `txt` file, set the random seed. For some models, the random seed is independently and randomly drawn for each video sample, and this is also acceptable, but it would be the best to record the random seed of every video being sampled. We need to ensure: (1) The random seeds are random, and not cherry picked. (2) The sampling process is reproducible, so that the evaluation results are reproducible. - Name the videos in the form of `$prompt-$index.mp4`, `$index` takes value of `0, 1, 2, 3, 4`. For example: ``` ├── A 3D model of a 1800s victorian house.-0.mp4 @@ -50,9 +69,27 @@ To sample videos for VBench evaluation: ├── A beautiful coastal beach in spring, waves lapping on sand by Hokusai, in the style of Ukiyo-4.mp4 ...... ``` +## Evaluate All Dimensions -## Prompts per Category -`prompts/prompts_per_category`: 100 prompts for each of the 8 content categories: `Animal`, `Architecture`, `Food`, `Human`, `Lifestyle`, `Plant`, `Scenery`, `Vehicles`. +- If you want to evaluate all the dimensions, below are the pseudo-code for sampling. + ``` + # set random seed + if args.seed: + torch.manual_seed(args.seed) + + # read prompt list + with open(f'./prompts/all_dimension.txt', 'r') as f: + prompt_list = f.readlines() + prompt_list = [prompt.strip() for prompt in prompt_list] + + for prompt in prompt_list: + + # sample 5 videos for each prompt + for index in range(5): + + # perform sampling + video_ = sample_per_video(dimension, prompt, idx) + cur_save_path = f'{args.save_path}/{prompt}-{index}.mp4' + torchvision.io.write_video(cur_save_path, video_, fps=8) + ``` -## Metadata -`prompts/metadata`: metadata for some prompt lists, such as the `color` and `object_class` labels for prompts that need to be semantically parsed. diff --git a/prompts/all_categories.txt b/prompts/all_category.txt similarity index 100% rename from prompts/all_categories.txt rename to prompts/all_category.txt diff --git a/prompts/prompts_per_dimension/all_dimension.txt b/prompts/all_dimension.txt similarity index 100% rename from prompts/prompts_per_dimension/all_dimension.txt rename to prompts/all_dimension.txt From 21f250aa19f0061936fef57d9defbc2dfebe1d61 Mon Sep 17 00:00:00 2001 From: zhangfan-p Date: Fri, 15 Dec 2023 17:32:49 +0800 Subject: [PATCH 056/248] [fix] subject consistency read_frame --- vbench/subject_consistency.py | 5 +++-- vbench/utils.py | 17 +++++------------ 2 files changed, 8 insertions(+), 14 deletions(-) diff --git a/vbench/subject_consistency.py b/vbench/subject_consistency.py index f946b1bb..8ab03c5a 100755 --- a/vbench/subject_consistency.py +++ b/vbench/subject_consistency.py @@ -16,7 +16,7 @@ logging.basicConfig(level = logging.INFO,format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s') logger = logging.getLogger(__name__) -def subject_consistency(model, video_list, device, read_frame=True): +def subject_consistency(model, video_list, device, read_frame): sim = 0.0 cnt = 0 video_results = [] @@ -59,7 +59,8 @@ def subject_consistency(model, video_list, device, read_frame=True): def compute_subject_consistency(json_dir, device, submodules_list): dino_model = torch.hub.load(**submodules_list).to(device) + read_frame = submodules_list['read_frame'] logger.info("Initialize DINO success") video_list, _ = load_dimension_info(json_dir, dimension='subject_consistency', lang='en') - all_results, video_results = subject_consistency(dino_model, video_list, device) + all_results, video_results = subject_consistency(dino_model, video_list, device, read_frame) return all_results, video_results diff --git a/vbench/utils.py b/vbench/utils.py index 8d2bcec7..6f023382 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -246,15 +246,6 @@ def init_submodules(dimension_list, local=False, read_frame=False): 'config':'pretrained/amt_model/AMT-S.yaml', 'ckpt':'pretrained/amt_model/amt-s.pth' } - # if local: - # details = submodules_dict[dimension] - # # Check if the file exists, if not, download it with wget - # if not os.path.isfile(details['ckpt']): - # print(f"File {details['ckpt']} does not exist. Downloading...") - # wget_command = ['wget', '-P', os.path.dirname(details['ckpt']), - # 'https://huggingface.co/lalala125/AMT/resolve/main/amt-s.pth'] - # subprocess.run(wget_command) - details = submodules_dict[dimension] # Check if the file exists, if not, download it with wget if not os.path.isfile(details['ckpt']): @@ -269,7 +260,7 @@ def init_submodules(dimension_list, local=False, read_frame=False): } details = submodules_dict[dimension] if not os.path.isfile(details['model']): - raise NotImplementedError + # raise NotImplementedError print(f"File {details['model']} does not exist. Downloading...") os.system(f'wget -P pretrained/raft_model/ https://dl.dropboxusercontent.com/s/4j4z58wuv8o0mfz/models.zip') os.system(f'unzip -d pretrained/raft_model/ pretrained/raft_model/models.zip') @@ -281,7 +272,8 @@ def init_submodules(dimension_list, local=False, read_frame=False): 'repo_or_dir':'pretrained/dino_model/facebookresearch_dino_main/', 'path':'pretrained/dino_model/dino_vitbase16_pretrain.pth', 'model': 'dino_vitb16', - 'source': 'local' + 'source': 'local', + 'read_frame': read_frame } details = submodules_dict[dimension] # Check if the file exists, if not, download it with wget @@ -298,7 +290,8 @@ def init_submodules(dimension_list, local=False, read_frame=False): submodules_dict[dimension] = { 'repo_or_dir':'facebookresearch/dino:main', 'source':'github', - 'model': 'dino_vitb16' + 'model': 'dino_vitb16', + 'read_frame': read_frame } elif dimension == 'aesthetic_quality': aes_path = "pretrained/aesthetic_model/emb_reader" From 0ba657c2634aab8a5923cf8407bf271dc88f6614 Mon Sep 17 00:00:00 2001 From: zhangfan-p Date: Sun, 17 Dec 2023 19:26:55 +0800 Subject: [PATCH 057/248] [fix] temporal flickering-skip empty file --- vbench/temporal_flickering.py | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/vbench/temporal_flickering.py b/vbench/temporal_flickering.py index c704366a..406b362e 100644 --- a/vbench/temporal_flickering.py +++ b/vbench/temporal_flickering.py @@ -44,7 +44,10 @@ def temporal_flickering(video_list): sim = [] video_results = [] for video_path in tqdm(video_list): - score_per_video = cal_score(video_path) + try: + score_per_video = cal_score(video_path) + except AssertionError: + continue video_results.append({'video_path': video_path, 'video_results': score_per_video}) sim.append(score_per_video) avg_score = np.mean(sim) From e26257df488a71155d5d3718ff0c1524d8c3af1a Mon Sep 17 00:00:00 2001 From: zhangfan-p Date: Sun, 17 Dec 2023 19:30:03 +0800 Subject: [PATCH 058/248] [fix] add num_frames parameter --- vbench/utils.py | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/vbench/utils.py b/vbench/utils.py index 6f023382..e4df979b 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -146,7 +146,13 @@ def load_video(video_path, data_transform=None, num_frames=None, return_tensor=T else: video_reader = VideoReader(video_path, num_threads=1) frames = video_reader.get_batch(range(len(video_reader))) # (T, H, W, C), torch.uint8 + buffer = frames.asnumpy().astype(np.uint8) + if num_frames: + frame_indices = get_frame_indices( + num_frames, len(buffer), sample="middle" + ) + buffer = buffer[frame_indices] if data_transform: buffer = data_transform(buffer) return buffer From fe655df3e13a2dcf3283a476fde28714e562ea4b Mon Sep 17 00:00:00 2001 From: zhangfan-p Date: Sun, 17 Dec 2023 19:31:51 +0800 Subject: [PATCH 059/248] [fix] human action set num_frames=16 --- vbench/human_action.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/vbench/human_action.py b/vbench/human_action.py index 3b4d7867..55ae8c9b 100755 --- a/vbench/human_action.py +++ b/vbench/human_action.py @@ -65,7 +65,7 @@ def human_action(umt_path, video_list, device): for video_path in tqdm(video_list): video_label_ls = video_path.split('/')[-1].lower().split('-')[0].split("person is ")[-1].split('_')[0] cnt += 1 - images = load_video(video_path, data_transform) + images = load_video(video_path, data_transform, num_frames=16) images = images.unsqueeze(0) images = images.to(device) with torch.no_grad(): From fd3613aa1444123ba74e2c9643cacc8fe3cc2078 Mon Sep 17 00:00:00 2001 From: zhangfan-p Date: Mon, 25 Dec 2023 20:42:03 +0800 Subject: [PATCH 060/248] [add] static-filter --- static_filter.py | 134 +++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 134 insertions(+) create mode 100644 static_filter.py diff --git a/static_filter.py b/static_filter.py new file mode 100644 index 00000000..3d47c7b2 --- /dev/null +++ b/static_filter.py @@ -0,0 +1,134 @@ +import argparse +import os +import cv2 +import glob +import numpy as np +import torch +from tqdm import tqdm +import json + +from vbench.third_party.RAFT.core.raft import RAFT +from vbench.third_party.RAFT.core.utils_core.utils import InputPadder + + +DEVICE = 'cuda' + + +class StaticFilter: + def __init__(self, args, device): + self.args = args + self.device = device + self.load_model() + + + def load_model(self): + self.model = torch.nn.DataParallel(RAFT(self.args)) + self.model.load_state_dict(torch.load(self.args.model)) + + self.model = self.model.module + self.model.to(self.device) + self.model.eval() + + + def get_score(self, img, flo): + img = img[0].permute(1,2,0).cpu().numpy() + flo = flo[0].permute(1,2,0).cpu().numpy() + + u = flo[:,:,0] + v = flo[:,:,1] + rad = np.sqrt(np.square(u) + np.square(v)) + + h, w = rad.shape + rad_flat = rad.flatten() + cut_index = int(h*w*0.02) + + max_rad = np.mean(abs(np.sort(-rad_flat))[:cut_index]) + + return max_rad + + + def check_static(self, score_list): + thres = self.params["thres"] + count_num = self.params["count_num"] + count = 0 + for score in score_list[:-2]: + if score > thres: + count += 1 + if count > count_num: + return False + for score in score_list[-2:]: + if score > thres*count_num*2: + return False + return True + + + def set_params(self, frame, count): + scale = min(list(frame.shape)[-2:]) + self.params = {"thres":3.0*(scale/256.0), "count_num":round(2*(count/16.0))} + + + def infer(self, path): + with torch.no_grad(): + frames = self.get_frames(path) + self.set_params(frame=frames[0], count=len(frames)) + static_score = [] + for image1, image2 in zip(frames[:-1]+[frames[0],frames[-1]], frames[1:]+[frames[-1],frames[0]]): + padder = InputPadder(image1.shape) + image1, image2 = padder.pad(image1, image2) + _, flow_up = self.model(image1, image2, iters=20, test_mode=True) + max_rad = self.get_score(image1, flow_up) + static_score.append(max_rad) + whether_static = self.check_static(static_score) + return whether_static + + + def get_frames(self, video_path): + frame_list = [] + video = cv2.VideoCapture(video_path) + while video.isOpened(): + success, frame = video.read() + if success: + frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB) # convert to rgb + frame = torch.from_numpy(frame.astype(np.uint8)).permute(2, 0, 1).float() + frame = frame[None].to(DEVICE) + frame_list.append(frame) + else: + break + video.release() + assert frame_list != [] + return frame_list + + +def filter_static(args): + static_filter = StaticFilter(args, device=DEVICE) + prompt_dict = {} + with open(args.prompt_file, "r") as f: + lines = [line.strip() for line in f.readlines()] + for line in lines: + prompt_dict[line] = {"static_count":0, "static_path":[]} + + paths = sorted(glob.glob(os.path.join(args.videos_path, "*.mp4"))) + for path in tqdm(paths): + name = '-'.join(path.split('/')[-1].split('-')[:-1]) + if name in lines: + if prompt_dict[name]["static_count"] < 5: + if static_filter.infer(path): + prompt_dict[name]["static_count"] += 1 + prompt_dict[name]["static_path"].append(path) + os.makedirs(args.result_path, exist_ok=True) + json.dump(prompt_dict, open(os.path.join(args.result_path, args.store_name), "w")) + + +if __name__ == '__main__': + parser = argparse.ArgumentParser() + parser.add_argument('--model', type=str, default="./pretrained/raft_model/models/raft-things.pth", help="restore checkpoint") + parser.add_argument('--videos_path', default="", required=True, help="video path for filtering") + parser.add_argument('--result_path', type=str, default="./filter_results", help='result save path') + parser.add_argument('--store_name', type=str, default="filtered_static_video.json", help='result file name') + parser.add_argument('--prompt_file', type=str, default="./prompts/prompts_per_dimension/temporal_flickering.txt", help='static_prompt') + parser.add_argument('--small', action='store_true', help='use small model') + parser.add_argument('--mixed_precision', action='store_true', help='use mixed precision') + parser.add_argument('--alternate_corr', action='store_true', help='use efficent correlation implementation') + args = parser.parse_args() + + filter_static(args) From 84051f7e5b2c49389e7f7e48421f337301795fa6 Mon Sep 17 00:00:00 2001 From: Zhang Fan Date: Mon, 25 Dec 2023 06:50:44 -0600 Subject: [PATCH 061/248] Update README.md --- README.md | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 8e909ad5..1f655088 100755 --- a/README.md +++ b/README.md @@ -65,7 +65,10 @@ bash evaluate.sh vbench_videos/{model}/{dimension}/{prompt}-{index}.mp4/gif ``` - +To filter static videos in the temporal flickering dimension, run this: +``` +python static_filter.py --videos_path $VIDEOS_PATH +``` ## :black_nib: Citation From d50d102e5435e600dc0580772b3504df719295f3 Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Wed, 27 Dec 2023 20:43:06 +0700 Subject: [PATCH 062/248] [refactor]: for building package --- MANIFEST.in | 2 + bin/evaluate | 70 + .../grit_model.py => grit_model_deprecated.py | 0 .../configs/Base.yaml | 0 .../configs/GRiT_B_DenseCap.yaml | 0 .../configs/GRiT_B_DenseCap_ObjectDet.yaml | 0 .../configs/GRiT_B_ObjectDet.yaml | 0 .../configs/GRiT_H_ObjectDet.yaml | 0 .../configs/GRiT_L_ObjectDet.yaml | 0 .../grit/__init__.py | 0 .../grit/config.py | 0 .../grit/custom_solver.py | 0 .../grit/data/custom_build_augmentation.py | 0 .../grit/data/custom_dataset_dataloader.py | 0 .../grit/data/custom_dataset_mapper.py | 0 .../grit/data/datasets/grit_coco.py | 0 .../grit/data/datasets/object365.py | 0 .../grit/data/datasets/vg.py | 0 .../transforms/custom_augmentation_impl.py | 0 .../grit/data/transforms/custom_transform.py | 0 .../grit/evaluation/eval.py | 0 .../grit/modeling/backbone/utils.py | 0 .../grit/modeling/backbone/vit.py | 0 .../grit/modeling/meta_arch/grit.py | 0 .../grit/modeling/roi_heads/grit_fast_rcnn.py | 0 .../grit/modeling/roi_heads/grit_roi_heads.py | 0 .../grit/modeling/soft_nms.py | 0 .../grit/modeling/text/file_utils.py | 0 .../grit/modeling/text/load_text_token.py | 0 .../grit/modeling/text/modeling_bert.py | 0 .../grit/modeling/text/text_decoder.py | 0 .../grit/predictor.py | 0 .../image_dense_captions.py | 2 +- .../CenterNet2/.circleci/config.yml | 0 .../third_party/CenterNet2/.clang-format | 0 .../third_party/CenterNet2/.flake8 | 0 .../CenterNet2/.github/CODE_OF_CONDUCT.md | 0 .../CenterNet2/.github/CONTRIBUTING.md | 0 .../.github/Detectron2-Logo-Horz.svg | 0 .../CenterNet2/.github/ISSUE_TEMPLATE.md | 0 .../CenterNet2/.github/ISSUE_TEMPLATE/bugs.md | 0 .../.github/ISSUE_TEMPLATE/config.yml | 0 .../.github/ISSUE_TEMPLATE/documentation.md | 0 .../.github/ISSUE_TEMPLATE/feature-request.md | 0 .../unexpected-problems-bugs.md | 0 .../.github/pull_request_template.md | 0 .../.github/workflows/check-template.yml | 0 .../.github/workflows/levenshtein.js | 0 .../.github/workflows/needs-reply.yml | 0 .../.github/workflows/remove-needs-reply.yml | 0 .../CenterNet2/.github/workflows/workflow.yml | 0 .../third_party/CenterNet2/.gitignore | 0 .../third_party/CenterNet2/GETTING_STARTED.md | 0 .../third_party/CenterNet2/INSTALL.md | 0 .../third_party/CenterNet2/LICENSE | 0 .../third_party/CenterNet2/MODEL_ZOO.md | 0 .../third_party/CenterNet2/README.md | 0 .../third_party/CenterNet2/README_D2.md | 0 .../CenterNet2/configs/Base-RCNN-C4.yaml | 0 .../configs/Base-RCNN-DilatedC5.yaml | 0 .../CenterNet2/configs/Base-RCNN-FPN.yaml | 0 .../CenterNet2/configs/Base-RetinaNet.yaml | 0 .../COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml | 0 .../faster_rcnn_R_101_C4_3x.yaml | 0 .../faster_rcnn_R_101_DC5_3x.yaml | 0 .../faster_rcnn_R_101_FPN_3x.yaml | 0 .../faster_rcnn_R_50_C4_1x.yaml | 0 .../faster_rcnn_R_50_C4_3x.yaml | 0 .../faster_rcnn_R_50_DC5_1x.yaml | 0 .../faster_rcnn_R_50_DC5_3x.yaml | 0 .../faster_rcnn_R_50_FPN_1x.yaml | 0 .../faster_rcnn_R_50_FPN_3x.yaml | 0 .../faster_rcnn_X_101_32x8d_FPN_3x.yaml | 0 .../COCO-Detection/fcos_R_50_FPN_1x.py | 0 .../retinanet_R_101_FPN_3x.yaml | 0 .../COCO-Detection/retinanet_R_50_FPN_1x.py | 0 .../COCO-Detection/retinanet_R_50_FPN_1x.yaml | 0 .../COCO-Detection/retinanet_R_50_FPN_3x.yaml | 0 .../COCO-Detection/rpn_R_50_C4_1x.yaml | 0 .../COCO-Detection/rpn_R_50_FPN_1x.yaml | 0 .../mask_rcnn_R_101_C4_3x.yaml | 0 .../mask_rcnn_R_101_DC5_3x.yaml | 0 .../mask_rcnn_R_101_FPN_3x.yaml | 0 .../mask_rcnn_R_50_C4_1x.py | 0 .../mask_rcnn_R_50_C4_1x.yaml | 0 .../mask_rcnn_R_50_C4_3x.yaml | 0 .../mask_rcnn_R_50_DC5_1x.yaml | 0 .../mask_rcnn_R_50_DC5_3x.yaml | 0 .../mask_rcnn_R_50_FPN_1x.py | 0 .../mask_rcnn_R_50_FPN_1x.yaml | 0 .../mask_rcnn_R_50_FPN_1x_giou.yaml | 0 .../mask_rcnn_R_50_FPN_3x.yaml | 0 .../mask_rcnn_X_101_32x8d_FPN_3x.yaml | 0 .../mask_rcnn_regnetx_4gf_dds_fpn_1x.py | 0 .../mask_rcnn_regnety_4gf_dds_fpn_1x.py | 0 .../Base-Keypoint-RCNN-FPN.yaml | 0 .../keypoint_rcnn_R_101_FPN_3x.yaml | 0 .../keypoint_rcnn_R_50_FPN_1x.py | 0 .../keypoint_rcnn_R_50_FPN_1x.yaml | 0 .../keypoint_rcnn_R_50_FPN_3x.yaml | 0 .../keypoint_rcnn_X_101_32x8d_FPN_3x.yaml | 0 .../Base-Panoptic-FPN.yaml | 0 .../panoptic_fpn_R_101_3x.yaml | 0 .../panoptic_fpn_R_50_1x.py | 0 .../panoptic_fpn_R_50_1x.yaml | 0 .../panoptic_fpn_R_50_3x.yaml | 0 .../Cityscapes/mask_rcnn_R_50_FPN.yaml | 0 .../configs/Detectron1-Comparisons/README.md | 0 .../faster_rcnn_R_50_FPN_noaug_1x.yaml | 0 .../keypoint_rcnn_R_50_FPN_1x.yaml | 0 .../mask_rcnn_R_50_FPN_noaug_1x.yaml | 0 .../mask_rcnn_R_101_FPN_1x.yaml | 0 .../mask_rcnn_R_50_FPN_1x.yaml | 0 .../mask_rcnn_X_101_32x8d_FPN_1x.yaml | 0 .../mask_rcnn_R_101_FPN_1x.yaml | 0 .../mask_rcnn_R_50_FPN_1x.yaml | 0 .../mask_rcnn_X_101_32x8d_FPN_1x.yaml | 0 .../Misc/cascade_mask_rcnn_R_50_FPN_1x.yaml | 0 .../Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml | 0 ...sk_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv.yaml | 0 .../mask_rcnn_R_50_FPN_1x_cls_agnostic.yaml | 0 .../mask_rcnn_R_50_FPN_1x_dconv_c3-c5.yaml | 0 .../mask_rcnn_R_50_FPN_3x_dconv_c3-c5.yaml | 0 .../Misc/mask_rcnn_R_50_FPN_3x_gn.yaml | 0 .../Misc/mask_rcnn_R_50_FPN_3x_syncbn.yaml | 0 .../Misc/mmdet_mask_rcnn_R_50_FPN_1x.py | 0 ...anoptic_fpn_R_101_dconv_cascade_gn_3x.yaml | 0 .../scratch_mask_rcnn_R_50_FPN_3x_gn.yaml | 0 .../scratch_mask_rcnn_R_50_FPN_9x_gn.yaml | 0 .../scratch_mask_rcnn_R_50_FPN_9x_syncbn.yaml | 0 .../configs/Misc/semantic_R_50_FPN_1x.yaml | 0 .../configs/Misc/torchvision_imagenet_R_50.py | 0 .../faster_rcnn_R_50_C4.yaml | 0 .../faster_rcnn_R_50_FPN.yaml | 0 .../CenterNet2/configs/common/README.md | 0 .../configs/common/coco_schedule.py | 0 .../CenterNet2/configs/common/data/coco.py | 0 .../configs/common/data/coco_keypoint.py | 0 .../common/data/coco_panoptic_separated.py | 0 .../configs/common/models/cascade_rcnn.py | 0 .../CenterNet2/configs/common/models/fcos.py | 0 .../common/models/keypoint_rcnn_fpn.py | 0 .../configs/common/models/mask_rcnn_c4.py | 0 .../configs/common/models/mask_rcnn_fpn.py | 0 .../configs/common/models/panoptic_fpn.py | 0 .../configs/common/models/retinanet.py | 0 .../CenterNet2/configs/common/optim.py | 0 .../CenterNet2/configs/common/train.py | 0 .../mask_rcnn_R_101_FPN_100ep_LSJ.py | 0 .../mask_rcnn_R_101_FPN_200ep_LSJ.py | 0 .../mask_rcnn_R_101_FPN_400ep_LSJ.py | 0 .../mask_rcnn_R_50_FPN_100ep_LSJ.py | 0 .../mask_rcnn_R_50_FPN_200ep_LSJ.py | 0 .../mask_rcnn_R_50_FPN_400ep_LSJ.py | 0 .../mask_rcnn_R_50_FPN_50ep_LSJ.py | 0 ...mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py | 0 ...mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ.py | 0 ...mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ.py | 0 ...mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py | 0 ...mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ.py | 0 ...mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ.py | 0 .../configs/quick_schedules/README.md | 0 ...mask_rcnn_R_50_FPN_inference_acc_test.yaml | 0 ...scade_mask_rcnn_R_50_FPN_instant_test.yaml | 0 ...fast_rcnn_R_50_FPN_inference_acc_test.yaml | 0 .../fast_rcnn_R_50_FPN_instant_test.yaml | 0 ...oint_rcnn_R_50_FPN_inference_acc_test.yaml | 0 .../keypoint_rcnn_R_50_FPN_instant_test.yaml | 0 ...R_50_FPN_normalized_training_acc_test.yaml | 0 ...point_rcnn_R_50_FPN_training_acc_test.yaml | 0 .../mask_rcnn_R_50_C4_GCV_instant_test.yaml | 0 .../mask_rcnn_R_50_C4_inference_acc_test.yaml | 0 .../mask_rcnn_R_50_C4_instant_test.yaml | 0 .../mask_rcnn_R_50_C4_training_acc_test.yaml | 0 ...mask_rcnn_R_50_DC5_inference_acc_test.yaml | 0 ...mask_rcnn_R_50_FPN_inference_acc_test.yaml | 0 .../mask_rcnn_R_50_FPN_instant_test.yaml | 0 ...R_50_FPN_pred_boxes_training_acc_test.yaml | 0 .../mask_rcnn_R_50_FPN_training_acc_test.yaml | 0 .../panoptic_fpn_R_50_inference_acc_test.yaml | 0 .../panoptic_fpn_R_50_instant_test.yaml | 0 .../panoptic_fpn_R_50_training_acc_test.yaml | 0 ...retinanet_R_50_FPN_inference_acc_test.yaml | 0 .../retinanet_R_50_FPN_instant_test.yaml | 0 .../rpn_R_50_FPN_inference_acc_test.yaml | 0 .../rpn_R_50_FPN_instant_test.yaml | 0 .../semantic_R_50_FPN_inference_acc_test.yaml | 0 .../semantic_R_50_FPN_instant_test.yaml | 0 .../semantic_R_50_FPN_training_acc_test.yaml | 0 .../third_party/CenterNet2/datasets/README.md | 0 .../datasets/lvis/lvis_v1_train_cat_info.json | 0 .../datasets/prepare_ade20k_sem_seg.py | 0 .../datasets/prepare_cocofied_lvis.py | 0 .../CenterNet2/datasets/prepare_for_tests.sh | 0 .../datasets/prepare_panoptic_fpn.py | 0 .../third_party/CenterNet2/demo/README.md | 0 .../third_party/CenterNet2/demo/demo.py | 0 .../third_party/CenterNet2/demo/predictor.py | 0 .../CenterNet2/detectron2/__init__.py | 0 .../detectron2/checkpoint/__init__.py | 0 .../detectron2/checkpoint/c2_model_loading.py | 0 .../detectron2/checkpoint/catalog.py | 0 .../checkpoint/detection_checkpoint.py | 0 .../CenterNet2/detectron2/config/__init__.py | 0 .../CenterNet2/detectron2/config/compat.py | 0 .../CenterNet2/detectron2/config/config.py | 0 .../CenterNet2/detectron2/config/defaults.py | 0 .../detectron2/config/instantiate.py | 0 .../CenterNet2/detectron2/config/lazy.py | 0 .../CenterNet2/detectron2/data/__init__.py | 0 .../CenterNet2/detectron2/data/benchmark.py | 0 .../CenterNet2/detectron2/data/build.py | 0 .../CenterNet2/detectron2/data/catalog.py | 0 .../CenterNet2/detectron2/data/common.py | 0 .../detectron2/data/dataset_mapper.py | 0 .../detectron2/data/datasets/README.md | 0 .../detectron2/data/datasets/__init__.py | 0 .../detectron2/data/datasets/builtin.py | 0 .../detectron2/data/datasets/builtin_meta.py | 0 .../detectron2/data/datasets/cityscapes.py | 0 .../data/datasets/cityscapes_panoptic.py | 0 .../detectron2/data/datasets/coco.py | 0 .../detectron2/data/datasets/coco_panoptic.py | 0 .../detectron2/data/datasets/lvis.py | 0 .../data/datasets/lvis_v0_5_categories.py | 0 .../data/datasets/lvis_v1_categories.py | 0 .../detectron2/data/datasets/pascal_voc.py | 0 .../detectron2/data/datasets/register_coco.py | 0 .../detectron2/data/detection_utils.py | 0 .../detectron2/data/samplers/__init__.py | 0 .../data/samplers/distributed_sampler.py | 0 .../data/samplers/grouped_batch_sampler.py | 0 .../detectron2/data/transforms/__init__.py | 0 .../data/transforms/augmentation.py | 0 .../data/transforms/augmentation_impl.py | 0 .../detectron2/data/transforms/transform.py | 0 .../CenterNet2/detectron2/engine/__init__.py | 0 .../CenterNet2/detectron2/engine/defaults.py | 0 .../CenterNet2/detectron2/engine/hooks.py | 0 .../CenterNet2/detectron2/engine/launch.py | 0 .../detectron2/engine/train_loop.py | 0 .../detectron2/evaluation/__init__.py | 0 .../evaluation/cityscapes_evaluation.py | 0 .../detectron2/evaluation/coco_evaluation.py | 0 .../detectron2/evaluation/evaluator.py | 0 .../detectron2/evaluation/fast_eval_api.py | 0 .../detectron2/evaluation/lvis_evaluation.py | 0 .../evaluation/panoptic_evaluation.py | 0 .../evaluation/pascal_voc_evaluation.py | 0 .../evaluation/rotated_coco_evaluation.py | 0 .../evaluation/sem_seg_evaluation.py | 0 .../detectron2/evaluation/testing.py | 0 .../CenterNet2/detectron2/export/README.md | 0 .../CenterNet2/detectron2/export/__init__.py | 0 .../CenterNet2/detectron2/export/api.py | 0 .../CenterNet2/detectron2/export/c10.py | 0 .../detectron2/export/caffe2_export.py | 0 .../detectron2/export/caffe2_inference.py | 0 .../detectron2/export/caffe2_modeling.py | 0 .../detectron2/export/caffe2_patch.py | 0 .../CenterNet2/detectron2/export/flatten.py | 0 .../CenterNet2/detectron2/export/shared.py | 0 .../detectron2/export/torchscript.py | 0 .../detectron2/export/torchscript_patch.py | 0 .../CenterNet2/detectron2/layers/__init__.py | 0 .../CenterNet2/detectron2/layers/aspp.py | 0 .../detectron2/layers/batch_norm.py | 0 .../CenterNet2/detectron2/layers/blocks.py | 0 .../detectron2/layers/csrc/README.md | 0 .../csrc/ROIAlignRotated/ROIAlignRotated.h | 0 .../ROIAlignRotated/ROIAlignRotated_cpu.cpp | 0 .../ROIAlignRotated/ROIAlignRotated_cuda.cu | 0 .../csrc/box_iou_rotated/box_iou_rotated.h | 0 .../box_iou_rotated/box_iou_rotated_cpu.cpp | 0 .../box_iou_rotated/box_iou_rotated_cuda.cu | 0 .../box_iou_rotated/box_iou_rotated_utils.h | 0 .../layers/csrc/cocoeval/cocoeval.cpp | 0 .../layers/csrc/cocoeval/cocoeval.h | 0 .../detectron2/layers/csrc/cuda_version.cu | 0 .../layers/csrc/deformable/deform_conv.h | 0 .../csrc/deformable/deform_conv_cuda.cu | 0 .../deformable/deform_conv_cuda_kernel.cu | 0 .../layers/csrc/nms_rotated/nms_rotated.h | 0 .../csrc/nms_rotated/nms_rotated_cpu.cpp | 0 .../csrc/nms_rotated/nms_rotated_cuda.cu | 0 .../detectron2/layers/csrc/vision.cpp | 0 .../detectron2/layers/deform_conv.py | 0 .../CenterNet2/detectron2/layers/losses.py | 0 .../CenterNet2/detectron2/layers/mask_ops.py | 0 .../CenterNet2/detectron2/layers/nms.py | 0 .../CenterNet2/detectron2/layers/roi_align.py | 0 .../detectron2/layers/roi_align_rotated.py | 0 .../detectron2/layers/rotated_boxes.py | 0 .../detectron2/layers/shape_spec.py | 0 .../CenterNet2/detectron2/layers/wrappers.py | 0 .../detectron2/model_zoo/__init__.py | 0 .../detectron2/model_zoo/model_zoo.py | 0 .../detectron2/modeling/__init__.py | 0 .../detectron2/modeling/anchor_generator.py | 0 .../detectron2/modeling/backbone/__init__.py | 0 .../detectron2/modeling/backbone/backbone.py | 0 .../detectron2/modeling/backbone/build.py | 0 .../detectron2/modeling/backbone/fpn.py | 0 .../detectron2/modeling/backbone/regnet.py | 0 .../detectron2/modeling/backbone/resnet.py | 0 .../detectron2/modeling/box_regression.py | 0 .../CenterNet2/detectron2/modeling/matcher.py | 0 .../detectron2/modeling/meta_arch/__init__.py | 0 .../detectron2/modeling/meta_arch/build.py | 0 .../modeling/meta_arch/dense_detector.py | 0 .../detectron2/modeling/meta_arch/fcos.py | 0 .../modeling/meta_arch/panoptic_fpn.py | 0 .../detectron2/modeling/meta_arch/rcnn.py | 0 .../modeling/meta_arch/retinanet.py | 0 .../modeling/meta_arch/semantic_seg.py | 0 .../detectron2/modeling/mmdet_wrapper.py | 0 .../CenterNet2/detectron2/modeling/poolers.py | 0 .../detectron2/modeling/postprocessing.py | 0 .../modeling/proposal_generator/__init__.py | 0 .../modeling/proposal_generator/build.py | 0 .../proposal_generator/proposal_utils.py | 0 .../modeling/proposal_generator/rpn.py | 0 .../modeling/proposal_generator/rrpn.py | 0 .../detectron2/modeling/roi_heads/__init__.py | 0 .../detectron2/modeling/roi_heads/box_head.py | 0 .../modeling/roi_heads/cascade_rcnn.py | 0 .../modeling/roi_heads/fast_rcnn.py | 0 .../modeling/roi_heads/keypoint_head.py | 0 .../modeling/roi_heads/mask_head.py | 0 .../modeling/roi_heads/roi_heads.py | 0 .../modeling/roi_heads/rotated_fast_rcnn.py | 0 .../detectron2/modeling/sampling.py | 0 .../modeling/test_time_augmentation.py | 0 .../CenterNet2/detectron2/projects/README.md | 0 .../detectron2/projects/__init__.py | 0 .../CenterNet2/detectron2/solver/__init__.py | 0 .../CenterNet2/detectron2/solver/build.py | 0 .../detectron2/solver/lr_scheduler.py | 0 .../detectron2/structures/__init__.py | 0 .../CenterNet2/detectron2/structures/boxes.py | 0 .../detectron2/structures/image_list.py | 0 .../detectron2/structures/instances.py | 0 .../detectron2/structures/keypoints.py | 0 .../CenterNet2/detectron2/structures/masks.py | 0 .../detectron2/structures/rotated_boxes.py | 0 .../CenterNet2/detectron2/utils/README.md | 0 .../CenterNet2/detectron2/utils/__init__.py | 0 .../CenterNet2/detectron2/utils/analysis.py | 0 .../detectron2/utils/collect_env.py | 0 .../CenterNet2/detectron2/utils/colormap.py | 0 .../CenterNet2/detectron2/utils/comm.py | 0 .../CenterNet2/detectron2/utils/env.py | 0 .../CenterNet2/detectron2/utils/events.py | 0 .../CenterNet2/detectron2/utils/file_io.py | 0 .../CenterNet2/detectron2/utils/logger.py | 0 .../CenterNet2/detectron2/utils/memory.py | 0 .../CenterNet2/detectron2/utils/registry.py | 0 .../CenterNet2/detectron2/utils/serialize.py | 0 .../CenterNet2/detectron2/utils/testing.py | 0 .../detectron2/utils/video_visualizer.py | 0 .../CenterNet2/detectron2/utils/visualizer.py | 0 .../third_party/CenterNet2/dev/README.md | 0 .../third_party/CenterNet2/dev/linter.sh | 0 .../CenterNet2/dev/packaging/README.md | 0 .../dev/packaging/build_all_wheels.sh | 0 .../CenterNet2/dev/packaging/build_wheel.sh | 0 .../dev/packaging/gen_install_table.py | 0 .../dev/packaging/gen_wheel_index.sh | 0 .../CenterNet2/dev/packaging/pkg_helpers.bash | 0 .../CenterNet2/dev/parse_results.sh | 0 .../CenterNet2/dev/run_inference_tests.sh | 0 .../CenterNet2/dev/run_instant_tests.sh | 0 .../third_party/CenterNet2/docker/Dockerfile | 0 .../third_party/CenterNet2/docker/README.md | 0 .../CenterNet2/docker/deploy.Dockerfile | 0 .../CenterNet2/docker/docker-compose.yml | 0 .../third_party/CenterNet2/docs/.gitignore | 0 .../third_party/CenterNet2/docs/Makefile | 0 .../third_party/CenterNet2/docs/README.md | 0 .../CenterNet2/docs/_static/css/custom.css | 0 .../third_party/CenterNet2/docs/conf.py | 0 .../third_party/CenterNet2/docs/index.rst | 0 .../CenterNet2/docs/modules/checkpoint.rst | 0 .../CenterNet2/docs/modules/config.rst | 0 .../CenterNet2/docs/modules/data.rst | 0 .../docs/modules/data_transforms.rst | 0 .../CenterNet2/docs/modules/engine.rst | 0 .../CenterNet2/docs/modules/evaluation.rst | 0 .../CenterNet2/docs/modules/export.rst | 0 .../CenterNet2/docs/modules/fvcore.rst | 0 .../CenterNet2/docs/modules/index.rst | 0 .../CenterNet2/docs/modules/layers.rst | 0 .../CenterNet2/docs/modules/model_zoo.rst | 0 .../CenterNet2/docs/modules/modeling.rst | 0 .../CenterNet2/docs/modules/solver.rst | 0 .../CenterNet2/docs/modules/structures.rst | 0 .../CenterNet2/docs/modules/utils.rst | 0 .../CenterNet2/docs/notes/benchmarks.md | 0 .../CenterNet2/docs/notes/changelog.md | 0 .../CenterNet2/docs/notes/compatibility.md | 0 .../CenterNet2/docs/notes/contributing.md | 0 .../CenterNet2/docs/notes/index.rst | 0 .../CenterNet2/docs/requirements.txt | 0 .../CenterNet2/docs/tutorials/README.md | 0 .../CenterNet2/docs/tutorials/augmentation.md | 0 .../docs/tutorials/builtin_datasets.md | 0 .../CenterNet2/docs/tutorials/configs.md | 0 .../CenterNet2/docs/tutorials/data_loading.md | 0 .../CenterNet2/docs/tutorials/datasets.md | 0 .../CenterNet2/docs/tutorials/deployment.md | 0 .../CenterNet2/docs/tutorials/evaluation.md | 0 .../CenterNet2/docs/tutorials/extend.md | 0 .../docs/tutorials/getting_started.md | 0 .../CenterNet2/docs/tutorials/index.rst | 0 .../CenterNet2/docs/tutorials/install.md | 0 .../CenterNet2/docs/tutorials/lazyconfigs.md | 0 .../CenterNet2/docs/tutorials/models.md | 0 .../CenterNet2/docs/tutorials/training.md | 0 .../CenterNet2/docs/tutorials/write-models.md | 0 .../CenterNet2/projects/CenterNet2/.gitignore | 0 .../projects/CenterNet2/centernet/__init__.py | 0 .../projects/CenterNet2/centernet/config.py | 0 .../data/custom_build_augmentation.py | 0 .../data/custom_dataset_dataloader.py | 0 .../centernet/data/datasets/coco.py | 0 .../centernet/data/datasets/nuimages.py | 0 .../centernet/data/datasets/objects365.py | 0 .../transforms/custom_augmentation_impl.py | 0 .../data/transforms/custom_transform.py | 0 .../centernet/modeling/backbone/bifpn.py | 0 .../centernet/modeling/backbone/bifpn_fcos.py | 0 .../centernet/modeling/backbone/dla.py | 0 .../centernet/modeling/backbone/dlafpn.py | 0 .../centernet/modeling/backbone/fpn_p5.py | 2 +- .../centernet/modeling/backbone/res2net.py | 0 .../CenterNet2/centernet/modeling/debug.py | 0 .../modeling/dense_heads/centernet.py | 0 .../modeling/dense_heads/centernet_head.py | 0 .../centernet/modeling/dense_heads/utils.py | 0 .../centernet/modeling/layers/deform_conv.py | 0 .../modeling/layers/heatmap_focal_loss.py | 0 .../centernet/modeling/layers/iou_loss.py | 0 .../centernet/modeling/layers/ml_nms.py | 0 .../modeling/meta_arch/centernet_detector.py | 0 .../modeling/roi_heads/custom_fast_rcnn.py | 0 .../modeling/roi_heads/custom_roi_heads.py | 0 .../centernet/modeling/roi_heads/fed_loss.py | 0 .../CenterNet2/centernet2_docs/MODEL_ZOO.md | 0 .../configs/Base-CenterNet-FPN.yaml | 0 .../CenterNet2/configs/Base-CenterNet2.yaml | 0 .../CenterNet2/configs/Base_S4_DLA.yaml | 0 .../configs/CenterNet-FPN_R50_1x.yaml | 0 .../configs/CenterNet-S4_DLA_8x.yaml | 0 .../configs/CenterNet2-F_R50_1x.yaml | 0 .../configs/CenterNet2_DLA-BiFPN-P3_24x.yaml | 0 .../configs/CenterNet2_DLA-BiFPN-P3_4x.yaml | 0 .../CenterNet2_DLA-BiFPN-P5_640_16x.yaml | 0 .../CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml | 0 ...enterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml | 0 .../CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml | 0 ...erNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml | 0 .../configs/CenterNet2_R2-101-DCN_896_4x.yaml | 0 .../CenterNet2/configs/CenterNet2_R50_1x.yaml | 0 .../configs/CenterNet2_X101-DCN_2x.yaml | 0 .../configs/LVIS_CenterNet2_R50_1x.yaml | 0 .../configs/LVIS_CenterNet2_R50_Fed_1x.yaml | 0 .../configs/O365_CenterNet2_R50_1x.yaml | 0 .../nuImages_CenterNet2_DLA_640_8x.yaml | 0 .../CenterNet2/projects/CenterNet2/demo.py | 0 .../projects/CenterNet2/predictor.py | 0 .../projects/CenterNet2/train_net.py | 0 .../third_party/CenterNet2/setup.cfg | 0 .../third_party/CenterNet2/setup.py | 0 .../third_party/CenterNet2/tests/README.md | 0 .../third_party/CenterNet2/tests/__init__.py | 0 .../CenterNet2/tests/config/dir1/dir1_a.py | 0 .../CenterNet2/tests/config/dir1/dir1_b.py | 0 .../CenterNet2/tests/config/root_cfg.py | 0 .../tests/config/test_instantiate_config.py | 0 .../tests/config/test_lazy_config.py | 0 .../tests/config/test_yacs_config.py | 0 .../CenterNet2/tests/data/__init__.py | 0 .../CenterNet2/tests/data/test_coco.py | 0 .../tests/data/test_coco_evaluation.py | 0 .../CenterNet2/tests/data/test_dataset.py | 0 .../tests/data/test_detection_utils.py | 0 .../tests/data/test_rotation_transform.py | 0 .../CenterNet2/tests/data/test_sampler.py | 0 .../CenterNet2/tests/data/test_transforms.py | 0 .../CenterNet2/tests/layers/__init__.py | 0 .../CenterNet2/tests/layers/test_blocks.py | 0 .../tests/layers/test_deformable.py | 0 .../CenterNet2/tests/layers/test_losses.py | 0 .../CenterNet2/tests/layers/test_mask_ops.py | 0 .../CenterNet2/tests/layers/test_nms.py | 0 .../tests/layers/test_nms_rotated.py | 0 .../CenterNet2/tests/layers/test_roi_align.py | 0 .../tests/layers/test_roi_align_rotated.py | 0 .../CenterNet2/tests/modeling/__init__.py | 0 .../tests/modeling/test_anchor_generator.py | 0 .../tests/modeling/test_backbone.py | 0 .../tests/modeling/test_box2box_transform.py | 0 .../tests/modeling/test_fast_rcnn.py | 0 .../CenterNet2/tests/modeling/test_matcher.py | 0 .../CenterNet2/tests/modeling/test_mmdet.py | 0 .../tests/modeling/test_model_e2e.py | 0 .../tests/modeling/test_roi_heads.py | 0 .../tests/modeling/test_roi_pooler.py | 0 .../CenterNet2/tests/modeling/test_rpn.py | 0 .../CenterNet2/tests/structures/__init__.py | 0 .../CenterNet2/tests/structures/test_boxes.py | 0 .../tests/structures/test_imagelist.py | 0 .../tests/structures/test_instances.py | 0 .../tests/structures/test_keypoints.py | 0 .../CenterNet2/tests/structures/test_masks.py | 0 .../tests/structures/test_rotated_boxes.py | 0 .../CenterNet2/tests/test_checkpoint.py | 0 .../CenterNet2/tests/test_engine.py | 0 .../CenterNet2/tests/test_events.py | 0 .../CenterNet2/tests/test_export_caffe2.py | 0 .../tests/test_export_torchscript.py | 0 .../CenterNet2/tests/test_model_analysis.py | 0 .../CenterNet2/tests/test_model_zoo.py | 0 .../CenterNet2/tests/test_packaging.py | 0 .../CenterNet2/tests/test_registry.py | 0 .../CenterNet2/tests/test_scheduler.py | 0 .../CenterNet2/tests/test_solver.py | 0 .../CenterNet2/tests/test_visualizer.py | 0 .../third_party/CenterNet2/tools/README.md | 0 .../third_party/CenterNet2/tools/__init__.py | 0 .../CenterNet2/tools/analyze_model.py | 0 .../third_party/CenterNet2/tools/benchmark.py | 0 .../tools/convert-torchvision-to-d2.py | 0 .../CenterNet2/tools/deploy/CMakeLists.txt | 0 .../CenterNet2/tools/deploy/README.md | 0 .../CenterNet2/tools/deploy/export_model.py | 0 .../tools/deploy/torchscript_mask_rcnn.cpp | 0 .../CenterNet2/tools/lazyconfig_train_net.py | 0 .../CenterNet2/tools/lightning_train_net.py | 0 .../CenterNet2/tools/plain_train_net.py | 0 .../third_party/CenterNet2/tools/train_net.py | 0 .../CenterNet2/tools/visualize_data.py | 0 .../tools/visualize_json_results.py | 0 requirements.txt | 17 + .../tests/data/__init__.py => setup.cfg | 0 setup.py | 35 + vbench/aesthetic_quality.py | 2 +- vbench/appearance_style.py | 4 +- vbench/background_consistency.py | 2 +- vbench/cli/__init__.py | 2 + vbench/cli/evaluate.py | 67 + vbench/color.py | 6 +- vbench/dynamic_degree.py | 11 +- vbench/human_action.py | 8 +- vbench/imaging_quality.py | 2 +- vbench/motion_smoothness.py | 8 +- vbench/multiple_objects.py | 6 +- vbench/object_class.py | 6 +- vbench/overall_consistency.py | 8 +- vbench/scene.py | 6 +- vbench/spatial_relationship.py | 6 +- vbench/subject_consistency.py | 2 +- vbench/temporal_flickering.py | 2 +- vbench/temporal_style.py | 8 +- .../tests/layers => RAFT}/__init__.py | 0 .../tests/modeling => }/__init__.py | 0 .../tests/structures => amt}/__init__.py | 0 .../CenterNet2/tools => grit_src}/__init__.py | 0 .../centernet_config.py} | 0 vbench/third_party/grit_src/grit/config.py | 2 +- .../grit_src/grit/data/__init__.py | 0 .../grit_src/grit/data/datasets/__init__.py | 0 .../grit_src/grit/modeling/__init__.py | 0 .../grit/modeling/backbone/__init__.py | 0 .../backbone/centernet_last_layer.py} | 2 +- .../grit_src/grit/modeling/backbone/vit.py | 3 +- .../grit/modeling/meta_arch/__init__.py | 0 .../grit_src/grit/modeling/meta_arch/grit.py | 2 +- .../grit/modeling/roi_heads/__init__.py | 0 .../grit/modeling/roi_heads/grit_roi_heads.py | 3 +- .../grit_src/grit/modeling/text/__init__.py | 0 .../grit_src/image_dense_captions.py | 50 +- .../CenterNet2/docs/notes/contributing.md | 1 - .../docs/tutorials/builtin_datasets.md | 1 - .../docs/tutorials/getting_started.md | 1 - .../CenterNet2/docs/tutorials/install.md | 1 - vbench/third_party/tag2Text/__init__.py | 2 + .../CenterNet2/.circleci/config.yml | 256 ---- .../third_party/CenterNet2/.clang-format | 85 -- .../grit_src/third_party/CenterNet2/.flake8 | 15 - .../CenterNet2/.github/CODE_OF_CONDUCT.md | 5 - .../CenterNet2/.github/CONTRIBUTING.md | 68 - .../.github/Detectron2-Logo-Horz.svg | 1 - .../CenterNet2/.github/ISSUE_TEMPLATE.md | 5 - .../CenterNet2/.github/ISSUE_TEMPLATE/bugs.md | 38 - .../.github/ISSUE_TEMPLATE/config.yml | 17 - .../.github/ISSUE_TEMPLATE/documentation.md | 14 - .../.github/ISSUE_TEMPLATE/feature-request.md | 31 - .../unexpected-problems-bugs.md | 44 - .../.github/pull_request_template.md | 10 - .../.github/workflows/check-template.yml | 86 -- .../.github/workflows/levenshtein.js | 44 - .../.github/workflows/needs-reply.yml | 98 -- .../.github/workflows/remove-needs-reply.yml | 25 - .../CenterNet2/.github/workflows/workflow.yml | 81 -- .../third_party/CenterNet2/.gitignore | 57 - .../third_party/CenterNet2/GETTING_STARTED.md | 79 - .../third_party/CenterNet2/INSTALL.md | 261 ---- .../grit_src/third_party/CenterNet2/LICENSE | 202 --- .../third_party/CenterNet2/MODEL_ZOO.md | 1052 -------------- .../grit_src/third_party/CenterNet2/README.md | 85 -- .../third_party/CenterNet2/README_D2.md | 62 - .../CenterNet2/configs/Base-RCNN-C4.yaml | 18 - .../configs/Base-RCNN-DilatedC5.yaml | 31 - .../CenterNet2/configs/Base-RCNN-FPN.yaml | 42 - .../CenterNet2/configs/Base-RetinaNet.yaml | 25 - .../COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml | 17 - .../faster_rcnn_R_101_C4_3x.yaml | 9 - .../faster_rcnn_R_101_DC5_3x.yaml | 9 - .../faster_rcnn_R_101_FPN_3x.yaml | 9 - .../faster_rcnn_R_50_C4_1x.yaml | 6 - .../faster_rcnn_R_50_C4_3x.yaml | 9 - .../faster_rcnn_R_50_DC5_1x.yaml | 6 - .../faster_rcnn_R_50_DC5_3x.yaml | 9 - .../faster_rcnn_R_50_FPN_1x.yaml | 6 - .../faster_rcnn_R_50_FPN_3x.yaml | 9 - .../faster_rcnn_X_101_32x8d_FPN_3x.yaml | 13 - .../COCO-Detection/fcos_R_50_FPN_1x.py | 11 - .../retinanet_R_101_FPN_3x.yaml | 8 - .../COCO-Detection/retinanet_R_50_FPN_1x.py | 11 - .../COCO-Detection/retinanet_R_50_FPN_1x.yaml | 5 - .../COCO-Detection/retinanet_R_50_FPN_3x.yaml | 8 - .../COCO-Detection/rpn_R_50_C4_1x.yaml | 10 - .../COCO-Detection/rpn_R_50_FPN_1x.yaml | 9 - .../mask_rcnn_R_101_C4_3x.yaml | 9 - .../mask_rcnn_R_101_DC5_3x.yaml | 9 - .../mask_rcnn_R_101_FPN_3x.yaml | 9 - .../mask_rcnn_R_50_C4_1x.py | 8 - .../mask_rcnn_R_50_C4_1x.yaml | 6 - .../mask_rcnn_R_50_C4_3x.yaml | 9 - .../mask_rcnn_R_50_DC5_1x.yaml | 6 - .../mask_rcnn_R_50_DC5_3x.yaml | 9 - .../mask_rcnn_R_50_FPN_1x.py | 8 - .../mask_rcnn_R_50_FPN_1x.yaml | 6 - .../mask_rcnn_R_50_FPN_1x_giou.yaml | 12 - .../mask_rcnn_R_50_FPN_3x.yaml | 9 - .../mask_rcnn_X_101_32x8d_FPN_3x.yaml | 13 - .../mask_rcnn_regnetx_4gf_dds_fpn_1x.py | 34 - .../mask_rcnn_regnety_4gf_dds_fpn_1x.py | 35 - .../Base-Keypoint-RCNN-FPN.yaml | 15 - .../keypoint_rcnn_R_101_FPN_3x.yaml | 8 - .../keypoint_rcnn_R_50_FPN_1x.py | 8 - .../keypoint_rcnn_R_50_FPN_1x.yaml | 5 - .../keypoint_rcnn_R_50_FPN_3x.yaml | 8 - .../keypoint_rcnn_X_101_32x8d_FPN_3x.yaml | 12 - .../Base-Panoptic-FPN.yaml | 11 - .../panoptic_fpn_R_101_3x.yaml | 8 - .../panoptic_fpn_R_50_1x.py | 8 - .../panoptic_fpn_R_50_1x.yaml | 5 - .../panoptic_fpn_R_50_3x.yaml | 8 - .../Cityscapes/mask_rcnn_R_50_FPN.yaml | 27 - .../configs/Detectron1-Comparisons/README.md | 84 -- .../faster_rcnn_R_50_FPN_noaug_1x.yaml | 17 - .../keypoint_rcnn_R_50_FPN_1x.yaml | 27 - .../mask_rcnn_R_50_FPN_noaug_1x.yaml | 20 - .../mask_rcnn_R_101_FPN_1x.yaml | 19 - .../mask_rcnn_R_50_FPN_1x.yaml | 19 - .../mask_rcnn_X_101_32x8d_FPN_1x.yaml | 23 - .../mask_rcnn_R_101_FPN_1x.yaml | 22 - .../mask_rcnn_R_50_FPN_1x.yaml | 22 - .../mask_rcnn_X_101_32x8d_FPN_1x.yaml | 26 - .../Misc/cascade_mask_rcnn_R_50_FPN_1x.yaml | 12 - .../Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml | 15 - ...sk_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv.yaml | 36 - .../mask_rcnn_R_50_FPN_1x_cls_agnostic.yaml | 10 - .../mask_rcnn_R_50_FPN_1x_dconv_c3-c5.yaml | 8 - .../mask_rcnn_R_50_FPN_3x_dconv_c3-c5.yaml | 11 - .../Misc/mask_rcnn_R_50_FPN_3x_gn.yaml | 21 - .../Misc/mask_rcnn_R_50_FPN_3x_syncbn.yaml | 24 - .../Misc/mmdet_mask_rcnn_R_50_FPN_1x.py | 151 -- ...anoptic_fpn_R_101_dconv_cascade_gn_3x.yaml | 26 - .../scratch_mask_rcnn_R_50_FPN_3x_gn.yaml | 13 - .../scratch_mask_rcnn_R_50_FPN_9x_gn.yaml | 19 - .../scratch_mask_rcnn_R_50_FPN_9x_syncbn.yaml | 19 - .../configs/Misc/semantic_R_50_FPN_1x.yaml | 11 - .../configs/Misc/torchvision_imagenet_R_50.py | 150 -- .../faster_rcnn_R_50_C4.yaml | 18 - .../faster_rcnn_R_50_FPN.yaml | 18 - .../CenterNet2/configs/common/README.md | 6 - .../configs/common/coco_schedule.py | 47 - .../CenterNet2/configs/common/data/coco.py | 48 - .../configs/common/data/coco_keypoint.py | 13 - .../common/data/coco_panoptic_separated.py | 26 - .../configs/common/models/cascade_rcnn.py | 36 - .../CenterNet2/configs/common/models/fcos.py | 23 - .../common/models/keypoint_rcnn_fpn.py | 33 - .../configs/common/models/mask_rcnn_c4.py | 88 -- .../configs/common/models/mask_rcnn_fpn.py | 93 -- .../configs/common/models/panoptic_fpn.py | 20 - .../configs/common/models/retinanet.py | 53 - .../CenterNet2/configs/common/optim.py | 15 - .../CenterNet2/configs/common/train.py | 18 - .../mask_rcnn_R_101_FPN_100ep_LSJ.py | 9 - .../mask_rcnn_R_101_FPN_200ep_LSJ.py | 14 - .../mask_rcnn_R_101_FPN_400ep_LSJ.py | 14 - .../mask_rcnn_R_50_FPN_100ep_LSJ.py | 72 - .../mask_rcnn_R_50_FPN_200ep_LSJ.py | 14 - .../mask_rcnn_R_50_FPN_400ep_LSJ.py | 14 - .../mask_rcnn_R_50_FPN_50ep_LSJ.py | 14 - ...mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py | 29 - ...mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ.py | 14 - ...mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ.py | 14 - ...mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py | 30 - ...mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ.py | 14 - ...mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ.py | 14 - .../configs/quick_schedules/README.md | 8 - ...mask_rcnn_R_50_FPN_inference_acc_test.yaml | 7 - ...scade_mask_rcnn_R_50_FPN_instant_test.yaml | 11 - ...fast_rcnn_R_50_FPN_inference_acc_test.yaml | 7 - .../fast_rcnn_R_50_FPN_instant_test.yaml | 15 - ...oint_rcnn_R_50_FPN_inference_acc_test.yaml | 7 - .../keypoint_rcnn_R_50_FPN_instant_test.yaml | 16 - ...R_50_FPN_normalized_training_acc_test.yaml | 30 - ...point_rcnn_R_50_FPN_training_acc_test.yaml | 28 - .../mask_rcnn_R_50_C4_GCV_instant_test.yaml | 18 - .../mask_rcnn_R_50_C4_inference_acc_test.yaml | 7 - .../mask_rcnn_R_50_C4_instant_test.yaml | 14 - .../mask_rcnn_R_50_C4_training_acc_test.yaml | 22 - ...mask_rcnn_R_50_DC5_inference_acc_test.yaml | 7 - ...mask_rcnn_R_50_FPN_inference_acc_test.yaml | 10 - .../mask_rcnn_R_50_FPN_instant_test.yaml | 14 - ...R_50_FPN_pred_boxes_training_acc_test.yaml | 6 - .../mask_rcnn_R_50_FPN_training_acc_test.yaml | 21 - .../panoptic_fpn_R_50_inference_acc_test.yaml | 7 - .../panoptic_fpn_R_50_instant_test.yaml | 19 - .../panoptic_fpn_R_50_training_acc_test.yaml | 20 - ...retinanet_R_50_FPN_inference_acc_test.yaml | 7 - .../retinanet_R_50_FPN_instant_test.yaml | 13 - .../rpn_R_50_FPN_inference_acc_test.yaml | 7 - .../rpn_R_50_FPN_instant_test.yaml | 13 - .../semantic_R_50_FPN_inference_acc_test.yaml | 10 - .../semantic_R_50_FPN_instant_test.yaml | 18 - .../semantic_R_50_FPN_training_acc_test.yaml | 20 - .../third_party/CenterNet2/datasets/README.md | 140 -- .../datasets/lvis/lvis_v1_train_cat_info.json | 1 - .../datasets/prepare_ade20k_sem_seg.py | 26 - .../datasets/prepare_cocofied_lvis.py | 176 --- .../CenterNet2/datasets/prepare_for_tests.sh | 31 - .../datasets/prepare_panoptic_fpn.py | 116 -- .../third_party/CenterNet2/demo/README.md | 8 - .../third_party/CenterNet2/demo/demo.py | 188 --- .../third_party/CenterNet2/demo/predictor.py | 220 --- .../CenterNet2/detectron2/__init__.py | 10 - .../detectron2/checkpoint/__init__.py | 10 - .../detectron2/checkpoint/c2_model_loading.py | 407 ------ .../detectron2/checkpoint/catalog.py | 115 -- .../checkpoint/detection_checkpoint.py | 120 -- .../CenterNet2/detectron2/config/__init__.py | 24 - .../CenterNet2/detectron2/config/compat.py | 229 --- .../CenterNet2/detectron2/config/config.py | 265 ---- .../CenterNet2/detectron2/config/defaults.py | 635 -------- .../detectron2/config/instantiate.py | 82 -- .../CenterNet2/detectron2/config/lazy.py | 399 ----- .../CenterNet2/detectron2/data/__init__.py | 19 - .../CenterNet2/detectron2/data/benchmark.py | 225 --- .../CenterNet2/detectron2/data/build.py | 542 ------- .../CenterNet2/detectron2/data/catalog.py | 236 --- .../CenterNet2/detectron2/data/common.py | 241 --- .../detectron2/data/dataset_mapper.py | 191 --- .../detectron2/data/datasets/README.md | 9 - .../detectron2/data/datasets/__init__.py | 9 - .../detectron2/data/datasets/builtin.py | 259 ---- .../detectron2/data/datasets/builtin_meta.py | 350 ----- .../detectron2/data/datasets/cityscapes.py | 329 ----- .../data/datasets/cityscapes_panoptic.py | 187 --- .../detectron2/data/datasets/coco.py | 539 ------- .../detectron2/data/datasets/coco_panoptic.py | 228 --- .../detectron2/data/datasets/lvis.py | 240 --- .../data/datasets/lvis_v0_5_categories.py | 13 - .../data/datasets/lvis_v1_categories.py | 16 - .../detectron2/data/datasets/pascal_voc.py | 82 -- .../detectron2/data/datasets/register_coco.py | 3 - .../detectron2/data/detection_utils.py | 623 -------- .../detectron2/data/samplers/__init__.py | 17 - .../data/samplers/distributed_sampler.py | 278 ---- .../data/samplers/grouped_batch_sampler.py | 47 - .../detectron2/data/transforms/__init__.py | 14 - .../data/transforms/augmentation.py | 377 ----- .../data/transforms/augmentation_impl.py | 614 -------- .../detectron2/data/transforms/transform.py | 351 ----- .../CenterNet2/detectron2/engine/__init__.py | 12 - .../CenterNet2/detectron2/engine/defaults.py | 715 --------- .../CenterNet2/detectron2/engine/hooks.py | 686 --------- .../CenterNet2/detectron2/engine/launch.py | 126 -- .../detectron2/engine/train_loop.py | 417 ------ .../detectron2/evaluation/__init__.py | 12 - .../evaluation/cityscapes_evaluation.py | 194 --- .../detectron2/evaluation/coco_evaluation.py | 710 --------- .../detectron2/evaluation/evaluator.py | 224 --- .../detectron2/evaluation/fast_eval_api.py | 121 -- .../detectron2/evaluation/lvis_evaluation.py | 380 ----- .../evaluation/panoptic_evaluation.py | 199 --- .../evaluation/pascal_voc_evaluation.py | 300 ---- .../evaluation/rotated_coco_evaluation.py | 207 --- .../evaluation/sem_seg_evaluation.py | 184 --- .../detectron2/evaluation/testing.py | 85 -- .../CenterNet2/detectron2/export/README.md | 13 - .../CenterNet2/detectron2/export/__init__.py | 15 - .../CenterNet2/detectron2/export/api.py | 235 --- .../CenterNet2/detectron2/export/c10.py | 534 ------- .../detectron2/export/caffe2_export.py | 207 --- .../detectron2/export/caffe2_inference.py | 161 --- .../detectron2/export/caffe2_modeling.py | 419 ------ .../detectron2/export/caffe2_patch.py | 152 -- .../CenterNet2/detectron2/export/flatten.py | 330 ----- .../CenterNet2/detectron2/export/shared.py | 1034 ------------- .../detectron2/export/torchscript.py | 132 -- .../detectron2/export/torchscript_patch.py | 406 ------ .../CenterNet2/detectron2/layers/__init__.py | 24 - .../CenterNet2/detectron2/layers/aspp.py | 144 -- .../detectron2/layers/batch_norm.py | 276 ---- .../CenterNet2/detectron2/layers/blocks.py | 111 -- .../detectron2/layers/csrc/README.md | 7 - .../csrc/ROIAlignRotated/ROIAlignRotated.h | 115 -- .../ROIAlignRotated/ROIAlignRotated_cpu.cpp | 522 ------- .../ROIAlignRotated/ROIAlignRotated_cuda.cu | 443 ------ .../csrc/box_iou_rotated/box_iou_rotated.h | 35 - .../box_iou_rotated/box_iou_rotated_cpu.cpp | 39 - .../box_iou_rotated/box_iou_rotated_cuda.cu | 130 -- .../box_iou_rotated/box_iou_rotated_utils.h | 370 ----- .../layers/csrc/cocoeval/cocoeval.cpp | 507 ------- .../layers/csrc/cocoeval/cocoeval.h | 88 -- .../detectron2/layers/csrc/cuda_version.cu | 26 - .../layers/csrc/deformable/deform_conv.h | 377 ----- .../csrc/deformable/deform_conv_cuda.cu | 1223 ---------------- .../deformable/deform_conv_cuda_kernel.cu | 1288 ----------------- .../layers/csrc/nms_rotated/nms_rotated.h | 39 - .../csrc/nms_rotated/nms_rotated_cpu.cpp | 75 - .../csrc/nms_rotated/nms_rotated_cuda.cu | 145 -- .../detectron2/layers/csrc/vision.cpp | 117 -- .../detectron2/layers/deform_conv.py | 501 ------- .../CenterNet2/detectron2/layers/losses.py | 133 -- .../CenterNet2/detectron2/layers/mask_ops.py | 275 ---- .../CenterNet2/detectron2/layers/nms.py | 139 -- .../CenterNet2/detectron2/layers/roi_align.py | 74 - .../detectron2/layers/roi_align_rotated.py | 91 -- .../detectron2/layers/rotated_boxes.py | 21 - .../detectron2/layers/shape_spec.py | 20 - .../CenterNet2/detectron2/layers/wrappers.py | 132 -- .../detectron2/model_zoo/__init__.py | 10 - .../detectron2/model_zoo/model_zoo.py | 213 --- .../detectron2/modeling/__init__.py | 59 - .../detectron2/modeling/anchor_generator.py | 382 ----- .../detectron2/modeling/backbone/__init__.py | 17 - .../detectron2/modeling/backbone/backbone.py | 53 - .../detectron2/modeling/backbone/build.py | 33 - .../detectron2/modeling/backbone/fpn.py | 255 ---- .../detectron2/modeling/backbone/regnet.py | 452 ------ .../detectron2/modeling/backbone/resnet.py | 694 --------- .../detectron2/modeling/box_regression.py | 369 ----- .../CenterNet2/detectron2/modeling/matcher.py | 127 -- .../detectron2/modeling/meta_arch/__init__.py | 16 - .../detectron2/modeling/meta_arch/build.py | 25 - .../modeling/meta_arch/dense_detector.py | 282 ---- .../detectron2/modeling/meta_arch/fcos.py | 303 ---- .../modeling/meta_arch/panoptic_fpn.py | 266 ---- .../detectron2/modeling/meta_arch/rcnn.py | 327 ----- .../modeling/meta_arch/retinanet.py | 439 ------ .../modeling/meta_arch/semantic_seg.py | 260 ---- .../detectron2/modeling/mmdet_wrapper.py | 274 ---- .../CenterNet2/detectron2/modeling/poolers.py | 245 ---- .../detectron2/modeling/postprocessing.py | 101 -- .../modeling/proposal_generator/__init__.py | 5 - .../modeling/proposal_generator/build.py | 24 - .../proposal_generator/proposal_utils.py | 196 --- .../modeling/proposal_generator/rpn.py | 533 ------- .../modeling/proposal_generator/rrpn.py | 203 --- .../detectron2/modeling/roi_heads/__init__.py | 29 - .../detectron2/modeling/roi_heads/box_head.py | 118 -- .../modeling/roi_heads/cascade_rcnn.py | 299 ---- .../modeling/roi_heads/fast_rcnn.py | 462 ------ .../modeling/roi_heads/keypoint_head.py | 272 ---- .../modeling/roi_heads/mask_head.py | 292 ---- .../modeling/roi_heads/roi_heads.py | 877 ----------- .../modeling/roi_heads/rotated_fast_rcnn.py | 270 ---- .../detectron2/modeling/sampling.py | 54 - .../modeling/test_time_augmentation.py | 307 ---- .../CenterNet2/detectron2/projects/README.md | 2 - .../detectron2/projects/__init__.py | 31 - .../CenterNet2/detectron2/solver/__init__.py | 5 - .../CenterNet2/detectron2/solver/build.py | 285 ---- .../detectron2/solver/lr_scheduler.py | 238 --- .../detectron2/structures/__init__.py | 17 - .../CenterNet2/detectron2/structures/boxes.py | 423 ------ .../detectron2/structures/image_list.py | 110 -- .../detectron2/structures/instances.py | 192 --- .../detectron2/structures/keypoints.py | 239 --- .../CenterNet2/detectron2/structures/masks.py | 532 ------- .../detectron2/structures/rotated_boxes.py | 503 ------- .../CenterNet2/detectron2/utils/README.md | 5 - .../CenterNet2/detectron2/utils/__init__.py | 1 - .../CenterNet2/detectron2/utils/analysis.py | 188 --- .../detectron2/utils/collect_env.py | 242 ---- .../CenterNet2/detectron2/utils/colormap.py | 140 -- .../CenterNet2/detectron2/utils/comm.py | 199 --- .../CenterNet2/detectron2/utils/env.py | 170 --- .../CenterNet2/detectron2/utils/events.py | 486 ------- .../CenterNet2/detectron2/utils/file_io.py | 37 - .../CenterNet2/detectron2/utils/logger.py | 237 --- .../CenterNet2/detectron2/utils/memory.py | 84 -- .../CenterNet2/detectron2/utils/registry.py | 60 - .../CenterNet2/detectron2/utils/serialize.py | 32 - .../CenterNet2/detectron2/utils/testing.py | 137 -- .../detectron2/utils/video_visualizer.py | 252 ---- .../CenterNet2/detectron2/utils/visualizer.py | 1267 ---------------- .../third_party/CenterNet2/dev/README.md | 7 - .../third_party/CenterNet2/dev/linter.sh | 42 - .../CenterNet2/dev/packaging/README.md | 17 - .../dev/packaging/build_all_wheels.sh | 65 - .../CenterNet2/dev/packaging/build_wheel.sh | 31 - .../dev/packaging/gen_install_table.py | 63 - .../dev/packaging/gen_wheel_index.sh | 46 - .../CenterNet2/dev/packaging/pkg_helpers.bash | 76 - .../CenterNet2/dev/parse_results.sh | 45 - .../CenterNet2/dev/run_inference_tests.sh | 44 - .../CenterNet2/dev/run_instant_tests.sh | 27 - .../third_party/CenterNet2/docker/Dockerfile | 47 - .../third_party/CenterNet2/docker/README.md | 45 - .../CenterNet2/docker/deploy.Dockerfile | 32 - .../CenterNet2/docker/docker-compose.yml | 26 - .../third_party/CenterNet2/docs/.gitignore | 1 - .../third_party/CenterNet2/docs/Makefile | 19 - .../third_party/CenterNet2/docs/README.md | 15 - .../CenterNet2/docs/_static/css/custom.css | 30 - .../third_party/CenterNet2/docs/conf.py | 382 ----- .../third_party/CenterNet2/docs/index.rst | 14 - .../CenterNet2/docs/modules/checkpoint.rst | 7 - .../CenterNet2/docs/modules/config.rst | 18 - .../CenterNet2/docs/modules/data.rst | 37 - .../docs/modules/data_transforms.rst | 10 - .../CenterNet2/docs/modules/engine.rst | 26 - .../CenterNet2/docs/modules/evaluation.rst | 7 - .../CenterNet2/docs/modules/export.rst | 9 - .../CenterNet2/docs/modules/fvcore.rst | 49 - .../CenterNet2/docs/modules/index.rst | 19 - .../CenterNet2/docs/modules/layers.rst | 7 - .../CenterNet2/docs/modules/model_zoo.rst | 7 - .../CenterNet2/docs/modules/modeling.rst | 58 - .../CenterNet2/docs/modules/solver.rst | 7 - .../CenterNet2/docs/modules/structures.rst | 7 - .../CenterNet2/docs/modules/utils.rst | 80 - .../CenterNet2/docs/notes/benchmarks.md | 196 --- .../CenterNet2/docs/notes/changelog.md | 48 - .../CenterNet2/docs/notes/compatibility.md | 84 -- .../CenterNet2/docs/notes/index.rst | 10 - .../CenterNet2/docs/requirements.txt | 21 - .../CenterNet2/docs/tutorials/README.md | 4 - .../CenterNet2/docs/tutorials/augmentation.md | 186 --- .../CenterNet2/docs/tutorials/configs.md | 62 - .../CenterNet2/docs/tutorials/data_loading.md | 95 -- .../CenterNet2/docs/tutorials/datasets.md | 290 ---- .../CenterNet2/docs/tutorials/deployment.md | 137 -- .../CenterNet2/docs/tutorials/evaluation.md | 68 - .../CenterNet2/docs/tutorials/extend.md | 141 -- .../CenterNet2/docs/tutorials/index.rst | 20 - .../CenterNet2/docs/tutorials/lazyconfigs.md | 170 --- .../CenterNet2/docs/tutorials/models.md | 180 --- .../CenterNet2/docs/tutorials/training.md | 67 - .../CenterNet2/docs/tutorials/write-models.md | 90 -- .../CenterNet2/projects/CenterNet2/.gitignore | 10 - .../projects/CenterNet2/centernet/__init__.py | 14 - .../data/custom_build_augmentation.py | 59 - .../data/custom_dataset_dataloader.py | 229 --- .../centernet/data/datasets/coco.py | 49 - .../centernet/data/datasets/nuimages.py | 37 - .../centernet/data/datasets/objects365.py | 394 ----- .../transforms/custom_augmentation_impl.py | 63 - .../data/transforms/custom_transform.py | 94 -- .../centernet/modeling/backbone/bifpn.py | 425 ------ .../centernet/modeling/backbone/bifpn_fcos.py | 469 ------ .../centernet/modeling/backbone/dla.py | 479 ------ .../centernet/modeling/backbone/dlafpn.py | 493 ------- .../centernet/modeling/backbone/res2net.py | 802 ---------- .../CenterNet2/centernet/modeling/debug.py | 283 ---- .../modeling/dense_heads/centernet.py | 864 ----------- .../modeling/dense_heads/centernet_head.py | 162 --- .../centernet/modeling/dense_heads/utils.py | 38 - .../centernet/modeling/layers/deform_conv.py | 116 -- .../modeling/layers/heatmap_focal_loss.py | 92 -- .../centernet/modeling/layers/iou_loss.py | 121 -- .../centernet/modeling/layers/ml_nms.py | 31 - .../modeling/meta_arch/centernet_detector.py | 69 - .../modeling/roi_heads/custom_fast_rcnn.py | 124 -- .../modeling/roi_heads/custom_roi_heads.py | 185 --- .../centernet/modeling/roi_heads/fed_loss.py | 31 - .../CenterNet2/centernet2_docs/MODEL_ZOO.md | 73 - .../configs/Base-CenterNet-FPN.yaml | 28 - .../CenterNet2/configs/Base-CenterNet2.yaml | 56 - .../CenterNet2/configs/Base_S4_DLA.yaml | 40 - .../configs/CenterNet-FPN_R50_1x.yaml | 4 - .../configs/CenterNet-S4_DLA_8x.yaml | 5 - .../configs/CenterNet2-F_R50_1x.yaml | 4 - .../configs/CenterNet2_DLA-BiFPN-P3_24x.yaml | 36 - .../configs/CenterNet2_DLA-BiFPN-P3_4x.yaml | 36 - .../CenterNet2_DLA-BiFPN-P5_640_16x.yaml | 29 - .../CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml | 30 - ...enterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml | 30 - .../CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml | 32 - ...erNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml | 36 - .../configs/CenterNet2_R2-101-DCN_896_4x.yaml | 29 - .../CenterNet2/configs/CenterNet2_R50_1x.yaml | 1 - .../configs/CenterNet2_X101-DCN_2x.yaml | 22 - .../configs/LVIS_CenterNet2_R50_1x.yaml | 17 - .../configs/LVIS_CenterNet2_R50_Fed_1x.yaml | 19 - .../configs/O365_CenterNet2_R50_1x.yaml | 13 - .../nuImages_CenterNet2_DLA_640_8x.yaml | 42 - .../CenterNet2/projects/CenterNet2/demo.py | 185 --- .../projects/CenterNet2/predictor.py | 243 ---- .../projects/CenterNet2/train_net.py | 228 --- .../grit_src/third_party/CenterNet2/setup.cfg | 26 - .../grit_src/third_party/CenterNet2/setup.py | 206 --- .../third_party/CenterNet2/tests/README.md | 9 - .../third_party/CenterNet2/tests/__init__.py | 1 - .../CenterNet2/tests/config/dir1/dir1_a.py | 3 - .../CenterNet2/tests/config/dir1/dir1_b.py | 11 - .../CenterNet2/tests/config/root_cfg.py | 14 - .../tests/config/test_instantiate_config.py | 100 -- .../tests/config/test_lazy_config.py | 79 - .../tests/config/test_yacs_config.py | 270 ---- .../CenterNet2/tests/data/test_coco.py | 139 -- .../tests/data/test_coco_evaluation.py | 138 -- .../CenterNet2/tests/data/test_dataset.py | 134 -- .../tests/data/test_detection_utils.py | 176 --- .../tests/data/test_rotation_transform.py | 71 - .../CenterNet2/tests/data/test_sampler.py | 111 -- .../CenterNet2/tests/data/test_transforms.py | 268 ---- .../CenterNet2/tests/layers/test_blocks.py | 51 - .../tests/layers/test_deformable.py | 175 --- .../CenterNet2/tests/layers/test_losses.py | 82 -- .../CenterNet2/tests/layers/test_mask_ops.py | 202 --- .../CenterNet2/tests/layers/test_nms.py | 33 - .../tests/layers/test_nms_rotated.py | 172 --- .../CenterNet2/tests/layers/test_roi_align.py | 210 --- .../tests/layers/test_roi_align_rotated.py | 176 --- .../tests/modeling/test_anchor_generator.py | 120 -- .../tests/modeling/test_backbone.py | 34 - .../tests/modeling/test_box2box_transform.py | 94 -- .../tests/modeling/test_fast_rcnn.py | 171 --- .../CenterNet2/tests/modeling/test_matcher.py | 42 - .../CenterNet2/tests/modeling/test_mmdet.py | 186 --- .../tests/modeling/test_model_e2e.py | 223 --- .../tests/modeling/test_roi_heads.py | 323 ----- .../tests/modeling/test_roi_pooler.py | 165 --- .../CenterNet2/tests/modeling/test_rpn.py | 262 ---- .../CenterNet2/tests/structures/test_boxes.py | 223 --- .../tests/structures/test_imagelist.py | 75 - .../tests/structures/test_instances.py | 219 --- .../tests/structures/test_keypoints.py | 19 - .../CenterNet2/tests/structures/test_masks.py | 53 - .../tests/structures/test_rotated_boxes.py | 437 ------ .../CenterNet2/tests/test_checkpoint.py | 49 - .../CenterNet2/tests/test_engine.py | 186 --- .../CenterNet2/tests/test_events.py | 64 - .../CenterNet2/tests/test_export_caffe2.py | 52 - .../tests/test_export_torchscript.py | 296 ---- .../CenterNet2/tests/test_model_analysis.py | 80 - .../CenterNet2/tests/test_model_zoo.py | 50 - .../CenterNet2/tests/test_packaging.py | 24 - .../CenterNet2/tests/test_registry.py | 45 - .../CenterNet2/tests/test_scheduler.py | 68 - .../CenterNet2/tests/test_solver.py | 66 - .../CenterNet2/tests/test_visualizer.py | 278 ---- .../third_party/CenterNet2/tools/README.md | 49 - .../CenterNet2/tools/analyze_model.py | 159 -- .../third_party/CenterNet2/tools/benchmark.py | 197 --- .../tools/convert-torchvision-to-d2.py | 56 - .../CenterNet2/tools/deploy/CMakeLists.txt | 15 - .../CenterNet2/tools/deploy/README.md | 66 - .../CenterNet2/tools/deploy/export_model.py | 235 --- .../tools/deploy/torchscript_mask_rcnn.cpp | 187 --- .../CenterNet2/tools/lazyconfig_train_net.py | 131 -- .../CenterNet2/tools/lightning_train_net.py | 239 --- .../CenterNet2/tools/plain_train_net.py | 223 --- .../third_party/CenterNet2/tools/train_net.py | 170 --- .../CenterNet2/tools/visualize_data.py | 94 -- .../tools/visualize_json_results.py | 90 -- vbench/third_party/umt/__init__.py | 0 vbench/third_party/umt/models/__init__.py | 4 +- .../umt/models/modeling_finetune.py | 76 +- version.txt | 1 + 1090 files changed, 318 insertions(+), 64558 deletions(-) create mode 100644 MANIFEST.in create mode 100644 bin/evaluate rename vbench/third_party/tag2Text/grit_model.py => grit_model_deprecated.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/configs/Base.yaml (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/configs/GRiT_B_DenseCap.yaml (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/configs/GRiT_B_DenseCap_ObjectDet.yaml (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/configs/GRiT_B_ObjectDet.yaml (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/configs/GRiT_H_ObjectDet.yaml (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/configs/GRiT_L_ObjectDet.yaml (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/__init__.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/config.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/custom_solver.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/data/custom_build_augmentation.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/data/custom_dataset_dataloader.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/data/custom_dataset_mapper.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/data/datasets/grit_coco.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/data/datasets/object365.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/data/datasets/vg.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/data/transforms/custom_augmentation_impl.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/data/transforms/custom_transform.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/evaluation/eval.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/modeling/backbone/utils.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/modeling/backbone/vit.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/modeling/meta_arch/grit.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/modeling/roi_heads/grit_fast_rcnn.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/modeling/roi_heads/grit_roi_heads.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/modeling/soft_nms.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/modeling/text/file_utils.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/modeling/text/load_text_token.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/modeling/text/modeling_bert.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/modeling/text/text_decoder.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/grit/predictor.py (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/image_dense_captions.py (99%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.circleci/config.yml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.clang-format (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.flake8 (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.github/CODE_OF_CONDUCT.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.github/CONTRIBUTING.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.github/Detectron2-Logo-Horz.svg (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.github/ISSUE_TEMPLATE.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.github/ISSUE_TEMPLATE/bugs.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.github/ISSUE_TEMPLATE/config.yml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.github/ISSUE_TEMPLATE/documentation.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.github/ISSUE_TEMPLATE/feature-request.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.github/ISSUE_TEMPLATE/unexpected-problems-bugs.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.github/pull_request_template.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.github/workflows/check-template.yml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.github/workflows/levenshtein.js (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.github/workflows/needs-reply.yml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.github/workflows/remove-needs-reply.yml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.github/workflows/workflow.yml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/.gitignore (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/GETTING_STARTED.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/INSTALL.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/LICENSE (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/MODEL_ZOO.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/README_D2.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Base-RCNN-C4.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Base-RCNN-DilatedC5.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Base-RCNN-FPN.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Base-RetinaNet.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_C4_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_DC5_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_FPN_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/fcos_R_50_FPN_1x.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_101_FPN_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_C4_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_FPN_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_C4_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_DC5_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_FPN_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x_giou.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Keypoints/Base-Keypoint-RCNN-FPN.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_101_FPN_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_X_101_32x8d_FPN_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-PanopticSegmentation/Base-Panoptic-FPN.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_101_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Cityscapes/mask_rcnn_R_50_FPN.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Detectron1-Comparisons/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Detectron1-Comparisons/faster_rcnn_R_50_FPN_noaug_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Detectron1-Comparisons/keypoint_rcnn_R_50_FPN_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_cls_agnostic.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_dconv_c3-c5.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_dconv_c3-c5.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_gn.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_syncbn.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Misc/mmdet_mask_rcnn_R_50_FPN_1x.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Misc/panoptic_fpn_R_101_dconv_cascade_gn_3x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_gn.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_syncbn.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Misc/semantic_R_50_FPN_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/Misc/torchvision_imagenet_R_50.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_C4.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_FPN.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/common/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/common/coco_schedule.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/common/data/coco.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/common/data/coco_keypoint.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/common/data/coco_panoptic_separated.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/common/models/cascade_rcnn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/common/models/fcos.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/common/models/keypoint_rcnn_fpn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/common/models/mask_rcnn_c4.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/common/models/mask_rcnn_fpn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/common/models/panoptic_fpn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/common/models/retinanet.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/common/optim.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/common/train.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_100ep_LSJ.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_200ep_LSJ.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_400ep_LSJ.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_200ep_LSJ.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_400ep_LSJ.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_50ep_LSJ.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_inference_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_instant_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_inference_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_instant_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_inference_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_instant_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_normalized_training_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_training_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_GCV_instant_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_inference_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_instant_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_training_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_DC5_inference_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_instant_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_pred_boxes_training_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_training_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_inference_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_instant_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_training_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_inference_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_instant_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_inference_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_instant_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_inference_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_instant_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_training_acc_test.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/datasets/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/datasets/lvis/lvis_v1_train_cat_info.json (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/datasets/prepare_ade20k_sem_seg.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/datasets/prepare_cocofied_lvis.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/datasets/prepare_for_tests.sh (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/datasets/prepare_panoptic_fpn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/demo/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/demo/demo.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/demo/predictor.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/checkpoint/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/checkpoint/c2_model_loading.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/checkpoint/catalog.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/checkpoint/detection_checkpoint.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/config/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/config/compat.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/config/config.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/config/defaults.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/config/instantiate.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/config/lazy.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/benchmark.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/build.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/catalog.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/common.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/dataset_mapper.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/datasets/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/datasets/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/datasets/builtin.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/datasets/builtin_meta.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/datasets/cityscapes.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/datasets/cityscapes_panoptic.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/datasets/coco.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/datasets/coco_panoptic.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/datasets/lvis.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/datasets/lvis_v0_5_categories.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/datasets/lvis_v1_categories.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/datasets/pascal_voc.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/datasets/register_coco.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/detection_utils.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/samplers/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/samplers/distributed_sampler.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/samplers/grouped_batch_sampler.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/transforms/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/transforms/augmentation.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/transforms/augmentation_impl.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/data/transforms/transform.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/engine/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/engine/defaults.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/engine/hooks.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/engine/launch.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/engine/train_loop.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/evaluation/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/evaluation/cityscapes_evaluation.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/evaluation/coco_evaluation.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/evaluation/evaluator.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/evaluation/fast_eval_api.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/evaluation/lvis_evaluation.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/evaluation/panoptic_evaluation.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/evaluation/pascal_voc_evaluation.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/evaluation/rotated_coco_evaluation.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/evaluation/sem_seg_evaluation.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/evaluation/testing.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/export/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/export/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/export/api.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/export/c10.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/export/caffe2_export.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/export/caffe2_inference.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/export/caffe2_modeling.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/export/caffe2_patch.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/export/flatten.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/export/shared.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/export/torchscript.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/export/torchscript_patch.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/aspp.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/batch_norm.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/blocks.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated.h (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cpu.cpp (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cuda.cu (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.cpp (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.h (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/cuda_version.cu (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv.h (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated.h (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/csrc/vision.cpp (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/deform_conv.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/losses.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/mask_ops.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/nms.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/roi_align.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/roi_align_rotated.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/rotated_boxes.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/shape_spec.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/layers/wrappers.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/model_zoo/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/model_zoo/model_zoo.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/anchor_generator.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/backbone/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/backbone/backbone.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/backbone/build.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/backbone/fpn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/backbone/regnet.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/backbone/resnet.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/box_regression.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/matcher.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/meta_arch/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/meta_arch/build.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/meta_arch/dense_detector.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/meta_arch/fcos.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/meta_arch/panoptic_fpn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/meta_arch/rcnn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/meta_arch/retinanet.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/meta_arch/semantic_seg.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/mmdet_wrapper.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/poolers.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/postprocessing.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/proposal_generator/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/proposal_generator/build.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/proposal_generator/proposal_utils.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/proposal_generator/rpn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/proposal_generator/rrpn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/roi_heads/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/roi_heads/box_head.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/roi_heads/cascade_rcnn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/roi_heads/fast_rcnn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/roi_heads/keypoint_head.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/roi_heads/mask_head.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/roi_heads/roi_heads.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/roi_heads/rotated_fast_rcnn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/sampling.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/modeling/test_time_augmentation.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/projects/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/projects/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/solver/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/solver/build.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/solver/lr_scheduler.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/structures/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/structures/boxes.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/structures/image_list.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/structures/instances.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/structures/keypoints.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/structures/masks.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/structures/rotated_boxes.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/utils/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/utils/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/utils/analysis.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/utils/collect_env.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/utils/colormap.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/utils/comm.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/utils/env.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/utils/events.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/utils/file_io.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/utils/logger.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/utils/memory.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/utils/registry.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/utils/serialize.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/utils/testing.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/utils/video_visualizer.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/detectron2/utils/visualizer.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/dev/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/dev/linter.sh (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/dev/packaging/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/dev/packaging/build_all_wheels.sh (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/dev/packaging/build_wheel.sh (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/dev/packaging/gen_install_table.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/dev/packaging/gen_wheel_index.sh (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/dev/packaging/pkg_helpers.bash (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/dev/parse_results.sh (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/dev/run_inference_tests.sh (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/dev/run_instant_tests.sh (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docker/Dockerfile (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docker/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docker/deploy.Dockerfile (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docker/docker-compose.yml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/.gitignore (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/Makefile (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/_static/css/custom.css (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/conf.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/index.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/modules/checkpoint.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/modules/config.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/modules/data.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/modules/data_transforms.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/modules/engine.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/modules/evaluation.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/modules/export.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/modules/fvcore.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/modules/index.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/modules/layers.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/modules/model_zoo.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/modules/modeling.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/modules/solver.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/modules/structures.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/modules/utils.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/notes/benchmarks.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/notes/changelog.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/notes/compatibility.md (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/notes/contributing.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/notes/index.rst (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/requirements.txt (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/tutorials/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/tutorials/augmentation.md (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/tutorials/builtin_datasets.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/tutorials/configs.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/tutorials/data_loading.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/tutorials/datasets.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/tutorials/deployment.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/tutorials/evaluation.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/tutorials/extend.md (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/tutorials/getting_started.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/tutorials/index.rst (100%) rename {vbench/third_party/tag2Text/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/tutorials/install.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/tutorials/lazyconfigs.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/tutorials/models.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/tutorials/training.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/docs/tutorials/write-models.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/.gitignore (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/config.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_build_augmentation.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_dataset_dataloader.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/coco.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/nuimages.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/objects365.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_augmentation_impl.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_transform.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dla.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/fpn_p5.py (99%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/res2net.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/debug.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet_head.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/utils.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/deform_conv.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/heatmap_focal_loss.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/iou_loss.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/ml_nms.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/meta_arch/centernet_detector.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_fast_rcnn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_roi_heads.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/fed_loss.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/centernet2_docs/MODEL_ZOO.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet-FPN.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet2.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/Base_S4_DLA.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-FPN_R50_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-S4_DLA_8x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2-F_R50_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R50_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_X101-DCN_2x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/O365_CenterNet2_R50_1x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/demo.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/predictor.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/projects/CenterNet2/train_net.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/setup.cfg (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/setup.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/config/dir1/dir1_a.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/config/dir1/dir1_b.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/config/root_cfg.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/config/test_instantiate_config.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/config/test_lazy_config.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/config/test_yacs_config.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/data/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/data/test_coco.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/data/test_coco_evaluation.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/data/test_dataset.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/data/test_detection_utils.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/data/test_rotation_transform.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/data/test_sampler.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/data/test_transforms.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/layers/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/layers/test_blocks.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/layers/test_deformable.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/layers/test_losses.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/layers/test_mask_ops.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/layers/test_nms.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/layers/test_nms_rotated.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/layers/test_roi_align.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/layers/test_roi_align_rotated.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/modeling/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/modeling/test_anchor_generator.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/modeling/test_backbone.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/modeling/test_box2box_transform.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/modeling/test_fast_rcnn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/modeling/test_matcher.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/modeling/test_mmdet.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/modeling/test_model_e2e.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/modeling/test_roi_heads.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/modeling/test_roi_pooler.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/modeling/test_rpn.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/structures/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/structures/test_boxes.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/structures/test_imagelist.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/structures/test_instances.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/structures/test_keypoints.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/structures/test_masks.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/structures/test_rotated_boxes.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/test_checkpoint.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/test_engine.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/test_events.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/test_export_caffe2.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/test_export_torchscript.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/test_model_analysis.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/test_model_zoo.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/test_packaging.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/test_registry.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/test_scheduler.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/test_solver.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tests/test_visualizer.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tools/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tools/__init__.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tools/analyze_model.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tools/benchmark.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tools/convert-torchvision-to-d2.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tools/deploy/CMakeLists.txt (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tools/deploy/README.md (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tools/deploy/export_model.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tools/deploy/torchscript_mask_rcnn.cpp (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tools/lazyconfig_train_net.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tools/lightning_train_net.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tools/plain_train_net.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tools/train_net.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tools/visualize_data.py (100%) rename {vbench/third_party/grit_src => grit_src_deprecated}/third_party/CenterNet2/tools/visualize_json_results.py (100%) create mode 100644 requirements.txt rename vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/__init__.py => setup.cfg (100%) mode change 100755 => 100644 create mode 100644 setup.py create mode 100644 vbench/cli/__init__.py create mode 100644 vbench/cli/evaluate.py rename vbench/third_party/{tag2Text/grit_src/third_party/CenterNet2/tests/layers => RAFT}/__init__.py (100%) mode change 100755 => 100644 rename vbench/third_party/{tag2Text/grit_src/third_party/CenterNet2/tests/modeling => }/__init__.py (100%) mode change 100755 => 100644 rename vbench/third_party/{tag2Text/grit_src/third_party/CenterNet2/tests/structures => amt}/__init__.py (100%) mode change 100755 => 100644 rename vbench/third_party/{tag2Text/grit_src/third_party/CenterNet2/tools => grit_src}/__init__.py (100%) mode change 100755 => 100644 rename vbench/third_party/{tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/config.py => grit_src/centernet_config.py} (100%) mode change 100755 => 100644 create mode 100644 vbench/third_party/grit_src/grit/data/__init__.py create mode 100644 vbench/third_party/grit_src/grit/data/datasets/__init__.py create mode 100644 vbench/third_party/grit_src/grit/modeling/__init__.py create mode 100644 vbench/third_party/grit_src/grit/modeling/backbone/__init__.py rename vbench/third_party/{tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/fpn_p5.py => grit_src/grit/modeling/backbone/centernet_last_layer.py} (99%) mode change 100755 => 100644 create mode 100644 vbench/third_party/grit_src/grit/modeling/meta_arch/__init__.py create mode 100644 vbench/third_party/grit_src/grit/modeling/roi_heads/__init__.py create mode 100644 vbench/third_party/grit_src/grit/modeling/text/__init__.py delete mode 120000 vbench/third_party/grit_src/third_party/CenterNet2/docs/notes/contributing.md delete mode 120000 vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/builtin_datasets.md delete mode 120000 vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/getting_started.md delete mode 120000 vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/install.md create mode 100644 vbench/third_party/tag2Text/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.circleci/config.yml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.clang-format delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.flake8 delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/CODE_OF_CONDUCT.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/CONTRIBUTING.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/Detectron2-Logo-Horz.svg delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/bugs.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/config.yml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/documentation.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/feature-request.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/unexpected-problems-bugs.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/pull_request_template.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/check-template.yml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/levenshtein.js delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/needs-reply.yml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/remove-needs-reply.yml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/workflow.yml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.gitignore delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/GETTING_STARTED.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/INSTALL.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/LICENSE delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/MODEL_ZOO.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/README_D2.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Base-RCNN-C4.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Base-RCNN-DilatedC5.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Base-RCNN-FPN.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Base-RetinaNet.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_C4_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_DC5_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_FPN_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/fcos_R_50_FPN_1x.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_101_FPN_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_C4_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_FPN_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_C4_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_DC5_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_FPN_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x_giou.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/Base-Keypoint-RCNN-FPN.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_101_FPN_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_X_101_32x8d_FPN_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/Base-Panoptic-FPN.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_101_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Cityscapes/mask_rcnn_R_50_FPN.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/faster_rcnn_R_50_FPN_noaug_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/keypoint_rcnn_R_50_FPN_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_cls_agnostic.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_dconv_c3-c5.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_dconv_c3-c5.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_gn.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_syncbn.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mmdet_mask_rcnn_R_50_FPN_1x.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/panoptic_fpn_R_101_dconv_cascade_gn_3x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_gn.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_syncbn.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/semantic_R_50_FPN_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/torchvision_imagenet_R_50.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_C4.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_FPN.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/coco_schedule.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/data/coco.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/data/coco_keypoint.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/data/coco_panoptic_separated.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/cascade_rcnn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/fcos.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/keypoint_rcnn_fpn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/mask_rcnn_c4.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/mask_rcnn_fpn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/panoptic_fpn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/retinanet.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/optim.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/train.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_100ep_LSJ.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_200ep_LSJ.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_400ep_LSJ.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_200ep_LSJ.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_400ep_LSJ.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_50ep_LSJ.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_inference_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_instant_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_inference_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_instant_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_inference_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_instant_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_normalized_training_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_training_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_GCV_instant_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_inference_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_instant_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_training_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_DC5_inference_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_instant_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_pred_boxes_training_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_training_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_inference_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_instant_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_training_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_inference_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_instant_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_inference_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_instant_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_inference_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_instant_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_training_acc_test.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/lvis/lvis_v1_train_cat_info.json delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/prepare_ade20k_sem_seg.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/prepare_cocofied_lvis.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/prepare_for_tests.sh delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/prepare_panoptic_fpn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/demo/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/demo/demo.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/demo/predictor.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/checkpoint/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/checkpoint/c2_model_loading.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/checkpoint/catalog.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/checkpoint/detection_checkpoint.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/compat.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/config.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/defaults.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/instantiate.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/lazy.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/benchmark.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/build.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/catalog.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/common.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/dataset_mapper.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/builtin.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/builtin_meta.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/cityscapes.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/cityscapes_panoptic.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/coco.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/coco_panoptic.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis_v0_5_categories.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis_v1_categories.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/pascal_voc.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/register_coco.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/detection_utils.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/samplers/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/samplers/distributed_sampler.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/samplers/grouped_batch_sampler.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/transforms/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/transforms/augmentation.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/transforms/augmentation_impl.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/transforms/transform.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/defaults.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/hooks.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/launch.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/train_loop.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/cityscapes_evaluation.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/coco_evaluation.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/evaluator.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/fast_eval_api.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/lvis_evaluation.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/panoptic_evaluation.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/pascal_voc_evaluation.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/rotated_coco_evaluation.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/sem_seg_evaluation.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/testing.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/api.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/c10.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/caffe2_export.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/caffe2_inference.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/caffe2_modeling.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/caffe2_patch.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/flatten.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/shared.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/torchscript.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/torchscript_patch.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/aspp.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/batch_norm.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/blocks.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated.h delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cpu.cpp delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cuda.cu delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.cpp delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.h delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cuda_version.cu delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv.h delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated.h delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/vision.cpp delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/deform_conv.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/losses.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/mask_ops.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/nms.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/roi_align.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/roi_align_rotated.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/rotated_boxes.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/shape_spec.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/wrappers.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/model_zoo/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/model_zoo/model_zoo.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/anchor_generator.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/backbone.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/build.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/fpn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/regnet.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/resnet.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/box_regression.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/matcher.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/build.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/dense_detector.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/fcos.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/panoptic_fpn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/rcnn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/retinanet.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/semantic_seg.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/mmdet_wrapper.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/poolers.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/postprocessing.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/build.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/proposal_utils.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/rpn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/rrpn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/box_head.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/cascade_rcnn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/fast_rcnn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/keypoint_head.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/mask_head.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/roi_heads.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/rotated_fast_rcnn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/sampling.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/test_time_augmentation.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/projects/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/projects/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/solver/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/solver/build.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/solver/lr_scheduler.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/boxes.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/image_list.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/instances.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/keypoints.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/masks.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/rotated_boxes.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/analysis.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/collect_env.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/colormap.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/comm.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/env.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/events.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/file_io.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/logger.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/memory.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/registry.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/serialize.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/testing.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/video_visualizer.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/visualizer.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/linter.sh delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/build_all_wheels.sh delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/build_wheel.sh delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/gen_install_table.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/gen_wheel_index.sh delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/pkg_helpers.bash delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/parse_results.sh delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/run_inference_tests.sh delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/run_instant_tests.sh delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docker/Dockerfile delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docker/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docker/deploy.Dockerfile delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docker/docker-compose.yml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/.gitignore delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/Makefile delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/_static/css/custom.css delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/conf.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/index.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/checkpoint.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/config.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/data.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/data_transforms.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/engine.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/evaluation.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/export.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/fvcore.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/index.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/layers.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/model_zoo.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/modeling.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/solver.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/structures.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/utils.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/benchmarks.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/changelog.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/compatibility.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/index.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/requirements.txt delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/augmentation.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/configs.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/data_loading.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/datasets.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/deployment.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/evaluation.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/extend.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/index.rst delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/lazyconfigs.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/models.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/training.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/write-models.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/.gitignore delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_build_augmentation.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_dataset_dataloader.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/coco.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/nuimages.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/objects365.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_augmentation_impl.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_transform.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dla.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/res2net.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/debug.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet_head.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/utils.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/deform_conv.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/heatmap_focal_loss.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/iou_loss.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/ml_nms.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/meta_arch/centernet_detector.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_fast_rcnn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_roi_heads.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/fed_loss.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet2_docs/MODEL_ZOO.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet-FPN.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet2.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base_S4_DLA.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-FPN_R50_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-S4_DLA_8x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2-F_R50_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R50_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_X101-DCN_2x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/O365_CenterNet2_R50_1x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/demo.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/predictor.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/train_net.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/setup.cfg delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/setup.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/__init__.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/dir1/dir1_a.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/dir1/dir1_b.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/root_cfg.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/test_instantiate_config.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/test_lazy_config.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/test_yacs_config.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_coco.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_coco_evaluation.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_dataset.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_detection_utils.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_rotation_transform.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_sampler.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_transforms.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_blocks.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_deformable.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_losses.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_mask_ops.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_nms.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_nms_rotated.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_roi_align.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_roi_align_rotated.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_anchor_generator.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_backbone.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_box2box_transform.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_fast_rcnn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_matcher.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_mmdet.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_model_e2e.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_roi_heads.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_roi_pooler.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_rpn.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_boxes.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_imagelist.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_instances.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_keypoints.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_masks.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_rotated_boxes.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_checkpoint.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_engine.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_events.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_export_caffe2.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_export_torchscript.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_model_analysis.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_model_zoo.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_packaging.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_registry.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_scheduler.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_solver.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_visualizer.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/analyze_model.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/benchmark.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/convert-torchvision-to-d2.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/deploy/CMakeLists.txt delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/deploy/README.md delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/deploy/export_model.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/deploy/torchscript_mask_rcnn.cpp delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/lazyconfig_train_net.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/lightning_train_net.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/plain_train_net.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/train_net.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/visualize_data.py delete mode 100755 vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/visualize_json_results.py create mode 100644 vbench/third_party/umt/__init__.py create mode 100644 version.txt diff --git a/MANIFEST.in b/MANIFEST.in new file mode 100644 index 00000000..e258121c --- /dev/null +++ b/MANIFEST.in @@ -0,0 +1,2 @@ +include version.txt +include requirements.txt diff --git a/bin/evaluate b/bin/evaluate new file mode 100644 index 00000000..ca7ebfc7 --- /dev/null +++ b/bin/evaluate @@ -0,0 +1,70 @@ +#!/usr/bin/env python3 + +import torch +import vbench +from vbench import VBench + + +import argparse + +def parse_args(): + + parser = argparse.ArgumentParser(description='VBench') + parser.add_argument( + "--output_path", + type=str, + default='./evaluation_results/', + help="output path to save the evaluation results", + ) + parser.add_argument( + "--full_json_dir", + type=str, + default='./VBench_full_info.json', + help="path to save the json file that contains the prompt and dimension information", + ) + parser.add_argument( + "--videos_path", + type=str, + required=True, + help="folder that contains the sampled videos", + ) + parser.add_argument( + "--dimension", + type=str, + required=True, + help="evaluation dimensions", + ) + parser.add_argument( + "--load_ckpt_from_local", + type=bool, + required=False, + help="whether load checkpoints from local default paths (assuming you have downloaded the checkpoints locally", + ) + parser.add_argument( + "--read_frame", + type=bool, + required=False, + help="whether directly read frames, or directly read videos", + ) + args = parser.parse_args() + return args + +def main(): + args = parse_args() + print(f'args: {args}') + + device = torch.device("cuda") + my_VBench = VBench(device, args.full_json_dir, args.output_path) + + print(f'start evaluation') + my_VBench.evaluate( + videos_path = args.videos_path, + name = args.dimension, + dimension_list = [args.dimension], + local=args.load_ckpt_from_local, + read_frame=args.read_frame, + ) + print('done') + +if __name__ == "__main__": + main() diff --git a/vbench/third_party/tag2Text/grit_model.py b/grit_model_deprecated.py similarity index 100% rename from vbench/third_party/tag2Text/grit_model.py rename to grit_model_deprecated.py diff --git a/vbench/third_party/tag2Text/grit_src/configs/Base.yaml b/grit_src_deprecated/configs/Base.yaml similarity index 100% rename from vbench/third_party/tag2Text/grit_src/configs/Base.yaml rename to grit_src_deprecated/configs/Base.yaml diff --git a/vbench/third_party/tag2Text/grit_src/configs/GRiT_B_DenseCap.yaml b/grit_src_deprecated/configs/GRiT_B_DenseCap.yaml similarity index 100% rename from vbench/third_party/tag2Text/grit_src/configs/GRiT_B_DenseCap.yaml rename to grit_src_deprecated/configs/GRiT_B_DenseCap.yaml diff --git a/vbench/third_party/tag2Text/grit_src/configs/GRiT_B_DenseCap_ObjectDet.yaml b/grit_src_deprecated/configs/GRiT_B_DenseCap_ObjectDet.yaml similarity index 100% rename from vbench/third_party/tag2Text/grit_src/configs/GRiT_B_DenseCap_ObjectDet.yaml rename to grit_src_deprecated/configs/GRiT_B_DenseCap_ObjectDet.yaml diff --git a/vbench/third_party/tag2Text/grit_src/configs/GRiT_B_ObjectDet.yaml b/grit_src_deprecated/configs/GRiT_B_ObjectDet.yaml similarity index 100% rename from vbench/third_party/tag2Text/grit_src/configs/GRiT_B_ObjectDet.yaml rename to grit_src_deprecated/configs/GRiT_B_ObjectDet.yaml diff --git a/vbench/third_party/tag2Text/grit_src/configs/GRiT_H_ObjectDet.yaml b/grit_src_deprecated/configs/GRiT_H_ObjectDet.yaml similarity index 100% rename from vbench/third_party/tag2Text/grit_src/configs/GRiT_H_ObjectDet.yaml rename to grit_src_deprecated/configs/GRiT_H_ObjectDet.yaml diff --git a/vbench/third_party/tag2Text/grit_src/configs/GRiT_L_ObjectDet.yaml b/grit_src_deprecated/configs/GRiT_L_ObjectDet.yaml similarity index 100% rename from vbench/third_party/tag2Text/grit_src/configs/GRiT_L_ObjectDet.yaml rename to grit_src_deprecated/configs/GRiT_L_ObjectDet.yaml diff --git a/vbench/third_party/tag2Text/grit_src/grit/__init__.py b/grit_src_deprecated/grit/__init__.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/__init__.py rename to grit_src_deprecated/grit/__init__.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/config.py b/grit_src_deprecated/grit/config.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/config.py rename to grit_src_deprecated/grit/config.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/custom_solver.py b/grit_src_deprecated/grit/custom_solver.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/custom_solver.py rename to grit_src_deprecated/grit/custom_solver.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/data/custom_build_augmentation.py b/grit_src_deprecated/grit/data/custom_build_augmentation.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/data/custom_build_augmentation.py rename to grit_src_deprecated/grit/data/custom_build_augmentation.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/data/custom_dataset_dataloader.py b/grit_src_deprecated/grit/data/custom_dataset_dataloader.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/data/custom_dataset_dataloader.py rename to grit_src_deprecated/grit/data/custom_dataset_dataloader.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/data/custom_dataset_mapper.py b/grit_src_deprecated/grit/data/custom_dataset_mapper.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/data/custom_dataset_mapper.py rename to grit_src_deprecated/grit/data/custom_dataset_mapper.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/data/datasets/grit_coco.py b/grit_src_deprecated/grit/data/datasets/grit_coco.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/data/datasets/grit_coco.py rename to grit_src_deprecated/grit/data/datasets/grit_coco.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/data/datasets/object365.py b/grit_src_deprecated/grit/data/datasets/object365.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/data/datasets/object365.py rename to grit_src_deprecated/grit/data/datasets/object365.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/data/datasets/vg.py b/grit_src_deprecated/grit/data/datasets/vg.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/data/datasets/vg.py rename to grit_src_deprecated/grit/data/datasets/vg.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/data/transforms/custom_augmentation_impl.py b/grit_src_deprecated/grit/data/transforms/custom_augmentation_impl.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/data/transforms/custom_augmentation_impl.py rename to grit_src_deprecated/grit/data/transforms/custom_augmentation_impl.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/data/transforms/custom_transform.py b/grit_src_deprecated/grit/data/transforms/custom_transform.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/data/transforms/custom_transform.py rename to grit_src_deprecated/grit/data/transforms/custom_transform.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/evaluation/eval.py b/grit_src_deprecated/grit/evaluation/eval.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/evaluation/eval.py rename to grit_src_deprecated/grit/evaluation/eval.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/modeling/backbone/utils.py b/grit_src_deprecated/grit/modeling/backbone/utils.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/modeling/backbone/utils.py rename to grit_src_deprecated/grit/modeling/backbone/utils.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/modeling/backbone/vit.py b/grit_src_deprecated/grit/modeling/backbone/vit.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/modeling/backbone/vit.py rename to grit_src_deprecated/grit/modeling/backbone/vit.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/modeling/meta_arch/grit.py b/grit_src_deprecated/grit/modeling/meta_arch/grit.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/modeling/meta_arch/grit.py rename to grit_src_deprecated/grit/modeling/meta_arch/grit.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/modeling/roi_heads/grit_fast_rcnn.py b/grit_src_deprecated/grit/modeling/roi_heads/grit_fast_rcnn.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/modeling/roi_heads/grit_fast_rcnn.py rename to grit_src_deprecated/grit/modeling/roi_heads/grit_fast_rcnn.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/modeling/roi_heads/grit_roi_heads.py b/grit_src_deprecated/grit/modeling/roi_heads/grit_roi_heads.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/modeling/roi_heads/grit_roi_heads.py rename to grit_src_deprecated/grit/modeling/roi_heads/grit_roi_heads.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/modeling/soft_nms.py b/grit_src_deprecated/grit/modeling/soft_nms.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/modeling/soft_nms.py rename to grit_src_deprecated/grit/modeling/soft_nms.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/modeling/text/file_utils.py b/grit_src_deprecated/grit/modeling/text/file_utils.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/modeling/text/file_utils.py rename to grit_src_deprecated/grit/modeling/text/file_utils.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/modeling/text/load_text_token.py b/grit_src_deprecated/grit/modeling/text/load_text_token.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/modeling/text/load_text_token.py rename to grit_src_deprecated/grit/modeling/text/load_text_token.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/modeling/text/modeling_bert.py b/grit_src_deprecated/grit/modeling/text/modeling_bert.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/modeling/text/modeling_bert.py rename to grit_src_deprecated/grit/modeling/text/modeling_bert.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/modeling/text/text_decoder.py b/grit_src_deprecated/grit/modeling/text/text_decoder.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/modeling/text/text_decoder.py rename to grit_src_deprecated/grit/modeling/text/text_decoder.py diff --git a/vbench/third_party/tag2Text/grit_src/grit/predictor.py b/grit_src_deprecated/grit/predictor.py similarity index 100% rename from vbench/third_party/tag2Text/grit_src/grit/predictor.py rename to grit_src_deprecated/grit/predictor.py diff --git a/vbench/third_party/tag2Text/grit_src/image_dense_captions.py b/grit_src_deprecated/image_dense_captions.py similarity index 99% rename from vbench/third_party/tag2Text/grit_src/image_dense_captions.py rename to grit_src_deprecated/image_dense_captions.py index cfa472f3..fa130af5 100755 --- a/vbench/third_party/tag2Text/grit_src/image_dense_captions.py +++ b/grit_src_deprecated/image_dense_captions.py @@ -84,4 +84,4 @@ def init_demo(device): if __name__=="__main__": import os os.environ['CUDA_VISIBLE_DEVICES']='7' - print(image_caption_api("images/dancing_example_4.mp4_20230417_135359.263.jpg",'cuda')) \ No newline at end of file + print(image_caption_api("images/dancing_example_4.mp4_20230417_135359.263.jpg",'cuda')) diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.circleci/config.yml b/grit_src_deprecated/third_party/CenterNet2/.circleci/config.yml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.circleci/config.yml rename to grit_src_deprecated/third_party/CenterNet2/.circleci/config.yml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.clang-format b/grit_src_deprecated/third_party/CenterNet2/.clang-format similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.clang-format rename to grit_src_deprecated/third_party/CenterNet2/.clang-format diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.flake8 b/grit_src_deprecated/third_party/CenterNet2/.flake8 similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.flake8 rename to grit_src_deprecated/third_party/CenterNet2/.flake8 diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.github/CODE_OF_CONDUCT.md b/grit_src_deprecated/third_party/CenterNet2/.github/CODE_OF_CONDUCT.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.github/CODE_OF_CONDUCT.md rename to grit_src_deprecated/third_party/CenterNet2/.github/CODE_OF_CONDUCT.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.github/CONTRIBUTING.md b/grit_src_deprecated/third_party/CenterNet2/.github/CONTRIBUTING.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.github/CONTRIBUTING.md rename to grit_src_deprecated/third_party/CenterNet2/.github/CONTRIBUTING.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.github/Detectron2-Logo-Horz.svg b/grit_src_deprecated/third_party/CenterNet2/.github/Detectron2-Logo-Horz.svg similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.github/Detectron2-Logo-Horz.svg rename to grit_src_deprecated/third_party/CenterNet2/.github/Detectron2-Logo-Horz.svg diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE.md b/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE.md rename to grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/bugs.md b/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/bugs.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/bugs.md rename to grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/bugs.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/config.yml b/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/config.yml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/config.yml rename to grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/config.yml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/documentation.md b/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/documentation.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/documentation.md rename to grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/documentation.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/feature-request.md b/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/feature-request.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/feature-request.md rename to grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/feature-request.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/unexpected-problems-bugs.md b/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/unexpected-problems-bugs.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/unexpected-problems-bugs.md rename to grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/unexpected-problems-bugs.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.github/pull_request_template.md b/grit_src_deprecated/third_party/CenterNet2/.github/pull_request_template.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.github/pull_request_template.md rename to grit_src_deprecated/third_party/CenterNet2/.github/pull_request_template.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.github/workflows/check-template.yml b/grit_src_deprecated/third_party/CenterNet2/.github/workflows/check-template.yml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.github/workflows/check-template.yml rename to grit_src_deprecated/third_party/CenterNet2/.github/workflows/check-template.yml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.github/workflows/levenshtein.js b/grit_src_deprecated/third_party/CenterNet2/.github/workflows/levenshtein.js similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.github/workflows/levenshtein.js rename to grit_src_deprecated/third_party/CenterNet2/.github/workflows/levenshtein.js diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.github/workflows/needs-reply.yml b/grit_src_deprecated/third_party/CenterNet2/.github/workflows/needs-reply.yml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.github/workflows/needs-reply.yml rename to grit_src_deprecated/third_party/CenterNet2/.github/workflows/needs-reply.yml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.github/workflows/remove-needs-reply.yml b/grit_src_deprecated/third_party/CenterNet2/.github/workflows/remove-needs-reply.yml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.github/workflows/remove-needs-reply.yml rename to grit_src_deprecated/third_party/CenterNet2/.github/workflows/remove-needs-reply.yml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.github/workflows/workflow.yml b/grit_src_deprecated/third_party/CenterNet2/.github/workflows/workflow.yml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.github/workflows/workflow.yml rename to grit_src_deprecated/third_party/CenterNet2/.github/workflows/workflow.yml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/.gitignore b/grit_src_deprecated/third_party/CenterNet2/.gitignore similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/.gitignore rename to grit_src_deprecated/third_party/CenterNet2/.gitignore diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/GETTING_STARTED.md b/grit_src_deprecated/third_party/CenterNet2/GETTING_STARTED.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/GETTING_STARTED.md rename to grit_src_deprecated/third_party/CenterNet2/GETTING_STARTED.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/INSTALL.md b/grit_src_deprecated/third_party/CenterNet2/INSTALL.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/INSTALL.md rename to grit_src_deprecated/third_party/CenterNet2/INSTALL.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/LICENSE b/grit_src_deprecated/third_party/CenterNet2/LICENSE similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/LICENSE rename to grit_src_deprecated/third_party/CenterNet2/LICENSE diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/MODEL_ZOO.md b/grit_src_deprecated/third_party/CenterNet2/MODEL_ZOO.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/MODEL_ZOO.md rename to grit_src_deprecated/third_party/CenterNet2/MODEL_ZOO.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/README.md b/grit_src_deprecated/third_party/CenterNet2/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/README.md rename to grit_src_deprecated/third_party/CenterNet2/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/README_D2.md b/grit_src_deprecated/third_party/CenterNet2/README_D2.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/README_D2.md rename to grit_src_deprecated/third_party/CenterNet2/README_D2.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Base-RCNN-C4.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-C4.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Base-RCNN-C4.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-C4.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Base-RCNN-DilatedC5.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-DilatedC5.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Base-RCNN-DilatedC5.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-DilatedC5.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Base-RCNN-FPN.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-FPN.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Base-RCNN-FPN.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-FPN.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Base-RetinaNet.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Base-RetinaNet.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Base-RetinaNet.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Base-RetinaNet.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_C4_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_C4_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_C4_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_C4_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_DC5_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_DC5_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_DC5_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_DC5_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_FPN_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_FPN_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_FPN_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/fcos_R_50_FPN_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/fcos_R_50_FPN_1x.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/fcos_R_50_FPN_1x.py rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/fcos_R_50_FPN_1x.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_101_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_101_FPN_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_101_FPN_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_101_FPN_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.py rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_C4_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_C4_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_C4_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_C4_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_FPN_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_FPN_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_FPN_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_C4_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_C4_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_C4_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_C4_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_DC5_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_DC5_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_DC5_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_DC5_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_FPN_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_FPN_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_FPN_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.py rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x_giou.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x_giou.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x_giou.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x_giou.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/Base-Keypoint-RCNN-FPN.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/Base-Keypoint-RCNN-FPN.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/Base-Keypoint-RCNN-FPN.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/Base-Keypoint-RCNN-FPN.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_101_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_101_FPN_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_101_FPN_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_101_FPN_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.py rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_X_101_32x8d_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_X_101_32x8d_FPN_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_X_101_32x8d_FPN_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_X_101_32x8d_FPN_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/Base-Panoptic-FPN.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/Base-Panoptic-FPN.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/Base-Panoptic-FPN.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/Base-Panoptic-FPN.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_101_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_101_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_101_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_101_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.py rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Cityscapes/mask_rcnn_R_50_FPN.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Cityscapes/mask_rcnn_R_50_FPN.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Cityscapes/mask_rcnn_R_50_FPN.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Cityscapes/mask_rcnn_R_50_FPN.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/README.md b/grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/README.md rename to grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/faster_rcnn_R_50_FPN_noaug_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/faster_rcnn_R_50_FPN_noaug_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/faster_rcnn_R_50_FPN_noaug_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/faster_rcnn_R_50_FPN_noaug_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/keypoint_rcnn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/keypoint_rcnn_R_50_FPN_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/keypoint_rcnn_R_50_FPN_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/keypoint_rcnn_R_50_FPN_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_cls_agnostic.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_cls_agnostic.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_cls_agnostic.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_cls_agnostic.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_dconv_c3-c5.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_dconv_c3-c5.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_dconv_c3-c5.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_dconv_c3-c5.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_dconv_c3-c5.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_dconv_c3-c5.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_dconv_c3-c5.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_dconv_c3-c5.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_gn.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_gn.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_gn.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_gn.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_syncbn.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_syncbn.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_syncbn.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_syncbn.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/mmdet_mask_rcnn_R_50_FPN_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mmdet_mask_rcnn_R_50_FPN_1x.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/mmdet_mask_rcnn_R_50_FPN_1x.py rename to grit_src_deprecated/third_party/CenterNet2/configs/Misc/mmdet_mask_rcnn_R_50_FPN_1x.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/panoptic_fpn_R_101_dconv_cascade_gn_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/panoptic_fpn_R_101_dconv_cascade_gn_3x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/panoptic_fpn_R_101_dconv_cascade_gn_3x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Misc/panoptic_fpn_R_101_dconv_cascade_gn_3x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_gn.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_gn.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_gn.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_gn.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_syncbn.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_syncbn.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_syncbn.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_syncbn.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/semantic_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/semantic_R_50_FPN_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/semantic_R_50_FPN_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/Misc/semantic_R_50_FPN_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/torchvision_imagenet_R_50.py b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/torchvision_imagenet_R_50.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/Misc/torchvision_imagenet_R_50.py rename to grit_src_deprecated/third_party/CenterNet2/configs/Misc/torchvision_imagenet_R_50.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_C4.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_C4.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_C4.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_C4.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_FPN.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_FPN.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_FPN.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_FPN.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/common/README.md b/grit_src_deprecated/third_party/CenterNet2/configs/common/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/common/README.md rename to grit_src_deprecated/third_party/CenterNet2/configs/common/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/common/coco_schedule.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/coco_schedule.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/common/coco_schedule.py rename to grit_src_deprecated/third_party/CenterNet2/configs/common/coco_schedule.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/common/data/coco.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/common/data/coco.py rename to grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/common/data/coco_keypoint.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco_keypoint.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/common/data/coco_keypoint.py rename to grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco_keypoint.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/common/data/coco_panoptic_separated.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco_panoptic_separated.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/common/data/coco_panoptic_separated.py rename to grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco_panoptic_separated.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/common/models/cascade_rcnn.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/models/cascade_rcnn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/common/models/cascade_rcnn.py rename to grit_src_deprecated/third_party/CenterNet2/configs/common/models/cascade_rcnn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/common/models/fcos.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/models/fcos.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/common/models/fcos.py rename to grit_src_deprecated/third_party/CenterNet2/configs/common/models/fcos.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/common/models/keypoint_rcnn_fpn.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/models/keypoint_rcnn_fpn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/common/models/keypoint_rcnn_fpn.py rename to grit_src_deprecated/third_party/CenterNet2/configs/common/models/keypoint_rcnn_fpn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/common/models/mask_rcnn_c4.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/models/mask_rcnn_c4.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/common/models/mask_rcnn_c4.py rename to grit_src_deprecated/third_party/CenterNet2/configs/common/models/mask_rcnn_c4.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/common/models/mask_rcnn_fpn.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/models/mask_rcnn_fpn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/common/models/mask_rcnn_fpn.py rename to grit_src_deprecated/third_party/CenterNet2/configs/common/models/mask_rcnn_fpn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/common/models/panoptic_fpn.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/models/panoptic_fpn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/common/models/panoptic_fpn.py rename to grit_src_deprecated/third_party/CenterNet2/configs/common/models/panoptic_fpn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/common/models/retinanet.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/models/retinanet.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/common/models/retinanet.py rename to grit_src_deprecated/third_party/CenterNet2/configs/common/models/retinanet.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/common/optim.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/optim.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/common/optim.py rename to grit_src_deprecated/third_party/CenterNet2/configs/common/optim.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/common/train.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/train.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/common/train.py rename to grit_src_deprecated/third_party/CenterNet2/configs/common/train.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_100ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_100ep_LSJ.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_100ep_LSJ.py rename to grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_100ep_LSJ.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_200ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_200ep_LSJ.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_200ep_LSJ.py rename to grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_200ep_LSJ.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_400ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_400ep_LSJ.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_400ep_LSJ.py rename to grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_400ep_LSJ.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ.py rename to grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_200ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_200ep_LSJ.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_200ep_LSJ.py rename to grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_200ep_LSJ.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_400ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_400ep_LSJ.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_400ep_LSJ.py rename to grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_400ep_LSJ.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_50ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_50ep_LSJ.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_50ep_LSJ.py rename to grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_50ep_LSJ.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py rename to grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ.py rename to grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ.py rename to grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py rename to grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ.py rename to grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ.py rename to grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/README.md b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/README.md rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_inference_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_inference_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_inference_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_instant_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_instant_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_instant_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_inference_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_inference_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_inference_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_instant_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_instant_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_instant_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_inference_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_inference_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_inference_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_instant_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_instant_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_instant_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_normalized_training_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_normalized_training_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_normalized_training_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_normalized_training_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_training_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_training_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_training_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_training_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_GCV_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_GCV_instant_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_GCV_instant_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_GCV_instant_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_inference_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_inference_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_inference_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_instant_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_instant_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_instant_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_training_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_training_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_training_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_training_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_DC5_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_DC5_inference_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_DC5_inference_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_DC5_inference_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_instant_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_instant_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_instant_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_pred_boxes_training_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_pred_boxes_training_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_pred_boxes_training_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_pred_boxes_training_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_training_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_training_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_training_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_training_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_inference_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_inference_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_inference_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_instant_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_instant_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_instant_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_training_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_training_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_training_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_training_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_inference_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_inference_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_inference_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_instant_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_instant_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_instant_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_inference_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_inference_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_inference_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_instant_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_instant_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_instant_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_inference_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_inference_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_inference_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_instant_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_instant_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_instant_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_training_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_training_acc_test.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_training_acc_test.yaml rename to grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_training_acc_test.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/datasets/README.md b/grit_src_deprecated/third_party/CenterNet2/datasets/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/datasets/README.md rename to grit_src_deprecated/third_party/CenterNet2/datasets/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/datasets/lvis/lvis_v1_train_cat_info.json b/grit_src_deprecated/third_party/CenterNet2/datasets/lvis/lvis_v1_train_cat_info.json similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/datasets/lvis/lvis_v1_train_cat_info.json rename to grit_src_deprecated/third_party/CenterNet2/datasets/lvis/lvis_v1_train_cat_info.json diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/datasets/prepare_ade20k_sem_seg.py b/grit_src_deprecated/third_party/CenterNet2/datasets/prepare_ade20k_sem_seg.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/datasets/prepare_ade20k_sem_seg.py rename to grit_src_deprecated/third_party/CenterNet2/datasets/prepare_ade20k_sem_seg.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/datasets/prepare_cocofied_lvis.py b/grit_src_deprecated/third_party/CenterNet2/datasets/prepare_cocofied_lvis.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/datasets/prepare_cocofied_lvis.py rename to grit_src_deprecated/third_party/CenterNet2/datasets/prepare_cocofied_lvis.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/datasets/prepare_for_tests.sh b/grit_src_deprecated/third_party/CenterNet2/datasets/prepare_for_tests.sh similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/datasets/prepare_for_tests.sh rename to grit_src_deprecated/third_party/CenterNet2/datasets/prepare_for_tests.sh diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/datasets/prepare_panoptic_fpn.py b/grit_src_deprecated/third_party/CenterNet2/datasets/prepare_panoptic_fpn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/datasets/prepare_panoptic_fpn.py rename to grit_src_deprecated/third_party/CenterNet2/datasets/prepare_panoptic_fpn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/demo/README.md b/grit_src_deprecated/third_party/CenterNet2/demo/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/demo/README.md rename to grit_src_deprecated/third_party/CenterNet2/demo/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/demo/demo.py b/grit_src_deprecated/third_party/CenterNet2/demo/demo.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/demo/demo.py rename to grit_src_deprecated/third_party/CenterNet2/demo/demo.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/demo/predictor.py b/grit_src_deprecated/third_party/CenterNet2/demo/predictor.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/demo/predictor.py rename to grit_src_deprecated/third_party/CenterNet2/demo/predictor.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/checkpoint/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/checkpoint/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/checkpoint/c2_model_loading.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/c2_model_loading.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/checkpoint/c2_model_loading.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/c2_model_loading.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/checkpoint/catalog.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/catalog.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/checkpoint/catalog.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/catalog.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/checkpoint/detection_checkpoint.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/detection_checkpoint.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/checkpoint/detection_checkpoint.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/detection_checkpoint.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/config/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/config/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/config/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/config/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/config/compat.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/config/compat.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/config/compat.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/config/compat.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/config/config.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/config/config.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/config/config.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/config/config.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/config/defaults.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/config/defaults.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/config/defaults.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/config/defaults.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/config/instantiate.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/config/instantiate.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/config/instantiate.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/config/instantiate.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/config/lazy.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/config/lazy.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/config/lazy.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/config/lazy.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/benchmark.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/benchmark.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/benchmark.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/benchmark.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/build.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/build.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/build.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/build.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/catalog.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/catalog.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/catalog.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/catalog.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/common.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/common.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/common.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/common.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/dataset_mapper.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/dataset_mapper.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/dataset_mapper.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/dataset_mapper.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/README.md b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/README.md rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/builtin.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/builtin.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/builtin.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/builtin.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/builtin_meta.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/builtin_meta.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/builtin_meta.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/builtin_meta.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/cityscapes.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/cityscapes.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/cityscapes.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/cityscapes.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/cityscapes_panoptic.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/cityscapes_panoptic.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/cityscapes_panoptic.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/cityscapes_panoptic.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/coco.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/coco.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/coco.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/coco.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/coco_panoptic.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/coco_panoptic.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/coco_panoptic.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/coco_panoptic.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis_v0_5_categories.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis_v0_5_categories.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis_v0_5_categories.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis_v0_5_categories.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis_v1_categories.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis_v1_categories.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis_v1_categories.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis_v1_categories.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/pascal_voc.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/pascal_voc.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/pascal_voc.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/pascal_voc.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/register_coco.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/register_coco.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/datasets/register_coco.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/register_coco.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/detection_utils.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/detection_utils.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/detection_utils.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/detection_utils.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/samplers/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/samplers/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/samplers/distributed_sampler.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/distributed_sampler.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/samplers/distributed_sampler.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/distributed_sampler.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/samplers/grouped_batch_sampler.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/grouped_batch_sampler.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/samplers/grouped_batch_sampler.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/grouped_batch_sampler.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/transforms/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/transforms/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/transforms/augmentation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/augmentation.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/transforms/augmentation.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/augmentation.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/transforms/augmentation_impl.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/augmentation_impl.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/transforms/augmentation_impl.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/augmentation_impl.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/transforms/transform.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/transform.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/data/transforms/transform.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/transform.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/engine/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/engine/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/engine/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/engine/defaults.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/defaults.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/engine/defaults.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/engine/defaults.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/engine/hooks.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/hooks.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/engine/hooks.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/engine/hooks.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/engine/launch.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/launch.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/engine/launch.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/engine/launch.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/engine/train_loop.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/train_loop.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/engine/train_loop.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/engine/train_loop.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/cityscapes_evaluation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/cityscapes_evaluation.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/cityscapes_evaluation.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/cityscapes_evaluation.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/coco_evaluation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/coco_evaluation.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/coco_evaluation.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/coco_evaluation.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/evaluator.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/evaluator.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/evaluator.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/evaluator.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/fast_eval_api.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/fast_eval_api.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/fast_eval_api.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/fast_eval_api.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/lvis_evaluation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/lvis_evaluation.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/lvis_evaluation.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/lvis_evaluation.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/panoptic_evaluation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/panoptic_evaluation.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/panoptic_evaluation.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/panoptic_evaluation.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/pascal_voc_evaluation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/pascal_voc_evaluation.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/pascal_voc_evaluation.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/pascal_voc_evaluation.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/rotated_coco_evaluation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/rotated_coco_evaluation.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/rotated_coco_evaluation.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/rotated_coco_evaluation.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/sem_seg_evaluation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/sem_seg_evaluation.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/sem_seg_evaluation.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/sem_seg_evaluation.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/testing.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/testing.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/evaluation/testing.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/testing.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/README.md b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/README.md rename to grit_src_deprecated/third_party/CenterNet2/detectron2/export/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/export/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/api.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/api.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/api.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/export/api.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/c10.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/c10.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/c10.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/export/c10.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/caffe2_export.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_export.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/caffe2_export.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_export.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/caffe2_inference.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_inference.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/caffe2_inference.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_inference.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/caffe2_modeling.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_modeling.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/caffe2_modeling.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_modeling.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/caffe2_patch.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_patch.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/caffe2_patch.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_patch.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/flatten.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/flatten.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/flatten.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/export/flatten.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/shared.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/shared.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/shared.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/export/shared.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/torchscript.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/torchscript.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/torchscript.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/export/torchscript.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/torchscript_patch.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/torchscript_patch.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/export/torchscript_patch.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/export/torchscript_patch.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/aspp.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/aspp.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/aspp.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/aspp.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/batch_norm.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/batch_norm.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/batch_norm.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/batch_norm.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/blocks.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/blocks.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/blocks.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/blocks.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/README.md b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/README.md rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated.h b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated.h similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated.h rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated.h diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cpu.cpp b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cpu.cpp similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cpu.cpp rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cpu.cpp diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cuda.cu b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cuda.cu similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cuda.cu rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cuda.cu diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.cpp b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.cpp similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.cpp rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.cpp diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.h b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.h similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.h rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.h diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cuda_version.cu b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cuda_version.cu similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cuda_version.cu rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cuda_version.cu diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv.h b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv.h similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv.h rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv.h diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated.h b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated.h similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated.h rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated.h diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/vision.cpp b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/vision.cpp similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/csrc/vision.cpp rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/vision.cpp diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/deform_conv.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/deform_conv.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/deform_conv.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/deform_conv.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/losses.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/losses.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/losses.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/losses.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/mask_ops.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/mask_ops.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/mask_ops.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/mask_ops.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/nms.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/nms.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/nms.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/nms.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/roi_align.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/roi_align.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/roi_align.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/roi_align.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/roi_align_rotated.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/roi_align_rotated.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/roi_align_rotated.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/roi_align_rotated.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/rotated_boxes.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/rotated_boxes.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/rotated_boxes.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/rotated_boxes.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/shape_spec.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/shape_spec.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/shape_spec.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/shape_spec.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/wrappers.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/wrappers.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/layers/wrappers.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/layers/wrappers.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/model_zoo/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/model_zoo/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/model_zoo/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/model_zoo/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/model_zoo/model_zoo.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/model_zoo/model_zoo.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/model_zoo/model_zoo.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/model_zoo/model_zoo.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/anchor_generator.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/anchor_generator.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/anchor_generator.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/anchor_generator.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/backbone.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/backbone.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/backbone.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/backbone.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/build.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/build.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/build.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/build.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/fpn.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/fpn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/fpn.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/fpn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/regnet.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/regnet.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/regnet.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/regnet.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/resnet.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/resnet.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/resnet.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/resnet.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/box_regression.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/box_regression.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/box_regression.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/box_regression.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/matcher.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/matcher.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/matcher.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/matcher.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/build.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/build.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/build.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/build.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/dense_detector.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/dense_detector.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/dense_detector.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/dense_detector.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/fcos.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/fcos.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/fcos.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/fcos.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/panoptic_fpn.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/panoptic_fpn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/panoptic_fpn.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/panoptic_fpn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/rcnn.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/rcnn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/rcnn.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/rcnn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/retinanet.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/retinanet.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/retinanet.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/retinanet.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/semantic_seg.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/semantic_seg.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/semantic_seg.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/semantic_seg.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/mmdet_wrapper.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/mmdet_wrapper.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/mmdet_wrapper.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/mmdet_wrapper.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/poolers.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/poolers.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/poolers.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/poolers.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/postprocessing.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/postprocessing.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/postprocessing.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/postprocessing.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/build.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/build.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/build.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/build.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/proposal_utils.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/proposal_utils.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/proposal_utils.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/proposal_utils.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/rpn.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/rpn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/rpn.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/rpn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/rrpn.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/rrpn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/rrpn.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/rrpn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/box_head.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/box_head.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/box_head.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/box_head.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/cascade_rcnn.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/cascade_rcnn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/cascade_rcnn.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/cascade_rcnn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/fast_rcnn.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/fast_rcnn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/fast_rcnn.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/fast_rcnn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/keypoint_head.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/keypoint_head.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/keypoint_head.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/keypoint_head.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/mask_head.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/mask_head.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/mask_head.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/mask_head.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/roi_heads.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/roi_heads.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/roi_heads.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/roi_heads.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/rotated_fast_rcnn.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/rotated_fast_rcnn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/rotated_fast_rcnn.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/rotated_fast_rcnn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/sampling.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/sampling.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/sampling.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/sampling.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/test_time_augmentation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/test_time_augmentation.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/modeling/test_time_augmentation.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/test_time_augmentation.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/projects/README.md b/grit_src_deprecated/third_party/CenterNet2/detectron2/projects/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/projects/README.md rename to grit_src_deprecated/third_party/CenterNet2/detectron2/projects/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/projects/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/projects/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/projects/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/projects/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/solver/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/solver/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/solver/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/solver/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/solver/build.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/solver/build.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/solver/build.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/solver/build.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/solver/lr_scheduler.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/solver/lr_scheduler.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/solver/lr_scheduler.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/solver/lr_scheduler.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/structures/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/structures/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/structures/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/structures/boxes.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/boxes.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/structures/boxes.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/structures/boxes.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/structures/image_list.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/image_list.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/structures/image_list.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/structures/image_list.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/structures/instances.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/instances.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/structures/instances.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/structures/instances.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/structures/keypoints.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/keypoints.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/structures/keypoints.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/structures/keypoints.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/structures/masks.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/masks.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/structures/masks.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/structures/masks.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/structures/rotated_boxes.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/rotated_boxes.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/structures/rotated_boxes.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/structures/rotated_boxes.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/README.md b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/README.md rename to grit_src_deprecated/third_party/CenterNet2/detectron2/utils/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/utils/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/analysis.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/analysis.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/analysis.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/utils/analysis.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/collect_env.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/collect_env.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/collect_env.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/utils/collect_env.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/colormap.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/colormap.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/colormap.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/utils/colormap.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/comm.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/comm.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/comm.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/utils/comm.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/env.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/env.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/env.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/utils/env.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/events.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/events.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/events.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/utils/events.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/file_io.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/file_io.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/file_io.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/utils/file_io.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/logger.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/logger.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/logger.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/utils/logger.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/memory.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/memory.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/memory.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/utils/memory.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/registry.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/registry.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/registry.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/utils/registry.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/serialize.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/serialize.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/serialize.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/utils/serialize.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/testing.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/testing.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/testing.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/utils/testing.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/video_visualizer.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/video_visualizer.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/video_visualizer.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/utils/video_visualizer.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/visualizer.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/visualizer.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/detectron2/utils/visualizer.py rename to grit_src_deprecated/third_party/CenterNet2/detectron2/utils/visualizer.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/dev/README.md b/grit_src_deprecated/third_party/CenterNet2/dev/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/dev/README.md rename to grit_src_deprecated/third_party/CenterNet2/dev/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/dev/linter.sh b/grit_src_deprecated/third_party/CenterNet2/dev/linter.sh similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/dev/linter.sh rename to grit_src_deprecated/third_party/CenterNet2/dev/linter.sh diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/dev/packaging/README.md b/grit_src_deprecated/third_party/CenterNet2/dev/packaging/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/dev/packaging/README.md rename to grit_src_deprecated/third_party/CenterNet2/dev/packaging/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/dev/packaging/build_all_wheels.sh b/grit_src_deprecated/third_party/CenterNet2/dev/packaging/build_all_wheels.sh similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/dev/packaging/build_all_wheels.sh rename to grit_src_deprecated/third_party/CenterNet2/dev/packaging/build_all_wheels.sh diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/dev/packaging/build_wheel.sh b/grit_src_deprecated/third_party/CenterNet2/dev/packaging/build_wheel.sh similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/dev/packaging/build_wheel.sh rename to grit_src_deprecated/third_party/CenterNet2/dev/packaging/build_wheel.sh diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/dev/packaging/gen_install_table.py b/grit_src_deprecated/third_party/CenterNet2/dev/packaging/gen_install_table.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/dev/packaging/gen_install_table.py rename to grit_src_deprecated/third_party/CenterNet2/dev/packaging/gen_install_table.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/dev/packaging/gen_wheel_index.sh b/grit_src_deprecated/third_party/CenterNet2/dev/packaging/gen_wheel_index.sh similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/dev/packaging/gen_wheel_index.sh rename to grit_src_deprecated/third_party/CenterNet2/dev/packaging/gen_wheel_index.sh diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/dev/packaging/pkg_helpers.bash b/grit_src_deprecated/third_party/CenterNet2/dev/packaging/pkg_helpers.bash similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/dev/packaging/pkg_helpers.bash rename to grit_src_deprecated/third_party/CenterNet2/dev/packaging/pkg_helpers.bash diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/dev/parse_results.sh b/grit_src_deprecated/third_party/CenterNet2/dev/parse_results.sh similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/dev/parse_results.sh rename to grit_src_deprecated/third_party/CenterNet2/dev/parse_results.sh diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/dev/run_inference_tests.sh b/grit_src_deprecated/third_party/CenterNet2/dev/run_inference_tests.sh similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/dev/run_inference_tests.sh rename to grit_src_deprecated/third_party/CenterNet2/dev/run_inference_tests.sh diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/dev/run_instant_tests.sh b/grit_src_deprecated/third_party/CenterNet2/dev/run_instant_tests.sh similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/dev/run_instant_tests.sh rename to grit_src_deprecated/third_party/CenterNet2/dev/run_instant_tests.sh diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docker/Dockerfile b/grit_src_deprecated/third_party/CenterNet2/docker/Dockerfile similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docker/Dockerfile rename to grit_src_deprecated/third_party/CenterNet2/docker/Dockerfile diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docker/README.md b/grit_src_deprecated/third_party/CenterNet2/docker/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docker/README.md rename to grit_src_deprecated/third_party/CenterNet2/docker/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docker/deploy.Dockerfile b/grit_src_deprecated/third_party/CenterNet2/docker/deploy.Dockerfile similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docker/deploy.Dockerfile rename to grit_src_deprecated/third_party/CenterNet2/docker/deploy.Dockerfile diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docker/docker-compose.yml b/grit_src_deprecated/third_party/CenterNet2/docker/docker-compose.yml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docker/docker-compose.yml rename to grit_src_deprecated/third_party/CenterNet2/docker/docker-compose.yml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/.gitignore b/grit_src_deprecated/third_party/CenterNet2/docs/.gitignore similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/.gitignore rename to grit_src_deprecated/third_party/CenterNet2/docs/.gitignore diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/Makefile b/grit_src_deprecated/third_party/CenterNet2/docs/Makefile similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/Makefile rename to grit_src_deprecated/third_party/CenterNet2/docs/Makefile diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/README.md b/grit_src_deprecated/third_party/CenterNet2/docs/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/README.md rename to grit_src_deprecated/third_party/CenterNet2/docs/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/_static/css/custom.css b/grit_src_deprecated/third_party/CenterNet2/docs/_static/css/custom.css similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/_static/css/custom.css rename to grit_src_deprecated/third_party/CenterNet2/docs/_static/css/custom.css diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/conf.py b/grit_src_deprecated/third_party/CenterNet2/docs/conf.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/conf.py rename to grit_src_deprecated/third_party/CenterNet2/docs/conf.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/index.rst b/grit_src_deprecated/third_party/CenterNet2/docs/index.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/index.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/index.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/checkpoint.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/checkpoint.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/checkpoint.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/modules/checkpoint.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/config.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/config.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/config.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/modules/config.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/data.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/data.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/data.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/modules/data.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/data_transforms.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/data_transforms.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/data_transforms.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/modules/data_transforms.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/engine.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/engine.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/engine.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/modules/engine.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/evaluation.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/evaluation.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/evaluation.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/modules/evaluation.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/export.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/export.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/export.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/modules/export.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/fvcore.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/fvcore.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/fvcore.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/modules/fvcore.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/index.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/index.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/index.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/modules/index.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/layers.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/layers.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/layers.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/modules/layers.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/model_zoo.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/model_zoo.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/model_zoo.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/modules/model_zoo.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/modeling.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/modeling.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/modeling.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/modules/modeling.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/solver.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/solver.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/solver.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/modules/solver.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/structures.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/structures.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/structures.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/modules/structures.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/utils.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/utils.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/modules/utils.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/modules/utils.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/notes/benchmarks.md b/grit_src_deprecated/third_party/CenterNet2/docs/notes/benchmarks.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/notes/benchmarks.md rename to grit_src_deprecated/third_party/CenterNet2/docs/notes/benchmarks.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/notes/changelog.md b/grit_src_deprecated/third_party/CenterNet2/docs/notes/changelog.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/notes/changelog.md rename to grit_src_deprecated/third_party/CenterNet2/docs/notes/changelog.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/notes/compatibility.md b/grit_src_deprecated/third_party/CenterNet2/docs/notes/compatibility.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/notes/compatibility.md rename to grit_src_deprecated/third_party/CenterNet2/docs/notes/compatibility.md diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/contributing.md b/grit_src_deprecated/third_party/CenterNet2/docs/notes/contributing.md similarity index 100% rename from vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/contributing.md rename to grit_src_deprecated/third_party/CenterNet2/docs/notes/contributing.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/notes/index.rst b/grit_src_deprecated/third_party/CenterNet2/docs/notes/index.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/notes/index.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/notes/index.rst diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/requirements.txt b/grit_src_deprecated/third_party/CenterNet2/docs/requirements.txt similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/requirements.txt rename to grit_src_deprecated/third_party/CenterNet2/docs/requirements.txt diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/README.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/README.md rename to grit_src_deprecated/third_party/CenterNet2/docs/tutorials/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/augmentation.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/augmentation.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/augmentation.md rename to grit_src_deprecated/third_party/CenterNet2/docs/tutorials/augmentation.md diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/builtin_datasets.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/builtin_datasets.md similarity index 100% rename from vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/builtin_datasets.md rename to grit_src_deprecated/third_party/CenterNet2/docs/tutorials/builtin_datasets.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/configs.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/configs.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/configs.md rename to grit_src_deprecated/third_party/CenterNet2/docs/tutorials/configs.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/data_loading.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/data_loading.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/data_loading.md rename to grit_src_deprecated/third_party/CenterNet2/docs/tutorials/data_loading.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/datasets.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/datasets.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/datasets.md rename to grit_src_deprecated/third_party/CenterNet2/docs/tutorials/datasets.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/deployment.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/deployment.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/deployment.md rename to grit_src_deprecated/third_party/CenterNet2/docs/tutorials/deployment.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/evaluation.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/evaluation.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/evaluation.md rename to grit_src_deprecated/third_party/CenterNet2/docs/tutorials/evaluation.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/extend.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/extend.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/extend.md rename to grit_src_deprecated/third_party/CenterNet2/docs/tutorials/extend.md diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/getting_started.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/getting_started.md similarity index 100% rename from vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/getting_started.md rename to grit_src_deprecated/third_party/CenterNet2/docs/tutorials/getting_started.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/index.rst b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/index.rst similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/index.rst rename to grit_src_deprecated/third_party/CenterNet2/docs/tutorials/index.rst diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/install.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/install.md similarity index 100% rename from vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/install.md rename to grit_src_deprecated/third_party/CenterNet2/docs/tutorials/install.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/lazyconfigs.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/lazyconfigs.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/lazyconfigs.md rename to grit_src_deprecated/third_party/CenterNet2/docs/tutorials/lazyconfigs.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/models.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/models.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/models.md rename to grit_src_deprecated/third_party/CenterNet2/docs/tutorials/models.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/training.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/training.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/training.md rename to grit_src_deprecated/third_party/CenterNet2/docs/tutorials/training.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/write-models.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/write-models.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/write-models.md rename to grit_src_deprecated/third_party/CenterNet2/docs/tutorials/write-models.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/.gitignore b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/.gitignore similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/.gitignore rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/.gitignore diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/__init__.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/config.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/config.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/config.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/config.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_build_augmentation.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_build_augmentation.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_build_augmentation.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_build_augmentation.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_dataset_dataloader.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_dataset_dataloader.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_dataset_dataloader.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_dataset_dataloader.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/coco.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/coco.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/coco.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/coco.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/nuimages.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/nuimages.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/nuimages.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/nuimages.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/objects365.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/objects365.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/objects365.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/objects365.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_augmentation_impl.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_augmentation_impl.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_augmentation_impl.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_augmentation_impl.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_transform.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_transform.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_transform.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_transform.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dla.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dla.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dla.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dla.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/fpn_p5.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/fpn_p5.py similarity index 99% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/fpn_p5.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/fpn_p5.py index e991f9c7..cc4e7a49 100755 --- a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/fpn_p5.py +++ b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/fpn_p5.py @@ -75,4 +75,4 @@ def build_p35_resnet_fpn_backbone(cfg, input_shape: ShapeSpec): top_block=None, fuse_type=cfg.MODEL.FPN.FUSE_TYPE, ) - return backbone \ No newline at end of file + return backbone diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/res2net.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/res2net.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/res2net.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/res2net.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/debug.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/debug.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/debug.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/debug.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet_head.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet_head.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet_head.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet_head.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/utils.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/utils.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/utils.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/utils.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/deform_conv.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/deform_conv.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/deform_conv.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/deform_conv.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/heatmap_focal_loss.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/heatmap_focal_loss.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/heatmap_focal_loss.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/heatmap_focal_loss.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/iou_loss.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/iou_loss.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/iou_loss.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/iou_loss.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/ml_nms.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/ml_nms.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/ml_nms.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/ml_nms.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/meta_arch/centernet_detector.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/meta_arch/centernet_detector.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/meta_arch/centernet_detector.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/meta_arch/centernet_detector.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_fast_rcnn.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_fast_rcnn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_fast_rcnn.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_fast_rcnn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_roi_heads.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_roi_heads.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_roi_heads.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_roi_heads.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/fed_loss.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/fed_loss.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/fed_loss.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/fed_loss.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet2_docs/MODEL_ZOO.md b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet2_docs/MODEL_ZOO.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet2_docs/MODEL_ZOO.md rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet2_docs/MODEL_ZOO.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet-FPN.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet-FPN.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet-FPN.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet-FPN.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet2.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet2.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet2.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet2.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base_S4_DLA.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base_S4_DLA.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base_S4_DLA.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base_S4_DLA.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-FPN_R50_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-FPN_R50_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-FPN_R50_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-FPN_R50_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-S4_DLA_8x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-S4_DLA_8x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-S4_DLA_8x.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-S4_DLA_8x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2-F_R50_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2-F_R50_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2-F_R50_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2-F_R50_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R50_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R50_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R50_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R50_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_X101-DCN_2x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_X101-DCN_2x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_X101-DCN_2x.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_X101-DCN_2x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/O365_CenterNet2_R50_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/O365_CenterNet2_R50_1x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/O365_CenterNet2_R50_1x.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/O365_CenterNet2_R50_1x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/demo.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/demo.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/demo.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/demo.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/predictor.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/predictor.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/predictor.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/predictor.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/train_net.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/train_net.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/projects/CenterNet2/train_net.py rename to grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/train_net.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/setup.cfg b/grit_src_deprecated/third_party/CenterNet2/setup.cfg similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/setup.cfg rename to grit_src_deprecated/third_party/CenterNet2/setup.cfg diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/setup.py b/grit_src_deprecated/third_party/CenterNet2/setup.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/setup.py rename to grit_src_deprecated/third_party/CenterNet2/setup.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/README.md b/grit_src_deprecated/third_party/CenterNet2/tests/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/README.md rename to grit_src_deprecated/third_party/CenterNet2/tests/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/__init__.py b/grit_src_deprecated/third_party/CenterNet2/tests/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/tests/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/config/dir1/dir1_a.py b/grit_src_deprecated/third_party/CenterNet2/tests/config/dir1/dir1_a.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/config/dir1/dir1_a.py rename to grit_src_deprecated/third_party/CenterNet2/tests/config/dir1/dir1_a.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/config/dir1/dir1_b.py b/grit_src_deprecated/third_party/CenterNet2/tests/config/dir1/dir1_b.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/config/dir1/dir1_b.py rename to grit_src_deprecated/third_party/CenterNet2/tests/config/dir1/dir1_b.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/config/root_cfg.py b/grit_src_deprecated/third_party/CenterNet2/tests/config/root_cfg.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/config/root_cfg.py rename to grit_src_deprecated/third_party/CenterNet2/tests/config/root_cfg.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/config/test_instantiate_config.py b/grit_src_deprecated/third_party/CenterNet2/tests/config/test_instantiate_config.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/config/test_instantiate_config.py rename to grit_src_deprecated/third_party/CenterNet2/tests/config/test_instantiate_config.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/config/test_lazy_config.py b/grit_src_deprecated/third_party/CenterNet2/tests/config/test_lazy_config.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/config/test_lazy_config.py rename to grit_src_deprecated/third_party/CenterNet2/tests/config/test_lazy_config.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/config/test_yacs_config.py b/grit_src_deprecated/third_party/CenterNet2/tests/config/test_yacs_config.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/config/test_yacs_config.py rename to grit_src_deprecated/third_party/CenterNet2/tests/config/test_yacs_config.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/data/__init__.py b/grit_src_deprecated/third_party/CenterNet2/tests/data/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/data/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/tests/data/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/data/test_coco.py b/grit_src_deprecated/third_party/CenterNet2/tests/data/test_coco.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/data/test_coco.py rename to grit_src_deprecated/third_party/CenterNet2/tests/data/test_coco.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/data/test_coco_evaluation.py b/grit_src_deprecated/third_party/CenterNet2/tests/data/test_coco_evaluation.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/data/test_coco_evaluation.py rename to grit_src_deprecated/third_party/CenterNet2/tests/data/test_coco_evaluation.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/data/test_dataset.py b/grit_src_deprecated/third_party/CenterNet2/tests/data/test_dataset.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/data/test_dataset.py rename to grit_src_deprecated/third_party/CenterNet2/tests/data/test_dataset.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/data/test_detection_utils.py b/grit_src_deprecated/third_party/CenterNet2/tests/data/test_detection_utils.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/data/test_detection_utils.py rename to grit_src_deprecated/third_party/CenterNet2/tests/data/test_detection_utils.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/data/test_rotation_transform.py b/grit_src_deprecated/third_party/CenterNet2/tests/data/test_rotation_transform.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/data/test_rotation_transform.py rename to grit_src_deprecated/third_party/CenterNet2/tests/data/test_rotation_transform.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/data/test_sampler.py b/grit_src_deprecated/third_party/CenterNet2/tests/data/test_sampler.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/data/test_sampler.py rename to grit_src_deprecated/third_party/CenterNet2/tests/data/test_sampler.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/data/test_transforms.py b/grit_src_deprecated/third_party/CenterNet2/tests/data/test_transforms.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/data/test_transforms.py rename to grit_src_deprecated/third_party/CenterNet2/tests/data/test_transforms.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/__init__.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/tests/layers/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/test_blocks.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_blocks.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/test_blocks.py rename to grit_src_deprecated/third_party/CenterNet2/tests/layers/test_blocks.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/test_deformable.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_deformable.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/test_deformable.py rename to grit_src_deprecated/third_party/CenterNet2/tests/layers/test_deformable.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/test_losses.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_losses.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/test_losses.py rename to grit_src_deprecated/third_party/CenterNet2/tests/layers/test_losses.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/test_mask_ops.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_mask_ops.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/test_mask_ops.py rename to grit_src_deprecated/third_party/CenterNet2/tests/layers/test_mask_ops.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/test_nms.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_nms.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/test_nms.py rename to grit_src_deprecated/third_party/CenterNet2/tests/layers/test_nms.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/test_nms_rotated.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_nms_rotated.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/test_nms_rotated.py rename to grit_src_deprecated/third_party/CenterNet2/tests/layers/test_nms_rotated.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/test_roi_align.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_roi_align.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/test_roi_align.py rename to grit_src_deprecated/third_party/CenterNet2/tests/layers/test_roi_align.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/test_roi_align_rotated.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_roi_align_rotated.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/layers/test_roi_align_rotated.py rename to grit_src_deprecated/third_party/CenterNet2/tests/layers/test_roi_align_rotated.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/__init__.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/tests/modeling/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_anchor_generator.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_anchor_generator.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_anchor_generator.py rename to grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_anchor_generator.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_backbone.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_backbone.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_backbone.py rename to grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_backbone.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_box2box_transform.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_box2box_transform.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_box2box_transform.py rename to grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_box2box_transform.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_fast_rcnn.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_fast_rcnn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_fast_rcnn.py rename to grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_fast_rcnn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_matcher.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_matcher.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_matcher.py rename to grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_matcher.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_mmdet.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_mmdet.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_mmdet.py rename to grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_mmdet.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_model_e2e.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_model_e2e.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_model_e2e.py rename to grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_model_e2e.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_roi_heads.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_roi_heads.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_roi_heads.py rename to grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_roi_heads.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_roi_pooler.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_roi_pooler.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_roi_pooler.py rename to grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_roi_pooler.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_rpn.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_rpn.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/modeling/test_rpn.py rename to grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_rpn.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/structures/__init__.py b/grit_src_deprecated/third_party/CenterNet2/tests/structures/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/structures/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/tests/structures/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/structures/test_boxes.py b/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_boxes.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/structures/test_boxes.py rename to grit_src_deprecated/third_party/CenterNet2/tests/structures/test_boxes.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/structures/test_imagelist.py b/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_imagelist.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/structures/test_imagelist.py rename to grit_src_deprecated/third_party/CenterNet2/tests/structures/test_imagelist.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/structures/test_instances.py b/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_instances.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/structures/test_instances.py rename to grit_src_deprecated/third_party/CenterNet2/tests/structures/test_instances.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/structures/test_keypoints.py b/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_keypoints.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/structures/test_keypoints.py rename to grit_src_deprecated/third_party/CenterNet2/tests/structures/test_keypoints.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/structures/test_masks.py b/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_masks.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/structures/test_masks.py rename to grit_src_deprecated/third_party/CenterNet2/tests/structures/test_masks.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/structures/test_rotated_boxes.py b/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_rotated_boxes.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/structures/test_rotated_boxes.py rename to grit_src_deprecated/third_party/CenterNet2/tests/structures/test_rotated_boxes.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/test_checkpoint.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_checkpoint.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/test_checkpoint.py rename to grit_src_deprecated/third_party/CenterNet2/tests/test_checkpoint.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/test_engine.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_engine.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/test_engine.py rename to grit_src_deprecated/third_party/CenterNet2/tests/test_engine.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/test_events.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_events.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/test_events.py rename to grit_src_deprecated/third_party/CenterNet2/tests/test_events.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/test_export_caffe2.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_export_caffe2.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/test_export_caffe2.py rename to grit_src_deprecated/third_party/CenterNet2/tests/test_export_caffe2.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/test_export_torchscript.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_export_torchscript.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/test_export_torchscript.py rename to grit_src_deprecated/third_party/CenterNet2/tests/test_export_torchscript.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/test_model_analysis.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_model_analysis.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/test_model_analysis.py rename to grit_src_deprecated/third_party/CenterNet2/tests/test_model_analysis.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/test_model_zoo.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_model_zoo.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/test_model_zoo.py rename to grit_src_deprecated/third_party/CenterNet2/tests/test_model_zoo.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/test_packaging.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_packaging.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/test_packaging.py rename to grit_src_deprecated/third_party/CenterNet2/tests/test_packaging.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/test_registry.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_registry.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/test_registry.py rename to grit_src_deprecated/third_party/CenterNet2/tests/test_registry.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/test_scheduler.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_scheduler.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/test_scheduler.py rename to grit_src_deprecated/third_party/CenterNet2/tests/test_scheduler.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/test_solver.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_solver.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/test_solver.py rename to grit_src_deprecated/third_party/CenterNet2/tests/test_solver.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tests/test_visualizer.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_visualizer.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tests/test_visualizer.py rename to grit_src_deprecated/third_party/CenterNet2/tests/test_visualizer.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tools/README.md b/grit_src_deprecated/third_party/CenterNet2/tools/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tools/README.md rename to grit_src_deprecated/third_party/CenterNet2/tools/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tools/__init__.py b/grit_src_deprecated/third_party/CenterNet2/tools/__init__.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tools/__init__.py rename to grit_src_deprecated/third_party/CenterNet2/tools/__init__.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tools/analyze_model.py b/grit_src_deprecated/third_party/CenterNet2/tools/analyze_model.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tools/analyze_model.py rename to grit_src_deprecated/third_party/CenterNet2/tools/analyze_model.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tools/benchmark.py b/grit_src_deprecated/third_party/CenterNet2/tools/benchmark.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tools/benchmark.py rename to grit_src_deprecated/third_party/CenterNet2/tools/benchmark.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tools/convert-torchvision-to-d2.py b/grit_src_deprecated/third_party/CenterNet2/tools/convert-torchvision-to-d2.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tools/convert-torchvision-to-d2.py rename to grit_src_deprecated/third_party/CenterNet2/tools/convert-torchvision-to-d2.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tools/deploy/CMakeLists.txt b/grit_src_deprecated/third_party/CenterNet2/tools/deploy/CMakeLists.txt similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tools/deploy/CMakeLists.txt rename to grit_src_deprecated/third_party/CenterNet2/tools/deploy/CMakeLists.txt diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tools/deploy/README.md b/grit_src_deprecated/third_party/CenterNet2/tools/deploy/README.md similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tools/deploy/README.md rename to grit_src_deprecated/third_party/CenterNet2/tools/deploy/README.md diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tools/deploy/export_model.py b/grit_src_deprecated/third_party/CenterNet2/tools/deploy/export_model.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tools/deploy/export_model.py rename to grit_src_deprecated/third_party/CenterNet2/tools/deploy/export_model.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tools/deploy/torchscript_mask_rcnn.cpp b/grit_src_deprecated/third_party/CenterNet2/tools/deploy/torchscript_mask_rcnn.cpp similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tools/deploy/torchscript_mask_rcnn.cpp rename to grit_src_deprecated/third_party/CenterNet2/tools/deploy/torchscript_mask_rcnn.cpp diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tools/lazyconfig_train_net.py b/grit_src_deprecated/third_party/CenterNet2/tools/lazyconfig_train_net.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tools/lazyconfig_train_net.py rename to grit_src_deprecated/third_party/CenterNet2/tools/lazyconfig_train_net.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tools/lightning_train_net.py b/grit_src_deprecated/third_party/CenterNet2/tools/lightning_train_net.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tools/lightning_train_net.py rename to grit_src_deprecated/third_party/CenterNet2/tools/lightning_train_net.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tools/plain_train_net.py b/grit_src_deprecated/third_party/CenterNet2/tools/plain_train_net.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tools/plain_train_net.py rename to grit_src_deprecated/third_party/CenterNet2/tools/plain_train_net.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tools/train_net.py b/grit_src_deprecated/third_party/CenterNet2/tools/train_net.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tools/train_net.py rename to grit_src_deprecated/third_party/CenterNet2/tools/train_net.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tools/visualize_data.py b/grit_src_deprecated/third_party/CenterNet2/tools/visualize_data.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tools/visualize_data.py rename to grit_src_deprecated/third_party/CenterNet2/tools/visualize_data.py diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/tools/visualize_json_results.py b/grit_src_deprecated/third_party/CenterNet2/tools/visualize_json_results.py similarity index 100% rename from vbench/third_party/grit_src/third_party/CenterNet2/tools/visualize_json_results.py rename to grit_src_deprecated/third_party/CenterNet2/tools/visualize_json_results.py diff --git a/requirements.txt b/requirements.txt new file mode 100644 index 00000000..6a9ef8e5 --- /dev/null +++ b/requirements.txt @@ -0,0 +1,17 @@ +Pillow +numpy +matplotlib +timm>=0.9 +torch>=1.12 +torchvision>=0.13 +tensorboard +scipy +opencv-python +detectron2 +scikit-learn +requests +scikit-image +pyyaml +easydict +lvis +fairscale diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/__init__.py b/setup.cfg old mode 100755 new mode 100644 similarity index 100% rename from vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/__init__.py rename to setup.cfg diff --git a/setup.py b/setup.py new file mode 100644 index 00000000..5b240aa2 --- /dev/null +++ b/setup.py @@ -0,0 +1,35 @@ +#!/usr/bin/env python + +from setuptools import find_packages, setup +import os + +def fetch_readme(): + with open('README.md', encoding='utf-8') as f: + text = f.read() + return text + +def fetch_requirements(): + filename = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'requirements.txt') + with open(filename, 'r') as f: + envs = [line.rstrip('\n') for line in f.readlines()] + return envs + +version = open('version.txt', 'r').read().rstrip('\n') +install_requires = fetch_requirements() +setup(name='vbench', + version='0.1.0', + description='Video generation benchmark', + long_description=fetch_readme(), + long_description_content_type='text/markdown', + project_urls={ + 'Source': 'https://github.com/Vchitect/VBench', + }, + entry_points={ + 'console_scripts': ['evaluate=vbench.cli.evaluate:main'] + }, + install_requires=install_requires, + packages=find_packages(), + include_package_data=True, + license='Apache Software License 2.0', + scripts=['bin/evaluate'], +) diff --git a/vbench/aesthetic_quality.py b/vbench/aesthetic_quality.py index c2d151a3..78d79618 100755 --- a/vbench/aesthetic_quality.py +++ b/vbench/aesthetic_quality.py @@ -7,7 +7,7 @@ import torch.nn as nn import torch.nn.functional as F from urllib.request import urlretrieve -from .utils import load_video, load_dimension_info, clip_transform +from vbench.utils import load_video, load_dimension_info, clip_transform from tqdm import tqdm diff --git a/vbench/appearance_style.py b/vbench/appearance_style.py index 944a024f..bb62a21f 100755 --- a/vbench/appearance_style.py +++ b/vbench/appearance_style.py @@ -6,7 +6,7 @@ import torch import clip from PIL import Image -from .utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps, clip_transform_Image +from vbench.utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps, clip_transform_Image def get_text_features(model, input_text, tokenizer, text_feature_dict={}): if input_text in text_feature_dict: @@ -63,4 +63,4 @@ def compute_appearance_style(json_dir, device, submodules_list): clip_model, preprocess = clip.load(device=device, **submodules_list) _, video_dict = load_dimension_info(json_dir, dimension='appearance_style', lang='en') all_results, video_results = appearance_style(clip_model, video_dict, device) - return all_results, video_results \ No newline at end of file + return all_results, video_results diff --git a/vbench/background_consistency.py b/vbench/background_consistency.py index d33cfc81..3ce0ea40 100755 --- a/vbench/background_consistency.py +++ b/vbench/background_consistency.py @@ -6,7 +6,7 @@ import torch import torch.nn as nn import torch.nn.functional as F -from .utils import load_video, load_dimension_info, clip_transform +from vbench.utils import load_video, load_dimension_info, clip_transform from tqdm import tqdm diff --git a/vbench/cli/__init__.py b/vbench/cli/__init__.py new file mode 100644 index 00000000..31b4bf0a --- /dev/null +++ b/vbench/cli/__init__.py @@ -0,0 +1,2 @@ +import sys +sys.path.append('vbench/third_party') diff --git a/vbench/cli/evaluate.py b/vbench/cli/evaluate.py new file mode 100644 index 00000000..fc04dbc6 --- /dev/null +++ b/vbench/cli/evaluate.py @@ -0,0 +1,67 @@ +import torch +from vbench import VBench + +import argparse + +def parse_args(): + + parser = argparse.ArgumentParser(description='VBench') + parser.add_argument( + "--output_path", + type=str, + default='./evaluation_results/', + help="output path to save the evaluation results", + ) + parser.add_argument( + "--full_json_dir", + type=str, + default='./VBench_full_info.json', + help="path to save the json file that contains the prompt and dimension information", + ) + parser.add_argument( + "--videos_path", + type=str, + required=True, + help="folder that contains the sampled videos", + ) + parser.add_argument( + "--dimension", + type=str, + required=True, + help="evaluation dimensions", + ) + parser.add_argument( + "--load_ckpt_from_local", + type=bool, + required=False, + help="whether load checkpoints from local default paths (assuming you have downloaded the checkpoints locally", + ) + parser.add_argument( + "--read_frame", + type=bool, + required=False, + help="whether directly read frames, or directly read videos", + ) + args = parser.parse_args() + return args + + +def main(): + args = parse_args() + print(f'args: {args}') + + device = torch.device("cuda") + my_VBench = VBench(device, args.full_json_dir, args.output_path) + + print(f'start evaluation') + my_VBench.evaluate( + videos_path = args.videos_path, + name = args.dimension, + dimension_list = [args.dimension], + local=args.load_ckpt_from_local, + read_frame=args.read_frame, + ) + print('done') + +if __name__ == "__main__": + main() diff --git a/vbench/color.py b/vbench/color.py index 78bfeb2d..23c22238 100755 --- a/vbench/color.py +++ b/vbench/color.py @@ -4,8 +4,8 @@ import torch import numpy as np from tqdm import tqdm -from .utils import load_video, load_dimension_info, read_frames_decord_by_fps -from .third_party.grit_model import DenseCaptioning +from vbench.utils import load_video, load_dimension_info, read_frames_decord_by_fps +from vbench.third_party.grit_model import DenseCaptioning import logging logging.basicConfig(level = logging.INFO,format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s') @@ -73,4 +73,4 @@ def compute_color(json_dir, device, submodules_dict): logger.info("Initialize detection model success") _, prompt_dict_ls = load_dimension_info(json_dir, dimension='color', lang='en') all_results, video_results = color(dense_caption_model, prompt_dict_ls, device) - return all_results, video_results \ No newline at end of file + return all_results, video_results diff --git a/vbench/dynamic_degree.py b/vbench/dynamic_degree.py index d6ce3ae8..153529a8 100644 --- a/vbench/dynamic_degree.py +++ b/vbench/dynamic_degree.py @@ -7,13 +7,10 @@ from tqdm import tqdm from easydict import EasyDict as edict -from .utils import load_dimension_info - -from .third_party.RAFT.core.raft import RAFT -from .third_party.RAFT.core.utils_core.utils import InputPadder - - +from vbench.utils import load_dimension_info +from vbench.third_party.RAFT.core.raft import RAFT +from vbench.third_party.RAFT.core.utils_core.utils import InputPadder class DynamicDegree: def __init__(self, args, device): @@ -140,4 +137,4 @@ def compute_dynamic_degree(json_dir, device, submodules_list): dynamic = DynamicDegree(args_new, device) video_list, _ = load_dimension_info(json_dir, dimension='dynamic_degree', lang='en') all_results, video_results = dynamic_degree(dynamic, video_list) - return all_results, video_results \ No newline at end of file + return all_results, video_results diff --git a/vbench/human_action.py b/vbench/human_action.py index 3b4d7867..8d3cea11 100755 --- a/vbench/human_action.py +++ b/vbench/human_action.py @@ -6,16 +6,16 @@ import torch import torch.nn as nn import torch.nn.functional as F -from .utils import load_video, load_dimension_info -from .third_party.umt.datasets.video_transforms import ( +from vbench.utils import load_video, load_dimension_info +from vbench.third_party.umt.datasets.video_transforms import ( Compose, Resize, CenterCrop, Normalize, create_random_augment, random_short_side_scale_jitter, random_crop, random_resized_crop_with_shift, random_resized_crop, horizontal_flip, random_short_side_scale_jitter, uniform_crop, ) -from .third_party.umt.datasets.volume_transforms import ClipToTensor +from vbench.third_party.umt.datasets.volume_transforms import ClipToTensor from timm.models import create_model -from .third_party.umt.models import vit_large_patch16_224 +# from vbench.third_party.umt.models import vit_large_patch16_224 from tqdm import tqdm def build_dict(): diff --git a/vbench/imaging_quality.py b/vbench/imaging_quality.py index 5af03691..86b72b78 100755 --- a/vbench/imaging_quality.py +++ b/vbench/imaging_quality.py @@ -1,7 +1,7 @@ import torch from tqdm import tqdm from pyiqa.archs.musiq_arch import MUSIQ -from .utils import load_video, load_dimension_info +from vbench.utils import load_video, load_dimension_info def transform(images): return images / 255. diff --git a/vbench/motion_smoothness.py b/vbench/motion_smoothness.py index 364fa071..93ae4443 100644 --- a/vbench/motion_smoothness.py +++ b/vbench/motion_smoothness.py @@ -6,14 +6,14 @@ from tqdm import tqdm from omegaconf import OmegaConf -from .utils import load_dimension_info +from vbench.utils import load_dimension_info -from .third_party.amt.utils.utils import ( +from vbench.third_party.amt.utils.utils import ( img2tensor, tensor2img, check_dim_and_resize ) -from .third_party.amt.utils.build_utils import build_from_cfg -from .third_party.amt.utils.utils import InputPadder +from vbench.third_party.amt.utils.build_utils import build_from_cfg +from vbench.third_party.amt.utils.utils import InputPadder class FrameProcess: diff --git a/vbench/multiple_objects.py b/vbench/multiple_objects.py index 6385969e..1e43c566 100755 --- a/vbench/multiple_objects.py +++ b/vbench/multiple_objects.py @@ -4,8 +4,8 @@ import torch import numpy as np from tqdm import tqdm -from .utils import load_video, load_dimension_info -from .third_party.grit_model import DenseCaptioning +from vbench.utils import load_video, load_dimension_info +from vbench.third_party.grit_model import DenseCaptioning import logging logging.basicConfig(level = logging.INFO,format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s') @@ -59,4 +59,4 @@ def compute_multiple_objects(json_dir, device, submodules_dict): logger.info("Initialize detection model success") _, prompt_dict_ls = load_dimension_info(json_dir, dimension='multiple_objects', lang='en') all_results, video_results = multiple_objects(dense_caption_model, prompt_dict_ls, device) - return all_results, video_results \ No newline at end of file + return all_results, video_results diff --git a/vbench/object_class.py b/vbench/object_class.py index a3ed0614..10cb27a1 100755 --- a/vbench/object_class.py +++ b/vbench/object_class.py @@ -4,8 +4,8 @@ import torch import numpy as np from tqdm import tqdm -from .utils import load_video, load_dimension_info -from .third_party.grit_model import DenseCaptioning +from vbench.utils import load_video, load_dimension_info +from vbench.third_party.grit_model import DenseCaptioning import logging logging.basicConfig(level = logging.INFO,format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s') @@ -55,4 +55,4 @@ def compute_object_class(json_dir, device, submodules_dict): logger.info("Initialize detection model success") _, prompt_dict_ls = load_dimension_info(json_dir, dimension='object_class', lang='en') all_results, video_results = object_class(dense_caption_model, prompt_dict_ls, device) - return all_results, video_results \ No newline at end of file + return all_results, video_results diff --git a/vbench/overall_consistency.py b/vbench/overall_consistency.py index b229bf6b..1b4ae6fe 100755 --- a/vbench/overall_consistency.py +++ b/vbench/overall_consistency.py @@ -6,9 +6,9 @@ import torch import clip from tqdm import tqdm -from .third_party.ViCLIP.viclip import ViCLIP -from .third_party.ViCLIP.simple_tokenizer import SimpleTokenizer -from .utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps +from vbench.third_party.ViCLIP.viclip import ViCLIP +from vbench.third_party.ViCLIP.simple_tokenizer import SimpleTokenizer +from vbench.utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps def get_text_features(model, input_text, tokenizer, text_feature_dict={}): if input_text in text_feature_dict: @@ -59,4 +59,4 @@ def compute_overall_consistency(json_dir, device, submodules_list): viclip = ViCLIP(tokenizer= tokenizer, **submodules_list).to(device) _, video_dict = load_dimension_info(json_dir, dimension='overall_consistency', lang='en') all_results, video_results = overall_consistency(viclip, video_dict, tokenizer, device) - return all_results, video_results \ No newline at end of file + return all_results, video_results diff --git a/vbench/scene.py b/vbench/scene.py index 8f276703..e8384468 100755 --- a/vbench/scene.py +++ b/vbench/scene.py @@ -4,8 +4,8 @@ import torch import numpy as np from tqdm import tqdm -from .utils import load_video, load_dimension_info, tag2text_transform -from .third_party.tag2Text.tag2text import tag2text_caption +from vbench.utils import load_video, load_dimension_info, tag2text_transform +from vbench.third_party.tag2Text.tag2text import tag2text_caption import logging logging.basicConfig(level = logging.INFO,format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s') @@ -55,4 +55,4 @@ def compute_scene(json_dir, device, submodules_dict): logger.info("Initialize caption model success") _, prompt_dict_ls = load_dimension_info(json_dir, dimension='scene', lang='en') all_results, video_results = scene(model, prompt_dict_ls, device) - return all_results, video_results \ No newline at end of file + return all_results, video_results diff --git a/vbench/spatial_relationship.py b/vbench/spatial_relationship.py index 898b59a1..6750595b 100755 --- a/vbench/spatial_relationship.py +++ b/vbench/spatial_relationship.py @@ -4,8 +4,8 @@ import torch import numpy as np from tqdm import tqdm -from .utils import load_video, load_dimension_info -from .third_party.grit_model import DenseCaptioning +from vbench.utils import load_video, load_dimension_info +from vbench.third_party.grit_model import DenseCaptioning import logging logging.basicConfig(level = logging.INFO,format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s') @@ -127,4 +127,4 @@ def compute_spatial_relationship(json_dir, device, submodules_dict): logger.info("Initialize detection model success") _, prompt_dict_ls = load_dimension_info(json_dir, dimension='spatial_relationship', lang='en') all_results, video_results = spatial_relationship(dense_caption_model, prompt_dict_ls, device) - return all_results, video_results \ No newline at end of file + return all_results, video_results diff --git a/vbench/subject_consistency.py b/vbench/subject_consistency.py index 8ab03c5a..d63bb2b3 100755 --- a/vbench/subject_consistency.py +++ b/vbench/subject_consistency.py @@ -11,7 +11,7 @@ import torch.nn.functional as F import torchvision.transforms as transforms -from .utils import load_video, load_dimension_info, dino_transform, dino_transform_Image +from vbench.utils import load_video, load_dimension_info, dino_transform, dino_transform_Image import logging logging.basicConfig(level = logging.INFO,format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s') logger = logging.getLogger(__name__) diff --git a/vbench/temporal_flickering.py b/vbench/temporal_flickering.py index c704366a..556542d6 100644 --- a/vbench/temporal_flickering.py +++ b/vbench/temporal_flickering.py @@ -1,7 +1,7 @@ import numpy as np from tqdm import tqdm import cv2 -from .utils import load_dimension_info +from vbench.utils import load_dimension_info def get_frames(video_path): diff --git a/vbench/temporal_style.py b/vbench/temporal_style.py index c27f89b3..4fae32f9 100755 --- a/vbench/temporal_style.py +++ b/vbench/temporal_style.py @@ -6,9 +6,9 @@ import torch import clip from tqdm import tqdm -from .third_party.ViCLIP.viclip import ViCLIP -from .third_party.ViCLIP.simple_tokenizer import SimpleTokenizer -from .utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps +from vbench.third_party.ViCLIP.viclip import ViCLIP +from vbench.third_party.ViCLIP.simple_tokenizer import SimpleTokenizer +from vbench.utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps def get_text_features(model, input_text, tokenizer, text_feature_dict={}): if input_text in text_feature_dict: @@ -60,4 +60,4 @@ def compute_temporal_style(json_dir, device, submodules_list): viclip = ViCLIP(tokenizer= tokenizer, **submodules_list).to(device) _, video_dict = load_dimension_info(json_dir, dimension='temporal_style', lang='en') all_results, video_results = temporal_style(viclip, video_dict, tokenizer, device) - return all_results, video_results \ No newline at end of file + return all_results, video_results diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/__init__.py b/vbench/third_party/RAFT/__init__.py old mode 100755 new mode 100644 similarity index 100% rename from vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/__init__.py rename to vbench/third_party/RAFT/__init__.py diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/__init__.py b/vbench/third_party/__init__.py old mode 100755 new mode 100644 similarity index 100% rename from vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/__init__.py rename to vbench/third_party/__init__.py diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/__init__.py b/vbench/third_party/amt/__init__.py old mode 100755 new mode 100644 similarity index 100% rename from vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/__init__.py rename to vbench/third_party/amt/__init__.py diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/__init__.py b/vbench/third_party/grit_src/__init__.py old mode 100755 new mode 100644 similarity index 100% rename from vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/__init__.py rename to vbench/third_party/grit_src/__init__.py diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/config.py b/vbench/third_party/grit_src/centernet_config.py old mode 100755 new mode 100644 similarity index 100% rename from vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/config.py rename to vbench/third_party/grit_src/centernet_config.py diff --git a/vbench/third_party/grit_src/grit/config.py b/vbench/third_party/grit_src/grit/config.py index fabe7f0f..3cb449d7 100755 --- a/vbench/third_party/grit_src/grit/config.py +++ b/vbench/third_party/grit_src/grit/config.py @@ -47,4 +47,4 @@ def add_grit_config(cfg): _C.INPUT.TEST_INPUT_TYPE = 'default' _C.FIND_UNUSED_PARAM = True - _C.USE_ACT_CHECKPOINT = True \ No newline at end of file + _C.USE_ACT_CHECKPOINT = True diff --git a/vbench/third_party/grit_src/grit/data/__init__.py b/vbench/third_party/grit_src/grit/data/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/grit_src/grit/data/datasets/__init__.py b/vbench/third_party/grit_src/grit/data/datasets/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/grit_src/grit/modeling/__init__.py b/vbench/third_party/grit_src/grit/modeling/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/grit_src/grit/modeling/backbone/__init__.py b/vbench/third_party/grit_src/grit/modeling/backbone/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/fpn_p5.py b/vbench/third_party/grit_src/grit/modeling/backbone/centernet_last_layer.py old mode 100755 new mode 100644 similarity index 99% rename from vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/fpn_p5.py rename to vbench/third_party/grit_src/grit/modeling/backbone/centernet_last_layer.py index e991f9c7..cc4e7a49 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/fpn_p5.py +++ b/vbench/third_party/grit_src/grit/modeling/backbone/centernet_last_layer.py @@ -75,4 +75,4 @@ def build_p35_resnet_fpn_backbone(cfg, input_shape: ShapeSpec): top_block=None, fuse_type=cfg.MODEL.FPN.FUSE_TYPE, ) - return backbone \ No newline at end of file + return backbone diff --git a/vbench/third_party/grit_src/grit/modeling/backbone/vit.py b/vbench/third_party/grit_src/grit/modeling/backbone/vit.py index 36d1207f..79e032bc 100755 --- a/vbench/third_party/grit_src/grit/modeling/backbone/vit.py +++ b/vbench/third_party/grit_src/grit/modeling/backbone/vit.py @@ -9,8 +9,9 @@ from detectron2.layers import CNNBlockBase, Conv2d, get_norm from detectron2.modeling.backbone.build import BACKBONE_REGISTRY from detectron2.layers import ShapeSpec -from centernet.modeling.backbone.fpn_p5 import LastLevelP6P7_P5 +# from centernet.modeling.backbone.fpn_p5 import LastLevelP6P7_P5 +from .centernet_last_layer import LastLevelP6P7_P5 import torch.utils.checkpoint as checkpoint from timm.models.layers import DropPath, Mlp, trunc_normal_ diff --git a/vbench/third_party/grit_src/grit/modeling/meta_arch/__init__.py b/vbench/third_party/grit_src/grit/modeling/meta_arch/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/grit_src/grit/modeling/meta_arch/grit.py b/vbench/third_party/grit_src/grit/modeling/meta_arch/grit.py index 47040d4c..126e0ca1 100755 --- a/vbench/third_party/grit_src/grit/modeling/meta_arch/grit.py +++ b/vbench/third_party/grit_src/grit/modeling/meta_arch/grit.py @@ -68,4 +68,4 @@ def forward(self, batched_inputs: List[Dict[str, torch.Tensor]]): losses.update(roihead_textdecoder_losses) losses.update(proposal_losses) - return losses \ No newline at end of file + return losses diff --git a/vbench/third_party/grit_src/grit/modeling/roi_heads/__init__.py b/vbench/third_party/grit_src/grit/modeling/roi_heads/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/grit_src/grit/modeling/roi_heads/grit_roi_heads.py b/vbench/third_party/grit_src/grit/modeling/roi_heads/grit_roi_heads.py index 8de7e59f..0dbce5ae 100755 --- a/vbench/third_party/grit_src/grit/modeling/roi_heads/grit_roi_heads.py +++ b/vbench/third_party/grit_src/grit/modeling/roi_heads/grit_roi_heads.py @@ -16,7 +16,8 @@ from ..text.text_decoder import TransformerDecoderTextualHead, GRiTTextDecoder, AutoRegressiveBeamSearch from ..text.load_text_token import LoadTextTokens from transformers import BertTokenizer -from grit_src.grit.data.custom_dataset_mapper import ObjDescription + +# from grit.data.custom_dataset_mapper import ObjDescription from ..soft_nms import batched_soft_nms import logging diff --git a/vbench/third_party/grit_src/grit/modeling/text/__init__.py b/vbench/third_party/grit_src/grit/modeling/text/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/grit_src/image_dense_captions.py b/vbench/third_party/grit_src/image_dense_captions.py index 3a513cfa..734b68eb 100755 --- a/vbench/third_party/grit_src/image_dense_captions.py +++ b/vbench/third_party/grit_src/image_dense_captions.py @@ -1,30 +1,42 @@ -import argparse -import multiprocessing as mp import os -import time -import cv2 -import tqdm -import sys - +import torch +from itertools import compress from detectron2.config import get_cfg from detectron2.data.detection_utils import read_image -from detectron2.utils.logger import setup_logger # constants WINDOW_NAME = "GRiT" CUR_DIR = os.path.dirname(os.path.abspath(__file__)) -sys.path.insert(0, f"{CUR_DIR}/../") -sys.path.insert(0, os.path.join(CUR_DIR,'third_party/CenterNet2/projects/CenterNet2/')) -from centernet.config import add_centernet_config -from grit_src.grit.config import add_grit_config - -from grit_src.grit.predictor import VisualizationDemo -import json - - - - +# sys.path.insert(0, f"{CUR_DIR}/../") +# print(CUR_DIR) +# sys.path.insert(0, os.path.join(CUR_DIR,'third_party/CenterNet2/projects/CenterNet2/')) +# from centernet.config import add_centernet_config +from .centernet_config import add_centernet_config +from .grit.config import add_grit_config +from .grit.predictor import VisualizationDemo + +class ObjDescription: + def __init__(self, object_descriptions): + self.data = object_descriptions + + def __getitem__(self, item): + assert type(item) == torch.Tensor + assert item.dim() == 1 + if len(item) > 0: + assert item.dtype == torch.int64 or item.dtype == torch.bool + if item.dtype == torch.int64: + return ObjDescription([self.data[x.item()] for x in item]) + elif item.dtype == torch.bool: + return ObjDescription(list(compress(self.data, item))) + + return ObjDescription(list(compress(self.data, item))) + + def __len__(self): + return len(self.data) + + def __repr__(self): + return "ObjDescription({})".format(self.data) def dense_pred_to_caption(predictions): boxes = predictions["instances"].pred_boxes if predictions["instances"].has("pred_boxes") else None diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/notes/contributing.md b/vbench/third_party/grit_src/third_party/CenterNet2/docs/notes/contributing.md deleted file mode 120000 index 95181235..00000000 --- a/vbench/third_party/grit_src/third_party/CenterNet2/docs/notes/contributing.md +++ /dev/null @@ -1 +0,0 @@ -../../.github/CONTRIBUTING.md \ No newline at end of file diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/builtin_datasets.md b/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/builtin_datasets.md deleted file mode 120000 index 0ba82423..00000000 --- a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/builtin_datasets.md +++ /dev/null @@ -1 +0,0 @@ -../../datasets/README.md \ No newline at end of file diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/getting_started.md b/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/getting_started.md deleted file mode 120000 index e90bde77..00000000 --- a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/getting_started.md +++ /dev/null @@ -1 +0,0 @@ -../../GETTING_STARTED.md \ No newline at end of file diff --git a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/install.md b/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/install.md deleted file mode 120000 index 5f52b2be..00000000 --- a/vbench/third_party/grit_src/third_party/CenterNet2/docs/tutorials/install.md +++ /dev/null @@ -1 +0,0 @@ -../../INSTALL.md \ No newline at end of file diff --git a/vbench/third_party/tag2Text/__init__.py b/vbench/third_party/tag2Text/__init__.py new file mode 100644 index 00000000..4ef99cf1 --- /dev/null +++ b/vbench/third_party/tag2Text/__init__.py @@ -0,0 +1,2 @@ +import sys +sys.path.append('third_party/grit_src') diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.circleci/config.yml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.circleci/config.yml deleted file mode 100755 index 097afade..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.circleci/config.yml +++ /dev/null @@ -1,256 +0,0 @@ -version: 2.1 - -# ------------------------------------------------------------------------------------- -# Environments to run the jobs in -# ------------------------------------------------------------------------------------- -cpu: &cpu - machine: - image: ubuntu-2004:202107-02 - resource_class: medium - -gpu: &gpu - machine: - # NOTE: use a cuda vesion that's supported by all our pytorch versions - image: ubuntu-1604-cuda-11.1:202012-01 - resource_class: gpu.nvidia.small - -windows-cpu: &windows_cpu - machine: - resource_class: windows.medium - image: windows-server-2019-vs2019:stable - shell: powershell.exe - -# windows-gpu: &windows_gpu -# machine: -# resource_class: windows.gpu.nvidia.medium -# image: windows-server-2019-nvidia:stable - -version_parameters: &version_parameters - parameters: - pytorch_version: - type: string - torchvision_version: - type: string - pytorch_index: - type: string - # use test wheels index to have access to RC wheels - # https://download.pytorch.org/whl/test/torch_test.html - default: "https://download.pytorch.org/whl/torch_stable.html" - python_version: # NOTE: only affect linux - type: string - default: '3.6.8' - - environment: - PYTORCH_VERSION: << parameters.pytorch_version >> - TORCHVISION_VERSION: << parameters.torchvision_version >> - PYTORCH_INDEX: << parameters.pytorch_index >> - PYTHON_VERSION: << parameters.python_version>> - # point datasets to ~/.torch so it's cached in CI - DETECTRON2_DATASETS: ~/.torch/datasets - -# ------------------------------------------------------------------------------------- -# Re-usable commands -# ------------------------------------------------------------------------------------- -# install_nvidia_driver: &install_nvidia_driver -# - run: -# name: Install nvidia driver -# working_directory: ~/ -# command: | -# wget -q 'https://s3.amazonaws.com/ossci-linux/nvidia_driver/NVIDIA-Linux-x86_64-430.40.run' -# sudo /bin/bash ./NVIDIA-Linux-x86_64-430.40.run -s --no-drm -# nvidia-smi - -add_ssh_keys: &add_ssh_keys - # https://circleci.com/docs/2.0/add-ssh-key/ - - add_ssh_keys: - fingerprints: - - "e4:13:f2:22:d4:49:e8:e4:57:5a:ac:20:2f:3f:1f:ca" - -install_python: &install_python - - run: - name: Install Python - working_directory: ~/ - command: | - # upgrade pyenv - cd /opt/circleci/.pyenv/plugins/python-build/../.. && git pull && cd - - pyenv install -s $PYTHON_VERSION - pyenv global $PYTHON_VERSION - python --version - which python - pip install --upgrade pip - -setup_venv: &setup_venv - - run: - name: Setup Virtual Env - working_directory: ~/ - command: | - python -m venv ~/venv - echo ". ~/venv/bin/activate" >> $BASH_ENV - . ~/venv/bin/activate - python --version - which python - which pip - pip install --upgrade pip - -setup_venv_win: &setup_venv_win - - run: - name: Setup Virutal Env for Windows - command: | - pip install virtualenv - python -m virtualenv env - .\env\Scripts\activate - python --version - which python - which pip - -install_linux_dep: &install_linux_dep - - run: - name: Install Dependencies - command: | - # disable crash coredump, so unittests fail fast - sudo systemctl stop apport.service - # install from github to get latest; install iopath first since fvcore depends on it - pip install --progress-bar off -U 'git+https://github.com/facebookresearch/iopath' - pip install --progress-bar off -U 'git+https://github.com/facebookresearch/fvcore' - # Don't use pytest-xdist: cuda tests are unstable under multi-process workers. - pip install --progress-bar off ninja opencv-python-headless pytest tensorboard pycocotools - pip install --progress-bar off torch==$PYTORCH_VERSION -f $PYTORCH_INDEX - if [[ "$TORCHVISION_VERSION" == "master" ]]; then - pip install git+https://github.com/pytorch/vision.git - else - pip install --progress-bar off torchvision==$TORCHVISION_VERSION -f $PYTORCH_INDEX - fi - - python -c 'import torch; print("CUDA:", torch.cuda.is_available())' - gcc --version - -install_detectron2: &install_detectron2 - - run: - name: Install Detectron2 - command: | - # Remove first, in case it's in the CI cache - pip uninstall -y detectron2 - pip install --progress-bar off -e .[all] - python -m detectron2.utils.collect_env - ./datasets/prepare_for_tests.sh - -run_unittests: &run_unittests - - run: - name: Run Unit Tests - command: | - pytest -v --durations=15 tests # parallel causes some random failures - -# ------------------------------------------------------------------------------------- -# Jobs to run -# ------------------------------------------------------------------------------------- -jobs: - linux_cpu_tests: - <<: *cpu - <<: *version_parameters - - working_directory: ~/detectron2 - - steps: - - checkout - - # Cache the venv directory that contains python, dependencies, and checkpoints - # Refresh the key when dependencies should be updated (e.g. when pytorch releases) - - restore_cache: - keys: - - cache-{{ arch }}-<< parameters.pytorch_version >>-{{ .Branch }}-20210827 - - - <<: *install_python - - <<: *install_linux_dep - - <<: *install_detectron2 - - <<: *run_unittests - - - save_cache: - paths: - - /opt/circleci/.pyenv - - ~/.torch - key: cache-{{ arch }}-<< parameters.pytorch_version >>-{{ .Branch }}-20210827 - - - linux_gpu_tests: - <<: *gpu - <<: *version_parameters - - working_directory: ~/detectron2 - - steps: - - checkout - - - restore_cache: - keys: - - cache-{{ arch }}-<< parameters.pytorch_version >>-{{ .Branch }}-20210827 - - - <<: *install_python - - <<: *install_linux_dep - - <<: *install_detectron2 - - <<: *run_unittests - - - save_cache: - paths: - - /opt/circleci/.pyenv - - ~/.torch - key: cache-{{ arch }}-<< parameters.pytorch_version >>-{{ .Branch }}-20210827 - - windows_cpu_build: - <<: *windows_cpu - <<: *version_parameters - steps: - - <<: *add_ssh_keys - - checkout - - <<: *setup_venv_win - - # Cache the env directory that contains dependencies - - restore_cache: - keys: - - cache-{{ arch }}-<< parameters.pytorch_version >>-{{ .Branch }}-20210404 - - - run: - name: Install Dependencies - command: | - pip install certifi --ignore-installed # required on windows to workaround some cert issue - pip install numpy cython # required on windows before pycocotools - pip install opencv-python-headless pytest-xdist pycocotools tensorboard - pip install -U git+https://github.com/facebookresearch/iopath - pip install -U git+https://github.com/facebookresearch/fvcore - pip install torch==$env:PYTORCH_VERSION torchvision==$env:TORCHVISION_VERSION -f $env:PYTORCH_INDEX - - - save_cache: - paths: - - env - key: cache-{{ arch }}-<< parameters.pytorch_version >>-{{ .Branch }}-20210404 - - - <<: *install_detectron2 - # TODO: unittest fails for now - -workflows: - version: 2 - regular_test: - jobs: - - linux_cpu_tests: - name: linux_cpu_tests_pytorch1.10 - pytorch_version: '1.10.0+cpu' - torchvision_version: '0.11.1+cpu' - - linux_gpu_tests: - name: linux_gpu_tests_pytorch1.8 - pytorch_version: '1.8.1+cu111' - torchvision_version: '0.9.1+cu111' - - linux_gpu_tests: - name: linux_gpu_tests_pytorch1.9 - pytorch_version: '1.9+cu111' - torchvision_version: '0.10+cu111' - - linux_gpu_tests: - name: linux_gpu_tests_pytorch1.10 - pytorch_version: '1.10+cu111' - torchvision_version: '0.11.1+cu111' - - linux_gpu_tests: - name: linux_gpu_tests_pytorch1.10_python39 - pytorch_version: '1.10+cu111' - torchvision_version: '0.11.1+cu111' - python_version: '3.9.6' - - windows_cpu_build: - pytorch_version: '1.10+cpu' - torchvision_version: '0.11.1+cpu' diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.clang-format b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.clang-format deleted file mode 100755 index 39b1b3d6..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.clang-format +++ /dev/null @@ -1,85 +0,0 @@ -AccessModifierOffset: -1 -AlignAfterOpenBracket: AlwaysBreak -AlignConsecutiveAssignments: false -AlignConsecutiveDeclarations: false -AlignEscapedNewlinesLeft: true -AlignOperands: false -AlignTrailingComments: false -AllowAllParametersOfDeclarationOnNextLine: false -AllowShortBlocksOnASingleLine: false -AllowShortCaseLabelsOnASingleLine: false -AllowShortFunctionsOnASingleLine: Empty -AllowShortIfStatementsOnASingleLine: false -AllowShortLoopsOnASingleLine: false -AlwaysBreakAfterReturnType: None -AlwaysBreakBeforeMultilineStrings: true -AlwaysBreakTemplateDeclarations: true -BinPackArguments: false -BinPackParameters: false -BraceWrapping: - AfterClass: false - AfterControlStatement: false - AfterEnum: false - AfterFunction: false - AfterNamespace: false - AfterObjCDeclaration: false - AfterStruct: false - AfterUnion: false - BeforeCatch: false - BeforeElse: false - IndentBraces: false -BreakBeforeBinaryOperators: None -BreakBeforeBraces: Attach -BreakBeforeTernaryOperators: true -BreakConstructorInitializersBeforeComma: false -BreakAfterJavaFieldAnnotations: false -BreakStringLiterals: false -ColumnLimit: 80 -CommentPragmas: '^ IWYU pragma:' -ConstructorInitializerAllOnOneLineOrOnePerLine: true -ConstructorInitializerIndentWidth: 4 -ContinuationIndentWidth: 4 -Cpp11BracedListStyle: true -DerivePointerAlignment: false -DisableFormat: false -ForEachMacros: [ FOR_EACH, FOR_EACH_R, FOR_EACH_RANGE, ] -IncludeCategories: - - Regex: '^<.*\.h(pp)?>' - Priority: 1 - - Regex: '^<.*' - Priority: 2 - - Regex: '.*' - Priority: 3 -IndentCaseLabels: true -IndentWidth: 2 -IndentWrappedFunctionNames: false -KeepEmptyLinesAtTheStartOfBlocks: false -MacroBlockBegin: '' -MacroBlockEnd: '' -MaxEmptyLinesToKeep: 1 -NamespaceIndentation: None -ObjCBlockIndentWidth: 2 -ObjCSpaceAfterProperty: false -ObjCSpaceBeforeProtocolList: false -PenaltyBreakBeforeFirstCallParameter: 1 -PenaltyBreakComment: 300 -PenaltyBreakFirstLessLess: 120 -PenaltyBreakString: 1000 -PenaltyExcessCharacter: 1000000 -PenaltyReturnTypeOnItsOwnLine: 200 -PointerAlignment: Left -ReflowComments: true -SortIncludes: true -SpaceAfterCStyleCast: false -SpaceBeforeAssignmentOperators: true -SpaceBeforeParens: ControlStatements -SpaceInEmptyParentheses: false -SpacesBeforeTrailingComments: 1 -SpacesInAngles: false -SpacesInContainerLiterals: true -SpacesInCStyleCastParentheses: false -SpacesInParentheses: false -SpacesInSquareBrackets: false -Standard: Cpp11 -TabWidth: 8 -UseTab: Never diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.flake8 b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.flake8 deleted file mode 100755 index ae8edda5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.flake8 +++ /dev/null @@ -1,15 +0,0 @@ -# This is an example .flake8 config, used when developing *Black* itself. -# Keep in sync with setup.cfg which is used for source packages. - -[flake8] -ignore = W503, E203, E221, C901, C408, E741, C407, B017 -max-line-length = 100 -max-complexity = 18 -select = B,C,E,F,W,T4,B9 -exclude = build -per-file-ignores = - **/__init__.py:F401,F403,E402 - **/configs/**.py:F401,E402 - configs/**.py:F401,E402 - **/tests/config/**.py:F401,E402 - tests/config/**.py:F401,E402 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/CODE_OF_CONDUCT.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/CODE_OF_CONDUCT.md deleted file mode 100755 index 0f7ad8bf..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/CODE_OF_CONDUCT.md +++ /dev/null @@ -1,5 +0,0 @@ -# Code of Conduct - -Facebook has adopted a Code of Conduct that we expect project participants to adhere to. -Please read the [full text](https://code.fb.com/codeofconduct/) -so that you can understand what actions will and will not be tolerated. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/CONTRIBUTING.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/CONTRIBUTING.md deleted file mode 100755 index 9bab709c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/CONTRIBUTING.md +++ /dev/null @@ -1,68 +0,0 @@ -# Contributing to detectron2 - -## Issues -We use GitHub issues to track public bugs and questions. -Please make sure to follow one of the -[issue templates](https://github.com/facebookresearch/detectron2/issues/new/choose) -when reporting any issues. - -Facebook has a [bounty program](https://www.facebook.com/whitehat/) for the safe -disclosure of security bugs. In those cases, please go through the process -outlined on that page and do not file a public issue. - -## Pull Requests -We actively welcome pull requests. - -However, if you're adding any significant features (e.g. > 50 lines), please -make sure to discuss with maintainers about your motivation and proposals in an issue -before sending a PR. This is to save your time so you don't spend time on a PR that we'll not accept. - -We do not always accept new features, and we take the following -factors into consideration: - -1. Whether the same feature can be achieved without modifying detectron2. - Detectron2 is designed so that you can implement many extensions from the outside, e.g. - those in [projects](https://github.com/facebookresearch/detectron2/tree/master/projects). - * If some part of detectron2 is not extensible enough, you can also bring up a more general issue to - improve it. Such feature request may be useful to more users. -2. Whether the feature is potentially useful to a large audience (e.g. an impactful detection paper, a popular dataset, - a significant speedup, a widely useful utility), - or only to a small portion of users (e.g., a less-known paper, an improvement not in the object - detection field, a trick that's not very popular in the community, code to handle a non-standard type of data) - * Adoption of additional models, datasets, new task are by default not added to detectron2 before they - receive significant popularity in the community. - We sometimes accept such features in `projects/`, or as a link in `projects/README.md`. -3. Whether the proposed solution has a good design / interface. This can be discussed in the issue prior to PRs, or - in the form of a draft PR. -4. Whether the proposed solution adds extra mental/practical overhead to users who don't - need such feature. -5. Whether the proposed solution breaks existing APIs. - -To add a feature to an existing function/class `Func`, there are always two approaches: -(1) add new arguments to `Func`; (2) write a new `Func_with_new_feature`. -To meet the above criteria, we often prefer approach (2), because: - -1. It does not involve modifying or potentially breaking existing code. -2. It does not add overhead to users who do not need the new feature. -3. Adding new arguments to a function/class is not scalable w.r.t. all the possible new research ideas in the future. - -When sending a PR, please do: - -1. If a PR contains multiple orthogonal changes, split it to several PRs. -2. If you've added code that should be tested, add tests. -3. For PRs that need experiments (e.g. adding a new model or new methods), - you don't need to update model zoo, but do provide experiment results in the description of the PR. -4. If APIs are changed, update the documentation. -5. We use the [Google style docstrings](https://www.sphinx-doc.org/en/master/usage/extensions/napoleon.html) in python. -6. Make sure your code lints with `./dev/linter.sh`. - - -## Contributor License Agreement ("CLA") -In order to accept your pull request, we need you to submit a CLA. You only need -to do this once to work on any of Facebook's open source projects. - -Complete your CLA here: - -## License -By contributing to detectron2, you agree that your contributions will be licensed -under the LICENSE file in the root directory of this source tree. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/Detectron2-Logo-Horz.svg b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/Detectron2-Logo-Horz.svg deleted file mode 100755 index eb2d643d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/Detectron2-Logo-Horz.svg +++ /dev/null @@ -1 +0,0 @@ -Detectron2-Logo-Horz \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE.md deleted file mode 100755 index 5e8aaa2d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE.md +++ /dev/null @@ -1,5 +0,0 @@ - -Please select an issue template from -https://github.com/facebookresearch/detectron2/issues/new/choose . - -Otherwise your issue will be closed. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/bugs.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/bugs.md deleted file mode 100755 index d0235c70..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/bugs.md +++ /dev/null @@ -1,38 +0,0 @@ ---- -name: "🐛 Bugs" -about: Report bugs in detectron2 -title: Please read & provide the following - ---- - -## Instructions To Reproduce the 🐛 Bug: -1. Full runnable code or full changes you made: -``` -If making changes to the project itself, please use output of the following command: -git rev-parse HEAD; git diff - - -``` -2. What exact command you run: -3. __Full logs__ or other relevant observations: -``` - -``` -4. please simplify the steps as much as possible so they do not require additional resources to - run, such as a private dataset. - -## Expected behavior: - -If there are no obvious error in "full logs" provided above, -please tell us the expected behavior. - -## Environment: - -Provide your environment information using the following command: -``` -wget -nc -q https://github.com/facebookresearch/detectron2/raw/main/detectron2/utils/collect_env.py && python collect_env.py -``` - -If your issue looks like an installation issue / environment issue, -please first try to solve it yourself with the instructions in -https://detectron2.readthedocs.io/tutorials/install.html#common-installation-issues diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/config.yml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/config.yml deleted file mode 100755 index c60c2e14..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/config.yml +++ /dev/null @@ -1,17 +0,0 @@ -# require an issue template to be chosen -blank_issues_enabled: false - -contact_links: - - name: How-To / All Other Questions - url: https://github.com/facebookresearch/detectron2/discussions - about: Use "github discussions" for community support on general questions that don't belong to the above issue categories - - name: Detectron2 Documentation - url: https://detectron2.readthedocs.io/index.html - about: Check if your question is answered in tutorials or API docs - -# Unexpected behaviors & bugs are split to two templates. -# When they are one template, users think "it's not a bug" and don't choose the template. -# -# But the file name is still "unexpected-problems-bugs.md" so that old references -# to this issue template still works. -# It's ok since this template should be a superset of "bugs.md" (unexpected behaviors is a superset of bugs) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/documentation.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/documentation.md deleted file mode 100755 index 88214d62..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/documentation.md +++ /dev/null @@ -1,14 +0,0 @@ ---- -name: "\U0001F4DA Documentation Issue" -about: Report a problem about existing documentation, comments, website or tutorials. -labels: documentation - ---- - -## 📚 Documentation Issue - -This issue category is for problems about existing documentation, not for asking how-to questions. - -* Provide a link to an existing documentation/comment/tutorial: - -* How should the above documentation/comment/tutorial improve: diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/feature-request.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/feature-request.md deleted file mode 100755 index 03a1e93d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/feature-request.md +++ /dev/null @@ -1,31 +0,0 @@ ---- -name: "\U0001F680Feature Request" -about: Suggest an improvement or new feature -labels: enhancement - ---- - -## 🚀 Feature -A clear and concise description of the feature proposal. - -## Motivation & Examples - -Tell us why the feature is useful. - -Describe what the feature would look like, if it is implemented. -Best demonstrated using **code examples** in addition to words. - -## Note - -We only consider adding new features if they are relevant to many users. - -If you request implementation of research papers -- we only consider papers that have enough significance and prevalance in the object detection field. - -We do not take requests for most projects in the `projects/` directory, because they are research code release that is mainly for other researchers to reproduce results. - -"Make X faster/accurate" is not a valid feature request. "Implement a concrete feature that can make X faster/accurate" can be a valid feature request. - -Instead of adding features inside detectron2, -you can implement many features by [extending detectron2](https://detectron2.readthedocs.io/tutorials/extend.html). -The [projects/](https://github.com/facebookresearch/detectron2/tree/main/projects/) directory contains many of such examples. - diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/unexpected-problems-bugs.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/unexpected-problems-bugs.md deleted file mode 100755 index 5db8f224..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/ISSUE_TEMPLATE/unexpected-problems-bugs.md +++ /dev/null @@ -1,44 +0,0 @@ ---- -name: "😩 Unexpected behaviors" -about: Report unexpected behaviors when using detectron2 -title: Please read & provide the following - ---- - -If you do not know the root cause of the problem, please post according to this template: - -## Instructions To Reproduce the Issue: - -Check https://stackoverflow.com/help/minimal-reproducible-example for how to ask good questions. -Simplify the steps to reproduce the issue using suggestions from the above link, and provide them below: - -1. Full runnable code or full changes you made: -``` -If making changes to the project itself, please use output of the following command: -git rev-parse HEAD; git diff - - -``` -2. What exact command you run: -3. __Full logs__ or other relevant observations: -``` - -``` - -## Expected behavior: - -If there are no obvious crash in "full logs" provided above, -please tell us the expected behavior. - -If you expect a model to converge / work better, we do not help with such issues, unless -a model fails to reproduce the results in detectron2 model zoo, or proves existence of bugs. - -## Environment: - -Paste the output of the following command: -``` -wget -nc -nv https://github.com/facebookresearch/detectron2/raw/main/detectron2/utils/collect_env.py && python collect_env.py -``` - -If your issue looks like an installation issue / environment issue, -please first check common issues in https://detectron2.readthedocs.io/tutorials/install.html#common-installation-issues diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/pull_request_template.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/pull_request_template.md deleted file mode 100755 index d71729ba..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/pull_request_template.md +++ /dev/null @@ -1,10 +0,0 @@ -Thanks for your contribution! - -If you're sending a large PR (e.g., >100 lines), -please open an issue first about the feature / bug, and indicate how you want to contribute. - -We do not always accept features. -See https://detectron2.readthedocs.io/notes/contributing.html#pull-requests about how we handle PRs. - -Before submitting a PR, please run `dev/linter.sh` to lint the code. - diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/check-template.yml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/check-template.yml deleted file mode 100755 index 3caed9df..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/check-template.yml +++ /dev/null @@ -1,86 +0,0 @@ -name: Check issue template - -on: - issues: - types: [opened] - -jobs: - check-template: - runs-on: ubuntu-latest - # comment this out when testing with https://github.com/nektos/act - if: ${{ github.repository_owner == 'facebookresearch' }} - steps: - - uses: actions/checkout@v2 - - uses: actions/github-script@v3 - with: - github-token: ${{secrets.GITHUB_TOKEN}} - script: | - // Arguments available: - // - github: A pre-authenticated octokit/rest.js client - // - context: An object containing the context of the workflow run - // - core: A reference to the @actions/core package - // - io: A reference to the @actions/io package - const fs = require('fs'); - const editDistance = require(`${process.env.GITHUB_WORKSPACE}/.github/workflows/levenshtein.js`).getEditDistance - issue = await github.issues.get({ - owner: context.issue.owner, - repo: context.issue.repo, - issue_number: context.issue.number, - }); - const hasLabel = issue.data.labels.length > 0; - if (hasLabel || issue.state === "closed") { - // don't require template on them - core.debug("Issue " + issue.data.title + " was skipped."); - return; - } - - sameAsTemplate = function(filename, body) { - let tmpl = fs.readFileSync(`.github/ISSUE_TEMPLATE/${filename}`, 'utf8'); - tmpl = tmpl.toLowerCase().split("---").slice(2).join("").trim(); - tmpl = tmpl.replace(/(\r\n|\n|\r)/gm, ""); - let bodyr = body.replace(/(\r\n|\n|\r)/gm, ""); - let dist = editDistance(tmpl, bodyr); - return dist < 8; - }; - - checkFail = async function(msg) { - core.info("Processing '" + issue.data.title + "' with message: " + msg); - await github.issues.addLabels({ - owner: context.issue.owner, - repo: context.issue.repo, - issue_number: context.issue.number, - labels: ["needs-more-info"], - }); - await github.issues.createComment({ - owner: context.issue.owner, - repo: context.issue.repo, - issue_number: context.issue.number, - body: msg, - }); - }; - - const body = issue.data.body.toLowerCase().trim(); - - if (sameAsTemplate("bugs.md", body) || sameAsTemplate("unexpected-problems-bugs.md", body)) { - await checkFail(` - We found that not enough information is provided about this issue. - Please provide details following the [issue template](https://github.com/facebookresearch/detectron2/issues/new/choose).`) - return; - } - - const hasInstructions = body.indexOf("reproduce") != -1; - const hasEnvironment = (body.indexOf("environment") != -1) || (body.indexOf("colab") != -1) || (body.indexOf("docker") != -1); - if (hasInstructions && hasEnvironment) { - core.debug("Issue " + issue.data.title + " follows template."); - return; - } - - let message = "You've chosen to report an unexpected problem or bug. Unless you already know the root cause of it, please include details about it by filling the [issue template](https://github.com/facebookresearch/detectron2/issues/new/choose).\n"; - message += "The following information is missing: "; - if (!hasInstructions) { - message += "\"Instructions To Reproduce the Issue and __Full__ Logs\"; "; - } - if (!hasEnvironment) { - message += "\"Your Environment\"; "; - } - await checkFail(message); diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/levenshtein.js b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/levenshtein.js deleted file mode 100755 index 67a5e361..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/levenshtein.js +++ /dev/null @@ -1,44 +0,0 @@ -/* -Copyright (c) 2011 Andrei Mackenzie - -Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: - -The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. - -THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. -*/ - -// Compute the edit distance between the two given strings -exports.getEditDistance = function(a, b){ - if(a.length == 0) return b.length; - if(b.length == 0) return a.length; - - var matrix = []; - - // increment along the first column of each row - var i; - for(i = 0; i <= b.length; i++){ - matrix[i] = [i]; - } - - // increment each column in the first row - var j; - for(j = 0; j <= a.length; j++){ - matrix[0][j] = j; - } - - // Fill in the rest of the matrix - for(i = 1; i <= b.length; i++){ - for(j = 1; j <= a.length; j++){ - if(b.charAt(i-1) == a.charAt(j-1)){ - matrix[i][j] = matrix[i-1][j-1]; - } else { - matrix[i][j] = Math.min(matrix[i-1][j-1] + 1, // substitution - Math.min(matrix[i][j-1] + 1, // insertion - matrix[i-1][j] + 1)); // deletion - } - } - } - - return matrix[b.length][a.length]; -}; diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/needs-reply.yml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/needs-reply.yml deleted file mode 100755 index 4affabd3..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/needs-reply.yml +++ /dev/null @@ -1,98 +0,0 @@ -name: Close/Lock issues after inactivity - -on: - schedule: - - cron: "0 0 * * *" - -jobs: - close-issues-needs-more-info: - runs-on: ubuntu-latest - if: ${{ github.repository_owner == 'facebookresearch' }} - steps: - - name: Close old issues that need reply - uses: actions/github-script@v3 - with: - github-token: ${{secrets.GITHUB_TOKEN}} - # Modified from https://github.com/dwieeb/needs-reply - script: | - // Arguments available: - // - github: A pre-authenticated octokit/rest.js client - // - context: An object containing the context of the workflow run - // - core: A reference to the @actions/core package - // - io: A reference to the @actions/io package - const kLabelToCheck = "needs-more-info"; - const kInvalidLabel = "invalid/unrelated"; - const kDaysBeforeClose = 7; - const kMessage = "Requested information was not provided in 7 days, so we're closing this issue.\n\nPlease open new issue if information becomes available. Otherwise, use [github discussions](https://github.com/facebookresearch/detectron2/discussions) for free-form discussions." - - issues = await github.issues.listForRepo({ - owner: context.repo.owner, - repo: context.repo.repo, - state: 'open', - labels: kLabelToCheck, - sort: 'updated', - direction: 'asc', - per_page: 30, - page: 1, - }); - issues = issues.data; - if (issues.length === 0) { - core.info('No more issues found to process. Exiting.'); - return; - } - for (const issue of issues) { - if (!!issue.pull_request) - continue; - core.info(`Processing issue #${issue.number}`); - - let updatedAt = new Date(issue.updated_at).getTime(); - const numComments = issue.comments; - const comments = await github.issues.listComments({ - owner: context.repo.owner, - repo: context.repo.repo, - issue_number: issue.number, - per_page: 30, - page: Math.floor((numComments - 1) / 30) + 1, // the last page - }); - const lastComments = comments.data - .map(l => new Date(l.created_at).getTime()) - .sort(); - if (lastComments.length > 0) { - updatedAt = lastComments[lastComments.length - 1]; - } - - const now = new Date().getTime(); - const daysSinceUpdated = (now - updatedAt) / 1000 / 60 / 60 / 24; - - if (daysSinceUpdated < kDaysBeforeClose) { - core.info(`Skipping #${issue.number} because it has been updated in the last ${daysSinceUpdated} days`); - continue; - } - core.info(`Closing #${issue.number} because it has not been updated in the last ${daysSinceUpdated} days`); - await github.issues.createComment({ - owner: context.repo.owner, - repo: context.repo.repo, - issue_number: issue.number, - body: kMessage, - }); - const newLabels = numComments <= 2 ? [kInvalidLabel, kLabelToCheck] : issue.labels; - await github.issues.update({ - owner: context.repo.owner, - repo: context.repo.repo, - issue_number: issue.number, - labels: newLabels, - state: 'closed', - }); - } - - lock-issues-after-closed: - runs-on: ubuntu-latest - if: ${{ github.repository_owner == 'facebookresearch' }} - steps: - - name: Lock closed issues that have no activity for a while - uses: dessant/lock-threads@v2 - with: - github-token: ${{ github.token }} - issue-lock-inactive-days: '300' - process-only: 'issues' - issue-exclude-labels: 'enhancement,bug,documentation' diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/remove-needs-reply.yml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/remove-needs-reply.yml deleted file mode 100755 index 1f000b28..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/remove-needs-reply.yml +++ /dev/null @@ -1,25 +0,0 @@ -name: Remove needs-more-info label - -on: - issue_comment: - types: [created] - issues: - types: [edited] - -jobs: - remove-needs-more-info-label: - runs-on: ubuntu-latest - # 1. issue_comment events could include PR comment, filter them out - # 2. Only trigger action if event was produced by the original author - if: ${{ !github.event.issue.pull_request && github.event.sender.login == github.event.issue.user.login }} - steps: - - name: Remove needs-more-info label - uses: octokit/request-action@v2.x - continue-on-error: true - with: - route: DELETE /repos/:repository/issues/:issue/labels/:label - repository: ${{ github.repository }} - issue: ${{ github.event.issue.number }} - label: needs-more-info - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/workflow.yml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/workflow.yml deleted file mode 100755 index 6085b32a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.github/workflows/workflow.yml +++ /dev/null @@ -1,81 +0,0 @@ -name: CI -on: [push, pull_request] - -# Run linter with github actions for quick feedbacks. -# Run macos tests with github actions. Linux (CPU & GPU) tests currently runs on CircleCI -jobs: - linter: - runs-on: ubuntu-latest - # run on PRs, or commits to facebookresearch (not internal) - if: ${{ github.repository_owner == 'facebookresearch' || github.event_name == 'pull_request' }} - steps: - - uses: actions/checkout@v2 - - name: Set up Python 3.6 - uses: actions/setup-python@v2 - with: - python-version: 3.6 - - name: Install dependencies - # flake8-bugbear flake8-comprehensions are useful but not available internally - run: | - python -m pip install --upgrade pip - python -m pip install flake8==3.8.1 isort==4.3.21 - python -m pip install black==21.4b2 - flake8 --version - - name: Lint - run: | - echo "Running isort" - isort -c -sp . - echo "Running black" - black -l 100 --check . - echo "Running flake8" - flake8 . - - macos_tests: - runs-on: macos-latest - # run on PRs, or commits to facebookresearch (not internal) - if: ${{ github.repository_owner == 'facebookresearch' || github.event_name == 'pull_request' }} - strategy: - fail-fast: false - matrix: - torch: ["1.8", "1.9", "1.10"] - include: - - torch: "1.8" - torchvision: 0.9 - - torch: "1.9" - torchvision: "0.10" - - torch: "1.10" - torchvision: "0.11.1" - env: - # point datasets to ~/.torch so it's cached by CI - DETECTRON2_DATASETS: ~/.torch/datasets - steps: - - name: Checkout - uses: actions/checkout@v2 - - name: Set up Python 3.6 - uses: actions/setup-python@v2 - with: - python-version: 3.6 - - name: Cache dependencies - uses: actions/cache@v2 - with: - path: | - ${{ env.pythonLocation }}/lib/python3.6/site-packages - ~/.torch - key: ${{ runner.os }}-torch${{ matrix.torch }}-${{ hashFiles('setup.py') }}-20210420 - - - name: Install dependencies - run: | - python -m pip install -U pip - python -m pip install ninja opencv-python-headless onnx pytest-xdist - python -m pip install torch==${{matrix.torch}} torchvision==${{matrix.torchvision}} -f https://download.pytorch.org/whl/torch_stable.html - # install from github to get latest; install iopath first since fvcore depends on it - python -m pip install -U 'git+https://github.com/facebookresearch/iopath' - python -m pip install -U 'git+https://github.com/facebookresearch/fvcore' - - - name: Build and install - run: | - CC=clang CXX=clang++ python -m pip install -e .[all] - python -m detectron2.utils.collect_env - ./datasets/prepare_for_tests.sh - - name: Run unittests - run: python -m pytest -n 4 --durations=15 -v tests/ diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.gitignore b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.gitignore deleted file mode 100755 index 8ca283cb..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/.gitignore +++ /dev/null @@ -1,57 +0,0 @@ -slurm* -# output dir -output -instant_test_output -inference_test_output - - -*.png -*.json -*.diff -# *.jpg -!/projects/DensePose/doc/images/*.jpg - -# compilation and distribution -__pycache__ -_ext -*.pyc -*.pyd -*.so -*.dll -*.egg-info/ -build/ -dist/ -wheels/ - -# pytorch/python/numpy formats -*.pth -*.pkl -*.npy -*.ts -model_ts*.txt - -# ipython/jupyter notebooks -*.ipynb -**/.ipynb_checkpoints/ - -# Editor temporaries -*.swn -*.swo -*.swp -*~ - -# editor settings -.idea -.vscode -_darcs - -# project dirs -/detectron2/model_zoo/configs -/datasets/* -!/datasets/*.* -!/datasets/lvis/ -/datasets/lvis/* -!/datasets/lvis/lvis_v1_train_cat_info.json -/projects/*/datasets -/models -/snippet diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/GETTING_STARTED.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/GETTING_STARTED.md deleted file mode 100755 index 404b0c8f..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/GETTING_STARTED.md +++ /dev/null @@ -1,79 +0,0 @@ -## Getting Started with Detectron2 - -This document provides a brief intro of the usage of builtin command-line tools in detectron2. - -For a tutorial that involves actual coding with the API, -see our [Colab Notebook](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5) -which covers how to run inference with an -existing model, and how to train a builtin model on a custom dataset. - - -### Inference Demo with Pre-trained Models - -1. Pick a model and its config file from - [model zoo](MODEL_ZOO.md), - for example, `mask_rcnn_R_50_FPN_3x.yaml`. -2. We provide `demo.py` that is able to demo builtin configs. Run it with: -``` -cd demo/ -python demo.py --config-file ../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml \ - --input input1.jpg input2.jpg \ - [--other-options] - --opts MODEL.WEIGHTS detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl -``` -The configs are made for training, therefore we need to specify `MODEL.WEIGHTS` to a model from model zoo for evaluation. -This command will run the inference and show visualizations in an OpenCV window. - -For details of the command line arguments, see `demo.py -h` or look at its source code -to understand its behavior. Some common arguments are: -* To run __on your webcam__, replace `--input files` with `--webcam`. -* To run __on a video__, replace `--input files` with `--video-input video.mp4`. -* To run __on cpu__, add `MODEL.DEVICE cpu` after `--opts`. -* To save outputs to a directory (for images) or a file (for webcam or video), use `--output`. - - -### Training & Evaluation in Command Line - -We provide two scripts in "tools/plain_train_net.py" and "tools/train_net.py", -that are made to train all the configs provided in detectron2. You may want to -use it as a reference to write your own training script. - -Compared to "train_net.py", "plain_train_net.py" supports fewer default -features. It also includes fewer abstraction, therefore is easier to add custom -logic. - -To train a model with "train_net.py", first -setup the corresponding datasets following -[datasets/README.md](./datasets/README.md), -then run: -``` -cd tools/ -./train_net.py --num-gpus 8 \ - --config-file ../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml -``` - -The configs are made for 8-GPU training. -To train on 1 GPU, you may need to [change some parameters](https://arxiv.org/abs/1706.02677), e.g.: -``` -./train_net.py \ - --config-file ../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml \ - --num-gpus 1 SOLVER.IMS_PER_BATCH 2 SOLVER.BASE_LR 0.0025 -``` - -To evaluate a model's performance, use -``` -./train_net.py \ - --config-file ../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml \ - --eval-only MODEL.WEIGHTS /path/to/checkpoint_file -``` -For more options, see `./train_net.py -h`. - -### Use Detectron2 APIs in Your Code - -See our [Colab Notebook](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5) -to learn how to use detectron2 APIs to: -1. run inference with an existing model -2. train a builtin model on a custom dataset - -See [detectron2/projects](https://github.com/facebookresearch/detectron2/tree/main/projects) -for more ways to build your project on detectron2. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/INSTALL.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/INSTALL.md deleted file mode 100755 index b4076891..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/INSTALL.md +++ /dev/null @@ -1,261 +0,0 @@ -## Installation - -### Requirements -- Linux or macOS with Python ≥ 3.6 -- PyTorch ≥ 1.8 and [torchvision](https://github.com/pytorch/vision/) that matches the PyTorch installation. - Install them together at [pytorch.org](https://pytorch.org) to make sure of this -- OpenCV is optional but needed by demo and visualization - - -### Build Detectron2 from Source - -gcc & g++ ≥ 5.4 are required. [ninja](https://ninja-build.org/) is optional but recommended for faster build. -After having them, run: -``` -python -m pip install 'git+https://github.com/facebookresearch/detectron2.git' -# (add --user if you don't have permission) - -# Or, to install it from a local clone: -git clone https://github.com/facebookresearch/detectron2.git -python -m pip install -e detectron2 - -# On macOS, you may need to prepend the above commands with a few environment variables: -CC=clang CXX=clang++ ARCHFLAGS="-arch x86_64" python -m pip install ... -``` - -To __rebuild__ detectron2 that's built from a local clone, use `rm -rf build/ **/*.so` to clean the -old build first. You often need to rebuild detectron2 after reinstalling PyTorch. - -### Install Pre-Built Detectron2 (Linux only) - -Choose from this table to install [v0.6 (Oct 2021)](https://github.com/facebookresearch/detectron2/releases): - -
      CUDA torch 1.10torch 1.9torch 1.8
      11.3
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu113/torch1.10/index.html
      -
      11.1
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu111/torch1.10/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu111/torch1.9/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu111/torch1.8/index.html
      -
      10.2
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu102/torch1.10/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu102/torch1.9/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu102/torch1.8/index.html
      -
      10.1
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu101/torch1.8/index.html
      -
      cpu
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cpu/torch1.10/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cpu/torch1.9/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cpu/torch1.8/index.html
      -
      - -Note that: -1. The pre-built packages have to be used with corresponding version of CUDA and the official package of PyTorch. - Otherwise, please build detectron2 from source. -2. New packages are released every few months. Therefore, packages may not contain latest features in the main - branch and may not be compatible with the main branch of a research project that uses detectron2 - (e.g. those in [projects](projects)). - -### Common Installation Issues - -Click each issue for its solutions: - -
      - -Undefined symbols that looks like "TH..","at::Tensor...","torch..." - -
      - -This usually happens when detectron2 or torchvision is not -compiled with the version of PyTorch you're running. - -If the error comes from a pre-built torchvision, uninstall torchvision and pytorch and reinstall them -following [pytorch.org](http://pytorch.org). So the versions will match. - -If the error comes from a pre-built detectron2, check [release notes](https://github.com/facebookresearch/detectron2/releases), -uninstall and reinstall the correct pre-built detectron2 that matches pytorch version. - -If the error comes from detectron2 or torchvision that you built manually from source, -remove files you built (`build/`, `**/*.so`) and rebuild it so it can pick up the version of pytorch currently in your environment. - -If the above instructions do not resolve this problem, please provide an environment (e.g. a dockerfile) that can reproduce the issue. -
      - -
      - -Missing torch dynamic libraries, OR segmentation fault immediately when using detectron2. - -This usually happens when detectron2 or torchvision is not -compiled with the version of PyTorch you're running. See the previous common issue for the solution. -
      - -
      - -Undefined C++ symbols (e.g. "GLIBCXX..") or C++ symbols not found. - -
      -Usually it's because the library is compiled with a newer C++ compiler but run with an old C++ runtime. - -This often happens with old anaconda. -It may help to run `conda update libgcc` to upgrade its runtime. - -The fundamental solution is to avoid the mismatch, either by compiling using older version of C++ -compiler, or run the code with proper C++ runtime. -To run the code with a specific C++ runtime, you can use environment variable `LD_PRELOAD=/path/to/libstdc++.so`. - -
      - -
      - -"nvcc not found" or "Not compiled with GPU support" or "Detectron2 CUDA Compiler: not available". - -
      -CUDA is not found when building detectron2. -You should make sure - -``` -python -c 'import torch; from torch.utils.cpp_extension import CUDA_HOME; print(torch.cuda.is_available(), CUDA_HOME)' -``` - -print `(True, a directory with cuda)` at the time you build detectron2. - -Most models can run inference (but not training) without GPU support. To use CPUs, set `MODEL.DEVICE='cpu'` in the config. -
      - -
      - -"invalid device function" or "no kernel image is available for execution". - -
      -Two possibilities: - -* You build detectron2 with one version of CUDA but run it with a different version. - - To check whether it is the case, - use `python -m detectron2.utils.collect_env` to find out inconsistent CUDA versions. - In the output of this command, you should expect "Detectron2 CUDA Compiler", "CUDA_HOME", "PyTorch built with - CUDA" - to contain cuda libraries of the same version. - - When they are inconsistent, - you need to either install a different build of PyTorch (or build by yourself) - to match your local CUDA installation, or install a different version of CUDA to match PyTorch. - -* PyTorch/torchvision/Detectron2 is not built for the correct GPU SM architecture (aka. compute capability). - - The architecture included by PyTorch/detectron2/torchvision is available in the "architecture flags" in - `python -m detectron2.utils.collect_env`. It must include - the architecture of your GPU, which can be found at [developer.nvidia.com/cuda-gpus](https://developer.nvidia.com/cuda-gpus). - - If you're using pre-built PyTorch/detectron2/torchvision, they have included support for most popular GPUs already. - If not supported, you need to build them from source. - - When building detectron2/torchvision from source, they detect the GPU device and build for only the device. - This means the compiled code may not work on a different GPU device. - To recompile them for the correct architecture, remove all installed/compiled files, - and rebuild them with the `TORCH_CUDA_ARCH_LIST` environment variable set properly. - For example, `export TORCH_CUDA_ARCH_LIST="6.0;7.0"` makes it compile for both P100s and V100s. -
      - -
      - -Undefined CUDA symbols; Cannot open libcudart.so - -
      -The version of NVCC you use to build detectron2 or torchvision does -not match the version of CUDA you are running with. -This often happens when using anaconda's CUDA runtime. - -Use `python -m detectron2.utils.collect_env` to find out inconsistent CUDA versions. -In the output of this command, you should expect "Detectron2 CUDA Compiler", "CUDA_HOME", "PyTorch built with - CUDA" -to contain cuda libraries of the same version. - -When they are inconsistent, -you need to either install a different build of PyTorch (or build by yourself) -to match your local CUDA installation, or install a different version of CUDA to match PyTorch. -
      - - -
      - -C++ compilation errors from NVCC / NVRTC, or "Unsupported gpu architecture" - -
      -A few possibilities: - -1. Local CUDA/NVCC version has to match the CUDA version of your PyTorch. Both can be found in `python collect_env.py`. - When they are inconsistent, you need to either install a different build of PyTorch (or build by yourself) - to match your local CUDA installation, or install a different version of CUDA to match PyTorch. - -2. Local CUDA/NVCC version shall support the SM architecture (a.k.a. compute capability) of your GPU. - The capability of your GPU can be found at [developer.nvidia.com/cuda-gpus](https://developer.nvidia.com/cuda-gpus). - The capability supported by NVCC is listed at [here](https://gist.github.com/ax3l/9489132). - If your NVCC version is too old, this can be workaround by setting environment variable - `TORCH_CUDA_ARCH_LIST` to a lower, supported capability. - -3. The combination of NVCC and GCC you use is incompatible. You need to change one of their versions. - See [here](https://gist.github.com/ax3l/9489132) for some valid combinations. - Notably, CUDA<=10.1.105 doesn't support GCC>7.3. - - The CUDA/GCC version used by PyTorch can be found by `print(torch.__config__.show())`. - -
      - - -
      - -"ImportError: cannot import name '_C'". - -
      -Please build and install detectron2 following the instructions above. - -Or, if you are running code from detectron2's root directory, `cd` to a different one. -Otherwise you may not import the code that you installed. -
      - - -
      - -Any issue on windows. - -
      - -Detectron2 is continuously built on windows with [CircleCI](https://app.circleci.com/pipelines/github/facebookresearch/detectron2?branch=main). -However we do not provide official support for it. -PRs that improves code compatibility on windows are welcome. -
      - -
      - -ONNX conversion segfault after some "TraceWarning". - -
      -The ONNX package is compiled with a too old compiler. - -Please build and install ONNX from its source code using a compiler -whose version is closer to what's used by PyTorch (available in `torch.__config__.show()`). -
      - - -
      - -"library not found for -lstdc++" on older version of MacOS - -
      -See -[this stackoverflow answer](https://stackoverflow.com/questions/56083725/macos-build-issues-lstdc-not-found-while-building-python-package). - -
      - - -### Installation inside specific environments: - -* __Colab__: see our [Colab Tutorial](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5) - which has step-by-step instructions. - -* __Docker__: The official [Dockerfile](docker) installs detectron2 with a few simple commands. - diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/LICENSE b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/LICENSE deleted file mode 100755 index cd1b0706..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/LICENSE +++ /dev/null @@ -1,202 +0,0 @@ -Apache License -Version 2.0, January 2004 -http://www.apache.org/licenses/ - -TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION - -1. Definitions. - -"License" shall mean the terms and conditions for use, reproduction, -and distribution as defined by Sections 1 through 9 of this document. - -"Licensor" shall mean the copyright owner or entity authorized by -the copyright owner that is granting the License. - -"Legal Entity" shall mean the union of the acting entity and all -other entities that control, are controlled by, or are under common -control with that entity. For the purposes of this definition, -"control" means (i) the power, direct or indirect, to cause the -direction or management of such entity, whether by contract or -otherwise, or (ii) ownership of fifty percent (50%) or more of the -outstanding shares, or (iii) beneficial ownership of such entity. - -"You" (or "Your") shall mean an individual or Legal Entity -exercising permissions granted by this License. - -"Source" form shall mean the preferred form for making modifications, -including but not limited to software source code, documentation -source, and configuration files. - -"Object" form shall mean any form resulting from mechanical -transformation or translation of a Source form, including but -not limited to compiled object code, generated documentation, -and conversions to other media types. - -"Work" shall mean the work of authorship, whether in Source or -Object form, made available under the License, as indicated by a -copyright notice that is included in or attached to the work -(an example is provided in the Appendix below). - -"Derivative Works" shall mean any work, whether in Source or Object -form, that is based on (or derived from) the Work and for which the -editorial revisions, annotations, elaborations, or other modifications -represent, as a whole, an original work of authorship. For the purposes -of this License, Derivative Works shall not include works that remain -separable from, or merely link (or bind by name) to the interfaces of, -the Work and Derivative Works thereof. - -"Contribution" shall mean any work of authorship, including -the original version of the Work and any modifications or additions -to that Work or Derivative Works thereof, that is intentionally -submitted to Licensor for inclusion in the Work by the copyright owner -or by an individual or Legal Entity authorized to submit on behalf of -the copyright owner. For the purposes of this definition, "submitted" -means any form of electronic, verbal, or written communication sent -to the Licensor or its representatives, including but not limited to -communication on electronic mailing lists, source code control systems, -and issue tracking systems that are managed by, or on behalf of, the -Licensor for the purpose of discussing and improving the Work, but -excluding communication that is conspicuously marked or otherwise -designated in writing by the copyright owner as "Not a Contribution." - -"Contributor" shall mean Licensor and any individual or Legal Entity -on behalf of whom a Contribution has been received by Licensor and -subsequently incorporated within the Work. - -2. Grant of Copyright License. Subject to the terms and conditions of -this License, each Contributor hereby grants to You a perpetual, -worldwide, non-exclusive, no-charge, royalty-free, irrevocable -copyright license to reproduce, prepare Derivative Works of, -publicly display, publicly perform, sublicense, and distribute the -Work and such Derivative Works in Source or Object form. - -3. Grant of Patent License. Subject to the terms and conditions of -this License, each Contributor hereby grants to You a perpetual, -worldwide, non-exclusive, no-charge, royalty-free, irrevocable -(except as stated in this section) patent license to make, have made, -use, offer to sell, sell, import, and otherwise transfer the Work, -where such license applies only to those patent claims licensable -by such Contributor that are necessarily infringed by their -Contribution(s) alone or by combination of their Contribution(s) -with the Work to which such Contribution(s) was submitted. If You -institute patent litigation against any entity (including a -cross-claim or counterclaim in a lawsuit) alleging that the Work -or a Contribution incorporated within the Work constitutes direct -or contributory patent infringement, then any patent licenses -granted to You under this License for that Work shall terminate -as of the date such litigation is filed. - -4. Redistribution. You may reproduce and distribute copies of the -Work or Derivative Works thereof in any medium, with or without -modifications, and in Source or Object form, provided that You -meet the following conditions: - -(a) You must give any other recipients of the Work or -Derivative Works a copy of this License; and - -(b) You must cause any modified files to carry prominent notices -stating that You changed the files; and - -(c) You must retain, in the Source form of any Derivative Works -that You distribute, all copyright, patent, trademark, and -attribution notices from the Source form of the Work, -excluding those notices that do not pertain to any part of -the Derivative Works; and - -(d) If the Work includes a "NOTICE" text file as part of its -distribution, then any Derivative Works that You distribute must -include a readable copy of the attribution notices contained -within such NOTICE file, excluding those notices that do not -pertain to any part of the Derivative Works, in at least one -of the following places: within a NOTICE text file distributed -as part of the Derivative Works; within the Source form or -documentation, if provided along with the Derivative Works; or, -within a display generated by the Derivative Works, if and -wherever such third-party notices normally appear. The contents -of the NOTICE file are for informational purposes only and -do not modify the License. You may add Your own attribution -notices within Derivative Works that You distribute, alongside -or as an addendum to the NOTICE text from the Work, provided -that such additional attribution notices cannot be construed -as modifying the License. - -You may add Your own copyright statement to Your modifications and -may provide additional or different license terms and conditions -for use, reproduction, or distribution of Your modifications, or -for any such Derivative Works as a whole, provided Your use, -reproduction, and distribution of the Work otherwise complies with -the conditions stated in this License. - -5. Submission of Contributions. Unless You explicitly state otherwise, -any Contribution intentionally submitted for inclusion in the Work -by You to the Licensor shall be under the terms and conditions of -this License, without any additional terms or conditions. -Notwithstanding the above, nothing herein shall supersede or modify -the terms of any separate license agreement you may have executed -with Licensor regarding such Contributions. - -6. Trademarks. This License does not grant permission to use the trade -names, trademarks, service marks, or product names of the Licensor, -except as required for reasonable and customary use in describing the -origin of the Work and reproducing the content of the NOTICE file. - -7. Disclaimer of Warranty. Unless required by applicable law or -agreed to in writing, Licensor provides the Work (and each -Contributor provides its Contributions) on an "AS IS" BASIS, -WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or -implied, including, without limitation, any warranties or conditions -of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A -PARTICULAR PURPOSE. You are solely responsible for determining the -appropriateness of using or redistributing the Work and assume any -risks associated with Your exercise of permissions under this License. - -8. Limitation of Liability. In no event and under no legal theory, -whether in tort (including negligence), contract, or otherwise, -unless required by applicable law (such as deliberate and grossly -negligent acts) or agreed to in writing, shall any Contributor be -liable to You for damages, including any direct, indirect, special, -incidental, or consequential damages of any character arising as a -result of this License or out of the use or inability to use the -Work (including but not limited to damages for loss of goodwill, -work stoppage, computer failure or malfunction, or any and all -other commercial damages or losses), even if such Contributor -has been advised of the possibility of such damages. - -9. Accepting Warranty or Additional Liability. While redistributing -the Work or Derivative Works thereof, You may choose to offer, -and charge a fee for, acceptance of support, warranty, indemnity, -or other liability obligations and/or rights consistent with this -License. However, in accepting such obligations, You may act only -on Your own behalf and on Your sole responsibility, not on behalf -of any other Contributor, and only if You agree to indemnify, -defend, and hold each Contributor harmless for any liability -incurred by, or claims asserted against, such Contributor by reason -of your accepting any such warranty or additional liability. - -END OF TERMS AND CONDITIONS - -APPENDIX: How to apply the Apache License to your work. - -To apply the Apache License to your work, attach the following -boilerplate notice, with the fields enclosed by brackets "[]" -replaced with your own identifying information. (Don't include -the brackets!) The text should be enclosed in the appropriate -comment syntax for the file format. We also recommend that a -file or class name and description of purpose be included on the -same "printed page" as the copyright notice for easier -identification within third-party archives. - -Copyright [yyyy] [name of copyright owner] - - -Licensed under the Apache License, Version 2.0 (the "License"); -you may not use this file except in compliance with the License. -You may obtain a copy of the License at - -http://www.apache.org/licenses/LICENSE-2.0 - -Unless required by applicable law or agreed to in writing, software -distributed under the License is distributed on an "AS IS" BASIS, -WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -See the License for the specific language governing permissions and -limitations under the License. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/MODEL_ZOO.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/MODEL_ZOO.md deleted file mode 100755 index 69db2728..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/MODEL_ZOO.md +++ /dev/null @@ -1,1052 +0,0 @@ -# Detectron2 Model Zoo and Baselines - -## Introduction - -This file documents a large collection of baselines trained -with detectron2 in Sep-Oct, 2019. -All numbers were obtained on [Big Basin](https://engineering.fb.com/data-center-engineering/introducing-big-basin-our-next-generation-ai-hardware/) -servers with 8 NVIDIA V100 GPUs & NVLink. The speed numbers are periodically updated with latest PyTorch/CUDA/cuDNN versions. -You can access these models from code using [detectron2.model_zoo](https://detectron2.readthedocs.io/modules/model_zoo.html) APIs. - -In addition to these official baseline models, you can find more models in [projects/](projects/). - -#### How to Read the Tables -* The "Name" column contains a link to the config file. Models can be reproduced using `tools/train_net.py` with the corresponding yaml config file, - or `tools/lazyconfig_train_net.py` for python config files. -* Training speed is averaged across the entire training. - We keep updating the speed with latest version of detectron2/pytorch/etc., - so they might be different from the `metrics` file. - Training speed for multi-machine jobs is not provided. -* Inference speed is measured by `tools/train_net.py --eval-only`, or [inference_on_dataset()](https://detectron2.readthedocs.io/modules/evaluation.html#detectron2.evaluation.inference_on_dataset), - with batch size 1 in detectron2 directly. - Measuring it with custom code may introduce other overhead. - Actual deployment in production should in general be faster than the given inference - speed due to more optimizations. -* The *model id* column is provided for ease of reference. - To check downloaded file integrity, any model on this page contains its md5 prefix in its file name. -* Training curves and other statistics can be found in `metrics` for each model. - -#### Common Settings for COCO Models -* All COCO models were trained on `train2017` and evaluated on `val2017`. -* The default settings are __not directly comparable__ with Detectron's standard settings. - For example, our default training data augmentation uses scale jittering in addition to horizontal flipping. - - To make fair comparisons with Detectron's settings, see - [Detectron1-Comparisons](configs/Detectron1-Comparisons/) for accuracy comparison, - and [benchmarks](https://detectron2.readthedocs.io/notes/benchmarks.html) - for speed comparison. -* For Faster/Mask R-CNN, we provide baselines based on __3 different backbone combinations__: - * __FPN__: Use a ResNet+FPN backbone with standard conv and FC heads for mask and box prediction, - respectively. It obtains the best - speed/accuracy tradeoff, but the other two are still useful for research. - * __C4__: Use a ResNet conv4 backbone with conv5 head. The original baseline in the Faster R-CNN paper. - * __DC5__ (Dilated-C5): Use a ResNet conv5 backbone with dilations in conv5, and standard conv and FC heads - for mask and box prediction, respectively. - This is used by the Deformable ConvNet paper. -* Most models are trained with the 3x schedule (~37 COCO epochs). - Although 1x models are heavily under-trained, we provide some ResNet-50 models with the 1x (~12 COCO epochs) - training schedule for comparison when doing quick research iteration. - -#### ImageNet Pretrained Models - -It's common to initialize from backbone models pre-trained on ImageNet classification tasks. The following backbone models are available: - -* [R-50.pkl](https://dl.fbaipublicfiles.com/detectron2/ImageNetPretrained/MSRA/R-50.pkl): converted copy of [MSRA's original ResNet-50](https://github.com/KaimingHe/deep-residual-networks) model. -* [R-101.pkl](https://dl.fbaipublicfiles.com/detectron2/ImageNetPretrained/MSRA/R-101.pkl): converted copy of [MSRA's original ResNet-101](https://github.com/KaimingHe/deep-residual-networks) model. -* [X-101-32x8d.pkl](https://dl.fbaipublicfiles.com/detectron2/ImageNetPretrained/FAIR/X-101-32x8d.pkl): ResNeXt-101-32x8d model trained with Caffe2 at FB. -* [R-50.pkl (torchvision)](https://dl.fbaipublicfiles.com/detectron2/ImageNetPretrained/torchvision/R-50.pkl): converted copy of [torchvision's ResNet-50](https://pytorch.org/docs/stable/torchvision/models.html#torchvision.models.resnet50) model. - More details can be found in [the conversion script](tools/convert-torchvision-to-d2.py). - -Note that the above models have __different__ format from those provided in Detectron: we do not fuse BatchNorm into an affine layer. -Pretrained models in Detectron's format can still be used. For example: -* [X-152-32x8d-IN5k.pkl](https://dl.fbaipublicfiles.com/detectron/ImageNetPretrained/25093814/X-152-32x8d-IN5k.pkl): - ResNeXt-152-32x8d model trained on ImageNet-5k with Caffe2 at FB (see ResNeXt paper for details on ImageNet-5k). -* [R-50-GN.pkl](https://dl.fbaipublicfiles.com/detectron/ImageNetPretrained/47261647/R-50-GN.pkl): - ResNet-50 with Group Normalization. -* [R-101-GN.pkl](https://dl.fbaipublicfiles.com/detectron/ImageNetPretrained/47592356/R-101-GN.pkl): - ResNet-101 with Group Normalization. - -These models require slightly different settings regarding normalization and architecture. See the model zoo configs for reference. - -#### License - -All models available for download through this document are licensed under the -[Creative Commons Attribution-ShareAlike 3.0 license](https://creativecommons.org/licenses/by-sa/3.0/). - -### COCO Object Detection Baselines - -#### Faster R-CNN: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      model iddownload
      R50-C41x0.5510.1024.835.7137257644model | metrics
      R50-DC51x0.3800.0685.037.3137847829model | metrics
      R50-FPN1x0.2100.0383.037.9137257794model | metrics
      R50-C43x0.5430.1044.838.4137849393model | metrics
      R50-DC53x0.3780.0705.039.0137849425model | metrics
      R50-FPN3x0.2090.0383.040.2137849458model | metrics
      R101-C43x0.6190.1395.941.1138204752model | metrics
      R101-DC53x0.4520.0866.140.6138204841model | metrics
      R101-FPN3x0.2860.0514.142.0137851257model | metrics
      X101-FPN3x0.6380.0986.743.0139173657model | metrics
      - -#### RetinaNet: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      model iddownload
      R501x0.2050.0414.137.4190397773model | metrics
      R503x0.2050.0414.138.7190397829model | metrics
      R1013x0.2910.0545.240.4190397697model | metrics
      - - -#### RPN & Fast R-CNN: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      prop.
      AR
      model iddownload
      RPN R50-C41x0.1300.0341.551.6137258005model | metrics
      RPN R50-FPN1x0.1860.0322.758.0137258492model | metrics
      Fast R-CNN R50-FPN1x0.1400.0292.637.8137635226model | metrics
      - -### COCO Instance Segmentation Baselines with Mask R-CNN - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      mask
      AP
      model iddownload
      R50-C41x0.5840.1105.236.832.2137259246model | metrics
      R50-DC51x0.4710.0766.538.334.2137260150model | metrics
      R50-FPN1x0.2610.0433.438.635.2137260431model | metrics
      R50-C43x0.5750.1115.239.834.4137849525model | metrics
      R50-DC53x0.4700.0766.540.035.9137849551model | metrics
      R50-FPN3x0.2610.0433.441.037.2137849600model | metrics
      R101-C43x0.6520.1456.342.636.7138363239model | metrics
      R101-DC53x0.5450.0927.641.937.3138363294model | metrics
      R101-FPN3x0.3400.0564.642.938.6138205316model | metrics
      X101-FPN3x0.6900.1037.244.339.5139653917model | metrics
      - - - -#### New baselines using Large-Scale Jitter and Longer Training Schedule - -The following baselines of COCO Instance Segmentation with Mask R-CNN are generated -using a longer training schedule and large-scale jitter as described in Google's -[Simple Copy-Paste Data Augmentation](https://arxiv.org/pdf/2012.07177.pdf) paper. These -models are trained from scratch using random initialization. These baselines exceed the -previous Mask R-CNN baselines. - -In the following table, one epoch consists of training on 118000 COCO images. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Nameepochstrain
      time
      (s/im)
      inference
      time
      (s/im)
      box
      AP
      mask
      AP
      model iddownload
      R50-FPN1000.3760.06944.640.342047764model | metrics
      R50-FPN2000.3760.06946.341.742047638model | metrics
      R50-FPN4000.3760.06947.442.542019571model | metrics
      R101-FPN1000.5180.07346.441.642025812model | metrics
      R101-FPN2000.5180.07348.043.142131867model | metrics
      R101-FPN4000.5180.07348.943.742073830model | metrics
      regnetx_4gf_dds_FPN1000.4740.07146.041.342047771model | metrics
      regnetx_4gf_dds_FPN2000.4740.07148.143.142132721model | metrics
      regnetx_4gf_dds_FPN4000.4740.07148.643.542025447model | metrics
      regnety_4gf_dds_FPN1000.4870.07346.141.642047784model | metrics
      regnety_4gf_dds_FPN2000.4870.07247.843.042047642model | metrics
      regnety_4gf_dds_FPN4000.4870.07248.243.342045954model | metrics
      - -### COCO Person Keypoint Detection Baselines with Keypoint R-CNN - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      kp.
      AP
      model iddownload
      R50-FPN1x0.3150.0725.053.664.0137261548model | metrics
      R50-FPN3x0.3160.0665.055.465.5137849621model | metrics
      R101-FPN3x0.3900.0766.156.466.1138363331model | metrics
      X101-FPN3x0.7380.1218.757.366.0139686956model | metrics
      - -### COCO Panoptic Segmentation Baselines with Panoptic FPN - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      mask
      AP
      PQmodel iddownload
      R50-FPN1x0.3040.0534.837.634.739.4139514544model | metrics
      R50-FPN3x0.3020.0534.840.036.541.5139514569model | metrics
      R101-FPN3x0.3920.0666.042.438.543.0139514519model | metrics
      - - -### LVIS Instance Segmentation Baselines with Mask R-CNN - -Mask R-CNN baselines on the [LVIS dataset](https://lvisdataset.org), v0.5. -These baselines are described in Table 3(c) of the [LVIS paper](https://arxiv.org/abs/1908.03195). - -NOTE: the 1x schedule here has the same amount of __iterations__ as the COCO 1x baselines. -They are roughly 24 epochs of LVISv0.5 data. -The final results of these configs have large variance across different runs. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      mask
      AP
      model iddownload
      R50-FPN1x0.2920.1077.123.624.4144219072model | metrics
      R101-FPN1x0.3710.1147.825.625.9144219035model | metrics
      X101-FPN1x0.7120.15110.226.727.1144219108model | metrics
      - - - -### Cityscapes & Pascal VOC Baselines - -Simple baselines for -* Mask R-CNN on Cityscapes instance segmentation (initialized from COCO pre-training, then trained on Cityscapes fine annotations only) -* Faster R-CNN on PASCAL VOC object detection (trained on VOC 2007 train+val + VOC 2012 train+val, tested on VOC 2007 using 11-point interpolated AP) - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Nametrain
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      box
      AP50
      mask
      AP
      model iddownload
      R50-FPN, Cityscapes0.2400.0784.436.5142423278model | metrics
      R50-C4, VOC0.5370.0814.851.980.3142202221model | metrics
      - - - -### Other Settings - -Ablations for Deformable Conv and Cascade R-CNN: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      mask
      AP
      model iddownload
      Baseline R50-FPN1x0.2610.0433.438.635.2137260431model | metrics
      Deformable Conv1x0.3420.0483.541.537.5138602867model | metrics
      Cascade R-CNN1x0.3170.0524.042.136.4138602847model | metrics
      Baseline R50-FPN3x0.2610.0433.441.037.2137849600model | metrics
      Deformable Conv3x0.3490.0473.542.738.5144998336model | metrics
      Cascade R-CNN3x0.3280.0534.044.338.5144998488model | metrics
      - - -Ablations for normalization methods, and a few models trained from scratch following [Rethinking ImageNet Pre-training](https://arxiv.org/abs/1811.08883). -(Note: The baseline uses `2fc` head while the others use [`4conv1fc` head](https://arxiv.org/abs/1803.08494)) - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      mask
      AP
      model iddownload
      Baseline R50-FPN3x0.2610.0433.441.037.2137849600model | metrics
      GN3x0.3090.0605.642.638.6138602888model | metrics
      SyncBN3x0.3450.0535.541.937.8169527823model | metrics
      GN (from scratch)3x0.3380.0617.239.936.6138602908model | metrics
      GN (from scratch)9xN/A0.0617.243.739.6183808979model | metrics
      SyncBN (from scratch)9xN/A0.0557.243.639.3184226666model | metrics
      - - -A few very large models trained for a long time, for demo purposes. They are trained using multiple machines: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Nameinference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      mask
      AP
      PQmodel iddownload
      Panoptic FPN R1010.09811.447.441.346.1139797668model | metrics
      Mask R-CNN X1520.23415.150.244.018131413model | metrics
      above + test-time aug.51.945.9
      diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/README.md deleted file mode 100755 index d3e1d5cf..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/README.md +++ /dev/null @@ -1,85 +0,0 @@ -# Probabilistic two-stage detection -Two-stage object detectors that use class-agnostic one-stage detectors as the proposal network. - - -

      - -> [**Probabilistic two-stage detection**](http://arxiv.org/abs/2103.07461), -> Xingyi Zhou, Vladlen Koltun, Philipp Krähenbühl, -> *arXiv technical report ([arXiv 2103.07461](http://arxiv.org/abs/2103.07461))* - -Contact: [zhouxy@cs.utexas.edu](mailto:zhouxy@cs.utexas.edu). Any questions or discussions are welcomed! - -## Abstract - -We develop a probabilistic interpretation of two-stage object detection. We show that this probabilistic interpretation motivates a number of common empirical training practices. It also suggests changes to two-stage detection pipelines. Specifically, the first stage should infer proper object-vs-background likelihoods, which should then inform the overall score of the detector. A standard region proposal network (RPN) cannot infer this likelihood sufficiently well, but many one-stage detectors can. We show how to build a probabilistic two-stage detector from any state-of-the-art one-stage detector. The resulting detectors are faster and more accurate than both their one- and two-stage precursors. Our detector achieves 56.4 mAP on COCO test-dev with single-scale testing, outperforming all published results. Using a lightweight backbone, our detector achieves 49.2 mAP on COCO at 33 fps on a Titan Xp. - -## Summary - -- Two-stage CenterNet: First stage estimates object probabilities, second stage conditionally classifies objects. - -- Resulting detector is faster and more accurate than both traditional two-stage detectors (fewer proposals required), and one-stage detectors (lighter first stage head). - -- Our best model achieves 56.4 mAP on COCO test-dev. - -- This repo also includes a detectron2-based CenterNet implementation with better accuracy (42.5 mAP at 70FPS) and a new FPN version of CenterNet (40.2 mAP with Res50_1x). - -## Main results - -All models are trained with multi-scale training, and tested with a single scale. The FPS is tested on a Titan RTX GPU. -More models and details can be found in the [MODEL_ZOO](projects/CenterNet2/centernet2_docs/MODEL_ZOO.md). - -#### COCO - -| Model | COCO val mAP | FPS | -|-------------------------------------------|---------------|-------| -| CenterNet-S4_DLA_8x | 42.5 | 71 | -| CenterNet2_R50_1x | 42.9 | 24 | -| CenterNet2_X101-DCN_2x | 49.9 | 8 | -| CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST | 56.1 | 5 | -| CenterNet2_DLA-BiFPN-P5_24x_ST | 49.2 | 38 | - - -#### LVIS - -| Model | val mAP box | -| ------------------------- | ----------- | -| CenterNet2_R50_1x | 26.5 | -| CenterNet2_FedLoss_R50_1x | 28.3 | - - -#### Objects365 - -| Model | val mAP | -|-------------------------------------------|----------| -| CenterNet2_R50_1x | 22.6 | - -## Installation - -Our project is developed on [detectron2](https://github.com/facebookresearch/detectron2). Please follow the official detectron2 [installation](https://github.com/facebookresearch/detectron2/blob/master/INSTALL.md). All our code is under `projects/CenterNet2/`. In theory, you should be able to copy-paste `projects/CenterNet2/` to the latest detectron2 release or your own detectron2 repo to run our project. There might be API changes in future detectron2 releases that make it incompatible. - -We use the default detectron2 demo script. To run inference on an image folder using our pre-trained model, run - -~~~ -python projects/CenterNet2/demo/demo.py --config-file projects/CenterNet2/configs/CenterNet2_R50_1x.yaml --input path/to/image/ --opts MODEL.WEIGHTS models/CenterNet2_R50_1x.pth -~~~ - -## Benchmark evaluation and training - -Please check detectron2 [GETTING_STARTED.md](https://github.com/facebookresearch/detectron2/blob/master/GETTING_STARTED.md) for running evaluation and training. Our config files are under `projects/CenterNet2/configs` and the pre-trained models are in the [MODEL_ZOO](projects/CenterNet2/centernet2_docs/MODEL_ZOO.md). - - -## License - -Our code under `projects/CenterNet2/` is under [Apache 2.0 license](projects/CenterNet2/LICENSE). `projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py` are from [AdelaiDet](https://github.com/aim-uofa/AdelaiDet), which follows the original [non-commercial license](https://github.com/aim-uofa/AdelaiDet/blob/master/LICENSE). The code from detectron2 follows the original [Apache 2.0 license](LICENSE). - -## Citation - -If you find this project useful for your research, please use the following BibTeX entry. - - @inproceedings{zhou2021probablistic, - title={Probabilistic two-stage detection}, - author={Zhou, Xingyi and Koltun, Vladlen and Kr{\"a}henb{\"u}hl, Philipp}, - booktitle={arXiv preprint arXiv:2103.07461}, - year={2021} - } diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/README_D2.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/README_D2.md deleted file mode 100755 index a88ad7e2..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/README_D2.md +++ /dev/null @@ -1,62 +0,0 @@ - - -Detectron2 is Facebook AI Research's next generation software system -that implements state-of-the-art object detection algorithms. -It is a ground-up rewrite of the previous version, -[Detectron](https://github.com/facebookresearch/Detectron/), -and it originates from [maskrcnn-benchmark](https://github.com/facebookresearch/maskrcnn-benchmark/). - -
      - -
      - -### What's New -* It is powered by the [PyTorch](https://pytorch.org) deep learning framework. -* Includes more features such as panoptic segmentation, Densepose, Cascade R-CNN, rotated bounding boxes, PointRend, - DeepLab, etc. -* Can be used as a library to support [different projects](projects/) on top of it. - We'll open source more research projects in this way. -* It [trains much faster](https://detectron2.readthedocs.io/notes/benchmarks.html). -* Models can be exported to TorchScript format or Caffe2 format for deployment. - -See our [blog post](https://ai.facebook.com/blog/-detectron2-a-pytorch-based-modular-object-detection-library-/) -to see more demos and learn about detectron2. - -## Installation - -See [INSTALL.md](INSTALL.md). - -## Getting Started - -Follow the [installation instructions](https://detectron2.readthedocs.io/tutorials/install.html) to -install detectron2. - -See [Getting Started with Detectron2](https://detectron2.readthedocs.io/tutorials/getting_started.html), -and the [Colab Notebook](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5) -to learn about basic usage. - -Learn more at our [documentation](https://detectron2.readthedocs.org). -And see [projects/](projects/) for some projects that are built on top of detectron2. - -## Model Zoo and Baselines - -We provide a large set of baseline results and trained models available for download in the [Detectron2 Model Zoo](MODEL_ZOO.md). - - -## License - -Detectron2 is released under the [Apache 2.0 license](LICENSE). - -## Citing Detectron2 - -If you use Detectron2 in your research or wish to refer to the baseline results published in the [Model Zoo](MODEL_ZOO.md), please use the following BibTeX entry. - -```BibTeX -@misc{wu2019detectron2, - author = {Yuxin Wu and Alexander Kirillov and Francisco Massa and - Wan-Yen Lo and Ross Girshick}, - title = {Detectron2}, - howpublished = {\url{https://github.com/facebookresearch/detectron2}}, - year = {2019} -} -``` diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Base-RCNN-C4.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Base-RCNN-C4.yaml deleted file mode 100755 index fbf34a0e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Base-RCNN-C4.yaml +++ /dev/null @@ -1,18 +0,0 @@ -MODEL: - META_ARCHITECTURE: "GeneralizedRCNN" - RPN: - PRE_NMS_TOPK_TEST: 6000 - POST_NMS_TOPK_TEST: 1000 - ROI_HEADS: - NAME: "Res5ROIHeads" -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - IMS_PER_BATCH: 16 - BASE_LR: 0.02 - STEPS: (60000, 80000) - MAX_ITER: 90000 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -VERSION: 2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Base-RCNN-DilatedC5.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Base-RCNN-DilatedC5.yaml deleted file mode 100755 index c0d6d16b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Base-RCNN-DilatedC5.yaml +++ /dev/null @@ -1,31 +0,0 @@ -MODEL: - META_ARCHITECTURE: "GeneralizedRCNN" - RESNETS: - OUT_FEATURES: ["res5"] - RES5_DILATION: 2 - RPN: - IN_FEATURES: ["res5"] - PRE_NMS_TOPK_TEST: 6000 - POST_NMS_TOPK_TEST: 1000 - ROI_HEADS: - NAME: "StandardROIHeads" - IN_FEATURES: ["res5"] - ROI_BOX_HEAD: - NAME: "FastRCNNConvFCHead" - NUM_FC: 2 - POOLER_RESOLUTION: 7 - ROI_MASK_HEAD: - NAME: "MaskRCNNConvUpsampleHead" - NUM_CONV: 4 - POOLER_RESOLUTION: 14 -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - IMS_PER_BATCH: 16 - BASE_LR: 0.02 - STEPS: (60000, 80000) - MAX_ITER: 90000 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -VERSION: 2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Base-RCNN-FPN.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Base-RCNN-FPN.yaml deleted file mode 100755 index 3e020f2e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Base-RCNN-FPN.yaml +++ /dev/null @@ -1,42 +0,0 @@ -MODEL: - META_ARCHITECTURE: "GeneralizedRCNN" - BACKBONE: - NAME: "build_resnet_fpn_backbone" - RESNETS: - OUT_FEATURES: ["res2", "res3", "res4", "res5"] - FPN: - IN_FEATURES: ["res2", "res3", "res4", "res5"] - ANCHOR_GENERATOR: - SIZES: [[32], [64], [128], [256], [512]] # One size for each in feature map - ASPECT_RATIOS: [[0.5, 1.0, 2.0]] # Three aspect ratios (same for all in feature maps) - RPN: - IN_FEATURES: ["p2", "p3", "p4", "p5", "p6"] - PRE_NMS_TOPK_TRAIN: 2000 # Per FPN level - PRE_NMS_TOPK_TEST: 1000 # Per FPN level - # Detectron1 uses 2000 proposals per-batch, - # (See "modeling/rpn/rpn_outputs.py" for details of this legacy issue) - # which is approximately 1000 proposals per-image since the default batch size for FPN is 2. - POST_NMS_TOPK_TRAIN: 1000 - POST_NMS_TOPK_TEST: 1000 - ROI_HEADS: - NAME: "StandardROIHeads" - IN_FEATURES: ["p2", "p3", "p4", "p5"] - ROI_BOX_HEAD: - NAME: "FastRCNNConvFCHead" - NUM_FC: 2 - POOLER_RESOLUTION: 7 - ROI_MASK_HEAD: - NAME: "MaskRCNNConvUpsampleHead" - NUM_CONV: 4 - POOLER_RESOLUTION: 14 -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - IMS_PER_BATCH: 16 - BASE_LR: 0.02 - STEPS: (60000, 80000) - MAX_ITER: 90000 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -VERSION: 2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Base-RetinaNet.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Base-RetinaNet.yaml deleted file mode 100755 index 8b45b982..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Base-RetinaNet.yaml +++ /dev/null @@ -1,25 +0,0 @@ -MODEL: - META_ARCHITECTURE: "RetinaNet" - BACKBONE: - NAME: "build_retinanet_resnet_fpn_backbone" - RESNETS: - OUT_FEATURES: ["res3", "res4", "res5"] - ANCHOR_GENERATOR: - SIZES: !!python/object/apply:eval ["[[x, x * 2**(1.0/3), x * 2**(2.0/3) ] for x in [32, 64, 128, 256, 512 ]]"] - FPN: - IN_FEATURES: ["res3", "res4", "res5"] - RETINANET: - IOU_THRESHOLDS: [0.4, 0.5] - IOU_LABELS: [0, -1, 1] - SMOOTH_L1_LOSS_BETA: 0.0 -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - IMS_PER_BATCH: 16 - BASE_LR: 0.01 # Note that RetinaNet uses a different default learning rate - STEPS: (60000, 80000) - MAX_ITER: 90000 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -VERSION: 2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml deleted file mode 100755 index 773ac10e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,17 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - LOAD_PROPOSALS: True - RESNETS: - DEPTH: 50 - PROPOSAL_GENERATOR: - NAME: "PrecomputedProposals" -DATASETS: - TRAIN: ("coco_2017_train",) - PROPOSAL_FILES_TRAIN: ("detectron2://COCO-Detection/rpn_R_50_FPN_1x/137258492/coco_2017_train_box_proposals_21bc3a.pkl", ) - TEST: ("coco_2017_val",) - PROPOSAL_FILES_TEST: ("detectron2://COCO-Detection/rpn_R_50_FPN_1x/137258492/coco_2017_val_box_proposals_ee0dad.pkl", ) -DATALOADER: - # proposals are part of the dataset_dicts, and take a lot of RAM - NUM_WORKERS: 2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_C4_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_C4_3x.yaml deleted file mode 100755 index db142cd6..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_C4_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - MASK_ON: False - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_DC5_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_DC5_3x.yaml deleted file mode 100755 index bceb6b34..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_DC5_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-DilatedC5.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - MASK_ON: False - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_FPN_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_FPN_3x.yaml deleted file mode 100755 index 57a098f5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_FPN_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - MASK_ON: False - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_1x.yaml deleted file mode 100755 index f9613010..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_1x.yaml +++ /dev/null @@ -1,6 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_3x.yaml deleted file mode 100755 index bc51bce3..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_1x.yaml deleted file mode 100755 index 0fe96f57..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_1x.yaml +++ /dev/null @@ -1,6 +0,0 @@ -_BASE_: "../Base-RCNN-DilatedC5.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_3x.yaml deleted file mode 100755 index 33fadeb8..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-DilatedC5.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_1x.yaml deleted file mode 100755 index 3262019a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,6 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_3x.yaml deleted file mode 100755 index 41395182..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x.yaml deleted file mode 100755 index 9c9b5ab7..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x.yaml +++ /dev/null @@ -1,13 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - MASK_ON: False - WEIGHTS: "detectron2://ImageNetPretrained/FAIR/X-101-32x8d.pkl" - PIXEL_STD: [57.375, 57.120, 58.395] - RESNETS: - STRIDE_IN_1X1: False # this is a C2 model - NUM_GROUPS: 32 - WIDTH_PER_GROUP: 8 - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/fcos_R_50_FPN_1x.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/fcos_R_50_FPN_1x.py deleted file mode 100755 index 86f83c68..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/fcos_R_50_FPN_1x.py +++ /dev/null @@ -1,11 +0,0 @@ -from ..common.optim import SGD as optimizer -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.data.coco import dataloader -from ..common.models.fcos import model -from ..common.train import train - -dataloader.train.mapper.use_instance_mask = False -optimizer.lr = 0.01 - -model.backbone.bottom_up.freeze_at = 2 -train.init_checkpoint = "detectron2://ImageNetPretrained/MSRA/R-50.pkl" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_101_FPN_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_101_FPN_3x.yaml deleted file mode 100755 index 4abb1b9a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_101_FPN_3x.yaml +++ /dev/null @@ -1,8 +0,0 @@ -_BASE_: "../Base-RetinaNet.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.py deleted file mode 100755 index 43057a8e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.py +++ /dev/null @@ -1,11 +0,0 @@ -from ..common.optim import SGD as optimizer -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.data.coco import dataloader -from ..common.models.retinanet import model -from ..common.train import train - -dataloader.train.mapper.use_instance_mask = False -model.backbone.bottom_up.freeze_at = 2 -optimizer.lr = 0.01 - -train.init_checkpoint = "detectron2://ImageNetPretrained/MSRA/R-50.pkl" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.yaml deleted file mode 100755 index 4a24ce3a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.yaml +++ /dev/null @@ -1,5 +0,0 @@ -_BASE_: "../Base-RetinaNet.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_3x.yaml deleted file mode 100755 index 3b5412d4..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_3x.yaml +++ /dev/null @@ -1,8 +0,0 @@ -_BASE_: "../Base-RetinaNet.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_C4_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_C4_1x.yaml deleted file mode 100755 index e0482115..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_C4_1x.yaml +++ /dev/null @@ -1,10 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - META_ARCHITECTURE: "ProposalNetwork" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 - RPN: - PRE_NMS_TOPK_TEST: 12000 - POST_NMS_TOPK_TEST: 2000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_FPN_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_FPN_1x.yaml deleted file mode 100755 index dc9c9520..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - META_ARCHITECTURE: "ProposalNetwork" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 - RPN: - POST_NMS_TOPK_TEST: 2000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_C4_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_C4_3x.yaml deleted file mode 100755 index 1a94cc45..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_C4_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - MASK_ON: True - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_DC5_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_DC5_3x.yaml deleted file mode 100755 index 67b70cf4..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_DC5_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-DilatedC5.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - MASK_ON: True - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_FPN_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_FPN_3x.yaml deleted file mode 100755 index 1935a302..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_FPN_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - MASK_ON: True - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.py deleted file mode 100755 index 22016be1..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.py +++ /dev/null @@ -1,8 +0,0 @@ -from ..common.train import train -from ..common.optim import SGD as optimizer -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.data.coco import dataloader -from ..common.models.mask_rcnn_c4 import model - -model.backbone.freeze_at = 2 -train.init_checkpoint = "detectron2://ImageNetPretrained/MSRA/R-50.pkl" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.yaml deleted file mode 100755 index a9aeb4ea..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.yaml +++ /dev/null @@ -1,6 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml deleted file mode 100755 index 38ed867d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_1x.yaml deleted file mode 100755 index b13eefab..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_1x.yaml +++ /dev/null @@ -1,6 +0,0 @@ -_BASE_: "../Base-RCNN-DilatedC5.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x.yaml deleted file mode 100755 index d4010163..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-DilatedC5.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py deleted file mode 100755 index 40844dde..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py +++ /dev/null @@ -1,8 +0,0 @@ -from ..common.optim import SGD as optimizer -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.data.coco import dataloader -from ..common.models.mask_rcnn_fpn import model -from ..common.train import train - -model.backbone.bottom_up.freeze_at = 2 -train.init_checkpoint = "detectron2://ImageNetPretrained/MSRA/R-50.pkl" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml deleted file mode 100755 index d50fb866..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,6 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x_giou.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x_giou.yaml deleted file mode 100755 index bec680ee..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x_giou.yaml +++ /dev/null @@ -1,12 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - RPN: - BBOX_REG_LOSS_TYPE: "giou" - BBOX_REG_LOSS_WEIGHT: 2.0 - ROI_BOX_HEAD: - BBOX_REG_LOSS_TYPE: "giou" - BBOX_REG_LOSS_WEIGHT: 10.0 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml deleted file mode 100755 index be7d06b8..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_3x.yaml deleted file mode 100755 index d14c63f7..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_3x.yaml +++ /dev/null @@ -1,13 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - MASK_ON: True - WEIGHTS: "detectron2://ImageNetPretrained/FAIR/X-101-32x8d.pkl" - PIXEL_STD: [57.375, 57.120, 58.395] - RESNETS: - STRIDE_IN_1X1: False # this is a C2 model - NUM_GROUPS: 32 - WIDTH_PER_GROUP: 8 - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py deleted file mode 100755 index d7bbdd7d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py +++ /dev/null @@ -1,34 +0,0 @@ -from ..common.optim import SGD as optimizer -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.data.coco import dataloader -from ..common.models.mask_rcnn_fpn import model -from ..common.train import train - -from detectron2.config import LazyCall as L -from detectron2.modeling.backbone import RegNet -from detectron2.modeling.backbone.regnet import SimpleStem, ResBottleneckBlock - - -# Replace default ResNet with RegNetX-4GF from the DDS paper. Config source: -# https://github.com/facebookresearch/pycls/blob/2c152a6e5d913e898cca4f0a758f41e6b976714d/configs/dds_baselines/regnetx/RegNetX-4.0GF_dds_8gpu.yaml#L4-L9 # noqa -model.backbone.bottom_up = L(RegNet)( - stem_class=SimpleStem, - stem_width=32, - block_class=ResBottleneckBlock, - depth=23, - w_a=38.65, - w_0=96, - w_m=2.43, - group_width=40, - freeze_at=2, - norm="FrozenBN", - out_features=["s1", "s2", "s3", "s4"], -) -model.pixel_std = [57.375, 57.120, 58.395] - -optimizer.weight_decay = 5e-5 -train.init_checkpoint = ( - "https://dl.fbaipublicfiles.com/pycls/dds_baselines/160906383/RegNetX-4.0GF_dds_8gpu.pyth" -) -# RegNets benefit from enabling cudnn benchmark mode -train.cudnn_benchmark = True diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py deleted file mode 100755 index 72c6b7a5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py +++ /dev/null @@ -1,35 +0,0 @@ -from ..common.optim import SGD as optimizer -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.data.coco import dataloader -from ..common.models.mask_rcnn_fpn import model -from ..common.train import train - -from detectron2.config import LazyCall as L -from detectron2.modeling.backbone import RegNet -from detectron2.modeling.backbone.regnet import SimpleStem, ResBottleneckBlock - - -# Replace default ResNet with RegNetY-4GF from the DDS paper. Config source: -# https://github.com/facebookresearch/pycls/blob/2c152a6e5d913e898cca4f0a758f41e6b976714d/configs/dds_baselines/regnety/RegNetY-4.0GF_dds_8gpu.yaml#L4-L10 # noqa -model.backbone.bottom_up = L(RegNet)( - stem_class=SimpleStem, - stem_width=32, - block_class=ResBottleneckBlock, - depth=22, - w_a=31.41, - w_0=96, - w_m=2.24, - group_width=64, - se_ratio=0.25, - freeze_at=2, - norm="FrozenBN", - out_features=["s1", "s2", "s3", "s4"], -) -model.pixel_std = [57.375, 57.120, 58.395] - -optimizer.weight_decay = 5e-5 -train.init_checkpoint = ( - "https://dl.fbaipublicfiles.com/pycls/dds_baselines/160906838/RegNetY-4.0GF_dds_8gpu.pyth" -) -# RegNets benefit from enabling cudnn benchmark mode -train.cudnn_benchmark = True diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/Base-Keypoint-RCNN-FPN.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/Base-Keypoint-RCNN-FPN.yaml deleted file mode 100755 index 4e03944a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/Base-Keypoint-RCNN-FPN.yaml +++ /dev/null @@ -1,15 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - KEYPOINT_ON: True - ROI_HEADS: - NUM_CLASSES: 1 - ROI_BOX_HEAD: - SMOOTH_L1_BETA: 0.5 # Keypoint AP degrades (though box AP improves) when using plain L1 loss - RPN: - # Detectron1 uses 2000 proposals per-batch, but this option is per-image in detectron2. - # 1000 proposals per-image is found to hurt box AP. - # Therefore we increase it to 1500 per-image. - POST_NMS_TOPK_TRAIN: 1500 -DATASETS: - TRAIN: ("keypoints_coco_2017_train",) - TEST: ("keypoints_coco_2017_val",) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_101_FPN_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_101_FPN_3x.yaml deleted file mode 100755 index 9309535c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_101_FPN_3x.yaml +++ /dev/null @@ -1,8 +0,0 @@ -_BASE_: "Base-Keypoint-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.py deleted file mode 100755 index 1aad53bf..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.py +++ /dev/null @@ -1,8 +0,0 @@ -from ..common.optim import SGD as optimizer -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.data.coco_keypoint import dataloader -from ..common.models.keypoint_rcnn_fpn import model -from ..common.train import train - -model.backbone.bottom_up.freeze_at = 2 -train.init_checkpoint = "detectron2://ImageNetPretrained/MSRA/R-50.pkl" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.yaml deleted file mode 100755 index 7bf85cf7..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,5 +0,0 @@ -_BASE_: "Base-Keypoint-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml deleted file mode 100755 index a07f243f..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml +++ /dev/null @@ -1,8 +0,0 @@ -_BASE_: "Base-Keypoint-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_X_101_32x8d_FPN_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_X_101_32x8d_FPN_3x.yaml deleted file mode 100755 index d4bfa20a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_X_101_32x8d_FPN_3x.yaml +++ /dev/null @@ -1,12 +0,0 @@ -_BASE_: "Base-Keypoint-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/FAIR/X-101-32x8d.pkl" - PIXEL_STD: [57.375, 57.120, 58.395] - RESNETS: - STRIDE_IN_1X1: False # this is a C2 model - NUM_GROUPS: 32 - WIDTH_PER_GROUP: 8 - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/Base-Panoptic-FPN.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/Base-Panoptic-FPN.yaml deleted file mode 100755 index f00d54b7..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/Base-Panoptic-FPN.yaml +++ /dev/null @@ -1,11 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - META_ARCHITECTURE: "PanopticFPN" - MASK_ON: True - SEM_SEG_HEAD: - LOSS_WEIGHT: 0.5 -DATASETS: - TRAIN: ("coco_2017_train_panoptic_separated",) - TEST: ("coco_2017_val_panoptic_separated",) -DATALOADER: - FILTER_EMPTY_ANNOTATIONS: False diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_101_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_101_3x.yaml deleted file mode 100755 index 0e01f6fb..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_101_3x.yaml +++ /dev/null @@ -1,8 +0,0 @@ -_BASE_: "Base-Panoptic-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.py deleted file mode 100755 index 40cf1813..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.py +++ /dev/null @@ -1,8 +0,0 @@ -from ..common.optim import SGD as optimizer -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.data.coco_panoptic_separated import dataloader -from ..common.models.panoptic_fpn import model -from ..common.train import train - -model.backbone.bottom_up.freeze_at = 2 -train.init_checkpoint = "detectron2://ImageNetPretrained/MSRA/R-50.pkl" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.yaml deleted file mode 100755 index 6afa2c1c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.yaml +++ /dev/null @@ -1,5 +0,0 @@ -_BASE_: "Base-Panoptic-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_3x.yaml deleted file mode 100755 index b956b3f6..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_3x.yaml +++ /dev/null @@ -1,8 +0,0 @@ -_BASE_: "Base-Panoptic-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Cityscapes/mask_rcnn_R_50_FPN.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Cityscapes/mask_rcnn_R_50_FPN.yaml deleted file mode 100755 index 1a7aaeb9..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Cityscapes/mask_rcnn_R_50_FPN.yaml +++ /dev/null @@ -1,27 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - # WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - # For better, more stable performance initialize from COCO - WEIGHTS: "detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl" - MASK_ON: True - ROI_HEADS: - NUM_CLASSES: 8 -# This is similar to the setting used in Mask R-CNN paper, Appendix A -# But there are some differences, e.g., we did not initialize the output -# layer using the corresponding classes from COCO -INPUT: - MIN_SIZE_TRAIN: (800, 832, 864, 896, 928, 960, 992, 1024) - MIN_SIZE_TRAIN_SAMPLING: "choice" - MIN_SIZE_TEST: 1024 - MAX_SIZE_TRAIN: 2048 - MAX_SIZE_TEST: 2048 -DATASETS: - TRAIN: ("cityscapes_fine_instance_seg_train",) - TEST: ("cityscapes_fine_instance_seg_val",) -SOLVER: - BASE_LR: 0.01 - STEPS: (18000,) - MAX_ITER: 24000 - IMS_PER_BATCH: 8 -TEST: - EVAL_PERIOD: 8000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/README.md deleted file mode 100755 index 924fd00a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/README.md +++ /dev/null @@ -1,84 +0,0 @@ - -Detectron2 model zoo's experimental settings and a few implementation details are different from Detectron. - -The differences in implementation details are shared in -[Compatibility with Other Libraries](../../docs/notes/compatibility.md). - -The differences in model zoo's experimental settings include: -* Use scale augmentation during training. This improves AP with lower training cost. -* Use L1 loss instead of smooth L1 loss for simplicity. This sometimes improves box AP but may - affect other AP. -* Use `POOLER_SAMPLING_RATIO=0` instead of 2. This does not significantly affect AP. -* Use `ROIAlignV2`. This does not significantly affect AP. - -In this directory, we provide a few configs that __do not__ have the above changes. -They mimic Detectron's behavior as close as possible, -and provide a fair comparison of accuracy and speed against Detectron. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      mask
      AP
      kp.
      AP
      model iddownload
      Faster R-CNN1x0.2190.0383.136.9137781054model | metrics
      Keypoint R-CNN1x0.3130.0715.053.164.2137781195model | metrics
      Mask R-CNN1x0.2730.0433.437.834.9137781281model | metrics
      - -## Comparisons: - -* Faster R-CNN: Detectron's AP is 36.7, similar to ours. -* Keypoint R-CNN: Detectron's AP is box 53.6, keypoint 64.2. Fixing a Detectron's - [bug](https://github.com/facebookresearch/Detectron/issues/459) lead to a drop in box AP, and can be - compensated back by some parameter tuning. -* Mask R-CNN: Detectron's AP is box 37.7, mask 33.9. We're 1 AP better in mask AP, due to more correct implementation. - See [this article](https://ppwwyyxx.com/blog/2021/Where-are-Pixels/) for details. - -For speed comparison, see [benchmarks](https://detectron2.readthedocs.io/notes/benchmarks.html). diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/faster_rcnn_R_50_FPN_noaug_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/faster_rcnn_R_50_FPN_noaug_1x.yaml deleted file mode 100755 index 6ce77f13..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/faster_rcnn_R_50_FPN_noaug_1x.yaml +++ /dev/null @@ -1,17 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 - # Detectron1 uses smooth L1 loss with some magic beta values. - # The defaults are changed to L1 loss in Detectron2. - RPN: - SMOOTH_L1_BETA: 0.1111 - ROI_BOX_HEAD: - SMOOTH_L1_BETA: 1.0 - POOLER_SAMPLING_RATIO: 2 - POOLER_TYPE: "ROIAlign" -INPUT: - # no scale augmentation - MIN_SIZE_TRAIN: (800, ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/keypoint_rcnn_R_50_FPN_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/keypoint_rcnn_R_50_FPN_1x.yaml deleted file mode 100755 index aacf868b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/keypoint_rcnn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,27 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - KEYPOINT_ON: True - RESNETS: - DEPTH: 50 - ROI_HEADS: - NUM_CLASSES: 1 - ROI_KEYPOINT_HEAD: - POOLER_RESOLUTION: 14 - POOLER_SAMPLING_RATIO: 2 - POOLER_TYPE: "ROIAlign" - # Detectron1 uses smooth L1 loss with some magic beta values. - # The defaults are changed to L1 loss in Detectron2. - ROI_BOX_HEAD: - SMOOTH_L1_BETA: 1.0 - POOLER_SAMPLING_RATIO: 2 - POOLER_TYPE: "ROIAlign" - RPN: - SMOOTH_L1_BETA: 0.1111 - # Detectron1 uses 2000 proposals per-batch, but this option is per-image in detectron2 - # 1000 proposals per-image is found to hurt box AP. - # Therefore we increase it to 1500 per-image. - POST_NMS_TOPK_TRAIN: 1500 -DATASETS: - TRAIN: ("keypoints_coco_2017_train",) - TEST: ("keypoints_coco_2017_val",) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x.yaml deleted file mode 100755 index 4ea86a8d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x.yaml +++ /dev/null @@ -1,20 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - # Detectron1 uses smooth L1 loss with some magic beta values. - # The defaults are changed to L1 loss in Detectron2. - RPN: - SMOOTH_L1_BETA: 0.1111 - ROI_BOX_HEAD: - SMOOTH_L1_BETA: 1.0 - POOLER_SAMPLING_RATIO: 2 - POOLER_TYPE: "ROIAlign" - ROI_MASK_HEAD: - POOLER_SAMPLING_RATIO: 2 - POOLER_TYPE: "ROIAlign" -INPUT: - # no scale augmentation - MIN_SIZE_TRAIN: (800, ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml deleted file mode 100755 index f0c3a1bb..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml +++ /dev/null @@ -1,19 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - MASK_ON: True - RESNETS: - DEPTH: 101 - ROI_HEADS: - NUM_CLASSES: 1230 - SCORE_THRESH_TEST: 0.0001 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -DATASETS: - TRAIN: ("lvis_v0.5_train",) - TEST: ("lvis_v0.5_val",) -TEST: - DETECTIONS_PER_IMAGE: 300 # LVIS allows up to 300 -DATALOADER: - SAMPLER_TRAIN: "RepeatFactorTrainingSampler" - REPEAT_THRESHOLD: 0.001 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml deleted file mode 100755 index 64b4caa4..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,19 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - ROI_HEADS: - NUM_CLASSES: 1230 - SCORE_THRESH_TEST: 0.0001 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -DATASETS: - TRAIN: ("lvis_v0.5_train",) - TEST: ("lvis_v0.5_val",) -TEST: - DETECTIONS_PER_IMAGE: 300 # LVIS allows up to 300 -DATALOADER: - SAMPLER_TRAIN: "RepeatFactorTrainingSampler" - REPEAT_THRESHOLD: 0.001 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml deleted file mode 100755 index c8b822c6..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml +++ /dev/null @@ -1,23 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/FAIR/X-101-32x8d.pkl" - PIXEL_STD: [57.375, 57.120, 58.395] - MASK_ON: True - RESNETS: - STRIDE_IN_1X1: False # this is a C2 model - NUM_GROUPS: 32 - WIDTH_PER_GROUP: 8 - DEPTH: 101 - ROI_HEADS: - NUM_CLASSES: 1230 - SCORE_THRESH_TEST: 0.0001 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -DATASETS: - TRAIN: ("lvis_v0.5_train",) - TEST: ("lvis_v0.5_val",) -TEST: - DETECTIONS_PER_IMAGE: 300 # LVIS allows up to 300 -DATALOADER: - SAMPLER_TRAIN: "RepeatFactorTrainingSampler" - REPEAT_THRESHOLD: 0.001 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml deleted file mode 100755 index ca4dd971..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml +++ /dev/null @@ -1,22 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - MASK_ON: True - RESNETS: - DEPTH: 101 - ROI_HEADS: - NUM_CLASSES: 1203 - SCORE_THRESH_TEST: 0.0001 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -DATASETS: - TRAIN: ("lvis_v1_train",) - TEST: ("lvis_v1_val",) -TEST: - DETECTIONS_PER_IMAGE: 300 # LVIS allows up to 300 -SOLVER: - STEPS: (120000, 160000) - MAX_ITER: 180000 # 180000 * 16 / 100000 ~ 28.8 epochs -DATALOADER: - SAMPLER_TRAIN: "RepeatFactorTrainingSampler" - REPEAT_THRESHOLD: 0.001 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml deleted file mode 100755 index f313295e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,22 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - ROI_HEADS: - NUM_CLASSES: 1203 - SCORE_THRESH_TEST: 0.0001 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -DATASETS: - TRAIN: ("lvis_v1_train",) - TEST: ("lvis_v1_val",) -TEST: - DETECTIONS_PER_IMAGE: 300 # LVIS allows up to 300 -SOLVER: - STEPS: (120000, 160000) - MAX_ITER: 180000 # 180000 * 16 / 100000 ~ 28.8 epochs -DATALOADER: - SAMPLER_TRAIN: "RepeatFactorTrainingSampler" - REPEAT_THRESHOLD: 0.001 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml deleted file mode 100755 index f6528f7c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml +++ /dev/null @@ -1,26 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/FAIR/X-101-32x8d.pkl" - PIXEL_STD: [57.375, 57.120, 58.395] - MASK_ON: True - RESNETS: - STRIDE_IN_1X1: False # this is a C2 model - NUM_GROUPS: 32 - WIDTH_PER_GROUP: 8 - DEPTH: 101 - ROI_HEADS: - NUM_CLASSES: 1203 - SCORE_THRESH_TEST: 0.0001 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -DATASETS: - TRAIN: ("lvis_v1_train",) - TEST: ("lvis_v1_val",) -SOLVER: - STEPS: (120000, 160000) - MAX_ITER: 180000 # 180000 * 16 / 100000 ~ 28.8 epochs -TEST: - DETECTIONS_PER_IMAGE: 300 # LVIS allows up to 300 -DATALOADER: - SAMPLER_TRAIN: "RepeatFactorTrainingSampler" - REPEAT_THRESHOLD: 0.001 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_1x.yaml deleted file mode 100755 index abb33b61..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,12 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - ROI_HEADS: - NAME: CascadeROIHeads - ROI_BOX_HEAD: - CLS_AGNOSTIC_BBOX_REG: True - RPN: - POST_NMS_TOPK_TRAIN: 2000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml deleted file mode 100755 index e2201ad5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml +++ /dev/null @@ -1,15 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - ROI_HEADS: - NAME: CascadeROIHeads - ROI_BOX_HEAD: - CLS_AGNOSTIC_BBOX_REG: True - RPN: - POST_NMS_TOPK_TRAIN: 2000 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv.yaml deleted file mode 100755 index fc117f6b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv.yaml +++ /dev/null @@ -1,36 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - MASK_ON: True - WEIGHTS: "catalog://ImageNetPretrained/FAIR/X-152-32x8d-IN5k" - RESNETS: - STRIDE_IN_1X1: False # this is a C2 model - NUM_GROUPS: 32 - WIDTH_PER_GROUP: 8 - DEPTH: 152 - DEFORM_ON_PER_STAGE: [False, True, True, True] - ROI_HEADS: - NAME: "CascadeROIHeads" - ROI_BOX_HEAD: - NAME: "FastRCNNConvFCHead" - NUM_CONV: 4 - NUM_FC: 1 - NORM: "GN" - CLS_AGNOSTIC_BBOX_REG: True - ROI_MASK_HEAD: - NUM_CONV: 8 - NORM: "GN" - RPN: - POST_NMS_TOPK_TRAIN: 2000 -SOLVER: - IMS_PER_BATCH: 128 - STEPS: (35000, 45000) - MAX_ITER: 50000 - BASE_LR: 0.16 -INPUT: - MIN_SIZE_TRAIN: (640, 864) - MIN_SIZE_TRAIN_SAMPLING: "range" - MAX_SIZE_TRAIN: 1440 - CROP: - ENABLED: True -TEST: - EVAL_PERIOD: 2500 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_cls_agnostic.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_cls_agnostic.yaml deleted file mode 100755 index 4c3b767f..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_cls_agnostic.yaml +++ /dev/null @@ -1,10 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - ROI_BOX_HEAD: - CLS_AGNOSTIC_BBOX_REG: True - ROI_MASK_HEAD: - CLS_AGNOSTIC_MASK: True diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_dconv_c3-c5.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_dconv_c3-c5.yaml deleted file mode 100755 index 04ff988d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_dconv_c3-c5.yaml +++ /dev/null @@ -1,8 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - DEFORM_ON_PER_STAGE: [False, True, True, True] # on Res3,Res4,Res5 - DEFORM_MODULATED: False diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_dconv_c3-c5.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_dconv_c3-c5.yaml deleted file mode 100755 index 68c0ca58..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_dconv_c3-c5.yaml +++ /dev/null @@ -1,11 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - DEFORM_ON_PER_STAGE: [False, True, True, True] # on Res3,Res4,Res5 - DEFORM_MODULATED: False -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_gn.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_gn.yaml deleted file mode 100755 index 74d274e5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_gn.yaml +++ /dev/null @@ -1,21 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "catalog://ImageNetPretrained/FAIR/R-50-GN" - MASK_ON: True - RESNETS: - DEPTH: 50 - NORM: "GN" - STRIDE_IN_1X1: False - FPN: - NORM: "GN" - ROI_BOX_HEAD: - NAME: "FastRCNNConvFCHead" - NUM_CONV: 4 - NUM_FC: 1 - NORM: "GN" - ROI_MASK_HEAD: - NORM: "GN" -SOLVER: - # 3x schedule - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_syncbn.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_syncbn.yaml deleted file mode 100755 index 11ebb076..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_syncbn.yaml +++ /dev/null @@ -1,24 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - NORM: "SyncBN" - STRIDE_IN_1X1: True - FPN: - NORM: "SyncBN" - ROI_BOX_HEAD: - NAME: "FastRCNNConvFCHead" - NUM_CONV: 4 - NUM_FC: 1 - NORM: "SyncBN" - ROI_MASK_HEAD: - NORM: "SyncBN" -SOLVER: - # 3x schedule - STEPS: (210000, 250000) - MAX_ITER: 270000 -TEST: - PRECISE_BN: - ENABLED: True diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mmdet_mask_rcnn_R_50_FPN_1x.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mmdet_mask_rcnn_R_50_FPN_1x.py deleted file mode 100755 index 0f2464be..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/mmdet_mask_rcnn_R_50_FPN_1x.py +++ /dev/null @@ -1,151 +0,0 @@ -# An example config to train a mmdetection model using detectron2. - -from ..common.data.coco import dataloader -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.optim import SGD as optimizer -from ..common.train import train - -from detectron2.modeling.mmdet_wrapper import MMDetDetector -from detectron2.config import LazyCall as L - -model = L(MMDetDetector)( - detector=dict( - type="MaskRCNN", - pretrained="torchvision://resnet50", - backbone=dict( - type="ResNet", - depth=50, - num_stages=4, - out_indices=(0, 1, 2, 3), - frozen_stages=1, - norm_cfg=dict(type="BN", requires_grad=True), - norm_eval=True, - style="pytorch", - ), - neck=dict(type="FPN", in_channels=[256, 512, 1024, 2048], out_channels=256, num_outs=5), - rpn_head=dict( - type="RPNHead", - in_channels=256, - feat_channels=256, - anchor_generator=dict( - type="AnchorGenerator", - scales=[8], - ratios=[0.5, 1.0, 2.0], - strides=[4, 8, 16, 32, 64], - ), - bbox_coder=dict( - type="DeltaXYWHBBoxCoder", - target_means=[0.0, 0.0, 0.0, 0.0], - target_stds=[1.0, 1.0, 1.0, 1.0], - ), - loss_cls=dict(type="CrossEntropyLoss", use_sigmoid=True, loss_weight=1.0), - loss_bbox=dict(type="L1Loss", loss_weight=1.0), - ), - roi_head=dict( - type="StandardRoIHead", - bbox_roi_extractor=dict( - type="SingleRoIExtractor", - roi_layer=dict(type="RoIAlign", output_size=7, sampling_ratio=0), - out_channels=256, - featmap_strides=[4, 8, 16, 32], - ), - bbox_head=dict( - type="Shared2FCBBoxHead", - in_channels=256, - fc_out_channels=1024, - roi_feat_size=7, - num_classes=80, - bbox_coder=dict( - type="DeltaXYWHBBoxCoder", - target_means=[0.0, 0.0, 0.0, 0.0], - target_stds=[0.1, 0.1, 0.2, 0.2], - ), - reg_class_agnostic=False, - loss_cls=dict(type="CrossEntropyLoss", use_sigmoid=False, loss_weight=1.0), - loss_bbox=dict(type="L1Loss", loss_weight=1.0), - ), - mask_roi_extractor=dict( - type="SingleRoIExtractor", - roi_layer=dict(type="RoIAlign", output_size=14, sampling_ratio=0), - out_channels=256, - featmap_strides=[4, 8, 16, 32], - ), - mask_head=dict( - type="FCNMaskHead", - num_convs=4, - in_channels=256, - conv_out_channels=256, - num_classes=80, - loss_mask=dict(type="CrossEntropyLoss", use_mask=True, loss_weight=1.0), - ), - ), - # model training and testing settings - train_cfg=dict( - rpn=dict( - assigner=dict( - type="MaxIoUAssigner", - pos_iou_thr=0.7, - neg_iou_thr=0.3, - min_pos_iou=0.3, - match_low_quality=True, - ignore_iof_thr=-1, - ), - sampler=dict( - type="RandomSampler", - num=256, - pos_fraction=0.5, - neg_pos_ub=-1, - add_gt_as_proposals=False, - ), - allowed_border=-1, - pos_weight=-1, - debug=False, - ), - rpn_proposal=dict( - nms_pre=2000, - max_per_img=1000, - nms=dict(type="nms", iou_threshold=0.7), - min_bbox_size=0, - ), - rcnn=dict( - assigner=dict( - type="MaxIoUAssigner", - pos_iou_thr=0.5, - neg_iou_thr=0.5, - min_pos_iou=0.5, - match_low_quality=True, - ignore_iof_thr=-1, - ), - sampler=dict( - type="RandomSampler", - num=512, - pos_fraction=0.25, - neg_pos_ub=-1, - add_gt_as_proposals=True, - ), - mask_size=28, - pos_weight=-1, - debug=False, - ), - ), - test_cfg=dict( - rpn=dict( - nms_pre=1000, - max_per_img=1000, - nms=dict(type="nms", iou_threshold=0.7), - min_bbox_size=0, - ), - rcnn=dict( - score_thr=0.05, - nms=dict(type="nms", iou_threshold=0.5), - max_per_img=100, - mask_thr_binary=0.5, - ), - ), - ), - pixel_mean=[123.675, 116.280, 103.530], - pixel_std=[58.395, 57.120, 57.375], -) - -dataloader.train.mapper.image_format = "RGB" # torchvision pretrained model -train.init_checkpoint = None # pretrained model is loaded inside backbone diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/panoptic_fpn_R_101_dconv_cascade_gn_3x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/panoptic_fpn_R_101_dconv_cascade_gn_3x.yaml deleted file mode 100755 index 34016cea..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/panoptic_fpn_R_101_dconv_cascade_gn_3x.yaml +++ /dev/null @@ -1,26 +0,0 @@ -# A large PanopticFPN for demo purposes. -# Use GN on backbone to support semantic seg. -# Use Cascade + Deform Conv to improve localization. -_BASE_: "../COCO-PanopticSegmentation/Base-Panoptic-FPN.yaml" -MODEL: - WEIGHTS: "catalog://ImageNetPretrained/FAIR/R-101-GN" - RESNETS: - DEPTH: 101 - NORM: "GN" - DEFORM_ON_PER_STAGE: [False, True, True, True] - STRIDE_IN_1X1: False - FPN: - NORM: "GN" - ROI_HEADS: - NAME: CascadeROIHeads - ROI_BOX_HEAD: - CLS_AGNOSTIC_BBOX_REG: True - ROI_MASK_HEAD: - NORM: "GN" - RPN: - POST_NMS_TOPK_TRAIN: 2000 -SOLVER: - STEPS: (105000, 125000) - MAX_ITER: 135000 - IMS_PER_BATCH: 32 - BASE_LR: 0.04 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml deleted file mode 100755 index f3400288..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml +++ /dev/null @@ -1,13 +0,0 @@ -_BASE_: "mask_rcnn_R_50_FPN_3x_gn.yaml" -MODEL: - # Train from random initialization. - WEIGHTS: "" - # It makes sense to divide by STD when training from scratch - # But it seems to make no difference on the results and C2's models didn't do this. - # So we keep things consistent with C2. - # PIXEL_STD: [57.375, 57.12, 58.395] - MASK_ON: True - BACKBONE: - FREEZE_AT: 0 -# NOTE: Please refer to Rethinking ImageNet Pre-training https://arxiv.org/abs/1811.08883 -# to learn what you need for training from scratch. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_gn.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_gn.yaml deleted file mode 100755 index d90c9ff0..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_gn.yaml +++ /dev/null @@ -1,19 +0,0 @@ -_BASE_: "mask_rcnn_R_50_FPN_3x_gn.yaml" -MODEL: - PIXEL_STD: [57.375, 57.12, 58.395] - WEIGHTS: "" - MASK_ON: True - RESNETS: - STRIDE_IN_1X1: False - BACKBONE: - FREEZE_AT: 0 -SOLVER: - # 9x schedule - IMS_PER_BATCH: 64 # 4x the standard - STEPS: (187500, 197500) # last 60/4==15k and last 20/4==5k - MAX_ITER: 202500 # 90k * 9 / 4 - BASE_LR: 0.08 -TEST: - EVAL_PERIOD: 2500 -# NOTE: Please refer to Rethinking ImageNet Pre-training https://arxiv.org/abs/1811.08883 -# to learn what you need for training from scratch. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_syncbn.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_syncbn.yaml deleted file mode 100755 index 60d4e423..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_syncbn.yaml +++ /dev/null @@ -1,19 +0,0 @@ -_BASE_: "mask_rcnn_R_50_FPN_3x_syncbn.yaml" -MODEL: - PIXEL_STD: [57.375, 57.12, 58.395] - WEIGHTS: "" - MASK_ON: True - RESNETS: - STRIDE_IN_1X1: False - BACKBONE: - FREEZE_AT: 0 -SOLVER: - # 9x schedule - IMS_PER_BATCH: 64 # 4x the standard - STEPS: (187500, 197500) # last 60/4==15k and last 20/4==5k - MAX_ITER: 202500 # 90k * 9 / 4 - BASE_LR: 0.08 -TEST: - EVAL_PERIOD: 2500 -# NOTE: Please refer to Rethinking ImageNet Pre-training https://arxiv.org/abs/1811.08883 -# to learn what you need for training from scratch. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/semantic_R_50_FPN_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/semantic_R_50_FPN_1x.yaml deleted file mode 100755 index ac256e13..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/semantic_R_50_FPN_1x.yaml +++ /dev/null @@ -1,11 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - META_ARCHITECTURE: "SemanticSegmentor" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 -DATASETS: - TRAIN: ("coco_2017_train_panoptic_stuffonly",) - TEST: ("coco_2017_val_panoptic_stuffonly",) -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/torchvision_imagenet_R_50.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/torchvision_imagenet_R_50.py deleted file mode 100755 index 0d75305b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/Misc/torchvision_imagenet_R_50.py +++ /dev/null @@ -1,150 +0,0 @@ -""" -An example config file to train a ImageNet classifier with detectron2. -Model and dataloader both come from torchvision. -This shows how to use detectron2 as a general engine for any new models and tasks. - -To run, use the following command: - -python tools/lazyconfig_train_net.py --config-file configs/Misc/torchvision_imagenet_R_50.py \ - --num-gpus 8 dataloader.train.dataset.root=/path/to/imagenet/ - -""" - - -import torch -from torch import nn -from torch.nn import functional as F -from omegaconf import OmegaConf -import torchvision -from torchvision.transforms import transforms as T -from torchvision.models.resnet import ResNet, Bottleneck -from fvcore.common.param_scheduler import MultiStepParamScheduler - -from detectron2.solver import WarmupParamScheduler -from detectron2.solver.build import get_default_optimizer_params -from detectron2.config import LazyCall as L -from detectron2.model_zoo import get_config -from detectron2.data.samplers import TrainingSampler, InferenceSampler -from detectron2.evaluation import DatasetEvaluator -from detectron2.utils import comm - - -""" -Note: Here we put reusable code (models, evaluation, data) together with configs just as a -proof-of-concept, to easily demonstrate what's needed to train a ImageNet classifier in detectron2. -Writing code in configs offers extreme flexibility but is often not a good engineering practice. -In practice, you might want to put code in your project and import them instead. -""" - - -def build_data_loader(dataset, batch_size, num_workers, training=True): - return torch.utils.data.DataLoader( - dataset, - sampler=(TrainingSampler if training else InferenceSampler)(len(dataset)), - batch_size=batch_size, - num_workers=num_workers, - pin_memory=True, - ) - - -class ClassificationNet(nn.Module): - def __init__(self, model: nn.Module): - super().__init__() - self.model = model - - @property - def device(self): - return list(self.model.parameters())[0].device - - def forward(self, inputs): - image, label = inputs - pred = self.model(image.to(self.device)) - if self.training: - label = label.to(self.device) - return F.cross_entropy(pred, label) - else: - return pred - - -class ClassificationAcc(DatasetEvaluator): - def reset(self): - self.corr = self.total = 0 - - def process(self, inputs, outputs): - image, label = inputs - self.corr += (outputs.argmax(dim=1).cpu() == label.cpu()).sum().item() - self.total += len(label) - - def evaluate(self): - all_corr_total = comm.all_gather([self.corr, self.total]) - corr = sum(x[0] for x in all_corr_total) - total = sum(x[1] for x in all_corr_total) - return {"accuracy": corr / total} - - -# --- End of code that could be in a project and be imported - - -dataloader = OmegaConf.create() -dataloader.train = L(build_data_loader)( - dataset=L(torchvision.datasets.ImageNet)( - root="/path/to/imagenet", - split="train", - transform=L(T.Compose)( - transforms=[ - L(T.RandomResizedCrop)(size=224), - L(T.RandomHorizontalFlip)(), - T.ToTensor(), - L(T.Normalize)(mean=(0.485, 0.456, 0.406), std=(0.229, 0.224, 0.225)), - ] - ), - ), - batch_size=256 // 8, - num_workers=4, - training=True, -) - -dataloader.test = L(build_data_loader)( - dataset=L(torchvision.datasets.ImageNet)( - root="${...train.dataset.root}", - split="val", - transform=L(T.Compose)( - transforms=[ - L(T.Resize)(size=256), - L(T.CenterCrop)(size=224), - T.ToTensor(), - L(T.Normalize)(mean=(0.485, 0.456, 0.406), std=(0.229, 0.224, 0.225)), - ] - ), - ), - batch_size=256 // 8, - num_workers=4, - training=False, -) - -dataloader.evaluator = L(ClassificationAcc)() - -model = L(ClassificationNet)( - model=(ResNet)(block=Bottleneck, layers=[3, 4, 6, 3], zero_init_residual=True) -) - - -optimizer = L(torch.optim.SGD)( - params=L(get_default_optimizer_params)(), - lr=0.1, - momentum=0.9, - weight_decay=1e-4, -) - -lr_multiplier = L(WarmupParamScheduler)( - scheduler=L(MultiStepParamScheduler)( - values=[1.0, 0.1, 0.01, 0.001], milestones=[30, 60, 90, 100] - ), - warmup_length=1 / 100, - warmup_factor=0.1, -) - - -train = get_config("common/train.py").train -train.init_checkpoint = None -train.max_iter = 100 * 1281167 // 256 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_C4.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_C4.yaml deleted file mode 100755 index ea2a6baa..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_C4.yaml +++ /dev/null @@ -1,18 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 - ROI_HEADS: - NUM_CLASSES: 20 -INPUT: - MIN_SIZE_TRAIN: (480, 512, 544, 576, 608, 640, 672, 704, 736, 768, 800) - MIN_SIZE_TEST: 800 -DATASETS: - TRAIN: ('voc_2007_trainval', 'voc_2012_trainval') - TEST: ('voc_2007_test',) -SOLVER: - STEPS: (12000, 16000) - MAX_ITER: 18000 # 17.4 epochs - WARMUP_ITERS: 100 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_FPN.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_FPN.yaml deleted file mode 100755 index e554cab1..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_FPN.yaml +++ /dev/null @@ -1,18 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 - ROI_HEADS: - NUM_CLASSES: 20 -INPUT: - MIN_SIZE_TRAIN: (480, 512, 544, 576, 608, 640, 672, 704, 736, 768, 800) - MIN_SIZE_TEST: 800 -DATASETS: - TRAIN: ('voc_2007_trainval', 'voc_2012_trainval') - TEST: ('voc_2007_test',) -SOLVER: - STEPS: (12000, 16000) - MAX_ITER: 18000 # 17.4 epochs - WARMUP_ITERS: 100 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/README.md deleted file mode 100755 index 912cc299..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/README.md +++ /dev/null @@ -1,6 +0,0 @@ -This directory provides definitions for a few common models, dataloaders, scheduler, -and optimizers that are often used in training. -The definition of these objects are provided in the form of lazy instantiation: -their arguments can be edited by users before constructing the objects. - -They can be imported, or loaded by `model_zoo.get_config` API in users' own configs. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/coco_schedule.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/coco_schedule.py deleted file mode 100755 index 355e66a1..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/coco_schedule.py +++ /dev/null @@ -1,47 +0,0 @@ -from fvcore.common.param_scheduler import MultiStepParamScheduler - -from detectron2.config import LazyCall as L -from detectron2.solver import WarmupParamScheduler - - -def default_X_scheduler(num_X): - """ - Returns the config for a default multi-step LR scheduler such as "1x", "3x", - commonly referred to in papers, where every 1x has the total length of 1440k - training images (~12 COCO epochs). LR is decayed twice at the end of training - following the strategy defined in "Rethinking ImageNet Pretraining", Sec 4. - - Args: - num_X: a positive real number - - Returns: - DictConfig: configs that define the multiplier for LR during training - """ - # total number of iterations assuming 16 batch size, using 1440000/16=90000 - total_steps_16bs = num_X * 90000 - - if num_X <= 2: - scheduler = L(MultiStepParamScheduler)( - values=[1.0, 0.1, 0.01], - # note that scheduler is scale-invariant. This is equivalent to - # milestones=[6, 8, 9] - milestones=[60000, 80000, 90000], - ) - else: - scheduler = L(MultiStepParamScheduler)( - values=[1.0, 0.1, 0.01], - milestones=[total_steps_16bs - 60000, total_steps_16bs - 20000, total_steps_16bs], - ) - return L(WarmupParamScheduler)( - scheduler=scheduler, - warmup_length=1000 / total_steps_16bs, - warmup_method="linear", - warmup_factor=0.001, - ) - - -lr_multiplier_1x = default_X_scheduler(1) -lr_multiplier_2x = default_X_scheduler(2) -lr_multiplier_3x = default_X_scheduler(3) -lr_multiplier_6x = default_X_scheduler(6) -lr_multiplier_9x = default_X_scheduler(9) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/data/coco.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/data/coco.py deleted file mode 100755 index 703c4385..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/data/coco.py +++ /dev/null @@ -1,48 +0,0 @@ -from omegaconf import OmegaConf - -import detectron2.data.transforms as T -from detectron2.config import LazyCall as L -from detectron2.data import ( - DatasetMapper, - build_detection_test_loader, - build_detection_train_loader, - get_detection_dataset_dicts, -) -from detectron2.evaluation import COCOEvaluator - -dataloader = OmegaConf.create() - -dataloader.train = L(build_detection_train_loader)( - dataset=L(get_detection_dataset_dicts)(names="coco_2017_train"), - mapper=L(DatasetMapper)( - is_train=True, - augmentations=[ - L(T.ResizeShortestEdge)( - short_edge_length=(640, 672, 704, 736, 768, 800), - sample_style="choice", - max_size=1333, - ), - L(T.RandomFlip)(horizontal=True), - ], - image_format="BGR", - use_instance_mask=True, - ), - total_batch_size=16, - num_workers=4, -) - -dataloader.test = L(build_detection_test_loader)( - dataset=L(get_detection_dataset_dicts)(names="coco_2017_val", filter_empty=False), - mapper=L(DatasetMapper)( - is_train=False, - augmentations=[ - L(T.ResizeShortestEdge)(short_edge_length=800, max_size=1333), - ], - image_format="${...train.mapper.image_format}", - ), - num_workers=4, -) - -dataloader.evaluator = L(COCOEvaluator)( - dataset_name="${..test.dataset.names}", -) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/data/coco_keypoint.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/data/coco_keypoint.py deleted file mode 100755 index b4ceb066..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/data/coco_keypoint.py +++ /dev/null @@ -1,13 +0,0 @@ -from detectron2.data.detection_utils import create_keypoint_hflip_indices - -from .coco import dataloader - -dataloader.train.dataset.min_keypoints = 1 -dataloader.train.dataset.names = "keypoints_coco_2017_train" -dataloader.test.dataset.names = "keypoints_coco_2017_val" - -dataloader.train.mapper.update( - use_instance_mask=False, - use_keypoint=True, - keypoint_hflip_indices=create_keypoint_hflip_indices(dataloader.train.dataset.names), -) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/data/coco_panoptic_separated.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/data/coco_panoptic_separated.py deleted file mode 100755 index 5ccbc77e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/data/coco_panoptic_separated.py +++ /dev/null @@ -1,26 +0,0 @@ -from detectron2.config import LazyCall as L -from detectron2.evaluation import ( - COCOEvaluator, - COCOPanopticEvaluator, - DatasetEvaluators, - SemSegEvaluator, -) - -from .coco import dataloader - -dataloader.train.dataset.names = "coco_2017_train_panoptic_separated" -dataloader.train.dataset.filter_empty = False -dataloader.test.dataset.names = "coco_2017_val_panoptic_separated" - - -dataloader.evaluator = [ - L(COCOEvaluator)( - dataset_name="${...test.dataset.names}", - ), - L(SemSegEvaluator)( - dataset_name="${...test.dataset.names}", - ), - L(COCOPanopticEvaluator)( - dataset_name="${...test.dataset.names}", - ), -] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/cascade_rcnn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/cascade_rcnn.py deleted file mode 100755 index c7372a80..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/cascade_rcnn.py +++ /dev/null @@ -1,36 +0,0 @@ -from detectron2.config import LazyCall as L -from detectron2.layers import ShapeSpec -from detectron2.modeling.box_regression import Box2BoxTransform -from detectron2.modeling.matcher import Matcher -from detectron2.modeling.roi_heads import FastRCNNOutputLayers, FastRCNNConvFCHead, CascadeROIHeads - -from .mask_rcnn_fpn import model - -# arguments that don't exist for Cascade R-CNN -[model.roi_heads.pop(k) for k in ["box_head", "box_predictor", "proposal_matcher"]] - -model.roi_heads.update( - _target_=CascadeROIHeads, - box_heads=[ - L(FastRCNNConvFCHead)( - input_shape=ShapeSpec(channels=256, height=7, width=7), - conv_dims=[], - fc_dims=[1024, 1024], - ) - for k in range(3) - ], - box_predictors=[ - L(FastRCNNOutputLayers)( - input_shape=ShapeSpec(channels=1024), - test_score_thresh=0.05, - box2box_transform=L(Box2BoxTransform)(weights=(w1, w1, w2, w2)), - cls_agnostic_bbox_reg=True, - num_classes="${...num_classes}", - ) - for (w1, w2) in [(10, 5), (20, 10), (30, 15)] - ], - proposal_matchers=[ - L(Matcher)(thresholds=[th], labels=[0, 1], allow_low_quality_matches=False) - for th in [0.5, 0.6, 0.7] - ], -) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/fcos.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/fcos.py deleted file mode 100755 index 1c752029..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/fcos.py +++ /dev/null @@ -1,23 +0,0 @@ -from detectron2.modeling.meta_arch.fcos import FCOS, FCOSHead - -from .retinanet import model - -model._target_ = FCOS - -del model.anchor_generator -del model.box2box_transform -del model.anchor_matcher -del model.input_format - -# Use P5 instead of C5 to compute P6/P7 -# (Sec 2.2 of https://arxiv.org/abs/2006.09214) -model.backbone.top_block.in_feature = "p5" -model.backbone.top_block.in_channels = 256 - -# New score threshold determined based on sqrt(cls_score * centerness) -model.test_score_thresh = 0.2 -model.test_nms_thresh = 0.6 - -model.head._target_ = FCOSHead -del model.head.num_anchors -model.head.norm = "GN" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/keypoint_rcnn_fpn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/keypoint_rcnn_fpn.py deleted file mode 100755 index 56b3994d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/keypoint_rcnn_fpn.py +++ /dev/null @@ -1,33 +0,0 @@ -from detectron2.config import LazyCall as L -from detectron2.layers import ShapeSpec -from detectron2.modeling.poolers import ROIPooler -from detectron2.modeling.roi_heads import KRCNNConvDeconvUpsampleHead - -from .mask_rcnn_fpn import model - -[model.roi_heads.pop(x) for x in ["mask_in_features", "mask_pooler", "mask_head"]] - -model.roi_heads.update( - num_classes=1, - keypoint_in_features=["p2", "p3", "p4", "p5"], - keypoint_pooler=L(ROIPooler)( - output_size=14, - scales=(1.0 / 4, 1.0 / 8, 1.0 / 16, 1.0 / 32), - sampling_ratio=0, - pooler_type="ROIAlignV2", - ), - keypoint_head=L(KRCNNConvDeconvUpsampleHead)( - input_shape=ShapeSpec(channels=256, width=14, height=14), - num_keypoints=17, - conv_dims=[512] * 8, - loss_normalizer="visible", - ), -) - -# Detectron1 uses 2000 proposals per-batch, but this option is per-image in detectron2. -# 1000 proposals per-image is found to hurt box AP. -# Therefore we increase it to 1500 per-image. -model.proposal_generator.post_nms_topk = (1500, 1000) - -# Keypoint AP degrades (though box AP improves) when using plain L1 loss -model.roi_heads.box_predictor.smooth_l1_beta = 0.5 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/mask_rcnn_c4.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/mask_rcnn_c4.py deleted file mode 100755 index a3dcf8be..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/mask_rcnn_c4.py +++ /dev/null @@ -1,88 +0,0 @@ -from detectron2.config import LazyCall as L -from detectron2.layers import ShapeSpec -from detectron2.modeling.meta_arch import GeneralizedRCNN -from detectron2.modeling.anchor_generator import DefaultAnchorGenerator -from detectron2.modeling.backbone import BasicStem, BottleneckBlock, ResNet -from detectron2.modeling.box_regression import Box2BoxTransform -from detectron2.modeling.matcher import Matcher -from detectron2.modeling.poolers import ROIPooler -from detectron2.modeling.proposal_generator import RPN, StandardRPNHead -from detectron2.modeling.roi_heads import ( - FastRCNNOutputLayers, - MaskRCNNConvUpsampleHead, - Res5ROIHeads, -) - -model = L(GeneralizedRCNN)( - backbone=L(ResNet)( - stem=L(BasicStem)(in_channels=3, out_channels=64, norm="FrozenBN"), - stages=L(ResNet.make_default_stages)( - depth=50, - stride_in_1x1=True, - norm="FrozenBN", - ), - out_features=["res4"], - ), - proposal_generator=L(RPN)( - in_features=["res4"], - head=L(StandardRPNHead)(in_channels=1024, num_anchors=15), - anchor_generator=L(DefaultAnchorGenerator)( - sizes=[[32, 64, 128, 256, 512]], - aspect_ratios=[0.5, 1.0, 2.0], - strides=[16], - offset=0.0, - ), - anchor_matcher=L(Matcher)( - thresholds=[0.3, 0.7], labels=[0, -1, 1], allow_low_quality_matches=True - ), - box2box_transform=L(Box2BoxTransform)(weights=[1.0, 1.0, 1.0, 1.0]), - batch_size_per_image=256, - positive_fraction=0.5, - pre_nms_topk=(12000, 6000), - post_nms_topk=(2000, 1000), - nms_thresh=0.7, - ), - roi_heads=L(Res5ROIHeads)( - num_classes=80, - batch_size_per_image=512, - positive_fraction=0.25, - proposal_matcher=L(Matcher)( - thresholds=[0.5], labels=[0, 1], allow_low_quality_matches=False - ), - in_features=["res4"], - pooler=L(ROIPooler)( - output_size=14, - scales=(1.0 / 16,), - sampling_ratio=0, - pooler_type="ROIAlignV2", - ), - res5=L(ResNet.make_stage)( - block_class=BottleneckBlock, - num_blocks=3, - stride_per_block=[2, 1, 1], - in_channels=1024, - bottleneck_channels=512, - out_channels=2048, - norm="FrozenBN", - stride_in_1x1=True, - ), - box_predictor=L(FastRCNNOutputLayers)( - input_shape=L(ShapeSpec)(channels="${...res5.out_channels}", height=1, width=1), - test_score_thresh=0.05, - box2box_transform=L(Box2BoxTransform)(weights=(10, 10, 5, 5)), - num_classes="${..num_classes}", - ), - mask_head=L(MaskRCNNConvUpsampleHead)( - input_shape=L(ShapeSpec)( - channels="${...res5.out_channels}", - width="${...pooler.output_size}", - height="${...pooler.output_size}", - ), - num_classes="${..num_classes}", - conv_dims=[256], - ), - ), - pixel_mean=[103.530, 116.280, 123.675], - pixel_std=[1.0, 1.0, 1.0], - input_format="BGR", -) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/mask_rcnn_fpn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/mask_rcnn_fpn.py deleted file mode 100755 index 744d5306..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/mask_rcnn_fpn.py +++ /dev/null @@ -1,93 +0,0 @@ -from detectron2.config import LazyCall as L -from detectron2.layers import ShapeSpec -from detectron2.modeling.meta_arch import GeneralizedRCNN -from detectron2.modeling.anchor_generator import DefaultAnchorGenerator -from detectron2.modeling.backbone.fpn import LastLevelMaxPool -from detectron2.modeling.backbone import BasicStem, FPN, ResNet -from detectron2.modeling.box_regression import Box2BoxTransform -from detectron2.modeling.matcher import Matcher -from detectron2.modeling.poolers import ROIPooler -from detectron2.modeling.proposal_generator import RPN, StandardRPNHead -from detectron2.modeling.roi_heads import ( - StandardROIHeads, - FastRCNNOutputLayers, - MaskRCNNConvUpsampleHead, - FastRCNNConvFCHead, -) - -model = L(GeneralizedRCNN)( - backbone=L(FPN)( - bottom_up=L(ResNet)( - stem=L(BasicStem)(in_channels=3, out_channels=64, norm="FrozenBN"), - stages=L(ResNet.make_default_stages)( - depth=50, - stride_in_1x1=True, - norm="FrozenBN", - ), - out_features=["res2", "res3", "res4", "res5"], - ), - in_features="${.bottom_up.out_features}", - out_channels=256, - top_block=L(LastLevelMaxPool)(), - ), - proposal_generator=L(RPN)( - in_features=["p2", "p3", "p4", "p5", "p6"], - head=L(StandardRPNHead)(in_channels=256, num_anchors=3), - anchor_generator=L(DefaultAnchorGenerator)( - sizes=[[32], [64], [128], [256], [512]], - aspect_ratios=[0.5, 1.0, 2.0], - strides=[4, 8, 16, 32, 64], - offset=0.0, - ), - anchor_matcher=L(Matcher)( - thresholds=[0.3, 0.7], labels=[0, -1, 1], allow_low_quality_matches=True - ), - box2box_transform=L(Box2BoxTransform)(weights=[1.0, 1.0, 1.0, 1.0]), - batch_size_per_image=256, - positive_fraction=0.5, - pre_nms_topk=(2000, 1000), - post_nms_topk=(1000, 1000), - nms_thresh=0.7, - ), - roi_heads=L(StandardROIHeads)( - num_classes=80, - batch_size_per_image=512, - positive_fraction=0.25, - proposal_matcher=L(Matcher)( - thresholds=[0.5], labels=[0, 1], allow_low_quality_matches=False - ), - box_in_features=["p2", "p3", "p4", "p5"], - box_pooler=L(ROIPooler)( - output_size=7, - scales=(1.0 / 4, 1.0 / 8, 1.0 / 16, 1.0 / 32), - sampling_ratio=0, - pooler_type="ROIAlignV2", - ), - box_head=L(FastRCNNConvFCHead)( - input_shape=ShapeSpec(channels=256, height=7, width=7), - conv_dims=[], - fc_dims=[1024, 1024], - ), - box_predictor=L(FastRCNNOutputLayers)( - input_shape=ShapeSpec(channels=1024), - test_score_thresh=0.05, - box2box_transform=L(Box2BoxTransform)(weights=(10, 10, 5, 5)), - num_classes="${..num_classes}", - ), - mask_in_features=["p2", "p3", "p4", "p5"], - mask_pooler=L(ROIPooler)( - output_size=14, - scales=(1.0 / 4, 1.0 / 8, 1.0 / 16, 1.0 / 32), - sampling_ratio=0, - pooler_type="ROIAlignV2", - ), - mask_head=L(MaskRCNNConvUpsampleHead)( - input_shape=ShapeSpec(channels=256, width=14, height=14), - num_classes="${..num_classes}", - conv_dims=[256, 256, 256, 256, 256], - ), - ), - pixel_mean=[103.530, 116.280, 123.675], - pixel_std=[1.0, 1.0, 1.0], - input_format="BGR", -) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/panoptic_fpn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/panoptic_fpn.py deleted file mode 100755 index 88f55d2c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/panoptic_fpn.py +++ /dev/null @@ -1,20 +0,0 @@ -from detectron2.config import LazyCall as L -from detectron2.layers import ShapeSpec -from detectron2.modeling import PanopticFPN -from detectron2.modeling.meta_arch.semantic_seg import SemSegFPNHead - -from .mask_rcnn_fpn import model - -model._target_ = PanopticFPN -model.sem_seg_head = L(SemSegFPNHead)( - input_shape={ - f: L(ShapeSpec)(stride=s, channels="${....backbone.out_channels}") - for f, s in zip(["p2", "p3", "p4", "p5"], [4, 8, 16, 32]) - }, - ignore_value=255, - num_classes=54, # COCO stuff + 1 - conv_dims=128, - common_stride=4, - loss_weight=0.5, - norm="GN", -) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/retinanet.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/retinanet.py deleted file mode 100755 index 83cfda4b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/models/retinanet.py +++ /dev/null @@ -1,53 +0,0 @@ -# -*- coding: utf-8 -*- - -from detectron2.config import LazyCall as L -from detectron2.layers import ShapeSpec -from detectron2.modeling.meta_arch import RetinaNet -from detectron2.modeling.anchor_generator import DefaultAnchorGenerator -from detectron2.modeling.backbone.fpn import LastLevelP6P7 -from detectron2.modeling.backbone import BasicStem, FPN, ResNet -from detectron2.modeling.box_regression import Box2BoxTransform -from detectron2.modeling.matcher import Matcher -from detectron2.modeling.meta_arch.retinanet import RetinaNetHead - -model = L(RetinaNet)( - backbone=L(FPN)( - bottom_up=L(ResNet)( - stem=L(BasicStem)(in_channels=3, out_channels=64, norm="FrozenBN"), - stages=L(ResNet.make_default_stages)( - depth=50, - stride_in_1x1=True, - norm="FrozenBN", - ), - out_features=["res3", "res4", "res5"], - ), - in_features=["res3", "res4", "res5"], - out_channels=256, - top_block=L(LastLevelP6P7)(in_channels=2048, out_channels="${..out_channels}"), - ), - head=L(RetinaNetHead)( - # Shape for each input feature map - input_shape=[ShapeSpec(channels=256)] * 5, - num_classes="${..num_classes}", - conv_dims=[256, 256, 256, 256], - prior_prob=0.01, - num_anchors=9, - ), - anchor_generator=L(DefaultAnchorGenerator)( - sizes=[[x, x * 2 ** (1.0 / 3), x * 2 ** (2.0 / 3)] for x in [32, 64, 128, 256, 512]], - aspect_ratios=[0.5, 1.0, 2.0], - strides=[8, 16, 32, 64, 128], - offset=0.0, - ), - box2box_transform=L(Box2BoxTransform)(weights=[1.0, 1.0, 1.0, 1.0]), - anchor_matcher=L(Matcher)( - thresholds=[0.4, 0.5], labels=[0, -1, 1], allow_low_quality_matches=True - ), - num_classes=80, - head_in_features=["p3", "p4", "p5", "p6", "p7"], - focal_loss_alpha=0.25, - focal_loss_gamma=2.0, - pixel_mean=[103.530, 116.280, 123.675], - pixel_std=[1.0, 1.0, 1.0], - input_format="BGR", -) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/optim.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/optim.py deleted file mode 100755 index d39d3aaa..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/optim.py +++ /dev/null @@ -1,15 +0,0 @@ -import torch - -from detectron2.config import LazyCall as L -from detectron2.solver.build import get_default_optimizer_params - -SGD = L(torch.optim.SGD)( - params=L(get_default_optimizer_params)( - # params.model is meant to be set to the model object, before instantiating - # the optimizer. - weight_decay_norm=0.0 - ), - lr=0.02, - momentum=0.9, - weight_decay=1e-4, -) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/train.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/train.py deleted file mode 100755 index b6ed02bd..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/common/train.py +++ /dev/null @@ -1,18 +0,0 @@ -# Common training-related configs that are designed for "tools/lazyconfig_train_net.py" -# You can use your own instead, together with your own train_net.py -train = dict( - output_dir="./output", - init_checkpoint="", - max_iter=90000, - amp=dict(enabled=False), # options for Automatic Mixed Precision - ddp=dict( # options for DistributedDataParallel - broadcast_buffers=False, - find_unused_parameters=False, - fp16_compression=False, - ), - checkpointer=dict(period=5000, max_to_keep=100), # options for PeriodicCheckpointer - eval_period=5000, - log_period=20, - device="cuda" - # ... -) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_100ep_LSJ.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_100ep_LSJ.py deleted file mode 100755 index 3740e9bb..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_100ep_LSJ.py +++ /dev/null @@ -1,9 +0,0 @@ -from .mask_rcnn_R_50_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -model.backbone.bottom_up.stages.depth = 101 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_200ep_LSJ.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_200ep_LSJ.py deleted file mode 100755 index 18e5f072..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_200ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_R_101_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter *= 2 # 100ep -> 200ep - -lr_multiplier.scheduler.milestones = [ - milestone * 2 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_400ep_LSJ.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_400ep_LSJ.py deleted file mode 100755 index 63c54ee9..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_400ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_R_101_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter *= 4 # 100ep -> 400ep - -lr_multiplier.scheduler.milestones = [ - milestone * 4 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ.py deleted file mode 100755 index df7a2aed..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ.py +++ /dev/null @@ -1,72 +0,0 @@ -import detectron2.data.transforms as T -from detectron2.config.lazy import LazyCall as L -from detectron2.layers.batch_norm import NaiveSyncBatchNorm -from detectron2.solver import WarmupParamScheduler -from fvcore.common.param_scheduler import MultiStepParamScheduler - -from ..common.data.coco import dataloader -from ..common.models.mask_rcnn_fpn import model -from ..common.optim import SGD as optimizer -from ..common.train import train - -# train from scratch -train.init_checkpoint = "" -train.amp.enabled = True -train.ddp.fp16_compression = True -model.backbone.bottom_up.freeze_at = 0 - -# SyncBN -# fmt: off -model.backbone.bottom_up.stem.norm = \ - model.backbone.bottom_up.stages.norm = \ - model.backbone.norm = "SyncBN" - -# Using NaiveSyncBatchNorm becase heads may have empty input. That is not supported by -# torch.nn.SyncBatchNorm. We can remove this after -# https://github.com/pytorch/pytorch/issues/36530 is fixed. -model.roi_heads.box_head.conv_norm = \ - model.roi_heads.mask_head.conv_norm = lambda c: NaiveSyncBatchNorm(c, - stats_mode="N") -# fmt: on - -# 2conv in RPN: -# https://github.com/tensorflow/tpu/blob/b24729de804fdb751b06467d3dce0637fa652060/models/official/detection/modeling/architecture/heads.py#L95-L97 # noqa: E501, B950 -model.proposal_generator.head.conv_dims = [-1, -1] - -# 4conv1fc box head -model.roi_heads.box_head.conv_dims = [256, 256, 256, 256] -model.roi_heads.box_head.fc_dims = [1024] - -# resize_and_crop_image in: -# https://github.com/tensorflow/tpu/blob/b24729de804fdb751b06467d3dce0637fa652060/models/official/detection/utils/input_utils.py#L127 # noqa: E501, B950 -image_size = 1024 -dataloader.train.mapper.augmentations = [ - L(T.ResizeScale)( - min_scale=0.1, max_scale=2.0, target_height=image_size, target_width=image_size - ), - L(T.FixedSizeCrop)(crop_size=(image_size, image_size)), - L(T.RandomFlip)(horizontal=True), -] - -# recompute boxes due to cropping -dataloader.train.mapper.recompute_boxes = True - -# larger batch-size. -dataloader.train.total_batch_size = 64 - -# Equivalent to 100 epochs. -# 100 ep = 184375 iters * 64 images/iter / 118000 images/ep -train.max_iter = 184375 - -lr_multiplier = L(WarmupParamScheduler)( - scheduler=L(MultiStepParamScheduler)( - values=[1.0, 0.1, 0.01], - milestones=[163889, 177546], - num_updates=train.max_iter, - ), - warmup_length=500 / train.max_iter, - warmup_factor=0.067, -) - -optimizer.lr = 0.1 -optimizer.weight_decay = 4e-5 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_200ep_LSJ.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_200ep_LSJ.py deleted file mode 100755 index 2a7c376d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_200ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_R_50_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter *= 2 # 100ep -> 200ep - -lr_multiplier.scheduler.milestones = [ - milestone * 2 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_400ep_LSJ.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_400ep_LSJ.py deleted file mode 100755 index 97586b8f..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_400ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_R_50_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter *= 4 # 100ep -> 400ep - -lr_multiplier.scheduler.milestones = [ - milestone * 4 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_50ep_LSJ.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_50ep_LSJ.py deleted file mode 100755 index 2ca1ede2..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_50ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_R_50_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter //= 2 # 100ep -> 50ep - -lr_multiplier.scheduler.milestones = [ - milestone // 2 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py deleted file mode 100755 index ef0b6d16..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py +++ /dev/null @@ -1,29 +0,0 @@ -from .mask_rcnn_R_50_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) -from detectron2.config import LazyCall as L -from detectron2.modeling.backbone import RegNet -from detectron2.modeling.backbone.regnet import SimpleStem, ResBottleneckBlock - -# Config source: -# https://github.com/facebookresearch/detectron2/blob/main/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py # noqa -model.backbone.bottom_up = L(RegNet)( - stem_class=SimpleStem, - stem_width=32, - block_class=ResBottleneckBlock, - depth=23, - w_a=38.65, - w_0=96, - w_m=2.43, - group_width=40, - norm="SyncBN", - out_features=["s1", "s2", "s3", "s4"], -) -model.pixel_std = [57.375, 57.120, 58.395] - -# RegNets benefit from enabling cudnn benchmark mode -train.cudnn_benchmark = True diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ.py deleted file mode 100755 index 731320e7..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter *= 2 # 100ep -> 200ep - -lr_multiplier.scheduler.milestones = [ - milestone * 2 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ.py deleted file mode 100755 index 8f369a2a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter *= 4 # 100ep -> 400ep - -lr_multiplier.scheduler.milestones = [ - milestone * 4 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py deleted file mode 100755 index ba2c3274..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py +++ /dev/null @@ -1,30 +0,0 @@ -from .mask_rcnn_R_50_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) -from detectron2.config import LazyCall as L -from detectron2.modeling.backbone import RegNet -from detectron2.modeling.backbone.regnet import SimpleStem, ResBottleneckBlock - -# Config source: -# https://github.com/facebookresearch/detectron2/blob/main/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py # noqa -model.backbone.bottom_up = L(RegNet)( - stem_class=SimpleStem, - stem_width=32, - block_class=ResBottleneckBlock, - depth=22, - w_a=31.41, - w_0=96, - w_m=2.24, - group_width=64, - se_ratio=0.25, - norm="SyncBN", - out_features=["s1", "s2", "s3", "s4"], -) -model.pixel_std = [57.375, 57.120, 58.395] - -# RegNets benefit from enabling cudnn benchmark mode -train.cudnn_benchmark = True diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ.py deleted file mode 100755 index b867cc86..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter *= 2 # 100ep -> 200ep - -lr_multiplier.scheduler.milestones = [ - milestone * 2 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ.py deleted file mode 100755 index 7b86ea8c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter *= 4 # 100ep -> 400ep - -lr_multiplier.scheduler.milestones = [ - milestone * 4 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/README.md deleted file mode 100755 index 4e6c82ef..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/README.md +++ /dev/null @@ -1,8 +0,0 @@ -These are quick configs for performance or accuracy regression tracking purposes. - -* `*instance_test.yaml`: can train on 2 GPUs. They are used to test whether the training can - successfully finish. They are not expected to produce reasonable training results. -* `*inference_acc_test.yaml`: They should be run using `--eval-only`. They run inference using pre-trained models and verify - the results are as expected. -* `*training_acc_test.yaml`: They should be trained on 8 GPUs. They finish in about an hour and verify the training accuracy - is within the normal range. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_inference_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_inference_acc_test.yaml deleted file mode 100755 index fc5a4116..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_inference_acc_test.yaml +++ /dev/null @@ -1,7 +0,0 @@ -_BASE_: "../Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml" -MODEL: - WEIGHTS: "detectron2://Misc/cascade_mask_rcnn_R_50_FPN_3x/144998488/model_final_480dd8.pkl" -DATASETS: - TEST: ("coco_2017_val_100",) -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 50.18, 0.02], ["segm", "AP", 43.87, 0.02]] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_instant_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_instant_test.yaml deleted file mode 100755 index e41a0fe7..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_instant_test.yaml +++ /dev/null @@ -1,11 +0,0 @@ -_BASE_: "../Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml" -DATASETS: - TRAIN: ("coco_2017_val_100",) - TEST: ("coco_2017_val_100",) -SOLVER: - BASE_LR: 0.005 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_inference_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_inference_acc_test.yaml deleted file mode 100755 index a2f37e5e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_inference_acc_test.yaml +++ /dev/null @@ -1,7 +0,0 @@ -_BASE_: "../COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml" -MODEL: - WEIGHTS: "detectron2://COCO-Detection/fast_rcnn_R_50_FPN_1x/137635226/model_final_e5f7ce.pkl" -DATASETS: - TEST: ("coco_2017_val_100",) -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 45.70, 0.02]] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_instant_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_instant_test.yaml deleted file mode 100755 index 52fc0ec0..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_instant_test.yaml +++ /dev/null @@ -1,15 +0,0 @@ -_BASE_: "../COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" -DATASETS: - TRAIN: ("coco_2017_val_100",) - PROPOSAL_FILES_TRAIN: ("detectron2://COCO-Detection/rpn_R_50_FPN_1x/137258492/coco_2017_val_box_proposals_ee0dad.pkl", ) - TEST: ("coco_2017_val_100",) - PROPOSAL_FILES_TEST: ("detectron2://COCO-Detection/rpn_R_50_FPN_1x/137258492/coco_2017_val_box_proposals_ee0dad.pkl", ) -SOLVER: - BASE_LR: 0.005 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_inference_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_inference_acc_test.yaml deleted file mode 100755 index 14cf2aa8..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_inference_acc_test.yaml +++ /dev/null @@ -1,7 +0,0 @@ -_BASE_: "../COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml" -MODEL: - WEIGHTS: "detectron2://COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x/137849621/model_final_a6e10b.pkl" -DATASETS: - TEST: ("keypoints_coco_2017_val_100",) -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 52.47, 0.02], ["keypoints", "AP", 67.36, 0.02]] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_instant_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_instant_test.yaml deleted file mode 100755 index 3dd209f6..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_instant_test.yaml +++ /dev/null @@ -1,16 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - KEYPOINT_ON: True - ROI_HEADS: - NUM_CLASSES: 1 -DATASETS: - TRAIN: ("keypoints_coco_2017_val_100",) - TEST: ("keypoints_coco_2017_val_100",) -SOLVER: - BASE_LR: 0.005 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_normalized_training_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_normalized_training_acc_test.yaml deleted file mode 100755 index 4b92392f..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_normalized_training_acc_test.yaml +++ /dev/null @@ -1,30 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - KEYPOINT_ON: True - RESNETS: - DEPTH: 50 - ROI_HEADS: - BATCH_SIZE_PER_IMAGE: 256 - NUM_CLASSES: 1 - ROI_KEYPOINT_HEAD: - POOLER_RESOLUTION: 14 - POOLER_SAMPLING_RATIO: 2 - NORMALIZE_LOSS_BY_VISIBLE_KEYPOINTS: False - LOSS_WEIGHT: 4.0 - ROI_BOX_HEAD: - SMOOTH_L1_BETA: 1.0 # Keypoint AP degrades when using plain L1 loss - RPN: - SMOOTH_L1_BETA: 0.2 # Keypoint AP degrades when using plain L1 loss -DATASETS: - TRAIN: ("keypoints_coco_2017_val",) - TEST: ("keypoints_coco_2017_val",) -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -SOLVER: - WARMUP_FACTOR: 0.33333333 - WARMUP_ITERS: 100 - STEPS: (5500, 5800) - MAX_ITER: 6000 -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 55.35, 1.0], ["keypoints", "AP", 76.91, 1.0]] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_training_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_training_acc_test.yaml deleted file mode 100755 index 9bd96287..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_training_acc_test.yaml +++ /dev/null @@ -1,28 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - KEYPOINT_ON: True - RESNETS: - DEPTH: 50 - ROI_HEADS: - BATCH_SIZE_PER_IMAGE: 256 - NUM_CLASSES: 1 - ROI_KEYPOINT_HEAD: - POOLER_RESOLUTION: 14 - POOLER_SAMPLING_RATIO: 2 - ROI_BOX_HEAD: - SMOOTH_L1_BETA: 1.0 # Keypoint AP degrades when using plain L1 loss - RPN: - SMOOTH_L1_BETA: 0.2 # Keypoint AP degrades when using plain L1 loss -DATASETS: - TRAIN: ("keypoints_coco_2017_val",) - TEST: ("keypoints_coco_2017_val",) -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -SOLVER: - WARMUP_FACTOR: 0.33333333 - WARMUP_ITERS: 100 - STEPS: (5500, 5800) - MAX_ITER: 6000 -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 53.5, 1.0], ["keypoints", "AP", 72.4, 1.0]] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_GCV_instant_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_GCV_instant_test.yaml deleted file mode 100755 index ab6e6981..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_GCV_instant_test.yaml +++ /dev/null @@ -1,18 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True -DATASETS: - TRAIN: ("coco_2017_val_100",) - TEST: ("coco_2017_val_100",) -SOLVER: - BASE_LR: 0.001 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 - CLIP_GRADIENTS: - ENABLED: True - CLIP_TYPE: "value" - CLIP_VALUE: 1.0 -DATALOADER: - NUM_WORKERS: 2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_inference_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_inference_acc_test.yaml deleted file mode 100755 index b2d5b7ff..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_inference_acc_test.yaml +++ /dev/null @@ -1,7 +0,0 @@ -_BASE_: "../COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml" -MODEL: - WEIGHTS: "detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x/137849525/model_final_4ce675.pkl" -DATASETS: - TEST: ("coco_2017_val_100",) -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 47.37, 0.02], ["segm", "AP", 40.99, 0.02]] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_instant_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_instant_test.yaml deleted file mode 100755 index 6c4f1214..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_instant_test.yaml +++ /dev/null @@ -1,14 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True -DATASETS: - TRAIN: ("coco_2017_val_100",) - TEST: ("coco_2017_val_100",) -SOLVER: - BASE_LR: 0.001 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_training_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_training_acc_test.yaml deleted file mode 100755 index f68dd8f9..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_training_acc_test.yaml +++ /dev/null @@ -1,22 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - ROI_HEADS: - BATCH_SIZE_PER_IMAGE: 256 - MASK_ON: True -DATASETS: - TRAIN: ("coco_2017_val",) - TEST: ("coco_2017_val",) -INPUT: - MIN_SIZE_TRAIN: (600,) - MAX_SIZE_TRAIN: 1000 - MIN_SIZE_TEST: 800 - MAX_SIZE_TEST: 1000 -SOLVER: - IMS_PER_BATCH: 8 # base uses 16 - WARMUP_FACTOR: 0.33333 - WARMUP_ITERS: 100 - STEPS: (11000, 11600) - MAX_ITER: 12000 -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 41.88, 0.7], ["segm", "AP", 33.79, 0.5]] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_DC5_inference_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_DC5_inference_acc_test.yaml deleted file mode 100755 index e3ce6cf9..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_DC5_inference_acc_test.yaml +++ /dev/null @@ -1,7 +0,0 @@ -_BASE_: "../COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x.yaml" -MODEL: - WEIGHTS: "detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x/137849551/model_final_84107b.pkl" -DATASETS: - TEST: ("coco_2017_val_100",) -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 47.44, 0.02], ["segm", "AP", 42.94, 0.02]] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml deleted file mode 100755 index e5454bfd..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml +++ /dev/null @@ -1,10 +0,0 @@ -_BASE_: "../COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml" -MODEL: - WEIGHTS: "detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl" -DATASETS: - TEST: ("coco_2017_val_100",) -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 47.34, 0.02], ["segm", "AP", 42.67, 0.02], ["bbox_TTA", "AP", 49.11, 0.02], ["segm_TTA", "AP", 45.04, 0.02]] - AUG: - ENABLED: True - MIN_SIZES: (700, 800) # to save some time diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_instant_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_instant_test.yaml deleted file mode 100755 index 6dbfcde0..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_instant_test.yaml +++ /dev/null @@ -1,14 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True -DATASETS: - TRAIN: ("coco_2017_val_100",) - TEST: ("coco_2017_val_100",) -SOLVER: - BASE_LR: 0.005 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_pred_boxes_training_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_pred_boxes_training_acc_test.yaml deleted file mode 100755 index 52f78762..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_pred_boxes_training_acc_test.yaml +++ /dev/null @@ -1,6 +0,0 @@ -_BASE_: "./mask_rcnn_R_50_FPN_training_acc_test.yaml" -MODEL: - ROI_BOX_HEAD: - TRAIN_ON_PRED_BOXES: True -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 42.6, 1.0], ["segm", "AP", 35.8, 0.8]] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_training_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_training_acc_test.yaml deleted file mode 100755 index aadae4ce..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_training_acc_test.yaml +++ /dev/null @@ -1,21 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - ROI_HEADS: - BATCH_SIZE_PER_IMAGE: 256 - MASK_ON: True -DATASETS: - TRAIN: ("coco_2017_val",) - TEST: ("coco_2017_val",) -INPUT: - MIN_SIZE_TRAIN: (600,) - MAX_SIZE_TRAIN: 1000 - MIN_SIZE_TEST: 800 - MAX_SIZE_TEST: 1000 -SOLVER: - WARMUP_FACTOR: 0.3333333 - WARMUP_ITERS: 100 - STEPS: (5500, 5800) - MAX_ITER: 6000 -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 42.5, 1.0], ["segm", "AP", 35.8, 0.8]] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_inference_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_inference_acc_test.yaml deleted file mode 100755 index 70874e3a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_inference_acc_test.yaml +++ /dev/null @@ -1,7 +0,0 @@ -_BASE_: "../COCO-PanopticSegmentation/panoptic_fpn_R_50_3x.yaml" -MODEL: - WEIGHTS: "detectron2://COCO-PanopticSegmentation/panoptic_fpn_R_50_3x/139514569/model_final_c10459.pkl" -DATASETS: - TEST: ("coco_2017_val_100_panoptic_separated",) -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 46.47, 0.02], ["segm", "AP", 43.39, 0.02], ["sem_seg", "mIoU", 42.55, 0.02], ["panoptic_seg", "PQ", 38.99, 0.02]] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_instant_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_instant_test.yaml deleted file mode 100755 index 7cdee7bf..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_instant_test.yaml +++ /dev/null @@ -1,19 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - META_ARCHITECTURE: "PanopticFPN" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - SEM_SEG_HEAD: - LOSS_WEIGHT: 0.5 -DATASETS: - TRAIN: ("coco_2017_val_100_panoptic_separated",) - TEST: ("coco_2017_val_100_panoptic_separated",) -SOLVER: - BASE_LR: 0.005 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 1 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_training_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_training_acc_test.yaml deleted file mode 100755 index f3bbf301..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_training_acc_test.yaml +++ /dev/null @@ -1,20 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - META_ARCHITECTURE: "PanopticFPN" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - SEM_SEG_HEAD: - LOSS_WEIGHT: 0.5 -DATASETS: - TRAIN: ("coco_2017_val_panoptic_separated",) - TEST: ("coco_2017_val_panoptic_separated",) -SOLVER: - BASE_LR: 0.01 - WARMUP_FACTOR: 0.001 - WARMUP_ITERS: 500 - STEPS: (5500,) - MAX_ITER: 7000 -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 46.70, 1.1], ["segm", "AP", 39.0, 0.7], ["sem_seg", "mIoU", 64.73, 1.3], ["panoptic_seg", "PQ", 48.13, 0.8]] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_inference_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_inference_acc_test.yaml deleted file mode 100755 index cb666c1a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_inference_acc_test.yaml +++ /dev/null @@ -1,7 +0,0 @@ -_BASE_: "../COCO-Detection/retinanet_R_50_FPN_3x.yaml" -MODEL: - WEIGHTS: "detectron2://COCO-Detection/retinanet_R_50_FPN_3x/190397829/model_final_5bd44e.pkl" -DATASETS: - TEST: ("coco_2017_val_100",) -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 44.45, 0.02]] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_instant_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_instant_test.yaml deleted file mode 100755 index 8d95c1f6..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_instant_test.yaml +++ /dev/null @@ -1,13 +0,0 @@ -_BASE_: "../COCO-Detection/retinanet_R_50_FPN_1x.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" -DATASETS: - TRAIN: ("coco_2017_val_100",) - TEST: ("coco_2017_val_100",) -SOLVER: - BASE_LR: 0.005 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_inference_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_inference_acc_test.yaml deleted file mode 100755 index c7c3f908..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_inference_acc_test.yaml +++ /dev/null @@ -1,7 +0,0 @@ -_BASE_: "../COCO-Detection/rpn_R_50_FPN_1x.yaml" -MODEL: - WEIGHTS: "detectron2://COCO-Detection/rpn_R_50_FPN_1x/137258492/model_final_02ce48.pkl" -DATASETS: - TEST: ("coco_2017_val_100",) -TEST: - EXPECTED_RESULTS: [["box_proposals", "AR@1000", 58.16, 0.02]] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_instant_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_instant_test.yaml deleted file mode 100755 index 402d4324..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_instant_test.yaml +++ /dev/null @@ -1,13 +0,0 @@ -_BASE_: "../COCO-Detection/rpn_R_50_FPN_1x.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" -DATASETS: - TRAIN: ("coco_2017_val_100",) - TEST: ("coco_2017_val_100",) -SOLVER: - STEPS: (30,) - MAX_ITER: 40 - BASE_LR: 0.005 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_inference_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_inference_acc_test.yaml deleted file mode 100755 index bca74987..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_inference_acc_test.yaml +++ /dev/null @@ -1,10 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - META_ARCHITECTURE: "SemanticSegmentor" - WEIGHTS: "detectron2://semantic_R_50_FPN_1x/111802073/model_final_c18079783c55a94968edc28b7101c5f0.pkl" - RESNETS: - DEPTH: 50 -DATASETS: - TEST: ("coco_2017_val_100_panoptic_stuffonly",) -TEST: - EXPECTED_RESULTS: [["sem_seg", "mIoU", 39.53, 0.02], ["sem_seg", "mACC", 51.50, 0.02]] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_instant_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_instant_test.yaml deleted file mode 100755 index 14ab606f..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_instant_test.yaml +++ /dev/null @@ -1,18 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - META_ARCHITECTURE: "SemanticSegmentor" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 -DATASETS: - TRAIN: ("coco_2017_val_100_panoptic_stuffonly",) - TEST: ("coco_2017_val_100_panoptic_stuffonly",) -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -SOLVER: - BASE_LR: 0.005 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_training_acc_test.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_training_acc_test.yaml deleted file mode 100755 index 1f78d775..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_training_acc_test.yaml +++ /dev/null @@ -1,20 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - META_ARCHITECTURE: "SemanticSegmentor" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 -DATASETS: - TRAIN: ("coco_2017_val_panoptic_stuffonly",) - TEST: ("coco_2017_val_panoptic_stuffonly",) -SOLVER: - BASE_LR: 0.01 - WARMUP_FACTOR: 0.001 - WARMUP_ITERS: 300 - STEPS: (5500,) - MAX_ITER: 7000 -TEST: - EXPECTED_RESULTS: [["sem_seg", "mIoU", 76.51, 1.0], ["sem_seg", "mACC", 83.25, 1.0]] -INPUT: - # no scale augmentation - MIN_SIZE_TRAIN: (800, ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/README.md deleted file mode 100755 index 0eb44cc3..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/README.md +++ /dev/null @@ -1,140 +0,0 @@ -# Use Builtin Datasets - -A dataset can be used by accessing [DatasetCatalog](https://detectron2.readthedocs.io/modules/data.html#detectron2.data.DatasetCatalog) -for its data, or [MetadataCatalog](https://detectron2.readthedocs.io/modules/data.html#detectron2.data.MetadataCatalog) for its metadata (class names, etc). -This document explains how to setup the builtin datasets so they can be used by the above APIs. -[Use Custom Datasets](https://detectron2.readthedocs.io/tutorials/datasets.html) gives a deeper dive on how to use `DatasetCatalog` and `MetadataCatalog`, -and how to add new datasets to them. - -Detectron2 has builtin support for a few datasets. -The datasets are assumed to exist in a directory specified by the environment variable -`DETECTRON2_DATASETS`. -Under this directory, detectron2 will look for datasets in the structure described below, if needed. -``` -$DETECTRON2_DATASETS/ - coco/ - lvis/ - cityscapes/ - VOC20{07,12}/ -``` - -You can set the location for builtin datasets by `export DETECTRON2_DATASETS=/path/to/datasets`. -If left unset, the default is `./datasets` relative to your current working directory. - -The [model zoo](https://github.com/facebookresearch/detectron2/blob/master/MODEL_ZOO.md) -contains configs and models that use these builtin datasets. - -## Expected dataset structure for [COCO instance/keypoint detection](https://cocodataset.org/#download): - -``` -coco/ - annotations/ - instances_{train,val}2017.json - person_keypoints_{train,val}2017.json - {train,val}2017/ - # image files that are mentioned in the corresponding json -``` - -You can use the 2014 version of the dataset as well. - -Some of the builtin tests (`dev/run_*_tests.sh`) uses a tiny version of the COCO dataset, -which you can download with `./datasets/prepare_for_tests.sh`. - -## Expected dataset structure for PanopticFPN: - -Extract panoptic annotations from [COCO website](https://cocodataset.org/#download) -into the following structure: -``` -coco/ - annotations/ - panoptic_{train,val}2017.json - panoptic_{train,val}2017/ # png annotations - panoptic_stuff_{train,val}2017/ # generated by the script mentioned below -``` - -Install panopticapi by: -``` -pip install git+https://github.com/cocodataset/panopticapi.git -``` -Then, run `python datasets/prepare_panoptic_fpn.py`, to extract semantic annotations from panoptic annotations. - -## Expected dataset structure for [LVIS instance segmentation](https://www.lvisdataset.org/dataset): -``` -coco/ - {train,val,test}2017/ -lvis/ - lvis_v0.5_{train,val}.json - lvis_v0.5_image_info_test.json - lvis_v1_{train,val}.json - lvis_v1_image_info_test{,_challenge}.json -``` - -Install lvis-api by: -``` -pip install git+https://github.com/lvis-dataset/lvis-api.git -``` - -To evaluate models trained on the COCO dataset using LVIS annotations, -run `python datasets/prepare_cocofied_lvis.py` to prepare "cocofied" LVIS annotations. - -## Expected dataset structure for [cityscapes](https://www.cityscapes-dataset.com/downloads/): -``` -cityscapes/ - gtFine/ - train/ - aachen/ - color.png, instanceIds.png, labelIds.png, polygons.json, - labelTrainIds.png - ... - val/ - test/ - # below are generated Cityscapes panoptic annotation - cityscapes_panoptic_train.json - cityscapes_panoptic_train/ - cityscapes_panoptic_val.json - cityscapes_panoptic_val/ - cityscapes_panoptic_test.json - cityscapes_panoptic_test/ - leftImg8bit/ - train/ - val/ - test/ -``` -Install cityscapes scripts by: -``` -pip install git+https://github.com/mcordts/cityscapesScripts.git -``` - -Note: to create labelTrainIds.png, first prepare the above structure, then run cityscapesescript with: -``` -CITYSCAPES_DATASET=/path/to/abovementioned/cityscapes python cityscapesscripts/preparation/createTrainIdLabelImgs.py -``` -These files are not needed for instance segmentation. - -Note: to generate Cityscapes panoptic dataset, run cityscapesescript with: -``` -CITYSCAPES_DATASET=/path/to/abovementioned/cityscapes python cityscapesscripts/preparation/createPanopticImgs.py -``` -These files are not needed for semantic and instance segmentation. - -## Expected dataset structure for [Pascal VOC](http://host.robots.ox.ac.uk/pascal/VOC/index.html): -``` -VOC20{07,12}/ - Annotations/ - ImageSets/ - Main/ - trainval.txt - test.txt - # train.txt or val.txt, if you use these splits - JPEGImages/ -``` - -## Expected dataset structure for [ADE20k Scene Parsing](http://sceneparsing.csail.mit.edu/): -``` -ADEChallengeData2016/ - annotations/ - annotations_detectron2/ - images/ - objectInfo150.txt -``` -The directory `annotations_detectron2` is generated by running `python datasets/prepare_ade20k_sem_seg.py`. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/lvis/lvis_v1_train_cat_info.json b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/lvis/lvis_v1_train_cat_info.json deleted file mode 100755 index 95fef092..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/lvis/lvis_v1_train_cat_info.json +++ /dev/null @@ -1 +0,0 @@ -[{"name": "aerosol_can", "instance_count": 109, "def": "a dispenser that holds a substance under pressure", "synonyms": ["aerosol_can", "spray_can"], "image_count": 64, "id": 1, "frequency": "c", "synset": "aerosol.n.02"}, {"name": "air_conditioner", "instance_count": 1081, "def": "a machine that keeps air cool and dry", "synonyms": ["air_conditioner"], "image_count": 364, "id": 2, "frequency": "f", "synset": "air_conditioner.n.01"}, {"name": "airplane", "instance_count": 3720, "def": "an aircraft that has a fixed wing and is powered by propellers or jets", "synonyms": ["airplane", "aeroplane"], "image_count": 1911, "id": 3, "frequency": "f", "synset": "airplane.n.01"}, {"name": "alarm_clock", "instance_count": 158, "def": "a clock that wakes a sleeper at some preset time", "synonyms": ["alarm_clock"], "image_count": 149, "id": 4, "frequency": "f", "synset": "alarm_clock.n.01"}, {"name": "alcohol", "instance_count": 207, "def": "a liquor or brew containing alcohol as the active agent", "synonyms": ["alcohol", "alcoholic_beverage"], "image_count": 29, "id": 5, "frequency": "c", "synset": "alcohol.n.01"}, {"name": "alligator", "instance_count": 39, "def": "amphibious reptiles related to crocodiles but with shorter broader snouts", "synonyms": ["alligator", "gator"], "image_count": 26, "id": 6, "frequency": "c", "synset": "alligator.n.02"}, {"name": "almond", "instance_count": 1700, "def": "oval-shaped edible seed of the almond tree", "synonyms": ["almond"], "image_count": 59, "id": 7, "frequency": "c", "synset": "almond.n.02"}, {"name": "ambulance", "instance_count": 25, "def": "a vehicle that takes people to and from hospitals", "synonyms": ["ambulance"], "image_count": 22, "id": 8, "frequency": "c", "synset": "ambulance.n.01"}, {"name": "amplifier", "instance_count": 16, "def": "electronic equipment that increases strength of signals", "synonyms": ["amplifier"], "image_count": 12, "id": 9, "frequency": "c", "synset": "amplifier.n.01"}, {"name": "anklet", "instance_count": 39, "def": "an ornament worn around the ankle", "synonyms": ["anklet", "ankle_bracelet"], "image_count": 28, "id": 10, "frequency": "c", "synset": "anklet.n.03"}, {"name": "antenna", "instance_count": 1018, "def": "an electrical device that sends or receives radio or television signals", "synonyms": ["antenna", "aerial", "transmitting_aerial"], "image_count": 505, "id": 11, "frequency": "f", "synset": "antenna.n.01"}, {"name": "apple", "instance_count": 17451, "def": "fruit with red or yellow or green skin and sweet to tart crisp whitish flesh", "synonyms": ["apple"], "image_count": 1207, "id": 12, "frequency": "f", "synset": "apple.n.01"}, {"name": "applesauce", "instance_count": 7, "def": "puree of stewed apples usually sweetened and spiced", "synonyms": ["applesauce"], "image_count": 4, "id": 13, "frequency": "r", "synset": "applesauce.n.01"}, {"name": "apricot", "instance_count": 62, "def": "downy yellow to rosy-colored fruit resembling a small peach", "synonyms": ["apricot"], "image_count": 10, "id": 14, "frequency": "r", "synset": "apricot.n.02"}, {"name": "apron", "instance_count": 881, "def": "a garment of cloth that is tied about the waist and worn to protect clothing", "synonyms": ["apron"], "image_count": 500, "id": 15, "frequency": "f", "synset": "apron.n.01"}, {"name": "aquarium", "instance_count": 36, "def": "a tank/pool/bowl filled with water for keeping live fish and underwater animals", "synonyms": ["aquarium", "fish_tank"], "image_count": 33, "id": 16, "frequency": "c", "synset": "aquarium.n.01"}, {"name": "arctic_(type_of_shoe)", "instance_count": 8, "def": "a waterproof overshoe that protects shoes from water or snow", "synonyms": ["arctic_(type_of_shoe)", "galosh", "golosh", "rubber_(type_of_shoe)", "gumshoe"], "image_count": 3, "id": 17, "frequency": "r", "synset": "arctic.n.02"}, {"name": "armband", "instance_count": 85, "def": "a band worn around the upper arm", "synonyms": ["armband"], "image_count": 44, "id": 18, "frequency": "c", "synset": "armband.n.02"}, {"name": "armchair", "instance_count": 1112, "def": "chair with a support on each side for arms", "synonyms": ["armchair"], "image_count": 561, "id": 19, "frequency": "f", "synset": "armchair.n.01"}, {"name": "armoire", "instance_count": 11, "def": "a large wardrobe or cabinet", "synonyms": ["armoire"], "image_count": 8, "id": 20, "frequency": "r", "synset": "armoire.n.01"}, {"name": "armor", "instance_count": 23, "def": "protective covering made of metal and used in combat", "synonyms": ["armor", "armour"], "image_count": 9, "id": 21, "frequency": "r", "synset": "armor.n.01"}, {"name": "artichoke", "instance_count": 293, "def": "a thistlelike flower head with edible fleshy leaves and heart", "synonyms": ["artichoke"], "image_count": 33, "id": 22, "frequency": "c", "synset": "artichoke.n.02"}, {"name": "trash_can", "instance_count": 2722, "def": "a bin that holds rubbish until it is collected", "synonyms": ["trash_can", "garbage_can", "wastebin", "dustbin", "trash_barrel", "trash_bin"], "image_count": 1883, "id": 23, "frequency": "f", "synset": "ashcan.n.01"}, {"name": "ashtray", "instance_count": 136, "def": "a receptacle for the ash from smokers' cigars or cigarettes", "synonyms": ["ashtray"], "image_count": 98, "id": 24, "frequency": "c", "synset": "ashtray.n.01"}, {"name": "asparagus", "instance_count": 969, "def": "edible young shoots of the asparagus plant", "synonyms": ["asparagus"], "image_count": 70, "id": 25, "frequency": "c", "synset": "asparagus.n.02"}, {"name": "atomizer", "instance_count": 67, "def": "a dispenser that turns a liquid (such as perfume) into a fine mist", "synonyms": ["atomizer", "atomiser", "spray", "sprayer", "nebulizer", "nebuliser"], "image_count": 46, "id": 26, "frequency": "c", "synset": "atomizer.n.01"}, {"name": "avocado", "instance_count": 1048, "def": "a pear-shaped fruit with green or blackish skin and rich yellowish pulp enclosing a single large seed", "synonyms": ["avocado"], "image_count": 117, "id": 27, "frequency": "f", "synset": "avocado.n.01"}, {"name": "award", "instance_count": 163, "def": "a tangible symbol signifying approval or distinction", "synonyms": ["award", "accolade"], "image_count": 41, "id": 28, "frequency": "c", "synset": "award.n.02"}, {"name": "awning", "instance_count": 4270, "def": "a canopy made of canvas to shelter people or things from rain or sun", "synonyms": ["awning"], "image_count": 1395, "id": 29, "frequency": "f", "synset": "awning.n.01"}, {"name": "ax", "instance_count": 8, "def": "an edge tool with a heavy bladed head mounted across a handle", "synonyms": ["ax", "axe"], "image_count": 7, "id": 30, "frequency": "r", "synset": "ax.n.01"}, {"name": "baboon", "instance_count": 3, "def": "large terrestrial monkeys having doglike muzzles", "synonyms": ["baboon"], "image_count": 1, "id": 31, "frequency": "r", "synset": "baboon.n.01"}, {"name": "baby_buggy", "instance_count": 447, "def": "a small vehicle with four wheels in which a baby or child is pushed around", "synonyms": ["baby_buggy", "baby_carriage", "perambulator", "pram", "stroller"], "image_count": 314, "id": 32, "frequency": "f", "synset": "baby_buggy.n.01"}, {"name": "basketball_backboard", "instance_count": 42, "def": "a raised vertical board with basket attached; used to play basketball", "synonyms": ["basketball_backboard"], "image_count": 31, "id": 33, "frequency": "c", "synset": "backboard.n.01"}, {"name": "backpack", "instance_count": 3907, "def": "a bag carried by a strap on your back or shoulder", "synonyms": ["backpack", "knapsack", "packsack", "rucksack", "haversack"], "image_count": 1905, "id": 34, "frequency": "f", "synset": "backpack.n.01"}, {"name": "handbag", "instance_count": 3947, "def": "a container used for carrying money and small personal items or accessories", "synonyms": ["handbag", "purse", "pocketbook"], "image_count": 1859, "id": 35, "frequency": "f", "synset": "bag.n.04"}, {"name": "suitcase", "instance_count": 8537, "def": "cases used to carry belongings when traveling", "synonyms": ["suitcase", "baggage", "luggage"], "image_count": 1623, "id": 36, "frequency": "f", "synset": "bag.n.06"}, {"name": "bagel", "instance_count": 372, "def": "glazed yeast-raised doughnut-shaped roll with hard crust", "synonyms": ["bagel", "beigel"], "image_count": 47, "id": 37, "frequency": "c", "synset": "bagel.n.01"}, {"name": "bagpipe", "instance_count": 6, "def": "a tubular wind instrument; the player blows air into a bag and squeezes it out", "synonyms": ["bagpipe"], "image_count": 3, "id": 38, "frequency": "r", "synset": "bagpipe.n.01"}, {"name": "baguet", "instance_count": 9, "def": "narrow French stick loaf", "synonyms": ["baguet", "baguette"], "image_count": 3, "id": 39, "frequency": "r", "synset": "baguet.n.01"}, {"name": "bait", "instance_count": 1, "def": "something used to lure fish or other animals into danger so they can be trapped or killed", "synonyms": ["bait", "lure"], "image_count": 1, "id": 40, "frequency": "r", "synset": "bait.n.02"}, {"name": "ball", "instance_count": 755, "def": "a spherical object used as a plaything", "synonyms": ["ball"], "image_count": 305, "id": 41, "frequency": "f", "synset": "ball.n.06"}, {"name": "ballet_skirt", "instance_count": 12, "def": "very short skirt worn by ballerinas", "synonyms": ["ballet_skirt", "tutu"], "image_count": 6, "id": 42, "frequency": "r", "synset": "ballet_skirt.n.01"}, {"name": "balloon", "instance_count": 1556, "def": "large tough nonrigid bag filled with gas or heated air", "synonyms": ["balloon"], "image_count": 210, "id": 43, "frequency": "f", "synset": "balloon.n.01"}, {"name": "bamboo", "instance_count": 243, "def": "woody tropical grass having hollow woody stems", "synonyms": ["bamboo"], "image_count": 36, "id": 44, "frequency": "c", "synset": "bamboo.n.02"}, {"name": "banana", "instance_count": 50552, "def": "elongated crescent-shaped yellow fruit with soft sweet flesh", "synonyms": ["banana"], "image_count": 1787, "id": 45, "frequency": "f", "synset": "banana.n.02"}, {"name": "Band_Aid", "instance_count": 19, "def": "trade name for an adhesive bandage to cover small cuts or blisters", "synonyms": ["Band_Aid"], "image_count": 17, "id": 46, "frequency": "c", "synset": "band_aid.n.01"}, {"name": "bandage", "instance_count": 92, "def": "a piece of soft material that covers and protects an injured part of the body", "synonyms": ["bandage"], "image_count": 51, "id": 47, "frequency": "c", "synset": "bandage.n.01"}, {"name": "bandanna", "instance_count": 219, "def": "large and brightly colored handkerchief; often used as a neckerchief", "synonyms": ["bandanna", "bandana"], "image_count": 138, "id": 48, "frequency": "f", "synset": "bandanna.n.01"}, {"name": "banjo", "instance_count": 3, "def": "a stringed instrument of the guitar family with a long neck and circular body", "synonyms": ["banjo"], "image_count": 3, "id": 49, "frequency": "r", "synset": "banjo.n.01"}, {"name": "banner", "instance_count": 5907, "def": "long strip of cloth or paper used for decoration or advertising", "synonyms": ["banner", "streamer"], "image_count": 1470, "id": 50, "frequency": "f", "synset": "banner.n.01"}, {"name": "barbell", "instance_count": 4, "def": "a bar to which heavy discs are attached at each end; used in weightlifting", "synonyms": ["barbell"], "image_count": 3, "id": 51, "frequency": "r", "synset": "barbell.n.01"}, {"name": "barge", "instance_count": 3, "def": "a flatbottom boat for carrying heavy loads (especially on canals)", "synonyms": ["barge"], "image_count": 2, "id": 52, "frequency": "r", "synset": "barge.n.01"}, {"name": "barrel", "instance_count": 707, "def": "a cylindrical container that holds liquids", "synonyms": ["barrel", "cask"], "image_count": 186, "id": 53, "frequency": "f", "synset": "barrel.n.02"}, {"name": "barrette", "instance_count": 119, "def": "a pin for holding women's hair in place", "synonyms": ["barrette"], "image_count": 76, "id": 54, "frequency": "c", "synset": "barrette.n.01"}, {"name": "barrow", "instance_count": 30, "def": "a cart for carrying small loads; has handles and one or more wheels", "synonyms": ["barrow", "garden_cart", "lawn_cart", "wheelbarrow"], "image_count": 26, "id": 55, "frequency": "c", "synset": "barrow.n.03"}, {"name": "baseball_base", "instance_count": 404, "def": "a place that the runner must touch before scoring", "synonyms": ["baseball_base"], "image_count": 303, "id": 56, "frequency": "f", "synset": "base.n.03"}, {"name": "baseball", "instance_count": 1013, "def": "a ball used in playing baseball", "synonyms": ["baseball"], "image_count": 738, "id": 57, "frequency": "f", "synset": "baseball.n.02"}, {"name": "baseball_bat", "instance_count": 2698, "def": "an implement used in baseball by the batter", "synonyms": ["baseball_bat"], "image_count": 1799, "id": 58, "frequency": "f", "synset": "baseball_bat.n.01"}, {"name": "baseball_cap", "instance_count": 9028, "def": "a cap with a bill", "synonyms": ["baseball_cap", "jockey_cap", "golf_cap"], "image_count": 1934, "id": 59, "frequency": "f", "synset": "baseball_cap.n.01"}, {"name": "baseball_glove", "instance_count": 2536, "def": "the handwear used by fielders in playing baseball", "synonyms": ["baseball_glove", "baseball_mitt"], "image_count": 1609, "id": 60, "frequency": "f", "synset": "baseball_glove.n.01"}, {"name": "basket", "instance_count": 3984, "def": "a container that is usually woven and has handles", "synonyms": ["basket", "handbasket"], "image_count": 1622, "id": 61, "frequency": "f", "synset": "basket.n.01"}, {"name": "basketball", "instance_count": 56, "def": "an inflated ball used in playing basketball", "synonyms": ["basketball"], "image_count": 41, "id": 62, "frequency": "c", "synset": "basketball.n.02"}, {"name": "bass_horn", "instance_count": 6, "def": "the lowest brass wind instrument", "synonyms": ["bass_horn", "sousaphone", "tuba"], "image_count": 4, "id": 63, "frequency": "r", "synset": "bass_horn.n.01"}, {"name": "bat_(animal)", "instance_count": 47, "def": "nocturnal mouselike mammal with forelimbs modified to form membranous wings", "synonyms": ["bat_(animal)"], "image_count": 11, "id": 64, "frequency": "c", "synset": "bat.n.01"}, {"name": "bath_mat", "instance_count": 336, "def": "a heavy towel or mat to stand on while drying yourself after a bath", "synonyms": ["bath_mat"], "image_count": 270, "id": 65, "frequency": "f", "synset": "bath_mat.n.01"}, {"name": "bath_towel", "instance_count": 1210, "def": "a large towel; to dry yourself after a bath", "synonyms": ["bath_towel"], "image_count": 349, "id": 66, "frequency": "f", "synset": "bath_towel.n.01"}, {"name": "bathrobe", "instance_count": 53, "def": "a loose-fitting robe of towelling; worn after a bath or swim", "synonyms": ["bathrobe"], "image_count": 42, "id": 67, "frequency": "c", "synset": "bathrobe.n.01"}, {"name": "bathtub", "instance_count": 868, "def": "a large open container that you fill with water and use to wash the body", "synonyms": ["bathtub", "bathing_tub"], "image_count": 823, "id": 68, "frequency": "f", "synset": "bathtub.n.01"}, {"name": "batter_(food)", "instance_count": 26, "def": "a liquid or semiliquid mixture, as of flour, eggs, and milk, used in cooking", "synonyms": ["batter_(food)"], "image_count": 6, "id": 69, "frequency": "r", "synset": "batter.n.02"}, {"name": "battery", "instance_count": 155, "def": "a portable device that produces electricity", "synonyms": ["battery"], "image_count": 48, "id": 70, "frequency": "c", "synset": "battery.n.02"}, {"name": "beachball", "instance_count": 3, "def": "large and light ball; for play at the seaside", "synonyms": ["beachball"], "image_count": 3, "id": 71, "frequency": "r", "synset": "beach_ball.n.01"}, {"name": "bead", "instance_count": 1371, "def": "a small ball with a hole through the middle used for ornamentation, jewellery, etc.", "synonyms": ["bead"], "image_count": 42, "id": 72, "frequency": "c", "synset": "bead.n.01"}, {"name": "bean_curd", "instance_count": 231, "def": "cheeselike food made of curdled soybean milk", "synonyms": ["bean_curd", "tofu"], "image_count": 24, "id": 73, "frequency": "c", "synset": "bean_curd.n.01"}, {"name": "beanbag", "instance_count": 20, "def": "a bag filled with dried beans or similar items; used in games or to sit on", "synonyms": ["beanbag"], "image_count": 16, "id": 74, "frequency": "c", "synset": "beanbag.n.01"}, {"name": "beanie", "instance_count": 1907, "def": "a small skullcap; formerly worn by schoolboys and college freshmen", "synonyms": ["beanie", "beany"], "image_count": 605, "id": 75, "frequency": "f", "synset": "beanie.n.01"}, {"name": "bear", "instance_count": 1069, "def": "large carnivorous or omnivorous mammals with shaggy coats and claws", "synonyms": ["bear"], "image_count": 646, "id": 76, "frequency": "f", "synset": "bear.n.01"}, {"name": "bed", "instance_count": 2137, "def": "a piece of furniture that provides a place to sleep", "synonyms": ["bed"], "image_count": 1765, "id": 77, "frequency": "f", "synset": "bed.n.01"}, {"name": "bedpan", "instance_count": 2, "def": "a shallow vessel used by a bedridden patient for defecation and urination", "synonyms": ["bedpan"], "image_count": 2, "id": 78, "frequency": "r", "synset": "bedpan.n.01"}, {"name": "bedspread", "instance_count": 188, "def": "decorative cover for a bed", "synonyms": ["bedspread", "bedcover", "bed_covering", "counterpane", "spread"], "image_count": 125, "id": 79, "frequency": "f", "synset": "bedspread.n.01"}, {"name": "cow", "instance_count": 8085, "def": "cattle/cow", "synonyms": ["cow"], "image_count": 1420, "id": 80, "frequency": "f", "synset": "beef.n.01"}, {"name": "beef_(food)", "instance_count": 1242, "def": "meat from an adult domestic bovine", "synonyms": ["beef_(food)", "boeuf_(food)"], "image_count": 140, "id": 81, "frequency": "f", "synset": "beef.n.02"}, {"name": "beeper", "instance_count": 4, "def": "an device that beeps when the person carrying it is being paged", "synonyms": ["beeper", "pager"], "image_count": 4, "id": 82, "frequency": "r", "synset": "beeper.n.01"}, {"name": "beer_bottle", "instance_count": 1227, "def": "a bottle that holds beer", "synonyms": ["beer_bottle"], "image_count": 322, "id": 83, "frequency": "f", "synset": "beer_bottle.n.01"}, {"name": "beer_can", "instance_count": 203, "def": "a can that holds beer", "synonyms": ["beer_can"], "image_count": 60, "id": 84, "frequency": "c", "synset": "beer_can.n.01"}, {"name": "beetle", "instance_count": 9, "def": "insect with hard wing covers", "synonyms": ["beetle"], "image_count": 2, "id": 85, "frequency": "r", "synset": "beetle.n.01"}, {"name": "bell", "instance_count": 590, "def": "a hollow device made of metal that makes a ringing sound when struck", "synonyms": ["bell"], "image_count": 231, "id": 86, "frequency": "f", "synset": "bell.n.01"}, {"name": "bell_pepper", "instance_count": 4369, "def": "large bell-shaped sweet pepper in green or red or yellow or orange or black varieties", "synonyms": ["bell_pepper", "capsicum"], "image_count": 333, "id": 87, "frequency": "f", "synset": "bell_pepper.n.02"}, {"name": "belt", "instance_count": 3683, "def": "a band to tie or buckle around the body (usually at the waist)", "synonyms": ["belt"], "image_count": 1941, "id": 88, "frequency": "f", "synset": "belt.n.02"}, {"name": "belt_buckle", "instance_count": 589, "def": "the buckle used to fasten a belt", "synonyms": ["belt_buckle"], "image_count": 367, "id": 89, "frequency": "f", "synset": "belt_buckle.n.01"}, {"name": "bench", "instance_count": 4374, "def": "a long seat for more than one person", "synonyms": ["bench"], "image_count": 1922, "id": 90, "frequency": "f", "synset": "bench.n.01"}, {"name": "beret", "instance_count": 57, "def": "a cap with no brim or bill; made of soft cloth", "synonyms": ["beret"], "image_count": 18, "id": 91, "frequency": "c", "synset": "beret.n.01"}, {"name": "bib", "instance_count": 96, "def": "a napkin tied under the chin of a child while eating", "synonyms": ["bib"], "image_count": 81, "id": 92, "frequency": "c", "synset": "bib.n.02"}, {"name": "Bible", "instance_count": 2, "def": "the sacred writings of the Christian religions", "synonyms": ["Bible"], "image_count": 1, "id": 93, "frequency": "r", "synset": "bible.n.01"}, {"name": "bicycle", "instance_count": 4566, "def": "a wheeled vehicle that has two wheels and is moved by foot pedals", "synonyms": ["bicycle", "bike_(bicycle)"], "image_count": 1852, "id": 94, "frequency": "f", "synset": "bicycle.n.01"}, {"name": "visor", "instance_count": 777, "def": "a brim that projects to the front to shade the eyes", "synonyms": ["visor", "vizor"], "image_count": 430, "id": 95, "frequency": "f", "synset": "bill.n.09"}, {"name": "billboard", "instance_count": 1025, "def": "large outdoor signboard", "synonyms": ["billboard"], "image_count": 247, "id": 96, "frequency": "f", "synset": "billboard.n.01"}, {"name": "binder", "instance_count": 311, "def": "holds loose papers or magazines", "synonyms": ["binder", "ring-binder"], "image_count": 94, "id": 97, "frequency": "c", "synset": "binder.n.03"}, {"name": "binoculars", "instance_count": 22, "def": "an optical instrument designed for simultaneous use by both eyes", "synonyms": ["binoculars", "field_glasses", "opera_glasses"], "image_count": 21, "id": 98, "frequency": "c", "synset": "binoculars.n.01"}, {"name": "bird", "instance_count": 11557, "def": "animal characterized by feathers and wings", "synonyms": ["bird"], "image_count": 1821, "id": 99, "frequency": "f", "synset": "bird.n.01"}, {"name": "birdfeeder", "instance_count": 16, "def": "an outdoor device that supplies food for wild birds", "synonyms": ["birdfeeder"], "image_count": 16, "id": 100, "frequency": "c", "synset": "bird_feeder.n.01"}, {"name": "birdbath", "instance_count": 12, "def": "an ornamental basin (usually in a garden) for birds to bathe in", "synonyms": ["birdbath"], "image_count": 12, "id": 101, "frequency": "c", "synset": "birdbath.n.01"}, {"name": "birdcage", "instance_count": 180, "def": "a cage in which a bird can be kept", "synonyms": ["birdcage"], "image_count": 25, "id": 102, "frequency": "c", "synset": "birdcage.n.01"}, {"name": "birdhouse", "instance_count": 60, "def": "a shelter for birds", "synonyms": ["birdhouse"], "image_count": 41, "id": 103, "frequency": "c", "synset": "birdhouse.n.01"}, {"name": "birthday_cake", "instance_count": 311, "def": "decorated cake served at a birthday party", "synonyms": ["birthday_cake"], "image_count": 244, "id": 104, "frequency": "f", "synset": "birthday_cake.n.01"}, {"name": "birthday_card", "instance_count": 23, "def": "a card expressing a birthday greeting", "synonyms": ["birthday_card"], "image_count": 7, "id": 105, "frequency": "r", "synset": "birthday_card.n.01"}, {"name": "pirate_flag", "instance_count": 1, "def": "a flag usually bearing a white skull and crossbones on a black background", "synonyms": ["pirate_flag"], "image_count": 1, "id": 106, "frequency": "r", "synset": "black_flag.n.01"}, {"name": "black_sheep", "instance_count": 214, "def": "sheep with a black coat", "synonyms": ["black_sheep"], "image_count": 40, "id": 107, "frequency": "c", "synset": "black_sheep.n.02"}, {"name": "blackberry", "instance_count": 406, "def": "large sweet black or very dark purple edible aggregate fruit", "synonyms": ["blackberry"], "image_count": 40, "id": 108, "frequency": "c", "synset": "blackberry.n.01"}, {"name": "blackboard", "instance_count": 154, "def": "sheet of slate; for writing with chalk", "synonyms": ["blackboard", "chalkboard"], "image_count": 104, "id": 109, "frequency": "f", "synset": "blackboard.n.01"}, {"name": "blanket", "instance_count": 3075, "def": "bedding that keeps a person warm in bed", "synonyms": ["blanket"], "image_count": 1671, "id": 110, "frequency": "f", "synset": "blanket.n.01"}, {"name": "blazer", "instance_count": 124, "def": "lightweight jacket; often striped in the colors of a club or school", "synonyms": ["blazer", "sport_jacket", "sport_coat", "sports_jacket", "sports_coat"], "image_count": 49, "id": 111, "frequency": "c", "synset": "blazer.n.01"}, {"name": "blender", "instance_count": 316, "def": "an electrically powered mixer that mix or chop or liquefy foods", "synonyms": ["blender", "liquidizer", "liquidiser"], "image_count": 243, "id": 112, "frequency": "f", "synset": "blender.n.01"}, {"name": "blimp", "instance_count": 3, "def": "a small nonrigid airship used for observation or as a barrage balloon", "synonyms": ["blimp"], "image_count": 2, "id": 113, "frequency": "r", "synset": "blimp.n.02"}, {"name": "blinker", "instance_count": 1269, "def": "a light that flashes on and off; used as a signal or to send messages", "synonyms": ["blinker", "flasher"], "image_count": 242, "id": 114, "frequency": "f", "synset": "blinker.n.01"}, {"name": "blouse", "instance_count": 623, "def": "a top worn by women", "synonyms": ["blouse"], "image_count": 271, "id": 115, "frequency": "f", "synset": "blouse.n.01"}, {"name": "blueberry", "instance_count": 2114, "def": "sweet edible dark-blue berries of blueberry plants", "synonyms": ["blueberry"], "image_count": 104, "id": 116, "frequency": "f", "synset": "blueberry.n.02"}, {"name": "gameboard", "instance_count": 17, "def": "a flat portable surface (usually rectangular) designed for board games", "synonyms": ["gameboard"], "image_count": 8, "id": 117, "frequency": "r", "synset": "board.n.09"}, {"name": "boat", "instance_count": 9981, "def": "a vessel for travel on water", "synonyms": ["boat", "ship_(boat)"], "image_count": 1758, "id": 118, "frequency": "f", "synset": "boat.n.01"}, {"name": "bob", "instance_count": 2, "def": "a small float usually made of cork; attached to a fishing line", "synonyms": ["bob", "bobber", "bobfloat"], "image_count": 1, "id": 119, "frequency": "r", "synset": "bob.n.05"}, {"name": "bobbin", "instance_count": 190, "def": "a thing around which thread/tape/film or other flexible materials can be wound", "synonyms": ["bobbin", "spool", "reel"], "image_count": 48, "id": 120, "frequency": "c", "synset": "bobbin.n.01"}, {"name": "bobby_pin", "instance_count": 43, "def": "a flat wire hairpin used to hold bobbed hair in place", "synonyms": ["bobby_pin", "hairgrip"], "image_count": 14, "id": 121, "frequency": "c", "synset": "bobby_pin.n.01"}, {"name": "boiled_egg", "instance_count": 125, "def": "egg cooked briefly in the shell in gently boiling water", "synonyms": ["boiled_egg", "coddled_egg"], "image_count": 40, "id": 122, "frequency": "c", "synset": "boiled_egg.n.01"}, {"name": "bolo_tie", "instance_count": 1, "def": "a cord fastened around the neck with an ornamental clasp and worn as a necktie", "synonyms": ["bolo_tie", "bolo", "bola_tie", "bola"], "image_count": 1, "id": 123, "frequency": "r", "synset": "bolo_tie.n.01"}, {"name": "deadbolt", "instance_count": 46, "def": "the part of a lock that is engaged or withdrawn with a key", "synonyms": ["deadbolt"], "image_count": 37, "id": 124, "frequency": "c", "synset": "bolt.n.03"}, {"name": "bolt", "instance_count": 11261, "def": "a screw that screws into a nut to form a fastener", "synonyms": ["bolt"], "image_count": 1510, "id": 125, "frequency": "f", "synset": "bolt.n.06"}, {"name": "bonnet", "instance_count": 10, "def": "a hat tied under the chin", "synonyms": ["bonnet"], "image_count": 6, "id": 126, "frequency": "r", "synset": "bonnet.n.01"}, {"name": "book", "instance_count": 33353, "def": "a written work or composition that has been published", "synonyms": ["book"], "image_count": 1903, "id": 127, "frequency": "f", "synset": "book.n.01"}, {"name": "bookcase", "instance_count": 113, "def": "a piece of furniture with shelves for storing books", "synonyms": ["bookcase"], "image_count": 70, "id": 128, "frequency": "c", "synset": "bookcase.n.01"}, {"name": "booklet", "instance_count": 439, "def": "a small book usually having a paper cover", "synonyms": ["booklet", "brochure", "leaflet", "pamphlet"], "image_count": 86, "id": 129, "frequency": "c", "synset": "booklet.n.01"}, {"name": "bookmark", "instance_count": 15, "def": "a marker (a piece of paper or ribbon) placed between the pages of a book", "synonyms": ["bookmark", "bookmarker"], "image_count": 7, "id": 130, "frequency": "r", "synset": "bookmark.n.01"}, {"name": "boom_microphone", "instance_count": 10, "def": "a pole carrying an overhead microphone projected over a film or tv set", "synonyms": ["boom_microphone", "microphone_boom"], "image_count": 5, "id": 131, "frequency": "r", "synset": "boom.n.04"}, {"name": "boot", "instance_count": 4194, "def": "footwear that covers the whole foot and lower leg", "synonyms": ["boot"], "image_count": 1406, "id": 132, "frequency": "f", "synset": "boot.n.01"}, {"name": "bottle", "instance_count": 7969, "def": "a glass or plastic vessel used for storing drinks or other liquids", "synonyms": ["bottle"], "image_count": 1901, "id": 133, "frequency": "f", "synset": "bottle.n.01"}, {"name": "bottle_opener", "instance_count": 15, "def": "an opener for removing caps or corks from bottles", "synonyms": ["bottle_opener"], "image_count": 15, "id": 134, "frequency": "c", "synset": "bottle_opener.n.01"}, {"name": "bouquet", "instance_count": 53, "def": "an arrangement of flowers that is usually given as a present", "synonyms": ["bouquet"], "image_count": 28, "id": 135, "frequency": "c", "synset": "bouquet.n.01"}, {"name": "bow_(weapon)", "instance_count": 6, "def": "a weapon for shooting arrows", "synonyms": ["bow_(weapon)"], "image_count": 6, "id": 136, "frequency": "r", "synset": "bow.n.04"}, {"name": "bow_(decorative_ribbons)", "instance_count": 1144, "def": "a decorative interlacing of ribbons", "synonyms": ["bow_(decorative_ribbons)"], "image_count": 494, "id": 137, "frequency": "f", "synset": "bow.n.08"}, {"name": "bow-tie", "instance_count": 359, "def": "a man's tie that ties in a bow", "synonyms": ["bow-tie", "bowtie"], "image_count": 234, "id": 138, "frequency": "f", "synset": "bow_tie.n.01"}, {"name": "bowl", "instance_count": 5308, "def": "a dish that is round and open at the top for serving foods", "synonyms": ["bowl"], "image_count": 1922, "id": 139, "frequency": "f", "synset": "bowl.n.03"}, {"name": "pipe_bowl", "instance_count": 1, "def": "a small round container that is open at the top for holding tobacco", "synonyms": ["pipe_bowl"], "image_count": 1, "id": 140, "frequency": "r", "synset": "bowl.n.08"}, {"name": "bowler_hat", "instance_count": 89, "def": "a felt hat that is round and hard with a narrow brim", "synonyms": ["bowler_hat", "bowler", "derby_hat", "derby", "plug_hat"], "image_count": 35, "id": 141, "frequency": "c", "synset": "bowler_hat.n.01"}, {"name": "bowling_ball", "instance_count": 38, "def": "a large ball with finger holes used in the sport of bowling", "synonyms": ["bowling_ball"], "image_count": 5, "id": 142, "frequency": "r", "synset": "bowling_ball.n.01"}, {"name": "box", "instance_count": 7855, "def": "a (usually rectangular) container; may have a lid", "synonyms": ["box"], "image_count": 1828, "id": 143, "frequency": "f", "synset": "box.n.01"}, {"name": "boxing_glove", "instance_count": 22, "def": "large glove coverings the fists of a fighter worn for the sport of boxing", "synonyms": ["boxing_glove"], "image_count": 8, "id": 144, "frequency": "r", "synset": "boxing_glove.n.01"}, {"name": "suspenders", "instance_count": 88, "def": "elastic straps that hold trousers up (usually used in the plural)", "synonyms": ["suspenders"], "image_count": 63, "id": 145, "frequency": "c", "synset": "brace.n.06"}, {"name": "bracelet", "instance_count": 3219, "def": "jewelry worn around the wrist for decoration", "synonyms": ["bracelet", "bangle"], "image_count": 1668, "id": 146, "frequency": "f", "synset": "bracelet.n.02"}, {"name": "brass_plaque", "instance_count": 4, "def": "a memorial made of brass", "synonyms": ["brass_plaque"], "image_count": 4, "id": 147, "frequency": "r", "synset": "brass.n.07"}, {"name": "brassiere", "instance_count": 118, "def": "an undergarment worn by women to support their breasts", "synonyms": ["brassiere", "bra", "bandeau"], "image_count": 95, "id": 148, "frequency": "c", "synset": "brassiere.n.01"}, {"name": "bread-bin", "instance_count": 17, "def": "a container used to keep bread or cake in", "synonyms": ["bread-bin", "breadbox"], "image_count": 17, "id": 149, "frequency": "c", "synset": "bread-bin.n.01"}, {"name": "bread", "instance_count": 6550, "def": "food made from dough of flour or meal and usually raised with yeast or baking powder and then baked", "synonyms": ["bread"], "image_count": 1567, "id": 150, "frequency": "f", "synset": "bread.n.01"}, {"name": "breechcloth", "instance_count": 3, "def": "a garment that provides covering for the loins", "synonyms": ["breechcloth", "breechclout", "loincloth"], "image_count": 2, "id": 151, "frequency": "r", "synset": "breechcloth.n.01"}, {"name": "bridal_gown", "instance_count": 118, "def": "a gown worn by the bride at a wedding", "synonyms": ["bridal_gown", "wedding_gown", "wedding_dress"], "image_count": 103, "id": 152, "frequency": "f", "synset": "bridal_gown.n.01"}, {"name": "briefcase", "instance_count": 84, "def": "a case with a handle; for carrying papers or files or books", "synonyms": ["briefcase"], "image_count": 50, "id": 153, "frequency": "c", "synset": "briefcase.n.01"}, {"name": "broccoli", "instance_count": 12166, "def": "plant with dense clusters of tight green flower buds", "synonyms": ["broccoli"], "image_count": 1309, "id": 154, "frequency": "f", "synset": "broccoli.n.01"}, {"name": "broach", "instance_count": 9, "def": "a decorative pin worn by women", "synonyms": ["broach"], "image_count": 6, "id": 155, "frequency": "r", "synset": "brooch.n.01"}, {"name": "broom", "instance_count": 144, "def": "bundle of straws or twigs attached to a long handle; used for cleaning", "synonyms": ["broom"], "image_count": 92, "id": 156, "frequency": "c", "synset": "broom.n.01"}, {"name": "brownie", "instance_count": 217, "def": "square or bar of very rich chocolate cake usually with nuts", "synonyms": ["brownie"], "image_count": 19, "id": 157, "frequency": "c", "synset": "brownie.n.03"}, {"name": "brussels_sprouts", "instance_count": 590, "def": "the small edible cabbage-like buds growing along a stalk", "synonyms": ["brussels_sprouts"], "image_count": 37, "id": 158, "frequency": "c", "synset": "brussels_sprouts.n.01"}, {"name": "bubble_gum", "instance_count": 4, "def": "a kind of chewing gum that can be blown into bubbles", "synonyms": ["bubble_gum"], "image_count": 4, "id": 159, "frequency": "r", "synset": "bubble_gum.n.01"}, {"name": "bucket", "instance_count": 1346, "def": "a roughly cylindrical vessel that is open at the top", "synonyms": ["bucket", "pail"], "image_count": 709, "id": 160, "frequency": "f", "synset": "bucket.n.01"}, {"name": "horse_buggy", "instance_count": 19, "def": "a small lightweight carriage; drawn by a single horse", "synonyms": ["horse_buggy"], "image_count": 9, "id": 161, "frequency": "r", "synset": "buggy.n.01"}, {"name": "bull", "instance_count": 230, "def": "a cow with horns", "synonyms": ["horned_cow"], "image_count": 82, "id": 162, "frequency": "c", "synset": "bull.n.11"}, {"name": "bulldog", "instance_count": 21, "def": "a thickset short-haired dog with a large head and strong undershot lower jaw", "synonyms": ["bulldog"], "image_count": 15, "id": 163, "frequency": "c", "synset": "bulldog.n.01"}, {"name": "bulldozer", "instance_count": 4, "def": "large powerful tractor; a large blade in front flattens areas of ground", "synonyms": ["bulldozer", "dozer"], "image_count": 3, "id": 164, "frequency": "r", "synset": "bulldozer.n.01"}, {"name": "bullet_train", "instance_count": 80, "def": "a high-speed passenger train", "synonyms": ["bullet_train"], "image_count": 61, "id": 165, "frequency": "c", "synset": "bullet_train.n.01"}, {"name": "bulletin_board", "instance_count": 76, "def": "a board that hangs on a wall; displays announcements", "synonyms": ["bulletin_board", "notice_board"], "image_count": 51, "id": 166, "frequency": "c", "synset": "bulletin_board.n.02"}, {"name": "bulletproof_vest", "instance_count": 27, "def": "a vest capable of resisting the impact of a bullet", "synonyms": ["bulletproof_vest"], "image_count": 5, "id": 167, "frequency": "r", "synset": "bulletproof_vest.n.01"}, {"name": "bullhorn", "instance_count": 15, "def": "a portable loudspeaker with built-in microphone and amplifier", "synonyms": ["bullhorn", "megaphone"], "image_count": 13, "id": 168, "frequency": "c", "synset": "bullhorn.n.01"}, {"name": "bun", "instance_count": 1780, "def": "small rounded bread either plain or sweet", "synonyms": ["bun", "roll"], "image_count": 642, "id": 169, "frequency": "f", "synset": "bun.n.01"}, {"name": "bunk_bed", "instance_count": 44, "def": "beds built one above the other", "synonyms": ["bunk_bed"], "image_count": 24, "id": 170, "frequency": "c", "synset": "bunk_bed.n.01"}, {"name": "buoy", "instance_count": 1404, "def": "a float attached by rope to the seabed to mark channels in a harbor or underwater hazards", "synonyms": ["buoy"], "image_count": 255, "id": 171, "frequency": "f", "synset": "buoy.n.01"}, {"name": "burrito", "instance_count": 14, "def": "a flour tortilla folded around a filling", "synonyms": ["burrito"], "image_count": 9, "id": 172, "frequency": "r", "synset": "burrito.n.01"}, {"name": "bus_(vehicle)", "instance_count": 3281, "def": "a vehicle carrying many passengers; used for public transport", "synonyms": ["bus_(vehicle)", "autobus", "charabanc", "double-decker", "motorbus", "motorcoach"], "image_count": 1808, "id": 173, "frequency": "f", "synset": "bus.n.01"}, {"name": "business_card", "instance_count": 84, "def": "a card on which are printed the person's name and business affiliation", "synonyms": ["business_card"], "image_count": 31, "id": 174, "frequency": "c", "synset": "business_card.n.01"}, {"name": "butter", "instance_count": 308, "def": "an edible emulsion of fat globules made by churning milk or cream; for cooking and table use", "synonyms": ["butter"], "image_count": 158, "id": 175, "frequency": "f", "synset": "butter.n.01"}, {"name": "butterfly", "instance_count": 296, "def": "insect typically having a slender body with knobbed antennae and broad colorful wings", "synonyms": ["butterfly"], "image_count": 80, "id": 176, "frequency": "c", "synset": "butterfly.n.01"}, {"name": "button", "instance_count": 7884, "def": "a round fastener sewn to shirts and coats etc to fit through buttonholes", "synonyms": ["button"], "image_count": 1884, "id": 177, "frequency": "f", "synset": "button.n.01"}, {"name": "cab_(taxi)", "instance_count": 414, "def": "a car that takes passengers where they want to go in exchange for money", "synonyms": ["cab_(taxi)", "taxi", "taxicab"], "image_count": 158, "id": 178, "frequency": "f", "synset": "cab.n.03"}, {"name": "cabana", "instance_count": 20, "def": "a small tent used as a dressing room beside the sea or a swimming pool", "synonyms": ["cabana"], "image_count": 2, "id": 179, "frequency": "r", "synset": "cabana.n.01"}, {"name": "cabin_car", "instance_count": 14, "def": "a car on a freight train for use of the train crew; usually the last car on the train", "synonyms": ["cabin_car", "caboose"], "image_count": 12, "id": 180, "frequency": "c", "synset": "cabin_car.n.01"}, {"name": "cabinet", "instance_count": 7371, "def": "a piece of furniture resembling a cupboard with doors and shelves and drawers", "synonyms": ["cabinet"], "image_count": 1659, "id": 181, "frequency": "f", "synset": "cabinet.n.01"}, {"name": "locker", "instance_count": 95, "def": "a storage compartment for clothes and valuables; usually it has a lock", "synonyms": ["locker", "storage_locker"], "image_count": 7, "id": 182, "frequency": "r", "synset": "cabinet.n.03"}, {"name": "cake", "instance_count": 2297, "def": "baked goods made from or based on a mixture of flour, sugar, eggs, and fat", "synonyms": ["cake"], "image_count": 834, "id": 183, "frequency": "f", "synset": "cake.n.03"}, {"name": "calculator", "instance_count": 60, "def": "a small machine that is used for mathematical calculations", "synonyms": ["calculator"], "image_count": 57, "id": 184, "frequency": "c", "synset": "calculator.n.02"}, {"name": "calendar", "instance_count": 251, "def": "a list or register of events (appointments/social events/court cases, etc)", "synonyms": ["calendar"], "image_count": 174, "id": 185, "frequency": "f", "synset": "calendar.n.02"}, {"name": "calf", "instance_count": 301, "def": "young of domestic cattle", "synonyms": ["calf"], "image_count": 95, "id": 186, "frequency": "c", "synset": "calf.n.01"}, {"name": "camcorder", "instance_count": 45, "def": "a portable television camera and videocassette recorder", "synonyms": ["camcorder"], "image_count": 27, "id": 187, "frequency": "c", "synset": "camcorder.n.01"}, {"name": "camel", "instance_count": 34, "def": "cud-chewing mammal used as a draft or saddle animal in desert regions", "synonyms": ["camel"], "image_count": 22, "id": 188, "frequency": "c", "synset": "camel.n.01"}, {"name": "camera", "instance_count": 2471, "def": "equipment for taking photographs", "synonyms": ["camera"], "image_count": 1391, "id": 189, "frequency": "f", "synset": "camera.n.01"}, {"name": "camera_lens", "instance_count": 167, "def": "a lens that focuses the image in a camera", "synonyms": ["camera_lens"], "image_count": 90, "id": 190, "frequency": "c", "synset": "camera_lens.n.01"}, {"name": "camper_(vehicle)", "instance_count": 102, "def": "a recreational vehicle equipped for camping out while traveling", "synonyms": ["camper_(vehicle)", "camping_bus", "motor_home"], "image_count": 40, "id": 191, "frequency": "c", "synset": "camper.n.02"}, {"name": "can", "instance_count": 1424, "def": "airtight sealed metal container for food or drink or paint etc.", "synonyms": ["can", "tin_can"], "image_count": 445, "id": 192, "frequency": "f", "synset": "can.n.01"}, {"name": "can_opener", "instance_count": 22, "def": "a device for cutting cans open", "synonyms": ["can_opener", "tin_opener"], "image_count": 21, "id": 193, "frequency": "c", "synset": "can_opener.n.01"}, {"name": "candle", "instance_count": 4288, "def": "stick of wax with a wick in the middle", "synonyms": ["candle", "candlestick"], "image_count": 1132, "id": 194, "frequency": "f", "synset": "candle.n.01"}, {"name": "candle_holder", "instance_count": 530, "def": "a holder with sockets for candles", "synonyms": ["candle_holder"], "image_count": 177, "id": 195, "frequency": "f", "synset": "candlestick.n.01"}, {"name": "candy_bar", "instance_count": 29, "def": "a candy shaped as a bar", "synonyms": ["candy_bar"], "image_count": 4, "id": 196, "frequency": "r", "synset": "candy_bar.n.01"}, {"name": "candy_cane", "instance_count": 107, "def": "a hard candy in the shape of a rod (usually with stripes)", "synonyms": ["candy_cane"], "image_count": 17, "id": 197, "frequency": "c", "synset": "candy_cane.n.01"}, {"name": "walking_cane", "instance_count": 106, "def": "a stick that people can lean on to help them walk", "synonyms": ["walking_cane"], "image_count": 84, "id": 198, "frequency": "c", "synset": "cane.n.01"}, {"name": "canister", "instance_count": 218, "def": "metal container for storing dry foods such as tea or flour", "synonyms": ["canister", "cannister"], "image_count": 55, "id": 199, "frequency": "c", "synset": "canister.n.02"}, {"name": "canoe", "instance_count": 96, "def": "small and light boat; pointed at both ends; propelled with a paddle", "synonyms": ["canoe"], "image_count": 30, "id": 200, "frequency": "c", "synset": "canoe.n.01"}, {"name": "cantaloup", "instance_count": 193, "def": "the fruit of a cantaloup vine; small to medium-sized melon with yellowish flesh", "synonyms": ["cantaloup", "cantaloupe"], "image_count": 25, "id": 201, "frequency": "c", "synset": "cantaloup.n.02"}, {"name": "canteen", "instance_count": 2, "def": "a flask for carrying water; used by soldiers or travelers", "synonyms": ["canteen"], "image_count": 2, "id": 202, "frequency": "r", "synset": "canteen.n.01"}, {"name": "cap_(headwear)", "instance_count": 636, "def": "a tight-fitting headwear", "synonyms": ["cap_(headwear)"], "image_count": 125, "id": 203, "frequency": "f", "synset": "cap.n.01"}, {"name": "bottle_cap", "instance_count": 5293, "def": "a top (as for a bottle)", "synonyms": ["bottle_cap", "cap_(container_lid)"], "image_count": 1135, "id": 204, "frequency": "f", "synset": "cap.n.02"}, {"name": "cape", "instance_count": 27, "def": "a sleeveless garment like a cloak but shorter", "synonyms": ["cape"], "image_count": 19, "id": 205, "frequency": "c", "synset": "cape.n.02"}, {"name": "cappuccino", "instance_count": 87, "def": "equal parts of espresso and steamed milk", "synonyms": ["cappuccino", "coffee_cappuccino"], "image_count": 72, "id": 206, "frequency": "c", "synset": "cappuccino.n.01"}, {"name": "car_(automobile)", "instance_count": 10528, "def": "a motor vehicle with four wheels", "synonyms": ["car_(automobile)", "auto_(automobile)", "automobile"], "image_count": 1926, "id": 207, "frequency": "f", "synset": "car.n.01"}, {"name": "railcar_(part_of_a_train)", "instance_count": 928, "def": "a wheeled vehicle adapted to the rails of railroad (mark each individual railcar separately)", "synonyms": ["railcar_(part_of_a_train)", "railway_car_(part_of_a_train)", "railroad_car_(part_of_a_train)"], "image_count": 159, "id": 208, "frequency": "f", "synset": "car.n.02"}, {"name": "elevator_car", "instance_count": 10, "def": "where passengers ride up and down", "synonyms": ["elevator_car"], "image_count": 7, "id": 209, "frequency": "r", "synset": "car.n.04"}, {"name": "car_battery", "instance_count": 1, "def": "a battery in a motor vehicle", "synonyms": ["car_battery", "automobile_battery"], "image_count": 1, "id": 210, "frequency": "r", "synset": "car_battery.n.01"}, {"name": "identity_card", "instance_count": 16, "def": "a card certifying the identity of the bearer", "synonyms": ["identity_card"], "image_count": 13, "id": 211, "frequency": "c", "synset": "card.n.02"}, {"name": "card", "instance_count": 122, "def": "a rectangular piece of paper used to send messages (e.g. greetings or pictures)", "synonyms": ["card"], "image_count": 35, "id": 212, "frequency": "c", "synset": "card.n.03"}, {"name": "cardigan", "instance_count": 22, "def": "knitted jacket that is fastened up the front with buttons or a zipper", "synonyms": ["cardigan"], "image_count": 18, "id": 213, "frequency": "c", "synset": "cardigan.n.01"}, {"name": "cargo_ship", "instance_count": 15, "def": "a ship designed to carry cargo", "synonyms": ["cargo_ship", "cargo_vessel"], "image_count": 8, "id": 214, "frequency": "r", "synset": "cargo_ship.n.01"}, {"name": "carnation", "instance_count": 22, "def": "plant with pink to purple-red spice-scented usually double flowers", "synonyms": ["carnation"], "image_count": 6, "id": 215, "frequency": "r", "synset": "carnation.n.01"}, {"name": "horse_carriage", "instance_count": 49, "def": "a vehicle with wheels drawn by one or more horses", "synonyms": ["horse_carriage"], "image_count": 35, "id": 216, "frequency": "c", "synset": "carriage.n.02"}, {"name": "carrot", "instance_count": 18049, "def": "deep orange edible root of the cultivated carrot plant", "synonyms": ["carrot"], "image_count": 1222, "id": 217, "frequency": "f", "synset": "carrot.n.01"}, {"name": "tote_bag", "instance_count": 231, "def": "a capacious bag or basket", "synonyms": ["tote_bag"], "image_count": 103, "id": 218, "frequency": "f", "synset": "carryall.n.01"}, {"name": "cart", "instance_count": 51, "def": "a heavy open wagon usually having two wheels and drawn by an animal", "synonyms": ["cart"], "image_count": 28, "id": 219, "frequency": "c", "synset": "cart.n.01"}, {"name": "carton", "instance_count": 206, "def": "a container made of cardboard for holding food or drink", "synonyms": ["carton"], "image_count": 63, "id": 220, "frequency": "c", "synset": "carton.n.02"}, {"name": "cash_register", "instance_count": 33, "def": "a cashbox with an adding machine to register transactions", "synonyms": ["cash_register", "register_(for_cash_transactions)"], "image_count": 28, "id": 221, "frequency": "c", "synset": "cash_register.n.01"}, {"name": "casserole", "instance_count": 12, "def": "food cooked and served in a casserole", "synonyms": ["casserole"], "image_count": 5, "id": 222, "frequency": "r", "synset": "casserole.n.01"}, {"name": "cassette", "instance_count": 74, "def": "a container that holds a magnetic tape used for recording or playing sound or video", "synonyms": ["cassette"], "image_count": 7, "id": 223, "frequency": "r", "synset": "cassette.n.01"}, {"name": "cast", "instance_count": 15, "def": "bandage consisting of a firm covering that immobilizes broken bones while they heal", "synonyms": ["cast", "plaster_cast", "plaster_bandage"], "image_count": 14, "id": 224, "frequency": "c", "synset": "cast.n.05"}, {"name": "cat", "instance_count": 2387, "def": "a domestic house cat", "synonyms": ["cat"], "image_count": 1918, "id": 225, "frequency": "f", "synset": "cat.n.01"}, {"name": "cauliflower", "instance_count": 1035, "def": "edible compact head of white undeveloped flowers", "synonyms": ["cauliflower"], "image_count": 133, "id": 226, "frequency": "f", "synset": "cauliflower.n.02"}, {"name": "cayenne_(spice)", "instance_count": 49, "def": "ground pods and seeds of pungent red peppers of the genus Capsicum", "synonyms": ["cayenne_(spice)", "cayenne_pepper_(spice)", "red_pepper_(spice)"], "image_count": 16, "id": 227, "frequency": "c", "synset": "cayenne.n.02"}, {"name": "CD_player", "instance_count": 37, "def": "electronic equipment for playing compact discs (CDs)", "synonyms": ["CD_player"], "image_count": 27, "id": 228, "frequency": "c", "synset": "cd_player.n.01"}, {"name": "celery", "instance_count": 911, "def": "widely cultivated herb with aromatic leaf stalks that are eaten raw or cooked", "synonyms": ["celery"], "image_count": 110, "id": 229, "frequency": "f", "synset": "celery.n.01"}, {"name": "cellular_telephone", "instance_count": 2902, "def": "a hand-held mobile telephone", "synonyms": ["cellular_telephone", "cellular_phone", "cellphone", "mobile_phone", "smart_phone"], "image_count": 1895, "id": 230, "frequency": "f", "synset": "cellular_telephone.n.01"}, {"name": "chain_mail", "instance_count": 13, "def": "(Middle Ages) flexible armor made of interlinked metal rings", "synonyms": ["chain_mail", "ring_mail", "chain_armor", "chain_armour", "ring_armor", "ring_armour"], "image_count": 4, "id": 231, "frequency": "r", "synset": "chain_mail.n.01"}, {"name": "chair", "instance_count": 11549, "def": "a seat for one person, with a support for the back", "synonyms": ["chair"], "image_count": 1927, "id": 232, "frequency": "f", "synset": "chair.n.01"}, {"name": "chaise_longue", "instance_count": 15, "def": "a long chair; for reclining", "synonyms": ["chaise_longue", "chaise", "daybed"], "image_count": 8, "id": 233, "frequency": "r", "synset": "chaise_longue.n.01"}, {"name": "chalice", "instance_count": 1, "def": "a bowl-shaped drinking vessel; especially the Eucharistic cup", "synonyms": ["chalice"], "image_count": 1, "id": 234, "frequency": "r", "synset": "chalice.n.01"}, {"name": "chandelier", "instance_count": 392, "def": "branched lighting fixture; often ornate; hangs from the ceiling", "synonyms": ["chandelier"], "image_count": 263, "id": 235, "frequency": "f", "synset": "chandelier.n.01"}, {"name": "chap", "instance_count": 19, "def": "leather leggings without a seat; worn over trousers by cowboys to protect their legs", "synonyms": ["chap"], "image_count": 10, "id": 236, "frequency": "r", "synset": "chap.n.04"}, {"name": "checkbook", "instance_count": 2, "def": "a book issued to holders of checking accounts", "synonyms": ["checkbook", "chequebook"], "image_count": 2, "id": 237, "frequency": "r", "synset": "checkbook.n.01"}, {"name": "checkerboard", "instance_count": 3, "def": "a board having 64 squares of two alternating colors", "synonyms": ["checkerboard"], "image_count": 3, "id": 238, "frequency": "r", "synset": "checkerboard.n.01"}, {"name": "cherry", "instance_count": 903, "def": "a red fruit with a single hard stone", "synonyms": ["cherry"], "image_count": 87, "id": 239, "frequency": "c", "synset": "cherry.n.03"}, {"name": "chessboard", "instance_count": 13, "def": "a checkerboard used to play chess", "synonyms": ["chessboard"], "image_count": 9, "id": 240, "frequency": "r", "synset": "chessboard.n.01"}, {"name": "chicken_(animal)", "instance_count": 417, "def": "a domestic fowl bred for flesh or eggs", "synonyms": ["chicken_(animal)"], "image_count": 71, "id": 241, "frequency": "c", "synset": "chicken.n.02"}, {"name": "chickpea", "instance_count": 265, "def": "the seed of the chickpea plant; usually dried", "synonyms": ["chickpea", "garbanzo"], "image_count": 13, "id": 242, "frequency": "c", "synset": "chickpea.n.01"}, {"name": "chili_(vegetable)", "instance_count": 354, "def": "very hot and finely tapering pepper of special pungency", "synonyms": ["chili_(vegetable)", "chili_pepper_(vegetable)", "chilli_(vegetable)", "chilly_(vegetable)", "chile_(vegetable)"], "image_count": 18, "id": 243, "frequency": "c", "synset": "chili.n.02"}, {"name": "chime", "instance_count": 2, "def": "an instrument consisting of a set of bells that are struck with a hammer", "synonyms": ["chime", "gong"], "image_count": 2, "id": 244, "frequency": "r", "synset": "chime.n.01"}, {"name": "chinaware", "instance_count": 41, "def": "dishware made of high quality porcelain", "synonyms": ["chinaware"], "image_count": 5, "id": 245, "frequency": "r", "synset": "chinaware.n.01"}, {"name": "crisp_(potato_chip)", "instance_count": 541, "def": "a thin crisp slice of potato fried in deep fat", "synonyms": ["crisp_(potato_chip)", "potato_chip"], "image_count": 45, "id": 246, "frequency": "c", "synset": "chip.n.04"}, {"name": "poker_chip", "instance_count": 21, "def": "a small disk-shaped counter used to represent money when gambling", "synonyms": ["poker_chip"], "image_count": 1, "id": 247, "frequency": "r", "synset": "chip.n.06"}, {"name": "chocolate_bar", "instance_count": 179, "def": "a bar of chocolate candy", "synonyms": ["chocolate_bar"], "image_count": 23, "id": 248, "frequency": "c", "synset": "chocolate_bar.n.01"}, {"name": "chocolate_cake", "instance_count": 80, "def": "cake containing chocolate", "synonyms": ["chocolate_cake"], "image_count": 32, "id": 249, "frequency": "c", "synset": "chocolate_cake.n.01"}, {"name": "chocolate_milk", "instance_count": 7, "def": "milk flavored with chocolate syrup", "synonyms": ["chocolate_milk"], "image_count": 4, "id": 250, "frequency": "r", "synset": "chocolate_milk.n.01"}, {"name": "chocolate_mousse", "instance_count": 1, "def": "dessert mousse made with chocolate", "synonyms": ["chocolate_mousse"], "image_count": 1, "id": 251, "frequency": "r", "synset": "chocolate_mousse.n.01"}, {"name": "choker", "instance_count": 1380, "def": "shirt collar, animal collar, or tight-fitting necklace", "synonyms": ["choker", "collar", "neckband"], "image_count": 858, "id": 252, "frequency": "f", "synset": "choker.n.03"}, {"name": "chopping_board", "instance_count": 840, "def": "a wooden board where meats or vegetables can be cut", "synonyms": ["chopping_board", "cutting_board", "chopping_block"], "image_count": 661, "id": 253, "frequency": "f", "synset": "chopping_board.n.01"}, {"name": "chopstick", "instance_count": 557, "def": "one of a pair of slender sticks used as oriental tableware to eat food with", "synonyms": ["chopstick"], "image_count": 168, "id": 254, "frequency": "f", "synset": "chopstick.n.01"}, {"name": "Christmas_tree", "instance_count": 303, "def": "an ornamented evergreen used as a Christmas decoration", "synonyms": ["Christmas_tree"], "image_count": 210, "id": 255, "frequency": "f", "synset": "christmas_tree.n.05"}, {"name": "slide", "instance_count": 106, "def": "sloping channel through which things can descend", "synonyms": ["slide"], "image_count": 65, "id": 256, "frequency": "c", "synset": "chute.n.02"}, {"name": "cider", "instance_count": 38, "def": "a beverage made from juice pressed from apples", "synonyms": ["cider", "cyder"], "image_count": 4, "id": 257, "frequency": "r", "synset": "cider.n.01"}, {"name": "cigar_box", "instance_count": 3, "def": "a box for holding cigars", "synonyms": ["cigar_box"], "image_count": 2, "id": 258, "frequency": "r", "synset": "cigar_box.n.01"}, {"name": "cigarette", "instance_count": 269, "def": "finely ground tobacco wrapped in paper; for smoking", "synonyms": ["cigarette"], "image_count": 159, "id": 259, "frequency": "f", "synset": "cigarette.n.01"}, {"name": "cigarette_case", "instance_count": 35, "def": "a small flat case for holding cigarettes", "synonyms": ["cigarette_case", "cigarette_pack"], "image_count": 31, "id": 260, "frequency": "c", "synset": "cigarette_case.n.01"}, {"name": "cistern", "instance_count": 901, "def": "a tank that holds the water used to flush a toilet", "synonyms": ["cistern", "water_tank"], "image_count": 811, "id": 261, "frequency": "f", "synset": "cistern.n.02"}, {"name": "clarinet", "instance_count": 1, "def": "a single-reed instrument with a straight tube", "synonyms": ["clarinet"], "image_count": 1, "id": 262, "frequency": "r", "synset": "clarinet.n.01"}, {"name": "clasp", "instance_count": 197, "def": "a fastener (as a buckle or hook) that is used to hold two things together", "synonyms": ["clasp"], "image_count": 42, "id": 263, "frequency": "c", "synset": "clasp.n.01"}, {"name": "cleansing_agent", "instance_count": 63, "def": "a preparation used in cleaning something", "synonyms": ["cleansing_agent", "cleanser", "cleaner"], "image_count": 27, "id": 264, "frequency": "c", "synset": "cleansing_agent.n.01"}, {"name": "cleat_(for_securing_rope)", "instance_count": 8, "def": "a fastener (usually with two projecting horns) around which a rope can be secured", "synonyms": ["cleat_(for_securing_rope)"], "image_count": 2, "id": 265, "frequency": "r", "synset": "cleat.n.02"}, {"name": "clementine", "instance_count": 108, "def": "a variety of mandarin orange", "synonyms": ["clementine"], "image_count": 5, "id": 266, "frequency": "r", "synset": "clementine.n.01"}, {"name": "clip", "instance_count": 301, "def": "any of various small fasteners used to hold loose articles together", "synonyms": ["clip"], "image_count": 95, "id": 267, "frequency": "c", "synset": "clip.n.03"}, {"name": "clipboard", "instance_count": 36, "def": "a small writing board with a clip at the top for holding papers", "synonyms": ["clipboard"], "image_count": 32, "id": 268, "frequency": "c", "synset": "clipboard.n.01"}, {"name": "clippers_(for_plants)", "instance_count": 1, "def": "shears for cutting grass or shrubbery (often used in the plural)", "synonyms": ["clippers_(for_plants)"], "image_count": 1, "id": 269, "frequency": "r", "synset": "clipper.n.03"}, {"name": "cloak", "instance_count": 1, "def": "a loose outer garment", "synonyms": ["cloak"], "image_count": 1, "id": 270, "frequency": "r", "synset": "cloak.n.02"}, {"name": "clock", "instance_count": 2677, "def": "a timepiece that shows the time of day", "synonyms": ["clock", "timepiece", "timekeeper"], "image_count": 1844, "id": 271, "frequency": "f", "synset": "clock.n.01"}, {"name": "clock_tower", "instance_count": 932, "def": "a tower with a large clock visible high up on an outside face", "synonyms": ["clock_tower"], "image_count": 897, "id": 272, "frequency": "f", "synset": "clock_tower.n.01"}, {"name": "clothes_hamper", "instance_count": 47, "def": "a hamper that holds dirty clothes to be washed or wet clothes to be dried", "synonyms": ["clothes_hamper", "laundry_basket", "clothes_basket"], "image_count": 31, "id": 273, "frequency": "c", "synset": "clothes_hamper.n.01"}, {"name": "clothespin", "instance_count": 111, "def": "wood or plastic fastener; for holding clothes on a clothesline", "synonyms": ["clothespin", "clothes_peg"], "image_count": 23, "id": 274, "frequency": "c", "synset": "clothespin.n.01"}, {"name": "clutch_bag", "instance_count": 1, "def": "a woman's strapless purse that is carried in the hand", "synonyms": ["clutch_bag"], "image_count": 1, "id": 275, "frequency": "r", "synset": "clutch_bag.n.01"}, {"name": "coaster", "instance_count": 390, "def": "a covering (plate or mat) that protects the surface of a table", "synonyms": ["coaster"], "image_count": 202, "id": 276, "frequency": "f", "synset": "coaster.n.03"}, {"name": "coat", "instance_count": 4145, "def": "an outer garment that has sleeves and covers the body from shoulder down", "synonyms": ["coat"], "image_count": 746, "id": 277, "frequency": "f", "synset": "coat.n.01"}, {"name": "coat_hanger", "instance_count": 282, "def": "a hanger that is shaped like a person's shoulders", "synonyms": ["coat_hanger", "clothes_hanger", "dress_hanger"], "image_count": 44, "id": 278, "frequency": "c", "synset": "coat_hanger.n.01"}, {"name": "coatrack", "instance_count": 16, "def": "a rack with hooks for temporarily holding coats and hats", "synonyms": ["coatrack", "hatrack"], "image_count": 14, "id": 279, "frequency": "c", "synset": "coatrack.n.01"}, {"name": "cock", "instance_count": 132, "def": "adult male chicken", "synonyms": ["cock", "rooster"], "image_count": 26, "id": 280, "frequency": "c", "synset": "cock.n.04"}, {"name": "cockroach", "instance_count": 1, "def": "any of numerous chiefly nocturnal insects; some are domestic pests", "synonyms": ["cockroach"], "image_count": 1, "id": 281, "frequency": "r", "synset": "cockroach.n.01"}, {"name": "cocoa_(beverage)", "instance_count": 4, "def": "a beverage made from cocoa powder and milk and sugar; usually drunk hot", "synonyms": ["cocoa_(beverage)", "hot_chocolate_(beverage)", "drinking_chocolate"], "image_count": 2, "id": 282, "frequency": "r", "synset": "cocoa.n.01"}, {"name": "coconut", "instance_count": 273, "def": "large hard-shelled brown oval nut with a fibrous husk", "synonyms": ["coconut", "cocoanut"], "image_count": 25, "id": 283, "frequency": "c", "synset": "coconut.n.02"}, {"name": "coffee_maker", "instance_count": 271, "def": "a kitchen appliance for brewing coffee automatically", "synonyms": ["coffee_maker", "coffee_machine"], "image_count": 238, "id": 284, "frequency": "f", "synset": "coffee_maker.n.01"}, {"name": "coffee_table", "instance_count": 709, "def": "low table where magazines can be placed and coffee or cocktails are served", "synonyms": ["coffee_table", "cocktail_table"], "image_count": 592, "id": 285, "frequency": "f", "synset": "coffee_table.n.01"}, {"name": "coffeepot", "instance_count": 32, "def": "tall pot in which coffee is brewed", "synonyms": ["coffeepot"], "image_count": 26, "id": 286, "frequency": "c", "synset": "coffeepot.n.01"}, {"name": "coil", "instance_count": 7, "def": "tubing that is wound in a spiral", "synonyms": ["coil"], "image_count": 5, "id": 287, "frequency": "r", "synset": "coil.n.05"}, {"name": "coin", "instance_count": 305, "def": "a flat metal piece (usually a disc) used as money", "synonyms": ["coin"], "image_count": 42, "id": 288, "frequency": "c", "synset": "coin.n.01"}, {"name": "colander", "instance_count": 16, "def": "bowl-shaped strainer; used to wash or drain foods", "synonyms": ["colander", "cullender"], "image_count": 13, "id": 289, "frequency": "c", "synset": "colander.n.01"}, {"name": "coleslaw", "instance_count": 72, "def": "basically shredded cabbage", "synonyms": ["coleslaw", "slaw"], "image_count": 46, "id": 290, "frequency": "c", "synset": "coleslaw.n.01"}, {"name": "coloring_material", "instance_count": 1, "def": "any material used for its color", "synonyms": ["coloring_material", "colouring_material"], "image_count": 1, "id": 291, "frequency": "r", "synset": "coloring_material.n.01"}, {"name": "combination_lock", "instance_count": 13, "def": "lock that can be opened only by turning dials in a special sequence", "synonyms": ["combination_lock"], "image_count": 8, "id": 292, "frequency": "r", "synset": "combination_lock.n.01"}, {"name": "pacifier", "instance_count": 40, "def": "device used for an infant to suck or bite on", "synonyms": ["pacifier", "teething_ring"], "image_count": 34, "id": 293, "frequency": "c", "synset": "comforter.n.04"}, {"name": "comic_book", "instance_count": 97, "def": "a magazine devoted to comic strips", "synonyms": ["comic_book"], "image_count": 5, "id": 294, "frequency": "r", "synset": "comic_book.n.01"}, {"name": "compass", "instance_count": 1, "def": "navigational instrument for finding directions", "synonyms": ["compass"], "image_count": 1, "id": 295, "frequency": "r", "synset": "compass.n.01"}, {"name": "computer_keyboard", "instance_count": 2745, "def": "a keyboard that is a data input device for computers", "synonyms": ["computer_keyboard", "keyboard_(computer)"], "image_count": 1871, "id": 296, "frequency": "f", "synset": "computer_keyboard.n.01"}, {"name": "condiment", "instance_count": 2985, "def": "a preparation (a sauce or relish or spice) to enhance flavor or enjoyment", "synonyms": ["condiment"], "image_count": 717, "id": 297, "frequency": "f", "synset": "condiment.n.01"}, {"name": "cone", "instance_count": 4081, "def": "a cone-shaped object used to direct traffic", "synonyms": ["cone", "traffic_cone"], "image_count": 1010, "id": 298, "frequency": "f", "synset": "cone.n.01"}, {"name": "control", "instance_count": 1775, "def": "a mechanism that controls the operation of a machine", "synonyms": ["control", "controller"], "image_count": 679, "id": 299, "frequency": "f", "synset": "control.n.09"}, {"name": "convertible_(automobile)", "instance_count": 4, "def": "a car that has top that can be folded or removed", "synonyms": ["convertible_(automobile)"], "image_count": 3, "id": 300, "frequency": "r", "synset": "convertible.n.01"}, {"name": "sofa_bed", "instance_count": 5, "def": "a sofa that can be converted into a bed", "synonyms": ["sofa_bed"], "image_count": 4, "id": 301, "frequency": "r", "synset": "convertible.n.03"}, {"name": "cooker", "instance_count": 1, "def": "a utensil for cooking", "synonyms": ["cooker"], "image_count": 1, "id": 302, "frequency": "r", "synset": "cooker.n.01"}, {"name": "cookie", "instance_count": 1920, "def": "any of various small flat sweet cakes (`biscuit' is the British term)", "synonyms": ["cookie", "cooky", "biscuit_(cookie)"], "image_count": 166, "id": 303, "frequency": "f", "synset": "cookie.n.01"}, {"name": "cooking_utensil", "instance_count": 18, "def": "a kitchen utensil made of material that does not melt easily; used for cooking", "synonyms": ["cooking_utensil"], "image_count": 2, "id": 304, "frequency": "r", "synset": "cooking_utensil.n.01"}, {"name": "cooler_(for_food)", "instance_count": 499, "def": "an insulated box for storing food often with ice", "synonyms": ["cooler_(for_food)", "ice_chest"], "image_count": 266, "id": 305, "frequency": "f", "synset": "cooler.n.01"}, {"name": "cork_(bottle_plug)", "instance_count": 326, "def": "the plug in the mouth of a bottle (especially a wine bottle)", "synonyms": ["cork_(bottle_plug)", "bottle_cork"], "image_count": 101, "id": 306, "frequency": "f", "synset": "cork.n.04"}, {"name": "corkboard", "instance_count": 7, "def": "a sheet consisting of cork granules", "synonyms": ["corkboard"], "image_count": 6, "id": 307, "frequency": "r", "synset": "corkboard.n.01"}, {"name": "corkscrew", "instance_count": 15, "def": "a bottle opener that pulls corks", "synonyms": ["corkscrew", "bottle_screw"], "image_count": 14, "id": 308, "frequency": "c", "synset": "corkscrew.n.01"}, {"name": "edible_corn", "instance_count": 1883, "def": "ears or kernels of corn that can be prepared and served for human food (only mark individual ears or kernels)", "synonyms": ["edible_corn", "corn", "maize"], "image_count": 133, "id": 309, "frequency": "f", "synset": "corn.n.03"}, {"name": "cornbread", "instance_count": 10, "def": "bread made primarily of cornmeal", "synonyms": ["cornbread"], "image_count": 2, "id": 310, "frequency": "r", "synset": "cornbread.n.01"}, {"name": "cornet", "instance_count": 65, "def": "a brass musical instrument with a narrow tube and a flared bell and many valves", "synonyms": ["cornet", "horn", "trumpet"], "image_count": 38, "id": 311, "frequency": "c", "synset": "cornet.n.01"}, {"name": "cornice", "instance_count": 149, "def": "a decorative framework to conceal curtain fixtures at the top of a window casing", "synonyms": ["cornice", "valance", "valance_board", "pelmet"], "image_count": 95, "id": 312, "frequency": "c", "synset": "cornice.n.01"}, {"name": "cornmeal", "instance_count": 1, "def": "coarsely ground corn", "synonyms": ["cornmeal"], "image_count": 1, "id": 313, "frequency": "r", "synset": "cornmeal.n.01"}, {"name": "corset", "instance_count": 12, "def": "a woman's close-fitting foundation garment", "synonyms": ["corset", "girdle"], "image_count": 12, "id": 314, "frequency": "c", "synset": "corset.n.01"}, {"name": "costume", "instance_count": 124, "def": "the attire characteristic of a country or a time or a social class", "synonyms": ["costume"], "image_count": 49, "id": 315, "frequency": "c", "synset": "costume.n.04"}, {"name": "cougar", "instance_count": 6, "def": "large American feline resembling a lion", "synonyms": ["cougar", "puma", "catamount", "mountain_lion", "panther"], "image_count": 5, "id": 316, "frequency": "r", "synset": "cougar.n.01"}, {"name": "coverall", "instance_count": 12, "def": "a loose-fitting protective garment that is worn over other clothing", "synonyms": ["coverall"], "image_count": 5, "id": 317, "frequency": "r", "synset": "coverall.n.01"}, {"name": "cowbell", "instance_count": 29, "def": "a bell hung around the neck of cow so that the cow can be easily located", "synonyms": ["cowbell"], "image_count": 16, "id": 318, "frequency": "c", "synset": "cowbell.n.01"}, {"name": "cowboy_hat", "instance_count": 535, "def": "a hat with a wide brim and a soft crown; worn by American ranch hands", "synonyms": ["cowboy_hat", "ten-gallon_hat"], "image_count": 216, "id": 319, "frequency": "f", "synset": "cowboy_hat.n.01"}, {"name": "crab_(animal)", "instance_count": 50, "def": "decapod having eyes on short stalks and a broad flattened shell and pincers", "synonyms": ["crab_(animal)"], "image_count": 12, "id": 320, "frequency": "c", "synset": "crab.n.01"}, {"name": "crabmeat", "instance_count": 5, "def": "the edible flesh of any of various crabs", "synonyms": ["crabmeat"], "image_count": 1, "id": 321, "frequency": "r", "synset": "crab.n.05"}, {"name": "cracker", "instance_count": 510, "def": "a thin crisp wafer", "synonyms": ["cracker"], "image_count": 54, "id": 322, "frequency": "c", "synset": "cracker.n.01"}, {"name": "crape", "instance_count": 12, "def": "small very thin pancake", "synonyms": ["crape", "crepe", "French_pancake"], "image_count": 5, "id": 323, "frequency": "r", "synset": "crape.n.01"}, {"name": "crate", "instance_count": 1832, "def": "a rugged box (usually made of wood); used for shipping", "synonyms": ["crate"], "image_count": 245, "id": 324, "frequency": "f", "synset": "crate.n.01"}, {"name": "crayon", "instance_count": 59, "def": "writing or drawing implement made of a colored stick of composition wax", "synonyms": ["crayon", "wax_crayon"], "image_count": 12, "id": 325, "frequency": "c", "synset": "crayon.n.01"}, {"name": "cream_pitcher", "instance_count": 10, "def": "a small pitcher for serving cream", "synonyms": ["cream_pitcher"], "image_count": 7, "id": 326, "frequency": "r", "synset": "cream_pitcher.n.01"}, {"name": "crescent_roll", "instance_count": 152, "def": "very rich flaky crescent-shaped roll", "synonyms": ["crescent_roll", "croissant"], "image_count": 35, "id": 327, "frequency": "c", "synset": "crescent_roll.n.01"}, {"name": "crib", "instance_count": 40, "def": "baby bed with high sides made of slats", "synonyms": ["crib", "cot"], "image_count": 36, "id": 328, "frequency": "c", "synset": "crib.n.01"}, {"name": "crock_pot", "instance_count": 128, "def": "an earthen jar (made of baked clay) or a modern electric crockpot", "synonyms": ["crock_pot", "earthenware_jar"], "image_count": 32, "id": 329, "frequency": "c", "synset": "crock.n.03"}, {"name": "crossbar", "instance_count": 6991, "def": "a horizontal bar that goes across something", "synonyms": ["crossbar"], "image_count": 1027, "id": 330, "frequency": "f", "synset": "crossbar.n.01"}, {"name": "crouton", "instance_count": 140, "def": "a small piece of toasted or fried bread; served in soup or salads", "synonyms": ["crouton"], "image_count": 10, "id": 331, "frequency": "r", "synset": "crouton.n.01"}, {"name": "crow", "instance_count": 24, "def": "black birds having a raucous call", "synonyms": ["crow"], "image_count": 12, "id": 332, "frequency": "c", "synset": "crow.n.01"}, {"name": "crowbar", "instance_count": 1, "def": "a heavy iron lever with one end forged into a wedge", "synonyms": ["crowbar", "wrecking_bar", "pry_bar"], "image_count": 1, "id": 333, "frequency": "r", "synset": "crowbar.n.01"}, {"name": "crown", "instance_count": 126, "def": "an ornamental jeweled headdress signifying sovereignty", "synonyms": ["crown"], "image_count": 67, "id": 334, "frequency": "c", "synset": "crown.n.04"}, {"name": "crucifix", "instance_count": 99, "def": "representation of the cross on which Jesus died", "synonyms": ["crucifix"], "image_count": 71, "id": 335, "frequency": "c", "synset": "crucifix.n.01"}, {"name": "cruise_ship", "instance_count": 35, "def": "a passenger ship used commercially for pleasure cruises", "synonyms": ["cruise_ship", "cruise_liner"], "image_count": 30, "id": 336, "frequency": "c", "synset": "cruise_ship.n.01"}, {"name": "police_cruiser", "instance_count": 86, "def": "a car in which policemen cruise the streets", "synonyms": ["police_cruiser", "patrol_car", "police_car", "squad_car"], "image_count": 48, "id": 337, "frequency": "c", "synset": "cruiser.n.01"}, {"name": "crumb", "instance_count": 3021, "def": "small piece of e.g. bread or cake", "synonyms": ["crumb"], "image_count": 249, "id": 338, "frequency": "f", "synset": "crumb.n.03"}, {"name": "crutch", "instance_count": 20, "def": "a wooden or metal staff that fits under the armpit and reaches to the ground", "synonyms": ["crutch"], "image_count": 13, "id": 339, "frequency": "c", "synset": "crutch.n.01"}, {"name": "cub_(animal)", "instance_count": 55, "def": "the young of certain carnivorous mammals such as the bear or wolf or lion", "synonyms": ["cub_(animal)"], "image_count": 29, "id": 340, "frequency": "c", "synset": "cub.n.03"}, {"name": "cube", "instance_count": 189, "def": "a block in the (approximate) shape of a cube", "synonyms": ["cube", "square_block"], "image_count": 14, "id": 341, "frequency": "c", "synset": "cube.n.05"}, {"name": "cucumber", "instance_count": 1533, "def": "cylindrical green fruit with thin green rind and white flesh eaten as a vegetable", "synonyms": ["cucumber", "cuke"], "image_count": 236, "id": 342, "frequency": "f", "synset": "cucumber.n.02"}, {"name": "cufflink", "instance_count": 17, "def": "jewelry consisting of linked buttons used to fasten the cuffs of a shirt", "synonyms": ["cufflink"], "image_count": 15, "id": 343, "frequency": "c", "synset": "cufflink.n.01"}, {"name": "cup", "instance_count": 4637, "def": "a small open container usually used for drinking; usually has a handle", "synonyms": ["cup"], "image_count": 1521, "id": 344, "frequency": "f", "synset": "cup.n.01"}, {"name": "trophy_cup", "instance_count": 80, "def": "a metal award or cup-shaped vessel with handles that is awarded as a trophy to a competition winner", "synonyms": ["trophy_cup"], "image_count": 25, "id": 345, "frequency": "c", "synset": "cup.n.08"}, {"name": "cupboard", "instance_count": 1623, "def": "a small room (or recess) or cabinet used for storage space", "synonyms": ["cupboard", "closet"], "image_count": 249, "id": 346, "frequency": "f", "synset": "cupboard.n.01"}, {"name": "cupcake", "instance_count": 1628, "def": "small cake baked in a muffin tin", "synonyms": ["cupcake"], "image_count": 139, "id": 347, "frequency": "f", "synset": "cupcake.n.01"}, {"name": "hair_curler", "instance_count": 20, "def": "a cylindrical tube around which the hair is wound to curl it", "synonyms": ["hair_curler", "hair_roller", "hair_crimper"], "image_count": 2, "id": 348, "frequency": "r", "synset": "curler.n.01"}, {"name": "curling_iron", "instance_count": 2, "def": "a cylindrical home appliance that heats hair that has been curled around it", "synonyms": ["curling_iron"], "image_count": 2, "id": 349, "frequency": "r", "synset": "curling_iron.n.01"}, {"name": "curtain", "instance_count": 4506, "def": "hanging cloth used as a blind (especially for a window)", "synonyms": ["curtain", "drapery"], "image_count": 1890, "id": 350, "frequency": "f", "synset": "curtain.n.01"}, {"name": "cushion", "instance_count": 7174, "def": "a soft bag filled with air or padding such as feathers or foam rubber", "synonyms": ["cushion"], "image_count": 1240, "id": 351, "frequency": "f", "synset": "cushion.n.03"}, {"name": "cylinder", "instance_count": 3, "def": "a cylindrical container", "synonyms": ["cylinder"], "image_count": 1, "id": 352, "frequency": "r", "synset": "cylinder.n.04"}, {"name": "cymbal", "instance_count": 24, "def": "a percussion instrument consisting of a concave brass disk", "synonyms": ["cymbal"], "image_count": 9, "id": 353, "frequency": "r", "synset": "cymbal.n.01"}, {"name": "dagger", "instance_count": 1, "def": "a short knife with a pointed blade used for piercing or stabbing", "synonyms": ["dagger"], "image_count": 1, "id": 354, "frequency": "r", "synset": "dagger.n.01"}, {"name": "dalmatian", "instance_count": 3, "def": "a large breed having a smooth white coat with black or brown spots", "synonyms": ["dalmatian"], "image_count": 3, "id": 355, "frequency": "r", "synset": "dalmatian.n.02"}, {"name": "dartboard", "instance_count": 11, "def": "a circular board of wood or cork used as the target in the game of darts", "synonyms": ["dartboard"], "image_count": 11, "id": 356, "frequency": "c", "synset": "dartboard.n.01"}, {"name": "date_(fruit)", "instance_count": 103, "def": "sweet edible fruit of the date palm with a single long woody seed", "synonyms": ["date_(fruit)"], "image_count": 4, "id": 357, "frequency": "r", "synset": "date.n.08"}, {"name": "deck_chair", "instance_count": 1787, "def": "a folding chair for use outdoors; a wooden frame supports a length of canvas", "synonyms": ["deck_chair", "beach_chair"], "image_count": 236, "id": 358, "frequency": "f", "synset": "deck_chair.n.01"}, {"name": "deer", "instance_count": 130, "def": "distinguished from Bovidae by the male's having solid deciduous antlers", "synonyms": ["deer", "cervid"], "image_count": 44, "id": 359, "frequency": "c", "synset": "deer.n.01"}, {"name": "dental_floss", "instance_count": 20, "def": "a soft thread for cleaning the spaces between the teeth", "synonyms": ["dental_floss", "floss"], "image_count": 19, "id": 360, "frequency": "c", "synset": "dental_floss.n.01"}, {"name": "desk", "instance_count": 1662, "def": "a piece of furniture with a writing surface and usually drawers or other compartments", "synonyms": ["desk"], "image_count": 1100, "id": 361, "frequency": "f", "synset": "desk.n.01"}, {"name": "detergent", "instance_count": 11, "def": "a surface-active chemical widely used in industry and laundering", "synonyms": ["detergent"], "image_count": 7, "id": 362, "frequency": "r", "synset": "detergent.n.01"}, {"name": "diaper", "instance_count": 89, "def": "garment consisting of a folded cloth drawn up between the legs and fastened at the waist", "synonyms": ["diaper"], "image_count": 69, "id": 363, "frequency": "c", "synset": "diaper.n.01"}, {"name": "diary", "instance_count": 2, "def": "yearly planner book", "synonyms": ["diary", "journal"], "image_count": 2, "id": 364, "frequency": "r", "synset": "diary.n.01"}, {"name": "die", "instance_count": 25, "def": "a small cube with 1 to 6 spots on the six faces; used in gambling", "synonyms": ["die", "dice"], "image_count": 8, "id": 365, "frequency": "r", "synset": "die.n.01"}, {"name": "dinghy", "instance_count": 15, "def": "a small boat of shallow draft with seats and oars with which it is propelled", "synonyms": ["dinghy", "dory", "rowboat"], "image_count": 5, "id": 366, "frequency": "r", "synset": "dinghy.n.01"}, {"name": "dining_table", "instance_count": 312, "def": "a table at which meals are served", "synonyms": ["dining_table"], "image_count": 227, "id": 367, "frequency": "f", "synset": "dining_table.n.01"}, {"name": "tux", "instance_count": 10, "def": "semiformal evening dress for men", "synonyms": ["tux", "tuxedo"], "image_count": 6, "id": 368, "frequency": "r", "synset": "dinner_jacket.n.01"}, {"name": "dish", "instance_count": 532, "def": "a piece of dishware normally used as a container for holding or serving food", "synonyms": ["dish"], "image_count": 106, "id": 369, "frequency": "f", "synset": "dish.n.01"}, {"name": "dish_antenna", "instance_count": 153, "def": "directional antenna consisting of a parabolic reflector", "synonyms": ["dish_antenna"], "image_count": 81, "id": 370, "frequency": "c", "synset": "dish.n.05"}, {"name": "dishrag", "instance_count": 32, "def": "a cloth for washing dishes or cleaning in general", "synonyms": ["dishrag", "dishcloth"], "image_count": 17, "id": 371, "frequency": "c", "synset": "dishrag.n.01"}, {"name": "dishtowel", "instance_count": 223, "def": "a towel for drying dishes", "synonyms": ["dishtowel", "tea_towel"], "image_count": 134, "id": 372, "frequency": "f", "synset": "dishtowel.n.01"}, {"name": "dishwasher", "instance_count": 317, "def": "a machine for washing dishes", "synonyms": ["dishwasher", "dishwashing_machine"], "image_count": 312, "id": 373, "frequency": "f", "synset": "dishwasher.n.01"}, {"name": "dishwasher_detergent", "instance_count": 9, "def": "dishsoap or dish detergent designed for use in dishwashers", "synonyms": ["dishwasher_detergent", "dishwashing_detergent", "dishwashing_liquid", "dishsoap"], "image_count": 8, "id": 374, "frequency": "r", "synset": "dishwasher_detergent.n.01"}, {"name": "dispenser", "instance_count": 610, "def": "a container so designed that the contents can be used in prescribed amounts", "synonyms": ["dispenser"], "image_count": 271, "id": 375, "frequency": "f", "synset": "dispenser.n.01"}, {"name": "diving_board", "instance_count": 2, "def": "a springboard from which swimmers can dive", "synonyms": ["diving_board"], "image_count": 2, "id": 376, "frequency": "r", "synset": "diving_board.n.01"}, {"name": "Dixie_cup", "instance_count": 352, "def": "a disposable cup made of paper; for holding drinks", "synonyms": ["Dixie_cup", "paper_cup"], "image_count": 103, "id": 377, "frequency": "f", "synset": "dixie_cup.n.01"}, {"name": "dog", "instance_count": 2684, "def": "a common domesticated dog", "synonyms": ["dog"], "image_count": 1938, "id": 378, "frequency": "f", "synset": "dog.n.01"}, {"name": "dog_collar", "instance_count": 733, "def": "a collar for a dog", "synonyms": ["dog_collar"], "image_count": 574, "id": 379, "frequency": "f", "synset": "dog_collar.n.01"}, {"name": "doll", "instance_count": 398, "def": "a toy replica of a HUMAN (NOT AN ANIMAL)", "synonyms": ["doll"], "image_count": 120, "id": 380, "frequency": "f", "synset": "doll.n.01"}, {"name": "dollar", "instance_count": 2, "def": "a piece of paper money worth one dollar", "synonyms": ["dollar", "dollar_bill", "one_dollar_bill"], "image_count": 2, "id": 381, "frequency": "r", "synset": "dollar.n.02"}, {"name": "dollhouse", "instance_count": 2, "def": "a house so small that it is likened to a child's plaything", "synonyms": ["dollhouse", "doll's_house"], "image_count": 2, "id": 382, "frequency": "r", "synset": "dollhouse.n.01"}, {"name": "dolphin", "instance_count": 38, "def": "any of various small toothed whales with a beaklike snout; larger than porpoises", "synonyms": ["dolphin"], "image_count": 13, "id": 383, "frequency": "c", "synset": "dolphin.n.02"}, {"name": "domestic_ass", "instance_count": 49, "def": "domestic beast of burden descended from the African wild ass; patient but stubborn", "synonyms": ["domestic_ass", "donkey"], "image_count": 29, "id": 384, "frequency": "c", "synset": "domestic_ass.n.01"}, {"name": "doorknob", "instance_count": 4072, "def": "a knob used to open a door (often called `doorhandle' in Great Britain)", "synonyms": ["doorknob", "doorhandle"], "image_count": 1710, "id": 385, "frequency": "f", "synset": "doorknob.n.01"}, {"name": "doormat", "instance_count": 78, "def": "a mat placed outside an exterior door for wiping the shoes before entering", "synonyms": ["doormat", "welcome_mat"], "image_count": 66, "id": 386, "frequency": "c", "synset": "doormat.n.02"}, {"name": "doughnut", "instance_count": 11911, "def": "a small ring-shaped friedcake", "synonyms": ["doughnut", "donut"], "image_count": 1008, "id": 387, "frequency": "f", "synset": "doughnut.n.02"}, {"name": "dove", "instance_count": 2, "def": "any of numerous small pigeons", "synonyms": ["dove"], "image_count": 1, "id": 388, "frequency": "r", "synset": "dove.n.01"}, {"name": "dragonfly", "instance_count": 8, "def": "slender-bodied non-stinging insect having iridescent wings that are outspread at rest", "synonyms": ["dragonfly"], "image_count": 3, "id": 389, "frequency": "r", "synset": "dragonfly.n.01"}, {"name": "drawer", "instance_count": 7927, "def": "a boxlike container in a piece of furniture; made so as to slide in and out", "synonyms": ["drawer"], "image_count": 1942, "id": 390, "frequency": "f", "synset": "drawer.n.01"}, {"name": "underdrawers", "instance_count": 23, "def": "underpants worn by men", "synonyms": ["underdrawers", "boxers", "boxershorts"], "image_count": 19, "id": 391, "frequency": "c", "synset": "drawers.n.01"}, {"name": "dress", "instance_count": 2842, "def": "a one-piece garment for a woman; has skirt and bodice", "synonyms": ["dress", "frock"], "image_count": 1488, "id": 392, "frequency": "f", "synset": "dress.n.01"}, {"name": "dress_hat", "instance_count": 76, "def": "a man's hat with a tall crown; usually covered with silk or with beaver fur", "synonyms": ["dress_hat", "high_hat", "opera_hat", "silk_hat", "top_hat"], "image_count": 46, "id": 393, "frequency": "c", "synset": "dress_hat.n.01"}, {"name": "dress_suit", "instance_count": 306, "def": "formalwear consisting of full evening dress for men", "synonyms": ["dress_suit"], "image_count": 106, "id": 394, "frequency": "f", "synset": "dress_suit.n.01"}, {"name": "dresser", "instance_count": 152, "def": "a cabinet with shelves", "synonyms": ["dresser"], "image_count": 115, "id": 395, "frequency": "f", "synset": "dresser.n.05"}, {"name": "drill", "instance_count": 24, "def": "a tool with a sharp rotating point for making holes in hard materials", "synonyms": ["drill"], "image_count": 19, "id": 396, "frequency": "c", "synset": "drill.n.01"}, {"name": "drone", "instance_count": 2, "def": "an aircraft without a pilot that is operated by remote control", "synonyms": ["drone"], "image_count": 2, "id": 397, "frequency": "r", "synset": "drone.n.04"}, {"name": "dropper", "instance_count": 1, "def": "pipet consisting of a small tube with a vacuum bulb at one end for drawing liquid in and releasing it a drop at a time", "synonyms": ["dropper", "eye_dropper"], "image_count": 1, "id": 398, "frequency": "r", "synset": "dropper.n.01"}, {"name": "drum_(musical_instrument)", "instance_count": 59, "def": "a musical percussion instrument; usually consists of a hollow cylinder with a membrane stretched across each end", "synonyms": ["drum_(musical_instrument)"], "image_count": 28, "id": 399, "frequency": "c", "synset": "drum.n.01"}, {"name": "drumstick", "instance_count": 25, "def": "a stick used for playing a drum", "synonyms": ["drumstick"], "image_count": 9, "id": 400, "frequency": "r", "synset": "drumstick.n.02"}, {"name": "duck", "instance_count": 1090, "def": "small web-footed broad-billed swimming bird", "synonyms": ["duck"], "image_count": 192, "id": 401, "frequency": "f", "synset": "duck.n.01"}, {"name": "duckling", "instance_count": 36, "def": "young duck", "synonyms": ["duckling"], "image_count": 12, "id": 402, "frequency": "c", "synset": "duckling.n.02"}, {"name": "duct_tape", "instance_count": 77, "def": "a wide silvery adhesive tape", "synonyms": ["duct_tape"], "image_count": 21, "id": 403, "frequency": "c", "synset": "duct_tape.n.01"}, {"name": "duffel_bag", "instance_count": 666, "def": "a large cylindrical bag of heavy cloth (does not include suitcases)", "synonyms": ["duffel_bag", "duffle_bag", "duffel", "duffle"], "image_count": 247, "id": 404, "frequency": "f", "synset": "duffel_bag.n.01"}, {"name": "dumbbell", "instance_count": 13, "def": "an exercising weight with two ball-like ends connected by a short handle", "synonyms": ["dumbbell"], "image_count": 6, "id": 405, "frequency": "r", "synset": "dumbbell.n.01"}, {"name": "dumpster", "instance_count": 95, "def": "a container designed to receive and transport and dump waste", "synonyms": ["dumpster"], "image_count": 64, "id": 406, "frequency": "c", "synset": "dumpster.n.01"}, {"name": "dustpan", "instance_count": 7, "def": "a short-handled receptacle into which dust can be swept", "synonyms": ["dustpan"], "image_count": 7, "id": 407, "frequency": "r", "synset": "dustpan.n.02"}, {"name": "eagle", "instance_count": 48, "def": "large birds of prey noted for their broad wings and strong soaring flight", "synonyms": ["eagle"], "image_count": 40, "id": 408, "frequency": "c", "synset": "eagle.n.01"}, {"name": "earphone", "instance_count": 767, "def": "device for listening to audio that is held over or inserted into the ear", "synonyms": ["earphone", "earpiece", "headphone"], "image_count": 542, "id": 409, "frequency": "f", "synset": "earphone.n.01"}, {"name": "earplug", "instance_count": 39, "def": "a soft plug that is inserted into the ear canal to block sound", "synonyms": ["earplug"], "image_count": 2, "id": 410, "frequency": "r", "synset": "earplug.n.01"}, {"name": "earring", "instance_count": 3070, "def": "jewelry to ornament the ear", "synonyms": ["earring"], "image_count": 1898, "id": 411, "frequency": "f", "synset": "earring.n.01"}, {"name": "easel", "instance_count": 43, "def": "an upright tripod for displaying something (usually an artist's canvas)", "synonyms": ["easel"], "image_count": 36, "id": 412, "frequency": "c", "synset": "easel.n.01"}, {"name": "eclair", "instance_count": 39, "def": "oblong cream puff", "synonyms": ["eclair"], "image_count": 4, "id": 413, "frequency": "r", "synset": "eclair.n.01"}, {"name": "eel", "instance_count": 1, "def": "an elongate fish with fatty flesh", "synonyms": ["eel"], "image_count": 1, "id": 414, "frequency": "r", "synset": "eel.n.01"}, {"name": "egg", "instance_count": 813, "def": "oval reproductive body of a fowl (especially a hen) used as food", "synonyms": ["egg", "eggs"], "image_count": 191, "id": 415, "frequency": "f", "synset": "egg.n.02"}, {"name": "egg_roll", "instance_count": 15, "def": "minced vegetables and meat wrapped in a pancake and fried", "synonyms": ["egg_roll", "spring_roll"], "image_count": 6, "id": 416, "frequency": "r", "synset": "egg_roll.n.01"}, {"name": "egg_yolk", "instance_count": 90, "def": "the yellow spherical part of an egg", "synonyms": ["egg_yolk", "yolk_(egg)"], "image_count": 41, "id": 417, "frequency": "c", "synset": "egg_yolk.n.01"}, {"name": "eggbeater", "instance_count": 52, "def": "a mixer for beating eggs or whipping cream", "synonyms": ["eggbeater", "eggwhisk"], "image_count": 39, "id": 418, "frequency": "c", "synset": "eggbeater.n.02"}, {"name": "eggplant", "instance_count": 337, "def": "egg-shaped vegetable having a shiny skin typically dark purple", "synonyms": ["eggplant", "aubergine"], "image_count": 46, "id": 419, "frequency": "c", "synset": "eggplant.n.01"}, {"name": "electric_chair", "instance_count": 1, "def": "a chair-shaped instrument of execution by electrocution", "synonyms": ["electric_chair"], "image_count": 1, "id": 420, "frequency": "r", "synset": "electric_chair.n.01"}, {"name": "refrigerator", "instance_count": 1702, "def": "a refrigerator in which the coolant is pumped around by an electric motor", "synonyms": ["refrigerator"], "image_count": 1451, "id": 421, "frequency": "f", "synset": "electric_refrigerator.n.01"}, {"name": "elephant", "instance_count": 5325, "def": "a common elephant", "synonyms": ["elephant"], "image_count": 1878, "id": 422, "frequency": "f", "synset": "elephant.n.01"}, {"name": "elk", "instance_count": 29, "def": "large northern deer with enormous flattened antlers in the male", "synonyms": ["elk", "moose"], "image_count": 11, "id": 423, "frequency": "c", "synset": "elk.n.01"}, {"name": "envelope", "instance_count": 210, "def": "a flat (usually rectangular) container for a letter, thin package, etc.", "synonyms": ["envelope"], "image_count": 82, "id": 424, "frequency": "c", "synset": "envelope.n.01"}, {"name": "eraser", "instance_count": 41, "def": "an implement used to erase something", "synonyms": ["eraser"], "image_count": 18, "id": 425, "frequency": "c", "synset": "eraser.n.01"}, {"name": "escargot", "instance_count": 5, "def": "edible snail usually served in the shell with a sauce of melted butter and garlic", "synonyms": ["escargot"], "image_count": 1, "id": 426, "frequency": "r", "synset": "escargot.n.01"}, {"name": "eyepatch", "instance_count": 9, "def": "a protective cloth covering for an injured eye", "synonyms": ["eyepatch"], "image_count": 7, "id": 427, "frequency": "r", "synset": "eyepatch.n.01"}, {"name": "falcon", "instance_count": 3, "def": "birds of prey having long pointed powerful wings adapted for swift flight", "synonyms": ["falcon"], "image_count": 3, "id": 428, "frequency": "r", "synset": "falcon.n.01"}, {"name": "fan", "instance_count": 737, "def": "a device for creating a current of air by movement of a surface or surfaces", "synonyms": ["fan"], "image_count": 575, "id": 429, "frequency": "f", "synset": "fan.n.01"}, {"name": "faucet", "instance_count": 3185, "def": "a regulator for controlling the flow of a liquid from a reservoir", "synonyms": ["faucet", "spigot", "tap"], "image_count": 1907, "id": 430, "frequency": "f", "synset": "faucet.n.01"}, {"name": "fedora", "instance_count": 14, "def": "a hat made of felt with a creased crown", "synonyms": ["fedora"], "image_count": 8, "id": 431, "frequency": "r", "synset": "fedora.n.01"}, {"name": "ferret", "instance_count": 5, "def": "domesticated albino variety of the European polecat bred for hunting rats and rabbits", "synonyms": ["ferret"], "image_count": 4, "id": 432, "frequency": "r", "synset": "ferret.n.02"}, {"name": "Ferris_wheel", "instance_count": 32, "def": "a large wheel with suspended seats that remain upright as the wheel rotates", "synonyms": ["Ferris_wheel"], "image_count": 32, "id": 433, "frequency": "c", "synset": "ferris_wheel.n.01"}, {"name": "ferry", "instance_count": 17, "def": "a boat that transports people or vehicles across a body of water and operates on a regular schedule", "synonyms": ["ferry", "ferryboat"], "image_count": 11, "id": 434, "frequency": "c", "synset": "ferry.n.01"}, {"name": "fig_(fruit)", "instance_count": 147, "def": "fleshy sweet pear-shaped yellowish or purple fruit eaten fresh or preserved or dried", "synonyms": ["fig_(fruit)"], "image_count": 4, "id": 435, "frequency": "r", "synset": "fig.n.04"}, {"name": "fighter_jet", "instance_count": 115, "def": "a high-speed military or naval airplane designed to destroy enemy targets", "synonyms": ["fighter_jet", "fighter_aircraft", "attack_aircraft"], "image_count": 54, "id": 436, "frequency": "c", "synset": "fighter.n.02"}, {"name": "figurine", "instance_count": 1056, "def": "a small carved or molded figure", "synonyms": ["figurine"], "image_count": 202, "id": 437, "frequency": "f", "synset": "figurine.n.01"}, {"name": "file_cabinet", "instance_count": 53, "def": "office furniture consisting of a container for keeping papers in order", "synonyms": ["file_cabinet", "filing_cabinet"], "image_count": 32, "id": 438, "frequency": "c", "synset": "file.n.03"}, {"name": "file_(tool)", "instance_count": 3, "def": "a steel hand tool with small sharp teeth on some or all of its surfaces; used for smoothing wood or metal", "synonyms": ["file_(tool)"], "image_count": 3, "id": 439, "frequency": "r", "synset": "file.n.04"}, {"name": "fire_alarm", "instance_count": 151, "def": "an alarm that is tripped off by fire or smoke", "synonyms": ["fire_alarm", "smoke_alarm"], "image_count": 130, "id": 440, "frequency": "f", "synset": "fire_alarm.n.02"}, {"name": "fire_engine", "instance_count": 179, "def": "large trucks that carry firefighters and equipment to the site of a fire", "synonyms": ["fire_engine", "fire_truck"], "image_count": 119, "id": 441, "frequency": "f", "synset": "fire_engine.n.01"}, {"name": "fire_extinguisher", "instance_count": 165, "def": "a manually operated device for extinguishing small fires", "synonyms": ["fire_extinguisher", "extinguisher"], "image_count": 141, "id": 442, "frequency": "f", "synset": "fire_extinguisher.n.01"}, {"name": "fire_hose", "instance_count": 67, "def": "a large hose that carries water from a fire hydrant to the site of the fire", "synonyms": ["fire_hose"], "image_count": 29, "id": 443, "frequency": "c", "synset": "fire_hose.n.01"}, {"name": "fireplace", "instance_count": 530, "def": "an open recess in a wall at the base of a chimney where a fire can be built", "synonyms": ["fireplace"], "image_count": 525, "id": 444, "frequency": "f", "synset": "fireplace.n.01"}, {"name": "fireplug", "instance_count": 1458, "def": "an upright hydrant for drawing water to use in fighting a fire", "synonyms": ["fireplug", "fire_hydrant", "hydrant"], "image_count": 1323, "id": 445, "frequency": "f", "synset": "fireplug.n.01"}, {"name": "first-aid_kit", "instance_count": 2, "def": "kit consisting of a set of bandages and medicines for giving first aid", "synonyms": ["first-aid_kit"], "image_count": 2, "id": 446, "frequency": "r", "synset": "first-aid_kit.n.01"}, {"name": "fish", "instance_count": 525, "def": "any of various mostly cold-blooded aquatic vertebrates usually having scales and breathing through gills", "synonyms": ["fish"], "image_count": 113, "id": 447, "frequency": "f", "synset": "fish.n.01"}, {"name": "fish_(food)", "instance_count": 96, "def": "the flesh of fish used as food", "synonyms": ["fish_(food)"], "image_count": 16, "id": 448, "frequency": "c", "synset": "fish.n.02"}, {"name": "fishbowl", "instance_count": 33, "def": "a transparent bowl in which small fish are kept", "synonyms": ["fishbowl", "goldfish_bowl"], "image_count": 7, "id": 449, "frequency": "r", "synset": "fishbowl.n.02"}, {"name": "fishing_rod", "instance_count": 84, "def": "a rod that is used in fishing to extend the fishing line", "synonyms": ["fishing_rod", "fishing_pole"], "image_count": 35, "id": 450, "frequency": "c", "synset": "fishing_rod.n.01"}, {"name": "flag", "instance_count": 7007, "def": "emblem usually consisting of a rectangular piece of cloth of distinctive design (do not include pole)", "synonyms": ["flag"], "image_count": 1908, "id": 451, "frequency": "f", "synset": "flag.n.01"}, {"name": "flagpole", "instance_count": 1082, "def": "a tall staff or pole on which a flag is raised", "synonyms": ["flagpole", "flagstaff"], "image_count": 353, "id": 452, "frequency": "f", "synset": "flagpole.n.02"}, {"name": "flamingo", "instance_count": 309, "def": "large pink web-footed bird with down-bent bill", "synonyms": ["flamingo"], "image_count": 18, "id": 453, "frequency": "c", "synset": "flamingo.n.01"}, {"name": "flannel", "instance_count": 18, "def": "a soft light woolen fabric; used for clothing", "synonyms": ["flannel"], "image_count": 14, "id": 454, "frequency": "c", "synset": "flannel.n.01"}, {"name": "flap", "instance_count": 218, "def": "any broad thin covering attached at one edge, such as a mud flap next to a wheel or a flap on an airplane wing", "synonyms": ["flap"], "image_count": 77, "id": 455, "frequency": "c", "synset": "flap.n.01"}, {"name": "flash", "instance_count": 10, "def": "a lamp for providing momentary light to take a photograph", "synonyms": ["flash", "flashbulb"], "image_count": 8, "id": 456, "frequency": "r", "synset": "flash.n.10"}, {"name": "flashlight", "instance_count": 48, "def": "a small portable battery-powered electric lamp", "synonyms": ["flashlight", "torch"], "image_count": 37, "id": 457, "frequency": "c", "synset": "flashlight.n.01"}, {"name": "fleece", "instance_count": 2, "def": "a soft bulky fabric with deep pile; used chiefly for clothing", "synonyms": ["fleece"], "image_count": 1, "id": 458, "frequency": "r", "synset": "fleece.n.03"}, {"name": "flip-flop_(sandal)", "instance_count": 1103, "def": "a backless sandal held to the foot by a thong between two toes", "synonyms": ["flip-flop_(sandal)"], "image_count": 346, "id": 459, "frequency": "f", "synset": "flip-flop.n.02"}, {"name": "flipper_(footwear)", "instance_count": 49, "def": "a shoe to aid a person in swimming", "synonyms": ["flipper_(footwear)", "fin_(footwear)"], "image_count": 19, "id": 460, "frequency": "c", "synset": "flipper.n.01"}, {"name": "flower_arrangement", "instance_count": 3960, "def": "a decorative arrangement of flowers", "synonyms": ["flower_arrangement", "floral_arrangement"], "image_count": 1779, "id": 461, "frequency": "f", "synset": "flower_arrangement.n.01"}, {"name": "flute_glass", "instance_count": 86, "def": "a tall narrow wineglass", "synonyms": ["flute_glass", "champagne_flute"], "image_count": 23, "id": 462, "frequency": "c", "synset": "flute.n.02"}, {"name": "foal", "instance_count": 30, "def": "a young horse", "synonyms": ["foal"], "image_count": 25, "id": 463, "frequency": "c", "synset": "foal.n.01"}, {"name": "folding_chair", "instance_count": 303, "def": "a chair that can be folded flat for storage", "synonyms": ["folding_chair"], "image_count": 67, "id": 464, "frequency": "c", "synset": "folding_chair.n.01"}, {"name": "food_processor", "instance_count": 22, "def": "a kitchen appliance for shredding, blending, chopping, or slicing food", "synonyms": ["food_processor"], "image_count": 19, "id": 465, "frequency": "c", "synset": "food_processor.n.01"}, {"name": "football_(American)", "instance_count": 35, "def": "the inflated oblong ball used in playing American football", "synonyms": ["football_(American)"], "image_count": 28, "id": 466, "frequency": "c", "synset": "football.n.02"}, {"name": "football_helmet", "instance_count": 7, "def": "a padded helmet with a face mask to protect the head of football players", "synonyms": ["football_helmet"], "image_count": 4, "id": 467, "frequency": "r", "synset": "football_helmet.n.01"}, {"name": "footstool", "instance_count": 41, "def": "a low seat or a stool to rest the feet of a seated person", "synonyms": ["footstool", "footrest"], "image_count": 27, "id": 468, "frequency": "c", "synset": "footstool.n.01"}, {"name": "fork", "instance_count": 3137, "def": "cutlery used for serving and eating food", "synonyms": ["fork"], "image_count": 1861, "id": 469, "frequency": "f", "synset": "fork.n.01"}, {"name": "forklift", "instance_count": 14, "def": "an industrial vehicle with a power operated fork in front that can be inserted under loads to lift and move them", "synonyms": ["forklift"], "image_count": 11, "id": 470, "frequency": "c", "synset": "forklift.n.01"}, {"name": "freight_car", "instance_count": 121, "def": "a railway car that carries freight", "synonyms": ["freight_car"], "image_count": 13, "id": 471, "frequency": "c", "synset": "freight_car.n.01"}, {"name": "French_toast", "instance_count": 41, "def": "bread slice dipped in egg and milk and fried", "synonyms": ["French_toast"], "image_count": 13, "id": 472, "frequency": "c", "synset": "french_toast.n.01"}, {"name": "freshener", "instance_count": 39, "def": "anything that freshens air by removing or covering odor", "synonyms": ["freshener", "air_freshener"], "image_count": 32, "id": 473, "frequency": "c", "synset": "freshener.n.01"}, {"name": "frisbee", "instance_count": 2332, "def": "a light, plastic disk propelled with a flip of the wrist for recreation or competition", "synonyms": ["frisbee"], "image_count": 1767, "id": 474, "frequency": "f", "synset": "frisbee.n.01"}, {"name": "frog", "instance_count": 84, "def": "a tailless stout-bodied amphibians with long hind limbs for leaping", "synonyms": ["frog", "toad", "toad_frog"], "image_count": 42, "id": 475, "frequency": "c", "synset": "frog.n.01"}, {"name": "fruit_juice", "instance_count": 37, "def": "drink produced by squeezing or crushing fruit", "synonyms": ["fruit_juice"], "image_count": 17, "id": 476, "frequency": "c", "synset": "fruit_juice.n.01"}, {"name": "frying_pan", "instance_count": 310, "def": "a pan used for frying foods", "synonyms": ["frying_pan", "frypan", "skillet"], "image_count": 128, "id": 477, "frequency": "f", "synset": "frying_pan.n.01"}, {"name": "fudge", "instance_count": 4, "def": "soft creamy candy", "synonyms": ["fudge"], "image_count": 1, "id": 478, "frequency": "r", "synset": "fudge.n.01"}, {"name": "funnel", "instance_count": 9, "def": "a cone-shaped utensil used to channel a substance into a container with a small mouth", "synonyms": ["funnel"], "image_count": 9, "id": 479, "frequency": "r", "synset": "funnel.n.02"}, {"name": "futon", "instance_count": 11, "def": "a pad that is used for sleeping on the floor or on a raised frame", "synonyms": ["futon"], "image_count": 10, "id": 480, "frequency": "r", "synset": "futon.n.01"}, {"name": "gag", "instance_count": 4, "def": "restraint put into a person's mouth to prevent speaking or shouting", "synonyms": ["gag", "muzzle"], "image_count": 4, "id": 481, "frequency": "r", "synset": "gag.n.02"}, {"name": "garbage", "instance_count": 18, "def": "a receptacle where waste can be discarded", "synonyms": ["garbage"], "image_count": 9, "id": 482, "frequency": "r", "synset": "garbage.n.03"}, {"name": "garbage_truck", "instance_count": 18, "def": "a truck for collecting domestic refuse", "synonyms": ["garbage_truck"], "image_count": 18, "id": 483, "frequency": "c", "synset": "garbage_truck.n.01"}, {"name": "garden_hose", "instance_count": 50, "def": "a hose used for watering a lawn or garden", "synonyms": ["garden_hose"], "image_count": 41, "id": 484, "frequency": "c", "synset": "garden_hose.n.01"}, {"name": "gargle", "instance_count": 38, "def": "a medicated solution used for gargling and rinsing the mouth", "synonyms": ["gargle", "mouthwash"], "image_count": 28, "id": 485, "frequency": "c", "synset": "gargle.n.01"}, {"name": "gargoyle", "instance_count": 8, "def": "an ornament consisting of a grotesquely carved figure of a person or animal", "synonyms": ["gargoyle"], "image_count": 3, "id": 486, "frequency": "r", "synset": "gargoyle.n.02"}, {"name": "garlic", "instance_count": 487, "def": "aromatic bulb used as seasoning", "synonyms": ["garlic", "ail"], "image_count": 65, "id": 487, "frequency": "c", "synset": "garlic.n.02"}, {"name": "gasmask", "instance_count": 12, "def": "a protective face mask with a filter", "synonyms": ["gasmask", "respirator", "gas_helmet"], "image_count": 9, "id": 488, "frequency": "r", "synset": "gasmask.n.01"}, {"name": "gazelle", "instance_count": 82, "def": "small swift graceful antelope of Africa and Asia having lustrous eyes", "synonyms": ["gazelle"], "image_count": 23, "id": 489, "frequency": "c", "synset": "gazelle.n.01"}, {"name": "gelatin", "instance_count": 248, "def": "an edible jelly made with gelatin and used as a dessert or salad base or a coating for foods", "synonyms": ["gelatin", "jelly"], "image_count": 24, "id": 490, "frequency": "c", "synset": "gelatin.n.02"}, {"name": "gemstone", "instance_count": 2, "def": "a crystalline rock that can be cut and polished for jewelry", "synonyms": ["gemstone"], "image_count": 1, "id": 491, "frequency": "r", "synset": "gem.n.02"}, {"name": "generator", "instance_count": 2, "def": "engine that converts mechanical energy into electrical energy by electromagnetic induction", "synonyms": ["generator"], "image_count": 2, "id": 492, "frequency": "r", "synset": "generator.n.02"}, {"name": "giant_panda", "instance_count": 112, "def": "large black-and-white herbivorous mammal of bamboo forests of China and Tibet", "synonyms": ["giant_panda", "panda", "panda_bear"], "image_count": 59, "id": 493, "frequency": "c", "synset": "giant_panda.n.01"}, {"name": "gift_wrap", "instance_count": 247, "def": "attractive wrapping paper suitable for wrapping gifts", "synonyms": ["gift_wrap"], "image_count": 48, "id": 494, "frequency": "c", "synset": "gift_wrap.n.01"}, {"name": "ginger", "instance_count": 93, "def": "the root of the common ginger plant; used fresh as a seasoning", "synonyms": ["ginger", "gingerroot"], "image_count": 17, "id": 495, "frequency": "c", "synset": "ginger.n.03"}, {"name": "giraffe", "instance_count": 3923, "def": "tall animal having a spotted coat and small horns and very long neck and legs", "synonyms": ["giraffe"], "image_count": 1877, "id": 496, "frequency": "f", "synset": "giraffe.n.01"}, {"name": "cincture", "instance_count": 56, "def": "a band of material around the waist that strengthens a skirt or trousers", "synonyms": ["cincture", "sash", "waistband", "waistcloth"], "image_count": 18, "id": 497, "frequency": "c", "synset": "girdle.n.02"}, {"name": "glass_(drink_container)", "instance_count": 6420, "def": "a container for holding liquids while drinking", "synonyms": ["glass_(drink_container)", "drinking_glass"], "image_count": 1920, "id": 498, "frequency": "f", "synset": "glass.n.02"}, {"name": "globe", "instance_count": 59, "def": "a sphere on which a map (especially of the earth) is represented", "synonyms": ["globe"], "image_count": 50, "id": 499, "frequency": "c", "synset": "globe.n.03"}, {"name": "glove", "instance_count": 5951, "def": "handwear covering the hand", "synonyms": ["glove"], "image_count": 1890, "id": 500, "frequency": "f", "synset": "glove.n.02"}, {"name": "goat", "instance_count": 842, "def": "a common goat", "synonyms": ["goat"], "image_count": 99, "id": 501, "frequency": "c", "synset": "goat.n.01"}, {"name": "goggles", "instance_count": 3202, "def": "tight-fitting spectacles worn to protect the eyes", "synonyms": ["goggles"], "image_count": 1530, "id": 502, "frequency": "f", "synset": "goggles.n.01"}, {"name": "goldfish", "instance_count": 11, "def": "small golden or orange-red freshwater fishes used as pond or aquarium pets", "synonyms": ["goldfish"], "image_count": 3, "id": 503, "frequency": "r", "synset": "goldfish.n.01"}, {"name": "golf_club", "instance_count": 14, "def": "golf equipment used by a golfer to hit a golf ball", "synonyms": ["golf_club", "golf-club"], "image_count": 11, "id": 504, "frequency": "c", "synset": "golf_club.n.02"}, {"name": "golfcart", "instance_count": 25, "def": "a small motor vehicle in which golfers can ride between shots", "synonyms": ["golfcart"], "image_count": 19, "id": 505, "frequency": "c", "synset": "golfcart.n.01"}, {"name": "gondola_(boat)", "instance_count": 8, "def": "long narrow flat-bottomed boat propelled by sculling; traditionally used on canals of Venice", "synonyms": ["gondola_(boat)"], "image_count": 3, "id": 506, "frequency": "r", "synset": "gondola.n.02"}, {"name": "goose", "instance_count": 413, "def": "loud, web-footed long-necked aquatic birds usually larger than ducks", "synonyms": ["goose"], "image_count": 63, "id": 507, "frequency": "c", "synset": "goose.n.01"}, {"name": "gorilla", "instance_count": 10, "def": "largest ape", "synonyms": ["gorilla"], "image_count": 5, "id": 508, "frequency": "r", "synset": "gorilla.n.01"}, {"name": "gourd", "instance_count": 101, "def": "any of numerous inedible fruits with hard rinds", "synonyms": ["gourd"], "image_count": 6, "id": 509, "frequency": "r", "synset": "gourd.n.02"}, {"name": "grape", "instance_count": 6377, "def": "any of various juicy fruit with green or purple skins; grow in clusters", "synonyms": ["grape"], "image_count": 233, "id": 510, "frequency": "f", "synset": "grape.n.01"}, {"name": "grater", "instance_count": 64, "def": "utensil with sharp perforations for shredding foods (as vegetables or cheese)", "synonyms": ["grater"], "image_count": 54, "id": 511, "frequency": "c", "synset": "grater.n.01"}, {"name": "gravestone", "instance_count": 778, "def": "a stone that is used to mark a grave", "synonyms": ["gravestone", "headstone", "tombstone"], "image_count": 36, "id": 512, "frequency": "c", "synset": "gravestone.n.01"}, {"name": "gravy_boat", "instance_count": 10, "def": "a dish (often boat-shaped) for serving gravy or sauce", "synonyms": ["gravy_boat", "gravy_holder"], "image_count": 10, "id": 513, "frequency": "r", "synset": "gravy_boat.n.01"}, {"name": "green_bean", "instance_count": 2571, "def": "a common bean plant cultivated for its slender green edible pods", "synonyms": ["green_bean"], "image_count": 124, "id": 514, "frequency": "f", "synset": "green_bean.n.02"}, {"name": "green_onion", "instance_count": 1618, "def": "a young onion before the bulb has enlarged", "synonyms": ["green_onion", "spring_onion", "scallion"], "image_count": 101, "id": 515, "frequency": "f", "synset": "green_onion.n.01"}, {"name": "griddle", "instance_count": 4, "def": "cooking utensil consisting of a flat heated surface on which food is cooked", "synonyms": ["griddle"], "image_count": 3, "id": 516, "frequency": "r", "synset": "griddle.n.01"}, {"name": "grill", "instance_count": 747, "def": "a framework of metal bars used as a partition or a grate", "synonyms": ["grill", "grille", "grillwork", "radiator_grille"], "image_count": 363, "id": 517, "frequency": "f", "synset": "grill.n.02"}, {"name": "grits", "instance_count": 3, "def": "coarsely ground corn boiled as a breakfast dish", "synonyms": ["grits", "hominy_grits"], "image_count": 3, "id": 518, "frequency": "r", "synset": "grits.n.01"}, {"name": "grizzly", "instance_count": 44, "def": "powerful brownish-yellow bear of the uplands of western North America", "synonyms": ["grizzly", "grizzly_bear"], "image_count": 30, "id": 519, "frequency": "c", "synset": "grizzly.n.01"}, {"name": "grocery_bag", "instance_count": 46, "def": "a sack for holding customer's groceries", "synonyms": ["grocery_bag"], "image_count": 18, "id": 520, "frequency": "c", "synset": "grocery_bag.n.01"}, {"name": "guitar", "instance_count": 315, "def": "a stringed instrument usually having six strings; played by strumming or plucking", "synonyms": ["guitar"], "image_count": 199, "id": 521, "frequency": "f", "synset": "guitar.n.01"}, {"name": "gull", "instance_count": 1398, "def": "mostly white aquatic bird having long pointed wings and short legs", "synonyms": ["gull", "seagull"], "image_count": 97, "id": 522, "frequency": "c", "synset": "gull.n.02"}, {"name": "gun", "instance_count": 68, "def": "a weapon that discharges a bullet at high velocity from a metal tube", "synonyms": ["gun"], "image_count": 32, "id": 523, "frequency": "c", "synset": "gun.n.01"}, {"name": "hairbrush", "instance_count": 165, "def": "a brush used to groom a person's hair", "synonyms": ["hairbrush"], "image_count": 121, "id": 524, "frequency": "f", "synset": "hairbrush.n.01"}, {"name": "hairnet", "instance_count": 53, "def": "a small net that someone wears over their hair to keep it in place", "synonyms": ["hairnet"], "image_count": 16, "id": 525, "frequency": "c", "synset": "hairnet.n.01"}, {"name": "hairpin", "instance_count": 20, "def": "a double pronged pin used to hold women's hair in place", "synonyms": ["hairpin"], "image_count": 12, "id": 526, "frequency": "c", "synset": "hairpin.n.01"}, {"name": "halter_top", "instance_count": 3, "def": "a woman's top that fastens behind the back and neck leaving the back and arms uncovered", "synonyms": ["halter_top"], "image_count": 2, "id": 527, "frequency": "r", "synset": "halter.n.03"}, {"name": "ham", "instance_count": 1765, "def": "meat cut from the thigh of a hog (usually smoked)", "synonyms": ["ham", "jambon", "gammon"], "image_count": 214, "id": 528, "frequency": "f", "synset": "ham.n.01"}, {"name": "hamburger", "instance_count": 126, "def": "a sandwich consisting of a patty of minced beef served on a bun", "synonyms": ["hamburger", "beefburger", "burger"], "image_count": 48, "id": 529, "frequency": "c", "synset": "hamburger.n.01"}, {"name": "hammer", "instance_count": 41, "def": "a hand tool with a heavy head and a handle; used to deliver an impulsive force by striking", "synonyms": ["hammer"], "image_count": 26, "id": 530, "frequency": "c", "synset": "hammer.n.02"}, {"name": "hammock", "instance_count": 15, "def": "a hanging bed of canvas or rope netting (usually suspended between two trees)", "synonyms": ["hammock"], "image_count": 13, "id": 531, "frequency": "c", "synset": "hammock.n.02"}, {"name": "hamper", "instance_count": 5, "def": "a basket usually with a cover", "synonyms": ["hamper"], "image_count": 4, "id": 532, "frequency": "r", "synset": "hamper.n.02"}, {"name": "hamster", "instance_count": 12, "def": "short-tailed burrowing rodent with large cheek pouches", "synonyms": ["hamster"], "image_count": 11, "id": 533, "frequency": "c", "synset": "hamster.n.01"}, {"name": "hair_dryer", "instance_count": 144, "def": "a hand-held electric blower that can blow warm air onto the hair", "synonyms": ["hair_dryer"], "image_count": 123, "id": 534, "frequency": "f", "synset": "hand_blower.n.01"}, {"name": "hand_glass", "instance_count": 7, "def": "a mirror intended to be held in the hand", "synonyms": ["hand_glass", "hand_mirror"], "image_count": 7, "id": 535, "frequency": "r", "synset": "hand_glass.n.01"}, {"name": "hand_towel", "instance_count": 619, "def": "a small towel used to dry the hands or face", "synonyms": ["hand_towel", "face_towel"], "image_count": 200, "id": 536, "frequency": "f", "synset": "hand_towel.n.01"}, {"name": "handcart", "instance_count": 204, "def": "wheeled vehicle that can be pushed by a person", "synonyms": ["handcart", "pushcart", "hand_truck"], "image_count": 91, "id": 537, "frequency": "c", "synset": "handcart.n.01"}, {"name": "handcuff", "instance_count": 10, "def": "shackle that consists of a metal loop that can be locked around the wrist", "synonyms": ["handcuff"], "image_count": 9, "id": 538, "frequency": "r", "synset": "handcuff.n.01"}, {"name": "handkerchief", "instance_count": 86, "def": "a square piece of cloth used for wiping the eyes or nose or as a costume accessory", "synonyms": ["handkerchief"], "image_count": 72, "id": 539, "frequency": "c", "synset": "handkerchief.n.01"}, {"name": "handle", "instance_count": 8314, "def": "the appendage to an object that is designed to be held in order to use or move it", "synonyms": ["handle", "grip", "handgrip"], "image_count": 1886, "id": 540, "frequency": "f", "synset": "handle.n.01"}, {"name": "handsaw", "instance_count": 5, "def": "a saw used with one hand for cutting wood", "synonyms": ["handsaw", "carpenter's_saw"], "image_count": 4, "id": 541, "frequency": "r", "synset": "handsaw.n.01"}, {"name": "hardback_book", "instance_count": 2, "def": "a book with cardboard or cloth or leather covers", "synonyms": ["hardback_book", "hardcover_book"], "image_count": 1, "id": 542, "frequency": "r", "synset": "hardback.n.01"}, {"name": "harmonium", "instance_count": 2, "def": "a free-reed instrument in which air is forced through the reeds by bellows", "synonyms": ["harmonium", "organ_(musical_instrument)", "reed_organ_(musical_instrument)"], "image_count": 1, "id": 543, "frequency": "r", "synset": "harmonium.n.01"}, {"name": "hat", "instance_count": 7213, "def": "headwear that protects the head from bad weather, sun, or worn for fashion", "synonyms": ["hat"], "image_count": 1932, "id": 544, "frequency": "f", "synset": "hat.n.01"}, {"name": "hatbox", "instance_count": 7, "def": "a round piece of luggage for carrying hats", "synonyms": ["hatbox"], "image_count": 4, "id": 545, "frequency": "r", "synset": "hatbox.n.01"}, {"name": "veil", "instance_count": 57, "def": "a garment that covers the head OR face", "synonyms": ["veil"], "image_count": 56, "id": 546, "frequency": "c", "synset": "head_covering.n.01"}, {"name": "headband", "instance_count": 1114, "def": "a band worn around or over the head", "synonyms": ["headband"], "image_count": 854, "id": 547, "frequency": "f", "synset": "headband.n.01"}, {"name": "headboard", "instance_count": 850, "def": "a vertical board or panel forming the head of a bedstead", "synonyms": ["headboard"], "image_count": 755, "id": 548, "frequency": "f", "synset": "headboard.n.01"}, {"name": "headlight", "instance_count": 7326, "def": "a powerful light with reflector; attached to the front of an automobile or locomotive", "synonyms": ["headlight", "headlamp"], "image_count": 1843, "id": 549, "frequency": "f", "synset": "headlight.n.01"}, {"name": "headscarf", "instance_count": 235, "def": "a kerchief worn over the head and tied under the chin", "synonyms": ["headscarf"], "image_count": 96, "id": 550, "frequency": "c", "synset": "headscarf.n.01"}, {"name": "headset", "instance_count": 10, "def": "receiver consisting of a pair of headphones", "synonyms": ["headset"], "image_count": 7, "id": 551, "frequency": "r", "synset": "headset.n.01"}, {"name": "headstall_(for_horses)", "instance_count": 133, "def": "the band that is the part of a bridle that fits around a horse's head", "synonyms": ["headstall_(for_horses)", "headpiece_(for_horses)"], "image_count": 74, "id": 552, "frequency": "c", "synset": "headstall.n.01"}, {"name": "heart", "instance_count": 347, "def": "a muscular organ; its contractions move the blood through the body", "synonyms": ["heart"], "image_count": 66, "id": 553, "frequency": "c", "synset": "heart.n.02"}, {"name": "heater", "instance_count": 64, "def": "device that heats water or supplies warmth to a room", "synonyms": ["heater", "warmer"], "image_count": 57, "id": 554, "frequency": "c", "synset": "heater.n.01"}, {"name": "helicopter", "instance_count": 68, "def": "an aircraft without wings that obtains its lift from the rotation of overhead blades", "synonyms": ["helicopter"], "image_count": 44, "id": 555, "frequency": "c", "synset": "helicopter.n.01"}, {"name": "helmet", "instance_count": 4845, "def": "a protective headgear made of hard material to resist blows", "synonyms": ["helmet"], "image_count": 1905, "id": 556, "frequency": "f", "synset": "helmet.n.02"}, {"name": "heron", "instance_count": 6, "def": "grey or white wading bird with long neck and long legs and (usually) long bill", "synonyms": ["heron"], "image_count": 4, "id": 557, "frequency": "r", "synset": "heron.n.02"}, {"name": "highchair", "instance_count": 98, "def": "a chair for feeding a very young child", "synonyms": ["highchair", "feeding_chair"], "image_count": 90, "id": 558, "frequency": "c", "synset": "highchair.n.01"}, {"name": "hinge", "instance_count": 5283, "def": "a joint that holds two parts together so that one can swing relative to the other", "synonyms": ["hinge"], "image_count": 1635, "id": 559, "frequency": "f", "synset": "hinge.n.01"}, {"name": "hippopotamus", "instance_count": 24, "def": "massive thick-skinned animal living in or around rivers of tropical Africa", "synonyms": ["hippopotamus"], "image_count": 8, "id": 560, "frequency": "r", "synset": "hippopotamus.n.01"}, {"name": "hockey_stick", "instance_count": 15, "def": "sports implement consisting of a stick used by hockey players to move the puck", "synonyms": ["hockey_stick"], "image_count": 5, "id": 561, "frequency": "r", "synset": "hockey_stick.n.01"}, {"name": "hog", "instance_count": 73, "def": "domestic swine", "synonyms": ["hog", "pig"], "image_count": 50, "id": 562, "frequency": "c", "synset": "hog.n.03"}, {"name": "home_plate_(baseball)", "instance_count": 551, "def": "(baseball) a rubber slab where the batter stands; it must be touched by a base runner in order to score", "synonyms": ["home_plate_(baseball)", "home_base_(baseball)"], "image_count": 545, "id": 563, "frequency": "f", "synset": "home_plate.n.01"}, {"name": "honey", "instance_count": 90, "def": "a sweet yellow liquid produced by bees", "synonyms": ["honey"], "image_count": 20, "id": 564, "frequency": "c", "synset": "honey.n.01"}, {"name": "fume_hood", "instance_count": 208, "def": "metal covering leading to a vent that exhausts smoke or fumes", "synonyms": ["fume_hood", "exhaust_hood"], "image_count": 193, "id": 565, "frequency": "f", "synset": "hood.n.06"}, {"name": "hook", "instance_count": 1157, "def": "a curved or bent implement for suspending or pulling something", "synonyms": ["hook"], "image_count": 285, "id": 566, "frequency": "f", "synset": "hook.n.05"}, {"name": "hookah", "instance_count": 3, "def": "a tobacco pipe with a long flexible tube connected to a container where the smoke is cooled by passing through water", "synonyms": ["hookah", "narghile", "nargileh", "sheesha", "shisha", "water_pipe"], "image_count": 3, "id": 567, "frequency": "r", "synset": "hookah.n.01"}, {"name": "hornet", "instance_count": 1, "def": "large stinging wasp", "synonyms": ["hornet"], "image_count": 1, "id": 568, "frequency": "r", "synset": "hornet.n.01"}, {"name": "horse", "instance_count": 4744, "def": "a common horse", "synonyms": ["horse"], "image_count": 1904, "id": 569, "frequency": "f", "synset": "horse.n.01"}, {"name": "hose", "instance_count": 610, "def": "a flexible pipe for conveying a liquid or gas", "synonyms": ["hose", "hosepipe"], "image_count": 294, "id": 570, "frequency": "f", "synset": "hose.n.03"}, {"name": "hot-air_balloon", "instance_count": 4, "def": "balloon for travel through the air in a basket suspended below a large bag of heated air", "synonyms": ["hot-air_balloon"], "image_count": 3, "id": 571, "frequency": "r", "synset": "hot-air_balloon.n.01"}, {"name": "hotplate", "instance_count": 6, "def": "a portable electric appliance for heating or cooking or keeping food warm", "synonyms": ["hotplate"], "image_count": 5, "id": 572, "frequency": "r", "synset": "hot_plate.n.01"}, {"name": "hot_sauce", "instance_count": 70, "def": "a pungent peppery sauce", "synonyms": ["hot_sauce"], "image_count": 24, "id": 573, "frequency": "c", "synset": "hot_sauce.n.01"}, {"name": "hourglass", "instance_count": 2, "def": "a sandglass timer that runs for sixty minutes", "synonyms": ["hourglass"], "image_count": 2, "id": 574, "frequency": "r", "synset": "hourglass.n.01"}, {"name": "houseboat", "instance_count": 4, "def": "a barge that is designed and equipped for use as a dwelling", "synonyms": ["houseboat"], "image_count": 2, "id": 575, "frequency": "r", "synset": "houseboat.n.01"}, {"name": "hummingbird", "instance_count": 18, "def": "tiny American bird having brilliant iridescent plumage and long slender bills", "synonyms": ["hummingbird"], "image_count": 16, "id": 576, "frequency": "c", "synset": "hummingbird.n.01"}, {"name": "hummus", "instance_count": 9, "def": "a thick spread made from mashed chickpeas", "synonyms": ["hummus", "humus", "hommos", "hoummos", "humous"], "image_count": 8, "id": 577, "frequency": "r", "synset": "hummus.n.01"}, {"name": "polar_bear", "instance_count": 196, "def": "white bear of Arctic regions", "synonyms": ["polar_bear"], "image_count": 154, "id": 578, "frequency": "f", "synset": "ice_bear.n.01"}, {"name": "icecream", "instance_count": 180, "def": "frozen dessert containing cream and sugar and flavoring", "synonyms": ["icecream"], "image_count": 66, "id": 579, "frequency": "c", "synset": "ice_cream.n.01"}, {"name": "popsicle", "instance_count": 1, "def": "ice cream or water ice on a small wooden stick", "synonyms": ["popsicle"], "image_count": 1, "id": 580, "frequency": "r", "synset": "ice_lolly.n.01"}, {"name": "ice_maker", "instance_count": 26, "def": "an appliance included in some electric refrigerators for making ice cubes", "synonyms": ["ice_maker"], "image_count": 24, "id": 581, "frequency": "c", "synset": "ice_maker.n.01"}, {"name": "ice_pack", "instance_count": 4, "def": "a waterproof bag filled with ice: applied to the body (especially the head) to cool or reduce swelling", "synonyms": ["ice_pack", "ice_bag"], "image_count": 1, "id": 582, "frequency": "r", "synset": "ice_pack.n.01"}, {"name": "ice_skate", "instance_count": 14, "def": "skate consisting of a boot with a steel blade fitted to the sole", "synonyms": ["ice_skate"], "image_count": 4, "id": 583, "frequency": "r", "synset": "ice_skate.n.01"}, {"name": "igniter", "instance_count": 77, "def": "a substance or device used to start a fire", "synonyms": ["igniter", "ignitor", "lighter"], "image_count": 75, "id": 584, "frequency": "c", "synset": "igniter.n.01"}, {"name": "inhaler", "instance_count": 7, "def": "a dispenser that produces a chemical vapor to be inhaled through mouth or nose", "synonyms": ["inhaler", "inhalator"], "image_count": 6, "id": 585, "frequency": "r", "synset": "inhaler.n.01"}, {"name": "iPod", "instance_count": 172, "def": "a pocket-sized device used to play music files", "synonyms": ["iPod"], "image_count": 126, "id": 586, "frequency": "f", "synset": "ipod.n.01"}, {"name": "iron_(for_clothing)", "instance_count": 38, "def": "home appliance consisting of a flat metal base that is heated and used to smooth cloth", "synonyms": ["iron_(for_clothing)", "smoothing_iron_(for_clothing)"], "image_count": 24, "id": 587, "frequency": "c", "synset": "iron.n.04"}, {"name": "ironing_board", "instance_count": 24, "def": "narrow padded board on collapsible supports; used for ironing clothes", "synonyms": ["ironing_board"], "image_count": 22, "id": 588, "frequency": "c", "synset": "ironing_board.n.01"}, {"name": "jacket", "instance_count": 8013, "def": "a waist-length coat", "synonyms": ["jacket"], "image_count": 1872, "id": 589, "frequency": "f", "synset": "jacket.n.01"}, {"name": "jam", "instance_count": 29, "def": "preserve of crushed fruit", "synonyms": ["jam"], "image_count": 16, "id": 590, "frequency": "c", "synset": "jam.n.01"}, {"name": "jar", "instance_count": 2002, "def": "a vessel (usually cylindrical) with a wide mouth and without handles", "synonyms": ["jar"], "image_count": 423, "id": 591, "frequency": "f", "synset": "jar.n.01"}, {"name": "jean", "instance_count": 5421, "def": "(usually plural) close-fitting trousers of heavy denim for manual work or casual wear", "synonyms": ["jean", "blue_jean", "denim"], "image_count": 1927, "id": 592, "frequency": "f", "synset": "jean.n.01"}, {"name": "jeep", "instance_count": 55, "def": "a car suitable for traveling over rough terrain", "synonyms": ["jeep", "landrover"], "image_count": 38, "id": 593, "frequency": "c", "synset": "jeep.n.01"}, {"name": "jelly_bean", "instance_count": 116, "def": "sugar-glazed jellied candy", "synonyms": ["jelly_bean", "jelly_egg"], "image_count": 3, "id": 594, "frequency": "r", "synset": "jelly_bean.n.01"}, {"name": "jersey", "instance_count": 8117, "def": "a close-fitting pullover shirt", "synonyms": ["jersey", "T-shirt", "tee_shirt"], "image_count": 1945, "id": 595, "frequency": "f", "synset": "jersey.n.03"}, {"name": "jet_plane", "instance_count": 87, "def": "an airplane powered by one or more jet engines", "synonyms": ["jet_plane", "jet-propelled_plane"], "image_count": 35, "id": 596, "frequency": "c", "synset": "jet.n.01"}, {"name": "jewel", "instance_count": 1, "def": "a precious or semiprecious stone incorporated into a piece of jewelry", "synonyms": ["jewel", "gem", "precious_stone"], "image_count": 1, "id": 597, "frequency": "r", "synset": "jewel.n.01"}, {"name": "jewelry", "instance_count": 51, "def": "an adornment (as a bracelet or ring or necklace) made of precious metals and set with gems (or imitation gems)", "synonyms": ["jewelry", "jewellery"], "image_count": 13, "id": 598, "frequency": "c", "synset": "jewelry.n.01"}, {"name": "joystick", "instance_count": 12, "def": "a control device for computers consisting of a vertical handle that can move freely in two directions", "synonyms": ["joystick"], "image_count": 9, "id": 599, "frequency": "r", "synset": "joystick.n.02"}, {"name": "jumpsuit", "instance_count": 21, "def": "one-piece garment fashioned after a parachutist's uniform", "synonyms": ["jumpsuit"], "image_count": 14, "id": 600, "frequency": "c", "synset": "jump_suit.n.01"}, {"name": "kayak", "instance_count": 124, "def": "a small canoe consisting of a light frame made watertight with animal skins", "synonyms": ["kayak"], "image_count": 37, "id": 601, "frequency": "c", "synset": "kayak.n.01"}, {"name": "keg", "instance_count": 6, "def": "small cask or barrel", "synonyms": ["keg"], "image_count": 3, "id": 602, "frequency": "r", "synset": "keg.n.02"}, {"name": "kennel", "instance_count": 4, "def": "outbuilding that serves as a shelter for a dog", "synonyms": ["kennel", "doghouse"], "image_count": 4, "id": 603, "frequency": "r", "synset": "kennel.n.01"}, {"name": "kettle", "instance_count": 130, "def": "a metal pot for stewing or boiling; usually has a lid", "synonyms": ["kettle", "boiler"], "image_count": 100, "id": 604, "frequency": "c", "synset": "kettle.n.01"}, {"name": "key", "instance_count": 447, "def": "metal instrument used to unlock a lock", "synonyms": ["key"], "image_count": 195, "id": 605, "frequency": "f", "synset": "key.n.01"}, {"name": "keycard", "instance_count": 1, "def": "a plastic card used to gain access typically to a door", "synonyms": ["keycard"], "image_count": 1, "id": 606, "frequency": "r", "synset": "keycard.n.01"}, {"name": "kilt", "instance_count": 19, "def": "a knee-length pleated tartan skirt worn by men as part of the traditional dress in the Highlands of northern Scotland", "synonyms": ["kilt"], "image_count": 12, "id": 607, "frequency": "c", "synset": "kilt.n.01"}, {"name": "kimono", "instance_count": 38, "def": "a loose robe; imitated from robes originally worn by Japanese", "synonyms": ["kimono"], "image_count": 24, "id": 608, "frequency": "c", "synset": "kimono.n.01"}, {"name": "kitchen_sink", "instance_count": 519, "def": "a sink in a kitchen", "synonyms": ["kitchen_sink"], "image_count": 489, "id": 609, "frequency": "f", "synset": "kitchen_sink.n.01"}, {"name": "kitchen_table", "instance_count": 11, "def": "a table in the kitchen", "synonyms": ["kitchen_table"], "image_count": 10, "id": 610, "frequency": "r", "synset": "kitchen_table.n.01"}, {"name": "kite", "instance_count": 11174, "def": "plaything consisting of a light frame covered with tissue paper; flown in wind at end of a string", "synonyms": ["kite"], "image_count": 1689, "id": 611, "frequency": "f", "synset": "kite.n.03"}, {"name": "kitten", "instance_count": 60, "def": "young domestic cat", "synonyms": ["kitten", "kitty"], "image_count": 42, "id": 612, "frequency": "c", "synset": "kitten.n.01"}, {"name": "kiwi_fruit", "instance_count": 702, "def": "fuzzy brown egg-shaped fruit with slightly tart green flesh", "synonyms": ["kiwi_fruit"], "image_count": 81, "id": 613, "frequency": "c", "synset": "kiwi.n.03"}, {"name": "knee_pad", "instance_count": 1765, "def": "protective garment consisting of a pad worn by football or baseball or hockey players", "synonyms": ["knee_pad"], "image_count": 894, "id": 614, "frequency": "f", "synset": "knee_pad.n.01"}, {"name": "knife", "instance_count": 3515, "def": "tool with a blade and point used as a cutting instrument", "synonyms": ["knife"], "image_count": 1868, "id": 615, "frequency": "f", "synset": "knife.n.01"}, {"name": "knitting_needle", "instance_count": 16, "def": "needle consisting of a slender rod with pointed ends; usually used in pairs", "synonyms": ["knitting_needle"], "image_count": 7, "id": 616, "frequency": "r", "synset": "knitting_needle.n.01"}, {"name": "knob", "instance_count": 8432, "def": "a round handle often found on a door", "synonyms": ["knob"], "image_count": 1567, "id": 617, "frequency": "f", "synset": "knob.n.02"}, {"name": "knocker_(on_a_door)", "instance_count": 10, "def": "a device (usually metal and ornamental) attached by a hinge to a door", "synonyms": ["knocker_(on_a_door)", "doorknocker"], "image_count": 10, "id": 618, "frequency": "r", "synset": "knocker.n.05"}, {"name": "koala", "instance_count": 15, "def": "sluggish tailless Australian marsupial with grey furry ears and coat", "synonyms": ["koala", "koala_bear"], "image_count": 8, "id": 619, "frequency": "r", "synset": "koala.n.01"}, {"name": "lab_coat", "instance_count": 42, "def": "a light coat worn to protect clothing from substances used while working in a laboratory", "synonyms": ["lab_coat", "laboratory_coat"], "image_count": 7, "id": 620, "frequency": "r", "synset": "lab_coat.n.01"}, {"name": "ladder", "instance_count": 975, "def": "steps consisting of two parallel members connected by rungs", "synonyms": ["ladder"], "image_count": 629, "id": 621, "frequency": "f", "synset": "ladder.n.01"}, {"name": "ladle", "instance_count": 226, "def": "a spoon-shaped vessel with a long handle frequently used to transfer liquids", "synonyms": ["ladle"], "image_count": 89, "id": 622, "frequency": "c", "synset": "ladle.n.01"}, {"name": "ladybug", "instance_count": 68, "def": "small round bright-colored and spotted beetle, typically red and black", "synonyms": ["ladybug", "ladybeetle", "ladybird_beetle"], "image_count": 15, "id": 623, "frequency": "c", "synset": "ladybug.n.01"}, {"name": "lamb_(animal)", "instance_count": 618, "def": "young sheep", "synonyms": ["lamb_(animal)"], "image_count": 134, "id": 624, "frequency": "f", "synset": "lamb.n.01"}, {"name": "lamb-chop", "instance_count": 8, "def": "chop cut from a lamb", "synonyms": ["lamb-chop", "lambchop"], "image_count": 4, "id": 625, "frequency": "r", "synset": "lamb_chop.n.01"}, {"name": "lamp", "instance_count": 4139, "def": "a piece of furniture holding one or more electric light bulbs", "synonyms": ["lamp"], "image_count": 1802, "id": 626, "frequency": "f", "synset": "lamp.n.02"}, {"name": "lamppost", "instance_count": 2234, "def": "a metal post supporting an outdoor lamp (such as a streetlight)", "synonyms": ["lamppost"], "image_count": 595, "id": 627, "frequency": "f", "synset": "lamppost.n.01"}, {"name": "lampshade", "instance_count": 2475, "def": "a protective ornamental shade used to screen a light bulb from direct view", "synonyms": ["lampshade"], "image_count": 1210, "id": 628, "frequency": "f", "synset": "lampshade.n.01"}, {"name": "lantern", "instance_count": 364, "def": "light in a transparent protective case", "synonyms": ["lantern"], "image_count": 48, "id": 629, "frequency": "c", "synset": "lantern.n.01"}, {"name": "lanyard", "instance_count": 1065, "def": "a cord worn around the neck to hold a knife or whistle, etc.", "synonyms": ["lanyard", "laniard"], "image_count": 418, "id": 630, "frequency": "f", "synset": "lanyard.n.02"}, {"name": "laptop_computer", "instance_count": 2852, "def": "a portable computer small enough to use in your lap", "synonyms": ["laptop_computer", "notebook_computer"], "image_count": 1846, "id": 631, "frequency": "f", "synset": "laptop.n.01"}, {"name": "lasagna", "instance_count": 7, "def": "baked dish of layers of lasagna pasta with sauce and cheese and meat or vegetables", "synonyms": ["lasagna", "lasagne"], "image_count": 5, "id": 632, "frequency": "r", "synset": "lasagna.n.01"}, {"name": "latch", "instance_count": 702, "def": "a bar that can be lowered or slid into a groove to fasten a door or gate", "synonyms": ["latch"], "image_count": 221, "id": 633, "frequency": "f", "synset": "latch.n.02"}, {"name": "lawn_mower", "instance_count": 12, "def": "garden tool for mowing grass on lawns", "synonyms": ["lawn_mower"], "image_count": 10, "id": 634, "frequency": "r", "synset": "lawn_mower.n.01"}, {"name": "leather", "instance_count": 20, "def": "an animal skin made smooth and flexible by removing the hair and then tanning", "synonyms": ["leather"], "image_count": 7, "id": 635, "frequency": "r", "synset": "leather.n.01"}, {"name": "legging_(clothing)", "instance_count": 154, "def": "a garment covering the leg (usually extending from the knee to the ankle)", "synonyms": ["legging_(clothing)", "leging_(clothing)", "leg_covering"], "image_count": 76, "id": 636, "frequency": "c", "synset": "legging.n.01"}, {"name": "Lego", "instance_count": 331, "def": "a child's plastic construction set for making models from blocks", "synonyms": ["Lego", "Lego_set"], "image_count": 22, "id": 637, "frequency": "c", "synset": "lego.n.01"}, {"name": "legume", "instance_count": 333, "def": "the fruit or seed of bean or pea plants", "synonyms": ["legume"], "image_count": 10, "id": 638, "frequency": "r", "synset": "legume.n.02"}, {"name": "lemon", "instance_count": 2168, "def": "yellow oval fruit with juicy acidic flesh", "synonyms": ["lemon"], "image_count": 341, "id": 639, "frequency": "f", "synset": "lemon.n.01"}, {"name": "lemonade", "instance_count": 2, "def": "sweetened beverage of diluted lemon juice", "synonyms": ["lemonade"], "image_count": 1, "id": 640, "frequency": "r", "synset": "lemonade.n.01"}, {"name": "lettuce", "instance_count": 5500, "def": "leafy plant commonly eaten in salad or on sandwiches", "synonyms": ["lettuce"], "image_count": 705, "id": 641, "frequency": "f", "synset": "lettuce.n.02"}, {"name": "license_plate", "instance_count": 4392, "def": "a plate mounted on the front and back of car and bearing the car's registration number", "synonyms": ["license_plate", "numberplate"], "image_count": 1900, "id": 642, "frequency": "f", "synset": "license_plate.n.01"}, {"name": "life_buoy", "instance_count": 524, "def": "a ring-shaped life preserver used to prevent drowning (NOT a life-jacket or vest)", "synonyms": ["life_buoy", "lifesaver", "life_belt", "life_ring"], "image_count": 188, "id": 643, "frequency": "f", "synset": "life_buoy.n.01"}, {"name": "life_jacket", "instance_count": 689, "def": "life preserver consisting of a sleeveless jacket of buoyant or inflatable design", "synonyms": ["life_jacket", "life_vest"], "image_count": 227, "id": 644, "frequency": "f", "synset": "life_jacket.n.01"}, {"name": "lightbulb", "instance_count": 7075, "def": "lightblub/source of light", "synonyms": ["lightbulb"], "image_count": 861, "id": 645, "frequency": "f", "synset": "light_bulb.n.01"}, {"name": "lightning_rod", "instance_count": 6, "def": "a metallic conductor that is attached to a high point and leads to the ground", "synonyms": ["lightning_rod", "lightning_conductor"], "image_count": 6, "id": 646, "frequency": "r", "synset": "lightning_rod.n.02"}, {"name": "lime", "instance_count": 1134, "def": "the green acidic fruit of any of various lime trees", "synonyms": ["lime"], "image_count": 115, "id": 647, "frequency": "f", "synset": "lime.n.06"}, {"name": "limousine", "instance_count": 6, "def": "long luxurious car; usually driven by a chauffeur", "synonyms": ["limousine"], "image_count": 5, "id": 648, "frequency": "r", "synset": "limousine.n.01"}, {"name": "lion", "instance_count": 69, "def": "large gregarious predatory cat of Africa and India", "synonyms": ["lion"], "image_count": 43, "id": 649, "frequency": "c", "synset": "lion.n.01"}, {"name": "lip_balm", "instance_count": 29, "def": "a balm applied to the lips", "synonyms": ["lip_balm"], "image_count": 14, "id": 650, "frequency": "c", "synset": "lip_balm.n.01"}, {"name": "liquor", "instance_count": 66, "def": "liquor or beer", "synonyms": ["liquor", "spirits", "hard_liquor", "liqueur", "cordial"], "image_count": 6, "id": 651, "frequency": "r", "synset": "liquor.n.01"}, {"name": "lizard", "instance_count": 22, "def": "a reptile with usually two pairs of legs and a tapering tail", "synonyms": ["lizard"], "image_count": 15, "id": 652, "frequency": "c", "synset": "lizard.n.01"}, {"name": "log", "instance_count": 7363, "def": "a segment of the trunk of a tree when stripped of branches", "synonyms": ["log"], "image_count": 1167, "id": 653, "frequency": "f", "synset": "log.n.01"}, {"name": "lollipop", "instance_count": 59, "def": "hard candy on a stick", "synonyms": ["lollipop"], "image_count": 15, "id": 654, "frequency": "c", "synset": "lollipop.n.02"}, {"name": "speaker_(stero_equipment)", "instance_count": 2029, "def": "electronic device that produces sound often as part of a stereo system", "synonyms": ["speaker_(stero_equipment)"], "image_count": 994, "id": 655, "frequency": "f", "synset": "loudspeaker.n.01"}, {"name": "loveseat", "instance_count": 41, "def": "small sofa that seats two people", "synonyms": ["loveseat"], "image_count": 28, "id": 656, "frequency": "c", "synset": "love_seat.n.01"}, {"name": "machine_gun", "instance_count": 5, "def": "a rapidly firing automatic gun", "synonyms": ["machine_gun"], "image_count": 2, "id": 657, "frequency": "r", "synset": "machine_gun.n.01"}, {"name": "magazine", "instance_count": 1379, "def": "a paperback periodic publication", "synonyms": ["magazine"], "image_count": 338, "id": 658, "frequency": "f", "synset": "magazine.n.02"}, {"name": "magnet", "instance_count": 5638, "def": "a device that attracts iron and produces a magnetic field", "synonyms": ["magnet"], "image_count": 334, "id": 659, "frequency": "f", "synset": "magnet.n.01"}, {"name": "mail_slot", "instance_count": 16, "def": "a slot (usually in a door) through which mail can be delivered", "synonyms": ["mail_slot"], "image_count": 15, "id": 660, "frequency": "c", "synset": "mail_slot.n.01"}, {"name": "mailbox_(at_home)", "instance_count": 240, "def": "a private box for delivery of mail", "synonyms": ["mailbox_(at_home)", "letter_box_(at_home)"], "image_count": 102, "id": 661, "frequency": "f", "synset": "mailbox.n.01"}, {"name": "mallard", "instance_count": 2, "def": "wild dabbling duck from which domestic ducks are descended", "synonyms": ["mallard"], "image_count": 1, "id": 662, "frequency": "r", "synset": "mallard.n.01"}, {"name": "mallet", "instance_count": 16, "def": "a sports implement with a long handle and a hammer-like head used to hit a ball", "synonyms": ["mallet"], "image_count": 8, "id": 663, "frequency": "r", "synset": "mallet.n.01"}, {"name": "mammoth", "instance_count": 2, "def": "any of numerous extinct elephants widely distributed in the Pleistocene", "synonyms": ["mammoth"], "image_count": 1, "id": 664, "frequency": "r", "synset": "mammoth.n.01"}, {"name": "manatee", "instance_count": 1, "def": "sirenian mammal of tropical coastal waters of America", "synonyms": ["manatee"], "image_count": 1, "id": 665, "frequency": "r", "synset": "manatee.n.01"}, {"name": "mandarin_orange", "instance_count": 401, "def": "a somewhat flat reddish-orange loose skinned citrus of China", "synonyms": ["mandarin_orange"], "image_count": 28, "id": 666, "frequency": "c", "synset": "mandarin.n.05"}, {"name": "manger", "instance_count": 126, "def": "a container (usually in a barn or stable) from which cattle or horses feed", "synonyms": ["manger", "trough"], "image_count": 91, "id": 667, "frequency": "c", "synset": "manger.n.01"}, {"name": "manhole", "instance_count": 445, "def": "a hole (usually with a flush cover) through which a person can gain access to an underground structure", "synonyms": ["manhole"], "image_count": 260, "id": 668, "frequency": "f", "synset": "manhole.n.01"}, {"name": "map", "instance_count": 186, "def": "a diagrammatic representation of the earth's surface (or part of it)", "synonyms": ["map"], "image_count": 131, "id": 669, "frequency": "f", "synset": "map.n.01"}, {"name": "marker", "instance_count": 501, "def": "a writing implement for making a mark", "synonyms": ["marker"], "image_count": 128, "id": 670, "frequency": "f", "synset": "marker.n.03"}, {"name": "martini", "instance_count": 3, "def": "a cocktail made of gin (or vodka) with dry vermouth", "synonyms": ["martini"], "image_count": 3, "id": 671, "frequency": "r", "synset": "martini.n.01"}, {"name": "mascot", "instance_count": 10, "def": "a person or animal that is adopted by a team or other group as a symbolic figure", "synonyms": ["mascot"], "image_count": 10, "id": 672, "frequency": "r", "synset": "mascot.n.01"}, {"name": "mashed_potato", "instance_count": 58, "def": "potato that has been peeled and boiled and then mashed", "synonyms": ["mashed_potato"], "image_count": 39, "id": 673, "frequency": "c", "synset": "mashed_potato.n.01"}, {"name": "masher", "instance_count": 2, "def": "a kitchen utensil used for mashing (e.g. potatoes)", "synonyms": ["masher"], "image_count": 2, "id": 674, "frequency": "r", "synset": "masher.n.02"}, {"name": "mask", "instance_count": 1595, "def": "a protective covering worn over the face", "synonyms": ["mask", "facemask"], "image_count": 925, "id": 675, "frequency": "f", "synset": "mask.n.04"}, {"name": "mast", "instance_count": 2985, "def": "a vertical spar for supporting sails", "synonyms": ["mast"], "image_count": 354, "id": 676, "frequency": "f", "synset": "mast.n.01"}, {"name": "mat_(gym_equipment)", "instance_count": 114, "def": "sports equipment consisting of a piece of thick padding on the floor for gymnastics", "synonyms": ["mat_(gym_equipment)", "gym_mat"], "image_count": 31, "id": 677, "frequency": "c", "synset": "mat.n.03"}, {"name": "matchbox", "instance_count": 11, "def": "a box for holding matches", "synonyms": ["matchbox"], "image_count": 10, "id": 678, "frequency": "r", "synset": "matchbox.n.01"}, {"name": "mattress", "instance_count": 354, "def": "a thick pad filled with resilient material used as a bed or part of a bed", "synonyms": ["mattress"], "image_count": 215, "id": 679, "frequency": "f", "synset": "mattress.n.01"}, {"name": "measuring_cup", "instance_count": 139, "def": "graduated cup used to measure liquid or granular ingredients", "synonyms": ["measuring_cup"], "image_count": 71, "id": 680, "frequency": "c", "synset": "measuring_cup.n.01"}, {"name": "measuring_stick", "instance_count": 57, "def": "measuring instrument having a sequence of marks at regular intervals", "synonyms": ["measuring_stick", "ruler_(measuring_stick)", "measuring_rod"], "image_count": 43, "id": 681, "frequency": "c", "synset": "measuring_stick.n.01"}, {"name": "meatball", "instance_count": 174, "def": "ground meat formed into a ball and fried or simmered in broth", "synonyms": ["meatball"], "image_count": 28, "id": 682, "frequency": "c", "synset": "meatball.n.01"}, {"name": "medicine", "instance_count": 243, "def": "something that treats or prevents or alleviates the symptoms of disease", "synonyms": ["medicine"], "image_count": 34, "id": 683, "frequency": "c", "synset": "medicine.n.02"}, {"name": "melon", "instance_count": 167, "def": "fruit of the gourd family having a hard rind and sweet juicy flesh", "synonyms": ["melon"], "image_count": 16, "id": 684, "frequency": "c", "synset": "melon.n.01"}, {"name": "microphone", "instance_count": 435, "def": "device for converting sound waves into electrical energy", "synonyms": ["microphone"], "image_count": 273, "id": 685, "frequency": "f", "synset": "microphone.n.01"}, {"name": "microscope", "instance_count": 3, "def": "magnifier of the image of small objects", "synonyms": ["microscope"], "image_count": 2, "id": 686, "frequency": "r", "synset": "microscope.n.01"}, {"name": "microwave_oven", "instance_count": 1105, "def": "kitchen appliance that cooks food by passing an electromagnetic wave through it", "synonyms": ["microwave_oven"], "image_count": 999, "id": 687, "frequency": "f", "synset": "microwave.n.02"}, {"name": "milestone", "instance_count": 5, "def": "stone post at side of a road to show distances", "synonyms": ["milestone", "milepost"], "image_count": 4, "id": 688, "frequency": "r", "synset": "milestone.n.01"}, {"name": "milk", "instance_count": 227, "def": "a white nutritious liquid secreted by mammals and used as food by human beings", "synonyms": ["milk"], "image_count": 107, "id": 689, "frequency": "f", "synset": "milk.n.01"}, {"name": "milk_can", "instance_count": 8, "def": "can for transporting milk", "synonyms": ["milk_can"], "image_count": 2, "id": 690, "frequency": "r", "synset": "milk_can.n.01"}, {"name": "milkshake", "instance_count": 1, "def": "frothy drink of milk and flavoring and sometimes fruit or ice cream", "synonyms": ["milkshake"], "image_count": 1, "id": 691, "frequency": "r", "synset": "milkshake.n.01"}, {"name": "minivan", "instance_count": 1046, "def": "a small box-shaped passenger van", "synonyms": ["minivan"], "image_count": 454, "id": 692, "frequency": "f", "synset": "minivan.n.01"}, {"name": "mint_candy", "instance_count": 27, "def": "a candy that is flavored with a mint oil", "synonyms": ["mint_candy"], "image_count": 9, "id": 693, "frequency": "r", "synset": "mint.n.05"}, {"name": "mirror", "instance_count": 3490, "def": "polished surface that forms images by reflecting light", "synonyms": ["mirror"], "image_count": 1901, "id": 694, "frequency": "f", "synset": "mirror.n.01"}, {"name": "mitten", "instance_count": 156, "def": "glove that encases the thumb separately and the other four fingers together", "synonyms": ["mitten"], "image_count": 61, "id": 695, "frequency": "c", "synset": "mitten.n.01"}, {"name": "mixer_(kitchen_tool)", "instance_count": 108, "def": "a kitchen utensil that is used for mixing foods", "synonyms": ["mixer_(kitchen_tool)", "stand_mixer"], "image_count": 91, "id": 696, "frequency": "c", "synset": "mixer.n.04"}, {"name": "money", "instance_count": 122, "def": "the official currency issued by a government or national bank", "synonyms": ["money"], "image_count": 46, "id": 697, "frequency": "c", "synset": "money.n.03"}, {"name": "monitor_(computer_equipment) computer_monitor", "instance_count": 2955, "def": "a computer monitor", "synonyms": ["monitor_(computer_equipment) computer_monitor"], "image_count": 1402, "id": 698, "frequency": "f", "synset": "monitor.n.04"}, {"name": "monkey", "instance_count": 166, "def": "any of various long-tailed primates", "synonyms": ["monkey"], "image_count": 74, "id": 699, "frequency": "c", "synset": "monkey.n.01"}, {"name": "motor", "instance_count": 985, "def": "machine that converts other forms of energy into mechanical energy and so imparts motion", "synonyms": ["motor"], "image_count": 421, "id": 700, "frequency": "f", "synset": "motor.n.01"}, {"name": "motor_scooter", "instance_count": 720, "def": "a wheeled vehicle with small wheels and a low-powered engine", "synonyms": ["motor_scooter", "scooter"], "image_count": 226, "id": 701, "frequency": "f", "synset": "motor_scooter.n.01"}, {"name": "motor_vehicle", "instance_count": 64, "def": "a self-propelled wheeled vehicle that does not run on rails", "synonyms": ["motor_vehicle", "automotive_vehicle"], "image_count": 10, "id": 702, "frequency": "r", "synset": "motor_vehicle.n.01"}, {"name": "motorcycle", "instance_count": 5247, "def": "a motor vehicle with two wheels and a strong frame", "synonyms": ["motorcycle"], "image_count": 1720, "id": 703, "frequency": "f", "synset": "motorcycle.n.01"}, {"name": "mound_(baseball)", "instance_count": 269, "def": "(baseball) the slight elevation on which the pitcher stands", "synonyms": ["mound_(baseball)", "pitcher's_mound"], "image_count": 261, "id": 704, "frequency": "f", "synset": "mound.n.01"}, {"name": "mouse_(computer_equipment)", "instance_count": 1832, "def": "a computer input device that controls an on-screen pointer (does not include trackpads / touchpads)", "synonyms": ["mouse_(computer_equipment)", "computer_mouse"], "image_count": 1337, "id": 705, "frequency": "f", "synset": "mouse.n.04"}, {"name": "mousepad", "instance_count": 333, "def": "a small portable pad that provides an operating surface for a computer mouse", "synonyms": ["mousepad"], "image_count": 293, "id": 706, "frequency": "f", "synset": "mousepad.n.01"}, {"name": "muffin", "instance_count": 352, "def": "a sweet quick bread baked in a cup-shaped pan", "synonyms": ["muffin"], "image_count": 62, "id": 707, "frequency": "c", "synset": "muffin.n.01"}, {"name": "mug", "instance_count": 1785, "def": "with handle and usually cylindrical", "synonyms": ["mug"], "image_count": 814, "id": 708, "frequency": "f", "synset": "mug.n.04"}, {"name": "mushroom", "instance_count": 6257, "def": "a common mushroom", "synonyms": ["mushroom"], "image_count": 407, "id": 709, "frequency": "f", "synset": "mushroom.n.02"}, {"name": "music_stool", "instance_count": 6, "def": "a stool for piano players; usually adjustable in height", "synonyms": ["music_stool", "piano_stool"], "image_count": 6, "id": 710, "frequency": "r", "synset": "music_stool.n.01"}, {"name": "musical_instrument", "instance_count": 33, "def": "any of various devices or contrivances that can be used to produce musical tones or sounds", "synonyms": ["musical_instrument", "instrument_(musical)"], "image_count": 16, "id": 711, "frequency": "c", "synset": "musical_instrument.n.01"}, {"name": "nailfile", "instance_count": 10, "def": "a small flat file for shaping the nails", "synonyms": ["nailfile"], "image_count": 7, "id": 712, "frequency": "r", "synset": "nailfile.n.01"}, {"name": "napkin", "instance_count": 3979, "def": "a small piece of table linen or paper that is used to wipe the mouth and to cover the lap in order to protect clothing", "synonyms": ["napkin", "table_napkin", "serviette"], "image_count": 1791, "id": 713, "frequency": "f", "synset": "napkin.n.01"}, {"name": "neckerchief", "instance_count": 4, "def": "a kerchief worn around the neck", "synonyms": ["neckerchief"], "image_count": 2, "id": 714, "frequency": "r", "synset": "neckerchief.n.01"}, {"name": "necklace", "instance_count": 2709, "def": "jewelry consisting of a cord or chain (often bearing gems) worn about the neck as an ornament", "synonyms": ["necklace"], "image_count": 1915, "id": 715, "frequency": "f", "synset": "necklace.n.01"}, {"name": "necktie", "instance_count": 4069, "def": "neckwear consisting of a long narrow piece of material worn under a collar and tied in knot at the front", "synonyms": ["necktie", "tie_(necktie)"], "image_count": 1940, "id": 716, "frequency": "f", "synset": "necktie.n.01"}, {"name": "needle", "instance_count": 61, "def": "a sharp pointed implement (usually metal)", "synonyms": ["needle"], "image_count": 13, "id": 717, "frequency": "c", "synset": "needle.n.03"}, {"name": "nest", "instance_count": 20, "def": "a structure in which animals lay eggs or give birth to their young", "synonyms": ["nest"], "image_count": 16, "id": 718, "frequency": "c", "synset": "nest.n.01"}, {"name": "newspaper", "instance_count": 1179, "def": "a daily or weekly publication on folded sheets containing news, articles, and advertisements", "synonyms": ["newspaper", "paper_(newspaper)"], "image_count": 448, "id": 719, "frequency": "f", "synset": "newspaper.n.01"}, {"name": "newsstand", "instance_count": 39, "def": "a stall where newspapers and other periodicals are sold", "synonyms": ["newsstand"], "image_count": 12, "id": 720, "frequency": "c", "synset": "newsstand.n.01"}, {"name": "nightshirt", "instance_count": 35, "def": "garments designed to be worn in bed", "synonyms": ["nightshirt", "nightwear", "sleepwear", "nightclothes"], "image_count": 18, "id": 721, "frequency": "c", "synset": "nightwear.n.01"}, {"name": "nosebag_(for_animals)", "instance_count": 4, "def": "a canvas bag that is used to feed an animal (such as a horse); covers the muzzle and fastens at the top of the head", "synonyms": ["nosebag_(for_animals)", "feedbag"], "image_count": 4, "id": 722, "frequency": "r", "synset": "nosebag.n.01"}, {"name": "noseband_(for_animals)", "instance_count": 120, "def": "a strap that is the part of a bridle that goes over the animal's nose", "synonyms": ["noseband_(for_animals)", "nosepiece_(for_animals)"], "image_count": 71, "id": 723, "frequency": "c", "synset": "noseband.n.01"}, {"name": "notebook", "instance_count": 290, "def": "a book with blank pages for recording notes or memoranda", "synonyms": ["notebook"], "image_count": 189, "id": 724, "frequency": "f", "synset": "notebook.n.01"}, {"name": "notepad", "instance_count": 187, "def": "a pad of paper for keeping notes", "synonyms": ["notepad"], "image_count": 74, "id": 725, "frequency": "c", "synset": "notepad.n.01"}, {"name": "nut", "instance_count": 790, "def": "a small metal block (usually square or hexagonal) with internal screw thread to be fitted onto a bolt", "synonyms": ["nut"], "image_count": 103, "id": 726, "frequency": "f", "synset": "nut.n.03"}, {"name": "nutcracker", "instance_count": 7, "def": "a hand tool used to crack nuts open", "synonyms": ["nutcracker"], "image_count": 3, "id": 727, "frequency": "r", "synset": "nutcracker.n.01"}, {"name": "oar", "instance_count": 488, "def": "an implement used to propel or steer a boat", "synonyms": ["oar"], "image_count": 110, "id": 728, "frequency": "f", "synset": "oar.n.01"}, {"name": "octopus_(food)", "instance_count": 5, "def": "tentacles of octopus prepared as food", "synonyms": ["octopus_(food)"], "image_count": 5, "id": 729, "frequency": "r", "synset": "octopus.n.01"}, {"name": "octopus_(animal)", "instance_count": 17, "def": "bottom-living cephalopod having a soft oval body with eight long tentacles", "synonyms": ["octopus_(animal)"], "image_count": 9, "id": 730, "frequency": "r", "synset": "octopus.n.02"}, {"name": "oil_lamp", "instance_count": 28, "def": "a lamp that burns oil (as kerosine) for light", "synonyms": ["oil_lamp", "kerosene_lamp", "kerosine_lamp"], "image_count": 15, "id": 731, "frequency": "c", "synset": "oil_lamp.n.01"}, {"name": "olive_oil", "instance_count": 36, "def": "oil from olives", "synonyms": ["olive_oil"], "image_count": 25, "id": 732, "frequency": "c", "synset": "olive_oil.n.01"}, {"name": "omelet", "instance_count": 10, "def": "beaten eggs cooked until just set; may be folded around e.g. ham or cheese or jelly", "synonyms": ["omelet", "omelette"], "image_count": 7, "id": 733, "frequency": "r", "synset": "omelet.n.01"}, {"name": "onion", "instance_count": 9779, "def": "the bulb of an onion plant", "synonyms": ["onion"], "image_count": 647, "id": 734, "frequency": "f", "synset": "onion.n.01"}, {"name": "orange_(fruit)", "instance_count": 13034, "def": "orange (FRUIT of an orange tree)", "synonyms": ["orange_(fruit)"], "image_count": 824, "id": 735, "frequency": "f", "synset": "orange.n.01"}, {"name": "orange_juice", "instance_count": 223, "def": "bottled or freshly squeezed juice of oranges", "synonyms": ["orange_juice"], "image_count": 100, "id": 736, "frequency": "c", "synset": "orange_juice.n.01"}, {"name": "ostrich", "instance_count": 71, "def": "fast-running African flightless bird with two-toed feet; largest living bird", "synonyms": ["ostrich"], "image_count": 47, "id": 737, "frequency": "c", "synset": "ostrich.n.02"}, {"name": "ottoman", "instance_count": 157, "def": "a thick standalone cushion used as a seat or footrest, often next to a chair", "synonyms": ["ottoman", "pouf", "pouffe", "hassock"], "image_count": 121, "id": 738, "frequency": "f", "synset": "ottoman.n.03"}, {"name": "oven", "instance_count": 929, "def": "kitchen appliance used for baking or roasting", "synonyms": ["oven"], "image_count": 731, "id": 739, "frequency": "f", "synset": "oven.n.01"}, {"name": "overalls_(clothing)", "instance_count": 76, "def": "work clothing consisting of denim trousers usually with a bib and shoulder straps", "synonyms": ["overalls_(clothing)"], "image_count": 73, "id": 740, "frequency": "c", "synset": "overall.n.01"}, {"name": "owl", "instance_count": 73, "def": "nocturnal bird of prey with hawk-like beak and claws and large head with front-facing eyes", "synonyms": ["owl"], "image_count": 49, "id": 741, "frequency": "c", "synset": "owl.n.01"}, {"name": "packet", "instance_count": 109, "def": "a small package or bundle", "synonyms": ["packet"], "image_count": 23, "id": 742, "frequency": "c", "synset": "packet.n.03"}, {"name": "inkpad", "instance_count": 12, "def": "absorbent material saturated with ink used to transfer ink evenly to a rubber stamp", "synonyms": ["inkpad", "inking_pad", "stamp_pad"], "image_count": 4, "id": 743, "frequency": "r", "synset": "pad.n.03"}, {"name": "pad", "instance_count": 264, "def": "mostly arm/knee pads labeled", "synonyms": ["pad"], "image_count": 62, "id": 744, "frequency": "c", "synset": "pad.n.04"}, {"name": "paddle", "instance_count": 306, "def": "a short light oar used without an oarlock to propel a canoe or small boat", "synonyms": ["paddle", "boat_paddle"], "image_count": 118, "id": 745, "frequency": "f", "synset": "paddle.n.04"}, {"name": "padlock", "instance_count": 184, "def": "a detachable, portable lock", "synonyms": ["padlock"], "image_count": 99, "id": 746, "frequency": "c", "synset": "padlock.n.01"}, {"name": "paintbrush", "instance_count": 91, "def": "a brush used as an applicator to apply paint", "synonyms": ["paintbrush"], "image_count": 40, "id": 747, "frequency": "c", "synset": "paintbrush.n.01"}, {"name": "painting", "instance_count": 2645, "def": "graphic art consisting of an artistic composition made by applying paints to a surface", "synonyms": ["painting"], "image_count": 1036, "id": 748, "frequency": "f", "synset": "painting.n.01"}, {"name": "pajamas", "instance_count": 163, "def": "loose-fitting nightclothes worn for sleeping or lounging", "synonyms": ["pajamas", "pyjamas"], "image_count": 105, "id": 749, "frequency": "f", "synset": "pajama.n.02"}, {"name": "palette", "instance_count": 68, "def": "board that provides a flat surface on which artists mix paints and the range of colors used", "synonyms": ["palette", "pallet"], "image_count": 21, "id": 750, "frequency": "c", "synset": "palette.n.02"}, {"name": "pan_(for_cooking)", "instance_count": 643, "def": "cooking utensil consisting of a wide metal vessel", "synonyms": ["pan_(for_cooking)", "cooking_pan"], "image_count": 229, "id": 751, "frequency": "f", "synset": "pan.n.01"}, {"name": "pan_(metal_container)", "instance_count": 21, "def": "shallow container made of metal", "synonyms": ["pan_(metal_container)"], "image_count": 7, "id": 752, "frequency": "r", "synset": "pan.n.03"}, {"name": "pancake", "instance_count": 295, "def": "a flat cake of thin batter fried on both sides on a griddle", "synonyms": ["pancake"], "image_count": 72, "id": 753, "frequency": "c", "synset": "pancake.n.01"}, {"name": "pantyhose", "instance_count": 11, "def": "a woman's tights consisting of underpants and stockings", "synonyms": ["pantyhose"], "image_count": 9, "id": 754, "frequency": "r", "synset": "pantyhose.n.01"}, {"name": "papaya", "instance_count": 206, "def": "large oval melon-like tropical fruit with yellowish flesh", "synonyms": ["papaya"], "image_count": 10, "id": 755, "frequency": "r", "synset": "papaya.n.02"}, {"name": "paper_plate", "instance_count": 957, "def": "a disposable plate made of cardboard", "synonyms": ["paper_plate"], "image_count": 328, "id": 756, "frequency": "f", "synset": "paper_plate.n.01"}, {"name": "paper_towel", "instance_count": 600, "def": "a disposable towel made of absorbent paper", "synonyms": ["paper_towel"], "image_count": 468, "id": 757, "frequency": "f", "synset": "paper_towel.n.01"}, {"name": "paperback_book", "instance_count": 3, "def": "a book with paper covers", "synonyms": ["paperback_book", "paper-back_book", "softback_book", "soft-cover_book"], "image_count": 1, "id": 758, "frequency": "r", "synset": "paperback_book.n.01"}, {"name": "paperweight", "instance_count": 4, "def": "a weight used to hold down a stack of papers", "synonyms": ["paperweight"], "image_count": 2, "id": 759, "frequency": "r", "synset": "paperweight.n.01"}, {"name": "parachute", "instance_count": 61, "def": "rescue equipment consisting of a device that fills with air and retards your fall", "synonyms": ["parachute"], "image_count": 24, "id": 760, "frequency": "c", "synset": "parachute.n.01"}, {"name": "parakeet", "instance_count": 46, "def": "any of numerous small slender long-tailed parrots", "synonyms": ["parakeet", "parrakeet", "parroket", "paraquet", "paroquet", "parroquet"], "image_count": 11, "id": 761, "frequency": "c", "synset": "parakeet.n.01"}, {"name": "parasail_(sports)", "instance_count": 385, "def": "parachute that will lift a person up into the air when it is towed by a motorboat or a car", "synonyms": ["parasail_(sports)"], "image_count": 72, "id": 762, "frequency": "c", "synset": "parasail.n.01"}, {"name": "parasol", "instance_count": 45, "def": "a handheld collapsible source of shade", "synonyms": ["parasol", "sunshade"], "image_count": 17, "id": 763, "frequency": "c", "synset": "parasol.n.01"}, {"name": "parchment", "instance_count": 17, "def": "a superior paper resembling sheepskin", "synonyms": ["parchment"], "image_count": 10, "id": 764, "frequency": "r", "synset": "parchment.n.01"}, {"name": "parka", "instance_count": 89, "def": "a kind of heavy jacket (`windcheater' is a British term)", "synonyms": ["parka", "anorak"], "image_count": 17, "id": 765, "frequency": "c", "synset": "parka.n.01"}, {"name": "parking_meter", "instance_count": 1075, "def": "a coin-operated timer located next to a parking space", "synonyms": ["parking_meter"], "image_count": 489, "id": 766, "frequency": "f", "synset": "parking_meter.n.01"}, {"name": "parrot", "instance_count": 76, "def": "usually brightly colored tropical birds with short hooked beaks and the ability to mimic sounds", "synonyms": ["parrot"], "image_count": 47, "id": 767, "frequency": "c", "synset": "parrot.n.01"}, {"name": "passenger_car_(part_of_a_train)", "instance_count": 465, "def": "a railcar where passengers ride", "synonyms": ["passenger_car_(part_of_a_train)", "coach_(part_of_a_train)"], "image_count": 93, "id": 768, "frequency": "c", "synset": "passenger_car.n.01"}, {"name": "passenger_ship", "instance_count": 1, "def": "a ship built to carry passengers", "synonyms": ["passenger_ship"], "image_count": 1, "id": 769, "frequency": "r", "synset": "passenger_ship.n.01"}, {"name": "passport", "instance_count": 12, "def": "a document issued by a country to a citizen allowing that person to travel abroad and re-enter the home country", "synonyms": ["passport"], "image_count": 12, "id": 770, "frequency": "c", "synset": "passport.n.02"}, {"name": "pastry", "instance_count": 4972, "def": "any of various baked foods made of dough or batter", "synonyms": ["pastry"], "image_count": 228, "id": 771, "frequency": "f", "synset": "pastry.n.02"}, {"name": "patty_(food)", "instance_count": 20, "def": "small flat mass of chopped food", "synonyms": ["patty_(food)"], "image_count": 5, "id": 772, "frequency": "r", "synset": "patty.n.01"}, {"name": "pea_(food)", "instance_count": 1869, "def": "seed of a pea plant used for food", "synonyms": ["pea_(food)"], "image_count": 76, "id": 773, "frequency": "c", "synset": "pea.n.01"}, {"name": "peach", "instance_count": 1041, "def": "downy juicy fruit with sweet yellowish or whitish flesh", "synonyms": ["peach"], "image_count": 71, "id": 774, "frequency": "c", "synset": "peach.n.03"}, {"name": "peanut_butter", "instance_count": 50, "def": "a spread made from ground peanuts", "synonyms": ["peanut_butter"], "image_count": 30, "id": 775, "frequency": "c", "synset": "peanut_butter.n.01"}, {"name": "pear", "instance_count": 1069, "def": "sweet juicy gritty-textured fruit available in many varieties", "synonyms": ["pear"], "image_count": 109, "id": 776, "frequency": "f", "synset": "pear.n.01"}, {"name": "peeler_(tool_for_fruit_and_vegetables)", "instance_count": 18, "def": "a device for peeling vegetables or fruits", "synonyms": ["peeler_(tool_for_fruit_and_vegetables)"], "image_count": 14, "id": 777, "frequency": "c", "synset": "peeler.n.03"}, {"name": "wooden_leg", "instance_count": 1, "def": "a prosthesis that replaces a missing leg", "synonyms": ["wooden_leg", "pegleg"], "image_count": 1, "id": 778, "frequency": "r", "synset": "peg.n.04"}, {"name": "pegboard", "instance_count": 9, "def": "a board perforated with regularly spaced holes into which pegs can be fitted", "synonyms": ["pegboard"], "image_count": 8, "id": 779, "frequency": "r", "synset": "pegboard.n.01"}, {"name": "pelican", "instance_count": 76, "def": "large long-winged warm-water seabird having a large bill with a distensible pouch for fish", "synonyms": ["pelican"], "image_count": 26, "id": 780, "frequency": "c", "synset": "pelican.n.01"}, {"name": "pen", "instance_count": 987, "def": "a writing implement with a point from which ink flows", "synonyms": ["pen"], "image_count": 339, "id": 781, "frequency": "f", "synset": "pen.n.01"}, {"name": "pencil", "instance_count": 543, "def": "a thin cylindrical pointed writing implement made of wood and graphite", "synonyms": ["pencil"], "image_count": 153, "id": 782, "frequency": "f", "synset": "pencil.n.01"}, {"name": "pencil_box", "instance_count": 2, "def": "a box for holding pencils", "synonyms": ["pencil_box", "pencil_case"], "image_count": 2, "id": 783, "frequency": "r", "synset": "pencil_box.n.01"}, {"name": "pencil_sharpener", "instance_count": 4, "def": "a rotary implement for sharpening the point on pencils", "synonyms": ["pencil_sharpener"], "image_count": 3, "id": 784, "frequency": "r", "synset": "pencil_sharpener.n.01"}, {"name": "pendulum", "instance_count": 18, "def": "an apparatus consisting of an object mounted so that it swings freely under the influence of gravity", "synonyms": ["pendulum"], "image_count": 8, "id": 785, "frequency": "r", "synset": "pendulum.n.01"}, {"name": "penguin", "instance_count": 229, "def": "short-legged flightless birds of cold southern regions having webbed feet and wings modified as flippers", "synonyms": ["penguin"], "image_count": 47, "id": 786, "frequency": "c", "synset": "penguin.n.01"}, {"name": "pennant", "instance_count": 235, "def": "a flag longer than it is wide (and often tapering)", "synonyms": ["pennant"], "image_count": 8, "id": 787, "frequency": "r", "synset": "pennant.n.02"}, {"name": "penny_(coin)", "instance_count": 15, "def": "a coin worth one-hundredth of the value of the basic unit", "synonyms": ["penny_(coin)"], "image_count": 6, "id": 788, "frequency": "r", "synset": "penny.n.02"}, {"name": "pepper", "instance_count": 697, "def": "pungent seasoning from the berry of the common pepper plant; whole or ground", "synonyms": ["pepper", "peppercorn"], "image_count": 116, "id": 789, "frequency": "f", "synset": "pepper.n.03"}, {"name": "pepper_mill", "instance_count": 91, "def": "a mill for grinding pepper", "synonyms": ["pepper_mill", "pepper_grinder"], "image_count": 69, "id": 790, "frequency": "c", "synset": "pepper_mill.n.01"}, {"name": "perfume", "instance_count": 28, "def": "a toiletry that emits and diffuses a fragrant odor", "synonyms": ["perfume"], "image_count": 13, "id": 791, "frequency": "c", "synset": "perfume.n.02"}, {"name": "persimmon", "instance_count": 22, "def": "orange fruit resembling a plum; edible when fully ripe", "synonyms": ["persimmon"], "image_count": 6, "id": 792, "frequency": "r", "synset": "persimmon.n.02"}, {"name": "person", "instance_count": 13439, "def": "a human being", "synonyms": ["person", "baby", "child", "boy", "girl", "man", "woman", "human"], "image_count": 1928, "id": 793, "frequency": "f", "synset": "person.n.01"}, {"name": "pet", "instance_count": 103, "def": "a domesticated animal kept for companionship or amusement", "synonyms": ["pet"], "image_count": 79, "id": 794, "frequency": "c", "synset": "pet.n.01"}, {"name": "pew_(church_bench)", "instance_count": 194, "def": "long bench with backs; used in church by the congregation", "synonyms": ["pew_(church_bench)", "church_bench"], "image_count": 14, "id": 795, "frequency": "c", "synset": "pew.n.01"}, {"name": "phonebook", "instance_count": 24, "def": "a directory containing an alphabetical list of telephone subscribers and their telephone numbers", "synonyms": ["phonebook", "telephone_book", "telephone_directory"], "image_count": 7, "id": 796, "frequency": "r", "synset": "phonebook.n.01"}, {"name": "phonograph_record", "instance_count": 138, "def": "sound recording consisting of a typically black disk with a continuous groove", "synonyms": ["phonograph_record", "phonograph_recording", "record_(phonograph_recording)"], "image_count": 20, "id": 797, "frequency": "c", "synset": "phonograph_record.n.01"}, {"name": "piano", "instance_count": 126, "def": "a keyboard instrument that is played by depressing keys that cause hammers to strike tuned strings and produce sounds", "synonyms": ["piano"], "image_count": 114, "id": 798, "frequency": "f", "synset": "piano.n.01"}, {"name": "pickle", "instance_count": 632, "def": "vegetables (especially cucumbers) preserved in brine or vinegar", "synonyms": ["pickle"], "image_count": 221, "id": 799, "frequency": "f", "synset": "pickle.n.01"}, {"name": "pickup_truck", "instance_count": 838, "def": "a light truck with an open body and low sides and a tailboard", "synonyms": ["pickup_truck"], "image_count": 502, "id": 800, "frequency": "f", "synset": "pickup.n.01"}, {"name": "pie", "instance_count": 228, "def": "dish baked in pastry-lined pan often with a pastry top", "synonyms": ["pie"], "image_count": 62, "id": 801, "frequency": "c", "synset": "pie.n.01"}, {"name": "pigeon", "instance_count": 1850, "def": "wild and domesticated birds having a heavy body and short legs", "synonyms": ["pigeon"], "image_count": 87, "id": 802, "frequency": "c", "synset": "pigeon.n.01"}, {"name": "piggy_bank", "instance_count": 5, "def": "a child's coin bank (often shaped like a pig)", "synonyms": ["piggy_bank", "penny_bank"], "image_count": 4, "id": 803, "frequency": "r", "synset": "piggy_bank.n.01"}, {"name": "pillow", "instance_count": 6115, "def": "a cushion to support the head of a sleeping person", "synonyms": ["pillow"], "image_count": 1912, "id": 804, "frequency": "f", "synset": "pillow.n.01"}, {"name": "pin_(non_jewelry)", "instance_count": 112, "def": "a small slender (often pointed) piece of wood or metal used to support or fasten or attach things", "synonyms": ["pin_(non_jewelry)"], "image_count": 7, "id": 805, "frequency": "r", "synset": "pin.n.09"}, {"name": "pineapple", "instance_count": 1636, "def": "large sweet fleshy tropical fruit with a tuft of stiff leaves", "synonyms": ["pineapple"], "image_count": 186, "id": 806, "frequency": "f", "synset": "pineapple.n.02"}, {"name": "pinecone", "instance_count": 141, "def": "the seed-producing cone of a pine tree", "synonyms": ["pinecone"], "image_count": 18, "id": 807, "frequency": "c", "synset": "pinecone.n.01"}, {"name": "ping-pong_ball", "instance_count": 4, "def": "light hollow ball used in playing table tennis", "synonyms": ["ping-pong_ball"], "image_count": 4, "id": 808, "frequency": "r", "synset": "ping-pong_ball.n.01"}, {"name": "pinwheel", "instance_count": 172, "def": "a toy consisting of vanes of colored paper or plastic that is pinned to a stick and spins when it is pointed into the wind", "synonyms": ["pinwheel"], "image_count": 3, "id": 809, "frequency": "r", "synset": "pinwheel.n.03"}, {"name": "tobacco_pipe", "instance_count": 7, "def": "a tube with a small bowl at one end; used for smoking tobacco", "synonyms": ["tobacco_pipe"], "image_count": 7, "id": 810, "frequency": "r", "synset": "pipe.n.01"}, {"name": "pipe", "instance_count": 4762, "def": "a long tube made of metal or plastic that is used to carry water or oil or gas etc.", "synonyms": ["pipe", "piping"], "image_count": 1413, "id": 811, "frequency": "f", "synset": "pipe.n.02"}, {"name": "pistol", "instance_count": 9, "def": "a firearm that is held and fired with one hand", "synonyms": ["pistol", "handgun"], "image_count": 7, "id": 812, "frequency": "r", "synset": "pistol.n.01"}, {"name": "pita_(bread)", "instance_count": 28, "def": "usually small round bread that can open into a pocket for filling", "synonyms": ["pita_(bread)", "pocket_bread"], "image_count": 12, "id": 813, "frequency": "c", "synset": "pita.n.01"}, {"name": "pitcher_(vessel_for_liquid)", "instance_count": 488, "def": "an open vessel with a handle and a spout for pouring", "synonyms": ["pitcher_(vessel_for_liquid)", "ewer"], "image_count": 248, "id": 814, "frequency": "f", "synset": "pitcher.n.02"}, {"name": "pitchfork", "instance_count": 4, "def": "a long-handled hand tool with sharp widely spaced prongs for lifting and pitching hay", "synonyms": ["pitchfork"], "image_count": 4, "id": 815, "frequency": "r", "synset": "pitchfork.n.01"}, {"name": "pizza", "instance_count": 4103, "def": "Italian open pie made of thin bread dough spread with a spiced mixture of e.g. tomato sauce and cheese", "synonyms": ["pizza"], "image_count": 1881, "id": 816, "frequency": "f", "synset": "pizza.n.01"}, {"name": "place_mat", "instance_count": 1123, "def": "a mat placed on a table for an individual place setting", "synonyms": ["place_mat"], "image_count": 529, "id": 817, "frequency": "f", "synset": "place_mat.n.01"}, {"name": "plate", "instance_count": 5214, "def": "dish on which food is served or from which food is eaten", "synonyms": ["plate"], "image_count": 1932, "id": 818, "frequency": "f", "synset": "plate.n.04"}, {"name": "platter", "instance_count": 148, "def": "a large shallow dish used for serving food", "synonyms": ["platter"], "image_count": 50, "id": 819, "frequency": "c", "synset": "platter.n.01"}, {"name": "playpen", "instance_count": 3, "def": "a portable enclosure in which babies may be left to play", "synonyms": ["playpen"], "image_count": 3, "id": 820, "frequency": "r", "synset": "playpen.n.01"}, {"name": "pliers", "instance_count": 49, "def": "a gripping hand tool with two hinged arms and (usually) serrated jaws", "synonyms": ["pliers", "plyers"], "image_count": 28, "id": 821, "frequency": "c", "synset": "pliers.n.01"}, {"name": "plow_(farm_equipment)", "instance_count": 12, "def": "a farm tool having one or more heavy blades to break the soil and cut a furrow prior to sowing", "synonyms": ["plow_(farm_equipment)", "plough_(farm_equipment)"], "image_count": 10, "id": 822, "frequency": "r", "synset": "plow.n.01"}, {"name": "plume", "instance_count": 11, "def": "a feather or cluster of feathers worn as an ornament", "synonyms": ["plume"], "image_count": 5, "id": 823, "frequency": "r", "synset": "plume.n.02"}, {"name": "pocket_watch", "instance_count": 20, "def": "a watch that is carried in a small watch pocket", "synonyms": ["pocket_watch"], "image_count": 5, "id": 824, "frequency": "r", "synset": "pocket_watch.n.01"}, {"name": "pocketknife", "instance_count": 21, "def": "a knife with a blade that folds into the handle; suitable for carrying in the pocket", "synonyms": ["pocketknife"], "image_count": 18, "id": 825, "frequency": "c", "synset": "pocketknife.n.01"}, {"name": "poker_(fire_stirring_tool)", "instance_count": 34, "def": "fire iron consisting of a metal rod with a handle; used to stir a fire", "synonyms": ["poker_(fire_stirring_tool)", "stove_poker", "fire_hook"], "image_count": 14, "id": 826, "frequency": "c", "synset": "poker.n.01"}, {"name": "pole", "instance_count": 14276, "def": "a long (usually round) rod of wood or metal or plastic", "synonyms": ["pole", "post"], "image_count": 1890, "id": 827, "frequency": "f", "synset": "pole.n.01"}, {"name": "polo_shirt", "instance_count": 1695, "def": "a shirt with short sleeves designed for comfort and casual wear", "synonyms": ["polo_shirt", "sport_shirt"], "image_count": 660, "id": 828, "frequency": "f", "synset": "polo_shirt.n.01"}, {"name": "poncho", "instance_count": 14, "def": "a blanket-like cloak with a hole in the center for the head", "synonyms": ["poncho"], "image_count": 8, "id": 829, "frequency": "r", "synset": "poncho.n.01"}, {"name": "pony", "instance_count": 57, "def": "any of various breeds of small gentle horses usually less than five feet high at the shoulder", "synonyms": ["pony"], "image_count": 25, "id": 830, "frequency": "c", "synset": "pony.n.05"}, {"name": "pool_table", "instance_count": 10, "def": "game equipment consisting of a heavy table on which pool is played", "synonyms": ["pool_table", "billiard_table", "snooker_table"], "image_count": 10, "id": 831, "frequency": "r", "synset": "pool_table.n.01"}, {"name": "pop_(soda)", "instance_count": 951, "def": "a sweet drink containing carbonated water and flavoring", "synonyms": ["pop_(soda)", "soda_(pop)", "tonic", "soft_drink"], "image_count": 218, "id": 832, "frequency": "f", "synset": "pop.n.02"}, {"name": "postbox_(public)", "instance_count": 57, "def": "public box for deposit of mail", "synonyms": ["postbox_(public)", "mailbox_(public)"], "image_count": 36, "id": 833, "frequency": "c", "synset": "postbox.n.01"}, {"name": "postcard", "instance_count": 276, "def": "a card for sending messages by post without an envelope", "synonyms": ["postcard", "postal_card", "mailing-card"], "image_count": 16, "id": 834, "frequency": "c", "synset": "postcard.n.01"}, {"name": "poster", "instance_count": 3378, "def": "a sign posted in a public place as an advertisement", "synonyms": ["poster", "placard"], "image_count": 808, "id": 835, "frequency": "f", "synset": "poster.n.01"}, {"name": "pot", "instance_count": 1719, "def": "metal or earthenware cooking vessel that is usually round and deep; often has a handle and lid", "synonyms": ["pot"], "image_count": 479, "id": 836, "frequency": "f", "synset": "pot.n.01"}, {"name": "flowerpot", "instance_count": 3902, "def": "a container in which plants are cultivated", "synonyms": ["flowerpot"], "image_count": 1404, "id": 837, "frequency": "f", "synset": "pot.n.04"}, {"name": "potato", "instance_count": 4393, "def": "an edible tuber native to South America", "synonyms": ["potato"], "image_count": 307, "id": 838, "frequency": "f", "synset": "potato.n.01"}, {"name": "potholder", "instance_count": 112, "def": "an insulated pad for holding hot pots", "synonyms": ["potholder"], "image_count": 57, "id": 839, "frequency": "c", "synset": "potholder.n.01"}, {"name": "pottery", "instance_count": 272, "def": "ceramic ware made from clay and baked in a kiln", "synonyms": ["pottery", "clayware"], "image_count": 28, "id": 840, "frequency": "c", "synset": "pottery.n.01"}, {"name": "pouch", "instance_count": 131, "def": "a small or medium size container for holding or carrying things", "synonyms": ["pouch"], "image_count": 80, "id": 841, "frequency": "c", "synset": "pouch.n.01"}, {"name": "power_shovel", "instance_count": 16, "def": "a machine for excavating", "synonyms": ["power_shovel", "excavator", "digger"], "image_count": 11, "id": 842, "frequency": "c", "synset": "power_shovel.n.01"}, {"name": "prawn", "instance_count": 779, "def": "any of various edible decapod crustaceans", "synonyms": ["prawn", "shrimp"], "image_count": 92, "id": 843, "frequency": "c", "synset": "prawn.n.01"}, {"name": "pretzel", "instance_count": 179, "def": "glazed and salted cracker typically in the shape of a loose knot", "synonyms": ["pretzel"], "image_count": 20, "id": 844, "frequency": "c", "synset": "pretzel.n.01"}, {"name": "printer", "instance_count": 217, "def": "a machine that prints", "synonyms": ["printer", "printing_machine"], "image_count": 194, "id": 845, "frequency": "f", "synset": "printer.n.03"}, {"name": "projectile_(weapon)", "instance_count": 64, "def": "a weapon that is forcibly thrown or projected at a targets", "synonyms": ["projectile_(weapon)", "missile"], "image_count": 23, "id": 846, "frequency": "c", "synset": "projectile.n.01"}, {"name": "projector", "instance_count": 54, "def": "an optical instrument that projects an enlarged image onto a screen", "synonyms": ["projector"], "image_count": 52, "id": 847, "frequency": "c", "synset": "projector.n.02"}, {"name": "propeller", "instance_count": 1458, "def": "a mechanical device that rotates to push against air or water", "synonyms": ["propeller", "propellor"], "image_count": 673, "id": 848, "frequency": "f", "synset": "propeller.n.01"}, {"name": "prune", "instance_count": 8, "def": "dried plum", "synonyms": ["prune"], "image_count": 2, "id": 849, "frequency": "r", "synset": "prune.n.01"}, {"name": "pudding", "instance_count": 2, "def": "any of various soft thick unsweetened baked dishes", "synonyms": ["pudding"], "image_count": 2, "id": 850, "frequency": "r", "synset": "pudding.n.01"}, {"name": "puffer_(fish)", "instance_count": 2, "def": "fishes whose elongated spiny body can inflate itself with water or air to form a globe", "synonyms": ["puffer_(fish)", "pufferfish", "blowfish", "globefish"], "image_count": 1, "id": 851, "frequency": "r", "synset": "puffer.n.02"}, {"name": "puffin", "instance_count": 4, "def": "seabirds having short necks and brightly colored compressed bills", "synonyms": ["puffin"], "image_count": 2, "id": 852, "frequency": "r", "synset": "puffin.n.01"}, {"name": "pug-dog", "instance_count": 13, "def": "small compact smooth-coated breed of Asiatic origin having a tightly curled tail and broad flat wrinkled muzzle", "synonyms": ["pug-dog"], "image_count": 8, "id": 853, "frequency": "r", "synset": "pug.n.01"}, {"name": "pumpkin", "instance_count": 1192, "def": "usually large pulpy deep-yellow round fruit of the squash family maturing in late summer or early autumn", "synonyms": ["pumpkin"], "image_count": 80, "id": 854, "frequency": "c", "synset": "pumpkin.n.02"}, {"name": "puncher", "instance_count": 6, "def": "a tool for making holes or indentations", "synonyms": ["puncher"], "image_count": 3, "id": 855, "frequency": "r", "synset": "punch.n.03"}, {"name": "puppet", "instance_count": 18, "def": "a small figure of a person operated from above with strings by a puppeteer", "synonyms": ["puppet", "marionette"], "image_count": 3, "id": 856, "frequency": "r", "synset": "puppet.n.01"}, {"name": "puppy", "instance_count": 57, "def": "a young dog", "synonyms": ["puppy"], "image_count": 15, "id": 857, "frequency": "c", "synset": "puppy.n.01"}, {"name": "quesadilla", "instance_count": 6, "def": "a tortilla that is filled with cheese and heated", "synonyms": ["quesadilla"], "image_count": 2, "id": 858, "frequency": "r", "synset": "quesadilla.n.01"}, {"name": "quiche", "instance_count": 33, "def": "a tart filled with rich unsweetened custard; often contains other ingredients (as cheese or ham or seafood or vegetables)", "synonyms": ["quiche"], "image_count": 10, "id": 859, "frequency": "r", "synset": "quiche.n.02"}, {"name": "quilt", "instance_count": 513, "def": "bedding made of two layers of cloth filled with stuffing and stitched together", "synonyms": ["quilt", "comforter"], "image_count": 386, "id": 860, "frequency": "f", "synset": "quilt.n.01"}, {"name": "rabbit", "instance_count": 139, "def": "any of various burrowing animals of the family Leporidae having long ears and short tails", "synonyms": ["rabbit"], "image_count": 65, "id": 861, "frequency": "c", "synset": "rabbit.n.01"}, {"name": "race_car", "instance_count": 6, "def": "a fast car that competes in races", "synonyms": ["race_car", "racing_car"], "image_count": 3, "id": 862, "frequency": "r", "synset": "racer.n.02"}, {"name": "racket", "instance_count": 64, "def": "a sports implement used to strike a ball in various games", "synonyms": ["racket", "racquet"], "image_count": 35, "id": 863, "frequency": "c", "synset": "racket.n.04"}, {"name": "radar", "instance_count": 13, "def": "measuring instrument in which the echo of a pulse of microwave radiation is used to detect and locate distant objects", "synonyms": ["radar"], "image_count": 5, "id": 864, "frequency": "r", "synset": "radar.n.01"}, {"name": "radiator", "instance_count": 195, "def": "a mechanism consisting of a metal honeycomb through which hot fluids circulate", "synonyms": ["radiator"], "image_count": 180, "id": 865, "frequency": "f", "synset": "radiator.n.03"}, {"name": "radio_receiver", "instance_count": 123, "def": "an electronic receiver that detects and demodulates and amplifies transmitted radio signals", "synonyms": ["radio_receiver", "radio_set", "radio", "tuner_(radio)"], "image_count": 99, "id": 866, "frequency": "c", "synset": "radio_receiver.n.01"}, {"name": "radish", "instance_count": 519, "def": "pungent edible root of any of various cultivated radish plants", "synonyms": ["radish", "daikon"], "image_count": 49, "id": 867, "frequency": "c", "synset": "radish.n.03"}, {"name": "raft", "instance_count": 66, "def": "a flat float (usually made of logs or planks) that can be used for transport or as a platform for swimmers", "synonyms": ["raft"], "image_count": 28, "id": 868, "frequency": "c", "synset": "raft.n.01"}, {"name": "rag_doll", "instance_count": 3, "def": "a cloth doll that is stuffed and (usually) painted", "synonyms": ["rag_doll"], "image_count": 1, "id": 869, "frequency": "r", "synset": "rag_doll.n.01"}, {"name": "raincoat", "instance_count": 303, "def": "a water-resistant coat", "synonyms": ["raincoat", "waterproof_jacket"], "image_count": 52, "id": 870, "frequency": "c", "synset": "raincoat.n.01"}, {"name": "ram_(animal)", "instance_count": 132, "def": "uncastrated adult male sheep", "synonyms": ["ram_(animal)"], "image_count": 36, "id": 871, "frequency": "c", "synset": "ram.n.05"}, {"name": "raspberry", "instance_count": 778, "def": "red or black edible aggregate berries usually smaller than the related blackberries", "synonyms": ["raspberry"], "image_count": 70, "id": 872, "frequency": "c", "synset": "raspberry.n.02"}, {"name": "rat", "instance_count": 6, "def": "any of various long-tailed rodents similar to but larger than a mouse", "synonyms": ["rat"], "image_count": 6, "id": 873, "frequency": "r", "synset": "rat.n.01"}, {"name": "razorblade", "instance_count": 35, "def": "a blade that has very sharp edge", "synonyms": ["razorblade"], "image_count": 29, "id": 874, "frequency": "c", "synset": "razorblade.n.01"}, {"name": "reamer_(juicer)", "instance_count": 26, "def": "a squeezer with a conical ridged center that is used for squeezing juice from citrus fruit", "synonyms": ["reamer_(juicer)", "juicer", "juice_reamer"], "image_count": 24, "id": 875, "frequency": "c", "synset": "reamer.n.01"}, {"name": "rearview_mirror", "instance_count": 3650, "def": "vehicle mirror (side or rearview)", "synonyms": ["rearview_mirror"], "image_count": 1115, "id": 876, "frequency": "f", "synset": "rearview_mirror.n.01"}, {"name": "receipt", "instance_count": 89, "def": "an acknowledgment (usually tangible) that payment has been made", "synonyms": ["receipt"], "image_count": 61, "id": 877, "frequency": "c", "synset": "receipt.n.02"}, {"name": "recliner", "instance_count": 28, "def": "an armchair whose back can be lowered and foot can be raised to allow the sitter to recline in it", "synonyms": ["recliner", "reclining_chair", "lounger_(chair)"], "image_count": 18, "id": 878, "frequency": "c", "synset": "recliner.n.01"}, {"name": "record_player", "instance_count": 22, "def": "machine in which rotating records cause a stylus to vibrate and the vibrations are amplified acoustically or electronically", "synonyms": ["record_player", "phonograph_(record_player)", "turntable"], "image_count": 18, "id": 879, "frequency": "c", "synset": "record_player.n.01"}, {"name": "reflector", "instance_count": 3426, "def": "device that reflects light, radiation, etc.", "synonyms": ["reflector"], "image_count": 665, "id": 880, "frequency": "f", "synset": "reflector.n.01"}, {"name": "remote_control", "instance_count": 2467, "def": "a device that can be used to control a machine or apparatus from a distance", "synonyms": ["remote_control"], "image_count": 1096, "id": 881, "frequency": "f", "synset": "remote_control.n.01"}, {"name": "rhinoceros", "instance_count": 50, "def": "massive powerful herbivorous odd-toed ungulate of southeast Asia and Africa having very thick skin and one or two horns on the snout", "synonyms": ["rhinoceros"], "image_count": 29, "id": 882, "frequency": "c", "synset": "rhinoceros.n.01"}, {"name": "rib_(food)", "instance_count": 32, "def": "cut of meat including one or more ribs", "synonyms": ["rib_(food)"], "image_count": 8, "id": 883, "frequency": "r", "synset": "rib.n.03"}, {"name": "rifle", "instance_count": 37, "def": "a shoulder firearm with a long barrel", "synonyms": ["rifle"], "image_count": 14, "id": 884, "frequency": "c", "synset": "rifle.n.01"}, {"name": "ring", "instance_count": 2314, "def": "jewelry consisting of a circlet of precious metal (often set with jewels) worn on the finger", "synonyms": ["ring"], "image_count": 1622, "id": 885, "frequency": "f", "synset": "ring.n.08"}, {"name": "river_boat", "instance_count": 3, "def": "a boat used on rivers or to ply a river", "synonyms": ["river_boat"], "image_count": 2, "id": 886, "frequency": "r", "synset": "river_boat.n.01"}, {"name": "road_map", "instance_count": 3, "def": "(NOT A ROAD) a MAP showing roads (for automobile travel)", "synonyms": ["road_map"], "image_count": 3, "id": 887, "frequency": "r", "synset": "road_map.n.02"}, {"name": "robe", "instance_count": 77, "def": "any loose flowing garment", "synonyms": ["robe"], "image_count": 32, "id": 888, "frequency": "c", "synset": "robe.n.01"}, {"name": "rocking_chair", "instance_count": 70, "def": "a chair mounted on rockers", "synonyms": ["rocking_chair"], "image_count": 55, "id": 889, "frequency": "c", "synset": "rocking_chair.n.01"}, {"name": "rodent", "instance_count": 2, "def": "relatively small placental mammals having a single pair of constantly growing incisor teeth specialized for gnawing", "synonyms": ["rodent"], "image_count": 1, "id": 890, "frequency": "r", "synset": "rodent.n.01"}, {"name": "roller_skate", "instance_count": 35, "def": "a shoe with pairs of rollers (small hard wheels) fixed to the sole", "synonyms": ["roller_skate"], "image_count": 10, "id": 891, "frequency": "r", "synset": "roller_skate.n.01"}, {"name": "Rollerblade", "instance_count": 31, "def": "an in-line variant of a roller skate", "synonyms": ["Rollerblade"], "image_count": 10, "id": 892, "frequency": "r", "synset": "rollerblade.n.01"}, {"name": "rolling_pin", "instance_count": 52, "def": "utensil consisting of a cylinder (usually of wood) with a handle at each end; used to roll out dough", "synonyms": ["rolling_pin"], "image_count": 47, "id": 893, "frequency": "c", "synset": "rolling_pin.n.01"}, {"name": "root_beer", "instance_count": 3, "def": "carbonated drink containing extracts of roots and herbs", "synonyms": ["root_beer"], "image_count": 3, "id": 894, "frequency": "r", "synset": "root_beer.n.01"}, {"name": "router_(computer_equipment)", "instance_count": 41, "def": "a device that forwards data packets between computer networks", "synonyms": ["router_(computer_equipment)"], "image_count": 29, "id": 895, "frequency": "c", "synset": "router.n.02"}, {"name": "rubber_band", "instance_count": 574, "def": "a narrow band of elastic rubber used to hold things (such as papers) together", "synonyms": ["rubber_band", "elastic_band"], "image_count": 342, "id": 896, "frequency": "f", "synset": "rubber_band.n.01"}, {"name": "runner_(carpet)", "instance_count": 32, "def": "a long narrow carpet", "synonyms": ["runner_(carpet)"], "image_count": 25, "id": 897, "frequency": "c", "synset": "runner.n.08"}, {"name": "plastic_bag", "instance_count": 3631, "def": "a bag made of paper or plastic for holding customer's purchases", "synonyms": ["plastic_bag", "paper_bag"], "image_count": 1469, "id": 898, "frequency": "f", "synset": "sack.n.01"}, {"name": "saddle_(on_an_animal)", "instance_count": 955, "def": "a seat for the rider of a horse or camel", "synonyms": ["saddle_(on_an_animal)"], "image_count": 521, "id": 899, "frequency": "f", "synset": "saddle.n.01"}, {"name": "saddle_blanket", "instance_count": 648, "def": "stable gear consisting of a blanket placed under the saddle", "synonyms": ["saddle_blanket", "saddlecloth", "horse_blanket"], "image_count": 347, "id": 900, "frequency": "f", "synset": "saddle_blanket.n.01"}, {"name": "saddlebag", "instance_count": 56, "def": "a large bag (or pair of bags) hung over a saddle", "synonyms": ["saddlebag"], "image_count": 35, "id": 901, "frequency": "c", "synset": "saddlebag.n.01"}, {"name": "safety_pin", "instance_count": 15, "def": "a pin in the form of a clasp; has a guard so the point of the pin will not stick the user", "synonyms": ["safety_pin"], "image_count": 7, "id": 902, "frequency": "r", "synset": "safety_pin.n.01"}, {"name": "sail", "instance_count": 863, "def": "a large piece of fabric by means of which wind is used to propel a sailing vessel", "synonyms": ["sail"], "image_count": 207, "id": 903, "frequency": "f", "synset": "sail.n.01"}, {"name": "salad", "instance_count": 171, "def": "food mixtures either arranged on a plate or tossed and served with a moist dressing; usually consisting of or including greens", "synonyms": ["salad"], "image_count": 108, "id": 904, "frequency": "f", "synset": "salad.n.01"}, {"name": "salad_plate", "instance_count": 6, "def": "a plate or bowl for individual servings of salad", "synonyms": ["salad_plate", "salad_bowl"], "image_count": 2, "id": 905, "frequency": "r", "synset": "salad_plate.n.01"}, {"name": "salami", "instance_count": 290, "def": "highly seasoned fatty sausage of pork and beef usually dried", "synonyms": ["salami"], "image_count": 34, "id": 906, "frequency": "c", "synset": "salami.n.01"}, {"name": "salmon_(fish)", "instance_count": 27, "def": "any of various large food and game fishes of northern waters", "synonyms": ["salmon_(fish)"], "image_count": 12, "id": 907, "frequency": "c", "synset": "salmon.n.01"}, {"name": "salmon_(food)", "instance_count": 14, "def": "flesh of any of various marine or freshwater fish of the family Salmonidae", "synonyms": ["salmon_(food)"], "image_count": 10, "id": 908, "frequency": "r", "synset": "salmon.n.03"}, {"name": "salsa", "instance_count": 22, "def": "spicy sauce of tomatoes and onions and chili peppers to accompany Mexican foods", "synonyms": ["salsa"], "image_count": 13, "id": 909, "frequency": "c", "synset": "salsa.n.01"}, {"name": "saltshaker", "instance_count": 543, "def": "a shaker with a perforated top for sprinkling salt", "synonyms": ["saltshaker"], "image_count": 361, "id": 910, "frequency": "f", "synset": "saltshaker.n.01"}, {"name": "sandal_(type_of_shoe)", "instance_count": 3145, "def": "a shoe consisting of a sole fastened by straps to the foot", "synonyms": ["sandal_(type_of_shoe)"], "image_count": 1023, "id": 911, "frequency": "f", "synset": "sandal.n.01"}, {"name": "sandwich", "instance_count": 2315, "def": "two (or more) slices of bread with a filling between them", "synonyms": ["sandwich"], "image_count": 782, "id": 912, "frequency": "f", "synset": "sandwich.n.01"}, {"name": "satchel", "instance_count": 3, "def": "luggage consisting of a small case with a flat bottom and (usually) a shoulder strap", "synonyms": ["satchel"], "image_count": 2, "id": 913, "frequency": "r", "synset": "satchel.n.01"}, {"name": "saucepan", "instance_count": 26, "def": "a deep pan with a handle; used for stewing or boiling", "synonyms": ["saucepan"], "image_count": 5, "id": 914, "frequency": "r", "synset": "saucepan.n.01"}, {"name": "saucer", "instance_count": 555, "def": "a small shallow dish for holding a cup at the table", "synonyms": ["saucer"], "image_count": 247, "id": 915, "frequency": "f", "synset": "saucer.n.02"}, {"name": "sausage", "instance_count": 2704, "def": "highly seasoned minced meat stuffed in casings", "synonyms": ["sausage"], "image_count": 221, "id": 916, "frequency": "f", "synset": "sausage.n.01"}, {"name": "sawhorse", "instance_count": 5, "def": "a framework for holding wood that is being sawed", "synonyms": ["sawhorse", "sawbuck"], "image_count": 4, "id": 917, "frequency": "r", "synset": "sawhorse.n.01"}, {"name": "saxophone", "instance_count": 13, "def": "a wind instrument with a `J'-shaped form typically made of brass", "synonyms": ["saxophone"], "image_count": 8, "id": 918, "frequency": "r", "synset": "sax.n.02"}, {"name": "scale_(measuring_instrument)", "instance_count": 178, "def": "a measuring instrument for weighing; shows amount of mass", "synonyms": ["scale_(measuring_instrument)"], "image_count": 158, "id": 919, "frequency": "f", "synset": "scale.n.07"}, {"name": "scarecrow", "instance_count": 4, "def": "an effigy in the shape of a man to frighten birds away from seeds", "synonyms": ["scarecrow", "strawman"], "image_count": 3, "id": 920, "frequency": "r", "synset": "scarecrow.n.01"}, {"name": "scarf", "instance_count": 1310, "def": "a garment worn around the head or neck or shoulders for warmth or decoration", "synonyms": ["scarf"], "image_count": 752, "id": 921, "frequency": "f", "synset": "scarf.n.01"}, {"name": "school_bus", "instance_count": 142, "def": "a bus used to transport children to or from school", "synonyms": ["school_bus"], "image_count": 64, "id": 922, "frequency": "c", "synset": "school_bus.n.01"}, {"name": "scissors", "instance_count": 1376, "def": "a tool having two crossed pivoting blades with looped handles", "synonyms": ["scissors"], "image_count": 707, "id": 923, "frequency": "f", "synset": "scissors.n.01"}, {"name": "scoreboard", "instance_count": 161, "def": "a large board for displaying the score of a contest (and some other information)", "synonyms": ["scoreboard"], "image_count": 143, "id": 924, "frequency": "f", "synset": "scoreboard.n.01"}, {"name": "scraper", "instance_count": 1, "def": "any of various hand tools for scraping", "synonyms": ["scraper"], "image_count": 1, "id": 925, "frequency": "r", "synset": "scraper.n.01"}, {"name": "screwdriver", "instance_count": 88, "def": "a hand tool for driving screws; has a tip that fits into the head of a screw", "synonyms": ["screwdriver"], "image_count": 49, "id": 926, "frequency": "c", "synset": "screwdriver.n.01"}, {"name": "scrubbing_brush", "instance_count": 141, "def": "a brush with short stiff bristles for heavy cleaning", "synonyms": ["scrubbing_brush"], "image_count": 126, "id": 927, "frequency": "f", "synset": "scrub_brush.n.01"}, {"name": "sculpture", "instance_count": 202, "def": "a three-dimensional work of art", "synonyms": ["sculpture"], "image_count": 76, "id": 928, "frequency": "c", "synset": "sculpture.n.01"}, {"name": "seabird", "instance_count": 126, "def": "a bird that frequents coastal waters and the open ocean: gulls; pelicans; gannets; cormorants; albatrosses; petrels; etc.", "synonyms": ["seabird", "seafowl"], "image_count": 11, "id": 929, "frequency": "c", "synset": "seabird.n.01"}, {"name": "seahorse", "instance_count": 23, "def": "small fish with horse-like heads bent sharply downward and curled tails", "synonyms": ["seahorse"], "image_count": 11, "id": 930, "frequency": "c", "synset": "seahorse.n.02"}, {"name": "seaplane", "instance_count": 4, "def": "an airplane that can land on or take off from water", "synonyms": ["seaplane", "hydroplane"], "image_count": 4, "id": 931, "frequency": "r", "synset": "seaplane.n.01"}, {"name": "seashell", "instance_count": 451, "def": "the shell of a marine organism", "synonyms": ["seashell"], "image_count": 39, "id": 932, "frequency": "c", "synset": "seashell.n.01"}, {"name": "sewing_machine", "instance_count": 11, "def": "a textile machine used as a home appliance for sewing", "synonyms": ["sewing_machine"], "image_count": 11, "id": 933, "frequency": "c", "synset": "sewing_machine.n.01"}, {"name": "shaker", "instance_count": 24, "def": "a container in which something can be shaken", "synonyms": ["shaker"], "image_count": 13, "id": 934, "frequency": "c", "synset": "shaker.n.03"}, {"name": "shampoo", "instance_count": 254, "def": "cleansing agent consisting of soaps or detergents used for washing the hair", "synonyms": ["shampoo"], "image_count": 91, "id": 935, "frequency": "c", "synset": "shampoo.n.01"}, {"name": "shark", "instance_count": 20, "def": "typically large carnivorous fishes with sharpe teeth", "synonyms": ["shark"], "image_count": 14, "id": 936, "frequency": "c", "synset": "shark.n.01"}, {"name": "sharpener", "instance_count": 7, "def": "any implement that is used to make something (an edge or a point) sharper", "synonyms": ["sharpener"], "image_count": 5, "id": 937, "frequency": "r", "synset": "sharpener.n.01"}, {"name": "Sharpie", "instance_count": 5, "def": "a pen with indelible ink that will write on any surface", "synonyms": ["Sharpie"], "image_count": 3, "id": 938, "frequency": "r", "synset": "sharpie.n.03"}, {"name": "shaver_(electric)", "instance_count": 12, "def": "a razor powered by an electric motor", "synonyms": ["shaver_(electric)", "electric_shaver", "electric_razor"], "image_count": 10, "id": 939, "frequency": "r", "synset": "shaver.n.03"}, {"name": "shaving_cream", "instance_count": 33, "def": "toiletry consisting that forms a rich lather for softening the beard before shaving", "synonyms": ["shaving_cream", "shaving_soap"], "image_count": 18, "id": 940, "frequency": "c", "synset": "shaving_cream.n.01"}, {"name": "shawl", "instance_count": 9, "def": "cloak consisting of an oblong piece of cloth used to cover the head and shoulders", "synonyms": ["shawl"], "image_count": 9, "id": 941, "frequency": "r", "synset": "shawl.n.01"}, {"name": "shears", "instance_count": 38, "def": "large scissors with strong blades", "synonyms": ["shears"], "image_count": 6, "id": 942, "frequency": "r", "synset": "shears.n.01"}, {"name": "sheep", "instance_count": 13304, "def": "woolly usually horned ruminant mammal related to the goat", "synonyms": ["sheep"], "image_count": 951, "id": 943, "frequency": "f", "synset": "sheep.n.01"}, {"name": "shepherd_dog", "instance_count": 2, "def": "any of various usually long-haired breeds of dog reared to herd and guard sheep", "synonyms": ["shepherd_dog", "sheepdog"], "image_count": 2, "id": 944, "frequency": "r", "synset": "shepherd_dog.n.01"}, {"name": "sherbert", "instance_count": 2, "def": "a frozen dessert made primarily of fruit juice and sugar", "synonyms": ["sherbert", "sherbet"], "image_count": 1, "id": 945, "frequency": "r", "synset": "sherbert.n.01"}, {"name": "shield", "instance_count": 41, "def": "armor carried on the arm to intercept blows", "synonyms": ["shield"], "image_count": 19, "id": 946, "frequency": "c", "synset": "shield.n.02"}, {"name": "shirt", "instance_count": 10177, "def": "a garment worn on the upper half of the body", "synonyms": ["shirt"], "image_count": 1942, "id": 947, "frequency": "f", "synset": "shirt.n.01"}, {"name": "shoe", "instance_count": 9374, "def": "common footwear covering the foot", "synonyms": ["shoe", "sneaker_(type_of_shoe)", "tennis_shoe"], "image_count": 1916, "id": 948, "frequency": "f", "synset": "shoe.n.01"}, {"name": "shopping_bag", "instance_count": 377, "def": "a bag made of plastic or strong paper (often with handles); used to transport goods after shopping", "synonyms": ["shopping_bag"], "image_count": 139, "id": 949, "frequency": "f", "synset": "shopping_bag.n.01"}, {"name": "shopping_cart", "instance_count": 90, "def": "a handcart that holds groceries or other goods while shopping", "synonyms": ["shopping_cart"], "image_count": 43, "id": 950, "frequency": "c", "synset": "shopping_cart.n.01"}, {"name": "short_pants", "instance_count": 5305, "def": "trousers that end at or above the knee", "synonyms": ["short_pants", "shorts_(clothing)", "trunks_(clothing)"], "image_count": 1969, "id": 951, "frequency": "f", "synset": "short_pants.n.01"}, {"name": "shot_glass", "instance_count": 24, "def": "a small glass adequate to hold a single swallow of whiskey", "synonyms": ["shot_glass"], "image_count": 5, "id": 952, "frequency": "r", "synset": "shot_glass.n.01"}, {"name": "shoulder_bag", "instance_count": 331, "def": "a large handbag that can be carried by a strap looped over the shoulder", "synonyms": ["shoulder_bag"], "image_count": 134, "id": 953, "frequency": "f", "synset": "shoulder_bag.n.01"}, {"name": "shovel", "instance_count": 110, "def": "a hand tool for lifting loose material such as snow, dirt, etc.", "synonyms": ["shovel"], "image_count": 74, "id": 954, "frequency": "c", "synset": "shovel.n.01"}, {"name": "shower_head", "instance_count": 450, "def": "a plumbing fixture that sprays water over you", "synonyms": ["shower_head"], "image_count": 381, "id": 955, "frequency": "f", "synset": "shower.n.01"}, {"name": "shower_cap", "instance_count": 1, "def": "a tight cap worn to keep hair dry while showering", "synonyms": ["shower_cap"], "image_count": 1, "id": 956, "frequency": "r", "synset": "shower_cap.n.01"}, {"name": "shower_curtain", "instance_count": 479, "def": "a curtain that keeps water from splashing out of the shower area", "synonyms": ["shower_curtain"], "image_count": 381, "id": 957, "frequency": "f", "synset": "shower_curtain.n.01"}, {"name": "shredder_(for_paper)", "instance_count": 6, "def": "a device that shreds documents", "synonyms": ["shredder_(for_paper)"], "image_count": 6, "id": 958, "frequency": "r", "synset": "shredder.n.01"}, {"name": "signboard", "instance_count": 8091, "def": "structure displaying a board on which advertisements can be posted", "synonyms": ["signboard"], "image_count": 1826, "id": 959, "frequency": "f", "synset": "signboard.n.01"}, {"name": "silo", "instance_count": 95, "def": "a cylindrical tower used for storing goods", "synonyms": ["silo"], "image_count": 28, "id": 960, "frequency": "c", "synset": "silo.n.01"}, {"name": "sink", "instance_count": 2182, "def": "plumbing fixture consisting of a water basin fixed to a wall or floor and having a drainpipe", "synonyms": ["sink"], "image_count": 1635, "id": 961, "frequency": "f", "synset": "sink.n.01"}, {"name": "skateboard", "instance_count": 3597, "def": "a board with wheels that is ridden in a standing or crouching position and propelled by foot", "synonyms": ["skateboard"], "image_count": 1967, "id": 962, "frequency": "f", "synset": "skateboard.n.01"}, {"name": "skewer", "instance_count": 81, "def": "a long pin for holding meat in position while it is being roasted", "synonyms": ["skewer"], "image_count": 16, "id": 963, "frequency": "c", "synset": "skewer.n.01"}, {"name": "ski", "instance_count": 8496, "def": "sports equipment for skiing on snow", "synonyms": ["ski"], "image_count": 1926, "id": 964, "frequency": "f", "synset": "ski.n.01"}, {"name": "ski_boot", "instance_count": 8124, "def": "a stiff boot that is fastened to a ski with a ski binding", "synonyms": ["ski_boot"], "image_count": 1789, "id": 965, "frequency": "f", "synset": "ski_boot.n.01"}, {"name": "ski_parka", "instance_count": 1727, "def": "a parka to be worn while skiing", "synonyms": ["ski_parka", "ski_jacket"], "image_count": 401, "id": 966, "frequency": "f", "synset": "ski_parka.n.01"}, {"name": "ski_pole", "instance_count": 8263, "def": "a pole with metal points used as an aid in skiing", "synonyms": ["ski_pole"], "image_count": 1968, "id": 967, "frequency": "f", "synset": "ski_pole.n.01"}, {"name": "skirt", "instance_count": 1784, "def": "a garment hanging from the waist; worn mainly by girls and women", "synonyms": ["skirt"], "image_count": 1167, "id": 968, "frequency": "f", "synset": "skirt.n.02"}, {"name": "skullcap", "instance_count": 1, "def": "rounded brimless cap fitting the crown of the head", "synonyms": ["skullcap"], "image_count": 1, "id": 969, "frequency": "r", "synset": "skullcap.n.01"}, {"name": "sled", "instance_count": 102, "def": "a vehicle or flat object for transportation over snow by sliding or pulled by dogs, etc.", "synonyms": ["sled", "sledge", "sleigh"], "image_count": 56, "id": 970, "frequency": "c", "synset": "sled.n.01"}, {"name": "sleeping_bag", "instance_count": 33, "def": "large padded bag designed to be slept in outdoors", "synonyms": ["sleeping_bag"], "image_count": 17, "id": 971, "frequency": "c", "synset": "sleeping_bag.n.01"}, {"name": "sling_(bandage)", "instance_count": 1, "def": "bandage to support an injured forearm; slung over the shoulder or neck", "synonyms": ["sling_(bandage)", "triangular_bandage"], "image_count": 1, "id": 972, "frequency": "r", "synset": "sling.n.05"}, {"name": "slipper_(footwear)", "instance_count": 121, "def": "low footwear that can be slipped on and off easily; usually worn indoors", "synonyms": ["slipper_(footwear)", "carpet_slipper_(footwear)"], "image_count": 58, "id": 973, "frequency": "c", "synset": "slipper.n.01"}, {"name": "smoothie", "instance_count": 53, "def": "a thick smooth drink consisting of fresh fruit pureed with ice cream or yoghurt or milk", "synonyms": ["smoothie"], "image_count": 9, "id": 974, "frequency": "r", "synset": "smoothie.n.02"}, {"name": "snake", "instance_count": 16, "def": "limbless scaly elongate reptile; some are venomous", "synonyms": ["snake", "serpent"], "image_count": 8, "id": 975, "frequency": "r", "synset": "snake.n.01"}, {"name": "snowboard", "instance_count": 2119, "def": "a board that resembles a broad ski or a small surfboard; used in a standing position to slide down snow-covered slopes", "synonyms": ["snowboard"], "image_count": 1124, "id": 976, "frequency": "f", "synset": "snowboard.n.01"}, {"name": "snowman", "instance_count": 61, "def": "a figure of a person made of packed snow", "synonyms": ["snowman"], "image_count": 31, "id": 977, "frequency": "c", "synset": "snowman.n.01"}, {"name": "snowmobile", "instance_count": 23, "def": "tracked vehicle for travel on snow having skis in front", "synonyms": ["snowmobile"], "image_count": 16, "id": 978, "frequency": "c", "synset": "snowmobile.n.01"}, {"name": "soap", "instance_count": 895, "def": "a cleansing agent made from the salts of vegetable or animal fats", "synonyms": ["soap"], "image_count": 491, "id": 979, "frequency": "f", "synset": "soap.n.01"}, {"name": "soccer_ball", "instance_count": 670, "def": "an inflated ball used in playing soccer (called `football' outside of the United States)", "synonyms": ["soccer_ball"], "image_count": 432, "id": 980, "frequency": "f", "synset": "soccer_ball.n.01"}, {"name": "sock", "instance_count": 6866, "def": "cloth covering for the foot; worn inside the shoe; reaches to between the ankle and the knee", "synonyms": ["sock"], "image_count": 1945, "id": 981, "frequency": "f", "synset": "sock.n.01"}, {"name": "sofa", "instance_count": 2408, "def": "an upholstered seat for more than one person", "synonyms": ["sofa", "couch", "lounge"], "image_count": 1899, "id": 982, "frequency": "f", "synset": "sofa.n.01"}, {"name": "softball", "instance_count": 5, "def": "ball used in playing softball", "synonyms": ["softball"], "image_count": 5, "id": 983, "frequency": "r", "synset": "softball.n.01"}, {"name": "solar_array", "instance_count": 52, "def": "electrical device consisting of a large array of connected solar cells", "synonyms": ["solar_array", "solar_battery", "solar_panel"], "image_count": 28, "id": 984, "frequency": "c", "synset": "solar_array.n.01"}, {"name": "sombrero", "instance_count": 22, "def": "a straw hat with a tall crown and broad brim; worn in American southwest and in Mexico", "synonyms": ["sombrero"], "image_count": 7, "id": 985, "frequency": "r", "synset": "sombrero.n.02"}, {"name": "soup", "instance_count": 193, "def": "liquid food especially of meat or fish or vegetable stock often containing pieces of solid food", "synonyms": ["soup"], "image_count": 146, "id": 986, "frequency": "f", "synset": "soup.n.01"}, {"name": "soup_bowl", "instance_count": 2, "def": "a bowl for serving soup", "synonyms": ["soup_bowl"], "image_count": 1, "id": 987, "frequency": "r", "synset": "soup_bowl.n.01"}, {"name": "soupspoon", "instance_count": 44, "def": "a spoon with a rounded bowl for eating soup", "synonyms": ["soupspoon"], "image_count": 25, "id": 988, "frequency": "c", "synset": "soupspoon.n.01"}, {"name": "sour_cream", "instance_count": 49, "def": "soured light cream", "synonyms": ["sour_cream", "soured_cream"], "image_count": 22, "id": 989, "frequency": "c", "synset": "sour_cream.n.01"}, {"name": "soya_milk", "instance_count": 2, "def": "a milk substitute containing soybean flour and water; used in some infant formulas and in making tofu", "synonyms": ["soya_milk", "soybean_milk", "soymilk"], "image_count": 1, "id": 990, "frequency": "r", "synset": "soya_milk.n.01"}, {"name": "space_shuttle", "instance_count": 10, "def": "a reusable spacecraft with wings for a controlled descent through the Earth's atmosphere", "synonyms": ["space_shuttle"], "image_count": 10, "id": 991, "frequency": "r", "synset": "space_shuttle.n.01"}, {"name": "sparkler_(fireworks)", "instance_count": 12, "def": "a firework that burns slowly and throws out a shower of sparks", "synonyms": ["sparkler_(fireworks)"], "image_count": 9, "id": 992, "frequency": "r", "synset": "sparkler.n.02"}, {"name": "spatula", "instance_count": 508, "def": "a hand tool with a thin flexible blade used to mix or spread soft substances", "synonyms": ["spatula"], "image_count": 308, "id": 993, "frequency": "f", "synset": "spatula.n.02"}, {"name": "spear", "instance_count": 9, "def": "a long pointed rod used as a tool or weapon", "synonyms": ["spear", "lance"], "image_count": 4, "id": 994, "frequency": "r", "synset": "spear.n.01"}, {"name": "spectacles", "instance_count": 3040, "def": "optical instrument consisting of a frame that holds a pair of lenses for correcting defective vision", "synonyms": ["spectacles", "specs", "eyeglasses", "glasses"], "image_count": 1969, "id": 995, "frequency": "f", "synset": "spectacles.n.01"}, {"name": "spice_rack", "instance_count": 54, "def": "a rack for displaying containers filled with spices", "synonyms": ["spice_rack"], "image_count": 45, "id": 996, "frequency": "c", "synset": "spice_rack.n.01"}, {"name": "spider", "instance_count": 19, "def": "predatory arachnid with eight legs, two poison fangs, two feelers, and usually two silk-spinning organs at the back end of the body", "synonyms": ["spider"], "image_count": 12, "id": 997, "frequency": "c", "synset": "spider.n.01"}, {"name": "crawfish", "instance_count": 5, "def": "large edible marine crustacean having a spiny carapace but lacking the large pincers of true lobsters", "synonyms": ["crawfish", "crayfish"], "image_count": 1, "id": 998, "frequency": "r", "synset": "spiny_lobster.n.02"}, {"name": "sponge", "instance_count": 116, "def": "a porous mass usable to absorb water typically used for cleaning", "synonyms": ["sponge"], "image_count": 85, "id": 999, "frequency": "c", "synset": "sponge.n.01"}, {"name": "spoon", "instance_count": 2111, "def": "a piece of cutlery with a shallow bowl-shaped container and a handle", "synonyms": ["spoon"], "image_count": 1127, "id": 1000, "frequency": "f", "synset": "spoon.n.01"}, {"name": "sportswear", "instance_count": 85, "def": "attire worn for sport or for casual wear", "synonyms": ["sportswear", "athletic_wear", "activewear"], "image_count": 11, "id": 1001, "frequency": "c", "synset": "sportswear.n.01"}, {"name": "spotlight", "instance_count": 403, "def": "a lamp that produces a strong beam of light to illuminate a restricted area; used to focus attention of a stage performer", "synonyms": ["spotlight"], "image_count": 60, "id": 1002, "frequency": "c", "synset": "spotlight.n.02"}, {"name": "squid_(food)", "instance_count": 6, "def": "(Italian cuisine) squid prepared as food", "synonyms": ["squid_(food)", "calamari", "calamary"], "image_count": 1, "id": 1003, "frequency": "r", "synset": "squid.n.01"}, {"name": "squirrel", "instance_count": 19, "def": "a kind of arboreal rodent having a long bushy tail", "synonyms": ["squirrel"], "image_count": 16, "id": 1004, "frequency": "c", "synset": "squirrel.n.01"}, {"name": "stagecoach", "instance_count": 1, "def": "a large coach-and-four formerly used to carry passengers and mail on regular routes between towns", "synonyms": ["stagecoach"], "image_count": 1, "id": 1005, "frequency": "r", "synset": "stagecoach.n.01"}, {"name": "stapler_(stapling_machine)", "instance_count": 68, "def": "a machine that inserts staples into sheets of paper in order to fasten them together", "synonyms": ["stapler_(stapling_machine)"], "image_count": 65, "id": 1006, "frequency": "c", "synset": "stapler.n.01"}, {"name": "starfish", "instance_count": 28, "def": "echinoderms characterized by five arms extending from a central disk", "synonyms": ["starfish", "sea_star"], "image_count": 13, "id": 1007, "frequency": "c", "synset": "starfish.n.01"}, {"name": "statue_(sculpture)", "instance_count": 1934, "def": "a sculpture representing a human or animal", "synonyms": ["statue_(sculpture)"], "image_count": 655, "id": 1008, "frequency": "f", "synset": "statue.n.01"}, {"name": "steak_(food)", "instance_count": 139, "def": "a slice of meat cut from the fleshy part of an animal or large fish", "synonyms": ["steak_(food)"], "image_count": 51, "id": 1009, "frequency": "c", "synset": "steak.n.01"}, {"name": "steak_knife", "instance_count": 1, "def": "a sharp table knife used in eating steak", "synonyms": ["steak_knife"], "image_count": 1, "id": 1010, "frequency": "r", "synset": "steak_knife.n.01"}, {"name": "steering_wheel", "instance_count": 901, "def": "a handwheel that is used for steering", "synonyms": ["steering_wheel"], "image_count": 673, "id": 1011, "frequency": "f", "synset": "steering_wheel.n.01"}, {"name": "stepladder", "instance_count": 5, "def": "a folding portable ladder hinged at the top", "synonyms": ["stepladder"], "image_count": 5, "id": 1012, "frequency": "r", "synset": "step_ladder.n.01"}, {"name": "step_stool", "instance_count": 43, "def": "a stool that has one or two steps that fold under the seat", "synonyms": ["step_stool"], "image_count": 36, "id": 1013, "frequency": "c", "synset": "step_stool.n.01"}, {"name": "stereo_(sound_system)", "instance_count": 77, "def": "electronic device for playing audio", "synonyms": ["stereo_(sound_system)"], "image_count": 54, "id": 1014, "frequency": "c", "synset": "stereo.n.01"}, {"name": "stew", "instance_count": 7, "def": "food prepared by stewing especially meat or fish with vegetables", "synonyms": ["stew"], "image_count": 5, "id": 1015, "frequency": "r", "synset": "stew.n.02"}, {"name": "stirrer", "instance_count": 18, "def": "an implement used for stirring", "synonyms": ["stirrer"], "image_count": 8, "id": 1016, "frequency": "r", "synset": "stirrer.n.02"}, {"name": "stirrup", "instance_count": 625, "def": "support consisting of metal loops into which rider's feet go", "synonyms": ["stirrup"], "image_count": 305, "id": 1017, "frequency": "f", "synset": "stirrup.n.01"}, {"name": "stool", "instance_count": 583, "def": "a simple seat without a back or arms", "synonyms": ["stool"], "image_count": 297, "id": 1018, "frequency": "f", "synset": "stool.n.01"}, {"name": "stop_sign", "instance_count": 1349, "def": "a traffic sign to notify drivers that they must come to a complete stop", "synonyms": ["stop_sign"], "image_count": 1053, "id": 1019, "frequency": "f", "synset": "stop_sign.n.01"}, {"name": "brake_light", "instance_count": 1334, "def": "a red light on the rear of a motor vehicle that signals when the brakes are applied", "synonyms": ["brake_light"], "image_count": 223, "id": 1020, "frequency": "f", "synset": "stoplight.n.01"}, {"name": "stove", "instance_count": 1133, "def": "a kitchen appliance used for cooking food", "synonyms": ["stove", "kitchen_stove", "range_(kitchen_appliance)", "kitchen_range", "cooking_stove"], "image_count": 1037, "id": 1021, "frequency": "f", "synset": "stove.n.01"}, {"name": "strainer", "instance_count": 99, "def": "a filter to retain larger pieces while smaller pieces and liquids pass through", "synonyms": ["strainer"], "image_count": 63, "id": 1022, "frequency": "c", "synset": "strainer.n.01"}, {"name": "strap", "instance_count": 7435, "def": "an elongated strip of material for binding things together or holding", "synonyms": ["strap"], "image_count": 1881, "id": 1023, "frequency": "f", "synset": "strap.n.01"}, {"name": "straw_(for_drinking)", "instance_count": 1154, "def": "a thin paper or plastic tube used to suck liquids into the mouth", "synonyms": ["straw_(for_drinking)", "drinking_straw"], "image_count": 507, "id": 1024, "frequency": "f", "synset": "straw.n.04"}, {"name": "strawberry", "instance_count": 4386, "def": "sweet fleshy red fruit", "synonyms": ["strawberry"], "image_count": 333, "id": 1025, "frequency": "f", "synset": "strawberry.n.01"}, {"name": "street_sign", "instance_count": 8350, "def": "a sign visible from the street", "synonyms": ["street_sign"], "image_count": 1911, "id": 1026, "frequency": "f", "synset": "street_sign.n.01"}, {"name": "streetlight", "instance_count": 7381, "def": "a lamp supported on a lamppost; for illuminating a street", "synonyms": ["streetlight", "street_lamp"], "image_count": 1765, "id": 1027, "frequency": "f", "synset": "streetlight.n.01"}, {"name": "string_cheese", "instance_count": 1, "def": "cheese formed in long strings twisted together", "synonyms": ["string_cheese"], "image_count": 1, "id": 1028, "frequency": "r", "synset": "string_cheese.n.01"}, {"name": "stylus", "instance_count": 11, "def": "a pointed tool for writing or drawing or engraving, including pens", "synonyms": ["stylus"], "image_count": 5, "id": 1029, "frequency": "r", "synset": "stylus.n.02"}, {"name": "subwoofer", "instance_count": 1, "def": "a loudspeaker that is designed to reproduce very low bass frequencies", "synonyms": ["subwoofer"], "image_count": 1, "id": 1030, "frequency": "r", "synset": "subwoofer.n.01"}, {"name": "sugar_bowl", "instance_count": 10, "def": "a dish in which sugar is served", "synonyms": ["sugar_bowl"], "image_count": 9, "id": 1031, "frequency": "r", "synset": "sugar_bowl.n.01"}, {"name": "sugarcane_(plant)", "instance_count": 31, "def": "juicy canes whose sap is a source of molasses and commercial sugar; fresh canes are sometimes chewed for the juice", "synonyms": ["sugarcane_(plant)"], "image_count": 2, "id": 1032, "frequency": "r", "synset": "sugarcane.n.01"}, {"name": "suit_(clothing)", "instance_count": 461, "def": "a set of garments (usually including a jacket and trousers or skirt) for outerwear all of the same fabric and color", "synonyms": ["suit_(clothing)"], "image_count": 151, "id": 1033, "frequency": "f", "synset": "suit.n.01"}, {"name": "sunflower", "instance_count": 618, "def": "any plant of the genus Helianthus having large flower heads with dark disk florets and showy yellow rays", "synonyms": ["sunflower"], "image_count": 82, "id": 1034, "frequency": "c", "synset": "sunflower.n.01"}, {"name": "sunglasses", "instance_count": 5603, "def": "spectacles that are darkened or polarized to protect the eyes from the glare of the sun", "synonyms": ["sunglasses"], "image_count": 1931, "id": 1035, "frequency": "f", "synset": "sunglasses.n.01"}, {"name": "sunhat", "instance_count": 170, "def": "a hat with a broad brim that protects the face from direct exposure to the sun", "synonyms": ["sunhat"], "image_count": 41, "id": 1036, "frequency": "c", "synset": "sunhat.n.01"}, {"name": "surfboard", "instance_count": 3835, "def": "a narrow buoyant board for riding surf", "synonyms": ["surfboard"], "image_count": 1895, "id": 1037, "frequency": "f", "synset": "surfboard.n.01"}, {"name": "sushi", "instance_count": 337, "def": "rice (with raw fish) wrapped in seaweed", "synonyms": ["sushi"], "image_count": 24, "id": 1038, "frequency": "c", "synset": "sushi.n.01"}, {"name": "mop", "instance_count": 22, "def": "cleaning implement consisting of absorbent material fastened to a handle; for cleaning floors", "synonyms": ["mop"], "image_count": 22, "id": 1039, "frequency": "c", "synset": "swab.n.02"}, {"name": "sweat_pants", "instance_count": 56, "def": "loose-fitting trousers with elastic cuffs; worn by athletes", "synonyms": ["sweat_pants"], "image_count": 35, "id": 1040, "frequency": "c", "synset": "sweat_pants.n.01"}, {"name": "sweatband", "instance_count": 145, "def": "a band of material tied around the forehead or wrist to absorb sweat", "synonyms": ["sweatband"], "image_count": 69, "id": 1041, "frequency": "c", "synset": "sweatband.n.02"}, {"name": "sweater", "instance_count": 1894, "def": "a crocheted or knitted garment covering the upper part of the body", "synonyms": ["sweater"], "image_count": 962, "id": 1042, "frequency": "f", "synset": "sweater.n.01"}, {"name": "sweatshirt", "instance_count": 1482, "def": "cotton knit pullover with long sleeves worn during athletic activity", "synonyms": ["sweatshirt"], "image_count": 588, "id": 1043, "frequency": "f", "synset": "sweatshirt.n.01"}, {"name": "sweet_potato", "instance_count": 137, "def": "the edible tuberous root of the sweet potato vine", "synonyms": ["sweet_potato"], "image_count": 21, "id": 1044, "frequency": "c", "synset": "sweet_potato.n.02"}, {"name": "swimsuit", "instance_count": 3141, "def": "garment worn for swimming", "synonyms": ["swimsuit", "swimwear", "bathing_suit", "swimming_costume", "bathing_costume", "swimming_trunks", "bathing_trunks"], "image_count": 825, "id": 1045, "frequency": "f", "synset": "swimsuit.n.01"}, {"name": "sword", "instance_count": 72, "def": "a cutting or thrusting weapon that has a long metal blade", "synonyms": ["sword"], "image_count": 52, "id": 1046, "frequency": "c", "synset": "sword.n.01"}, {"name": "syringe", "instance_count": 14, "def": "a medical instrument used to inject or withdraw fluids", "synonyms": ["syringe"], "image_count": 5, "id": 1047, "frequency": "r", "synset": "syringe.n.01"}, {"name": "Tabasco_sauce", "instance_count": 5, "def": "very spicy sauce (trade name Tabasco) made from fully-aged red peppers", "synonyms": ["Tabasco_sauce"], "image_count": 5, "id": 1048, "frequency": "r", "synset": "tabasco.n.02"}, {"name": "table-tennis_table", "instance_count": 5, "def": "a table used for playing table tennis", "synonyms": ["table-tennis_table", "ping-pong_table"], "image_count": 5, "id": 1049, "frequency": "r", "synset": "table-tennis_table.n.01"}, {"name": "table", "instance_count": 2804, "def": "a piece of furniture having a smooth flat top that is usually supported by one or more vertical legs", "synonyms": ["table"], "image_count": 1860, "id": 1050, "frequency": "f", "synset": "table.n.02"}, {"name": "table_lamp", "instance_count": 81, "def": "a lamp that sits on a table", "synonyms": ["table_lamp"], "image_count": 56, "id": 1051, "frequency": "c", "synset": "table_lamp.n.01"}, {"name": "tablecloth", "instance_count": 2496, "def": "a covering spread over a dining table", "synonyms": ["tablecloth"], "image_count": 1582, "id": 1052, "frequency": "f", "synset": "tablecloth.n.01"}, {"name": "tachometer", "instance_count": 10, "def": "measuring instrument for indicating speed of rotation", "synonyms": ["tachometer"], "image_count": 7, "id": 1053, "frequency": "r", "synset": "tachometer.n.01"}, {"name": "taco", "instance_count": 21, "def": "a small tortilla cupped around a filling", "synonyms": ["taco"], "image_count": 2, "id": 1054, "frequency": "r", "synset": "taco.n.02"}, {"name": "tag", "instance_count": 7550, "def": "a label associated with something for the purpose of identification or information", "synonyms": ["tag"], "image_count": 1562, "id": 1055, "frequency": "f", "synset": "tag.n.02"}, {"name": "taillight", "instance_count": 9222, "def": "lamp (usually red) mounted at the rear of a motor vehicle", "synonyms": ["taillight", "rear_light"], "image_count": 1885, "id": 1056, "frequency": "f", "synset": "taillight.n.01"}, {"name": "tambourine", "instance_count": 1, "def": "a shallow drum with a single drumhead and with metallic disks in the sides", "synonyms": ["tambourine"], "image_count": 1, "id": 1057, "frequency": "r", "synset": "tambourine.n.01"}, {"name": "army_tank", "instance_count": 7, "def": "an enclosed armored military vehicle; has a cannon and moves on caterpillar treads", "synonyms": ["army_tank", "armored_combat_vehicle", "armoured_combat_vehicle"], "image_count": 5, "id": 1058, "frequency": "r", "synset": "tank.n.01"}, {"name": "tank_(storage_vessel)", "instance_count": 304, "def": "a large (usually metallic) vessel for holding gases or liquids", "synonyms": ["tank_(storage_vessel)", "storage_tank"], "image_count": 137, "id": 1059, "frequency": "f", "synset": "tank.n.02"}, {"name": "tank_top_(clothing)", "instance_count": 1799, "def": "a tight-fitting sleeveless shirt with wide shoulder straps and low neck and no front opening", "synonyms": ["tank_top_(clothing)"], "image_count": 1094, "id": 1060, "frequency": "f", "synset": "tank_top.n.01"}, {"name": "tape_(sticky_cloth_or_paper)", "instance_count": 560, "def": "a long thin piece of cloth or paper as used for binding or fastening", "synonyms": ["tape_(sticky_cloth_or_paper)"], "image_count": 134, "id": 1061, "frequency": "f", "synset": "tape.n.01"}, {"name": "tape_measure", "instance_count": 35, "def": "measuring instrument consisting of a narrow strip (cloth or metal) marked in inches or centimeters and used for measuring lengths", "synonyms": ["tape_measure", "measuring_tape"], "image_count": 29, "id": 1062, "frequency": "c", "synset": "tape.n.04"}, {"name": "tapestry", "instance_count": 29, "def": "a heavy textile with a woven design; used for curtains and upholstery", "synonyms": ["tapestry"], "image_count": 22, "id": 1063, "frequency": "c", "synset": "tapestry.n.02"}, {"name": "tarp", "instance_count": 1315, "def": "waterproofed canvas", "synonyms": ["tarp"], "image_count": 522, "id": 1064, "frequency": "f", "synset": "tarpaulin.n.01"}, {"name": "tartan", "instance_count": 68, "def": "a cloth having a crisscross design", "synonyms": ["tartan", "plaid"], "image_count": 50, "id": 1065, "frequency": "c", "synset": "tartan.n.01"}, {"name": "tassel", "instance_count": 276, "def": "adornment consisting of a bunch of cords fastened at one end", "synonyms": ["tassel"], "image_count": 68, "id": 1066, "frequency": "c", "synset": "tassel.n.01"}, {"name": "tea_bag", "instance_count": 42, "def": "a measured amount of tea in a bag for an individual serving of tea", "synonyms": ["tea_bag"], "image_count": 16, "id": 1067, "frequency": "c", "synset": "tea_bag.n.01"}, {"name": "teacup", "instance_count": 152, "def": "a cup from which tea is drunk", "synonyms": ["teacup"], "image_count": 40, "id": 1068, "frequency": "c", "synset": "teacup.n.02"}, {"name": "teakettle", "instance_count": 40, "def": "kettle for boiling water to make tea", "synonyms": ["teakettle"], "image_count": 35, "id": 1069, "frequency": "c", "synset": "teakettle.n.01"}, {"name": "teapot", "instance_count": 209, "def": "pot for brewing tea; usually has a spout and handle", "synonyms": ["teapot"], "image_count": 135, "id": 1070, "frequency": "f", "synset": "teapot.n.01"}, {"name": "teddy_bear", "instance_count": 4886, "def": "plaything consisting of a child's toy bear (usually plush and stuffed with soft materials)", "synonyms": ["teddy_bear"], "image_count": 1413, "id": 1071, "frequency": "f", "synset": "teddy.n.01"}, {"name": "telephone", "instance_count": 945, "def": "electronic device for communicating by voice over long distances (includes wired and wireless/cell phones)", "synonyms": ["telephone", "phone", "telephone_set"], "image_count": 772, "id": 1072, "frequency": "f", "synset": "telephone.n.01"}, {"name": "telephone_booth", "instance_count": 62, "def": "booth for using a telephone", "synonyms": ["telephone_booth", "phone_booth", "call_box", "telephone_box", "telephone_kiosk"], "image_count": 50, "id": 1073, "frequency": "c", "synset": "telephone_booth.n.01"}, {"name": "telephone_pole", "instance_count": 3725, "def": "tall pole supporting telephone wires", "synonyms": ["telephone_pole", "telegraph_pole", "telegraph_post"], "image_count": 1015, "id": 1074, "frequency": "f", "synset": "telephone_pole.n.01"}, {"name": "telephoto_lens", "instance_count": 1, "def": "a camera lens that magnifies the image", "synonyms": ["telephoto_lens", "zoom_lens"], "image_count": 1, "id": 1075, "frequency": "r", "synset": "telephoto_lens.n.01"}, {"name": "television_camera", "instance_count": 117, "def": "television equipment for capturing and recording video", "synonyms": ["television_camera", "tv_camera"], "image_count": 65, "id": 1076, "frequency": "c", "synset": "television_camera.n.01"}, {"name": "television_set", "instance_count": 2205, "def": "an electronic device that receives television signals and displays them on a screen", "synonyms": ["television_set", "tv", "tv_set"], "image_count": 1900, "id": 1077, "frequency": "f", "synset": "television_receiver.n.01"}, {"name": "tennis_ball", "instance_count": 2835, "def": "ball about the size of a fist used in playing tennis", "synonyms": ["tennis_ball"], "image_count": 1302, "id": 1078, "frequency": "f", "synset": "tennis_ball.n.01"}, {"name": "tennis_racket", "instance_count": 3035, "def": "a racket used to play tennis", "synonyms": ["tennis_racket"], "image_count": 1977, "id": 1079, "frequency": "f", "synset": "tennis_racket.n.01"}, {"name": "tequila", "instance_count": 2, "def": "Mexican liquor made from fermented juices of an agave plant", "synonyms": ["tequila"], "image_count": 2, "id": 1080, "frequency": "r", "synset": "tequila.n.01"}, {"name": "thermometer", "instance_count": 33, "def": "measuring instrument for measuring temperature", "synonyms": ["thermometer"], "image_count": 29, "id": 1081, "frequency": "c", "synset": "thermometer.n.01"}, {"name": "thermos_bottle", "instance_count": 49, "def": "vacuum flask that preserves temperature of hot or cold drinks", "synonyms": ["thermos_bottle"], "image_count": 36, "id": 1082, "frequency": "c", "synset": "thermos.n.01"}, {"name": "thermostat", "instance_count": 153, "def": "a regulator for automatically regulating temperature by starting or stopping the supply of heat", "synonyms": ["thermostat"], "image_count": 138, "id": 1083, "frequency": "f", "synset": "thermostat.n.01"}, {"name": "thimble", "instance_count": 6, "def": "a small metal cap to protect the finger while sewing; can be used as a small container", "synonyms": ["thimble"], "image_count": 4, "id": 1084, "frequency": "r", "synset": "thimble.n.02"}, {"name": "thread", "instance_count": 320, "def": "a fine cord of twisted fibers (of cotton or silk or wool or nylon etc.) used in sewing and weaving", "synonyms": ["thread", "yarn"], "image_count": 67, "id": 1085, "frequency": "c", "synset": "thread.n.01"}, {"name": "thumbtack", "instance_count": 224, "def": "a tack for attaching papers to a bulletin board or drawing board", "synonyms": ["thumbtack", "drawing_pin", "pushpin"], "image_count": 26, "id": 1086, "frequency": "c", "synset": "thumbtack.n.01"}, {"name": "tiara", "instance_count": 31, "def": "a jeweled headdress worn by women on formal occasions", "synonyms": ["tiara"], "image_count": 25, "id": 1087, "frequency": "c", "synset": "tiara.n.01"}, {"name": "tiger", "instance_count": 67, "def": "large feline of forests in most of Asia having a tawny coat with black stripes", "synonyms": ["tiger"], "image_count": 33, "id": 1088, "frequency": "c", "synset": "tiger.n.02"}, {"name": "tights_(clothing)", "instance_count": 45, "def": "skintight knit hose covering the body from the waist to the feet worn by acrobats and dancers and as stockings by women and girls", "synonyms": ["tights_(clothing)", "leotards"], "image_count": 37, "id": 1089, "frequency": "c", "synset": "tights.n.01"}, {"name": "timer", "instance_count": 62, "def": "a timepiece that measures a time interval and signals its end", "synonyms": ["timer", "stopwatch"], "image_count": 50, "id": 1090, "frequency": "c", "synset": "timer.n.01"}, {"name": "tinfoil", "instance_count": 421, "def": "foil made of tin or an alloy of tin and lead", "synonyms": ["tinfoil"], "image_count": 270, "id": 1091, "frequency": "f", "synset": "tinfoil.n.01"}, {"name": "tinsel", "instance_count": 70, "def": "a showy decoration that is basically valueless", "synonyms": ["tinsel"], "image_count": 12, "id": 1092, "frequency": "c", "synset": "tinsel.n.01"}, {"name": "tissue_paper", "instance_count": 587, "def": "a soft thin (usually translucent) paper", "synonyms": ["tissue_paper"], "image_count": 316, "id": 1093, "frequency": "f", "synset": "tissue.n.02"}, {"name": "toast_(food)", "instance_count": 125, "def": "slice of bread that has been toasted", "synonyms": ["toast_(food)"], "image_count": 41, "id": 1094, "frequency": "c", "synset": "toast.n.01"}, {"name": "toaster", "instance_count": 240, "def": "a kitchen appliance (usually electric) for toasting bread", "synonyms": ["toaster"], "image_count": 224, "id": 1095, "frequency": "f", "synset": "toaster.n.02"}, {"name": "toaster_oven", "instance_count": 114, "def": "kitchen appliance consisting of a small electric oven for toasting or warming food", "synonyms": ["toaster_oven"], "image_count": 105, "id": 1096, "frequency": "f", "synset": "toaster_oven.n.01"}, {"name": "toilet", "instance_count": 2295, "def": "a plumbing fixture for defecation and urination", "synonyms": ["toilet"], "image_count": 1925, "id": 1097, "frequency": "f", "synset": "toilet.n.02"}, {"name": "toilet_tissue", "instance_count": 1683, "def": "a soft thin absorbent paper for use in toilets", "synonyms": ["toilet_tissue", "toilet_paper", "bathroom_tissue"], "image_count": 1021, "id": 1098, "frequency": "f", "synset": "toilet_tissue.n.01"}, {"name": "tomato", "instance_count": 12338, "def": "mildly acid red or yellow pulpy fruit eaten as a vegetable", "synonyms": ["tomato"], "image_count": 1213, "id": 1099, "frequency": "f", "synset": "tomato.n.01"}, {"name": "tongs", "instance_count": 294, "def": "any of various devices for taking hold of objects; usually have two hinged legs with handles above and pointed hooks below", "synonyms": ["tongs"], "image_count": 172, "id": 1100, "frequency": "f", "synset": "tongs.n.01"}, {"name": "toolbox", "instance_count": 39, "def": "a box or chest or cabinet for holding hand tools", "synonyms": ["toolbox"], "image_count": 28, "id": 1101, "frequency": "c", "synset": "toolbox.n.01"}, {"name": "toothbrush", "instance_count": 1683, "def": "small brush; has long handle; used to clean teeth", "synonyms": ["toothbrush"], "image_count": 745, "id": 1102, "frequency": "f", "synset": "toothbrush.n.01"}, {"name": "toothpaste", "instance_count": 326, "def": "a dentifrice in the form of a paste", "synonyms": ["toothpaste"], "image_count": 187, "id": 1103, "frequency": "f", "synset": "toothpaste.n.01"}, {"name": "toothpick", "instance_count": 423, "def": "pick consisting of a small strip of wood or plastic; used to pick food from between the teeth", "synonyms": ["toothpick"], "image_count": 147, "id": 1104, "frequency": "f", "synset": "toothpick.n.01"}, {"name": "cover", "instance_count": 306, "def": "covering for a hole (especially a hole in the top of a container)", "synonyms": ["cover"], "image_count": 136, "id": 1105, "frequency": "f", "synset": "top.n.09"}, {"name": "tortilla", "instance_count": 135, "def": "thin unleavened pancake made from cornmeal or wheat flour", "synonyms": ["tortilla"], "image_count": 34, "id": 1106, "frequency": "c", "synset": "tortilla.n.01"}, {"name": "tow_truck", "instance_count": 45, "def": "a truck equipped to hoist and pull wrecked cars (or to remove cars from no-parking zones)", "synonyms": ["tow_truck"], "image_count": 41, "id": 1107, "frequency": "c", "synset": "tow_truck.n.01"}, {"name": "towel", "instance_count": 2212, "def": "a rectangular piece of absorbent cloth (or paper) for drying or wiping", "synonyms": ["towel"], "image_count": 636, "id": 1108, "frequency": "f", "synset": "towel.n.01"}, {"name": "towel_rack", "instance_count": 987, "def": "a rack consisting of one or more bars on which towels can be hung", "synonyms": ["towel_rack", "towel_rail", "towel_bar"], "image_count": 570, "id": 1109, "frequency": "f", "synset": "towel_rack.n.01"}, {"name": "toy", "instance_count": 6756, "def": "a device regarded as providing amusement", "synonyms": ["toy"], "image_count": 1149, "id": 1110, "frequency": "f", "synset": "toy.n.03"}, {"name": "tractor_(farm_equipment)", "instance_count": 80, "def": "a wheeled vehicle with large wheels; used in farming and other applications", "synonyms": ["tractor_(farm_equipment)"], "image_count": 61, "id": 1111, "frequency": "c", "synset": "tractor.n.01"}, {"name": "traffic_light", "instance_count": 7298, "def": "a device to control vehicle traffic often consisting of three or more lights", "synonyms": ["traffic_light"], "image_count": 1890, "id": 1112, "frequency": "f", "synset": "traffic_light.n.01"}, {"name": "dirt_bike", "instance_count": 47, "def": "a lightweight motorcycle equipped with rugged tires and suspension for off-road use", "synonyms": ["dirt_bike"], "image_count": 18, "id": 1113, "frequency": "c", "synset": "trail_bike.n.01"}, {"name": "trailer_truck", "instance_count": 297, "def": "a truck consisting of a tractor and trailer together", "synonyms": ["trailer_truck", "tractor_trailer", "trucking_rig", "articulated_lorry", "semi_truck"], "image_count": 143, "id": 1114, "frequency": "f", "synset": "trailer_truck.n.01"}, {"name": "train_(railroad_vehicle)", "instance_count": 2192, "def": "public or private transport provided by a line of railway cars coupled together and drawn by a locomotive", "synonyms": ["train_(railroad_vehicle)", "railroad_train"], "image_count": 1517, "id": 1115, "frequency": "f", "synset": "train.n.01"}, {"name": "trampoline", "instance_count": 7, "def": "gymnastic apparatus consisting of a strong canvas sheet attached with springs to a metal frame", "synonyms": ["trampoline"], "image_count": 7, "id": 1116, "frequency": "r", "synset": "trampoline.n.01"}, {"name": "tray", "instance_count": 2397, "def": "an open receptacle for holding or displaying or serving articles or food", "synonyms": ["tray"], "image_count": 943, "id": 1117, "frequency": "f", "synset": "tray.n.01"}, {"name": "trench_coat", "instance_count": 16, "def": "a military style raincoat; belted with deep pockets", "synonyms": ["trench_coat"], "image_count": 6, "id": 1118, "frequency": "r", "synset": "trench_coat.n.01"}, {"name": "triangle_(musical_instrument)", "instance_count": 1, "def": "a percussion instrument consisting of a metal bar bent in the shape of an open triangle", "synonyms": ["triangle_(musical_instrument)"], "image_count": 1, "id": 1119, "frequency": "r", "synset": "triangle.n.05"}, {"name": "tricycle", "instance_count": 15, "def": "a vehicle with three wheels that is moved by foot pedals", "synonyms": ["tricycle"], "image_count": 11, "id": 1120, "frequency": "c", "synset": "tricycle.n.01"}, {"name": "tripod", "instance_count": 132, "def": "a three-legged rack used for support", "synonyms": ["tripod"], "image_count": 101, "id": 1121, "frequency": "f", "synset": "tripod.n.01"}, {"name": "trousers", "instance_count": 7806, "def": "a garment extending from the waist to the knee or ankle, covering each leg separately", "synonyms": ["trousers", "pants_(clothing)"], "image_count": 1909, "id": 1122, "frequency": "f", "synset": "trouser.n.01"}, {"name": "truck", "instance_count": 1797, "def": "an automotive vehicle suitable for hauling", "synonyms": ["truck"], "image_count": 800, "id": 1123, "frequency": "f", "synset": "truck.n.01"}, {"name": "truffle_(chocolate)", "instance_count": 4, "def": "creamy chocolate candy", "synonyms": ["truffle_(chocolate)", "chocolate_truffle"], "image_count": 1, "id": 1124, "frequency": "r", "synset": "truffle.n.03"}, {"name": "trunk", "instance_count": 334, "def": "luggage consisting of a large strong case used when traveling or for storage", "synonyms": ["trunk"], "image_count": 44, "id": 1125, "frequency": "c", "synset": "trunk.n.02"}, {"name": "vat", "instance_count": 15, "def": "a large vessel for holding or storing liquids", "synonyms": ["vat"], "image_count": 3, "id": 1126, "frequency": "r", "synset": "tub.n.02"}, {"name": "turban", "instance_count": 124, "def": "a traditional headdress consisting of a long scarf wrapped around the head", "synonyms": ["turban"], "image_count": 44, "id": 1127, "frequency": "c", "synset": "turban.n.01"}, {"name": "turkey_(food)", "instance_count": 120, "def": "flesh of large domesticated fowl usually roasted", "synonyms": ["turkey_(food)"], "image_count": 31, "id": 1128, "frequency": "c", "synset": "turkey.n.04"}, {"name": "turnip", "instance_count": 109, "def": "widely cultivated plant having a large fleshy edible white or yellow root", "synonyms": ["turnip"], "image_count": 7, "id": 1129, "frequency": "r", "synset": "turnip.n.01"}, {"name": "turtle", "instance_count": 31, "def": "any of various aquatic and land reptiles having a bony shell and flipper-like limbs for swimming", "synonyms": ["turtle"], "image_count": 20, "id": 1130, "frequency": "c", "synset": "turtle.n.02"}, {"name": "turtleneck_(clothing)", "instance_count": 13, "def": "a sweater or jersey with a high close-fitting collar", "synonyms": ["turtleneck_(clothing)", "polo-neck"], "image_count": 11, "id": 1131, "frequency": "c", "synset": "turtleneck.n.01"}, {"name": "typewriter", "instance_count": 14, "def": "hand-operated character printer for printing written messages one character at a time", "synonyms": ["typewriter"], "image_count": 13, "id": 1132, "frequency": "c", "synset": "typewriter.n.01"}, {"name": "umbrella", "instance_count": 9161, "def": "a lightweight handheld collapsible canopy", "synonyms": ["umbrella"], "image_count": 1924, "id": 1133, "frequency": "f", "synset": "umbrella.n.01"}, {"name": "underwear", "instance_count": 164, "def": "undergarment worn next to the skin and under the outer garments", "synonyms": ["underwear", "underclothes", "underclothing", "underpants"], "image_count": 113, "id": 1134, "frequency": "f", "synset": "underwear.n.01"}, {"name": "unicycle", "instance_count": 2, "def": "a vehicle with a single wheel that is driven by pedals", "synonyms": ["unicycle"], "image_count": 2, "id": 1135, "frequency": "r", "synset": "unicycle.n.01"}, {"name": "urinal", "instance_count": 381, "def": "a plumbing fixture (usually attached to the wall) used by men to urinate", "synonyms": ["urinal"], "image_count": 139, "id": 1136, "frequency": "f", "synset": "urinal.n.01"}, {"name": "urn", "instance_count": 81, "def": "a large vase that usually has a pedestal or feet", "synonyms": ["urn"], "image_count": 12, "id": 1137, "frequency": "c", "synset": "urn.n.01"}, {"name": "vacuum_cleaner", "instance_count": 38, "def": "an electrical home appliance that cleans by suction", "synonyms": ["vacuum_cleaner"], "image_count": 37, "id": 1138, "frequency": "c", "synset": "vacuum.n.04"}, {"name": "vase", "instance_count": 4971, "def": "an open jar of glass or porcelain used as an ornament or to hold flowers", "synonyms": ["vase"], "image_count": 1866, "id": 1139, "frequency": "f", "synset": "vase.n.01"}, {"name": "vending_machine", "instance_count": 65, "def": "a slot machine for selling goods", "synonyms": ["vending_machine"], "image_count": 47, "id": 1140, "frequency": "c", "synset": "vending_machine.n.01"}, {"name": "vent", "instance_count": 3370, "def": "a hole for the escape of gas or air", "synonyms": ["vent", "blowhole", "air_vent"], "image_count": 1468, "id": 1141, "frequency": "f", "synset": "vent.n.01"}, {"name": "vest", "instance_count": 1313, "def": "a man's sleeveless garment worn underneath a coat", "synonyms": ["vest", "waistcoat"], "image_count": 729, "id": 1142, "frequency": "f", "synset": "vest.n.01"}, {"name": "videotape", "instance_count": 228, "def": "a video recording made on magnetic tape", "synonyms": ["videotape"], "image_count": 24, "id": 1143, "frequency": "c", "synset": "videotape.n.01"}, {"name": "vinegar", "instance_count": 1, "def": "sour-tasting liquid produced usually by oxidation of the alcohol in wine or cider and used as a condiment or food preservative", "synonyms": ["vinegar"], "image_count": 1, "id": 1144, "frequency": "r", "synset": "vinegar.n.01"}, {"name": "violin", "instance_count": 10, "def": "bowed stringed instrument that is the highest member of the violin family", "synonyms": ["violin", "fiddle"], "image_count": 10, "id": 1145, "frequency": "r", "synset": "violin.n.01"}, {"name": "vodka", "instance_count": 3, "def": "unaged colorless liquor originating in Russia", "synonyms": ["vodka"], "image_count": 3, "id": 1146, "frequency": "r", "synset": "vodka.n.01"}, {"name": "volleyball", "instance_count": 33, "def": "an inflated ball used in playing volleyball", "synonyms": ["volleyball"], "image_count": 14, "id": 1147, "frequency": "c", "synset": "volleyball.n.02"}, {"name": "vulture", "instance_count": 16, "def": "any of various large birds of prey having naked heads and weak claws and feeding chiefly on carrion", "synonyms": ["vulture"], "image_count": 4, "id": 1148, "frequency": "r", "synset": "vulture.n.01"}, {"name": "waffle", "instance_count": 61, "def": "pancake batter baked in a waffle iron", "synonyms": ["waffle"], "image_count": 29, "id": 1149, "frequency": "c", "synset": "waffle.n.01"}, {"name": "waffle_iron", "instance_count": 4, "def": "a kitchen appliance for baking waffles", "synonyms": ["waffle_iron"], "image_count": 4, "id": 1150, "frequency": "r", "synset": "waffle_iron.n.01"}, {"name": "wagon", "instance_count": 121, "def": "any of various kinds of wheeled vehicles drawn by an animal or a tractor", "synonyms": ["wagon"], "image_count": 70, "id": 1151, "frequency": "c", "synset": "wagon.n.01"}, {"name": "wagon_wheel", "instance_count": 209, "def": "a wheel of a wagon", "synonyms": ["wagon_wheel"], "image_count": 46, "id": 1152, "frequency": "c", "synset": "wagon_wheel.n.01"}, {"name": "walking_stick", "instance_count": 21, "def": "a stick carried in the hand for support in walking", "synonyms": ["walking_stick"], "image_count": 14, "id": 1153, "frequency": "c", "synset": "walking_stick.n.01"}, {"name": "wall_clock", "instance_count": 100, "def": "a clock mounted on a wall", "synonyms": ["wall_clock"], "image_count": 48, "id": 1154, "frequency": "c", "synset": "wall_clock.n.01"}, {"name": "wall_socket", "instance_count": 3069, "def": "receptacle providing a place in a wiring system where current can be taken to run electrical devices", "synonyms": ["wall_socket", "wall_plug", "electric_outlet", "electrical_outlet", "outlet", "electric_receptacle"], "image_count": 1855, "id": 1155, "frequency": "f", "synset": "wall_socket.n.01"}, {"name": "wallet", "instance_count": 123, "def": "a pocket-size case for holding papers and paper money", "synonyms": ["wallet", "billfold"], "image_count": 113, "id": 1156, "frequency": "f", "synset": "wallet.n.01"}, {"name": "walrus", "instance_count": 1, "def": "either of two large northern marine mammals having ivory tusks and tough hide over thick blubber", "synonyms": ["walrus"], "image_count": 1, "id": 1157, "frequency": "r", "synset": "walrus.n.01"}, {"name": "wardrobe", "instance_count": 1, "def": "a tall piece of furniture that provides storage space for clothes; has a door and rails or hooks for hanging clothes", "synonyms": ["wardrobe"], "image_count": 1, "id": 1158, "frequency": "r", "synset": "wardrobe.n.01"}, {"name": "washbasin", "instance_count": 15, "def": "a bathroom sink that is permanently installed and connected to a water supply and drainpipe; where you can wash your hands and face", "synonyms": ["washbasin", "basin_(for_washing)", "washbowl", "washstand", "handbasin"], "image_count": 10, "id": 1159, "frequency": "r", "synset": "washbasin.n.01"}, {"name": "automatic_washer", "instance_count": 68, "def": "a home appliance for washing clothes and linens automatically", "synonyms": ["automatic_washer", "washing_machine"], "image_count": 54, "id": 1160, "frequency": "c", "synset": "washer.n.03"}, {"name": "watch", "instance_count": 2703, "def": "a small, portable timepiece", "synonyms": ["watch", "wristwatch"], "image_count": 1923, "id": 1161, "frequency": "f", "synset": "watch.n.01"}, {"name": "water_bottle", "instance_count": 1449, "def": "a bottle for holding water", "synonyms": ["water_bottle"], "image_count": 630, "id": 1162, "frequency": "f", "synset": "water_bottle.n.01"}, {"name": "water_cooler", "instance_count": 39, "def": "a device for cooling and dispensing drinking water", "synonyms": ["water_cooler"], "image_count": 31, "id": 1163, "frequency": "c", "synset": "water_cooler.n.01"}, {"name": "water_faucet", "instance_count": 109, "def": "a faucet for drawing water from a pipe or cask", "synonyms": ["water_faucet", "water_tap", "tap_(water_faucet)"], "image_count": 69, "id": 1164, "frequency": "c", "synset": "water_faucet.n.01"}, {"name": "water_heater", "instance_count": 7, "def": "a heater and storage tank to supply heated water", "synonyms": ["water_heater", "hot-water_heater"], "image_count": 7, "id": 1165, "frequency": "r", "synset": "water_heater.n.01"}, {"name": "water_jug", "instance_count": 23, "def": "a jug that holds water", "synonyms": ["water_jug"], "image_count": 11, "id": 1166, "frequency": "c", "synset": "water_jug.n.01"}, {"name": "water_gun", "instance_count": 1, "def": "plaything consisting of a toy pistol that squirts water", "synonyms": ["water_gun", "squirt_gun"], "image_count": 1, "id": 1167, "frequency": "r", "synset": "water_pistol.n.01"}, {"name": "water_scooter", "instance_count": 54, "def": "a motorboat resembling a motor scooter (NOT A SURFBOARD OR WATER SKI)", "synonyms": ["water_scooter", "sea_scooter", "jet_ski"], "image_count": 30, "id": 1168, "frequency": "c", "synset": "water_scooter.n.01"}, {"name": "water_ski", "instance_count": 98, "def": "broad ski for skimming over water towed by a speedboat (DO NOT MARK WATER)", "synonyms": ["water_ski"], "image_count": 50, "id": 1169, "frequency": "c", "synset": "water_ski.n.01"}, {"name": "water_tower", "instance_count": 60, "def": "a large reservoir for water", "synonyms": ["water_tower"], "image_count": 45, "id": 1170, "frequency": "c", "synset": "water_tower.n.01"}, {"name": "watering_can", "instance_count": 44, "def": "a container with a handle and a spout with a perforated nozzle; used to sprinkle water over plants", "synonyms": ["watering_can"], "image_count": 28, "id": 1171, "frequency": "c", "synset": "watering_can.n.01"}, {"name": "watermelon", "instance_count": 814, "def": "large oblong or roundish melon with a hard green rind and sweet watery red or occasionally yellowish pulp", "synonyms": ["watermelon"], "image_count": 114, "id": 1172, "frequency": "f", "synset": "watermelon.n.02"}, {"name": "weathervane", "instance_count": 237, "def": "mechanical device attached to an elevated structure; rotates freely to show the direction of the wind", "synonyms": ["weathervane", "vane_(weathervane)", "wind_vane"], "image_count": 193, "id": 1173, "frequency": "f", "synset": "weathervane.n.01"}, {"name": "webcam", "instance_count": 27, "def": "a digital camera designed to take digital photographs and transmit them over the internet", "synonyms": ["webcam"], "image_count": 21, "id": 1174, "frequency": "c", "synset": "webcam.n.01"}, {"name": "wedding_cake", "instance_count": 140, "def": "a rich cake with two or more tiers and covered with frosting and decorations; served at a wedding reception", "synonyms": ["wedding_cake", "bridecake"], "image_count": 91, "id": 1175, "frequency": "c", "synset": "wedding_cake.n.01"}, {"name": "wedding_ring", "instance_count": 49, "def": "a ring given to the bride and/or groom at the wedding", "synonyms": ["wedding_ring", "wedding_band"], "image_count": 31, "id": 1176, "frequency": "c", "synset": "wedding_ring.n.01"}, {"name": "wet_suit", "instance_count": 2907, "def": "a close-fitting garment made of a permeable material; worn in cold water to retain body heat", "synonyms": ["wet_suit"], "image_count": 1469, "id": 1177, "frequency": "f", "synset": "wet_suit.n.01"}, {"name": "wheel", "instance_count": 11272, "def": "a circular frame with spokes (or a solid disc) that can rotate on a shaft or axle", "synonyms": ["wheel"], "image_count": 1924, "id": 1178, "frequency": "f", "synset": "wheel.n.01"}, {"name": "wheelchair", "instance_count": 107, "def": "a movable chair mounted on large wheels", "synonyms": ["wheelchair"], "image_count": 87, "id": 1179, "frequency": "c", "synset": "wheelchair.n.01"}, {"name": "whipped_cream", "instance_count": 201, "def": "cream that has been beaten until light and fluffy", "synonyms": ["whipped_cream"], "image_count": 77, "id": 1180, "frequency": "c", "synset": "whipped_cream.n.01"}, {"name": "whistle", "instance_count": 13, "def": "a small wind instrument that produces a whistling sound by blowing into it", "synonyms": ["whistle"], "image_count": 11, "id": 1181, "frequency": "c", "synset": "whistle.n.03"}, {"name": "wig", "instance_count": 69, "def": "hairpiece covering the head and made of real or synthetic hair", "synonyms": ["wig"], "image_count": 47, "id": 1182, "frequency": "c", "synset": "wig.n.01"}, {"name": "wind_chime", "instance_count": 28, "def": "a decorative arrangement of pieces of metal or glass or pottery that hang together loosely so the wind can cause them to tinkle", "synonyms": ["wind_chime"], "image_count": 21, "id": 1183, "frequency": "c", "synset": "wind_chime.n.01"}, {"name": "windmill", "instance_count": 202, "def": "A mill or turbine that is powered by wind", "synonyms": ["windmill"], "image_count": 47, "id": 1184, "frequency": "c", "synset": "windmill.n.01"}, {"name": "window_box_(for_plants)", "instance_count": 253, "def": "a container for growing plants on a windowsill", "synonyms": ["window_box_(for_plants)"], "image_count": 70, "id": 1185, "frequency": "c", "synset": "window_box.n.01"}, {"name": "windshield_wiper", "instance_count": 4793, "def": "a mechanical device that cleans the windshield", "synonyms": ["windshield_wiper", "windscreen_wiper", "wiper_(for_windshield/screen)"], "image_count": 1838, "id": 1186, "frequency": "f", "synset": "windshield_wiper.n.01"}, {"name": "windsock", "instance_count": 26, "def": "a truncated cloth cone mounted on a mast/pole; shows wind direction", "synonyms": ["windsock", "air_sock", "air-sleeve", "wind_sleeve", "wind_cone"], "image_count": 19, "id": 1187, "frequency": "c", "synset": "windsock.n.01"}, {"name": "wine_bottle", "instance_count": 4449, "def": "a bottle for holding wine", "synonyms": ["wine_bottle"], "image_count": 531, "id": 1188, "frequency": "f", "synset": "wine_bottle.n.01"}, {"name": "wine_bucket", "instance_count": 21, "def": "a bucket of ice used to chill a bottle of wine", "synonyms": ["wine_bucket", "wine_cooler"], "image_count": 11, "id": 1189, "frequency": "c", "synset": "wine_bucket.n.01"}, {"name": "wineglass", "instance_count": 4259, "def": "a glass that has a stem and in which wine is served", "synonyms": ["wineglass"], "image_count": 941, "id": 1190, "frequency": "f", "synset": "wineglass.n.01"}, {"name": "blinder_(for_horses)", "instance_count": 271, "def": "blinds that prevent a horse from seeing something on either side", "synonyms": ["blinder_(for_horses)"], "image_count": 113, "id": 1191, "frequency": "f", "synset": "winker.n.02"}, {"name": "wok", "instance_count": 60, "def": "pan with a convex bottom; used for frying in Chinese cooking", "synonyms": ["wok"], "image_count": 26, "id": 1192, "frequency": "c", "synset": "wok.n.01"}, {"name": "wolf", "instance_count": 16, "def": "a wild carnivorous mammal of the dog family, living and hunting in packs", "synonyms": ["wolf"], "image_count": 5, "id": 1193, "frequency": "r", "synset": "wolf.n.01"}, {"name": "wooden_spoon", "instance_count": 123, "def": "a spoon made of wood", "synonyms": ["wooden_spoon"], "image_count": 56, "id": 1194, "frequency": "c", "synset": "wooden_spoon.n.02"}, {"name": "wreath", "instance_count": 119, "def": "an arrangement of flowers, leaves, or stems fastened in a ring", "synonyms": ["wreath"], "image_count": 73, "id": 1195, "frequency": "c", "synset": "wreath.n.01"}, {"name": "wrench", "instance_count": 80, "def": "a hand tool that is used to hold or twist a nut or bolt", "synonyms": ["wrench", "spanner"], "image_count": 32, "id": 1196, "frequency": "c", "synset": "wrench.n.03"}, {"name": "wristband", "instance_count": 268, "def": "band consisting of a part of a sleeve that covers the wrist", "synonyms": ["wristband"], "image_count": 128, "id": 1197, "frequency": "f", "synset": "wristband.n.01"}, {"name": "wristlet", "instance_count": 1330, "def": "a band or bracelet worn around the wrist", "synonyms": ["wristlet", "wrist_band"], "image_count": 623, "id": 1198, "frequency": "f", "synset": "wristlet.n.01"}, {"name": "yacht", "instance_count": 50, "def": "an expensive vessel propelled by sail or power and used for cruising or racing", "synonyms": ["yacht"], "image_count": 12, "id": 1199, "frequency": "c", "synset": "yacht.n.01"}, {"name": "yogurt", "instance_count": 116, "def": "a custard-like food made from curdled milk", "synonyms": ["yogurt", "yoghurt", "yoghourt"], "image_count": 52, "id": 1200, "frequency": "c", "synset": "yogurt.n.01"}, {"name": "yoke_(animal_equipment)", "instance_count": 20, "def": "gear joining two animals at the neck; NOT egg yolk", "synonyms": ["yoke_(animal_equipment)"], "image_count": 11, "id": 1201, "frequency": "c", "synset": "yoke.n.07"}, {"name": "zebra", "instance_count": 5443, "def": "any of several fleet black-and-white striped African equines", "synonyms": ["zebra"], "image_count": 1674, "id": 1202, "frequency": "f", "synset": "zebra.n.01"}, {"name": "zucchini", "instance_count": 798, "def": "small cucumber-shaped vegetable marrow; typically dark green", "synonyms": ["zucchini", "courgette"], "image_count": 81, "id": 1203, "frequency": "c", "synset": "zucchini.n.02"}] \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/prepare_ade20k_sem_seg.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/prepare_ade20k_sem_seg.py deleted file mode 100755 index 8b4a58d8..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/prepare_ade20k_sem_seg.py +++ /dev/null @@ -1,26 +0,0 @@ -#!/usr/bin/env python3 -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -import os -from pathlib import Path -import tqdm -from PIL import Image - - -def convert(input, output): - img = np.asarray(Image.open(input)) - assert img.dtype == np.uint8 - img = img - 1 # 0 (ignore) becomes 255. others are shifted by 1 - Image.fromarray(img).save(output) - - -if __name__ == "__main__": - dataset_dir = Path(os.getenv("DETECTRON2_DATASETS", "datasets")) / "ADEChallengeData2016" - for name in ["training", "validation"]: - annotation_dir = dataset_dir / "annotations" / name - output_dir = dataset_dir / "annotations_detectron2" / name - output_dir.mkdir(parents=True, exist_ok=True) - for file in tqdm.tqdm(list(annotation_dir.iterdir())): - output_file = output_dir / file.name - convert(file, output_file) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/prepare_cocofied_lvis.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/prepare_cocofied_lvis.py deleted file mode 100755 index 245c8848..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/prepare_cocofied_lvis.py +++ /dev/null @@ -1,176 +0,0 @@ -#!/usr/bin/env python3 -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import copy -import json -import os -from collections import defaultdict - -# This mapping is extracted from the official LVIS mapping: -# https://github.com/lvis-dataset/lvis-api/blob/master/data/coco_to_synset.json -COCO_SYNSET_CATEGORIES = [ - {"synset": "person.n.01", "coco_cat_id": 1}, - {"synset": "bicycle.n.01", "coco_cat_id": 2}, - {"synset": "car.n.01", "coco_cat_id": 3}, - {"synset": "motorcycle.n.01", "coco_cat_id": 4}, - {"synset": "airplane.n.01", "coco_cat_id": 5}, - {"synset": "bus.n.01", "coco_cat_id": 6}, - {"synset": "train.n.01", "coco_cat_id": 7}, - {"synset": "truck.n.01", "coco_cat_id": 8}, - {"synset": "boat.n.01", "coco_cat_id": 9}, - {"synset": "traffic_light.n.01", "coco_cat_id": 10}, - {"synset": "fireplug.n.01", "coco_cat_id": 11}, - {"synset": "stop_sign.n.01", "coco_cat_id": 13}, - {"synset": "parking_meter.n.01", "coco_cat_id": 14}, - {"synset": "bench.n.01", "coco_cat_id": 15}, - {"synset": "bird.n.01", "coco_cat_id": 16}, - {"synset": "cat.n.01", "coco_cat_id": 17}, - {"synset": "dog.n.01", "coco_cat_id": 18}, - {"synset": "horse.n.01", "coco_cat_id": 19}, - {"synset": "sheep.n.01", "coco_cat_id": 20}, - {"synset": "beef.n.01", "coco_cat_id": 21}, - {"synset": "elephant.n.01", "coco_cat_id": 22}, - {"synset": "bear.n.01", "coco_cat_id": 23}, - {"synset": "zebra.n.01", "coco_cat_id": 24}, - {"synset": "giraffe.n.01", "coco_cat_id": 25}, - {"synset": "backpack.n.01", "coco_cat_id": 27}, - {"synset": "umbrella.n.01", "coco_cat_id": 28}, - {"synset": "bag.n.04", "coco_cat_id": 31}, - {"synset": "necktie.n.01", "coco_cat_id": 32}, - {"synset": "bag.n.06", "coco_cat_id": 33}, - {"synset": "frisbee.n.01", "coco_cat_id": 34}, - {"synset": "ski.n.01", "coco_cat_id": 35}, - {"synset": "snowboard.n.01", "coco_cat_id": 36}, - {"synset": "ball.n.06", "coco_cat_id": 37}, - {"synset": "kite.n.03", "coco_cat_id": 38}, - {"synset": "baseball_bat.n.01", "coco_cat_id": 39}, - {"synset": "baseball_glove.n.01", "coco_cat_id": 40}, - {"synset": "skateboard.n.01", "coco_cat_id": 41}, - {"synset": "surfboard.n.01", "coco_cat_id": 42}, - {"synset": "tennis_racket.n.01", "coco_cat_id": 43}, - {"synset": "bottle.n.01", "coco_cat_id": 44}, - {"synset": "wineglass.n.01", "coco_cat_id": 46}, - {"synset": "cup.n.01", "coco_cat_id": 47}, - {"synset": "fork.n.01", "coco_cat_id": 48}, - {"synset": "knife.n.01", "coco_cat_id": 49}, - {"synset": "spoon.n.01", "coco_cat_id": 50}, - {"synset": "bowl.n.03", "coco_cat_id": 51}, - {"synset": "banana.n.02", "coco_cat_id": 52}, - {"synset": "apple.n.01", "coco_cat_id": 53}, - {"synset": "sandwich.n.01", "coco_cat_id": 54}, - {"synset": "orange.n.01", "coco_cat_id": 55}, - {"synset": "broccoli.n.01", "coco_cat_id": 56}, - {"synset": "carrot.n.01", "coco_cat_id": 57}, - {"synset": "frank.n.02", "coco_cat_id": 58}, - {"synset": "pizza.n.01", "coco_cat_id": 59}, - {"synset": "doughnut.n.02", "coco_cat_id": 60}, - {"synset": "cake.n.03", "coco_cat_id": 61}, - {"synset": "chair.n.01", "coco_cat_id": 62}, - {"synset": "sofa.n.01", "coco_cat_id": 63}, - {"synset": "pot.n.04", "coco_cat_id": 64}, - {"synset": "bed.n.01", "coco_cat_id": 65}, - {"synset": "dining_table.n.01", "coco_cat_id": 67}, - {"synset": "toilet.n.02", "coco_cat_id": 70}, - {"synset": "television_receiver.n.01", "coco_cat_id": 72}, - {"synset": "laptop.n.01", "coco_cat_id": 73}, - {"synset": "mouse.n.04", "coco_cat_id": 74}, - {"synset": "remote_control.n.01", "coco_cat_id": 75}, - {"synset": "computer_keyboard.n.01", "coco_cat_id": 76}, - {"synset": "cellular_telephone.n.01", "coco_cat_id": 77}, - {"synset": "microwave.n.02", "coco_cat_id": 78}, - {"synset": "oven.n.01", "coco_cat_id": 79}, - {"synset": "toaster.n.02", "coco_cat_id": 80}, - {"synset": "sink.n.01", "coco_cat_id": 81}, - {"synset": "electric_refrigerator.n.01", "coco_cat_id": 82}, - {"synset": "book.n.01", "coco_cat_id": 84}, - {"synset": "clock.n.01", "coco_cat_id": 85}, - {"synset": "vase.n.01", "coco_cat_id": 86}, - {"synset": "scissors.n.01", "coco_cat_id": 87}, - {"synset": "teddy.n.01", "coco_cat_id": 88}, - {"synset": "hand_blower.n.01", "coco_cat_id": 89}, - {"synset": "toothbrush.n.01", "coco_cat_id": 90}, -] - - -def cocofy_lvis(input_filename, output_filename): - """ - Filter LVIS instance segmentation annotations to remove all categories that are not included in - COCO. The new json files can be used to evaluate COCO AP using `lvis-api`. The category ids in - the output json are the incontiguous COCO dataset ids. - - Args: - input_filename (str): path to the LVIS json file. - output_filename (str): path to the COCOfied json file. - """ - - with open(input_filename, "r") as f: - lvis_json = json.load(f) - - lvis_annos = lvis_json.pop("annotations") - cocofied_lvis = copy.deepcopy(lvis_json) - lvis_json["annotations"] = lvis_annos - - # Mapping from lvis cat id to coco cat id via synset - lvis_cat_id_to_synset = {cat["id"]: cat["synset"] for cat in lvis_json["categories"]} - synset_to_coco_cat_id = {x["synset"]: x["coco_cat_id"] for x in COCO_SYNSET_CATEGORIES} - # Synsets that we will keep in the dataset - synsets_to_keep = set(synset_to_coco_cat_id.keys()) - coco_cat_id_with_instances = defaultdict(int) - - new_annos = [] - ann_id = 1 - for ann in lvis_annos: - lvis_cat_id = ann["category_id"] - synset = lvis_cat_id_to_synset[lvis_cat_id] - if synset not in synsets_to_keep: - continue - coco_cat_id = synset_to_coco_cat_id[synset] - new_ann = copy.deepcopy(ann) - new_ann["category_id"] = coco_cat_id - new_ann["id"] = ann_id - ann_id += 1 - new_annos.append(new_ann) - coco_cat_id_with_instances[coco_cat_id] += 1 - cocofied_lvis["annotations"] = new_annos - - for image in cocofied_lvis["images"]: - for key in ["not_exhaustive_category_ids", "neg_category_ids"]: - new_category_list = [] - for lvis_cat_id in image[key]: - synset = lvis_cat_id_to_synset[lvis_cat_id] - if synset not in synsets_to_keep: - continue - coco_cat_id = synset_to_coco_cat_id[synset] - new_category_list.append(coco_cat_id) - coco_cat_id_with_instances[coco_cat_id] += 1 - image[key] = new_category_list - - coco_cat_id_with_instances = set(coco_cat_id_with_instances.keys()) - - new_categories = [] - for cat in lvis_json["categories"]: - synset = cat["synset"] - if synset not in synsets_to_keep: - continue - coco_cat_id = synset_to_coco_cat_id[synset] - if coco_cat_id not in coco_cat_id_with_instances: - continue - new_cat = copy.deepcopy(cat) - new_cat["id"] = coco_cat_id - new_categories.append(new_cat) - cocofied_lvis["categories"] = new_categories - - with open(output_filename, "w") as f: - json.dump(cocofied_lvis, f) - print("{} is COCOfied and stored in {}.".format(input_filename, output_filename)) - - -if __name__ == "__main__": - dataset_dir = os.path.join(os.getenv("DETECTRON2_DATASETS", "datasets"), "lvis") - for s in ["lvis_v0.5_train", "lvis_v0.5_val"]: - print("Start COCOfing {}.".format(s)) - cocofy_lvis( - os.path.join(dataset_dir, "{}.json".format(s)), - os.path.join(dataset_dir, "{}_cocofied.json".format(s)), - ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/prepare_for_tests.sh b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/prepare_for_tests.sh deleted file mode 100755 index 67e875a4..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/prepare_for_tests.sh +++ /dev/null @@ -1,31 +0,0 @@ -#!/bin/bash -e -# Copyright (c) Facebook, Inc. and its affiliates. - -# Download the mini dataset (coco val2017_100, with only 100 images) -# to be used in unittests & integration tests. - -cd "${0%/*}" - -BASE=https://dl.fbaipublicfiles.com/detectron2 -ROOT=${DETECTRON2_DATASETS:-./} -ROOT=${ROOT/#\~/$HOME} # expand ~ to HOME -mkdir -p $ROOT/coco/annotations - -for anno in instances_val2017_100 \ - person_keypoints_val2017_100 ; do - - dest=$ROOT/coco/annotations/$anno.json - [[ -s $dest ]] && { - echo "$dest exists. Skipping ..." - } || { - wget $BASE/annotations/coco/$anno.json -O $dest - } -done - -dest=$ROOT/coco/val2017_100.tgz -[[ -d $ROOT/coco/val2017 ]] && { - echo "$ROOT/coco/val2017 exists. Skipping ..." -} || { - wget $BASE/annotations/coco/val2017_100.tgz -O $dest - tar xzf $dest -C $ROOT/coco/ && rm -f $dest -} diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/prepare_panoptic_fpn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/prepare_panoptic_fpn.py deleted file mode 100755 index 597d791a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/datasets/prepare_panoptic_fpn.py +++ /dev/null @@ -1,116 +0,0 @@ -#!/usr/bin/env python3 -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import functools -import json -import multiprocessing as mp -import numpy as np -import os -import time -from fvcore.common.download import download -from panopticapi.utils import rgb2id -from PIL import Image - -from detectron2.data.datasets.builtin_meta import COCO_CATEGORIES - - -def _process_panoptic_to_semantic(input_panoptic, output_semantic, segments, id_map): - panoptic = np.asarray(Image.open(input_panoptic), dtype=np.uint32) - panoptic = rgb2id(panoptic) - output = np.zeros_like(panoptic, dtype=np.uint8) + 255 - for seg in segments: - cat_id = seg["category_id"] - new_cat_id = id_map[cat_id] - output[panoptic == seg["id"]] = new_cat_id - Image.fromarray(output).save(output_semantic) - - -def separate_coco_semantic_from_panoptic(panoptic_json, panoptic_root, sem_seg_root, categories): - """ - Create semantic segmentation annotations from panoptic segmentation - annotations, to be used by PanopticFPN. - - It maps all thing categories to class 0, and maps all unlabeled pixels to class 255. - It maps all stuff categories to contiguous ids starting from 1. - - Args: - panoptic_json (str): path to the panoptic json file, in COCO's format. - panoptic_root (str): a directory with panoptic annotation files, in COCO's format. - sem_seg_root (str): a directory to output semantic annotation files - categories (list[dict]): category metadata. Each dict needs to have: - "id": corresponds to the "category_id" in the json annotations - "isthing": 0 or 1 - """ - os.makedirs(sem_seg_root, exist_ok=True) - - stuff_ids = [k["id"] for k in categories if k["isthing"] == 0] - thing_ids = [k["id"] for k in categories if k["isthing"] == 1] - id_map = {} # map from category id to id in the output semantic annotation - assert len(stuff_ids) <= 254 - for i, stuff_id in enumerate(stuff_ids): - id_map[stuff_id] = i + 1 - for thing_id in thing_ids: - id_map[thing_id] = 0 - id_map[0] = 255 - - with open(panoptic_json) as f: - obj = json.load(f) - - pool = mp.Pool(processes=max(mp.cpu_count() // 2, 4)) - - def iter_annotations(): - for anno in obj["annotations"]: - file_name = anno["file_name"] - segments = anno["segments_info"] - input = os.path.join(panoptic_root, file_name) - output = os.path.join(sem_seg_root, file_name) - yield input, output, segments - - print("Start writing to {} ...".format(sem_seg_root)) - start = time.time() - pool.starmap( - functools.partial(_process_panoptic_to_semantic, id_map=id_map), - iter_annotations(), - chunksize=100, - ) - print("Finished. time: {:.2f}s".format(time.time() - start)) - - -if __name__ == "__main__": - dataset_dir = os.path.join(os.getenv("DETECTRON2_DATASETS", "datasets"), "coco") - for s in ["val2017", "train2017"]: - separate_coco_semantic_from_panoptic( - os.path.join(dataset_dir, "annotations/panoptic_{}.json".format(s)), - os.path.join(dataset_dir, "panoptic_{}".format(s)), - os.path.join(dataset_dir, "panoptic_stuff_{}".format(s)), - COCO_CATEGORIES, - ) - - # Prepare val2017_100 for quick testing: - - dest_dir = os.path.join(dataset_dir, "annotations/") - URL_PREFIX = "https://dl.fbaipublicfiles.com/detectron2/" - download(URL_PREFIX + "annotations/coco/panoptic_val2017_100.json", dest_dir) - with open(os.path.join(dest_dir, "panoptic_val2017_100.json")) as f: - obj = json.load(f) - - def link_val100(dir_full, dir_100): - print("Creating " + dir_100 + " ...") - os.makedirs(dir_100, exist_ok=True) - for img in obj["images"]: - basename = os.path.splitext(img["file_name"])[0] - src = os.path.join(dir_full, basename + ".png") - dst = os.path.join(dir_100, basename + ".png") - src = os.path.relpath(src, start=dir_100) - os.symlink(src, dst) - - link_val100( - os.path.join(dataset_dir, "panoptic_val2017"), - os.path.join(dataset_dir, "panoptic_val2017_100"), - ) - - link_val100( - os.path.join(dataset_dir, "panoptic_stuff_val2017"), - os.path.join(dataset_dir, "panoptic_stuff_val2017_100"), - ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/demo/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/demo/README.md deleted file mode 100755 index 133d8d38..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/demo/README.md +++ /dev/null @@ -1,8 +0,0 @@ - -## Detectron2 Demo - -We provide a command line tool to run a simple demo of builtin configs. -The usage is explained in [GETTING_STARTED.md](../GETTING_STARTED.md). - -See our [blog post](https://ai.facebook.com/blog/-detectron2-a-pytorch-based-modular-object-detection-library-) -for a high-quality demo generated with this tool. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/demo/demo.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/demo/demo.py deleted file mode 100755 index 4baa8767..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/demo/demo.py +++ /dev/null @@ -1,188 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import argparse -import glob -import multiprocessing as mp -import numpy as np -import os -import tempfile -import time -import warnings -import cv2 -import tqdm - -from detectron2.config import get_cfg -from detectron2.data.detection_utils import read_image -from detectron2.utils.logger import setup_logger - -from predictor import VisualizationDemo - -# constants -WINDOW_NAME = "COCO detections" - - -def setup_cfg(args): - # load config from file and command-line arguments - cfg = get_cfg() - # To use demo for Panoptic-DeepLab, please uncomment the following two lines. - # from detectron2.projects.panoptic_deeplab import add_panoptic_deeplab_config # noqa - # add_panoptic_deeplab_config(cfg) - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - # Set score_threshold for builtin models - cfg.MODEL.RETINANET.SCORE_THRESH_TEST = args.confidence_threshold - cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = args.confidence_threshold - cfg.MODEL.PANOPTIC_FPN.COMBINE.INSTANCES_CONFIDENCE_THRESH = args.confidence_threshold - cfg.freeze() - return cfg - - -def get_parser(): - parser = argparse.ArgumentParser(description="Detectron2 demo for builtin configs") - parser.add_argument( - "--config-file", - default="configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml", - metavar="FILE", - help="path to config file", - ) - parser.add_argument("--webcam", action="store_true", help="Take inputs from webcam.") - parser.add_argument("--video-input", help="Path to video file.") - parser.add_argument( - "--input", - nargs="+", - help="A list of space separated input images; " - "or a single glob pattern such as 'directory/*.jpg'", - ) - parser.add_argument( - "--output", - help="A file or directory to save output visualizations. " - "If not given, will show output in an OpenCV window.", - ) - - parser.add_argument( - "--confidence-threshold", - type=float, - default=0.5, - help="Minimum score for instance predictions to be shown", - ) - parser.add_argument( - "--opts", - help="Modify config options using the command-line 'KEY VALUE' pairs", - default=[], - nargs=argparse.REMAINDER, - ) - return parser - - -def test_opencv_video_format(codec, file_ext): - with tempfile.TemporaryDirectory(prefix="video_format_test") as dir: - filename = os.path.join(dir, "test_file" + file_ext) - writer = cv2.VideoWriter( - filename=filename, - fourcc=cv2.VideoWriter_fourcc(*codec), - fps=float(30), - frameSize=(10, 10), - isColor=True, - ) - [writer.write(np.zeros((10, 10, 3), np.uint8)) for _ in range(30)] - writer.release() - if os.path.isfile(filename): - return True - return False - - -if __name__ == "__main__": - mp.set_start_method("spawn", force=True) - args = get_parser().parse_args() - setup_logger(name="fvcore") - logger = setup_logger() - logger.info("Arguments: " + str(args)) - - cfg = setup_cfg(args) - - demo = VisualizationDemo(cfg) - - if args.input: - if len(args.input) == 1: - args.input = glob.glob(os.path.expanduser(args.input[0])) - assert args.input, "The input path(s) was not found" - for path in tqdm.tqdm(args.input, disable=not args.output): - # use PIL, to be consistent with evaluation - img = read_image(path, format="BGR") - start_time = time.time() - predictions, visualized_output = demo.run_on_image(img) - logger.info( - "{}: {} in {:.2f}s".format( - path, - "detected {} instances".format(len(predictions["instances"])) - if "instances" in predictions - else "finished", - time.time() - start_time, - ) - ) - - if args.output: - if os.path.isdir(args.output): - assert os.path.isdir(args.output), args.output - out_filename = os.path.join(args.output, os.path.basename(path)) - else: - assert len(args.input) == 1, "Please specify a directory with args.output" - out_filename = args.output - visualized_output.save(out_filename) - else: - cv2.namedWindow(WINDOW_NAME, cv2.WINDOW_NORMAL) - cv2.imshow(WINDOW_NAME, visualized_output.get_image()[:, :, ::-1]) - if cv2.waitKey(0) == 27: - break # esc to quit - elif args.webcam: - assert args.input is None, "Cannot have both --input and --webcam!" - assert args.output is None, "output not yet supported with --webcam!" - cam = cv2.VideoCapture(0) - for vis in tqdm.tqdm(demo.run_on_video(cam)): - cv2.namedWindow(WINDOW_NAME, cv2.WINDOW_NORMAL) - cv2.imshow(WINDOW_NAME, vis) - if cv2.waitKey(1) == 27: - break # esc to quit - cam.release() - cv2.destroyAllWindows() - elif args.video_input: - video = cv2.VideoCapture(args.video_input) - width = int(video.get(cv2.CAP_PROP_FRAME_WIDTH)) - height = int(video.get(cv2.CAP_PROP_FRAME_HEIGHT)) - frames_per_second = video.get(cv2.CAP_PROP_FPS) - num_frames = int(video.get(cv2.CAP_PROP_FRAME_COUNT)) - basename = os.path.basename(args.video_input) - codec, file_ext = ( - ("x264", ".mkv") if test_opencv_video_format("x264", ".mkv") else ("mp4v", ".mp4") - ) - if codec == ".mp4v": - warnings.warn("x264 codec not available, switching to mp4v") - if args.output: - if os.path.isdir(args.output): - output_fname = os.path.join(args.output, basename) - output_fname = os.path.splitext(output_fname)[0] + file_ext - else: - output_fname = args.output - assert not os.path.isfile(output_fname), output_fname - output_file = cv2.VideoWriter( - filename=output_fname, - # some installation of opencv may not support x264 (due to its license), - # you can try other format (e.g. MPEG) - fourcc=cv2.VideoWriter_fourcc(*codec), - fps=float(frames_per_second), - frameSize=(width, height), - isColor=True, - ) - assert os.path.isfile(args.video_input) - for vis_frame in tqdm.tqdm(demo.run_on_video(video), total=num_frames): - if args.output: - output_file.write(vis_frame) - else: - cv2.namedWindow(basename, cv2.WINDOW_NORMAL) - cv2.imshow(basename, vis_frame) - if cv2.waitKey(1) == 27: - break # esc to quit - video.release() - if args.output: - output_file.release() - else: - cv2.destroyAllWindows() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/demo/predictor.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/demo/predictor.py deleted file mode 100755 index 7b7ebd3f..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/demo/predictor.py +++ /dev/null @@ -1,220 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import atexit -import bisect -import multiprocessing as mp -from collections import deque -import cv2 -import torch - -from detectron2.data import MetadataCatalog -from detectron2.engine.defaults import DefaultPredictor -from detectron2.utils.video_visualizer import VideoVisualizer -from detectron2.utils.visualizer import ColorMode, Visualizer - - -class VisualizationDemo(object): - def __init__(self, cfg, instance_mode=ColorMode.IMAGE, parallel=False): - """ - Args: - cfg (CfgNode): - instance_mode (ColorMode): - parallel (bool): whether to run the model in different processes from visualization. - Useful since the visualization logic can be slow. - """ - self.metadata = MetadataCatalog.get( - cfg.DATASETS.TEST[0] if len(cfg.DATASETS.TEST) else "__unused" - ) - self.cpu_device = torch.device("cpu") - self.instance_mode = instance_mode - - self.parallel = parallel - if parallel: - num_gpu = torch.cuda.device_count() - self.predictor = AsyncPredictor(cfg, num_gpus=num_gpu) - else: - self.predictor = DefaultPredictor(cfg) - - def run_on_image(self, image): - """ - Args: - image (np.ndarray): an image of shape (H, W, C) (in BGR order). - This is the format used by OpenCV. - - Returns: - predictions (dict): the output of the model. - vis_output (VisImage): the visualized image output. - """ - vis_output = None - predictions = self.predictor(image) - # Convert image from OpenCV BGR format to Matplotlib RGB format. - image = image[:, :, ::-1] - visualizer = Visualizer(image, self.metadata, instance_mode=self.instance_mode) - if "panoptic_seg" in predictions: - panoptic_seg, segments_info = predictions["panoptic_seg"] - vis_output = visualizer.draw_panoptic_seg_predictions( - panoptic_seg.to(self.cpu_device), segments_info - ) - else: - if "sem_seg" in predictions: - vis_output = visualizer.draw_sem_seg( - predictions["sem_seg"].argmax(dim=0).to(self.cpu_device) - ) - if "instances" in predictions: - instances = predictions["instances"].to(self.cpu_device) - vis_output = visualizer.draw_instance_predictions(predictions=instances) - - return predictions, vis_output - - def _frame_from_video(self, video): - while video.isOpened(): - success, frame = video.read() - if success: - yield frame - else: - break - - def run_on_video(self, video): - """ - Visualizes predictions on frames of the input video. - - Args: - video (cv2.VideoCapture): a :class:`VideoCapture` object, whose source can be - either a webcam or a video file. - - Yields: - ndarray: BGR visualizations of each video frame. - """ - video_visualizer = VideoVisualizer(self.metadata, self.instance_mode) - - def process_predictions(frame, predictions): - frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB) - if "panoptic_seg" in predictions: - panoptic_seg, segments_info = predictions["panoptic_seg"] - vis_frame = video_visualizer.draw_panoptic_seg_predictions( - frame, panoptic_seg.to(self.cpu_device), segments_info - ) - elif "instances" in predictions: - predictions = predictions["instances"].to(self.cpu_device) - vis_frame = video_visualizer.draw_instance_predictions(frame, predictions) - elif "sem_seg" in predictions: - vis_frame = video_visualizer.draw_sem_seg( - frame, predictions["sem_seg"].argmax(dim=0).to(self.cpu_device) - ) - - # Converts Matplotlib RGB format to OpenCV BGR format - vis_frame = cv2.cvtColor(vis_frame.get_image(), cv2.COLOR_RGB2BGR) - return vis_frame - - frame_gen = self._frame_from_video(video) - if self.parallel: - buffer_size = self.predictor.default_buffer_size - - frame_data = deque() - - for cnt, frame in enumerate(frame_gen): - frame_data.append(frame) - self.predictor.put(frame) - - if cnt >= buffer_size: - frame = frame_data.popleft() - predictions = self.predictor.get() - yield process_predictions(frame, predictions) - - while len(frame_data): - frame = frame_data.popleft() - predictions = self.predictor.get() - yield process_predictions(frame, predictions) - else: - for frame in frame_gen: - yield process_predictions(frame, self.predictor(frame)) - - -class AsyncPredictor: - """ - A predictor that runs the model asynchronously, possibly on >1 GPUs. - Because rendering the visualization takes considerably amount of time, - this helps improve throughput a little bit when rendering videos. - """ - - class _StopToken: - pass - - class _PredictWorker(mp.Process): - def __init__(self, cfg, task_queue, result_queue): - self.cfg = cfg - self.task_queue = task_queue - self.result_queue = result_queue - super().__init__() - - def run(self): - predictor = DefaultPredictor(self.cfg) - - while True: - task = self.task_queue.get() - if isinstance(task, AsyncPredictor._StopToken): - break - idx, data = task - result = predictor(data) - self.result_queue.put((idx, result)) - - def __init__(self, cfg, num_gpus: int = 1): - """ - Args: - cfg (CfgNode): - num_gpus (int): if 0, will run on CPU - """ - num_workers = max(num_gpus, 1) - self.task_queue = mp.Queue(maxsize=num_workers * 3) - self.result_queue = mp.Queue(maxsize=num_workers * 3) - self.procs = [] - for gpuid in range(max(num_gpus, 1)): - cfg = cfg.clone() - cfg.defrost() - cfg.MODEL.DEVICE = "cuda:{}".format(gpuid) if num_gpus > 0 else "cpu" - self.procs.append( - AsyncPredictor._PredictWorker(cfg, self.task_queue, self.result_queue) - ) - - self.put_idx = 0 - self.get_idx = 0 - self.result_rank = [] - self.result_data = [] - - for p in self.procs: - p.start() - atexit.register(self.shutdown) - - def put(self, image): - self.put_idx += 1 - self.task_queue.put((self.put_idx, image)) - - def get(self): - self.get_idx += 1 # the index needed for this request - if len(self.result_rank) and self.result_rank[0] == self.get_idx: - res = self.result_data[0] - del self.result_data[0], self.result_rank[0] - return res - - while True: - # make sure the results are returned in the correct order - idx, res = self.result_queue.get() - if idx == self.get_idx: - return res - insert = bisect.bisect(self.result_rank, idx) - self.result_rank.insert(insert, idx) - self.result_data.insert(insert, res) - - def __len__(self): - return self.put_idx - self.get_idx - - def __call__(self, image): - self.put(image) - return self.get() - - def shutdown(self): - for _ in self.procs: - self.task_queue.put(AsyncPredictor._StopToken()) - - @property - def default_buffer_size(self): - return len(self.procs) * 5 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/__init__.py deleted file mode 100755 index bdd994b4..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/__init__.py +++ /dev/null @@ -1,10 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -from .utils.env import setup_environment - -setup_environment() - - -# This line will be programatically read/write by setup.py. -# Leave them at the bottom of this file and don't touch them. -__version__ = "0.6" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/checkpoint/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/checkpoint/__init__.py deleted file mode 100755 index 99da0469..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/checkpoint/__init__.py +++ /dev/null @@ -1,10 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. -# File: - - -from . import catalog as _UNUSED # register the handler -from .detection_checkpoint import DetectionCheckpointer -from fvcore.common.checkpoint import Checkpointer, PeriodicCheckpointer - -__all__ = ["Checkpointer", "PeriodicCheckpointer", "DetectionCheckpointer"] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/checkpoint/c2_model_loading.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/checkpoint/c2_model_loading.py deleted file mode 100755 index 8c8d181b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/checkpoint/c2_model_loading.py +++ /dev/null @@ -1,407 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import logging -import re -from typing import Dict, List -import torch -from tabulate import tabulate - - -def convert_basic_c2_names(original_keys): - """ - Apply some basic name conversion to names in C2 weights. - It only deals with typical backbone models. - - Args: - original_keys (list[str]): - Returns: - list[str]: The same number of strings matching those in original_keys. - """ - layer_keys = copy.deepcopy(original_keys) - layer_keys = [ - {"pred_b": "linear_b", "pred_w": "linear_w"}.get(k, k) for k in layer_keys - ] # some hard-coded mappings - - layer_keys = [k.replace("_", ".") for k in layer_keys] - layer_keys = [re.sub("\\.b$", ".bias", k) for k in layer_keys] - layer_keys = [re.sub("\\.w$", ".weight", k) for k in layer_keys] - # Uniform both bn and gn names to "norm" - layer_keys = [re.sub("bn\\.s$", "norm.weight", k) for k in layer_keys] - layer_keys = [re.sub("bn\\.bias$", "norm.bias", k) for k in layer_keys] - layer_keys = [re.sub("bn\\.rm", "norm.running_mean", k) for k in layer_keys] - layer_keys = [re.sub("bn\\.running.mean$", "norm.running_mean", k) for k in layer_keys] - layer_keys = [re.sub("bn\\.riv$", "norm.running_var", k) for k in layer_keys] - layer_keys = [re.sub("bn\\.running.var$", "norm.running_var", k) for k in layer_keys] - layer_keys = [re.sub("bn\\.gamma$", "norm.weight", k) for k in layer_keys] - layer_keys = [re.sub("bn\\.beta$", "norm.bias", k) for k in layer_keys] - layer_keys = [re.sub("gn\\.s$", "norm.weight", k) for k in layer_keys] - layer_keys = [re.sub("gn\\.bias$", "norm.bias", k) for k in layer_keys] - - # stem - layer_keys = [re.sub("^res\\.conv1\\.norm\\.", "conv1.norm.", k) for k in layer_keys] - # to avoid mis-matching with "conv1" in other components (e.g. detection head) - layer_keys = [re.sub("^conv1\\.", "stem.conv1.", k) for k in layer_keys] - - # layer1-4 is used by torchvision, however we follow the C2 naming strategy (res2-5) - # layer_keys = [re.sub("^res2.", "layer1.", k) for k in layer_keys] - # layer_keys = [re.sub("^res3.", "layer2.", k) for k in layer_keys] - # layer_keys = [re.sub("^res4.", "layer3.", k) for k in layer_keys] - # layer_keys = [re.sub("^res5.", "layer4.", k) for k in layer_keys] - - # blocks - layer_keys = [k.replace(".branch1.", ".shortcut.") for k in layer_keys] - layer_keys = [k.replace(".branch2a.", ".conv1.") for k in layer_keys] - layer_keys = [k.replace(".branch2b.", ".conv2.") for k in layer_keys] - layer_keys = [k.replace(".branch2c.", ".conv3.") for k in layer_keys] - - # DensePose substitutions - layer_keys = [re.sub("^body.conv.fcn", "body_conv_fcn", k) for k in layer_keys] - layer_keys = [k.replace("AnnIndex.lowres", "ann_index_lowres") for k in layer_keys] - layer_keys = [k.replace("Index.UV.lowres", "index_uv_lowres") for k in layer_keys] - layer_keys = [k.replace("U.lowres", "u_lowres") for k in layer_keys] - layer_keys = [k.replace("V.lowres", "v_lowres") for k in layer_keys] - return layer_keys - - -def convert_c2_detectron_names(weights): - """ - Map Caffe2 Detectron weight names to Detectron2 names. - - Args: - weights (dict): name -> tensor - - Returns: - dict: detectron2 names -> tensor - dict: detectron2 names -> C2 names - """ - logger = logging.getLogger(__name__) - logger.info("Renaming Caffe2 weights ......") - original_keys = sorted(weights.keys()) - layer_keys = copy.deepcopy(original_keys) - - layer_keys = convert_basic_c2_names(layer_keys) - - # -------------------------------------------------------------------------- - # RPN hidden representation conv - # -------------------------------------------------------------------------- - # FPN case - # In the C2 model, the RPN hidden layer conv is defined for FPN level 2 and then - # shared for all other levels, hence the appearance of "fpn2" - layer_keys = [ - k.replace("conv.rpn.fpn2", "proposal_generator.rpn_head.conv") for k in layer_keys - ] - # Non-FPN case - layer_keys = [k.replace("conv.rpn", "proposal_generator.rpn_head.conv") for k in layer_keys] - - # -------------------------------------------------------------------------- - # RPN box transformation conv - # -------------------------------------------------------------------------- - # FPN case (see note above about "fpn2") - layer_keys = [ - k.replace("rpn.bbox.pred.fpn2", "proposal_generator.rpn_head.anchor_deltas") - for k in layer_keys - ] - layer_keys = [ - k.replace("rpn.cls.logits.fpn2", "proposal_generator.rpn_head.objectness_logits") - for k in layer_keys - ] - # Non-FPN case - layer_keys = [ - k.replace("rpn.bbox.pred", "proposal_generator.rpn_head.anchor_deltas") for k in layer_keys - ] - layer_keys = [ - k.replace("rpn.cls.logits", "proposal_generator.rpn_head.objectness_logits") - for k in layer_keys - ] - - # -------------------------------------------------------------------------- - # Fast R-CNN box head - # -------------------------------------------------------------------------- - layer_keys = [re.sub("^bbox\\.pred", "bbox_pred", k) for k in layer_keys] - layer_keys = [re.sub("^cls\\.score", "cls_score", k) for k in layer_keys] - layer_keys = [re.sub("^fc6\\.", "box_head.fc1.", k) for k in layer_keys] - layer_keys = [re.sub("^fc7\\.", "box_head.fc2.", k) for k in layer_keys] - # 4conv1fc head tensor names: head_conv1_w, head_conv1_gn_s - layer_keys = [re.sub("^head\\.conv", "box_head.conv", k) for k in layer_keys] - - # -------------------------------------------------------------------------- - # FPN lateral and output convolutions - # -------------------------------------------------------------------------- - def fpn_map(name): - """ - Look for keys with the following patterns: - 1) Starts with "fpn.inner." - Example: "fpn.inner.res2.2.sum.lateral.weight" - Meaning: These are lateral pathway convolutions - 2) Starts with "fpn.res" - Example: "fpn.res2.2.sum.weight" - Meaning: These are FPN output convolutions - """ - splits = name.split(".") - norm = ".norm" if "norm" in splits else "" - if name.startswith("fpn.inner."): - # splits example: ['fpn', 'inner', 'res2', '2', 'sum', 'lateral', 'weight'] - stage = int(splits[2][len("res") :]) - return "fpn_lateral{}{}.{}".format(stage, norm, splits[-1]) - elif name.startswith("fpn.res"): - # splits example: ['fpn', 'res2', '2', 'sum', 'weight'] - stage = int(splits[1][len("res") :]) - return "fpn_output{}{}.{}".format(stage, norm, splits[-1]) - return name - - layer_keys = [fpn_map(k) for k in layer_keys] - - # -------------------------------------------------------------------------- - # Mask R-CNN mask head - # -------------------------------------------------------------------------- - # roi_heads.StandardROIHeads case - layer_keys = [k.replace(".[mask].fcn", "mask_head.mask_fcn") for k in layer_keys] - layer_keys = [re.sub("^\\.mask\\.fcn", "mask_head.mask_fcn", k) for k in layer_keys] - layer_keys = [k.replace("mask.fcn.logits", "mask_head.predictor") for k in layer_keys] - # roi_heads.Res5ROIHeads case - layer_keys = [k.replace("conv5.mask", "mask_head.deconv") for k in layer_keys] - - # -------------------------------------------------------------------------- - # Keypoint R-CNN head - # -------------------------------------------------------------------------- - # interestingly, the keypoint head convs have blob names that are simply "conv_fcnX" - layer_keys = [k.replace("conv.fcn", "roi_heads.keypoint_head.conv_fcn") for k in layer_keys] - layer_keys = [ - k.replace("kps.score.lowres", "roi_heads.keypoint_head.score_lowres") for k in layer_keys - ] - layer_keys = [k.replace("kps.score.", "roi_heads.keypoint_head.score.") for k in layer_keys] - - # -------------------------------------------------------------------------- - # Done with replacements - # -------------------------------------------------------------------------- - assert len(set(layer_keys)) == len(layer_keys) - assert len(original_keys) == len(layer_keys) - - new_weights = {} - new_keys_to_original_keys = {} - for orig, renamed in zip(original_keys, layer_keys): - new_keys_to_original_keys[renamed] = orig - if renamed.startswith("bbox_pred.") or renamed.startswith("mask_head.predictor."): - # remove the meaningless prediction weight for background class - new_start_idx = 4 if renamed.startswith("bbox_pred.") else 1 - new_weights[renamed] = weights[orig][new_start_idx:] - logger.info( - "Remove prediction weight for background class in {}. The shape changes from " - "{} to {}.".format( - renamed, tuple(weights[orig].shape), tuple(new_weights[renamed].shape) - ) - ) - elif renamed.startswith("cls_score."): - # move weights of bg class from original index 0 to last index - logger.info( - "Move classification weights for background class in {} from index 0 to " - "index {}.".format(renamed, weights[orig].shape[0] - 1) - ) - new_weights[renamed] = torch.cat([weights[orig][1:], weights[orig][:1]]) - else: - new_weights[renamed] = weights[orig] - - return new_weights, new_keys_to_original_keys - - -# Note the current matching is not symmetric. -# it assumes model_state_dict will have longer names. -def align_and_update_state_dicts(model_state_dict, ckpt_state_dict, c2_conversion=True): - """ - Match names between the two state-dict, and returns a new chkpt_state_dict with names - converted to match model_state_dict with heuristics. The returned dict can be later - loaded with fvcore checkpointer. - If `c2_conversion==True`, `ckpt_state_dict` is assumed to be a Caffe2 - model and will be renamed at first. - - Strategy: suppose that the models that we will create will have prefixes appended - to each of its keys, for example due to an extra level of nesting that the original - pre-trained weights from ImageNet won't contain. For example, model.state_dict() - might return backbone[0].body.res2.conv1.weight, while the pre-trained model contains - res2.conv1.weight. We thus want to match both parameters together. - For that, we look for each model weight, look among all loaded keys if there is one - that is a suffix of the current weight name, and use it if that's the case. - If multiple matches exist, take the one with longest size - of the corresponding name. For example, for the same model as before, the pretrained - weight file can contain both res2.conv1.weight, as well as conv1.weight. In this case, - we want to match backbone[0].body.conv1.weight to conv1.weight, and - backbone[0].body.res2.conv1.weight to res2.conv1.weight. - """ - model_keys = sorted(model_state_dict.keys()) - if c2_conversion: - ckpt_state_dict, original_keys = convert_c2_detectron_names(ckpt_state_dict) - # original_keys: the name in the original dict (before renaming) - else: - original_keys = {x: x for x in ckpt_state_dict.keys()} - ckpt_keys = sorted(ckpt_state_dict.keys()) - - def match(a, b): - # Matched ckpt_key should be a complete (starts with '.') suffix. - # For example, roi_heads.mesh_head.whatever_conv1 does not match conv1, - # but matches whatever_conv1 or mesh_head.whatever_conv1. - return a == b or a.endswith("." + b) - - # get a matrix of string matches, where each (i, j) entry correspond to the size of the - # ckpt_key string, if it matches - match_matrix = [len(j) if match(i, j) else 0 for i in model_keys for j in ckpt_keys] - match_matrix = torch.as_tensor(match_matrix).view(len(model_keys), len(ckpt_keys)) - # use the matched one with longest size in case of multiple matches - max_match_size, idxs = match_matrix.max(1) - # remove indices that correspond to no-match - idxs[max_match_size == 0] = -1 - - logger = logging.getLogger(__name__) - # matched_pairs (matched checkpoint key --> matched model key) - matched_keys = {} - result_state_dict = {} - for idx_model, idx_ckpt in enumerate(idxs.tolist()): - if idx_ckpt == -1: - continue - key_model = model_keys[idx_model] - key_ckpt = ckpt_keys[idx_ckpt] - value_ckpt = ckpt_state_dict[key_ckpt] - shape_in_model = model_state_dict[key_model].shape - - if shape_in_model != value_ckpt.shape: - logger.warning( - "Shape of {} in checkpoint is {}, while shape of {} in model is {}.".format( - key_ckpt, value_ckpt.shape, key_model, shape_in_model - ) - ) - logger.warning( - "{} will not be loaded. Please double check and see if this is desired.".format( - key_ckpt - ) - ) - continue - - assert key_model not in result_state_dict - result_state_dict[key_model] = value_ckpt - if key_ckpt in matched_keys: # already added to matched_keys - logger.error( - "Ambiguity found for {} in checkpoint!" - "It matches at least two keys in the model ({} and {}).".format( - key_ckpt, key_model, matched_keys[key_ckpt] - ) - ) - raise ValueError("Cannot match one checkpoint key to multiple keys in the model.") - - matched_keys[key_ckpt] = key_model - - # logging: - matched_model_keys = sorted(matched_keys.values()) - if len(matched_model_keys) == 0: - logger.warning("No weights in checkpoint matched with model.") - return ckpt_state_dict - common_prefix = _longest_common_prefix(matched_model_keys) - rev_matched_keys = {v: k for k, v in matched_keys.items()} - original_keys = {k: original_keys[rev_matched_keys[k]] for k in matched_model_keys} - - model_key_groups = _group_keys_by_module(matched_model_keys, original_keys) - table = [] - memo = set() - for key_model in matched_model_keys: - if key_model in memo: - continue - if key_model in model_key_groups: - group = model_key_groups[key_model] - memo |= set(group) - shapes = [tuple(model_state_dict[k].shape) for k in group] - table.append( - ( - _longest_common_prefix([k[len(common_prefix) :] for k in group]) + "*", - _group_str([original_keys[k] for k in group]), - " ".join([str(x).replace(" ", "") for x in shapes]), - ) - ) - else: - key_checkpoint = original_keys[key_model] - shape = str(tuple(model_state_dict[key_model].shape)) - table.append((key_model[len(common_prefix) :], key_checkpoint, shape)) - table_str = tabulate( - table, tablefmt="pipe", headers=["Names in Model", "Names in Checkpoint", "Shapes"] - ) - logger.info( - "Following weights matched with " - + (f"submodule {common_prefix[:-1]}" if common_prefix else "model") - + ":\n" - + table_str - ) - - unmatched_ckpt_keys = [k for k in ckpt_keys if k not in set(matched_keys.keys())] - for k in unmatched_ckpt_keys: - result_state_dict[k] = ckpt_state_dict[k] - return result_state_dict - - -def _group_keys_by_module(keys: List[str], original_names: Dict[str, str]): - """ - Params in the same submodule are grouped together. - - Args: - keys: names of all parameters - original_names: mapping from parameter name to their name in the checkpoint - - Returns: - dict[name -> all other names in the same group] - """ - - def _submodule_name(key): - pos = key.rfind(".") - if pos < 0: - return None - prefix = key[: pos + 1] - return prefix - - all_submodules = [_submodule_name(k) for k in keys] - all_submodules = [x for x in all_submodules if x] - all_submodules = sorted(all_submodules, key=len) - - ret = {} - for prefix in all_submodules: - group = [k for k in keys if k.startswith(prefix)] - if len(group) <= 1: - continue - original_name_lcp = _longest_common_prefix_str([original_names[k] for k in group]) - if len(original_name_lcp) == 0: - # don't group weights if original names don't share prefix - continue - - for k in group: - if k in ret: - continue - ret[k] = group - return ret - - -def _longest_common_prefix(names: List[str]) -> str: - """ - ["abc.zfg", "abc.zef"] -> "abc." - """ - names = [n.split(".") for n in names] - m1, m2 = min(names), max(names) - ret = [a for a, b in zip(m1, m2) if a == b] - ret = ".".join(ret) + "." if len(ret) else "" - return ret - - -def _longest_common_prefix_str(names: List[str]) -> str: - m1, m2 = min(names), max(names) - lcp = [a for a, b in zip(m1, m2) if a == b] - lcp = "".join(lcp) - return lcp - - -def _group_str(names: List[str]) -> str: - """ - Turn "common1", "common2", "common3" into "common{1,2,3}" - """ - lcp = _longest_common_prefix_str(names) - rest = [x[len(lcp) :] for x in names] - rest = "{" + ",".join(rest) + "}" - ret = lcp + rest - - # add some simplification for BN specifically - ret = ret.replace("bn_{beta,running_mean,running_var,gamma}", "bn_*") - ret = ret.replace("bn_beta,bn_running_mean,bn_running_var,bn_gamma", "bn_*") - return ret diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/checkpoint/catalog.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/checkpoint/catalog.py deleted file mode 100755 index 9a857367..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/checkpoint/catalog.py +++ /dev/null @@ -1,115 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging - -from detectron2.utils.file_io import PathHandler, PathManager - - -class ModelCatalog(object): - """ - Store mappings from names to third-party models. - """ - - S3_C2_DETECTRON_PREFIX = "https://dl.fbaipublicfiles.com/detectron" - - # MSRA models have STRIDE_IN_1X1=True. False otherwise. - # NOTE: all BN models here have fused BN into an affine layer. - # As a result, you should only load them to a model with "FrozenBN". - # Loading them to a model with regular BN or SyncBN is wrong. - # Even when loaded to FrozenBN, it is still different from affine by an epsilon, - # which should be negligible for training. - # NOTE: all models here uses PIXEL_STD=[1,1,1] - # NOTE: Most of the BN models here are no longer used. We use the - # re-converted pre-trained models under detectron2 model zoo instead. - C2_IMAGENET_MODELS = { - "MSRA/R-50": "ImageNetPretrained/MSRA/R-50.pkl", - "MSRA/R-101": "ImageNetPretrained/MSRA/R-101.pkl", - "FAIR/R-50-GN": "ImageNetPretrained/47261647/R-50-GN.pkl", - "FAIR/R-101-GN": "ImageNetPretrained/47592356/R-101-GN.pkl", - "FAIR/X-101-32x8d": "ImageNetPretrained/20171220/X-101-32x8d.pkl", - "FAIR/X-101-64x4d": "ImageNetPretrained/FBResNeXt/X-101-64x4d.pkl", - "FAIR/X-152-32x8d-IN5k": "ImageNetPretrained/25093814/X-152-32x8d-IN5k.pkl", - } - - C2_DETECTRON_PATH_FORMAT = ( - "{prefix}/{url}/output/train/{dataset}/{type}/model_final.pkl" # noqa B950 - ) - - C2_DATASET_COCO = "coco_2014_train%3Acoco_2014_valminusminival" - C2_DATASET_COCO_KEYPOINTS = "keypoints_coco_2014_train%3Akeypoints_coco_2014_valminusminival" - - # format: {model_name} -> part of the url - C2_DETECTRON_MODELS = { - "35857197/e2e_faster_rcnn_R-50-C4_1x": "35857197/12_2017_baselines/e2e_faster_rcnn_R-50-C4_1x.yaml.01_33_49.iAX0mXvW", # noqa B950 - "35857345/e2e_faster_rcnn_R-50-FPN_1x": "35857345/12_2017_baselines/e2e_faster_rcnn_R-50-FPN_1x.yaml.01_36_30.cUF7QR7I", # noqa B950 - "35857890/e2e_faster_rcnn_R-101-FPN_1x": "35857890/12_2017_baselines/e2e_faster_rcnn_R-101-FPN_1x.yaml.01_38_50.sNxI7sX7", # noqa B950 - "36761737/e2e_faster_rcnn_X-101-32x8d-FPN_1x": "36761737/12_2017_baselines/e2e_faster_rcnn_X-101-32x8d-FPN_1x.yaml.06_31_39.5MIHi1fZ", # noqa B950 - "35858791/e2e_mask_rcnn_R-50-C4_1x": "35858791/12_2017_baselines/e2e_mask_rcnn_R-50-C4_1x.yaml.01_45_57.ZgkA7hPB", # noqa B950 - "35858933/e2e_mask_rcnn_R-50-FPN_1x": "35858933/12_2017_baselines/e2e_mask_rcnn_R-50-FPN_1x.yaml.01_48_14.DzEQe4wC", # noqa B950 - "35861795/e2e_mask_rcnn_R-101-FPN_1x": "35861795/12_2017_baselines/e2e_mask_rcnn_R-101-FPN_1x.yaml.02_31_37.KqyEK4tT", # noqa B950 - "36761843/e2e_mask_rcnn_X-101-32x8d-FPN_1x": "36761843/12_2017_baselines/e2e_mask_rcnn_X-101-32x8d-FPN_1x.yaml.06_35_59.RZotkLKI", # noqa B950 - "48616381/e2e_mask_rcnn_R-50-FPN_2x_gn": "GN/48616381/04_2018_gn_baselines/e2e_mask_rcnn_R-50-FPN_2x_gn_0416.13_23_38.bTlTI97Q", # noqa B950 - "37697547/e2e_keypoint_rcnn_R-50-FPN_1x": "37697547/12_2017_baselines/e2e_keypoint_rcnn_R-50-FPN_1x.yaml.08_42_54.kdzV35ao", # noqa B950 - "35998355/rpn_R-50-C4_1x": "35998355/12_2017_baselines/rpn_R-50-C4_1x.yaml.08_00_43.njH5oD9L", # noqa B950 - "35998814/rpn_R-50-FPN_1x": "35998814/12_2017_baselines/rpn_R-50-FPN_1x.yaml.08_06_03.Axg0r179", # noqa B950 - "36225147/fast_R-50-FPN_1x": "36225147/12_2017_baselines/fast_rcnn_R-50-FPN_1x.yaml.08_39_09.L3obSdQ2", # noqa B950 - } - - @staticmethod - def get(name): - if name.startswith("Caffe2Detectron/COCO"): - return ModelCatalog._get_c2_detectron_baseline(name) - if name.startswith("ImageNetPretrained/"): - return ModelCatalog._get_c2_imagenet_pretrained(name) - raise RuntimeError("model not present in the catalog: {}".format(name)) - - @staticmethod - def _get_c2_imagenet_pretrained(name): - prefix = ModelCatalog.S3_C2_DETECTRON_PREFIX - name = name[len("ImageNetPretrained/") :] - name = ModelCatalog.C2_IMAGENET_MODELS[name] - url = "/".join([prefix, name]) - return url - - @staticmethod - def _get_c2_detectron_baseline(name): - name = name[len("Caffe2Detectron/COCO/") :] - url = ModelCatalog.C2_DETECTRON_MODELS[name] - if "keypoint_rcnn" in name: - dataset = ModelCatalog.C2_DATASET_COCO_KEYPOINTS - else: - dataset = ModelCatalog.C2_DATASET_COCO - - if "35998355/rpn_R-50-C4_1x" in name: - # this one model is somehow different from others .. - type = "rpn" - else: - type = "generalized_rcnn" - - # Detectron C2 models are stored in the structure defined in `C2_DETECTRON_PATH_FORMAT`. - url = ModelCatalog.C2_DETECTRON_PATH_FORMAT.format( - prefix=ModelCatalog.S3_C2_DETECTRON_PREFIX, url=url, type=type, dataset=dataset - ) - return url - - -class ModelCatalogHandler(PathHandler): - """ - Resolve URL like catalog://. - """ - - PREFIX = "catalog://" - - def _get_supported_prefixes(self): - return [self.PREFIX] - - def _get_local_path(self, path, **kwargs): - logger = logging.getLogger(__name__) - catalog_path = ModelCatalog.get(path[len(self.PREFIX) :]) - logger.info("Catalog entry {} points to {}".format(path, catalog_path)) - return PathManager.get_local_path(catalog_path, **kwargs) - - def _open(self, path, mode="r", **kwargs): - return PathManager.open(self._get_local_path(path), mode, **kwargs) - - -PathManager.register_handler(ModelCatalogHandler()) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/checkpoint/detection_checkpoint.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/checkpoint/detection_checkpoint.py deleted file mode 100755 index 82fd3b2d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/checkpoint/detection_checkpoint.py +++ /dev/null @@ -1,120 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import os -import pickle -import torch -from fvcore.common.checkpoint import Checkpointer -from torch.nn.parallel import DistributedDataParallel - -import detectron2.utils.comm as comm -from detectron2.utils.file_io import PathManager - -from .c2_model_loading import align_and_update_state_dicts - - -class DetectionCheckpointer(Checkpointer): - """ - Same as :class:`Checkpointer`, but is able to: - 1. handle models in detectron & detectron2 model zoo, and apply conversions for legacy models. - 2. correctly load checkpoints that are only available on the master worker - """ - - def __init__(self, model, save_dir="", *, save_to_disk=None, **checkpointables): - is_main_process = comm.is_main_process() - super().__init__( - model, - save_dir, - save_to_disk=is_main_process if save_to_disk is None else save_to_disk, - **checkpointables, - ) - self.path_manager = PathManager - - def load(self, path, *args, **kwargs): - need_sync = False - - if path and isinstance(self.model, DistributedDataParallel): - logger = logging.getLogger(__name__) - path = self.path_manager.get_local_path(path) - has_file = os.path.isfile(path) - all_has_file = comm.all_gather(has_file) - if not all_has_file[0]: - raise OSError(f"File {path} not found on main worker.") - if not all(all_has_file): - logger.warning( - f"Not all workers can read checkpoint {path}. " - "Training may fail to fully resume." - ) - # TODO: broadcast the checkpoint file contents from main - # worker, and load from it instead. - need_sync = True - if not has_file: - path = None # don't load if not readable - ret = super().load(path, *args, **kwargs) - - if need_sync: - logger.info("Broadcasting model states from main worker ...") - self.model._sync_params_and_buffers() - return ret - - def _load_file(self, filename): - if filename.endswith(".pkl"): - with PathManager.open(filename, "rb") as f: - data = pickle.load(f, encoding="latin1") - if "model" in data and "__author__" in data: - # file is in Detectron2 model zoo format - self.logger.info("Reading a file from '{}'".format(data["__author__"])) - return data - else: - # assume file is from Caffe2 / Detectron1 model zoo - if "blobs" in data: - # Detection models have "blobs", but ImageNet models don't - data = data["blobs"] - data = {k: v for k, v in data.items() if not k.endswith("_momentum")} - return {"model": data, "__author__": "Caffe2", "matching_heuristics": True} - elif filename.endswith(".pyth"): - # assume file is from pycls; no one else seems to use the ".pyth" extension - with PathManager.open(filename, "rb") as f: - data = torch.load(f) - assert ( - "model_state" in data - ), f"Cannot load .pyth file {filename}; pycls checkpoints must contain 'model_state'." - model_state = { - k: v - for k, v in data["model_state"].items() - if not k.endswith("num_batches_tracked") - } - return {"model": model_state, "__author__": "pycls", "matching_heuristics": True} - - loaded = super()._load_file(filename) # load native pth checkpoint - if "model" not in loaded: - loaded = {"model": loaded} - return loaded - - def _load_model(self, checkpoint): - if checkpoint.get("matching_heuristics", False): - self._convert_ndarray_to_tensor(checkpoint["model"]) - # convert weights by name-matching heuristics - checkpoint["model"] = align_and_update_state_dicts( - self.model.state_dict(), - checkpoint["model"], - c2_conversion=checkpoint.get("__author__", None) == "Caffe2", - ) - # for non-caffe2 models, use standard ways to load it - incompatible = super()._load_model(checkpoint) - - model_buffers = dict(self.model.named_buffers(recurse=False)) - for k in ["pixel_mean", "pixel_std"]: - # Ignore missing key message about pixel_mean/std. - # Though they may be missing in old checkpoints, they will be correctly - # initialized from config anyway. - if k in model_buffers: - try: - incompatible.missing_keys.remove(k) - except ValueError: - pass - for k in incompatible.unexpected_keys[:]: - # Ignore unexpected keys about cell anchors. They exist in old checkpoints - # but now they are non-persistent buffers and will not be in new checkpoints. - if "anchor_generator.cell_anchors" in k: - incompatible.unexpected_keys.remove(k) - return incompatible diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/__init__.py deleted file mode 100755 index 4e648e63..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/__init__.py +++ /dev/null @@ -1,24 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .compat import downgrade_config, upgrade_config -from .config import CfgNode, get_cfg, global_cfg, set_global_cfg, configurable -from .instantiate import instantiate -from .lazy import LazyCall, LazyConfig - -__all__ = [ - "CfgNode", - "get_cfg", - "global_cfg", - "set_global_cfg", - "downgrade_config", - "upgrade_config", - "configurable", - "instantiate", - "LazyCall", - "LazyConfig", -] - - -from detectron2.utils.env import fixup_module_metadata - -fixup_module_metadata(__name__, globals(), __all__) -del fixup_module_metadata diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/compat.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/compat.py deleted file mode 100755 index 11a08c43..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/compat.py +++ /dev/null @@ -1,229 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -""" -Backward compatibility of configs. - -Instructions to bump version: -+ It's not needed to bump version if new keys are added. - It's only needed when backward-incompatible changes happen - (i.e., some existing keys disappear, or the meaning of a key changes) -+ To bump version, do the following: - 1. Increment _C.VERSION in defaults.py - 2. Add a converter in this file. - - Each ConverterVX has a function "upgrade" which in-place upgrades config from X-1 to X, - and a function "downgrade" which in-place downgrades config from X to X-1 - - In each function, VERSION is left unchanged. - - Each converter assumes that its input has the relevant keys - (i.e., the input is not a partial config). - 3. Run the tests (test_config.py) to make sure the upgrade & downgrade - functions are consistent. -""" - -import logging -from typing import List, Optional, Tuple - -from .config import CfgNode as CN -from .defaults import _C - -__all__ = ["upgrade_config", "downgrade_config"] - - -def upgrade_config(cfg: CN, to_version: Optional[int] = None) -> CN: - """ - Upgrade a config from its current version to a newer version. - - Args: - cfg (CfgNode): - to_version (int): defaults to the latest version. - """ - cfg = cfg.clone() - if to_version is None: - to_version = _C.VERSION - - assert cfg.VERSION <= to_version, "Cannot upgrade from v{} to v{}!".format( - cfg.VERSION, to_version - ) - for k in range(cfg.VERSION, to_version): - converter = globals()["ConverterV" + str(k + 1)] - converter.upgrade(cfg) - cfg.VERSION = k + 1 - return cfg - - -def downgrade_config(cfg: CN, to_version: int) -> CN: - """ - Downgrade a config from its current version to an older version. - - Args: - cfg (CfgNode): - to_version (int): - - Note: - A general downgrade of arbitrary configs is not always possible due to the - different functionalities in different versions. - The purpose of downgrade is only to recover the defaults in old versions, - allowing it to load an old partial yaml config. - Therefore, the implementation only needs to fill in the default values - in the old version when a general downgrade is not possible. - """ - cfg = cfg.clone() - assert cfg.VERSION >= to_version, "Cannot downgrade from v{} to v{}!".format( - cfg.VERSION, to_version - ) - for k in range(cfg.VERSION, to_version, -1): - converter = globals()["ConverterV" + str(k)] - converter.downgrade(cfg) - cfg.VERSION = k - 1 - return cfg - - -def guess_version(cfg: CN, filename: str) -> int: - """ - Guess the version of a partial config where the VERSION field is not specified. - Returns the version, or the latest if cannot make a guess. - - This makes it easier for users to migrate. - """ - logger = logging.getLogger(__name__) - - def _has(name: str) -> bool: - cur = cfg - for n in name.split("."): - if n not in cur: - return False - cur = cur[n] - return True - - # Most users' partial configs have "MODEL.WEIGHT", so guess on it - ret = None - if _has("MODEL.WEIGHT") or _has("TEST.AUG_ON"): - ret = 1 - - if ret is not None: - logger.warning("Config '{}' has no VERSION. Assuming it to be v{}.".format(filename, ret)) - else: - ret = _C.VERSION - logger.warning( - "Config '{}' has no VERSION. Assuming it to be compatible with latest v{}.".format( - filename, ret - ) - ) - return ret - - -def _rename(cfg: CN, old: str, new: str) -> None: - old_keys = old.split(".") - new_keys = new.split(".") - - def _set(key_seq: List[str], val: str) -> None: - cur = cfg - for k in key_seq[:-1]: - if k not in cur: - cur[k] = CN() - cur = cur[k] - cur[key_seq[-1]] = val - - def _get(key_seq: List[str]) -> CN: - cur = cfg - for k in key_seq: - cur = cur[k] - return cur - - def _del(key_seq: List[str]) -> None: - cur = cfg - for k in key_seq[:-1]: - cur = cur[k] - del cur[key_seq[-1]] - if len(cur) == 0 and len(key_seq) > 1: - _del(key_seq[:-1]) - - _set(new_keys, _get(old_keys)) - _del(old_keys) - - -class _RenameConverter: - """ - A converter that handles simple rename. - """ - - RENAME: List[Tuple[str, str]] = [] # list of tuples of (old name, new name) - - @classmethod - def upgrade(cls, cfg: CN) -> None: - for old, new in cls.RENAME: - _rename(cfg, old, new) - - @classmethod - def downgrade(cls, cfg: CN) -> None: - for old, new in cls.RENAME[::-1]: - _rename(cfg, new, old) - - -class ConverterV1(_RenameConverter): - RENAME = [("MODEL.RPN_HEAD.NAME", "MODEL.RPN.HEAD_NAME")] - - -class ConverterV2(_RenameConverter): - """ - A large bulk of rename, before public release. - """ - - RENAME = [ - ("MODEL.WEIGHT", "MODEL.WEIGHTS"), - ("MODEL.PANOPTIC_FPN.SEMANTIC_LOSS_SCALE", "MODEL.SEM_SEG_HEAD.LOSS_WEIGHT"), - ("MODEL.PANOPTIC_FPN.RPN_LOSS_SCALE", "MODEL.RPN.LOSS_WEIGHT"), - ("MODEL.PANOPTIC_FPN.INSTANCE_LOSS_SCALE", "MODEL.PANOPTIC_FPN.INSTANCE_LOSS_WEIGHT"), - ("MODEL.PANOPTIC_FPN.COMBINE_ON", "MODEL.PANOPTIC_FPN.COMBINE.ENABLED"), - ( - "MODEL.PANOPTIC_FPN.COMBINE_OVERLAP_THRESHOLD", - "MODEL.PANOPTIC_FPN.COMBINE.OVERLAP_THRESH", - ), - ( - "MODEL.PANOPTIC_FPN.COMBINE_STUFF_AREA_LIMIT", - "MODEL.PANOPTIC_FPN.COMBINE.STUFF_AREA_LIMIT", - ), - ( - "MODEL.PANOPTIC_FPN.COMBINE_INSTANCES_CONFIDENCE_THRESHOLD", - "MODEL.PANOPTIC_FPN.COMBINE.INSTANCES_CONFIDENCE_THRESH", - ), - ("MODEL.ROI_HEADS.SCORE_THRESH", "MODEL.ROI_HEADS.SCORE_THRESH_TEST"), - ("MODEL.ROI_HEADS.NMS", "MODEL.ROI_HEADS.NMS_THRESH_TEST"), - ("MODEL.RETINANET.INFERENCE_SCORE_THRESHOLD", "MODEL.RETINANET.SCORE_THRESH_TEST"), - ("MODEL.RETINANET.INFERENCE_TOPK_CANDIDATES", "MODEL.RETINANET.TOPK_CANDIDATES_TEST"), - ("MODEL.RETINANET.INFERENCE_NMS_THRESHOLD", "MODEL.RETINANET.NMS_THRESH_TEST"), - ("TEST.DETECTIONS_PER_IMG", "TEST.DETECTIONS_PER_IMAGE"), - ("TEST.AUG_ON", "TEST.AUG.ENABLED"), - ("TEST.AUG_MIN_SIZES", "TEST.AUG.MIN_SIZES"), - ("TEST.AUG_MAX_SIZE", "TEST.AUG.MAX_SIZE"), - ("TEST.AUG_FLIP", "TEST.AUG.FLIP"), - ] - - @classmethod - def upgrade(cls, cfg: CN) -> None: - super().upgrade(cfg) - - if cfg.MODEL.META_ARCHITECTURE == "RetinaNet": - _rename( - cfg, "MODEL.RETINANET.ANCHOR_ASPECT_RATIOS", "MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS" - ) - _rename(cfg, "MODEL.RETINANET.ANCHOR_SIZES", "MODEL.ANCHOR_GENERATOR.SIZES") - del cfg["MODEL"]["RPN"]["ANCHOR_SIZES"] - del cfg["MODEL"]["RPN"]["ANCHOR_ASPECT_RATIOS"] - else: - _rename(cfg, "MODEL.RPN.ANCHOR_ASPECT_RATIOS", "MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS") - _rename(cfg, "MODEL.RPN.ANCHOR_SIZES", "MODEL.ANCHOR_GENERATOR.SIZES") - del cfg["MODEL"]["RETINANET"]["ANCHOR_SIZES"] - del cfg["MODEL"]["RETINANET"]["ANCHOR_ASPECT_RATIOS"] - del cfg["MODEL"]["RETINANET"]["ANCHOR_STRIDES"] - - @classmethod - def downgrade(cls, cfg: CN) -> None: - super().downgrade(cfg) - - _rename(cfg, "MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS", "MODEL.RPN.ANCHOR_ASPECT_RATIOS") - _rename(cfg, "MODEL.ANCHOR_GENERATOR.SIZES", "MODEL.RPN.ANCHOR_SIZES") - cfg.MODEL.RETINANET.ANCHOR_ASPECT_RATIOS = cfg.MODEL.RPN.ANCHOR_ASPECT_RATIOS - cfg.MODEL.RETINANET.ANCHOR_SIZES = cfg.MODEL.RPN.ANCHOR_SIZES - cfg.MODEL.RETINANET.ANCHOR_STRIDES = [] # this is not used anywhere in any version diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/config.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/config.py deleted file mode 100755 index 49a55b1b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/config.py +++ /dev/null @@ -1,265 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import functools -import inspect -import logging -from fvcore.common.config import CfgNode as _CfgNode - -from detectron2.utils.file_io import PathManager - - -class CfgNode(_CfgNode): - """ - The same as `fvcore.common.config.CfgNode`, but different in: - - 1. Use unsafe yaml loading by default. - Note that this may lead to arbitrary code execution: you must not - load a config file from untrusted sources before manually inspecting - the content of the file. - 2. Support config versioning. - When attempting to merge an old config, it will convert the old config automatically. - - .. automethod:: clone - .. automethod:: freeze - .. automethod:: defrost - .. automethod:: is_frozen - .. automethod:: load_yaml_with_base - .. automethod:: merge_from_list - .. automethod:: merge_from_other_cfg - """ - - @classmethod - def _open_cfg(cls, filename): - return PathManager.open(filename, "r") - - # Note that the default value of allow_unsafe is changed to True - def merge_from_file(self, cfg_filename: str, allow_unsafe: bool = True) -> None: - """ - Load content from the given config file and merge it into self. - - Args: - cfg_filename: config filename - allow_unsafe: allow unsafe yaml syntax - """ - assert PathManager.isfile(cfg_filename), f"Config file '{cfg_filename}' does not exist!" - loaded_cfg = self.load_yaml_with_base(cfg_filename, allow_unsafe=allow_unsafe) - loaded_cfg = type(self)(loaded_cfg) - - # defaults.py needs to import CfgNode - from .defaults import _C - - latest_ver = _C.VERSION - assert ( - latest_ver == self.VERSION - ), "CfgNode.merge_from_file is only allowed on a config object of latest version!" - - logger = logging.getLogger(__name__) - - loaded_ver = loaded_cfg.get("VERSION", None) - if loaded_ver is None: - from .compat import guess_version - - loaded_ver = guess_version(loaded_cfg, cfg_filename) - assert loaded_ver <= self.VERSION, "Cannot merge a v{} config into a v{} config.".format( - loaded_ver, self.VERSION - ) - - if loaded_ver == self.VERSION: - self.merge_from_other_cfg(loaded_cfg) - else: - # compat.py needs to import CfgNode - from .compat import upgrade_config, downgrade_config - - logger.warning( - "Loading an old v{} config file '{}' by automatically upgrading to v{}. " - "See docs/CHANGELOG.md for instructions to update your files.".format( - loaded_ver, cfg_filename, self.VERSION - ) - ) - # To convert, first obtain a full config at an old version - old_self = downgrade_config(self, to_version=loaded_ver) - old_self.merge_from_other_cfg(loaded_cfg) - new_config = upgrade_config(old_self) - self.clear() - self.update(new_config) - - def dump(self, *args, **kwargs): - """ - Returns: - str: a yaml string representation of the config - """ - # to make it show up in docs - return super().dump(*args, **kwargs) - - -global_cfg = CfgNode() - - -def get_cfg() -> CfgNode: - """ - Get a copy of the default config. - - Returns: - a detectron2 CfgNode instance. - """ - from .defaults import _C - - return _C.clone() - - -def set_global_cfg(cfg: CfgNode) -> None: - """ - Let the global config point to the given cfg. - - Assume that the given "cfg" has the key "KEY", after calling - `set_global_cfg(cfg)`, the key can be accessed by: - :: - from detectron2.config import global_cfg - print(global_cfg.KEY) - - By using a hacky global config, you can access these configs anywhere, - without having to pass the config object or the values deep into the code. - This is a hacky feature introduced for quick prototyping / research exploration. - """ - global global_cfg - global_cfg.clear() - global_cfg.update(cfg) - - -def configurable(init_func=None, *, from_config=None): - """ - Decorate a function or a class's __init__ method so that it can be called - with a :class:`CfgNode` object using a :func:`from_config` function that translates - :class:`CfgNode` to arguments. - - Examples: - :: - # Usage 1: Decorator on __init__: - class A: - @configurable - def __init__(self, a, b=2, c=3): - pass - - @classmethod - def from_config(cls, cfg): # 'cfg' must be the first argument - # Returns kwargs to be passed to __init__ - return {"a": cfg.A, "b": cfg.B} - - a1 = A(a=1, b=2) # regular construction - a2 = A(cfg) # construct with a cfg - a3 = A(cfg, b=3, c=4) # construct with extra overwrite - - # Usage 2: Decorator on any function. Needs an extra from_config argument: - @configurable(from_config=lambda cfg: {"a: cfg.A, "b": cfg.B}) - def a_func(a, b=2, c=3): - pass - - a1 = a_func(a=1, b=2) # regular call - a2 = a_func(cfg) # call with a cfg - a3 = a_func(cfg, b=3, c=4) # call with extra overwrite - - Args: - init_func (callable): a class's ``__init__`` method in usage 1. The - class must have a ``from_config`` classmethod which takes `cfg` as - the first argument. - from_config (callable): the from_config function in usage 2. It must take `cfg` - as its first argument. - """ - - if init_func is not None: - assert ( - inspect.isfunction(init_func) - and from_config is None - and init_func.__name__ == "__init__" - ), "Incorrect use of @configurable. Check API documentation for examples." - - @functools.wraps(init_func) - def wrapped(self, *args, **kwargs): - try: - from_config_func = type(self).from_config - except AttributeError as e: - raise AttributeError( - "Class with @configurable must have a 'from_config' classmethod." - ) from e - if not inspect.ismethod(from_config_func): - raise TypeError("Class with @configurable must have a 'from_config' classmethod.") - - if _called_with_cfg(*args, **kwargs): - explicit_args = _get_args_from_config(from_config_func, *args, **kwargs) - init_func(self, **explicit_args) - else: - init_func(self, *args, **kwargs) - - return wrapped - - else: - if from_config is None: - return configurable # @configurable() is made equivalent to @configurable - assert inspect.isfunction( - from_config - ), "from_config argument of configurable must be a function!" - - def wrapper(orig_func): - @functools.wraps(orig_func) - def wrapped(*args, **kwargs): - if _called_with_cfg(*args, **kwargs): - explicit_args = _get_args_from_config(from_config, *args, **kwargs) - return orig_func(**explicit_args) - else: - return orig_func(*args, **kwargs) - - wrapped.from_config = from_config - return wrapped - - return wrapper - - -def _get_args_from_config(from_config_func, *args, **kwargs): - """ - Use `from_config` to obtain explicit arguments. - - Returns: - dict: arguments to be used for cls.__init__ - """ - signature = inspect.signature(from_config_func) - if list(signature.parameters.keys())[0] != "cfg": - if inspect.isfunction(from_config_func): - name = from_config_func.__name__ - else: - name = f"{from_config_func.__self__}.from_config" - raise TypeError(f"{name} must take 'cfg' as the first argument!") - support_var_arg = any( - param.kind in [param.VAR_POSITIONAL, param.VAR_KEYWORD] - for param in signature.parameters.values() - ) - if support_var_arg: # forward all arguments to from_config, if from_config accepts them - ret = from_config_func(*args, **kwargs) - else: - # forward supported arguments to from_config - supported_arg_names = set(signature.parameters.keys()) - extra_kwargs = {} - for name in list(kwargs.keys()): - if name not in supported_arg_names: - extra_kwargs[name] = kwargs.pop(name) - ret = from_config_func(*args, **kwargs) - # forward the other arguments to __init__ - ret.update(extra_kwargs) - return ret - - -def _called_with_cfg(*args, **kwargs): - """ - Returns: - bool: whether the arguments contain CfgNode and should be considered - forwarded to from_config. - """ - from omegaconf import DictConfig - - if len(args) and isinstance(args[0], (_CfgNode, DictConfig)): - return True - if isinstance(kwargs.pop("cfg", None), (_CfgNode, DictConfig)): - return True - # `from_config`'s first argument is forced to be "cfg". - # So the above check covers all cases. - return False diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/defaults.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/defaults.py deleted file mode 100755 index 848486df..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/defaults.py +++ /dev/null @@ -1,635 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .config import CfgNode as CN - -# NOTE: given the new config system -# (https://detectron2.readthedocs.io/en/latest/tutorials/lazyconfigs.html), -# we will stop adding new functionalities to default CfgNode. - -# ----------------------------------------------------------------------------- -# Convention about Training / Test specific parameters -# ----------------------------------------------------------------------------- -# Whenever an argument can be either used for training or for testing, the -# corresponding name will be post-fixed by a _TRAIN for a training parameter, -# or _TEST for a test-specific parameter. -# For example, the number of images during training will be -# IMAGES_PER_BATCH_TRAIN, while the number of images for testing will be -# IMAGES_PER_BATCH_TEST - -# ----------------------------------------------------------------------------- -# Config definition -# ----------------------------------------------------------------------------- - -_C = CN() - -# The version number, to upgrade from old configs to new ones if any -# changes happen. It's recommended to keep a VERSION in your config file. -_C.VERSION = 2 - -_C.MODEL = CN() -_C.MODEL.LOAD_PROPOSALS = False -_C.MODEL.MASK_ON = False -_C.MODEL.KEYPOINT_ON = False -_C.MODEL.DEVICE = "cuda" -_C.MODEL.META_ARCHITECTURE = "GeneralizedRCNN" - -# Path (a file path, or URL like detectron2://.., https://..) to a checkpoint file -# to be loaded to the model. You can find available models in the model zoo. -_C.MODEL.WEIGHTS = "" - -# Values to be used for image normalization (BGR order, since INPUT.FORMAT defaults to BGR). -# To train on images of different number of channels, just set different mean & std. -# Default values are the mean pixel value from ImageNet: [103.53, 116.28, 123.675] -_C.MODEL.PIXEL_MEAN = [103.530, 116.280, 123.675] -# When using pre-trained models in Detectron1 or any MSRA models, -# std has been absorbed into its conv1 weights, so the std needs to be set 1. -# Otherwise, you can use [57.375, 57.120, 58.395] (ImageNet std) -_C.MODEL.PIXEL_STD = [1.0, 1.0, 1.0] - - -# ----------------------------------------------------------------------------- -# INPUT -# ----------------------------------------------------------------------------- -_C.INPUT = CN() -# By default, {MIN,MAX}_SIZE options are used in transforms.ResizeShortestEdge. -# Please refer to ResizeShortestEdge for detailed definition. -# Size of the smallest side of the image during training -_C.INPUT.MIN_SIZE_TRAIN = (800,) -# Sample size of smallest side by choice or random selection from range give by -# INPUT.MIN_SIZE_TRAIN -_C.INPUT.MIN_SIZE_TRAIN_SAMPLING = "choice" -# Maximum size of the side of the image during training -_C.INPUT.MAX_SIZE_TRAIN = 1333 -# Size of the smallest side of the image during testing. Set to zero to disable resize in testing. -_C.INPUT.MIN_SIZE_TEST = 800 -# Maximum size of the side of the image during testing -_C.INPUT.MAX_SIZE_TEST = 1333 -# Mode for flipping images used in data augmentation during training -# choose one of ["horizontal, "vertical", "none"] -_C.INPUT.RANDOM_FLIP = "horizontal" - -# `True` if cropping is used for data augmentation during training -_C.INPUT.CROP = CN({"ENABLED": False}) -# Cropping type. See documentation of `detectron2.data.transforms.RandomCrop` for explanation. -_C.INPUT.CROP.TYPE = "relative_range" -# Size of crop in range (0, 1] if CROP.TYPE is "relative" or "relative_range" and in number of -# pixels if CROP.TYPE is "absolute" -_C.INPUT.CROP.SIZE = [0.9, 0.9] - - -# Whether the model needs RGB, YUV, HSV etc. -# Should be one of the modes defined here, as we use PIL to read the image: -# https://pillow.readthedocs.io/en/stable/handbook/concepts.html#concept-modes -# with BGR being the one exception. One can set image format to BGR, we will -# internally use RGB for conversion and flip the channels over -_C.INPUT.FORMAT = "BGR" -# The ground truth mask format that the model will use. -# Mask R-CNN supports either "polygon" or "bitmask" as ground truth. -_C.INPUT.MASK_FORMAT = "polygon" # alternative: "bitmask" - - -# ----------------------------------------------------------------------------- -# Dataset -# ----------------------------------------------------------------------------- -_C.DATASETS = CN() -# List of the dataset names for training. Must be registered in DatasetCatalog -# Samples from these datasets will be merged and used as one dataset. -_C.DATASETS.TRAIN = () -# List of the pre-computed proposal files for training, which must be consistent -# with datasets listed in DATASETS.TRAIN. -_C.DATASETS.PROPOSAL_FILES_TRAIN = () -# Number of top scoring precomputed proposals to keep for training -_C.DATASETS.PRECOMPUTED_PROPOSAL_TOPK_TRAIN = 2000 -# List of the dataset names for testing. Must be registered in DatasetCatalog -_C.DATASETS.TEST = () -# List of the pre-computed proposal files for test, which must be consistent -# with datasets listed in DATASETS.TEST. -_C.DATASETS.PROPOSAL_FILES_TEST = () -# Number of top scoring precomputed proposals to keep for test -_C.DATASETS.PRECOMPUTED_PROPOSAL_TOPK_TEST = 1000 - -# ----------------------------------------------------------------------------- -# DataLoader -# ----------------------------------------------------------------------------- -_C.DATALOADER = CN() -# Number of data loading threads -_C.DATALOADER.NUM_WORKERS = 4 -# If True, each batch should contain only images for which the aspect ratio -# is compatible. This groups portrait images together, and landscape images -# are not batched with portrait images. -_C.DATALOADER.ASPECT_RATIO_GROUPING = True -# Options: TrainingSampler, RepeatFactorTrainingSampler -_C.DATALOADER.SAMPLER_TRAIN = "TrainingSampler" -# Repeat threshold for RepeatFactorTrainingSampler -_C.DATALOADER.REPEAT_THRESHOLD = 0.0 -# Tf True, when working on datasets that have instance annotations, the -# training dataloader will filter out images without associated annotations -_C.DATALOADER.FILTER_EMPTY_ANNOTATIONS = True - -# ---------------------------------------------------------------------------- # -# Backbone options -# ---------------------------------------------------------------------------- # -_C.MODEL.BACKBONE = CN() - -_C.MODEL.BACKBONE.NAME = "build_resnet_backbone" -# Freeze the first several stages so they are not trained. -# There are 5 stages in ResNet. The first is a convolution, and the following -# stages are each group of residual blocks. -_C.MODEL.BACKBONE.FREEZE_AT = 2 - - -# ---------------------------------------------------------------------------- # -# FPN options -# ---------------------------------------------------------------------------- # -_C.MODEL.FPN = CN() -# Names of the input feature maps to be used by FPN -# They must have contiguous power of 2 strides -# e.g., ["res2", "res3", "res4", "res5"] -_C.MODEL.FPN.IN_FEATURES = [] -_C.MODEL.FPN.OUT_CHANNELS = 256 - -# Options: "" (no norm), "GN" -_C.MODEL.FPN.NORM = "" - -# Types for fusing the FPN top-down and lateral features. Can be either "sum" or "avg" -_C.MODEL.FPN.FUSE_TYPE = "sum" - - -# ---------------------------------------------------------------------------- # -# Proposal generator options -# ---------------------------------------------------------------------------- # -_C.MODEL.PROPOSAL_GENERATOR = CN() -# Current proposal generators include "RPN", "RRPN" and "PrecomputedProposals" -_C.MODEL.PROPOSAL_GENERATOR.NAME = "RPN" -# Proposal height and width both need to be greater than MIN_SIZE -# (a the scale used during training or inference) -_C.MODEL.PROPOSAL_GENERATOR.MIN_SIZE = 0 - - -# ---------------------------------------------------------------------------- # -# Anchor generator options -# ---------------------------------------------------------------------------- # -_C.MODEL.ANCHOR_GENERATOR = CN() -# The generator can be any name in the ANCHOR_GENERATOR registry -_C.MODEL.ANCHOR_GENERATOR.NAME = "DefaultAnchorGenerator" -# Anchor sizes (i.e. sqrt of area) in absolute pixels w.r.t. the network input. -# Format: list[list[float]]. SIZES[i] specifies the list of sizes to use for -# IN_FEATURES[i]; len(SIZES) must be equal to len(IN_FEATURES) or 1. -# When len(SIZES) == 1, SIZES[0] is used for all IN_FEATURES. -_C.MODEL.ANCHOR_GENERATOR.SIZES = [[32, 64, 128, 256, 512]] -# Anchor aspect ratios. For each area given in `SIZES`, anchors with different aspect -# ratios are generated by an anchor generator. -# Format: list[list[float]]. ASPECT_RATIOS[i] specifies the list of aspect ratios (H/W) -# to use for IN_FEATURES[i]; len(ASPECT_RATIOS) == len(IN_FEATURES) must be true, -# or len(ASPECT_RATIOS) == 1 is true and aspect ratio list ASPECT_RATIOS[0] is used -# for all IN_FEATURES. -_C.MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS = [[0.5, 1.0, 2.0]] -# Anchor angles. -# list[list[float]], the angle in degrees, for each input feature map. -# ANGLES[i] specifies the list of angles for IN_FEATURES[i]. -_C.MODEL.ANCHOR_GENERATOR.ANGLES = [[-90, 0, 90]] -# Relative offset between the center of the first anchor and the top-left corner of the image -# Value has to be in [0, 1). Recommend to use 0.5, which means half stride. -# The value is not expected to affect model accuracy. -_C.MODEL.ANCHOR_GENERATOR.OFFSET = 0.0 - -# ---------------------------------------------------------------------------- # -# RPN options -# ---------------------------------------------------------------------------- # -_C.MODEL.RPN = CN() -_C.MODEL.RPN.HEAD_NAME = "StandardRPNHead" # used by RPN_HEAD_REGISTRY - -# Names of the input feature maps to be used by RPN -# e.g., ["p2", "p3", "p4", "p5", "p6"] for FPN -_C.MODEL.RPN.IN_FEATURES = ["res4"] -# Remove RPN anchors that go outside the image by BOUNDARY_THRESH pixels -# Set to -1 or a large value, e.g. 100000, to disable pruning anchors -_C.MODEL.RPN.BOUNDARY_THRESH = -1 -# IOU overlap ratios [BG_IOU_THRESHOLD, FG_IOU_THRESHOLD] -# Minimum overlap required between an anchor and ground-truth box for the -# (anchor, gt box) pair to be a positive example (IoU >= FG_IOU_THRESHOLD -# ==> positive RPN example: 1) -# Maximum overlap allowed between an anchor and ground-truth box for the -# (anchor, gt box) pair to be a negative examples (IoU < BG_IOU_THRESHOLD -# ==> negative RPN example: 0) -# Anchors with overlap in between (BG_IOU_THRESHOLD <= IoU < FG_IOU_THRESHOLD) -# are ignored (-1) -_C.MODEL.RPN.IOU_THRESHOLDS = [0.3, 0.7] -_C.MODEL.RPN.IOU_LABELS = [0, -1, 1] -# Number of regions per image used to train RPN -_C.MODEL.RPN.BATCH_SIZE_PER_IMAGE = 256 -# Target fraction of foreground (positive) examples per RPN minibatch -_C.MODEL.RPN.POSITIVE_FRACTION = 0.5 -# Options are: "smooth_l1", "giou", "diou", "ciou" -_C.MODEL.RPN.BBOX_REG_LOSS_TYPE = "smooth_l1" -_C.MODEL.RPN.BBOX_REG_LOSS_WEIGHT = 1.0 -# Weights on (dx, dy, dw, dh) for normalizing RPN anchor regression targets -_C.MODEL.RPN.BBOX_REG_WEIGHTS = (1.0, 1.0, 1.0, 1.0) -# The transition point from L1 to L2 loss. Set to 0.0 to make the loss simply L1. -_C.MODEL.RPN.SMOOTH_L1_BETA = 0.0 -_C.MODEL.RPN.LOSS_WEIGHT = 1.0 -# Number of top scoring RPN proposals to keep before applying NMS -# When FPN is used, this is *per FPN level* (not total) -_C.MODEL.RPN.PRE_NMS_TOPK_TRAIN = 12000 -_C.MODEL.RPN.PRE_NMS_TOPK_TEST = 6000 -# Number of top scoring RPN proposals to keep after applying NMS -# When FPN is used, this limit is applied per level and then again to the union -# of proposals from all levels -# NOTE: When FPN is used, the meaning of this config is different from Detectron1. -# It means per-batch topk in Detectron1, but per-image topk here. -# See the "find_top_rpn_proposals" function for details. -_C.MODEL.RPN.POST_NMS_TOPK_TRAIN = 2000 -_C.MODEL.RPN.POST_NMS_TOPK_TEST = 1000 -# NMS threshold used on RPN proposals -_C.MODEL.RPN.NMS_THRESH = 0.7 -# Set this to -1 to use the same number of output channels as input channels. -_C.MODEL.RPN.CONV_DIMS = [-1] - -# ---------------------------------------------------------------------------- # -# ROI HEADS options -# ---------------------------------------------------------------------------- # -_C.MODEL.ROI_HEADS = CN() -_C.MODEL.ROI_HEADS.NAME = "Res5ROIHeads" -# Number of foreground classes -_C.MODEL.ROI_HEADS.NUM_CLASSES = 80 -# Names of the input feature maps to be used by ROI heads -# Currently all heads (box, mask, ...) use the same input feature map list -# e.g., ["p2", "p3", "p4", "p5"] is commonly used for FPN -_C.MODEL.ROI_HEADS.IN_FEATURES = ["res4"] -# IOU overlap ratios [IOU_THRESHOLD] -# Overlap threshold for an RoI to be considered background (if < IOU_THRESHOLD) -# Overlap threshold for an RoI to be considered foreground (if >= IOU_THRESHOLD) -_C.MODEL.ROI_HEADS.IOU_THRESHOLDS = [0.5] -_C.MODEL.ROI_HEADS.IOU_LABELS = [0, 1] -# RoI minibatch size *per image* (number of regions of interest [ROIs]) during training -# Total number of RoIs per training minibatch = -# ROI_HEADS.BATCH_SIZE_PER_IMAGE * SOLVER.IMS_PER_BATCH -# E.g., a common configuration is: 512 * 16 = 8192 -_C.MODEL.ROI_HEADS.BATCH_SIZE_PER_IMAGE = 512 -# Target fraction of RoI minibatch that is labeled foreground (i.e. class > 0) -_C.MODEL.ROI_HEADS.POSITIVE_FRACTION = 0.25 - -# Only used on test mode - -# Minimum score threshold (assuming scores in a [0, 1] range); a value chosen to -# balance obtaining high recall with not having too many low precision -# detections that will slow down inference post processing steps (like NMS) -# A default threshold of 0.0 increases AP by ~0.2-0.3 but significantly slows down -# inference. -_C.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.05 -# Overlap threshold used for non-maximum suppression (suppress boxes with -# IoU >= this threshold) -_C.MODEL.ROI_HEADS.NMS_THRESH_TEST = 0.5 -# If True, augment proposals with ground-truth boxes before sampling proposals to -# train ROI heads. -_C.MODEL.ROI_HEADS.PROPOSAL_APPEND_GT = True - -# ---------------------------------------------------------------------------- # -# Box Head -# ---------------------------------------------------------------------------- # -_C.MODEL.ROI_BOX_HEAD = CN() -# C4 don't use head name option -# Options for non-C4 models: FastRCNNConvFCHead, -_C.MODEL.ROI_BOX_HEAD.NAME = "" -# Options are: "smooth_l1", "giou", "diou", "ciou" -_C.MODEL.ROI_BOX_HEAD.BBOX_REG_LOSS_TYPE = "smooth_l1" -# The final scaling coefficient on the box regression loss, used to balance the magnitude of its -# gradients with other losses in the model. See also `MODEL.ROI_KEYPOINT_HEAD.LOSS_WEIGHT`. -_C.MODEL.ROI_BOX_HEAD.BBOX_REG_LOSS_WEIGHT = 1.0 -# Default weights on (dx, dy, dw, dh) for normalizing bbox regression targets -# These are empirically chosen to approximately lead to unit variance targets -_C.MODEL.ROI_BOX_HEAD.BBOX_REG_WEIGHTS = (10.0, 10.0, 5.0, 5.0) -# The transition point from L1 to L2 loss. Set to 0.0 to make the loss simply L1. -_C.MODEL.ROI_BOX_HEAD.SMOOTH_L1_BETA = 0.0 -_C.MODEL.ROI_BOX_HEAD.POOLER_RESOLUTION = 14 -_C.MODEL.ROI_BOX_HEAD.POOLER_SAMPLING_RATIO = 0 -# Type of pooling operation applied to the incoming feature map for each RoI -_C.MODEL.ROI_BOX_HEAD.POOLER_TYPE = "ROIAlignV2" - -_C.MODEL.ROI_BOX_HEAD.NUM_FC = 0 -# Hidden layer dimension for FC layers in the RoI box head -_C.MODEL.ROI_BOX_HEAD.FC_DIM = 1024 -_C.MODEL.ROI_BOX_HEAD.NUM_CONV = 0 -# Channel dimension for Conv layers in the RoI box head -_C.MODEL.ROI_BOX_HEAD.CONV_DIM = 256 -# Normalization method for the convolution layers. -# Options: "" (no norm), "GN", "SyncBN". -_C.MODEL.ROI_BOX_HEAD.NORM = "" -# Whether to use class agnostic for bbox regression -_C.MODEL.ROI_BOX_HEAD.CLS_AGNOSTIC_BBOX_REG = False -# If true, RoI heads use bounding boxes predicted by the box head rather than proposal boxes. -_C.MODEL.ROI_BOX_HEAD.TRAIN_ON_PRED_BOXES = False - -# ---------------------------------------------------------------------------- # -# Cascaded Box Head -# ---------------------------------------------------------------------------- # -_C.MODEL.ROI_BOX_CASCADE_HEAD = CN() -# The number of cascade stages is implicitly defined by the length of the following two configs. -_C.MODEL.ROI_BOX_CASCADE_HEAD.BBOX_REG_WEIGHTS = ( - (10.0, 10.0, 5.0, 5.0), - (20.0, 20.0, 10.0, 10.0), - (30.0, 30.0, 15.0, 15.0), -) -_C.MODEL.ROI_BOX_CASCADE_HEAD.IOUS = (0.5, 0.6, 0.7) - - -# ---------------------------------------------------------------------------- # -# Mask Head -# ---------------------------------------------------------------------------- # -_C.MODEL.ROI_MASK_HEAD = CN() -_C.MODEL.ROI_MASK_HEAD.NAME = "MaskRCNNConvUpsampleHead" -_C.MODEL.ROI_MASK_HEAD.POOLER_RESOLUTION = 14 -_C.MODEL.ROI_MASK_HEAD.POOLER_SAMPLING_RATIO = 0 -_C.MODEL.ROI_MASK_HEAD.NUM_CONV = 0 # The number of convs in the mask head -_C.MODEL.ROI_MASK_HEAD.CONV_DIM = 256 -# Normalization method for the convolution layers. -# Options: "" (no norm), "GN", "SyncBN". -_C.MODEL.ROI_MASK_HEAD.NORM = "" -# Whether to use class agnostic for mask prediction -_C.MODEL.ROI_MASK_HEAD.CLS_AGNOSTIC_MASK = False -# Type of pooling operation applied to the incoming feature map for each RoI -_C.MODEL.ROI_MASK_HEAD.POOLER_TYPE = "ROIAlignV2" - - -# ---------------------------------------------------------------------------- # -# Keypoint Head -# ---------------------------------------------------------------------------- # -_C.MODEL.ROI_KEYPOINT_HEAD = CN() -_C.MODEL.ROI_KEYPOINT_HEAD.NAME = "KRCNNConvDeconvUpsampleHead" -_C.MODEL.ROI_KEYPOINT_HEAD.POOLER_RESOLUTION = 14 -_C.MODEL.ROI_KEYPOINT_HEAD.POOLER_SAMPLING_RATIO = 0 -_C.MODEL.ROI_KEYPOINT_HEAD.CONV_DIMS = tuple(512 for _ in range(8)) -_C.MODEL.ROI_KEYPOINT_HEAD.NUM_KEYPOINTS = 17 # 17 is the number of keypoints in COCO. - -# Images with too few (or no) keypoints are excluded from training. -_C.MODEL.ROI_KEYPOINT_HEAD.MIN_KEYPOINTS_PER_IMAGE = 1 -# Normalize by the total number of visible keypoints in the minibatch if True. -# Otherwise, normalize by the total number of keypoints that could ever exist -# in the minibatch. -# The keypoint softmax loss is only calculated on visible keypoints. -# Since the number of visible keypoints can vary significantly between -# minibatches, this has the effect of up-weighting the importance of -# minibatches with few visible keypoints. (Imagine the extreme case of -# only one visible keypoint versus N: in the case of N, each one -# contributes 1/N to the gradient compared to the single keypoint -# determining the gradient direction). Instead, we can normalize the -# loss by the total number of keypoints, if it were the case that all -# keypoints were visible in a full minibatch. (Returning to the example, -# this means that the one visible keypoint contributes as much as each -# of the N keypoints.) -_C.MODEL.ROI_KEYPOINT_HEAD.NORMALIZE_LOSS_BY_VISIBLE_KEYPOINTS = True -# Multi-task loss weight to use for keypoints -# Recommended values: -# - use 1.0 if NORMALIZE_LOSS_BY_VISIBLE_KEYPOINTS is True -# - use 4.0 if NORMALIZE_LOSS_BY_VISIBLE_KEYPOINTS is False -_C.MODEL.ROI_KEYPOINT_HEAD.LOSS_WEIGHT = 1.0 -# Type of pooling operation applied to the incoming feature map for each RoI -_C.MODEL.ROI_KEYPOINT_HEAD.POOLER_TYPE = "ROIAlignV2" - -# ---------------------------------------------------------------------------- # -# Semantic Segmentation Head -# ---------------------------------------------------------------------------- # -_C.MODEL.SEM_SEG_HEAD = CN() -_C.MODEL.SEM_SEG_HEAD.NAME = "SemSegFPNHead" -_C.MODEL.SEM_SEG_HEAD.IN_FEATURES = ["p2", "p3", "p4", "p5"] -# Label in the semantic segmentation ground truth that is ignored, i.e., no loss is calculated for -# the correposnding pixel. -_C.MODEL.SEM_SEG_HEAD.IGNORE_VALUE = 255 -# Number of classes in the semantic segmentation head -_C.MODEL.SEM_SEG_HEAD.NUM_CLASSES = 54 -# Number of channels in the 3x3 convs inside semantic-FPN heads. -_C.MODEL.SEM_SEG_HEAD.CONVS_DIM = 128 -# Outputs from semantic-FPN heads are up-scaled to the COMMON_STRIDE stride. -_C.MODEL.SEM_SEG_HEAD.COMMON_STRIDE = 4 -# Normalization method for the convolution layers. Options: "" (no norm), "GN". -_C.MODEL.SEM_SEG_HEAD.NORM = "GN" -_C.MODEL.SEM_SEG_HEAD.LOSS_WEIGHT = 1.0 - -_C.MODEL.PANOPTIC_FPN = CN() -# Scaling of all losses from instance detection / segmentation head. -_C.MODEL.PANOPTIC_FPN.INSTANCE_LOSS_WEIGHT = 1.0 - -# options when combining instance & semantic segmentation outputs -_C.MODEL.PANOPTIC_FPN.COMBINE = CN({"ENABLED": True}) # "COMBINE.ENABLED" is deprecated & not used -_C.MODEL.PANOPTIC_FPN.COMBINE.OVERLAP_THRESH = 0.5 -_C.MODEL.PANOPTIC_FPN.COMBINE.STUFF_AREA_LIMIT = 4096 -_C.MODEL.PANOPTIC_FPN.COMBINE.INSTANCES_CONFIDENCE_THRESH = 0.5 - - -# ---------------------------------------------------------------------------- # -# RetinaNet Head -# ---------------------------------------------------------------------------- # -_C.MODEL.RETINANET = CN() - -# This is the number of foreground classes. -_C.MODEL.RETINANET.NUM_CLASSES = 80 - -_C.MODEL.RETINANET.IN_FEATURES = ["p3", "p4", "p5", "p6", "p7"] - -# Convolutions to use in the cls and bbox tower -# NOTE: this doesn't include the last conv for logits -_C.MODEL.RETINANET.NUM_CONVS = 4 - -# IoU overlap ratio [bg, fg] for labeling anchors. -# Anchors with < bg are labeled negative (0) -# Anchors with >= bg and < fg are ignored (-1) -# Anchors with >= fg are labeled positive (1) -_C.MODEL.RETINANET.IOU_THRESHOLDS = [0.4, 0.5] -_C.MODEL.RETINANET.IOU_LABELS = [0, -1, 1] - -# Prior prob for rare case (i.e. foreground) at the beginning of training. -# This is used to set the bias for the logits layer of the classifier subnet. -# This improves training stability in the case of heavy class imbalance. -_C.MODEL.RETINANET.PRIOR_PROB = 0.01 - -# Inference cls score threshold, only anchors with score > INFERENCE_TH are -# considered for inference (to improve speed) -_C.MODEL.RETINANET.SCORE_THRESH_TEST = 0.05 -# Select topk candidates before NMS -_C.MODEL.RETINANET.TOPK_CANDIDATES_TEST = 1000 -_C.MODEL.RETINANET.NMS_THRESH_TEST = 0.5 - -# Weights on (dx, dy, dw, dh) for normalizing Retinanet anchor regression targets -_C.MODEL.RETINANET.BBOX_REG_WEIGHTS = (1.0, 1.0, 1.0, 1.0) - -# Loss parameters -_C.MODEL.RETINANET.FOCAL_LOSS_GAMMA = 2.0 -_C.MODEL.RETINANET.FOCAL_LOSS_ALPHA = 0.25 -_C.MODEL.RETINANET.SMOOTH_L1_LOSS_BETA = 0.1 -# Options are: "smooth_l1", "giou", "diou", "ciou" -_C.MODEL.RETINANET.BBOX_REG_LOSS_TYPE = "smooth_l1" - -# One of BN, SyncBN, FrozenBN, GN -# Only supports GN until unshared norm is implemented -_C.MODEL.RETINANET.NORM = "" - - -# ---------------------------------------------------------------------------- # -# ResNe[X]t options (ResNets = {ResNet, ResNeXt} -# Note that parts of a resnet may be used for both the backbone and the head -# These options apply to both -# ---------------------------------------------------------------------------- # -_C.MODEL.RESNETS = CN() - -_C.MODEL.RESNETS.DEPTH = 50 -_C.MODEL.RESNETS.OUT_FEATURES = ["res4"] # res4 for C4 backbone, res2..5 for FPN backbone - -# Number of groups to use; 1 ==> ResNet; > 1 ==> ResNeXt -_C.MODEL.RESNETS.NUM_GROUPS = 1 - -# Options: FrozenBN, GN, "SyncBN", "BN" -_C.MODEL.RESNETS.NORM = "FrozenBN" - -# Baseline width of each group. -# Scaling this parameters will scale the width of all bottleneck layers. -_C.MODEL.RESNETS.WIDTH_PER_GROUP = 64 - -# Place the stride 2 conv on the 1x1 filter -# Use True only for the original MSRA ResNet; use False for C2 and Torch models -_C.MODEL.RESNETS.STRIDE_IN_1X1 = True - -# Apply dilation in stage "res5" -_C.MODEL.RESNETS.RES5_DILATION = 1 - -# Output width of res2. Scaling this parameters will scale the width of all 1x1 convs in ResNet -# For R18 and R34, this needs to be set to 64 -_C.MODEL.RESNETS.RES2_OUT_CHANNELS = 256 -_C.MODEL.RESNETS.STEM_OUT_CHANNELS = 64 - -# Apply Deformable Convolution in stages -# Specify if apply deform_conv on Res2, Res3, Res4, Res5 -_C.MODEL.RESNETS.DEFORM_ON_PER_STAGE = [False, False, False, False] -# Use True to use modulated deform_conv (DeformableV2, https://arxiv.org/abs/1811.11168); -# Use False for DeformableV1. -_C.MODEL.RESNETS.DEFORM_MODULATED = False -# Number of groups in deformable conv. -_C.MODEL.RESNETS.DEFORM_NUM_GROUPS = 1 - - -# ---------------------------------------------------------------------------- # -# Solver -# ---------------------------------------------------------------------------- # -_C.SOLVER = CN() - -# Options: WarmupMultiStepLR, WarmupCosineLR. -# See detectron2/solver/build.py for definition. -_C.SOLVER.LR_SCHEDULER_NAME = "WarmupMultiStepLR" - -_C.SOLVER.MAX_ITER = 40000 - -_C.SOLVER.BASE_LR = 0.001 - -_C.SOLVER.MOMENTUM = 0.9 - -_C.SOLVER.NESTEROV = False - -_C.SOLVER.WEIGHT_DECAY = 0.0001 -# The weight decay that's applied to parameters of normalization layers -# (typically the affine transformation) -_C.SOLVER.WEIGHT_DECAY_NORM = 0.0 - -_C.SOLVER.GAMMA = 0.1 -# The iteration number to decrease learning rate by GAMMA. -_C.SOLVER.STEPS = (30000,) - -_C.SOLVER.WARMUP_FACTOR = 1.0 / 1000 -_C.SOLVER.WARMUP_ITERS = 1000 -_C.SOLVER.WARMUP_METHOD = "linear" - -# Save a checkpoint after every this number of iterations -_C.SOLVER.CHECKPOINT_PERIOD = 5000 - -# Number of images per batch across all machines. This is also the number -# of training images per step (i.e. per iteration). If we use 16 GPUs -# and IMS_PER_BATCH = 32, each GPU will see 2 images per batch. -# May be adjusted automatically if REFERENCE_WORLD_SIZE is set. -_C.SOLVER.IMS_PER_BATCH = 16 - -# The reference number of workers (GPUs) this config is meant to train with. -# It takes no effect when set to 0. -# With a non-zero value, it will be used by DefaultTrainer to compute a desired -# per-worker batch size, and then scale the other related configs (total batch size, -# learning rate, etc) to match the per-worker batch size. -# See documentation of `DefaultTrainer.auto_scale_workers` for details: -_C.SOLVER.REFERENCE_WORLD_SIZE = 0 - -# Detectron v1 (and previous detection code) used a 2x higher LR and 0 WD for -# biases. This is not useful (at least for recent models). You should avoid -# changing these and they exist only to reproduce Detectron v1 training if -# desired. -_C.SOLVER.BIAS_LR_FACTOR = 1.0 -_C.SOLVER.WEIGHT_DECAY_BIAS = None # None means following WEIGHT_DECAY - -# Gradient clipping -_C.SOLVER.CLIP_GRADIENTS = CN({"ENABLED": False}) -# Type of gradient clipping, currently 2 values are supported: -# - "value": the absolute values of elements of each gradients are clipped -# - "norm": the norm of the gradient for each parameter is clipped thus -# affecting all elements in the parameter -_C.SOLVER.CLIP_GRADIENTS.CLIP_TYPE = "value" -# Maximum absolute value used for clipping gradients -_C.SOLVER.CLIP_GRADIENTS.CLIP_VALUE = 1.0 -# Floating point number p for L-p norm to be used with the "norm" -# gradient clipping type; for L-inf, please specify .inf -_C.SOLVER.CLIP_GRADIENTS.NORM_TYPE = 2.0 - -# Enable automatic mixed precision for training -# Note that this does not change model's inference behavior. -# To use AMP in inference, run inference under autocast() -_C.SOLVER.AMP = CN({"ENABLED": False}) - -# ---------------------------------------------------------------------------- # -# Specific test options -# ---------------------------------------------------------------------------- # -_C.TEST = CN() -# For end-to-end tests to verify the expected accuracy. -# Each item is [task, metric, value, tolerance] -# e.g.: [['bbox', 'AP', 38.5, 0.2]] -_C.TEST.EXPECTED_RESULTS = [] -# The period (in terms of steps) to evaluate the model during training. -# Set to 0 to disable. -_C.TEST.EVAL_PERIOD = 0 -# The sigmas used to calculate keypoint OKS. See http://cocodataset.org/#keypoints-eval -# When empty, it will use the defaults in COCO. -# Otherwise it should be a list[float] with the same length as ROI_KEYPOINT_HEAD.NUM_KEYPOINTS. -_C.TEST.KEYPOINT_OKS_SIGMAS = [] -# Maximum number of detections to return per image during inference (100 is -# based on the limit established for the COCO dataset). -_C.TEST.DETECTIONS_PER_IMAGE = 100 - -_C.TEST.AUG = CN({"ENABLED": False}) -_C.TEST.AUG.MIN_SIZES = (400, 500, 600, 700, 800, 900, 1000, 1100, 1200) -_C.TEST.AUG.MAX_SIZE = 4000 -_C.TEST.AUG.FLIP = True - -_C.TEST.PRECISE_BN = CN({"ENABLED": False}) -_C.TEST.PRECISE_BN.NUM_ITER = 200 - -# ---------------------------------------------------------------------------- # -# Misc options -# ---------------------------------------------------------------------------- # -# Directory where output files are written -_C.OUTPUT_DIR = "./output" -# Set seed to negative to fully randomize everything. -# Set seed to positive to use a fixed seed. Note that a fixed seed increases -# reproducibility but does not guarantee fully deterministic behavior. -# Disabling all parallelism further increases reproducibility. -_C.SEED = -1 -# Benchmark different cudnn algorithms. -# If input images have very different sizes, this option will have large overhead -# for about 10k iterations. It usually hurts total time, but can benefit for certain models. -# If input images have the same or similar sizes, benchmark is often helpful. -_C.CUDNN_BENCHMARK = False -# The period (in terms of steps) for minibatch visualization at train time. -# Set to 0 to disable. -_C.VIS_PERIOD = 0 - -# global config is for quick hack purposes. -# You can set them in command line or config files, -# and access it with: -# -# from detectron2.config import global_cfg -# print(global_cfg.HACK) -# -# Do not commit any configs into it. -_C.GLOBAL = CN() -_C.GLOBAL.HACK = 1.0 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/instantiate.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/instantiate.py deleted file mode 100755 index cbb32e19..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/instantiate.py +++ /dev/null @@ -1,82 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import dataclasses -import logging -from collections import abc -from typing import Any - -from detectron2.utils.registry import _convert_target_to_string, locate - -__all__ = ["dump_dataclass", "instantiate"] - - -def dump_dataclass(obj: Any): - """ - Dump a dataclass recursively into a dict that can be later instantiated. - - Args: - obj: a dataclass object - - Returns: - dict - """ - assert dataclasses.is_dataclass(obj) and not isinstance( - obj, type - ), "dump_dataclass() requires an instance of a dataclass." - ret = {"_target_": _convert_target_to_string(type(obj))} - for f in dataclasses.fields(obj): - v = getattr(obj, f.name) - if dataclasses.is_dataclass(v): - v = dump_dataclass(v) - if isinstance(v, (list, tuple)): - v = [dump_dataclass(x) if dataclasses.is_dataclass(x) else x for x in v] - ret[f.name] = v - return ret - - -def instantiate(cfg): - """ - Recursively instantiate objects defined in dictionaries by - "_target_" and arguments. - - Args: - cfg: a dict-like object with "_target_" that defines the caller, and - other keys that define the arguments - - Returns: - object instantiated by cfg - """ - from omegaconf import ListConfig - - if isinstance(cfg, ListConfig): - lst = [instantiate(x) for x in cfg] - return ListConfig(lst, flags={"allow_objects": True}) - if isinstance(cfg, list): - # Specialize for list, because many classes take - # list[objects] as arguments, such as ResNet, DatasetMapper - return [instantiate(x) for x in cfg] - - if isinstance(cfg, abc.Mapping) and "_target_" in cfg: - # conceptually equivalent to hydra.utils.instantiate(cfg) with _convert_=all, - # but faster: https://github.com/facebookresearch/hydra/issues/1200 - cfg = {k: instantiate(v) for k, v in cfg.items()} - cls = cfg.pop("_target_") - cls = instantiate(cls) - - if isinstance(cls, str): - cls_name = cls - cls = locate(cls_name) - assert cls is not None, cls_name - else: - try: - cls_name = cls.__module__ + "." + cls.__qualname__ - except Exception: - # target could be anything, so the above could fail - cls_name = str(cls) - assert callable(cls), f"_target_ {cls} does not define a callable object" - try: - return cls(**cfg) - except TypeError: - logger = logging.getLogger(__name__) - logger.error(f"Error when instantiating {cls_name}!") - raise - return cfg # return as-is if don't know what to do diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/lazy.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/lazy.py deleted file mode 100755 index fa5d86b4..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/config/lazy.py +++ /dev/null @@ -1,399 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import ast -import builtins -import importlib -import inspect -import logging -import os -import uuid -from collections import abc -from contextlib import contextmanager -from copy import deepcopy -from dataclasses import is_dataclass -from typing import List, Tuple, Union -import cloudpickle -import yaml -from omegaconf import DictConfig, ListConfig, OmegaConf - -from detectron2.utils.file_io import PathManager -from detectron2.utils.registry import _convert_target_to_string - -__all__ = ["LazyCall", "LazyConfig"] - - -class LazyCall: - """ - Wrap a callable so that when it's called, the call will not be executed, - but returns a dict that describes the call. - - LazyCall object has to be called with only keyword arguments. Positional - arguments are not yet supported. - - Examples: - :: - from detectron2.config import instantiate, LazyCall - - layer_cfg = LazyCall(nn.Conv2d)(in_channels=32, out_channels=32) - layer_cfg.out_channels = 64 # can edit it afterwards - layer = instantiate(layer_cfg) - """ - - def __init__(self, target): - if not (callable(target) or isinstance(target, (str, abc.Mapping))): - raise TypeError( - f"target of LazyCall must be a callable or defines a callable! Got {target}" - ) - self._target = target - - def __call__(self, **kwargs): - if is_dataclass(self._target): - # omegaconf object cannot hold dataclass type - # https://github.com/omry/omegaconf/issues/784 - target = _convert_target_to_string(self._target) - else: - target = self._target - kwargs["_target_"] = target - - return DictConfig(content=kwargs, flags={"allow_objects": True}) - - -def _visit_dict_config(cfg, func): - """ - Apply func recursively to all DictConfig in cfg. - """ - if isinstance(cfg, DictConfig): - func(cfg) - for v in cfg.values(): - _visit_dict_config(v, func) - elif isinstance(cfg, ListConfig): - for v in cfg: - _visit_dict_config(v, func) - - -def _validate_py_syntax(filename): - # see also https://github.com/open-mmlab/mmcv/blob/master/mmcv/utils/config.py - with PathManager.open(filename, "r") as f: - content = f.read() - try: - ast.parse(content) - except SyntaxError as e: - raise SyntaxError(f"Config file {filename} has syntax error!") from e - - -def _cast_to_config(obj): - # if given a dict, return DictConfig instead - if isinstance(obj, dict): - return DictConfig(obj, flags={"allow_objects": True}) - return obj - - -_CFG_PACKAGE_NAME = "detectron2._cfg_loader" -""" -A namespace to put all imported config into. -""" - - -def _random_package_name(filename): - # generate a random package name when loading config files - return _CFG_PACKAGE_NAME + str(uuid.uuid4())[:4] + "." + os.path.basename(filename) - - -@contextmanager -def _patch_import(): - """ - Enhance relative import statements in config files, so that they: - 1. locate files purely based on relative location, regardless of packages. - e.g. you can import file without having __init__ - 2. do not cache modules globally; modifications of module states has no side effect - 3. support other storage system through PathManager - 4. imported dict are turned into omegaconf.DictConfig automatically - """ - old_import = builtins.__import__ - - def find_relative_file(original_file, relative_import_path, level): - cur_file = os.path.dirname(original_file) - for _ in range(level - 1): - cur_file = os.path.dirname(cur_file) - cur_name = relative_import_path.lstrip(".") - for part in cur_name.split("."): - cur_file = os.path.join(cur_file, part) - # NOTE: directory import is not handled. Because then it's unclear - # if such import should produce python module or DictConfig. This can - # be discussed further if needed. - if not cur_file.endswith(".py"): - cur_file += ".py" - if not PathManager.isfile(cur_file): - raise ImportError( - f"Cannot import name {relative_import_path} from " - f"{original_file}: {cur_file} has to exist." - ) - return cur_file - - def new_import(name, globals=None, locals=None, fromlist=(), level=0): - if ( - # Only deal with relative imports inside config files - level != 0 - and globals is not None - and (globals.get("__package__", "") or "").startswith(_CFG_PACKAGE_NAME) - ): - cur_file = find_relative_file(globals["__file__"], name, level) - _validate_py_syntax(cur_file) - spec = importlib.machinery.ModuleSpec( - _random_package_name(cur_file), None, origin=cur_file - ) - module = importlib.util.module_from_spec(spec) - module.__file__ = cur_file - with PathManager.open(cur_file) as f: - content = f.read() - exec(compile(content, cur_file, "exec"), module.__dict__) - for name in fromlist: # turn imported dict into DictConfig automatically - val = _cast_to_config(module.__dict__[name]) - module.__dict__[name] = val - return module - return old_import(name, globals, locals, fromlist=fromlist, level=level) - - builtins.__import__ = new_import - yield new_import - builtins.__import__ = old_import - - -class LazyConfig: - """ - Provide methods to save, load, and overrides an omegaconf config object - which may contain definition of lazily-constructed objects. - """ - - @staticmethod - def load_rel(filename: str, keys: Union[None, str, Tuple[str, ...]] = None): - """ - Similar to :meth:`load()`, but load path relative to the caller's - source file. - - This has the same functionality as a relative import, except that this method - accepts filename as a string, so more characters are allowed in the filename. - """ - caller_frame = inspect.stack()[1] - caller_fname = caller_frame[0].f_code.co_filename - assert caller_fname != "", "load_rel Unable to find caller" - caller_dir = os.path.dirname(caller_fname) - filename = os.path.join(caller_dir, filename) - return LazyConfig.load(filename, keys) - - @staticmethod - def load(filename: str, keys: Union[None, str, Tuple[str, ...]] = None): - """ - Load a config file. - - Args: - filename: absolute path or relative path w.r.t. the current working directory - keys: keys to load and return. If not given, return all keys - (whose values are config objects) in a dict. - """ - has_keys = keys is not None - filename = filename.replace("/./", "/") # redundant - if os.path.splitext(filename)[1] not in [".py", ".yaml", ".yml"]: - raise ValueError(f"Config file {filename} has to be a python or yaml file.") - if filename.endswith(".py"): - _validate_py_syntax(filename) - - with _patch_import(): - # Record the filename - module_namespace = { - "__file__": filename, - "__package__": _random_package_name(filename), - } - with PathManager.open(filename) as f: - content = f.read() - # Compile first with filename to: - # 1. make filename appears in stacktrace - # 2. make load_rel able to find its parent's (possibly remote) location - exec(compile(content, filename, "exec"), module_namespace) - - ret = module_namespace - else: - with PathManager.open(filename) as f: - obj = yaml.unsafe_load(f) - ret = OmegaConf.create(obj, flags={"allow_objects": True}) - - if has_keys: - if isinstance(keys, str): - return _cast_to_config(ret[keys]) - else: - return tuple(_cast_to_config(ret[a]) for a in keys) - else: - if filename.endswith(".py"): - # when not specified, only load those that are config objects - ret = DictConfig( - { - name: _cast_to_config(value) - for name, value in ret.items() - if isinstance(value, (DictConfig, ListConfig, dict)) - and not name.startswith("_") - }, - flags={"allow_objects": True}, - ) - return ret - - @staticmethod - def save(cfg, filename: str): - """ - Save a config object to a yaml file. - Note that when the config dictionary contains complex objects (e.g. lambda), - it can't be saved to yaml. In that case we will print an error and - attempt to save to a pkl file instead. - - Args: - cfg: an omegaconf config object - filename: yaml file name to save the config file - """ - logger = logging.getLogger(__name__) - try: - cfg = deepcopy(cfg) - except Exception: - pass - else: - # if it's deep-copyable, then... - def _replace_type_by_name(x): - if "_target_" in x and callable(x._target_): - try: - x._target_ = _convert_target_to_string(x._target_) - except AttributeError: - pass - - # not necessary, but makes yaml looks nicer - _visit_dict_config(cfg, _replace_type_by_name) - - save_pkl = False - try: - dict = OmegaConf.to_container(cfg, resolve=False) - dumped = yaml.dump(dict, default_flow_style=None, allow_unicode=True, width=9999) - with PathManager.open(filename, "w") as f: - f.write(dumped) - - try: - _ = yaml.unsafe_load(dumped) # test that it is loadable - except Exception: - logger.warning( - "The config contains objects that cannot serialize to a valid yaml. " - f"{filename} is human-readable but cannot be loaded." - ) - save_pkl = True - except Exception: - logger.exception("Unable to serialize the config to yaml. Error:") - save_pkl = True - - if save_pkl: - new_filename = filename + ".pkl" - try: - # retry by pickle - with PathManager.open(new_filename, "wb") as f: - cloudpickle.dump(cfg, f) - logger.warning(f"Config is saved using cloudpickle at {new_filename}.") - except Exception: - pass - - @staticmethod - def apply_overrides(cfg, overrides: List[str]): - """ - In-place override contents of cfg. - - Args: - cfg: an omegaconf config object - overrides: list of strings in the format of "a=b" to override configs. - See https://hydra.cc/docs/next/advanced/override_grammar/basic/ - for syntax. - - Returns: - the cfg object - """ - - def safe_update(cfg, key, value): - parts = key.split(".") - for idx in range(1, len(parts)): - prefix = ".".join(parts[:idx]) - v = OmegaConf.select(cfg, prefix, default=None) - if v is None: - break - if not OmegaConf.is_config(v): - raise KeyError( - f"Trying to update key {key}, but {prefix} " - f"is not a config, but has type {type(v)}." - ) - OmegaConf.update(cfg, key, value, merge=True) - - from hydra.core.override_parser.overrides_parser import OverridesParser - - parser = OverridesParser.create() - overrides = parser.parse_overrides(overrides) - for o in overrides: - key = o.key_or_group - value = o.value() - if o.is_delete(): - # TODO support this - raise NotImplementedError("deletion is not yet a supported override") - safe_update(cfg, key, value) - return cfg - - @staticmethod - def to_py(cfg, prefix: str = "cfg."): - """ - Try to convert a config object into Python-like psuedo code. - - Note that perfect conversion is not always possible. So the returned - results are mainly meant to be human-readable, and not meant to be executed. - - Args: - cfg: an omegaconf config object - prefix: root name for the resulting code (default: "cfg.") - - - Returns: - str of formatted Python code - """ - import black - - cfg = OmegaConf.to_container(cfg, resolve=True) - - def _to_str(obj, prefix=None, inside_call=False): - if prefix is None: - prefix = [] - if isinstance(obj, abc.Mapping) and "_target_" in obj: - # Dict representing a function call - target = _convert_target_to_string(obj.pop("_target_")) - args = [] - for k, v in sorted(obj.items()): - args.append(f"{k}={_to_str(v, inside_call=True)}") - args = ", ".join(args) - call = f"{target}({args})" - return "".join(prefix) + call - elif isinstance(obj, abc.Mapping) and not inside_call: - # Dict that is not inside a call is a list of top-level config objects that we - # render as one object per line with dot separated prefixes - key_list = [] - for k, v in sorted(obj.items()): - if isinstance(v, abc.Mapping) and "_target_" not in v: - key_list.append(_to_str(v, prefix=prefix + [k + "."])) - else: - key = "".join(prefix) + k - key_list.append(f"{key}={_to_str(v)}") - return "\n".join(key_list) - elif isinstance(obj, abc.Mapping): - # Dict that is inside a call is rendered as a regular dict - return ( - "{" - + ",".join( - f"{repr(k)}: {_to_str(v, inside_call=inside_call)}" - for k, v in sorted(obj.items()) - ) - + "}" - ) - elif isinstance(obj, list): - return "[" + ",".join(_to_str(x, inside_call=inside_call) for x in obj) + "]" - else: - return repr(obj) - - py_str = _to_str(cfg, prefix=[prefix]) - try: - return black.format_str(py_str, mode=black.Mode()) - except black.InvalidInput: - return py_str diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/__init__.py deleted file mode 100755 index 259f669b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/__init__.py +++ /dev/null @@ -1,19 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from . import transforms # isort:skip - -from .build import ( - build_batch_data_loader, - build_detection_test_loader, - build_detection_train_loader, - get_detection_dataset_dicts, - load_proposals_into_dataset, - print_instances_class_histogram, -) -from .catalog import DatasetCatalog, MetadataCatalog, Metadata -from .common import DatasetFromList, MapDataset, ToIterableDataset -from .dataset_mapper import DatasetMapper - -# ensure the builtin datasets are registered -from . import datasets, samplers # isort:skip - -__all__ = [k for k in globals().keys() if not k.startswith("_")] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/benchmark.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/benchmark.py deleted file mode 100755 index ac2f372a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/benchmark.py +++ /dev/null @@ -1,225 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import numpy as np -from itertools import count -from typing import List, Tuple -import torch -import tqdm -from fvcore.common.timer import Timer - -from detectron2.utils import comm - -from .build import build_batch_data_loader -from .common import DatasetFromList, MapDataset -from .samplers import TrainingSampler - -logger = logging.getLogger(__name__) - - -class _EmptyMapDataset(torch.utils.data.Dataset): - """ - Map anything to emptiness. - """ - - def __init__(self, dataset): - self.ds = dataset - - def __len__(self): - return len(self.ds) - - def __getitem__(self, idx): - _ = self.ds[idx] - return [0] - - -def iter_benchmark( - iterator, num_iter: int, warmup: int = 5, max_time_seconds: float = 60 -) -> Tuple[float, List[float]]: - """ - Benchmark an iterator/iterable for `num_iter` iterations with an extra - `warmup` iterations of warmup. - End early if `max_time_seconds` time is spent on iterations. - - Returns: - float: average time (seconds) per iteration - list[float]: time spent on each iteration. Sometimes useful for further analysis. - """ - num_iter, warmup = int(num_iter), int(warmup) - - iterator = iter(iterator) - for _ in range(warmup): - next(iterator) - timer = Timer() - all_times = [] - for curr_iter in tqdm.trange(num_iter): - start = timer.seconds() - if start > max_time_seconds: - num_iter = curr_iter - break - next(iterator) - all_times.append(timer.seconds() - start) - avg = timer.seconds() / num_iter - return avg, all_times - - -class DataLoaderBenchmark: - """ - Some common benchmarks that help understand perf bottleneck of a standard dataloader - made of dataset, mapper and sampler. - """ - - def __init__( - self, - dataset, - *, - mapper, - sampler=None, - total_batch_size, - num_workers=0, - max_time_seconds: int = 90, - ): - """ - Args: - max_time_seconds (int): maximum time to spent for each benchmark - other args: same as in `build.py:build_detection_train_loader` - """ - if isinstance(dataset, list): - dataset = DatasetFromList(dataset, copy=False, serialize=True) - if sampler is None: - sampler = TrainingSampler(len(dataset)) - - self.dataset = dataset - self.mapper = mapper - self.sampler = sampler - self.total_batch_size = total_batch_size - self.num_workers = num_workers - self.per_gpu_batch_size = self.total_batch_size // comm.get_world_size() - - self.max_time_seconds = max_time_seconds - - def _benchmark(self, iterator, num_iter, warmup, msg=None): - avg, all_times = iter_benchmark(iterator, num_iter, warmup, self.max_time_seconds) - if msg is not None: - self._log_time(msg, avg, all_times) - return avg, all_times - - def _log_time(self, msg, avg, all_times, distributed=False): - percentiles = [np.percentile(all_times, k, interpolation="nearest") for k in [1, 5, 95, 99]] - if not distributed: - logger.info( - f"{msg}: avg={1.0/avg:.1f} it/s, " - f"p1={percentiles[0]:.2g}s, p5={percentiles[1]:.2g}s, " - f"p95={percentiles[2]:.2g}s, p99={percentiles[3]:.2g}s." - ) - return - avg_per_gpu = comm.all_gather(avg) - percentiles_per_gpu = comm.all_gather(percentiles) - if comm.get_rank() > 0: - return - for idx, avg, percentiles in zip(count(), avg_per_gpu, percentiles_per_gpu): - logger.info( - f"GPU{idx} {msg}: avg={1.0/avg:.1f} it/s, " - f"p1={percentiles[0]:.2g}s, p5={percentiles[1]:.2g}s, " - f"p95={percentiles[2]:.2g}s, p99={percentiles[3]:.2g}s." - ) - - def benchmark_dataset(self, num_iter, warmup=5): - """ - Benchmark the speed of taking raw samples from the dataset. - """ - - def loader(): - while True: - for k in self.sampler: - yield self.dataset[k] - - self._benchmark(loader(), num_iter, warmup, "Dataset Alone") - - def benchmark_mapper(self, num_iter, warmup=5): - """ - Benchmark the speed of taking raw samples from the dataset and map - them in a single process. - """ - - def loader(): - while True: - for k in self.sampler: - yield self.mapper(self.dataset[k]) - - self._benchmark(loader(), num_iter, warmup, "Single Process Mapper (sec/sample)") - - def benchmark_workers(self, num_iter, warmup=10): - """ - Benchmark the dataloader by tuning num_workers to [0, 1, self.num_workers]. - """ - candidates = [0, 1] - if self.num_workers not in candidates: - candidates.append(self.num_workers) - - dataset = MapDataset(self.dataset, self.mapper) - for n in candidates: - loader = build_batch_data_loader( - dataset, - self.sampler, - self.total_batch_size, - num_workers=n, - ) - self._benchmark( - iter(loader), - num_iter * max(n, 1), - warmup * max(n, 1), - f"DataLoader ({n} workers, bs={self.per_gpu_batch_size})", - ) - del loader - - def benchmark_IPC(self, num_iter, warmup=10): - """ - Benchmark the dataloader where each worker outputs nothing. This - eliminates the IPC overhead compared to the regular dataloader. - - PyTorch multiprocessing's IPC only optimizes for torch tensors. - Large numpy arrays or other data structure may incur large IPC overhead. - """ - n = self.num_workers - dataset = _EmptyMapDataset(MapDataset(self.dataset, self.mapper)) - loader = build_batch_data_loader( - dataset, self.sampler, self.total_batch_size, num_workers=n - ) - self._benchmark( - iter(loader), - num_iter * max(n, 1), - warmup * max(n, 1), - f"DataLoader ({n} workers, bs={self.per_gpu_batch_size}) w/o comm", - ) - - def benchmark_distributed(self, num_iter, warmup=10): - """ - Benchmark the dataloader in each distributed worker, and log results of - all workers. This helps understand the final performance as well as - the variances among workers. - - It also prints startup time (first iter) of the dataloader. - """ - gpu = comm.get_world_size() - dataset = MapDataset(self.dataset, self.mapper) - n = self.num_workers - loader = build_batch_data_loader( - dataset, self.sampler, self.total_batch_size, num_workers=n - ) - - timer = Timer() - loader = iter(loader) - next(loader) - startup_time = timer.seconds() - logger.info("Dataloader startup time: {:.2f} seconds".format(startup_time)) - - comm.synchronize() - - avg, all_times = self._benchmark(loader, num_iter * max(n, 1), warmup * max(n, 1)) - del loader - self._log_time( - f"DataLoader ({gpu} GPUs x {n} workers, total bs={self.total_batch_size})", - avg, - all_times, - True, - ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/build.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/build.py deleted file mode 100755 index a31369d1..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/build.py +++ /dev/null @@ -1,542 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import itertools -import logging -import numpy as np -import operator -import pickle -from typing import Any, Callable, Dict, List, Optional, Union -import torch -import torch.utils.data as torchdata -from tabulate import tabulate -from termcolor import colored - -from detectron2.config import configurable -from detectron2.structures import BoxMode -from detectron2.utils.comm import get_world_size -from detectron2.utils.env import seed_all_rng -from detectron2.utils.file_io import PathManager -from detectron2.utils.logger import _log_api_usage, log_first_n - -from .catalog import DatasetCatalog, MetadataCatalog -from .common import AspectRatioGroupedDataset, DatasetFromList, MapDataset, ToIterableDataset -from .dataset_mapper import DatasetMapper -from .detection_utils import check_metadata_consistency -from .samplers import ( - InferenceSampler, - RandomSubsetTrainingSampler, - RepeatFactorTrainingSampler, - TrainingSampler, -) - -""" -This file contains the default logic to build a dataloader for training or testing. -""" - -__all__ = [ - "build_batch_data_loader", - "build_detection_train_loader", - "build_detection_test_loader", - "get_detection_dataset_dicts", - "load_proposals_into_dataset", - "print_instances_class_histogram", -] - - -def filter_images_with_only_crowd_annotations(dataset_dicts): - """ - Filter out images with none annotations or only crowd annotations - (i.e., images without non-crowd annotations). - A common training-time preprocessing on COCO dataset. - - Args: - dataset_dicts (list[dict]): annotations in Detectron2 Dataset format. - - Returns: - list[dict]: the same format, but filtered. - """ - num_before = len(dataset_dicts) - - def valid(anns): - for ann in anns: - if ann.get("iscrowd", 0) == 0: - return True - return False - - dataset_dicts = [x for x in dataset_dicts if valid(x["annotations"])] - num_after = len(dataset_dicts) - logger = logging.getLogger(__name__) - logger.info( - "Removed {} images with no usable annotations. {} images left.".format( - num_before - num_after, num_after - ) - ) - return dataset_dicts - - -def filter_images_with_few_keypoints(dataset_dicts, min_keypoints_per_image): - """ - Filter out images with too few number of keypoints. - - Args: - dataset_dicts (list[dict]): annotations in Detectron2 Dataset format. - - Returns: - list[dict]: the same format as dataset_dicts, but filtered. - """ - num_before = len(dataset_dicts) - - def visible_keypoints_in_image(dic): - # Each keypoints field has the format [x1, y1, v1, ...], where v is visibility - annotations = dic["annotations"] - return sum( - (np.array(ann["keypoints"][2::3]) > 0).sum() - for ann in annotations - if "keypoints" in ann - ) - - dataset_dicts = [ - x for x in dataset_dicts if visible_keypoints_in_image(x) >= min_keypoints_per_image - ] - num_after = len(dataset_dicts) - logger = logging.getLogger(__name__) - logger.info( - "Removed {} images with fewer than {} keypoints.".format( - num_before - num_after, min_keypoints_per_image - ) - ) - return dataset_dicts - - -def load_proposals_into_dataset(dataset_dicts, proposal_file): - """ - Load precomputed object proposals into the dataset. - - The proposal file should be a pickled dict with the following keys: - - - "ids": list[int] or list[str], the image ids - - "boxes": list[np.ndarray], each is an Nx4 array of boxes corresponding to the image id - - "objectness_logits": list[np.ndarray], each is an N sized array of objectness scores - corresponding to the boxes. - - "bbox_mode": the BoxMode of the boxes array. Defaults to ``BoxMode.XYXY_ABS``. - - Args: - dataset_dicts (list[dict]): annotations in Detectron2 Dataset format. - proposal_file (str): file path of pre-computed proposals, in pkl format. - - Returns: - list[dict]: the same format as dataset_dicts, but added proposal field. - """ - logger = logging.getLogger(__name__) - logger.info("Loading proposals from: {}".format(proposal_file)) - - with PathManager.open(proposal_file, "rb") as f: - proposals = pickle.load(f, encoding="latin1") - - # Rename the key names in D1 proposal files - rename_keys = {"indexes": "ids", "scores": "objectness_logits"} - for key in rename_keys: - if key in proposals: - proposals[rename_keys[key]] = proposals.pop(key) - - # Fetch the indexes of all proposals that are in the dataset - # Convert image_id to str since they could be int. - img_ids = set({str(record["image_id"]) for record in dataset_dicts}) - id_to_index = {str(id): i for i, id in enumerate(proposals["ids"]) if str(id) in img_ids} - - # Assuming default bbox_mode of precomputed proposals are 'XYXY_ABS' - bbox_mode = BoxMode(proposals["bbox_mode"]) if "bbox_mode" in proposals else BoxMode.XYXY_ABS - - for record in dataset_dicts: - # Get the index of the proposal - i = id_to_index[str(record["image_id"])] - - boxes = proposals["boxes"][i] - objectness_logits = proposals["objectness_logits"][i] - # Sort the proposals in descending order of the scores - inds = objectness_logits.argsort()[::-1] - record["proposal_boxes"] = boxes[inds] - record["proposal_objectness_logits"] = objectness_logits[inds] - record["proposal_bbox_mode"] = bbox_mode - - return dataset_dicts - - -def print_instances_class_histogram(dataset_dicts, class_names): - """ - Args: - dataset_dicts (list[dict]): list of dataset dicts. - class_names (list[str]): list of class names (zero-indexed). - """ - num_classes = len(class_names) - hist_bins = np.arange(num_classes + 1) - histogram = np.zeros((num_classes,), dtype=np.int) - for entry in dataset_dicts: - annos = entry["annotations"] - classes = np.asarray( - [x["category_id"] for x in annos if not x.get("iscrowd", 0)], dtype=np.int - ) - if len(classes): - assert classes.min() >= 0, f"Got an invalid category_id={classes.min()}" - assert ( - classes.max() < num_classes - ), f"Got an invalid category_id={classes.max()} for a dataset of {num_classes} classes" - histogram += np.histogram(classes, bins=hist_bins)[0] - - N_COLS = min(6, len(class_names) * 2) - - def short_name(x): - # make long class names shorter. useful for lvis - if len(x) > 13: - return x[:11] + ".." - return x - - data = list( - itertools.chain(*[[short_name(class_names[i]), int(v)] for i, v in enumerate(histogram)]) - ) - total_num_instances = sum(data[1::2]) - data.extend([None] * (N_COLS - (len(data) % N_COLS))) - if num_classes > 1: - data.extend(["total", total_num_instances]) - data = itertools.zip_longest(*[data[i::N_COLS] for i in range(N_COLS)]) - table = tabulate( - data, - headers=["category", "#instances"] * (N_COLS // 2), - tablefmt="pipe", - numalign="left", - stralign="center", - ) - log_first_n( - logging.INFO, - "Distribution of instances among all {} categories:\n".format(num_classes) - + colored(table, "cyan"), - key="message", - ) - - -def get_detection_dataset_dicts( - names, - filter_empty=True, - min_keypoints=0, - proposal_files=None, - check_consistency=True, -): - """ - Load and prepare dataset dicts for instance detection/segmentation and semantic segmentation. - - Args: - names (str or list[str]): a dataset name or a list of dataset names - filter_empty (bool): whether to filter out images without instance annotations - min_keypoints (int): filter out images with fewer keypoints than - `min_keypoints`. Set to 0 to do nothing. - proposal_files (list[str]): if given, a list of object proposal files - that match each dataset in `names`. - check_consistency (bool): whether to check if datasets have consistent metadata. - - Returns: - list[dict]: a list of dicts following the standard dataset dict format. - """ - if isinstance(names, str): - names = [names] - assert len(names), names - dataset_dicts = [DatasetCatalog.get(dataset_name) for dataset_name in names] - for dataset_name, dicts in zip(names, dataset_dicts): - assert len(dicts), "Dataset '{}' is empty!".format(dataset_name) - - if proposal_files is not None: - assert len(names) == len(proposal_files) - # load precomputed proposals from proposal files - dataset_dicts = [ - load_proposals_into_dataset(dataset_i_dicts, proposal_file) - for dataset_i_dicts, proposal_file in zip(dataset_dicts, proposal_files) - ] - - if isinstance(dataset_dicts[0], torchdata.Dataset): - return torchdata.ConcatDataset(dataset_dicts) - - dataset_dicts = list(itertools.chain.from_iterable(dataset_dicts)) - - has_instances = "annotations" in dataset_dicts[0] - if filter_empty and has_instances: - dataset_dicts = filter_images_with_only_crowd_annotations(dataset_dicts) - if min_keypoints > 0 and has_instances: - dataset_dicts = filter_images_with_few_keypoints(dataset_dicts, min_keypoints) - - if check_consistency and has_instances: - try: - class_names = MetadataCatalog.get(names[0]).thing_classes - check_metadata_consistency("thing_classes", names) - print_instances_class_histogram(dataset_dicts, class_names) - except AttributeError: # class names are not available for this dataset - pass - - assert len(dataset_dicts), "No valid data found in {}.".format(",".join(names)) - return dataset_dicts - - -def build_batch_data_loader( - dataset, - sampler, - total_batch_size, - *, - aspect_ratio_grouping=False, - num_workers=0, - collate_fn=None, -): - """ - Build a batched dataloader. The main differences from `torch.utils.data.DataLoader` are: - 1. support aspect ratio grouping options - 2. use no "batch collation", because this is common for detection training - - Args: - dataset (torch.utils.data.Dataset): a pytorch map-style or iterable dataset. - sampler (torch.utils.data.sampler.Sampler or None): a sampler that produces indices. - Must be provided iff. ``dataset`` is a map-style dataset. - total_batch_size, aspect_ratio_grouping, num_workers, collate_fn: see - :func:`build_detection_train_loader`. - - Returns: - iterable[list]. Length of each list is the batch size of the current - GPU. Each element in the list comes from the dataset. - """ - world_size = get_world_size() - assert ( - total_batch_size > 0 and total_batch_size % world_size == 0 - ), "Total batch size ({}) must be divisible by the number of gpus ({}).".format( - total_batch_size, world_size - ) - batch_size = total_batch_size // world_size - - if isinstance(dataset, torchdata.IterableDataset): - assert sampler is None, "sampler must be None if dataset is IterableDataset" - else: - dataset = ToIterableDataset(dataset, sampler) - - if aspect_ratio_grouping: - data_loader = torchdata.DataLoader( - dataset, - num_workers=num_workers, - collate_fn=operator.itemgetter(0), # don't batch, but yield individual elements - worker_init_fn=worker_init_reset_seed, - ) # yield individual mapped dict - data_loader = AspectRatioGroupedDataset(data_loader, batch_size) - if collate_fn is None: - return data_loader - return MapDataset(data_loader, collate_fn) - else: - return torchdata.DataLoader( - dataset, - batch_size=batch_size, - drop_last=True, - num_workers=num_workers, - collate_fn=trivial_batch_collator if collate_fn is None else collate_fn, - worker_init_fn=worker_init_reset_seed, - ) - - -def _train_loader_from_config(cfg, mapper=None, *, dataset=None, sampler=None): - if dataset is None: - dataset = get_detection_dataset_dicts( - cfg.DATASETS.TRAIN, - filter_empty=cfg.DATALOADER.FILTER_EMPTY_ANNOTATIONS, - min_keypoints=cfg.MODEL.ROI_KEYPOINT_HEAD.MIN_KEYPOINTS_PER_IMAGE - if cfg.MODEL.KEYPOINT_ON - else 0, - proposal_files=cfg.DATASETS.PROPOSAL_FILES_TRAIN if cfg.MODEL.LOAD_PROPOSALS else None, - ) - _log_api_usage("dataset." + cfg.DATASETS.TRAIN[0]) - - if mapper is None: - mapper = DatasetMapper(cfg, True) - - if sampler is None: - sampler_name = cfg.DATALOADER.SAMPLER_TRAIN - logger = logging.getLogger(__name__) - logger.info("Using training sampler {}".format(sampler_name)) - if sampler_name == "TrainingSampler": - sampler = TrainingSampler(len(dataset)) - elif sampler_name == "RepeatFactorTrainingSampler": - repeat_factors = RepeatFactorTrainingSampler.repeat_factors_from_category_frequency( - dataset, cfg.DATALOADER.REPEAT_THRESHOLD - ) - sampler = RepeatFactorTrainingSampler(repeat_factors) - elif sampler_name == "RandomSubsetTrainingSampler": - sampler = RandomSubsetTrainingSampler(len(dataset), cfg.DATALOADER.RANDOM_SUBSET_RATIO) - else: - raise ValueError("Unknown training sampler: {}".format(sampler_name)) - - return { - "dataset": dataset, - "sampler": sampler, - "mapper": mapper, - "total_batch_size": cfg.SOLVER.IMS_PER_BATCH, - "aspect_ratio_grouping": cfg.DATALOADER.ASPECT_RATIO_GROUPING, - "num_workers": cfg.DATALOADER.NUM_WORKERS, - } - - -@configurable(from_config=_train_loader_from_config) -def build_detection_train_loader( - dataset, - *, - mapper, - sampler=None, - total_batch_size, - aspect_ratio_grouping=True, - num_workers=0, - collate_fn=None, -): - """ - Build a dataloader for object detection with some default features. - - Args: - dataset (list or torch.utils.data.Dataset): a list of dataset dicts, - or a pytorch dataset (either map-style or iterable). It can be obtained - by using :func:`DatasetCatalog.get` or :func:`get_detection_dataset_dicts`. - mapper (callable): a callable which takes a sample (dict) from dataset and - returns the format to be consumed by the model. - When using cfg, the default choice is ``DatasetMapper(cfg, is_train=True)``. - sampler (torch.utils.data.sampler.Sampler or None): a sampler that produces - indices to be applied on ``dataset``. - If ``dataset`` is map-style, the default sampler is a :class:`TrainingSampler`, - which coordinates an infinite random shuffle sequence across all workers. - Sampler must be None if ``dataset`` is iterable. - total_batch_size (int): total batch size across all workers. - aspect_ratio_grouping (bool): whether to group images with similar - aspect ratio for efficiency. When enabled, it requires each - element in dataset be a dict with keys "width" and "height". - num_workers (int): number of parallel data loading workers - collate_fn: a function that determines how to do batching, same as the argument of - `torch.utils.data.DataLoader`. Defaults to do no collation and return a list of - data. No collation is OK for small batch size and simple data structures. - If your batch size is large and each sample contains too many small tensors, - it's more efficient to collate them in data loader. - - Returns: - torch.utils.data.DataLoader: - a dataloader. Each output from it is a ``list[mapped_element]`` of length - ``total_batch_size / num_workers``, where ``mapped_element`` is produced - by the ``mapper``. - """ - if isinstance(dataset, list): - dataset = DatasetFromList(dataset, copy=False) - if mapper is not None: - dataset = MapDataset(dataset, mapper) - - if isinstance(dataset, torchdata.IterableDataset): - assert sampler is None, "sampler must be None if dataset is IterableDataset" - else: - if sampler is None: - sampler = TrainingSampler(len(dataset)) - assert isinstance(sampler, torchdata.Sampler), f"Expect a Sampler but got {type(sampler)}" - return build_batch_data_loader( - dataset, - sampler, - total_batch_size, - aspect_ratio_grouping=aspect_ratio_grouping, - num_workers=num_workers, - collate_fn=collate_fn, - ) - - -def _test_loader_from_config(cfg, dataset_name, mapper=None): - """ - Uses the given `dataset_name` argument (instead of the names in cfg), because the - standard practice is to evaluate each test set individually (not combining them). - """ - if isinstance(dataset_name, str): - dataset_name = [dataset_name] - - dataset = get_detection_dataset_dicts( - dataset_name, - filter_empty=False, - proposal_files=[ - cfg.DATASETS.PROPOSAL_FILES_TEST[list(cfg.DATASETS.TEST).index(x)] for x in dataset_name - ] - if cfg.MODEL.LOAD_PROPOSALS - else None, - ) - if mapper is None: - mapper = DatasetMapper(cfg, False) - return { - "dataset": dataset, - "mapper": mapper, - "num_workers": cfg.DATALOADER.NUM_WORKERS, - "sampler": InferenceSampler(len(dataset)), - } - - -@configurable(from_config=_test_loader_from_config) -def build_detection_test_loader( - dataset: Union[List[Any], torchdata.Dataset], - *, - mapper: Callable[[Dict[str, Any]], Any], - sampler: Optional[torchdata.Sampler] = None, - batch_size: int = 1, - num_workers: int = 0, - collate_fn: Optional[Callable[[List[Any]], Any]] = None, -) -> torchdata.DataLoader: - """ - Similar to `build_detection_train_loader`, with default batch size = 1, - and sampler = :class:`InferenceSampler`. This sampler coordinates all workers - to produce the exact set of all samples. - - Args: - dataset: a list of dataset dicts, - or a pytorch dataset (either map-style or iterable). They can be obtained - by using :func:`DatasetCatalog.get` or :func:`get_detection_dataset_dicts`. - mapper: a callable which takes a sample (dict) from dataset - and returns the format to be consumed by the model. - When using cfg, the default choice is ``DatasetMapper(cfg, is_train=False)``. - sampler: a sampler that produces - indices to be applied on ``dataset``. Default to :class:`InferenceSampler`, - which splits the dataset across all workers. Sampler must be None - if `dataset` is iterable. - batch_size: the batch size of the data loader to be created. - Default to 1 image per worker since this is the standard when reporting - inference time in papers. - num_workers: number of parallel data loading workers - collate_fn: same as the argument of `torch.utils.data.DataLoader`. - Defaults to do no collation and return a list of data. - - Returns: - DataLoader: a torch DataLoader, that loads the given detection - dataset, with test-time transformation and batching. - - Examples: - :: - data_loader = build_detection_test_loader( - DatasetRegistry.get("my_test"), - mapper=DatasetMapper(...)) - - # or, instantiate with a CfgNode: - data_loader = build_detection_test_loader(cfg, "my_test") - """ - if isinstance(dataset, list): - dataset = DatasetFromList(dataset, copy=False) - if mapper is not None: - dataset = MapDataset(dataset, mapper) - if isinstance(dataset, torchdata.IterableDataset): - assert sampler is None, "sampler must be None if dataset is IterableDataset" - else: - if sampler is None: - sampler = InferenceSampler(len(dataset)) - return torchdata.DataLoader( - dataset, - batch_size=batch_size, - sampler=sampler, - drop_last=False, - num_workers=num_workers, - collate_fn=trivial_batch_collator if collate_fn is None else collate_fn, - ) - - -def trivial_batch_collator(batch): - """ - A batch collator that does nothing. - """ - return batch - - -def worker_init_reset_seed(worker_id): - initial_seed = torch.initial_seed() % 2 ** 31 - seed_all_rng(initial_seed + worker_id) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/catalog.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/catalog.py deleted file mode 100755 index 45c110c1..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/catalog.py +++ /dev/null @@ -1,236 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import logging -import types -from collections import UserDict -from typing import List - -from detectron2.utils.logger import log_first_n - -__all__ = ["DatasetCatalog", "MetadataCatalog", "Metadata"] - - -class _DatasetCatalog(UserDict): - """ - A global dictionary that stores information about the datasets and how to obtain them. - - It contains a mapping from strings - (which are names that identify a dataset, e.g. "coco_2014_train") - to a function which parses the dataset and returns the samples in the - format of `list[dict]`. - - The returned dicts should be in Detectron2 Dataset format (See DATASETS.md for details) - if used with the data loader functionalities in `data/build.py,data/detection_transform.py`. - - The purpose of having this catalog is to make it easy to choose - different datasets, by just using the strings in the config. - """ - - def register(self, name, func): - """ - Args: - name (str): the name that identifies a dataset, e.g. "coco_2014_train". - func (callable): a callable which takes no arguments and returns a list of dicts. - It must return the same results if called multiple times. - """ - assert callable(func), "You must register a function with `DatasetCatalog.register`!" - assert name not in self, "Dataset '{}' is already registered!".format(name) - self[name] = func - - def get(self, name): - """ - Call the registered function and return its results. - - Args: - name (str): the name that identifies a dataset, e.g. "coco_2014_train". - - Returns: - list[dict]: dataset annotations. - """ - try: - f = self[name] - except KeyError as e: - raise KeyError( - "Dataset '{}' is not registered! Available datasets are: {}".format( - name, ", ".join(list(self.keys())) - ) - ) from e - return f() - - def list(self) -> List[str]: - """ - List all registered datasets. - - Returns: - list[str] - """ - return list(self.keys()) - - def remove(self, name): - """ - Alias of ``pop``. - """ - self.pop(name) - - def __str__(self): - return "DatasetCatalog(registered datasets: {})".format(", ".join(self.keys())) - - __repr__ = __str__ - - -DatasetCatalog = _DatasetCatalog() -DatasetCatalog.__doc__ = ( - _DatasetCatalog.__doc__ - + """ - .. automethod:: detectron2.data.catalog.DatasetCatalog.register - .. automethod:: detectron2.data.catalog.DatasetCatalog.get -""" -) - - -class Metadata(types.SimpleNamespace): - """ - A class that supports simple attribute setter/getter. - It is intended for storing metadata of a dataset and make it accessible globally. - - Examples: - :: - # somewhere when you load the data: - MetadataCatalog.get("mydataset").thing_classes = ["person", "dog"] - - # somewhere when you print statistics or visualize: - classes = MetadataCatalog.get("mydataset").thing_classes - """ - - # the name of the dataset - # set default to N/A so that `self.name` in the errors will not trigger getattr again - name: str = "N/A" - - _RENAMED = { - "class_names": "thing_classes", - "dataset_id_to_contiguous_id": "thing_dataset_id_to_contiguous_id", - "stuff_class_names": "stuff_classes", - } - - def __getattr__(self, key): - if key in self._RENAMED: - log_first_n( - logging.WARNING, - "Metadata '{}' was renamed to '{}'!".format(key, self._RENAMED[key]), - n=10, - ) - return getattr(self, self._RENAMED[key]) - - # "name" exists in every metadata - if len(self.__dict__) > 1: - raise AttributeError( - "Attribute '{}' does not exist in the metadata of dataset '{}'. Available " - "keys are {}.".format(key, self.name, str(self.__dict__.keys())) - ) - else: - raise AttributeError( - f"Attribute '{key}' does not exist in the metadata of dataset '{self.name}': " - "metadata is empty." - ) - - def __setattr__(self, key, val): - if key in self._RENAMED: - log_first_n( - logging.WARNING, - "Metadata '{}' was renamed to '{}'!".format(key, self._RENAMED[key]), - n=10, - ) - setattr(self, self._RENAMED[key], val) - - # Ensure that metadata of the same name stays consistent - try: - oldval = getattr(self, key) - assert oldval == val, ( - "Attribute '{}' in the metadata of '{}' cannot be set " - "to a different value!\n{} != {}".format(key, self.name, oldval, val) - ) - except AttributeError: - super().__setattr__(key, val) - - def as_dict(self): - """ - Returns all the metadata as a dict. - Note that modifications to the returned dict will not reflect on the Metadata object. - """ - return copy.copy(self.__dict__) - - def set(self, **kwargs): - """ - Set multiple metadata with kwargs. - """ - for k, v in kwargs.items(): - setattr(self, k, v) - return self - - def get(self, key, default=None): - """ - Access an attribute and return its value if exists. - Otherwise return default. - """ - try: - return getattr(self, key) - except AttributeError: - return default - - -class _MetadataCatalog(UserDict): - """ - MetadataCatalog is a global dictionary that provides access to - :class:`Metadata` of a given dataset. - - The metadata associated with a certain name is a singleton: once created, the - metadata will stay alive and will be returned by future calls to ``get(name)``. - - It's like global variables, so don't abuse it. - It's meant for storing knowledge that's constant and shared across the execution - of the program, e.g.: the class names in COCO. - """ - - def get(self, name): - """ - Args: - name (str): name of a dataset (e.g. coco_2014_train). - - Returns: - Metadata: The :class:`Metadata` instance associated with this name, - or create an empty one if none is available. - """ - assert len(name) - r = super().get(name, None) - if r is None: - r = self[name] = Metadata(name=name) - return r - - def list(self): - """ - List all registered metadata. - - Returns: - list[str]: keys (names of datasets) of all registered metadata - """ - return list(self.keys()) - - def remove(self, name): - """ - Alias of ``pop``. - """ - self.pop(name) - - def __str__(self): - return "MetadataCatalog(registered metadata: {})".format(", ".join(self.keys())) - - __repr__ = __str__ - - -MetadataCatalog = _MetadataCatalog() -MetadataCatalog.__doc__ = ( - _MetadataCatalog.__doc__ - + """ - .. automethod:: detectron2.data.catalog.MetadataCatalog.get -""" -) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/common.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/common.py deleted file mode 100755 index d6b87424..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/common.py +++ /dev/null @@ -1,241 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import itertools -import logging -import numpy as np -import pickle -import random -import torch.utils.data as data -from torch.utils.data.sampler import Sampler - -from detectron2.utils.serialize import PicklableWrapper - -__all__ = ["MapDataset", "DatasetFromList", "AspectRatioGroupedDataset", "ToIterableDataset"] - - -def _shard_iterator_dataloader_worker(iterable): - # Shard the iterable if we're currently inside pytorch dataloader worker. - worker_info = data.get_worker_info() - if worker_info is None or worker_info.num_workers == 1: - # do nothing - yield from iterable - else: - yield from itertools.islice(iterable, worker_info.id, None, worker_info.num_workers) - - -class _MapIterableDataset(data.IterableDataset): - """ - Map a function over elements in an IterableDataset. - - Similar to pytorch's MapIterDataPipe, but support filtering when map_func - returns None. - - This class is not public-facing. Will be called by `MapDataset`. - """ - - def __init__(self, dataset, map_func): - self._dataset = dataset - self._map_func = PicklableWrapper(map_func) # wrap so that a lambda will work - - def __len__(self): - return len(self._dataset) - - def __iter__(self): - for x in map(self._map_func, self._dataset): - if x is not None: - yield x - - -class MapDataset(data.Dataset): - """ - Map a function over the elements in a dataset. - """ - - def __init__(self, dataset, map_func): - """ - Args: - dataset: a dataset where map function is applied. Can be either - map-style or iterable dataset. When given an iterable dataset, - the returned object will also be an iterable dataset. - map_func: a callable which maps the element in dataset. map_func can - return None to skip the data (e.g. in case of errors). - How None is handled depends on the style of `dataset`. - If `dataset` is map-style, it randomly tries other elements. - If `dataset` is iterable, it skips the data and tries the next. - """ - self._dataset = dataset - self._map_func = PicklableWrapper(map_func) # wrap so that a lambda will work - - self._rng = random.Random(42) - self._fallback_candidates = set(range(len(dataset))) - - def __new__(cls, dataset, map_func): - is_iterable = isinstance(dataset, data.IterableDataset) - if is_iterable: - return _MapIterableDataset(dataset, map_func) - else: - return super().__new__(cls) - - def __getnewargs__(self): - return self._dataset, self._map_func - - def __len__(self): - return len(self._dataset) - - def __getitem__(self, idx): - retry_count = 0 - cur_idx = int(idx) - - while True: - data = self._map_func(self._dataset[cur_idx]) - if data is not None: - self._fallback_candidates.add(cur_idx) - return data - - # _map_func fails for this idx, use a random new index from the pool - retry_count += 1 - self._fallback_candidates.discard(cur_idx) - cur_idx = self._rng.sample(self._fallback_candidates, k=1)[0] - - if retry_count >= 3: - logger = logging.getLogger(__name__) - logger.warning( - "Failed to apply `_map_func` for idx: {}, retry count: {}".format( - idx, retry_count - ) - ) - - -class DatasetFromList(data.Dataset): - """ - Wrap a list to a torch Dataset. It produces elements of the list as data. - """ - - def __init__(self, lst: list, copy: bool = True, serialize: bool = True): - """ - Args: - lst (list): a list which contains elements to produce. - copy (bool): whether to deepcopy the element when producing it, - so that the result can be modified in place without affecting the - source in the list. - serialize (bool): whether to hold memory using serialized objects, when - enabled, data loader workers can use shared RAM from master - process instead of making a copy. - """ - self._lst = lst - self._copy = copy - self._serialize = serialize - - def _serialize(data): - buffer = pickle.dumps(data, protocol=-1) - return np.frombuffer(buffer, dtype=np.uint8) - - if self._serialize: - logger = logging.getLogger(__name__) - logger.info( - "Serializing {} elements to byte tensors and concatenating them all ...".format( - len(self._lst) - ) - ) - self._lst = [_serialize(x) for x in self._lst] - self._addr = np.asarray([len(x) for x in self._lst], dtype=np.int64) - self._addr = np.cumsum(self._addr) - self._lst = np.concatenate(self._lst) - logger.info("Serialized dataset takes {:.2f} MiB".format(len(self._lst) / 1024 ** 2)) - - def __len__(self): - if self._serialize: - return len(self._addr) - else: - return len(self._lst) - - def __getitem__(self, idx): - if self._serialize: - start_addr = 0 if idx == 0 else self._addr[idx - 1].item() - end_addr = self._addr[idx].item() - bytes = memoryview(self._lst[start_addr:end_addr]) - return pickle.loads(bytes) - elif self._copy: - return copy.deepcopy(self._lst[idx]) - else: - return self._lst[idx] - - -class ToIterableDataset(data.IterableDataset): - """ - Convert an old indices-based (also called map-style) dataset - to an iterable-style dataset. - """ - - def __init__(self, dataset: data.Dataset, sampler: Sampler, shard_sampler: bool = True): - """ - Args: - dataset: an old-style dataset with ``__getitem__`` - sampler: a cheap iterable that produces indices to be applied on ``dataset``. - shard_sampler: whether to shard the sampler based on the current pytorch data loader - worker id. When an IterableDataset is forked by pytorch's DataLoader into multiple - workers, it is responsible for sharding its data based on worker id so that workers - don't produce identical data. - - Most samplers (like our TrainingSampler) do not shard based on dataloader worker id - and this argument should be set to True. But certain samplers may be already - sharded, in that case this argument should be set to False. - """ - assert not isinstance(dataset, data.IterableDataset), dataset - assert isinstance(sampler, Sampler), sampler - self.dataset = dataset - self.sampler = sampler - self.shard_sampler = shard_sampler - - def __iter__(self): - if not self.shard_sampler: - sampler = self.sampler - else: - # With map-style dataset, `DataLoader(dataset, sampler)` runs the - # sampler in main process only. But `DataLoader(ToIterableDataset(dataset, sampler))` - # will run sampler in every of the N worker. So we should only keep 1/N of the ids on - # each worker. The assumption is that sampler is cheap to iterate so it's fine to - # discard ids in workers. - sampler = _shard_iterator_dataloader_worker(self.sampler) - for idx in sampler: - yield self.dataset[idx] - - def __len__(self): - return len(self.sampler) - - -class AspectRatioGroupedDataset(data.IterableDataset): - """ - Batch data that have similar aspect ratio together. - In this implementation, images whose aspect ratio < (or >) 1 will - be batched together. - This improves training speed because the images then need less padding - to form a batch. - - It assumes the underlying dataset produces dicts with "width" and "height" keys. - It will then produce a list of original dicts with length = batch_size, - all with similar aspect ratios. - """ - - def __init__(self, dataset, batch_size): - """ - Args: - dataset: an iterable. Each element must be a dict with keys - "width" and "height", which will be used to batch data. - batch_size (int): - """ - self.dataset = dataset - self.batch_size = batch_size - self._buckets = [[] for _ in range(2)] - # Hard-coded two aspect ratio groups: w > h and w < h. - # Can add support for more aspect ratio groups, but doesn't seem useful - - def __iter__(self): - for d in self.dataset: - w, h = d["width"], d["height"] - bucket_id = 0 if w > h else 1 - bucket = self._buckets[bucket_id] - bucket.append(d) - if len(bucket) == self.batch_size: - yield bucket[:] - del bucket[:] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/dataset_mapper.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/dataset_mapper.py deleted file mode 100755 index a8714f79..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/dataset_mapper.py +++ /dev/null @@ -1,191 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import logging -import numpy as np -from typing import List, Optional, Union -import torch - -from detectron2.config import configurable - -from . import detection_utils as utils -from . import transforms as T - -""" -This file contains the default mapping that's applied to "dataset dicts". -""" - -__all__ = ["DatasetMapper"] - - -class DatasetMapper: - """ - A callable which takes a dataset dict in Detectron2 Dataset format, - and map it into a format used by the model. - - This is the default callable to be used to map your dataset dict into training data. - You may need to follow it to implement your own one for customized logic, - such as a different way to read or transform images. - See :doc:`/tutorials/data_loading` for details. - - The callable currently does the following: - - 1. Read the image from "file_name" - 2. Applies cropping/geometric transforms to the image and annotations - 3. Prepare data and annotations to Tensor and :class:`Instances` - """ - - @configurable - def __init__( - self, - is_train: bool, - *, - augmentations: List[Union[T.Augmentation, T.Transform]], - image_format: str, - use_instance_mask: bool = False, - use_keypoint: bool = False, - instance_mask_format: str = "polygon", - keypoint_hflip_indices: Optional[np.ndarray] = None, - precomputed_proposal_topk: Optional[int] = None, - recompute_boxes: bool = False, - ): - """ - NOTE: this interface is experimental. - - Args: - is_train: whether it's used in training or inference - augmentations: a list of augmentations or deterministic transforms to apply - image_format: an image format supported by :func:`detection_utils.read_image`. - use_instance_mask: whether to process instance segmentation annotations, if available - use_keypoint: whether to process keypoint annotations if available - instance_mask_format: one of "polygon" or "bitmask". Process instance segmentation - masks into this format. - keypoint_hflip_indices: see :func:`detection_utils.create_keypoint_hflip_indices` - precomputed_proposal_topk: if given, will load pre-computed - proposals from dataset_dict and keep the top k proposals for each image. - recompute_boxes: whether to overwrite bounding box annotations - by computing tight bounding boxes from instance mask annotations. - """ - if recompute_boxes: - assert use_instance_mask, "recompute_boxes requires instance masks" - # fmt: off - self.is_train = is_train - self.augmentations = T.AugmentationList(augmentations) - self.image_format = image_format - self.use_instance_mask = use_instance_mask - self.instance_mask_format = instance_mask_format - self.use_keypoint = use_keypoint - self.keypoint_hflip_indices = keypoint_hflip_indices - self.proposal_topk = precomputed_proposal_topk - self.recompute_boxes = recompute_boxes - # fmt: on - logger = logging.getLogger(__name__) - mode = "training" if is_train else "inference" - logger.info(f"[DatasetMapper] Augmentations used in {mode}: {augmentations}") - - @classmethod - def from_config(cls, cfg, is_train: bool = True): - augs = utils.build_augmentation(cfg, is_train) - if cfg.INPUT.CROP.ENABLED and is_train: - augs.insert(0, T.RandomCrop(cfg.INPUT.CROP.TYPE, cfg.INPUT.CROP.SIZE)) - recompute_boxes = cfg.MODEL.MASK_ON - else: - recompute_boxes = False - - ret = { - "is_train": is_train, - "augmentations": augs, - "image_format": cfg.INPUT.FORMAT, - "use_instance_mask": cfg.MODEL.MASK_ON, - "instance_mask_format": cfg.INPUT.MASK_FORMAT, - "use_keypoint": cfg.MODEL.KEYPOINT_ON, - "recompute_boxes": recompute_boxes, - } - - if cfg.MODEL.KEYPOINT_ON: - ret["keypoint_hflip_indices"] = utils.create_keypoint_hflip_indices(cfg.DATASETS.TRAIN) - - if cfg.MODEL.LOAD_PROPOSALS: - ret["precomputed_proposal_topk"] = ( - cfg.DATASETS.PRECOMPUTED_PROPOSAL_TOPK_TRAIN - if is_train - else cfg.DATASETS.PRECOMPUTED_PROPOSAL_TOPK_TEST - ) - return ret - - def _transform_annotations(self, dataset_dict, transforms, image_shape): - # USER: Modify this if you want to keep them for some reason. - for anno in dataset_dict["annotations"]: - if not self.use_instance_mask: - anno.pop("segmentation", None) - if not self.use_keypoint: - anno.pop("keypoints", None) - - # USER: Implement additional transformations if you have other types of data - annos = [ - utils.transform_instance_annotations( - obj, transforms, image_shape, keypoint_hflip_indices=self.keypoint_hflip_indices - ) - for obj in dataset_dict.pop("annotations") - if obj.get("iscrowd", 0) == 0 - ] - instances = utils.annotations_to_instances( - annos, image_shape, mask_format=self.instance_mask_format - ) - - # After transforms such as cropping are applied, the bounding box may no longer - # tightly bound the object. As an example, imagine a triangle object - # [(0,0), (2,0), (0,2)] cropped by a box [(1,0),(2,2)] (XYXY format). The tight - # bounding box of the cropped triangle should be [(1,0),(2,1)], which is not equal to - # the intersection of original bounding box and the cropping box. - if self.recompute_boxes: - instances.gt_boxes = instances.gt_masks.get_bounding_boxes() - dataset_dict["instances"] = utils.filter_empty_instances(instances) - - def __call__(self, dataset_dict): - """ - Args: - dataset_dict (dict): Metadata of one image, in Detectron2 Dataset format. - - Returns: - dict: a format that builtin models in detectron2 accept - """ - dataset_dict = copy.deepcopy(dataset_dict) # it will be modified by code below - # USER: Write your own image loading if it's not from a file - image = utils.read_image(dataset_dict["file_name"], format=self.image_format) - utils.check_image_size(dataset_dict, image) - - # USER: Remove if you don't do semantic/panoptic segmentation. - if "sem_seg_file_name" in dataset_dict: - sem_seg_gt = utils.read_image(dataset_dict.pop("sem_seg_file_name"), "L").squeeze(2) - else: - sem_seg_gt = None - - aug_input = T.AugInput(image, sem_seg=sem_seg_gt) - transforms = self.augmentations(aug_input) - image, sem_seg_gt = aug_input.image, aug_input.sem_seg - - image_shape = image.shape[:2] # h, w - # Pytorch's dataloader is efficient on torch.Tensor due to shared-memory, - # but not efficient on large generic data structures due to the use of pickle & mp.Queue. - # Therefore it's important to use torch.Tensor. - dataset_dict["image"] = torch.as_tensor(np.ascontiguousarray(image.transpose(2, 0, 1))) - if sem_seg_gt is not None: - dataset_dict["sem_seg"] = torch.as_tensor(sem_seg_gt.astype("long")) - - # USER: Remove if you don't use pre-computed proposals. - # Most users would not need this feature. - if self.proposal_topk is not None: - utils.transform_proposals( - dataset_dict, image_shape, transforms, proposal_topk=self.proposal_topk - ) - - if not self.is_train: - # USER: Modify this if you want to keep them for some reason. - dataset_dict.pop("annotations", None) - dataset_dict.pop("sem_seg_file_name", None) - return dataset_dict - - if "annotations" in dataset_dict: - self._transform_annotations(dataset_dict, transforms, image_shape) - - return dataset_dict diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/README.md deleted file mode 100755 index 9fb3e4f7..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/README.md +++ /dev/null @@ -1,9 +0,0 @@ - - -### Common Datasets - -The dataset implemented here do not need to load the data into the final format. -It should provide the minimal data structure needed to use the dataset, so it can be very efficient. - -For example, for an image dataset, just provide the file names and labels, but don't read the images. -Let the downstream decide how to read. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/__init__.py deleted file mode 100755 index a44bedc1..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/__init__.py +++ /dev/null @@ -1,9 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .coco import load_coco_json, load_sem_seg, register_coco_instances, convert_to_coco_json -from .coco_panoptic import register_coco_panoptic, register_coco_panoptic_separated -from .lvis import load_lvis_json, register_lvis_instances, get_lvis_instances_meta -from .pascal_voc import load_voc_instances, register_pascal_voc -from . import builtin as _builtin # ensure the builtin datasets are registered - - -__all__ = [k for k in globals().keys() if not k.startswith("_")] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/builtin.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/builtin.py deleted file mode 100755 index c3a68aa8..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/builtin.py +++ /dev/null @@ -1,259 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - - -""" -This file registers pre-defined datasets at hard-coded paths, and their metadata. - -We hard-code metadata for common datasets. This will enable: -1. Consistency check when loading the datasets -2. Use models on these standard datasets directly and run demos, - without having to download the dataset annotations - -We hard-code some paths to the dataset that's assumed to -exist in "./datasets/". - -Users SHOULD NOT use this file to create new dataset / metadata for new dataset. -To add new dataset, refer to the tutorial "docs/DATASETS.md". -""" - -import os - -from detectron2.data import DatasetCatalog, MetadataCatalog - -from .builtin_meta import ADE20K_SEM_SEG_CATEGORIES, _get_builtin_metadata -from .cityscapes import load_cityscapes_instances, load_cityscapes_semantic -from .cityscapes_panoptic import register_all_cityscapes_panoptic -from .coco import load_sem_seg, register_coco_instances -from .coco_panoptic import register_coco_panoptic, register_coco_panoptic_separated -from .lvis import get_lvis_instances_meta, register_lvis_instances -from .pascal_voc import register_pascal_voc - -# ==== Predefined datasets and splits for COCO ========== - -_PREDEFINED_SPLITS_COCO = {} -_PREDEFINED_SPLITS_COCO["coco"] = { - "coco_2014_train": ("coco/train2014", "coco/annotations/instances_train2014.json"), - "coco_2014_val": ("coco/val2014", "coco/annotations/instances_val2014.json"), - "coco_2014_minival": ("coco/val2014", "coco/annotations/instances_minival2014.json"), - "coco_2014_valminusminival": ( - "coco/val2014", - "coco/annotations/instances_valminusminival2014.json", - ), - "coco_2017_train": ("coco/train2017", "coco/annotations/instances_train2017.json"), - "coco_2017_val": ("coco/val2017", "coco/annotations/instances_val2017.json"), - "coco_2017_test": ("coco/test2017", "coco/annotations/image_info_test2017.json"), - "coco_2017_test-dev": ("coco/test2017", "coco/annotations/image_info_test-dev2017.json"), - "coco_2017_val_100": ("coco/val2017", "coco/annotations/instances_val2017_100.json"), -} - -_PREDEFINED_SPLITS_COCO["coco_person"] = { - "keypoints_coco_2014_train": ( - "coco/train2014", - "coco/annotations/person_keypoints_train2014.json", - ), - "keypoints_coco_2014_val": ("coco/val2014", "coco/annotations/person_keypoints_val2014.json"), - "keypoints_coco_2014_minival": ( - "coco/val2014", - "coco/annotations/person_keypoints_minival2014.json", - ), - "keypoints_coco_2014_valminusminival": ( - "coco/val2014", - "coco/annotations/person_keypoints_valminusminival2014.json", - ), - "keypoints_coco_2017_train": ( - "coco/train2017", - "coco/annotations/person_keypoints_train2017.json", - ), - "keypoints_coco_2017_val": ("coco/val2017", "coco/annotations/person_keypoints_val2017.json"), - "keypoints_coco_2017_val_100": ( - "coco/val2017", - "coco/annotations/person_keypoints_val2017_100.json", - ), -} - - -_PREDEFINED_SPLITS_COCO_PANOPTIC = { - "coco_2017_train_panoptic": ( - # This is the original panoptic annotation directory - "coco/panoptic_train2017", - "coco/annotations/panoptic_train2017.json", - # This directory contains semantic annotations that are - # converted from panoptic annotations. - # It is used by PanopticFPN. - # You can use the script at detectron2/datasets/prepare_panoptic_fpn.py - # to create these directories. - "coco/panoptic_stuff_train2017", - ), - "coco_2017_val_panoptic": ( - "coco/panoptic_val2017", - "coco/annotations/panoptic_val2017.json", - "coco/panoptic_stuff_val2017", - ), - "coco_2017_val_100_panoptic": ( - "coco/panoptic_val2017_100", - "coco/annotations/panoptic_val2017_100.json", - "coco/panoptic_stuff_val2017_100", - ), -} - - -def register_all_coco(root): - for dataset_name, splits_per_dataset in _PREDEFINED_SPLITS_COCO.items(): - for key, (image_root, json_file) in splits_per_dataset.items(): - # Assume pre-defined datasets live in `./datasets`. - register_coco_instances( - key, - _get_builtin_metadata(dataset_name), - os.path.join(root, json_file) if "://" not in json_file else json_file, - os.path.join(root, image_root), - ) - - for ( - prefix, - (panoptic_root, panoptic_json, semantic_root), - ) in _PREDEFINED_SPLITS_COCO_PANOPTIC.items(): - prefix_instances = prefix[: -len("_panoptic")] - instances_meta = MetadataCatalog.get(prefix_instances) - image_root, instances_json = instances_meta.image_root, instances_meta.json_file - # The "separated" version of COCO panoptic segmentation dataset, - # e.g. used by Panoptic FPN - register_coco_panoptic_separated( - prefix, - _get_builtin_metadata("coco_panoptic_separated"), - image_root, - os.path.join(root, panoptic_root), - os.path.join(root, panoptic_json), - os.path.join(root, semantic_root), - instances_json, - ) - # The "standard" version of COCO panoptic segmentation dataset, - # e.g. used by Panoptic-DeepLab - register_coco_panoptic( - prefix, - _get_builtin_metadata("coco_panoptic_standard"), - image_root, - os.path.join(root, panoptic_root), - os.path.join(root, panoptic_json), - instances_json, - ) - - -# ==== Predefined datasets and splits for LVIS ========== - - -_PREDEFINED_SPLITS_LVIS = { - "lvis_v1": { - "lvis_v1_train": ("coco/", "lvis/lvis_v1_train.json"), - "lvis_v1_val": ("coco/", "lvis/lvis_v1_val.json"), - "lvis_v1_test_dev": ("coco/", "lvis/lvis_v1_image_info_test_dev.json"), - "lvis_v1_test_challenge": ("coco/", "lvis/lvis_v1_image_info_test_challenge.json"), - }, - "lvis_v0.5": { - "lvis_v0.5_train": ("coco/", "lvis/lvis_v0.5_train.json"), - "lvis_v0.5_val": ("coco/", "lvis/lvis_v0.5_val.json"), - "lvis_v0.5_val_rand_100": ("coco/", "lvis/lvis_v0.5_val_rand_100.json"), - "lvis_v0.5_test": ("coco/", "lvis/lvis_v0.5_image_info_test.json"), - }, - "lvis_v0.5_cocofied": { - "lvis_v0.5_train_cocofied": ("coco/", "lvis/lvis_v0.5_train_cocofied.json"), - "lvis_v0.5_val_cocofied": ("coco/", "lvis/lvis_v0.5_val_cocofied.json"), - }, -} - - -def register_all_lvis(root): - for dataset_name, splits_per_dataset in _PREDEFINED_SPLITS_LVIS.items(): - for key, (image_root, json_file) in splits_per_dataset.items(): - register_lvis_instances( - key, - get_lvis_instances_meta(dataset_name), - os.path.join(root, json_file) if "://" not in json_file else json_file, - os.path.join(root, image_root), - ) - - -# ==== Predefined splits for raw cityscapes images =========== -_RAW_CITYSCAPES_SPLITS = { - "cityscapes_fine_{task}_train": ("cityscapes/leftImg8bit/train/", "cityscapes/gtFine/train/"), - "cityscapes_fine_{task}_val": ("cityscapes/leftImg8bit/val/", "cityscapes/gtFine/val/"), - "cityscapes_fine_{task}_test": ("cityscapes/leftImg8bit/test/", "cityscapes/gtFine/test/"), -} - - -def register_all_cityscapes(root): - for key, (image_dir, gt_dir) in _RAW_CITYSCAPES_SPLITS.items(): - meta = _get_builtin_metadata("cityscapes") - image_dir = os.path.join(root, image_dir) - gt_dir = os.path.join(root, gt_dir) - - inst_key = key.format(task="instance_seg") - DatasetCatalog.register( - inst_key, - lambda x=image_dir, y=gt_dir: load_cityscapes_instances( - x, y, from_json=True, to_polygons=True - ), - ) - MetadataCatalog.get(inst_key).set( - image_dir=image_dir, gt_dir=gt_dir, evaluator_type="cityscapes_instance", **meta - ) - - sem_key = key.format(task="sem_seg") - DatasetCatalog.register( - sem_key, lambda x=image_dir, y=gt_dir: load_cityscapes_semantic(x, y) - ) - MetadataCatalog.get(sem_key).set( - image_dir=image_dir, - gt_dir=gt_dir, - evaluator_type="cityscapes_sem_seg", - ignore_label=255, - **meta, - ) - - -# ==== Predefined splits for PASCAL VOC =========== -def register_all_pascal_voc(root): - SPLITS = [ - ("voc_2007_trainval", "VOC2007", "trainval"), - ("voc_2007_train", "VOC2007", "train"), - ("voc_2007_val", "VOC2007", "val"), - ("voc_2007_test", "VOC2007", "test"), - ("voc_2012_trainval", "VOC2012", "trainval"), - ("voc_2012_train", "VOC2012", "train"), - ("voc_2012_val", "VOC2012", "val"), - ] - for name, dirname, split in SPLITS: - year = 2007 if "2007" in name else 2012 - register_pascal_voc(name, os.path.join(root, dirname), split, year) - MetadataCatalog.get(name).evaluator_type = "pascal_voc" - - -def register_all_ade20k(root): - root = os.path.join(root, "ADEChallengeData2016") - for name, dirname in [("train", "training"), ("val", "validation")]: - image_dir = os.path.join(root, "images", dirname) - gt_dir = os.path.join(root, "annotations_detectron2", dirname) - name = f"ade20k_sem_seg_{name}" - DatasetCatalog.register( - name, lambda x=image_dir, y=gt_dir: load_sem_seg(y, x, gt_ext="png", image_ext="jpg") - ) - MetadataCatalog.get(name).set( - stuff_classes=ADE20K_SEM_SEG_CATEGORIES[:], - image_root=image_dir, - sem_seg_root=gt_dir, - evaluator_type="sem_seg", - ignore_label=255, - ) - - -# True for open source; -# Internally at fb, we register them elsewhere -if __name__.endswith(".builtin"): - # Assume pre-defined datasets live in `./datasets`. - _root = os.path.expanduser(os.getenv("DETECTRON2_DATASETS", "datasets")) - register_all_coco(_root) - register_all_lvis(_root) - register_all_cityscapes(_root) - register_all_cityscapes_panoptic(_root) - register_all_pascal_voc(_root) - register_all_ade20k(_root) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/builtin_meta.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/builtin_meta.py deleted file mode 100755 index 63c7a1a3..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/builtin_meta.py +++ /dev/null @@ -1,350 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -""" -Note: -For your custom dataset, there is no need to hard-code metadata anywhere in the code. -For example, for COCO-format dataset, metadata will be obtained automatically -when calling `load_coco_json`. For other dataset, metadata may also be obtained in other ways -during loading. - -However, we hard-coded metadata for a few common dataset here. -The only goal is to allow users who don't have these dataset to use pre-trained models. -Users don't have to download a COCO json (which contains metadata), in order to visualize a -COCO model (with correct class names and colors). -""" - - -# All coco categories, together with their nice-looking visualization colors -# It's from https://github.com/cocodataset/panopticapi/blob/master/panoptic_coco_categories.json -COCO_CATEGORIES = [ - {"color": [220, 20, 60], "isthing": 1, "id": 1, "name": "person"}, - {"color": [119, 11, 32], "isthing": 1, "id": 2, "name": "bicycle"}, - {"color": [0, 0, 142], "isthing": 1, "id": 3, "name": "car"}, - {"color": [0, 0, 230], "isthing": 1, "id": 4, "name": "motorcycle"}, - {"color": [106, 0, 228], "isthing": 1, "id": 5, "name": "airplane"}, - {"color": [0, 60, 100], "isthing": 1, "id": 6, "name": "bus"}, - {"color": [0, 80, 100], "isthing": 1, "id": 7, "name": "train"}, - {"color": [0, 0, 70], "isthing": 1, "id": 8, "name": "truck"}, - {"color": [0, 0, 192], "isthing": 1, "id": 9, "name": "boat"}, - {"color": [250, 170, 30], "isthing": 1, "id": 10, "name": "traffic light"}, - {"color": [100, 170, 30], "isthing": 1, "id": 11, "name": "fire hydrant"}, - {"color": [220, 220, 0], "isthing": 1, "id": 13, "name": "stop sign"}, - {"color": [175, 116, 175], "isthing": 1, "id": 14, "name": "parking meter"}, - {"color": [250, 0, 30], "isthing": 1, "id": 15, "name": "bench"}, - {"color": [165, 42, 42], "isthing": 1, "id": 16, "name": "bird"}, - {"color": [255, 77, 255], "isthing": 1, "id": 17, "name": "cat"}, - {"color": [0, 226, 252], "isthing": 1, "id": 18, "name": "dog"}, - {"color": [182, 182, 255], "isthing": 1, "id": 19, "name": "horse"}, - {"color": [0, 82, 0], "isthing": 1, "id": 20, "name": "sheep"}, - {"color": [120, 166, 157], "isthing": 1, "id": 21, "name": "cow"}, - {"color": [110, 76, 0], "isthing": 1, "id": 22, "name": "elephant"}, - {"color": [174, 57, 255], "isthing": 1, "id": 23, "name": "bear"}, - {"color": [199, 100, 0], "isthing": 1, "id": 24, "name": "zebra"}, - {"color": [72, 0, 118], "isthing": 1, "id": 25, "name": "giraffe"}, - {"color": [255, 179, 240], "isthing": 1, "id": 27, "name": "backpack"}, - {"color": [0, 125, 92], "isthing": 1, "id": 28, "name": "umbrella"}, - {"color": [209, 0, 151], "isthing": 1, "id": 31, "name": "handbag"}, - {"color": [188, 208, 182], "isthing": 1, "id": 32, "name": "tie"}, - {"color": [0, 220, 176], "isthing": 1, "id": 33, "name": "suitcase"}, - {"color": [255, 99, 164], "isthing": 1, "id": 34, "name": "frisbee"}, - {"color": [92, 0, 73], "isthing": 1, "id": 35, "name": "skis"}, - {"color": [133, 129, 255], "isthing": 1, "id": 36, "name": "snowboard"}, - {"color": [78, 180, 255], "isthing": 1, "id": 37, "name": "sports ball"}, - {"color": [0, 228, 0], "isthing": 1, "id": 38, "name": "kite"}, - {"color": [174, 255, 243], "isthing": 1, "id": 39, "name": "baseball bat"}, - {"color": [45, 89, 255], "isthing": 1, "id": 40, "name": "baseball glove"}, - {"color": [134, 134, 103], "isthing": 1, "id": 41, "name": "skateboard"}, - {"color": [145, 148, 174], "isthing": 1, "id": 42, "name": "surfboard"}, - {"color": [255, 208, 186], "isthing": 1, "id": 43, "name": "tennis racket"}, - {"color": [197, 226, 255], "isthing": 1, "id": 44, "name": "bottle"}, - {"color": [171, 134, 1], "isthing": 1, "id": 46, "name": "wine glass"}, - {"color": [109, 63, 54], "isthing": 1, "id": 47, "name": "cup"}, - {"color": [207, 138, 255], "isthing": 1, "id": 48, "name": "fork"}, - {"color": [151, 0, 95], "isthing": 1, "id": 49, "name": "knife"}, - {"color": [9, 80, 61], "isthing": 1, "id": 50, "name": "spoon"}, - {"color": [84, 105, 51], "isthing": 1, "id": 51, "name": "bowl"}, - {"color": [74, 65, 105], "isthing": 1, "id": 52, "name": "banana"}, - {"color": [166, 196, 102], "isthing": 1, "id": 53, "name": "apple"}, - {"color": [208, 195, 210], "isthing": 1, "id": 54, "name": "sandwich"}, - {"color": [255, 109, 65], "isthing": 1, "id": 55, "name": "orange"}, - {"color": [0, 143, 149], "isthing": 1, "id": 56, "name": "broccoli"}, - {"color": [179, 0, 194], "isthing": 1, "id": 57, "name": "carrot"}, - {"color": [209, 99, 106], "isthing": 1, "id": 58, "name": "hot dog"}, - {"color": [5, 121, 0], "isthing": 1, "id": 59, "name": "pizza"}, - {"color": [227, 255, 205], "isthing": 1, "id": 60, "name": "donut"}, - {"color": [147, 186, 208], "isthing": 1, "id": 61, "name": "cake"}, - {"color": [153, 69, 1], "isthing": 1, "id": 62, "name": "chair"}, - {"color": [3, 95, 161], "isthing": 1, "id": 63, "name": "couch"}, - {"color": [163, 255, 0], "isthing": 1, "id": 64, "name": "potted plant"}, - {"color": [119, 0, 170], "isthing": 1, "id": 65, "name": "bed"}, - {"color": [0, 182, 199], "isthing": 1, "id": 67, "name": "dining table"}, - {"color": [0, 165, 120], "isthing": 1, "id": 70, "name": "toilet"}, - {"color": [183, 130, 88], "isthing": 1, "id": 72, "name": "tv"}, - {"color": [95, 32, 0], "isthing": 1, "id": 73, "name": "laptop"}, - {"color": [130, 114, 135], "isthing": 1, "id": 74, "name": "mouse"}, - {"color": [110, 129, 133], "isthing": 1, "id": 75, "name": "remote"}, - {"color": [166, 74, 118], "isthing": 1, "id": 76, "name": "keyboard"}, - {"color": [219, 142, 185], "isthing": 1, "id": 77, "name": "cell phone"}, - {"color": [79, 210, 114], "isthing": 1, "id": 78, "name": "microwave"}, - {"color": [178, 90, 62], "isthing": 1, "id": 79, "name": "oven"}, - {"color": [65, 70, 15], "isthing": 1, "id": 80, "name": "toaster"}, - {"color": [127, 167, 115], "isthing": 1, "id": 81, "name": "sink"}, - {"color": [59, 105, 106], "isthing": 1, "id": 82, "name": "refrigerator"}, - {"color": [142, 108, 45], "isthing": 1, "id": 84, "name": "book"}, - {"color": [196, 172, 0], "isthing": 1, "id": 85, "name": "clock"}, - {"color": [95, 54, 80], "isthing": 1, "id": 86, "name": "vase"}, - {"color": [128, 76, 255], "isthing": 1, "id": 87, "name": "scissors"}, - {"color": [201, 57, 1], "isthing": 1, "id": 88, "name": "teddy bear"}, - {"color": [246, 0, 122], "isthing": 1, "id": 89, "name": "hair drier"}, - {"color": [191, 162, 208], "isthing": 1, "id": 90, "name": "toothbrush"}, - {"color": [255, 255, 128], "isthing": 0, "id": 92, "name": "banner"}, - {"color": [147, 211, 203], "isthing": 0, "id": 93, "name": "blanket"}, - {"color": [150, 100, 100], "isthing": 0, "id": 95, "name": "bridge"}, - {"color": [168, 171, 172], "isthing": 0, "id": 100, "name": "cardboard"}, - {"color": [146, 112, 198], "isthing": 0, "id": 107, "name": "counter"}, - {"color": [210, 170, 100], "isthing": 0, "id": 109, "name": "curtain"}, - {"color": [92, 136, 89], "isthing": 0, "id": 112, "name": "door-stuff"}, - {"color": [218, 88, 184], "isthing": 0, "id": 118, "name": "floor-wood"}, - {"color": [241, 129, 0], "isthing": 0, "id": 119, "name": "flower"}, - {"color": [217, 17, 255], "isthing": 0, "id": 122, "name": "fruit"}, - {"color": [124, 74, 181], "isthing": 0, "id": 125, "name": "gravel"}, - {"color": [70, 70, 70], "isthing": 0, "id": 128, "name": "house"}, - {"color": [255, 228, 255], "isthing": 0, "id": 130, "name": "light"}, - {"color": [154, 208, 0], "isthing": 0, "id": 133, "name": "mirror-stuff"}, - {"color": [193, 0, 92], "isthing": 0, "id": 138, "name": "net"}, - {"color": [76, 91, 113], "isthing": 0, "id": 141, "name": "pillow"}, - {"color": [255, 180, 195], "isthing": 0, "id": 144, "name": "platform"}, - {"color": [106, 154, 176], "isthing": 0, "id": 145, "name": "playingfield"}, - {"color": [230, 150, 140], "isthing": 0, "id": 147, "name": "railroad"}, - {"color": [60, 143, 255], "isthing": 0, "id": 148, "name": "river"}, - {"color": [128, 64, 128], "isthing": 0, "id": 149, "name": "road"}, - {"color": [92, 82, 55], "isthing": 0, "id": 151, "name": "roof"}, - {"color": [254, 212, 124], "isthing": 0, "id": 154, "name": "sand"}, - {"color": [73, 77, 174], "isthing": 0, "id": 155, "name": "sea"}, - {"color": [255, 160, 98], "isthing": 0, "id": 156, "name": "shelf"}, - {"color": [255, 255, 255], "isthing": 0, "id": 159, "name": "snow"}, - {"color": [104, 84, 109], "isthing": 0, "id": 161, "name": "stairs"}, - {"color": [169, 164, 131], "isthing": 0, "id": 166, "name": "tent"}, - {"color": [225, 199, 255], "isthing": 0, "id": 168, "name": "towel"}, - {"color": [137, 54, 74], "isthing": 0, "id": 171, "name": "wall-brick"}, - {"color": [135, 158, 223], "isthing": 0, "id": 175, "name": "wall-stone"}, - {"color": [7, 246, 231], "isthing": 0, "id": 176, "name": "wall-tile"}, - {"color": [107, 255, 200], "isthing": 0, "id": 177, "name": "wall-wood"}, - {"color": [58, 41, 149], "isthing": 0, "id": 178, "name": "water-other"}, - {"color": [183, 121, 142], "isthing": 0, "id": 180, "name": "window-blind"}, - {"color": [255, 73, 97], "isthing": 0, "id": 181, "name": "window-other"}, - {"color": [107, 142, 35], "isthing": 0, "id": 184, "name": "tree-merged"}, - {"color": [190, 153, 153], "isthing": 0, "id": 185, "name": "fence-merged"}, - {"color": [146, 139, 141], "isthing": 0, "id": 186, "name": "ceiling-merged"}, - {"color": [70, 130, 180], "isthing": 0, "id": 187, "name": "sky-other-merged"}, - {"color": [134, 199, 156], "isthing": 0, "id": 188, "name": "cabinet-merged"}, - {"color": [209, 226, 140], "isthing": 0, "id": 189, "name": "table-merged"}, - {"color": [96, 36, 108], "isthing": 0, "id": 190, "name": "floor-other-merged"}, - {"color": [96, 96, 96], "isthing": 0, "id": 191, "name": "pavement-merged"}, - {"color": [64, 170, 64], "isthing": 0, "id": 192, "name": "mountain-merged"}, - {"color": [152, 251, 152], "isthing": 0, "id": 193, "name": "grass-merged"}, - {"color": [208, 229, 228], "isthing": 0, "id": 194, "name": "dirt-merged"}, - {"color": [206, 186, 171], "isthing": 0, "id": 195, "name": "paper-merged"}, - {"color": [152, 161, 64], "isthing": 0, "id": 196, "name": "food-other-merged"}, - {"color": [116, 112, 0], "isthing": 0, "id": 197, "name": "building-other-merged"}, - {"color": [0, 114, 143], "isthing": 0, "id": 198, "name": "rock-merged"}, - {"color": [102, 102, 156], "isthing": 0, "id": 199, "name": "wall-other-merged"}, - {"color": [250, 141, 255], "isthing": 0, "id": 200, "name": "rug-merged"}, -] - -# fmt: off -COCO_PERSON_KEYPOINT_NAMES = ( - "nose", - "left_eye", "right_eye", - "left_ear", "right_ear", - "left_shoulder", "right_shoulder", - "left_elbow", "right_elbow", - "left_wrist", "right_wrist", - "left_hip", "right_hip", - "left_knee", "right_knee", - "left_ankle", "right_ankle", -) -# fmt: on - -# Pairs of keypoints that should be exchanged under horizontal flipping -COCO_PERSON_KEYPOINT_FLIP_MAP = ( - ("left_eye", "right_eye"), - ("left_ear", "right_ear"), - ("left_shoulder", "right_shoulder"), - ("left_elbow", "right_elbow"), - ("left_wrist", "right_wrist"), - ("left_hip", "right_hip"), - ("left_knee", "right_knee"), - ("left_ankle", "right_ankle"), -) - -# rules for pairs of keypoints to draw a line between, and the line color to use. -KEYPOINT_CONNECTION_RULES = [ - # face - ("left_ear", "left_eye", (102, 204, 255)), - ("right_ear", "right_eye", (51, 153, 255)), - ("left_eye", "nose", (102, 0, 204)), - ("nose", "right_eye", (51, 102, 255)), - # upper-body - ("left_shoulder", "right_shoulder", (255, 128, 0)), - ("left_shoulder", "left_elbow", (153, 255, 204)), - ("right_shoulder", "right_elbow", (128, 229, 255)), - ("left_elbow", "left_wrist", (153, 255, 153)), - ("right_elbow", "right_wrist", (102, 255, 224)), - # lower-body - ("left_hip", "right_hip", (255, 102, 0)), - ("left_hip", "left_knee", (255, 255, 77)), - ("right_hip", "right_knee", (153, 255, 204)), - ("left_knee", "left_ankle", (191, 255, 128)), - ("right_knee", "right_ankle", (255, 195, 77)), -] - -# All Cityscapes categories, together with their nice-looking visualization colors -# It's from https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/helpers/labels.py # noqa -CITYSCAPES_CATEGORIES = [ - {"color": (128, 64, 128), "isthing": 0, "id": 7, "trainId": 0, "name": "road"}, - {"color": (244, 35, 232), "isthing": 0, "id": 8, "trainId": 1, "name": "sidewalk"}, - {"color": (70, 70, 70), "isthing": 0, "id": 11, "trainId": 2, "name": "building"}, - {"color": (102, 102, 156), "isthing": 0, "id": 12, "trainId": 3, "name": "wall"}, - {"color": (190, 153, 153), "isthing": 0, "id": 13, "trainId": 4, "name": "fence"}, - {"color": (153, 153, 153), "isthing": 0, "id": 17, "trainId": 5, "name": "pole"}, - {"color": (250, 170, 30), "isthing": 0, "id": 19, "trainId": 6, "name": "traffic light"}, - {"color": (220, 220, 0), "isthing": 0, "id": 20, "trainId": 7, "name": "traffic sign"}, - {"color": (107, 142, 35), "isthing": 0, "id": 21, "trainId": 8, "name": "vegetation"}, - {"color": (152, 251, 152), "isthing": 0, "id": 22, "trainId": 9, "name": "terrain"}, - {"color": (70, 130, 180), "isthing": 0, "id": 23, "trainId": 10, "name": "sky"}, - {"color": (220, 20, 60), "isthing": 1, "id": 24, "trainId": 11, "name": "person"}, - {"color": (255, 0, 0), "isthing": 1, "id": 25, "trainId": 12, "name": "rider"}, - {"color": (0, 0, 142), "isthing": 1, "id": 26, "trainId": 13, "name": "car"}, - {"color": (0, 0, 70), "isthing": 1, "id": 27, "trainId": 14, "name": "truck"}, - {"color": (0, 60, 100), "isthing": 1, "id": 28, "trainId": 15, "name": "bus"}, - {"color": (0, 80, 100), "isthing": 1, "id": 31, "trainId": 16, "name": "train"}, - {"color": (0, 0, 230), "isthing": 1, "id": 32, "trainId": 17, "name": "motorcycle"}, - {"color": (119, 11, 32), "isthing": 1, "id": 33, "trainId": 18, "name": "bicycle"}, -] - -# fmt: off -ADE20K_SEM_SEG_CATEGORIES = [ - "wall", "building", "sky", "floor", "tree", "ceiling", "road, route", "bed", "window ", "grass", "cabinet", "sidewalk, pavement", "person", "earth, ground", "door", "table", "mountain, mount", "plant", "curtain", "chair", "car", "water", "painting, picture", "sofa", "shelf", "house", "sea", "mirror", "rug", "field", "armchair", "seat", "fence", "desk", "rock, stone", "wardrobe, closet, press", "lamp", "tub", "rail", "cushion", "base, pedestal, stand", "box", "column, pillar", "signboard, sign", "chest of drawers, chest, bureau, dresser", "counter", "sand", "sink", "skyscraper", "fireplace", "refrigerator, icebox", "grandstand, covered stand", "path", "stairs", "runway", "case, display case, showcase, vitrine", "pool table, billiard table, snooker table", "pillow", "screen door, screen", "stairway, staircase", "river", "bridge, span", "bookcase", "blind, screen", "coffee table", "toilet, can, commode, crapper, pot, potty, stool, throne", "flower", "book", "hill", "bench", "countertop", "stove", "palm, palm tree", "kitchen island", "computer", "swivel chair", "boat", "bar", "arcade machine", "hovel, hut, hutch, shack, shanty", "bus", "towel", "light", "truck", "tower", "chandelier", "awning, sunshade, sunblind", "street lamp", "booth", "tv", "plane", "dirt track", "clothes", "pole", "land, ground, soil", "bannister, banister, balustrade, balusters, handrail", "escalator, moving staircase, moving stairway", "ottoman, pouf, pouffe, puff, hassock", "bottle", "buffet, counter, sideboard", "poster, posting, placard, notice, bill, card", "stage", "van", "ship", "fountain", "conveyer belt, conveyor belt, conveyer, conveyor, transporter", "canopy", "washer, automatic washer, washing machine", "plaything, toy", "pool", "stool", "barrel, cask", "basket, handbasket", "falls", "tent", "bag", "minibike, motorbike", "cradle", "oven", "ball", "food, solid food", "step, stair", "tank, storage tank", "trade name", "microwave", "pot", "animal", "bicycle", "lake", "dishwasher", "screen", "blanket, cover", "sculpture", "hood, exhaust hood", "sconce", "vase", "traffic light", "tray", "trash can", "fan", "pier", "crt screen", "plate", "monitor", "bulletin board", "shower", "radiator", "glass, drinking glass", "clock", "flag", # noqa -] -# After processed by `prepare_ade20k_sem_seg.py`, id 255 means ignore -# fmt: on - - -def _get_coco_instances_meta(): - thing_ids = [k["id"] for k in COCO_CATEGORIES if k["isthing"] == 1] - thing_colors = [k["color"] for k in COCO_CATEGORIES if k["isthing"] == 1] - assert len(thing_ids) == 80, len(thing_ids) - # Mapping from the incontiguous COCO category id to an id in [0, 79] - thing_dataset_id_to_contiguous_id = {k: i for i, k in enumerate(thing_ids)} - thing_classes = [k["name"] for k in COCO_CATEGORIES if k["isthing"] == 1] - ret = { - "thing_dataset_id_to_contiguous_id": thing_dataset_id_to_contiguous_id, - "thing_classes": thing_classes, - "thing_colors": thing_colors, - } - return ret - - -def _get_coco_panoptic_separated_meta(): - """ - Returns metadata for "separated" version of the panoptic segmentation dataset. - """ - stuff_ids = [k["id"] for k in COCO_CATEGORIES if k["isthing"] == 0] - assert len(stuff_ids) == 53, len(stuff_ids) - - # For semantic segmentation, this mapping maps from contiguous stuff id - # (in [0, 53], used in models) to ids in the dataset (used for processing results) - # The id 0 is mapped to an extra category "thing". - stuff_dataset_id_to_contiguous_id = {k: i + 1 for i, k in enumerate(stuff_ids)} - # When converting COCO panoptic annotations to semantic annotations - # We label the "thing" category to 0 - stuff_dataset_id_to_contiguous_id[0] = 0 - - # 54 names for COCO stuff categories (including "things") - stuff_classes = ["things"] + [ - k["name"].replace("-other", "").replace("-merged", "") - for k in COCO_CATEGORIES - if k["isthing"] == 0 - ] - - # NOTE: I randomly picked a color for things - stuff_colors = [[82, 18, 128]] + [k["color"] for k in COCO_CATEGORIES if k["isthing"] == 0] - ret = { - "stuff_dataset_id_to_contiguous_id": stuff_dataset_id_to_contiguous_id, - "stuff_classes": stuff_classes, - "stuff_colors": stuff_colors, - } - ret.update(_get_coco_instances_meta()) - return ret - - -def _get_builtin_metadata(dataset_name): - if dataset_name == "coco": - return _get_coco_instances_meta() - if dataset_name == "coco_panoptic_separated": - return _get_coco_panoptic_separated_meta() - elif dataset_name == "coco_panoptic_standard": - meta = {} - # The following metadata maps contiguous id from [0, #thing categories + - # #stuff categories) to their names and colors. We have to replica of the - # same name and color under "thing_*" and "stuff_*" because the current - # visualization function in D2 handles thing and class classes differently - # due to some heuristic used in Panoptic FPN. We keep the same naming to - # enable reusing existing visualization functions. - thing_classes = [k["name"] for k in COCO_CATEGORIES] - thing_colors = [k["color"] for k in COCO_CATEGORIES] - stuff_classes = [k["name"] for k in COCO_CATEGORIES] - stuff_colors = [k["color"] for k in COCO_CATEGORIES] - - meta["thing_classes"] = thing_classes - meta["thing_colors"] = thing_colors - meta["stuff_classes"] = stuff_classes - meta["stuff_colors"] = stuff_colors - - # Convert category id for training: - # category id: like semantic segmentation, it is the class id for each - # pixel. Since there are some classes not used in evaluation, the category - # id is not always contiguous and thus we have two set of category ids: - # - original category id: category id in the original dataset, mainly - # used for evaluation. - # - contiguous category id: [0, #classes), in order to train the linear - # softmax classifier. - thing_dataset_id_to_contiguous_id = {} - stuff_dataset_id_to_contiguous_id = {} - - for i, cat in enumerate(COCO_CATEGORIES): - if cat["isthing"]: - thing_dataset_id_to_contiguous_id[cat["id"]] = i - else: - stuff_dataset_id_to_contiguous_id[cat["id"]] = i - - meta["thing_dataset_id_to_contiguous_id"] = thing_dataset_id_to_contiguous_id - meta["stuff_dataset_id_to_contiguous_id"] = stuff_dataset_id_to_contiguous_id - - return meta - elif dataset_name == "coco_person": - return { - "thing_classes": ["person"], - "keypoint_names": COCO_PERSON_KEYPOINT_NAMES, - "keypoint_flip_map": COCO_PERSON_KEYPOINT_FLIP_MAP, - "keypoint_connection_rules": KEYPOINT_CONNECTION_RULES, - } - elif dataset_name == "cityscapes": - # fmt: off - CITYSCAPES_THING_CLASSES = [ - "person", "rider", "car", "truck", - "bus", "train", "motorcycle", "bicycle", - ] - CITYSCAPES_STUFF_CLASSES = [ - "road", "sidewalk", "building", "wall", "fence", "pole", "traffic light", - "traffic sign", "vegetation", "terrain", "sky", "person", "rider", "car", - "truck", "bus", "train", "motorcycle", "bicycle", - ] - # fmt: on - return { - "thing_classes": CITYSCAPES_THING_CLASSES, - "stuff_classes": CITYSCAPES_STUFF_CLASSES, - } - raise KeyError("No built-in metadata for dataset {}".format(dataset_name)) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/cityscapes.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/cityscapes.py deleted file mode 100755 index 1e84a5bd..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/cityscapes.py +++ /dev/null @@ -1,329 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import functools -import json -import logging -import multiprocessing as mp -import numpy as np -import os -from itertools import chain -import pycocotools.mask as mask_util -from PIL import Image - -from detectron2.structures import BoxMode -from detectron2.utils.comm import get_world_size -from detectron2.utils.file_io import PathManager -from detectron2.utils.logger import setup_logger - -try: - import cv2 # noqa -except ImportError: - # OpenCV is an optional dependency at the moment - pass - - -logger = logging.getLogger(__name__) - - -def _get_cityscapes_files(image_dir, gt_dir): - files = [] - # scan through the directory - cities = PathManager.ls(image_dir) - logger.info(f"{len(cities)} cities found in '{image_dir}'.") - for city in cities: - city_img_dir = os.path.join(image_dir, city) - city_gt_dir = os.path.join(gt_dir, city) - for basename in PathManager.ls(city_img_dir): - image_file = os.path.join(city_img_dir, basename) - - suffix = "leftImg8bit.png" - assert basename.endswith(suffix), basename - basename = basename[: -len(suffix)] - - instance_file = os.path.join(city_gt_dir, basename + "gtFine_instanceIds.png") - label_file = os.path.join(city_gt_dir, basename + "gtFine_labelIds.png") - json_file = os.path.join(city_gt_dir, basename + "gtFine_polygons.json") - - files.append((image_file, instance_file, label_file, json_file)) - assert len(files), "No images found in {}".format(image_dir) - for f in files[0]: - assert PathManager.isfile(f), f - return files - - -def load_cityscapes_instances(image_dir, gt_dir, from_json=True, to_polygons=True): - """ - Args: - image_dir (str): path to the raw dataset. e.g., "~/cityscapes/leftImg8bit/train". - gt_dir (str): path to the raw annotations. e.g., "~/cityscapes/gtFine/train". - from_json (bool): whether to read annotations from the raw json file or the png files. - to_polygons (bool): whether to represent the segmentation as polygons - (COCO's format) instead of masks (cityscapes's format). - - Returns: - list[dict]: a list of dicts in Detectron2 standard format. (See - `Using Custom Datasets `_ ) - """ - if from_json: - assert to_polygons, ( - "Cityscapes's json annotations are in polygon format. " - "Converting to mask format is not supported now." - ) - files = _get_cityscapes_files(image_dir, gt_dir) - - logger.info("Preprocessing cityscapes annotations ...") - # This is still not fast: all workers will execute duplicate works and will - # take up to 10m on a 8GPU server. - pool = mp.Pool(processes=max(mp.cpu_count() // get_world_size() // 2, 4)) - - ret = pool.map( - functools.partial(_cityscapes_files_to_dict, from_json=from_json, to_polygons=to_polygons), - files, - ) - logger.info("Loaded {} images from {}".format(len(ret), image_dir)) - - # Map cityscape ids to contiguous ids - from cityscapesscripts.helpers.labels import labels - - labels = [l for l in labels if l.hasInstances and not l.ignoreInEval] - dataset_id_to_contiguous_id = {l.id: idx for idx, l in enumerate(labels)} - for dict_per_image in ret: - for anno in dict_per_image["annotations"]: - anno["category_id"] = dataset_id_to_contiguous_id[anno["category_id"]] - return ret - - -def load_cityscapes_semantic(image_dir, gt_dir): - """ - Args: - image_dir (str): path to the raw dataset. e.g., "~/cityscapes/leftImg8bit/train". - gt_dir (str): path to the raw annotations. e.g., "~/cityscapes/gtFine/train". - - Returns: - list[dict]: a list of dict, each has "file_name" and - "sem_seg_file_name". - """ - ret = [] - # gt_dir is small and contain many small files. make sense to fetch to local first - gt_dir = PathManager.get_local_path(gt_dir) - for image_file, _, label_file, json_file in _get_cityscapes_files(image_dir, gt_dir): - label_file = label_file.replace("labelIds", "labelTrainIds") - - with PathManager.open(json_file, "r") as f: - jsonobj = json.load(f) - ret.append( - { - "file_name": image_file, - "sem_seg_file_name": label_file, - "height": jsonobj["imgHeight"], - "width": jsonobj["imgWidth"], - } - ) - assert len(ret), f"No images found in {image_dir}!" - assert PathManager.isfile( - ret[0]["sem_seg_file_name"] - ), "Please generate labelTrainIds.png with cityscapesscripts/preparation/createTrainIdLabelImgs.py" # noqa - return ret - - -def _cityscapes_files_to_dict(files, from_json, to_polygons): - """ - Parse cityscapes annotation files to a instance segmentation dataset dict. - - Args: - files (tuple): consists of (image_file, instance_id_file, label_id_file, json_file) - from_json (bool): whether to read annotations from the raw json file or the png files. - to_polygons (bool): whether to represent the segmentation as polygons - (COCO's format) instead of masks (cityscapes's format). - - Returns: - A dict in Detectron2 Dataset format. - """ - from cityscapesscripts.helpers.labels import id2label, name2label - - image_file, instance_id_file, _, json_file = files - - annos = [] - - if from_json: - from shapely.geometry import MultiPolygon, Polygon - - with PathManager.open(json_file, "r") as f: - jsonobj = json.load(f) - ret = { - "file_name": image_file, - "image_id": os.path.basename(image_file), - "height": jsonobj["imgHeight"], - "width": jsonobj["imgWidth"], - } - - # `polygons_union` contains the union of all valid polygons. - polygons_union = Polygon() - - # CityscapesScripts draw the polygons in sequential order - # and each polygon *overwrites* existing ones. See - # (https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/preparation/json2instanceImg.py) # noqa - # We use reverse order, and each polygon *avoids* early ones. - # This will resolve the ploygon overlaps in the same way as CityscapesScripts. - for obj in jsonobj["objects"][::-1]: - if "deleted" in obj: # cityscapes data format specific - continue - label_name = obj["label"] - - try: - label = name2label[label_name] - except KeyError: - if label_name.endswith("group"): # crowd area - label = name2label[label_name[: -len("group")]] - else: - raise - if label.id < 0: # cityscapes data format - continue - - # Cityscapes's raw annotations uses integer coordinates - # Therefore +0.5 here - poly_coord = np.asarray(obj["polygon"], dtype="f4") + 0.5 - # CityscapesScript uses PIL.ImageDraw.polygon to rasterize - # polygons for evaluation. This function operates in integer space - # and draws each pixel whose center falls into the polygon. - # Therefore it draws a polygon which is 0.5 "fatter" in expectation. - # We therefore dilate the input polygon by 0.5 as our input. - poly = Polygon(poly_coord).buffer(0.5, resolution=4) - - if not label.hasInstances or label.ignoreInEval: - # even if we won't store the polygon it still contributes to overlaps resolution - polygons_union = polygons_union.union(poly) - continue - - # Take non-overlapping part of the polygon - poly_wo_overlaps = poly.difference(polygons_union) - if poly_wo_overlaps.is_empty: - continue - polygons_union = polygons_union.union(poly) - - anno = {} - anno["iscrowd"] = label_name.endswith("group") - anno["category_id"] = label.id - - if isinstance(poly_wo_overlaps, Polygon): - poly_list = [poly_wo_overlaps] - elif isinstance(poly_wo_overlaps, MultiPolygon): - poly_list = poly_wo_overlaps.geoms - else: - raise NotImplementedError("Unknown geometric structure {}".format(poly_wo_overlaps)) - - poly_coord = [] - for poly_el in poly_list: - # COCO API can work only with exterior boundaries now, hence we store only them. - # TODO: store both exterior and interior boundaries once other parts of the - # codebase support holes in polygons. - poly_coord.append(list(chain(*poly_el.exterior.coords))) - anno["segmentation"] = poly_coord - (xmin, ymin, xmax, ymax) = poly_wo_overlaps.bounds - - anno["bbox"] = (xmin, ymin, xmax, ymax) - anno["bbox_mode"] = BoxMode.XYXY_ABS - - annos.append(anno) - else: - # See also the official annotation parsing scripts at - # https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/instances2dict.py # noqa - with PathManager.open(instance_id_file, "rb") as f: - inst_image = np.asarray(Image.open(f), order="F") - # ids < 24 are stuff labels (filtering them first is about 5% faster) - flattened_ids = np.unique(inst_image[inst_image >= 24]) - - ret = { - "file_name": image_file, - "image_id": os.path.basename(image_file), - "height": inst_image.shape[0], - "width": inst_image.shape[1], - } - - for instance_id in flattened_ids: - # For non-crowd annotations, instance_id // 1000 is the label_id - # Crowd annotations have <1000 instance ids - label_id = instance_id // 1000 if instance_id >= 1000 else instance_id - label = id2label[label_id] - if not label.hasInstances or label.ignoreInEval: - continue - - anno = {} - anno["iscrowd"] = instance_id < 1000 - anno["category_id"] = label.id - - mask = np.asarray(inst_image == instance_id, dtype=np.uint8, order="F") - - inds = np.nonzero(mask) - ymin, ymax = inds[0].min(), inds[0].max() - xmin, xmax = inds[1].min(), inds[1].max() - anno["bbox"] = (xmin, ymin, xmax, ymax) - if xmax <= xmin or ymax <= ymin: - continue - anno["bbox_mode"] = BoxMode.XYXY_ABS - if to_polygons: - # This conversion comes from D4809743 and D5171122, - # when Mask-RCNN was first developed. - contours = cv2.findContours(mask.copy(), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_NONE)[ - -2 - ] - polygons = [c.reshape(-1).tolist() for c in contours if len(c) >= 3] - # opencv's can produce invalid polygons - if len(polygons) == 0: - continue - anno["segmentation"] = polygons - else: - anno["segmentation"] = mask_util.encode(mask[:, :, None])[0] - annos.append(anno) - ret["annotations"] = annos - return ret - - -if __name__ == "__main__": - """ - Test the cityscapes dataset loader. - - Usage: - python -m detectron2.data.datasets.cityscapes \ - cityscapes/leftImg8bit/train cityscapes/gtFine/train - """ - import argparse - - parser = argparse.ArgumentParser() - parser.add_argument("image_dir") - parser.add_argument("gt_dir") - parser.add_argument("--type", choices=["instance", "semantic"], default="instance") - args = parser.parse_args() - from detectron2.data.catalog import Metadata - from detectron2.utils.visualizer import Visualizer - from cityscapesscripts.helpers.labels import labels - - logger = setup_logger(name=__name__) - - dirname = "cityscapes-data-vis" - os.makedirs(dirname, exist_ok=True) - - if args.type == "instance": - dicts = load_cityscapes_instances( - args.image_dir, args.gt_dir, from_json=True, to_polygons=True - ) - logger.info("Done loading {} samples.".format(len(dicts))) - - thing_classes = [k.name for k in labels if k.hasInstances and not k.ignoreInEval] - meta = Metadata().set(thing_classes=thing_classes) - - else: - dicts = load_cityscapes_semantic(args.image_dir, args.gt_dir) - logger.info("Done loading {} samples.".format(len(dicts))) - - stuff_classes = [k.name for k in labels if k.trainId != 255] - stuff_colors = [k.color for k in labels if k.trainId != 255] - meta = Metadata().set(stuff_classes=stuff_classes, stuff_colors=stuff_colors) - - for d in dicts: - img = np.array(Image.open(PathManager.open(d["file_name"], "rb"))) - visualizer = Visualizer(img, metadata=meta) - vis = visualizer.draw_dataset_dict(d) - # cv2.imshow("a", vis.get_image()[:, :, ::-1]) - # cv2.waitKey() - fpath = os.path.join(dirname, os.path.basename(d["file_name"])) - vis.save(fpath) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/cityscapes_panoptic.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/cityscapes_panoptic.py deleted file mode 100755 index 48c136f1..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/cityscapes_panoptic.py +++ /dev/null @@ -1,187 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import json -import logging -import os - -from detectron2.data import DatasetCatalog, MetadataCatalog -from detectron2.data.datasets.builtin_meta import CITYSCAPES_CATEGORIES -from detectron2.utils.file_io import PathManager - -""" -This file contains functions to register the Cityscapes panoptic dataset to the DatasetCatalog. -""" - - -logger = logging.getLogger(__name__) - - -def get_cityscapes_panoptic_files(image_dir, gt_dir, json_info): - files = [] - # scan through the directory - cities = PathManager.ls(image_dir) - logger.info(f"{len(cities)} cities found in '{image_dir}'.") - image_dict = {} - for city in cities: - city_img_dir = os.path.join(image_dir, city) - for basename in PathManager.ls(city_img_dir): - image_file = os.path.join(city_img_dir, basename) - - suffix = "_leftImg8bit.png" - assert basename.endswith(suffix), basename - basename = os.path.basename(basename)[: -len(suffix)] - - image_dict[basename] = image_file - - for ann in json_info["annotations"]: - image_file = image_dict.get(ann["image_id"], None) - assert image_file is not None, "No image {} found for annotation {}".format( - ann["image_id"], ann["file_name"] - ) - label_file = os.path.join(gt_dir, ann["file_name"]) - segments_info = ann["segments_info"] - - files.append((image_file, label_file, segments_info)) - - assert len(files), "No images found in {}".format(image_dir) - assert PathManager.isfile(files[0][0]), files[0][0] - assert PathManager.isfile(files[0][1]), files[0][1] - return files - - -def load_cityscapes_panoptic(image_dir, gt_dir, gt_json, meta): - """ - Args: - image_dir (str): path to the raw dataset. e.g., "~/cityscapes/leftImg8bit/train". - gt_dir (str): path to the raw annotations. e.g., - "~/cityscapes/gtFine/cityscapes_panoptic_train". - gt_json (str): path to the json file. e.g., - "~/cityscapes/gtFine/cityscapes_panoptic_train.json". - meta (dict): dictionary containing "thing_dataset_id_to_contiguous_id" - and "stuff_dataset_id_to_contiguous_id" to map category ids to - contiguous ids for training. - - Returns: - list[dict]: a list of dicts in Detectron2 standard format. (See - `Using Custom Datasets `_ ) - """ - - def _convert_category_id(segment_info, meta): - if segment_info["category_id"] in meta["thing_dataset_id_to_contiguous_id"]: - segment_info["category_id"] = meta["thing_dataset_id_to_contiguous_id"][ - segment_info["category_id"] - ] - else: - segment_info["category_id"] = meta["stuff_dataset_id_to_contiguous_id"][ - segment_info["category_id"] - ] - return segment_info - - assert os.path.exists( - gt_json - ), "Please run `python cityscapesscripts/preparation/createPanopticImgs.py` to generate label files." # noqa - with open(gt_json) as f: - json_info = json.load(f) - files = get_cityscapes_panoptic_files(image_dir, gt_dir, json_info) - ret = [] - for image_file, label_file, segments_info in files: - sem_label_file = ( - image_file.replace("leftImg8bit", "gtFine").split(".")[0] + "_labelTrainIds.png" - ) - segments_info = [_convert_category_id(x, meta) for x in segments_info] - ret.append( - { - "file_name": image_file, - "image_id": "_".join( - os.path.splitext(os.path.basename(image_file))[0].split("_")[:3] - ), - "sem_seg_file_name": sem_label_file, - "pan_seg_file_name": label_file, - "segments_info": segments_info, - } - ) - assert len(ret), f"No images found in {image_dir}!" - assert PathManager.isfile( - ret[0]["sem_seg_file_name"] - ), "Please generate labelTrainIds.png with cityscapesscripts/preparation/createTrainIdLabelImgs.py" # noqa - assert PathManager.isfile( - ret[0]["pan_seg_file_name"] - ), "Please generate panoptic annotation with python cityscapesscripts/preparation/createPanopticImgs.py" # noqa - return ret - - -_RAW_CITYSCAPES_PANOPTIC_SPLITS = { - "cityscapes_fine_panoptic_train": ( - "cityscapes/leftImg8bit/train", - "cityscapes/gtFine/cityscapes_panoptic_train", - "cityscapes/gtFine/cityscapes_panoptic_train.json", - ), - "cityscapes_fine_panoptic_val": ( - "cityscapes/leftImg8bit/val", - "cityscapes/gtFine/cityscapes_panoptic_val", - "cityscapes/gtFine/cityscapes_panoptic_val.json", - ), - # "cityscapes_fine_panoptic_test": not supported yet -} - - -def register_all_cityscapes_panoptic(root): - meta = {} - # The following metadata maps contiguous id from [0, #thing categories + - # #stuff categories) to their names and colors. We have to replica of the - # same name and color under "thing_*" and "stuff_*" because the current - # visualization function in D2 handles thing and class classes differently - # due to some heuristic used in Panoptic FPN. We keep the same naming to - # enable reusing existing visualization functions. - thing_classes = [k["name"] for k in CITYSCAPES_CATEGORIES] - thing_colors = [k["color"] for k in CITYSCAPES_CATEGORIES] - stuff_classes = [k["name"] for k in CITYSCAPES_CATEGORIES] - stuff_colors = [k["color"] for k in CITYSCAPES_CATEGORIES] - - meta["thing_classes"] = thing_classes - meta["thing_colors"] = thing_colors - meta["stuff_classes"] = stuff_classes - meta["stuff_colors"] = stuff_colors - - # There are three types of ids in cityscapes panoptic segmentation: - # (1) category id: like semantic segmentation, it is the class id for each - # pixel. Since there are some classes not used in evaluation, the category - # id is not always contiguous and thus we have two set of category ids: - # - original category id: category id in the original dataset, mainly - # used for evaluation. - # - contiguous category id: [0, #classes), in order to train the classifier - # (2) instance id: this id is used to differentiate different instances from - # the same category. For "stuff" classes, the instance id is always 0; for - # "thing" classes, the instance id starts from 1 and 0 is reserved for - # ignored instances (e.g. crowd annotation). - # (3) panoptic id: this is the compact id that encode both category and - # instance id by: category_id * 1000 + instance_id. - thing_dataset_id_to_contiguous_id = {} - stuff_dataset_id_to_contiguous_id = {} - - for k in CITYSCAPES_CATEGORIES: - if k["isthing"] == 1: - thing_dataset_id_to_contiguous_id[k["id"]] = k["trainId"] - else: - stuff_dataset_id_to_contiguous_id[k["id"]] = k["trainId"] - - meta["thing_dataset_id_to_contiguous_id"] = thing_dataset_id_to_contiguous_id - meta["stuff_dataset_id_to_contiguous_id"] = stuff_dataset_id_to_contiguous_id - - for key, (image_dir, gt_dir, gt_json) in _RAW_CITYSCAPES_PANOPTIC_SPLITS.items(): - image_dir = os.path.join(root, image_dir) - gt_dir = os.path.join(root, gt_dir) - gt_json = os.path.join(root, gt_json) - - DatasetCatalog.register( - key, lambda x=image_dir, y=gt_dir, z=gt_json: load_cityscapes_panoptic(x, y, z, meta) - ) - MetadataCatalog.get(key).set( - panoptic_root=gt_dir, - image_root=image_dir, - panoptic_json=gt_json, - gt_dir=gt_dir.replace("cityscapes_panoptic_", ""), - evaluator_type="cityscapes_panoptic_seg", - ignore_label=255, - label_divisor=1000, - **meta, - ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/coco.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/coco.py deleted file mode 100755 index ed4f7ccb..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/coco.py +++ /dev/null @@ -1,539 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import contextlib -import datetime -import io -import json -import logging -import numpy as np -import os -import shutil -import pycocotools.mask as mask_util -from fvcore.common.timer import Timer -from iopath.common.file_io import file_lock -from PIL import Image - -from detectron2.structures import Boxes, BoxMode, PolygonMasks, RotatedBoxes -from detectron2.utils.file_io import PathManager - -from .. import DatasetCatalog, MetadataCatalog - -""" -This file contains functions to parse COCO-format annotations into dicts in "Detectron2 format". -""" - - -logger = logging.getLogger(__name__) - -__all__ = ["load_coco_json", "load_sem_seg", "convert_to_coco_json", "register_coco_instances"] - - -def load_coco_json(json_file, image_root, dataset_name=None, extra_annotation_keys=None): - """ - Load a json file with COCO's instances annotation format. - Currently supports instance detection, instance segmentation, - and person keypoints annotations. - - Args: - json_file (str): full path to the json file in COCO instances annotation format. - image_root (str or path-like): the directory where the images in this json file exists. - dataset_name (str or None): the name of the dataset (e.g., coco_2017_train). - When provided, this function will also do the following: - - * Put "thing_classes" into the metadata associated with this dataset. - * Map the category ids into a contiguous range (needed by standard dataset format), - and add "thing_dataset_id_to_contiguous_id" to the metadata associated - with this dataset. - - This option should usually be provided, unless users need to load - the original json content and apply more processing manually. - extra_annotation_keys (list[str]): list of per-annotation keys that should also be - loaded into the dataset dict (besides "iscrowd", "bbox", "keypoints", - "category_id", "segmentation"). The values for these keys will be returned as-is. - For example, the densepose annotations are loaded in this way. - - Returns: - list[dict]: a list of dicts in Detectron2 standard dataset dicts format (See - `Using Custom Datasets `_ ) when `dataset_name` is not None. - If `dataset_name` is None, the returned `category_ids` may be - incontiguous and may not conform to the Detectron2 standard format. - - Notes: - 1. This function does not read the image files. - The results do not have the "image" field. - """ - from pycocotools.coco import COCO - - timer = Timer() - json_file = PathManager.get_local_path(json_file) - with contextlib.redirect_stdout(io.StringIO()): - coco_api = COCO(json_file) - if timer.seconds() > 1: - logger.info("Loading {} takes {:.2f} seconds.".format(json_file, timer.seconds())) - - id_map = None - if dataset_name is not None: - meta = MetadataCatalog.get(dataset_name) - cat_ids = sorted(coco_api.getCatIds()) - cats = coco_api.loadCats(cat_ids) - # The categories in a custom json file may not be sorted. - thing_classes = [c["name"] for c in sorted(cats, key=lambda x: x["id"])] - meta.thing_classes = thing_classes - - # In COCO, certain category ids are artificially removed, - # and by convention they are always ignored. - # We deal with COCO's id issue and translate - # the category ids to contiguous ids in [0, 80). - - # It works by looking at the "categories" field in the json, therefore - # if users' own json also have incontiguous ids, we'll - # apply this mapping as well but print a warning. - if not (min(cat_ids) == 1 and max(cat_ids) == len(cat_ids)): - if "coco" not in dataset_name: - logger.warning( - """ -Category ids in annotations are not in [1, #categories]! We'll apply a mapping for you. -""" - ) - id_map = {v: i for i, v in enumerate(cat_ids)} - meta.thing_dataset_id_to_contiguous_id = id_map - - # sort indices for reproducible results - img_ids = sorted(coco_api.imgs.keys()) - # imgs is a list of dicts, each looks something like: - # {'license': 4, - # 'url': 'http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg', - # 'file_name': 'COCO_val2014_000000001268.jpg', - # 'height': 427, - # 'width': 640, - # 'date_captured': '2013-11-17 05:57:24', - # 'id': 1268} - imgs = coco_api.loadImgs(img_ids) - # anns is a list[list[dict]], where each dict is an annotation - # record for an object. The inner list enumerates the objects in an image - # and the outer list enumerates over images. Example of anns[0]: - # [{'segmentation': [[192.81, - # 247.09, - # ... - # 219.03, - # 249.06]], - # 'area': 1035.749, - # 'iscrowd': 0, - # 'image_id': 1268, - # 'bbox': [192.81, 224.8, 74.73, 33.43], - # 'category_id': 16, - # 'id': 42986}, - # ...] - anns = [coco_api.imgToAnns[img_id] for img_id in img_ids] - total_num_valid_anns = sum([len(x) for x in anns]) - total_num_anns = len(coco_api.anns) - if total_num_valid_anns < total_num_anns: - logger.warning( - f"{json_file} contains {total_num_anns} annotations, but only " - f"{total_num_valid_anns} of them match to images in the file." - ) - - if "minival" not in json_file: - # The popular valminusminival & minival annotations for COCO2014 contain this bug. - # However the ratio of buggy annotations there is tiny and does not affect accuracy. - # Therefore we explicitly white-list them. - ann_ids = [ann["id"] for anns_per_image in anns for ann in anns_per_image] - assert len(set(ann_ids)) == len(ann_ids), "Annotation ids in '{}' are not unique!".format( - json_file - ) - - imgs_anns = list(zip(imgs, anns)) - logger.info("Loaded {} images in COCO format from {}".format(len(imgs_anns), json_file)) - - dataset_dicts = [] - - ann_keys = ["iscrowd", "bbox", "keypoints", "category_id"] + (extra_annotation_keys or []) - - num_instances_without_valid_segmentation = 0 - - for (img_dict, anno_dict_list) in imgs_anns: - record = {} - record["file_name"] = os.path.join(image_root, img_dict["file_name"]) - record["height"] = img_dict["height"] - record["width"] = img_dict["width"] - image_id = record["image_id"] = img_dict["id"] - - objs = [] - for anno in anno_dict_list: - # Check that the image_id in this annotation is the same as - # the image_id we're looking at. - # This fails only when the data parsing logic or the annotation file is buggy. - - # The original COCO valminusminival2014 & minival2014 annotation files - # actually contains bugs that, together with certain ways of using COCO API, - # can trigger this assertion. - assert anno["image_id"] == image_id - - assert anno.get("ignore", 0) == 0, '"ignore" in COCO json file is not supported.' - - obj = {key: anno[key] for key in ann_keys if key in anno} - if "bbox" in obj and len(obj["bbox"]) == 0: - raise ValueError( - f"One annotation of image {image_id} contains empty 'bbox' value! " - "This json does not have valid COCO format." - ) - - segm = anno.get("segmentation", None) - if segm: # either list[list[float]] or dict(RLE) - if isinstance(segm, dict): - if isinstance(segm["counts"], list): - # convert to compressed RLE - segm = mask_util.frPyObjects(segm, *segm["size"]) - else: - # filter out invalid polygons (< 3 points) - segm = [poly for poly in segm if len(poly) % 2 == 0 and len(poly) >= 6] - if len(segm) == 0: - num_instances_without_valid_segmentation += 1 - continue # ignore this instance - obj["segmentation"] = segm - - keypts = anno.get("keypoints", None) - if keypts: # list[int] - for idx, v in enumerate(keypts): - if idx % 3 != 2: - # COCO's segmentation coordinates are floating points in [0, H or W], - # but keypoint coordinates are integers in [0, H-1 or W-1] - # Therefore we assume the coordinates are "pixel indices" and - # add 0.5 to convert to floating point coordinates. - keypts[idx] = v + 0.5 - obj["keypoints"] = keypts - - obj["bbox_mode"] = BoxMode.XYWH_ABS - if id_map: - annotation_category_id = obj["category_id"] - try: - obj["category_id"] = id_map[annotation_category_id] - except KeyError as e: - raise KeyError( - f"Encountered category_id={annotation_category_id} " - "but this id does not exist in 'categories' of the json file." - ) from e - objs.append(obj) - record["annotations"] = objs - dataset_dicts.append(record) - - if num_instances_without_valid_segmentation > 0: - logger.warning( - "Filtered out {} instances without valid segmentation. ".format( - num_instances_without_valid_segmentation - ) - + "There might be issues in your dataset generation process. Please " - "check https://detectron2.readthedocs.io/en/latest/tutorials/datasets.html carefully" - ) - return dataset_dicts - - -def load_sem_seg(gt_root, image_root, gt_ext="png", image_ext="jpg"): - """ - Load semantic segmentation datasets. All files under "gt_root" with "gt_ext" extension are - treated as ground truth annotations and all files under "image_root" with "image_ext" extension - as input images. Ground truth and input images are matched using file paths relative to - "gt_root" and "image_root" respectively without taking into account file extensions. - This works for COCO as well as some other datasets. - - Args: - gt_root (str): full path to ground truth semantic segmentation files. Semantic segmentation - annotations are stored as images with integer values in pixels that represent - corresponding semantic labels. - image_root (str): the directory where the input images are. - gt_ext (str): file extension for ground truth annotations. - image_ext (str): file extension for input images. - - Returns: - list[dict]: - a list of dicts in detectron2 standard format without instance-level - annotation. - - Notes: - 1. This function does not read the image and ground truth files. - The results do not have the "image" and "sem_seg" fields. - """ - - # We match input images with ground truth based on their relative filepaths (without file - # extensions) starting from 'image_root' and 'gt_root' respectively. - def file2id(folder_path, file_path): - # extract relative path starting from `folder_path` - image_id = os.path.normpath(os.path.relpath(file_path, start=folder_path)) - # remove file extension - image_id = os.path.splitext(image_id)[0] - return image_id - - input_files = sorted( - (os.path.join(image_root, f) for f in PathManager.ls(image_root) if f.endswith(image_ext)), - key=lambda file_path: file2id(image_root, file_path), - ) - gt_files = sorted( - (os.path.join(gt_root, f) for f in PathManager.ls(gt_root) if f.endswith(gt_ext)), - key=lambda file_path: file2id(gt_root, file_path), - ) - - assert len(gt_files) > 0, "No annotations found in {}.".format(gt_root) - - # Use the intersection, so that val2017_100 annotations can run smoothly with val2017 images - if len(input_files) != len(gt_files): - logger.warn( - "Directory {} and {} has {} and {} files, respectively.".format( - image_root, gt_root, len(input_files), len(gt_files) - ) - ) - input_basenames = [os.path.basename(f)[: -len(image_ext)] for f in input_files] - gt_basenames = [os.path.basename(f)[: -len(gt_ext)] for f in gt_files] - intersect = list(set(input_basenames) & set(gt_basenames)) - # sort, otherwise each worker may obtain a list[dict] in different order - intersect = sorted(intersect) - logger.warn("Will use their intersection of {} files.".format(len(intersect))) - input_files = [os.path.join(image_root, f + image_ext) for f in intersect] - gt_files = [os.path.join(gt_root, f + gt_ext) for f in intersect] - - logger.info( - "Loaded {} images with semantic segmentation from {}".format(len(input_files), image_root) - ) - - dataset_dicts = [] - for (img_path, gt_path) in zip(input_files, gt_files): - record = {} - record["file_name"] = img_path - record["sem_seg_file_name"] = gt_path - dataset_dicts.append(record) - - return dataset_dicts - - -def convert_to_coco_dict(dataset_name): - """ - Convert an instance detection/segmentation or keypoint detection dataset - in detectron2's standard format into COCO json format. - - Generic dataset description can be found here: - https://detectron2.readthedocs.io/tutorials/datasets.html#register-a-dataset - - COCO data format description can be found here: - http://cocodataset.org/#format-data - - Args: - dataset_name (str): - name of the source dataset - Must be registered in DatastCatalog and in detectron2's standard format. - Must have corresponding metadata "thing_classes" - Returns: - coco_dict: serializable dict in COCO json format - """ - - dataset_dicts = DatasetCatalog.get(dataset_name) - metadata = MetadataCatalog.get(dataset_name) - - # unmap the category mapping ids for COCO - if hasattr(metadata, "thing_dataset_id_to_contiguous_id"): - reverse_id_mapping = {v: k for k, v in metadata.thing_dataset_id_to_contiguous_id.items()} - reverse_id_mapper = lambda contiguous_id: reverse_id_mapping[contiguous_id] # noqa - else: - reverse_id_mapper = lambda contiguous_id: contiguous_id # noqa - - categories = [ - {"id": reverse_id_mapper(id), "name": name} - for id, name in enumerate(metadata.thing_classes) - ] - - logger.info("Converting dataset dicts into COCO format") - coco_images = [] - coco_annotations = [] - - for image_id, image_dict in enumerate(dataset_dicts): - coco_image = { - "id": image_dict.get("image_id", image_id), - "width": int(image_dict["width"]), - "height": int(image_dict["height"]), - "file_name": str(image_dict["file_name"]), - } - coco_images.append(coco_image) - - anns_per_image = image_dict.get("annotations", []) - for annotation in anns_per_image: - # create a new dict with only COCO fields - coco_annotation = {} - - # COCO requirement: XYWH box format for axis-align and XYWHA for rotated - bbox = annotation["bbox"] - if isinstance(bbox, np.ndarray): - if bbox.ndim != 1: - raise ValueError(f"bbox has to be 1-dimensional. Got shape={bbox.shape}.") - bbox = bbox.tolist() - if len(bbox) not in [4, 5]: - raise ValueError(f"bbox has to has length 4 or 5. Got {bbox}.") - from_bbox_mode = annotation["bbox_mode"] - to_bbox_mode = BoxMode.XYWH_ABS if len(bbox) == 4 else BoxMode.XYWHA_ABS - bbox = BoxMode.convert(bbox, from_bbox_mode, to_bbox_mode) - - # COCO requirement: instance area - if "segmentation" in annotation: - # Computing areas for instances by counting the pixels - segmentation = annotation["segmentation"] - # TODO: check segmentation type: RLE, BinaryMask or Polygon - if isinstance(segmentation, list): - polygons = PolygonMasks([segmentation]) - area = polygons.area()[0].item() - elif isinstance(segmentation, dict): # RLE - area = mask_util.area(segmentation).item() - else: - raise TypeError(f"Unknown segmentation type {type(segmentation)}!") - else: - # Computing areas using bounding boxes - if to_bbox_mode == BoxMode.XYWH_ABS: - bbox_xy = BoxMode.convert(bbox, to_bbox_mode, BoxMode.XYXY_ABS) - area = Boxes([bbox_xy]).area()[0].item() - else: - area = RotatedBoxes([bbox]).area()[0].item() - - if "keypoints" in annotation: - keypoints = annotation["keypoints"] # list[int] - for idx, v in enumerate(keypoints): - if idx % 3 != 2: - # COCO's segmentation coordinates are floating points in [0, H or W], - # but keypoint coordinates are integers in [0, H-1 or W-1] - # For COCO format consistency we substract 0.5 - # https://github.com/facebookresearch/detectron2/pull/175#issuecomment-551202163 - keypoints[idx] = v - 0.5 - if "num_keypoints" in annotation: - num_keypoints = annotation["num_keypoints"] - else: - num_keypoints = sum(kp > 0 for kp in keypoints[2::3]) - - # COCO requirement: - # linking annotations to images - # "id" field must start with 1 - coco_annotation["id"] = len(coco_annotations) + 1 - coco_annotation["image_id"] = coco_image["id"] - coco_annotation["bbox"] = [round(float(x), 3) for x in bbox] - coco_annotation["area"] = float(area) - coco_annotation["iscrowd"] = int(annotation.get("iscrowd", 0)) - coco_annotation["category_id"] = int(reverse_id_mapper(annotation["category_id"])) - - # Add optional fields - if "keypoints" in annotation: - coco_annotation["keypoints"] = keypoints - coco_annotation["num_keypoints"] = num_keypoints - - if "segmentation" in annotation: - seg = coco_annotation["segmentation"] = annotation["segmentation"] - if isinstance(seg, dict): # RLE - counts = seg["counts"] - if not isinstance(counts, str): - # make it json-serializable - seg["counts"] = counts.decode("ascii") - - coco_annotations.append(coco_annotation) - - logger.info( - "Conversion finished, " - f"#images: {len(coco_images)}, #annotations: {len(coco_annotations)}" - ) - - info = { - "date_created": str(datetime.datetime.now()), - "description": "Automatically generated COCO json file for Detectron2.", - } - coco_dict = {"info": info, "images": coco_images, "categories": categories, "licenses": None} - if len(coco_annotations) > 0: - coco_dict["annotations"] = coco_annotations - return coco_dict - - -def convert_to_coco_json(dataset_name, output_file, allow_cached=True): - """ - Converts dataset into COCO format and saves it to a json file. - dataset_name must be registered in DatasetCatalog and in detectron2's standard format. - - Args: - dataset_name: - reference from the config file to the catalogs - must be registered in DatasetCatalog and in detectron2's standard format - output_file: path of json file that will be saved to - allow_cached: if json file is already present then skip conversion - """ - - # TODO: The dataset or the conversion script *may* change, - # a checksum would be useful for validating the cached data - - PathManager.mkdirs(os.path.dirname(output_file)) - with file_lock(output_file): - if PathManager.exists(output_file) and allow_cached: - logger.warning( - f"Using previously cached COCO format annotations at '{output_file}'. " - "You need to clear the cache file if your dataset has been modified." - ) - else: - logger.info(f"Converting annotations of dataset '{dataset_name}' to COCO format ...)") - coco_dict = convert_to_coco_dict(dataset_name) - - logger.info(f"Caching COCO format annotations at '{output_file}' ...") - tmp_file = output_file + ".tmp" - with PathManager.open(tmp_file, "w") as f: - json.dump(coco_dict, f) - shutil.move(tmp_file, output_file) - - -def register_coco_instances(name, metadata, json_file, image_root): - """ - Register a dataset in COCO's json annotation format for - instance detection, instance segmentation and keypoint detection. - (i.e., Type 1 and 2 in http://cocodataset.org/#format-data. - `instances*.json` and `person_keypoints*.json` in the dataset). - - This is an example of how to register a new dataset. - You can do something similar to this function, to register new datasets. - - Args: - name (str): the name that identifies a dataset, e.g. "coco_2014_train". - metadata (dict): extra metadata associated with this dataset. You can - leave it as an empty dict. - json_file (str): path to the json instance annotation file. - image_root (str or path-like): directory which contains all the images. - """ - assert isinstance(name, str), name - assert isinstance(json_file, (str, os.PathLike)), json_file - assert isinstance(image_root, (str, os.PathLike)), image_root - # 1. register a function which returns dicts - DatasetCatalog.register(name, lambda: load_coco_json(json_file, image_root, name)) - - # 2. Optionally, add metadata about this dataset, - # since they might be useful in evaluation, visualization or logging - MetadataCatalog.get(name).set( - json_file=json_file, image_root=image_root, evaluator_type="coco", **metadata - ) - - -if __name__ == "__main__": - """ - Test the COCO json dataset loader. - - Usage: - python -m detectron2.data.datasets.coco \ - path/to/json path/to/image_root dataset_name - - "dataset_name" can be "coco_2014_minival_100", or other - pre-registered ones - """ - from detectron2.utils.logger import setup_logger - from detectron2.utils.visualizer import Visualizer - import detectron2.data.datasets # noqa # add pre-defined metadata - import sys - - logger = setup_logger(name=__name__) - assert sys.argv[3] in DatasetCatalog.list() - meta = MetadataCatalog.get(sys.argv[3]) - - dicts = load_coco_json(sys.argv[1], sys.argv[2], sys.argv[3]) - logger.info("Done loading {} samples.".format(len(dicts))) - - dirname = "coco-data-vis" - os.makedirs(dirname, exist_ok=True) - for d in dicts: - img = np.array(Image.open(d["file_name"])) - visualizer = Visualizer(img, metadata=meta) - vis = visualizer.draw_dataset_dict(d) - fpath = os.path.join(dirname, os.path.basename(d["file_name"])) - vis.save(fpath) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/coco_panoptic.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/coco_panoptic.py deleted file mode 100755 index b8dae443..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/coco_panoptic.py +++ /dev/null @@ -1,228 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import json -import os - -from detectron2.data import DatasetCatalog, MetadataCatalog -from detectron2.utils.file_io import PathManager - -from .coco import load_coco_json, load_sem_seg - -__all__ = ["register_coco_panoptic", "register_coco_panoptic_separated"] - - -def load_coco_panoptic_json(json_file, image_dir, gt_dir, meta): - """ - Args: - image_dir (str): path to the raw dataset. e.g., "~/coco/train2017". - gt_dir (str): path to the raw annotations. e.g., "~/coco/panoptic_train2017". - json_file (str): path to the json file. e.g., "~/coco/annotations/panoptic_train2017.json". - - Returns: - list[dict]: a list of dicts in Detectron2 standard format. (See - `Using Custom Datasets `_ ) - """ - - def _convert_category_id(segment_info, meta): - if segment_info["category_id"] in meta["thing_dataset_id_to_contiguous_id"]: - segment_info["category_id"] = meta["thing_dataset_id_to_contiguous_id"][ - segment_info["category_id"] - ] - segment_info["isthing"] = True - else: - segment_info["category_id"] = meta["stuff_dataset_id_to_contiguous_id"][ - segment_info["category_id"] - ] - segment_info["isthing"] = False - return segment_info - - with PathManager.open(json_file) as f: - json_info = json.load(f) - - ret = [] - for ann in json_info["annotations"]: - image_id = int(ann["image_id"]) - # TODO: currently we assume image and label has the same filename but - # different extension, and images have extension ".jpg" for COCO. Need - # to make image extension a user-provided argument if we extend this - # function to support other COCO-like datasets. - image_file = os.path.join(image_dir, os.path.splitext(ann["file_name"])[0] + ".jpg") - label_file = os.path.join(gt_dir, ann["file_name"]) - segments_info = [_convert_category_id(x, meta) for x in ann["segments_info"]] - ret.append( - { - "file_name": image_file, - "image_id": image_id, - "pan_seg_file_name": label_file, - "segments_info": segments_info, - } - ) - assert len(ret), f"No images found in {image_dir}!" - assert PathManager.isfile(ret[0]["file_name"]), ret[0]["file_name"] - assert PathManager.isfile(ret[0]["pan_seg_file_name"]), ret[0]["pan_seg_file_name"] - return ret - - -def register_coco_panoptic( - name, metadata, image_root, panoptic_root, panoptic_json, instances_json=None -): - """ - Register a "standard" version of COCO panoptic segmentation dataset named `name`. - The dictionaries in this registered dataset follows detectron2's standard format. - Hence it's called "standard". - - Args: - name (str): the name that identifies a dataset, - e.g. "coco_2017_train_panoptic" - metadata (dict): extra metadata associated with this dataset. - image_root (str): directory which contains all the images - panoptic_root (str): directory which contains panoptic annotation images in COCO format - panoptic_json (str): path to the json panoptic annotation file in COCO format - sem_seg_root (none): not used, to be consistent with - `register_coco_panoptic_separated`. - instances_json (str): path to the json instance annotation file - """ - panoptic_name = name - DatasetCatalog.register( - panoptic_name, - lambda: load_coco_panoptic_json(panoptic_json, image_root, panoptic_root, metadata), - ) - MetadataCatalog.get(panoptic_name).set( - panoptic_root=panoptic_root, - image_root=image_root, - panoptic_json=panoptic_json, - json_file=instances_json, - evaluator_type="coco_panoptic_seg", - ignore_label=255, - label_divisor=1000, - **metadata, - ) - - -def register_coco_panoptic_separated( - name, metadata, image_root, panoptic_root, panoptic_json, sem_seg_root, instances_json -): - """ - Register a "separated" version of COCO panoptic segmentation dataset named `name`. - The annotations in this registered dataset will contain both instance annotations and - semantic annotations, each with its own contiguous ids. Hence it's called "separated". - - It follows the setting used by the PanopticFPN paper: - - 1. The instance annotations directly come from polygons in the COCO - instances annotation task, rather than from the masks in the COCO panoptic annotations. - - The two format have small differences: - Polygons in the instance annotations may have overlaps. - The mask annotations are produced by labeling the overlapped polygons - with depth ordering. - - 2. The semantic annotations are converted from panoptic annotations, where - all "things" are assigned a semantic id of 0. - All semantic categories will therefore have ids in contiguous - range [1, #stuff_categories]. - - This function will also register a pure semantic segmentation dataset - named ``name + '_stuffonly'``. - - Args: - name (str): the name that identifies a dataset, - e.g. "coco_2017_train_panoptic" - metadata (dict): extra metadata associated with this dataset. - image_root (str): directory which contains all the images - panoptic_root (str): directory which contains panoptic annotation images - panoptic_json (str): path to the json panoptic annotation file - sem_seg_root (str): directory which contains all the ground truth segmentation annotations. - instances_json (str): path to the json instance annotation file - """ - panoptic_name = name + "_separated" - DatasetCatalog.register( - panoptic_name, - lambda: merge_to_panoptic( - load_coco_json(instances_json, image_root, panoptic_name), - load_sem_seg(sem_seg_root, image_root), - ), - ) - MetadataCatalog.get(panoptic_name).set( - panoptic_root=panoptic_root, - image_root=image_root, - panoptic_json=panoptic_json, - sem_seg_root=sem_seg_root, - json_file=instances_json, # TODO rename - evaluator_type="coco_panoptic_seg", - ignore_label=255, - **metadata, - ) - - semantic_name = name + "_stuffonly" - DatasetCatalog.register(semantic_name, lambda: load_sem_seg(sem_seg_root, image_root)) - MetadataCatalog.get(semantic_name).set( - sem_seg_root=sem_seg_root, - image_root=image_root, - evaluator_type="sem_seg", - ignore_label=255, - **metadata, - ) - - -def merge_to_panoptic(detection_dicts, sem_seg_dicts): - """ - Create dataset dicts for panoptic segmentation, by - merging two dicts using "file_name" field to match their entries. - - Args: - detection_dicts (list[dict]): lists of dicts for object detection or instance segmentation. - sem_seg_dicts (list[dict]): lists of dicts for semantic segmentation. - - Returns: - list[dict] (one per input image): Each dict contains all (key, value) pairs from dicts in - both detection_dicts and sem_seg_dicts that correspond to the same image. - The function assumes that the same key in different dicts has the same value. - """ - results = [] - sem_seg_file_to_entry = {x["file_name"]: x for x in sem_seg_dicts} - assert len(sem_seg_file_to_entry) > 0 - - for det_dict in detection_dicts: - dic = copy.copy(det_dict) - dic.update(sem_seg_file_to_entry[dic["file_name"]]) - results.append(dic) - return results - - -if __name__ == "__main__": - """ - Test the COCO panoptic dataset loader. - - Usage: - python -m detectron2.data.datasets.coco_panoptic \ - path/to/image_root path/to/panoptic_root path/to/panoptic_json dataset_name 10 - - "dataset_name" can be "coco_2017_train_panoptic", or other - pre-registered ones - """ - from detectron2.utils.logger import setup_logger - from detectron2.utils.visualizer import Visualizer - import detectron2.data.datasets # noqa # add pre-defined metadata - import sys - from PIL import Image - import numpy as np - - logger = setup_logger(name=__name__) - assert sys.argv[4] in DatasetCatalog.list() - meta = MetadataCatalog.get(sys.argv[4]) - - dicts = load_coco_panoptic_json(sys.argv[3], sys.argv[1], sys.argv[2], meta.as_dict()) - logger.info("Done loading {} samples.".format(len(dicts))) - - dirname = "coco-data-vis" - os.makedirs(dirname, exist_ok=True) - num_imgs_to_vis = int(sys.argv[5]) - for i, d in enumerate(dicts): - img = np.array(Image.open(d["file_name"])) - visualizer = Visualizer(img, metadata=meta) - vis = visualizer.draw_dataset_dict(d) - fpath = os.path.join(dirname, os.path.basename(d["file_name"])) - vis.save(fpath) - if i + 1 >= num_imgs_to_vis: - break diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis.py deleted file mode 100755 index 78b39653..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis.py +++ /dev/null @@ -1,240 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import os -from fvcore.common.timer import Timer - -from detectron2.data import DatasetCatalog, MetadataCatalog -from detectron2.structures import BoxMode -from detectron2.utils.file_io import PathManager - -from .builtin_meta import _get_coco_instances_meta -from .lvis_v0_5_categories import LVIS_CATEGORIES as LVIS_V0_5_CATEGORIES -from .lvis_v1_categories import LVIS_CATEGORIES as LVIS_V1_CATEGORIES - -""" -This file contains functions to parse LVIS-format annotations into dicts in the -"Detectron2 format". -""" - -logger = logging.getLogger(__name__) - -__all__ = ["load_lvis_json", "register_lvis_instances", "get_lvis_instances_meta"] - - -def register_lvis_instances(name, metadata, json_file, image_root): - """ - Register a dataset in LVIS's json annotation format for instance detection and segmentation. - - Args: - name (str): a name that identifies the dataset, e.g. "lvis_v0.5_train". - metadata (dict): extra metadata associated with this dataset. It can be an empty dict. - json_file (str): path to the json instance annotation file. - image_root (str or path-like): directory which contains all the images. - """ - DatasetCatalog.register(name, lambda: load_lvis_json(json_file, image_root, name)) - MetadataCatalog.get(name).set( - json_file=json_file, image_root=image_root, evaluator_type="lvis", **metadata - ) - - -def load_lvis_json(json_file, image_root, dataset_name=None, extra_annotation_keys=None): - """ - Load a json file in LVIS's annotation format. - - Args: - json_file (str): full path to the LVIS json annotation file. - image_root (str): the directory where the images in this json file exists. - dataset_name (str): the name of the dataset (e.g., "lvis_v0.5_train"). - If provided, this function will put "thing_classes" into the metadata - associated with this dataset. - extra_annotation_keys (list[str]): list of per-annotation keys that should also be - loaded into the dataset dict (besides "bbox", "bbox_mode", "category_id", - "segmentation"). The values for these keys will be returned as-is. - - Returns: - list[dict]: a list of dicts in Detectron2 standard format. (See - `Using Custom Datasets `_ ) - - Notes: - 1. This function does not read the image files. - The results do not have the "image" field. - """ - from lvis import LVIS - - json_file = PathManager.get_local_path(json_file) - - timer = Timer() - lvis_api = LVIS(json_file) - if timer.seconds() > 1: - logger.info("Loading {} takes {:.2f} seconds.".format(json_file, timer.seconds())) - - if dataset_name is not None: - meta = get_lvis_instances_meta(dataset_name) - MetadataCatalog.get(dataset_name).set(**meta) - - # sort indices for reproducible results - img_ids = sorted(lvis_api.imgs.keys()) - # imgs is a list of dicts, each looks something like: - # {'license': 4, - # 'url': 'http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg', - # 'file_name': 'COCO_val2014_000000001268.jpg', - # 'height': 427, - # 'width': 640, - # 'date_captured': '2013-11-17 05:57:24', - # 'id': 1268} - imgs = lvis_api.load_imgs(img_ids) - # anns is a list[list[dict]], where each dict is an annotation - # record for an object. The inner list enumerates the objects in an image - # and the outer list enumerates over images. Example of anns[0]: - # [{'segmentation': [[192.81, - # 247.09, - # ... - # 219.03, - # 249.06]], - # 'area': 1035.749, - # 'image_id': 1268, - # 'bbox': [192.81, 224.8, 74.73, 33.43], - # 'category_id': 16, - # 'id': 42986}, - # ...] - anns = [lvis_api.img_ann_map[img_id] for img_id in img_ids] - - # Sanity check that each annotation has a unique id - ann_ids = [ann["id"] for anns_per_image in anns for ann in anns_per_image] - assert len(set(ann_ids)) == len(ann_ids), "Annotation ids in '{}' are not unique".format( - json_file - ) - - imgs_anns = list(zip(imgs, anns)) - - logger.info("Loaded {} images in the LVIS format from {}".format(len(imgs_anns), json_file)) - - if extra_annotation_keys: - logger.info( - "The following extra annotation keys will be loaded: {} ".format(extra_annotation_keys) - ) - else: - extra_annotation_keys = [] - - def get_file_name(img_root, img_dict): - # Determine the path including the split folder ("train2017", "val2017", "test2017") from - # the coco_url field. Example: - # 'coco_url': 'http://images.cocodataset.org/train2017/000000155379.jpg' - split_folder, file_name = img_dict["coco_url"].split("/")[-2:] - return os.path.join(img_root + split_folder, file_name) - - dataset_dicts = [] - - for (img_dict, anno_dict_list) in imgs_anns: - record = {} - record["file_name"] = get_file_name(image_root, img_dict) - record["height"] = img_dict["height"] - record["width"] = img_dict["width"] - record["not_exhaustive_category_ids"] = img_dict.get("not_exhaustive_category_ids", []) - record["neg_category_ids"] = img_dict.get("neg_category_ids", []) - image_id = record["image_id"] = img_dict["id"] - - objs = [] - for anno in anno_dict_list: - # Check that the image_id in this annotation is the same as - # the image_id we're looking at. - # This fails only when the data parsing logic or the annotation file is buggy. - assert anno["image_id"] == image_id - obj = {"bbox": anno["bbox"], "bbox_mode": BoxMode.XYWH_ABS} - # LVIS data loader can be used to load COCO dataset categories. In this case `meta` - # variable will have a field with COCO-specific category mapping. - if dataset_name is not None and "thing_dataset_id_to_contiguous_id" in meta: - obj["category_id"] = meta["thing_dataset_id_to_contiguous_id"][anno["category_id"]] - else: - obj["category_id"] = anno["category_id"] - 1 # Convert 1-indexed to 0-indexed - segm = anno["segmentation"] # list[list[float]] - # filter out invalid polygons (< 3 points) - valid_segm = [poly for poly in segm if len(poly) % 2 == 0 and len(poly) >= 6] - assert len(segm) == len( - valid_segm - ), "Annotation contains an invalid polygon with < 3 points" - assert len(segm) > 0 - obj["segmentation"] = segm - for extra_ann_key in extra_annotation_keys: - obj[extra_ann_key] = anno[extra_ann_key] - objs.append(obj) - record["annotations"] = objs - dataset_dicts.append(record) - - return dataset_dicts - - -def get_lvis_instances_meta(dataset_name): - """ - Load LVIS metadata. - - Args: - dataset_name (str): LVIS dataset name without the split name (e.g., "lvis_v0.5"). - - Returns: - dict: LVIS metadata with keys: thing_classes - """ - if "cocofied" in dataset_name: - return _get_coco_instances_meta() - if "v0.5" in dataset_name: - return _get_lvis_instances_meta_v0_5() - elif "v1" in dataset_name: - return _get_lvis_instances_meta_v1() - raise ValueError("No built-in metadata for dataset {}".format(dataset_name)) - - -def _get_lvis_instances_meta_v0_5(): - assert len(LVIS_V0_5_CATEGORIES) == 1230 - cat_ids = [k["id"] for k in LVIS_V0_5_CATEGORIES] - assert min(cat_ids) == 1 and max(cat_ids) == len( - cat_ids - ), "Category ids are not in [1, #categories], as expected" - # Ensure that the category list is sorted by id - lvis_categories = sorted(LVIS_V0_5_CATEGORIES, key=lambda x: x["id"]) - thing_classes = [k["synonyms"][0] for k in lvis_categories] - meta = {"thing_classes": thing_classes} - return meta - - -def _get_lvis_instances_meta_v1(): - assert len(LVIS_V1_CATEGORIES) == 1203 - cat_ids = [k["id"] for k in LVIS_V1_CATEGORIES] - assert min(cat_ids) == 1 and max(cat_ids) == len( - cat_ids - ), "Category ids are not in [1, #categories], as expected" - # Ensure that the category list is sorted by id - lvis_categories = sorted(LVIS_V1_CATEGORIES, key=lambda x: x["id"]) - thing_classes = [k["synonyms"][0] for k in lvis_categories] - meta = {"thing_classes": thing_classes} - return meta - - -if __name__ == "__main__": - """ - Test the LVIS json dataset loader. - - Usage: - python -m detectron2.data.datasets.lvis \ - path/to/json path/to/image_root dataset_name vis_limit - """ - import sys - import numpy as np - from detectron2.utils.logger import setup_logger - from PIL import Image - import detectron2.data.datasets # noqa # add pre-defined metadata - from detectron2.utils.visualizer import Visualizer - - logger = setup_logger(name=__name__) - meta = MetadataCatalog.get(sys.argv[3]) - - dicts = load_lvis_json(sys.argv[1], sys.argv[2], sys.argv[3]) - logger.info("Done loading {} samples.".format(len(dicts))) - - dirname = "lvis-data-vis" - os.makedirs(dirname, exist_ok=True) - for d in dicts[: int(sys.argv[4])]: - img = np.array(Image.open(d["file_name"])) - visualizer = Visualizer(img, metadata=meta) - vis = visualizer.draw_dataset_dict(d) - fpath = os.path.join(dirname, os.path.basename(d["file_name"])) - vis.save(fpath) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis_v0_5_categories.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis_v0_5_categories.py deleted file mode 100755 index d3dab619..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis_v0_5_categories.py +++ /dev/null @@ -1,13 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# Autogen with -# with open("lvis_v0.5_val.json", "r") as f: -# a = json.load(f) -# c = a["categories"] -# for x in c: -# del x["image_count"] -# del x["instance_count"] -# LVIS_CATEGORIES = repr(c) + " # noqa" - -# fmt: off -LVIS_CATEGORIES = [{'frequency': 'r', 'id': 1, 'synset': 'acorn.n.01', 'synonyms': ['acorn'], 'def': 'nut from an oak tree', 'name': 'acorn'}, {'frequency': 'c', 'id': 2, 'synset': 'aerosol.n.02', 'synonyms': ['aerosol_can', 'spray_can'], 'def': 'a dispenser that holds a substance under pressure', 'name': 'aerosol_can'}, {'frequency': 'f', 'id': 3, 'synset': 'air_conditioner.n.01', 'synonyms': ['air_conditioner'], 'def': 'a machine that keeps air cool and dry', 'name': 'air_conditioner'}, {'frequency': 'f', 'id': 4, 'synset': 'airplane.n.01', 'synonyms': ['airplane', 'aeroplane'], 'def': 'an aircraft that has a fixed wing and is powered by propellers or jets', 'name': 'airplane'}, {'frequency': 'c', 'id': 5, 'synset': 'alarm_clock.n.01', 'synonyms': ['alarm_clock'], 'def': 'a clock that wakes a sleeper at some preset time', 'name': 'alarm_clock'}, {'frequency': 'c', 'id': 6, 'synset': 'alcohol.n.01', 'synonyms': ['alcohol', 'alcoholic_beverage'], 'def': 'a liquor or brew containing alcohol as the active agent', 'name': 'alcohol'}, {'frequency': 'r', 'id': 7, 'synset': 'alligator.n.02', 'synonyms': ['alligator', 'gator'], 'def': 'amphibious reptiles related to crocodiles but with shorter broader snouts', 'name': 'alligator'}, {'frequency': 'c', 'id': 8, 'synset': 'almond.n.02', 'synonyms': ['almond'], 'def': 'oval-shaped edible seed of the almond tree', 'name': 'almond'}, {'frequency': 'c', 'id': 9, 'synset': 'ambulance.n.01', 'synonyms': ['ambulance'], 'def': 'a vehicle that takes people to and from hospitals', 'name': 'ambulance'}, {'frequency': 'r', 'id': 10, 'synset': 'amplifier.n.01', 'synonyms': ['amplifier'], 'def': 'electronic equipment that increases strength of signals', 'name': 'amplifier'}, {'frequency': 'c', 'id': 11, 'synset': 'anklet.n.03', 'synonyms': ['anklet', 'ankle_bracelet'], 'def': 'an ornament worn around the ankle', 'name': 'anklet'}, {'frequency': 'f', 'id': 12, 'synset': 'antenna.n.01', 'synonyms': ['antenna', 'aerial', 'transmitting_aerial'], 'def': 'an electrical device that sends or receives radio or television signals', 'name': 'antenna'}, {'frequency': 'f', 'id': 13, 'synset': 'apple.n.01', 'synonyms': ['apple'], 'def': 'fruit with red or yellow or green skin and sweet to tart crisp whitish flesh', 'name': 'apple'}, {'frequency': 'r', 'id': 14, 'synset': 'apple_juice.n.01', 'synonyms': ['apple_juice'], 'def': 'the juice of apples', 'name': 'apple_juice'}, {'frequency': 'r', 'id': 15, 'synset': 'applesauce.n.01', 'synonyms': ['applesauce'], 'def': 'puree of stewed apples usually sweetened and spiced', 'name': 'applesauce'}, {'frequency': 'r', 'id': 16, 'synset': 'apricot.n.02', 'synonyms': ['apricot'], 'def': 'downy yellow to rosy-colored fruit resembling a small peach', 'name': 'apricot'}, {'frequency': 'f', 'id': 17, 'synset': 'apron.n.01', 'synonyms': ['apron'], 'def': 'a garment of cloth that is tied about the waist and worn to protect clothing', 'name': 'apron'}, {'frequency': 'c', 'id': 18, 'synset': 'aquarium.n.01', 'synonyms': ['aquarium', 'fish_tank'], 'def': 'a tank/pool/bowl filled with water for keeping live fish and underwater animals', 'name': 'aquarium'}, {'frequency': 'c', 'id': 19, 'synset': 'armband.n.02', 'synonyms': ['armband'], 'def': 'a band worn around the upper arm', 'name': 'armband'}, {'frequency': 'f', 'id': 20, 'synset': 'armchair.n.01', 'synonyms': ['armchair'], 'def': 'chair with a support on each side for arms', 'name': 'armchair'}, {'frequency': 'r', 'id': 21, 'synset': 'armoire.n.01', 'synonyms': ['armoire'], 'def': 'a large wardrobe or cabinet', 'name': 'armoire'}, {'frequency': 'r', 'id': 22, 'synset': 'armor.n.01', 'synonyms': ['armor', 'armour'], 'def': 'protective covering made of metal and used in combat', 'name': 'armor'}, {'frequency': 'c', 'id': 23, 'synset': 'artichoke.n.02', 'synonyms': ['artichoke'], 'def': 'a thistlelike flower head with edible fleshy leaves and heart', 'name': 'artichoke'}, {'frequency': 'f', 'id': 24, 'synset': 'ashcan.n.01', 'synonyms': ['trash_can', 'garbage_can', 'wastebin', 'dustbin', 'trash_barrel', 'trash_bin'], 'def': 'a bin that holds rubbish until it is collected', 'name': 'trash_can'}, {'frequency': 'c', 'id': 25, 'synset': 'ashtray.n.01', 'synonyms': ['ashtray'], 'def': "a receptacle for the ash from smokers' cigars or cigarettes", 'name': 'ashtray'}, {'frequency': 'c', 'id': 26, 'synset': 'asparagus.n.02', 'synonyms': ['asparagus'], 'def': 'edible young shoots of the asparagus plant', 'name': 'asparagus'}, {'frequency': 'c', 'id': 27, 'synset': 'atomizer.n.01', 'synonyms': ['atomizer', 'atomiser', 'spray', 'sprayer', 'nebulizer', 'nebuliser'], 'def': 'a dispenser that turns a liquid (such as perfume) into a fine mist', 'name': 'atomizer'}, {'frequency': 'c', 'id': 28, 'synset': 'avocado.n.01', 'synonyms': ['avocado'], 'def': 'a pear-shaped fruit with green or blackish skin and rich yellowish pulp enclosing a single large seed', 'name': 'avocado'}, {'frequency': 'c', 'id': 29, 'synset': 'award.n.02', 'synonyms': ['award', 'accolade'], 'def': 'a tangible symbol signifying approval or distinction', 'name': 'award'}, {'frequency': 'f', 'id': 30, 'synset': 'awning.n.01', 'synonyms': ['awning'], 'def': 'a canopy made of canvas to shelter people or things from rain or sun', 'name': 'awning'}, {'frequency': 'r', 'id': 31, 'synset': 'ax.n.01', 'synonyms': ['ax', 'axe'], 'def': 'an edge tool with a heavy bladed head mounted across a handle', 'name': 'ax'}, {'frequency': 'f', 'id': 32, 'synset': 'baby_buggy.n.01', 'synonyms': ['baby_buggy', 'baby_carriage', 'perambulator', 'pram', 'stroller'], 'def': 'a small vehicle with four wheels in which a baby or child is pushed around', 'name': 'baby_buggy'}, {'frequency': 'c', 'id': 33, 'synset': 'backboard.n.01', 'synonyms': ['basketball_backboard'], 'def': 'a raised vertical board with basket attached; used to play basketball', 'name': 'basketball_backboard'}, {'frequency': 'f', 'id': 34, 'synset': 'backpack.n.01', 'synonyms': ['backpack', 'knapsack', 'packsack', 'rucksack', 'haversack'], 'def': 'a bag carried by a strap on your back or shoulder', 'name': 'backpack'}, {'frequency': 'f', 'id': 35, 'synset': 'bag.n.04', 'synonyms': ['handbag', 'purse', 'pocketbook'], 'def': 'a container used for carrying money and small personal items or accessories', 'name': 'handbag'}, {'frequency': 'f', 'id': 36, 'synset': 'bag.n.06', 'synonyms': ['suitcase', 'baggage', 'luggage'], 'def': 'cases used to carry belongings when traveling', 'name': 'suitcase'}, {'frequency': 'c', 'id': 37, 'synset': 'bagel.n.01', 'synonyms': ['bagel', 'beigel'], 'def': 'glazed yeast-raised doughnut-shaped roll with hard crust', 'name': 'bagel'}, {'frequency': 'r', 'id': 38, 'synset': 'bagpipe.n.01', 'synonyms': ['bagpipe'], 'def': 'a tubular wind instrument; the player blows air into a bag and squeezes it out', 'name': 'bagpipe'}, {'frequency': 'r', 'id': 39, 'synset': 'baguet.n.01', 'synonyms': ['baguet', 'baguette'], 'def': 'narrow French stick loaf', 'name': 'baguet'}, {'frequency': 'r', 'id': 40, 'synset': 'bait.n.02', 'synonyms': ['bait', 'lure'], 'def': 'something used to lure fish or other animals into danger so they can be trapped or killed', 'name': 'bait'}, {'frequency': 'f', 'id': 41, 'synset': 'ball.n.06', 'synonyms': ['ball'], 'def': 'a spherical object used as a plaything', 'name': 'ball'}, {'frequency': 'r', 'id': 42, 'synset': 'ballet_skirt.n.01', 'synonyms': ['ballet_skirt', 'tutu'], 'def': 'very short skirt worn by ballerinas', 'name': 'ballet_skirt'}, {'frequency': 'f', 'id': 43, 'synset': 'balloon.n.01', 'synonyms': ['balloon'], 'def': 'large tough nonrigid bag filled with gas or heated air', 'name': 'balloon'}, {'frequency': 'c', 'id': 44, 'synset': 'bamboo.n.02', 'synonyms': ['bamboo'], 'def': 'woody tropical grass having hollow woody stems', 'name': 'bamboo'}, {'frequency': 'f', 'id': 45, 'synset': 'banana.n.02', 'synonyms': ['banana'], 'def': 'elongated crescent-shaped yellow fruit with soft sweet flesh', 'name': 'banana'}, {'frequency': 'r', 'id': 46, 'synset': 'band_aid.n.01', 'synonyms': ['Band_Aid'], 'def': 'trade name for an adhesive bandage to cover small cuts or blisters', 'name': 'Band_Aid'}, {'frequency': 'c', 'id': 47, 'synset': 'bandage.n.01', 'synonyms': ['bandage'], 'def': 'a piece of soft material that covers and protects an injured part of the body', 'name': 'bandage'}, {'frequency': 'c', 'id': 48, 'synset': 'bandanna.n.01', 'synonyms': ['bandanna', 'bandana'], 'def': 'large and brightly colored handkerchief; often used as a neckerchief', 'name': 'bandanna'}, {'frequency': 'r', 'id': 49, 'synset': 'banjo.n.01', 'synonyms': ['banjo'], 'def': 'a stringed instrument of the guitar family with a long neck and circular body', 'name': 'banjo'}, {'frequency': 'f', 'id': 50, 'synset': 'banner.n.01', 'synonyms': ['banner', 'streamer'], 'def': 'long strip of cloth or paper used for decoration or advertising', 'name': 'banner'}, {'frequency': 'r', 'id': 51, 'synset': 'barbell.n.01', 'synonyms': ['barbell'], 'def': 'a bar to which heavy discs are attached at each end; used in weightlifting', 'name': 'barbell'}, {'frequency': 'r', 'id': 52, 'synset': 'barge.n.01', 'synonyms': ['barge'], 'def': 'a flatbottom boat for carrying heavy loads (especially on canals)', 'name': 'barge'}, {'frequency': 'f', 'id': 53, 'synset': 'barrel.n.02', 'synonyms': ['barrel', 'cask'], 'def': 'a cylindrical container that holds liquids', 'name': 'barrel'}, {'frequency': 'c', 'id': 54, 'synset': 'barrette.n.01', 'synonyms': ['barrette'], 'def': "a pin for holding women's hair in place", 'name': 'barrette'}, {'frequency': 'c', 'id': 55, 'synset': 'barrow.n.03', 'synonyms': ['barrow', 'garden_cart', 'lawn_cart', 'wheelbarrow'], 'def': 'a cart for carrying small loads; has handles and one or more wheels', 'name': 'barrow'}, {'frequency': 'f', 'id': 56, 'synset': 'base.n.03', 'synonyms': ['baseball_base'], 'def': 'a place that the runner must touch before scoring', 'name': 'baseball_base'}, {'frequency': 'f', 'id': 57, 'synset': 'baseball.n.02', 'synonyms': ['baseball'], 'def': 'a ball used in playing baseball', 'name': 'baseball'}, {'frequency': 'f', 'id': 58, 'synset': 'baseball_bat.n.01', 'synonyms': ['baseball_bat'], 'def': 'an implement used in baseball by the batter', 'name': 'baseball_bat'}, {'frequency': 'f', 'id': 59, 'synset': 'baseball_cap.n.01', 'synonyms': ['baseball_cap', 'jockey_cap', 'golf_cap'], 'def': 'a cap with a bill', 'name': 'baseball_cap'}, {'frequency': 'f', 'id': 60, 'synset': 'baseball_glove.n.01', 'synonyms': ['baseball_glove', 'baseball_mitt'], 'def': 'the handwear used by fielders in playing baseball', 'name': 'baseball_glove'}, {'frequency': 'f', 'id': 61, 'synset': 'basket.n.01', 'synonyms': ['basket', 'handbasket'], 'def': 'a container that is usually woven and has handles', 'name': 'basket'}, {'frequency': 'c', 'id': 62, 'synset': 'basket.n.03', 'synonyms': ['basketball_hoop'], 'def': 'metal hoop supporting a net through which players try to throw the basketball', 'name': 'basketball_hoop'}, {'frequency': 'c', 'id': 63, 'synset': 'basketball.n.02', 'synonyms': ['basketball'], 'def': 'an inflated ball used in playing basketball', 'name': 'basketball'}, {'frequency': 'r', 'id': 64, 'synset': 'bass_horn.n.01', 'synonyms': ['bass_horn', 'sousaphone', 'tuba'], 'def': 'the lowest brass wind instrument', 'name': 'bass_horn'}, {'frequency': 'r', 'id': 65, 'synset': 'bat.n.01', 'synonyms': ['bat_(animal)'], 'def': 'nocturnal mouselike mammal with forelimbs modified to form membranous wings', 'name': 'bat_(animal)'}, {'frequency': 'f', 'id': 66, 'synset': 'bath_mat.n.01', 'synonyms': ['bath_mat'], 'def': 'a heavy towel or mat to stand on while drying yourself after a bath', 'name': 'bath_mat'}, {'frequency': 'f', 'id': 67, 'synset': 'bath_towel.n.01', 'synonyms': ['bath_towel'], 'def': 'a large towel; to dry yourself after a bath', 'name': 'bath_towel'}, {'frequency': 'c', 'id': 68, 'synset': 'bathrobe.n.01', 'synonyms': ['bathrobe'], 'def': 'a loose-fitting robe of towelling; worn after a bath or swim', 'name': 'bathrobe'}, {'frequency': 'f', 'id': 69, 'synset': 'bathtub.n.01', 'synonyms': ['bathtub', 'bathing_tub'], 'def': 'a large open container that you fill with water and use to wash the body', 'name': 'bathtub'}, {'frequency': 'r', 'id': 70, 'synset': 'batter.n.02', 'synonyms': ['batter_(food)'], 'def': 'a liquid or semiliquid mixture, as of flour, eggs, and milk, used in cooking', 'name': 'batter_(food)'}, {'frequency': 'c', 'id': 71, 'synset': 'battery.n.02', 'synonyms': ['battery'], 'def': 'a portable device that produces electricity', 'name': 'battery'}, {'frequency': 'r', 'id': 72, 'synset': 'beach_ball.n.01', 'synonyms': ['beachball'], 'def': 'large and light ball; for play at the seaside', 'name': 'beachball'}, {'frequency': 'c', 'id': 73, 'synset': 'bead.n.01', 'synonyms': ['bead'], 'def': 'a small ball with a hole through the middle used for ornamentation, jewellery, etc.', 'name': 'bead'}, {'frequency': 'r', 'id': 74, 'synset': 'beaker.n.01', 'synonyms': ['beaker'], 'def': 'a flatbottomed jar made of glass or plastic; used for chemistry', 'name': 'beaker'}, {'frequency': 'c', 'id': 75, 'synset': 'bean_curd.n.01', 'synonyms': ['bean_curd', 'tofu'], 'def': 'cheeselike food made of curdled soybean milk', 'name': 'bean_curd'}, {'frequency': 'c', 'id': 76, 'synset': 'beanbag.n.01', 'synonyms': ['beanbag'], 'def': 'a bag filled with dried beans or similar items; used in games or to sit on', 'name': 'beanbag'}, {'frequency': 'f', 'id': 77, 'synset': 'beanie.n.01', 'synonyms': ['beanie', 'beany'], 'def': 'a small skullcap; formerly worn by schoolboys and college freshmen', 'name': 'beanie'}, {'frequency': 'f', 'id': 78, 'synset': 'bear.n.01', 'synonyms': ['bear'], 'def': 'large carnivorous or omnivorous mammals with shaggy coats and claws', 'name': 'bear'}, {'frequency': 'f', 'id': 79, 'synset': 'bed.n.01', 'synonyms': ['bed'], 'def': 'a piece of furniture that provides a place to sleep', 'name': 'bed'}, {'frequency': 'c', 'id': 80, 'synset': 'bedspread.n.01', 'synonyms': ['bedspread', 'bedcover', 'bed_covering', 'counterpane', 'spread'], 'def': 'decorative cover for a bed', 'name': 'bedspread'}, {'frequency': 'f', 'id': 81, 'synset': 'beef.n.01', 'synonyms': ['cow'], 'def': 'cattle that are reared for their meat', 'name': 'cow'}, {'frequency': 'c', 'id': 82, 'synset': 'beef.n.02', 'synonyms': ['beef_(food)', 'boeuf_(food)'], 'def': 'meat from an adult domestic bovine', 'name': 'beef_(food)'}, {'frequency': 'r', 'id': 83, 'synset': 'beeper.n.01', 'synonyms': ['beeper', 'pager'], 'def': 'an device that beeps when the person carrying it is being paged', 'name': 'beeper'}, {'frequency': 'f', 'id': 84, 'synset': 'beer_bottle.n.01', 'synonyms': ['beer_bottle'], 'def': 'a bottle that holds beer', 'name': 'beer_bottle'}, {'frequency': 'c', 'id': 85, 'synset': 'beer_can.n.01', 'synonyms': ['beer_can'], 'def': 'a can that holds beer', 'name': 'beer_can'}, {'frequency': 'r', 'id': 86, 'synset': 'beetle.n.01', 'synonyms': ['beetle'], 'def': 'insect with hard wing covers', 'name': 'beetle'}, {'frequency': 'f', 'id': 87, 'synset': 'bell.n.01', 'synonyms': ['bell'], 'def': 'a hollow device made of metal that makes a ringing sound when struck', 'name': 'bell'}, {'frequency': 'f', 'id': 88, 'synset': 'bell_pepper.n.02', 'synonyms': ['bell_pepper', 'capsicum'], 'def': 'large bell-shaped sweet pepper in green or red or yellow or orange or black varieties', 'name': 'bell_pepper'}, {'frequency': 'f', 'id': 89, 'synset': 'belt.n.02', 'synonyms': ['belt'], 'def': 'a band to tie or buckle around the body (usually at the waist)', 'name': 'belt'}, {'frequency': 'f', 'id': 90, 'synset': 'belt_buckle.n.01', 'synonyms': ['belt_buckle'], 'def': 'the buckle used to fasten a belt', 'name': 'belt_buckle'}, {'frequency': 'f', 'id': 91, 'synset': 'bench.n.01', 'synonyms': ['bench'], 'def': 'a long seat for more than one person', 'name': 'bench'}, {'frequency': 'c', 'id': 92, 'synset': 'beret.n.01', 'synonyms': ['beret'], 'def': 'a cap with no brim or bill; made of soft cloth', 'name': 'beret'}, {'frequency': 'c', 'id': 93, 'synset': 'bib.n.02', 'synonyms': ['bib'], 'def': 'a napkin tied under the chin of a child while eating', 'name': 'bib'}, {'frequency': 'r', 'id': 94, 'synset': 'bible.n.01', 'synonyms': ['Bible'], 'def': 'the sacred writings of the Christian religions', 'name': 'Bible'}, {'frequency': 'f', 'id': 95, 'synset': 'bicycle.n.01', 'synonyms': ['bicycle', 'bike_(bicycle)'], 'def': 'a wheeled vehicle that has two wheels and is moved by foot pedals', 'name': 'bicycle'}, {'frequency': 'f', 'id': 96, 'synset': 'bill.n.09', 'synonyms': ['visor', 'vizor'], 'def': 'a brim that projects to the front to shade the eyes', 'name': 'visor'}, {'frequency': 'c', 'id': 97, 'synset': 'binder.n.03', 'synonyms': ['binder', 'ring-binder'], 'def': 'holds loose papers or magazines', 'name': 'binder'}, {'frequency': 'c', 'id': 98, 'synset': 'binoculars.n.01', 'synonyms': ['binoculars', 'field_glasses', 'opera_glasses'], 'def': 'an optical instrument designed for simultaneous use by both eyes', 'name': 'binoculars'}, {'frequency': 'f', 'id': 99, 'synset': 'bird.n.01', 'synonyms': ['bird'], 'def': 'animal characterized by feathers and wings', 'name': 'bird'}, {'frequency': 'r', 'id': 100, 'synset': 'bird_feeder.n.01', 'synonyms': ['birdfeeder'], 'def': 'an outdoor device that supplies food for wild birds', 'name': 'birdfeeder'}, {'frequency': 'r', 'id': 101, 'synset': 'birdbath.n.01', 'synonyms': ['birdbath'], 'def': 'an ornamental basin (usually in a garden) for birds to bathe in', 'name': 'birdbath'}, {'frequency': 'c', 'id': 102, 'synset': 'birdcage.n.01', 'synonyms': ['birdcage'], 'def': 'a cage in which a bird can be kept', 'name': 'birdcage'}, {'frequency': 'c', 'id': 103, 'synset': 'birdhouse.n.01', 'synonyms': ['birdhouse'], 'def': 'a shelter for birds', 'name': 'birdhouse'}, {'frequency': 'f', 'id': 104, 'synset': 'birthday_cake.n.01', 'synonyms': ['birthday_cake'], 'def': 'decorated cake served at a birthday party', 'name': 'birthday_cake'}, {'frequency': 'r', 'id': 105, 'synset': 'birthday_card.n.01', 'synonyms': ['birthday_card'], 'def': 'a card expressing a birthday greeting', 'name': 'birthday_card'}, {'frequency': 'r', 'id': 106, 'synset': 'biscuit.n.01', 'synonyms': ['biscuit_(bread)'], 'def': 'small round bread leavened with baking-powder or soda', 'name': 'biscuit_(bread)'}, {'frequency': 'r', 'id': 107, 'synset': 'black_flag.n.01', 'synonyms': ['pirate_flag'], 'def': 'a flag usually bearing a white skull and crossbones on a black background', 'name': 'pirate_flag'}, {'frequency': 'c', 'id': 108, 'synset': 'black_sheep.n.02', 'synonyms': ['black_sheep'], 'def': 'sheep with a black coat', 'name': 'black_sheep'}, {'frequency': 'c', 'id': 109, 'synset': 'blackboard.n.01', 'synonyms': ['blackboard', 'chalkboard'], 'def': 'sheet of slate; for writing with chalk', 'name': 'blackboard'}, {'frequency': 'f', 'id': 110, 'synset': 'blanket.n.01', 'synonyms': ['blanket'], 'def': 'bedding that keeps a person warm in bed', 'name': 'blanket'}, {'frequency': 'c', 'id': 111, 'synset': 'blazer.n.01', 'synonyms': ['blazer', 'sport_jacket', 'sport_coat', 'sports_jacket', 'sports_coat'], 'def': 'lightweight jacket; often striped in the colors of a club or school', 'name': 'blazer'}, {'frequency': 'f', 'id': 112, 'synset': 'blender.n.01', 'synonyms': ['blender', 'liquidizer', 'liquidiser'], 'def': 'an electrically powered mixer that mix or chop or liquefy foods', 'name': 'blender'}, {'frequency': 'r', 'id': 113, 'synset': 'blimp.n.02', 'synonyms': ['blimp'], 'def': 'a small nonrigid airship used for observation or as a barrage balloon', 'name': 'blimp'}, {'frequency': 'c', 'id': 114, 'synset': 'blinker.n.01', 'synonyms': ['blinker', 'flasher'], 'def': 'a light that flashes on and off; used as a signal or to send messages', 'name': 'blinker'}, {'frequency': 'c', 'id': 115, 'synset': 'blueberry.n.02', 'synonyms': ['blueberry'], 'def': 'sweet edible dark-blue berries of blueberry plants', 'name': 'blueberry'}, {'frequency': 'r', 'id': 116, 'synset': 'boar.n.02', 'synonyms': ['boar'], 'def': 'an uncastrated male hog', 'name': 'boar'}, {'frequency': 'r', 'id': 117, 'synset': 'board.n.09', 'synonyms': ['gameboard'], 'def': 'a flat portable surface (usually rectangular) designed for board games', 'name': 'gameboard'}, {'frequency': 'f', 'id': 118, 'synset': 'boat.n.01', 'synonyms': ['boat', 'ship_(boat)'], 'def': 'a vessel for travel on water', 'name': 'boat'}, {'frequency': 'c', 'id': 119, 'synset': 'bobbin.n.01', 'synonyms': ['bobbin', 'spool', 'reel'], 'def': 'a thing around which thread/tape/film or other flexible materials can be wound', 'name': 'bobbin'}, {'frequency': 'r', 'id': 120, 'synset': 'bobby_pin.n.01', 'synonyms': ['bobby_pin', 'hairgrip'], 'def': 'a flat wire hairpin used to hold bobbed hair in place', 'name': 'bobby_pin'}, {'frequency': 'c', 'id': 121, 'synset': 'boiled_egg.n.01', 'synonyms': ['boiled_egg', 'coddled_egg'], 'def': 'egg cooked briefly in the shell in gently boiling water', 'name': 'boiled_egg'}, {'frequency': 'r', 'id': 122, 'synset': 'bolo_tie.n.01', 'synonyms': ['bolo_tie', 'bolo', 'bola_tie', 'bola'], 'def': 'a cord fastened around the neck with an ornamental clasp and worn as a necktie', 'name': 'bolo_tie'}, {'frequency': 'c', 'id': 123, 'synset': 'bolt.n.03', 'synonyms': ['deadbolt'], 'def': 'the part of a lock that is engaged or withdrawn with a key', 'name': 'deadbolt'}, {'frequency': 'f', 'id': 124, 'synset': 'bolt.n.06', 'synonyms': ['bolt'], 'def': 'a screw that screws into a nut to form a fastener', 'name': 'bolt'}, {'frequency': 'r', 'id': 125, 'synset': 'bonnet.n.01', 'synonyms': ['bonnet'], 'def': 'a hat tied under the chin', 'name': 'bonnet'}, {'frequency': 'f', 'id': 126, 'synset': 'book.n.01', 'synonyms': ['book'], 'def': 'a written work or composition that has been published', 'name': 'book'}, {'frequency': 'r', 'id': 127, 'synset': 'book_bag.n.01', 'synonyms': ['book_bag'], 'def': 'a bag in which students carry their books', 'name': 'book_bag'}, {'frequency': 'c', 'id': 128, 'synset': 'bookcase.n.01', 'synonyms': ['bookcase'], 'def': 'a piece of furniture with shelves for storing books', 'name': 'bookcase'}, {'frequency': 'c', 'id': 129, 'synset': 'booklet.n.01', 'synonyms': ['booklet', 'brochure', 'leaflet', 'pamphlet'], 'def': 'a small book usually having a paper cover', 'name': 'booklet'}, {'frequency': 'r', 'id': 130, 'synset': 'bookmark.n.01', 'synonyms': ['bookmark', 'bookmarker'], 'def': 'a marker (a piece of paper or ribbon) placed between the pages of a book', 'name': 'bookmark'}, {'frequency': 'r', 'id': 131, 'synset': 'boom.n.04', 'synonyms': ['boom_microphone', 'microphone_boom'], 'def': 'a pole carrying an overhead microphone projected over a film or tv set', 'name': 'boom_microphone'}, {'frequency': 'f', 'id': 132, 'synset': 'boot.n.01', 'synonyms': ['boot'], 'def': 'footwear that covers the whole foot and lower leg', 'name': 'boot'}, {'frequency': 'f', 'id': 133, 'synset': 'bottle.n.01', 'synonyms': ['bottle'], 'def': 'a glass or plastic vessel used for storing drinks or other liquids', 'name': 'bottle'}, {'frequency': 'c', 'id': 134, 'synset': 'bottle_opener.n.01', 'synonyms': ['bottle_opener'], 'def': 'an opener for removing caps or corks from bottles', 'name': 'bottle_opener'}, {'frequency': 'c', 'id': 135, 'synset': 'bouquet.n.01', 'synonyms': ['bouquet'], 'def': 'an arrangement of flowers that is usually given as a present', 'name': 'bouquet'}, {'frequency': 'r', 'id': 136, 'synset': 'bow.n.04', 'synonyms': ['bow_(weapon)'], 'def': 'a weapon for shooting arrows', 'name': 'bow_(weapon)'}, {'frequency': 'f', 'id': 137, 'synset': 'bow.n.08', 'synonyms': ['bow_(decorative_ribbons)'], 'def': 'a decorative interlacing of ribbons', 'name': 'bow_(decorative_ribbons)'}, {'frequency': 'f', 'id': 138, 'synset': 'bow_tie.n.01', 'synonyms': ['bow-tie', 'bowtie'], 'def': "a man's tie that ties in a bow", 'name': 'bow-tie'}, {'frequency': 'f', 'id': 139, 'synset': 'bowl.n.03', 'synonyms': ['bowl'], 'def': 'a dish that is round and open at the top for serving foods', 'name': 'bowl'}, {'frequency': 'r', 'id': 140, 'synset': 'bowl.n.08', 'synonyms': ['pipe_bowl'], 'def': 'a small round container that is open at the top for holding tobacco', 'name': 'pipe_bowl'}, {'frequency': 'c', 'id': 141, 'synset': 'bowler_hat.n.01', 'synonyms': ['bowler_hat', 'bowler', 'derby_hat', 'derby', 'plug_hat'], 'def': 'a felt hat that is round and hard with a narrow brim', 'name': 'bowler_hat'}, {'frequency': 'r', 'id': 142, 'synset': 'bowling_ball.n.01', 'synonyms': ['bowling_ball'], 'def': 'a large ball with finger holes used in the sport of bowling', 'name': 'bowling_ball'}, {'frequency': 'r', 'id': 143, 'synset': 'bowling_pin.n.01', 'synonyms': ['bowling_pin'], 'def': 'a club-shaped wooden object used in bowling', 'name': 'bowling_pin'}, {'frequency': 'r', 'id': 144, 'synset': 'boxing_glove.n.01', 'synonyms': ['boxing_glove'], 'def': 'large glove coverings the fists of a fighter worn for the sport of boxing', 'name': 'boxing_glove'}, {'frequency': 'c', 'id': 145, 'synset': 'brace.n.06', 'synonyms': ['suspenders'], 'def': 'elastic straps that hold trousers up (usually used in the plural)', 'name': 'suspenders'}, {'frequency': 'f', 'id': 146, 'synset': 'bracelet.n.02', 'synonyms': ['bracelet', 'bangle'], 'def': 'jewelry worn around the wrist for decoration', 'name': 'bracelet'}, {'frequency': 'r', 'id': 147, 'synset': 'brass.n.07', 'synonyms': ['brass_plaque'], 'def': 'a memorial made of brass', 'name': 'brass_plaque'}, {'frequency': 'c', 'id': 148, 'synset': 'brassiere.n.01', 'synonyms': ['brassiere', 'bra', 'bandeau'], 'def': 'an undergarment worn by women to support their breasts', 'name': 'brassiere'}, {'frequency': 'c', 'id': 149, 'synset': 'bread-bin.n.01', 'synonyms': ['bread-bin', 'breadbox'], 'def': 'a container used to keep bread or cake in', 'name': 'bread-bin'}, {'frequency': 'r', 'id': 150, 'synset': 'breechcloth.n.01', 'synonyms': ['breechcloth', 'breechclout', 'loincloth'], 'def': 'a garment that provides covering for the loins', 'name': 'breechcloth'}, {'frequency': 'c', 'id': 151, 'synset': 'bridal_gown.n.01', 'synonyms': ['bridal_gown', 'wedding_gown', 'wedding_dress'], 'def': 'a gown worn by the bride at a wedding', 'name': 'bridal_gown'}, {'frequency': 'c', 'id': 152, 'synset': 'briefcase.n.01', 'synonyms': ['briefcase'], 'def': 'a case with a handle; for carrying papers or files or books', 'name': 'briefcase'}, {'frequency': 'c', 'id': 153, 'synset': 'bristle_brush.n.01', 'synonyms': ['bristle_brush'], 'def': 'a brush that is made with the short stiff hairs of an animal or plant', 'name': 'bristle_brush'}, {'frequency': 'f', 'id': 154, 'synset': 'broccoli.n.01', 'synonyms': ['broccoli'], 'def': 'plant with dense clusters of tight green flower buds', 'name': 'broccoli'}, {'frequency': 'r', 'id': 155, 'synset': 'brooch.n.01', 'synonyms': ['broach'], 'def': 'a decorative pin worn by women', 'name': 'broach'}, {'frequency': 'c', 'id': 156, 'synset': 'broom.n.01', 'synonyms': ['broom'], 'def': 'bundle of straws or twigs attached to a long handle; used for cleaning', 'name': 'broom'}, {'frequency': 'c', 'id': 157, 'synset': 'brownie.n.03', 'synonyms': ['brownie'], 'def': 'square or bar of very rich chocolate cake usually with nuts', 'name': 'brownie'}, {'frequency': 'c', 'id': 158, 'synset': 'brussels_sprouts.n.01', 'synonyms': ['brussels_sprouts'], 'def': 'the small edible cabbage-like buds growing along a stalk', 'name': 'brussels_sprouts'}, {'frequency': 'r', 'id': 159, 'synset': 'bubble_gum.n.01', 'synonyms': ['bubble_gum'], 'def': 'a kind of chewing gum that can be blown into bubbles', 'name': 'bubble_gum'}, {'frequency': 'f', 'id': 160, 'synset': 'bucket.n.01', 'synonyms': ['bucket', 'pail'], 'def': 'a roughly cylindrical vessel that is open at the top', 'name': 'bucket'}, {'frequency': 'r', 'id': 161, 'synset': 'buggy.n.01', 'synonyms': ['horse_buggy'], 'def': 'a small lightweight carriage; drawn by a single horse', 'name': 'horse_buggy'}, {'frequency': 'c', 'id': 162, 'synset': 'bull.n.11', 'synonyms': ['bull'], 'def': 'mature male cow', 'name': 'bull'}, {'frequency': 'r', 'id': 163, 'synset': 'bulldog.n.01', 'synonyms': ['bulldog'], 'def': 'a thickset short-haired dog with a large head and strong undershot lower jaw', 'name': 'bulldog'}, {'frequency': 'r', 'id': 164, 'synset': 'bulldozer.n.01', 'synonyms': ['bulldozer', 'dozer'], 'def': 'large powerful tractor; a large blade in front flattens areas of ground', 'name': 'bulldozer'}, {'frequency': 'c', 'id': 165, 'synset': 'bullet_train.n.01', 'synonyms': ['bullet_train'], 'def': 'a high-speed passenger train', 'name': 'bullet_train'}, {'frequency': 'c', 'id': 166, 'synset': 'bulletin_board.n.02', 'synonyms': ['bulletin_board', 'notice_board'], 'def': 'a board that hangs on a wall; displays announcements', 'name': 'bulletin_board'}, {'frequency': 'r', 'id': 167, 'synset': 'bulletproof_vest.n.01', 'synonyms': ['bulletproof_vest'], 'def': 'a vest capable of resisting the impact of a bullet', 'name': 'bulletproof_vest'}, {'frequency': 'c', 'id': 168, 'synset': 'bullhorn.n.01', 'synonyms': ['bullhorn', 'megaphone'], 'def': 'a portable loudspeaker with built-in microphone and amplifier', 'name': 'bullhorn'}, {'frequency': 'r', 'id': 169, 'synset': 'bully_beef.n.01', 'synonyms': ['corned_beef', 'corn_beef'], 'def': 'beef cured or pickled in brine', 'name': 'corned_beef'}, {'frequency': 'f', 'id': 170, 'synset': 'bun.n.01', 'synonyms': ['bun', 'roll'], 'def': 'small rounded bread either plain or sweet', 'name': 'bun'}, {'frequency': 'c', 'id': 171, 'synset': 'bunk_bed.n.01', 'synonyms': ['bunk_bed'], 'def': 'beds built one above the other', 'name': 'bunk_bed'}, {'frequency': 'f', 'id': 172, 'synset': 'buoy.n.01', 'synonyms': ['buoy'], 'def': 'a float attached by rope to the seabed to mark channels in a harbor or underwater hazards', 'name': 'buoy'}, {'frequency': 'r', 'id': 173, 'synset': 'burrito.n.01', 'synonyms': ['burrito'], 'def': 'a flour tortilla folded around a filling', 'name': 'burrito'}, {'frequency': 'f', 'id': 174, 'synset': 'bus.n.01', 'synonyms': ['bus_(vehicle)', 'autobus', 'charabanc', 'double-decker', 'motorbus', 'motorcoach'], 'def': 'a vehicle carrying many passengers; used for public transport', 'name': 'bus_(vehicle)'}, {'frequency': 'c', 'id': 175, 'synset': 'business_card.n.01', 'synonyms': ['business_card'], 'def': "a card on which are printed the person's name and business affiliation", 'name': 'business_card'}, {'frequency': 'c', 'id': 176, 'synset': 'butcher_knife.n.01', 'synonyms': ['butcher_knife'], 'def': 'a large sharp knife for cutting or trimming meat', 'name': 'butcher_knife'}, {'frequency': 'c', 'id': 177, 'synset': 'butter.n.01', 'synonyms': ['butter'], 'def': 'an edible emulsion of fat globules made by churning milk or cream; for cooking and table use', 'name': 'butter'}, {'frequency': 'c', 'id': 178, 'synset': 'butterfly.n.01', 'synonyms': ['butterfly'], 'def': 'insect typically having a slender body with knobbed antennae and broad colorful wings', 'name': 'butterfly'}, {'frequency': 'f', 'id': 179, 'synset': 'button.n.01', 'synonyms': ['button'], 'def': 'a round fastener sewn to shirts and coats etc to fit through buttonholes', 'name': 'button'}, {'frequency': 'f', 'id': 180, 'synset': 'cab.n.03', 'synonyms': ['cab_(taxi)', 'taxi', 'taxicab'], 'def': 'a car that takes passengers where they want to go in exchange for money', 'name': 'cab_(taxi)'}, {'frequency': 'r', 'id': 181, 'synset': 'cabana.n.01', 'synonyms': ['cabana'], 'def': 'a small tent used as a dressing room beside the sea or a swimming pool', 'name': 'cabana'}, {'frequency': 'r', 'id': 182, 'synset': 'cabin_car.n.01', 'synonyms': ['cabin_car', 'caboose'], 'def': 'a car on a freight train for use of the train crew; usually the last car on the train', 'name': 'cabin_car'}, {'frequency': 'f', 'id': 183, 'synset': 'cabinet.n.01', 'synonyms': ['cabinet'], 'def': 'a piece of furniture resembling a cupboard with doors and shelves and drawers', 'name': 'cabinet'}, {'frequency': 'r', 'id': 184, 'synset': 'cabinet.n.03', 'synonyms': ['locker', 'storage_locker'], 'def': 'a storage compartment for clothes and valuables; usually it has a lock', 'name': 'locker'}, {'frequency': 'f', 'id': 185, 'synset': 'cake.n.03', 'synonyms': ['cake'], 'def': 'baked goods made from or based on a mixture of flour, sugar, eggs, and fat', 'name': 'cake'}, {'frequency': 'c', 'id': 186, 'synset': 'calculator.n.02', 'synonyms': ['calculator'], 'def': 'a small machine that is used for mathematical calculations', 'name': 'calculator'}, {'frequency': 'f', 'id': 187, 'synset': 'calendar.n.02', 'synonyms': ['calendar'], 'def': 'a list or register of events (appointments/social events/court cases, etc)', 'name': 'calendar'}, {'frequency': 'c', 'id': 188, 'synset': 'calf.n.01', 'synonyms': ['calf'], 'def': 'young of domestic cattle', 'name': 'calf'}, {'frequency': 'c', 'id': 189, 'synset': 'camcorder.n.01', 'synonyms': ['camcorder'], 'def': 'a portable television camera and videocassette recorder', 'name': 'camcorder'}, {'frequency': 'c', 'id': 190, 'synset': 'camel.n.01', 'synonyms': ['camel'], 'def': 'cud-chewing mammal used as a draft or saddle animal in desert regions', 'name': 'camel'}, {'frequency': 'f', 'id': 191, 'synset': 'camera.n.01', 'synonyms': ['camera'], 'def': 'equipment for taking photographs', 'name': 'camera'}, {'frequency': 'c', 'id': 192, 'synset': 'camera_lens.n.01', 'synonyms': ['camera_lens'], 'def': 'a lens that focuses the image in a camera', 'name': 'camera_lens'}, {'frequency': 'c', 'id': 193, 'synset': 'camper.n.02', 'synonyms': ['camper_(vehicle)', 'camping_bus', 'motor_home'], 'def': 'a recreational vehicle equipped for camping out while traveling', 'name': 'camper_(vehicle)'}, {'frequency': 'f', 'id': 194, 'synset': 'can.n.01', 'synonyms': ['can', 'tin_can'], 'def': 'airtight sealed metal container for food or drink or paint etc.', 'name': 'can'}, {'frequency': 'c', 'id': 195, 'synset': 'can_opener.n.01', 'synonyms': ['can_opener', 'tin_opener'], 'def': 'a device for cutting cans open', 'name': 'can_opener'}, {'frequency': 'r', 'id': 196, 'synset': 'candelabrum.n.01', 'synonyms': ['candelabrum', 'candelabra'], 'def': 'branched candlestick; ornamental; has several lights', 'name': 'candelabrum'}, {'frequency': 'f', 'id': 197, 'synset': 'candle.n.01', 'synonyms': ['candle', 'candlestick'], 'def': 'stick of wax with a wick in the middle', 'name': 'candle'}, {'frequency': 'f', 'id': 198, 'synset': 'candlestick.n.01', 'synonyms': ['candle_holder'], 'def': 'a holder with sockets for candles', 'name': 'candle_holder'}, {'frequency': 'r', 'id': 199, 'synset': 'candy_bar.n.01', 'synonyms': ['candy_bar'], 'def': 'a candy shaped as a bar', 'name': 'candy_bar'}, {'frequency': 'c', 'id': 200, 'synset': 'candy_cane.n.01', 'synonyms': ['candy_cane'], 'def': 'a hard candy in the shape of a rod (usually with stripes)', 'name': 'candy_cane'}, {'frequency': 'c', 'id': 201, 'synset': 'cane.n.01', 'synonyms': ['walking_cane'], 'def': 'a stick that people can lean on to help them walk', 'name': 'walking_cane'}, {'frequency': 'c', 'id': 202, 'synset': 'canister.n.02', 'synonyms': ['canister', 'cannister'], 'def': 'metal container for storing dry foods such as tea or flour', 'name': 'canister'}, {'frequency': 'r', 'id': 203, 'synset': 'cannon.n.02', 'synonyms': ['cannon'], 'def': 'heavy gun fired from a tank', 'name': 'cannon'}, {'frequency': 'c', 'id': 204, 'synset': 'canoe.n.01', 'synonyms': ['canoe'], 'def': 'small and light boat; pointed at both ends; propelled with a paddle', 'name': 'canoe'}, {'frequency': 'r', 'id': 205, 'synset': 'cantaloup.n.02', 'synonyms': ['cantaloup', 'cantaloupe'], 'def': 'the fruit of a cantaloup vine; small to medium-sized melon with yellowish flesh', 'name': 'cantaloup'}, {'frequency': 'r', 'id': 206, 'synset': 'canteen.n.01', 'synonyms': ['canteen'], 'def': 'a flask for carrying water; used by soldiers or travelers', 'name': 'canteen'}, {'frequency': 'c', 'id': 207, 'synset': 'cap.n.01', 'synonyms': ['cap_(headwear)'], 'def': 'a tight-fitting headwear', 'name': 'cap_(headwear)'}, {'frequency': 'f', 'id': 208, 'synset': 'cap.n.02', 'synonyms': ['bottle_cap', 'cap_(container_lid)'], 'def': 'a top (as for a bottle)', 'name': 'bottle_cap'}, {'frequency': 'r', 'id': 209, 'synset': 'cape.n.02', 'synonyms': ['cape'], 'def': 'a sleeveless garment like a cloak but shorter', 'name': 'cape'}, {'frequency': 'c', 'id': 210, 'synset': 'cappuccino.n.01', 'synonyms': ['cappuccino', 'coffee_cappuccino'], 'def': 'equal parts of espresso and steamed milk', 'name': 'cappuccino'}, {'frequency': 'f', 'id': 211, 'synset': 'car.n.01', 'synonyms': ['car_(automobile)', 'auto_(automobile)', 'automobile'], 'def': 'a motor vehicle with four wheels', 'name': 'car_(automobile)'}, {'frequency': 'f', 'id': 212, 'synset': 'car.n.02', 'synonyms': ['railcar_(part_of_a_train)', 'railway_car_(part_of_a_train)', 'railroad_car_(part_of_a_train)'], 'def': 'a wheeled vehicle adapted to the rails of railroad', 'name': 'railcar_(part_of_a_train)'}, {'frequency': 'r', 'id': 213, 'synset': 'car.n.04', 'synonyms': ['elevator_car'], 'def': 'where passengers ride up and down', 'name': 'elevator_car'}, {'frequency': 'r', 'id': 214, 'synset': 'car_battery.n.01', 'synonyms': ['car_battery', 'automobile_battery'], 'def': 'a battery in a motor vehicle', 'name': 'car_battery'}, {'frequency': 'c', 'id': 215, 'synset': 'card.n.02', 'synonyms': ['identity_card'], 'def': 'a card certifying the identity of the bearer', 'name': 'identity_card'}, {'frequency': 'c', 'id': 216, 'synset': 'card.n.03', 'synonyms': ['card'], 'def': 'a rectangular piece of paper used to send messages (e.g. greetings or pictures)', 'name': 'card'}, {'frequency': 'r', 'id': 217, 'synset': 'cardigan.n.01', 'synonyms': ['cardigan'], 'def': 'knitted jacket that is fastened up the front with buttons or a zipper', 'name': 'cardigan'}, {'frequency': 'r', 'id': 218, 'synset': 'cargo_ship.n.01', 'synonyms': ['cargo_ship', 'cargo_vessel'], 'def': 'a ship designed to carry cargo', 'name': 'cargo_ship'}, {'frequency': 'r', 'id': 219, 'synset': 'carnation.n.01', 'synonyms': ['carnation'], 'def': 'plant with pink to purple-red spice-scented usually double flowers', 'name': 'carnation'}, {'frequency': 'c', 'id': 220, 'synset': 'carriage.n.02', 'synonyms': ['horse_carriage'], 'def': 'a vehicle with wheels drawn by one or more horses', 'name': 'horse_carriage'}, {'frequency': 'f', 'id': 221, 'synset': 'carrot.n.01', 'synonyms': ['carrot'], 'def': 'deep orange edible root of the cultivated carrot plant', 'name': 'carrot'}, {'frequency': 'c', 'id': 222, 'synset': 'carryall.n.01', 'synonyms': ['tote_bag'], 'def': 'a capacious bag or basket', 'name': 'tote_bag'}, {'frequency': 'c', 'id': 223, 'synset': 'cart.n.01', 'synonyms': ['cart'], 'def': 'a heavy open wagon usually having two wheels and drawn by an animal', 'name': 'cart'}, {'frequency': 'c', 'id': 224, 'synset': 'carton.n.02', 'synonyms': ['carton'], 'def': 'a box made of cardboard; opens by flaps on top', 'name': 'carton'}, {'frequency': 'c', 'id': 225, 'synset': 'cash_register.n.01', 'synonyms': ['cash_register', 'register_(for_cash_transactions)'], 'def': 'a cashbox with an adding machine to register transactions', 'name': 'cash_register'}, {'frequency': 'r', 'id': 226, 'synset': 'casserole.n.01', 'synonyms': ['casserole'], 'def': 'food cooked and served in a casserole', 'name': 'casserole'}, {'frequency': 'r', 'id': 227, 'synset': 'cassette.n.01', 'synonyms': ['cassette'], 'def': 'a container that holds a magnetic tape used for recording or playing sound or video', 'name': 'cassette'}, {'frequency': 'c', 'id': 228, 'synset': 'cast.n.05', 'synonyms': ['cast', 'plaster_cast', 'plaster_bandage'], 'def': 'bandage consisting of a firm covering that immobilizes broken bones while they heal', 'name': 'cast'}, {'frequency': 'f', 'id': 229, 'synset': 'cat.n.01', 'synonyms': ['cat'], 'def': 'a domestic house cat', 'name': 'cat'}, {'frequency': 'c', 'id': 230, 'synset': 'cauliflower.n.02', 'synonyms': ['cauliflower'], 'def': 'edible compact head of white undeveloped flowers', 'name': 'cauliflower'}, {'frequency': 'r', 'id': 231, 'synset': 'caviar.n.01', 'synonyms': ['caviar', 'caviare'], 'def': "salted roe of sturgeon or other large fish; usually served as an hors d'oeuvre", 'name': 'caviar'}, {'frequency': 'c', 'id': 232, 'synset': 'cayenne.n.02', 'synonyms': ['cayenne_(spice)', 'cayenne_pepper_(spice)', 'red_pepper_(spice)'], 'def': 'ground pods and seeds of pungent red peppers of the genus Capsicum', 'name': 'cayenne_(spice)'}, {'frequency': 'c', 'id': 233, 'synset': 'cd_player.n.01', 'synonyms': ['CD_player'], 'def': 'electronic equipment for playing compact discs (CDs)', 'name': 'CD_player'}, {'frequency': 'c', 'id': 234, 'synset': 'celery.n.01', 'synonyms': ['celery'], 'def': 'widely cultivated herb with aromatic leaf stalks that are eaten raw or cooked', 'name': 'celery'}, {'frequency': 'f', 'id': 235, 'synset': 'cellular_telephone.n.01', 'synonyms': ['cellular_telephone', 'cellular_phone', 'cellphone', 'mobile_phone', 'smart_phone'], 'def': 'a hand-held mobile telephone', 'name': 'cellular_telephone'}, {'frequency': 'r', 'id': 236, 'synset': 'chain_mail.n.01', 'synonyms': ['chain_mail', 'ring_mail', 'chain_armor', 'chain_armour', 'ring_armor', 'ring_armour'], 'def': '(Middle Ages) flexible armor made of interlinked metal rings', 'name': 'chain_mail'}, {'frequency': 'f', 'id': 237, 'synset': 'chair.n.01', 'synonyms': ['chair'], 'def': 'a seat for one person, with a support for the back', 'name': 'chair'}, {'frequency': 'r', 'id': 238, 'synset': 'chaise_longue.n.01', 'synonyms': ['chaise_longue', 'chaise', 'daybed'], 'def': 'a long chair; for reclining', 'name': 'chaise_longue'}, {'frequency': 'r', 'id': 239, 'synset': 'champagne.n.01', 'synonyms': ['champagne'], 'def': 'a white sparkling wine produced in Champagne or resembling that produced there', 'name': 'champagne'}, {'frequency': 'f', 'id': 240, 'synset': 'chandelier.n.01', 'synonyms': ['chandelier'], 'def': 'branched lighting fixture; often ornate; hangs from the ceiling', 'name': 'chandelier'}, {'frequency': 'r', 'id': 241, 'synset': 'chap.n.04', 'synonyms': ['chap'], 'def': 'leather leggings without a seat; worn over trousers by cowboys to protect their legs', 'name': 'chap'}, {'frequency': 'r', 'id': 242, 'synset': 'checkbook.n.01', 'synonyms': ['checkbook', 'chequebook'], 'def': 'a book issued to holders of checking accounts', 'name': 'checkbook'}, {'frequency': 'r', 'id': 243, 'synset': 'checkerboard.n.01', 'synonyms': ['checkerboard'], 'def': 'a board having 64 squares of two alternating colors', 'name': 'checkerboard'}, {'frequency': 'c', 'id': 244, 'synset': 'cherry.n.03', 'synonyms': ['cherry'], 'def': 'a red fruit with a single hard stone', 'name': 'cherry'}, {'frequency': 'r', 'id': 245, 'synset': 'chessboard.n.01', 'synonyms': ['chessboard'], 'def': 'a checkerboard used to play chess', 'name': 'chessboard'}, {'frequency': 'r', 'id': 246, 'synset': 'chest_of_drawers.n.01', 'synonyms': ['chest_of_drawers_(furniture)', 'bureau_(furniture)', 'chest_(furniture)'], 'def': 'furniture with drawers for keeping clothes', 'name': 'chest_of_drawers_(furniture)'}, {'frequency': 'c', 'id': 247, 'synset': 'chicken.n.02', 'synonyms': ['chicken_(animal)'], 'def': 'a domestic fowl bred for flesh or eggs', 'name': 'chicken_(animal)'}, {'frequency': 'c', 'id': 248, 'synset': 'chicken_wire.n.01', 'synonyms': ['chicken_wire'], 'def': 'a galvanized wire network with a hexagonal mesh; used to build fences', 'name': 'chicken_wire'}, {'frequency': 'r', 'id': 249, 'synset': 'chickpea.n.01', 'synonyms': ['chickpea', 'garbanzo'], 'def': 'the seed of the chickpea plant; usually dried', 'name': 'chickpea'}, {'frequency': 'r', 'id': 250, 'synset': 'chihuahua.n.03', 'synonyms': ['Chihuahua'], 'def': 'an old breed of tiny short-haired dog with protruding eyes from Mexico', 'name': 'Chihuahua'}, {'frequency': 'r', 'id': 251, 'synset': 'chili.n.02', 'synonyms': ['chili_(vegetable)', 'chili_pepper_(vegetable)', 'chilli_(vegetable)', 'chilly_(vegetable)', 'chile_(vegetable)'], 'def': 'very hot and finely tapering pepper of special pungency', 'name': 'chili_(vegetable)'}, {'frequency': 'r', 'id': 252, 'synset': 'chime.n.01', 'synonyms': ['chime', 'gong'], 'def': 'an instrument consisting of a set of bells that are struck with a hammer', 'name': 'chime'}, {'frequency': 'r', 'id': 253, 'synset': 'chinaware.n.01', 'synonyms': ['chinaware'], 'def': 'dishware made of high quality porcelain', 'name': 'chinaware'}, {'frequency': 'c', 'id': 254, 'synset': 'chip.n.04', 'synonyms': ['crisp_(potato_chip)', 'potato_chip'], 'def': 'a thin crisp slice of potato fried in deep fat', 'name': 'crisp_(potato_chip)'}, {'frequency': 'r', 'id': 255, 'synset': 'chip.n.06', 'synonyms': ['poker_chip'], 'def': 'a small disk-shaped counter used to represent money when gambling', 'name': 'poker_chip'}, {'frequency': 'c', 'id': 256, 'synset': 'chocolate_bar.n.01', 'synonyms': ['chocolate_bar'], 'def': 'a bar of chocolate candy', 'name': 'chocolate_bar'}, {'frequency': 'c', 'id': 257, 'synset': 'chocolate_cake.n.01', 'synonyms': ['chocolate_cake'], 'def': 'cake containing chocolate', 'name': 'chocolate_cake'}, {'frequency': 'r', 'id': 258, 'synset': 'chocolate_milk.n.01', 'synonyms': ['chocolate_milk'], 'def': 'milk flavored with chocolate syrup', 'name': 'chocolate_milk'}, {'frequency': 'r', 'id': 259, 'synset': 'chocolate_mousse.n.01', 'synonyms': ['chocolate_mousse'], 'def': 'dessert mousse made with chocolate', 'name': 'chocolate_mousse'}, {'frequency': 'f', 'id': 260, 'synset': 'choker.n.03', 'synonyms': ['choker', 'collar', 'neckband'], 'def': 'necklace that fits tightly around the neck', 'name': 'choker'}, {'frequency': 'f', 'id': 261, 'synset': 'chopping_board.n.01', 'synonyms': ['chopping_board', 'cutting_board', 'chopping_block'], 'def': 'a wooden board where meats or vegetables can be cut', 'name': 'chopping_board'}, {'frequency': 'c', 'id': 262, 'synset': 'chopstick.n.01', 'synonyms': ['chopstick'], 'def': 'one of a pair of slender sticks used as oriental tableware to eat food with', 'name': 'chopstick'}, {'frequency': 'f', 'id': 263, 'synset': 'christmas_tree.n.05', 'synonyms': ['Christmas_tree'], 'def': 'an ornamented evergreen used as a Christmas decoration', 'name': 'Christmas_tree'}, {'frequency': 'c', 'id': 264, 'synset': 'chute.n.02', 'synonyms': ['slide'], 'def': 'sloping channel through which things can descend', 'name': 'slide'}, {'frequency': 'r', 'id': 265, 'synset': 'cider.n.01', 'synonyms': ['cider', 'cyder'], 'def': 'a beverage made from juice pressed from apples', 'name': 'cider'}, {'frequency': 'r', 'id': 266, 'synset': 'cigar_box.n.01', 'synonyms': ['cigar_box'], 'def': 'a box for holding cigars', 'name': 'cigar_box'}, {'frequency': 'c', 'id': 267, 'synset': 'cigarette.n.01', 'synonyms': ['cigarette'], 'def': 'finely ground tobacco wrapped in paper; for smoking', 'name': 'cigarette'}, {'frequency': 'c', 'id': 268, 'synset': 'cigarette_case.n.01', 'synonyms': ['cigarette_case', 'cigarette_pack'], 'def': 'a small flat case for holding cigarettes', 'name': 'cigarette_case'}, {'frequency': 'f', 'id': 269, 'synset': 'cistern.n.02', 'synonyms': ['cistern', 'water_tank'], 'def': 'a tank that holds the water used to flush a toilet', 'name': 'cistern'}, {'frequency': 'r', 'id': 270, 'synset': 'clarinet.n.01', 'synonyms': ['clarinet'], 'def': 'a single-reed instrument with a straight tube', 'name': 'clarinet'}, {'frequency': 'r', 'id': 271, 'synset': 'clasp.n.01', 'synonyms': ['clasp'], 'def': 'a fastener (as a buckle or hook) that is used to hold two things together', 'name': 'clasp'}, {'frequency': 'c', 'id': 272, 'synset': 'cleansing_agent.n.01', 'synonyms': ['cleansing_agent', 'cleanser', 'cleaner'], 'def': 'a preparation used in cleaning something', 'name': 'cleansing_agent'}, {'frequency': 'r', 'id': 273, 'synset': 'clementine.n.01', 'synonyms': ['clementine'], 'def': 'a variety of mandarin orange', 'name': 'clementine'}, {'frequency': 'c', 'id': 274, 'synset': 'clip.n.03', 'synonyms': ['clip'], 'def': 'any of various small fasteners used to hold loose articles together', 'name': 'clip'}, {'frequency': 'c', 'id': 275, 'synset': 'clipboard.n.01', 'synonyms': ['clipboard'], 'def': 'a small writing board with a clip at the top for holding papers', 'name': 'clipboard'}, {'frequency': 'f', 'id': 276, 'synset': 'clock.n.01', 'synonyms': ['clock', 'timepiece', 'timekeeper'], 'def': 'a timepiece that shows the time of day', 'name': 'clock'}, {'frequency': 'f', 'id': 277, 'synset': 'clock_tower.n.01', 'synonyms': ['clock_tower'], 'def': 'a tower with a large clock visible high up on an outside face', 'name': 'clock_tower'}, {'frequency': 'c', 'id': 278, 'synset': 'clothes_hamper.n.01', 'synonyms': ['clothes_hamper', 'laundry_basket', 'clothes_basket'], 'def': 'a hamper that holds dirty clothes to be washed or wet clothes to be dried', 'name': 'clothes_hamper'}, {'frequency': 'c', 'id': 279, 'synset': 'clothespin.n.01', 'synonyms': ['clothespin', 'clothes_peg'], 'def': 'wood or plastic fastener; for holding clothes on a clothesline', 'name': 'clothespin'}, {'frequency': 'r', 'id': 280, 'synset': 'clutch_bag.n.01', 'synonyms': ['clutch_bag'], 'def': "a woman's strapless purse that is carried in the hand", 'name': 'clutch_bag'}, {'frequency': 'f', 'id': 281, 'synset': 'coaster.n.03', 'synonyms': ['coaster'], 'def': 'a covering (plate or mat) that protects the surface of a table', 'name': 'coaster'}, {'frequency': 'f', 'id': 282, 'synset': 'coat.n.01', 'synonyms': ['coat'], 'def': 'an outer garment that has sleeves and covers the body from shoulder down', 'name': 'coat'}, {'frequency': 'c', 'id': 283, 'synset': 'coat_hanger.n.01', 'synonyms': ['coat_hanger', 'clothes_hanger', 'dress_hanger'], 'def': "a hanger that is shaped like a person's shoulders", 'name': 'coat_hanger'}, {'frequency': 'r', 'id': 284, 'synset': 'coatrack.n.01', 'synonyms': ['coatrack', 'hatrack'], 'def': 'a rack with hooks for temporarily holding coats and hats', 'name': 'coatrack'}, {'frequency': 'c', 'id': 285, 'synset': 'cock.n.04', 'synonyms': ['cock', 'rooster'], 'def': 'adult male chicken', 'name': 'cock'}, {'frequency': 'c', 'id': 286, 'synset': 'coconut.n.02', 'synonyms': ['coconut', 'cocoanut'], 'def': 'large hard-shelled brown oval nut with a fibrous husk', 'name': 'coconut'}, {'frequency': 'r', 'id': 287, 'synset': 'coffee_filter.n.01', 'synonyms': ['coffee_filter'], 'def': 'filter (usually of paper) that passes the coffee and retains the coffee grounds', 'name': 'coffee_filter'}, {'frequency': 'f', 'id': 288, 'synset': 'coffee_maker.n.01', 'synonyms': ['coffee_maker', 'coffee_machine'], 'def': 'a kitchen appliance for brewing coffee automatically', 'name': 'coffee_maker'}, {'frequency': 'f', 'id': 289, 'synset': 'coffee_table.n.01', 'synonyms': ['coffee_table', 'cocktail_table'], 'def': 'low table where magazines can be placed and coffee or cocktails are served', 'name': 'coffee_table'}, {'frequency': 'c', 'id': 290, 'synset': 'coffeepot.n.01', 'synonyms': ['coffeepot'], 'def': 'tall pot in which coffee is brewed', 'name': 'coffeepot'}, {'frequency': 'r', 'id': 291, 'synset': 'coil.n.05', 'synonyms': ['coil'], 'def': 'tubing that is wound in a spiral', 'name': 'coil'}, {'frequency': 'c', 'id': 292, 'synset': 'coin.n.01', 'synonyms': ['coin'], 'def': 'a flat metal piece (usually a disc) used as money', 'name': 'coin'}, {'frequency': 'r', 'id': 293, 'synset': 'colander.n.01', 'synonyms': ['colander', 'cullender'], 'def': 'bowl-shaped strainer; used to wash or drain foods', 'name': 'colander'}, {'frequency': 'c', 'id': 294, 'synset': 'coleslaw.n.01', 'synonyms': ['coleslaw', 'slaw'], 'def': 'basically shredded cabbage', 'name': 'coleslaw'}, {'frequency': 'r', 'id': 295, 'synset': 'coloring_material.n.01', 'synonyms': ['coloring_material', 'colouring_material'], 'def': 'any material used for its color', 'name': 'coloring_material'}, {'frequency': 'r', 'id': 296, 'synset': 'combination_lock.n.01', 'synonyms': ['combination_lock'], 'def': 'lock that can be opened only by turning dials in a special sequence', 'name': 'combination_lock'}, {'frequency': 'c', 'id': 297, 'synset': 'comforter.n.04', 'synonyms': ['pacifier', 'teething_ring'], 'def': 'device used for an infant to suck or bite on', 'name': 'pacifier'}, {'frequency': 'r', 'id': 298, 'synset': 'comic_book.n.01', 'synonyms': ['comic_book'], 'def': 'a magazine devoted to comic strips', 'name': 'comic_book'}, {'frequency': 'f', 'id': 299, 'synset': 'computer_keyboard.n.01', 'synonyms': ['computer_keyboard', 'keyboard_(computer)'], 'def': 'a keyboard that is a data input device for computers', 'name': 'computer_keyboard'}, {'frequency': 'r', 'id': 300, 'synset': 'concrete_mixer.n.01', 'synonyms': ['concrete_mixer', 'cement_mixer'], 'def': 'a machine with a large revolving drum in which cement/concrete is mixed', 'name': 'concrete_mixer'}, {'frequency': 'f', 'id': 301, 'synset': 'cone.n.01', 'synonyms': ['cone', 'traffic_cone'], 'def': 'a cone-shaped object used to direct traffic', 'name': 'cone'}, {'frequency': 'f', 'id': 302, 'synset': 'control.n.09', 'synonyms': ['control', 'controller'], 'def': 'a mechanism that controls the operation of a machine', 'name': 'control'}, {'frequency': 'r', 'id': 303, 'synset': 'convertible.n.01', 'synonyms': ['convertible_(automobile)'], 'def': 'a car that has top that can be folded or removed', 'name': 'convertible_(automobile)'}, {'frequency': 'r', 'id': 304, 'synset': 'convertible.n.03', 'synonyms': ['sofa_bed'], 'def': 'a sofa that can be converted into a bed', 'name': 'sofa_bed'}, {'frequency': 'c', 'id': 305, 'synset': 'cookie.n.01', 'synonyms': ['cookie', 'cooky', 'biscuit_(cookie)'], 'def': "any of various small flat sweet cakes (`biscuit' is the British term)", 'name': 'cookie'}, {'frequency': 'r', 'id': 306, 'synset': 'cookie_jar.n.01', 'synonyms': ['cookie_jar', 'cooky_jar'], 'def': 'a jar in which cookies are kept (and sometimes money is hidden)', 'name': 'cookie_jar'}, {'frequency': 'r', 'id': 307, 'synset': 'cooking_utensil.n.01', 'synonyms': ['cooking_utensil'], 'def': 'a kitchen utensil made of material that does not melt easily; used for cooking', 'name': 'cooking_utensil'}, {'frequency': 'f', 'id': 308, 'synset': 'cooler.n.01', 'synonyms': ['cooler_(for_food)', 'ice_chest'], 'def': 'an insulated box for storing food often with ice', 'name': 'cooler_(for_food)'}, {'frequency': 'c', 'id': 309, 'synset': 'cork.n.04', 'synonyms': ['cork_(bottle_plug)', 'bottle_cork'], 'def': 'the plug in the mouth of a bottle (especially a wine bottle)', 'name': 'cork_(bottle_plug)'}, {'frequency': 'r', 'id': 310, 'synset': 'corkboard.n.01', 'synonyms': ['corkboard'], 'def': 'a sheet consisting of cork granules', 'name': 'corkboard'}, {'frequency': 'r', 'id': 311, 'synset': 'corkscrew.n.01', 'synonyms': ['corkscrew', 'bottle_screw'], 'def': 'a bottle opener that pulls corks', 'name': 'corkscrew'}, {'frequency': 'c', 'id': 312, 'synset': 'corn.n.03', 'synonyms': ['edible_corn', 'corn', 'maize'], 'def': 'ears of corn that can be prepared and served for human food', 'name': 'edible_corn'}, {'frequency': 'r', 'id': 313, 'synset': 'cornbread.n.01', 'synonyms': ['cornbread'], 'def': 'bread made primarily of cornmeal', 'name': 'cornbread'}, {'frequency': 'c', 'id': 314, 'synset': 'cornet.n.01', 'synonyms': ['cornet', 'horn', 'trumpet'], 'def': 'a brass musical instrument with a narrow tube and a flared bell and many valves', 'name': 'cornet'}, {'frequency': 'c', 'id': 315, 'synset': 'cornice.n.01', 'synonyms': ['cornice', 'valance', 'valance_board', 'pelmet'], 'def': 'a decorative framework to conceal curtain fixtures at the top of a window casing', 'name': 'cornice'}, {'frequency': 'r', 'id': 316, 'synset': 'cornmeal.n.01', 'synonyms': ['cornmeal'], 'def': 'coarsely ground corn', 'name': 'cornmeal'}, {'frequency': 'r', 'id': 317, 'synset': 'corset.n.01', 'synonyms': ['corset', 'girdle'], 'def': "a woman's close-fitting foundation garment", 'name': 'corset'}, {'frequency': 'r', 'id': 318, 'synset': 'cos.n.02', 'synonyms': ['romaine_lettuce'], 'def': 'lettuce with long dark-green leaves in a loosely packed elongated head', 'name': 'romaine_lettuce'}, {'frequency': 'c', 'id': 319, 'synset': 'costume.n.04', 'synonyms': ['costume'], 'def': 'the attire characteristic of a country or a time or a social class', 'name': 'costume'}, {'frequency': 'r', 'id': 320, 'synset': 'cougar.n.01', 'synonyms': ['cougar', 'puma', 'catamount', 'mountain_lion', 'panther'], 'def': 'large American feline resembling a lion', 'name': 'cougar'}, {'frequency': 'r', 'id': 321, 'synset': 'coverall.n.01', 'synonyms': ['coverall'], 'def': 'a loose-fitting protective garment that is worn over other clothing', 'name': 'coverall'}, {'frequency': 'r', 'id': 322, 'synset': 'cowbell.n.01', 'synonyms': ['cowbell'], 'def': 'a bell hung around the neck of cow so that the cow can be easily located', 'name': 'cowbell'}, {'frequency': 'f', 'id': 323, 'synset': 'cowboy_hat.n.01', 'synonyms': ['cowboy_hat', 'ten-gallon_hat'], 'def': 'a hat with a wide brim and a soft crown; worn by American ranch hands', 'name': 'cowboy_hat'}, {'frequency': 'r', 'id': 324, 'synset': 'crab.n.01', 'synonyms': ['crab_(animal)'], 'def': 'decapod having eyes on short stalks and a broad flattened shell and pincers', 'name': 'crab_(animal)'}, {'frequency': 'c', 'id': 325, 'synset': 'cracker.n.01', 'synonyms': ['cracker'], 'def': 'a thin crisp wafer', 'name': 'cracker'}, {'frequency': 'r', 'id': 326, 'synset': 'crape.n.01', 'synonyms': ['crape', 'crepe', 'French_pancake'], 'def': 'small very thin pancake', 'name': 'crape'}, {'frequency': 'f', 'id': 327, 'synset': 'crate.n.01', 'synonyms': ['crate'], 'def': 'a rugged box (usually made of wood); used for shipping', 'name': 'crate'}, {'frequency': 'r', 'id': 328, 'synset': 'crayon.n.01', 'synonyms': ['crayon', 'wax_crayon'], 'def': 'writing or drawing implement made of a colored stick of composition wax', 'name': 'crayon'}, {'frequency': 'r', 'id': 329, 'synset': 'cream_pitcher.n.01', 'synonyms': ['cream_pitcher'], 'def': 'a small pitcher for serving cream', 'name': 'cream_pitcher'}, {'frequency': 'r', 'id': 330, 'synset': 'credit_card.n.01', 'synonyms': ['credit_card', 'charge_card', 'debit_card'], 'def': 'a card, usually plastic, used to pay for goods and services', 'name': 'credit_card'}, {'frequency': 'c', 'id': 331, 'synset': 'crescent_roll.n.01', 'synonyms': ['crescent_roll', 'croissant'], 'def': 'very rich flaky crescent-shaped roll', 'name': 'crescent_roll'}, {'frequency': 'c', 'id': 332, 'synset': 'crib.n.01', 'synonyms': ['crib', 'cot'], 'def': 'baby bed with high sides made of slats', 'name': 'crib'}, {'frequency': 'c', 'id': 333, 'synset': 'crock.n.03', 'synonyms': ['crock_pot', 'earthenware_jar'], 'def': 'an earthen jar (made of baked clay)', 'name': 'crock_pot'}, {'frequency': 'f', 'id': 334, 'synset': 'crossbar.n.01', 'synonyms': ['crossbar'], 'def': 'a horizontal bar that goes across something', 'name': 'crossbar'}, {'frequency': 'r', 'id': 335, 'synset': 'crouton.n.01', 'synonyms': ['crouton'], 'def': 'a small piece of toasted or fried bread; served in soup or salads', 'name': 'crouton'}, {'frequency': 'r', 'id': 336, 'synset': 'crow.n.01', 'synonyms': ['crow'], 'def': 'black birds having a raucous call', 'name': 'crow'}, {'frequency': 'c', 'id': 337, 'synset': 'crown.n.04', 'synonyms': ['crown'], 'def': 'an ornamental jeweled headdress signifying sovereignty', 'name': 'crown'}, {'frequency': 'c', 'id': 338, 'synset': 'crucifix.n.01', 'synonyms': ['crucifix'], 'def': 'representation of the cross on which Jesus died', 'name': 'crucifix'}, {'frequency': 'c', 'id': 339, 'synset': 'cruise_ship.n.01', 'synonyms': ['cruise_ship', 'cruise_liner'], 'def': 'a passenger ship used commercially for pleasure cruises', 'name': 'cruise_ship'}, {'frequency': 'c', 'id': 340, 'synset': 'cruiser.n.01', 'synonyms': ['police_cruiser', 'patrol_car', 'police_car', 'squad_car'], 'def': 'a car in which policemen cruise the streets', 'name': 'police_cruiser'}, {'frequency': 'c', 'id': 341, 'synset': 'crumb.n.03', 'synonyms': ['crumb'], 'def': 'small piece of e.g. bread or cake', 'name': 'crumb'}, {'frequency': 'r', 'id': 342, 'synset': 'crutch.n.01', 'synonyms': ['crutch'], 'def': 'a wooden or metal staff that fits under the armpit and reaches to the ground', 'name': 'crutch'}, {'frequency': 'c', 'id': 343, 'synset': 'cub.n.03', 'synonyms': ['cub_(animal)'], 'def': 'the young of certain carnivorous mammals such as the bear or wolf or lion', 'name': 'cub_(animal)'}, {'frequency': 'r', 'id': 344, 'synset': 'cube.n.05', 'synonyms': ['cube', 'square_block'], 'def': 'a block in the (approximate) shape of a cube', 'name': 'cube'}, {'frequency': 'f', 'id': 345, 'synset': 'cucumber.n.02', 'synonyms': ['cucumber', 'cuke'], 'def': 'cylindrical green fruit with thin green rind and white flesh eaten as a vegetable', 'name': 'cucumber'}, {'frequency': 'c', 'id': 346, 'synset': 'cufflink.n.01', 'synonyms': ['cufflink'], 'def': 'jewelry consisting of linked buttons used to fasten the cuffs of a shirt', 'name': 'cufflink'}, {'frequency': 'f', 'id': 347, 'synset': 'cup.n.01', 'synonyms': ['cup'], 'def': 'a small open container usually used for drinking; usually has a handle', 'name': 'cup'}, {'frequency': 'c', 'id': 348, 'synset': 'cup.n.08', 'synonyms': ['trophy_cup'], 'def': 'a metal vessel with handles that is awarded as a trophy to a competition winner', 'name': 'trophy_cup'}, {'frequency': 'c', 'id': 349, 'synset': 'cupcake.n.01', 'synonyms': ['cupcake'], 'def': 'small cake baked in a muffin tin', 'name': 'cupcake'}, {'frequency': 'r', 'id': 350, 'synset': 'curler.n.01', 'synonyms': ['hair_curler', 'hair_roller', 'hair_crimper'], 'def': 'a cylindrical tube around which the hair is wound to curl it', 'name': 'hair_curler'}, {'frequency': 'r', 'id': 351, 'synset': 'curling_iron.n.01', 'synonyms': ['curling_iron'], 'def': 'a cylindrical home appliance that heats hair that has been curled around it', 'name': 'curling_iron'}, {'frequency': 'f', 'id': 352, 'synset': 'curtain.n.01', 'synonyms': ['curtain', 'drapery'], 'def': 'hanging cloth used as a blind (especially for a window)', 'name': 'curtain'}, {'frequency': 'f', 'id': 353, 'synset': 'cushion.n.03', 'synonyms': ['cushion'], 'def': 'a soft bag filled with air or padding such as feathers or foam rubber', 'name': 'cushion'}, {'frequency': 'r', 'id': 354, 'synset': 'custard.n.01', 'synonyms': ['custard'], 'def': 'sweetened mixture of milk and eggs baked or boiled or frozen', 'name': 'custard'}, {'frequency': 'c', 'id': 355, 'synset': 'cutter.n.06', 'synonyms': ['cutting_tool'], 'def': 'a cutting implement; a tool for cutting', 'name': 'cutting_tool'}, {'frequency': 'r', 'id': 356, 'synset': 'cylinder.n.04', 'synonyms': ['cylinder'], 'def': 'a cylindrical container', 'name': 'cylinder'}, {'frequency': 'r', 'id': 357, 'synset': 'cymbal.n.01', 'synonyms': ['cymbal'], 'def': 'a percussion instrument consisting of a concave brass disk', 'name': 'cymbal'}, {'frequency': 'r', 'id': 358, 'synset': 'dachshund.n.01', 'synonyms': ['dachshund', 'dachsie', 'badger_dog'], 'def': 'small long-bodied short-legged breed of dog having a short sleek coat and long drooping ears', 'name': 'dachshund'}, {'frequency': 'r', 'id': 359, 'synset': 'dagger.n.01', 'synonyms': ['dagger'], 'def': 'a short knife with a pointed blade used for piercing or stabbing', 'name': 'dagger'}, {'frequency': 'r', 'id': 360, 'synset': 'dartboard.n.01', 'synonyms': ['dartboard'], 'def': 'a circular board of wood or cork used as the target in the game of darts', 'name': 'dartboard'}, {'frequency': 'r', 'id': 361, 'synset': 'date.n.08', 'synonyms': ['date_(fruit)'], 'def': 'sweet edible fruit of the date palm with a single long woody seed', 'name': 'date_(fruit)'}, {'frequency': 'f', 'id': 362, 'synset': 'deck_chair.n.01', 'synonyms': ['deck_chair', 'beach_chair'], 'def': 'a folding chair for use outdoors; a wooden frame supports a length of canvas', 'name': 'deck_chair'}, {'frequency': 'c', 'id': 363, 'synset': 'deer.n.01', 'synonyms': ['deer', 'cervid'], 'def': "distinguished from Bovidae by the male's having solid deciduous antlers", 'name': 'deer'}, {'frequency': 'c', 'id': 364, 'synset': 'dental_floss.n.01', 'synonyms': ['dental_floss', 'floss'], 'def': 'a soft thread for cleaning the spaces between the teeth', 'name': 'dental_floss'}, {'frequency': 'f', 'id': 365, 'synset': 'desk.n.01', 'synonyms': ['desk'], 'def': 'a piece of furniture with a writing surface and usually drawers or other compartments', 'name': 'desk'}, {'frequency': 'r', 'id': 366, 'synset': 'detergent.n.01', 'synonyms': ['detergent'], 'def': 'a surface-active chemical widely used in industry and laundering', 'name': 'detergent'}, {'frequency': 'c', 'id': 367, 'synset': 'diaper.n.01', 'synonyms': ['diaper'], 'def': 'garment consisting of a folded cloth drawn up between the legs and fastened at the waist', 'name': 'diaper'}, {'frequency': 'r', 'id': 368, 'synset': 'diary.n.01', 'synonyms': ['diary', 'journal'], 'def': 'a daily written record of (usually personal) experiences and observations', 'name': 'diary'}, {'frequency': 'r', 'id': 369, 'synset': 'die.n.01', 'synonyms': ['die', 'dice'], 'def': 'a small cube with 1 to 6 spots on the six faces; used in gambling', 'name': 'die'}, {'frequency': 'r', 'id': 370, 'synset': 'dinghy.n.01', 'synonyms': ['dinghy', 'dory', 'rowboat'], 'def': 'a small boat of shallow draft with seats and oars with which it is propelled', 'name': 'dinghy'}, {'frequency': 'f', 'id': 371, 'synset': 'dining_table.n.01', 'synonyms': ['dining_table'], 'def': 'a table at which meals are served', 'name': 'dining_table'}, {'frequency': 'r', 'id': 372, 'synset': 'dinner_jacket.n.01', 'synonyms': ['tux', 'tuxedo'], 'def': 'semiformal evening dress for men', 'name': 'tux'}, {'frequency': 'c', 'id': 373, 'synset': 'dish.n.01', 'synonyms': ['dish'], 'def': 'a piece of dishware normally used as a container for holding or serving food', 'name': 'dish'}, {'frequency': 'c', 'id': 374, 'synset': 'dish.n.05', 'synonyms': ['dish_antenna'], 'def': 'directional antenna consisting of a parabolic reflector', 'name': 'dish_antenna'}, {'frequency': 'c', 'id': 375, 'synset': 'dishrag.n.01', 'synonyms': ['dishrag', 'dishcloth'], 'def': 'a cloth for washing dishes', 'name': 'dishrag'}, {'frequency': 'c', 'id': 376, 'synset': 'dishtowel.n.01', 'synonyms': ['dishtowel', 'tea_towel'], 'def': 'a towel for drying dishes', 'name': 'dishtowel'}, {'frequency': 'f', 'id': 377, 'synset': 'dishwasher.n.01', 'synonyms': ['dishwasher', 'dishwashing_machine'], 'def': 'a machine for washing dishes', 'name': 'dishwasher'}, {'frequency': 'r', 'id': 378, 'synset': 'dishwasher_detergent.n.01', 'synonyms': ['dishwasher_detergent', 'dishwashing_detergent', 'dishwashing_liquid'], 'def': 'a low-sudsing detergent designed for use in dishwashers', 'name': 'dishwasher_detergent'}, {'frequency': 'r', 'id': 379, 'synset': 'diskette.n.01', 'synonyms': ['diskette', 'floppy', 'floppy_disk'], 'def': 'a small plastic magnetic disk enclosed in a stiff envelope used to store data', 'name': 'diskette'}, {'frequency': 'c', 'id': 380, 'synset': 'dispenser.n.01', 'synonyms': ['dispenser'], 'def': 'a container so designed that the contents can be used in prescribed amounts', 'name': 'dispenser'}, {'frequency': 'c', 'id': 381, 'synset': 'dixie_cup.n.01', 'synonyms': ['Dixie_cup', 'paper_cup'], 'def': 'a disposable cup made of paper; for holding drinks', 'name': 'Dixie_cup'}, {'frequency': 'f', 'id': 382, 'synset': 'dog.n.01', 'synonyms': ['dog'], 'def': 'a common domesticated dog', 'name': 'dog'}, {'frequency': 'f', 'id': 383, 'synset': 'dog_collar.n.01', 'synonyms': ['dog_collar'], 'def': 'a collar for a dog', 'name': 'dog_collar'}, {'frequency': 'c', 'id': 384, 'synset': 'doll.n.01', 'synonyms': ['doll'], 'def': 'a toy replica of a HUMAN (NOT AN ANIMAL)', 'name': 'doll'}, {'frequency': 'r', 'id': 385, 'synset': 'dollar.n.02', 'synonyms': ['dollar', 'dollar_bill', 'one_dollar_bill'], 'def': 'a piece of paper money worth one dollar', 'name': 'dollar'}, {'frequency': 'r', 'id': 386, 'synset': 'dolphin.n.02', 'synonyms': ['dolphin'], 'def': 'any of various small toothed whales with a beaklike snout; larger than porpoises', 'name': 'dolphin'}, {'frequency': 'c', 'id': 387, 'synset': 'domestic_ass.n.01', 'synonyms': ['domestic_ass', 'donkey'], 'def': 'domestic beast of burden descended from the African wild ass; patient but stubborn', 'name': 'domestic_ass'}, {'frequency': 'r', 'id': 388, 'synset': 'domino.n.03', 'synonyms': ['eye_mask'], 'def': 'a mask covering the upper part of the face but with holes for the eyes', 'name': 'eye_mask'}, {'frequency': 'r', 'id': 389, 'synset': 'doorbell.n.01', 'synonyms': ['doorbell', 'buzzer'], 'def': 'a button at an outer door that gives a ringing or buzzing signal when pushed', 'name': 'doorbell'}, {'frequency': 'f', 'id': 390, 'synset': 'doorknob.n.01', 'synonyms': ['doorknob', 'doorhandle'], 'def': "a knob used to open a door (often called `doorhandle' in Great Britain)", 'name': 'doorknob'}, {'frequency': 'c', 'id': 391, 'synset': 'doormat.n.02', 'synonyms': ['doormat', 'welcome_mat'], 'def': 'a mat placed outside an exterior door for wiping the shoes before entering', 'name': 'doormat'}, {'frequency': 'f', 'id': 392, 'synset': 'doughnut.n.02', 'synonyms': ['doughnut', 'donut'], 'def': 'a small ring-shaped friedcake', 'name': 'doughnut'}, {'frequency': 'r', 'id': 393, 'synset': 'dove.n.01', 'synonyms': ['dove'], 'def': 'any of numerous small pigeons', 'name': 'dove'}, {'frequency': 'r', 'id': 394, 'synset': 'dragonfly.n.01', 'synonyms': ['dragonfly'], 'def': 'slender-bodied non-stinging insect having iridescent wings that are outspread at rest', 'name': 'dragonfly'}, {'frequency': 'f', 'id': 395, 'synset': 'drawer.n.01', 'synonyms': ['drawer'], 'def': 'a boxlike container in a piece of furniture; made so as to slide in and out', 'name': 'drawer'}, {'frequency': 'c', 'id': 396, 'synset': 'drawers.n.01', 'synonyms': ['underdrawers', 'boxers', 'boxershorts'], 'def': 'underpants worn by men', 'name': 'underdrawers'}, {'frequency': 'f', 'id': 397, 'synset': 'dress.n.01', 'synonyms': ['dress', 'frock'], 'def': 'a one-piece garment for a woman; has skirt and bodice', 'name': 'dress'}, {'frequency': 'c', 'id': 398, 'synset': 'dress_hat.n.01', 'synonyms': ['dress_hat', 'high_hat', 'opera_hat', 'silk_hat', 'top_hat'], 'def': "a man's hat with a tall crown; usually covered with silk or with beaver fur", 'name': 'dress_hat'}, {'frequency': 'c', 'id': 399, 'synset': 'dress_suit.n.01', 'synonyms': ['dress_suit'], 'def': 'formalwear consisting of full evening dress for men', 'name': 'dress_suit'}, {'frequency': 'c', 'id': 400, 'synset': 'dresser.n.05', 'synonyms': ['dresser'], 'def': 'a cabinet with shelves', 'name': 'dresser'}, {'frequency': 'c', 'id': 401, 'synset': 'drill.n.01', 'synonyms': ['drill'], 'def': 'a tool with a sharp rotating point for making holes in hard materials', 'name': 'drill'}, {'frequency': 'r', 'id': 402, 'synset': 'drinking_fountain.n.01', 'synonyms': ['drinking_fountain'], 'def': 'a public fountain to provide a jet of drinking water', 'name': 'drinking_fountain'}, {'frequency': 'r', 'id': 403, 'synset': 'drone.n.04', 'synonyms': ['drone'], 'def': 'an aircraft without a pilot that is operated by remote control', 'name': 'drone'}, {'frequency': 'r', 'id': 404, 'synset': 'dropper.n.01', 'synonyms': ['dropper', 'eye_dropper'], 'def': 'pipet consisting of a small tube with a vacuum bulb at one end for drawing liquid in and releasing it a drop at a time', 'name': 'dropper'}, {'frequency': 'c', 'id': 405, 'synset': 'drum.n.01', 'synonyms': ['drum_(musical_instrument)'], 'def': 'a musical percussion instrument; usually consists of a hollow cylinder with a membrane stretched across each end', 'name': 'drum_(musical_instrument)'}, {'frequency': 'r', 'id': 406, 'synset': 'drumstick.n.02', 'synonyms': ['drumstick'], 'def': 'a stick used for playing a drum', 'name': 'drumstick'}, {'frequency': 'f', 'id': 407, 'synset': 'duck.n.01', 'synonyms': ['duck'], 'def': 'small web-footed broad-billed swimming bird', 'name': 'duck'}, {'frequency': 'r', 'id': 408, 'synset': 'duckling.n.02', 'synonyms': ['duckling'], 'def': 'young duck', 'name': 'duckling'}, {'frequency': 'c', 'id': 409, 'synset': 'duct_tape.n.01', 'synonyms': ['duct_tape'], 'def': 'a wide silvery adhesive tape', 'name': 'duct_tape'}, {'frequency': 'f', 'id': 410, 'synset': 'duffel_bag.n.01', 'synonyms': ['duffel_bag', 'duffle_bag', 'duffel', 'duffle'], 'def': 'a large cylindrical bag of heavy cloth', 'name': 'duffel_bag'}, {'frequency': 'r', 'id': 411, 'synset': 'dumbbell.n.01', 'synonyms': ['dumbbell'], 'def': 'an exercising weight with two ball-like ends connected by a short handle', 'name': 'dumbbell'}, {'frequency': 'c', 'id': 412, 'synset': 'dumpster.n.01', 'synonyms': ['dumpster'], 'def': 'a container designed to receive and transport and dump waste', 'name': 'dumpster'}, {'frequency': 'r', 'id': 413, 'synset': 'dustpan.n.02', 'synonyms': ['dustpan'], 'def': 'a short-handled receptacle into which dust can be swept', 'name': 'dustpan'}, {'frequency': 'r', 'id': 414, 'synset': 'dutch_oven.n.02', 'synonyms': ['Dutch_oven'], 'def': 'iron or earthenware cooking pot; used for stews', 'name': 'Dutch_oven'}, {'frequency': 'c', 'id': 415, 'synset': 'eagle.n.01', 'synonyms': ['eagle'], 'def': 'large birds of prey noted for their broad wings and strong soaring flight', 'name': 'eagle'}, {'frequency': 'f', 'id': 416, 'synset': 'earphone.n.01', 'synonyms': ['earphone', 'earpiece', 'headphone'], 'def': 'device for listening to audio that is held over or inserted into the ear', 'name': 'earphone'}, {'frequency': 'r', 'id': 417, 'synset': 'earplug.n.01', 'synonyms': ['earplug'], 'def': 'a soft plug that is inserted into the ear canal to block sound', 'name': 'earplug'}, {'frequency': 'f', 'id': 418, 'synset': 'earring.n.01', 'synonyms': ['earring'], 'def': 'jewelry to ornament the ear', 'name': 'earring'}, {'frequency': 'c', 'id': 419, 'synset': 'easel.n.01', 'synonyms': ['easel'], 'def': "an upright tripod for displaying something (usually an artist's canvas)", 'name': 'easel'}, {'frequency': 'r', 'id': 420, 'synset': 'eclair.n.01', 'synonyms': ['eclair'], 'def': 'oblong cream puff', 'name': 'eclair'}, {'frequency': 'r', 'id': 421, 'synset': 'eel.n.01', 'synonyms': ['eel'], 'def': 'an elongate fish with fatty flesh', 'name': 'eel'}, {'frequency': 'f', 'id': 422, 'synset': 'egg.n.02', 'synonyms': ['egg', 'eggs'], 'def': 'oval reproductive body of a fowl (especially a hen) used as food', 'name': 'egg'}, {'frequency': 'r', 'id': 423, 'synset': 'egg_roll.n.01', 'synonyms': ['egg_roll', 'spring_roll'], 'def': 'minced vegetables and meat wrapped in a pancake and fried', 'name': 'egg_roll'}, {'frequency': 'c', 'id': 424, 'synset': 'egg_yolk.n.01', 'synonyms': ['egg_yolk', 'yolk_(egg)'], 'def': 'the yellow spherical part of an egg', 'name': 'egg_yolk'}, {'frequency': 'c', 'id': 425, 'synset': 'eggbeater.n.02', 'synonyms': ['eggbeater', 'eggwhisk'], 'def': 'a mixer for beating eggs or whipping cream', 'name': 'eggbeater'}, {'frequency': 'c', 'id': 426, 'synset': 'eggplant.n.01', 'synonyms': ['eggplant', 'aubergine'], 'def': 'egg-shaped vegetable having a shiny skin typically dark purple', 'name': 'eggplant'}, {'frequency': 'r', 'id': 427, 'synset': 'electric_chair.n.01', 'synonyms': ['electric_chair'], 'def': 'a chair-shaped instrument of execution by electrocution', 'name': 'electric_chair'}, {'frequency': 'f', 'id': 428, 'synset': 'electric_refrigerator.n.01', 'synonyms': ['refrigerator'], 'def': 'a refrigerator in which the coolant is pumped around by an electric motor', 'name': 'refrigerator'}, {'frequency': 'f', 'id': 429, 'synset': 'elephant.n.01', 'synonyms': ['elephant'], 'def': 'a common elephant', 'name': 'elephant'}, {'frequency': 'r', 'id': 430, 'synset': 'elk.n.01', 'synonyms': ['elk', 'moose'], 'def': 'large northern deer with enormous flattened antlers in the male', 'name': 'elk'}, {'frequency': 'c', 'id': 431, 'synset': 'envelope.n.01', 'synonyms': ['envelope'], 'def': 'a flat (usually rectangular) container for a letter, thin package, etc.', 'name': 'envelope'}, {'frequency': 'c', 'id': 432, 'synset': 'eraser.n.01', 'synonyms': ['eraser'], 'def': 'an implement used to erase something', 'name': 'eraser'}, {'frequency': 'r', 'id': 433, 'synset': 'escargot.n.01', 'synonyms': ['escargot'], 'def': 'edible snail usually served in the shell with a sauce of melted butter and garlic', 'name': 'escargot'}, {'frequency': 'r', 'id': 434, 'synset': 'eyepatch.n.01', 'synonyms': ['eyepatch'], 'def': 'a protective cloth covering for an injured eye', 'name': 'eyepatch'}, {'frequency': 'r', 'id': 435, 'synset': 'falcon.n.01', 'synonyms': ['falcon'], 'def': 'birds of prey having long pointed powerful wings adapted for swift flight', 'name': 'falcon'}, {'frequency': 'f', 'id': 436, 'synset': 'fan.n.01', 'synonyms': ['fan'], 'def': 'a device for creating a current of air by movement of a surface or surfaces', 'name': 'fan'}, {'frequency': 'f', 'id': 437, 'synset': 'faucet.n.01', 'synonyms': ['faucet', 'spigot', 'tap'], 'def': 'a regulator for controlling the flow of a liquid from a reservoir', 'name': 'faucet'}, {'frequency': 'r', 'id': 438, 'synset': 'fedora.n.01', 'synonyms': ['fedora'], 'def': 'a hat made of felt with a creased crown', 'name': 'fedora'}, {'frequency': 'r', 'id': 439, 'synset': 'ferret.n.02', 'synonyms': ['ferret'], 'def': 'domesticated albino variety of the European polecat bred for hunting rats and rabbits', 'name': 'ferret'}, {'frequency': 'c', 'id': 440, 'synset': 'ferris_wheel.n.01', 'synonyms': ['Ferris_wheel'], 'def': 'a large wheel with suspended seats that remain upright as the wheel rotates', 'name': 'Ferris_wheel'}, {'frequency': 'r', 'id': 441, 'synset': 'ferry.n.01', 'synonyms': ['ferry', 'ferryboat'], 'def': 'a boat that transports people or vehicles across a body of water and operates on a regular schedule', 'name': 'ferry'}, {'frequency': 'r', 'id': 442, 'synset': 'fig.n.04', 'synonyms': ['fig_(fruit)'], 'def': 'fleshy sweet pear-shaped yellowish or purple fruit eaten fresh or preserved or dried', 'name': 'fig_(fruit)'}, {'frequency': 'c', 'id': 443, 'synset': 'fighter.n.02', 'synonyms': ['fighter_jet', 'fighter_aircraft', 'attack_aircraft'], 'def': 'a high-speed military or naval airplane designed to destroy enemy targets', 'name': 'fighter_jet'}, {'frequency': 'f', 'id': 444, 'synset': 'figurine.n.01', 'synonyms': ['figurine'], 'def': 'a small carved or molded figure', 'name': 'figurine'}, {'frequency': 'c', 'id': 445, 'synset': 'file.n.03', 'synonyms': ['file_cabinet', 'filing_cabinet'], 'def': 'office furniture consisting of a container for keeping papers in order', 'name': 'file_cabinet'}, {'frequency': 'r', 'id': 446, 'synset': 'file.n.04', 'synonyms': ['file_(tool)'], 'def': 'a steel hand tool with small sharp teeth on some or all of its surfaces; used for smoothing wood or metal', 'name': 'file_(tool)'}, {'frequency': 'f', 'id': 447, 'synset': 'fire_alarm.n.02', 'synonyms': ['fire_alarm', 'smoke_alarm'], 'def': 'an alarm that is tripped off by fire or smoke', 'name': 'fire_alarm'}, {'frequency': 'c', 'id': 448, 'synset': 'fire_engine.n.01', 'synonyms': ['fire_engine', 'fire_truck'], 'def': 'large trucks that carry firefighters and equipment to the site of a fire', 'name': 'fire_engine'}, {'frequency': 'c', 'id': 449, 'synset': 'fire_extinguisher.n.01', 'synonyms': ['fire_extinguisher', 'extinguisher'], 'def': 'a manually operated device for extinguishing small fires', 'name': 'fire_extinguisher'}, {'frequency': 'c', 'id': 450, 'synset': 'fire_hose.n.01', 'synonyms': ['fire_hose'], 'def': 'a large hose that carries water from a fire hydrant to the site of the fire', 'name': 'fire_hose'}, {'frequency': 'f', 'id': 451, 'synset': 'fireplace.n.01', 'synonyms': ['fireplace'], 'def': 'an open recess in a wall at the base of a chimney where a fire can be built', 'name': 'fireplace'}, {'frequency': 'f', 'id': 452, 'synset': 'fireplug.n.01', 'synonyms': ['fireplug', 'fire_hydrant', 'hydrant'], 'def': 'an upright hydrant for drawing water to use in fighting a fire', 'name': 'fireplug'}, {'frequency': 'c', 'id': 453, 'synset': 'fish.n.01', 'synonyms': ['fish'], 'def': 'any of various mostly cold-blooded aquatic vertebrates usually having scales and breathing through gills', 'name': 'fish'}, {'frequency': 'r', 'id': 454, 'synset': 'fish.n.02', 'synonyms': ['fish_(food)'], 'def': 'the flesh of fish used as food', 'name': 'fish_(food)'}, {'frequency': 'r', 'id': 455, 'synset': 'fishbowl.n.02', 'synonyms': ['fishbowl', 'goldfish_bowl'], 'def': 'a transparent bowl in which small fish are kept', 'name': 'fishbowl'}, {'frequency': 'r', 'id': 456, 'synset': 'fishing_boat.n.01', 'synonyms': ['fishing_boat', 'fishing_vessel'], 'def': 'a vessel for fishing', 'name': 'fishing_boat'}, {'frequency': 'c', 'id': 457, 'synset': 'fishing_rod.n.01', 'synonyms': ['fishing_rod', 'fishing_pole'], 'def': 'a rod that is used in fishing to extend the fishing line', 'name': 'fishing_rod'}, {'frequency': 'f', 'id': 458, 'synset': 'flag.n.01', 'synonyms': ['flag'], 'def': 'emblem usually consisting of a rectangular piece of cloth of distinctive design (do not include pole)', 'name': 'flag'}, {'frequency': 'f', 'id': 459, 'synset': 'flagpole.n.02', 'synonyms': ['flagpole', 'flagstaff'], 'def': 'a tall staff or pole on which a flag is raised', 'name': 'flagpole'}, {'frequency': 'c', 'id': 460, 'synset': 'flamingo.n.01', 'synonyms': ['flamingo'], 'def': 'large pink web-footed bird with down-bent bill', 'name': 'flamingo'}, {'frequency': 'c', 'id': 461, 'synset': 'flannel.n.01', 'synonyms': ['flannel'], 'def': 'a soft light woolen fabric; used for clothing', 'name': 'flannel'}, {'frequency': 'r', 'id': 462, 'synset': 'flash.n.10', 'synonyms': ['flash', 'flashbulb'], 'def': 'a lamp for providing momentary light to take a photograph', 'name': 'flash'}, {'frequency': 'c', 'id': 463, 'synset': 'flashlight.n.01', 'synonyms': ['flashlight', 'torch'], 'def': 'a small portable battery-powered electric lamp', 'name': 'flashlight'}, {'frequency': 'r', 'id': 464, 'synset': 'fleece.n.03', 'synonyms': ['fleece'], 'def': 'a soft bulky fabric with deep pile; used chiefly for clothing', 'name': 'fleece'}, {'frequency': 'f', 'id': 465, 'synset': 'flip-flop.n.02', 'synonyms': ['flip-flop_(sandal)'], 'def': 'a backless sandal held to the foot by a thong between two toes', 'name': 'flip-flop_(sandal)'}, {'frequency': 'c', 'id': 466, 'synset': 'flipper.n.01', 'synonyms': ['flipper_(footwear)', 'fin_(footwear)'], 'def': 'a shoe to aid a person in swimming', 'name': 'flipper_(footwear)'}, {'frequency': 'f', 'id': 467, 'synset': 'flower_arrangement.n.01', 'synonyms': ['flower_arrangement', 'floral_arrangement'], 'def': 'a decorative arrangement of flowers', 'name': 'flower_arrangement'}, {'frequency': 'c', 'id': 468, 'synset': 'flute.n.02', 'synonyms': ['flute_glass', 'champagne_flute'], 'def': 'a tall narrow wineglass', 'name': 'flute_glass'}, {'frequency': 'r', 'id': 469, 'synset': 'foal.n.01', 'synonyms': ['foal'], 'def': 'a young horse', 'name': 'foal'}, {'frequency': 'c', 'id': 470, 'synset': 'folding_chair.n.01', 'synonyms': ['folding_chair'], 'def': 'a chair that can be folded flat for storage', 'name': 'folding_chair'}, {'frequency': 'c', 'id': 471, 'synset': 'food_processor.n.01', 'synonyms': ['food_processor'], 'def': 'a kitchen appliance for shredding, blending, chopping, or slicing food', 'name': 'food_processor'}, {'frequency': 'c', 'id': 472, 'synset': 'football.n.02', 'synonyms': ['football_(American)'], 'def': 'the inflated oblong ball used in playing American football', 'name': 'football_(American)'}, {'frequency': 'r', 'id': 473, 'synset': 'football_helmet.n.01', 'synonyms': ['football_helmet'], 'def': 'a padded helmet with a face mask to protect the head of football players', 'name': 'football_helmet'}, {'frequency': 'c', 'id': 474, 'synset': 'footstool.n.01', 'synonyms': ['footstool', 'footrest'], 'def': 'a low seat or a stool to rest the feet of a seated person', 'name': 'footstool'}, {'frequency': 'f', 'id': 475, 'synset': 'fork.n.01', 'synonyms': ['fork'], 'def': 'cutlery used for serving and eating food', 'name': 'fork'}, {'frequency': 'r', 'id': 476, 'synset': 'forklift.n.01', 'synonyms': ['forklift'], 'def': 'an industrial vehicle with a power operated fork in front that can be inserted under loads to lift and move them', 'name': 'forklift'}, {'frequency': 'r', 'id': 477, 'synset': 'freight_car.n.01', 'synonyms': ['freight_car'], 'def': 'a railway car that carries freight', 'name': 'freight_car'}, {'frequency': 'r', 'id': 478, 'synset': 'french_toast.n.01', 'synonyms': ['French_toast'], 'def': 'bread slice dipped in egg and milk and fried', 'name': 'French_toast'}, {'frequency': 'c', 'id': 479, 'synset': 'freshener.n.01', 'synonyms': ['freshener', 'air_freshener'], 'def': 'anything that freshens', 'name': 'freshener'}, {'frequency': 'f', 'id': 480, 'synset': 'frisbee.n.01', 'synonyms': ['frisbee'], 'def': 'a light, plastic disk propelled with a flip of the wrist for recreation or competition', 'name': 'frisbee'}, {'frequency': 'c', 'id': 481, 'synset': 'frog.n.01', 'synonyms': ['frog', 'toad', 'toad_frog'], 'def': 'a tailless stout-bodied amphibians with long hind limbs for leaping', 'name': 'frog'}, {'frequency': 'c', 'id': 482, 'synset': 'fruit_juice.n.01', 'synonyms': ['fruit_juice'], 'def': 'drink produced by squeezing or crushing fruit', 'name': 'fruit_juice'}, {'frequency': 'r', 'id': 483, 'synset': 'fruit_salad.n.01', 'synonyms': ['fruit_salad'], 'def': 'salad composed of fruits', 'name': 'fruit_salad'}, {'frequency': 'c', 'id': 484, 'synset': 'frying_pan.n.01', 'synonyms': ['frying_pan', 'frypan', 'skillet'], 'def': 'a pan used for frying foods', 'name': 'frying_pan'}, {'frequency': 'r', 'id': 485, 'synset': 'fudge.n.01', 'synonyms': ['fudge'], 'def': 'soft creamy candy', 'name': 'fudge'}, {'frequency': 'r', 'id': 486, 'synset': 'funnel.n.02', 'synonyms': ['funnel'], 'def': 'a cone-shaped utensil used to channel a substance into a container with a small mouth', 'name': 'funnel'}, {'frequency': 'c', 'id': 487, 'synset': 'futon.n.01', 'synonyms': ['futon'], 'def': 'a pad that is used for sleeping on the floor or on a raised frame', 'name': 'futon'}, {'frequency': 'r', 'id': 488, 'synset': 'gag.n.02', 'synonyms': ['gag', 'muzzle'], 'def': "restraint put into a person's mouth to prevent speaking or shouting", 'name': 'gag'}, {'frequency': 'r', 'id': 489, 'synset': 'garbage.n.03', 'synonyms': ['garbage'], 'def': 'a receptacle where waste can be discarded', 'name': 'garbage'}, {'frequency': 'c', 'id': 490, 'synset': 'garbage_truck.n.01', 'synonyms': ['garbage_truck'], 'def': 'a truck for collecting domestic refuse', 'name': 'garbage_truck'}, {'frequency': 'c', 'id': 491, 'synset': 'garden_hose.n.01', 'synonyms': ['garden_hose'], 'def': 'a hose used for watering a lawn or garden', 'name': 'garden_hose'}, {'frequency': 'c', 'id': 492, 'synset': 'gargle.n.01', 'synonyms': ['gargle', 'mouthwash'], 'def': 'a medicated solution used for gargling and rinsing the mouth', 'name': 'gargle'}, {'frequency': 'r', 'id': 493, 'synset': 'gargoyle.n.02', 'synonyms': ['gargoyle'], 'def': 'an ornament consisting of a grotesquely carved figure of a person or animal', 'name': 'gargoyle'}, {'frequency': 'c', 'id': 494, 'synset': 'garlic.n.02', 'synonyms': ['garlic', 'ail'], 'def': 'aromatic bulb used as seasoning', 'name': 'garlic'}, {'frequency': 'r', 'id': 495, 'synset': 'gasmask.n.01', 'synonyms': ['gasmask', 'respirator', 'gas_helmet'], 'def': 'a protective face mask with a filter', 'name': 'gasmask'}, {'frequency': 'r', 'id': 496, 'synset': 'gazelle.n.01', 'synonyms': ['gazelle'], 'def': 'small swift graceful antelope of Africa and Asia having lustrous eyes', 'name': 'gazelle'}, {'frequency': 'c', 'id': 497, 'synset': 'gelatin.n.02', 'synonyms': ['gelatin', 'jelly'], 'def': 'an edible jelly made with gelatin and used as a dessert or salad base or a coating for foods', 'name': 'gelatin'}, {'frequency': 'r', 'id': 498, 'synset': 'gem.n.02', 'synonyms': ['gemstone'], 'def': 'a crystalline rock that can be cut and polished for jewelry', 'name': 'gemstone'}, {'frequency': 'c', 'id': 499, 'synset': 'giant_panda.n.01', 'synonyms': ['giant_panda', 'panda', 'panda_bear'], 'def': 'large black-and-white herbivorous mammal of bamboo forests of China and Tibet', 'name': 'giant_panda'}, {'frequency': 'c', 'id': 500, 'synset': 'gift_wrap.n.01', 'synonyms': ['gift_wrap'], 'def': 'attractive wrapping paper suitable for wrapping gifts', 'name': 'gift_wrap'}, {'frequency': 'c', 'id': 501, 'synset': 'ginger.n.03', 'synonyms': ['ginger', 'gingerroot'], 'def': 'the root of the common ginger plant; used fresh as a seasoning', 'name': 'ginger'}, {'frequency': 'f', 'id': 502, 'synset': 'giraffe.n.01', 'synonyms': ['giraffe'], 'def': 'tall animal having a spotted coat and small horns and very long neck and legs', 'name': 'giraffe'}, {'frequency': 'c', 'id': 503, 'synset': 'girdle.n.02', 'synonyms': ['cincture', 'sash', 'waistband', 'waistcloth'], 'def': 'a band of material around the waist that strengthens a skirt or trousers', 'name': 'cincture'}, {'frequency': 'f', 'id': 504, 'synset': 'glass.n.02', 'synonyms': ['glass_(drink_container)', 'drinking_glass'], 'def': 'a container for holding liquids while drinking', 'name': 'glass_(drink_container)'}, {'frequency': 'c', 'id': 505, 'synset': 'globe.n.03', 'synonyms': ['globe'], 'def': 'a sphere on which a map (especially of the earth) is represented', 'name': 'globe'}, {'frequency': 'f', 'id': 506, 'synset': 'glove.n.02', 'synonyms': ['glove'], 'def': 'handwear covering the hand', 'name': 'glove'}, {'frequency': 'c', 'id': 507, 'synset': 'goat.n.01', 'synonyms': ['goat'], 'def': 'a common goat', 'name': 'goat'}, {'frequency': 'f', 'id': 508, 'synset': 'goggles.n.01', 'synonyms': ['goggles'], 'def': 'tight-fitting spectacles worn to protect the eyes', 'name': 'goggles'}, {'frequency': 'r', 'id': 509, 'synset': 'goldfish.n.01', 'synonyms': ['goldfish'], 'def': 'small golden or orange-red freshwater fishes used as pond or aquarium pets', 'name': 'goldfish'}, {'frequency': 'r', 'id': 510, 'synset': 'golf_club.n.02', 'synonyms': ['golf_club', 'golf-club'], 'def': 'golf equipment used by a golfer to hit a golf ball', 'name': 'golf_club'}, {'frequency': 'c', 'id': 511, 'synset': 'golfcart.n.01', 'synonyms': ['golfcart'], 'def': 'a small motor vehicle in which golfers can ride between shots', 'name': 'golfcart'}, {'frequency': 'r', 'id': 512, 'synset': 'gondola.n.02', 'synonyms': ['gondola_(boat)'], 'def': 'long narrow flat-bottomed boat propelled by sculling; traditionally used on canals of Venice', 'name': 'gondola_(boat)'}, {'frequency': 'c', 'id': 513, 'synset': 'goose.n.01', 'synonyms': ['goose'], 'def': 'loud, web-footed long-necked aquatic birds usually larger than ducks', 'name': 'goose'}, {'frequency': 'r', 'id': 514, 'synset': 'gorilla.n.01', 'synonyms': ['gorilla'], 'def': 'largest ape', 'name': 'gorilla'}, {'frequency': 'r', 'id': 515, 'synset': 'gourd.n.02', 'synonyms': ['gourd'], 'def': 'any of numerous inedible fruits with hard rinds', 'name': 'gourd'}, {'frequency': 'r', 'id': 516, 'synset': 'gown.n.04', 'synonyms': ['surgical_gown', 'scrubs_(surgical_clothing)'], 'def': 'protective garment worn by surgeons during operations', 'name': 'surgical_gown'}, {'frequency': 'f', 'id': 517, 'synset': 'grape.n.01', 'synonyms': ['grape'], 'def': 'any of various juicy fruit with green or purple skins; grow in clusters', 'name': 'grape'}, {'frequency': 'r', 'id': 518, 'synset': 'grasshopper.n.01', 'synonyms': ['grasshopper'], 'def': 'plant-eating insect with hind legs adapted for leaping', 'name': 'grasshopper'}, {'frequency': 'c', 'id': 519, 'synset': 'grater.n.01', 'synonyms': ['grater'], 'def': 'utensil with sharp perforations for shredding foods (as vegetables or cheese)', 'name': 'grater'}, {'frequency': 'c', 'id': 520, 'synset': 'gravestone.n.01', 'synonyms': ['gravestone', 'headstone', 'tombstone'], 'def': 'a stone that is used to mark a grave', 'name': 'gravestone'}, {'frequency': 'r', 'id': 521, 'synset': 'gravy_boat.n.01', 'synonyms': ['gravy_boat', 'gravy_holder'], 'def': 'a dish (often boat-shaped) for serving gravy or sauce', 'name': 'gravy_boat'}, {'frequency': 'c', 'id': 522, 'synset': 'green_bean.n.02', 'synonyms': ['green_bean'], 'def': 'a common bean plant cultivated for its slender green edible pods', 'name': 'green_bean'}, {'frequency': 'c', 'id': 523, 'synset': 'green_onion.n.01', 'synonyms': ['green_onion', 'spring_onion', 'scallion'], 'def': 'a young onion before the bulb has enlarged', 'name': 'green_onion'}, {'frequency': 'r', 'id': 524, 'synset': 'griddle.n.01', 'synonyms': ['griddle'], 'def': 'cooking utensil consisting of a flat heated surface on which food is cooked', 'name': 'griddle'}, {'frequency': 'r', 'id': 525, 'synset': 'grillroom.n.01', 'synonyms': ['grillroom', 'grill_(restaurant)'], 'def': 'a restaurant where food is cooked on a grill', 'name': 'grillroom'}, {'frequency': 'r', 'id': 526, 'synset': 'grinder.n.04', 'synonyms': ['grinder_(tool)'], 'def': 'a machine tool that polishes metal', 'name': 'grinder_(tool)'}, {'frequency': 'r', 'id': 527, 'synset': 'grits.n.01', 'synonyms': ['grits', 'hominy_grits'], 'def': 'coarsely ground corn boiled as a breakfast dish', 'name': 'grits'}, {'frequency': 'c', 'id': 528, 'synset': 'grizzly.n.01', 'synonyms': ['grizzly', 'grizzly_bear'], 'def': 'powerful brownish-yellow bear of the uplands of western North America', 'name': 'grizzly'}, {'frequency': 'c', 'id': 529, 'synset': 'grocery_bag.n.01', 'synonyms': ['grocery_bag'], 'def': "a sack for holding customer's groceries", 'name': 'grocery_bag'}, {'frequency': 'r', 'id': 530, 'synset': 'guacamole.n.01', 'synonyms': ['guacamole'], 'def': 'a dip made of mashed avocado mixed with chopped onions and other seasonings', 'name': 'guacamole'}, {'frequency': 'f', 'id': 531, 'synset': 'guitar.n.01', 'synonyms': ['guitar'], 'def': 'a stringed instrument usually having six strings; played by strumming or plucking', 'name': 'guitar'}, {'frequency': 'c', 'id': 532, 'synset': 'gull.n.02', 'synonyms': ['gull', 'seagull'], 'def': 'mostly white aquatic bird having long pointed wings and short legs', 'name': 'gull'}, {'frequency': 'c', 'id': 533, 'synset': 'gun.n.01', 'synonyms': ['gun'], 'def': 'a weapon that discharges a bullet at high velocity from a metal tube', 'name': 'gun'}, {'frequency': 'r', 'id': 534, 'synset': 'hair_spray.n.01', 'synonyms': ['hair_spray'], 'def': 'substance sprayed on the hair to hold it in place', 'name': 'hair_spray'}, {'frequency': 'c', 'id': 535, 'synset': 'hairbrush.n.01', 'synonyms': ['hairbrush'], 'def': "a brush used to groom a person's hair", 'name': 'hairbrush'}, {'frequency': 'c', 'id': 536, 'synset': 'hairnet.n.01', 'synonyms': ['hairnet'], 'def': 'a small net that someone wears over their hair to keep it in place', 'name': 'hairnet'}, {'frequency': 'c', 'id': 537, 'synset': 'hairpin.n.01', 'synonyms': ['hairpin'], 'def': "a double pronged pin used to hold women's hair in place", 'name': 'hairpin'}, {'frequency': 'f', 'id': 538, 'synset': 'ham.n.01', 'synonyms': ['ham', 'jambon', 'gammon'], 'def': 'meat cut from the thigh of a hog (usually smoked)', 'name': 'ham'}, {'frequency': 'c', 'id': 539, 'synset': 'hamburger.n.01', 'synonyms': ['hamburger', 'beefburger', 'burger'], 'def': 'a sandwich consisting of a patty of minced beef served on a bun', 'name': 'hamburger'}, {'frequency': 'c', 'id': 540, 'synset': 'hammer.n.02', 'synonyms': ['hammer'], 'def': 'a hand tool with a heavy head and a handle; used to deliver an impulsive force by striking', 'name': 'hammer'}, {'frequency': 'r', 'id': 541, 'synset': 'hammock.n.02', 'synonyms': ['hammock'], 'def': 'a hanging bed of canvas or rope netting (usually suspended between two trees)', 'name': 'hammock'}, {'frequency': 'r', 'id': 542, 'synset': 'hamper.n.02', 'synonyms': ['hamper'], 'def': 'a basket usually with a cover', 'name': 'hamper'}, {'frequency': 'r', 'id': 543, 'synset': 'hamster.n.01', 'synonyms': ['hamster'], 'def': 'short-tailed burrowing rodent with large cheek pouches', 'name': 'hamster'}, {'frequency': 'c', 'id': 544, 'synset': 'hand_blower.n.01', 'synonyms': ['hair_dryer'], 'def': 'a hand-held electric blower that can blow warm air onto the hair', 'name': 'hair_dryer'}, {'frequency': 'r', 'id': 545, 'synset': 'hand_glass.n.01', 'synonyms': ['hand_glass', 'hand_mirror'], 'def': 'a mirror intended to be held in the hand', 'name': 'hand_glass'}, {'frequency': 'f', 'id': 546, 'synset': 'hand_towel.n.01', 'synonyms': ['hand_towel', 'face_towel'], 'def': 'a small towel used to dry the hands or face', 'name': 'hand_towel'}, {'frequency': 'c', 'id': 547, 'synset': 'handcart.n.01', 'synonyms': ['handcart', 'pushcart', 'hand_truck'], 'def': 'wheeled vehicle that can be pushed by a person', 'name': 'handcart'}, {'frequency': 'r', 'id': 548, 'synset': 'handcuff.n.01', 'synonyms': ['handcuff'], 'def': 'shackle that consists of a metal loop that can be locked around the wrist', 'name': 'handcuff'}, {'frequency': 'c', 'id': 549, 'synset': 'handkerchief.n.01', 'synonyms': ['handkerchief'], 'def': 'a square piece of cloth used for wiping the eyes or nose or as a costume accessory', 'name': 'handkerchief'}, {'frequency': 'f', 'id': 550, 'synset': 'handle.n.01', 'synonyms': ['handle', 'grip', 'handgrip'], 'def': 'the appendage to an object that is designed to be held in order to use or move it', 'name': 'handle'}, {'frequency': 'r', 'id': 551, 'synset': 'handsaw.n.01', 'synonyms': ['handsaw', "carpenter's_saw"], 'def': 'a saw used with one hand for cutting wood', 'name': 'handsaw'}, {'frequency': 'r', 'id': 552, 'synset': 'hardback.n.01', 'synonyms': ['hardback_book', 'hardcover_book'], 'def': 'a book with cardboard or cloth or leather covers', 'name': 'hardback_book'}, {'frequency': 'r', 'id': 553, 'synset': 'harmonium.n.01', 'synonyms': ['harmonium', 'organ_(musical_instrument)', 'reed_organ_(musical_instrument)'], 'def': 'a free-reed instrument in which air is forced through the reeds by bellows', 'name': 'harmonium'}, {'frequency': 'f', 'id': 554, 'synset': 'hat.n.01', 'synonyms': ['hat'], 'def': 'headwear that protects the head from bad weather, sun, or worn for fashion', 'name': 'hat'}, {'frequency': 'r', 'id': 555, 'synset': 'hatbox.n.01', 'synonyms': ['hatbox'], 'def': 'a round piece of luggage for carrying hats', 'name': 'hatbox'}, {'frequency': 'r', 'id': 556, 'synset': 'hatch.n.03', 'synonyms': ['hatch'], 'def': 'a movable barrier covering a hatchway', 'name': 'hatch'}, {'frequency': 'c', 'id': 557, 'synset': 'head_covering.n.01', 'synonyms': ['veil'], 'def': 'a garment that covers the head and face', 'name': 'veil'}, {'frequency': 'f', 'id': 558, 'synset': 'headband.n.01', 'synonyms': ['headband'], 'def': 'a band worn around or over the head', 'name': 'headband'}, {'frequency': 'f', 'id': 559, 'synset': 'headboard.n.01', 'synonyms': ['headboard'], 'def': 'a vertical board or panel forming the head of a bedstead', 'name': 'headboard'}, {'frequency': 'f', 'id': 560, 'synset': 'headlight.n.01', 'synonyms': ['headlight', 'headlamp'], 'def': 'a powerful light with reflector; attached to the front of an automobile or locomotive', 'name': 'headlight'}, {'frequency': 'c', 'id': 561, 'synset': 'headscarf.n.01', 'synonyms': ['headscarf'], 'def': 'a kerchief worn over the head and tied under the chin', 'name': 'headscarf'}, {'frequency': 'r', 'id': 562, 'synset': 'headset.n.01', 'synonyms': ['headset'], 'def': 'receiver consisting of a pair of headphones', 'name': 'headset'}, {'frequency': 'c', 'id': 563, 'synset': 'headstall.n.01', 'synonyms': ['headstall_(for_horses)', 'headpiece_(for_horses)'], 'def': "the band that is the part of a bridle that fits around a horse's head", 'name': 'headstall_(for_horses)'}, {'frequency': 'r', 'id': 564, 'synset': 'hearing_aid.n.02', 'synonyms': ['hearing_aid'], 'def': 'an acoustic device used to direct sound to the ear of a hearing-impaired person', 'name': 'hearing_aid'}, {'frequency': 'c', 'id': 565, 'synset': 'heart.n.02', 'synonyms': ['heart'], 'def': 'a muscular organ; its contractions move the blood through the body', 'name': 'heart'}, {'frequency': 'c', 'id': 566, 'synset': 'heater.n.01', 'synonyms': ['heater', 'warmer'], 'def': 'device that heats water or supplies warmth to a room', 'name': 'heater'}, {'frequency': 'c', 'id': 567, 'synset': 'helicopter.n.01', 'synonyms': ['helicopter'], 'def': 'an aircraft without wings that obtains its lift from the rotation of overhead blades', 'name': 'helicopter'}, {'frequency': 'f', 'id': 568, 'synset': 'helmet.n.02', 'synonyms': ['helmet'], 'def': 'a protective headgear made of hard material to resist blows', 'name': 'helmet'}, {'frequency': 'r', 'id': 569, 'synset': 'heron.n.02', 'synonyms': ['heron'], 'def': 'grey or white wading bird with long neck and long legs and (usually) long bill', 'name': 'heron'}, {'frequency': 'c', 'id': 570, 'synset': 'highchair.n.01', 'synonyms': ['highchair', 'feeding_chair'], 'def': 'a chair for feeding a very young child', 'name': 'highchair'}, {'frequency': 'f', 'id': 571, 'synset': 'hinge.n.01', 'synonyms': ['hinge'], 'def': 'a joint that holds two parts together so that one can swing relative to the other', 'name': 'hinge'}, {'frequency': 'r', 'id': 572, 'synset': 'hippopotamus.n.01', 'synonyms': ['hippopotamus'], 'def': 'massive thick-skinned animal living in or around rivers of tropical Africa', 'name': 'hippopotamus'}, {'frequency': 'r', 'id': 573, 'synset': 'hockey_stick.n.01', 'synonyms': ['hockey_stick'], 'def': 'sports implement consisting of a stick used by hockey players to move the puck', 'name': 'hockey_stick'}, {'frequency': 'c', 'id': 574, 'synset': 'hog.n.03', 'synonyms': ['hog', 'pig'], 'def': 'domestic swine', 'name': 'hog'}, {'frequency': 'f', 'id': 575, 'synset': 'home_plate.n.01', 'synonyms': ['home_plate_(baseball)', 'home_base_(baseball)'], 'def': '(baseball) a rubber slab where the batter stands; it must be touched by a base runner in order to score', 'name': 'home_plate_(baseball)'}, {'frequency': 'c', 'id': 576, 'synset': 'honey.n.01', 'synonyms': ['honey'], 'def': 'a sweet yellow liquid produced by bees', 'name': 'honey'}, {'frequency': 'f', 'id': 577, 'synset': 'hood.n.06', 'synonyms': ['fume_hood', 'exhaust_hood'], 'def': 'metal covering leading to a vent that exhausts smoke or fumes', 'name': 'fume_hood'}, {'frequency': 'f', 'id': 578, 'synset': 'hook.n.05', 'synonyms': ['hook'], 'def': 'a curved or bent implement for suspending or pulling something', 'name': 'hook'}, {'frequency': 'f', 'id': 579, 'synset': 'horse.n.01', 'synonyms': ['horse'], 'def': 'a common horse', 'name': 'horse'}, {'frequency': 'f', 'id': 580, 'synset': 'hose.n.03', 'synonyms': ['hose', 'hosepipe'], 'def': 'a flexible pipe for conveying a liquid or gas', 'name': 'hose'}, {'frequency': 'r', 'id': 581, 'synset': 'hot-air_balloon.n.01', 'synonyms': ['hot-air_balloon'], 'def': 'balloon for travel through the air in a basket suspended below a large bag of heated air', 'name': 'hot-air_balloon'}, {'frequency': 'r', 'id': 582, 'synset': 'hot_plate.n.01', 'synonyms': ['hotplate'], 'def': 'a portable electric appliance for heating or cooking or keeping food warm', 'name': 'hotplate'}, {'frequency': 'c', 'id': 583, 'synset': 'hot_sauce.n.01', 'synonyms': ['hot_sauce'], 'def': 'a pungent peppery sauce', 'name': 'hot_sauce'}, {'frequency': 'r', 'id': 584, 'synset': 'hourglass.n.01', 'synonyms': ['hourglass'], 'def': 'a sandglass timer that runs for sixty minutes', 'name': 'hourglass'}, {'frequency': 'r', 'id': 585, 'synset': 'houseboat.n.01', 'synonyms': ['houseboat'], 'def': 'a barge that is designed and equipped for use as a dwelling', 'name': 'houseboat'}, {'frequency': 'r', 'id': 586, 'synset': 'hummingbird.n.01', 'synonyms': ['hummingbird'], 'def': 'tiny American bird having brilliant iridescent plumage and long slender bills', 'name': 'hummingbird'}, {'frequency': 'r', 'id': 587, 'synset': 'hummus.n.01', 'synonyms': ['hummus', 'humus', 'hommos', 'hoummos', 'humous'], 'def': 'a thick spread made from mashed chickpeas', 'name': 'hummus'}, {'frequency': 'c', 'id': 588, 'synset': 'ice_bear.n.01', 'synonyms': ['polar_bear'], 'def': 'white bear of Arctic regions', 'name': 'polar_bear'}, {'frequency': 'c', 'id': 589, 'synset': 'ice_cream.n.01', 'synonyms': ['icecream'], 'def': 'frozen dessert containing cream and sugar and flavoring', 'name': 'icecream'}, {'frequency': 'r', 'id': 590, 'synset': 'ice_lolly.n.01', 'synonyms': ['popsicle'], 'def': 'ice cream or water ice on a small wooden stick', 'name': 'popsicle'}, {'frequency': 'c', 'id': 591, 'synset': 'ice_maker.n.01', 'synonyms': ['ice_maker'], 'def': 'an appliance included in some electric refrigerators for making ice cubes', 'name': 'ice_maker'}, {'frequency': 'r', 'id': 592, 'synset': 'ice_pack.n.01', 'synonyms': ['ice_pack', 'ice_bag'], 'def': 'a waterproof bag filled with ice: applied to the body (especially the head) to cool or reduce swelling', 'name': 'ice_pack'}, {'frequency': 'r', 'id': 593, 'synset': 'ice_skate.n.01', 'synonyms': ['ice_skate'], 'def': 'skate consisting of a boot with a steel blade fitted to the sole', 'name': 'ice_skate'}, {'frequency': 'r', 'id': 594, 'synset': 'ice_tea.n.01', 'synonyms': ['ice_tea', 'iced_tea'], 'def': 'strong tea served over ice', 'name': 'ice_tea'}, {'frequency': 'c', 'id': 595, 'synset': 'igniter.n.01', 'synonyms': ['igniter', 'ignitor', 'lighter'], 'def': 'a substance or device used to start a fire', 'name': 'igniter'}, {'frequency': 'r', 'id': 596, 'synset': 'incense.n.01', 'synonyms': ['incense'], 'def': 'a substance that produces a fragrant odor when burned', 'name': 'incense'}, {'frequency': 'r', 'id': 597, 'synset': 'inhaler.n.01', 'synonyms': ['inhaler', 'inhalator'], 'def': 'a dispenser that produces a chemical vapor to be inhaled through mouth or nose', 'name': 'inhaler'}, {'frequency': 'c', 'id': 598, 'synset': 'ipod.n.01', 'synonyms': ['iPod'], 'def': 'a pocket-sized device used to play music files', 'name': 'iPod'}, {'frequency': 'c', 'id': 599, 'synset': 'iron.n.04', 'synonyms': ['iron_(for_clothing)', 'smoothing_iron_(for_clothing)'], 'def': 'home appliance consisting of a flat metal base that is heated and used to smooth cloth', 'name': 'iron_(for_clothing)'}, {'frequency': 'r', 'id': 600, 'synset': 'ironing_board.n.01', 'synonyms': ['ironing_board'], 'def': 'narrow padded board on collapsible supports; used for ironing clothes', 'name': 'ironing_board'}, {'frequency': 'f', 'id': 601, 'synset': 'jacket.n.01', 'synonyms': ['jacket'], 'def': 'a waist-length coat', 'name': 'jacket'}, {'frequency': 'r', 'id': 602, 'synset': 'jam.n.01', 'synonyms': ['jam'], 'def': 'preserve of crushed fruit', 'name': 'jam'}, {'frequency': 'f', 'id': 603, 'synset': 'jean.n.01', 'synonyms': ['jean', 'blue_jean', 'denim'], 'def': '(usually plural) close-fitting trousers of heavy denim for manual work or casual wear', 'name': 'jean'}, {'frequency': 'c', 'id': 604, 'synset': 'jeep.n.01', 'synonyms': ['jeep', 'landrover'], 'def': 'a car suitable for traveling over rough terrain', 'name': 'jeep'}, {'frequency': 'r', 'id': 605, 'synset': 'jelly_bean.n.01', 'synonyms': ['jelly_bean', 'jelly_egg'], 'def': 'sugar-glazed jellied candy', 'name': 'jelly_bean'}, {'frequency': 'f', 'id': 606, 'synset': 'jersey.n.03', 'synonyms': ['jersey', 'T-shirt', 'tee_shirt'], 'def': 'a close-fitting pullover shirt', 'name': 'jersey'}, {'frequency': 'c', 'id': 607, 'synset': 'jet.n.01', 'synonyms': ['jet_plane', 'jet-propelled_plane'], 'def': 'an airplane powered by one or more jet engines', 'name': 'jet_plane'}, {'frequency': 'c', 'id': 608, 'synset': 'jewelry.n.01', 'synonyms': ['jewelry', 'jewellery'], 'def': 'an adornment (as a bracelet or ring or necklace) made of precious metals and set with gems (or imitation gems)', 'name': 'jewelry'}, {'frequency': 'r', 'id': 609, 'synset': 'joystick.n.02', 'synonyms': ['joystick'], 'def': 'a control device for computers consisting of a vertical handle that can move freely in two directions', 'name': 'joystick'}, {'frequency': 'r', 'id': 610, 'synset': 'jump_suit.n.01', 'synonyms': ['jumpsuit'], 'def': "one-piece garment fashioned after a parachutist's uniform", 'name': 'jumpsuit'}, {'frequency': 'c', 'id': 611, 'synset': 'kayak.n.01', 'synonyms': ['kayak'], 'def': 'a small canoe consisting of a light frame made watertight with animal skins', 'name': 'kayak'}, {'frequency': 'r', 'id': 612, 'synset': 'keg.n.02', 'synonyms': ['keg'], 'def': 'small cask or barrel', 'name': 'keg'}, {'frequency': 'r', 'id': 613, 'synset': 'kennel.n.01', 'synonyms': ['kennel', 'doghouse'], 'def': 'outbuilding that serves as a shelter for a dog', 'name': 'kennel'}, {'frequency': 'c', 'id': 614, 'synset': 'kettle.n.01', 'synonyms': ['kettle', 'boiler'], 'def': 'a metal pot for stewing or boiling; usually has a lid', 'name': 'kettle'}, {'frequency': 'f', 'id': 615, 'synset': 'key.n.01', 'synonyms': ['key'], 'def': 'metal instrument used to unlock a lock', 'name': 'key'}, {'frequency': 'r', 'id': 616, 'synset': 'keycard.n.01', 'synonyms': ['keycard'], 'def': 'a plastic card used to gain access typically to a door', 'name': 'keycard'}, {'frequency': 'r', 'id': 617, 'synset': 'kilt.n.01', 'synonyms': ['kilt'], 'def': 'a knee-length pleated tartan skirt worn by men as part of the traditional dress in the Highlands of northern Scotland', 'name': 'kilt'}, {'frequency': 'c', 'id': 618, 'synset': 'kimono.n.01', 'synonyms': ['kimono'], 'def': 'a loose robe; imitated from robes originally worn by Japanese', 'name': 'kimono'}, {'frequency': 'f', 'id': 619, 'synset': 'kitchen_sink.n.01', 'synonyms': ['kitchen_sink'], 'def': 'a sink in a kitchen', 'name': 'kitchen_sink'}, {'frequency': 'c', 'id': 620, 'synset': 'kitchen_table.n.01', 'synonyms': ['kitchen_table'], 'def': 'a table in the kitchen', 'name': 'kitchen_table'}, {'frequency': 'f', 'id': 621, 'synset': 'kite.n.03', 'synonyms': ['kite'], 'def': 'plaything consisting of a light frame covered with tissue paper; flown in wind at end of a string', 'name': 'kite'}, {'frequency': 'c', 'id': 622, 'synset': 'kitten.n.01', 'synonyms': ['kitten', 'kitty'], 'def': 'young domestic cat', 'name': 'kitten'}, {'frequency': 'c', 'id': 623, 'synset': 'kiwi.n.03', 'synonyms': ['kiwi_fruit'], 'def': 'fuzzy brown egg-shaped fruit with slightly tart green flesh', 'name': 'kiwi_fruit'}, {'frequency': 'f', 'id': 624, 'synset': 'knee_pad.n.01', 'synonyms': ['knee_pad'], 'def': 'protective garment consisting of a pad worn by football or baseball or hockey players', 'name': 'knee_pad'}, {'frequency': 'f', 'id': 625, 'synset': 'knife.n.01', 'synonyms': ['knife'], 'def': 'tool with a blade and point used as a cutting instrument', 'name': 'knife'}, {'frequency': 'r', 'id': 626, 'synset': 'knight.n.02', 'synonyms': ['knight_(chess_piece)', 'horse_(chess_piece)'], 'def': 'a chess game piece shaped to resemble the head of a horse', 'name': 'knight_(chess_piece)'}, {'frequency': 'r', 'id': 627, 'synset': 'knitting_needle.n.01', 'synonyms': ['knitting_needle'], 'def': 'needle consisting of a slender rod with pointed ends; usually used in pairs', 'name': 'knitting_needle'}, {'frequency': 'f', 'id': 628, 'synset': 'knob.n.02', 'synonyms': ['knob'], 'def': 'a round handle often found on a door', 'name': 'knob'}, {'frequency': 'r', 'id': 629, 'synset': 'knocker.n.05', 'synonyms': ['knocker_(on_a_door)', 'doorknocker'], 'def': 'a device (usually metal and ornamental) attached by a hinge to a door', 'name': 'knocker_(on_a_door)'}, {'frequency': 'r', 'id': 630, 'synset': 'koala.n.01', 'synonyms': ['koala', 'koala_bear'], 'def': 'sluggish tailless Australian marsupial with grey furry ears and coat', 'name': 'koala'}, {'frequency': 'r', 'id': 631, 'synset': 'lab_coat.n.01', 'synonyms': ['lab_coat', 'laboratory_coat'], 'def': 'a light coat worn to protect clothing from substances used while working in a laboratory', 'name': 'lab_coat'}, {'frequency': 'f', 'id': 632, 'synset': 'ladder.n.01', 'synonyms': ['ladder'], 'def': 'steps consisting of two parallel members connected by rungs', 'name': 'ladder'}, {'frequency': 'c', 'id': 633, 'synset': 'ladle.n.01', 'synonyms': ['ladle'], 'def': 'a spoon-shaped vessel with a long handle frequently used to transfer liquids', 'name': 'ladle'}, {'frequency': 'r', 'id': 634, 'synset': 'ladybug.n.01', 'synonyms': ['ladybug', 'ladybeetle', 'ladybird_beetle'], 'def': 'small round bright-colored and spotted beetle, typically red and black', 'name': 'ladybug'}, {'frequency': 'c', 'id': 635, 'synset': 'lamb.n.01', 'synonyms': ['lamb_(animal)'], 'def': 'young sheep', 'name': 'lamb_(animal)'}, {'frequency': 'r', 'id': 636, 'synset': 'lamb_chop.n.01', 'synonyms': ['lamb-chop', 'lambchop'], 'def': 'chop cut from a lamb', 'name': 'lamb-chop'}, {'frequency': 'f', 'id': 637, 'synset': 'lamp.n.02', 'synonyms': ['lamp'], 'def': 'a piece of furniture holding one or more electric light bulbs', 'name': 'lamp'}, {'frequency': 'f', 'id': 638, 'synset': 'lamppost.n.01', 'synonyms': ['lamppost'], 'def': 'a metal post supporting an outdoor lamp (such as a streetlight)', 'name': 'lamppost'}, {'frequency': 'f', 'id': 639, 'synset': 'lampshade.n.01', 'synonyms': ['lampshade'], 'def': 'a protective ornamental shade used to screen a light bulb from direct view', 'name': 'lampshade'}, {'frequency': 'c', 'id': 640, 'synset': 'lantern.n.01', 'synonyms': ['lantern'], 'def': 'light in a transparent protective case', 'name': 'lantern'}, {'frequency': 'f', 'id': 641, 'synset': 'lanyard.n.02', 'synonyms': ['lanyard', 'laniard'], 'def': 'a cord worn around the neck to hold a knife or whistle, etc.', 'name': 'lanyard'}, {'frequency': 'f', 'id': 642, 'synset': 'laptop.n.01', 'synonyms': ['laptop_computer', 'notebook_computer'], 'def': 'a portable computer small enough to use in your lap', 'name': 'laptop_computer'}, {'frequency': 'r', 'id': 643, 'synset': 'lasagna.n.01', 'synonyms': ['lasagna', 'lasagne'], 'def': 'baked dish of layers of lasagna pasta with sauce and cheese and meat or vegetables', 'name': 'lasagna'}, {'frequency': 'c', 'id': 644, 'synset': 'latch.n.02', 'synonyms': ['latch'], 'def': 'a bar that can be lowered or slid into a groove to fasten a door or gate', 'name': 'latch'}, {'frequency': 'r', 'id': 645, 'synset': 'lawn_mower.n.01', 'synonyms': ['lawn_mower'], 'def': 'garden tool for mowing grass on lawns', 'name': 'lawn_mower'}, {'frequency': 'r', 'id': 646, 'synset': 'leather.n.01', 'synonyms': ['leather'], 'def': 'an animal skin made smooth and flexible by removing the hair and then tanning', 'name': 'leather'}, {'frequency': 'c', 'id': 647, 'synset': 'legging.n.01', 'synonyms': ['legging_(clothing)', 'leging_(clothing)', 'leg_covering'], 'def': 'a garment covering the leg (usually extending from the knee to the ankle)', 'name': 'legging_(clothing)'}, {'frequency': 'c', 'id': 648, 'synset': 'lego.n.01', 'synonyms': ['Lego', 'Lego_set'], 'def': "a child's plastic construction set for making models from blocks", 'name': 'Lego'}, {'frequency': 'f', 'id': 649, 'synset': 'lemon.n.01', 'synonyms': ['lemon'], 'def': 'yellow oval fruit with juicy acidic flesh', 'name': 'lemon'}, {'frequency': 'r', 'id': 650, 'synset': 'lemonade.n.01', 'synonyms': ['lemonade'], 'def': 'sweetened beverage of diluted lemon juice', 'name': 'lemonade'}, {'frequency': 'f', 'id': 651, 'synset': 'lettuce.n.02', 'synonyms': ['lettuce'], 'def': 'leafy plant commonly eaten in salad or on sandwiches', 'name': 'lettuce'}, {'frequency': 'f', 'id': 652, 'synset': 'license_plate.n.01', 'synonyms': ['license_plate', 'numberplate'], 'def': "a plate mounted on the front and back of car and bearing the car's registration number", 'name': 'license_plate'}, {'frequency': 'f', 'id': 653, 'synset': 'life_buoy.n.01', 'synonyms': ['life_buoy', 'lifesaver', 'life_belt', 'life_ring'], 'def': 'a ring-shaped life preserver used to prevent drowning (NOT a life-jacket or vest)', 'name': 'life_buoy'}, {'frequency': 'f', 'id': 654, 'synset': 'life_jacket.n.01', 'synonyms': ['life_jacket', 'life_vest'], 'def': 'life preserver consisting of a sleeveless jacket of buoyant or inflatable design', 'name': 'life_jacket'}, {'frequency': 'f', 'id': 655, 'synset': 'light_bulb.n.01', 'synonyms': ['lightbulb'], 'def': 'glass bulb or tube shaped electric device that emits light (DO NOT MARK LAMPS AS A WHOLE)', 'name': 'lightbulb'}, {'frequency': 'r', 'id': 656, 'synset': 'lightning_rod.n.02', 'synonyms': ['lightning_rod', 'lightning_conductor'], 'def': 'a metallic conductor that is attached to a high point and leads to the ground', 'name': 'lightning_rod'}, {'frequency': 'c', 'id': 657, 'synset': 'lime.n.06', 'synonyms': ['lime'], 'def': 'the green acidic fruit of any of various lime trees', 'name': 'lime'}, {'frequency': 'r', 'id': 658, 'synset': 'limousine.n.01', 'synonyms': ['limousine'], 'def': 'long luxurious car; usually driven by a chauffeur', 'name': 'limousine'}, {'frequency': 'r', 'id': 659, 'synset': 'linen.n.02', 'synonyms': ['linen_paper'], 'def': 'a high-quality paper made of linen fibers or with a linen finish', 'name': 'linen_paper'}, {'frequency': 'c', 'id': 660, 'synset': 'lion.n.01', 'synonyms': ['lion'], 'def': 'large gregarious predatory cat of Africa and India', 'name': 'lion'}, {'frequency': 'c', 'id': 661, 'synset': 'lip_balm.n.01', 'synonyms': ['lip_balm'], 'def': 'a balm applied to the lips', 'name': 'lip_balm'}, {'frequency': 'c', 'id': 662, 'synset': 'lipstick.n.01', 'synonyms': ['lipstick', 'lip_rouge'], 'def': 'makeup that is used to color the lips', 'name': 'lipstick'}, {'frequency': 'r', 'id': 663, 'synset': 'liquor.n.01', 'synonyms': ['liquor', 'spirits', 'hard_liquor', 'liqueur', 'cordial'], 'def': 'an alcoholic beverage that is distilled rather than fermented', 'name': 'liquor'}, {'frequency': 'r', 'id': 664, 'synset': 'lizard.n.01', 'synonyms': ['lizard'], 'def': 'a reptile with usually two pairs of legs and a tapering tail', 'name': 'lizard'}, {'frequency': 'r', 'id': 665, 'synset': 'loafer.n.02', 'synonyms': ['Loafer_(type_of_shoe)'], 'def': 'a low leather step-in shoe', 'name': 'Loafer_(type_of_shoe)'}, {'frequency': 'f', 'id': 666, 'synset': 'log.n.01', 'synonyms': ['log'], 'def': 'a segment of the trunk of a tree when stripped of branches', 'name': 'log'}, {'frequency': 'c', 'id': 667, 'synset': 'lollipop.n.02', 'synonyms': ['lollipop'], 'def': 'hard candy on a stick', 'name': 'lollipop'}, {'frequency': 'c', 'id': 668, 'synset': 'lotion.n.01', 'synonyms': ['lotion'], 'def': 'any of various cosmetic preparations that are applied to the skin', 'name': 'lotion'}, {'frequency': 'f', 'id': 669, 'synset': 'loudspeaker.n.01', 'synonyms': ['speaker_(stero_equipment)'], 'def': 'electronic device that produces sound often as part of a stereo system', 'name': 'speaker_(stero_equipment)'}, {'frequency': 'c', 'id': 670, 'synset': 'love_seat.n.01', 'synonyms': ['loveseat'], 'def': 'small sofa that seats two people', 'name': 'loveseat'}, {'frequency': 'r', 'id': 671, 'synset': 'machine_gun.n.01', 'synonyms': ['machine_gun'], 'def': 'a rapidly firing automatic gun', 'name': 'machine_gun'}, {'frequency': 'f', 'id': 672, 'synset': 'magazine.n.02', 'synonyms': ['magazine'], 'def': 'a paperback periodic publication', 'name': 'magazine'}, {'frequency': 'f', 'id': 673, 'synset': 'magnet.n.01', 'synonyms': ['magnet'], 'def': 'a device that attracts iron and produces a magnetic field', 'name': 'magnet'}, {'frequency': 'r', 'id': 674, 'synset': 'mail_slot.n.01', 'synonyms': ['mail_slot'], 'def': 'a slot (usually in a door) through which mail can be delivered', 'name': 'mail_slot'}, {'frequency': 'c', 'id': 675, 'synset': 'mailbox.n.01', 'synonyms': ['mailbox_(at_home)', 'letter_box_(at_home)'], 'def': 'a private box for delivery of mail', 'name': 'mailbox_(at_home)'}, {'frequency': 'r', 'id': 676, 'synset': 'mallet.n.01', 'synonyms': ['mallet'], 'def': 'a sports implement with a long handle and a hammer-like head used to hit a ball', 'name': 'mallet'}, {'frequency': 'r', 'id': 677, 'synset': 'mammoth.n.01', 'synonyms': ['mammoth'], 'def': 'any of numerous extinct elephants widely distributed in the Pleistocene', 'name': 'mammoth'}, {'frequency': 'c', 'id': 678, 'synset': 'mandarin.n.05', 'synonyms': ['mandarin_orange'], 'def': 'a somewhat flat reddish-orange loose skinned citrus of China', 'name': 'mandarin_orange'}, {'frequency': 'c', 'id': 679, 'synset': 'manger.n.01', 'synonyms': ['manger', 'trough'], 'def': 'a container (usually in a barn or stable) from which cattle or horses feed', 'name': 'manger'}, {'frequency': 'f', 'id': 680, 'synset': 'manhole.n.01', 'synonyms': ['manhole'], 'def': 'a hole (usually with a flush cover) through which a person can gain access to an underground structure', 'name': 'manhole'}, {'frequency': 'c', 'id': 681, 'synset': 'map.n.01', 'synonyms': ['map'], 'def': "a diagrammatic representation of the earth's surface (or part of it)", 'name': 'map'}, {'frequency': 'c', 'id': 682, 'synset': 'marker.n.03', 'synonyms': ['marker'], 'def': 'a writing implement for making a mark', 'name': 'marker'}, {'frequency': 'r', 'id': 683, 'synset': 'martini.n.01', 'synonyms': ['martini'], 'def': 'a cocktail made of gin (or vodka) with dry vermouth', 'name': 'martini'}, {'frequency': 'r', 'id': 684, 'synset': 'mascot.n.01', 'synonyms': ['mascot'], 'def': 'a person or animal that is adopted by a team or other group as a symbolic figure', 'name': 'mascot'}, {'frequency': 'c', 'id': 685, 'synset': 'mashed_potato.n.01', 'synonyms': ['mashed_potato'], 'def': 'potato that has been peeled and boiled and then mashed', 'name': 'mashed_potato'}, {'frequency': 'r', 'id': 686, 'synset': 'masher.n.02', 'synonyms': ['masher'], 'def': 'a kitchen utensil used for mashing (e.g. potatoes)', 'name': 'masher'}, {'frequency': 'f', 'id': 687, 'synset': 'mask.n.04', 'synonyms': ['mask', 'facemask'], 'def': 'a protective covering worn over the face', 'name': 'mask'}, {'frequency': 'f', 'id': 688, 'synset': 'mast.n.01', 'synonyms': ['mast'], 'def': 'a vertical spar for supporting sails', 'name': 'mast'}, {'frequency': 'c', 'id': 689, 'synset': 'mat.n.03', 'synonyms': ['mat_(gym_equipment)', 'gym_mat'], 'def': 'sports equipment consisting of a piece of thick padding on the floor for gymnastics', 'name': 'mat_(gym_equipment)'}, {'frequency': 'r', 'id': 690, 'synset': 'matchbox.n.01', 'synonyms': ['matchbox'], 'def': 'a box for holding matches', 'name': 'matchbox'}, {'frequency': 'f', 'id': 691, 'synset': 'mattress.n.01', 'synonyms': ['mattress'], 'def': 'a thick pad filled with resilient material used as a bed or part of a bed', 'name': 'mattress'}, {'frequency': 'c', 'id': 692, 'synset': 'measuring_cup.n.01', 'synonyms': ['measuring_cup'], 'def': 'graduated cup used to measure liquid or granular ingredients', 'name': 'measuring_cup'}, {'frequency': 'c', 'id': 693, 'synset': 'measuring_stick.n.01', 'synonyms': ['measuring_stick', 'ruler_(measuring_stick)', 'measuring_rod'], 'def': 'measuring instrument having a sequence of marks at regular intervals', 'name': 'measuring_stick'}, {'frequency': 'c', 'id': 694, 'synset': 'meatball.n.01', 'synonyms': ['meatball'], 'def': 'ground meat formed into a ball and fried or simmered in broth', 'name': 'meatball'}, {'frequency': 'c', 'id': 695, 'synset': 'medicine.n.02', 'synonyms': ['medicine'], 'def': 'something that treats or prevents or alleviates the symptoms of disease', 'name': 'medicine'}, {'frequency': 'r', 'id': 696, 'synset': 'melon.n.01', 'synonyms': ['melon'], 'def': 'fruit of the gourd family having a hard rind and sweet juicy flesh', 'name': 'melon'}, {'frequency': 'f', 'id': 697, 'synset': 'microphone.n.01', 'synonyms': ['microphone'], 'def': 'device for converting sound waves into electrical energy', 'name': 'microphone'}, {'frequency': 'r', 'id': 698, 'synset': 'microscope.n.01', 'synonyms': ['microscope'], 'def': 'magnifier of the image of small objects', 'name': 'microscope'}, {'frequency': 'f', 'id': 699, 'synset': 'microwave.n.02', 'synonyms': ['microwave_oven'], 'def': 'kitchen appliance that cooks food by passing an electromagnetic wave through it', 'name': 'microwave_oven'}, {'frequency': 'r', 'id': 700, 'synset': 'milestone.n.01', 'synonyms': ['milestone', 'milepost'], 'def': 'stone post at side of a road to show distances', 'name': 'milestone'}, {'frequency': 'c', 'id': 701, 'synset': 'milk.n.01', 'synonyms': ['milk'], 'def': 'a white nutritious liquid secreted by mammals and used as food by human beings', 'name': 'milk'}, {'frequency': 'f', 'id': 702, 'synset': 'minivan.n.01', 'synonyms': ['minivan'], 'def': 'a small box-shaped passenger van', 'name': 'minivan'}, {'frequency': 'r', 'id': 703, 'synset': 'mint.n.05', 'synonyms': ['mint_candy'], 'def': 'a candy that is flavored with a mint oil', 'name': 'mint_candy'}, {'frequency': 'f', 'id': 704, 'synset': 'mirror.n.01', 'synonyms': ['mirror'], 'def': 'polished surface that forms images by reflecting light', 'name': 'mirror'}, {'frequency': 'c', 'id': 705, 'synset': 'mitten.n.01', 'synonyms': ['mitten'], 'def': 'glove that encases the thumb separately and the other four fingers together', 'name': 'mitten'}, {'frequency': 'c', 'id': 706, 'synset': 'mixer.n.04', 'synonyms': ['mixer_(kitchen_tool)', 'stand_mixer'], 'def': 'a kitchen utensil that is used for mixing foods', 'name': 'mixer_(kitchen_tool)'}, {'frequency': 'c', 'id': 707, 'synset': 'money.n.03', 'synonyms': ['money'], 'def': 'the official currency issued by a government or national bank', 'name': 'money'}, {'frequency': 'f', 'id': 708, 'synset': 'monitor.n.04', 'synonyms': ['monitor_(computer_equipment) computer_monitor'], 'def': 'a computer monitor', 'name': 'monitor_(computer_equipment) computer_monitor'}, {'frequency': 'c', 'id': 709, 'synset': 'monkey.n.01', 'synonyms': ['monkey'], 'def': 'any of various long-tailed primates', 'name': 'monkey'}, {'frequency': 'f', 'id': 710, 'synset': 'motor.n.01', 'synonyms': ['motor'], 'def': 'machine that converts other forms of energy into mechanical energy and so imparts motion', 'name': 'motor'}, {'frequency': 'f', 'id': 711, 'synset': 'motor_scooter.n.01', 'synonyms': ['motor_scooter', 'scooter'], 'def': 'a wheeled vehicle with small wheels and a low-powered engine', 'name': 'motor_scooter'}, {'frequency': 'r', 'id': 712, 'synset': 'motor_vehicle.n.01', 'synonyms': ['motor_vehicle', 'automotive_vehicle'], 'def': 'a self-propelled wheeled vehicle that does not run on rails', 'name': 'motor_vehicle'}, {'frequency': 'r', 'id': 713, 'synset': 'motorboat.n.01', 'synonyms': ['motorboat', 'powerboat'], 'def': 'a boat propelled by an internal-combustion engine', 'name': 'motorboat'}, {'frequency': 'f', 'id': 714, 'synset': 'motorcycle.n.01', 'synonyms': ['motorcycle'], 'def': 'a motor vehicle with two wheels and a strong frame', 'name': 'motorcycle'}, {'frequency': 'f', 'id': 715, 'synset': 'mound.n.01', 'synonyms': ['mound_(baseball)', "pitcher's_mound"], 'def': '(baseball) the slight elevation on which the pitcher stands', 'name': 'mound_(baseball)'}, {'frequency': 'r', 'id': 716, 'synset': 'mouse.n.01', 'synonyms': ['mouse_(animal_rodent)'], 'def': 'a small rodent with pointed snouts and small ears on elongated bodies with slender usually hairless tails', 'name': 'mouse_(animal_rodent)'}, {'frequency': 'f', 'id': 717, 'synset': 'mouse.n.04', 'synonyms': ['mouse_(computer_equipment)', 'computer_mouse'], 'def': 'a computer input device that controls an on-screen pointer', 'name': 'mouse_(computer_equipment)'}, {'frequency': 'f', 'id': 718, 'synset': 'mousepad.n.01', 'synonyms': ['mousepad'], 'def': 'a small portable pad that provides an operating surface for a computer mouse', 'name': 'mousepad'}, {'frequency': 'c', 'id': 719, 'synset': 'muffin.n.01', 'synonyms': ['muffin'], 'def': 'a sweet quick bread baked in a cup-shaped pan', 'name': 'muffin'}, {'frequency': 'f', 'id': 720, 'synset': 'mug.n.04', 'synonyms': ['mug'], 'def': 'with handle and usually cylindrical', 'name': 'mug'}, {'frequency': 'f', 'id': 721, 'synset': 'mushroom.n.02', 'synonyms': ['mushroom'], 'def': 'a common mushroom', 'name': 'mushroom'}, {'frequency': 'r', 'id': 722, 'synset': 'music_stool.n.01', 'synonyms': ['music_stool', 'piano_stool'], 'def': 'a stool for piano players; usually adjustable in height', 'name': 'music_stool'}, {'frequency': 'r', 'id': 723, 'synset': 'musical_instrument.n.01', 'synonyms': ['musical_instrument', 'instrument_(musical)'], 'def': 'any of various devices or contrivances that can be used to produce musical tones or sounds', 'name': 'musical_instrument'}, {'frequency': 'r', 'id': 724, 'synset': 'nailfile.n.01', 'synonyms': ['nailfile'], 'def': 'a small flat file for shaping the nails', 'name': 'nailfile'}, {'frequency': 'r', 'id': 725, 'synset': 'nameplate.n.01', 'synonyms': ['nameplate'], 'def': 'a plate bearing a name', 'name': 'nameplate'}, {'frequency': 'f', 'id': 726, 'synset': 'napkin.n.01', 'synonyms': ['napkin', 'table_napkin', 'serviette'], 'def': 'a small piece of table linen or paper that is used to wipe the mouth and to cover the lap in order to protect clothing', 'name': 'napkin'}, {'frequency': 'r', 'id': 727, 'synset': 'neckerchief.n.01', 'synonyms': ['neckerchief'], 'def': 'a kerchief worn around the neck', 'name': 'neckerchief'}, {'frequency': 'f', 'id': 728, 'synset': 'necklace.n.01', 'synonyms': ['necklace'], 'def': 'jewelry consisting of a cord or chain (often bearing gems) worn about the neck as an ornament', 'name': 'necklace'}, {'frequency': 'f', 'id': 729, 'synset': 'necktie.n.01', 'synonyms': ['necktie', 'tie_(necktie)'], 'def': 'neckwear consisting of a long narrow piece of material worn under a collar and tied in knot at the front', 'name': 'necktie'}, {'frequency': 'r', 'id': 730, 'synset': 'needle.n.03', 'synonyms': ['needle'], 'def': 'a sharp pointed implement (usually metal)', 'name': 'needle'}, {'frequency': 'c', 'id': 731, 'synset': 'nest.n.01', 'synonyms': ['nest'], 'def': 'a structure in which animals lay eggs or give birth to their young', 'name': 'nest'}, {'frequency': 'r', 'id': 732, 'synset': 'newsstand.n.01', 'synonyms': ['newsstand'], 'def': 'a stall where newspapers and other periodicals are sold', 'name': 'newsstand'}, {'frequency': 'c', 'id': 733, 'synset': 'nightwear.n.01', 'synonyms': ['nightshirt', 'nightwear', 'sleepwear', 'nightclothes'], 'def': 'garments designed to be worn in bed', 'name': 'nightshirt'}, {'frequency': 'r', 'id': 734, 'synset': 'nosebag.n.01', 'synonyms': ['nosebag_(for_animals)', 'feedbag'], 'def': 'a canvas bag that is used to feed an animal (such as a horse); covers the muzzle and fastens at the top of the head', 'name': 'nosebag_(for_animals)'}, {'frequency': 'r', 'id': 735, 'synset': 'noseband.n.01', 'synonyms': ['noseband_(for_animals)', 'nosepiece_(for_animals)'], 'def': "a strap that is the part of a bridle that goes over the animal's nose", 'name': 'noseband_(for_animals)'}, {'frequency': 'f', 'id': 736, 'synset': 'notebook.n.01', 'synonyms': ['notebook'], 'def': 'a book with blank pages for recording notes or memoranda', 'name': 'notebook'}, {'frequency': 'c', 'id': 737, 'synset': 'notepad.n.01', 'synonyms': ['notepad'], 'def': 'a pad of paper for keeping notes', 'name': 'notepad'}, {'frequency': 'c', 'id': 738, 'synset': 'nut.n.03', 'synonyms': ['nut'], 'def': 'a small metal block (usually square or hexagonal) with internal screw thread to be fitted onto a bolt', 'name': 'nut'}, {'frequency': 'r', 'id': 739, 'synset': 'nutcracker.n.01', 'synonyms': ['nutcracker'], 'def': 'a hand tool used to crack nuts open', 'name': 'nutcracker'}, {'frequency': 'c', 'id': 740, 'synset': 'oar.n.01', 'synonyms': ['oar'], 'def': 'an implement used to propel or steer a boat', 'name': 'oar'}, {'frequency': 'r', 'id': 741, 'synset': 'octopus.n.01', 'synonyms': ['octopus_(food)'], 'def': 'tentacles of octopus prepared as food', 'name': 'octopus_(food)'}, {'frequency': 'r', 'id': 742, 'synset': 'octopus.n.02', 'synonyms': ['octopus_(animal)'], 'def': 'bottom-living cephalopod having a soft oval body with eight long tentacles', 'name': 'octopus_(animal)'}, {'frequency': 'c', 'id': 743, 'synset': 'oil_lamp.n.01', 'synonyms': ['oil_lamp', 'kerosene_lamp', 'kerosine_lamp'], 'def': 'a lamp that burns oil (as kerosine) for light', 'name': 'oil_lamp'}, {'frequency': 'c', 'id': 744, 'synset': 'olive_oil.n.01', 'synonyms': ['olive_oil'], 'def': 'oil from olives', 'name': 'olive_oil'}, {'frequency': 'r', 'id': 745, 'synset': 'omelet.n.01', 'synonyms': ['omelet', 'omelette'], 'def': 'beaten eggs cooked until just set; may be folded around e.g. ham or cheese or jelly', 'name': 'omelet'}, {'frequency': 'f', 'id': 746, 'synset': 'onion.n.01', 'synonyms': ['onion'], 'def': 'the bulb of an onion plant', 'name': 'onion'}, {'frequency': 'f', 'id': 747, 'synset': 'orange.n.01', 'synonyms': ['orange_(fruit)'], 'def': 'orange (FRUIT of an orange tree)', 'name': 'orange_(fruit)'}, {'frequency': 'c', 'id': 748, 'synset': 'orange_juice.n.01', 'synonyms': ['orange_juice'], 'def': 'bottled or freshly squeezed juice of oranges', 'name': 'orange_juice'}, {'frequency': 'r', 'id': 749, 'synset': 'oregano.n.01', 'synonyms': ['oregano', 'marjoram'], 'def': 'aromatic Eurasian perennial herb used in cooking and baking', 'name': 'oregano'}, {'frequency': 'c', 'id': 750, 'synset': 'ostrich.n.02', 'synonyms': ['ostrich'], 'def': 'fast-running African flightless bird with two-toed feet; largest living bird', 'name': 'ostrich'}, {'frequency': 'c', 'id': 751, 'synset': 'ottoman.n.03', 'synonyms': ['ottoman', 'pouf', 'pouffe', 'hassock'], 'def': 'thick cushion used as a seat', 'name': 'ottoman'}, {'frequency': 'c', 'id': 752, 'synset': 'overall.n.01', 'synonyms': ['overalls_(clothing)'], 'def': 'work clothing consisting of denim trousers usually with a bib and shoulder straps', 'name': 'overalls_(clothing)'}, {'frequency': 'c', 'id': 753, 'synset': 'owl.n.01', 'synonyms': ['owl'], 'def': 'nocturnal bird of prey with hawk-like beak and claws and large head with front-facing eyes', 'name': 'owl'}, {'frequency': 'c', 'id': 754, 'synset': 'packet.n.03', 'synonyms': ['packet'], 'def': 'a small package or bundle', 'name': 'packet'}, {'frequency': 'r', 'id': 755, 'synset': 'pad.n.03', 'synonyms': ['inkpad', 'inking_pad', 'stamp_pad'], 'def': 'absorbent material saturated with ink used to transfer ink evenly to a rubber stamp', 'name': 'inkpad'}, {'frequency': 'c', 'id': 756, 'synset': 'pad.n.04', 'synonyms': ['pad'], 'def': 'a flat mass of soft material used for protection, stuffing, or comfort', 'name': 'pad'}, {'frequency': 'c', 'id': 757, 'synset': 'paddle.n.04', 'synonyms': ['paddle', 'boat_paddle'], 'def': 'a short light oar used without an oarlock to propel a canoe or small boat', 'name': 'paddle'}, {'frequency': 'c', 'id': 758, 'synset': 'padlock.n.01', 'synonyms': ['padlock'], 'def': 'a detachable, portable lock', 'name': 'padlock'}, {'frequency': 'r', 'id': 759, 'synset': 'paintbox.n.01', 'synonyms': ['paintbox'], 'def': "a box containing a collection of cubes or tubes of artists' paint", 'name': 'paintbox'}, {'frequency': 'c', 'id': 760, 'synset': 'paintbrush.n.01', 'synonyms': ['paintbrush'], 'def': 'a brush used as an applicator to apply paint', 'name': 'paintbrush'}, {'frequency': 'f', 'id': 761, 'synset': 'painting.n.01', 'synonyms': ['painting'], 'def': 'graphic art consisting of an artistic composition made by applying paints to a surface', 'name': 'painting'}, {'frequency': 'c', 'id': 762, 'synset': 'pajama.n.02', 'synonyms': ['pajamas', 'pyjamas'], 'def': 'loose-fitting nightclothes worn for sleeping or lounging', 'name': 'pajamas'}, {'frequency': 'c', 'id': 763, 'synset': 'palette.n.02', 'synonyms': ['palette', 'pallet'], 'def': 'board that provides a flat surface on which artists mix paints and the range of colors used', 'name': 'palette'}, {'frequency': 'f', 'id': 764, 'synset': 'pan.n.01', 'synonyms': ['pan_(for_cooking)', 'cooking_pan'], 'def': 'cooking utensil consisting of a wide metal vessel', 'name': 'pan_(for_cooking)'}, {'frequency': 'r', 'id': 765, 'synset': 'pan.n.03', 'synonyms': ['pan_(metal_container)'], 'def': 'shallow container made of metal', 'name': 'pan_(metal_container)'}, {'frequency': 'c', 'id': 766, 'synset': 'pancake.n.01', 'synonyms': ['pancake'], 'def': 'a flat cake of thin batter fried on both sides on a griddle', 'name': 'pancake'}, {'frequency': 'r', 'id': 767, 'synset': 'pantyhose.n.01', 'synonyms': ['pantyhose'], 'def': "a woman's tights consisting of underpants and stockings", 'name': 'pantyhose'}, {'frequency': 'r', 'id': 768, 'synset': 'papaya.n.02', 'synonyms': ['papaya'], 'def': 'large oval melon-like tropical fruit with yellowish flesh', 'name': 'papaya'}, {'frequency': 'r', 'id': 769, 'synset': 'paper_clip.n.01', 'synonyms': ['paperclip'], 'def': 'a wire or plastic clip for holding sheets of paper together', 'name': 'paperclip'}, {'frequency': 'f', 'id': 770, 'synset': 'paper_plate.n.01', 'synonyms': ['paper_plate'], 'def': 'a disposable plate made of cardboard', 'name': 'paper_plate'}, {'frequency': 'f', 'id': 771, 'synset': 'paper_towel.n.01', 'synonyms': ['paper_towel'], 'def': 'a disposable towel made of absorbent paper', 'name': 'paper_towel'}, {'frequency': 'r', 'id': 772, 'synset': 'paperback_book.n.01', 'synonyms': ['paperback_book', 'paper-back_book', 'softback_book', 'soft-cover_book'], 'def': 'a book with paper covers', 'name': 'paperback_book'}, {'frequency': 'r', 'id': 773, 'synset': 'paperweight.n.01', 'synonyms': ['paperweight'], 'def': 'a weight used to hold down a stack of papers', 'name': 'paperweight'}, {'frequency': 'c', 'id': 774, 'synset': 'parachute.n.01', 'synonyms': ['parachute'], 'def': 'rescue equipment consisting of a device that fills with air and retards your fall', 'name': 'parachute'}, {'frequency': 'r', 'id': 775, 'synset': 'parakeet.n.01', 'synonyms': ['parakeet', 'parrakeet', 'parroket', 'paraquet', 'paroquet', 'parroquet'], 'def': 'any of numerous small slender long-tailed parrots', 'name': 'parakeet'}, {'frequency': 'c', 'id': 776, 'synset': 'parasail.n.01', 'synonyms': ['parasail_(sports)'], 'def': 'parachute that will lift a person up into the air when it is towed by a motorboat or a car', 'name': 'parasail_(sports)'}, {'frequency': 'r', 'id': 777, 'synset': 'parchment.n.01', 'synonyms': ['parchment'], 'def': 'a superior paper resembling sheepskin', 'name': 'parchment'}, {'frequency': 'r', 'id': 778, 'synset': 'parka.n.01', 'synonyms': ['parka', 'anorak'], 'def': "a kind of heavy jacket (`windcheater' is a British term)", 'name': 'parka'}, {'frequency': 'f', 'id': 779, 'synset': 'parking_meter.n.01', 'synonyms': ['parking_meter'], 'def': 'a coin-operated timer located next to a parking space', 'name': 'parking_meter'}, {'frequency': 'c', 'id': 780, 'synset': 'parrot.n.01', 'synonyms': ['parrot'], 'def': 'usually brightly colored tropical birds with short hooked beaks and the ability to mimic sounds', 'name': 'parrot'}, {'frequency': 'c', 'id': 781, 'synset': 'passenger_car.n.01', 'synonyms': ['passenger_car_(part_of_a_train)', 'coach_(part_of_a_train)'], 'def': 'a railcar where passengers ride', 'name': 'passenger_car_(part_of_a_train)'}, {'frequency': 'r', 'id': 782, 'synset': 'passenger_ship.n.01', 'synonyms': ['passenger_ship'], 'def': 'a ship built to carry passengers', 'name': 'passenger_ship'}, {'frequency': 'r', 'id': 783, 'synset': 'passport.n.02', 'synonyms': ['passport'], 'def': 'a document issued by a country to a citizen allowing that person to travel abroad and re-enter the home country', 'name': 'passport'}, {'frequency': 'f', 'id': 784, 'synset': 'pastry.n.02', 'synonyms': ['pastry'], 'def': 'any of various baked foods made of dough or batter', 'name': 'pastry'}, {'frequency': 'r', 'id': 785, 'synset': 'patty.n.01', 'synonyms': ['patty_(food)'], 'def': 'small flat mass of chopped food', 'name': 'patty_(food)'}, {'frequency': 'c', 'id': 786, 'synset': 'pea.n.01', 'synonyms': ['pea_(food)'], 'def': 'seed of a pea plant used for food', 'name': 'pea_(food)'}, {'frequency': 'c', 'id': 787, 'synset': 'peach.n.03', 'synonyms': ['peach'], 'def': 'downy juicy fruit with sweet yellowish or whitish flesh', 'name': 'peach'}, {'frequency': 'c', 'id': 788, 'synset': 'peanut_butter.n.01', 'synonyms': ['peanut_butter'], 'def': 'a spread made from ground peanuts', 'name': 'peanut_butter'}, {'frequency': 'c', 'id': 789, 'synset': 'pear.n.01', 'synonyms': ['pear'], 'def': 'sweet juicy gritty-textured fruit available in many varieties', 'name': 'pear'}, {'frequency': 'r', 'id': 790, 'synset': 'peeler.n.03', 'synonyms': ['peeler_(tool_for_fruit_and_vegetables)'], 'def': 'a device for peeling vegetables or fruits', 'name': 'peeler_(tool_for_fruit_and_vegetables)'}, {'frequency': 'r', 'id': 791, 'synset': 'pegboard.n.01', 'synonyms': ['pegboard'], 'def': 'a board perforated with regularly spaced holes into which pegs can be fitted', 'name': 'pegboard'}, {'frequency': 'c', 'id': 792, 'synset': 'pelican.n.01', 'synonyms': ['pelican'], 'def': 'large long-winged warm-water seabird having a large bill with a distensible pouch for fish', 'name': 'pelican'}, {'frequency': 'f', 'id': 793, 'synset': 'pen.n.01', 'synonyms': ['pen'], 'def': 'a writing implement with a point from which ink flows', 'name': 'pen'}, {'frequency': 'c', 'id': 794, 'synset': 'pencil.n.01', 'synonyms': ['pencil'], 'def': 'a thin cylindrical pointed writing implement made of wood and graphite', 'name': 'pencil'}, {'frequency': 'r', 'id': 795, 'synset': 'pencil_box.n.01', 'synonyms': ['pencil_box', 'pencil_case'], 'def': 'a box for holding pencils', 'name': 'pencil_box'}, {'frequency': 'r', 'id': 796, 'synset': 'pencil_sharpener.n.01', 'synonyms': ['pencil_sharpener'], 'def': 'a rotary implement for sharpening the point on pencils', 'name': 'pencil_sharpener'}, {'frequency': 'r', 'id': 797, 'synset': 'pendulum.n.01', 'synonyms': ['pendulum'], 'def': 'an apparatus consisting of an object mounted so that it swings freely under the influence of gravity', 'name': 'pendulum'}, {'frequency': 'c', 'id': 798, 'synset': 'penguin.n.01', 'synonyms': ['penguin'], 'def': 'short-legged flightless birds of cold southern regions having webbed feet and wings modified as flippers', 'name': 'penguin'}, {'frequency': 'r', 'id': 799, 'synset': 'pennant.n.02', 'synonyms': ['pennant'], 'def': 'a flag longer than it is wide (and often tapering)', 'name': 'pennant'}, {'frequency': 'r', 'id': 800, 'synset': 'penny.n.02', 'synonyms': ['penny_(coin)'], 'def': 'a coin worth one-hundredth of the value of the basic unit', 'name': 'penny_(coin)'}, {'frequency': 'c', 'id': 801, 'synset': 'pepper.n.03', 'synonyms': ['pepper', 'peppercorn'], 'def': 'pungent seasoning from the berry of the common pepper plant; whole or ground', 'name': 'pepper'}, {'frequency': 'c', 'id': 802, 'synset': 'pepper_mill.n.01', 'synonyms': ['pepper_mill', 'pepper_grinder'], 'def': 'a mill for grinding pepper', 'name': 'pepper_mill'}, {'frequency': 'c', 'id': 803, 'synset': 'perfume.n.02', 'synonyms': ['perfume'], 'def': 'a toiletry that emits and diffuses a fragrant odor', 'name': 'perfume'}, {'frequency': 'r', 'id': 804, 'synset': 'persimmon.n.02', 'synonyms': ['persimmon'], 'def': 'orange fruit resembling a plum; edible when fully ripe', 'name': 'persimmon'}, {'frequency': 'f', 'id': 805, 'synset': 'person.n.01', 'synonyms': ['baby', 'child', 'boy', 'girl', 'man', 'woman', 'person', 'human'], 'def': 'a human being', 'name': 'baby'}, {'frequency': 'r', 'id': 806, 'synset': 'pet.n.01', 'synonyms': ['pet'], 'def': 'a domesticated animal kept for companionship or amusement', 'name': 'pet'}, {'frequency': 'r', 'id': 807, 'synset': 'petfood.n.01', 'synonyms': ['petfood', 'pet-food'], 'def': 'food prepared for animal pets', 'name': 'petfood'}, {'frequency': 'r', 'id': 808, 'synset': 'pew.n.01', 'synonyms': ['pew_(church_bench)', 'church_bench'], 'def': 'long bench with backs; used in church by the congregation', 'name': 'pew_(church_bench)'}, {'frequency': 'r', 'id': 809, 'synset': 'phonebook.n.01', 'synonyms': ['phonebook', 'telephone_book', 'telephone_directory'], 'def': 'a directory containing an alphabetical list of telephone subscribers and their telephone numbers', 'name': 'phonebook'}, {'frequency': 'c', 'id': 810, 'synset': 'phonograph_record.n.01', 'synonyms': ['phonograph_record', 'phonograph_recording', 'record_(phonograph_recording)'], 'def': 'sound recording consisting of a typically black disk with a continuous groove', 'name': 'phonograph_record'}, {'frequency': 'c', 'id': 811, 'synset': 'piano.n.01', 'synonyms': ['piano'], 'def': 'a keyboard instrument that is played by depressing keys that cause hammers to strike tuned strings and produce sounds', 'name': 'piano'}, {'frequency': 'f', 'id': 812, 'synset': 'pickle.n.01', 'synonyms': ['pickle'], 'def': 'vegetables (especially cucumbers) preserved in brine or vinegar', 'name': 'pickle'}, {'frequency': 'f', 'id': 813, 'synset': 'pickup.n.01', 'synonyms': ['pickup_truck'], 'def': 'a light truck with an open body and low sides and a tailboard', 'name': 'pickup_truck'}, {'frequency': 'c', 'id': 814, 'synset': 'pie.n.01', 'synonyms': ['pie'], 'def': 'dish baked in pastry-lined pan often with a pastry top', 'name': 'pie'}, {'frequency': 'c', 'id': 815, 'synset': 'pigeon.n.01', 'synonyms': ['pigeon'], 'def': 'wild and domesticated birds having a heavy body and short legs', 'name': 'pigeon'}, {'frequency': 'r', 'id': 816, 'synset': 'piggy_bank.n.01', 'synonyms': ['piggy_bank', 'penny_bank'], 'def': "a child's coin bank (often shaped like a pig)", 'name': 'piggy_bank'}, {'frequency': 'f', 'id': 817, 'synset': 'pillow.n.01', 'synonyms': ['pillow'], 'def': 'a cushion to support the head of a sleeping person', 'name': 'pillow'}, {'frequency': 'r', 'id': 818, 'synset': 'pin.n.09', 'synonyms': ['pin_(non_jewelry)'], 'def': 'a small slender (often pointed) piece of wood or metal used to support or fasten or attach things', 'name': 'pin_(non_jewelry)'}, {'frequency': 'f', 'id': 819, 'synset': 'pineapple.n.02', 'synonyms': ['pineapple'], 'def': 'large sweet fleshy tropical fruit with a tuft of stiff leaves', 'name': 'pineapple'}, {'frequency': 'c', 'id': 820, 'synset': 'pinecone.n.01', 'synonyms': ['pinecone'], 'def': 'the seed-producing cone of a pine tree', 'name': 'pinecone'}, {'frequency': 'r', 'id': 821, 'synset': 'ping-pong_ball.n.01', 'synonyms': ['ping-pong_ball'], 'def': 'light hollow ball used in playing table tennis', 'name': 'ping-pong_ball'}, {'frequency': 'r', 'id': 822, 'synset': 'pinwheel.n.03', 'synonyms': ['pinwheel'], 'def': 'a toy consisting of vanes of colored paper or plastic that is pinned to a stick and spins when it is pointed into the wind', 'name': 'pinwheel'}, {'frequency': 'r', 'id': 823, 'synset': 'pipe.n.01', 'synonyms': ['tobacco_pipe'], 'def': 'a tube with a small bowl at one end; used for smoking tobacco', 'name': 'tobacco_pipe'}, {'frequency': 'f', 'id': 824, 'synset': 'pipe.n.02', 'synonyms': ['pipe', 'piping'], 'def': 'a long tube made of metal or plastic that is used to carry water or oil or gas etc.', 'name': 'pipe'}, {'frequency': 'r', 'id': 825, 'synset': 'pistol.n.01', 'synonyms': ['pistol', 'handgun'], 'def': 'a firearm that is held and fired with one hand', 'name': 'pistol'}, {'frequency': 'r', 'id': 826, 'synset': 'pita.n.01', 'synonyms': ['pita_(bread)', 'pocket_bread'], 'def': 'usually small round bread that can open into a pocket for filling', 'name': 'pita_(bread)'}, {'frequency': 'f', 'id': 827, 'synset': 'pitcher.n.02', 'synonyms': ['pitcher_(vessel_for_liquid)', 'ewer'], 'def': 'an open vessel with a handle and a spout for pouring', 'name': 'pitcher_(vessel_for_liquid)'}, {'frequency': 'r', 'id': 828, 'synset': 'pitchfork.n.01', 'synonyms': ['pitchfork'], 'def': 'a long-handled hand tool with sharp widely spaced prongs for lifting and pitching hay', 'name': 'pitchfork'}, {'frequency': 'f', 'id': 829, 'synset': 'pizza.n.01', 'synonyms': ['pizza'], 'def': 'Italian open pie made of thin bread dough spread with a spiced mixture of e.g. tomato sauce and cheese', 'name': 'pizza'}, {'frequency': 'f', 'id': 830, 'synset': 'place_mat.n.01', 'synonyms': ['place_mat'], 'def': 'a mat placed on a table for an individual place setting', 'name': 'place_mat'}, {'frequency': 'f', 'id': 831, 'synset': 'plate.n.04', 'synonyms': ['plate'], 'def': 'dish on which food is served or from which food is eaten', 'name': 'plate'}, {'frequency': 'c', 'id': 832, 'synset': 'platter.n.01', 'synonyms': ['platter'], 'def': 'a large shallow dish used for serving food', 'name': 'platter'}, {'frequency': 'r', 'id': 833, 'synset': 'playing_card.n.01', 'synonyms': ['playing_card'], 'def': 'one of a pack of cards that are used to play card games', 'name': 'playing_card'}, {'frequency': 'r', 'id': 834, 'synset': 'playpen.n.01', 'synonyms': ['playpen'], 'def': 'a portable enclosure in which babies may be left to play', 'name': 'playpen'}, {'frequency': 'c', 'id': 835, 'synset': 'pliers.n.01', 'synonyms': ['pliers', 'plyers'], 'def': 'a gripping hand tool with two hinged arms and (usually) serrated jaws', 'name': 'pliers'}, {'frequency': 'r', 'id': 836, 'synset': 'plow.n.01', 'synonyms': ['plow_(farm_equipment)', 'plough_(farm_equipment)'], 'def': 'a farm tool having one or more heavy blades to break the soil and cut a furrow prior to sowing', 'name': 'plow_(farm_equipment)'}, {'frequency': 'r', 'id': 837, 'synset': 'pocket_watch.n.01', 'synonyms': ['pocket_watch'], 'def': 'a watch that is carried in a small watch pocket', 'name': 'pocket_watch'}, {'frequency': 'c', 'id': 838, 'synset': 'pocketknife.n.01', 'synonyms': ['pocketknife'], 'def': 'a knife with a blade that folds into the handle; suitable for carrying in the pocket', 'name': 'pocketknife'}, {'frequency': 'c', 'id': 839, 'synset': 'poker.n.01', 'synonyms': ['poker_(fire_stirring_tool)', 'stove_poker', 'fire_hook'], 'def': 'fire iron consisting of a metal rod with a handle; used to stir a fire', 'name': 'poker_(fire_stirring_tool)'}, {'frequency': 'f', 'id': 840, 'synset': 'pole.n.01', 'synonyms': ['pole', 'post'], 'def': 'a long (usually round) rod of wood or metal or plastic', 'name': 'pole'}, {'frequency': 'r', 'id': 841, 'synset': 'police_van.n.01', 'synonyms': ['police_van', 'police_wagon', 'paddy_wagon', 'patrol_wagon'], 'def': 'van used by police to transport prisoners', 'name': 'police_van'}, {'frequency': 'f', 'id': 842, 'synset': 'polo_shirt.n.01', 'synonyms': ['polo_shirt', 'sport_shirt'], 'def': 'a shirt with short sleeves designed for comfort and casual wear', 'name': 'polo_shirt'}, {'frequency': 'r', 'id': 843, 'synset': 'poncho.n.01', 'synonyms': ['poncho'], 'def': 'a blanket-like cloak with a hole in the center for the head', 'name': 'poncho'}, {'frequency': 'c', 'id': 844, 'synset': 'pony.n.05', 'synonyms': ['pony'], 'def': 'any of various breeds of small gentle horses usually less than five feet high at the shoulder', 'name': 'pony'}, {'frequency': 'r', 'id': 845, 'synset': 'pool_table.n.01', 'synonyms': ['pool_table', 'billiard_table', 'snooker_table'], 'def': 'game equipment consisting of a heavy table on which pool is played', 'name': 'pool_table'}, {'frequency': 'f', 'id': 846, 'synset': 'pop.n.02', 'synonyms': ['pop_(soda)', 'soda_(pop)', 'tonic', 'soft_drink'], 'def': 'a sweet drink containing carbonated water and flavoring', 'name': 'pop_(soda)'}, {'frequency': 'r', 'id': 847, 'synset': 'portrait.n.02', 'synonyms': ['portrait', 'portrayal'], 'def': 'any likeness of a person, in any medium', 'name': 'portrait'}, {'frequency': 'c', 'id': 848, 'synset': 'postbox.n.01', 'synonyms': ['postbox_(public)', 'mailbox_(public)'], 'def': 'public box for deposit of mail', 'name': 'postbox_(public)'}, {'frequency': 'c', 'id': 849, 'synset': 'postcard.n.01', 'synonyms': ['postcard', 'postal_card', 'mailing-card'], 'def': 'a card for sending messages by post without an envelope', 'name': 'postcard'}, {'frequency': 'f', 'id': 850, 'synset': 'poster.n.01', 'synonyms': ['poster', 'placard'], 'def': 'a sign posted in a public place as an advertisement', 'name': 'poster'}, {'frequency': 'f', 'id': 851, 'synset': 'pot.n.01', 'synonyms': ['pot'], 'def': 'metal or earthenware cooking vessel that is usually round and deep; often has a handle and lid', 'name': 'pot'}, {'frequency': 'f', 'id': 852, 'synset': 'pot.n.04', 'synonyms': ['flowerpot'], 'def': 'a container in which plants are cultivated', 'name': 'flowerpot'}, {'frequency': 'f', 'id': 853, 'synset': 'potato.n.01', 'synonyms': ['potato'], 'def': 'an edible tuber native to South America', 'name': 'potato'}, {'frequency': 'c', 'id': 854, 'synset': 'potholder.n.01', 'synonyms': ['potholder'], 'def': 'an insulated pad for holding hot pots', 'name': 'potholder'}, {'frequency': 'c', 'id': 855, 'synset': 'pottery.n.01', 'synonyms': ['pottery', 'clayware'], 'def': 'ceramic ware made from clay and baked in a kiln', 'name': 'pottery'}, {'frequency': 'c', 'id': 856, 'synset': 'pouch.n.01', 'synonyms': ['pouch'], 'def': 'a small or medium size container for holding or carrying things', 'name': 'pouch'}, {'frequency': 'r', 'id': 857, 'synset': 'power_shovel.n.01', 'synonyms': ['power_shovel', 'excavator', 'digger'], 'def': 'a machine for excavating', 'name': 'power_shovel'}, {'frequency': 'c', 'id': 858, 'synset': 'prawn.n.01', 'synonyms': ['prawn', 'shrimp'], 'def': 'any of various edible decapod crustaceans', 'name': 'prawn'}, {'frequency': 'f', 'id': 859, 'synset': 'printer.n.03', 'synonyms': ['printer', 'printing_machine'], 'def': 'a machine that prints', 'name': 'printer'}, {'frequency': 'c', 'id': 860, 'synset': 'projectile.n.01', 'synonyms': ['projectile_(weapon)', 'missile'], 'def': 'a weapon that is forcibly thrown or projected at a targets', 'name': 'projectile_(weapon)'}, {'frequency': 'c', 'id': 861, 'synset': 'projector.n.02', 'synonyms': ['projector'], 'def': 'an optical instrument that projects an enlarged image onto a screen', 'name': 'projector'}, {'frequency': 'f', 'id': 862, 'synset': 'propeller.n.01', 'synonyms': ['propeller', 'propellor'], 'def': 'a mechanical device that rotates to push against air or water', 'name': 'propeller'}, {'frequency': 'r', 'id': 863, 'synset': 'prune.n.01', 'synonyms': ['prune'], 'def': 'dried plum', 'name': 'prune'}, {'frequency': 'r', 'id': 864, 'synset': 'pudding.n.01', 'synonyms': ['pudding'], 'def': 'any of various soft thick unsweetened baked dishes', 'name': 'pudding'}, {'frequency': 'r', 'id': 865, 'synset': 'puffer.n.02', 'synonyms': ['puffer_(fish)', 'pufferfish', 'blowfish', 'globefish'], 'def': 'fishes whose elongated spiny body can inflate itself with water or air to form a globe', 'name': 'puffer_(fish)'}, {'frequency': 'r', 'id': 866, 'synset': 'puffin.n.01', 'synonyms': ['puffin'], 'def': 'seabirds having short necks and brightly colored compressed bills', 'name': 'puffin'}, {'frequency': 'r', 'id': 867, 'synset': 'pug.n.01', 'synonyms': ['pug-dog'], 'def': 'small compact smooth-coated breed of Asiatic origin having a tightly curled tail and broad flat wrinkled muzzle', 'name': 'pug-dog'}, {'frequency': 'c', 'id': 868, 'synset': 'pumpkin.n.02', 'synonyms': ['pumpkin'], 'def': 'usually large pulpy deep-yellow round fruit of the squash family maturing in late summer or early autumn', 'name': 'pumpkin'}, {'frequency': 'r', 'id': 869, 'synset': 'punch.n.03', 'synonyms': ['puncher'], 'def': 'a tool for making holes or indentations', 'name': 'puncher'}, {'frequency': 'r', 'id': 870, 'synset': 'puppet.n.01', 'synonyms': ['puppet', 'marionette'], 'def': 'a small figure of a person operated from above with strings by a puppeteer', 'name': 'puppet'}, {'frequency': 'r', 'id': 871, 'synset': 'puppy.n.01', 'synonyms': ['puppy'], 'def': 'a young dog', 'name': 'puppy'}, {'frequency': 'r', 'id': 872, 'synset': 'quesadilla.n.01', 'synonyms': ['quesadilla'], 'def': 'a tortilla that is filled with cheese and heated', 'name': 'quesadilla'}, {'frequency': 'r', 'id': 873, 'synset': 'quiche.n.02', 'synonyms': ['quiche'], 'def': 'a tart filled with rich unsweetened custard; often contains other ingredients (as cheese or ham or seafood or vegetables)', 'name': 'quiche'}, {'frequency': 'f', 'id': 874, 'synset': 'quilt.n.01', 'synonyms': ['quilt', 'comforter'], 'def': 'bedding made of two layers of cloth filled with stuffing and stitched together', 'name': 'quilt'}, {'frequency': 'c', 'id': 875, 'synset': 'rabbit.n.01', 'synonyms': ['rabbit'], 'def': 'any of various burrowing animals of the family Leporidae having long ears and short tails', 'name': 'rabbit'}, {'frequency': 'r', 'id': 876, 'synset': 'racer.n.02', 'synonyms': ['race_car', 'racing_car'], 'def': 'a fast car that competes in races', 'name': 'race_car'}, {'frequency': 'c', 'id': 877, 'synset': 'racket.n.04', 'synonyms': ['racket', 'racquet'], 'def': 'a sports implement used to strike a ball in various games', 'name': 'racket'}, {'frequency': 'r', 'id': 878, 'synset': 'radar.n.01', 'synonyms': ['radar'], 'def': 'measuring instrument in which the echo of a pulse of microwave radiation is used to detect and locate distant objects', 'name': 'radar'}, {'frequency': 'c', 'id': 879, 'synset': 'radiator.n.03', 'synonyms': ['radiator'], 'def': 'a mechanism consisting of a metal honeycomb through which hot fluids circulate', 'name': 'radiator'}, {'frequency': 'c', 'id': 880, 'synset': 'radio_receiver.n.01', 'synonyms': ['radio_receiver', 'radio_set', 'radio', 'tuner_(radio)'], 'def': 'an electronic receiver that detects and demodulates and amplifies transmitted radio signals', 'name': 'radio_receiver'}, {'frequency': 'c', 'id': 881, 'synset': 'radish.n.03', 'synonyms': ['radish', 'daikon'], 'def': 'pungent edible root of any of various cultivated radish plants', 'name': 'radish'}, {'frequency': 'c', 'id': 882, 'synset': 'raft.n.01', 'synonyms': ['raft'], 'def': 'a flat float (usually made of logs or planks) that can be used for transport or as a platform for swimmers', 'name': 'raft'}, {'frequency': 'r', 'id': 883, 'synset': 'rag_doll.n.01', 'synonyms': ['rag_doll'], 'def': 'a cloth doll that is stuffed and (usually) painted', 'name': 'rag_doll'}, {'frequency': 'c', 'id': 884, 'synset': 'raincoat.n.01', 'synonyms': ['raincoat', 'waterproof_jacket'], 'def': 'a water-resistant coat', 'name': 'raincoat'}, {'frequency': 'c', 'id': 885, 'synset': 'ram.n.05', 'synonyms': ['ram_(animal)'], 'def': 'uncastrated adult male sheep', 'name': 'ram_(animal)'}, {'frequency': 'c', 'id': 886, 'synset': 'raspberry.n.02', 'synonyms': ['raspberry'], 'def': 'red or black edible aggregate berries usually smaller than the related blackberries', 'name': 'raspberry'}, {'frequency': 'r', 'id': 887, 'synset': 'rat.n.01', 'synonyms': ['rat'], 'def': 'any of various long-tailed rodents similar to but larger than a mouse', 'name': 'rat'}, {'frequency': 'c', 'id': 888, 'synset': 'razorblade.n.01', 'synonyms': ['razorblade'], 'def': 'a blade that has very sharp edge', 'name': 'razorblade'}, {'frequency': 'c', 'id': 889, 'synset': 'reamer.n.01', 'synonyms': ['reamer_(juicer)', 'juicer', 'juice_reamer'], 'def': 'a squeezer with a conical ridged center that is used for squeezing juice from citrus fruit', 'name': 'reamer_(juicer)'}, {'frequency': 'f', 'id': 890, 'synset': 'rearview_mirror.n.01', 'synonyms': ['rearview_mirror'], 'def': 'car mirror that reflects the view out of the rear window', 'name': 'rearview_mirror'}, {'frequency': 'c', 'id': 891, 'synset': 'receipt.n.02', 'synonyms': ['receipt'], 'def': 'an acknowledgment (usually tangible) that payment has been made', 'name': 'receipt'}, {'frequency': 'c', 'id': 892, 'synset': 'recliner.n.01', 'synonyms': ['recliner', 'reclining_chair', 'lounger_(chair)'], 'def': 'an armchair whose back can be lowered and foot can be raised to allow the sitter to recline in it', 'name': 'recliner'}, {'frequency': 'r', 'id': 893, 'synset': 'record_player.n.01', 'synonyms': ['record_player', 'phonograph_(record_player)', 'turntable'], 'def': 'machine in which rotating records cause a stylus to vibrate and the vibrations are amplified acoustically or electronically', 'name': 'record_player'}, {'frequency': 'r', 'id': 894, 'synset': 'red_cabbage.n.02', 'synonyms': ['red_cabbage'], 'def': 'compact head of purplish-red leaves', 'name': 'red_cabbage'}, {'frequency': 'f', 'id': 895, 'synset': 'reflector.n.01', 'synonyms': ['reflector'], 'def': 'device that reflects light, radiation, etc.', 'name': 'reflector'}, {'frequency': 'f', 'id': 896, 'synset': 'remote_control.n.01', 'synonyms': ['remote_control'], 'def': 'a device that can be used to control a machine or apparatus from a distance', 'name': 'remote_control'}, {'frequency': 'c', 'id': 897, 'synset': 'rhinoceros.n.01', 'synonyms': ['rhinoceros'], 'def': 'massive powerful herbivorous odd-toed ungulate of southeast Asia and Africa having very thick skin and one or two horns on the snout', 'name': 'rhinoceros'}, {'frequency': 'r', 'id': 898, 'synset': 'rib.n.03', 'synonyms': ['rib_(food)'], 'def': 'cut of meat including one or more ribs', 'name': 'rib_(food)'}, {'frequency': 'r', 'id': 899, 'synset': 'rifle.n.01', 'synonyms': ['rifle'], 'def': 'a shoulder firearm with a long barrel', 'name': 'rifle'}, {'frequency': 'f', 'id': 900, 'synset': 'ring.n.08', 'synonyms': ['ring'], 'def': 'jewelry consisting of a circlet of precious metal (often set with jewels) worn on the finger', 'name': 'ring'}, {'frequency': 'r', 'id': 901, 'synset': 'river_boat.n.01', 'synonyms': ['river_boat'], 'def': 'a boat used on rivers or to ply a river', 'name': 'river_boat'}, {'frequency': 'r', 'id': 902, 'synset': 'road_map.n.02', 'synonyms': ['road_map'], 'def': '(NOT A ROAD) a MAP showing roads (for automobile travel)', 'name': 'road_map'}, {'frequency': 'c', 'id': 903, 'synset': 'robe.n.01', 'synonyms': ['robe'], 'def': 'any loose flowing garment', 'name': 'robe'}, {'frequency': 'c', 'id': 904, 'synset': 'rocking_chair.n.01', 'synonyms': ['rocking_chair'], 'def': 'a chair mounted on rockers', 'name': 'rocking_chair'}, {'frequency': 'r', 'id': 905, 'synset': 'roller_skate.n.01', 'synonyms': ['roller_skate'], 'def': 'a shoe with pairs of rollers (small hard wheels) fixed to the sole', 'name': 'roller_skate'}, {'frequency': 'r', 'id': 906, 'synset': 'rollerblade.n.01', 'synonyms': ['Rollerblade'], 'def': 'an in-line variant of a roller skate', 'name': 'Rollerblade'}, {'frequency': 'c', 'id': 907, 'synset': 'rolling_pin.n.01', 'synonyms': ['rolling_pin'], 'def': 'utensil consisting of a cylinder (usually of wood) with a handle at each end; used to roll out dough', 'name': 'rolling_pin'}, {'frequency': 'r', 'id': 908, 'synset': 'root_beer.n.01', 'synonyms': ['root_beer'], 'def': 'carbonated drink containing extracts of roots and herbs', 'name': 'root_beer'}, {'frequency': 'c', 'id': 909, 'synset': 'router.n.02', 'synonyms': ['router_(computer_equipment)'], 'def': 'a device that forwards data packets between computer networks', 'name': 'router_(computer_equipment)'}, {'frequency': 'f', 'id': 910, 'synset': 'rubber_band.n.01', 'synonyms': ['rubber_band', 'elastic_band'], 'def': 'a narrow band of elastic rubber used to hold things (such as papers) together', 'name': 'rubber_band'}, {'frequency': 'c', 'id': 911, 'synset': 'runner.n.08', 'synonyms': ['runner_(carpet)'], 'def': 'a long narrow carpet', 'name': 'runner_(carpet)'}, {'frequency': 'f', 'id': 912, 'synset': 'sack.n.01', 'synonyms': ['plastic_bag', 'paper_bag'], 'def': "a bag made of paper or plastic for holding customer's purchases", 'name': 'plastic_bag'}, {'frequency': 'f', 'id': 913, 'synset': 'saddle.n.01', 'synonyms': ['saddle_(on_an_animal)'], 'def': 'a seat for the rider of a horse or camel', 'name': 'saddle_(on_an_animal)'}, {'frequency': 'f', 'id': 914, 'synset': 'saddle_blanket.n.01', 'synonyms': ['saddle_blanket', 'saddlecloth', 'horse_blanket'], 'def': 'stable gear consisting of a blanket placed under the saddle', 'name': 'saddle_blanket'}, {'frequency': 'c', 'id': 915, 'synset': 'saddlebag.n.01', 'synonyms': ['saddlebag'], 'def': 'a large bag (or pair of bags) hung over a saddle', 'name': 'saddlebag'}, {'frequency': 'r', 'id': 916, 'synset': 'safety_pin.n.01', 'synonyms': ['safety_pin'], 'def': 'a pin in the form of a clasp; has a guard so the point of the pin will not stick the user', 'name': 'safety_pin'}, {'frequency': 'c', 'id': 917, 'synset': 'sail.n.01', 'synonyms': ['sail'], 'def': 'a large piece of fabric by means of which wind is used to propel a sailing vessel', 'name': 'sail'}, {'frequency': 'c', 'id': 918, 'synset': 'salad.n.01', 'synonyms': ['salad'], 'def': 'food mixtures either arranged on a plate or tossed and served with a moist dressing; usually consisting of or including greens', 'name': 'salad'}, {'frequency': 'r', 'id': 919, 'synset': 'salad_plate.n.01', 'synonyms': ['salad_plate', 'salad_bowl'], 'def': 'a plate or bowl for individual servings of salad', 'name': 'salad_plate'}, {'frequency': 'r', 'id': 920, 'synset': 'salami.n.01', 'synonyms': ['salami'], 'def': 'highly seasoned fatty sausage of pork and beef usually dried', 'name': 'salami'}, {'frequency': 'r', 'id': 921, 'synset': 'salmon.n.01', 'synonyms': ['salmon_(fish)'], 'def': 'any of various large food and game fishes of northern waters', 'name': 'salmon_(fish)'}, {'frequency': 'r', 'id': 922, 'synset': 'salmon.n.03', 'synonyms': ['salmon_(food)'], 'def': 'flesh of any of various marine or freshwater fish of the family Salmonidae', 'name': 'salmon_(food)'}, {'frequency': 'r', 'id': 923, 'synset': 'salsa.n.01', 'synonyms': ['salsa'], 'def': 'spicy sauce of tomatoes and onions and chili peppers to accompany Mexican foods', 'name': 'salsa'}, {'frequency': 'f', 'id': 924, 'synset': 'saltshaker.n.01', 'synonyms': ['saltshaker'], 'def': 'a shaker with a perforated top for sprinkling salt', 'name': 'saltshaker'}, {'frequency': 'f', 'id': 925, 'synset': 'sandal.n.01', 'synonyms': ['sandal_(type_of_shoe)'], 'def': 'a shoe consisting of a sole fastened by straps to the foot', 'name': 'sandal_(type_of_shoe)'}, {'frequency': 'f', 'id': 926, 'synset': 'sandwich.n.01', 'synonyms': ['sandwich'], 'def': 'two (or more) slices of bread with a filling between them', 'name': 'sandwich'}, {'frequency': 'r', 'id': 927, 'synset': 'satchel.n.01', 'synonyms': ['satchel'], 'def': 'luggage consisting of a small case with a flat bottom and (usually) a shoulder strap', 'name': 'satchel'}, {'frequency': 'r', 'id': 928, 'synset': 'saucepan.n.01', 'synonyms': ['saucepan'], 'def': 'a deep pan with a handle; used for stewing or boiling', 'name': 'saucepan'}, {'frequency': 'f', 'id': 929, 'synset': 'saucer.n.02', 'synonyms': ['saucer'], 'def': 'a small shallow dish for holding a cup at the table', 'name': 'saucer'}, {'frequency': 'f', 'id': 930, 'synset': 'sausage.n.01', 'synonyms': ['sausage'], 'def': 'highly seasoned minced meat stuffed in casings', 'name': 'sausage'}, {'frequency': 'r', 'id': 931, 'synset': 'sawhorse.n.01', 'synonyms': ['sawhorse', 'sawbuck'], 'def': 'a framework for holding wood that is being sawed', 'name': 'sawhorse'}, {'frequency': 'r', 'id': 932, 'synset': 'sax.n.02', 'synonyms': ['saxophone'], 'def': "a wind instrument with a `J'-shaped form typically made of brass", 'name': 'saxophone'}, {'frequency': 'f', 'id': 933, 'synset': 'scale.n.07', 'synonyms': ['scale_(measuring_instrument)'], 'def': 'a measuring instrument for weighing; shows amount of mass', 'name': 'scale_(measuring_instrument)'}, {'frequency': 'r', 'id': 934, 'synset': 'scarecrow.n.01', 'synonyms': ['scarecrow', 'strawman'], 'def': 'an effigy in the shape of a man to frighten birds away from seeds', 'name': 'scarecrow'}, {'frequency': 'f', 'id': 935, 'synset': 'scarf.n.01', 'synonyms': ['scarf'], 'def': 'a garment worn around the head or neck or shoulders for warmth or decoration', 'name': 'scarf'}, {'frequency': 'c', 'id': 936, 'synset': 'school_bus.n.01', 'synonyms': ['school_bus'], 'def': 'a bus used to transport children to or from school', 'name': 'school_bus'}, {'frequency': 'f', 'id': 937, 'synset': 'scissors.n.01', 'synonyms': ['scissors'], 'def': 'a tool having two crossed pivoting blades with looped handles', 'name': 'scissors'}, {'frequency': 'c', 'id': 938, 'synset': 'scoreboard.n.01', 'synonyms': ['scoreboard'], 'def': 'a large board for displaying the score of a contest (and some other information)', 'name': 'scoreboard'}, {'frequency': 'c', 'id': 939, 'synset': 'scrambled_eggs.n.01', 'synonyms': ['scrambled_eggs'], 'def': 'eggs beaten and cooked to a soft firm consistency while stirring', 'name': 'scrambled_eggs'}, {'frequency': 'r', 'id': 940, 'synset': 'scraper.n.01', 'synonyms': ['scraper'], 'def': 'any of various hand tools for scraping', 'name': 'scraper'}, {'frequency': 'r', 'id': 941, 'synset': 'scratcher.n.03', 'synonyms': ['scratcher'], 'def': 'a device used for scratching', 'name': 'scratcher'}, {'frequency': 'c', 'id': 942, 'synset': 'screwdriver.n.01', 'synonyms': ['screwdriver'], 'def': 'a hand tool for driving screws; has a tip that fits into the head of a screw', 'name': 'screwdriver'}, {'frequency': 'c', 'id': 943, 'synset': 'scrub_brush.n.01', 'synonyms': ['scrubbing_brush'], 'def': 'a brush with short stiff bristles for heavy cleaning', 'name': 'scrubbing_brush'}, {'frequency': 'c', 'id': 944, 'synset': 'sculpture.n.01', 'synonyms': ['sculpture'], 'def': 'a three-dimensional work of art', 'name': 'sculpture'}, {'frequency': 'r', 'id': 945, 'synset': 'seabird.n.01', 'synonyms': ['seabird', 'seafowl'], 'def': 'a bird that frequents coastal waters and the open ocean: gulls; pelicans; gannets; cormorants; albatrosses; petrels; etc.', 'name': 'seabird'}, {'frequency': 'r', 'id': 946, 'synset': 'seahorse.n.02', 'synonyms': ['seahorse'], 'def': 'small fish with horse-like heads bent sharply downward and curled tails', 'name': 'seahorse'}, {'frequency': 'r', 'id': 947, 'synset': 'seaplane.n.01', 'synonyms': ['seaplane', 'hydroplane'], 'def': 'an airplane that can land on or take off from water', 'name': 'seaplane'}, {'frequency': 'c', 'id': 948, 'synset': 'seashell.n.01', 'synonyms': ['seashell'], 'def': 'the shell of a marine organism', 'name': 'seashell'}, {'frequency': 'r', 'id': 949, 'synset': 'seedling.n.01', 'synonyms': ['seedling'], 'def': 'young plant or tree grown from a seed', 'name': 'seedling'}, {'frequency': 'c', 'id': 950, 'synset': 'serving_dish.n.01', 'synonyms': ['serving_dish'], 'def': 'a dish used for serving food', 'name': 'serving_dish'}, {'frequency': 'r', 'id': 951, 'synset': 'sewing_machine.n.01', 'synonyms': ['sewing_machine'], 'def': 'a textile machine used as a home appliance for sewing', 'name': 'sewing_machine'}, {'frequency': 'r', 'id': 952, 'synset': 'shaker.n.03', 'synonyms': ['shaker'], 'def': 'a container in which something can be shaken', 'name': 'shaker'}, {'frequency': 'c', 'id': 953, 'synset': 'shampoo.n.01', 'synonyms': ['shampoo'], 'def': 'cleansing agent consisting of soaps or detergents used for washing the hair', 'name': 'shampoo'}, {'frequency': 'r', 'id': 954, 'synset': 'shark.n.01', 'synonyms': ['shark'], 'def': 'typically large carnivorous fishes with sharpe teeth', 'name': 'shark'}, {'frequency': 'r', 'id': 955, 'synset': 'sharpener.n.01', 'synonyms': ['sharpener'], 'def': 'any implement that is used to make something (an edge or a point) sharper', 'name': 'sharpener'}, {'frequency': 'r', 'id': 956, 'synset': 'sharpie.n.03', 'synonyms': ['Sharpie'], 'def': 'a pen with indelible ink that will write on any surface', 'name': 'Sharpie'}, {'frequency': 'r', 'id': 957, 'synset': 'shaver.n.03', 'synonyms': ['shaver_(electric)', 'electric_shaver', 'electric_razor'], 'def': 'a razor powered by an electric motor', 'name': 'shaver_(electric)'}, {'frequency': 'c', 'id': 958, 'synset': 'shaving_cream.n.01', 'synonyms': ['shaving_cream', 'shaving_soap'], 'def': 'toiletry consisting that forms a rich lather for softening the beard before shaving', 'name': 'shaving_cream'}, {'frequency': 'r', 'id': 959, 'synset': 'shawl.n.01', 'synonyms': ['shawl'], 'def': 'cloak consisting of an oblong piece of cloth used to cover the head and shoulders', 'name': 'shawl'}, {'frequency': 'r', 'id': 960, 'synset': 'shears.n.01', 'synonyms': ['shears'], 'def': 'large scissors with strong blades', 'name': 'shears'}, {'frequency': 'f', 'id': 961, 'synset': 'sheep.n.01', 'synonyms': ['sheep'], 'def': 'woolly usually horned ruminant mammal related to the goat', 'name': 'sheep'}, {'frequency': 'r', 'id': 962, 'synset': 'shepherd_dog.n.01', 'synonyms': ['shepherd_dog', 'sheepdog'], 'def': 'any of various usually long-haired breeds of dog reared to herd and guard sheep', 'name': 'shepherd_dog'}, {'frequency': 'r', 'id': 963, 'synset': 'sherbert.n.01', 'synonyms': ['sherbert', 'sherbet'], 'def': 'a frozen dessert made primarily of fruit juice and sugar', 'name': 'sherbert'}, {'frequency': 'r', 'id': 964, 'synset': 'shield.n.02', 'synonyms': ['shield'], 'def': 'armor carried on the arm to intercept blows', 'name': 'shield'}, {'frequency': 'f', 'id': 965, 'synset': 'shirt.n.01', 'synonyms': ['shirt'], 'def': 'a garment worn on the upper half of the body', 'name': 'shirt'}, {'frequency': 'f', 'id': 966, 'synset': 'shoe.n.01', 'synonyms': ['shoe', 'sneaker_(type_of_shoe)', 'tennis_shoe'], 'def': 'common footwear covering the foot', 'name': 'shoe'}, {'frequency': 'c', 'id': 967, 'synset': 'shopping_bag.n.01', 'synonyms': ['shopping_bag'], 'def': 'a bag made of plastic or strong paper (often with handles); used to transport goods after shopping', 'name': 'shopping_bag'}, {'frequency': 'c', 'id': 968, 'synset': 'shopping_cart.n.01', 'synonyms': ['shopping_cart'], 'def': 'a handcart that holds groceries or other goods while shopping', 'name': 'shopping_cart'}, {'frequency': 'f', 'id': 969, 'synset': 'short_pants.n.01', 'synonyms': ['short_pants', 'shorts_(clothing)', 'trunks_(clothing)'], 'def': 'trousers that end at or above the knee', 'name': 'short_pants'}, {'frequency': 'r', 'id': 970, 'synset': 'shot_glass.n.01', 'synonyms': ['shot_glass'], 'def': 'a small glass adequate to hold a single swallow of whiskey', 'name': 'shot_glass'}, {'frequency': 'c', 'id': 971, 'synset': 'shoulder_bag.n.01', 'synonyms': ['shoulder_bag'], 'def': 'a large handbag that can be carried by a strap looped over the shoulder', 'name': 'shoulder_bag'}, {'frequency': 'c', 'id': 972, 'synset': 'shovel.n.01', 'synonyms': ['shovel'], 'def': 'a hand tool for lifting loose material such as snow, dirt, etc.', 'name': 'shovel'}, {'frequency': 'f', 'id': 973, 'synset': 'shower.n.01', 'synonyms': ['shower_head'], 'def': 'a plumbing fixture that sprays water over you', 'name': 'shower_head'}, {'frequency': 'f', 'id': 974, 'synset': 'shower_curtain.n.01', 'synonyms': ['shower_curtain'], 'def': 'a curtain that keeps water from splashing out of the shower area', 'name': 'shower_curtain'}, {'frequency': 'r', 'id': 975, 'synset': 'shredder.n.01', 'synonyms': ['shredder_(for_paper)'], 'def': 'a device that shreds documents', 'name': 'shredder_(for_paper)'}, {'frequency': 'r', 'id': 976, 'synset': 'sieve.n.01', 'synonyms': ['sieve', 'screen_(sieve)'], 'def': 'a strainer for separating lumps from powdered material or grading particles', 'name': 'sieve'}, {'frequency': 'f', 'id': 977, 'synset': 'signboard.n.01', 'synonyms': ['signboard'], 'def': 'structure displaying a board on which advertisements can be posted', 'name': 'signboard'}, {'frequency': 'c', 'id': 978, 'synset': 'silo.n.01', 'synonyms': ['silo'], 'def': 'a cylindrical tower used for storing goods', 'name': 'silo'}, {'frequency': 'f', 'id': 979, 'synset': 'sink.n.01', 'synonyms': ['sink'], 'def': 'plumbing fixture consisting of a water basin fixed to a wall or floor and having a drainpipe', 'name': 'sink'}, {'frequency': 'f', 'id': 980, 'synset': 'skateboard.n.01', 'synonyms': ['skateboard'], 'def': 'a board with wheels that is ridden in a standing or crouching position and propelled by foot', 'name': 'skateboard'}, {'frequency': 'c', 'id': 981, 'synset': 'skewer.n.01', 'synonyms': ['skewer'], 'def': 'a long pin for holding meat in position while it is being roasted', 'name': 'skewer'}, {'frequency': 'f', 'id': 982, 'synset': 'ski.n.01', 'synonyms': ['ski'], 'def': 'sports equipment for skiing on snow', 'name': 'ski'}, {'frequency': 'f', 'id': 983, 'synset': 'ski_boot.n.01', 'synonyms': ['ski_boot'], 'def': 'a stiff boot that is fastened to a ski with a ski binding', 'name': 'ski_boot'}, {'frequency': 'f', 'id': 984, 'synset': 'ski_parka.n.01', 'synonyms': ['ski_parka', 'ski_jacket'], 'def': 'a parka to be worn while skiing', 'name': 'ski_parka'}, {'frequency': 'f', 'id': 985, 'synset': 'ski_pole.n.01', 'synonyms': ['ski_pole'], 'def': 'a pole with metal points used as an aid in skiing', 'name': 'ski_pole'}, {'frequency': 'f', 'id': 986, 'synset': 'skirt.n.02', 'synonyms': ['skirt'], 'def': 'a garment hanging from the waist; worn mainly by girls and women', 'name': 'skirt'}, {'frequency': 'c', 'id': 987, 'synset': 'sled.n.01', 'synonyms': ['sled', 'sledge', 'sleigh'], 'def': 'a vehicle or flat object for transportation over snow by sliding or pulled by dogs, etc.', 'name': 'sled'}, {'frequency': 'c', 'id': 988, 'synset': 'sleeping_bag.n.01', 'synonyms': ['sleeping_bag'], 'def': 'large padded bag designed to be slept in outdoors', 'name': 'sleeping_bag'}, {'frequency': 'r', 'id': 989, 'synset': 'sling.n.05', 'synonyms': ['sling_(bandage)', 'triangular_bandage'], 'def': 'bandage to support an injured forearm; slung over the shoulder or neck', 'name': 'sling_(bandage)'}, {'frequency': 'c', 'id': 990, 'synset': 'slipper.n.01', 'synonyms': ['slipper_(footwear)', 'carpet_slipper_(footwear)'], 'def': 'low footwear that can be slipped on and off easily; usually worn indoors', 'name': 'slipper_(footwear)'}, {'frequency': 'r', 'id': 991, 'synset': 'smoothie.n.02', 'synonyms': ['smoothie'], 'def': 'a thick smooth drink consisting of fresh fruit pureed with ice cream or yoghurt or milk', 'name': 'smoothie'}, {'frequency': 'r', 'id': 992, 'synset': 'snake.n.01', 'synonyms': ['snake', 'serpent'], 'def': 'limbless scaly elongate reptile; some are venomous', 'name': 'snake'}, {'frequency': 'f', 'id': 993, 'synset': 'snowboard.n.01', 'synonyms': ['snowboard'], 'def': 'a board that resembles a broad ski or a small surfboard; used in a standing position to slide down snow-covered slopes', 'name': 'snowboard'}, {'frequency': 'c', 'id': 994, 'synset': 'snowman.n.01', 'synonyms': ['snowman'], 'def': 'a figure of a person made of packed snow', 'name': 'snowman'}, {'frequency': 'c', 'id': 995, 'synset': 'snowmobile.n.01', 'synonyms': ['snowmobile'], 'def': 'tracked vehicle for travel on snow having skis in front', 'name': 'snowmobile'}, {'frequency': 'f', 'id': 996, 'synset': 'soap.n.01', 'synonyms': ['soap'], 'def': 'a cleansing agent made from the salts of vegetable or animal fats', 'name': 'soap'}, {'frequency': 'f', 'id': 997, 'synset': 'soccer_ball.n.01', 'synonyms': ['soccer_ball'], 'def': "an inflated ball used in playing soccer (called `football' outside of the United States)", 'name': 'soccer_ball'}, {'frequency': 'f', 'id': 998, 'synset': 'sock.n.01', 'synonyms': ['sock'], 'def': 'cloth covering for the foot; worn inside the shoe; reaches to between the ankle and the knee', 'name': 'sock'}, {'frequency': 'r', 'id': 999, 'synset': 'soda_fountain.n.02', 'synonyms': ['soda_fountain'], 'def': 'an apparatus for dispensing soda water', 'name': 'soda_fountain'}, {'frequency': 'r', 'id': 1000, 'synset': 'soda_water.n.01', 'synonyms': ['carbonated_water', 'club_soda', 'seltzer', 'sparkling_water'], 'def': 'effervescent beverage artificially charged with carbon dioxide', 'name': 'carbonated_water'}, {'frequency': 'f', 'id': 1001, 'synset': 'sofa.n.01', 'synonyms': ['sofa', 'couch', 'lounge'], 'def': 'an upholstered seat for more than one person', 'name': 'sofa'}, {'frequency': 'r', 'id': 1002, 'synset': 'softball.n.01', 'synonyms': ['softball'], 'def': 'ball used in playing softball', 'name': 'softball'}, {'frequency': 'c', 'id': 1003, 'synset': 'solar_array.n.01', 'synonyms': ['solar_array', 'solar_battery', 'solar_panel'], 'def': 'electrical device consisting of a large array of connected solar cells', 'name': 'solar_array'}, {'frequency': 'r', 'id': 1004, 'synset': 'sombrero.n.02', 'synonyms': ['sombrero'], 'def': 'a straw hat with a tall crown and broad brim; worn in American southwest and in Mexico', 'name': 'sombrero'}, {'frequency': 'c', 'id': 1005, 'synset': 'soup.n.01', 'synonyms': ['soup'], 'def': 'liquid food especially of meat or fish or vegetable stock often containing pieces of solid food', 'name': 'soup'}, {'frequency': 'r', 'id': 1006, 'synset': 'soup_bowl.n.01', 'synonyms': ['soup_bowl'], 'def': 'a bowl for serving soup', 'name': 'soup_bowl'}, {'frequency': 'c', 'id': 1007, 'synset': 'soupspoon.n.01', 'synonyms': ['soupspoon'], 'def': 'a spoon with a rounded bowl for eating soup', 'name': 'soupspoon'}, {'frequency': 'c', 'id': 1008, 'synset': 'sour_cream.n.01', 'synonyms': ['sour_cream', 'soured_cream'], 'def': 'soured light cream', 'name': 'sour_cream'}, {'frequency': 'r', 'id': 1009, 'synset': 'soya_milk.n.01', 'synonyms': ['soya_milk', 'soybean_milk', 'soymilk'], 'def': 'a milk substitute containing soybean flour and water; used in some infant formulas and in making tofu', 'name': 'soya_milk'}, {'frequency': 'r', 'id': 1010, 'synset': 'space_shuttle.n.01', 'synonyms': ['space_shuttle'], 'def': "a reusable spacecraft with wings for a controlled descent through the Earth's atmosphere", 'name': 'space_shuttle'}, {'frequency': 'r', 'id': 1011, 'synset': 'sparkler.n.02', 'synonyms': ['sparkler_(fireworks)'], 'def': 'a firework that burns slowly and throws out a shower of sparks', 'name': 'sparkler_(fireworks)'}, {'frequency': 'f', 'id': 1012, 'synset': 'spatula.n.02', 'synonyms': ['spatula'], 'def': 'a hand tool with a thin flexible blade used to mix or spread soft substances', 'name': 'spatula'}, {'frequency': 'r', 'id': 1013, 'synset': 'spear.n.01', 'synonyms': ['spear', 'lance'], 'def': 'a long pointed rod used as a tool or weapon', 'name': 'spear'}, {'frequency': 'f', 'id': 1014, 'synset': 'spectacles.n.01', 'synonyms': ['spectacles', 'specs', 'eyeglasses', 'glasses'], 'def': 'optical instrument consisting of a frame that holds a pair of lenses for correcting defective vision', 'name': 'spectacles'}, {'frequency': 'c', 'id': 1015, 'synset': 'spice_rack.n.01', 'synonyms': ['spice_rack'], 'def': 'a rack for displaying containers filled with spices', 'name': 'spice_rack'}, {'frequency': 'r', 'id': 1016, 'synset': 'spider.n.01', 'synonyms': ['spider'], 'def': 'predatory arachnid with eight legs, two poison fangs, two feelers, and usually two silk-spinning organs at the back end of the body', 'name': 'spider'}, {'frequency': 'c', 'id': 1017, 'synset': 'sponge.n.01', 'synonyms': ['sponge'], 'def': 'a porous mass usable to absorb water typically used for cleaning', 'name': 'sponge'}, {'frequency': 'f', 'id': 1018, 'synset': 'spoon.n.01', 'synonyms': ['spoon'], 'def': 'a piece of cutlery with a shallow bowl-shaped container and a handle', 'name': 'spoon'}, {'frequency': 'c', 'id': 1019, 'synset': 'sportswear.n.01', 'synonyms': ['sportswear', 'athletic_wear', 'activewear'], 'def': 'attire worn for sport or for casual wear', 'name': 'sportswear'}, {'frequency': 'c', 'id': 1020, 'synset': 'spotlight.n.02', 'synonyms': ['spotlight'], 'def': 'a lamp that produces a strong beam of light to illuminate a restricted area; used to focus attention of a stage performer', 'name': 'spotlight'}, {'frequency': 'r', 'id': 1021, 'synset': 'squirrel.n.01', 'synonyms': ['squirrel'], 'def': 'a kind of arboreal rodent having a long bushy tail', 'name': 'squirrel'}, {'frequency': 'c', 'id': 1022, 'synset': 'stapler.n.01', 'synonyms': ['stapler_(stapling_machine)'], 'def': 'a machine that inserts staples into sheets of paper in order to fasten them together', 'name': 'stapler_(stapling_machine)'}, {'frequency': 'r', 'id': 1023, 'synset': 'starfish.n.01', 'synonyms': ['starfish', 'sea_star'], 'def': 'echinoderms characterized by five arms extending from a central disk', 'name': 'starfish'}, {'frequency': 'f', 'id': 1024, 'synset': 'statue.n.01', 'synonyms': ['statue_(sculpture)'], 'def': 'a sculpture representing a human or animal', 'name': 'statue_(sculpture)'}, {'frequency': 'c', 'id': 1025, 'synset': 'steak.n.01', 'synonyms': ['steak_(food)'], 'def': 'a slice of meat cut from the fleshy part of an animal or large fish', 'name': 'steak_(food)'}, {'frequency': 'r', 'id': 1026, 'synset': 'steak_knife.n.01', 'synonyms': ['steak_knife'], 'def': 'a sharp table knife used in eating steak', 'name': 'steak_knife'}, {'frequency': 'r', 'id': 1027, 'synset': 'steamer.n.02', 'synonyms': ['steamer_(kitchen_appliance)'], 'def': 'a cooking utensil that can be used to cook food by steaming it', 'name': 'steamer_(kitchen_appliance)'}, {'frequency': 'f', 'id': 1028, 'synset': 'steering_wheel.n.01', 'synonyms': ['steering_wheel'], 'def': 'a handwheel that is used for steering', 'name': 'steering_wheel'}, {'frequency': 'r', 'id': 1029, 'synset': 'stencil.n.01', 'synonyms': ['stencil'], 'def': 'a sheet of material (metal, plastic, etc.) that has been perforated with a pattern; ink or paint can pass through the perforations to create the printed pattern on the surface below', 'name': 'stencil'}, {'frequency': 'r', 'id': 1030, 'synset': 'step_ladder.n.01', 'synonyms': ['stepladder'], 'def': 'a folding portable ladder hinged at the top', 'name': 'stepladder'}, {'frequency': 'c', 'id': 1031, 'synset': 'step_stool.n.01', 'synonyms': ['step_stool'], 'def': 'a stool that has one or two steps that fold under the seat', 'name': 'step_stool'}, {'frequency': 'c', 'id': 1032, 'synset': 'stereo.n.01', 'synonyms': ['stereo_(sound_system)'], 'def': 'electronic device for playing audio', 'name': 'stereo_(sound_system)'}, {'frequency': 'r', 'id': 1033, 'synset': 'stew.n.02', 'synonyms': ['stew'], 'def': 'food prepared by stewing especially meat or fish with vegetables', 'name': 'stew'}, {'frequency': 'r', 'id': 1034, 'synset': 'stirrer.n.02', 'synonyms': ['stirrer'], 'def': 'an implement used for stirring', 'name': 'stirrer'}, {'frequency': 'f', 'id': 1035, 'synset': 'stirrup.n.01', 'synonyms': ['stirrup'], 'def': "support consisting of metal loops into which rider's feet go", 'name': 'stirrup'}, {'frequency': 'c', 'id': 1036, 'synset': 'stocking.n.01', 'synonyms': ['stockings_(leg_wear)'], 'def': 'close-fitting hosiery to cover the foot and leg; come in matched pairs', 'name': 'stockings_(leg_wear)'}, {'frequency': 'f', 'id': 1037, 'synset': 'stool.n.01', 'synonyms': ['stool'], 'def': 'a simple seat without a back or arms', 'name': 'stool'}, {'frequency': 'f', 'id': 1038, 'synset': 'stop_sign.n.01', 'synonyms': ['stop_sign'], 'def': 'a traffic sign to notify drivers that they must come to a complete stop', 'name': 'stop_sign'}, {'frequency': 'f', 'id': 1039, 'synset': 'stoplight.n.01', 'synonyms': ['brake_light'], 'def': 'a red light on the rear of a motor vehicle that signals when the brakes are applied', 'name': 'brake_light'}, {'frequency': 'f', 'id': 1040, 'synset': 'stove.n.01', 'synonyms': ['stove', 'kitchen_stove', 'range_(kitchen_appliance)', 'kitchen_range', 'cooking_stove'], 'def': 'a kitchen appliance used for cooking food', 'name': 'stove'}, {'frequency': 'c', 'id': 1041, 'synset': 'strainer.n.01', 'synonyms': ['strainer'], 'def': 'a filter to retain larger pieces while smaller pieces and liquids pass through', 'name': 'strainer'}, {'frequency': 'f', 'id': 1042, 'synset': 'strap.n.01', 'synonyms': ['strap'], 'def': 'an elongated strip of material for binding things together or holding', 'name': 'strap'}, {'frequency': 'f', 'id': 1043, 'synset': 'straw.n.04', 'synonyms': ['straw_(for_drinking)', 'drinking_straw'], 'def': 'a thin paper or plastic tube used to suck liquids into the mouth', 'name': 'straw_(for_drinking)'}, {'frequency': 'f', 'id': 1044, 'synset': 'strawberry.n.01', 'synonyms': ['strawberry'], 'def': 'sweet fleshy red fruit', 'name': 'strawberry'}, {'frequency': 'f', 'id': 1045, 'synset': 'street_sign.n.01', 'synonyms': ['street_sign'], 'def': 'a sign visible from the street', 'name': 'street_sign'}, {'frequency': 'f', 'id': 1046, 'synset': 'streetlight.n.01', 'synonyms': ['streetlight', 'street_lamp'], 'def': 'a lamp supported on a lamppost; for illuminating a street', 'name': 'streetlight'}, {'frequency': 'r', 'id': 1047, 'synset': 'string_cheese.n.01', 'synonyms': ['string_cheese'], 'def': 'cheese formed in long strings twisted together', 'name': 'string_cheese'}, {'frequency': 'r', 'id': 1048, 'synset': 'stylus.n.02', 'synonyms': ['stylus'], 'def': 'a pointed tool for writing or drawing or engraving', 'name': 'stylus'}, {'frequency': 'r', 'id': 1049, 'synset': 'subwoofer.n.01', 'synonyms': ['subwoofer'], 'def': 'a loudspeaker that is designed to reproduce very low bass frequencies', 'name': 'subwoofer'}, {'frequency': 'r', 'id': 1050, 'synset': 'sugar_bowl.n.01', 'synonyms': ['sugar_bowl'], 'def': 'a dish in which sugar is served', 'name': 'sugar_bowl'}, {'frequency': 'r', 'id': 1051, 'synset': 'sugarcane.n.01', 'synonyms': ['sugarcane_(plant)'], 'def': 'juicy canes whose sap is a source of molasses and commercial sugar; fresh canes are sometimes chewed for the juice', 'name': 'sugarcane_(plant)'}, {'frequency': 'c', 'id': 1052, 'synset': 'suit.n.01', 'synonyms': ['suit_(clothing)'], 'def': 'a set of garments (usually including a jacket and trousers or skirt) for outerwear all of the same fabric and color', 'name': 'suit_(clothing)'}, {'frequency': 'c', 'id': 1053, 'synset': 'sunflower.n.01', 'synonyms': ['sunflower'], 'def': 'any plant of the genus Helianthus having large flower heads with dark disk florets and showy yellow rays', 'name': 'sunflower'}, {'frequency': 'f', 'id': 1054, 'synset': 'sunglasses.n.01', 'synonyms': ['sunglasses'], 'def': 'spectacles that are darkened or polarized to protect the eyes from the glare of the sun', 'name': 'sunglasses'}, {'frequency': 'c', 'id': 1055, 'synset': 'sunhat.n.01', 'synonyms': ['sunhat'], 'def': 'a hat with a broad brim that protects the face from direct exposure to the sun', 'name': 'sunhat'}, {'frequency': 'r', 'id': 1056, 'synset': 'sunscreen.n.01', 'synonyms': ['sunscreen', 'sunblock'], 'def': 'a cream spread on the skin; contains a chemical to filter out ultraviolet light and so protect from sunburn', 'name': 'sunscreen'}, {'frequency': 'f', 'id': 1057, 'synset': 'surfboard.n.01', 'synonyms': ['surfboard'], 'def': 'a narrow buoyant board for riding surf', 'name': 'surfboard'}, {'frequency': 'c', 'id': 1058, 'synset': 'sushi.n.01', 'synonyms': ['sushi'], 'def': 'rice (with raw fish) wrapped in seaweed', 'name': 'sushi'}, {'frequency': 'c', 'id': 1059, 'synset': 'swab.n.02', 'synonyms': ['mop'], 'def': 'cleaning implement consisting of absorbent material fastened to a handle; for cleaning floors', 'name': 'mop'}, {'frequency': 'c', 'id': 1060, 'synset': 'sweat_pants.n.01', 'synonyms': ['sweat_pants'], 'def': 'loose-fitting trousers with elastic cuffs; worn by athletes', 'name': 'sweat_pants'}, {'frequency': 'c', 'id': 1061, 'synset': 'sweatband.n.02', 'synonyms': ['sweatband'], 'def': 'a band of material tied around the forehead or wrist to absorb sweat', 'name': 'sweatband'}, {'frequency': 'f', 'id': 1062, 'synset': 'sweater.n.01', 'synonyms': ['sweater'], 'def': 'a crocheted or knitted garment covering the upper part of the body', 'name': 'sweater'}, {'frequency': 'f', 'id': 1063, 'synset': 'sweatshirt.n.01', 'synonyms': ['sweatshirt'], 'def': 'cotton knit pullover with long sleeves worn during athletic activity', 'name': 'sweatshirt'}, {'frequency': 'c', 'id': 1064, 'synset': 'sweet_potato.n.02', 'synonyms': ['sweet_potato'], 'def': 'the edible tuberous root of the sweet potato vine', 'name': 'sweet_potato'}, {'frequency': 'f', 'id': 1065, 'synset': 'swimsuit.n.01', 'synonyms': ['swimsuit', 'swimwear', 'bathing_suit', 'swimming_costume', 'bathing_costume', 'swimming_trunks', 'bathing_trunks'], 'def': 'garment worn for swimming', 'name': 'swimsuit'}, {'frequency': 'c', 'id': 1066, 'synset': 'sword.n.01', 'synonyms': ['sword'], 'def': 'a cutting or thrusting weapon that has a long metal blade', 'name': 'sword'}, {'frequency': 'r', 'id': 1067, 'synset': 'syringe.n.01', 'synonyms': ['syringe'], 'def': 'a medical instrument used to inject or withdraw fluids', 'name': 'syringe'}, {'frequency': 'r', 'id': 1068, 'synset': 'tabasco.n.02', 'synonyms': ['Tabasco_sauce'], 'def': 'very spicy sauce (trade name Tabasco) made from fully-aged red peppers', 'name': 'Tabasco_sauce'}, {'frequency': 'r', 'id': 1069, 'synset': 'table-tennis_table.n.01', 'synonyms': ['table-tennis_table', 'ping-pong_table'], 'def': 'a table used for playing table tennis', 'name': 'table-tennis_table'}, {'frequency': 'f', 'id': 1070, 'synset': 'table.n.02', 'synonyms': ['table'], 'def': 'a piece of furniture having a smooth flat top that is usually supported by one or more vertical legs', 'name': 'table'}, {'frequency': 'c', 'id': 1071, 'synset': 'table_lamp.n.01', 'synonyms': ['table_lamp'], 'def': 'a lamp that sits on a table', 'name': 'table_lamp'}, {'frequency': 'f', 'id': 1072, 'synset': 'tablecloth.n.01', 'synonyms': ['tablecloth'], 'def': 'a covering spread over a dining table', 'name': 'tablecloth'}, {'frequency': 'r', 'id': 1073, 'synset': 'tachometer.n.01', 'synonyms': ['tachometer'], 'def': 'measuring instrument for indicating speed of rotation', 'name': 'tachometer'}, {'frequency': 'r', 'id': 1074, 'synset': 'taco.n.02', 'synonyms': ['taco'], 'def': 'a small tortilla cupped around a filling', 'name': 'taco'}, {'frequency': 'f', 'id': 1075, 'synset': 'tag.n.02', 'synonyms': ['tag'], 'def': 'a label associated with something for the purpose of identification or information', 'name': 'tag'}, {'frequency': 'f', 'id': 1076, 'synset': 'taillight.n.01', 'synonyms': ['taillight', 'rear_light'], 'def': 'lamp (usually red) mounted at the rear of a motor vehicle', 'name': 'taillight'}, {'frequency': 'r', 'id': 1077, 'synset': 'tambourine.n.01', 'synonyms': ['tambourine'], 'def': 'a shallow drum with a single drumhead and with metallic disks in the sides', 'name': 'tambourine'}, {'frequency': 'r', 'id': 1078, 'synset': 'tank.n.01', 'synonyms': ['army_tank', 'armored_combat_vehicle', 'armoured_combat_vehicle'], 'def': 'an enclosed armored military vehicle; has a cannon and moves on caterpillar treads', 'name': 'army_tank'}, {'frequency': 'c', 'id': 1079, 'synset': 'tank.n.02', 'synonyms': ['tank_(storage_vessel)', 'storage_tank'], 'def': 'a large (usually metallic) vessel for holding gases or liquids', 'name': 'tank_(storage_vessel)'}, {'frequency': 'f', 'id': 1080, 'synset': 'tank_top.n.01', 'synonyms': ['tank_top_(clothing)'], 'def': 'a tight-fitting sleeveless shirt with wide shoulder straps and low neck and no front opening', 'name': 'tank_top_(clothing)'}, {'frequency': 'c', 'id': 1081, 'synset': 'tape.n.01', 'synonyms': ['tape_(sticky_cloth_or_paper)'], 'def': 'a long thin piece of cloth or paper as used for binding or fastening', 'name': 'tape_(sticky_cloth_or_paper)'}, {'frequency': 'c', 'id': 1082, 'synset': 'tape.n.04', 'synonyms': ['tape_measure', 'measuring_tape'], 'def': 'measuring instrument consisting of a narrow strip (cloth or metal) marked in inches or centimeters and used for measuring lengths', 'name': 'tape_measure'}, {'frequency': 'c', 'id': 1083, 'synset': 'tapestry.n.02', 'synonyms': ['tapestry'], 'def': 'a heavy textile with a woven design; used for curtains and upholstery', 'name': 'tapestry'}, {'frequency': 'f', 'id': 1084, 'synset': 'tarpaulin.n.01', 'synonyms': ['tarp'], 'def': 'waterproofed canvas', 'name': 'tarp'}, {'frequency': 'c', 'id': 1085, 'synset': 'tartan.n.01', 'synonyms': ['tartan', 'plaid'], 'def': 'a cloth having a crisscross design', 'name': 'tartan'}, {'frequency': 'c', 'id': 1086, 'synset': 'tassel.n.01', 'synonyms': ['tassel'], 'def': 'adornment consisting of a bunch of cords fastened at one end', 'name': 'tassel'}, {'frequency': 'r', 'id': 1087, 'synset': 'tea_bag.n.01', 'synonyms': ['tea_bag'], 'def': 'a measured amount of tea in a bag for an individual serving of tea', 'name': 'tea_bag'}, {'frequency': 'c', 'id': 1088, 'synset': 'teacup.n.02', 'synonyms': ['teacup'], 'def': 'a cup from which tea is drunk', 'name': 'teacup'}, {'frequency': 'c', 'id': 1089, 'synset': 'teakettle.n.01', 'synonyms': ['teakettle'], 'def': 'kettle for boiling water to make tea', 'name': 'teakettle'}, {'frequency': 'c', 'id': 1090, 'synset': 'teapot.n.01', 'synonyms': ['teapot'], 'def': 'pot for brewing tea; usually has a spout and handle', 'name': 'teapot'}, {'frequency': 'f', 'id': 1091, 'synset': 'teddy.n.01', 'synonyms': ['teddy_bear'], 'def': "plaything consisting of a child's toy bear (usually plush and stuffed with soft materials)", 'name': 'teddy_bear'}, {'frequency': 'f', 'id': 1092, 'synset': 'telephone.n.01', 'synonyms': ['telephone', 'phone', 'telephone_set'], 'def': 'electronic device for communicating by voice over long distances', 'name': 'telephone'}, {'frequency': 'c', 'id': 1093, 'synset': 'telephone_booth.n.01', 'synonyms': ['telephone_booth', 'phone_booth', 'call_box', 'telephone_box', 'telephone_kiosk'], 'def': 'booth for using a telephone', 'name': 'telephone_booth'}, {'frequency': 'f', 'id': 1094, 'synset': 'telephone_pole.n.01', 'synonyms': ['telephone_pole', 'telegraph_pole', 'telegraph_post'], 'def': 'tall pole supporting telephone wires', 'name': 'telephone_pole'}, {'frequency': 'r', 'id': 1095, 'synset': 'telephoto_lens.n.01', 'synonyms': ['telephoto_lens', 'zoom_lens'], 'def': 'a camera lens that magnifies the image', 'name': 'telephoto_lens'}, {'frequency': 'c', 'id': 1096, 'synset': 'television_camera.n.01', 'synonyms': ['television_camera', 'tv_camera'], 'def': 'television equipment for capturing and recording video', 'name': 'television_camera'}, {'frequency': 'f', 'id': 1097, 'synset': 'television_receiver.n.01', 'synonyms': ['television_set', 'tv', 'tv_set'], 'def': 'an electronic device that receives television signals and displays them on a screen', 'name': 'television_set'}, {'frequency': 'f', 'id': 1098, 'synset': 'tennis_ball.n.01', 'synonyms': ['tennis_ball'], 'def': 'ball about the size of a fist used in playing tennis', 'name': 'tennis_ball'}, {'frequency': 'f', 'id': 1099, 'synset': 'tennis_racket.n.01', 'synonyms': ['tennis_racket'], 'def': 'a racket used to play tennis', 'name': 'tennis_racket'}, {'frequency': 'r', 'id': 1100, 'synset': 'tequila.n.01', 'synonyms': ['tequila'], 'def': 'Mexican liquor made from fermented juices of an agave plant', 'name': 'tequila'}, {'frequency': 'c', 'id': 1101, 'synset': 'thermometer.n.01', 'synonyms': ['thermometer'], 'def': 'measuring instrument for measuring temperature', 'name': 'thermometer'}, {'frequency': 'c', 'id': 1102, 'synset': 'thermos.n.01', 'synonyms': ['thermos_bottle'], 'def': 'vacuum flask that preserves temperature of hot or cold drinks', 'name': 'thermos_bottle'}, {'frequency': 'c', 'id': 1103, 'synset': 'thermostat.n.01', 'synonyms': ['thermostat'], 'def': 'a regulator for automatically regulating temperature by starting or stopping the supply of heat', 'name': 'thermostat'}, {'frequency': 'r', 'id': 1104, 'synset': 'thimble.n.02', 'synonyms': ['thimble'], 'def': 'a small metal cap to protect the finger while sewing; can be used as a small container', 'name': 'thimble'}, {'frequency': 'c', 'id': 1105, 'synset': 'thread.n.01', 'synonyms': ['thread', 'yarn'], 'def': 'a fine cord of twisted fibers (of cotton or silk or wool or nylon etc.) used in sewing and weaving', 'name': 'thread'}, {'frequency': 'c', 'id': 1106, 'synset': 'thumbtack.n.01', 'synonyms': ['thumbtack', 'drawing_pin', 'pushpin'], 'def': 'a tack for attaching papers to a bulletin board or drawing board', 'name': 'thumbtack'}, {'frequency': 'c', 'id': 1107, 'synset': 'tiara.n.01', 'synonyms': ['tiara'], 'def': 'a jeweled headdress worn by women on formal occasions', 'name': 'tiara'}, {'frequency': 'c', 'id': 1108, 'synset': 'tiger.n.02', 'synonyms': ['tiger'], 'def': 'large feline of forests in most of Asia having a tawny coat with black stripes', 'name': 'tiger'}, {'frequency': 'c', 'id': 1109, 'synset': 'tights.n.01', 'synonyms': ['tights_(clothing)', 'leotards'], 'def': 'skintight knit hose covering the body from the waist to the feet worn by acrobats and dancers and as stockings by women and girls', 'name': 'tights_(clothing)'}, {'frequency': 'c', 'id': 1110, 'synset': 'timer.n.01', 'synonyms': ['timer', 'stopwatch'], 'def': 'a timepiece that measures a time interval and signals its end', 'name': 'timer'}, {'frequency': 'f', 'id': 1111, 'synset': 'tinfoil.n.01', 'synonyms': ['tinfoil'], 'def': 'foil made of tin or an alloy of tin and lead', 'name': 'tinfoil'}, {'frequency': 'r', 'id': 1112, 'synset': 'tinsel.n.01', 'synonyms': ['tinsel'], 'def': 'a showy decoration that is basically valueless', 'name': 'tinsel'}, {'frequency': 'f', 'id': 1113, 'synset': 'tissue.n.02', 'synonyms': ['tissue_paper'], 'def': 'a soft thin (usually translucent) paper', 'name': 'tissue_paper'}, {'frequency': 'c', 'id': 1114, 'synset': 'toast.n.01', 'synonyms': ['toast_(food)'], 'def': 'slice of bread that has been toasted', 'name': 'toast_(food)'}, {'frequency': 'f', 'id': 1115, 'synset': 'toaster.n.02', 'synonyms': ['toaster'], 'def': 'a kitchen appliance (usually electric) for toasting bread', 'name': 'toaster'}, {'frequency': 'c', 'id': 1116, 'synset': 'toaster_oven.n.01', 'synonyms': ['toaster_oven'], 'def': 'kitchen appliance consisting of a small electric oven for toasting or warming food', 'name': 'toaster_oven'}, {'frequency': 'f', 'id': 1117, 'synset': 'toilet.n.02', 'synonyms': ['toilet'], 'def': 'a plumbing fixture for defecation and urination', 'name': 'toilet'}, {'frequency': 'f', 'id': 1118, 'synset': 'toilet_tissue.n.01', 'synonyms': ['toilet_tissue', 'toilet_paper', 'bathroom_tissue'], 'def': 'a soft thin absorbent paper for use in toilets', 'name': 'toilet_tissue'}, {'frequency': 'f', 'id': 1119, 'synset': 'tomato.n.01', 'synonyms': ['tomato'], 'def': 'mildly acid red or yellow pulpy fruit eaten as a vegetable', 'name': 'tomato'}, {'frequency': 'c', 'id': 1120, 'synset': 'tongs.n.01', 'synonyms': ['tongs'], 'def': 'any of various devices for taking hold of objects; usually have two hinged legs with handles above and pointed hooks below', 'name': 'tongs'}, {'frequency': 'c', 'id': 1121, 'synset': 'toolbox.n.01', 'synonyms': ['toolbox'], 'def': 'a box or chest or cabinet for holding hand tools', 'name': 'toolbox'}, {'frequency': 'f', 'id': 1122, 'synset': 'toothbrush.n.01', 'synonyms': ['toothbrush'], 'def': 'small brush; has long handle; used to clean teeth', 'name': 'toothbrush'}, {'frequency': 'f', 'id': 1123, 'synset': 'toothpaste.n.01', 'synonyms': ['toothpaste'], 'def': 'a dentifrice in the form of a paste', 'name': 'toothpaste'}, {'frequency': 'c', 'id': 1124, 'synset': 'toothpick.n.01', 'synonyms': ['toothpick'], 'def': 'pick consisting of a small strip of wood or plastic; used to pick food from between the teeth', 'name': 'toothpick'}, {'frequency': 'c', 'id': 1125, 'synset': 'top.n.09', 'synonyms': ['cover'], 'def': 'covering for a hole (especially a hole in the top of a container)', 'name': 'cover'}, {'frequency': 'c', 'id': 1126, 'synset': 'tortilla.n.01', 'synonyms': ['tortilla'], 'def': 'thin unleavened pancake made from cornmeal or wheat flour', 'name': 'tortilla'}, {'frequency': 'c', 'id': 1127, 'synset': 'tow_truck.n.01', 'synonyms': ['tow_truck'], 'def': 'a truck equipped to hoist and pull wrecked cars (or to remove cars from no-parking zones)', 'name': 'tow_truck'}, {'frequency': 'f', 'id': 1128, 'synset': 'towel.n.01', 'synonyms': ['towel'], 'def': 'a rectangular piece of absorbent cloth (or paper) for drying or wiping', 'name': 'towel'}, {'frequency': 'f', 'id': 1129, 'synset': 'towel_rack.n.01', 'synonyms': ['towel_rack', 'towel_rail', 'towel_bar'], 'def': 'a rack consisting of one or more bars on which towels can be hung', 'name': 'towel_rack'}, {'frequency': 'f', 'id': 1130, 'synset': 'toy.n.03', 'synonyms': ['toy'], 'def': 'a device regarded as providing amusement', 'name': 'toy'}, {'frequency': 'c', 'id': 1131, 'synset': 'tractor.n.01', 'synonyms': ['tractor_(farm_equipment)'], 'def': 'a wheeled vehicle with large wheels; used in farming and other applications', 'name': 'tractor_(farm_equipment)'}, {'frequency': 'f', 'id': 1132, 'synset': 'traffic_light.n.01', 'synonyms': ['traffic_light'], 'def': 'a device to control vehicle traffic often consisting of three or more lights', 'name': 'traffic_light'}, {'frequency': 'r', 'id': 1133, 'synset': 'trail_bike.n.01', 'synonyms': ['dirt_bike'], 'def': 'a lightweight motorcycle equipped with rugged tires and suspension for off-road use', 'name': 'dirt_bike'}, {'frequency': 'c', 'id': 1134, 'synset': 'trailer_truck.n.01', 'synonyms': ['trailer_truck', 'tractor_trailer', 'trucking_rig', 'articulated_lorry', 'semi_truck'], 'def': 'a truck consisting of a tractor and trailer together', 'name': 'trailer_truck'}, {'frequency': 'f', 'id': 1135, 'synset': 'train.n.01', 'synonyms': ['train_(railroad_vehicle)', 'railroad_train'], 'def': 'public or private transport provided by a line of railway cars coupled together and drawn by a locomotive', 'name': 'train_(railroad_vehicle)'}, {'frequency': 'r', 'id': 1136, 'synset': 'trampoline.n.01', 'synonyms': ['trampoline'], 'def': 'gymnastic apparatus consisting of a strong canvas sheet attached with springs to a metal frame', 'name': 'trampoline'}, {'frequency': 'f', 'id': 1137, 'synset': 'tray.n.01', 'synonyms': ['tray'], 'def': 'an open receptacle for holding or displaying or serving articles or food', 'name': 'tray'}, {'frequency': 'r', 'id': 1138, 'synset': 'tree_house.n.01', 'synonyms': ['tree_house'], 'def': '(NOT A TREE) a PLAYHOUSE built in the branches of a tree', 'name': 'tree_house'}, {'frequency': 'r', 'id': 1139, 'synset': 'trench_coat.n.01', 'synonyms': ['trench_coat'], 'def': 'a military style raincoat; belted with deep pockets', 'name': 'trench_coat'}, {'frequency': 'r', 'id': 1140, 'synset': 'triangle.n.05', 'synonyms': ['triangle_(musical_instrument)'], 'def': 'a percussion instrument consisting of a metal bar bent in the shape of an open triangle', 'name': 'triangle_(musical_instrument)'}, {'frequency': 'r', 'id': 1141, 'synset': 'tricycle.n.01', 'synonyms': ['tricycle'], 'def': 'a vehicle with three wheels that is moved by foot pedals', 'name': 'tricycle'}, {'frequency': 'c', 'id': 1142, 'synset': 'tripod.n.01', 'synonyms': ['tripod'], 'def': 'a three-legged rack used for support', 'name': 'tripod'}, {'frequency': 'f', 'id': 1143, 'synset': 'trouser.n.01', 'synonyms': ['trousers', 'pants_(clothing)'], 'def': 'a garment extending from the waist to the knee or ankle, covering each leg separately', 'name': 'trousers'}, {'frequency': 'f', 'id': 1144, 'synset': 'truck.n.01', 'synonyms': ['truck'], 'def': 'an automotive vehicle suitable for hauling', 'name': 'truck'}, {'frequency': 'r', 'id': 1145, 'synset': 'truffle.n.03', 'synonyms': ['truffle_(chocolate)', 'chocolate_truffle'], 'def': 'creamy chocolate candy', 'name': 'truffle_(chocolate)'}, {'frequency': 'c', 'id': 1146, 'synset': 'trunk.n.02', 'synonyms': ['trunk'], 'def': 'luggage consisting of a large strong case used when traveling or for storage', 'name': 'trunk'}, {'frequency': 'r', 'id': 1147, 'synset': 'tub.n.02', 'synonyms': ['vat'], 'def': 'a large open vessel for holding or storing liquids', 'name': 'vat'}, {'frequency': 'c', 'id': 1148, 'synset': 'turban.n.01', 'synonyms': ['turban'], 'def': 'a traditional headdress consisting of a long scarf wrapped around the head', 'name': 'turban'}, {'frequency': 'r', 'id': 1149, 'synset': 'turkey.n.01', 'synonyms': ['turkey_(bird)'], 'def': 'large gallinaceous bird with fan-shaped tail; widely domesticated for food', 'name': 'turkey_(bird)'}, {'frequency': 'c', 'id': 1150, 'synset': 'turkey.n.04', 'synonyms': ['turkey_(food)'], 'def': 'flesh of large domesticated fowl usually roasted', 'name': 'turkey_(food)'}, {'frequency': 'r', 'id': 1151, 'synset': 'turnip.n.01', 'synonyms': ['turnip'], 'def': 'widely cultivated plant having a large fleshy edible white or yellow root', 'name': 'turnip'}, {'frequency': 'c', 'id': 1152, 'synset': 'turtle.n.02', 'synonyms': ['turtle'], 'def': 'any of various aquatic and land reptiles having a bony shell and flipper-like limbs for swimming', 'name': 'turtle'}, {'frequency': 'r', 'id': 1153, 'synset': 'turtleneck.n.01', 'synonyms': ['turtleneck_(clothing)', 'polo-neck'], 'def': 'a sweater or jersey with a high close-fitting collar', 'name': 'turtleneck_(clothing)'}, {'frequency': 'r', 'id': 1154, 'synset': 'typewriter.n.01', 'synonyms': ['typewriter'], 'def': 'hand-operated character printer for printing written messages one character at a time', 'name': 'typewriter'}, {'frequency': 'f', 'id': 1155, 'synset': 'umbrella.n.01', 'synonyms': ['umbrella'], 'def': 'a lightweight handheld collapsible canopy', 'name': 'umbrella'}, {'frequency': 'c', 'id': 1156, 'synset': 'underwear.n.01', 'synonyms': ['underwear', 'underclothes', 'underclothing', 'underpants'], 'def': 'undergarment worn next to the skin and under the outer garments', 'name': 'underwear'}, {'frequency': 'r', 'id': 1157, 'synset': 'unicycle.n.01', 'synonyms': ['unicycle'], 'def': 'a vehicle with a single wheel that is driven by pedals', 'name': 'unicycle'}, {'frequency': 'c', 'id': 1158, 'synset': 'urinal.n.01', 'synonyms': ['urinal'], 'def': 'a plumbing fixture (usually attached to the wall) used by men to urinate', 'name': 'urinal'}, {'frequency': 'r', 'id': 1159, 'synset': 'urn.n.01', 'synonyms': ['urn'], 'def': 'a large vase that usually has a pedestal or feet', 'name': 'urn'}, {'frequency': 'c', 'id': 1160, 'synset': 'vacuum.n.04', 'synonyms': ['vacuum_cleaner'], 'def': 'an electrical home appliance that cleans by suction', 'name': 'vacuum_cleaner'}, {'frequency': 'c', 'id': 1161, 'synset': 'valve.n.03', 'synonyms': ['valve'], 'def': 'control consisting of a mechanical device for controlling the flow of a fluid', 'name': 'valve'}, {'frequency': 'f', 'id': 1162, 'synset': 'vase.n.01', 'synonyms': ['vase'], 'def': 'an open jar of glass or porcelain used as an ornament or to hold flowers', 'name': 'vase'}, {'frequency': 'c', 'id': 1163, 'synset': 'vending_machine.n.01', 'synonyms': ['vending_machine'], 'def': 'a slot machine for selling goods', 'name': 'vending_machine'}, {'frequency': 'f', 'id': 1164, 'synset': 'vent.n.01', 'synonyms': ['vent', 'blowhole', 'air_vent'], 'def': 'a hole for the escape of gas or air', 'name': 'vent'}, {'frequency': 'c', 'id': 1165, 'synset': 'videotape.n.01', 'synonyms': ['videotape'], 'def': 'a video recording made on magnetic tape', 'name': 'videotape'}, {'frequency': 'r', 'id': 1166, 'synset': 'vinegar.n.01', 'synonyms': ['vinegar'], 'def': 'sour-tasting liquid produced usually by oxidation of the alcohol in wine or cider and used as a condiment or food preservative', 'name': 'vinegar'}, {'frequency': 'r', 'id': 1167, 'synset': 'violin.n.01', 'synonyms': ['violin', 'fiddle'], 'def': 'bowed stringed instrument that is the highest member of the violin family', 'name': 'violin'}, {'frequency': 'r', 'id': 1168, 'synset': 'vodka.n.01', 'synonyms': ['vodka'], 'def': 'unaged colorless liquor originating in Russia', 'name': 'vodka'}, {'frequency': 'r', 'id': 1169, 'synset': 'volleyball.n.02', 'synonyms': ['volleyball'], 'def': 'an inflated ball used in playing volleyball', 'name': 'volleyball'}, {'frequency': 'r', 'id': 1170, 'synset': 'vulture.n.01', 'synonyms': ['vulture'], 'def': 'any of various large birds of prey having naked heads and weak claws and feeding chiefly on carrion', 'name': 'vulture'}, {'frequency': 'c', 'id': 1171, 'synset': 'waffle.n.01', 'synonyms': ['waffle'], 'def': 'pancake batter baked in a waffle iron', 'name': 'waffle'}, {'frequency': 'r', 'id': 1172, 'synset': 'waffle_iron.n.01', 'synonyms': ['waffle_iron'], 'def': 'a kitchen appliance for baking waffles', 'name': 'waffle_iron'}, {'frequency': 'c', 'id': 1173, 'synset': 'wagon.n.01', 'synonyms': ['wagon'], 'def': 'any of various kinds of wheeled vehicles drawn by an animal or a tractor', 'name': 'wagon'}, {'frequency': 'c', 'id': 1174, 'synset': 'wagon_wheel.n.01', 'synonyms': ['wagon_wheel'], 'def': 'a wheel of a wagon', 'name': 'wagon_wheel'}, {'frequency': 'c', 'id': 1175, 'synset': 'walking_stick.n.01', 'synonyms': ['walking_stick'], 'def': 'a stick carried in the hand for support in walking', 'name': 'walking_stick'}, {'frequency': 'c', 'id': 1176, 'synset': 'wall_clock.n.01', 'synonyms': ['wall_clock'], 'def': 'a clock mounted on a wall', 'name': 'wall_clock'}, {'frequency': 'f', 'id': 1177, 'synset': 'wall_socket.n.01', 'synonyms': ['wall_socket', 'wall_plug', 'electric_outlet', 'electrical_outlet', 'outlet', 'electric_receptacle'], 'def': 'receptacle providing a place in a wiring system where current can be taken to run electrical devices', 'name': 'wall_socket'}, {'frequency': 'c', 'id': 1178, 'synset': 'wallet.n.01', 'synonyms': ['wallet', 'billfold'], 'def': 'a pocket-size case for holding papers and paper money', 'name': 'wallet'}, {'frequency': 'r', 'id': 1179, 'synset': 'walrus.n.01', 'synonyms': ['walrus'], 'def': 'either of two large northern marine mammals having ivory tusks and tough hide over thick blubber', 'name': 'walrus'}, {'frequency': 'r', 'id': 1180, 'synset': 'wardrobe.n.01', 'synonyms': ['wardrobe'], 'def': 'a tall piece of furniture that provides storage space for clothes; has a door and rails or hooks for hanging clothes', 'name': 'wardrobe'}, {'frequency': 'r', 'id': 1181, 'synset': 'wasabi.n.02', 'synonyms': ['wasabi'], 'def': 'the thick green root of the wasabi plant that the Japanese use in cooking and that tastes like strong horseradish', 'name': 'wasabi'}, {'frequency': 'c', 'id': 1182, 'synset': 'washer.n.03', 'synonyms': ['automatic_washer', 'washing_machine'], 'def': 'a home appliance for washing clothes and linens automatically', 'name': 'automatic_washer'}, {'frequency': 'f', 'id': 1183, 'synset': 'watch.n.01', 'synonyms': ['watch', 'wristwatch'], 'def': 'a small, portable timepiece', 'name': 'watch'}, {'frequency': 'f', 'id': 1184, 'synset': 'water_bottle.n.01', 'synonyms': ['water_bottle'], 'def': 'a bottle for holding water', 'name': 'water_bottle'}, {'frequency': 'c', 'id': 1185, 'synset': 'water_cooler.n.01', 'synonyms': ['water_cooler'], 'def': 'a device for cooling and dispensing drinking water', 'name': 'water_cooler'}, {'frequency': 'c', 'id': 1186, 'synset': 'water_faucet.n.01', 'synonyms': ['water_faucet', 'water_tap', 'tap_(water_faucet)'], 'def': 'a faucet for drawing water from a pipe or cask', 'name': 'water_faucet'}, {'frequency': 'r', 'id': 1187, 'synset': 'water_filter.n.01', 'synonyms': ['water_filter'], 'def': 'a filter to remove impurities from the water supply', 'name': 'water_filter'}, {'frequency': 'r', 'id': 1188, 'synset': 'water_heater.n.01', 'synonyms': ['water_heater', 'hot-water_heater'], 'def': 'a heater and storage tank to supply heated water', 'name': 'water_heater'}, {'frequency': 'r', 'id': 1189, 'synset': 'water_jug.n.01', 'synonyms': ['water_jug'], 'def': 'a jug that holds water', 'name': 'water_jug'}, {'frequency': 'r', 'id': 1190, 'synset': 'water_pistol.n.01', 'synonyms': ['water_gun', 'squirt_gun'], 'def': 'plaything consisting of a toy pistol that squirts water', 'name': 'water_gun'}, {'frequency': 'c', 'id': 1191, 'synset': 'water_scooter.n.01', 'synonyms': ['water_scooter', 'sea_scooter', 'jet_ski'], 'def': 'a motorboat resembling a motor scooter (NOT A SURFBOARD OR WATER SKI)', 'name': 'water_scooter'}, {'frequency': 'c', 'id': 1192, 'synset': 'water_ski.n.01', 'synonyms': ['water_ski'], 'def': 'broad ski for skimming over water towed by a speedboat (DO NOT MARK WATER)', 'name': 'water_ski'}, {'frequency': 'c', 'id': 1193, 'synset': 'water_tower.n.01', 'synonyms': ['water_tower'], 'def': 'a large reservoir for water', 'name': 'water_tower'}, {'frequency': 'c', 'id': 1194, 'synset': 'watering_can.n.01', 'synonyms': ['watering_can'], 'def': 'a container with a handle and a spout with a perforated nozzle; used to sprinkle water over plants', 'name': 'watering_can'}, {'frequency': 'c', 'id': 1195, 'synset': 'watermelon.n.02', 'synonyms': ['watermelon'], 'def': 'large oblong or roundish melon with a hard green rind and sweet watery red or occasionally yellowish pulp', 'name': 'watermelon'}, {'frequency': 'f', 'id': 1196, 'synset': 'weathervane.n.01', 'synonyms': ['weathervane', 'vane_(weathervane)', 'wind_vane'], 'def': 'mechanical device attached to an elevated structure; rotates freely to show the direction of the wind', 'name': 'weathervane'}, {'frequency': 'c', 'id': 1197, 'synset': 'webcam.n.01', 'synonyms': ['webcam'], 'def': 'a digital camera designed to take digital photographs and transmit them over the internet', 'name': 'webcam'}, {'frequency': 'c', 'id': 1198, 'synset': 'wedding_cake.n.01', 'synonyms': ['wedding_cake', 'bridecake'], 'def': 'a rich cake with two or more tiers and covered with frosting and decorations; served at a wedding reception', 'name': 'wedding_cake'}, {'frequency': 'c', 'id': 1199, 'synset': 'wedding_ring.n.01', 'synonyms': ['wedding_ring', 'wedding_band'], 'def': 'a ring given to the bride and/or groom at the wedding', 'name': 'wedding_ring'}, {'frequency': 'f', 'id': 1200, 'synset': 'wet_suit.n.01', 'synonyms': ['wet_suit'], 'def': 'a close-fitting garment made of a permeable material; worn in cold water to retain body heat', 'name': 'wet_suit'}, {'frequency': 'f', 'id': 1201, 'synset': 'wheel.n.01', 'synonyms': ['wheel'], 'def': 'a circular frame with spokes (or a solid disc) that can rotate on a shaft or axle', 'name': 'wheel'}, {'frequency': 'c', 'id': 1202, 'synset': 'wheelchair.n.01', 'synonyms': ['wheelchair'], 'def': 'a movable chair mounted on large wheels', 'name': 'wheelchair'}, {'frequency': 'c', 'id': 1203, 'synset': 'whipped_cream.n.01', 'synonyms': ['whipped_cream'], 'def': 'cream that has been beaten until light and fluffy', 'name': 'whipped_cream'}, {'frequency': 'r', 'id': 1204, 'synset': 'whiskey.n.01', 'synonyms': ['whiskey'], 'def': 'a liquor made from fermented mash of grain', 'name': 'whiskey'}, {'frequency': 'r', 'id': 1205, 'synset': 'whistle.n.03', 'synonyms': ['whistle'], 'def': 'a small wind instrument that produces a whistling sound by blowing into it', 'name': 'whistle'}, {'frequency': 'r', 'id': 1206, 'synset': 'wick.n.02', 'synonyms': ['wick'], 'def': 'a loosely woven cord in a candle or oil lamp that is lit on fire', 'name': 'wick'}, {'frequency': 'c', 'id': 1207, 'synset': 'wig.n.01', 'synonyms': ['wig'], 'def': 'hairpiece covering the head and made of real or synthetic hair', 'name': 'wig'}, {'frequency': 'c', 'id': 1208, 'synset': 'wind_chime.n.01', 'synonyms': ['wind_chime'], 'def': 'a decorative arrangement of pieces of metal or glass or pottery that hang together loosely so the wind can cause them to tinkle', 'name': 'wind_chime'}, {'frequency': 'c', 'id': 1209, 'synset': 'windmill.n.01', 'synonyms': ['windmill'], 'def': 'a mill that is powered by the wind', 'name': 'windmill'}, {'frequency': 'c', 'id': 1210, 'synset': 'window_box.n.01', 'synonyms': ['window_box_(for_plants)'], 'def': 'a container for growing plants on a windowsill', 'name': 'window_box_(for_plants)'}, {'frequency': 'f', 'id': 1211, 'synset': 'windshield_wiper.n.01', 'synonyms': ['windshield_wiper', 'windscreen_wiper', 'wiper_(for_windshield/screen)'], 'def': 'a mechanical device that cleans the windshield', 'name': 'windshield_wiper'}, {'frequency': 'c', 'id': 1212, 'synset': 'windsock.n.01', 'synonyms': ['windsock', 'air_sock', 'air-sleeve', 'wind_sleeve', 'wind_cone'], 'def': 'a truncated cloth cone mounted on a mast/pole; shows wind direction', 'name': 'windsock'}, {'frequency': 'f', 'id': 1213, 'synset': 'wine_bottle.n.01', 'synonyms': ['wine_bottle'], 'def': 'a bottle for holding wine', 'name': 'wine_bottle'}, {'frequency': 'r', 'id': 1214, 'synset': 'wine_bucket.n.01', 'synonyms': ['wine_bucket', 'wine_cooler'], 'def': 'a bucket of ice used to chill a bottle of wine', 'name': 'wine_bucket'}, {'frequency': 'f', 'id': 1215, 'synset': 'wineglass.n.01', 'synonyms': ['wineglass'], 'def': 'a glass that has a stem and in which wine is served', 'name': 'wineglass'}, {'frequency': 'r', 'id': 1216, 'synset': 'wing_chair.n.01', 'synonyms': ['wing_chair'], 'def': 'easy chair having wings on each side of a high back', 'name': 'wing_chair'}, {'frequency': 'c', 'id': 1217, 'synset': 'winker.n.02', 'synonyms': ['blinder_(for_horses)'], 'def': 'blinds that prevent a horse from seeing something on either side', 'name': 'blinder_(for_horses)'}, {'frequency': 'c', 'id': 1218, 'synset': 'wok.n.01', 'synonyms': ['wok'], 'def': 'pan with a convex bottom; used for frying in Chinese cooking', 'name': 'wok'}, {'frequency': 'r', 'id': 1219, 'synset': 'wolf.n.01', 'synonyms': ['wolf'], 'def': 'a wild carnivorous mammal of the dog family, living and hunting in packs', 'name': 'wolf'}, {'frequency': 'c', 'id': 1220, 'synset': 'wooden_spoon.n.02', 'synonyms': ['wooden_spoon'], 'def': 'a spoon made of wood', 'name': 'wooden_spoon'}, {'frequency': 'c', 'id': 1221, 'synset': 'wreath.n.01', 'synonyms': ['wreath'], 'def': 'an arrangement of flowers, leaves, or stems fastened in a ring', 'name': 'wreath'}, {'frequency': 'c', 'id': 1222, 'synset': 'wrench.n.03', 'synonyms': ['wrench', 'spanner'], 'def': 'a hand tool that is used to hold or twist a nut or bolt', 'name': 'wrench'}, {'frequency': 'c', 'id': 1223, 'synset': 'wristband.n.01', 'synonyms': ['wristband'], 'def': 'band consisting of a part of a sleeve that covers the wrist', 'name': 'wristband'}, {'frequency': 'f', 'id': 1224, 'synset': 'wristlet.n.01', 'synonyms': ['wristlet', 'wrist_band'], 'def': 'a band or bracelet worn around the wrist', 'name': 'wristlet'}, {'frequency': 'r', 'id': 1225, 'synset': 'yacht.n.01', 'synonyms': ['yacht'], 'def': 'an expensive vessel propelled by sail or power and used for cruising or racing', 'name': 'yacht'}, {'frequency': 'r', 'id': 1226, 'synset': 'yak.n.02', 'synonyms': ['yak'], 'def': 'large long-haired wild ox of Tibet often domesticated', 'name': 'yak'}, {'frequency': 'c', 'id': 1227, 'synset': 'yogurt.n.01', 'synonyms': ['yogurt', 'yoghurt', 'yoghourt'], 'def': 'a custard-like food made from curdled milk', 'name': 'yogurt'}, {'frequency': 'r', 'id': 1228, 'synset': 'yoke.n.07', 'synonyms': ['yoke_(animal_equipment)'], 'def': 'gear joining two animals at the neck; NOT egg yolk', 'name': 'yoke_(animal_equipment)'}, {'frequency': 'f', 'id': 1229, 'synset': 'zebra.n.01', 'synonyms': ['zebra'], 'def': 'any of several fleet black-and-white striped African equines', 'name': 'zebra'}, {'frequency': 'c', 'id': 1230, 'synset': 'zucchini.n.02', 'synonyms': ['zucchini', 'courgette'], 'def': 'small cucumber-shaped vegetable marrow; typically dark green', 'name': 'zucchini'}] # noqa -# fmt: on diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis_v1_categories.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis_v1_categories.py deleted file mode 100755 index 7374e696..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/lvis_v1_categories.py +++ /dev/null @@ -1,16 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# Autogen with -# with open("lvis_v1_val.json", "r") as f: -# a = json.load(f) -# c = a["categories"] -# for x in c: -# del x["image_count"] -# del x["instance_count"] -# LVIS_CATEGORIES = repr(c) + " # noqa" -# with open("/tmp/lvis_categories.py", "wt") as f: -# f.write(f"LVIS_CATEGORIES = {LVIS_CATEGORIES}") -# Then paste the contents of that file below - -# fmt: off -LVIS_CATEGORIES = [{'frequency': 'c', 'synset': 'aerosol.n.02', 'synonyms': ['aerosol_can', 'spray_can'], 'id': 1, 'def': 'a dispenser that holds a substance under pressure', 'name': 'aerosol_can'}, {'frequency': 'f', 'synset': 'air_conditioner.n.01', 'synonyms': ['air_conditioner'], 'id': 2, 'def': 'a machine that keeps air cool and dry', 'name': 'air_conditioner'}, {'frequency': 'f', 'synset': 'airplane.n.01', 'synonyms': ['airplane', 'aeroplane'], 'id': 3, 'def': 'an aircraft that has a fixed wing and is powered by propellers or jets', 'name': 'airplane'}, {'frequency': 'f', 'synset': 'alarm_clock.n.01', 'synonyms': ['alarm_clock'], 'id': 4, 'def': 'a clock that wakes a sleeper at some preset time', 'name': 'alarm_clock'}, {'frequency': 'c', 'synset': 'alcohol.n.01', 'synonyms': ['alcohol', 'alcoholic_beverage'], 'id': 5, 'def': 'a liquor or brew containing alcohol as the active agent', 'name': 'alcohol'}, {'frequency': 'c', 'synset': 'alligator.n.02', 'synonyms': ['alligator', 'gator'], 'id': 6, 'def': 'amphibious reptiles related to crocodiles but with shorter broader snouts', 'name': 'alligator'}, {'frequency': 'c', 'synset': 'almond.n.02', 'synonyms': ['almond'], 'id': 7, 'def': 'oval-shaped edible seed of the almond tree', 'name': 'almond'}, {'frequency': 'c', 'synset': 'ambulance.n.01', 'synonyms': ['ambulance'], 'id': 8, 'def': 'a vehicle that takes people to and from hospitals', 'name': 'ambulance'}, {'frequency': 'c', 'synset': 'amplifier.n.01', 'synonyms': ['amplifier'], 'id': 9, 'def': 'electronic equipment that increases strength of signals', 'name': 'amplifier'}, {'frequency': 'c', 'synset': 'anklet.n.03', 'synonyms': ['anklet', 'ankle_bracelet'], 'id': 10, 'def': 'an ornament worn around the ankle', 'name': 'anklet'}, {'frequency': 'f', 'synset': 'antenna.n.01', 'synonyms': ['antenna', 'aerial', 'transmitting_aerial'], 'id': 11, 'def': 'an electrical device that sends or receives radio or television signals', 'name': 'antenna'}, {'frequency': 'f', 'synset': 'apple.n.01', 'synonyms': ['apple'], 'id': 12, 'def': 'fruit with red or yellow or green skin and sweet to tart crisp whitish flesh', 'name': 'apple'}, {'frequency': 'r', 'synset': 'applesauce.n.01', 'synonyms': ['applesauce'], 'id': 13, 'def': 'puree of stewed apples usually sweetened and spiced', 'name': 'applesauce'}, {'frequency': 'r', 'synset': 'apricot.n.02', 'synonyms': ['apricot'], 'id': 14, 'def': 'downy yellow to rosy-colored fruit resembling a small peach', 'name': 'apricot'}, {'frequency': 'f', 'synset': 'apron.n.01', 'synonyms': ['apron'], 'id': 15, 'def': 'a garment of cloth that is tied about the waist and worn to protect clothing', 'name': 'apron'}, {'frequency': 'c', 'synset': 'aquarium.n.01', 'synonyms': ['aquarium', 'fish_tank'], 'id': 16, 'def': 'a tank/pool/bowl filled with water for keeping live fish and underwater animals', 'name': 'aquarium'}, {'frequency': 'r', 'synset': 'arctic.n.02', 'synonyms': ['arctic_(type_of_shoe)', 'galosh', 'golosh', 'rubber_(type_of_shoe)', 'gumshoe'], 'id': 17, 'def': 'a waterproof overshoe that protects shoes from water or snow', 'name': 'arctic_(type_of_shoe)'}, {'frequency': 'c', 'synset': 'armband.n.02', 'synonyms': ['armband'], 'id': 18, 'def': 'a band worn around the upper arm', 'name': 'armband'}, {'frequency': 'f', 'synset': 'armchair.n.01', 'synonyms': ['armchair'], 'id': 19, 'def': 'chair with a support on each side for arms', 'name': 'armchair'}, {'frequency': 'r', 'synset': 'armoire.n.01', 'synonyms': ['armoire'], 'id': 20, 'def': 'a large wardrobe or cabinet', 'name': 'armoire'}, {'frequency': 'r', 'synset': 'armor.n.01', 'synonyms': ['armor', 'armour'], 'id': 21, 'def': 'protective covering made of metal and used in combat', 'name': 'armor'}, {'frequency': 'c', 'synset': 'artichoke.n.02', 'synonyms': ['artichoke'], 'id': 22, 'def': 'a thistlelike flower head with edible fleshy leaves and heart', 'name': 'artichoke'}, {'frequency': 'f', 'synset': 'ashcan.n.01', 'synonyms': ['trash_can', 'garbage_can', 'wastebin', 'dustbin', 'trash_barrel', 'trash_bin'], 'id': 23, 'def': 'a bin that holds rubbish until it is collected', 'name': 'trash_can'}, {'frequency': 'c', 'synset': 'ashtray.n.01', 'synonyms': ['ashtray'], 'id': 24, 'def': "a receptacle for the ash from smokers' cigars or cigarettes", 'name': 'ashtray'}, {'frequency': 'c', 'synset': 'asparagus.n.02', 'synonyms': ['asparagus'], 'id': 25, 'def': 'edible young shoots of the asparagus plant', 'name': 'asparagus'}, {'frequency': 'c', 'synset': 'atomizer.n.01', 'synonyms': ['atomizer', 'atomiser', 'spray', 'sprayer', 'nebulizer', 'nebuliser'], 'id': 26, 'def': 'a dispenser that turns a liquid (such as perfume) into a fine mist', 'name': 'atomizer'}, {'frequency': 'f', 'synset': 'avocado.n.01', 'synonyms': ['avocado'], 'id': 27, 'def': 'a pear-shaped fruit with green or blackish skin and rich yellowish pulp enclosing a single large seed', 'name': 'avocado'}, {'frequency': 'c', 'synset': 'award.n.02', 'synonyms': ['award', 'accolade'], 'id': 28, 'def': 'a tangible symbol signifying approval or distinction', 'name': 'award'}, {'frequency': 'f', 'synset': 'awning.n.01', 'synonyms': ['awning'], 'id': 29, 'def': 'a canopy made of canvas to shelter people or things from rain or sun', 'name': 'awning'}, {'frequency': 'r', 'synset': 'ax.n.01', 'synonyms': ['ax', 'axe'], 'id': 30, 'def': 'an edge tool with a heavy bladed head mounted across a handle', 'name': 'ax'}, {'frequency': 'r', 'synset': 'baboon.n.01', 'synonyms': ['baboon'], 'id': 31, 'def': 'large terrestrial monkeys having doglike muzzles', 'name': 'baboon'}, {'frequency': 'f', 'synset': 'baby_buggy.n.01', 'synonyms': ['baby_buggy', 'baby_carriage', 'perambulator', 'pram', 'stroller'], 'id': 32, 'def': 'a small vehicle with four wheels in which a baby or child is pushed around', 'name': 'baby_buggy'}, {'frequency': 'c', 'synset': 'backboard.n.01', 'synonyms': ['basketball_backboard'], 'id': 33, 'def': 'a raised vertical board with basket attached; used to play basketball', 'name': 'basketball_backboard'}, {'frequency': 'f', 'synset': 'backpack.n.01', 'synonyms': ['backpack', 'knapsack', 'packsack', 'rucksack', 'haversack'], 'id': 34, 'def': 'a bag carried by a strap on your back or shoulder', 'name': 'backpack'}, {'frequency': 'f', 'synset': 'bag.n.04', 'synonyms': ['handbag', 'purse', 'pocketbook'], 'id': 35, 'def': 'a container used for carrying money and small personal items or accessories', 'name': 'handbag'}, {'frequency': 'f', 'synset': 'bag.n.06', 'synonyms': ['suitcase', 'baggage', 'luggage'], 'id': 36, 'def': 'cases used to carry belongings when traveling', 'name': 'suitcase'}, {'frequency': 'c', 'synset': 'bagel.n.01', 'synonyms': ['bagel', 'beigel'], 'id': 37, 'def': 'glazed yeast-raised doughnut-shaped roll with hard crust', 'name': 'bagel'}, {'frequency': 'r', 'synset': 'bagpipe.n.01', 'synonyms': ['bagpipe'], 'id': 38, 'def': 'a tubular wind instrument; the player blows air into a bag and squeezes it out', 'name': 'bagpipe'}, {'frequency': 'r', 'synset': 'baguet.n.01', 'synonyms': ['baguet', 'baguette'], 'id': 39, 'def': 'narrow French stick loaf', 'name': 'baguet'}, {'frequency': 'r', 'synset': 'bait.n.02', 'synonyms': ['bait', 'lure'], 'id': 40, 'def': 'something used to lure fish or other animals into danger so they can be trapped or killed', 'name': 'bait'}, {'frequency': 'f', 'synset': 'ball.n.06', 'synonyms': ['ball'], 'id': 41, 'def': 'a spherical object used as a plaything', 'name': 'ball'}, {'frequency': 'r', 'synset': 'ballet_skirt.n.01', 'synonyms': ['ballet_skirt', 'tutu'], 'id': 42, 'def': 'very short skirt worn by ballerinas', 'name': 'ballet_skirt'}, {'frequency': 'f', 'synset': 'balloon.n.01', 'synonyms': ['balloon'], 'id': 43, 'def': 'large tough nonrigid bag filled with gas or heated air', 'name': 'balloon'}, {'frequency': 'c', 'synset': 'bamboo.n.02', 'synonyms': ['bamboo'], 'id': 44, 'def': 'woody tropical grass having hollow woody stems', 'name': 'bamboo'}, {'frequency': 'f', 'synset': 'banana.n.02', 'synonyms': ['banana'], 'id': 45, 'def': 'elongated crescent-shaped yellow fruit with soft sweet flesh', 'name': 'banana'}, {'frequency': 'c', 'synset': 'band_aid.n.01', 'synonyms': ['Band_Aid'], 'id': 46, 'def': 'trade name for an adhesive bandage to cover small cuts or blisters', 'name': 'Band_Aid'}, {'frequency': 'c', 'synset': 'bandage.n.01', 'synonyms': ['bandage'], 'id': 47, 'def': 'a piece of soft material that covers and protects an injured part of the body', 'name': 'bandage'}, {'frequency': 'f', 'synset': 'bandanna.n.01', 'synonyms': ['bandanna', 'bandana'], 'id': 48, 'def': 'large and brightly colored handkerchief; often used as a neckerchief', 'name': 'bandanna'}, {'frequency': 'r', 'synset': 'banjo.n.01', 'synonyms': ['banjo'], 'id': 49, 'def': 'a stringed instrument of the guitar family with a long neck and circular body', 'name': 'banjo'}, {'frequency': 'f', 'synset': 'banner.n.01', 'synonyms': ['banner', 'streamer'], 'id': 50, 'def': 'long strip of cloth or paper used for decoration or advertising', 'name': 'banner'}, {'frequency': 'r', 'synset': 'barbell.n.01', 'synonyms': ['barbell'], 'id': 51, 'def': 'a bar to which heavy discs are attached at each end; used in weightlifting', 'name': 'barbell'}, {'frequency': 'r', 'synset': 'barge.n.01', 'synonyms': ['barge'], 'id': 52, 'def': 'a flatbottom boat for carrying heavy loads (especially on canals)', 'name': 'barge'}, {'frequency': 'f', 'synset': 'barrel.n.02', 'synonyms': ['barrel', 'cask'], 'id': 53, 'def': 'a cylindrical container that holds liquids', 'name': 'barrel'}, {'frequency': 'c', 'synset': 'barrette.n.01', 'synonyms': ['barrette'], 'id': 54, 'def': "a pin for holding women's hair in place", 'name': 'barrette'}, {'frequency': 'c', 'synset': 'barrow.n.03', 'synonyms': ['barrow', 'garden_cart', 'lawn_cart', 'wheelbarrow'], 'id': 55, 'def': 'a cart for carrying small loads; has handles and one or more wheels', 'name': 'barrow'}, {'frequency': 'f', 'synset': 'base.n.03', 'synonyms': ['baseball_base'], 'id': 56, 'def': 'a place that the runner must touch before scoring', 'name': 'baseball_base'}, {'frequency': 'f', 'synset': 'baseball.n.02', 'synonyms': ['baseball'], 'id': 57, 'def': 'a ball used in playing baseball', 'name': 'baseball'}, {'frequency': 'f', 'synset': 'baseball_bat.n.01', 'synonyms': ['baseball_bat'], 'id': 58, 'def': 'an implement used in baseball by the batter', 'name': 'baseball_bat'}, {'frequency': 'f', 'synset': 'baseball_cap.n.01', 'synonyms': ['baseball_cap', 'jockey_cap', 'golf_cap'], 'id': 59, 'def': 'a cap with a bill', 'name': 'baseball_cap'}, {'frequency': 'f', 'synset': 'baseball_glove.n.01', 'synonyms': ['baseball_glove', 'baseball_mitt'], 'id': 60, 'def': 'the handwear used by fielders in playing baseball', 'name': 'baseball_glove'}, {'frequency': 'f', 'synset': 'basket.n.01', 'synonyms': ['basket', 'handbasket'], 'id': 61, 'def': 'a container that is usually woven and has handles', 'name': 'basket'}, {'frequency': 'c', 'synset': 'basketball.n.02', 'synonyms': ['basketball'], 'id': 62, 'def': 'an inflated ball used in playing basketball', 'name': 'basketball'}, {'frequency': 'r', 'synset': 'bass_horn.n.01', 'synonyms': ['bass_horn', 'sousaphone', 'tuba'], 'id': 63, 'def': 'the lowest brass wind instrument', 'name': 'bass_horn'}, {'frequency': 'c', 'synset': 'bat.n.01', 'synonyms': ['bat_(animal)'], 'id': 64, 'def': 'nocturnal mouselike mammal with forelimbs modified to form membranous wings', 'name': 'bat_(animal)'}, {'frequency': 'f', 'synset': 'bath_mat.n.01', 'synonyms': ['bath_mat'], 'id': 65, 'def': 'a heavy towel or mat to stand on while drying yourself after a bath', 'name': 'bath_mat'}, {'frequency': 'f', 'synset': 'bath_towel.n.01', 'synonyms': ['bath_towel'], 'id': 66, 'def': 'a large towel; to dry yourself after a bath', 'name': 'bath_towel'}, {'frequency': 'c', 'synset': 'bathrobe.n.01', 'synonyms': ['bathrobe'], 'id': 67, 'def': 'a loose-fitting robe of towelling; worn after a bath or swim', 'name': 'bathrobe'}, {'frequency': 'f', 'synset': 'bathtub.n.01', 'synonyms': ['bathtub', 'bathing_tub'], 'id': 68, 'def': 'a large open container that you fill with water and use to wash the body', 'name': 'bathtub'}, {'frequency': 'r', 'synset': 'batter.n.02', 'synonyms': ['batter_(food)'], 'id': 69, 'def': 'a liquid or semiliquid mixture, as of flour, eggs, and milk, used in cooking', 'name': 'batter_(food)'}, {'frequency': 'c', 'synset': 'battery.n.02', 'synonyms': ['battery'], 'id': 70, 'def': 'a portable device that produces electricity', 'name': 'battery'}, {'frequency': 'r', 'synset': 'beach_ball.n.01', 'synonyms': ['beachball'], 'id': 71, 'def': 'large and light ball; for play at the seaside', 'name': 'beachball'}, {'frequency': 'c', 'synset': 'bead.n.01', 'synonyms': ['bead'], 'id': 72, 'def': 'a small ball with a hole through the middle used for ornamentation, jewellery, etc.', 'name': 'bead'}, {'frequency': 'c', 'synset': 'bean_curd.n.01', 'synonyms': ['bean_curd', 'tofu'], 'id': 73, 'def': 'cheeselike food made of curdled soybean milk', 'name': 'bean_curd'}, {'frequency': 'c', 'synset': 'beanbag.n.01', 'synonyms': ['beanbag'], 'id': 74, 'def': 'a bag filled with dried beans or similar items; used in games or to sit on', 'name': 'beanbag'}, {'frequency': 'f', 'synset': 'beanie.n.01', 'synonyms': ['beanie', 'beany'], 'id': 75, 'def': 'a small skullcap; formerly worn by schoolboys and college freshmen', 'name': 'beanie'}, {'frequency': 'f', 'synset': 'bear.n.01', 'synonyms': ['bear'], 'id': 76, 'def': 'large carnivorous or omnivorous mammals with shaggy coats and claws', 'name': 'bear'}, {'frequency': 'f', 'synset': 'bed.n.01', 'synonyms': ['bed'], 'id': 77, 'def': 'a piece of furniture that provides a place to sleep', 'name': 'bed'}, {'frequency': 'r', 'synset': 'bedpan.n.01', 'synonyms': ['bedpan'], 'id': 78, 'def': 'a shallow vessel used by a bedridden patient for defecation and urination', 'name': 'bedpan'}, {'frequency': 'f', 'synset': 'bedspread.n.01', 'synonyms': ['bedspread', 'bedcover', 'bed_covering', 'counterpane', 'spread'], 'id': 79, 'def': 'decorative cover for a bed', 'name': 'bedspread'}, {'frequency': 'f', 'synset': 'beef.n.01', 'synonyms': ['cow'], 'id': 80, 'def': 'cattle/cow', 'name': 'cow'}, {'frequency': 'f', 'synset': 'beef.n.02', 'synonyms': ['beef_(food)', 'boeuf_(food)'], 'id': 81, 'def': 'meat from an adult domestic bovine', 'name': 'beef_(food)'}, {'frequency': 'r', 'synset': 'beeper.n.01', 'synonyms': ['beeper', 'pager'], 'id': 82, 'def': 'an device that beeps when the person carrying it is being paged', 'name': 'beeper'}, {'frequency': 'f', 'synset': 'beer_bottle.n.01', 'synonyms': ['beer_bottle'], 'id': 83, 'def': 'a bottle that holds beer', 'name': 'beer_bottle'}, {'frequency': 'c', 'synset': 'beer_can.n.01', 'synonyms': ['beer_can'], 'id': 84, 'def': 'a can that holds beer', 'name': 'beer_can'}, {'frequency': 'r', 'synset': 'beetle.n.01', 'synonyms': ['beetle'], 'id': 85, 'def': 'insect with hard wing covers', 'name': 'beetle'}, {'frequency': 'f', 'synset': 'bell.n.01', 'synonyms': ['bell'], 'id': 86, 'def': 'a hollow device made of metal that makes a ringing sound when struck', 'name': 'bell'}, {'frequency': 'f', 'synset': 'bell_pepper.n.02', 'synonyms': ['bell_pepper', 'capsicum'], 'id': 87, 'def': 'large bell-shaped sweet pepper in green or red or yellow or orange or black varieties', 'name': 'bell_pepper'}, {'frequency': 'f', 'synset': 'belt.n.02', 'synonyms': ['belt'], 'id': 88, 'def': 'a band to tie or buckle around the body (usually at the waist)', 'name': 'belt'}, {'frequency': 'f', 'synset': 'belt_buckle.n.01', 'synonyms': ['belt_buckle'], 'id': 89, 'def': 'the buckle used to fasten a belt', 'name': 'belt_buckle'}, {'frequency': 'f', 'synset': 'bench.n.01', 'synonyms': ['bench'], 'id': 90, 'def': 'a long seat for more than one person', 'name': 'bench'}, {'frequency': 'c', 'synset': 'beret.n.01', 'synonyms': ['beret'], 'id': 91, 'def': 'a cap with no brim or bill; made of soft cloth', 'name': 'beret'}, {'frequency': 'c', 'synset': 'bib.n.02', 'synonyms': ['bib'], 'id': 92, 'def': 'a napkin tied under the chin of a child while eating', 'name': 'bib'}, {'frequency': 'r', 'synset': 'bible.n.01', 'synonyms': ['Bible'], 'id': 93, 'def': 'the sacred writings of the Christian religions', 'name': 'Bible'}, {'frequency': 'f', 'synset': 'bicycle.n.01', 'synonyms': ['bicycle', 'bike_(bicycle)'], 'id': 94, 'def': 'a wheeled vehicle that has two wheels and is moved by foot pedals', 'name': 'bicycle'}, {'frequency': 'f', 'synset': 'bill.n.09', 'synonyms': ['visor', 'vizor'], 'id': 95, 'def': 'a brim that projects to the front to shade the eyes', 'name': 'visor'}, {'frequency': 'f', 'synset': 'billboard.n.01', 'synonyms': ['billboard'], 'id': 96, 'def': 'large outdoor signboard', 'name': 'billboard'}, {'frequency': 'c', 'synset': 'binder.n.03', 'synonyms': ['binder', 'ring-binder'], 'id': 97, 'def': 'holds loose papers or magazines', 'name': 'binder'}, {'frequency': 'c', 'synset': 'binoculars.n.01', 'synonyms': ['binoculars', 'field_glasses', 'opera_glasses'], 'id': 98, 'def': 'an optical instrument designed for simultaneous use by both eyes', 'name': 'binoculars'}, {'frequency': 'f', 'synset': 'bird.n.01', 'synonyms': ['bird'], 'id': 99, 'def': 'animal characterized by feathers and wings', 'name': 'bird'}, {'frequency': 'c', 'synset': 'bird_feeder.n.01', 'synonyms': ['birdfeeder'], 'id': 100, 'def': 'an outdoor device that supplies food for wild birds', 'name': 'birdfeeder'}, {'frequency': 'c', 'synset': 'birdbath.n.01', 'synonyms': ['birdbath'], 'id': 101, 'def': 'an ornamental basin (usually in a garden) for birds to bathe in', 'name': 'birdbath'}, {'frequency': 'c', 'synset': 'birdcage.n.01', 'synonyms': ['birdcage'], 'id': 102, 'def': 'a cage in which a bird can be kept', 'name': 'birdcage'}, {'frequency': 'c', 'synset': 'birdhouse.n.01', 'synonyms': ['birdhouse'], 'id': 103, 'def': 'a shelter for birds', 'name': 'birdhouse'}, {'frequency': 'f', 'synset': 'birthday_cake.n.01', 'synonyms': ['birthday_cake'], 'id': 104, 'def': 'decorated cake served at a birthday party', 'name': 'birthday_cake'}, {'frequency': 'r', 'synset': 'birthday_card.n.01', 'synonyms': ['birthday_card'], 'id': 105, 'def': 'a card expressing a birthday greeting', 'name': 'birthday_card'}, {'frequency': 'r', 'synset': 'black_flag.n.01', 'synonyms': ['pirate_flag'], 'id': 106, 'def': 'a flag usually bearing a white skull and crossbones on a black background', 'name': 'pirate_flag'}, {'frequency': 'c', 'synset': 'black_sheep.n.02', 'synonyms': ['black_sheep'], 'id': 107, 'def': 'sheep with a black coat', 'name': 'black_sheep'}, {'frequency': 'c', 'synset': 'blackberry.n.01', 'synonyms': ['blackberry'], 'id': 108, 'def': 'large sweet black or very dark purple edible aggregate fruit', 'name': 'blackberry'}, {'frequency': 'f', 'synset': 'blackboard.n.01', 'synonyms': ['blackboard', 'chalkboard'], 'id': 109, 'def': 'sheet of slate; for writing with chalk', 'name': 'blackboard'}, {'frequency': 'f', 'synset': 'blanket.n.01', 'synonyms': ['blanket'], 'id': 110, 'def': 'bedding that keeps a person warm in bed', 'name': 'blanket'}, {'frequency': 'c', 'synset': 'blazer.n.01', 'synonyms': ['blazer', 'sport_jacket', 'sport_coat', 'sports_jacket', 'sports_coat'], 'id': 111, 'def': 'lightweight jacket; often striped in the colors of a club or school', 'name': 'blazer'}, {'frequency': 'f', 'synset': 'blender.n.01', 'synonyms': ['blender', 'liquidizer', 'liquidiser'], 'id': 112, 'def': 'an electrically powered mixer that mix or chop or liquefy foods', 'name': 'blender'}, {'frequency': 'r', 'synset': 'blimp.n.02', 'synonyms': ['blimp'], 'id': 113, 'def': 'a small nonrigid airship used for observation or as a barrage balloon', 'name': 'blimp'}, {'frequency': 'f', 'synset': 'blinker.n.01', 'synonyms': ['blinker', 'flasher'], 'id': 114, 'def': 'a light that flashes on and off; used as a signal or to send messages', 'name': 'blinker'}, {'frequency': 'f', 'synset': 'blouse.n.01', 'synonyms': ['blouse'], 'id': 115, 'def': 'a top worn by women', 'name': 'blouse'}, {'frequency': 'f', 'synset': 'blueberry.n.02', 'synonyms': ['blueberry'], 'id': 116, 'def': 'sweet edible dark-blue berries of blueberry plants', 'name': 'blueberry'}, {'frequency': 'r', 'synset': 'board.n.09', 'synonyms': ['gameboard'], 'id': 117, 'def': 'a flat portable surface (usually rectangular) designed for board games', 'name': 'gameboard'}, {'frequency': 'f', 'synset': 'boat.n.01', 'synonyms': ['boat', 'ship_(boat)'], 'id': 118, 'def': 'a vessel for travel on water', 'name': 'boat'}, {'frequency': 'r', 'synset': 'bob.n.05', 'synonyms': ['bob', 'bobber', 'bobfloat'], 'id': 119, 'def': 'a small float usually made of cork; attached to a fishing line', 'name': 'bob'}, {'frequency': 'c', 'synset': 'bobbin.n.01', 'synonyms': ['bobbin', 'spool', 'reel'], 'id': 120, 'def': 'a thing around which thread/tape/film or other flexible materials can be wound', 'name': 'bobbin'}, {'frequency': 'c', 'synset': 'bobby_pin.n.01', 'synonyms': ['bobby_pin', 'hairgrip'], 'id': 121, 'def': 'a flat wire hairpin used to hold bobbed hair in place', 'name': 'bobby_pin'}, {'frequency': 'c', 'synset': 'boiled_egg.n.01', 'synonyms': ['boiled_egg', 'coddled_egg'], 'id': 122, 'def': 'egg cooked briefly in the shell in gently boiling water', 'name': 'boiled_egg'}, {'frequency': 'r', 'synset': 'bolo_tie.n.01', 'synonyms': ['bolo_tie', 'bolo', 'bola_tie', 'bola'], 'id': 123, 'def': 'a cord fastened around the neck with an ornamental clasp and worn as a necktie', 'name': 'bolo_tie'}, {'frequency': 'c', 'synset': 'bolt.n.03', 'synonyms': ['deadbolt'], 'id': 124, 'def': 'the part of a lock that is engaged or withdrawn with a key', 'name': 'deadbolt'}, {'frequency': 'f', 'synset': 'bolt.n.06', 'synonyms': ['bolt'], 'id': 125, 'def': 'a screw that screws into a nut to form a fastener', 'name': 'bolt'}, {'frequency': 'r', 'synset': 'bonnet.n.01', 'synonyms': ['bonnet'], 'id': 126, 'def': 'a hat tied under the chin', 'name': 'bonnet'}, {'frequency': 'f', 'synset': 'book.n.01', 'synonyms': ['book'], 'id': 127, 'def': 'a written work or composition that has been published', 'name': 'book'}, {'frequency': 'c', 'synset': 'bookcase.n.01', 'synonyms': ['bookcase'], 'id': 128, 'def': 'a piece of furniture with shelves for storing books', 'name': 'bookcase'}, {'frequency': 'c', 'synset': 'booklet.n.01', 'synonyms': ['booklet', 'brochure', 'leaflet', 'pamphlet'], 'id': 129, 'def': 'a small book usually having a paper cover', 'name': 'booklet'}, {'frequency': 'r', 'synset': 'bookmark.n.01', 'synonyms': ['bookmark', 'bookmarker'], 'id': 130, 'def': 'a marker (a piece of paper or ribbon) placed between the pages of a book', 'name': 'bookmark'}, {'frequency': 'r', 'synset': 'boom.n.04', 'synonyms': ['boom_microphone', 'microphone_boom'], 'id': 131, 'def': 'a pole carrying an overhead microphone projected over a film or tv set', 'name': 'boom_microphone'}, {'frequency': 'f', 'synset': 'boot.n.01', 'synonyms': ['boot'], 'id': 132, 'def': 'footwear that covers the whole foot and lower leg', 'name': 'boot'}, {'frequency': 'f', 'synset': 'bottle.n.01', 'synonyms': ['bottle'], 'id': 133, 'def': 'a glass or plastic vessel used for storing drinks or other liquids', 'name': 'bottle'}, {'frequency': 'c', 'synset': 'bottle_opener.n.01', 'synonyms': ['bottle_opener'], 'id': 134, 'def': 'an opener for removing caps or corks from bottles', 'name': 'bottle_opener'}, {'frequency': 'c', 'synset': 'bouquet.n.01', 'synonyms': ['bouquet'], 'id': 135, 'def': 'an arrangement of flowers that is usually given as a present', 'name': 'bouquet'}, {'frequency': 'r', 'synset': 'bow.n.04', 'synonyms': ['bow_(weapon)'], 'id': 136, 'def': 'a weapon for shooting arrows', 'name': 'bow_(weapon)'}, {'frequency': 'f', 'synset': 'bow.n.08', 'synonyms': ['bow_(decorative_ribbons)'], 'id': 137, 'def': 'a decorative interlacing of ribbons', 'name': 'bow_(decorative_ribbons)'}, {'frequency': 'f', 'synset': 'bow_tie.n.01', 'synonyms': ['bow-tie', 'bowtie'], 'id': 138, 'def': "a man's tie that ties in a bow", 'name': 'bow-tie'}, {'frequency': 'f', 'synset': 'bowl.n.03', 'synonyms': ['bowl'], 'id': 139, 'def': 'a dish that is round and open at the top for serving foods', 'name': 'bowl'}, {'frequency': 'r', 'synset': 'bowl.n.08', 'synonyms': ['pipe_bowl'], 'id': 140, 'def': 'a small round container that is open at the top for holding tobacco', 'name': 'pipe_bowl'}, {'frequency': 'c', 'synset': 'bowler_hat.n.01', 'synonyms': ['bowler_hat', 'bowler', 'derby_hat', 'derby', 'plug_hat'], 'id': 141, 'def': 'a felt hat that is round and hard with a narrow brim', 'name': 'bowler_hat'}, {'frequency': 'r', 'synset': 'bowling_ball.n.01', 'synonyms': ['bowling_ball'], 'id': 142, 'def': 'a large ball with finger holes used in the sport of bowling', 'name': 'bowling_ball'}, {'frequency': 'f', 'synset': 'box.n.01', 'synonyms': ['box'], 'id': 143, 'def': 'a (usually rectangular) container; may have a lid', 'name': 'box'}, {'frequency': 'r', 'synset': 'boxing_glove.n.01', 'synonyms': ['boxing_glove'], 'id': 144, 'def': 'large glove coverings the fists of a fighter worn for the sport of boxing', 'name': 'boxing_glove'}, {'frequency': 'c', 'synset': 'brace.n.06', 'synonyms': ['suspenders'], 'id': 145, 'def': 'elastic straps that hold trousers up (usually used in the plural)', 'name': 'suspenders'}, {'frequency': 'f', 'synset': 'bracelet.n.02', 'synonyms': ['bracelet', 'bangle'], 'id': 146, 'def': 'jewelry worn around the wrist for decoration', 'name': 'bracelet'}, {'frequency': 'r', 'synset': 'brass.n.07', 'synonyms': ['brass_plaque'], 'id': 147, 'def': 'a memorial made of brass', 'name': 'brass_plaque'}, {'frequency': 'c', 'synset': 'brassiere.n.01', 'synonyms': ['brassiere', 'bra', 'bandeau'], 'id': 148, 'def': 'an undergarment worn by women to support their breasts', 'name': 'brassiere'}, {'frequency': 'c', 'synset': 'bread-bin.n.01', 'synonyms': ['bread-bin', 'breadbox'], 'id': 149, 'def': 'a container used to keep bread or cake in', 'name': 'bread-bin'}, {'frequency': 'f', 'synset': 'bread.n.01', 'synonyms': ['bread'], 'id': 150, 'def': 'food made from dough of flour or meal and usually raised with yeast or baking powder and then baked', 'name': 'bread'}, {'frequency': 'r', 'synset': 'breechcloth.n.01', 'synonyms': ['breechcloth', 'breechclout', 'loincloth'], 'id': 151, 'def': 'a garment that provides covering for the loins', 'name': 'breechcloth'}, {'frequency': 'f', 'synset': 'bridal_gown.n.01', 'synonyms': ['bridal_gown', 'wedding_gown', 'wedding_dress'], 'id': 152, 'def': 'a gown worn by the bride at a wedding', 'name': 'bridal_gown'}, {'frequency': 'c', 'synset': 'briefcase.n.01', 'synonyms': ['briefcase'], 'id': 153, 'def': 'a case with a handle; for carrying papers or files or books', 'name': 'briefcase'}, {'frequency': 'f', 'synset': 'broccoli.n.01', 'synonyms': ['broccoli'], 'id': 154, 'def': 'plant with dense clusters of tight green flower buds', 'name': 'broccoli'}, {'frequency': 'r', 'synset': 'brooch.n.01', 'synonyms': ['broach'], 'id': 155, 'def': 'a decorative pin worn by women', 'name': 'broach'}, {'frequency': 'c', 'synset': 'broom.n.01', 'synonyms': ['broom'], 'id': 156, 'def': 'bundle of straws or twigs attached to a long handle; used for cleaning', 'name': 'broom'}, {'frequency': 'c', 'synset': 'brownie.n.03', 'synonyms': ['brownie'], 'id': 157, 'def': 'square or bar of very rich chocolate cake usually with nuts', 'name': 'brownie'}, {'frequency': 'c', 'synset': 'brussels_sprouts.n.01', 'synonyms': ['brussels_sprouts'], 'id': 158, 'def': 'the small edible cabbage-like buds growing along a stalk', 'name': 'brussels_sprouts'}, {'frequency': 'r', 'synset': 'bubble_gum.n.01', 'synonyms': ['bubble_gum'], 'id': 159, 'def': 'a kind of chewing gum that can be blown into bubbles', 'name': 'bubble_gum'}, {'frequency': 'f', 'synset': 'bucket.n.01', 'synonyms': ['bucket', 'pail'], 'id': 160, 'def': 'a roughly cylindrical vessel that is open at the top', 'name': 'bucket'}, {'frequency': 'r', 'synset': 'buggy.n.01', 'synonyms': ['horse_buggy'], 'id': 161, 'def': 'a small lightweight carriage; drawn by a single horse', 'name': 'horse_buggy'}, {'frequency': 'c', 'synset': 'bull.n.11', 'synonyms': ['horned_cow'], 'id': 162, 'def': 'a cow with horns', 'name': 'bull'}, {'frequency': 'c', 'synset': 'bulldog.n.01', 'synonyms': ['bulldog'], 'id': 163, 'def': 'a thickset short-haired dog with a large head and strong undershot lower jaw', 'name': 'bulldog'}, {'frequency': 'r', 'synset': 'bulldozer.n.01', 'synonyms': ['bulldozer', 'dozer'], 'id': 164, 'def': 'large powerful tractor; a large blade in front flattens areas of ground', 'name': 'bulldozer'}, {'frequency': 'c', 'synset': 'bullet_train.n.01', 'synonyms': ['bullet_train'], 'id': 165, 'def': 'a high-speed passenger train', 'name': 'bullet_train'}, {'frequency': 'c', 'synset': 'bulletin_board.n.02', 'synonyms': ['bulletin_board', 'notice_board'], 'id': 166, 'def': 'a board that hangs on a wall; displays announcements', 'name': 'bulletin_board'}, {'frequency': 'r', 'synset': 'bulletproof_vest.n.01', 'synonyms': ['bulletproof_vest'], 'id': 167, 'def': 'a vest capable of resisting the impact of a bullet', 'name': 'bulletproof_vest'}, {'frequency': 'c', 'synset': 'bullhorn.n.01', 'synonyms': ['bullhorn', 'megaphone'], 'id': 168, 'def': 'a portable loudspeaker with built-in microphone and amplifier', 'name': 'bullhorn'}, {'frequency': 'f', 'synset': 'bun.n.01', 'synonyms': ['bun', 'roll'], 'id': 169, 'def': 'small rounded bread either plain or sweet', 'name': 'bun'}, {'frequency': 'c', 'synset': 'bunk_bed.n.01', 'synonyms': ['bunk_bed'], 'id': 170, 'def': 'beds built one above the other', 'name': 'bunk_bed'}, {'frequency': 'f', 'synset': 'buoy.n.01', 'synonyms': ['buoy'], 'id': 171, 'def': 'a float attached by rope to the seabed to mark channels in a harbor or underwater hazards', 'name': 'buoy'}, {'frequency': 'r', 'synset': 'burrito.n.01', 'synonyms': ['burrito'], 'id': 172, 'def': 'a flour tortilla folded around a filling', 'name': 'burrito'}, {'frequency': 'f', 'synset': 'bus.n.01', 'synonyms': ['bus_(vehicle)', 'autobus', 'charabanc', 'double-decker', 'motorbus', 'motorcoach'], 'id': 173, 'def': 'a vehicle carrying many passengers; used for public transport', 'name': 'bus_(vehicle)'}, {'frequency': 'c', 'synset': 'business_card.n.01', 'synonyms': ['business_card'], 'id': 174, 'def': "a card on which are printed the person's name and business affiliation", 'name': 'business_card'}, {'frequency': 'f', 'synset': 'butter.n.01', 'synonyms': ['butter'], 'id': 175, 'def': 'an edible emulsion of fat globules made by churning milk or cream; for cooking and table use', 'name': 'butter'}, {'frequency': 'c', 'synset': 'butterfly.n.01', 'synonyms': ['butterfly'], 'id': 176, 'def': 'insect typically having a slender body with knobbed antennae and broad colorful wings', 'name': 'butterfly'}, {'frequency': 'f', 'synset': 'button.n.01', 'synonyms': ['button'], 'id': 177, 'def': 'a round fastener sewn to shirts and coats etc to fit through buttonholes', 'name': 'button'}, {'frequency': 'f', 'synset': 'cab.n.03', 'synonyms': ['cab_(taxi)', 'taxi', 'taxicab'], 'id': 178, 'def': 'a car that takes passengers where they want to go in exchange for money', 'name': 'cab_(taxi)'}, {'frequency': 'r', 'synset': 'cabana.n.01', 'synonyms': ['cabana'], 'id': 179, 'def': 'a small tent used as a dressing room beside the sea or a swimming pool', 'name': 'cabana'}, {'frequency': 'c', 'synset': 'cabin_car.n.01', 'synonyms': ['cabin_car', 'caboose'], 'id': 180, 'def': 'a car on a freight train for use of the train crew; usually the last car on the train', 'name': 'cabin_car'}, {'frequency': 'f', 'synset': 'cabinet.n.01', 'synonyms': ['cabinet'], 'id': 181, 'def': 'a piece of furniture resembling a cupboard with doors and shelves and drawers', 'name': 'cabinet'}, {'frequency': 'r', 'synset': 'cabinet.n.03', 'synonyms': ['locker', 'storage_locker'], 'id': 182, 'def': 'a storage compartment for clothes and valuables; usually it has a lock', 'name': 'locker'}, {'frequency': 'f', 'synset': 'cake.n.03', 'synonyms': ['cake'], 'id': 183, 'def': 'baked goods made from or based on a mixture of flour, sugar, eggs, and fat', 'name': 'cake'}, {'frequency': 'c', 'synset': 'calculator.n.02', 'synonyms': ['calculator'], 'id': 184, 'def': 'a small machine that is used for mathematical calculations', 'name': 'calculator'}, {'frequency': 'f', 'synset': 'calendar.n.02', 'synonyms': ['calendar'], 'id': 185, 'def': 'a list or register of events (appointments/social events/court cases, etc)', 'name': 'calendar'}, {'frequency': 'c', 'synset': 'calf.n.01', 'synonyms': ['calf'], 'id': 186, 'def': 'young of domestic cattle', 'name': 'calf'}, {'frequency': 'c', 'synset': 'camcorder.n.01', 'synonyms': ['camcorder'], 'id': 187, 'def': 'a portable television camera and videocassette recorder', 'name': 'camcorder'}, {'frequency': 'c', 'synset': 'camel.n.01', 'synonyms': ['camel'], 'id': 188, 'def': 'cud-chewing mammal used as a draft or saddle animal in desert regions', 'name': 'camel'}, {'frequency': 'f', 'synset': 'camera.n.01', 'synonyms': ['camera'], 'id': 189, 'def': 'equipment for taking photographs', 'name': 'camera'}, {'frequency': 'c', 'synset': 'camera_lens.n.01', 'synonyms': ['camera_lens'], 'id': 190, 'def': 'a lens that focuses the image in a camera', 'name': 'camera_lens'}, {'frequency': 'c', 'synset': 'camper.n.02', 'synonyms': ['camper_(vehicle)', 'camping_bus', 'motor_home'], 'id': 191, 'def': 'a recreational vehicle equipped for camping out while traveling', 'name': 'camper_(vehicle)'}, {'frequency': 'f', 'synset': 'can.n.01', 'synonyms': ['can', 'tin_can'], 'id': 192, 'def': 'airtight sealed metal container for food or drink or paint etc.', 'name': 'can'}, {'frequency': 'c', 'synset': 'can_opener.n.01', 'synonyms': ['can_opener', 'tin_opener'], 'id': 193, 'def': 'a device for cutting cans open', 'name': 'can_opener'}, {'frequency': 'f', 'synset': 'candle.n.01', 'synonyms': ['candle', 'candlestick'], 'id': 194, 'def': 'stick of wax with a wick in the middle', 'name': 'candle'}, {'frequency': 'f', 'synset': 'candlestick.n.01', 'synonyms': ['candle_holder'], 'id': 195, 'def': 'a holder with sockets for candles', 'name': 'candle_holder'}, {'frequency': 'r', 'synset': 'candy_bar.n.01', 'synonyms': ['candy_bar'], 'id': 196, 'def': 'a candy shaped as a bar', 'name': 'candy_bar'}, {'frequency': 'c', 'synset': 'candy_cane.n.01', 'synonyms': ['candy_cane'], 'id': 197, 'def': 'a hard candy in the shape of a rod (usually with stripes)', 'name': 'candy_cane'}, {'frequency': 'c', 'synset': 'cane.n.01', 'synonyms': ['walking_cane'], 'id': 198, 'def': 'a stick that people can lean on to help them walk', 'name': 'walking_cane'}, {'frequency': 'c', 'synset': 'canister.n.02', 'synonyms': ['canister', 'cannister'], 'id': 199, 'def': 'metal container for storing dry foods such as tea or flour', 'name': 'canister'}, {'frequency': 'c', 'synset': 'canoe.n.01', 'synonyms': ['canoe'], 'id': 200, 'def': 'small and light boat; pointed at both ends; propelled with a paddle', 'name': 'canoe'}, {'frequency': 'c', 'synset': 'cantaloup.n.02', 'synonyms': ['cantaloup', 'cantaloupe'], 'id': 201, 'def': 'the fruit of a cantaloup vine; small to medium-sized melon with yellowish flesh', 'name': 'cantaloup'}, {'frequency': 'r', 'synset': 'canteen.n.01', 'synonyms': ['canteen'], 'id': 202, 'def': 'a flask for carrying water; used by soldiers or travelers', 'name': 'canteen'}, {'frequency': 'f', 'synset': 'cap.n.01', 'synonyms': ['cap_(headwear)'], 'id': 203, 'def': 'a tight-fitting headwear', 'name': 'cap_(headwear)'}, {'frequency': 'f', 'synset': 'cap.n.02', 'synonyms': ['bottle_cap', 'cap_(container_lid)'], 'id': 204, 'def': 'a top (as for a bottle)', 'name': 'bottle_cap'}, {'frequency': 'c', 'synset': 'cape.n.02', 'synonyms': ['cape'], 'id': 205, 'def': 'a sleeveless garment like a cloak but shorter', 'name': 'cape'}, {'frequency': 'c', 'synset': 'cappuccino.n.01', 'synonyms': ['cappuccino', 'coffee_cappuccino'], 'id': 206, 'def': 'equal parts of espresso and steamed milk', 'name': 'cappuccino'}, {'frequency': 'f', 'synset': 'car.n.01', 'synonyms': ['car_(automobile)', 'auto_(automobile)', 'automobile'], 'id': 207, 'def': 'a motor vehicle with four wheels', 'name': 'car_(automobile)'}, {'frequency': 'f', 'synset': 'car.n.02', 'synonyms': ['railcar_(part_of_a_train)', 'railway_car_(part_of_a_train)', 'railroad_car_(part_of_a_train)'], 'id': 208, 'def': 'a wheeled vehicle adapted to the rails of railroad (mark each individual railcar separately)', 'name': 'railcar_(part_of_a_train)'}, {'frequency': 'r', 'synset': 'car.n.04', 'synonyms': ['elevator_car'], 'id': 209, 'def': 'where passengers ride up and down', 'name': 'elevator_car'}, {'frequency': 'r', 'synset': 'car_battery.n.01', 'synonyms': ['car_battery', 'automobile_battery'], 'id': 210, 'def': 'a battery in a motor vehicle', 'name': 'car_battery'}, {'frequency': 'c', 'synset': 'card.n.02', 'synonyms': ['identity_card'], 'id': 211, 'def': 'a card certifying the identity of the bearer', 'name': 'identity_card'}, {'frequency': 'c', 'synset': 'card.n.03', 'synonyms': ['card'], 'id': 212, 'def': 'a rectangular piece of paper used to send messages (e.g. greetings or pictures)', 'name': 'card'}, {'frequency': 'c', 'synset': 'cardigan.n.01', 'synonyms': ['cardigan'], 'id': 213, 'def': 'knitted jacket that is fastened up the front with buttons or a zipper', 'name': 'cardigan'}, {'frequency': 'r', 'synset': 'cargo_ship.n.01', 'synonyms': ['cargo_ship', 'cargo_vessel'], 'id': 214, 'def': 'a ship designed to carry cargo', 'name': 'cargo_ship'}, {'frequency': 'r', 'synset': 'carnation.n.01', 'synonyms': ['carnation'], 'id': 215, 'def': 'plant with pink to purple-red spice-scented usually double flowers', 'name': 'carnation'}, {'frequency': 'c', 'synset': 'carriage.n.02', 'synonyms': ['horse_carriage'], 'id': 216, 'def': 'a vehicle with wheels drawn by one or more horses', 'name': 'horse_carriage'}, {'frequency': 'f', 'synset': 'carrot.n.01', 'synonyms': ['carrot'], 'id': 217, 'def': 'deep orange edible root of the cultivated carrot plant', 'name': 'carrot'}, {'frequency': 'f', 'synset': 'carryall.n.01', 'synonyms': ['tote_bag'], 'id': 218, 'def': 'a capacious bag or basket', 'name': 'tote_bag'}, {'frequency': 'c', 'synset': 'cart.n.01', 'synonyms': ['cart'], 'id': 219, 'def': 'a heavy open wagon usually having two wheels and drawn by an animal', 'name': 'cart'}, {'frequency': 'c', 'synset': 'carton.n.02', 'synonyms': ['carton'], 'id': 220, 'def': 'a container made of cardboard for holding food or drink', 'name': 'carton'}, {'frequency': 'c', 'synset': 'cash_register.n.01', 'synonyms': ['cash_register', 'register_(for_cash_transactions)'], 'id': 221, 'def': 'a cashbox with an adding machine to register transactions', 'name': 'cash_register'}, {'frequency': 'r', 'synset': 'casserole.n.01', 'synonyms': ['casserole'], 'id': 222, 'def': 'food cooked and served in a casserole', 'name': 'casserole'}, {'frequency': 'r', 'synset': 'cassette.n.01', 'synonyms': ['cassette'], 'id': 223, 'def': 'a container that holds a magnetic tape used for recording or playing sound or video', 'name': 'cassette'}, {'frequency': 'c', 'synset': 'cast.n.05', 'synonyms': ['cast', 'plaster_cast', 'plaster_bandage'], 'id': 224, 'def': 'bandage consisting of a firm covering that immobilizes broken bones while they heal', 'name': 'cast'}, {'frequency': 'f', 'synset': 'cat.n.01', 'synonyms': ['cat'], 'id': 225, 'def': 'a domestic house cat', 'name': 'cat'}, {'frequency': 'f', 'synset': 'cauliflower.n.02', 'synonyms': ['cauliflower'], 'id': 226, 'def': 'edible compact head of white undeveloped flowers', 'name': 'cauliflower'}, {'frequency': 'c', 'synset': 'cayenne.n.02', 'synonyms': ['cayenne_(spice)', 'cayenne_pepper_(spice)', 'red_pepper_(spice)'], 'id': 227, 'def': 'ground pods and seeds of pungent red peppers of the genus Capsicum', 'name': 'cayenne_(spice)'}, {'frequency': 'c', 'synset': 'cd_player.n.01', 'synonyms': ['CD_player'], 'id': 228, 'def': 'electronic equipment for playing compact discs (CDs)', 'name': 'CD_player'}, {'frequency': 'f', 'synset': 'celery.n.01', 'synonyms': ['celery'], 'id': 229, 'def': 'widely cultivated herb with aromatic leaf stalks that are eaten raw or cooked', 'name': 'celery'}, {'frequency': 'f', 'synset': 'cellular_telephone.n.01', 'synonyms': ['cellular_telephone', 'cellular_phone', 'cellphone', 'mobile_phone', 'smart_phone'], 'id': 230, 'def': 'a hand-held mobile telephone', 'name': 'cellular_telephone'}, {'frequency': 'r', 'synset': 'chain_mail.n.01', 'synonyms': ['chain_mail', 'ring_mail', 'chain_armor', 'chain_armour', 'ring_armor', 'ring_armour'], 'id': 231, 'def': '(Middle Ages) flexible armor made of interlinked metal rings', 'name': 'chain_mail'}, {'frequency': 'f', 'synset': 'chair.n.01', 'synonyms': ['chair'], 'id': 232, 'def': 'a seat for one person, with a support for the back', 'name': 'chair'}, {'frequency': 'r', 'synset': 'chaise_longue.n.01', 'synonyms': ['chaise_longue', 'chaise', 'daybed'], 'id': 233, 'def': 'a long chair; for reclining', 'name': 'chaise_longue'}, {'frequency': 'r', 'synset': 'chalice.n.01', 'synonyms': ['chalice'], 'id': 234, 'def': 'a bowl-shaped drinking vessel; especially the Eucharistic cup', 'name': 'chalice'}, {'frequency': 'f', 'synset': 'chandelier.n.01', 'synonyms': ['chandelier'], 'id': 235, 'def': 'branched lighting fixture; often ornate; hangs from the ceiling', 'name': 'chandelier'}, {'frequency': 'r', 'synset': 'chap.n.04', 'synonyms': ['chap'], 'id': 236, 'def': 'leather leggings without a seat; worn over trousers by cowboys to protect their legs', 'name': 'chap'}, {'frequency': 'r', 'synset': 'checkbook.n.01', 'synonyms': ['checkbook', 'chequebook'], 'id': 237, 'def': 'a book issued to holders of checking accounts', 'name': 'checkbook'}, {'frequency': 'r', 'synset': 'checkerboard.n.01', 'synonyms': ['checkerboard'], 'id': 238, 'def': 'a board having 64 squares of two alternating colors', 'name': 'checkerboard'}, {'frequency': 'c', 'synset': 'cherry.n.03', 'synonyms': ['cherry'], 'id': 239, 'def': 'a red fruit with a single hard stone', 'name': 'cherry'}, {'frequency': 'r', 'synset': 'chessboard.n.01', 'synonyms': ['chessboard'], 'id': 240, 'def': 'a checkerboard used to play chess', 'name': 'chessboard'}, {'frequency': 'c', 'synset': 'chicken.n.02', 'synonyms': ['chicken_(animal)'], 'id': 241, 'def': 'a domestic fowl bred for flesh or eggs', 'name': 'chicken_(animal)'}, {'frequency': 'c', 'synset': 'chickpea.n.01', 'synonyms': ['chickpea', 'garbanzo'], 'id': 242, 'def': 'the seed of the chickpea plant; usually dried', 'name': 'chickpea'}, {'frequency': 'c', 'synset': 'chili.n.02', 'synonyms': ['chili_(vegetable)', 'chili_pepper_(vegetable)', 'chilli_(vegetable)', 'chilly_(vegetable)', 'chile_(vegetable)'], 'id': 243, 'def': 'very hot and finely tapering pepper of special pungency', 'name': 'chili_(vegetable)'}, {'frequency': 'r', 'synset': 'chime.n.01', 'synonyms': ['chime', 'gong'], 'id': 244, 'def': 'an instrument consisting of a set of bells that are struck with a hammer', 'name': 'chime'}, {'frequency': 'r', 'synset': 'chinaware.n.01', 'synonyms': ['chinaware'], 'id': 245, 'def': 'dishware made of high quality porcelain', 'name': 'chinaware'}, {'frequency': 'c', 'synset': 'chip.n.04', 'synonyms': ['crisp_(potato_chip)', 'potato_chip'], 'id': 246, 'def': 'a thin crisp slice of potato fried in deep fat', 'name': 'crisp_(potato_chip)'}, {'frequency': 'r', 'synset': 'chip.n.06', 'synonyms': ['poker_chip'], 'id': 247, 'def': 'a small disk-shaped counter used to represent money when gambling', 'name': 'poker_chip'}, {'frequency': 'c', 'synset': 'chocolate_bar.n.01', 'synonyms': ['chocolate_bar'], 'id': 248, 'def': 'a bar of chocolate candy', 'name': 'chocolate_bar'}, {'frequency': 'c', 'synset': 'chocolate_cake.n.01', 'synonyms': ['chocolate_cake'], 'id': 249, 'def': 'cake containing chocolate', 'name': 'chocolate_cake'}, {'frequency': 'r', 'synset': 'chocolate_milk.n.01', 'synonyms': ['chocolate_milk'], 'id': 250, 'def': 'milk flavored with chocolate syrup', 'name': 'chocolate_milk'}, {'frequency': 'r', 'synset': 'chocolate_mousse.n.01', 'synonyms': ['chocolate_mousse'], 'id': 251, 'def': 'dessert mousse made with chocolate', 'name': 'chocolate_mousse'}, {'frequency': 'f', 'synset': 'choker.n.03', 'synonyms': ['choker', 'collar', 'neckband'], 'id': 252, 'def': 'shirt collar, animal collar, or tight-fitting necklace', 'name': 'choker'}, {'frequency': 'f', 'synset': 'chopping_board.n.01', 'synonyms': ['chopping_board', 'cutting_board', 'chopping_block'], 'id': 253, 'def': 'a wooden board where meats or vegetables can be cut', 'name': 'chopping_board'}, {'frequency': 'f', 'synset': 'chopstick.n.01', 'synonyms': ['chopstick'], 'id': 254, 'def': 'one of a pair of slender sticks used as oriental tableware to eat food with', 'name': 'chopstick'}, {'frequency': 'f', 'synset': 'christmas_tree.n.05', 'synonyms': ['Christmas_tree'], 'id': 255, 'def': 'an ornamented evergreen used as a Christmas decoration', 'name': 'Christmas_tree'}, {'frequency': 'c', 'synset': 'chute.n.02', 'synonyms': ['slide'], 'id': 256, 'def': 'sloping channel through which things can descend', 'name': 'slide'}, {'frequency': 'r', 'synset': 'cider.n.01', 'synonyms': ['cider', 'cyder'], 'id': 257, 'def': 'a beverage made from juice pressed from apples', 'name': 'cider'}, {'frequency': 'r', 'synset': 'cigar_box.n.01', 'synonyms': ['cigar_box'], 'id': 258, 'def': 'a box for holding cigars', 'name': 'cigar_box'}, {'frequency': 'f', 'synset': 'cigarette.n.01', 'synonyms': ['cigarette'], 'id': 259, 'def': 'finely ground tobacco wrapped in paper; for smoking', 'name': 'cigarette'}, {'frequency': 'c', 'synset': 'cigarette_case.n.01', 'synonyms': ['cigarette_case', 'cigarette_pack'], 'id': 260, 'def': 'a small flat case for holding cigarettes', 'name': 'cigarette_case'}, {'frequency': 'f', 'synset': 'cistern.n.02', 'synonyms': ['cistern', 'water_tank'], 'id': 261, 'def': 'a tank that holds the water used to flush a toilet', 'name': 'cistern'}, {'frequency': 'r', 'synset': 'clarinet.n.01', 'synonyms': ['clarinet'], 'id': 262, 'def': 'a single-reed instrument with a straight tube', 'name': 'clarinet'}, {'frequency': 'c', 'synset': 'clasp.n.01', 'synonyms': ['clasp'], 'id': 263, 'def': 'a fastener (as a buckle or hook) that is used to hold two things together', 'name': 'clasp'}, {'frequency': 'c', 'synset': 'cleansing_agent.n.01', 'synonyms': ['cleansing_agent', 'cleanser', 'cleaner'], 'id': 264, 'def': 'a preparation used in cleaning something', 'name': 'cleansing_agent'}, {'frequency': 'r', 'synset': 'cleat.n.02', 'synonyms': ['cleat_(for_securing_rope)'], 'id': 265, 'def': 'a fastener (usually with two projecting horns) around which a rope can be secured', 'name': 'cleat_(for_securing_rope)'}, {'frequency': 'r', 'synset': 'clementine.n.01', 'synonyms': ['clementine'], 'id': 266, 'def': 'a variety of mandarin orange', 'name': 'clementine'}, {'frequency': 'c', 'synset': 'clip.n.03', 'synonyms': ['clip'], 'id': 267, 'def': 'any of various small fasteners used to hold loose articles together', 'name': 'clip'}, {'frequency': 'c', 'synset': 'clipboard.n.01', 'synonyms': ['clipboard'], 'id': 268, 'def': 'a small writing board with a clip at the top for holding papers', 'name': 'clipboard'}, {'frequency': 'r', 'synset': 'clipper.n.03', 'synonyms': ['clippers_(for_plants)'], 'id': 269, 'def': 'shears for cutting grass or shrubbery (often used in the plural)', 'name': 'clippers_(for_plants)'}, {'frequency': 'r', 'synset': 'cloak.n.02', 'synonyms': ['cloak'], 'id': 270, 'def': 'a loose outer garment', 'name': 'cloak'}, {'frequency': 'f', 'synset': 'clock.n.01', 'synonyms': ['clock', 'timepiece', 'timekeeper'], 'id': 271, 'def': 'a timepiece that shows the time of day', 'name': 'clock'}, {'frequency': 'f', 'synset': 'clock_tower.n.01', 'synonyms': ['clock_tower'], 'id': 272, 'def': 'a tower with a large clock visible high up on an outside face', 'name': 'clock_tower'}, {'frequency': 'c', 'synset': 'clothes_hamper.n.01', 'synonyms': ['clothes_hamper', 'laundry_basket', 'clothes_basket'], 'id': 273, 'def': 'a hamper that holds dirty clothes to be washed or wet clothes to be dried', 'name': 'clothes_hamper'}, {'frequency': 'c', 'synset': 'clothespin.n.01', 'synonyms': ['clothespin', 'clothes_peg'], 'id': 274, 'def': 'wood or plastic fastener; for holding clothes on a clothesline', 'name': 'clothespin'}, {'frequency': 'r', 'synset': 'clutch_bag.n.01', 'synonyms': ['clutch_bag'], 'id': 275, 'def': "a woman's strapless purse that is carried in the hand", 'name': 'clutch_bag'}, {'frequency': 'f', 'synset': 'coaster.n.03', 'synonyms': ['coaster'], 'id': 276, 'def': 'a covering (plate or mat) that protects the surface of a table', 'name': 'coaster'}, {'frequency': 'f', 'synset': 'coat.n.01', 'synonyms': ['coat'], 'id': 277, 'def': 'an outer garment that has sleeves and covers the body from shoulder down', 'name': 'coat'}, {'frequency': 'c', 'synset': 'coat_hanger.n.01', 'synonyms': ['coat_hanger', 'clothes_hanger', 'dress_hanger'], 'id': 278, 'def': "a hanger that is shaped like a person's shoulders", 'name': 'coat_hanger'}, {'frequency': 'c', 'synset': 'coatrack.n.01', 'synonyms': ['coatrack', 'hatrack'], 'id': 279, 'def': 'a rack with hooks for temporarily holding coats and hats', 'name': 'coatrack'}, {'frequency': 'c', 'synset': 'cock.n.04', 'synonyms': ['cock', 'rooster'], 'id': 280, 'def': 'adult male chicken', 'name': 'cock'}, {'frequency': 'r', 'synset': 'cockroach.n.01', 'synonyms': ['cockroach'], 'id': 281, 'def': 'any of numerous chiefly nocturnal insects; some are domestic pests', 'name': 'cockroach'}, {'frequency': 'r', 'synset': 'cocoa.n.01', 'synonyms': ['cocoa_(beverage)', 'hot_chocolate_(beverage)', 'drinking_chocolate'], 'id': 282, 'def': 'a beverage made from cocoa powder and milk and sugar; usually drunk hot', 'name': 'cocoa_(beverage)'}, {'frequency': 'c', 'synset': 'coconut.n.02', 'synonyms': ['coconut', 'cocoanut'], 'id': 283, 'def': 'large hard-shelled brown oval nut with a fibrous husk', 'name': 'coconut'}, {'frequency': 'f', 'synset': 'coffee_maker.n.01', 'synonyms': ['coffee_maker', 'coffee_machine'], 'id': 284, 'def': 'a kitchen appliance for brewing coffee automatically', 'name': 'coffee_maker'}, {'frequency': 'f', 'synset': 'coffee_table.n.01', 'synonyms': ['coffee_table', 'cocktail_table'], 'id': 285, 'def': 'low table where magazines can be placed and coffee or cocktails are served', 'name': 'coffee_table'}, {'frequency': 'c', 'synset': 'coffeepot.n.01', 'synonyms': ['coffeepot'], 'id': 286, 'def': 'tall pot in which coffee is brewed', 'name': 'coffeepot'}, {'frequency': 'r', 'synset': 'coil.n.05', 'synonyms': ['coil'], 'id': 287, 'def': 'tubing that is wound in a spiral', 'name': 'coil'}, {'frequency': 'c', 'synset': 'coin.n.01', 'synonyms': ['coin'], 'id': 288, 'def': 'a flat metal piece (usually a disc) used as money', 'name': 'coin'}, {'frequency': 'c', 'synset': 'colander.n.01', 'synonyms': ['colander', 'cullender'], 'id': 289, 'def': 'bowl-shaped strainer; used to wash or drain foods', 'name': 'colander'}, {'frequency': 'c', 'synset': 'coleslaw.n.01', 'synonyms': ['coleslaw', 'slaw'], 'id': 290, 'def': 'basically shredded cabbage', 'name': 'coleslaw'}, {'frequency': 'r', 'synset': 'coloring_material.n.01', 'synonyms': ['coloring_material', 'colouring_material'], 'id': 291, 'def': 'any material used for its color', 'name': 'coloring_material'}, {'frequency': 'r', 'synset': 'combination_lock.n.01', 'synonyms': ['combination_lock'], 'id': 292, 'def': 'lock that can be opened only by turning dials in a special sequence', 'name': 'combination_lock'}, {'frequency': 'c', 'synset': 'comforter.n.04', 'synonyms': ['pacifier', 'teething_ring'], 'id': 293, 'def': 'device used for an infant to suck or bite on', 'name': 'pacifier'}, {'frequency': 'r', 'synset': 'comic_book.n.01', 'synonyms': ['comic_book'], 'id': 294, 'def': 'a magazine devoted to comic strips', 'name': 'comic_book'}, {'frequency': 'r', 'synset': 'compass.n.01', 'synonyms': ['compass'], 'id': 295, 'def': 'navigational instrument for finding directions', 'name': 'compass'}, {'frequency': 'f', 'synset': 'computer_keyboard.n.01', 'synonyms': ['computer_keyboard', 'keyboard_(computer)'], 'id': 296, 'def': 'a keyboard that is a data input device for computers', 'name': 'computer_keyboard'}, {'frequency': 'f', 'synset': 'condiment.n.01', 'synonyms': ['condiment'], 'id': 297, 'def': 'a preparation (a sauce or relish or spice) to enhance flavor or enjoyment', 'name': 'condiment'}, {'frequency': 'f', 'synset': 'cone.n.01', 'synonyms': ['cone', 'traffic_cone'], 'id': 298, 'def': 'a cone-shaped object used to direct traffic', 'name': 'cone'}, {'frequency': 'f', 'synset': 'control.n.09', 'synonyms': ['control', 'controller'], 'id': 299, 'def': 'a mechanism that controls the operation of a machine', 'name': 'control'}, {'frequency': 'r', 'synset': 'convertible.n.01', 'synonyms': ['convertible_(automobile)'], 'id': 300, 'def': 'a car that has top that can be folded or removed', 'name': 'convertible_(automobile)'}, {'frequency': 'r', 'synset': 'convertible.n.03', 'synonyms': ['sofa_bed'], 'id': 301, 'def': 'a sofa that can be converted into a bed', 'name': 'sofa_bed'}, {'frequency': 'r', 'synset': 'cooker.n.01', 'synonyms': ['cooker'], 'id': 302, 'def': 'a utensil for cooking', 'name': 'cooker'}, {'frequency': 'f', 'synset': 'cookie.n.01', 'synonyms': ['cookie', 'cooky', 'biscuit_(cookie)'], 'id': 303, 'def': "any of various small flat sweet cakes (`biscuit' is the British term)", 'name': 'cookie'}, {'frequency': 'r', 'synset': 'cooking_utensil.n.01', 'synonyms': ['cooking_utensil'], 'id': 304, 'def': 'a kitchen utensil made of material that does not melt easily; used for cooking', 'name': 'cooking_utensil'}, {'frequency': 'f', 'synset': 'cooler.n.01', 'synonyms': ['cooler_(for_food)', 'ice_chest'], 'id': 305, 'def': 'an insulated box for storing food often with ice', 'name': 'cooler_(for_food)'}, {'frequency': 'f', 'synset': 'cork.n.04', 'synonyms': ['cork_(bottle_plug)', 'bottle_cork'], 'id': 306, 'def': 'the plug in the mouth of a bottle (especially a wine bottle)', 'name': 'cork_(bottle_plug)'}, {'frequency': 'r', 'synset': 'corkboard.n.01', 'synonyms': ['corkboard'], 'id': 307, 'def': 'a sheet consisting of cork granules', 'name': 'corkboard'}, {'frequency': 'c', 'synset': 'corkscrew.n.01', 'synonyms': ['corkscrew', 'bottle_screw'], 'id': 308, 'def': 'a bottle opener that pulls corks', 'name': 'corkscrew'}, {'frequency': 'f', 'synset': 'corn.n.03', 'synonyms': ['edible_corn', 'corn', 'maize'], 'id': 309, 'def': 'ears or kernels of corn that can be prepared and served for human food (only mark individual ears or kernels)', 'name': 'edible_corn'}, {'frequency': 'r', 'synset': 'cornbread.n.01', 'synonyms': ['cornbread'], 'id': 310, 'def': 'bread made primarily of cornmeal', 'name': 'cornbread'}, {'frequency': 'c', 'synset': 'cornet.n.01', 'synonyms': ['cornet', 'horn', 'trumpet'], 'id': 311, 'def': 'a brass musical instrument with a narrow tube and a flared bell and many valves', 'name': 'cornet'}, {'frequency': 'c', 'synset': 'cornice.n.01', 'synonyms': ['cornice', 'valance', 'valance_board', 'pelmet'], 'id': 312, 'def': 'a decorative framework to conceal curtain fixtures at the top of a window casing', 'name': 'cornice'}, {'frequency': 'r', 'synset': 'cornmeal.n.01', 'synonyms': ['cornmeal'], 'id': 313, 'def': 'coarsely ground corn', 'name': 'cornmeal'}, {'frequency': 'c', 'synset': 'corset.n.01', 'synonyms': ['corset', 'girdle'], 'id': 314, 'def': "a woman's close-fitting foundation garment", 'name': 'corset'}, {'frequency': 'c', 'synset': 'costume.n.04', 'synonyms': ['costume'], 'id': 315, 'def': 'the attire characteristic of a country or a time or a social class', 'name': 'costume'}, {'frequency': 'r', 'synset': 'cougar.n.01', 'synonyms': ['cougar', 'puma', 'catamount', 'mountain_lion', 'panther'], 'id': 316, 'def': 'large American feline resembling a lion', 'name': 'cougar'}, {'frequency': 'r', 'synset': 'coverall.n.01', 'synonyms': ['coverall'], 'id': 317, 'def': 'a loose-fitting protective garment that is worn over other clothing', 'name': 'coverall'}, {'frequency': 'c', 'synset': 'cowbell.n.01', 'synonyms': ['cowbell'], 'id': 318, 'def': 'a bell hung around the neck of cow so that the cow can be easily located', 'name': 'cowbell'}, {'frequency': 'f', 'synset': 'cowboy_hat.n.01', 'synonyms': ['cowboy_hat', 'ten-gallon_hat'], 'id': 319, 'def': 'a hat with a wide brim and a soft crown; worn by American ranch hands', 'name': 'cowboy_hat'}, {'frequency': 'c', 'synset': 'crab.n.01', 'synonyms': ['crab_(animal)'], 'id': 320, 'def': 'decapod having eyes on short stalks and a broad flattened shell and pincers', 'name': 'crab_(animal)'}, {'frequency': 'r', 'synset': 'crab.n.05', 'synonyms': ['crabmeat'], 'id': 321, 'def': 'the edible flesh of any of various crabs', 'name': 'crabmeat'}, {'frequency': 'c', 'synset': 'cracker.n.01', 'synonyms': ['cracker'], 'id': 322, 'def': 'a thin crisp wafer', 'name': 'cracker'}, {'frequency': 'r', 'synset': 'crape.n.01', 'synonyms': ['crape', 'crepe', 'French_pancake'], 'id': 323, 'def': 'small very thin pancake', 'name': 'crape'}, {'frequency': 'f', 'synset': 'crate.n.01', 'synonyms': ['crate'], 'id': 324, 'def': 'a rugged box (usually made of wood); used for shipping', 'name': 'crate'}, {'frequency': 'c', 'synset': 'crayon.n.01', 'synonyms': ['crayon', 'wax_crayon'], 'id': 325, 'def': 'writing or drawing implement made of a colored stick of composition wax', 'name': 'crayon'}, {'frequency': 'r', 'synset': 'cream_pitcher.n.01', 'synonyms': ['cream_pitcher'], 'id': 326, 'def': 'a small pitcher for serving cream', 'name': 'cream_pitcher'}, {'frequency': 'c', 'synset': 'crescent_roll.n.01', 'synonyms': ['crescent_roll', 'croissant'], 'id': 327, 'def': 'very rich flaky crescent-shaped roll', 'name': 'crescent_roll'}, {'frequency': 'c', 'synset': 'crib.n.01', 'synonyms': ['crib', 'cot'], 'id': 328, 'def': 'baby bed with high sides made of slats', 'name': 'crib'}, {'frequency': 'c', 'synset': 'crock.n.03', 'synonyms': ['crock_pot', 'earthenware_jar'], 'id': 329, 'def': 'an earthen jar (made of baked clay) or a modern electric crockpot', 'name': 'crock_pot'}, {'frequency': 'f', 'synset': 'crossbar.n.01', 'synonyms': ['crossbar'], 'id': 330, 'def': 'a horizontal bar that goes across something', 'name': 'crossbar'}, {'frequency': 'r', 'synset': 'crouton.n.01', 'synonyms': ['crouton'], 'id': 331, 'def': 'a small piece of toasted or fried bread; served in soup or salads', 'name': 'crouton'}, {'frequency': 'c', 'synset': 'crow.n.01', 'synonyms': ['crow'], 'id': 332, 'def': 'black birds having a raucous call', 'name': 'crow'}, {'frequency': 'r', 'synset': 'crowbar.n.01', 'synonyms': ['crowbar', 'wrecking_bar', 'pry_bar'], 'id': 333, 'def': 'a heavy iron lever with one end forged into a wedge', 'name': 'crowbar'}, {'frequency': 'c', 'synset': 'crown.n.04', 'synonyms': ['crown'], 'id': 334, 'def': 'an ornamental jeweled headdress signifying sovereignty', 'name': 'crown'}, {'frequency': 'c', 'synset': 'crucifix.n.01', 'synonyms': ['crucifix'], 'id': 335, 'def': 'representation of the cross on which Jesus died', 'name': 'crucifix'}, {'frequency': 'c', 'synset': 'cruise_ship.n.01', 'synonyms': ['cruise_ship', 'cruise_liner'], 'id': 336, 'def': 'a passenger ship used commercially for pleasure cruises', 'name': 'cruise_ship'}, {'frequency': 'c', 'synset': 'cruiser.n.01', 'synonyms': ['police_cruiser', 'patrol_car', 'police_car', 'squad_car'], 'id': 337, 'def': 'a car in which policemen cruise the streets', 'name': 'police_cruiser'}, {'frequency': 'f', 'synset': 'crumb.n.03', 'synonyms': ['crumb'], 'id': 338, 'def': 'small piece of e.g. bread or cake', 'name': 'crumb'}, {'frequency': 'c', 'synset': 'crutch.n.01', 'synonyms': ['crutch'], 'id': 339, 'def': 'a wooden or metal staff that fits under the armpit and reaches to the ground', 'name': 'crutch'}, {'frequency': 'c', 'synset': 'cub.n.03', 'synonyms': ['cub_(animal)'], 'id': 340, 'def': 'the young of certain carnivorous mammals such as the bear or wolf or lion', 'name': 'cub_(animal)'}, {'frequency': 'c', 'synset': 'cube.n.05', 'synonyms': ['cube', 'square_block'], 'id': 341, 'def': 'a block in the (approximate) shape of a cube', 'name': 'cube'}, {'frequency': 'f', 'synset': 'cucumber.n.02', 'synonyms': ['cucumber', 'cuke'], 'id': 342, 'def': 'cylindrical green fruit with thin green rind and white flesh eaten as a vegetable', 'name': 'cucumber'}, {'frequency': 'c', 'synset': 'cufflink.n.01', 'synonyms': ['cufflink'], 'id': 343, 'def': 'jewelry consisting of linked buttons used to fasten the cuffs of a shirt', 'name': 'cufflink'}, {'frequency': 'f', 'synset': 'cup.n.01', 'synonyms': ['cup'], 'id': 344, 'def': 'a small open container usually used for drinking; usually has a handle', 'name': 'cup'}, {'frequency': 'c', 'synset': 'cup.n.08', 'synonyms': ['trophy_cup'], 'id': 345, 'def': 'a metal award or cup-shaped vessel with handles that is awarded as a trophy to a competition winner', 'name': 'trophy_cup'}, {'frequency': 'f', 'synset': 'cupboard.n.01', 'synonyms': ['cupboard', 'closet'], 'id': 346, 'def': 'a small room (or recess) or cabinet used for storage space', 'name': 'cupboard'}, {'frequency': 'f', 'synset': 'cupcake.n.01', 'synonyms': ['cupcake'], 'id': 347, 'def': 'small cake baked in a muffin tin', 'name': 'cupcake'}, {'frequency': 'r', 'synset': 'curler.n.01', 'synonyms': ['hair_curler', 'hair_roller', 'hair_crimper'], 'id': 348, 'def': 'a cylindrical tube around which the hair is wound to curl it', 'name': 'hair_curler'}, {'frequency': 'r', 'synset': 'curling_iron.n.01', 'synonyms': ['curling_iron'], 'id': 349, 'def': 'a cylindrical home appliance that heats hair that has been curled around it', 'name': 'curling_iron'}, {'frequency': 'f', 'synset': 'curtain.n.01', 'synonyms': ['curtain', 'drapery'], 'id': 350, 'def': 'hanging cloth used as a blind (especially for a window)', 'name': 'curtain'}, {'frequency': 'f', 'synset': 'cushion.n.03', 'synonyms': ['cushion'], 'id': 351, 'def': 'a soft bag filled with air or padding such as feathers or foam rubber', 'name': 'cushion'}, {'frequency': 'r', 'synset': 'cylinder.n.04', 'synonyms': ['cylinder'], 'id': 352, 'def': 'a cylindrical container', 'name': 'cylinder'}, {'frequency': 'r', 'synset': 'cymbal.n.01', 'synonyms': ['cymbal'], 'id': 353, 'def': 'a percussion instrument consisting of a concave brass disk', 'name': 'cymbal'}, {'frequency': 'r', 'synset': 'dagger.n.01', 'synonyms': ['dagger'], 'id': 354, 'def': 'a short knife with a pointed blade used for piercing or stabbing', 'name': 'dagger'}, {'frequency': 'r', 'synset': 'dalmatian.n.02', 'synonyms': ['dalmatian'], 'id': 355, 'def': 'a large breed having a smooth white coat with black or brown spots', 'name': 'dalmatian'}, {'frequency': 'c', 'synset': 'dartboard.n.01', 'synonyms': ['dartboard'], 'id': 356, 'def': 'a circular board of wood or cork used as the target in the game of darts', 'name': 'dartboard'}, {'frequency': 'r', 'synset': 'date.n.08', 'synonyms': ['date_(fruit)'], 'id': 357, 'def': 'sweet edible fruit of the date palm with a single long woody seed', 'name': 'date_(fruit)'}, {'frequency': 'f', 'synset': 'deck_chair.n.01', 'synonyms': ['deck_chair', 'beach_chair'], 'id': 358, 'def': 'a folding chair for use outdoors; a wooden frame supports a length of canvas', 'name': 'deck_chair'}, {'frequency': 'c', 'synset': 'deer.n.01', 'synonyms': ['deer', 'cervid'], 'id': 359, 'def': "distinguished from Bovidae by the male's having solid deciduous antlers", 'name': 'deer'}, {'frequency': 'c', 'synset': 'dental_floss.n.01', 'synonyms': ['dental_floss', 'floss'], 'id': 360, 'def': 'a soft thread for cleaning the spaces between the teeth', 'name': 'dental_floss'}, {'frequency': 'f', 'synset': 'desk.n.01', 'synonyms': ['desk'], 'id': 361, 'def': 'a piece of furniture with a writing surface and usually drawers or other compartments', 'name': 'desk'}, {'frequency': 'r', 'synset': 'detergent.n.01', 'synonyms': ['detergent'], 'id': 362, 'def': 'a surface-active chemical widely used in industry and laundering', 'name': 'detergent'}, {'frequency': 'c', 'synset': 'diaper.n.01', 'synonyms': ['diaper'], 'id': 363, 'def': 'garment consisting of a folded cloth drawn up between the legs and fastened at the waist', 'name': 'diaper'}, {'frequency': 'r', 'synset': 'diary.n.01', 'synonyms': ['diary', 'journal'], 'id': 364, 'def': 'yearly planner book', 'name': 'diary'}, {'frequency': 'r', 'synset': 'die.n.01', 'synonyms': ['die', 'dice'], 'id': 365, 'def': 'a small cube with 1 to 6 spots on the six faces; used in gambling', 'name': 'die'}, {'frequency': 'r', 'synset': 'dinghy.n.01', 'synonyms': ['dinghy', 'dory', 'rowboat'], 'id': 366, 'def': 'a small boat of shallow draft with seats and oars with which it is propelled', 'name': 'dinghy'}, {'frequency': 'f', 'synset': 'dining_table.n.01', 'synonyms': ['dining_table'], 'id': 367, 'def': 'a table at which meals are served', 'name': 'dining_table'}, {'frequency': 'r', 'synset': 'dinner_jacket.n.01', 'synonyms': ['tux', 'tuxedo'], 'id': 368, 'def': 'semiformal evening dress for men', 'name': 'tux'}, {'frequency': 'f', 'synset': 'dish.n.01', 'synonyms': ['dish'], 'id': 369, 'def': 'a piece of dishware normally used as a container for holding or serving food', 'name': 'dish'}, {'frequency': 'c', 'synset': 'dish.n.05', 'synonyms': ['dish_antenna'], 'id': 370, 'def': 'directional antenna consisting of a parabolic reflector', 'name': 'dish_antenna'}, {'frequency': 'c', 'synset': 'dishrag.n.01', 'synonyms': ['dishrag', 'dishcloth'], 'id': 371, 'def': 'a cloth for washing dishes or cleaning in general', 'name': 'dishrag'}, {'frequency': 'f', 'synset': 'dishtowel.n.01', 'synonyms': ['dishtowel', 'tea_towel'], 'id': 372, 'def': 'a towel for drying dishes', 'name': 'dishtowel'}, {'frequency': 'f', 'synset': 'dishwasher.n.01', 'synonyms': ['dishwasher', 'dishwashing_machine'], 'id': 373, 'def': 'a machine for washing dishes', 'name': 'dishwasher'}, {'frequency': 'r', 'synset': 'dishwasher_detergent.n.01', 'synonyms': ['dishwasher_detergent', 'dishwashing_detergent', 'dishwashing_liquid', 'dishsoap'], 'id': 374, 'def': 'dishsoap or dish detergent designed for use in dishwashers', 'name': 'dishwasher_detergent'}, {'frequency': 'f', 'synset': 'dispenser.n.01', 'synonyms': ['dispenser'], 'id': 375, 'def': 'a container so designed that the contents can be used in prescribed amounts', 'name': 'dispenser'}, {'frequency': 'r', 'synset': 'diving_board.n.01', 'synonyms': ['diving_board'], 'id': 376, 'def': 'a springboard from which swimmers can dive', 'name': 'diving_board'}, {'frequency': 'f', 'synset': 'dixie_cup.n.01', 'synonyms': ['Dixie_cup', 'paper_cup'], 'id': 377, 'def': 'a disposable cup made of paper; for holding drinks', 'name': 'Dixie_cup'}, {'frequency': 'f', 'synset': 'dog.n.01', 'synonyms': ['dog'], 'id': 378, 'def': 'a common domesticated dog', 'name': 'dog'}, {'frequency': 'f', 'synset': 'dog_collar.n.01', 'synonyms': ['dog_collar'], 'id': 379, 'def': 'a collar for a dog', 'name': 'dog_collar'}, {'frequency': 'f', 'synset': 'doll.n.01', 'synonyms': ['doll'], 'id': 380, 'def': 'a toy replica of a HUMAN (NOT AN ANIMAL)', 'name': 'doll'}, {'frequency': 'r', 'synset': 'dollar.n.02', 'synonyms': ['dollar', 'dollar_bill', 'one_dollar_bill'], 'id': 381, 'def': 'a piece of paper money worth one dollar', 'name': 'dollar'}, {'frequency': 'r', 'synset': 'dollhouse.n.01', 'synonyms': ['dollhouse', "doll's_house"], 'id': 382, 'def': "a house so small that it is likened to a child's plaything", 'name': 'dollhouse'}, {'frequency': 'c', 'synset': 'dolphin.n.02', 'synonyms': ['dolphin'], 'id': 383, 'def': 'any of various small toothed whales with a beaklike snout; larger than porpoises', 'name': 'dolphin'}, {'frequency': 'c', 'synset': 'domestic_ass.n.01', 'synonyms': ['domestic_ass', 'donkey'], 'id': 384, 'def': 'domestic beast of burden descended from the African wild ass; patient but stubborn', 'name': 'domestic_ass'}, {'frequency': 'f', 'synset': 'doorknob.n.01', 'synonyms': ['doorknob', 'doorhandle'], 'id': 385, 'def': "a knob used to open a door (often called `doorhandle' in Great Britain)", 'name': 'doorknob'}, {'frequency': 'c', 'synset': 'doormat.n.02', 'synonyms': ['doormat', 'welcome_mat'], 'id': 386, 'def': 'a mat placed outside an exterior door for wiping the shoes before entering', 'name': 'doormat'}, {'frequency': 'f', 'synset': 'doughnut.n.02', 'synonyms': ['doughnut', 'donut'], 'id': 387, 'def': 'a small ring-shaped friedcake', 'name': 'doughnut'}, {'frequency': 'r', 'synset': 'dove.n.01', 'synonyms': ['dove'], 'id': 388, 'def': 'any of numerous small pigeons', 'name': 'dove'}, {'frequency': 'r', 'synset': 'dragonfly.n.01', 'synonyms': ['dragonfly'], 'id': 389, 'def': 'slender-bodied non-stinging insect having iridescent wings that are outspread at rest', 'name': 'dragonfly'}, {'frequency': 'f', 'synset': 'drawer.n.01', 'synonyms': ['drawer'], 'id': 390, 'def': 'a boxlike container in a piece of furniture; made so as to slide in and out', 'name': 'drawer'}, {'frequency': 'c', 'synset': 'drawers.n.01', 'synonyms': ['underdrawers', 'boxers', 'boxershorts'], 'id': 391, 'def': 'underpants worn by men', 'name': 'underdrawers'}, {'frequency': 'f', 'synset': 'dress.n.01', 'synonyms': ['dress', 'frock'], 'id': 392, 'def': 'a one-piece garment for a woman; has skirt and bodice', 'name': 'dress'}, {'frequency': 'c', 'synset': 'dress_hat.n.01', 'synonyms': ['dress_hat', 'high_hat', 'opera_hat', 'silk_hat', 'top_hat'], 'id': 393, 'def': "a man's hat with a tall crown; usually covered with silk or with beaver fur", 'name': 'dress_hat'}, {'frequency': 'f', 'synset': 'dress_suit.n.01', 'synonyms': ['dress_suit'], 'id': 394, 'def': 'formalwear consisting of full evening dress for men', 'name': 'dress_suit'}, {'frequency': 'f', 'synset': 'dresser.n.05', 'synonyms': ['dresser'], 'id': 395, 'def': 'a cabinet with shelves', 'name': 'dresser'}, {'frequency': 'c', 'synset': 'drill.n.01', 'synonyms': ['drill'], 'id': 396, 'def': 'a tool with a sharp rotating point for making holes in hard materials', 'name': 'drill'}, {'frequency': 'r', 'synset': 'drone.n.04', 'synonyms': ['drone'], 'id': 397, 'def': 'an aircraft without a pilot that is operated by remote control', 'name': 'drone'}, {'frequency': 'r', 'synset': 'dropper.n.01', 'synonyms': ['dropper', 'eye_dropper'], 'id': 398, 'def': 'pipet consisting of a small tube with a vacuum bulb at one end for drawing liquid in and releasing it a drop at a time', 'name': 'dropper'}, {'frequency': 'c', 'synset': 'drum.n.01', 'synonyms': ['drum_(musical_instrument)'], 'id': 399, 'def': 'a musical percussion instrument; usually consists of a hollow cylinder with a membrane stretched across each end', 'name': 'drum_(musical_instrument)'}, {'frequency': 'r', 'synset': 'drumstick.n.02', 'synonyms': ['drumstick'], 'id': 400, 'def': 'a stick used for playing a drum', 'name': 'drumstick'}, {'frequency': 'f', 'synset': 'duck.n.01', 'synonyms': ['duck'], 'id': 401, 'def': 'small web-footed broad-billed swimming bird', 'name': 'duck'}, {'frequency': 'c', 'synset': 'duckling.n.02', 'synonyms': ['duckling'], 'id': 402, 'def': 'young duck', 'name': 'duckling'}, {'frequency': 'c', 'synset': 'duct_tape.n.01', 'synonyms': ['duct_tape'], 'id': 403, 'def': 'a wide silvery adhesive tape', 'name': 'duct_tape'}, {'frequency': 'f', 'synset': 'duffel_bag.n.01', 'synonyms': ['duffel_bag', 'duffle_bag', 'duffel', 'duffle'], 'id': 404, 'def': 'a large cylindrical bag of heavy cloth (does not include suitcases)', 'name': 'duffel_bag'}, {'frequency': 'r', 'synset': 'dumbbell.n.01', 'synonyms': ['dumbbell'], 'id': 405, 'def': 'an exercising weight with two ball-like ends connected by a short handle', 'name': 'dumbbell'}, {'frequency': 'c', 'synset': 'dumpster.n.01', 'synonyms': ['dumpster'], 'id': 406, 'def': 'a container designed to receive and transport and dump waste', 'name': 'dumpster'}, {'frequency': 'r', 'synset': 'dustpan.n.02', 'synonyms': ['dustpan'], 'id': 407, 'def': 'a short-handled receptacle into which dust can be swept', 'name': 'dustpan'}, {'frequency': 'c', 'synset': 'eagle.n.01', 'synonyms': ['eagle'], 'id': 408, 'def': 'large birds of prey noted for their broad wings and strong soaring flight', 'name': 'eagle'}, {'frequency': 'f', 'synset': 'earphone.n.01', 'synonyms': ['earphone', 'earpiece', 'headphone'], 'id': 409, 'def': 'device for listening to audio that is held over or inserted into the ear', 'name': 'earphone'}, {'frequency': 'r', 'synset': 'earplug.n.01', 'synonyms': ['earplug'], 'id': 410, 'def': 'a soft plug that is inserted into the ear canal to block sound', 'name': 'earplug'}, {'frequency': 'f', 'synset': 'earring.n.01', 'synonyms': ['earring'], 'id': 411, 'def': 'jewelry to ornament the ear', 'name': 'earring'}, {'frequency': 'c', 'synset': 'easel.n.01', 'synonyms': ['easel'], 'id': 412, 'def': "an upright tripod for displaying something (usually an artist's canvas)", 'name': 'easel'}, {'frequency': 'r', 'synset': 'eclair.n.01', 'synonyms': ['eclair'], 'id': 413, 'def': 'oblong cream puff', 'name': 'eclair'}, {'frequency': 'r', 'synset': 'eel.n.01', 'synonyms': ['eel'], 'id': 414, 'def': 'an elongate fish with fatty flesh', 'name': 'eel'}, {'frequency': 'f', 'synset': 'egg.n.02', 'synonyms': ['egg', 'eggs'], 'id': 415, 'def': 'oval reproductive body of a fowl (especially a hen) used as food', 'name': 'egg'}, {'frequency': 'r', 'synset': 'egg_roll.n.01', 'synonyms': ['egg_roll', 'spring_roll'], 'id': 416, 'def': 'minced vegetables and meat wrapped in a pancake and fried', 'name': 'egg_roll'}, {'frequency': 'c', 'synset': 'egg_yolk.n.01', 'synonyms': ['egg_yolk', 'yolk_(egg)'], 'id': 417, 'def': 'the yellow spherical part of an egg', 'name': 'egg_yolk'}, {'frequency': 'c', 'synset': 'eggbeater.n.02', 'synonyms': ['eggbeater', 'eggwhisk'], 'id': 418, 'def': 'a mixer for beating eggs or whipping cream', 'name': 'eggbeater'}, {'frequency': 'c', 'synset': 'eggplant.n.01', 'synonyms': ['eggplant', 'aubergine'], 'id': 419, 'def': 'egg-shaped vegetable having a shiny skin typically dark purple', 'name': 'eggplant'}, {'frequency': 'r', 'synset': 'electric_chair.n.01', 'synonyms': ['electric_chair'], 'id': 420, 'def': 'a chair-shaped instrument of execution by electrocution', 'name': 'electric_chair'}, {'frequency': 'f', 'synset': 'electric_refrigerator.n.01', 'synonyms': ['refrigerator'], 'id': 421, 'def': 'a refrigerator in which the coolant is pumped around by an electric motor', 'name': 'refrigerator'}, {'frequency': 'f', 'synset': 'elephant.n.01', 'synonyms': ['elephant'], 'id': 422, 'def': 'a common elephant', 'name': 'elephant'}, {'frequency': 'c', 'synset': 'elk.n.01', 'synonyms': ['elk', 'moose'], 'id': 423, 'def': 'large northern deer with enormous flattened antlers in the male', 'name': 'elk'}, {'frequency': 'c', 'synset': 'envelope.n.01', 'synonyms': ['envelope'], 'id': 424, 'def': 'a flat (usually rectangular) container for a letter, thin package, etc.', 'name': 'envelope'}, {'frequency': 'c', 'synset': 'eraser.n.01', 'synonyms': ['eraser'], 'id': 425, 'def': 'an implement used to erase something', 'name': 'eraser'}, {'frequency': 'r', 'synset': 'escargot.n.01', 'synonyms': ['escargot'], 'id': 426, 'def': 'edible snail usually served in the shell with a sauce of melted butter and garlic', 'name': 'escargot'}, {'frequency': 'r', 'synset': 'eyepatch.n.01', 'synonyms': ['eyepatch'], 'id': 427, 'def': 'a protective cloth covering for an injured eye', 'name': 'eyepatch'}, {'frequency': 'r', 'synset': 'falcon.n.01', 'synonyms': ['falcon'], 'id': 428, 'def': 'birds of prey having long pointed powerful wings adapted for swift flight', 'name': 'falcon'}, {'frequency': 'f', 'synset': 'fan.n.01', 'synonyms': ['fan'], 'id': 429, 'def': 'a device for creating a current of air by movement of a surface or surfaces', 'name': 'fan'}, {'frequency': 'f', 'synset': 'faucet.n.01', 'synonyms': ['faucet', 'spigot', 'tap'], 'id': 430, 'def': 'a regulator for controlling the flow of a liquid from a reservoir', 'name': 'faucet'}, {'frequency': 'r', 'synset': 'fedora.n.01', 'synonyms': ['fedora'], 'id': 431, 'def': 'a hat made of felt with a creased crown', 'name': 'fedora'}, {'frequency': 'r', 'synset': 'ferret.n.02', 'synonyms': ['ferret'], 'id': 432, 'def': 'domesticated albino variety of the European polecat bred for hunting rats and rabbits', 'name': 'ferret'}, {'frequency': 'c', 'synset': 'ferris_wheel.n.01', 'synonyms': ['Ferris_wheel'], 'id': 433, 'def': 'a large wheel with suspended seats that remain upright as the wheel rotates', 'name': 'Ferris_wheel'}, {'frequency': 'c', 'synset': 'ferry.n.01', 'synonyms': ['ferry', 'ferryboat'], 'id': 434, 'def': 'a boat that transports people or vehicles across a body of water and operates on a regular schedule', 'name': 'ferry'}, {'frequency': 'r', 'synset': 'fig.n.04', 'synonyms': ['fig_(fruit)'], 'id': 435, 'def': 'fleshy sweet pear-shaped yellowish or purple fruit eaten fresh or preserved or dried', 'name': 'fig_(fruit)'}, {'frequency': 'c', 'synset': 'fighter.n.02', 'synonyms': ['fighter_jet', 'fighter_aircraft', 'attack_aircraft'], 'id': 436, 'def': 'a high-speed military or naval airplane designed to destroy enemy targets', 'name': 'fighter_jet'}, {'frequency': 'f', 'synset': 'figurine.n.01', 'synonyms': ['figurine'], 'id': 437, 'def': 'a small carved or molded figure', 'name': 'figurine'}, {'frequency': 'c', 'synset': 'file.n.03', 'synonyms': ['file_cabinet', 'filing_cabinet'], 'id': 438, 'def': 'office furniture consisting of a container for keeping papers in order', 'name': 'file_cabinet'}, {'frequency': 'r', 'synset': 'file.n.04', 'synonyms': ['file_(tool)'], 'id': 439, 'def': 'a steel hand tool with small sharp teeth on some or all of its surfaces; used for smoothing wood or metal', 'name': 'file_(tool)'}, {'frequency': 'f', 'synset': 'fire_alarm.n.02', 'synonyms': ['fire_alarm', 'smoke_alarm'], 'id': 440, 'def': 'an alarm that is tripped off by fire or smoke', 'name': 'fire_alarm'}, {'frequency': 'f', 'synset': 'fire_engine.n.01', 'synonyms': ['fire_engine', 'fire_truck'], 'id': 441, 'def': 'large trucks that carry firefighters and equipment to the site of a fire', 'name': 'fire_engine'}, {'frequency': 'f', 'synset': 'fire_extinguisher.n.01', 'synonyms': ['fire_extinguisher', 'extinguisher'], 'id': 442, 'def': 'a manually operated device for extinguishing small fires', 'name': 'fire_extinguisher'}, {'frequency': 'c', 'synset': 'fire_hose.n.01', 'synonyms': ['fire_hose'], 'id': 443, 'def': 'a large hose that carries water from a fire hydrant to the site of the fire', 'name': 'fire_hose'}, {'frequency': 'f', 'synset': 'fireplace.n.01', 'synonyms': ['fireplace'], 'id': 444, 'def': 'an open recess in a wall at the base of a chimney where a fire can be built', 'name': 'fireplace'}, {'frequency': 'f', 'synset': 'fireplug.n.01', 'synonyms': ['fireplug', 'fire_hydrant', 'hydrant'], 'id': 445, 'def': 'an upright hydrant for drawing water to use in fighting a fire', 'name': 'fireplug'}, {'frequency': 'r', 'synset': 'first-aid_kit.n.01', 'synonyms': ['first-aid_kit'], 'id': 446, 'def': 'kit consisting of a set of bandages and medicines for giving first aid', 'name': 'first-aid_kit'}, {'frequency': 'f', 'synset': 'fish.n.01', 'synonyms': ['fish'], 'id': 447, 'def': 'any of various mostly cold-blooded aquatic vertebrates usually having scales and breathing through gills', 'name': 'fish'}, {'frequency': 'c', 'synset': 'fish.n.02', 'synonyms': ['fish_(food)'], 'id': 448, 'def': 'the flesh of fish used as food', 'name': 'fish_(food)'}, {'frequency': 'r', 'synset': 'fishbowl.n.02', 'synonyms': ['fishbowl', 'goldfish_bowl'], 'id': 449, 'def': 'a transparent bowl in which small fish are kept', 'name': 'fishbowl'}, {'frequency': 'c', 'synset': 'fishing_rod.n.01', 'synonyms': ['fishing_rod', 'fishing_pole'], 'id': 450, 'def': 'a rod that is used in fishing to extend the fishing line', 'name': 'fishing_rod'}, {'frequency': 'f', 'synset': 'flag.n.01', 'synonyms': ['flag'], 'id': 451, 'def': 'emblem usually consisting of a rectangular piece of cloth of distinctive design (do not include pole)', 'name': 'flag'}, {'frequency': 'f', 'synset': 'flagpole.n.02', 'synonyms': ['flagpole', 'flagstaff'], 'id': 452, 'def': 'a tall staff or pole on which a flag is raised', 'name': 'flagpole'}, {'frequency': 'c', 'synset': 'flamingo.n.01', 'synonyms': ['flamingo'], 'id': 453, 'def': 'large pink web-footed bird with down-bent bill', 'name': 'flamingo'}, {'frequency': 'c', 'synset': 'flannel.n.01', 'synonyms': ['flannel'], 'id': 454, 'def': 'a soft light woolen fabric; used for clothing', 'name': 'flannel'}, {'frequency': 'c', 'synset': 'flap.n.01', 'synonyms': ['flap'], 'id': 455, 'def': 'any broad thin covering attached at one edge, such as a mud flap next to a wheel or a flap on an airplane wing', 'name': 'flap'}, {'frequency': 'r', 'synset': 'flash.n.10', 'synonyms': ['flash', 'flashbulb'], 'id': 456, 'def': 'a lamp for providing momentary light to take a photograph', 'name': 'flash'}, {'frequency': 'c', 'synset': 'flashlight.n.01', 'synonyms': ['flashlight', 'torch'], 'id': 457, 'def': 'a small portable battery-powered electric lamp', 'name': 'flashlight'}, {'frequency': 'r', 'synset': 'fleece.n.03', 'synonyms': ['fleece'], 'id': 458, 'def': 'a soft bulky fabric with deep pile; used chiefly for clothing', 'name': 'fleece'}, {'frequency': 'f', 'synset': 'flip-flop.n.02', 'synonyms': ['flip-flop_(sandal)'], 'id': 459, 'def': 'a backless sandal held to the foot by a thong between two toes', 'name': 'flip-flop_(sandal)'}, {'frequency': 'c', 'synset': 'flipper.n.01', 'synonyms': ['flipper_(footwear)', 'fin_(footwear)'], 'id': 460, 'def': 'a shoe to aid a person in swimming', 'name': 'flipper_(footwear)'}, {'frequency': 'f', 'synset': 'flower_arrangement.n.01', 'synonyms': ['flower_arrangement', 'floral_arrangement'], 'id': 461, 'def': 'a decorative arrangement of flowers', 'name': 'flower_arrangement'}, {'frequency': 'c', 'synset': 'flute.n.02', 'synonyms': ['flute_glass', 'champagne_flute'], 'id': 462, 'def': 'a tall narrow wineglass', 'name': 'flute_glass'}, {'frequency': 'c', 'synset': 'foal.n.01', 'synonyms': ['foal'], 'id': 463, 'def': 'a young horse', 'name': 'foal'}, {'frequency': 'c', 'synset': 'folding_chair.n.01', 'synonyms': ['folding_chair'], 'id': 464, 'def': 'a chair that can be folded flat for storage', 'name': 'folding_chair'}, {'frequency': 'c', 'synset': 'food_processor.n.01', 'synonyms': ['food_processor'], 'id': 465, 'def': 'a kitchen appliance for shredding, blending, chopping, or slicing food', 'name': 'food_processor'}, {'frequency': 'c', 'synset': 'football.n.02', 'synonyms': ['football_(American)'], 'id': 466, 'def': 'the inflated oblong ball used in playing American football', 'name': 'football_(American)'}, {'frequency': 'r', 'synset': 'football_helmet.n.01', 'synonyms': ['football_helmet'], 'id': 467, 'def': 'a padded helmet with a face mask to protect the head of football players', 'name': 'football_helmet'}, {'frequency': 'c', 'synset': 'footstool.n.01', 'synonyms': ['footstool', 'footrest'], 'id': 468, 'def': 'a low seat or a stool to rest the feet of a seated person', 'name': 'footstool'}, {'frequency': 'f', 'synset': 'fork.n.01', 'synonyms': ['fork'], 'id': 469, 'def': 'cutlery used for serving and eating food', 'name': 'fork'}, {'frequency': 'c', 'synset': 'forklift.n.01', 'synonyms': ['forklift'], 'id': 470, 'def': 'an industrial vehicle with a power operated fork in front that can be inserted under loads to lift and move them', 'name': 'forklift'}, {'frequency': 'c', 'synset': 'freight_car.n.01', 'synonyms': ['freight_car'], 'id': 471, 'def': 'a railway car that carries freight', 'name': 'freight_car'}, {'frequency': 'c', 'synset': 'french_toast.n.01', 'synonyms': ['French_toast'], 'id': 472, 'def': 'bread slice dipped in egg and milk and fried', 'name': 'French_toast'}, {'frequency': 'c', 'synset': 'freshener.n.01', 'synonyms': ['freshener', 'air_freshener'], 'id': 473, 'def': 'anything that freshens air by removing or covering odor', 'name': 'freshener'}, {'frequency': 'f', 'synset': 'frisbee.n.01', 'synonyms': ['frisbee'], 'id': 474, 'def': 'a light, plastic disk propelled with a flip of the wrist for recreation or competition', 'name': 'frisbee'}, {'frequency': 'c', 'synset': 'frog.n.01', 'synonyms': ['frog', 'toad', 'toad_frog'], 'id': 475, 'def': 'a tailless stout-bodied amphibians with long hind limbs for leaping', 'name': 'frog'}, {'frequency': 'c', 'synset': 'fruit_juice.n.01', 'synonyms': ['fruit_juice'], 'id': 476, 'def': 'drink produced by squeezing or crushing fruit', 'name': 'fruit_juice'}, {'frequency': 'f', 'synset': 'frying_pan.n.01', 'synonyms': ['frying_pan', 'frypan', 'skillet'], 'id': 477, 'def': 'a pan used for frying foods', 'name': 'frying_pan'}, {'frequency': 'r', 'synset': 'fudge.n.01', 'synonyms': ['fudge'], 'id': 478, 'def': 'soft creamy candy', 'name': 'fudge'}, {'frequency': 'r', 'synset': 'funnel.n.02', 'synonyms': ['funnel'], 'id': 479, 'def': 'a cone-shaped utensil used to channel a substance into a container with a small mouth', 'name': 'funnel'}, {'frequency': 'r', 'synset': 'futon.n.01', 'synonyms': ['futon'], 'id': 480, 'def': 'a pad that is used for sleeping on the floor or on a raised frame', 'name': 'futon'}, {'frequency': 'r', 'synset': 'gag.n.02', 'synonyms': ['gag', 'muzzle'], 'id': 481, 'def': "restraint put into a person's mouth to prevent speaking or shouting", 'name': 'gag'}, {'frequency': 'r', 'synset': 'garbage.n.03', 'synonyms': ['garbage'], 'id': 482, 'def': 'a receptacle where waste can be discarded', 'name': 'garbage'}, {'frequency': 'c', 'synset': 'garbage_truck.n.01', 'synonyms': ['garbage_truck'], 'id': 483, 'def': 'a truck for collecting domestic refuse', 'name': 'garbage_truck'}, {'frequency': 'c', 'synset': 'garden_hose.n.01', 'synonyms': ['garden_hose'], 'id': 484, 'def': 'a hose used for watering a lawn or garden', 'name': 'garden_hose'}, {'frequency': 'c', 'synset': 'gargle.n.01', 'synonyms': ['gargle', 'mouthwash'], 'id': 485, 'def': 'a medicated solution used for gargling and rinsing the mouth', 'name': 'gargle'}, {'frequency': 'r', 'synset': 'gargoyle.n.02', 'synonyms': ['gargoyle'], 'id': 486, 'def': 'an ornament consisting of a grotesquely carved figure of a person or animal', 'name': 'gargoyle'}, {'frequency': 'c', 'synset': 'garlic.n.02', 'synonyms': ['garlic', 'ail'], 'id': 487, 'def': 'aromatic bulb used as seasoning', 'name': 'garlic'}, {'frequency': 'r', 'synset': 'gasmask.n.01', 'synonyms': ['gasmask', 'respirator', 'gas_helmet'], 'id': 488, 'def': 'a protective face mask with a filter', 'name': 'gasmask'}, {'frequency': 'c', 'synset': 'gazelle.n.01', 'synonyms': ['gazelle'], 'id': 489, 'def': 'small swift graceful antelope of Africa and Asia having lustrous eyes', 'name': 'gazelle'}, {'frequency': 'c', 'synset': 'gelatin.n.02', 'synonyms': ['gelatin', 'jelly'], 'id': 490, 'def': 'an edible jelly made with gelatin and used as a dessert or salad base or a coating for foods', 'name': 'gelatin'}, {'frequency': 'r', 'synset': 'gem.n.02', 'synonyms': ['gemstone'], 'id': 491, 'def': 'a crystalline rock that can be cut and polished for jewelry', 'name': 'gemstone'}, {'frequency': 'r', 'synset': 'generator.n.02', 'synonyms': ['generator'], 'id': 492, 'def': 'engine that converts mechanical energy into electrical energy by electromagnetic induction', 'name': 'generator'}, {'frequency': 'c', 'synset': 'giant_panda.n.01', 'synonyms': ['giant_panda', 'panda', 'panda_bear'], 'id': 493, 'def': 'large black-and-white herbivorous mammal of bamboo forests of China and Tibet', 'name': 'giant_panda'}, {'frequency': 'c', 'synset': 'gift_wrap.n.01', 'synonyms': ['gift_wrap'], 'id': 494, 'def': 'attractive wrapping paper suitable for wrapping gifts', 'name': 'gift_wrap'}, {'frequency': 'c', 'synset': 'ginger.n.03', 'synonyms': ['ginger', 'gingerroot'], 'id': 495, 'def': 'the root of the common ginger plant; used fresh as a seasoning', 'name': 'ginger'}, {'frequency': 'f', 'synset': 'giraffe.n.01', 'synonyms': ['giraffe'], 'id': 496, 'def': 'tall animal having a spotted coat and small horns and very long neck and legs', 'name': 'giraffe'}, {'frequency': 'c', 'synset': 'girdle.n.02', 'synonyms': ['cincture', 'sash', 'waistband', 'waistcloth'], 'id': 497, 'def': 'a band of material around the waist that strengthens a skirt or trousers', 'name': 'cincture'}, {'frequency': 'f', 'synset': 'glass.n.02', 'synonyms': ['glass_(drink_container)', 'drinking_glass'], 'id': 498, 'def': 'a container for holding liquids while drinking', 'name': 'glass_(drink_container)'}, {'frequency': 'c', 'synset': 'globe.n.03', 'synonyms': ['globe'], 'id': 499, 'def': 'a sphere on which a map (especially of the earth) is represented', 'name': 'globe'}, {'frequency': 'f', 'synset': 'glove.n.02', 'synonyms': ['glove'], 'id': 500, 'def': 'handwear covering the hand', 'name': 'glove'}, {'frequency': 'c', 'synset': 'goat.n.01', 'synonyms': ['goat'], 'id': 501, 'def': 'a common goat', 'name': 'goat'}, {'frequency': 'f', 'synset': 'goggles.n.01', 'synonyms': ['goggles'], 'id': 502, 'def': 'tight-fitting spectacles worn to protect the eyes', 'name': 'goggles'}, {'frequency': 'r', 'synset': 'goldfish.n.01', 'synonyms': ['goldfish'], 'id': 503, 'def': 'small golden or orange-red freshwater fishes used as pond or aquarium pets', 'name': 'goldfish'}, {'frequency': 'c', 'synset': 'golf_club.n.02', 'synonyms': ['golf_club', 'golf-club'], 'id': 504, 'def': 'golf equipment used by a golfer to hit a golf ball', 'name': 'golf_club'}, {'frequency': 'c', 'synset': 'golfcart.n.01', 'synonyms': ['golfcart'], 'id': 505, 'def': 'a small motor vehicle in which golfers can ride between shots', 'name': 'golfcart'}, {'frequency': 'r', 'synset': 'gondola.n.02', 'synonyms': ['gondola_(boat)'], 'id': 506, 'def': 'long narrow flat-bottomed boat propelled by sculling; traditionally used on canals of Venice', 'name': 'gondola_(boat)'}, {'frequency': 'c', 'synset': 'goose.n.01', 'synonyms': ['goose'], 'id': 507, 'def': 'loud, web-footed long-necked aquatic birds usually larger than ducks', 'name': 'goose'}, {'frequency': 'r', 'synset': 'gorilla.n.01', 'synonyms': ['gorilla'], 'id': 508, 'def': 'largest ape', 'name': 'gorilla'}, {'frequency': 'r', 'synset': 'gourd.n.02', 'synonyms': ['gourd'], 'id': 509, 'def': 'any of numerous inedible fruits with hard rinds', 'name': 'gourd'}, {'frequency': 'f', 'synset': 'grape.n.01', 'synonyms': ['grape'], 'id': 510, 'def': 'any of various juicy fruit with green or purple skins; grow in clusters', 'name': 'grape'}, {'frequency': 'c', 'synset': 'grater.n.01', 'synonyms': ['grater'], 'id': 511, 'def': 'utensil with sharp perforations for shredding foods (as vegetables or cheese)', 'name': 'grater'}, {'frequency': 'c', 'synset': 'gravestone.n.01', 'synonyms': ['gravestone', 'headstone', 'tombstone'], 'id': 512, 'def': 'a stone that is used to mark a grave', 'name': 'gravestone'}, {'frequency': 'r', 'synset': 'gravy_boat.n.01', 'synonyms': ['gravy_boat', 'gravy_holder'], 'id': 513, 'def': 'a dish (often boat-shaped) for serving gravy or sauce', 'name': 'gravy_boat'}, {'frequency': 'f', 'synset': 'green_bean.n.02', 'synonyms': ['green_bean'], 'id': 514, 'def': 'a common bean plant cultivated for its slender green edible pods', 'name': 'green_bean'}, {'frequency': 'f', 'synset': 'green_onion.n.01', 'synonyms': ['green_onion', 'spring_onion', 'scallion'], 'id': 515, 'def': 'a young onion before the bulb has enlarged', 'name': 'green_onion'}, {'frequency': 'r', 'synset': 'griddle.n.01', 'synonyms': ['griddle'], 'id': 516, 'def': 'cooking utensil consisting of a flat heated surface on which food is cooked', 'name': 'griddle'}, {'frequency': 'f', 'synset': 'grill.n.02', 'synonyms': ['grill', 'grille', 'grillwork', 'radiator_grille'], 'id': 517, 'def': 'a framework of metal bars used as a partition or a grate', 'name': 'grill'}, {'frequency': 'r', 'synset': 'grits.n.01', 'synonyms': ['grits', 'hominy_grits'], 'id': 518, 'def': 'coarsely ground corn boiled as a breakfast dish', 'name': 'grits'}, {'frequency': 'c', 'synset': 'grizzly.n.01', 'synonyms': ['grizzly', 'grizzly_bear'], 'id': 519, 'def': 'powerful brownish-yellow bear of the uplands of western North America', 'name': 'grizzly'}, {'frequency': 'c', 'synset': 'grocery_bag.n.01', 'synonyms': ['grocery_bag'], 'id': 520, 'def': "a sack for holding customer's groceries", 'name': 'grocery_bag'}, {'frequency': 'f', 'synset': 'guitar.n.01', 'synonyms': ['guitar'], 'id': 521, 'def': 'a stringed instrument usually having six strings; played by strumming or plucking', 'name': 'guitar'}, {'frequency': 'c', 'synset': 'gull.n.02', 'synonyms': ['gull', 'seagull'], 'id': 522, 'def': 'mostly white aquatic bird having long pointed wings and short legs', 'name': 'gull'}, {'frequency': 'c', 'synset': 'gun.n.01', 'synonyms': ['gun'], 'id': 523, 'def': 'a weapon that discharges a bullet at high velocity from a metal tube', 'name': 'gun'}, {'frequency': 'f', 'synset': 'hairbrush.n.01', 'synonyms': ['hairbrush'], 'id': 524, 'def': "a brush used to groom a person's hair", 'name': 'hairbrush'}, {'frequency': 'c', 'synset': 'hairnet.n.01', 'synonyms': ['hairnet'], 'id': 525, 'def': 'a small net that someone wears over their hair to keep it in place', 'name': 'hairnet'}, {'frequency': 'c', 'synset': 'hairpin.n.01', 'synonyms': ['hairpin'], 'id': 526, 'def': "a double pronged pin used to hold women's hair in place", 'name': 'hairpin'}, {'frequency': 'r', 'synset': 'halter.n.03', 'synonyms': ['halter_top'], 'id': 527, 'def': "a woman's top that fastens behind the back and neck leaving the back and arms uncovered", 'name': 'halter_top'}, {'frequency': 'f', 'synset': 'ham.n.01', 'synonyms': ['ham', 'jambon', 'gammon'], 'id': 528, 'def': 'meat cut from the thigh of a hog (usually smoked)', 'name': 'ham'}, {'frequency': 'c', 'synset': 'hamburger.n.01', 'synonyms': ['hamburger', 'beefburger', 'burger'], 'id': 529, 'def': 'a sandwich consisting of a patty of minced beef served on a bun', 'name': 'hamburger'}, {'frequency': 'c', 'synset': 'hammer.n.02', 'synonyms': ['hammer'], 'id': 530, 'def': 'a hand tool with a heavy head and a handle; used to deliver an impulsive force by striking', 'name': 'hammer'}, {'frequency': 'c', 'synset': 'hammock.n.02', 'synonyms': ['hammock'], 'id': 531, 'def': 'a hanging bed of canvas or rope netting (usually suspended between two trees)', 'name': 'hammock'}, {'frequency': 'r', 'synset': 'hamper.n.02', 'synonyms': ['hamper'], 'id': 532, 'def': 'a basket usually with a cover', 'name': 'hamper'}, {'frequency': 'c', 'synset': 'hamster.n.01', 'synonyms': ['hamster'], 'id': 533, 'def': 'short-tailed burrowing rodent with large cheek pouches', 'name': 'hamster'}, {'frequency': 'f', 'synset': 'hand_blower.n.01', 'synonyms': ['hair_dryer'], 'id': 534, 'def': 'a hand-held electric blower that can blow warm air onto the hair', 'name': 'hair_dryer'}, {'frequency': 'r', 'synset': 'hand_glass.n.01', 'synonyms': ['hand_glass', 'hand_mirror'], 'id': 535, 'def': 'a mirror intended to be held in the hand', 'name': 'hand_glass'}, {'frequency': 'f', 'synset': 'hand_towel.n.01', 'synonyms': ['hand_towel', 'face_towel'], 'id': 536, 'def': 'a small towel used to dry the hands or face', 'name': 'hand_towel'}, {'frequency': 'c', 'synset': 'handcart.n.01', 'synonyms': ['handcart', 'pushcart', 'hand_truck'], 'id': 537, 'def': 'wheeled vehicle that can be pushed by a person', 'name': 'handcart'}, {'frequency': 'r', 'synset': 'handcuff.n.01', 'synonyms': ['handcuff'], 'id': 538, 'def': 'shackle that consists of a metal loop that can be locked around the wrist', 'name': 'handcuff'}, {'frequency': 'c', 'synset': 'handkerchief.n.01', 'synonyms': ['handkerchief'], 'id': 539, 'def': 'a square piece of cloth used for wiping the eyes or nose or as a costume accessory', 'name': 'handkerchief'}, {'frequency': 'f', 'synset': 'handle.n.01', 'synonyms': ['handle', 'grip', 'handgrip'], 'id': 540, 'def': 'the appendage to an object that is designed to be held in order to use or move it', 'name': 'handle'}, {'frequency': 'r', 'synset': 'handsaw.n.01', 'synonyms': ['handsaw', "carpenter's_saw"], 'id': 541, 'def': 'a saw used with one hand for cutting wood', 'name': 'handsaw'}, {'frequency': 'r', 'synset': 'hardback.n.01', 'synonyms': ['hardback_book', 'hardcover_book'], 'id': 542, 'def': 'a book with cardboard or cloth or leather covers', 'name': 'hardback_book'}, {'frequency': 'r', 'synset': 'harmonium.n.01', 'synonyms': ['harmonium', 'organ_(musical_instrument)', 'reed_organ_(musical_instrument)'], 'id': 543, 'def': 'a free-reed instrument in which air is forced through the reeds by bellows', 'name': 'harmonium'}, {'frequency': 'f', 'synset': 'hat.n.01', 'synonyms': ['hat'], 'id': 544, 'def': 'headwear that protects the head from bad weather, sun, or worn for fashion', 'name': 'hat'}, {'frequency': 'r', 'synset': 'hatbox.n.01', 'synonyms': ['hatbox'], 'id': 545, 'def': 'a round piece of luggage for carrying hats', 'name': 'hatbox'}, {'frequency': 'c', 'synset': 'head_covering.n.01', 'synonyms': ['veil'], 'id': 546, 'def': 'a garment that covers the head OR face', 'name': 'veil'}, {'frequency': 'f', 'synset': 'headband.n.01', 'synonyms': ['headband'], 'id': 547, 'def': 'a band worn around or over the head', 'name': 'headband'}, {'frequency': 'f', 'synset': 'headboard.n.01', 'synonyms': ['headboard'], 'id': 548, 'def': 'a vertical board or panel forming the head of a bedstead', 'name': 'headboard'}, {'frequency': 'f', 'synset': 'headlight.n.01', 'synonyms': ['headlight', 'headlamp'], 'id': 549, 'def': 'a powerful light with reflector; attached to the front of an automobile or locomotive', 'name': 'headlight'}, {'frequency': 'c', 'synset': 'headscarf.n.01', 'synonyms': ['headscarf'], 'id': 550, 'def': 'a kerchief worn over the head and tied under the chin', 'name': 'headscarf'}, {'frequency': 'r', 'synset': 'headset.n.01', 'synonyms': ['headset'], 'id': 551, 'def': 'receiver consisting of a pair of headphones', 'name': 'headset'}, {'frequency': 'c', 'synset': 'headstall.n.01', 'synonyms': ['headstall_(for_horses)', 'headpiece_(for_horses)'], 'id': 552, 'def': "the band that is the part of a bridle that fits around a horse's head", 'name': 'headstall_(for_horses)'}, {'frequency': 'c', 'synset': 'heart.n.02', 'synonyms': ['heart'], 'id': 553, 'def': 'a muscular organ; its contractions move the blood through the body', 'name': 'heart'}, {'frequency': 'c', 'synset': 'heater.n.01', 'synonyms': ['heater', 'warmer'], 'id': 554, 'def': 'device that heats water or supplies warmth to a room', 'name': 'heater'}, {'frequency': 'c', 'synset': 'helicopter.n.01', 'synonyms': ['helicopter'], 'id': 555, 'def': 'an aircraft without wings that obtains its lift from the rotation of overhead blades', 'name': 'helicopter'}, {'frequency': 'f', 'synset': 'helmet.n.02', 'synonyms': ['helmet'], 'id': 556, 'def': 'a protective headgear made of hard material to resist blows', 'name': 'helmet'}, {'frequency': 'r', 'synset': 'heron.n.02', 'synonyms': ['heron'], 'id': 557, 'def': 'grey or white wading bird with long neck and long legs and (usually) long bill', 'name': 'heron'}, {'frequency': 'c', 'synset': 'highchair.n.01', 'synonyms': ['highchair', 'feeding_chair'], 'id': 558, 'def': 'a chair for feeding a very young child', 'name': 'highchair'}, {'frequency': 'f', 'synset': 'hinge.n.01', 'synonyms': ['hinge'], 'id': 559, 'def': 'a joint that holds two parts together so that one can swing relative to the other', 'name': 'hinge'}, {'frequency': 'r', 'synset': 'hippopotamus.n.01', 'synonyms': ['hippopotamus'], 'id': 560, 'def': 'massive thick-skinned animal living in or around rivers of tropical Africa', 'name': 'hippopotamus'}, {'frequency': 'r', 'synset': 'hockey_stick.n.01', 'synonyms': ['hockey_stick'], 'id': 561, 'def': 'sports implement consisting of a stick used by hockey players to move the puck', 'name': 'hockey_stick'}, {'frequency': 'c', 'synset': 'hog.n.03', 'synonyms': ['hog', 'pig'], 'id': 562, 'def': 'domestic swine', 'name': 'hog'}, {'frequency': 'f', 'synset': 'home_plate.n.01', 'synonyms': ['home_plate_(baseball)', 'home_base_(baseball)'], 'id': 563, 'def': '(baseball) a rubber slab where the batter stands; it must be touched by a base runner in order to score', 'name': 'home_plate_(baseball)'}, {'frequency': 'c', 'synset': 'honey.n.01', 'synonyms': ['honey'], 'id': 564, 'def': 'a sweet yellow liquid produced by bees', 'name': 'honey'}, {'frequency': 'f', 'synset': 'hood.n.06', 'synonyms': ['fume_hood', 'exhaust_hood'], 'id': 565, 'def': 'metal covering leading to a vent that exhausts smoke or fumes', 'name': 'fume_hood'}, {'frequency': 'f', 'synset': 'hook.n.05', 'synonyms': ['hook'], 'id': 566, 'def': 'a curved or bent implement for suspending or pulling something', 'name': 'hook'}, {'frequency': 'r', 'synset': 'hookah.n.01', 'synonyms': ['hookah', 'narghile', 'nargileh', 'sheesha', 'shisha', 'water_pipe'], 'id': 567, 'def': 'a tobacco pipe with a long flexible tube connected to a container where the smoke is cooled by passing through water', 'name': 'hookah'}, {'frequency': 'r', 'synset': 'hornet.n.01', 'synonyms': ['hornet'], 'id': 568, 'def': 'large stinging wasp', 'name': 'hornet'}, {'frequency': 'f', 'synset': 'horse.n.01', 'synonyms': ['horse'], 'id': 569, 'def': 'a common horse', 'name': 'horse'}, {'frequency': 'f', 'synset': 'hose.n.03', 'synonyms': ['hose', 'hosepipe'], 'id': 570, 'def': 'a flexible pipe for conveying a liquid or gas', 'name': 'hose'}, {'frequency': 'r', 'synset': 'hot-air_balloon.n.01', 'synonyms': ['hot-air_balloon'], 'id': 571, 'def': 'balloon for travel through the air in a basket suspended below a large bag of heated air', 'name': 'hot-air_balloon'}, {'frequency': 'r', 'synset': 'hot_plate.n.01', 'synonyms': ['hotplate'], 'id': 572, 'def': 'a portable electric appliance for heating or cooking or keeping food warm', 'name': 'hotplate'}, {'frequency': 'c', 'synset': 'hot_sauce.n.01', 'synonyms': ['hot_sauce'], 'id': 573, 'def': 'a pungent peppery sauce', 'name': 'hot_sauce'}, {'frequency': 'r', 'synset': 'hourglass.n.01', 'synonyms': ['hourglass'], 'id': 574, 'def': 'a sandglass timer that runs for sixty minutes', 'name': 'hourglass'}, {'frequency': 'r', 'synset': 'houseboat.n.01', 'synonyms': ['houseboat'], 'id': 575, 'def': 'a barge that is designed and equipped for use as a dwelling', 'name': 'houseboat'}, {'frequency': 'c', 'synset': 'hummingbird.n.01', 'synonyms': ['hummingbird'], 'id': 576, 'def': 'tiny American bird having brilliant iridescent plumage and long slender bills', 'name': 'hummingbird'}, {'frequency': 'r', 'synset': 'hummus.n.01', 'synonyms': ['hummus', 'humus', 'hommos', 'hoummos', 'humous'], 'id': 577, 'def': 'a thick spread made from mashed chickpeas', 'name': 'hummus'}, {'frequency': 'f', 'synset': 'ice_bear.n.01', 'synonyms': ['polar_bear'], 'id': 578, 'def': 'white bear of Arctic regions', 'name': 'polar_bear'}, {'frequency': 'c', 'synset': 'ice_cream.n.01', 'synonyms': ['icecream'], 'id': 579, 'def': 'frozen dessert containing cream and sugar and flavoring', 'name': 'icecream'}, {'frequency': 'r', 'synset': 'ice_lolly.n.01', 'synonyms': ['popsicle'], 'id': 580, 'def': 'ice cream or water ice on a small wooden stick', 'name': 'popsicle'}, {'frequency': 'c', 'synset': 'ice_maker.n.01', 'synonyms': ['ice_maker'], 'id': 581, 'def': 'an appliance included in some electric refrigerators for making ice cubes', 'name': 'ice_maker'}, {'frequency': 'r', 'synset': 'ice_pack.n.01', 'synonyms': ['ice_pack', 'ice_bag'], 'id': 582, 'def': 'a waterproof bag filled with ice: applied to the body (especially the head) to cool or reduce swelling', 'name': 'ice_pack'}, {'frequency': 'r', 'synset': 'ice_skate.n.01', 'synonyms': ['ice_skate'], 'id': 583, 'def': 'skate consisting of a boot with a steel blade fitted to the sole', 'name': 'ice_skate'}, {'frequency': 'c', 'synset': 'igniter.n.01', 'synonyms': ['igniter', 'ignitor', 'lighter'], 'id': 584, 'def': 'a substance or device used to start a fire', 'name': 'igniter'}, {'frequency': 'r', 'synset': 'inhaler.n.01', 'synonyms': ['inhaler', 'inhalator'], 'id': 585, 'def': 'a dispenser that produces a chemical vapor to be inhaled through mouth or nose', 'name': 'inhaler'}, {'frequency': 'f', 'synset': 'ipod.n.01', 'synonyms': ['iPod'], 'id': 586, 'def': 'a pocket-sized device used to play music files', 'name': 'iPod'}, {'frequency': 'c', 'synset': 'iron.n.04', 'synonyms': ['iron_(for_clothing)', 'smoothing_iron_(for_clothing)'], 'id': 587, 'def': 'home appliance consisting of a flat metal base that is heated and used to smooth cloth', 'name': 'iron_(for_clothing)'}, {'frequency': 'c', 'synset': 'ironing_board.n.01', 'synonyms': ['ironing_board'], 'id': 588, 'def': 'narrow padded board on collapsible supports; used for ironing clothes', 'name': 'ironing_board'}, {'frequency': 'f', 'synset': 'jacket.n.01', 'synonyms': ['jacket'], 'id': 589, 'def': 'a waist-length coat', 'name': 'jacket'}, {'frequency': 'c', 'synset': 'jam.n.01', 'synonyms': ['jam'], 'id': 590, 'def': 'preserve of crushed fruit', 'name': 'jam'}, {'frequency': 'f', 'synset': 'jar.n.01', 'synonyms': ['jar'], 'id': 591, 'def': 'a vessel (usually cylindrical) with a wide mouth and without handles', 'name': 'jar'}, {'frequency': 'f', 'synset': 'jean.n.01', 'synonyms': ['jean', 'blue_jean', 'denim'], 'id': 592, 'def': '(usually plural) close-fitting trousers of heavy denim for manual work or casual wear', 'name': 'jean'}, {'frequency': 'c', 'synset': 'jeep.n.01', 'synonyms': ['jeep', 'landrover'], 'id': 593, 'def': 'a car suitable for traveling over rough terrain', 'name': 'jeep'}, {'frequency': 'r', 'synset': 'jelly_bean.n.01', 'synonyms': ['jelly_bean', 'jelly_egg'], 'id': 594, 'def': 'sugar-glazed jellied candy', 'name': 'jelly_bean'}, {'frequency': 'f', 'synset': 'jersey.n.03', 'synonyms': ['jersey', 'T-shirt', 'tee_shirt'], 'id': 595, 'def': 'a close-fitting pullover shirt', 'name': 'jersey'}, {'frequency': 'c', 'synset': 'jet.n.01', 'synonyms': ['jet_plane', 'jet-propelled_plane'], 'id': 596, 'def': 'an airplane powered by one or more jet engines', 'name': 'jet_plane'}, {'frequency': 'r', 'synset': 'jewel.n.01', 'synonyms': ['jewel', 'gem', 'precious_stone'], 'id': 597, 'def': 'a precious or semiprecious stone incorporated into a piece of jewelry', 'name': 'jewel'}, {'frequency': 'c', 'synset': 'jewelry.n.01', 'synonyms': ['jewelry', 'jewellery'], 'id': 598, 'def': 'an adornment (as a bracelet or ring or necklace) made of precious metals and set with gems (or imitation gems)', 'name': 'jewelry'}, {'frequency': 'r', 'synset': 'joystick.n.02', 'synonyms': ['joystick'], 'id': 599, 'def': 'a control device for computers consisting of a vertical handle that can move freely in two directions', 'name': 'joystick'}, {'frequency': 'c', 'synset': 'jump_suit.n.01', 'synonyms': ['jumpsuit'], 'id': 600, 'def': "one-piece garment fashioned after a parachutist's uniform", 'name': 'jumpsuit'}, {'frequency': 'c', 'synset': 'kayak.n.01', 'synonyms': ['kayak'], 'id': 601, 'def': 'a small canoe consisting of a light frame made watertight with animal skins', 'name': 'kayak'}, {'frequency': 'r', 'synset': 'keg.n.02', 'synonyms': ['keg'], 'id': 602, 'def': 'small cask or barrel', 'name': 'keg'}, {'frequency': 'r', 'synset': 'kennel.n.01', 'synonyms': ['kennel', 'doghouse'], 'id': 603, 'def': 'outbuilding that serves as a shelter for a dog', 'name': 'kennel'}, {'frequency': 'c', 'synset': 'kettle.n.01', 'synonyms': ['kettle', 'boiler'], 'id': 604, 'def': 'a metal pot for stewing or boiling; usually has a lid', 'name': 'kettle'}, {'frequency': 'f', 'synset': 'key.n.01', 'synonyms': ['key'], 'id': 605, 'def': 'metal instrument used to unlock a lock', 'name': 'key'}, {'frequency': 'r', 'synset': 'keycard.n.01', 'synonyms': ['keycard'], 'id': 606, 'def': 'a plastic card used to gain access typically to a door', 'name': 'keycard'}, {'frequency': 'c', 'synset': 'kilt.n.01', 'synonyms': ['kilt'], 'id': 607, 'def': 'a knee-length pleated tartan skirt worn by men as part of the traditional dress in the Highlands of northern Scotland', 'name': 'kilt'}, {'frequency': 'c', 'synset': 'kimono.n.01', 'synonyms': ['kimono'], 'id': 608, 'def': 'a loose robe; imitated from robes originally worn by Japanese', 'name': 'kimono'}, {'frequency': 'f', 'synset': 'kitchen_sink.n.01', 'synonyms': ['kitchen_sink'], 'id': 609, 'def': 'a sink in a kitchen', 'name': 'kitchen_sink'}, {'frequency': 'r', 'synset': 'kitchen_table.n.01', 'synonyms': ['kitchen_table'], 'id': 610, 'def': 'a table in the kitchen', 'name': 'kitchen_table'}, {'frequency': 'f', 'synset': 'kite.n.03', 'synonyms': ['kite'], 'id': 611, 'def': 'plaything consisting of a light frame covered with tissue paper; flown in wind at end of a string', 'name': 'kite'}, {'frequency': 'c', 'synset': 'kitten.n.01', 'synonyms': ['kitten', 'kitty'], 'id': 612, 'def': 'young domestic cat', 'name': 'kitten'}, {'frequency': 'c', 'synset': 'kiwi.n.03', 'synonyms': ['kiwi_fruit'], 'id': 613, 'def': 'fuzzy brown egg-shaped fruit with slightly tart green flesh', 'name': 'kiwi_fruit'}, {'frequency': 'f', 'synset': 'knee_pad.n.01', 'synonyms': ['knee_pad'], 'id': 614, 'def': 'protective garment consisting of a pad worn by football or baseball or hockey players', 'name': 'knee_pad'}, {'frequency': 'f', 'synset': 'knife.n.01', 'synonyms': ['knife'], 'id': 615, 'def': 'tool with a blade and point used as a cutting instrument', 'name': 'knife'}, {'frequency': 'r', 'synset': 'knitting_needle.n.01', 'synonyms': ['knitting_needle'], 'id': 616, 'def': 'needle consisting of a slender rod with pointed ends; usually used in pairs', 'name': 'knitting_needle'}, {'frequency': 'f', 'synset': 'knob.n.02', 'synonyms': ['knob'], 'id': 617, 'def': 'a round handle often found on a door', 'name': 'knob'}, {'frequency': 'r', 'synset': 'knocker.n.05', 'synonyms': ['knocker_(on_a_door)', 'doorknocker'], 'id': 618, 'def': 'a device (usually metal and ornamental) attached by a hinge to a door', 'name': 'knocker_(on_a_door)'}, {'frequency': 'r', 'synset': 'koala.n.01', 'synonyms': ['koala', 'koala_bear'], 'id': 619, 'def': 'sluggish tailless Australian marsupial with grey furry ears and coat', 'name': 'koala'}, {'frequency': 'r', 'synset': 'lab_coat.n.01', 'synonyms': ['lab_coat', 'laboratory_coat'], 'id': 620, 'def': 'a light coat worn to protect clothing from substances used while working in a laboratory', 'name': 'lab_coat'}, {'frequency': 'f', 'synset': 'ladder.n.01', 'synonyms': ['ladder'], 'id': 621, 'def': 'steps consisting of two parallel members connected by rungs', 'name': 'ladder'}, {'frequency': 'c', 'synset': 'ladle.n.01', 'synonyms': ['ladle'], 'id': 622, 'def': 'a spoon-shaped vessel with a long handle frequently used to transfer liquids', 'name': 'ladle'}, {'frequency': 'c', 'synset': 'ladybug.n.01', 'synonyms': ['ladybug', 'ladybeetle', 'ladybird_beetle'], 'id': 623, 'def': 'small round bright-colored and spotted beetle, typically red and black', 'name': 'ladybug'}, {'frequency': 'f', 'synset': 'lamb.n.01', 'synonyms': ['lamb_(animal)'], 'id': 624, 'def': 'young sheep', 'name': 'lamb_(animal)'}, {'frequency': 'r', 'synset': 'lamb_chop.n.01', 'synonyms': ['lamb-chop', 'lambchop'], 'id': 625, 'def': 'chop cut from a lamb', 'name': 'lamb-chop'}, {'frequency': 'f', 'synset': 'lamp.n.02', 'synonyms': ['lamp'], 'id': 626, 'def': 'a piece of furniture holding one or more electric light bulbs', 'name': 'lamp'}, {'frequency': 'f', 'synset': 'lamppost.n.01', 'synonyms': ['lamppost'], 'id': 627, 'def': 'a metal post supporting an outdoor lamp (such as a streetlight)', 'name': 'lamppost'}, {'frequency': 'f', 'synset': 'lampshade.n.01', 'synonyms': ['lampshade'], 'id': 628, 'def': 'a protective ornamental shade used to screen a light bulb from direct view', 'name': 'lampshade'}, {'frequency': 'c', 'synset': 'lantern.n.01', 'synonyms': ['lantern'], 'id': 629, 'def': 'light in a transparent protective case', 'name': 'lantern'}, {'frequency': 'f', 'synset': 'lanyard.n.02', 'synonyms': ['lanyard', 'laniard'], 'id': 630, 'def': 'a cord worn around the neck to hold a knife or whistle, etc.', 'name': 'lanyard'}, {'frequency': 'f', 'synset': 'laptop.n.01', 'synonyms': ['laptop_computer', 'notebook_computer'], 'id': 631, 'def': 'a portable computer small enough to use in your lap', 'name': 'laptop_computer'}, {'frequency': 'r', 'synset': 'lasagna.n.01', 'synonyms': ['lasagna', 'lasagne'], 'id': 632, 'def': 'baked dish of layers of lasagna pasta with sauce and cheese and meat or vegetables', 'name': 'lasagna'}, {'frequency': 'f', 'synset': 'latch.n.02', 'synonyms': ['latch'], 'id': 633, 'def': 'a bar that can be lowered or slid into a groove to fasten a door or gate', 'name': 'latch'}, {'frequency': 'r', 'synset': 'lawn_mower.n.01', 'synonyms': ['lawn_mower'], 'id': 634, 'def': 'garden tool for mowing grass on lawns', 'name': 'lawn_mower'}, {'frequency': 'r', 'synset': 'leather.n.01', 'synonyms': ['leather'], 'id': 635, 'def': 'an animal skin made smooth and flexible by removing the hair and then tanning', 'name': 'leather'}, {'frequency': 'c', 'synset': 'legging.n.01', 'synonyms': ['legging_(clothing)', 'leging_(clothing)', 'leg_covering'], 'id': 636, 'def': 'a garment covering the leg (usually extending from the knee to the ankle)', 'name': 'legging_(clothing)'}, {'frequency': 'c', 'synset': 'lego.n.01', 'synonyms': ['Lego', 'Lego_set'], 'id': 637, 'def': "a child's plastic construction set for making models from blocks", 'name': 'Lego'}, {'frequency': 'r', 'synset': 'legume.n.02', 'synonyms': ['legume'], 'id': 638, 'def': 'the fruit or seed of bean or pea plants', 'name': 'legume'}, {'frequency': 'f', 'synset': 'lemon.n.01', 'synonyms': ['lemon'], 'id': 639, 'def': 'yellow oval fruit with juicy acidic flesh', 'name': 'lemon'}, {'frequency': 'r', 'synset': 'lemonade.n.01', 'synonyms': ['lemonade'], 'id': 640, 'def': 'sweetened beverage of diluted lemon juice', 'name': 'lemonade'}, {'frequency': 'f', 'synset': 'lettuce.n.02', 'synonyms': ['lettuce'], 'id': 641, 'def': 'leafy plant commonly eaten in salad or on sandwiches', 'name': 'lettuce'}, {'frequency': 'f', 'synset': 'license_plate.n.01', 'synonyms': ['license_plate', 'numberplate'], 'id': 642, 'def': "a plate mounted on the front and back of car and bearing the car's registration number", 'name': 'license_plate'}, {'frequency': 'f', 'synset': 'life_buoy.n.01', 'synonyms': ['life_buoy', 'lifesaver', 'life_belt', 'life_ring'], 'id': 643, 'def': 'a ring-shaped life preserver used to prevent drowning (NOT a life-jacket or vest)', 'name': 'life_buoy'}, {'frequency': 'f', 'synset': 'life_jacket.n.01', 'synonyms': ['life_jacket', 'life_vest'], 'id': 644, 'def': 'life preserver consisting of a sleeveless jacket of buoyant or inflatable design', 'name': 'life_jacket'}, {'frequency': 'f', 'synset': 'light_bulb.n.01', 'synonyms': ['lightbulb'], 'id': 645, 'def': 'lightblub/source of light', 'name': 'lightbulb'}, {'frequency': 'r', 'synset': 'lightning_rod.n.02', 'synonyms': ['lightning_rod', 'lightning_conductor'], 'id': 646, 'def': 'a metallic conductor that is attached to a high point and leads to the ground', 'name': 'lightning_rod'}, {'frequency': 'f', 'synset': 'lime.n.06', 'synonyms': ['lime'], 'id': 647, 'def': 'the green acidic fruit of any of various lime trees', 'name': 'lime'}, {'frequency': 'r', 'synset': 'limousine.n.01', 'synonyms': ['limousine'], 'id': 648, 'def': 'long luxurious car; usually driven by a chauffeur', 'name': 'limousine'}, {'frequency': 'c', 'synset': 'lion.n.01', 'synonyms': ['lion'], 'id': 649, 'def': 'large gregarious predatory cat of Africa and India', 'name': 'lion'}, {'frequency': 'c', 'synset': 'lip_balm.n.01', 'synonyms': ['lip_balm'], 'id': 650, 'def': 'a balm applied to the lips', 'name': 'lip_balm'}, {'frequency': 'r', 'synset': 'liquor.n.01', 'synonyms': ['liquor', 'spirits', 'hard_liquor', 'liqueur', 'cordial'], 'id': 651, 'def': 'liquor or beer', 'name': 'liquor'}, {'frequency': 'c', 'synset': 'lizard.n.01', 'synonyms': ['lizard'], 'id': 652, 'def': 'a reptile with usually two pairs of legs and a tapering tail', 'name': 'lizard'}, {'frequency': 'f', 'synset': 'log.n.01', 'synonyms': ['log'], 'id': 653, 'def': 'a segment of the trunk of a tree when stripped of branches', 'name': 'log'}, {'frequency': 'c', 'synset': 'lollipop.n.02', 'synonyms': ['lollipop'], 'id': 654, 'def': 'hard candy on a stick', 'name': 'lollipop'}, {'frequency': 'f', 'synset': 'loudspeaker.n.01', 'synonyms': ['speaker_(stero_equipment)'], 'id': 655, 'def': 'electronic device that produces sound often as part of a stereo system', 'name': 'speaker_(stero_equipment)'}, {'frequency': 'c', 'synset': 'love_seat.n.01', 'synonyms': ['loveseat'], 'id': 656, 'def': 'small sofa that seats two people', 'name': 'loveseat'}, {'frequency': 'r', 'synset': 'machine_gun.n.01', 'synonyms': ['machine_gun'], 'id': 657, 'def': 'a rapidly firing automatic gun', 'name': 'machine_gun'}, {'frequency': 'f', 'synset': 'magazine.n.02', 'synonyms': ['magazine'], 'id': 658, 'def': 'a paperback periodic publication', 'name': 'magazine'}, {'frequency': 'f', 'synset': 'magnet.n.01', 'synonyms': ['magnet'], 'id': 659, 'def': 'a device that attracts iron and produces a magnetic field', 'name': 'magnet'}, {'frequency': 'c', 'synset': 'mail_slot.n.01', 'synonyms': ['mail_slot'], 'id': 660, 'def': 'a slot (usually in a door) through which mail can be delivered', 'name': 'mail_slot'}, {'frequency': 'f', 'synset': 'mailbox.n.01', 'synonyms': ['mailbox_(at_home)', 'letter_box_(at_home)'], 'id': 661, 'def': 'a private box for delivery of mail', 'name': 'mailbox_(at_home)'}, {'frequency': 'r', 'synset': 'mallard.n.01', 'synonyms': ['mallard'], 'id': 662, 'def': 'wild dabbling duck from which domestic ducks are descended', 'name': 'mallard'}, {'frequency': 'r', 'synset': 'mallet.n.01', 'synonyms': ['mallet'], 'id': 663, 'def': 'a sports implement with a long handle and a hammer-like head used to hit a ball', 'name': 'mallet'}, {'frequency': 'r', 'synset': 'mammoth.n.01', 'synonyms': ['mammoth'], 'id': 664, 'def': 'any of numerous extinct elephants widely distributed in the Pleistocene', 'name': 'mammoth'}, {'frequency': 'r', 'synset': 'manatee.n.01', 'synonyms': ['manatee'], 'id': 665, 'def': 'sirenian mammal of tropical coastal waters of America', 'name': 'manatee'}, {'frequency': 'c', 'synset': 'mandarin.n.05', 'synonyms': ['mandarin_orange'], 'id': 666, 'def': 'a somewhat flat reddish-orange loose skinned citrus of China', 'name': 'mandarin_orange'}, {'frequency': 'c', 'synset': 'manger.n.01', 'synonyms': ['manger', 'trough'], 'id': 667, 'def': 'a container (usually in a barn or stable) from which cattle or horses feed', 'name': 'manger'}, {'frequency': 'f', 'synset': 'manhole.n.01', 'synonyms': ['manhole'], 'id': 668, 'def': 'a hole (usually with a flush cover) through which a person can gain access to an underground structure', 'name': 'manhole'}, {'frequency': 'f', 'synset': 'map.n.01', 'synonyms': ['map'], 'id': 669, 'def': "a diagrammatic representation of the earth's surface (or part of it)", 'name': 'map'}, {'frequency': 'f', 'synset': 'marker.n.03', 'synonyms': ['marker'], 'id': 670, 'def': 'a writing implement for making a mark', 'name': 'marker'}, {'frequency': 'r', 'synset': 'martini.n.01', 'synonyms': ['martini'], 'id': 671, 'def': 'a cocktail made of gin (or vodka) with dry vermouth', 'name': 'martini'}, {'frequency': 'r', 'synset': 'mascot.n.01', 'synonyms': ['mascot'], 'id': 672, 'def': 'a person or animal that is adopted by a team or other group as a symbolic figure', 'name': 'mascot'}, {'frequency': 'c', 'synset': 'mashed_potato.n.01', 'synonyms': ['mashed_potato'], 'id': 673, 'def': 'potato that has been peeled and boiled and then mashed', 'name': 'mashed_potato'}, {'frequency': 'r', 'synset': 'masher.n.02', 'synonyms': ['masher'], 'id': 674, 'def': 'a kitchen utensil used for mashing (e.g. potatoes)', 'name': 'masher'}, {'frequency': 'f', 'synset': 'mask.n.04', 'synonyms': ['mask', 'facemask'], 'id': 675, 'def': 'a protective covering worn over the face', 'name': 'mask'}, {'frequency': 'f', 'synset': 'mast.n.01', 'synonyms': ['mast'], 'id': 676, 'def': 'a vertical spar for supporting sails', 'name': 'mast'}, {'frequency': 'c', 'synset': 'mat.n.03', 'synonyms': ['mat_(gym_equipment)', 'gym_mat'], 'id': 677, 'def': 'sports equipment consisting of a piece of thick padding on the floor for gymnastics', 'name': 'mat_(gym_equipment)'}, {'frequency': 'r', 'synset': 'matchbox.n.01', 'synonyms': ['matchbox'], 'id': 678, 'def': 'a box for holding matches', 'name': 'matchbox'}, {'frequency': 'f', 'synset': 'mattress.n.01', 'synonyms': ['mattress'], 'id': 679, 'def': 'a thick pad filled with resilient material used as a bed or part of a bed', 'name': 'mattress'}, {'frequency': 'c', 'synset': 'measuring_cup.n.01', 'synonyms': ['measuring_cup'], 'id': 680, 'def': 'graduated cup used to measure liquid or granular ingredients', 'name': 'measuring_cup'}, {'frequency': 'c', 'synset': 'measuring_stick.n.01', 'synonyms': ['measuring_stick', 'ruler_(measuring_stick)', 'measuring_rod'], 'id': 681, 'def': 'measuring instrument having a sequence of marks at regular intervals', 'name': 'measuring_stick'}, {'frequency': 'c', 'synset': 'meatball.n.01', 'synonyms': ['meatball'], 'id': 682, 'def': 'ground meat formed into a ball and fried or simmered in broth', 'name': 'meatball'}, {'frequency': 'c', 'synset': 'medicine.n.02', 'synonyms': ['medicine'], 'id': 683, 'def': 'something that treats or prevents or alleviates the symptoms of disease', 'name': 'medicine'}, {'frequency': 'c', 'synset': 'melon.n.01', 'synonyms': ['melon'], 'id': 684, 'def': 'fruit of the gourd family having a hard rind and sweet juicy flesh', 'name': 'melon'}, {'frequency': 'f', 'synset': 'microphone.n.01', 'synonyms': ['microphone'], 'id': 685, 'def': 'device for converting sound waves into electrical energy', 'name': 'microphone'}, {'frequency': 'r', 'synset': 'microscope.n.01', 'synonyms': ['microscope'], 'id': 686, 'def': 'magnifier of the image of small objects', 'name': 'microscope'}, {'frequency': 'f', 'synset': 'microwave.n.02', 'synonyms': ['microwave_oven'], 'id': 687, 'def': 'kitchen appliance that cooks food by passing an electromagnetic wave through it', 'name': 'microwave_oven'}, {'frequency': 'r', 'synset': 'milestone.n.01', 'synonyms': ['milestone', 'milepost'], 'id': 688, 'def': 'stone post at side of a road to show distances', 'name': 'milestone'}, {'frequency': 'f', 'synset': 'milk.n.01', 'synonyms': ['milk'], 'id': 689, 'def': 'a white nutritious liquid secreted by mammals and used as food by human beings', 'name': 'milk'}, {'frequency': 'r', 'synset': 'milk_can.n.01', 'synonyms': ['milk_can'], 'id': 690, 'def': 'can for transporting milk', 'name': 'milk_can'}, {'frequency': 'r', 'synset': 'milkshake.n.01', 'synonyms': ['milkshake'], 'id': 691, 'def': 'frothy drink of milk and flavoring and sometimes fruit or ice cream', 'name': 'milkshake'}, {'frequency': 'f', 'synset': 'minivan.n.01', 'synonyms': ['minivan'], 'id': 692, 'def': 'a small box-shaped passenger van', 'name': 'minivan'}, {'frequency': 'r', 'synset': 'mint.n.05', 'synonyms': ['mint_candy'], 'id': 693, 'def': 'a candy that is flavored with a mint oil', 'name': 'mint_candy'}, {'frequency': 'f', 'synset': 'mirror.n.01', 'synonyms': ['mirror'], 'id': 694, 'def': 'polished surface that forms images by reflecting light', 'name': 'mirror'}, {'frequency': 'c', 'synset': 'mitten.n.01', 'synonyms': ['mitten'], 'id': 695, 'def': 'glove that encases the thumb separately and the other four fingers together', 'name': 'mitten'}, {'frequency': 'c', 'synset': 'mixer.n.04', 'synonyms': ['mixer_(kitchen_tool)', 'stand_mixer'], 'id': 696, 'def': 'a kitchen utensil that is used for mixing foods', 'name': 'mixer_(kitchen_tool)'}, {'frequency': 'c', 'synset': 'money.n.03', 'synonyms': ['money'], 'id': 697, 'def': 'the official currency issued by a government or national bank', 'name': 'money'}, {'frequency': 'f', 'synset': 'monitor.n.04', 'synonyms': ['monitor_(computer_equipment) computer_monitor'], 'id': 698, 'def': 'a computer monitor', 'name': 'monitor_(computer_equipment) computer_monitor'}, {'frequency': 'c', 'synset': 'monkey.n.01', 'synonyms': ['monkey'], 'id': 699, 'def': 'any of various long-tailed primates', 'name': 'monkey'}, {'frequency': 'f', 'synset': 'motor.n.01', 'synonyms': ['motor'], 'id': 700, 'def': 'machine that converts other forms of energy into mechanical energy and so imparts motion', 'name': 'motor'}, {'frequency': 'f', 'synset': 'motor_scooter.n.01', 'synonyms': ['motor_scooter', 'scooter'], 'id': 701, 'def': 'a wheeled vehicle with small wheels and a low-powered engine', 'name': 'motor_scooter'}, {'frequency': 'r', 'synset': 'motor_vehicle.n.01', 'synonyms': ['motor_vehicle', 'automotive_vehicle'], 'id': 702, 'def': 'a self-propelled wheeled vehicle that does not run on rails', 'name': 'motor_vehicle'}, {'frequency': 'f', 'synset': 'motorcycle.n.01', 'synonyms': ['motorcycle'], 'id': 703, 'def': 'a motor vehicle with two wheels and a strong frame', 'name': 'motorcycle'}, {'frequency': 'f', 'synset': 'mound.n.01', 'synonyms': ['mound_(baseball)', "pitcher's_mound"], 'id': 704, 'def': '(baseball) the slight elevation on which the pitcher stands', 'name': 'mound_(baseball)'}, {'frequency': 'f', 'synset': 'mouse.n.04', 'synonyms': ['mouse_(computer_equipment)', 'computer_mouse'], 'id': 705, 'def': 'a computer input device that controls an on-screen pointer (does not include trackpads / touchpads)', 'name': 'mouse_(computer_equipment)'}, {'frequency': 'f', 'synset': 'mousepad.n.01', 'synonyms': ['mousepad'], 'id': 706, 'def': 'a small portable pad that provides an operating surface for a computer mouse', 'name': 'mousepad'}, {'frequency': 'c', 'synset': 'muffin.n.01', 'synonyms': ['muffin'], 'id': 707, 'def': 'a sweet quick bread baked in a cup-shaped pan', 'name': 'muffin'}, {'frequency': 'f', 'synset': 'mug.n.04', 'synonyms': ['mug'], 'id': 708, 'def': 'with handle and usually cylindrical', 'name': 'mug'}, {'frequency': 'f', 'synset': 'mushroom.n.02', 'synonyms': ['mushroom'], 'id': 709, 'def': 'a common mushroom', 'name': 'mushroom'}, {'frequency': 'r', 'synset': 'music_stool.n.01', 'synonyms': ['music_stool', 'piano_stool'], 'id': 710, 'def': 'a stool for piano players; usually adjustable in height', 'name': 'music_stool'}, {'frequency': 'c', 'synset': 'musical_instrument.n.01', 'synonyms': ['musical_instrument', 'instrument_(musical)'], 'id': 711, 'def': 'any of various devices or contrivances that can be used to produce musical tones or sounds', 'name': 'musical_instrument'}, {'frequency': 'r', 'synset': 'nailfile.n.01', 'synonyms': ['nailfile'], 'id': 712, 'def': 'a small flat file for shaping the nails', 'name': 'nailfile'}, {'frequency': 'f', 'synset': 'napkin.n.01', 'synonyms': ['napkin', 'table_napkin', 'serviette'], 'id': 713, 'def': 'a small piece of table linen or paper that is used to wipe the mouth and to cover the lap in order to protect clothing', 'name': 'napkin'}, {'frequency': 'r', 'synset': 'neckerchief.n.01', 'synonyms': ['neckerchief'], 'id': 714, 'def': 'a kerchief worn around the neck', 'name': 'neckerchief'}, {'frequency': 'f', 'synset': 'necklace.n.01', 'synonyms': ['necklace'], 'id': 715, 'def': 'jewelry consisting of a cord or chain (often bearing gems) worn about the neck as an ornament', 'name': 'necklace'}, {'frequency': 'f', 'synset': 'necktie.n.01', 'synonyms': ['necktie', 'tie_(necktie)'], 'id': 716, 'def': 'neckwear consisting of a long narrow piece of material worn under a collar and tied in knot at the front', 'name': 'necktie'}, {'frequency': 'c', 'synset': 'needle.n.03', 'synonyms': ['needle'], 'id': 717, 'def': 'a sharp pointed implement (usually metal)', 'name': 'needle'}, {'frequency': 'c', 'synset': 'nest.n.01', 'synonyms': ['nest'], 'id': 718, 'def': 'a structure in which animals lay eggs or give birth to their young', 'name': 'nest'}, {'frequency': 'f', 'synset': 'newspaper.n.01', 'synonyms': ['newspaper', 'paper_(newspaper)'], 'id': 719, 'def': 'a daily or weekly publication on folded sheets containing news, articles, and advertisements', 'name': 'newspaper'}, {'frequency': 'c', 'synset': 'newsstand.n.01', 'synonyms': ['newsstand'], 'id': 720, 'def': 'a stall where newspapers and other periodicals are sold', 'name': 'newsstand'}, {'frequency': 'c', 'synset': 'nightwear.n.01', 'synonyms': ['nightshirt', 'nightwear', 'sleepwear', 'nightclothes'], 'id': 721, 'def': 'garments designed to be worn in bed', 'name': 'nightshirt'}, {'frequency': 'r', 'synset': 'nosebag.n.01', 'synonyms': ['nosebag_(for_animals)', 'feedbag'], 'id': 722, 'def': 'a canvas bag that is used to feed an animal (such as a horse); covers the muzzle and fastens at the top of the head', 'name': 'nosebag_(for_animals)'}, {'frequency': 'c', 'synset': 'noseband.n.01', 'synonyms': ['noseband_(for_animals)', 'nosepiece_(for_animals)'], 'id': 723, 'def': "a strap that is the part of a bridle that goes over the animal's nose", 'name': 'noseband_(for_animals)'}, {'frequency': 'f', 'synset': 'notebook.n.01', 'synonyms': ['notebook'], 'id': 724, 'def': 'a book with blank pages for recording notes or memoranda', 'name': 'notebook'}, {'frequency': 'c', 'synset': 'notepad.n.01', 'synonyms': ['notepad'], 'id': 725, 'def': 'a pad of paper for keeping notes', 'name': 'notepad'}, {'frequency': 'f', 'synset': 'nut.n.03', 'synonyms': ['nut'], 'id': 726, 'def': 'a small metal block (usually square or hexagonal) with internal screw thread to be fitted onto a bolt', 'name': 'nut'}, {'frequency': 'r', 'synset': 'nutcracker.n.01', 'synonyms': ['nutcracker'], 'id': 727, 'def': 'a hand tool used to crack nuts open', 'name': 'nutcracker'}, {'frequency': 'f', 'synset': 'oar.n.01', 'synonyms': ['oar'], 'id': 728, 'def': 'an implement used to propel or steer a boat', 'name': 'oar'}, {'frequency': 'r', 'synset': 'octopus.n.01', 'synonyms': ['octopus_(food)'], 'id': 729, 'def': 'tentacles of octopus prepared as food', 'name': 'octopus_(food)'}, {'frequency': 'r', 'synset': 'octopus.n.02', 'synonyms': ['octopus_(animal)'], 'id': 730, 'def': 'bottom-living cephalopod having a soft oval body with eight long tentacles', 'name': 'octopus_(animal)'}, {'frequency': 'c', 'synset': 'oil_lamp.n.01', 'synonyms': ['oil_lamp', 'kerosene_lamp', 'kerosine_lamp'], 'id': 731, 'def': 'a lamp that burns oil (as kerosine) for light', 'name': 'oil_lamp'}, {'frequency': 'c', 'synset': 'olive_oil.n.01', 'synonyms': ['olive_oil'], 'id': 732, 'def': 'oil from olives', 'name': 'olive_oil'}, {'frequency': 'r', 'synset': 'omelet.n.01', 'synonyms': ['omelet', 'omelette'], 'id': 733, 'def': 'beaten eggs cooked until just set; may be folded around e.g. ham or cheese or jelly', 'name': 'omelet'}, {'frequency': 'f', 'synset': 'onion.n.01', 'synonyms': ['onion'], 'id': 734, 'def': 'the bulb of an onion plant', 'name': 'onion'}, {'frequency': 'f', 'synset': 'orange.n.01', 'synonyms': ['orange_(fruit)'], 'id': 735, 'def': 'orange (FRUIT of an orange tree)', 'name': 'orange_(fruit)'}, {'frequency': 'c', 'synset': 'orange_juice.n.01', 'synonyms': ['orange_juice'], 'id': 736, 'def': 'bottled or freshly squeezed juice of oranges', 'name': 'orange_juice'}, {'frequency': 'c', 'synset': 'ostrich.n.02', 'synonyms': ['ostrich'], 'id': 737, 'def': 'fast-running African flightless bird with two-toed feet; largest living bird', 'name': 'ostrich'}, {'frequency': 'f', 'synset': 'ottoman.n.03', 'synonyms': ['ottoman', 'pouf', 'pouffe', 'hassock'], 'id': 738, 'def': 'a thick standalone cushion used as a seat or footrest, often next to a chair', 'name': 'ottoman'}, {'frequency': 'f', 'synset': 'oven.n.01', 'synonyms': ['oven'], 'id': 739, 'def': 'kitchen appliance used for baking or roasting', 'name': 'oven'}, {'frequency': 'c', 'synset': 'overall.n.01', 'synonyms': ['overalls_(clothing)'], 'id': 740, 'def': 'work clothing consisting of denim trousers usually with a bib and shoulder straps', 'name': 'overalls_(clothing)'}, {'frequency': 'c', 'synset': 'owl.n.01', 'synonyms': ['owl'], 'id': 741, 'def': 'nocturnal bird of prey with hawk-like beak and claws and large head with front-facing eyes', 'name': 'owl'}, {'frequency': 'c', 'synset': 'packet.n.03', 'synonyms': ['packet'], 'id': 742, 'def': 'a small package or bundle', 'name': 'packet'}, {'frequency': 'r', 'synset': 'pad.n.03', 'synonyms': ['inkpad', 'inking_pad', 'stamp_pad'], 'id': 743, 'def': 'absorbent material saturated with ink used to transfer ink evenly to a rubber stamp', 'name': 'inkpad'}, {'frequency': 'c', 'synset': 'pad.n.04', 'synonyms': ['pad'], 'id': 744, 'def': 'mostly arm/knee pads labeled', 'name': 'pad'}, {'frequency': 'f', 'synset': 'paddle.n.04', 'synonyms': ['paddle', 'boat_paddle'], 'id': 745, 'def': 'a short light oar used without an oarlock to propel a canoe or small boat', 'name': 'paddle'}, {'frequency': 'c', 'synset': 'padlock.n.01', 'synonyms': ['padlock'], 'id': 746, 'def': 'a detachable, portable lock', 'name': 'padlock'}, {'frequency': 'c', 'synset': 'paintbrush.n.01', 'synonyms': ['paintbrush'], 'id': 747, 'def': 'a brush used as an applicator to apply paint', 'name': 'paintbrush'}, {'frequency': 'f', 'synset': 'painting.n.01', 'synonyms': ['painting'], 'id': 748, 'def': 'graphic art consisting of an artistic composition made by applying paints to a surface', 'name': 'painting'}, {'frequency': 'f', 'synset': 'pajama.n.02', 'synonyms': ['pajamas', 'pyjamas'], 'id': 749, 'def': 'loose-fitting nightclothes worn for sleeping or lounging', 'name': 'pajamas'}, {'frequency': 'c', 'synset': 'palette.n.02', 'synonyms': ['palette', 'pallet'], 'id': 750, 'def': 'board that provides a flat surface on which artists mix paints and the range of colors used', 'name': 'palette'}, {'frequency': 'f', 'synset': 'pan.n.01', 'synonyms': ['pan_(for_cooking)', 'cooking_pan'], 'id': 751, 'def': 'cooking utensil consisting of a wide metal vessel', 'name': 'pan_(for_cooking)'}, {'frequency': 'r', 'synset': 'pan.n.03', 'synonyms': ['pan_(metal_container)'], 'id': 752, 'def': 'shallow container made of metal', 'name': 'pan_(metal_container)'}, {'frequency': 'c', 'synset': 'pancake.n.01', 'synonyms': ['pancake'], 'id': 753, 'def': 'a flat cake of thin batter fried on both sides on a griddle', 'name': 'pancake'}, {'frequency': 'r', 'synset': 'pantyhose.n.01', 'synonyms': ['pantyhose'], 'id': 754, 'def': "a woman's tights consisting of underpants and stockings", 'name': 'pantyhose'}, {'frequency': 'r', 'synset': 'papaya.n.02', 'synonyms': ['papaya'], 'id': 755, 'def': 'large oval melon-like tropical fruit with yellowish flesh', 'name': 'papaya'}, {'frequency': 'f', 'synset': 'paper_plate.n.01', 'synonyms': ['paper_plate'], 'id': 756, 'def': 'a disposable plate made of cardboard', 'name': 'paper_plate'}, {'frequency': 'f', 'synset': 'paper_towel.n.01', 'synonyms': ['paper_towel'], 'id': 757, 'def': 'a disposable towel made of absorbent paper', 'name': 'paper_towel'}, {'frequency': 'r', 'synset': 'paperback_book.n.01', 'synonyms': ['paperback_book', 'paper-back_book', 'softback_book', 'soft-cover_book'], 'id': 758, 'def': 'a book with paper covers', 'name': 'paperback_book'}, {'frequency': 'r', 'synset': 'paperweight.n.01', 'synonyms': ['paperweight'], 'id': 759, 'def': 'a weight used to hold down a stack of papers', 'name': 'paperweight'}, {'frequency': 'c', 'synset': 'parachute.n.01', 'synonyms': ['parachute'], 'id': 760, 'def': 'rescue equipment consisting of a device that fills with air and retards your fall', 'name': 'parachute'}, {'frequency': 'c', 'synset': 'parakeet.n.01', 'synonyms': ['parakeet', 'parrakeet', 'parroket', 'paraquet', 'paroquet', 'parroquet'], 'id': 761, 'def': 'any of numerous small slender long-tailed parrots', 'name': 'parakeet'}, {'frequency': 'c', 'synset': 'parasail.n.01', 'synonyms': ['parasail_(sports)'], 'id': 762, 'def': 'parachute that will lift a person up into the air when it is towed by a motorboat or a car', 'name': 'parasail_(sports)'}, {'frequency': 'c', 'synset': 'parasol.n.01', 'synonyms': ['parasol', 'sunshade'], 'id': 763, 'def': 'a handheld collapsible source of shade', 'name': 'parasol'}, {'frequency': 'r', 'synset': 'parchment.n.01', 'synonyms': ['parchment'], 'id': 764, 'def': 'a superior paper resembling sheepskin', 'name': 'parchment'}, {'frequency': 'c', 'synset': 'parka.n.01', 'synonyms': ['parka', 'anorak'], 'id': 765, 'def': "a kind of heavy jacket (`windcheater' is a British term)", 'name': 'parka'}, {'frequency': 'f', 'synset': 'parking_meter.n.01', 'synonyms': ['parking_meter'], 'id': 766, 'def': 'a coin-operated timer located next to a parking space', 'name': 'parking_meter'}, {'frequency': 'c', 'synset': 'parrot.n.01', 'synonyms': ['parrot'], 'id': 767, 'def': 'usually brightly colored tropical birds with short hooked beaks and the ability to mimic sounds', 'name': 'parrot'}, {'frequency': 'c', 'synset': 'passenger_car.n.01', 'synonyms': ['passenger_car_(part_of_a_train)', 'coach_(part_of_a_train)'], 'id': 768, 'def': 'a railcar where passengers ride', 'name': 'passenger_car_(part_of_a_train)'}, {'frequency': 'r', 'synset': 'passenger_ship.n.01', 'synonyms': ['passenger_ship'], 'id': 769, 'def': 'a ship built to carry passengers', 'name': 'passenger_ship'}, {'frequency': 'c', 'synset': 'passport.n.02', 'synonyms': ['passport'], 'id': 770, 'def': 'a document issued by a country to a citizen allowing that person to travel abroad and re-enter the home country', 'name': 'passport'}, {'frequency': 'f', 'synset': 'pastry.n.02', 'synonyms': ['pastry'], 'id': 771, 'def': 'any of various baked foods made of dough or batter', 'name': 'pastry'}, {'frequency': 'r', 'synset': 'patty.n.01', 'synonyms': ['patty_(food)'], 'id': 772, 'def': 'small flat mass of chopped food', 'name': 'patty_(food)'}, {'frequency': 'c', 'synset': 'pea.n.01', 'synonyms': ['pea_(food)'], 'id': 773, 'def': 'seed of a pea plant used for food', 'name': 'pea_(food)'}, {'frequency': 'c', 'synset': 'peach.n.03', 'synonyms': ['peach'], 'id': 774, 'def': 'downy juicy fruit with sweet yellowish or whitish flesh', 'name': 'peach'}, {'frequency': 'c', 'synset': 'peanut_butter.n.01', 'synonyms': ['peanut_butter'], 'id': 775, 'def': 'a spread made from ground peanuts', 'name': 'peanut_butter'}, {'frequency': 'f', 'synset': 'pear.n.01', 'synonyms': ['pear'], 'id': 776, 'def': 'sweet juicy gritty-textured fruit available in many varieties', 'name': 'pear'}, {'frequency': 'c', 'synset': 'peeler.n.03', 'synonyms': ['peeler_(tool_for_fruit_and_vegetables)'], 'id': 777, 'def': 'a device for peeling vegetables or fruits', 'name': 'peeler_(tool_for_fruit_and_vegetables)'}, {'frequency': 'r', 'synset': 'peg.n.04', 'synonyms': ['wooden_leg', 'pegleg'], 'id': 778, 'def': 'a prosthesis that replaces a missing leg', 'name': 'wooden_leg'}, {'frequency': 'r', 'synset': 'pegboard.n.01', 'synonyms': ['pegboard'], 'id': 779, 'def': 'a board perforated with regularly spaced holes into which pegs can be fitted', 'name': 'pegboard'}, {'frequency': 'c', 'synset': 'pelican.n.01', 'synonyms': ['pelican'], 'id': 780, 'def': 'large long-winged warm-water seabird having a large bill with a distensible pouch for fish', 'name': 'pelican'}, {'frequency': 'f', 'synset': 'pen.n.01', 'synonyms': ['pen'], 'id': 781, 'def': 'a writing implement with a point from which ink flows', 'name': 'pen'}, {'frequency': 'f', 'synset': 'pencil.n.01', 'synonyms': ['pencil'], 'id': 782, 'def': 'a thin cylindrical pointed writing implement made of wood and graphite', 'name': 'pencil'}, {'frequency': 'r', 'synset': 'pencil_box.n.01', 'synonyms': ['pencil_box', 'pencil_case'], 'id': 783, 'def': 'a box for holding pencils', 'name': 'pencil_box'}, {'frequency': 'r', 'synset': 'pencil_sharpener.n.01', 'synonyms': ['pencil_sharpener'], 'id': 784, 'def': 'a rotary implement for sharpening the point on pencils', 'name': 'pencil_sharpener'}, {'frequency': 'r', 'synset': 'pendulum.n.01', 'synonyms': ['pendulum'], 'id': 785, 'def': 'an apparatus consisting of an object mounted so that it swings freely under the influence of gravity', 'name': 'pendulum'}, {'frequency': 'c', 'synset': 'penguin.n.01', 'synonyms': ['penguin'], 'id': 786, 'def': 'short-legged flightless birds of cold southern regions having webbed feet and wings modified as flippers', 'name': 'penguin'}, {'frequency': 'r', 'synset': 'pennant.n.02', 'synonyms': ['pennant'], 'id': 787, 'def': 'a flag longer than it is wide (and often tapering)', 'name': 'pennant'}, {'frequency': 'r', 'synset': 'penny.n.02', 'synonyms': ['penny_(coin)'], 'id': 788, 'def': 'a coin worth one-hundredth of the value of the basic unit', 'name': 'penny_(coin)'}, {'frequency': 'f', 'synset': 'pepper.n.03', 'synonyms': ['pepper', 'peppercorn'], 'id': 789, 'def': 'pungent seasoning from the berry of the common pepper plant; whole or ground', 'name': 'pepper'}, {'frequency': 'c', 'synset': 'pepper_mill.n.01', 'synonyms': ['pepper_mill', 'pepper_grinder'], 'id': 790, 'def': 'a mill for grinding pepper', 'name': 'pepper_mill'}, {'frequency': 'c', 'synset': 'perfume.n.02', 'synonyms': ['perfume'], 'id': 791, 'def': 'a toiletry that emits and diffuses a fragrant odor', 'name': 'perfume'}, {'frequency': 'r', 'synset': 'persimmon.n.02', 'synonyms': ['persimmon'], 'id': 792, 'def': 'orange fruit resembling a plum; edible when fully ripe', 'name': 'persimmon'}, {'frequency': 'f', 'synset': 'person.n.01', 'synonyms': ['person', 'baby', 'child', 'boy', 'girl', 'man', 'woman', 'human'], 'id': 793, 'def': 'a human being', 'name': 'person'}, {'frequency': 'c', 'synset': 'pet.n.01', 'synonyms': ['pet'], 'id': 794, 'def': 'a domesticated animal kept for companionship or amusement', 'name': 'pet'}, {'frequency': 'c', 'synset': 'pew.n.01', 'synonyms': ['pew_(church_bench)', 'church_bench'], 'id': 795, 'def': 'long bench with backs; used in church by the congregation', 'name': 'pew_(church_bench)'}, {'frequency': 'r', 'synset': 'phonebook.n.01', 'synonyms': ['phonebook', 'telephone_book', 'telephone_directory'], 'id': 796, 'def': 'a directory containing an alphabetical list of telephone subscribers and their telephone numbers', 'name': 'phonebook'}, {'frequency': 'c', 'synset': 'phonograph_record.n.01', 'synonyms': ['phonograph_record', 'phonograph_recording', 'record_(phonograph_recording)'], 'id': 797, 'def': 'sound recording consisting of a typically black disk with a continuous groove', 'name': 'phonograph_record'}, {'frequency': 'f', 'synset': 'piano.n.01', 'synonyms': ['piano'], 'id': 798, 'def': 'a keyboard instrument that is played by depressing keys that cause hammers to strike tuned strings and produce sounds', 'name': 'piano'}, {'frequency': 'f', 'synset': 'pickle.n.01', 'synonyms': ['pickle'], 'id': 799, 'def': 'vegetables (especially cucumbers) preserved in brine or vinegar', 'name': 'pickle'}, {'frequency': 'f', 'synset': 'pickup.n.01', 'synonyms': ['pickup_truck'], 'id': 800, 'def': 'a light truck with an open body and low sides and a tailboard', 'name': 'pickup_truck'}, {'frequency': 'c', 'synset': 'pie.n.01', 'synonyms': ['pie'], 'id': 801, 'def': 'dish baked in pastry-lined pan often with a pastry top', 'name': 'pie'}, {'frequency': 'c', 'synset': 'pigeon.n.01', 'synonyms': ['pigeon'], 'id': 802, 'def': 'wild and domesticated birds having a heavy body and short legs', 'name': 'pigeon'}, {'frequency': 'r', 'synset': 'piggy_bank.n.01', 'synonyms': ['piggy_bank', 'penny_bank'], 'id': 803, 'def': "a child's coin bank (often shaped like a pig)", 'name': 'piggy_bank'}, {'frequency': 'f', 'synset': 'pillow.n.01', 'synonyms': ['pillow'], 'id': 804, 'def': 'a cushion to support the head of a sleeping person', 'name': 'pillow'}, {'frequency': 'r', 'synset': 'pin.n.09', 'synonyms': ['pin_(non_jewelry)'], 'id': 805, 'def': 'a small slender (often pointed) piece of wood or metal used to support or fasten or attach things', 'name': 'pin_(non_jewelry)'}, {'frequency': 'f', 'synset': 'pineapple.n.02', 'synonyms': ['pineapple'], 'id': 806, 'def': 'large sweet fleshy tropical fruit with a tuft of stiff leaves', 'name': 'pineapple'}, {'frequency': 'c', 'synset': 'pinecone.n.01', 'synonyms': ['pinecone'], 'id': 807, 'def': 'the seed-producing cone of a pine tree', 'name': 'pinecone'}, {'frequency': 'r', 'synset': 'ping-pong_ball.n.01', 'synonyms': ['ping-pong_ball'], 'id': 808, 'def': 'light hollow ball used in playing table tennis', 'name': 'ping-pong_ball'}, {'frequency': 'r', 'synset': 'pinwheel.n.03', 'synonyms': ['pinwheel'], 'id': 809, 'def': 'a toy consisting of vanes of colored paper or plastic that is pinned to a stick and spins when it is pointed into the wind', 'name': 'pinwheel'}, {'frequency': 'r', 'synset': 'pipe.n.01', 'synonyms': ['tobacco_pipe'], 'id': 810, 'def': 'a tube with a small bowl at one end; used for smoking tobacco', 'name': 'tobacco_pipe'}, {'frequency': 'f', 'synset': 'pipe.n.02', 'synonyms': ['pipe', 'piping'], 'id': 811, 'def': 'a long tube made of metal or plastic that is used to carry water or oil or gas etc.', 'name': 'pipe'}, {'frequency': 'r', 'synset': 'pistol.n.01', 'synonyms': ['pistol', 'handgun'], 'id': 812, 'def': 'a firearm that is held and fired with one hand', 'name': 'pistol'}, {'frequency': 'c', 'synset': 'pita.n.01', 'synonyms': ['pita_(bread)', 'pocket_bread'], 'id': 813, 'def': 'usually small round bread that can open into a pocket for filling', 'name': 'pita_(bread)'}, {'frequency': 'f', 'synset': 'pitcher.n.02', 'synonyms': ['pitcher_(vessel_for_liquid)', 'ewer'], 'id': 814, 'def': 'an open vessel with a handle and a spout for pouring', 'name': 'pitcher_(vessel_for_liquid)'}, {'frequency': 'r', 'synset': 'pitchfork.n.01', 'synonyms': ['pitchfork'], 'id': 815, 'def': 'a long-handled hand tool with sharp widely spaced prongs for lifting and pitching hay', 'name': 'pitchfork'}, {'frequency': 'f', 'synset': 'pizza.n.01', 'synonyms': ['pizza'], 'id': 816, 'def': 'Italian open pie made of thin bread dough spread with a spiced mixture of e.g. tomato sauce and cheese', 'name': 'pizza'}, {'frequency': 'f', 'synset': 'place_mat.n.01', 'synonyms': ['place_mat'], 'id': 817, 'def': 'a mat placed on a table for an individual place setting', 'name': 'place_mat'}, {'frequency': 'f', 'synset': 'plate.n.04', 'synonyms': ['plate'], 'id': 818, 'def': 'dish on which food is served or from which food is eaten', 'name': 'plate'}, {'frequency': 'c', 'synset': 'platter.n.01', 'synonyms': ['platter'], 'id': 819, 'def': 'a large shallow dish used for serving food', 'name': 'platter'}, {'frequency': 'r', 'synset': 'playpen.n.01', 'synonyms': ['playpen'], 'id': 820, 'def': 'a portable enclosure in which babies may be left to play', 'name': 'playpen'}, {'frequency': 'c', 'synset': 'pliers.n.01', 'synonyms': ['pliers', 'plyers'], 'id': 821, 'def': 'a gripping hand tool with two hinged arms and (usually) serrated jaws', 'name': 'pliers'}, {'frequency': 'r', 'synset': 'plow.n.01', 'synonyms': ['plow_(farm_equipment)', 'plough_(farm_equipment)'], 'id': 822, 'def': 'a farm tool having one or more heavy blades to break the soil and cut a furrow prior to sowing', 'name': 'plow_(farm_equipment)'}, {'frequency': 'r', 'synset': 'plume.n.02', 'synonyms': ['plume'], 'id': 823, 'def': 'a feather or cluster of feathers worn as an ornament', 'name': 'plume'}, {'frequency': 'r', 'synset': 'pocket_watch.n.01', 'synonyms': ['pocket_watch'], 'id': 824, 'def': 'a watch that is carried in a small watch pocket', 'name': 'pocket_watch'}, {'frequency': 'c', 'synset': 'pocketknife.n.01', 'synonyms': ['pocketknife'], 'id': 825, 'def': 'a knife with a blade that folds into the handle; suitable for carrying in the pocket', 'name': 'pocketknife'}, {'frequency': 'c', 'synset': 'poker.n.01', 'synonyms': ['poker_(fire_stirring_tool)', 'stove_poker', 'fire_hook'], 'id': 826, 'def': 'fire iron consisting of a metal rod with a handle; used to stir a fire', 'name': 'poker_(fire_stirring_tool)'}, {'frequency': 'f', 'synset': 'pole.n.01', 'synonyms': ['pole', 'post'], 'id': 827, 'def': 'a long (usually round) rod of wood or metal or plastic', 'name': 'pole'}, {'frequency': 'f', 'synset': 'polo_shirt.n.01', 'synonyms': ['polo_shirt', 'sport_shirt'], 'id': 828, 'def': 'a shirt with short sleeves designed for comfort and casual wear', 'name': 'polo_shirt'}, {'frequency': 'r', 'synset': 'poncho.n.01', 'synonyms': ['poncho'], 'id': 829, 'def': 'a blanket-like cloak with a hole in the center for the head', 'name': 'poncho'}, {'frequency': 'c', 'synset': 'pony.n.05', 'synonyms': ['pony'], 'id': 830, 'def': 'any of various breeds of small gentle horses usually less than five feet high at the shoulder', 'name': 'pony'}, {'frequency': 'r', 'synset': 'pool_table.n.01', 'synonyms': ['pool_table', 'billiard_table', 'snooker_table'], 'id': 831, 'def': 'game equipment consisting of a heavy table on which pool is played', 'name': 'pool_table'}, {'frequency': 'f', 'synset': 'pop.n.02', 'synonyms': ['pop_(soda)', 'soda_(pop)', 'tonic', 'soft_drink'], 'id': 832, 'def': 'a sweet drink containing carbonated water and flavoring', 'name': 'pop_(soda)'}, {'frequency': 'c', 'synset': 'postbox.n.01', 'synonyms': ['postbox_(public)', 'mailbox_(public)'], 'id': 833, 'def': 'public box for deposit of mail', 'name': 'postbox_(public)'}, {'frequency': 'c', 'synset': 'postcard.n.01', 'synonyms': ['postcard', 'postal_card', 'mailing-card'], 'id': 834, 'def': 'a card for sending messages by post without an envelope', 'name': 'postcard'}, {'frequency': 'f', 'synset': 'poster.n.01', 'synonyms': ['poster', 'placard'], 'id': 835, 'def': 'a sign posted in a public place as an advertisement', 'name': 'poster'}, {'frequency': 'f', 'synset': 'pot.n.01', 'synonyms': ['pot'], 'id': 836, 'def': 'metal or earthenware cooking vessel that is usually round and deep; often has a handle and lid', 'name': 'pot'}, {'frequency': 'f', 'synset': 'pot.n.04', 'synonyms': ['flowerpot'], 'id': 837, 'def': 'a container in which plants are cultivated', 'name': 'flowerpot'}, {'frequency': 'f', 'synset': 'potato.n.01', 'synonyms': ['potato'], 'id': 838, 'def': 'an edible tuber native to South America', 'name': 'potato'}, {'frequency': 'c', 'synset': 'potholder.n.01', 'synonyms': ['potholder'], 'id': 839, 'def': 'an insulated pad for holding hot pots', 'name': 'potholder'}, {'frequency': 'c', 'synset': 'pottery.n.01', 'synonyms': ['pottery', 'clayware'], 'id': 840, 'def': 'ceramic ware made from clay and baked in a kiln', 'name': 'pottery'}, {'frequency': 'c', 'synset': 'pouch.n.01', 'synonyms': ['pouch'], 'id': 841, 'def': 'a small or medium size container for holding or carrying things', 'name': 'pouch'}, {'frequency': 'c', 'synset': 'power_shovel.n.01', 'synonyms': ['power_shovel', 'excavator', 'digger'], 'id': 842, 'def': 'a machine for excavating', 'name': 'power_shovel'}, {'frequency': 'c', 'synset': 'prawn.n.01', 'synonyms': ['prawn', 'shrimp'], 'id': 843, 'def': 'any of various edible decapod crustaceans', 'name': 'prawn'}, {'frequency': 'c', 'synset': 'pretzel.n.01', 'synonyms': ['pretzel'], 'id': 844, 'def': 'glazed and salted cracker typically in the shape of a loose knot', 'name': 'pretzel'}, {'frequency': 'f', 'synset': 'printer.n.03', 'synonyms': ['printer', 'printing_machine'], 'id': 845, 'def': 'a machine that prints', 'name': 'printer'}, {'frequency': 'c', 'synset': 'projectile.n.01', 'synonyms': ['projectile_(weapon)', 'missile'], 'id': 846, 'def': 'a weapon that is forcibly thrown or projected at a targets', 'name': 'projectile_(weapon)'}, {'frequency': 'c', 'synset': 'projector.n.02', 'synonyms': ['projector'], 'id': 847, 'def': 'an optical instrument that projects an enlarged image onto a screen', 'name': 'projector'}, {'frequency': 'f', 'synset': 'propeller.n.01', 'synonyms': ['propeller', 'propellor'], 'id': 848, 'def': 'a mechanical device that rotates to push against air or water', 'name': 'propeller'}, {'frequency': 'r', 'synset': 'prune.n.01', 'synonyms': ['prune'], 'id': 849, 'def': 'dried plum', 'name': 'prune'}, {'frequency': 'r', 'synset': 'pudding.n.01', 'synonyms': ['pudding'], 'id': 850, 'def': 'any of various soft thick unsweetened baked dishes', 'name': 'pudding'}, {'frequency': 'r', 'synset': 'puffer.n.02', 'synonyms': ['puffer_(fish)', 'pufferfish', 'blowfish', 'globefish'], 'id': 851, 'def': 'fishes whose elongated spiny body can inflate itself with water or air to form a globe', 'name': 'puffer_(fish)'}, {'frequency': 'r', 'synset': 'puffin.n.01', 'synonyms': ['puffin'], 'id': 852, 'def': 'seabirds having short necks and brightly colored compressed bills', 'name': 'puffin'}, {'frequency': 'r', 'synset': 'pug.n.01', 'synonyms': ['pug-dog'], 'id': 853, 'def': 'small compact smooth-coated breed of Asiatic origin having a tightly curled tail and broad flat wrinkled muzzle', 'name': 'pug-dog'}, {'frequency': 'c', 'synset': 'pumpkin.n.02', 'synonyms': ['pumpkin'], 'id': 854, 'def': 'usually large pulpy deep-yellow round fruit of the squash family maturing in late summer or early autumn', 'name': 'pumpkin'}, {'frequency': 'r', 'synset': 'punch.n.03', 'synonyms': ['puncher'], 'id': 855, 'def': 'a tool for making holes or indentations', 'name': 'puncher'}, {'frequency': 'r', 'synset': 'puppet.n.01', 'synonyms': ['puppet', 'marionette'], 'id': 856, 'def': 'a small figure of a person operated from above with strings by a puppeteer', 'name': 'puppet'}, {'frequency': 'c', 'synset': 'puppy.n.01', 'synonyms': ['puppy'], 'id': 857, 'def': 'a young dog', 'name': 'puppy'}, {'frequency': 'r', 'synset': 'quesadilla.n.01', 'synonyms': ['quesadilla'], 'id': 858, 'def': 'a tortilla that is filled with cheese and heated', 'name': 'quesadilla'}, {'frequency': 'r', 'synset': 'quiche.n.02', 'synonyms': ['quiche'], 'id': 859, 'def': 'a tart filled with rich unsweetened custard; often contains other ingredients (as cheese or ham or seafood or vegetables)', 'name': 'quiche'}, {'frequency': 'f', 'synset': 'quilt.n.01', 'synonyms': ['quilt', 'comforter'], 'id': 860, 'def': 'bedding made of two layers of cloth filled with stuffing and stitched together', 'name': 'quilt'}, {'frequency': 'c', 'synset': 'rabbit.n.01', 'synonyms': ['rabbit'], 'id': 861, 'def': 'any of various burrowing animals of the family Leporidae having long ears and short tails', 'name': 'rabbit'}, {'frequency': 'r', 'synset': 'racer.n.02', 'synonyms': ['race_car', 'racing_car'], 'id': 862, 'def': 'a fast car that competes in races', 'name': 'race_car'}, {'frequency': 'c', 'synset': 'racket.n.04', 'synonyms': ['racket', 'racquet'], 'id': 863, 'def': 'a sports implement used to strike a ball in various games', 'name': 'racket'}, {'frequency': 'r', 'synset': 'radar.n.01', 'synonyms': ['radar'], 'id': 864, 'def': 'measuring instrument in which the echo of a pulse of microwave radiation is used to detect and locate distant objects', 'name': 'radar'}, {'frequency': 'f', 'synset': 'radiator.n.03', 'synonyms': ['radiator'], 'id': 865, 'def': 'a mechanism consisting of a metal honeycomb through which hot fluids circulate', 'name': 'radiator'}, {'frequency': 'c', 'synset': 'radio_receiver.n.01', 'synonyms': ['radio_receiver', 'radio_set', 'radio', 'tuner_(radio)'], 'id': 866, 'def': 'an electronic receiver that detects and demodulates and amplifies transmitted radio signals', 'name': 'radio_receiver'}, {'frequency': 'c', 'synset': 'radish.n.03', 'synonyms': ['radish', 'daikon'], 'id': 867, 'def': 'pungent edible root of any of various cultivated radish plants', 'name': 'radish'}, {'frequency': 'c', 'synset': 'raft.n.01', 'synonyms': ['raft'], 'id': 868, 'def': 'a flat float (usually made of logs or planks) that can be used for transport or as a platform for swimmers', 'name': 'raft'}, {'frequency': 'r', 'synset': 'rag_doll.n.01', 'synonyms': ['rag_doll'], 'id': 869, 'def': 'a cloth doll that is stuffed and (usually) painted', 'name': 'rag_doll'}, {'frequency': 'c', 'synset': 'raincoat.n.01', 'synonyms': ['raincoat', 'waterproof_jacket'], 'id': 870, 'def': 'a water-resistant coat', 'name': 'raincoat'}, {'frequency': 'c', 'synset': 'ram.n.05', 'synonyms': ['ram_(animal)'], 'id': 871, 'def': 'uncastrated adult male sheep', 'name': 'ram_(animal)'}, {'frequency': 'c', 'synset': 'raspberry.n.02', 'synonyms': ['raspberry'], 'id': 872, 'def': 'red or black edible aggregate berries usually smaller than the related blackberries', 'name': 'raspberry'}, {'frequency': 'r', 'synset': 'rat.n.01', 'synonyms': ['rat'], 'id': 873, 'def': 'any of various long-tailed rodents similar to but larger than a mouse', 'name': 'rat'}, {'frequency': 'c', 'synset': 'razorblade.n.01', 'synonyms': ['razorblade'], 'id': 874, 'def': 'a blade that has very sharp edge', 'name': 'razorblade'}, {'frequency': 'c', 'synset': 'reamer.n.01', 'synonyms': ['reamer_(juicer)', 'juicer', 'juice_reamer'], 'id': 875, 'def': 'a squeezer with a conical ridged center that is used for squeezing juice from citrus fruit', 'name': 'reamer_(juicer)'}, {'frequency': 'f', 'synset': 'rearview_mirror.n.01', 'synonyms': ['rearview_mirror'], 'id': 876, 'def': 'vehicle mirror (side or rearview)', 'name': 'rearview_mirror'}, {'frequency': 'c', 'synset': 'receipt.n.02', 'synonyms': ['receipt'], 'id': 877, 'def': 'an acknowledgment (usually tangible) that payment has been made', 'name': 'receipt'}, {'frequency': 'c', 'synset': 'recliner.n.01', 'synonyms': ['recliner', 'reclining_chair', 'lounger_(chair)'], 'id': 878, 'def': 'an armchair whose back can be lowered and foot can be raised to allow the sitter to recline in it', 'name': 'recliner'}, {'frequency': 'c', 'synset': 'record_player.n.01', 'synonyms': ['record_player', 'phonograph_(record_player)', 'turntable'], 'id': 879, 'def': 'machine in which rotating records cause a stylus to vibrate and the vibrations are amplified acoustically or electronically', 'name': 'record_player'}, {'frequency': 'f', 'synset': 'reflector.n.01', 'synonyms': ['reflector'], 'id': 880, 'def': 'device that reflects light, radiation, etc.', 'name': 'reflector'}, {'frequency': 'f', 'synset': 'remote_control.n.01', 'synonyms': ['remote_control'], 'id': 881, 'def': 'a device that can be used to control a machine or apparatus from a distance', 'name': 'remote_control'}, {'frequency': 'c', 'synset': 'rhinoceros.n.01', 'synonyms': ['rhinoceros'], 'id': 882, 'def': 'massive powerful herbivorous odd-toed ungulate of southeast Asia and Africa having very thick skin and one or two horns on the snout', 'name': 'rhinoceros'}, {'frequency': 'r', 'synset': 'rib.n.03', 'synonyms': ['rib_(food)'], 'id': 883, 'def': 'cut of meat including one or more ribs', 'name': 'rib_(food)'}, {'frequency': 'c', 'synset': 'rifle.n.01', 'synonyms': ['rifle'], 'id': 884, 'def': 'a shoulder firearm with a long barrel', 'name': 'rifle'}, {'frequency': 'f', 'synset': 'ring.n.08', 'synonyms': ['ring'], 'id': 885, 'def': 'jewelry consisting of a circlet of precious metal (often set with jewels) worn on the finger', 'name': 'ring'}, {'frequency': 'r', 'synset': 'river_boat.n.01', 'synonyms': ['river_boat'], 'id': 886, 'def': 'a boat used on rivers or to ply a river', 'name': 'river_boat'}, {'frequency': 'r', 'synset': 'road_map.n.02', 'synonyms': ['road_map'], 'id': 887, 'def': '(NOT A ROAD) a MAP showing roads (for automobile travel)', 'name': 'road_map'}, {'frequency': 'c', 'synset': 'robe.n.01', 'synonyms': ['robe'], 'id': 888, 'def': 'any loose flowing garment', 'name': 'robe'}, {'frequency': 'c', 'synset': 'rocking_chair.n.01', 'synonyms': ['rocking_chair'], 'id': 889, 'def': 'a chair mounted on rockers', 'name': 'rocking_chair'}, {'frequency': 'r', 'synset': 'rodent.n.01', 'synonyms': ['rodent'], 'id': 890, 'def': 'relatively small placental mammals having a single pair of constantly growing incisor teeth specialized for gnawing', 'name': 'rodent'}, {'frequency': 'r', 'synset': 'roller_skate.n.01', 'synonyms': ['roller_skate'], 'id': 891, 'def': 'a shoe with pairs of rollers (small hard wheels) fixed to the sole', 'name': 'roller_skate'}, {'frequency': 'r', 'synset': 'rollerblade.n.01', 'synonyms': ['Rollerblade'], 'id': 892, 'def': 'an in-line variant of a roller skate', 'name': 'Rollerblade'}, {'frequency': 'c', 'synset': 'rolling_pin.n.01', 'synonyms': ['rolling_pin'], 'id': 893, 'def': 'utensil consisting of a cylinder (usually of wood) with a handle at each end; used to roll out dough', 'name': 'rolling_pin'}, {'frequency': 'r', 'synset': 'root_beer.n.01', 'synonyms': ['root_beer'], 'id': 894, 'def': 'carbonated drink containing extracts of roots and herbs', 'name': 'root_beer'}, {'frequency': 'c', 'synset': 'router.n.02', 'synonyms': ['router_(computer_equipment)'], 'id': 895, 'def': 'a device that forwards data packets between computer networks', 'name': 'router_(computer_equipment)'}, {'frequency': 'f', 'synset': 'rubber_band.n.01', 'synonyms': ['rubber_band', 'elastic_band'], 'id': 896, 'def': 'a narrow band of elastic rubber used to hold things (such as papers) together', 'name': 'rubber_band'}, {'frequency': 'c', 'synset': 'runner.n.08', 'synonyms': ['runner_(carpet)'], 'id': 897, 'def': 'a long narrow carpet', 'name': 'runner_(carpet)'}, {'frequency': 'f', 'synset': 'sack.n.01', 'synonyms': ['plastic_bag', 'paper_bag'], 'id': 898, 'def': "a bag made of paper or plastic for holding customer's purchases", 'name': 'plastic_bag'}, {'frequency': 'f', 'synset': 'saddle.n.01', 'synonyms': ['saddle_(on_an_animal)'], 'id': 899, 'def': 'a seat for the rider of a horse or camel', 'name': 'saddle_(on_an_animal)'}, {'frequency': 'f', 'synset': 'saddle_blanket.n.01', 'synonyms': ['saddle_blanket', 'saddlecloth', 'horse_blanket'], 'id': 900, 'def': 'stable gear consisting of a blanket placed under the saddle', 'name': 'saddle_blanket'}, {'frequency': 'c', 'synset': 'saddlebag.n.01', 'synonyms': ['saddlebag'], 'id': 901, 'def': 'a large bag (or pair of bags) hung over a saddle', 'name': 'saddlebag'}, {'frequency': 'r', 'synset': 'safety_pin.n.01', 'synonyms': ['safety_pin'], 'id': 902, 'def': 'a pin in the form of a clasp; has a guard so the point of the pin will not stick the user', 'name': 'safety_pin'}, {'frequency': 'f', 'synset': 'sail.n.01', 'synonyms': ['sail'], 'id': 903, 'def': 'a large piece of fabric by means of which wind is used to propel a sailing vessel', 'name': 'sail'}, {'frequency': 'f', 'synset': 'salad.n.01', 'synonyms': ['salad'], 'id': 904, 'def': 'food mixtures either arranged on a plate or tossed and served with a moist dressing; usually consisting of or including greens', 'name': 'salad'}, {'frequency': 'r', 'synset': 'salad_plate.n.01', 'synonyms': ['salad_plate', 'salad_bowl'], 'id': 905, 'def': 'a plate or bowl for individual servings of salad', 'name': 'salad_plate'}, {'frequency': 'c', 'synset': 'salami.n.01', 'synonyms': ['salami'], 'id': 906, 'def': 'highly seasoned fatty sausage of pork and beef usually dried', 'name': 'salami'}, {'frequency': 'c', 'synset': 'salmon.n.01', 'synonyms': ['salmon_(fish)'], 'id': 907, 'def': 'any of various large food and game fishes of northern waters', 'name': 'salmon_(fish)'}, {'frequency': 'r', 'synset': 'salmon.n.03', 'synonyms': ['salmon_(food)'], 'id': 908, 'def': 'flesh of any of various marine or freshwater fish of the family Salmonidae', 'name': 'salmon_(food)'}, {'frequency': 'c', 'synset': 'salsa.n.01', 'synonyms': ['salsa'], 'id': 909, 'def': 'spicy sauce of tomatoes and onions and chili peppers to accompany Mexican foods', 'name': 'salsa'}, {'frequency': 'f', 'synset': 'saltshaker.n.01', 'synonyms': ['saltshaker'], 'id': 910, 'def': 'a shaker with a perforated top for sprinkling salt', 'name': 'saltshaker'}, {'frequency': 'f', 'synset': 'sandal.n.01', 'synonyms': ['sandal_(type_of_shoe)'], 'id': 911, 'def': 'a shoe consisting of a sole fastened by straps to the foot', 'name': 'sandal_(type_of_shoe)'}, {'frequency': 'f', 'synset': 'sandwich.n.01', 'synonyms': ['sandwich'], 'id': 912, 'def': 'two (or more) slices of bread with a filling between them', 'name': 'sandwich'}, {'frequency': 'r', 'synset': 'satchel.n.01', 'synonyms': ['satchel'], 'id': 913, 'def': 'luggage consisting of a small case with a flat bottom and (usually) a shoulder strap', 'name': 'satchel'}, {'frequency': 'r', 'synset': 'saucepan.n.01', 'synonyms': ['saucepan'], 'id': 914, 'def': 'a deep pan with a handle; used for stewing or boiling', 'name': 'saucepan'}, {'frequency': 'f', 'synset': 'saucer.n.02', 'synonyms': ['saucer'], 'id': 915, 'def': 'a small shallow dish for holding a cup at the table', 'name': 'saucer'}, {'frequency': 'f', 'synset': 'sausage.n.01', 'synonyms': ['sausage'], 'id': 916, 'def': 'highly seasoned minced meat stuffed in casings', 'name': 'sausage'}, {'frequency': 'r', 'synset': 'sawhorse.n.01', 'synonyms': ['sawhorse', 'sawbuck'], 'id': 917, 'def': 'a framework for holding wood that is being sawed', 'name': 'sawhorse'}, {'frequency': 'r', 'synset': 'sax.n.02', 'synonyms': ['saxophone'], 'id': 918, 'def': "a wind instrument with a `J'-shaped form typically made of brass", 'name': 'saxophone'}, {'frequency': 'f', 'synset': 'scale.n.07', 'synonyms': ['scale_(measuring_instrument)'], 'id': 919, 'def': 'a measuring instrument for weighing; shows amount of mass', 'name': 'scale_(measuring_instrument)'}, {'frequency': 'r', 'synset': 'scarecrow.n.01', 'synonyms': ['scarecrow', 'strawman'], 'id': 920, 'def': 'an effigy in the shape of a man to frighten birds away from seeds', 'name': 'scarecrow'}, {'frequency': 'f', 'synset': 'scarf.n.01', 'synonyms': ['scarf'], 'id': 921, 'def': 'a garment worn around the head or neck or shoulders for warmth or decoration', 'name': 'scarf'}, {'frequency': 'c', 'synset': 'school_bus.n.01', 'synonyms': ['school_bus'], 'id': 922, 'def': 'a bus used to transport children to or from school', 'name': 'school_bus'}, {'frequency': 'f', 'synset': 'scissors.n.01', 'synonyms': ['scissors'], 'id': 923, 'def': 'a tool having two crossed pivoting blades with looped handles', 'name': 'scissors'}, {'frequency': 'f', 'synset': 'scoreboard.n.01', 'synonyms': ['scoreboard'], 'id': 924, 'def': 'a large board for displaying the score of a contest (and some other information)', 'name': 'scoreboard'}, {'frequency': 'r', 'synset': 'scraper.n.01', 'synonyms': ['scraper'], 'id': 925, 'def': 'any of various hand tools for scraping', 'name': 'scraper'}, {'frequency': 'c', 'synset': 'screwdriver.n.01', 'synonyms': ['screwdriver'], 'id': 926, 'def': 'a hand tool for driving screws; has a tip that fits into the head of a screw', 'name': 'screwdriver'}, {'frequency': 'f', 'synset': 'scrub_brush.n.01', 'synonyms': ['scrubbing_brush'], 'id': 927, 'def': 'a brush with short stiff bristles for heavy cleaning', 'name': 'scrubbing_brush'}, {'frequency': 'c', 'synset': 'sculpture.n.01', 'synonyms': ['sculpture'], 'id': 928, 'def': 'a three-dimensional work of art', 'name': 'sculpture'}, {'frequency': 'c', 'synset': 'seabird.n.01', 'synonyms': ['seabird', 'seafowl'], 'id': 929, 'def': 'a bird that frequents coastal waters and the open ocean: gulls; pelicans; gannets; cormorants; albatrosses; petrels; etc.', 'name': 'seabird'}, {'frequency': 'c', 'synset': 'seahorse.n.02', 'synonyms': ['seahorse'], 'id': 930, 'def': 'small fish with horse-like heads bent sharply downward and curled tails', 'name': 'seahorse'}, {'frequency': 'r', 'synset': 'seaplane.n.01', 'synonyms': ['seaplane', 'hydroplane'], 'id': 931, 'def': 'an airplane that can land on or take off from water', 'name': 'seaplane'}, {'frequency': 'c', 'synset': 'seashell.n.01', 'synonyms': ['seashell'], 'id': 932, 'def': 'the shell of a marine organism', 'name': 'seashell'}, {'frequency': 'c', 'synset': 'sewing_machine.n.01', 'synonyms': ['sewing_machine'], 'id': 933, 'def': 'a textile machine used as a home appliance for sewing', 'name': 'sewing_machine'}, {'frequency': 'c', 'synset': 'shaker.n.03', 'synonyms': ['shaker'], 'id': 934, 'def': 'a container in which something can be shaken', 'name': 'shaker'}, {'frequency': 'c', 'synset': 'shampoo.n.01', 'synonyms': ['shampoo'], 'id': 935, 'def': 'cleansing agent consisting of soaps or detergents used for washing the hair', 'name': 'shampoo'}, {'frequency': 'c', 'synset': 'shark.n.01', 'synonyms': ['shark'], 'id': 936, 'def': 'typically large carnivorous fishes with sharpe teeth', 'name': 'shark'}, {'frequency': 'r', 'synset': 'sharpener.n.01', 'synonyms': ['sharpener'], 'id': 937, 'def': 'any implement that is used to make something (an edge or a point) sharper', 'name': 'sharpener'}, {'frequency': 'r', 'synset': 'sharpie.n.03', 'synonyms': ['Sharpie'], 'id': 938, 'def': 'a pen with indelible ink that will write on any surface', 'name': 'Sharpie'}, {'frequency': 'r', 'synset': 'shaver.n.03', 'synonyms': ['shaver_(electric)', 'electric_shaver', 'electric_razor'], 'id': 939, 'def': 'a razor powered by an electric motor', 'name': 'shaver_(electric)'}, {'frequency': 'c', 'synset': 'shaving_cream.n.01', 'synonyms': ['shaving_cream', 'shaving_soap'], 'id': 940, 'def': 'toiletry consisting that forms a rich lather for softening the beard before shaving', 'name': 'shaving_cream'}, {'frequency': 'r', 'synset': 'shawl.n.01', 'synonyms': ['shawl'], 'id': 941, 'def': 'cloak consisting of an oblong piece of cloth used to cover the head and shoulders', 'name': 'shawl'}, {'frequency': 'r', 'synset': 'shears.n.01', 'synonyms': ['shears'], 'id': 942, 'def': 'large scissors with strong blades', 'name': 'shears'}, {'frequency': 'f', 'synset': 'sheep.n.01', 'synonyms': ['sheep'], 'id': 943, 'def': 'woolly usually horned ruminant mammal related to the goat', 'name': 'sheep'}, {'frequency': 'r', 'synset': 'shepherd_dog.n.01', 'synonyms': ['shepherd_dog', 'sheepdog'], 'id': 944, 'def': 'any of various usually long-haired breeds of dog reared to herd and guard sheep', 'name': 'shepherd_dog'}, {'frequency': 'r', 'synset': 'sherbert.n.01', 'synonyms': ['sherbert', 'sherbet'], 'id': 945, 'def': 'a frozen dessert made primarily of fruit juice and sugar', 'name': 'sherbert'}, {'frequency': 'c', 'synset': 'shield.n.02', 'synonyms': ['shield'], 'id': 946, 'def': 'armor carried on the arm to intercept blows', 'name': 'shield'}, {'frequency': 'f', 'synset': 'shirt.n.01', 'synonyms': ['shirt'], 'id': 947, 'def': 'a garment worn on the upper half of the body', 'name': 'shirt'}, {'frequency': 'f', 'synset': 'shoe.n.01', 'synonyms': ['shoe', 'sneaker_(type_of_shoe)', 'tennis_shoe'], 'id': 948, 'def': 'common footwear covering the foot', 'name': 'shoe'}, {'frequency': 'f', 'synset': 'shopping_bag.n.01', 'synonyms': ['shopping_bag'], 'id': 949, 'def': 'a bag made of plastic or strong paper (often with handles); used to transport goods after shopping', 'name': 'shopping_bag'}, {'frequency': 'c', 'synset': 'shopping_cart.n.01', 'synonyms': ['shopping_cart'], 'id': 950, 'def': 'a handcart that holds groceries or other goods while shopping', 'name': 'shopping_cart'}, {'frequency': 'f', 'synset': 'short_pants.n.01', 'synonyms': ['short_pants', 'shorts_(clothing)', 'trunks_(clothing)'], 'id': 951, 'def': 'trousers that end at or above the knee', 'name': 'short_pants'}, {'frequency': 'r', 'synset': 'shot_glass.n.01', 'synonyms': ['shot_glass'], 'id': 952, 'def': 'a small glass adequate to hold a single swallow of whiskey', 'name': 'shot_glass'}, {'frequency': 'f', 'synset': 'shoulder_bag.n.01', 'synonyms': ['shoulder_bag'], 'id': 953, 'def': 'a large handbag that can be carried by a strap looped over the shoulder', 'name': 'shoulder_bag'}, {'frequency': 'c', 'synset': 'shovel.n.01', 'synonyms': ['shovel'], 'id': 954, 'def': 'a hand tool for lifting loose material such as snow, dirt, etc.', 'name': 'shovel'}, {'frequency': 'f', 'synset': 'shower.n.01', 'synonyms': ['shower_head'], 'id': 955, 'def': 'a plumbing fixture that sprays water over you', 'name': 'shower_head'}, {'frequency': 'r', 'synset': 'shower_cap.n.01', 'synonyms': ['shower_cap'], 'id': 956, 'def': 'a tight cap worn to keep hair dry while showering', 'name': 'shower_cap'}, {'frequency': 'f', 'synset': 'shower_curtain.n.01', 'synonyms': ['shower_curtain'], 'id': 957, 'def': 'a curtain that keeps water from splashing out of the shower area', 'name': 'shower_curtain'}, {'frequency': 'r', 'synset': 'shredder.n.01', 'synonyms': ['shredder_(for_paper)'], 'id': 958, 'def': 'a device that shreds documents', 'name': 'shredder_(for_paper)'}, {'frequency': 'f', 'synset': 'signboard.n.01', 'synonyms': ['signboard'], 'id': 959, 'def': 'structure displaying a board on which advertisements can be posted', 'name': 'signboard'}, {'frequency': 'c', 'synset': 'silo.n.01', 'synonyms': ['silo'], 'id': 960, 'def': 'a cylindrical tower used for storing goods', 'name': 'silo'}, {'frequency': 'f', 'synset': 'sink.n.01', 'synonyms': ['sink'], 'id': 961, 'def': 'plumbing fixture consisting of a water basin fixed to a wall or floor and having a drainpipe', 'name': 'sink'}, {'frequency': 'f', 'synset': 'skateboard.n.01', 'synonyms': ['skateboard'], 'id': 962, 'def': 'a board with wheels that is ridden in a standing or crouching position and propelled by foot', 'name': 'skateboard'}, {'frequency': 'c', 'synset': 'skewer.n.01', 'synonyms': ['skewer'], 'id': 963, 'def': 'a long pin for holding meat in position while it is being roasted', 'name': 'skewer'}, {'frequency': 'f', 'synset': 'ski.n.01', 'synonyms': ['ski'], 'id': 964, 'def': 'sports equipment for skiing on snow', 'name': 'ski'}, {'frequency': 'f', 'synset': 'ski_boot.n.01', 'synonyms': ['ski_boot'], 'id': 965, 'def': 'a stiff boot that is fastened to a ski with a ski binding', 'name': 'ski_boot'}, {'frequency': 'f', 'synset': 'ski_parka.n.01', 'synonyms': ['ski_parka', 'ski_jacket'], 'id': 966, 'def': 'a parka to be worn while skiing', 'name': 'ski_parka'}, {'frequency': 'f', 'synset': 'ski_pole.n.01', 'synonyms': ['ski_pole'], 'id': 967, 'def': 'a pole with metal points used as an aid in skiing', 'name': 'ski_pole'}, {'frequency': 'f', 'synset': 'skirt.n.02', 'synonyms': ['skirt'], 'id': 968, 'def': 'a garment hanging from the waist; worn mainly by girls and women', 'name': 'skirt'}, {'frequency': 'r', 'synset': 'skullcap.n.01', 'synonyms': ['skullcap'], 'id': 969, 'def': 'rounded brimless cap fitting the crown of the head', 'name': 'skullcap'}, {'frequency': 'c', 'synset': 'sled.n.01', 'synonyms': ['sled', 'sledge', 'sleigh'], 'id': 970, 'def': 'a vehicle or flat object for transportation over snow by sliding or pulled by dogs, etc.', 'name': 'sled'}, {'frequency': 'c', 'synset': 'sleeping_bag.n.01', 'synonyms': ['sleeping_bag'], 'id': 971, 'def': 'large padded bag designed to be slept in outdoors', 'name': 'sleeping_bag'}, {'frequency': 'r', 'synset': 'sling.n.05', 'synonyms': ['sling_(bandage)', 'triangular_bandage'], 'id': 972, 'def': 'bandage to support an injured forearm; slung over the shoulder or neck', 'name': 'sling_(bandage)'}, {'frequency': 'c', 'synset': 'slipper.n.01', 'synonyms': ['slipper_(footwear)', 'carpet_slipper_(footwear)'], 'id': 973, 'def': 'low footwear that can be slipped on and off easily; usually worn indoors', 'name': 'slipper_(footwear)'}, {'frequency': 'r', 'synset': 'smoothie.n.02', 'synonyms': ['smoothie'], 'id': 974, 'def': 'a thick smooth drink consisting of fresh fruit pureed with ice cream or yoghurt or milk', 'name': 'smoothie'}, {'frequency': 'r', 'synset': 'snake.n.01', 'synonyms': ['snake', 'serpent'], 'id': 975, 'def': 'limbless scaly elongate reptile; some are venomous', 'name': 'snake'}, {'frequency': 'f', 'synset': 'snowboard.n.01', 'synonyms': ['snowboard'], 'id': 976, 'def': 'a board that resembles a broad ski or a small surfboard; used in a standing position to slide down snow-covered slopes', 'name': 'snowboard'}, {'frequency': 'c', 'synset': 'snowman.n.01', 'synonyms': ['snowman'], 'id': 977, 'def': 'a figure of a person made of packed snow', 'name': 'snowman'}, {'frequency': 'c', 'synset': 'snowmobile.n.01', 'synonyms': ['snowmobile'], 'id': 978, 'def': 'tracked vehicle for travel on snow having skis in front', 'name': 'snowmobile'}, {'frequency': 'f', 'synset': 'soap.n.01', 'synonyms': ['soap'], 'id': 979, 'def': 'a cleansing agent made from the salts of vegetable or animal fats', 'name': 'soap'}, {'frequency': 'f', 'synset': 'soccer_ball.n.01', 'synonyms': ['soccer_ball'], 'id': 980, 'def': "an inflated ball used in playing soccer (called `football' outside of the United States)", 'name': 'soccer_ball'}, {'frequency': 'f', 'synset': 'sock.n.01', 'synonyms': ['sock'], 'id': 981, 'def': 'cloth covering for the foot; worn inside the shoe; reaches to between the ankle and the knee', 'name': 'sock'}, {'frequency': 'f', 'synset': 'sofa.n.01', 'synonyms': ['sofa', 'couch', 'lounge'], 'id': 982, 'def': 'an upholstered seat for more than one person', 'name': 'sofa'}, {'frequency': 'r', 'synset': 'softball.n.01', 'synonyms': ['softball'], 'id': 983, 'def': 'ball used in playing softball', 'name': 'softball'}, {'frequency': 'c', 'synset': 'solar_array.n.01', 'synonyms': ['solar_array', 'solar_battery', 'solar_panel'], 'id': 984, 'def': 'electrical device consisting of a large array of connected solar cells', 'name': 'solar_array'}, {'frequency': 'r', 'synset': 'sombrero.n.02', 'synonyms': ['sombrero'], 'id': 985, 'def': 'a straw hat with a tall crown and broad brim; worn in American southwest and in Mexico', 'name': 'sombrero'}, {'frequency': 'f', 'synset': 'soup.n.01', 'synonyms': ['soup'], 'id': 986, 'def': 'liquid food especially of meat or fish or vegetable stock often containing pieces of solid food', 'name': 'soup'}, {'frequency': 'r', 'synset': 'soup_bowl.n.01', 'synonyms': ['soup_bowl'], 'id': 987, 'def': 'a bowl for serving soup', 'name': 'soup_bowl'}, {'frequency': 'c', 'synset': 'soupspoon.n.01', 'synonyms': ['soupspoon'], 'id': 988, 'def': 'a spoon with a rounded bowl for eating soup', 'name': 'soupspoon'}, {'frequency': 'c', 'synset': 'sour_cream.n.01', 'synonyms': ['sour_cream', 'soured_cream'], 'id': 989, 'def': 'soured light cream', 'name': 'sour_cream'}, {'frequency': 'r', 'synset': 'soya_milk.n.01', 'synonyms': ['soya_milk', 'soybean_milk', 'soymilk'], 'id': 990, 'def': 'a milk substitute containing soybean flour and water; used in some infant formulas and in making tofu', 'name': 'soya_milk'}, {'frequency': 'r', 'synset': 'space_shuttle.n.01', 'synonyms': ['space_shuttle'], 'id': 991, 'def': "a reusable spacecraft with wings for a controlled descent through the Earth's atmosphere", 'name': 'space_shuttle'}, {'frequency': 'r', 'synset': 'sparkler.n.02', 'synonyms': ['sparkler_(fireworks)'], 'id': 992, 'def': 'a firework that burns slowly and throws out a shower of sparks', 'name': 'sparkler_(fireworks)'}, {'frequency': 'f', 'synset': 'spatula.n.02', 'synonyms': ['spatula'], 'id': 993, 'def': 'a hand tool with a thin flexible blade used to mix or spread soft substances', 'name': 'spatula'}, {'frequency': 'r', 'synset': 'spear.n.01', 'synonyms': ['spear', 'lance'], 'id': 994, 'def': 'a long pointed rod used as a tool or weapon', 'name': 'spear'}, {'frequency': 'f', 'synset': 'spectacles.n.01', 'synonyms': ['spectacles', 'specs', 'eyeglasses', 'glasses'], 'id': 995, 'def': 'optical instrument consisting of a frame that holds a pair of lenses for correcting defective vision', 'name': 'spectacles'}, {'frequency': 'c', 'synset': 'spice_rack.n.01', 'synonyms': ['spice_rack'], 'id': 996, 'def': 'a rack for displaying containers filled with spices', 'name': 'spice_rack'}, {'frequency': 'c', 'synset': 'spider.n.01', 'synonyms': ['spider'], 'id': 997, 'def': 'predatory arachnid with eight legs, two poison fangs, two feelers, and usually two silk-spinning organs at the back end of the body', 'name': 'spider'}, {'frequency': 'r', 'synset': 'spiny_lobster.n.02', 'synonyms': ['crawfish', 'crayfish'], 'id': 998, 'def': 'large edible marine crustacean having a spiny carapace but lacking the large pincers of true lobsters', 'name': 'crawfish'}, {'frequency': 'c', 'synset': 'sponge.n.01', 'synonyms': ['sponge'], 'id': 999, 'def': 'a porous mass usable to absorb water typically used for cleaning', 'name': 'sponge'}, {'frequency': 'f', 'synset': 'spoon.n.01', 'synonyms': ['spoon'], 'id': 1000, 'def': 'a piece of cutlery with a shallow bowl-shaped container and a handle', 'name': 'spoon'}, {'frequency': 'c', 'synset': 'sportswear.n.01', 'synonyms': ['sportswear', 'athletic_wear', 'activewear'], 'id': 1001, 'def': 'attire worn for sport or for casual wear', 'name': 'sportswear'}, {'frequency': 'c', 'synset': 'spotlight.n.02', 'synonyms': ['spotlight'], 'id': 1002, 'def': 'a lamp that produces a strong beam of light to illuminate a restricted area; used to focus attention of a stage performer', 'name': 'spotlight'}, {'frequency': 'r', 'synset': 'squid.n.01', 'synonyms': ['squid_(food)', 'calamari', 'calamary'], 'id': 1003, 'def': '(Italian cuisine) squid prepared as food', 'name': 'squid_(food)'}, {'frequency': 'c', 'synset': 'squirrel.n.01', 'synonyms': ['squirrel'], 'id': 1004, 'def': 'a kind of arboreal rodent having a long bushy tail', 'name': 'squirrel'}, {'frequency': 'r', 'synset': 'stagecoach.n.01', 'synonyms': ['stagecoach'], 'id': 1005, 'def': 'a large coach-and-four formerly used to carry passengers and mail on regular routes between towns', 'name': 'stagecoach'}, {'frequency': 'c', 'synset': 'stapler.n.01', 'synonyms': ['stapler_(stapling_machine)'], 'id': 1006, 'def': 'a machine that inserts staples into sheets of paper in order to fasten them together', 'name': 'stapler_(stapling_machine)'}, {'frequency': 'c', 'synset': 'starfish.n.01', 'synonyms': ['starfish', 'sea_star'], 'id': 1007, 'def': 'echinoderms characterized by five arms extending from a central disk', 'name': 'starfish'}, {'frequency': 'f', 'synset': 'statue.n.01', 'synonyms': ['statue_(sculpture)'], 'id': 1008, 'def': 'a sculpture representing a human or animal', 'name': 'statue_(sculpture)'}, {'frequency': 'c', 'synset': 'steak.n.01', 'synonyms': ['steak_(food)'], 'id': 1009, 'def': 'a slice of meat cut from the fleshy part of an animal or large fish', 'name': 'steak_(food)'}, {'frequency': 'r', 'synset': 'steak_knife.n.01', 'synonyms': ['steak_knife'], 'id': 1010, 'def': 'a sharp table knife used in eating steak', 'name': 'steak_knife'}, {'frequency': 'f', 'synset': 'steering_wheel.n.01', 'synonyms': ['steering_wheel'], 'id': 1011, 'def': 'a handwheel that is used for steering', 'name': 'steering_wheel'}, {'frequency': 'r', 'synset': 'step_ladder.n.01', 'synonyms': ['stepladder'], 'id': 1012, 'def': 'a folding portable ladder hinged at the top', 'name': 'stepladder'}, {'frequency': 'c', 'synset': 'step_stool.n.01', 'synonyms': ['step_stool'], 'id': 1013, 'def': 'a stool that has one or two steps that fold under the seat', 'name': 'step_stool'}, {'frequency': 'c', 'synset': 'stereo.n.01', 'synonyms': ['stereo_(sound_system)'], 'id': 1014, 'def': 'electronic device for playing audio', 'name': 'stereo_(sound_system)'}, {'frequency': 'r', 'synset': 'stew.n.02', 'synonyms': ['stew'], 'id': 1015, 'def': 'food prepared by stewing especially meat or fish with vegetables', 'name': 'stew'}, {'frequency': 'r', 'synset': 'stirrer.n.02', 'synonyms': ['stirrer'], 'id': 1016, 'def': 'an implement used for stirring', 'name': 'stirrer'}, {'frequency': 'f', 'synset': 'stirrup.n.01', 'synonyms': ['stirrup'], 'id': 1017, 'def': "support consisting of metal loops into which rider's feet go", 'name': 'stirrup'}, {'frequency': 'f', 'synset': 'stool.n.01', 'synonyms': ['stool'], 'id': 1018, 'def': 'a simple seat without a back or arms', 'name': 'stool'}, {'frequency': 'f', 'synset': 'stop_sign.n.01', 'synonyms': ['stop_sign'], 'id': 1019, 'def': 'a traffic sign to notify drivers that they must come to a complete stop', 'name': 'stop_sign'}, {'frequency': 'f', 'synset': 'stoplight.n.01', 'synonyms': ['brake_light'], 'id': 1020, 'def': 'a red light on the rear of a motor vehicle that signals when the brakes are applied', 'name': 'brake_light'}, {'frequency': 'f', 'synset': 'stove.n.01', 'synonyms': ['stove', 'kitchen_stove', 'range_(kitchen_appliance)', 'kitchen_range', 'cooking_stove'], 'id': 1021, 'def': 'a kitchen appliance used for cooking food', 'name': 'stove'}, {'frequency': 'c', 'synset': 'strainer.n.01', 'synonyms': ['strainer'], 'id': 1022, 'def': 'a filter to retain larger pieces while smaller pieces and liquids pass through', 'name': 'strainer'}, {'frequency': 'f', 'synset': 'strap.n.01', 'synonyms': ['strap'], 'id': 1023, 'def': 'an elongated strip of material for binding things together or holding', 'name': 'strap'}, {'frequency': 'f', 'synset': 'straw.n.04', 'synonyms': ['straw_(for_drinking)', 'drinking_straw'], 'id': 1024, 'def': 'a thin paper or plastic tube used to suck liquids into the mouth', 'name': 'straw_(for_drinking)'}, {'frequency': 'f', 'synset': 'strawberry.n.01', 'synonyms': ['strawberry'], 'id': 1025, 'def': 'sweet fleshy red fruit', 'name': 'strawberry'}, {'frequency': 'f', 'synset': 'street_sign.n.01', 'synonyms': ['street_sign'], 'id': 1026, 'def': 'a sign visible from the street', 'name': 'street_sign'}, {'frequency': 'f', 'synset': 'streetlight.n.01', 'synonyms': ['streetlight', 'street_lamp'], 'id': 1027, 'def': 'a lamp supported on a lamppost; for illuminating a street', 'name': 'streetlight'}, {'frequency': 'r', 'synset': 'string_cheese.n.01', 'synonyms': ['string_cheese'], 'id': 1028, 'def': 'cheese formed in long strings twisted together', 'name': 'string_cheese'}, {'frequency': 'r', 'synset': 'stylus.n.02', 'synonyms': ['stylus'], 'id': 1029, 'def': 'a pointed tool for writing or drawing or engraving, including pens', 'name': 'stylus'}, {'frequency': 'r', 'synset': 'subwoofer.n.01', 'synonyms': ['subwoofer'], 'id': 1030, 'def': 'a loudspeaker that is designed to reproduce very low bass frequencies', 'name': 'subwoofer'}, {'frequency': 'r', 'synset': 'sugar_bowl.n.01', 'synonyms': ['sugar_bowl'], 'id': 1031, 'def': 'a dish in which sugar is served', 'name': 'sugar_bowl'}, {'frequency': 'r', 'synset': 'sugarcane.n.01', 'synonyms': ['sugarcane_(plant)'], 'id': 1032, 'def': 'juicy canes whose sap is a source of molasses and commercial sugar; fresh canes are sometimes chewed for the juice', 'name': 'sugarcane_(plant)'}, {'frequency': 'f', 'synset': 'suit.n.01', 'synonyms': ['suit_(clothing)'], 'id': 1033, 'def': 'a set of garments (usually including a jacket and trousers or skirt) for outerwear all of the same fabric and color', 'name': 'suit_(clothing)'}, {'frequency': 'c', 'synset': 'sunflower.n.01', 'synonyms': ['sunflower'], 'id': 1034, 'def': 'any plant of the genus Helianthus having large flower heads with dark disk florets and showy yellow rays', 'name': 'sunflower'}, {'frequency': 'f', 'synset': 'sunglasses.n.01', 'synonyms': ['sunglasses'], 'id': 1035, 'def': 'spectacles that are darkened or polarized to protect the eyes from the glare of the sun', 'name': 'sunglasses'}, {'frequency': 'c', 'synset': 'sunhat.n.01', 'synonyms': ['sunhat'], 'id': 1036, 'def': 'a hat with a broad brim that protects the face from direct exposure to the sun', 'name': 'sunhat'}, {'frequency': 'f', 'synset': 'surfboard.n.01', 'synonyms': ['surfboard'], 'id': 1037, 'def': 'a narrow buoyant board for riding surf', 'name': 'surfboard'}, {'frequency': 'c', 'synset': 'sushi.n.01', 'synonyms': ['sushi'], 'id': 1038, 'def': 'rice (with raw fish) wrapped in seaweed', 'name': 'sushi'}, {'frequency': 'c', 'synset': 'swab.n.02', 'synonyms': ['mop'], 'id': 1039, 'def': 'cleaning implement consisting of absorbent material fastened to a handle; for cleaning floors', 'name': 'mop'}, {'frequency': 'c', 'synset': 'sweat_pants.n.01', 'synonyms': ['sweat_pants'], 'id': 1040, 'def': 'loose-fitting trousers with elastic cuffs; worn by athletes', 'name': 'sweat_pants'}, {'frequency': 'c', 'synset': 'sweatband.n.02', 'synonyms': ['sweatband'], 'id': 1041, 'def': 'a band of material tied around the forehead or wrist to absorb sweat', 'name': 'sweatband'}, {'frequency': 'f', 'synset': 'sweater.n.01', 'synonyms': ['sweater'], 'id': 1042, 'def': 'a crocheted or knitted garment covering the upper part of the body', 'name': 'sweater'}, {'frequency': 'f', 'synset': 'sweatshirt.n.01', 'synonyms': ['sweatshirt'], 'id': 1043, 'def': 'cotton knit pullover with long sleeves worn during athletic activity', 'name': 'sweatshirt'}, {'frequency': 'c', 'synset': 'sweet_potato.n.02', 'synonyms': ['sweet_potato'], 'id': 1044, 'def': 'the edible tuberous root of the sweet potato vine', 'name': 'sweet_potato'}, {'frequency': 'f', 'synset': 'swimsuit.n.01', 'synonyms': ['swimsuit', 'swimwear', 'bathing_suit', 'swimming_costume', 'bathing_costume', 'swimming_trunks', 'bathing_trunks'], 'id': 1045, 'def': 'garment worn for swimming', 'name': 'swimsuit'}, {'frequency': 'c', 'synset': 'sword.n.01', 'synonyms': ['sword'], 'id': 1046, 'def': 'a cutting or thrusting weapon that has a long metal blade', 'name': 'sword'}, {'frequency': 'r', 'synset': 'syringe.n.01', 'synonyms': ['syringe'], 'id': 1047, 'def': 'a medical instrument used to inject or withdraw fluids', 'name': 'syringe'}, {'frequency': 'r', 'synset': 'tabasco.n.02', 'synonyms': ['Tabasco_sauce'], 'id': 1048, 'def': 'very spicy sauce (trade name Tabasco) made from fully-aged red peppers', 'name': 'Tabasco_sauce'}, {'frequency': 'r', 'synset': 'table-tennis_table.n.01', 'synonyms': ['table-tennis_table', 'ping-pong_table'], 'id': 1049, 'def': 'a table used for playing table tennis', 'name': 'table-tennis_table'}, {'frequency': 'f', 'synset': 'table.n.02', 'synonyms': ['table'], 'id': 1050, 'def': 'a piece of furniture having a smooth flat top that is usually supported by one or more vertical legs', 'name': 'table'}, {'frequency': 'c', 'synset': 'table_lamp.n.01', 'synonyms': ['table_lamp'], 'id': 1051, 'def': 'a lamp that sits on a table', 'name': 'table_lamp'}, {'frequency': 'f', 'synset': 'tablecloth.n.01', 'synonyms': ['tablecloth'], 'id': 1052, 'def': 'a covering spread over a dining table', 'name': 'tablecloth'}, {'frequency': 'r', 'synset': 'tachometer.n.01', 'synonyms': ['tachometer'], 'id': 1053, 'def': 'measuring instrument for indicating speed of rotation', 'name': 'tachometer'}, {'frequency': 'r', 'synset': 'taco.n.02', 'synonyms': ['taco'], 'id': 1054, 'def': 'a small tortilla cupped around a filling', 'name': 'taco'}, {'frequency': 'f', 'synset': 'tag.n.02', 'synonyms': ['tag'], 'id': 1055, 'def': 'a label associated with something for the purpose of identification or information', 'name': 'tag'}, {'frequency': 'f', 'synset': 'taillight.n.01', 'synonyms': ['taillight', 'rear_light'], 'id': 1056, 'def': 'lamp (usually red) mounted at the rear of a motor vehicle', 'name': 'taillight'}, {'frequency': 'r', 'synset': 'tambourine.n.01', 'synonyms': ['tambourine'], 'id': 1057, 'def': 'a shallow drum with a single drumhead and with metallic disks in the sides', 'name': 'tambourine'}, {'frequency': 'r', 'synset': 'tank.n.01', 'synonyms': ['army_tank', 'armored_combat_vehicle', 'armoured_combat_vehicle'], 'id': 1058, 'def': 'an enclosed armored military vehicle; has a cannon and moves on caterpillar treads', 'name': 'army_tank'}, {'frequency': 'f', 'synset': 'tank.n.02', 'synonyms': ['tank_(storage_vessel)', 'storage_tank'], 'id': 1059, 'def': 'a large (usually metallic) vessel for holding gases or liquids', 'name': 'tank_(storage_vessel)'}, {'frequency': 'f', 'synset': 'tank_top.n.01', 'synonyms': ['tank_top_(clothing)'], 'id': 1060, 'def': 'a tight-fitting sleeveless shirt with wide shoulder straps and low neck and no front opening', 'name': 'tank_top_(clothing)'}, {'frequency': 'f', 'synset': 'tape.n.01', 'synonyms': ['tape_(sticky_cloth_or_paper)'], 'id': 1061, 'def': 'a long thin piece of cloth or paper as used for binding or fastening', 'name': 'tape_(sticky_cloth_or_paper)'}, {'frequency': 'c', 'synset': 'tape.n.04', 'synonyms': ['tape_measure', 'measuring_tape'], 'id': 1062, 'def': 'measuring instrument consisting of a narrow strip (cloth or metal) marked in inches or centimeters and used for measuring lengths', 'name': 'tape_measure'}, {'frequency': 'c', 'synset': 'tapestry.n.02', 'synonyms': ['tapestry'], 'id': 1063, 'def': 'a heavy textile with a woven design; used for curtains and upholstery', 'name': 'tapestry'}, {'frequency': 'f', 'synset': 'tarpaulin.n.01', 'synonyms': ['tarp'], 'id': 1064, 'def': 'waterproofed canvas', 'name': 'tarp'}, {'frequency': 'c', 'synset': 'tartan.n.01', 'synonyms': ['tartan', 'plaid'], 'id': 1065, 'def': 'a cloth having a crisscross design', 'name': 'tartan'}, {'frequency': 'c', 'synset': 'tassel.n.01', 'synonyms': ['tassel'], 'id': 1066, 'def': 'adornment consisting of a bunch of cords fastened at one end', 'name': 'tassel'}, {'frequency': 'c', 'synset': 'tea_bag.n.01', 'synonyms': ['tea_bag'], 'id': 1067, 'def': 'a measured amount of tea in a bag for an individual serving of tea', 'name': 'tea_bag'}, {'frequency': 'c', 'synset': 'teacup.n.02', 'synonyms': ['teacup'], 'id': 1068, 'def': 'a cup from which tea is drunk', 'name': 'teacup'}, {'frequency': 'c', 'synset': 'teakettle.n.01', 'synonyms': ['teakettle'], 'id': 1069, 'def': 'kettle for boiling water to make tea', 'name': 'teakettle'}, {'frequency': 'f', 'synset': 'teapot.n.01', 'synonyms': ['teapot'], 'id': 1070, 'def': 'pot for brewing tea; usually has a spout and handle', 'name': 'teapot'}, {'frequency': 'f', 'synset': 'teddy.n.01', 'synonyms': ['teddy_bear'], 'id': 1071, 'def': "plaything consisting of a child's toy bear (usually plush and stuffed with soft materials)", 'name': 'teddy_bear'}, {'frequency': 'f', 'synset': 'telephone.n.01', 'synonyms': ['telephone', 'phone', 'telephone_set'], 'id': 1072, 'def': 'electronic device for communicating by voice over long distances (includes wired and wireless/cell phones)', 'name': 'telephone'}, {'frequency': 'c', 'synset': 'telephone_booth.n.01', 'synonyms': ['telephone_booth', 'phone_booth', 'call_box', 'telephone_box', 'telephone_kiosk'], 'id': 1073, 'def': 'booth for using a telephone', 'name': 'telephone_booth'}, {'frequency': 'f', 'synset': 'telephone_pole.n.01', 'synonyms': ['telephone_pole', 'telegraph_pole', 'telegraph_post'], 'id': 1074, 'def': 'tall pole supporting telephone wires', 'name': 'telephone_pole'}, {'frequency': 'r', 'synset': 'telephoto_lens.n.01', 'synonyms': ['telephoto_lens', 'zoom_lens'], 'id': 1075, 'def': 'a camera lens that magnifies the image', 'name': 'telephoto_lens'}, {'frequency': 'c', 'synset': 'television_camera.n.01', 'synonyms': ['television_camera', 'tv_camera'], 'id': 1076, 'def': 'television equipment for capturing and recording video', 'name': 'television_camera'}, {'frequency': 'f', 'synset': 'television_receiver.n.01', 'synonyms': ['television_set', 'tv', 'tv_set'], 'id': 1077, 'def': 'an electronic device that receives television signals and displays them on a screen', 'name': 'television_set'}, {'frequency': 'f', 'synset': 'tennis_ball.n.01', 'synonyms': ['tennis_ball'], 'id': 1078, 'def': 'ball about the size of a fist used in playing tennis', 'name': 'tennis_ball'}, {'frequency': 'f', 'synset': 'tennis_racket.n.01', 'synonyms': ['tennis_racket'], 'id': 1079, 'def': 'a racket used to play tennis', 'name': 'tennis_racket'}, {'frequency': 'r', 'synset': 'tequila.n.01', 'synonyms': ['tequila'], 'id': 1080, 'def': 'Mexican liquor made from fermented juices of an agave plant', 'name': 'tequila'}, {'frequency': 'c', 'synset': 'thermometer.n.01', 'synonyms': ['thermometer'], 'id': 1081, 'def': 'measuring instrument for measuring temperature', 'name': 'thermometer'}, {'frequency': 'c', 'synset': 'thermos.n.01', 'synonyms': ['thermos_bottle'], 'id': 1082, 'def': 'vacuum flask that preserves temperature of hot or cold drinks', 'name': 'thermos_bottle'}, {'frequency': 'f', 'synset': 'thermostat.n.01', 'synonyms': ['thermostat'], 'id': 1083, 'def': 'a regulator for automatically regulating temperature by starting or stopping the supply of heat', 'name': 'thermostat'}, {'frequency': 'r', 'synset': 'thimble.n.02', 'synonyms': ['thimble'], 'id': 1084, 'def': 'a small metal cap to protect the finger while sewing; can be used as a small container', 'name': 'thimble'}, {'frequency': 'c', 'synset': 'thread.n.01', 'synonyms': ['thread', 'yarn'], 'id': 1085, 'def': 'a fine cord of twisted fibers (of cotton or silk or wool or nylon etc.) used in sewing and weaving', 'name': 'thread'}, {'frequency': 'c', 'synset': 'thumbtack.n.01', 'synonyms': ['thumbtack', 'drawing_pin', 'pushpin'], 'id': 1086, 'def': 'a tack for attaching papers to a bulletin board or drawing board', 'name': 'thumbtack'}, {'frequency': 'c', 'synset': 'tiara.n.01', 'synonyms': ['tiara'], 'id': 1087, 'def': 'a jeweled headdress worn by women on formal occasions', 'name': 'tiara'}, {'frequency': 'c', 'synset': 'tiger.n.02', 'synonyms': ['tiger'], 'id': 1088, 'def': 'large feline of forests in most of Asia having a tawny coat with black stripes', 'name': 'tiger'}, {'frequency': 'c', 'synset': 'tights.n.01', 'synonyms': ['tights_(clothing)', 'leotards'], 'id': 1089, 'def': 'skintight knit hose covering the body from the waist to the feet worn by acrobats and dancers and as stockings by women and girls', 'name': 'tights_(clothing)'}, {'frequency': 'c', 'synset': 'timer.n.01', 'synonyms': ['timer', 'stopwatch'], 'id': 1090, 'def': 'a timepiece that measures a time interval and signals its end', 'name': 'timer'}, {'frequency': 'f', 'synset': 'tinfoil.n.01', 'synonyms': ['tinfoil'], 'id': 1091, 'def': 'foil made of tin or an alloy of tin and lead', 'name': 'tinfoil'}, {'frequency': 'c', 'synset': 'tinsel.n.01', 'synonyms': ['tinsel'], 'id': 1092, 'def': 'a showy decoration that is basically valueless', 'name': 'tinsel'}, {'frequency': 'f', 'synset': 'tissue.n.02', 'synonyms': ['tissue_paper'], 'id': 1093, 'def': 'a soft thin (usually translucent) paper', 'name': 'tissue_paper'}, {'frequency': 'c', 'synset': 'toast.n.01', 'synonyms': ['toast_(food)'], 'id': 1094, 'def': 'slice of bread that has been toasted', 'name': 'toast_(food)'}, {'frequency': 'f', 'synset': 'toaster.n.02', 'synonyms': ['toaster'], 'id': 1095, 'def': 'a kitchen appliance (usually electric) for toasting bread', 'name': 'toaster'}, {'frequency': 'f', 'synset': 'toaster_oven.n.01', 'synonyms': ['toaster_oven'], 'id': 1096, 'def': 'kitchen appliance consisting of a small electric oven for toasting or warming food', 'name': 'toaster_oven'}, {'frequency': 'f', 'synset': 'toilet.n.02', 'synonyms': ['toilet'], 'id': 1097, 'def': 'a plumbing fixture for defecation and urination', 'name': 'toilet'}, {'frequency': 'f', 'synset': 'toilet_tissue.n.01', 'synonyms': ['toilet_tissue', 'toilet_paper', 'bathroom_tissue'], 'id': 1098, 'def': 'a soft thin absorbent paper for use in toilets', 'name': 'toilet_tissue'}, {'frequency': 'f', 'synset': 'tomato.n.01', 'synonyms': ['tomato'], 'id': 1099, 'def': 'mildly acid red or yellow pulpy fruit eaten as a vegetable', 'name': 'tomato'}, {'frequency': 'f', 'synset': 'tongs.n.01', 'synonyms': ['tongs'], 'id': 1100, 'def': 'any of various devices for taking hold of objects; usually have two hinged legs with handles above and pointed hooks below', 'name': 'tongs'}, {'frequency': 'c', 'synset': 'toolbox.n.01', 'synonyms': ['toolbox'], 'id': 1101, 'def': 'a box or chest or cabinet for holding hand tools', 'name': 'toolbox'}, {'frequency': 'f', 'synset': 'toothbrush.n.01', 'synonyms': ['toothbrush'], 'id': 1102, 'def': 'small brush; has long handle; used to clean teeth', 'name': 'toothbrush'}, {'frequency': 'f', 'synset': 'toothpaste.n.01', 'synonyms': ['toothpaste'], 'id': 1103, 'def': 'a dentifrice in the form of a paste', 'name': 'toothpaste'}, {'frequency': 'f', 'synset': 'toothpick.n.01', 'synonyms': ['toothpick'], 'id': 1104, 'def': 'pick consisting of a small strip of wood or plastic; used to pick food from between the teeth', 'name': 'toothpick'}, {'frequency': 'f', 'synset': 'top.n.09', 'synonyms': ['cover'], 'id': 1105, 'def': 'covering for a hole (especially a hole in the top of a container)', 'name': 'cover'}, {'frequency': 'c', 'synset': 'tortilla.n.01', 'synonyms': ['tortilla'], 'id': 1106, 'def': 'thin unleavened pancake made from cornmeal or wheat flour', 'name': 'tortilla'}, {'frequency': 'c', 'synset': 'tow_truck.n.01', 'synonyms': ['tow_truck'], 'id': 1107, 'def': 'a truck equipped to hoist and pull wrecked cars (or to remove cars from no-parking zones)', 'name': 'tow_truck'}, {'frequency': 'f', 'synset': 'towel.n.01', 'synonyms': ['towel'], 'id': 1108, 'def': 'a rectangular piece of absorbent cloth (or paper) for drying or wiping', 'name': 'towel'}, {'frequency': 'f', 'synset': 'towel_rack.n.01', 'synonyms': ['towel_rack', 'towel_rail', 'towel_bar'], 'id': 1109, 'def': 'a rack consisting of one or more bars on which towels can be hung', 'name': 'towel_rack'}, {'frequency': 'f', 'synset': 'toy.n.03', 'synonyms': ['toy'], 'id': 1110, 'def': 'a device regarded as providing amusement', 'name': 'toy'}, {'frequency': 'c', 'synset': 'tractor.n.01', 'synonyms': ['tractor_(farm_equipment)'], 'id': 1111, 'def': 'a wheeled vehicle with large wheels; used in farming and other applications', 'name': 'tractor_(farm_equipment)'}, {'frequency': 'f', 'synset': 'traffic_light.n.01', 'synonyms': ['traffic_light'], 'id': 1112, 'def': 'a device to control vehicle traffic often consisting of three or more lights', 'name': 'traffic_light'}, {'frequency': 'c', 'synset': 'trail_bike.n.01', 'synonyms': ['dirt_bike'], 'id': 1113, 'def': 'a lightweight motorcycle equipped with rugged tires and suspension for off-road use', 'name': 'dirt_bike'}, {'frequency': 'f', 'synset': 'trailer_truck.n.01', 'synonyms': ['trailer_truck', 'tractor_trailer', 'trucking_rig', 'articulated_lorry', 'semi_truck'], 'id': 1114, 'def': 'a truck consisting of a tractor and trailer together', 'name': 'trailer_truck'}, {'frequency': 'f', 'synset': 'train.n.01', 'synonyms': ['train_(railroad_vehicle)', 'railroad_train'], 'id': 1115, 'def': 'public or private transport provided by a line of railway cars coupled together and drawn by a locomotive', 'name': 'train_(railroad_vehicle)'}, {'frequency': 'r', 'synset': 'trampoline.n.01', 'synonyms': ['trampoline'], 'id': 1116, 'def': 'gymnastic apparatus consisting of a strong canvas sheet attached with springs to a metal frame', 'name': 'trampoline'}, {'frequency': 'f', 'synset': 'tray.n.01', 'synonyms': ['tray'], 'id': 1117, 'def': 'an open receptacle for holding or displaying or serving articles or food', 'name': 'tray'}, {'frequency': 'r', 'synset': 'trench_coat.n.01', 'synonyms': ['trench_coat'], 'id': 1118, 'def': 'a military style raincoat; belted with deep pockets', 'name': 'trench_coat'}, {'frequency': 'r', 'synset': 'triangle.n.05', 'synonyms': ['triangle_(musical_instrument)'], 'id': 1119, 'def': 'a percussion instrument consisting of a metal bar bent in the shape of an open triangle', 'name': 'triangle_(musical_instrument)'}, {'frequency': 'c', 'synset': 'tricycle.n.01', 'synonyms': ['tricycle'], 'id': 1120, 'def': 'a vehicle with three wheels that is moved by foot pedals', 'name': 'tricycle'}, {'frequency': 'f', 'synset': 'tripod.n.01', 'synonyms': ['tripod'], 'id': 1121, 'def': 'a three-legged rack used for support', 'name': 'tripod'}, {'frequency': 'f', 'synset': 'trouser.n.01', 'synonyms': ['trousers', 'pants_(clothing)'], 'id': 1122, 'def': 'a garment extending from the waist to the knee or ankle, covering each leg separately', 'name': 'trousers'}, {'frequency': 'f', 'synset': 'truck.n.01', 'synonyms': ['truck'], 'id': 1123, 'def': 'an automotive vehicle suitable for hauling', 'name': 'truck'}, {'frequency': 'r', 'synset': 'truffle.n.03', 'synonyms': ['truffle_(chocolate)', 'chocolate_truffle'], 'id': 1124, 'def': 'creamy chocolate candy', 'name': 'truffle_(chocolate)'}, {'frequency': 'c', 'synset': 'trunk.n.02', 'synonyms': ['trunk'], 'id': 1125, 'def': 'luggage consisting of a large strong case used when traveling or for storage', 'name': 'trunk'}, {'frequency': 'r', 'synset': 'tub.n.02', 'synonyms': ['vat'], 'id': 1126, 'def': 'a large vessel for holding or storing liquids', 'name': 'vat'}, {'frequency': 'c', 'synset': 'turban.n.01', 'synonyms': ['turban'], 'id': 1127, 'def': 'a traditional headdress consisting of a long scarf wrapped around the head', 'name': 'turban'}, {'frequency': 'c', 'synset': 'turkey.n.04', 'synonyms': ['turkey_(food)'], 'id': 1128, 'def': 'flesh of large domesticated fowl usually roasted', 'name': 'turkey_(food)'}, {'frequency': 'r', 'synset': 'turnip.n.01', 'synonyms': ['turnip'], 'id': 1129, 'def': 'widely cultivated plant having a large fleshy edible white or yellow root', 'name': 'turnip'}, {'frequency': 'c', 'synset': 'turtle.n.02', 'synonyms': ['turtle'], 'id': 1130, 'def': 'any of various aquatic and land reptiles having a bony shell and flipper-like limbs for swimming', 'name': 'turtle'}, {'frequency': 'c', 'synset': 'turtleneck.n.01', 'synonyms': ['turtleneck_(clothing)', 'polo-neck'], 'id': 1131, 'def': 'a sweater or jersey with a high close-fitting collar', 'name': 'turtleneck_(clothing)'}, {'frequency': 'c', 'synset': 'typewriter.n.01', 'synonyms': ['typewriter'], 'id': 1132, 'def': 'hand-operated character printer for printing written messages one character at a time', 'name': 'typewriter'}, {'frequency': 'f', 'synset': 'umbrella.n.01', 'synonyms': ['umbrella'], 'id': 1133, 'def': 'a lightweight handheld collapsible canopy', 'name': 'umbrella'}, {'frequency': 'f', 'synset': 'underwear.n.01', 'synonyms': ['underwear', 'underclothes', 'underclothing', 'underpants'], 'id': 1134, 'def': 'undergarment worn next to the skin and under the outer garments', 'name': 'underwear'}, {'frequency': 'r', 'synset': 'unicycle.n.01', 'synonyms': ['unicycle'], 'id': 1135, 'def': 'a vehicle with a single wheel that is driven by pedals', 'name': 'unicycle'}, {'frequency': 'f', 'synset': 'urinal.n.01', 'synonyms': ['urinal'], 'id': 1136, 'def': 'a plumbing fixture (usually attached to the wall) used by men to urinate', 'name': 'urinal'}, {'frequency': 'c', 'synset': 'urn.n.01', 'synonyms': ['urn'], 'id': 1137, 'def': 'a large vase that usually has a pedestal or feet', 'name': 'urn'}, {'frequency': 'c', 'synset': 'vacuum.n.04', 'synonyms': ['vacuum_cleaner'], 'id': 1138, 'def': 'an electrical home appliance that cleans by suction', 'name': 'vacuum_cleaner'}, {'frequency': 'f', 'synset': 'vase.n.01', 'synonyms': ['vase'], 'id': 1139, 'def': 'an open jar of glass or porcelain used as an ornament or to hold flowers', 'name': 'vase'}, {'frequency': 'c', 'synset': 'vending_machine.n.01', 'synonyms': ['vending_machine'], 'id': 1140, 'def': 'a slot machine for selling goods', 'name': 'vending_machine'}, {'frequency': 'f', 'synset': 'vent.n.01', 'synonyms': ['vent', 'blowhole', 'air_vent'], 'id': 1141, 'def': 'a hole for the escape of gas or air', 'name': 'vent'}, {'frequency': 'f', 'synset': 'vest.n.01', 'synonyms': ['vest', 'waistcoat'], 'id': 1142, 'def': "a man's sleeveless garment worn underneath a coat", 'name': 'vest'}, {'frequency': 'c', 'synset': 'videotape.n.01', 'synonyms': ['videotape'], 'id': 1143, 'def': 'a video recording made on magnetic tape', 'name': 'videotape'}, {'frequency': 'r', 'synset': 'vinegar.n.01', 'synonyms': ['vinegar'], 'id': 1144, 'def': 'sour-tasting liquid produced usually by oxidation of the alcohol in wine or cider and used as a condiment or food preservative', 'name': 'vinegar'}, {'frequency': 'r', 'synset': 'violin.n.01', 'synonyms': ['violin', 'fiddle'], 'id': 1145, 'def': 'bowed stringed instrument that is the highest member of the violin family', 'name': 'violin'}, {'frequency': 'r', 'synset': 'vodka.n.01', 'synonyms': ['vodka'], 'id': 1146, 'def': 'unaged colorless liquor originating in Russia', 'name': 'vodka'}, {'frequency': 'c', 'synset': 'volleyball.n.02', 'synonyms': ['volleyball'], 'id': 1147, 'def': 'an inflated ball used in playing volleyball', 'name': 'volleyball'}, {'frequency': 'r', 'synset': 'vulture.n.01', 'synonyms': ['vulture'], 'id': 1148, 'def': 'any of various large birds of prey having naked heads and weak claws and feeding chiefly on carrion', 'name': 'vulture'}, {'frequency': 'c', 'synset': 'waffle.n.01', 'synonyms': ['waffle'], 'id': 1149, 'def': 'pancake batter baked in a waffle iron', 'name': 'waffle'}, {'frequency': 'r', 'synset': 'waffle_iron.n.01', 'synonyms': ['waffle_iron'], 'id': 1150, 'def': 'a kitchen appliance for baking waffles', 'name': 'waffle_iron'}, {'frequency': 'c', 'synset': 'wagon.n.01', 'synonyms': ['wagon'], 'id': 1151, 'def': 'any of various kinds of wheeled vehicles drawn by an animal or a tractor', 'name': 'wagon'}, {'frequency': 'c', 'synset': 'wagon_wheel.n.01', 'synonyms': ['wagon_wheel'], 'id': 1152, 'def': 'a wheel of a wagon', 'name': 'wagon_wheel'}, {'frequency': 'c', 'synset': 'walking_stick.n.01', 'synonyms': ['walking_stick'], 'id': 1153, 'def': 'a stick carried in the hand for support in walking', 'name': 'walking_stick'}, {'frequency': 'c', 'synset': 'wall_clock.n.01', 'synonyms': ['wall_clock'], 'id': 1154, 'def': 'a clock mounted on a wall', 'name': 'wall_clock'}, {'frequency': 'f', 'synset': 'wall_socket.n.01', 'synonyms': ['wall_socket', 'wall_plug', 'electric_outlet', 'electrical_outlet', 'outlet', 'electric_receptacle'], 'id': 1155, 'def': 'receptacle providing a place in a wiring system where current can be taken to run electrical devices', 'name': 'wall_socket'}, {'frequency': 'f', 'synset': 'wallet.n.01', 'synonyms': ['wallet', 'billfold'], 'id': 1156, 'def': 'a pocket-size case for holding papers and paper money', 'name': 'wallet'}, {'frequency': 'r', 'synset': 'walrus.n.01', 'synonyms': ['walrus'], 'id': 1157, 'def': 'either of two large northern marine mammals having ivory tusks and tough hide over thick blubber', 'name': 'walrus'}, {'frequency': 'r', 'synset': 'wardrobe.n.01', 'synonyms': ['wardrobe'], 'id': 1158, 'def': 'a tall piece of furniture that provides storage space for clothes; has a door and rails or hooks for hanging clothes', 'name': 'wardrobe'}, {'frequency': 'r', 'synset': 'washbasin.n.01', 'synonyms': ['washbasin', 'basin_(for_washing)', 'washbowl', 'washstand', 'handbasin'], 'id': 1159, 'def': 'a bathroom sink that is permanently installed and connected to a water supply and drainpipe; where you can wash your hands and face', 'name': 'washbasin'}, {'frequency': 'c', 'synset': 'washer.n.03', 'synonyms': ['automatic_washer', 'washing_machine'], 'id': 1160, 'def': 'a home appliance for washing clothes and linens automatically', 'name': 'automatic_washer'}, {'frequency': 'f', 'synset': 'watch.n.01', 'synonyms': ['watch', 'wristwatch'], 'id': 1161, 'def': 'a small, portable timepiece', 'name': 'watch'}, {'frequency': 'f', 'synset': 'water_bottle.n.01', 'synonyms': ['water_bottle'], 'id': 1162, 'def': 'a bottle for holding water', 'name': 'water_bottle'}, {'frequency': 'c', 'synset': 'water_cooler.n.01', 'synonyms': ['water_cooler'], 'id': 1163, 'def': 'a device for cooling and dispensing drinking water', 'name': 'water_cooler'}, {'frequency': 'c', 'synset': 'water_faucet.n.01', 'synonyms': ['water_faucet', 'water_tap', 'tap_(water_faucet)'], 'id': 1164, 'def': 'a faucet for drawing water from a pipe or cask', 'name': 'water_faucet'}, {'frequency': 'r', 'synset': 'water_heater.n.01', 'synonyms': ['water_heater', 'hot-water_heater'], 'id': 1165, 'def': 'a heater and storage tank to supply heated water', 'name': 'water_heater'}, {'frequency': 'c', 'synset': 'water_jug.n.01', 'synonyms': ['water_jug'], 'id': 1166, 'def': 'a jug that holds water', 'name': 'water_jug'}, {'frequency': 'r', 'synset': 'water_pistol.n.01', 'synonyms': ['water_gun', 'squirt_gun'], 'id': 1167, 'def': 'plaything consisting of a toy pistol that squirts water', 'name': 'water_gun'}, {'frequency': 'c', 'synset': 'water_scooter.n.01', 'synonyms': ['water_scooter', 'sea_scooter', 'jet_ski'], 'id': 1168, 'def': 'a motorboat resembling a motor scooter (NOT A SURFBOARD OR WATER SKI)', 'name': 'water_scooter'}, {'frequency': 'c', 'synset': 'water_ski.n.01', 'synonyms': ['water_ski'], 'id': 1169, 'def': 'broad ski for skimming over water towed by a speedboat (DO NOT MARK WATER)', 'name': 'water_ski'}, {'frequency': 'c', 'synset': 'water_tower.n.01', 'synonyms': ['water_tower'], 'id': 1170, 'def': 'a large reservoir for water', 'name': 'water_tower'}, {'frequency': 'c', 'synset': 'watering_can.n.01', 'synonyms': ['watering_can'], 'id': 1171, 'def': 'a container with a handle and a spout with a perforated nozzle; used to sprinkle water over plants', 'name': 'watering_can'}, {'frequency': 'f', 'synset': 'watermelon.n.02', 'synonyms': ['watermelon'], 'id': 1172, 'def': 'large oblong or roundish melon with a hard green rind and sweet watery red or occasionally yellowish pulp', 'name': 'watermelon'}, {'frequency': 'f', 'synset': 'weathervane.n.01', 'synonyms': ['weathervane', 'vane_(weathervane)', 'wind_vane'], 'id': 1173, 'def': 'mechanical device attached to an elevated structure; rotates freely to show the direction of the wind', 'name': 'weathervane'}, {'frequency': 'c', 'synset': 'webcam.n.01', 'synonyms': ['webcam'], 'id': 1174, 'def': 'a digital camera designed to take digital photographs and transmit them over the internet', 'name': 'webcam'}, {'frequency': 'c', 'synset': 'wedding_cake.n.01', 'synonyms': ['wedding_cake', 'bridecake'], 'id': 1175, 'def': 'a rich cake with two or more tiers and covered with frosting and decorations; served at a wedding reception', 'name': 'wedding_cake'}, {'frequency': 'c', 'synset': 'wedding_ring.n.01', 'synonyms': ['wedding_ring', 'wedding_band'], 'id': 1176, 'def': 'a ring given to the bride and/or groom at the wedding', 'name': 'wedding_ring'}, {'frequency': 'f', 'synset': 'wet_suit.n.01', 'synonyms': ['wet_suit'], 'id': 1177, 'def': 'a close-fitting garment made of a permeable material; worn in cold water to retain body heat', 'name': 'wet_suit'}, {'frequency': 'f', 'synset': 'wheel.n.01', 'synonyms': ['wheel'], 'id': 1178, 'def': 'a circular frame with spokes (or a solid disc) that can rotate on a shaft or axle', 'name': 'wheel'}, {'frequency': 'c', 'synset': 'wheelchair.n.01', 'synonyms': ['wheelchair'], 'id': 1179, 'def': 'a movable chair mounted on large wheels', 'name': 'wheelchair'}, {'frequency': 'c', 'synset': 'whipped_cream.n.01', 'synonyms': ['whipped_cream'], 'id': 1180, 'def': 'cream that has been beaten until light and fluffy', 'name': 'whipped_cream'}, {'frequency': 'c', 'synset': 'whistle.n.03', 'synonyms': ['whistle'], 'id': 1181, 'def': 'a small wind instrument that produces a whistling sound by blowing into it', 'name': 'whistle'}, {'frequency': 'c', 'synset': 'wig.n.01', 'synonyms': ['wig'], 'id': 1182, 'def': 'hairpiece covering the head and made of real or synthetic hair', 'name': 'wig'}, {'frequency': 'c', 'synset': 'wind_chime.n.01', 'synonyms': ['wind_chime'], 'id': 1183, 'def': 'a decorative arrangement of pieces of metal or glass or pottery that hang together loosely so the wind can cause them to tinkle', 'name': 'wind_chime'}, {'frequency': 'c', 'synset': 'windmill.n.01', 'synonyms': ['windmill'], 'id': 1184, 'def': 'A mill or turbine that is powered by wind', 'name': 'windmill'}, {'frequency': 'c', 'synset': 'window_box.n.01', 'synonyms': ['window_box_(for_plants)'], 'id': 1185, 'def': 'a container for growing plants on a windowsill', 'name': 'window_box_(for_plants)'}, {'frequency': 'f', 'synset': 'windshield_wiper.n.01', 'synonyms': ['windshield_wiper', 'windscreen_wiper', 'wiper_(for_windshield/screen)'], 'id': 1186, 'def': 'a mechanical device that cleans the windshield', 'name': 'windshield_wiper'}, {'frequency': 'c', 'synset': 'windsock.n.01', 'synonyms': ['windsock', 'air_sock', 'air-sleeve', 'wind_sleeve', 'wind_cone'], 'id': 1187, 'def': 'a truncated cloth cone mounted on a mast/pole; shows wind direction', 'name': 'windsock'}, {'frequency': 'f', 'synset': 'wine_bottle.n.01', 'synonyms': ['wine_bottle'], 'id': 1188, 'def': 'a bottle for holding wine', 'name': 'wine_bottle'}, {'frequency': 'c', 'synset': 'wine_bucket.n.01', 'synonyms': ['wine_bucket', 'wine_cooler'], 'id': 1189, 'def': 'a bucket of ice used to chill a bottle of wine', 'name': 'wine_bucket'}, {'frequency': 'f', 'synset': 'wineglass.n.01', 'synonyms': ['wineglass'], 'id': 1190, 'def': 'a glass that has a stem and in which wine is served', 'name': 'wineglass'}, {'frequency': 'f', 'synset': 'winker.n.02', 'synonyms': ['blinder_(for_horses)'], 'id': 1191, 'def': 'blinds that prevent a horse from seeing something on either side', 'name': 'blinder_(for_horses)'}, {'frequency': 'c', 'synset': 'wok.n.01', 'synonyms': ['wok'], 'id': 1192, 'def': 'pan with a convex bottom; used for frying in Chinese cooking', 'name': 'wok'}, {'frequency': 'r', 'synset': 'wolf.n.01', 'synonyms': ['wolf'], 'id': 1193, 'def': 'a wild carnivorous mammal of the dog family, living and hunting in packs', 'name': 'wolf'}, {'frequency': 'c', 'synset': 'wooden_spoon.n.02', 'synonyms': ['wooden_spoon'], 'id': 1194, 'def': 'a spoon made of wood', 'name': 'wooden_spoon'}, {'frequency': 'c', 'synset': 'wreath.n.01', 'synonyms': ['wreath'], 'id': 1195, 'def': 'an arrangement of flowers, leaves, or stems fastened in a ring', 'name': 'wreath'}, {'frequency': 'c', 'synset': 'wrench.n.03', 'synonyms': ['wrench', 'spanner'], 'id': 1196, 'def': 'a hand tool that is used to hold or twist a nut or bolt', 'name': 'wrench'}, {'frequency': 'f', 'synset': 'wristband.n.01', 'synonyms': ['wristband'], 'id': 1197, 'def': 'band consisting of a part of a sleeve that covers the wrist', 'name': 'wristband'}, {'frequency': 'f', 'synset': 'wristlet.n.01', 'synonyms': ['wristlet', 'wrist_band'], 'id': 1198, 'def': 'a band or bracelet worn around the wrist', 'name': 'wristlet'}, {'frequency': 'c', 'synset': 'yacht.n.01', 'synonyms': ['yacht'], 'id': 1199, 'def': 'an expensive vessel propelled by sail or power and used for cruising or racing', 'name': 'yacht'}, {'frequency': 'c', 'synset': 'yogurt.n.01', 'synonyms': ['yogurt', 'yoghurt', 'yoghourt'], 'id': 1200, 'def': 'a custard-like food made from curdled milk', 'name': 'yogurt'}, {'frequency': 'c', 'synset': 'yoke.n.07', 'synonyms': ['yoke_(animal_equipment)'], 'id': 1201, 'def': 'gear joining two animals at the neck; NOT egg yolk', 'name': 'yoke_(animal_equipment)'}, {'frequency': 'f', 'synset': 'zebra.n.01', 'synonyms': ['zebra'], 'id': 1202, 'def': 'any of several fleet black-and-white striped African equines', 'name': 'zebra'}, {'frequency': 'c', 'synset': 'zucchini.n.02', 'synonyms': ['zucchini', 'courgette'], 'id': 1203, 'def': 'small cucumber-shaped vegetable marrow; typically dark green', 'name': 'zucchini'}] # noqa -# fmt: on diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/pascal_voc.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/pascal_voc.py deleted file mode 100755 index dbbf82cb..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/pascal_voc.py +++ /dev/null @@ -1,82 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import numpy as np -import os -import xml.etree.ElementTree as ET -from typing import List, Tuple, Union - -from detectron2.data import DatasetCatalog, MetadataCatalog -from detectron2.structures import BoxMode -from detectron2.utils.file_io import PathManager - -__all__ = ["load_voc_instances", "register_pascal_voc"] - - -# fmt: off -CLASS_NAMES = ( - "aeroplane", "bicycle", "bird", "boat", "bottle", "bus", "car", "cat", - "chair", "cow", "diningtable", "dog", "horse", "motorbike", "person", - "pottedplant", "sheep", "sofa", "train", "tvmonitor" -) -# fmt: on - - -def load_voc_instances(dirname: str, split: str, class_names: Union[List[str], Tuple[str, ...]]): - """ - Load Pascal VOC detection annotations to Detectron2 format. - - Args: - dirname: Contain "Annotations", "ImageSets", "JPEGImages" - split (str): one of "train", "test", "val", "trainval" - class_names: list or tuple of class names - """ - with PathManager.open(os.path.join(dirname, "ImageSets", "Main", split + ".txt")) as f: - fileids = np.loadtxt(f, dtype=np.str) - - # Needs to read many small annotation files. Makes sense at local - annotation_dirname = PathManager.get_local_path(os.path.join(dirname, "Annotations/")) - dicts = [] - for fileid in fileids: - anno_file = os.path.join(annotation_dirname, fileid + ".xml") - jpeg_file = os.path.join(dirname, "JPEGImages", fileid + ".jpg") - - with PathManager.open(anno_file) as f: - tree = ET.parse(f) - - r = { - "file_name": jpeg_file, - "image_id": fileid, - "height": int(tree.findall("./size/height")[0].text), - "width": int(tree.findall("./size/width")[0].text), - } - instances = [] - - for obj in tree.findall("object"): - cls = obj.find("name").text - # We include "difficult" samples in training. - # Based on limited experiments, they don't hurt accuracy. - # difficult = int(obj.find("difficult").text) - # if difficult == 1: - # continue - bbox = obj.find("bndbox") - bbox = [float(bbox.find(x).text) for x in ["xmin", "ymin", "xmax", "ymax"]] - # Original annotations are integers in the range [1, W or H] - # Assuming they mean 1-based pixel indices (inclusive), - # a box with annotation (xmin=1, xmax=W) covers the whole image. - # In coordinate space this is represented by (xmin=0, xmax=W) - bbox[0] -= 1.0 - bbox[1] -= 1.0 - instances.append( - {"category_id": class_names.index(cls), "bbox": bbox, "bbox_mode": BoxMode.XYXY_ABS} - ) - r["annotations"] = instances - dicts.append(r) - return dicts - - -def register_pascal_voc(name, dirname, split, year, class_names=CLASS_NAMES): - DatasetCatalog.register(name, lambda: load_voc_instances(dirname, split, class_names)) - MetadataCatalog.get(name).set( - thing_classes=list(class_names), dirname=dirname, year=year, split=split - ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/register_coco.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/register_coco.py deleted file mode 100755 index e564438d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/datasets/register_coco.py +++ /dev/null @@ -1,3 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .coco import register_coco_instances # noqa -from .coco_panoptic import register_coco_panoptic_separated # noqa diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/detection_utils.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/detection_utils.py deleted file mode 100755 index 2707eb43..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/detection_utils.py +++ /dev/null @@ -1,623 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -""" -Common data processing utilities that are used in a -typical object detection data pipeline. -""" -import logging -import numpy as np -from typing import List, Union -import pycocotools.mask as mask_util -import torch -from PIL import Image - -from detectron2.structures import ( - BitMasks, - Boxes, - BoxMode, - Instances, - Keypoints, - PolygonMasks, - RotatedBoxes, - polygons_to_bitmask, -) -from detectron2.utils.file_io import PathManager - -from . import transforms as T -from .catalog import MetadataCatalog - -__all__ = [ - "SizeMismatchError", - "convert_image_to_rgb", - "check_image_size", - "transform_proposals", - "transform_instance_annotations", - "annotations_to_instances", - "annotations_to_instances_rotated", - "build_augmentation", - "build_transform_gen", - "create_keypoint_hflip_indices", - "filter_empty_instances", - "read_image", -] - - -class SizeMismatchError(ValueError): - """ - When loaded image has difference width/height compared with annotation. - """ - - -# https://en.wikipedia.org/wiki/YUV#SDTV_with_BT.601 -_M_RGB2YUV = [[0.299, 0.587, 0.114], [-0.14713, -0.28886, 0.436], [0.615, -0.51499, -0.10001]] -_M_YUV2RGB = [[1.0, 0.0, 1.13983], [1.0, -0.39465, -0.58060], [1.0, 2.03211, 0.0]] - -# https://www.exiv2.org/tags.html -_EXIF_ORIENT = 274 # exif 'Orientation' tag - - -def convert_PIL_to_numpy(image, format): - """ - Convert PIL image to numpy array of target format. - - Args: - image (PIL.Image): a PIL image - format (str): the format of output image - - Returns: - (np.ndarray): also see `read_image` - """ - if format is not None: - # PIL only supports RGB, so convert to RGB and flip channels over below - conversion_format = format - if format in ["BGR", "YUV-BT.601"]: - conversion_format = "RGB" - image = image.convert(conversion_format) - image = np.asarray(image) - # PIL squeezes out the channel dimension for "L", so make it HWC - if format == "L": - image = np.expand_dims(image, -1) - - # handle formats not supported by PIL - elif format == "BGR": - # flip channels if needed - image = image[:, :, ::-1] - elif format == "YUV-BT.601": - image = image / 255.0 - image = np.dot(image, np.array(_M_RGB2YUV).T) - - return image - - -def convert_image_to_rgb(image, format): - """ - Convert an image from given format to RGB. - - Args: - image (np.ndarray or Tensor): an HWC image - format (str): the format of input image, also see `read_image` - - Returns: - (np.ndarray): (H,W,3) RGB image in 0-255 range, can be either float or uint8 - """ - if isinstance(image, torch.Tensor): - image = image.cpu().numpy() - if format == "BGR": - image = image[:, :, [2, 1, 0]] - elif format == "YUV-BT.601": - image = np.dot(image, np.array(_M_YUV2RGB).T) - image = image * 255.0 - else: - if format == "L": - image = image[:, :, 0] - image = image.astype(np.uint8) - image = np.asarray(Image.fromarray(image, mode=format).convert("RGB")) - return image - - -def _apply_exif_orientation(image): - """ - Applies the exif orientation correctly. - - This code exists per the bug: - https://github.com/python-pillow/Pillow/issues/3973 - with the function `ImageOps.exif_transpose`. The Pillow source raises errors with - various methods, especially `tobytes` - - Function based on: - https://github.com/wkentaro/labelme/blob/v4.5.4/labelme/utils/image.py#L59 - https://github.com/python-pillow/Pillow/blob/7.1.2/src/PIL/ImageOps.py#L527 - - Args: - image (PIL.Image): a PIL image - - Returns: - (PIL.Image): the PIL image with exif orientation applied, if applicable - """ - if not hasattr(image, "getexif"): - return image - - try: - exif = image.getexif() - except Exception: # https://github.com/facebookresearch/detectron2/issues/1885 - exif = None - - if exif is None: - return image - - orientation = exif.get(_EXIF_ORIENT) - - method = { - 2: Image.FLIP_LEFT_RIGHT, - 3: Image.ROTATE_180, - 4: Image.FLIP_TOP_BOTTOM, - 5: Image.TRANSPOSE, - 6: Image.ROTATE_270, - 7: Image.TRANSVERSE, - 8: Image.ROTATE_90, - }.get(orientation) - - if method is not None: - return image.transpose(method) - return image - - -def read_image(file_name, format=None): - """ - Read an image into the given format. - Will apply rotation and flipping if the image has such exif information. - - Args: - file_name (str): image file path - format (str): one of the supported image modes in PIL, or "BGR" or "YUV-BT.601". - - Returns: - image (np.ndarray): - an HWC image in the given format, which is 0-255, uint8 for - supported image modes in PIL or "BGR"; float (0-1 for Y) for YUV-BT.601. - """ - with PathManager.open(file_name, "rb") as f: - image = Image.open(f) - - # work around this bug: https://github.com/python-pillow/Pillow/issues/3973 - image = _apply_exif_orientation(image) - return convert_PIL_to_numpy(image, format) - - -def check_image_size(dataset_dict, image): - """ - Raise an error if the image does not match the size specified in the dict. - """ - if "width" in dataset_dict or "height" in dataset_dict: - image_wh = (image.shape[1], image.shape[0]) - expected_wh = (dataset_dict["width"], dataset_dict["height"]) - if not image_wh == expected_wh: - raise SizeMismatchError( - "Mismatched image shape{}, got {}, expect {}.".format( - " for image " + dataset_dict["file_name"] - if "file_name" in dataset_dict - else "", - image_wh, - expected_wh, - ) - + " Please check the width/height in your annotation." - ) - - # To ensure bbox always remap to original image size - if "width" not in dataset_dict: - dataset_dict["width"] = image.shape[1] - if "height" not in dataset_dict: - dataset_dict["height"] = image.shape[0] - - -def transform_proposals(dataset_dict, image_shape, transforms, *, proposal_topk, min_box_size=0): - """ - Apply transformations to the proposals in dataset_dict, if any. - - Args: - dataset_dict (dict): a dict read from the dataset, possibly - contains fields "proposal_boxes", "proposal_objectness_logits", "proposal_bbox_mode" - image_shape (tuple): height, width - transforms (TransformList): - proposal_topk (int): only keep top-K scoring proposals - min_box_size (int): proposals with either side smaller than this - threshold are removed - - The input dict is modified in-place, with abovementioned keys removed. A new - key "proposals" will be added. Its value is an `Instances` - object which contains the transformed proposals in its field - "proposal_boxes" and "objectness_logits". - """ - if "proposal_boxes" in dataset_dict: - # Transform proposal boxes - boxes = transforms.apply_box( - BoxMode.convert( - dataset_dict.pop("proposal_boxes"), - dataset_dict.pop("proposal_bbox_mode"), - BoxMode.XYXY_ABS, - ) - ) - boxes = Boxes(boxes) - objectness_logits = torch.as_tensor( - dataset_dict.pop("proposal_objectness_logits").astype("float32") - ) - - boxes.clip(image_shape) - keep = boxes.nonempty(threshold=min_box_size) - boxes = boxes[keep] - objectness_logits = objectness_logits[keep] - - proposals = Instances(image_shape) - proposals.proposal_boxes = boxes[:proposal_topk] - proposals.objectness_logits = objectness_logits[:proposal_topk] - dataset_dict["proposals"] = proposals - - -def transform_instance_annotations( - annotation, transforms, image_size, *, keypoint_hflip_indices=None -): - """ - Apply transforms to box, segmentation and keypoints annotations of a single instance. - - It will use `transforms.apply_box` for the box, and - `transforms.apply_coords` for segmentation polygons & keypoints. - If you need anything more specially designed for each data structure, - you'll need to implement your own version of this function or the transforms. - - Args: - annotation (dict): dict of instance annotations for a single instance. - It will be modified in-place. - transforms (TransformList or list[Transform]): - image_size (tuple): the height, width of the transformed image - keypoint_hflip_indices (ndarray[int]): see `create_keypoint_hflip_indices`. - - Returns: - dict: - the same input dict with fields "bbox", "segmentation", "keypoints" - transformed according to `transforms`. - The "bbox_mode" field will be set to XYXY_ABS. - """ - if isinstance(transforms, (tuple, list)): - transforms = T.TransformList(transforms) - # bbox is 1d (per-instance bounding box) - bbox = BoxMode.convert(annotation["bbox"], annotation["bbox_mode"], BoxMode.XYXY_ABS) - # clip transformed bbox to image size - bbox = transforms.apply_box(np.array([bbox]))[0].clip(min=0) - annotation["bbox"] = np.minimum(bbox, list(image_size + image_size)[::-1]) - annotation["bbox_mode"] = BoxMode.XYXY_ABS - - if "segmentation" in annotation: - # each instance contains 1 or more polygons - segm = annotation["segmentation"] - if isinstance(segm, list): - # polygons - polygons = [np.asarray(p).reshape(-1, 2) for p in segm] - annotation["segmentation"] = [ - p.reshape(-1) for p in transforms.apply_polygons(polygons) - ] - elif isinstance(segm, dict): - # RLE - mask = mask_util.decode(segm) - mask = transforms.apply_segmentation(mask) - assert tuple(mask.shape[:2]) == image_size - annotation["segmentation"] = mask - else: - raise ValueError( - "Cannot transform segmentation of type '{}'!" - "Supported types are: polygons as list[list[float] or ndarray]," - " COCO-style RLE as a dict.".format(type(segm)) - ) - - if "keypoints" in annotation: - keypoints = transform_keypoint_annotations( - annotation["keypoints"], transforms, image_size, keypoint_hflip_indices - ) - annotation["keypoints"] = keypoints - - return annotation - - -def transform_keypoint_annotations(keypoints, transforms, image_size, keypoint_hflip_indices=None): - """ - Transform keypoint annotations of an image. - If a keypoint is transformed out of image boundary, it will be marked "unlabeled" (visibility=0) - - Args: - keypoints (list[float]): Nx3 float in Detectron2's Dataset format. - Each point is represented by (x, y, visibility). - transforms (TransformList): - image_size (tuple): the height, width of the transformed image - keypoint_hflip_indices (ndarray[int]): see `create_keypoint_hflip_indices`. - When `transforms` includes horizontal flip, will use the index - mapping to flip keypoints. - """ - # (N*3,) -> (N, 3) - keypoints = np.asarray(keypoints, dtype="float64").reshape(-1, 3) - keypoints_xy = transforms.apply_coords(keypoints[:, :2]) - - # Set all out-of-boundary points to "unlabeled" - inside = (keypoints_xy >= np.array([0, 0])) & (keypoints_xy <= np.array(image_size[::-1])) - inside = inside.all(axis=1) - keypoints[:, :2] = keypoints_xy - keypoints[:, 2][~inside] = 0 - - # This assumes that HorizFlipTransform is the only one that does flip - do_hflip = sum(isinstance(t, T.HFlipTransform) for t in transforms.transforms) % 2 == 1 - - # Alternative way: check if probe points was horizontally flipped. - # probe = np.asarray([[0.0, 0.0], [image_width, 0.0]]) - # probe_aug = transforms.apply_coords(probe.copy()) - # do_hflip = np.sign(probe[1][0] - probe[0][0]) != np.sign(probe_aug[1][0] - probe_aug[0][0]) # noqa - - # If flipped, swap each keypoint with its opposite-handed equivalent - if do_hflip: - if keypoint_hflip_indices is None: - raise ValueError("Cannot flip keypoints without providing flip indices!") - if len(keypoints) != len(keypoint_hflip_indices): - raise ValueError( - "Keypoint data has {} points, but metadata " - "contains {} points!".format(len(keypoints), len(keypoint_hflip_indices)) - ) - keypoints = keypoints[np.asarray(keypoint_hflip_indices, dtype=np.int32), :] - - # Maintain COCO convention that if visibility == 0 (unlabeled), then x, y = 0 - keypoints[keypoints[:, 2] == 0] = 0 - return keypoints - - -def annotations_to_instances(annos, image_size, mask_format="polygon"): - """ - Create an :class:`Instances` object used by the models, - from instance annotations in the dataset dict. - - Args: - annos (list[dict]): a list of instance annotations in one image, each - element for one instance. - image_size (tuple): height, width - - Returns: - Instances: - It will contain fields "gt_boxes", "gt_classes", - "gt_masks", "gt_keypoints", if they can be obtained from `annos`. - This is the format that builtin models expect. - """ - boxes = ( - np.stack( - [BoxMode.convert(obj["bbox"], obj["bbox_mode"], BoxMode.XYXY_ABS) for obj in annos] - ) - if len(annos) - else np.zeros((0, 4)) - ) - target = Instances(image_size) - target.gt_boxes = Boxes(boxes) - - classes = [int(obj["category_id"]) for obj in annos] - classes = torch.tensor(classes, dtype=torch.int64) - target.gt_classes = classes - - if len(annos) and "segmentation" in annos[0]: - segms = [obj["segmentation"] for obj in annos] - if mask_format == "polygon": - try: - masks = PolygonMasks(segms) - except ValueError as e: - raise ValueError( - "Failed to use mask_format=='polygon' from the given annotations!" - ) from e - else: - assert mask_format == "bitmask", mask_format - masks = [] - for segm in segms: - if isinstance(segm, list): - # polygon - masks.append(polygons_to_bitmask(segm, *image_size)) - elif isinstance(segm, dict): - # COCO RLE - masks.append(mask_util.decode(segm)) - elif isinstance(segm, np.ndarray): - assert segm.ndim == 2, "Expect segmentation of 2 dimensions, got {}.".format( - segm.ndim - ) - # mask array - masks.append(segm) - else: - raise ValueError( - "Cannot convert segmentation of type '{}' to BitMasks!" - "Supported types are: polygons as list[list[float] or ndarray]," - " COCO-style RLE as a dict, or a binary segmentation mask " - " in a 2D numpy array of shape HxW.".format(type(segm)) - ) - # torch.from_numpy does not support array with negative stride. - masks = BitMasks( - torch.stack([torch.from_numpy(np.ascontiguousarray(x)) for x in masks]) - ) - target.gt_masks = masks - - if len(annos) and "keypoints" in annos[0]: - kpts = [obj.get("keypoints", []) for obj in annos] - target.gt_keypoints = Keypoints(kpts) - - return target - - -def annotations_to_instances_rotated(annos, image_size): - """ - Create an :class:`Instances` object used by the models, - from instance annotations in the dataset dict. - Compared to `annotations_to_instances`, this function is for rotated boxes only - - Args: - annos (list[dict]): a list of instance annotations in one image, each - element for one instance. - image_size (tuple): height, width - - Returns: - Instances: - Containing fields "gt_boxes", "gt_classes", - if they can be obtained from `annos`. - This is the format that builtin models expect. - """ - boxes = [obj["bbox"] for obj in annos] - target = Instances(image_size) - boxes = target.gt_boxes = RotatedBoxes(boxes) - boxes.clip(image_size) - - classes = [obj["category_id"] for obj in annos] - classes = torch.tensor(classes, dtype=torch.int64) - target.gt_classes = classes - - return target - - -def filter_empty_instances( - instances, by_box=True, by_mask=True, box_threshold=1e-5, return_mask=False -): - """ - Filter out empty instances in an `Instances` object. - - Args: - instances (Instances): - by_box (bool): whether to filter out instances with empty boxes - by_mask (bool): whether to filter out instances with empty masks - box_threshold (float): minimum width and height to be considered non-empty - return_mask (bool): whether to return boolean mask of filtered instances - - Returns: - Instances: the filtered instances. - tensor[bool], optional: boolean mask of filtered instances - """ - assert by_box or by_mask - r = [] - if by_box: - r.append(instances.gt_boxes.nonempty(threshold=box_threshold)) - if instances.has("gt_masks") and by_mask: - r.append(instances.gt_masks.nonempty()) - - # TODO: can also filter visible keypoints - - if not r: - return instances - m = r[0] - for x in r[1:]: - m = m & x - if return_mask: - return instances[m], m - return instances[m] - - -def create_keypoint_hflip_indices(dataset_names: Union[str, List[str]]) -> List[int]: - """ - Args: - dataset_names: list of dataset names - - Returns: - list[int]: a list of size=#keypoints, storing the - horizontally-flipped keypoint indices. - """ - if isinstance(dataset_names, str): - dataset_names = [dataset_names] - - check_metadata_consistency("keypoint_names", dataset_names) - check_metadata_consistency("keypoint_flip_map", dataset_names) - - meta = MetadataCatalog.get(dataset_names[0]) - names = meta.keypoint_names - # TODO flip -> hflip - flip_map = dict(meta.keypoint_flip_map) - flip_map.update({v: k for k, v in flip_map.items()}) - flipped_names = [i if i not in flip_map else flip_map[i] for i in names] - flip_indices = [names.index(i) for i in flipped_names] - return flip_indices - - -def gen_crop_transform_with_instance(crop_size, image_size, instance): - """ - Generate a CropTransform so that the cropping region contains - the center of the given instance. - - Args: - crop_size (tuple): h, w in pixels - image_size (tuple): h, w - instance (dict): an annotation dict of one instance, in Detectron2's - dataset format. - """ - crop_size = np.asarray(crop_size, dtype=np.int32) - bbox = BoxMode.convert(instance["bbox"], instance["bbox_mode"], BoxMode.XYXY_ABS) - center_yx = (bbox[1] + bbox[3]) * 0.5, (bbox[0] + bbox[2]) * 0.5 - assert ( - image_size[0] >= center_yx[0] and image_size[1] >= center_yx[1] - ), "The annotation bounding box is outside of the image!" - assert ( - image_size[0] >= crop_size[0] and image_size[1] >= crop_size[1] - ), "Crop size is larger than image size!" - - min_yx = np.maximum(np.floor(center_yx).astype(np.int32) - crop_size, 0) - max_yx = np.maximum(np.asarray(image_size, dtype=np.int32) - crop_size, 0) - max_yx = np.minimum(max_yx, np.ceil(center_yx).astype(np.int32)) - - y0 = np.random.randint(min_yx[0], max_yx[0] + 1) - x0 = np.random.randint(min_yx[1], max_yx[1] + 1) - return T.CropTransform(x0, y0, crop_size[1], crop_size[0]) - - -def check_metadata_consistency(key, dataset_names): - """ - Check that the datasets have consistent metadata. - - Args: - key (str): a metadata key - dataset_names (list[str]): a list of dataset names - - Raises: - AttributeError: if the key does not exist in the metadata - ValueError: if the given datasets do not have the same metadata values defined by key - """ - if len(dataset_names) == 0: - return - logger = logging.getLogger(__name__) - entries_per_dataset = [getattr(MetadataCatalog.get(d), key) for d in dataset_names] - for idx, entry in enumerate(entries_per_dataset): - if entry != entries_per_dataset[0]: - logger.error( - "Metadata '{}' for dataset '{}' is '{}'".format(key, dataset_names[idx], str(entry)) - ) - logger.error( - "Metadata '{}' for dataset '{}' is '{}'".format( - key, dataset_names[0], str(entries_per_dataset[0]) - ) - ) - raise ValueError("Datasets have different metadata '{}'!".format(key)) - - -def build_augmentation(cfg, is_train): - """ - Create a list of default :class:`Augmentation` from config. - Now it includes resizing and flipping. - - Returns: - list[Augmentation] - """ - if is_train: - min_size = cfg.INPUT.MIN_SIZE_TRAIN - max_size = cfg.INPUT.MAX_SIZE_TRAIN - sample_style = cfg.INPUT.MIN_SIZE_TRAIN_SAMPLING - else: - min_size = cfg.INPUT.MIN_SIZE_TEST - max_size = cfg.INPUT.MAX_SIZE_TEST - sample_style = "choice" - augmentation = [T.ResizeShortestEdge(min_size, max_size, sample_style)] - if is_train and cfg.INPUT.RANDOM_FLIP != "none": - augmentation.append( - T.RandomFlip( - horizontal=cfg.INPUT.RANDOM_FLIP == "horizontal", - vertical=cfg.INPUT.RANDOM_FLIP == "vertical", - ) - ) - return augmentation - - -build_transform_gen = build_augmentation -""" -Alias for backward-compatibility. -""" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/samplers/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/samplers/__init__.py deleted file mode 100755 index 85c9f1a9..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/samplers/__init__.py +++ /dev/null @@ -1,17 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .distributed_sampler import ( - InferenceSampler, - RandomSubsetTrainingSampler, - RepeatFactorTrainingSampler, - TrainingSampler, -) - -from .grouped_batch_sampler import GroupedBatchSampler - -__all__ = [ - "GroupedBatchSampler", - "TrainingSampler", - "RandomSubsetTrainingSampler", - "InferenceSampler", - "RepeatFactorTrainingSampler", -] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/samplers/distributed_sampler.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/samplers/distributed_sampler.py deleted file mode 100755 index a098e6ac..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/samplers/distributed_sampler.py +++ /dev/null @@ -1,278 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import itertools -import logging -import math -from collections import defaultdict -from typing import Optional -import torch -from torch.utils.data.sampler import Sampler - -from detectron2.utils import comm - -logger = logging.getLogger(__name__) - - -class TrainingSampler(Sampler): - """ - In training, we only care about the "infinite stream" of training data. - So this sampler produces an infinite stream of indices and - all workers cooperate to correctly shuffle the indices and sample different indices. - - The samplers in each worker effectively produces `indices[worker_id::num_workers]` - where `indices` is an infinite stream of indices consisting of - `shuffle(range(size)) + shuffle(range(size)) + ...` (if shuffle is True) - or `range(size) + range(size) + ...` (if shuffle is False) - - Note that this sampler does not shard based on pytorch DataLoader worker id. - A sampler passed to pytorch DataLoader is used only with map-style dataset - and will not be executed inside workers. - But if this sampler is used in a way that it gets execute inside a dataloader - worker, then extra work needs to be done to shard its outputs based on worker id. - This is required so that workers don't produce identical data. - :class:`ToIterableDataset` implements this logic. - This note is true for all samplers in detectron2. - """ - - def __init__(self, size: int, shuffle: bool = True, seed: Optional[int] = None): - """ - Args: - size (int): the total number of data of the underlying dataset to sample from - shuffle (bool): whether to shuffle the indices or not - seed (int): the initial seed of the shuffle. Must be the same - across all workers. If None, will use a random seed shared - among workers (require synchronization among all workers). - """ - if not isinstance(size, int): - raise TypeError(f"TrainingSampler(size=) expects an int. Got type {type(size)}.") - if size <= 0: - raise ValueError(f"TrainingSampler(size=) expects a positive int. Got {size}.") - self._size = size - self._shuffle = shuffle - if seed is None: - seed = comm.shared_random_seed() - self._seed = int(seed) - - self._rank = comm.get_rank() - self._world_size = comm.get_world_size() - - def __iter__(self): - start = self._rank - yield from itertools.islice(self._infinite_indices(), start, None, self._world_size) - - def _infinite_indices(self): - g = torch.Generator() - g.manual_seed(self._seed) - while True: - if self._shuffle: - yield from torch.randperm(self._size, generator=g).tolist() - else: - yield from torch.arange(self._size).tolist() - - -class RandomSubsetTrainingSampler(TrainingSampler): - """ - Similar to TrainingSampler, but only sample a random subset of indices. - This is useful when you want to estimate the accuracy vs data-number curves by - training the model with different subset_ratio. - """ - - def __init__( - self, - size: int, - subset_ratio: float, - shuffle: bool = True, - seed_shuffle: Optional[int] = None, - seed_subset: Optional[int] = None, - ): - """ - Args: - size (int): the total number of data of the underlying dataset to sample from - subset_ratio (float): the ratio of subset data to sample from the underlying dataset - shuffle (bool): whether to shuffle the indices or not - seed_shuffle (int): the initial seed of the shuffle. Must be the same - across all workers. If None, will use a random seed shared - among workers (require synchronization among all workers). - seed_subset (int): the seed to randomize the subset to be sampled. - Must be the same across all workers. If None, will use a random seed shared - among workers (require synchronization among all workers). - """ - super().__init__(size=size, shuffle=shuffle, seed=seed_shuffle) - - assert 0.0 < subset_ratio <= 1.0 - self._size_subset = int(size * subset_ratio) - assert self._size_subset > 0 - if seed_subset is None: - seed_subset = comm.shared_random_seed() - self._seed_subset = int(seed_subset) - - # randomly generate the subset indexes to be sampled from - g = torch.Generator() - g.manual_seed(self._seed_subset) - indexes_randperm = torch.randperm(self._size, generator=g) - self._indexes_subset = indexes_randperm[: self._size_subset] - - logger.info("Using RandomSubsetTrainingSampler......") - logger.info(f"Randomly sample {self._size_subset} data from the original {self._size} data") - - def _infinite_indices(self): - g = torch.Generator() - g.manual_seed(self._seed) # self._seed equals seed_shuffle from __init__() - while True: - if self._shuffle: - # generate a random permutation to shuffle self._indexes_subset - randperm = torch.randperm(self._size_subset, generator=g) - yield from self._indexes_subset[randperm].tolist() - else: - yield from self._indexes_subset.tolist() - - -class RepeatFactorTrainingSampler(Sampler): - """ - Similar to TrainingSampler, but a sample may appear more times than others based - on its "repeat factor". This is suitable for training on class imbalanced datasets like LVIS. - """ - - def __init__(self, repeat_factors, *, shuffle=True, seed=None): - """ - Args: - repeat_factors (Tensor): a float vector, the repeat factor for each indice. When it's - full of ones, it is equivalent to ``TrainingSampler(len(repeat_factors), ...)``. - shuffle (bool): whether to shuffle the indices or not - seed (int): the initial seed of the shuffle. Must be the same - across all workers. If None, will use a random seed shared - among workers (require synchronization among all workers). - """ - self._shuffle = shuffle - if seed is None: - seed = comm.shared_random_seed() - self._seed = int(seed) - - self._rank = comm.get_rank() - self._world_size = comm.get_world_size() - - # Split into whole number (_int_part) and fractional (_frac_part) parts. - self._int_part = torch.trunc(repeat_factors) - self._frac_part = repeat_factors - self._int_part - - @staticmethod - def repeat_factors_from_category_frequency(dataset_dicts, repeat_thresh): - """ - Compute (fractional) per-image repeat factors based on category frequency. - The repeat factor for an image is a function of the frequency of the rarest - category labeled in that image. The "frequency of category c" in [0, 1] is defined - as the fraction of images in the training set (without repeats) in which category c - appears. - See :paper:`lvis` (>= v2) Appendix B.2. - - Args: - dataset_dicts (list[dict]): annotations in Detectron2 dataset format. - repeat_thresh (float): frequency threshold below which data is repeated. - If the frequency is half of `repeat_thresh`, the image will be - repeated twice. - - Returns: - torch.Tensor: - the i-th element is the repeat factor for the dataset image at index i. - """ - # 1. For each category c, compute the fraction of images that contain it: f(c) - category_freq = defaultdict(int) - for dataset_dict in dataset_dicts: # For each image (without repeats) - cat_ids = {ann["category_id"] for ann in dataset_dict["annotations"]} - for cat_id in cat_ids: - category_freq[cat_id] += 1 - num_images = len(dataset_dicts) - for k, v in category_freq.items(): - category_freq[k] = v / num_images - - # 2. For each category c, compute the category-level repeat factor: - # r(c) = max(1, sqrt(t / f(c))) - category_rep = { - cat_id: max(1.0, math.sqrt(repeat_thresh / cat_freq)) - for cat_id, cat_freq in category_freq.items() - } - - # 3. For each image I, compute the image-level repeat factor: - # r(I) = max_{c in I} r(c) - rep_factors = [] - for dataset_dict in dataset_dicts: - cat_ids = {ann["category_id"] for ann in dataset_dict["annotations"]} - rep_factor = max({category_rep[cat_id] for cat_id in cat_ids}, default=1.0) - rep_factors.append(rep_factor) - - return torch.tensor(rep_factors, dtype=torch.float32) - - def _get_epoch_indices(self, generator): - """ - Create a list of dataset indices (with repeats) to use for one epoch. - - Args: - generator (torch.Generator): pseudo random number generator used for - stochastic rounding. - - Returns: - torch.Tensor: list of dataset indices to use in one epoch. Each index - is repeated based on its calculated repeat factor. - """ - # Since repeat factors are fractional, we use stochastic rounding so - # that the target repeat factor is achieved in expectation over the - # course of training - rands = torch.rand(len(self._frac_part), generator=generator) - rep_factors = self._int_part + (rands < self._frac_part).float() - # Construct a list of indices in which we repeat images as specified - indices = [] - for dataset_index, rep_factor in enumerate(rep_factors): - indices.extend([dataset_index] * int(rep_factor.item())) - return torch.tensor(indices, dtype=torch.int64) - - def __iter__(self): - start = self._rank - yield from itertools.islice(self._infinite_indices(), start, None, self._world_size) - - def _infinite_indices(self): - g = torch.Generator() - g.manual_seed(self._seed) - while True: - # Sample indices with repeats determined by stochastic rounding; each - # "epoch" may have a slightly different size due to the rounding. - indices = self._get_epoch_indices(g) - if self._shuffle: - randperm = torch.randperm(len(indices), generator=g) - yield from indices[randperm].tolist() - else: - yield from indices.tolist() - - -class InferenceSampler(Sampler): - """ - Produce indices for inference across all workers. - Inference needs to run on the __exact__ set of samples, - therefore when the total number of samples is not divisible by the number of workers, - this sampler produces different number of samples on different workers. - """ - - def __init__(self, size: int): - """ - Args: - size (int): the total number of data of the underlying dataset to sample from - """ - self._size = size - assert size > 0 - self._rank = comm.get_rank() - self._world_size = comm.get_world_size() - self._local_indices = self._get_local_indices(size, self._world_size, self._rank) - - @staticmethod - def _get_local_indices(total_size, world_size, rank): - shard_size = total_size // world_size - left = total_size % world_size - shard_sizes = [shard_size + int(r < left) for r in range(world_size)] - - begin = sum(shard_sizes[:rank]) - end = min(sum(shard_sizes[: rank + 1]), total_size) - return range(begin, end) - - def __iter__(self): - yield from self._local_indices - - def __len__(self): - return len(self._local_indices) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/samplers/grouped_batch_sampler.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/samplers/grouped_batch_sampler.py deleted file mode 100755 index 5b247730..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/samplers/grouped_batch_sampler.py +++ /dev/null @@ -1,47 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -from torch.utils.data.sampler import BatchSampler, Sampler - - -class GroupedBatchSampler(BatchSampler): - """ - Wraps another sampler to yield a mini-batch of indices. - It enforces that the batch only contain elements from the same group. - It also tries to provide mini-batches which follows an ordering which is - as close as possible to the ordering from the original sampler. - """ - - def __init__(self, sampler, group_ids, batch_size): - """ - Args: - sampler (Sampler): Base sampler. - group_ids (list[int]): If the sampler produces indices in range [0, N), - `group_ids` must be a list of `N` ints which contains the group id of each sample. - The group ids must be a set of integers in the range [0, num_groups). - batch_size (int): Size of mini-batch. - """ - if not isinstance(sampler, Sampler): - raise ValueError( - "sampler should be an instance of " - "torch.utils.data.Sampler, but got sampler={}".format(sampler) - ) - self.sampler = sampler - self.group_ids = np.asarray(group_ids) - assert self.group_ids.ndim == 1 - self.batch_size = batch_size - groups = np.unique(self.group_ids).tolist() - - # buffer the indices of each group until batch size is reached - self.buffer_per_group = {k: [] for k in groups} - - def __iter__(self): - for idx in self.sampler: - group_id = self.group_ids[idx] - group_buffer = self.buffer_per_group[group_id] - group_buffer.append(idx) - if len(group_buffer) == self.batch_size: - yield group_buffer[:] # yield a copy of the list - del group_buffer[:] - - def __len__(self): - raise NotImplementedError("len() of GroupedBatchSampler is not well-defined.") diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/transforms/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/transforms/__init__.py deleted file mode 100755 index ab3c63b5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/transforms/__init__.py +++ /dev/null @@ -1,14 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from fvcore.transforms.transform import Transform, TransformList # order them first -from fvcore.transforms.transform import * -from .transform import * -from .augmentation import * -from .augmentation_impl import * - -__all__ = [k for k in globals().keys() if not k.startswith("_")] - - -from detectron2.utils.env import fixup_module_metadata - -fixup_module_metadata(__name__, globals(), __all__) -del fixup_module_metadata diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/transforms/augmentation.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/transforms/augmentation.py deleted file mode 100755 index 48be5b1b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/transforms/augmentation.py +++ /dev/null @@ -1,377 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import inspect -import numpy as np -import pprint -from typing import Any, List, Optional, Tuple, Union -from fvcore.transforms.transform import Transform, TransformList - -""" -See "Data Augmentation" tutorial for an overview of the system: -https://detectron2.readthedocs.io/tutorials/augmentation.html -""" - - -__all__ = [ - "Augmentation", - "AugmentationList", - "AugInput", - "TransformGen", - "apply_transform_gens", - "StandardAugInput", - "apply_augmentations", -] - - -def _check_img_dtype(img): - assert isinstance(img, np.ndarray), "[Augmentation] Needs an numpy array, but got a {}!".format( - type(img) - ) - assert not isinstance(img.dtype, np.integer) or ( - img.dtype == np.uint8 - ), "[Augmentation] Got image of type {}, use uint8 or floating points instead!".format( - img.dtype - ) - assert img.ndim in [2, 3], img.ndim - - -def _get_aug_input_args(aug, aug_input) -> List[Any]: - """ - Get the arguments to be passed to ``aug.get_transform`` from the input ``aug_input``. - """ - if aug.input_args is None: - # Decide what attributes are needed automatically - prms = list(inspect.signature(aug.get_transform).parameters.items()) - # The default behavior is: if there is one parameter, then its "image" - # (work automatically for majority of use cases, and also avoid BC breaking), - # Otherwise, use the argument names. - if len(prms) == 1: - names = ("image",) - else: - names = [] - for name, prm in prms: - if prm.kind in (inspect.Parameter.VAR_POSITIONAL, inspect.Parameter.VAR_KEYWORD): - raise TypeError( - f""" \ -The default implementation of `{type(aug)}.__call__` does not allow \ -`{type(aug)}.get_transform` to use variable-length arguments (*args, **kwargs)! \ -If arguments are unknown, reimplement `__call__` instead. \ -""" - ) - names.append(name) - aug.input_args = tuple(names) - - args = [] - for f in aug.input_args: - try: - args.append(getattr(aug_input, f)) - except AttributeError as e: - raise AttributeError( - f"{type(aug)}.get_transform needs input attribute '{f}', " - f"but it is not an attribute of {type(aug_input)}!" - ) from e - return args - - -class Augmentation: - """ - Augmentation defines (often random) policies/strategies to generate :class:`Transform` - from data. It is often used for pre-processing of input data. - - A "policy" that generates a :class:`Transform` may, in the most general case, - need arbitrary information from input data in order to determine what transforms - to apply. Therefore, each :class:`Augmentation` instance defines the arguments - needed by its :meth:`get_transform` method. When called with the positional arguments, - the :meth:`get_transform` method executes the policy. - - Note that :class:`Augmentation` defines the policies to create a :class:`Transform`, - but not how to execute the actual transform operations to those data. - Its :meth:`__call__` method will use :meth:`AugInput.transform` to execute the transform. - - The returned `Transform` object is meant to describe deterministic transformation, which means - it can be re-applied on associated data, e.g. the geometry of an image and its segmentation - masks need to be transformed together. - (If such re-application is not needed, then determinism is not a crucial requirement.) - """ - - input_args: Optional[Tuple[str]] = None - """ - Stores the attribute names needed by :meth:`get_transform`, e.g. ``("image", "sem_seg")``. - By default, it is just a tuple of argument names in :meth:`self.get_transform`, which often only - contain "image". As long as the argument name convention is followed, there is no need for - users to touch this attribute. - """ - - def _init(self, params=None): - if params: - for k, v in params.items(): - if k != "self" and not k.startswith("_"): - setattr(self, k, v) - - def get_transform(self, *args) -> Transform: - """ - Execute the policy based on input data, and decide what transform to apply to inputs. - - Args: - args: Any fixed-length positional arguments. By default, the name of the arguments - should exist in the :class:`AugInput` to be used. - - Returns: - Transform: Returns the deterministic transform to apply to the input. - - Examples: - :: - class MyAug: - # if a policy needs to know both image and semantic segmentation - def get_transform(image, sem_seg) -> T.Transform: - pass - tfm: Transform = MyAug().get_transform(image, sem_seg) - new_image = tfm.apply_image(image) - - Notes: - Users can freely use arbitrary new argument names in custom - :meth:`get_transform` method, as long as they are available in the - input data. In detectron2 we use the following convention: - - * image: (H,W) or (H,W,C) ndarray of type uint8 in range [0, 255], or - floating point in range [0, 1] or [0, 255]. - * boxes: (N,4) ndarray of float32. It represents the instance bounding boxes - of N instances. Each is in XYXY format in unit of absolute coordinates. - * sem_seg: (H,W) ndarray of type uint8. Each element is an integer label of pixel. - - We do not specify convention for other types and do not include builtin - :class:`Augmentation` that uses other types in detectron2. - """ - raise NotImplementedError - - def __call__(self, aug_input) -> Transform: - """ - Augment the given `aug_input` **in-place**, and return the transform that's used. - - This method will be called to apply the augmentation. In most augmentation, it - is enough to use the default implementation, which calls :meth:`get_transform` - using the inputs. But a subclass can overwrite it to have more complicated logic. - - Args: - aug_input (AugInput): an object that has attributes needed by this augmentation - (defined by ``self.get_transform``). Its ``transform`` method will be called - to in-place transform it. - - Returns: - Transform: the transform that is applied on the input. - """ - args = _get_aug_input_args(self, aug_input) - tfm = self.get_transform(*args) - assert isinstance(tfm, (Transform, TransformList)), ( - f"{type(self)}.get_transform must return an instance of Transform! " - f"Got {type(tfm)} instead." - ) - aug_input.transform(tfm) - return tfm - - def _rand_range(self, low=1.0, high=None, size=None): - """ - Uniform float random number between low and high. - """ - if high is None: - low, high = 0, low - if size is None: - size = [] - return np.random.uniform(low, high, size) - - def __repr__(self): - """ - Produce something like: - "MyAugmentation(field1={self.field1}, field2={self.field2})" - """ - try: - sig = inspect.signature(self.__init__) - classname = type(self).__name__ - argstr = [] - for name, param in sig.parameters.items(): - assert ( - param.kind != param.VAR_POSITIONAL and param.kind != param.VAR_KEYWORD - ), "The default __repr__ doesn't support *args or **kwargs" - assert hasattr(self, name), ( - "Attribute {} not found! " - "Default __repr__ only works if attributes match the constructor.".format(name) - ) - attr = getattr(self, name) - default = param.default - if default is attr: - continue - attr_str = pprint.pformat(attr) - if "\n" in attr_str: - # don't show it if pformat decides to use >1 lines - attr_str = "..." - argstr.append("{}={}".format(name, attr_str)) - return "{}({})".format(classname, ", ".join(argstr)) - except AssertionError: - return super().__repr__() - - __str__ = __repr__ - - -def _transform_to_aug(tfm_or_aug): - """ - Wrap Transform into Augmentation. - Private, used internally to implement augmentations. - """ - assert isinstance(tfm_or_aug, (Transform, Augmentation)), tfm_or_aug - if isinstance(tfm_or_aug, Augmentation): - return tfm_or_aug - else: - - class _TransformToAug(Augmentation): - def __init__(self, tfm: Transform): - self.tfm = tfm - - def get_transform(self, *args): - return self.tfm - - def __repr__(self): - return repr(self.tfm) - - __str__ = __repr__ - - return _TransformToAug(tfm_or_aug) - - -class AugmentationList(Augmentation): - """ - Apply a sequence of augmentations. - - It has ``__call__`` method to apply the augmentations. - - Note that :meth:`get_transform` method is impossible (will throw error if called) - for :class:`AugmentationList`, because in order to apply a sequence of augmentations, - the kth augmentation must be applied first, to provide inputs needed by the (k+1)th - augmentation. - """ - - def __init__(self, augs): - """ - Args: - augs (list[Augmentation or Transform]): - """ - super().__init__() - self.augs = [_transform_to_aug(x) for x in augs] - - def __call__(self, aug_input) -> Transform: - tfms = [] - for x in self.augs: - tfm = x(aug_input) - tfms.append(tfm) - return TransformList(tfms) - - def __repr__(self): - msgs = [str(x) for x in self.augs] - return "AugmentationList[{}]".format(", ".join(msgs)) - - __str__ = __repr__ - - -class AugInput: - """ - Input that can be used with :meth:`Augmentation.__call__`. - This is a standard implementation for the majority of use cases. - This class provides the standard attributes **"image", "boxes", "sem_seg"** - defined in :meth:`__init__` and they may be needed by different augmentations. - Most augmentation policies do not need attributes beyond these three. - - After applying augmentations to these attributes (using :meth:`AugInput.transform`), - the returned transforms can then be used to transform other data structures that users have. - - Examples: - :: - input = AugInput(image, boxes=boxes) - tfms = augmentation(input) - transformed_image = input.image - transformed_boxes = input.boxes - transformed_other_data = tfms.apply_other(other_data) - - An extended project that works with new data types may implement augmentation policies - that need other inputs. An algorithm may need to transform inputs in a way different - from the standard approach defined in this class. In those rare situations, users can - implement a class similar to this class, that satify the following condition: - - * The input must provide access to these data in the form of attribute access - (``getattr``). For example, if an :class:`Augmentation` to be applied needs "image" - and "sem_seg" arguments, its input must have the attribute "image" and "sem_seg". - * The input must have a ``transform(tfm: Transform) -> None`` method which - in-place transforms all its attributes. - """ - - # TODO maybe should support more builtin data types here - def __init__( - self, - image: np.ndarray, - *, - boxes: Optional[np.ndarray] = None, - sem_seg: Optional[np.ndarray] = None, - ): - """ - Args: - image (ndarray): (H,W) or (H,W,C) ndarray of type uint8 in range [0, 255], or - floating point in range [0, 1] or [0, 255]. The meaning of C is up - to users. - boxes (ndarray or None): Nx4 float32 boxes in XYXY_ABS mode - sem_seg (ndarray or None): HxW uint8 semantic segmentation mask. Each element - is an integer label of pixel. - """ - _check_img_dtype(image) - self.image = image - self.boxes = boxes - self.sem_seg = sem_seg - - def transform(self, tfm: Transform) -> None: - """ - In-place transform all attributes of this class. - - By "in-place", it means after calling this method, accessing an attribute such - as ``self.image`` will return transformed data. - """ - self.image = tfm.apply_image(self.image) - if self.boxes is not None: - self.boxes = tfm.apply_box(self.boxes) - if self.sem_seg is not None: - self.sem_seg = tfm.apply_segmentation(self.sem_seg) - - def apply_augmentations( - self, augmentations: List[Union[Augmentation, Transform]] - ) -> TransformList: - """ - Equivalent of ``AugmentationList(augmentations)(self)`` - """ - return AugmentationList(augmentations)(self) - - -def apply_augmentations(augmentations: List[Union[Transform, Augmentation]], inputs): - """ - Use ``T.AugmentationList(augmentations)(inputs)`` instead. - """ - if isinstance(inputs, np.ndarray): - # handle the common case of image-only Augmentation, also for backward compatibility - image_only = True - inputs = AugInput(inputs) - else: - image_only = False - tfms = inputs.apply_augmentations(augmentations) - return inputs.image if image_only else inputs, tfms - - -apply_transform_gens = apply_augmentations -""" -Alias for backward-compatibility. -""" - -TransformGen = Augmentation -""" -Alias for Augmentation, since it is something that generates :class:`Transform`s -""" - -StandardAugInput = AugInput -""" -Alias for compatibility. It's not worth the complexity to have two classes. -""" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/transforms/augmentation_impl.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/transforms/augmentation_impl.py deleted file mode 100755 index 652a34a9..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/transforms/augmentation_impl.py +++ /dev/null @@ -1,614 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. -""" -Implement many useful :class:`Augmentation`. -""" -import numpy as np -import sys -from typing import Tuple -import torch -from fvcore.transforms.transform import ( - BlendTransform, - CropTransform, - HFlipTransform, - NoOpTransform, - PadTransform, - Transform, - TransformList, - VFlipTransform, -) -from PIL import Image - -from .augmentation import Augmentation, _transform_to_aug -from .transform import ExtentTransform, ResizeTransform, RotationTransform - -__all__ = [ - "FixedSizeCrop", - "RandomApply", - "RandomBrightness", - "RandomContrast", - "RandomCrop", - "RandomExtent", - "RandomFlip", - "RandomSaturation", - "RandomLighting", - "RandomRotation", - "Resize", - "ResizeScale", - "ResizeShortestEdge", - "RandomCrop_CategoryAreaConstraint", -] - - -class RandomApply(Augmentation): - """ - Randomly apply an augmentation with a given probability. - """ - - def __init__(self, tfm_or_aug, prob=0.5): - """ - Args: - tfm_or_aug (Transform, Augmentation): the transform or augmentation - to be applied. It can either be a `Transform` or `Augmentation` - instance. - prob (float): probability between 0.0 and 1.0 that - the wrapper transformation is applied - """ - super().__init__() - self.aug = _transform_to_aug(tfm_or_aug) - assert 0.0 <= prob <= 1.0, f"Probablity must be between 0.0 and 1.0 (given: {prob})" - self.prob = prob - - def get_transform(self, *args): - do = self._rand_range() < self.prob - if do: - return self.aug.get_transform(*args) - else: - return NoOpTransform() - - def __call__(self, aug_input): - do = self._rand_range() < self.prob - if do: - return self.aug(aug_input) - else: - return NoOpTransform() - - -class RandomFlip(Augmentation): - """ - Flip the image horizontally or vertically with the given probability. - """ - - def __init__(self, prob=0.5, *, horizontal=True, vertical=False): - """ - Args: - prob (float): probability of flip. - horizontal (boolean): whether to apply horizontal flipping - vertical (boolean): whether to apply vertical flipping - """ - super().__init__() - - if horizontal and vertical: - raise ValueError("Cannot do both horiz and vert. Please use two Flip instead.") - if not horizontal and not vertical: - raise ValueError("At least one of horiz or vert has to be True!") - self._init(locals()) - - def get_transform(self, image): - h, w = image.shape[:2] - do = self._rand_range() < self.prob - if do: - if self.horizontal: - return HFlipTransform(w) - elif self.vertical: - return VFlipTransform(h) - else: - return NoOpTransform() - - -class Resize(Augmentation): - """Resize image to a fixed target size""" - - def __init__(self, shape, interp=Image.BILINEAR): - """ - Args: - shape: (h, w) tuple or a int - interp: PIL interpolation method - """ - if isinstance(shape, int): - shape = (shape, shape) - shape = tuple(shape) - self._init(locals()) - - def get_transform(self, image): - return ResizeTransform( - image.shape[0], image.shape[1], self.shape[0], self.shape[1], self.interp - ) - - -class ResizeShortestEdge(Augmentation): - """ - Resize the image while keeping the aspect ratio unchanged. - It attempts to scale the shorter edge to the given `short_edge_length`, - as long as the longer edge does not exceed `max_size`. - If `max_size` is reached, then downscale so that the longer edge does not exceed max_size. - """ - - @torch.jit.unused - def __init__( - self, short_edge_length, max_size=sys.maxsize, sample_style="range", interp=Image.BILINEAR - ): - """ - Args: - short_edge_length (list[int]): If ``sample_style=="range"``, - a [min, max] interval from which to sample the shortest edge length. - If ``sample_style=="choice"``, a list of shortest edge lengths to sample from. - max_size (int): maximum allowed longest edge length. - sample_style (str): either "range" or "choice". - """ - super().__init__() - assert sample_style in ["range", "choice"], sample_style - - self.is_range = sample_style == "range" - if isinstance(short_edge_length, int): - short_edge_length = (short_edge_length, short_edge_length) - if self.is_range: - assert len(short_edge_length) == 2, ( - "short_edge_length must be two values using 'range' sample style." - f" Got {short_edge_length}!" - ) - self._init(locals()) - - @torch.jit.unused - def get_transform(self, image): - h, w = image.shape[:2] - if self.is_range: - size = np.random.randint(self.short_edge_length[0], self.short_edge_length[1] + 1) - else: - size = np.random.choice(self.short_edge_length) - if size == 0: - return NoOpTransform() - - newh, neww = ResizeShortestEdge.get_output_shape(h, w, size, self.max_size) - return ResizeTransform(h, w, newh, neww, self.interp) - - @staticmethod - def get_output_shape( - oldh: int, oldw: int, short_edge_length: int, max_size: int - ) -> Tuple[int, int]: - """ - Compute the output size given input size and target short edge length. - """ - h, w = oldh, oldw - size = short_edge_length * 1.0 - scale = size / min(h, w) - if h < w: - newh, neww = size, scale * w - else: - newh, neww = scale * h, size - if max(newh, neww) > max_size: - scale = max_size * 1.0 / max(newh, neww) - newh = newh * scale - neww = neww * scale - neww = int(neww + 0.5) - newh = int(newh + 0.5) - return (newh, neww) - - -class ResizeScale(Augmentation): - """ - Takes target size as input and randomly scales the given target size between `min_scale` - and `max_scale`. It then scales the input image such that it fits inside the scaled target - box, keeping the aspect ratio constant. - This implements the resize part of the Google's 'resize_and_crop' data augmentation: - https://github.com/tensorflow/tpu/blob/master/models/official/detection/utils/input_utils.py#L127 - """ - - def __init__( - self, - min_scale: float, - max_scale: float, - target_height: int, - target_width: int, - interp: int = Image.BILINEAR, - ): - """ - Args: - min_scale: minimum image scale range. - max_scale: maximum image scale range. - target_height: target image height. - target_width: target image width. - interp: image interpolation method. - """ - super().__init__() - self._init(locals()) - - def _get_resize(self, image: np.ndarray, scale: float) -> Transform: - input_size = image.shape[:2] - - # Compute new target size given a scale. - target_size = (self.target_height, self.target_width) - target_scale_size = np.multiply(target_size, scale) - - # Compute actual rescaling applied to input image and output size. - output_scale = np.minimum( - target_scale_size[0] / input_size[0], target_scale_size[1] / input_size[1] - ) - output_size = np.round(np.multiply(input_size, output_scale)).astype(int) - - return ResizeTransform( - input_size[0], input_size[1], output_size[0], output_size[1], self.interp - ) - - def get_transform(self, image: np.ndarray) -> Transform: - random_scale = np.random.uniform(self.min_scale, self.max_scale) - return self._get_resize(image, random_scale) - - -class RandomRotation(Augmentation): - """ - This method returns a copy of this image, rotated the given - number of degrees counter clockwise around the given center. - """ - - def __init__(self, angle, expand=True, center=None, sample_style="range", interp=None): - """ - Args: - angle (list[float]): If ``sample_style=="range"``, - a [min, max] interval from which to sample the angle (in degrees). - If ``sample_style=="choice"``, a list of angles to sample from - expand (bool): choose if the image should be resized to fit the whole - rotated image (default), or simply cropped - center (list[[float, float]]): If ``sample_style=="range"``, - a [[minx, miny], [maxx, maxy]] relative interval from which to sample the center, - [0, 0] being the top left of the image and [1, 1] the bottom right. - If ``sample_style=="choice"``, a list of centers to sample from - Default: None, which means that the center of rotation is the center of the image - center has no effect if expand=True because it only affects shifting - """ - super().__init__() - assert sample_style in ["range", "choice"], sample_style - self.is_range = sample_style == "range" - if isinstance(angle, (float, int)): - angle = (angle, angle) - if center is not None and isinstance(center[0], (float, int)): - center = (center, center) - self._init(locals()) - - def get_transform(self, image): - h, w = image.shape[:2] - center = None - if self.is_range: - angle = np.random.uniform(self.angle[0], self.angle[1]) - if self.center is not None: - center = ( - np.random.uniform(self.center[0][0], self.center[1][0]), - np.random.uniform(self.center[0][1], self.center[1][1]), - ) - else: - angle = np.random.choice(self.angle) - if self.center is not None: - center = np.random.choice(self.center) - - if center is not None: - center = (w * center[0], h * center[1]) # Convert to absolute coordinates - - if angle % 360 == 0: - return NoOpTransform() - - return RotationTransform(h, w, angle, expand=self.expand, center=center, interp=self.interp) - - -class FixedSizeCrop(Augmentation): - """ - If `crop_size` is smaller than the input image size, then it uses a random crop of - the crop size. If `crop_size` is larger than the input image size, then it pads - the right and the bottom of the image to the crop size if `pad` is True, otherwise - it returns the smaller image. - """ - - def __init__(self, crop_size: Tuple[int], pad: bool = True, pad_value: float = 128.0): - """ - Args: - crop_size: target image (height, width). - pad: if True, will pad images smaller than `crop_size` up to `crop_size` - pad_value: the padding value. - """ - super().__init__() - self._init(locals()) - - def _get_crop(self, image: np.ndarray) -> Transform: - # Compute the image scale and scaled size. - input_size = image.shape[:2] - output_size = self.crop_size - - # Add random crop if the image is scaled up. - max_offset = np.subtract(input_size, output_size) - max_offset = np.maximum(max_offset, 0) - offset = np.multiply(max_offset, np.random.uniform(0.0, 1.0)) - offset = np.round(offset).astype(int) - return CropTransform( - offset[1], offset[0], output_size[1], output_size[0], input_size[1], input_size[0] - ) - - def _get_pad(self, image: np.ndarray) -> Transform: - # Compute the image scale and scaled size. - input_size = image.shape[:2] - output_size = self.crop_size - - # Add padding if the image is scaled down. - pad_size = np.subtract(output_size, input_size) - pad_size = np.maximum(pad_size, 0) - original_size = np.minimum(input_size, output_size) - return PadTransform( - 0, 0, pad_size[1], pad_size[0], original_size[1], original_size[0], self.pad_value - ) - - def get_transform(self, image: np.ndarray) -> TransformList: - transforms = [self._get_crop(image)] - if self.pad: - transforms.append(self._get_pad(image)) - return TransformList(transforms) - - -class RandomCrop(Augmentation): - """ - Randomly crop a rectangle region out of an image. - """ - - def __init__(self, crop_type: str, crop_size): - """ - Args: - crop_type (str): one of "relative_range", "relative", "absolute", "absolute_range". - crop_size (tuple[float, float]): two floats, explained below. - - - "relative": crop a (H * crop_size[0], W * crop_size[1]) region from an input image of - size (H, W). crop size should be in (0, 1] - - "relative_range": uniformly sample two values from [crop_size[0], 1] - and [crop_size[1]], 1], and use them as in "relative" crop type. - - "absolute" crop a (crop_size[0], crop_size[1]) region from input image. - crop_size must be smaller than the input image size. - - "absolute_range", for an input of size (H, W), uniformly sample H_crop in - [crop_size[0], min(H, crop_size[1])] and W_crop in [crop_size[0], min(W, crop_size[1])]. - Then crop a region (H_crop, W_crop). - """ - # TODO style of relative_range and absolute_range are not consistent: - # one takes (h, w) but another takes (min, max) - super().__init__() - assert crop_type in ["relative_range", "relative", "absolute", "absolute_range"] - self._init(locals()) - - def get_transform(self, image): - h, w = image.shape[:2] - croph, cropw = self.get_crop_size((h, w)) - assert h >= croph and w >= cropw, "Shape computation in {} has bugs.".format(self) - h0 = np.random.randint(h - croph + 1) - w0 = np.random.randint(w - cropw + 1) - return CropTransform(w0, h0, cropw, croph) - - def get_crop_size(self, image_size): - """ - Args: - image_size (tuple): height, width - - Returns: - crop_size (tuple): height, width in absolute pixels - """ - h, w = image_size - if self.crop_type == "relative": - ch, cw = self.crop_size - return int(h * ch + 0.5), int(w * cw + 0.5) - elif self.crop_type == "relative_range": - crop_size = np.asarray(self.crop_size, dtype=np.float32) - ch, cw = crop_size + np.random.rand(2) * (1 - crop_size) - return int(h * ch + 0.5), int(w * cw + 0.5) - elif self.crop_type == "absolute": - return (min(self.crop_size[0], h), min(self.crop_size[1], w)) - elif self.crop_type == "absolute_range": - assert self.crop_size[0] <= self.crop_size[1] - ch = np.random.randint(min(h, self.crop_size[0]), min(h, self.crop_size[1]) + 1) - cw = np.random.randint(min(w, self.crop_size[0]), min(w, self.crop_size[1]) + 1) - return ch, cw - else: - raise NotImplementedError("Unknown crop type {}".format(self.crop_type)) - - -class RandomCrop_CategoryAreaConstraint(Augmentation): - """ - Similar to :class:`RandomCrop`, but find a cropping window such that no single category - occupies a ratio of more than `single_category_max_area` in semantic segmentation ground - truth, which can cause unstability in training. The function attempts to find such a valid - cropping window for at most 10 times. - """ - - def __init__( - self, - crop_type: str, - crop_size, - single_category_max_area: float = 1.0, - ignored_category: int = None, - ): - """ - Args: - crop_type, crop_size: same as in :class:`RandomCrop` - single_category_max_area: the maximum allowed area ratio of a - category. Set to 1.0 to disable - ignored_category: allow this category in the semantic segmentation - ground truth to exceed the area ratio. Usually set to the category - that's ignored in training. - """ - self.crop_aug = RandomCrop(crop_type, crop_size) - self._init(locals()) - - def get_transform(self, image, sem_seg): - if self.single_category_max_area >= 1.0: - return self.crop_aug.get_transform(image) - else: - h, w = sem_seg.shape - for _ in range(10): - crop_size = self.crop_aug.get_crop_size((h, w)) - y0 = np.random.randint(h - crop_size[0] + 1) - x0 = np.random.randint(w - crop_size[1] + 1) - sem_seg_temp = sem_seg[y0 : y0 + crop_size[0], x0 : x0 + crop_size[1]] - labels, cnt = np.unique(sem_seg_temp, return_counts=True) - if self.ignored_category is not None: - cnt = cnt[labels != self.ignored_category] - if len(cnt) > 1 and np.max(cnt) < np.sum(cnt) * self.single_category_max_area: - break - crop_tfm = CropTransform(x0, y0, crop_size[1], crop_size[0]) - return crop_tfm - - -class RandomExtent(Augmentation): - """ - Outputs an image by cropping a random "subrect" of the source image. - - The subrect can be parameterized to include pixels outside the source image, - in which case they will be set to zeros (i.e. black). The size of the output - image will vary with the size of the random subrect. - """ - - def __init__(self, scale_range, shift_range): - """ - Args: - output_size (h, w): Dimensions of output image - scale_range (l, h): Range of input-to-output size scaling factor - shift_range (x, y): Range of shifts of the cropped subrect. The rect - is shifted by [w / 2 * Uniform(-x, x), h / 2 * Uniform(-y, y)], - where (w, h) is the (width, height) of the input image. Set each - component to zero to crop at the image's center. - """ - super().__init__() - self._init(locals()) - - def get_transform(self, image): - img_h, img_w = image.shape[:2] - - # Initialize src_rect to fit the input image. - src_rect = np.array([-0.5 * img_w, -0.5 * img_h, 0.5 * img_w, 0.5 * img_h]) - - # Apply a random scaling to the src_rect. - src_rect *= np.random.uniform(self.scale_range[0], self.scale_range[1]) - - # Apply a random shift to the coordinates origin. - src_rect[0::2] += self.shift_range[0] * img_w * (np.random.rand() - 0.5) - src_rect[1::2] += self.shift_range[1] * img_h * (np.random.rand() - 0.5) - - # Map src_rect coordinates into image coordinates (center at corner). - src_rect[0::2] += 0.5 * img_w - src_rect[1::2] += 0.5 * img_h - - return ExtentTransform( - src_rect=(src_rect[0], src_rect[1], src_rect[2], src_rect[3]), - output_size=(int(src_rect[3] - src_rect[1]), int(src_rect[2] - src_rect[0])), - ) - - -class RandomContrast(Augmentation): - """ - Randomly transforms image contrast. - - Contrast intensity is uniformly sampled in (intensity_min, intensity_max). - - intensity < 1 will reduce contrast - - intensity = 1 will preserve the input image - - intensity > 1 will increase contrast - - See: https://pillow.readthedocs.io/en/3.0.x/reference/ImageEnhance.html - """ - - def __init__(self, intensity_min, intensity_max): - """ - Args: - intensity_min (float): Minimum augmentation - intensity_max (float): Maximum augmentation - """ - super().__init__() - self._init(locals()) - - def get_transform(self, image): - w = np.random.uniform(self.intensity_min, self.intensity_max) - return BlendTransform(src_image=image.mean(), src_weight=1 - w, dst_weight=w) - - -class RandomBrightness(Augmentation): - """ - Randomly transforms image brightness. - - Brightness intensity is uniformly sampled in (intensity_min, intensity_max). - - intensity < 1 will reduce brightness - - intensity = 1 will preserve the input image - - intensity > 1 will increase brightness - - See: https://pillow.readthedocs.io/en/3.0.x/reference/ImageEnhance.html - """ - - def __init__(self, intensity_min, intensity_max): - """ - Args: - intensity_min (float): Minimum augmentation - intensity_max (float): Maximum augmentation - """ - super().__init__() - self._init(locals()) - - def get_transform(self, image): - w = np.random.uniform(self.intensity_min, self.intensity_max) - return BlendTransform(src_image=0, src_weight=1 - w, dst_weight=w) - - -class RandomSaturation(Augmentation): - """ - Randomly transforms saturation of an RGB image. - Input images are assumed to have 'RGB' channel order. - - Saturation intensity is uniformly sampled in (intensity_min, intensity_max). - - intensity < 1 will reduce saturation (make the image more grayscale) - - intensity = 1 will preserve the input image - - intensity > 1 will increase saturation - - See: https://pillow.readthedocs.io/en/3.0.x/reference/ImageEnhance.html - """ - - def __init__(self, intensity_min, intensity_max): - """ - Args: - intensity_min (float): Minimum augmentation (1 preserves input). - intensity_max (float): Maximum augmentation (1 preserves input). - """ - super().__init__() - self._init(locals()) - - def get_transform(self, image): - assert image.shape[-1] == 3, "RandomSaturation only works on RGB images" - w = np.random.uniform(self.intensity_min, self.intensity_max) - grayscale = image.dot([0.299, 0.587, 0.114])[:, :, np.newaxis] - return BlendTransform(src_image=grayscale, src_weight=1 - w, dst_weight=w) - - -class RandomLighting(Augmentation): - """ - The "lighting" augmentation described in AlexNet, using fixed PCA over ImageNet. - Input images are assumed to have 'RGB' channel order. - - The degree of color jittering is randomly sampled via a normal distribution, - with standard deviation given by the scale parameter. - """ - - def __init__(self, scale): - """ - Args: - scale (float): Standard deviation of principal component weighting. - """ - super().__init__() - self._init(locals()) - self.eigen_vecs = np.array( - [[-0.5675, 0.7192, 0.4009], [-0.5808, -0.0045, -0.8140], [-0.5836, -0.6948, 0.4203]] - ) - self.eigen_vals = np.array([0.2175, 0.0188, 0.0045]) - - def get_transform(self, image): - assert image.shape[-1] == 3, "RandomLighting only works on RGB images" - weights = np.random.normal(scale=self.scale, size=3) - return BlendTransform( - src_image=self.eigen_vecs.dot(weights * self.eigen_vals), src_weight=1.0, dst_weight=1.0 - ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/transforms/transform.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/transforms/transform.py deleted file mode 100755 index de44b991..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/data/transforms/transform.py +++ /dev/null @@ -1,351 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -""" -See "Data Augmentation" tutorial for an overview of the system: -https://detectron2.readthedocs.io/tutorials/augmentation.html -""" - -import numpy as np -import torch -import torch.nn.functional as F -from fvcore.transforms.transform import ( - CropTransform, - HFlipTransform, - NoOpTransform, - Transform, - TransformList, -) -from PIL import Image - -try: - import cv2 # noqa -except ImportError: - # OpenCV is an optional dependency at the moment - pass - -__all__ = [ - "ExtentTransform", - "ResizeTransform", - "RotationTransform", - "ColorTransform", - "PILColorTransform", -] - - -class ExtentTransform(Transform): - """ - Extracts a subregion from the source image and scales it to the output size. - - The fill color is used to map pixels from the source rect that fall outside - the source image. - - See: https://pillow.readthedocs.io/en/latest/PIL.html#PIL.ImageTransform.ExtentTransform - """ - - def __init__(self, src_rect, output_size, interp=Image.LINEAR, fill=0): - """ - Args: - src_rect (x0, y0, x1, y1): src coordinates - output_size (h, w): dst image size - interp: PIL interpolation methods - fill: Fill color used when src_rect extends outside image - """ - super().__init__() - self._set_attributes(locals()) - - def apply_image(self, img, interp=None): - h, w = self.output_size - if len(img.shape) > 2 and img.shape[2] == 1: - pil_image = Image.fromarray(img[:, :, 0], mode="L") - else: - pil_image = Image.fromarray(img) - pil_image = pil_image.transform( - size=(w, h), - method=Image.EXTENT, - data=self.src_rect, - resample=interp if interp else self.interp, - fill=self.fill, - ) - ret = np.asarray(pil_image) - if len(img.shape) > 2 and img.shape[2] == 1: - ret = np.expand_dims(ret, -1) - return ret - - def apply_coords(self, coords): - # Transform image center from source coordinates into output coordinates - # and then map the new origin to the corner of the output image. - h, w = self.output_size - x0, y0, x1, y1 = self.src_rect - new_coords = coords.astype(np.float32) - new_coords[:, 0] -= 0.5 * (x0 + x1) - new_coords[:, 1] -= 0.5 * (y0 + y1) - new_coords[:, 0] *= w / (x1 - x0) - new_coords[:, 1] *= h / (y1 - y0) - new_coords[:, 0] += 0.5 * w - new_coords[:, 1] += 0.5 * h - return new_coords - - def apply_segmentation(self, segmentation): - segmentation = self.apply_image(segmentation, interp=Image.NEAREST) - return segmentation - - -class ResizeTransform(Transform): - """ - Resize the image to a target size. - """ - - def __init__(self, h, w, new_h, new_w, interp=None): - """ - Args: - h, w (int): original image size - new_h, new_w (int): new image size - interp: PIL interpolation methods, defaults to bilinear. - """ - # TODO decide on PIL vs opencv - super().__init__() - if interp is None: - interp = Image.BILINEAR - self._set_attributes(locals()) - - def apply_image(self, img, interp=None): - assert img.shape[:2] == (self.h, self.w) - assert len(img.shape) <= 4 - interp_method = interp if interp is not None else self.interp - - if img.dtype == np.uint8: - if len(img.shape) > 2 and img.shape[2] == 1: - pil_image = Image.fromarray(img[:, :, 0], mode="L") - else: - pil_image = Image.fromarray(img) - pil_image = pil_image.resize((self.new_w, self.new_h), interp_method) - ret = np.asarray(pil_image) - if len(img.shape) > 2 and img.shape[2] == 1: - ret = np.expand_dims(ret, -1) - else: - # PIL only supports uint8 - if any(x < 0 for x in img.strides): - img = np.ascontiguousarray(img) - img = torch.from_numpy(img) - shape = list(img.shape) - shape_4d = shape[:2] + [1] * (4 - len(shape)) + shape[2:] - img = img.view(shape_4d).permute(2, 3, 0, 1) # hw(c) -> nchw - _PIL_RESIZE_TO_INTERPOLATE_MODE = { - Image.NEAREST: "nearest", - Image.BILINEAR: "bilinear", - Image.BICUBIC: "bicubic", - } - mode = _PIL_RESIZE_TO_INTERPOLATE_MODE[interp_method] - align_corners = None if mode == "nearest" else False - img = F.interpolate( - img, (self.new_h, self.new_w), mode=mode, align_corners=align_corners - ) - shape[:2] = (self.new_h, self.new_w) - ret = img.permute(2, 3, 0, 1).view(shape).numpy() # nchw -> hw(c) - - return ret - - def apply_coords(self, coords): - coords[:, 0] = coords[:, 0] * (self.new_w * 1.0 / self.w) - coords[:, 1] = coords[:, 1] * (self.new_h * 1.0 / self.h) - return coords - - def apply_segmentation(self, segmentation): - segmentation = self.apply_image(segmentation, interp=Image.NEAREST) - return segmentation - - def inverse(self): - return ResizeTransform(self.new_h, self.new_w, self.h, self.w, self.interp) - - -class RotationTransform(Transform): - """ - This method returns a copy of this image, rotated the given - number of degrees counter clockwise around its center. - """ - - def __init__(self, h, w, angle, expand=True, center=None, interp=None): - """ - Args: - h, w (int): original image size - angle (float): degrees for rotation - expand (bool): choose if the image should be resized to fit the whole - rotated image (default), or simply cropped - center (tuple (width, height)): coordinates of the rotation center - if left to None, the center will be fit to the center of each image - center has no effect if expand=True because it only affects shifting - interp: cv2 interpolation method, default cv2.INTER_LINEAR - """ - super().__init__() - image_center = np.array((w / 2, h / 2)) - if center is None: - center = image_center - if interp is None: - interp = cv2.INTER_LINEAR - abs_cos, abs_sin = (abs(np.cos(np.deg2rad(angle))), abs(np.sin(np.deg2rad(angle)))) - if expand: - # find the new width and height bounds - bound_w, bound_h = np.rint( - [h * abs_sin + w * abs_cos, h * abs_cos + w * abs_sin] - ).astype(int) - else: - bound_w, bound_h = w, h - - self._set_attributes(locals()) - self.rm_coords = self.create_rotation_matrix() - # Needed because of this problem https://github.com/opencv/opencv/issues/11784 - self.rm_image = self.create_rotation_matrix(offset=-0.5) - - def apply_image(self, img, interp=None): - """ - img should be a numpy array, formatted as Height * Width * Nchannels - """ - if len(img) == 0 or self.angle % 360 == 0: - return img - assert img.shape[:2] == (self.h, self.w) - interp = interp if interp is not None else self.interp - return cv2.warpAffine(img, self.rm_image, (self.bound_w, self.bound_h), flags=interp) - - def apply_coords(self, coords): - """ - coords should be a N * 2 array-like, containing N couples of (x, y) points - """ - coords = np.asarray(coords, dtype=float) - if len(coords) == 0 or self.angle % 360 == 0: - return coords - return cv2.transform(coords[:, np.newaxis, :], self.rm_coords)[:, 0, :] - - def apply_segmentation(self, segmentation): - segmentation = self.apply_image(segmentation, interp=cv2.INTER_NEAREST) - return segmentation - - def create_rotation_matrix(self, offset=0): - center = (self.center[0] + offset, self.center[1] + offset) - rm = cv2.getRotationMatrix2D(tuple(center), self.angle, 1) - if self.expand: - # Find the coordinates of the center of rotation in the new image - # The only point for which we know the future coordinates is the center of the image - rot_im_center = cv2.transform(self.image_center[None, None, :] + offset, rm)[0, 0, :] - new_center = np.array([self.bound_w / 2, self.bound_h / 2]) + offset - rot_im_center - # shift the rotation center to the new coordinates - rm[:, 2] += new_center - return rm - - def inverse(self): - """ - The inverse is to rotate it back with expand, and crop to get the original shape. - """ - if not self.expand: # Not possible to inverse if a part of the image is lost - raise NotImplementedError() - rotation = RotationTransform( - self.bound_h, self.bound_w, -self.angle, True, None, self.interp - ) - crop = CropTransform( - (rotation.bound_w - self.w) // 2, (rotation.bound_h - self.h) // 2, self.w, self.h - ) - return TransformList([rotation, crop]) - - -class ColorTransform(Transform): - """ - Generic wrapper for any photometric transforms. - These transformations should only affect the color space and - not the coordinate space of the image (e.g. annotation - coordinates such as bounding boxes should not be changed) - """ - - def __init__(self, op): - """ - Args: - op (Callable): operation to be applied to the image, - which takes in an ndarray and returns an ndarray. - """ - if not callable(op): - raise ValueError("op parameter should be callable") - super().__init__() - self._set_attributes(locals()) - - def apply_image(self, img): - return self.op(img) - - def apply_coords(self, coords): - return coords - - def inverse(self): - return NoOpTransform() - - def apply_segmentation(self, segmentation): - return segmentation - - -class PILColorTransform(ColorTransform): - """ - Generic wrapper for PIL Photometric image transforms, - which affect the color space and not the coordinate - space of the image - """ - - def __init__(self, op): - """ - Args: - op (Callable): operation to be applied to the image, - which takes in a PIL Image and returns a transformed - PIL Image. - For reference on possible operations see: - - https://pillow.readthedocs.io/en/stable/ - """ - if not callable(op): - raise ValueError("op parameter should be callable") - super().__init__(op) - - def apply_image(self, img): - img = Image.fromarray(img) - return np.asarray(super().apply_image(img)) - - -def HFlip_rotated_box(transform, rotated_boxes): - """ - Apply the horizontal flip transform on rotated boxes. - - Args: - rotated_boxes (ndarray): Nx5 floating point array of - (x_center, y_center, width, height, angle_degrees) format - in absolute coordinates. - """ - # Transform x_center - rotated_boxes[:, 0] = transform.width - rotated_boxes[:, 0] - # Transform angle - rotated_boxes[:, 4] = -rotated_boxes[:, 4] - return rotated_boxes - - -def Resize_rotated_box(transform, rotated_boxes): - """ - Apply the resizing transform on rotated boxes. For details of how these (approximation) - formulas are derived, please refer to :meth:`RotatedBoxes.scale`. - - Args: - rotated_boxes (ndarray): Nx5 floating point array of - (x_center, y_center, width, height, angle_degrees) format - in absolute coordinates. - """ - scale_factor_x = transform.new_w * 1.0 / transform.w - scale_factor_y = transform.new_h * 1.0 / transform.h - rotated_boxes[:, 0] *= scale_factor_x - rotated_boxes[:, 1] *= scale_factor_y - theta = rotated_boxes[:, 4] * np.pi / 180.0 - c = np.cos(theta) - s = np.sin(theta) - rotated_boxes[:, 2] *= np.sqrt(np.square(scale_factor_x * c) + np.square(scale_factor_y * s)) - rotated_boxes[:, 3] *= np.sqrt(np.square(scale_factor_x * s) + np.square(scale_factor_y * c)) - rotated_boxes[:, 4] = np.arctan2(scale_factor_x * s, scale_factor_y * c) * 180 / np.pi - - return rotated_boxes - - -HFlipTransform.register_type("rotated_box", HFlip_rotated_box) -ResizeTransform.register_type("rotated_box", Resize_rotated_box) - -# not necessary any more with latest fvcore -NoOpTransform.register_type("rotated_box", lambda t, x: x) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/__init__.py deleted file mode 100755 index 08a61572..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/__init__.py +++ /dev/null @@ -1,12 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -from .launch import * -from .train_loop import * - -__all__ = [k for k in globals().keys() if not k.startswith("_")] - - -# prefer to let hooks and defaults live in separate namespaces (therefore not in __all__) -# but still make them available here -from .hooks import * -from .defaults import * diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/defaults.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/defaults.py deleted file mode 100755 index cc3faa15..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/defaults.py +++ /dev/null @@ -1,715 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -""" -This file contains components with some default boilerplate logic user may need -in training / testing. They will not work for everyone, but many users may find them useful. - -The behavior of functions/classes in this file is subject to change, -since they are meant to represent the "common default behavior" people need in their projects. -""" - -import argparse -import logging -import os -import sys -import weakref -from collections import OrderedDict -from typing import Optional -import torch -from fvcore.nn.precise_bn import get_bn_modules -from omegaconf import OmegaConf -from torch.nn.parallel import DistributedDataParallel - -import detectron2.data.transforms as T -from detectron2.checkpoint import DetectionCheckpointer -from detectron2.config import CfgNode, LazyConfig -from detectron2.data import ( - MetadataCatalog, - build_detection_test_loader, - build_detection_train_loader, -) -from detectron2.evaluation import ( - DatasetEvaluator, - inference_on_dataset, - print_csv_format, - verify_results, -) -from detectron2.modeling import build_model -from detectron2.solver import build_lr_scheduler, build_optimizer -from detectron2.utils import comm -from detectron2.utils.collect_env import collect_env_info -from detectron2.utils.env import seed_all_rng -from detectron2.utils.events import CommonMetricPrinter, JSONWriter, TensorboardXWriter -from detectron2.utils.file_io import PathManager -from detectron2.utils.logger import setup_logger - -from . import hooks -from .train_loop import AMPTrainer, SimpleTrainer, TrainerBase - -__all__ = [ - "create_ddp_model", - "default_argument_parser", - "default_setup", - "default_writers", - "DefaultPredictor", - "DefaultTrainer", -] - - -def create_ddp_model(model, *, fp16_compression=False, **kwargs): - """ - Create a DistributedDataParallel model if there are >1 processes. - - Args: - model: a torch.nn.Module - fp16_compression: add fp16 compression hooks to the ddp object. - See more at https://pytorch.org/docs/stable/ddp_comm_hooks.html#torch.distributed.algorithms.ddp_comm_hooks.default_hooks.fp16_compress_hook - kwargs: other arguments of :module:`torch.nn.parallel.DistributedDataParallel`. - """ # noqa - if comm.get_world_size() == 1: - return model - if "device_ids" not in kwargs: - kwargs["device_ids"] = [comm.get_local_rank()] - ddp = DistributedDataParallel(model, **kwargs) - if fp16_compression: - from torch.distributed.algorithms.ddp_comm_hooks import default as comm_hooks - - ddp.register_comm_hook(state=None, hook=comm_hooks.fp16_compress_hook) - return ddp - - -def default_argument_parser(epilog=None): - """ - Create a parser with some common arguments used by detectron2 users. - - Args: - epilog (str): epilog passed to ArgumentParser describing the usage. - - Returns: - argparse.ArgumentParser: - """ - parser = argparse.ArgumentParser( - epilog=epilog - or f""" -Examples: - -Run on single machine: - $ {sys.argv[0]} --num-gpus 8 --config-file cfg.yaml - -Change some config options: - $ {sys.argv[0]} --config-file cfg.yaml MODEL.WEIGHTS /path/to/weight.pth SOLVER.BASE_LR 0.001 - -Run on multiple machines: - (machine0)$ {sys.argv[0]} --machine-rank 0 --num-machines 2 --dist-url [--other-flags] - (machine1)$ {sys.argv[0]} --machine-rank 1 --num-machines 2 --dist-url [--other-flags] -""", - formatter_class=argparse.RawDescriptionHelpFormatter, - ) - parser.add_argument("--config-file", default="", metavar="FILE", help="path to config file") - parser.add_argument( - "--resume", - action="store_true", - help="Whether to attempt to resume from the checkpoint directory. " - "See documentation of `DefaultTrainer.resume_or_load()` for what it means.", - ) - parser.add_argument("--eval-only", action="store_true", help="perform evaluation only") - parser.add_argument("--num-gpus", type=int, default=1, help="number of gpus *per machine*") - parser.add_argument("--num-machines", type=int, default=1, help="total number of machines") - parser.add_argument( - "--machine-rank", type=int, default=0, help="the rank of this machine (unique per machine)" - ) - - # PyTorch still may leave orphan processes in multi-gpu training. - # Therefore we use a deterministic way to obtain port, - # so that users are aware of orphan processes by seeing the port occupied. - port = 2 ** 15 + 2 ** 14 + hash(os.getuid() if sys.platform != "win32" else 1) % 2 ** 14 - parser.add_argument( - "--dist-url", - default="tcp://127.0.0.1:{}".format(port), - help="initialization URL for pytorch distributed backend. See " - "https://pytorch.org/docs/stable/distributed.html for details.", - ) - parser.add_argument( - "opts", - help=""" -Modify config options at the end of the command. For Yacs configs, use -space-separated "PATH.KEY VALUE" pairs. -For python-based LazyConfig, use "path.key=value". - """.strip(), - default=None, - nargs=argparse.REMAINDER, - ) - return parser - - -def _try_get_key(cfg, *keys, default=None): - """ - Try select keys from cfg until the first key that exists. Otherwise return default. - """ - if isinstance(cfg, CfgNode): - cfg = OmegaConf.create(cfg.dump()) - for k in keys: - none = object() - p = OmegaConf.select(cfg, k, default=none) - if p is not none: - return p - return default - - -def _highlight(code, filename): - try: - import pygments - except ImportError: - return code - - from pygments.lexers import Python3Lexer, YamlLexer - from pygments.formatters import Terminal256Formatter - - lexer = Python3Lexer() if filename.endswith(".py") else YamlLexer() - code = pygments.highlight(code, lexer, Terminal256Formatter(style="monokai")) - return code - - -def default_setup(cfg, args): - """ - Perform some basic common setups at the beginning of a job, including: - - 1. Set up the detectron2 logger - 2. Log basic information about environment, cmdline arguments, and config - 3. Backup the config to the output directory - - Args: - cfg (CfgNode or omegaconf.DictConfig): the full config to be used - args (argparse.NameSpace): the command line arguments to be logged - """ - output_dir = _try_get_key(cfg, "OUTPUT_DIR", "output_dir", "train.output_dir") - if comm.is_main_process() and output_dir: - PathManager.mkdirs(output_dir) - - rank = comm.get_rank() - setup_logger(output_dir, distributed_rank=rank, name="fvcore") - logger = setup_logger(output_dir, distributed_rank=rank) - - logger.info("Rank of current process: {}. World size: {}".format(rank, comm.get_world_size())) - logger.info("Environment info:\n" + collect_env_info()) - - logger.info("Command line arguments: " + str(args)) - if hasattr(args, "config_file") and args.config_file != "": - logger.info( - "Contents of args.config_file={}:\n{}".format( - args.config_file, - _highlight(PathManager.open(args.config_file, "r").read(), args.config_file), - ) - ) - - if comm.is_main_process() and output_dir: - # Note: some of our scripts may expect the existence of - # config.yaml in output directory - path = os.path.join(output_dir, "config.yaml") - if isinstance(cfg, CfgNode): - logger.info("Running with full config:\n{}".format(_highlight(cfg.dump(), ".yaml"))) - with PathManager.open(path, "w") as f: - f.write(cfg.dump()) - else: - LazyConfig.save(cfg, path) - logger.info("Full config saved to {}".format(path)) - - # make sure each worker has a different, yet deterministic seed if specified - seed = _try_get_key(cfg, "SEED", "train.seed", default=-1) - seed_all_rng(None if seed < 0 else seed + rank) - - # cudnn benchmark has large overhead. It shouldn't be used considering the small size of - # typical validation set. - if not (hasattr(args, "eval_only") and args.eval_only): - torch.backends.cudnn.benchmark = _try_get_key( - cfg, "CUDNN_BENCHMARK", "train.cudnn_benchmark", default=False - ) - - -def default_writers(output_dir: str, max_iter: Optional[int] = None): - """ - Build a list of :class:`EventWriter` to be used. - It now consists of a :class:`CommonMetricPrinter`, - :class:`TensorboardXWriter` and :class:`JSONWriter`. - - Args: - output_dir: directory to store JSON metrics and tensorboard events - max_iter: the total number of iterations - - Returns: - list[EventWriter]: a list of :class:`EventWriter` objects. - """ - PathManager.mkdirs(output_dir) - return [ - # It may not always print what you want to see, since it prints "common" metrics only. - CommonMetricPrinter(max_iter), - JSONWriter(os.path.join(output_dir, "metrics.json")), - TensorboardXWriter(output_dir), - ] - - -class DefaultPredictor: - """ - Create a simple end-to-end predictor with the given config that runs on - single device for a single input image. - - Compared to using the model directly, this class does the following additions: - - 1. Load checkpoint from `cfg.MODEL.WEIGHTS`. - 2. Always take BGR image as the input and apply conversion defined by `cfg.INPUT.FORMAT`. - 3. Apply resizing defined by `cfg.INPUT.{MIN,MAX}_SIZE_TEST`. - 4. Take one input image and produce a single output, instead of a batch. - - This is meant for simple demo purposes, so it does the above steps automatically. - This is not meant for benchmarks or running complicated inference logic. - If you'd like to do anything more complicated, please refer to its source code as - examples to build and use the model manually. - - Attributes: - metadata (Metadata): the metadata of the underlying dataset, obtained from - cfg.DATASETS.TEST. - - Examples: - :: - pred = DefaultPredictor(cfg) - inputs = cv2.imread("input.jpg") - outputs = pred(inputs) - """ - - def __init__(self, cfg): - self.cfg = cfg.clone() # cfg can be modified by model - self.model = build_model(self.cfg) - self.model.eval() - if len(cfg.DATASETS.TEST): - self.metadata = MetadataCatalog.get(cfg.DATASETS.TEST[0]) - - checkpointer = DetectionCheckpointer(self.model) - checkpointer.load(cfg.MODEL.WEIGHTS) - - self.aug = T.ResizeShortestEdge( - [cfg.INPUT.MIN_SIZE_TEST, cfg.INPUT.MIN_SIZE_TEST], cfg.INPUT.MAX_SIZE_TEST - ) - - self.input_format = cfg.INPUT.FORMAT - assert self.input_format in ["RGB", "BGR"], self.input_format - - def __call__(self, original_image): - """ - Args: - original_image (np.ndarray): an image of shape (H, W, C) (in BGR order). - - Returns: - predictions (dict): - the output of the model for one image only. - See :doc:`/tutorials/models` for details about the format. - """ - with torch.no_grad(): # https://github.com/sphinx-doc/sphinx/issues/4258 - # Apply pre-processing to image. - if self.input_format == "RGB": - # whether the model expects BGR inputs or RGB - original_image = original_image[:, :, ::-1] - height, width = original_image.shape[:2] - image = self.aug.get_transform(original_image).apply_image(original_image) - image = torch.as_tensor(image.astype("float32").transpose(2, 0, 1)) - - inputs = {"image": image, "height": height, "width": width} - predictions = self.model([inputs])[0] - return predictions - - -class DefaultTrainer(TrainerBase): - """ - A trainer with default training logic. It does the following: - - 1. Create a :class:`SimpleTrainer` using model, optimizer, dataloader - defined by the given config. Create a LR scheduler defined by the config. - 2. Load the last checkpoint or `cfg.MODEL.WEIGHTS`, if exists, when - `resume_or_load` is called. - 3. Register a few common hooks defined by the config. - - It is created to simplify the **standard model training workflow** and reduce code boilerplate - for users who only need the standard training workflow, with standard features. - It means this class makes *many assumptions* about your training logic that - may easily become invalid in a new research. In fact, any assumptions beyond those made in the - :class:`SimpleTrainer` are too much for research. - - The code of this class has been annotated about restrictive assumptions it makes. - When they do not work for you, you're encouraged to: - - 1. Overwrite methods of this class, OR: - 2. Use :class:`SimpleTrainer`, which only does minimal SGD training and - nothing else. You can then add your own hooks if needed. OR: - 3. Write your own training loop similar to `tools/plain_train_net.py`. - - See the :doc:`/tutorials/training` tutorials for more details. - - Note that the behavior of this class, like other functions/classes in - this file, is not stable, since it is meant to represent the "common default behavior". - It is only guaranteed to work well with the standard models and training workflow in detectron2. - To obtain more stable behavior, write your own training logic with other public APIs. - - Examples: - :: - trainer = DefaultTrainer(cfg) - trainer.resume_or_load() # load last checkpoint or MODEL.WEIGHTS - trainer.train() - - Attributes: - scheduler: - checkpointer (DetectionCheckpointer): - cfg (CfgNode): - """ - - def __init__(self, cfg): - """ - Args: - cfg (CfgNode): - """ - super().__init__() - logger = logging.getLogger("detectron2") - if not logger.isEnabledFor(logging.INFO): # setup_logger is not called for d2 - setup_logger() - cfg = DefaultTrainer.auto_scale_workers(cfg, comm.get_world_size()) - - # Assume these objects must be constructed in this order. - model = self.build_model(cfg) - optimizer = self.build_optimizer(cfg, model) - data_loader = self.build_train_loader(cfg) - - model = create_ddp_model(model, broadcast_buffers=False) - self._trainer = (AMPTrainer if cfg.SOLVER.AMP.ENABLED else SimpleTrainer)( - model, data_loader, optimizer - ) - - self.scheduler = self.build_lr_scheduler(cfg, optimizer) - self.checkpointer = DetectionCheckpointer( - # Assume you want to save checkpoints together with logs/statistics - model, - cfg.OUTPUT_DIR, - trainer=weakref.proxy(self), - ) - self.start_iter = 0 - self.max_iter = cfg.SOLVER.MAX_ITER - self.cfg = cfg - - self.register_hooks(self.build_hooks()) - - def resume_or_load(self, resume=True): - """ - If `resume==True` and `cfg.OUTPUT_DIR` contains the last checkpoint (defined by - a `last_checkpoint` file), resume from the file. Resuming means loading all - available states (eg. optimizer and scheduler) and update iteration counter - from the checkpoint. ``cfg.MODEL.WEIGHTS`` will not be used. - - Otherwise, this is considered as an independent training. The method will load model - weights from the file `cfg.MODEL.WEIGHTS` (but will not load other states) and start - from iteration 0. - - Args: - resume (bool): whether to do resume or not - """ - self.checkpointer.resume_or_load(self.cfg.MODEL.WEIGHTS, resume=resume) - if resume and self.checkpointer.has_checkpoint(): - # The checkpoint stores the training iteration that just finished, thus we start - # at the next iteration - self.start_iter = self.iter + 1 - - def build_hooks(self): - """ - Build a list of default hooks, including timing, evaluation, - checkpointing, lr scheduling, precise BN, writing events. - - Returns: - list[HookBase]: - """ - cfg = self.cfg.clone() - cfg.defrost() - cfg.DATALOADER.NUM_WORKERS = 0 # save some memory and time for PreciseBN - - ret = [ - hooks.IterationTimer(), - hooks.LRScheduler(), - hooks.PreciseBN( - # Run at the same freq as (but before) evaluation. - cfg.TEST.EVAL_PERIOD, - self.model, - # Build a new data loader to not affect training - self.build_train_loader(cfg), - cfg.TEST.PRECISE_BN.NUM_ITER, - ) - if cfg.TEST.PRECISE_BN.ENABLED and get_bn_modules(self.model) - else None, - ] - - # Do PreciseBN before checkpointer, because it updates the model and need to - # be saved by checkpointer. - # This is not always the best: if checkpointing has a different frequency, - # some checkpoints may have more precise statistics than others. - if comm.is_main_process(): - ret.append(hooks.PeriodicCheckpointer(self.checkpointer, cfg.SOLVER.CHECKPOINT_PERIOD)) - - def test_and_save_results(): - self._last_eval_results = self.test(self.cfg, self.model) - return self._last_eval_results - - # Do evaluation after checkpointer, because then if it fails, - # we can use the saved checkpoint to debug. - ret.append(hooks.EvalHook(cfg.TEST.EVAL_PERIOD, test_and_save_results)) - - if comm.is_main_process(): - # Here the default print/log frequency of each writer is used. - # run writers in the end, so that evaluation metrics are written - ret.append(hooks.PeriodicWriter(self.build_writers(), period=20)) - return ret - - def build_writers(self): - """ - Build a list of writers to be used using :func:`default_writers()`. - If you'd like a different list of writers, you can overwrite it in - your trainer. - - Returns: - list[EventWriter]: a list of :class:`EventWriter` objects. - """ - return default_writers(self.cfg.OUTPUT_DIR, self.max_iter) - - def train(self): - """ - Run training. - - Returns: - OrderedDict of results, if evaluation is enabled. Otherwise None. - """ - super().train(self.start_iter, self.max_iter) - if len(self.cfg.TEST.EXPECTED_RESULTS) and comm.is_main_process(): - assert hasattr( - self, "_last_eval_results" - ), "No evaluation results obtained during training!" - verify_results(self.cfg, self._last_eval_results) - return self._last_eval_results - - def run_step(self): - self._trainer.iter = self.iter - self._trainer.run_step() - - def state_dict(self): - ret = super().state_dict() - ret["_trainer"] = self._trainer.state_dict() - return ret - - def load_state_dict(self, state_dict): - super().load_state_dict(state_dict) - self._trainer.load_state_dict(state_dict["_trainer"]) - - @classmethod - def build_model(cls, cfg): - """ - Returns: - torch.nn.Module: - - It now calls :func:`detectron2.modeling.build_model`. - Overwrite it if you'd like a different model. - """ - model = build_model(cfg) - logger = logging.getLogger(__name__) - logger.info("Model:\n{}".format(model)) - return model - - @classmethod - def build_optimizer(cls, cfg, model): - """ - Returns: - torch.optim.Optimizer: - - It now calls :func:`detectron2.solver.build_optimizer`. - Overwrite it if you'd like a different optimizer. - """ - return build_optimizer(cfg, model) - - @classmethod - def build_lr_scheduler(cls, cfg, optimizer): - """ - It now calls :func:`detectron2.solver.build_lr_scheduler`. - Overwrite it if you'd like a different scheduler. - """ - return build_lr_scheduler(cfg, optimizer) - - @classmethod - def build_train_loader(cls, cfg): - """ - Returns: - iterable - - It now calls :func:`detectron2.data.build_detection_train_loader`. - Overwrite it if you'd like a different data loader. - """ - return build_detection_train_loader(cfg) - - @classmethod - def build_test_loader(cls, cfg, dataset_name): - """ - Returns: - iterable - - It now calls :func:`detectron2.data.build_detection_test_loader`. - Overwrite it if you'd like a different data loader. - """ - return build_detection_test_loader(cfg, dataset_name) - - @classmethod - def build_evaluator(cls, cfg, dataset_name): - """ - Returns: - DatasetEvaluator or None - - It is not implemented by default. - """ - raise NotImplementedError( - """ -If you want DefaultTrainer to automatically run evaluation, -please implement `build_evaluator()` in subclasses (see train_net.py for example). -Alternatively, you can call evaluation functions yourself (see Colab balloon tutorial for example). -""" - ) - - @classmethod - def test(cls, cfg, model, evaluators=None): - """ - Evaluate the given model. The given model is expected to already contain - weights to evaluate. - - Args: - cfg (CfgNode): - model (nn.Module): - evaluators (list[DatasetEvaluator] or None): if None, will call - :meth:`build_evaluator`. Otherwise, must have the same length as - ``cfg.DATASETS.TEST``. - - Returns: - dict: a dict of result metrics - """ - logger = logging.getLogger(__name__) - if isinstance(evaluators, DatasetEvaluator): - evaluators = [evaluators] - if evaluators is not None: - assert len(cfg.DATASETS.TEST) == len(evaluators), "{} != {}".format( - len(cfg.DATASETS.TEST), len(evaluators) - ) - - results = OrderedDict() - for idx, dataset_name in enumerate(cfg.DATASETS.TEST): - data_loader = cls.build_test_loader(cfg, dataset_name) - # When evaluators are passed in as arguments, - # implicitly assume that evaluators can be created before data_loader. - if evaluators is not None: - evaluator = evaluators[idx] - else: - try: - evaluator = cls.build_evaluator(cfg, dataset_name) - except NotImplementedError: - logger.warn( - "No evaluator found. Use `DefaultTrainer.test(evaluators=)`, " - "or implement its `build_evaluator` method." - ) - results[dataset_name] = {} - continue - results_i = inference_on_dataset(model, data_loader, evaluator) - results[dataset_name] = results_i - if comm.is_main_process(): - assert isinstance( - results_i, dict - ), "Evaluator must return a dict on the main process. Got {} instead.".format( - results_i - ) - logger.info("Evaluation results for {} in csv format:".format(dataset_name)) - print_csv_format(results_i) - - if len(results) == 1: - results = list(results.values())[0] - return results - - @staticmethod - def auto_scale_workers(cfg, num_workers: int): - """ - When the config is defined for certain number of workers (according to - ``cfg.SOLVER.REFERENCE_WORLD_SIZE``) that's different from the number of - workers currently in use, returns a new cfg where the total batch size - is scaled so that the per-GPU batch size stays the same as the - original ``IMS_PER_BATCH // REFERENCE_WORLD_SIZE``. - - Other config options are also scaled accordingly: - * training steps and warmup steps are scaled inverse proportionally. - * learning rate are scaled proportionally, following :paper:`ImageNet in 1h`. - - For example, with the original config like the following: - - .. code-block:: yaml - - IMS_PER_BATCH: 16 - BASE_LR: 0.1 - REFERENCE_WORLD_SIZE: 8 - MAX_ITER: 5000 - STEPS: (4000,) - CHECKPOINT_PERIOD: 1000 - - When this config is used on 16 GPUs instead of the reference number 8, - calling this method will return a new config with: - - .. code-block:: yaml - - IMS_PER_BATCH: 32 - BASE_LR: 0.2 - REFERENCE_WORLD_SIZE: 16 - MAX_ITER: 2500 - STEPS: (2000,) - CHECKPOINT_PERIOD: 500 - - Note that both the original config and this new config can be trained on 16 GPUs. - It's up to user whether to enable this feature (by setting ``REFERENCE_WORLD_SIZE``). - - Returns: - CfgNode: a new config. Same as original if ``cfg.SOLVER.REFERENCE_WORLD_SIZE==0``. - """ - old_world_size = cfg.SOLVER.REFERENCE_WORLD_SIZE - if old_world_size == 0 or old_world_size == num_workers: - return cfg - cfg = cfg.clone() - frozen = cfg.is_frozen() - cfg.defrost() - - assert ( - cfg.SOLVER.IMS_PER_BATCH % old_world_size == 0 - ), "Invalid REFERENCE_WORLD_SIZE in config!" - scale = num_workers / old_world_size - bs = cfg.SOLVER.IMS_PER_BATCH = int(round(cfg.SOLVER.IMS_PER_BATCH * scale)) - lr = cfg.SOLVER.BASE_LR = cfg.SOLVER.BASE_LR * scale - max_iter = cfg.SOLVER.MAX_ITER = int(round(cfg.SOLVER.MAX_ITER / scale)) - warmup_iter = cfg.SOLVER.WARMUP_ITERS = int(round(cfg.SOLVER.WARMUP_ITERS / scale)) - cfg.SOLVER.STEPS = tuple(int(round(s / scale)) for s in cfg.SOLVER.STEPS) - cfg.TEST.EVAL_PERIOD = int(round(cfg.TEST.EVAL_PERIOD / scale)) - cfg.SOLVER.CHECKPOINT_PERIOD = int(round(cfg.SOLVER.CHECKPOINT_PERIOD / scale)) - cfg.SOLVER.REFERENCE_WORLD_SIZE = num_workers # maintain invariant - logger = logging.getLogger(__name__) - logger.info( - f"Auto-scaling the config to batch_size={bs}, learning_rate={lr}, " - f"max_iter={max_iter}, warmup={warmup_iter}." - ) - - if frozen: - cfg.freeze() - return cfg - - -# Access basic attributes from the underlying trainer -for _attr in ["model", "data_loader", "optimizer"]: - setattr( - DefaultTrainer, - _attr, - property( - # getter - lambda self, x=_attr: getattr(self._trainer, x), - # setter - lambda self, value, x=_attr: setattr(self._trainer, x, value), - ), - ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/hooks.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/hooks.py deleted file mode 100755 index 52c321f9..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/hooks.py +++ /dev/null @@ -1,686 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import datetime -import itertools -import logging -import math -import operator -import os -import tempfile -import time -import warnings -from collections import Counter -import torch -from fvcore.common.checkpoint import Checkpointer -from fvcore.common.checkpoint import PeriodicCheckpointer as _PeriodicCheckpointer -from fvcore.common.param_scheduler import ParamScheduler -from fvcore.common.timer import Timer -from fvcore.nn.precise_bn import get_bn_modules, update_bn_stats - -import detectron2.utils.comm as comm -from detectron2.evaluation.testing import flatten_results_dict -from detectron2.solver import LRMultiplier -from detectron2.utils.events import EventStorage, EventWriter -from detectron2.utils.file_io import PathManager - -from .train_loop import HookBase - -__all__ = [ - "CallbackHook", - "IterationTimer", - "PeriodicWriter", - "PeriodicCheckpointer", - "BestCheckpointer", - "LRScheduler", - "AutogradProfiler", - "EvalHook", - "PreciseBN", - "TorchProfiler", - "TorchMemoryStats", -] - - -""" -Implement some common hooks. -""" - - -class CallbackHook(HookBase): - """ - Create a hook using callback functions provided by the user. - """ - - def __init__(self, *, before_train=None, after_train=None, before_step=None, after_step=None): - """ - Each argument is a function that takes one argument: the trainer. - """ - self._before_train = before_train - self._before_step = before_step - self._after_step = after_step - self._after_train = after_train - - def before_train(self): - if self._before_train: - self._before_train(self.trainer) - - def after_train(self): - if self._after_train: - self._after_train(self.trainer) - # The functions may be closures that hold reference to the trainer - # Therefore, delete them to avoid circular reference. - del self._before_train, self._after_train - del self._before_step, self._after_step - - def before_step(self): - if self._before_step: - self._before_step(self.trainer) - - def after_step(self): - if self._after_step: - self._after_step(self.trainer) - - -class IterationTimer(HookBase): - """ - Track the time spent for each iteration (each run_step call in the trainer). - Print a summary in the end of training. - - This hook uses the time between the call to its :meth:`before_step` - and :meth:`after_step` methods. - Under the convention that :meth:`before_step` of all hooks should only - take negligible amount of time, the :class:`IterationTimer` hook should be - placed at the beginning of the list of hooks to obtain accurate timing. - """ - - def __init__(self, warmup_iter=3): - """ - Args: - warmup_iter (int): the number of iterations at the beginning to exclude - from timing. - """ - self._warmup_iter = warmup_iter - self._step_timer = Timer() - self._start_time = time.perf_counter() - self._total_timer = Timer() - - def before_train(self): - self._start_time = time.perf_counter() - self._total_timer.reset() - self._total_timer.pause() - - def after_train(self): - logger = logging.getLogger(__name__) - total_time = time.perf_counter() - self._start_time - total_time_minus_hooks = self._total_timer.seconds() - hook_time = total_time - total_time_minus_hooks - - num_iter = self.trainer.storage.iter + 1 - self.trainer.start_iter - self._warmup_iter - - if num_iter > 0 and total_time_minus_hooks > 0: - # Speed is meaningful only after warmup - # NOTE this format is parsed by grep in some scripts - logger.info( - "Overall training speed: {} iterations in {} ({:.4f} s / it)".format( - num_iter, - str(datetime.timedelta(seconds=int(total_time_minus_hooks))), - total_time_minus_hooks / num_iter, - ) - ) - - logger.info( - "Total training time: {} ({} on hooks)".format( - str(datetime.timedelta(seconds=int(total_time))), - str(datetime.timedelta(seconds=int(hook_time))), - ) - ) - - def before_step(self): - self._step_timer.reset() - self._total_timer.resume() - - def after_step(self): - # +1 because we're in after_step, the current step is done - # but not yet counted - iter_done = self.trainer.storage.iter - self.trainer.start_iter + 1 - if iter_done >= self._warmup_iter: - sec = self._step_timer.seconds() - self.trainer.storage.put_scalars(time=sec) - else: - self._start_time = time.perf_counter() - self._total_timer.reset() - - self._total_timer.pause() - - -class PeriodicWriter(HookBase): - """ - Write events to EventStorage (by calling ``writer.write()``) periodically. - - It is executed every ``period`` iterations and after the last iteration. - Note that ``period`` does not affect how data is smoothed by each writer. - """ - - def __init__(self, writers, period=20): - """ - Args: - writers (list[EventWriter]): a list of EventWriter objects - period (int): - """ - self._writers = writers - for w in writers: - assert isinstance(w, EventWriter), w - self._period = period - - def after_step(self): - if (self.trainer.iter + 1) % self._period == 0 or ( - self.trainer.iter == self.trainer.max_iter - 1 - ): - for writer in self._writers: - writer.write() - - def after_train(self): - for writer in self._writers: - # If any new data is found (e.g. produced by other after_train), - # write them before closing - writer.write() - writer.close() - - -class PeriodicCheckpointer(_PeriodicCheckpointer, HookBase): - """ - Same as :class:`detectron2.checkpoint.PeriodicCheckpointer`, but as a hook. - - Note that when used as a hook, - it is unable to save additional data other than what's defined - by the given `checkpointer`. - - It is executed every ``period`` iterations and after the last iteration. - """ - - def before_train(self): - self.max_iter = self.trainer.max_iter - - def after_step(self): - # No way to use **kwargs - self.step(self.trainer.iter) - - -class BestCheckpointer(HookBase): - """ - Checkpoints best weights based off given metric. - - This hook should be used in conjunction to and executed after the hook - that produces the metric, e.g. `EvalHook`. - """ - - def __init__( - self, - eval_period: int, - checkpointer: Checkpointer, - val_metric: str, - mode: str = "max", - file_prefix: str = "model_best", - ) -> None: - """ - Args: - eval_period (int): the period `EvalHook` is set to run. - checkpointer: the checkpointer object used to save checkpoints. - val_metric (str): validation metric to track for best checkpoint, e.g. "bbox/AP50" - mode (str): one of {'max', 'min'}. controls whether the chosen val metric should be - maximized or minimized, e.g. for "bbox/AP50" it should be "max" - file_prefix (str): the prefix of checkpoint's filename, defaults to "model_best" - """ - self._logger = logging.getLogger(__name__) - self._period = eval_period - self._val_metric = val_metric - assert mode in [ - "max", - "min", - ], f'Mode "{mode}" to `BestCheckpointer` is unknown. It should be one of {"max", "min"}.' - if mode == "max": - self._compare = operator.gt - else: - self._compare = operator.lt - self._checkpointer = checkpointer - self._file_prefix = file_prefix - self.best_metric = None - self.best_iter = None - - def _update_best(self, val, iteration): - if math.isnan(val) or math.isinf(val): - return False - self.best_metric = val - self.best_iter = iteration - return True - - def _best_checking(self): - metric_tuple = self.trainer.storage.latest().get(self._val_metric) - if metric_tuple is None: - self._logger.warning( - f"Given val metric {self._val_metric} does not seem to be computed/stored." - "Will not be checkpointing based on it." - ) - return - else: - latest_metric, metric_iter = metric_tuple - - if self.best_metric is None: - if self._update_best(latest_metric, metric_iter): - additional_state = {"iteration": metric_iter} - self._checkpointer.save(f"{self._file_prefix}", **additional_state) - self._logger.info( - f"Saved first model at {self.best_metric:0.5f} @ {self.best_iter} steps" - ) - elif self._compare(latest_metric, self.best_metric): - additional_state = {"iteration": metric_iter} - self._checkpointer.save(f"{self._file_prefix}", **additional_state) - self._logger.info( - f"Saved best model as latest eval score for {self._val_metric} is " - f"{latest_metric:0.5f}, better than last best score " - f"{self.best_metric:0.5f} @ iteration {self.best_iter}." - ) - self._update_best(latest_metric, metric_iter) - else: - self._logger.info( - f"Not saving as latest eval score for {self._val_metric} is {latest_metric:0.5f}, " - f"not better than best score {self.best_metric:0.5f} @ iteration {self.best_iter}." - ) - - def after_step(self): - # same conditions as `EvalHook` - next_iter = self.trainer.iter + 1 - if ( - self._period > 0 - and next_iter % self._period == 0 - and next_iter != self.trainer.max_iter - ): - self._best_checking() - - def after_train(self): - # same conditions as `EvalHook` - if self.trainer.iter + 1 >= self.trainer.max_iter: - self._best_checking() - - -class LRScheduler(HookBase): - """ - A hook which executes a torch builtin LR scheduler and summarizes the LR. - It is executed after every iteration. - """ - - def __init__(self, optimizer=None, scheduler=None): - """ - Args: - optimizer (torch.optim.Optimizer): - scheduler (torch.optim.LRScheduler or fvcore.common.param_scheduler.ParamScheduler): - if a :class:`ParamScheduler` object, it defines the multiplier over the base LR - in the optimizer. - - If any argument is not given, will try to obtain it from the trainer. - """ - self._optimizer = optimizer - self._scheduler = scheduler - - def before_train(self): - self._optimizer = self._optimizer or self.trainer.optimizer - if isinstance(self.scheduler, ParamScheduler): - self._scheduler = LRMultiplier( - self._optimizer, - self.scheduler, - self.trainer.max_iter, - last_iter=self.trainer.iter - 1, - ) - self._best_param_group_id = LRScheduler.get_best_param_group_id(self._optimizer) - - @staticmethod - def get_best_param_group_id(optimizer): - # NOTE: some heuristics on what LR to summarize - # summarize the param group with most parameters - largest_group = max(len(g["params"]) for g in optimizer.param_groups) - - if largest_group == 1: - # If all groups have one parameter, - # then find the most common initial LR, and use it for summary - lr_count = Counter([g["lr"] for g in optimizer.param_groups]) - lr = lr_count.most_common()[0][0] - for i, g in enumerate(optimizer.param_groups): - if g["lr"] == lr: - return i - else: - for i, g in enumerate(optimizer.param_groups): - if len(g["params"]) == largest_group: - return i - - def after_step(self): - lr = self._optimizer.param_groups[self._best_param_group_id]["lr"] - self.trainer.storage.put_scalar("lr", lr, smoothing_hint=False) - self.scheduler.step() - - @property - def scheduler(self): - return self._scheduler or self.trainer.scheduler - - def state_dict(self): - if isinstance(self.scheduler, torch.optim.lr_scheduler._LRScheduler): - return self.scheduler.state_dict() - return {} - - def load_state_dict(self, state_dict): - if isinstance(self.scheduler, torch.optim.lr_scheduler._LRScheduler): - logger = logging.getLogger(__name__) - logger.info("Loading scheduler from state_dict ...") - self.scheduler.load_state_dict(state_dict) - - -class TorchProfiler(HookBase): - """ - A hook which runs `torch.profiler.profile`. - - Examples: - :: - hooks.TorchProfiler( - lambda trainer: 10 < trainer.iter < 20, self.cfg.OUTPUT_DIR - ) - - The above example will run the profiler for iteration 10~20 and dump - results to ``OUTPUT_DIR``. We did not profile the first few iterations - because they are typically slower than the rest. - The result files can be loaded in the ``chrome://tracing`` page in chrome browser, - and the tensorboard visualizations can be visualized using - ``tensorboard --logdir OUTPUT_DIR/log`` - """ - - def __init__(self, enable_predicate, output_dir, *, activities=None, save_tensorboard=True): - """ - Args: - enable_predicate (callable[trainer -> bool]): a function which takes a trainer, - and returns whether to enable the profiler. - It will be called once every step, and can be used to select which steps to profile. - output_dir (str): the output directory to dump tracing files. - activities (iterable): same as in `torch.profiler.profile`. - save_tensorboard (bool): whether to save tensorboard visualizations at (output_dir)/log/ - """ - self._enable_predicate = enable_predicate - self._activities = activities - self._output_dir = output_dir - self._save_tensorboard = save_tensorboard - - def before_step(self): - if self._enable_predicate(self.trainer): - if self._save_tensorboard: - on_trace_ready = torch.profiler.tensorboard_trace_handler( - os.path.join( - self._output_dir, - "log", - "profiler-tensorboard-iter{}".format(self.trainer.iter), - ), - f"worker{comm.get_rank()}", - ) - else: - on_trace_ready = None - self._profiler = torch.profiler.profile( - activities=self._activities, - on_trace_ready=on_trace_ready, - record_shapes=True, - profile_memory=True, - with_stack=True, - with_flops=True, - ) - self._profiler.__enter__() - else: - self._profiler = None - - def after_step(self): - if self._profiler is None: - return - self._profiler.__exit__(None, None, None) - if not self._save_tensorboard: - PathManager.mkdirs(self._output_dir) - out_file = os.path.join( - self._output_dir, "profiler-trace-iter{}.json".format(self.trainer.iter) - ) - if "://" not in out_file: - self._profiler.export_chrome_trace(out_file) - else: - # Support non-posix filesystems - with tempfile.TemporaryDirectory(prefix="detectron2_profiler") as d: - tmp_file = os.path.join(d, "tmp.json") - self._profiler.export_chrome_trace(tmp_file) - with open(tmp_file) as f: - content = f.read() - with PathManager.open(out_file, "w") as f: - f.write(content) - - -class AutogradProfiler(TorchProfiler): - """ - A hook which runs `torch.autograd.profiler.profile`. - - Examples: - :: - hooks.AutogradProfiler( - lambda trainer: 10 < trainer.iter < 20, self.cfg.OUTPUT_DIR - ) - - The above example will run the profiler for iteration 10~20 and dump - results to ``OUTPUT_DIR``. We did not profile the first few iterations - because they are typically slower than the rest. - The result files can be loaded in the ``chrome://tracing`` page in chrome browser. - - Note: - When used together with NCCL on older version of GPUs, - autograd profiler may cause deadlock because it unnecessarily allocates - memory on every device it sees. The memory management calls, if - interleaved with NCCL calls, lead to deadlock on GPUs that do not - support ``cudaLaunchCooperativeKernelMultiDevice``. - """ - - def __init__(self, enable_predicate, output_dir, *, use_cuda=True): - """ - Args: - enable_predicate (callable[trainer -> bool]): a function which takes a trainer, - and returns whether to enable the profiler. - It will be called once every step, and can be used to select which steps to profile. - output_dir (str): the output directory to dump tracing files. - use_cuda (bool): same as in `torch.autograd.profiler.profile`. - """ - warnings.warn("AutogradProfiler has been deprecated in favor of TorchProfiler.") - self._enable_predicate = enable_predicate - self._use_cuda = use_cuda - self._output_dir = output_dir - - def before_step(self): - if self._enable_predicate(self.trainer): - self._profiler = torch.autograd.profiler.profile(use_cuda=self._use_cuda) - self._profiler.__enter__() - else: - self._profiler = None - - -class EvalHook(HookBase): - """ - Run an evaluation function periodically, and at the end of training. - - It is executed every ``eval_period`` iterations and after the last iteration. - """ - - def __init__(self, eval_period, eval_function): - """ - Args: - eval_period (int): the period to run `eval_function`. Set to 0 to - not evaluate periodically (but still after the last iteration). - eval_function (callable): a function which takes no arguments, and - returns a nested dict of evaluation metrics. - - Note: - This hook must be enabled in all or none workers. - If you would like only certain workers to perform evaluation, - give other workers a no-op function (`eval_function=lambda: None`). - """ - self._period = eval_period - self._func = eval_function - - def _do_eval(self): - results = self._func() - - if results: - assert isinstance( - results, dict - ), "Eval function must return a dict. Got {} instead.".format(results) - - flattened_results = flatten_results_dict(results) - for k, v in flattened_results.items(): - try: - v = float(v) - except Exception as e: - raise ValueError( - "[EvalHook] eval_function should return a nested dict of float. " - "Got '{}: {}' instead.".format(k, v) - ) from e - self.trainer.storage.put_scalars(**flattened_results, smoothing_hint=False) - - # Evaluation may take different time among workers. - # A barrier make them start the next iteration together. - comm.synchronize() - - def after_step(self): - next_iter = self.trainer.iter + 1 - if self._period > 0 and next_iter % self._period == 0: - # do the last eval in after_train - if next_iter != self.trainer.max_iter: - self._do_eval() - - def after_train(self): - # This condition is to prevent the eval from running after a failed training - if self.trainer.iter + 1 >= self.trainer.max_iter: - self._do_eval() - # func is likely a closure that holds reference to the trainer - # therefore we clean it to avoid circular reference in the end - del self._func - - -class PreciseBN(HookBase): - """ - The standard implementation of BatchNorm uses EMA in inference, which is - sometimes suboptimal. - This class computes the true average of statistics rather than the moving average, - and put true averages to every BN layer in the given model. - - It is executed every ``period`` iterations and after the last iteration. - """ - - def __init__(self, period, model, data_loader, num_iter): - """ - Args: - period (int): the period this hook is run, or 0 to not run during training. - The hook will always run in the end of training. - model (nn.Module): a module whose all BN layers in training mode will be - updated by precise BN. - Note that user is responsible for ensuring the BN layers to be - updated are in training mode when this hook is triggered. - data_loader (iterable): it will produce data to be run by `model(data)`. - num_iter (int): number of iterations used to compute the precise - statistics. - """ - self._logger = logging.getLogger(__name__) - if len(get_bn_modules(model)) == 0: - self._logger.info( - "PreciseBN is disabled because model does not contain BN layers in training mode." - ) - self._disabled = True - return - - self._model = model - self._data_loader = data_loader - self._num_iter = num_iter - self._period = period - self._disabled = False - - self._data_iter = None - - def after_step(self): - next_iter = self.trainer.iter + 1 - is_final = next_iter == self.trainer.max_iter - if is_final or (self._period > 0 and next_iter % self._period == 0): - self.update_stats() - - def update_stats(self): - """ - Update the model with precise statistics. Users can manually call this method. - """ - if self._disabled: - return - - if self._data_iter is None: - self._data_iter = iter(self._data_loader) - - def data_loader(): - for num_iter in itertools.count(1): - if num_iter % 100 == 0: - self._logger.info( - "Running precise-BN ... {}/{} iterations.".format(num_iter, self._num_iter) - ) - # This way we can reuse the same iterator - yield next(self._data_iter) - - with EventStorage(): # capture events in a new storage to discard them - self._logger.info( - "Running precise-BN for {} iterations... ".format(self._num_iter) - + "Note that this could produce different statistics every time." - ) - update_bn_stats(self._model, data_loader(), self._num_iter) - - -class TorchMemoryStats(HookBase): - """ - Writes pytorch's cuda memory statistics periodically. - """ - - def __init__(self, period=20, max_runs=10): - """ - Args: - period (int): Output stats each 'period' iterations - max_runs (int): Stop the logging after 'max_runs' - """ - - self._logger = logging.getLogger(__name__) - self._period = period - self._max_runs = max_runs - self._runs = 0 - - def after_step(self): - if self._runs > self._max_runs: - return - - if (self.trainer.iter + 1) % self._period == 0 or ( - self.trainer.iter == self.trainer.max_iter - 1 - ): - if torch.cuda.is_available(): - max_reserved_mb = torch.cuda.max_memory_reserved() / 1024.0 / 1024.0 - reserved_mb = torch.cuda.memory_reserved() / 1024.0 / 1024.0 - max_allocated_mb = torch.cuda.max_memory_allocated() / 1024.0 / 1024.0 - allocated_mb = torch.cuda.memory_allocated() / 1024.0 / 1024.0 - - self._logger.info( - ( - " iter: {} " - " max_reserved_mem: {:.0f}MB " - " reserved_mem: {:.0f}MB " - " max_allocated_mem: {:.0f}MB " - " allocated_mem: {:.0f}MB " - ).format( - self.trainer.iter, - max_reserved_mb, - reserved_mb, - max_allocated_mb, - allocated_mb, - ) - ) - - self._runs += 1 - if self._runs == self._max_runs: - mem_summary = torch.cuda.memory_summary() - self._logger.info("\n" + mem_summary) - - torch.cuda.reset_peak_memory_stats() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/launch.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/launch.py deleted file mode 100755 index 46f98691..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/launch.py +++ /dev/null @@ -1,126 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -from datetime import timedelta -import torch -import torch.distributed as dist -import torch.multiprocessing as mp - -from detectron2.utils import comm - -__all__ = ["DEFAULT_TIMEOUT", "launch"] - -DEFAULT_TIMEOUT = timedelta(minutes=30) - - -def _find_free_port(): - import socket - - sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) - # Binding to port 0 will cause the OS to find an available port for us - sock.bind(("", 0)) - port = sock.getsockname()[1] - sock.close() - # NOTE: there is still a chance the port could be taken by other processes. - return port - - -def launch( - main_func, - num_gpus_per_machine, - num_machines=1, - machine_rank=0, - dist_url=None, - args=(), - timeout=DEFAULT_TIMEOUT, -): - """ - Launch multi-gpu or distributed training. - This function must be called on all machines involved in the training. - It will spawn child processes (defined by ``num_gpus_per_machine``) on each machine. - - Args: - main_func: a function that will be called by `main_func(*args)` - num_gpus_per_machine (int): number of GPUs per machine - num_machines (int): the total number of machines - machine_rank (int): the rank of this machine - dist_url (str): url to connect to for distributed jobs, including protocol - e.g. "tcp://127.0.0.1:8686". - Can be set to "auto" to automatically select a free port on localhost - timeout (timedelta): timeout of the distributed workers - args (tuple): arguments passed to main_func - """ - world_size = num_machines * num_gpus_per_machine - if world_size > 1: - # https://github.com/pytorch/pytorch/pull/14391 - # TODO prctl in spawned processes - - if dist_url == "auto": - assert num_machines == 1, "dist_url=auto not supported in multi-machine jobs." - port = _find_free_port() - dist_url = f"tcp://127.0.0.1:{port}" - if num_machines > 1 and dist_url.startswith("file://"): - logger = logging.getLogger(__name__) - logger.warning( - "file:// is not a reliable init_method in multi-machine jobs. Prefer tcp://" - ) - - mp.spawn( - _distributed_worker, - nprocs=num_gpus_per_machine, - args=( - main_func, - world_size, - num_gpus_per_machine, - machine_rank, - dist_url, - args, - timeout, - ), - daemon=False, - ) - else: - main_func(*args) - - -def _distributed_worker( - local_rank, - main_func, - world_size, - num_gpus_per_machine, - machine_rank, - dist_url, - args, - timeout=DEFAULT_TIMEOUT, -): - assert torch.cuda.is_available(), "cuda is not available. Please check your installation." - global_rank = machine_rank * num_gpus_per_machine + local_rank - try: - dist.init_process_group( - backend="NCCL", - init_method=dist_url, - world_size=world_size, - rank=global_rank, - timeout=timeout, - ) - except Exception as e: - logger = logging.getLogger(__name__) - logger.error("Process group URL: {}".format(dist_url)) - raise e - - # Setup the local process group (which contains ranks within the same machine) - assert comm._LOCAL_PROCESS_GROUP is None - num_machines = world_size // num_gpus_per_machine - for i in range(num_machines): - ranks_on_i = list(range(i * num_gpus_per_machine, (i + 1) * num_gpus_per_machine)) - pg = dist.new_group(ranks_on_i) - if i == machine_rank: - comm._LOCAL_PROCESS_GROUP = pg - - assert num_gpus_per_machine <= torch.cuda.device_count() - torch.cuda.set_device(local_rank) - - # synchronize is needed here to prevent a possible timeout after calling init_process_group - # See: https://github.com/facebookresearch/maskrcnn-benchmark/issues/172 - comm.synchronize() - - main_func(*args) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/train_loop.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/train_loop.py deleted file mode 100755 index c4a86b52..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/engine/train_loop.py +++ /dev/null @@ -1,417 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import logging -import numpy as np -import time -import weakref -from typing import List, Mapping, Optional -import torch -from torch.nn.parallel import DataParallel, DistributedDataParallel - -import detectron2.utils.comm as comm -from detectron2.utils.events import EventStorage, get_event_storage -from detectron2.utils.logger import _log_api_usage - -__all__ = ["HookBase", "TrainerBase", "SimpleTrainer", "AMPTrainer"] - - -class HookBase: - """ - Base class for hooks that can be registered with :class:`TrainerBase`. - - Each hook can implement 4 methods. The way they are called is demonstrated - in the following snippet: - :: - hook.before_train() - for iter in range(start_iter, max_iter): - hook.before_step() - trainer.run_step() - hook.after_step() - iter += 1 - hook.after_train() - - Notes: - 1. In the hook method, users can access ``self.trainer`` to access more - properties about the context (e.g., model, current iteration, or config - if using :class:`DefaultTrainer`). - - 2. A hook that does something in :meth:`before_step` can often be - implemented equivalently in :meth:`after_step`. - If the hook takes non-trivial time, it is strongly recommended to - implement the hook in :meth:`after_step` instead of :meth:`before_step`. - The convention is that :meth:`before_step` should only take negligible time. - - Following this convention will allow hooks that do care about the difference - between :meth:`before_step` and :meth:`after_step` (e.g., timer) to - function properly. - - """ - - trainer: "TrainerBase" = None - """ - A weak reference to the trainer object. Set by the trainer when the hook is registered. - """ - - def before_train(self): - """ - Called before the first iteration. - """ - pass - - def after_train(self): - """ - Called after the last iteration. - """ - pass - - def before_step(self): - """ - Called before each iteration. - """ - pass - - def after_step(self): - """ - Called after each iteration. - """ - pass - - def state_dict(self): - """ - Hooks are stateless by default, but can be made checkpointable by - implementing `state_dict` and `load_state_dict`. - """ - return {} - - -class TrainerBase: - """ - Base class for iterative trainer with hooks. - - The only assumption we made here is: the training runs in a loop. - A subclass can implement what the loop is. - We made no assumptions about the existence of dataloader, optimizer, model, etc. - - Attributes: - iter(int): the current iteration. - - start_iter(int): The iteration to start with. - By convention the minimum possible value is 0. - - max_iter(int): The iteration to end training. - - storage(EventStorage): An EventStorage that's opened during the course of training. - """ - - def __init__(self) -> None: - self._hooks: List[HookBase] = [] - self.iter: int = 0 - self.start_iter: int = 0 - self.max_iter: int - self.storage: EventStorage - _log_api_usage("trainer." + self.__class__.__name__) - - def register_hooks(self, hooks: List[Optional[HookBase]]) -> None: - """ - Register hooks to the trainer. The hooks are executed in the order - they are registered. - - Args: - hooks (list[Optional[HookBase]]): list of hooks - """ - hooks = [h for h in hooks if h is not None] - for h in hooks: - assert isinstance(h, HookBase) - # To avoid circular reference, hooks and trainer cannot own each other. - # This normally does not matter, but will cause memory leak if the - # involved objects contain __del__: - # See http://engineering.hearsaysocial.com/2013/06/16/circular-references-in-python/ - h.trainer = weakref.proxy(self) - self._hooks.extend(hooks) - - def train(self, start_iter: int, max_iter: int): - """ - Args: - start_iter, max_iter (int): See docs above - """ - logger = logging.getLogger(__name__) - logger.info("Starting training from iteration {}".format(start_iter)) - - self.iter = self.start_iter = start_iter - self.max_iter = max_iter - - with EventStorage(start_iter) as self.storage: - try: - self.before_train() - for self.iter in range(start_iter, max_iter): - self.before_step() - self.run_step() - self.after_step() - # self.iter == max_iter can be used by `after_train` to - # tell whether the training successfully finished or failed - # due to exceptions. - self.iter += 1 - except Exception: - logger.exception("Exception during training:") - raise - finally: - self.after_train() - - def before_train(self): - for h in self._hooks: - h.before_train() - - def after_train(self): - self.storage.iter = self.iter - for h in self._hooks: - h.after_train() - - def before_step(self): - # Maintain the invariant that storage.iter == trainer.iter - # for the entire execution of each step - self.storage.iter = self.iter - - for h in self._hooks: - h.before_step() - - def after_step(self): - for h in self._hooks: - h.after_step() - - def run_step(self): - raise NotImplementedError - - def state_dict(self): - ret = {"iteration": self.iter} - hooks_state = {} - for h in self._hooks: - sd = h.state_dict() - if sd: - name = type(h).__qualname__ - if name in hooks_state: - # TODO handle repetitive stateful hooks - continue - hooks_state[name] = sd - if hooks_state: - ret["hooks"] = hooks_state - return ret - - def load_state_dict(self, state_dict): - logger = logging.getLogger(__name__) - self.iter = state_dict["iteration"] - for key, value in state_dict.get("hooks", {}).items(): - for h in self._hooks: - try: - name = type(h).__qualname__ - except AttributeError: - continue - if name == key: - h.load_state_dict(value) - break - else: - logger.warning(f"Cannot find the hook '{key}', its state_dict is ignored.") - - -class SimpleTrainer(TrainerBase): - """ - A simple trainer for the most common type of task: - single-cost single-optimizer single-data-source iterative optimization, - optionally using data-parallelism. - It assumes that every step, you: - - 1. Compute the loss with a data from the data_loader. - 2. Compute the gradients with the above loss. - 3. Update the model with the optimizer. - - All other tasks during training (checkpointing, logging, evaluation, LR schedule) - are maintained by hooks, which can be registered by :meth:`TrainerBase.register_hooks`. - - If you want to do anything fancier than this, - either subclass TrainerBase and implement your own `run_step`, - or write your own training loop. - """ - - def __init__(self, model, data_loader, optimizer): - """ - Args: - model: a torch Module. Takes a data from data_loader and returns a - dict of losses. - data_loader: an iterable. Contains data to be used to call model. - optimizer: a torch optimizer. - """ - super().__init__() - - """ - We set the model to training mode in the trainer. - However it's valid to train a model that's in eval mode. - If you want your model (or a submodule of it) to behave - like evaluation during training, you can overwrite its train() method. - """ - model.train() - - self.model = model - self.data_loader = data_loader - self._data_loader_iter = iter(data_loader) - self.optimizer = optimizer - - def run_step(self): - """ - Implement the standard training logic described above. - """ - assert self.model.training, "[SimpleTrainer] model was changed to eval mode!" - start = time.perf_counter() - """ - If you want to do something with the data, you can wrap the dataloader. - """ - data = next(self._data_loader_iter) - data_time = time.perf_counter() - start - - """ - If you want to do something with the losses, you can wrap the model. - """ - loss_dict = self.model(data) - if isinstance(loss_dict, torch.Tensor): - losses = loss_dict - loss_dict = {"total_loss": loss_dict} - else: - losses = sum(loss_dict.values()) - - """ - If you need to accumulate gradients or do something similar, you can - wrap the optimizer with your custom `zero_grad()` method. - """ - self.optimizer.zero_grad() - losses.backward() - - self._write_metrics(loss_dict, data_time) - - """ - If you need gradient clipping/scaling or other processing, you can - wrap the optimizer with your custom `step()` method. But it is - suboptimal as explained in https://arxiv.org/abs/2006.15704 Sec 3.2.4 - """ - self.optimizer.step() - - def _write_metrics( - self, - loss_dict: Mapping[str, torch.Tensor], - data_time: float, - prefix: str = "", - ) -> None: - SimpleTrainer.write_metrics(loss_dict, data_time, prefix) - - @staticmethod - def write_metrics( - loss_dict: Mapping[str, torch.Tensor], - data_time: float, - prefix: str = "", - ) -> None: - """ - Args: - loss_dict (dict): dict of scalar losses - data_time (float): time taken by the dataloader iteration - prefix (str): prefix for logging keys - """ - metrics_dict = {k: v.detach().cpu().item() for k, v in loss_dict.items()} - metrics_dict["data_time"] = data_time - - # Gather metrics among all workers for logging - # This assumes we do DDP-style training, which is currently the only - # supported method in detectron2. - all_metrics_dict = comm.gather(metrics_dict) - - if comm.is_main_process(): - storage = get_event_storage() - - # data_time among workers can have high variance. The actual latency - # caused by data_time is the maximum among workers. - data_time = np.max([x.pop("data_time") for x in all_metrics_dict]) - storage.put_scalar("data_time", data_time) - - # average the rest metrics - metrics_dict = { - k: np.mean([x[k] for x in all_metrics_dict]) for k in all_metrics_dict[0].keys() - } - total_losses_reduced = sum(metrics_dict.values()) - if not np.isfinite(total_losses_reduced): - raise FloatingPointError( - f"Loss became infinite or NaN at iteration={storage.iter}!\n" - f"loss_dict = {metrics_dict}" - ) - - storage.put_scalar("{}total_loss".format(prefix), total_losses_reduced) - if len(metrics_dict) > 1: - storage.put_scalars(**metrics_dict) - - def state_dict(self): - ret = super().state_dict() - ret["optimizer"] = self.optimizer.state_dict() - return ret - - def load_state_dict(self, state_dict): - super().load_state_dict(state_dict) - self.optimizer.load_state_dict(state_dict["optimizer"]) - - -class AMPTrainer(SimpleTrainer): - """ - Like :class:`SimpleTrainer`, but uses PyTorch's native automatic mixed precision - in the training loop. - """ - - def __init__(self, model, data_loader, optimizer, grad_scaler=None): - """ - Args: - model, data_loader, optimizer: same as in :class:`SimpleTrainer`. - grad_scaler: torch GradScaler to automatically scale gradients. - """ - unsupported = "AMPTrainer does not support single-process multi-device training!" - if isinstance(model, DistributedDataParallel): - assert not (model.device_ids and len(model.device_ids) > 1), unsupported - assert not isinstance(model, DataParallel), unsupported - - super().__init__(model, data_loader, optimizer) - - if grad_scaler is None: - from torch.cuda.amp import GradScaler - - grad_scaler = GradScaler() - self.grad_scaler = grad_scaler - - def run_step(self): - """ - Implement the AMP training logic. - """ - assert self.model.training, "[AMPTrainer] model was changed to eval mode!" - assert torch.cuda.is_available(), "[AMPTrainer] CUDA is required for AMP training!" - from torch.cuda.amp import autocast - - start = time.perf_counter() - data = next(self._data_loader_iter) - data_time = time.perf_counter() - start - - with autocast(): - loss_dict = self.model(data) - if isinstance(loss_dict, torch.Tensor): - losses = loss_dict - loss_dict = {"total_loss": loss_dict} - else: - losses = sum(loss_dict.values()) - - self.optimizer.zero_grad() - self.grad_scaler.scale(losses).backward() - - self._write_metrics(loss_dict, data_time) - - self.grad_scaler.step(self.optimizer) - self.grad_scaler.update() - - def state_dict(self): - ret = super().state_dict() - ret["grad_scaler"] = self.grad_scaler.state_dict() - return ret - - def load_state_dict(self, state_dict): - super().load_state_dict(state_dict) - self.grad_scaler.load_state_dict(state_dict["grad_scaler"]) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/__init__.py deleted file mode 100755 index d96609e8..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/__init__.py +++ /dev/null @@ -1,12 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .cityscapes_evaluation import CityscapesInstanceEvaluator, CityscapesSemSegEvaluator -from .coco_evaluation import COCOEvaluator -from .rotated_coco_evaluation import RotatedCOCOEvaluator -from .evaluator import DatasetEvaluator, DatasetEvaluators, inference_context, inference_on_dataset -from .lvis_evaluation import LVISEvaluator -from .panoptic_evaluation import COCOPanopticEvaluator -from .pascal_voc_evaluation import PascalVOCDetectionEvaluator -from .sem_seg_evaluation import SemSegEvaluator -from .testing import print_csv_format, verify_results - -__all__ = [k for k in globals().keys() if not k.startswith("_")] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/cityscapes_evaluation.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/cityscapes_evaluation.py deleted file mode 100755 index 3fb6c4cd..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/cityscapes_evaluation.py +++ /dev/null @@ -1,194 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import glob -import logging -import numpy as np -import os -import tempfile -from collections import OrderedDict -import torch -from PIL import Image - -from detectron2.data import MetadataCatalog -from detectron2.utils import comm -from detectron2.utils.file_io import PathManager - -from .evaluator import DatasetEvaluator - - -class CityscapesEvaluator(DatasetEvaluator): - """ - Base class for evaluation using cityscapes API. - """ - - def __init__(self, dataset_name): - """ - Args: - dataset_name (str): the name of the dataset. - It must have the following metadata associated with it: - "thing_classes", "gt_dir". - """ - self._metadata = MetadataCatalog.get(dataset_name) - self._cpu_device = torch.device("cpu") - self._logger = logging.getLogger(__name__) - - def reset(self): - self._working_dir = tempfile.TemporaryDirectory(prefix="cityscapes_eval_") - self._temp_dir = self._working_dir.name - # All workers will write to the same results directory - # TODO this does not work in distributed training - self._temp_dir = comm.all_gather(self._temp_dir)[0] - if self._temp_dir != self._working_dir.name: - self._working_dir.cleanup() - self._logger.info( - "Writing cityscapes results to temporary directory {} ...".format(self._temp_dir) - ) - - -class CityscapesInstanceEvaluator(CityscapesEvaluator): - """ - Evaluate instance segmentation results on cityscapes dataset using cityscapes API. - - Note: - * It does not work in multi-machine distributed training. - * It contains a synchronization, therefore has to be used on all ranks. - * Only the main process runs evaluation. - """ - - def process(self, inputs, outputs): - from cityscapesscripts.helpers.labels import name2label - - for input, output in zip(inputs, outputs): - file_name = input["file_name"] - basename = os.path.splitext(os.path.basename(file_name))[0] - pred_txt = os.path.join(self._temp_dir, basename + "_pred.txt") - - if "instances" in output: - output = output["instances"].to(self._cpu_device) - num_instances = len(output) - with open(pred_txt, "w") as fout: - for i in range(num_instances): - pred_class = output.pred_classes[i] - classes = self._metadata.thing_classes[pred_class] - class_id = name2label[classes].id - score = output.scores[i] - mask = output.pred_masks[i].numpy().astype("uint8") - png_filename = os.path.join( - self._temp_dir, basename + "_{}_{}.png".format(i, classes) - ) - - Image.fromarray(mask * 255).save(png_filename) - fout.write( - "{} {} {}\n".format(os.path.basename(png_filename), class_id, score) - ) - else: - # Cityscapes requires a prediction file for every ground truth image. - with open(pred_txt, "w") as fout: - pass - - def evaluate(self): - """ - Returns: - dict: has a key "segm", whose value is a dict of "AP" and "AP50". - """ - comm.synchronize() - if comm.get_rank() > 0: - return - import cityscapesscripts.evaluation.evalInstanceLevelSemanticLabeling as cityscapes_eval - - self._logger.info("Evaluating results under {} ...".format(self._temp_dir)) - - # set some global states in cityscapes evaluation API, before evaluating - cityscapes_eval.args.predictionPath = os.path.abspath(self._temp_dir) - cityscapes_eval.args.predictionWalk = None - cityscapes_eval.args.JSONOutput = False - cityscapes_eval.args.colorized = False - cityscapes_eval.args.gtInstancesFile = os.path.join(self._temp_dir, "gtInstances.json") - - # These lines are adopted from - # https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py # noqa - gt_dir = PathManager.get_local_path(self._metadata.gt_dir) - groundTruthImgList = glob.glob(os.path.join(gt_dir, "*", "*_gtFine_instanceIds.png")) - assert len( - groundTruthImgList - ), "Cannot find any ground truth images to use for evaluation. Searched for: {}".format( - cityscapes_eval.args.groundTruthSearch - ) - predictionImgList = [] - for gt in groundTruthImgList: - predictionImgList.append(cityscapes_eval.getPrediction(gt, cityscapes_eval.args)) - results = cityscapes_eval.evaluateImgLists( - predictionImgList, groundTruthImgList, cityscapes_eval.args - )["averages"] - - ret = OrderedDict() - ret["segm"] = {"AP": results["allAp"] * 100, "AP50": results["allAp50%"] * 100} - self._working_dir.cleanup() - return ret - - -class CityscapesSemSegEvaluator(CityscapesEvaluator): - """ - Evaluate semantic segmentation results on cityscapes dataset using cityscapes API. - - Note: - * It does not work in multi-machine distributed training. - * It contains a synchronization, therefore has to be used on all ranks. - * Only the main process runs evaluation. - """ - - def process(self, inputs, outputs): - from cityscapesscripts.helpers.labels import trainId2label - - for input, output in zip(inputs, outputs): - file_name = input["file_name"] - basename = os.path.splitext(os.path.basename(file_name))[0] - pred_filename = os.path.join(self._temp_dir, basename + "_pred.png") - - output = output["sem_seg"].argmax(dim=0).to(self._cpu_device).numpy() - pred = 255 * np.ones(output.shape, dtype=np.uint8) - for train_id, label in trainId2label.items(): - if label.ignoreInEval: - continue - pred[output == train_id] = label.id - Image.fromarray(pred).save(pred_filename) - - def evaluate(self): - comm.synchronize() - if comm.get_rank() > 0: - return - # Load the Cityscapes eval script *after* setting the required env var, - # since the script reads CITYSCAPES_DATASET into global variables at load time. - import cityscapesscripts.evaluation.evalPixelLevelSemanticLabeling as cityscapes_eval - - self._logger.info("Evaluating results under {} ...".format(self._temp_dir)) - - # set some global states in cityscapes evaluation API, before evaluating - cityscapes_eval.args.predictionPath = os.path.abspath(self._temp_dir) - cityscapes_eval.args.predictionWalk = None - cityscapes_eval.args.JSONOutput = False - cityscapes_eval.args.colorized = False - - # These lines are adopted from - # https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalPixelLevelSemanticLabeling.py # noqa - gt_dir = PathManager.get_local_path(self._metadata.gt_dir) - groundTruthImgList = glob.glob(os.path.join(gt_dir, "*", "*_gtFine_labelIds.png")) - assert len( - groundTruthImgList - ), "Cannot find any ground truth images to use for evaluation. Searched for: {}".format( - cityscapes_eval.args.groundTruthSearch - ) - predictionImgList = [] - for gt in groundTruthImgList: - predictionImgList.append(cityscapes_eval.getPrediction(cityscapes_eval.args, gt)) - results = cityscapes_eval.evaluateImgLists( - predictionImgList, groundTruthImgList, cityscapes_eval.args - ) - ret = OrderedDict() - ret["sem_seg"] = { - "IoU": 100.0 * results["averageScoreClasses"], - "iIoU": 100.0 * results["averageScoreInstClasses"], - "IoU_sup": 100.0 * results["averageScoreCategories"], - "iIoU_sup": 100.0 * results["averageScoreInstCategories"], - } - self._working_dir.cleanup() - return ret diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/coco_evaluation.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/coco_evaluation.py deleted file mode 100755 index aad7f5a6..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/coco_evaluation.py +++ /dev/null @@ -1,710 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import contextlib -import copy -import io -import itertools -import json -import logging -import numpy as np -import os -import pickle -from collections import OrderedDict -import pycocotools.mask as mask_util -import torch -from pycocotools.coco import COCO -from pycocotools.cocoeval import COCOeval -from tabulate import tabulate - -import detectron2.utils.comm as comm -from detectron2.config import CfgNode -from detectron2.data import MetadataCatalog -from detectron2.data.datasets.coco import convert_to_coco_json -from detectron2.evaluation.fast_eval_api import COCOeval_opt -from detectron2.structures import Boxes, BoxMode, pairwise_iou -from detectron2.utils.file_io import PathManager -from detectron2.utils.logger import create_small_table - -from .evaluator import DatasetEvaluator - - -class COCOEvaluator(DatasetEvaluator): - """ - Evaluate AR for object proposals, AP for instance detection/segmentation, AP - for keypoint detection outputs using COCO's metrics. - See http://cocodataset.org/#detection-eval and - http://cocodataset.org/#keypoints-eval to understand its metrics. - The metrics range from 0 to 100 (instead of 0 to 1), where a -1 or NaN means - the metric cannot be computed (e.g. due to no predictions made). - - In addition to COCO, this evaluator is able to support any bounding box detection, - instance segmentation, or keypoint detection dataset. - """ - - def __init__( - self, - dataset_name, - tasks=None, - distributed=True, - output_dir=None, - *, - max_dets_per_image=None, - use_fast_impl=True, - kpt_oks_sigmas=(), - ): - """ - Args: - dataset_name (str): name of the dataset to be evaluated. - It must have either the following corresponding metadata: - - "json_file": the path to the COCO format annotation - - Or it must be in detectron2's standard dataset format - so it can be converted to COCO format automatically. - tasks (tuple[str]): tasks that can be evaluated under the given - configuration. A task is one of "bbox", "segm", "keypoints". - By default, will infer this automatically from predictions. - distributed (True): if True, will collect results from all ranks and run evaluation - in the main process. - Otherwise, will only evaluate the results in the current process. - output_dir (str): optional, an output directory to dump all - results predicted on the dataset. The dump contains two files: - - 1. "instances_predictions.pth" a file that can be loaded with `torch.load` and - contains all the results in the format they are produced by the model. - 2. "coco_instances_results.json" a json file in COCO's result format. - max_dets_per_image (int): limit on the maximum number of detections per image. - By default in COCO, this limit is to 100, but this can be customized - to be greater, as is needed in evaluation metrics AP fixed and AP pool - (see https://arxiv.org/pdf/2102.01066.pdf) - This doesn't affect keypoint evaluation. - use_fast_impl (bool): use a fast but **unofficial** implementation to compute AP. - Although the results should be very close to the official implementation in COCO - API, it is still recommended to compute results with the official API for use in - papers. The faster implementation also uses more RAM. - kpt_oks_sigmas (list[float]): The sigmas used to calculate keypoint OKS. - See http://cocodataset.org/#keypoints-eval - When empty, it will use the defaults in COCO. - Otherwise it should be the same length as ROI_KEYPOINT_HEAD.NUM_KEYPOINTS. - """ - self._logger = logging.getLogger(__name__) - self._distributed = distributed - self._output_dir = output_dir - self._use_fast_impl = use_fast_impl - - # COCOeval requires the limit on the number of detections per image (maxDets) to be a list - # with at least 3 elements. The default maxDets in COCOeval is [1, 10, 100], in which the - # 3rd element (100) is used as the limit on the number of detections per image when - # evaluating AP. COCOEvaluator expects an integer for max_dets_per_image, so for COCOeval, - # we reformat max_dets_per_image into [1, 10, max_dets_per_image], based on the defaults. - if max_dets_per_image is None: - max_dets_per_image = [1, 10, 100] - else: - max_dets_per_image = [1, 10, max_dets_per_image] - self._max_dets_per_image = max_dets_per_image - - if tasks is not None and isinstance(tasks, CfgNode): - kpt_oks_sigmas = ( - tasks.TEST.KEYPOINT_OKS_SIGMAS if not kpt_oks_sigmas else kpt_oks_sigmas - ) - self._logger.warn( - "COCO Evaluator instantiated using config, this is deprecated behavior." - " Please pass in explicit arguments instead." - ) - self._tasks = None # Infering it from predictions should be better - else: - self._tasks = tasks - - self._cpu_device = torch.device("cpu") - - self._metadata = MetadataCatalog.get(dataset_name) - if not hasattr(self._metadata, "json_file"): - if output_dir is None: - raise ValueError( - "output_dir must be provided to COCOEvaluator " - "for datasets not in COCO format." - ) - self._logger.info(f"Trying to convert '{dataset_name}' to COCO format ...") - - cache_path = os.path.join(output_dir, f"{dataset_name}_coco_format.json") - self._metadata.json_file = cache_path - convert_to_coco_json(dataset_name, cache_path) - - json_file = PathManager.get_local_path(self._metadata.json_file) - with contextlib.redirect_stdout(io.StringIO()): - self._coco_api = COCO(json_file) - - # Test set json files do not contain annotations (evaluation must be - # performed using the COCO evaluation server). - self._do_evaluation = "annotations" in self._coco_api.dataset - if self._do_evaluation: - self._kpt_oks_sigmas = kpt_oks_sigmas - - def reset(self): - self._predictions = [] - - def process(self, inputs, outputs): - """ - Args: - inputs: the inputs to a COCO model (e.g., GeneralizedRCNN). - It is a list of dict. Each dict corresponds to an image and - contains keys like "height", "width", "file_name", "image_id". - outputs: the outputs of a COCO model. It is a list of dicts with key - "instances" that contains :class:`Instances`. - """ - for input, output in zip(inputs, outputs): - prediction = {"image_id": input["image_id"]} - - if "instances" in output: - instances = output["instances"].to(self._cpu_device) - prediction["instances"] = instances_to_coco_json(instances, input["image_id"]) - if "proposals" in output: - prediction["proposals"] = output["proposals"].to(self._cpu_device) - if len(prediction) > 1: - self._predictions.append(prediction) - - def evaluate(self, img_ids=None): - """ - Args: - img_ids: a list of image IDs to evaluate on. Default to None for the whole dataset - """ - if self._distributed: - comm.synchronize() - predictions = comm.gather(self._predictions, dst=0) - predictions = list(itertools.chain(*predictions)) - - if not comm.is_main_process(): - return {} - else: - predictions = self._predictions - - if len(predictions) == 0: - self._logger.warning("[COCOEvaluator] Did not receive valid predictions.") - return {} - - if self._output_dir: - PathManager.mkdirs(self._output_dir) - file_path = os.path.join(self._output_dir, "instances_predictions.pth") - with PathManager.open(file_path, "wb") as f: - torch.save(predictions, f) - - self._results = OrderedDict() - if "proposals" in predictions[0]: - self._eval_box_proposals(predictions) - if "instances" in predictions[0]: - self._eval_predictions(predictions, img_ids=img_ids) - # Copy so the caller can do whatever with results - return copy.deepcopy(self._results) - - def _tasks_from_predictions(self, predictions): - """ - Get COCO API "tasks" (i.e. iou_type) from COCO-format predictions. - """ - tasks = {"bbox"} - for pred in predictions: - if "segmentation" in pred: - tasks.add("segm") - if "keypoints" in pred: - tasks.add("keypoints") - return sorted(tasks) - - def _eval_predictions(self, predictions, img_ids=None): - """ - Evaluate predictions. Fill self._results with the metrics of the tasks. - """ - self._logger.info("Preparing results for COCO format ...") - coco_results = list(itertools.chain(*[x["instances"] for x in predictions])) - tasks = self._tasks or self._tasks_from_predictions(coco_results) - - # unmap the category ids for COCO - if hasattr(self._metadata, "thing_dataset_id_to_contiguous_id"): - dataset_id_to_contiguous_id = self._metadata.thing_dataset_id_to_contiguous_id - all_contiguous_ids = list(dataset_id_to_contiguous_id.values()) - num_classes = len(all_contiguous_ids) - assert min(all_contiguous_ids) == 0 and max(all_contiguous_ids) == num_classes - 1 - - reverse_id_mapping = {v: k for k, v in dataset_id_to_contiguous_id.items()} - for result in coco_results: - category_id = result["category_id"] - assert category_id < num_classes, ( - f"A prediction has class={category_id}, " - f"but the dataset only has {num_classes} classes and " - f"predicted class id should be in [0, {num_classes - 1}]." - ) - result["category_id"] = reverse_id_mapping[category_id] - - if self._output_dir: - file_path = os.path.join(self._output_dir, "coco_instances_results.json") - self._logger.info("Saving results to {}".format(file_path)) - with PathManager.open(file_path, "w") as f: - f.write(json.dumps(coco_results)) - f.flush() - - if not self._do_evaluation: - self._logger.info("Annotations are not available for evaluation.") - return - - self._logger.info( - "Evaluating predictions with {} COCO API...".format( - "unofficial" if self._use_fast_impl else "official" - ) - ) - for task in sorted(tasks): - assert task in {"bbox", "segm", "keypoints"}, f"Got unknown task: {task}!" - coco_eval = ( - _evaluate_predictions_on_coco( - self._coco_api, - coco_results, - task, - kpt_oks_sigmas=self._kpt_oks_sigmas, - use_fast_impl=self._use_fast_impl, - img_ids=img_ids, - max_dets_per_image=self._max_dets_per_image, - ) - if len(coco_results) > 0 - else None # cocoapi does not handle empty results very well - ) - - res = self._derive_coco_results( - coco_eval, task, class_names=self._metadata.get("thing_classes") - ) - self._results[task] = res - - def _eval_box_proposals(self, predictions): - """ - Evaluate the box proposals in predictions. - Fill self._results with the metrics for "box_proposals" task. - """ - if self._output_dir: - # Saving generated box proposals to file. - # Predicted box_proposals are in XYXY_ABS mode. - bbox_mode = BoxMode.XYXY_ABS.value - ids, boxes, objectness_logits = [], [], [] - for prediction in predictions: - ids.append(prediction["image_id"]) - boxes.append(prediction["proposals"].proposal_boxes.tensor.numpy()) - objectness_logits.append(prediction["proposals"].objectness_logits.numpy()) - - proposal_data = { - "boxes": boxes, - "objectness_logits": objectness_logits, - "ids": ids, - "bbox_mode": bbox_mode, - } - with PathManager.open(os.path.join(self._output_dir, "box_proposals.pkl"), "wb") as f: - pickle.dump(proposal_data, f) - - if not self._do_evaluation: - self._logger.info("Annotations are not available for evaluation.") - return - - self._logger.info("Evaluating bbox proposals ...") - res = {} - areas = {"all": "", "small": "s", "medium": "m", "large": "l"} - for limit in [100, 1000]: - for area, suffix in areas.items(): - stats = _evaluate_box_proposals(predictions, self._coco_api, area=area, limit=limit) - key = "AR{}@{:d}".format(suffix, limit) - res[key] = float(stats["ar"].item() * 100) - self._logger.info("Proposal metrics: \n" + create_small_table(res)) - self._results["box_proposals"] = res - - def _derive_coco_results(self, coco_eval, iou_type, class_names=None): - """ - Derive the desired score numbers from summarized COCOeval. - - Args: - coco_eval (None or COCOEval): None represents no predictions from model. - iou_type (str): - class_names (None or list[str]): if provided, will use it to predict - per-category AP. - - Returns: - a dict of {metric name: score} - """ - - metrics = { - "bbox": ["AP", "AP50", "AP75", "APs", "APm", "APl"], - "segm": ["AP", "AP50", "AP75", "APs", "APm", "APl"], - "keypoints": ["AP", "AP50", "AP75", "APm", "APl"], - }[iou_type] - - if coco_eval is None: - self._logger.warn("No predictions from the model!") - return {metric: float("nan") for metric in metrics} - - # the standard metrics - results = { - metric: float(coco_eval.stats[idx] * 100 if coco_eval.stats[idx] >= 0 else "nan") - for idx, metric in enumerate(metrics) - } - self._logger.info( - "Evaluation results for {}: \n".format(iou_type) + create_small_table(results) - ) - if not np.isfinite(sum(results.values())): - self._logger.info("Some metrics cannot be computed and is shown as NaN.") - - if class_names is None or len(class_names) <= 1: - return results - # Compute per-category AP - # from https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L222-L252 # noqa - precisions = coco_eval.eval["precision"] - # precision has dims (iou, recall, cls, area range, max dets) - assert len(class_names) == precisions.shape[2] - - results_per_category = [] - for idx, name in enumerate(class_names): - # area range index 0: all area ranges - # max dets index -1: typically 100 per image - precision = precisions[:, :, idx, 0, -1] - precision = precision[precision > -1] - ap = np.mean(precision) if precision.size else float("nan") - results_per_category.append(("{}".format(name), float(ap * 100))) - - # tabulate it - N_COLS = min(6, len(results_per_category) * 2) - results_flatten = list(itertools.chain(*results_per_category)) - results_2d = itertools.zip_longest(*[results_flatten[i::N_COLS] for i in range(N_COLS)]) - table = tabulate( - results_2d, - tablefmt="pipe", - floatfmt=".3f", - headers=["category", "AP"] * (N_COLS // 2), - numalign="left", - ) - self._logger.info("Per-category {} AP: \n".format(iou_type) + table) - - results.update({"AP-" + name: ap for name, ap in results_per_category}) - return results - - -def instances_to_coco_json(instances, img_id): - """ - Dump an "Instances" object to a COCO-format json that's used for evaluation. - - Args: - instances (Instances): - img_id (int): the image id - - Returns: - list[dict]: list of json annotations in COCO format. - """ - num_instance = len(instances) - if num_instance == 0: - return [] - - boxes = instances.pred_boxes.tensor.numpy() - boxes = BoxMode.convert(boxes, BoxMode.XYXY_ABS, BoxMode.XYWH_ABS) - boxes = boxes.tolist() - scores = instances.scores.tolist() - classes = instances.pred_classes.tolist() - - has_mask = instances.has("pred_masks") - if has_mask: - # use RLE to encode the masks, because they are too large and takes memory - # since this evaluator stores outputs of the entire dataset - rles = [ - mask_util.encode(np.array(mask[:, :, None], order="F", dtype="uint8"))[0] - for mask in instances.pred_masks - ] - for rle in rles: - # "counts" is an array encoded by mask_util as a byte-stream. Python3's - # json writer which always produces strings cannot serialize a bytestream - # unless you decode it. Thankfully, utf-8 works out (which is also what - # the pycocotools/_mask.pyx does). - rle["counts"] = rle["counts"].decode("utf-8") - - has_keypoints = instances.has("pred_keypoints") - if has_keypoints: - keypoints = instances.pred_keypoints - - results = [] - for k in range(num_instance): - result = { - "image_id": img_id, - "category_id": classes[k], - "bbox": boxes[k], - "score": scores[k], - } - if has_mask: - result["segmentation"] = rles[k] - if has_keypoints: - # In COCO annotations, - # keypoints coordinates are pixel indices. - # However our predictions are floating point coordinates. - # Therefore we subtract 0.5 to be consistent with the annotation format. - # This is the inverse of data loading logic in `datasets/coco.py`. - keypoints[k][:, :2] -= 0.5 - result["keypoints"] = keypoints[k].flatten().tolist() - results.append(result) - return results - - -# inspired from Detectron: -# https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 # noqa -def _evaluate_box_proposals(dataset_predictions, coco_api, thresholds=None, area="all", limit=None): - """ - Evaluate detection proposal recall metrics. This function is a much - faster alternative to the official COCO API recall evaluation code. However, - it produces slightly different results. - """ - # Record max overlap value for each gt box - # Return vector of overlap values - areas = { - "all": 0, - "small": 1, - "medium": 2, - "large": 3, - "96-128": 4, - "128-256": 5, - "256-512": 6, - "512-inf": 7, - } - area_ranges = [ - [0 ** 2, 1e5 ** 2], # all - [0 ** 2, 32 ** 2], # small - [32 ** 2, 96 ** 2], # medium - [96 ** 2, 1e5 ** 2], # large - [96 ** 2, 128 ** 2], # 96-128 - [128 ** 2, 256 ** 2], # 128-256 - [256 ** 2, 512 ** 2], # 256-512 - [512 ** 2, 1e5 ** 2], - ] # 512-inf - assert area in areas, "Unknown area range: {}".format(area) - area_range = area_ranges[areas[area]] - gt_overlaps = [] - num_pos = 0 - - for prediction_dict in dataset_predictions: - predictions = prediction_dict["proposals"] - - # sort predictions in descending order - # TODO maybe remove this and make it explicit in the documentation - inds = predictions.objectness_logits.sort(descending=True)[1] - predictions = predictions[inds] - - ann_ids = coco_api.getAnnIds(imgIds=prediction_dict["image_id"]) - anno = coco_api.loadAnns(ann_ids) - gt_boxes = [ - BoxMode.convert(obj["bbox"], BoxMode.XYWH_ABS, BoxMode.XYXY_ABS) - for obj in anno - if obj["iscrowd"] == 0 - ] - gt_boxes = torch.as_tensor(gt_boxes).reshape(-1, 4) # guard against no boxes - gt_boxes = Boxes(gt_boxes) - gt_areas = torch.as_tensor([obj["area"] for obj in anno if obj["iscrowd"] == 0]) - - if len(gt_boxes) == 0 or len(predictions) == 0: - continue - - valid_gt_inds = (gt_areas >= area_range[0]) & (gt_areas <= area_range[1]) - gt_boxes = gt_boxes[valid_gt_inds] - - num_pos += len(gt_boxes) - - if len(gt_boxes) == 0: - continue - - if limit is not None and len(predictions) > limit: - predictions = predictions[:limit] - - overlaps = pairwise_iou(predictions.proposal_boxes, gt_boxes) - - _gt_overlaps = torch.zeros(len(gt_boxes)) - for j in range(min(len(predictions), len(gt_boxes))): - # find which proposal box maximally covers each gt box - # and get the iou amount of coverage for each gt box - max_overlaps, argmax_overlaps = overlaps.max(dim=0) - - # find which gt box is 'best' covered (i.e. 'best' = most iou) - gt_ovr, gt_ind = max_overlaps.max(dim=0) - assert gt_ovr >= 0 - # find the proposal box that covers the best covered gt box - box_ind = argmax_overlaps[gt_ind] - # record the iou coverage of this gt box - _gt_overlaps[j] = overlaps[box_ind, gt_ind] - assert _gt_overlaps[j] == gt_ovr - # mark the proposal box and the gt box as used - overlaps[box_ind, :] = -1 - overlaps[:, gt_ind] = -1 - - # append recorded iou coverage level - gt_overlaps.append(_gt_overlaps) - gt_overlaps = ( - torch.cat(gt_overlaps, dim=0) if len(gt_overlaps) else torch.zeros(0, dtype=torch.float32) - ) - gt_overlaps, _ = torch.sort(gt_overlaps) - - if thresholds is None: - step = 0.05 - thresholds = torch.arange(0.5, 0.95 + 1e-5, step, dtype=torch.float32) - recalls = torch.zeros_like(thresholds) - # compute recall for each iou threshold - for i, t in enumerate(thresholds): - recalls[i] = (gt_overlaps >= t).float().sum() / float(num_pos) - # ar = 2 * np.trapz(recalls, thresholds) - ar = recalls.mean() - return { - "ar": ar, - "recalls": recalls, - "thresholds": thresholds, - "gt_overlaps": gt_overlaps, - "num_pos": num_pos, - } - - -def _evaluate_predictions_on_coco( - coco_gt, - coco_results, - iou_type, - kpt_oks_sigmas=None, - use_fast_impl=True, - img_ids=None, - max_dets_per_image=None, -): - """ - Evaluate the coco results using COCOEval API. - """ - assert len(coco_results) > 0 - - if iou_type == "segm": - coco_results = copy.deepcopy(coco_results) - # When evaluating mask AP, if the results contain bbox, cocoapi will - # use the box area as the area of the instance, instead of the mask area. - # This leads to a different definition of small/medium/large. - # We remove the bbox field to let mask AP use mask area. - for c in coco_results: - c.pop("bbox", None) - - coco_dt = coco_gt.loadRes(coco_results) - coco_eval = (COCOeval_opt if use_fast_impl else COCOeval)(coco_gt, coco_dt, iou_type) - # For COCO, the default max_dets_per_image is [1, 10, 100]. - if max_dets_per_image is None: - max_dets_per_image = [1, 10, 100] # Default from COCOEval - else: - assert ( - len(max_dets_per_image) >= 3 - ), "COCOeval requires maxDets (and max_dets_per_image) to have length at least 3" - # In the case that user supplies a custom input for max_dets_per_image, - # apply COCOevalMaxDets to evaluate AP with the custom input. - if max_dets_per_image[2] != 100: - coco_eval = COCOevalMaxDets(coco_gt, coco_dt, iou_type) - if iou_type != "keypoints": - coco_eval.params.maxDets = max_dets_per_image - - if img_ids is not None: - coco_eval.params.imgIds = img_ids - - if iou_type == "keypoints": - # Use the COCO default keypoint OKS sigmas unless overrides are specified - if kpt_oks_sigmas: - assert hasattr(coco_eval.params, "kpt_oks_sigmas"), "pycocotools is too old!" - coco_eval.params.kpt_oks_sigmas = np.array(kpt_oks_sigmas) - # COCOAPI requires every detection and every gt to have keypoints, so - # we just take the first entry from both - num_keypoints_dt = len(coco_results[0]["keypoints"]) // 3 - num_keypoints_gt = len(next(iter(coco_gt.anns.values()))["keypoints"]) // 3 - num_keypoints_oks = len(coco_eval.params.kpt_oks_sigmas) - assert num_keypoints_oks == num_keypoints_dt == num_keypoints_gt, ( - f"[COCOEvaluator] Prediction contain {num_keypoints_dt} keypoints. " - f"Ground truth contains {num_keypoints_gt} keypoints. " - f"The length of cfg.TEST.KEYPOINT_OKS_SIGMAS is {num_keypoints_oks}. " - "They have to agree with each other. For meaning of OKS, please refer to " - "http://cocodataset.org/#keypoints-eval." - ) - - coco_eval.evaluate() - coco_eval.accumulate() - coco_eval.summarize() - - return coco_eval - - -class COCOevalMaxDets(COCOeval): - """ - Modified version of COCOeval for evaluating AP with a custom - maxDets (by default for COCO, maxDets is 100) - """ - - def summarize(self): - """ - Compute and display summary metrics for evaluation results given - a custom value for max_dets_per_image - """ - - def _summarize(ap=1, iouThr=None, areaRng="all", maxDets=100): - p = self.params - iStr = " {:<18} {} @[ IoU={:<9} | area={:>6s} | maxDets={:>3d} ] = {:0.3f}" - titleStr = "Average Precision" if ap == 1 else "Average Recall" - typeStr = "(AP)" if ap == 1 else "(AR)" - iouStr = ( - "{:0.2f}:{:0.2f}".format(p.iouThrs[0], p.iouThrs[-1]) - if iouThr is None - else "{:0.2f}".format(iouThr) - ) - - aind = [i for i, aRng in enumerate(p.areaRngLbl) if aRng == areaRng] - mind = [i for i, mDet in enumerate(p.maxDets) if mDet == maxDets] - if ap == 1: - # dimension of precision: [TxRxKxAxM] - s = self.eval["precision"] - # IoU - if iouThr is not None: - t = np.where(iouThr == p.iouThrs)[0] - s = s[t] - s = s[:, :, :, aind, mind] - else: - # dimension of recall: [TxKxAxM] - s = self.eval["recall"] - if iouThr is not None: - t = np.where(iouThr == p.iouThrs)[0] - s = s[t] - s = s[:, :, aind, mind] - if len(s[s > -1]) == 0: - mean_s = -1 - else: - mean_s = np.mean(s[s > -1]) - print(iStr.format(titleStr, typeStr, iouStr, areaRng, maxDets, mean_s)) - return mean_s - - def _summarizeDets(): - stats = np.zeros((12,)) - # Evaluate AP using the custom limit on maximum detections per image - stats[0] = _summarize(1, maxDets=self.params.maxDets[2]) - stats[1] = _summarize(1, iouThr=0.5, maxDets=self.params.maxDets[2]) - stats[2] = _summarize(1, iouThr=0.75, maxDets=self.params.maxDets[2]) - stats[3] = _summarize(1, areaRng="small", maxDets=self.params.maxDets[2]) - stats[4] = _summarize(1, areaRng="medium", maxDets=self.params.maxDets[2]) - stats[5] = _summarize(1, areaRng="large", maxDets=self.params.maxDets[2]) - stats[6] = _summarize(0, maxDets=self.params.maxDets[0]) - stats[7] = _summarize(0, maxDets=self.params.maxDets[1]) - stats[8] = _summarize(0, maxDets=self.params.maxDets[2]) - stats[9] = _summarize(0, areaRng="small", maxDets=self.params.maxDets[2]) - stats[10] = _summarize(0, areaRng="medium", maxDets=self.params.maxDets[2]) - stats[11] = _summarize(0, areaRng="large", maxDets=self.params.maxDets[2]) - return stats - - def _summarizeKps(): - stats = np.zeros((10,)) - stats[0] = _summarize(1, maxDets=20) - stats[1] = _summarize(1, maxDets=20, iouThr=0.5) - stats[2] = _summarize(1, maxDets=20, iouThr=0.75) - stats[3] = _summarize(1, maxDets=20, areaRng="medium") - stats[4] = _summarize(1, maxDets=20, areaRng="large") - stats[5] = _summarize(0, maxDets=20) - stats[6] = _summarize(0, maxDets=20, iouThr=0.5) - stats[7] = _summarize(0, maxDets=20, iouThr=0.75) - stats[8] = _summarize(0, maxDets=20, areaRng="medium") - stats[9] = _summarize(0, maxDets=20, areaRng="large") - return stats - - if not self.eval: - raise Exception("Please run accumulate() first") - iouType = self.params.iouType - if iouType == "segm" or iouType == "bbox": - summarize = _summarizeDets - elif iouType == "keypoints": - summarize = _summarizeKps - self.stats = summarize() - - def __str__(self): - self.summarize() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/evaluator.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/evaluator.py deleted file mode 100755 index baf99600..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/evaluator.py +++ /dev/null @@ -1,224 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import datetime -import logging -import time -from collections import OrderedDict, abc -from contextlib import ExitStack, contextmanager -from typing import List, Union -import torch -from torch import nn - -from detectron2.utils.comm import get_world_size, is_main_process -from detectron2.utils.logger import log_every_n_seconds - - -class DatasetEvaluator: - """ - Base class for a dataset evaluator. - - The function :func:`inference_on_dataset` runs the model over - all samples in the dataset, and have a DatasetEvaluator to process the inputs/outputs. - - This class will accumulate information of the inputs/outputs (by :meth:`process`), - and produce evaluation results in the end (by :meth:`evaluate`). - """ - - def reset(self): - """ - Preparation for a new round of evaluation. - Should be called before starting a round of evaluation. - """ - pass - - def process(self, inputs, outputs): - """ - Process the pair of inputs and outputs. - If they contain batches, the pairs can be consumed one-by-one using `zip`: - - .. code-block:: python - - for input_, output in zip(inputs, outputs): - # do evaluation on single input/output pair - ... - - Args: - inputs (list): the inputs that's used to call the model. - outputs (list): the return value of `model(inputs)` - """ - pass - - def evaluate(self): - """ - Evaluate/summarize the performance, after processing all input/output pairs. - - Returns: - dict: - A new evaluator class can return a dict of arbitrary format - as long as the user can process the results. - In our train_net.py, we expect the following format: - - * key: the name of the task (e.g., bbox) - * value: a dict of {metric name: score}, e.g.: {"AP50": 80} - """ - pass - - -class DatasetEvaluators(DatasetEvaluator): - """ - Wrapper class to combine multiple :class:`DatasetEvaluator` instances. - - This class dispatches every evaluation call to - all of its :class:`DatasetEvaluator`. - """ - - def __init__(self, evaluators): - """ - Args: - evaluators (list): the evaluators to combine. - """ - super().__init__() - self._evaluators = evaluators - - def reset(self): - for evaluator in self._evaluators: - evaluator.reset() - - def process(self, inputs, outputs): - for evaluator in self._evaluators: - evaluator.process(inputs, outputs) - - def evaluate(self): - results = OrderedDict() - for evaluator in self._evaluators: - result = evaluator.evaluate() - if is_main_process() and result is not None: - for k, v in result.items(): - assert ( - k not in results - ), "Different evaluators produce results with the same key {}".format(k) - results[k] = v - return results - - -def inference_on_dataset( - model, data_loader, evaluator: Union[DatasetEvaluator, List[DatasetEvaluator], None] -): - """ - Run model on the data_loader and evaluate the metrics with evaluator. - Also benchmark the inference speed of `model.__call__` accurately. - The model will be used in eval mode. - - Args: - model (callable): a callable which takes an object from - `data_loader` and returns some outputs. - - If it's an nn.Module, it will be temporarily set to `eval` mode. - If you wish to evaluate a model in `training` mode instead, you can - wrap the given model and override its behavior of `.eval()` and `.train()`. - data_loader: an iterable object with a length. - The elements it generates will be the inputs to the model. - evaluator: the evaluator(s) to run. Use `None` if you only want to benchmark, - but don't want to do any evaluation. - - Returns: - The return value of `evaluator.evaluate()` - """ - num_devices = get_world_size() - logger = logging.getLogger(__name__) - logger.info("Start inference on {} batches".format(len(data_loader))) - - total = len(data_loader) # inference data loader must have a fixed length - if evaluator is None: - # create a no-op evaluator - evaluator = DatasetEvaluators([]) - if isinstance(evaluator, abc.MutableSequence): - evaluator = DatasetEvaluators(evaluator) - evaluator.reset() - - num_warmup = min(5, total - 1) - start_time = time.perf_counter() - total_data_time = 0 - total_compute_time = 0 - total_eval_time = 0 - with ExitStack() as stack: - if isinstance(model, nn.Module): - stack.enter_context(inference_context(model)) - stack.enter_context(torch.no_grad()) - - start_data_time = time.perf_counter() - for idx, inputs in enumerate(data_loader): - total_data_time += time.perf_counter() - start_data_time - if idx == num_warmup: - start_time = time.perf_counter() - total_data_time = 0 - total_compute_time = 0 - total_eval_time = 0 - - start_compute_time = time.perf_counter() - outputs = model(inputs) - if torch.cuda.is_available(): - torch.cuda.synchronize() - total_compute_time += time.perf_counter() - start_compute_time - - start_eval_time = time.perf_counter() - evaluator.process(inputs, outputs) - total_eval_time += time.perf_counter() - start_eval_time - - iters_after_start = idx + 1 - num_warmup * int(idx >= num_warmup) - data_seconds_per_iter = total_data_time / iters_after_start - compute_seconds_per_iter = total_compute_time / iters_after_start - eval_seconds_per_iter = total_eval_time / iters_after_start - total_seconds_per_iter = (time.perf_counter() - start_time) / iters_after_start - if idx >= num_warmup * 2 or compute_seconds_per_iter > 5: - eta = datetime.timedelta(seconds=int(total_seconds_per_iter * (total - idx - 1))) - log_every_n_seconds( - logging.INFO, - ( - f"Inference done {idx + 1}/{total}. " - f"Dataloading: {data_seconds_per_iter:.4f} s/iter. " - f"Inference: {compute_seconds_per_iter:.4f} s/iter. " - f"Eval: {eval_seconds_per_iter:.4f} s/iter. " - f"Total: {total_seconds_per_iter:.4f} s/iter. " - f"ETA={eta}" - ), - n=5, - ) - start_data_time = time.perf_counter() - - # Measure the time only for this worker (before the synchronization barrier) - total_time = time.perf_counter() - start_time - total_time_str = str(datetime.timedelta(seconds=total_time)) - # NOTE this format is parsed by grep - logger.info( - "Total inference time: {} ({:.6f} s / iter per device, on {} devices)".format( - total_time_str, total_time / (total - num_warmup), num_devices - ) - ) - total_compute_time_str = str(datetime.timedelta(seconds=int(total_compute_time))) - logger.info( - "Total inference pure compute time: {} ({:.6f} s / iter per device, on {} devices)".format( - total_compute_time_str, total_compute_time / (total - num_warmup), num_devices - ) - ) - - results = evaluator.evaluate() - # An evaluator may return None when not in main process. - # Replace it by an empty dict instead to make it easier for downstream code to handle - if results is None: - results = {} - return results - - -@contextmanager -def inference_context(model): - """ - A context where the model is temporarily changed to eval mode, - and restored to previous mode afterwards. - - Args: - model: a torch Module - """ - training_mode = model.training - model.eval() - yield - model.train(training_mode) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/fast_eval_api.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/fast_eval_api.py deleted file mode 100755 index 2eb202bd..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/fast_eval_api.py +++ /dev/null @@ -1,121 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import logging -import numpy as np -import time -from pycocotools.cocoeval import COCOeval - -from detectron2 import _C - -logger = logging.getLogger(__name__) - - -class COCOeval_opt(COCOeval): - """ - This is a slightly modified version of the original COCO API, where the functions evaluateImg() - and accumulate() are implemented in C++ to speedup evaluation - """ - - def evaluate(self): - """ - Run per image evaluation on given images and store results in self.evalImgs_cpp, a - datastructure that isn't readable from Python but is used by a c++ implementation of - accumulate(). Unlike the original COCO PythonAPI, we don't populate the datastructure - self.evalImgs because this datastructure is a computational bottleneck. - :return: None - """ - tic = time.time() - - p = self.params - # add backward compatibility if useSegm is specified in params - if p.useSegm is not None: - p.iouType = "segm" if p.useSegm == 1 else "bbox" - logger.info("Evaluate annotation type *{}*".format(p.iouType)) - p.imgIds = list(np.unique(p.imgIds)) - if p.useCats: - p.catIds = list(np.unique(p.catIds)) - p.maxDets = sorted(p.maxDets) - self.params = p - - self._prepare() # bottleneck - - # loop through images, area range, max detection number - catIds = p.catIds if p.useCats else [-1] - - if p.iouType == "segm" or p.iouType == "bbox": - computeIoU = self.computeIoU - elif p.iouType == "keypoints": - computeIoU = self.computeOks - self.ious = { - (imgId, catId): computeIoU(imgId, catId) for imgId in p.imgIds for catId in catIds - } # bottleneck - - maxDet = p.maxDets[-1] - - # <<<< Beginning of code differences with original COCO API - def convert_instances_to_cpp(instances, is_det=False): - # Convert annotations for a list of instances in an image to a format that's fast - # to access in C++ - instances_cpp = [] - for instance in instances: - instance_cpp = _C.InstanceAnnotation( - int(instance["id"]), - instance["score"] if is_det else instance.get("score", 0.0), - instance["area"], - bool(instance.get("iscrowd", 0)), - bool(instance.get("ignore", 0)), - ) - instances_cpp.append(instance_cpp) - return instances_cpp - - # Convert GT annotations, detections, and IOUs to a format that's fast to access in C++ - ground_truth_instances = [ - [convert_instances_to_cpp(self._gts[imgId, catId]) for catId in p.catIds] - for imgId in p.imgIds - ] - detected_instances = [ - [convert_instances_to_cpp(self._dts[imgId, catId], is_det=True) for catId in p.catIds] - for imgId in p.imgIds - ] - ious = [[self.ious[imgId, catId] for catId in catIds] for imgId in p.imgIds] - - if not p.useCats: - # For each image, flatten per-category lists into a single list - ground_truth_instances = [[[o for c in i for o in c]] for i in ground_truth_instances] - detected_instances = [[[o for c in i for o in c]] for i in detected_instances] - - # Call C++ implementation of self.evaluateImgs() - self._evalImgs_cpp = _C.COCOevalEvaluateImages( - p.areaRng, maxDet, p.iouThrs, ious, ground_truth_instances, detected_instances - ) - self._evalImgs = None - - self._paramsEval = copy.deepcopy(self.params) - toc = time.time() - logger.info("COCOeval_opt.evaluate() finished in {:0.2f} seconds.".format(toc - tic)) - # >>>> End of code differences with original COCO API - - def accumulate(self): - """ - Accumulate per image evaluation results and store the result in self.eval. Does not - support changing parameter settings from those used by self.evaluate() - """ - logger.info("Accumulating evaluation results...") - tic = time.time() - assert hasattr( - self, "_evalImgs_cpp" - ), "evaluate() must be called before accmulate() is called." - - self.eval = _C.COCOevalAccumulate(self._paramsEval, self._evalImgs_cpp) - - # recall is num_iou_thresholds X num_categories X num_area_ranges X num_max_detections - self.eval["recall"] = np.array(self.eval["recall"]).reshape( - self.eval["counts"][:1] + self.eval["counts"][2:] - ) - - # precision and scores are num_iou_thresholds X num_recall_thresholds X num_categories X - # num_area_ranges X num_max_detections - self.eval["precision"] = np.array(self.eval["precision"]).reshape(self.eval["counts"]) - self.eval["scores"] = np.array(self.eval["scores"]).reshape(self.eval["counts"]) - toc = time.time() - logger.info("COCOeval_opt.accumulate() finished in {:0.2f} seconds.".format(toc - tic)) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/lvis_evaluation.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/lvis_evaluation.py deleted file mode 100755 index 0604feaa..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/lvis_evaluation.py +++ /dev/null @@ -1,380 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import itertools -import json -import logging -import os -import pickle -from collections import OrderedDict -import torch - -import detectron2.utils.comm as comm -from detectron2.config import CfgNode -from detectron2.data import MetadataCatalog -from detectron2.structures import Boxes, BoxMode, pairwise_iou -from detectron2.utils.file_io import PathManager -from detectron2.utils.logger import create_small_table - -from .coco_evaluation import instances_to_coco_json -from .evaluator import DatasetEvaluator - - -class LVISEvaluator(DatasetEvaluator): - """ - Evaluate object proposal and instance detection/segmentation outputs using - LVIS's metrics and evaluation API. - """ - - def __init__( - self, - dataset_name, - tasks=None, - distributed=True, - output_dir=None, - *, - max_dets_per_image=None, - ): - """ - Args: - dataset_name (str): name of the dataset to be evaluated. - It must have the following corresponding metadata: - "json_file": the path to the LVIS format annotation - tasks (tuple[str]): tasks that can be evaluated under the given - configuration. A task is one of "bbox", "segm". - By default, will infer this automatically from predictions. - distributed (True): if True, will collect results from all ranks for evaluation. - Otherwise, will evaluate the results in the current process. - output_dir (str): optional, an output directory to dump results. - max_dets_per_image (None or int): limit on maximum detections per image in evaluating AP - This limit, by default of the LVIS dataset, is 300. - """ - from lvis import LVIS - - self._logger = logging.getLogger(__name__) - - if tasks is not None and isinstance(tasks, CfgNode): - self._logger.warn( - "COCO Evaluator instantiated using config, this is deprecated behavior." - " Please pass in explicit arguments instead." - ) - self._tasks = None # Infering it from predictions should be better - else: - self._tasks = tasks - - self._distributed = distributed - self._output_dir = output_dir - self._max_dets_per_image = max_dets_per_image - - self._cpu_device = torch.device("cpu") - - self._metadata = MetadataCatalog.get(dataset_name) - json_file = PathManager.get_local_path(self._metadata.json_file) - self._lvis_api = LVIS(json_file) - # Test set json files do not contain annotations (evaluation must be - # performed using the LVIS evaluation server). - self._do_evaluation = len(self._lvis_api.get_ann_ids()) > 0 - - def reset(self): - self._predictions = [] - - def process(self, inputs, outputs): - """ - Args: - inputs: the inputs to a LVIS model (e.g., GeneralizedRCNN). - It is a list of dict. Each dict corresponds to an image and - contains keys like "height", "width", "file_name", "image_id". - outputs: the outputs of a LVIS model. It is a list of dicts with key - "instances" that contains :class:`Instances`. - """ - for input, output in zip(inputs, outputs): - prediction = {"image_id": input["image_id"]} - - if "instances" in output: - instances = output["instances"].to(self._cpu_device) - prediction["instances"] = instances_to_coco_json(instances, input["image_id"]) - if "proposals" in output: - prediction["proposals"] = output["proposals"].to(self._cpu_device) - self._predictions.append(prediction) - - def evaluate(self): - if self._distributed: - comm.synchronize() - predictions = comm.gather(self._predictions, dst=0) - predictions = list(itertools.chain(*predictions)) - - if not comm.is_main_process(): - return - else: - predictions = self._predictions - - if len(predictions) == 0: - self._logger.warning("[LVISEvaluator] Did not receive valid predictions.") - return {} - - if self._output_dir: - PathManager.mkdirs(self._output_dir) - file_path = os.path.join(self._output_dir, "instances_predictions.pth") - with PathManager.open(file_path, "wb") as f: - torch.save(predictions, f) - - self._results = OrderedDict() - if "proposals" in predictions[0]: - self._eval_box_proposals(predictions) - if "instances" in predictions[0]: - self._eval_predictions(predictions) - # Copy so the caller can do whatever with results - return copy.deepcopy(self._results) - - def _tasks_from_predictions(self, predictions): - for pred in predictions: - if "segmentation" in pred: - return ("bbox", "segm") - return ("bbox",) - - def _eval_predictions(self, predictions): - """ - Evaluate predictions. Fill self._results with the metrics of the tasks. - - Args: - predictions (list[dict]): list of outputs from the model - """ - self._logger.info("Preparing results in the LVIS format ...") - lvis_results = list(itertools.chain(*[x["instances"] for x in predictions])) - tasks = self._tasks or self._tasks_from_predictions(lvis_results) - - # LVIS evaluator can be used to evaluate results for COCO dataset categories. - # In this case `_metadata` variable will have a field with COCO-specific category mapping. - if hasattr(self._metadata, "thing_dataset_id_to_contiguous_id"): - reverse_id_mapping = { - v: k for k, v in self._metadata.thing_dataset_id_to_contiguous_id.items() - } - for result in lvis_results: - result["category_id"] = reverse_id_mapping[result["category_id"]] - else: - # unmap the category ids for LVIS (from 0-indexed to 1-indexed) - for result in lvis_results: - result["category_id"] += 1 - - if self._output_dir: - file_path = os.path.join(self._output_dir, "lvis_instances_results.json") - self._logger.info("Saving results to {}".format(file_path)) - with PathManager.open(file_path, "w") as f: - f.write(json.dumps(lvis_results)) - f.flush() - - if not self._do_evaluation: - self._logger.info("Annotations are not available for evaluation.") - return - - self._logger.info("Evaluating predictions ...") - for task in sorted(tasks): - res = _evaluate_predictions_on_lvis( - self._lvis_api, - lvis_results, - task, - max_dets_per_image=self._max_dets_per_image, - class_names=self._metadata.get("thing_classes"), - ) - self._results[task] = res - - def _eval_box_proposals(self, predictions): - """ - Evaluate the box proposals in predictions. - Fill self._results with the metrics for "box_proposals" task. - """ - if self._output_dir: - # Saving generated box proposals to file. - # Predicted box_proposals are in XYXY_ABS mode. - bbox_mode = BoxMode.XYXY_ABS.value - ids, boxes, objectness_logits = [], [], [] - for prediction in predictions: - ids.append(prediction["image_id"]) - boxes.append(prediction["proposals"].proposal_boxes.tensor.numpy()) - objectness_logits.append(prediction["proposals"].objectness_logits.numpy()) - - proposal_data = { - "boxes": boxes, - "objectness_logits": objectness_logits, - "ids": ids, - "bbox_mode": bbox_mode, - } - with PathManager.open(os.path.join(self._output_dir, "box_proposals.pkl"), "wb") as f: - pickle.dump(proposal_data, f) - - if not self._do_evaluation: - self._logger.info("Annotations are not available for evaluation.") - return - - self._logger.info("Evaluating bbox proposals ...") - res = {} - areas = {"all": "", "small": "s", "medium": "m", "large": "l"} - for limit in [100, 1000]: - for area, suffix in areas.items(): - stats = _evaluate_box_proposals(predictions, self._lvis_api, area=area, limit=limit) - key = "AR{}@{:d}".format(suffix, limit) - res[key] = float(stats["ar"].item() * 100) - self._logger.info("Proposal metrics: \n" + create_small_table(res)) - self._results["box_proposals"] = res - - -# inspired from Detectron: -# https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 # noqa -def _evaluate_box_proposals(dataset_predictions, lvis_api, thresholds=None, area="all", limit=None): - """ - Evaluate detection proposal recall metrics. This function is a much - faster alternative to the official LVIS API recall evaluation code. However, - it produces slightly different results. - """ - # Record max overlap value for each gt box - # Return vector of overlap values - areas = { - "all": 0, - "small": 1, - "medium": 2, - "large": 3, - "96-128": 4, - "128-256": 5, - "256-512": 6, - "512-inf": 7, - } - area_ranges = [ - [0 ** 2, 1e5 ** 2], # all - [0 ** 2, 32 ** 2], # small - [32 ** 2, 96 ** 2], # medium - [96 ** 2, 1e5 ** 2], # large - [96 ** 2, 128 ** 2], # 96-128 - [128 ** 2, 256 ** 2], # 128-256 - [256 ** 2, 512 ** 2], # 256-512 - [512 ** 2, 1e5 ** 2], - ] # 512-inf - assert area in areas, "Unknown area range: {}".format(area) - area_range = area_ranges[areas[area]] - gt_overlaps = [] - num_pos = 0 - - for prediction_dict in dataset_predictions: - predictions = prediction_dict["proposals"] - - # sort predictions in descending order - # TODO maybe remove this and make it explicit in the documentation - inds = predictions.objectness_logits.sort(descending=True)[1] - predictions = predictions[inds] - - ann_ids = lvis_api.get_ann_ids(img_ids=[prediction_dict["image_id"]]) - anno = lvis_api.load_anns(ann_ids) - gt_boxes = [ - BoxMode.convert(obj["bbox"], BoxMode.XYWH_ABS, BoxMode.XYXY_ABS) for obj in anno - ] - gt_boxes = torch.as_tensor(gt_boxes).reshape(-1, 4) # guard against no boxes - gt_boxes = Boxes(gt_boxes) - gt_areas = torch.as_tensor([obj["area"] for obj in anno]) - - if len(gt_boxes) == 0 or len(predictions) == 0: - continue - - valid_gt_inds = (gt_areas >= area_range[0]) & (gt_areas <= area_range[1]) - gt_boxes = gt_boxes[valid_gt_inds] - - num_pos += len(gt_boxes) - - if len(gt_boxes) == 0: - continue - - if limit is not None and len(predictions) > limit: - predictions = predictions[:limit] - - overlaps = pairwise_iou(predictions.proposal_boxes, gt_boxes) - - _gt_overlaps = torch.zeros(len(gt_boxes)) - for j in range(min(len(predictions), len(gt_boxes))): - # find which proposal box maximally covers each gt box - # and get the iou amount of coverage for each gt box - max_overlaps, argmax_overlaps = overlaps.max(dim=0) - - # find which gt box is 'best' covered (i.e. 'best' = most iou) - gt_ovr, gt_ind = max_overlaps.max(dim=0) - assert gt_ovr >= 0 - # find the proposal box that covers the best covered gt box - box_ind = argmax_overlaps[gt_ind] - # record the iou coverage of this gt box - _gt_overlaps[j] = overlaps[box_ind, gt_ind] - assert _gt_overlaps[j] == gt_ovr - # mark the proposal box and the gt box as used - overlaps[box_ind, :] = -1 - overlaps[:, gt_ind] = -1 - - # append recorded iou coverage level - gt_overlaps.append(_gt_overlaps) - gt_overlaps = ( - torch.cat(gt_overlaps, dim=0) if len(gt_overlaps) else torch.zeros(0, dtype=torch.float32) - ) - gt_overlaps, _ = torch.sort(gt_overlaps) - - if thresholds is None: - step = 0.05 - thresholds = torch.arange(0.5, 0.95 + 1e-5, step, dtype=torch.float32) - recalls = torch.zeros_like(thresholds) - # compute recall for each iou threshold - for i, t in enumerate(thresholds): - recalls[i] = (gt_overlaps >= t).float().sum() / float(num_pos) - # ar = 2 * np.trapz(recalls, thresholds) - ar = recalls.mean() - return { - "ar": ar, - "recalls": recalls, - "thresholds": thresholds, - "gt_overlaps": gt_overlaps, - "num_pos": num_pos, - } - - -def _evaluate_predictions_on_lvis( - lvis_gt, lvis_results, iou_type, max_dets_per_image=None, class_names=None -): - """ - Args: - iou_type (str): - max_dets_per_image (None or int): limit on maximum detections per image in evaluating AP - This limit, by default of the LVIS dataset, is 300. - class_names (None or list[str]): if provided, will use it to predict - per-category AP. - - Returns: - a dict of {metric name: score} - """ - metrics = { - "bbox": ["AP", "AP50", "AP75", "APs", "APm", "APl", "APr", "APc", "APf"], - "segm": ["AP", "AP50", "AP75", "APs", "APm", "APl", "APr", "APc", "APf"], - }[iou_type] - - logger = logging.getLogger(__name__) - - if len(lvis_results) == 0: # TODO: check if needed - logger.warn("No predictions from the model!") - return {metric: float("nan") for metric in metrics} - - if iou_type == "segm": - lvis_results = copy.deepcopy(lvis_results) - # When evaluating mask AP, if the results contain bbox, LVIS API will - # use the box area as the area of the instance, instead of the mask area. - # This leads to a different definition of small/medium/large. - # We remove the bbox field to let mask AP use mask area. - for c in lvis_results: - c.pop("bbox", None) - - if max_dets_per_image is None: - max_dets_per_image = 300 # Default for LVIS dataset - - from lvis import LVISEval, LVISResults - - logger.info(f"Evaluating with max detections per image = {max_dets_per_image}") - lvis_results = LVISResults(lvis_gt, lvis_results, max_dets=max_dets_per_image) - lvis_eval = LVISEval(lvis_gt, lvis_results, iou_type) - lvis_eval.run() - lvis_eval.print_results() - - # Pull the standard metrics from the LVIS results - results = lvis_eval.get_results() - results = {metric: float(results[metric] * 100) for metric in metrics} - logger.info("Evaluation results for {}: \n".format(iou_type) + create_small_table(results)) - return results diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/panoptic_evaluation.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/panoptic_evaluation.py deleted file mode 100755 index 9fb3462b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/panoptic_evaluation.py +++ /dev/null @@ -1,199 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import contextlib -import io -import itertools -import json -import logging -import numpy as np -import os -import tempfile -from collections import OrderedDict -from typing import Optional -from PIL import Image -from tabulate import tabulate - -from detectron2.data import MetadataCatalog -from detectron2.utils import comm -from detectron2.utils.file_io import PathManager - -from .evaluator import DatasetEvaluator - -logger = logging.getLogger(__name__) - - -class COCOPanopticEvaluator(DatasetEvaluator): - """ - Evaluate Panoptic Quality metrics on COCO using PanopticAPI. - It saves panoptic segmentation prediction in `output_dir` - - It contains a synchronize call and has to be called from all workers. - """ - - def __init__(self, dataset_name: str, output_dir: Optional[str] = None): - """ - Args: - dataset_name: name of the dataset - output_dir: output directory to save results for evaluation. - """ - self._metadata = MetadataCatalog.get(dataset_name) - self._thing_contiguous_id_to_dataset_id = { - v: k for k, v in self._metadata.thing_dataset_id_to_contiguous_id.items() - } - self._stuff_contiguous_id_to_dataset_id = { - v: k for k, v in self._metadata.stuff_dataset_id_to_contiguous_id.items() - } - - self._output_dir = output_dir - if self._output_dir is not None: - PathManager.mkdirs(self._output_dir) - - def reset(self): - self._predictions = [] - - def _convert_category_id(self, segment_info): - isthing = segment_info.pop("isthing", None) - if isthing is None: - # the model produces panoptic category id directly. No more conversion needed - return segment_info - if isthing is True: - segment_info["category_id"] = self._thing_contiguous_id_to_dataset_id[ - segment_info["category_id"] - ] - else: - segment_info["category_id"] = self._stuff_contiguous_id_to_dataset_id[ - segment_info["category_id"] - ] - return segment_info - - def process(self, inputs, outputs): - from panopticapi.utils import id2rgb - - for input, output in zip(inputs, outputs): - panoptic_img, segments_info = output["panoptic_seg"] - panoptic_img = panoptic_img.cpu().numpy() - if segments_info is None: - # If "segments_info" is None, we assume "panoptic_img" is a - # H*W int32 image storing the panoptic_id in the format of - # category_id * label_divisor + instance_id. We reserve -1 for - # VOID label, and add 1 to panoptic_img since the official - # evaluation script uses 0 for VOID label. - label_divisor = self._metadata.label_divisor - segments_info = [] - for panoptic_label in np.unique(panoptic_img): - if panoptic_label == -1: - # VOID region. - continue - pred_class = panoptic_label // label_divisor - isthing = ( - pred_class in self._metadata.thing_dataset_id_to_contiguous_id.values() - ) - segments_info.append( - { - "id": int(panoptic_label) + 1, - "category_id": int(pred_class), - "isthing": bool(isthing), - } - ) - # Official evaluation script uses 0 for VOID label. - panoptic_img += 1 - - file_name = os.path.basename(input["file_name"]) - file_name_png = os.path.splitext(file_name)[0] + ".png" - with io.BytesIO() as out: - Image.fromarray(id2rgb(panoptic_img)).save(out, format="PNG") - segments_info = [self._convert_category_id(x) for x in segments_info] - self._predictions.append( - { - "image_id": input["image_id"], - "file_name": file_name_png, - "png_string": out.getvalue(), - "segments_info": segments_info, - } - ) - - def evaluate(self): - comm.synchronize() - - self._predictions = comm.gather(self._predictions) - self._predictions = list(itertools.chain(*self._predictions)) - if not comm.is_main_process(): - return - - # PanopticApi requires local files - gt_json = PathManager.get_local_path(self._metadata.panoptic_json) - gt_folder = PathManager.get_local_path(self._metadata.panoptic_root) - - with tempfile.TemporaryDirectory(prefix="panoptic_eval") as pred_dir: - logger.info("Writing all panoptic predictions to {} ...".format(pred_dir)) - for p in self._predictions: - with open(os.path.join(pred_dir, p["file_name"]), "wb") as f: - f.write(p.pop("png_string")) - - with open(gt_json, "r") as f: - json_data = json.load(f) - json_data["annotations"] = self._predictions - - output_dir = self._output_dir or pred_dir - predictions_json = os.path.join(output_dir, "predictions.json") - with PathManager.open(predictions_json, "w") as f: - f.write(json.dumps(json_data)) - - from panopticapi.evaluation import pq_compute - - with contextlib.redirect_stdout(io.StringIO()): - pq_res = pq_compute( - gt_json, - PathManager.get_local_path(predictions_json), - gt_folder=gt_folder, - pred_folder=pred_dir, - ) - - res = {} - res["PQ"] = 100 * pq_res["All"]["pq"] - res["SQ"] = 100 * pq_res["All"]["sq"] - res["RQ"] = 100 * pq_res["All"]["rq"] - res["PQ_th"] = 100 * pq_res["Things"]["pq"] - res["SQ_th"] = 100 * pq_res["Things"]["sq"] - res["RQ_th"] = 100 * pq_res["Things"]["rq"] - res["PQ_st"] = 100 * pq_res["Stuff"]["pq"] - res["SQ_st"] = 100 * pq_res["Stuff"]["sq"] - res["RQ_st"] = 100 * pq_res["Stuff"]["rq"] - - results = OrderedDict({"panoptic_seg": res}) - _print_panoptic_results(pq_res) - - return results - - -def _print_panoptic_results(pq_res): - headers = ["", "PQ", "SQ", "RQ", "#categories"] - data = [] - for name in ["All", "Things", "Stuff"]: - row = [name] + [pq_res[name][k] * 100 for k in ["pq", "sq", "rq"]] + [pq_res[name]["n"]] - data.append(row) - table = tabulate( - data, headers=headers, tablefmt="pipe", floatfmt=".3f", stralign="center", numalign="center" - ) - logger.info("Panoptic Evaluation Results:\n" + table) - - -if __name__ == "__main__": - from detectron2.utils.logger import setup_logger - - logger = setup_logger() - import argparse - - parser = argparse.ArgumentParser() - parser.add_argument("--gt-json") - parser.add_argument("--gt-dir") - parser.add_argument("--pred-json") - parser.add_argument("--pred-dir") - args = parser.parse_args() - - from panopticapi.evaluation import pq_compute - - with contextlib.redirect_stdout(io.StringIO()): - pq_res = pq_compute( - args.gt_json, args.pred_json, gt_folder=args.gt_dir, pred_folder=args.pred_dir - ) - _print_panoptic_results(pq_res) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/pascal_voc_evaluation.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/pascal_voc_evaluation.py deleted file mode 100755 index 1d1abcde..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/pascal_voc_evaluation.py +++ /dev/null @@ -1,300 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import logging -import numpy as np -import os -import tempfile -import xml.etree.ElementTree as ET -from collections import OrderedDict, defaultdict -from functools import lru_cache -import torch - -from detectron2.data import MetadataCatalog -from detectron2.utils import comm -from detectron2.utils.file_io import PathManager - -from .evaluator import DatasetEvaluator - - -class PascalVOCDetectionEvaluator(DatasetEvaluator): - """ - Evaluate Pascal VOC style AP for Pascal VOC dataset. - It contains a synchronization, therefore has to be called from all ranks. - - Note that the concept of AP can be implemented in different ways and may not - produce identical results. This class mimics the implementation of the official - Pascal VOC Matlab API, and should produce similar but not identical results to the - official API. - """ - - def __init__(self, dataset_name): - """ - Args: - dataset_name (str): name of the dataset, e.g., "voc_2007_test" - """ - self._dataset_name = dataset_name - meta = MetadataCatalog.get(dataset_name) - - # Too many tiny files, download all to local for speed. - annotation_dir_local = PathManager.get_local_path( - os.path.join(meta.dirname, "Annotations/") - ) - self._anno_file_template = os.path.join(annotation_dir_local, "{}.xml") - self._image_set_path = os.path.join(meta.dirname, "ImageSets", "Main", meta.split + ".txt") - self._class_names = meta.thing_classes - assert meta.year in [2007, 2012], meta.year - self._is_2007 = meta.year == 2007 - self._cpu_device = torch.device("cpu") - self._logger = logging.getLogger(__name__) - - def reset(self): - self._predictions = defaultdict(list) # class name -> list of prediction strings - - def process(self, inputs, outputs): - for input, output in zip(inputs, outputs): - image_id = input["image_id"] - instances = output["instances"].to(self._cpu_device) - boxes = instances.pred_boxes.tensor.numpy() - scores = instances.scores.tolist() - classes = instances.pred_classes.tolist() - for box, score, cls in zip(boxes, scores, classes): - xmin, ymin, xmax, ymax = box - # The inverse of data loading logic in `datasets/pascal_voc.py` - xmin += 1 - ymin += 1 - self._predictions[cls].append( - f"{image_id} {score:.3f} {xmin:.1f} {ymin:.1f} {xmax:.1f} {ymax:.1f}" - ) - - def evaluate(self): - """ - Returns: - dict: has a key "segm", whose value is a dict of "AP", "AP50", and "AP75". - """ - all_predictions = comm.gather(self._predictions, dst=0) - if not comm.is_main_process(): - return - predictions = defaultdict(list) - for predictions_per_rank in all_predictions: - for clsid, lines in predictions_per_rank.items(): - predictions[clsid].extend(lines) - del all_predictions - - self._logger.info( - "Evaluating {} using {} metric. " - "Note that results do not use the official Matlab API.".format( - self._dataset_name, 2007 if self._is_2007 else 2012 - ) - ) - - with tempfile.TemporaryDirectory(prefix="pascal_voc_eval_") as dirname: - res_file_template = os.path.join(dirname, "{}.txt") - - aps = defaultdict(list) # iou -> ap per class - for cls_id, cls_name in enumerate(self._class_names): - lines = predictions.get(cls_id, [""]) - - with open(res_file_template.format(cls_name), "w") as f: - f.write("\n".join(lines)) - - for thresh in range(50, 100, 5): - rec, prec, ap = voc_eval( - res_file_template, - self._anno_file_template, - self._image_set_path, - cls_name, - ovthresh=thresh / 100.0, - use_07_metric=self._is_2007, - ) - aps[thresh].append(ap * 100) - - ret = OrderedDict() - mAP = {iou: np.mean(x) for iou, x in aps.items()} - ret["bbox"] = {"AP": np.mean(list(mAP.values())), "AP50": mAP[50], "AP75": mAP[75]} - return ret - - -############################################################################## -# -# Below code is modified from -# https://github.com/rbgirshick/py-faster-rcnn/blob/master/lib/datasets/voc_eval.py -# -------------------------------------------------------- -# Fast/er R-CNN -# Licensed under The MIT License [see LICENSE for details] -# Written by Bharath Hariharan -# -------------------------------------------------------- - -"""Python implementation of the PASCAL VOC devkit's AP evaluation code.""" - - -@lru_cache(maxsize=None) -def parse_rec(filename): - """Parse a PASCAL VOC xml file.""" - with PathManager.open(filename) as f: - tree = ET.parse(f) - objects = [] - for obj in tree.findall("object"): - obj_struct = {} - obj_struct["name"] = obj.find("name").text - obj_struct["pose"] = obj.find("pose").text - obj_struct["truncated"] = int(obj.find("truncated").text) - obj_struct["difficult"] = int(obj.find("difficult").text) - bbox = obj.find("bndbox") - obj_struct["bbox"] = [ - int(bbox.find("xmin").text), - int(bbox.find("ymin").text), - int(bbox.find("xmax").text), - int(bbox.find("ymax").text), - ] - objects.append(obj_struct) - - return objects - - -def voc_ap(rec, prec, use_07_metric=False): - """Compute VOC AP given precision and recall. If use_07_metric is true, uses - the VOC 07 11-point method (default:False). - """ - if use_07_metric: - # 11 point metric - ap = 0.0 - for t in np.arange(0.0, 1.1, 0.1): - if np.sum(rec >= t) == 0: - p = 0 - else: - p = np.max(prec[rec >= t]) - ap = ap + p / 11.0 - else: - # correct AP calculation - # first append sentinel values at the end - mrec = np.concatenate(([0.0], rec, [1.0])) - mpre = np.concatenate(([0.0], prec, [0.0])) - - # compute the precision envelope - for i in range(mpre.size - 1, 0, -1): - mpre[i - 1] = np.maximum(mpre[i - 1], mpre[i]) - - # to calculate area under PR curve, look for points - # where X axis (recall) changes value - i = np.where(mrec[1:] != mrec[:-1])[0] - - # and sum (\Delta recall) * prec - ap = np.sum((mrec[i + 1] - mrec[i]) * mpre[i + 1]) - return ap - - -def voc_eval(detpath, annopath, imagesetfile, classname, ovthresh=0.5, use_07_metric=False): - """rec, prec, ap = voc_eval(detpath, - annopath, - imagesetfile, - classname, - [ovthresh], - [use_07_metric]) - - Top level function that does the PASCAL VOC evaluation. - - detpath: Path to detections - detpath.format(classname) should produce the detection results file. - annopath: Path to annotations - annopath.format(imagename) should be the xml annotations file. - imagesetfile: Text file containing the list of images, one image per line. - classname: Category name (duh) - [ovthresh]: Overlap threshold (default = 0.5) - [use_07_metric]: Whether to use VOC07's 11 point AP computation - (default False) - """ - # assumes detections are in detpath.format(classname) - # assumes annotations are in annopath.format(imagename) - # assumes imagesetfile is a text file with each line an image name - - # first load gt - # read list of images - with PathManager.open(imagesetfile, "r") as f: - lines = f.readlines() - imagenames = [x.strip() for x in lines] - - # load annots - recs = {} - for imagename in imagenames: - recs[imagename] = parse_rec(annopath.format(imagename)) - - # extract gt objects for this class - class_recs = {} - npos = 0 - for imagename in imagenames: - R = [obj for obj in recs[imagename] if obj["name"] == classname] - bbox = np.array([x["bbox"] for x in R]) - difficult = np.array([x["difficult"] for x in R]).astype(np.bool) - # difficult = np.array([False for x in R]).astype(np.bool) # treat all "difficult" as GT - det = [False] * len(R) - npos = npos + sum(~difficult) - class_recs[imagename] = {"bbox": bbox, "difficult": difficult, "det": det} - - # read dets - detfile = detpath.format(classname) - with open(detfile, "r") as f: - lines = f.readlines() - - splitlines = [x.strip().split(" ") for x in lines] - image_ids = [x[0] for x in splitlines] - confidence = np.array([float(x[1]) for x in splitlines]) - BB = np.array([[float(z) for z in x[2:]] for x in splitlines]).reshape(-1, 4) - - # sort by confidence - sorted_ind = np.argsort(-confidence) - BB = BB[sorted_ind, :] - image_ids = [image_ids[x] for x in sorted_ind] - - # go down dets and mark TPs and FPs - nd = len(image_ids) - tp = np.zeros(nd) - fp = np.zeros(nd) - for d in range(nd): - R = class_recs[image_ids[d]] - bb = BB[d, :].astype(float) - ovmax = -np.inf - BBGT = R["bbox"].astype(float) - - if BBGT.size > 0: - # compute overlaps - # intersection - ixmin = np.maximum(BBGT[:, 0], bb[0]) - iymin = np.maximum(BBGT[:, 1], bb[1]) - ixmax = np.minimum(BBGT[:, 2], bb[2]) - iymax = np.minimum(BBGT[:, 3], bb[3]) - iw = np.maximum(ixmax - ixmin + 1.0, 0.0) - ih = np.maximum(iymax - iymin + 1.0, 0.0) - inters = iw * ih - - # union - uni = ( - (bb[2] - bb[0] + 1.0) * (bb[3] - bb[1] + 1.0) - + (BBGT[:, 2] - BBGT[:, 0] + 1.0) * (BBGT[:, 3] - BBGT[:, 1] + 1.0) - - inters - ) - - overlaps = inters / uni - ovmax = np.max(overlaps) - jmax = np.argmax(overlaps) - - if ovmax > ovthresh: - if not R["difficult"][jmax]: - if not R["det"][jmax]: - tp[d] = 1.0 - R["det"][jmax] = 1 - else: - fp[d] = 1.0 - else: - fp[d] = 1.0 - - # compute precision recall - fp = np.cumsum(fp) - tp = np.cumsum(tp) - rec = tp / float(npos) - # avoid divide by zero in case the first detection matches a difficult - # ground truth - prec = tp / np.maximum(tp + fp, np.finfo(np.float64).eps) - ap = voc_ap(rec, prec, use_07_metric) - - return rec, prec, ap diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/rotated_coco_evaluation.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/rotated_coco_evaluation.py deleted file mode 100755 index ea6d1b38..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/rotated_coco_evaluation.py +++ /dev/null @@ -1,207 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import itertools -import json -import numpy as np -import os -import torch -from pycocotools.cocoeval import COCOeval, maskUtils - -from detectron2.structures import BoxMode, RotatedBoxes, pairwise_iou_rotated -from detectron2.utils.file_io import PathManager - -from .coco_evaluation import COCOEvaluator - - -class RotatedCOCOeval(COCOeval): - @staticmethod - def is_rotated(box_list): - if type(box_list) == np.ndarray: - return box_list.shape[1] == 5 - elif type(box_list) == list: - if box_list == []: # cannot decide the box_dim - return False - return np.all( - np.array( - [ - (len(obj) == 5) and ((type(obj) == list) or (type(obj) == np.ndarray)) - for obj in box_list - ] - ) - ) - return False - - @staticmethod - def boxlist_to_tensor(boxlist, output_box_dim): - if type(boxlist) == np.ndarray: - box_tensor = torch.from_numpy(boxlist) - elif type(boxlist) == list: - if boxlist == []: - return torch.zeros((0, output_box_dim), dtype=torch.float32) - else: - box_tensor = torch.FloatTensor(boxlist) - else: - raise Exception("Unrecognized boxlist type") - - input_box_dim = box_tensor.shape[1] - if input_box_dim != output_box_dim: - if input_box_dim == 4 and output_box_dim == 5: - box_tensor = BoxMode.convert(box_tensor, BoxMode.XYWH_ABS, BoxMode.XYWHA_ABS) - else: - raise Exception( - "Unable to convert from {}-dim box to {}-dim box".format( - input_box_dim, output_box_dim - ) - ) - return box_tensor - - def compute_iou_dt_gt(self, dt, gt, is_crowd): - if self.is_rotated(dt) or self.is_rotated(gt): - # TODO: take is_crowd into consideration - assert all(c == 0 for c in is_crowd) - dt = RotatedBoxes(self.boxlist_to_tensor(dt, output_box_dim=5)) - gt = RotatedBoxes(self.boxlist_to_tensor(gt, output_box_dim=5)) - return pairwise_iou_rotated(dt, gt) - else: - # This is the same as the classical COCO evaluation - return maskUtils.iou(dt, gt, is_crowd) - - def computeIoU(self, imgId, catId): - p = self.params - if p.useCats: - gt = self._gts[imgId, catId] - dt = self._dts[imgId, catId] - else: - gt = [_ for cId in p.catIds for _ in self._gts[imgId, cId]] - dt = [_ for cId in p.catIds for _ in self._dts[imgId, cId]] - if len(gt) == 0 and len(dt) == 0: - return [] - inds = np.argsort([-d["score"] for d in dt], kind="mergesort") - dt = [dt[i] for i in inds] - if len(dt) > p.maxDets[-1]: - dt = dt[0 : p.maxDets[-1]] - - assert p.iouType == "bbox", "unsupported iouType for iou computation" - - g = [g["bbox"] for g in gt] - d = [d["bbox"] for d in dt] - - # compute iou between each dt and gt region - iscrowd = [int(o["iscrowd"]) for o in gt] - - # Note: this function is copied from cocoeval.py in cocoapi - # and the major difference is here. - ious = self.compute_iou_dt_gt(d, g, iscrowd) - return ious - - -class RotatedCOCOEvaluator(COCOEvaluator): - """ - Evaluate object proposal/instance detection outputs using COCO-like metrics and APIs, - with rotated boxes support. - Note: this uses IOU only and does not consider angle differences. - """ - - def process(self, inputs, outputs): - """ - Args: - inputs: the inputs to a COCO model (e.g., GeneralizedRCNN). - It is a list of dict. Each dict corresponds to an image and - contains keys like "height", "width", "file_name", "image_id". - outputs: the outputs of a COCO model. It is a list of dicts with key - "instances" that contains :class:`Instances`. - """ - for input, output in zip(inputs, outputs): - prediction = {"image_id": input["image_id"]} - - if "instances" in output: - instances = output["instances"].to(self._cpu_device) - - prediction["instances"] = self.instances_to_json(instances, input["image_id"]) - if "proposals" in output: - prediction["proposals"] = output["proposals"].to(self._cpu_device) - self._predictions.append(prediction) - - def instances_to_json(self, instances, img_id): - num_instance = len(instances) - if num_instance == 0: - return [] - - boxes = instances.pred_boxes.tensor.numpy() - if boxes.shape[1] == 4: - boxes = BoxMode.convert(boxes, BoxMode.XYXY_ABS, BoxMode.XYWH_ABS) - boxes = boxes.tolist() - scores = instances.scores.tolist() - classes = instances.pred_classes.tolist() - - results = [] - for k in range(num_instance): - result = { - "image_id": img_id, - "category_id": classes[k], - "bbox": boxes[k], - "score": scores[k], - } - - results.append(result) - return results - - def _eval_predictions(self, predictions, img_ids=None): # img_ids: unused - """ - Evaluate predictions on the given tasks. - Fill self._results with the metrics of the tasks. - """ - self._logger.info("Preparing results for COCO format ...") - coco_results = list(itertools.chain(*[x["instances"] for x in predictions])) - - # unmap the category ids for COCO - if hasattr(self._metadata, "thing_dataset_id_to_contiguous_id"): - reverse_id_mapping = { - v: k for k, v in self._metadata.thing_dataset_id_to_contiguous_id.items() - } - for result in coco_results: - result["category_id"] = reverse_id_mapping[result["category_id"]] - - if self._output_dir: - file_path = os.path.join(self._output_dir, "coco_instances_results.json") - self._logger.info("Saving results to {}".format(file_path)) - with PathManager.open(file_path, "w") as f: - f.write(json.dumps(coco_results)) - f.flush() - - if not self._do_evaluation: - self._logger.info("Annotations are not available for evaluation.") - return - - self._logger.info("Evaluating predictions ...") - - assert self._tasks is None or set(self._tasks) == { - "bbox" - }, "[RotatedCOCOEvaluator] Only bbox evaluation is supported" - coco_eval = ( - self._evaluate_predictions_on_coco(self._coco_api, coco_results) - if len(coco_results) > 0 - else None # cocoapi does not handle empty results very well - ) - - task = "bbox" - res = self._derive_coco_results( - coco_eval, task, class_names=self._metadata.get("thing_classes") - ) - self._results[task] = res - - def _evaluate_predictions_on_coco(self, coco_gt, coco_results): - """ - Evaluate the coco results using COCOEval API. - """ - assert len(coco_results) > 0 - - coco_dt = coco_gt.loadRes(coco_results) - - # Only bbox is supported for now - coco_eval = RotatedCOCOeval(coco_gt, coco_dt, iouType="bbox") - - coco_eval.evaluate() - coco_eval.accumulate() - coco_eval.summarize() - - return coco_eval diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/sem_seg_evaluation.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/sem_seg_evaluation.py deleted file mode 100755 index 7a19db71..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/sem_seg_evaluation.py +++ /dev/null @@ -1,184 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import itertools -import json -import logging -import numpy as np -import os -from collections import OrderedDict -import PIL.Image as Image -import pycocotools.mask as mask_util -import torch - -from detectron2.data import DatasetCatalog, MetadataCatalog -from detectron2.utils.comm import all_gather, is_main_process, synchronize -from detectron2.utils.file_io import PathManager - -from .evaluator import DatasetEvaluator - - -class SemSegEvaluator(DatasetEvaluator): - """ - Evaluate semantic segmentation metrics. - """ - - def __init__( - self, - dataset_name, - distributed=True, - output_dir=None, - *, - num_classes=None, - ignore_label=None, - ): - """ - Args: - dataset_name (str): name of the dataset to be evaluated. - distributed (bool): if True, will collect results from all ranks for evaluation. - Otherwise, will evaluate the results in the current process. - output_dir (str): an output directory to dump results. - num_classes, ignore_label: deprecated argument - """ - self._logger = logging.getLogger(__name__) - if num_classes is not None: - self._logger.warn( - "SemSegEvaluator(num_classes) is deprecated! It should be obtained from metadata." - ) - if ignore_label is not None: - self._logger.warn( - "SemSegEvaluator(ignore_label) is deprecated! It should be obtained from metadata." - ) - self._dataset_name = dataset_name - self._distributed = distributed - self._output_dir = output_dir - - self._cpu_device = torch.device("cpu") - - self.input_file_to_gt_file = { - dataset_record["file_name"]: dataset_record["sem_seg_file_name"] - for dataset_record in DatasetCatalog.get(dataset_name) - } - - meta = MetadataCatalog.get(dataset_name) - # Dict that maps contiguous training ids to COCO category ids - try: - c2d = meta.stuff_dataset_id_to_contiguous_id - self._contiguous_id_to_dataset_id = {v: k for k, v in c2d.items()} - except AttributeError: - self._contiguous_id_to_dataset_id = None - self._class_names = meta.stuff_classes - self._num_classes = len(meta.stuff_classes) - if num_classes is not None: - assert self._num_classes == num_classes, f"{self._num_classes} != {num_classes}" - self._ignore_label = ignore_label if ignore_label is not None else meta.ignore_label - - def reset(self): - self._conf_matrix = np.zeros((self._num_classes + 1, self._num_classes + 1), dtype=np.int64) - self._predictions = [] - - def process(self, inputs, outputs): - """ - Args: - inputs: the inputs to a model. - It is a list of dicts. Each dict corresponds to an image and - contains keys like "height", "width", "file_name". - outputs: the outputs of a model. It is either list of semantic segmentation predictions - (Tensor [H, W]) or list of dicts with key "sem_seg" that contains semantic - segmentation prediction in the same format. - """ - for input, output in zip(inputs, outputs): - output = output["sem_seg"].argmax(dim=0).to(self._cpu_device) - pred = np.array(output, dtype=np.int) - with PathManager.open(self.input_file_to_gt_file[input["file_name"]], "rb") as f: - gt = np.array(Image.open(f), dtype=np.int) - - gt[gt == self._ignore_label] = self._num_classes - - self._conf_matrix += np.bincount( - (self._num_classes + 1) * pred.reshape(-1) + gt.reshape(-1), - minlength=self._conf_matrix.size, - ).reshape(self._conf_matrix.shape) - - self._predictions.extend(self.encode_json_sem_seg(pred, input["file_name"])) - - def evaluate(self): - """ - Evaluates standard semantic segmentation metrics (http://cocodataset.org/#stuff-eval): - - * Mean intersection-over-union averaged across classes (mIoU) - * Frequency Weighted IoU (fwIoU) - * Mean pixel accuracy averaged across classes (mACC) - * Pixel Accuracy (pACC) - """ - if self._distributed: - synchronize() - conf_matrix_list = all_gather(self._conf_matrix) - self._predictions = all_gather(self._predictions) - self._predictions = list(itertools.chain(*self._predictions)) - if not is_main_process(): - return - - self._conf_matrix = np.zeros_like(self._conf_matrix) - for conf_matrix in conf_matrix_list: - self._conf_matrix += conf_matrix - - if self._output_dir: - PathManager.mkdirs(self._output_dir) - file_path = os.path.join(self._output_dir, "sem_seg_predictions.json") - with PathManager.open(file_path, "w") as f: - f.write(json.dumps(self._predictions)) - - acc = np.full(self._num_classes, np.nan, dtype=np.float) - iou = np.full(self._num_classes, np.nan, dtype=np.float) - tp = self._conf_matrix.diagonal()[:-1].astype(np.float) - pos_gt = np.sum(self._conf_matrix[:-1, :-1], axis=0).astype(np.float) - class_weights = pos_gt / np.sum(pos_gt) - pos_pred = np.sum(self._conf_matrix[:-1, :-1], axis=1).astype(np.float) - acc_valid = pos_gt > 0 - acc[acc_valid] = tp[acc_valid] / pos_gt[acc_valid] - iou_valid = (pos_gt + pos_pred) > 0 - union = pos_gt + pos_pred - tp - iou[acc_valid] = tp[acc_valid] / union[acc_valid] - macc = np.sum(acc[acc_valid]) / np.sum(acc_valid) - miou = np.sum(iou[acc_valid]) / np.sum(iou_valid) - fiou = np.sum(iou[acc_valid] * class_weights[acc_valid]) - pacc = np.sum(tp) / np.sum(pos_gt) - - res = {} - res["mIoU"] = 100 * miou - res["fwIoU"] = 100 * fiou - for i, name in enumerate(self._class_names): - res["IoU-{}".format(name)] = 100 * iou[i] - res["mACC"] = 100 * macc - res["pACC"] = 100 * pacc - for i, name in enumerate(self._class_names): - res["ACC-{}".format(name)] = 100 * acc[i] - - if self._output_dir: - file_path = os.path.join(self._output_dir, "sem_seg_evaluation.pth") - with PathManager.open(file_path, "wb") as f: - torch.save(res, f) - results = OrderedDict({"sem_seg": res}) - self._logger.info(results) - return results - - def encode_json_sem_seg(self, sem_seg, input_file_name): - """ - Convert semantic segmentation to COCO stuff format with segments encoded as RLEs. - See http://cocodataset.org/#format-results - """ - json_list = [] - for label in np.unique(sem_seg): - if self._contiguous_id_to_dataset_id is not None: - assert ( - label in self._contiguous_id_to_dataset_id - ), "Label {} is not in the metadata info for {}".format(label, self._dataset_name) - dataset_id = self._contiguous_id_to_dataset_id[label] - else: - dataset_id = int(label) - mask = (sem_seg == label).astype(np.uint8) - mask_rle = mask_util.encode(np.array(mask[:, :, None], order="F"))[0] - mask_rle["counts"] = mask_rle["counts"].decode("utf-8") - json_list.append( - {"file_name": input_file_name, "category_id": dataset_id, "segmentation": mask_rle} - ) - return json_list diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/testing.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/testing.py deleted file mode 100755 index 9e5ae625..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/evaluation/testing.py +++ /dev/null @@ -1,85 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import numpy as np -import pprint -import sys -from collections.abc import Mapping - - -def print_csv_format(results): - """ - Print main metrics in a format similar to Detectron, - so that they are easy to copypaste into a spreadsheet. - - Args: - results (OrderedDict[dict]): task_name -> {metric -> score} - unordered dict can also be printed, but in arbitrary order - """ - assert isinstance(results, Mapping) or not len(results), results - logger = logging.getLogger(__name__) - for task, res in results.items(): - if isinstance(res, Mapping): - # Don't print "AP-category" metrics since they are usually not tracked. - important_res = [(k, v) for k, v in res.items() if "-" not in k] - logger.info("copypaste: Task: {}".format(task)) - logger.info("copypaste: " + ",".join([k[0] for k in important_res])) - logger.info("copypaste: " + ",".join(["{0:.4f}".format(k[1]) for k in important_res])) - else: - logger.info(f"copypaste: {task}={res}") - - -def verify_results(cfg, results): - """ - Args: - results (OrderedDict[dict]): task_name -> {metric -> score} - - Returns: - bool: whether the verification succeeds or not - """ - expected_results = cfg.TEST.EXPECTED_RESULTS - if not len(expected_results): - return True - - ok = True - for task, metric, expected, tolerance in expected_results: - actual = results[task].get(metric, None) - if actual is None: - ok = False - continue - if not np.isfinite(actual): - ok = False - continue - diff = abs(actual - expected) - if diff > tolerance: - ok = False - - logger = logging.getLogger(__name__) - if not ok: - logger.error("Result verification failed!") - logger.error("Expected Results: " + str(expected_results)) - logger.error("Actual Results: " + pprint.pformat(results)) - - sys.exit(1) - else: - logger.info("Results verification passed.") - return ok - - -def flatten_results_dict(results): - """ - Expand a hierarchical dict of scalars into a flat dict of scalars. - If results[k1][k2][k3] = v, the returned dict will have the entry - {"k1/k2/k3": v}. - - Args: - results (dict): - """ - r = {} - for k, v in results.items(): - if isinstance(v, Mapping): - v = flatten_results_dict(v) - for kk, vv in v.items(): - r[k + "/" + kk] = vv - else: - r[k] = v - return r diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/README.md deleted file mode 100755 index 9fcd3351..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/README.md +++ /dev/null @@ -1,13 +0,0 @@ - -This directory contains code to prepare a detectron2 model for deployment. -Currently it supports exporting a detectron2 model to Caffe2 format through ONNX. - -Please see [documentation](https://detectron2.readthedocs.io/tutorials/deployment.html) for its usage. - - -### Acknowledgements - -Thanks to Mobile Vision team at Facebook for developing the Caffe2 conversion tools. - -Thanks to Computing Platform Department - PAI team at Alibaba Group (@bddpqq, @chenbohua3) who -help export Detectron2 models to TorchScript. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/__init__.py deleted file mode 100755 index 25e5c946..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/__init__.py +++ /dev/null @@ -1,15 +0,0 @@ -# -*- coding: utf-8 -*- - -try: - from caffe2.proto import caffe2_pb2 as _tmp - - # caffe2 is optional -except ImportError: - pass -else: - from .api import * - -from .flatten import TracingAdapter -from .torchscript import scripting_with_instances, dump_torchscript_IR - -__all__ = [k for k in globals().keys() if not k.startswith("_")] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/api.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/api.py deleted file mode 100755 index ad427218..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/api.py +++ /dev/null @@ -1,235 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import logging -import os -import torch -from caffe2.proto import caffe2_pb2 -from torch import nn - -from detectron2.config import CfgNode -from detectron2.utils.file_io import PathManager - -from .caffe2_inference import ProtobufDetectionModel -from .caffe2_modeling import META_ARCH_CAFFE2_EXPORT_TYPE_MAP, convert_batched_inputs_to_c2_format -from .shared import get_pb_arg_vali, get_pb_arg_vals, save_graph - -__all__ = [ - "add_export_config", - "Caffe2Model", - "Caffe2Tracer", -] - - -def add_export_config(cfg): - return cfg - - -class Caffe2Tracer: - """ - Make a detectron2 model traceable with Caffe2 operators. - This class creates a traceable version of a detectron2 model which: - - 1. Rewrite parts of the model using ops in Caffe2. Note that some ops do - not have GPU implementation in Caffe2. - 2. Remove post-processing and only produce raw layer outputs - - After making a traceable model, the class provide methods to export such a - model to different deployment formats. - Exported graph produced by this class take two input tensors: - - 1. (1, C, H, W) float "data" which is an image (usually in [0, 255]). - (H, W) often has to be padded to multiple of 32 (depend on the model - architecture). - 2. 1x3 float "im_info", each row of which is (height, width, 1.0). - Height and width are true image shapes before padding. - - The class currently only supports models using builtin meta architectures. - Batch inference is not supported, and contributions are welcome. - """ - - def __init__(self, cfg: CfgNode, model: nn.Module, inputs): - """ - Args: - cfg (CfgNode): a detectron2 config used to construct caffe2-compatible model. - model (nn.Module): An original pytorch model. Must be among a few official models - in detectron2 that can be converted to become caffe2-compatible automatically. - Weights have to be already loaded to this model. - inputs: sample inputs that the given model takes for inference. - Will be used to trace the model. For most models, random inputs with - no detected objects will not work as they lead to wrong traces. - """ - assert isinstance(cfg, CfgNode), cfg - assert isinstance(model, torch.nn.Module), type(model) - - # TODO make it support custom models, by passing in c2 model directly - C2MetaArch = META_ARCH_CAFFE2_EXPORT_TYPE_MAP[cfg.MODEL.META_ARCHITECTURE] - self.traceable_model = C2MetaArch(cfg, copy.deepcopy(model)) - self.inputs = inputs - self.traceable_inputs = self.traceable_model.get_caffe2_inputs(inputs) - - def export_caffe2(self): - """ - Export the model to Caffe2's protobuf format. - The returned object can be saved with its :meth:`.save_protobuf()` method. - The result can be loaded and executed using Caffe2 runtime. - - Returns: - :class:`Caffe2Model` - """ - from .caffe2_export import export_caffe2_detection_model - - predict_net, init_net = export_caffe2_detection_model( - self.traceable_model, self.traceable_inputs - ) - return Caffe2Model(predict_net, init_net) - - def export_onnx(self): - """ - Export the model to ONNX format. - Note that the exported model contains custom ops only available in caffe2, therefore it - cannot be directly executed by other runtime (such as onnxruntime or TensorRT). - Post-processing or transformation passes may be applied on the model to accommodate - different runtimes, but we currently do not provide support for them. - - Returns: - onnx.ModelProto: an onnx model. - """ - from .caffe2_export import export_onnx_model as export_onnx_model_impl - - return export_onnx_model_impl(self.traceable_model, (self.traceable_inputs,)) - - def export_torchscript(self): - """ - Export the model to a ``torch.jit.TracedModule`` by tracing. - The returned object can be saved to a file by ``.save()``. - - Returns: - torch.jit.TracedModule: a torch TracedModule - """ - logger = logging.getLogger(__name__) - logger.info("Tracing the model with torch.jit.trace ...") - with torch.no_grad(): - return torch.jit.trace(self.traceable_model, (self.traceable_inputs,)) - - -class Caffe2Model(nn.Module): - """ - A wrapper around the traced model in Caffe2's protobuf format. - The exported graph has different inputs/outputs from the original Pytorch - model, as explained in :class:`Caffe2Tracer`. This class wraps around the - exported graph to simulate the same interface as the original Pytorch model. - It also provides functions to save/load models in Caffe2's format.' - - Examples: - :: - c2_model = Caffe2Tracer(cfg, torch_model, inputs).export_caffe2() - inputs = [{"image": img_tensor_CHW}] - outputs = c2_model(inputs) - orig_outputs = torch_model(inputs) - """ - - def __init__(self, predict_net, init_net): - super().__init__() - self.eval() # always in eval mode - self._predict_net = predict_net - self._init_net = init_net - self._predictor = None - - __init__.__HIDE_SPHINX_DOC__ = True - - @property - def predict_net(self): - """ - caffe2.core.Net: the underlying caffe2 predict net - """ - return self._predict_net - - @property - def init_net(self): - """ - caffe2.core.Net: the underlying caffe2 init net - """ - return self._init_net - - def save_protobuf(self, output_dir): - """ - Save the model as caffe2's protobuf format. - It saves the following files: - - * "model.pb": definition of the graph. Can be visualized with - tools like `netron `_. - * "model_init.pb": model parameters - * "model.pbtxt": human-readable definition of the graph. Not - needed for deployment. - - Args: - output_dir (str): the output directory to save protobuf files. - """ - logger = logging.getLogger(__name__) - logger.info("Saving model to {} ...".format(output_dir)) - if not PathManager.exists(output_dir): - PathManager.mkdirs(output_dir) - - with PathManager.open(os.path.join(output_dir, "model.pb"), "wb") as f: - f.write(self._predict_net.SerializeToString()) - with PathManager.open(os.path.join(output_dir, "model.pbtxt"), "w") as f: - f.write(str(self._predict_net)) - with PathManager.open(os.path.join(output_dir, "model_init.pb"), "wb") as f: - f.write(self._init_net.SerializeToString()) - - def save_graph(self, output_file, inputs=None): - """ - Save the graph as SVG format. - - Args: - output_file (str): a SVG file - inputs: optional inputs given to the model. - If given, the inputs will be used to run the graph to record - shape of every tensor. The shape information will be - saved together with the graph. - """ - from .caffe2_export import run_and_save_graph - - if inputs is None: - save_graph(self._predict_net, output_file, op_only=False) - else: - size_divisibility = get_pb_arg_vali(self._predict_net, "size_divisibility", 0) - device = get_pb_arg_vals(self._predict_net, "device", b"cpu").decode("ascii") - inputs = convert_batched_inputs_to_c2_format(inputs, size_divisibility, device) - inputs = [x.cpu().numpy() for x in inputs] - run_and_save_graph(self._predict_net, self._init_net, inputs, output_file) - - @staticmethod - def load_protobuf(dir): - """ - Args: - dir (str): a directory used to save Caffe2Model with - :meth:`save_protobuf`. - The files "model.pb" and "model_init.pb" are needed. - - Returns: - Caffe2Model: the caffe2 model loaded from this directory. - """ - predict_net = caffe2_pb2.NetDef() - with PathManager.open(os.path.join(dir, "model.pb"), "rb") as f: - predict_net.ParseFromString(f.read()) - - init_net = caffe2_pb2.NetDef() - with PathManager.open(os.path.join(dir, "model_init.pb"), "rb") as f: - init_net.ParseFromString(f.read()) - - return Caffe2Model(predict_net, init_net) - - def __call__(self, inputs): - """ - An interface that wraps around a Caffe2 model and mimics detectron2's models' - input/output format. See details about the format at :doc:`/tutorials/models`. - This is used to compare the outputs of caffe2 model with its original torch model. - - Due to the extra conversion between Pytorch/Caffe2, this method is not meant for - benchmark. Because of the conversion, this method also has dependency - on detectron2 in order to convert to detectron2's output format. - """ - if self._predictor is None: - self._predictor = ProtobufDetectionModel(self._predict_net, self._init_net) - return self._predictor(inputs) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/c10.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/c10.py deleted file mode 100755 index 25ee2300..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/c10.py +++ /dev/null @@ -1,534 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import math -import torch -import torch.nn.functional as F - -from detectron2.layers import cat -from detectron2.layers.roi_align_rotated import ROIAlignRotated -from detectron2.modeling import poolers -from detectron2.modeling.proposal_generator import rpn -from detectron2.modeling.roi_heads.mask_head import mask_rcnn_inference -from detectron2.structures import Boxes, ImageList, Instances, Keypoints - -from .shared import alias, to_device - - -""" -This file contains caffe2-compatible implementation of several detectron2 components. -""" - - -class Caffe2Boxes(Boxes): - """ - Representing a list of detectron2.structures.Boxes from minibatch, each box - is represented by a 5d vector (batch index + 4 coordinates), or a 6d vector - (batch index + 5 coordinates) for RotatedBoxes. - """ - - def __init__(self, tensor): - assert isinstance(tensor, torch.Tensor) - assert tensor.dim() == 2 and tensor.size(-1) in [4, 5, 6], tensor.size() - # TODO: make tensor immutable when dim is Nx5 for Boxes, - # and Nx6 for RotatedBoxes? - self.tensor = tensor - - -# TODO clean up this class, maybe just extend Instances -class InstancesList(object): - """ - Tensor representation of a list of Instances object for a batch of images. - - When dealing with a batch of images with Caffe2 ops, a list of bboxes - (instances) are usually represented by single Tensor with size - (sigma(Ni), 5) or (sigma(Ni), 4) plus a batch split Tensor. This class is - for providing common functions to convert between these two representations. - """ - - def __init__(self, im_info, indices, extra_fields=None): - # [N, 3] -> (H, W, Scale) - self.im_info = im_info - # [N,] -> indice of batch to which the instance belongs - self.indices = indices - # [N, ...] - self.batch_extra_fields = extra_fields or {} - - self.image_size = self.im_info - - def get_fields(self): - """like `get_fields` in the Instances object, - but return each field in tensor representations""" - ret = {} - for k, v in self.batch_extra_fields.items(): - # if isinstance(v, torch.Tensor): - # tensor_rep = v - # elif isinstance(v, (Boxes, Keypoints)): - # tensor_rep = v.tensor - # else: - # raise ValueError("Can't find tensor representation for: {}".format()) - ret[k] = v - return ret - - def has(self, name): - return name in self.batch_extra_fields - - def set(self, name, value): - data_len = len(value) - if len(self.batch_extra_fields): - assert ( - len(self) == data_len - ), "Adding a field of length {} to a Instances of length {}".format(data_len, len(self)) - self.batch_extra_fields[name] = value - - def __setattr__(self, name, val): - if name in ["im_info", "indices", "batch_extra_fields", "image_size"]: - super().__setattr__(name, val) - else: - self.set(name, val) - - def __getattr__(self, name): - if name not in self.batch_extra_fields: - raise AttributeError("Cannot find field '{}' in the given Instances!".format(name)) - return self.batch_extra_fields[name] - - def __len__(self): - return len(self.indices) - - def flatten(self): - ret = [] - for _, v in self.batch_extra_fields.items(): - if isinstance(v, (Boxes, Keypoints)): - ret.append(v.tensor) - else: - ret.append(v) - return ret - - @staticmethod - def to_d2_instances_list(instances_list): - """ - Convert InstancesList to List[Instances]. The input `instances_list` can - also be a List[Instances], in this case this method is a non-op. - """ - if not isinstance(instances_list, InstancesList): - assert all(isinstance(x, Instances) for x in instances_list) - return instances_list - - ret = [] - for i, info in enumerate(instances_list.im_info): - instances = Instances(torch.Size([int(info[0].item()), int(info[1].item())])) - - ids = instances_list.indices == i - for k, v in instances_list.batch_extra_fields.items(): - if isinstance(v, torch.Tensor): - instances.set(k, v[ids]) - continue - elif isinstance(v, Boxes): - instances.set(k, v[ids, -4:]) - continue - - target_type, tensor_source = v - assert isinstance(tensor_source, torch.Tensor) - assert tensor_source.shape[0] == instances_list.indices.shape[0] - tensor_source = tensor_source[ids] - - if issubclass(target_type, Boxes): - instances.set(k, Boxes(tensor_source[:, -4:])) - elif issubclass(target_type, Keypoints): - instances.set(k, Keypoints(tensor_source)) - elif issubclass(target_type, torch.Tensor): - instances.set(k, tensor_source) - else: - raise ValueError("Can't handle targe type: {}".format(target_type)) - - ret.append(instances) - return ret - - -class Caffe2Compatible(object): - """ - A model can inherit this class to indicate that it can be traced and deployed with caffe2. - """ - - def _get_tensor_mode(self): - return self._tensor_mode - - def _set_tensor_mode(self, v): - self._tensor_mode = v - - tensor_mode = property(_get_tensor_mode, _set_tensor_mode) - """ - If true, the model expects C2-style tensor only inputs/outputs format. - """ - - -class Caffe2RPN(Caffe2Compatible, rpn.RPN): - def _generate_proposals( - self, images, objectness_logits_pred, anchor_deltas_pred, gt_instances=None - ): - assert isinstance(images, ImageList) - if self.tensor_mode: - im_info = images.image_sizes - else: - im_info = torch.tensor([[im_sz[0], im_sz[1], 1.0] for im_sz in images.image_sizes]).to( - images.tensor.device - ) - assert isinstance(im_info, torch.Tensor) - - rpn_rois_list = [] - rpn_roi_probs_list = [] - for scores, bbox_deltas, cell_anchors_tensor, feat_stride in zip( - objectness_logits_pred, - anchor_deltas_pred, - iter(self.anchor_generator.cell_anchors), - self.anchor_generator.strides, - ): - scores = scores.detach() - bbox_deltas = bbox_deltas.detach() - - rpn_rois, rpn_roi_probs = torch.ops._caffe2.GenerateProposals( - scores, - bbox_deltas, - im_info, - cell_anchors_tensor, - spatial_scale=1.0 / feat_stride, - pre_nms_topN=self.pre_nms_topk[self.training], - post_nms_topN=self.post_nms_topk[self.training], - nms_thresh=self.nms_thresh, - min_size=self.min_box_size, - # correct_transform_coords=True, # deprecated argument - angle_bound_on=True, # Default - angle_bound_lo=-180, - angle_bound_hi=180, - clip_angle_thresh=1.0, # Default - legacy_plus_one=False, - ) - rpn_rois_list.append(rpn_rois) - rpn_roi_probs_list.append(rpn_roi_probs) - - # For FPN in D2, in RPN all proposals from different levels are concated - # together, ranked and picked by top post_nms_topk. Then in ROIPooler - # it calculates level_assignments and calls the RoIAlign from - # the corresponding level. - - if len(objectness_logits_pred) == 1: - rpn_rois = rpn_rois_list[0] - rpn_roi_probs = rpn_roi_probs_list[0] - else: - assert len(rpn_rois_list) == len(rpn_roi_probs_list) - rpn_post_nms_topN = self.post_nms_topk[self.training] - - device = rpn_rois_list[0].device - input_list = [to_device(x, "cpu") for x in (rpn_rois_list + rpn_roi_probs_list)] - - # TODO remove this after confirming rpn_max_level/rpn_min_level - # is not needed in CollectRpnProposals. - feature_strides = list(self.anchor_generator.strides) - rpn_min_level = int(math.log2(feature_strides[0])) - rpn_max_level = int(math.log2(feature_strides[-1])) - assert (rpn_max_level - rpn_min_level + 1) == len( - rpn_rois_list - ), "CollectRpnProposals requires continuous levels" - - rpn_rois = torch.ops._caffe2.CollectRpnProposals( - input_list, - # NOTE: in current implementation, rpn_max_level and rpn_min_level - # are not needed, only the subtraction of two matters and it - # can be infer from the number of inputs. Keep them now for - # consistency. - rpn_max_level=2 + len(rpn_rois_list) - 1, - rpn_min_level=2, - rpn_post_nms_topN=rpn_post_nms_topN, - ) - rpn_rois = to_device(rpn_rois, device) - rpn_roi_probs = [] - - proposals = self.c2_postprocess(im_info, rpn_rois, rpn_roi_probs, self.tensor_mode) - return proposals, {} - - def forward(self, images, features, gt_instances=None): - assert not self.training - features = [features[f] for f in self.in_features] - objectness_logits_pred, anchor_deltas_pred = self.rpn_head(features) - return self._generate_proposals( - images, - objectness_logits_pred, - anchor_deltas_pred, - gt_instances, - ) - - @staticmethod - def c2_postprocess(im_info, rpn_rois, rpn_roi_probs, tensor_mode): - proposals = InstancesList( - im_info=im_info, - indices=rpn_rois[:, 0], - extra_fields={ - "proposal_boxes": Caffe2Boxes(rpn_rois), - "objectness_logits": (torch.Tensor, rpn_roi_probs), - }, - ) - if not tensor_mode: - proposals = InstancesList.to_d2_instances_list(proposals) - else: - proposals = [proposals] - return proposals - - -class Caffe2ROIPooler(Caffe2Compatible, poolers.ROIPooler): - @staticmethod - def c2_preprocess(box_lists): - assert all(isinstance(x, Boxes) for x in box_lists) - if all(isinstance(x, Caffe2Boxes) for x in box_lists): - # input is pure-tensor based - assert len(box_lists) == 1 - pooler_fmt_boxes = box_lists[0].tensor - else: - pooler_fmt_boxes = poolers.convert_boxes_to_pooler_format(box_lists) - return pooler_fmt_boxes - - def forward(self, x, box_lists): - assert not self.training - - pooler_fmt_boxes = self.c2_preprocess(box_lists) - num_level_assignments = len(self.level_poolers) - - if num_level_assignments == 1: - if isinstance(self.level_poolers[0], ROIAlignRotated): - c2_roi_align = torch.ops._caffe2.RoIAlignRotated - aligned = True - else: - c2_roi_align = torch.ops._caffe2.RoIAlign - aligned = self.level_poolers[0].aligned - - x0 = x[0] - if x0.is_quantized: - x0 = x0.dequantize() - - out = c2_roi_align( - x0, - pooler_fmt_boxes, - order="NCHW", - spatial_scale=float(self.level_poolers[0].spatial_scale), - pooled_h=int(self.output_size[0]), - pooled_w=int(self.output_size[1]), - sampling_ratio=int(self.level_poolers[0].sampling_ratio), - aligned=aligned, - ) - return out - - device = pooler_fmt_boxes.device - assert ( - self.max_level - self.min_level + 1 == 4 - ), "Currently DistributeFpnProposals only support 4 levels" - fpn_outputs = torch.ops._caffe2.DistributeFpnProposals( - to_device(pooler_fmt_boxes, "cpu"), - roi_canonical_scale=self.canonical_box_size, - roi_canonical_level=self.canonical_level, - roi_max_level=self.max_level, - roi_min_level=self.min_level, - legacy_plus_one=False, - ) - fpn_outputs = [to_device(x, device) for x in fpn_outputs] - - rois_fpn_list = fpn_outputs[:-1] - rois_idx_restore_int32 = fpn_outputs[-1] - - roi_feat_fpn_list = [] - for roi_fpn, x_level, pooler in zip(rois_fpn_list, x, self.level_poolers): - if isinstance(pooler, ROIAlignRotated): - c2_roi_align = torch.ops._caffe2.RoIAlignRotated - aligned = True - else: - c2_roi_align = torch.ops._caffe2.RoIAlign - aligned = bool(pooler.aligned) - - if x_level.is_quantized: - x_level = x_level.dequantize() - - roi_feat_fpn = c2_roi_align( - x_level, - roi_fpn, - order="NCHW", - spatial_scale=float(pooler.spatial_scale), - pooled_h=int(self.output_size[0]), - pooled_w=int(self.output_size[1]), - sampling_ratio=int(pooler.sampling_ratio), - aligned=aligned, - ) - roi_feat_fpn_list.append(roi_feat_fpn) - - roi_feat_shuffled = cat(roi_feat_fpn_list, dim=0) - assert roi_feat_shuffled.numel() > 0 and rois_idx_restore_int32.numel() > 0, ( - "Caffe2 export requires tracing with a model checkpoint + input that can produce valid" - " detections. But no detections were obtained with the given checkpoint and input!" - ) - roi_feat = torch.ops._caffe2.BatchPermutation(roi_feat_shuffled, rois_idx_restore_int32) - return roi_feat - - -class Caffe2FastRCNNOutputsInference: - def __init__(self, tensor_mode): - self.tensor_mode = tensor_mode # whether the output is caffe2 tensor mode - - def __call__(self, box_predictor, predictions, proposals): - """equivalent to FastRCNNOutputLayers.inference""" - num_classes = box_predictor.num_classes - score_thresh = box_predictor.test_score_thresh - nms_thresh = box_predictor.test_nms_thresh - topk_per_image = box_predictor.test_topk_per_image - is_rotated = len(box_predictor.box2box_transform.weights) == 5 - - if is_rotated: - box_dim = 5 - assert box_predictor.box2box_transform.weights[4] == 1, ( - "The weights for Rotated BBoxTransform in C2 have only 4 dimensions," - + " thus enforcing the angle weight to be 1 for now" - ) - box2box_transform_weights = box_predictor.box2box_transform.weights[:4] - else: - box_dim = 4 - box2box_transform_weights = box_predictor.box2box_transform.weights - - class_logits, box_regression = predictions - if num_classes + 1 == class_logits.shape[1]: - class_prob = F.softmax(class_logits, -1) - else: - assert num_classes == class_logits.shape[1] - class_prob = F.sigmoid(class_logits) - # BoxWithNMSLimit will infer num_classes from the shape of the class_prob - # So append a zero column as placeholder for the background class - class_prob = torch.cat((class_prob, torch.zeros(class_prob.shape[0], 1)), dim=1) - - assert box_regression.shape[1] % box_dim == 0 - cls_agnostic_bbox_reg = box_regression.shape[1] // box_dim == 1 - - input_tensor_mode = proposals[0].proposal_boxes.tensor.shape[1] == box_dim + 1 - - rois = type(proposals[0].proposal_boxes).cat([p.proposal_boxes for p in proposals]) - device, dtype = rois.tensor.device, rois.tensor.dtype - if input_tensor_mode: - im_info = proposals[0].image_size - rois = rois.tensor - else: - im_info = torch.tensor( - [[sz[0], sz[1], 1.0] for sz in [x.image_size for x in proposals]] - ) - batch_ids = cat( - [ - torch.full((b, 1), i, dtype=dtype, device=device) - for i, b in enumerate(len(p) for p in proposals) - ], - dim=0, - ) - rois = torch.cat([batch_ids, rois.tensor], dim=1) - - roi_pred_bbox, roi_batch_splits = torch.ops._caffe2.BBoxTransform( - to_device(rois, "cpu"), - to_device(box_regression, "cpu"), - to_device(im_info, "cpu"), - weights=box2box_transform_weights, - apply_scale=True, - rotated=is_rotated, - angle_bound_on=True, - angle_bound_lo=-180, - angle_bound_hi=180, - clip_angle_thresh=1.0, - legacy_plus_one=False, - ) - roi_pred_bbox = to_device(roi_pred_bbox, device) - roi_batch_splits = to_device(roi_batch_splits, device) - - nms_outputs = torch.ops._caffe2.BoxWithNMSLimit( - to_device(class_prob, "cpu"), - to_device(roi_pred_bbox, "cpu"), - to_device(roi_batch_splits, "cpu"), - score_thresh=float(score_thresh), - nms=float(nms_thresh), - detections_per_im=int(topk_per_image), - soft_nms_enabled=False, - soft_nms_method="linear", - soft_nms_sigma=0.5, - soft_nms_min_score_thres=0.001, - rotated=is_rotated, - cls_agnostic_bbox_reg=cls_agnostic_bbox_reg, - input_boxes_include_bg_cls=False, - output_classes_include_bg_cls=False, - legacy_plus_one=False, - ) - roi_score_nms = to_device(nms_outputs[0], device) - roi_bbox_nms = to_device(nms_outputs[1], device) - roi_class_nms = to_device(nms_outputs[2], device) - roi_batch_splits_nms = to_device(nms_outputs[3], device) - roi_keeps_nms = to_device(nms_outputs[4], device) - roi_keeps_size_nms = to_device(nms_outputs[5], device) - if not self.tensor_mode: - roi_class_nms = roi_class_nms.to(torch.int64) - - roi_batch_ids = cat( - [ - torch.full((b, 1), i, dtype=dtype, device=device) - for i, b in enumerate(int(x.item()) for x in roi_batch_splits_nms) - ], - dim=0, - ) - - roi_class_nms = alias(roi_class_nms, "class_nms") - roi_score_nms = alias(roi_score_nms, "score_nms") - roi_bbox_nms = alias(roi_bbox_nms, "bbox_nms") - roi_batch_splits_nms = alias(roi_batch_splits_nms, "batch_splits_nms") - roi_keeps_nms = alias(roi_keeps_nms, "keeps_nms") - roi_keeps_size_nms = alias(roi_keeps_size_nms, "keeps_size_nms") - - results = InstancesList( - im_info=im_info, - indices=roi_batch_ids[:, 0], - extra_fields={ - "pred_boxes": Caffe2Boxes(roi_bbox_nms), - "scores": roi_score_nms, - "pred_classes": roi_class_nms, - }, - ) - - if not self.tensor_mode: - results = InstancesList.to_d2_instances_list(results) - batch_splits = roi_batch_splits_nms.int().tolist() - kept_indices = list(roi_keeps_nms.to(torch.int64).split(batch_splits)) - else: - results = [results] - kept_indices = [roi_keeps_nms] - - return results, kept_indices - - -class Caffe2MaskRCNNInference: - def __call__(self, pred_mask_logits, pred_instances): - """equivalent to mask_head.mask_rcnn_inference""" - if all(isinstance(x, InstancesList) for x in pred_instances): - assert len(pred_instances) == 1 - mask_probs_pred = pred_mask_logits.sigmoid() - mask_probs_pred = alias(mask_probs_pred, "mask_fcn_probs") - pred_instances[0].pred_masks = mask_probs_pred - else: - mask_rcnn_inference(pred_mask_logits, pred_instances) - - -class Caffe2KeypointRCNNInference: - def __init__(self, use_heatmap_max_keypoint): - self.use_heatmap_max_keypoint = use_heatmap_max_keypoint - - def __call__(self, pred_keypoint_logits, pred_instances): - # just return the keypoint heatmap for now, - # there will be option to call HeatmapMaxKeypointOp - output = alias(pred_keypoint_logits, "kps_score") - if all(isinstance(x, InstancesList) for x in pred_instances): - assert len(pred_instances) == 1 - if self.use_heatmap_max_keypoint: - device = output.device - output = torch.ops._caffe2.HeatmapMaxKeypoint( - to_device(output, "cpu"), - pred_instances[0].pred_boxes.tensor, - should_output_softmax=True, # worth make it configerable? - ) - output = to_device(output, device) - output = alias(output, "keypoints_out") - pred_instances[0].pred_keypoints = output - return pred_keypoint_logits diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/caffe2_export.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/caffe2_export.py deleted file mode 100755 index 74ac123a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/caffe2_export.py +++ /dev/null @@ -1,207 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import copy -import io -import logging -import numpy as np -from typing import List -import onnx -import torch -from caffe2.proto import caffe2_pb2 -from caffe2.python import core -from caffe2.python.onnx.backend import Caffe2Backend -from tabulate import tabulate -from termcolor import colored -from torch.onnx import OperatorExportTypes - -from .shared import ( - ScopedWS, - construct_init_net_from_params, - fuse_alias_placeholder, - fuse_copy_between_cpu_and_gpu, - get_params_from_init_net, - group_norm_replace_aten_with_caffe2, - infer_device_type, - remove_dead_end_ops, - remove_reshape_for_fc, - save_graph, -) - -logger = logging.getLogger(__name__) - - -def export_onnx_model(model, inputs): - """ - Trace and export a model to onnx format. - - Args: - model (nn.Module): - inputs (tuple[args]): the model will be called by `model(*inputs)` - - Returns: - an onnx model - """ - assert isinstance(model, torch.nn.Module) - - # make sure all modules are in eval mode, onnx may change the training state - # of the module if the states are not consistent - def _check_eval(module): - assert not module.training - - model.apply(_check_eval) - - # Export the model to ONNX - with torch.no_grad(): - with io.BytesIO() as f: - torch.onnx.export( - model, - inputs, - f, - operator_export_type=OperatorExportTypes.ONNX_ATEN_FALLBACK, - # verbose=True, # NOTE: uncomment this for debugging - # export_params=True, - ) - onnx_model = onnx.load_from_string(f.getvalue()) - - # Apply ONNX's Optimization - all_passes = onnx.optimizer.get_available_passes() - passes = ["fuse_bn_into_conv"] - assert all(p in all_passes for p in passes) - onnx_model = onnx.optimizer.optimize(onnx_model, passes) - return onnx_model - - -def _op_stats(net_def): - type_count = {} - for t in [op.type for op in net_def.op]: - type_count[t] = type_count.get(t, 0) + 1 - type_count_list = sorted(type_count.items(), key=lambda kv: kv[0]) # alphabet - type_count_list = sorted(type_count_list, key=lambda kv: -kv[1]) # count - return "\n".join("{:>4}x {}".format(count, name) for name, count in type_count_list) - - -def _assign_device_option( - predict_net: caffe2_pb2.NetDef, init_net: caffe2_pb2.NetDef, tensor_inputs: List[torch.Tensor] -): - """ - ONNX exported network doesn't have concept of device, assign necessary - device option for each op in order to make it runable on GPU runtime. - """ - - def _get_device_type(torch_tensor): - assert torch_tensor.device.type in ["cpu", "cuda"] - assert torch_tensor.device.index == 0 - return torch_tensor.device.type - - def _assign_op_device_option(net_proto, net_ssa, blob_device_types): - for op, ssa_i in zip(net_proto.op, net_ssa): - if op.type in ["CopyCPUToGPU", "CopyGPUToCPU"]: - op.device_option.CopyFrom(core.DeviceOption(caffe2_pb2.CUDA, 0)) - else: - devices = [blob_device_types[b] for b in ssa_i[0] + ssa_i[1]] - assert all(d == devices[0] for d in devices) - if devices[0] == "cuda": - op.device_option.CopyFrom(core.DeviceOption(caffe2_pb2.CUDA, 0)) - - # update ops in predict_net - predict_net_input_device_types = { - (name, 0): _get_device_type(tensor) - for name, tensor in zip(predict_net.external_input, tensor_inputs) - } - predict_net_device_types = infer_device_type( - predict_net, known_status=predict_net_input_device_types, device_name_style="pytorch" - ) - predict_net_ssa, _ = core.get_ssa(predict_net) - _assign_op_device_option(predict_net, predict_net_ssa, predict_net_device_types) - - # update ops in init_net - init_net_ssa, versions = core.get_ssa(init_net) - init_net_output_device_types = { - (name, versions[name]): predict_net_device_types[(name, 0)] - for name in init_net.external_output - } - init_net_device_types = infer_device_type( - init_net, known_status=init_net_output_device_types, device_name_style="pytorch" - ) - _assign_op_device_option(init_net, init_net_ssa, init_net_device_types) - - -def export_caffe2_detection_model(model: torch.nn.Module, tensor_inputs: List[torch.Tensor]): - """ - Export a caffe2-compatible Detectron2 model to caffe2 format via ONNX. - - Arg: - model: a caffe2-compatible version of detectron2 model, defined in caffe2_modeling.py - tensor_inputs: a list of tensors that caffe2 model takes as input. - """ - model = copy.deepcopy(model) - assert isinstance(model, torch.nn.Module) - assert hasattr(model, "encode_additional_info") - - # Export via ONNX - logger.info( - "Exporting a {} model via ONNX ...".format(type(model).__name__) - + " Some warnings from ONNX are expected and are usually not to worry about." - ) - onnx_model = export_onnx_model(model, (tensor_inputs,)) - # Convert ONNX model to Caffe2 protobuf - init_net, predict_net = Caffe2Backend.onnx_graph_to_caffe2_net(onnx_model) - ops_table = [[op.type, op.input, op.output] for op in predict_net.op] - table = tabulate(ops_table, headers=["type", "input", "output"], tablefmt="pipe") - logger.info( - "ONNX export Done. Exported predict_net (before optimizations):\n" + colored(table, "cyan") - ) - - # Apply protobuf optimization - fuse_alias_placeholder(predict_net, init_net) - if any(t.device.type != "cpu" for t in tensor_inputs): - fuse_copy_between_cpu_and_gpu(predict_net) - remove_dead_end_ops(init_net) - _assign_device_option(predict_net, init_net, tensor_inputs) - params, device_options = get_params_from_init_net(init_net) - predict_net, params = remove_reshape_for_fc(predict_net, params) - init_net = construct_init_net_from_params(params, device_options) - group_norm_replace_aten_with_caffe2(predict_net) - - # Record necessary information for running the pb model in Detectron2 system. - model.encode_additional_info(predict_net, init_net) - - logger.info("Operators used in predict_net: \n{}".format(_op_stats(predict_net))) - logger.info("Operators used in init_net: \n{}".format(_op_stats(init_net))) - - return predict_net, init_net - - -def run_and_save_graph(predict_net, init_net, tensor_inputs, graph_save_path): - """ - Run the caffe2 model on given inputs, recording the shape and draw the graph. - - predict_net/init_net: caffe2 model. - tensor_inputs: a list of tensors that caffe2 model takes as input. - graph_save_path: path for saving graph of exported model. - """ - - logger.info("Saving graph of ONNX exported model to {} ...".format(graph_save_path)) - save_graph(predict_net, graph_save_path, op_only=False) - - # Run the exported Caffe2 net - logger.info("Running ONNX exported model ...") - with ScopedWS("__ws_tmp__", True) as ws: - ws.RunNetOnce(init_net) - initialized_blobs = set(ws.Blobs()) - uninitialized = [inp for inp in predict_net.external_input if inp not in initialized_blobs] - for name, blob in zip(uninitialized, tensor_inputs): - ws.FeedBlob(name, blob) - - try: - ws.RunNetOnce(predict_net) - except RuntimeError as e: - logger.warning("Encountered RuntimeError: \n{}".format(str(e))) - - ws_blobs = {b: ws.FetchBlob(b) for b in ws.Blobs()} - blob_sizes = {b: ws_blobs[b].shape for b in ws_blobs if isinstance(ws_blobs[b], np.ndarray)} - - logger.info("Saving graph with blob shapes to {} ...".format(graph_save_path)) - save_graph(predict_net, graph_save_path, op_only=False, blob_sizes=blob_sizes) - - return ws_blobs diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/caffe2_inference.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/caffe2_inference.py deleted file mode 100755 index deb886c0..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/caffe2_inference.py +++ /dev/null @@ -1,161 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import logging -import numpy as np -from itertools import count -import torch -from caffe2.proto import caffe2_pb2 -from caffe2.python import core - -from .caffe2_modeling import META_ARCH_CAFFE2_EXPORT_TYPE_MAP, convert_batched_inputs_to_c2_format -from .shared import ScopedWS, get_pb_arg_vali, get_pb_arg_vals, infer_device_type - -logger = logging.getLogger(__name__) - - -# ===== ref: mobile-vision predictor's 'Caffe2Wrapper' class ====== -class ProtobufModel(torch.nn.Module): - """ - Wrapper of a caffe2's protobuf model. - It works just like nn.Module, but running caffe2 under the hood. - Input/Output are tuple[tensor] that match the caffe2 net's external_input/output. - """ - - _ids = count(0) - - def __init__(self, predict_net, init_net): - logger.info(f"Initializing ProtobufModel for: {predict_net.name} ...") - super().__init__() - assert isinstance(predict_net, caffe2_pb2.NetDef) - assert isinstance(init_net, caffe2_pb2.NetDef) - # create unique temporary workspace for each instance - self.ws_name = "__tmp_ProtobufModel_{}__".format(next(self._ids)) - self.net = core.Net(predict_net) - - logger.info("Running init_net once to fill the parameters ...") - with ScopedWS(self.ws_name, is_reset=True, is_cleanup=False) as ws: - ws.RunNetOnce(init_net) - uninitialized_external_input = [] - for blob in self.net.Proto().external_input: - if blob not in ws.Blobs(): - uninitialized_external_input.append(blob) - ws.CreateBlob(blob) - ws.CreateNet(self.net) - - self._error_msgs = set() - self._input_blobs = uninitialized_external_input - - def _infer_output_devices(self, inputs): - """ - Returns: - list[str]: list of device for each external output - """ - - def _get_device_type(torch_tensor): - assert torch_tensor.device.type in ["cpu", "cuda"] - assert torch_tensor.device.index == 0 - return torch_tensor.device.type - - predict_net = self.net.Proto() - input_device_types = { - (name, 0): _get_device_type(tensor) for name, tensor in zip(self._input_blobs, inputs) - } - device_type_map = infer_device_type( - predict_net, known_status=input_device_types, device_name_style="pytorch" - ) - ssa, versions = core.get_ssa(predict_net) - versioned_outputs = [(name, versions[name]) for name in predict_net.external_output] - output_devices = [device_type_map[outp] for outp in versioned_outputs] - return output_devices - - def forward(self, inputs): - """ - Args: - inputs (tuple[torch.Tensor]) - - Returns: - tuple[torch.Tensor] - """ - assert len(inputs) == len(self._input_blobs), ( - f"Length of inputs ({len(inputs)}) " - f"doesn't match the required input blobs: {self._input_blobs}" - ) - - with ScopedWS(self.ws_name, is_reset=False, is_cleanup=False) as ws: - for b, tensor in zip(self._input_blobs, inputs): - ws.FeedBlob(b, tensor) - - try: - ws.RunNet(self.net.Proto().name) - except RuntimeError as e: - if not str(e) in self._error_msgs: - self._error_msgs.add(str(e)) - logger.warning("Encountered new RuntimeError: \n{}".format(str(e))) - logger.warning("Catch the error and use partial results.") - - c2_outputs = [ws.FetchBlob(b) for b in self.net.Proto().external_output] - # Remove outputs of current run, this is necessary in order to - # prevent fetching the result from previous run if the model fails - # in the middle. - for b in self.net.Proto().external_output: - # Needs to create uninitialized blob to make the net runable. - # This is "equivalent" to: ws.RemoveBlob(b) then ws.CreateBlob(b), - # but there'no such API. - ws.FeedBlob(b, f"{b}, a C++ native class of type nullptr (uninitialized).") - - # Cast output to torch.Tensor on the desired device - output_devices = ( - self._infer_output_devices(inputs) - if any(t.device.type != "cpu" for t in inputs) - else ["cpu" for _ in self.net.Proto().external_output] - ) - - outputs = [] - for name, c2_output, device in zip( - self.net.Proto().external_output, c2_outputs, output_devices - ): - if not isinstance(c2_output, np.ndarray): - raise RuntimeError( - "Invalid output for blob {}, received: {}".format(name, c2_output) - ) - outputs.append(torch.tensor(c2_output).to(device=device)) - return tuple(outputs) - - -class ProtobufDetectionModel(torch.nn.Module): - """ - A class works just like a pytorch meta arch in terms of inference, but running - caffe2 model under the hood. - """ - - def __init__(self, predict_net, init_net, *, convert_outputs=None): - """ - Args: - predict_net, init_net (core.Net): caffe2 nets - convert_outptus (callable): a function that converts caffe2 - outputs to the same format of the original pytorch model. - By default, use the one defined in the caffe2 meta_arch. - """ - super().__init__() - self.protobuf_model = ProtobufModel(predict_net, init_net) - self.size_divisibility = get_pb_arg_vali(predict_net, "size_divisibility", 0) - self.device = get_pb_arg_vals(predict_net, "device", b"cpu").decode("ascii") - - if convert_outputs is None: - meta_arch = get_pb_arg_vals(predict_net, "meta_architecture", b"GeneralizedRCNN") - meta_arch = META_ARCH_CAFFE2_EXPORT_TYPE_MAP[meta_arch.decode("ascii")] - self._convert_outputs = meta_arch.get_outputs_converter(predict_net, init_net) - else: - self._convert_outputs = convert_outputs - - def _convert_inputs(self, batched_inputs): - # currently all models convert inputs in the same way - return convert_batched_inputs_to_c2_format( - batched_inputs, self.size_divisibility, self.device - ) - - def forward(self, batched_inputs): - c2_inputs = self._convert_inputs(batched_inputs) - c2_results = self.protobuf_model(c2_inputs) - c2_results = dict(zip(self.protobuf_model.net.Proto().external_output, c2_results)) - return self._convert_outputs(batched_inputs, c2_inputs, c2_results) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/caffe2_modeling.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/caffe2_modeling.py deleted file mode 100755 index e00de4ad..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/caffe2_modeling.py +++ /dev/null @@ -1,419 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import functools -import io -import struct -import types -import torch - -from detectron2.modeling import meta_arch -from detectron2.modeling.box_regression import Box2BoxTransform -from detectron2.modeling.roi_heads import keypoint_head -from detectron2.structures import Boxes, ImageList, Instances, RotatedBoxes - -from .c10 import Caffe2Compatible -from .caffe2_patch import ROIHeadsPatcher, patch_generalized_rcnn -from .shared import ( - alias, - check_set_pb_arg, - get_pb_arg_floats, - get_pb_arg_valf, - get_pb_arg_vali, - get_pb_arg_vals, - mock_torch_nn_functional_interpolate, -) - - -def assemble_rcnn_outputs_by_name(image_sizes, tensor_outputs, force_mask_on=False): - """ - A function to assemble caffe2 model's outputs (i.e. Dict[str, Tensor]) - to detectron2's format (i.e. list of Instances instance). - This only works when the model follows the Caffe2 detectron's naming convention. - - Args: - image_sizes (List[List[int, int]]): [H, W] of every image. - tensor_outputs (Dict[str, Tensor]): external_output to its tensor. - - force_mask_on (Bool): if true, the it make sure there'll be pred_masks even - if the mask is not found from tensor_outputs (usually due to model crash) - """ - - results = [Instances(image_size) for image_size in image_sizes] - - batch_splits = tensor_outputs.get("batch_splits", None) - if batch_splits: - raise NotImplementedError() - assert len(image_sizes) == 1 - result = results[0] - - bbox_nms = tensor_outputs["bbox_nms"] - score_nms = tensor_outputs["score_nms"] - class_nms = tensor_outputs["class_nms"] - # Detection will always success because Conv support 0-batch - assert bbox_nms is not None - assert score_nms is not None - assert class_nms is not None - if bbox_nms.shape[1] == 5: - result.pred_boxes = RotatedBoxes(bbox_nms) - else: - result.pred_boxes = Boxes(bbox_nms) - result.scores = score_nms - result.pred_classes = class_nms.to(torch.int64) - - mask_fcn_probs = tensor_outputs.get("mask_fcn_probs", None) - if mask_fcn_probs is not None: - # finish the mask pred - mask_probs_pred = mask_fcn_probs - num_masks = mask_probs_pred.shape[0] - class_pred = result.pred_classes - indices = torch.arange(num_masks, device=class_pred.device) - mask_probs_pred = mask_probs_pred[indices, class_pred][:, None] - result.pred_masks = mask_probs_pred - elif force_mask_on: - # NOTE: there's no way to know the height/width of mask here, it won't be - # used anyway when batch size is 0, so just set them to 0. - result.pred_masks = torch.zeros([0, 1, 0, 0], dtype=torch.uint8) - - keypoints_out = tensor_outputs.get("keypoints_out", None) - kps_score = tensor_outputs.get("kps_score", None) - if keypoints_out is not None: - # keypoints_out: [N, 4, #kypoints], where 4 is in order of (x, y, score, prob) - keypoints_tensor = keypoints_out - # NOTE: it's possible that prob is not calculated if "should_output_softmax" - # is set to False in HeatmapMaxKeypoint, so just using raw score, seems - # it doesn't affect mAP. TODO: check more carefully. - keypoint_xyp = keypoints_tensor.transpose(1, 2)[:, :, [0, 1, 2]] - result.pred_keypoints = keypoint_xyp - elif kps_score is not None: - # keypoint heatmap to sparse data structure - pred_keypoint_logits = kps_score - keypoint_head.keypoint_rcnn_inference(pred_keypoint_logits, [result]) - - return results - - -def _cast_to_f32(f64): - return struct.unpack("f", struct.pack("f", f64))[0] - - -def set_caffe2_compatible_tensor_mode(model, enable=True): - def _fn(m): - if isinstance(m, Caffe2Compatible): - m.tensor_mode = enable - - model.apply(_fn) - - -def convert_batched_inputs_to_c2_format(batched_inputs, size_divisibility, device): - """ - See get_caffe2_inputs() below. - """ - assert all(isinstance(x, dict) for x in batched_inputs) - assert all(x["image"].dim() == 3 for x in batched_inputs) - - images = [x["image"] for x in batched_inputs] - images = ImageList.from_tensors(images, size_divisibility) - - im_info = [] - for input_per_image, image_size in zip(batched_inputs, images.image_sizes): - target_height = input_per_image.get("height", image_size[0]) - target_width = input_per_image.get("width", image_size[1]) # noqa - # NOTE: The scale inside im_info is kept as convention and for providing - # post-processing information if further processing is needed. For - # current Caffe2 model definitions that don't include post-processing inside - # the model, this number is not used. - # NOTE: There can be a slight difference between width and height - # scales, using a single number can results in numerical difference - # compared with D2's post-processing. - scale = target_height / image_size[0] - im_info.append([image_size[0], image_size[1], scale]) - im_info = torch.Tensor(im_info) - - return images.tensor.to(device), im_info.to(device) - - -class Caffe2MetaArch(Caffe2Compatible, torch.nn.Module): - """ - Base class for caffe2-compatible implementation of a meta architecture. - The forward is traceable and its traced graph can be converted to caffe2 - graph through ONNX. - """ - - def __init__(self, cfg, torch_model): - """ - Args: - cfg (CfgNode): - torch_model (nn.Module): the detectron2 model (meta_arch) to be - converted. - """ - super().__init__() - self._wrapped_model = torch_model - self.eval() - set_caffe2_compatible_tensor_mode(self, True) - - def get_caffe2_inputs(self, batched_inputs): - """ - Convert pytorch-style structured inputs to caffe2-style inputs that - are tuples of tensors. - - Args: - batched_inputs (list[dict]): inputs to a detectron2 model - in its standard format. Each dict has "image" (CHW tensor), and optionally - "height" and "width". - - Returns: - tuple[Tensor]: - tuple of tensors that will be the inputs to the - :meth:`forward` method. For existing models, the first - is an NCHW tensor (padded and batched); the second is - a im_info Nx3 tensor, where the rows are - (height, width, unused legacy parameter) - """ - return convert_batched_inputs_to_c2_format( - batched_inputs, - self._wrapped_model.backbone.size_divisibility, - self._wrapped_model.device, - ) - - def encode_additional_info(self, predict_net, init_net): - """ - Save extra metadata that will be used by inference in the output protobuf. - """ - pass - - def forward(self, inputs): - """ - Run the forward in caffe2-style. It has to use caffe2-compatible ops - and the method will be used for tracing. - - Args: - inputs (tuple[Tensor]): inputs defined by :meth:`get_caffe2_input`. - They will be the inputs of the converted caffe2 graph. - - Returns: - tuple[Tensor]: output tensors. They will be the outputs of the - converted caffe2 graph. - """ - raise NotImplementedError - - def _caffe2_preprocess_image(self, inputs): - """ - Caffe2 implementation of preprocess_image, which is called inside each MetaArch's forward. - It normalizes the input images, and the final caffe2 graph assumes the - inputs have been batched already. - """ - data, im_info = inputs - data = alias(data, "data") - im_info = alias(im_info, "im_info") - mean, std = self._wrapped_model.pixel_mean, self._wrapped_model.pixel_std - normalized_data = (data - mean) / std - normalized_data = alias(normalized_data, "normalized_data") - - # Pack (data, im_info) into ImageList which is recognized by self.inference. - images = ImageList(tensor=normalized_data, image_sizes=im_info) - return images - - @staticmethod - def get_outputs_converter(predict_net, init_net): - """ - Creates a function that converts outputs of the caffe2 model to - detectron2's standard format. - The function uses information in `predict_net` and `init_net` that are - available at inferene time. Therefore the function logic can be used in inference. - - The returned function has the following signature: - - def convert(batched_inputs, c2_inputs, c2_results) -> detectron2_outputs - - Where - - * batched_inputs (list[dict]): the original input format of the meta arch - * c2_inputs (tuple[Tensor]): the caffe2 inputs. - * c2_results (dict[str, Tensor]): the caffe2 output format, - corresponding to the outputs of the :meth:`forward` function. - * detectron2_outputs: the original output format of the meta arch. - - This function can be used to compare the outputs of the original meta arch and - the converted caffe2 graph. - - Returns: - callable: a callable of the above signature. - """ - raise NotImplementedError - - -class Caffe2GeneralizedRCNN(Caffe2MetaArch): - def __init__(self, cfg, torch_model): - assert isinstance(torch_model, meta_arch.GeneralizedRCNN) - torch_model = patch_generalized_rcnn(torch_model) - super().__init__(cfg, torch_model) - - try: - use_heatmap_max_keypoint = cfg.EXPORT_CAFFE2.USE_HEATMAP_MAX_KEYPOINT - except AttributeError: - use_heatmap_max_keypoint = False - self.roi_heads_patcher = ROIHeadsPatcher( - self._wrapped_model.roi_heads, use_heatmap_max_keypoint - ) - - def encode_additional_info(self, predict_net, init_net): - size_divisibility = self._wrapped_model.backbone.size_divisibility - check_set_pb_arg(predict_net, "size_divisibility", "i", size_divisibility) - check_set_pb_arg( - predict_net, "device", "s", str.encode(str(self._wrapped_model.device), "ascii") - ) - check_set_pb_arg(predict_net, "meta_architecture", "s", b"GeneralizedRCNN") - - @mock_torch_nn_functional_interpolate() - def forward(self, inputs): - if not self.tensor_mode: - return self._wrapped_model.inference(inputs) - images = self._caffe2_preprocess_image(inputs) - features = self._wrapped_model.backbone(images.tensor) - proposals, _ = self._wrapped_model.proposal_generator(images, features) - with self.roi_heads_patcher.mock_roi_heads(): - detector_results, _ = self._wrapped_model.roi_heads(images, features, proposals) - return tuple(detector_results[0].flatten()) - - @staticmethod - def get_outputs_converter(predict_net, init_net): - def f(batched_inputs, c2_inputs, c2_results): - _, im_info = c2_inputs - image_sizes = [[int(im[0]), int(im[1])] for im in im_info] - results = assemble_rcnn_outputs_by_name(image_sizes, c2_results) - return meta_arch.GeneralizedRCNN._postprocess(results, batched_inputs, image_sizes) - - return f - - -class Caffe2RetinaNet(Caffe2MetaArch): - def __init__(self, cfg, torch_model): - assert isinstance(torch_model, meta_arch.RetinaNet) - super().__init__(cfg, torch_model) - - @mock_torch_nn_functional_interpolate() - def forward(self, inputs): - assert self.tensor_mode - images = self._caffe2_preprocess_image(inputs) - - # explicitly return the images sizes to avoid removing "im_info" by ONNX - # since it's not used in the forward path - return_tensors = [images.image_sizes] - - features = self._wrapped_model.backbone(images.tensor) - features = [features[f] for f in self._wrapped_model.head_in_features] - for i, feature_i in enumerate(features): - features[i] = alias(feature_i, "feature_{}".format(i), is_backward=True) - return_tensors.append(features[i]) - - pred_logits, pred_anchor_deltas = self._wrapped_model.head(features) - for i, (box_cls_i, box_delta_i) in enumerate(zip(pred_logits, pred_anchor_deltas)): - return_tensors.append(alias(box_cls_i, "box_cls_{}".format(i))) - return_tensors.append(alias(box_delta_i, "box_delta_{}".format(i))) - - return tuple(return_tensors) - - def encode_additional_info(self, predict_net, init_net): - size_divisibility = self._wrapped_model.backbone.size_divisibility - check_set_pb_arg(predict_net, "size_divisibility", "i", size_divisibility) - check_set_pb_arg( - predict_net, "device", "s", str.encode(str(self._wrapped_model.device), "ascii") - ) - check_set_pb_arg(predict_net, "meta_architecture", "s", b"RetinaNet") - - # Inference parameters: - check_set_pb_arg( - predict_net, "score_threshold", "f", _cast_to_f32(self._wrapped_model.test_score_thresh) - ) - check_set_pb_arg( - predict_net, "topk_candidates", "i", self._wrapped_model.test_topk_candidates - ) - check_set_pb_arg( - predict_net, "nms_threshold", "f", _cast_to_f32(self._wrapped_model.test_nms_thresh) - ) - check_set_pb_arg( - predict_net, - "max_detections_per_image", - "i", - self._wrapped_model.max_detections_per_image, - ) - - check_set_pb_arg( - predict_net, - "bbox_reg_weights", - "floats", - [_cast_to_f32(w) for w in self._wrapped_model.box2box_transform.weights], - ) - self._encode_anchor_generator_cfg(predict_net) - - def _encode_anchor_generator_cfg(self, predict_net): - # serialize anchor_generator for future use - serialized_anchor_generator = io.BytesIO() - torch.save(self._wrapped_model.anchor_generator, serialized_anchor_generator) - # Ideally we can put anchor generating inside the model, then we don't - # need to store this information. - bytes = serialized_anchor_generator.getvalue() - check_set_pb_arg(predict_net, "serialized_anchor_generator", "s", bytes) - - @staticmethod - def get_outputs_converter(predict_net, init_net): - self = types.SimpleNamespace() - serialized_anchor_generator = io.BytesIO( - get_pb_arg_vals(predict_net, "serialized_anchor_generator", None) - ) - self.anchor_generator = torch.load(serialized_anchor_generator) - bbox_reg_weights = get_pb_arg_floats(predict_net, "bbox_reg_weights", None) - self.box2box_transform = Box2BoxTransform(weights=tuple(bbox_reg_weights)) - self.test_score_thresh = get_pb_arg_valf(predict_net, "score_threshold", None) - self.test_topk_candidates = get_pb_arg_vali(predict_net, "topk_candidates", None) - self.test_nms_thresh = get_pb_arg_valf(predict_net, "nms_threshold", None) - self.max_detections_per_image = get_pb_arg_vali( - predict_net, "max_detections_per_image", None - ) - - # hack to reuse inference code from RetinaNet - for meth in [ - "forward_inference", - "inference_single_image", - "_transpose_dense_predictions", - "_decode_multi_level_predictions", - "_decode_per_level_predictions", - ]: - setattr(self, meth, functools.partial(getattr(meta_arch.RetinaNet, meth), self)) - - def f(batched_inputs, c2_inputs, c2_results): - _, im_info = c2_inputs - image_sizes = [[int(im[0]), int(im[1])] for im in im_info] - dummy_images = ImageList( - torch.randn( - ( - len(im_info), - 3, - ) - + tuple(image_sizes[0]) - ), - image_sizes, - ) - - num_features = len([x for x in c2_results.keys() if x.startswith("box_cls_")]) - pred_logits = [c2_results["box_cls_{}".format(i)] for i in range(num_features)] - pred_anchor_deltas = [c2_results["box_delta_{}".format(i)] for i in range(num_features)] - - # For each feature level, feature should have the same batch size and - # spatial dimension as the box_cls and box_delta. - dummy_features = [x.clone()[:, 0:0, :, :] for x in pred_logits] - # self.num_classess can be inferred - self.num_classes = pred_logits[0].shape[1] // (pred_anchor_deltas[0].shape[1] // 4) - - results = self.forward_inference( - dummy_images, dummy_features, [pred_logits, pred_anchor_deltas] - ) - return meta_arch.GeneralizedRCNN._postprocess(results, batched_inputs, image_sizes) - - return f - - -META_ARCH_CAFFE2_EXPORT_TYPE_MAP = { - "GeneralizedRCNN": Caffe2GeneralizedRCNN, - "RetinaNet": Caffe2RetinaNet, -} diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/caffe2_patch.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/caffe2_patch.py deleted file mode 100755 index c9eee594..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/caffe2_patch.py +++ /dev/null @@ -1,152 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import contextlib -from unittest import mock -import torch - -from detectron2.modeling import poolers -from detectron2.modeling.proposal_generator import rpn -from detectron2.modeling.roi_heads import keypoint_head, mask_head -from detectron2.modeling.roi_heads.fast_rcnn import FastRCNNOutputLayers - -from .c10 import ( - Caffe2Compatible, - Caffe2FastRCNNOutputsInference, - Caffe2KeypointRCNNInference, - Caffe2MaskRCNNInference, - Caffe2ROIPooler, - Caffe2RPN, -) - - -class GenericMixin(object): - pass - - -class Caffe2CompatibleConverter(object): - """ - A GenericUpdater which implements the `create_from` interface, by modifying - module object and assign it with another class replaceCls. - """ - - def __init__(self, replaceCls): - self.replaceCls = replaceCls - - def create_from(self, module): - # update module's class to the new class - assert isinstance(module, torch.nn.Module) - if issubclass(self.replaceCls, GenericMixin): - # replaceCls should act as mixin, create a new class on-the-fly - new_class = type( - "{}MixedWith{}".format(self.replaceCls.__name__, module.__class__.__name__), - (self.replaceCls, module.__class__), - {}, # {"new_method": lambda self: ...}, - ) - module.__class__ = new_class - else: - # replaceCls is complete class, this allow arbitrary class swap - module.__class__ = self.replaceCls - - # initialize Caffe2Compatible - if isinstance(module, Caffe2Compatible): - module.tensor_mode = False - - return module - - -def patch(model, target, updater, *args, **kwargs): - """ - recursively (post-order) update all modules with the target type and its - subclasses, make a initialization/composition/inheritance/... via the - updater.create_from. - """ - for name, module in model.named_children(): - model._modules[name] = patch(module, target, updater, *args, **kwargs) - if isinstance(model, target): - return updater.create_from(model, *args, **kwargs) - return model - - -def patch_generalized_rcnn(model): - ccc = Caffe2CompatibleConverter - model = patch(model, rpn.RPN, ccc(Caffe2RPN)) - model = patch(model, poolers.ROIPooler, ccc(Caffe2ROIPooler)) - - return model - - -@contextlib.contextmanager -def mock_fastrcnn_outputs_inference( - tensor_mode, check=True, box_predictor_type=FastRCNNOutputLayers -): - with mock.patch.object( - box_predictor_type, - "inference", - autospec=True, - side_effect=Caffe2FastRCNNOutputsInference(tensor_mode), - ) as mocked_func: - yield - if check: - assert mocked_func.call_count > 0 - - -@contextlib.contextmanager -def mock_mask_rcnn_inference(tensor_mode, patched_module, check=True): - with mock.patch( - "{}.mask_rcnn_inference".format(patched_module), side_effect=Caffe2MaskRCNNInference() - ) as mocked_func: - yield - if check: - assert mocked_func.call_count > 0 - - -@contextlib.contextmanager -def mock_keypoint_rcnn_inference(tensor_mode, patched_module, use_heatmap_max_keypoint, check=True): - with mock.patch( - "{}.keypoint_rcnn_inference".format(patched_module), - side_effect=Caffe2KeypointRCNNInference(use_heatmap_max_keypoint), - ) as mocked_func: - yield - if check: - assert mocked_func.call_count > 0 - - -class ROIHeadsPatcher: - def __init__(self, heads, use_heatmap_max_keypoint): - self.heads = heads - self.use_heatmap_max_keypoint = use_heatmap_max_keypoint - - @contextlib.contextmanager - def mock_roi_heads(self, tensor_mode=True): - """ - Patching several inference functions inside ROIHeads and its subclasses - - Args: - tensor_mode (bool): whether the inputs/outputs are caffe2's tensor - format or not. Default to True. - """ - # NOTE: this requries the `keypoint_rcnn_inference` and `mask_rcnn_inference` - # are called inside the same file as BaseXxxHead due to using mock.patch. - kpt_heads_mod = keypoint_head.BaseKeypointRCNNHead.__module__ - mask_head_mod = mask_head.BaseMaskRCNNHead.__module__ - - mock_ctx_managers = [ - mock_fastrcnn_outputs_inference( - tensor_mode=tensor_mode, - check=True, - box_predictor_type=type(self.heads.box_predictor), - ) - ] - if getattr(self.heads, "keypoint_on", False): - mock_ctx_managers += [ - mock_keypoint_rcnn_inference( - tensor_mode, kpt_heads_mod, self.use_heatmap_max_keypoint - ) - ] - if getattr(self.heads, "mask_on", False): - mock_ctx_managers += [mock_mask_rcnn_inference(tensor_mode, mask_head_mod)] - - with contextlib.ExitStack() as stack: # python 3.3+ - for mgr in mock_ctx_managers: - stack.enter_context(mgr) - yield diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/flatten.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/flatten.py deleted file mode 100755 index f5ba4297..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/flatten.py +++ /dev/null @@ -1,330 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import collections -from dataclasses import dataclass -from typing import Callable, List, Optional, Tuple -import torch -from torch import nn - -from detectron2.structures import Boxes, Instances, ROIMasks -from detectron2.utils.registry import _convert_target_to_string, locate - -from .torchscript_patch import patch_builtin_len - - -@dataclass -class Schema: - """ - A Schema defines how to flatten a possibly hierarchical object into tuple of - primitive objects, so it can be used as inputs/outputs of PyTorch's tracing. - - PyTorch does not support tracing a function that produces rich output - structures (e.g. dict, Instances, Boxes). To trace such a function, we - flatten the rich object into tuple of tensors, and return this tuple of tensors - instead. Meanwhile, we also need to know how to "rebuild" the original object - from the flattened results, so we can evaluate the flattened results. - A Schema defines how to flatten an object, and while flattening it, it records - necessary schemas so that the object can be rebuilt using the flattened outputs. - - The flattened object and the schema object is returned by ``.flatten`` classmethod. - Then the original object can be rebuilt with the ``__call__`` method of schema. - - A Schema is a dataclass that can be serialized easily. - """ - - # inspired by FetchMapper in tensorflow/python/client/session.py - - @classmethod - def flatten(cls, obj): - raise NotImplementedError - - def __call__(self, values): - raise NotImplementedError - - @staticmethod - def _concat(values): - ret = () - sizes = [] - for v in values: - assert isinstance(v, tuple), "Flattened results must be a tuple" - ret = ret + v - sizes.append(len(v)) - return ret, sizes - - @staticmethod - def _split(values, sizes): - if len(sizes): - expected_len = sum(sizes) - assert ( - len(values) == expected_len - ), f"Values has length {len(values)} but expect length {expected_len}." - ret = [] - for k in range(len(sizes)): - begin, end = sum(sizes[:k]), sum(sizes[: k + 1]) - ret.append(values[begin:end]) - return ret - - -@dataclass -class ListSchema(Schema): - schemas: List[Schema] # the schemas that define how to flatten each element in the list - sizes: List[int] # the flattened length of each element - - def __call__(self, values): - values = self._split(values, self.sizes) - if len(values) != len(self.schemas): - raise ValueError( - f"Values has length {len(values)} but schemas " f"has length {len(self.schemas)}!" - ) - values = [m(v) for m, v in zip(self.schemas, values)] - return list(values) - - @classmethod - def flatten(cls, obj): - res = [flatten_to_tuple(k) for k in obj] - values, sizes = cls._concat([k[0] for k in res]) - return values, cls([k[1] for k in res], sizes) - - -@dataclass -class TupleSchema(ListSchema): - def __call__(self, values): - return tuple(super().__call__(values)) - - -@dataclass -class IdentitySchema(Schema): - def __call__(self, values): - return values[0] - - @classmethod - def flatten(cls, obj): - return (obj,), cls() - - -@dataclass -class DictSchema(ListSchema): - keys: List[str] - - def __call__(self, values): - values = super().__call__(values) - return dict(zip(self.keys, values)) - - @classmethod - def flatten(cls, obj): - for k in obj.keys(): - if not isinstance(k, str): - raise KeyError("Only support flattening dictionaries if keys are str.") - keys = sorted(obj.keys()) - values = [obj[k] for k in keys] - ret, schema = ListSchema.flatten(values) - return ret, cls(schema.schemas, schema.sizes, keys) - - -@dataclass -class InstancesSchema(DictSchema): - def __call__(self, values): - image_size, fields = values[-1], values[:-1] - fields = super().__call__(fields) - return Instances(image_size, **fields) - - @classmethod - def flatten(cls, obj): - ret, schema = super().flatten(obj.get_fields()) - size = obj.image_size - if not isinstance(size, torch.Tensor): - size = torch.tensor(size) - return ret + (size,), schema - - -@dataclass -class TensorWrapSchema(Schema): - """ - For classes that are simple wrapper of tensors, e.g. - Boxes, RotatedBoxes, BitMasks - """ - - class_name: str - - def __call__(self, values): - return locate(self.class_name)(values[0]) - - @classmethod - def flatten(cls, obj): - return (obj.tensor,), cls(_convert_target_to_string(type(obj))) - - -# if more custom structures needed in the future, can allow -# passing in extra schemas for custom types -def flatten_to_tuple(obj): - """ - Flatten an object so it can be used for PyTorch tracing. - Also returns how to rebuild the original object from the flattened outputs. - - Returns: - res (tuple): the flattened results that can be used as tracing outputs - schema: an object with a ``__call__`` method such that ``schema(res) == obj``. - It is a pure dataclass that can be serialized. - """ - schemas = [ - ((str, bytes), IdentitySchema), - (list, ListSchema), - (tuple, TupleSchema), - (collections.abc.Mapping, DictSchema), - (Instances, InstancesSchema), - ((Boxes, ROIMasks), TensorWrapSchema), - ] - for klass, schema in schemas: - if isinstance(obj, klass): - F = schema - break - else: - F = IdentitySchema - - return F.flatten(obj) - - -class TracingAdapter(nn.Module): - """ - A model may take rich input/output format (e.g. dict or custom classes), - but `torch.jit.trace` requires tuple of tensors as input/output. - This adapter flattens input/output format of a model so it becomes traceable. - - It also records the necessary schema to rebuild model's inputs/outputs from flattened - inputs/outputs. - - Example: - :: - outputs = model(inputs) # inputs/outputs may be rich structure - adapter = TracingAdapter(model, inputs) - - # can now trace the model, with adapter.flattened_inputs, or another - # tuple of tensors with the same length and meaning - traced = torch.jit.trace(adapter, adapter.flattened_inputs) - - # traced model can only produce flattened outputs (tuple of tensors) - flattened_outputs = traced(*adapter.flattened_inputs) - # adapter knows the schema to convert it back (new_outputs == outputs) - new_outputs = adapter.outputs_schema(flattened_outputs) - """ - - flattened_inputs: Tuple[torch.Tensor] = None - """ - Flattened version of inputs given to this class's constructor. - """ - - inputs_schema: Schema = None - """ - Schema of the inputs given to this class's constructor. - """ - - outputs_schema: Schema = None - """ - Schema of the output produced by calling the given model with inputs. - """ - - def __init__( - self, - model: nn.Module, - inputs, - inference_func: Optional[Callable] = None, - allow_non_tensor: bool = False, - ): - """ - Args: - model: an nn.Module - inputs: An input argument or a tuple of input arguments used to call model. - After flattening, it has to only consist of tensors. - inference_func: a callable that takes (model, *inputs), calls the - model with inputs, and return outputs. By default it - is ``lambda model, *inputs: model(*inputs)``. Can be override - if you need to call the model differently. - allow_non_tensor: allow inputs/outputs to contain non-tensor objects. - This option will filter out non-tensor objects to make the - model traceable, but ``inputs_schema``/``outputs_schema`` cannot be - used anymore because inputs/outputs cannot be rebuilt from pure tensors. - This is useful when you're only interested in the single trace of - execution (e.g. for flop count), but not interested in - generalizing the traced graph to new inputs. - """ - super().__init__() - if isinstance(model, (nn.parallel.distributed.DistributedDataParallel, nn.DataParallel)): - model = model.module - self.model = model - if not isinstance(inputs, tuple): - inputs = (inputs,) - self.inputs = inputs - self.allow_non_tensor = allow_non_tensor - - if inference_func is None: - inference_func = lambda model, *inputs: model(*inputs) # noqa - self.inference_func = inference_func - - self.flattened_inputs, self.inputs_schema = flatten_to_tuple(inputs) - - if all(isinstance(x, torch.Tensor) for x in self.flattened_inputs): - return - if self.allow_non_tensor: - self.flattened_inputs = tuple( - [x for x in self.flattened_inputs if isinstance(x, torch.Tensor)] - ) - self.inputs_schema = None - else: - for input in self.flattened_inputs: - if not isinstance(input, torch.Tensor): - raise ValueError( - "Inputs for tracing must only contain tensors. " - f"Got a {type(input)} instead." - ) - - def forward(self, *args: torch.Tensor): - with torch.no_grad(), patch_builtin_len(): - if self.inputs_schema is not None: - inputs_orig_format = self.inputs_schema(args) - else: - if len(args) != len(self.flattened_inputs) or any( - x is not y for x, y in zip(args, self.flattened_inputs) - ): - raise ValueError( - "TracingAdapter does not contain valid inputs_schema." - " So it cannot generalize to other inputs and must be" - " traced with `.flattened_inputs`." - ) - inputs_orig_format = self.inputs - - outputs = self.inference_func(self.model, *inputs_orig_format) - flattened_outputs, schema = flatten_to_tuple(outputs) - - flattened_output_tensors = tuple( - [x for x in flattened_outputs if isinstance(x, torch.Tensor)] - ) - if len(flattened_output_tensors) < len(flattened_outputs): - if self.allow_non_tensor: - flattened_outputs = flattened_output_tensors - self.outputs_schema = None - else: - raise ValueError( - "Model cannot be traced because some model outputs " - "cannot flatten to tensors." - ) - else: # schema is valid - if self.outputs_schema is None: - self.outputs_schema = schema - else: - assert self.outputs_schema == schema, ( - "Model should always return outputs with the same " - "structure so it can be traced!" - ) - return flattened_outputs - - def _create_wrapper(self, traced_model): - """ - Return a function that has an input/output interface the same as the - original model, but it calls the given traced model under the hood. - """ - - def forward(*args): - flattened_inputs, _ = flatten_to_tuple(args) - flattened_outputs = traced_model(*flattened_inputs) - return self.outputs_schema(flattened_outputs) - - return forward diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/shared.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/shared.py deleted file mode 100755 index 2d0f7bf3..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/shared.py +++ /dev/null @@ -1,1034 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import collections -import contextlib -import copy -import functools -import logging -import numpy as np -import os -from typing import Any, Callable, Dict, List, Optional, Tuple, Union -from unittest import mock -import caffe2.python.utils as putils -import torch -import torch.nn.functional as F -from caffe2.proto import caffe2_pb2 -from caffe2.python import core, net_drawer, workspace -from torch.nn.functional import interpolate as interp - -logger = logging.getLogger(__name__) - - -# ==== torch/utils_toffee/cast.py ======================================= - - -def to_device(t, device_str): - """ - This function is a replacement of .to(another_device) such that it allows the - casting to be traced properly by explicitly calling the underlying copy ops. - It also avoids introducing unncessary op when casting to the same device. - """ - src = t.device - dst = torch.device(device_str) - - if src == dst: - return t - elif src.type == "cuda" and dst.type == "cpu": - return torch.ops._caffe2.CopyGPUToCPU(t) - elif src.type == "cpu" and dst.type == "cuda": - return torch.ops._caffe2.CopyCPUToGPU(t) - else: - raise RuntimeError("Can't cast tensor from device {} to device {}".format(src, dst)) - - -# ==== torch/utils_toffee/interpolate.py ======================================= - - -# Note: borrowed from vision/detection/fair/detectron/detectron/modeling/detector.py -def BilinearInterpolation(tensor_in, up_scale): - assert up_scale % 2 == 0, "Scale should be even" - - def upsample_filt(size): - factor = (size + 1) // 2 - if size % 2 == 1: - center = factor - 1 - else: - center = factor - 0.5 - - og = np.ogrid[:size, :size] - return (1 - abs(og[0] - center) / factor) * (1 - abs(og[1] - center) / factor) - - kernel_size = int(up_scale) * 2 - bil_filt = upsample_filt(kernel_size) - - dim = int(tensor_in.shape[1]) - kernel = np.zeros((dim, dim, kernel_size, kernel_size), dtype=np.float32) - kernel[range(dim), range(dim), :, :] = bil_filt - - tensor_out = F.conv_transpose2d( - tensor_in, - weight=to_device(torch.Tensor(kernel), tensor_in.device), - bias=None, - stride=int(up_scale), - padding=int(up_scale / 2), - ) - - return tensor_out - - -# NOTE: ONNX is incompatible with traced torch.nn.functional.interpolate if -# using dynamic `scale_factor` rather than static `size`. (T43166860) -# NOTE: Caffe2 Int8 conversion might not be able to quantize `size` properly. -def onnx_compatibale_interpolate( - input, size=None, scale_factor=None, mode="nearest", align_corners=None -): - # NOTE: The input dimensions are interpreted in the form: - # `mini-batch x channels x [optional depth] x [optional height] x width`. - if size is None and scale_factor is not None: - if input.dim() == 4: - if isinstance(scale_factor, (int, float)): - height_scale, width_scale = (scale_factor, scale_factor) - else: - assert isinstance(scale_factor, (tuple, list)) - assert len(scale_factor) == 2 - height_scale, width_scale = scale_factor - - assert not align_corners, "No matching C2 op for align_corners == True" - if mode == "nearest": - return torch.ops._caffe2.ResizeNearest( - input, order="NCHW", width_scale=width_scale, height_scale=height_scale - ) - elif mode == "bilinear": - logger.warning( - "Use F.conv_transpose2d for bilinear interpolate" - " because there's no such C2 op, this may cause significant" - " slowdown and the boundary pixels won't be as same as" - " using F.interpolate due to padding." - ) - assert height_scale == width_scale - return BilinearInterpolation(input, up_scale=height_scale) - logger.warning("Output size is not static, it might cause ONNX conversion issue") - - return interp(input, size, scale_factor, mode, align_corners) - - -@contextlib.contextmanager -def mock_torch_nn_functional_interpolate(): - if torch.onnx.is_in_onnx_export(): - with mock.patch( - "torch.nn.functional.interpolate", side_effect=onnx_compatibale_interpolate - ): - yield - else: - yield - - -# ==== torch/utils_caffe2/ws_utils.py ========================================== - - -class ScopedWS(object): - def __init__(self, ws_name, is_reset, is_cleanup=False): - self.ws_name = ws_name - self.is_reset = is_reset - self.is_cleanup = is_cleanup - self.org_ws = "" - - def __enter__(self): - self.org_ws = workspace.CurrentWorkspace() - if self.ws_name is not None: - workspace.SwitchWorkspace(self.ws_name, True) - if self.is_reset: - workspace.ResetWorkspace() - - return workspace - - def __exit__(self, *args): - if self.is_cleanup: - workspace.ResetWorkspace() - if self.ws_name is not None: - workspace.SwitchWorkspace(self.org_ws) - - -def fetch_any_blob(name): - bb = None - try: - bb = workspace.FetchBlob(name) - except TypeError: - bb = workspace.FetchInt8Blob(name) - except Exception as e: - logger.error("Get blob {} error: {}".format(name, e)) - - return bb - - -# ==== torch/utils_caffe2/protobuf.py ========================================== - - -def get_pb_arg(pb, arg_name): - for x in pb.arg: - if x.name == arg_name: - return x - return None - - -def get_pb_arg_valf(pb, arg_name, default_val): - arg = get_pb_arg(pb, arg_name) - return arg.f if arg is not None else default_val - - -def get_pb_arg_floats(pb, arg_name, default_val): - arg = get_pb_arg(pb, arg_name) - return list(map(float, arg.floats)) if arg is not None else default_val - - -def get_pb_arg_ints(pb, arg_name, default_val): - arg = get_pb_arg(pb, arg_name) - return list(map(int, arg.ints)) if arg is not None else default_val - - -def get_pb_arg_vali(pb, arg_name, default_val): - arg = get_pb_arg(pb, arg_name) - return arg.i if arg is not None else default_val - - -def get_pb_arg_vals(pb, arg_name, default_val): - arg = get_pb_arg(pb, arg_name) - return arg.s if arg is not None else default_val - - -def get_pb_arg_valstrings(pb, arg_name, default_val): - arg = get_pb_arg(pb, arg_name) - return list(arg.strings) if arg is not None else default_val - - -def check_set_pb_arg(pb, arg_name, arg_attr, arg_value, allow_override=False): - arg = get_pb_arg(pb, arg_name) - if arg is None: - arg = putils.MakeArgument(arg_name, arg_value) - assert hasattr(arg, arg_attr) - pb.arg.extend([arg]) - if allow_override and getattr(arg, arg_attr) != arg_value: - logger.warning( - "Override argument {}: {} -> {}".format(arg_name, getattr(arg, arg_attr), arg_value) - ) - setattr(arg, arg_attr, arg_value) - else: - assert arg is not None - assert getattr(arg, arg_attr) == arg_value, "Existing value {}, new value {}".format( - getattr(arg, arg_attr), arg_value - ) - - -def _create_const_fill_op_from_numpy(name, tensor, device_option=None): - assert type(tensor) == np.ndarray - kTypeNameMapper = { - np.dtype("float32"): "GivenTensorFill", - np.dtype("int32"): "GivenTensorIntFill", - np.dtype("int64"): "GivenTensorInt64Fill", - np.dtype("uint8"): "GivenTensorStringFill", - } - - args_dict = {} - if tensor.dtype == np.dtype("uint8"): - args_dict.update({"values": [str(tensor.data)], "shape": [1]}) - else: - args_dict.update({"values": tensor, "shape": tensor.shape}) - - if device_option is not None: - args_dict["device_option"] = device_option - - return core.CreateOperator(kTypeNameMapper[tensor.dtype], [], [name], **args_dict) - - -def _create_const_fill_op_from_c2_int8_tensor(name, int8_tensor): - assert type(int8_tensor) == workspace.Int8Tensor - kTypeNameMapper = { - np.dtype("int32"): "Int8GivenIntTensorFill", - np.dtype("uint8"): "Int8GivenTensorFill", - } - - tensor = int8_tensor.data - assert tensor.dtype in [np.dtype("uint8"), np.dtype("int32")] - values = tensor.tobytes() if tensor.dtype == np.dtype("uint8") else tensor - - return core.CreateOperator( - kTypeNameMapper[tensor.dtype], - [], - [name], - values=values, - shape=tensor.shape, - Y_scale=int8_tensor.scale, - Y_zero_point=int8_tensor.zero_point, - ) - - -def create_const_fill_op( - name: str, - blob: Union[np.ndarray, workspace.Int8Tensor], - device_option: Optional[caffe2_pb2.DeviceOption] = None, -) -> caffe2_pb2.OperatorDef: - """ - Given a blob object, return the Caffe2 operator that creates this blob - as constant. Currently support NumPy tensor and Caffe2 Int8Tensor. - """ - - tensor_type = type(blob) - assert tensor_type in [ - np.ndarray, - workspace.Int8Tensor, - ], 'Error when creating const fill op for "{}", unsupported blob type: {}'.format( - name, type(blob) - ) - - if tensor_type == np.ndarray: - return _create_const_fill_op_from_numpy(name, blob, device_option) - elif tensor_type == workspace.Int8Tensor: - assert device_option is None - return _create_const_fill_op_from_c2_int8_tensor(name, blob) - - -def construct_init_net_from_params( - params: Dict[str, Any], device_options: Optional[Dict[str, caffe2_pb2.DeviceOption]] = None -) -> caffe2_pb2.NetDef: - """ - Construct the init_net from params dictionary - """ - init_net = caffe2_pb2.NetDef() - device_options = device_options or {} - for name, blob in params.items(): - if isinstance(blob, str): - logger.warning( - ( - "Blob {} with type {} is not supported in generating init net," - " skipped.".format(name, type(blob)) - ) - ) - continue - init_net.op.extend( - [create_const_fill_op(name, blob, device_option=device_options.get(name, None))] - ) - init_net.external_output.append(name) - return init_net - - -def get_producer_map(ssa): - """ - Return dict from versioned blob to (i, j), - where i is index of producer op, j is the index of output of that op. - """ - producer_map = {} - for i in range(len(ssa)): - outputs = ssa[i][1] - for j, outp in enumerate(outputs): - producer_map[outp] = (i, j) - return producer_map - - -def get_consumer_map(ssa): - """ - Return dict from versioned blob to list of (i, j), - where i is index of consumer op, j is the index of input of that op. - """ - consumer_map = collections.defaultdict(list) - for i in range(len(ssa)): - inputs = ssa[i][0] - for j, inp in enumerate(inputs): - consumer_map[inp].append((i, j)) - return consumer_map - - -def get_params_from_init_net( - init_net: caffe2_pb2.NetDef, -) -> [Dict[str, Any], Dict[str, caffe2_pb2.DeviceOption]]: - """ - Take the output blobs from init_net by running it. - Outputs: - params: dict from blob name to numpy array - device_options: dict from blob name to the device option of its creating op - """ - # NOTE: this assumes that the params is determined by producer op with the - # only exception be CopyGPUToCPU which is CUDA op but returns CPU tensor. - def _get_device_option(producer_op): - if producer_op.type == "CopyGPUToCPU": - return caffe2_pb2.DeviceOption() - else: - return producer_op.device_option - - with ScopedWS("__get_params_from_init_net__", is_reset=True, is_cleanup=True) as ws: - ws.RunNetOnce(init_net) - params = {b: fetch_any_blob(b) for b in init_net.external_output} - ssa, versions = core.get_ssa(init_net) - producer_map = get_producer_map(ssa) - device_options = { - b: _get_device_option(init_net.op[producer_map[(b, versions[b])][0]]) - for b in init_net.external_output - } - return params, device_options - - -def _updater_raise(op, input_types, output_types): - raise RuntimeError( - "Failed to apply updater for op {} given input_types {} and" - " output_types {}".format(op, input_types, output_types) - ) - - -def _generic_status_identifier( - predict_net: caffe2_pb2.NetDef, - status_updater: Callable, - known_status: Dict[Tuple[str, int], Any], -) -> Dict[Tuple[str, int], Any]: - """ - Statically infer the status of each blob, the status can be such as device type - (CPU/GPU), layout (NCHW/NHWC), data type (float32/int8), etc. "Blob" here - is versioned blob (Tuple[str, int]) in the format compatible with ssa. - Inputs: - predict_net: the caffe2 network - status_updater: a callable, given an op and the status of its input/output, - it returns the updated status of input/output. `None` is used for - representing unknown status. - known_status: a dict containing known status, used as initialization. - Outputs: - A dict mapping from versioned blob to its status - """ - ssa, versions = core.get_ssa(predict_net) - versioned_ext_input = [(b, 0) for b in predict_net.external_input] - versioned_ext_output = [(b, versions[b]) for b in predict_net.external_output] - all_versioned_blobs = set().union(*[set(x[0] + x[1]) for x in ssa]) - - allowed_vbs = all_versioned_blobs.union(versioned_ext_input).union(versioned_ext_output) - assert all(k in allowed_vbs for k in known_status) - assert all(v is not None for v in known_status.values()) - _known_status = copy.deepcopy(known_status) - - def _check_and_update(key, value): - assert value is not None - if key in _known_status: - if not _known_status[key] == value: - raise RuntimeError( - "Confilict status for {}, existing status {}, new status {}".format( - key, _known_status[key], value - ) - ) - _known_status[key] = value - - def _update_i(op, ssa_i): - versioned_inputs = ssa_i[0] - versioned_outputs = ssa_i[1] - - inputs_status = [_known_status.get(b, None) for b in versioned_inputs] - outputs_status = [_known_status.get(b, None) for b in versioned_outputs] - - new_inputs_status, new_outputs_status = status_updater(op, inputs_status, outputs_status) - - for versioned_blob, status in zip( - versioned_inputs + versioned_outputs, new_inputs_status + new_outputs_status - ): - if status is not None: - _check_and_update(versioned_blob, status) - - for op, ssa_i in zip(predict_net.op, ssa): - _update_i(op, ssa_i) - for op, ssa_i in zip(reversed(predict_net.op), reversed(ssa)): - _update_i(op, ssa_i) - - # NOTE: This strictly checks all the blob from predict_net must be assgined - # a known status. However sometimes it's impossible (eg. having deadend op), - # we may relax this constraint if - for k in all_versioned_blobs: - if k not in _known_status: - raise NotImplementedError( - "Can not infer the status for {}. Currently only support the case where" - " a single forward and backward pass can identify status for all blobs.".format(k) - ) - - return _known_status - - -def infer_device_type( - predict_net: caffe2_pb2.NetDef, - known_status: Dict[Tuple[str, int], Any], - device_name_style: str = "caffe2", -) -> Dict[Tuple[str, int], str]: - """Return the device type ("cpu" or "gpu"/"cuda") of each (versioned) blob""" - - assert device_name_style in ["caffe2", "pytorch"] - _CPU_STR = "cpu" - _GPU_STR = "gpu" if device_name_style == "caffe2" else "cuda" - - def _copy_cpu_to_gpu_updater(op, input_types, output_types): - if input_types[0] == _GPU_STR or output_types[0] == _CPU_STR: - _updater_raise(op, input_types, output_types) - return ([_CPU_STR], [_GPU_STR]) - - def _copy_gpu_to_cpu_updater(op, input_types, output_types): - if input_types[0] == _CPU_STR or output_types[0] == _GPU_STR: - _updater_raise(op, input_types, output_types) - return ([_GPU_STR], [_CPU_STR]) - - def _other_ops_updater(op, input_types, output_types): - non_none_types = [x for x in input_types + output_types if x is not None] - if len(non_none_types) > 0: - the_type = non_none_types[0] - if not all(x == the_type for x in non_none_types): - _updater_raise(op, input_types, output_types) - else: - the_type = None - return ([the_type for _ in op.input], [the_type for _ in op.output]) - - def _device_updater(op, *args, **kwargs): - return { - "CopyCPUToGPU": _copy_cpu_to_gpu_updater, - "CopyGPUToCPU": _copy_gpu_to_cpu_updater, - }.get(op.type, _other_ops_updater)(op, *args, **kwargs) - - return _generic_status_identifier(predict_net, _device_updater, known_status) - - -# ==== torch/utils_caffe2/vis.py =============================================== - - -def _modify_blob_names(ops, blob_rename_f): - ret = [] - - def _replace_list(blob_list, replaced_list): - del blob_list[:] - blob_list.extend(replaced_list) - - for x in ops: - cur = copy.deepcopy(x) - _replace_list(cur.input, list(map(blob_rename_f, cur.input))) - _replace_list(cur.output, list(map(blob_rename_f, cur.output))) - ret.append(cur) - - return ret - - -def _rename_blob(name, blob_sizes, blob_ranges): - def _list_to_str(bsize): - ret = ", ".join([str(x) for x in bsize]) - ret = "[" + ret + "]" - return ret - - ret = name - if blob_sizes is not None and name in blob_sizes: - ret += "\n" + _list_to_str(blob_sizes[name]) - if blob_ranges is not None and name in blob_ranges: - ret += "\n" + _list_to_str(blob_ranges[name]) - - return ret - - -# graph_name could not contain word 'graph' -def save_graph(net, file_name, graph_name="net", op_only=True, blob_sizes=None, blob_ranges=None): - blob_rename_f = functools.partial(_rename_blob, blob_sizes=blob_sizes, blob_ranges=blob_ranges) - return save_graph_base(net, file_name, graph_name, op_only, blob_rename_f) - - -def save_graph_base(net, file_name, graph_name="net", op_only=True, blob_rename_func=None): - graph = None - ops = net.op - if blob_rename_func is not None: - ops = _modify_blob_names(ops, blob_rename_func) - if not op_only: - graph = net_drawer.GetPydotGraph(ops, graph_name, rankdir="TB") - else: - graph = net_drawer.GetPydotGraphMinimal( - ops, graph_name, rankdir="TB", minimal_dependency=True - ) - - try: - par_dir = os.path.dirname(file_name) - if not os.path.exists(par_dir): - os.makedirs(par_dir) - - format = os.path.splitext(os.path.basename(file_name))[-1] - if format == ".png": - graph.write_png(file_name) - elif format == ".pdf": - graph.write_pdf(file_name) - elif format == ".svg": - graph.write_svg(file_name) - else: - print("Incorrect format {}".format(format)) - except Exception as e: - print("Error when writing graph to image {}".format(e)) - - return graph - - -# ==== torch/utils_toffee/aten_to_caffe2.py ==================================== - - -def group_norm_replace_aten_with_caffe2(predict_net: caffe2_pb2.NetDef): - """ - For ONNX exported model, GroupNorm will be represented as ATen op, - this can be a drop in replacement from ATen to GroupNorm - """ - count = 0 - for op in predict_net.op: - if op.type == "ATen": - op_name = get_pb_arg_vals(op, "operator", None) # return byte in py3 - if op_name and op_name.decode() == "group_norm": - op.arg.remove(get_pb_arg(op, "operator")) - - if get_pb_arg_vali(op, "cudnn_enabled", None): - op.arg.remove(get_pb_arg(op, "cudnn_enabled")) - - num_groups = get_pb_arg_vali(op, "num_groups", None) - if num_groups is not None: - op.arg.remove(get_pb_arg(op, "num_groups")) - check_set_pb_arg(op, "group", "i", num_groups) - - op.type = "GroupNorm" - count += 1 - if count > 1: - logger.info("Replaced {} ATen operator to GroupNormOp".format(count)) - - -# ==== torch/utils_toffee/alias.py ============================================= - - -def alias(x, name, is_backward=False): - if not torch.onnx.is_in_onnx_export(): - return x - assert isinstance(x, torch.Tensor) - return torch.ops._caffe2.AliasWithName(x, name, is_backward=is_backward) - - -def fuse_alias_placeholder(predict_net, init_net): - """Remove AliasWithName placeholder and rename the input/output of it""" - # First we finish all the re-naming - for i, op in enumerate(predict_net.op): - if op.type == "AliasWithName": - assert len(op.input) == 1 - assert len(op.output) == 1 - name = get_pb_arg_vals(op, "name", None).decode() - is_backward = bool(get_pb_arg_vali(op, "is_backward", 0)) - rename_op_input(predict_net, init_net, i, 0, name, from_producer=is_backward) - rename_op_output(predict_net, i, 0, name) - - # Remove AliasWithName, should be very safe since it's a non-op - new_ops = [] - for op in predict_net.op: - if op.type != "AliasWithName": - new_ops.append(op) - else: - # safety check - assert op.input == op.output - assert op.input[0] == op.arg[0].s.decode() - del predict_net.op[:] - predict_net.op.extend(new_ops) - - -# ==== torch/utils_caffe2/graph_transform.py =================================== - - -class IllegalGraphTransformError(ValueError): - """When a graph transform function call can't be executed.""" - - -def _rename_versioned_blob_in_proto( - proto: caffe2_pb2.NetDef, - old_name: str, - new_name: str, - version: int, - ssa: List[Tuple[List[Tuple[str, int]], List[Tuple[str, int]]]], - start_versions: Dict[str, int], - end_versions: Dict[str, int], -): - """In given proto, rename all blobs with matched version""" - # Operater list - for op, i_th_ssa in zip(proto.op, ssa): - versioned_inputs, versioned_outputs = i_th_ssa - for i in range(len(op.input)): - if versioned_inputs[i] == (old_name, version): - op.input[i] = new_name - for i in range(len(op.output)): - if versioned_outputs[i] == (old_name, version): - op.output[i] = new_name - # external_input - if start_versions.get(old_name, 0) == version: - for i in range(len(proto.external_input)): - if proto.external_input[i] == old_name: - proto.external_input[i] = new_name - # external_output - if end_versions.get(old_name, 0) == version: - for i in range(len(proto.external_output)): - if proto.external_output[i] == old_name: - proto.external_output[i] = new_name - - -def rename_op_input( - predict_net: caffe2_pb2.NetDef, - init_net: caffe2_pb2.NetDef, - op_id: int, - input_id: int, - new_name: str, - from_producer: bool = False, -): - """ - Rename the op_id-th operator in predict_net, change it's input_id-th input's - name to the new_name. It also does automatic re-route and change - external_input and init_net if necessary. - - It requires the input is only consumed by this op. - - This function modifies predict_net and init_net in-place. - - When from_producer is enable, this also updates other operators that consumes - the same input. Be cautious because may trigger unintended behavior. - """ - assert isinstance(predict_net, caffe2_pb2.NetDef) - assert isinstance(init_net, caffe2_pb2.NetDef) - - init_net_ssa, init_net_versions = core.get_ssa(init_net) - predict_net_ssa, predict_net_versions = core.get_ssa( - predict_net, copy.deepcopy(init_net_versions) - ) - - versioned_inputs, versioned_outputs = predict_net_ssa[op_id] - old_name, version = versioned_inputs[input_id] - - if from_producer: - producer_map = get_producer_map(predict_net_ssa) - if not (old_name, version) in producer_map: - raise NotImplementedError( - "Can't find producer, the input {} is probably from" - " init_net, this is not supported yet.".format(old_name) - ) - producer = producer_map[(old_name, version)] - rename_op_output(predict_net, producer[0], producer[1], new_name) - return - - def contain_targets(op_ssa): - return (old_name, version) in op_ssa[0] - - is_consumer = [contain_targets(op_ssa) for op_ssa in predict_net_ssa] - if sum(is_consumer) > 1: - raise IllegalGraphTransformError( - ( - "Input '{}' of operator(#{}) are consumed by other ops, please use" - + " rename_op_output on the producer instead. Offending op: \n{}" - ).format(old_name, op_id, predict_net.op[op_id]) - ) - - # update init_net - _rename_versioned_blob_in_proto( - init_net, old_name, new_name, version, init_net_ssa, {}, init_net_versions - ) - # update predict_net - _rename_versioned_blob_in_proto( - predict_net, - old_name, - new_name, - version, - predict_net_ssa, - init_net_versions, - predict_net_versions, - ) - - -def rename_op_output(predict_net: caffe2_pb2.NetDef, op_id: int, output_id: int, new_name: str): - """ - Rename the op_id-th operator in predict_net, change it's output_id-th input's - name to the new_name. It also does automatic re-route and change - external_output and if necessary. - - It allows multiple consumers of its output. - - This function modifies predict_net in-place, doesn't need init_net. - """ - assert isinstance(predict_net, caffe2_pb2.NetDef) - - ssa, blob_versions = core.get_ssa(predict_net) - - versioned_inputs, versioned_outputs = ssa[op_id] - old_name, version = versioned_outputs[output_id] - - # update predict_net - _rename_versioned_blob_in_proto( - predict_net, old_name, new_name, version, ssa, {}, blob_versions - ) - - -def get_sub_graph_external_input_output( - predict_net: caffe2_pb2.NetDef, sub_graph_op_indices: List[int] -) -> Tuple[List[Tuple[str, int]], List[Tuple[str, int]]]: - """ - Return the list of external input/output of sub-graph, - each element is tuple of the name and corresponding version in predict_net. - - external input/output is defined the same way as caffe2 NetDef. - """ - ssa, versions = core.get_ssa(predict_net) - - all_inputs = [] - all_outputs = [] - for op_id in sub_graph_op_indices: - all_inputs += [inp for inp in ssa[op_id][0] if inp not in all_inputs] - all_outputs += list(ssa[op_id][1]) # ssa output won't repeat - - # for versioned blobs, external inputs are just those blob in all_inputs - # but not in all_outputs - ext_inputs = [inp for inp in all_inputs if inp not in all_outputs] - - # external outputs are essentially outputs of this subgraph that are used - # outside of this sub-graph (including predict_net.external_output) - all_other_inputs = sum( - (ssa[i][0] for i in range(len(ssa)) if i not in sub_graph_op_indices), - [(outp, versions[outp]) for outp in predict_net.external_output], - ) - ext_outputs = [outp for outp in all_outputs if outp in set(all_other_inputs)] - - return ext_inputs, ext_outputs - - -class DiGraph: - """A DAG representation of caffe2 graph, each vertice is a versioned blob.""" - - def __init__(self): - self.vertices = set() - self.graph = collections.defaultdict(list) - - def add_edge(self, u, v): - self.graph[u].append(v) - self.vertices.add(u) - self.vertices.add(v) - - # grab from https://www.geeksforgeeks.org/find-paths-given-source-destination/ - def get_all_paths(self, s, d): - visited = {k: False for k in self.vertices} - path = [] - all_paths = [] - - def _get_all_paths_util(graph, u, d, visited, path): - visited[u] = True - path.append(u) - if u == d: - all_paths.append(copy.deepcopy(path)) - else: - for i in graph[u]: - if not visited[i]: - _get_all_paths_util(graph, i, d, visited, path) - path.pop() - visited[u] = False - - _get_all_paths_util(self.graph, s, d, visited, path) - return all_paths - - @staticmethod - def from_ssa(ssa): - graph = DiGraph() - for op_id in range(len(ssa)): - for inp in ssa[op_id][0]: - for outp in ssa[op_id][1]: - graph.add_edge(inp, outp) - return graph - - -def _get_dependency_chain(ssa, versioned_target, versioned_source): - """ - Return the index list of relevant operator to produce target blob from source blob, - if there's no dependency, return empty list. - """ - - # finding all paths between nodes can be O(N!), thus we can only search - # in the subgraph using the op starting from the first consumer of source blob - # to the producer of the target blob. - consumer_map = get_consumer_map(ssa) - producer_map = get_producer_map(ssa) - start_op = min(x[0] for x in consumer_map[versioned_source]) - 15 - end_op = ( - producer_map[versioned_target][0] + 15 if versioned_target in producer_map else start_op - ) - sub_graph_ssa = ssa[start_op : end_op + 1] - if len(sub_graph_ssa) > 30: - logger.warning( - "Subgraph bebetween {} and {} is large (from op#{} to op#{}), it" - " might take non-trival time to find all paths between them.".format( - versioned_source, versioned_target, start_op, end_op - ) - ) - - dag = DiGraph.from_ssa(sub_graph_ssa) - paths = dag.get_all_paths(versioned_source, versioned_target) # include two ends - ops_in_paths = [[producer_map[blob][0] for blob in path[1:]] for path in paths] - return sorted(set().union(*[set(ops) for ops in ops_in_paths])) - - -def identify_reshape_sub_graph(predict_net: caffe2_pb2.NetDef) -> List[List[int]]: - """ - Idenfity the reshape sub-graph in a protobuf. - The reshape sub-graph is defined as matching the following pattern: - - (input_blob) -> Op_1 -> ... -> Op_N -> (new_shape) -─┐ - └-------------------------------------------> Reshape -> (output_blob) - - Return: - List of sub-graphs, each sub-graph is represented as a list of indices - of the relavent ops, [Op_1, Op_2, ..., Op_N, Reshape] - """ - - ssa, _ = core.get_ssa(predict_net) - - ret = [] - for i, op in enumerate(predict_net.op): - if op.type == "Reshape": - assert len(op.input) == 2 - input_ssa = ssa[i][0] - data_source = input_ssa[0] - shape_source = input_ssa[1] - op_indices = _get_dependency_chain(ssa, shape_source, data_source) - ret.append(op_indices + [i]) - return ret - - -def remove_reshape_for_fc(predict_net, params): - """ - In PyTorch nn.Linear has to take 2D tensor, this often leads to reshape - a 4D tensor to 2D by calling .view(). However this (dynamic) reshaping - doesn't work well with ONNX and Int8 tools, and cause using extra - ops (eg. ExpandDims) that might not be available on mobile. - Luckily Caffe2 supports 4D tensor for FC, so we can remove those reshape - after exporting ONNX model. - """ - from caffe2.python import core - - # find all reshape sub-graph that can be removed, which is now all Reshape - # sub-graph whose output is only consumed by FC. - # TODO: to make it safer, we may need the actually value to better determine - # if a Reshape before FC is removable. - reshape_sub_graphs = identify_reshape_sub_graph(predict_net) - sub_graphs_to_remove = [] - for reshape_sub_graph in reshape_sub_graphs: - reshape_op_id = reshape_sub_graph[-1] - assert predict_net.op[reshape_op_id].type == "Reshape" - ssa, _ = core.get_ssa(predict_net) - reshape_output = ssa[reshape_op_id][1][0] - consumers = [i for i in range(len(ssa)) if reshape_output in ssa[i][0]] - if all(predict_net.op[consumer].type == "FC" for consumer in consumers): - # safety check if the sub-graph is isolated, for this reshape sub-graph, - # it means it has one non-param external input and one external output. - ext_inputs, ext_outputs = get_sub_graph_external_input_output( - predict_net, reshape_sub_graph - ) - non_params_ext_inputs = [inp for inp in ext_inputs if inp[1] != 0] - if len(non_params_ext_inputs) == 1 and len(ext_outputs) == 1: - sub_graphs_to_remove.append(reshape_sub_graph) - - # perform removing subgraph by: - # 1: rename the Reshape's output to its input, then the graph can be - # seen as in-place itentify, meaning whose external input/output are the same. - # 2: simply remove those ops. - remove_op_ids = [] - params_to_remove = [] - for sub_graph in sub_graphs_to_remove: - logger.info( - "Remove Reshape sub-graph:\n{}".format( - "".join(["(#{:>4})\n{}".format(i, predict_net.op[i]) for i in sub_graph]) - ) - ) - reshape_op_id = sub_graph[-1] - new_reshap_output = predict_net.op[reshape_op_id].input[0] - rename_op_output(predict_net, reshape_op_id, 0, new_reshap_output) - ext_inputs, ext_outputs = get_sub_graph_external_input_output(predict_net, sub_graph) - non_params_ext_inputs = [inp for inp in ext_inputs if inp[1] != 0] - params_ext_inputs = [inp for inp in ext_inputs if inp[1] == 0] - assert len(non_params_ext_inputs) == 1 and len(ext_outputs) == 1 - assert ext_outputs[0][0] == non_params_ext_inputs[0][0] - assert ext_outputs[0][1] == non_params_ext_inputs[0][1] + 1 - remove_op_ids.extend(sub_graph) - params_to_remove.extend(params_ext_inputs) - - predict_net = copy.deepcopy(predict_net) - new_ops = [op for i, op in enumerate(predict_net.op) if i not in remove_op_ids] - del predict_net.op[:] - predict_net.op.extend(new_ops) - for versioned_params in params_to_remove: - name = versioned_params[0] - logger.info("Remove params: {} from init_net and predict_net.external_input".format(name)) - del params[name] - predict_net.external_input.remove(name) - - return predict_net, params - - -def fuse_copy_between_cpu_and_gpu(predict_net: caffe2_pb2.NetDef): - """ - In-place fuse extra copy ops between cpu/gpu for the following case: - a -CopyAToB-> b -CopyBToA> c1 -NextOp1-> d1 - -CopyBToA> c2 -NextOp2-> d2 - The fused network will look like: - a -NextOp1-> d1 - -NextOp2-> d2 - """ - - _COPY_OPS = ["CopyCPUToGPU", "CopyGPUToCPU"] - - def _fuse_once(predict_net): - ssa, blob_versions = core.get_ssa(predict_net) - consumer_map = get_consumer_map(ssa) - versioned_external_output = [ - (name, blob_versions[name]) for name in predict_net.external_output - ] - - for op_id, op in enumerate(predict_net.op): - if op.type in _COPY_OPS: - fw_copy_versioned_output = ssa[op_id][1][0] - consumer_ids = [x[0] for x in consumer_map[fw_copy_versioned_output]] - reverse_op_type = _COPY_OPS[1 - _COPY_OPS.index(op.type)] - - is_fusable = ( - len(consumer_ids) > 0 - and fw_copy_versioned_output not in versioned_external_output - and all( - predict_net.op[_op_id].type == reverse_op_type - and ssa[_op_id][1][0] not in versioned_external_output - for _op_id in consumer_ids - ) - ) - - if is_fusable: - for rv_copy_op_id in consumer_ids: - # making each NextOp uses "a" directly and removing Copy ops - rs_copy_versioned_output = ssa[rv_copy_op_id][1][0] - next_op_id, inp_id = consumer_map[rs_copy_versioned_output][0] - predict_net.op[next_op_id].input[inp_id] = op.input[0] - # remove CopyOps - new_ops = [ - op - for i, op in enumerate(predict_net.op) - if i != op_id and i not in consumer_ids - ] - del predict_net.op[:] - predict_net.op.extend(new_ops) - return True - - return False - - # _fuse_once returns False is nothing can be fused - while _fuse_once(predict_net): - pass - - -def remove_dead_end_ops(net_def: caffe2_pb2.NetDef): - """remove ops if its output is not used or not in external_output""" - ssa, versions = core.get_ssa(net_def) - versioned_external_output = [(name, versions[name]) for name in net_def.external_output] - consumer_map = get_consumer_map(ssa) - removed_op_ids = set() - - def _is_dead_end(versioned_blob): - return not ( - versioned_blob in versioned_external_output - or ( - len(consumer_map[versioned_blob]) > 0 - and all(x[0] not in removed_op_ids for x in consumer_map[versioned_blob]) - ) - ) - - for i, ssa_i in reversed(list(enumerate(ssa))): - versioned_outputs = ssa_i[1] - if all(_is_dead_end(outp) for outp in versioned_outputs): - removed_op_ids.add(i) - - # simply removing those deadend ops should have no effect to external_output - new_ops = [op for i, op in enumerate(net_def.op) if i not in removed_op_ids] - del net_def.op[:] - net_def.op.extend(new_ops) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/torchscript.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/torchscript.py deleted file mode 100755 index 24fe59bd..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/torchscript.py +++ /dev/null @@ -1,132 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import os -import torch - -from detectron2.utils.file_io import PathManager - -from .torchscript_patch import freeze_training_mode, patch_instances - -__all__ = ["scripting_with_instances", "dump_torchscript_IR"] - - -def scripting_with_instances(model, fields): - """ - Run :func:`torch.jit.script` on a model that uses the :class:`Instances` class. Since - attributes of :class:`Instances` are "dynamically" added in eager mode,it is difficult - for scripting to support it out of the box. This function is made to support scripting - a model that uses :class:`Instances`. It does the following: - - 1. Create a scriptable ``new_Instances`` class which behaves similarly to ``Instances``, - but with all attributes been "static". - The attributes need to be statically declared in the ``fields`` argument. - 2. Register ``new_Instances``, and force scripting compiler to - use it when trying to compile ``Instances``. - - After this function, the process will be reverted. User should be able to script another model - using different fields. - - Example: - Assume that ``Instances`` in the model consist of two attributes named - ``proposal_boxes`` and ``objectness_logits`` with type :class:`Boxes` and - :class:`Tensor` respectively during inference. You can call this function like: - :: - fields = {"proposal_boxes": Boxes, "objectness_logits": torch.Tensor} - torchscipt_model = scripting_with_instances(model, fields) - - Note: - It only support models in evaluation mode. - - Args: - model (nn.Module): The input model to be exported by scripting. - fields (Dict[str, type]): Attribute names and corresponding type that - ``Instances`` will use in the model. Note that all attributes used in ``Instances`` - need to be added, regardless of whether they are inputs/outputs of the model. - Data type not defined in detectron2 is not supported for now. - - Returns: - torch.jit.ScriptModule: the model in torchscript format - """ - assert ( - not model.training - ), "Currently we only support exporting models in evaluation mode to torchscript" - - with freeze_training_mode(model), patch_instances(fields): - scripted_model = torch.jit.script(model) - return scripted_model - - -# alias for old name -export_torchscript_with_instances = scripting_with_instances - - -def dump_torchscript_IR(model, dir): - """ - Dump IR of a TracedModule/ScriptModule/Function in various format (code, graph, - inlined graph). Useful for debugging. - - Args: - model (TracedModule/ScriptModule/ScriptFUnction): traced or scripted module - dir (str): output directory to dump files. - """ - dir = os.path.expanduser(dir) - PathManager.mkdirs(dir) - - def _get_script_mod(mod): - if isinstance(mod, torch.jit.TracedModule): - return mod._actual_script_module - return mod - - # Dump pretty-printed code: https://pytorch.org/docs/stable/jit.html#inspecting-code - with PathManager.open(os.path.join(dir, "model_ts_code.txt"), "w") as f: - - def get_code(mod): - # Try a few ways to get code using private attributes. - try: - # This contains more information than just `mod.code` - return _get_script_mod(mod)._c.code - except AttributeError: - pass - try: - return mod.code - except AttributeError: - return None - - def dump_code(prefix, mod): - code = get_code(mod) - name = prefix or "root model" - if code is None: - f.write(f"Could not found code for {name} (type={mod.original_name})\n") - f.write("\n") - else: - f.write(f"\nCode for {name}, type={mod.original_name}:\n") - f.write(code) - f.write("\n") - f.write("-" * 80) - - for name, m in mod.named_children(): - dump_code(prefix + "." + name, m) - - if isinstance(model, torch.jit.ScriptFunction): - f.write(get_code(model)) - else: - dump_code("", model) - - def _get_graph(model): - try: - # Recursively dump IR of all modules - return _get_script_mod(model)._c.dump_to_str(True, False, False) - except AttributeError: - return model.graph.str() - - with PathManager.open(os.path.join(dir, "model_ts_IR.txt"), "w") as f: - f.write(_get_graph(model)) - - # Dump IR of the entire graph (all submodules inlined) - with PathManager.open(os.path.join(dir, "model_ts_IR_inlined.txt"), "w") as f: - f.write(str(model.inlined_graph)) - - if not isinstance(model, torch.jit.ScriptFunction): - # Dump the model structure in pytorch style - with PathManager.open(os.path.join(dir, "model.txt"), "w") as f: - f.write(str(model)) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/torchscript_patch.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/torchscript_patch.py deleted file mode 100755 index da9b324f..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/export/torchscript_patch.py +++ /dev/null @@ -1,406 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import os -import sys -import tempfile -from contextlib import ExitStack, contextmanager -from copy import deepcopy -from unittest import mock -import torch -from torch import nn - -# need some explicit imports due to https://github.com/pytorch/pytorch/issues/38964 -import detectron2 # noqa F401 -from detectron2.structures import Boxes, Instances -from detectron2.utils.env import _import_file - -_counter = 0 - - -def _clear_jit_cache(): - from torch.jit._recursive import concrete_type_store - from torch.jit._state import _jit_caching_layer - - concrete_type_store.type_store.clear() # for modules - _jit_caching_layer.clear() # for free functions - - -def _add_instances_conversion_methods(newInstances): - """ - Add from_instances methods to the scripted Instances class. - """ - cls_name = newInstances.__name__ - - @torch.jit.unused - def from_instances(instances: Instances): - """ - Create scripted Instances from original Instances - """ - fields = instances.get_fields() - image_size = instances.image_size - ret = newInstances(image_size) - for name, val in fields.items(): - assert hasattr(ret, f"_{name}"), f"No attribute named {name} in {cls_name}" - setattr(ret, name, deepcopy(val)) - return ret - - newInstances.from_instances = from_instances - - -@contextmanager -def patch_instances(fields): - """ - A contextmanager, under which the Instances class in detectron2 is replaced - by a statically-typed scriptable class, defined by `fields`. - See more in `scripting_with_instances`. - """ - - with tempfile.TemporaryDirectory(prefix="detectron2") as dir, tempfile.NamedTemporaryFile( - mode="w", encoding="utf-8", suffix=".py", dir=dir, delete=False - ) as f: - try: - # Objects that use Instances should not reuse previously-compiled - # results in cache, because `Instances` could be a new class each time. - _clear_jit_cache() - - cls_name, s = _gen_instance_module(fields) - f.write(s) - f.flush() - f.close() - - module = _import(f.name) - new_instances = getattr(module, cls_name) - _ = torch.jit.script(new_instances) - # let torchscript think Instances was scripted already - Instances.__torch_script_class__ = True - # let torchscript find new_instances when looking for the jit type of Instances - Instances._jit_override_qualname = torch._jit_internal._qualified_name(new_instances) - - _add_instances_conversion_methods(new_instances) - yield new_instances - finally: - try: - del Instances.__torch_script_class__ - del Instances._jit_override_qualname - except AttributeError: - pass - sys.modules.pop(module.__name__) - - -def _gen_instance_class(fields): - """ - Args: - fields (dict[name: type]) - """ - - class _FieldType: - def __init__(self, name, type_): - assert isinstance(name, str), f"Field name must be str, got {name}" - self.name = name - self.type_ = type_ - self.annotation = f"{type_.__module__}.{type_.__name__}" - - fields = [_FieldType(k, v) for k, v in fields.items()] - - def indent(level, s): - return " " * 4 * level + s - - lines = [] - - global _counter - _counter += 1 - - cls_name = "ScriptedInstances{}".format(_counter) - - field_names = tuple(x.name for x in fields) - extra_args = ", ".join([f"{f.name}: Optional[{f.annotation}] = None" for f in fields]) - lines.append( - f""" -class {cls_name}: - def __init__(self, image_size: Tuple[int, int], {extra_args}): - self.image_size = image_size - self._field_names = {field_names} -""" - ) - - for f in fields: - lines.append( - indent(2, f"self._{f.name} = torch.jit.annotate(Optional[{f.annotation}], {f.name})") - ) - - for f in fields: - lines.append( - f""" - @property - def {f.name}(self) -> {f.annotation}: - # has to use a local for type refinement - # https://pytorch.org/docs/stable/jit_language_reference.html#optional-type-refinement - t = self._{f.name} - assert t is not None, "{f.name} is None and cannot be accessed!" - return t - - @{f.name}.setter - def {f.name}(self, value: {f.annotation}) -> None: - self._{f.name} = value -""" - ) - - # support method `__len__` - lines.append( - """ - def __len__(self) -> int: -""" - ) - for f in fields: - lines.append( - f""" - t = self._{f.name} - if t is not None: - return len(t) -""" - ) - lines.append( - """ - raise NotImplementedError("Empty Instances does not support __len__!") -""" - ) - - # support method `has` - lines.append( - """ - def has(self, name: str) -> bool: -""" - ) - for f in fields: - lines.append( - f""" - if name == "{f.name}": - return self._{f.name} is not None -""" - ) - lines.append( - """ - return False -""" - ) - - # support method `to` - none_args = ", None" * len(fields) - lines.append( - f""" - def to(self, device: torch.device) -> "{cls_name}": - ret = {cls_name}(self.image_size{none_args}) -""" - ) - for f in fields: - if hasattr(f.type_, "to"): - lines.append( - f""" - t = self._{f.name} - if t is not None: - ret._{f.name} = t.to(device) -""" - ) - else: - # For now, ignore fields that cannot be moved to devices. - # Maybe can support other tensor-like classes (e.g. __torch_function__) - pass - lines.append( - """ - return ret -""" - ) - - # support method `getitem` - none_args = ", None" * len(fields) - lines.append( - f""" - def __getitem__(self, item) -> "{cls_name}": - ret = {cls_name}(self.image_size{none_args}) -""" - ) - for f in fields: - lines.append( - f""" - t = self._{f.name} - if t is not None: - ret._{f.name} = t[item] -""" - ) - lines.append( - """ - return ret -""" - ) - - # support method `cat` - # this version does not contain checks that all instances have same size and fields - none_args = ", None" * len(fields) - lines.append( - f""" - def cat(self, instances: List["{cls_name}"]) -> "{cls_name}": - ret = {cls_name}(self.image_size{none_args}) -""" - ) - for f in fields: - lines.append( - f""" - t = self._{f.name} - if t is not None: - values: List[{f.annotation}] = [x.{f.name} for x in instances] - if torch.jit.isinstance(t, torch.Tensor): - ret._{f.name} = torch.cat(values, dim=0) - else: - ret._{f.name} = t.cat(values) -""" - ) - lines.append( - """ - return ret""" - ) - - # support method `get_fields()` - lines.append( - """ - def get_fields(self) -> Dict[str, Tensor]: - ret = {} - """ - ) - for f in fields: - if f.type_ == Boxes: - stmt = "t.tensor" - elif f.type_ == torch.Tensor: - stmt = "t" - else: - stmt = f'assert False, "unsupported type {str(f.type_)}"' - lines.append( - f""" - t = self._{f.name} - if t is not None: - ret["{f.name}"] = {stmt} - """ - ) - lines.append( - """ - return ret""" - ) - return cls_name, os.linesep.join(lines) - - -def _gen_instance_module(fields): - # TODO: find a more automatic way to enable import of other classes - s = """ -from copy import deepcopy -import torch -from torch import Tensor -import typing -from typing import * - -import detectron2 -from detectron2.structures import Boxes, Instances - -""" - - cls_name, cls_def = _gen_instance_class(fields) - s += cls_def - return cls_name, s - - -def _import(path): - return _import_file( - "{}{}".format(sys.modules[__name__].__name__, _counter), path, make_importable=True - ) - - -@contextmanager -def patch_builtin_len(modules=()): - """ - Patch the builtin len() function of a few detectron2 modules - to use __len__ instead, because __len__ does not convert values to - integers and therefore is friendly to tracing. - - Args: - modules (list[stsr]): names of extra modules to patch len(), in - addition to those in detectron2. - """ - - def _new_len(obj): - return obj.__len__() - - with ExitStack() as stack: - MODULES = [ - "detectron2.modeling.roi_heads.fast_rcnn", - "detectron2.modeling.roi_heads.mask_head", - "detectron2.modeling.roi_heads.keypoint_head", - ] + list(modules) - ctxs = [stack.enter_context(mock.patch(mod + ".len")) for mod in MODULES] - for m in ctxs: - m.side_effect = _new_len - yield - - -def patch_nonscriptable_classes(): - """ - Apply patches on a few nonscriptable detectron2 classes. - Should not have side-effects on eager usage. - """ - # __prepare_scriptable__ can also be added to models for easier maintenance. - # But it complicates the clean model code. - - from detectron2.modeling.backbone import ResNet, FPN - - # Due to https://github.com/pytorch/pytorch/issues/36061, - # we change backbone to use ModuleList for scripting. - # (note: this changes param names in state_dict) - - def prepare_resnet(self): - ret = deepcopy(self) - ret.stages = nn.ModuleList(ret.stages) - for k in self.stage_names: - delattr(ret, k) - return ret - - ResNet.__prepare_scriptable__ = prepare_resnet - - def prepare_fpn(self): - ret = deepcopy(self) - ret.lateral_convs = nn.ModuleList(ret.lateral_convs) - ret.output_convs = nn.ModuleList(ret.output_convs) - for name, _ in self.named_children(): - if name.startswith("fpn_"): - delattr(ret, name) - return ret - - FPN.__prepare_scriptable__ = prepare_fpn - - # Annotate some attributes to be constants for the purpose of scripting, - # even though they are not constants in eager mode. - from detectron2.modeling.roi_heads import StandardROIHeads - - if hasattr(StandardROIHeads, "__annotations__"): - # copy first to avoid editing annotations of base class - StandardROIHeads.__annotations__ = deepcopy(StandardROIHeads.__annotations__) - StandardROIHeads.__annotations__["mask_on"] = torch.jit.Final[bool] - StandardROIHeads.__annotations__["keypoint_on"] = torch.jit.Final[bool] - - -# These patches are not supposed to have side-effects. -patch_nonscriptable_classes() - - -@contextmanager -def freeze_training_mode(model): - """ - A context manager that annotates the "training" attribute of every submodule - to constant, so that the training codepath in these modules can be - meta-compiled away. Upon exiting, the annotations are reverted. - """ - classes = {type(x) for x in model.modules()} - # __constants__ is the old way to annotate constants and not compatible - # with __annotations__ . - classes = {x for x in classes if not hasattr(x, "__constants__")} - for cls in classes: - cls.__annotations__["training"] = torch.jit.Final[bool] - yield - for cls in classes: - cls.__annotations__["training"] = bool diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/__init__.py deleted file mode 100755 index 3d015c53..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/__init__.py +++ /dev/null @@ -1,24 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .batch_norm import FrozenBatchNorm2d, get_norm, NaiveSyncBatchNorm, CycleBatchNormList -from .deform_conv import DeformConv, ModulatedDeformConv -from .mask_ops import paste_masks_in_image -from .nms import batched_nms, batched_nms_rotated, nms, nms_rotated -from .roi_align import ROIAlign, roi_align -from .roi_align_rotated import ROIAlignRotated, roi_align_rotated -from .shape_spec import ShapeSpec -from .wrappers import ( - BatchNorm2d, - Conv2d, - ConvTranspose2d, - cat, - interpolate, - Linear, - nonzero_tuple, - cross_entropy, - shapes_to_tensor, -) -from .blocks import CNNBlockBase, DepthwiseSeparableConv2d -from .aspp import ASPP -from .losses import ciou_loss, diou_loss - -__all__ = [k for k in globals().keys() if not k.startswith("_")] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/aspp.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/aspp.py deleted file mode 100755 index 14861aa9..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/aspp.py +++ /dev/null @@ -1,144 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -from copy import deepcopy -import fvcore.nn.weight_init as weight_init -import torch -from torch import nn -from torch.nn import functional as F - -from .batch_norm import get_norm -from .blocks import DepthwiseSeparableConv2d -from .wrappers import Conv2d - - -class ASPP(nn.Module): - """ - Atrous Spatial Pyramid Pooling (ASPP). - """ - - def __init__( - self, - in_channels, - out_channels, - dilations, - *, - norm, - activation, - pool_kernel_size=None, - dropout: float = 0.0, - use_depthwise_separable_conv=False, - ): - """ - Args: - in_channels (int): number of input channels for ASPP. - out_channels (int): number of output channels. - dilations (list): a list of 3 dilations in ASPP. - norm (str or callable): normalization for all conv layers. - See :func:`layers.get_norm` for supported format. norm is - applied to all conv layers except the conv following - global average pooling. - activation (callable): activation function. - pool_kernel_size (tuple, list): the average pooling size (kh, kw) - for image pooling layer in ASPP. If set to None, it always - performs global average pooling. If not None, it must be - divisible by the shape of inputs in forward(). It is recommended - to use a fixed input feature size in training, and set this - option to match this size, so that it performs global average - pooling in training, and the size of the pooling window stays - consistent in inference. - dropout (float): apply dropout on the output of ASPP. It is used in - the official DeepLab implementation with a rate of 0.1: - https://github.com/tensorflow/models/blob/21b73d22f3ed05b650e85ac50849408dd36de32e/research/deeplab/model.py#L532 # noqa - use_depthwise_separable_conv (bool): use DepthwiseSeparableConv2d - for 3x3 convs in ASPP, proposed in :paper:`DeepLabV3+`. - """ - super(ASPP, self).__init__() - assert len(dilations) == 3, "ASPP expects 3 dilations, got {}".format(len(dilations)) - self.pool_kernel_size = pool_kernel_size - self.dropout = dropout - use_bias = norm == "" - self.convs = nn.ModuleList() - # conv 1x1 - self.convs.append( - Conv2d( - in_channels, - out_channels, - kernel_size=1, - bias=use_bias, - norm=get_norm(norm, out_channels), - activation=deepcopy(activation), - ) - ) - weight_init.c2_xavier_fill(self.convs[-1]) - # atrous convs - for dilation in dilations: - if use_depthwise_separable_conv: - self.convs.append( - DepthwiseSeparableConv2d( - in_channels, - out_channels, - kernel_size=3, - padding=dilation, - dilation=dilation, - norm1=norm, - activation1=deepcopy(activation), - norm2=norm, - activation2=deepcopy(activation), - ) - ) - else: - self.convs.append( - Conv2d( - in_channels, - out_channels, - kernel_size=3, - padding=dilation, - dilation=dilation, - bias=use_bias, - norm=get_norm(norm, out_channels), - activation=deepcopy(activation), - ) - ) - weight_init.c2_xavier_fill(self.convs[-1]) - # image pooling - # We do not add BatchNorm because the spatial resolution is 1x1, - # the original TF implementation has BatchNorm. - if pool_kernel_size is None: - image_pooling = nn.Sequential( - nn.AdaptiveAvgPool2d(1), - Conv2d(in_channels, out_channels, 1, bias=True, activation=deepcopy(activation)), - ) - else: - image_pooling = nn.Sequential( - nn.AvgPool2d(kernel_size=pool_kernel_size, stride=1), - Conv2d(in_channels, out_channels, 1, bias=True, activation=deepcopy(activation)), - ) - weight_init.c2_xavier_fill(image_pooling[1]) - self.convs.append(image_pooling) - - self.project = Conv2d( - 5 * out_channels, - out_channels, - kernel_size=1, - bias=use_bias, - norm=get_norm(norm, out_channels), - activation=deepcopy(activation), - ) - weight_init.c2_xavier_fill(self.project) - - def forward(self, x): - size = x.shape[-2:] - if self.pool_kernel_size is not None: - if size[0] % self.pool_kernel_size[0] or size[1] % self.pool_kernel_size[1]: - raise ValueError( - "`pool_kernel_size` must be divisible by the shape of inputs. " - "Input size: {} `pool_kernel_size`: {}".format(size, self.pool_kernel_size) - ) - res = [] - for conv in self.convs: - res.append(conv(x)) - res[-1] = F.interpolate(res[-1], size=size, mode="bilinear", align_corners=False) - res = torch.cat(res, dim=1) - res = self.project(res) - res = F.dropout(res, self.dropout, training=self.training) if self.dropout > 0 else res - return res diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/batch_norm.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/batch_norm.py deleted file mode 100755 index 09a6c66c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/batch_norm.py +++ /dev/null @@ -1,276 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import torch -import torch.distributed as dist -from fvcore.nn.distributed import differentiable_all_reduce -from torch import nn -from torch.nn import functional as F - -from detectron2.utils import comm, env - -from .wrappers import BatchNorm2d - - -class FrozenBatchNorm2d(nn.Module): - """ - BatchNorm2d where the batch statistics and the affine parameters are fixed. - - It contains non-trainable buffers called - "weight" and "bias", "running_mean", "running_var", - initialized to perform identity transformation. - - The pre-trained backbone models from Caffe2 only contain "weight" and "bias", - which are computed from the original four parameters of BN. - The affine transform `x * weight + bias` will perform the equivalent - computation of `(x - running_mean) / sqrt(running_var) * weight + bias`. - When loading a backbone model from Caffe2, "running_mean" and "running_var" - will be left unchanged as identity transformation. - - Other pre-trained backbone models may contain all 4 parameters. - - The forward is implemented by `F.batch_norm(..., training=False)`. - """ - - _version = 3 - - def __init__(self, num_features, eps=1e-5): - super().__init__() - self.num_features = num_features - self.eps = eps - self.register_buffer("weight", torch.ones(num_features)) - self.register_buffer("bias", torch.zeros(num_features)) - self.register_buffer("running_mean", torch.zeros(num_features)) - self.register_buffer("running_var", torch.ones(num_features) - eps) - - def forward(self, x): - if x.requires_grad: - # When gradients are needed, F.batch_norm will use extra memory - # because its backward op computes gradients for weight/bias as well. - scale = self.weight * (self.running_var + self.eps).rsqrt() - bias = self.bias - self.running_mean * scale - scale = scale.reshape(1, -1, 1, 1) - bias = bias.reshape(1, -1, 1, 1) - out_dtype = x.dtype # may be half - return x * scale.to(out_dtype) + bias.to(out_dtype) - else: - # When gradients are not needed, F.batch_norm is a single fused op - # and provide more optimization opportunities. - return F.batch_norm( - x, - self.running_mean, - self.running_var, - self.weight, - self.bias, - training=False, - eps=self.eps, - ) - - def _load_from_state_dict( - self, state_dict, prefix, local_metadata, strict, missing_keys, unexpected_keys, error_msgs - ): - version = local_metadata.get("version", None) - - if version is None or version < 2: - # No running_mean/var in early versions - # This will silent the warnings - if prefix + "running_mean" not in state_dict: - state_dict[prefix + "running_mean"] = torch.zeros_like(self.running_mean) - if prefix + "running_var" not in state_dict: - state_dict[prefix + "running_var"] = torch.ones_like(self.running_var) - - super()._load_from_state_dict( - state_dict, prefix, local_metadata, strict, missing_keys, unexpected_keys, error_msgs - ) - - def __repr__(self): - return "FrozenBatchNorm2d(num_features={}, eps={})".format(self.num_features, self.eps) - - @classmethod - def convert_frozen_batchnorm(cls, module): - """ - Convert all BatchNorm/SyncBatchNorm in module into FrozenBatchNorm. - - Args: - module (torch.nn.Module): - - Returns: - If module is BatchNorm/SyncBatchNorm, returns a new module. - Otherwise, in-place convert module and return it. - - Similar to convert_sync_batchnorm in - https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/batchnorm.py - """ - bn_module = nn.modules.batchnorm - bn_module = (bn_module.BatchNorm2d, bn_module.SyncBatchNorm) - res = module - if isinstance(module, bn_module): - res = cls(module.num_features) - if module.affine: - res.weight.data = module.weight.data.clone().detach() - res.bias.data = module.bias.data.clone().detach() - res.running_mean.data = module.running_mean.data - res.running_var.data = module.running_var.data - res.eps = module.eps - else: - for name, child in module.named_children(): - new_child = cls.convert_frozen_batchnorm(child) - if new_child is not child: - res.add_module(name, new_child) - return res - - -def get_norm(norm, out_channels): - """ - Args: - norm (str or callable): either one of BN, SyncBN, FrozenBN, GN; - or a callable that takes a channel number and returns - the normalization layer as a nn.Module. - - Returns: - nn.Module or None: the normalization layer - """ - if norm is None: - return None - if isinstance(norm, str): - if len(norm) == 0: - return None - norm = { - "BN": BatchNorm2d, - # Fixed in https://github.com/pytorch/pytorch/pull/36382 - "SyncBN": NaiveSyncBatchNorm if env.TORCH_VERSION <= (1, 5) else nn.SyncBatchNorm, - "FrozenBN": FrozenBatchNorm2d, - "GN": lambda channels: nn.GroupNorm(32, channels), - # for debugging: - "nnSyncBN": nn.SyncBatchNorm, - "naiveSyncBN": NaiveSyncBatchNorm, - # expose stats_mode N as an option to caller, required for zero-len inputs - "naiveSyncBN_N": lambda channels: NaiveSyncBatchNorm(channels, stats_mode="N"), - }[norm] - return norm(out_channels) - - -class NaiveSyncBatchNorm(BatchNorm2d): - """ - In PyTorch<=1.5, ``nn.SyncBatchNorm`` has incorrect gradient - when the batch size on each worker is different. - (e.g., when scale augmentation is used, or when it is applied to mask head). - - This is a slower but correct alternative to `nn.SyncBatchNorm`. - - Note: - There isn't a single definition of Sync BatchNorm. - - When ``stats_mode==""``, this module computes overall statistics by using - statistics of each worker with equal weight. The result is true statistics - of all samples (as if they are all on one worker) only when all workers - have the same (N, H, W). This mode does not support inputs with zero batch size. - - When ``stats_mode=="N"``, this module computes overall statistics by weighting - the statistics of each worker by their ``N``. The result is true statistics - of all samples (as if they are all on one worker) only when all workers - have the same (H, W). It is slower than ``stats_mode==""``. - - Even though the result of this module may not be the true statistics of all samples, - it may still be reasonable because it might be preferrable to assign equal weights - to all workers, regardless of their (H, W) dimension, instead of putting larger weight - on larger images. From preliminary experiments, little difference is found between such - a simplified implementation and an accurate computation of overall mean & variance. - """ - - def __init__(self, *args, stats_mode="", **kwargs): - super().__init__(*args, **kwargs) - assert stats_mode in ["", "N"] - self._stats_mode = stats_mode - - def forward(self, input): - if comm.get_world_size() == 1 or not self.training: - return super().forward(input) - - B, C = input.shape[0], input.shape[1] - - half_input = input.dtype == torch.float16 - if half_input: - # fp16 does not have good enough numerics for the reduction here - input = input.float() - mean = torch.mean(input, dim=[0, 2, 3]) - meansqr = torch.mean(input * input, dim=[0, 2, 3]) - - if self._stats_mode == "": - assert B > 0, 'SyncBatchNorm(stats_mode="") does not support zero batch size.' - vec = torch.cat([mean, meansqr], dim=0) - vec = differentiable_all_reduce(vec) * (1.0 / dist.get_world_size()) - mean, meansqr = torch.split(vec, C) - momentum = self.momentum - else: - if B == 0: - vec = torch.zeros([2 * C + 1], device=mean.device, dtype=mean.dtype) - vec = vec + input.sum() # make sure there is gradient w.r.t input - else: - vec = torch.cat( - [mean, meansqr, torch.ones([1], device=mean.device, dtype=mean.dtype)], dim=0 - ) - vec = differentiable_all_reduce(vec * B) - - total_batch = vec[-1].detach() - momentum = total_batch.clamp(max=1) * self.momentum # no update if total_batch is 0 - mean, meansqr, _ = torch.split(vec / total_batch.clamp(min=1), C) # avoid div-by-zero - - var = meansqr - mean * mean - invstd = torch.rsqrt(var + self.eps) - scale = self.weight * invstd - bias = self.bias - mean * scale - scale = scale.reshape(1, -1, 1, 1) - bias = bias.reshape(1, -1, 1, 1) - - self.running_mean += momentum * (mean.detach() - self.running_mean) - self.running_var += momentum * (var.detach() - self.running_var) - ret = input * scale + bias - if half_input: - ret = ret.half() - return ret - - -class CycleBatchNormList(nn.ModuleList): - """ - Implement domain-specific BatchNorm by cycling. - - When a BatchNorm layer is used for multiple input domains or input - features, it might need to maintain a separate test-time statistics - for each domain. See Sec 5.2 in :paper:`rethinking-batchnorm`. - - This module implements it by using N separate BN layers - and it cycles through them every time a forward() is called. - - NOTE: The caller of this module MUST guarantee to always call - this module by multiple of N times. Otherwise its test-time statistics - will be incorrect. - """ - - def __init__(self, length: int, bn_class=nn.BatchNorm2d, **kwargs): - """ - Args: - length: number of BatchNorm layers to cycle. - bn_class: the BatchNorm class to use - kwargs: arguments of the BatchNorm class, such as num_features. - """ - self._affine = kwargs.pop("affine", True) - super().__init__([bn_class(**kwargs, affine=False) for k in range(length)]) - if self._affine: - # shared affine, domain-specific BN - channels = self[0].num_features - self.weight = nn.Parameter(torch.ones(channels)) - self.bias = nn.Parameter(torch.zeros(channels)) - self._pos = 0 - - def forward(self, x): - ret = self[self._pos](x) - self._pos = (self._pos + 1) % len(self) - - if self._affine: - w = self.weight.reshape(1, -1, 1, 1) - b = self.bias.reshape(1, -1, 1, 1) - return ret * w + b - else: - return ret - - def extra_repr(self): - return f"affine={self._affine}" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/blocks.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/blocks.py deleted file mode 100755 index 1995a4bf..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/blocks.py +++ /dev/null @@ -1,111 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import fvcore.nn.weight_init as weight_init -from torch import nn - -from .batch_norm import FrozenBatchNorm2d, get_norm -from .wrappers import Conv2d - - -""" -CNN building blocks. -""" - - -class CNNBlockBase(nn.Module): - """ - A CNN block is assumed to have input channels, output channels and a stride. - The input and output of `forward()` method must be NCHW tensors. - The method can perform arbitrary computation but must match the given - channels and stride specification. - - Attribute: - in_channels (int): - out_channels (int): - stride (int): - """ - - def __init__(self, in_channels, out_channels, stride): - """ - The `__init__` method of any subclass should also contain these arguments. - - Args: - in_channels (int): - out_channels (int): - stride (int): - """ - super().__init__() - self.in_channels = in_channels - self.out_channels = out_channels - self.stride = stride - - def freeze(self): - """ - Make this block not trainable. - This method sets all parameters to `requires_grad=False`, - and convert all BatchNorm layers to FrozenBatchNorm - - Returns: - the block itself - """ - for p in self.parameters(): - p.requires_grad = False - FrozenBatchNorm2d.convert_frozen_batchnorm(self) - return self - - -class DepthwiseSeparableConv2d(nn.Module): - """ - A kxk depthwise convolution + a 1x1 convolution. - - In :paper:`xception`, norm & activation are applied on the second conv. - :paper:`mobilenet` uses norm & activation on both convs. - """ - - def __init__( - self, - in_channels, - out_channels, - kernel_size=3, - padding=1, - dilation=1, - *, - norm1=None, - activation1=None, - norm2=None, - activation2=None, - ): - """ - Args: - norm1, norm2 (str or callable): normalization for the two conv layers. - activation1, activation2 (callable(Tensor) -> Tensor): activation - function for the two conv layers. - """ - super().__init__() - self.depthwise = Conv2d( - in_channels, - in_channels, - kernel_size=kernel_size, - padding=padding, - dilation=dilation, - groups=in_channels, - bias=not norm1, - norm=get_norm(norm1, in_channels), - activation=activation1, - ) - self.pointwise = Conv2d( - in_channels, - out_channels, - kernel_size=1, - bias=not norm2, - norm=get_norm(norm2, out_channels), - activation=activation2, - ) - - # default initialization - weight_init.c2_msra_fill(self.depthwise) - weight_init.c2_msra_fill(self.pointwise) - - def forward(self, x): - return self.pointwise(self.depthwise(x)) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/README.md deleted file mode 100755 index 778ed3da..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/README.md +++ /dev/null @@ -1,7 +0,0 @@ - - -To add a new Op: - -1. Create a new directory -2. Implement new ops there -3. Delcare its Python interface in `vision.cpp`. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated.h b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated.h deleted file mode 100755 index 03f42110..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated.h +++ /dev/null @@ -1,115 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#pragma once -#include - -namespace detectron2 { - -at::Tensor ROIAlignRotated_forward_cpu( - const at::Tensor& input, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int sampling_ratio); - -at::Tensor ROIAlignRotated_backward_cpu( - const at::Tensor& grad, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int batch_size, - const int channels, - const int height, - const int width, - const int sampling_ratio); - -#if defined(WITH_CUDA) || defined(WITH_HIP) -at::Tensor ROIAlignRotated_forward_cuda( - const at::Tensor& input, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int sampling_ratio); - -at::Tensor ROIAlignRotated_backward_cuda( - const at::Tensor& grad, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int batch_size, - const int channels, - const int height, - const int width, - const int sampling_ratio); -#endif - -// Interface for Python -inline at::Tensor ROIAlignRotated_forward( - const at::Tensor& input, - const at::Tensor& rois, - const double spatial_scale, - const int64_t pooled_height, - const int64_t pooled_width, - const int64_t sampling_ratio) { - if (input.is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - return ROIAlignRotated_forward_cuda( - input, - rois, - spatial_scale, - pooled_height, - pooled_width, - sampling_ratio); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - return ROIAlignRotated_forward_cpu( - input, rois, spatial_scale, pooled_height, pooled_width, sampling_ratio); -} - -inline at::Tensor ROIAlignRotated_backward( - const at::Tensor& grad, - const at::Tensor& rois, - const double spatial_scale, - const int64_t pooled_height, - const int64_t pooled_width, - const int64_t batch_size, - const int64_t channels, - const int64_t height, - const int64_t width, - const int64_t sampling_ratio) { - if (grad.is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - return ROIAlignRotated_backward_cuda( - grad, - rois, - spatial_scale, - pooled_height, - pooled_width, - batch_size, - channels, - height, - width, - sampling_ratio); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - return ROIAlignRotated_backward_cpu( - grad, - rois, - spatial_scale, - pooled_height, - pooled_width, - batch_size, - channels, - height, - width, - sampling_ratio); -} - -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cpu.cpp b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cpu.cpp deleted file mode 100755 index 2a3d3056..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cpu.cpp +++ /dev/null @@ -1,522 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#include -#include "ROIAlignRotated.h" - -// Note: this implementation originates from the Caffe2 ROIAlignRotated Op -// and PyTorch ROIAlign (non-rotated) Op implementations. -// The key difference between this implementation and those ones is -// we don't do "legacy offset" in this version, as there aren't many previous -// works, if any, using the "legacy" ROIAlignRotated Op. -// This would make the interface a bit cleaner. - -namespace detectron2 { - -namespace { -template -struct PreCalc { - int pos1; - int pos2; - int pos3; - int pos4; - T w1; - T w2; - T w3; - T w4; -}; - -template -void pre_calc_for_bilinear_interpolate( - const int height, - const int width, - const int pooled_height, - const int pooled_width, - const int iy_upper, - const int ix_upper, - T roi_start_h, - T roi_start_w, - T bin_size_h, - T bin_size_w, - int roi_bin_grid_h, - int roi_bin_grid_w, - T roi_center_h, - T roi_center_w, - T cos_theta, - T sin_theta, - std::vector>& pre_calc) { - int pre_calc_index = 0; - for (int ph = 0; ph < pooled_height; ph++) { - for (int pw = 0; pw < pooled_width; pw++) { - for (int iy = 0; iy < iy_upper; iy++) { - const T yy = roi_start_h + ph * bin_size_h + - static_cast(iy + .5f) * bin_size_h / - static_cast(roi_bin_grid_h); // e.g., 0.5, 1.5 - for (int ix = 0; ix < ix_upper; ix++) { - const T xx = roi_start_w + pw * bin_size_w + - static_cast(ix + .5f) * bin_size_w / - static_cast(roi_bin_grid_w); - - // Rotate by theta around the center and translate - // In image space, (y, x) is the order for Right Handed System, - // and this is essentially multiplying the point by a rotation matrix - // to rotate it counterclockwise through angle theta. - T y = yy * cos_theta - xx * sin_theta + roi_center_h; - T x = yy * sin_theta + xx * cos_theta + roi_center_w; - // deal with: inverse elements are out of feature map boundary - if (y < -1.0 || y > height || x < -1.0 || x > width) { - // empty - PreCalc pc; - pc.pos1 = 0; - pc.pos2 = 0; - pc.pos3 = 0; - pc.pos4 = 0; - pc.w1 = 0; - pc.w2 = 0; - pc.w3 = 0; - pc.w4 = 0; - pre_calc[pre_calc_index] = pc; - pre_calc_index += 1; - continue; - } - - if (y < 0) { - y = 0; - } - if (x < 0) { - x = 0; - } - - int y_low = (int)y; - int x_low = (int)x; - int y_high; - int x_high; - - if (y_low >= height - 1) { - y_high = y_low = height - 1; - y = (T)y_low; - } else { - y_high = y_low + 1; - } - - if (x_low >= width - 1) { - x_high = x_low = width - 1; - x = (T)x_low; - } else { - x_high = x_low + 1; - } - - T ly = y - y_low; - T lx = x - x_low; - T hy = 1. - ly, hx = 1. - lx; - T w1 = hy * hx, w2 = hy * lx, w3 = ly * hx, w4 = ly * lx; - - // save weights and indices - PreCalc pc; - pc.pos1 = y_low * width + x_low; - pc.pos2 = y_low * width + x_high; - pc.pos3 = y_high * width + x_low; - pc.pos4 = y_high * width + x_high; - pc.w1 = w1; - pc.w2 = w2; - pc.w3 = w3; - pc.w4 = w4; - pre_calc[pre_calc_index] = pc; - - pre_calc_index += 1; - } - } - } - } -} - -template -void bilinear_interpolate_gradient( - const int height, - const int width, - T y, - T x, - T& w1, - T& w2, - T& w3, - T& w4, - int& x_low, - int& x_high, - int& y_low, - int& y_high) { - // deal with cases that inverse elements are out of feature map boundary - if (y < -1.0 || y > height || x < -1.0 || x > width) { - // empty - w1 = w2 = w3 = w4 = 0.; - x_low = x_high = y_low = y_high = -1; - return; - } - - if (y < 0) { - y = 0; - } - - if (x < 0) { - x = 0; - } - - y_low = (int)y; - x_low = (int)x; - - if (y_low >= height - 1) { - y_high = y_low = height - 1; - y = (T)y_low; - } else { - y_high = y_low + 1; - } - - if (x_low >= width - 1) { - x_high = x_low = width - 1; - x = (T)x_low; - } else { - x_high = x_low + 1; - } - - T ly = y - y_low; - T lx = x - x_low; - T hy = 1. - ly, hx = 1. - lx; - - // reference in forward - // T v1 = input[y_low * width + x_low]; - // T v2 = input[y_low * width + x_high]; - // T v3 = input[y_high * width + x_low]; - // T v4 = input[y_high * width + x_high]; - // T val = (w1 * v1 + w2 * v2 + w3 * v3 + w4 * v4); - - w1 = hy * hx, w2 = hy * lx, w3 = ly * hx, w4 = ly * lx; - - return; -} - -template -inline void add(T* address, const T& val) { - *address += val; -} - -} // namespace - -template -void ROIAlignRotatedForward( - const int nthreads, - const T* input, - const T& spatial_scale, - const int channels, - const int height, - const int width, - const int pooled_height, - const int pooled_width, - const int sampling_ratio, - const T* rois, - T* output) { - int n_rois = nthreads / channels / pooled_width / pooled_height; - // (n, c, ph, pw) is an element in the pooled output - // can be parallelized using omp - // #pragma omp parallel for num_threads(32) - for (int n = 0; n < n_rois; n++) { - int index_n = n * channels * pooled_width * pooled_height; - - const T* current_roi = rois + n * 6; - int roi_batch_ind = current_roi[0]; - - // Do not use rounding; this implementation detail is critical - // ROIAlignRotated supports align == true, i.e., continuous coordinate - // by default, thus the 0.5 offset - T offset = (T)0.5; - T roi_center_w = current_roi[1] * spatial_scale - offset; - T roi_center_h = current_roi[2] * spatial_scale - offset; - T roi_width = current_roi[3] * spatial_scale; - T roi_height = current_roi[4] * spatial_scale; - T theta = current_roi[5] * M_PI / 180.0; - T cos_theta = cos(theta); - T sin_theta = sin(theta); - - AT_ASSERTM( - roi_width >= 0 && roi_height >= 0, - "ROIs in ROIAlignRotated do not have non-negative size!"); - - T bin_size_h = static_cast(roi_height) / static_cast(pooled_height); - T bin_size_w = static_cast(roi_width) / static_cast(pooled_width); - - // We use roi_bin_grid to sample the grid and mimic integral - int roi_bin_grid_h = (sampling_ratio > 0) - ? sampling_ratio - : ceil(roi_height / pooled_height); // e.g., = 2 - int roi_bin_grid_w = - (sampling_ratio > 0) ? sampling_ratio : ceil(roi_width / pooled_width); - - // We do average (integral) pooling inside a bin - const T count = std::max(roi_bin_grid_h * roi_bin_grid_w, 1); // e.g. = 4 - - // we want to precalculate indices and weights shared by all channels, - // this is the key point of optimization - std::vector> pre_calc( - roi_bin_grid_h * roi_bin_grid_w * pooled_width * pooled_height); - - // roi_start_h and roi_start_w are computed wrt the center of RoI (x, y). - // Appropriate translation needs to be applied after. - T roi_start_h = -roi_height / 2.0; - T roi_start_w = -roi_width / 2.0; - - pre_calc_for_bilinear_interpolate( - height, - width, - pooled_height, - pooled_width, - roi_bin_grid_h, - roi_bin_grid_w, - roi_start_h, - roi_start_w, - bin_size_h, - bin_size_w, - roi_bin_grid_h, - roi_bin_grid_w, - roi_center_h, - roi_center_w, - cos_theta, - sin_theta, - pre_calc); - - for (int c = 0; c < channels; c++) { - int index_n_c = index_n + c * pooled_width * pooled_height; - const T* offset_input = - input + (roi_batch_ind * channels + c) * height * width; - int pre_calc_index = 0; - - for (int ph = 0; ph < pooled_height; ph++) { - for (int pw = 0; pw < pooled_width; pw++) { - int index = index_n_c + ph * pooled_width + pw; - - T output_val = 0.; - for (int iy = 0; iy < roi_bin_grid_h; iy++) { - for (int ix = 0; ix < roi_bin_grid_w; ix++) { - PreCalc pc = pre_calc[pre_calc_index]; - output_val += pc.w1 * offset_input[pc.pos1] + - pc.w2 * offset_input[pc.pos2] + - pc.w3 * offset_input[pc.pos3] + pc.w4 * offset_input[pc.pos4]; - - pre_calc_index += 1; - } - } - output_val /= count; - - output[index] = output_val; - } // for pw - } // for ph - } // for c - } // for n -} - -template -void ROIAlignRotatedBackward( - const int nthreads, - // may not be contiguous. should index using n_stride, etc - const T* grad_output, - const T& spatial_scale, - const int channels, - const int height, - const int width, - const int pooled_height, - const int pooled_width, - const int sampling_ratio, - T* grad_input, - const T* rois, - const int n_stride, - const int c_stride, - const int h_stride, - const int w_stride) { - for (int index = 0; index < nthreads; index++) { - // (n, c, ph, pw) is an element in the pooled output - int pw = index % pooled_width; - int ph = (index / pooled_width) % pooled_height; - int c = (index / pooled_width / pooled_height) % channels; - int n = index / pooled_width / pooled_height / channels; - - const T* current_roi = rois + n * 6; - int roi_batch_ind = current_roi[0]; - - // Do not use rounding; this implementation detail is critical - // ROIAlignRotated supports align == true, i.e., continuous coordinate - // by default, thus the 0.5 offset - T offset = (T)0.5; - T roi_center_w = current_roi[1] * spatial_scale - offset; - T roi_center_h = current_roi[2] * spatial_scale - offset; - T roi_width = current_roi[3] * spatial_scale; - T roi_height = current_roi[4] * spatial_scale; - T theta = current_roi[5] * M_PI / 180.0; - T cos_theta = cos(theta); - T sin_theta = sin(theta); - - AT_ASSERTM( - roi_width >= 0 && roi_height >= 0, - "ROIs in ROIAlignRotated do not have non-negative size!"); - - T bin_size_h = static_cast(roi_height) / static_cast(pooled_height); - T bin_size_w = static_cast(roi_width) / static_cast(pooled_width); - - T* offset_grad_input = - grad_input + ((roi_batch_ind * channels + c) * height * width); - - int output_offset = n * n_stride + c * c_stride; - const T* offset_grad_output = grad_output + output_offset; - const T grad_output_this_bin = - offset_grad_output[ph * h_stride + pw * w_stride]; - - // We use roi_bin_grid to sample the grid and mimic integral - int roi_bin_grid_h = (sampling_ratio > 0) - ? sampling_ratio - : ceil(roi_height / pooled_height); // e.g., = 2 - int roi_bin_grid_w = - (sampling_ratio > 0) ? sampling_ratio : ceil(roi_width / pooled_width); - - // roi_start_h and roi_start_w are computed wrt the center of RoI (x, y). - // Appropriate translation needs to be applied after. - T roi_start_h = -roi_height / 2.0; - T roi_start_w = -roi_width / 2.0; - - // We do average (integral) pooling inside a bin - const T count = roi_bin_grid_h * roi_bin_grid_w; // e.g. = 4 - - for (int iy = 0; iy < roi_bin_grid_h; iy++) { - const T yy = roi_start_h + ph * bin_size_h + - static_cast(iy + .5f) * bin_size_h / - static_cast(roi_bin_grid_h); // e.g., 0.5, 1.5 - for (int ix = 0; ix < roi_bin_grid_w; ix++) { - const T xx = roi_start_w + pw * bin_size_w + - static_cast(ix + .5f) * bin_size_w / - static_cast(roi_bin_grid_w); - - // Rotate by theta around the center and translate - T y = yy * cos_theta - xx * sin_theta + roi_center_h; - T x = yy * sin_theta + xx * cos_theta + roi_center_w; - - T w1, w2, w3, w4; - int x_low, x_high, y_low, y_high; - - bilinear_interpolate_gradient( - height, width, y, x, w1, w2, w3, w4, x_low, x_high, y_low, y_high); - - T g1 = grad_output_this_bin * w1 / count; - T g2 = grad_output_this_bin * w2 / count; - T g3 = grad_output_this_bin * w3 / count; - T g4 = grad_output_this_bin * w4 / count; - - if (x_low >= 0 && x_high >= 0 && y_low >= 0 && y_high >= 0) { - // atomic add is not needed for now since it is single threaded - add(offset_grad_input + y_low * width + x_low, static_cast(g1)); - add(offset_grad_input + y_low * width + x_high, static_cast(g2)); - add(offset_grad_input + y_high * width + x_low, static_cast(g3)); - add(offset_grad_input + y_high * width + x_high, static_cast(g4)); - } // if - } // ix - } // iy - } // for -} // ROIAlignRotatedBackward - -at::Tensor ROIAlignRotated_forward_cpu( - const at::Tensor& input, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int sampling_ratio) { - AT_ASSERTM(input.device().is_cpu(), "input must be a CPU tensor"); - AT_ASSERTM(rois.device().is_cpu(), "rois must be a CPU tensor"); - - at::TensorArg input_t{input, "input", 1}, rois_t{rois, "rois", 2}; - - at::CheckedFrom c = "ROIAlign_forward_cpu"; - at::checkAllSameType(c, {input_t, rois_t}); - - auto num_rois = rois.size(0); - auto channels = input.size(1); - auto height = input.size(2); - auto width = input.size(3); - - at::Tensor output = at::zeros( - {num_rois, channels, pooled_height, pooled_width}, input.options()); - - auto output_size = num_rois * pooled_height * pooled_width * channels; - - if (output.numel() == 0) { - return output; - } - - auto input_ = input.contiguous(), rois_ = rois.contiguous(); - AT_DISPATCH_FLOATING_TYPES_AND_HALF( - input.scalar_type(), "ROIAlignRotated_forward", [&] { - ROIAlignRotatedForward( - output_size, - input_.data_ptr(), - spatial_scale, - channels, - height, - width, - pooled_height, - pooled_width, - sampling_ratio, - rois_.data_ptr(), - output.data_ptr()); - }); - return output; -} - -at::Tensor ROIAlignRotated_backward_cpu( - const at::Tensor& grad, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int batch_size, - const int channels, - const int height, - const int width, - const int sampling_ratio) { - AT_ASSERTM(grad.device().is_cpu(), "grad must be a CPU tensor"); - AT_ASSERTM(rois.device().is_cpu(), "rois must be a CPU tensor"); - - at::TensorArg grad_t{grad, "grad", 1}, rois_t{rois, "rois", 2}; - - at::CheckedFrom c = "ROIAlignRotated_backward_cpu"; - at::checkAllSameType(c, {grad_t, rois_t}); - - at::Tensor grad_input = - at::zeros({batch_size, channels, height, width}, grad.options()); - - // handle possibly empty gradients - if (grad.numel() == 0) { - return grad_input; - } - - // get stride values to ensure indexing into gradients is correct. - int n_stride = grad.stride(0); - int c_stride = grad.stride(1); - int h_stride = grad.stride(2); - int w_stride = grad.stride(3); - - auto rois_ = rois.contiguous(); - AT_DISPATCH_FLOATING_TYPES_AND_HALF( - grad.scalar_type(), "ROIAlignRotated_forward", [&] { - ROIAlignRotatedBackward( - grad.numel(), - grad.data_ptr(), - spatial_scale, - channels, - height, - width, - pooled_height, - pooled_width, - sampling_ratio, - grad_input.data_ptr(), - rois_.data_ptr(), - n_stride, - c_stride, - h_stride, - w_stride); - }); - return grad_input; -} - -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cuda.cu b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cuda.cu deleted file mode 100755 index fca18651..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cuda.cu +++ /dev/null @@ -1,443 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#include -#include -#include -#include - -// TODO make it in a common file -#define CUDA_1D_KERNEL_LOOP(i, n) \ - for (int i = blockIdx.x * blockDim.x + threadIdx.x; i < n; \ - i += blockDim.x * gridDim.x) - -// Note: this implementation originates from the Caffe2 ROIAlignRotated Op -// and PyTorch ROIAlign (non-rotated) Op implementations. -// The key difference between this implementation and those ones is -// we don't do "legacy offset" in this version, as there aren't many previous -// works, if any, using the "legacy" ROIAlignRotated Op. -// This would make the interface a bit cleaner. - -namespace detectron2 { - -namespace { - -template -__device__ T bilinear_interpolate( - const T* input, - const int height, - const int width, - T y, - T x) { - // deal with cases that inverse elements are out of feature map boundary - if (y < -1.0 || y > height || x < -1.0 || x > width) { - // empty - return 0; - } - - if (y < 0) { - y = 0; - } - - if (x < 0) { - x = 0; - } - - int y_low = (int)y; - int x_low = (int)x; - int y_high; - int x_high; - - if (y_low >= height - 1) { - y_high = y_low = height - 1; - y = (T)y_low; - } else { - y_high = y_low + 1; - } - - if (x_low >= width - 1) { - x_high = x_low = width - 1; - x = (T)x_low; - } else { - x_high = x_low + 1; - } - - T ly = y - y_low; - T lx = x - x_low; - T hy = 1. - ly, hx = 1. - lx; - // do bilinear interpolation - T v1 = input[y_low * width + x_low]; - T v2 = input[y_low * width + x_high]; - T v3 = input[y_high * width + x_low]; - T v4 = input[y_high * width + x_high]; - T w1 = hy * hx, w2 = hy * lx, w3 = ly * hx, w4 = ly * lx; - - T val = (w1 * v1 + w2 * v2 + w3 * v3 + w4 * v4); - - return val; -} - -template -__device__ void bilinear_interpolate_gradient( - const int height, - const int width, - T y, - T x, - T& w1, - T& w2, - T& w3, - T& w4, - int& x_low, - int& x_high, - int& y_low, - int& y_high) { - // deal with cases that inverse elements are out of feature map boundary - if (y < -1.0 || y > height || x < -1.0 || x > width) { - // empty - w1 = w2 = w3 = w4 = 0.; - x_low = x_high = y_low = y_high = -1; - return; - } - - if (y < 0) { - y = 0; - } - - if (x < 0) { - x = 0; - } - - y_low = (int)y; - x_low = (int)x; - - if (y_low >= height - 1) { - y_high = y_low = height - 1; - y = (T)y_low; - } else { - y_high = y_low + 1; - } - - if (x_low >= width - 1) { - x_high = x_low = width - 1; - x = (T)x_low; - } else { - x_high = x_low + 1; - } - - T ly = y - y_low; - T lx = x - x_low; - T hy = 1. - ly, hx = 1. - lx; - - // reference in forward - // T v1 = input[y_low * width + x_low]; - // T v2 = input[y_low * width + x_high]; - // T v3 = input[y_high * width + x_low]; - // T v4 = input[y_high * width + x_high]; - // T val = (w1 * v1 + w2 * v2 + w3 * v3 + w4 * v4); - - w1 = hy * hx, w2 = hy * lx, w3 = ly * hx, w4 = ly * lx; - - return; -} - -} // namespace - -template -__global__ void RoIAlignRotatedForward( - const int nthreads, - const T* input, - const T spatial_scale, - const int channels, - const int height, - const int width, - const int pooled_height, - const int pooled_width, - const int sampling_ratio, - const T* rois, - T* top_data) { - CUDA_1D_KERNEL_LOOP(index, nthreads) { - // (n, c, ph, pw) is an element in the pooled output - int pw = index % pooled_width; - int ph = (index / pooled_width) % pooled_height; - int c = (index / pooled_width / pooled_height) % channels; - int n = index / pooled_width / pooled_height / channels; - - const T* current_roi = rois + n * 6; - int roi_batch_ind = current_roi[0]; - - // Do not use rounding; this implementation detail is critical - // ROIAlignRotated supports align == true, i.e., continuous coordinate - // by default, thus the 0.5 offset - T offset = (T)0.5; - T roi_center_w = current_roi[1] * spatial_scale - offset; - T roi_center_h = current_roi[2] * spatial_scale - offset; - T roi_width = current_roi[3] * spatial_scale; - T roi_height = current_roi[4] * spatial_scale; - T theta = current_roi[5] * M_PI / 180.0; - T cos_theta = cos(theta); - T sin_theta = sin(theta); - - T bin_size_h = static_cast(roi_height) / static_cast(pooled_height); - T bin_size_w = static_cast(roi_width) / static_cast(pooled_width); - - const T* offset_input = - input + (roi_batch_ind * channels + c) * height * width; - - // We use roi_bin_grid to sample the grid and mimic integral - int roi_bin_grid_h = (sampling_ratio > 0) - ? sampling_ratio - : ceil(roi_height / pooled_height); // e.g., = 2 - int roi_bin_grid_w = - (sampling_ratio > 0) ? sampling_ratio : ceil(roi_width / pooled_width); - - // roi_start_h and roi_start_w are computed wrt the center of RoI (x, y). - // Appropriate translation needs to be applied after. - T roi_start_h = -roi_height / 2.0; - T roi_start_w = -roi_width / 2.0; - - // We do average (inte gral) pooling inside a bin - const T count = max(roi_bin_grid_h * roi_bin_grid_w, 1); // e.g. = 4 - - T output_val = 0.; - for (int iy = 0; iy < roi_bin_grid_h; iy++) // e.g., iy = 0, 1 - { - const T yy = roi_start_h + ph * bin_size_h + - static_cast(iy + .5f) * bin_size_h / - static_cast(roi_bin_grid_h); // e.g., 0.5, 1.5 - for (int ix = 0; ix < roi_bin_grid_w; ix++) { - const T xx = roi_start_w + pw * bin_size_w + - static_cast(ix + .5f) * bin_size_w / - static_cast(roi_bin_grid_w); - - // Rotate by theta around the center and translate - T y = yy * cos_theta - xx * sin_theta + roi_center_h; - T x = yy * sin_theta + xx * cos_theta + roi_center_w; - - T val = bilinear_interpolate(offset_input, height, width, y, x); - output_val += val; - } - } - output_val /= count; - - top_data[index] = output_val; - } -} - -template -__global__ void RoIAlignRotatedBackwardFeature( - const int nthreads, - const T* top_diff, - const int num_rois, - const T spatial_scale, - const int channels, - const int height, - const int width, - const int pooled_height, - const int pooled_width, - const int sampling_ratio, - T* bottom_diff, - const T* rois) { - CUDA_1D_KERNEL_LOOP(index, nthreads) { - // (n, c, ph, pw) is an element in the pooled output - int pw = index % pooled_width; - int ph = (index / pooled_width) % pooled_height; - int c = (index / pooled_width / pooled_height) % channels; - int n = index / pooled_width / pooled_height / channels; - - const T* current_roi = rois + n * 6; - int roi_batch_ind = current_roi[0]; - - // Do not use rounding; this implementation detail is critical - // ROIAlignRotated supports align == true, i.e., continuous coordinate - // by default, thus the 0.5 offset - T offset = (T)0.5; - T roi_center_w = current_roi[1] * spatial_scale - offset; - T roi_center_h = current_roi[2] * spatial_scale - offset; - T roi_width = current_roi[3] * spatial_scale; - T roi_height = current_roi[4] * spatial_scale; - T theta = current_roi[5] * M_PI / 180.0; - T cos_theta = cos(theta); - T sin_theta = sin(theta); - - T bin_size_h = static_cast(roi_height) / static_cast(pooled_height); - T bin_size_w = static_cast(roi_width) / static_cast(pooled_width); - - T* offset_bottom_diff = - bottom_diff + (roi_batch_ind * channels + c) * height * width; - - int top_offset = (n * channels + c) * pooled_height * pooled_width; - const T* offset_top_diff = top_diff + top_offset; - const T top_diff_this_bin = offset_top_diff[ph * pooled_width + pw]; - - // We use roi_bin_grid to sample the grid and mimic integral - int roi_bin_grid_h = (sampling_ratio > 0) - ? sampling_ratio - : ceil(roi_height / pooled_height); // e.g., = 2 - int roi_bin_grid_w = - (sampling_ratio > 0) ? sampling_ratio : ceil(roi_width / pooled_width); - - // roi_start_h and roi_start_w are computed wrt the center of RoI (x, y). - // Appropriate translation needs to be applied after. - T roi_start_h = -roi_height / 2.0; - T roi_start_w = -roi_width / 2.0; - - // We do average (integral) pooling inside a bin - const T count = roi_bin_grid_h * roi_bin_grid_w; // e.g. = 4 - - for (int iy = 0; iy < roi_bin_grid_h; iy++) // e.g., iy = 0, 1 - { - const T yy = roi_start_h + ph * bin_size_h + - static_cast(iy + .5f) * bin_size_h / - static_cast(roi_bin_grid_h); // e.g., 0.5, 1.5 - for (int ix = 0; ix < roi_bin_grid_w; ix++) { - const T xx = roi_start_w + pw * bin_size_w + - static_cast(ix + .5f) * bin_size_w / - static_cast(roi_bin_grid_w); - - // Rotate by theta around the center and translate - T y = yy * cos_theta - xx * sin_theta + roi_center_h; - T x = yy * sin_theta + xx * cos_theta + roi_center_w; - - T w1, w2, w3, w4; - int x_low, x_high, y_low, y_high; - - bilinear_interpolate_gradient( - height, width, y, x, w1, w2, w3, w4, x_low, x_high, y_low, y_high); - - T g1 = top_diff_this_bin * w1 / count; - T g2 = top_diff_this_bin * w2 / count; - T g3 = top_diff_this_bin * w3 / count; - T g4 = top_diff_this_bin * w4 / count; - - if (x_low >= 0 && x_high >= 0 && y_low >= 0 && y_high >= 0) { - atomicAdd( - offset_bottom_diff + y_low * width + x_low, static_cast(g1)); - atomicAdd( - offset_bottom_diff + y_low * width + x_high, static_cast(g2)); - atomicAdd( - offset_bottom_diff + y_high * width + x_low, static_cast(g3)); - atomicAdd( - offset_bottom_diff + y_high * width + x_high, static_cast(g4)); - } // if - } // ix - } // iy - } // CUDA_1D_KERNEL_LOOP -} // RoIAlignRotatedBackward - -at::Tensor ROIAlignRotated_forward_cuda( - const at::Tensor& input, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int sampling_ratio) { - AT_ASSERTM(input.device().is_cuda(), "input must be a CUDA tensor"); - AT_ASSERTM(rois.device().is_cuda(), "rois must be a CUDA tensor"); - at::TensorArg input_t{input, "input", 1}, rois_t{rois, "rois", 2}; - - at::CheckedFrom c = "ROIAlignRotated_forward_cuda"; - at::checkAllSameGPU(c, {input_t, rois_t}); - at::checkAllSameType(c, {input_t, rois_t}); - at::cuda::CUDAGuard device_guard(input.device()); - - auto num_rois = rois.size(0); - auto channels = input.size(1); - auto height = input.size(2); - auto width = input.size(3); - - auto output = at::empty( - {num_rois, channels, pooled_height, pooled_width}, input.options()); - auto output_size = num_rois * pooled_height * pooled_width * channels; - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - dim3 grid(std::min( - at::cuda::ATenCeilDiv( - static_cast(output_size), static_cast(512)), - static_cast(4096))); - dim3 block(512); - - if (output.numel() == 0) { - AT_CUDA_CHECK(cudaGetLastError()); - return output; - } - - auto input_ = input.contiguous(), rois_ = rois.contiguous(); - AT_DISPATCH_FLOATING_TYPES( - input.scalar_type(), "ROIAlignRotated_forward", [&] { - RoIAlignRotatedForward<<>>( - output_size, - input_.data_ptr(), - spatial_scale, - channels, - height, - width, - pooled_height, - pooled_width, - sampling_ratio, - rois_.data_ptr(), - output.data_ptr()); - }); - cudaDeviceSynchronize(); - AT_CUDA_CHECK(cudaGetLastError()); - return output; -} - -// TODO remove the dependency on input and use instead its sizes -> save memory -at::Tensor ROIAlignRotated_backward_cuda( - const at::Tensor& grad, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int batch_size, - const int channels, - const int height, - const int width, - const int sampling_ratio) { - AT_ASSERTM(grad.device().is_cuda(), "grad must be a CUDA tensor"); - AT_ASSERTM(rois.device().is_cuda(), "rois must be a CUDA tensor"); - - at::TensorArg grad_t{grad, "grad", 1}, rois_t{rois, "rois", 2}; - at::CheckedFrom c = "ROIAlign_backward_cuda"; - at::checkAllSameGPU(c, {grad_t, rois_t}); - at::checkAllSameType(c, {grad_t, rois_t}); - at::cuda::CUDAGuard device_guard(grad.device()); - - auto num_rois = rois.size(0); - auto grad_input = - at::zeros({batch_size, channels, height, width}, grad.options()); - - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - dim3 grid(std::min( - at::cuda::ATenCeilDiv( - static_cast(grad.numel()), static_cast(512)), - static_cast(4096))); - dim3 block(512); - - // handle possibly empty gradients - if (grad.numel() == 0) { - AT_CUDA_CHECK(cudaGetLastError()); - return grad_input; - } - - auto grad_ = grad.contiguous(), rois_ = rois.contiguous(); - AT_DISPATCH_FLOATING_TYPES( - grad.scalar_type(), "ROIAlignRotated_backward", [&] { - RoIAlignRotatedBackwardFeature<<>>( - grad.numel(), - grad_.data_ptr(), - num_rois, - spatial_scale, - channels, - height, - width, - pooled_height, - pooled_width, - sampling_ratio, - grad_input.data_ptr(), - rois_.data_ptr()); - }); - AT_CUDA_CHECK(cudaGetLastError()); - return grad_input; -} - -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h deleted file mode 100755 index 3bf383b8..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h +++ /dev/null @@ -1,35 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#pragma once -#include - -namespace detectron2 { - -at::Tensor box_iou_rotated_cpu( - const at::Tensor& boxes1, - const at::Tensor& boxes2); - -#if defined(WITH_CUDA) || defined(WITH_HIP) -at::Tensor box_iou_rotated_cuda( - const at::Tensor& boxes1, - const at::Tensor& boxes2); -#endif - -// Interface for Python -// inline is needed to prevent multiple function definitions when this header is -// included by different cpps -inline at::Tensor box_iou_rotated( - const at::Tensor& boxes1, - const at::Tensor& boxes2) { - assert(boxes1.device().is_cuda() == boxes2.device().is_cuda()); - if (boxes1.device().is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - return box_iou_rotated_cuda(boxes1.contiguous(), boxes2.contiguous()); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - - return box_iou_rotated_cpu(boxes1.contiguous(), boxes2.contiguous()); -} - -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp deleted file mode 100755 index c843487b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp +++ /dev/null @@ -1,39 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#include "box_iou_rotated.h" -#include "box_iou_rotated_utils.h" - -namespace detectron2 { - -template -void box_iou_rotated_cpu_kernel( - const at::Tensor& boxes1, - const at::Tensor& boxes2, - at::Tensor& ious) { - auto num_boxes1 = boxes1.size(0); - auto num_boxes2 = boxes2.size(0); - - for (int i = 0; i < num_boxes1; i++) { - for (int j = 0; j < num_boxes2; j++) { - ious[i * num_boxes2 + j] = single_box_iou_rotated( - boxes1[i].data_ptr(), boxes2[j].data_ptr()); - } - } -} - -at::Tensor box_iou_rotated_cpu( - // input must be contiguous: - const at::Tensor& boxes1, - const at::Tensor& boxes2) { - auto num_boxes1 = boxes1.size(0); - auto num_boxes2 = boxes2.size(0); - at::Tensor ious = - at::empty({num_boxes1 * num_boxes2}, boxes1.options().dtype(at::kFloat)); - - box_iou_rotated_cpu_kernel(boxes1, boxes2, ious); - - // reshape from 1d array to 2d array - auto shape = std::vector{num_boxes1, num_boxes2}; - return ious.reshape(shape); -} - -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu deleted file mode 100755 index 952710e5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu +++ /dev/null @@ -1,130 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#include -#include -#include -#include -#include "box_iou_rotated_utils.h" - -namespace detectron2 { - -// 2D block with 32 * 16 = 512 threads per block -const int BLOCK_DIM_X = 32; -const int BLOCK_DIM_Y = 16; - -template -__global__ void box_iou_rotated_cuda_kernel( - const int n_boxes1, - const int n_boxes2, - const T* dev_boxes1, - const T* dev_boxes2, - T* dev_ious) { - const int row_start = blockIdx.x * blockDim.x; - const int col_start = blockIdx.y * blockDim.y; - - const int row_size = min(n_boxes1 - row_start, blockDim.x); - const int col_size = min(n_boxes2 - col_start, blockDim.y); - - __shared__ float block_boxes1[BLOCK_DIM_X * 5]; - __shared__ float block_boxes2[BLOCK_DIM_Y * 5]; - - // It's safe to copy using threadIdx.x since BLOCK_DIM_X >= BLOCK_DIM_Y - if (threadIdx.x < row_size && threadIdx.y == 0) { - block_boxes1[threadIdx.x * 5 + 0] = - dev_boxes1[(row_start + threadIdx.x) * 5 + 0]; - block_boxes1[threadIdx.x * 5 + 1] = - dev_boxes1[(row_start + threadIdx.x) * 5 + 1]; - block_boxes1[threadIdx.x * 5 + 2] = - dev_boxes1[(row_start + threadIdx.x) * 5 + 2]; - block_boxes1[threadIdx.x * 5 + 3] = - dev_boxes1[(row_start + threadIdx.x) * 5 + 3]; - block_boxes1[threadIdx.x * 5 + 4] = - dev_boxes1[(row_start + threadIdx.x) * 5 + 4]; - } - - if (threadIdx.x < col_size && threadIdx.y == 0) { - block_boxes2[threadIdx.x * 5 + 0] = - dev_boxes2[(col_start + threadIdx.x) * 5 + 0]; - block_boxes2[threadIdx.x * 5 + 1] = - dev_boxes2[(col_start + threadIdx.x) * 5 + 1]; - block_boxes2[threadIdx.x * 5 + 2] = - dev_boxes2[(col_start + threadIdx.x) * 5 + 2]; - block_boxes2[threadIdx.x * 5 + 3] = - dev_boxes2[(col_start + threadIdx.x) * 5 + 3]; - block_boxes2[threadIdx.x * 5 + 4] = - dev_boxes2[(col_start + threadIdx.x) * 5 + 4]; - } - __syncthreads(); - - if (threadIdx.x < row_size && threadIdx.y < col_size) { - int offset = (row_start + threadIdx.x) * n_boxes2 + col_start + threadIdx.y; - dev_ious[offset] = single_box_iou_rotated( - block_boxes1 + threadIdx.x * 5, block_boxes2 + threadIdx.y * 5); - } -} - -at::Tensor box_iou_rotated_cuda( - // input must be contiguous - const at::Tensor& boxes1, - const at::Tensor& boxes2) { - using scalar_t = float; - AT_ASSERTM( - boxes1.scalar_type() == at::kFloat, "boxes1 must be a float tensor"); - AT_ASSERTM( - boxes2.scalar_type() == at::kFloat, "boxes2 must be a float tensor"); - AT_ASSERTM(boxes1.is_cuda(), "boxes1 must be a CUDA tensor"); - AT_ASSERTM(boxes2.is_cuda(), "boxes2 must be a CUDA tensor"); - at::cuda::CUDAGuard device_guard(boxes1.device()); - - auto num_boxes1 = boxes1.size(0); - auto num_boxes2 = boxes2.size(0); - - at::Tensor ious = - at::empty({num_boxes1 * num_boxes2}, boxes1.options().dtype(at::kFloat)); - - bool transpose = false; - if (num_boxes1 > 0 && num_boxes2 > 0) { - scalar_t *data1 = boxes1.data_ptr(), - *data2 = boxes2.data_ptr(); - - if (num_boxes2 > 65535 * BLOCK_DIM_Y) { - AT_ASSERTM( - num_boxes1 <= 65535 * BLOCK_DIM_Y, - "Too many boxes for box_iou_rotated_cuda!"); - // x dim is allowed to be large, but y dim cannot, - // so we transpose the two to avoid "invalid configuration argument" - // error. We assume one of them is small. Otherwise the result is hard to - // fit in memory anyway. - std::swap(num_boxes1, num_boxes2); - std::swap(data1, data2); - transpose = true; - } - - const int blocks_x = - at::cuda::ATenCeilDiv(static_cast(num_boxes1), BLOCK_DIM_X); - const int blocks_y = - at::cuda::ATenCeilDiv(static_cast(num_boxes2), BLOCK_DIM_Y); - - dim3 blocks(blocks_x, blocks_y); - dim3 threads(BLOCK_DIM_X, BLOCK_DIM_Y); - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - box_iou_rotated_cuda_kernel<<>>( - num_boxes1, - num_boxes2, - data1, - data2, - (scalar_t*)ious.data_ptr()); - - AT_CUDA_CHECK(cudaGetLastError()); - } - - // reshape from 1d array to 2d array - auto shape = std::vector{num_boxes1, num_boxes2}; - if (transpose) { - return ious.view(shape).t(); - } else { - return ious.view(shape); - } -} - -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h deleted file mode 100755 index b54a5dde..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h +++ /dev/null @@ -1,370 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#pragma once - -#include -#include - -#if defined(__CUDACC__) || __HCC__ == 1 || __HIP__ == 1 -// Designates functions callable from the host (CPU) and the device (GPU) -#define HOST_DEVICE __host__ __device__ -#define HOST_DEVICE_INLINE HOST_DEVICE __forceinline__ -#else -#include -#define HOST_DEVICE -#define HOST_DEVICE_INLINE HOST_DEVICE inline -#endif - -namespace detectron2 { - -namespace { - -template -struct RotatedBox { - T x_ctr, y_ctr, w, h, a; -}; - -template -struct Point { - T x, y; - HOST_DEVICE_INLINE Point(const T& px = 0, const T& py = 0) : x(px), y(py) {} - HOST_DEVICE_INLINE Point operator+(const Point& p) const { - return Point(x + p.x, y + p.y); - } - HOST_DEVICE_INLINE Point& operator+=(const Point& p) { - x += p.x; - y += p.y; - return *this; - } - HOST_DEVICE_INLINE Point operator-(const Point& p) const { - return Point(x - p.x, y - p.y); - } - HOST_DEVICE_INLINE Point operator*(const T coeff) const { - return Point(x * coeff, y * coeff); - } -}; - -template -HOST_DEVICE_INLINE T dot_2d(const Point& A, const Point& B) { - return A.x * B.x + A.y * B.y; -} - -// R: result type. can be different from input type -template -HOST_DEVICE_INLINE R cross_2d(const Point& A, const Point& B) { - return static_cast(A.x) * static_cast(B.y) - - static_cast(B.x) * static_cast(A.y); -} - -template -HOST_DEVICE_INLINE void get_rotated_vertices( - const RotatedBox& box, - Point (&pts)[4]) { - // M_PI / 180. == 0.01745329251 - double theta = box.a * 0.01745329251; - T cosTheta2 = (T)cos(theta) * 0.5f; - T sinTheta2 = (T)sin(theta) * 0.5f; - - // y: top --> down; x: left --> right - pts[0].x = box.x_ctr + sinTheta2 * box.h + cosTheta2 * box.w; - pts[0].y = box.y_ctr + cosTheta2 * box.h - sinTheta2 * box.w; - pts[1].x = box.x_ctr - sinTheta2 * box.h + cosTheta2 * box.w; - pts[1].y = box.y_ctr - cosTheta2 * box.h - sinTheta2 * box.w; - pts[2].x = 2 * box.x_ctr - pts[0].x; - pts[2].y = 2 * box.y_ctr - pts[0].y; - pts[3].x = 2 * box.x_ctr - pts[1].x; - pts[3].y = 2 * box.y_ctr - pts[1].y; -} - -template -HOST_DEVICE_INLINE int get_intersection_points( - const Point (&pts1)[4], - const Point (&pts2)[4], - Point (&intersections)[24]) { - // Line vector - // A line from p1 to p2 is: p1 + (p2-p1)*t, t=[0,1] - Point vec1[4], vec2[4]; - for (int i = 0; i < 4; i++) { - vec1[i] = pts1[(i + 1) % 4] - pts1[i]; - vec2[i] = pts2[(i + 1) % 4] - pts2[i]; - } - - // When computing the intersection area, it doesn't hurt if we have - // more (duplicated/approximate) intersections/vertices than needed, - // while it can cause drastic difference if we miss an intersection/vertex. - // Therefore, we add an epsilon to relax the comparisons between - // the float point numbers that decide the intersection points. - double EPS = 1e-5; - - // Line test - test all line combos for intersection - int num = 0; // number of intersections - for (int i = 0; i < 4; i++) { - for (int j = 0; j < 4; j++) { - // Solve for 2x2 Ax=b - T det = cross_2d(vec2[j], vec1[i]); - - // This takes care of parallel lines - if (fabs(det) <= 1e-14) { - continue; - } - - auto vec12 = pts2[j] - pts1[i]; - - T t1 = cross_2d(vec2[j], vec12) / det; - T t2 = cross_2d(vec1[i], vec12) / det; - - if (t1 > -EPS && t1 < 1.0f + EPS && t2 > -EPS && t2 < 1.0f + EPS) { - intersections[num++] = pts1[i] + vec1[i] * t1; - } - } - } - - // Check for vertices of rect1 inside rect2 - { - const auto& AB = vec2[0]; - const auto& DA = vec2[3]; - auto ABdotAB = dot_2d(AB, AB); - auto ADdotAD = dot_2d(DA, DA); - for (int i = 0; i < 4; i++) { - // assume ABCD is the rectangle, and P is the point to be judged - // P is inside ABCD iff. P's projection on AB lies within AB - // and P's projection on AD lies within AD - - auto AP = pts1[i] - pts2[0]; - - auto APdotAB = dot_2d(AP, AB); - auto APdotAD = -dot_2d(AP, DA); - - if ((APdotAB > -EPS) && (APdotAD > -EPS) && (APdotAB < ABdotAB + EPS) && - (APdotAD < ADdotAD + EPS)) { - intersections[num++] = pts1[i]; - } - } - } - - // Reverse the check - check for vertices of rect2 inside rect1 - { - const auto& AB = vec1[0]; - const auto& DA = vec1[3]; - auto ABdotAB = dot_2d(AB, AB); - auto ADdotAD = dot_2d(DA, DA); - for (int i = 0; i < 4; i++) { - auto AP = pts2[i] - pts1[0]; - - auto APdotAB = dot_2d(AP, AB); - auto APdotAD = -dot_2d(AP, DA); - - if ((APdotAB > -EPS) && (APdotAD > -EPS) && (APdotAB < ABdotAB + EPS) && - (APdotAD < ADdotAD + EPS)) { - intersections[num++] = pts2[i]; - } - } - } - - return num; -} - -template -HOST_DEVICE_INLINE int convex_hull_graham( - const Point (&p)[24], - const int& num_in, - Point (&q)[24], - bool shift_to_zero = false) { - assert(num_in >= 2); - - // Step 1: - // Find point with minimum y - // if more than 1 points have the same minimum y, - // pick the one with the minimum x. - int t = 0; - for (int i = 1; i < num_in; i++) { - if (p[i].y < p[t].y || (p[i].y == p[t].y && p[i].x < p[t].x)) { - t = i; - } - } - auto& start = p[t]; // starting point - - // Step 2: - // Subtract starting point from every points (for sorting in the next step) - for (int i = 0; i < num_in; i++) { - q[i] = p[i] - start; - } - - // Swap the starting point to position 0 - auto tmp = q[0]; - q[0] = q[t]; - q[t] = tmp; - - // Step 3: - // Sort point 1 ~ num_in according to their relative cross-product values - // (essentially sorting according to angles) - // If the angles are the same, sort according to their distance to origin - T dist[24]; -#if defined(__CUDACC__) || __HCC__ == 1 || __HIP__ == 1 - // compute distance to origin before sort, and sort them together with the - // points - for (int i = 0; i < num_in; i++) { - dist[i] = dot_2d(q[i], q[i]); - } - - // CUDA version - // In the future, we can potentially use thrust - // for sorting here to improve speed (though not guaranteed) - for (int i = 1; i < num_in - 1; i++) { - for (int j = i + 1; j < num_in; j++) { - T crossProduct = cross_2d(q[i], q[j]); - if ((crossProduct < -1e-6) || - (fabs(crossProduct) < 1e-6 && dist[i] > dist[j])) { - auto q_tmp = q[i]; - q[i] = q[j]; - q[j] = q_tmp; - auto dist_tmp = dist[i]; - dist[i] = dist[j]; - dist[j] = dist_tmp; - } - } - } -#else - // CPU version - std::sort( - q + 1, q + num_in, [](const Point& A, const Point& B) -> bool { - T temp = cross_2d(A, B); - if (fabs(temp) < 1e-6) { - return dot_2d(A, A) < dot_2d(B, B); - } else { - return temp > 0; - } - }); - // compute distance to origin after sort, since the points are now different. - for (int i = 0; i < num_in; i++) { - dist[i] = dot_2d(q[i], q[i]); - } -#endif - - // Step 4: - // Make sure there are at least 2 points (that don't overlap with each other) - // in the stack - int k; // index of the non-overlapped second point - for (k = 1; k < num_in; k++) { - if (dist[k] > 1e-8) { - break; - } - } - if (k == num_in) { - // We reach the end, which means the convex hull is just one point - q[0] = p[t]; - return 1; - } - q[1] = q[k]; - int m = 2; // 2 points in the stack - // Step 5: - // Finally we can start the scanning process. - // When a non-convex relationship between the 3 points is found - // (either concave shape or duplicated points), - // we pop the previous point from the stack - // until the 3-point relationship is convex again, or - // until the stack only contains two points - for (int i = k + 1; i < num_in; i++) { - while (m > 1) { - auto q1 = q[i] - q[m - 2], q2 = q[m - 1] - q[m - 2]; - // cross_2d() uses FMA and therefore computes round(round(q1.x*q2.y) - - // q2.x*q1.y) So it may not return 0 even when q1==q2. Therefore we - // compare round(q1.x*q2.y) and round(q2.x*q1.y) directly. (round means - // round to nearest floating point). - if (q1.x * q2.y >= q2.x * q1.y) - m--; - else - break; - } - // Using double also helps, but float can solve the issue for now. - // while (m > 1 && cross_2d(q[i] - q[m - 2], q[m - 1] - q[m - 2]) - // >= 0) { - // m--; - // } - q[m++] = q[i]; - } - - // Step 6 (Optional): - // In general sense we need the original coordinates, so we - // need to shift the points back (reverting Step 2) - // But if we're only interested in getting the area/perimeter of the shape - // We can simply return. - if (!shift_to_zero) { - for (int i = 0; i < m; i++) { - q[i] += start; - } - } - - return m; -} - -template -HOST_DEVICE_INLINE T polygon_area(const Point (&q)[24], const int& m) { - if (m <= 2) { - return 0; - } - - T area = 0; - for (int i = 1; i < m - 1; i++) { - area += fabs(cross_2d(q[i] - q[0], q[i + 1] - q[0])); - } - - return area / 2.0; -} - -template -HOST_DEVICE_INLINE T rotated_boxes_intersection( - const RotatedBox& box1, - const RotatedBox& box2) { - // There are up to 4 x 4 + 4 + 4 = 24 intersections (including dups) returned - // from rotated_rect_intersection_pts - Point intersectPts[24], orderedPts[24]; - - Point pts1[4]; - Point pts2[4]; - get_rotated_vertices(box1, pts1); - get_rotated_vertices(box2, pts2); - - int num = get_intersection_points(pts1, pts2, intersectPts); - - if (num <= 2) { - return 0.0; - } - - // Convex Hull to order the intersection points in clockwise order and find - // the contour area. - int num_convex = convex_hull_graham(intersectPts, num, orderedPts, true); - return polygon_area(orderedPts, num_convex); -} - -} // namespace - -template -HOST_DEVICE_INLINE T -single_box_iou_rotated(T const* const box1_raw, T const* const box2_raw) { - // shift center to the middle point to achieve higher precision in result - RotatedBox box1, box2; - auto center_shift_x = (box1_raw[0] + box2_raw[0]) / 2.0; - auto center_shift_y = (box1_raw[1] + box2_raw[1]) / 2.0; - box1.x_ctr = box1_raw[0] - center_shift_x; - box1.y_ctr = box1_raw[1] - center_shift_y; - box1.w = box1_raw[2]; - box1.h = box1_raw[3]; - box1.a = box1_raw[4]; - box2.x_ctr = box2_raw[0] - center_shift_x; - box2.y_ctr = box2_raw[1] - center_shift_y; - box2.w = box2_raw[2]; - box2.h = box2_raw[3]; - box2.a = box2_raw[4]; - - T area1 = box1.w * box1.h; - T area2 = box2.w * box2.h; - if (area1 < 1e-14 || area2 < 1e-14) { - return 0.f; - } - - T intersection = rotated_boxes_intersection(box1, box2); - T iou = intersection / (area1 + area2 - intersection); - return iou; -} - -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.cpp b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.cpp deleted file mode 100755 index 0a5b7b90..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.cpp +++ /dev/null @@ -1,507 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#include "cocoeval.h" -#include -#include -#include -#include - -using namespace pybind11::literals; - -namespace detectron2 { - -namespace COCOeval { - -// Sort detections from highest score to lowest, such that -// detection_instances[detection_sorted_indices[t]] >= -// detection_instances[detection_sorted_indices[t+1]]. Use stable_sort to match -// original COCO API -void SortInstancesByDetectionScore( - const std::vector& detection_instances, - std::vector* detection_sorted_indices) { - detection_sorted_indices->resize(detection_instances.size()); - std::iota( - detection_sorted_indices->begin(), detection_sorted_indices->end(), 0); - std::stable_sort( - detection_sorted_indices->begin(), - detection_sorted_indices->end(), - [&detection_instances](size_t j1, size_t j2) { - return detection_instances[j1].score > detection_instances[j2].score; - }); -} - -// Partition the ground truth objects based on whether or not to ignore them -// based on area -void SortInstancesByIgnore( - const std::array& area_range, - const std::vector& ground_truth_instances, - std::vector* ground_truth_sorted_indices, - std::vector* ignores) { - ignores->clear(); - ignores->reserve(ground_truth_instances.size()); - for (auto o : ground_truth_instances) { - ignores->push_back( - o.ignore || o.area < area_range[0] || o.area > area_range[1]); - } - - ground_truth_sorted_indices->resize(ground_truth_instances.size()); - std::iota( - ground_truth_sorted_indices->begin(), - ground_truth_sorted_indices->end(), - 0); - std::stable_sort( - ground_truth_sorted_indices->begin(), - ground_truth_sorted_indices->end(), - [&ignores](size_t j1, size_t j2) { - return (int)(*ignores)[j1] < (int)(*ignores)[j2]; - }); -} - -// For each IOU threshold, greedily match each detected instance to a ground -// truth instance (if possible) and store the results -void MatchDetectionsToGroundTruth( - const std::vector& detection_instances, - const std::vector& detection_sorted_indices, - const std::vector& ground_truth_instances, - const std::vector& ground_truth_sorted_indices, - const std::vector& ignores, - const std::vector>& ious, - const std::vector& iou_thresholds, - const std::array& area_range, - ImageEvaluation* results) { - // Initialize memory to store return data matches and ignore - const int num_iou_thresholds = iou_thresholds.size(); - const int num_ground_truth = ground_truth_sorted_indices.size(); - const int num_detections = detection_sorted_indices.size(); - std::vector ground_truth_matches( - num_iou_thresholds * num_ground_truth, 0); - std::vector& detection_matches = results->detection_matches; - std::vector& detection_ignores = results->detection_ignores; - std::vector& ground_truth_ignores = results->ground_truth_ignores; - detection_matches.resize(num_iou_thresholds * num_detections, 0); - detection_ignores.resize(num_iou_thresholds * num_detections, false); - ground_truth_ignores.resize(num_ground_truth); - for (auto g = 0; g < num_ground_truth; ++g) { - ground_truth_ignores[g] = ignores[ground_truth_sorted_indices[g]]; - } - - for (auto t = 0; t < num_iou_thresholds; ++t) { - for (auto d = 0; d < num_detections; ++d) { - // information about best match so far (match=-1 -> unmatched) - double best_iou = std::min(iou_thresholds[t], 1 - 1e-10); - int match = -1; - for (auto g = 0; g < num_ground_truth; ++g) { - // if this ground truth instance is already matched and not a - // crowd, it cannot be matched to another detection - if (ground_truth_matches[t * num_ground_truth + g] > 0 && - !ground_truth_instances[ground_truth_sorted_indices[g]].is_crowd) { - continue; - } - - // if detected instance matched to a regular ground truth - // instance, we can break on the first ground truth instance - // tagged as ignore (because they are sorted by the ignore tag) - if (match >= 0 && !ground_truth_ignores[match] && - ground_truth_ignores[g]) { - break; - } - - // if IOU overlap is the best so far, store the match appropriately - if (ious[d][ground_truth_sorted_indices[g]] >= best_iou) { - best_iou = ious[d][ground_truth_sorted_indices[g]]; - match = g; - } - } - // if match was made, store id of match for both detection and - // ground truth - if (match >= 0) { - detection_ignores[t * num_detections + d] = ground_truth_ignores[match]; - detection_matches[t * num_detections + d] = - ground_truth_instances[ground_truth_sorted_indices[match]].id; - ground_truth_matches[t * num_ground_truth + match] = - detection_instances[detection_sorted_indices[d]].id; - } - - // set unmatched detections outside of area range to ignore - const InstanceAnnotation& detection = - detection_instances[detection_sorted_indices[d]]; - detection_ignores[t * num_detections + d] = - detection_ignores[t * num_detections + d] || - (detection_matches[t * num_detections + d] == 0 && - (detection.area < area_range[0] || detection.area > area_range[1])); - } - } - - // store detection score results - results->detection_scores.resize(detection_sorted_indices.size()); - for (size_t d = 0; d < detection_sorted_indices.size(); ++d) { - results->detection_scores[d] = - detection_instances[detection_sorted_indices[d]].score; - } -} - -std::vector EvaluateImages( - const std::vector>& area_ranges, - int max_detections, - const std::vector& iou_thresholds, - const ImageCategoryInstances>& image_category_ious, - const ImageCategoryInstances& - image_category_ground_truth_instances, - const ImageCategoryInstances& - image_category_detection_instances) { - const int num_area_ranges = area_ranges.size(); - const int num_images = image_category_ground_truth_instances.size(); - const int num_categories = - image_category_ious.size() > 0 ? image_category_ious[0].size() : 0; - std::vector detection_sorted_indices; - std::vector ground_truth_sorted_indices; - std::vector ignores; - std::vector results_all( - num_images * num_area_ranges * num_categories); - - // Store results for each image, category, and area range combination. Results - // for each IOU threshold are packed into the same ImageEvaluation object - for (auto i = 0; i < num_images; ++i) { - for (auto c = 0; c < num_categories; ++c) { - const std::vector& ground_truth_instances = - image_category_ground_truth_instances[i][c]; - const std::vector& detection_instances = - image_category_detection_instances[i][c]; - - SortInstancesByDetectionScore( - detection_instances, &detection_sorted_indices); - if ((int)detection_sorted_indices.size() > max_detections) { - detection_sorted_indices.resize(max_detections); - } - - for (size_t a = 0; a < area_ranges.size(); ++a) { - SortInstancesByIgnore( - area_ranges[a], - ground_truth_instances, - &ground_truth_sorted_indices, - &ignores); - - MatchDetectionsToGroundTruth( - detection_instances, - detection_sorted_indices, - ground_truth_instances, - ground_truth_sorted_indices, - ignores, - image_category_ious[i][c], - iou_thresholds, - area_ranges[a], - &results_all - [c * num_area_ranges * num_images + a * num_images + i]); - } - } - } - - return results_all; -} - -// Convert a python list to a vector -template -std::vector list_to_vec(const py::list& l) { - std::vector v(py::len(l)); - for (int i = 0; i < (int)py::len(l); ++i) { - v[i] = l[i].cast(); - } - return v; -} - -// Helper function to Accumulate() -// Considers the evaluation results applicable to a particular category, area -// range, and max_detections parameter setting, which begin at -// evaluations[evaluation_index]. Extracts a sorted list of length n of all -// applicable detection instances concatenated across all images in the dataset, -// which are represented by the outputs evaluation_indices, detection_scores, -// image_detection_indices, and detection_sorted_indices--all of which are -// length n. evaluation_indices[i] stores the applicable index into -// evaluations[] for instance i, which has detection score detection_score[i], -// and is the image_detection_indices[i]'th of the list of detections -// for the image containing i. detection_sorted_indices[] defines a sorted -// permutation of the 3 other outputs -int BuildSortedDetectionList( - const std::vector& evaluations, - const int64_t evaluation_index, - const int64_t num_images, - const int max_detections, - std::vector* evaluation_indices, - std::vector* detection_scores, - std::vector* detection_sorted_indices, - std::vector* image_detection_indices) { - assert(evaluations.size() >= evaluation_index + num_images); - - // Extract a list of object instances of the applicable category, area - // range, and max detections requirements such that they can be sorted - image_detection_indices->clear(); - evaluation_indices->clear(); - detection_scores->clear(); - image_detection_indices->reserve(num_images * max_detections); - evaluation_indices->reserve(num_images * max_detections); - detection_scores->reserve(num_images * max_detections); - int num_valid_ground_truth = 0; - for (auto i = 0; i < num_images; ++i) { - const ImageEvaluation& evaluation = evaluations[evaluation_index + i]; - - for (int d = 0; - d < (int)evaluation.detection_scores.size() && d < max_detections; - ++d) { // detected instances - evaluation_indices->push_back(evaluation_index + i); - image_detection_indices->push_back(d); - detection_scores->push_back(evaluation.detection_scores[d]); - } - for (auto ground_truth_ignore : evaluation.ground_truth_ignores) { - if (!ground_truth_ignore) { - ++num_valid_ground_truth; - } - } - } - - // Sort detections by decreasing score, using stable sort to match - // python implementation - detection_sorted_indices->resize(detection_scores->size()); - std::iota( - detection_sorted_indices->begin(), detection_sorted_indices->end(), 0); - std::stable_sort( - detection_sorted_indices->begin(), - detection_sorted_indices->end(), - [&detection_scores](size_t j1, size_t j2) { - return (*detection_scores)[j1] > (*detection_scores)[j2]; - }); - - return num_valid_ground_truth; -} - -// Helper function to Accumulate() -// Compute a precision recall curve given a sorted list of detected instances -// encoded in evaluations, evaluation_indices, detection_scores, -// detection_sorted_indices, image_detection_indices (see -// BuildSortedDetectionList()). Using vectors precisions and recalls -// and temporary storage, output the results into precisions_out, recalls_out, -// and scores_out, which are large buffers containing many precion/recall curves -// for all possible parameter settings, with precisions_out_index and -// recalls_out_index defining the applicable indices to store results. -void ComputePrecisionRecallCurve( - const int64_t precisions_out_index, - const int64_t precisions_out_stride, - const int64_t recalls_out_index, - const std::vector& recall_thresholds, - const int iou_threshold_index, - const int num_iou_thresholds, - const int num_valid_ground_truth, - const std::vector& evaluations, - const std::vector& evaluation_indices, - const std::vector& detection_scores, - const std::vector& detection_sorted_indices, - const std::vector& image_detection_indices, - std::vector* precisions, - std::vector* recalls, - std::vector* precisions_out, - std::vector* scores_out, - std::vector* recalls_out) { - assert(recalls_out->size() > recalls_out_index); - - // Compute precision/recall for each instance in the sorted list of detections - int64_t true_positives_sum = 0, false_positives_sum = 0; - precisions->clear(); - recalls->clear(); - precisions->reserve(detection_sorted_indices.size()); - recalls->reserve(detection_sorted_indices.size()); - assert(!evaluations.empty() || detection_sorted_indices.empty()); - for (auto detection_sorted_index : detection_sorted_indices) { - const ImageEvaluation& evaluation = - evaluations[evaluation_indices[detection_sorted_index]]; - const auto num_detections = - evaluation.detection_matches.size() / num_iou_thresholds; - const auto detection_index = iou_threshold_index * num_detections + - image_detection_indices[detection_sorted_index]; - assert(evaluation.detection_matches.size() > detection_index); - assert(evaluation.detection_ignores.size() > detection_index); - const int64_t detection_match = - evaluation.detection_matches[detection_index]; - const bool detection_ignores = - evaluation.detection_ignores[detection_index]; - const auto true_positive = detection_match > 0 && !detection_ignores; - const auto false_positive = detection_match == 0 && !detection_ignores; - if (true_positive) { - ++true_positives_sum; - } - if (false_positive) { - ++false_positives_sum; - } - - const double recall = - static_cast(true_positives_sum) / num_valid_ground_truth; - recalls->push_back(recall); - const int64_t num_valid_detections = - true_positives_sum + false_positives_sum; - const double precision = num_valid_detections > 0 - ? static_cast(true_positives_sum) / num_valid_detections - : 0.0; - precisions->push_back(precision); - } - - (*recalls_out)[recalls_out_index] = !recalls->empty() ? recalls->back() : 0; - - for (int64_t i = static_cast(precisions->size()) - 1; i > 0; --i) { - if ((*precisions)[i] > (*precisions)[i - 1]) { - (*precisions)[i - 1] = (*precisions)[i]; - } - } - - // Sample the per instance precision/recall list at each recall threshold - for (size_t r = 0; r < recall_thresholds.size(); ++r) { - // first index in recalls >= recall_thresholds[r] - std::vector::iterator low = std::lower_bound( - recalls->begin(), recalls->end(), recall_thresholds[r]); - size_t precisions_index = low - recalls->begin(); - - const auto results_ind = precisions_out_index + r * precisions_out_stride; - assert(results_ind < precisions_out->size()); - assert(results_ind < scores_out->size()); - if (precisions_index < precisions->size()) { - (*precisions_out)[results_ind] = (*precisions)[precisions_index]; - (*scores_out)[results_ind] = - detection_scores[detection_sorted_indices[precisions_index]]; - } else { - (*precisions_out)[results_ind] = 0; - (*scores_out)[results_ind] = 0; - } - } -} -py::dict Accumulate( - const py::object& params, - const std::vector& evaluations) { - const std::vector recall_thresholds = - list_to_vec(params.attr("recThrs")); - const std::vector max_detections = - list_to_vec(params.attr("maxDets")); - const int num_iou_thresholds = py::len(params.attr("iouThrs")); - const int num_recall_thresholds = py::len(params.attr("recThrs")); - const int num_categories = params.attr("useCats").cast() == 1 - ? py::len(params.attr("catIds")) - : 1; - const int num_area_ranges = py::len(params.attr("areaRng")); - const int num_max_detections = py::len(params.attr("maxDets")); - const int num_images = py::len(params.attr("imgIds")); - - std::vector precisions_out( - num_iou_thresholds * num_recall_thresholds * num_categories * - num_area_ranges * num_max_detections, - -1); - std::vector recalls_out( - num_iou_thresholds * num_categories * num_area_ranges * - num_max_detections, - -1); - std::vector scores_out( - num_iou_thresholds * num_recall_thresholds * num_categories * - num_area_ranges * num_max_detections, - -1); - - // Consider the list of all detected instances in the entire dataset in one - // large list. evaluation_indices, detection_scores, - // image_detection_indices, and detection_sorted_indices all have the same - // length as this list, such that each entry corresponds to one detected - // instance - std::vector evaluation_indices; // indices into evaluations[] - std::vector detection_scores; // detection scores of each instance - std::vector detection_sorted_indices; // sorted indices of all - // instances in the dataset - std::vector - image_detection_indices; // indices into the list of detected instances in - // the same image as each instance - std::vector precisions, recalls; - - for (auto c = 0; c < num_categories; ++c) { - for (auto a = 0; a < num_area_ranges; ++a) { - for (auto m = 0; m < num_max_detections; ++m) { - // The COCO PythonAPI assumes evaluations[] (the return value of - // COCOeval::EvaluateImages() is one long list storing results for each - // combination of category, area range, and image id, with categories in - // the outermost loop and images in the innermost loop. - const int64_t evaluations_index = - c * num_area_ranges * num_images + a * num_images; - int num_valid_ground_truth = BuildSortedDetectionList( - evaluations, - evaluations_index, - num_images, - max_detections[m], - &evaluation_indices, - &detection_scores, - &detection_sorted_indices, - &image_detection_indices); - - if (num_valid_ground_truth == 0) { - continue; - } - - for (auto t = 0; t < num_iou_thresholds; ++t) { - // recalls_out is a flattened vectors representing a - // num_iou_thresholds X num_categories X num_area_ranges X - // num_max_detections matrix - const int64_t recalls_out_index = - t * num_categories * num_area_ranges * num_max_detections + - c * num_area_ranges * num_max_detections + - a * num_max_detections + m; - - // precisions_out and scores_out are flattened vectors - // representing a num_iou_thresholds X num_recall_thresholds X - // num_categories X num_area_ranges X num_max_detections matrix - const int64_t precisions_out_stride = - num_categories * num_area_ranges * num_max_detections; - const int64_t precisions_out_index = t * num_recall_thresholds * - num_categories * num_area_ranges * num_max_detections + - c * num_area_ranges * num_max_detections + - a * num_max_detections + m; - - ComputePrecisionRecallCurve( - precisions_out_index, - precisions_out_stride, - recalls_out_index, - recall_thresholds, - t, - num_iou_thresholds, - num_valid_ground_truth, - evaluations, - evaluation_indices, - detection_scores, - detection_sorted_indices, - image_detection_indices, - &precisions, - &recalls, - &precisions_out, - &scores_out, - &recalls_out); - } - } - } - } - - time_t rawtime; - struct tm local_time; - std::array buffer; - time(&rawtime); -#ifdef _WIN32 - localtime_s(&local_time, &rawtime); -#else - localtime_r(&rawtime, &local_time); -#endif - strftime( - buffer.data(), 200, "%Y-%m-%d %H:%num_max_detections:%S", &local_time); - return py::dict( - "params"_a = params, - "counts"_a = std::vector( - {num_iou_thresholds, - num_recall_thresholds, - num_categories, - num_area_ranges, - num_max_detections}), - "date"_a = buffer, - "precision"_a = precisions_out, - "recall"_a = recalls_out, - "scores"_a = scores_out); -} - -} // namespace COCOeval - -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.h b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.h deleted file mode 100755 index db246e49..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.h +++ /dev/null @@ -1,88 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#pragma once - -#include -#include -#include -#include -#include - -namespace py = pybind11; - -namespace detectron2 { - -namespace COCOeval { - -// Annotation data for a single object instance in an image -struct InstanceAnnotation { - InstanceAnnotation( - uint64_t id, - double score, - double area, - bool is_crowd, - bool ignore) - : id{id}, score{score}, area{area}, is_crowd{is_crowd}, ignore{ignore} {} - uint64_t id; - double score = 0.; - double area = 0.; - bool is_crowd = false; - bool ignore = false; -}; - -// Stores intermediate results for evaluating detection results for a single -// image that has D detected instances and G ground truth instances. This stores -// matches between detected and ground truth instances -struct ImageEvaluation { - // For each of the D detected instances, the id of the matched ground truth - // instance, or 0 if unmatched - std::vector detection_matches; - - // The detection score of each of the D detected instances - std::vector detection_scores; - - // Marks whether or not each of G instances was ignored from evaluation (e.g., - // because it's outside area_range) - std::vector ground_truth_ignores; - - // Marks whether or not each of D instances was ignored from evaluation (e.g., - // because it's outside aRng) - std::vector detection_ignores; -}; - -template -using ImageCategoryInstances = std::vector>>; - -// C++ implementation of COCO API cocoeval.py::COCOeval.evaluateImg(). For each -// combination of image, category, area range settings, and IOU thresholds to -// evaluate, it matches detected instances to ground truth instances and stores -// the results into a vector of ImageEvaluation results, which will be -// interpreted by the COCOeval::Accumulate() function to produce precion-recall -// curves. The parameters of nested vectors have the following semantics: -// image_category_ious[i][c][d][g] is the intersection over union of the d'th -// detected instance and g'th ground truth instance of -// category category_ids[c] in image image_ids[i] -// image_category_ground_truth_instances[i][c] is a vector of ground truth -// instances in image image_ids[i] of category category_ids[c] -// image_category_detection_instances[i][c] is a vector of detected -// instances in image image_ids[i] of category category_ids[c] -std::vector EvaluateImages( - const std::vector>& area_ranges, // vector of 2-tuples - int max_detections, - const std::vector& iou_thresholds, - const ImageCategoryInstances>& image_category_ious, - const ImageCategoryInstances& - image_category_ground_truth_instances, - const ImageCategoryInstances& - image_category_detection_instances); - -// C++ implementation of COCOeval.accumulate(), which generates precision -// recall curves for each set of category, IOU threshold, detection area range, -// and max number of detections parameters. It is assumed that the parameter -// evaluations is the return value of the functon COCOeval::EvaluateImages(), -// which was called with the same parameter settings params -py::dict Accumulate( - const py::object& params, - const std::vector& evalutations); - -} // namespace COCOeval -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cuda_version.cu b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cuda_version.cu deleted file mode 100755 index 6dfe1b90..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/cuda_version.cu +++ /dev/null @@ -1,26 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. - -#include - -namespace detectron2 { -int get_cudart_version() { -// Not a ROCM platform: Either HIP is not used, or -// it is used, but platform is not ROCM (i.e. it is CUDA) -#if !defined(__HIP_PLATFORM_HCC__) - return CUDART_VERSION; -#else - int version = 0; - -#if HIP_VERSION_MAJOR != 0 - // Create a convention similar to that of CUDA, as assumed by other - // parts of the code. - - version = HIP_VERSION_MINOR; - version += (HIP_VERSION_MAJOR * 100); -#else - hipRuntimeGetVersion(&version); -#endif - return version; -#endif -} -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv.h b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv.h deleted file mode 100755 index 965c1bfd..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv.h +++ /dev/null @@ -1,377 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#pragma once -#include - -namespace detectron2 { - -#if defined(WITH_CUDA) || defined(WITH_HIP) -int deform_conv_forward_cuda( - at::Tensor input, - at::Tensor weight, - at::Tensor offset, - at::Tensor output, - at::Tensor columns, - at::Tensor ones, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - int im2col_step); - -int deform_conv_backward_input_cuda( - at::Tensor input, - at::Tensor offset, - at::Tensor gradOutput, - at::Tensor gradInput, - at::Tensor gradOffset, - at::Tensor weight, - at::Tensor columns, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - int im2col_step); - -int deform_conv_backward_parameters_cuda( - at::Tensor input, - at::Tensor offset, - at::Tensor gradOutput, - at::Tensor gradWeight, // at::Tensor gradBias, - at::Tensor columns, - at::Tensor ones, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - float scale, - int im2col_step); - -void modulated_deform_conv_cuda_forward( - at::Tensor input, - at::Tensor weight, - at::Tensor bias, - at::Tensor ones, - at::Tensor offset, - at::Tensor mask, - at::Tensor output, - at::Tensor columns, - int kernel_h, - int kernel_w, - const int stride_h, - const int stride_w, - const int pad_h, - const int pad_w, - const int dilation_h, - const int dilation_w, - const int group, - const int deformable_group, - const bool with_bias); - -void modulated_deform_conv_cuda_backward( - at::Tensor input, - at::Tensor weight, - at::Tensor bias, - at::Tensor ones, - at::Tensor offset, - at::Tensor mask, - at::Tensor columns, - at::Tensor grad_input, - at::Tensor grad_weight, - at::Tensor grad_bias, - at::Tensor grad_offset, - at::Tensor grad_mask, - at::Tensor grad_output, - int kernel_h, - int kernel_w, - int stride_h, - int stride_w, - int pad_h, - int pad_w, - int dilation_h, - int dilation_w, - int group, - int deformable_group, - const bool with_bias); - -#endif - -inline int deform_conv_forward( - at::Tensor input, - at::Tensor weight, - at::Tensor offset, - at::Tensor output, - at::Tensor columns, - at::Tensor ones, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - int im2col_step) { - if (input.is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - TORCH_CHECK(weight.is_cuda(), "weight tensor is not on GPU!"); - TORCH_CHECK(offset.is_cuda(), "offset tensor is not on GPU!"); - return deform_conv_forward_cuda( - input, - weight, - offset, - output, - columns, - ones, - kW, - kH, - dW, - dH, - padW, - padH, - dilationW, - dilationH, - group, - deformable_group, - im2col_step); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - AT_ERROR("This operator is not implemented on CPU"); -} - -inline int deform_conv_backward_input( - at::Tensor input, - at::Tensor offset, - at::Tensor gradOutput, - at::Tensor gradInput, - at::Tensor gradOffset, - at::Tensor weight, - at::Tensor columns, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - int im2col_step) { - if (gradOutput.is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - TORCH_CHECK(input.is_cuda(), "input tensor is not on GPU!"); - TORCH_CHECK(weight.is_cuda(), "weight tensor is not on GPU!"); - TORCH_CHECK(offset.is_cuda(), "offset tensor is not on GPU!"); - return deform_conv_backward_input_cuda( - input, - offset, - gradOutput, - gradInput, - gradOffset, - weight, - columns, - kW, - kH, - dW, - dH, - padW, - padH, - dilationW, - dilationH, - group, - deformable_group, - im2col_step); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - AT_ERROR("This operator is not implemented on CPU"); -} - -inline int deform_conv_backward_filter( - at::Tensor input, - at::Tensor offset, - at::Tensor gradOutput, - at::Tensor gradWeight, // at::Tensor gradBias, - at::Tensor columns, - at::Tensor ones, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - float scale, - int im2col_step) { - if (gradOutput.is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - TORCH_CHECK(input.is_cuda(), "input tensor is not on GPU!"); - TORCH_CHECK(offset.is_cuda(), "offset tensor is not on GPU!"); - return deform_conv_backward_parameters_cuda( - input, - offset, - gradOutput, - gradWeight, - columns, - ones, - kW, - kH, - dW, - dH, - padW, - padH, - dilationW, - dilationH, - group, - deformable_group, - scale, - im2col_step); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - AT_ERROR("This operator is not implemented on CPU"); -} - -inline void modulated_deform_conv_forward( - at::Tensor input, - at::Tensor weight, - at::Tensor bias, - at::Tensor ones, - at::Tensor offset, - at::Tensor mask, - at::Tensor output, - at::Tensor columns, - int kernel_h, - int kernel_w, - const int stride_h, - const int stride_w, - const int pad_h, - const int pad_w, - const int dilation_h, - const int dilation_w, - const int group, - const int deformable_group, - const bool with_bias) { - if (input.is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - TORCH_CHECK(weight.is_cuda(), "weight tensor is not on GPU!"); - TORCH_CHECK(bias.is_cuda(), "bias tensor is not on GPU!"); - TORCH_CHECK(offset.is_cuda(), "offset tensor is not on GPU!"); - return modulated_deform_conv_cuda_forward( - input, - weight, - bias, - ones, - offset, - mask, - output, - columns, - kernel_h, - kernel_w, - stride_h, - stride_w, - pad_h, - pad_w, - dilation_h, - dilation_w, - group, - deformable_group, - with_bias); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - AT_ERROR("This operator is not implemented on CPU"); -} - -inline void modulated_deform_conv_backward( - at::Tensor input, - at::Tensor weight, - at::Tensor bias, - at::Tensor ones, - at::Tensor offset, - at::Tensor mask, - at::Tensor columns, - at::Tensor grad_input, - at::Tensor grad_weight, - at::Tensor grad_bias, - at::Tensor grad_offset, - at::Tensor grad_mask, - at::Tensor grad_output, - int kernel_h, - int kernel_w, - int stride_h, - int stride_w, - int pad_h, - int pad_w, - int dilation_h, - int dilation_w, - int group, - int deformable_group, - const bool with_bias) { - if (grad_output.is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - TORCH_CHECK(input.is_cuda(), "input tensor is not on GPU!"); - TORCH_CHECK(weight.is_cuda(), "weight tensor is not on GPU!"); - TORCH_CHECK(bias.is_cuda(), "bias tensor is not on GPU!"); - TORCH_CHECK(offset.is_cuda(), "offset tensor is not on GPU!"); - return modulated_deform_conv_cuda_backward( - input, - weight, - bias, - ones, - offset, - mask, - columns, - grad_input, - grad_weight, - grad_bias, - grad_offset, - grad_mask, - grad_output, - kernel_h, - kernel_w, - stride_h, - stride_w, - pad_h, - pad_w, - dilation_h, - dilation_w, - group, - deformable_group, - with_bias); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - AT_ERROR("This operator is not implemented on CPU"); -} - -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu deleted file mode 100755 index 2072bb85..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu +++ /dev/null @@ -1,1223 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. - -// modified from -// https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda.cpp -// Original license: Apache 2.0 - -// modify from -// https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda.c -// Original license: Apache 2.0 - -#include - -#include "deform_conv.h" - -#include -#include - -namespace detectron2 { - -void deformable_im2col( - const at::Tensor data_im, - const at::Tensor data_offset, - const int channels, - const int height, - const int width, - const int ksize_h, - const int ksize_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int parallel_imgs, - const int deformable_group, - at::Tensor data_col); - -void deformable_col2im( - const at::Tensor data_col, - const at::Tensor data_offset, - const int channels, - const int height, - const int width, - const int ksize_h, - const int ksize_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int parallel_imgs, - const int deformable_group, - at::Tensor grad_im); - -void deformable_col2im_coord( - const at::Tensor data_col, - const at::Tensor data_im, - const at::Tensor data_offset, - const int channels, - const int height, - const int width, - const int ksize_h, - const int ksize_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int parallel_imgs, - const int deformable_group, - at::Tensor grad_offset); - -void modulated_deformable_im2col_cuda( - const at::Tensor data_im, - const at::Tensor data_offset, - const at::Tensor data_mask, - const int batch_size, - const int channels, - const int height_im, - const int width_im, - const int height_col, - const int width_col, - const int kernel_h, - const int kenerl_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int deformable_group, - at::Tensor data_col); - -void modulated_deformable_col2im_cuda( - const at::Tensor data_col, - const at::Tensor data_offset, - const at::Tensor data_mask, - const int batch_size, - const int channels, - const int height_im, - const int width_im, - const int height_col, - const int width_col, - const int kernel_h, - const int kenerl_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int deformable_group, - at::Tensor grad_im); - -void modulated_deformable_col2im_coord_cuda( - const at::Tensor data_col, - const at::Tensor data_im, - const at::Tensor data_offset, - const at::Tensor data_mask, - const int batch_size, - const int channels, - const int height_im, - const int width_im, - const int height_col, - const int width_col, - const int kernel_h, - const int kenerl_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int deformable_group, - at::Tensor grad_offset, - at::Tensor grad_mask); - -void shape_check( - at::Tensor input, - at::Tensor offset, - at::Tensor* gradOutput, - at::Tensor weight, - int kH, - int kW, - int dH, - int dW, - int padH, - int padW, - int dilationH, - int dilationW, - int group, - int deformable_group) { - TORCH_CHECK( - weight.ndimension() == 4, - "4D weight tensor (nOutputPlane,nInputPlane,kH,kW) expected, " - "but got: %s", - weight.ndimension()); - - TORCH_CHECK(weight.is_contiguous(), "weight tensor has to be contiguous"); - - TORCH_CHECK( - kW > 0 && kH > 0, - "kernel size should be greater than zero, but got kH: %d kW: %d", - kH, - kW); - - TORCH_CHECK( - (weight.size(2) == kH && weight.size(3) == kW), - "kernel size should be consistent with weight, ", - "but got kH: %d kW: %d weight.size(2): %d, weight.size(3): %d", - kH, - kW, - weight.size(2), - weight.size(3)); - - TORCH_CHECK( - dW > 0 && dH > 0, - "stride should be greater than zero, but got dH: %d dW: %d", - dH, - dW); - - TORCH_CHECK( - dilationW > 0 && dilationH > 0, - "dilation should be greater than 0, but got dilationH: %d dilationW: %d", - dilationH, - dilationW); - - int ndim = input.ndimension(); - int dimf = 0; - int dimh = 1; - int dimw = 2; - - if (ndim == 4) { - dimf++; - dimh++; - dimw++; - } - - TORCH_CHECK( - ndim == 3 || ndim == 4, - "3D or 4D input tensor expected but got: %s", - ndim); - - long nInputPlane = weight.size(1) * group; - long inputHeight = input.size(dimh); - long inputWidth = input.size(dimw); - long nOutputPlane = weight.size(0); - long outputHeight = - (inputHeight + 2 * padH - (dilationH * (kH - 1) + 1)) / dH + 1; - long outputWidth = - (inputWidth + 2 * padW - (dilationW * (kW - 1) + 1)) / dW + 1; - - TORCH_CHECK( - nInputPlane % deformable_group == 0, - "input channels must divide deformable group size"); - - if (outputWidth < 1 || outputHeight < 1) - AT_ERROR( - "Given input size: (%ld x %ld x %ld). " - "Calculated output size: (%ld x %ld x %ld). Output size is too small", - nInputPlane, - inputHeight, - inputWidth, - nOutputPlane, - outputHeight, - outputWidth); - - TORCH_CHECK( - input.size(1) == nInputPlane, - "invalid number of input planes, expected: %d, but got: %d", - nInputPlane, - input.size(1)); - - TORCH_CHECK( - (inputHeight + 2 * padH >= kH && inputWidth + 2 * padW >= kW), - "input image is smaller than kernel"); - - TORCH_CHECK( - (offset.size(2) == outputHeight && offset.size(3) == outputWidth), - "invalid spatial size of offset, expected height: %d width: %d, but " - "got height: %d width: %d", - outputHeight, - outputWidth, - offset.size(2), - offset.size(3)); - - TORCH_CHECK( - (offset.size(1) == deformable_group * 2 * kH * kW), - "invalid number of channels of offset"); - - if (gradOutput != NULL) { - TORCH_CHECK( - gradOutput->size(dimf) == nOutputPlane, - "invalid number of gradOutput planes, expected: %d, but got: %d", - nOutputPlane, - gradOutput->size(dimf)); - - TORCH_CHECK( - (gradOutput->size(dimh) == outputHeight && - gradOutput->size(dimw) == outputWidth), - "invalid size of gradOutput, expected height: %d width: %d , but " - "got height: %d width: %d", - outputHeight, - outputWidth, - gradOutput->size(dimh), - gradOutput->size(dimw)); - } -} - -int deform_conv_forward_cuda( - at::Tensor input, - at::Tensor weight, - at::Tensor offset, - at::Tensor output, - at::Tensor columns, - at::Tensor ones, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - int im2col_step) { - // todo: resize columns to include im2col: done - // todo: add im2col_step as input - // todo: add new output buffer and transpose it to output (or directly - // transpose output) todo: possibly change data indexing because of - // parallel_imgs - - shape_check( - input, - offset, - NULL, - weight, - kH, - kW, - dH, - dW, - padH, - padW, - dilationH, - dilationW, - group, - deformable_group); - - input = input.contiguous(); - offset = offset.contiguous(); - weight = weight.contiguous(); - - int batch = 1; - if (input.ndimension() == 3) { - // Force batch - batch = 0; - input.unsqueeze_(0); - offset.unsqueeze_(0); - } - - // todo: assert batchsize dividable by im2col_step - - long batchSize = input.size(0); - long nInputPlane = input.size(1); - long inputHeight = input.size(2); - long inputWidth = input.size(3); - - long nOutputPlane = weight.size(0); - - long outputWidth = - (inputWidth + 2 * padW - (dilationW * (kW - 1) + 1)) / dW + 1; - long outputHeight = - (inputHeight + 2 * padH - (dilationH * (kH - 1) + 1)) / dH + 1; - - TORCH_CHECK((offset.size(0) == batchSize), "invalid batch size of offset"); - - output = output.view( - {batchSize / im2col_step, - im2col_step, - nOutputPlane, - outputHeight, - outputWidth}); - columns = at::zeros( - {nInputPlane * kW * kH, im2col_step * outputHeight * outputWidth}, - input.options()); - - if (ones.ndimension() != 2 || - ones.size(0) * ones.size(1) < outputHeight * outputWidth) { - ones = at::ones({outputHeight, outputWidth}, input.options()); - } - - input = input.view( - {batchSize / im2col_step, - im2col_step, - nInputPlane, - inputHeight, - inputWidth}); - offset = offset.view( - {batchSize / im2col_step, - im2col_step, - deformable_group * 2 * kH * kW, - outputHeight, - outputWidth}); - - at::Tensor output_buffer = at::zeros( - {batchSize / im2col_step, - nOutputPlane, - im2col_step * outputHeight, - outputWidth}, - output.options()); - - output_buffer = output_buffer.view( - {output_buffer.size(0), - group, - output_buffer.size(1) / group, - output_buffer.size(2), - output_buffer.size(3)}); - - for (int elt = 0; elt < batchSize / im2col_step; elt++) { - deformable_im2col( - input[elt], - offset[elt], - nInputPlane, - inputHeight, - inputWidth, - kH, - kW, - padH, - padW, - dH, - dW, - dilationH, - dilationW, - im2col_step, - deformable_group, - columns); - - columns = columns.view({group, columns.size(0) / group, columns.size(1)}); - weight = weight.view( - {group, - weight.size(0) / group, - weight.size(1), - weight.size(2), - weight.size(3)}); - - for (int g = 0; g < group; g++) { - output_buffer[elt][g] = output_buffer[elt][g] - .flatten(1) - .addmm_(weight[g].flatten(1), columns[g]) - .view_as(output_buffer[elt][g]); - } - } - - output_buffer = output_buffer.view( - {output_buffer.size(0), - output_buffer.size(1) * output_buffer.size(2), - output_buffer.size(3), - output_buffer.size(4)}); - - output_buffer = output_buffer.view( - {batchSize / im2col_step, - nOutputPlane, - im2col_step, - outputHeight, - outputWidth}); - output_buffer.transpose_(1, 2); - output.copy_(output_buffer); - output = output.view({batchSize, nOutputPlane, outputHeight, outputWidth}); - - input = input.view({batchSize, nInputPlane, inputHeight, inputWidth}); - offset = offset.view( - {batchSize, deformable_group * 2 * kH * kW, outputHeight, outputWidth}); - - if (batch == 0) { - output = output.view({nOutputPlane, outputHeight, outputWidth}); - input = input.view({nInputPlane, inputHeight, inputWidth}); - offset = offset.view({offset.size(1), offset.size(2), offset.size(3)}); - } - - return 1; -} - -int deform_conv_backward_input_cuda( - at::Tensor input, - at::Tensor offset, - at::Tensor gradOutput, - at::Tensor gradInput, - at::Tensor gradOffset, - at::Tensor weight, - at::Tensor columns, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - int im2col_step) { - shape_check( - input, - offset, - &gradOutput, - weight, - kH, - kW, - dH, - dW, - padH, - padW, - dilationH, - dilationW, - group, - deformable_group); - - input = input.contiguous(); - offset = offset.contiguous(); - gradOutput = gradOutput.contiguous(); - weight = weight.contiguous(); - - int batch = 1; - - if (input.ndimension() == 3) { - // Force batch - batch = 0; - input = input.view({1, input.size(0), input.size(1), input.size(2)}); - offset = offset.view({1, offset.size(0), offset.size(1), offset.size(2)}); - gradOutput = gradOutput.view( - {1, gradOutput.size(0), gradOutput.size(1), gradOutput.size(2)}); - } - - long batchSize = input.size(0); - long nInputPlane = input.size(1); - long inputHeight = input.size(2); - long inputWidth = input.size(3); - - long nOutputPlane = weight.size(0); - - long outputWidth = - (inputWidth + 2 * padW - (dilationW * (kW - 1) + 1)) / dW + 1; - long outputHeight = - (inputHeight + 2 * padH - (dilationH * (kH - 1) + 1)) / dH + 1; - - TORCH_CHECK((offset.size(0) == batchSize), 3, "invalid batch size of offset"); - gradInput = gradInput.view({batchSize, nInputPlane, inputHeight, inputWidth}); - columns = at::zeros( - {nInputPlane * kW * kH, im2col_step * outputHeight * outputWidth}, - input.options()); - - // change order of grad output - gradOutput = gradOutput.view( - {batchSize / im2col_step, - im2col_step, - nOutputPlane, - outputHeight, - outputWidth}); - gradOutput.transpose_(1, 2); - - gradInput = gradInput.view( - {batchSize / im2col_step, - im2col_step, - nInputPlane, - inputHeight, - inputWidth}); - input = input.view( - {batchSize / im2col_step, - im2col_step, - nInputPlane, - inputHeight, - inputWidth}); - gradOffset = gradOffset.view( - {batchSize / im2col_step, - im2col_step, - deformable_group * 2 * kH * kW, - outputHeight, - outputWidth}); - offset = offset.view( - {batchSize / im2col_step, - im2col_step, - deformable_group * 2 * kH * kW, - outputHeight, - outputWidth}); - - for (int elt = 0; elt < batchSize / im2col_step; elt++) { - // divide into groups - columns = columns.view({group, columns.size(0) / group, columns.size(1)}); - weight = weight.view( - {group, - weight.size(0) / group, - weight.size(1), - weight.size(2), - weight.size(3)}); - gradOutput = gradOutput.view( - {gradOutput.size(0), - group, - gradOutput.size(1) / group, - gradOutput.size(2), - gradOutput.size(3), - gradOutput.size(4)}); - - for (int g = 0; g < group; g++) { - columns[g] = columns[g].addmm_( - weight[g].flatten(1).transpose(0, 1), - gradOutput[elt][g].flatten(1), - 0.0f, - 1.0f); - } - - columns = - columns.view({columns.size(0) * columns.size(1), columns.size(2)}); - gradOutput = gradOutput.view( - {gradOutput.size(0), - gradOutput.size(1) * gradOutput.size(2), - gradOutput.size(3), - gradOutput.size(4), - gradOutput.size(5)}); - - deformable_col2im_coord( - columns, - input[elt], - offset[elt], - nInputPlane, - inputHeight, - inputWidth, - kH, - kW, - padH, - padW, - dH, - dW, - dilationH, - dilationW, - im2col_step, - deformable_group, - gradOffset[elt]); - - deformable_col2im( - columns, - offset[elt], - nInputPlane, - inputHeight, - inputWidth, - kH, - kW, - padH, - padW, - dH, - dW, - dilationH, - dilationW, - im2col_step, - deformable_group, - gradInput[elt]); - } - - gradOutput.transpose_(1, 2); - gradOutput = - gradOutput.view({batchSize, nOutputPlane, outputHeight, outputWidth}); - - gradInput = gradInput.view({batchSize, nInputPlane, inputHeight, inputWidth}); - input = input.view({batchSize, nInputPlane, inputHeight, inputWidth}); - gradOffset = gradOffset.view( - {batchSize, deformable_group * 2 * kH * kW, outputHeight, outputWidth}); - offset = offset.view( - {batchSize, deformable_group * 2 * kH * kW, outputHeight, outputWidth}); - - if (batch == 0) { - gradOutput = gradOutput.view({nOutputPlane, outputHeight, outputWidth}); - input = input.view({nInputPlane, inputHeight, inputWidth}); - gradInput = gradInput.view({nInputPlane, inputHeight, inputWidth}); - offset = offset.view({offset.size(1), offset.size(2), offset.size(3)}); - gradOffset = - gradOffset.view({offset.size(1), offset.size(2), offset.size(3)}); - } - - return 1; -} - -int deform_conv_backward_parameters_cuda( - at::Tensor input, - at::Tensor offset, - at::Tensor gradOutput, - at::Tensor gradWeight, // at::Tensor gradBias, - at::Tensor columns, - at::Tensor ones, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - float scale, - int im2col_step) { - // todo: transpose and reshape outGrad - // todo: reshape columns - // todo: add im2col_step as input - - shape_check( - input, - offset, - &gradOutput, - gradWeight, - kH, - kW, - dH, - dW, - padH, - padW, - dilationH, - dilationW, - group, - deformable_group); - - input = input.contiguous(); - offset = offset.contiguous(); - gradOutput = gradOutput.contiguous(); - - int batch = 1; - - if (input.ndimension() == 3) { - // Force batch - batch = 0; - input = input.view( - at::IntList({1, input.size(0), input.size(1), input.size(2)})); - gradOutput = gradOutput.view( - {1, gradOutput.size(0), gradOutput.size(1), gradOutput.size(2)}); - } - - long batchSize = input.size(0); - long nInputPlane = input.size(1); - long inputHeight = input.size(2); - long inputWidth = input.size(3); - - long nOutputPlane = gradWeight.size(0); - - long outputWidth = - (inputWidth + 2 * padW - (dilationW * (kW - 1) + 1)) / dW + 1; - long outputHeight = - (inputHeight + 2 * padH - (dilationH * (kH - 1) + 1)) / dH + 1; - - TORCH_CHECK((offset.size(0) == batchSize), "invalid batch size of offset"); - - columns = at::zeros( - {nInputPlane * kW * kH, im2col_step * outputHeight * outputWidth}, - input.options()); - - gradOutput = gradOutput.view( - {batchSize / im2col_step, - im2col_step, - nOutputPlane, - outputHeight, - outputWidth}); - gradOutput.transpose_(1, 2); - - at::Tensor gradOutputBuffer = at::zeros_like(gradOutput); - gradOutputBuffer = gradOutputBuffer.view( - {batchSize / im2col_step, - nOutputPlane, - im2col_step, - outputHeight, - outputWidth}); - gradOutputBuffer.copy_(gradOutput); - // gradOutput is not contiguous, so we do reshape (instead of view) next - gradOutputBuffer = gradOutputBuffer.reshape( - {batchSize / im2col_step, - nOutputPlane, - im2col_step * outputHeight, - outputWidth}); - - gradOutput.transpose_(1, 2); - gradOutput = - gradOutput.view({batchSize, nOutputPlane, outputHeight, outputWidth}); - - input = input.view( - {batchSize / im2col_step, - im2col_step, - nInputPlane, - inputHeight, - inputWidth}); - offset = offset.view( - {batchSize / im2col_step, - im2col_step, - deformable_group * 2 * kH * kW, - outputHeight, - outputWidth}); - - for (int elt = 0; elt < batchSize / im2col_step; elt++) { - deformable_im2col( - input[elt], - offset[elt], - nInputPlane, - inputHeight, - inputWidth, - kH, - kW, - padH, - padW, - dH, - dW, - dilationH, - dilationW, - im2col_step, - deformable_group, - columns); - - // divide into group - gradOutputBuffer = gradOutputBuffer.view( - {gradOutputBuffer.size(0), - group, - gradOutputBuffer.size(1) / group, - gradOutputBuffer.size(2), - gradOutputBuffer.size(3)}); - columns = columns.view({group, columns.size(0) / group, columns.size(1)}); - gradWeight = gradWeight.view( - {group, - gradWeight.size(0) / group, - gradWeight.size(1), - gradWeight.size(2), - gradWeight.size(3)}); - - for (int g = 0; g < group; g++) { - gradWeight[g] = gradWeight[g] - .flatten(1) - .addmm_( - gradOutputBuffer[elt][g].flatten(1), - columns[g].transpose(1, 0), - 1.0, - scale) - .view_as(gradWeight[g]); - } - gradOutputBuffer = gradOutputBuffer.view( - {gradOutputBuffer.size(0), - gradOutputBuffer.size(1) * gradOutputBuffer.size(2), - gradOutputBuffer.size(3), - gradOutputBuffer.size(4)}); - columns = - columns.view({columns.size(0) * columns.size(1), columns.size(2)}); - gradWeight = gradWeight.view( - {gradWeight.size(0) * gradWeight.size(1), - gradWeight.size(2), - gradWeight.size(3), - gradWeight.size(4)}); - } - - input = input.view({batchSize, nInputPlane, inputHeight, inputWidth}); - offset = offset.view( - {batchSize, deformable_group * 2 * kH * kW, outputHeight, outputWidth}); - - if (batch == 0) { - gradOutput = gradOutput.view({nOutputPlane, outputHeight, outputWidth}); - input = input.view({nInputPlane, inputHeight, inputWidth}); - } - - return 1; -} - -void modulated_deform_conv_cuda_forward( - at::Tensor input, - at::Tensor weight, - at::Tensor bias, - at::Tensor ones, - at::Tensor offset, - at::Tensor mask, - at::Tensor output, - at::Tensor columns, - int kernel_h, - int kernel_w, - const int stride_h, - const int stride_w, - const int pad_h, - const int pad_w, - const int dilation_h, - const int dilation_w, - const int group, - const int deformable_group, - const bool with_bias) { - shape_check( - input, - offset, - NULL, - weight, - kernel_h, - kernel_w, - stride_h, - stride_w, - pad_h, - pad_w, - dilation_h, - dilation_w, - group, - deformable_group); - - TORCH_CHECK(input.is_contiguous(), "input tensor has to be contiguous"); - TORCH_CHECK(weight.is_contiguous(), "weight tensor has to be contiguous"); - - const int batch = input.size(0); - const int channels = input.size(1); - const int height = input.size(2); - const int width = input.size(3); - - const int channels_out = weight.size(0); - const int channels_kernel = weight.size(1); - const int kernel_h_ = weight.size(2); - const int kernel_w_ = weight.size(3); - - if (kernel_h_ != kernel_h || kernel_w_ != kernel_w) - AT_ERROR( - "Input shape and kernel shape wont match: (%d x %d vs %d x %d).", - kernel_h_, - kernel_w, - kernel_h_, - kernel_w_); - if (channels != channels_kernel * group) - AT_ERROR( - "Input shape and kernel channels wont match: (%d vs %d).", - channels, - channels_kernel * group); - - const int height_out = - (height + 2 * pad_h - (dilation_h * (kernel_h - 1) + 1)) / stride_h + 1; - const int width_out = - (width + 2 * pad_w - (dilation_w * (kernel_w - 1) + 1)) / stride_w + 1; - - // mask shape check - TORCH_CHECK( - (mask.size(2) == height_out && mask.size(3) == width_out), - "invalid spatial size of mask, expected height: %d width: %d, but " - "got height: %d width: %d", - height_out, - width_out, - mask.size(2), - mask.size(3)); - - TORCH_CHECK( - (mask.size(1) == deformable_group * kernel_h * kernel_w), - "invalid number of channels of mask"); - - if (ones.ndimension() != 2 || - ones.size(0) * ones.size(1) < height_out * width_out) { - // Resize plane and fill with ones... - ones = at::ones({height_out, width_out}, input.options()); - } - - // resize output - output = output.view({batch, channels_out, height_out, width_out}).zero_(); - // resize temporary columns - columns = at::zeros( - {channels * kernel_h * kernel_w, 1 * height_out * width_out}, - input.options()); - - output = output.view( - {output.size(0), - group, - output.size(1) / group, - output.size(2), - output.size(3)}); - - for (int b = 0; b < batch; b++) { - modulated_deformable_im2col_cuda( - input[b], - offset[b], - mask[b], - 1, - channels, - height, - width, - height_out, - width_out, - kernel_h, - kernel_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - deformable_group, - columns); - - // divide into group - weight = weight.view( - {group, - weight.size(0) / group, - weight.size(1), - weight.size(2), - weight.size(3)}); - columns = columns.view({group, columns.size(0) / group, columns.size(1)}); - - for (int g = 0; g < group; g++) { - output[b][g] = output[b][g] - .flatten(1) - .addmm_(weight[g].flatten(1), columns[g]) - .view_as(output[b][g]); - } - - weight = weight.view( - {weight.size(0) * weight.size(1), - weight.size(2), - weight.size(3), - weight.size(4)}); - columns = - columns.view({columns.size(0) * columns.size(1), columns.size(2)}); - } - - output = output.view( - {output.size(0), - output.size(1) * output.size(2), - output.size(3), - output.size(4)}); - - if (with_bias) { - output += bias.view({1, bias.size(0), 1, 1}); - } -} - -void modulated_deform_conv_cuda_backward( - at::Tensor input, - at::Tensor weight, - at::Tensor bias, - at::Tensor ones, - at::Tensor offset, - at::Tensor mask, - at::Tensor columns, - at::Tensor grad_input, - at::Tensor grad_weight, - at::Tensor grad_bias, - at::Tensor grad_offset, - at::Tensor grad_mask, - at::Tensor grad_output, - int kernel_h, - int kernel_w, - int stride_h, - int stride_w, - int pad_h, - int pad_w, - int dilation_h, - int dilation_w, - int group, - int deformable_group, - const bool with_bias) { - shape_check( - input, - offset, - &grad_output, - weight, - kernel_h, - kernel_w, - stride_h, - stride_w, - pad_h, - pad_w, - dilation_h, - dilation_w, - group, - deformable_group); - - TORCH_CHECK(input.is_contiguous(), "input tensor has to be contiguous"); - TORCH_CHECK(weight.is_contiguous(), "weight tensor has to be contiguous"); - - const int batch = input.size(0); - const int channels = input.size(1); - const int height = input.size(2); - const int width = input.size(3); - - const int channels_kernel = weight.size(1); - const int kernel_h_ = weight.size(2); - const int kernel_w_ = weight.size(3); - if (kernel_h_ != kernel_h || kernel_w_ != kernel_w) - AT_ERROR( - "Input shape and kernel shape wont match: (%d x %d vs %d x %d).", - kernel_h_, - kernel_w, - kernel_h_, - kernel_w_); - if (channels != channels_kernel * group) - AT_ERROR( - "Input shape and kernel channels wont match: (%d vs %d).", - channels, - channels_kernel * group); - - const int height_out = - (height + 2 * pad_h - (dilation_h * (kernel_h - 1) + 1)) / stride_h + 1; - const int width_out = - (width + 2 * pad_w - (dilation_w * (kernel_w - 1) + 1)) / stride_w + 1; - - // mask shape check - TORCH_CHECK( - (mask.size(2) == height_out && mask.size(3) == width_out), - "invalid spatial size of mask, expected height: %d width: %d, but " - "got height: %d width: %d", - height_out, - width_out, - mask.size(2), - mask.size(3)); - - TORCH_CHECK( - (mask.size(1) == deformable_group * kernel_h * kernel_w), - "invalid number of channels of mask"); - - if (ones.ndimension() != 2 || - ones.size(0) * ones.size(1) < height_out * width_out) { - // Resize plane and fill with ones... - ones = at::ones({height_out, width_out}, input.options()); - } - - grad_input = grad_input.view({batch, channels, height, width}); - columns = at::zeros( - {channels * kernel_h * kernel_w, height_out * width_out}, - input.options()); - - grad_output = grad_output.view( - {grad_output.size(0), - group, - grad_output.size(1) / group, - grad_output.size(2), - grad_output.size(3)}); - - for (int b = 0; b < batch; b++) { - // divide int group - columns = columns.view({group, columns.size(0) / group, columns.size(1)}); - weight = weight.view( - {group, - weight.size(0) / group, - weight.size(1), - weight.size(2), - weight.size(3)}); - - for (int g = 0; g < group; g++) { - columns[g].addmm_( - weight[g].flatten(1).transpose(0, 1), - grad_output[b][g].flatten(1), - 0.0f, - 1.0f); - } - - columns = - columns.view({columns.size(0) * columns.size(1), columns.size(2)}); - weight = weight.view( - {weight.size(0) * weight.size(1), - weight.size(2), - weight.size(3), - weight.size(4)}); - - // gradient w.r.t. input coordinate data - modulated_deformable_col2im_coord_cuda( - columns, - input[b], - offset[b], - mask[b], - 1, - channels, - height, - width, - height_out, - width_out, - kernel_h, - kernel_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - deformable_group, - grad_offset[b], - grad_mask[b]); - // gradient w.r.t. input data - modulated_deformable_col2im_cuda( - columns, - offset[b], - mask[b], - 1, - channels, - height, - width, - height_out, - width_out, - kernel_h, - kernel_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - deformable_group, - grad_input[b]); - - // gradient w.r.t. weight, dWeight should accumulate across the batch and - // group - modulated_deformable_im2col_cuda( - input[b], - offset[b], - mask[b], - 1, - channels, - height, - width, - height_out, - width_out, - kernel_h, - kernel_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - deformable_group, - columns); - - columns = columns.view({group, columns.size(0) / group, columns.size(1)}); - grad_weight = grad_weight.view( - {group, - grad_weight.size(0) / group, - grad_weight.size(1), - grad_weight.size(2), - grad_weight.size(3)}); - if (with_bias) - grad_bias = grad_bias.view({group, grad_bias.size(0) / group}); - - for (int g = 0; g < group; g++) { - grad_weight[g] = - grad_weight[g] - .flatten(1) - .addmm_(grad_output[b][g].flatten(1), columns[g].transpose(0, 1)) - .view_as(grad_weight[g]); - if (with_bias) { - grad_bias[g] = - grad_bias[g] - .view({-1, 1}) - .addmm_(grad_output[b][g].flatten(1), ones.view({-1, 1})) - .view(-1); - } - } - - columns = - columns.view({columns.size(0) * columns.size(1), columns.size(2)}); - grad_weight = grad_weight.view( - {grad_weight.size(0) * grad_weight.size(1), - grad_weight.size(2), - grad_weight.size(3), - grad_weight.size(4)}); - if (with_bias) - grad_bias = grad_bias.view({grad_bias.size(0) * grad_bias.size(1)}); - } - grad_output = grad_output.view( - {grad_output.size(0) * grad_output.size(1), - grad_output.size(2), - grad_output.size(3), - grad_output.size(4)}); -} - -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu deleted file mode 100755 index f299c7ad..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu +++ /dev/null @@ -1,1288 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. - -// modified from -// https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu -// Original license: Apache 2.0 -// clang-format off - -// modify from -// https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu - -/*! - ******************* BEGIN Caffe Copyright Notice and Disclaimer ***************** - * - * COPYRIGHT - * - * All contributions by the University of California: - * Copyright (c) 2014-2017 The Regents of the University of California (Regents) - * All rights reserved. - * - * All other contributions: - * Copyright (c) 2014-2017, the respective contributors - * All rights reserved. - * - * Caffe uses a shared copyright model: each contributor holds copyright over - * their contributions to Caffe. The project versioning records all such - * contribution and copyright details. If a contributor wants to further mark - * their specific copyright on a particular contribution, they should indicate - * their copyright solely in the commit message of the change when it is - * committed. - * - * LICENSE - * - * Redistribution and use in source and binary forms, with or without - * modification, are permitted provided that the following conditions are met: - * - * 1. Redistributions of source code must retain the above copyright notice, this - * list of conditions and the following disclaimer. - * 2. Redistributions in binary form must reproduce the above copyright notice, - * this list of conditions and the following disclaimer in the documentation - * and/or other materials provided with the distribution. - * - * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" - *AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE - *IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE - * DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE - *FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL - *DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR - *SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER - *CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, - *OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE - *OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - * - * CONTRIBUTION AGREEMENT - * - * By contributing to the BVLC/caffe repository through pull-request, comment, - * or otherwise, the contributor releases their content to the - * license and copyright terms herein. - * - ***************** END Caffe Copyright Notice and Disclaimer ********************* - * - * Copyright (c) 2018 Microsoft - * Licensed under The MIT License [see LICENSE for details] - * \file modulated_deformable_im2col.cuh - * \brief Function definitions of converting an image to - * column matrix based on kernel, padding, dilation, and offset. - * These functions are mainly used in deformable convolution operators. - * \ref: https://arxiv.org/abs/1703.06211 - * \author Yuwen Xiong, Haozhi Qi, Jifeng Dai, Xizhou Zhu, Han Hu, Dazhi Cheng - */ - -#include -#include -#include -#include -#include -#include - -using namespace at; - -#define CUDA_KERNEL_LOOP(i, n) \ - for (int i = blockIdx.x * blockDim.x + threadIdx.x; i < (n); \ - i += blockDim.x * gridDim.x) - - -namespace { - -const int CUDA_NUM_THREADS = 1024; -const int kMaxGridNum = 65535; - -inline int GET_BLOCKS(const int N) { - return std::min(kMaxGridNum, (N + CUDA_NUM_THREADS - 1) / CUDA_NUM_THREADS); -} - -} - -template -__device__ scalar_t deformable_im2col_bilinear( - const scalar_t* bottom_data, - const int data_width, - const int height, - const int width, - scalar_t h, - scalar_t w) { - int h_low = floor(h); - int w_low = floor(w); - int h_high = h_low + 1; - int w_high = w_low + 1; - - scalar_t lh = h - h_low; - scalar_t lw = w - w_low; - scalar_t hh = 1 - lh, hw = 1 - lw; - - scalar_t v1 = 0; - if (h_low >= 0 && w_low >= 0) - v1 = bottom_data[h_low * data_width + w_low]; - scalar_t v2 = 0; - if (h_low >= 0 && w_high <= width - 1) - v2 = bottom_data[h_low * data_width + w_high]; - scalar_t v3 = 0; - if (h_high <= height - 1 && w_low >= 0) - v3 = bottom_data[h_high * data_width + w_low]; - scalar_t v4 = 0; - if (h_high <= height - 1 && w_high <= width - 1) - v4 = bottom_data[h_high * data_width + w_high]; - - scalar_t w1 = hh * hw, w2 = hh * lw, w3 = lh * hw, w4 = lh * lw; - - scalar_t val = (w1 * v1 + w2 * v2 + w3 * v3 + w4 * v4); - return val; -} - -template -__device__ scalar_t get_gradient_weight( - scalar_t argmax_h, - scalar_t argmax_w, - const int h, - const int w, - const int height, - const int width) { - if (argmax_h <= -1 || argmax_h >= height || argmax_w <= -1 || - argmax_w >= width) { - // empty - return 0; - } - - int argmax_h_low = floor(argmax_h); - int argmax_w_low = floor(argmax_w); - int argmax_h_high = argmax_h_low + 1; - int argmax_w_high = argmax_w_low + 1; - - scalar_t weight = 0; - if (h == argmax_h_low && w == argmax_w_low) - weight = (h + 1 - argmax_h) * (w + 1 - argmax_w); - if (h == argmax_h_low && w == argmax_w_high) - weight = (h + 1 - argmax_h) * (argmax_w + 1 - w); - if (h == argmax_h_high && w == argmax_w_low) - weight = (argmax_h + 1 - h) * (w + 1 - argmax_w); - if (h == argmax_h_high && w == argmax_w_high) - weight = (argmax_h + 1 - h) * (argmax_w + 1 - w); - return weight; -} - -template -__device__ scalar_t get_coordinate_weight( - scalar_t argmax_h, - scalar_t argmax_w, - const int height, - const int width, - const scalar_t* im_data, - const int data_width, - const int bp_dir) { - if (argmax_h <= -1 || argmax_h >= height || argmax_w <= -1 || - argmax_w >= width) { - // empty - return 0; - } - - int argmax_h_low = floor(argmax_h); - int argmax_w_low = floor(argmax_w); - int argmax_h_high = argmax_h_low + 1; - int argmax_w_high = argmax_w_low + 1; - - scalar_t weight = 0; - - if (bp_dir == 0) { - if (argmax_h_low >= 0 && argmax_w_low >= 0) - weight += -1 * (argmax_w_low + 1 - argmax_w) * - im_data[argmax_h_low * data_width + argmax_w_low]; - if (argmax_h_low >= 0 && argmax_w_high <= width - 1) - weight += -1 * (argmax_w - argmax_w_low) * - im_data[argmax_h_low * data_width + argmax_w_high]; - if (argmax_h_high <= height - 1 && argmax_w_low >= 0) - weight += (argmax_w_low + 1 - argmax_w) * - im_data[argmax_h_high * data_width + argmax_w_low]; - if (argmax_h_high <= height - 1 && argmax_w_high <= width - 1) - weight += (argmax_w - argmax_w_low) * - im_data[argmax_h_high * data_width + argmax_w_high]; - } else if (bp_dir == 1) { - if (argmax_h_low >= 0 && argmax_w_low >= 0) - weight += -1 * (argmax_h_low + 1 - argmax_h) * - im_data[argmax_h_low * data_width + argmax_w_low]; - if (argmax_h_low >= 0 && argmax_w_high <= width - 1) - weight += (argmax_h_low + 1 - argmax_h) * - im_data[argmax_h_low * data_width + argmax_w_high]; - if (argmax_h_high <= height - 1 && argmax_w_low >= 0) - weight += -1 * (argmax_h - argmax_h_low) * - im_data[argmax_h_high * data_width + argmax_w_low]; - if (argmax_h_high <= height - 1 && argmax_w_high <= width - 1) - weight += (argmax_h - argmax_h_low) * - im_data[argmax_h_high * data_width + argmax_w_high]; - } - - return weight; -} - -template -__global__ void deformable_im2col_gpu_kernel( - const int n, - const scalar_t* data_im, - const scalar_t* data_offset, - const int height, - const int width, - const int kernel_h, - const int kernel_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int channel_per_deformable_group, - const int batch_size, - const int num_channels, - const int deformable_group, - const int height_col, - const int width_col, - scalar_t* data_col) { - CUDA_KERNEL_LOOP(index, n) { - // index index of output matrix - const int w_col = index % width_col; - const int h_col = (index / width_col) % height_col; - const int b_col = (index / width_col / height_col) % batch_size; - const int c_im = (index / width_col / height_col) / batch_size; - const int c_col = c_im * kernel_h * kernel_w; - - // compute deformable group index - const int deformable_group_index = c_im / channel_per_deformable_group; - - const int h_in = h_col * stride_h - pad_h; - const int w_in = w_col * stride_w - pad_w; - scalar_t* data_col_ptr = data_col + - ((c_col * batch_size + b_col) * height_col + h_col) * width_col + w_col; - // const scalar_t* data_im_ptr = data_im + ((b_col * num_channels + c_im) * - // height + h_in) * width + w_in; - const scalar_t* data_im_ptr = - data_im + (b_col * num_channels + c_im) * height * width; - const scalar_t* data_offset_ptr = data_offset + - (b_col * deformable_group + deformable_group_index) * 2 * kernel_h * - kernel_w * height_col * width_col; - - for (int i = 0; i < kernel_h; ++i) { - for (int j = 0; j < kernel_w; ++j) { - const int data_offset_h_ptr = - ((2 * (i * kernel_w + j)) * height_col + h_col) * width_col + w_col; - const int data_offset_w_ptr = - ((2 * (i * kernel_w + j) + 1) * height_col + h_col) * width_col + - w_col; - const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr]; - const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr]; - scalar_t val = static_cast(0); - const scalar_t h_im = h_in + i * dilation_h + offset_h; - const scalar_t w_im = w_in + j * dilation_w + offset_w; - if (h_im > -1 && w_im > -1 && h_im < height && w_im < width) { - // const scalar_t map_h = i * dilation_h + offset_h; - // const scalar_t map_w = j * dilation_w + offset_w; - // const int cur_height = height - h_in; - // const int cur_width = width - w_in; - // val = deformable_im2col_bilinear(data_im_ptr, width, cur_height, - // cur_width, map_h, map_w); - val = deformable_im2col_bilinear( - data_im_ptr, width, height, width, h_im, w_im); - } - *data_col_ptr = val; - data_col_ptr += batch_size * height_col * width_col; - } - } - } -} - - -template -__global__ void deformable_col2im_gpu_kernel( - const int n, - const scalar_t* data_col, - const scalar_t* data_offset, - const int channels, - const int height, - const int width, - const int kernel_h, - const int kernel_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int channel_per_deformable_group, - const int batch_size, - const int deformable_group, - const int height_col, - const int width_col, - scalar_t* grad_im) { - CUDA_KERNEL_LOOP(index, n) { - const int j = (index / width_col / height_col / batch_size) % kernel_w; - const int i = - (index / width_col / height_col / batch_size / kernel_w) % kernel_h; - const int c = - index / width_col / height_col / batch_size / kernel_w / kernel_h; - // compute the start and end of the output - - const int deformable_group_index = c / channel_per_deformable_group; - - int w_out = index % width_col; - int h_out = (index / width_col) % height_col; - int b = (index / width_col / height_col) % batch_size; - int w_in = w_out * stride_w - pad_w; - int h_in = h_out * stride_h - pad_h; - - const scalar_t* data_offset_ptr = data_offset + - (b * deformable_group + deformable_group_index) * 2 * kernel_h * - kernel_w * height_col * width_col; - const int data_offset_h_ptr = - ((2 * (i * kernel_w + j)) * height_col + h_out) * width_col + w_out; - const int data_offset_w_ptr = - ((2 * (i * kernel_w + j) + 1) * height_col + h_out) * width_col + w_out; - const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr]; - const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr]; - const scalar_t cur_inv_h_data = h_in + i * dilation_h + offset_h; - const scalar_t cur_inv_w_data = w_in + j * dilation_w + offset_w; - - const scalar_t cur_top_grad = data_col[index]; - const int cur_h = (int)cur_inv_h_data; - const int cur_w = (int)cur_inv_w_data; - for (int dy = -2; dy <= 2; dy++) { - for (int dx = -2; dx <= 2; dx++) { - if (cur_h + dy >= 0 && cur_h + dy < height && cur_w + dx >= 0 && - cur_w + dx < width && abs(cur_inv_h_data - (cur_h + dy)) < 1 && - abs(cur_inv_w_data - (cur_w + dx)) < 1) { - int cur_bottom_grad_pos = - ((b * channels + c) * height + cur_h + dy) * width + cur_w + dx; - scalar_t weight = get_gradient_weight( - cur_inv_h_data, - cur_inv_w_data, - cur_h + dy, - cur_w + dx, - height, - width); - atomicAdd(grad_im + cur_bottom_grad_pos, weight * cur_top_grad); - } - } - } - } -} - - -template -__global__ void deformable_col2im_coord_gpu_kernel( - const int n, - const scalar_t* data_col, - const scalar_t* data_im, - const scalar_t* data_offset, - const int channels, - const int height, - const int width, - const int kernel_h, - const int kernel_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int channel_per_deformable_group, - const int batch_size, - const int offset_channels, - const int deformable_group, - const int height_col, - const int width_col, - scalar_t* grad_offset) { - CUDA_KERNEL_LOOP(index, n) { - scalar_t val = 0; - int w = index % width_col; - int h = (index / width_col) % height_col; - int c = (index / width_col / height_col) % offset_channels; - int b = (index / width_col / height_col) / offset_channels; - // compute the start and end of the output - - const int deformable_group_index = c / (2 * kernel_h * kernel_w); - const int col_step = kernel_h * kernel_w; - int cnt = 0; - const scalar_t* data_col_ptr = data_col + - deformable_group_index * channel_per_deformable_group * batch_size * - width_col * height_col; - const scalar_t* data_im_ptr = data_im + - (b * deformable_group + deformable_group_index) * - channel_per_deformable_group / kernel_h / kernel_w * height * width; - const scalar_t* data_offset_ptr = data_offset + - (b * deformable_group + deformable_group_index) * 2 * kernel_h * - kernel_w * height_col * width_col; - - const int offset_c = c - deformable_group_index * 2 * kernel_h * kernel_w; - - for (int col_c = (offset_c / 2); col_c < channel_per_deformable_group; - col_c += col_step) { - const int col_pos = - (((col_c * batch_size + b) * height_col) + h) * width_col + w; - const int bp_dir = offset_c % 2; - - int j = (col_pos / width_col / height_col / batch_size) % kernel_w; - int i = - (col_pos / width_col / height_col / batch_size / kernel_w) % kernel_h; - int w_out = col_pos % width_col; - int h_out = (col_pos / width_col) % height_col; - int w_in = w_out * stride_w - pad_w; - int h_in = h_out * stride_h - pad_h; - const int data_offset_h_ptr = - (((2 * (i * kernel_w + j)) * height_col + h_out) * width_col + w_out); - const int data_offset_w_ptr = - (((2 * (i * kernel_w + j) + 1) * height_col + h_out) * width_col + - w_out); - const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr]; - const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr]; - scalar_t inv_h = h_in + i * dilation_h + offset_h; - scalar_t inv_w = w_in + j * dilation_w + offset_w; - if (inv_h <= -1 || inv_w <= -1 || inv_h >= height || inv_w >= width) { - inv_h = inv_w = -2; - } - const scalar_t weight = get_coordinate_weight( - inv_h, - inv_w, - height, - width, - data_im_ptr + cnt * height * width, - width, - bp_dir); - val += weight * data_col_ptr[col_pos]; - cnt += 1; - } - - grad_offset[index] = val; - } -} - - -namespace detectron2 { - -void deformable_im2col( - const at::Tensor data_im, - const at::Tensor data_offset, - const int channels, - const int height, - const int width, - const int ksize_h, - const int ksize_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int parallel_imgs, - const int deformable_group, - at::Tensor data_col) { - // num_axes should be smaller than block size - // todo: check parallel_imgs is correctly passed in - int height_col = - (height + 2 * pad_h - (dilation_h * (ksize_h - 1) + 1)) / stride_h + 1; - int width_col = - (width + 2 * pad_w - (dilation_w * (ksize_w - 1) + 1)) / stride_w + 1; - int num_kernels = channels * height_col * width_col * parallel_imgs; - int channel_per_deformable_group = channels / deformable_group; - - at::cuda::CUDAGuard device_guard(data_im.device()); - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - AT_DISPATCH_FLOATING_TYPES_AND_HALF( - data_im.scalar_type(), "deformable_im2col_gpu", ([&] { - const scalar_t* data_im_ = data_im.data_ptr(); - const scalar_t* data_offset_ = data_offset.data_ptr(); - scalar_t* data_col_ = data_col.data_ptr(); - - deformable_im2col_gpu_kernel<<< - GET_BLOCKS(num_kernels), - CUDA_NUM_THREADS, - 0, - stream>>>( - num_kernels, - data_im_, - data_offset_, - height, - width, - ksize_h, - ksize_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - channel_per_deformable_group, - parallel_imgs, - channels, - deformable_group, - height_col, - width_col, - data_col_); - })); - - cudaError_t err = cudaGetLastError(); - if (err != cudaSuccess) { - printf("error in deformable_im2col: %s\n", cudaGetErrorString(err)); - } -} - - -void deformable_col2im( - const at::Tensor data_col, - const at::Tensor data_offset, - const int channels, - const int height, - const int width, - const int ksize_h, - const int ksize_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int parallel_imgs, - const int deformable_group, - at::Tensor grad_im) { - // todo: make sure parallel_imgs is passed in correctly - int height_col = - (height + 2 * pad_h - (dilation_h * (ksize_h - 1) + 1)) / stride_h + 1; - int width_col = - (width + 2 * pad_w - (dilation_w * (ksize_w - 1) + 1)) / stride_w + 1; - int num_kernels = - channels * ksize_h * ksize_w * height_col * width_col * parallel_imgs; - int channel_per_deformable_group = channels / deformable_group; - - at::cuda::CUDAGuard device_guard(data_col.device()); - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - AT_DISPATCH_FLOATING_TYPES_AND_HALF( - data_col.scalar_type(), "deformable_col2im_gpu", ([&] { - const scalar_t* data_col_ = data_col.data_ptr(); - const scalar_t* data_offset_ = data_offset.data_ptr(); - scalar_t* grad_im_ = grad_im.data_ptr(); - - deformable_col2im_gpu_kernel<<< - GET_BLOCKS(num_kernels), - CUDA_NUM_THREADS, - 0, - stream>>>( - num_kernels, - data_col_, - data_offset_, - channels, - height, - width, - ksize_h, - ksize_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - channel_per_deformable_group, - parallel_imgs, - deformable_group, - height_col, - width_col, - grad_im_); - })); - - cudaError_t err = cudaGetLastError(); - if (err != cudaSuccess) { - printf("error in deformable_col2im: %s\n", cudaGetErrorString(err)); - } -} - - -void deformable_col2im_coord( - const at::Tensor data_col, - const at::Tensor data_im, - const at::Tensor data_offset, - const int channels, - const int height, - const int width, - const int ksize_h, - const int ksize_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int parallel_imgs, - const int deformable_group, - at::Tensor grad_offset) { - int height_col = - (height + 2 * pad_h - (dilation_h * (ksize_h - 1) + 1)) / stride_h + 1; - int width_col = - (width + 2 * pad_w - (dilation_w * (ksize_w - 1) + 1)) / stride_w + 1; - int num_kernels = height_col * width_col * 2 * ksize_h * ksize_w * - deformable_group * parallel_imgs; - int channel_per_deformable_group = - channels * ksize_h * ksize_w / deformable_group; - - at::cuda::CUDAGuard device_guard(data_col.device()); - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - AT_DISPATCH_FLOATING_TYPES_AND_HALF( - data_col.scalar_type(), "deformable_col2im_coord_gpu", ([&] { - const scalar_t* data_col_ = data_col.data_ptr(); - const scalar_t* data_im_ = data_im.data_ptr(); - const scalar_t* data_offset_ = data_offset.data_ptr(); - scalar_t* grad_offset_ = grad_offset.data_ptr(); - - deformable_col2im_coord_gpu_kernel<<< - GET_BLOCKS(num_kernels), - CUDA_NUM_THREADS, - 0, - stream>>>( - num_kernels, - data_col_, - data_im_, - data_offset_, - channels, - height, - width, - ksize_h, - ksize_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - channel_per_deformable_group, - parallel_imgs, - 2 * ksize_h * ksize_w * deformable_group, - deformable_group, - height_col, - width_col, - grad_offset_); - })); -} - -} // namespace detectron2 - - -template -__device__ scalar_t dmcn_im2col_bilinear( - const scalar_t* bottom_data, - const int data_width, - const int height, - const int width, - scalar_t h, - scalar_t w) { - int h_low = floor(h); - int w_low = floor(w); - int h_high = h_low + 1; - int w_high = w_low + 1; - - scalar_t lh = h - h_low; - scalar_t lw = w - w_low; - scalar_t hh = 1 - lh, hw = 1 - lw; - - scalar_t v1 = 0; - if (h_low >= 0 && w_low >= 0) - v1 = bottom_data[h_low * data_width + w_low]; - scalar_t v2 = 0; - if (h_low >= 0 && w_high <= width - 1) - v2 = bottom_data[h_low * data_width + w_high]; - scalar_t v3 = 0; - if (h_high <= height - 1 && w_low >= 0) - v3 = bottom_data[h_high * data_width + w_low]; - scalar_t v4 = 0; - if (h_high <= height - 1 && w_high <= width - 1) - v4 = bottom_data[h_high * data_width + w_high]; - - scalar_t w1 = hh * hw, w2 = hh * lw, w3 = lh * hw, w4 = lh * lw; - - scalar_t val = (w1 * v1 + w2 * v2 + w3 * v3 + w4 * v4); - return val; -} - -template -__device__ scalar_t dmcn_get_gradient_weight( - scalar_t argmax_h, - scalar_t argmax_w, - const int h, - const int w, - const int height, - const int width) { - if (argmax_h <= -1 || argmax_h >= height || argmax_w <= -1 || - argmax_w >= width) { - // empty - return 0; - } - - int argmax_h_low = floor(argmax_h); - int argmax_w_low = floor(argmax_w); - int argmax_h_high = argmax_h_low + 1; - int argmax_w_high = argmax_w_low + 1; - - scalar_t weight = 0; - if (h == argmax_h_low && w == argmax_w_low) - weight = (h + 1 - argmax_h) * (w + 1 - argmax_w); - if (h == argmax_h_low && w == argmax_w_high) - weight = (h + 1 - argmax_h) * (argmax_w + 1 - w); - if (h == argmax_h_high && w == argmax_w_low) - weight = (argmax_h + 1 - h) * (w + 1 - argmax_w); - if (h == argmax_h_high && w == argmax_w_high) - weight = (argmax_h + 1 - h) * (argmax_w + 1 - w); - return weight; -} - -template -__device__ scalar_t dmcn_get_coordinate_weight( - scalar_t argmax_h, - scalar_t argmax_w, - const int height, - const int width, - const scalar_t* im_data, - const int data_width, - const int bp_dir) { - if (argmax_h <= -1 || argmax_h >= height || argmax_w <= -1 || - argmax_w >= width) { - // empty - return 0; - } - - int argmax_h_low = floor(argmax_h); - int argmax_w_low = floor(argmax_w); - int argmax_h_high = argmax_h_low + 1; - int argmax_w_high = argmax_w_low + 1; - - scalar_t weight = 0; - - if (bp_dir == 0) { - if (argmax_h_low >= 0 && argmax_w_low >= 0) - weight += -1 * (argmax_w_low + 1 - argmax_w) * - im_data[argmax_h_low * data_width + argmax_w_low]; - if (argmax_h_low >= 0 && argmax_w_high <= width - 1) - weight += -1 * (argmax_w - argmax_w_low) * - im_data[argmax_h_low * data_width + argmax_w_high]; - if (argmax_h_high <= height - 1 && argmax_w_low >= 0) - weight += (argmax_w_low + 1 - argmax_w) * - im_data[argmax_h_high * data_width + argmax_w_low]; - if (argmax_h_high <= height - 1 && argmax_w_high <= width - 1) - weight += (argmax_w - argmax_w_low) * - im_data[argmax_h_high * data_width + argmax_w_high]; - } else if (bp_dir == 1) { - if (argmax_h_low >= 0 && argmax_w_low >= 0) - weight += -1 * (argmax_h_low + 1 - argmax_h) * - im_data[argmax_h_low * data_width + argmax_w_low]; - if (argmax_h_low >= 0 && argmax_w_high <= width - 1) - weight += (argmax_h_low + 1 - argmax_h) * - im_data[argmax_h_low * data_width + argmax_w_high]; - if (argmax_h_high <= height - 1 && argmax_w_low >= 0) - weight += -1 * (argmax_h - argmax_h_low) * - im_data[argmax_h_high * data_width + argmax_w_low]; - if (argmax_h_high <= height - 1 && argmax_w_high <= width - 1) - weight += (argmax_h - argmax_h_low) * - im_data[argmax_h_high * data_width + argmax_w_high]; - } - - return weight; -} - -template -__global__ void modulated_deformable_im2col_gpu_kernel( - const int n, - const scalar_t* data_im, - const scalar_t* data_offset, - const scalar_t* data_mask, - const int height, - const int width, - const int kernel_h, - const int kernel_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int channel_per_deformable_group, - const int batch_size, - const int num_channels, - const int deformable_group, - const int height_col, - const int width_col, - scalar_t* data_col) { - CUDA_KERNEL_LOOP(index, n) { - // index index of output matrix - const int w_col = index % width_col; - const int h_col = (index / width_col) % height_col; - const int b_col = (index / width_col / height_col) % batch_size; - const int c_im = (index / width_col / height_col) / batch_size; - const int c_col = c_im * kernel_h * kernel_w; - - // compute deformable group index - const int deformable_group_index = c_im / channel_per_deformable_group; - - const int h_in = h_col * stride_h - pad_h; - const int w_in = w_col * stride_w - pad_w; - - scalar_t* data_col_ptr = data_col + - ((c_col * batch_size + b_col) * height_col + h_col) * width_col + w_col; - // const float* data_im_ptr = data_im + ((b_col * num_channels + c_im) * - // height + h_in) * width + w_in; - const scalar_t* data_im_ptr = - data_im + (b_col * num_channels + c_im) * height * width; - const scalar_t* data_offset_ptr = data_offset + - (b_col * deformable_group + deformable_group_index) * 2 * kernel_h * - kernel_w * height_col * width_col; - - const scalar_t* data_mask_ptr = data_mask + - (b_col * deformable_group + deformable_group_index) * kernel_h * - kernel_w * height_col * width_col; - - for (int i = 0; i < kernel_h; ++i) { - for (int j = 0; j < kernel_w; ++j) { - const int data_offset_h_ptr = - ((2 * (i * kernel_w + j)) * height_col + h_col) * width_col + w_col; - const int data_offset_w_ptr = - ((2 * (i * kernel_w + j) + 1) * height_col + h_col) * width_col + - w_col; - const int data_mask_hw_ptr = - ((i * kernel_w + j) * height_col + h_col) * width_col + w_col; - const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr]; - const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr]; - const scalar_t mask = data_mask_ptr[data_mask_hw_ptr]; - scalar_t val = static_cast(0); - const scalar_t h_im = h_in + i * dilation_h + offset_h; - const scalar_t w_im = w_in + j * dilation_w + offset_w; - // if (h_im >= 0 && w_im >= 0 && h_im < height && w_im < width) { - if (h_im > -1 && w_im > -1 && h_im < height && w_im < width) { - // const float map_h = i * dilation_h + offset_h; - // const float map_w = j * dilation_w + offset_w; - // const int cur_height = height - h_in; - // const int cur_width = width - w_in; - // val = dmcn_im2col_bilinear(data_im_ptr, width, cur_height, - // cur_width, map_h, map_w); - val = dmcn_im2col_bilinear( - data_im_ptr, width, height, width, h_im, w_im); - } - *data_col_ptr = val * mask; - data_col_ptr += batch_size * height_col * width_col; - // data_col_ptr += height_col * width_col; - } - } - } -} - -template -__global__ void modulated_deformable_col2im_gpu_kernel( - const int n, - const scalar_t* data_col, - const scalar_t* data_offset, - const scalar_t* data_mask, - const int channels, - const int height, - const int width, - const int kernel_h, - const int kernel_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int channel_per_deformable_group, - const int batch_size, - const int deformable_group, - const int height_col, - const int width_col, - scalar_t* grad_im) { - CUDA_KERNEL_LOOP(index, n) { - const int j = (index / width_col / height_col / batch_size) % kernel_w; - const int i = - (index / width_col / height_col / batch_size / kernel_w) % kernel_h; - const int c = - index / width_col / height_col / batch_size / kernel_w / kernel_h; - // compute the start and end of the output - - const int deformable_group_index = c / channel_per_deformable_group; - - int w_out = index % width_col; - int h_out = (index / width_col) % height_col; - int b = (index / width_col / height_col) % batch_size; - int w_in = w_out * stride_w - pad_w; - int h_in = h_out * stride_h - pad_h; - - const scalar_t* data_offset_ptr = data_offset + - (b * deformable_group + deformable_group_index) * 2 * kernel_h * - kernel_w * height_col * width_col; - const scalar_t* data_mask_ptr = data_mask + - (b * deformable_group + deformable_group_index) * kernel_h * kernel_w * - height_col * width_col; - const int data_offset_h_ptr = - ((2 * (i * kernel_w + j)) * height_col + h_out) * width_col + w_out; - const int data_offset_w_ptr = - ((2 * (i * kernel_w + j) + 1) * height_col + h_out) * width_col + w_out; - const int data_mask_hw_ptr = - ((i * kernel_w + j) * height_col + h_out) * width_col + w_out; - const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr]; - const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr]; - const scalar_t mask = data_mask_ptr[data_mask_hw_ptr]; - const scalar_t cur_inv_h_data = h_in + i * dilation_h + offset_h; - const scalar_t cur_inv_w_data = w_in + j * dilation_w + offset_w; - - const scalar_t cur_top_grad = data_col[index] * mask; - const int cur_h = (int)cur_inv_h_data; - const int cur_w = (int)cur_inv_w_data; - for (int dy = -2; dy <= 2; dy++) { - for (int dx = -2; dx <= 2; dx++) { - if (cur_h + dy >= 0 && cur_h + dy < height && cur_w + dx >= 0 && - cur_w + dx < width && abs(cur_inv_h_data - (cur_h + dy)) < 1 && - abs(cur_inv_w_data - (cur_w + dx)) < 1) { - int cur_bottom_grad_pos = - ((b * channels + c) * height + cur_h + dy) * width + cur_w + dx; - scalar_t weight = dmcn_get_gradient_weight( - cur_inv_h_data, - cur_inv_w_data, - cur_h + dy, - cur_w + dx, - height, - width); - atomicAdd(grad_im + cur_bottom_grad_pos, weight * cur_top_grad); - } - } - } - } -} - -template -__global__ void modulated_deformable_col2im_coord_gpu_kernel( - const int n, - const scalar_t* data_col, - const scalar_t* data_im, - const scalar_t* data_offset, - const scalar_t* data_mask, - const int channels, - const int height, - const int width, - const int kernel_h, - const int kernel_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int channel_per_deformable_group, - const int batch_size, - const int offset_channels, - const int deformable_group, - const int height_col, - const int width_col, - scalar_t* grad_offset, - scalar_t* grad_mask) { - CUDA_KERNEL_LOOP(index, n) { - scalar_t val = 0, mval = 0; - int w = index % width_col; - int h = (index / width_col) % height_col; - int c = (index / width_col / height_col) % offset_channels; - int b = (index / width_col / height_col) / offset_channels; - // compute the start and end of the output - - const int deformable_group_index = c / (2 * kernel_h * kernel_w); - const int col_step = kernel_h * kernel_w; - int cnt = 0; - const scalar_t* data_col_ptr = data_col + - deformable_group_index * channel_per_deformable_group * batch_size * - width_col * height_col; - const scalar_t* data_im_ptr = data_im + - (b * deformable_group + deformable_group_index) * - channel_per_deformable_group / kernel_h / kernel_w * height * width; - const scalar_t* data_offset_ptr = data_offset + - (b * deformable_group + deformable_group_index) * 2 * kernel_h * - kernel_w * height_col * width_col; - const scalar_t* data_mask_ptr = data_mask + - (b * deformable_group + deformable_group_index) * kernel_h * kernel_w * - height_col * width_col; - - const int offset_c = c - deformable_group_index * 2 * kernel_h * kernel_w; - - for (int col_c = (offset_c / 2); col_c < channel_per_deformable_group; - col_c += col_step) { - const int col_pos = - (((col_c * batch_size + b) * height_col) + h) * width_col + w; - const int bp_dir = offset_c % 2; - - int j = (col_pos / width_col / height_col / batch_size) % kernel_w; - int i = - (col_pos / width_col / height_col / batch_size / kernel_w) % kernel_h; - int w_out = col_pos % width_col; - int h_out = (col_pos / width_col) % height_col; - int w_in = w_out * stride_w - pad_w; - int h_in = h_out * stride_h - pad_h; - const int data_offset_h_ptr = - (((2 * (i * kernel_w + j)) * height_col + h_out) * width_col + w_out); - const int data_offset_w_ptr = - (((2 * (i * kernel_w + j) + 1) * height_col + h_out) * width_col + - w_out); - const int data_mask_hw_ptr = - (((i * kernel_w + j) * height_col + h_out) * width_col + w_out); - const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr]; - const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr]; - const scalar_t mask = data_mask_ptr[data_mask_hw_ptr]; - scalar_t inv_h = h_in + i * dilation_h + offset_h; - scalar_t inv_w = w_in + j * dilation_w + offset_w; - if (inv_h <= -1 || inv_w <= -1 || inv_h >= height || inv_w >= width) { - inv_h = inv_w = -2; - } else { - mval += data_col_ptr[col_pos] * - dmcn_im2col_bilinear( - data_im_ptr + cnt * height * width, - width, - height, - width, - inv_h, - inv_w); - } - const scalar_t weight = dmcn_get_coordinate_weight( - inv_h, - inv_w, - height, - width, - data_im_ptr + cnt * height * width, - width, - bp_dir); - val += weight * data_col_ptr[col_pos] * mask; - cnt += 1; - } - // KERNEL_ASSIGN(grad_offset[index], offset_req, val); - grad_offset[index] = val; - if (offset_c % 2 == 0) - // KERNEL_ASSIGN(grad_mask[(((b * deformable_group + - // deformable_group_index) * kernel_h * kernel_w + offset_c / 2) * - // height_col + h) * width_col + w], mask_req, mval); - grad_mask - [(((b * deformable_group + deformable_group_index) * kernel_h * - kernel_w + - offset_c / 2) * - height_col + - h) * - width_col + - w] = mval; - } -} - - -namespace detectron2 { - -void modulated_deformable_im2col_cuda( - const at::Tensor data_im, - const at::Tensor data_offset, - const at::Tensor data_mask, - const int batch_size, - const int channels, - const int height_im, - const int width_im, - const int height_col, - const int width_col, - const int kernel_h, - const int kenerl_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int deformable_group, - at::Tensor data_col) { - // num_axes should be smaller than block size - const int channel_per_deformable_group = channels / deformable_group; - const int num_kernels = channels * batch_size * height_col * width_col; - - at::cuda::CUDAGuard device_guard(data_im.device()); - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - AT_DISPATCH_FLOATING_TYPES_AND_HALF( - data_im.scalar_type(), "modulated_deformable_im2col_gpu", ([&] { - const scalar_t* data_im_ = data_im.data_ptr(); - const scalar_t* data_offset_ = data_offset.data_ptr(); - const scalar_t* data_mask_ = data_mask.data_ptr(); - scalar_t* data_col_ = data_col.data_ptr(); - - modulated_deformable_im2col_gpu_kernel<<< - GET_BLOCKS(num_kernels), - CUDA_NUM_THREADS, - 0, - stream>>>( - num_kernels, - data_im_, - data_offset_, - data_mask_, - height_im, - width_im, - kernel_h, - kenerl_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - channel_per_deformable_group, - batch_size, - channels, - deformable_group, - height_col, - width_col, - data_col_); - })); - - cudaError_t err = cudaGetLastError(); - if (err != cudaSuccess) { - printf( - "error in modulated_deformable_im2col_cuda: %s\n", - cudaGetErrorString(err)); - } -} - -void modulated_deformable_col2im_cuda( - const at::Tensor data_col, - const at::Tensor data_offset, - const at::Tensor data_mask, - const int batch_size, - const int channels, - const int height_im, - const int width_im, - const int height_col, - const int width_col, - const int kernel_h, - const int kernel_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int deformable_group, - at::Tensor grad_im) { - const int channel_per_deformable_group = channels / deformable_group; - const int num_kernels = - channels * kernel_h * kernel_w * batch_size * height_col * width_col; - - at::cuda::CUDAGuard device_guard(data_col.device()); - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - AT_DISPATCH_FLOATING_TYPES_AND_HALF( - data_col.scalar_type(), "modulated_deformable_col2im_gpu", ([&] { - const scalar_t* data_col_ = data_col.data_ptr(); - const scalar_t* data_offset_ = data_offset.data_ptr(); - const scalar_t* data_mask_ = data_mask.data_ptr(); - scalar_t* grad_im_ = grad_im.data_ptr(); - - modulated_deformable_col2im_gpu_kernel<<< - GET_BLOCKS(num_kernels), - CUDA_NUM_THREADS, - 0, - stream>>>( - num_kernels, - data_col_, - data_offset_, - data_mask_, - channels, - height_im, - width_im, - kernel_h, - kernel_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - channel_per_deformable_group, - batch_size, - deformable_group, - height_col, - width_col, - grad_im_); - })); - - cudaError_t err = cudaGetLastError(); - if (err != cudaSuccess) { - printf( - "error in modulated_deformable_col2im_cuda: %s\n", - cudaGetErrorString(err)); - } -} - -void modulated_deformable_col2im_coord_cuda( - const at::Tensor data_col, - const at::Tensor data_im, - const at::Tensor data_offset, - const at::Tensor data_mask, - const int batch_size, - const int channels, - const int height_im, - const int width_im, - const int height_col, - const int width_col, - const int kernel_h, - const int kernel_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int deformable_group, - at::Tensor grad_offset, - at::Tensor grad_mask) { - const int num_kernels = batch_size * height_col * width_col * 2 * kernel_h * - kernel_w * deformable_group; - const int channel_per_deformable_group = - channels * kernel_h * kernel_w / deformable_group; - - at::cuda::CUDAGuard device_guard(data_col.device()); - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - AT_DISPATCH_FLOATING_TYPES_AND_HALF( - data_col.scalar_type(), "modulated_deformable_col2im_coord_gpu", ([&] { - const scalar_t* data_col_ = data_col.data_ptr(); - const scalar_t* data_im_ = data_im.data_ptr(); - const scalar_t* data_offset_ = data_offset.data_ptr(); - const scalar_t* data_mask_ = data_mask.data_ptr(); - scalar_t* grad_offset_ = grad_offset.data_ptr(); - scalar_t* grad_mask_ = grad_mask.data_ptr(); - - modulated_deformable_col2im_coord_gpu_kernel<<< - GET_BLOCKS(num_kernels), - CUDA_NUM_THREADS, - 0, - stream>>>( - num_kernels, - data_col_, - data_im_, - data_offset_, - data_mask_, - channels, - height_im, - width_im, - kernel_h, - kernel_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - channel_per_deformable_group, - batch_size, - 2 * kernel_h * kernel_w * deformable_group, - deformable_group, - height_col, - width_col, - grad_offset_, - grad_mask_); - })); - cudaError_t err = cudaGetLastError(); - if (err != cudaSuccess) { - printf( - "error in modulated_deformable_col2im_coord_cuda: %s\n", - cudaGetErrorString(err)); - } -} - -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated.h b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated.h deleted file mode 100755 index 12aca388..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated.h +++ /dev/null @@ -1,39 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#pragma once -#include - -namespace detectron2 { - -at::Tensor nms_rotated_cpu( - const at::Tensor& dets, - const at::Tensor& scores, - const double iou_threshold); - -#if defined(WITH_CUDA) || defined(WITH_HIP) -at::Tensor nms_rotated_cuda( - const at::Tensor& dets, - const at::Tensor& scores, - const double iou_threshold); -#endif - -// Interface for Python -// inline is needed to prevent multiple function definitions when this header is -// included by different cpps -inline at::Tensor nms_rotated( - const at::Tensor& dets, - const at::Tensor& scores, - const double iou_threshold) { - assert(dets.device().is_cuda() == scores.device().is_cuda()); - if (dets.device().is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - return nms_rotated_cuda( - dets.contiguous(), scores.contiguous(), iou_threshold); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - - return nms_rotated_cpu(dets.contiguous(), scores.contiguous(), iou_threshold); -} - -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp deleted file mode 100755 index d7556e64..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp +++ /dev/null @@ -1,75 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#include "../box_iou_rotated/box_iou_rotated_utils.h" -#include "nms_rotated.h" - -namespace detectron2 { - -template -at::Tensor nms_rotated_cpu_kernel( - const at::Tensor& dets, - const at::Tensor& scores, - const double iou_threshold) { - // nms_rotated_cpu_kernel is modified from torchvision's nms_cpu_kernel, - // however, the code in this function is much shorter because - // we delegate the IoU computation for rotated boxes to - // the single_box_iou_rotated function in box_iou_rotated_utils.h - AT_ASSERTM(dets.device().is_cpu(), "dets must be a CPU tensor"); - AT_ASSERTM(scores.device().is_cpu(), "scores must be a CPU tensor"); - AT_ASSERTM( - dets.scalar_type() == scores.scalar_type(), - "dets should have the same type as scores"); - - if (dets.numel() == 0) { - return at::empty({0}, dets.options().dtype(at::kLong)); - } - - auto order_t = std::get<1>(scores.sort(0, /* descending=*/true)); - - auto ndets = dets.size(0); - at::Tensor suppressed_t = at::zeros({ndets}, dets.options().dtype(at::kByte)); - at::Tensor keep_t = at::zeros({ndets}, dets.options().dtype(at::kLong)); - - auto suppressed = suppressed_t.data_ptr(); - auto keep = keep_t.data_ptr(); - auto order = order_t.data_ptr(); - - int64_t num_to_keep = 0; - - for (int64_t _i = 0; _i < ndets; _i++) { - auto i = order[_i]; - if (suppressed[i] == 1) { - continue; - } - - keep[num_to_keep++] = i; - - for (int64_t _j = _i + 1; _j < ndets; _j++) { - auto j = order[_j]; - if (suppressed[j] == 1) { - continue; - } - - auto ovr = single_box_iou_rotated( - dets[i].data_ptr(), dets[j].data_ptr()); - if (ovr >= iou_threshold) { - suppressed[j] = 1; - } - } - } - return keep_t.narrow(/*dim=*/0, /*start=*/0, /*length=*/num_to_keep); -} - -at::Tensor nms_rotated_cpu( - // input must be contiguous - const at::Tensor& dets, - const at::Tensor& scores, - const double iou_threshold) { - auto result = at::empty({0}, dets.options()); - - AT_DISPATCH_FLOATING_TYPES(dets.scalar_type(), "nms_rotated", [&] { - result = nms_rotated_cpu_kernel(dets, scores, iou_threshold); - }); - return result; -} - -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu deleted file mode 100755 index 2a3db5c6..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu +++ /dev/null @@ -1,145 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#include -#include -#include -#include -#ifdef WITH_CUDA -#include "../box_iou_rotated/box_iou_rotated_utils.h" -#endif -// TODO avoid this when pytorch supports "same directory" hipification -#ifdef WITH_HIP -#include "box_iou_rotated/box_iou_rotated_utils.h" -#endif - -using namespace detectron2; - -namespace { -int const threadsPerBlock = sizeof(unsigned long long) * 8; -} - -template -__global__ void nms_rotated_cuda_kernel( - const int n_boxes, - const double iou_threshold, - const T* dev_boxes, - unsigned long long* dev_mask) { - // nms_rotated_cuda_kernel is modified from torchvision's nms_cuda_kernel - - const int row_start = blockIdx.y; - const int col_start = blockIdx.x; - - // if (row_start > col_start) return; - - const int row_size = - min(n_boxes - row_start * threadsPerBlock, threadsPerBlock); - const int col_size = - min(n_boxes - col_start * threadsPerBlock, threadsPerBlock); - - // Compared to nms_cuda_kernel, where each box is represented with 4 values - // (x1, y1, x2, y2), each rotated box is represented with 5 values - // (x_center, y_center, width, height, angle_degrees) here. - __shared__ T block_boxes[threadsPerBlock * 5]; - if (threadIdx.x < col_size) { - block_boxes[threadIdx.x * 5 + 0] = - dev_boxes[(threadsPerBlock * col_start + threadIdx.x) * 5 + 0]; - block_boxes[threadIdx.x * 5 + 1] = - dev_boxes[(threadsPerBlock * col_start + threadIdx.x) * 5 + 1]; - block_boxes[threadIdx.x * 5 + 2] = - dev_boxes[(threadsPerBlock * col_start + threadIdx.x) * 5 + 2]; - block_boxes[threadIdx.x * 5 + 3] = - dev_boxes[(threadsPerBlock * col_start + threadIdx.x) * 5 + 3]; - block_boxes[threadIdx.x * 5 + 4] = - dev_boxes[(threadsPerBlock * col_start + threadIdx.x) * 5 + 4]; - } - __syncthreads(); - - if (threadIdx.x < row_size) { - const int cur_box_idx = threadsPerBlock * row_start + threadIdx.x; - const T* cur_box = dev_boxes + cur_box_idx * 5; - int i = 0; - unsigned long long t = 0; - int start = 0; - if (row_start == col_start) { - start = threadIdx.x + 1; - } - for (i = start; i < col_size; i++) { - // Instead of devIoU used by original horizontal nms, here - // we use the single_box_iou_rotated function from box_iou_rotated_utils.h - if (single_box_iou_rotated(cur_box, block_boxes + i * 5) > - iou_threshold) { - t |= 1ULL << i; - } - } - const int col_blocks = at::cuda::ATenCeilDiv(n_boxes, threadsPerBlock); - dev_mask[cur_box_idx * col_blocks + col_start] = t; - } -} - -namespace detectron2 { - -at::Tensor nms_rotated_cuda( - // input must be contiguous - const at::Tensor& dets, - const at::Tensor& scores, - double iou_threshold) { - // using scalar_t = float; - AT_ASSERTM(dets.is_cuda(), "dets must be a CUDA tensor"); - AT_ASSERTM(scores.is_cuda(), "scores must be a CUDA tensor"); - at::cuda::CUDAGuard device_guard(dets.device()); - - auto order_t = std::get<1>(scores.sort(0, /* descending=*/true)); - auto dets_sorted = dets.index_select(0, order_t); - - auto dets_num = dets.size(0); - - const int col_blocks = - at::cuda::ATenCeilDiv(static_cast(dets_num), threadsPerBlock); - - at::Tensor mask = - at::empty({dets_num * col_blocks}, dets.options().dtype(at::kLong)); - - dim3 blocks(col_blocks, col_blocks); - dim3 threads(threadsPerBlock); - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - AT_DISPATCH_FLOATING_TYPES( - dets_sorted.scalar_type(), "nms_rotated_kernel_cuda", [&] { - nms_rotated_cuda_kernel<<>>( - dets_num, - iou_threshold, - dets_sorted.data_ptr(), - (unsigned long long*)mask.data_ptr()); - }); - - at::Tensor mask_cpu = mask.to(at::kCPU); - unsigned long long* mask_host = - (unsigned long long*)mask_cpu.data_ptr(); - - std::vector remv(col_blocks); - memset(&remv[0], 0, sizeof(unsigned long long) * col_blocks); - - at::Tensor keep = - at::empty({dets_num}, dets.options().dtype(at::kLong).device(at::kCPU)); - int64_t* keep_out = keep.data_ptr(); - - int num_to_keep = 0; - for (int i = 0; i < dets_num; i++) { - int nblock = i / threadsPerBlock; - int inblock = i % threadsPerBlock; - - if (!(remv[nblock] & (1ULL << inblock))) { - keep_out[num_to_keep++] = i; - unsigned long long* p = mask_host + i * col_blocks; - for (int j = nblock; j < col_blocks; j++) { - remv[j] |= p[j]; - } - } - } - - AT_CUDA_CHECK(cudaGetLastError()); - return order_t.index( - {keep.narrow(/*dim=*/0, /*start=*/0, /*length=*/num_to_keep) - .to(order_t.device(), keep.scalar_type())}); -} - -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/vision.cpp b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/vision.cpp deleted file mode 100755 index c9a2cd4f..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/csrc/vision.cpp +++ /dev/null @@ -1,117 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. - -#include -#include "ROIAlignRotated/ROIAlignRotated.h" -#include "box_iou_rotated/box_iou_rotated.h" -#include "cocoeval/cocoeval.h" -#include "deformable/deform_conv.h" -#include "nms_rotated/nms_rotated.h" - -namespace detectron2 { - -#if defined(WITH_CUDA) || defined(WITH_HIP) -extern int get_cudart_version(); -#endif - -std::string get_cuda_version() { -#if defined(WITH_CUDA) || defined(WITH_HIP) - std::ostringstream oss; - -#if defined(WITH_CUDA) - oss << "CUDA "; -#else - oss << "HIP "; -#endif - - // copied from - // https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 - auto printCudaStyleVersion = [&](int v) { - oss << (v / 1000) << "." << (v / 10 % 100); - if (v % 10 != 0) { - oss << "." << (v % 10); - } - }; - printCudaStyleVersion(get_cudart_version()); - return oss.str(); -#else // neither CUDA nor HIP - return std::string("not available"); -#endif -} - -bool has_cuda() { -#if defined(WITH_CUDA) - return true; -#else - return false; -#endif -} - -// similar to -// https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp -std::string get_compiler_version() { - std::ostringstream ss; -#if defined(__GNUC__) -#ifndef __clang__ - -#if ((__GNUC__ <= 4) && (__GNUC_MINOR__ <= 8)) -#error "GCC >= 4.9 is required!" -#endif - - { ss << "GCC " << __GNUC__ << "." << __GNUC_MINOR__; } -#endif -#endif - -#if defined(__clang_major__) - { - ss << "clang " << __clang_major__ << "." << __clang_minor__ << "." - << __clang_patchlevel__; - } -#endif - -#if defined(_MSC_VER) - { ss << "MSVC " << _MSC_FULL_VER; } -#endif - return ss.str(); -} - -PYBIND11_MODULE(TORCH_EXTENSION_NAME, m) { - m.def("get_compiler_version", &get_compiler_version, "get_compiler_version"); - m.def("get_cuda_version", &get_cuda_version, "get_cuda_version"); - m.def("has_cuda", &has_cuda, "has_cuda"); - - m.def("deform_conv_forward", &deform_conv_forward, "deform_conv_forward"); - m.def( - "deform_conv_backward_input", - &deform_conv_backward_input, - "deform_conv_backward_input"); - m.def( - "deform_conv_backward_filter", - &deform_conv_backward_filter, - "deform_conv_backward_filter"); - m.def( - "modulated_deform_conv_forward", - &modulated_deform_conv_forward, - "modulated_deform_conv_forward"); - m.def( - "modulated_deform_conv_backward", - &modulated_deform_conv_backward, - "modulated_deform_conv_backward"); - - m.def("COCOevalAccumulate", &COCOeval::Accumulate, "COCOeval::Accumulate"); - m.def( - "COCOevalEvaluateImages", - &COCOeval::EvaluateImages, - "COCOeval::EvaluateImages"); - pybind11::class_(m, "InstanceAnnotation") - .def(pybind11::init()); - pybind11::class_(m, "ImageEvaluation") - .def(pybind11::init<>()); -} - -TORCH_LIBRARY(detectron2, m) { - m.def("nms_rotated", &nms_rotated); - m.def("box_iou_rotated", &box_iou_rotated); - m.def("roi_align_rotated_forward", &ROIAlignRotated_forward); - m.def("roi_align_rotated_backward", &ROIAlignRotated_backward); -} -} // namespace detectron2 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/deform_conv.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/deform_conv.py deleted file mode 100755 index eca070f5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/deform_conv.py +++ /dev/null @@ -1,501 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import math -from functools import lru_cache -import torch -from torch import nn -from torch.autograd import Function -from torch.autograd.function import once_differentiable -from torch.nn.modules.utils import _pair -from torchvision.ops import deform_conv2d - -from detectron2 import _C - -from .wrappers import _NewEmptyTensorOp - - -class _DeformConv(Function): - @staticmethod - def forward( - ctx, - input, - offset, - weight, - stride=1, - padding=0, - dilation=1, - groups=1, - deformable_groups=1, - im2col_step=64, - ): - if input is not None and input.dim() != 4: - raise ValueError( - "Expected 4D tensor as input, got {}D tensor instead.".format(input.dim()) - ) - ctx.stride = _pair(stride) - ctx.padding = _pair(padding) - ctx.dilation = _pair(dilation) - ctx.groups = groups - ctx.deformable_groups = deformable_groups - ctx.im2col_step = im2col_step - - ctx.save_for_backward(input, offset, weight) - - output = input.new_empty( - _DeformConv._output_size(input, weight, ctx.padding, ctx.dilation, ctx.stride) - ) - - ctx.bufs_ = [input.new_empty(0), input.new_empty(0)] # columns, ones - - if not input.is_cuda: - if deformable_groups != 1: - raise NotImplementedError( - "Deformable Conv with deformable_groups != 1 is not supported on CPUs!" - ) - return deform_conv2d( - input, offset, weight, stride=stride, padding=padding, dilation=dilation - ) - else: - cur_im2col_step = _DeformConv._cal_im2col_step(input.shape[0], ctx.im2col_step) - assert (input.shape[0] % cur_im2col_step) == 0, "im2col step must divide batchsize" - - _C.deform_conv_forward( - input, - weight, - offset, - output, - ctx.bufs_[0], - ctx.bufs_[1], - weight.size(3), - weight.size(2), - ctx.stride[1], - ctx.stride[0], - ctx.padding[1], - ctx.padding[0], - ctx.dilation[1], - ctx.dilation[0], - ctx.groups, - ctx.deformable_groups, - cur_im2col_step, - ) - return output - - @staticmethod - @once_differentiable - def backward(ctx, grad_output): - input, offset, weight = ctx.saved_tensors - - grad_input = grad_offset = grad_weight = None - - if not grad_output.is_cuda: - raise NotImplementedError("Deformable Conv is not supported on CPUs!") - else: - cur_im2col_step = _DeformConv._cal_im2col_step(input.shape[0], ctx.im2col_step) - assert (input.shape[0] % cur_im2col_step) == 0, "im2col step must divide batchsize" - - if ctx.needs_input_grad[0] or ctx.needs_input_grad[1]: - grad_input = torch.zeros_like(input) - grad_offset = torch.zeros_like(offset) - _C.deform_conv_backward_input( - input, - offset, - grad_output, - grad_input, - grad_offset, - weight, - ctx.bufs_[0], - weight.size(3), - weight.size(2), - ctx.stride[1], - ctx.stride[0], - ctx.padding[1], - ctx.padding[0], - ctx.dilation[1], - ctx.dilation[0], - ctx.groups, - ctx.deformable_groups, - cur_im2col_step, - ) - - if ctx.needs_input_grad[2]: - grad_weight = torch.zeros_like(weight) - _C.deform_conv_backward_filter( - input, - offset, - grad_output, - grad_weight, - ctx.bufs_[0], - ctx.bufs_[1], - weight.size(3), - weight.size(2), - ctx.stride[1], - ctx.stride[0], - ctx.padding[1], - ctx.padding[0], - ctx.dilation[1], - ctx.dilation[0], - ctx.groups, - ctx.deformable_groups, - 1, - cur_im2col_step, - ) - - return grad_input, grad_offset, grad_weight, None, None, None, None, None, None - - @staticmethod - def _output_size(input, weight, padding, dilation, stride): - channels = weight.size(0) - output_size = (input.size(0), channels) - for d in range(input.dim() - 2): - in_size = input.size(d + 2) - pad = padding[d] - kernel = dilation[d] * (weight.size(d + 2) - 1) + 1 - stride_ = stride[d] - output_size += ((in_size + (2 * pad) - kernel) // stride_ + 1,) - if not all(map(lambda s: s > 0, output_size)): - raise ValueError( - "convolution input is too small (output would be {})".format( - "x".join(map(str, output_size)) - ) - ) - return output_size - - @staticmethod - @lru_cache(maxsize=128) - def _cal_im2col_step(input_size, default_size): - """ - Calculate proper im2col step size, which should be divisible by input_size and not larger - than prefer_size. Meanwhile the step size should be as large as possible to be more - efficient. So we choose the largest one among all divisors of input_size which are smaller - than prefer_size. - :param input_size: input batch size . - :param default_size: default preferred im2col step size. - :return: the largest proper step size. - """ - if input_size <= default_size: - return input_size - best_step = 1 - for step in range(2, min(int(math.sqrt(input_size)) + 1, default_size)): - if input_size % step == 0: - if input_size // step <= default_size: - return input_size // step - best_step = step - - return best_step - - -class _ModulatedDeformConv(Function): - @staticmethod - def forward( - ctx, - input, - offset, - mask, - weight, - bias=None, - stride=1, - padding=0, - dilation=1, - groups=1, - deformable_groups=1, - ): - ctx.stride = stride - ctx.padding = padding - ctx.dilation = dilation - ctx.groups = groups - ctx.deformable_groups = deformable_groups - ctx.with_bias = bias is not None - if not ctx.with_bias: - bias = input.new_empty(1) # fake tensor - if not input.is_cuda: - raise NotImplementedError("Deformable Conv is not supported on CPUs!") - if ( - weight.requires_grad - or mask.requires_grad - or offset.requires_grad - or input.requires_grad - ): - ctx.save_for_backward(input, offset, mask, weight, bias) - output = input.new_empty(_ModulatedDeformConv._infer_shape(ctx, input, weight)) - ctx._bufs = [input.new_empty(0), input.new_empty(0)] - _C.modulated_deform_conv_forward( - input, - weight, - bias, - ctx._bufs[0], - offset, - mask, - output, - ctx._bufs[1], - weight.shape[2], - weight.shape[3], - ctx.stride, - ctx.stride, - ctx.padding, - ctx.padding, - ctx.dilation, - ctx.dilation, - ctx.groups, - ctx.deformable_groups, - ctx.with_bias, - ) - return output - - @staticmethod - @once_differentiable - def backward(ctx, grad_output): - if not grad_output.is_cuda: - raise NotImplementedError("Deformable Conv is not supported on CPUs!") - input, offset, mask, weight, bias = ctx.saved_tensors - grad_input = torch.zeros_like(input) - grad_offset = torch.zeros_like(offset) - grad_mask = torch.zeros_like(mask) - grad_weight = torch.zeros_like(weight) - grad_bias = torch.zeros_like(bias) - _C.modulated_deform_conv_backward( - input, - weight, - bias, - ctx._bufs[0], - offset, - mask, - ctx._bufs[1], - grad_input, - grad_weight, - grad_bias, - grad_offset, - grad_mask, - grad_output, - weight.shape[2], - weight.shape[3], - ctx.stride, - ctx.stride, - ctx.padding, - ctx.padding, - ctx.dilation, - ctx.dilation, - ctx.groups, - ctx.deformable_groups, - ctx.with_bias, - ) - if not ctx.with_bias: - grad_bias = None - - return ( - grad_input, - grad_offset, - grad_mask, - grad_weight, - grad_bias, - None, - None, - None, - None, - None, - ) - - @staticmethod - def _infer_shape(ctx, input, weight): - n = input.size(0) - channels_out = weight.size(0) - height, width = input.shape[2:4] - kernel_h, kernel_w = weight.shape[2:4] - height_out = ( - height + 2 * ctx.padding - (ctx.dilation * (kernel_h - 1) + 1) - ) // ctx.stride + 1 - width_out = ( - width + 2 * ctx.padding - (ctx.dilation * (kernel_w - 1) + 1) - ) // ctx.stride + 1 - return n, channels_out, height_out, width_out - - -deform_conv = _DeformConv.apply -modulated_deform_conv = _ModulatedDeformConv.apply - - -class DeformConv(nn.Module): - def __init__( - self, - in_channels, - out_channels, - kernel_size, - stride=1, - padding=0, - dilation=1, - groups=1, - deformable_groups=1, - bias=False, - norm=None, - activation=None, - ): - """ - Deformable convolution from :paper:`deformconv`. - - Arguments are similar to :class:`Conv2D`. Extra arguments: - - Args: - deformable_groups (int): number of groups used in deformable convolution. - norm (nn.Module, optional): a normalization layer - activation (callable(Tensor) -> Tensor): a callable activation function - """ - super(DeformConv, self).__init__() - - assert not bias - assert in_channels % groups == 0, "in_channels {} cannot be divisible by groups {}".format( - in_channels, groups - ) - assert ( - out_channels % groups == 0 - ), "out_channels {} cannot be divisible by groups {}".format(out_channels, groups) - - self.in_channels = in_channels - self.out_channels = out_channels - self.kernel_size = _pair(kernel_size) - self.stride = _pair(stride) - self.padding = _pair(padding) - self.dilation = _pair(dilation) - self.groups = groups - self.deformable_groups = deformable_groups - self.norm = norm - self.activation = activation - - self.weight = nn.Parameter( - torch.Tensor(out_channels, in_channels // self.groups, *self.kernel_size) - ) - self.bias = None - - nn.init.kaiming_uniform_(self.weight, nonlinearity="relu") - - def forward(self, x, offset): - if x.numel() == 0: - # When input is empty, we want to return a empty tensor with "correct" shape, - # So that the following operations will not panic - # if they check for the shape of the tensor. - # This computes the height and width of the output tensor - output_shape = [ - (i + 2 * p - (di * (k - 1) + 1)) // s + 1 - for i, p, di, k, s in zip( - x.shape[-2:], self.padding, self.dilation, self.kernel_size, self.stride - ) - ] - output_shape = [x.shape[0], self.weight.shape[0]] + output_shape - return _NewEmptyTensorOp.apply(x, output_shape) - - x = deform_conv( - x, - offset, - self.weight, - self.stride, - self.padding, - self.dilation, - self.groups, - self.deformable_groups, - ) - if self.norm is not None: - x = self.norm(x) - if self.activation is not None: - x = self.activation(x) - return x - - def extra_repr(self): - tmpstr = "in_channels=" + str(self.in_channels) - tmpstr += ", out_channels=" + str(self.out_channels) - tmpstr += ", kernel_size=" + str(self.kernel_size) - tmpstr += ", stride=" + str(self.stride) - tmpstr += ", padding=" + str(self.padding) - tmpstr += ", dilation=" + str(self.dilation) - tmpstr += ", groups=" + str(self.groups) - tmpstr += ", deformable_groups=" + str(self.deformable_groups) - tmpstr += ", bias=False" - return tmpstr - - -class ModulatedDeformConv(nn.Module): - def __init__( - self, - in_channels, - out_channels, - kernel_size, - stride=1, - padding=0, - dilation=1, - groups=1, - deformable_groups=1, - bias=True, - norm=None, - activation=None, - ): - """ - Modulated deformable convolution from :paper:`deformconv2`. - - Arguments are similar to :class:`Conv2D`. Extra arguments: - - Args: - deformable_groups (int): number of groups used in deformable convolution. - norm (nn.Module, optional): a normalization layer - activation (callable(Tensor) -> Tensor): a callable activation function - """ - super(ModulatedDeformConv, self).__init__() - self.in_channels = in_channels - self.out_channels = out_channels - self.kernel_size = _pair(kernel_size) - self.stride = stride - self.padding = padding - self.dilation = dilation - self.groups = groups - self.deformable_groups = deformable_groups - self.with_bias = bias - self.norm = norm - self.activation = activation - - self.weight = nn.Parameter( - torch.Tensor(out_channels, in_channels // groups, *self.kernel_size) - ) - if bias: - self.bias = nn.Parameter(torch.Tensor(out_channels)) - else: - self.bias = None - - nn.init.kaiming_uniform_(self.weight, nonlinearity="relu") - if self.bias is not None: - nn.init.constant_(self.bias, 0) - - def forward(self, x, offset, mask): - if x.numel() == 0: - output_shape = [ - (i + 2 * p - (di * (k - 1) + 1)) // s + 1 - for i, p, di, k, s in zip( - x.shape[-2:], self.padding, self.dilation, self.kernel_size, self.stride - ) - ] - output_shape = [x.shape[0], self.weight.shape[0]] + output_shape - return _NewEmptyTensorOp.apply(x, output_shape) - - x = modulated_deform_conv( - x, - offset, - mask, - self.weight, - self.bias, - self.stride, - self.padding, - self.dilation, - self.groups, - self.deformable_groups, - ) - if self.norm is not None: - x = self.norm(x) - if self.activation is not None: - x = self.activation(x) - return x - - def extra_repr(self): - tmpstr = "in_channels=" + str(self.in_channels) - tmpstr += ", out_channels=" + str(self.out_channels) - tmpstr += ", kernel_size=" + str(self.kernel_size) - tmpstr += ", stride=" + str(self.stride) - tmpstr += ", padding=" + str(self.padding) - tmpstr += ", dilation=" + str(self.dilation) - tmpstr += ", groups=" + str(self.groups) - tmpstr += ", deformable_groups=" + str(self.deformable_groups) - tmpstr += ", bias=" + str(self.with_bias) - return tmpstr diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/losses.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/losses.py deleted file mode 100755 index cf4d5e9b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/losses.py +++ /dev/null @@ -1,133 +0,0 @@ -import math -import torch - - -def diou_loss( - boxes1: torch.Tensor, - boxes2: torch.Tensor, - reduction: str = "none", - eps: float = 1e-7, -) -> torch.Tensor: - """ - Distance Intersection over Union Loss (Zhaohui Zheng et. al) - https://arxiv.org/abs/1911.08287 - Args: - boxes1, boxes2 (Tensor): box locations in XYXY format, shape (N, 4) or (4,). - reduction: 'none' | 'mean' | 'sum' - 'none': No reduction will be applied to the output. - 'mean': The output will be averaged. - 'sum': The output will be summed. - eps (float): small number to prevent division by zero - """ - - x1, y1, x2, y2 = boxes1.unbind(dim=-1) - x1g, y1g, x2g, y2g = boxes2.unbind(dim=-1) - - # TODO: use torch._assert_async() when pytorch 1.8 support is dropped - assert (x2 >= x1).all(), "bad box: x1 larger than x2" - assert (y2 >= y1).all(), "bad box: y1 larger than y2" - - # Intersection keypoints - xkis1 = torch.max(x1, x1g) - ykis1 = torch.max(y1, y1g) - xkis2 = torch.min(x2, x2g) - ykis2 = torch.min(y2, y2g) - - intsct = torch.zeros_like(x1) - mask = (ykis2 > ykis1) & (xkis2 > xkis1) - intsct[mask] = (xkis2[mask] - xkis1[mask]) * (ykis2[mask] - ykis1[mask]) - union = (x2 - x1) * (y2 - y1) + (x2g - x1g) * (y2g - y1g) - intsct + eps - iou = intsct / union - - # smallest enclosing box - xc1 = torch.min(x1, x1g) - yc1 = torch.min(y1, y1g) - xc2 = torch.max(x2, x2g) - yc2 = torch.max(y2, y2g) - diag_len = ((xc2 - xc1) ** 2) + ((yc2 - yc1) ** 2) + eps - - # centers of boxes - x_p = (x2 + x1) / 2 - y_p = (y2 + y1) / 2 - x_g = (x1g + x2g) / 2 - y_g = (y1g + y2g) / 2 - distance = ((x_p - x_g) ** 2) + ((y_p - y_g) ** 2) - - # Eqn. (7) - loss = 1 - iou + (distance / diag_len) - if reduction == "mean": - loss = loss.mean() if loss.numel() > 0 else 0.0 * loss.sum() - elif reduction == "sum": - loss = loss.sum() - - return loss - - -def ciou_loss( - boxes1: torch.Tensor, - boxes2: torch.Tensor, - reduction: str = "none", - eps: float = 1e-7, -) -> torch.Tensor: - """ - Complete Intersection over Union Loss (Zhaohui Zheng et. al) - https://arxiv.org/abs/1911.08287 - Args: - boxes1, boxes2 (Tensor): box locations in XYXY format, shape (N, 4) or (4,). - reduction: 'none' | 'mean' | 'sum' - 'none': No reduction will be applied to the output. - 'mean': The output will be averaged. - 'sum': The output will be summed. - eps (float): small number to prevent division by zero - """ - - x1, y1, x2, y2 = boxes1.unbind(dim=-1) - x1g, y1g, x2g, y2g = boxes2.unbind(dim=-1) - - # TODO: use torch._assert_async() when pytorch 1.8 support is dropped - assert (x2 >= x1).all(), "bad box: x1 larger than x2" - assert (y2 >= y1).all(), "bad box: y1 larger than y2" - - # Intersection keypoints - xkis1 = torch.max(x1, x1g) - ykis1 = torch.max(y1, y1g) - xkis2 = torch.min(x2, x2g) - ykis2 = torch.min(y2, y2g) - - intsct = torch.zeros_like(x1) - mask = (ykis2 > ykis1) & (xkis2 > xkis1) - intsct[mask] = (xkis2[mask] - xkis1[mask]) * (ykis2[mask] - ykis1[mask]) - union = (x2 - x1) * (y2 - y1) + (x2g - x1g) * (y2g - y1g) - intsct + eps - iou = intsct / union - - # smallest enclosing box - xc1 = torch.min(x1, x1g) - yc1 = torch.min(y1, y1g) - xc2 = torch.max(x2, x2g) - yc2 = torch.max(y2, y2g) - diag_len = ((xc2 - xc1) ** 2) + ((yc2 - yc1) ** 2) + eps - - # centers of boxes - x_p = (x2 + x1) / 2 - y_p = (y2 + y1) / 2 - x_g = (x1g + x2g) / 2 - y_g = (y1g + y2g) / 2 - distance = ((x_p - x_g) ** 2) + ((y_p - y_g) ** 2) - - # width and height of boxes - w_pred = x2 - x1 - h_pred = y2 - y1 - w_gt = x2g - x1g - h_gt = y2g - y1g - v = (4 / (math.pi ** 2)) * torch.pow((torch.atan(w_gt / h_gt) - torch.atan(w_pred / h_pred)), 2) - with torch.no_grad(): - alpha = v / (1 - iou + v + eps) - - # Eqn. (10) - loss = 1 - iou + (distance / diag_len) + alpha * v - if reduction == "mean": - loss = loss.mean() if loss.numel() > 0 else 0.0 * loss.sum() - elif reduction == "sum": - loss = loss.sum() - - return loss diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/mask_ops.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/mask_ops.py deleted file mode 100755 index e7a9f3a3..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/mask_ops.py +++ /dev/null @@ -1,275 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -from typing import Tuple -import torch -from PIL import Image -from torch.nn import functional as F - -__all__ = ["paste_masks_in_image"] - - -BYTES_PER_FLOAT = 4 -# TODO: This memory limit may be too much or too little. It would be better to -# determine it based on available resources. -GPU_MEM_LIMIT = 1024 ** 3 # 1 GB memory limit - - -def _do_paste_mask(masks, boxes, img_h: int, img_w: int, skip_empty: bool = True): - """ - Args: - masks: N, 1, H, W - boxes: N, 4 - img_h, img_w (int): - skip_empty (bool): only paste masks within the region that - tightly bound all boxes, and returns the results this region only. - An important optimization for CPU. - - Returns: - if skip_empty == False, a mask of shape (N, img_h, img_w) - if skip_empty == True, a mask of shape (N, h', w'), and the slice - object for the corresponding region. - """ - # On GPU, paste all masks together (up to chunk size) - # by using the entire image to sample the masks - # Compared to pasting them one by one, - # this has more operations but is faster on COCO-scale dataset. - device = masks.device - - if skip_empty and not torch.jit.is_scripting(): - x0_int, y0_int = torch.clamp(boxes.min(dim=0).values.floor()[:2] - 1, min=0).to( - dtype=torch.int32 - ) - x1_int = torch.clamp(boxes[:, 2].max().ceil() + 1, max=img_w).to(dtype=torch.int32) - y1_int = torch.clamp(boxes[:, 3].max().ceil() + 1, max=img_h).to(dtype=torch.int32) - else: - x0_int, y0_int = 0, 0 - x1_int, y1_int = img_w, img_h - x0, y0, x1, y1 = torch.split(boxes, 1, dim=1) # each is Nx1 - - N = masks.shape[0] - - img_y = torch.arange(y0_int, y1_int, device=device, dtype=torch.float32) + 0.5 - img_x = torch.arange(x0_int, x1_int, device=device, dtype=torch.float32) + 0.5 - img_y = (img_y - y0) / (y1 - y0) * 2 - 1 - img_x = (img_x - x0) / (x1 - x0) * 2 - 1 - # img_x, img_y have shapes (N, w), (N, h) - - gx = img_x[:, None, :].expand(N, img_y.size(1), img_x.size(1)) - gy = img_y[:, :, None].expand(N, img_y.size(1), img_x.size(1)) - grid = torch.stack([gx, gy], dim=3) - - if not torch.jit.is_scripting(): - if not masks.dtype.is_floating_point: - masks = masks.float() - img_masks = F.grid_sample(masks, grid.to(masks.dtype), align_corners=False) - - if skip_empty and not torch.jit.is_scripting(): - return img_masks[:, 0], (slice(y0_int, y1_int), slice(x0_int, x1_int)) - else: - return img_masks[:, 0], () - - -# Annotate boxes as Tensor (but not Boxes) in order to use scripting -@torch.jit.script_if_tracing -def paste_masks_in_image( - masks: torch.Tensor, boxes: torch.Tensor, image_shape: Tuple[int, int], threshold: float = 0.5 -): - """ - Paste a set of masks that are of a fixed resolution (e.g., 28 x 28) into an image. - The location, height, and width for pasting each mask is determined by their - corresponding bounding boxes in boxes. - - Note: - This is a complicated but more accurate implementation. In actual deployment, it is - often enough to use a faster but less accurate implementation. - See :func:`paste_mask_in_image_old` in this file for an alternative implementation. - - Args: - masks (tensor): Tensor of shape (Bimg, Hmask, Wmask), where Bimg is the number of - detected object instances in the image and Hmask, Wmask are the mask width and mask - height of the predicted mask (e.g., Hmask = Wmask = 28). Values are in [0, 1]. - boxes (Boxes or Tensor): A Boxes of length Bimg or Tensor of shape (Bimg, 4). - boxes[i] and masks[i] correspond to the same object instance. - image_shape (tuple): height, width - threshold (float): A threshold in [0, 1] for converting the (soft) masks to - binary masks. - - Returns: - img_masks (Tensor): A tensor of shape (Bimg, Himage, Wimage), where Bimg is the - number of detected object instances and Himage, Wimage are the image width - and height. img_masks[i] is a binary mask for object instance i. - """ - - assert masks.shape[-1] == masks.shape[-2], "Only square mask predictions are supported" - N = len(masks) - if N == 0: - return masks.new_empty((0,) + image_shape, dtype=torch.uint8) - if not isinstance(boxes, torch.Tensor): - boxes = boxes.tensor - device = boxes.device - assert len(boxes) == N, boxes.shape - - img_h, img_w = image_shape - - # The actual implementation split the input into chunks, - # and paste them chunk by chunk. - if device.type == "cpu" or torch.jit.is_scripting(): - # CPU is most efficient when they are pasted one by one with skip_empty=True - # so that it performs minimal number of operations. - num_chunks = N - else: - # GPU benefits from parallelism for larger chunks, but may have memory issue - # int(img_h) because shape may be tensors in tracing - num_chunks = int(np.ceil(N * int(img_h) * int(img_w) * BYTES_PER_FLOAT / GPU_MEM_LIMIT)) - assert ( - num_chunks <= N - ), "Default GPU_MEM_LIMIT in mask_ops.py is too small; try increasing it" - chunks = torch.chunk(torch.arange(N, device=device), num_chunks) - - img_masks = torch.zeros( - N, img_h, img_w, device=device, dtype=torch.bool if threshold >= 0 else torch.uint8 - ) - for inds in chunks: - masks_chunk, spatial_inds = _do_paste_mask( - masks[inds, None, :, :], boxes[inds], img_h, img_w, skip_empty=device.type == "cpu" - ) - - if threshold >= 0: - masks_chunk = (masks_chunk >= threshold).to(dtype=torch.bool) - else: - # for visualization and debugging - masks_chunk = (masks_chunk * 255).to(dtype=torch.uint8) - - if torch.jit.is_scripting(): # Scripting does not use the optimized codepath - img_masks[inds] = masks_chunk - else: - img_masks[(inds,) + spatial_inds] = masks_chunk - return img_masks - - -# The below are the original paste function (from Detectron1) which has -# larger quantization error. -# It is faster on CPU, while the aligned one is faster on GPU thanks to grid_sample. - - -def paste_mask_in_image_old(mask, box, img_h, img_w, threshold): - """ - Paste a single mask in an image. - This is a per-box implementation of :func:`paste_masks_in_image`. - This function has larger quantization error due to incorrect pixel - modeling and is not used any more. - - Args: - mask (Tensor): A tensor of shape (Hmask, Wmask) storing the mask of a single - object instance. Values are in [0, 1]. - box (Tensor): A tensor of shape (4, ) storing the x0, y0, x1, y1 box corners - of the object instance. - img_h, img_w (int): Image height and width. - threshold (float): Mask binarization threshold in [0, 1]. - - Returns: - im_mask (Tensor): - The resized and binarized object mask pasted into the original - image plane (a tensor of shape (img_h, img_w)). - """ - # Conversion from continuous box coordinates to discrete pixel coordinates - # via truncation (cast to int32). This determines which pixels to paste the - # mask onto. - box = box.to(dtype=torch.int32) # Continuous to discrete coordinate conversion - # An example (1D) box with continuous coordinates (x0=0.7, x1=4.3) will map to - # a discrete coordinates (x0=0, x1=4). Note that box is mapped to 5 = x1 - x0 + 1 - # pixels (not x1 - x0 pixels). - samples_w = box[2] - box[0] + 1 # Number of pixel samples, *not* geometric width - samples_h = box[3] - box[1] + 1 # Number of pixel samples, *not* geometric height - - # Resample the mask from it's original grid to the new samples_w x samples_h grid - mask = Image.fromarray(mask.cpu().numpy()) - mask = mask.resize((samples_w, samples_h), resample=Image.BILINEAR) - mask = np.array(mask, copy=False) - - if threshold >= 0: - mask = np.array(mask > threshold, dtype=np.uint8) - mask = torch.from_numpy(mask) - else: - # for visualization and debugging, we also - # allow it to return an unmodified mask - mask = torch.from_numpy(mask * 255).to(torch.uint8) - - im_mask = torch.zeros((img_h, img_w), dtype=torch.uint8) - x_0 = max(box[0], 0) - x_1 = min(box[2] + 1, img_w) - y_0 = max(box[1], 0) - y_1 = min(box[3] + 1, img_h) - - im_mask[y_0:y_1, x_0:x_1] = mask[ - (y_0 - box[1]) : (y_1 - box[1]), (x_0 - box[0]) : (x_1 - box[0]) - ] - return im_mask - - -# Our pixel modeling requires extrapolation for any continuous -# coordinate < 0.5 or > length - 0.5. When sampling pixels on the masks, -# we would like this extrapolation to be an interpolation between boundary values and zero, -# instead of using absolute zero or boundary values. -# Therefore `paste_mask_in_image_old` is often used with zero padding around the masks like this: -# masks, scale = pad_masks(masks[:, 0, :, :], 1) -# boxes = scale_boxes(boxes.tensor, scale) - - -def pad_masks(masks, padding): - """ - Args: - masks (tensor): A tensor of shape (B, M, M) representing B masks. - padding (int): Number of cells to pad on all sides. - - Returns: - The padded masks and the scale factor of the padding size / original size. - """ - B = masks.shape[0] - M = masks.shape[-1] - pad2 = 2 * padding - scale = float(M + pad2) / M - padded_masks = masks.new_zeros((B, M + pad2, M + pad2)) - padded_masks[:, padding:-padding, padding:-padding] = masks - return padded_masks, scale - - -def scale_boxes(boxes, scale): - """ - Args: - boxes (tensor): A tensor of shape (B, 4) representing B boxes with 4 - coords representing the corners x0, y0, x1, y1, - scale (float): The box scaling factor. - - Returns: - Scaled boxes. - """ - w_half = (boxes[:, 2] - boxes[:, 0]) * 0.5 - h_half = (boxes[:, 3] - boxes[:, 1]) * 0.5 - x_c = (boxes[:, 2] + boxes[:, 0]) * 0.5 - y_c = (boxes[:, 3] + boxes[:, 1]) * 0.5 - - w_half *= scale - h_half *= scale - - scaled_boxes = torch.zeros_like(boxes) - scaled_boxes[:, 0] = x_c - w_half - scaled_boxes[:, 2] = x_c + w_half - scaled_boxes[:, 1] = y_c - h_half - scaled_boxes[:, 3] = y_c + h_half - return scaled_boxes - - -@torch.jit.script_if_tracing -def _paste_masks_tensor_shape( - masks: torch.Tensor, - boxes: torch.Tensor, - image_shape: Tuple[torch.Tensor, torch.Tensor], - threshold: float = 0.5, -): - """ - A wrapper of paste_masks_in_image where image_shape is Tensor. - During tracing, shapes might be tensors instead of ints. The Tensor->int - conversion should be scripted rather than traced. - """ - return paste_masks_in_image(masks, boxes, (int(image_shape[0]), int(image_shape[1])), threshold) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/nms.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/nms.py deleted file mode 100755 index 6b6be71c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/nms.py +++ /dev/null @@ -1,139 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import torch -from torchvision.ops import boxes as box_ops -from torchvision.ops import nms # noqa . for compatibility - - -def batched_nms( - boxes: torch.Tensor, scores: torch.Tensor, idxs: torch.Tensor, iou_threshold: float -): - """ - Same as torchvision.ops.boxes.batched_nms, but with float(). - """ - assert boxes.shape[-1] == 4 - # Note: Torchvision already has a strategy (https://github.com/pytorch/vision/issues/1311) - # to decide whether to use coordinate trick or for loop to implement batched_nms. So we - # just call it directly. - # Fp16 does not have enough range for batched NMS, so adding float(). - return box_ops.batched_nms(boxes.float(), scores, idxs, iou_threshold) - - -# Note: this function (nms_rotated) might be moved into -# torchvision/ops/boxes.py in the future -def nms_rotated(boxes, scores, iou_threshold): - """ - Performs non-maximum suppression (NMS) on the rotated boxes according - to their intersection-over-union (IoU). - - Rotated NMS iteratively removes lower scoring rotated boxes which have an - IoU greater than iou_threshold with another (higher scoring) rotated box. - - Note that RotatedBox (5, 3, 4, 2, -90) covers exactly the same region as - RotatedBox (5, 3, 4, 2, 90) does, and their IoU will be 1. However, they - can be representing completely different objects in certain tasks, e.g., OCR. - - As for the question of whether rotated-NMS should treat them as faraway boxes - even though their IOU is 1, it depends on the application and/or ground truth annotation. - - As an extreme example, consider a single character v and the square box around it. - - If the angle is 0 degree, the object (text) would be read as 'v'; - - If the angle is 90 degrees, the object (text) would become '>'; - - If the angle is 180 degrees, the object (text) would become '^'; - - If the angle is 270/-90 degrees, the object (text) would become '<' - - All of these cases have IoU of 1 to each other, and rotated NMS that only - uses IoU as criterion would only keep one of them with the highest score - - which, practically, still makes sense in most cases because typically - only one of theses orientations is the correct one. Also, it does not matter - as much if the box is only used to classify the object (instead of transcribing - them with a sequential OCR recognition model) later. - - On the other hand, when we use IoU to filter proposals that are close to the - ground truth during training, we should definitely take the angle into account if - we know the ground truth is labeled with the strictly correct orientation (as in, - upside-down words are annotated with -180 degrees even though they can be covered - with a 0/90/-90 degree box, etc.) - - The way the original dataset is annotated also matters. For example, if the dataset - is a 4-point polygon dataset that does not enforce ordering of vertices/orientation, - we can estimate a minimum rotated bounding box to this polygon, but there's no way - we can tell the correct angle with 100% confidence (as shown above, there could be 4 different - rotated boxes, with angles differed by 90 degrees to each other, covering the exactly - same region). In that case we have to just use IoU to determine the box - proximity (as many detection benchmarks (even for text) do) unless there're other - assumptions we can make (like width is always larger than height, or the object is not - rotated by more than 90 degrees CCW/CW, etc.) - - In summary, not considering angles in rotated NMS seems to be a good option for now, - but we should be aware of its implications. - - Args: - boxes (Tensor[N, 5]): Rotated boxes to perform NMS on. They are expected to be in - (x_center, y_center, width, height, angle_degrees) format. - scores (Tensor[N]): Scores for each one of the rotated boxes - iou_threshold (float): Discards all overlapping rotated boxes with IoU < iou_threshold - - Returns: - keep (Tensor): int64 tensor with the indices of the elements that have been kept - by Rotated NMS, sorted in decreasing order of scores - """ - return torch.ops.detectron2.nms_rotated(boxes, scores, iou_threshold) - - -# Note: this function (batched_nms_rotated) might be moved into -# torchvision/ops/boxes.py in the future -def batched_nms_rotated(boxes, scores, idxs, iou_threshold): - """ - Performs non-maximum suppression in a batched fashion. - - Each index value correspond to a category, and NMS - will not be applied between elements of different categories. - - Args: - boxes (Tensor[N, 5]): - boxes where NMS will be performed. They - are expected to be in (x_ctr, y_ctr, width, height, angle_degrees) format - scores (Tensor[N]): - scores for each one of the boxes - idxs (Tensor[N]): - indices of the categories for each one of the boxes. - iou_threshold (float): - discards all overlapping boxes - with IoU < iou_threshold - - Returns: - Tensor: - int64 tensor with the indices of the elements that have been kept - by NMS, sorted in decreasing order of scores - """ - assert boxes.shape[-1] == 5 - - if boxes.numel() == 0: - return torch.empty((0,), dtype=torch.int64, device=boxes.device) - boxes = boxes.float() # fp16 does not have enough range for batched NMS - # Strategy: in order to perform NMS independently per class, - # we add an offset to all the boxes. The offset is dependent - # only on the class idx, and is large enough so that boxes - # from different classes do not overlap - - # Note that batched_nms in torchvision/ops/boxes.py only uses max_coordinate, - # which won't handle negative coordinates correctly. - # Here by using min_coordinate we can make sure the negative coordinates are - # correctly handled. - max_coordinate = ( - torch.max(boxes[:, 0], boxes[:, 1]) + torch.max(boxes[:, 2], boxes[:, 3]) / 2 - ).max() - min_coordinate = ( - torch.min(boxes[:, 0], boxes[:, 1]) - torch.max(boxes[:, 2], boxes[:, 3]) / 2 - ).min() - offsets = idxs.to(boxes) * (max_coordinate - min_coordinate + 1) - boxes_for_nms = boxes.clone() # avoid modifying the original values in boxes - boxes_for_nms[:, :2] += offsets[:, None] - keep = nms_rotated(boxes_for_nms, scores, iou_threshold) - return keep diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/roi_align.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/roi_align.py deleted file mode 100755 index 163462e1..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/roi_align.py +++ /dev/null @@ -1,74 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from torch import nn -from torchvision.ops import roi_align - - -# NOTE: torchvision's RoIAlign has a different default aligned=False -class ROIAlign(nn.Module): - def __init__(self, output_size, spatial_scale, sampling_ratio, aligned=True): - """ - Args: - output_size (tuple): h, w - spatial_scale (float): scale the input boxes by this number - sampling_ratio (int): number of inputs samples to take for each output - sample. 0 to take samples densely. - aligned (bool): if False, use the legacy implementation in - Detectron. If True, align the results more perfectly. - - Note: - The meaning of aligned=True: - - Given a continuous coordinate c, its two neighboring pixel indices (in our - pixel model) are computed by floor(c - 0.5) and ceil(c - 0.5). For example, - c=1.3 has pixel neighbors with discrete indices [0] and [1] (which are sampled - from the underlying signal at continuous coordinates 0.5 and 1.5). But the original - roi_align (aligned=False) does not subtract the 0.5 when computing neighboring - pixel indices and therefore it uses pixels with a slightly incorrect alignment - (relative to our pixel model) when performing bilinear interpolation. - - With `aligned=True`, - we first appropriately scale the ROI and then shift it by -0.5 - prior to calling roi_align. This produces the correct neighbors; see - detectron2/tests/test_roi_align.py for verification. - - The difference does not make a difference to the model's performance if - ROIAlign is used together with conv layers. - """ - super().__init__() - self.output_size = output_size - self.spatial_scale = spatial_scale - self.sampling_ratio = sampling_ratio - self.aligned = aligned - - from torchvision import __version__ - - version = tuple(int(x) for x in __version__.split(".")[:2]) - # https://github.com/pytorch/vision/pull/2438 - assert version >= (0, 7), "Require torchvision >= 0.7" - - def forward(self, input, rois): - """ - Args: - input: NCHW images - rois: Bx5 boxes. First column is the index into N. The other 4 columns are xyxy. - """ - assert rois.dim() == 2 and rois.size(1) == 5 - if input.is_quantized: - input = input.dequantize() - return roi_align( - input, - rois.to(dtype=input.dtype), - self.output_size, - self.spatial_scale, - self.sampling_ratio, - self.aligned, - ) - - def __repr__(self): - tmpstr = self.__class__.__name__ + "(" - tmpstr += "output_size=" + str(self.output_size) - tmpstr += ", spatial_scale=" + str(self.spatial_scale) - tmpstr += ", sampling_ratio=" + str(self.sampling_ratio) - tmpstr += ", aligned=" + str(self.aligned) - tmpstr += ")" - return tmpstr diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/roi_align_rotated.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/roi_align_rotated.py deleted file mode 100755 index d097326c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/roi_align_rotated.py +++ /dev/null @@ -1,91 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import torch -from torch import nn -from torch.autograd import Function -from torch.autograd.function import once_differentiable -from torch.nn.modules.utils import _pair - - -class _ROIAlignRotated(Function): - @staticmethod - def forward(ctx, input, roi, output_size, spatial_scale, sampling_ratio): - ctx.save_for_backward(roi) - ctx.output_size = _pair(output_size) - ctx.spatial_scale = spatial_scale - ctx.sampling_ratio = sampling_ratio - ctx.input_shape = input.size() - output = torch.ops.detectron2.roi_align_rotated_forward( - input, roi, spatial_scale, output_size[0], output_size[1], sampling_ratio - ) - return output - - @staticmethod - @once_differentiable - def backward(ctx, grad_output): - (rois,) = ctx.saved_tensors - output_size = ctx.output_size - spatial_scale = ctx.spatial_scale - sampling_ratio = ctx.sampling_ratio - bs, ch, h, w = ctx.input_shape - grad_input = torch.ops.detectron2.roi_align_rotated_backward( - grad_output, - rois, - spatial_scale, - output_size[0], - output_size[1], - bs, - ch, - h, - w, - sampling_ratio, - ) - return grad_input, None, None, None, None, None - - -roi_align_rotated = _ROIAlignRotated.apply - - -class ROIAlignRotated(nn.Module): - def __init__(self, output_size, spatial_scale, sampling_ratio): - """ - Args: - output_size (tuple): h, w - spatial_scale (float): scale the input boxes by this number - sampling_ratio (int): number of inputs samples to take for each output - sample. 0 to take samples densely. - - Note: - ROIAlignRotated supports continuous coordinate by default: - Given a continuous coordinate c, its two neighboring pixel indices (in our - pixel model) are computed by floor(c - 0.5) and ceil(c - 0.5). For example, - c=1.3 has pixel neighbors with discrete indices [0] and [1] (which are sampled - from the underlying signal at continuous coordinates 0.5 and 1.5). - """ - super(ROIAlignRotated, self).__init__() - self.output_size = output_size - self.spatial_scale = spatial_scale - self.sampling_ratio = sampling_ratio - - def forward(self, input, rois): - """ - Args: - input: NCHW images - rois: Bx6 boxes. First column is the index into N. - The other 5 columns are (x_ctr, y_ctr, width, height, angle_degrees). - """ - assert rois.dim() == 2 and rois.size(1) == 6 - orig_dtype = input.dtype - if orig_dtype == torch.float16: - input = input.float() - rois = rois.float() - return roi_align_rotated( - input, rois, self.output_size, self.spatial_scale, self.sampling_ratio - ).to(dtype=orig_dtype) - - def __repr__(self): - tmpstr = self.__class__.__name__ + "(" - tmpstr += "output_size=" + str(self.output_size) - tmpstr += ", spatial_scale=" + str(self.spatial_scale) - tmpstr += ", sampling_ratio=" + str(self.sampling_ratio) - tmpstr += ")" - return tmpstr diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/rotated_boxes.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/rotated_boxes.py deleted file mode 100755 index 03f73b3b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/rotated_boxes.py +++ /dev/null @@ -1,21 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from __future__ import absolute_import, division, print_function, unicode_literals -import torch - - -def pairwise_iou_rotated(boxes1, boxes2): - """ - Return intersection-over-union (Jaccard index) of boxes. - - Both sets of boxes are expected to be in - (x_center, y_center, width, height, angle) format. - - Arguments: - boxes1 (Tensor[N, 5]) - boxes2 (Tensor[M, 5]) - - Returns: - iou (Tensor[N, M]): the NxM matrix containing the pairwise - IoU values for every element in boxes1 and boxes2 - """ - return torch.ops.detectron2.box_iou_rotated(boxes1, boxes2) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/shape_spec.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/shape_spec.py deleted file mode 100755 index fe7e8e26..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/shape_spec.py +++ /dev/null @@ -1,20 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. -from collections import namedtuple - - -class ShapeSpec(namedtuple("_ShapeSpec", ["channels", "height", "width", "stride"])): - """ - A simple structure that contains basic shape specification about a tensor. - It is often used as the auxiliary inputs/outputs of models, - to complement the lack of shape inference ability among pytorch modules. - - Attributes: - channels: - height: - width: - stride: - """ - - def __new__(cls, channels=None, height=None, width=None, stride=None): - return super().__new__(cls, channels, height, width, stride) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/wrappers.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/wrappers.py deleted file mode 100755 index 29d0ef91..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/layers/wrappers.py +++ /dev/null @@ -1,132 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -""" -Wrappers around on some nn functions, mainly to support empty tensors. - -Ideally, add support directly in PyTorch to empty tensors in those functions. - -These can be removed once https://github.com/pytorch/pytorch/issues/12013 -is implemented -""" - -from typing import List, Optional -import torch -from torch.nn import functional as F - - -def shapes_to_tensor(x: List[int], device: Optional[torch.device] = None) -> torch.Tensor: - """ - Turn a list of integer scalars or integer Tensor scalars into a vector, - in a way that's both traceable and scriptable. - - In tracing, `x` should be a list of scalar Tensor, so the output can trace to the inputs. - In scripting or eager, `x` should be a list of int. - """ - if torch.jit.is_scripting(): - return torch.as_tensor(x, device=device) - if torch.jit.is_tracing(): - assert all( - [isinstance(t, torch.Tensor) for t in x] - ), "Shape should be tensor during tracing!" - # as_tensor should not be used in tracing because it records a constant - ret = torch.stack(x) - if ret.device != device: # avoid recording a hard-coded device if not necessary - ret = ret.to(device=device) - return ret - return torch.as_tensor(x, device=device) - - -def cat(tensors: List[torch.Tensor], dim: int = 0): - """ - Efficient version of torch.cat that avoids a copy if there is only a single element in a list - """ - assert isinstance(tensors, (list, tuple)) - if len(tensors) == 1: - return tensors[0] - return torch.cat(tensors, dim) - - -def cross_entropy(input, target, *, reduction="mean", **kwargs): - """ - Same as `torch.nn.functional.cross_entropy`, but returns 0 (instead of nan) - for empty inputs. - """ - if target.numel() == 0 and reduction == "mean": - return input.sum() * 0.0 # connect the gradient - return F.cross_entropy(input, target, reduction=reduction, **kwargs) - - -class _NewEmptyTensorOp(torch.autograd.Function): - @staticmethod - def forward(ctx, x, new_shape): - ctx.shape = x.shape - return x.new_empty(new_shape) - - @staticmethod - def backward(ctx, grad): - shape = ctx.shape - return _NewEmptyTensorOp.apply(grad, shape), None - - -class Conv2d(torch.nn.Conv2d): - """ - A wrapper around :class:`torch.nn.Conv2d` to support empty inputs and more features. - """ - - def __init__(self, *args, **kwargs): - """ - Extra keyword arguments supported in addition to those in `torch.nn.Conv2d`: - - Args: - norm (nn.Module, optional): a normalization layer - activation (callable(Tensor) -> Tensor): a callable activation function - - It assumes that norm layer is used before activation. - """ - norm = kwargs.pop("norm", None) - activation = kwargs.pop("activation", None) - super().__init__(*args, **kwargs) - - self.norm = norm - self.activation = activation - - def forward(self, x): - # torchscript does not support SyncBatchNorm yet - # https://github.com/pytorch/pytorch/issues/40507 - # and we skip these codes in torchscript since: - # 1. currently we only support torchscript in evaluation mode - # 2. features needed by exporting module to torchscript are added in PyTorch 1.6 or - # later version, `Conv2d` in these PyTorch versions has already supported empty inputs. - if not torch.jit.is_scripting(): - if x.numel() == 0 and self.training: - # https://github.com/pytorch/pytorch/issues/12013 - assert not isinstance( - self.norm, torch.nn.SyncBatchNorm - ), "SyncBatchNorm does not support empty inputs!" - - x = F.conv2d( - x, self.weight, self.bias, self.stride, self.padding, self.dilation, self.groups - ) - if self.norm is not None: - x = self.norm(x) - if self.activation is not None: - x = self.activation(x) - return x - - -ConvTranspose2d = torch.nn.ConvTranspose2d -BatchNorm2d = torch.nn.BatchNorm2d -interpolate = F.interpolate -Linear = torch.nn.Linear - - -def nonzero_tuple(x): - """ - A 'as_tuple=True' version of torch.nonzero to support torchscript. - because of https://github.com/pytorch/pytorch/issues/38718 - """ - if torch.jit.is_scripting(): - if x.dim() == 0: - return x.unsqueeze(0).nonzero().unbind(1) - return x.nonzero().unbind(1) - else: - return x.nonzero(as_tuple=True) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/model_zoo/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/model_zoo/__init__.py deleted file mode 100755 index 62042081..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/model_zoo/__init__.py +++ /dev/null @@ -1,10 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -""" -Model Zoo API for Detectron2: a collection of functions to create common model architectures -listed in `MODEL_ZOO.md `_, -and optionally load their pre-trained weights. -""" - -from .model_zoo import get, get_config_file, get_checkpoint_url, get_config - -__all__ = ["get_checkpoint_url", "get", "get_config_file", "get_config"] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/model_zoo/model_zoo.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/model_zoo/model_zoo.py deleted file mode 100755 index 5b90bc9a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/model_zoo/model_zoo.py +++ /dev/null @@ -1,213 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import os -from typing import Optional -import pkg_resources -import torch - -from detectron2.checkpoint import DetectionCheckpointer -from detectron2.config import CfgNode, LazyConfig, get_cfg, instantiate -from detectron2.modeling import build_model - - -class _ModelZooUrls(object): - """ - Mapping from names to officially released Detectron2 pre-trained models. - """ - - S3_PREFIX = "https://dl.fbaipublicfiles.com/detectron2/" - - # format: {config_path.yaml} -> model_id/model_final_{commit}.pkl - CONFIG_PATH_TO_URL_SUFFIX = { - # COCO Detection with Faster R-CNN - "COCO-Detection/faster_rcnn_R_50_C4_1x": "137257644/model_final_721ade.pkl", - "COCO-Detection/faster_rcnn_R_50_DC5_1x": "137847829/model_final_51d356.pkl", - "COCO-Detection/faster_rcnn_R_50_FPN_1x": "137257794/model_final_b275ba.pkl", - "COCO-Detection/faster_rcnn_R_50_C4_3x": "137849393/model_final_f97cb7.pkl", - "COCO-Detection/faster_rcnn_R_50_DC5_3x": "137849425/model_final_68d202.pkl", - "COCO-Detection/faster_rcnn_R_50_FPN_3x": "137849458/model_final_280758.pkl", - "COCO-Detection/faster_rcnn_R_101_C4_3x": "138204752/model_final_298dad.pkl", - "COCO-Detection/faster_rcnn_R_101_DC5_3x": "138204841/model_final_3e0943.pkl", - "COCO-Detection/faster_rcnn_R_101_FPN_3x": "137851257/model_final_f6e8b1.pkl", - "COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x": "139173657/model_final_68b088.pkl", - # COCO Detection with RetinaNet - "COCO-Detection/retinanet_R_50_FPN_1x": "190397773/model_final_bfca0b.pkl", - "COCO-Detection/retinanet_R_50_FPN_3x": "190397829/model_final_5bd44e.pkl", - "COCO-Detection/retinanet_R_101_FPN_3x": "190397697/model_final_971ab9.pkl", - # COCO Detection with RPN and Fast R-CNN - "COCO-Detection/rpn_R_50_C4_1x": "137258005/model_final_450694.pkl", - "COCO-Detection/rpn_R_50_FPN_1x": "137258492/model_final_02ce48.pkl", - "COCO-Detection/fast_rcnn_R_50_FPN_1x": "137635226/model_final_e5f7ce.pkl", - # COCO Instance Segmentation Baselines with Mask R-CNN - "COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x": "137259246/model_final_9243eb.pkl", - "COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_1x": "137260150/model_final_4f86c3.pkl", - "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x": "137260431/model_final_a54504.pkl", - "COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x": "137849525/model_final_4ce675.pkl", - "COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x": "137849551/model_final_84107b.pkl", - "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x": "137849600/model_final_f10217.pkl", - "COCO-InstanceSegmentation/mask_rcnn_R_101_C4_3x": "138363239/model_final_a2914c.pkl", - "COCO-InstanceSegmentation/mask_rcnn_R_101_DC5_3x": "138363294/model_final_0464b7.pkl", - "COCO-InstanceSegmentation/mask_rcnn_R_101_FPN_3x": "138205316/model_final_a3ec72.pkl", - "COCO-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_3x": "139653917/model_final_2d9806.pkl", # noqa - # New baselines using Large-Scale Jitter and Longer Training Schedule - "new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ": "42047764/model_final_bb69de.pkl", - "new_baselines/mask_rcnn_R_50_FPN_200ep_LSJ": "42047638/model_final_89a8d3.pkl", - "new_baselines/mask_rcnn_R_50_FPN_400ep_LSJ": "42019571/model_final_14d201.pkl", - "new_baselines/mask_rcnn_R_101_FPN_100ep_LSJ": "42025812/model_final_4f7b58.pkl", - "new_baselines/mask_rcnn_R_101_FPN_200ep_LSJ": "42131867/model_final_0bb7ae.pkl", - "new_baselines/mask_rcnn_R_101_FPN_400ep_LSJ": "42073830/model_final_f96b26.pkl", - "new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ": "42047771/model_final_b7fbab.pkl", # noqa - "new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ": "42132721/model_final_5d87c1.pkl", # noqa - "new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ": "42025447/model_final_f1362d.pkl", # noqa - "new_baselines/mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ": "42047784/model_final_6ba57e.pkl", # noqa - "new_baselines/mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ": "42047642/model_final_27b9c1.pkl", # noqa - "new_baselines/mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ": "42045954/model_final_ef3a80.pkl", # noqa - # COCO Person Keypoint Detection Baselines with Keypoint R-CNN - "COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x": "137261548/model_final_04e291.pkl", - "COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x": "137849621/model_final_a6e10b.pkl", - "COCO-Keypoints/keypoint_rcnn_R_101_FPN_3x": "138363331/model_final_997cc7.pkl", - "COCO-Keypoints/keypoint_rcnn_X_101_32x8d_FPN_3x": "139686956/model_final_5ad38f.pkl", - # COCO Panoptic Segmentation Baselines with Panoptic FPN - "COCO-PanopticSegmentation/panoptic_fpn_R_50_1x": "139514544/model_final_dbfeb4.pkl", - "COCO-PanopticSegmentation/panoptic_fpn_R_50_3x": "139514569/model_final_c10459.pkl", - "COCO-PanopticSegmentation/panoptic_fpn_R_101_3x": "139514519/model_final_cafdb1.pkl", - # LVIS Instance Segmentation Baselines with Mask R-CNN - "LVISv0.5-InstanceSegmentation/mask_rcnn_R_50_FPN_1x": "144219072/model_final_571f7c.pkl", # noqa - "LVISv0.5-InstanceSegmentation/mask_rcnn_R_101_FPN_1x": "144219035/model_final_824ab5.pkl", # noqa - "LVISv0.5-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x": "144219108/model_final_5e3439.pkl", # noqa - # Cityscapes & Pascal VOC Baselines - "Cityscapes/mask_rcnn_R_50_FPN": "142423278/model_final_af9cf5.pkl", - "PascalVOC-Detection/faster_rcnn_R_50_C4": "142202221/model_final_b1acc2.pkl", - # Other Settings - "Misc/mask_rcnn_R_50_FPN_1x_dconv_c3-c5": "138602867/model_final_65c703.pkl", - "Misc/mask_rcnn_R_50_FPN_3x_dconv_c3-c5": "144998336/model_final_821d0b.pkl", - "Misc/cascade_mask_rcnn_R_50_FPN_1x": "138602847/model_final_e9d89b.pkl", - "Misc/cascade_mask_rcnn_R_50_FPN_3x": "144998488/model_final_480dd8.pkl", - "Misc/mask_rcnn_R_50_FPN_3x_syncbn": "169527823/model_final_3b3c51.pkl", - "Misc/mask_rcnn_R_50_FPN_3x_gn": "138602888/model_final_dc5d9e.pkl", - "Misc/scratch_mask_rcnn_R_50_FPN_3x_gn": "138602908/model_final_01ca85.pkl", - "Misc/scratch_mask_rcnn_R_50_FPN_9x_gn": "183808979/model_final_da7b4c.pkl", - "Misc/scratch_mask_rcnn_R_50_FPN_9x_syncbn": "184226666/model_final_5ce33e.pkl", - "Misc/panoptic_fpn_R_101_dconv_cascade_gn_3x": "139797668/model_final_be35db.pkl", - "Misc/cascade_mask_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv": "18131413/model_0039999_e76410.pkl", # noqa - # D1 Comparisons - "Detectron1-Comparisons/faster_rcnn_R_50_FPN_noaug_1x": "137781054/model_final_7ab50c.pkl", # noqa - "Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x": "137781281/model_final_62ca52.pkl", # noqa - "Detectron1-Comparisons/keypoint_rcnn_R_50_FPN_1x": "137781195/model_final_cce136.pkl", - } - - @staticmethod - def query(config_path: str) -> Optional[str]: - """ - Args: - config_path: relative config filename - """ - name = config_path.replace(".yaml", "").replace(".py", "") - if name in _ModelZooUrls.CONFIG_PATH_TO_URL_SUFFIX: - suffix = _ModelZooUrls.CONFIG_PATH_TO_URL_SUFFIX[name] - return _ModelZooUrls.S3_PREFIX + name + "/" + suffix - return None - - -def get_checkpoint_url(config_path): - """ - Returns the URL to the model trained using the given config - - Args: - config_path (str): config file name relative to detectron2's "configs/" - directory, e.g., "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml" - - Returns: - str: a URL to the model - """ - url = _ModelZooUrls.query(config_path) - if url is None: - raise RuntimeError("Pretrained model for {} is not available!".format(config_path)) - return url - - -def get_config_file(config_path): - """ - Returns path to a builtin config file. - - Args: - config_path (str): config file name relative to detectron2's "configs/" - directory, e.g., "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml" - - Returns: - str: the real path to the config file. - """ - cfg_file = pkg_resources.resource_filename( - "detectron2.model_zoo", os.path.join("configs", config_path) - ) - if not os.path.exists(cfg_file): - raise RuntimeError("{} not available in Model Zoo!".format(config_path)) - return cfg_file - - -def get_config(config_path, trained: bool = False): - """ - Returns a config object for a model in model zoo. - - Args: - config_path (str): config file name relative to detectron2's "configs/" - directory, e.g., "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml" - trained (bool): If True, will set ``MODEL.WEIGHTS`` to trained model zoo weights. - If False, the checkpoint specified in the config file's ``MODEL.WEIGHTS`` is used - instead; this will typically (though not always) initialize a subset of weights using - an ImageNet pre-trained model, while randomly initializing the other weights. - - Returns: - CfgNode or omegaconf.DictConfig: a config object - """ - cfg_file = get_config_file(config_path) - if cfg_file.endswith(".yaml"): - cfg = get_cfg() - cfg.merge_from_file(cfg_file) - if trained: - cfg.MODEL.WEIGHTS = get_checkpoint_url(config_path) - return cfg - elif cfg_file.endswith(".py"): - cfg = LazyConfig.load(cfg_file) - if trained: - url = get_checkpoint_url(config_path) - if "train" in cfg and "init_checkpoint" in cfg.train: - cfg.train.init_checkpoint = url - else: - raise NotImplementedError - return cfg - - -def get(config_path, trained: bool = False, device: Optional[str] = None): - """ - Get a model specified by relative path under Detectron2's official ``configs/`` directory. - - Args: - config_path (str): config file name relative to detectron2's "configs/" - directory, e.g., "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml" - trained (bool): see :func:`get_config`. - device (str or None): overwrite the device in config, if given. - - Returns: - nn.Module: a detectron2 model. Will be in training mode. - - Example: - :: - from detectron2 import model_zoo - model = model_zoo.get("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml", trained=True) - """ - cfg = get_config(config_path, trained) - if device is None and not torch.cuda.is_available(): - device = "cpu" - if device is not None and isinstance(cfg, CfgNode): - cfg.MODEL.DEVICE = device - - if isinstance(cfg, CfgNode): - model = build_model(cfg) - DetectionCheckpointer(model).load(cfg.MODEL.WEIGHTS) - else: - model = instantiate(cfg.model) - if device is not None: - model = model.to(device) - if "train" in cfg and "init_checkpoint" in cfg.train: - DetectionCheckpointer(model).load(cfg.train.init_checkpoint) - return model diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/__init__.py deleted file mode 100755 index 576493de..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/__init__.py +++ /dev/null @@ -1,59 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from detectron2.layers import ShapeSpec - -from .anchor_generator import build_anchor_generator, ANCHOR_GENERATOR_REGISTRY -from .backbone import ( - BACKBONE_REGISTRY, - FPN, - Backbone, - ResNet, - ResNetBlockBase, - build_backbone, - build_resnet_backbone, - make_stage, -) -from .meta_arch import ( - META_ARCH_REGISTRY, - SEM_SEG_HEADS_REGISTRY, - GeneralizedRCNN, - PanopticFPN, - ProposalNetwork, - RetinaNet, - SemanticSegmentor, - build_model, - build_sem_seg_head, - FCOS, -) -from .postprocessing import detector_postprocess -from .proposal_generator import ( - PROPOSAL_GENERATOR_REGISTRY, - build_proposal_generator, - RPN_HEAD_REGISTRY, - build_rpn_head, -) -from .roi_heads import ( - ROI_BOX_HEAD_REGISTRY, - ROI_HEADS_REGISTRY, - ROI_KEYPOINT_HEAD_REGISTRY, - ROI_MASK_HEAD_REGISTRY, - ROIHeads, - StandardROIHeads, - BaseMaskRCNNHead, - BaseKeypointRCNNHead, - FastRCNNOutputLayers, - build_box_head, - build_keypoint_head, - build_mask_head, - build_roi_heads, -) -from .test_time_augmentation import DatasetMapperTTA, GeneralizedRCNNWithTTA -from .mmdet_wrapper import MMDetBackbone, MMDetDetector - -_EXCLUDE = {"ShapeSpec"} -__all__ = [k for k in globals().keys() if k not in _EXCLUDE and not k.startswith("_")] - - -from detectron2.utils.env import fixup_module_metadata - -fixup_module_metadata(__name__, globals(), __all__) -del fixup_module_metadata diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/anchor_generator.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/anchor_generator.py deleted file mode 100755 index ee4b9881..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/anchor_generator.py +++ /dev/null @@ -1,382 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import collections -import math -from typing import List -import torch -from torch import nn - -from detectron2.config import configurable -from detectron2.layers import ShapeSpec -from detectron2.structures import Boxes, RotatedBoxes -from detectron2.utils.registry import Registry - -ANCHOR_GENERATOR_REGISTRY = Registry("ANCHOR_GENERATOR") -ANCHOR_GENERATOR_REGISTRY.__doc__ = """ -Registry for modules that creates object detection anchors for feature maps. - -The registered object will be called with `obj(cfg, input_shape)`. -""" - - -class BufferList(nn.Module): - """ - Similar to nn.ParameterList, but for buffers - """ - - def __init__(self, buffers): - super().__init__() - for i, buffer in enumerate(buffers): - # Use non-persistent buffer so the values are not saved in checkpoint - self.register_buffer(str(i), buffer, persistent=False) - - def __len__(self): - return len(self._buffers) - - def __iter__(self): - return iter(self._buffers.values()) - - -def _create_grid_offsets(size: List[int], stride: int, offset: float, device: torch.device): - grid_height, grid_width = size - shifts_x = torch.arange( - offset * stride, grid_width * stride, step=stride, dtype=torch.float32, device=device - ) - shifts_y = torch.arange( - offset * stride, grid_height * stride, step=stride, dtype=torch.float32, device=device - ) - - shift_y, shift_x = torch.meshgrid(shifts_y, shifts_x) - shift_x = shift_x.reshape(-1) - shift_y = shift_y.reshape(-1) - return shift_x, shift_y - - -def _broadcast_params(params, num_features, name): - """ - If one size (or aspect ratio) is specified and there are multiple feature - maps, we "broadcast" anchors of that single size (or aspect ratio) - over all feature maps. - - If params is list[float], or list[list[float]] with len(params) == 1, repeat - it num_features time. - - Returns: - list[list[float]]: param for each feature - """ - assert isinstance( - params, collections.abc.Sequence - ), f"{name} in anchor generator has to be a list! Got {params}." - assert len(params), f"{name} in anchor generator cannot be empty!" - if not isinstance(params[0], collections.abc.Sequence): # params is list[float] - return [params] * num_features - if len(params) == 1: - return list(params) * num_features - assert len(params) == num_features, ( - f"Got {name} of length {len(params)} in anchor generator, " - f"but the number of input features is {num_features}!" - ) - return params - - -@ANCHOR_GENERATOR_REGISTRY.register() -class DefaultAnchorGenerator(nn.Module): - """ - Compute anchors in the standard ways described in - "Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks". - """ - - box_dim: torch.jit.Final[int] = 4 - """ - the dimension of each anchor box. - """ - - @configurable - def __init__(self, *, sizes, aspect_ratios, strides, offset=0.5): - """ - This interface is experimental. - - Args: - sizes (list[list[float]] or list[float]): - If ``sizes`` is list[list[float]], ``sizes[i]`` is the list of anchor sizes - (i.e. sqrt of anchor area) to use for the i-th feature map. - If ``sizes`` is list[float], ``sizes`` is used for all feature maps. - Anchor sizes are given in absolute lengths in units of - the input image; they do not dynamically scale if the input image size changes. - aspect_ratios (list[list[float]] or list[float]): list of aspect ratios - (i.e. height / width) to use for anchors. Same "broadcast" rule for `sizes` applies. - strides (list[int]): stride of each input feature. - offset (float): Relative offset between the center of the first anchor and the top-left - corner of the image. Value has to be in [0, 1). - Recommend to use 0.5, which means half stride. - """ - super().__init__() - - self.strides = strides - self.num_features = len(self.strides) - sizes = _broadcast_params(sizes, self.num_features, "sizes") - aspect_ratios = _broadcast_params(aspect_ratios, self.num_features, "aspect_ratios") - self.cell_anchors = self._calculate_anchors(sizes, aspect_ratios) - - self.offset = offset - assert 0.0 <= self.offset < 1.0, self.offset - - @classmethod - def from_config(cls, cfg, input_shape: List[ShapeSpec]): - return { - "sizes": cfg.MODEL.ANCHOR_GENERATOR.SIZES, - "aspect_ratios": cfg.MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS, - "strides": [x.stride for x in input_shape], - "offset": cfg.MODEL.ANCHOR_GENERATOR.OFFSET, - } - - def _calculate_anchors(self, sizes, aspect_ratios): - cell_anchors = [ - self.generate_cell_anchors(s, a).float() for s, a in zip(sizes, aspect_ratios) - ] - return BufferList(cell_anchors) - - @property - @torch.jit.unused - def num_cell_anchors(self): - """ - Alias of `num_anchors`. - """ - return self.num_anchors - - @property - @torch.jit.unused - def num_anchors(self): - """ - Returns: - list[int]: Each int is the number of anchors at every pixel - location, on that feature map. - For example, if at every pixel we use anchors of 3 aspect - ratios and 5 sizes, the number of anchors is 15. - (See also ANCHOR_GENERATOR.SIZES and ANCHOR_GENERATOR.ASPECT_RATIOS in config) - - In standard RPN models, `num_anchors` on every feature map is the same. - """ - return [len(cell_anchors) for cell_anchors in self.cell_anchors] - - def _grid_anchors(self, grid_sizes: List[List[int]]): - """ - Returns: - list[Tensor]: #featuremap tensors, each is (#locations x #cell_anchors) x 4 - """ - anchors = [] - # buffers() not supported by torchscript. use named_buffers() instead - buffers: List[torch.Tensor] = [x[1] for x in self.cell_anchors.named_buffers()] - for size, stride, base_anchors in zip(grid_sizes, self.strides, buffers): - shift_x, shift_y = _create_grid_offsets(size, stride, self.offset, base_anchors.device) - shifts = torch.stack((shift_x, shift_y, shift_x, shift_y), dim=1) - - anchors.append((shifts.view(-1, 1, 4) + base_anchors.view(1, -1, 4)).reshape(-1, 4)) - - return anchors - - def generate_cell_anchors(self, sizes=(32, 64, 128, 256, 512), aspect_ratios=(0.5, 1, 2)): - """ - Generate a tensor storing canonical anchor boxes, which are all anchor - boxes of different sizes and aspect_ratios centered at (0, 0). - We can later build the set of anchors for a full feature map by - shifting and tiling these tensors (see `meth:_grid_anchors`). - - Args: - sizes (tuple[float]): - aspect_ratios (tuple[float]]): - - Returns: - Tensor of shape (len(sizes) * len(aspect_ratios), 4) storing anchor boxes - in XYXY format. - """ - - # This is different from the anchor generator defined in the original Faster R-CNN - # code or Detectron. They yield the same AP, however the old version defines cell - # anchors in a less natural way with a shift relative to the feature grid and - # quantization that results in slightly different sizes for different aspect ratios. - # See also https://github.com/facebookresearch/Detectron/issues/227 - - anchors = [] - for size in sizes: - area = size ** 2.0 - for aspect_ratio in aspect_ratios: - # s * s = w * h - # a = h / w - # ... some algebra ... - # w = sqrt(s * s / a) - # h = a * w - w = math.sqrt(area / aspect_ratio) - h = aspect_ratio * w - x0, y0, x1, y1 = -w / 2.0, -h / 2.0, w / 2.0, h / 2.0 - anchors.append([x0, y0, x1, y1]) - return torch.tensor(anchors) - - def forward(self, features: List[torch.Tensor]): - """ - Args: - features (list[Tensor]): list of backbone feature maps on which to generate anchors. - - Returns: - list[Boxes]: a list of Boxes containing all the anchors for each feature map - (i.e. the cell anchors repeated over all locations in the feature map). - The number of anchors of each feature map is Hi x Wi x num_cell_anchors, - where Hi, Wi are resolution of the feature map divided by anchor stride. - """ - grid_sizes = [feature_map.shape[-2:] for feature_map in features] - anchors_over_all_feature_maps = self._grid_anchors(grid_sizes) - return [Boxes(x) for x in anchors_over_all_feature_maps] - - -@ANCHOR_GENERATOR_REGISTRY.register() -class RotatedAnchorGenerator(nn.Module): - """ - Compute rotated anchors used by Rotated RPN (RRPN), described in - "Arbitrary-Oriented Scene Text Detection via Rotation Proposals". - """ - - box_dim: int = 5 - """ - the dimension of each anchor box. - """ - - @configurable - def __init__(self, *, sizes, aspect_ratios, strides, angles, offset=0.5): - """ - This interface is experimental. - - Args: - sizes (list[list[float]] or list[float]): - If sizes is list[list[float]], sizes[i] is the list of anchor sizes - (i.e. sqrt of anchor area) to use for the i-th feature map. - If sizes is list[float], the sizes are used for all feature maps. - Anchor sizes are given in absolute lengths in units of - the input image; they do not dynamically scale if the input image size changes. - aspect_ratios (list[list[float]] or list[float]): list of aspect ratios - (i.e. height / width) to use for anchors. Same "broadcast" rule for `sizes` applies. - strides (list[int]): stride of each input feature. - angles (list[list[float]] or list[float]): list of angles (in degrees CCW) - to use for anchors. Same "broadcast" rule for `sizes` applies. - offset (float): Relative offset between the center of the first anchor and the top-left - corner of the image. Value has to be in [0, 1). - Recommend to use 0.5, which means half stride. - """ - super().__init__() - - self.strides = strides - self.num_features = len(self.strides) - sizes = _broadcast_params(sizes, self.num_features, "sizes") - aspect_ratios = _broadcast_params(aspect_ratios, self.num_features, "aspect_ratios") - angles = _broadcast_params(angles, self.num_features, "angles") - self.cell_anchors = self._calculate_anchors(sizes, aspect_ratios, angles) - - self.offset = offset - assert 0.0 <= self.offset < 1.0, self.offset - - @classmethod - def from_config(cls, cfg, input_shape: List[ShapeSpec]): - return { - "sizes": cfg.MODEL.ANCHOR_GENERATOR.SIZES, - "aspect_ratios": cfg.MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS, - "strides": [x.stride for x in input_shape], - "offset": cfg.MODEL.ANCHOR_GENERATOR.OFFSET, - "angles": cfg.MODEL.ANCHOR_GENERATOR.ANGLES, - } - - def _calculate_anchors(self, sizes, aspect_ratios, angles): - cell_anchors = [ - self.generate_cell_anchors(size, aspect_ratio, angle).float() - for size, aspect_ratio, angle in zip(sizes, aspect_ratios, angles) - ] - return BufferList(cell_anchors) - - @property - def num_cell_anchors(self): - """ - Alias of `num_anchors`. - """ - return self.num_anchors - - @property - def num_anchors(self): - """ - Returns: - list[int]: Each int is the number of anchors at every pixel - location, on that feature map. - For example, if at every pixel we use anchors of 3 aspect - ratios, 2 sizes and 5 angles, the number of anchors is 30. - (See also ANCHOR_GENERATOR.SIZES, ANCHOR_GENERATOR.ASPECT_RATIOS - and ANCHOR_GENERATOR.ANGLES in config) - - In standard RRPN models, `num_anchors` on every feature map is the same. - """ - return [len(cell_anchors) for cell_anchors in self.cell_anchors] - - def _grid_anchors(self, grid_sizes): - anchors = [] - for size, stride, base_anchors in zip(grid_sizes, self.strides, self.cell_anchors): - shift_x, shift_y = _create_grid_offsets(size, stride, self.offset, base_anchors.device) - zeros = torch.zeros_like(shift_x) - shifts = torch.stack((shift_x, shift_y, zeros, zeros, zeros), dim=1) - - anchors.append((shifts.view(-1, 1, 5) + base_anchors.view(1, -1, 5)).reshape(-1, 5)) - - return anchors - - def generate_cell_anchors( - self, - sizes=(32, 64, 128, 256, 512), - aspect_ratios=(0.5, 1, 2), - angles=(-90, -60, -30, 0, 30, 60, 90), - ): - """ - Generate a tensor storing canonical anchor boxes, which are all anchor - boxes of different sizes, aspect_ratios, angles centered at (0, 0). - We can later build the set of anchors for a full feature map by - shifting and tiling these tensors (see `meth:_grid_anchors`). - - Args: - sizes (tuple[float]): - aspect_ratios (tuple[float]]): - angles (tuple[float]]): - - Returns: - Tensor of shape (len(sizes) * len(aspect_ratios) * len(angles), 5) - storing anchor boxes in (x_ctr, y_ctr, w, h, angle) format. - """ - anchors = [] - for size in sizes: - area = size ** 2.0 - for aspect_ratio in aspect_ratios: - # s * s = w * h - # a = h / w - # ... some algebra ... - # w = sqrt(s * s / a) - # h = a * w - w = math.sqrt(area / aspect_ratio) - h = aspect_ratio * w - anchors.extend([0, 0, w, h, a] for a in angles) - - return torch.tensor(anchors) - - def forward(self, features): - """ - Args: - features (list[Tensor]): list of backbone feature maps on which to generate anchors. - - Returns: - list[RotatedBoxes]: a list of Boxes containing all the anchors for each feature map - (i.e. the cell anchors repeated over all locations in the feature map). - The number of anchors of each feature map is Hi x Wi x num_cell_anchors, - where Hi, Wi are resolution of the feature map divided by anchor stride. - """ - grid_sizes = [feature_map.shape[-2:] for feature_map in features] - anchors_over_all_feature_maps = self._grid_anchors(grid_sizes) - return [RotatedBoxes(x) for x in anchors_over_all_feature_maps] - - -def build_anchor_generator(cfg, input_shape): - """ - Built an anchor generator from `cfg.MODEL.ANCHOR_GENERATOR.NAME`. - """ - anchor_generator = cfg.MODEL.ANCHOR_GENERATOR.NAME - return ANCHOR_GENERATOR_REGISTRY.get(anchor_generator)(cfg, input_shape) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/__init__.py deleted file mode 100755 index 55b265d5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/__init__.py +++ /dev/null @@ -1,17 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .build import build_backbone, BACKBONE_REGISTRY # noqa F401 isort:skip - -from .backbone import Backbone -from .fpn import FPN -from .regnet import RegNet -from .resnet import ( - BasicStem, - ResNet, - ResNetBlockBase, - build_resnet_backbone, - make_stage, - BottleneckBlock, -) - -__all__ = [k for k in globals().keys() if not k.startswith("_")] -# TODO can expose more resnet blocks after careful consideration diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/backbone.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/backbone.py deleted file mode 100755 index 369fb884..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/backbone.py +++ /dev/null @@ -1,53 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from abc import ABCMeta, abstractmethod -import torch.nn as nn - -from detectron2.layers import ShapeSpec - -__all__ = ["Backbone"] - - -class Backbone(nn.Module, metaclass=ABCMeta): - """ - Abstract base class for network backbones. - """ - - def __init__(self): - """ - The `__init__` method of any subclass can specify its own set of arguments. - """ - super().__init__() - - @abstractmethod - def forward(self): - """ - Subclasses must override this method, but adhere to the same return type. - - Returns: - dict[str->Tensor]: mapping from feature name (e.g., "res2") to tensor - """ - pass - - @property - def size_divisibility(self) -> int: - """ - Some backbones require the input height and width to be divisible by a - specific integer. This is typically true for encoder / decoder type networks - with lateral connection (e.g., FPN) for which feature maps need to match - dimension in the "bottom up" and "top down" paths. Set to 0 if no specific - input size divisibility is required. - """ - return 0 - - def output_shape(self): - """ - Returns: - dict[str->ShapeSpec] - """ - # this is a backward-compatible default - return { - name: ShapeSpec( - channels=self._out_feature_channels[name], stride=self._out_feature_strides[name] - ) - for name in self._out_features - } diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/build.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/build.py deleted file mode 100755 index af021411..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/build.py +++ /dev/null @@ -1,33 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from detectron2.layers import ShapeSpec -from detectron2.utils.registry import Registry - -from .backbone import Backbone - -BACKBONE_REGISTRY = Registry("BACKBONE") -BACKBONE_REGISTRY.__doc__ = """ -Registry for backbones, which extract feature maps from images - -The registered object must be a callable that accepts two arguments: - -1. A :class:`detectron2.config.CfgNode` -2. A :class:`detectron2.layers.ShapeSpec`, which contains the input shape specification. - -Registered object must return instance of :class:`Backbone`. -""" - - -def build_backbone(cfg, input_shape=None): - """ - Build a backbone from `cfg.MODEL.BACKBONE.NAME`. - - Returns: - an instance of :class:`Backbone` - """ - if input_shape is None: - input_shape = ShapeSpec(channels=len(cfg.MODEL.PIXEL_MEAN)) - - backbone_name = cfg.MODEL.BACKBONE.NAME - backbone = BACKBONE_REGISTRY.get(backbone_name)(cfg, input_shape) - assert isinstance(backbone, Backbone) - return backbone diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/fpn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/fpn.py deleted file mode 100755 index d0bdfc9d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/fpn.py +++ /dev/null @@ -1,255 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import math -import fvcore.nn.weight_init as weight_init -import torch -import torch.nn.functional as F -from torch import nn - -from detectron2.layers import Conv2d, ShapeSpec, get_norm - -from .backbone import Backbone -from .build import BACKBONE_REGISTRY -from .resnet import build_resnet_backbone - -__all__ = ["build_resnet_fpn_backbone", "build_retinanet_resnet_fpn_backbone", "FPN"] - - -class FPN(Backbone): - """ - This module implements :paper:`FPN`. - It creates pyramid features built on top of some input feature maps. - """ - - _fuse_type: torch.jit.Final[str] - - def __init__( - self, bottom_up, in_features, out_channels, norm="", top_block=None, fuse_type="sum" - ): - """ - Args: - bottom_up (Backbone): module representing the bottom up subnetwork. - Must be a subclass of :class:`Backbone`. The multi-scale feature - maps generated by the bottom up network, and listed in `in_features`, - are used to generate FPN levels. - in_features (list[str]): names of the input feature maps coming - from the backbone to which FPN is attached. For example, if the - backbone produces ["res2", "res3", "res4"], any *contiguous* sublist - of these may be used; order must be from high to low resolution. - out_channels (int): number of channels in the output feature maps. - norm (str): the normalization to use. - top_block (nn.Module or None): if provided, an extra operation will - be performed on the output of the last (smallest resolution) - FPN output, and the result will extend the result list. The top_block - further downsamples the feature map. It must have an attribute - "num_levels", meaning the number of extra FPN levels added by - this block, and "in_feature", which is a string representing - its input feature (e.g., p5). - fuse_type (str): types for fusing the top down features and the lateral - ones. It can be "sum" (default), which sums up element-wise; or "avg", - which takes the element-wise mean of the two. - """ - super(FPN, self).__init__() - assert isinstance(bottom_up, Backbone) - assert in_features, in_features - - # Feature map strides and channels from the bottom up network (e.g. ResNet) - input_shapes = bottom_up.output_shape() - strides = [input_shapes[f].stride for f in in_features] - in_channels_per_feature = [input_shapes[f].channels for f in in_features] - - _assert_strides_are_log2_contiguous(strides) - lateral_convs = [] - output_convs = [] - - use_bias = norm == "" - for idx, in_channels in enumerate(in_channels_per_feature): - lateral_norm = get_norm(norm, out_channels) - output_norm = get_norm(norm, out_channels) - - lateral_conv = Conv2d( - in_channels, out_channels, kernel_size=1, bias=use_bias, norm=lateral_norm - ) - output_conv = Conv2d( - out_channels, - out_channels, - kernel_size=3, - stride=1, - padding=1, - bias=use_bias, - norm=output_norm, - ) - weight_init.c2_xavier_fill(lateral_conv) - weight_init.c2_xavier_fill(output_conv) - stage = int(math.log2(strides[idx])) - self.add_module("fpn_lateral{}".format(stage), lateral_conv) - self.add_module("fpn_output{}".format(stage), output_conv) - - lateral_convs.append(lateral_conv) - output_convs.append(output_conv) - # Place convs into top-down order (from low to high resolution) - # to make the top-down computation in forward clearer. - self.lateral_convs = lateral_convs[::-1] - self.output_convs = output_convs[::-1] - self.top_block = top_block - self.in_features = tuple(in_features) - self.bottom_up = bottom_up - # Return feature names are "p", like ["p2", "p3", ..., "p6"] - self._out_feature_strides = {"p{}".format(int(math.log2(s))): s for s in strides} - # top block output feature maps. - if self.top_block is not None: - for s in range(stage, stage + self.top_block.num_levels): - self._out_feature_strides["p{}".format(s + 1)] = 2 ** (s + 1) - - self._out_features = list(self._out_feature_strides.keys()) - self._out_feature_channels = {k: out_channels for k in self._out_features} - self._size_divisibility = strides[-1] - assert fuse_type in {"avg", "sum"} - self._fuse_type = fuse_type - - @property - def size_divisibility(self): - return self._size_divisibility - - def forward(self, x): - """ - Args: - input (dict[str->Tensor]): mapping feature map name (e.g., "res5") to - feature map tensor for each feature level in high to low resolution order. - - Returns: - dict[str->Tensor]: - mapping from feature map name to FPN feature map tensor - in high to low resolution order. Returned feature names follow the FPN - paper convention: "p", where stage has stride = 2 ** stage e.g., - ["p2", "p3", ..., "p6"]. - """ - bottom_up_features = self.bottom_up(x) - results = [] - prev_features = self.lateral_convs[0](bottom_up_features[self.in_features[-1]]) - results.append(self.output_convs[0](prev_features)) - - # Reverse feature maps into top-down order (from low to high resolution) - for idx, (lateral_conv, output_conv) in enumerate( - zip(self.lateral_convs, self.output_convs) - ): - # Slicing of ModuleList is not supported https://github.com/pytorch/pytorch/issues/47336 - # Therefore we loop over all modules but skip the first one - if idx > 0: - features = self.in_features[-idx - 1] - features = bottom_up_features[features] - top_down_features = F.interpolate(prev_features, scale_factor=2.0, mode="nearest") - lateral_features = lateral_conv(features) - prev_features = lateral_features + top_down_features - if self._fuse_type == "avg": - prev_features /= 2 - results.insert(0, output_conv(prev_features)) - - if self.top_block is not None: - if self.top_block.in_feature in bottom_up_features: - top_block_in_feature = bottom_up_features[self.top_block.in_feature] - else: - top_block_in_feature = results[self._out_features.index(self.top_block.in_feature)] - results.extend(self.top_block(top_block_in_feature)) - assert len(self._out_features) == len(results) - return {f: res for f, res in zip(self._out_features, results)} - - def output_shape(self): - return { - name: ShapeSpec( - channels=self._out_feature_channels[name], stride=self._out_feature_strides[name] - ) - for name in self._out_features - } - - -def _assert_strides_are_log2_contiguous(strides): - """ - Assert that each stride is 2x times its preceding stride, i.e. "contiguous in log2". - """ - for i, stride in enumerate(strides[1:], 1): - assert stride == 2 * strides[i - 1], "Strides {} {} are not log2 contiguous".format( - stride, strides[i - 1] - ) - - -class LastLevelMaxPool(nn.Module): - """ - This module is used in the original FPN to generate a downsampled - P6 feature from P5. - """ - - def __init__(self): - super().__init__() - self.num_levels = 1 - self.in_feature = "p5" - - def forward(self, x): - return [F.max_pool2d(x, kernel_size=1, stride=2, padding=0)] - - -class LastLevelP6P7(nn.Module): - """ - This module is used in RetinaNet to generate extra layers, P6 and P7 from - C5 feature. - """ - - def __init__(self, in_channels, out_channels, in_feature="res5"): - super().__init__() - self.num_levels = 2 - self.in_feature = in_feature - self.p6 = nn.Conv2d(in_channels, out_channels, 3, 2, 1) - self.p7 = nn.Conv2d(out_channels, out_channels, 3, 2, 1) - for module in [self.p6, self.p7]: - weight_init.c2_xavier_fill(module) - - def forward(self, c5): - p6 = self.p6(c5) - p7 = self.p7(F.relu(p6)) - return [p6, p7] - - -@BACKBONE_REGISTRY.register() -def build_resnet_fpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_resnet_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - backbone = FPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - norm=cfg.MODEL.FPN.NORM, - top_block=LastLevelMaxPool(), - fuse_type=cfg.MODEL.FPN.FUSE_TYPE, - ) - return backbone - - -@BACKBONE_REGISTRY.register() -def build_retinanet_resnet_fpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_resnet_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - in_channels_p6p7 = bottom_up.output_shape()["res5"].channels - backbone = FPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - norm=cfg.MODEL.FPN.NORM, - top_block=LastLevelP6P7(in_channels_p6p7, out_channels), - fuse_type=cfg.MODEL.FPN.FUSE_TYPE, - ) - return backbone diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/regnet.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/regnet.py deleted file mode 100755 index 3533d633..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/regnet.py +++ /dev/null @@ -1,452 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -""" -Implementation of RegNet models from :paper:`dds` and :paper:`scaling`. - -This code is adapted from https://github.com/facebookresearch/pycls with minimal modifications. -Some code duplication exists between RegNet and ResNets (e.g., ResStem) in order to simplify -model loading. -""" - -import numpy as np -from torch import nn - -from detectron2.layers import CNNBlockBase, ShapeSpec, get_norm - -from .backbone import Backbone - -__all__ = [ - "AnyNet", - "RegNet", - "ResStem", - "SimpleStem", - "VanillaBlock", - "ResBasicBlock", - "ResBottleneckBlock", -] - - -def conv2d(w_in, w_out, k, *, stride=1, groups=1, bias=False): - """Helper for building a conv2d layer.""" - assert k % 2 == 1, "Only odd size kernels supported to avoid padding issues." - s, p, g, b = stride, (k - 1) // 2, groups, bias - return nn.Conv2d(w_in, w_out, k, stride=s, padding=p, groups=g, bias=b) - - -def gap2d(): - """Helper for building a global average pooling layer.""" - return nn.AdaptiveAvgPool2d((1, 1)) - - -def pool2d(k, *, stride=1): - """Helper for building a pool2d layer.""" - assert k % 2 == 1, "Only odd size kernels supported to avoid padding issues." - return nn.MaxPool2d(k, stride=stride, padding=(k - 1) // 2) - - -def init_weights(m): - """Performs ResNet-style weight initialization.""" - if isinstance(m, nn.Conv2d): - # Note that there is no bias due to BN - fan_out = m.kernel_size[0] * m.kernel_size[1] * m.out_channels - m.weight.data.normal_(mean=0.0, std=np.sqrt(2.0 / fan_out)) - elif isinstance(m, nn.BatchNorm2d): - m.weight.data.fill_(1.0) - m.bias.data.zero_() - elif isinstance(m, nn.Linear): - m.weight.data.normal_(mean=0.0, std=0.01) - m.bias.data.zero_() - - -class ResStem(CNNBlockBase): - """ResNet stem for ImageNet: 7x7, BN, AF, MaxPool.""" - - def __init__(self, w_in, w_out, norm, activation_class): - super().__init__(w_in, w_out, 4) - self.conv = conv2d(w_in, w_out, 7, stride=2) - self.bn = get_norm(norm, w_out) - self.af = activation_class() - self.pool = pool2d(3, stride=2) - - def forward(self, x): - for layer in self.children(): - x = layer(x) - return x - - -class SimpleStem(CNNBlockBase): - """Simple stem for ImageNet: 3x3, BN, AF.""" - - def __init__(self, w_in, w_out, norm, activation_class): - super().__init__(w_in, w_out, 2) - self.conv = conv2d(w_in, w_out, 3, stride=2) - self.bn = get_norm(norm, w_out) - self.af = activation_class() - - def forward(self, x): - for layer in self.children(): - x = layer(x) - return x - - -class SE(nn.Module): - """Squeeze-and-Excitation (SE) block: AvgPool, FC, Act, FC, Sigmoid.""" - - def __init__(self, w_in, w_se, activation_class): - super().__init__() - self.avg_pool = gap2d() - self.f_ex = nn.Sequential( - conv2d(w_in, w_se, 1, bias=True), - activation_class(), - conv2d(w_se, w_in, 1, bias=True), - nn.Sigmoid(), - ) - - def forward(self, x): - return x * self.f_ex(self.avg_pool(x)) - - -class VanillaBlock(CNNBlockBase): - """Vanilla block: [3x3 conv, BN, Relu] x2.""" - - def __init__(self, w_in, w_out, stride, norm, activation_class, _params): - super().__init__(w_in, w_out, stride) - self.a = conv2d(w_in, w_out, 3, stride=stride) - self.a_bn = get_norm(norm, w_out) - self.a_af = activation_class() - self.b = conv2d(w_out, w_out, 3) - self.b_bn = get_norm(norm, w_out) - self.b_af = activation_class() - - def forward(self, x): - for layer in self.children(): - x = layer(x) - return x - - -class BasicTransform(nn.Module): - """Basic transformation: [3x3 conv, BN, Relu] x2.""" - - def __init__(self, w_in, w_out, stride, norm, activation_class, _params): - super().__init__() - self.a = conv2d(w_in, w_out, 3, stride=stride) - self.a_bn = get_norm(norm, w_out) - self.a_af = activation_class() - self.b = conv2d(w_out, w_out, 3) - self.b_bn = get_norm(norm, w_out) - self.b_bn.final_bn = True - - def forward(self, x): - for layer in self.children(): - x = layer(x) - return x - - -class ResBasicBlock(CNNBlockBase): - """Residual basic block: x + f(x), f = basic transform.""" - - def __init__(self, w_in, w_out, stride, norm, activation_class, params): - super().__init__(w_in, w_out, stride) - self.proj, self.bn = None, None - if (w_in != w_out) or (stride != 1): - self.proj = conv2d(w_in, w_out, 1, stride=stride) - self.bn = get_norm(norm, w_out) - self.f = BasicTransform(w_in, w_out, stride, norm, activation_class, params) - self.af = activation_class() - - def forward(self, x): - x_p = self.bn(self.proj(x)) if self.proj else x - return self.af(x_p + self.f(x)) - - -class BottleneckTransform(nn.Module): - """Bottleneck transformation: 1x1, 3x3 [+SE], 1x1.""" - - def __init__(self, w_in, w_out, stride, norm, activation_class, params): - super().__init__() - w_b = int(round(w_out * params["bot_mul"])) - w_se = int(round(w_in * params["se_r"])) - groups = w_b // params["group_w"] - self.a = conv2d(w_in, w_b, 1) - self.a_bn = get_norm(norm, w_b) - self.a_af = activation_class() - self.b = conv2d(w_b, w_b, 3, stride=stride, groups=groups) - self.b_bn = get_norm(norm, w_b) - self.b_af = activation_class() - self.se = SE(w_b, w_se, activation_class) if w_se else None - self.c = conv2d(w_b, w_out, 1) - self.c_bn = get_norm(norm, w_out) - self.c_bn.final_bn = True - - def forward(self, x): - for layer in self.children(): - x = layer(x) - return x - - -class ResBottleneckBlock(CNNBlockBase): - """Residual bottleneck block: x + f(x), f = bottleneck transform.""" - - def __init__(self, w_in, w_out, stride, norm, activation_class, params): - super().__init__(w_in, w_out, stride) - self.proj, self.bn = None, None - if (w_in != w_out) or (stride != 1): - self.proj = conv2d(w_in, w_out, 1, stride=stride) - self.bn = get_norm(norm, w_out) - self.f = BottleneckTransform(w_in, w_out, stride, norm, activation_class, params) - self.af = activation_class() - - def forward(self, x): - x_p = self.bn(self.proj(x)) if self.proj else x - return self.af(x_p + self.f(x)) - - -class AnyStage(nn.Module): - """AnyNet stage (sequence of blocks w/ the same output shape).""" - - def __init__(self, w_in, w_out, stride, d, block_class, norm, activation_class, params): - super().__init__() - for i in range(d): - block = block_class(w_in, w_out, stride, norm, activation_class, params) - self.add_module("b{}".format(i + 1), block) - stride, w_in = 1, w_out - - def forward(self, x): - for block in self.children(): - x = block(x) - return x - - -class AnyNet(Backbone): - """AnyNet model. See :paper:`dds`.""" - - def __init__( - self, - *, - stem_class, - stem_width, - block_class, - depths, - widths, - group_widths, - strides, - bottleneck_ratios, - se_ratio, - activation_class, - freeze_at=0, - norm="BN", - out_features=None, - ): - """ - Args: - stem_class (callable): A callable taking 4 arguments (channels in, channels out, - normalization, callable returning an activation function) that returns another - callable implementing the stem module. - stem_width (int): The number of output channels that the stem produces. - block_class (callable): A callable taking 6 arguments (channels in, channels out, - stride, normalization, callable returning an activation function, a dict of - block-specific parameters) that returns another callable implementing the repeated - block module. - depths (list[int]): Number of blocks in each stage. - widths (list[int]): For each stage, the number of output channels of each block. - group_widths (list[int]): For each stage, the number of channels per group in group - convolution, if the block uses group convolution. - strides (list[int]): The stride that each network stage applies to its input. - bottleneck_ratios (list[float]): For each stage, the ratio of the number of bottleneck - channels to the number of block input channels (or, equivalently, output channels), - if the block uses a bottleneck. - se_ratio (float): The ratio of the number of channels used inside the squeeze-excitation - (SE) module to it number of input channels, if SE the block uses SE. - activation_class (callable): A callable taking no arguments that returns another - callable implementing an activation function. - freeze_at (int): The number of stages at the beginning to freeze. - see :meth:`freeze` for detailed explanation. - norm (str or callable): normalization for all conv layers. - See :func:`layers.get_norm` for supported format. - out_features (list[str]): name of the layers whose outputs should - be returned in forward. RegNet's use "stem" and "s1", "s2", etc for the stages after - the stem. If None, will return the output of the last layer. - """ - super().__init__() - self.stem = stem_class(3, stem_width, norm, activation_class) - - current_stride = self.stem.stride - self._out_feature_strides = {"stem": current_stride} - self._out_feature_channels = {"stem": self.stem.out_channels} - self.stages_and_names = [] - prev_w = stem_width - - for i, (d, w, s, b, g) in enumerate( - zip(depths, widths, strides, bottleneck_ratios, group_widths) - ): - params = {"bot_mul": b, "group_w": g, "se_r": se_ratio} - stage = AnyStage(prev_w, w, s, d, block_class, norm, activation_class, params) - name = "s{}".format(i + 1) - self.add_module(name, stage) - self.stages_and_names.append((stage, name)) - self._out_feature_strides[name] = current_stride = int( - current_stride * np.prod([k.stride for k in stage.children()]) - ) - self._out_feature_channels[name] = list(stage.children())[-1].out_channels - prev_w = w - - self.apply(init_weights) - - if out_features is None: - out_features = [name] - self._out_features = out_features - assert len(self._out_features) - children = [x[0] for x in self.named_children()] - for out_feature in self._out_features: - assert out_feature in children, "Available children: {} does not include {}".format( - ", ".join(children), out_feature - ) - self.freeze(freeze_at) - - def forward(self, x): - """ - Args: - x: Tensor of shape (N,C,H,W). H, W must be a multiple of ``self.size_divisibility``. - - Returns: - dict[str->Tensor]: names and the corresponding features - """ - assert x.dim() == 4, f"Model takes an input of shape (N, C, H, W). Got {x.shape} instead!" - outputs = {} - x = self.stem(x) - if "stem" in self._out_features: - outputs["stem"] = x - for stage, name in self.stages_and_names: - x = stage(x) - if name in self._out_features: - outputs[name] = x - return outputs - - def output_shape(self): - return { - name: ShapeSpec( - channels=self._out_feature_channels[name], stride=self._out_feature_strides[name] - ) - for name in self._out_features - } - - def freeze(self, freeze_at=0): - """ - Freeze the first several stages of the model. Commonly used in fine-tuning. - - Layers that produce the same feature map spatial size are defined as one - "stage" by :paper:`FPN`. - - Args: - freeze_at (int): number of stages to freeze. - `1` means freezing the stem. `2` means freezing the stem and - one residual stage, etc. - - Returns: - nn.Module: this model itself - """ - if freeze_at >= 1: - self.stem.freeze() - for idx, (stage, _) in enumerate(self.stages_and_names, start=2): - if freeze_at >= idx: - for block in stage.children(): - block.freeze() - return self - - -def adjust_block_compatibility(ws, bs, gs): - """Adjusts the compatibility of widths, bottlenecks, and groups.""" - assert len(ws) == len(bs) == len(gs) - assert all(w > 0 and b > 0 and g > 0 for w, b, g in zip(ws, bs, gs)) - vs = [int(max(1, w * b)) for w, b in zip(ws, bs)] - gs = [int(min(g, v)) for g, v in zip(gs, vs)] - ms = [np.lcm(g, b) if b > 1 else g for g, b in zip(gs, bs)] - vs = [max(m, int(round(v / m) * m)) for v, m in zip(vs, ms)] - ws = [int(v / b) for v, b in zip(vs, bs)] - assert all(w * b % g == 0 for w, b, g in zip(ws, bs, gs)) - return ws, bs, gs - - -def generate_regnet_parameters(w_a, w_0, w_m, d, q=8): - """Generates per stage widths and depths from RegNet parameters.""" - assert w_a >= 0 and w_0 > 0 and w_m > 1 and w_0 % q == 0 - # Generate continuous per-block ws - ws_cont = np.arange(d) * w_a + w_0 - # Generate quantized per-block ws - ks = np.round(np.log(ws_cont / w_0) / np.log(w_m)) - ws_all = w_0 * np.power(w_m, ks) - ws_all = np.round(np.divide(ws_all, q)).astype(int) * q - # Generate per stage ws and ds (assumes ws_all are sorted) - ws, ds = np.unique(ws_all, return_counts=True) - # Compute number of actual stages and total possible stages - num_stages, total_stages = len(ws), ks.max() + 1 - # Convert numpy arrays to lists and return - ws, ds, ws_all, ws_cont = (x.tolist() for x in (ws, ds, ws_all, ws_cont)) - return ws, ds, num_stages, total_stages, ws_all, ws_cont - - -class RegNet(AnyNet): - """RegNet model. See :paper:`dds`.""" - - def __init__( - self, - *, - stem_class, - stem_width, - block_class, - depth, - w_a, - w_0, - w_m, - group_width, - stride=2, - bottleneck_ratio=1.0, - se_ratio=0.0, - activation_class=None, - freeze_at=0, - norm="BN", - out_features=None, - ): - """ - Build a RegNet from the parameterization described in :paper:`dds` Section 3.3. - - Args: - See :class:`AnyNet` for arguments that are not listed here. - depth (int): Total number of blocks in the RegNet. - w_a (float): Factor by which block width would increase prior to quantizing block widths - by stage. See :paper:`dds` Section 3.3. - w_0 (int): Initial block width. See :paper:`dds` Section 3.3. - w_m (float): Parameter controlling block width quantization. - See :paper:`dds` Section 3.3. - group_width (int): Number of channels per group in group convolution, if the block uses - group convolution. - bottleneck_ratio (float): The ratio of the number of bottleneck channels to the number - of block input channels (or, equivalently, output channels), if the block uses a - bottleneck. - stride (int): The stride that each network stage applies to its input. - """ - ws, ds = generate_regnet_parameters(w_a, w_0, w_m, depth)[0:2] - ss = [stride for _ in ws] - bs = [bottleneck_ratio for _ in ws] - gs = [group_width for _ in ws] - ws, bs, gs = adjust_block_compatibility(ws, bs, gs) - - def default_activation_class(): - return nn.ReLU(inplace=True) - - super().__init__( - stem_class=stem_class, - stem_width=stem_width, - block_class=block_class, - depths=ds, - widths=ws, - strides=ss, - group_widths=gs, - bottleneck_ratios=bs, - se_ratio=se_ratio, - activation_class=default_activation_class - if activation_class is None - else activation_class, - freeze_at=freeze_at, - norm=norm, - out_features=out_features, - ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/resnet.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/resnet.py deleted file mode 100755 index 5b8e842c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/backbone/resnet.py +++ /dev/null @@ -1,694 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -import fvcore.nn.weight_init as weight_init -import torch -import torch.nn.functional as F -from torch import nn - -from detectron2.layers import ( - CNNBlockBase, - Conv2d, - DeformConv, - ModulatedDeformConv, - ShapeSpec, - get_norm, -) - -from .backbone import Backbone -from .build import BACKBONE_REGISTRY - -__all__ = [ - "ResNetBlockBase", - "BasicBlock", - "BottleneckBlock", - "DeformBottleneckBlock", - "BasicStem", - "ResNet", - "make_stage", - "build_resnet_backbone", -] - - -class BasicBlock(CNNBlockBase): - """ - The basic residual block for ResNet-18 and ResNet-34 defined in :paper:`ResNet`, - with two 3x3 conv layers and a projection shortcut if needed. - """ - - def __init__(self, in_channels, out_channels, *, stride=1, norm="BN"): - """ - Args: - in_channels (int): Number of input channels. - out_channels (int): Number of output channels. - stride (int): Stride for the first conv. - norm (str or callable): normalization for all conv layers. - See :func:`layers.get_norm` for supported format. - """ - super().__init__(in_channels, out_channels, stride) - - if in_channels != out_channels: - self.shortcut = Conv2d( - in_channels, - out_channels, - kernel_size=1, - stride=stride, - bias=False, - norm=get_norm(norm, out_channels), - ) - else: - self.shortcut = None - - self.conv1 = Conv2d( - in_channels, - out_channels, - kernel_size=3, - stride=stride, - padding=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - - self.conv2 = Conv2d( - out_channels, - out_channels, - kernel_size=3, - stride=1, - padding=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - - for layer in [self.conv1, self.conv2, self.shortcut]: - if layer is not None: # shortcut can be None - weight_init.c2_msra_fill(layer) - - def forward(self, x): - out = self.conv1(x) - out = F.relu_(out) - out = self.conv2(out) - - if self.shortcut is not None: - shortcut = self.shortcut(x) - else: - shortcut = x - - out += shortcut - out = F.relu_(out) - return out - - -class BottleneckBlock(CNNBlockBase): - """ - The standard bottleneck residual block used by ResNet-50, 101 and 152 - defined in :paper:`ResNet`. It contains 3 conv layers with kernels - 1x1, 3x3, 1x1, and a projection shortcut if needed. - """ - - def __init__( - self, - in_channels, - out_channels, - *, - bottleneck_channels, - stride=1, - num_groups=1, - norm="BN", - stride_in_1x1=False, - dilation=1, - ): - """ - Args: - bottleneck_channels (int): number of output channels for the 3x3 - "bottleneck" conv layers. - num_groups (int): number of groups for the 3x3 conv layer. - norm (str or callable): normalization for all conv layers. - See :func:`layers.get_norm` for supported format. - stride_in_1x1 (bool): when stride>1, whether to put stride in the - first 1x1 convolution or the bottleneck 3x3 convolution. - dilation (int): the dilation rate of the 3x3 conv layer. - """ - super().__init__(in_channels, out_channels, stride) - - if in_channels != out_channels: - self.shortcut = Conv2d( - in_channels, - out_channels, - kernel_size=1, - stride=stride, - bias=False, - norm=get_norm(norm, out_channels), - ) - else: - self.shortcut = None - - # The original MSRA ResNet models have stride in the first 1x1 conv - # The subsequent fb.torch.resnet and Caffe2 ResNe[X]t implementations have - # stride in the 3x3 conv - stride_1x1, stride_3x3 = (stride, 1) if stride_in_1x1 else (1, stride) - - self.conv1 = Conv2d( - in_channels, - bottleneck_channels, - kernel_size=1, - stride=stride_1x1, - bias=False, - norm=get_norm(norm, bottleneck_channels), - ) - - self.conv2 = Conv2d( - bottleneck_channels, - bottleneck_channels, - kernel_size=3, - stride=stride_3x3, - padding=1 * dilation, - bias=False, - groups=num_groups, - dilation=dilation, - norm=get_norm(norm, bottleneck_channels), - ) - - self.conv3 = Conv2d( - bottleneck_channels, - out_channels, - kernel_size=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - - for layer in [self.conv1, self.conv2, self.conv3, self.shortcut]: - if layer is not None: # shortcut can be None - weight_init.c2_msra_fill(layer) - - # Zero-initialize the last normalization in each residual branch, - # so that at the beginning, the residual branch starts with zeros, - # and each residual block behaves like an identity. - # See Sec 5.1 in "Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour": - # "For BN layers, the learnable scaling coefficient γ is initialized - # to be 1, except for each residual block's last BN - # where γ is initialized to be 0." - - # nn.init.constant_(self.conv3.norm.weight, 0) - # TODO this somehow hurts performance when training GN models from scratch. - # Add it as an option when we need to use this code to train a backbone. - - def forward(self, x): - out = self.conv1(x) - out = F.relu_(out) - - out = self.conv2(out) - out = F.relu_(out) - - out = self.conv3(out) - - if self.shortcut is not None: - shortcut = self.shortcut(x) - else: - shortcut = x - - out += shortcut - out = F.relu_(out) - return out - - -class DeformBottleneckBlock(CNNBlockBase): - """ - Similar to :class:`BottleneckBlock`, but with :paper:`deformable conv ` - in the 3x3 convolution. - """ - - def __init__( - self, - in_channels, - out_channels, - *, - bottleneck_channels, - stride=1, - num_groups=1, - norm="BN", - stride_in_1x1=False, - dilation=1, - deform_modulated=False, - deform_num_groups=1, - ): - super().__init__(in_channels, out_channels, stride) - self.deform_modulated = deform_modulated - - if in_channels != out_channels: - self.shortcut = Conv2d( - in_channels, - out_channels, - kernel_size=1, - stride=stride, - bias=False, - norm=get_norm(norm, out_channels), - ) - else: - self.shortcut = None - - stride_1x1, stride_3x3 = (stride, 1) if stride_in_1x1 else (1, stride) - - self.conv1 = Conv2d( - in_channels, - bottleneck_channels, - kernel_size=1, - stride=stride_1x1, - bias=False, - norm=get_norm(norm, bottleneck_channels), - ) - - if deform_modulated: - deform_conv_op = ModulatedDeformConv - # offset channels are 2 or 3 (if with modulated) * kernel_size * kernel_size - offset_channels = 27 - else: - deform_conv_op = DeformConv - offset_channels = 18 - - self.conv2_offset = Conv2d( - bottleneck_channels, - offset_channels * deform_num_groups, - kernel_size=3, - stride=stride_3x3, - padding=1 * dilation, - dilation=dilation, - ) - self.conv2 = deform_conv_op( - bottleneck_channels, - bottleneck_channels, - kernel_size=3, - stride=stride_3x3, - padding=1 * dilation, - bias=False, - groups=num_groups, - dilation=dilation, - deformable_groups=deform_num_groups, - norm=get_norm(norm, bottleneck_channels), - ) - - self.conv3 = Conv2d( - bottleneck_channels, - out_channels, - kernel_size=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - - for layer in [self.conv1, self.conv2, self.conv3, self.shortcut]: - if layer is not None: # shortcut can be None - weight_init.c2_msra_fill(layer) - - nn.init.constant_(self.conv2_offset.weight, 0) - nn.init.constant_(self.conv2_offset.bias, 0) - - def forward(self, x): - out = self.conv1(x) - out = F.relu_(out) - - if self.deform_modulated: - offset_mask = self.conv2_offset(out) - offset_x, offset_y, mask = torch.chunk(offset_mask, 3, dim=1) - offset = torch.cat((offset_x, offset_y), dim=1) - mask = mask.sigmoid() - out = self.conv2(out, offset, mask) - else: - offset = self.conv2_offset(out) - out = self.conv2(out, offset) - out = F.relu_(out) - - out = self.conv3(out) - - if self.shortcut is not None: - shortcut = self.shortcut(x) - else: - shortcut = x - - out += shortcut - out = F.relu_(out) - return out - - -class BasicStem(CNNBlockBase): - """ - The standard ResNet stem (layers before the first residual block), - with a conv, relu and max_pool. - """ - - def __init__(self, in_channels=3, out_channels=64, norm="BN"): - """ - Args: - norm (str or callable): norm after the first conv layer. - See :func:`layers.get_norm` for supported format. - """ - super().__init__(in_channels, out_channels, 4) - self.in_channels = in_channels - self.conv1 = Conv2d( - in_channels, - out_channels, - kernel_size=7, - stride=2, - padding=3, - bias=False, - norm=get_norm(norm, out_channels), - ) - weight_init.c2_msra_fill(self.conv1) - - def forward(self, x): - x = self.conv1(x) - x = F.relu_(x) - x = F.max_pool2d(x, kernel_size=3, stride=2, padding=1) - return x - - -class ResNet(Backbone): - """ - Implement :paper:`ResNet`. - """ - - def __init__(self, stem, stages, num_classes=None, out_features=None, freeze_at=0): - """ - Args: - stem (nn.Module): a stem module - stages (list[list[CNNBlockBase]]): several (typically 4) stages, - each contains multiple :class:`CNNBlockBase`. - num_classes (None or int): if None, will not perform classification. - Otherwise, will create a linear layer. - out_features (list[str]): name of the layers whose outputs should - be returned in forward. Can be anything in "stem", "linear", or "res2" ... - If None, will return the output of the last layer. - freeze_at (int): The number of stages at the beginning to freeze. - see :meth:`freeze` for detailed explanation. - """ - super().__init__() - self.stem = stem - self.num_classes = num_classes - - current_stride = self.stem.stride - self._out_feature_strides = {"stem": current_stride} - self._out_feature_channels = {"stem": self.stem.out_channels} - - self.stage_names, self.stages = [], [] - - if out_features is not None: - # Avoid keeping unused layers in this module. They consume extra memory - # and may cause allreduce to fail - num_stages = max( - [{"res2": 1, "res3": 2, "res4": 3, "res5": 4}.get(f, 0) for f in out_features] - ) - stages = stages[:num_stages] - for i, blocks in enumerate(stages): - assert len(blocks) > 0, len(blocks) - for block in blocks: - assert isinstance(block, CNNBlockBase), block - - name = "res" + str(i + 2) - stage = nn.Sequential(*blocks) - - self.add_module(name, stage) - self.stage_names.append(name) - self.stages.append(stage) - - self._out_feature_strides[name] = current_stride = int( - current_stride * np.prod([k.stride for k in blocks]) - ) - self._out_feature_channels[name] = curr_channels = blocks[-1].out_channels - self.stage_names = tuple(self.stage_names) # Make it static for scripting - - if num_classes is not None: - self.avgpool = nn.AdaptiveAvgPool2d((1, 1)) - self.linear = nn.Linear(curr_channels, num_classes) - - # Sec 5.1 in "Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour": - # "The 1000-way fully-connected layer is initialized by - # drawing weights from a zero-mean Gaussian with standard deviation of 0.01." - nn.init.normal_(self.linear.weight, std=0.01) - name = "linear" - - if out_features is None: - out_features = [name] - self._out_features = out_features - assert len(self._out_features) - children = [x[0] for x in self.named_children()] - for out_feature in self._out_features: - assert out_feature in children, "Available children: {}".format(", ".join(children)) - self.freeze(freeze_at) - - def forward(self, x): - """ - Args: - x: Tensor of shape (N,C,H,W). H, W must be a multiple of ``self.size_divisibility``. - - Returns: - dict[str->Tensor]: names and the corresponding features - """ - assert x.dim() == 4, f"ResNet takes an input of shape (N, C, H, W). Got {x.shape} instead!" - outputs = {} - x = self.stem(x) - if "stem" in self._out_features: - outputs["stem"] = x - for name, stage in zip(self.stage_names, self.stages): - x = stage(x) - if name in self._out_features: - outputs[name] = x - if self.num_classes is not None: - x = self.avgpool(x) - x = torch.flatten(x, 1) - x = self.linear(x) - if "linear" in self._out_features: - outputs["linear"] = x - return outputs - - def output_shape(self): - return { - name: ShapeSpec( - channels=self._out_feature_channels[name], stride=self._out_feature_strides[name] - ) - for name in self._out_features - } - - def freeze(self, freeze_at=0): - """ - Freeze the first several stages of the ResNet. Commonly used in - fine-tuning. - - Layers that produce the same feature map spatial size are defined as one - "stage" by :paper:`FPN`. - - Args: - freeze_at (int): number of stages to freeze. - `1` means freezing the stem. `2` means freezing the stem and - one residual stage, etc. - - Returns: - nn.Module: this ResNet itself - """ - if freeze_at >= 1: - self.stem.freeze() - for idx, stage in enumerate(self.stages, start=2): - if freeze_at >= idx: - for block in stage.children(): - block.freeze() - return self - - @staticmethod - def make_stage(block_class, num_blocks, *, in_channels, out_channels, **kwargs): - """ - Create a list of blocks of the same type that forms one ResNet stage. - - Args: - block_class (type): a subclass of CNNBlockBase that's used to create all blocks in this - stage. A module of this type must not change spatial resolution of inputs unless its - stride != 1. - num_blocks (int): number of blocks in this stage - in_channels (int): input channels of the entire stage. - out_channels (int): output channels of **every block** in the stage. - kwargs: other arguments passed to the constructor of - `block_class`. If the argument name is "xx_per_block", the - argument is a list of values to be passed to each block in the - stage. Otherwise, the same argument is passed to every block - in the stage. - - Returns: - list[CNNBlockBase]: a list of block module. - - Examples: - :: - stage = ResNet.make_stage( - BottleneckBlock, 3, in_channels=16, out_channels=64, - bottleneck_channels=16, num_groups=1, - stride_per_block=[2, 1, 1], - dilations_per_block=[1, 1, 2] - ) - - Usually, layers that produce the same feature map spatial size are defined as one - "stage" (in :paper:`FPN`). Under such definition, ``stride_per_block[1:]`` should - all be 1. - """ - blocks = [] - for i in range(num_blocks): - curr_kwargs = {} - for k, v in kwargs.items(): - if k.endswith("_per_block"): - assert len(v) == num_blocks, ( - f"Argument '{k}' of make_stage should have the " - f"same length as num_blocks={num_blocks}." - ) - newk = k[: -len("_per_block")] - assert newk not in kwargs, f"Cannot call make_stage with both {k} and {newk}!" - curr_kwargs[newk] = v[i] - else: - curr_kwargs[k] = v - - blocks.append( - block_class(in_channels=in_channels, out_channels=out_channels, **curr_kwargs) - ) - in_channels = out_channels - return blocks - - @staticmethod - def make_default_stages(depth, block_class=None, **kwargs): - """ - Created list of ResNet stages from pre-defined depth (one of 18, 34, 50, 101, 152). - If it doesn't create the ResNet variant you need, please use :meth:`make_stage` - instead for fine-grained customization. - - Args: - depth (int): depth of ResNet - block_class (type): the CNN block class. Has to accept - `bottleneck_channels` argument for depth > 50. - By default it is BasicBlock or BottleneckBlock, based on the - depth. - kwargs: - other arguments to pass to `make_stage`. Should not contain - stride and channels, as they are predefined for each depth. - - Returns: - list[list[CNNBlockBase]]: modules in all stages; see arguments of - :class:`ResNet.__init__`. - """ - num_blocks_per_stage = { - 18: [2, 2, 2, 2], - 34: [3, 4, 6, 3], - 50: [3, 4, 6, 3], - 101: [3, 4, 23, 3], - 152: [3, 8, 36, 3], - }[depth] - if block_class is None: - block_class = BasicBlock if depth < 50 else BottleneckBlock - if depth < 50: - in_channels = [64, 64, 128, 256] - out_channels = [64, 128, 256, 512] - else: - in_channels = [64, 256, 512, 1024] - out_channels = [256, 512, 1024, 2048] - ret = [] - for (n, s, i, o) in zip(num_blocks_per_stage, [1, 2, 2, 2], in_channels, out_channels): - if depth >= 50: - kwargs["bottleneck_channels"] = o // 4 - ret.append( - ResNet.make_stage( - block_class=block_class, - num_blocks=n, - stride_per_block=[s] + [1] * (n - 1), - in_channels=i, - out_channels=o, - **kwargs, - ) - ) - return ret - - -ResNetBlockBase = CNNBlockBase -""" -Alias for backward compatibiltiy. -""" - - -def make_stage(*args, **kwargs): - """ - Deprecated alias for backward compatibiltiy. - """ - return ResNet.make_stage(*args, **kwargs) - - -@BACKBONE_REGISTRY.register() -def build_resnet_backbone(cfg, input_shape): - """ - Create a ResNet instance from config. - - Returns: - ResNet: a :class:`ResNet` instance. - """ - # need registration of new blocks/stems? - norm = cfg.MODEL.RESNETS.NORM - stem = BasicStem( - in_channels=input_shape.channels, - out_channels=cfg.MODEL.RESNETS.STEM_OUT_CHANNELS, - norm=norm, - ) - - # fmt: off - freeze_at = cfg.MODEL.BACKBONE.FREEZE_AT - out_features = cfg.MODEL.RESNETS.OUT_FEATURES - depth = cfg.MODEL.RESNETS.DEPTH - num_groups = cfg.MODEL.RESNETS.NUM_GROUPS - width_per_group = cfg.MODEL.RESNETS.WIDTH_PER_GROUP - bottleneck_channels = num_groups * width_per_group - in_channels = cfg.MODEL.RESNETS.STEM_OUT_CHANNELS - out_channels = cfg.MODEL.RESNETS.RES2_OUT_CHANNELS - stride_in_1x1 = cfg.MODEL.RESNETS.STRIDE_IN_1X1 - res5_dilation = cfg.MODEL.RESNETS.RES5_DILATION - deform_on_per_stage = cfg.MODEL.RESNETS.DEFORM_ON_PER_STAGE - deform_modulated = cfg.MODEL.RESNETS.DEFORM_MODULATED - deform_num_groups = cfg.MODEL.RESNETS.DEFORM_NUM_GROUPS - # fmt: on - assert res5_dilation in {1, 2}, "res5_dilation cannot be {}.".format(res5_dilation) - - num_blocks_per_stage = { - 18: [2, 2, 2, 2], - 34: [3, 4, 6, 3], - 50: [3, 4, 6, 3], - 101: [3, 4, 23, 3], - 152: [3, 8, 36, 3], - }[depth] - - if depth in [18, 34]: - assert out_channels == 64, "Must set MODEL.RESNETS.RES2_OUT_CHANNELS = 64 for R18/R34" - assert not any( - deform_on_per_stage - ), "MODEL.RESNETS.DEFORM_ON_PER_STAGE unsupported for R18/R34" - assert res5_dilation == 1, "Must set MODEL.RESNETS.RES5_DILATION = 1 for R18/R34" - assert num_groups == 1, "Must set MODEL.RESNETS.NUM_GROUPS = 1 for R18/R34" - - stages = [] - - for idx, stage_idx in enumerate(range(2, 6)): - # res5_dilation is used this way as a convention in R-FCN & Deformable Conv paper - dilation = res5_dilation if stage_idx == 5 else 1 - first_stride = 1 if idx == 0 or (stage_idx == 5 and dilation == 2) else 2 - stage_kargs = { - "num_blocks": num_blocks_per_stage[idx], - "stride_per_block": [first_stride] + [1] * (num_blocks_per_stage[idx] - 1), - "in_channels": in_channels, - "out_channels": out_channels, - "norm": norm, - } - # Use BasicBlock for R18 and R34. - if depth in [18, 34]: - stage_kargs["block_class"] = BasicBlock - else: - stage_kargs["bottleneck_channels"] = bottleneck_channels - stage_kargs["stride_in_1x1"] = stride_in_1x1 - stage_kargs["dilation"] = dilation - stage_kargs["num_groups"] = num_groups - if deform_on_per_stage[idx]: - stage_kargs["block_class"] = DeformBottleneckBlock - stage_kargs["deform_modulated"] = deform_modulated - stage_kargs["deform_num_groups"] = deform_num_groups - else: - stage_kargs["block_class"] = BottleneckBlock - blocks = ResNet.make_stage(**stage_kargs) - in_channels = out_channels - out_channels *= 2 - bottleneck_channels *= 2 - stages.append(blocks) - return ResNet(stem, stages, out_features=out_features, freeze_at=freeze_at) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/box_regression.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/box_regression.py deleted file mode 100755 index b24c123f..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/box_regression.py +++ /dev/null @@ -1,369 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import math -from typing import List, Tuple, Union -import torch -from fvcore.nn import giou_loss, smooth_l1_loss -from torch.nn import functional as F - -from detectron2.layers import cat, ciou_loss, diou_loss -from detectron2.structures import Boxes - -# Value for clamping large dw and dh predictions. The heuristic is that we clamp -# such that dw and dh are no larger than what would transform a 16px box into a -# 1000px box (based on a small anchor, 16px, and a typical image size, 1000px). -_DEFAULT_SCALE_CLAMP = math.log(1000.0 / 16) - - -__all__ = ["Box2BoxTransform", "Box2BoxTransformRotated", "Box2BoxTransformLinear"] - - -@torch.jit.script -class Box2BoxTransform(object): - """ - The box-to-box transform defined in R-CNN. The transformation is parameterized - by 4 deltas: (dx, dy, dw, dh). The transformation scales the box's width and height - by exp(dw), exp(dh) and shifts a box's center by the offset (dx * width, dy * height). - """ - - def __init__( - self, weights: Tuple[float, float, float, float], scale_clamp: float = _DEFAULT_SCALE_CLAMP - ): - """ - Args: - weights (4-element tuple): Scaling factors that are applied to the - (dx, dy, dw, dh) deltas. In Fast R-CNN, these were originally set - such that the deltas have unit variance; now they are treated as - hyperparameters of the system. - scale_clamp (float): When predicting deltas, the predicted box scaling - factors (dw and dh) are clamped such that they are <= scale_clamp. - """ - self.weights = weights - self.scale_clamp = scale_clamp - - def get_deltas(self, src_boxes, target_boxes): - """ - Get box regression transformation deltas (dx, dy, dw, dh) that can be used - to transform the `src_boxes` into the `target_boxes`. That is, the relation - ``target_boxes == self.apply_deltas(deltas, src_boxes)`` is true (unless - any delta is too large and is clamped). - - Args: - src_boxes (Tensor): source boxes, e.g., object proposals - target_boxes (Tensor): target of the transformation, e.g., ground-truth - boxes. - """ - assert isinstance(src_boxes, torch.Tensor), type(src_boxes) - assert isinstance(target_boxes, torch.Tensor), type(target_boxes) - - src_widths = src_boxes[:, 2] - src_boxes[:, 0] - src_heights = src_boxes[:, 3] - src_boxes[:, 1] - src_ctr_x = src_boxes[:, 0] + 0.5 * src_widths - src_ctr_y = src_boxes[:, 1] + 0.5 * src_heights - - target_widths = target_boxes[:, 2] - target_boxes[:, 0] - target_heights = target_boxes[:, 3] - target_boxes[:, 1] - target_ctr_x = target_boxes[:, 0] + 0.5 * target_widths - target_ctr_y = target_boxes[:, 1] + 0.5 * target_heights - - wx, wy, ww, wh = self.weights - dx = wx * (target_ctr_x - src_ctr_x) / src_widths - dy = wy * (target_ctr_y - src_ctr_y) / src_heights - dw = ww * torch.log(target_widths / src_widths) - dh = wh * torch.log(target_heights / src_heights) - - deltas = torch.stack((dx, dy, dw, dh), dim=1) - assert (src_widths > 0).all().item(), "Input boxes to Box2BoxTransform are not valid!" - return deltas - - def apply_deltas(self, deltas, boxes): - """ - Apply transformation `deltas` (dx, dy, dw, dh) to `boxes`. - - Args: - deltas (Tensor): transformation deltas of shape (N, k*4), where k >= 1. - deltas[i] represents k potentially different class-specific - box transformations for the single box boxes[i]. - boxes (Tensor): boxes to transform, of shape (N, 4) - """ - deltas = deltas.float() # ensure fp32 for decoding precision - boxes = boxes.to(deltas.dtype) - - widths = boxes[:, 2] - boxes[:, 0] - heights = boxes[:, 3] - boxes[:, 1] - ctr_x = boxes[:, 0] + 0.5 * widths - ctr_y = boxes[:, 1] + 0.5 * heights - - wx, wy, ww, wh = self.weights - dx = deltas[:, 0::4] / wx - dy = deltas[:, 1::4] / wy - dw = deltas[:, 2::4] / ww - dh = deltas[:, 3::4] / wh - - # Prevent sending too large values into torch.exp() - dw = torch.clamp(dw, max=self.scale_clamp) - dh = torch.clamp(dh, max=self.scale_clamp) - - pred_ctr_x = dx * widths[:, None] + ctr_x[:, None] - pred_ctr_y = dy * heights[:, None] + ctr_y[:, None] - pred_w = torch.exp(dw) * widths[:, None] - pred_h = torch.exp(dh) * heights[:, None] - - x1 = pred_ctr_x - 0.5 * pred_w - y1 = pred_ctr_y - 0.5 * pred_h - x2 = pred_ctr_x + 0.5 * pred_w - y2 = pred_ctr_y + 0.5 * pred_h - pred_boxes = torch.stack((x1, y1, x2, y2), dim=-1) - return pred_boxes.reshape(deltas.shape) - - -@torch.jit.script -class Box2BoxTransformRotated(object): - """ - The box-to-box transform defined in Rotated R-CNN. The transformation is parameterized - by 5 deltas: (dx, dy, dw, dh, da). The transformation scales the box's width and height - by exp(dw), exp(dh), shifts a box's center by the offset (dx * width, dy * height), - and rotate a box's angle by da (radians). - Note: angles of deltas are in radians while angles of boxes are in degrees. - """ - - def __init__( - self, - weights: Tuple[float, float, float, float, float], - scale_clamp: float = _DEFAULT_SCALE_CLAMP, - ): - """ - Args: - weights (5-element tuple): Scaling factors that are applied to the - (dx, dy, dw, dh, da) deltas. These are treated as - hyperparameters of the system. - scale_clamp (float): When predicting deltas, the predicted box scaling - factors (dw and dh) are clamped such that they are <= scale_clamp. - """ - self.weights = weights - self.scale_clamp = scale_clamp - - def get_deltas(self, src_boxes, target_boxes): - """ - Get box regression transformation deltas (dx, dy, dw, dh, da) that can be used - to transform the `src_boxes` into the `target_boxes`. That is, the relation - ``target_boxes == self.apply_deltas(deltas, src_boxes)`` is true (unless - any delta is too large and is clamped). - - Args: - src_boxes (Tensor): Nx5 source boxes, e.g., object proposals - target_boxes (Tensor): Nx5 target of the transformation, e.g., ground-truth - boxes. - """ - assert isinstance(src_boxes, torch.Tensor), type(src_boxes) - assert isinstance(target_boxes, torch.Tensor), type(target_boxes) - - src_ctr_x, src_ctr_y, src_widths, src_heights, src_angles = torch.unbind(src_boxes, dim=1) - - target_ctr_x, target_ctr_y, target_widths, target_heights, target_angles = torch.unbind( - target_boxes, dim=1 - ) - - wx, wy, ww, wh, wa = self.weights - dx = wx * (target_ctr_x - src_ctr_x) / src_widths - dy = wy * (target_ctr_y - src_ctr_y) / src_heights - dw = ww * torch.log(target_widths / src_widths) - dh = wh * torch.log(target_heights / src_heights) - # Angles of deltas are in radians while angles of boxes are in degrees. - # the conversion to radians serve as a way to normalize the values - da = target_angles - src_angles - da = (da + 180.0) % 360.0 - 180.0 # make it in [-180, 180) - da *= wa * math.pi / 180.0 - - deltas = torch.stack((dx, dy, dw, dh, da), dim=1) - assert ( - (src_widths > 0).all().item() - ), "Input boxes to Box2BoxTransformRotated are not valid!" - return deltas - - def apply_deltas(self, deltas, boxes): - """ - Apply transformation `deltas` (dx, dy, dw, dh, da) to `boxes`. - - Args: - deltas (Tensor): transformation deltas of shape (N, k*5). - deltas[i] represents box transformation for the single box boxes[i]. - boxes (Tensor): boxes to transform, of shape (N, 5) - """ - assert deltas.shape[1] % 5 == 0 and boxes.shape[1] == 5 - - boxes = boxes.to(deltas.dtype).unsqueeze(2) - - ctr_x = boxes[:, 0] - ctr_y = boxes[:, 1] - widths = boxes[:, 2] - heights = boxes[:, 3] - angles = boxes[:, 4] - - wx, wy, ww, wh, wa = self.weights - - dx = deltas[:, 0::5] / wx - dy = deltas[:, 1::5] / wy - dw = deltas[:, 2::5] / ww - dh = deltas[:, 3::5] / wh - da = deltas[:, 4::5] / wa - - # Prevent sending too large values into torch.exp() - dw = torch.clamp(dw, max=self.scale_clamp) - dh = torch.clamp(dh, max=self.scale_clamp) - - pred_boxes = torch.zeros_like(deltas) - pred_boxes[:, 0::5] = dx * widths + ctr_x # x_ctr - pred_boxes[:, 1::5] = dy * heights + ctr_y # y_ctr - pred_boxes[:, 2::5] = torch.exp(dw) * widths # width - pred_boxes[:, 3::5] = torch.exp(dh) * heights # height - - # Following original RRPN implementation, - # angles of deltas are in radians while angles of boxes are in degrees. - pred_angle = da * 180.0 / math.pi + angles - pred_angle = (pred_angle + 180.0) % 360.0 - 180.0 # make it in [-180, 180) - - pred_boxes[:, 4::5] = pred_angle - - return pred_boxes - - -class Box2BoxTransformLinear(object): - """ - The linear box-to-box transform defined in FCOS. The transformation is parameterized - by the distance from the center of (square) src box to 4 edges of the target box. - """ - - def __init__(self, normalize_by_size=True): - """ - Args: - normalize_by_size: normalize deltas by the size of src (anchor) boxes. - """ - self.normalize_by_size = normalize_by_size - - def get_deltas(self, src_boxes, target_boxes): - """ - Get box regression transformation deltas (dx1, dy1, dx2, dy2) that can be used - to transform the `src_boxes` into the `target_boxes`. That is, the relation - ``target_boxes == self.apply_deltas(deltas, src_boxes)`` is true. - The center of src must be inside target boxes. - - Args: - src_boxes (Tensor): square source boxes, e.g., anchors - target_boxes (Tensor): target of the transformation, e.g., ground-truth - boxes. - """ - assert isinstance(src_boxes, torch.Tensor), type(src_boxes) - assert isinstance(target_boxes, torch.Tensor), type(target_boxes) - - src_ctr_x = 0.5 * (src_boxes[:, 0] + src_boxes[:, 2]) - src_ctr_y = 0.5 * (src_boxes[:, 1] + src_boxes[:, 3]) - - target_l = src_ctr_x - target_boxes[:, 0] - target_t = src_ctr_y - target_boxes[:, 1] - target_r = target_boxes[:, 2] - src_ctr_x - target_b = target_boxes[:, 3] - src_ctr_y - - deltas = torch.stack((target_l, target_t, target_r, target_b), dim=1) - if self.normalize_by_size: - stride_w = src_boxes[:, 2] - src_boxes[:, 0] - stride_h = src_boxes[:, 3] - src_boxes[:, 1] - strides = torch.stack([stride_w, stride_h, stride_w, stride_h], axis=1) - deltas = deltas / strides - - return deltas - - def apply_deltas(self, deltas, boxes): - """ - Apply transformation `deltas` (dx1, dy1, dx2, dy2) to `boxes`. - - Args: - deltas (Tensor): transformation deltas of shape (N, k*4), where k >= 1. - deltas[i] represents k potentially different class-specific - box transformations for the single box boxes[i]. - boxes (Tensor): boxes to transform, of shape (N, 4) - """ - # Ensure the output is a valid box. See Sec 2.1 of https://arxiv.org/abs/2006.09214 - deltas = F.relu(deltas) - boxes = boxes.to(deltas.dtype) - - ctr_x = 0.5 * (boxes[:, 0] + boxes[:, 2]) - ctr_y = 0.5 * (boxes[:, 1] + boxes[:, 3]) - if self.normalize_by_size: - stride_w = boxes[:, 2] - boxes[:, 0] - stride_h = boxes[:, 3] - boxes[:, 1] - strides = torch.stack([stride_w, stride_h, stride_w, stride_h], axis=1) - deltas = deltas * strides - - l = deltas[:, 0::4] - t = deltas[:, 1::4] - r = deltas[:, 2::4] - b = deltas[:, 3::4] - - pred_boxes = torch.zeros_like(deltas) - pred_boxes[:, 0::4] = ctr_x[:, None] - l # x1 - pred_boxes[:, 1::4] = ctr_y[:, None] - t # y1 - pred_boxes[:, 2::4] = ctr_x[:, None] + r # x2 - pred_boxes[:, 3::4] = ctr_y[:, None] + b # y2 - return pred_boxes - - -def _dense_box_regression_loss( - anchors: List[Union[Boxes, torch.Tensor]], - box2box_transform: Box2BoxTransform, - pred_anchor_deltas: List[torch.Tensor], - gt_boxes: List[torch.Tensor], - fg_mask: torch.Tensor, - box_reg_loss_type="smooth_l1", - smooth_l1_beta=0.0, -): - """ - Compute loss for dense multi-level box regression. - Loss is accumulated over ``fg_mask``. - - Args: - anchors: #lvl anchor boxes, each is (HixWixA, 4) - pred_anchor_deltas: #lvl predictions, each is (N, HixWixA, 4) - gt_boxes: N ground truth boxes, each has shape (R, 4) (R = sum(Hi * Wi * A)) - fg_mask: the foreground boolean mask of shape (N, R) to compute loss on - box_reg_loss_type (str): Loss type to use. Supported losses: "smooth_l1", "giou", - "diou", "ciou". - smooth_l1_beta (float): beta parameter for the smooth L1 regression loss. Default to - use L1 loss. Only used when `box_reg_loss_type` is "smooth_l1" - """ - if isinstance(anchors[0], Boxes): - anchors = type(anchors[0]).cat(anchors).tensor # (R, 4) - else: - anchors = cat(anchors) - if box_reg_loss_type == "smooth_l1": - gt_anchor_deltas = [box2box_transform.get_deltas(anchors, k) for k in gt_boxes] - gt_anchor_deltas = torch.stack(gt_anchor_deltas) # (N, R, 4) - loss_box_reg = smooth_l1_loss( - cat(pred_anchor_deltas, dim=1)[fg_mask], - gt_anchor_deltas[fg_mask], - beta=smooth_l1_beta, - reduction="sum", - ) - elif box_reg_loss_type == "giou": - pred_boxes = [ - box2box_transform.apply_deltas(k, anchors) for k in cat(pred_anchor_deltas, dim=1) - ] - loss_box_reg = giou_loss( - torch.stack(pred_boxes)[fg_mask], torch.stack(gt_boxes)[fg_mask], reduction="sum" - ) - elif box_reg_loss_type == "diou": - pred_boxes = [ - box2box_transform.apply_deltas(k, anchors) for k in cat(pred_anchor_deltas, dim=1) - ] - loss_box_reg = diou_loss( - torch.stack(pred_boxes)[fg_mask], torch.stack(gt_boxes)[fg_mask], reduction="sum" - ) - elif box_reg_loss_type == "ciou": - pred_boxes = [ - box2box_transform.apply_deltas(k, anchors) for k in cat(pred_anchor_deltas, dim=1) - ] - loss_box_reg = ciou_loss( - torch.stack(pred_boxes)[fg_mask], torch.stack(gt_boxes)[fg_mask], reduction="sum" - ) - else: - raise ValueError(f"Invalid dense box regression loss type '{box_reg_loss_type}'") - return loss_box_reg diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/matcher.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/matcher.py deleted file mode 100755 index c7597cab..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/matcher.py +++ /dev/null @@ -1,127 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from typing import List -import torch - -from detectron2.layers import nonzero_tuple - - -# TODO: the name is too general -class Matcher(object): - """ - This class assigns to each predicted "element" (e.g., a box) a ground-truth - element. Each predicted element will have exactly zero or one matches; each - ground-truth element may be matched to zero or more predicted elements. - - The matching is determined by the MxN match_quality_matrix, that characterizes - how well each (ground-truth, prediction)-pair match each other. For example, - if the elements are boxes, this matrix may contain box intersection-over-union - overlap values. - - The matcher returns (a) a vector of length N containing the index of the - ground-truth element m in [0, M) that matches to prediction n in [0, N). - (b) a vector of length N containing the labels for each prediction. - """ - - def __init__( - self, thresholds: List[float], labels: List[int], allow_low_quality_matches: bool = False - ): - """ - Args: - thresholds (list): a list of thresholds used to stratify predictions - into levels. - labels (list): a list of values to label predictions belonging at - each level. A label can be one of {-1, 0, 1} signifying - {ignore, negative class, positive class}, respectively. - allow_low_quality_matches (bool): if True, produce additional matches - for predictions with maximum match quality lower than high_threshold. - See set_low_quality_matches_ for more details. - - For example, - thresholds = [0.3, 0.5] - labels = [0, -1, 1] - All predictions with iou < 0.3 will be marked with 0 and - thus will be considered as false positives while training. - All predictions with 0.3 <= iou < 0.5 will be marked with -1 and - thus will be ignored. - All predictions with 0.5 <= iou will be marked with 1 and - thus will be considered as true positives. - """ - # Add -inf and +inf to first and last position in thresholds - thresholds = thresholds[:] - assert thresholds[0] > 0 - thresholds.insert(0, -float("inf")) - thresholds.append(float("inf")) - # Currently torchscript does not support all + generator - assert all([low <= high for (low, high) in zip(thresholds[:-1], thresholds[1:])]) - assert all([l in [-1, 0, 1] for l in labels]) - assert len(labels) == len(thresholds) - 1 - self.thresholds = thresholds - self.labels = labels - self.allow_low_quality_matches = allow_low_quality_matches - - def __call__(self, match_quality_matrix): - """ - Args: - match_quality_matrix (Tensor[float]): an MxN tensor, containing the - pairwise quality between M ground-truth elements and N predicted - elements. All elements must be >= 0 (due to the us of `torch.nonzero` - for selecting indices in :meth:`set_low_quality_matches_`). - - Returns: - matches (Tensor[int64]): a vector of length N, where matches[i] is a matched - ground-truth index in [0, M) - match_labels (Tensor[int8]): a vector of length N, where pred_labels[i] indicates - whether a prediction is a true or false positive or ignored - """ - assert match_quality_matrix.dim() == 2 - if match_quality_matrix.numel() == 0: - default_matches = match_quality_matrix.new_full( - (match_quality_matrix.size(1),), 0, dtype=torch.int64 - ) - # When no gt boxes exist, we define IOU = 0 and therefore set labels - # to `self.labels[0]`, which usually defaults to background class 0 - # To choose to ignore instead, can make labels=[-1,0,-1,1] + set appropriate thresholds - default_match_labels = match_quality_matrix.new_full( - (match_quality_matrix.size(1),), self.labels[0], dtype=torch.int8 - ) - return default_matches, default_match_labels - - assert torch.all(match_quality_matrix >= 0) - - # match_quality_matrix is M (gt) x N (predicted) - # Max over gt elements (dim 0) to find best gt candidate for each prediction - matched_vals, matches = match_quality_matrix.max(dim=0) - - match_labels = matches.new_full(matches.size(), 1, dtype=torch.int8) - - for (l, low, high) in zip(self.labels, self.thresholds[:-1], self.thresholds[1:]): - low_high = (matched_vals >= low) & (matched_vals < high) - match_labels[low_high] = l - - if self.allow_low_quality_matches: - self.set_low_quality_matches_(match_labels, match_quality_matrix) - - return matches, match_labels - - def set_low_quality_matches_(self, match_labels, match_quality_matrix): - """ - Produce additional matches for predictions that have only low-quality matches. - Specifically, for each ground-truth G find the set of predictions that have - maximum overlap with it (including ties); for each prediction in that set, if - it is unmatched, then match it to the ground-truth G. - - This function implements the RPN assignment case (i) in Sec. 3.1.2 of - :paper:`Faster R-CNN`. - """ - # For each gt, find the prediction with which it has highest quality - highest_quality_foreach_gt, _ = match_quality_matrix.max(dim=1) - # Find the highest quality match available, even if it is low, including ties. - # Note that the matches qualities must be positive due to the use of - # `torch.nonzero`. - _, pred_inds_with_highest_quality = nonzero_tuple( - match_quality_matrix == highest_quality_foreach_gt[:, None] - ) - # If an anchor was labeled positive only due to a low-quality match - # with gt_A, but it has larger overlap with gt_B, it's matched index will still be gt_B. - # This follows the implementation in Detectron, and is found to have no significant impact. - match_labels[pred_inds_with_highest_quality] = 1 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/__init__.py deleted file mode 100755 index 6b066815..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/__init__.py +++ /dev/null @@ -1,16 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -from .build import META_ARCH_REGISTRY, build_model # isort:skip - -from .panoptic_fpn import PanopticFPN - -# import all the meta_arch, so they will be registered -from .rcnn import GeneralizedRCNN, ProposalNetwork -from .dense_detector import DenseDetector -from .retinanet import RetinaNet -from .fcos import FCOS -from .semantic_seg import SEM_SEG_HEADS_REGISTRY, SemanticSegmentor, build_sem_seg_head - - -__all__ = list(globals().keys()) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/build.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/build.py deleted file mode 100755 index 34272157..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/build.py +++ /dev/null @@ -1,25 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import torch - -from detectron2.utils.logger import _log_api_usage -from detectron2.utils.registry import Registry - -META_ARCH_REGISTRY = Registry("META_ARCH") # noqa F401 isort:skip -META_ARCH_REGISTRY.__doc__ = """ -Registry for meta-architectures, i.e. the whole model. - -The registered object will be called with `obj(cfg)` -and expected to return a `nn.Module` object. -""" - - -def build_model(cfg): - """ - Build the whole model architecture, defined by ``cfg.MODEL.META_ARCHITECTURE``. - Note that it does not load any weights from ``cfg``. - """ - meta_arch = cfg.MODEL.META_ARCHITECTURE - model = META_ARCH_REGISTRY.get(meta_arch)(cfg) - model.to(torch.device(cfg.MODEL.DEVICE)) - _log_api_usage("modeling.meta_arch." + meta_arch) - return model diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/dense_detector.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/dense_detector.py deleted file mode 100755 index 382eab97..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/dense_detector.py +++ /dev/null @@ -1,282 +0,0 @@ -import numpy as np -from typing import Dict, List, Optional, Tuple -import torch -from torch import Tensor, nn - -from detectron2.data.detection_utils import convert_image_to_rgb -from detectron2.modeling import Backbone -from detectron2.structures import Boxes, ImageList, Instances -from detectron2.utils.events import get_event_storage - -from ..postprocessing import detector_postprocess - - -def permute_to_N_HWA_K(tensor, K: int): - """ - Transpose/reshape a tensor from (N, (Ai x K), H, W) to (N, (HxWxAi), K) - """ - assert tensor.dim() == 4, tensor.shape - N, _, H, W = tensor.shape - tensor = tensor.view(N, -1, K, H, W) - tensor = tensor.permute(0, 3, 4, 1, 2) - tensor = tensor.reshape(N, -1, K) # Size=(N,HWA,K) - return tensor - - -class DenseDetector(nn.Module): - """ - Base class for dense detector. We define a dense detector as a fully-convolutional model that - makes per-pixel (i.e. dense) predictions. - """ - - def __init__( - self, - backbone: Backbone, - head: nn.Module, - head_in_features: Optional[List[str]] = None, - *, - pixel_mean, - pixel_std, - ): - """ - Args: - backbone: backbone module - head: head module - head_in_features: backbone features to use in head. Default to all backbone features. - pixel_mean (Tuple[float]): - Values to be used for image normalization (BGR order). - To train on images of different number of channels, set different mean & std. - Default values are the mean pixel value from ImageNet: [103.53, 116.28, 123.675] - pixel_std (Tuple[float]): - When using pre-trained models in Detectron1 or any MSRA models, - std has been absorbed into its conv1 weights, so the std needs to be set 1. - Otherwise, you can use [57.375, 57.120, 58.395] (ImageNet std) - """ - super().__init__() - - self.backbone = backbone - self.head = head - if head_in_features is None: - shapes = self.backbone.output_shape() - self.head_in_features = sorted(shapes.keys(), key=lambda x: shapes[x].stride) - else: - self.head_in_features = head_in_features - - self.register_buffer("pixel_mean", torch.tensor(pixel_mean).view(-1, 1, 1), False) - self.register_buffer("pixel_std", torch.tensor(pixel_std).view(-1, 1, 1), False) - - @property - def device(self): - return self.pixel_mean.device - - def forward(self, batched_inputs: List[Dict[str, Tensor]]): - """ - Args: - batched_inputs: a list, batched outputs of :class:`DatasetMapper` . - Each item in the list contains the inputs for one image. - For now, each item in the list is a dict that contains: - - * image: Tensor, image in (C, H, W) format. - * instances: Instances - - Other information that's included in the original dicts, such as: - - * "height", "width" (int): the output resolution of the model, used in inference. - See :meth:`postprocess` for details. - - Returns: - In training, dict[str, Tensor]: mapping from a named loss to a tensor storing the - loss. Used during training only. In inference, the standard output format, described - in :doc:`/tutorials/models`. - """ - images = self.preprocess_image(batched_inputs) - features = self.backbone(images.tensor) - features = [features[f] for f in self.head_in_features] - predictions = self.head(features) - - if self.training: - assert not torch.jit.is_scripting(), "Not supported" - assert "instances" in batched_inputs[0], "Instance annotations are missing in training!" - gt_instances = [x["instances"].to(self.device) for x in batched_inputs] - return self.forward_training(images, features, predictions, gt_instances) - else: - results = self.forward_inference(images, features, predictions) - if torch.jit.is_scripting(): - return results - - processed_results = [] - for results_per_image, input_per_image, image_size in zip( - results, batched_inputs, images.image_sizes - ): - height = input_per_image.get("height", image_size[0]) - width = input_per_image.get("width", image_size[1]) - r = detector_postprocess(results_per_image, height, width) - processed_results.append({"instances": r}) - return processed_results - - def forward_training(self, images, features, predictions, gt_instances): - raise NotImplementedError() - - def preprocess_image(self, batched_inputs: List[Dict[str, Tensor]]): - """ - Normalize, pad and batch the input images. - """ - images = [x["image"].to(self.device) for x in batched_inputs] - images = [(x - self.pixel_mean) / self.pixel_std for x in images] - images = ImageList.from_tensors(images, self.backbone.size_divisibility) - return images - - def _transpose_dense_predictions( - self, predictions: List[List[Tensor]], dims_per_anchor: List[int] - ) -> List[List[Tensor]]: - """ - Transpose the dense per-level predictions. - - Args: - predictions: a list of outputs, each is a list of per-level - predictions with shape (N, Ai x K, Hi, Wi), where N is the - number of images, Ai is the number of anchors per location on - level i, K is the dimension of predictions per anchor. - dims_per_anchor: the value of K for each predictions. e.g. 4 for - box prediction, #classes for classification prediction. - - Returns: - List[List[Tensor]]: each prediction is transposed to (N, Hi x Wi x Ai, K). - """ - assert len(predictions) == len(dims_per_anchor) - res: List[List[Tensor]] = [] - for pred, dim_per_anchor in zip(predictions, dims_per_anchor): - pred = [permute_to_N_HWA_K(x, dim_per_anchor) for x in pred] - res.append(pred) - return res - - def _ema_update(self, name: str, value: float, initial_value: float, momentum: float = 0.9): - """ - Apply EMA update to `self.name` using `value`. - - This is mainly used for loss normalizer. In Detectron1, loss is normalized by number - of foreground samples in the batch. When batch size is 1 per GPU, #foreground has a - large variance and using it lead to lower performance. Therefore we maintain an EMA of - #foreground to stabilize the normalizer. - - Args: - name: name of the normalizer - value: the new value to update - initial_value: the initial value to start with - momentum: momentum of EMA - - Returns: - float: the updated EMA value - """ - if hasattr(self, name): - old = getattr(self, name) - else: - old = initial_value - new = old * momentum + value * (1 - momentum) - setattr(self, name, new) - return new - - def _decode_per_level_predictions( - self, - anchors: Boxes, - pred_scores: Tensor, - pred_deltas: Tensor, - score_thresh: float, - topk_candidates: int, - image_size: Tuple[int, int], - ) -> Instances: - """ - Decode boxes and classification predictions of one featuer level, by - the following steps: - 1. filter the predictions based on score threshold and top K scores. - 2. transform the box regression outputs - 3. return the predicted scores, classes and boxes - - Args: - anchors: Boxes, anchor for this feature level - pred_scores: HxWxA,K - pred_deltas: HxWxA,4 - - Returns: - Instances: with field "scores", "pred_boxes", "pred_classes". - """ - # Apply two filtering to make NMS faster. - # 1. Keep boxes with confidence score higher than threshold - keep_idxs = pred_scores > score_thresh - pred_scores = pred_scores[keep_idxs] - topk_idxs = torch.nonzero(keep_idxs) # Kx2 - - # 2. Keep top k top scoring boxes only - num_topk = min(topk_candidates, topk_idxs.size(0)) - pred_scores, idxs = pred_scores.topk(num_topk) - topk_idxs = topk_idxs[idxs] - - anchor_idxs, classes_idxs = topk_idxs.unbind(dim=1) - - pred_boxes = self.box2box_transform.apply_deltas( - pred_deltas[anchor_idxs], anchors.tensor[anchor_idxs] - ) - return Instances( - image_size, pred_boxes=Boxes(pred_boxes), scores=pred_scores, pred_classes=classes_idxs - ) - - def _decode_multi_level_predictions( - self, - anchors: List[Boxes], - pred_scores: List[Tensor], - pred_deltas: List[Tensor], - score_thresh: float, - topk_candidates: int, - image_size: Tuple[int, int], - ) -> Instances: - """ - Run `_decode_per_level_predictions` for all feature levels and concat the results. - """ - predictions = [ - self._decode_per_level_predictions( - anchors_i, - box_cls_i, - box_reg_i, - self.test_score_thresh, - self.test_topk_candidates, - image_size, - ) - # Iterate over every feature level - for box_cls_i, box_reg_i, anchors_i in zip(pred_scores, pred_deltas, anchors) - ] - return predictions[0].cat(predictions) # 'Instances.cat' is not scriptale but this is - - def visualize_training(self, batched_inputs, results): - """ - A function used to visualize ground truth images and final network predictions. - It shows ground truth bounding boxes on the original image and up to 20 - predicted object bounding boxes on the original image. - - Args: - batched_inputs (list): a list that contains input to the model. - results (List[Instances]): a list of #images elements returned by forward_inference(). - """ - from detectron2.utils.visualizer import Visualizer - - assert len(batched_inputs) == len( - results - ), "Cannot visualize inputs and results of different sizes" - storage = get_event_storage() - max_boxes = 20 - - image_index = 0 # only visualize a single image - img = batched_inputs[image_index]["image"] - img = convert_image_to_rgb(img.permute(1, 2, 0), self.input_format) - v_gt = Visualizer(img, None) - v_gt = v_gt.overlay_instances(boxes=batched_inputs[image_index]["instances"].gt_boxes) - anno_img = v_gt.get_image() - processed_results = detector_postprocess(results[image_index], img.shape[0], img.shape[1]) - predicted_boxes = processed_results.pred_boxes.tensor.detach().cpu().numpy() - - v_pred = Visualizer(img, None) - v_pred = v_pred.overlay_instances(boxes=predicted_boxes[0:max_boxes]) - prop_img = v_pred.get_image() - vis_img = np.vstack((anno_img, prop_img)) - vis_img = vis_img.transpose(2, 0, 1) - vis_name = f"Top: GT bounding boxes; Bottom: {max_boxes} Highest Scoring Results" - storage.put_image(vis_name, vis_img) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/fcos.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/fcos.py deleted file mode 100755 index 55cdb76e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/fcos.py +++ /dev/null @@ -1,303 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import logging -from typing import List, Optional, Tuple -import torch -from fvcore.nn import sigmoid_focal_loss_jit -from torch import Tensor, nn -from torch.nn import functional as F - -from detectron2.layers import ShapeSpec, batched_nms -from detectron2.structures import Boxes, ImageList, Instances, pairwise_point_box_distance -from detectron2.utils.events import get_event_storage - -from ..anchor_generator import DefaultAnchorGenerator -from ..backbone import Backbone -from ..box_regression import Box2BoxTransformLinear, _dense_box_regression_loss -from .dense_detector import DenseDetector -from .retinanet import RetinaNetHead - -__all__ = ["FCOS"] - - -logger = logging.getLogger(__name__) - - -class FCOS(DenseDetector): - """ - Implement FCOS in :paper:`fcos`. - """ - - def __init__( - self, - *, - backbone: Backbone, - head: nn.Module, - head_in_features: Optional[List[str]] = None, - box2box_transform=None, - num_classes, - center_sampling_radius: float = 1.5, - focal_loss_alpha=0.25, - focal_loss_gamma=2.0, - test_score_thresh=0.2, - test_topk_candidates=1000, - test_nms_thresh=0.6, - max_detections_per_image=100, - pixel_mean, - pixel_std, - ): - """ - Args: - center_sampling_radius: radius of the "center" of a groundtruth box, - within which all anchor points are labeled positive. - Other arguments mean the same as in :class:`RetinaNet`. - """ - super().__init__( - backbone, head, head_in_features, pixel_mean=pixel_mean, pixel_std=pixel_std - ) - - self.num_classes = num_classes - - # FCOS uses one anchor point per location. - # We represent the anchor point by a box whose size equals the anchor stride. - feature_shapes = backbone.output_shape() - fpn_strides = [feature_shapes[k].stride for k in self.head_in_features] - self.anchor_generator = DefaultAnchorGenerator( - sizes=[[k] for k in fpn_strides], aspect_ratios=[1.0], strides=fpn_strides - ) - - # FCOS parameterizes box regression by a linear transform, - # where predictions are normalized by anchor stride (equal to anchor size). - if box2box_transform is None: - box2box_transform = Box2BoxTransformLinear(normalize_by_size=True) - self.box2box_transform = box2box_transform - - self.center_sampling_radius = float(center_sampling_radius) - - # Loss parameters: - self.focal_loss_alpha = focal_loss_alpha - self.focal_loss_gamma = focal_loss_gamma - - # Inference parameters: - self.test_score_thresh = test_score_thresh - self.test_topk_candidates = test_topk_candidates - self.test_nms_thresh = test_nms_thresh - self.max_detections_per_image = max_detections_per_image - - def forward_training(self, images, features, predictions, gt_instances): - # Transpose the Hi*Wi*A dimension to the middle: - pred_logits, pred_anchor_deltas, pred_centerness = self._transpose_dense_predictions( - predictions, [self.num_classes, 4, 1] - ) - anchors = self.anchor_generator(features) - gt_labels, gt_boxes = self.label_anchors(anchors, gt_instances) - return self.losses( - anchors, pred_logits, gt_labels, pred_anchor_deltas, gt_boxes, pred_centerness - ) - - @torch.no_grad() - def match_anchors(self, anchors: List[Boxes], gt_instances: List[Instances]): - """ - Match anchors with ground truth boxes. - - Args: - anchors: #level boxes, from the highest resolution to lower resolution - gt_instances: ground truth instances per image - - Returns: - List[Tensor]: - #image tensors, each is a vector of matched gt - indices (or -1 for unmatched anchors) for all anchors. - """ - num_anchors_per_level = [len(x) for x in anchors] - anchors = Boxes.cat(anchors) # Rx4 - anchor_centers = anchors.get_centers() # Rx2 - anchor_sizes = anchors.tensor[:, 2] - anchors.tensor[:, 0] # R - - lower_bound = anchor_sizes * 4 - lower_bound[: num_anchors_per_level[0]] = 0 - upper_bound = anchor_sizes * 8 - upper_bound[-num_anchors_per_level[-1] :] = float("inf") - - matched_indices = [] - for gt_per_image in gt_instances: - gt_centers = gt_per_image.gt_boxes.get_centers() # Nx2 - # FCOS with center sampling: anchor point must be close enough to gt center. - pairwise_match = (anchor_centers[:, None, :] - gt_centers[None, :, :]).abs_().max( - dim=2 - ).values < self.center_sampling_radius * anchor_sizes[:, None] - pairwise_dist = pairwise_point_box_distance(anchor_centers, gt_per_image.gt_boxes) - - # The original FCOS anchor matching rule: anchor point must be inside gt - pairwise_match &= pairwise_dist.min(dim=2).values > 0 - - # Multilevel anchor matching in FCOS: each anchor is only responsible - # for certain scale range. - pairwise_dist = pairwise_dist.max(dim=2).values - pairwise_match &= (pairwise_dist > lower_bound[:, None]) & ( - pairwise_dist < upper_bound[:, None] - ) - - # Match the GT box with minimum area, if there are multiple GT matches - gt_areas = gt_per_image.gt_boxes.area() # N - pairwise_match = pairwise_match.to(torch.float32) * (1e8 - gt_areas[None, :]) - min_values, matched_idx = pairwise_match.max(dim=1) # R, per-anchor match - matched_idx[min_values < 1e-5] = -1 # Unmatched anchors are assigned -1 - - matched_indices.append(matched_idx) - return matched_indices - - @torch.no_grad() - def label_anchors(self, anchors, gt_instances): - """ - Same interface as :meth:`RetinaNet.label_anchors`, but implemented with FCOS - anchor matching rule. - - Unlike RetinaNet, there are no ignored anchors. - """ - matched_indices = self.match_anchors(anchors, gt_instances) - - matched_labels, matched_boxes = [], [] - for gt_index, gt_per_image in zip(matched_indices, gt_instances): - label = gt_per_image.gt_classes[gt_index.clip(min=0)] - label[gt_index < 0] = self.num_classes # background - - matched_gt_boxes = gt_per_image.gt_boxes[gt_index.clip(min=0)] - - matched_labels.append(label) - matched_boxes.append(matched_gt_boxes) - return matched_labels, matched_boxes - - def losses( - self, anchors, pred_logits, gt_labels, pred_anchor_deltas, gt_boxes, pred_centerness - ): - """ - This method is almost identical to :meth:`RetinaNet.losses`, with an extra - "loss_centerness" in the returned dict. - """ - num_images = len(gt_labels) - gt_labels = torch.stack(gt_labels) # (N, R) - - pos_mask = (gt_labels >= 0) & (gt_labels != self.num_classes) - num_pos_anchors = pos_mask.sum().item() - get_event_storage().put_scalar("num_pos_anchors", num_pos_anchors / num_images) - normalizer = self._ema_update("loss_normalizer", max(num_pos_anchors, 1), 300) - - # classification and regression loss - gt_labels_target = F.one_hot(gt_labels, num_classes=self.num_classes + 1)[ - :, :, :-1 - ] # no loss for the last (background) class - loss_cls = sigmoid_focal_loss_jit( - torch.cat(pred_logits, dim=1), - gt_labels_target.to(pred_logits[0].dtype), - alpha=self.focal_loss_alpha, - gamma=self.focal_loss_gamma, - reduction="sum", - ) - - loss_box_reg = _dense_box_regression_loss( - anchors, - self.box2box_transform, - pred_anchor_deltas, - [x.tensor for x in gt_boxes], - pos_mask, - box_reg_loss_type="giou", - ) - - ctrness_targets = self.compute_ctrness_targets(anchors, gt_boxes) # NxR - pred_centerness = torch.cat(pred_centerness, dim=1).squeeze(dim=2) # NxR - ctrness_loss = F.binary_cross_entropy_with_logits( - pred_centerness[pos_mask], ctrness_targets[pos_mask], reduction="sum" - ) - return { - "loss_fcos_cls": loss_cls / normalizer, - "loss_fcos_loc": loss_box_reg / normalizer, - "loss_fcos_ctr": ctrness_loss / normalizer, - } - - def compute_ctrness_targets(self, anchors, gt_boxes): # NxR - anchors = Boxes.cat(anchors).tensor # Rx4 - reg_targets = [self.box2box_transform.get_deltas(anchors, m.tensor) for m in gt_boxes] - reg_targets = torch.stack(reg_targets, dim=0) # NxRx4 - if len(reg_targets) == 0: - return reg_targets.new_zeros(len(reg_targets)) - left_right = reg_targets[:, :, [0, 2]] - top_bottom = reg_targets[:, :, [1, 3]] - ctrness = (left_right.min(dim=-1)[0] / left_right.max(dim=-1)[0]) * ( - top_bottom.min(dim=-1)[0] / top_bottom.max(dim=-1)[0] - ) - return torch.sqrt(ctrness) - - def forward_inference( - self, images: ImageList, features: List[Tensor], predictions: List[List[Tensor]] - ): - pred_logits, pred_anchor_deltas, pred_centerness = self._transpose_dense_predictions( - predictions, [self.num_classes, 4, 1] - ) - anchors = self.anchor_generator(features) - - results: List[Instances] = [] - for img_idx, image_size in enumerate(images.image_sizes): - scores_per_image = [ - # Multiply and sqrt centerness & classification scores - # (See eqn. 4 in https://arxiv.org/abs/2006.09214) - torch.sqrt(x[img_idx].sigmoid_() * y[img_idx].sigmoid_()) - for x, y in zip(pred_logits, pred_centerness) - ] - deltas_per_image = [x[img_idx] for x in pred_anchor_deltas] - results_per_image = self.inference_single_image( - anchors, scores_per_image, deltas_per_image, image_size - ) - results.append(results_per_image) - return results - - def inference_single_image( - self, - anchors: List[Boxes], - box_cls: List[Tensor], - box_delta: List[Tensor], - image_size: Tuple[int, int], - ): - """ - Identical to :meth:`RetinaNet.inference_single_image. - """ - pred = self._decode_multi_level_predictions( - anchors, - box_cls, - box_delta, - self.test_score_thresh, - self.test_topk_candidates, - image_size, - ) - keep = batched_nms( - pred.pred_boxes.tensor, pred.scores, pred.pred_classes, self.test_nms_thresh - ) - return pred[keep[: self.max_detections_per_image]] - - -class FCOSHead(RetinaNetHead): - """ - The head used in :paper:`fcos`. It adds an additional centerness - prediction branch on top of :class:`RetinaNetHead`. - """ - - def __init__(self, *, input_shape: List[ShapeSpec], conv_dims: List[int], **kwargs): - super().__init__(input_shape=input_shape, conv_dims=conv_dims, num_anchors=1, **kwargs) - # Unlike original FCOS, we do not add an additional learnable scale layer - # because it's found to have no benefits after normalizing regression targets by stride. - self._num_features = len(input_shape) - self.ctrness = nn.Conv2d(conv_dims[-1], 1, kernel_size=3, stride=1, padding=1) - torch.nn.init.normal_(self.ctrness.weight, std=0.01) - torch.nn.init.constant_(self.ctrness.bias, 0) - - def forward(self, features): - assert len(features) == self._num_features - logits = [] - bbox_reg = [] - ctrness = [] - for feature in features: - logits.append(self.cls_score(self.cls_subnet(feature))) - bbox_feature = self.bbox_subnet(feature) - bbox_reg.append(self.bbox_pred(bbox_feature)) - ctrness.append(self.ctrness(bbox_feature)) - return logits, bbox_reg, ctrness diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/panoptic_fpn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/panoptic_fpn.py deleted file mode 100755 index 13aeabce..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/panoptic_fpn.py +++ /dev/null @@ -1,266 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import logging -from typing import Dict, List -import torch -from torch import nn - -from detectron2.config import configurable -from detectron2.structures import ImageList - -from ..postprocessing import detector_postprocess, sem_seg_postprocess -from .build import META_ARCH_REGISTRY -from .rcnn import GeneralizedRCNN -from .semantic_seg import build_sem_seg_head - -__all__ = ["PanopticFPN"] - - -@META_ARCH_REGISTRY.register() -class PanopticFPN(GeneralizedRCNN): - """ - Implement the paper :paper:`PanopticFPN`. - """ - - @configurable - def __init__( - self, - *, - sem_seg_head: nn.Module, - combine_overlap_thresh: float = 0.5, - combine_stuff_area_thresh: float = 4096, - combine_instances_score_thresh: float = 0.5, - **kwargs, - ): - """ - NOTE: this interface is experimental. - - Args: - sem_seg_head: a module for the semantic segmentation head. - combine_overlap_thresh: combine masks into one instances if - they have enough overlap - combine_stuff_area_thresh: ignore stuff areas smaller than this threshold - combine_instances_score_thresh: ignore instances whose score is - smaller than this threshold - - Other arguments are the same as :class:`GeneralizedRCNN`. - """ - super().__init__(**kwargs) - self.sem_seg_head = sem_seg_head - # options when combining instance & semantic outputs - self.combine_overlap_thresh = combine_overlap_thresh - self.combine_stuff_area_thresh = combine_stuff_area_thresh - self.combine_instances_score_thresh = combine_instances_score_thresh - - @classmethod - def from_config(cls, cfg): - ret = super().from_config(cfg) - ret.update( - { - "combine_overlap_thresh": cfg.MODEL.PANOPTIC_FPN.COMBINE.OVERLAP_THRESH, - "combine_stuff_area_thresh": cfg.MODEL.PANOPTIC_FPN.COMBINE.STUFF_AREA_LIMIT, - "combine_instances_score_thresh": cfg.MODEL.PANOPTIC_FPN.COMBINE.INSTANCES_CONFIDENCE_THRESH, # noqa - } - ) - ret["sem_seg_head"] = build_sem_seg_head(cfg, ret["backbone"].output_shape()) - logger = logging.getLogger(__name__) - if not cfg.MODEL.PANOPTIC_FPN.COMBINE.ENABLED: - logger.warning( - "PANOPTIC_FPN.COMBINED.ENABLED is no longer used. " - " model.inference(do_postprocess=) should be used to toggle postprocessing." - ) - if cfg.MODEL.PANOPTIC_FPN.INSTANCE_LOSS_WEIGHT != 1.0: - w = cfg.MODEL.PANOPTIC_FPN.INSTANCE_LOSS_WEIGHT - logger.warning( - "PANOPTIC_FPN.INSTANCE_LOSS_WEIGHT should be replaced by weights on each ROI head." - ) - - def update_weight(x): - if isinstance(x, dict): - return {k: v * w for k, v in x.items()} - else: - return x * w - - roi_heads = ret["roi_heads"] - roi_heads.box_predictor.loss_weight = update_weight(roi_heads.box_predictor.loss_weight) - roi_heads.mask_head.loss_weight = update_weight(roi_heads.mask_head.loss_weight) - return ret - - def forward(self, batched_inputs): - """ - Args: - batched_inputs: a list, batched outputs of :class:`DatasetMapper`. - Each item in the list contains the inputs for one image. - - For now, each item in the list is a dict that contains: - - * "image": Tensor, image in (C, H, W) format. - * "instances": Instances - * "sem_seg": semantic segmentation ground truth. - * Other information that's included in the original dicts, such as: - "height", "width" (int): the output resolution of the model, used in inference. - See :meth:`postprocess` for details. - - Returns: - list[dict]: - each dict has the results for one image. The dict contains the following keys: - - * "instances": see :meth:`GeneralizedRCNN.forward` for its format. - * "sem_seg": see :meth:`SemanticSegmentor.forward` for its format. - * "panoptic_seg": See the return value of - :func:`combine_semantic_and_instance_outputs` for its format. - """ - if not self.training: - return self.inference(batched_inputs) - images = self.preprocess_image(batched_inputs) - features = self.backbone(images.tensor) - - assert "sem_seg" in batched_inputs[0] - gt_sem_seg = [x["sem_seg"].to(self.device) for x in batched_inputs] - gt_sem_seg = ImageList.from_tensors( - gt_sem_seg, self.backbone.size_divisibility, self.sem_seg_head.ignore_value - ).tensor - sem_seg_results, sem_seg_losses = self.sem_seg_head(features, gt_sem_seg) - - gt_instances = [x["instances"].to(self.device) for x in batched_inputs] - proposals, proposal_losses = self.proposal_generator(images, features, gt_instances) - detector_results, detector_losses = self.roi_heads( - images, features, proposals, gt_instances - ) - - losses = sem_seg_losses - losses.update(proposal_losses) - losses.update(detector_losses) - return losses - - def inference(self, batched_inputs: List[Dict[str, torch.Tensor]], do_postprocess: bool = True): - """ - Run inference on the given inputs. - - Args: - batched_inputs (list[dict]): same as in :meth:`forward` - do_postprocess (bool): whether to apply post-processing on the outputs. - - Returns: - When do_postprocess=True, see docs in :meth:`forward`. - Otherwise, returns a (list[Instances], list[Tensor]) that contains - the raw detector outputs, and raw semantic segmentation outputs. - """ - images = self.preprocess_image(batched_inputs) - features = self.backbone(images.tensor) - sem_seg_results, sem_seg_losses = self.sem_seg_head(features, None) - proposals, _ = self.proposal_generator(images, features, None) - detector_results, _ = self.roi_heads(images, features, proposals, None) - - if do_postprocess: - processed_results = [] - for sem_seg_result, detector_result, input_per_image, image_size in zip( - sem_seg_results, detector_results, batched_inputs, images.image_sizes - ): - height = input_per_image.get("height", image_size[0]) - width = input_per_image.get("width", image_size[1]) - sem_seg_r = sem_seg_postprocess(sem_seg_result, image_size, height, width) - detector_r = detector_postprocess(detector_result, height, width) - - processed_results.append({"sem_seg": sem_seg_r, "instances": detector_r}) - - panoptic_r = combine_semantic_and_instance_outputs( - detector_r, - sem_seg_r.argmax(dim=0), - self.combine_overlap_thresh, - self.combine_stuff_area_thresh, - self.combine_instances_score_thresh, - ) - processed_results[-1]["panoptic_seg"] = panoptic_r - return processed_results - else: - return detector_results, sem_seg_results - - -def combine_semantic_and_instance_outputs( - instance_results, - semantic_results, - overlap_threshold, - stuff_area_thresh, - instances_score_thresh, -): - """ - Implement a simple combining logic following - "combine_semantic_and_instance_predictions.py" in panopticapi - to produce panoptic segmentation outputs. - - Args: - instance_results: output of :func:`detector_postprocess`. - semantic_results: an (H, W) tensor, each element is the contiguous semantic - category id - - Returns: - panoptic_seg (Tensor): of shape (height, width) where the values are ids for each segment. - segments_info (list[dict]): Describe each segment in `panoptic_seg`. - Each dict contains keys "id", "category_id", "isthing". - """ - panoptic_seg = torch.zeros_like(semantic_results, dtype=torch.int32) - - # sort instance outputs by scores - sorted_inds = torch.argsort(-instance_results.scores) - - current_segment_id = 0 - segments_info = [] - - instance_masks = instance_results.pred_masks.to(dtype=torch.bool, device=panoptic_seg.device) - - # Add instances one-by-one, check for overlaps with existing ones - for inst_id in sorted_inds: - score = instance_results.scores[inst_id].item() - if score < instances_score_thresh: - break - mask = instance_masks[inst_id] # H,W - mask_area = mask.sum().item() - - if mask_area == 0: - continue - - intersect = (mask > 0) & (panoptic_seg > 0) - intersect_area = intersect.sum().item() - - if intersect_area * 1.0 / mask_area > overlap_threshold: - continue - - if intersect_area > 0: - mask = mask & (panoptic_seg == 0) - - current_segment_id += 1 - panoptic_seg[mask] = current_segment_id - segments_info.append( - { - "id": current_segment_id, - "isthing": True, - "score": score, - "category_id": instance_results.pred_classes[inst_id].item(), - "instance_id": inst_id.item(), - } - ) - - # Add semantic results to remaining empty areas - semantic_labels = torch.unique(semantic_results).cpu().tolist() - for semantic_label in semantic_labels: - if semantic_label == 0: # 0 is a special "thing" class - continue - mask = (semantic_results == semantic_label) & (panoptic_seg == 0) - mask_area = mask.sum().item() - if mask_area < stuff_area_thresh: - continue - - current_segment_id += 1 - panoptic_seg[mask] = current_segment_id - segments_info.append( - { - "id": current_segment_id, - "isthing": False, - "category_id": semantic_label, - "area": mask_area, - } - ) - - return panoptic_seg, segments_info diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/rcnn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/rcnn.py deleted file mode 100755 index 7b45363e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/rcnn.py +++ /dev/null @@ -1,327 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import numpy as np -from typing import Dict, List, Optional, Tuple -import torch -from torch import nn - -from detectron2.config import configurable -from detectron2.data.detection_utils import convert_image_to_rgb -from detectron2.structures import ImageList, Instances -from detectron2.utils.events import get_event_storage -from detectron2.utils.logger import log_first_n - -from ..backbone import Backbone, build_backbone -from ..postprocessing import detector_postprocess -from ..proposal_generator import build_proposal_generator -from ..roi_heads import build_roi_heads -from .build import META_ARCH_REGISTRY - -__all__ = ["GeneralizedRCNN", "ProposalNetwork"] - - -@META_ARCH_REGISTRY.register() -class GeneralizedRCNN(nn.Module): - """ - Generalized R-CNN. Any models that contains the following three components: - 1. Per-image feature extraction (aka backbone) - 2. Region proposal generation - 3. Per-region feature extraction and prediction - """ - - @configurable - def __init__( - self, - *, - backbone: Backbone, - proposal_generator: nn.Module, - roi_heads: nn.Module, - pixel_mean: Tuple[float], - pixel_std: Tuple[float], - input_format: Optional[str] = None, - vis_period: int = 0, - ): - """ - Args: - backbone: a backbone module, must follow detectron2's backbone interface - proposal_generator: a module that generates proposals using backbone features - roi_heads: a ROI head that performs per-region computation - pixel_mean, pixel_std: list or tuple with #channels element, representing - the per-channel mean and std to be used to normalize the input image - input_format: describe the meaning of channels of input. Needed by visualization - vis_period: the period to run visualization. Set to 0 to disable. - """ - super().__init__() - self.backbone = backbone - self.proposal_generator = proposal_generator - self.roi_heads = roi_heads - - self.input_format = input_format - self.vis_period = vis_period - if vis_period > 0: - assert input_format is not None, "input_format is required for visualization!" - - self.register_buffer("pixel_mean", torch.tensor(pixel_mean).view(-1, 1, 1), False) - self.register_buffer("pixel_std", torch.tensor(pixel_std).view(-1, 1, 1), False) - assert ( - self.pixel_mean.shape == self.pixel_std.shape - ), f"{self.pixel_mean} and {self.pixel_std} have different shapes!" - - @classmethod - def from_config(cls, cfg): - backbone = build_backbone(cfg) - return { - "backbone": backbone, - "proposal_generator": build_proposal_generator(cfg, backbone.output_shape()), - "roi_heads": build_roi_heads(cfg, backbone.output_shape()), - "input_format": cfg.INPUT.FORMAT, - "vis_period": cfg.VIS_PERIOD, - "pixel_mean": cfg.MODEL.PIXEL_MEAN, - "pixel_std": cfg.MODEL.PIXEL_STD, - } - - @property - def device(self): - return self.pixel_mean.device - - def visualize_training(self, batched_inputs, proposals): - """ - A function used to visualize images and proposals. It shows ground truth - bounding boxes on the original image and up to 20 top-scoring predicted - object proposals on the original image. Users can implement different - visualization functions for different models. - - Args: - batched_inputs (list): a list that contains input to the model. - proposals (list): a list that contains predicted proposals. Both - batched_inputs and proposals should have the same length. - """ - from detectron2.utils.visualizer import Visualizer - - storage = get_event_storage() - max_vis_prop = 20 - - for input, prop in zip(batched_inputs, proposals): - img = input["image"] - img = convert_image_to_rgb(img.permute(1, 2, 0), self.input_format) - v_gt = Visualizer(img, None) - v_gt = v_gt.overlay_instances(boxes=input["instances"].gt_boxes) - anno_img = v_gt.get_image() - box_size = min(len(prop.proposal_boxes), max_vis_prop) - v_pred = Visualizer(img, None) - v_pred = v_pred.overlay_instances( - boxes=prop.proposal_boxes[0:box_size].tensor.cpu().numpy() - ) - prop_img = v_pred.get_image() - vis_img = np.concatenate((anno_img, prop_img), axis=1) - vis_img = vis_img.transpose(2, 0, 1) - vis_name = "Left: GT bounding boxes; Right: Predicted proposals" - storage.put_image(vis_name, vis_img) - break # only visualize one image in a batch - - def forward(self, batched_inputs: List[Dict[str, torch.Tensor]]): - """ - Args: - batched_inputs: a list, batched outputs of :class:`DatasetMapper` . - Each item in the list contains the inputs for one image. - For now, each item in the list is a dict that contains: - - * image: Tensor, image in (C, H, W) format. - * instances (optional): groundtruth :class:`Instances` - * proposals (optional): :class:`Instances`, precomputed proposals. - - Other information that's included in the original dicts, such as: - - * "height", "width" (int): the output resolution of the model, used in inference. - See :meth:`postprocess` for details. - - Returns: - list[dict]: - Each dict is the output for one input image. - The dict contains one key "instances" whose value is a :class:`Instances`. - The :class:`Instances` object has the following keys: - "pred_boxes", "pred_classes", "scores", "pred_masks", "pred_keypoints" - """ - if not self.training: - return self.inference(batched_inputs) - - images = self.preprocess_image(batched_inputs) - if "instances" in batched_inputs[0]: - gt_instances = [x["instances"].to(self.device) for x in batched_inputs] - else: - gt_instances = None - - features = self.backbone(images.tensor) - - if self.proposal_generator is not None: - proposals, proposal_losses = self.proposal_generator(images, features, gt_instances) - else: - assert "proposals" in batched_inputs[0] - proposals = [x["proposals"].to(self.device) for x in batched_inputs] - proposal_losses = {} - - _, detector_losses = self.roi_heads(images, features, proposals, gt_instances) - if self.vis_period > 0: - storage = get_event_storage() - if storage.iter % self.vis_period == 0: - self.visualize_training(batched_inputs, proposals) - - losses = {} - losses.update(detector_losses) - losses.update(proposal_losses) - return losses - - def inference( - self, - batched_inputs: List[Dict[str, torch.Tensor]], - detected_instances: Optional[List[Instances]] = None, - do_postprocess: bool = True, - ): - """ - Run inference on the given inputs. - - Args: - batched_inputs (list[dict]): same as in :meth:`forward` - detected_instances (None or list[Instances]): if not None, it - contains an `Instances` object per image. The `Instances` - object contains "pred_boxes" and "pred_classes" which are - known boxes in the image. - The inference will then skip the detection of bounding boxes, - and only predict other per-ROI outputs. - do_postprocess (bool): whether to apply post-processing on the outputs. - - Returns: - When do_postprocess=True, same as in :meth:`forward`. - Otherwise, a list[Instances] containing raw network outputs. - """ - assert not self.training - - images = self.preprocess_image(batched_inputs) - features = self.backbone(images.tensor) - - if detected_instances is None: - if self.proposal_generator is not None: - proposals, _ = self.proposal_generator(images, features, None) - else: - assert "proposals" in batched_inputs[0] - proposals = [x["proposals"].to(self.device) for x in batched_inputs] - - results, _ = self.roi_heads(images, features, proposals, None) - else: - detected_instances = [x.to(self.device) for x in detected_instances] - results = self.roi_heads.forward_with_given_boxes(features, detected_instances) - - if do_postprocess: - assert not torch.jit.is_scripting(), "Scripting is not supported for postprocess." - return GeneralizedRCNN._postprocess(results, batched_inputs, images.image_sizes) - else: - return results - - def preprocess_image(self, batched_inputs: List[Dict[str, torch.Tensor]]): - """ - Normalize, pad and batch the input images. - """ - images = [x["image"].to(self.device) for x in batched_inputs] - images = [(x - self.pixel_mean) / self.pixel_std for x in images] - images = ImageList.from_tensors(images, self.backbone.size_divisibility) - return images - - @staticmethod - def _postprocess(instances, batched_inputs: List[Dict[str, torch.Tensor]], image_sizes): - """ - Rescale the output instances to the target size. - """ - # note: private function; subject to changes - processed_results = [] - for results_per_image, input_per_image, image_size in zip( - instances, batched_inputs, image_sizes - ): - height = input_per_image.get("height", image_size[0]) - width = input_per_image.get("width", image_size[1]) - r = detector_postprocess(results_per_image, height, width) - processed_results.append({"instances": r}) - return processed_results - - -@META_ARCH_REGISTRY.register() -class ProposalNetwork(nn.Module): - """ - A meta architecture that only predicts object proposals. - """ - - @configurable - def __init__( - self, - *, - backbone: Backbone, - proposal_generator: nn.Module, - pixel_mean: Tuple[float], - pixel_std: Tuple[float], - ): - """ - Args: - backbone: a backbone module, must follow detectron2's backbone interface - proposal_generator: a module that generates proposals using backbone features - pixel_mean, pixel_std: list or tuple with #channels element, representing - the per-channel mean and std to be used to normalize the input image - """ - super().__init__() - self.backbone = backbone - self.proposal_generator = proposal_generator - self.register_buffer("pixel_mean", torch.tensor(pixel_mean).view(-1, 1, 1), False) - self.register_buffer("pixel_std", torch.tensor(pixel_std).view(-1, 1, 1), False) - - @classmethod - def from_config(cls, cfg): - backbone = build_backbone(cfg) - return { - "backbone": backbone, - "proposal_generator": build_proposal_generator(cfg, backbone.output_shape()), - "pixel_mean": cfg.MODEL.PIXEL_MEAN, - "pixel_std": cfg.MODEL.PIXEL_STD, - } - - @property - def device(self): - return self.pixel_mean.device - - def forward(self, batched_inputs): - """ - Args: - Same as in :class:`GeneralizedRCNN.forward` - - Returns: - list[dict]: - Each dict is the output for one input image. - The dict contains one key "proposals" whose value is a - :class:`Instances` with keys "proposal_boxes" and "objectness_logits". - """ - images = [x["image"].to(self.device) for x in batched_inputs] - images = [(x - self.pixel_mean) / self.pixel_std for x in images] - images = ImageList.from_tensors(images, self.backbone.size_divisibility) - features = self.backbone(images.tensor) - - if "instances" in batched_inputs[0]: - gt_instances = [x["instances"].to(self.device) for x in batched_inputs] - elif "targets" in batched_inputs[0]: - log_first_n( - logging.WARN, "'targets' in the model inputs is now renamed to 'instances'!", n=10 - ) - gt_instances = [x["targets"].to(self.device) for x in batched_inputs] - else: - gt_instances = None - proposals, proposal_losses = self.proposal_generator(images, features, gt_instances) - # In training, the proposals are not useful at all but we generate them anyway. - # This makes RPN-only models about 5% slower. - if self.training: - return proposal_losses - - processed_results = [] - for results_per_image, input_per_image, image_size in zip( - proposals, batched_inputs, images.image_sizes - ): - height = input_per_image.get("height", image_size[0]) - width = input_per_image.get("width", image_size[1]) - r = detector_postprocess(results_per_image, height, width) - processed_results.append({"proposals": r}) - return processed_results diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/retinanet.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/retinanet.py deleted file mode 100755 index 3ea88f61..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/retinanet.py +++ /dev/null @@ -1,439 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import math -from typing import List, Tuple -import torch -from fvcore.nn import sigmoid_focal_loss_jit -from torch import Tensor, nn -from torch.nn import functional as F - -from detectron2.config import configurable -from detectron2.layers import CycleBatchNormList, ShapeSpec, batched_nms, cat, get_norm -from detectron2.structures import Boxes, ImageList, Instances, pairwise_iou -from detectron2.utils.events import get_event_storage - -from ..anchor_generator import build_anchor_generator -from ..backbone import Backbone, build_backbone -from ..box_regression import Box2BoxTransform, _dense_box_regression_loss -from ..matcher import Matcher -from .build import META_ARCH_REGISTRY -from .dense_detector import DenseDetector, permute_to_N_HWA_K # noqa - -__all__ = ["RetinaNet"] - - -logger = logging.getLogger(__name__) - - -@META_ARCH_REGISTRY.register() -class RetinaNet(DenseDetector): - """ - Implement RetinaNet in :paper:`RetinaNet`. - """ - - @configurable - def __init__( - self, - *, - backbone: Backbone, - head: nn.Module, - head_in_features, - anchor_generator, - box2box_transform, - anchor_matcher, - num_classes, - focal_loss_alpha=0.25, - focal_loss_gamma=2.0, - smooth_l1_beta=0.0, - box_reg_loss_type="smooth_l1", - test_score_thresh=0.05, - test_topk_candidates=1000, - test_nms_thresh=0.5, - max_detections_per_image=100, - pixel_mean, - pixel_std, - vis_period=0, - input_format="BGR", - ): - """ - NOTE: this interface is experimental. - - Args: - backbone: a backbone module, must follow detectron2's backbone interface - head (nn.Module): a module that predicts logits and regression deltas - for each level from a list of per-level features - head_in_features (Tuple[str]): Names of the input feature maps to be used in head - anchor_generator (nn.Module): a module that creates anchors from a - list of features. Usually an instance of :class:`AnchorGenerator` - box2box_transform (Box2BoxTransform): defines the transform from anchors boxes to - instance boxes - anchor_matcher (Matcher): label the anchors by matching them with ground truth. - num_classes (int): number of classes. Used to label background proposals. - - # Loss parameters: - focal_loss_alpha (float): focal_loss_alpha - focal_loss_gamma (float): focal_loss_gamma - smooth_l1_beta (float): smooth_l1_beta - box_reg_loss_type (str): Options are "smooth_l1", "giou", "diou", "ciou" - - # Inference parameters: - test_score_thresh (float): Inference cls score threshold, only anchors with - score > INFERENCE_TH are considered for inference (to improve speed) - test_topk_candidates (int): Select topk candidates before NMS - test_nms_thresh (float): Overlap threshold used for non-maximum suppression - (suppress boxes with IoU >= this threshold) - max_detections_per_image (int): - Maximum number of detections to return per image during inference - (100 is based on the limit established for the COCO dataset). - - pixel_mean, pixel_std: see :class:`DenseDetector`. - """ - super().__init__( - backbone, head, head_in_features, pixel_mean=pixel_mean, pixel_std=pixel_std - ) - self.num_classes = num_classes - - # Anchors - self.anchor_generator = anchor_generator - self.box2box_transform = box2box_transform - self.anchor_matcher = anchor_matcher - - # Loss parameters: - self.focal_loss_alpha = focal_loss_alpha - self.focal_loss_gamma = focal_loss_gamma - self.smooth_l1_beta = smooth_l1_beta - self.box_reg_loss_type = box_reg_loss_type - # Inference parameters: - self.test_score_thresh = test_score_thresh - self.test_topk_candidates = test_topk_candidates - self.test_nms_thresh = test_nms_thresh - self.max_detections_per_image = max_detections_per_image - # Vis parameters - self.vis_period = vis_period - self.input_format = input_format - - @classmethod - def from_config(cls, cfg): - backbone = build_backbone(cfg) - backbone_shape = backbone.output_shape() - feature_shapes = [backbone_shape[f] for f in cfg.MODEL.RETINANET.IN_FEATURES] - head = RetinaNetHead(cfg, feature_shapes) - anchor_generator = build_anchor_generator(cfg, feature_shapes) - return { - "backbone": backbone, - "head": head, - "anchor_generator": anchor_generator, - "box2box_transform": Box2BoxTransform(weights=cfg.MODEL.RETINANET.BBOX_REG_WEIGHTS), - "anchor_matcher": Matcher( - cfg.MODEL.RETINANET.IOU_THRESHOLDS, - cfg.MODEL.RETINANET.IOU_LABELS, - allow_low_quality_matches=True, - ), - "pixel_mean": cfg.MODEL.PIXEL_MEAN, - "pixel_std": cfg.MODEL.PIXEL_STD, - "num_classes": cfg.MODEL.RETINANET.NUM_CLASSES, - "head_in_features": cfg.MODEL.RETINANET.IN_FEATURES, - # Loss parameters: - "focal_loss_alpha": cfg.MODEL.RETINANET.FOCAL_LOSS_ALPHA, - "focal_loss_gamma": cfg.MODEL.RETINANET.FOCAL_LOSS_GAMMA, - "smooth_l1_beta": cfg.MODEL.RETINANET.SMOOTH_L1_LOSS_BETA, - "box_reg_loss_type": cfg.MODEL.RETINANET.BBOX_REG_LOSS_TYPE, - # Inference parameters: - "test_score_thresh": cfg.MODEL.RETINANET.SCORE_THRESH_TEST, - "test_topk_candidates": cfg.MODEL.RETINANET.TOPK_CANDIDATES_TEST, - "test_nms_thresh": cfg.MODEL.RETINANET.NMS_THRESH_TEST, - "max_detections_per_image": cfg.TEST.DETECTIONS_PER_IMAGE, - # Vis parameters - "vis_period": cfg.VIS_PERIOD, - "input_format": cfg.INPUT.FORMAT, - } - - def forward_training(self, images, features, predictions, gt_instances): - # Transpose the Hi*Wi*A dimension to the middle: - pred_logits, pred_anchor_deltas = self._transpose_dense_predictions( - predictions, [self.num_classes, 4] - ) - anchors = self.anchor_generator(features) - gt_labels, gt_boxes = self.label_anchors(anchors, gt_instances) - return self.losses(anchors, pred_logits, gt_labels, pred_anchor_deltas, gt_boxes) - - def losses(self, anchors, pred_logits, gt_labels, pred_anchor_deltas, gt_boxes): - """ - Args: - anchors (list[Boxes]): a list of #feature level Boxes - gt_labels, gt_boxes: see output of :meth:`RetinaNet.label_anchors`. - Their shapes are (N, R) and (N, R, 4), respectively, where R is - the total number of anchors across levels, i.e. sum(Hi x Wi x Ai) - pred_logits, pred_anchor_deltas: both are list[Tensor]. Each element in the - list corresponds to one level and has shape (N, Hi * Wi * Ai, K or 4). - Where K is the number of classes used in `pred_logits`. - - Returns: - dict[str, Tensor]: - mapping from a named loss to a scalar tensor storing the loss. - Used during training only. The dict keys are: "loss_cls" and "loss_box_reg" - """ - num_images = len(gt_labels) - gt_labels = torch.stack(gt_labels) # (N, R) - - valid_mask = gt_labels >= 0 - pos_mask = (gt_labels >= 0) & (gt_labels != self.num_classes) - num_pos_anchors = pos_mask.sum().item() - get_event_storage().put_scalar("num_pos_anchors", num_pos_anchors / num_images) - normalizer = self._ema_update("loss_normalizer", max(num_pos_anchors, 1), 100) - - # classification and regression loss - gt_labels_target = F.one_hot(gt_labels[valid_mask], num_classes=self.num_classes + 1)[ - :, :-1 - ] # no loss for the last (background) class - loss_cls = sigmoid_focal_loss_jit( - cat(pred_logits, dim=1)[valid_mask], - gt_labels_target.to(pred_logits[0].dtype), - alpha=self.focal_loss_alpha, - gamma=self.focal_loss_gamma, - reduction="sum", - ) - - loss_box_reg = _dense_box_regression_loss( - anchors, - self.box2box_transform, - pred_anchor_deltas, - gt_boxes, - pos_mask, - box_reg_loss_type=self.box_reg_loss_type, - smooth_l1_beta=self.smooth_l1_beta, - ) - - return { - "loss_cls": loss_cls / normalizer, - "loss_box_reg": loss_box_reg / normalizer, - } - - @torch.no_grad() - def label_anchors(self, anchors, gt_instances): - """ - Args: - anchors (list[Boxes]): A list of #feature level Boxes. - The Boxes contains anchors of this image on the specific feature level. - gt_instances (list[Instances]): a list of N `Instances`s. The i-th - `Instances` contains the ground-truth per-instance annotations - for the i-th input image. - - Returns: - list[Tensor]: List of #img tensors. i-th element is a vector of labels whose length is - the total number of anchors across all feature maps (sum(Hi * Wi * A)). - Label values are in {-1, 0, ..., K}, with -1 means ignore, and K means background. - - list[Tensor]: i-th element is a Rx4 tensor, where R is the total number of anchors - across feature maps. The values are the matched gt boxes for each anchor. - Values are undefined for those anchors not labeled as foreground. - """ - anchors = Boxes.cat(anchors) # Rx4 - - gt_labels = [] - matched_gt_boxes = [] - for gt_per_image in gt_instances: - match_quality_matrix = pairwise_iou(gt_per_image.gt_boxes, anchors) - matched_idxs, anchor_labels = self.anchor_matcher(match_quality_matrix) - del match_quality_matrix - - if len(gt_per_image) > 0: - matched_gt_boxes_i = gt_per_image.gt_boxes.tensor[matched_idxs] - - gt_labels_i = gt_per_image.gt_classes[matched_idxs] - # Anchors with label 0 are treated as background. - gt_labels_i[anchor_labels == 0] = self.num_classes - # Anchors with label -1 are ignored. - gt_labels_i[anchor_labels == -1] = -1 - else: - matched_gt_boxes_i = torch.zeros_like(anchors.tensor) - gt_labels_i = torch.zeros_like(matched_idxs) + self.num_classes - - gt_labels.append(gt_labels_i) - matched_gt_boxes.append(matched_gt_boxes_i) - - return gt_labels, matched_gt_boxes - - def forward_inference( - self, images: ImageList, features: List[Tensor], predictions: List[List[Tensor]] - ): - pred_logits, pred_anchor_deltas = self._transpose_dense_predictions( - predictions, [self.num_classes, 4] - ) - anchors = self.anchor_generator(features) - - results: List[Instances] = [] - for img_idx, image_size in enumerate(images.image_sizes): - scores_per_image = [x[img_idx].sigmoid_() for x in pred_logits] - deltas_per_image = [x[img_idx] for x in pred_anchor_deltas] - results_per_image = self.inference_single_image( - anchors, scores_per_image, deltas_per_image, image_size - ) - results.append(results_per_image) - return results - - def inference_single_image( - self, - anchors: List[Boxes], - box_cls: List[Tensor], - box_delta: List[Tensor], - image_size: Tuple[int, int], - ): - """ - Single-image inference. Return bounding-box detection results by thresholding - on scores and applying non-maximum suppression (NMS). - - Arguments: - anchors (list[Boxes]): list of #feature levels. Each entry contains - a Boxes object, which contains all the anchors in that feature level. - box_cls (list[Tensor]): list of #feature levels. Each entry contains - tensor of size (H x W x A, K) - box_delta (list[Tensor]): Same shape as 'box_cls' except that K becomes 4. - image_size (tuple(H, W)): a tuple of the image height and width. - - Returns: - Same as `inference`, but for only one image. - """ - pred = self._decode_multi_level_predictions( - anchors, - box_cls, - box_delta, - self.test_score_thresh, - self.test_topk_candidates, - image_size, - ) - keep = batched_nms( # per-class NMS - pred.pred_boxes.tensor, pred.scores, pred.pred_classes, self.test_nms_thresh - ) - return pred[keep[: self.max_detections_per_image]] - - -class RetinaNetHead(nn.Module): - """ - The head used in RetinaNet for object classification and box regression. - It has two subnets for the two tasks, with a common structure but separate parameters. - """ - - @configurable - def __init__( - self, - *, - input_shape: List[ShapeSpec], - num_classes, - num_anchors, - conv_dims: List[int], - norm="", - prior_prob=0.01, - ): - """ - NOTE: this interface is experimental. - - Args: - input_shape (List[ShapeSpec]): input shape - num_classes (int): number of classes. Used to label background proposals. - num_anchors (int): number of generated anchors - conv_dims (List[int]): dimensions for each convolution layer - norm (str or callable): - Normalization for conv layers except for the two output layers. - See :func:`detectron2.layers.get_norm` for supported types. - prior_prob (float): Prior weight for computing bias - """ - super().__init__() - - self._num_features = len(input_shape) - if norm == "BN" or norm == "SyncBN": - logger.info( - f"Using domain-specific {norm} in RetinaNetHead with len={self._num_features}." - ) - bn_class = nn.BatchNorm2d if norm == "BN" else nn.SyncBatchNorm - - def norm(c): - return CycleBatchNormList( - length=self._num_features, bn_class=bn_class, num_features=c - ) - - else: - norm_name = str(type(get_norm(norm, 1))) - if "BN" in norm_name: - logger.warning( - f"Shared BatchNorm (type={norm_name}) may not work well in RetinaNetHead." - ) - - cls_subnet = [] - bbox_subnet = [] - for in_channels, out_channels in zip( - [input_shape[0].channels] + list(conv_dims), conv_dims - ): - cls_subnet.append( - nn.Conv2d(in_channels, out_channels, kernel_size=3, stride=1, padding=1) - ) - if norm: - cls_subnet.append(get_norm(norm, out_channels)) - cls_subnet.append(nn.ReLU()) - bbox_subnet.append( - nn.Conv2d(in_channels, out_channels, kernel_size=3, stride=1, padding=1) - ) - if norm: - bbox_subnet.append(get_norm(norm, out_channels)) - bbox_subnet.append(nn.ReLU()) - - self.cls_subnet = nn.Sequential(*cls_subnet) - self.bbox_subnet = nn.Sequential(*bbox_subnet) - self.cls_score = nn.Conv2d( - conv_dims[-1], num_anchors * num_classes, kernel_size=3, stride=1, padding=1 - ) - self.bbox_pred = nn.Conv2d( - conv_dims[-1], num_anchors * 4, kernel_size=3, stride=1, padding=1 - ) - - # Initialization - for modules in [self.cls_subnet, self.bbox_subnet, self.cls_score, self.bbox_pred]: - for layer in modules.modules(): - if isinstance(layer, nn.Conv2d): - torch.nn.init.normal_(layer.weight, mean=0, std=0.01) - torch.nn.init.constant_(layer.bias, 0) - - # Use prior in model initialization to improve stability - bias_value = -(math.log((1 - prior_prob) / prior_prob)) - torch.nn.init.constant_(self.cls_score.bias, bias_value) - - @classmethod - def from_config(cls, cfg, input_shape: List[ShapeSpec]): - num_anchors = build_anchor_generator(cfg, input_shape).num_cell_anchors - assert ( - len(set(num_anchors)) == 1 - ), "Using different number of anchors between levels is not currently supported!" - num_anchors = num_anchors[0] - - return { - "input_shape": input_shape, - "num_classes": cfg.MODEL.RETINANET.NUM_CLASSES, - "conv_dims": [input_shape[0].channels] * cfg.MODEL.RETINANET.NUM_CONVS, - "prior_prob": cfg.MODEL.RETINANET.PRIOR_PROB, - "norm": cfg.MODEL.RETINANET.NORM, - "num_anchors": num_anchors, - } - - def forward(self, features: List[Tensor]): - """ - Arguments: - features (list[Tensor]): FPN feature map tensors in high to low resolution. - Each tensor in the list correspond to different feature levels. - - Returns: - logits (list[Tensor]): #lvl tensors, each has shape (N, AxK, Hi, Wi). - The tensor predicts the classification probability - at each spatial position for each of the A anchors and K object - classes. - bbox_reg (list[Tensor]): #lvl tensors, each has shape (N, Ax4, Hi, Wi). - The tensor predicts 4-vector (dx,dy,dw,dh) box - regression values for every anchor. These values are the - relative offset between the anchor and the ground truth box. - """ - assert len(features) == self._num_features - logits = [] - bbox_reg = [] - for feature in features: - logits.append(self.cls_score(self.cls_subnet(feature))) - bbox_reg.append(self.bbox_pred(self.bbox_subnet(feature))) - return logits, bbox_reg diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/semantic_seg.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/semantic_seg.py deleted file mode 100755 index 6dd3dc23..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/meta_arch/semantic_seg.py +++ /dev/null @@ -1,260 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -from typing import Callable, Dict, Optional, Tuple, Union -import fvcore.nn.weight_init as weight_init -import torch -from torch import nn -from torch.nn import functional as F - -from detectron2.config import configurable -from detectron2.layers import Conv2d, ShapeSpec, get_norm -from detectron2.structures import ImageList -from detectron2.utils.registry import Registry - -from ..backbone import Backbone, build_backbone -from ..postprocessing import sem_seg_postprocess -from .build import META_ARCH_REGISTRY - -__all__ = [ - "SemanticSegmentor", - "SEM_SEG_HEADS_REGISTRY", - "SemSegFPNHead", - "build_sem_seg_head", -] - - -SEM_SEG_HEADS_REGISTRY = Registry("SEM_SEG_HEADS") -SEM_SEG_HEADS_REGISTRY.__doc__ = """ -Registry for semantic segmentation heads, which make semantic segmentation predictions -from feature maps. -""" - - -@META_ARCH_REGISTRY.register() -class SemanticSegmentor(nn.Module): - """ - Main class for semantic segmentation architectures. - """ - - @configurable - def __init__( - self, - *, - backbone: Backbone, - sem_seg_head: nn.Module, - pixel_mean: Tuple[float], - pixel_std: Tuple[float], - ): - """ - Args: - backbone: a backbone module, must follow detectron2's backbone interface - sem_seg_head: a module that predicts semantic segmentation from backbone features - pixel_mean, pixel_std: list or tuple with #channels element, representing - the per-channel mean and std to be used to normalize the input image - """ - super().__init__() - self.backbone = backbone - self.sem_seg_head = sem_seg_head - self.register_buffer("pixel_mean", torch.tensor(pixel_mean).view(-1, 1, 1), False) - self.register_buffer("pixel_std", torch.tensor(pixel_std).view(-1, 1, 1), False) - - @classmethod - def from_config(cls, cfg): - backbone = build_backbone(cfg) - sem_seg_head = build_sem_seg_head(cfg, backbone.output_shape()) - return { - "backbone": backbone, - "sem_seg_head": sem_seg_head, - "pixel_mean": cfg.MODEL.PIXEL_MEAN, - "pixel_std": cfg.MODEL.PIXEL_STD, - } - - @property - def device(self): - return self.pixel_mean.device - - def forward(self, batched_inputs): - """ - Args: - batched_inputs: a list, batched outputs of :class:`DatasetMapper`. - Each item in the list contains the inputs for one image. - - For now, each item in the list is a dict that contains: - - * "image": Tensor, image in (C, H, W) format. - * "sem_seg": semantic segmentation ground truth - * Other information that's included in the original dicts, such as: - "height", "width" (int): the output resolution of the model (may be different - from input resolution), used in inference. - - - Returns: - list[dict]: - Each dict is the output for one input image. - The dict contains one key "sem_seg" whose value is a - Tensor that represents the - per-pixel segmentation prediced by the head. - The prediction has shape KxHxW that represents the logits of - each class for each pixel. - """ - images = [x["image"].to(self.device) for x in batched_inputs] - images = [(x - self.pixel_mean) / self.pixel_std for x in images] - images = ImageList.from_tensors(images, self.backbone.size_divisibility) - - features = self.backbone(images.tensor) - - if "sem_seg" in batched_inputs[0]: - targets = [x["sem_seg"].to(self.device) for x in batched_inputs] - targets = ImageList.from_tensors( - targets, self.backbone.size_divisibility, self.sem_seg_head.ignore_value - ).tensor - else: - targets = None - results, losses = self.sem_seg_head(features, targets) - - if self.training: - return losses - - processed_results = [] - for result, input_per_image, image_size in zip(results, batched_inputs, images.image_sizes): - height = input_per_image.get("height", image_size[0]) - width = input_per_image.get("width", image_size[1]) - r = sem_seg_postprocess(result, image_size, height, width) - processed_results.append({"sem_seg": r}) - return processed_results - - -def build_sem_seg_head(cfg, input_shape): - """ - Build a semantic segmentation head from `cfg.MODEL.SEM_SEG_HEAD.NAME`. - """ - name = cfg.MODEL.SEM_SEG_HEAD.NAME - return SEM_SEG_HEADS_REGISTRY.get(name)(cfg, input_shape) - - -@SEM_SEG_HEADS_REGISTRY.register() -class SemSegFPNHead(nn.Module): - """ - A semantic segmentation head described in :paper:`PanopticFPN`. - It takes a list of FPN features as input, and applies a sequence of - 3x3 convs and upsampling to scale all of them to the stride defined by - ``common_stride``. Then these features are added and used to make final - predictions by another 1x1 conv layer. - """ - - @configurable - def __init__( - self, - input_shape: Dict[str, ShapeSpec], - *, - num_classes: int, - conv_dims: int, - common_stride: int, - loss_weight: float = 1.0, - norm: Optional[Union[str, Callable]] = None, - ignore_value: int = -1, - ): - """ - NOTE: this interface is experimental. - - Args: - input_shape: shapes (channels and stride) of the input features - num_classes: number of classes to predict - conv_dims: number of output channels for the intermediate conv layers. - common_stride: the common stride that all features will be upscaled to - loss_weight: loss weight - norm (str or callable): normalization for all conv layers - ignore_value: category id to be ignored during training. - """ - super().__init__() - input_shape = sorted(input_shape.items(), key=lambda x: x[1].stride) - if not len(input_shape): - raise ValueError("SemSegFPNHead(input_shape=) cannot be empty!") - self.in_features = [k for k, v in input_shape] - feature_strides = [v.stride for k, v in input_shape] - feature_channels = [v.channels for k, v in input_shape] - - self.ignore_value = ignore_value - self.common_stride = common_stride - self.loss_weight = loss_weight - - self.scale_heads = [] - for in_feature, stride, channels in zip( - self.in_features, feature_strides, feature_channels - ): - head_ops = [] - head_length = max(1, int(np.log2(stride) - np.log2(self.common_stride))) - for k in range(head_length): - norm_module = get_norm(norm, conv_dims) - conv = Conv2d( - channels if k == 0 else conv_dims, - conv_dims, - kernel_size=3, - stride=1, - padding=1, - bias=not norm, - norm=norm_module, - activation=F.relu, - ) - weight_init.c2_msra_fill(conv) - head_ops.append(conv) - if stride != self.common_stride: - head_ops.append( - nn.Upsample(scale_factor=2, mode="bilinear", align_corners=False) - ) - self.scale_heads.append(nn.Sequential(*head_ops)) - self.add_module(in_feature, self.scale_heads[-1]) - self.predictor = Conv2d(conv_dims, num_classes, kernel_size=1, stride=1, padding=0) - weight_init.c2_msra_fill(self.predictor) - - @classmethod - def from_config(cls, cfg, input_shape: Dict[str, ShapeSpec]): - return { - "input_shape": { - k: v for k, v in input_shape.items() if k in cfg.MODEL.SEM_SEG_HEAD.IN_FEATURES - }, - "ignore_value": cfg.MODEL.SEM_SEG_HEAD.IGNORE_VALUE, - "num_classes": cfg.MODEL.SEM_SEG_HEAD.NUM_CLASSES, - "conv_dims": cfg.MODEL.SEM_SEG_HEAD.CONVS_DIM, - "common_stride": cfg.MODEL.SEM_SEG_HEAD.COMMON_STRIDE, - "norm": cfg.MODEL.SEM_SEG_HEAD.NORM, - "loss_weight": cfg.MODEL.SEM_SEG_HEAD.LOSS_WEIGHT, - } - - def forward(self, features, targets=None): - """ - Returns: - In training, returns (None, dict of losses) - In inference, returns (CxHxW logits, {}) - """ - x = self.layers(features) - if self.training: - return None, self.losses(x, targets) - else: - x = F.interpolate( - x, scale_factor=self.common_stride, mode="bilinear", align_corners=False - ) - return x, {} - - def layers(self, features): - for i, f in enumerate(self.in_features): - if i == 0: - x = self.scale_heads[i](features[f]) - else: - x = x + self.scale_heads[i](features[f]) - x = self.predictor(x) - return x - - def losses(self, predictions, targets): - predictions = predictions.float() # https://github.com/pytorch/pytorch/issues/48163 - predictions = F.interpolate( - predictions, - scale_factor=self.common_stride, - mode="bilinear", - align_corners=False, - ) - loss = F.cross_entropy( - predictions, targets, reduction="mean", ignore_index=self.ignore_value - ) - losses = {"loss_sem_seg": loss * self.loss_weight} - return losses diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/mmdet_wrapper.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/mmdet_wrapper.py deleted file mode 100755 index 386e9296..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/mmdet_wrapper.py +++ /dev/null @@ -1,274 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import itertools -import logging -import numpy as np -from collections import OrderedDict -from collections.abc import Mapping -from typing import Dict, List, Optional, Tuple, Union -import torch -from omegaconf import DictConfig, OmegaConf -from torch import Tensor, nn - -from detectron2.layers import ShapeSpec -from detectron2.structures import BitMasks, Boxes, ImageList, Instances -from detectron2.utils.events import get_event_storage - -from .backbone import Backbone - -logger = logging.getLogger(__name__) - - -def _to_container(cfg): - """ - mmdet will assert the type of dict/list. - So convert omegaconf objects to dict/list. - """ - if isinstance(cfg, DictConfig): - cfg = OmegaConf.to_container(cfg, resolve=True) - from mmcv.utils import ConfigDict - - return ConfigDict(cfg) - - -class MMDetBackbone(Backbone): - """ - Wrapper of mmdetection backbones to use in detectron2. - - mmdet backbones produce list/tuple of tensors, while detectron2 backbones - produce a dict of tensors. This class wraps the given backbone to produce - output in detectron2's convention, so it can be used in place of detectron2 - backbones. - """ - - def __init__( - self, - backbone: Union[nn.Module, Mapping], - neck: Union[nn.Module, Mapping, None] = None, - *, - output_shapes: List[ShapeSpec], - output_names: Optional[List[str]] = None, - ): - """ - Args: - backbone: either a backbone module or a mmdet config dict that defines a - backbone. The backbone takes a 4D image tensor and returns a - sequence of tensors. - neck: either a backbone module or a mmdet config dict that defines a - neck. The neck takes outputs of backbone and returns a - sequence of tensors. If None, no neck is used. - pretrained_backbone: defines the backbone weights that can be loaded by - mmdet, such as "torchvision://resnet50". - output_shapes: shape for every output of the backbone (or neck, if given). - stride and channels are often needed. - output_names: names for every output of the backbone (or neck, if given). - By default, will use "out0", "out1", ... - """ - super().__init__() - if isinstance(backbone, Mapping): - from mmdet.models import build_backbone - - backbone = build_backbone(_to_container(backbone)) - self.backbone = backbone - - if isinstance(neck, Mapping): - from mmdet.models import build_neck - - neck = build_neck(_to_container(neck)) - self.neck = neck - - # "Neck" weights, if any, are part of neck itself. This is the interface - # of mmdet so we follow it. Reference: - # https://github.com/open-mmlab/mmdetection/blob/master/mmdet/models/detectors/two_stage.py - logger.info("Initializing mmdet backbone weights...") - self.backbone.init_weights() - # train() in mmdet modules is non-trivial, and has to be explicitly - # called. Reference: - # https://github.com/open-mmlab/mmdetection/blob/master/mmdet/models/backbones/resnet.py - self.backbone.train() - if self.neck is not None: - logger.info("Initializing mmdet neck weights ...") - if isinstance(self.neck, nn.Sequential): - for m in self.neck: - m.init_weights() - else: - self.neck.init_weights() - self.neck.train() - - self._output_shapes = output_shapes - if not output_names: - output_names = [f"out{i}" for i in range(len(output_shapes))] - self._output_names = output_names - - def forward(self, x) -> Dict[str, Tensor]: - outs = self.backbone(x) - if self.neck is not None: - outs = self.neck(outs) - assert isinstance( - outs, (list, tuple) - ), "mmdet backbone should return a list/tuple of tensors!" - if len(outs) != len(self._output_shapes): - raise ValueError( - "Length of output_shapes does not match outputs from the mmdet backbone: " - f"{len(outs)} != {len(self._output_shapes)}" - ) - return {k: v for k, v in zip(self._output_names, outs)} - - def output_shape(self) -> Dict[str, ShapeSpec]: - return {k: v for k, v in zip(self._output_names, self._output_shapes)} - - -class MMDetDetector(nn.Module): - """ - Wrapper of a mmdetection detector model, for detection and instance segmentation. - Input/output formats of this class follow detectron2's convention, so a - mmdetection model can be trained and evaluated in detectron2. - """ - - def __init__( - self, - detector: Union[nn.Module, Mapping], - *, - # Default is 32 regardless of model: - # https://github.com/open-mmlab/mmdetection/tree/master/configs/_base_/datasets - size_divisibility=32, - pixel_mean: Tuple[float], - pixel_std: Tuple[float], - ): - """ - Args: - detector: a mmdet detector, or a mmdet config dict that defines a detector. - size_divisibility: pad input images to multiple of this number - pixel_mean: per-channel mean to normalize input image - pixel_std: per-channel stddev to normalize input image - """ - super().__init__() - if isinstance(detector, Mapping): - from mmdet.models import build_detector - - detector = build_detector(_to_container(detector)) - self.detector = detector - self.size_divisibility = size_divisibility - - self.register_buffer("pixel_mean", torch.tensor(pixel_mean).view(-1, 1, 1), False) - self.register_buffer("pixel_std", torch.tensor(pixel_std).view(-1, 1, 1), False) - assert ( - self.pixel_mean.shape == self.pixel_std.shape - ), f"{self.pixel_mean} and {self.pixel_std} have different shapes!" - - def forward(self, batched_inputs: List[Dict[str, torch.Tensor]]): - images = [x["image"].to(self.device) for x in batched_inputs] - images = [(x - self.pixel_mean) / self.pixel_std for x in images] - images = ImageList.from_tensors(images, size_divisibility=self.size_divisibility).tensor - metas = [] - rescale = {"height" in x for x in batched_inputs} - if len(rescale) != 1: - raise ValueError("Some inputs have original height/width, but some don't!") - rescale = list(rescale)[0] - output_shapes = [] - for input in batched_inputs: - meta = {} - c, h, w = input["image"].shape - meta["img_shape"] = meta["ori_shape"] = (h, w, c) - if rescale: - scale_factor = np.array( - [w / input["width"], h / input["height"]] * 2, dtype="float32" - ) - ori_shape = (input["height"], input["width"]) - output_shapes.append(ori_shape) - meta["ori_shape"] = ori_shape + (c,) - else: - scale_factor = 1.0 - output_shapes.append((h, w)) - meta["scale_factor"] = scale_factor - meta["flip"] = False - padh, padw = images.shape[-2:] - meta["pad_shape"] = (padh, padw, c) - metas.append(meta) - - if self.training: - gt_instances = [x["instances"].to(self.device) for x in batched_inputs] - if gt_instances[0].has("gt_masks"): - from mmdet.core import PolygonMasks as mm_PolygonMasks, BitmapMasks as mm_BitMasks - - def convert_mask(m, shape): - # mmdet mask format - if isinstance(m, BitMasks): - return mm_BitMasks(m.tensor.cpu().numpy(), shape[0], shape[1]) - else: - return mm_PolygonMasks(m.polygons, shape[0], shape[1]) - - gt_masks = [convert_mask(x.gt_masks, x.image_size) for x in gt_instances] - losses_and_metrics = self.detector.forward_train( - images, - metas, - [x.gt_boxes.tensor for x in gt_instances], - [x.gt_classes for x in gt_instances], - gt_masks=gt_masks, - ) - else: - losses_and_metrics = self.detector.forward_train( - images, - metas, - [x.gt_boxes.tensor for x in gt_instances], - [x.gt_classes for x in gt_instances], - ) - return _parse_losses(losses_and_metrics) - else: - results = self.detector.simple_test(images, metas, rescale=rescale) - results = [ - {"instances": _convert_mmdet_result(r, shape)} - for r, shape in zip(results, output_shapes) - ] - return results - - @property - def device(self): - return self.pixel_mean.device - - -# Reference: show_result() in -# https://github.com/open-mmlab/mmdetection/blob/master/mmdet/models/detectors/base.py -def _convert_mmdet_result(result, shape: Tuple[int, int]) -> Instances: - if isinstance(result, tuple): - bbox_result, segm_result = result - if isinstance(segm_result, tuple): - segm_result = segm_result[0] - else: - bbox_result, segm_result = result, None - - bboxes = torch.from_numpy(np.vstack(bbox_result)) # Nx5 - bboxes, scores = bboxes[:, :4], bboxes[:, -1] - labels = [ - torch.full((bbox.shape[0],), i, dtype=torch.int32) for i, bbox in enumerate(bbox_result) - ] - labels = torch.cat(labels) - inst = Instances(shape) - inst.pred_boxes = Boxes(bboxes) - inst.scores = scores - inst.pred_classes = labels - - if segm_result is not None and len(labels) > 0: - segm_result = list(itertools.chain(*segm_result)) - segm_result = [torch.from_numpy(x) if isinstance(x, np.ndarray) else x for x in segm_result] - segm_result = torch.stack(segm_result, dim=0) - inst.pred_masks = segm_result - return inst - - -# reference: https://github.com/open-mmlab/mmdetection/blob/master/mmdet/models/detectors/base.py -def _parse_losses(losses: Dict[str, Tensor]) -> Dict[str, Tensor]: - log_vars = OrderedDict() - for loss_name, loss_value in losses.items(): - if isinstance(loss_value, torch.Tensor): - log_vars[loss_name] = loss_value.mean() - elif isinstance(loss_value, list): - log_vars[loss_name] = sum(_loss.mean() for _loss in loss_value) - else: - raise TypeError(f"{loss_name} is not a tensor or list of tensors") - - if "loss" not in loss_name: - # put metrics to storage; don't return them - storage = get_event_storage() - value = log_vars.pop(loss_name).cpu().item() - storage.put_scalar(loss_name, value) - return log_vars diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/poolers.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/poolers.py deleted file mode 100755 index 6bea77af..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/poolers.py +++ /dev/null @@ -1,245 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import math -from typing import List -import torch -from torch import nn -from torchvision.ops import RoIPool - -from detectron2.layers import ROIAlign, ROIAlignRotated, cat, nonzero_tuple, shapes_to_tensor -from detectron2.structures import Boxes - -""" -To export ROIPooler to torchscript, in this file, variables that should be annotated with -`Union[List[Boxes], List[RotatedBoxes]]` are only annotated with `List[Boxes]`. - -TODO: Correct these annotations when torchscript support `Union`. -https://github.com/pytorch/pytorch/issues/41412 -""" - -__all__ = ["ROIPooler"] - - -def assign_boxes_to_levels( - box_lists: List[Boxes], - min_level: int, - max_level: int, - canonical_box_size: int, - canonical_level: int, -): - """ - Map each box in `box_lists` to a feature map level index and return the assignment - vector. - - Args: - box_lists (list[Boxes] | list[RotatedBoxes]): A list of N Boxes or N RotatedBoxes, - where N is the number of images in the batch. - min_level (int): Smallest feature map level index. The input is considered index 0, - the output of stage 1 is index 1, and so. - max_level (int): Largest feature map level index. - canonical_box_size (int): A canonical box size in pixels (sqrt(box area)). - canonical_level (int): The feature map level index on which a canonically-sized box - should be placed. - - Returns: - A tensor of length M, where M is the total number of boxes aggregated over all - N batch images. The memory layout corresponds to the concatenation of boxes - from all images. Each element is the feature map index, as an offset from - `self.min_level`, for the corresponding box (so value i means the box is at - `self.min_level + i`). - """ - box_sizes = torch.sqrt(cat([boxes.area() for boxes in box_lists])) - # Eqn.(1) in FPN paper - level_assignments = torch.floor( - canonical_level + torch.log2(box_sizes / canonical_box_size + 1e-8) - ) - # clamp level to (min, max), in case the box size is too large or too small - # for the available feature maps - level_assignments = torch.clamp(level_assignments, min=min_level, max=max_level) - return level_assignments.to(torch.int64) - min_level - - -def convert_boxes_to_pooler_format(box_lists: List[Boxes]): - """ - Convert all boxes in `box_lists` to the low-level format used by ROI pooling ops - (see description under Returns). - - Args: - box_lists (list[Boxes] | list[RotatedBoxes]): - A list of N Boxes or N RotatedBoxes, where N is the number of images in the batch. - - Returns: - When input is list[Boxes]: - A tensor of shape (M, 5), where M is the total number of boxes aggregated over all - N batch images. - The 5 columns are (batch index, x0, y0, x1, y1), where batch index - is the index in [0, N) identifying which batch image the box with corners at - (x0, y0, x1, y1) comes from. - When input is list[RotatedBoxes]: - A tensor of shape (M, 6), where M is the total number of boxes aggregated over all - N batch images. - The 6 columns are (batch index, x_ctr, y_ctr, width, height, angle_degrees), - where batch index is the index in [0, N) identifying which batch image the - rotated box (x_ctr, y_ctr, width, height, angle_degrees) comes from. - """ - boxes = torch.cat([x.tensor for x in box_lists], dim=0) - # __len__ returns Tensor in tracing. - sizes = shapes_to_tensor([x.__len__() for x in box_lists], device=boxes.device) - indices = torch.repeat_interleave( - torch.arange(len(box_lists), dtype=boxes.dtype, device=boxes.device), sizes - ) - return cat([indices[:, None], boxes], dim=1) - - -class ROIPooler(nn.Module): - """ - Region of interest feature map pooler that supports pooling from one or more - feature maps. - """ - - def __init__( - self, - output_size, - scales, - sampling_ratio, - pooler_type, - canonical_box_size=224, - canonical_level=4, - ): - """ - Args: - output_size (int, tuple[int] or list[int]): output size of the pooled region, - e.g., 14 x 14. If tuple or list is given, the length must be 2. - scales (list[float]): The scale for each low-level pooling op relative to - the input image. For a feature map with stride s relative to the input - image, scale is defined as 1/s. The stride must be power of 2. - When there are multiple scales, they must form a pyramid, i.e. they must be - a monotically decreasing geometric sequence with a factor of 1/2. - sampling_ratio (int): The `sampling_ratio` parameter for the ROIAlign op. - pooler_type (string): Name of the type of pooling operation that should be applied. - For instance, "ROIPool" or "ROIAlignV2". - canonical_box_size (int): A canonical box size in pixels (sqrt(box area)). The default - is heuristically defined as 224 pixels in the FPN paper (based on ImageNet - pre-training). - canonical_level (int): The feature map level index from which a canonically-sized box - should be placed. The default is defined as level 4 (stride=16) in the FPN paper, - i.e., a box of size 224x224 will be placed on the feature with stride=16. - The box placement for all boxes will be determined from their sizes w.r.t - canonical_box_size. For example, a box whose area is 4x that of a canonical box - should be used to pool features from feature level ``canonical_level+1``. - - Note that the actual input feature maps given to this module may not have - sufficiently many levels for the input boxes. If the boxes are too large or too - small for the input feature maps, the closest level will be used. - """ - super().__init__() - - if isinstance(output_size, int): - output_size = (output_size, output_size) - assert len(output_size) == 2 - assert isinstance(output_size[0], int) and isinstance(output_size[1], int) - self.output_size = output_size - - if pooler_type == "ROIAlign": - self.level_poolers = nn.ModuleList( - ROIAlign( - output_size, spatial_scale=scale, sampling_ratio=sampling_ratio, aligned=False - ) - for scale in scales - ) - elif pooler_type == "ROIAlignV2": - self.level_poolers = nn.ModuleList( - ROIAlign( - output_size, spatial_scale=scale, sampling_ratio=sampling_ratio, aligned=True - ) - for scale in scales - ) - elif pooler_type == "ROIPool": - self.level_poolers = nn.ModuleList( - RoIPool(output_size, spatial_scale=scale) for scale in scales - ) - elif pooler_type == "ROIAlignRotated": - self.level_poolers = nn.ModuleList( - ROIAlignRotated(output_size, spatial_scale=scale, sampling_ratio=sampling_ratio) - for scale in scales - ) - else: - raise ValueError("Unknown pooler type: {}".format(pooler_type)) - - # Map scale (defined as 1 / stride) to its feature map level under the - # assumption that stride is a power of 2. - min_level = -(math.log2(scales[0])) - max_level = -(math.log2(scales[-1])) - assert math.isclose(min_level, int(min_level)) and math.isclose( - max_level, int(max_level) - ), "Featuremap stride is not power of 2!" - self.min_level = int(min_level) - self.max_level = int(max_level) - assert ( - len(scales) == self.max_level - self.min_level + 1 - ), "[ROIPooler] Sizes of input featuremaps do not form a pyramid!" - assert 0 <= self.min_level and self.min_level <= self.max_level - self.canonical_level = canonical_level - assert canonical_box_size > 0 - self.canonical_box_size = canonical_box_size - - def forward(self, x: List[torch.Tensor], box_lists: List[Boxes]): - """ - Args: - x (list[Tensor]): A list of feature maps of NCHW shape, with scales matching those - used to construct this module. - box_lists (list[Boxes] | list[RotatedBoxes]): - A list of N Boxes or N RotatedBoxes, where N is the number of images in the batch. - The box coordinates are defined on the original image and - will be scaled by the `scales` argument of :class:`ROIPooler`. - - Returns: - Tensor: - A tensor of shape (M, C, output_size, output_size) where M is the total number of - boxes aggregated over all N batch images and C is the number of channels in `x`. - """ - num_level_assignments = len(self.level_poolers) - - assert isinstance(x, list) and isinstance( - box_lists, list - ), "Arguments to pooler must be lists" - assert ( - len(x) == num_level_assignments - ), "unequal value, num_level_assignments={}, but x is list of {} Tensors".format( - num_level_assignments, len(x) - ) - - assert len(box_lists) == x[0].size( - 0 - ), "unequal value, x[0] batch dim 0 is {}, but box_list has length {}".format( - x[0].size(0), len(box_lists) - ) - if len(box_lists) == 0: - return torch.zeros( - (0, x[0].shape[1]) + self.output_size, device=x[0].device, dtype=x[0].dtype - ) - - pooler_fmt_boxes = convert_boxes_to_pooler_format(box_lists) - - if num_level_assignments == 1: - return self.level_poolers[0](x[0], pooler_fmt_boxes) - - level_assignments = assign_boxes_to_levels( - box_lists, self.min_level, self.max_level, self.canonical_box_size, self.canonical_level - ) - - num_boxes = pooler_fmt_boxes.size(0) - num_channels = x[0].shape[1] - output_size = self.output_size[0] - - dtype, device = x[0].dtype, x[0].device - output = torch.zeros( - (num_boxes, num_channels, output_size, output_size), dtype=dtype, device=device - ) - - for level, pooler in enumerate(self.level_poolers): - inds = nonzero_tuple(level_assignments == level)[0] - pooler_fmt_boxes_level = pooler_fmt_boxes[inds] - # Use index_put_ instead of advance indexing, to avoid pytorch/issues/49852 - output.index_put_((inds,), pooler(x[level], pooler_fmt_boxes_level)) - - return output diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/postprocessing.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/postprocessing.py deleted file mode 100755 index 52f273bb..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/postprocessing.py +++ /dev/null @@ -1,101 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import torch -from torch.nn import functional as F - -from detectron2.structures import Instances, ROIMasks - - -# perhaps should rename to "resize_instance" -def detector_postprocess( - results: Instances, output_height: int, output_width: int, mask_threshold: float = 0.5 -): - """ - Resize the output instances. - The input images are often resized when entering an object detector. - As a result, we often need the outputs of the detector in a different - resolution from its inputs. - - This function will resize the raw outputs of an R-CNN detector - to produce outputs according to the desired output resolution. - - Args: - results (Instances): the raw outputs from the detector. - `results.image_size` contains the input image resolution the detector sees. - This object might be modified in-place. - output_height, output_width: the desired output resolution. - - Returns: - Instances: the resized output from the model, based on the output resolution - """ - if isinstance(output_width, torch.Tensor): - # This shape might (but not necessarily) be tensors during tracing. - # Converts integer tensors to float temporaries to ensure true - # division is performed when computing scale_x and scale_y. - output_width_tmp = output_width.float() - output_height_tmp = output_height.float() - new_size = torch.stack([output_height, output_width]) - else: - new_size = (output_height, output_width) - output_width_tmp = output_width - output_height_tmp = output_height - - scale_x, scale_y = ( - output_width_tmp / results.image_size[1], - output_height_tmp / results.image_size[0], - ) - results = Instances(new_size, **results.get_fields()) - - if results.has("pred_boxes"): - output_boxes = results.pred_boxes - elif results.has("proposal_boxes"): - output_boxes = results.proposal_boxes - else: - output_boxes = None - assert output_boxes is not None, "Predictions must contain boxes!" - - output_boxes.scale(scale_x, scale_y) - output_boxes.clip(results.image_size) - - results = results[output_boxes.nonempty()] - - if results.has("pred_masks"): - if isinstance(results.pred_masks, ROIMasks): - roi_masks = results.pred_masks - else: - # pred_masks is a tensor of shape (N, 1, M, M) - roi_masks = ROIMasks(results.pred_masks[:, 0, :, :]) - results.pred_masks = roi_masks.to_bitmasks( - results.pred_boxes, output_height, output_width, mask_threshold - ).tensor # TODO return ROIMasks/BitMask object in the future - - if results.has("pred_keypoints"): - results.pred_keypoints[:, :, 0] *= scale_x - results.pred_keypoints[:, :, 1] *= scale_y - - return results - - -def sem_seg_postprocess(result, img_size, output_height, output_width): - """ - Return semantic segmentation predictions in the original resolution. - - The input images are often resized when entering semantic segmentor. Moreover, in same - cases, they also padded inside segmentor to be divisible by maximum network stride. - As a result, we often need the predictions of the segmentor in a different - resolution from its inputs. - - Args: - result (Tensor): semantic segmentation prediction logits. A tensor of shape (C, H, W), - where C is the number of classes, and H, W are the height and width of the prediction. - img_size (tuple): image size that segmentor is taking as input. - output_height, output_width: the desired output resolution. - - Returns: - semantic segmentation prediction (Tensor): A tensor of the shape - (C, output_height, output_width) that contains per-pixel soft predictions. - """ - result = result[:, : img_size[0], : img_size[1]].expand(1, -1, -1, -1) - result = F.interpolate( - result, size=(output_height, output_width), mode="bilinear", align_corners=False - )[0] - return result diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/__init__.py deleted file mode 100755 index 3f4e4df7..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/__init__.py +++ /dev/null @@ -1,5 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .build import PROPOSAL_GENERATOR_REGISTRY, build_proposal_generator -from .rpn import RPN_HEAD_REGISTRY, build_rpn_head, RPN, StandardRPNHead - -__all__ = list(globals().keys()) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/build.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/build.py deleted file mode 100755 index 34eb12d0..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/build.py +++ /dev/null @@ -1,24 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from detectron2.utils.registry import Registry - -PROPOSAL_GENERATOR_REGISTRY = Registry("PROPOSAL_GENERATOR") -PROPOSAL_GENERATOR_REGISTRY.__doc__ = """ -Registry for proposal generator, which produces object proposals from feature maps. - -The registered object will be called with `obj(cfg, input_shape)`. -The call should return a `nn.Module` object. -""" - -from . import rpn, rrpn # noqa F401 isort:skip - - -def build_proposal_generator(cfg, input_shape): - """ - Build a proposal generator from `cfg.MODEL.PROPOSAL_GENERATOR.NAME`. - The name can be "PrecomputedProposals" to use no proposal generator. - """ - name = cfg.MODEL.PROPOSAL_GENERATOR.NAME - if name == "PrecomputedProposals": - return None - - return PROPOSAL_GENERATOR_REGISTRY.get(name)(cfg, input_shape) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/proposal_utils.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/proposal_utils.py deleted file mode 100755 index 47032198..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/proposal_utils.py +++ /dev/null @@ -1,196 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import math -from typing import List, Tuple, Union -import torch - -from detectron2.layers import batched_nms, cat -from detectron2.structures import Boxes, Instances - -logger = logging.getLogger(__name__) - - -def _is_tracing(): - # (fixed in TORCH_VERSION >= 1.9) - if torch.jit.is_scripting(): - # https://github.com/pytorch/pytorch/issues/47379 - return False - else: - return torch.jit.is_tracing() - - -def find_top_rpn_proposals( - proposals: List[torch.Tensor], - pred_objectness_logits: List[torch.Tensor], - image_sizes: List[Tuple[int, int]], - nms_thresh: float, - pre_nms_topk: int, - post_nms_topk: int, - min_box_size: float, - training: bool, -): - """ - For each feature map, select the `pre_nms_topk` highest scoring proposals, - apply NMS, clip proposals, and remove small boxes. Return the `post_nms_topk` - highest scoring proposals among all the feature maps for each image. - - Args: - proposals (list[Tensor]): A list of L tensors. Tensor i has shape (N, Hi*Wi*A, 4). - All proposal predictions on the feature maps. - pred_objectness_logits (list[Tensor]): A list of L tensors. Tensor i has shape (N, Hi*Wi*A). - image_sizes (list[tuple]): sizes (h, w) for each image - nms_thresh (float): IoU threshold to use for NMS - pre_nms_topk (int): number of top k scoring proposals to keep before applying NMS. - When RPN is run on multiple feature maps (as in FPN) this number is per - feature map. - post_nms_topk (int): number of top k scoring proposals to keep after applying NMS. - When RPN is run on multiple feature maps (as in FPN) this number is total, - over all feature maps. - min_box_size (float): minimum proposal box side length in pixels (absolute units - wrt input images). - training (bool): True if proposals are to be used in training, otherwise False. - This arg exists only to support a legacy bug; look for the "NB: Legacy bug ..." - comment. - - Returns: - list[Instances]: list of N Instances. The i-th Instances - stores post_nms_topk object proposals for image i, sorted by their - objectness score in descending order. - """ - num_images = len(image_sizes) - device = proposals[0].device - - # 1. Select top-k anchor for every level and every image - topk_scores = [] # #lvl Tensor, each of shape N x topk - topk_proposals = [] - level_ids = [] # #lvl Tensor, each of shape (topk,) - batch_idx = torch.arange(num_images, device=device) - for level_id, (proposals_i, logits_i) in enumerate(zip(proposals, pred_objectness_logits)): - Hi_Wi_A = logits_i.shape[1] - if isinstance(Hi_Wi_A, torch.Tensor): # it's a tensor in tracing - num_proposals_i = torch.clamp(Hi_Wi_A, max=pre_nms_topk) - else: - num_proposals_i = min(Hi_Wi_A, pre_nms_topk) - - topk_scores_i, topk_idx = logits_i.topk(num_proposals_i, dim=1) - - # each is N x topk - topk_proposals_i = proposals_i[batch_idx[:, None], topk_idx] # N x topk x 4 - - topk_proposals.append(topk_proposals_i) - topk_scores.append(topk_scores_i) - level_ids.append(torch.full((num_proposals_i,), level_id, dtype=torch.int64, device=device)) - - # 2. Concat all levels together - topk_scores = cat(topk_scores, dim=1) - topk_proposals = cat(topk_proposals, dim=1) - level_ids = cat(level_ids, dim=0) - - # 3. For each image, run a per-level NMS, and choose topk results. - results: List[Instances] = [] - for n, image_size in enumerate(image_sizes): - boxes = Boxes(topk_proposals[n]) - scores_per_img = topk_scores[n] - lvl = level_ids - - valid_mask = torch.isfinite(boxes.tensor).all(dim=1) & torch.isfinite(scores_per_img) - if not valid_mask.all(): - if training: - raise FloatingPointError( - "Predicted boxes or scores contain Inf/NaN. Training has diverged." - ) - boxes = boxes[valid_mask] - scores_per_img = scores_per_img[valid_mask] - lvl = lvl[valid_mask] - boxes.clip(image_size) - - # filter empty boxes - keep = boxes.nonempty(threshold=min_box_size) - if _is_tracing() or keep.sum().item() != len(boxes): - boxes, scores_per_img, lvl = boxes[keep], scores_per_img[keep], lvl[keep] - - keep = batched_nms(boxes.tensor, scores_per_img, lvl, nms_thresh) - # In Detectron1, there was different behavior during training vs. testing. - # (https://github.com/facebookresearch/Detectron/issues/459) - # During training, topk is over the proposals from *all* images in the training batch. - # During testing, it is over the proposals for each image separately. - # As a result, the training behavior becomes batch-dependent, - # and the configuration "POST_NMS_TOPK_TRAIN" end up relying on the batch size. - # This bug is addressed in Detectron2 to make the behavior independent of batch size. - keep = keep[:post_nms_topk] # keep is already sorted - - res = Instances(image_size) - res.proposal_boxes = boxes[keep] - res.objectness_logits = scores_per_img[keep] - results.append(res) - return results - - -def add_ground_truth_to_proposals( - gt: Union[List[Instances], List[Boxes]], proposals: List[Instances] -) -> List[Instances]: - """ - Call `add_ground_truth_to_proposals_single_image` for all images. - - Args: - gt(Union[List[Instances], List[Boxes]): list of N elements. Element i is a Instances - representing the ground-truth for image i. - proposals (list[Instances]): list of N elements. Element i is a Instances - representing the proposals for image i. - - Returns: - list[Instances]: list of N Instances. Each is the proposals for the image, - with field "proposal_boxes" and "objectness_logits". - """ - assert gt is not None - - if len(proposals) != len(gt): - raise ValueError("proposals and gt should have the same length as the number of images!") - if len(proposals) == 0: - return proposals - - return [ - add_ground_truth_to_proposals_single_image(gt_i, proposals_i) - for gt_i, proposals_i in zip(gt, proposals) - ] - - -def add_ground_truth_to_proposals_single_image( - gt: Union[Instances, Boxes], proposals: Instances -) -> Instances: - """ - Augment `proposals` with `gt`. - - Args: - Same as `add_ground_truth_to_proposals`, but with gt and proposals - per image. - - Returns: - Same as `add_ground_truth_to_proposals`, but for only one image. - """ - if isinstance(gt, Boxes): - # convert Boxes to Instances - gt = Instances(proposals.image_size, gt_boxes=gt) - - gt_boxes = gt.gt_boxes - device = proposals.objectness_logits.device - # Assign all ground-truth boxes an objectness logit corresponding to - # P(object) = sigmoid(logit) =~ 1. - gt_logit_value = math.log((1.0 - 1e-10) / (1 - (1.0 - 1e-10))) - gt_logits = gt_logit_value * torch.ones(len(gt_boxes), device=device) - - # Concatenating gt_boxes with proposals requires them to have the same fields - gt_proposal = Instances(proposals.image_size, **gt.get_fields()) - gt_proposal.proposal_boxes = gt_boxes - gt_proposal.objectness_logits = gt_logits - - for key in proposals.get_fields().keys(): - assert gt_proposal.has( - key - ), "The attribute '{}' in `proposals` does not exist in `gt`".format(key) - - # NOTE: Instances.cat only use fields from the first item. Extra fields in latter items - # will be thrown away. - new_proposals = Instances.cat([proposals, gt_proposal]) - - return new_proposals diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/rpn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/rpn.py deleted file mode 100755 index 99cd536d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/rpn.py +++ /dev/null @@ -1,533 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from typing import Dict, List, Optional, Tuple, Union -import torch -import torch.nn.functional as F -from torch import nn - -from detectron2.config import configurable -from detectron2.layers import Conv2d, ShapeSpec, cat -from detectron2.structures import Boxes, ImageList, Instances, pairwise_iou -from detectron2.utils.events import get_event_storage -from detectron2.utils.memory import retry_if_cuda_oom -from detectron2.utils.registry import Registry - -from ..anchor_generator import build_anchor_generator -from ..box_regression import Box2BoxTransform, _dense_box_regression_loss -from ..matcher import Matcher -from ..sampling import subsample_labels -from .build import PROPOSAL_GENERATOR_REGISTRY -from .proposal_utils import find_top_rpn_proposals - -RPN_HEAD_REGISTRY = Registry("RPN_HEAD") -RPN_HEAD_REGISTRY.__doc__ = """ -Registry for RPN heads, which take feature maps and perform -objectness classification and bounding box regression for anchors. - -The registered object will be called with `obj(cfg, input_shape)`. -The call should return a `nn.Module` object. -""" - - -""" -Shape shorthand in this module: - - N: number of images in the minibatch - L: number of feature maps per image on which RPN is run - A: number of cell anchors (must be the same for all feature maps) - Hi, Wi: height and width of the i-th feature map - B: size of the box parameterization - -Naming convention: - - objectness: refers to the binary classification of an anchor as object vs. not object. - - deltas: refers to the 4-d (dx, dy, dw, dh) deltas that parameterize the box2box - transform (see :class:`box_regression.Box2BoxTransform`), or 5d for rotated boxes. - - pred_objectness_logits: predicted objectness scores in [-inf, +inf]; use - sigmoid(pred_objectness_logits) to estimate P(object). - - gt_labels: ground-truth binary classification labels for objectness - - pred_anchor_deltas: predicted box2box transform deltas - - gt_anchor_deltas: ground-truth box2box transform deltas -""" - - -def build_rpn_head(cfg, input_shape): - """ - Build an RPN head defined by `cfg.MODEL.RPN.HEAD_NAME`. - """ - name = cfg.MODEL.RPN.HEAD_NAME - return RPN_HEAD_REGISTRY.get(name)(cfg, input_shape) - - -@RPN_HEAD_REGISTRY.register() -class StandardRPNHead(nn.Module): - """ - Standard RPN classification and regression heads described in :paper:`Faster R-CNN`. - Uses a 3x3 conv to produce a shared hidden state from which one 1x1 conv predicts - objectness logits for each anchor and a second 1x1 conv predicts bounding-box deltas - specifying how to deform each anchor into an object proposal. - """ - - @configurable - def __init__( - self, *, in_channels: int, num_anchors: int, box_dim: int = 4, conv_dims: List[int] = (-1,) - ): - """ - NOTE: this interface is experimental. - - Args: - in_channels (int): number of input feature channels. When using multiple - input features, they must have the same number of channels. - num_anchors (int): number of anchors to predict for *each spatial position* - on the feature map. The total number of anchors for each - feature map will be `num_anchors * H * W`. - box_dim (int): dimension of a box, which is also the number of box regression - predictions to make for each anchor. An axis aligned box has - box_dim=4, while a rotated box has box_dim=5. - conv_dims (list[int]): a list of integers representing the output channels - of N conv layers. Set it to -1 to use the same number of output channels - as input channels. - """ - super().__init__() - cur_channels = in_channels - # Keeping the old variable names and structure for backwards compatiblity. - # Otherwise the old checkpoints will fail to load. - if len(conv_dims) == 1: - out_channels = cur_channels if conv_dims[0] == -1 else conv_dims[0] - # 3x3 conv for the hidden representation - self.conv = self._get_rpn_conv(cur_channels, out_channels) - cur_channels = out_channels - else: - self.conv = nn.Sequential() - for k, conv_dim in enumerate(conv_dims): - out_channels = cur_channels if conv_dim == -1 else conv_dim - if out_channels <= 0: - raise ValueError( - f"Conv output channels should be greater than 0. Got {out_channels}" - ) - conv = self._get_rpn_conv(cur_channels, out_channels) - self.conv.add_module(f"conv{k}", conv) - cur_channels = out_channels - # 1x1 conv for predicting objectness logits - self.objectness_logits = nn.Conv2d(cur_channels, num_anchors, kernel_size=1, stride=1) - # 1x1 conv for predicting box2box transform deltas - self.anchor_deltas = nn.Conv2d(cur_channels, num_anchors * box_dim, kernel_size=1, stride=1) - - # Keeping the order of weights initialization same for backwards compatiblility. - for layer in self.modules(): - if isinstance(layer, nn.Conv2d): - nn.init.normal_(layer.weight, std=0.01) - nn.init.constant_(layer.bias, 0) - - def _get_rpn_conv(self, in_channels, out_channels): - return Conv2d( - in_channels, - out_channels, - kernel_size=3, - stride=1, - padding=1, - activation=nn.ReLU(), - ) - - @classmethod - def from_config(cls, cfg, input_shape): - # Standard RPN is shared across levels: - in_channels = [s.channels for s in input_shape] - assert len(set(in_channels)) == 1, "Each level must have the same channel!" - in_channels = in_channels[0] - - # RPNHead should take the same input as anchor generator - # NOTE: it assumes that creating an anchor generator does not have unwanted side effect. - anchor_generator = build_anchor_generator(cfg, input_shape) - num_anchors = anchor_generator.num_anchors - box_dim = anchor_generator.box_dim - assert ( - len(set(num_anchors)) == 1 - ), "Each level must have the same number of anchors per spatial position" - return { - "in_channels": in_channels, - "num_anchors": num_anchors[0], - "box_dim": box_dim, - "conv_dims": cfg.MODEL.RPN.CONV_DIMS, - } - - def forward(self, features: List[torch.Tensor]): - """ - Args: - features (list[Tensor]): list of feature maps - - Returns: - list[Tensor]: A list of L elements. - Element i is a tensor of shape (N, A, Hi, Wi) representing - the predicted objectness logits for all anchors. A is the number of cell anchors. - list[Tensor]: A list of L elements. Element i is a tensor of shape - (N, A*box_dim, Hi, Wi) representing the predicted "deltas" used to transform anchors - to proposals. - """ - pred_objectness_logits = [] - pred_anchor_deltas = [] - for x in features: - t = self.conv(x) - pred_objectness_logits.append(self.objectness_logits(t)) - pred_anchor_deltas.append(self.anchor_deltas(t)) - return pred_objectness_logits, pred_anchor_deltas - - -@PROPOSAL_GENERATOR_REGISTRY.register() -class RPN(nn.Module): - """ - Region Proposal Network, introduced by :paper:`Faster R-CNN`. - """ - - @configurable - def __init__( - self, - *, - in_features: List[str], - head: nn.Module, - anchor_generator: nn.Module, - anchor_matcher: Matcher, - box2box_transform: Box2BoxTransform, - batch_size_per_image: int, - positive_fraction: float, - pre_nms_topk: Tuple[float, float], - post_nms_topk: Tuple[float, float], - nms_thresh: float = 0.7, - min_box_size: float = 0.0, - anchor_boundary_thresh: float = -1.0, - loss_weight: Union[float, Dict[str, float]] = 1.0, - box_reg_loss_type: str = "smooth_l1", - smooth_l1_beta: float = 0.0, - ): - """ - NOTE: this interface is experimental. - - Args: - in_features (list[str]): list of names of input features to use - head (nn.Module): a module that predicts logits and regression deltas - for each level from a list of per-level features - anchor_generator (nn.Module): a module that creates anchors from a - list of features. Usually an instance of :class:`AnchorGenerator` - anchor_matcher (Matcher): label the anchors by matching them with ground truth. - box2box_transform (Box2BoxTransform): defines the transform from anchors boxes to - instance boxes - batch_size_per_image (int): number of anchors per image to sample for training - positive_fraction (float): fraction of foreground anchors to sample for training - pre_nms_topk (tuple[float]): (train, test) that represents the - number of top k proposals to select before NMS, in - training and testing. - post_nms_topk (tuple[float]): (train, test) that represents the - number of top k proposals to select after NMS, in - training and testing. - nms_thresh (float): NMS threshold used to de-duplicate the predicted proposals - min_box_size (float): remove proposal boxes with any side smaller than this threshold, - in the unit of input image pixels - anchor_boundary_thresh (float): legacy option - loss_weight (float|dict): weights to use for losses. Can be single float for weighting - all rpn losses together, or a dict of individual weightings. Valid dict keys are: - "loss_rpn_cls" - applied to classification loss - "loss_rpn_loc" - applied to box regression loss - box_reg_loss_type (str): Loss type to use. Supported losses: "smooth_l1", "giou". - smooth_l1_beta (float): beta parameter for the smooth L1 regression loss. Default to - use L1 loss. Only used when `box_reg_loss_type` is "smooth_l1" - """ - super().__init__() - self.in_features = in_features - self.rpn_head = head - self.anchor_generator = anchor_generator - self.anchor_matcher = anchor_matcher - self.box2box_transform = box2box_transform - self.batch_size_per_image = batch_size_per_image - self.positive_fraction = positive_fraction - # Map from self.training state to train/test settings - self.pre_nms_topk = {True: pre_nms_topk[0], False: pre_nms_topk[1]} - self.post_nms_topk = {True: post_nms_topk[0], False: post_nms_topk[1]} - self.nms_thresh = nms_thresh - self.min_box_size = float(min_box_size) - self.anchor_boundary_thresh = anchor_boundary_thresh - if isinstance(loss_weight, float): - loss_weight = {"loss_rpn_cls": loss_weight, "loss_rpn_loc": loss_weight} - self.loss_weight = loss_weight - self.box_reg_loss_type = box_reg_loss_type - self.smooth_l1_beta = smooth_l1_beta - - @classmethod - def from_config(cls, cfg, input_shape: Dict[str, ShapeSpec]): - in_features = cfg.MODEL.RPN.IN_FEATURES - ret = { - "in_features": in_features, - "min_box_size": cfg.MODEL.PROPOSAL_GENERATOR.MIN_SIZE, - "nms_thresh": cfg.MODEL.RPN.NMS_THRESH, - "batch_size_per_image": cfg.MODEL.RPN.BATCH_SIZE_PER_IMAGE, - "positive_fraction": cfg.MODEL.RPN.POSITIVE_FRACTION, - "loss_weight": { - "loss_rpn_cls": cfg.MODEL.RPN.LOSS_WEIGHT, - "loss_rpn_loc": cfg.MODEL.RPN.BBOX_REG_LOSS_WEIGHT * cfg.MODEL.RPN.LOSS_WEIGHT, - }, - "anchor_boundary_thresh": cfg.MODEL.RPN.BOUNDARY_THRESH, - "box2box_transform": Box2BoxTransform(weights=cfg.MODEL.RPN.BBOX_REG_WEIGHTS), - "box_reg_loss_type": cfg.MODEL.RPN.BBOX_REG_LOSS_TYPE, - "smooth_l1_beta": cfg.MODEL.RPN.SMOOTH_L1_BETA, - } - - ret["pre_nms_topk"] = (cfg.MODEL.RPN.PRE_NMS_TOPK_TRAIN, cfg.MODEL.RPN.PRE_NMS_TOPK_TEST) - ret["post_nms_topk"] = (cfg.MODEL.RPN.POST_NMS_TOPK_TRAIN, cfg.MODEL.RPN.POST_NMS_TOPK_TEST) - - ret["anchor_generator"] = build_anchor_generator(cfg, [input_shape[f] for f in in_features]) - ret["anchor_matcher"] = Matcher( - cfg.MODEL.RPN.IOU_THRESHOLDS, cfg.MODEL.RPN.IOU_LABELS, allow_low_quality_matches=True - ) - ret["head"] = build_rpn_head(cfg, [input_shape[f] for f in in_features]) - return ret - - def _subsample_labels(self, label): - """ - Randomly sample a subset of positive and negative examples, and overwrite - the label vector to the ignore value (-1) for all elements that are not - included in the sample. - - Args: - labels (Tensor): a vector of -1, 0, 1. Will be modified in-place and returned. - """ - pos_idx, neg_idx = subsample_labels( - label, self.batch_size_per_image, self.positive_fraction, 0 - ) - # Fill with the ignore label (-1), then set positive and negative labels - label.fill_(-1) - label.scatter_(0, pos_idx, 1) - label.scatter_(0, neg_idx, 0) - return label - - @torch.jit.unused - @torch.no_grad() - def label_and_sample_anchors( - self, anchors: List[Boxes], gt_instances: List[Instances] - ) -> Tuple[List[torch.Tensor], List[torch.Tensor]]: - """ - Args: - anchors (list[Boxes]): anchors for each feature map. - gt_instances: the ground-truth instances for each image. - - Returns: - list[Tensor]: - List of #img tensors. i-th element is a vector of labels whose length is - the total number of anchors across all feature maps R = sum(Hi * Wi * A). - Label values are in {-1, 0, 1}, with meanings: -1 = ignore; 0 = negative - class; 1 = positive class. - list[Tensor]: - i-th element is a Rx4 tensor. The values are the matched gt boxes for each - anchor. Values are undefined for those anchors not labeled as 1. - """ - anchors = Boxes.cat(anchors) - - gt_boxes = [x.gt_boxes for x in gt_instances] - image_sizes = [x.image_size for x in gt_instances] - del gt_instances - - gt_labels = [] - matched_gt_boxes = [] - for image_size_i, gt_boxes_i in zip(image_sizes, gt_boxes): - """ - image_size_i: (h, w) for the i-th image - gt_boxes_i: ground-truth boxes for i-th image - """ - - match_quality_matrix = retry_if_cuda_oom(pairwise_iou)(gt_boxes_i, anchors) - matched_idxs, gt_labels_i = retry_if_cuda_oom(self.anchor_matcher)(match_quality_matrix) - # Matching is memory-expensive and may result in CPU tensors. But the result is small - gt_labels_i = gt_labels_i.to(device=gt_boxes_i.device) - del match_quality_matrix - - if self.anchor_boundary_thresh >= 0: - # Discard anchors that go out of the boundaries of the image - # NOTE: This is legacy functionality that is turned off by default in Detectron2 - anchors_inside_image = anchors.inside_box(image_size_i, self.anchor_boundary_thresh) - gt_labels_i[~anchors_inside_image] = -1 - - # A vector of labels (-1, 0, 1) for each anchor - gt_labels_i = self._subsample_labels(gt_labels_i) - - if len(gt_boxes_i) == 0: - # These values won't be used anyway since the anchor is labeled as background - matched_gt_boxes_i = torch.zeros_like(anchors.tensor) - else: - # TODO wasted indexing computation for ignored boxes - matched_gt_boxes_i = gt_boxes_i[matched_idxs].tensor - - gt_labels.append(gt_labels_i) # N,AHW - matched_gt_boxes.append(matched_gt_boxes_i) - return gt_labels, matched_gt_boxes - - @torch.jit.unused - def losses( - self, - anchors: List[Boxes], - pred_objectness_logits: List[torch.Tensor], - gt_labels: List[torch.Tensor], - pred_anchor_deltas: List[torch.Tensor], - gt_boxes: List[torch.Tensor], - ) -> Dict[str, torch.Tensor]: - """ - Return the losses from a set of RPN predictions and their associated ground-truth. - - Args: - anchors (list[Boxes or RotatedBoxes]): anchors for each feature map, each - has shape (Hi*Wi*A, B), where B is box dimension (4 or 5). - pred_objectness_logits (list[Tensor]): A list of L elements. - Element i is a tensor of shape (N, Hi*Wi*A) representing - the predicted objectness logits for all anchors. - gt_labels (list[Tensor]): Output of :meth:`label_and_sample_anchors`. - pred_anchor_deltas (list[Tensor]): A list of L elements. Element i is a tensor of shape - (N, Hi*Wi*A, 4 or 5) representing the predicted "deltas" used to transform anchors - to proposals. - gt_boxes (list[Tensor]): Output of :meth:`label_and_sample_anchors`. - - Returns: - dict[loss name -> loss value]: A dict mapping from loss name to loss value. - Loss names are: `loss_rpn_cls` for objectness classification and - `loss_rpn_loc` for proposal localization. - """ - num_images = len(gt_labels) - gt_labels = torch.stack(gt_labels) # (N, sum(Hi*Wi*Ai)) - - # Log the number of positive/negative anchors per-image that's used in training - pos_mask = gt_labels == 1 - num_pos_anchors = pos_mask.sum().item() - num_neg_anchors = (gt_labels == 0).sum().item() - storage = get_event_storage() - storage.put_scalar("rpn/num_pos_anchors", num_pos_anchors / num_images) - storage.put_scalar("rpn/num_neg_anchors", num_neg_anchors / num_images) - - localization_loss = _dense_box_regression_loss( - anchors, - self.box2box_transform, - pred_anchor_deltas, - gt_boxes, - pos_mask, - box_reg_loss_type=self.box_reg_loss_type, - smooth_l1_beta=self.smooth_l1_beta, - ) - - valid_mask = gt_labels >= 0 - objectness_loss = F.binary_cross_entropy_with_logits( - cat(pred_objectness_logits, dim=1)[valid_mask], - gt_labels[valid_mask].to(torch.float32), - reduction="sum", - ) - normalizer = self.batch_size_per_image * num_images - losses = { - "loss_rpn_cls": objectness_loss / normalizer, - # The original Faster R-CNN paper uses a slightly different normalizer - # for loc loss. But it doesn't matter in practice - "loss_rpn_loc": localization_loss / normalizer, - } - losses = {k: v * self.loss_weight.get(k, 1.0) for k, v in losses.items()} - return losses - - def forward( - self, - images: ImageList, - features: Dict[str, torch.Tensor], - gt_instances: Optional[List[Instances]] = None, - ): - """ - Args: - images (ImageList): input images of length `N` - features (dict[str, Tensor]): input data as a mapping from feature - map name to tensor. Axis 0 represents the number of images `N` in - the input data; axes 1-3 are channels, height, and width, which may - vary between feature maps (e.g., if a feature pyramid is used). - gt_instances (list[Instances], optional): a length `N` list of `Instances`s. - Each `Instances` stores ground-truth instances for the corresponding image. - - Returns: - proposals: list[Instances]: contains fields "proposal_boxes", "objectness_logits" - loss: dict[Tensor] or None - """ - features = [features[f] for f in self.in_features] - anchors = self.anchor_generator(features) - - pred_objectness_logits, pred_anchor_deltas = self.rpn_head(features) - # Transpose the Hi*Wi*A dimension to the middle: - pred_objectness_logits = [ - # (N, A, Hi, Wi) -> (N, Hi, Wi, A) -> (N, Hi*Wi*A) - score.permute(0, 2, 3, 1).flatten(1) - for score in pred_objectness_logits - ] - pred_anchor_deltas = [ - # (N, A*B, Hi, Wi) -> (N, A, B, Hi, Wi) -> (N, Hi, Wi, A, B) -> (N, Hi*Wi*A, B) - x.view(x.shape[0], -1, self.anchor_generator.box_dim, x.shape[-2], x.shape[-1]) - .permute(0, 3, 4, 1, 2) - .flatten(1, -2) - for x in pred_anchor_deltas - ] - - if self.training: - assert gt_instances is not None, "RPN requires gt_instances in training!" - gt_labels, gt_boxes = self.label_and_sample_anchors(anchors, gt_instances) - losses = self.losses( - anchors, pred_objectness_logits, gt_labels, pred_anchor_deltas, gt_boxes - ) - else: - losses = {} - proposals = self.predict_proposals( - anchors, pred_objectness_logits, pred_anchor_deltas, images.image_sizes - ) - return proposals, losses - - def predict_proposals( - self, - anchors: List[Boxes], - pred_objectness_logits: List[torch.Tensor], - pred_anchor_deltas: List[torch.Tensor], - image_sizes: List[Tuple[int, int]], - ): - """ - Decode all the predicted box regression deltas to proposals. Find the top proposals - by applying NMS and removing boxes that are too small. - - Returns: - proposals (list[Instances]): list of N Instances. The i-th Instances - stores post_nms_topk object proposals for image i, sorted by their - objectness score in descending order. - """ - # The proposals are treated as fixed for joint training with roi heads. - # This approach ignores the derivative w.r.t. the proposal boxes’ coordinates that - # are also network responses. - with torch.no_grad(): - pred_proposals = self._decode_proposals(anchors, pred_anchor_deltas) - return find_top_rpn_proposals( - pred_proposals, - pred_objectness_logits, - image_sizes, - self.nms_thresh, - self.pre_nms_topk[self.training], - self.post_nms_topk[self.training], - self.min_box_size, - self.training, - ) - - def _decode_proposals(self, anchors: List[Boxes], pred_anchor_deltas: List[torch.Tensor]): - """ - Transform anchors into proposals by applying the predicted anchor deltas. - - Returns: - proposals (list[Tensor]): A list of L tensors. Tensor i has shape - (N, Hi*Wi*A, B) - """ - N = pred_anchor_deltas[0].shape[0] - proposals = [] - # For each feature map - for anchors_i, pred_anchor_deltas_i in zip(anchors, pred_anchor_deltas): - B = anchors_i.tensor.size(1) - pred_anchor_deltas_i = pred_anchor_deltas_i.reshape(-1, B) - # Expand anchors to shape (N*Hi*Wi*A, B) - anchors_i = anchors_i.tensor.unsqueeze(0).expand(N, -1, -1).reshape(-1, B) - proposals_i = self.box2box_transform.apply_deltas(pred_anchor_deltas_i, anchors_i) - # Append feature map proposals with shape (N, Hi*Wi*A, B) - proposals.append(proposals_i.view(N, -1, B)) - return proposals diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/rrpn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/rrpn.py deleted file mode 100755 index d51b92b7..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/proposal_generator/rrpn.py +++ /dev/null @@ -1,203 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import itertools -import logging -from typing import Dict, List -import torch - -from detectron2.config import configurable -from detectron2.layers import ShapeSpec, batched_nms_rotated, cat -from detectron2.structures import Instances, RotatedBoxes, pairwise_iou_rotated -from detectron2.utils.memory import retry_if_cuda_oom - -from ..box_regression import Box2BoxTransformRotated -from .build import PROPOSAL_GENERATOR_REGISTRY -from .proposal_utils import _is_tracing -from .rpn import RPN - -logger = logging.getLogger(__name__) - - -def find_top_rrpn_proposals( - proposals, - pred_objectness_logits, - image_sizes, - nms_thresh, - pre_nms_topk, - post_nms_topk, - min_box_size, - training, -): - """ - For each feature map, select the `pre_nms_topk` highest scoring proposals, - apply NMS, clip proposals, and remove small boxes. Return the `post_nms_topk` - highest scoring proposals among all the feature maps if `training` is True, - otherwise, returns the highest `post_nms_topk` scoring proposals for each - feature map. - - Args: - proposals (list[Tensor]): A list of L tensors. Tensor i has shape (N, Hi*Wi*A, 5). - All proposal predictions on the feature maps. - pred_objectness_logits (list[Tensor]): A list of L tensors. Tensor i has shape (N, Hi*Wi*A). - image_sizes (list[tuple]): sizes (h, w) for each image - nms_thresh (float): IoU threshold to use for NMS - pre_nms_topk (int): number of top k scoring proposals to keep before applying NMS. - When RRPN is run on multiple feature maps (as in FPN) this number is per - feature map. - post_nms_topk (int): number of top k scoring proposals to keep after applying NMS. - When RRPN is run on multiple feature maps (as in FPN) this number is total, - over all feature maps. - min_box_size(float): minimum proposal box side length in pixels (absolute units wrt - input images). - training (bool): True if proposals are to be used in training, otherwise False. - This arg exists only to support a legacy bug; look for the "NB: Legacy bug ..." - comment. - - Returns: - proposals (list[Instances]): list of N Instances. The i-th Instances - stores post_nms_topk object proposals for image i. - """ - num_images = len(image_sizes) - device = proposals[0].device - - # 1. Select top-k anchor for every level and every image - topk_scores = [] # #lvl Tensor, each of shape N x topk - topk_proposals = [] - level_ids = [] # #lvl Tensor, each of shape (topk,) - batch_idx = torch.arange(num_images, device=device) - for level_id, proposals_i, logits_i in zip( - itertools.count(), proposals, pred_objectness_logits - ): - Hi_Wi_A = logits_i.shape[1] - if isinstance(Hi_Wi_A, torch.Tensor): # it's a tensor in tracing - num_proposals_i = torch.clamp(Hi_Wi_A, max=pre_nms_topk) - else: - num_proposals_i = min(Hi_Wi_A, pre_nms_topk) - - topk_scores_i, topk_idx = logits_i.topk(num_proposals_i, dim=1) - - # each is N x topk - topk_proposals_i = proposals_i[batch_idx[:, None], topk_idx] # N x topk x 5 - - topk_proposals.append(topk_proposals_i) - topk_scores.append(topk_scores_i) - level_ids.append(torch.full((num_proposals_i,), level_id, dtype=torch.int64, device=device)) - - # 2. Concat all levels together - topk_scores = cat(topk_scores, dim=1) - topk_proposals = cat(topk_proposals, dim=1) - level_ids = cat(level_ids, dim=0) - - # 3. For each image, run a per-level NMS, and choose topk results. - results = [] - for n, image_size in enumerate(image_sizes): - boxes = RotatedBoxes(topk_proposals[n]) - scores_per_img = topk_scores[n] - valid_mask = torch.isfinite(boxes.tensor).all(dim=1) & torch.isfinite(scores_per_img) - if not valid_mask.all(): - boxes = boxes[valid_mask] - scores_per_img = scores_per_img[valid_mask] - boxes.clip(image_size) - - # filter empty boxes - keep = boxes.nonempty(threshold=min_box_size) - lvl = level_ids - if _is_tracing() or keep.sum().item() != len(boxes): - boxes, scores_per_img, lvl = (boxes[keep], scores_per_img[keep], level_ids[keep]) - - keep = batched_nms_rotated(boxes.tensor, scores_per_img, lvl, nms_thresh) - # In Detectron1, there was different behavior during training vs. testing. - # (https://github.com/facebookresearch/Detectron/issues/459) - # During training, topk is over the proposals from *all* images in the training batch. - # During testing, it is over the proposals for each image separately. - # As a result, the training behavior becomes batch-dependent, - # and the configuration "POST_NMS_TOPK_TRAIN" end up relying on the batch size. - # This bug is addressed in Detectron2 to make the behavior independent of batch size. - keep = keep[:post_nms_topk] - - res = Instances(image_size) - res.proposal_boxes = boxes[keep] - res.objectness_logits = scores_per_img[keep] - results.append(res) - return results - - -@PROPOSAL_GENERATOR_REGISTRY.register() -class RRPN(RPN): - """ - Rotated Region Proposal Network described in :paper:`RRPN`. - """ - - @configurable - def __init__(self, *args, **kwargs): - super().__init__(*args, **kwargs) - if self.anchor_boundary_thresh >= 0: - raise NotImplementedError( - "anchor_boundary_thresh is a legacy option not implemented for RRPN." - ) - - @classmethod - def from_config(cls, cfg, input_shape: Dict[str, ShapeSpec]): - ret = super().from_config(cfg, input_shape) - ret["box2box_transform"] = Box2BoxTransformRotated(weights=cfg.MODEL.RPN.BBOX_REG_WEIGHTS) - return ret - - @torch.no_grad() - def label_and_sample_anchors(self, anchors: List[RotatedBoxes], gt_instances: List[Instances]): - """ - Args: - anchors (list[RotatedBoxes]): anchors for each feature map. - gt_instances: the ground-truth instances for each image. - - Returns: - list[Tensor]: - List of #img tensors. i-th element is a vector of labels whose length is - the total number of anchors across feature maps. Label values are in {-1, 0, 1}, - with meanings: -1 = ignore; 0 = negative class; 1 = positive class. - list[Tensor]: - i-th element is a Nx5 tensor, where N is the total number of anchors across - feature maps. The values are the matched gt boxes for each anchor. - Values are undefined for those anchors not labeled as 1. - """ - anchors = RotatedBoxes.cat(anchors) - - gt_boxes = [x.gt_boxes for x in gt_instances] - del gt_instances - - gt_labels = [] - matched_gt_boxes = [] - for gt_boxes_i in gt_boxes: - """ - gt_boxes_i: ground-truth boxes for i-th image - """ - match_quality_matrix = retry_if_cuda_oom(pairwise_iou_rotated)(gt_boxes_i, anchors) - matched_idxs, gt_labels_i = retry_if_cuda_oom(self.anchor_matcher)(match_quality_matrix) - # Matching is memory-expensive and may result in CPU tensors. But the result is small - gt_labels_i = gt_labels_i.to(device=gt_boxes_i.device) - - # A vector of labels (-1, 0, 1) for each anchor - gt_labels_i = self._subsample_labels(gt_labels_i) - - if len(gt_boxes_i) == 0: - # These values won't be used anyway since the anchor is labeled as background - matched_gt_boxes_i = torch.zeros_like(anchors.tensor) - else: - # TODO wasted indexing computation for ignored boxes - matched_gt_boxes_i = gt_boxes_i[matched_idxs].tensor - - gt_labels.append(gt_labels_i) # N,AHW - matched_gt_boxes.append(matched_gt_boxes_i) - return gt_labels, matched_gt_boxes - - @torch.no_grad() - def predict_proposals(self, anchors, pred_objectness_logits, pred_anchor_deltas, image_sizes): - pred_proposals = self._decode_proposals(anchors, pred_anchor_deltas) - return find_top_rrpn_proposals( - pred_proposals, - pred_objectness_logits, - image_sizes, - self.nms_thresh, - self.pre_nms_topk[self.training], - self.post_nms_topk[self.training], - self.min_box_size, - self.training, - ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/__init__.py deleted file mode 100755 index d13e9c57..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/__init__.py +++ /dev/null @@ -1,29 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .box_head import ROI_BOX_HEAD_REGISTRY, build_box_head, FastRCNNConvFCHead -from .keypoint_head import ( - ROI_KEYPOINT_HEAD_REGISTRY, - build_keypoint_head, - BaseKeypointRCNNHead, - KRCNNConvDeconvUpsampleHead, -) -from .mask_head import ( - ROI_MASK_HEAD_REGISTRY, - build_mask_head, - BaseMaskRCNNHead, - MaskRCNNConvUpsampleHead, -) -from .roi_heads import ( - ROI_HEADS_REGISTRY, - ROIHeads, - Res5ROIHeads, - StandardROIHeads, - build_roi_heads, - select_foreground_proposals, -) -from .cascade_rcnn import CascadeROIHeads -from .rotated_fast_rcnn import RROIHeads -from .fast_rcnn import FastRCNNOutputLayers - -from . import cascade_rcnn # isort:skip - -__all__ = list(globals().keys()) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/box_head.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/box_head.py deleted file mode 100755 index 5d0370b0..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/box_head.py +++ /dev/null @@ -1,118 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -from typing import List -import fvcore.nn.weight_init as weight_init -import torch -from torch import nn - -from detectron2.config import configurable -from detectron2.layers import Conv2d, ShapeSpec, get_norm -from detectron2.utils.registry import Registry - -__all__ = ["FastRCNNConvFCHead", "build_box_head", "ROI_BOX_HEAD_REGISTRY"] - -ROI_BOX_HEAD_REGISTRY = Registry("ROI_BOX_HEAD") -ROI_BOX_HEAD_REGISTRY.__doc__ = """ -Registry for box heads, which make box predictions from per-region features. - -The registered object will be called with `obj(cfg, input_shape)`. -""" - - -# To get torchscript support, we make the head a subclass of `nn.Sequential`. -# Therefore, to add new layers in this head class, please make sure they are -# added in the order they will be used in forward(). -@ROI_BOX_HEAD_REGISTRY.register() -class FastRCNNConvFCHead(nn.Sequential): - """ - A head with several 3x3 conv layers (each followed by norm & relu) and then - several fc layers (each followed by relu). - """ - - @configurable - def __init__( - self, input_shape: ShapeSpec, *, conv_dims: List[int], fc_dims: List[int], conv_norm="" - ): - """ - NOTE: this interface is experimental. - - Args: - input_shape (ShapeSpec): shape of the input feature. - conv_dims (list[int]): the output dimensions of the conv layers - fc_dims (list[int]): the output dimensions of the fc layers - conv_norm (str or callable): normalization for the conv layers. - See :func:`detectron2.layers.get_norm` for supported types. - """ - super().__init__() - assert len(conv_dims) + len(fc_dims) > 0 - - self._output_size = (input_shape.channels, input_shape.height, input_shape.width) - - self.conv_norm_relus = [] - for k, conv_dim in enumerate(conv_dims): - conv = Conv2d( - self._output_size[0], - conv_dim, - kernel_size=3, - padding=1, - bias=not conv_norm, - norm=get_norm(conv_norm, conv_dim), - activation=nn.ReLU(), - ) - self.add_module("conv{}".format(k + 1), conv) - self.conv_norm_relus.append(conv) - self._output_size = (conv_dim, self._output_size[1], self._output_size[2]) - - self.fcs = [] - for k, fc_dim in enumerate(fc_dims): - if k == 0: - self.add_module("flatten", nn.Flatten()) - fc = nn.Linear(int(np.prod(self._output_size)), fc_dim) - self.add_module("fc{}".format(k + 1), fc) - self.add_module("fc_relu{}".format(k + 1), nn.ReLU()) - self.fcs.append(fc) - self._output_size = fc_dim - - for layer in self.conv_norm_relus: - weight_init.c2_msra_fill(layer) - for layer in self.fcs: - weight_init.c2_xavier_fill(layer) - - @classmethod - def from_config(cls, cfg, input_shape): - num_conv = cfg.MODEL.ROI_BOX_HEAD.NUM_CONV - conv_dim = cfg.MODEL.ROI_BOX_HEAD.CONV_DIM - num_fc = cfg.MODEL.ROI_BOX_HEAD.NUM_FC - fc_dim = cfg.MODEL.ROI_BOX_HEAD.FC_DIM - return { - "input_shape": input_shape, - "conv_dims": [conv_dim] * num_conv, - "fc_dims": [fc_dim] * num_fc, - "conv_norm": cfg.MODEL.ROI_BOX_HEAD.NORM, - } - - def forward(self, x): - for layer in self: - x = layer(x) - return x - - @property - @torch.jit.unused - def output_shape(self): - """ - Returns: - ShapeSpec: the output feature shape - """ - o = self._output_size - if isinstance(o, int): - return ShapeSpec(channels=o) - else: - return ShapeSpec(channels=o[0], height=o[1], width=o[2]) - - -def build_box_head(cfg, input_shape): - """ - Build a box head defined by `cfg.MODEL.ROI_BOX_HEAD.NAME`. - """ - name = cfg.MODEL.ROI_BOX_HEAD.NAME - return ROI_BOX_HEAD_REGISTRY.get(name)(cfg, input_shape) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/cascade_rcnn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/cascade_rcnn.py deleted file mode 100755 index a0ca70fe..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/cascade_rcnn.py +++ /dev/null @@ -1,299 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from typing import List -import torch -from torch import nn -from torch.autograd.function import Function - -from detectron2.config import configurable -from detectron2.layers import ShapeSpec -from detectron2.structures import Boxes, Instances, pairwise_iou -from detectron2.utils.events import get_event_storage - -from ..box_regression import Box2BoxTransform -from ..matcher import Matcher -from ..poolers import ROIPooler -from .box_head import build_box_head -from .fast_rcnn import FastRCNNOutputLayers, fast_rcnn_inference -from .roi_heads import ROI_HEADS_REGISTRY, StandardROIHeads - - -class _ScaleGradient(Function): - @staticmethod - def forward(ctx, input, scale): - ctx.scale = scale - return input - - @staticmethod - def backward(ctx, grad_output): - return grad_output * ctx.scale, None - - -@ROI_HEADS_REGISTRY.register() -class CascadeROIHeads(StandardROIHeads): - """ - The ROI heads that implement :paper:`Cascade R-CNN`. - """ - - @configurable - def __init__( - self, - *, - box_in_features: List[str], - box_pooler: ROIPooler, - box_heads: List[nn.Module], - box_predictors: List[nn.Module], - proposal_matchers: List[Matcher], - **kwargs, - ): - """ - NOTE: this interface is experimental. - - Args: - box_pooler (ROIPooler): pooler that extracts region features from given boxes - box_heads (list[nn.Module]): box head for each cascade stage - box_predictors (list[nn.Module]): box predictor for each cascade stage - proposal_matchers (list[Matcher]): matcher with different IoU thresholds to - match boxes with ground truth for each stage. The first matcher matches - RPN proposals with ground truth, the other matchers use boxes predicted - by the previous stage as proposals and match them with ground truth. - """ - assert "proposal_matcher" not in kwargs, ( - "CascadeROIHeads takes 'proposal_matchers=' for each stage instead " - "of one 'proposal_matcher='." - ) - # The first matcher matches RPN proposals with ground truth, done in the base class - kwargs["proposal_matcher"] = proposal_matchers[0] - num_stages = self.num_cascade_stages = len(box_heads) - box_heads = nn.ModuleList(box_heads) - box_predictors = nn.ModuleList(box_predictors) - assert len(box_predictors) == num_stages, f"{len(box_predictors)} != {num_stages}!" - assert len(proposal_matchers) == num_stages, f"{len(proposal_matchers)} != {num_stages}!" - super().__init__( - box_in_features=box_in_features, - box_pooler=box_pooler, - box_head=box_heads, - box_predictor=box_predictors, - **kwargs, - ) - self.proposal_matchers = proposal_matchers - - @classmethod - def from_config(cls, cfg, input_shape): - ret = super().from_config(cfg, input_shape) - ret.pop("proposal_matcher") - return ret - - @classmethod - def _init_box_head(cls, cfg, input_shape): - # fmt: off - in_features = cfg.MODEL.ROI_HEADS.IN_FEATURES - pooler_resolution = cfg.MODEL.ROI_BOX_HEAD.POOLER_RESOLUTION - pooler_scales = tuple(1.0 / input_shape[k].stride for k in in_features) - sampling_ratio = cfg.MODEL.ROI_BOX_HEAD.POOLER_SAMPLING_RATIO - pooler_type = cfg.MODEL.ROI_BOX_HEAD.POOLER_TYPE - cascade_bbox_reg_weights = cfg.MODEL.ROI_BOX_CASCADE_HEAD.BBOX_REG_WEIGHTS - cascade_ious = cfg.MODEL.ROI_BOX_CASCADE_HEAD.IOUS - assert len(cascade_bbox_reg_weights) == len(cascade_ious) - assert cfg.MODEL.ROI_BOX_HEAD.CLS_AGNOSTIC_BBOX_REG, \ - "CascadeROIHeads only support class-agnostic regression now!" - assert cascade_ious[0] == cfg.MODEL.ROI_HEADS.IOU_THRESHOLDS[0] - # fmt: on - - in_channels = [input_shape[f].channels for f in in_features] - # Check all channel counts are equal - assert len(set(in_channels)) == 1, in_channels - in_channels = in_channels[0] - - box_pooler = ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type=pooler_type, - ) - pooled_shape = ShapeSpec( - channels=in_channels, width=pooler_resolution, height=pooler_resolution - ) - - box_heads, box_predictors, proposal_matchers = [], [], [] - for match_iou, bbox_reg_weights in zip(cascade_ious, cascade_bbox_reg_weights): - box_head = build_box_head(cfg, pooled_shape) - box_heads.append(box_head) - box_predictors.append( - FastRCNNOutputLayers( - cfg, - box_head.output_shape, - box2box_transform=Box2BoxTransform(weights=bbox_reg_weights), - ) - ) - proposal_matchers.append(Matcher([match_iou], [0, 1], allow_low_quality_matches=False)) - return { - "box_in_features": in_features, - "box_pooler": box_pooler, - "box_heads": box_heads, - "box_predictors": box_predictors, - "proposal_matchers": proposal_matchers, - } - - def forward(self, images, features, proposals, targets=None): - del images - if self.training: - proposals = self.label_and_sample_proposals(proposals, targets) - - if self.training: - # Need targets to box head - losses = self._forward_box(features, proposals, targets) - losses.update(self._forward_mask(features, proposals)) - losses.update(self._forward_keypoint(features, proposals)) - return proposals, losses - else: - pred_instances = self._forward_box(features, proposals) - pred_instances = self.forward_with_given_boxes(features, pred_instances) - return pred_instances, {} - - def _forward_box(self, features, proposals, targets=None): - """ - Args: - features, targets: the same as in - Same as in :meth:`ROIHeads.forward`. - proposals (list[Instances]): the per-image object proposals with - their matching ground truth. - Each has fields "proposal_boxes", and "objectness_logits", - "gt_classes", "gt_boxes". - """ - features = [features[f] for f in self.box_in_features] - head_outputs = [] # (predictor, predictions, proposals) - prev_pred_boxes = None - image_sizes = [x.image_size for x in proposals] - for k in range(self.num_cascade_stages): - if k > 0: - # The output boxes of the previous stage are used to create the input - # proposals of the next stage. - proposals = self._create_proposals_from_boxes(prev_pred_boxes, image_sizes) - if self.training: - proposals = self._match_and_label_boxes(proposals, k, targets) - predictions = self._run_stage(features, proposals, k) - prev_pred_boxes = self.box_predictor[k].predict_boxes(predictions, proposals) - head_outputs.append((self.box_predictor[k], predictions, proposals)) - - if self.training: - losses = {} - storage = get_event_storage() - for stage, (predictor, predictions, proposals) in enumerate(head_outputs): - with storage.name_scope("stage{}".format(stage)): - stage_losses = predictor.losses(predictions, proposals) - losses.update({k + "_stage{}".format(stage): v for k, v in stage_losses.items()}) - return losses - else: - # Each is a list[Tensor] of length #image. Each tensor is Ri x (K+1) - scores_per_stage = [h[0].predict_probs(h[1], h[2]) for h in head_outputs] - - # Average the scores across heads - scores = [ - sum(list(scores_per_image)) * (1.0 / self.num_cascade_stages) - for scores_per_image in zip(*scores_per_stage) - ] - # Use the boxes of the last head - predictor, predictions, proposals = head_outputs[-1] - boxes = predictor.predict_boxes(predictions, proposals) - pred_instances, _ = fast_rcnn_inference( - boxes, - scores, - image_sizes, - predictor.test_score_thresh, - predictor.test_nms_thresh, - predictor.test_topk_per_image, - ) - return pred_instances - - @torch.no_grad() - def _match_and_label_boxes(self, proposals, stage, targets): - """ - Match proposals with groundtruth using the matcher at the given stage. - Label the proposals as foreground or background based on the match. - - Args: - proposals (list[Instances]): One Instances for each image, with - the field "proposal_boxes". - stage (int): the current stage - targets (list[Instances]): the ground truth instances - - Returns: - list[Instances]: the same proposals, but with fields "gt_classes" and "gt_boxes" - """ - num_fg_samples, num_bg_samples = [], [] - for proposals_per_image, targets_per_image in zip(proposals, targets): - match_quality_matrix = pairwise_iou( - targets_per_image.gt_boxes, proposals_per_image.proposal_boxes - ) - # proposal_labels are 0 or 1 - matched_idxs, proposal_labels = self.proposal_matchers[stage](match_quality_matrix) - if len(targets_per_image) > 0: - gt_classes = targets_per_image.gt_classes[matched_idxs] - # Label unmatched proposals (0 label from matcher) as background (label=num_classes) - gt_classes[proposal_labels == 0] = self.num_classes - gt_boxes = targets_per_image.gt_boxes[matched_idxs] - else: - gt_classes = torch.zeros_like(matched_idxs) + self.num_classes - gt_boxes = Boxes( - targets_per_image.gt_boxes.tensor.new_zeros((len(proposals_per_image), 4)) - ) - proposals_per_image.gt_classes = gt_classes - proposals_per_image.gt_boxes = gt_boxes - - num_fg_samples.append((proposal_labels == 1).sum().item()) - num_bg_samples.append(proposal_labels.numel() - num_fg_samples[-1]) - - # Log the number of fg/bg samples in each stage - storage = get_event_storage() - storage.put_scalar( - "stage{}/roi_head/num_fg_samples".format(stage), - sum(num_fg_samples) / len(num_fg_samples), - ) - storage.put_scalar( - "stage{}/roi_head/num_bg_samples".format(stage), - sum(num_bg_samples) / len(num_bg_samples), - ) - return proposals - - def _run_stage(self, features, proposals, stage): - """ - Args: - features (list[Tensor]): #lvl input features to ROIHeads - proposals (list[Instances]): #image Instances, with the field "proposal_boxes" - stage (int): the current stage - - Returns: - Same output as `FastRCNNOutputLayers.forward()`. - """ - box_features = self.box_pooler(features, [x.proposal_boxes for x in proposals]) - # The original implementation averages the losses among heads, - # but scale up the parameter gradients of the heads. - # This is equivalent to adding the losses among heads, - # but scale down the gradients on features. - if self.training: - box_features = _ScaleGradient.apply(box_features, 1.0 / self.num_cascade_stages) - box_features = self.box_head[stage](box_features) - return self.box_predictor[stage](box_features) - - def _create_proposals_from_boxes(self, boxes, image_sizes): - """ - Args: - boxes (list[Tensor]): per-image predicted boxes, each of shape Ri x 4 - image_sizes (list[tuple]): list of image shapes in (h, w) - - Returns: - list[Instances]: per-image proposals with the given boxes. - """ - # Just like RPN, the proposals should not have gradients - boxes = [Boxes(b.detach()) for b in boxes] - proposals = [] - for boxes_per_image, image_size in zip(boxes, image_sizes): - boxes_per_image.clip(image_size) - if self.training: - # do not filter empty boxes at inference time, - # because the scores from each stage need to be aligned and added later - boxes_per_image = boxes_per_image[boxes_per_image.nonempty()] - prop = Instances(image_size) - prop.proposal_boxes = boxes_per_image - proposals.append(prop) - return proposals diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/fast_rcnn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/fast_rcnn.py deleted file mode 100755 index 42eba210..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/fast_rcnn.py +++ /dev/null @@ -1,462 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -from typing import Dict, List, Tuple, Union -import torch -from torch import nn -from torch.nn import functional as F - -from detectron2.config import configurable -from detectron2.layers import ShapeSpec, batched_nms, cat, cross_entropy, nonzero_tuple -from detectron2.modeling.box_regression import Box2BoxTransform, _dense_box_regression_loss -from detectron2.structures import Boxes, Instances -from detectron2.utils.events import get_event_storage - -__all__ = ["fast_rcnn_inference", "FastRCNNOutputLayers"] - - -logger = logging.getLogger(__name__) - -""" -Shape shorthand in this module: - - N: number of images in the minibatch - R: number of ROIs, combined over all images, in the minibatch - Ri: number of ROIs in image i - K: number of foreground classes. E.g.,there are 80 foreground classes in COCO. - -Naming convention: - - deltas: refers to the 4-d (dx, dy, dw, dh) deltas that parameterize the box2box - transform (see :class:`box_regression.Box2BoxTransform`). - - pred_class_logits: predicted class scores in [-inf, +inf]; use - softmax(pred_class_logits) to estimate P(class). - - gt_classes: ground-truth classification labels in [0, K], where [0, K) represent - foreground object classes and K represents the background class. - - pred_proposal_deltas: predicted box2box transform deltas for transforming proposals - to detection box predictions. - - gt_proposal_deltas: ground-truth box2box transform deltas -""" - - -def fast_rcnn_inference( - boxes: List[torch.Tensor], - scores: List[torch.Tensor], - image_shapes: List[Tuple[int, int]], - score_thresh: float, - nms_thresh: float, - topk_per_image: int, -): - """ - Call `fast_rcnn_inference_single_image` for all images. - - Args: - boxes (list[Tensor]): A list of Tensors of predicted class-specific or class-agnostic - boxes for each image. Element i has shape (Ri, K * 4) if doing - class-specific regression, or (Ri, 4) if doing class-agnostic - regression, where Ri is the number of predicted objects for image i. - This is compatible with the output of :meth:`FastRCNNOutputLayers.predict_boxes`. - scores (list[Tensor]): A list of Tensors of predicted class scores for each image. - Element i has shape (Ri, K + 1), where Ri is the number of predicted objects - for image i. Compatible with the output of :meth:`FastRCNNOutputLayers.predict_probs`. - image_shapes (list[tuple]): A list of (width, height) tuples for each image in the batch. - score_thresh (float): Only return detections with a confidence score exceeding this - threshold. - nms_thresh (float): The threshold to use for box non-maximum suppression. Value in [0, 1]. - topk_per_image (int): The number of top scoring detections to return. Set < 0 to return - all detections. - - Returns: - instances: (list[Instances]): A list of N instances, one for each image in the batch, - that stores the topk most confidence detections. - kept_indices: (list[Tensor]): A list of 1D tensor of length of N, each element indicates - the corresponding boxes/scores index in [0, Ri) from the input, for image i. - """ - result_per_image = [ - fast_rcnn_inference_single_image( - boxes_per_image, scores_per_image, image_shape, score_thresh, nms_thresh, topk_per_image - ) - for scores_per_image, boxes_per_image, image_shape in zip(scores, boxes, image_shapes) - ] - return [x[0] for x in result_per_image], [x[1] for x in result_per_image] - - -def _log_classification_stats(pred_logits, gt_classes, prefix="fast_rcnn"): - """ - Log the classification metrics to EventStorage. - - Args: - pred_logits: Rx(K+1) logits. The last column is for background class. - gt_classes: R labels - """ - num_instances = gt_classes.numel() - if num_instances == 0: - return - pred_classes = pred_logits.argmax(dim=1) - bg_class_ind = pred_logits.shape[1] - 1 - - fg_inds = (gt_classes >= 0) & (gt_classes < bg_class_ind) - num_fg = fg_inds.nonzero().numel() - fg_gt_classes = gt_classes[fg_inds] - fg_pred_classes = pred_classes[fg_inds] - - num_false_negative = (fg_pred_classes == bg_class_ind).nonzero().numel() - num_accurate = (pred_classes == gt_classes).nonzero().numel() - fg_num_accurate = (fg_pred_classes == fg_gt_classes).nonzero().numel() - - storage = get_event_storage() - storage.put_scalar(f"{prefix}/cls_accuracy", num_accurate / num_instances) - if num_fg > 0: - storage.put_scalar(f"{prefix}/fg_cls_accuracy", fg_num_accurate / num_fg) - storage.put_scalar(f"{prefix}/false_negative", num_false_negative / num_fg) - - -def fast_rcnn_inference_single_image( - boxes, - scores, - image_shape: Tuple[int, int], - score_thresh: float, - nms_thresh: float, - topk_per_image: int, -): - """ - Single-image inference. Return bounding-box detection results by thresholding - on scores and applying non-maximum suppression (NMS). - - Args: - Same as `fast_rcnn_inference`, but with boxes, scores, and image shapes - per image. - - Returns: - Same as `fast_rcnn_inference`, but for only one image. - """ - valid_mask = torch.isfinite(boxes).all(dim=1) & torch.isfinite(scores).all(dim=1) - if not valid_mask.all(): - boxes = boxes[valid_mask] - scores = scores[valid_mask] - - scores = scores[:, :-1] - num_bbox_reg_classes = boxes.shape[1] // 4 - # Convert to Boxes to use the `clip` function ... - boxes = Boxes(boxes.reshape(-1, 4)) - boxes.clip(image_shape) - boxes = boxes.tensor.view(-1, num_bbox_reg_classes, 4) # R x C x 4 - - # 1. Filter results based on detection scores. It can make NMS more efficient - # by filtering out low-confidence detections. - filter_mask = scores > score_thresh # R x K - # R' x 2. First column contains indices of the R predictions; - # Second column contains indices of classes. - filter_inds = filter_mask.nonzero() - if num_bbox_reg_classes == 1: - boxes = boxes[filter_inds[:, 0], 0] - else: - boxes = boxes[filter_mask] - scores = scores[filter_mask] - - # 2. Apply NMS for each class independently. - keep = batched_nms(boxes, scores, filter_inds[:, 1], nms_thresh) - if topk_per_image >= 0: - keep = keep[:topk_per_image] - boxes, scores, filter_inds = boxes[keep], scores[keep], filter_inds[keep] - - result = Instances(image_shape) - result.pred_boxes = Boxes(boxes) - result.scores = scores - result.pred_classes = filter_inds[:, 1] - return result, filter_inds[:, 0] - - -class FastRCNNOutputLayers(nn.Module): - """ - Two linear layers for predicting Fast R-CNN outputs: - - 1. proposal-to-detection box regression deltas - 2. classification scores - """ - - @configurable - def __init__( - self, - input_shape: ShapeSpec, - *, - box2box_transform, - num_classes: int, - test_score_thresh: float = 0.0, - test_nms_thresh: float = 0.5, - test_topk_per_image: int = 100, - cls_agnostic_bbox_reg: bool = False, - smooth_l1_beta: float = 0.0, - box_reg_loss_type: str = "smooth_l1", - loss_weight: Union[float, Dict[str, float]] = 1.0, - ): - """ - NOTE: this interface is experimental. - - Args: - input_shape (ShapeSpec): shape of the input feature to this module - box2box_transform (Box2BoxTransform or Box2BoxTransformRotated): - num_classes (int): number of foreground classes - test_score_thresh (float): threshold to filter predictions results. - test_nms_thresh (float): NMS threshold for prediction results. - test_topk_per_image (int): number of top predictions to produce per image. - cls_agnostic_bbox_reg (bool): whether to use class agnostic for bbox regression - smooth_l1_beta (float): transition point from L1 to L2 loss. Only used if - `box_reg_loss_type` is "smooth_l1" - box_reg_loss_type (str): Box regression loss type. One of: "smooth_l1", "giou", - "diou", "ciou" - loss_weight (float|dict): weights to use for losses. Can be single float for weighting - all losses, or a dict of individual weightings. Valid dict keys are: - * "loss_cls": applied to classification loss - * "loss_box_reg": applied to box regression loss - """ - super().__init__() - if isinstance(input_shape, int): # some backward compatibility - input_shape = ShapeSpec(channels=input_shape) - self.num_classes = num_classes - input_size = input_shape.channels * (input_shape.width or 1) * (input_shape.height or 1) - # prediction layer for num_classes foreground classes and one background class (hence + 1) - self.cls_score = nn.Linear(input_size, num_classes + 1) - num_bbox_reg_classes = 1 if cls_agnostic_bbox_reg else num_classes - box_dim = len(box2box_transform.weights) - self.bbox_pred = nn.Linear(input_size, num_bbox_reg_classes * box_dim) - - nn.init.normal_(self.cls_score.weight, std=0.01) - nn.init.normal_(self.bbox_pred.weight, std=0.001) - for l in [self.cls_score, self.bbox_pred]: - nn.init.constant_(l.bias, 0) - - self.box2box_transform = box2box_transform - self.smooth_l1_beta = smooth_l1_beta - self.test_score_thresh = test_score_thresh - self.test_nms_thresh = test_nms_thresh - self.test_topk_per_image = test_topk_per_image - self.box_reg_loss_type = box_reg_loss_type - if isinstance(loss_weight, float): - loss_weight = {"loss_cls": loss_weight, "loss_box_reg": loss_weight} - self.loss_weight = loss_weight - - @classmethod - def from_config(cls, cfg, input_shape): - return { - "input_shape": input_shape, - "box2box_transform": Box2BoxTransform(weights=cfg.MODEL.ROI_BOX_HEAD.BBOX_REG_WEIGHTS), - # fmt: off - "num_classes" : cfg.MODEL.ROI_HEADS.NUM_CLASSES, - "cls_agnostic_bbox_reg" : cfg.MODEL.ROI_BOX_HEAD.CLS_AGNOSTIC_BBOX_REG, - "smooth_l1_beta" : cfg.MODEL.ROI_BOX_HEAD.SMOOTH_L1_BETA, - "test_score_thresh" : cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST, - "test_nms_thresh" : cfg.MODEL.ROI_HEADS.NMS_THRESH_TEST, - "test_topk_per_image" : cfg.TEST.DETECTIONS_PER_IMAGE, - "box_reg_loss_type" : cfg.MODEL.ROI_BOX_HEAD.BBOX_REG_LOSS_TYPE, - "loss_weight" : {"loss_box_reg": cfg.MODEL.ROI_BOX_HEAD.BBOX_REG_LOSS_WEIGHT}, - # fmt: on - } - - def forward(self, x): - """ - Args: - x: per-region features of shape (N, ...) for N bounding boxes to predict. - - Returns: - (Tensor, Tensor): - First tensor: shape (N,K+1), scores for each of the N box. Each row contains the - scores for K object categories and 1 background class. - - Second tensor: bounding box regression deltas for each box. Shape is shape (N,Kx4), - or (N,4) for class-agnostic regression. - """ - if x.dim() > 2: - x = torch.flatten(x, start_dim=1) - scores = self.cls_score(x) - proposal_deltas = self.bbox_pred(x) - return scores, proposal_deltas - - def losses(self, predictions, proposals): - """ - Args: - predictions: return values of :meth:`forward()`. - proposals (list[Instances]): proposals that match the features that were used - to compute predictions. The fields ``proposal_boxes``, ``gt_boxes``, - ``gt_classes`` are expected. - - Returns: - Dict[str, Tensor]: dict of losses - """ - scores, proposal_deltas = predictions - - # parse classification outputs - gt_classes = ( - cat([p.gt_classes for p in proposals], dim=0) if len(proposals) else torch.empty(0) - ) - _log_classification_stats(scores, gt_classes) - - # parse box regression outputs - if len(proposals): - proposal_boxes = cat([p.proposal_boxes.tensor for p in proposals], dim=0) # Nx4 - assert not proposal_boxes.requires_grad, "Proposals should not require gradients!" - # If "gt_boxes" does not exist, the proposals must be all negative and - # should not be included in regression loss computation. - # Here we just use proposal_boxes as an arbitrary placeholder because its - # value won't be used in self.box_reg_loss(). - gt_boxes = cat( - [(p.gt_boxes if p.has("gt_boxes") else p.proposal_boxes).tensor for p in proposals], - dim=0, - ) - else: - proposal_boxes = gt_boxes = torch.empty((0, 4), device=proposal_deltas.device) - - losses = { - "loss_cls": cross_entropy(scores, gt_classes, reduction="mean"), - "loss_box_reg": self.box_reg_loss( - proposal_boxes, gt_boxes, proposal_deltas, gt_classes - ), - } - return {k: v * self.loss_weight.get(k, 1.0) for k, v in losses.items()} - - def box_reg_loss(self, proposal_boxes, gt_boxes, pred_deltas, gt_classes): - """ - Args: - proposal_boxes/gt_boxes are tensors with the same shape (R, 4 or 5). - pred_deltas has shape (R, 4 or 5), or (R, num_classes * (4 or 5)). - gt_classes is a long tensor of shape R, the gt class label of each proposal. - R shall be the number of proposals. - """ - box_dim = proposal_boxes.shape[1] # 4 or 5 - # Regression loss is only computed for foreground proposals (those matched to a GT) - fg_inds = nonzero_tuple((gt_classes >= 0) & (gt_classes < self.num_classes))[0] - if pred_deltas.shape[1] == box_dim: # cls-agnostic regression - fg_pred_deltas = pred_deltas[fg_inds] - else: - fg_pred_deltas = pred_deltas.view(-1, self.num_classes, box_dim)[ - fg_inds, gt_classes[fg_inds] - ] - - loss_box_reg = _dense_box_regression_loss( - [proposal_boxes[fg_inds]], - self.box2box_transform, - [fg_pred_deltas.unsqueeze(0)], - [gt_boxes[fg_inds]], - ..., - self.box_reg_loss_type, - self.smooth_l1_beta, - ) - - # The reg loss is normalized using the total number of regions (R), not the number - # of foreground regions even though the box regression loss is only defined on - # foreground regions. Why? Because doing so gives equal training influence to - # each foreground example. To see how, consider two different minibatches: - # (1) Contains a single foreground region - # (2) Contains 100 foreground regions - # If we normalize by the number of foreground regions, the single example in - # minibatch (1) will be given 100 times as much influence as each foreground - # example in minibatch (2). Normalizing by the total number of regions, R, - # means that the single example in minibatch (1) and each of the 100 examples - # in minibatch (2) are given equal influence. - return loss_box_reg / max(gt_classes.numel(), 1.0) # return 0 if empty - - def inference(self, predictions: Tuple[torch.Tensor, torch.Tensor], proposals: List[Instances]): - """ - Args: - predictions: return values of :meth:`forward()`. - proposals (list[Instances]): proposals that match the features that were - used to compute predictions. The ``proposal_boxes`` field is expected. - - Returns: - list[Instances]: same as `fast_rcnn_inference`. - list[Tensor]: same as `fast_rcnn_inference`. - """ - boxes = self.predict_boxes(predictions, proposals) - scores = self.predict_probs(predictions, proposals) - image_shapes = [x.image_size for x in proposals] - return fast_rcnn_inference( - boxes, - scores, - image_shapes, - self.test_score_thresh, - self.test_nms_thresh, - self.test_topk_per_image, - ) - - def predict_boxes_for_gt_classes(self, predictions, proposals): - """ - Args: - predictions: return values of :meth:`forward()`. - proposals (list[Instances]): proposals that match the features that were used - to compute predictions. The fields ``proposal_boxes``, ``gt_classes`` are expected. - - Returns: - list[Tensor]: - A list of Tensors of predicted boxes for GT classes in case of - class-specific box head. Element i of the list has shape (Ri, B), where Ri is - the number of proposals for image i and B is the box dimension (4 or 5) - """ - if not len(proposals): - return [] - scores, proposal_deltas = predictions - proposal_boxes = cat([p.proposal_boxes.tensor for p in proposals], dim=0) - N, B = proposal_boxes.shape - predict_boxes = self.box2box_transform.apply_deltas( - proposal_deltas, proposal_boxes - ) # Nx(KxB) - - K = predict_boxes.shape[1] // B - if K > 1: - gt_classes = torch.cat([p.gt_classes for p in proposals], dim=0) - # Some proposals are ignored or have a background class. Their gt_classes - # cannot be used as index. - gt_classes = gt_classes.clamp_(0, K - 1) - - predict_boxes = predict_boxes.view(N, K, B)[ - torch.arange(N, dtype=torch.long, device=predict_boxes.device), gt_classes - ] - num_prop_per_image = [len(p) for p in proposals] - return predict_boxes.split(num_prop_per_image) - - def predict_boxes( - self, predictions: Tuple[torch.Tensor, torch.Tensor], proposals: List[Instances] - ): - """ - Args: - predictions: return values of :meth:`forward()`. - proposals (list[Instances]): proposals that match the features that were - used to compute predictions. The ``proposal_boxes`` field is expected. - - Returns: - list[Tensor]: - A list of Tensors of predicted class-specific or class-agnostic boxes - for each image. Element i has shape (Ri, K * B) or (Ri, B), where Ri is - the number of proposals for image i and B is the box dimension (4 or 5) - """ - if not len(proposals): - return [] - _, proposal_deltas = predictions - num_prop_per_image = [len(p) for p in proposals] - proposal_boxes = cat([p.proposal_boxes.tensor for p in proposals], dim=0) - predict_boxes = self.box2box_transform.apply_deltas( - proposal_deltas, - proposal_boxes, - ) # Nx(KxB) - return predict_boxes.split(num_prop_per_image) - - def predict_probs( - self, predictions: Tuple[torch.Tensor, torch.Tensor], proposals: List[Instances] - ): - """ - Args: - predictions: return values of :meth:`forward()`. - proposals (list[Instances]): proposals that match the features that were - used to compute predictions. - - Returns: - list[Tensor]: - A list of Tensors of predicted class probabilities for each image. - Element i has shape (Ri, K + 1), where Ri is the number of proposals for image i. - """ - scores, _ = predictions - num_inst_per_image = [len(p) for p in proposals] - probs = F.softmax(scores, dim=-1) - return probs.split(num_inst_per_image, dim=0) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/keypoint_head.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/keypoint_head.py deleted file mode 100755 index e0acc138..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/keypoint_head.py +++ /dev/null @@ -1,272 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from typing import List -import torch -from torch import nn -from torch.nn import functional as F - -from detectron2.config import configurable -from detectron2.layers import Conv2d, ConvTranspose2d, cat, interpolate -from detectron2.structures import Instances, heatmaps_to_keypoints -from detectron2.utils.events import get_event_storage -from detectron2.utils.registry import Registry - -_TOTAL_SKIPPED = 0 - - -__all__ = [ - "ROI_KEYPOINT_HEAD_REGISTRY", - "build_keypoint_head", - "BaseKeypointRCNNHead", - "KRCNNConvDeconvUpsampleHead", -] - - -ROI_KEYPOINT_HEAD_REGISTRY = Registry("ROI_KEYPOINT_HEAD") -ROI_KEYPOINT_HEAD_REGISTRY.__doc__ = """ -Registry for keypoint heads, which make keypoint predictions from per-region features. - -The registered object will be called with `obj(cfg, input_shape)`. -""" - - -def build_keypoint_head(cfg, input_shape): - """ - Build a keypoint head from `cfg.MODEL.ROI_KEYPOINT_HEAD.NAME`. - """ - name = cfg.MODEL.ROI_KEYPOINT_HEAD.NAME - return ROI_KEYPOINT_HEAD_REGISTRY.get(name)(cfg, input_shape) - - -def keypoint_rcnn_loss(pred_keypoint_logits, instances, normalizer): - """ - Arguments: - pred_keypoint_logits (Tensor): A tensor of shape (N, K, S, S) where N is the total number - of instances in the batch, K is the number of keypoints, and S is the side length - of the keypoint heatmap. The values are spatial logits. - instances (list[Instances]): A list of M Instances, where M is the batch size. - These instances are predictions from the model - that are in 1:1 correspondence with pred_keypoint_logits. - Each Instances should contain a `gt_keypoints` field containing a `structures.Keypoint` - instance. - normalizer (float): Normalize the loss by this amount. - If not specified, we normalize by the number of visible keypoints in the minibatch. - - Returns a scalar tensor containing the loss. - """ - heatmaps = [] - valid = [] - - keypoint_side_len = pred_keypoint_logits.shape[2] - for instances_per_image in instances: - if len(instances_per_image) == 0: - continue - keypoints = instances_per_image.gt_keypoints - heatmaps_per_image, valid_per_image = keypoints.to_heatmap( - instances_per_image.proposal_boxes.tensor, keypoint_side_len - ) - heatmaps.append(heatmaps_per_image.view(-1)) - valid.append(valid_per_image.view(-1)) - - if len(heatmaps): - keypoint_targets = cat(heatmaps, dim=0) - valid = cat(valid, dim=0).to(dtype=torch.uint8) - valid = torch.nonzero(valid).squeeze(1) - - # torch.mean (in binary_cross_entropy_with_logits) doesn't - # accept empty tensors, so handle it separately - if len(heatmaps) == 0 or valid.numel() == 0: - global _TOTAL_SKIPPED - _TOTAL_SKIPPED += 1 - storage = get_event_storage() - storage.put_scalar("kpts_num_skipped_batches", _TOTAL_SKIPPED, smoothing_hint=False) - return pred_keypoint_logits.sum() * 0 - - N, K, H, W = pred_keypoint_logits.shape - pred_keypoint_logits = pred_keypoint_logits.view(N * K, H * W) - - keypoint_loss = F.cross_entropy( - pred_keypoint_logits[valid], keypoint_targets[valid], reduction="sum" - ) - - # If a normalizer isn't specified, normalize by the number of visible keypoints in the minibatch - if normalizer is None: - normalizer = valid.numel() - keypoint_loss /= normalizer - - return keypoint_loss - - -def keypoint_rcnn_inference(pred_keypoint_logits: torch.Tensor, pred_instances: List[Instances]): - """ - Post process each predicted keypoint heatmap in `pred_keypoint_logits` into (x, y, score) - and add it to the `pred_instances` as a `pred_keypoints` field. - - Args: - pred_keypoint_logits (Tensor): A tensor of shape (R, K, S, S) where R is the total number - of instances in the batch, K is the number of keypoints, and S is the side length of - the keypoint heatmap. The values are spatial logits. - pred_instances (list[Instances]): A list of N Instances, where N is the number of images. - - Returns: - None. Each element in pred_instances will contain extra "pred_keypoints" and - "pred_keypoint_heatmaps" fields. "pred_keypoints" is a tensor of shape - (#instance, K, 3) where the last dimension corresponds to (x, y, score). - The scores are larger than 0. "pred_keypoint_heatmaps" contains the raw - keypoint logits as passed to this function. - """ - # flatten all bboxes from all images together (list[Boxes] -> Rx4 tensor) - bboxes_flat = cat([b.pred_boxes.tensor for b in pred_instances], dim=0) - - pred_keypoint_logits = pred_keypoint_logits.detach() - keypoint_results = heatmaps_to_keypoints(pred_keypoint_logits, bboxes_flat.detach()) - num_instances_per_image = [len(i) for i in pred_instances] - keypoint_results = keypoint_results[:, :, [0, 1, 3]].split(num_instances_per_image, dim=0) - heatmap_results = pred_keypoint_logits.split(num_instances_per_image, dim=0) - - for keypoint_results_per_image, heatmap_results_per_image, instances_per_image in zip( - keypoint_results, heatmap_results, pred_instances - ): - # keypoint_results_per_image is (num instances)x(num keypoints)x(x, y, score) - # heatmap_results_per_image is (num instances)x(num keypoints)x(side)x(side) - instances_per_image.pred_keypoints = keypoint_results_per_image - instances_per_image.pred_keypoint_heatmaps = heatmap_results_per_image - - -class BaseKeypointRCNNHead(nn.Module): - """ - Implement the basic Keypoint R-CNN losses and inference logic described in - Sec. 5 of :paper:`Mask R-CNN`. - """ - - @configurable - def __init__(self, *, num_keypoints, loss_weight=1.0, loss_normalizer=1.0): - """ - NOTE: this interface is experimental. - - Args: - num_keypoints (int): number of keypoints to predict - loss_weight (float): weight to multiple on the keypoint loss - loss_normalizer (float or str): - If float, divide the loss by `loss_normalizer * #images`. - If 'visible', the loss is normalized by the total number of - visible keypoints across images. - """ - super().__init__() - self.num_keypoints = num_keypoints - self.loss_weight = loss_weight - assert loss_normalizer == "visible" or isinstance(loss_normalizer, float), loss_normalizer - self.loss_normalizer = loss_normalizer - - @classmethod - def from_config(cls, cfg, input_shape): - ret = { - "loss_weight": cfg.MODEL.ROI_KEYPOINT_HEAD.LOSS_WEIGHT, - "num_keypoints": cfg.MODEL.ROI_KEYPOINT_HEAD.NUM_KEYPOINTS, - } - normalize_by_visible = ( - cfg.MODEL.ROI_KEYPOINT_HEAD.NORMALIZE_LOSS_BY_VISIBLE_KEYPOINTS - ) # noqa - if not normalize_by_visible: - batch_size_per_image = cfg.MODEL.ROI_HEADS.BATCH_SIZE_PER_IMAGE - positive_sample_fraction = cfg.MODEL.ROI_HEADS.POSITIVE_FRACTION - ret["loss_normalizer"] = ( - ret["num_keypoints"] * batch_size_per_image * positive_sample_fraction - ) - else: - ret["loss_normalizer"] = "visible" - return ret - - def forward(self, x, instances: List[Instances]): - """ - Args: - x: input 4D region feature(s) provided by :class:`ROIHeads`. - instances (list[Instances]): contains the boxes & labels corresponding - to the input features. - Exact format is up to its caller to decide. - Typically, this is the foreground instances in training, with - "proposal_boxes" field and other gt annotations. - In inference, it contains boxes that are already predicted. - - Returns: - A dict of losses if in training. The predicted "instances" if in inference. - """ - x = self.layers(x) - if self.training: - num_images = len(instances) - normalizer = ( - None if self.loss_normalizer == "visible" else num_images * self.loss_normalizer - ) - return { - "loss_keypoint": keypoint_rcnn_loss(x, instances, normalizer=normalizer) - * self.loss_weight - } - else: - keypoint_rcnn_inference(x, instances) - return instances - - def layers(self, x): - """ - Neural network layers that makes predictions from regional input features. - """ - raise NotImplementedError - - -# To get torchscript support, we make the head a subclass of `nn.Sequential`. -# Therefore, to add new layers in this head class, please make sure they are -# added in the order they will be used in forward(). -@ROI_KEYPOINT_HEAD_REGISTRY.register() -class KRCNNConvDeconvUpsampleHead(BaseKeypointRCNNHead, nn.Sequential): - """ - A standard keypoint head containing a series of 3x3 convs, followed by - a transpose convolution and bilinear interpolation for upsampling. - It is described in Sec. 5 of :paper:`Mask R-CNN`. - """ - - @configurable - def __init__(self, input_shape, *, num_keypoints, conv_dims, **kwargs): - """ - NOTE: this interface is experimental. - - Args: - input_shape (ShapeSpec): shape of the input feature - conv_dims: an iterable of output channel counts for each conv in the head - e.g. (512, 512, 512) for three convs outputting 512 channels. - """ - super().__init__(num_keypoints=num_keypoints, **kwargs) - - # default up_scale to 2.0 (this can be made an option) - up_scale = 2.0 - in_channels = input_shape.channels - - for idx, layer_channels in enumerate(conv_dims, 1): - module = Conv2d(in_channels, layer_channels, 3, stride=1, padding=1) - self.add_module("conv_fcn{}".format(idx), module) - self.add_module("conv_fcn_relu{}".format(idx), nn.ReLU()) - in_channels = layer_channels - - deconv_kernel = 4 - self.score_lowres = ConvTranspose2d( - in_channels, num_keypoints, deconv_kernel, stride=2, padding=deconv_kernel // 2 - 1 - ) - self.up_scale = up_scale - - for name, param in self.named_parameters(): - if "bias" in name: - nn.init.constant_(param, 0) - elif "weight" in name: - # Caffe2 implementation uses MSRAFill, which in fact - # corresponds to kaiming_normal_ in PyTorch - nn.init.kaiming_normal_(param, mode="fan_out", nonlinearity="relu") - - @classmethod - def from_config(cls, cfg, input_shape): - ret = super().from_config(cfg, input_shape) - ret["input_shape"] = input_shape - ret["conv_dims"] = cfg.MODEL.ROI_KEYPOINT_HEAD.CONV_DIMS - return ret - - def layers(self, x): - for layer in self: - x = layer(x) - x = interpolate(x, scale_factor=self.up_scale, mode="bilinear", align_corners=False) - return x diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/mask_head.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/mask_head.py deleted file mode 100755 index 5ac5c4b9..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/mask_head.py +++ /dev/null @@ -1,292 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from typing import List -import fvcore.nn.weight_init as weight_init -import torch -from torch import nn -from torch.nn import functional as F - -from detectron2.config import configurable -from detectron2.layers import Conv2d, ConvTranspose2d, ShapeSpec, cat, get_norm -from detectron2.structures import Instances -from detectron2.utils.events import get_event_storage -from detectron2.utils.registry import Registry - -__all__ = [ - "BaseMaskRCNNHead", - "MaskRCNNConvUpsampleHead", - "build_mask_head", - "ROI_MASK_HEAD_REGISTRY", -] - - -ROI_MASK_HEAD_REGISTRY = Registry("ROI_MASK_HEAD") -ROI_MASK_HEAD_REGISTRY.__doc__ = """ -Registry for mask heads, which predicts instance masks given -per-region features. - -The registered object will be called with `obj(cfg, input_shape)`. -""" - - -@torch.jit.unused -def mask_rcnn_loss(pred_mask_logits: torch.Tensor, instances: List[Instances], vis_period: int = 0): - """ - Compute the mask prediction loss defined in the Mask R-CNN paper. - - Args: - pred_mask_logits (Tensor): A tensor of shape (B, C, Hmask, Wmask) or (B, 1, Hmask, Wmask) - for class-specific or class-agnostic, where B is the total number of predicted masks - in all images, C is the number of foreground classes, and Hmask, Wmask are the height - and width of the mask predictions. The values are logits. - instances (list[Instances]): A list of N Instances, where N is the number of images - in the batch. These instances are in 1:1 - correspondence with the pred_mask_logits. The ground-truth labels (class, box, mask, - ...) associated with each instance are stored in fields. - vis_period (int): the period (in steps) to dump visualization. - - Returns: - mask_loss (Tensor): A scalar tensor containing the loss. - """ - cls_agnostic_mask = pred_mask_logits.size(1) == 1 - total_num_masks = pred_mask_logits.size(0) - mask_side_len = pred_mask_logits.size(2) - assert pred_mask_logits.size(2) == pred_mask_logits.size(3), "Mask prediction must be square!" - - gt_classes = [] - gt_masks = [] - for instances_per_image in instances: - if len(instances_per_image) == 0: - continue - if not cls_agnostic_mask: - gt_classes_per_image = instances_per_image.gt_classes.to(dtype=torch.int64) - gt_classes.append(gt_classes_per_image) - - gt_masks_per_image = instances_per_image.gt_masks.crop_and_resize( - instances_per_image.proposal_boxes.tensor, mask_side_len - ).to(device=pred_mask_logits.device) - # A tensor of shape (N, M, M), N=#instances in the image; M=mask_side_len - gt_masks.append(gt_masks_per_image) - - if len(gt_masks) == 0: - return pred_mask_logits.sum() * 0 - - gt_masks = cat(gt_masks, dim=0) - - if cls_agnostic_mask: - pred_mask_logits = pred_mask_logits[:, 0] - else: - indices = torch.arange(total_num_masks) - gt_classes = cat(gt_classes, dim=0) - pred_mask_logits = pred_mask_logits[indices, gt_classes] - - if gt_masks.dtype == torch.bool: - gt_masks_bool = gt_masks - else: - # Here we allow gt_masks to be float as well (depend on the implementation of rasterize()) - gt_masks_bool = gt_masks > 0.5 - gt_masks = gt_masks.to(dtype=torch.float32) - - # Log the training accuracy (using gt classes and 0.5 threshold) - mask_incorrect = (pred_mask_logits > 0.0) != gt_masks_bool - mask_accuracy = 1 - (mask_incorrect.sum().item() / max(mask_incorrect.numel(), 1.0)) - num_positive = gt_masks_bool.sum().item() - false_positive = (mask_incorrect & ~gt_masks_bool).sum().item() / max( - gt_masks_bool.numel() - num_positive, 1.0 - ) - false_negative = (mask_incorrect & gt_masks_bool).sum().item() / max(num_positive, 1.0) - - storage = get_event_storage() - storage.put_scalar("mask_rcnn/accuracy", mask_accuracy) - storage.put_scalar("mask_rcnn/false_positive", false_positive) - storage.put_scalar("mask_rcnn/false_negative", false_negative) - if vis_period > 0 and storage.iter % vis_period == 0: - pred_masks = pred_mask_logits.sigmoid() - vis_masks = torch.cat([pred_masks, gt_masks], axis=2) - name = "Left: mask prediction; Right: mask GT" - for idx, vis_mask in enumerate(vis_masks): - vis_mask = torch.stack([vis_mask] * 3, axis=0) - storage.put_image(name + f" ({idx})", vis_mask) - - mask_loss = F.binary_cross_entropy_with_logits(pred_mask_logits, gt_masks, reduction="mean") - return mask_loss - - -def mask_rcnn_inference(pred_mask_logits: torch.Tensor, pred_instances: List[Instances]): - """ - Convert pred_mask_logits to estimated foreground probability masks while also - extracting only the masks for the predicted classes in pred_instances. For each - predicted box, the mask of the same class is attached to the instance by adding a - new "pred_masks" field to pred_instances. - - Args: - pred_mask_logits (Tensor): A tensor of shape (B, C, Hmask, Wmask) or (B, 1, Hmask, Wmask) - for class-specific or class-agnostic, where B is the total number of predicted masks - in all images, C is the number of foreground classes, and Hmask, Wmask are the height - and width of the mask predictions. The values are logits. - pred_instances (list[Instances]): A list of N Instances, where N is the number of images - in the batch. Each Instances must have field "pred_classes". - - Returns: - None. pred_instances will contain an extra "pred_masks" field storing a mask of size (Hmask, - Wmask) for predicted class. Note that the masks are returned as a soft (non-quantized) - masks the resolution predicted by the network; post-processing steps, such as resizing - the predicted masks to the original image resolution and/or binarizing them, is left - to the caller. - """ - cls_agnostic_mask = pred_mask_logits.size(1) == 1 - - if cls_agnostic_mask: - mask_probs_pred = pred_mask_logits.sigmoid() - else: - # Select masks corresponding to the predicted classes - num_masks = pred_mask_logits.shape[0] - class_pred = cat([i.pred_classes for i in pred_instances]) - indices = torch.arange(num_masks, device=class_pred.device) - mask_probs_pred = pred_mask_logits[indices, class_pred][:, None].sigmoid() - # mask_probs_pred.shape: (B, 1, Hmask, Wmask) - - num_boxes_per_image = [len(i) for i in pred_instances] - mask_probs_pred = mask_probs_pred.split(num_boxes_per_image, dim=0) - - for prob, instances in zip(mask_probs_pred, pred_instances): - instances.pred_masks = prob # (1, Hmask, Wmask) - - -class BaseMaskRCNNHead(nn.Module): - """ - Implement the basic Mask R-CNN losses and inference logic described in :paper:`Mask R-CNN` - """ - - @configurable - def __init__(self, *, loss_weight: float = 1.0, vis_period: int = 0): - """ - NOTE: this interface is experimental. - - Args: - loss_weight (float): multiplier of the loss - vis_period (int): visualization period - """ - super().__init__() - self.vis_period = vis_period - self.loss_weight = loss_weight - - @classmethod - def from_config(cls, cfg, input_shape): - return {"vis_period": cfg.VIS_PERIOD} - - def forward(self, x, instances: List[Instances]): - """ - Args: - x: input region feature(s) provided by :class:`ROIHeads`. - instances (list[Instances]): contains the boxes & labels corresponding - to the input features. - Exact format is up to its caller to decide. - Typically, this is the foreground instances in training, with - "proposal_boxes" field and other gt annotations. - In inference, it contains boxes that are already predicted. - - Returns: - A dict of losses in training. The predicted "instances" in inference. - """ - x = self.layers(x) - if self.training: - return {"loss_mask": mask_rcnn_loss(x, instances, self.vis_period) * self.loss_weight} - else: - mask_rcnn_inference(x, instances) - return instances - - def layers(self, x): - """ - Neural network layers that makes predictions from input features. - """ - raise NotImplementedError - - -# To get torchscript support, we make the head a subclass of `nn.Sequential`. -# Therefore, to add new layers in this head class, please make sure they are -# added in the order they will be used in forward(). -@ROI_MASK_HEAD_REGISTRY.register() -class MaskRCNNConvUpsampleHead(BaseMaskRCNNHead, nn.Sequential): - """ - A mask head with several conv layers, plus an upsample layer (with `ConvTranspose2d`). - Predictions are made with a final 1x1 conv layer. - """ - - @configurable - def __init__(self, input_shape: ShapeSpec, *, num_classes, conv_dims, conv_norm="", **kwargs): - """ - NOTE: this interface is experimental. - - Args: - input_shape (ShapeSpec): shape of the input feature - num_classes (int): the number of foreground classes (i.e. background is not - included). 1 if using class agnostic prediction. - conv_dims (list[int]): a list of N>0 integers representing the output dimensions - of N-1 conv layers and the last upsample layer. - conv_norm (str or callable): normalization for the conv layers. - See :func:`detectron2.layers.get_norm` for supported types. - """ - super().__init__(**kwargs) - assert len(conv_dims) >= 1, "conv_dims have to be non-empty!" - - self.conv_norm_relus = [] - - cur_channels = input_shape.channels - for k, conv_dim in enumerate(conv_dims[:-1]): - conv = Conv2d( - cur_channels, - conv_dim, - kernel_size=3, - stride=1, - padding=1, - bias=not conv_norm, - norm=get_norm(conv_norm, conv_dim), - activation=nn.ReLU(), - ) - self.add_module("mask_fcn{}".format(k + 1), conv) - self.conv_norm_relus.append(conv) - cur_channels = conv_dim - - self.deconv = ConvTranspose2d( - cur_channels, conv_dims[-1], kernel_size=2, stride=2, padding=0 - ) - self.add_module("deconv_relu", nn.ReLU()) - cur_channels = conv_dims[-1] - - self.predictor = Conv2d(cur_channels, num_classes, kernel_size=1, stride=1, padding=0) - - for layer in self.conv_norm_relus + [self.deconv]: - weight_init.c2_msra_fill(layer) - # use normal distribution initialization for mask prediction layer - nn.init.normal_(self.predictor.weight, std=0.001) - if self.predictor.bias is not None: - nn.init.constant_(self.predictor.bias, 0) - - @classmethod - def from_config(cls, cfg, input_shape): - ret = super().from_config(cfg, input_shape) - conv_dim = cfg.MODEL.ROI_MASK_HEAD.CONV_DIM - num_conv = cfg.MODEL.ROI_MASK_HEAD.NUM_CONV - ret.update( - conv_dims=[conv_dim] * (num_conv + 1), # +1 for ConvTranspose - conv_norm=cfg.MODEL.ROI_MASK_HEAD.NORM, - input_shape=input_shape, - ) - if cfg.MODEL.ROI_MASK_HEAD.CLS_AGNOSTIC_MASK: - ret["num_classes"] = 1 - else: - ret["num_classes"] = cfg.MODEL.ROI_HEADS.NUM_CLASSES - return ret - - def layers(self, x): - for layer in self: - x = layer(x) - return x - - -def build_mask_head(cfg, input_shape): - """ - Build a mask head defined by `cfg.MODEL.ROI_MASK_HEAD.NAME`. - """ - name = cfg.MODEL.ROI_MASK_HEAD.NAME - return ROI_MASK_HEAD_REGISTRY.get(name)(cfg, input_shape) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/roi_heads.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/roi_heads.py deleted file mode 100755 index 13dd57a0..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/roi_heads.py +++ /dev/null @@ -1,877 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import inspect -import logging -import numpy as np -from typing import Dict, List, Optional, Tuple -import torch -from torch import nn - -from detectron2.config import configurable -from detectron2.layers import ShapeSpec, nonzero_tuple -from detectron2.structures import Boxes, ImageList, Instances, pairwise_iou -from detectron2.utils.events import get_event_storage -from detectron2.utils.registry import Registry - -from ..backbone.resnet import BottleneckBlock, ResNet -from ..matcher import Matcher -from ..poolers import ROIPooler -from ..proposal_generator.proposal_utils import add_ground_truth_to_proposals -from ..sampling import subsample_labels -from .box_head import build_box_head -from .fast_rcnn import FastRCNNOutputLayers -from .keypoint_head import build_keypoint_head -from .mask_head import build_mask_head - -ROI_HEADS_REGISTRY = Registry("ROI_HEADS") -ROI_HEADS_REGISTRY.__doc__ = """ -Registry for ROI heads in a generalized R-CNN model. -ROIHeads take feature maps and region proposals, and -perform per-region computation. - -The registered object will be called with `obj(cfg, input_shape)`. -The call is expected to return an :class:`ROIHeads`. -""" - -logger = logging.getLogger(__name__) - - -def build_roi_heads(cfg, input_shape): - """ - Build ROIHeads defined by `cfg.MODEL.ROI_HEADS.NAME`. - """ - name = cfg.MODEL.ROI_HEADS.NAME - return ROI_HEADS_REGISTRY.get(name)(cfg, input_shape) - - -def select_foreground_proposals( - proposals: List[Instances], bg_label: int -) -> Tuple[List[Instances], List[torch.Tensor]]: - """ - Given a list of N Instances (for N images), each containing a `gt_classes` field, - return a list of Instances that contain only instances with `gt_classes != -1 && - gt_classes != bg_label`. - - Args: - proposals (list[Instances]): A list of N Instances, where N is the number of - images in the batch. - bg_label: label index of background class. - - Returns: - list[Instances]: N Instances, each contains only the selected foreground instances. - list[Tensor]: N boolean vector, correspond to the selection mask of - each Instances object. True for selected instances. - """ - assert isinstance(proposals, (list, tuple)) - assert isinstance(proposals[0], Instances) - assert proposals[0].has("gt_classes") - fg_proposals = [] - fg_selection_masks = [] - for proposals_per_image in proposals: - gt_classes = proposals_per_image.gt_classes - fg_selection_mask = (gt_classes != -1) & (gt_classes != bg_label) - fg_idxs = fg_selection_mask.nonzero().squeeze(1) - fg_proposals.append(proposals_per_image[fg_idxs]) - fg_selection_masks.append(fg_selection_mask) - return fg_proposals, fg_selection_masks - - -def select_proposals_with_visible_keypoints(proposals: List[Instances]) -> List[Instances]: - """ - Args: - proposals (list[Instances]): a list of N Instances, where N is the - number of images. - - Returns: - proposals: only contains proposals with at least one visible keypoint. - - Note that this is still slightly different from Detectron. - In Detectron, proposals for training keypoint head are re-sampled from - all the proposals with IOU>threshold & >=1 visible keypoint. - - Here, the proposals are first sampled from all proposals with - IOU>threshold, then proposals with no visible keypoint are filtered out. - This strategy seems to make no difference on Detectron and is easier to implement. - """ - ret = [] - all_num_fg = [] - for proposals_per_image in proposals: - # If empty/unannotated image (hard negatives), skip filtering for train - if len(proposals_per_image) == 0: - ret.append(proposals_per_image) - continue - gt_keypoints = proposals_per_image.gt_keypoints.tensor - # #fg x K x 3 - vis_mask = gt_keypoints[:, :, 2] >= 1 - xs, ys = gt_keypoints[:, :, 0], gt_keypoints[:, :, 1] - proposal_boxes = proposals_per_image.proposal_boxes.tensor.unsqueeze(dim=1) # #fg x 1 x 4 - kp_in_box = ( - (xs >= proposal_boxes[:, :, 0]) - & (xs <= proposal_boxes[:, :, 2]) - & (ys >= proposal_boxes[:, :, 1]) - & (ys <= proposal_boxes[:, :, 3]) - ) - selection = (kp_in_box & vis_mask).any(dim=1) - selection_idxs = nonzero_tuple(selection)[0] - all_num_fg.append(selection_idxs.numel()) - ret.append(proposals_per_image[selection_idxs]) - - storage = get_event_storage() - storage.put_scalar("keypoint_head/num_fg_samples", np.mean(all_num_fg)) - return ret - - -class ROIHeads(torch.nn.Module): - """ - ROIHeads perform all per-region computation in an R-CNN. - - It typically contains logic to - - 1. (in training only) match proposals with ground truth and sample them - 2. crop the regions and extract per-region features using proposals - 3. make per-region predictions with different heads - - It can have many variants, implemented as subclasses of this class. - This base class contains the logic to match/sample proposals. - But it is not necessary to inherit this class if the sampling logic is not needed. - """ - - @configurable - def __init__( - self, - *, - num_classes, - batch_size_per_image, - positive_fraction, - proposal_matcher, - proposal_append_gt=True, - ): - """ - NOTE: this interface is experimental. - - Args: - num_classes (int): number of foreground classes (i.e. background is not included) - batch_size_per_image (int): number of proposals to sample for training - positive_fraction (float): fraction of positive (foreground) proposals - to sample for training. - proposal_matcher (Matcher): matcher that matches proposals and ground truth - proposal_append_gt (bool): whether to include ground truth as proposals as well - """ - super().__init__() - self.batch_size_per_image = batch_size_per_image - self.positive_fraction = positive_fraction - self.num_classes = num_classes - self.proposal_matcher = proposal_matcher - self.proposal_append_gt = proposal_append_gt - - @classmethod - def from_config(cls, cfg): - return { - "batch_size_per_image": cfg.MODEL.ROI_HEADS.BATCH_SIZE_PER_IMAGE, - "positive_fraction": cfg.MODEL.ROI_HEADS.POSITIVE_FRACTION, - "num_classes": cfg.MODEL.ROI_HEADS.NUM_CLASSES, - "proposal_append_gt": cfg.MODEL.ROI_HEADS.PROPOSAL_APPEND_GT, - # Matcher to assign box proposals to gt boxes - "proposal_matcher": Matcher( - cfg.MODEL.ROI_HEADS.IOU_THRESHOLDS, - cfg.MODEL.ROI_HEADS.IOU_LABELS, - allow_low_quality_matches=False, - ), - } - - def _sample_proposals( - self, matched_idxs: torch.Tensor, matched_labels: torch.Tensor, gt_classes: torch.Tensor - ) -> Tuple[torch.Tensor, torch.Tensor]: - """ - Based on the matching between N proposals and M groundtruth, - sample the proposals and set their classification labels. - - Args: - matched_idxs (Tensor): a vector of length N, each is the best-matched - gt index in [0, M) for each proposal. - matched_labels (Tensor): a vector of length N, the matcher's label - (one of cfg.MODEL.ROI_HEADS.IOU_LABELS) for each proposal. - gt_classes (Tensor): a vector of length M. - - Returns: - Tensor: a vector of indices of sampled proposals. Each is in [0, N). - Tensor: a vector of the same length, the classification label for - each sampled proposal. Each sample is labeled as either a category in - [0, num_classes) or the background (num_classes). - """ - has_gt = gt_classes.numel() > 0 - # Get the corresponding GT for each proposal - if has_gt: - gt_classes = gt_classes[matched_idxs] - # Label unmatched proposals (0 label from matcher) as background (label=num_classes) - gt_classes[matched_labels == 0] = self.num_classes - # Label ignore proposals (-1 label) - gt_classes[matched_labels == -1] = -1 - else: - gt_classes = torch.zeros_like(matched_idxs) + self.num_classes - - sampled_fg_idxs, sampled_bg_idxs = subsample_labels( - gt_classes, self.batch_size_per_image, self.positive_fraction, self.num_classes - ) - - sampled_idxs = torch.cat([sampled_fg_idxs, sampled_bg_idxs], dim=0) - return sampled_idxs, gt_classes[sampled_idxs] - - @torch.no_grad() - def label_and_sample_proposals( - self, proposals: List[Instances], targets: List[Instances] - ) -> List[Instances]: - """ - Prepare some proposals to be used to train the ROI heads. - It performs box matching between `proposals` and `targets`, and assigns - training labels to the proposals. - It returns ``self.batch_size_per_image`` random samples from proposals and groundtruth - boxes, with a fraction of positives that is no larger than - ``self.positive_fraction``. - - Args: - See :meth:`ROIHeads.forward` - - Returns: - list[Instances]: - length `N` list of `Instances`s containing the proposals - sampled for training. Each `Instances` has the following fields: - - - proposal_boxes: the proposal boxes - - gt_boxes: the ground-truth box that the proposal is assigned to - (this is only meaningful if the proposal has a label > 0; if label = 0 - then the ground-truth box is random) - - Other fields such as "gt_classes", "gt_masks", that's included in `targets`. - """ - # Augment proposals with ground-truth boxes. - # In the case of learned proposals (e.g., RPN), when training starts - # the proposals will be low quality due to random initialization. - # It's possible that none of these initial - # proposals have high enough overlap with the gt objects to be used - # as positive examples for the second stage components (box head, - # cls head, mask head). Adding the gt boxes to the set of proposals - # ensures that the second stage components will have some positive - # examples from the start of training. For RPN, this augmentation improves - # convergence and empirically improves box AP on COCO by about 0.5 - # points (under one tested configuration). - if self.proposal_append_gt: - proposals = add_ground_truth_to_proposals(targets, proposals) - - proposals_with_gt = [] - - num_fg_samples = [] - num_bg_samples = [] - for proposals_per_image, targets_per_image in zip(proposals, targets): - has_gt = len(targets_per_image) > 0 - match_quality_matrix = pairwise_iou( - targets_per_image.gt_boxes, proposals_per_image.proposal_boxes - ) - matched_idxs, matched_labels = self.proposal_matcher(match_quality_matrix) - sampled_idxs, gt_classes = self._sample_proposals( - matched_idxs, matched_labels, targets_per_image.gt_classes - ) - - # Set target attributes of the sampled proposals: - proposals_per_image = proposals_per_image[sampled_idxs] - proposals_per_image.gt_classes = gt_classes - - if has_gt: - sampled_targets = matched_idxs[sampled_idxs] - # We index all the attributes of targets that start with "gt_" - # and have not been added to proposals yet (="gt_classes"). - # NOTE: here the indexing waste some compute, because heads - # like masks, keypoints, etc, will filter the proposals again, - # (by foreground/background, or number of keypoints in the image, etc) - # so we essentially index the data twice. - for (trg_name, trg_value) in targets_per_image.get_fields().items(): - if trg_name.startswith("gt_") and not proposals_per_image.has(trg_name): - proposals_per_image.set(trg_name, trg_value[sampled_targets]) - # If no GT is given in the image, we don't know what a dummy gt value can be. - # Therefore the returned proposals won't have any gt_* fields, except for a - # gt_classes full of background label. - - num_bg_samples.append((gt_classes == self.num_classes).sum().item()) - num_fg_samples.append(gt_classes.numel() - num_bg_samples[-1]) - proposals_with_gt.append(proposals_per_image) - - # Log the number of fg/bg samples that are selected for training ROI heads - storage = get_event_storage() - storage.put_scalar("roi_head/num_fg_samples", np.mean(num_fg_samples)) - storage.put_scalar("roi_head/num_bg_samples", np.mean(num_bg_samples)) - - return proposals_with_gt - - def forward( - self, - images: ImageList, - features: Dict[str, torch.Tensor], - proposals: List[Instances], - targets: Optional[List[Instances]] = None, - ) -> Tuple[List[Instances], Dict[str, torch.Tensor]]: - """ - Args: - images (ImageList): - features (dict[str,Tensor]): input data as a mapping from feature - map name to tensor. Axis 0 represents the number of images `N` in - the input data; axes 1-3 are channels, height, and width, which may - vary between feature maps (e.g., if a feature pyramid is used). - proposals (list[Instances]): length `N` list of `Instances`. The i-th - `Instances` contains object proposals for the i-th input image, - with fields "proposal_boxes" and "objectness_logits". - targets (list[Instances], optional): length `N` list of `Instances`. The i-th - `Instances` contains the ground-truth per-instance annotations - for the i-th input image. Specify `targets` during training only. - It may have the following fields: - - - gt_boxes: the bounding box of each instance. - - gt_classes: the label for each instance with a category ranging in [0, #class]. - - gt_masks: PolygonMasks or BitMasks, the ground-truth masks of each instance. - - gt_keypoints: NxKx3, the groud-truth keypoints for each instance. - - Returns: - list[Instances]: length `N` list of `Instances` containing the - detected instances. Returned during inference only; may be [] during training. - - dict[str->Tensor]: - mapping from a named loss to a tensor storing the loss. Used during training only. - """ - raise NotImplementedError() - - -@ROI_HEADS_REGISTRY.register() -class Res5ROIHeads(ROIHeads): - """ - The ROIHeads in a typical "C4" R-CNN model, where - the box and mask head share the cropping and - the per-region feature computation by a Res5 block. - See :paper:`ResNet` Appendix A. - """ - - @configurable - def __init__( - self, - *, - in_features: List[str], - pooler: ROIPooler, - res5: nn.Module, - box_predictor: nn.Module, - mask_head: Optional[nn.Module] = None, - **kwargs, - ): - """ - NOTE: this interface is experimental. - - Args: - in_features (list[str]): list of backbone feature map names to use for - feature extraction - pooler (ROIPooler): pooler to extra region features from backbone - res5 (nn.Sequential): a CNN to compute per-region features, to be used by - ``box_predictor`` and ``mask_head``. Typically this is a "res5" - block from a ResNet. - box_predictor (nn.Module): make box predictions from the feature. - Should have the same interface as :class:`FastRCNNOutputLayers`. - mask_head (nn.Module): transform features to make mask predictions - """ - super().__init__(**kwargs) - self.in_features = in_features - self.pooler = pooler - if isinstance(res5, (list, tuple)): - res5 = nn.Sequential(*res5) - self.res5 = res5 - self.box_predictor = box_predictor - self.mask_on = mask_head is not None - if self.mask_on: - self.mask_head = mask_head - - @classmethod - def from_config(cls, cfg, input_shape): - # fmt: off - ret = super().from_config(cfg) - in_features = ret["in_features"] = cfg.MODEL.ROI_HEADS.IN_FEATURES - pooler_resolution = cfg.MODEL.ROI_BOX_HEAD.POOLER_RESOLUTION - pooler_type = cfg.MODEL.ROI_BOX_HEAD.POOLER_TYPE - pooler_scales = (1.0 / input_shape[in_features[0]].stride, ) - sampling_ratio = cfg.MODEL.ROI_BOX_HEAD.POOLER_SAMPLING_RATIO - mask_on = cfg.MODEL.MASK_ON - # fmt: on - assert not cfg.MODEL.KEYPOINT_ON - assert len(in_features) == 1 - - ret["pooler"] = ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type=pooler_type, - ) - - # Compatbility with old moco code. Might be useful. - # See notes in StandardROIHeads.from_config - if not inspect.ismethod(cls._build_res5_block): - logger.warning( - "The behavior of _build_res5_block may change. " - "Please do not depend on private methods." - ) - cls._build_res5_block = classmethod(cls._build_res5_block) - - ret["res5"], out_channels = cls._build_res5_block(cfg) - ret["box_predictor"] = FastRCNNOutputLayers( - cfg, ShapeSpec(channels=out_channels, height=1, width=1) - ) - - if mask_on: - ret["mask_head"] = build_mask_head( - cfg, - ShapeSpec(channels=out_channels, width=pooler_resolution, height=pooler_resolution), - ) - return ret - - @classmethod - def _build_res5_block(cls, cfg): - # fmt: off - stage_channel_factor = 2 ** 3 # res5 is 8x res2 - num_groups = cfg.MODEL.RESNETS.NUM_GROUPS - width_per_group = cfg.MODEL.RESNETS.WIDTH_PER_GROUP - bottleneck_channels = num_groups * width_per_group * stage_channel_factor - out_channels = cfg.MODEL.RESNETS.RES2_OUT_CHANNELS * stage_channel_factor - stride_in_1x1 = cfg.MODEL.RESNETS.STRIDE_IN_1X1 - norm = cfg.MODEL.RESNETS.NORM - assert not cfg.MODEL.RESNETS.DEFORM_ON_PER_STAGE[-1], \ - "Deformable conv is not yet supported in res5 head." - # fmt: on - - blocks = ResNet.make_stage( - BottleneckBlock, - 3, - stride_per_block=[2, 1, 1], - in_channels=out_channels // 2, - bottleneck_channels=bottleneck_channels, - out_channels=out_channels, - num_groups=num_groups, - norm=norm, - stride_in_1x1=stride_in_1x1, - ) - return nn.Sequential(*blocks), out_channels - - def _shared_roi_transform(self, features: List[torch.Tensor], boxes: List[Boxes]): - x = self.pooler(features, boxes) - return self.res5(x) - - def forward( - self, - images: ImageList, - features: Dict[str, torch.Tensor], - proposals: List[Instances], - targets: Optional[List[Instances]] = None, - ): - """ - See :meth:`ROIHeads.forward`. - """ - del images - - if self.training: - assert targets - proposals = self.label_and_sample_proposals(proposals, targets) - del targets - - proposal_boxes = [x.proposal_boxes for x in proposals] - box_features = self._shared_roi_transform( - [features[f] for f in self.in_features], proposal_boxes - ) - predictions = self.box_predictor(box_features.mean(dim=[2, 3])) - - if self.training: - del features - losses = self.box_predictor.losses(predictions, proposals) - if self.mask_on: - proposals, fg_selection_masks = select_foreground_proposals( - proposals, self.num_classes - ) - # Since the ROI feature transform is shared between boxes and masks, - # we don't need to recompute features. The mask loss is only defined - # on foreground proposals, so we need to select out the foreground - # features. - mask_features = box_features[torch.cat(fg_selection_masks, dim=0)] - del box_features - losses.update(self.mask_head(mask_features, proposals)) - return [], losses - else: - pred_instances, _ = self.box_predictor.inference(predictions, proposals) - pred_instances = self.forward_with_given_boxes(features, pred_instances) - return pred_instances, {} - - def forward_with_given_boxes( - self, features: Dict[str, torch.Tensor], instances: List[Instances] - ) -> List[Instances]: - """ - Use the given boxes in `instances` to produce other (non-box) per-ROI outputs. - - Args: - features: same as in `forward()` - instances (list[Instances]): instances to predict other outputs. Expect the keys - "pred_boxes" and "pred_classes" to exist. - - Returns: - instances (Instances): - the same `Instances` object, with extra - fields such as `pred_masks` or `pred_keypoints`. - """ - assert not self.training - assert instances[0].has("pred_boxes") and instances[0].has("pred_classes") - - if self.mask_on: - feature_list = [features[f] for f in self.in_features] - x = self._shared_roi_transform(feature_list, [x.pred_boxes for x in instances]) - return self.mask_head(x, instances) - else: - return instances - - -@ROI_HEADS_REGISTRY.register() -class StandardROIHeads(ROIHeads): - """ - It's "standard" in a sense that there is no ROI transform sharing - or feature sharing between tasks. - Each head independently processes the input features by each head's - own pooler and head. - - This class is used by most models, such as FPN and C5. - To implement more models, you can subclass it and implement a different - :meth:`forward()` or a head. - """ - - @configurable - def __init__( - self, - *, - box_in_features: List[str], - box_pooler: ROIPooler, - box_head: nn.Module, - box_predictor: nn.Module, - mask_in_features: Optional[List[str]] = None, - mask_pooler: Optional[ROIPooler] = None, - mask_head: Optional[nn.Module] = None, - keypoint_in_features: Optional[List[str]] = None, - keypoint_pooler: Optional[ROIPooler] = None, - keypoint_head: Optional[nn.Module] = None, - train_on_pred_boxes: bool = False, - **kwargs, - ): - """ - NOTE: this interface is experimental. - - Args: - box_in_features (list[str]): list of feature names to use for the box head. - box_pooler (ROIPooler): pooler to extra region features for box head - box_head (nn.Module): transform features to make box predictions - box_predictor (nn.Module): make box predictions from the feature. - Should have the same interface as :class:`FastRCNNOutputLayers`. - mask_in_features (list[str]): list of feature names to use for the mask - pooler or mask head. None if not using mask head. - mask_pooler (ROIPooler): pooler to extract region features from image features. - The mask head will then take region features to make predictions. - If None, the mask head will directly take the dict of image features - defined by `mask_in_features` - mask_head (nn.Module): transform features to make mask predictions - keypoint_in_features, keypoint_pooler, keypoint_head: similar to ``mask_*``. - train_on_pred_boxes (bool): whether to use proposal boxes or - predicted boxes from the box head to train other heads. - """ - super().__init__(**kwargs) - # keep self.in_features for backward compatibility - self.in_features = self.box_in_features = box_in_features - self.box_pooler = box_pooler - self.box_head = box_head - self.box_predictor = box_predictor - - self.mask_on = mask_in_features is not None - if self.mask_on: - self.mask_in_features = mask_in_features - self.mask_pooler = mask_pooler - self.mask_head = mask_head - - self.keypoint_on = keypoint_in_features is not None - if self.keypoint_on: - self.keypoint_in_features = keypoint_in_features - self.keypoint_pooler = keypoint_pooler - self.keypoint_head = keypoint_head - - self.train_on_pred_boxes = train_on_pred_boxes - - @classmethod - def from_config(cls, cfg, input_shape): - ret = super().from_config(cfg) - ret["train_on_pred_boxes"] = cfg.MODEL.ROI_BOX_HEAD.TRAIN_ON_PRED_BOXES - # Subclasses that have not been updated to use from_config style construction - # may have overridden _init_*_head methods. In this case, those overridden methods - # will not be classmethods and we need to avoid trying to call them here. - # We test for this with ismethod which only returns True for bound methods of cls. - # Such subclasses will need to handle calling their overridden _init_*_head methods. - if inspect.ismethod(cls._init_box_head): - ret.update(cls._init_box_head(cfg, input_shape)) - if inspect.ismethod(cls._init_mask_head): - ret.update(cls._init_mask_head(cfg, input_shape)) - if inspect.ismethod(cls._init_keypoint_head): - ret.update(cls._init_keypoint_head(cfg, input_shape)) - return ret - - @classmethod - def _init_box_head(cls, cfg, input_shape): - # fmt: off - in_features = cfg.MODEL.ROI_HEADS.IN_FEATURES - pooler_resolution = cfg.MODEL.ROI_BOX_HEAD.POOLER_RESOLUTION - pooler_scales = tuple(1.0 / input_shape[k].stride for k in in_features) - sampling_ratio = cfg.MODEL.ROI_BOX_HEAD.POOLER_SAMPLING_RATIO - pooler_type = cfg.MODEL.ROI_BOX_HEAD.POOLER_TYPE - # fmt: on - - # If StandardROIHeads is applied on multiple feature maps (as in FPN), - # then we share the same predictors and therefore the channel counts must be the same - in_channels = [input_shape[f].channels for f in in_features] - # Check all channel counts are equal - assert len(set(in_channels)) == 1, in_channels - in_channels = in_channels[0] - - box_pooler = ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type=pooler_type, - ) - # Here we split "box head" and "box predictor", which is mainly due to historical reasons. - # They are used together so the "box predictor" layers should be part of the "box head". - # New subclasses of ROIHeads do not need "box predictor"s. - box_head = build_box_head( - cfg, ShapeSpec(channels=in_channels, height=pooler_resolution, width=pooler_resolution) - ) - box_predictor = FastRCNNOutputLayers(cfg, box_head.output_shape) - return { - "box_in_features": in_features, - "box_pooler": box_pooler, - "box_head": box_head, - "box_predictor": box_predictor, - } - - @classmethod - def _init_mask_head(cls, cfg, input_shape): - if not cfg.MODEL.MASK_ON: - return {} - # fmt: off - in_features = cfg.MODEL.ROI_HEADS.IN_FEATURES - pooler_resolution = cfg.MODEL.ROI_MASK_HEAD.POOLER_RESOLUTION - pooler_scales = tuple(1.0 / input_shape[k].stride for k in in_features) - sampling_ratio = cfg.MODEL.ROI_MASK_HEAD.POOLER_SAMPLING_RATIO - pooler_type = cfg.MODEL.ROI_MASK_HEAD.POOLER_TYPE - # fmt: on - - in_channels = [input_shape[f].channels for f in in_features][0] - - ret = {"mask_in_features": in_features} - ret["mask_pooler"] = ( - ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type=pooler_type, - ) - if pooler_type - else None - ) - if pooler_type: - shape = ShapeSpec( - channels=in_channels, width=pooler_resolution, height=pooler_resolution - ) - else: - shape = {f: input_shape[f] for f in in_features} - ret["mask_head"] = build_mask_head(cfg, shape) - return ret - - @classmethod - def _init_keypoint_head(cls, cfg, input_shape): - if not cfg.MODEL.KEYPOINT_ON: - return {} - # fmt: off - in_features = cfg.MODEL.ROI_HEADS.IN_FEATURES - pooler_resolution = cfg.MODEL.ROI_KEYPOINT_HEAD.POOLER_RESOLUTION - pooler_scales = tuple(1.0 / input_shape[k].stride for k in in_features) # noqa - sampling_ratio = cfg.MODEL.ROI_KEYPOINT_HEAD.POOLER_SAMPLING_RATIO - pooler_type = cfg.MODEL.ROI_KEYPOINT_HEAD.POOLER_TYPE - # fmt: on - - in_channels = [input_shape[f].channels for f in in_features][0] - - ret = {"keypoint_in_features": in_features} - ret["keypoint_pooler"] = ( - ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type=pooler_type, - ) - if pooler_type - else None - ) - if pooler_type: - shape = ShapeSpec( - channels=in_channels, width=pooler_resolution, height=pooler_resolution - ) - else: - shape = {f: input_shape[f] for f in in_features} - ret["keypoint_head"] = build_keypoint_head(cfg, shape) - return ret - - def forward( - self, - images: ImageList, - features: Dict[str, torch.Tensor], - proposals: List[Instances], - targets: Optional[List[Instances]] = None, - ) -> Tuple[List[Instances], Dict[str, torch.Tensor]]: - """ - See :class:`ROIHeads.forward`. - """ - del images - if self.training: - assert targets, "'targets' argument is required during training" - proposals = self.label_and_sample_proposals(proposals, targets) - del targets - - if self.training: - losses = self._forward_box(features, proposals) - # Usually the original proposals used by the box head are used by the mask, keypoint - # heads. But when `self.train_on_pred_boxes is True`, proposals will contain boxes - # predicted by the box head. - losses.update(self._forward_mask(features, proposals)) - losses.update(self._forward_keypoint(features, proposals)) - return proposals, losses - else: - pred_instances = self._forward_box(features, proposals) - # During inference cascaded prediction is used: the mask and keypoints heads are only - # applied to the top scoring box detections. - pred_instances = self.forward_with_given_boxes(features, pred_instances) - return pred_instances, {} - - def forward_with_given_boxes( - self, features: Dict[str, torch.Tensor], instances: List[Instances] - ) -> List[Instances]: - """ - Use the given boxes in `instances` to produce other (non-box) per-ROI outputs. - - This is useful for downstream tasks where a box is known, but need to obtain - other attributes (outputs of other heads). - Test-time augmentation also uses this. - - Args: - features: same as in `forward()` - instances (list[Instances]): instances to predict other outputs. Expect the keys - "pred_boxes" and "pred_classes" to exist. - - Returns: - list[Instances]: - the same `Instances` objects, with extra - fields such as `pred_masks` or `pred_keypoints`. - """ - assert not self.training - assert instances[0].has("pred_boxes") and instances[0].has("pred_classes") - - instances = self._forward_mask(features, instances) - instances = self._forward_keypoint(features, instances) - return instances - - def _forward_box(self, features: Dict[str, torch.Tensor], proposals: List[Instances]): - """ - Forward logic of the box prediction branch. If `self.train_on_pred_boxes is True`, - the function puts predicted boxes in the `proposal_boxes` field of `proposals` argument. - - Args: - features (dict[str, Tensor]): mapping from feature map names to tensor. - Same as in :meth:`ROIHeads.forward`. - proposals (list[Instances]): the per-image object proposals with - their matching ground truth. - Each has fields "proposal_boxes", and "objectness_logits", - "gt_classes", "gt_boxes". - - Returns: - In training, a dict of losses. - In inference, a list of `Instances`, the predicted instances. - """ - features = [features[f] for f in self.box_in_features] - box_features = self.box_pooler(features, [x.proposal_boxes for x in proposals]) - box_features = self.box_head(box_features) - predictions = self.box_predictor(box_features) - del box_features - - if self.training: - losses = self.box_predictor.losses(predictions, proposals) - # proposals is modified in-place below, so losses must be computed first. - if self.train_on_pred_boxes: - with torch.no_grad(): - pred_boxes = self.box_predictor.predict_boxes_for_gt_classes( - predictions, proposals - ) - for proposals_per_image, pred_boxes_per_image in zip(proposals, pred_boxes): - proposals_per_image.proposal_boxes = Boxes(pred_boxes_per_image) - return losses - else: - pred_instances, _ = self.box_predictor.inference(predictions, proposals) - return pred_instances - - def _forward_mask(self, features: Dict[str, torch.Tensor], instances: List[Instances]): - """ - Forward logic of the mask prediction branch. - - Args: - features (dict[str, Tensor]): mapping from feature map names to tensor. - Same as in :meth:`ROIHeads.forward`. - instances (list[Instances]): the per-image instances to train/predict masks. - In training, they can be the proposals. - In inference, they can be the boxes predicted by R-CNN box head. - - Returns: - In training, a dict of losses. - In inference, update `instances` with new fields "pred_masks" and return it. - """ - if not self.mask_on: - return {} if self.training else instances - - if self.training: - # head is only trained on positive proposals. - instances, _ = select_foreground_proposals(instances, self.num_classes) - - if self.mask_pooler is not None: - features = [features[f] for f in self.mask_in_features] - boxes = [x.proposal_boxes if self.training else x.pred_boxes for x in instances] - features = self.mask_pooler(features, boxes) - else: - features = {f: features[f] for f in self.mask_in_features} - return self.mask_head(features, instances) - - def _forward_keypoint(self, features: Dict[str, torch.Tensor], instances: List[Instances]): - """ - Forward logic of the keypoint prediction branch. - - Args: - features (dict[str, Tensor]): mapping from feature map names to tensor. - Same as in :meth:`ROIHeads.forward`. - instances (list[Instances]): the per-image instances to train/predict keypoints. - In training, they can be the proposals. - In inference, they can be the boxes predicted by R-CNN box head. - - Returns: - In training, a dict of losses. - In inference, update `instances` with new fields "pred_keypoints" and return it. - """ - if not self.keypoint_on: - return {} if self.training else instances - - if self.training: - # head is only trained on positive proposals with >=1 visible keypoints. - instances, _ = select_foreground_proposals(instances, self.num_classes) - instances = select_proposals_with_visible_keypoints(instances) - - if self.keypoint_pooler is not None: - features = [features[f] for f in self.keypoint_in_features] - boxes = [x.proposal_boxes if self.training else x.pred_boxes for x in instances] - features = self.keypoint_pooler(features, boxes) - else: - features = {f: features[f] for f in self.keypoint_in_features} - return self.keypoint_head(features, instances) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/rotated_fast_rcnn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/rotated_fast_rcnn.py deleted file mode 100755 index b1eedeeb..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/roi_heads/rotated_fast_rcnn.py +++ /dev/null @@ -1,270 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import numpy as np -import torch - -from detectron2.config import configurable -from detectron2.layers import ShapeSpec, batched_nms_rotated -from detectron2.structures import Instances, RotatedBoxes, pairwise_iou_rotated -from detectron2.utils.events import get_event_storage - -from ..box_regression import Box2BoxTransformRotated -from ..poolers import ROIPooler -from ..proposal_generator.proposal_utils import add_ground_truth_to_proposals -from .box_head import build_box_head -from .fast_rcnn import FastRCNNOutputLayers -from .roi_heads import ROI_HEADS_REGISTRY, StandardROIHeads - -logger = logging.getLogger(__name__) - -""" -Shape shorthand in this module: - - N: number of images in the minibatch - R: number of ROIs, combined over all images, in the minibatch - Ri: number of ROIs in image i - K: number of foreground classes. E.g.,there are 80 foreground classes in COCO. - -Naming convention: - - deltas: refers to the 5-d (dx, dy, dw, dh, da) deltas that parameterize the box2box - transform (see :class:`box_regression.Box2BoxTransformRotated`). - - pred_class_logits: predicted class scores in [-inf, +inf]; use - softmax(pred_class_logits) to estimate P(class). - - gt_classes: ground-truth classification labels in [0, K], where [0, K) represent - foreground object classes and K represents the background class. - - pred_proposal_deltas: predicted rotated box2box transform deltas for transforming proposals - to detection box predictions. - - gt_proposal_deltas: ground-truth rotated box2box transform deltas -""" - - -def fast_rcnn_inference_rotated( - boxes, scores, image_shapes, score_thresh, nms_thresh, topk_per_image -): - """ - Call `fast_rcnn_inference_single_image_rotated` for all images. - - Args: - boxes (list[Tensor]): A list of Tensors of predicted class-specific or class-agnostic - boxes for each image. Element i has shape (Ri, K * 5) if doing - class-specific regression, or (Ri, 5) if doing class-agnostic - regression, where Ri is the number of predicted objects for image i. - This is compatible with the output of :meth:`FastRCNNOutputLayers.predict_boxes`. - scores (list[Tensor]): A list of Tensors of predicted class scores for each image. - Element i has shape (Ri, K + 1), where Ri is the number of predicted objects - for image i. Compatible with the output of :meth:`FastRCNNOutputLayers.predict_probs`. - image_shapes (list[tuple]): A list of (width, height) tuples for each image in the batch. - score_thresh (float): Only return detections with a confidence score exceeding this - threshold. - nms_thresh (float): The threshold to use for box non-maximum suppression. Value in [0, 1]. - topk_per_image (int): The number of top scoring detections to return. Set < 0 to return - all detections. - - Returns: - instances: (list[Instances]): A list of N instances, one for each image in the batch, - that stores the topk most confidence detections. - kept_indices: (list[Tensor]): A list of 1D tensor of length of N, each element indicates - the corresponding boxes/scores index in [0, Ri) from the input, for image i. - """ - result_per_image = [ - fast_rcnn_inference_single_image_rotated( - boxes_per_image, scores_per_image, image_shape, score_thresh, nms_thresh, topk_per_image - ) - for scores_per_image, boxes_per_image, image_shape in zip(scores, boxes, image_shapes) - ] - return [x[0] for x in result_per_image], [x[1] for x in result_per_image] - - -def fast_rcnn_inference_single_image_rotated( - boxes, scores, image_shape, score_thresh, nms_thresh, topk_per_image -): - """ - Single-image inference. Return rotated bounding-box detection results by thresholding - on scores and applying rotated non-maximum suppression (Rotated NMS). - - Args: - Same as `fast_rcnn_inference_rotated`, but with rotated boxes, scores, and image shapes - per image. - - Returns: - Same as `fast_rcnn_inference_rotated`, but for only one image. - """ - valid_mask = torch.isfinite(boxes).all(dim=1) & torch.isfinite(scores).all(dim=1) - if not valid_mask.all(): - boxes = boxes[valid_mask] - scores = scores[valid_mask] - - B = 5 # box dimension - scores = scores[:, :-1] - num_bbox_reg_classes = boxes.shape[1] // B - # Convert to Boxes to use the `clip` function ... - boxes = RotatedBoxes(boxes.reshape(-1, B)) - boxes.clip(image_shape) - boxes = boxes.tensor.view(-1, num_bbox_reg_classes, B) # R x C x B - # Filter results based on detection scores - filter_mask = scores > score_thresh # R x K - # R' x 2. First column contains indices of the R predictions; - # Second column contains indices of classes. - filter_inds = filter_mask.nonzero() - if num_bbox_reg_classes == 1: - boxes = boxes[filter_inds[:, 0], 0] - else: - boxes = boxes[filter_mask] - scores = scores[filter_mask] - - # Apply per-class Rotated NMS - keep = batched_nms_rotated(boxes, scores, filter_inds[:, 1], nms_thresh) - if topk_per_image >= 0: - keep = keep[:topk_per_image] - boxes, scores, filter_inds = boxes[keep], scores[keep], filter_inds[keep] - - result = Instances(image_shape) - result.pred_boxes = RotatedBoxes(boxes) - result.scores = scores - result.pred_classes = filter_inds[:, 1] - - return result, filter_inds[:, 0] - - -class RotatedFastRCNNOutputLayers(FastRCNNOutputLayers): - """ - Two linear layers for predicting Rotated Fast R-CNN outputs. - """ - - @classmethod - def from_config(cls, cfg, input_shape): - args = super().from_config(cfg, input_shape) - args["box2box_transform"] = Box2BoxTransformRotated( - weights=cfg.MODEL.ROI_BOX_HEAD.BBOX_REG_WEIGHTS - ) - return args - - def inference(self, predictions, proposals): - """ - Returns: - list[Instances]: same as `fast_rcnn_inference_rotated`. - list[Tensor]: same as `fast_rcnn_inference_rotated`. - """ - boxes = self.predict_boxes(predictions, proposals) - scores = self.predict_probs(predictions, proposals) - image_shapes = [x.image_size for x in proposals] - - return fast_rcnn_inference_rotated( - boxes, - scores, - image_shapes, - self.test_score_thresh, - self.test_nms_thresh, - self.test_topk_per_image, - ) - - -@ROI_HEADS_REGISTRY.register() -class RROIHeads(StandardROIHeads): - """ - This class is used by Rotated Fast R-CNN to detect rotated boxes. - For now, it only supports box predictions but not mask or keypoints. - """ - - @configurable - def __init__(self, **kwargs): - """ - NOTE: this interface is experimental. - """ - super().__init__(**kwargs) - assert ( - not self.mask_on and not self.keypoint_on - ), "Mask/Keypoints not supported in Rotated ROIHeads." - assert not self.train_on_pred_boxes, "train_on_pred_boxes not implemented for RROIHeads!" - - @classmethod - def _init_box_head(cls, cfg, input_shape): - # fmt: off - in_features = cfg.MODEL.ROI_HEADS.IN_FEATURES - pooler_resolution = cfg.MODEL.ROI_BOX_HEAD.POOLER_RESOLUTION - pooler_scales = tuple(1.0 / input_shape[k].stride for k in in_features) - sampling_ratio = cfg.MODEL.ROI_BOX_HEAD.POOLER_SAMPLING_RATIO - pooler_type = cfg.MODEL.ROI_BOX_HEAD.POOLER_TYPE - # fmt: on - assert pooler_type in ["ROIAlignRotated"], pooler_type - # assume all channel counts are equal - in_channels = [input_shape[f].channels for f in in_features][0] - - box_pooler = ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type=pooler_type, - ) - box_head = build_box_head( - cfg, ShapeSpec(channels=in_channels, height=pooler_resolution, width=pooler_resolution) - ) - # This line is the only difference v.s. StandardROIHeads - box_predictor = RotatedFastRCNNOutputLayers(cfg, box_head.output_shape) - return { - "box_in_features": in_features, - "box_pooler": box_pooler, - "box_head": box_head, - "box_predictor": box_predictor, - } - - @torch.no_grad() - def label_and_sample_proposals(self, proposals, targets): - """ - Prepare some proposals to be used to train the RROI heads. - It performs box matching between `proposals` and `targets`, and assigns - training labels to the proposals. - It returns `self.batch_size_per_image` random samples from proposals and groundtruth boxes, - with a fraction of positives that is no larger than `self.positive_sample_fraction. - - Args: - See :meth:`StandardROIHeads.forward` - - Returns: - list[Instances]: length `N` list of `Instances`s containing the proposals - sampled for training. Each `Instances` has the following fields: - - proposal_boxes: the rotated proposal boxes - - gt_boxes: the ground-truth rotated boxes that the proposal is assigned to - (this is only meaningful if the proposal has a label > 0; if label = 0 - then the ground-truth box is random) - - gt_classes: the ground-truth classification lable for each proposal - """ - if self.proposal_append_gt: - proposals = add_ground_truth_to_proposals(targets, proposals) - - proposals_with_gt = [] - - num_fg_samples = [] - num_bg_samples = [] - for proposals_per_image, targets_per_image in zip(proposals, targets): - has_gt = len(targets_per_image) > 0 - match_quality_matrix = pairwise_iou_rotated( - targets_per_image.gt_boxes, proposals_per_image.proposal_boxes - ) - matched_idxs, matched_labels = self.proposal_matcher(match_quality_matrix) - sampled_idxs, gt_classes = self._sample_proposals( - matched_idxs, matched_labels, targets_per_image.gt_classes - ) - - proposals_per_image = proposals_per_image[sampled_idxs] - proposals_per_image.gt_classes = gt_classes - - if has_gt: - sampled_targets = matched_idxs[sampled_idxs] - proposals_per_image.gt_boxes = targets_per_image.gt_boxes[sampled_targets] - - num_bg_samples.append((gt_classes == self.num_classes).sum().item()) - num_fg_samples.append(gt_classes.numel() - num_bg_samples[-1]) - proposals_with_gt.append(proposals_per_image) - - # Log the number of fg/bg samples that are selected for training ROI heads - storage = get_event_storage() - storage.put_scalar("roi_head/num_fg_samples", np.mean(num_fg_samples)) - storage.put_scalar("roi_head/num_bg_samples", np.mean(num_bg_samples)) - - return proposals_with_gt diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/sampling.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/sampling.py deleted file mode 100755 index a2d0f664..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/sampling.py +++ /dev/null @@ -1,54 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import torch - -from detectron2.layers import nonzero_tuple - -__all__ = ["subsample_labels"] - - -def subsample_labels( - labels: torch.Tensor, num_samples: int, positive_fraction: float, bg_label: int -): - """ - Return `num_samples` (or fewer, if not enough found) - random samples from `labels` which is a mixture of positives & negatives. - It will try to return as many positives as possible without - exceeding `positive_fraction * num_samples`, and then try to - fill the remaining slots with negatives. - - Args: - labels (Tensor): (N, ) label vector with values: - * -1: ignore - * bg_label: background ("negative") class - * otherwise: one or more foreground ("positive") classes - num_samples (int): The total number of labels with value >= 0 to return. - Values that are not sampled will be filled with -1 (ignore). - positive_fraction (float): The number of subsampled labels with values > 0 - is `min(num_positives, int(positive_fraction * num_samples))`. The number - of negatives sampled is `min(num_negatives, num_samples - num_positives_sampled)`. - In order words, if there are not enough positives, the sample is filled with - negatives. If there are also not enough negatives, then as many elements are - sampled as is possible. - bg_label (int): label index of background ("negative") class. - - Returns: - pos_idx, neg_idx (Tensor): - 1D vector of indices. The total length of both is `num_samples` or fewer. - """ - positive = nonzero_tuple((labels != -1) & (labels != bg_label))[0] - negative = nonzero_tuple(labels == bg_label)[0] - - num_pos = int(num_samples * positive_fraction) - # protect against not enough positive examples - num_pos = min(positive.numel(), num_pos) - num_neg = num_samples - num_pos - # protect against not enough negative examples - num_neg = min(negative.numel(), num_neg) - - # randomly select positive and negative examples - perm1 = torch.randperm(positive.numel(), device=positive.device)[:num_pos] - perm2 = torch.randperm(negative.numel(), device=negative.device)[:num_neg] - - pos_idx = positive[perm1] - neg_idx = negative[perm2] - return pos_idx, neg_idx diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/test_time_augmentation.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/test_time_augmentation.py deleted file mode 100755 index 373e6bf0..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/modeling/test_time_augmentation.py +++ /dev/null @@ -1,307 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import numpy as np -from contextlib import contextmanager -from itertools import count -from typing import List -import torch -from fvcore.transforms import HFlipTransform, NoOpTransform -from torch import nn -from torch.nn.parallel import DistributedDataParallel - -from detectron2.config import configurable -from detectron2.data.detection_utils import read_image -from detectron2.data.transforms import ( - RandomFlip, - ResizeShortestEdge, - ResizeTransform, - apply_augmentations, -) -from detectron2.structures import Boxes, Instances - -from .meta_arch import GeneralizedRCNN -from .postprocessing import detector_postprocess -from .roi_heads.fast_rcnn import fast_rcnn_inference_single_image - -__all__ = ["DatasetMapperTTA", "GeneralizedRCNNWithTTA"] - - -class DatasetMapperTTA: - """ - Implement test-time augmentation for detection data. - It is a callable which takes a dataset dict from a detection dataset, - and returns a list of dataset dicts where the images - are augmented from the input image by the transformations defined in the config. - This is used for test-time augmentation. - """ - - @configurable - def __init__(self, min_sizes: List[int], max_size: int, flip: bool): - """ - Args: - min_sizes: list of short-edge size to resize the image to - max_size: maximum height or width of resized images - flip: whether to apply flipping augmentation - """ - self.min_sizes = min_sizes - self.max_size = max_size - self.flip = flip - - @classmethod - def from_config(cls, cfg): - return { - "min_sizes": cfg.TEST.AUG.MIN_SIZES, - "max_size": cfg.TEST.AUG.MAX_SIZE, - "flip": cfg.TEST.AUG.FLIP, - } - - def __call__(self, dataset_dict): - """ - Args: - dict: a dict in standard model input format. See tutorials for details. - - Returns: - list[dict]: - a list of dicts, which contain augmented version of the input image. - The total number of dicts is ``len(min_sizes) * (2 if flip else 1)``. - Each dict has field "transforms" which is a TransformList, - containing the transforms that are used to generate this image. - """ - numpy_image = dataset_dict["image"].permute(1, 2, 0).numpy() - shape = numpy_image.shape - orig_shape = (dataset_dict["height"], dataset_dict["width"]) - if shape[:2] != orig_shape: - # It transforms the "original" image in the dataset to the input image - pre_tfm = ResizeTransform(orig_shape[0], orig_shape[1], shape[0], shape[1]) - else: - pre_tfm = NoOpTransform() - - # Create all combinations of augmentations to use - aug_candidates = [] # each element is a list[Augmentation] - for min_size in self.min_sizes: - resize = ResizeShortestEdge(min_size, self.max_size) - aug_candidates.append([resize]) # resize only - if self.flip: - flip = RandomFlip(prob=1.0) - aug_candidates.append([resize, flip]) # resize + flip - - # Apply all the augmentations - ret = [] - for aug in aug_candidates: - new_image, tfms = apply_augmentations(aug, np.copy(numpy_image)) - torch_image = torch.from_numpy(np.ascontiguousarray(new_image.transpose(2, 0, 1))) - - dic = copy.deepcopy(dataset_dict) - dic["transforms"] = pre_tfm + tfms - dic["image"] = torch_image - ret.append(dic) - return ret - - -class GeneralizedRCNNWithTTA(nn.Module): - """ - A GeneralizedRCNN with test-time augmentation enabled. - Its :meth:`__call__` method has the same interface as :meth:`GeneralizedRCNN.forward`. - """ - - def __init__(self, cfg, model, tta_mapper=None, batch_size=3): - """ - Args: - cfg (CfgNode): - model (GeneralizedRCNN): a GeneralizedRCNN to apply TTA on. - tta_mapper (callable): takes a dataset dict and returns a list of - augmented versions of the dataset dict. Defaults to - `DatasetMapperTTA(cfg)`. - batch_size (int): batch the augmented images into this batch size for inference. - """ - super().__init__() - if isinstance(model, DistributedDataParallel): - model = model.module - assert isinstance( - model, GeneralizedRCNN - ), "TTA is only supported on GeneralizedRCNN. Got a model of type {}".format(type(model)) - self.cfg = cfg.clone() - assert not self.cfg.MODEL.KEYPOINT_ON, "TTA for keypoint is not supported yet" - assert ( - not self.cfg.MODEL.LOAD_PROPOSALS - ), "TTA for pre-computed proposals is not supported yet" - - self.model = model - - if tta_mapper is None: - tta_mapper = DatasetMapperTTA(cfg) - self.tta_mapper = tta_mapper - self.batch_size = batch_size - - @contextmanager - def _turn_off_roi_heads(self, attrs): - """ - Open a context where some heads in `model.roi_heads` are temporarily turned off. - Args: - attr (list[str]): the attribute in `model.roi_heads` which can be used - to turn off a specific head, e.g., "mask_on", "keypoint_on". - """ - roi_heads = self.model.roi_heads - old = {} - for attr in attrs: - try: - old[attr] = getattr(roi_heads, attr) - except AttributeError: - # The head may not be implemented in certain ROIHeads - pass - - if len(old.keys()) == 0: - yield - else: - for attr in old.keys(): - setattr(roi_heads, attr, False) - yield - for attr in old.keys(): - setattr(roi_heads, attr, old[attr]) - - def _batch_inference(self, batched_inputs, detected_instances=None): - """ - Execute inference on a list of inputs, - using batch size = self.batch_size, instead of the length of the list. - - Inputs & outputs have the same format as :meth:`GeneralizedRCNN.inference` - """ - if detected_instances is None: - detected_instances = [None] * len(batched_inputs) - - outputs = [] - inputs, instances = [], [] - for idx, input, instance in zip(count(), batched_inputs, detected_instances): - inputs.append(input) - instances.append(instance) - if len(inputs) == self.batch_size or idx == len(batched_inputs) - 1: - outputs.extend( - self.model.inference( - inputs, - instances if instances[0] is not None else None, - do_postprocess=False, - ) - ) - inputs, instances = [], [] - return outputs - - def __call__(self, batched_inputs): - """ - Same input/output format as :meth:`GeneralizedRCNN.forward` - """ - - def _maybe_read_image(dataset_dict): - ret = copy.copy(dataset_dict) - if "image" not in ret: - image = read_image(ret.pop("file_name"), self.model.input_format) - image = torch.from_numpy(np.ascontiguousarray(image.transpose(2, 0, 1))) # CHW - ret["image"] = image - if "height" not in ret and "width" not in ret: - ret["height"] = image.shape[1] - ret["width"] = image.shape[2] - return ret - - return [self._inference_one_image(_maybe_read_image(x)) for x in batched_inputs] - - def _inference_one_image(self, input): - """ - Args: - input (dict): one dataset dict with "image" field being a CHW tensor - - Returns: - dict: one output dict - """ - orig_shape = (input["height"], input["width"]) - augmented_inputs, tfms = self._get_augmented_inputs(input) - # Detect boxes from all augmented versions - with self._turn_off_roi_heads(["mask_on", "keypoint_on"]): - # temporarily disable roi heads - all_boxes, all_scores, all_classes = self._get_augmented_boxes(augmented_inputs, tfms) - # merge all detected boxes to obtain final predictions for boxes - merged_instances = self._merge_detections(all_boxes, all_scores, all_classes, orig_shape) - - if self.cfg.MODEL.MASK_ON: - # Use the detected boxes to obtain masks - augmented_instances = self._rescale_detected_boxes( - augmented_inputs, merged_instances, tfms - ) - # run forward on the detected boxes - outputs = self._batch_inference(augmented_inputs, augmented_instances) - # Delete now useless variables to avoid being out of memory - del augmented_inputs, augmented_instances - # average the predictions - merged_instances.pred_masks = self._reduce_pred_masks(outputs, tfms) - merged_instances = detector_postprocess(merged_instances, *orig_shape) - return {"instances": merged_instances} - else: - return {"instances": merged_instances} - - def _get_augmented_inputs(self, input): - augmented_inputs = self.tta_mapper(input) - tfms = [x.pop("transforms") for x in augmented_inputs] - return augmented_inputs, tfms - - def _get_augmented_boxes(self, augmented_inputs, tfms): - # 1: forward with all augmented images - outputs = self._batch_inference(augmented_inputs) - # 2: union the results - all_boxes = [] - all_scores = [] - all_classes = [] - for output, tfm in zip(outputs, tfms): - # Need to inverse the transforms on boxes, to obtain results on original image - pred_boxes = output.pred_boxes.tensor - original_pred_boxes = tfm.inverse().apply_box(pred_boxes.cpu().numpy()) - all_boxes.append(torch.from_numpy(original_pred_boxes).to(pred_boxes.device)) - - all_scores.extend(output.scores) - all_classes.extend(output.pred_classes) - all_boxes = torch.cat(all_boxes, dim=0) - return all_boxes, all_scores, all_classes - - def _merge_detections(self, all_boxes, all_scores, all_classes, shape_hw): - # select from the union of all results - num_boxes = len(all_boxes) - num_classes = self.cfg.MODEL.ROI_HEADS.NUM_CLASSES - # +1 because fast_rcnn_inference expects background scores as well - all_scores_2d = torch.zeros(num_boxes, num_classes + 1, device=all_boxes.device) - for idx, cls, score in zip(count(), all_classes, all_scores): - all_scores_2d[idx, cls] = score - - merged_instances, _ = fast_rcnn_inference_single_image( - all_boxes, - all_scores_2d, - shape_hw, - 1e-8, - self.cfg.MODEL.ROI_HEADS.NMS_THRESH_TEST, - self.cfg.TEST.DETECTIONS_PER_IMAGE, - ) - - return merged_instances - - def _rescale_detected_boxes(self, augmented_inputs, merged_instances, tfms): - augmented_instances = [] - for input, tfm in zip(augmented_inputs, tfms): - # Transform the target box to the augmented image's coordinate space - pred_boxes = merged_instances.pred_boxes.tensor.cpu().numpy() - pred_boxes = torch.from_numpy(tfm.apply_box(pred_boxes)) - - aug_instances = Instances( - image_size=input["image"].shape[1:3], - pred_boxes=Boxes(pred_boxes), - pred_classes=merged_instances.pred_classes, - scores=merged_instances.scores, - ) - augmented_instances.append(aug_instances) - return augmented_instances - - def _reduce_pred_masks(self, outputs, tfms): - # Should apply inverse transforms on masks. - # We assume only resize & flip are used. pred_masks is a scale-invariant - # representation, so we handle flip specially - for output, tfm in zip(outputs, tfms): - if any(isinstance(t, HFlipTransform) for t in tfm.transforms): - output.pred_masks = output.pred_masks.flip(dims=[3]) - all_pred_masks = torch.stack([o.pred_masks for o in outputs], dim=0) - avg_pred_masks = torch.mean(all_pred_masks, dim=0) - return avg_pred_masks diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/projects/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/projects/README.md deleted file mode 100755 index 95afe7ff..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/projects/README.md +++ /dev/null @@ -1,2 +0,0 @@ - -Projects live in the [`projects` directory](../../projects) under the root of this repository, but not here. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/projects/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/projects/__init__.py deleted file mode 100755 index a68207db..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/projects/__init__.py +++ /dev/null @@ -1,31 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import importlib -from pathlib import Path - -_PROJECTS = { - "point_rend": "PointRend", - "deeplab": "DeepLab", - "panoptic_deeplab": "Panoptic-DeepLab", -} -_PROJECT_ROOT = Path(__file__).resolve().parent.parent.parent / "projects" - -if _PROJECT_ROOT.is_dir(): - # This is true only for in-place installation (pip install -e, setup.py develop), - # where setup(package_dir=) does not work: https://github.com/pypa/setuptools/issues/230 - - class _D2ProjectsFinder(importlib.abc.MetaPathFinder): - def find_spec(self, name, path, target=None): - if not name.startswith("detectron2.projects."): - return - project_name = name.split(".")[-1] - project_dir = _PROJECTS.get(project_name) - if not project_dir: - return - target_file = _PROJECT_ROOT / f"{project_dir}/{project_name}/__init__.py" - if not target_file.is_file(): - return - return importlib.util.spec_from_file_location(name, target_file) - - import sys - - sys.meta_path.append(_D2ProjectsFinder()) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/solver/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/solver/__init__.py deleted file mode 100755 index 9a2dbd35..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/solver/__init__.py +++ /dev/null @@ -1,5 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .build import build_lr_scheduler, build_optimizer, get_default_optimizer_params -from .lr_scheduler import WarmupCosineLR, WarmupMultiStepLR, LRMultiplier, WarmupParamScheduler - -__all__ = [k for k in globals().keys() if not k.startswith("_")] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/solver/build.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/solver/build.py deleted file mode 100755 index 1989dfcd..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/solver/build.py +++ /dev/null @@ -1,285 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import itertools -import logging -from collections import defaultdict -from enum import Enum -from typing import Any, Callable, Dict, Iterable, List, Optional, Set, Type, Union -import torch -from fvcore.common.param_scheduler import CosineParamScheduler, MultiStepParamScheduler - -from detectron2.config import CfgNode - -from .lr_scheduler import LRMultiplier, WarmupParamScheduler - -_GradientClipperInput = Union[torch.Tensor, Iterable[torch.Tensor]] -_GradientClipper = Callable[[_GradientClipperInput], None] - - -class GradientClipType(Enum): - VALUE = "value" - NORM = "norm" - - -def _create_gradient_clipper(cfg: CfgNode) -> _GradientClipper: - """ - Creates gradient clipping closure to clip by value or by norm, - according to the provided config. - """ - cfg = copy.deepcopy(cfg) - - def clip_grad_norm(p: _GradientClipperInput): - torch.nn.utils.clip_grad_norm_(p, cfg.CLIP_VALUE, cfg.NORM_TYPE) - - def clip_grad_value(p: _GradientClipperInput): - torch.nn.utils.clip_grad_value_(p, cfg.CLIP_VALUE) - - _GRADIENT_CLIP_TYPE_TO_CLIPPER = { - GradientClipType.VALUE: clip_grad_value, - GradientClipType.NORM: clip_grad_norm, - } - return _GRADIENT_CLIP_TYPE_TO_CLIPPER[GradientClipType(cfg.CLIP_TYPE)] - - -def _generate_optimizer_class_with_gradient_clipping( - optimizer: Type[torch.optim.Optimizer], - *, - per_param_clipper: Optional[_GradientClipper] = None, - global_clipper: Optional[_GradientClipper] = None, -) -> Type[torch.optim.Optimizer]: - """ - Dynamically creates a new type that inherits the type of a given instance - and overrides the `step` method to add gradient clipping - """ - assert ( - per_param_clipper is None or global_clipper is None - ), "Not allowed to use both per-parameter clipping and global clipping" - - def optimizer_wgc_step(self, closure=None): - if per_param_clipper is not None: - for group in self.param_groups: - for p in group["params"]: - per_param_clipper(p) - else: - # global clipper for future use with detr - # (https://github.com/facebookresearch/detr/pull/287) - all_params = itertools.chain(*[g["params"] for g in self.param_groups]) - global_clipper(all_params) - super(type(self), self).step(closure) - - OptimizerWithGradientClip = type( - optimizer.__name__ + "WithGradientClip", - (optimizer,), - {"step": optimizer_wgc_step}, - ) - return OptimizerWithGradientClip - - -def maybe_add_gradient_clipping( - cfg: CfgNode, optimizer: Type[torch.optim.Optimizer] -) -> Type[torch.optim.Optimizer]: - """ - If gradient clipping is enabled through config options, wraps the existing - optimizer type to become a new dynamically created class OptimizerWithGradientClip - that inherits the given optimizer and overrides the `step` method to - include gradient clipping. - - Args: - cfg: CfgNode, configuration options - optimizer: type. A subclass of torch.optim.Optimizer - - Return: - type: either the input `optimizer` (if gradient clipping is disabled), or - a subclass of it with gradient clipping included in the `step` method. - """ - if not cfg.SOLVER.CLIP_GRADIENTS.ENABLED: - return optimizer - if isinstance(optimizer, torch.optim.Optimizer): - optimizer_type = type(optimizer) - else: - assert issubclass(optimizer, torch.optim.Optimizer), optimizer - optimizer_type = optimizer - - grad_clipper = _create_gradient_clipper(cfg.SOLVER.CLIP_GRADIENTS) - OptimizerWithGradientClip = _generate_optimizer_class_with_gradient_clipping( - optimizer_type, per_param_clipper=grad_clipper - ) - if isinstance(optimizer, torch.optim.Optimizer): - optimizer.__class__ = OptimizerWithGradientClip # a bit hacky, not recommended - return optimizer - else: - return OptimizerWithGradientClip - - -def build_optimizer(cfg: CfgNode, model: torch.nn.Module) -> torch.optim.Optimizer: - """ - Build an optimizer from config. - """ - params = get_default_optimizer_params( - model, - base_lr=cfg.SOLVER.BASE_LR, - weight_decay_norm=cfg.SOLVER.WEIGHT_DECAY_NORM, - bias_lr_factor=cfg.SOLVER.BIAS_LR_FACTOR, - weight_decay_bias=cfg.SOLVER.WEIGHT_DECAY_BIAS, - ) - return maybe_add_gradient_clipping(cfg, torch.optim.SGD)( - params, - lr=cfg.SOLVER.BASE_LR, - momentum=cfg.SOLVER.MOMENTUM, - nesterov=cfg.SOLVER.NESTEROV, - weight_decay=cfg.SOLVER.WEIGHT_DECAY, - ) - - -def get_default_optimizer_params( - model: torch.nn.Module, - base_lr: Optional[float] = None, - weight_decay: Optional[float] = None, - weight_decay_norm: Optional[float] = None, - bias_lr_factor: Optional[float] = 1.0, - weight_decay_bias: Optional[float] = None, - overrides: Optional[Dict[str, Dict[str, float]]] = None, -) -> List[Dict[str, Any]]: - """ - Get default param list for optimizer, with support for a few types of - overrides. If no overrides needed, this is equivalent to `model.parameters()`. - - Args: - base_lr: lr for every group by default. Can be omitted to use the one in optimizer. - weight_decay: weight decay for every group by default. Can be omitted to use the one - in optimizer. - weight_decay_norm: override weight decay for params in normalization layers - bias_lr_factor: multiplier of lr for bias parameters. - weight_decay_bias: override weight decay for bias parameters - overrides: if not `None`, provides values for optimizer hyperparameters - (LR, weight decay) for module parameters with a given name; e.g. - ``{"embedding": {"lr": 0.01, "weight_decay": 0.1}}`` will set the LR and - weight decay values for all module parameters named `embedding`. - - For common detection models, ``weight_decay_norm`` is the only option - needed to be set. ``bias_lr_factor,weight_decay_bias`` are legacy settings - from Detectron1 that are not found useful. - - Example: - :: - torch.optim.SGD(get_default_optimizer_params(model, weight_decay_norm=0), - lr=0.01, weight_decay=1e-4, momentum=0.9) - """ - if overrides is None: - overrides = {} - defaults = {} - if base_lr is not None: - defaults["lr"] = base_lr - if weight_decay is not None: - defaults["weight_decay"] = weight_decay - bias_overrides = {} - if bias_lr_factor is not None and bias_lr_factor != 1.0: - # NOTE: unlike Detectron v1, we now by default make bias hyperparameters - # exactly the same as regular weights. - if base_lr is None: - raise ValueError("bias_lr_factor requires base_lr") - bias_overrides["lr"] = base_lr * bias_lr_factor - if weight_decay_bias is not None: - bias_overrides["weight_decay"] = weight_decay_bias - if len(bias_overrides): - if "bias" in overrides: - raise ValueError("Conflicting overrides for 'bias'") - overrides["bias"] = bias_overrides - - norm_module_types = ( - torch.nn.BatchNorm1d, - torch.nn.BatchNorm2d, - torch.nn.BatchNorm3d, - torch.nn.SyncBatchNorm, - # NaiveSyncBatchNorm inherits from BatchNorm2d - torch.nn.GroupNorm, - torch.nn.InstanceNorm1d, - torch.nn.InstanceNorm2d, - torch.nn.InstanceNorm3d, - torch.nn.LayerNorm, - torch.nn.LocalResponseNorm, - ) - params: List[Dict[str, Any]] = [] - memo: Set[torch.nn.parameter.Parameter] = set() - for module in model.modules(): - for module_param_name, value in module.named_parameters(recurse=False): - if not value.requires_grad: - continue - # Avoid duplicating parameters - if value in memo: - continue - memo.add(value) - - hyperparams = copy.copy(defaults) - if isinstance(module, norm_module_types) and weight_decay_norm is not None: - hyperparams["weight_decay"] = weight_decay_norm - hyperparams.update(overrides.get(module_param_name, {})) - params.append({"params": [value], **hyperparams}) - return reduce_param_groups(params) - - -def _expand_param_groups(params: List[Dict[str, Any]]) -> List[Dict[str, Any]]: - # Transform parameter groups into per-parameter structure. - # Later items in `params` can overwrite parameters set in previous items. - ret = defaultdict(dict) - for item in params: - assert "params" in item - cur_params = {x: y for x, y in item.items() if x != "params"} - for param in item["params"]: - ret[param].update({"params": [param], **cur_params}) - return list(ret.values()) - - -def reduce_param_groups(params: List[Dict[str, Any]]) -> List[Dict[str, Any]]: - # Reorganize the parameter groups and merge duplicated groups. - # The number of parameter groups needs to be as small as possible in order - # to efficiently use the PyTorch multi-tensor optimizer. Therefore instead - # of using a parameter_group per single parameter, we reorganize the - # parameter groups and merge duplicated groups. This approach speeds - # up multi-tensor optimizer significantly. - params = _expand_param_groups(params) - groups = defaultdict(list) # re-group all parameter groups by their hyperparams - for item in params: - cur_params = tuple((x, y) for x, y in item.items() if x != "params") - groups[cur_params].extend(item["params"]) - ret = [] - for param_keys, param_values in groups.items(): - cur = {kv[0]: kv[1] for kv in param_keys} - cur["params"] = param_values - ret.append(cur) - return ret - - -def build_lr_scheduler( - cfg: CfgNode, optimizer: torch.optim.Optimizer -) -> torch.optim.lr_scheduler._LRScheduler: - """ - Build a LR scheduler from config. - """ - name = cfg.SOLVER.LR_SCHEDULER_NAME - - if name == "WarmupMultiStepLR": - steps = [x for x in cfg.SOLVER.STEPS if x <= cfg.SOLVER.MAX_ITER] - if len(steps) != len(cfg.SOLVER.STEPS): - logger = logging.getLogger(__name__) - logger.warning( - "SOLVER.STEPS contains values larger than SOLVER.MAX_ITER. " - "These values will be ignored." - ) - sched = MultiStepParamScheduler( - values=[cfg.SOLVER.GAMMA ** k for k in range(len(steps) + 1)], - milestones=steps, - num_updates=cfg.SOLVER.MAX_ITER, - ) - elif name == "WarmupCosineLR": - sched = CosineParamScheduler(1, 0) - else: - raise ValueError("Unknown LR scheduler: {}".format(name)) - - sched = WarmupParamScheduler( - sched, - cfg.SOLVER.WARMUP_FACTOR, - min(cfg.SOLVER.WARMUP_ITERS / cfg.SOLVER.MAX_ITER, 1.0), - cfg.SOLVER.WARMUP_METHOD, - ) - return LRMultiplier(optimizer, multiplier=sched, max_iter=cfg.SOLVER.MAX_ITER) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/solver/lr_scheduler.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/solver/lr_scheduler.py deleted file mode 100755 index 8803e87b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/solver/lr_scheduler.py +++ /dev/null @@ -1,238 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import math -from bisect import bisect_right -from typing import List -import torch -from fvcore.common.param_scheduler import ( - CompositeParamScheduler, - ConstantParamScheduler, - LinearParamScheduler, - ParamScheduler, -) - -logger = logging.getLogger(__name__) - - -class WarmupParamScheduler(CompositeParamScheduler): - """ - Add an initial warmup stage to another scheduler. - """ - - def __init__( - self, - scheduler: ParamScheduler, - warmup_factor: float, - warmup_length: float, - warmup_method: str = "linear", - ): - """ - Args: - scheduler: warmup will be added at the beginning of this scheduler - warmup_factor: the factor w.r.t the initial value of ``scheduler``, e.g. 0.001 - warmup_length: the relative length (in [0, 1]) of warmup steps w.r.t the entire - training, e.g. 0.01 - warmup_method: one of "linear" or "constant" - """ - end_value = scheduler(warmup_length) # the value to reach when warmup ends - start_value = warmup_factor * scheduler(0.0) - if warmup_method == "constant": - warmup = ConstantParamScheduler(start_value) - elif warmup_method == "linear": - warmup = LinearParamScheduler(start_value, end_value) - else: - raise ValueError("Unknown warmup method: {}".format(warmup_method)) - super().__init__( - [warmup, scheduler], - interval_scaling=["rescaled", "fixed"], - lengths=[warmup_length, 1 - warmup_length], - ) - - -class LRMultiplier(torch.optim.lr_scheduler._LRScheduler): - """ - A LRScheduler which uses fvcore :class:`ParamScheduler` to multiply the - learning rate of each param in the optimizer. - Every step, the learning rate of each parameter becomes its initial value - multiplied by the output of the given :class:`ParamScheduler`. - - The absolute learning rate value of each parameter can be different. - This scheduler can be used as long as the relative scale among them do - not change during training. - - Examples: - :: - LRMultiplier( - opt, - WarmupParamScheduler( - MultiStepParamScheduler( - [1, 0.1, 0.01], - milestones=[60000, 80000], - num_updates=90000, - ), 0.001, 100 / 90000 - ), - max_iter=90000 - ) - """ - - # NOTES: in the most general case, every LR can use its own scheduler. - # Supporting this requires interaction with the optimizer when its parameter - # group is initialized. For example, classyvision implements its own optimizer - # that allows different schedulers for every parameter group. - # To avoid this complexity, we use this class to support the most common cases - # where the relative scale among all LRs stay unchanged during training. In this - # case we only need a total of one scheduler that defines the relative LR multiplier. - - def __init__( - self, - optimizer: torch.optim.Optimizer, - multiplier: ParamScheduler, - max_iter: int, - last_iter: int = -1, - ): - """ - Args: - optimizer, last_iter: See ``torch.optim.lr_scheduler._LRScheduler``. - ``last_iter`` is the same as ``last_epoch``. - multiplier: a fvcore ParamScheduler that defines the multiplier on - every LR of the optimizer - max_iter: the total number of training iterations - """ - if not isinstance(multiplier, ParamScheduler): - raise ValueError( - "_LRMultiplier(multiplier=) must be an instance of fvcore " - f"ParamScheduler. Got {multiplier} instead." - ) - self._multiplier = multiplier - self._max_iter = max_iter - super().__init__(optimizer, last_epoch=last_iter) - - def state_dict(self): - # fvcore schedulers are stateless. Only keep pytorch scheduler states - return {"base_lrs": self.base_lrs, "last_epoch": self.last_epoch} - - def get_lr(self) -> List[float]: - multiplier = self._multiplier(self.last_epoch / self._max_iter) - return [base_lr * multiplier for base_lr in self.base_lrs] - - -""" -Content below is no longer needed! -""" - -# NOTE: PyTorch's LR scheduler interface uses names that assume the LR changes -# only on epoch boundaries. We typically use iteration based schedules instead. -# As a result, "epoch" (e.g., as in self.last_epoch) should be understood to mean -# "iteration" instead. - -# FIXME: ideally this would be achieved with a CombinedLRScheduler, separating -# MultiStepLR with WarmupLR but the current LRScheduler design doesn't allow it. - - -class WarmupMultiStepLR(torch.optim.lr_scheduler._LRScheduler): - def __init__( - self, - optimizer: torch.optim.Optimizer, - milestones: List[int], - gamma: float = 0.1, - warmup_factor: float = 0.001, - warmup_iters: int = 1000, - warmup_method: str = "linear", - last_epoch: int = -1, - ): - logger.warning( - "WarmupMultiStepLR is deprecated! Use LRMultipilier with fvcore ParamScheduler instead!" - ) - if not list(milestones) == sorted(milestones): - raise ValueError( - "Milestones should be a list of" " increasing integers. Got {}", milestones - ) - self.milestones = milestones - self.gamma = gamma - self.warmup_factor = warmup_factor - self.warmup_iters = warmup_iters - self.warmup_method = warmup_method - super().__init__(optimizer, last_epoch) - - def get_lr(self) -> List[float]: - warmup_factor = _get_warmup_factor_at_iter( - self.warmup_method, self.last_epoch, self.warmup_iters, self.warmup_factor - ) - return [ - base_lr * warmup_factor * self.gamma ** bisect_right(self.milestones, self.last_epoch) - for base_lr in self.base_lrs - ] - - def _compute_values(self) -> List[float]: - # The new interface - return self.get_lr() - - -class WarmupCosineLR(torch.optim.lr_scheduler._LRScheduler): - def __init__( - self, - optimizer: torch.optim.Optimizer, - max_iters: int, - warmup_factor: float = 0.001, - warmup_iters: int = 1000, - warmup_method: str = "linear", - last_epoch: int = -1, - ): - logger.warning( - "WarmupCosineLR is deprecated! Use LRMultipilier with fvcore ParamScheduler instead!" - ) - self.max_iters = max_iters - self.warmup_factor = warmup_factor - self.warmup_iters = warmup_iters - self.warmup_method = warmup_method - super().__init__(optimizer, last_epoch) - - def get_lr(self) -> List[float]: - warmup_factor = _get_warmup_factor_at_iter( - self.warmup_method, self.last_epoch, self.warmup_iters, self.warmup_factor - ) - # Different definitions of half-cosine with warmup are possible. For - # simplicity we multiply the standard half-cosine schedule by the warmup - # factor. An alternative is to start the period of the cosine at warmup_iters - # instead of at 0. In the case that warmup_iters << max_iters the two are - # very close to each other. - return [ - base_lr - * warmup_factor - * 0.5 - * (1.0 + math.cos(math.pi * self.last_epoch / self.max_iters)) - for base_lr in self.base_lrs - ] - - def _compute_values(self) -> List[float]: - # The new interface - return self.get_lr() - - -def _get_warmup_factor_at_iter( - method: str, iter: int, warmup_iters: int, warmup_factor: float -) -> float: - """ - Return the learning rate warmup factor at a specific iteration. - See :paper:`ImageNet in 1h` for more details. - - Args: - method (str): warmup method; either "constant" or "linear". - iter (int): iteration at which to calculate the warmup factor. - warmup_iters (int): the number of warmup iterations. - warmup_factor (float): the base warmup factor (the meaning changes according - to the method used). - - Returns: - float: the effective warmup factor at the given iteration. - """ - if iter >= warmup_iters: - return 1.0 - - if method == "constant": - return warmup_factor - elif method == "linear": - alpha = iter / warmup_iters - return warmup_factor * (1 - alpha) + alpha - else: - raise ValueError("Unknown warmup method: {}".format(method)) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/__init__.py deleted file mode 100755 index f3ee6057..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/__init__.py +++ /dev/null @@ -1,17 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .boxes import Boxes, BoxMode, pairwise_iou, pairwise_ioa, pairwise_point_box_distance -from .image_list import ImageList - -from .instances import Instances -from .keypoints import Keypoints, heatmaps_to_keypoints -from .masks import BitMasks, PolygonMasks, polygons_to_bitmask, ROIMasks -from .rotated_boxes import RotatedBoxes -from .rotated_boxes import pairwise_iou as pairwise_iou_rotated - -__all__ = [k for k in globals().keys() if not k.startswith("_")] - - -from detectron2.utils.env import fixup_module_metadata - -fixup_module_metadata(__name__, globals(), __all__) -del fixup_module_metadata diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/boxes.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/boxes.py deleted file mode 100755 index ae543c61..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/boxes.py +++ /dev/null @@ -1,423 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import math -import numpy as np -from enum import IntEnum, unique -from typing import List, Tuple, Union -import torch -from torch import device - -_RawBoxType = Union[List[float], Tuple[float, ...], torch.Tensor, np.ndarray] - - -@unique -class BoxMode(IntEnum): - """ - Enum of different ways to represent a box. - """ - - XYXY_ABS = 0 - """ - (x0, y0, x1, y1) in absolute floating points coordinates. - The coordinates in range [0, width or height]. - """ - XYWH_ABS = 1 - """ - (x0, y0, w, h) in absolute floating points coordinates. - """ - XYXY_REL = 2 - """ - Not yet supported! - (x0, y0, x1, y1) in range [0, 1]. They are relative to the size of the image. - """ - XYWH_REL = 3 - """ - Not yet supported! - (x0, y0, w, h) in range [0, 1]. They are relative to the size of the image. - """ - XYWHA_ABS = 4 - """ - (xc, yc, w, h, a) in absolute floating points coordinates. - (xc, yc) is the center of the rotated box, and the angle a is in degrees ccw. - """ - - @staticmethod - def convert(box: _RawBoxType, from_mode: "BoxMode", to_mode: "BoxMode") -> _RawBoxType: - """ - Args: - box: can be a k-tuple, k-list or an Nxk array/tensor, where k = 4 or 5 - from_mode, to_mode (BoxMode) - - Returns: - The converted box of the same type. - """ - if from_mode == to_mode: - return box - - original_type = type(box) - is_numpy = isinstance(box, np.ndarray) - single_box = isinstance(box, (list, tuple)) - if single_box: - assert len(box) == 4 or len(box) == 5, ( - "BoxMode.convert takes either a k-tuple/list or an Nxk array/tensor," - " where k == 4 or 5" - ) - arr = torch.tensor(box)[None, :] - else: - # avoid modifying the input box - if is_numpy: - arr = torch.from_numpy(np.asarray(box)).clone() - else: - arr = box.clone() - - assert to_mode not in [BoxMode.XYXY_REL, BoxMode.XYWH_REL] and from_mode not in [ - BoxMode.XYXY_REL, - BoxMode.XYWH_REL, - ], "Relative mode not yet supported!" - - if from_mode == BoxMode.XYWHA_ABS and to_mode == BoxMode.XYXY_ABS: - assert ( - arr.shape[-1] == 5 - ), "The last dimension of input shape must be 5 for XYWHA format" - original_dtype = arr.dtype - arr = arr.double() - - w = arr[:, 2] - h = arr[:, 3] - a = arr[:, 4] - c = torch.abs(torch.cos(a * math.pi / 180.0)) - s = torch.abs(torch.sin(a * math.pi / 180.0)) - # This basically computes the horizontal bounding rectangle of the rotated box - new_w = c * w + s * h - new_h = c * h + s * w - - # convert center to top-left corner - arr[:, 0] -= new_w / 2.0 - arr[:, 1] -= new_h / 2.0 - # bottom-right corner - arr[:, 2] = arr[:, 0] + new_w - arr[:, 3] = arr[:, 1] + new_h - - arr = arr[:, :4].to(dtype=original_dtype) - elif from_mode == BoxMode.XYWH_ABS and to_mode == BoxMode.XYWHA_ABS: - original_dtype = arr.dtype - arr = arr.double() - arr[:, 0] += arr[:, 2] / 2.0 - arr[:, 1] += arr[:, 3] / 2.0 - angles = torch.zeros((arr.shape[0], 1), dtype=arr.dtype) - arr = torch.cat((arr, angles), axis=1).to(dtype=original_dtype) - else: - if to_mode == BoxMode.XYXY_ABS and from_mode == BoxMode.XYWH_ABS: - arr[:, 2] += arr[:, 0] - arr[:, 3] += arr[:, 1] - elif from_mode == BoxMode.XYXY_ABS and to_mode == BoxMode.XYWH_ABS: - arr[:, 2] -= arr[:, 0] - arr[:, 3] -= arr[:, 1] - else: - raise NotImplementedError( - "Conversion from BoxMode {} to {} is not supported yet".format( - from_mode, to_mode - ) - ) - - if single_box: - return original_type(arr.flatten().tolist()) - if is_numpy: - return arr.numpy() - else: - return arr - - -class Boxes: - """ - This structure stores a list of boxes as a Nx4 torch.Tensor. - It supports some common methods about boxes - (`area`, `clip`, `nonempty`, etc), - and also behaves like a Tensor - (support indexing, `to(device)`, `.device`, and iteration over all boxes) - - Attributes: - tensor (torch.Tensor): float matrix of Nx4. Each row is (x1, y1, x2, y2). - """ - - def __init__(self, tensor: torch.Tensor): - """ - Args: - tensor (Tensor[float]): a Nx4 matrix. Each row is (x1, y1, x2, y2). - """ - device = tensor.device if isinstance(tensor, torch.Tensor) else torch.device("cpu") - tensor = torch.as_tensor(tensor, dtype=torch.float32, device=device) - if tensor.numel() == 0: - # Use reshape, so we don't end up creating a new tensor that does not depend on - # the inputs (and consequently confuses jit) - tensor = tensor.reshape((-1, 4)).to(dtype=torch.float32, device=device) - assert tensor.dim() == 2 and tensor.size(-1) == 4, tensor.size() - - self.tensor = tensor - - def clone(self) -> "Boxes": - """ - Clone the Boxes. - - Returns: - Boxes - """ - return Boxes(self.tensor.clone()) - - def to(self, device: torch.device): - # Boxes are assumed float32 and does not support to(dtype) - return Boxes(self.tensor.to(device=device)) - - def area(self) -> torch.Tensor: - """ - Computes the area of all the boxes. - - Returns: - torch.Tensor: a vector with areas of each box. - """ - box = self.tensor - area = (box[:, 2] - box[:, 0]) * (box[:, 3] - box[:, 1]) - return area - - def clip(self, box_size: Tuple[int, int]) -> None: - """ - Clip (in place) the boxes by limiting x coordinates to the range [0, width] - and y coordinates to the range [0, height]. - - Args: - box_size (height, width): The clipping box's size. - """ - assert torch.isfinite(self.tensor).all(), "Box tensor contains infinite or NaN!" - h, w = box_size - x1 = self.tensor[:, 0].clamp(min=0, max=w) - y1 = self.tensor[:, 1].clamp(min=0, max=h) - x2 = self.tensor[:, 2].clamp(min=0, max=w) - y2 = self.tensor[:, 3].clamp(min=0, max=h) - self.tensor = torch.stack((x1, y1, x2, y2), dim=-1) - - def nonempty(self, threshold: float = 0.0) -> torch.Tensor: - """ - Find boxes that are non-empty. - A box is considered empty, if either of its side is no larger than threshold. - - Returns: - Tensor: - a binary vector which represents whether each box is empty - (False) or non-empty (True). - """ - box = self.tensor - widths = box[:, 2] - box[:, 0] - heights = box[:, 3] - box[:, 1] - keep = (widths > threshold) & (heights > threshold) - return keep - - def __getitem__(self, item) -> "Boxes": - """ - Args: - item: int, slice, or a BoolTensor - - Returns: - Boxes: Create a new :class:`Boxes` by indexing. - - The following usage are allowed: - - 1. `new_boxes = boxes[3]`: return a `Boxes` which contains only one box. - 2. `new_boxes = boxes[2:10]`: return a slice of boxes. - 3. `new_boxes = boxes[vector]`, where vector is a torch.BoolTensor - with `length = len(boxes)`. Nonzero elements in the vector will be selected. - - Note that the returned Boxes might share storage with this Boxes, - subject to Pytorch's indexing semantics. - """ - if isinstance(item, int): - return Boxes(self.tensor[item].view(1, -1)) - b = self.tensor[item] - assert b.dim() == 2, "Indexing on Boxes with {} failed to return a matrix!".format(item) - return Boxes(b) - - def __len__(self) -> int: - return self.tensor.shape[0] - - def __repr__(self) -> str: - return "Boxes(" + str(self.tensor) + ")" - - def inside_box(self, box_size: Tuple[int, int], boundary_threshold: int = 0) -> torch.Tensor: - """ - Args: - box_size (height, width): Size of the reference box. - boundary_threshold (int): Boxes that extend beyond the reference box - boundary by more than boundary_threshold are considered "outside". - - Returns: - a binary vector, indicating whether each box is inside the reference box. - """ - height, width = box_size - inds_inside = ( - (self.tensor[..., 0] >= -boundary_threshold) - & (self.tensor[..., 1] >= -boundary_threshold) - & (self.tensor[..., 2] < width + boundary_threshold) - & (self.tensor[..., 3] < height + boundary_threshold) - ) - return inds_inside - - def get_centers(self) -> torch.Tensor: - """ - Returns: - The box centers in a Nx2 array of (x, y). - """ - return (self.tensor[:, :2] + self.tensor[:, 2:]) / 2 - - def scale(self, scale_x: float, scale_y: float) -> None: - """ - Scale the box with horizontal and vertical scaling factors - """ - self.tensor[:, 0::2] *= scale_x - self.tensor[:, 1::2] *= scale_y - - @classmethod - def cat(cls, boxes_list: List["Boxes"]) -> "Boxes": - """ - Concatenates a list of Boxes into a single Boxes - - Arguments: - boxes_list (list[Boxes]) - - Returns: - Boxes: the concatenated Boxes - """ - assert isinstance(boxes_list, (list, tuple)) - if len(boxes_list) == 0: - return cls(torch.empty(0)) - assert all([isinstance(box, Boxes) for box in boxes_list]) - - # use torch.cat (v.s. layers.cat) so the returned boxes never share storage with input - cat_boxes = cls(torch.cat([b.tensor for b in boxes_list], dim=0)) - return cat_boxes - - @property - def device(self) -> device: - return self.tensor.device - - # type "Iterator[torch.Tensor]", yield, and iter() not supported by torchscript - # https://github.com/pytorch/pytorch/issues/18627 - @torch.jit.unused - def __iter__(self): - """ - Yield a box as a Tensor of shape (4,) at a time. - """ - yield from self.tensor - - -def pairwise_intersection(boxes1: Boxes, boxes2: Boxes) -> torch.Tensor: - """ - Given two lists of boxes of size N and M, - compute the intersection area between __all__ N x M pairs of boxes. - The box order must be (xmin, ymin, xmax, ymax) - - Args: - boxes1,boxes2 (Boxes): two `Boxes`. Contains N & M boxes, respectively. - - Returns: - Tensor: intersection, sized [N,M]. - """ - boxes1, boxes2 = boxes1.tensor, boxes2.tensor - width_height = torch.min(boxes1[:, None, 2:], boxes2[:, 2:]) - torch.max( - boxes1[:, None, :2], boxes2[:, :2] - ) # [N,M,2] - - width_height.clamp_(min=0) # [N,M,2] - intersection = width_height.prod(dim=2) # [N,M] - return intersection - - -# implementation from https://github.com/kuangliu/torchcv/blob/master/torchcv/utils/box.py -# with slight modifications -def pairwise_iou(boxes1: Boxes, boxes2: Boxes) -> torch.Tensor: - """ - Given two lists of boxes of size N and M, compute the IoU - (intersection over union) between **all** N x M pairs of boxes. - The box order must be (xmin, ymin, xmax, ymax). - - Args: - boxes1,boxes2 (Boxes): two `Boxes`. Contains N & M boxes, respectively. - - Returns: - Tensor: IoU, sized [N,M]. - """ - area1 = boxes1.area() # [N] - area2 = boxes2.area() # [M] - inter = pairwise_intersection(boxes1, boxes2) - - # handle empty boxes - iou = torch.where( - inter > 0, - inter / (area1[:, None] + area2 - inter), - torch.zeros(1, dtype=inter.dtype, device=inter.device), - ) - return iou - - -def pairwise_ioa(boxes1: Boxes, boxes2: Boxes) -> torch.Tensor: - """ - Similar to :func:`pariwise_iou` but compute the IoA (intersection over boxes2 area). - - Args: - boxes1,boxes2 (Boxes): two `Boxes`. Contains N & M boxes, respectively. - - Returns: - Tensor: IoA, sized [N,M]. - """ - area2 = boxes2.area() # [M] - inter = pairwise_intersection(boxes1, boxes2) - - # handle empty boxes - ioa = torch.where( - inter > 0, inter / area2, torch.zeros(1, dtype=inter.dtype, device=inter.device) - ) - return ioa - - -def pairwise_point_box_distance(points: torch.Tensor, boxes: Boxes): - """ - Pairwise distance between N points and M boxes. The distance between a - point and a box is represented by the distance from the point to 4 edges - of the box. Distances are all positive when the point is inside the box. - - Args: - points: Nx2 coordinates. Each row is (x, y) - boxes: M boxes - - Returns: - Tensor: distances of size (N, M, 4). The 4 values are distances from - the point to the left, top, right, bottom of the box. - """ - x, y = points.unsqueeze(dim=2).unbind(dim=1) # (N, 1) - x0, y0, x1, y1 = boxes.tensor.unsqueeze(dim=0).unbind(dim=2) # (1, M) - return torch.stack([x - x0, y - y0, x1 - x, y1 - y], dim=2) - - -def matched_pairwise_iou(boxes1: Boxes, boxes2: Boxes) -> torch.Tensor: - """ - Compute pairwise intersection over union (IOU) of two sets of matched - boxes that have the same number of boxes. - Similar to :func:`pairwise_iou`, but computes only diagonal elements of the matrix. - - Args: - boxes1 (Boxes): bounding boxes, sized [N,4]. - boxes2 (Boxes): same length as boxes1 - Returns: - Tensor: iou, sized [N]. - """ - assert len(boxes1) == len( - boxes2 - ), "boxlists should have the same" "number of entries, got {}, {}".format( - len(boxes1), len(boxes2) - ) - area1 = boxes1.area() # [N] - area2 = boxes2.area() # [N] - box1, box2 = boxes1.tensor, boxes2.tensor - lt = torch.max(box1[:, :2], box2[:, :2]) # [N,2] - rb = torch.min(box1[:, 2:], box2[:, 2:]) # [N,2] - wh = (rb - lt).clamp(min=0) # [N,2] - inter = wh[:, 0] * wh[:, 1] # [N] - iou = inter / (area1 + area2 - inter) # [N] - return iou diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/image_list.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/image_list.py deleted file mode 100755 index b31b2d39..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/image_list.py +++ /dev/null @@ -1,110 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from __future__ import division -from typing import Any, List, Tuple -import torch -from torch import device -from torch.nn import functional as F - -from detectron2.layers.wrappers import shapes_to_tensor - - -class ImageList(object): - """ - Structure that holds a list of images (of possibly - varying sizes) as a single tensor. - This works by padding the images to the same size. - The original sizes of each image is stored in `image_sizes`. - - Attributes: - image_sizes (list[tuple[int, int]]): each tuple is (h, w). - During tracing, it becomes list[Tensor] instead. - """ - - def __init__(self, tensor: torch.Tensor, image_sizes: List[Tuple[int, int]]): - """ - Arguments: - tensor (Tensor): of shape (N, H, W) or (N, C_1, ..., C_K, H, W) where K >= 1 - image_sizes (list[tuple[int, int]]): Each tuple is (h, w). It can - be smaller than (H, W) due to padding. - """ - self.tensor = tensor - self.image_sizes = image_sizes - - def __len__(self) -> int: - return len(self.image_sizes) - - def __getitem__(self, idx) -> torch.Tensor: - """ - Access the individual image in its original size. - - Args: - idx: int or slice - - Returns: - Tensor: an image of shape (H, W) or (C_1, ..., C_K, H, W) where K >= 1 - """ - size = self.image_sizes[idx] - return self.tensor[idx, ..., : size[0], : size[1]] - - @torch.jit.unused - def to(self, *args: Any, **kwargs: Any) -> "ImageList": - cast_tensor = self.tensor.to(*args, **kwargs) - return ImageList(cast_tensor, self.image_sizes) - - @property - def device(self) -> device: - return self.tensor.device - - @staticmethod - def from_tensors( - tensors: List[torch.Tensor], size_divisibility: int = 0, pad_value: float = 0.0 - ) -> "ImageList": - """ - Args: - tensors: a tuple or list of `torch.Tensor`, each of shape (Hi, Wi) or - (C_1, ..., C_K, Hi, Wi) where K >= 1. The Tensors will be padded - to the same shape with `pad_value`. - size_divisibility (int): If `size_divisibility > 0`, add padding to ensure - the common height and width is divisible by `size_divisibility`. - This depends on the model and many models need a divisibility of 32. - pad_value (float): value to pad - - Returns: - an `ImageList`. - """ - assert len(tensors) > 0 - assert isinstance(tensors, (tuple, list)) - for t in tensors: - assert isinstance(t, torch.Tensor), type(t) - assert t.shape[:-2] == tensors[0].shape[:-2], t.shape - - image_sizes = [(im.shape[-2], im.shape[-1]) for im in tensors] - image_sizes_tensor = [shapes_to_tensor(x) for x in image_sizes] - max_size = torch.stack(image_sizes_tensor).max(0).values - - if size_divisibility > 1: - stride = size_divisibility - # the last two dims are H,W, both subject to divisibility requirement - max_size = (max_size + (stride - 1)).div(stride, rounding_mode="floor") * stride - - # handle weirdness of scripting and tracing ... - if torch.jit.is_scripting(): - max_size: List[int] = max_size.to(dtype=torch.long).tolist() - else: - if torch.jit.is_tracing(): - image_sizes = image_sizes_tensor - - if len(tensors) == 1: - # This seems slightly (2%) faster. - # TODO: check whether it's faster for multiple images as well - image_size = image_sizes[0] - padding_size = [0, max_size[-1] - image_size[1], 0, max_size[-2] - image_size[0]] - batched_imgs = F.pad(tensors[0], padding_size, value=pad_value).unsqueeze_(0) - else: - # max_size can be a tensor in tracing mode, therefore convert to list - batch_shape = [len(tensors)] + list(tensors[0].shape[:-2]) + list(max_size) - batched_imgs = tensors[0].new_full(batch_shape, pad_value) - for img, pad_img in zip(tensors, batched_imgs): - pad_img[..., : img.shape[-2], : img.shape[-1]].copy_(img) - - return ImageList(batched_imgs.contiguous(), image_sizes) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/instances.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/instances.py deleted file mode 100755 index 612e66f5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/instances.py +++ /dev/null @@ -1,192 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import itertools -from typing import Any, Dict, List, Tuple, Union -import torch - - -class Instances: - """ - This class represents a list of instances in an image. - It stores the attributes of instances (e.g., boxes, masks, labels, scores) as "fields". - All fields must have the same ``__len__`` which is the number of instances. - - All other (non-field) attributes of this class are considered private: - they must start with '_' and are not modifiable by a user. - - Some basic usage: - - 1. Set/get/check a field: - - .. code-block:: python - - instances.gt_boxes = Boxes(...) - print(instances.pred_masks) # a tensor of shape (N, H, W) - print('gt_masks' in instances) - - 2. ``len(instances)`` returns the number of instances - 3. Indexing: ``instances[indices]`` will apply the indexing on all the fields - and returns a new :class:`Instances`. - Typically, ``indices`` is a integer vector of indices, - or a binary mask of length ``num_instances`` - - .. code-block:: python - - category_3_detections = instances[instances.pred_classes == 3] - confident_detections = instances[instances.scores > 0.9] - """ - - def __init__(self, image_size: Tuple[int, int], **kwargs: Any): - """ - Args: - image_size (height, width): the spatial size of the image. - kwargs: fields to add to this `Instances`. - """ - self._image_size = image_size - self._fields: Dict[str, Any] = {} - for k, v in kwargs.items(): - self.set(k, v) - - @property - def image_size(self) -> Tuple[int, int]: - """ - Returns: - tuple: height, width - """ - return self._image_size - - def __setattr__(self, name: str, val: Any) -> None: - if name.startswith("_"): - super().__setattr__(name, val) - else: - self.set(name, val) - - def __getattr__(self, name: str) -> Any: - if name == "_fields" or name not in self._fields: - raise AttributeError("Cannot find field '{}' in the given Instances!".format(name)) - return self._fields[name] - - def set(self, name: str, value: Any) -> None: - """ - Set the field named `name` to `value`. - The length of `value` must be the number of instances, - and must agree with other existing fields in this object. - """ - data_len = len(value) - if len(self._fields): - assert ( - len(self) == data_len - ), "Adding a field of length {} to a Instances of length {}".format(data_len, len(self)) - self._fields[name] = value - - def has(self, name: str) -> bool: - """ - Returns: - bool: whether the field called `name` exists. - """ - return name in self._fields - - def remove(self, name: str) -> None: - """ - Remove the field called `name`. - """ - del self._fields[name] - - def get(self, name: str) -> Any: - """ - Returns the field called `name`. - """ - return self._fields[name] - - def get_fields(self) -> Dict[str, Any]: - """ - Returns: - dict: a dict which maps names (str) to data of the fields - - Modifying the returned dict will modify this instance. - """ - return self._fields - - # Tensor-like methods - def to(self, *args: Any, **kwargs: Any) -> "Instances": - """ - Returns: - Instances: all fields are called with a `to(device)`, if the field has this method. - """ - ret = Instances(self._image_size) - for k, v in self._fields.items(): - if hasattr(v, "to"): - v = v.to(*args, **kwargs) - ret.set(k, v) - return ret - - def __getitem__(self, item: Union[int, slice, torch.BoolTensor]) -> "Instances": - """ - Args: - item: an index-like object and will be used to index all the fields. - - Returns: - If `item` is a string, return the data in the corresponding field. - Otherwise, returns an `Instances` where all fields are indexed by `item`. - """ - if type(item) == int: - if item >= len(self) or item < -len(self): - raise IndexError("Instances index out of range!") - else: - item = slice(item, None, len(self)) - - ret = Instances(self._image_size) - for k, v in self._fields.items(): - ret.set(k, v[item]) - return ret - - def __len__(self) -> int: - for v in self._fields.values(): - # use __len__ because len() has to be int and is not friendly to tracing - return v.__len__() - raise NotImplementedError("Empty Instances does not support __len__!") - - def __iter__(self): - raise NotImplementedError("`Instances` object is not iterable!") - - @staticmethod - def cat(instance_lists: List["Instances"]) -> "Instances": - """ - Args: - instance_lists (list[Instances]) - - Returns: - Instances - """ - assert all(isinstance(i, Instances) for i in instance_lists) - assert len(instance_lists) > 0 - if len(instance_lists) == 1: - return instance_lists[0] - - image_size = instance_lists[0].image_size - if not isinstance(image_size, torch.Tensor): # could be a tensor in tracing - for i in instance_lists[1:]: - assert i.image_size == image_size - ret = Instances(image_size) - for k in instance_lists[0]._fields.keys(): - values = [i.get(k) for i in instance_lists] - v0 = values[0] - if isinstance(v0, torch.Tensor): - values = torch.cat(values, dim=0) - elif isinstance(v0, list): - values = list(itertools.chain(*values)) - elif hasattr(type(v0), "cat"): - values = type(v0).cat(values) - else: - raise ValueError("Unsupported type {} for concatenation".format(type(v0))) - ret.set(k, values) - return ret - - def __str__(self) -> str: - s = self.__class__.__name__ + "(" - s += "num_instances={}, ".format(len(self)) - s += "image_height={}, ".format(self._image_size[0]) - s += "image_width={}, ".format(self._image_size[1]) - s += "fields=[{}])".format(", ".join((f"{k}: {v}" for k, v in self._fields.items()))) - return s - - __repr__ = __str__ diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/keypoints.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/keypoints.py deleted file mode 100755 index d0ee8724..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/keypoints.py +++ /dev/null @@ -1,239 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -from typing import Any, List, Tuple, Union -import torch -from torch.nn import functional as F - - -class Keypoints: - """ - Stores keypoint **annotation** data. GT Instances have a `gt_keypoints` property - containing the x,y location and visibility flag of each keypoint. This tensor has shape - (N, K, 3) where N is the number of instances and K is the number of keypoints per instance. - - The visibility flag follows the COCO format and must be one of three integers: - - * v=0: not labeled (in which case x=y=0) - * v=1: labeled but not visible - * v=2: labeled and visible - """ - - def __init__(self, keypoints: Union[torch.Tensor, np.ndarray, List[List[float]]]): - """ - Arguments: - keypoints: A Tensor, numpy array, or list of the x, y, and visibility of each keypoint. - The shape should be (N, K, 3) where N is the number of - instances, and K is the number of keypoints per instance. - """ - device = keypoints.device if isinstance(keypoints, torch.Tensor) else torch.device("cpu") - keypoints = torch.as_tensor(keypoints, dtype=torch.float32, device=device) - assert keypoints.dim() == 3 and keypoints.shape[2] == 3, keypoints.shape - self.tensor = keypoints - - def __len__(self) -> int: - return self.tensor.size(0) - - def to(self, *args: Any, **kwargs: Any) -> "Keypoints": - return type(self)(self.tensor.to(*args, **kwargs)) - - @property - def device(self) -> torch.device: - return self.tensor.device - - def to_heatmap(self, boxes: torch.Tensor, heatmap_size: int) -> torch.Tensor: - """ - Convert keypoint annotations to a heatmap of one-hot labels for training, - as described in :paper:`Mask R-CNN`. - - Arguments: - boxes: Nx4 tensor, the boxes to draw the keypoints to - - Returns: - heatmaps: - A tensor of shape (N, K), each element is integer spatial label - in the range [0, heatmap_size**2 - 1] for each keypoint in the input. - valid: - A tensor of shape (N, K) containing whether each keypoint is in the roi or not. - """ - return _keypoints_to_heatmap(self.tensor, boxes, heatmap_size) - - def __getitem__(self, item: Union[int, slice, torch.BoolTensor]) -> "Keypoints": - """ - Create a new `Keypoints` by indexing on this `Keypoints`. - - The following usage are allowed: - - 1. `new_kpts = kpts[3]`: return a `Keypoints` which contains only one instance. - 2. `new_kpts = kpts[2:10]`: return a slice of key points. - 3. `new_kpts = kpts[vector]`, where vector is a torch.ByteTensor - with `length = len(kpts)`. Nonzero elements in the vector will be selected. - - Note that the returned Keypoints might share storage with this Keypoints, - subject to Pytorch's indexing semantics. - """ - if isinstance(item, int): - return Keypoints([self.tensor[item]]) - return Keypoints(self.tensor[item]) - - def __repr__(self) -> str: - s = self.__class__.__name__ + "(" - s += "num_instances={})".format(len(self.tensor)) - return s - - @staticmethod - def cat(keypoints_list: List["Keypoints"]) -> "Keypoints": - """ - Concatenates a list of Keypoints into a single Keypoints - - Arguments: - keypoints_list (list[Keypoints]) - - Returns: - Keypoints: the concatenated Keypoints - """ - assert isinstance(keypoints_list, (list, tuple)) - assert len(keypoints_list) > 0 - assert all(isinstance(keypoints, Keypoints) for keypoints in keypoints_list) - - cat_kpts = type(keypoints_list[0])( - torch.cat([kpts.tensor for kpts in keypoints_list], dim=0) - ) - return cat_kpts - - -# TODO make this nicer, this is a direct translation from C2 (but removing the inner loop) -def _keypoints_to_heatmap( - keypoints: torch.Tensor, rois: torch.Tensor, heatmap_size: int -) -> Tuple[torch.Tensor, torch.Tensor]: - """ - Encode keypoint locations into a target heatmap for use in SoftmaxWithLoss across space. - - Maps keypoints from the half-open interval [x1, x2) on continuous image coordinates to the - closed interval [0, heatmap_size - 1] on discrete image coordinates. We use the - continuous-discrete conversion from Heckbert 1990 ("What is the coordinate of a pixel?"): - d = floor(c) and c = d + 0.5, where d is a discrete coordinate and c is a continuous coordinate. - - Arguments: - keypoints: tensor of keypoint locations in of shape (N, K, 3). - rois: Nx4 tensor of rois in xyxy format - heatmap_size: integer side length of square heatmap. - - Returns: - heatmaps: A tensor of shape (N, K) containing an integer spatial label - in the range [0, heatmap_size**2 - 1] for each keypoint in the input. - valid: A tensor of shape (N, K) containing whether each keypoint is in - the roi or not. - """ - - if rois.numel() == 0: - return rois.new().long(), rois.new().long() - offset_x = rois[:, 0] - offset_y = rois[:, 1] - scale_x = heatmap_size / (rois[:, 2] - rois[:, 0]) - scale_y = heatmap_size / (rois[:, 3] - rois[:, 1]) - - offset_x = offset_x[:, None] - offset_y = offset_y[:, None] - scale_x = scale_x[:, None] - scale_y = scale_y[:, None] - - x = keypoints[..., 0] - y = keypoints[..., 1] - - x_boundary_inds = x == rois[:, 2][:, None] - y_boundary_inds = y == rois[:, 3][:, None] - - x = (x - offset_x) * scale_x - x = x.floor().long() - y = (y - offset_y) * scale_y - y = y.floor().long() - - x[x_boundary_inds] = heatmap_size - 1 - y[y_boundary_inds] = heatmap_size - 1 - - valid_loc = (x >= 0) & (y >= 0) & (x < heatmap_size) & (y < heatmap_size) - vis = keypoints[..., 2] > 0 - valid = (valid_loc & vis).long() - - lin_ind = y * heatmap_size + x - heatmaps = lin_ind * valid - - return heatmaps, valid - - -@torch.jit.script_if_tracing -def heatmaps_to_keypoints(maps: torch.Tensor, rois: torch.Tensor) -> torch.Tensor: - """ - Extract predicted keypoint locations from heatmaps. - - Args: - maps (Tensor): (#ROIs, #keypoints, POOL_H, POOL_W). The predicted heatmap of logits for - each ROI and each keypoint. - rois (Tensor): (#ROIs, 4). The box of each ROI. - - Returns: - Tensor of shape (#ROIs, #keypoints, 4) with the last dimension corresponding to - (x, y, logit, score) for each keypoint. - - When converting discrete pixel indices in an NxN image to a continuous keypoint coordinate, - we maintain consistency with :meth:`Keypoints.to_heatmap` by using the conversion from - Heckbert 1990: c = d + 0.5, where d is a discrete coordinate and c is a continuous coordinate. - """ - # The decorator use of torch.no_grad() was not supported by torchscript. - # https://github.com/pytorch/pytorch/issues/44768 - maps = maps.detach() - rois = rois.detach() - - offset_x = rois[:, 0] - offset_y = rois[:, 1] - - widths = (rois[:, 2] - rois[:, 0]).clamp(min=1) - heights = (rois[:, 3] - rois[:, 1]).clamp(min=1) - widths_ceil = widths.ceil() - heights_ceil = heights.ceil() - - num_rois, num_keypoints = maps.shape[:2] - xy_preds = maps.new_zeros(rois.shape[0], num_keypoints, 4) - - width_corrections = widths / widths_ceil - height_corrections = heights / heights_ceil - - keypoints_idx = torch.arange(num_keypoints, device=maps.device) - - for i in range(num_rois): - outsize = (int(heights_ceil[i]), int(widths_ceil[i])) - roi_map = F.interpolate( - maps[[i]], size=outsize, mode="bicubic", align_corners=False - ).squeeze( - 0 - ) # #keypoints x H x W - - # softmax over the spatial region - max_score, _ = roi_map.view(num_keypoints, -1).max(1) - max_score = max_score.view(num_keypoints, 1, 1) - tmp_full_resolution = (roi_map - max_score).exp_() - tmp_pool_resolution = (maps[i] - max_score).exp_() - # Produce scores over the region H x W, but normalize with POOL_H x POOL_W, - # so that the scores of objects of different absolute sizes will be more comparable - roi_map_scores = tmp_full_resolution / tmp_pool_resolution.sum((1, 2), keepdim=True) - - w = roi_map.shape[2] - pos = roi_map.view(num_keypoints, -1).argmax(1) - - x_int = pos % w - y_int = (pos - x_int) // w - - assert ( - roi_map_scores[keypoints_idx, y_int, x_int] - == roi_map_scores.view(num_keypoints, -1).max(1)[0] - ).all() - - x = (x_int.float() + 0.5) * width_corrections[i] - y = (y_int.float() + 0.5) * height_corrections[i] - - xy_preds[i, :, 0] = x + offset_x[i] - xy_preds[i, :, 1] = y + offset_y[i] - xy_preds[i, :, 2] = roi_map[keypoints_idx, y_int, x_int] - xy_preds[i, :, 3] = roi_map_scores[keypoints_idx, y_int, x_int] - - return xy_preds diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/masks.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/masks.py deleted file mode 100755 index 8f8e72dd..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/masks.py +++ /dev/null @@ -1,532 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import itertools -import numpy as np -from typing import Any, Iterator, List, Union -import pycocotools.mask as mask_util -import torch -from torch import device - -from detectron2.layers.roi_align import ROIAlign -from detectron2.utils.memory import retry_if_cuda_oom - -from .boxes import Boxes - - -def polygon_area(x, y): - # Using the shoelace formula - # https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates - return 0.5 * np.abs(np.dot(x, np.roll(y, 1)) - np.dot(y, np.roll(x, 1))) - - -def polygons_to_bitmask(polygons: List[np.ndarray], height: int, width: int) -> np.ndarray: - """ - Args: - polygons (list[ndarray]): each array has shape (Nx2,) - height, width (int) - - Returns: - ndarray: a bool mask of shape (height, width) - """ - if len(polygons) == 0: - # COCOAPI does not support empty polygons - return np.zeros((height, width)).astype(np.bool) - rles = mask_util.frPyObjects(polygons, height, width) - rle = mask_util.merge(rles) - return mask_util.decode(rle).astype(np.bool) - - -def rasterize_polygons_within_box( - polygons: List[np.ndarray], box: np.ndarray, mask_size: int -) -> torch.Tensor: - """ - Rasterize the polygons into a mask image and - crop the mask content in the given box. - The cropped mask is resized to (mask_size, mask_size). - - This function is used when generating training targets for mask head in Mask R-CNN. - Given original ground-truth masks for an image, new ground-truth mask - training targets in the size of `mask_size x mask_size` - must be provided for each predicted box. This function will be called to - produce such targets. - - Args: - polygons (list[ndarray[float]]): a list of polygons, which represents an instance. - box: 4-element numpy array - mask_size (int): - - Returns: - Tensor: BoolTensor of shape (mask_size, mask_size) - """ - # 1. Shift the polygons w.r.t the boxes - w, h = box[2] - box[0], box[3] - box[1] - - polygons = copy.deepcopy(polygons) - for p in polygons: - p[0::2] = p[0::2] - box[0] - p[1::2] = p[1::2] - box[1] - - # 2. Rescale the polygons to the new box size - # max() to avoid division by small number - ratio_h = mask_size / max(h, 0.1) - ratio_w = mask_size / max(w, 0.1) - - if ratio_h == ratio_w: - for p in polygons: - p *= ratio_h - else: - for p in polygons: - p[0::2] *= ratio_w - p[1::2] *= ratio_h - - # 3. Rasterize the polygons with coco api - mask = polygons_to_bitmask(polygons, mask_size, mask_size) - mask = torch.from_numpy(mask) - return mask - - -class BitMasks: - """ - This class stores the segmentation masks for all objects in one image, in - the form of bitmaps. - - Attributes: - tensor: bool Tensor of N,H,W, representing N instances in the image. - """ - - def __init__(self, tensor: Union[torch.Tensor, np.ndarray]): - """ - Args: - tensor: bool Tensor of N,H,W, representing N instances in the image. - """ - device = tensor.device if isinstance(tensor, torch.Tensor) else torch.device("cpu") - tensor = torch.as_tensor(tensor, dtype=torch.bool, device=device) - assert tensor.dim() == 3, tensor.size() - self.image_size = tensor.shape[1:] - self.tensor = tensor - - @torch.jit.unused - def to(self, *args: Any, **kwargs: Any) -> "BitMasks": - return BitMasks(self.tensor.to(*args, **kwargs)) - - @property - def device(self) -> torch.device: - return self.tensor.device - - @torch.jit.unused - def __getitem__(self, item: Union[int, slice, torch.BoolTensor]) -> "BitMasks": - """ - Returns: - BitMasks: Create a new :class:`BitMasks` by indexing. - - The following usage are allowed: - - 1. `new_masks = masks[3]`: return a `BitMasks` which contains only one mask. - 2. `new_masks = masks[2:10]`: return a slice of masks. - 3. `new_masks = masks[vector]`, where vector is a torch.BoolTensor - with `length = len(masks)`. Nonzero elements in the vector will be selected. - - Note that the returned object might share storage with this object, - subject to Pytorch's indexing semantics. - """ - if isinstance(item, int): - return BitMasks(self.tensor[item].unsqueeze(0)) - m = self.tensor[item] - assert m.dim() == 3, "Indexing on BitMasks with {} returns a tensor with shape {}!".format( - item, m.shape - ) - return BitMasks(m) - - @torch.jit.unused - def __iter__(self) -> torch.Tensor: - yield from self.tensor - - @torch.jit.unused - def __repr__(self) -> str: - s = self.__class__.__name__ + "(" - s += "num_instances={})".format(len(self.tensor)) - return s - - def __len__(self) -> int: - return self.tensor.shape[0] - - def nonempty(self) -> torch.Tensor: - """ - Find masks that are non-empty. - - Returns: - Tensor: a BoolTensor which represents - whether each mask is empty (False) or non-empty (True). - """ - return self.tensor.flatten(1).any(dim=1) - - @staticmethod - def from_polygon_masks( - polygon_masks: Union["PolygonMasks", List[List[np.ndarray]]], height: int, width: int - ) -> "BitMasks": - """ - Args: - polygon_masks (list[list[ndarray]] or PolygonMasks) - height, width (int) - """ - if isinstance(polygon_masks, PolygonMasks): - polygon_masks = polygon_masks.polygons - masks = [polygons_to_bitmask(p, height, width) for p in polygon_masks] - if len(masks): - return BitMasks(torch.stack([torch.from_numpy(x) for x in masks])) - else: - return BitMasks(torch.empty(0, height, width, dtype=torch.bool)) - - @staticmethod - def from_roi_masks(roi_masks: "ROIMasks", height: int, width: int) -> "BitMasks": - """ - Args: - roi_masks: - height, width (int): - """ - return roi_masks.to_bitmasks(height, width) - - def crop_and_resize(self, boxes: torch.Tensor, mask_size: int) -> torch.Tensor: - """ - Crop each bitmask by the given box, and resize results to (mask_size, mask_size). - This can be used to prepare training targets for Mask R-CNN. - It has less reconstruction error compared to rasterization with polygons. - However we observe no difference in accuracy, - but BitMasks requires more memory to store all the masks. - - Args: - boxes (Tensor): Nx4 tensor storing the boxes for each mask - mask_size (int): the size of the rasterized mask. - - Returns: - Tensor: - A bool tensor of shape (N, mask_size, mask_size), where - N is the number of predicted boxes for this image. - """ - assert len(boxes) == len(self), "{} != {}".format(len(boxes), len(self)) - device = self.tensor.device - - batch_inds = torch.arange(len(boxes), device=device).to(dtype=boxes.dtype)[:, None] - rois = torch.cat([batch_inds, boxes], dim=1) # Nx5 - - bit_masks = self.tensor.to(dtype=torch.float32) - rois = rois.to(device=device) - output = ( - ROIAlign((mask_size, mask_size), 1.0, 0, aligned=True) - .forward(bit_masks[:, None, :, :], rois) - .squeeze(1) - ) - output = output >= 0.5 - return output - - def get_bounding_boxes(self) -> Boxes: - """ - Returns: - Boxes: tight bounding boxes around bitmasks. - If a mask is empty, it's bounding box will be all zero. - """ - boxes = torch.zeros(self.tensor.shape[0], 4, dtype=torch.float32) - x_any = torch.any(self.tensor, dim=1) - y_any = torch.any(self.tensor, dim=2) - for idx in range(self.tensor.shape[0]): - x = torch.where(x_any[idx, :])[0] - y = torch.where(y_any[idx, :])[0] - if len(x) > 0 and len(y) > 0: - boxes[idx, :] = torch.as_tensor( - [x[0], y[0], x[-1] + 1, y[-1] + 1], dtype=torch.float32 - ) - return Boxes(boxes) - - @staticmethod - def cat(bitmasks_list: List["BitMasks"]) -> "BitMasks": - """ - Concatenates a list of BitMasks into a single BitMasks - - Arguments: - bitmasks_list (list[BitMasks]) - - Returns: - BitMasks: the concatenated BitMasks - """ - assert isinstance(bitmasks_list, (list, tuple)) - assert len(bitmasks_list) > 0 - assert all(isinstance(bitmask, BitMasks) for bitmask in bitmasks_list) - - cat_bitmasks = type(bitmasks_list[0])(torch.cat([bm.tensor for bm in bitmasks_list], dim=0)) - return cat_bitmasks - - -class PolygonMasks: - """ - This class stores the segmentation masks for all objects in one image, in the form of polygons. - - Attributes: - polygons: list[list[ndarray]]. Each ndarray is a float64 vector representing a polygon. - """ - - def __init__(self, polygons: List[List[Union[torch.Tensor, np.ndarray]]]): - """ - Arguments: - polygons (list[list[np.ndarray]]): The first - level of the list correspond to individual instances, - the second level to all the polygons that compose the - instance, and the third level to the polygon coordinates. - The third level array should have the format of - [x0, y0, x1, y1, ..., xn, yn] (n >= 3). - """ - if not isinstance(polygons, list): - raise ValueError( - "Cannot create PolygonMasks: Expect a list of list of polygons per image. " - "Got '{}' instead.".format(type(polygons)) - ) - - def _make_array(t: Union[torch.Tensor, np.ndarray]) -> np.ndarray: - # Use float64 for higher precision, because why not? - # Always put polygons on CPU (self.to is a no-op) since they - # are supposed to be small tensors. - # May need to change this assumption if GPU placement becomes useful - if isinstance(t, torch.Tensor): - t = t.cpu().numpy() - return np.asarray(t).astype("float64") - - def process_polygons( - polygons_per_instance: List[Union[torch.Tensor, np.ndarray]] - ) -> List[np.ndarray]: - if not isinstance(polygons_per_instance, list): - raise ValueError( - "Cannot create polygons: Expect a list of polygons per instance. " - "Got '{}' instead.".format(type(polygons_per_instance)) - ) - # transform each polygon to a numpy array - polygons_per_instance = [_make_array(p) for p in polygons_per_instance] - for polygon in polygons_per_instance: - if len(polygon) % 2 != 0 or len(polygon) < 6: - raise ValueError(f"Cannot create a polygon from {len(polygon)} coordinates.") - return polygons_per_instance - - self.polygons: List[List[np.ndarray]] = [ - process_polygons(polygons_per_instance) for polygons_per_instance in polygons - ] - - def to(self, *args: Any, **kwargs: Any) -> "PolygonMasks": - return self - - @property - def device(self) -> torch.device: - return torch.device("cpu") - - def get_bounding_boxes(self) -> Boxes: - """ - Returns: - Boxes: tight bounding boxes around polygon masks. - """ - boxes = torch.zeros(len(self.polygons), 4, dtype=torch.float32) - for idx, polygons_per_instance in enumerate(self.polygons): - minxy = torch.as_tensor([float("inf"), float("inf")], dtype=torch.float32) - maxxy = torch.zeros(2, dtype=torch.float32) - for polygon in polygons_per_instance: - coords = torch.from_numpy(polygon).view(-1, 2).to(dtype=torch.float32) - minxy = torch.min(minxy, torch.min(coords, dim=0).values) - maxxy = torch.max(maxxy, torch.max(coords, dim=0).values) - boxes[idx, :2] = minxy - boxes[idx, 2:] = maxxy - return Boxes(boxes) - - def nonempty(self) -> torch.Tensor: - """ - Find masks that are non-empty. - - Returns: - Tensor: - a BoolTensor which represents whether each mask is empty (False) or not (True). - """ - keep = [1 if len(polygon) > 0 else 0 for polygon in self.polygons] - return torch.from_numpy(np.asarray(keep, dtype=np.bool)) - - def __getitem__(self, item: Union[int, slice, List[int], torch.BoolTensor]) -> "PolygonMasks": - """ - Support indexing over the instances and return a `PolygonMasks` object. - `item` can be: - - 1. An integer. It will return an object with only one instance. - 2. A slice. It will return an object with the selected instances. - 3. A list[int]. It will return an object with the selected instances, - correpsonding to the indices in the list. - 4. A vector mask of type BoolTensor, whose length is num_instances. - It will return an object with the instances whose mask is nonzero. - """ - if isinstance(item, int): - selected_polygons = [self.polygons[item]] - elif isinstance(item, slice): - selected_polygons = self.polygons[item] - elif isinstance(item, list): - selected_polygons = [self.polygons[i] for i in item] - elif isinstance(item, torch.Tensor): - # Polygons is a list, so we have to move the indices back to CPU. - if item.dtype == torch.bool: - assert item.dim() == 1, item.shape - item = item.nonzero().squeeze(1).cpu().numpy().tolist() - elif item.dtype in [torch.int32, torch.int64]: - item = item.cpu().numpy().tolist() - else: - raise ValueError("Unsupported tensor dtype={} for indexing!".format(item.dtype)) - selected_polygons = [self.polygons[i] for i in item] - return PolygonMasks(selected_polygons) - - def __iter__(self) -> Iterator[List[np.ndarray]]: - """ - Yields: - list[ndarray]: the polygons for one instance. - Each Tensor is a float64 vector representing a polygon. - """ - return iter(self.polygons) - - def __repr__(self) -> str: - s = self.__class__.__name__ + "(" - s += "num_instances={})".format(len(self.polygons)) - return s - - def __len__(self) -> int: - return len(self.polygons) - - def crop_and_resize(self, boxes: torch.Tensor, mask_size: int) -> torch.Tensor: - """ - Crop each mask by the given box, and resize results to (mask_size, mask_size). - This can be used to prepare training targets for Mask R-CNN. - - Args: - boxes (Tensor): Nx4 tensor storing the boxes for each mask - mask_size (int): the size of the rasterized mask. - - Returns: - Tensor: A bool tensor of shape (N, mask_size, mask_size), where - N is the number of predicted boxes for this image. - """ - assert len(boxes) == len(self), "{} != {}".format(len(boxes), len(self)) - - device = boxes.device - # Put boxes on the CPU, as the polygon representation is not efficient GPU-wise - # (several small tensors for representing a single instance mask) - boxes = boxes.to(torch.device("cpu")) - - results = [ - rasterize_polygons_within_box(poly, box.numpy(), mask_size) - for poly, box in zip(self.polygons, boxes) - ] - """ - poly: list[list[float]], the polygons for one instance - box: a tensor of shape (4,) - """ - if len(results) == 0: - return torch.empty(0, mask_size, mask_size, dtype=torch.bool, device=device) - return torch.stack(results, dim=0).to(device=device) - - def area(self): - """ - Computes area of the mask. - Only works with Polygons, using the shoelace formula: - https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates - - Returns: - Tensor: a vector, area for each instance - """ - - area = [] - for polygons_per_instance in self.polygons: - area_per_instance = 0 - for p in polygons_per_instance: - area_per_instance += polygon_area(p[0::2], p[1::2]) - area.append(area_per_instance) - - return torch.tensor(area) - - @staticmethod - def cat(polymasks_list: List["PolygonMasks"]) -> "PolygonMasks": - """ - Concatenates a list of PolygonMasks into a single PolygonMasks - - Arguments: - polymasks_list (list[PolygonMasks]) - - Returns: - PolygonMasks: the concatenated PolygonMasks - """ - assert isinstance(polymasks_list, (list, tuple)) - assert len(polymasks_list) > 0 - assert all(isinstance(polymask, PolygonMasks) for polymask in polymasks_list) - - cat_polymasks = type(polymasks_list[0])( - list(itertools.chain.from_iterable(pm.polygons for pm in polymasks_list)) - ) - return cat_polymasks - - -class ROIMasks: - """ - Represent masks by N smaller masks defined in some ROIs. Once ROI boxes are given, - full-image bitmask can be obtained by "pasting" the mask on the region defined - by the corresponding ROI box. - """ - - def __init__(self, tensor: torch.Tensor): - """ - Args: - tensor: (N, M, M) mask tensor that defines the mask within each ROI. - """ - if tensor.dim() != 3: - raise ValueError("ROIMasks must take a masks of 3 dimension.") - self.tensor = tensor - - def to(self, device: torch.device) -> "ROIMasks": - return ROIMasks(self.tensor.to(device)) - - @property - def device(self) -> device: - return self.tensor.device - - def __len__(self): - return self.tensor.shape[0] - - def __getitem__(self, item) -> "ROIMasks": - """ - Returns: - ROIMasks: Create a new :class:`ROIMasks` by indexing. - - The following usage are allowed: - - 1. `new_masks = masks[2:10]`: return a slice of masks. - 2. `new_masks = masks[vector]`, where vector is a torch.BoolTensor - with `length = len(masks)`. Nonzero elements in the vector will be selected. - - Note that the returned object might share storage with this object, - subject to Pytorch's indexing semantics. - """ - t = self.tensor[item] - if t.dim() != 3: - raise ValueError( - f"Indexing on ROIMasks with {item} returns a tensor with shape {t.shape}!" - ) - return ROIMasks(t) - - @torch.jit.unused - def __repr__(self) -> str: - s = self.__class__.__name__ + "(" - s += "num_instances={})".format(len(self.tensor)) - return s - - @torch.jit.unused - def to_bitmasks(self, boxes: torch.Tensor, height, width, threshold=0.5): - """ - Args: see documentation of :func:`paste_masks_in_image`. - """ - from detectron2.layers.mask_ops import paste_masks_in_image, _paste_masks_tensor_shape - - if torch.jit.is_tracing(): - if isinstance(height, torch.Tensor): - paste_func = _paste_masks_tensor_shape - else: - paste_func = paste_masks_in_image - else: - paste_func = retry_if_cuda_oom(paste_masks_in_image) - bitmasks = paste_func(self.tensor, boxes.tensor, (height, width), threshold=threshold) - return BitMasks(bitmasks) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/rotated_boxes.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/rotated_boxes.py deleted file mode 100755 index 4ec8e4c7..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/structures/rotated_boxes.py +++ /dev/null @@ -1,503 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import math -from typing import List, Tuple -import torch - -from detectron2.layers.rotated_boxes import pairwise_iou_rotated - -from .boxes import Boxes - - -class RotatedBoxes(Boxes): - """ - This structure stores a list of rotated boxes as a Nx5 torch.Tensor. - It supports some common methods about boxes - (`area`, `clip`, `nonempty`, etc), - and also behaves like a Tensor - (support indexing, `to(device)`, `.device`, and iteration over all boxes) - """ - - def __init__(self, tensor: torch.Tensor): - """ - Args: - tensor (Tensor[float]): a Nx5 matrix. Each row is - (x_center, y_center, width, height, angle), - in which angle is represented in degrees. - While there's no strict range restriction for it, - the recommended principal range is between [-180, 180) degrees. - - Assume we have a horizontal box B = (x_center, y_center, width, height), - where width is along the x-axis and height is along the y-axis. - The rotated box B_rot (x_center, y_center, width, height, angle) - can be seen as: - - 1. When angle == 0: - B_rot == B - 2. When angle > 0: - B_rot is obtained by rotating B w.r.t its center by :math:`|angle|` degrees CCW; - 3. When angle < 0: - B_rot is obtained by rotating B w.r.t its center by :math:`|angle|` degrees CW. - - Mathematically, since the right-handed coordinate system for image space - is (y, x), where y is top->down and x is left->right, the 4 vertices of the - rotated rectangle :math:`(yr_i, xr_i)` (i = 1, 2, 3, 4) can be obtained from - the vertices of the horizontal rectangle :math:`(y_i, x_i)` (i = 1, 2, 3, 4) - in the following way (:math:`\\theta = angle*\\pi/180` is the angle in radians, - :math:`(y_c, x_c)` is the center of the rectangle): - - .. math:: - - yr_i = \\cos(\\theta) (y_i - y_c) - \\sin(\\theta) (x_i - x_c) + y_c, - - xr_i = \\sin(\\theta) (y_i - y_c) + \\cos(\\theta) (x_i - x_c) + x_c, - - which is the standard rigid-body rotation transformation. - - Intuitively, the angle is - (1) the rotation angle from y-axis in image space - to the height vector (top->down in the box's local coordinate system) - of the box in CCW, and - (2) the rotation angle from x-axis in image space - to the width vector (left->right in the box's local coordinate system) - of the box in CCW. - - More intuitively, consider the following horizontal box ABCD represented - in (x1, y1, x2, y2): (3, 2, 7, 4), - covering the [3, 7] x [2, 4] region of the continuous coordinate system - which looks like this: - - .. code:: none - - O--------> x - | - | A---B - | | | - | D---C - | - v y - - Note that each capital letter represents one 0-dimensional geometric point - instead of a 'square pixel' here. - - In the example above, using (x, y) to represent a point we have: - - .. math:: - - O = (0, 0), A = (3, 2), B = (7, 2), C = (7, 4), D = (3, 4) - - We name vector AB = vector DC as the width vector in box's local coordinate system, and - vector AD = vector BC as the height vector in box's local coordinate system. Initially, - when angle = 0 degree, they're aligned with the positive directions of x-axis and y-axis - in the image space, respectively. - - For better illustration, we denote the center of the box as E, - - .. code:: none - - O--------> x - | - | A---B - | | E | - | D---C - | - v y - - where the center E = ((3+7)/2, (2+4)/2) = (5, 3). - - Also, - - .. math:: - - width = |AB| = |CD| = 7 - 3 = 4, - height = |AD| = |BC| = 4 - 2 = 2. - - Therefore, the corresponding representation for the same shape in rotated box in - (x_center, y_center, width, height, angle) format is: - - (5, 3, 4, 2, 0), - - Now, let's consider (5, 3, 4, 2, 90), which is rotated by 90 degrees - CCW (counter-clockwise) by definition. It looks like this: - - .. code:: none - - O--------> x - | B-C - | | | - | |E| - | | | - | A-D - v y - - The center E is still located at the same point (5, 3), while the vertices - ABCD are rotated by 90 degrees CCW with regard to E: - A = (4, 5), B = (4, 1), C = (6, 1), D = (6, 5) - - Here, 90 degrees can be seen as the CCW angle to rotate from y-axis to - vector AD or vector BC (the top->down height vector in box's local coordinate system), - or the CCW angle to rotate from x-axis to vector AB or vector DC (the left->right - width vector in box's local coordinate system). - - .. math:: - - width = |AB| = |CD| = 5 - 1 = 4, - height = |AD| = |BC| = 6 - 4 = 2. - - Next, how about (5, 3, 4, 2, -90), which is rotated by 90 degrees CW (clockwise) - by definition? It looks like this: - - .. code:: none - - O--------> x - | D-A - | | | - | |E| - | | | - | C-B - v y - - The center E is still located at the same point (5, 3), while the vertices - ABCD are rotated by 90 degrees CW with regard to E: - A = (6, 1), B = (6, 5), C = (4, 5), D = (4, 1) - - .. math:: - - width = |AB| = |CD| = 5 - 1 = 4, - height = |AD| = |BC| = 6 - 4 = 2. - - This covers exactly the same region as (5, 3, 4, 2, 90) does, and their IoU - will be 1. However, these two will generate different RoI Pooling results and - should not be treated as an identical box. - - On the other hand, it's easy to see that (X, Y, W, H, A) is identical to - (X, Y, W, H, A+360N), for any integer N. For example (5, 3, 4, 2, 270) would be - identical to (5, 3, 4, 2, -90), because rotating the shape 270 degrees CCW is - equivalent to rotating the same shape 90 degrees CW. - - We could rotate further to get (5, 3, 4, 2, 180), or (5, 3, 4, 2, -180): - - .. code:: none - - O--------> x - | - | C---D - | | E | - | B---A - | - v y - - .. math:: - - A = (7, 4), B = (3, 4), C = (3, 2), D = (7, 2), - - width = |AB| = |CD| = 7 - 3 = 4, - height = |AD| = |BC| = 4 - 2 = 2. - - Finally, this is a very inaccurate (heavily quantized) illustration of - how (5, 3, 4, 2, 60) looks like in case anyone wonders: - - .. code:: none - - O--------> x - | B\ - | / C - | /E / - | A / - | `D - v y - - It's still a rectangle with center of (5, 3), width of 4 and height of 2, - but its angle (and thus orientation) is somewhere between - (5, 3, 4, 2, 0) and (5, 3, 4, 2, 90). - """ - device = tensor.device if isinstance(tensor, torch.Tensor) else torch.device("cpu") - tensor = torch.as_tensor(tensor, dtype=torch.float32, device=device) - if tensor.numel() == 0: - # Use reshape, so we don't end up creating a new tensor that does not depend on - # the inputs (and consequently confuses jit) - tensor = tensor.reshape((0, 5)).to(dtype=torch.float32, device=device) - assert tensor.dim() == 2 and tensor.size(-1) == 5, tensor.size() - - self.tensor = tensor - - def clone(self) -> "RotatedBoxes": - """ - Clone the RotatedBoxes. - - Returns: - RotatedBoxes - """ - return RotatedBoxes(self.tensor.clone()) - - def to(self, device: torch.device): - # Boxes are assumed float32 and does not support to(dtype) - return RotatedBoxes(self.tensor.to(device=device)) - - def area(self) -> torch.Tensor: - """ - Computes the area of all the boxes. - - Returns: - torch.Tensor: a vector with areas of each box. - """ - box = self.tensor - area = box[:, 2] * box[:, 3] - return area - - def normalize_angles(self) -> None: - """ - Restrict angles to the range of [-180, 180) degrees - """ - self.tensor[:, 4] = (self.tensor[:, 4] + 180.0) % 360.0 - 180.0 - - def clip(self, box_size: Tuple[int, int], clip_angle_threshold: float = 1.0) -> None: - """ - Clip (in place) the boxes by limiting x coordinates to the range [0, width] - and y coordinates to the range [0, height]. - - For RRPN: - Only clip boxes that are almost horizontal with a tolerance of - clip_angle_threshold to maintain backward compatibility. - - Rotated boxes beyond this threshold are not clipped for two reasons: - - 1. There are potentially multiple ways to clip a rotated box to make it - fit within the image. - 2. It's tricky to make the entire rectangular box fit within the image - and still be able to not leave out pixels of interest. - - Therefore we rely on ops like RoIAlignRotated to safely handle this. - - Args: - box_size (height, width): The clipping box's size. - clip_angle_threshold: - Iff. abs(normalized(angle)) <= clip_angle_threshold (in degrees), - we do the clipping as horizontal boxes. - """ - h, w = box_size - - # normalize angles to be within (-180, 180] degrees - self.normalize_angles() - - idx = torch.where(torch.abs(self.tensor[:, 4]) <= clip_angle_threshold)[0] - - # convert to (x1, y1, x2, y2) - x1 = self.tensor[idx, 0] - self.tensor[idx, 2] / 2.0 - y1 = self.tensor[idx, 1] - self.tensor[idx, 3] / 2.0 - x2 = self.tensor[idx, 0] + self.tensor[idx, 2] / 2.0 - y2 = self.tensor[idx, 1] + self.tensor[idx, 3] / 2.0 - - # clip - x1.clamp_(min=0, max=w) - y1.clamp_(min=0, max=h) - x2.clamp_(min=0, max=w) - y2.clamp_(min=0, max=h) - - # convert back to (xc, yc, w, h) - self.tensor[idx, 0] = (x1 + x2) / 2.0 - self.tensor[idx, 1] = (y1 + y2) / 2.0 - # make sure widths and heights do not increase due to numerical errors - self.tensor[idx, 2] = torch.min(self.tensor[idx, 2], x2 - x1) - self.tensor[idx, 3] = torch.min(self.tensor[idx, 3], y2 - y1) - - def nonempty(self, threshold: float = 0.0) -> torch.Tensor: - """ - Find boxes that are non-empty. - A box is considered empty, if either of its side is no larger than threshold. - - Returns: - Tensor: a binary vector which represents - whether each box is empty (False) or non-empty (True). - """ - box = self.tensor - widths = box[:, 2] - heights = box[:, 3] - keep = (widths > threshold) & (heights > threshold) - return keep - - def __getitem__(self, item) -> "RotatedBoxes": - """ - Returns: - RotatedBoxes: Create a new :class:`RotatedBoxes` by indexing. - - The following usage are allowed: - - 1. `new_boxes = boxes[3]`: return a `RotatedBoxes` which contains only one box. - 2. `new_boxes = boxes[2:10]`: return a slice of boxes. - 3. `new_boxes = boxes[vector]`, where vector is a torch.ByteTensor - with `length = len(boxes)`. Nonzero elements in the vector will be selected. - - Note that the returned RotatedBoxes might share storage with this RotatedBoxes, - subject to Pytorch's indexing semantics. - """ - if isinstance(item, int): - return RotatedBoxes(self.tensor[item].view(1, -1)) - b = self.tensor[item] - assert b.dim() == 2, "Indexing on RotatedBoxes with {} failed to return a matrix!".format( - item - ) - return RotatedBoxes(b) - - def __len__(self) -> int: - return self.tensor.shape[0] - - def __repr__(self) -> str: - return "RotatedBoxes(" + str(self.tensor) + ")" - - def inside_box(self, box_size: Tuple[int, int], boundary_threshold: int = 0) -> torch.Tensor: - """ - Args: - box_size (height, width): Size of the reference box covering - [0, width] x [0, height] - boundary_threshold (int): Boxes that extend beyond the reference box - boundary by more than boundary_threshold are considered "outside". - - For RRPN, it might not be necessary to call this function since it's common - for rotated box to extend to outside of the image boundaries - (the clip function only clips the near-horizontal boxes) - - Returns: - a binary vector, indicating whether each box is inside the reference box. - """ - height, width = box_size - - cnt_x = self.tensor[..., 0] - cnt_y = self.tensor[..., 1] - half_w = self.tensor[..., 2] / 2.0 - half_h = self.tensor[..., 3] / 2.0 - a = self.tensor[..., 4] - c = torch.abs(torch.cos(a * math.pi / 180.0)) - s = torch.abs(torch.sin(a * math.pi / 180.0)) - # This basically computes the horizontal bounding rectangle of the rotated box - max_rect_dx = c * half_w + s * half_h - max_rect_dy = c * half_h + s * half_w - - inds_inside = ( - (cnt_x - max_rect_dx >= -boundary_threshold) - & (cnt_y - max_rect_dy >= -boundary_threshold) - & (cnt_x + max_rect_dx < width + boundary_threshold) - & (cnt_y + max_rect_dy < height + boundary_threshold) - ) - - return inds_inside - - def get_centers(self) -> torch.Tensor: - """ - Returns: - The box centers in a Nx2 array of (x, y). - """ - return self.tensor[:, :2] - - def scale(self, scale_x: float, scale_y: float) -> None: - """ - Scale the rotated box with horizontal and vertical scaling factors - Note: when scale_factor_x != scale_factor_y, - the rotated box does not preserve the rectangular shape when the angle - is not a multiple of 90 degrees under resize transformation. - Instead, the shape is a parallelogram (that has skew) - Here we make an approximation by fitting a rotated rectangle to the parallelogram. - """ - self.tensor[:, 0] *= scale_x - self.tensor[:, 1] *= scale_y - theta = self.tensor[:, 4] * math.pi / 180.0 - c = torch.cos(theta) - s = torch.sin(theta) - - # In image space, y is top->down and x is left->right - # Consider the local coordintate system for the rotated box, - # where the box center is located at (0, 0), and the four vertices ABCD are - # A(-w / 2, -h / 2), B(w / 2, -h / 2), C(w / 2, h / 2), D(-w / 2, h / 2) - # the midpoint of the left edge AD of the rotated box E is: - # E = (A+D)/2 = (-w / 2, 0) - # the midpoint of the top edge AB of the rotated box F is: - # F(0, -h / 2) - # To get the old coordinates in the global system, apply the rotation transformation - # (Note: the right-handed coordinate system for image space is yOx): - # (old_x, old_y) = (s * y + c * x, c * y - s * x) - # E(old) = (s * 0 + c * (-w/2), c * 0 - s * (-w/2)) = (-c * w / 2, s * w / 2) - # F(old) = (s * (-h / 2) + c * 0, c * (-h / 2) - s * 0) = (-s * h / 2, -c * h / 2) - # After applying the scaling factor (sfx, sfy): - # E(new) = (-sfx * c * w / 2, sfy * s * w / 2) - # F(new) = (-sfx * s * h / 2, -sfy * c * h / 2) - # The new width after scaling tranformation becomes: - - # w(new) = |E(new) - O| * 2 - # = sqrt[(sfx * c * w / 2)^2 + (sfy * s * w / 2)^2] * 2 - # = sqrt[(sfx * c)^2 + (sfy * s)^2] * w - # i.e., scale_factor_w = sqrt[(sfx * c)^2 + (sfy * s)^2] - # - # For example, - # when angle = 0 or 180, |c| = 1, s = 0, scale_factor_w == scale_factor_x; - # when |angle| = 90, c = 0, |s| = 1, scale_factor_w == scale_factor_y - self.tensor[:, 2] *= torch.sqrt((scale_x * c) ** 2 + (scale_y * s) ** 2) - - # h(new) = |F(new) - O| * 2 - # = sqrt[(sfx * s * h / 2)^2 + (sfy * c * h / 2)^2] * 2 - # = sqrt[(sfx * s)^2 + (sfy * c)^2] * h - # i.e., scale_factor_h = sqrt[(sfx * s)^2 + (sfy * c)^2] - # - # For example, - # when angle = 0 or 180, |c| = 1, s = 0, scale_factor_h == scale_factor_y; - # when |angle| = 90, c = 0, |s| = 1, scale_factor_h == scale_factor_x - self.tensor[:, 3] *= torch.sqrt((scale_x * s) ** 2 + (scale_y * c) ** 2) - - # The angle is the rotation angle from y-axis in image space to the height - # vector (top->down in the box's local coordinate system) of the box in CCW. - # - # angle(new) = angle_yOx(O - F(new)) - # = angle_yOx( (sfx * s * h / 2, sfy * c * h / 2) ) - # = atan2(sfx * s * h / 2, sfy * c * h / 2) - # = atan2(sfx * s, sfy * c) - # - # For example, - # when sfx == sfy, angle(new) == atan2(s, c) == angle(old) - self.tensor[:, 4] = torch.atan2(scale_x * s, scale_y * c) * 180 / math.pi - - @classmethod - def cat(cls, boxes_list: List["RotatedBoxes"]) -> "RotatedBoxes": - """ - Concatenates a list of RotatedBoxes into a single RotatedBoxes - - Arguments: - boxes_list (list[RotatedBoxes]) - - Returns: - RotatedBoxes: the concatenated RotatedBoxes - """ - assert isinstance(boxes_list, (list, tuple)) - if len(boxes_list) == 0: - return cls(torch.empty(0)) - assert all([isinstance(box, RotatedBoxes) for box in boxes_list]) - - # use torch.cat (v.s. layers.cat) so the returned boxes never share storage with input - cat_boxes = cls(torch.cat([b.tensor for b in boxes_list], dim=0)) - return cat_boxes - - @property - def device(self) -> torch.device: - return self.tensor.device - - @torch.jit.unused - def __iter__(self): - """ - Yield a box as a Tensor of shape (5,) at a time. - """ - yield from self.tensor - - -def pairwise_iou(boxes1: RotatedBoxes, boxes2: RotatedBoxes) -> None: - """ - Given two lists of rotated boxes of size N and M, - compute the IoU (intersection over union) - between **all** N x M pairs of boxes. - The box order must be (x_center, y_center, width, height, angle). - - Args: - boxes1, boxes2 (RotatedBoxes): - two `RotatedBoxes`. Contains N & M rotated boxes, respectively. - - Returns: - Tensor: IoU, sized [N,M]. - """ - - return pairwise_iou_rotated(boxes1.tensor, boxes2.tensor) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/README.md deleted file mode 100755 index 9765b24a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/README.md +++ /dev/null @@ -1,5 +0,0 @@ -# Utility functions - -This folder contain utility functions that are not used in the -core library, but are useful for building models or training -code using the config system. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/__init__.py deleted file mode 100755 index 9020c2df..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/__init__.py +++ /dev/null @@ -1 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/analysis.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/analysis.py deleted file mode 100755 index 178da796..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/analysis.py +++ /dev/null @@ -1,188 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# -*- coding: utf-8 -*- - -import typing -from typing import Any, List -import fvcore -from fvcore.nn import activation_count, flop_count, parameter_count, parameter_count_table -from torch import nn - -from detectron2.export import TracingAdapter - -__all__ = [ - "activation_count_operators", - "flop_count_operators", - "parameter_count_table", - "parameter_count", - "FlopCountAnalysis", -] - -FLOPS_MODE = "flops" -ACTIVATIONS_MODE = "activations" - - -# Some extra ops to ignore from counting, including elementwise and reduction ops -_IGNORED_OPS = { - "aten::add", - "aten::add_", - "aten::argmax", - "aten::argsort", - "aten::batch_norm", - "aten::constant_pad_nd", - "aten::div", - "aten::div_", - "aten::exp", - "aten::log2", - "aten::max_pool2d", - "aten::meshgrid", - "aten::mul", - "aten::mul_", - "aten::neg", - "aten::nonzero_numpy", - "aten::reciprocal", - "aten::repeat_interleave", - "aten::rsub", - "aten::sigmoid", - "aten::sigmoid_", - "aten::softmax", - "aten::sort", - "aten::sqrt", - "aten::sub", - "torchvision::nms", # TODO estimate flop for nms -} - - -class FlopCountAnalysis(fvcore.nn.FlopCountAnalysis): - """ - Same as :class:`fvcore.nn.FlopCountAnalysis`, but supports detectron2 models. - """ - - def __init__(self, model, inputs): - """ - Args: - model (nn.Module): - inputs (Any): inputs of the given model. Does not have to be tuple of tensors. - """ - wrapper = TracingAdapter(model, inputs, allow_non_tensor=True) - super().__init__(wrapper, wrapper.flattened_inputs) - self.set_op_handle(**{k: None for k in _IGNORED_OPS}) - - -def flop_count_operators(model: nn.Module, inputs: list) -> typing.DefaultDict[str, float]: - """ - Implement operator-level flops counting using jit. - This is a wrapper of :func:`fvcore.nn.flop_count` and adds supports for standard - detection models in detectron2. - Please use :class:`FlopCountAnalysis` for more advanced functionalities. - - Note: - The function runs the input through the model to compute flops. - The flops of a detection model is often input-dependent, for example, - the flops of box & mask head depends on the number of proposals & - the number of detected objects. - Therefore, the flops counting using a single input may not accurately - reflect the computation cost of a model. It's recommended to average - across a number of inputs. - - Args: - model: a detectron2 model that takes `list[dict]` as input. - inputs (list[dict]): inputs to model, in detectron2's standard format. - Only "image" key will be used. - supported_ops (dict[str, Handle]): see documentation of :func:`fvcore.nn.flop_count` - - Returns: - Counter: Gflop count per operator - """ - old_train = model.training - model.eval() - ret = FlopCountAnalysis(model, inputs).by_operator() - model.train(old_train) - return {k: v / 1e9 for k, v in ret.items()} - - -def activation_count_operators( - model: nn.Module, inputs: list, **kwargs -) -> typing.DefaultDict[str, float]: - """ - Implement operator-level activations counting using jit. - This is a wrapper of fvcore.nn.activation_count, that supports standard detection models - in detectron2. - - Note: - The function runs the input through the model to compute activations. - The activations of a detection model is often input-dependent, for example, - the activations of box & mask head depends on the number of proposals & - the number of detected objects. - - Args: - model: a detectron2 model that takes `list[dict]` as input. - inputs (list[dict]): inputs to model, in detectron2's standard format. - Only "image" key will be used. - - Returns: - Counter: activation count per operator - """ - return _wrapper_count_operators(model=model, inputs=inputs, mode=ACTIVATIONS_MODE, **kwargs) - - -def _wrapper_count_operators( - model: nn.Module, inputs: list, mode: str, **kwargs -) -> typing.DefaultDict[str, float]: - # ignore some ops - supported_ops = {k: lambda *args, **kwargs: {} for k in _IGNORED_OPS} - supported_ops.update(kwargs.pop("supported_ops", {})) - kwargs["supported_ops"] = supported_ops - - assert len(inputs) == 1, "Please use batch size=1" - tensor_input = inputs[0]["image"] - inputs = [{"image": tensor_input}] # remove other keys, in case there are any - - old_train = model.training - if isinstance(model, (nn.parallel.distributed.DistributedDataParallel, nn.DataParallel)): - model = model.module - wrapper = TracingAdapter(model, inputs) - wrapper.eval() - if mode == FLOPS_MODE: - ret = flop_count(wrapper, (tensor_input,), **kwargs) - elif mode == ACTIVATIONS_MODE: - ret = activation_count(wrapper, (tensor_input,), **kwargs) - else: - raise NotImplementedError("Count for mode {} is not supported yet.".format(mode)) - # compatible with change in fvcore - if isinstance(ret, tuple): - ret = ret[0] - model.train(old_train) - return ret - - -def find_unused_parameters(model: nn.Module, inputs: Any) -> List[str]: - """ - Given a model, find parameters that do not contribute - to the loss. - - Args: - model: a model in training mode that returns losses - inputs: argument or a tuple of arguments. Inputs of the model - - Returns: - list[str]: the name of unused parameters - """ - assert model.training - for _, prm in model.named_parameters(): - prm.grad = None - - if isinstance(inputs, tuple): - losses = model(*inputs) - else: - losses = model(inputs) - - if isinstance(losses, dict): - losses = sum(losses.values()) - losses.backward() - - unused: List[str] = [] - for name, prm in model.named_parameters(): - if prm.grad is None: - unused.append(name) - prm.grad = None - return unused diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/collect_env.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/collect_env.py deleted file mode 100755 index 807b6c7e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/collect_env.py +++ /dev/null @@ -1,242 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import importlib -import numpy as np -import os -import re -import subprocess -import sys -from collections import defaultdict -import PIL -import torch -import torchvision -from tabulate import tabulate - -__all__ = ["collect_env_info"] - - -def collect_torch_env(): - try: - import torch.__config__ - - return torch.__config__.show() - except ImportError: - # compatible with older versions of pytorch - from torch.utils.collect_env import get_pretty_env_info - - return get_pretty_env_info() - - -def get_env_module(): - var_name = "DETECTRON2_ENV_MODULE" - return var_name, os.environ.get(var_name, "") - - -def detect_compute_compatibility(CUDA_HOME, so_file): - try: - cuobjdump = os.path.join(CUDA_HOME, "bin", "cuobjdump") - if os.path.isfile(cuobjdump): - output = subprocess.check_output( - "'{}' --list-elf '{}'".format(cuobjdump, so_file), shell=True - ) - output = output.decode("utf-8").strip().split("\n") - arch = [] - for line in output: - line = re.findall(r"\.sm_([0-9]*)\.", line)[0] - arch.append(".".join(line)) - arch = sorted(set(arch)) - return ", ".join(arch) - else: - return so_file + "; cannot find cuobjdump" - except Exception: - # unhandled failure - return so_file - - -def collect_env_info(): - has_gpu = torch.cuda.is_available() # true for both CUDA & ROCM - torch_version = torch.__version__ - - # NOTE that CUDA_HOME/ROCM_HOME could be None even when CUDA runtime libs are functional - from torch.utils.cpp_extension import CUDA_HOME, ROCM_HOME - - has_rocm = False - if (getattr(torch.version, "hip", None) is not None) and (ROCM_HOME is not None): - has_rocm = True - has_cuda = has_gpu and (not has_rocm) - - data = [] - data.append(("sys.platform", sys.platform)) # check-template.yml depends on it - data.append(("Python", sys.version.replace("\n", ""))) - data.append(("numpy", np.__version__)) - - try: - import detectron2 # noqa - - data.append( - ("detectron2", detectron2.__version__ + " @" + os.path.dirname(detectron2.__file__)) - ) - except ImportError: - data.append(("detectron2", "failed to import")) - except AttributeError: - data.append(("detectron2", "imported a wrong installation")) - - try: - import detectron2._C as _C - except ImportError as e: - data.append(("detectron2._C", f"not built correctly: {e}")) - - # print system compilers when extension fails to build - if sys.platform != "win32": # don't know what to do for windows - try: - # this is how torch/utils/cpp_extensions.py choose compiler - cxx = os.environ.get("CXX", "c++") - cxx = subprocess.check_output("'{}' --version".format(cxx), shell=True) - cxx = cxx.decode("utf-8").strip().split("\n")[0] - except subprocess.SubprocessError: - cxx = "Not found" - data.append(("Compiler ($CXX)", cxx)) - - if has_cuda and CUDA_HOME is not None: - try: - nvcc = os.path.join(CUDA_HOME, "bin", "nvcc") - nvcc = subprocess.check_output("'{}' -V".format(nvcc), shell=True) - nvcc = nvcc.decode("utf-8").strip().split("\n")[-1] - except subprocess.SubprocessError: - nvcc = "Not found" - data.append(("CUDA compiler", nvcc)) - if has_cuda and sys.platform != "win32": - try: - so_file = importlib.util.find_spec("detectron2._C").origin - except (ImportError, AttributeError): - pass - else: - data.append( - ("detectron2 arch flags", detect_compute_compatibility(CUDA_HOME, so_file)) - ) - else: - # print compilers that are used to build extension - data.append(("Compiler", _C.get_compiler_version())) - data.append(("CUDA compiler", _C.get_cuda_version())) # cuda or hip - if has_cuda and getattr(_C, "has_cuda", lambda: True)(): - data.append( - ("detectron2 arch flags", detect_compute_compatibility(CUDA_HOME, _C.__file__)) - ) - - data.append(get_env_module()) - data.append(("PyTorch", torch_version + " @" + os.path.dirname(torch.__file__))) - data.append(("PyTorch debug build", torch.version.debug)) - - if not has_gpu: - has_gpu_text = "No: torch.cuda.is_available() == False" - else: - has_gpu_text = "Yes" - data.append(("GPU available", has_gpu_text)) - if has_gpu: - devices = defaultdict(list) - for k in range(torch.cuda.device_count()): - cap = ".".join((str(x) for x in torch.cuda.get_device_capability(k))) - name = torch.cuda.get_device_name(k) + f" (arch={cap})" - devices[name].append(str(k)) - for name, devids in devices.items(): - data.append(("GPU " + ",".join(devids), name)) - - if has_rocm: - msg = " - invalid!" if not (ROCM_HOME and os.path.isdir(ROCM_HOME)) else "" - data.append(("ROCM_HOME", str(ROCM_HOME) + msg)) - else: - try: - from torch.utils.collect_env import get_nvidia_driver_version, run as _run - - data.append(("Driver version", get_nvidia_driver_version(_run))) - except Exception: - pass - msg = " - invalid!" if not (CUDA_HOME and os.path.isdir(CUDA_HOME)) else "" - data.append(("CUDA_HOME", str(CUDA_HOME) + msg)) - - cuda_arch_list = os.environ.get("TORCH_CUDA_ARCH_LIST", None) - if cuda_arch_list: - data.append(("TORCH_CUDA_ARCH_LIST", cuda_arch_list)) - data.append(("Pillow", PIL.__version__)) - - try: - data.append( - ( - "torchvision", - str(torchvision.__version__) + " @" + os.path.dirname(torchvision.__file__), - ) - ) - if has_cuda: - try: - torchvision_C = importlib.util.find_spec("torchvision._C").origin - msg = detect_compute_compatibility(CUDA_HOME, torchvision_C) - data.append(("torchvision arch flags", msg)) - except (ImportError, AttributeError): - data.append(("torchvision._C", "Not found")) - except AttributeError: - data.append(("torchvision", "unknown")) - - try: - import fvcore - - data.append(("fvcore", fvcore.__version__)) - except (ImportError, AttributeError): - pass - - try: - import iopath - - data.append(("iopath", iopath.__version__)) - except (ImportError, AttributeError): - pass - - try: - import cv2 - - data.append(("cv2", cv2.__version__)) - except (ImportError, AttributeError): - data.append(("cv2", "Not found")) - env_str = tabulate(data) + "\n" - env_str += collect_torch_env() - return env_str - - -def test_nccl_ops(): - num_gpu = torch.cuda.device_count() - if os.access("/tmp", os.W_OK): - import torch.multiprocessing as mp - - dist_url = "file:///tmp/nccl_tmp_file" - print("Testing NCCL connectivity ... this should not hang.") - mp.spawn(_test_nccl_worker, nprocs=num_gpu, args=(num_gpu, dist_url), daemon=False) - print("NCCL succeeded.") - - -def _test_nccl_worker(rank, num_gpu, dist_url): - import torch.distributed as dist - - dist.init_process_group(backend="NCCL", init_method=dist_url, rank=rank, world_size=num_gpu) - dist.barrier(device_ids=[rank]) - - -if __name__ == "__main__": - try: - from detectron2.utils.collect_env import collect_env_info as f - - print(f()) - except ImportError: - print(collect_env_info()) - - if torch.cuda.is_available(): - num_gpu = torch.cuda.device_count() - for k in range(num_gpu): - device = f"cuda:{k}" - try: - x = torch.tensor([1, 2.0], dtype=torch.float32) - x = x.to(device) - except Exception as e: - print( - f"Unable to copy tensor to device={device}: {e}. " - "Your CUDA environment is broken." - ) - if num_gpu > 1: - test_nccl_ops() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/colormap.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/colormap.py deleted file mode 100755 index 150ccc37..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/colormap.py +++ /dev/null @@ -1,140 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -""" -An awesome colormap for really neat visualizations. -Copied from Detectron, and removed gray colors. -""" - -import numpy as np - -__all__ = ["colormap", "random_color"] - -# fmt: off -# RGB: -_COLORS = np.array( - [ - 0.000, 0.447, 0.741, - 0.850, 0.325, 0.098, - 0.929, 0.694, 0.125, - 0.494, 0.184, 0.556, - 0.466, 0.674, 0.188, - 0.301, 0.745, 0.933, - 0.635, 0.078, 0.184, - 0.300, 0.300, 0.300, - 0.600, 0.600, 0.600, - 1.000, 0.000, 0.000, - 1.000, 0.500, 0.000, - 0.749, 0.749, 0.000, - 0.000, 1.000, 0.000, - 0.000, 0.000, 1.000, - 0.667, 0.000, 1.000, - 0.333, 0.333, 0.000, - 0.333, 0.667, 0.000, - 0.333, 1.000, 0.000, - 0.667, 0.333, 0.000, - 0.667, 0.667, 0.000, - 0.667, 1.000, 0.000, - 1.000, 0.333, 0.000, - 1.000, 0.667, 0.000, - 1.000, 1.000, 0.000, - 0.000, 0.333, 0.500, - 0.000, 0.667, 0.500, - 0.000, 1.000, 0.500, - 0.333, 0.000, 0.500, - 0.333, 0.333, 0.500, - 0.333, 0.667, 0.500, - 0.333, 1.000, 0.500, - 0.667, 0.000, 0.500, - 0.667, 0.333, 0.500, - 0.667, 0.667, 0.500, - 0.667, 1.000, 0.500, - 1.000, 0.000, 0.500, - 1.000, 0.333, 0.500, - 1.000, 0.667, 0.500, - 1.000, 1.000, 0.500, - 0.000, 0.333, 1.000, - 0.000, 0.667, 1.000, - 0.000, 1.000, 1.000, - 0.333, 0.000, 1.000, - 0.333, 0.333, 1.000, - 0.333, 0.667, 1.000, - 0.333, 1.000, 1.000, - 0.667, 0.000, 1.000, - 0.667, 0.333, 1.000, - 0.667, 0.667, 1.000, - 0.667, 1.000, 1.000, - 1.000, 0.000, 1.000, - 1.000, 0.333, 1.000, - 1.000, 0.667, 1.000, - 0.333, 0.000, 0.000, - 0.500, 0.000, 0.000, - 0.667, 0.000, 0.000, - 0.833, 0.000, 0.000, - 1.000, 0.000, 0.000, - 0.000, 0.167, 0.000, - 0.000, 0.333, 0.000, - 0.000, 0.500, 0.000, - 0.000, 0.667, 0.000, - 0.000, 0.833, 0.000, - 0.000, 1.000, 0.000, - 0.000, 0.000, 0.167, - 0.000, 0.000, 0.333, - 0.000, 0.000, 0.500, - 0.000, 0.000, 0.667, - 0.000, 0.000, 0.833, - 0.000, 0.000, 1.000, - 0.000, 0.000, 0.000, - 0.143, 0.143, 0.143, - 0.857, 0.857, 0.857, - 1.000, 1.000, 1.000 - ] -).astype(np.float32).reshape(-1, 3) -# fmt: on - - -def colormap(rgb=False, maximum=255): - """ - Args: - rgb (bool): whether to return RGB colors or BGR colors. - maximum (int): either 255 or 1 - - Returns: - ndarray: a float32 array of Nx3 colors, in range [0, 255] or [0, 1] - """ - assert maximum in [255, 1], maximum - c = _COLORS * maximum - if not rgb: - c = c[:, ::-1] - return c - - -def random_color(rgb=False, maximum=255): - """ - Args: - rgb (bool): whether to return RGB colors or BGR colors. - maximum (int): either 255 or 1 - - Returns: - ndarray: a vector of 3 numbers - """ - idx = np.random.randint(0, len(_COLORS)) - ret = _COLORS[idx] * maximum - if not rgb: - ret = ret[::-1] - return ret - - -if __name__ == "__main__": - import cv2 - - size = 100 - H, W = 10, 10 - canvas = np.random.rand(H * size, W * size, 3).astype("float32") - for h in range(H): - for w in range(W): - idx = h * W + w - if idx >= len(_COLORS): - break - canvas[h * size : (h + 1) * size, w * size : (w + 1) * size] = _COLORS[idx] - cv2.imshow("a", canvas) - cv2.waitKey(0) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/comm.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/comm.py deleted file mode 100755 index 7e2a0c44..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/comm.py +++ /dev/null @@ -1,199 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -""" -This file contains primitives for multi-gpu communication. -This is useful when doing distributed training. -""" - -import functools -import numpy as np -import torch -import torch.distributed as dist - -_LOCAL_PROCESS_GROUP = None -""" -A torch process group which only includes processes that on the same machine as the current process. -This variable is set when processes are spawned by `launch()` in "engine/launch.py". -""" - - -def get_world_size() -> int: - if not dist.is_available(): - return 1 - if not dist.is_initialized(): - return 1 - return dist.get_world_size() - - -def get_rank() -> int: - if not dist.is_available(): - return 0 - if not dist.is_initialized(): - return 0 - return dist.get_rank() - - -def get_local_rank() -> int: - """ - Returns: - The rank of the current process within the local (per-machine) process group. - """ - if not dist.is_available(): - return 0 - if not dist.is_initialized(): - return 0 - assert ( - _LOCAL_PROCESS_GROUP is not None - ), "Local process group is not created! Please use launch() to spawn processes!" - return dist.get_rank(group=_LOCAL_PROCESS_GROUP) - - -def get_local_size() -> int: - """ - Returns: - The size of the per-machine process group, - i.e. the number of processes per machine. - """ - if not dist.is_available(): - return 1 - if not dist.is_initialized(): - return 1 - return dist.get_world_size(group=_LOCAL_PROCESS_GROUP) - - -def is_main_process() -> bool: - return get_rank() == 0 - - -def synchronize(): - """ - Helper function to synchronize (barrier) among all processes when - using distributed training - """ - if not dist.is_available(): - return - if not dist.is_initialized(): - return - world_size = dist.get_world_size() - if world_size == 1: - return - if dist.get_backend() == dist.Backend.NCCL: - # This argument is needed to avoid warnings. - # It's valid only for NCCL backend. - dist.barrier(device_ids=[torch.cuda.current_device()]) - else: - dist.barrier() - - -@functools.lru_cache() -def _get_global_gloo_group(): - """ - Return a process group based on gloo backend, containing all the ranks - The result is cached. - """ - if dist.get_backend() == "nccl": - return dist.new_group(backend="gloo") - else: - return dist.group.WORLD - - -def all_gather(data, group=None): - """ - Run all_gather on arbitrary picklable data (not necessarily tensors). - - Args: - data: any picklable object - group: a torch process group. By default, will use a group which - contains all ranks on gloo backend. - - Returns: - list[data]: list of data gathered from each rank - """ - if get_world_size() == 1: - return [data] - if group is None: - group = _get_global_gloo_group() # use CPU group by default, to reduce GPU RAM usage. - world_size = dist.get_world_size(group) - if world_size == 1: - return [data] - - output = [None for _ in range(world_size)] - dist.all_gather_object(output, data, group=group) - return output - - -def gather(data, dst=0, group=None): - """ - Run gather on arbitrary picklable data (not necessarily tensors). - - Args: - data: any picklable object - dst (int): destination rank - group: a torch process group. By default, will use a group which - contains all ranks on gloo backend. - - Returns: - list[data]: on dst, a list of data gathered from each rank. Otherwise, - an empty list. - """ - if get_world_size() == 1: - return [data] - if group is None: - group = _get_global_gloo_group() - world_size = dist.get_world_size(group=group) - if world_size == 1: - return [data] - rank = dist.get_rank(group=group) - - if rank == dst: - output = [None for _ in range(world_size)] - dist.gather_object(data, output, dst=dst, group=group) - return output - else: - dist.gather_object(data, None, dst=dst, group=group) - return [] - - -def shared_random_seed(): - """ - Returns: - int: a random number that is the same across all workers. - If workers need a shared RNG, they can use this shared seed to - create one. - - All workers must call this function, otherwise it will deadlock. - """ - ints = np.random.randint(2 ** 31) - all_ints = all_gather(ints) - return all_ints[0] - - -def reduce_dict(input_dict, average=True): - """ - Reduce the values in the dictionary from all processes so that process with rank - 0 has the reduced results. - - Args: - input_dict (dict): inputs to be reduced. All the values must be scalar CUDA Tensor. - average (bool): whether to do average or sum - - Returns: - a dict with the same keys as input_dict, after reduction. - """ - world_size = get_world_size() - if world_size < 2: - return input_dict - with torch.no_grad(): - names = [] - values = [] - # sort the keys so that they are consistent across processes - for k in sorted(input_dict.keys()): - names.append(k) - values.append(input_dict[k]) - values = torch.stack(values, dim=0) - dist.reduce(values, dst=0) - if dist.get_rank() == 0 and average: - # only main process gets accumulated, so only divide by - # world_size in this case - values /= world_size - reduced_dict = {k: v for k, v in zip(names, values)} - return reduced_dict diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/env.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/env.py deleted file mode 100755 index 40634c17..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/env.py +++ /dev/null @@ -1,170 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import importlib -import importlib.util -import logging -import numpy as np -import os -import random -import sys -from datetime import datetime -import torch - -__all__ = ["seed_all_rng"] - - -TORCH_VERSION = tuple(int(x) for x in torch.__version__.split(".")[:2]) -""" -PyTorch version as a tuple of 2 ints. Useful for comparison. -""" - - -DOC_BUILDING = os.getenv("_DOC_BUILDING", False) # set in docs/conf.py -""" -Whether we're building documentation. -""" - - -def seed_all_rng(seed=None): - """ - Set the random seed for the RNG in torch, numpy and python. - - Args: - seed (int): if None, will use a strong random seed. - """ - if seed is None: - seed = ( - os.getpid() - + int(datetime.now().strftime("%S%f")) - + int.from_bytes(os.urandom(2), "big") - ) - logger = logging.getLogger(__name__) - logger.info("Using a generated random seed {}".format(seed)) - np.random.seed(seed) - torch.manual_seed(seed) - random.seed(seed) - os.environ["PYTHONHASHSEED"] = str(seed) - - -# from https://stackoverflow.com/questions/67631/how-to-import-a-module-given-the-full-path -def _import_file(module_name, file_path, make_importable=False): - spec = importlib.util.spec_from_file_location(module_name, file_path) - module = importlib.util.module_from_spec(spec) - spec.loader.exec_module(module) - if make_importable: - sys.modules[module_name] = module - return module - - -def _configure_libraries(): - """ - Configurations for some libraries. - """ - # An environment option to disable `import cv2` globally, - # in case it leads to negative performance impact - disable_cv2 = int(os.environ.get("DETECTRON2_DISABLE_CV2", False)) - if disable_cv2: - sys.modules["cv2"] = None - else: - # Disable opencl in opencv since its interaction with cuda often has negative effects - # This envvar is supported after OpenCV 3.4.0 - os.environ["OPENCV_OPENCL_RUNTIME"] = "disabled" - try: - import cv2 - - if int(cv2.__version__.split(".")[0]) >= 3: - cv2.ocl.setUseOpenCL(False) - except ModuleNotFoundError: - # Other types of ImportError, if happened, should not be ignored. - # Because a failed opencv import could mess up address space - # https://github.com/skvark/opencv-python/issues/381 - pass - - def get_version(module, digit=2): - return tuple(map(int, module.__version__.split(".")[:digit])) - - # fmt: off - assert get_version(torch) >= (1, 4), "Requires torch>=1.4" - import fvcore - assert get_version(fvcore, 3) >= (0, 1, 2), "Requires fvcore>=0.1.2" - import yaml - assert get_version(yaml) >= (5, 1), "Requires pyyaml>=5.1" - # fmt: on - - -_ENV_SETUP_DONE = False - - -def setup_environment(): - """Perform environment setup work. The default setup is a no-op, but this - function allows the user to specify a Python source file or a module in - the $DETECTRON2_ENV_MODULE environment variable, that performs - custom setup work that may be necessary to their computing environment. - """ - global _ENV_SETUP_DONE - if _ENV_SETUP_DONE: - return - _ENV_SETUP_DONE = True - - _configure_libraries() - - custom_module_path = os.environ.get("DETECTRON2_ENV_MODULE") - - if custom_module_path: - setup_custom_environment(custom_module_path) - else: - # The default setup is a no-op - pass - - -def setup_custom_environment(custom_module): - """ - Load custom environment setup by importing a Python source file or a - module, and run the setup function. - """ - if custom_module.endswith(".py"): - module = _import_file("detectron2.utils.env.custom_module", custom_module) - else: - module = importlib.import_module(custom_module) - assert hasattr(module, "setup_environment") and callable(module.setup_environment), ( - "Custom environment module defined in {} does not have the " - "required callable attribute 'setup_environment'." - ).format(custom_module) - module.setup_environment() - - -def fixup_module_metadata(module_name, namespace, keys=None): - """ - Fix the __qualname__ of module members to be their exported api name, so - when they are referenced in docs, sphinx can find them. Reference: - https://github.com/python-trio/trio/blob/6754c74eacfad9cc5c92d5c24727a2f3b620624e/trio/_util.py#L216-L241 - """ - if not DOC_BUILDING: - return - seen_ids = set() - - def fix_one(qualname, name, obj): - # avoid infinite recursion (relevant when using - # typing.Generic, for example) - if id(obj) in seen_ids: - return - seen_ids.add(id(obj)) - - mod = getattr(obj, "__module__", None) - if mod is not None and (mod.startswith(module_name) or mod.startswith("fvcore.")): - obj.__module__ = module_name - # Modules, unlike everything else in Python, put fully-qualitied - # names into their __name__ attribute. We check for "." to avoid - # rewriting these. - if hasattr(obj, "__name__") and "." not in obj.__name__: - obj.__name__ = name - obj.__qualname__ = qualname - if isinstance(obj, type): - for attr_name, attr_value in obj.__dict__.items(): - fix_one(objname + "." + attr_name, attr_name, attr_value) - - if keys is None: - keys = namespace.keys() - for objname in keys: - if not objname.startswith("_"): - obj = namespace[objname] - fix_one(objname, objname, obj) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/events.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/events.py deleted file mode 100755 index 5dee954b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/events.py +++ /dev/null @@ -1,486 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import datetime -import json -import logging -import os -import time -from collections import defaultdict -from contextlib import contextmanager -from typing import Optional -import torch -from fvcore.common.history_buffer import HistoryBuffer - -from detectron2.utils.file_io import PathManager - -__all__ = [ - "get_event_storage", - "JSONWriter", - "TensorboardXWriter", - "CommonMetricPrinter", - "EventStorage", -] - -_CURRENT_STORAGE_STACK = [] - - -def get_event_storage(): - """ - Returns: - The :class:`EventStorage` object that's currently being used. - Throws an error if no :class:`EventStorage` is currently enabled. - """ - assert len( - _CURRENT_STORAGE_STACK - ), "get_event_storage() has to be called inside a 'with EventStorage(...)' context!" - return _CURRENT_STORAGE_STACK[-1] - - -class EventWriter: - """ - Base class for writers that obtain events from :class:`EventStorage` and process them. - """ - - def write(self): - raise NotImplementedError - - def close(self): - pass - - -class JSONWriter(EventWriter): - """ - Write scalars to a json file. - - It saves scalars as one json per line (instead of a big json) for easy parsing. - - Examples parsing such a json file: - :: - $ cat metrics.json | jq -s '.[0:2]' - [ - { - "data_time": 0.008433341979980469, - "iteration": 19, - "loss": 1.9228371381759644, - "loss_box_reg": 0.050025828182697296, - "loss_classifier": 0.5316952466964722, - "loss_mask": 0.7236229181289673, - "loss_rpn_box": 0.0856662318110466, - "loss_rpn_cls": 0.48198649287223816, - "lr": 0.007173333333333333, - "time": 0.25401854515075684 - }, - { - "data_time": 0.007216215133666992, - "iteration": 39, - "loss": 1.282649278640747, - "loss_box_reg": 0.06222952902317047, - "loss_classifier": 0.30682939291000366, - "loss_mask": 0.6970193982124329, - "loss_rpn_box": 0.038663312792778015, - "loss_rpn_cls": 0.1471673548221588, - "lr": 0.007706666666666667, - "time": 0.2490077018737793 - } - ] - - $ cat metrics.json | jq '.loss_mask' - 0.7126231789588928 - 0.689423680305481 - 0.6776131987571716 - ... - - """ - - def __init__(self, json_file, window_size=20): - """ - Args: - json_file (str): path to the json file. New data will be appended if the file exists. - window_size (int): the window size of median smoothing for the scalars whose - `smoothing_hint` are True. - """ - self._file_handle = PathManager.open(json_file, "a") - self._window_size = window_size - self._last_write = -1 - - def write(self): - storage = get_event_storage() - to_save = defaultdict(dict) - - for k, (v, iter) in storage.latest_with_smoothing_hint(self._window_size).items(): - # keep scalars that have not been written - if iter <= self._last_write: - continue - to_save[iter][k] = v - if len(to_save): - all_iters = sorted(to_save.keys()) - self._last_write = max(all_iters) - - for itr, scalars_per_iter in to_save.items(): - scalars_per_iter["iteration"] = itr - self._file_handle.write(json.dumps(scalars_per_iter, sort_keys=True) + "\n") - self._file_handle.flush() - try: - os.fsync(self._file_handle.fileno()) - except AttributeError: - pass - - def close(self): - self._file_handle.close() - - -class TensorboardXWriter(EventWriter): - """ - Write all scalars to a tensorboard file. - """ - - def __init__(self, log_dir: str, window_size: int = 20, **kwargs): - """ - Args: - log_dir (str): the directory to save the output events - window_size (int): the scalars will be median-smoothed by this window size - - kwargs: other arguments passed to `torch.utils.tensorboard.SummaryWriter(...)` - """ - self._window_size = window_size - from torch.utils.tensorboard import SummaryWriter - - self._writer = SummaryWriter(log_dir, **kwargs) - self._last_write = -1 - - def write(self): - storage = get_event_storage() - new_last_write = self._last_write - for k, (v, iter) in storage.latest_with_smoothing_hint(self._window_size).items(): - if iter > self._last_write: - self._writer.add_scalar(k, v, iter) - new_last_write = max(new_last_write, iter) - self._last_write = new_last_write - - # storage.put_{image,histogram} is only meant to be used by - # tensorboard writer. So we access its internal fields directly from here. - if len(storage._vis_data) >= 1: - for img_name, img, step_num in storage._vis_data: - self._writer.add_image(img_name, img, step_num) - # Storage stores all image data and rely on this writer to clear them. - # As a result it assumes only one writer will use its image data. - # An alternative design is to let storage store limited recent - # data (e.g. only the most recent image) that all writers can access. - # In that case a writer may not see all image data if its period is long. - storage.clear_images() - - if len(storage._histograms) >= 1: - for params in storage._histograms: - self._writer.add_histogram_raw(**params) - storage.clear_histograms() - - def close(self): - if hasattr(self, "_writer"): # doesn't exist when the code fails at import - self._writer.close() - - -class CommonMetricPrinter(EventWriter): - """ - Print **common** metrics to the terminal, including - iteration time, ETA, memory, all losses, and the learning rate. - It also applies smoothing using a window of 20 elements. - - It's meant to print common metrics in common ways. - To print something in more customized ways, please implement a similar printer by yourself. - """ - - def __init__(self, max_iter: Optional[int] = None, window_size: int = 20): - """ - Args: - max_iter: the maximum number of iterations to train. - Used to compute ETA. If not given, ETA will not be printed. - window_size (int): the losses will be median-smoothed by this window size - """ - self.logger = logging.getLogger(__name__) - self._max_iter = max_iter - self._window_size = window_size - self._last_write = None # (step, time) of last call to write(). Used to compute ETA - - def _get_eta(self, storage) -> Optional[str]: - if self._max_iter is None: - return "" - iteration = storage.iter - try: - eta_seconds = storage.history("time").median(1000) * (self._max_iter - iteration - 1) - storage.put_scalar("eta_seconds", eta_seconds, smoothing_hint=False) - return str(datetime.timedelta(seconds=int(eta_seconds))) - except KeyError: - # estimate eta on our own - more noisy - eta_string = None - if self._last_write is not None: - estimate_iter_time = (time.perf_counter() - self._last_write[1]) / ( - iteration - self._last_write[0] - ) - eta_seconds = estimate_iter_time * (self._max_iter - iteration - 1) - eta_string = str(datetime.timedelta(seconds=int(eta_seconds))) - self._last_write = (iteration, time.perf_counter()) - return eta_string - - def write(self): - storage = get_event_storage() - iteration = storage.iter - if iteration == self._max_iter: - # This hook only reports training progress (loss, ETA, etc) but not other data, - # therefore do not write anything after training succeeds, even if this method - # is called. - return - - try: - data_time = storage.history("data_time").avg(20) - except KeyError: - # they may not exist in the first few iterations (due to warmup) - # or when SimpleTrainer is not used - data_time = None - try: - iter_time = storage.history("time").global_avg() - except KeyError: - iter_time = None - try: - lr = "{:.5g}".format(storage.history("lr").latest()) - except KeyError: - lr = "N/A" - - eta_string = self._get_eta(storage) - - if torch.cuda.is_available(): - max_mem_mb = torch.cuda.max_memory_allocated() / 1024.0 / 1024.0 - else: - max_mem_mb = None - - # NOTE: max_mem is parsed by grep in "dev/parse_results.sh" - self.logger.info( - " {eta}iter: {iter} {losses} {time}{data_time}lr: {lr} {memory}".format( - eta=f"eta: {eta_string} " if eta_string else "", - iter=iteration, - losses=" ".join( - [ - "{}: {:.4g}".format(k, v.median(self._window_size)) - for k, v in storage.histories().items() - if "loss" in k - ] - ), - time="time: {:.4f} ".format(iter_time) if iter_time is not None else "", - data_time="data_time: {:.4f} ".format(data_time) if data_time is not None else "", - lr=lr, - memory="max_mem: {:.0f}M".format(max_mem_mb) if max_mem_mb is not None else "", - ) - ) - - -class EventStorage: - """ - The user-facing class that provides metric storage functionalities. - - In the future we may add support for storing / logging other types of data if needed. - """ - - def __init__(self, start_iter=0): - """ - Args: - start_iter (int): the iteration number to start with - """ - self._history = defaultdict(HistoryBuffer) - self._smoothing_hints = {} - self._latest_scalars = {} - self._iter = start_iter - self._current_prefix = "" - self._vis_data = [] - self._histograms = [] - - def put_image(self, img_name, img_tensor): - """ - Add an `img_tensor` associated with `img_name`, to be shown on - tensorboard. - - Args: - img_name (str): The name of the image to put into tensorboard. - img_tensor (torch.Tensor or numpy.array): An `uint8` or `float` - Tensor of shape `[channel, height, width]` where `channel` is - 3. The image format should be RGB. The elements in img_tensor - can either have values in [0, 1] (float32) or [0, 255] (uint8). - The `img_tensor` will be visualized in tensorboard. - """ - self._vis_data.append((img_name, img_tensor, self._iter)) - - def put_scalar(self, name, value, smoothing_hint=True): - """ - Add a scalar `value` to the `HistoryBuffer` associated with `name`. - - Args: - smoothing_hint (bool): a 'hint' on whether this scalar is noisy and should be - smoothed when logged. The hint will be accessible through - :meth:`EventStorage.smoothing_hints`. A writer may ignore the hint - and apply custom smoothing rule. - - It defaults to True because most scalars we save need to be smoothed to - provide any useful signal. - """ - name = self._current_prefix + name - history = self._history[name] - value = float(value) - history.update(value, self._iter) - self._latest_scalars[name] = (value, self._iter) - - existing_hint = self._smoothing_hints.get(name) - if existing_hint is not None: - assert ( - existing_hint == smoothing_hint - ), "Scalar {} was put with a different smoothing_hint!".format(name) - else: - self._smoothing_hints[name] = smoothing_hint - - def put_scalars(self, *, smoothing_hint=True, **kwargs): - """ - Put multiple scalars from keyword arguments. - - Examples: - - storage.put_scalars(loss=my_loss, accuracy=my_accuracy, smoothing_hint=True) - """ - for k, v in kwargs.items(): - self.put_scalar(k, v, smoothing_hint=smoothing_hint) - - def put_histogram(self, hist_name, hist_tensor, bins=1000): - """ - Create a histogram from a tensor. - - Args: - hist_name (str): The name of the histogram to put into tensorboard. - hist_tensor (torch.Tensor): A Tensor of arbitrary shape to be converted - into a histogram. - bins (int): Number of histogram bins. - """ - ht_min, ht_max = hist_tensor.min().item(), hist_tensor.max().item() - - # Create a histogram with PyTorch - hist_counts = torch.histc(hist_tensor, bins=bins) - hist_edges = torch.linspace(start=ht_min, end=ht_max, steps=bins + 1, dtype=torch.float32) - - # Parameter for the add_histogram_raw function of SummaryWriter - hist_params = dict( - tag=hist_name, - min=ht_min, - max=ht_max, - num=len(hist_tensor), - sum=float(hist_tensor.sum()), - sum_squares=float(torch.sum(hist_tensor ** 2)), - bucket_limits=hist_edges[1:].tolist(), - bucket_counts=hist_counts.tolist(), - global_step=self._iter, - ) - self._histograms.append(hist_params) - - def history(self, name): - """ - Returns: - HistoryBuffer: the scalar history for name - """ - ret = self._history.get(name, None) - if ret is None: - raise KeyError("No history metric available for {}!".format(name)) - return ret - - def histories(self): - """ - Returns: - dict[name -> HistoryBuffer]: the HistoryBuffer for all scalars - """ - return self._history - - def latest(self): - """ - Returns: - dict[str -> (float, int)]: mapping from the name of each scalar to the most - recent value and the iteration number its added. - """ - return self._latest_scalars - - def latest_with_smoothing_hint(self, window_size=20): - """ - Similar to :meth:`latest`, but the returned values - are either the un-smoothed original latest value, - or a median of the given window_size, - depend on whether the smoothing_hint is True. - - This provides a default behavior that other writers can use. - """ - result = {} - for k, (v, itr) in self._latest_scalars.items(): - result[k] = ( - self._history[k].median(window_size) if self._smoothing_hints[k] else v, - itr, - ) - return result - - def smoothing_hints(self): - """ - Returns: - dict[name -> bool]: the user-provided hint on whether the scalar - is noisy and needs smoothing. - """ - return self._smoothing_hints - - def step(self): - """ - User should either: (1) Call this function to increment storage.iter when needed. Or - (2) Set `storage.iter` to the correct iteration number before each iteration. - - The storage will then be able to associate the new data with an iteration number. - """ - self._iter += 1 - - @property - def iter(self): - """ - Returns: - int: The current iteration number. When used together with a trainer, - this is ensured to be the same as trainer.iter. - """ - return self._iter - - @iter.setter - def iter(self, val): - self._iter = int(val) - - @property - def iteration(self): - # for backward compatibility - return self._iter - - def __enter__(self): - _CURRENT_STORAGE_STACK.append(self) - return self - - def __exit__(self, exc_type, exc_val, exc_tb): - assert _CURRENT_STORAGE_STACK[-1] == self - _CURRENT_STORAGE_STACK.pop() - - @contextmanager - def name_scope(self, name): - """ - Yields: - A context within which all the events added to this storage - will be prefixed by the name scope. - """ - old_prefix = self._current_prefix - self._current_prefix = name.rstrip("/") + "/" - yield - self._current_prefix = old_prefix - - def clear_images(self): - """ - Delete all the stored images for visualization. This should be called - after images are written to tensorboard. - """ - self._vis_data = [] - - def clear_histograms(self): - """ - Delete all the stored histograms for visualization. - This should be called after histograms are written to tensorboard. - """ - self._histograms = [] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/file_io.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/file_io.py deleted file mode 100755 index 46ee4ec3..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/file_io.py +++ /dev/null @@ -1,37 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from iopath.common.file_io import HTTPURLHandler, OneDrivePathHandler, PathHandler -from iopath.common.file_io import PathManager as PathManagerBase - -__all__ = ["PathManager", "PathHandler"] - - -PathManager = PathManagerBase() -""" -This is a detectron2 project-specific PathManager. -We try to stay away from global PathManager in fvcore as it -introduces potential conflicts among other libraries. -""" - - -class Detectron2Handler(PathHandler): - """ - Resolve anything that's hosted under detectron2's namespace. - """ - - PREFIX = "detectron2://" - S3_DETECTRON2_PREFIX = "https://dl.fbaipublicfiles.com/detectron2/" - - def _get_supported_prefixes(self): - return [self.PREFIX] - - def _get_local_path(self, path, **kwargs): - name = path[len(self.PREFIX) :] - return PathManager.get_local_path(self.S3_DETECTRON2_PREFIX + name, **kwargs) - - def _open(self, path, mode="r", **kwargs): - return PathManager.open(self._get_local_path(path), mode, **kwargs) - - -PathManager.register_handler(HTTPURLHandler()) -PathManager.register_handler(OneDrivePathHandler()) -PathManager.register_handler(Detectron2Handler()) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/logger.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/logger.py deleted file mode 100755 index 7c7890f8..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/logger.py +++ /dev/null @@ -1,237 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import atexit -import functools -import logging -import os -import sys -import time -from collections import Counter -import torch -from tabulate import tabulate -from termcolor import colored - -from detectron2.utils.file_io import PathManager - -__all__ = ["setup_logger", "log_first_n", "log_every_n", "log_every_n_seconds"] - - -class _ColorfulFormatter(logging.Formatter): - def __init__(self, *args, **kwargs): - self._root_name = kwargs.pop("root_name") + "." - self._abbrev_name = kwargs.pop("abbrev_name", "") - if len(self._abbrev_name): - self._abbrev_name = self._abbrev_name + "." - super(_ColorfulFormatter, self).__init__(*args, **kwargs) - - def formatMessage(self, record): - record.name = record.name.replace(self._root_name, self._abbrev_name) - log = super(_ColorfulFormatter, self).formatMessage(record) - if record.levelno == logging.WARNING: - prefix = colored("WARNING", "red", attrs=["blink"]) - elif record.levelno == logging.ERROR or record.levelno == logging.CRITICAL: - prefix = colored("ERROR", "red", attrs=["blink", "underline"]) - else: - return log - return prefix + " " + log - - -@functools.lru_cache() # so that calling setup_logger multiple times won't add many handlers -def setup_logger( - output=None, distributed_rank=0, *, color=True, name="detectron2", abbrev_name=None -): - """ - Initialize the detectron2 logger and set its verbosity level to "DEBUG". - - Args: - output (str): a file name or a directory to save log. If None, will not save log file. - If ends with ".txt" or ".log", assumed to be a file name. - Otherwise, logs will be saved to `output/log.txt`. - name (str): the root module name of this logger - abbrev_name (str): an abbreviation of the module, to avoid long names in logs. - Set to "" to not log the root module in logs. - By default, will abbreviate "detectron2" to "d2" and leave other - modules unchanged. - - Returns: - logging.Logger: a logger - """ - logger = logging.getLogger(name) - logger.setLevel(logging.DEBUG) - logger.propagate = False - - if abbrev_name is None: - abbrev_name = "d2" if name == "detectron2" else name - - plain_formatter = logging.Formatter( - "[%(asctime)s] %(name)s %(levelname)s: %(message)s", datefmt="%m/%d %H:%M:%S" - ) - # stdout logging: master only - if distributed_rank == 0: - ch = logging.StreamHandler(stream=sys.stdout) - ch.setLevel(logging.DEBUG) - if color: - formatter = _ColorfulFormatter( - colored("[%(asctime)s %(name)s]: ", "green") + "%(message)s", - datefmt="%m/%d %H:%M:%S", - root_name=name, - abbrev_name=str(abbrev_name), - ) - else: - formatter = plain_formatter - ch.setFormatter(formatter) - logger.addHandler(ch) - - # file logging: all workers - if output is not None: - if output.endswith(".txt") or output.endswith(".log"): - filename = output - else: - filename = os.path.join(output, "log.txt") - if distributed_rank > 0: - filename = filename + ".rank{}".format(distributed_rank) - PathManager.mkdirs(os.path.dirname(filename)) - - fh = logging.StreamHandler(_cached_log_stream(filename)) - fh.setLevel(logging.DEBUG) - fh.setFormatter(plain_formatter) - logger.addHandler(fh) - - return logger - - -# cache the opened file object, so that different calls to `setup_logger` -# with the same file name can safely write to the same file. -@functools.lru_cache(maxsize=None) -def _cached_log_stream(filename): - # use 1K buffer if writing to cloud storage - io = PathManager.open(filename, "a", buffering=1024 if "://" in filename else -1) - atexit.register(io.close) - return io - - -""" -Below are some other convenient logging methods. -They are mainly adopted from -https://github.com/abseil/abseil-py/blob/master/absl/logging/__init__.py -""" - - -def _find_caller(): - """ - Returns: - str: module name of the caller - tuple: a hashable key to be used to identify different callers - """ - frame = sys._getframe(2) - while frame: - code = frame.f_code - if os.path.join("utils", "logger.") not in code.co_filename: - mod_name = frame.f_globals["__name__"] - if mod_name == "__main__": - mod_name = "detectron2" - return mod_name, (code.co_filename, frame.f_lineno, code.co_name) - frame = frame.f_back - - -_LOG_COUNTER = Counter() -_LOG_TIMER = {} - - -def log_first_n(lvl, msg, n=1, *, name=None, key="caller"): - """ - Log only for the first n times. - - Args: - lvl (int): the logging level - msg (str): - n (int): - name (str): name of the logger to use. Will use the caller's module by default. - key (str or tuple[str]): the string(s) can be one of "caller" or - "message", which defines how to identify duplicated logs. - For example, if called with `n=1, key="caller"`, this function - will only log the first call from the same caller, regardless of - the message content. - If called with `n=1, key="message"`, this function will log the - same content only once, even if they are called from different places. - If called with `n=1, key=("caller", "message")`, this function - will not log only if the same caller has logged the same message before. - """ - if isinstance(key, str): - key = (key,) - assert len(key) > 0 - - caller_module, caller_key = _find_caller() - hash_key = () - if "caller" in key: - hash_key = hash_key + caller_key - if "message" in key: - hash_key = hash_key + (msg,) - - _LOG_COUNTER[hash_key] += 1 - if _LOG_COUNTER[hash_key] <= n: - logging.getLogger(name or caller_module).log(lvl, msg) - - -def log_every_n(lvl, msg, n=1, *, name=None): - """ - Log once per n times. - - Args: - lvl (int): the logging level - msg (str): - n (int): - name (str): name of the logger to use. Will use the caller's module by default. - """ - caller_module, key = _find_caller() - _LOG_COUNTER[key] += 1 - if n == 1 or _LOG_COUNTER[key] % n == 1: - logging.getLogger(name or caller_module).log(lvl, msg) - - -def log_every_n_seconds(lvl, msg, n=1, *, name=None): - """ - Log no more than once per n seconds. - - Args: - lvl (int): the logging level - msg (str): - n (int): - name (str): name of the logger to use. Will use the caller's module by default. - """ - caller_module, key = _find_caller() - last_logged = _LOG_TIMER.get(key, None) - current_time = time.time() - if last_logged is None or current_time - last_logged >= n: - logging.getLogger(name or caller_module).log(lvl, msg) - _LOG_TIMER[key] = current_time - - -def create_small_table(small_dict): - """ - Create a small table using the keys of small_dict as headers. This is only - suitable for small dictionaries. - - Args: - small_dict (dict): a result dictionary of only a few items. - - Returns: - str: the table as a string. - """ - keys, values = tuple(zip(*small_dict.items())) - table = tabulate( - [values], - headers=keys, - tablefmt="pipe", - floatfmt=".3f", - stralign="center", - numalign="center", - ) - return table - - -def _log_api_usage(identifier: str): - """ - Internal function used to log the usage of different detectron2 components - inside facebook's infra. - """ - torch._C._log_api_usage_once("detectron2." + identifier) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/memory.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/memory.py deleted file mode 100755 index bd494780..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/memory.py +++ /dev/null @@ -1,84 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import logging -from contextlib import contextmanager -from functools import wraps -import torch - -__all__ = ["retry_if_cuda_oom"] - - -@contextmanager -def _ignore_torch_cuda_oom(): - """ - A context which ignores CUDA OOM exception from pytorch. - """ - try: - yield - except RuntimeError as e: - # NOTE: the string may change? - if "CUDA out of memory. " in str(e): - pass - else: - raise - - -def retry_if_cuda_oom(func): - """ - Makes a function retry itself after encountering - pytorch's CUDA OOM error. - It will first retry after calling `torch.cuda.empty_cache()`. - - If that still fails, it will then retry by trying to convert inputs to CPUs. - In this case, it expects the function to dispatch to CPU implementation. - The return values may become CPU tensors as well and it's user's - responsibility to convert it back to CUDA tensor if needed. - - Args: - func: a stateless callable that takes tensor-like objects as arguments - - Returns: - a callable which retries `func` if OOM is encountered. - - Examples: - :: - output = retry_if_cuda_oom(some_torch_function)(input1, input2) - # output may be on CPU even if inputs are on GPU - - Note: - 1. When converting inputs to CPU, it will only look at each argument and check - if it has `.device` and `.to` for conversion. Nested structures of tensors - are not supported. - - 2. Since the function might be called more than once, it has to be - stateless. - """ - - def maybe_to_cpu(x): - try: - like_gpu_tensor = x.device.type == "cuda" and hasattr(x, "to") - except AttributeError: - like_gpu_tensor = False - if like_gpu_tensor: - return x.to(device="cpu") - else: - return x - - @wraps(func) - def wrapped(*args, **kwargs): - with _ignore_torch_cuda_oom(): - return func(*args, **kwargs) - - # Clear cache and retry - torch.cuda.empty_cache() - with _ignore_torch_cuda_oom(): - return func(*args, **kwargs) - - # Try on CPU. This slows down the code significantly, therefore print a notice. - logger = logging.getLogger(__name__) - logger.info("Attempting to copy inputs of {} to CPU due to CUDA OOM".format(str(func))) - new_args = (maybe_to_cpu(x) for x in args) - new_kwargs = {k: maybe_to_cpu(v) for k, v in kwargs.items()} - return func(*new_args, **new_kwargs) - - return wrapped diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/registry.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/registry.py deleted file mode 100755 index 4b01e900..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/registry.py +++ /dev/null @@ -1,60 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -from typing import Any -import pydoc -from fvcore.common.registry import Registry # for backward compatibility. - -""" -``Registry`` and `locate` provide ways to map a string (typically found -in config files) to callable objects. -""" - -__all__ = ["Registry", "locate"] - - -def _convert_target_to_string(t: Any) -> str: - """ - Inverse of ``locate()``. - - Args: - t: any object with ``__module__`` and ``__qualname__`` - """ - module, qualname = t.__module__, t.__qualname__ - - # Compress the path to this object, e.g. ``module.submodule._impl.class`` - # may become ``module.submodule.class``, if the later also resolves to the same - # object. This simplifies the string, and also is less affected by moving the - # class implementation. - module_parts = module.split(".") - for k in range(1, len(module_parts)): - prefix = ".".join(module_parts[:k]) - candidate = f"{prefix}.{qualname}" - try: - if locate(candidate) is t: - return candidate - except ImportError: - pass - return f"{module}.{qualname}" - - -def locate(name: str) -> Any: - """ - Locate and return an object ``x`` using an input string ``{x.__module__}.{x.__qualname__}``, - such as "module.submodule.class_name". - - Raise Exception if it cannot be found. - """ - obj = pydoc.locate(name) - - # Some cases (e.g. torch.optim.sgd.SGD) not handled correctly - # by pydoc.locate. Try a private function from hydra. - if obj is None: - try: - # from hydra.utils import get_method - will print many errors - from hydra.utils import _locate - except ImportError as e: - raise ImportError(f"Cannot dynamically locate object {name}!") from e - else: - obj = _locate(name) # it raises if fails - - return obj diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/serialize.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/serialize.py deleted file mode 100755 index 0b388628..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/serialize.py +++ /dev/null @@ -1,32 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import cloudpickle - - -class PicklableWrapper(object): - """ - Wrap an object to make it more picklable, note that it uses - heavy weight serialization libraries that are slower than pickle. - It's best to use it only on closures (which are usually not picklable). - - This is a simplified version of - https://github.com/joblib/joblib/blob/master/joblib/externals/loky/cloudpickle_wrapper.py - """ - - def __init__(self, obj): - while isinstance(obj, PicklableWrapper): - # Wrapping an object twice is no-op - obj = obj._obj - self._obj = obj - - def __reduce__(self): - s = cloudpickle.dumps(self._obj) - return cloudpickle.loads, (s,) - - def __call__(self, *args, **kwargs): - return self._obj(*args, **kwargs) - - def __getattr__(self, attr): - # Ensure that the wrapped object can be used seamlessly as the previous object. - if attr not in ["_obj"]: - return getattr(self._obj, attr) - return getattr(self, attr) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/testing.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/testing.py deleted file mode 100755 index 161fa6b8..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/testing.py +++ /dev/null @@ -1,137 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import io -import numpy as np -import torch - -from detectron2 import model_zoo -from detectron2.data import DatasetCatalog -from detectron2.data.detection_utils import read_image -from detectron2.modeling import build_model -from detectron2.structures import Boxes, Instances, ROIMasks -from detectron2.utils.file_io import PathManager - - -""" -Internal utilities for tests. Don't use except for writing tests. -""" - - -def get_model_no_weights(config_path): - """ - Like model_zoo.get, but do not load any weights (even pretrained) - """ - cfg = model_zoo.get_config(config_path) - if not torch.cuda.is_available(): - cfg.MODEL.DEVICE = "cpu" - return build_model(cfg) - - -def random_boxes(num_boxes, max_coord=100, device="cpu"): - """ - Create a random Nx4 boxes tensor, with coordinates < max_coord. - """ - boxes = torch.rand(num_boxes, 4, device=device) * (max_coord * 0.5) - boxes.clamp_(min=1.0) # tiny boxes cause numerical instability in box regression - # Note: the implementation of this function in torchvision is: - # boxes[:, 2:] += torch.rand(N, 2) * 100 - # but it does not guarantee non-negative widths/heights constraints: - # boxes[:, 2] >= boxes[:, 0] and boxes[:, 3] >= boxes[:, 1]: - boxes[:, 2:] += boxes[:, :2] - return boxes - - -def get_sample_coco_image(tensor=True): - """ - Args: - tensor (bool): if True, returns 3xHxW tensor. - else, returns a HxWx3 numpy array. - - Returns: - an image, in BGR color. - """ - try: - file_name = DatasetCatalog.get("coco_2017_val_100")[0]["file_name"] - if not PathManager.exists(file_name): - raise FileNotFoundError() - except IOError: - # for public CI to run - file_name = PathManager.get_local_path( - "http://images.cocodataset.org/train2017/000000000009.jpg" - ) - ret = read_image(file_name, format="BGR") - if tensor: - ret = torch.from_numpy(np.ascontiguousarray(ret.transpose(2, 0, 1))) - return ret - - -def convert_scripted_instances(instances): - """ - Convert a scripted Instances object to a regular :class:`Instances` object - """ - assert hasattr( - instances, "image_size" - ), f"Expect an Instances object, but got {type(instances)}!" - ret = Instances(instances.image_size) - for name in instances._field_names: - val = getattr(instances, "_" + name, None) - if val is not None: - ret.set(name, val) - return ret - - -def assert_instances_allclose(input, other, *, rtol=1e-5, msg="", size_as_tensor=False): - """ - Args: - input, other (Instances): - size_as_tensor: compare image_size of the Instances as tensors (instead of tuples). - Useful for comparing outputs of tracing. - """ - if not isinstance(input, Instances): - input = convert_scripted_instances(input) - if not isinstance(other, Instances): - other = convert_scripted_instances(other) - - if not msg: - msg = "Two Instances are different! " - else: - msg = msg.rstrip() + " " - - size_error_msg = msg + f"image_size is {input.image_size} vs. {other.image_size}!" - if size_as_tensor: - assert torch.equal( - torch.tensor(input.image_size), torch.tensor(other.image_size) - ), size_error_msg - else: - assert input.image_size == other.image_size, size_error_msg - fields = sorted(input.get_fields().keys()) - fields_other = sorted(other.get_fields().keys()) - assert fields == fields_other, msg + f"Fields are {fields} vs {fields_other}!" - - for f in fields: - val1, val2 = input.get(f), other.get(f) - if isinstance(val1, (Boxes, ROIMasks)): - # boxes in the range of O(100) and can have a larger tolerance - assert torch.allclose(val1.tensor, val2.tensor, atol=100 * rtol), ( - msg + f"Field {f} differs too much!" - ) - elif isinstance(val1, torch.Tensor): - if val1.dtype.is_floating_point: - mag = torch.abs(val1).max().cpu().item() - assert torch.allclose(val1, val2, atol=mag * rtol), ( - msg + f"Field {f} differs too much!" - ) - else: - assert torch.equal(val1, val2), msg + f"Field {f} is different!" - else: - raise ValueError(f"Don't know how to compare type {type(val1)}") - - -def reload_script_model(module): - """ - Save a jit module and load it back. - Similar to the `getExportImportCopy` function in torch/testing/ - """ - buffer = io.BytesIO() - torch.jit.save(module, buffer) - buffer.seek(0) - return torch.jit.load(buffer) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/video_visualizer.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/video_visualizer.py deleted file mode 100755 index 9d8a366d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/video_visualizer.py +++ /dev/null @@ -1,252 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -import pycocotools.mask as mask_util - -from detectron2.utils.visualizer import ( - ColorMode, - Visualizer, - _create_text_labels, - _PanopticPrediction, -) - -from .colormap import random_color - - -class _DetectedInstance: - """ - Used to store data about detected objects in video frame, - in order to transfer color to objects in the future frames. - - Attributes: - label (int): - bbox (tuple[float]): - mask_rle (dict): - color (tuple[float]): RGB colors in range (0, 1) - ttl (int): time-to-live for the instance. For example, if ttl=2, - the instance color can be transferred to objects in the next two frames. - """ - - __slots__ = ["label", "bbox", "mask_rle", "color", "ttl"] - - def __init__(self, label, bbox, mask_rle, color, ttl): - self.label = label - self.bbox = bbox - self.mask_rle = mask_rle - self.color = color - self.ttl = ttl - - -class VideoVisualizer: - def __init__(self, metadata, instance_mode=ColorMode.IMAGE): - """ - Args: - metadata (MetadataCatalog): image metadata. - """ - self.metadata = metadata - self._old_instances = [] - assert instance_mode in [ - ColorMode.IMAGE, - ColorMode.IMAGE_BW, - ], "Other mode not supported yet." - self._instance_mode = instance_mode - - def draw_instance_predictions(self, frame, predictions): - """ - Draw instance-level prediction results on an image. - - Args: - frame (ndarray): an RGB image of shape (H, W, C), in the range [0, 255]. - predictions (Instances): the output of an instance detection/segmentation - model. Following fields will be used to draw: - "pred_boxes", "pred_classes", "scores", "pred_masks" (or "pred_masks_rle"). - - Returns: - output (VisImage): image object with visualizations. - """ - frame_visualizer = Visualizer(frame, self.metadata) - num_instances = len(predictions) - if num_instances == 0: - return frame_visualizer.output - - boxes = predictions.pred_boxes.tensor.numpy() if predictions.has("pred_boxes") else None - scores = predictions.scores if predictions.has("scores") else None - classes = predictions.pred_classes.numpy() if predictions.has("pred_classes") else None - keypoints = predictions.pred_keypoints if predictions.has("pred_keypoints") else None - colors = predictions.COLOR if predictions.has("COLOR") else [None] * len(predictions) - durations = predictions.ID_duration if predictions.has("ID_duration") else None - duration_threshold = self.metadata.get("duration_threshold", 0) - visibilities = None if durations is None else [x > duration_threshold for x in durations] - - if predictions.has("pred_masks"): - masks = predictions.pred_masks - # mask IOU is not yet enabled - # masks_rles = mask_util.encode(np.asarray(masks.permute(1, 2, 0), order="F")) - # assert len(masks_rles) == num_instances - else: - masks = None - - detected = [ - _DetectedInstance(classes[i], boxes[i], mask_rle=None, color=colors[i], ttl=8) - for i in range(num_instances) - ] - if not predictions.has("COLOR"): - colors = self._assign_colors(detected) - - labels = _create_text_labels(classes, scores, self.metadata.get("thing_classes", None)) - - if self._instance_mode == ColorMode.IMAGE_BW: - # any() returns uint8 tensor - frame_visualizer.output.reset_image( - frame_visualizer._create_grayscale_image( - (masks.any(dim=0) > 0).numpy() if masks is not None else None - ) - ) - alpha = 0.3 - else: - alpha = 0.5 - - labels = ( - None - if labels is None - else [y[0] for y in filter(lambda x: x[1], zip(labels, visibilities))] - ) # noqa - assigned_colors = ( - None - if colors is None - else [y[0] for y in filter(lambda x: x[1], zip(colors, visibilities))] - ) # noqa - frame_visualizer.overlay_instances( - boxes=None if masks is not None else boxes[visibilities], # boxes are a bit distracting - masks=None if masks is None else masks[visibilities], - labels=labels, - keypoints=None if keypoints is None else keypoints[visibilities], - assigned_colors=assigned_colors, - alpha=alpha, - ) - - return frame_visualizer.output - - def draw_sem_seg(self, frame, sem_seg, area_threshold=None): - """ - Args: - sem_seg (ndarray or Tensor): semantic segmentation of shape (H, W), - each value is the integer label. - area_threshold (Optional[int]): only draw segmentations larger than the threshold - """ - # don't need to do anything special - frame_visualizer = Visualizer(frame, self.metadata) - frame_visualizer.draw_sem_seg(sem_seg, area_threshold=None) - return frame_visualizer.output - - def draw_panoptic_seg_predictions( - self, frame, panoptic_seg, segments_info, area_threshold=None, alpha=0.5 - ): - frame_visualizer = Visualizer(frame, self.metadata) - pred = _PanopticPrediction(panoptic_seg, segments_info, self.metadata) - - if self._instance_mode == ColorMode.IMAGE_BW: - frame_visualizer.output.reset_image( - frame_visualizer._create_grayscale_image(pred.non_empty_mask()) - ) - - # draw mask for all semantic segments first i.e. "stuff" - for mask, sinfo in pred.semantic_masks(): - category_idx = sinfo["category_id"] - try: - mask_color = [x / 255 for x in self.metadata.stuff_colors[category_idx]] - except AttributeError: - mask_color = None - - frame_visualizer.draw_binary_mask( - mask, - color=mask_color, - text=self.metadata.stuff_classes[category_idx], - alpha=alpha, - area_threshold=area_threshold, - ) - - all_instances = list(pred.instance_masks()) - if len(all_instances) == 0: - return frame_visualizer.output - # draw mask for all instances second - masks, sinfo = list(zip(*all_instances)) - num_instances = len(masks) - masks_rles = mask_util.encode( - np.asarray(np.asarray(masks).transpose(1, 2, 0), dtype=np.uint8, order="F") - ) - assert len(masks_rles) == num_instances - - category_ids = [x["category_id"] for x in sinfo] - detected = [ - _DetectedInstance(category_ids[i], bbox=None, mask_rle=masks_rles[i], color=None, ttl=8) - for i in range(num_instances) - ] - colors = self._assign_colors(detected) - labels = [self.metadata.thing_classes[k] for k in category_ids] - - frame_visualizer.overlay_instances( - boxes=None, - masks=masks, - labels=labels, - keypoints=None, - assigned_colors=colors, - alpha=alpha, - ) - return frame_visualizer.output - - def _assign_colors(self, instances): - """ - Naive tracking heuristics to assign same color to the same instance, - will update the internal state of tracked instances. - - Returns: - list[tuple[float]]: list of colors. - """ - - # Compute iou with either boxes or masks: - is_crowd = np.zeros((len(instances),), dtype=np.bool) - if instances[0].bbox is None: - assert instances[0].mask_rle is not None - # use mask iou only when box iou is None - # because box seems good enough - rles_old = [x.mask_rle for x in self._old_instances] - rles_new = [x.mask_rle for x in instances] - ious = mask_util.iou(rles_old, rles_new, is_crowd) - threshold = 0.5 - else: - boxes_old = [x.bbox for x in self._old_instances] - boxes_new = [x.bbox for x in instances] - ious = mask_util.iou(boxes_old, boxes_new, is_crowd) - threshold = 0.6 - if len(ious) == 0: - ious = np.zeros((len(self._old_instances), len(instances)), dtype="float32") - - # Only allow matching instances of the same label: - for old_idx, old in enumerate(self._old_instances): - for new_idx, new in enumerate(instances): - if old.label != new.label: - ious[old_idx, new_idx] = 0 - - matched_new_per_old = np.asarray(ious).argmax(axis=1) - max_iou_per_old = np.asarray(ious).max(axis=1) - - # Try to find match for each old instance: - extra_instances = [] - for idx, inst in enumerate(self._old_instances): - if max_iou_per_old[idx] > threshold: - newidx = matched_new_per_old[idx] - if instances[newidx].color is None: - instances[newidx].color = inst.color - continue - # If an old instance does not match any new instances, - # keep it for the next frame in case it is just missed by the detector - inst.ttl -= 1 - if inst.ttl > 0: - extra_instances.append(inst) - - # Assign random color to newly-detected instances: - for inst in instances: - if inst.color is None: - inst.color = random_color(rgb=True, maximum=1) - self._old_instances = instances[:] + extra_instances - return [d.color for d in instances] diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/visualizer.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/visualizer.py deleted file mode 100755 index 8e145181..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/detectron2/utils/visualizer.py +++ /dev/null @@ -1,1267 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import colorsys -import logging -import math -import numpy as np -from enum import Enum, unique -import cv2 -import matplotlib as mpl -import matplotlib.colors as mplc -import matplotlib.figure as mplfigure -import pycocotools.mask as mask_util -import torch -from matplotlib.backends.backend_agg import FigureCanvasAgg -from PIL import Image - -from detectron2.data import MetadataCatalog -from detectron2.structures import BitMasks, Boxes, BoxMode, Keypoints, PolygonMasks, RotatedBoxes -from detectron2.utils.file_io import PathManager - -from .colormap import random_color - -logger = logging.getLogger(__name__) - -__all__ = ["ColorMode", "VisImage", "Visualizer"] - - -_SMALL_OBJECT_AREA_THRESH = 1000 -_LARGE_MASK_AREA_THRESH = 120000 -_OFF_WHITE = (1.0, 1.0, 240.0 / 255) -_BLACK = (0, 0, 0) -_RED = (1.0, 0, 0) - -_KEYPOINT_THRESHOLD = 0.05 - - -@unique -class ColorMode(Enum): - """ - Enum of different color modes to use for instance visualizations. - """ - - IMAGE = 0 - """ - Picks a random color for every instance and overlay segmentations with low opacity. - """ - SEGMENTATION = 1 - """ - Let instances of the same category have similar colors - (from metadata.thing_colors), and overlay them with - high opacity. This provides more attention on the quality of segmentation. - """ - IMAGE_BW = 2 - """ - Same as IMAGE, but convert all areas without masks to gray-scale. - Only available for drawing per-instance mask predictions. - """ - - -class GenericMask: - """ - Attribute: - polygons (list[ndarray]): list[ndarray]: polygons for this mask. - Each ndarray has format [x, y, x, y, ...] - mask (ndarray): a binary mask - """ - - def __init__(self, mask_or_polygons, height, width): - self._mask = self._polygons = self._has_holes = None - self.height = height - self.width = width - - m = mask_or_polygons - if isinstance(m, dict): - # RLEs - assert "counts" in m and "size" in m - if isinstance(m["counts"], list): # uncompressed RLEs - h, w = m["size"] - assert h == height and w == width - m = mask_util.frPyObjects(m, h, w) - self._mask = mask_util.decode(m)[:, :] - return - - if isinstance(m, list): # list[ndarray] - self._polygons = [np.asarray(x).reshape(-1) for x in m] - return - - if isinstance(m, np.ndarray): # assumed to be a binary mask - assert m.shape[1] != 2, m.shape - assert m.shape == ( - height, - width, - ), f"mask shape: {m.shape}, target dims: {height}, {width}" - self._mask = m.astype("uint8") - return - - raise ValueError("GenericMask cannot handle object {} of type '{}'".format(m, type(m))) - - @property - def mask(self): - if self._mask is None: - self._mask = self.polygons_to_mask(self._polygons) - return self._mask - - @property - def polygons(self): - if self._polygons is None: - self._polygons, self._has_holes = self.mask_to_polygons(self._mask) - return self._polygons - - @property - def has_holes(self): - if self._has_holes is None: - if self._mask is not None: - self._polygons, self._has_holes = self.mask_to_polygons(self._mask) - else: - self._has_holes = False # if original format is polygon, does not have holes - return self._has_holes - - def mask_to_polygons(self, mask): - # cv2.RETR_CCOMP flag retrieves all the contours and arranges them to a 2-level - # hierarchy. External contours (boundary) of the object are placed in hierarchy-1. - # Internal contours (holes) are placed in hierarchy-2. - # cv2.CHAIN_APPROX_NONE flag gets vertices of polygons from contours. - mask = np.ascontiguousarray(mask) # some versions of cv2 does not support incontiguous arr - res = cv2.findContours(mask.astype("uint8"), cv2.RETR_CCOMP, cv2.CHAIN_APPROX_NONE) - hierarchy = res[-1] - if hierarchy is None: # empty mask - return [], False - has_holes = (hierarchy.reshape(-1, 4)[:, 3] >= 0).sum() > 0 - res = res[-2] - res = [x.flatten() for x in res] - # These coordinates from OpenCV are integers in range [0, W-1 or H-1]. - # We add 0.5 to turn them into real-value coordinate space. A better solution - # would be to first +0.5 and then dilate the returned polygon by 0.5. - res = [x + 0.5 for x in res if len(x) >= 6] - return res, has_holes - - def polygons_to_mask(self, polygons): - rle = mask_util.frPyObjects(polygons, self.height, self.width) - rle = mask_util.merge(rle) - return mask_util.decode(rle)[:, :] - - def area(self): - return self.mask.sum() - - def bbox(self): - p = mask_util.frPyObjects(self.polygons, self.height, self.width) - p = mask_util.merge(p) - bbox = mask_util.toBbox(p) - bbox[2] += bbox[0] - bbox[3] += bbox[1] - return bbox - - -class _PanopticPrediction: - """ - Unify different panoptic annotation/prediction formats - """ - - def __init__(self, panoptic_seg, segments_info, metadata=None): - if segments_info is None: - assert metadata is not None - # If "segments_info" is None, we assume "panoptic_img" is a - # H*W int32 image storing the panoptic_id in the format of - # category_id * label_divisor + instance_id. We reserve -1 for - # VOID label. - label_divisor = metadata.label_divisor - segments_info = [] - for panoptic_label in np.unique(panoptic_seg.numpy()): - if panoptic_label == -1: - # VOID region. - continue - pred_class = panoptic_label // label_divisor - isthing = pred_class in metadata.thing_dataset_id_to_contiguous_id.values() - segments_info.append( - { - "id": int(panoptic_label), - "category_id": int(pred_class), - "isthing": bool(isthing), - } - ) - del metadata - - self._seg = panoptic_seg - - self._sinfo = {s["id"]: s for s in segments_info} # seg id -> seg info - segment_ids, areas = torch.unique(panoptic_seg, sorted=True, return_counts=True) - areas = areas.numpy() - sorted_idxs = np.argsort(-areas) - self._seg_ids, self._seg_areas = segment_ids[sorted_idxs], areas[sorted_idxs] - self._seg_ids = self._seg_ids.tolist() - for sid, area in zip(self._seg_ids, self._seg_areas): - if sid in self._sinfo: - self._sinfo[sid]["area"] = float(area) - - def non_empty_mask(self): - """ - Returns: - (H, W) array, a mask for all pixels that have a prediction - """ - empty_ids = [] - for id in self._seg_ids: - if id not in self._sinfo: - empty_ids.append(id) - if len(empty_ids) == 0: - return np.zeros(self._seg.shape, dtype=np.uint8) - assert ( - len(empty_ids) == 1 - ), ">1 ids corresponds to no labels. This is currently not supported" - return (self._seg != empty_ids[0]).numpy().astype(np.bool) - - def semantic_masks(self): - for sid in self._seg_ids: - sinfo = self._sinfo.get(sid) - if sinfo is None or sinfo["isthing"]: - # Some pixels (e.g. id 0 in PanopticFPN) have no instance or semantic predictions. - continue - yield (self._seg == sid).numpy().astype(np.bool), sinfo - - def instance_masks(self): - for sid in self._seg_ids: - sinfo = self._sinfo.get(sid) - if sinfo is None or not sinfo["isthing"]: - continue - mask = (self._seg == sid).numpy().astype(np.bool) - if mask.sum() > 0: - yield mask, sinfo - - -def _create_text_labels(classes, scores, class_names, is_crowd=None): - """ - Args: - classes (list[int] or None): - scores (list[float] or None): - class_names (list[str] or None): - is_crowd (list[bool] or None): - - Returns: - list[str] or None - """ - labels = None - if classes is not None: - if class_names is not None and len(class_names) > 0: - labels = [class_names[i] for i in classes] - else: - labels = [str(i) for i in classes] - if scores is not None: - if labels is None: - labels = ["{:.0f}%".format(s * 100) for s in scores] - else: - labels = ["{} {:.0f}%".format(l, s * 100) for l, s in zip(labels, scores)] - if labels is not None and is_crowd is not None: - labels = [l + ("|crowd" if crowd else "") for l, crowd in zip(labels, is_crowd)] - return labels - - -class VisImage: - def __init__(self, img, scale=1.0): - """ - Args: - img (ndarray): an RGB image of shape (H, W, 3) in range [0, 255]. - scale (float): scale the input image - """ - self.img = img - self.scale = scale - self.width, self.height = img.shape[1], img.shape[0] - self._setup_figure(img) - - def _setup_figure(self, img): - """ - Args: - Same as in :meth:`__init__()`. - - Returns: - fig (matplotlib.pyplot.figure): top level container for all the image plot elements. - ax (matplotlib.pyplot.Axes): contains figure elements and sets the coordinate system. - """ - fig = mplfigure.Figure(frameon=False) - self.dpi = fig.get_dpi() - # add a small 1e-2 to avoid precision lost due to matplotlib's truncation - # (https://github.com/matplotlib/matplotlib/issues/15363) - fig.set_size_inches( - (self.width * self.scale + 1e-2) / self.dpi, - (self.height * self.scale + 1e-2) / self.dpi, - ) - self.canvas = FigureCanvasAgg(fig) - # self.canvas = mpl.backends.backend_cairo.FigureCanvasCairo(fig) - ax = fig.add_axes([0.0, 0.0, 1.0, 1.0]) - ax.axis("off") - self.fig = fig - self.ax = ax - self.reset_image(img) - - def reset_image(self, img): - """ - Args: - img: same as in __init__ - """ - img = img.astype("uint8") - self.ax.imshow(img, extent=(0, self.width, self.height, 0), interpolation="nearest") - - def save(self, filepath): - """ - Args: - filepath (str): a string that contains the absolute path, including the file name, where - the visualized image will be saved. - """ - self.fig.savefig(filepath) - - def get_image(self): - """ - Returns: - ndarray: - the visualized image of shape (H, W, 3) (RGB) in uint8 type. - The shape is scaled w.r.t the input image using the given `scale` argument. - """ - canvas = self.canvas - s, (width, height) = canvas.print_to_buffer() - # buf = io.BytesIO() # works for cairo backend - # canvas.print_rgba(buf) - # width, height = self.width, self.height - # s = buf.getvalue() - - buffer = np.frombuffer(s, dtype="uint8") - - img_rgba = buffer.reshape(height, width, 4) - rgb, alpha = np.split(img_rgba, [3], axis=2) - return rgb.astype("uint8") - - -class Visualizer: - """ - Visualizer that draws data about detection/segmentation on images. - - It contains methods like `draw_{text,box,circle,line,binary_mask,polygon}` - that draw primitive objects to images, as well as high-level wrappers like - `draw_{instance_predictions,sem_seg,panoptic_seg_predictions,dataset_dict}` - that draw composite data in some pre-defined style. - - Note that the exact visualization style for the high-level wrappers are subject to change. - Style such as color, opacity, label contents, visibility of labels, or even the visibility - of objects themselves (e.g. when the object is too small) may change according - to different heuristics, as long as the results still look visually reasonable. - - To obtain a consistent style, you can implement custom drawing functions with the - abovementioned primitive methods instead. If you need more customized visualization - styles, you can process the data yourself following their format documented in - tutorials (:doc:`/tutorials/models`, :doc:`/tutorials/datasets`). This class does not - intend to satisfy everyone's preference on drawing styles. - - This visualizer focuses on high rendering quality rather than performance. It is not - designed to be used for real-time applications. - """ - - # TODO implement a fast, rasterized version using OpenCV - - def __init__(self, img_rgb, metadata=None, scale=1.0, instance_mode=ColorMode.IMAGE): - """ - Args: - img_rgb: a numpy array of shape (H, W, C), where H and W correspond to - the height and width of the image respectively. C is the number of - color channels. The image is required to be in RGB format since that - is a requirement of the Matplotlib library. The image is also expected - to be in the range [0, 255]. - metadata (Metadata): dataset metadata (e.g. class names and colors) - instance_mode (ColorMode): defines one of the pre-defined style for drawing - instances on an image. - """ - self.img = np.asarray(img_rgb).clip(0, 255).astype(np.uint8) - if metadata is None: - metadata = MetadataCatalog.get("__nonexist__") - self.metadata = metadata - self.output = VisImage(self.img, scale=scale) - self.cpu_device = torch.device("cpu") - - # too small texts are useless, therefore clamp to 9 - self._default_font_size = max( - np.sqrt(self.output.height * self.output.width) // 90, 10 // scale - ) - self._instance_mode = instance_mode - self.keypoint_threshold = _KEYPOINT_THRESHOLD - - def draw_instance_predictions(self, predictions): - """ - Draw instance-level prediction results on an image. - - Args: - predictions (Instances): the output of an instance detection/segmentation - model. Following fields will be used to draw: - "pred_boxes", "pred_classes", "scores", "pred_masks" (or "pred_masks_rle"). - - Returns: - output (VisImage): image object with visualizations. - """ - boxes = predictions.pred_boxes if predictions.has("pred_boxes") else None - scores = predictions.scores if predictions.has("scores") else None - classes = predictions.pred_classes.tolist() if predictions.has("pred_classes") else None - labels = _create_text_labels(classes, scores, self.metadata.get("thing_classes", None)) - keypoints = predictions.pred_keypoints if predictions.has("pred_keypoints") else None - - if predictions.has("pred_masks"): - masks = np.asarray(predictions.pred_masks) - masks = [GenericMask(x, self.output.height, self.output.width) for x in masks] - else: - masks = None - - if self._instance_mode == ColorMode.SEGMENTATION and self.metadata.get("thing_colors"): - colors = [ - self._jitter([x / 255 for x in self.metadata.thing_colors[c]]) for c in classes - ] - alpha = 0.8 - else: - colors = None - alpha = 0.5 - - if self._instance_mode == ColorMode.IMAGE_BW: - self.output.reset_image( - self._create_grayscale_image( - (predictions.pred_masks.any(dim=0) > 0).numpy() - if predictions.has("pred_masks") - else None - ) - ) - alpha = 0.3 - - self.overlay_instances( - masks=masks, - boxes=boxes, - labels=labels, - keypoints=keypoints, - assigned_colors=colors, - alpha=alpha, - ) - return self.output - - def draw_sem_seg(self, sem_seg, area_threshold=None, alpha=0.8): - """ - Draw semantic segmentation predictions/labels. - - Args: - sem_seg (Tensor or ndarray): the segmentation of shape (H, W). - Each value is the integer label of the pixel. - area_threshold (int): segments with less than `area_threshold` are not drawn. - alpha (float): the larger it is, the more opaque the segmentations are. - - Returns: - output (VisImage): image object with visualizations. - """ - if isinstance(sem_seg, torch.Tensor): - sem_seg = sem_seg.numpy() - labels, areas = np.unique(sem_seg, return_counts=True) - sorted_idxs = np.argsort(-areas).tolist() - labels = labels[sorted_idxs] - for label in filter(lambda l: l < len(self.metadata.stuff_classes), labels): - try: - mask_color = [x / 255 for x in self.metadata.stuff_colors[label]] - except (AttributeError, IndexError): - mask_color = None - - binary_mask = (sem_seg == label).astype(np.uint8) - text = self.metadata.stuff_classes[label] - self.draw_binary_mask( - binary_mask, - color=mask_color, - edge_color=_OFF_WHITE, - text=text, - alpha=alpha, - area_threshold=area_threshold, - ) - return self.output - - def draw_panoptic_seg(self, panoptic_seg, segments_info, area_threshold=None, alpha=0.7): - """ - Draw panoptic prediction annotations or results. - - Args: - panoptic_seg (Tensor): of shape (height, width) where the values are ids for each - segment. - segments_info (list[dict] or None): Describe each segment in `panoptic_seg`. - If it is a ``list[dict]``, each dict contains keys "id", "category_id". - If None, category id of each pixel is computed by - ``pixel // metadata.label_divisor``. - area_threshold (int): stuff segments with less than `area_threshold` are not drawn. - - Returns: - output (VisImage): image object with visualizations. - """ - pred = _PanopticPrediction(panoptic_seg, segments_info, self.metadata) - - if self._instance_mode == ColorMode.IMAGE_BW: - self.output.reset_image(self._create_grayscale_image(pred.non_empty_mask())) - - # draw mask for all semantic segments first i.e. "stuff" - for mask, sinfo in pred.semantic_masks(): - category_idx = sinfo["category_id"] - try: - mask_color = [x / 255 for x in self.metadata.stuff_colors[category_idx]] - except AttributeError: - mask_color = None - - text = self.metadata.stuff_classes[category_idx] - self.draw_binary_mask( - mask, - color=mask_color, - edge_color=_OFF_WHITE, - text=text, - alpha=alpha, - area_threshold=area_threshold, - ) - - # draw mask for all instances second - all_instances = list(pred.instance_masks()) - if len(all_instances) == 0: - return self.output - masks, sinfo = list(zip(*all_instances)) - category_ids = [x["category_id"] for x in sinfo] - - try: - scores = [x["score"] for x in sinfo] - except KeyError: - scores = None - labels = _create_text_labels( - category_ids, scores, self.metadata.thing_classes, [x.get("iscrowd", 0) for x in sinfo] - ) - - try: - colors = [ - self._jitter([x / 255 for x in self.metadata.thing_colors[c]]) for c in category_ids - ] - except AttributeError: - colors = None - self.overlay_instances(masks=masks, labels=labels, assigned_colors=colors, alpha=alpha) - - return self.output - - draw_panoptic_seg_predictions = draw_panoptic_seg # backward compatibility - - def draw_dataset_dict(self, dic): - """ - Draw annotations/segmentaions in Detectron2 Dataset format. - - Args: - dic (dict): annotation/segmentation data of one image, in Detectron2 Dataset format. - - Returns: - output (VisImage): image object with visualizations. - """ - annos = dic.get("annotations", None) - if annos: - if "segmentation" in annos[0]: - masks = [x["segmentation"] for x in annos] - else: - masks = None - if "keypoints" in annos[0]: - keypts = [x["keypoints"] for x in annos] - keypts = np.array(keypts).reshape(len(annos), -1, 3) - else: - keypts = None - - boxes = [ - BoxMode.convert(x["bbox"], x["bbox_mode"], BoxMode.XYXY_ABS) - if len(x["bbox"]) == 4 - else x["bbox"] - for x in annos - ] - - colors = None - category_ids = [x["category_id"] for x in annos] - if self._instance_mode == ColorMode.SEGMENTATION and self.metadata.get("thing_colors"): - colors = [ - self._jitter([x / 255 for x in self.metadata.thing_colors[c]]) - for c in category_ids - ] - names = self.metadata.get("thing_classes", None) - labels = _create_text_labels( - category_ids, - scores=None, - class_names=names, - is_crowd=[x.get("iscrowd", 0) for x in annos], - ) - self.overlay_instances( - labels=labels, boxes=boxes, masks=masks, keypoints=keypts, assigned_colors=colors - ) - - sem_seg = dic.get("sem_seg", None) - if sem_seg is None and "sem_seg_file_name" in dic: - with PathManager.open(dic["sem_seg_file_name"], "rb") as f: - sem_seg = Image.open(f) - sem_seg = np.asarray(sem_seg, dtype="uint8") - if sem_seg is not None: - self.draw_sem_seg(sem_seg, area_threshold=0, alpha=0.5) - - pan_seg = dic.get("pan_seg", None) - if pan_seg is None and "pan_seg_file_name" in dic: - with PathManager.open(dic["pan_seg_file_name"], "rb") as f: - pan_seg = Image.open(f) - pan_seg = np.asarray(pan_seg) - from panopticapi.utils import rgb2id - - pan_seg = rgb2id(pan_seg) - if pan_seg is not None: - segments_info = dic["segments_info"] - pan_seg = torch.tensor(pan_seg) - self.draw_panoptic_seg(pan_seg, segments_info, area_threshold=0, alpha=0.5) - return self.output - - def overlay_instances( - self, - *, - boxes=None, - labels=None, - masks=None, - keypoints=None, - assigned_colors=None, - alpha=0.5, - ): - """ - Args: - boxes (Boxes, RotatedBoxes or ndarray): either a :class:`Boxes`, - or an Nx4 numpy array of XYXY_ABS format for the N objects in a single image, - or a :class:`RotatedBoxes`, - or an Nx5 numpy array of (x_center, y_center, width, height, angle_degrees) format - for the N objects in a single image, - labels (list[str]): the text to be displayed for each instance. - masks (masks-like object): Supported types are: - - * :class:`detectron2.structures.PolygonMasks`, - :class:`detectron2.structures.BitMasks`. - * list[list[ndarray]]: contains the segmentation masks for all objects in one image. - The first level of the list corresponds to individual instances. The second - level to all the polygon that compose the instance, and the third level - to the polygon coordinates. The third level should have the format of - [x0, y0, x1, y1, ..., xn, yn] (n >= 3). - * list[ndarray]: each ndarray is a binary mask of shape (H, W). - * list[dict]: each dict is a COCO-style RLE. - keypoints (Keypoint or array like): an array-like object of shape (N, K, 3), - where the N is the number of instances and K is the number of keypoints. - The last dimension corresponds to (x, y, visibility or score). - assigned_colors (list[matplotlib.colors]): a list of colors, where each color - corresponds to each mask or box in the image. Refer to 'matplotlib.colors' - for full list of formats that the colors are accepted in. - Returns: - output (VisImage): image object with visualizations. - """ - num_instances = 0 - if boxes is not None: - boxes = self._convert_boxes(boxes) - num_instances = len(boxes) - if masks is not None: - masks = self._convert_masks(masks) - if num_instances: - assert len(masks) == num_instances - else: - num_instances = len(masks) - if keypoints is not None: - if num_instances: - assert len(keypoints) == num_instances - else: - num_instances = len(keypoints) - keypoints = self._convert_keypoints(keypoints) - if labels is not None: - assert len(labels) == num_instances - if assigned_colors is None: - assigned_colors = [random_color(rgb=True, maximum=1) for _ in range(num_instances)] - if num_instances == 0: - return self.output - if boxes is not None and boxes.shape[1] == 5: - return self.overlay_rotated_instances( - boxes=boxes, labels=labels, assigned_colors=assigned_colors - ) - - # Display in largest to smallest order to reduce occlusion. - areas = None - if boxes is not None: - areas = np.prod(boxes[:, 2:] - boxes[:, :2], axis=1) - elif masks is not None: - areas = np.asarray([x.area() for x in masks]) - - if areas is not None: - sorted_idxs = np.argsort(-areas).tolist() - # Re-order overlapped instances in descending order. - boxes = boxes[sorted_idxs] if boxes is not None else None - labels = [labels[k] for k in sorted_idxs] if labels is not None else None - masks = [masks[idx] for idx in sorted_idxs] if masks is not None else None - assigned_colors = [assigned_colors[idx] for idx in sorted_idxs] - keypoints = keypoints[sorted_idxs] if keypoints is not None else None - - for i in range(num_instances): - color = assigned_colors[i] - if boxes is not None: - self.draw_box(boxes[i], edge_color=color) - - if masks is not None: - for segment in masks[i].polygons: - self.draw_polygon(segment.reshape(-1, 2), color, alpha=alpha) - - if labels is not None: - # first get a box - if boxes is not None: - x0, y0, x1, y1 = boxes[i] - text_pos = (x0, y0) # if drawing boxes, put text on the box corner. - horiz_align = "left" - elif masks is not None: - # skip small mask without polygon - if len(masks[i].polygons) == 0: - continue - - x0, y0, x1, y1 = masks[i].bbox() - - # draw text in the center (defined by median) when box is not drawn - # median is less sensitive to outliers. - text_pos = np.median(masks[i].mask.nonzero(), axis=1)[::-1] - horiz_align = "center" - else: - continue # drawing the box confidence for keypoints isn't very useful. - # for small objects, draw text at the side to avoid occlusion - instance_area = (y1 - y0) * (x1 - x0) - if ( - instance_area < _SMALL_OBJECT_AREA_THRESH * self.output.scale - or y1 - y0 < 40 * self.output.scale - ): - if y1 >= self.output.height - 5: - text_pos = (x1, y0) - else: - text_pos = (x0, y1) - - height_ratio = (y1 - y0) / np.sqrt(self.output.height * self.output.width) - lighter_color = self._change_color_brightness(color, brightness_factor=0.7) - font_size = ( - np.clip((height_ratio - 0.02) / 0.08 + 1, 1.2, 2) - * 0.5 - * self._default_font_size - ) - self.draw_text( - labels[i], - text_pos, - color=lighter_color, - horizontal_alignment=horiz_align, - font_size=font_size, - ) - - # draw keypoints - if keypoints is not None: - for keypoints_per_instance in keypoints: - self.draw_and_connect_keypoints(keypoints_per_instance) - - return self.output - - def overlay_rotated_instances(self, boxes=None, labels=None, assigned_colors=None): - """ - Args: - boxes (ndarray): an Nx5 numpy array of - (x_center, y_center, width, height, angle_degrees) format - for the N objects in a single image. - labels (list[str]): the text to be displayed for each instance. - assigned_colors (list[matplotlib.colors]): a list of colors, where each color - corresponds to each mask or box in the image. Refer to 'matplotlib.colors' - for full list of formats that the colors are accepted in. - - Returns: - output (VisImage): image object with visualizations. - """ - num_instances = len(boxes) - - if assigned_colors is None: - assigned_colors = [random_color(rgb=True, maximum=1) for _ in range(num_instances)] - if num_instances == 0: - return self.output - - # Display in largest to smallest order to reduce occlusion. - if boxes is not None: - areas = boxes[:, 2] * boxes[:, 3] - - sorted_idxs = np.argsort(-areas).tolist() - # Re-order overlapped instances in descending order. - boxes = boxes[sorted_idxs] - labels = [labels[k] for k in sorted_idxs] if labels is not None else None - colors = [assigned_colors[idx] for idx in sorted_idxs] - - for i in range(num_instances): - self.draw_rotated_box_with_label( - boxes[i], edge_color=colors[i], label=labels[i] if labels is not None else None - ) - - return self.output - - def draw_and_connect_keypoints(self, keypoints): - """ - Draws keypoints of an instance and follows the rules for keypoint connections - to draw lines between appropriate keypoints. This follows color heuristics for - line color. - - Args: - keypoints (Tensor): a tensor of shape (K, 3), where K is the number of keypoints - and the last dimension corresponds to (x, y, probability). - - Returns: - output (VisImage): image object with visualizations. - """ - visible = {} - keypoint_names = self.metadata.get("keypoint_names") - for idx, keypoint in enumerate(keypoints): - - # draw keypoint - x, y, prob = keypoint - if prob > self.keypoint_threshold: - self.draw_circle((x, y), color=_RED) - if keypoint_names: - keypoint_name = keypoint_names[idx] - visible[keypoint_name] = (x, y) - - if self.metadata.get("keypoint_connection_rules"): - for kp0, kp1, color in self.metadata.keypoint_connection_rules: - if kp0 in visible and kp1 in visible: - x0, y0 = visible[kp0] - x1, y1 = visible[kp1] - color = tuple(x / 255.0 for x in color) - self.draw_line([x0, x1], [y0, y1], color=color) - - # draw lines from nose to mid-shoulder and mid-shoulder to mid-hip - # Note that this strategy is specific to person keypoints. - # For other keypoints, it should just do nothing - try: - ls_x, ls_y = visible["left_shoulder"] - rs_x, rs_y = visible["right_shoulder"] - mid_shoulder_x, mid_shoulder_y = (ls_x + rs_x) / 2, (ls_y + rs_y) / 2 - except KeyError: - pass - else: - # draw line from nose to mid-shoulder - nose_x, nose_y = visible.get("nose", (None, None)) - if nose_x is not None: - self.draw_line([nose_x, mid_shoulder_x], [nose_y, mid_shoulder_y], color=_RED) - - try: - # draw line from mid-shoulder to mid-hip - lh_x, lh_y = visible["left_hip"] - rh_x, rh_y = visible["right_hip"] - except KeyError: - pass - else: - mid_hip_x, mid_hip_y = (lh_x + rh_x) / 2, (lh_y + rh_y) / 2 - self.draw_line([mid_hip_x, mid_shoulder_x], [mid_hip_y, mid_shoulder_y], color=_RED) - return self.output - - """ - Primitive drawing functions: - """ - - def draw_text( - self, - text, - position, - *, - font_size=None, - color="g", - horizontal_alignment="center", - rotation=0, - ): - """ - Args: - text (str): class label - position (tuple): a tuple of the x and y coordinates to place text on image. - font_size (int, optional): font of the text. If not provided, a font size - proportional to the image width is calculated and used. - color: color of the text. Refer to `matplotlib.colors` for full list - of formats that are accepted. - horizontal_alignment (str): see `matplotlib.text.Text` - rotation: rotation angle in degrees CCW - - Returns: - output (VisImage): image object with text drawn. - """ - if not font_size: - font_size = self._default_font_size - - # since the text background is dark, we don't want the text to be dark - color = np.maximum(list(mplc.to_rgb(color)), 0.2) - color[np.argmax(color)] = max(0.8, np.max(color)) - - x, y = position - self.output.ax.text( - x, - y, - text, - size=font_size * self.output.scale, - family="sans-serif", - bbox={"facecolor": "black", "alpha": 0.8, "pad": 0.7, "edgecolor": "none"}, - verticalalignment="top", - horizontalalignment=horizontal_alignment, - color=color, - zorder=10, - rotation=rotation, - ) - return self.output - - def draw_box(self, box_coord, alpha=0.5, edge_color="g", line_style="-"): - """ - Args: - box_coord (tuple): a tuple containing x0, y0, x1, y1 coordinates, where x0 and y0 - are the coordinates of the image's top left corner. x1 and y1 are the - coordinates of the image's bottom right corner. - alpha (float): blending efficient. Smaller values lead to more transparent masks. - edge_color: color of the outline of the box. Refer to `matplotlib.colors` - for full list of formats that are accepted. - line_style (string): the string to use to create the outline of the boxes. - - Returns: - output (VisImage): image object with box drawn. - """ - x0, y0, x1, y1 = box_coord - width = x1 - x0 - height = y1 - y0 - - linewidth = max(self._default_font_size / 4, 1) - - self.output.ax.add_patch( - mpl.patches.Rectangle( - (x0, y0), - width, - height, - fill=False, - edgecolor=edge_color, - linewidth=linewidth * self.output.scale, - alpha=alpha, - linestyle=line_style, - ) - ) - return self.output - - def draw_rotated_box_with_label( - self, rotated_box, alpha=0.5, edge_color="g", line_style="-", label=None - ): - """ - Draw a rotated box with label on its top-left corner. - - Args: - rotated_box (tuple): a tuple containing (cnt_x, cnt_y, w, h, angle), - where cnt_x and cnt_y are the center coordinates of the box. - w and h are the width and height of the box. angle represents how - many degrees the box is rotated CCW with regard to the 0-degree box. - alpha (float): blending efficient. Smaller values lead to more transparent masks. - edge_color: color of the outline of the box. Refer to `matplotlib.colors` - for full list of formats that are accepted. - line_style (string): the string to use to create the outline of the boxes. - label (string): label for rotated box. It will not be rendered when set to None. - - Returns: - output (VisImage): image object with box drawn. - """ - cnt_x, cnt_y, w, h, angle = rotated_box - area = w * h - # use thinner lines when the box is small - linewidth = self._default_font_size / ( - 6 if area < _SMALL_OBJECT_AREA_THRESH * self.output.scale else 3 - ) - - theta = angle * math.pi / 180.0 - c = math.cos(theta) - s = math.sin(theta) - rect = [(-w / 2, h / 2), (-w / 2, -h / 2), (w / 2, -h / 2), (w / 2, h / 2)] - # x: left->right ; y: top->down - rotated_rect = [(s * yy + c * xx + cnt_x, c * yy - s * xx + cnt_y) for (xx, yy) in rect] - for k in range(4): - j = (k + 1) % 4 - self.draw_line( - [rotated_rect[k][0], rotated_rect[j][0]], - [rotated_rect[k][1], rotated_rect[j][1]], - color=edge_color, - linestyle="--" if k == 1 else line_style, - linewidth=linewidth, - ) - - if label is not None: - text_pos = rotated_rect[1] # topleft corner - - height_ratio = h / np.sqrt(self.output.height * self.output.width) - label_color = self._change_color_brightness(edge_color, brightness_factor=0.7) - font_size = ( - np.clip((height_ratio - 0.02) / 0.08 + 1, 1.2, 2) * 0.5 * self._default_font_size - ) - self.draw_text(label, text_pos, color=label_color, font_size=font_size, rotation=angle) - - return self.output - - def draw_circle(self, circle_coord, color, radius=3): - """ - Args: - circle_coord (list(int) or tuple(int)): contains the x and y coordinates - of the center of the circle. - color: color of the polygon. Refer to `matplotlib.colors` for a full list of - formats that are accepted. - radius (int): radius of the circle. - - Returns: - output (VisImage): image object with box drawn. - """ - x, y = circle_coord - self.output.ax.add_patch( - mpl.patches.Circle(circle_coord, radius=radius, fill=True, color=color) - ) - return self.output - - def draw_line(self, x_data, y_data, color, linestyle="-", linewidth=None): - """ - Args: - x_data (list[int]): a list containing x values of all the points being drawn. - Length of list should match the length of y_data. - y_data (list[int]): a list containing y values of all the points being drawn. - Length of list should match the length of x_data. - color: color of the line. Refer to `matplotlib.colors` for a full list of - formats that are accepted. - linestyle: style of the line. Refer to `matplotlib.lines.Line2D` - for a full list of formats that are accepted. - linewidth (float or None): width of the line. When it's None, - a default value will be computed and used. - - Returns: - output (VisImage): image object with line drawn. - """ - if linewidth is None: - linewidth = self._default_font_size / 3 - linewidth = max(linewidth, 1) - self.output.ax.add_line( - mpl.lines.Line2D( - x_data, - y_data, - linewidth=linewidth * self.output.scale, - color=color, - linestyle=linestyle, - ) - ) - return self.output - - def draw_binary_mask( - self, binary_mask, color=None, *, edge_color=None, text=None, alpha=0.5, area_threshold=10 - ): - """ - Args: - binary_mask (ndarray): numpy array of shape (H, W), where H is the image height and - W is the image width. Each value in the array is either a 0 or 1 value of uint8 - type. - color: color of the mask. Refer to `matplotlib.colors` for a full list of - formats that are accepted. If None, will pick a random color. - edge_color: color of the polygon edges. Refer to `matplotlib.colors` for a - full list of formats that are accepted. - text (str): if None, will be drawn on the object - alpha (float): blending efficient. Smaller values lead to more transparent masks. - area_threshold (float): a connected component smaller than this area will not be shown. - - Returns: - output (VisImage): image object with mask drawn. - """ - if color is None: - color = random_color(rgb=True, maximum=1) - color = mplc.to_rgb(color) - - has_valid_segment = False - binary_mask = binary_mask.astype("uint8") # opencv needs uint8 - mask = GenericMask(binary_mask, self.output.height, self.output.width) - shape2d = (binary_mask.shape[0], binary_mask.shape[1]) - - if not mask.has_holes: - # draw polygons for regular masks - for segment in mask.polygons: - area = mask_util.area(mask_util.frPyObjects([segment], shape2d[0], shape2d[1])) - if area < (area_threshold or 0): - continue - has_valid_segment = True - segment = segment.reshape(-1, 2) - self.draw_polygon(segment, color=color, edge_color=edge_color, alpha=alpha) - else: - # TODO: Use Path/PathPatch to draw vector graphics: - # https://stackoverflow.com/questions/8919719/how-to-plot-a-complex-polygon - rgba = np.zeros(shape2d + (4,), dtype="float32") - rgba[:, :, :3] = color - rgba[:, :, 3] = (mask.mask == 1).astype("float32") * alpha - has_valid_segment = True - self.output.ax.imshow(rgba, extent=(0, self.output.width, self.output.height, 0)) - - if text is not None and has_valid_segment: - lighter_color = self._change_color_brightness(color, brightness_factor=0.7) - self._draw_text_in_mask(binary_mask, text, lighter_color) - return self.output - - def draw_soft_mask(self, soft_mask, color=None, *, text=None, alpha=0.5): - """ - Args: - soft_mask (ndarray): float array of shape (H, W), each value in [0, 1]. - color: color of the mask. Refer to `matplotlib.colors` for a full list of - formats that are accepted. If None, will pick a random color. - text (str): if None, will be drawn on the object - alpha (float): blending efficient. Smaller values lead to more transparent masks. - - Returns: - output (VisImage): image object with mask drawn. - """ - if color is None: - color = random_color(rgb=True, maximum=1) - color = mplc.to_rgb(color) - - shape2d = (soft_mask.shape[0], soft_mask.shape[1]) - rgba = np.zeros(shape2d + (4,), dtype="float32") - rgba[:, :, :3] = color - rgba[:, :, 3] = soft_mask * alpha - self.output.ax.imshow(rgba, extent=(0, self.output.width, self.output.height, 0)) - - if text is not None: - lighter_color = self._change_color_brightness(color, brightness_factor=0.7) - binary_mask = (soft_mask > 0.5).astype("uint8") - self._draw_text_in_mask(binary_mask, text, lighter_color) - return self.output - - def draw_polygon(self, segment, color, edge_color=None, alpha=0.5): - """ - Args: - segment: numpy array of shape Nx2, containing all the points in the polygon. - color: color of the polygon. Refer to `matplotlib.colors` for a full list of - formats that are accepted. - edge_color: color of the polygon edges. Refer to `matplotlib.colors` for a - full list of formats that are accepted. If not provided, a darker shade - of the polygon color will be used instead. - alpha (float): blending efficient. Smaller values lead to more transparent masks. - - Returns: - output (VisImage): image object with polygon drawn. - """ - if edge_color is None: - # make edge color darker than the polygon color - if alpha > 0.8: - edge_color = self._change_color_brightness(color, brightness_factor=-0.7) - else: - edge_color = color - edge_color = mplc.to_rgb(edge_color) + (1,) - - polygon = mpl.patches.Polygon( - segment, - fill=True, - facecolor=mplc.to_rgb(color) + (alpha,), - edgecolor=edge_color, - linewidth=max(self._default_font_size // 15 * self.output.scale, 1), - ) - self.output.ax.add_patch(polygon) - return self.output - - """ - Internal methods: - """ - - def _jitter(self, color): - """ - Randomly modifies given color to produce a slightly different color than the color given. - - Args: - color (tuple[double]): a tuple of 3 elements, containing the RGB values of the color - picked. The values in the list are in the [0.0, 1.0] range. - - Returns: - jittered_color (tuple[double]): a tuple of 3 elements, containing the RGB values of the - color after being jittered. The values in the list are in the [0.0, 1.0] range. - """ - color = mplc.to_rgb(color) - vec = np.random.rand(3) - # better to do it in another color space - vec = vec / np.linalg.norm(vec) * 0.5 - res = np.clip(vec + color, 0, 1) - return tuple(res) - - def _create_grayscale_image(self, mask=None): - """ - Create a grayscale version of the original image. - The colors in masked area, if given, will be kept. - """ - img_bw = self.img.astype("f4").mean(axis=2) - img_bw = np.stack([img_bw] * 3, axis=2) - if mask is not None: - img_bw[mask] = self.img[mask] - return img_bw - - def _change_color_brightness(self, color, brightness_factor): - """ - Depending on the brightness_factor, gives a lighter or darker color i.e. a color with - less or more saturation than the original color. - - Args: - color: color of the polygon. Refer to `matplotlib.colors` for a full list of - formats that are accepted. - brightness_factor (float): a value in [-1.0, 1.0] range. A lightness factor of - 0 will correspond to no change, a factor in [-1.0, 0) range will result in - a darker color and a factor in (0, 1.0] range will result in a lighter color. - - Returns: - modified_color (tuple[double]): a tuple containing the RGB values of the - modified color. Each value in the tuple is in the [0.0, 1.0] range. - """ - assert brightness_factor >= -1.0 and brightness_factor <= 1.0 - color = mplc.to_rgb(color) - polygon_color = colorsys.rgb_to_hls(*mplc.to_rgb(color)) - modified_lightness = polygon_color[1] + (brightness_factor * polygon_color[1]) - modified_lightness = 0.0 if modified_lightness < 0.0 else modified_lightness - modified_lightness = 1.0 if modified_lightness > 1.0 else modified_lightness - modified_color = colorsys.hls_to_rgb(polygon_color[0], modified_lightness, polygon_color[2]) - return modified_color - - def _convert_boxes(self, boxes): - """ - Convert different format of boxes to an NxB array, where B = 4 or 5 is the box dimension. - """ - if isinstance(boxes, Boxes) or isinstance(boxes, RotatedBoxes): - return boxes.tensor.detach().numpy() - else: - return np.asarray(boxes) - - def _convert_masks(self, masks_or_polygons): - """ - Convert different format of masks or polygons to a tuple of masks and polygons. - - Returns: - list[GenericMask]: - """ - - m = masks_or_polygons - if isinstance(m, PolygonMasks): - m = m.polygons - if isinstance(m, BitMasks): - m = m.tensor.numpy() - if isinstance(m, torch.Tensor): - m = m.numpy() - ret = [] - for x in m: - if isinstance(x, GenericMask): - ret.append(x) - else: - ret.append(GenericMask(x, self.output.height, self.output.width)) - return ret - - def _draw_text_in_mask(self, binary_mask, text, color): - """ - Find proper places to draw text given a binary mask. - """ - # TODO sometimes drawn on wrong objects. the heuristics here can improve. - _num_cc, cc_labels, stats, centroids = cv2.connectedComponentsWithStats(binary_mask, 8) - if stats[1:, -1].size == 0: - return - largest_component_id = np.argmax(stats[1:, -1]) + 1 - - # draw text on the largest component, as well as other very large components. - for cid in range(1, _num_cc): - if cid == largest_component_id or stats[cid, -1] > _LARGE_MASK_AREA_THRESH: - # median is more stable than centroid - # center = centroids[largest_component_id] - center = np.median((cc_labels == cid).nonzero(), axis=1)[::-1] - self.draw_text(text, center, color=color) - - def _convert_keypoints(self, keypoints): - if isinstance(keypoints, Keypoints): - keypoints = keypoints.tensor - keypoints = np.asarray(keypoints) - return keypoints - - def get_output(self): - """ - Returns: - output (VisImage): the image output containing the visualizations added - to the image. - """ - return self.output diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/README.md deleted file mode 100755 index bec811ad..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/README.md +++ /dev/null @@ -1,7 +0,0 @@ - -## Some scripts for developers to use, include: - -- `linter.sh`: lint the codebase before commit. -- `run_{inference,instant}_tests.sh`: run inference/training for a few iterations. - Note that these tests require 2 GPUs. -- `parse_results.sh`: parse results from a log file. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/linter.sh b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/linter.sh deleted file mode 100755 index e873186f..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/linter.sh +++ /dev/null @@ -1,42 +0,0 @@ -#!/bin/bash -e -# Copyright (c) Facebook, Inc. and its affiliates. - -# cd to detectron2 project root -cd "$(dirname "${BASH_SOURCE[0]}")/.." - -{ - black --version | grep -E "21\." > /dev/null -} || { - echo "Linter requires 'black==21.*' !" - exit 1 -} - -ISORT_VERSION=$(isort --version-number) -if [[ "$ISORT_VERSION" != 4.3* ]]; then - echo "Linter requires isort==4.3.21 !" - exit 1 -fi - -set -v - -echo "Running isort ..." -isort -y -sp . --atomic - -echo "Running black ..." -black -l 100 . - -echo "Running flake8 ..." -if [ -x "$(command -v flake8-3)" ]; then - flake8-3 . -else - python3 -m flake8 . -fi - -# echo "Running mypy ..." -# Pytorch does not have enough type annotations -# mypy detectron2/solver detectron2/structures detectron2/config - -echo "Running clang-format ..." -find . -regex ".*\.\(cpp\|c\|cc\|cu\|cxx\|h\|hh\|hpp\|hxx\|tcc\|mm\|m\)" -print0 | xargs -0 clang-format -i - -command -v arc > /dev/null && arc lint diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/README.md deleted file mode 100755 index 0174b7dd..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/README.md +++ /dev/null @@ -1,17 +0,0 @@ - -## To build a cu101 wheel for release: - -``` -$ nvidia-docker run -it --storage-opt "size=20GB" --name pt pytorch/manylinux-cuda101 -# inside the container: -# git clone https://github.com/facebookresearch/detectron2/ -# cd detectron2 -# export CU_VERSION=cu101 D2_VERSION_SUFFIX= PYTHON_VERSION=3.7 PYTORCH_VERSION=1.8 -# ./dev/packaging/build_wheel.sh -``` - -## To build all wheels for combinations of CUDA and Python -``` -./dev/packaging/build_all_wheels.sh -./dev/packaging/gen_wheel_index.sh /path/to/wheels -``` diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/build_all_wheels.sh b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/build_all_wheels.sh deleted file mode 100755 index 98b5e444..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/build_all_wheels.sh +++ /dev/null @@ -1,65 +0,0 @@ -#!/bin/bash -e -# Copyright (c) Facebook, Inc. and its affiliates. - -[[ -d "dev/packaging" ]] || { - echo "Please run this script at detectron2 root!" - exit 1 -} - -build_one() { - cu=$1 - pytorch_ver=$2 - - case "$cu" in - cu*) - container_name=manylinux-cuda${cu/cu/} - ;; - cpu) - container_name=manylinux-cuda101 - ;; - *) - echo "Unrecognized cu=$cu" - exit 1 - ;; - esac - - echo "Launching container $container_name ..." - container_id="$container_name"_"$cu"_"$pytorch_ver" - - py_versions=(3.6 3.7 3.8 3.9) - - for py in "${py_versions[@]}"; do - docker run -itd \ - --name "$container_id" \ - --mount type=bind,source="$(pwd)",target=/detectron2 \ - pytorch/$container_name - - cat </dev/null 2>&1 && pwd )" -. "$script_dir/pkg_helpers.bash" - -echo "Build Settings:" -echo "CU_VERSION: $CU_VERSION" # e.g. cu101 -echo "D2_VERSION_SUFFIX: $D2_VERSION_SUFFIX" # e.g. +cu101 or "" -echo "PYTHON_VERSION: $PYTHON_VERSION" # e.g. 3.6 -echo "PYTORCH_VERSION: $PYTORCH_VERSION" # e.g. 1.4 - -setup_cuda -setup_wheel_python - -yum install ninja-build -y -ln -sv /usr/bin/ninja-build /usr/bin/ninja || true - -pip_install pip numpy -U -pip_install "torch==$PYTORCH_VERSION" \ - -f https://download.pytorch.org/whl/"$CU_VERSION"/torch_stable.html - -# use separate directories to allow parallel build -BASE_BUILD_DIR=build/$CU_VERSION-py$PYTHON_VERSION-pt$PYTORCH_VERSION -python setup.py \ - build -b "$BASE_BUILD_DIR" \ - bdist_wheel -b "$BASE_BUILD_DIR/build_dist" -d "wheels/$CU_VERSION/torch$PYTORCH_VERSION" -rm -rf "$BASE_BUILD_DIR" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/gen_install_table.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/gen_install_table.py deleted file mode 100755 index b4c852dc..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/gen_install_table.py +++ /dev/null @@ -1,63 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. -# -*- coding: utf-8 -*- - -import argparse - -template = """
      install
      \
      -python -m pip install detectron2{d2_version} -f \\
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/{cuda}/torch{torch}/index.html
      -
      """ -CUDA_SUFFIX = { - "11.3": "cu113", - "11.1": "cu111", - "11.0": "cu110", - "10.2": "cu102", - "10.1": "cu101", - "10.0": "cu100", - "9.2": "cu92", - "cpu": "cpu", -} - - -def gen_header(torch_versions): - return '' + "".join( - [ - ''.format(t) - for t in torch_versions - ] - ) - - -if __name__ == "__main__": - parser = argparse.ArgumentParser() - parser.add_argument("--d2-version", help="detectron2 version number, default to empty") - args = parser.parse_args() - d2_version = f"=={args.d2_version}" if args.d2_version else "" - - all_versions = ( - [("1.8", k) for k in ["11.1", "10.2", "10.1", "cpu"]] - + [("1.9", k) for k in ["11.1", "10.2", "cpu"]] - + [("1.10", k) for k in ["11.3", "11.1", "10.2", "cpu"]] - ) - - torch_versions = sorted( - {k[0] for k in all_versions}, key=lambda x: int(x.split(".")[1]), reverse=True - ) - cuda_versions = sorted( - {k[1] for k in all_versions}, key=lambda x: float(x) if x != "cpu" else 0, reverse=True - ) - - table = gen_header(torch_versions) - for cu in cuda_versions: - table += f""" """ - cu_suffix = CUDA_SUFFIX[cu] - for torch in torch_versions: - if (torch, cu) in all_versions: - cell = template.format(d2_version=d2_version, cuda=cu_suffix, torch=torch) - else: - cell = "" - table += f""" """ - table += "" - table += "
      CUDA torch {}
      {cu}{cell}
      " - print(table) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/gen_wheel_index.sh b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/gen_wheel_index.sh deleted file mode 100755 index ec96a27d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/gen_wheel_index.sh +++ /dev/null @@ -1,46 +0,0 @@ -#!/bin/bash -e -# Copyright (c) Facebook, Inc. and its affiliates. - - -root=$(readlink -f $1) -if [[ -z "$root" ]]; then - echo "Usage: ./gen_wheel_index.sh /absolute/path/to/wheels" - exit -fi - -export LC_ALL=C # reproducible sort -# NOTE: all sort in this script might not work when xx.10 is released - -index=$root/index.html - -cd "$root" -for cu in cpu cu92 cu100 cu101 cu102 cu110 cu111 cu113; do - mkdir -p "$root/$cu" - cd "$root/$cu" - echo "Creating $PWD/index.html ..." - # First sort by torch version, then stable sort by d2 version with unique. - # As a result, the latest torch version for each d2 version is kept. - for whl in $(find -type f -name '*.whl' -printf '%P\n' \ - | sort -k 1 -r | sort -t '/' -k 2 --stable -r --unique); do - echo "$whl
      " - done > index.html - - - for torch in torch*; do - cd "$root/$cu/$torch" - - # list all whl for each cuda,torch version - echo "Creating $PWD/index.html ..." - for whl in $(find . -type f -name '*.whl' -printf '%P\n' | sort -r); do - echo "$whl
      " - done > index.html - done -done - -cd "$root" -# Just list everything: -echo "Creating $index ..." -for whl in $(find . -type f -name '*.whl' -printf '%P\n' | sort -r); do - echo "$whl
      " -done > "$index" - diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/pkg_helpers.bash b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/pkg_helpers.bash deleted file mode 100755 index ed9acb00..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/packaging/pkg_helpers.bash +++ /dev/null @@ -1,76 +0,0 @@ -#!/bin/bash -e -# Copyright (c) Facebook, Inc. and its affiliates. - -# Function to retry functions that sometimes timeout or have flaky failures -retry () { - $* || (sleep 1 && $*) || (sleep 2 && $*) || (sleep 4 && $*) || (sleep 8 && $*) -} -# Install with pip a bit more robustly than the default -pip_install() { - retry pip install --progress-bar off "$@" -} - - -setup_cuda() { - # Now work out the CUDA settings - # Like other torch domain libraries, we choose common GPU architectures only. - # See https://github.com/pytorch/pytorch/blob/master/torch/utils/cpp_extension.py - # and https://github.com/pytorch/vision/blob/main/packaging/pkg_helpers.bash for reference. - export FORCE_CUDA=1 - case "$CU_VERSION" in - cu113) - export CUDA_HOME=/usr/local/cuda-11.3/ - export TORCH_CUDA_ARCH_LIST="3.7;5.0;5.2;6.0;6.1+PTX;7.0;7.5+PTX;8.0;8.6+PTX" - ;; - cu112) - export CUDA_HOME=/usr/local/cuda-11.2/ - export TORCH_CUDA_ARCH_LIST="3.7;5.0;5.2;6.0;6.1+PTX;7.0;7.5+PTX;8.0;8.6+PTX" - ;; - cu111) - export CUDA_HOME=/usr/local/cuda-11.1/ - export TORCH_CUDA_ARCH_LIST="3.7;5.0;5.2;6.0;6.1+PTX;7.0;7.5+PTX;8.0;8.6+PTX" - ;; - cu110) - export CUDA_HOME=/usr/local/cuda-11.0/ - export TORCH_CUDA_ARCH_LIST="3.7;5.0;5.2;6.0;6.1+PTX;7.0;7.5+PTX;8.0+PTX" - ;; - cu102) - export CUDA_HOME=/usr/local/cuda-10.2/ - export TORCH_CUDA_ARCH_LIST="3.7;5.0;5.2;6.0;6.1+PTX;7.0;7.5+PTX" - ;; - cu101) - export CUDA_HOME=/usr/local/cuda-10.1/ - export TORCH_CUDA_ARCH_LIST="3.7;5.0;5.2;6.0;6.1+PTX;7.0;7.5+PTX" - ;; - cu100) - export CUDA_HOME=/usr/local/cuda-10.0/ - export TORCH_CUDA_ARCH_LIST="3.7;5.0;5.2;6.0;6.1+PTX;7.0;7.5+PTX" - ;; - cu92) - export CUDA_HOME=/usr/local/cuda-9.2/ - export TORCH_CUDA_ARCH_LIST="3.7;5.0;5.2;6.0;6.1+PTX;7.0+PTX" - ;; - cpu) - unset FORCE_CUDA - export CUDA_VISIBLE_DEVICES= - ;; - *) - echo "Unrecognized CU_VERSION=$CU_VERSION" - exit 1 - ;; - esac -} - -setup_wheel_python() { - case "$PYTHON_VERSION" in - 3.6) python_abi=cp36-cp36m ;; - 3.7) python_abi=cp37-cp37m ;; - 3.8) python_abi=cp38-cp38 ;; - 3.9) python_abi=cp39-cp39 ;; - *) - echo "Unrecognized PYTHON_VERSION=$PYTHON_VERSION" - exit 1 - ;; - esac - export PATH="/opt/python/$python_abi/bin:$PATH" -} diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/parse_results.sh b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/parse_results.sh deleted file mode 100755 index 80768a40..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/parse_results.sh +++ /dev/null @@ -1,45 +0,0 @@ -#!/bin/bash -# Copyright (c) Facebook, Inc. and its affiliates. - -# A shell script that parses metrics from the log file. -# Make it easier for developers to track performance of models. - -LOG="$1" - -if [[ -z "$LOG" ]]; then - echo "Usage: $0 /path/to/log/file" - exit 1 -fi - -# [12/15 11:47:32] trainer INFO: Total training time: 12:15:04.446477 (0.4900 s / it) -# [12/15 11:49:03] inference INFO: Total inference time: 0:01:25.326167 (0.13652186737060548 s / img per device, on 8 devices) -# [12/15 11:49:03] inference INFO: Total inference pure compute time: ..... - -# training time -trainspeed=$(grep -o 'Overall training.*' "$LOG" | grep -Eo '\(.*\)' | grep -o '[0-9\.]*') -echo "Training speed: $trainspeed s/it" - -# inference time: there could be multiple inference during training -inferencespeed=$(grep -o 'Total inference pure.*' "$LOG" | tail -n1 | grep -Eo '\(.*\)' | grep -o '[0-9\.]*' | head -n1) -echo "Inference speed: $inferencespeed s/it" - -# [12/15 11:47:18] trainer INFO: eta: 0:00:00 iter: 90000 loss: 0.5407 (0.7256) loss_classifier: 0.1744 (0.2446) loss_box_reg: 0.0838 (0.1160) loss_mask: 0.2159 (0.2722) loss_objectness: 0.0244 (0.0429) loss_rpn_box_reg: 0.0279 (0.0500) time: 0.4487 (0.4899) data: 0.0076 (0.0975) lr: 0.000200 max mem: 4161 -memory=$(grep -o 'max[_ ]mem: [0-9]*' "$LOG" | tail -n1 | grep -o '[0-9]*') -echo "Training memory: $memory MB" - -echo "Easy to copypaste:" -echo "$trainspeed","$inferencespeed","$memory" - -echo "------------------------------" - -# [12/26 17:26:32] engine.coco_evaluation: copypaste: Task: bbox -# [12/26 17:26:32] engine.coco_evaluation: copypaste: AP,AP50,AP75,APs,APm,APl -# [12/26 17:26:32] engine.coco_evaluation: copypaste: 0.0017,0.0024,0.0017,0.0005,0.0019,0.0011 -# [12/26 17:26:32] engine.coco_evaluation: copypaste: Task: segm -# [12/26 17:26:32] engine.coco_evaluation: copypaste: AP,AP50,AP75,APs,APm,APl -# [12/26 17:26:32] engine.coco_evaluation: copypaste: 0.0014,0.0021,0.0016,0.0005,0.0016,0.0011 - -echo "COCO Results:" -num_tasks=$(grep -o 'copypaste:.*Task.*' "$LOG" | sort -u | wc -l) -# each task has 3 lines -grep -o 'copypaste:.*' "$LOG" | cut -d ' ' -f 2- | tail -n $((num_tasks * 3)) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/run_inference_tests.sh b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/run_inference_tests.sh deleted file mode 100755 index bc9dcc56..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/run_inference_tests.sh +++ /dev/null @@ -1,44 +0,0 @@ -#!/bin/bash -e -# Copyright (c) Facebook, Inc. and its affiliates. - -BIN="python tools/train_net.py" -OUTPUT="inference_test_output" -NUM_GPUS=2 - -CFG_LIST=( "${@:1}" ) - -if [ ${#CFG_LIST[@]} -eq 0 ]; then - CFG_LIST=( ./configs/quick_schedules/*inference_acc_test.yaml ) -fi - -echo "========================================================================" -echo "Configs to run:" -echo "${CFG_LIST[@]}" -echo "========================================================================" - - -for cfg in "${CFG_LIST[@]}"; do - echo "========================================================================" - echo "Running $cfg ..." - echo "========================================================================" - $BIN \ - --eval-only \ - --num-gpus $NUM_GPUS \ - --config-file "$cfg" \ - OUTPUT_DIR $OUTPUT - rm -rf $OUTPUT -done - - -echo "========================================================================" -echo "Running demo.py ..." -echo "========================================================================" -DEMO_BIN="python demo/demo.py" -COCO_DIR=datasets/coco/val2014 -mkdir -pv $OUTPUT - -set -v - -$DEMO_BIN --config-file ./configs/quick_schedules/panoptic_fpn_R_50_inference_acc_test.yaml \ - --input $COCO_DIR/COCO_val2014_0000001933* --output $OUTPUT -rm -rf $OUTPUT diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/run_instant_tests.sh b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/run_instant_tests.sh deleted file mode 100755 index 9fd9ba0c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/dev/run_instant_tests.sh +++ /dev/null @@ -1,27 +0,0 @@ -#!/bin/bash -e -# Copyright (c) Facebook, Inc. and its affiliates. - -BIN="python tools/train_net.py" -OUTPUT="instant_test_output" -NUM_GPUS=2 - -CFG_LIST=( "${@:1}" ) -if [ ${#CFG_LIST[@]} -eq 0 ]; then - CFG_LIST=( ./configs/quick_schedules/*instant_test.yaml ) -fi - -echo "========================================================================" -echo "Configs to run:" -echo "${CFG_LIST[@]}" -echo "========================================================================" - -for cfg in "${CFG_LIST[@]}"; do - echo "========================================================================" - echo "Running $cfg ..." - echo "========================================================================" - $BIN --num-gpus $NUM_GPUS --config-file "$cfg" \ - SOLVER.IMS_PER_BATCH $(($NUM_GPUS * 2)) \ - OUTPUT_DIR "$OUTPUT" - rm -rf "$OUTPUT" -done - diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docker/Dockerfile b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docker/Dockerfile deleted file mode 100755 index 4eec16dd..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docker/Dockerfile +++ /dev/null @@ -1,47 +0,0 @@ -FROM nvidia/cuda:11.1.1-cudnn8-devel-ubuntu18.04 -# use an older system (18.04) to avoid opencv incompatibility (issue#3524) - -ENV DEBIAN_FRONTEND noninteractive -RUN apt-get update && apt-get install -y \ - python3-opencv ca-certificates python3-dev git wget sudo ninja-build -RUN ln -sv /usr/bin/python3 /usr/bin/python - -# create a non-root user -ARG USER_ID=1000 -RUN useradd -m --no-log-init --system --uid ${USER_ID} appuser -g sudo -RUN echo '%sudo ALL=(ALL) NOPASSWD:ALL' >> /etc/sudoers -USER appuser -WORKDIR /home/appuser - -ENV PATH="/home/appuser/.local/bin:${PATH}" -RUN wget https://bootstrap.pypa.io/get-pip.py && \ - python3 get-pip.py --user && \ - rm get-pip.py - -# install dependencies -# See https://pytorch.org/ for other options if you use a different version of CUDA -RUN pip install --user tensorboard cmake # cmake from apt-get is too old -RUN pip install --user torch==1.10 torchvision==0.11.1 -f https://download.pytorch.org/whl/cu111/torch_stable.html - -RUN pip install --user 'git+https://github.com/facebookresearch/fvcore' -# install detectron2 -RUN git clone https://github.com/facebookresearch/detectron2 detectron2_repo -# set FORCE_CUDA because during `docker build` cuda is not accessible -ENV FORCE_CUDA="1" -# This will by default build detectron2 for all common cuda architectures and take a lot more time, -# because inside `docker build`, there is no way to tell which architecture will be used. -ARG TORCH_CUDA_ARCH_LIST="Kepler;Kepler+Tesla;Maxwell;Maxwell+Tegra;Pascal;Volta;Turing" -ENV TORCH_CUDA_ARCH_LIST="${TORCH_CUDA_ARCH_LIST}" - -RUN pip install --user -e detectron2_repo - -# Set a fixed model cache directory. -ENV FVCORE_CACHE="/tmp" -WORKDIR /home/appuser/detectron2_repo - -# run detectron2 under user "appuser": -# wget http://images.cocodataset.org/val2017/000000439715.jpg -O input.jpg -# python3 demo/demo.py \ - #--config-file configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml \ - #--input input.jpg --output outputs/ \ - #--opts MODEL.WEIGHTS detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docker/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docker/README.md deleted file mode 100755 index ea709f33..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docker/README.md +++ /dev/null @@ -1,45 +0,0 @@ - -## Use the container (with docker ≥ 19.03) - -``` -cd docker/ -# Build: -docker build --build-arg USER_ID=$UID -t detectron2:v0 . -# Launch (require GPUs): -docker run --gpus all -it \ - --shm-size=8gb --env="DISPLAY" --volume="/tmp/.X11-unix:/tmp/.X11-unix:rw" \ - --name=detectron2 detectron2:v0 - -# Grant docker access to host X server to show images -xhost +local:`docker inspect --format='{{ .Config.Hostname }}' detectron2` -``` - -## Use the container (with docker-compose ≥ 1.28.0) - -Install docker-compose and nvidia-docker-toolkit, then run: -``` -cd docker && USER_ID=$UID docker-compose run detectron2 -``` - -## Use the deployment container (to test C++ examples) -After building the base detectron2 container as above, do: -``` -# Build: -docker build -t detectron2-deploy:v0 -f deploy.Dockerfile . -# Launch: -docker run --gpus all -it detectron2-deploy:v0 -``` - -#### Using a persistent cache directory - -You can prevent models from being re-downloaded on every run, -by storing them in a cache directory. - -To do this, add `--volume=$HOME/.torch/fvcore_cache:/tmp:rw` in the run command. - -## Install new dependencies -Add the following to `Dockerfile` to make persistent changes. -``` -RUN sudo apt-get update && sudo apt-get install -y vim -``` -Or run them in the container to make temporary changes. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docker/deploy.Dockerfile b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docker/deploy.Dockerfile deleted file mode 100755 index 30b4ed77..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docker/deploy.Dockerfile +++ /dev/null @@ -1,32 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# This file defines a container that compiles the C++ examples of detectron2. -# See docker/README.md for usage. - -# Depends on the image produced by "./Dockerfile" -FROM detectron2:v0 - -USER appuser -ENV HOME=/home/appuser -WORKDIR $HOME - -# Let torchvision find libtorch -ENV CMAKE_PREFIX_PATH=$HOME/.local/lib/python3.6/site-packages/torch/ - -RUN sudo apt-get update && sudo apt-get install libopencv-dev --yes - -# install libtorchvision -RUN git clone --branch v0.11.1 https://github.com/pytorch/vision/ -RUN mkdir vision/build && cd vision/build && \ - cmake .. -DCMAKE_INSTALL_PREFIX=$HOME/.local -DCMAKE_BUILD_TYPE=Release -DWITH_CUDA=on -DTORCH_CUDA_ARCH_LIST=$TORCH_CUDA_ARCH_LIST && \ - make -j && make install - -# make our installation take effect -ENV CPATH=$HOME/.local/include \ - LIBRARY_PATH=$HOME/.local/lib \ - LD_LIBRARY_PATH=$HOME/.local/lib - - -# build C++ examples of detectron2 -RUN cd detectron2_repo/tools/deploy && mkdir build && cd build && \ - cmake -DTORCH_CUDA_ARCH_LIST=$TORCH_CUDA_ARCH_LIST .. && make -# binaries will be available under tools/deploy/build diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docker/docker-compose.yml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docker/docker-compose.yml deleted file mode 100755 index 6665ab4c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docker/docker-compose.yml +++ /dev/null @@ -1,26 +0,0 @@ -version: "2.3" -services: - detectron2: - build: - context: . - dockerfile: Dockerfile - args: - USER_ID: ${USER_ID:-1000} - deploy: - resources: - reservations: - devices: - - capabilities: - - gpu - shm_size: "8gb" - ulimits: - memlock: -1 - stack: 67108864 - volumes: - - /tmp/.X11-unix:/tmp/.X11-unix:ro - environment: - - DISPLAY=$DISPLAY - - NVIDIA_VISIBLE_DEVICES=all - # Uncomment with proper source to access webcam from docker - # devices: - # - /dev/video0:/dev/video0 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/.gitignore b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/.gitignore deleted file mode 100755 index e35d8850..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/.gitignore +++ /dev/null @@ -1 +0,0 @@ -_build diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/Makefile b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/Makefile deleted file mode 100755 index 718eddce..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/Makefile +++ /dev/null @@ -1,19 +0,0 @@ -# Minimal makefile for Sphinx documentation -# Copyright (c) Facebook, Inc. and its affiliates. - -# You can set these variables from the command line. -SPHINXOPTS = -SPHINXBUILD = sphinx-build -SOURCEDIR = . -BUILDDIR = _build - -# Put it first so that "make" without argument is like "make help". -help: - @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) - -.PHONY: help Makefile - -# Catch-all target: route all unknown targets to Sphinx using the new -# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS). -%: Makefile - @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/README.md deleted file mode 100755 index 8531cafd..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/README.md +++ /dev/null @@ -1,15 +0,0 @@ -# Read the docs: - -The latest documentation built from this directory is available at [detectron2.readthedocs.io](https://detectron2.readthedocs.io/). -Documents in this directory are not meant to be read on github. - -# Build the docs: - -1. Install detectron2 according to [INSTALL.md](../INSTALL.md). -2. Install additional libraries required to build docs: - - docutils==0.16 - - Sphinx==3.2.0 - - recommonmark==0.6.0 - - sphinx_rtd_theme - -3. Run `make html` from this directory. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/_static/css/custom.css b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/_static/css/custom.css deleted file mode 100755 index 6c511764..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/_static/css/custom.css +++ /dev/null @@ -1,30 +0,0 @@ -/* - * Copyright (c) Facebook, Inc. and its affiliates. - * some extra css to make markdown look similar between github/sphinx - */ - -/* - * Below is for install.md: - */ -.rst-content code { - white-space: pre; - border: 0px; -} - -.rst-content th { - border: 1px solid #e1e4e5; -} - -.rst-content th p { - /* otherwise will be default 24px for regular paragraph */ - margin-bottom: 0px; -} - -.rst-content .line-block { - /* otherwise will be 24px */ - margin-bottom: 0px; -} - -div.section > details { - padding-bottom: 1em; -} diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/conf.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/conf.py deleted file mode 100755 index c7232f41..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/conf.py +++ /dev/null @@ -1,382 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -# flake8: noqa - -# Configuration file for the Sphinx documentation builder. -# -# This file does only contain a selection of the most common options. For a -# full list see the documentation: -# http://www.sphinx-doc.org/en/master/config - -# -- Path setup -------------------------------------------------------------- - -# If extensions (or modules to document with autodoc) are in another directory, -# add these directories to sys.path here. If the directory is relative to the -# documentation root, use os.path.abspath to make it absolute, like shown here. -# -import os -import sys -from unittest import mock -from sphinx.domains import Domain -from typing import Dict, List, Tuple - -# The theme to use for HTML and HTML Help pages. See the documentation for -# a list of builtin themes. -# -import sphinx_rtd_theme - - -class GithubURLDomain(Domain): - """ - Resolve certain links in markdown files to github source. - """ - - name = "githuburl" - ROOT = "https://github.com/facebookresearch/detectron2/blob/main/" - LINKED_DOC = ["tutorials/install", "tutorials/getting_started"] - - def resolve_any_xref(self, env, fromdocname, builder, target, node, contnode): - github_url = None - if not target.endswith("html") and target.startswith("../../"): - url = target.replace("../", "") - github_url = url - if fromdocname in self.LINKED_DOC: - # unresolved links in these docs are all github links - github_url = target - - if github_url is not None: - if github_url.endswith("MODEL_ZOO") or github_url.endswith("README"): - # bug of recommonmark. - # https://github.com/readthedocs/recommonmark/blob/ddd56e7717e9745f11300059e4268e204138a6b1/recommonmark/parser.py#L152-L155 - github_url += ".md" - print("Ref {} resolved to github:{}".format(target, github_url)) - contnode["refuri"] = self.ROOT + github_url - return [("githuburl:any", contnode)] - else: - return [] - - -# to support markdown -from recommonmark.parser import CommonMarkParser - -sys.path.insert(0, os.path.abspath("../")) -os.environ["_DOC_BUILDING"] = "True" -DEPLOY = os.environ.get("READTHEDOCS") == "True" - - -# -- Project information ----------------------------------------------------- - -# fmt: off -try: - import torch # noqa -except ImportError: - for m in [ - "torch", "torchvision", "torch.nn", "torch.nn.parallel", "torch.distributed", "torch.multiprocessing", "torch.autograd", - "torch.autograd.function", "torch.nn.modules", "torch.nn.modules.utils", "torch.utils", "torch.utils.data", "torch.onnx", - "torchvision", "torchvision.ops", - ]: - sys.modules[m] = mock.Mock(name=m) - sys.modules['torch'].__version__ = "1.7" # fake version - HAS_TORCH = False -else: - try: - torch.ops.detectron2 = mock.Mock(name="torch.ops.detectron2") - except: - pass - HAS_TORCH = True - -for m in [ - "cv2", "scipy", "portalocker", "detectron2._C", - "pycocotools", "pycocotools.mask", "pycocotools.coco", "pycocotools.cocoeval", - "google", "google.protobuf", "google.protobuf.internal", "onnx", - "caffe2", "caffe2.proto", "caffe2.python", "caffe2.python.utils", "caffe2.python.onnx", "caffe2.python.onnx.backend", -]: - sys.modules[m] = mock.Mock(name=m) -# fmt: on -sys.modules["cv2"].__version__ = "3.4" - -import detectron2 # isort: skip - -if HAS_TORCH: - from detectron2.utils.env import fixup_module_metadata - - fixup_module_metadata("torch.nn", torch.nn.__dict__) - fixup_module_metadata("torch.utils.data", torch.utils.data.__dict__) - - -project = "detectron2" -copyright = "2019-2020, detectron2 contributors" -author = "detectron2 contributors" - -# The short X.Y version -version = detectron2.__version__ -# The full version, including alpha/beta/rc tags -release = version - - -# -- General configuration --------------------------------------------------- - -# If your documentation needs a minimal Sphinx version, state it here. -# -needs_sphinx = "3.0" - -# Add any Sphinx extension module names here, as strings. They can be -# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom -# ones. -extensions = [ - "recommonmark", - "sphinx.ext.autodoc", - "sphinx.ext.napoleon", - "sphinx.ext.intersphinx", - "sphinx.ext.todo", - "sphinx.ext.coverage", - "sphinx.ext.mathjax", - "sphinx.ext.viewcode", - "sphinx.ext.githubpages", -] - -# -- Configurations for plugins ------------ -napoleon_google_docstring = True -napoleon_include_init_with_doc = True -napoleon_include_special_with_doc = True -napoleon_numpy_docstring = False -napoleon_use_rtype = False -autodoc_inherit_docstrings = False -autodoc_member_order = "bysource" - -if DEPLOY: - intersphinx_timeout = 10 -else: - # skip this when building locally - intersphinx_timeout = 0.5 -intersphinx_mapping = { - "python": ("https://docs.python.org/3.6", None), - "numpy": ("https://docs.scipy.org/doc/numpy/", None), - "torch": ("https://pytorch.org/docs/master/", None), -} -# ------------------------- - - -# Add any paths that contain templates here, relative to this directory. -templates_path = ["_templates"] - -source_suffix = [".rst", ".md"] - -# The master toctree document. -master_doc = "index" - -# The language for content autogenerated by Sphinx. Refer to documentation -# for a list of supported languages. -# -# This is also used if you do content translation via gettext catalogs. -# Usually you set "language" from the command line for these cases. -language = None - -# List of patterns, relative to source directory, that match files and -# directories to ignore when looking for source files. -# This pattern also affects html_static_path and html_extra_path. -exclude_patterns = ["_build", "Thumbs.db", ".DS_Store", "build", "README.md", "tutorials/README.md"] - -# The name of the Pygments (syntax highlighting) style to use. -pygments_style = "sphinx" - - -# -- Options for HTML output ------------------------------------------------- - -html_theme = "sphinx_rtd_theme" -html_theme_path = [sphinx_rtd_theme.get_html_theme_path()] - -# Theme options are theme-specific and customize the look and feel of a theme -# further. For a list of options available for each theme, see the -# documentation. -# -# html_theme_options = {} - -# Add any paths that contain custom static files (such as style sheets) here, -# relative to this directory. They are copied after the builtin static files, -# so a file named "default.css" will overwrite the builtin "default.css". -html_static_path = ["_static"] -html_css_files = ["css/custom.css"] - -# Custom sidebar templates, must be a dictionary that maps document names -# to template names. -# -# The default sidebars (for documents that don't match any pattern) are -# defined by theme itself. Builtin themes are using these templates by -# default: ``['localtoc.html', 'relations.html', 'sourcelink.html', -# 'searchbox.html']``. -# -# html_sidebars = {} - - -# -- Options for HTMLHelp output --------------------------------------------- - -# Output file base name for HTML help builder. -htmlhelp_basename = "detectron2doc" - - -# -- Options for LaTeX output ------------------------------------------------ - -latex_elements = { - # The paper size ('letterpaper' or 'a4paper'). - # - # 'papersize': 'letterpaper', - # The font size ('10pt', '11pt' or '12pt'). - # - # 'pointsize': '10pt', - # Additional stuff for the LaTeX preamble. - # - # 'preamble': '', - # Latex figure (float) alignment - # - # 'figure_align': 'htbp', -} - -# Grouping the document tree into LaTeX files. List of tuples -# (source start file, target name, title, -# author, documentclass [howto, manual, or own class]). -latex_documents = [ - (master_doc, "detectron2.tex", "detectron2 Documentation", "detectron2 contributors", "manual") -] - - -# -- Options for manual page output ------------------------------------------ - -# One entry per manual page. List of tuples -# (source start file, name, description, authors, manual section). -man_pages = [(master_doc, "detectron2", "detectron2 Documentation", [author], 1)] - - -# -- Options for Texinfo output ---------------------------------------------- - -# Grouping the document tree into Texinfo files. List of tuples -# (source start file, target name, title, author, -# dir menu entry, description, category) -texinfo_documents = [ - ( - master_doc, - "detectron2", - "detectron2 Documentation", - author, - "detectron2", - "One line description of project.", - "Miscellaneous", - ) -] - - -# -- Options for todo extension ---------------------------------------------- - -# If true, `todo` and `todoList` produce output, else they produce nothing. -todo_include_todos = True - - -def autodoc_skip_member(app, what, name, obj, skip, options): - # we hide something deliberately - if getattr(obj, "__HIDE_SPHINX_DOC__", False): - return True - - # Hide some that are deprecated or not intended to be used - HIDDEN = { - "ResNetBlockBase", - "GroupedBatchSampler", - "build_transform_gen", - "apply_transform_gens", - "TransformGen", - "apply_augmentations", - "StandardAugInput", - "build_batch_data_loader", - "draw_panoptic_seg_predictions", - "WarmupCosineLR", - "WarmupMultiStepLR", - "downgrade_config", - "upgrade_config", - "add_export_config", - } - try: - if name in HIDDEN or ( - hasattr(obj, "__doc__") and obj.__doc__.lower().strip().startswith("deprecated") - ): - print("Skipping deprecated object: {}".format(name)) - return True - except: - pass - return skip - - -_PAPER_DATA = { - "resnet": ("1512.03385", "Deep Residual Learning for Image Recognition"), - "fpn": ("1612.03144", "Feature Pyramid Networks for Object Detection"), - "mask r-cnn": ("1703.06870", "Mask R-CNN"), - "faster r-cnn": ( - "1506.01497", - "Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks", - ), - "deformconv": ("1703.06211", "Deformable Convolutional Networks"), - "deformconv2": ("1811.11168", "Deformable ConvNets v2: More Deformable, Better Results"), - "panopticfpn": ("1901.02446", "Panoptic Feature Pyramid Networks"), - "retinanet": ("1708.02002", "Focal Loss for Dense Object Detection"), - "cascade r-cnn": ("1712.00726", "Cascade R-CNN: Delving into High Quality Object Detection"), - "lvis": ("1908.03195", "LVIS: A Dataset for Large Vocabulary Instance Segmentation"), - "rrpn": ("1703.01086", "Arbitrary-Oriented Scene Text Detection via Rotation Proposals"), - "imagenet in 1h": ("1706.02677", "Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour"), - "xception": ("1610.02357", "Xception: Deep Learning with Depthwise Separable Convolutions"), - "mobilenet": ( - "1704.04861", - "MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications", - ), - "deeplabv3+": ( - "1802.02611", - "Encoder-Decoder with Atrous Separable Convolution for Semantic Image Segmentation", - ), - "dds": ("2003.13678", "Designing Network Design Spaces"), - "scaling": ("2103.06877", "Fast and Accurate Model Scaling"), - "fcos": ("2006.09214", "FCOS: A Simple and Strong Anchor-free Object Detector"), - "rethinking-batchnorm": ("2105.07576", 'Rethinking "Batch" in BatchNorm'), -} - - -def paper_ref_role( - typ: str, - rawtext: str, - text: str, - lineno: int, - inliner, - options: Dict = {}, - content: List[str] = [], -): - """ - Parse :paper:`xxx`. Similar to the "extlinks" sphinx extension. - """ - from docutils import nodes, utils - from sphinx.util.nodes import split_explicit_title - - text = utils.unescape(text) - has_explicit_title, title, link = split_explicit_title(text) - link = link.lower() - if link not in _PAPER_DATA: - inliner.reporter.warning("Cannot find paper " + link) - paper_url, paper_title = "#", link - else: - paper_url, paper_title = _PAPER_DATA[link] - if "/" not in paper_url: - paper_url = "https://arxiv.org/abs/" + paper_url - if not has_explicit_title: - title = paper_title - pnode = nodes.reference(title, title, internal=False, refuri=paper_url) - return [pnode], [] - - -def setup(app): - from recommonmark.transform import AutoStructify - - app.add_domain(GithubURLDomain) - app.connect("autodoc-skip-member", autodoc_skip_member) - app.add_role("paper", paper_ref_role) - app.add_config_value( - "recommonmark_config", - {"enable_math": True, "enable_inline_math": True, "enable_eval_rst": True}, - True, - ) - app.add_transform(AutoStructify) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/index.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/index.rst deleted file mode 100755 index 8634b7b1..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/index.rst +++ /dev/null @@ -1,14 +0,0 @@ -.. detectron2 documentation master file, created by - sphinx-quickstart on Sat Sep 21 13:46:45 2019. - You can adapt this file completely to your liking, but it should at least - contain the root `toctree` directive. - -Welcome to detectron2's documentation! -====================================== - -.. toctree:: - :maxdepth: 2 - - tutorials/index - notes/index - modules/index diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/checkpoint.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/checkpoint.rst deleted file mode 100755 index 449caaff..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/checkpoint.rst +++ /dev/null @@ -1,7 +0,0 @@ -detectron2.checkpoint -============================= - -.. automodule:: detectron2.checkpoint - :members: - :undoc-members: - :show-inheritance: diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/config.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/config.rst deleted file mode 100755 index c76913d8..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/config.rst +++ /dev/null @@ -1,18 +0,0 @@ -detectron2.config -========================= - -Related tutorials: :doc:`../tutorials/configs`, :doc:`../tutorials/extend`. - -.. automodule:: detectron2.config - :members: - :undoc-members: - :show-inheritance: - - -Yaml Config References ------------------ - -.. literalinclude:: ../../detectron2/config/defaults.py - :language: python - :linenos: - :lines: 7- diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/data.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/data.rst deleted file mode 100755 index 0d5bd891..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/data.rst +++ /dev/null @@ -1,37 +0,0 @@ -detectron2.data -======================= - -.. autodata:: detectron2.data.DatasetCatalog(dict) - :annotation: - -.. autodata:: detectron2.data.MetadataCatalog(dict) - :annotation: - -.. automodule:: detectron2.data - :members: - :undoc-members: - :show-inheritance: - -detectron2.data.detection\_utils module ---------------------------------------- - -.. automodule:: detectron2.data.detection_utils - :members: - :undoc-members: - :show-inheritance: - -detectron2.data.datasets module ---------------------------------------- - -.. automodule:: detectron2.data.datasets - :members: - :undoc-members: - :show-inheritance: - -detectron2.data.samplers module ---------------------------------------- - -.. automodule:: detectron2.data.samplers - :members: - :undoc-members: - :show-inheritance: diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/data_transforms.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/data_transforms.rst deleted file mode 100755 index 1533a434..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/data_transforms.rst +++ /dev/null @@ -1,10 +0,0 @@ -detectron2.data.transforms -==================================== - -Related tutorial: :doc:`../tutorials/augmentation`. - -.. automodule:: detectron2.data.transforms - :members: - :undoc-members: - :show-inheritance: - :imported-members: diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/engine.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/engine.rst deleted file mode 100755 index 7e0d2b07..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/engine.rst +++ /dev/null @@ -1,26 +0,0 @@ -detectron2.engine -========================= - -Related tutorial: :doc:`../tutorials/training`. - -.. automodule:: detectron2.engine - :members: - :undoc-members: - :show-inheritance: - - -detectron2.engine.defaults module ---------------------------------- - -.. automodule:: detectron2.engine.defaults - :members: - :undoc-members: - :show-inheritance: - -detectron2.engine.hooks module ---------------------------------- - -.. automodule:: detectron2.engine.hooks - :members: - :undoc-members: - :show-inheritance: diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/evaluation.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/evaluation.rst deleted file mode 100755 index 69bfc4b9..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/evaluation.rst +++ /dev/null @@ -1,7 +0,0 @@ -detectron2.evaluation -============================= - -.. automodule:: detectron2.evaluation - :members: - :undoc-members: - :show-inheritance: diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/export.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/export.rst deleted file mode 100755 index dcee14f8..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/export.rst +++ /dev/null @@ -1,9 +0,0 @@ -detectron2.export -========================= - -Related tutorial: :doc:`../tutorials/deployment`. - -.. automodule:: detectron2.export - :members: - :undoc-members: - :show-inheritance: diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/fvcore.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/fvcore.rst deleted file mode 100755 index c8bf9f58..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/fvcore.rst +++ /dev/null @@ -1,49 +0,0 @@ -fvcore documentation -==================== - -Detectron2 depends on utilities in -`fvcore `_. -We include part of fvcore documentation here for easier reference. - -fvcore.nn ------------------ - -.. automodule:: fvcore.nn - :members: - :inherited-members: - :undoc-members: - :show-inheritance: - -fvcore.common ---------------------- - -.. automodule:: fvcore.common.checkpoint - :members: - :undoc-members: - :show-inheritance: - -.. automodule:: fvcore.common.config - :members: - :undoc-members: - :show-inheritance: - -.. automodule:: fvcore.common.history_buffer - :members: - :undoc-members: - :show-inheritance: - -.. automodule:: fvcore.common.param_scheduler - :members: - :inherited-members: - :undoc-members: - :show-inheritance: - -.. automodule:: fvcore.common.registry - :members: - :undoc-members: - :show-inheritance: - -.. automodule:: fvcore.common.timer - :members: - :undoc-members: - :show-inheritance: diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/index.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/index.rst deleted file mode 100755 index 14b75439..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/index.rst +++ /dev/null @@ -1,19 +0,0 @@ -API Documentation -================== - -.. toctree:: - - checkpoint - config - data - data_transforms - engine - evaluation - layers - model_zoo - modeling - solver - structures - utils - export - fvcore diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/layers.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/layers.rst deleted file mode 100755 index b43b42a7..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/layers.rst +++ /dev/null @@ -1,7 +0,0 @@ -detectron2.layers -========================= - -.. automodule:: detectron2.layers - :members: - :undoc-members: - :show-inheritance: diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/model_zoo.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/model_zoo.rst deleted file mode 100755 index 5abbad1f..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/model_zoo.rst +++ /dev/null @@ -1,7 +0,0 @@ -detectron2.model_zoo -============================ - -.. automodule:: detectron2.model_zoo - :members: - :undoc-members: - :show-inheritance: diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/modeling.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/modeling.rst deleted file mode 100755 index a22c7ed3..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/modeling.rst +++ /dev/null @@ -1,58 +0,0 @@ -detectron2.modeling -=========================== - -.. automodule:: detectron2.modeling - :members: - :undoc-members: - :show-inheritance: - - -detectron2.modeling.poolers module ---------------------------------------- - -.. automodule:: detectron2.modeling.poolers - :members: - :undoc-members: - :show-inheritance: - - -detectron2.modeling.sampling module ------------------------------------- - -.. automodule:: detectron2.modeling.sampling - :members: - :undoc-members: - :show-inheritance: - - -detectron2.modeling.box_regression module ------------------------------------------- - -.. automodule:: detectron2.modeling.box_regression - :members: - :undoc-members: - :show-inheritance: - - -Model Registries ------------------ - -These are different registries provided in modeling. -Each registry provide you the ability to replace it with your customized component, -without having to modify detectron2's code. - -Note that it is impossible to allow users to customize any line of code directly. -Even just to add one line at some place, -you'll likely need to find out the smallest registry which contains that line, -and register your component to that registry. - - -.. autodata:: detectron2.modeling.META_ARCH_REGISTRY -.. autodata:: detectron2.modeling.BACKBONE_REGISTRY -.. autodata:: detectron2.modeling.PROPOSAL_GENERATOR_REGISTRY -.. autodata:: detectron2.modeling.RPN_HEAD_REGISTRY -.. autodata:: detectron2.modeling.ANCHOR_GENERATOR_REGISTRY -.. autodata:: detectron2.modeling.ROI_HEADS_REGISTRY -.. autodata:: detectron2.modeling.ROI_BOX_HEAD_REGISTRY -.. autodata:: detectron2.modeling.ROI_MASK_HEAD_REGISTRY -.. autodata:: detectron2.modeling.ROI_KEYPOINT_HEAD_REGISTRY diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/solver.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/solver.rst deleted file mode 100755 index 59d98c72..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/solver.rst +++ /dev/null @@ -1,7 +0,0 @@ -detectron2.solver -========================= - -.. automodule:: detectron2.solver - :members: - :undoc-members: - :show-inheritance: diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/structures.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/structures.rst deleted file mode 100755 index 1369dc08..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/structures.rst +++ /dev/null @@ -1,7 +0,0 @@ -detectron2.structures -============================= - -.. automodule:: detectron2.structures - :members: - :undoc-members: - :show-inheritance: diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/utils.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/utils.rst deleted file mode 100755 index ab58f2ca..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/modules/utils.rst +++ /dev/null @@ -1,80 +0,0 @@ -detectron2.utils -======================== - -detectron2.utils.colormap module --------------------------------- - -.. automodule:: detectron2.utils.colormap - :members: - :undoc-members: - :show-inheritance: - -detectron2.utils.comm module ----------------------------- - -.. automodule:: detectron2.utils.comm - :members: - :undoc-members: - :show-inheritance: - - -detectron2.utils.events module ------------------------------- - -.. automodule:: detectron2.utils.events - :members: - :undoc-members: - :show-inheritance: - - -detectron2.utils.logger module ------------------------------- - -.. automodule:: detectron2.utils.logger - :members: - :undoc-members: - :show-inheritance: - - -detectron2.utils.registry module --------------------------------- - -.. automodule:: detectron2.utils.registry - :members: - :undoc-members: - :show-inheritance: - -detectron2.utils.memory module ----------------------------------- - -.. automodule:: detectron2.utils.memory - :members: - :undoc-members: - :show-inheritance: - - -detectron2.utils.analysis module ----------------------------------- - -.. automodule:: detectron2.utils.analysis - :members: - :undoc-members: - :show-inheritance: - - -detectron2.utils.visualizer module ----------------------------------- - -.. automodule:: detectron2.utils.visualizer - :members: - :undoc-members: - :show-inheritance: - -detectron2.utils.video\_visualizer module ------------------------------------------ - -.. automodule:: detectron2.utils.video_visualizer - :members: - :undoc-members: - :show-inheritance: - diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/benchmarks.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/benchmarks.md deleted file mode 100755 index b41588da..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/benchmarks.md +++ /dev/null @@ -1,196 +0,0 @@ - -# Benchmarks - -Here we benchmark the training speed of a Mask R-CNN in detectron2, -with some other popular open source Mask R-CNN implementations. - - -### Settings - -* Hardware: 8 NVIDIA V100s with NVLink. -* Software: Python 3.7, CUDA 10.1, cuDNN 7.6.5, PyTorch 1.5, - TensorFlow 1.15.0rc2, Keras 2.2.5, MxNet 1.6.0b20190820. -* Model: an end-to-end R-50-FPN Mask-RCNN model, using the same hyperparameter as the - [Detectron baseline config](https://github.com/facebookresearch/Detectron/blob/master/configs/12_2017_baselines/e2e_mask_rcnn_R-50-FPN_1x.yaml) - (it does not have scale augmentation). -* Metrics: We use the average throughput in iterations 100-500 to skip GPU warmup time. - Note that for R-CNN-style models, the throughput of a model typically changes during training, because - it depends on the predictions of the model. Therefore this metric is not directly comparable with - "train speed" in model zoo, which is the average speed of the entire training run. - - -### Main Results - -```eval_rst -+-------------------------------+--------------------+ -| Implementation | Throughput (img/s) | -+===============================+====================+ -| |D2| |PT| | 62 | -+-------------------------------+--------------------+ -| mmdetection_ |PT| | 53 | -+-------------------------------+--------------------+ -| maskrcnn-benchmark_ |PT| | 53 | -+-------------------------------+--------------------+ -| tensorpack_ |TF| | 50 | -+-------------------------------+--------------------+ -| simpledet_ |mxnet| | 39 | -+-------------------------------+--------------------+ -| Detectron_ |C2| | 19 | -+-------------------------------+--------------------+ -| `matterport/Mask_RCNN`__ |TF| | 14 | -+-------------------------------+--------------------+ - -.. _maskrcnn-benchmark: https://github.com/facebookresearch/maskrcnn-benchmark/ -.. _tensorpack: https://github.com/tensorpack/tensorpack/tree/master/examples/FasterRCNN -.. _mmdetection: https://github.com/open-mmlab/mmdetection/ -.. _simpledet: https://github.com/TuSimple/simpledet/ -.. _Detectron: https://github.com/facebookresearch/Detectron -__ https://github.com/matterport/Mask_RCNN/ - -.. |D2| image:: https://github.com/facebookresearch/detectron2/raw/main/.github/Detectron2-Logo-Horz.svg?sanitize=true - :height: 15pt - :target: https://github.com/facebookresearch/detectron2/ -.. |PT| image:: https://pytorch.org/assets/images/logo-icon.svg - :width: 15pt - :height: 15pt - :target: https://pytorch.org -.. |TF| image:: https://static.nvidiagrid.net/ngc/containers/tensorflow.png - :width: 15pt - :height: 15pt - :target: https://tensorflow.org -.. |mxnet| image:: https://github.com/dmlc/web-data/raw/master/mxnet/image/mxnet_favicon.png - :width: 15pt - :height: 15pt - :target: https://mxnet.apache.org/ -.. |C2| image:: https://caffe2.ai/static/logo.svg - :width: 15pt - :height: 15pt - :target: https://caffe2.ai -``` - - -Details for each implementation: - -* __Detectron2__: with release v0.1.2, run: - ``` - python tools/train_net.py --config-file configs/Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x.yaml --num-gpus 8 - ``` - -* __mmdetection__: at commit `b0d845f`, run - ``` - ./tools/dist_train.sh configs/mask_rcnn/mask_rcnn_r50_caffe_fpn_1x_coco.py 8 - ``` - -* __maskrcnn-benchmark__: use commit `0ce8f6f` with `sed -i 's/torch.uint8/torch.bool/g' **/*.py; sed -i 's/AT_CHECK/TORCH_CHECK/g' **/*.cu` - to make it compatible with PyTorch 1.5. Then, run training with - ``` - python -m torch.distributed.launch --nproc_per_node=8 tools/train_net.py --config-file configs/e2e_mask_rcnn_R_50_FPN_1x.yaml - ``` - The speed we observed is faster than its model zoo, likely due to different software versions. - -* __tensorpack__: at commit `caafda`, `export TF_CUDNN_USE_AUTOTUNE=0`, then run - ``` - mpirun -np 8 ./train.py --config DATA.BASEDIR=/data/coco TRAINER=horovod BACKBONE.STRIDE_1X1=True TRAIN.STEPS_PER_EPOCH=50 --load ImageNet-R50-AlignPadding.npz - ``` - -* __SimpleDet__: at commit `9187a1`, run - ``` - python detection_train.py --config config/mask_r50v1_fpn_1x.py - ``` - -* __Detectron__: run - ``` - python tools/train_net.py --cfg configs/12_2017_baselines/e2e_mask_rcnn_R-50-FPN_1x.yaml - ``` - Note that many of its ops run on CPUs, therefore the performance is limited. - -* __matterport/Mask_RCNN__: at commit `3deaec`, apply the following diff, `export TF_CUDNN_USE_AUTOTUNE=0`, then run - ``` - python coco.py train --dataset=/data/coco/ --model=imagenet - ``` - Note that many small details in this implementation might be different - from Detectron's standards. - -
      - - (diff to make it use the same hyperparameters - click to expand) - - - ```diff - diff --git i/mrcnn/model.py w/mrcnn/model.py - index 62cb2b0..61d7779 100644 - --- i/mrcnn/model.py - +++ w/mrcnn/model.py - @@ -2367,8 +2367,8 @@ class MaskRCNN(): - epochs=epochs, - steps_per_epoch=self.config.STEPS_PER_EPOCH, - callbacks=callbacks, - - validation_data=val_generator, - - validation_steps=self.config.VALIDATION_STEPS, - + #validation_data=val_generator, - + #validation_steps=self.config.VALIDATION_STEPS, - max_queue_size=100, - workers=workers, - use_multiprocessing=True, - diff --git i/mrcnn/parallel_model.py w/mrcnn/parallel_model.py - index d2bf53b..060172a 100644 - --- i/mrcnn/parallel_model.py - +++ w/mrcnn/parallel_model.py - @@ -32,6 +32,7 @@ class ParallelModel(KM.Model): - keras_model: The Keras model to parallelize - gpu_count: Number of GPUs. Must be > 1 - """ - + super().__init__() - self.inner_model = keras_model - self.gpu_count = gpu_count - merged_outputs = self.make_parallel() - diff --git i/samples/coco/coco.py w/samples/coco/coco.py - index 5d172b5..239ed75 100644 - --- i/samples/coco/coco.py - +++ w/samples/coco/coco.py - @@ -81,7 +81,10 @@ class CocoConfig(Config): - IMAGES_PER_GPU = 2 - - # Uncomment to train on 8 GPUs (default is 1) - - # GPU_COUNT = 8 - + GPU_COUNT = 8 - + BACKBONE = "resnet50" - + STEPS_PER_EPOCH = 50 - + TRAIN_ROIS_PER_IMAGE = 512 - - # Number of classes (including background) - NUM_CLASSES = 1 + 80 # COCO has 80 classes - @@ -496,29 +499,10 @@ if __name__ == '__main__': - # *** This training schedule is an example. Update to your needs *** - - # Training - Stage 1 - - print("Training network heads") - model.train(dataset_train, dataset_val, - learning_rate=config.LEARNING_RATE, - epochs=40, - - layers='heads', - - augmentation=augmentation) - - - - # Training - Stage 2 - - # Finetune layers from ResNet stage 4 and up - - print("Fine tune Resnet stage 4 and up") - - model.train(dataset_train, dataset_val, - - learning_rate=config.LEARNING_RATE, - - epochs=120, - - layers='4+', - - augmentation=augmentation) - - - - # Training - Stage 3 - - # Fine tune all layers - - print("Fine tune all layers") - - model.train(dataset_train, dataset_val, - - learning_rate=config.LEARNING_RATE / 10, - - epochs=160, - - layers='all', - + layers='3+', - augmentation=augmentation) - - elif args.command == "evaluate": - ``` - -
      diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/changelog.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/changelog.md deleted file mode 100755 index 000e9f88..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/changelog.md +++ /dev/null @@ -1,48 +0,0 @@ -# Change Log and Backward Compatibility - -### Releases -See release logs at -[https://github.com/facebookresearch/detectron2/releases](https://github.com/facebookresearch/detectron2/releases) -for new updates. - -### Backward Compatibility - -Due to the research nature of what the library does, there might be backward incompatible changes. -But we try to reduce users' disruption by the following ways: -* APIs listed in [API documentation](https://detectron2.readthedocs.io/modules/index.html), including - function/class names, their arguments, and documented class attributes, are considered *stable* unless - otherwise noted in the documentation. - They are less likely to be broken, but if needed, will trigger a deprecation warning for a reasonable period - before getting broken, and will be documented in release logs. -* Others functions/classses/attributes are considered internal, and are more likely to change. - However, we're aware that some of them may be already used by other projects, and in particular we may - use them for convenience among projects under `detectron2/projects`. - For such APIs, we may treat them as stable APIs and also apply the above strategies. - They may be promoted to stable when we're ready. -* Projects under "detectron2/projects" or imported with "detectron2.projects" are research projects - and are all considered experimental. -* Classes/functions that contain the word "default" or are explicitly documented to produce - "default behavior" may change their behaviors when new features are added. - -Despite of the possible breakage, if a third-party project would like to keep up with the latest updates -in detectron2, using it as a library will still be less disruptive than forking, because -the frequency and scope of API changes will be much smaller than code changes. - -To see such changes, search for "incompatible changes" in [release logs](https://github.com/facebookresearch/detectron2/releases). - -### Config Version Change Log - -Detectron2's config version has not been changed since open source. -There is no need for an open source user to worry about this. - -* v1: Rename `RPN_HEAD.NAME` to `RPN.HEAD_NAME`. -* v2: A batch of rename of many configurations before release. - -### Silent Regressions in Historical Versions: - -We list a few silent regressions, since they may silently produce incorrect results and will be hard to debug. - -* 04/01/2020 - 05/11/2020: Bad accuracy if `TRAIN_ON_PRED_BOXES` is set to True. -* 03/30/2020 - 04/01/2020: ResNets are not correctly built. -* 12/19/2019 - 12/26/2019: Using aspect ratio grouping causes a drop in accuracy. -* - 11/9/2019: Test time augmentation does not predict the last category. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/compatibility.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/compatibility.md deleted file mode 100755 index 83d93f51..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/compatibility.md +++ /dev/null @@ -1,84 +0,0 @@ -# Compatibility with Other Libraries - -## Compatibility with Detectron (and maskrcnn-benchmark) - -Detectron2 addresses some legacy issues left in Detectron. As a result, their models -are not compatible: -running inference with the same model weights will produce different results in the two code bases. - -The major differences regarding inference are: - -- The height and width of a box with corners (x1, y1) and (x2, y2) is now computed more naturally as - width = x2 - x1 and height = y2 - y1; - In Detectron, a "+ 1" was added both height and width. - - Note that the relevant ops in Caffe2 have [adopted this change of convention](https://github.com/pytorch/pytorch/pull/20550) - with an extra option. - So it is still possible to run inference with a Detectron2-trained model in Caffe2. - - The change in height/width calculations most notably changes: - - encoding/decoding in bounding box regression. - - non-maximum suppression. The effect here is very negligible, though. - -- RPN now uses simpler anchors with fewer quantization artifacts. - - In Detectron, the anchors were quantized and - [do not have accurate areas](https://github.com/facebookresearch/Detectron/issues/227). - In Detectron2, the anchors are center-aligned to feature grid points and not quantized. - -- Classification layers have a different ordering of class labels. - - This involves any trainable parameter with shape (..., num_categories + 1, ...). - In Detectron2, integer labels [0, K-1] correspond to the K = num_categories object categories - and the label "K" corresponds to the special "background" category. - In Detectron, label "0" means background, and labels [1, K] correspond to the K categories. - -- ROIAlign is implemented differently. The new implementation is [available in Caffe2](https://github.com/pytorch/pytorch/pull/23706). - - 1. All the ROIs are shifted by half a pixel compared to Detectron in order to create better image-feature-map alignment. - See `layers/roi_align.py` for details. - To enable the old behavior, use `ROIAlign(aligned=False)`, or `POOLER_TYPE=ROIAlign` instead of - `ROIAlignV2` (the default). - - 1. The ROIs are not required to have a minimum size of 1. - This will lead to tiny differences in the output, but should be negligible. - -- Mask inference function is different. - - In Detectron2, the "paste_mask" function is different and should be more accurate than in Detectron. This change - can improve mask AP on COCO by ~0.5% absolute. - -There are some other differences in training as well, but they won't affect -model-level compatibility. The major ones are: - -- We fixed a [bug](https://github.com/facebookresearch/Detectron/issues/459) in - Detectron, by making `RPN.POST_NMS_TOPK_TRAIN` per-image, rather than per-batch. - The fix may lead to a small accuracy drop for a few models (e.g. keypoint - detection) and will require some parameter tuning to match the Detectron results. -- For simplicity, we change the default loss in bounding box regression to L1 loss, instead of smooth L1 loss. - We have observed that this tends to slightly decrease box AP50 while improving box AP for higher - overlap thresholds (and leading to a slight overall improvement in box AP). -- We interpret the coordinates in COCO bounding box and segmentation annotations - as coordinates in range `[0, width]` or `[0, height]`. The coordinates in - COCO keypoint annotations are interpreted as pixel indices in range `[0, width - 1]` or `[0, height - 1]`. - Note that this affects how flip augmentation is implemented. - - -[This article](https://ppwwyyxx.com/blog/2021/Where-are-Pixels/) -explains more details on the above mentioned issues -about pixels, coordinates, and "+1"s. - - -## Compatibility with Caffe2 - -As mentioned above, despite the incompatibilities with Detectron, the relevant -ops have been implemented in Caffe2. -Therefore, models trained with detectron2 can be converted in Caffe2. -See [Deployment](../tutorials/deployment.md) for the tutorial. - -## Compatibility with TensorFlow - -Most ops are available in TensorFlow, although some tiny differences in -the implementation of resize / ROIAlign / padding need to be addressed. -A working conversion script is provided by [tensorpack Faster R-CNN](https://github.com/tensorpack/tensorpack/tree/master/examples/FasterRCNN/convert_d2) -to run a standard detectron2 model in TensorFlow. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/index.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/index.rst deleted file mode 100755 index 63cf907b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/notes/index.rst +++ /dev/null @@ -1,10 +0,0 @@ -Notes -====================================== - -.. toctree:: - :maxdepth: 2 - - benchmarks - compatibility - contributing - changelog diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/requirements.txt b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/requirements.txt deleted file mode 100755 index 58d3c2a5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/requirements.txt +++ /dev/null @@ -1,21 +0,0 @@ -docutils==0.16 -# https://github.com/sphinx-doc/sphinx/commit/7acd3ada3f38076af7b2b5c9f3b60bb9c2587a3d -sphinx==3.2.0 -recommonmark==0.6.0 -sphinx_rtd_theme -# Dependencies here are only those required by import -termcolor -numpy -tqdm -matplotlib -termcolor -yacs -tabulate -cloudpickle -Pillow -future -git+git://github.com/facebookresearch/fvcore.git -https://download.pytorch.org/whl/cpu/torch-1.8.1%2Bcpu-cp37-cp37m-linux_x86_64.whl -https://download.pytorch.org/whl/cpu/torchvision-0.9.1%2Bcpu-cp37-cp37m-linux_x86_64.whl -omegaconf>=2.1.0.dev24 -hydra-core>=1.1.0.dev5 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/README.md deleted file mode 100755 index 1ca9c94d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/README.md +++ /dev/null @@ -1,4 +0,0 @@ -# Read the docs: - -The latest documentation built from this directory is available at [detectron2.readthedocs.io](https://detectron2.readthedocs.io/). -Documents in this directory are not meant to be read on github. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/augmentation.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/augmentation.md deleted file mode 100755 index 7601a082..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/augmentation.md +++ /dev/null @@ -1,186 +0,0 @@ - -# Data Augmentation - -Augmentation is an important part of training. -Detectron2's data augmentation system aims at addressing the following goals: - -1. Allow augmenting multiple data types together - (e.g., images together with their bounding boxes and masks) -2. Allow applying a sequence of statically-declared augmentation -3. Allow adding custom new data types to augment (rotated bounding boxes, video clips, etc.) -4. Process and manipulate the __operations__ that are applied by augmentations - -The first two features cover most of the common use cases, and is also -available in other libraries such as [albumentations](https://medium.com/pytorch/multi-target-in-albumentations-16a777e9006e). -Supporting other features adds some overhead to detectron2's augmentation API, -which we'll explain in this tutorial. - -This tutorial focuses on how to use augmentations when writing new data loaders, -and how to write new augmentations. -If you use the default data loader in detectron2, it already supports taking a user-provided list of custom augmentations, -as explained in the [Dataloader tutorial](data_loading). - -## Basic Usage - -The basic usage of feature (1) and (2) is like the following: -```python -from detectron2.data import transforms as T -# Define a sequence of augmentations: -augs = T.AugmentationList([ - T.RandomBrightness(0.9, 1.1), - T.RandomFlip(prob=0.5), - T.RandomCrop("absolute", (640, 640)) -]) # type: T.Augmentation - -# Define the augmentation input ("image" required, others optional): -input = T.AugInput(image, boxes=boxes, sem_seg=sem_seg) -# Apply the augmentation: -transform = augs(input) # type: T.Transform -image_transformed = input.image # new image -sem_seg_transformed = input.sem_seg # new semantic segmentation - -# For any extra data that needs to be augmented together, use transform, e.g.: -image2_transformed = transform.apply_image(image2) -polygons_transformed = transform.apply_polygons(polygons) -``` - -Three basic concepts are involved here. They are: -* [T.Augmentation](../modules/data_transforms.html#detectron2.data.transforms.Augmentation) defines the __"policy"__ to modify inputs. - * its `__call__(AugInput) -> Transform` method augments the inputs in-place, and returns the operation that is applied -* [T.Transform](../modules/data_transforms.html#detectron2.data.transforms.Transform) - implements the actual __operations__ to transform data - * it has methods such as `apply_image`, `apply_coords` that define how to transform each data type -* [T.AugInput](../modules/data_transforms.html#detectron2.data.transforms.AugInput) - stores inputs needed by `T.Augmentation` and how they should be transformed. - This concept is needed for some advanced usage. - Using this class directly should be sufficient for all common use cases, - since extra data not in `T.AugInput` can be augmented using the returned - `transform`, as shown in the above example. - -## Write New Augmentations - -Most 2D augmentations only need to know about the input image. Such augmentation can be implemented easily like this: - -```python -class MyColorAugmentation(T.Augmentation): - def get_transform(self, image): - r = np.random.rand(2) - return T.ColorTransform(lambda x: x * r[0] + r[1] * 10) - -class MyCustomResize(T.Augmentation): - def get_transform(self, image): - old_h, old_w = image.shape[:2] - new_h, new_w = int(old_h * np.random.rand()), int(old_w * 1.5) - return T.ResizeTransform(old_h, old_w, new_h, new_w) - -augs = MyCustomResize() -transform = augs(input) -``` - -In addition to image, any attributes of the given `AugInput` can be used as long -as they are part of the function signature, e.g.: - -```python -class MyCustomCrop(T.Augmentation): - def get_transform(self, image, sem_seg): - # decide where to crop using both image and sem_seg - return T.CropTransform(...) - -augs = MyCustomCrop() -assert hasattr(input, "image") and hasattr(input, "sem_seg") -transform = augs(input) -``` - -New transform operation can also be added by subclassing -[T.Transform](../modules/data_transforms.html#detectron2.data.transforms.Transform). - -## Advanced Usage - -We give a few examples of advanced usages that -are enabled by our system. -These options can be interesting to new research, -although changing them is often not needed -for standard use cases. - -### Custom transform strategy - -Instead of only returning the augmented data, detectron2's `Augmentation` returns the __operations__ as `T.Transform`. -This allows users to apply custom transform strategy on their data. -We use keypoints data as an example. - -Keypoints are (x, y) coordinates, but they are not so trivial to augment due to the semantic meaning they carry. -Such meaning is only known to the users, therefore users may want to augment them manually -by looking at the returned `transform`. -For example, when an image is horizontally flipped, we'd like to swap the keypoint annotations for "left eye" and "right eye". -This can be done like this (included by default in detectron2's default data loader): -```python -# augs, input are defined as in previous examples -transform = augs(input) # type: T.Transform -keypoints_xy = transform.apply_coords(keypoints_xy) # transform the coordinates - -# get a list of all transforms that were applied -transforms = T.TransformList([transform]).transforms -# check if it is flipped for odd number of times -do_hflip = sum(isinstance(t, T.HFlipTransform) for t in transforms) % 2 == 1 -if do_hflip: - keypoints_xy = keypoints_xy[flip_indices_mapping] -``` - -As another example, keypoints annotations often have a "visibility" field. -A sequence of augmentations might augment a visible keypoint out of the image boundary (e.g. with cropping), -but then bring it back within the boundary afterwards (e.g. with image padding). -If users decide to label such keypoints "invisible", -then the visibility check has to happen after every transform step. -This can be achieved by: - -```python -transform = augs(input) # type: T.TransformList -assert isinstance(transform, T.TransformList) -for t in transform.transforms: - keypoints_xy = t.apply_coords(keypoints_xy) - visibility &= (keypoints_xy >= [0, 0] & keypoints_xy <= [W, H]).all(axis=1) - -# btw, detectron2's `transform_keypoint_annotations` function chooses to label such keypoints "visible": -# keypoints_xy = transform.apply_coords(keypoints_xy) -# visibility &= (keypoints_xy >= [0, 0] & keypoints_xy <= [W, H]).all(axis=1) -``` - - -### Geometrically invert the transform -If images are pre-processed by augmentations before inference, the predicted results -such as segmentation masks are localized on the augmented image. -We'd like to invert the applied augmentation with the [inverse()](../modules/data_transforms.html#detectron2.data.transforms.Transform.inverse) -API, to obtain results on the original image: -```python -transform = augs(input) -pred_mask = make_prediction(input.image) -inv_transform = transform.inverse() -pred_mask_orig = inv_transform.apply_segmentation(pred_mask) -``` - -### Add new data types - -[T.Transform](../modules/data_transforms.html#detectron2.data.transforms.Transform) -supports a few common data types to transform, including images, coordinates, masks, boxes, polygons. -It allows registering new data types, e.g.: -```python -@T.HFlipTransform.register_type("rotated_boxes") -def func(flip_transform: T.HFlipTransform, rotated_boxes: Any): - # do the work - return flipped_rotated_boxes - -t = HFlipTransform(width=800) -transformed_rotated_boxes = t.apply_rotated_boxes(rotated_boxes) # func will be called -``` - -### Extend T.AugInput - -An augmentation can only access attributes available in the given input. -[T.AugInput](../modules/data_transforms.html#detectron2.data.transforms.StandardAugInput) defines "image", "boxes", "sem_seg", -which are sufficient for common augmentation strategies to decide how to augment. -If not, a custom implementation is needed. - -By re-implement the "transform()" method in AugInput, it is also possible to -augment different fields in ways that are dependent on each other. -Such use case is uncommon (e.g. post-process bounding box based on augmented masks), but allowed by the system. - diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/configs.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/configs.md deleted file mode 100755 index 751e4eb6..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/configs.md +++ /dev/null @@ -1,62 +0,0 @@ -# Yacs Configs - -Detectron2 provides a key-value based config system that can be -used to obtain standard, common behaviors. - -This system uses YAML and [yacs](https://github.com/rbgirshick/yacs). -Yaml is a very limited language, -so we do not expect all features in detectron2 to be available through configs. -If you need something that's not available in the config space, -please write code using detectron2's API. - -With the introduction of a more powerful [LazyConfig system](lazyconfigs.md), -we no longer add functionality / new keys to the Yacs/Yaml-based config system. - -### Basic Usage - -Some basic usage of the `CfgNode` object is shown here. See more in [documentation](../modules/config.html#detectron2.config.CfgNode). -```python -from detectron2.config import get_cfg -cfg = get_cfg() # obtain detectron2's default config -cfg.xxx = yyy # add new configs for your own custom components -cfg.merge_from_file("my_cfg.yaml") # load values from a file - -cfg.merge_from_list(["MODEL.WEIGHTS", "weights.pth"]) # can also load values from a list of str -print(cfg.dump()) # print formatted configs -with open("output.yaml", "w") as f: - f.write(cfg.dump()) # save config to file -``` - -In addition to the basic Yaml syntax, the config file can -define a `_BASE_: base.yaml` field, which will load a base config file first. -Values in the base config will be overwritten in sub-configs, if there are any conflicts. -We provided several base configs for standard model architectures. - -Many builtin tools in detectron2 accept command line config overwrite: -Key-value pairs provided in the command line will overwrite the existing values in the config file. -For example, [demo.py](../../demo/demo.py) can be used with -``` -./demo.py --config-file config.yaml [--other-options] \ - --opts MODEL.WEIGHTS /path/to/weights INPUT.MIN_SIZE_TEST 1000 -``` - -To see a list of available configs in detectron2 and what they mean, -check [Config References](../modules/config.html#config-references) - -### Configs in Projects - -A project that lives outside the detectron2 library may define its own configs, which will need to be added -for the project to be functional, e.g.: -```python -from detectron2.projects.point_rend import add_pointrend_config -cfg = get_cfg() # obtain detectron2's default config -add_pointrend_config(cfg) # add pointrend's default config -# ... ... -``` - -### Best Practice with Configs - -1. Treat the configs you write as "code": avoid copying them or duplicating them; use `_BASE_` - to share common parts between configs. - -2. Keep the configs you write simple: don't include keys that do not affect the experimental setting. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/data_loading.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/data_loading.md deleted file mode 100755 index 1d2769fc..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/data_loading.md +++ /dev/null @@ -1,95 +0,0 @@ - -# Dataloader - -Dataloader is the component that provides data to models. -A dataloader usually (but not necessarily) takes raw information from [datasets](./datasets.md), -and process them into a format needed by the model. - -## How the Existing Dataloader Works - -Detectron2 contains a builtin data loading pipeline. -It's good to understand how it works, in case you need to write a custom one. - -Detectron2 provides two functions -[build_detection_{train,test}_loader](../modules/data.html#detectron2.data.build_detection_train_loader) -that create a default data loader from a given config. -Here is how `build_detection_{train,test}_loader` work: - -1. It takes the name of a registered dataset (e.g., "coco_2017_train") and loads a `list[dict]` representing the dataset items - in a lightweight format. These dataset items are not yet ready to be used by the model (e.g., images are - not loaded into memory, random augmentations have not been applied, etc.). - Details about the dataset format and dataset registration can be found in - [datasets](./datasets.md). -2. Each dict in this list is mapped by a function ("mapper"): - * Users can customize this mapping function by specifying the "mapper" argument in - `build_detection_{train,test}_loader`. The default mapper is [DatasetMapper](../modules/data.html#detectron2.data.DatasetMapper). - * The output format of the mapper can be arbitrary, as long as it is accepted by the consumer of this data loader (usually the model). - The outputs of the default mapper, after batching, follow the default model input format documented in - [Use Models](./models.html#model-input-format). - * The role of the mapper is to transform the lightweight representation of a dataset item into a format - that is ready for the model to consume (including, e.g., read images, perform random data augmentation and convert to torch Tensors). - If you would like to perform custom transformations to data, you often want a custom mapper. -3. The outputs of the mapper are batched (simply into a list). -4. This batched data is the output of the data loader. Typically, it's also the input of - `model.forward()`. - - -## Write a Custom Dataloader - -Using a different "mapper" with `build_detection_{train,test}_loader(mapper=)` works for most use cases -of custom data loading. -For example, if you want to resize all images to a fixed size for training, use: - -```python -import detectron2.data.transforms as T -from detectron2.data import DatasetMapper # the default mapper -dataloader = build_detection_train_loader(cfg, - mapper=DatasetMapper(cfg, is_train=True, augmentations=[ - T.Resize((800, 800)) - ])) -# use this dataloader instead of the default -``` -If the arguments of the default [DatasetMapper](../modules/data.html#detectron2.data.DatasetMapper) -does not provide what you need, you may write a custom mapper function and use it instead, e.g.: - -```python -from detectron2.data import detection_utils as utils - # Show how to implement a minimal mapper, similar to the default DatasetMapper -def mapper(dataset_dict): - dataset_dict = copy.deepcopy(dataset_dict) # it will be modified by code below - # can use other ways to read image - image = utils.read_image(dataset_dict["file_name"], format="BGR") - # See "Data Augmentation" tutorial for details usage - auginput = T.AugInput(image) - transform = T.Resize((800, 800))(auginput) - image = torch.from_numpy(auginput.image.transpose(2, 0, 1)) - annos = [ - utils.transform_instance_annotations(annotation, [transform], image.shape[1:]) - for annotation in dataset_dict.pop("annotations") - ] - return { - # create the format that the model expects - "image": image, - "instances": utils.annotations_to_instances(annos, image.shape[1:]) - } -dataloader = build_detection_train_loader(cfg, mapper=mapper) -``` - -If you want to change not only the mapper (e.g., in order to implement different sampling or batching logic), -`build_detection_train_loader` won't work and you will need to write a different data loader. -The data loader is simply a -python iterator that produces [the format](./models.md) that the model accepts. -You can implement it using any tools you like. - -No matter what to implement, it's recommended to -check out [API documentation of detectron2.data](../modules/data) to learn more about the APIs of -these functions. - -## Use a Custom Dataloader - -If you use [DefaultTrainer](../modules/engine.html#detectron2.engine.defaults.DefaultTrainer), -you can overwrite its `build_{train,test}_loader` method to use your own dataloader. -See the [deeplab dataloader](../../projects/DeepLab/train_net.py) -for an example. - -If you write your own training loop, you can plug in your data loader easily. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/datasets.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/datasets.md deleted file mode 100755 index 91103f64..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/datasets.md +++ /dev/null @@ -1,290 +0,0 @@ -# Use Custom Datasets - -This document explains how the dataset APIs -([DatasetCatalog](../modules/data.html#detectron2.data.DatasetCatalog), [MetadataCatalog](../modules/data.html#detectron2.data.MetadataCatalog)) -work, and how to use them to add custom datasets. - -Datasets that have builtin support in detectron2 are listed in [builtin datasets](builtin_datasets.md). -If you want to use a custom dataset while also reusing detectron2's data loaders, -you will need to: - -1. __Register__ your dataset (i.e., tell detectron2 how to obtain your dataset). -2. Optionally, __register metadata__ for your dataset. - -Next, we explain the above two concepts in detail. - -The [Colab tutorial](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5) -has a live example of how to register and train on a dataset of custom formats. - -### Register a Dataset - -To let detectron2 know how to obtain a dataset named "my_dataset", users need to implement -a function that returns the items in your dataset and then tell detectron2 about this -function: -```python -def my_dataset_function(): - ... - return list[dict] in the following format - -from detectron2.data import DatasetCatalog -DatasetCatalog.register("my_dataset", my_dataset_function) -# later, to access the data: -data: List[Dict] = DatasetCatalog.get("my_dataset") -``` - -Here, the snippet associates a dataset named "my_dataset" with a function that returns the data. -The function must return the same data (with same order) if called multiple times. -The registration stays effective until the process exits. - -The function can do arbitrary things and should return the data in `list[dict]`, each dict in either -of the following formats: -1. Detectron2's standard dataset dict, described below. This will make it work with many other builtin - features in detectron2, so it's recommended to use it when it's sufficient. -2. Any custom format. You can also return arbitrary dicts in your own format, - such as adding extra keys for new tasks. - Then you will need to handle them properly downstream as well. - See below for more details. - -#### Standard Dataset Dicts - -For standard tasks -(instance detection, instance/semantic/panoptic segmentation, keypoint detection), -we load the original dataset into `list[dict]` with a specification similar to COCO's annotations. -This is our standard representation for a dataset. - -Each dict contains information about one image. -The dict may have the following fields, -and the required fields vary based on what the dataloader or the task needs (see more below). - -```eval_rst -.. list-table:: - :header-rows: 1 - - * - Task - - Fields - * - Common - - file_name, height, width, image_id - - * - Instance detection/segmentation - - annotations - - * - Semantic segmentation - - sem_seg_file_name - - * - Panoptic segmentation - - pan_seg_file_name, segments_info -``` - -+ `file_name`: the full path to the image file. -+ `height`, `width`: integer. The shape of the image. -+ `image_id` (str or int): a unique id that identifies this image. Required by many - evaluators to identify the images, but a dataset may use it for different purposes. -+ `annotations` (list[dict]): Required by __instance detection/segmentation or keypoint detection__ tasks. - Each dict corresponds to annotations of one instance in this image, and - may contain the following keys: - + `bbox` (list[float], required): list of 4 numbers representing the bounding box of the instance. - + `bbox_mode` (int, required): the format of bbox. It must be a member of - [structures.BoxMode](../modules/structures.html#detectron2.structures.BoxMode). - Currently supports: `BoxMode.XYXY_ABS`, `BoxMode.XYWH_ABS`. - + `category_id` (int, required): an integer in the range [0, num_categories-1] representing the category label. - The value num_categories is reserved to represent the "background" category, if applicable. - + `segmentation` (list[list[float]] or dict): the segmentation mask of the instance. - + If `list[list[float]]`, it represents a list of polygons, one for each connected component - of the object. Each `list[float]` is one simple polygon in the format of `[x1, y1, ..., xn, yn]` (n≥3). - The Xs and Ys are absolute coordinates in unit of pixels. - + If `dict`, it represents the per-pixel segmentation mask in COCO's compressed RLE format. - The dict should have keys "size" and "counts". You can convert a uint8 segmentation mask of 0s and - 1s into such dict by `pycocotools.mask.encode(np.asarray(mask, order="F"))`. - `cfg.INPUT.MASK_FORMAT` must be set to `bitmask` if using the default data loader with such format. - + `keypoints` (list[float]): in the format of [x1, y1, v1,..., xn, yn, vn]. - v[i] means the [visibility](http://cocodataset.org/#format-data) of this keypoint. - `n` must be equal to the number of keypoint categories. - The Xs and Ys are absolute real-value coordinates in range [0, W or H]. - - (Note that the keypoint coordinates in COCO format are integers in range [0, W-1 or H-1], which is different - from our standard format. Detectron2 adds 0.5 to COCO keypoint coordinates to convert them from discrete - pixel indices to floating point coordinates.) - + `iscrowd`: 0 (default) or 1. Whether this instance is labeled as COCO's "crowd - region". Don't include this field if you don't know what it means. - - If `annotations` is an empty list, it means the image is labeled to have no objects. - Such images will by default be removed from training, - but can be included using `DATALOADER.FILTER_EMPTY_ANNOTATIONS`. - -+ `sem_seg_file_name` (str): - The full path to the semantic segmentation ground truth file. - It should be a grayscale image whose pixel values are integer labels. -+ `pan_seg_file_name` (str): - The full path to panoptic segmentation ground truth file. - It should be an RGB image whose pixel values are integer ids encoded using the - [panopticapi.utils.id2rgb](https://github.com/cocodataset/panopticapi/) function. - The ids are defined by `segments_info`. - If an id does not appear in `segments_info`, the pixel is considered unlabeled - and is usually ignored in training & evaluation. -+ `segments_info` (list[dict]): defines the meaning of each id in panoptic segmentation ground truth. - Each dict has the following keys: - + `id` (int): integer that appears in the ground truth image. - + `category_id` (int): an integer in the range [0, num_categories-1] representing the category label. - + `iscrowd`: 0 (default) or 1. Whether this instance is labeled as COCO's "crowd region". - - -```eval_rst - -.. note:: - - The PanopticFPN model does not use the panoptic segmentation - format defined here, but a combination of both instance segmentation and semantic segmentation data - format. See :doc:`builtin_datasets` for instructions on COCO. - -``` - -Fast R-CNN (with pre-computed proposals) models are rarely used today. -To train a Fast R-CNN, the following extra keys are needed: - -+ `proposal_boxes` (array): 2D numpy array with shape (K, 4) representing K precomputed proposal boxes for this image. -+ `proposal_objectness_logits` (array): numpy array with shape (K, ), which corresponds to the objectness - logits of proposals in 'proposal_boxes'. -+ `proposal_bbox_mode` (int): the format of the precomputed proposal bbox. - It must be a member of - [structures.BoxMode](../modules/structures.html#detectron2.structures.BoxMode). - Default is `BoxMode.XYXY_ABS`. - - - -#### Custom Dataset Dicts for New Tasks - -In the `list[dict]` that your dataset function returns, the dictionary can also have __arbitrary custom data__. -This will be useful for a new task that needs extra information not covered -by the standard dataset dicts. In this case, you need to make sure the downstream code can handle your data -correctly. Usually this requires writing a new `mapper` for the dataloader (see [Use Custom Dataloaders](./data_loading.md)). - -When designing a custom format, note that all dicts are stored in memory -(sometimes serialized and with multiple copies). -To save memory, each dict is meant to contain __small__ but sufficient information -about each sample, such as file names and annotations. -Loading full samples typically happens in the data loader. - -For attributes shared among the entire dataset, use `Metadata` (see below). -To avoid extra memory, do not save such information inside each sample. - -### "Metadata" for Datasets - -Each dataset is associated with some metadata, accessible through -`MetadataCatalog.get(dataset_name).some_metadata`. -Metadata is a key-value mapping that contains information that's shared among -the entire dataset, and usually is used to interpret what's in the dataset, e.g., -names of classes, colors of classes, root of files, etc. -This information will be useful for augmentation, evaluation, visualization, logging, etc. -The structure of metadata depends on what is needed from the corresponding downstream code. - -If you register a new dataset through `DatasetCatalog.register`, -you may also want to add its corresponding metadata through -`MetadataCatalog.get(dataset_name).some_key = some_value`, to enable any features that need the metadata. -You can do it like this (using the metadata key "thing_classes" as an example): - -```python -from detectron2.data import MetadataCatalog -MetadataCatalog.get("my_dataset").thing_classes = ["person", "dog"] -``` - -Here is a list of metadata keys that are used by builtin features in detectron2. -If you add your own dataset without these metadata, some features may be -unavailable to you: - -* `thing_classes` (list[str]): Used by all instance detection/segmentation tasks. - A list of names for each instance/thing category. - If you load a COCO format dataset, it will be automatically set by the function `load_coco_json`. - -* `thing_colors` (list[tuple(r, g, b)]): Pre-defined color (in [0, 255]) for each thing category. - Used for visualization. If not given, random colors will be used. - -* `stuff_classes` (list[str]): Used by semantic and panoptic segmentation tasks. - A list of names for each stuff category. - -* `stuff_colors` (list[tuple(r, g, b)]): Pre-defined color (in [0, 255]) for each stuff category. - Used for visualization. If not given, random colors are used. - -* `ignore_label` (int): Used by semantic and panoptic segmentation tasks. Pixels in ground-truth - annotations with this category label should be ignored in evaluation. Typically these are "unlabeled" - pixels. - -* `keypoint_names` (list[str]): Used by keypoint detection. A list of names for each keypoint. - -* `keypoint_flip_map` (list[tuple[str]]): Used by keypoint detection. A list of pairs of names, - where each pair are the two keypoints that should be flipped if the image is - flipped horizontally during augmentation. -* `keypoint_connection_rules`: list[tuple(str, str, (r, g, b))]. Each tuple specifies a pair of keypoints - that are connected and the color (in [0, 255]) to use for the line between them when visualized. - -Some additional metadata that are specific to the evaluation of certain datasets (e.g. COCO): - -* `thing_dataset_id_to_contiguous_id` (dict[int->int]): Used by all instance detection/segmentation tasks in the COCO format. - A mapping from instance class ids in the dataset to contiguous ids in range [0, #class). - Will be automatically set by the function `load_coco_json`. - -* `stuff_dataset_id_to_contiguous_id` (dict[int->int]): Used when generating prediction json files for - semantic/panoptic segmentation. - A mapping from semantic segmentation class ids in the dataset - to contiguous ids in [0, num_categories). It is useful for evaluation only. - -* `json_file`: The COCO annotation json file. Used by COCO evaluation for COCO-format datasets. -* `panoptic_root`, `panoptic_json`: Used by COCO-format panoptic evaluation. -* `evaluator_type`: Used by the builtin main training script to select - evaluator. Don't use it in a new training script. - You can just provide the [DatasetEvaluator](../modules/evaluation.html#detectron2.evaluation.DatasetEvaluator) - for your dataset directly in your main script. - -```eval_rst -.. note:: - - In recognition, sometimes we use the term "thing" for instance-level tasks, - and "stuff" for semantic segmentation tasks. - Both are used in panoptic segmentation tasks. - For background on the concept of "thing" and "stuff", see - `On Seeing Stuff: The Perception of Materials by Humans and Machines - `_. -``` - -### Register a COCO Format Dataset - -If your instance-level (detection, segmentation, keypoint) dataset is already a json file in the COCO format, -the dataset and its associated metadata can be registered easily with: -```python -from detectron2.data.datasets import register_coco_instances -register_coco_instances("my_dataset", {}, "json_annotation.json", "path/to/image/dir") -``` - -If your dataset is in COCO format but need to be further processed, or has extra custom per-instance annotations, -the [load_coco_json](../modules/data.html#detectron2.data.datasets.load_coco_json) -function might be useful. - -### Update the Config for New Datasets - -Once you've registered the dataset, you can use the name of the dataset (e.g., "my_dataset" in -example above) in `cfg.DATASETS.{TRAIN,TEST}`. -There are other configs you might want to change to train or evaluate on new datasets: - -* `MODEL.ROI_HEADS.NUM_CLASSES` and `MODEL.RETINANET.NUM_CLASSES` are the number of thing classes - for R-CNN and RetinaNet models, respectively. -* `MODEL.ROI_KEYPOINT_HEAD.NUM_KEYPOINTS` sets the number of keypoints for Keypoint R-CNN. - You'll also need to set [Keypoint OKS](http://cocodataset.org/#keypoints-eval) - with `TEST.KEYPOINT_OKS_SIGMAS` for evaluation. -* `MODEL.SEM_SEG_HEAD.NUM_CLASSES` sets the number of stuff classes for Semantic FPN & Panoptic FPN. -* `TEST.DETECTIONS_PER_IMAGE` controls the maximum number of objects to be detected. - Set it to a larger number if test images may contain >100 objects. -* If you're training Fast R-CNN (with precomputed proposals), `DATASETS.PROPOSAL_FILES_{TRAIN,TEST}` - need to match the datasets. The format of proposal files are documented - [here](../modules/data.html#detectron2.data.load_proposals_into_dataset). - -New models -(e.g. [TensorMask](../../projects/TensorMask), -[PointRend](../../projects/PointRend)) -often have similar configs of their own that need to be changed as well. - -```eval_rst -.. tip:: - - After changing the number of classes, certain layers in a pre-trained model will become incompatible - and therefore cannot be loaded to the new model. - This is expected, and loading such pre-trained models will produce warnings about such layers. -``` diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/deployment.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/deployment.md deleted file mode 100755 index 173b9a0e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/deployment.md +++ /dev/null @@ -1,137 +0,0 @@ -# Deployment - -Models written in Python need to go through an export process to become a deployable artifact. -A few basic concepts about this process: - -__"Export method"__ is how a Python model is fully serialized to a deployable format. -We support the following export methods: - -* `tracing`: see [pytorch documentation](https://pytorch.org/tutorials/beginner/Intro_to_TorchScript_tutorial.html) to learn about it -* `scripting`: see [pytorch documentation](https://pytorch.org/tutorials/beginner/Intro_to_TorchScript_tutorial.html) to learn about it -* `caffe2_tracing`: replace parts of the model by caffe2 operators, then use tracing. - -__"Format"__ is how a serialized model is described in a file, e.g. -TorchScript, Caffe2 protobuf, ONNX format. -__"Runtime"__ is an engine that loads a serialized model and executes it, -e.g., PyTorch, Caffe2, TensorFlow, onnxruntime, TensorRT, etc. -A runtime is often tied to a specific format -(e.g. PyTorch needs TorchScript format, Caffe2 needs protobuf format). -We currently support the following combination and each has some limitations: - -```eval_rst -+----------------------------+-------------+-------------+-----------------------------+ -| Export Method | tracing | scripting | caffe2_tracing | -+============================+=============+=============+=============================+ -| **Formats** | TorchScript | TorchScript | Caffe2, TorchScript, ONNX | -+----------------------------+-------------+-------------+-----------------------------+ -| **Runtime** | PyTorch | PyTorch | Caffe2, PyTorch | -+----------------------------+-------------+-------------+-----------------------------+ -| C++/Python inference | ✅ | ✅ | ✅ | -+----------------------------+-------------+-------------+-----------------------------+ -| Dynamic resolution | ✅ | ✅ | ✅ | -+----------------------------+-------------+-------------+-----------------------------+ -| Batch size requirement | Constant | Dynamic | Batch inference unsupported | -+----------------------------+-------------+-------------+-----------------------------+ -| Extra runtime deps | torchvision | torchvision | Caffe2 ops (usually already | -| | | | | -| | | | included in PyTorch) | -+----------------------------+-------------+-------------+-----------------------------+ -| Faster/Mask/Keypoint R-CNN | ✅ | ✅ | ✅ | -+----------------------------+-------------+-------------+-----------------------------+ -| RetinaNet | ✅ | ✅ | ✅ | -+----------------------------+-------------+-------------+-----------------------------+ -| PointRend R-CNN | ✅ | ❌ | ❌ | -+----------------------------+-------------+-------------+-----------------------------+ -| Cascade R-CNN | ✅ | ❌ | ❌ | -+----------------------------+-------------+-------------+-----------------------------+ - -``` - -`caffe2_tracing` is going to be deprecated. -We don't plan to work on additional support for other formats/runtime, but contributions are welcome. - - -## Deployment with Tracing or Scripting - -Models can be exported to TorchScript format, by either -[tracing or scripting](https://pytorch.org/tutorials/beginner/Intro_to_TorchScript_tutorial.html). -The output model file can be loaded without detectron2 dependency in either Python or C++. -The exported model often requires torchvision (or its C++ library) dependency for some custom ops. - -This feature requires PyTorch ≥ 1.8. - -### Coverage -Most official models under the meta architectures `GeneralizedRCNN` and `RetinaNet` -are supported in both tracing and scripting mode. -Cascade R-CNN and PointRend are currently supported in tracing. -Users' custom extensions are supported if they are also scriptable or traceable. - -For models exported with tracing, dynamic input resolution is allowed, but batch size -(number of input images) must be fixed. -Scripting can support dynamic batch size. - -### Usage - -The main export APIs for tracing and scripting are [TracingAdapter](../modules/export.html#detectron2.export.TracingAdapter) -and [scripting_with_instances](../modules/export.html#detectron2.export.scripting_with_instances). -Their usage is currently demonstrated in [test_export_torchscript.py](../../tests/test_export_torchscript.py) -(see `TestScripting` and `TestTracing`) -as well as the [deployment example](../../tools/deploy). -Please check that these examples can run, and then modify for your use cases. -The usage now requires some user effort and necessary knowledge for each model to workaround the limitation of scripting and tracing. -In the future we plan to wrap these under simpler APIs to lower the bar to use them. - -## Deployment with Caffe2-tracing -We provide [Caffe2Tracer](../modules/export.html#detectron2.export.Caffe2Tracer) -that performs the export logic. -It replaces parts of the model with Caffe2 operators, -and then export the model into Caffe2, TorchScript or ONNX format. - -The converted model is able to run in either Python or C++ without detectron2/torchvision dependency, on CPU or GPUs. -It has a runtime optimized for CPU & mobile inference, but not optimized for GPU inference. - -This feature requires 1.9 > ONNX ≥ 1.6. - -### Coverage - -Most official models under these 3 common meta architectures: `GeneralizedRCNN`, `RetinaNet`, `PanopticFPN` -are supported. Cascade R-CNN is not supported. Batch inference is not supported. - -Users' custom extensions under these architectures (added through registration) are supported -as long as they do not contain control flow or operators not available in Caffe2 (e.g. deformable convolution). -For example, custom backbones and heads are often supported out of the box. - -### Usage - -The APIs are listed at [the API documentation](../modules/export). -We provide [export_model.py](../../tools/deploy/) as an example that uses -these APIs to convert a standard model. For custom models/datasets, you can add them to this script. - -### Use the model in C++/Python - -The model can be loaded in C++ and deployed with -either Caffe2 or Pytorch runtime.. [C++ examples](../../tools/deploy/) for Mask R-CNN -are given as a reference. Note that: - -* Models exported with `caffe2_tracing` method take a special input format - described in [documentation](../modules/export.html#detectron2.export.Caffe2Tracer). - This was taken care of in the C++ example. - -* The converted models do not contain post-processing operations that - transform raw layer outputs into formatted predictions. - For example, the C++ examples only produce raw outputs (28x28 masks) from the final - layers that are not post-processed, because in actual deployment, an application often needs - its custom lightweight post-processing, so this step is left for users. - -To help use the Caffe2-format model in python, -we provide a python wrapper around the converted model, in the -[Caffe2Model.\_\_call\_\_](../modules/export.html#detectron2.export.Caffe2Model.__call__) method. -This method has an interface that's identical to the [pytorch versions of models](./models.md), -and it internally applies pre/post-processing code to match the formats. -This wrapper can serve as a reference for how to use Caffe2's python API, -or for how to implement pre/post-processing in actual deployment. - -## Conversion to TensorFlow -[tensorpack Faster R-CNN](https://github.com/tensorpack/tensorpack/tree/master/examples/FasterRCNN/convert_d2) -provides scripts to convert a few standard detectron2 R-CNN models to TensorFlow's pb format. -It works by translating configs and weights, therefore only support a few models. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/evaluation.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/evaluation.md deleted file mode 100755 index bd924a3b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/evaluation.md +++ /dev/null @@ -1,68 +0,0 @@ - -# Evaluation - -Evaluation is a process that takes a number of inputs/outputs pairs and aggregate them. -You can always [use the model](./models.md) directly and just parse its inputs/outputs manually to perform -evaluation. -Alternatively, evaluation is implemented in detectron2 using the [DatasetEvaluator](../modules/evaluation.html#detectron2.evaluation.DatasetEvaluator) -interface. - -Detectron2 includes a few `DatasetEvaluator` that computes metrics using standard dataset-specific -APIs (e.g., COCO, LVIS). -You can also implement your own `DatasetEvaluator` that performs some other jobs -using the inputs/outputs pairs. -For example, to count how many instances are detected on the validation set: - -``` -class Counter(DatasetEvaluator): - def reset(self): - self.count = 0 - def process(self, inputs, outputs): - for output in outputs: - self.count += len(output["instances"]) - def evaluate(self): - # save self.count somewhere, or print it, or return it. - return {"count": self.count} -``` - -## Use evaluators - -To evaluate using the methods of evaluators manually: -``` -def get_all_inputs_outputs(): - for data in data_loader: - yield data, model(data) - -evaluator.reset() -for inputs, outputs in get_all_inputs_outputs(): - evaluator.process(inputs, outputs) -eval_results = evaluator.evaluate() -``` - -Evaluators can also be used with [inference_on_dataset](../modules/evaluation.html#detectron2.evaluation.inference_on_dataset). -For example, - -```python -eval_results = inference_on_dataset( - model, - data_loader, - DatasetEvaluators([COCOEvaluator(...), Counter()])) -``` -This will execute `model` on all inputs from `data_loader`, and call evaluator to process them. - -Compared to running the evaluation manually using the model, the benefit of this function is that -evaluators can be merged together using [DatasetEvaluators](../modules/evaluation.html#detectron2.evaluation.DatasetEvaluators), -and all the evaluation can finish in one forward pass over the dataset. -This function also provides accurate speed benchmarks for the given model and dataset. - -## Evaluators for custom dataset - -Many evaluators in detectron2 are made for specific datasets, -in order to obtain scores using each dataset's official API. -In addition to that, two evaluators are able to evaluate any generic dataset -that follows detectron2's [standard dataset format](./datasets.md), so they -can be used to evaluate custom datasets: - -* [COCOEvaluator](../modules/evaluation.html#detectron2.evaluation.COCOEvaluator) is able to evaluate AP (Average Precision) for box detection, - instance segmentation, keypoint detection on any custom dataset. -* [SemSegEvaluator](../modules/evaluation.html#detectron2.evaluation.SemSegEvaluator) is able to evaluate semantic segmentation metrics on any custom dataset. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/extend.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/extend.md deleted file mode 100755 index a6af550f..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/extend.md +++ /dev/null @@ -1,141 +0,0 @@ -# Extend Detectron2's Defaults - -__Research is about doing things in new ways__. -This brings a tension in how to create abstractions in code, -which is a challenge for any research engineering project of a significant size: - -1. On one hand, it needs to have very thin abstractions to allow for the possibility of doing - everything in new ways. It should be reasonably easy to break existing - abstractions and replace them with new ones. - -2. On the other hand, such a project also needs reasonably high-level - abstractions, so that users can easily do things in standard ways, - without worrying too much about the details that only certain researchers care about. - -In detectron2, there are two types of interfaces that address this tension together: - -1. Functions and classes that take a config (`cfg`) argument - created from a yaml file - (sometimes with few extra arguments). - - Such functions and classes implement - the "standard default" behavior: it will read what it needs from a given - config and do the "standard" thing. - Users only need to load an expert-made config and pass it around, without having to worry about - which arguments are used and what they all mean. - - See [Yacs Configs](configs.md) for a detailed tutorial. - -2. Functions and classes that have well-defined explicit arguments. - - Each of these is a small building block of the entire system. - They require users' expertise to understand what each argument should be, - and require more effort to stitch together to a larger system. - But they can be stitched together in more flexible ways. - - When you need to implement something not supported by the "standard defaults" - included in detectron2, these well-defined components can be reused. - - The [LazyConfig system](lazyconfigs.md) relies on such functions and classes. - -3. A few functions and classes are implemented with the - [@configurable](../modules/config.html#detectron2.config.configurable) - decorator - they can be called with either a config, or with explicit arguments, or a mixture of both. - Their explicit argument interfaces are currently experimental. - - As an example, a Mask R-CNN model can be built in the following ways: - - 1. Config-only: - ```python - # load proper yaml config file, then - model = build_model(cfg) - ``` - - 2. Mixture of config and additional argument overrides: - ```python - model = GeneralizedRCNN( - cfg, - roi_heads=StandardROIHeads(cfg, batch_size_per_image=666), - pixel_std=[57.0, 57.0, 57.0]) - ``` - - 3. Full explicit arguments: -
      - - (click to expand) - - - ```python - model = GeneralizedRCNN( - backbone=FPN( - ResNet( - BasicStem(3, 64, norm="FrozenBN"), - ResNet.make_default_stages(50, stride_in_1x1=True, norm="FrozenBN"), - out_features=["res2", "res3", "res4", "res5"], - ).freeze(2), - ["res2", "res3", "res4", "res5"], - 256, - top_block=LastLevelMaxPool(), - ), - proposal_generator=RPN( - in_features=["p2", "p3", "p4", "p5", "p6"], - head=StandardRPNHead(in_channels=256, num_anchors=3), - anchor_generator=DefaultAnchorGenerator( - sizes=[[32], [64], [128], [256], [512]], - aspect_ratios=[0.5, 1.0, 2.0], - strides=[4, 8, 16, 32, 64], - offset=0.0, - ), - anchor_matcher=Matcher([0.3, 0.7], [0, -1, 1], allow_low_quality_matches=True), - box2box_transform=Box2BoxTransform([1.0, 1.0, 1.0, 1.0]), - batch_size_per_image=256, - positive_fraction=0.5, - pre_nms_topk=(2000, 1000), - post_nms_topk=(1000, 1000), - nms_thresh=0.7, - ), - roi_heads=StandardROIHeads( - num_classes=80, - batch_size_per_image=512, - positive_fraction=0.25, - proposal_matcher=Matcher([0.5], [0, 1], allow_low_quality_matches=False), - box_in_features=["p2", "p3", "p4", "p5"], - box_pooler=ROIPooler(7, (1.0 / 4, 1.0 / 8, 1.0 / 16, 1.0 / 32), 0, "ROIAlignV2"), - box_head=FastRCNNConvFCHead( - ShapeSpec(channels=256, height=7, width=7), conv_dims=[], fc_dims=[1024, 1024] - ), - box_predictor=FastRCNNOutputLayers( - ShapeSpec(channels=1024), - test_score_thresh=0.05, - box2box_transform=Box2BoxTransform((10, 10, 5, 5)), - num_classes=80, - ), - mask_in_features=["p2", "p3", "p4", "p5"], - mask_pooler=ROIPooler(14, (1.0 / 4, 1.0 / 8, 1.0 / 16, 1.0 / 32), 0, "ROIAlignV2"), - mask_head=MaskRCNNConvUpsampleHead( - ShapeSpec(channels=256, width=14, height=14), - num_classes=80, - conv_dims=[256, 256, 256, 256, 256], - ), - ), - pixel_mean=[103.530, 116.280, 123.675], - pixel_std=[1.0, 1.0, 1.0], - input_format="BGR", - ) - ``` - -
      - - -If you only need the standard behavior, the [Beginner's Tutorial](./getting_started.md) -should suffice. If you need to extend detectron2 to your own needs, -see the following tutorials for more details: - -* Detectron2 includes a few standard datasets. To use custom ones, see - [Use Custom Datasets](./datasets.md). -* Detectron2 contains the standard logic that creates a data loader for training/testing from a - dataset, but you can write your own as well. See [Use Custom Data Loaders](./data_loading.md). -* Detectron2 implements many standard detection models, and provide ways for you - to overwrite their behaviors. See [Use Models](./models.md) and [Write Models](./write-models.md). -* Detectron2 provides a default training loop that is good for common training tasks. - You can customize it with hooks, or write your own loop instead. See [training](./training.md). diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/index.rst b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/index.rst deleted file mode 100755 index 850b95cf..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/index.rst +++ /dev/null @@ -1,20 +0,0 @@ -Tutorials -====================================== - -.. toctree:: - :maxdepth: 2 - - install - getting_started - builtin_datasets - extend - datasets - data_loading - augmentation - models - write-models - training - evaluation - configs - lazyconfigs - deployment diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/lazyconfigs.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/lazyconfigs.md deleted file mode 100755 index ca9de305..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/lazyconfigs.md +++ /dev/null @@ -1,170 +0,0 @@ -# Lazy Configs - -The traditional yacs-based config system provides basic, standard functionalities. -However, it does not offer enough flexibility for many new projects. -We develop an alternative, non-intrusive config system that can be used with -detectron2 or potentially any other complex projects. - -## Python Syntax - -Our config objects are still dictionaries. Instead of using Yaml to define dictionaries, -we create dictionaries in Python directly. This gives users the following power that -doesn't exist in Yaml: - -* Easily manipulate the dictionary (addition & deletion) using Python. -* Write simple arithmetics or call simple functions. -* Use more data types / objects. -* Import / compose other config files, using the familiar Python import syntax. - -A Python config file can be loaded like this: -```python -# config.py: -a = dict(x=1, y=2, z=dict(xx=1)) -b = dict(x=3, y=4) - -# my_code.py: -from detectron2.config import LazyConfig -cfg = LazyConfig.load("path/to/config.py") # an omegaconf dictionary -assert cfg.a.z.xx == 1 -``` - -After [LazyConfig.load](../modules/config.html#detectron2.config.LazyConfig.load), `cfg` will be a dictionary that contains all dictionaries -defined in the global scope of the config file. Note that: -* All dictionaries are turned to an [omegaconf](https://omegaconf.readthedocs.io/) - config object during loading. This enables access to omegaconf features, - such as its [access syntax](https://omegaconf.readthedocs.io/en/2.1_branch/usage.html#access-and-manipulation) - and [interpolation](https://omegaconf.readthedocs.io/en/2.1_branch/usage.html#variable-interpolation). -* Absolute imports in `config.py` works the same as in regular Python. -* Relative imports can only import dictionaries from config files. - They are simply a syntax sugar for [LazyConfig.load_rel](../modules/config.html#detectron2.config.LazyConfig.load_rel). - They can load Python files at relative path without requiring `__init__.py`. - -[LazyConfig.save](../modules/config.html#detectron2.config.LazyConfig.save) can save a config object to yaml. -Note that this is not always successful if non-serializable objects appear in the config file (e.g. lambdas). -It is up to users whether to sacrifice the ability to save in exchange for flexibility. - -## Recursive Instantiation - -The LazyConfig system heavily uses recursive instantiation, which is a pattern that -uses a dictionary to describe a -call to a function/class. The dictionary consists of: - -1. A "\_target\_" key which contains path to the callable, such as "module.submodule.class_name". -2. Other keys that represent arguments to pass to the callable. Arguments themselves can be defined - using recursive instantiation. - -We provide a helper function [LazyCall](../modules/config.html#detectron2.config.LazyCall) that helps create such dictionaries. -The following code using `LazyCall` -```python -from detectron2.config import LazyCall as L -from my_app import Trainer, Optimizer -cfg = L(Trainer)( - optimizer=L(Optimizer)( - lr=0.01, - algo="SGD" - ) -) -``` -creates a dictionary like this: -``` -cfg = { - "_target_": "my_app.Trainer", - "optimizer": { - "_target_": "my_app.Optimizer", - "lr": 0.01, "algo": "SGD" - } -} -``` - -By representing objects using such dictionaries, a general -[instantiate](../modules/config.html#detectron2.config.instantiate) -function can turn them into actual objects, i.e.: -```python -from detectron2.config import instantiate -trainer = instantiate(cfg) -# equivalent to: -# from my_app import Trainer, Optimizer -# trainer = Trainer(optimizer=Optimizer(lr=0.01, algo="SGD")) -``` - -This pattern is powerful enough to describe very complex objects, e.g.: - -
      - -A Full Mask R-CNN described in recursive instantiation (click to expand) - - -```eval_rst -.. literalinclude:: ../../configs/common/models/mask_rcnn_fpn.py - :language: python - :linenos: -``` - -
      - -There are also objects or logic that cannot be described simply by a dictionary, -such as reused objects or method calls. They may require some refactoring -to work with recursive instantiation. - -## Using Model Zoo LazyConfigs - -We provide some configs in the model zoo using the LazyConfig system, for example: - -* [common baselines](../../configs/common/). -* [new Mask R-CNN baselines](../../configs/new_baselines/) - -After installing detectron2, they can be loaded by the model zoo API -[model_zoo.get_config](../modules/model_zoo.html#detectron2.model_zoo.get_config). - -Using these as references, you're free to define custom config structure / fields for your own -project, as long as your training script can understand them. -Despite of this, our model zoo configs still follow some simple conventions for consistency, e.g. -`cfg.model` defines a model object, `cfg.dataloader.{train,test}` defines dataloader objects, -and `cfg.train` contains training options in key-value form. -In addition to `print()`, a better way to view the structure of a config is like this: -``` -from detectron2.model_zoo import get_config -from detectron2.config import LazyConfig -print(LazyConfig.to_py(get_config("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py"))) -``` -From the output it's easier to find relevant options to change, e.g. -`dataloader.train.total_batch_size` for the batch size, or `optimizer.lr` for base learning rate. - -We provide a reference training script -[tools/lazyconfig_train_net.py](../../tools/lazyconfig_train_net.py), -that can train/eval our model zoo configs. -It also shows how to support command line value overrides. - -To demonstrate the power and flexibility of the new system, we show that -[a simple config file](../../configs/Misc/torchvision_imagenet_R_50.py) -can let detectron2 train an ImageNet classification model from torchvision, even though -detectron2 contains no features about ImageNet classification. -This can serve as a reference for using detectron2 in other deep learning tasks. - -## Summary - -By using recursive instantiation to create objects, -we avoid passing a giant config to many places, because `cfg` is only passed to `instantiate`. -This has the following benefits: - -* It's __non-intrusive__: objects to be constructed are config-agnostic, regular Python - functions/classes. - They can even live in other libraries. For example, - `{"_target_": "torch.nn.Conv2d", "in_channels": 10, "out_channels": 10, "kernel_size": 1}` - defines a conv layer. -* __Clarity__ of what function/classes will be called, and what arguments they use. -* `cfg` doesn't need pre-defined keys and structures. It's valid as long as it translates to valid - code. This gives a lot more __flexibility__. -* You can still pass huge dictionaries as arguments, just like the old way. - -Recursive instantiation and Python syntax are orthogonal: you can use one without the other. -But by putting them together, the config file looks a lot like the code that will be executed: - -![img](./lazyconfig.jpg) - -However, the config file just defines dictionaries, which can be easily manipulated further -by composition or overrides. -The corresponding code will only be executed -later when `instantiate` is called. In some way, -in config files we're writing "editable code" that will be "lazily executed" later when needed. -That's why we call this system "LazyConfig". diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/models.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/models.md deleted file mode 100755 index 3cf918e7..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/models.md +++ /dev/null @@ -1,180 +0,0 @@ -# Use Models - -## Build Models from Yacs Config -From a yacs config object, -models (and their sub-models) can be built by -functions such as `build_model`, `build_backbone`, `build_roi_heads`: -```python -from detectron2.modeling import build_model -model = build_model(cfg) # returns a torch.nn.Module -``` - -`build_model` only builds the model structure and fills it with random parameters. -See below for how to load an existing checkpoint to the model and how to use the `model` object. - -### Load/Save a Checkpoint -```python -from detectron2.checkpoint import DetectionCheckpointer -DetectionCheckpointer(model).load(file_path_or_url) # load a file, usually from cfg.MODEL.WEIGHTS - -checkpointer = DetectionCheckpointer(model, save_dir="output") -checkpointer.save("model_999") # save to output/model_999.pth -``` - -Detectron2's checkpointer recognizes models in pytorch's `.pth` format, as well as the `.pkl` files -in our model zoo. -See [API doc](../modules/checkpoint.html#detectron2.checkpoint.DetectionCheckpointer) -for more details about its usage. - -The model files can be arbitrarily manipulated using `torch.{load,save}` for `.pth` files or -`pickle.{dump,load}` for `.pkl` files. - -### Use a Model - -A model can be called by `outputs = model(inputs)`, where `inputs` is a `list[dict]`. -Each dict corresponds to one image and the required keys -depend on the type of model, and whether the model is in training or evaluation mode. -For example, in order to do inference, -all existing models expect the "image" key, and optionally "height" and "width". -The detailed format of inputs and outputs of existing models are explained below. - -__Training__: When in training mode, all models are required to be used under an `EventStorage`. -The training statistics will be put into the storage: -```python -from detectron2.utils.events import EventStorage -with EventStorage() as storage: - losses = model(inputs) -``` - -__Inference__: If you only want to do simple inference using an existing model, -[DefaultPredictor](../modules/engine.html#detectron2.engine.defaults.DefaultPredictor) -is a wrapper around model that provides such basic functionality. -It includes default behavior including model loading, preprocessing, -and operates on single image rather than batches. See its documentation for usage. - -You can also run inference directly like this: -``` -model.eval() -with torch.no_grad(): - outputs = model(inputs) -``` - -### Model Input Format - -Users can implement custom models that support any arbitrary input format. -Here we describe the standard input format that all builtin models support in detectron2. -They all take a `list[dict]` as the inputs. Each dict -corresponds to information about one image. - -The dict may contain the following keys: - -* "image": `Tensor` in (C, H, W) format. The meaning of channels are defined by `cfg.INPUT.FORMAT`. - Image normalization, if any, will be performed inside the model using - `cfg.MODEL.PIXEL_{MEAN,STD}`. -* "height", "width": the **desired** output height and width **in inference**, which is not necessarily the same - as the height or width of the `image` field. - For example, the `image` field contains the resized image, if resize is used as a preprocessing step. - But you may want the outputs to be in **original** resolution. - If provided, the model will produce output in this resolution, - rather than in the resolution of the `image` as input into the model. This is more efficient and accurate. -* "instances": an [Instances](../modules/structures.html#detectron2.structures.Instances) - object for training, with the following fields: - + "gt_boxes": a [Boxes](../modules/structures.html#detectron2.structures.Boxes) object storing N boxes, one for each instance. - + "gt_classes": `Tensor` of long type, a vector of N labels, in range [0, num_categories). - + "gt_masks": a [PolygonMasks](../modules/structures.html#detectron2.structures.PolygonMasks) - or [BitMasks](../modules/structures.html#detectron2.structures.BitMasks) object storing N masks, one for each instance. - + "gt_keypoints": a [Keypoints](../modules/structures.html#detectron2.structures.Keypoints) - object storing N keypoint sets, one for each instance. -* "sem_seg": `Tensor[int]` in (H, W) format. The semantic segmentation ground truth for training. - Values represent category labels starting from 0. -* "proposals": an [Instances](../modules/structures.html#detectron2.structures.Instances) - object used only in Fast R-CNN style models, with the following fields: - + "proposal_boxes": a [Boxes](../modules/structures.html#detectron2.structures.Boxes) object storing P proposal boxes. - + "objectness_logits": `Tensor`, a vector of P scores, one for each proposal. - -For inference of builtin models, only "image" key is required, and "width/height" are optional. - -We currently don't define standard input format for panoptic segmentation training, -because models now use custom formats produced by custom data loaders. - -#### How it connects to data loader: - -The output of the default [DatasetMapper]( ../modules/data.html#detectron2.data.DatasetMapper) is a dict -that follows the above format. -After the data loader performs batching, it becomes `list[dict]` which the builtin models support. - - -### Model Output Format - -When in training mode, the builtin models output a `dict[str->ScalarTensor]` with all the losses. - -When in inference mode, the builtin models output a `list[dict]`, one dict for each image. -Based on the tasks the model is doing, each dict may contain the following fields: - -* "instances": [Instances](../modules/structures.html#detectron2.structures.Instances) - object with the following fields: - * "pred_boxes": [Boxes](../modules/structures.html#detectron2.structures.Boxes) object storing N boxes, one for each detected instance. - * "scores": `Tensor`, a vector of N confidence scores. - * "pred_classes": `Tensor`, a vector of N labels in range [0, num_categories). - + "pred_masks": a `Tensor` of shape (N, H, W), masks for each detected instance. - + "pred_keypoints": a `Tensor` of shape (N, num_keypoint, 3). - Each row in the last dimension is (x, y, score). Confidence scores are larger than 0. -* "sem_seg": `Tensor` of (num_categories, H, W), the semantic segmentation prediction. -* "proposals": [Instances](../modules/structures.html#detectron2.structures.Instances) - object with the following fields: - * "proposal_boxes": [Boxes](../modules/structures.html#detectron2.structures.Boxes) - object storing N boxes. - * "objectness_logits": a torch vector of N confidence scores. -* "panoptic_seg": A tuple of `(pred: Tensor, segments_info: Optional[list[dict]])`. - The `pred` tensor has shape (H, W), containing the segment id of each pixel. - - * If `segments_info` exists, each dict describes one segment id in `pred` and has the following fields: - - * "id": the segment id - * "isthing": whether the segment is a thing or stuff - * "category_id": the category id of this segment. - - If a pixel's id does not exist in `segments_info`, it is considered to be void label - defined in [Panoptic Segmentation](https://arxiv.org/abs/1801.00868). - - * If `segments_info` is None, all pixel values in `pred` must be ≥ -1. - Pixels with value -1 are assigned void labels. - Otherwise, the category id of each pixel is obtained by - `category_id = pixel // metadata.label_divisor`. - - -### Partially execute a model: - -Sometimes you may want to obtain an intermediate tensor inside a model, -such as the input of certain layer, the output before post-processing. -Since there are typically hundreds of intermediate tensors, there isn't an API that provides you -the intermediate result you need. -You have the following options: - -1. Write a (sub)model. Following the [tutorial](./write-models.md), you can - rewrite a model component (e.g. a head of a model), such that it - does the same thing as the existing component, but returns the output - you need. -2. Partially execute a model. You can create the model as usual, - but use custom code to execute it instead of its `forward()`. For example, - the following code obtains mask features before mask head. - - ```python - images = ImageList.from_tensors(...) # preprocessed input tensor - model = build_model(cfg) - model.eval() - features = model.backbone(images.tensor) - proposals, _ = model.proposal_generator(images, features) - instances, _ = model.roi_heads(images, features, proposals) - mask_features = [features[f] for f in model.roi_heads.in_features] - mask_features = model.roi_heads.mask_pooler(mask_features, [x.pred_boxes for x in instances]) - ``` - -3. Use [forward hooks](https://pytorch.org/tutorials/beginner/former_torchies/nnft_tutorial.html#forward-and-backward-function-hooks). - Forward hooks can help you obtain inputs or outputs of a certain module. - If they are not exactly what you want, they can at least be used together with partial execution - to obtain other tensors. - -All options require you to read documentation and sometimes code -of the existing models to understand the internal logic, -in order to write code to obtain the internal tensors. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/training.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/training.md deleted file mode 100755 index 7e2987e4..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/training.md +++ /dev/null @@ -1,67 +0,0 @@ -# Training - -From the previous tutorials, you may now have a custom model and a data loader. -To run training, users typically have a preference in one of the following two styles: - -### Custom Training Loop - -With a model and a data loader ready, everything else needed to write a training loop can -be found in PyTorch, and you are free to write the training loop yourself. -This style allows researchers to manage the entire training logic more clearly and have full control. -One such example is provided in [tools/plain_train_net.py](../../tools/plain_train_net.py). - -Any customization on the training logic is then easily controlled by the user. - -### Trainer Abstraction - -We also provide a standardized "trainer" abstraction with a -hook system that helps simplify the standard training behavior. -It includes the following two instantiations: - -* [SimpleTrainer](../modules/engine.html#detectron2.engine.SimpleTrainer) - provides a minimal training loop for single-cost single-optimizer single-data-source training, with nothing else. - Other tasks (checkpointing, logging, etc) can be implemented using - [the hook system](../modules/engine.html#detectron2.engine.HookBase). -* [DefaultTrainer](../modules/engine.html#detectron2.engine.defaults.DefaultTrainer) is a `SimpleTrainer` initialized from a - yacs config, used by - [tools/train_net.py](../../tools/train_net.py) and many scripts. - It includes more standard default behaviors that one might want to opt in, - including default configurations for optimizer, learning rate schedule, - logging, evaluation, checkpointing etc. - -To customize a `DefaultTrainer`: - -1. For simple customizations (e.g. change optimizer, evaluator, LR scheduler, data loader, etc.), overwrite [its methods](../modules/engine.html#detectron2.engine.defaults.DefaultTrainer) in a subclass, just like [tools/train_net.py](../../tools/train_net.py). -2. For extra tasks during training, check the - [hook system](../modules/engine.html#detectron2.engine.HookBase) to see if it's supported. - - As an example, to print hello during training: - ```python - class HelloHook(HookBase): - def after_step(self): - if self.trainer.iter % 100 == 0: - print(f"Hello at iteration {self.trainer.iter}!") - ``` -3. Using a trainer+hook system means there will always be some non-standard behaviors that cannot be supported, especially in research. - For this reason, we intentionally keep the trainer & hook system minimal, rather than powerful. - If anything cannot be achieved by such a system, it's easier to start from [tools/plain_train_net.py](../../tools/plain_train_net.py) to implement custom training logic manually. - -### Logging of Metrics - -During training, detectron2 models and trainer put metrics to a centralized [EventStorage](../modules/utils.html#detectron2.utils.events.EventStorage). -You can use the following code to access it and log metrics to it: -``` -from detectron2.utils.events import get_event_storage - -# inside the model: -if self.training: - value = # compute the value from inputs - storage = get_event_storage() - storage.put_scalar("some_accuracy", value) -``` - -Refer to its documentation for more details. - -Metrics are then written to various destinations with [EventWriter](../modules/utils.html#module-detectron2.utils.events). -DefaultTrainer enables a few `EventWriter` with default configurations. -See above for how to customize them. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/write-models.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/write-models.md deleted file mode 100755 index 967d1265..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/docs/tutorials/write-models.md +++ /dev/null @@ -1,90 +0,0 @@ -# Write Models - -If you are trying to do something completely new, you may wish to implement -a model entirely from scratch. However, in many situations you may -be interested in modifying or extending some components of an existing model. -Therefore, we also provide mechanisms that let users override the -behavior of certain internal components of standard models. - - -## Register New Components - -For common concepts that users often want to customize, such as "backbone feature extractor", "box head", -we provide a registration mechanism for users to inject custom implementation that -will be immediately available to use in config files. - -For example, to add a new backbone, import this code in your code: -```python -from detectron2.modeling import BACKBONE_REGISTRY, Backbone, ShapeSpec - -@BACKBONE_REGISTRY.register() -class ToyBackbone(Backbone): - def __init__(self, cfg, input_shape): - super().__init__() - # create your own backbone - self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=16, padding=3) - - def forward(self, image): - return {"conv1": self.conv1(image)} - - def output_shape(self): - return {"conv1": ShapeSpec(channels=64, stride=16)} -``` - -In this code, we implement a new backbone following the interface of the -[Backbone](../modules/modeling.html#detectron2.modeling.Backbone) class, -and register it into the [BACKBONE_REGISTRY](../modules/modeling.html#detectron2.modeling.BACKBONE_REGISTRY) -which requires subclasses of `Backbone`. -After importing this code, detectron2 can link the name of the class to its implementation. Therefore you can write the following code: - -```python -cfg = ... # read a config -cfg.MODEL.BACKBONE.NAME = 'ToyBackbone' # or set it in the config file -model = build_model(cfg) # it will find `ToyBackbone` defined above -``` - -As another example, to add new abilities to the ROI heads in the Generalized R-CNN meta-architecture, -you can implement a new -[ROIHeads](../modules/modeling.html#detectron2.modeling.ROIHeads) subclass and put it in the `ROI_HEADS_REGISTRY`. -[DensePose](../../projects/DensePose) -and [MeshRCNN](https://github.com/facebookresearch/meshrcnn) -are two examples that implement new ROIHeads to perform new tasks. -And [projects/](../../projects/) -contains more examples that implement different architectures. - -A complete list of registries can be found in [API documentation](../modules/modeling.html#model-registries). -You can register components in these registries to customize different parts of a model, or the -entire model. - -## Construct Models with Explicit Arguments - -Registry is a bridge to connect names in config files to the actual code. -They are meant to cover a few main components that users frequently need to replace. -However, the capability of a text-based config file is sometimes limited and -some deeper customization may be available only through writing code. - -Most model components in detectron2 have a clear `__init__` interface that documents -what input arguments it needs. Calling them with custom arguments will give you a custom variant -of the model. - -As an example, to use __custom loss function__ in the box head of a Faster R-CNN, we can do the following: - -1. Losses are currently computed in [FastRCNNOutputLayers](../modules/modeling.html#detectron2.modeling.FastRCNNOutputLayers). - We need to implement a variant or a subclass of it, with custom loss functions, named `MyRCNNOutput`. -2. Call `StandardROIHeads` with `box_predictor=MyRCNNOutput()` argument instead of the builtin `FastRCNNOutputLayers`. - If all other arguments should stay unchanged, this can be easily achieved by using the [configurable `__init__`](../modules/config.html#detectron2.config.configurable) mechanism: - - ```python - roi_heads = StandardROIHeads( - cfg, backbone.output_shape(), - box_predictor=MyRCNNOutput(...) - ) - ``` -3. (optional) If we want to enable this new model from a config file, registration is needed: - ```python - @ROI_HEADS_REGISTRY.register() - class MyStandardROIHeads(StandardROIHeads): - def __init__(self, cfg, input_shape): - super().__init__(cfg, input_shape, - box_predictor=MyRCNNOutput(...)) - ``` diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/.gitignore b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/.gitignore deleted file mode 100755 index 51c17688..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/.gitignore +++ /dev/null @@ -1,10 +0,0 @@ -# compilation and distribution -__pycache__ -_ext -*.pyc -*.pyd -*.so -centernet.egg-info/ -build/ -dist/ -wheels/ diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/__init__.py deleted file mode 100755 index e17db317..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/__init__.py +++ /dev/null @@ -1,14 +0,0 @@ -from .modeling.meta_arch.centernet_detector import CenterNetDetector -from .modeling.dense_heads.centernet import CenterNet -from .modeling.roi_heads.custom_roi_heads import CustomROIHeads, CustomCascadeROIHeads - -from .modeling.backbone.fpn_p5 import build_p67_resnet_fpn_backbone -from .modeling.backbone.dla import build_dla_backbone -from .modeling.backbone.dlafpn import build_dla_fpn3_backbone -from .modeling.backbone.bifpn import build_resnet_bifpn_backbone -from .modeling.backbone.bifpn_fcos import build_fcos_resnet_bifpn_backbone -from .modeling.backbone.res2net import build_p67_res2net_fpn_backbone - -from .data.datasets.objects365 import categories_v1 -from .data.datasets.coco import _PREDEFINED_SPLITS_COCO -from .data.datasets import nuimages diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_build_augmentation.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_build_augmentation.py deleted file mode 100755 index 7d91f21e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_build_augmentation.py +++ /dev/null @@ -1,59 +0,0 @@ -import logging -import numpy as np -import pycocotools.mask as mask_util -import torch -from fvcore.common.file_io import PathManager -from PIL import Image - -from detectron2.structures import ( - BitMasks, - Boxes, - BoxMode, - Instances, - Keypoints, - PolygonMasks, - RotatedBoxes, - polygons_to_bitmask, -) - -from detectron2.data import transforms as T -from .transforms.custom_augmentation_impl import EfficientDetResizeCrop - -def build_custom_augmentation(cfg, is_train): - """ - Create a list of default :class:`Augmentation` from config. - Now it includes resizing and flipping. - - Returns: - list[Augmentation] - """ - if cfg.INPUT.CUSTOM_AUG == 'ResizeShortestEdge': - if is_train: - min_size = cfg.INPUT.MIN_SIZE_TRAIN - max_size = cfg.INPUT.MAX_SIZE_TRAIN - sample_style = cfg.INPUT.MIN_SIZE_TRAIN_SAMPLING - else: - min_size = cfg.INPUT.MIN_SIZE_TEST - max_size = cfg.INPUT.MAX_SIZE_TEST - sample_style = "choice" - augmentation = [T.ResizeShortestEdge(min_size, max_size, sample_style)] - elif cfg.INPUT.CUSTOM_AUG == 'EfficientDetResizeCrop': - if is_train: - scale = cfg.INPUT.SCALE_RANGE - size = cfg.INPUT.TRAIN_SIZE - else: - scale = (1, 1) - size = cfg.INPUT.TEST_SIZE - augmentation = [EfficientDetResizeCrop(size, scale)] - else: - assert 0, cfg.INPUT.CUSTOM_AUG - - if is_train: - augmentation.append(T.RandomFlip()) - return augmentation - - -build_custom_transform_gen = build_custom_augmentation -""" -Alias for backward-compatibility. -""" \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_dataset_dataloader.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_dataset_dataloader.py deleted file mode 100755 index 4e9844c9..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_dataset_dataloader.py +++ /dev/null @@ -1,229 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -import copy -import logging -import numpy as np -import operator -import torch -import torch.utils.data -import json -from detectron2.utils.comm import get_world_size - -from detectron2.data import samplers -from torch.utils.data.sampler import BatchSampler, Sampler -from detectron2.data.common import DatasetFromList, MapDataset -from detectron2.data.dataset_mapper import DatasetMapper -from detectron2.data.build import get_detection_dataset_dicts, build_batch_data_loader -from detectron2.data.samplers import TrainingSampler, RepeatFactorTrainingSampler -from detectron2.data.build import worker_init_reset_seed, print_instances_class_histogram -from detectron2.data.build import filter_images_with_only_crowd_annotations -from detectron2.data.build import filter_images_with_few_keypoints -from detectron2.data.build import check_metadata_consistency -from detectron2.data.catalog import MetadataCatalog, DatasetCatalog -from detectron2.utils import comm -import itertools -import math -from collections import defaultdict -from typing import Optional - -# from .custom_build_augmentation import build_custom_augmentation - -def build_custom_train_loader(cfg, mapper=None): - """ - Modified from detectron2.data.build.build_custom_train_loader, but supports - different samplers - """ - source_aware = cfg.DATALOADER.SOURCE_AWARE - if source_aware: - dataset_dicts = get_detection_dataset_dicts_with_source( - cfg.DATASETS.TRAIN, - filter_empty=cfg.DATALOADER.FILTER_EMPTY_ANNOTATIONS, - min_keypoints=cfg.MODEL.ROI_KEYPOINT_HEAD.MIN_KEYPOINTS_PER_IMAGE - if cfg.MODEL.KEYPOINT_ON - else 0, - proposal_files=cfg.DATASETS.PROPOSAL_FILES_TRAIN if cfg.MODEL.LOAD_PROPOSALS else None, - ) - sizes = [0 for _ in range(len(cfg.DATASETS.TRAIN))] - for d in dataset_dicts: - sizes[d['dataset_source']] += 1 - print('dataset sizes', sizes) - else: - dataset_dicts = get_detection_dataset_dicts( - cfg.DATASETS.TRAIN, - filter_empty=cfg.DATALOADER.FILTER_EMPTY_ANNOTATIONS, - min_keypoints=cfg.MODEL.ROI_KEYPOINT_HEAD.MIN_KEYPOINTS_PER_IMAGE - if cfg.MODEL.KEYPOINT_ON - else 0, - proposal_files=cfg.DATASETS.PROPOSAL_FILES_TRAIN if cfg.MODEL.LOAD_PROPOSALS else None, - ) - dataset = DatasetFromList(dataset_dicts, copy=False) - - if mapper is None: - assert 0 - # mapper = DatasetMapper(cfg, True) - dataset = MapDataset(dataset, mapper) - - sampler_name = cfg.DATALOADER.SAMPLER_TRAIN - logger = logging.getLogger(__name__) - logger.info("Using training sampler {}".format(sampler_name)) - # TODO avoid if-else? - if sampler_name == "TrainingSampler": - sampler = TrainingSampler(len(dataset)) - elif sampler_name == "MultiDatasetSampler": - assert source_aware - sampler = MultiDatasetSampler(cfg, sizes, dataset_dicts) - elif sampler_name == "RepeatFactorTrainingSampler": - repeat_factors = RepeatFactorTrainingSampler.repeat_factors_from_category_frequency( - dataset_dicts, cfg.DATALOADER.REPEAT_THRESHOLD - ) - sampler = RepeatFactorTrainingSampler(repeat_factors) - elif sampler_name == "ClassAwareSampler": - sampler = ClassAwareSampler(dataset_dicts) - else: - raise ValueError("Unknown training sampler: {}".format(sampler_name)) - - return build_batch_data_loader( - dataset, - sampler, - cfg.SOLVER.IMS_PER_BATCH, - aspect_ratio_grouping=cfg.DATALOADER.ASPECT_RATIO_GROUPING, - num_workers=cfg.DATALOADER.NUM_WORKERS, - ) - - -class ClassAwareSampler(Sampler): - def __init__(self, dataset_dicts, seed: Optional[int] = None): - """ - Args: - size (int): the total number of data of the underlying dataset to sample from - seed (int): the initial seed of the shuffle. Must be the same - across all workers. If None, will use a random seed shared - among workers (require synchronization among all workers). - """ - self._size = len(dataset_dicts) - assert self._size > 0 - if seed is None: - seed = comm.shared_random_seed() - self._seed = int(seed) - - self._rank = comm.get_rank() - self._world_size = comm.get_world_size() - self.weights = self._get_class_balance_factor(dataset_dicts) - - - def __iter__(self): - start = self._rank - yield from itertools.islice( - self._infinite_indices(), start, None, self._world_size) - - - def _infinite_indices(self): - g = torch.Generator() - g.manual_seed(self._seed) - while True: - ids = torch.multinomial( - self.weights, self._size, generator=g, - replacement=True) - yield from ids - - - def _get_class_balance_factor(self, dataset_dicts, l=1.): - # 1. For each category c, compute the fraction of images that contain it: f(c) - ret = [] - category_freq = defaultdict(int) - for dataset_dict in dataset_dicts: # For each image (without repeats) - cat_ids = {ann["category_id"] for ann in dataset_dict["annotations"]} - for cat_id in cat_ids: - category_freq[cat_id] += 1 - for i, dataset_dict in enumerate(dataset_dicts): - cat_ids = {ann["category_id"] for ann in dataset_dict["annotations"]} - ret.append(sum( - [1. / (category_freq[cat_id] ** l) for cat_id in cat_ids])) - return torch.tensor(ret).float() - - -def get_detection_dataset_dicts_with_source( - dataset_names, filter_empty=True, min_keypoints=0, proposal_files=None -): - assert len(dataset_names) - dataset_dicts = [DatasetCatalog.get(dataset_name) for dataset_name in dataset_names] - for dataset_name, dicts in zip(dataset_names, dataset_dicts): - assert len(dicts), "Dataset '{}' is empty!".format(dataset_name) - - for source_id, (dataset_name, dicts) in \ - enumerate(zip(dataset_names, dataset_dicts)): - assert len(dicts), "Dataset '{}' is empty!".format(dataset_name) - for d in dicts: - d['dataset_source'] = source_id - - if "annotations" in dicts[0]: - try: - class_names = MetadataCatalog.get(dataset_name).thing_classes - check_metadata_consistency("thing_classes", dataset_name) - print_instances_class_histogram(dicts, class_names) - except AttributeError: # class names are not available for this dataset - pass - - assert proposal_files is None - - dataset_dicts = list(itertools.chain.from_iterable(dataset_dicts)) - - has_instances = "annotations" in dataset_dicts[0] - if filter_empty and has_instances: - dataset_dicts = filter_images_with_only_crowd_annotations(dataset_dicts) - if min_keypoints > 0 and has_instances: - dataset_dicts = filter_images_with_few_keypoints(dataset_dicts, min_keypoints) - - return dataset_dicts - -class MultiDatasetSampler(Sampler): - def __init__(self, cfg, sizes, dataset_dicts, seed: Optional[int] = None): - """ - Args: - size (int): the total number of data of the underlying dataset to sample from - seed (int): the initial seed of the shuffle. Must be the same - across all workers. If None, will use a random seed shared - among workers (require synchronization among all workers). - """ - self.sizes = sizes - dataset_ratio = cfg.DATALOADER.DATASET_RATIO - self._batch_size = cfg.SOLVER.IMS_PER_BATCH - assert len(dataset_ratio) == len(sizes), \ - 'length of dataset ratio {} should be equal to number if dataset {}'.format( - len(dataset_ratio), len(sizes) - ) - if seed is None: - seed = comm.shared_random_seed() - self._seed = int(seed) - self._rank = comm.get_rank() - self._world_size = comm.get_world_size() - - self._ims_per_gpu = self._batch_size // self._world_size - self.dataset_ids = torch.tensor( - [d['dataset_source'] for d in dataset_dicts], dtype=torch.long) - - dataset_weight = [torch.ones(s) * max(sizes) / s * r / sum(dataset_ratio) \ - for i, (r, s) in enumerate(zip(dataset_ratio, sizes))] - dataset_weight = torch.cat(dataset_weight) - self.weights = dataset_weight - self.sample_epoch_size = len(self.weights) - - def __iter__(self): - start = self._rank - yield from itertools.islice( - self._infinite_indices(), start, None, self._world_size) - - - def _infinite_indices(self): - g = torch.Generator() - g.manual_seed(self._seed) - while True: - ids = torch.multinomial( - self.weights, self.sample_epoch_size, generator=g, - replacement=True) - nums = [(self.dataset_ids[ids] == i).sum().int().item() \ - for i in range(len(self.sizes))] - print('_rank, len, nums', self._rank, len(ids), nums, flush=True) - # print('_rank, len, nums, self.dataset_ids[ids[:10]], ', - # self._rank, len(ids), nums, self.dataset_ids[ids[:10]], - # flush=True) - yield from ids \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/coco.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/coco.py deleted file mode 100755 index f8496aac..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/coco.py +++ /dev/null @@ -1,49 +0,0 @@ -import os - -from detectron2.data.datasets.register_coco import register_coco_instances -from detectron2.data.datasets.coco import load_coco_json -from detectron2.data.datasets.builtin_meta import _get_builtin_metadata -from detectron2.data import DatasetCatalog, MetadataCatalog - - -def register_distill_coco_instances(name, metadata, json_file, image_root): - """ - add extra_annotation_keys - """ - assert isinstance(name, str), name - assert isinstance(json_file, (str, os.PathLike)), json_file - assert isinstance(image_root, (str, os.PathLike)), image_root - # 1. register a function which returns dicts - DatasetCatalog.register(name, lambda: load_coco_json( - json_file, image_root, name, extra_annotation_keys=['score'])) - - # 2. Optionally, add metadata about this dataset, - # since they might be useful in evaluation, visualization or logging - MetadataCatalog.get(name).set( - json_file=json_file, image_root=image_root, evaluator_type="coco", **metadata - ) - - -_PREDEFINED_SPLITS_COCO = { - "coco_2017_unlabeled": ("coco/unlabeled2017", "coco/annotations/image_info_unlabeled2017.json"), -} - -for key, (image_root, json_file) in _PREDEFINED_SPLITS_COCO.items(): - register_coco_instances( - key, - _get_builtin_metadata('coco'), - os.path.join("datasets", json_file) if "://" not in json_file else json_file, - os.path.join("datasets", image_root), - ) - -_PREDEFINED_SPLITS_DISTILL_COCO = { - "coco_un_yolov4_55_0.5": ("coco/unlabeled2017", "coco/annotations/yolov4_cocounlabeled_55_ann0.5.json"), -} - -for key, (image_root, json_file) in _PREDEFINED_SPLITS_DISTILL_COCO.items(): - register_distill_coco_instances( - key, - _get_builtin_metadata('coco'), - os.path.join("datasets", json_file) if "://" not in json_file else json_file, - os.path.join("datasets", image_root), - ) \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/nuimages.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/nuimages.py deleted file mode 100755 index 52736e33..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/nuimages.py +++ /dev/null @@ -1,37 +0,0 @@ -from detectron2.data.datasets.register_coco import register_coco_instances -import os - -categories = [ - {'id': 0, 'name': 'car'}, - {'id': 1, 'name': 'truck'}, - {'id': 2, 'name': 'trailer'}, - {'id': 3, 'name': 'bus'}, - {'id': 4, 'name': 'construction_vehicle'}, - {'id': 5, 'name': 'bicycle'}, - {'id': 6, 'name': 'motorcycle'}, - {'id': 7, 'name': 'pedestrian'}, - {'id': 8, 'name': 'traffic_cone'}, - {'id': 9, 'name': 'barrier'}, -] - -def _get_builtin_metadata(): - id_to_name = {x['id']: x['name'] for x in categories} - thing_dataset_id_to_contiguous_id = {i: i for i in range(len(categories))} - thing_classes = [id_to_name[k] for k in sorted(id_to_name)] - return { - "thing_dataset_id_to_contiguous_id": thing_dataset_id_to_contiguous_id, - "thing_classes": thing_classes} - -_PREDEFINED_SPLITS = { - "nuimages_train": ("nuimages", "nuimages/annotations/nuimages_v1.0-train.json"), - "nuimages_val": ("nuimages", "nuimages/annotations/nuimages_v1.0-val.json"), - "nuimages_mini": ("nuimages", "nuimages/annotations/nuimages_v1.0-mini.json"), -} - -for key, (image_root, json_file) in _PREDEFINED_SPLITS.items(): - register_coco_instances( - key, - _get_builtin_metadata(), - os.path.join("datasets", json_file) if "://" not in json_file else json_file, - os.path.join("datasets", image_root), - ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/objects365.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/objects365.py deleted file mode 100755 index 41395bdd..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/objects365.py +++ /dev/null @@ -1,394 +0,0 @@ -from detectron2.data.datasets.register_coco import register_coco_instances -import os - -categories_v1 = [ -{'id': 164, 'name': 'cutting/chopping board'} , -{'id': 49, 'name': 'tie'} , -{'id': 306, 'name': 'crosswalk sign'} , -{'id': 145, 'name': 'gun'} , -{'id': 14, 'name': 'street lights'} , -{'id': 223, 'name': 'bar soap'} , -{'id': 74, 'name': 'wild bird'} , -{'id': 219, 'name': 'ice cream'} , -{'id': 37, 'name': 'stool'} , -{'id': 25, 'name': 'storage box'} , -{'id': 153, 'name': 'giraffe'} , -{'id': 52, 'name': 'pen/pencil'} , -{'id': 61, 'name': 'high heels'} , -{'id': 340, 'name': 'mangosteen'} , -{'id': 22, 'name': 'bracelet'} , -{'id': 155, 'name': 'piano'} , -{'id': 162, 'name': 'vent'} , -{'id': 75, 'name': 'laptop'} , -{'id': 236, 'name': 'toaster'} , -{'id': 231, 'name': 'fire truck'} , -{'id': 42, 'name': 'basket'} , -{'id': 150, 'name': 'zebra'} , -{'id': 124, 'name': 'head phone'} , -{'id': 90, 'name': 'sheep'} , -{'id': 322, 'name': 'steak'} , -{'id': 39, 'name': 'couch'} , -{'id': 209, 'name': 'toothbrush'} , -{'id': 59, 'name': 'bicycle'} , -{'id': 336, 'name': 'red cabbage'} , -{'id': 228, 'name': 'golf ball'} , -{'id': 120, 'name': 'tomato'} , -{'id': 132, 'name': 'computer box'} , -{'id': 8, 'name': 'cup'} , -{'id': 183, 'name': 'basketball'} , -{'id': 298, 'name': 'butterfly'} , -{'id': 250, 'name': 'garlic'} , -{'id': 12, 'name': 'desk'} , -{'id': 141, 'name': 'microwave'} , -{'id': 171, 'name': 'strawberry'} , -{'id': 200, 'name': 'kettle'} , -{'id': 63, 'name': 'van'} , -{'id': 300, 'name': 'cheese'} , -{'id': 215, 'name': 'marker'} , -{'id': 100, 'name': 'blackboard/whiteboard'} , -{'id': 186, 'name': 'printer'} , -{'id': 333, 'name': 'bread/bun'} , -{'id': 243, 'name': 'penguin'} , -{'id': 364, 'name': 'iron'} , -{'id': 180, 'name': 'ladder'} , -{'id': 34, 'name': 'flag'} , -{'id': 78, 'name': 'cell phone'} , -{'id': 97, 'name': 'fan'} , -{'id': 224, 'name': 'scale'} , -{'id': 151, 'name': 'duck'} , -{'id': 319, 'name': 'flute'} , -{'id': 156, 'name': 'stop sign'} , -{'id': 290, 'name': 'rickshaw'} , -{'id': 128, 'name': 'sailboat'} , -{'id': 165, 'name': 'tennis racket'} , -{'id': 241, 'name': 'cigar'} , -{'id': 101, 'name': 'balloon'} , -{'id': 308, 'name': 'hair drier'} , -{'id': 167, 'name': 'skating and skiing shoes'} , -{'id': 237, 'name': 'helicopter'} , -{'id': 65, 'name': 'sink'} , -{'id': 129, 'name': 'tangerine'} , -{'id': 330, 'name': 'crab'} , -{'id': 320, 'name': 'measuring cup'} , -{'id': 260, 'name': 'fishing rod'} , -{'id': 346, 'name': 'saw'} , -{'id': 216, 'name': 'ship'} , -{'id': 46, 'name': 'coffee table'} , -{'id': 194, 'name': 'facial mask'} , -{'id': 281, 'name': 'stapler'} , -{'id': 118, 'name': 'refrigerator'} , -{'id': 40, 'name': 'belt'} , -{'id': 349, 'name': 'starfish'} , -{'id': 87, 'name': 'hanger'} , -{'id': 116, 'name': 'baseball glove'} , -{'id': 261, 'name': 'cherry'} , -{'id': 334, 'name': 'baozi'} , -{'id': 267, 'name': 'screwdriver'} , -{'id': 158, 'name': 'converter'} , -{'id': 335, 'name': 'lion'} , -{'id': 170, 'name': 'baseball'} , -{'id': 111, 'name': 'skis'} , -{'id': 136, 'name': 'broccoli'} , -{'id': 342, 'name': 'eraser'} , -{'id': 337, 'name': 'polar bear'} , -{'id': 139, 'name': 'shovel'} , -{'id': 193, 'name': 'extension cord'} , -{'id': 284, 'name': 'goldfish'} , -{'id': 174, 'name': 'pepper'} , -{'id': 138, 'name': 'stroller'} , -{'id': 328, 'name': 'yak'} , -{'id': 83, 'name': 'clock'} , -{'id': 235, 'name': 'tricycle'} , -{'id': 248, 'name': 'parking meter'} , -{'id': 274, 'name': 'trophy'} , -{'id': 324, 'name': 'binoculars'} , -{'id': 51, 'name': 'traffic light'} , -{'id': 314, 'name': 'donkey'} , -{'id': 45, 'name': 'barrel/bucket'} , -{'id': 292, 'name': 'pomegranate'} , -{'id': 13, 'name': 'handbag'} , -{'id': 262, 'name': 'tablet'} , -{'id': 68, 'name': 'apple'} , -{'id': 226, 'name': 'cabbage'} , -{'id': 23, 'name': 'flower'} , -{'id': 58, 'name': 'faucet'} , -{'id': 206, 'name': 'tong'} , -{'id': 291, 'name': 'trombone'} , -{'id': 160, 'name': 'carrot'} , -{'id': 172, 'name': 'bow tie'} , -{'id': 122, 'name': 'tent'} , -{'id': 163, 'name': 'cookies'} , -{'id': 115, 'name': 'remote'} , -{'id': 175, 'name': 'coffee machine'} , -{'id': 238, 'name': 'green beans'} , -{'id': 233, 'name': 'cello'} , -{'id': 28, 'name': 'wine glass'} , -{'id': 295, 'name': 'mushroom'} , -{'id': 344, 'name': 'scallop'} , -{'id': 125, 'name': 'lantern'} , -{'id': 123, 'name': 'shampoo/shower gel'} , -{'id': 285, 'name': 'meat balls'} , -{'id': 266, 'name': 'key'} , -{'id': 296, 'name': 'calculator'} , -{'id': 168, 'name': 'scissors'} , -{'id': 103, 'name': 'cymbal'} , -{'id': 6, 'name': 'bottle'} , -{'id': 264, 'name': 'nuts'} , -{'id': 234, 'name': 'notepaper'} , -{'id': 211, 'name': 'mango'} , -{'id': 287, 'name': 'toothpaste'} , -{'id': 196, 'name': 'chopsticks'} , -{'id': 140, 'name': 'baseball bat'} , -{'id': 244, 'name': 'hurdle'} , -{'id': 195, 'name': 'tennis ball'} , -{'id': 144, 'name': 'surveillance camera'} , -{'id': 271, 'name': 'volleyball'} , -{'id': 94, 'name': 'keyboard'} , -{'id': 339, 'name': 'seal'} , -{'id': 11, 'name': 'picture/frame'} , -{'id': 348, 'name': 'okra'} , -{'id': 191, 'name': 'sausage'} , -{'id': 166, 'name': 'candy'} , -{'id': 62, 'name': 'ring'} , -{'id': 311, 'name': 'dolphin'} , -{'id': 273, 'name': 'eggplant'} , -{'id': 84, 'name': 'drum'} , -{'id': 143, 'name': 'surfboard'} , -{'id': 288, 'name': 'antelope'} , -{'id': 204, 'name': 'clutch'} , -{'id': 207, 'name': 'slide'} , -{'id': 43, 'name': 'towel/napkin'} , -{'id': 352, 'name': 'durian'} , -{'id': 276, 'name': 'board eraser'} , -{'id': 315, 'name': 'electric drill'} , -{'id': 312, 'name': 'sushi'} , -{'id': 198, 'name': 'pie'} , -{'id': 106, 'name': 'pickup truck'} , -{'id': 176, 'name': 'bathtub'} , -{'id': 26, 'name': 'vase'} , -{'id': 133, 'name': 'elephant'} , -{'id': 256, 'name': 'sandwich'} , -{'id': 327, 'name': 'noodles'} , -{'id': 10, 'name': 'glasses'} , -{'id': 109, 'name': 'airplane'} , -{'id': 95, 'name': 'tripod'} , -{'id': 247, 'name': 'CD'} , -{'id': 121, 'name': 'machinery vehicle'} , -{'id': 365, 'name': 'flashlight'} , -{'id': 53, 'name': 'microphone'} , -{'id': 270, 'name': 'pliers'} , -{'id': 362, 'name': 'chainsaw'} , -{'id': 259, 'name': 'bear'} , -{'id': 197, 'name': 'electronic stove and gas stove'} , -{'id': 89, 'name': 'pot/pan'} , -{'id': 220, 'name': 'tape'} , -{'id': 338, 'name': 'lighter'} , -{'id': 177, 'name': 'snowboard'} , -{'id': 214, 'name': 'violin'} , -{'id': 217, 'name': 'chicken'} , -{'id': 2, 'name': 'sneakers'} , -{'id': 161, 'name': 'washing machine'} , -{'id': 131, 'name': 'kite'} , -{'id': 354, 'name': 'rabbit'} , -{'id': 86, 'name': 'bus'} , -{'id': 275, 'name': 'dates'} , -{'id': 282, 'name': 'camel'} , -{'id': 88, 'name': 'nightstand'} , -{'id': 179, 'name': 'grapes'} , -{'id': 229, 'name': 'pine apple'} , -{'id': 56, 'name': 'necklace'} , -{'id': 18, 'name': 'leather shoes'} , -{'id': 358, 'name': 'hoverboard'} , -{'id': 345, 'name': 'pencil case'} , -{'id': 359, 'name': 'pasta'} , -{'id': 157, 'name': 'radiator'} , -{'id': 201, 'name': 'hamburger'} , -{'id': 268, 'name': 'globe'} , -{'id': 332, 'name': 'barbell'} , -{'id': 329, 'name': 'mop'} , -{'id': 252, 'name': 'horn'} , -{'id': 350, 'name': 'eagle'} , -{'id': 169, 'name': 'folder'} , -{'id': 137, 'name': 'toilet'} , -{'id': 5, 'name': 'lamp'} , -{'id': 27, 'name': 'bench'} , -{'id': 249, 'name': 'swan'} , -{'id': 76, 'name': 'knife'} , -{'id': 341, 'name': 'comb'} , -{'id': 64, 'name': 'watch'} , -{'id': 105, 'name': 'telephone'} , -{'id': 3, 'name': 'chair'} , -{'id': 33, 'name': 'boat'} , -{'id': 107, 'name': 'orange'} , -{'id': 60, 'name': 'bread'} , -{'id': 147, 'name': 'cat'} , -{'id': 135, 'name': 'gas stove'} , -{'id': 307, 'name': 'papaya'} , -{'id': 227, 'name': 'router/modem'} , -{'id': 357, 'name': 'asparagus'} , -{'id': 73, 'name': 'motorcycle'} , -{'id': 77, 'name': 'traffic sign'} , -{'id': 67, 'name': 'fish'} , -{'id': 326, 'name': 'radish'} , -{'id': 213, 'name': 'egg'} , -{'id': 203, 'name': 'cucumber'} , -{'id': 17, 'name': 'helmet'} , -{'id': 110, 'name': 'luggage'} , -{'id': 80, 'name': 'truck'} , -{'id': 199, 'name': 'frisbee'} , -{'id': 232, 'name': 'peach'} , -{'id': 1, 'name': 'person'} , -{'id': 29, 'name': 'boots'} , -{'id': 310, 'name': 'chips'} , -{'id': 142, 'name': 'skateboard'} , -{'id': 44, 'name': 'slippers'} , -{'id': 4, 'name': 'hat'} , -{'id': 178, 'name': 'suitcase'} , -{'id': 24, 'name': 'tv'} , -{'id': 119, 'name': 'train'} , -{'id': 82, 'name': 'power outlet'} , -{'id': 245, 'name': 'swing'} , -{'id': 15, 'name': 'book'} , -{'id': 294, 'name': 'jellyfish'} , -{'id': 192, 'name': 'fire extinguisher'} , -{'id': 212, 'name': 'deer'} , -{'id': 181, 'name': 'pear'} , -{'id': 347, 'name': 'table tennis paddle'} , -{'id': 113, 'name': 'trolley'} , -{'id': 91, 'name': 'guitar'} , -{'id': 202, 'name': 'golf club'} , -{'id': 221, 'name': 'wheelchair'} , -{'id': 254, 'name': 'saxophone'} , -{'id': 117, 'name': 'paper towel'} , -{'id': 303, 'name': 'race car'} , -{'id': 240, 'name': 'carriage'} , -{'id': 246, 'name': 'radio'} , -{'id': 318, 'name': 'parrot'} , -{'id': 251, 'name': 'french fries'} , -{'id': 98, 'name': 'dog'} , -{'id': 112, 'name': 'soccer'} , -{'id': 355, 'name': 'french horn'} , -{'id': 79, 'name': 'paddle'} , -{'id': 283, 'name': 'lettuce'} , -{'id': 9, 'name': 'car'} , -{'id': 258, 'name': 'kiwi fruit'} , -{'id': 325, 'name': 'llama'} , -{'id': 187, 'name': 'billiards'} , -{'id': 210, 'name': 'facial cleanser'} , -{'id': 81, 'name': 'cow'} , -{'id': 331, 'name': 'microscope'} , -{'id': 148, 'name': 'lemon'} , -{'id': 302, 'name': 'pomelo'} , -{'id': 85, 'name': 'fork'} , -{'id': 154, 'name': 'pumpkin'} , -{'id': 289, 'name': 'shrimp'} , -{'id': 71, 'name': 'teddy bear'} , -{'id': 184, 'name': 'potato'} , -{'id': 102, 'name': 'air conditioner'} , -{'id': 208, 'name': 'hot dog'} , -{'id': 222, 'name': 'plum'} , -{'id': 316, 'name': 'spring rolls'} , -{'id': 230, 'name': 'crane'} , -{'id': 149, 'name': 'liquid soap'} , -{'id': 55, 'name': 'canned'} , -{'id': 35, 'name': 'speaker'} , -{'id': 108, 'name': 'banana'} , -{'id': 297, 'name': 'treadmill'} , -{'id': 99, 'name': 'spoon'} , -{'id': 104, 'name': 'mouse'} , -{'id': 182, 'name': 'american football'} , -{'id': 299, 'name': 'egg tart'} , -{'id': 127, 'name': 'cleaning products'} , -{'id': 313, 'name': 'urinal'} , -{'id': 286, 'name': 'medal'} , -{'id': 239, 'name': 'brush'} , -{'id': 96, 'name': 'hockey'} , -{'id': 279, 'name': 'dumbbell'} , -{'id': 32, 'name': 'umbrella'} , -{'id': 272, 'name': 'hammer'} , -{'id': 16, 'name': 'plate'} , -{'id': 21, 'name': 'potted plant'} , -{'id': 242, 'name': 'earphone'} , -{'id': 70, 'name': 'candle'} , -{'id': 185, 'name': 'paint brush'} , -{'id': 48, 'name': 'toy'} , -{'id': 130, 'name': 'pizza'} , -{'id': 255, 'name': 'trumpet'} , -{'id': 361, 'name': 'hotair balloon'} , -{'id': 188, 'name': 'fire hydrant'} , -{'id': 50, 'name': 'bed'} , -{'id': 253, 'name': 'avocado'} , -{'id': 293, 'name': 'coconut'} , -{'id': 257, 'name': 'cue'} , -{'id': 280, 'name': 'hamimelon'} , -{'id': 66, 'name': 'horse'} , -{'id': 173, 'name': 'pigeon'} , -{'id': 190, 'name': 'projector'} , -{'id': 69, 'name': 'camera'} , -{'id': 30, 'name': 'bowl'} , -{'id': 269, 'name': 'broom'} , -{'id': 343, 'name': 'pitaya'} , -{'id': 305, 'name': 'tuba'} , -{'id': 309, 'name': 'green onion'} , -{'id': 363, 'name': 'lobster'} , -{'id': 225, 'name': 'watermelon'} , -{'id': 47, 'name': 'suv'} , -{'id': 31, 'name': 'dining table'} , -{'id': 54, 'name': 'sandals'} , -{'id': 351, 'name': 'monkey'} , -{'id': 218, 'name': 'onion'} , -{'id': 36, 'name': 'trash bin/can'} , -{'id': 20, 'name': 'glove'} , -{'id': 277, 'name': 'rice'} , -{'id': 152, 'name': 'sports car'} , -{'id': 360, 'name': 'target'} , -{'id': 205, 'name': 'blender'} , -{'id': 19, 'name': 'pillow'} , -{'id': 72, 'name': 'cake'} , -{'id': 93, 'name': 'tea pot'} , -{'id': 353, 'name': 'game board'} , -{'id': 38, 'name': 'backpack'} , -{'id': 356, 'name': 'ambulance'} , -{'id': 146, 'name': 'life saver'} , -{'id': 189, 'name': 'goose'} , -{'id': 278, 'name': 'tape measure/ruler'} , -{'id': 92, 'name': 'traffic cone'} , -{'id': 134, 'name': 'toiletries'} , -{'id': 114, 'name': 'oven'} , -{'id': 317, 'name': 'tortoise/turtle'} , -{'id': 265, 'name': 'corn'} , -{'id': 126, 'name': 'donut'} , -{'id': 57, 'name': 'mirror'} , -{'id': 7, 'name': 'cabinet/shelf'} , -{'id': 263, 'name': 'green vegetables'} , -{'id': 159, 'name': 'tissue '} , -{'id': 321, 'name': 'shark'} , -{'id': 301, 'name': 'pig'} , -{'id': 41, 'name': 'carpet'} , -{'id': 304, 'name': 'rice cooker'} , -{'id': 323, 'name': 'poker card'} , -] - -def _get_builtin_metadata(version): - if version == 'v1': - id_to_name = {x['id']: x['name'] for x in categories_v1} - else: - assert 0, version - thing_dataset_id_to_contiguous_id = {i + 1: i for i in range(365)} - thing_classes = [id_to_name[k] for k in sorted(id_to_name)] - return { - "thing_dataset_id_to_contiguous_id": thing_dataset_id_to_contiguous_id, - "thing_classes": thing_classes} - -_PREDEFINED_SPLITS_OBJECTS365 = { - "objects365_train": ("objects365/train", "objects365/annotations/objects365_train.json"), - "objects365_val": ("objects365/val", "objects365/annotations/objects365_val.json"), -} - -for key, (image_root, json_file) in _PREDEFINED_SPLITS_OBJECTS365.items(): - register_coco_instances( - key, - _get_builtin_metadata('v1'), - os.path.join("datasets", json_file) if "://" not in json_file else json_file, - os.path.join("datasets", image_root), - ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_augmentation_impl.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_augmentation_impl.py deleted file mode 100755 index 5a69e178..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_augmentation_impl.py +++ /dev/null @@ -1,63 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -# Modified by Xingyi Zhou -""" -Implement many useful :class:`Augmentation`. -""" -import numpy as np -import sys -from fvcore.transforms.transform import ( - BlendTransform, - CropTransform, - HFlipTransform, - NoOpTransform, - Transform, - VFlipTransform, -) -from PIL import Image - -from detectron2.data.transforms.augmentation import Augmentation -from .custom_transform import EfficientDetResizeCropTransform - -__all__ = [ - "EfficientDetResizeCrop", -] - - -class EfficientDetResizeCrop(Augmentation): - """ - Scale the shorter edge to the given size, with a limit of `max_size` on the longer edge. - If `max_size` is reached, then downscale so that the longer edge does not exceed max_size. - """ - - def __init__( - self, size, scale, interp=Image.BILINEAR - ): - """ - Args: - """ - super().__init__() - self.target_size = (size, size) - self.scale = scale - self.interp = interp - - def get_transform(self, img): - # Select a random scale factor. - scale_factor = np.random.uniform(*self.scale) - scaled_target_height = scale_factor * self.target_size[0] - scaled_target_width = scale_factor * self.target_size[1] - # Recompute the accurate scale_factor using rounded scaled image size. - width, height = img.shape[1], img.shape[0] - img_scale_y = scaled_target_height / height - img_scale_x = scaled_target_width / width - img_scale = min(img_scale_y, img_scale_x) - - # Select non-zero random offset (x, y) if scaled image is larger than target size - scaled_h = int(height * img_scale) - scaled_w = int(width * img_scale) - offset_y = scaled_h - self.target_size[0] - offset_x = scaled_w - self.target_size[1] - offset_y = int(max(0.0, float(offset_y)) * np.random.uniform(0, 1)) - offset_x = int(max(0.0, float(offset_x)) * np.random.uniform(0, 1)) - return EfficientDetResizeCropTransform( - scaled_h, scaled_w, offset_y, offset_x, img_scale, self.target_size, self.interp) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_transform.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_transform.py deleted file mode 100755 index 654d65d9..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_transform.py +++ /dev/null @@ -1,94 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -# Modified by Xingyi Zhou -# File: transform.py - -import numpy as np -import torch -import torch.nn.functional as F -from fvcore.transforms.transform import ( - CropTransform, - HFlipTransform, - NoOpTransform, - Transform, - TransformList, -) -from PIL import Image - -try: - import cv2 # noqa -except ImportError: - # OpenCV is an optional dependency at the moment - pass - -__all__ = [ - "EfficientDetResizeCropTransform", -] - - -class EfficientDetResizeCropTransform(Transform): - """ - """ - - def __init__(self, scaled_h, scaled_w, offset_y, offset_x, img_scale, target_size, interp=None): - """ - Args: - h, w (int): original image size - new_h, new_w (int): new image size - interp: PIL interpolation methods, defaults to bilinear. - """ - # TODO decide on PIL vs opencv - super().__init__() - if interp is None: - interp = Image.BILINEAR - self._set_attributes(locals()) - - def apply_image(self, img, interp=None): - # assert img.shape[:2] == (self.h, self.w) - assert len(img.shape) <= 4 - - if img.dtype == np.uint8: - pil_image = Image.fromarray(img) - interp_method = interp if interp is not None else self.interp - pil_image = pil_image.resize((self.scaled_w, self.scaled_h), interp_method) - ret = np.asarray(pil_image) - right = min(self.scaled_w, self.offset_x + self.target_size[1]) - lower = min(self.scaled_h, self.offset_y + self.target_size[0]) - # img = img.crop((self.offset_x, self.offset_y, right, lower)) - if len(ret.shape) <= 3: - ret = ret[self.offset_y: lower, self.offset_x: right] - else: - ret = ret[..., self.offset_y: lower, self.offset_x: right, :] - else: - # PIL only supports uint8 - img = torch.from_numpy(img) - shape = list(img.shape) - shape_4d = shape[:2] + [1] * (4 - len(shape)) + shape[2:] - img = img.view(shape_4d).permute(2, 3, 0, 1) # hw(c) -> nchw - _PIL_RESIZE_TO_INTERPOLATE_MODE = {Image.BILINEAR: "bilinear", Image.BICUBIC: "bicubic"} - mode = _PIL_RESIZE_TO_INTERPOLATE_MODE[self.interp] - img = F.interpolate(img, (self.scaled_h, self.scaled_w), mode=mode, align_corners=False) - shape[:2] = (self.scaled_h, self.scaled_w) - ret = img.permute(2, 3, 0, 1).view(shape).numpy() # nchw -> hw(c) - right = min(self.scaled_w, self.offset_x + self.target_size[1]) - lower = min(self.scaled_h, self.offset_y + self.target_size[0]) - if len(ret.shape) <= 3: - ret = ret[self.offset_y: lower, self.offset_x: right] - else: - ret = ret[..., self.offset_y: lower, self.offset_x: right, :] - return ret - - def apply_coords(self, coords): - coords[:, 0] = coords[:, 0] * self.img_scale - coords[:, 1] = coords[:, 1] * self.img_scale - coords[:, 0] -= self.offset_x - coords[:, 1] -= self.offset_y - return coords - - def apply_segmentation(self, segmentation): - segmentation = self.apply_image(segmentation, interp=Image.NEAREST) - return segmentation - - def inverse(self): - raise NotImplementedError - # return ResizeTransform(self.new_h, self.new_w, self.h, self.w, self.interp) \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn.py deleted file mode 100755 index 565e2940..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn.py +++ /dev/null @@ -1,425 +0,0 @@ -# Modified from https://github.com/rwightman/efficientdet-pytorch/blob/master/effdet/efficientdet.py -# The original file is under Apache-2.0 License -import math -from os.path import join -import numpy as np -from collections import OrderedDict -from typing import List - -import torch -from torch import nn -import torch.utils.model_zoo as model_zoo -import torch.nn.functional as F -import fvcore.nn.weight_init as weight_init - -from detectron2.layers import ShapeSpec, Conv2d -from detectron2.modeling.backbone.resnet import build_resnet_backbone -from detectron2.modeling.backbone.build import BACKBONE_REGISTRY -from detectron2.layers.batch_norm import get_norm -from detectron2.modeling.backbone import Backbone -from .dlafpn import dla34 - -def get_fpn_config(base_reduction=8): - """BiFPN config with sum.""" - p = { - 'nodes': [ - {'reduction': base_reduction << 3, 'inputs_offsets': [3, 4]}, - {'reduction': base_reduction << 2, 'inputs_offsets': [2, 5]}, - {'reduction': base_reduction << 1, 'inputs_offsets': [1, 6]}, - {'reduction': base_reduction, 'inputs_offsets': [0, 7]}, - {'reduction': base_reduction << 1, 'inputs_offsets': [1, 7, 8]}, - {'reduction': base_reduction << 2, 'inputs_offsets': [2, 6, 9]}, - {'reduction': base_reduction << 3, 'inputs_offsets': [3, 5, 10]}, - {'reduction': base_reduction << 4, 'inputs_offsets': [4, 11]}, - ], - 'weight_method': 'fastattn', - } - return p - - -def swish(x, inplace: bool = False): - """Swish - Described in: https://arxiv.org/abs/1710.05941 - """ - return x.mul_(x.sigmoid()) if inplace else x.mul(x.sigmoid()) - - -class Swish(nn.Module): - def __init__(self, inplace: bool = False): - super(Swish, self).__init__() - self.inplace = inplace - - def forward(self, x): - return swish(x, self.inplace) - - -class SequentialAppend(nn.Sequential): - def __init__(self, *args): - super(SequentialAppend, self).__init__(*args) - - def forward(self, x): - for module in self: - x.append(module(x)) - return x - - -class SequentialAppendLast(nn.Sequential): - def __init__(self, *args): - super(SequentialAppendLast, self).__init__(*args) - - # def forward(self, x: List[torch.Tensor]): - def forward(self, x): - for module in self: - x.append(module(x[-1])) - return x - - -class ConvBnAct2d(nn.Module): - def __init__(self, in_channels, out_channels, kernel_size, stride=1, dilation=1, padding='', bias=False, - norm='', act_layer=Swish): - super(ConvBnAct2d, self).__init__() - # self.conv = create_conv2d( - # in_channels, out_channels, kernel_size, stride=stride, dilation=dilation, padding=padding, bias=bias) - self.conv = Conv2d( - in_channels, out_channels, kernel_size=kernel_size, stride=stride, - padding=kernel_size // 2, bias=(norm == '')) - self.bn = get_norm(norm, out_channels) - self.act = None if act_layer is None else act_layer(inplace=True) - - def forward(self, x): - x = self.conv(x) - if self.bn is not None: - x = self.bn(x) - if self.act is not None: - x = self.act(x) - return x - - -class SeparableConv2d(nn.Module): - """ Separable Conv - """ - def __init__(self, in_channels, out_channels, kernel_size=3, stride=1, dilation=1, padding='', bias=False, - channel_multiplier=1.0, pw_kernel_size=1, act_layer=Swish, - norm=''): - super(SeparableConv2d, self).__init__() - - # self.conv_dw = create_conv2d( - # in_channels, int(in_channels * channel_multiplier), kernel_size, - # stride=stride, dilation=dilation, padding=padding, depthwise=True) - - self.conv_dw = Conv2d( - in_channels, int(in_channels * channel_multiplier), - kernel_size=kernel_size, stride=stride, padding=kernel_size // 2, bias=bias, - groups=out_channels) - # print('conv_dw', kernel_size, stride) - # self.conv_pw = create_conv2d( - # int(in_channels * channel_multiplier), out_channels, pw_kernel_size, padding=padding, bias=bias) - - self.conv_pw = Conv2d( - int(in_channels * channel_multiplier), out_channels, - kernel_size=pw_kernel_size, padding=pw_kernel_size // 2, bias=(norm=='')) - # print('conv_pw', pw_kernel_size) - - self.bn = get_norm(norm, out_channels) - self.act = None if act_layer is None else act_layer(inplace=True) - - def forward(self, x): - x = self.conv_dw(x) - x = self.conv_pw(x) - if self.bn is not None: - x = self.bn(x) - if self.act is not None: - x = self.act(x) - return x - - -class ResampleFeatureMap(nn.Sequential): - def __init__(self, in_channels, out_channels, reduction_ratio=1., pad_type='', pooling_type='max', - norm='', apply_bn=False, conv_after_downsample=False, - redundant_bias=False): - super(ResampleFeatureMap, self).__init__() - pooling_type = pooling_type or 'max' - self.in_channels = in_channels - self.out_channels = out_channels - self.reduction_ratio = reduction_ratio - self.conv_after_downsample = conv_after_downsample - - conv = None - if in_channels != out_channels: - conv = ConvBnAct2d( - in_channels, out_channels, kernel_size=1, padding=pad_type, - norm=norm if apply_bn else '', - bias=not apply_bn or redundant_bias, act_layer=None) - - if reduction_ratio > 1: - stride_size = int(reduction_ratio) - if conv is not None and not self.conv_after_downsample: - self.add_module('conv', conv) - self.add_module( - 'downsample', - # create_pool2d( - # pooling_type, kernel_size=stride_size + 1, stride=stride_size, padding=pad_type) - # nn.MaxPool2d(kernel_size=stride_size + 1, stride=stride_size, padding=pad_type) - nn.MaxPool2d(kernel_size=stride_size, stride=stride_size) - ) - if conv is not None and self.conv_after_downsample: - self.add_module('conv', conv) - else: - if conv is not None: - self.add_module('conv', conv) - if reduction_ratio < 1: - scale = int(1 // reduction_ratio) - self.add_module('upsample', nn.UpsamplingNearest2d(scale_factor=scale)) - - -class FpnCombine(nn.Module): - def __init__(self, feature_info, fpn_config, fpn_channels, inputs_offsets, target_reduction, pad_type='', - pooling_type='max', norm='', apply_bn_for_resampling=False, - conv_after_downsample=False, redundant_bias=False, weight_method='attn'): - super(FpnCombine, self).__init__() - self.inputs_offsets = inputs_offsets - self.weight_method = weight_method - - self.resample = nn.ModuleDict() - for idx, offset in enumerate(inputs_offsets): - in_channels = fpn_channels - if offset < len(feature_info): - in_channels = feature_info[offset]['num_chs'] - input_reduction = feature_info[offset]['reduction'] - else: - node_idx = offset - len(feature_info) - # print('node_idx, len', node_idx, len(fpn_config['nodes'])) - input_reduction = fpn_config['nodes'][node_idx]['reduction'] - reduction_ratio = target_reduction / input_reduction - self.resample[str(offset)] = ResampleFeatureMap( - in_channels, fpn_channels, reduction_ratio=reduction_ratio, pad_type=pad_type, - pooling_type=pooling_type, norm=norm, - apply_bn=apply_bn_for_resampling, conv_after_downsample=conv_after_downsample, - redundant_bias=redundant_bias) - - if weight_method == 'attn' or weight_method == 'fastattn': - # WSM - self.edge_weights = nn.Parameter(torch.ones(len(inputs_offsets)), requires_grad=True) - else: - self.edge_weights = None - - def forward(self, x): - dtype = x[0].dtype - nodes = [] - for offset in self.inputs_offsets: - input_node = x[offset] - input_node = self.resample[str(offset)](input_node) - nodes.append(input_node) - - if self.weight_method == 'attn': - normalized_weights = torch.softmax(self.edge_weights.type(dtype), dim=0) - x = torch.stack(nodes, dim=-1) * normalized_weights - elif self.weight_method == 'fastattn': - edge_weights = nn.functional.relu(self.edge_weights.type(dtype)) - weights_sum = torch.sum(edge_weights) - x = torch.stack( - [(nodes[i] * edge_weights[i]) / (weights_sum + 0.0001) for i in range(len(nodes))], dim=-1) - elif self.weight_method == 'sum': - x = torch.stack(nodes, dim=-1) - else: - raise ValueError('unknown weight_method {}'.format(self.weight_method)) - x = torch.sum(x, dim=-1) - return x - - -class BiFpnLayer(nn.Module): - def __init__(self, feature_info, fpn_config, fpn_channels, num_levels=5, pad_type='', - pooling_type='max', norm='', act_layer=Swish, - apply_bn_for_resampling=False, conv_after_downsample=True, conv_bn_relu_pattern=False, - separable_conv=True, redundant_bias=False): - super(BiFpnLayer, self).__init__() - self.fpn_config = fpn_config - self.num_levels = num_levels - self.conv_bn_relu_pattern = False - - self.feature_info = [] - self.fnode = SequentialAppend() - for i, fnode_cfg in enumerate(fpn_config['nodes']): - # logging.debug('fnode {} : {}'.format(i, fnode_cfg)) - # print('fnode {} : {}'.format(i, fnode_cfg)) - fnode_layers = OrderedDict() - - # combine features - reduction = fnode_cfg['reduction'] - fnode_layers['combine'] = FpnCombine( - feature_info, fpn_config, fpn_channels, fnode_cfg['inputs_offsets'], target_reduction=reduction, - pad_type=pad_type, pooling_type=pooling_type, norm=norm, - apply_bn_for_resampling=apply_bn_for_resampling, conv_after_downsample=conv_after_downsample, - redundant_bias=redundant_bias, weight_method=fpn_config['weight_method']) - self.feature_info.append(dict(num_chs=fpn_channels, reduction=reduction)) - - # after combine ops - after_combine = OrderedDict() - if not conv_bn_relu_pattern: - after_combine['act'] = act_layer(inplace=True) - conv_bias = redundant_bias - conv_act = None - else: - conv_bias = False - conv_act = act_layer - conv_kwargs = dict( - in_channels=fpn_channels, out_channels=fpn_channels, kernel_size=3, padding=pad_type, - bias=conv_bias, norm=norm, act_layer=conv_act) - after_combine['conv'] = SeparableConv2d(**conv_kwargs) if separable_conv else ConvBnAct2d(**conv_kwargs) - fnode_layers['after_combine'] = nn.Sequential(after_combine) - - self.fnode.add_module(str(i), nn.Sequential(fnode_layers)) - - self.feature_info = self.feature_info[-num_levels::] - - def forward(self, x): - x = self.fnode(x) - return x[-self.num_levels::] - - -class BiFPN(Backbone): - def __init__( - self, cfg, bottom_up, in_features, out_channels, norm='', - num_levels=5, num_bifpn=4, separable_conv=False, - ): - super(BiFPN, self).__init__() - assert isinstance(bottom_up, Backbone) - - # Feature map strides and channels from the bottom up network (e.g. ResNet) - input_shapes = bottom_up.output_shape() - in_strides = [input_shapes[f].stride for f in in_features] - in_channels = [input_shapes[f].channels for f in in_features] - - self.num_levels = num_levels - self.num_bifpn = num_bifpn - self.bottom_up = bottom_up - self.in_features = in_features - self._size_divisibility = 128 - levels = [int(math.log2(s)) for s in in_strides] - self._out_feature_strides = { - "p{}".format(int(math.log2(s))): s for s in in_strides} - if len(in_features) < num_levels: - for l in range(num_levels - len(in_features)): - s = l + levels[-1] - self._out_feature_strides["p{}".format(s + 1)] = 2 ** (s + 1) - self._out_features = list(sorted(self._out_feature_strides.keys())) - self._out_feature_channels = {k: out_channels for k in self._out_features} - - # print('self._out_feature_strides', self._out_feature_strides) - # print('self._out_feature_channels', self._out_feature_channels) - - feature_info = [ - {'num_chs': in_channels[level], 'reduction': in_strides[level]} \ - for level in range(len(self.in_features)) - ] - # self.config = config - fpn_config = get_fpn_config() - self.resample = SequentialAppendLast() - for level in range(num_levels): - if level < len(feature_info): - in_chs = in_channels[level] # feature_info[level]['num_chs'] - reduction = in_strides[level] # feature_info[level]['reduction'] - else: - # Adds a coarser level by downsampling the last feature map - reduction_ratio = 2 - self.resample.add_module(str(level), ResampleFeatureMap( - in_channels=in_chs, - out_channels=out_channels, - pad_type='same', - pooling_type=None, - norm=norm, - reduction_ratio=reduction_ratio, - apply_bn=True, - conv_after_downsample=False, - redundant_bias=False, - )) - in_chs = out_channels - reduction = int(reduction * reduction_ratio) - feature_info.append(dict(num_chs=in_chs, reduction=reduction)) - - self.cell = nn.Sequential() - for rep in range(self.num_bifpn): - # logging.debug('building cell {}'.format(rep)) - # print('building cell {}'.format(rep)) - fpn_layer = BiFpnLayer( - feature_info=feature_info, - fpn_config=fpn_config, - fpn_channels=out_channels, - num_levels=self.num_levels, - pad_type='same', - pooling_type=None, - norm=norm, - act_layer=Swish, - separable_conv=separable_conv, - apply_bn_for_resampling=True, - conv_after_downsample=False, - conv_bn_relu_pattern=False, - redundant_bias=False, - ) - self.cell.add_module(str(rep), fpn_layer) - feature_info = fpn_layer.feature_info - # import pdb; pdb.set_trace() - - @property - def size_divisibility(self): - return self._size_divisibility - - def forward(self, x): - # print('input shapes', x.shape) - bottom_up_features = self.bottom_up(x) - x = [bottom_up_features[f] for f in self.in_features] - assert len(self.resample) == self.num_levels - len(x) - x = self.resample(x) - shapes = [xx.shape for xx in x] - # print('resample shapes', shapes) - x = self.cell(x) - out = {f: xx for f, xx in zip(self._out_features, x)} - # import pdb; pdb.set_trace() - return out - - -@BACKBONE_REGISTRY.register() -def build_resnet_bifpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_resnet_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - backbone = BiFPN( - cfg=cfg, - bottom_up=bottom_up, - in_features=in_features, - out_channels=cfg.MODEL.BIFPN.OUT_CHANNELS, - norm=cfg.MODEL.BIFPN.NORM, - num_levels=cfg.MODEL.BIFPN.NUM_LEVELS, - num_bifpn=cfg.MODEL.BIFPN.NUM_BIFPN, - separable_conv=cfg.MODEL.BIFPN.SEPARABLE_CONV, - ) - return backbone - -@BACKBONE_REGISTRY.register() -def build_p37_dla_bifpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = dla34(cfg) - in_features = cfg.MODEL.FPN.IN_FEATURES - assert cfg.MODEL.BIFPN.NUM_LEVELS == 5 - - backbone = BiFPN( - cfg=cfg, - bottom_up=bottom_up, - in_features=in_features, - out_channels=cfg.MODEL.BIFPN.OUT_CHANNELS, - norm=cfg.MODEL.BIFPN.NORM, - num_levels=cfg.MODEL.BIFPN.NUM_LEVELS, - num_bifpn=cfg.MODEL.BIFPN.NUM_BIFPN, - separable_conv=cfg.MODEL.BIFPN.SEPARABLE_CONV, - ) - return backbone diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py deleted file mode 100755 index 17f2904c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py +++ /dev/null @@ -1,469 +0,0 @@ -# This file is modified from https://github.com/aim-uofa/AdelaiDet/blob/master/adet/modeling/backbone/bifpn.py -# The original file is under 2-clause BSD License for academic use, and *non-commercial use*. -import torch -import torch.nn.functional as F -from torch import nn - -from detectron2.layers import Conv2d, ShapeSpec, get_norm - -from detectron2.modeling.backbone import Backbone, build_resnet_backbone -from detectron2.modeling import BACKBONE_REGISTRY -from .dlafpn import dla34 - -__all__ = [] - - -def swish(x): - return x * x.sigmoid() - - -def split_name(name): - for i, c in enumerate(name): - if not c.isalpha(): - return name[:i], int(name[i:]) - raise ValueError() - - -class FeatureMapResampler(nn.Module): - def __init__(self, in_channels, out_channels, stride, norm=""): - super(FeatureMapResampler, self).__init__() - if in_channels != out_channels: - self.reduction = Conv2d( - in_channels, out_channels, kernel_size=1, - bias=(norm == ""), - norm=get_norm(norm, out_channels), - activation=None - ) - else: - self.reduction = None - - assert stride <= 2 - self.stride = stride - - def forward(self, x): - if self.reduction is not None: - x = self.reduction(x) - - if self.stride == 2: - x = F.max_pool2d( - x, kernel_size=self.stride + 1, - stride=self.stride, padding=1 - ) - elif self.stride == 1: - pass - else: - raise NotImplementedError() - return x - - -class BackboneWithTopLevels(Backbone): - def __init__(self, backbone, out_channels, num_top_levels, norm=""): - super(BackboneWithTopLevels, self).__init__() - self.backbone = backbone - backbone_output_shape = backbone.output_shape() - - self._out_feature_channels = {name: shape.channels for name, shape in backbone_output_shape.items()} - self._out_feature_strides = {name: shape.stride for name, shape in backbone_output_shape.items()} - self._out_features = list(self._out_feature_strides.keys()) - - last_feature_name = max(self._out_feature_strides.keys(), key=lambda x: split_name(x)[1]) - self.last_feature_name = last_feature_name - self.num_top_levels = num_top_levels - - last_channels = self._out_feature_channels[last_feature_name] - last_stride = self._out_feature_strides[last_feature_name] - - prefix, suffix = split_name(last_feature_name) - prev_channels = last_channels - for i in range(num_top_levels): - name = prefix + str(suffix + i + 1) - self.add_module(name, FeatureMapResampler( - prev_channels, out_channels, 2, norm - )) - prev_channels = out_channels - - self._out_feature_channels[name] = out_channels - self._out_feature_strides[name] = last_stride * 2 ** (i + 1) - self._out_features.append(name) - - def forward(self, x): - outputs = self.backbone(x) - last_features = outputs[self.last_feature_name] - prefix, suffix = split_name(self.last_feature_name) - - x = last_features - for i in range(self.num_top_levels): - name = prefix + str(suffix + i + 1) - x = self.__getattr__(name)(x) - outputs[name] = x - - return outputs - - -class SingleBiFPN(Backbone): - """ - This module implements Feature Pyramid Network. - It creates pyramid features built on top of some input feature maps. - """ - - def __init__( - self, in_channels_list, out_channels, norm="" - ): - """ - Args: - bottom_up (Backbone): module representing the bottom up subnetwork. - Must be a subclass of :class:`Backbone`. The multi-scale feature - maps generated by the bottom up network, and listed in `in_features`, - are used to generate FPN levels. - in_features (list[str]): names of the input feature maps coming - from the backbone to which FPN is attached. For example, if the - backbone produces ["res2", "res3", "res4"], any *contiguous* sublist - of these may be used; order must be from high to low resolution. - out_channels (int): number of channels in the output feature maps. - norm (str): the normalization to use. - """ - super(SingleBiFPN, self).__init__() - - self.out_channels = out_channels - # build 5-levels bifpn - if len(in_channels_list) == 5: - self.nodes = [ - {'feat_level': 3, 'inputs_offsets': [3, 4]}, - {'feat_level': 2, 'inputs_offsets': [2, 5]}, - {'feat_level': 1, 'inputs_offsets': [1, 6]}, - {'feat_level': 0, 'inputs_offsets': [0, 7]}, - {'feat_level': 1, 'inputs_offsets': [1, 7, 8]}, - {'feat_level': 2, 'inputs_offsets': [2, 6, 9]}, - {'feat_level': 3, 'inputs_offsets': [3, 5, 10]}, - {'feat_level': 4, 'inputs_offsets': [4, 11]}, - ] - elif len(in_channels_list) == 3: - self.nodes = [ - {'feat_level': 1, 'inputs_offsets': [1, 2]}, - {'feat_level': 0, 'inputs_offsets': [0, 3]}, - {'feat_level': 1, 'inputs_offsets': [1, 3, 4]}, - {'feat_level': 2, 'inputs_offsets': [2, 5]}, - ] - else: - raise NotImplementedError - - node_info = [_ for _ in in_channels_list] - - num_output_connections = [0 for _ in in_channels_list] - for fnode in self.nodes: - feat_level = fnode["feat_level"] - inputs_offsets = fnode["inputs_offsets"] - inputs_offsets_str = "_".join(map(str, inputs_offsets)) - for input_offset in inputs_offsets: - num_output_connections[input_offset] += 1 - - in_channels = node_info[input_offset] - if in_channels != out_channels: - lateral_conv = Conv2d( - in_channels, - out_channels, - kernel_size=1, - norm=get_norm(norm, out_channels) - ) - self.add_module( - "lateral_{}_f{}".format(input_offset, feat_level), lateral_conv - ) - node_info.append(out_channels) - num_output_connections.append(0) - - # generate attention weights - name = "weights_f{}_{}".format(feat_level, inputs_offsets_str) - self.__setattr__(name, nn.Parameter( - torch.ones(len(inputs_offsets), dtype=torch.float32), - requires_grad=True - )) - - # generate convolutions after combination - name = "outputs_f{}_{}".format(feat_level, inputs_offsets_str) - self.add_module(name, Conv2d( - out_channels, - out_channels, - kernel_size=3, - padding=1, - norm=get_norm(norm, out_channels), - bias=(norm == "") - )) - - def forward(self, feats): - """ - Args: - input (dict[str->Tensor]): mapping feature map name (e.g., "p5") to - feature map tensor for each feature level in high to low resolution order. - Returns: - dict[str->Tensor]: - mapping from feature map name to FPN feature map tensor - in high to low resolution order. Returned feature names follow the FPN - paper convention: "p", where stage has stride = 2 ** stage e.g., - ["n2", "n3", ..., "n6"]. - """ - feats = [_ for _ in feats] - num_levels = len(feats) - num_output_connections = [0 for _ in feats] - for fnode in self.nodes: - feat_level = fnode["feat_level"] - inputs_offsets = fnode["inputs_offsets"] - inputs_offsets_str = "_".join(map(str, inputs_offsets)) - input_nodes = [] - _, _, target_h, target_w = feats[feat_level].size() - for input_offset in inputs_offsets: - num_output_connections[input_offset] += 1 - input_node = feats[input_offset] - - # reduction - if input_node.size(1) != self.out_channels: - name = "lateral_{}_f{}".format(input_offset, feat_level) - input_node = self.__getattr__(name)(input_node) - - # maybe downsample - _, _, h, w = input_node.size() - if h > target_h and w > target_w: - height_stride_size = int((h - 1) // target_h + 1) - width_stride_size = int((w - 1) // target_w + 1) - assert height_stride_size == width_stride_size == 2 - input_node = F.max_pool2d( - input_node, kernel_size=(height_stride_size + 1, width_stride_size + 1), - stride=(height_stride_size, width_stride_size), padding=1 - ) - elif h <= target_h and w <= target_w: - if h < target_h or w < target_w: - input_node = F.interpolate( - input_node, - size=(target_h, target_w), - mode="nearest" - ) - else: - raise NotImplementedError() - input_nodes.append(input_node) - - # attention - name = "weights_f{}_{}".format(feat_level, inputs_offsets_str) - weights = F.relu(self.__getattr__(name)) - norm_weights = weights / (weights.sum() + 0.0001) - - new_node = torch.stack(input_nodes, dim=-1) - new_node = (norm_weights * new_node).sum(dim=-1) - new_node = swish(new_node) - - name = "outputs_f{}_{}".format(feat_level, inputs_offsets_str) - feats.append(self.__getattr__(name)(new_node)) - - num_output_connections.append(0) - - output_feats = [] - for idx in range(num_levels): - for i, fnode in enumerate(reversed(self.nodes)): - if fnode['feat_level'] == idx: - output_feats.append(feats[-1 - i]) - break - else: - raise ValueError() - return output_feats - - -class BiFPN(Backbone): - """ - This module implements Feature Pyramid Network. - It creates pyramid features built on top of some input feature maps. - """ - - def __init__( - self, bottom_up, in_features, out_channels, num_top_levels, num_repeats, norm="" - ): - """ - Args: - bottom_up (Backbone): module representing the bottom up subnetwork. - Must be a subclass of :class:`Backbone`. The multi-scale feature - maps generated by the bottom up network, and listed in `in_features`, - are used to generate FPN levels. - in_features (list[str]): names of the input feature maps coming - from the backbone to which FPN is attached. For example, if the - backbone produces ["res2", "res3", "res4"], any *contiguous* sublist - of these may be used; order must be from high to low resolution. - out_channels (int): number of channels in the output feature maps. - num_top_levels (int): the number of the top levels (p6 or p7). - num_repeats (int): the number of repeats of BiFPN. - norm (str): the normalization to use. - """ - super(BiFPN, self).__init__() - assert isinstance(bottom_up, Backbone) - - # add extra feature levels (i.e., 6 and 7) - self.bottom_up = BackboneWithTopLevels( - bottom_up, out_channels, - num_top_levels, norm - ) - bottom_up_output_shapes = self.bottom_up.output_shape() - - in_features = sorted(in_features, key=lambda x: split_name(x)[1]) - self._size_divisibility = 128 #bottom_up_output_shapes[in_features[-1]].stride - self.out_channels = out_channels - self.min_level = split_name(in_features[0])[1] - - # add the names for top blocks - prefix, last_suffix = split_name(in_features[-1]) - for i in range(num_top_levels): - in_features.append(prefix + str(last_suffix + i + 1)) - self.in_features = in_features - - # generate output features - self._out_features = ["p{}".format(split_name(name)[1]) for name in in_features] - self._out_feature_strides = { - out_name: bottom_up_output_shapes[in_name].stride - for out_name, in_name in zip(self._out_features, in_features) - } - self._out_feature_channels = {k: out_channels for k in self._out_features} - - # build bifpn - self.repeated_bifpn = nn.ModuleList() - for i in range(num_repeats): - if i == 0: - in_channels_list = [ - bottom_up_output_shapes[name].channels for name in in_features - ] - else: - in_channels_list = [ - self._out_feature_channels[name] for name in self._out_features - ] - self.repeated_bifpn.append(SingleBiFPN( - in_channels_list, out_channels, norm - )) - - @property - def size_divisibility(self): - return self._size_divisibility - - def forward(self, x): - """ - Args: - input (dict[str->Tensor]): mapping feature map name (e.g., "p5") to - feature map tensor for each feature level in high to low resolution order. - Returns: - dict[str->Tensor]: - mapping from feature map name to FPN feature map tensor - in high to low resolution order. Returned feature names follow the FPN - paper convention: "p", where stage has stride = 2 ** stage e.g., - ["n2", "n3", ..., "n6"]. - """ - bottom_up_features = self.bottom_up(x) - feats = [bottom_up_features[f] for f in self.in_features] - - for bifpn in self.repeated_bifpn: - feats = bifpn(feats) - - return dict(zip(self._out_features, feats)) - - -def _assert_strides_are_log2_contiguous(strides): - """ - Assert that each stride is 2x times its preceding stride, i.e. "contiguous in log2". - """ - for i, stride in enumerate(strides[1:], 1): - assert stride == 2 * strides[i - 1], "Strides {} {} are not log2 contiguous".format( - stride, strides[i - 1] - ) - - -@BACKBONE_REGISTRY.register() -def build_fcos_resnet_bifpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_resnet_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.BIFPN.OUT_CHANNELS - num_repeats = cfg.MODEL.BIFPN.NUM_BIFPN - top_levels = 2 - - backbone = BiFPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - num_top_levels=top_levels, - num_repeats=num_repeats, - norm=cfg.MODEL.BIFPN.NORM - ) - return backbone - - - -@BACKBONE_REGISTRY.register() -def build_p35_fcos_resnet_bifpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_resnet_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.BIFPN.OUT_CHANNELS - num_repeats = cfg.MODEL.BIFPN.NUM_BIFPN - top_levels = 0 - - backbone = BiFPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - num_top_levels=top_levels, - num_repeats=num_repeats, - norm=cfg.MODEL.BIFPN.NORM - ) - return backbone - - -@BACKBONE_REGISTRY.register() -def build_p35_fcos_dla_bifpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = dla34(cfg) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.BIFPN.OUT_CHANNELS - num_repeats = cfg.MODEL.BIFPN.NUM_BIFPN - top_levels = 0 - - backbone = BiFPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - num_top_levels=top_levels, - num_repeats=num_repeats, - norm=cfg.MODEL.BIFPN.NORM - ) - return backbone - -@BACKBONE_REGISTRY.register() -def build_p37_fcos_dla_bifpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = dla34(cfg) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.BIFPN.OUT_CHANNELS - num_repeats = cfg.MODEL.BIFPN.NUM_BIFPN - assert cfg.MODEL.BIFPN.NUM_LEVELS == 5 - top_levels = 2 - - backbone = BiFPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - num_top_levels=top_levels, - num_repeats=num_repeats, - norm=cfg.MODEL.BIFPN.NORM - ) - return backbone \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dla.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dla.py deleted file mode 100755 index 9f15f840..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dla.py +++ /dev/null @@ -1,479 +0,0 @@ -import numpy as np -import math -from os.path import join -import fvcore.nn.weight_init as weight_init -import torch -import torch.nn.functional as F -from torch import nn -import torch.utils.model_zoo as model_zoo - -from detectron2.modeling.backbone.resnet import ( - BasicStem, BottleneckBlock, DeformBottleneckBlock) -from detectron2.layers import ( - Conv2d, - DeformConv, - FrozenBatchNorm2d, - ModulatedDeformConv, - ShapeSpec, - get_norm, -) - -from detectron2.modeling.backbone.backbone import Backbone -from detectron2.modeling.backbone.build import BACKBONE_REGISTRY -from detectron2.modeling.backbone.fpn import FPN - -__all__ = [ - "BottleneckBlock", - "DeformBottleneckBlock", - "BasicStem", -] - -DCNV1 = False - -HASH = { - 34: 'ba72cf86', - 60: '24839fc4', -} - -def get_model_url(data, name, hash): - return join('http://dl.yf.io/dla/models', data, '{}-{}.pth'.format(name, hash)) - -class BasicBlock(nn.Module): - def __init__(self, inplanes, planes, stride=1, dilation=1, norm='BN'): - super(BasicBlock, self).__init__() - self.conv1 = nn.Conv2d(inplanes, planes, kernel_size=3, - stride=stride, padding=dilation, - bias=False, dilation=dilation) - self.bn1 = get_norm(norm, planes) - self.relu = nn.ReLU(inplace=True) - self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, - stride=1, padding=dilation, - bias=False, dilation=dilation) - self.bn2 = get_norm(norm, planes) - self.stride = stride - - def forward(self, x, residual=None): - if residual is None: - residual = x - - out = self.conv1(x) - out = self.bn1(out) - out = self.relu(out) - - out = self.conv2(out) - out = self.bn2(out) - - out += residual - out = self.relu(out) - - return out - -class Bottleneck(nn.Module): - expansion = 2 - - def __init__(self, inplanes, planes, stride=1, dilation=1, norm='BN'): - super(Bottleneck, self).__init__() - expansion = Bottleneck.expansion - bottle_planes = planes // expansion - self.conv1 = nn.Conv2d(inplanes, bottle_planes, - kernel_size=1, bias=False) - self.bn1 = get_norm(norm, bottle_planes) - self.conv2 = nn.Conv2d(bottle_planes, bottle_planes, kernel_size=3, - stride=stride, padding=dilation, - bias=False, dilation=dilation) - self.bn2 = get_norm(norm, bottle_planes) - self.conv3 = nn.Conv2d(bottle_planes, planes, - kernel_size=1, bias=False) - self.bn3 = get_norm(norm, planes) - self.relu = nn.ReLU(inplace=True) - self.stride = stride - - def forward(self, x, residual=None): - if residual is None: - residual = x - - out = self.conv1(x) - out = self.bn1(out) - out = self.relu(out) - - out = self.conv2(out) - out = self.bn2(out) - out = self.relu(out) - - out = self.conv3(out) - out = self.bn3(out) - - out += residual - out = self.relu(out) - - return out - -class Root(nn.Module): - def __init__(self, in_channels, out_channels, kernel_size, residual, norm='BN'): - super(Root, self).__init__() - self.conv = nn.Conv2d( - in_channels, out_channels, 1, - stride=1, bias=False, padding=(kernel_size - 1) // 2) - self.bn = get_norm(norm, out_channels) - self.relu = nn.ReLU(inplace=True) - self.residual = residual - - def forward(self, *x): - children = x - x = self.conv(torch.cat(x, 1)) - x = self.bn(x) - if self.residual: - x += children[0] - x = self.relu(x) - - return x - - -class Tree(nn.Module): - def __init__(self, levels, block, in_channels, out_channels, stride=1, - level_root=False, root_dim=0, root_kernel_size=1, - dilation=1, root_residual=False, norm='BN'): - super(Tree, self).__init__() - if root_dim == 0: - root_dim = 2 * out_channels - if level_root: - root_dim += in_channels - if levels == 1: - self.tree1 = block(in_channels, out_channels, stride, - dilation=dilation, norm=norm) - self.tree2 = block(out_channels, out_channels, 1, - dilation=dilation, norm=norm) - else: - self.tree1 = Tree(levels - 1, block, in_channels, out_channels, - stride, root_dim=0, - root_kernel_size=root_kernel_size, - dilation=dilation, root_residual=root_residual, - norm=norm) - self.tree2 = Tree(levels - 1, block, out_channels, out_channels, - root_dim=root_dim + out_channels, - root_kernel_size=root_kernel_size, - dilation=dilation, root_residual=root_residual, - norm=norm) - if levels == 1: - self.root = Root(root_dim, out_channels, root_kernel_size, - root_residual, norm=norm) - self.level_root = level_root - self.root_dim = root_dim - self.downsample = None - self.project = None - self.levels = levels - if stride > 1: - self.downsample = nn.MaxPool2d(stride, stride=stride) - if in_channels != out_channels: - self.project = nn.Sequential( - nn.Conv2d(in_channels, out_channels, - kernel_size=1, stride=1, bias=False), - get_norm(norm, out_channels) - ) - - def forward(self, x, residual=None, children=None): - children = [] if children is None else children - bottom = self.downsample(x) if self.downsample else x - residual = self.project(bottom) if self.project else bottom - if self.level_root: - children.append(bottom) - x1 = self.tree1(x, residual) - if self.levels == 1: - x2 = self.tree2(x1) - x = self.root(x2, x1, *children) - else: - children.append(x1) - x = self.tree2(x1, children=children) - return x - -class DLA(nn.Module): - def __init__(self, num_layers, levels, channels, - block=BasicBlock, residual_root=False, norm='BN'): - """ - Args: - """ - super(DLA, self).__init__() - self.norm = norm - self.channels = channels - self.base_layer = nn.Sequential( - nn.Conv2d(3, channels[0], kernel_size=7, stride=1, - padding=3, bias=False), - get_norm(self.norm, channels[0]), - nn.ReLU(inplace=True)) - self.level0 = self._make_conv_level( - channels[0], channels[0], levels[0]) - self.level1 = self._make_conv_level( - channels[0], channels[1], levels[1], stride=2) - self.level2 = Tree(levels[2], block, channels[1], channels[2], 2, - level_root=False, - root_residual=residual_root, norm=norm) - self.level3 = Tree(levels[3], block, channels[2], channels[3], 2, - level_root=True, root_residual=residual_root, - norm=norm) - self.level4 = Tree(levels[4], block, channels[3], channels[4], 2, - level_root=True, root_residual=residual_root, - norm=norm) - self.level5 = Tree(levels[5], block, channels[4], channels[5], 2, - level_root=True, root_residual=residual_root, - norm=norm) - self.load_pretrained_model( - data='imagenet', name='dla{}'.format(num_layers), - hash=HASH[num_layers]) - - def load_pretrained_model(self, data, name, hash): - model_url = get_model_url(data, name, hash) - model_weights = model_zoo.load_url(model_url) - num_classes = len(model_weights[list(model_weights.keys())[-1]]) - self.fc = nn.Conv2d( - self.channels[-1], num_classes, - kernel_size=1, stride=1, padding=0, bias=True) - print('Loading pretrained') - self.load_state_dict(model_weights, strict=False) - - def _make_conv_level(self, inplanes, planes, convs, stride=1, dilation=1): - modules = [] - for i in range(convs): - modules.extend([ - nn.Conv2d(inplanes, planes, kernel_size=3, - stride=stride if i == 0 else 1, - padding=dilation, bias=False, dilation=dilation), - get_norm(self.norm, planes), - nn.ReLU(inplace=True)]) - inplanes = planes - return nn.Sequential(*modules) - - def forward(self, x): - y = [] - x = self.base_layer(x) - for i in range(6): - x = getattr(self, 'level{}'.format(i))(x) - y.append(x) - return y - - -def fill_up_weights(up): - w = up.weight.data - f = math.ceil(w.size(2) / 2) - c = (2 * f - 1 - f % 2) / (2. * f) - for i in range(w.size(2)): - for j in range(w.size(3)): - w[0, 0, i, j] = \ - (1 - math.fabs(i / f - c)) * (1 - math.fabs(j / f - c)) - for c in range(1, w.size(0)): - w[c, 0, :, :] = w[0, 0, :, :] - - -class _DeformConv(nn.Module): - def __init__(self, chi, cho, norm='BN'): - super(_DeformConv, self).__init__() - self.actf = nn.Sequential( - get_norm(norm, cho), - nn.ReLU(inplace=True) - ) - if DCNV1: - self.offset = Conv2d( - chi, 18, kernel_size=3, stride=1, - padding=1, dilation=1) - self.conv = DeformConv( - chi, cho, kernel_size=(3,3), stride=1, padding=1, - dilation=1, deformable_groups=1) - else: - self.offset = Conv2d( - chi, 27, kernel_size=3, stride=1, - padding=1, dilation=1) - self.conv = ModulatedDeformConv( - chi, cho, kernel_size=3, stride=1, padding=1, - dilation=1, deformable_groups=1) - nn.init.constant_(self.offset.weight, 0) - nn.init.constant_(self.offset.bias, 0) - - def forward(self, x): - if DCNV1: - offset = self.offset(x) - x = self.conv(x, offset) - else: - offset_mask = self.offset(x) - offset_x, offset_y, mask = torch.chunk(offset_mask, 3, dim=1) - offset = torch.cat((offset_x, offset_y), dim=1) - mask = mask.sigmoid() - x = self.conv(x, offset, mask) - x = self.actf(x) - return x - - -class IDAUp(nn.Module): - def __init__(self, o, channels, up_f, norm='BN'): - super(IDAUp, self).__init__() - for i in range(1, len(channels)): - c = channels[i] - f = int(up_f[i]) - proj = _DeformConv(c, o, norm=norm) - node = _DeformConv(o, o, norm=norm) - - up = nn.ConvTranspose2d(o, o, f * 2, stride=f, - padding=f // 2, output_padding=0, - groups=o, bias=False) - fill_up_weights(up) - - setattr(self, 'proj_' + str(i), proj) - setattr(self, 'up_' + str(i), up) - setattr(self, 'node_' + str(i), node) - - - def forward(self, layers, startp, endp): - for i in range(startp + 1, endp): - upsample = getattr(self, 'up_' + str(i - startp)) - project = getattr(self, 'proj_' + str(i - startp)) - layers[i] = upsample(project(layers[i])) - node = getattr(self, 'node_' + str(i - startp)) - layers[i] = node(layers[i] + layers[i - 1]) - - -class DLAUp(nn.Module): - def __init__(self, startp, channels, scales, in_channels=None, norm='BN'): - super(DLAUp, self).__init__() - self.startp = startp - if in_channels is None: - in_channels = channels - self.channels = channels - channels = list(channels) - scales = np.array(scales, dtype=int) - for i in range(len(channels) - 1): - j = -i - 2 - setattr(self, 'ida_{}'.format(i), - IDAUp(channels[j], in_channels[j:], - scales[j:] // scales[j], norm=norm)) - scales[j + 1:] = scales[j] - in_channels[j + 1:] = [channels[j] for _ in channels[j + 1:]] - - def forward(self, layers): - out = [layers[-1]] # start with 32 - for i in range(len(layers) - self.startp - 1): - ida = getattr(self, 'ida_{}'.format(i)) - ida(layers, len(layers) -i - 2, len(layers)) - out.insert(0, layers[-1]) - return out - -DLA_CONFIGS = { - 34: ([1, 1, 1, 2, 2, 1], [16, 32, 64, 128, 256, 512], BasicBlock), - 60: ([1, 1, 1, 2, 3, 1], [16, 32, 128, 256, 512, 1024], Bottleneck) -} - - -class DLASeg(Backbone): - def __init__(self, num_layers, out_features, use_dla_up=True, - ms_output=False, norm='BN'): - super(DLASeg, self).__init__() - # depth = 34 - levels, channels, Block = DLA_CONFIGS[num_layers] - self.base = DLA(num_layers=num_layers, - levels=levels, channels=channels, block=Block, norm=norm) - down_ratio = 4 - self.first_level = int(np.log2(down_ratio)) - self.ms_output = ms_output - self.last_level = 5 if not self.ms_output else 6 - channels = self.base.channels - scales = [2 ** i for i in range(len(channels[self.first_level:]))] - self.use_dla_up = use_dla_up - if self.use_dla_up: - self.dla_up = DLAUp( - self.first_level, channels[self.first_level:], scales, - norm=norm) - out_channel = channels[self.first_level] - if not self.ms_output: # stride 4 DLA - self.ida_up = IDAUp( - out_channel, channels[self.first_level:self.last_level], - [2 ** i for i in range(self.last_level - self.first_level)], - norm=norm) - self._out_features = out_features - self._out_feature_channels = { - 'dla{}'.format(i): channels[i] for i in range(6)} - self._out_feature_strides = { - 'dla{}'.format(i): 2 ** i for i in range(6)} - self._size_divisibility = 32 - - @property - def size_divisibility(self): - return self._size_divisibility - - def forward(self, x): - x = self.base(x) - if self.use_dla_up: - x = self.dla_up(x) - if not self.ms_output: # stride 4 dla - y = [] - for i in range(self.last_level - self.first_level): - y.append(x[i].clone()) - self.ida_up(y, 0, len(y)) - ret = {} - for i in range(self.last_level - self.first_level): - out_feature = 'dla{}'.format(i) - if out_feature in self._out_features: - ret[out_feature] = y[i] - else: - ret = {} - st = self.first_level if self.use_dla_up else 0 - for i in range(self.last_level - st): - out_feature = 'dla{}'.format(i + st) - if out_feature in self._out_features: - ret[out_feature] = x[i] - - return ret - - -@BACKBONE_REGISTRY.register() -def build_dla_backbone(cfg, input_shape): - """ - Create a ResNet instance from config. - - Returns: - ResNet: a :class:`ResNet` instance. - """ - return DLASeg( - out_features=cfg.MODEL.DLA.OUT_FEATURES, - num_layers=cfg.MODEL.DLA.NUM_LAYERS, - use_dla_up=cfg.MODEL.DLA.USE_DLA_UP, - ms_output=cfg.MODEL.DLA.MS_OUTPUT, - norm=cfg.MODEL.DLA.NORM) - -class LastLevelP6P7(nn.Module): - """ - This module is used in RetinaNet to generate extra layers, P6 and P7 from - C5 feature. - """ - - def __init__(self, in_channels, out_channels): - super().__init__() - self.num_levels = 2 - self.in_feature = "dla5" - self.p6 = nn.Conv2d(in_channels, out_channels, 3, 2, 1) - self.p7 = nn.Conv2d(out_channels, out_channels, 3, 2, 1) - for module in [self.p6, self.p7]: - weight_init.c2_xavier_fill(module) - - def forward(self, c5): - p6 = self.p6(c5) - p7 = self.p7(F.relu(p6)) - return [p6, p7] - -@BACKBONE_REGISTRY.register() -def build_retinanet_dla_fpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_dla_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - in_channels_p6p7 = bottom_up.output_shape()['dla5'].channels - backbone = FPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - norm=cfg.MODEL.FPN.NORM, - top_block=LastLevelP6P7(in_channels_p6p7, out_channels), - fuse_type=cfg.MODEL.FPN.FUSE_TYPE, - ) - return backbone diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py deleted file mode 100755 index 2a33c66b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py +++ /dev/null @@ -1,493 +0,0 @@ -#!/usr/bin/env python -# -*- coding: utf-8 -*- - -# this file is from https://github.com/ucbdrive/dla/blob/master/dla.py. - -import math -from os.path import join -import numpy as np - -import torch -from torch import nn -import torch.utils.model_zoo as model_zoo -import torch.nn.functional as F -import fvcore.nn.weight_init as weight_init - -from detectron2.modeling.backbone import FPN -from detectron2.layers import ShapeSpec, ModulatedDeformConv, Conv2d -from detectron2.modeling.backbone.build import BACKBONE_REGISTRY -from detectron2.layers.batch_norm import get_norm -from detectron2.modeling.backbone import Backbone - -WEB_ROOT = 'http://dl.yf.io/dla/models' - - -def get_model_url(data, name, hash): - return join( - 'http://dl.yf.io/dla/models', data, '{}-{}.pth'.format(name, hash)) - - -def conv3x3(in_planes, out_planes, stride=1): - "3x3 convolution with padding" - return nn.Conv2d(in_planes, out_planes, kernel_size=3, stride=stride, - padding=1, bias=False) - - -class BasicBlock(nn.Module): - def __init__(self, cfg, inplanes, planes, stride=1, dilation=1): - super(BasicBlock, self).__init__() - self.conv1 = nn.Conv2d(inplanes, planes, kernel_size=3, - stride=stride, padding=dilation, - bias=False, dilation=dilation) - self.bn1 = get_norm(cfg.MODEL.DLA.NORM, planes) - self.relu = nn.ReLU(inplace=True) - self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, - stride=1, padding=dilation, - bias=False, dilation=dilation) - self.bn2 = get_norm(cfg.MODEL.DLA.NORM, planes) - self.stride = stride - - def forward(self, x, residual=None): - if residual is None: - residual = x - - out = self.conv1(x) - out = self.bn1(out) - out = self.relu(out) - - out = self.conv2(out) - out = self.bn2(out) - - out += residual - out = self.relu(out) - - return out - - -class Bottleneck(nn.Module): - expansion = 2 - - def __init__(self, cfg, inplanes, planes, stride=1, dilation=1): - super(Bottleneck, self).__init__() - expansion = Bottleneck.expansion - bottle_planes = planes // expansion - self.conv1 = nn.Conv2d(inplanes, bottle_planes, - kernel_size=1, bias=False) - self.bn1 = get_norm(cfg.MODEL.DLA.NORM, bottle_planes) - self.conv2 = nn.Conv2d(bottle_planes, bottle_planes, kernel_size=3, - stride=stride, padding=dilation, - bias=False, dilation=dilation) - self.bn2 = get_norm(cfg.MODEL.DLA.NORM, bottle_planes) - self.conv3 = nn.Conv2d(bottle_planes, planes, - kernel_size=1, bias=False) - self.bn3 = get_norm(cfg.MODEL.DLA.NORM, planes) - self.relu = nn.ReLU(inplace=True) - self.stride = stride - - def forward(self, x, residual=None): - if residual is None: - residual = x - - out = self.conv1(x) - out = self.bn1(out) - out = self.relu(out) - - out = self.conv2(out) - out = self.bn2(out) - out = self.relu(out) - - out = self.conv3(out) - out = self.bn3(out) - - out += residual - out = self.relu(out) - - return out - - -class Root(nn.Module): - def __init__(self, cfg, in_channels, out_channels, kernel_size, residual): - super(Root, self).__init__() - self.conv = nn.Conv2d( - in_channels, out_channels, kernel_size, - stride=1, bias=False, padding=(kernel_size - 1) // 2) - self.bn = get_norm(cfg.MODEL.DLA.NORM, out_channels) - self.relu = nn.ReLU(inplace=True) - self.residual = residual - - def forward(self, *x): - children = x - x = self.conv(torch.cat(x, 1)) - x = self.bn(x) - if self.residual: - x += children[0] - x = self.relu(x) - - return x - - -class Tree(nn.Module): - def __init__(self, cfg, levels, block, in_channels, out_channels, stride=1, - level_root=False, root_dim=0, root_kernel_size=1, - dilation=1, root_residual=False): - super(Tree, self).__init__() - if root_dim == 0: - root_dim = 2 * out_channels - if level_root: - root_dim += in_channels - if levels == 1: - self.tree1 = block(cfg, in_channels, out_channels, stride, - dilation=dilation) - self.tree2 = block(cfg, out_channels, out_channels, 1, - dilation=dilation) - else: - self.tree1 = Tree(cfg, levels - 1, block, in_channels, out_channels, - stride, root_dim=0, - root_kernel_size=root_kernel_size, - dilation=dilation, root_residual=root_residual) - self.tree2 = Tree(cfg, levels - 1, block, out_channels, out_channels, - root_dim=root_dim + out_channels, - root_kernel_size=root_kernel_size, - dilation=dilation, root_residual=root_residual) - if levels == 1: - self.root = Root(cfg, root_dim, out_channels, root_kernel_size, - root_residual) - self.level_root = level_root - self.root_dim = root_dim - self.downsample = None - self.project = None - self.levels = levels - if stride > 1: - self.downsample = nn.MaxPool2d(stride, stride=stride) - if in_channels != out_channels: - self.project = nn.Sequential( - nn.Conv2d(in_channels, out_channels, - kernel_size=1, stride=1, bias=False), - get_norm(cfg.MODEL.DLA.NORM, out_channels) - ) - - def forward(self, x, residual=None, children=None): - if self.training and residual is not None: - x = x + residual.sum() * 0.0 - children = [] if children is None else children - bottom = self.downsample(x) if self.downsample else x - residual = self.project(bottom) if self.project else bottom - if self.level_root: - children.append(bottom) - x1 = self.tree1(x, residual) - if self.levels == 1: - x2 = self.tree2(x1) - x = self.root(x2, x1, *children) - else: - children.append(x1) - x = self.tree2(x1, children=children) - return x - - -class DLA(Backbone): - def __init__(self, cfg, levels, channels, block=BasicBlock, residual_root=False): - super(DLA, self).__init__() - self.cfg = cfg - self.channels = channels - - self._out_features = ["dla{}".format(i) for i in range(6)] - self._out_feature_channels = {k: channels[i] for i, k in enumerate(self._out_features)} - self._out_feature_strides = {k: 2 ** i for i, k in enumerate(self._out_features)} - - self.base_layer = nn.Sequential( - nn.Conv2d(3, channels[0], kernel_size=7, stride=1, - padding=3, bias=False), - get_norm(cfg.MODEL.DLA.NORM, channels[0]), - nn.ReLU(inplace=True)) - self.level0 = self._make_conv_level( - channels[0], channels[0], levels[0]) - self.level1 = self._make_conv_level( - channels[0], channels[1], levels[1], stride=2) - self.level2 = Tree(cfg, levels[2], block, channels[1], channels[2], 2, - level_root=False, - root_residual=residual_root) - self.level3 = Tree(cfg, levels[3], block, channels[2], channels[3], 2, - level_root=True, root_residual=residual_root) - self.level4 = Tree(cfg, levels[4], block, channels[3], channels[4], 2, - level_root=True, root_residual=residual_root) - self.level5 = Tree(cfg, levels[5], block, channels[4], channels[5], 2, - level_root=True, root_residual=residual_root) - - for m in self.modules(): - if isinstance(m, nn.Conv2d): - n = m.kernel_size[0] * m.kernel_size[1] * m.out_channels - m.weight.data.normal_(0, math.sqrt(2. / n)) - - self.load_pretrained_model( - data='imagenet', name='dla34', hash='ba72cf86') - - def load_pretrained_model(self, data, name, hash): - model_url = get_model_url(data, name, hash) - model_weights = model_zoo.load_url(model_url) - del model_weights['fc.weight'] - del model_weights['fc.bias'] - print('Loading pretrained DLA!') - self.load_state_dict(model_weights, strict=True) - - def _make_conv_level(self, inplanes, planes, convs, stride=1, dilation=1): - modules = [] - for i in range(convs): - modules.extend([ - nn.Conv2d(inplanes, planes, kernel_size=3, - stride=stride if i == 0 else 1, - padding=dilation, bias=False, dilation=dilation), - get_norm(self.cfg.MODEL.DLA.NORM, planes), - nn.ReLU(inplace=True)]) - inplanes = planes - return nn.Sequential(*modules) - - def forward(self, x): - y = {} - x = self.base_layer(x) - for i in range(6): - name = 'level{}'.format(i) - x = getattr(self, name)(x) - y['dla{}'.format(i)] = x - return y - - -def fill_up_weights(up): - w = up.weight.data - f = math.ceil(w.size(2) / 2) - c = (2 * f - 1 - f % 2) / (2. * f) - for i in range(w.size(2)): - for j in range(w.size(3)): - w[0, 0, i, j] = \ - (1 - math.fabs(i / f - c)) * (1 - math.fabs(j / f - c)) - for c in range(1, w.size(0)): - w[c, 0, :, :] = w[0, 0, :, :] - - -class Conv(nn.Module): - def __init__(self, chi, cho, norm): - super(Conv, self).__init__() - self.conv = nn.Sequential( - nn.Conv2d(chi, cho, kernel_size=1, stride=1, bias=False), - get_norm(norm, cho), - nn.ReLU(inplace=True)) - - def forward(self, x): - return self.conv(x) - - -class DeformConv(nn.Module): - def __init__(self, chi, cho, norm): - super(DeformConv, self).__init__() - self.actf = nn.Sequential( - get_norm(norm, cho), - nn.ReLU(inplace=True) - ) - self.offset = Conv2d( - chi, 27, kernel_size=3, stride=1, - padding=1, dilation=1) - self.conv = ModulatedDeformConv( - chi, cho, kernel_size=3, stride=1, padding=1, - dilation=1, deformable_groups=1) - nn.init.constant_(self.offset.weight, 0) - nn.init.constant_(self.offset.bias, 0) - - def forward(self, x): - offset_mask = self.offset(x) - offset_x, offset_y, mask = torch.chunk(offset_mask, 3, dim=1) - offset = torch.cat((offset_x, offset_y), dim=1) - mask = mask.sigmoid() - x = self.conv(x, offset, mask) - x = self.actf(x) - return x - - -class IDAUp(nn.Module): - def __init__(self, o, channels, up_f, norm='FrozenBN', node_type=Conv): - super(IDAUp, self).__init__() - for i in range(1, len(channels)): - c = channels[i] - f = int(up_f[i]) - proj = node_type(c, o, norm) - node = node_type(o, o, norm) - - up = nn.ConvTranspose2d(o, o, f * 2, stride=f, - padding=f // 2, output_padding=0, - groups=o, bias=False) - fill_up_weights(up) - - setattr(self, 'proj_' + str(i), proj) - setattr(self, 'up_' + str(i), up) - setattr(self, 'node_' + str(i), node) - - - def forward(self, layers, startp, endp): - for i in range(startp + 1, endp): - upsample = getattr(self, 'up_' + str(i - startp)) - project = getattr(self, 'proj_' + str(i - startp)) - layers[i] = upsample(project(layers[i])) - node = getattr(self, 'node_' + str(i - startp)) - layers[i] = node(layers[i] + layers[i - 1]) - - -DLAUP_NODE_MAP = { - 'conv': Conv, - 'dcn': DeformConv, -} - -class DLAUP(Backbone): - def __init__(self, bottom_up, in_features, norm, dlaup_node='conv'): - super(DLAUP, self).__init__() - assert isinstance(bottom_up, Backbone) - self.bottom_up = bottom_up - input_shapes = bottom_up.output_shape() - in_strides = [input_shapes[f].stride for f in in_features] - in_channels = [input_shapes[f].channels for f in in_features] - in_levels = [int(math.log2(input_shapes[f].stride)) for f in in_features] - self.in_features = in_features - out_features = ['dlaup{}'.format(l) for l in in_levels] - self._out_features = out_features - self._out_feature_channels = { - 'dlaup{}'.format(l): in_channels[i] for i, l in enumerate(in_levels)} - self._out_feature_strides = { - 'dlaup{}'.format(l): 2 ** l for l in in_levels} - - print('self._out_features', self._out_features) - print('self._out_feature_channels', self._out_feature_channels) - print('self._out_feature_strides', self._out_feature_strides) - self._size_divisibility = 32 - - node_type = DLAUP_NODE_MAP[dlaup_node] - - self.startp = int(math.log2(in_strides[0])) - self.channels = in_channels - channels = list(in_channels) - scales = np.array([2 ** i for i in range(len(out_features))], dtype=int) - for i in range(len(channels) - 1): - j = -i - 2 - setattr(self, 'ida_{}'.format(i), - IDAUp(channels[j], in_channels[j:], - scales[j:] // scales[j], - norm=norm, - node_type=node_type)) - scales[j + 1:] = scales[j] - in_channels[j + 1:] = [channels[j] for _ in channels[j + 1:]] - - @property - def size_divisibility(self): - return self._size_divisibility - - def forward(self, x): - bottom_up_features = self.bottom_up(x) - layers = [bottom_up_features[f] for f in self.in_features] - out = [layers[-1]] # start with 32 - for i in range(len(layers) - 1): - ida = getattr(self, 'ida_{}'.format(i)) - ida(layers, len(layers) - i - 2, len(layers)) - out.insert(0, layers[-1]) - ret = {} - for k, v in zip(self._out_features, out): - ret[k] = v - # import pdb; pdb.set_trace() - return ret - - -def dla34(cfg, pretrained=None): # DLA-34 - model = DLA(cfg, [1, 1, 1, 2, 2, 1], - [16, 32, 64, 128, 256, 512], - block=BasicBlock) - return model - - -class LastLevelP6P7(nn.Module): - """ - This module is used in RetinaNet to generate extra layers, P6 and P7 from - C5 feature. - """ - - def __init__(self, in_channels, out_channels): - super().__init__() - self.num_levels = 2 - self.in_feature = "dla5" - self.p6 = nn.Conv2d(in_channels, out_channels, 3, 2, 1) - self.p7 = nn.Conv2d(out_channels, out_channels, 3, 2, 1) - for module in [self.p6, self.p7]: - weight_init.c2_xavier_fill(module) - - def forward(self, c5): - p6 = self.p6(c5) - p7 = self.p7(F.relu(p6)) - return [p6, p7] - - -@BACKBONE_REGISTRY.register() -def build_dla_fpn3_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - - depth_to_creator = {"dla34": dla34} - bottom_up = depth_to_creator['dla{}'.format(cfg.MODEL.DLA.NUM_LAYERS)](cfg) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - - backbone = FPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - norm=cfg.MODEL.FPN.NORM, - top_block=None, - fuse_type=cfg.MODEL.FPN.FUSE_TYPE, - ) - - return backbone - -@BACKBONE_REGISTRY.register() -def build_dla_fpn5_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - - depth_to_creator = {"dla34": dla34} - bottom_up = depth_to_creator['dla{}'.format(cfg.MODEL.DLA.NUM_LAYERS)](cfg) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - in_channels_top = bottom_up.output_shape()['dla5'].channels - - backbone = FPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - norm=cfg.MODEL.FPN.NORM, - top_block=LastLevelP6P7(in_channels_top, out_channels), - fuse_type=cfg.MODEL.FPN.FUSE_TYPE, - ) - - return backbone - - -@BACKBONE_REGISTRY.register() -def build_dlaup_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - - depth_to_creator = {"dla34": dla34} - bottom_up = depth_to_creator['dla{}'.format(cfg.MODEL.DLA.NUM_LAYERS)](cfg) - - backbone = DLAUP( - bottom_up=bottom_up, - in_features=cfg.MODEL.DLA.DLAUP_IN_FEATURES, - norm=cfg.MODEL.DLA.NORM, - dlaup_node=cfg.MODEL.DLA.DLAUP_NODE, - ) - - return backbone diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/res2net.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/res2net.py deleted file mode 100755 index 1d0d40ad..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/res2net.py +++ /dev/null @@ -1,802 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -# This file is modified from https://github.com/Res2Net/Res2Net-detectron2/blob/master/detectron2/modeling/backbone/resnet.py -# The original file is under Apache-2.0 License -import numpy as np -import fvcore.nn.weight_init as weight_init -import torch -import torch.nn.functional as F -from torch import nn - -from detectron2.layers import ( - CNNBlockBase, - Conv2d, - DeformConv, - ModulatedDeformConv, - ShapeSpec, - get_norm, -) - -from detectron2.modeling.backbone import Backbone -from detectron2.modeling.backbone.fpn import FPN -from detectron2.modeling.backbone.build import BACKBONE_REGISTRY -from .fpn_p5 import LastLevelP6P7_P5 -from .bifpn import BiFPN - -__all__ = [ - "ResNetBlockBase", - "BasicBlock", - "BottleneckBlock", - "DeformBottleneckBlock", - "BasicStem", - "ResNet", - "make_stage", - "build_res2net_backbone", -] - - -ResNetBlockBase = CNNBlockBase -""" -Alias for backward compatibiltiy. -""" - - -class BasicBlock(CNNBlockBase): - """ - The basic residual block for ResNet-18 and ResNet-34, with two 3x3 conv layers - and a projection shortcut if needed. - """ - - def __init__(self, in_channels, out_channels, *, stride=1, norm="BN"): - """ - Args: - in_channels (int): Number of input channels. - out_channels (int): Number of output channels. - stride (int): Stride for the first conv. - norm (str or callable): normalization for all conv layers. - See :func:`layers.get_norm` for supported format. - """ - super().__init__(in_channels, out_channels, stride) - - if in_channels != out_channels: - self.shortcut = Conv2d( - in_channels, - out_channels, - kernel_size=1, - stride=stride, - bias=False, - norm=get_norm(norm, out_channels), - ) - else: - self.shortcut = None - - self.conv1 = Conv2d( - in_channels, - out_channels, - kernel_size=3, - stride=stride, - padding=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - - self.conv2 = Conv2d( - out_channels, - out_channels, - kernel_size=3, - stride=1, - padding=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - - for layer in [self.conv1, self.conv2, self.shortcut]: - if layer is not None: # shortcut can be None - weight_init.c2_msra_fill(layer) - - def forward(self, x): - out = self.conv1(x) - out = F.relu_(out) - out = self.conv2(out) - - if self.shortcut is not None: - shortcut = self.shortcut(x) - else: - shortcut = x - - out += shortcut - out = F.relu_(out) - return out - - -class BottleneckBlock(CNNBlockBase): - """ - The standard bottle2neck residual block used by Res2Net-50, 101 and 152. - """ - - def __init__( - self, - in_channels, - out_channels, - *, - bottleneck_channels, - stride=1, - num_groups=1, - norm="BN", - stride_in_1x1=False, - dilation=1, - basewidth=26, - scale=4, - ): - """ - Args: - bottleneck_channels (int): number of output channels for the 3x3 - "bottleneck" conv layers. - num_groups (int): number of groups for the 3x3 conv layer. - norm (str or callable): normalization for all conv layers. - See :func:`layers.get_norm` for supported format. - stride_in_1x1 (bool): when stride>1, whether to put stride in the - first 1x1 convolution or the bottleneck 3x3 convolution. - dilation (int): the dilation rate of the 3x3 conv layer. - """ - super().__init__(in_channels, out_channels, stride) - - if in_channels != out_channels: - self.shortcut = nn.Sequential( - nn.AvgPool2d(kernel_size=stride, stride=stride, - ceil_mode=True, count_include_pad=False), - Conv2d( - in_channels, - out_channels, - kernel_size=1, - stride=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - ) - else: - self.shortcut = None - - # The original MSRA ResNet models have stride in the first 1x1 conv - # The subsequent fb.torch.resnet and Caffe2 ResNe[X]t implementations have - # stride in the 3x3 conv - stride_1x1, stride_3x3 = (stride, 1) if stride_in_1x1 else (1, stride) - width = bottleneck_channels//scale - - self.conv1 = Conv2d( - in_channels, - bottleneck_channels, - kernel_size=1, - stride=stride_1x1, - bias=False, - norm=get_norm(norm, bottleneck_channels), - ) - if scale == 1: - self.nums = 1 - else: - self.nums = scale -1 - if self.in_channels!=self.out_channels and stride_3x3!=2: - self.pool = nn.AvgPool2d(kernel_size=3, stride = stride_3x3, padding=1) - - convs = [] - bns = [] - for i in range(self.nums): - convs.append(nn.Conv2d( - width, - width, - kernel_size=3, - stride=stride_3x3, - padding=1 * dilation, - bias=False, - groups=num_groups, - dilation=dilation, - )) - bns.append(get_norm(norm, width)) - self.convs = nn.ModuleList(convs) - self.bns = nn.ModuleList(bns) - - self.conv3 = Conv2d( - bottleneck_channels, - out_channels, - kernel_size=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - self.scale = scale - self.width = width - self.in_channels = in_channels - self.out_channels = out_channels - self.stride_3x3 = stride_3x3 - for layer in [self.conv1, self.conv3]: - if layer is not None: # shortcut can be None - weight_init.c2_msra_fill(layer) - if self.shortcut is not None: - for layer in self.shortcut.modules(): - if isinstance(layer, Conv2d): - weight_init.c2_msra_fill(layer) - - for layer in self.convs: - if layer is not None: # shortcut can be None - weight_init.c2_msra_fill(layer) - - # Zero-initialize the last normalization in each residual branch, - # so that at the beginning, the residual branch starts with zeros, - # and each residual block behaves like an identity. - # See Sec 5.1 in "Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour": - # "For BN layers, the learnable scaling coefficient γ is initialized - # to be 1, except for each residual block's last BN - # where γ is initialized to be 0." - - # nn.init.constant_(self.conv3.norm.weight, 0) - # TODO this somehow hurts performance when training GN models from scratch. - # Add it as an option when we need to use this code to train a backbone. - - def forward(self, x): - out = self.conv1(x) - out = F.relu_(out) - - spx = torch.split(out, self.width, 1) - for i in range(self.nums): - if i==0 or self.in_channels!=self.out_channels: - sp = spx[i] - else: - sp = sp + spx[i] - sp = self.convs[i](sp) - sp = F.relu_(self.bns[i](sp)) - if i==0: - out = sp - else: - out = torch.cat((out, sp), 1) - if self.scale!=1 and self.stride_3x3==1: - out = torch.cat((out, spx[self.nums]), 1) - elif self.scale != 1 and self.stride_3x3==2: - out = torch.cat((out, self.pool(spx[self.nums])), 1) - - out = self.conv3(out) - - if self.shortcut is not None: - shortcut = self.shortcut(x) - else: - shortcut = x - - out += shortcut - out = F.relu_(out) - return out - - -class DeformBottleneckBlock(ResNetBlockBase): - """ - Not implemented for res2net yet. - Similar to :class:`BottleneckBlock`, but with deformable conv in the 3x3 convolution. - """ - - def __init__( - self, - in_channels, - out_channels, - *, - bottleneck_channels, - stride=1, - num_groups=1, - norm="BN", - stride_in_1x1=False, - dilation=1, - deform_modulated=False, - deform_num_groups=1, - basewidth=26, - scale=4, - ): - super().__init__(in_channels, out_channels, stride) - self.deform_modulated = deform_modulated - - if in_channels != out_channels: - # self.shortcut = Conv2d( - # in_channels, - # out_channels, - # kernel_size=1, - # stride=stride, - # bias=False, - # norm=get_norm(norm, out_channels), - # ) - self.shortcut = nn.Sequential( - nn.AvgPool2d(kernel_size=stride, stride=stride, - ceil_mode=True, count_include_pad=False), - Conv2d( - in_channels, - out_channels, - kernel_size=1, - stride=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - ) - else: - self.shortcut = None - - stride_1x1, stride_3x3 = (stride, 1) if stride_in_1x1 else (1, stride) - width = bottleneck_channels//scale - - self.conv1 = Conv2d( - in_channels, - bottleneck_channels, - kernel_size=1, - stride=stride_1x1, - bias=False, - norm=get_norm(norm, bottleneck_channels), - ) - - if scale == 1: - self.nums = 1 - else: - self.nums = scale -1 - if self.in_channels!=self.out_channels and stride_3x3!=2: - self.pool = nn.AvgPool2d(kernel_size=3, stride = stride_3x3, padding=1) - - if deform_modulated: - deform_conv_op = ModulatedDeformConv - # offset channels are 2 or 3 (if with modulated) * kernel_size * kernel_size - offset_channels = 27 - else: - deform_conv_op = DeformConv - offset_channels = 18 - - # self.conv2_offset = Conv2d( - # bottleneck_channels, - # offset_channels * deform_num_groups, - # kernel_size=3, - # stride=stride_3x3, - # padding=1 * dilation, - # dilation=dilation, - # ) - # self.conv2 = deform_conv_op( - # bottleneck_channels, - # bottleneck_channels, - # kernel_size=3, - # stride=stride_3x3, - # padding=1 * dilation, - # bias=False, - # groups=num_groups, - # dilation=dilation, - # deformable_groups=deform_num_groups, - # norm=get_norm(norm, bottleneck_channels), - # ) - - conv2_offsets = [] - convs = [] - bns = [] - for i in range(self.nums): - conv2_offsets.append(Conv2d( - width, - offset_channels * deform_num_groups, - kernel_size=3, - stride=stride_3x3, - padding=1 * dilation, - bias=False, - groups=num_groups, - dilation=dilation, - )) - convs.append(deform_conv_op( - width, - width, - kernel_size=3, - stride=stride_3x3, - padding=1 * dilation, - bias=False, - groups=num_groups, - dilation=dilation, - deformable_groups=deform_num_groups, - )) - bns.append(get_norm(norm, width)) - self.conv2_offsets = nn.ModuleList(conv2_offsets) - self.convs = nn.ModuleList(convs) - self.bns = nn.ModuleList(bns) - - self.conv3 = Conv2d( - bottleneck_channels, - out_channels, - kernel_size=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - self.scale = scale - self.width = width - self.in_channels = in_channels - self.out_channels = out_channels - self.stride_3x3 = stride_3x3 - # for layer in [self.conv1, self.conv2, self.conv3, self.shortcut]: - # if layer is not None: # shortcut can be None - # weight_init.c2_msra_fill(layer) - - # nn.init.constant_(self.conv2_offset.weight, 0) - # nn.init.constant_(self.conv2_offset.bias, 0) - for layer in [self.conv1, self.conv3]: - if layer is not None: # shortcut can be None - weight_init.c2_msra_fill(layer) - if self.shortcut is not None: - for layer in self.shortcut.modules(): - if isinstance(layer, Conv2d): - weight_init.c2_msra_fill(layer) - - for layer in self.convs: - if layer is not None: # shortcut can be None - weight_init.c2_msra_fill(layer) - - for layer in self.conv2_offsets: - if layer.weight is not None: - nn.init.constant_(layer.weight, 0) - if layer.bias is not None: - nn.init.constant_(layer.bias, 0) - - def forward(self, x): - out = self.conv1(x) - out = F.relu_(out) - - # if self.deform_modulated: - # offset_mask = self.conv2_offset(out) - # offset_x, offset_y, mask = torch.chunk(offset_mask, 3, dim=1) - # offset = torch.cat((offset_x, offset_y), dim=1) - # mask = mask.sigmoid() - # out = self.conv2(out, offset, mask) - # else: - # offset = self.conv2_offset(out) - # out = self.conv2(out, offset) - # out = F.relu_(out) - - spx = torch.split(out, self.width, 1) - for i in range(self.nums): - if i==0 or self.in_channels!=self.out_channels: - sp = spx[i].contiguous() - else: - sp = sp + spx[i].contiguous() - - # sp = self.convs[i](sp) - if self.deform_modulated: - offset_mask = self.conv2_offsets[i](sp) - offset_x, offset_y, mask = torch.chunk(offset_mask, 3, dim=1) - offset = torch.cat((offset_x, offset_y), dim=1) - mask = mask.sigmoid() - sp = self.convs[i](sp, offset, mask) - else: - offset = self.conv2_offsets[i](sp) - sp = self.convs[i](sp, offset) - sp = F.relu_(self.bns[i](sp)) - if i==0: - out = sp - else: - out = torch.cat((out, sp), 1) - if self.scale!=1 and self.stride_3x3==1: - out = torch.cat((out, spx[self.nums]), 1) - elif self.scale != 1 and self.stride_3x3==2: - out = torch.cat((out, self.pool(spx[self.nums])), 1) - - out = self.conv3(out) - - if self.shortcut is not None: - shortcut = self.shortcut(x) - else: - shortcut = x - - out += shortcut - out = F.relu_(out) - return out - - -def make_stage(block_class, num_blocks, first_stride, *, in_channels, out_channels, **kwargs): - """ - Create a list of blocks just like those in a ResNet stage. - Args: - block_class (type): a subclass of ResNetBlockBase - num_blocks (int): - first_stride (int): the stride of the first block. The other blocks will have stride=1. - in_channels (int): input channels of the entire stage. - out_channels (int): output channels of **every block** in the stage. - kwargs: other arguments passed to the constructor of every block. - Returns: - list[nn.Module]: a list of block module. - """ - assert "stride" not in kwargs, "Stride of blocks in make_stage cannot be changed." - blocks = [] - for i in range(num_blocks): - blocks.append( - block_class( - in_channels=in_channels, - out_channels=out_channels, - stride=first_stride if i == 0 else 1, - **kwargs, - ) - ) - in_channels = out_channels - return blocks - - -class BasicStem(CNNBlockBase): - """ - The standard ResNet stem (layers before the first residual block). - """ - - def __init__(self, in_channels=3, out_channels=64, norm="BN"): - """ - Args: - norm (str or callable): norm after the first conv layer. - See :func:`layers.get_norm` for supported format. - """ - super().__init__(in_channels, out_channels, 4) - self.in_channels = in_channels - self.conv1 = nn.Sequential( - Conv2d( - in_channels, - 32, - kernel_size=3, - stride=2, - padding=1, - bias=False, - ), - get_norm(norm, 32), - nn.ReLU(inplace=True), - Conv2d( - 32, - 32, - kernel_size=3, - stride=1, - padding=1, - bias=False, - ), - get_norm(norm, 32), - nn.ReLU(inplace=True), - Conv2d( - 32, - out_channels, - kernel_size=3, - stride=1, - padding=1, - bias=False, - ), - ) - self.bn1 = get_norm(norm, out_channels) - - for layer in self.conv1: - if isinstance(layer, Conv2d): - weight_init.c2_msra_fill(layer) - - def forward(self, x): - x = self.conv1(x) - x = self.bn1(x) - x = F.relu_(x) - x = F.max_pool2d(x, kernel_size=3, stride=2, padding=1) - return x - - -class ResNet(Backbone): - def __init__(self, stem, stages, num_classes=None, out_features=None): - """ - Args: - stem (nn.Module): a stem module - stages (list[list[CNNBlockBase]]): several (typically 4) stages, - each contains multiple :class:`CNNBlockBase`. - num_classes (None or int): if None, will not perform classification. - Otherwise, will create a linear layer. - out_features (list[str]): name of the layers whose outputs should - be returned in forward. Can be anything in "stem", "linear", or "res2" ... - If None, will return the output of the last layer. - """ - super(ResNet, self).__init__() - self.stem = stem - self.num_classes = num_classes - - current_stride = self.stem.stride - self._out_feature_strides = {"stem": current_stride} - self._out_feature_channels = {"stem": self.stem.out_channels} - - self.stages_and_names = [] - for i, blocks in enumerate(stages): - assert len(blocks) > 0, len(blocks) - for block in blocks: - assert isinstance(block, CNNBlockBase), block - - name = "res" + str(i + 2) - stage = nn.Sequential(*blocks) - - self.add_module(name, stage) - self.stages_and_names.append((stage, name)) - - self._out_feature_strides[name] = current_stride = int( - current_stride * np.prod([k.stride for k in blocks]) - ) - self._out_feature_channels[name] = curr_channels = blocks[-1].out_channels - - if num_classes is not None: - self.avgpool = nn.AdaptiveAvgPool2d((1, 1)) - self.linear = nn.Linear(curr_channels, num_classes) - - # Sec 5.1 in "Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour": - # "The 1000-way fully-connected layer is initialized by - # drawing weights from a zero-mean Gaussian with standard deviation of 0.01." - nn.init.normal_(self.linear.weight, std=0.01) - name = "linear" - - if out_features is None: - out_features = [name] - self._out_features = out_features - assert len(self._out_features) - children = [x[0] for x in self.named_children()] - for out_feature in self._out_features: - assert out_feature in children, "Available children: {}".format(", ".join(children)) - - def forward(self, x): - outputs = {} - x = self.stem(x) - if "stem" in self._out_features: - outputs["stem"] = x - for stage, name in self.stages_and_names: - x = stage(x) - if name in self._out_features: - outputs[name] = x - if self.num_classes is not None: - x = self.avgpool(x) - x = torch.flatten(x, 1) - x = self.linear(x) - if "linear" in self._out_features: - outputs["linear"] = x - return outputs - - def output_shape(self): - return { - name: ShapeSpec( - channels=self._out_feature_channels[name], stride=self._out_feature_strides[name] - ) - for name in self._out_features - } - - def freeze(self, freeze_at=0): - """ - Freeze the first several stages of the ResNet. Commonly used in - fine-tuning. - Args: - freeze_at (int): number of stem and stages to freeze. - `1` means freezing the stem. `2` means freezing the stem and - the first stage, etc. - Returns: - nn.Module: this ResNet itself - """ - if freeze_at >= 1: - self.stem.freeze() - for idx, (stage, _) in enumerate(self.stages_and_names, start=2): - if freeze_at >= idx: - for block in stage.children(): - block.freeze() - return self - - -@BACKBONE_REGISTRY.register() -def build_res2net_backbone(cfg, input_shape): - """ - Create a Res2Net instance from config. - Returns: - ResNet: a :class:`ResNet` instance. - """ - # need registration of new blocks/stems? - norm = cfg.MODEL.RESNETS.NORM - stem = BasicStem( - in_channels=input_shape.channels, - out_channels=cfg.MODEL.RESNETS.STEM_OUT_CHANNELS, - norm=norm, - ) - - # fmt: off - freeze_at = cfg.MODEL.BACKBONE.FREEZE_AT - out_features = cfg.MODEL.RESNETS.OUT_FEATURES - depth = cfg.MODEL.RESNETS.DEPTH - num_groups = cfg.MODEL.RESNETS.NUM_GROUPS - width_per_group = cfg.MODEL.RESNETS.WIDTH_PER_GROUP - scale = 4 - bottleneck_channels = num_groups * width_per_group * scale - in_channels = cfg.MODEL.RESNETS.STEM_OUT_CHANNELS - out_channels = cfg.MODEL.RESNETS.RES2_OUT_CHANNELS - stride_in_1x1 = cfg.MODEL.RESNETS.STRIDE_IN_1X1 - res5_dilation = cfg.MODEL.RESNETS.RES5_DILATION - deform_on_per_stage = cfg.MODEL.RESNETS.DEFORM_ON_PER_STAGE - deform_modulated = cfg.MODEL.RESNETS.DEFORM_MODULATED - deform_num_groups = cfg.MODEL.RESNETS.DEFORM_NUM_GROUPS - # fmt: on - assert res5_dilation in {1, 2}, "res5_dilation cannot be {}.".format(res5_dilation) - - num_blocks_per_stage = { - 18: [2, 2, 2, 2], - 34: [3, 4, 6, 3], - 50: [3, 4, 6, 3], - 101: [3, 4, 23, 3], - 152: [3, 8, 36, 3], - }[depth] - - if depth in [18, 34]: - assert out_channels == 64, "Must set MODEL.RESNETS.RES2_OUT_CHANNELS = 64 for R18/R34" - assert not any( - deform_on_per_stage - ), "MODEL.RESNETS.DEFORM_ON_PER_STAGE unsupported for R18/R34" - assert res5_dilation == 1, "Must set MODEL.RESNETS.RES5_DILATION = 1 for R18/R34" - assert num_groups == 1, "Must set MODEL.RESNETS.NUM_GROUPS = 1 for R18/R34" - - stages = [] - - # Avoid creating variables without gradients - # It consumes extra memory and may cause allreduce to fail - out_stage_idx = [{"res2": 2, "res3": 3, "res4": 4, "res5": 5}[f] for f in out_features] - max_stage_idx = max(out_stage_idx) - for idx, stage_idx in enumerate(range(2, max_stage_idx + 1)): - dilation = res5_dilation if stage_idx == 5 else 1 - first_stride = 1 if idx == 0 or (stage_idx == 5 and dilation == 2) else 2 - stage_kargs = { - "num_blocks": num_blocks_per_stage[idx], - "first_stride": first_stride, - "in_channels": in_channels, - "out_channels": out_channels, - "norm": norm, - } - # Use BasicBlock for R18 and R34. - if depth in [18, 34]: - stage_kargs["block_class"] = BasicBlock - else: - stage_kargs["bottleneck_channels"] = bottleneck_channels - stage_kargs["stride_in_1x1"] = stride_in_1x1 - stage_kargs["dilation"] = dilation - stage_kargs["num_groups"] = num_groups - stage_kargs["scale"] = scale - - if deform_on_per_stage[idx]: - stage_kargs["block_class"] = DeformBottleneckBlock - stage_kargs["deform_modulated"] = deform_modulated - stage_kargs["deform_num_groups"] = deform_num_groups - else: - stage_kargs["block_class"] = BottleneckBlock - blocks = make_stage(**stage_kargs) - in_channels = out_channels - out_channels *= 2 - bottleneck_channels *= 2 - stages.append(blocks) - return ResNet(stem, stages, out_features=out_features).freeze(freeze_at) - - -@BACKBONE_REGISTRY.register() -def build_p67_res2net_fpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_res2net_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - backbone = FPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - norm=cfg.MODEL.FPN.NORM, - top_block=LastLevelP6P7_P5(out_channels, out_channels), - fuse_type=cfg.MODEL.FPN.FUSE_TYPE, - ) - return backbone - - -@BACKBONE_REGISTRY.register() -def build_res2net_bifpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_res2net_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - backbone = BiFPN( - cfg=cfg, - bottom_up=bottom_up, - in_features=in_features, - out_channels=cfg.MODEL.BIFPN.OUT_CHANNELS, - norm=cfg.MODEL.BIFPN.NORM, - num_levels=cfg.MODEL.BIFPN.NUM_LEVELS, - num_bifpn=cfg.MODEL.BIFPN.NUM_BIFPN, - separable_conv=cfg.MODEL.BIFPN.SEPARABLE_CONV, - ) - return backbone \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/debug.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/debug.py deleted file mode 100755 index 0a4437fb..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/debug.py +++ /dev/null @@ -1,283 +0,0 @@ -import cv2 -import numpy as np -import torch -import torch.nn.functional as F - -COLORS = ((np.random.rand(1300, 3) * 0.4 + 0.6) * 255).astype( - np.uint8).reshape(1300, 1, 1, 3) - -def _get_color_image(heatmap): - heatmap = heatmap.reshape( - heatmap.shape[0], heatmap.shape[1], heatmap.shape[2], 1) - if heatmap.shape[0] == 1: - color_map = (heatmap * np.ones((1, 1, 1, 3), np.uint8) * 255).max( - axis=0).astype(np.uint8) # H, W, 3 - else: - color_map = (heatmap * COLORS[:heatmap.shape[0]]).max(axis=0).astype(np.uint8) # H, W, 3 - - return color_map - -def _blend_image(image, color_map, a=0.7): - color_map = cv2.resize(color_map, (image.shape[1], image.shape[0])) - ret = np.clip(image * (1 - a) + color_map * a, 0, 255).astype(np.uint8) - return ret - -def _blend_image_heatmaps(image, color_maps, a=0.7): - merges = np.zeros((image.shape[0], image.shape[1], 3), np.float32) - for color_map in color_maps: - color_map = cv2.resize(color_map, (image.shape[1], image.shape[0])) - merges = np.maximum(merges, color_map) - ret = np.clip(image * (1 - a) + merges * a, 0, 255).astype(np.uint8) - return ret - -def _decompose_level(x, shapes_per_level, N): - ''' - x: LNHiWi x C - ''' - x = x.view(x.shape[0], -1) - ret = [] - st = 0 - for l in range(len(shapes_per_level)): - ret.append([]) - h = shapes_per_level[l][0].int().item() - w = shapes_per_level[l][1].int().item() - for i in range(N): - ret[l].append(x[st + h * w * i:st + h * w * (i + 1)].view( - h, w, -1).permute(2, 0, 1)) - st += h * w * N - return ret - -def _imagelist_to_tensor(images): - images = [x for x in images] - image_sizes = [x.shape[-2:] for x in images] - h = max([size[0] for size in image_sizes]) - w = max([size[1] for size in image_sizes]) - S = 32 - h, w = ((h - 1) // S + 1) * S, ((w - 1) // S + 1) * S - images = [F.pad(x, (0, w - x.shape[2], 0, h - x.shape[1], 0, 0)) \ - for x in images] - images = torch.stack(images) - return images - - -def _ind2il(ind, shapes_per_level, N): - r = ind - l = 0 - S = 0 - while r - S >= N * shapes_per_level[l][0] * shapes_per_level[l][1]: - S += N * shapes_per_level[l][0] * shapes_per_level[l][1] - l += 1 - i = (r - S) // (shapes_per_level[l][0] * shapes_per_level[l][1]) - return i, l - -def debug_train( - images, gt_instances, flattened_hms, reg_targets, labels, pos_inds, - shapes_per_level, locations, strides): - ''' - images: N x 3 x H x W - flattened_hms: LNHiWi x C - shapes_per_level: L x 2 [(H_i, W_i)] - locations: LNHiWi x 2 - ''' - reg_inds = torch.nonzero( - reg_targets.max(dim=1)[0] > 0).squeeze(1) - N = len(images) - images = _imagelist_to_tensor(images) - repeated_locations = [torch.cat([loc] * N, dim=0) \ - for loc in locations] - locations = torch.cat(repeated_locations, dim=0) - gt_hms = _decompose_level(flattened_hms, shapes_per_level, N) - masks = flattened_hms.new_zeros((flattened_hms.shape[0], 1)) - masks[pos_inds] = 1 - masks = _decompose_level(masks, shapes_per_level, N) - for i in range(len(images)): - image = images[i].detach().cpu().numpy().transpose(1, 2, 0) - color_maps = [] - for l in range(len(gt_hms)): - color_map = _get_color_image( - gt_hms[l][i].detach().cpu().numpy()) - color_maps.append(color_map) - cv2.imshow('gthm_{}'.format(l), color_map) - blend = _blend_image_heatmaps(image.copy(), color_maps) - if gt_instances is not None: - bboxes = gt_instances[i].gt_boxes.tensor - for j in range(len(bboxes)): - bbox = bboxes[j] - cv2.rectangle( - blend, - (int(bbox[0]), int(bbox[1])), - (int(bbox[2]), int(bbox[3])), - (0, 0, 255), 3, cv2.LINE_AA) - - for j in range(len(pos_inds)): - image_id, l = _ind2il(pos_inds[j], shapes_per_level, N) - if image_id != i: - continue - loc = locations[pos_inds[j]] - cv2.drawMarker( - blend, (int(loc[0]), int(loc[1])), (0, 255, 255), - markerSize=(l + 1) * 16) - - for j in range(len(reg_inds)): - image_id, l = _ind2il(reg_inds[j], shapes_per_level, N) - if image_id != i: - continue - ltrb = reg_targets[reg_inds[j]] - ltrb *= strides[l] - loc = locations[reg_inds[j]] - bbox = [(loc[0] - ltrb[0]), (loc[1] - ltrb[1]), - (loc[0] + ltrb[2]), (loc[1] + ltrb[3])] - cv2.rectangle( - blend, - (int(bbox[0]), int(bbox[1])), - (int(bbox[2]), int(bbox[3])), - (255, 0, 0), 1, cv2.LINE_AA) - cv2.circle(blend, (int(loc[0]), int(loc[1])), 2, (255, 0, 0), -1) - - cv2.imshow('blend', blend) - cv2.waitKey() - - -def debug_test( - images, logits_pred, reg_pred, agn_hm_pred=[], preds=[], - vis_thresh=0.3, debug_show_name=False, mult_agn=False): - ''' - images: N x 3 x H x W - class_target: LNHiWi x C - cat_agn_heatmap: LNHiWi - shapes_per_level: L x 2 [(H_i, W_i)] - ''' - N = len(images) - for i in range(len(images)): - image = images[i].detach().cpu().numpy().transpose(1, 2, 0) - result = image.copy().astype(np.uint8) - pred_image = image.copy().astype(np.uint8) - color_maps = [] - L = len(logits_pred) - for l in range(L): - if logits_pred[0] is not None: - stride = min(image.shape[0], image.shape[1]) / min( - logits_pred[l][i].shape[1], logits_pred[l][i].shape[2]) - else: - stride = min(image.shape[0], image.shape[1]) / min( - agn_hm_pred[l][i].shape[1], agn_hm_pred[l][i].shape[2]) - stride = stride if stride < 60 else 64 if stride < 100 else 128 - if logits_pred[0] is not None: - if mult_agn: - logits_pred[l][i] = logits_pred[l][i] * agn_hm_pred[l][i] - color_map = _get_color_image( - logits_pred[l][i].detach().cpu().numpy()) - color_maps.append(color_map) - cv2.imshow('predhm_{}'.format(l), color_map) - - if debug_show_name: - from detectron2.data.datasets.lvis_v1_categories import LVIS_CATEGORIES - cat2name = [x['name'] for x in LVIS_CATEGORIES] - for j in range(len(preds[i].scores) if preds is not None else 0): - if preds[i].scores[j] > vis_thresh: - bbox = preds[i].proposal_boxes[j] \ - if preds[i].has('proposal_boxes') else \ - preds[i].pred_boxes[j] - bbox = bbox.tensor[0].detach().cpu().numpy().astype(np.int32) - cat = int(preds[i].pred_classes[j]) \ - if preds[i].has('pred_classes') else 0 - cl = COLORS[cat, 0, 0] - cv2.rectangle( - pred_image, (int(bbox[0]), int(bbox[1])), - (int(bbox[2]), int(bbox[3])), - (int(cl[0]), int(cl[1]), int(cl[2])), 2, cv2.LINE_AA) - if debug_show_name: - txt = '{}{:.1f}'.format( - cat2name[cat] if cat > 0 else '', - preds[i].scores[j]) - font = cv2.FONT_HERSHEY_SIMPLEX - cat_size = cv2.getTextSize(txt, font, 0.5, 2)[0] - cv2.rectangle( - pred_image, - (int(bbox[0]), int(bbox[1] - cat_size[1] - 2)), - (int(bbox[0] + cat_size[0]), int(bbox[1] - 2)), - (int(cl[0]), int(cl[1]), int(cl[2])), -1) - cv2.putText( - pred_image, txt, (int(bbox[0]), int(bbox[1] - 2)), - font, 0.5, (0, 0, 0), thickness=1, lineType=cv2.LINE_AA) - - - if agn_hm_pred[l] is not None: - agn_hm_ = agn_hm_pred[l][i, 0, :, :, None].detach().cpu().numpy() - agn_hm_ = (agn_hm_ * np.array([255, 255, 255]).reshape( - 1, 1, 3)).astype(np.uint8) - cv2.imshow('agn_hm_{}'.format(l), agn_hm_) - blend = _blend_image_heatmaps(image.copy(), color_maps) - cv2.imshow('blend', blend) - cv2.imshow('preds', pred_image) - cv2.waitKey() - -global cnt -cnt = 0 - -def debug_second_stage(images, instances, proposals=None, vis_thresh=0.3, - save_debug=False, debug_show_name=False): - images = _imagelist_to_tensor(images) - if debug_show_name: - from detectron2.data.datasets.lvis_v1_categories import LVIS_CATEGORIES - cat2name = [x['name'] for x in LVIS_CATEGORIES] - for i in range(len(images)): - image = images[i].detach().cpu().numpy().transpose(1, 2, 0).astype(np.uint8).copy() - if instances[i].has('gt_boxes'): - bboxes = instances[i].gt_boxes.tensor.cpu().numpy() - scores = np.ones(bboxes.shape[0]) - cats = instances[i].gt_classes.cpu().numpy() - else: - bboxes = instances[i].pred_boxes.tensor.cpu().numpy() - scores = instances[i].scores.cpu().numpy() - cats = instances[i].pred_classes.cpu().numpy() - for j in range(len(bboxes)): - if scores[j] > vis_thresh: - bbox = bboxes[j] - cl = COLORS[cats[j], 0, 0] - cl = (int(cl[0]), int(cl[1]), int(cl[2])) - cv2.rectangle( - image, - (int(bbox[0]), int(bbox[1])), - (int(bbox[2]), int(bbox[3])), - cl, 2, cv2.LINE_AA) - if debug_show_name: - cat = cats[j] - txt = '{}{:.1f}'.format( - cat2name[cat] if cat > 0 else '', - scores[j]) - font = cv2.FONT_HERSHEY_SIMPLEX - cat_size = cv2.getTextSize(txt, font, 0.5, 2)[0] - cv2.rectangle( - image, - (int(bbox[0]), int(bbox[1] - cat_size[1] - 2)), - (int(bbox[0] + cat_size[0]), int(bbox[1] - 2)), - (int(cl[0]), int(cl[1]), int(cl[2])), -1) - cv2.putText( - image, txt, (int(bbox[0]), int(bbox[1] - 2)), - font, 0.5, (0, 0, 0), thickness=1, lineType=cv2.LINE_AA) - if proposals is not None: - proposal_image = images[i].detach().cpu().numpy().transpose(1, 2, 0).astype(np.uint8).copy() - bboxes = proposals[i].proposal_boxes.tensor.cpu().numpy() - if proposals[i].has('scores'): - scores = proposals[i].scores.cpu().numpy() - else: - scores = proposals[i].objectness_logits.sigmoid().cpu().numpy() - for j in range(len(bboxes)): - if scores[j] > vis_thresh: - bbox = bboxes[j] - cl = (209, 159, 83) - cv2.rectangle( - proposal_image, - (int(bbox[0]), int(bbox[1])), - (int(bbox[2]), int(bbox[3])), - cl, 2, cv2.LINE_AA) - - cv2.imshow('image', image) - if proposals is not None: - cv2.imshow('proposals', proposal_image) - if save_debug: - global cnt - cnt += 1 - cv2.imwrite('output/save_debug/{}.jpg'.format(cnt), proposal_image) - cv2.waitKey() \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet.py deleted file mode 100755 index feb7a822..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet.py +++ /dev/null @@ -1,864 +0,0 @@ - -import math -import json -import copy -from typing import List, Dict -import numpy as np -import torch -from torch import nn -from torch.nn import functional as F - -from detectron2.modeling.proposal_generator.build import PROPOSAL_GENERATOR_REGISTRY -from detectron2.layers import ShapeSpec, cat -from detectron2.structures import Instances, Boxes -from detectron2.modeling import detector_postprocess -from detectron2.utils.comm import get_world_size -from detectron2.config import configurable - -from ..layers.heatmap_focal_loss import heatmap_focal_loss_jit -from ..layers.heatmap_focal_loss import binary_heatmap_focal_loss -from ..layers.iou_loss import IOULoss -from ..layers.ml_nms import ml_nms -from ..debug import debug_train, debug_test -from .utils import reduce_sum, _transpose -from .centernet_head import CenterNetHead - -__all__ = ["CenterNet"] - -INF = 100000000 - -@PROPOSAL_GENERATOR_REGISTRY.register() -class CenterNet(nn.Module): - @configurable - def __init__(self, - # input_shape: Dict[str, ShapeSpec], - in_channels=256, - *, - num_classes=80, - in_features=("p3", "p4", "p5", "p6", "p7"), - strides=(8, 16, 32, 64, 128), - score_thresh=0.05, - hm_min_overlap=0.8, - loc_loss_type='giou', - min_radius=4, - hm_focal_alpha=0.25, - hm_focal_beta=4, - loss_gamma=2.0, - reg_weight=2.0, - not_norm_reg=True, - with_agn_hm=False, - only_proposal=False, - as_proposal=False, - not_nms=False, - pos_weight=1., - neg_weight=1., - sigmoid_clamp=1e-4, - ignore_high_fp=-1., - center_nms=False, - sizes_of_interest=[[0,80],[64,160],[128,320],[256,640],[512,10000000]], - more_pos=False, - more_pos_thresh=0.2, - more_pos_topk=9, - pre_nms_topk_train=1000, - pre_nms_topk_test=1000, - post_nms_topk_train=100, - post_nms_topk_test=100, - nms_thresh_train=0.6, - nms_thresh_test=0.6, - no_reduce=False, - debug=False, - vis_thresh=0.5, - pixel_mean=[103.530,116.280,123.675], - pixel_std=[1.0,1.0,1.0], - device='cuda', - centernet_head=None, - ): - super().__init__() - self.num_classes = num_classes - self.in_features = in_features - self.strides = strides - self.score_thresh = score_thresh - self.min_radius = min_radius - self.hm_focal_alpha = hm_focal_alpha - self.hm_focal_beta = hm_focal_beta - self.loss_gamma = loss_gamma - self.reg_weight = reg_weight - self.not_norm_reg = not_norm_reg - self.with_agn_hm = with_agn_hm - self.only_proposal = only_proposal - self.as_proposal = as_proposal - self.not_nms = not_nms - self.pos_weight = pos_weight - self.neg_weight = neg_weight - self.sigmoid_clamp = sigmoid_clamp - self.ignore_high_fp = ignore_high_fp - self.center_nms = center_nms - self.sizes_of_interest = sizes_of_interest - self.more_pos = more_pos - self.more_pos_thresh = more_pos_thresh - self.more_pos_topk = more_pos_topk - self.pre_nms_topk_train = pre_nms_topk_train - self.pre_nms_topk_test = pre_nms_topk_test - self.post_nms_topk_train = post_nms_topk_train - self.post_nms_topk_test = post_nms_topk_test - self.nms_thresh_train = nms_thresh_train - self.nms_thresh_test = nms_thresh_test - self.no_reduce = no_reduce - self.debug = debug - self.vis_thresh = vis_thresh - if self.center_nms: - self.not_nms = True - self.iou_loss = IOULoss(loc_loss_type) - assert (not self.only_proposal) or self.with_agn_hm - # delta for rendering heatmap - self.delta = (1 - hm_min_overlap) / (1 + hm_min_overlap) - if centernet_head is None: - self.centernet_head = CenterNetHead( - in_channels=in_channels, - num_levels=len(in_features), - with_agn_hm=with_agn_hm, - only_proposal=only_proposal) - else: - self.centernet_head = centernet_head - if self.debug: - pixel_mean = torch.Tensor(pixel_mean).to( - torch.device(device)).view(3, 1, 1) - pixel_std = torch.Tensor(pixel_std).to( - torch.device(device)).view(3, 1, 1) - self.denormalizer = lambda x: x * pixel_std + pixel_mean - - @classmethod - def from_config(cls, cfg, input_shape): - ret = { - # 'input_shape': input_shape, - 'in_channels': input_shape[ - cfg.MODEL.CENTERNET.IN_FEATURES[0]].channels, - 'num_classes': cfg.MODEL.CENTERNET.NUM_CLASSES, - 'in_features': cfg.MODEL.CENTERNET.IN_FEATURES, - 'strides': cfg.MODEL.CENTERNET.FPN_STRIDES, - 'score_thresh': cfg.MODEL.CENTERNET.INFERENCE_TH, - 'loc_loss_type': cfg.MODEL.CENTERNET.LOC_LOSS_TYPE, - 'hm_min_overlap': cfg.MODEL.CENTERNET.HM_MIN_OVERLAP, - 'min_radius': cfg.MODEL.CENTERNET.MIN_RADIUS, - 'hm_focal_alpha': cfg.MODEL.CENTERNET.HM_FOCAL_ALPHA, - 'hm_focal_beta': cfg.MODEL.CENTERNET.HM_FOCAL_BETA, - 'loss_gamma': cfg.MODEL.CENTERNET.LOSS_GAMMA, - 'reg_weight': cfg.MODEL.CENTERNET.REG_WEIGHT, - 'not_norm_reg': cfg.MODEL.CENTERNET.NOT_NORM_REG, - 'with_agn_hm': cfg.MODEL.CENTERNET.WITH_AGN_HM, - 'only_proposal': cfg.MODEL.CENTERNET.ONLY_PROPOSAL, - 'as_proposal': cfg.MODEL.CENTERNET.AS_PROPOSAL, - 'not_nms': cfg.MODEL.CENTERNET.NOT_NMS, - 'pos_weight': cfg.MODEL.CENTERNET.POS_WEIGHT, - 'neg_weight': cfg.MODEL.CENTERNET.NEG_WEIGHT, - 'sigmoid_clamp': cfg.MODEL.CENTERNET.SIGMOID_CLAMP, - 'ignore_high_fp': cfg.MODEL.CENTERNET.IGNORE_HIGH_FP, - 'center_nms': cfg.MODEL.CENTERNET.CENTER_NMS, - 'sizes_of_interest': cfg.MODEL.CENTERNET.SOI, - 'more_pos': cfg.MODEL.CENTERNET.MORE_POS, - 'more_pos_thresh': cfg.MODEL.CENTERNET.MORE_POS_THRESH, - 'more_pos_topk': cfg.MODEL.CENTERNET.MORE_POS_TOPK, - 'pre_nms_topk_train': cfg.MODEL.CENTERNET.PRE_NMS_TOPK_TRAIN, - 'pre_nms_topk_test': cfg.MODEL.CENTERNET.PRE_NMS_TOPK_TEST, - 'post_nms_topk_train': cfg.MODEL.CENTERNET.POST_NMS_TOPK_TRAIN, - 'post_nms_topk_test': cfg.MODEL.CENTERNET.POST_NMS_TOPK_TEST, - 'nms_thresh_train': cfg.MODEL.CENTERNET.NMS_TH_TRAIN, - 'nms_thresh_test': cfg.MODEL.CENTERNET.NMS_TH_TEST, - 'no_reduce': cfg.MODEL.CENTERNET.NO_REDUCE, - 'debug': cfg.DEBUG, - 'vis_thresh': cfg.VIS_THRESH, - 'pixel_mean': cfg.MODEL.PIXEL_MEAN, - 'pixel_std': cfg.MODEL.PIXEL_STD, - 'device': cfg.MODEL.DEVICE, - 'centernet_head': CenterNetHead( - cfg, [input_shape[f] for f in cfg.MODEL.CENTERNET.IN_FEATURES]), - } - return ret - - - def forward(self, images, features_dict, gt_instances): - features = [features_dict[f] for f in self.in_features] - clss_per_level, reg_pred_per_level, agn_hm_pred_per_level = \ - self.centernet_head(features) - grids = self.compute_grids(features) - shapes_per_level = grids[0].new_tensor( - [(x.shape[2], x.shape[3]) for x in reg_pred_per_level]) - - if not self.training: - return self.inference( - images, clss_per_level, reg_pred_per_level, - agn_hm_pred_per_level, grids) - else: - pos_inds, labels, reg_targets, flattened_hms = \ - self._get_ground_truth( - grids, shapes_per_level, gt_instances) - # logits_pred: M x F, reg_pred: M x 4, agn_hm_pred: M - logits_pred, reg_pred, agn_hm_pred = self._flatten_outputs( - clss_per_level, reg_pred_per_level, agn_hm_pred_per_level) - - if self.more_pos: - # add more pixels as positive if \ - # 1. they are within the center3x3 region of an object - # 2. their regression losses are small (= 0).squeeze(1) - reg_pred = reg_pred[reg_inds] - reg_targets_pos = reg_targets[reg_inds] - reg_weight_map = flattened_hms.max(dim=1)[0] - reg_weight_map = reg_weight_map[reg_inds] - reg_weight_map = reg_weight_map * 0 + 1 \ - if self.not_norm_reg else reg_weight_map - if self.no_reduce: - reg_norm = max(reg_weight_map.sum(), 1) - else: - reg_norm = max(reduce_sum(reg_weight_map.sum()).item() / num_gpus, 1) - - reg_loss = self.reg_weight * self.iou_loss( - reg_pred, reg_targets_pos, reg_weight_map, - reduction='sum') / reg_norm - losses['loss_centernet_loc'] = reg_loss - - if self.with_agn_hm: - cat_agn_heatmap = flattened_hms.max(dim=1)[0] # M - agn_pos_loss, agn_neg_loss = binary_heatmap_focal_loss( - agn_hm_pred, cat_agn_heatmap, pos_inds, - alpha=self.hm_focal_alpha, - beta=self.hm_focal_beta, - gamma=self.loss_gamma, - sigmoid_clamp=self.sigmoid_clamp, - ignore_high_fp=self.ignore_high_fp, - ) - agn_pos_loss = self.pos_weight * agn_pos_loss / num_pos_avg - agn_neg_loss = self.neg_weight * agn_neg_loss / num_pos_avg - losses['loss_centernet_agn_pos'] = agn_pos_loss - losses['loss_centernet_agn_neg'] = agn_neg_loss - - if self.debug: - print('losses', losses) - print('total_num_pos', total_num_pos) - return losses - - - def compute_grids(self, features): - grids = [] - for level, feature in enumerate(features): - h, w = feature.size()[-2:] - shifts_x = torch.arange( - 0, w * self.strides[level], - step=self.strides[level], - dtype=torch.float32, device=feature.device) - shifts_y = torch.arange( - 0, h * self.strides[level], - step=self.strides[level], - dtype=torch.float32, device=feature.device) - shift_y, shift_x = torch.meshgrid(shifts_y, shifts_x) - shift_x = shift_x.reshape(-1) - shift_y = shift_y.reshape(-1) - grids_per_level = torch.stack((shift_x, shift_y), dim=1) + \ - self.strides[level] // 2 - grids.append(grids_per_level) - return grids - - - def _get_ground_truth(self, grids, shapes_per_level, gt_instances): - ''' - Input: - grids: list of tensors [(hl x wl, 2)]_l - shapes_per_level: list of tuples L x 2: - gt_instances: gt instances - Retuen: - pos_inds: N - labels: N - reg_targets: M x 4 - flattened_hms: M x C or M x 1 - N: number of objects in all images - M: number of pixels from all FPN levels - ''' - - # get positive pixel index - if not self.more_pos: - pos_inds, labels = self._get_label_inds( - gt_instances, shapes_per_level) - else: - pos_inds, labels = None, None - heatmap_channels = self.num_classes - L = len(grids) - num_loc_list = [len(loc) for loc in grids] - strides = torch.cat([ - shapes_per_level.new_ones(num_loc_list[l]) * self.strides[l] \ - for l in range(L)]).float() # M - reg_size_ranges = torch.cat([ - shapes_per_level.new_tensor(self.sizes_of_interest[l]).float().view( - 1, 2).expand(num_loc_list[l], 2) for l in range(L)]) # M x 2 - grids = torch.cat(grids, dim=0) # M x 2 - M = grids.shape[0] - - reg_targets = [] - flattened_hms = [] - for i in range(len(gt_instances)): # images - boxes = gt_instances[i].gt_boxes.tensor # N x 4 - area = gt_instances[i].gt_boxes.area() # N - gt_classes = gt_instances[i].gt_classes # N in [0, self.num_classes] - - N = boxes.shape[0] - if N == 0: - reg_targets.append(grids.new_zeros((M, 4)) - INF) - flattened_hms.append( - grids.new_zeros(( - M, 1 if self.only_proposal else heatmap_channels))) - continue - - l = grids[:, 0].view(M, 1) - boxes[:, 0].view(1, N) # M x N - t = grids[:, 1].view(M, 1) - boxes[:, 1].view(1, N) # M x N - r = boxes[:, 2].view(1, N) - grids[:, 0].view(M, 1) # M x N - b = boxes[:, 3].view(1, N) - grids[:, 1].view(M, 1) # M x N - reg_target = torch.stack([l, t, r, b], dim=2) # M x N x 4 - - centers = ((boxes[:, [0, 1]] + boxes[:, [2, 3]]) / 2) # N x 2 - centers_expanded = centers.view(1, N, 2).expand(M, N, 2) # M x N x 2 - strides_expanded = strides.view(M, 1, 1).expand(M, N, 2) - centers_discret = ((centers_expanded / strides_expanded).int() * \ - strides_expanded).float() + strides_expanded / 2 # M x N x 2 - - is_peak = (((grids.view(M, 1, 2).expand(M, N, 2) - \ - centers_discret) ** 2).sum(dim=2) == 0) # M x N - is_in_boxes = reg_target.min(dim=2)[0] > 0 # M x N - is_center3x3 = self.get_center3x3( - grids, centers, strides) & is_in_boxes # M x N - is_cared_in_the_level = self.assign_reg_fpn( - reg_target, reg_size_ranges) # M x N - reg_mask = is_center3x3 & is_cared_in_the_level # M x N - - dist2 = ((grids.view(M, 1, 2).expand(M, N, 2) - \ - centers_expanded) ** 2).sum(dim=2) # M x N - dist2[is_peak] = 0 - radius2 = self.delta ** 2 * 2 * area # N - radius2 = torch.clamp( - radius2, min=self.min_radius ** 2) - weighted_dist2 = dist2 / radius2.view(1, N).expand(M, N) # M x N - reg_target = self._get_reg_targets( - reg_target, weighted_dist2.clone(), reg_mask, area) # M x 4 - - if self.only_proposal: - flattened_hm = self._create_agn_heatmaps_from_dist( - weighted_dist2.clone()) # M x 1 - else: - flattened_hm = self._create_heatmaps_from_dist( - weighted_dist2.clone(), gt_classes, - channels=heatmap_channels) # M x C - - reg_targets.append(reg_target) - flattened_hms.append(flattened_hm) - - # transpose im first training_targets to level first ones - reg_targets = _transpose(reg_targets, num_loc_list) - flattened_hms = _transpose(flattened_hms, num_loc_list) - for l in range(len(reg_targets)): - reg_targets[l] = reg_targets[l] / float(self.strides[l]) - reg_targets = cat([x for x in reg_targets], dim=0) # MB x 4 - flattened_hms = cat([x for x in flattened_hms], dim=0) # MB x C - - return pos_inds, labels, reg_targets, flattened_hms - - - def _get_label_inds(self, gt_instances, shapes_per_level): - ''' - Inputs: - gt_instances: [n_i], sum n_i = N - shapes_per_level: L x 2 [(h_l, w_l)]_L - Returns: - pos_inds: N' - labels: N' - ''' - pos_inds = [] - labels = [] - L = len(self.strides) - B = len(gt_instances) - shapes_per_level = shapes_per_level.long() - loc_per_level = (shapes_per_level[:, 0] * shapes_per_level[:, 1]).long() # L - level_bases = [] - s = 0 - for l in range(L): - level_bases.append(s) - s = s + B * loc_per_level[l] - level_bases = shapes_per_level.new_tensor(level_bases).long() # L - strides_default = shapes_per_level.new_tensor(self.strides).float() # L - for im_i in range(B): - targets_per_im = gt_instances[im_i] - bboxes = targets_per_im.gt_boxes.tensor # n x 4 - n = bboxes.shape[0] - centers = ((bboxes[:, [0, 1]] + bboxes[:, [2, 3]]) / 2) # n x 2 - centers = centers.view(n, 1, 2).expand(n, L, 2) - strides = strides_default.view(1, L, 1).expand(n, L, 2) - centers_inds = (centers / strides).long() # n x L x 2 - Ws = shapes_per_level[:, 1].view(1, L).expand(n, L) - pos_ind = level_bases.view(1, L).expand(n, L) + \ - im_i * loc_per_level.view(1, L).expand(n, L) + \ - centers_inds[:, :, 1] * Ws + \ - centers_inds[:, :, 0] # n x L - is_cared_in_the_level = self.assign_fpn_level(bboxes) - pos_ind = pos_ind[is_cared_in_the_level].view(-1) - label = targets_per_im.gt_classes.view( - n, 1).expand(n, L)[is_cared_in_the_level].view(-1) - - pos_inds.append(pos_ind) # n' - labels.append(label) # n' - pos_inds = torch.cat(pos_inds, dim=0).long() - labels = torch.cat(labels, dim=0) - return pos_inds, labels # N, N - - - def assign_fpn_level(self, boxes): - ''' - Inputs: - boxes: n x 4 - size_ranges: L x 2 - Return: - is_cared_in_the_level: n x L - ''' - size_ranges = boxes.new_tensor( - self.sizes_of_interest).view(len(self.sizes_of_interest), 2) # L x 2 - crit = ((boxes[:, 2:] - boxes[:, :2]) **2).sum(dim=1) ** 0.5 / 2 # n - n, L = crit.shape[0], size_ranges.shape[0] - crit = crit.view(n, 1).expand(n, L) - size_ranges_expand = size_ranges.view(1, L, 2).expand(n, L, 2) - is_cared_in_the_level = (crit >= size_ranges_expand[:, :, 0]) & \ - (crit <= size_ranges_expand[:, :, 1]) - return is_cared_in_the_level - - - def assign_reg_fpn(self, reg_targets_per_im, size_ranges): - ''' - TODO (Xingyi): merge it with assign_fpn_level - Inputs: - reg_targets_per_im: M x N x 4 - size_ranges: M x 2 - ''' - crit = ((reg_targets_per_im[:, :, :2] + \ - reg_targets_per_im[:, :, 2:])**2).sum(dim=2) ** 0.5 / 2 # M x N - is_cared_in_the_level = (crit >= size_ranges[:, [0]]) & \ - (crit <= size_ranges[:, [1]]) - return is_cared_in_the_level - - - def _get_reg_targets(self, reg_targets, dist, mask, area): - ''' - reg_targets (M x N x 4): long tensor - dist (M x N) - is_*: M x N - ''' - dist[mask == 0] = INF * 1.0 - min_dist, min_inds = dist.min(dim=1) # M - reg_targets_per_im = reg_targets[ - range(len(reg_targets)), min_inds] # M x N x 4 --> M x 4 - reg_targets_per_im[min_dist == INF] = - INF - return reg_targets_per_im - - - def _create_heatmaps_from_dist(self, dist, labels, channels): - ''' - dist: M x N - labels: N - return: - heatmaps: M x C - ''' - heatmaps = dist.new_zeros((dist.shape[0], channels)) - for c in range(channels): - inds = (labels == c) # N - if inds.int().sum() == 0: - continue - heatmaps[:, c] = torch.exp(-dist[:, inds].min(dim=1)[0]) - zeros = heatmaps[:, c] < 1e-4 - heatmaps[zeros, c] = 0 - return heatmaps - - - def _create_agn_heatmaps_from_dist(self, dist): - ''' - TODO (Xingyi): merge it with _create_heatmaps_from_dist - dist: M x N - return: - heatmaps: M x 1 - ''' - heatmaps = dist.new_zeros((dist.shape[0], 1)) - heatmaps[:, 0] = torch.exp(-dist.min(dim=1)[0]) - zeros = heatmaps < 1e-4 - heatmaps[zeros] = 0 - return heatmaps - - - def _flatten_outputs(self, clss, reg_pred, agn_hm_pred): - # Reshape: (N, F, Hl, Wl) -> (N, Hl, Wl, F) -> (sum_l N*Hl*Wl, F) - clss = cat([x.permute(0, 2, 3, 1).reshape(-1, x.shape[1]) \ - for x in clss], dim=0) if clss[0] is not None else None - reg_pred = cat( - [x.permute(0, 2, 3, 1).reshape(-1, 4) for x in reg_pred], dim=0) - agn_hm_pred = cat([x.permute(0, 2, 3, 1).reshape(-1) \ - for x in agn_hm_pred], dim=0) if self.with_agn_hm else None - return clss, reg_pred, agn_hm_pred - - - def get_center3x3(self, locations, centers, strides): - ''' - Inputs: - locations: M x 2 - centers: N x 2 - strides: M - ''' - M, N = locations.shape[0], centers.shape[0] - locations_expanded = locations.view(M, 1, 2).expand(M, N, 2) # M x N x 2 - centers_expanded = centers.view(1, N, 2).expand(M, N, 2) # M x N x 2 - strides_expanded = strides.view(M, 1, 1).expand(M, N, 2) # M x N - centers_discret = ((centers_expanded / strides_expanded).int() * \ - strides_expanded).float() + strides_expanded / 2 # M x N x 2 - dist_x = (locations_expanded[:, :, 0] - centers_discret[:, :, 0]).abs() - dist_y = (locations_expanded[:, :, 1] - centers_discret[:, :, 1]).abs() - return (dist_x <= strides_expanded[:, :, 0]) & \ - (dist_y <= strides_expanded[:, :, 0]) - - - def inference(self, images, clss_per_level, reg_pred_per_level, - agn_hm_pred_per_level, grids): - logits_pred = [x.sigmoid() if x is not None else None \ - for x in clss_per_level] - agn_hm_pred_per_level = [x.sigmoid() if x is not None else None \ - for x in agn_hm_pred_per_level] - - if self.only_proposal: - proposals = self.predict_instances( - grids, agn_hm_pred_per_level, reg_pred_per_level, - images.image_sizes, [None for _ in agn_hm_pred_per_level]) - else: - proposals = self.predict_instances( - grids, logits_pred, reg_pred_per_level, - images.image_sizes, agn_hm_pred_per_level) - if self.as_proposal or self.only_proposal: - for p in range(len(proposals)): - proposals[p].proposal_boxes = proposals[p].get('pred_boxes') - proposals[p].objectness_logits = proposals[p].get('scores') - proposals[p].remove('pred_boxes') - - if self.debug: - debug_test( - [self.denormalizer(x) for x in images], - logits_pred, reg_pred_per_level, - agn_hm_pred_per_level, preds=proposals, - vis_thresh=self.vis_thresh, - debug_show_name=False) - return proposals, {} - - - def predict_instances( - self, grids, logits_pred, reg_pred, image_sizes, agn_hm_pred, - is_proposal=False): - sampled_boxes = [] - for l in range(len(grids)): - sampled_boxes.append(self.predict_single_level( - grids[l], logits_pred[l], reg_pred[l] * self.strides[l], - image_sizes, agn_hm_pred[l], l, is_proposal=is_proposal)) - boxlists = list(zip(*sampled_boxes)) - boxlists = [Instances.cat(boxlist) for boxlist in boxlists] - boxlists = self.nms_and_topK( - boxlists, nms=not self.not_nms) - return boxlists - - - def predict_single_level( - self, grids, heatmap, reg_pred, image_sizes, agn_hm, level, - is_proposal=False): - N, C, H, W = heatmap.shape - # put in the same format as grids - if self.center_nms: - heatmap_nms = nn.functional.max_pool2d( - heatmap, (3, 3), stride=1, padding=1) - heatmap = heatmap * (heatmap_nms == heatmap).float() - heatmap = heatmap.permute(0, 2, 3, 1) # N x H x W x C - heatmap = heatmap.reshape(N, -1, C) # N x HW x C - box_regression = reg_pred.view(N, 4, H, W).permute(0, 2, 3, 1) # N x H x W x 4 - box_regression = box_regression.reshape(N, -1, 4) - - candidate_inds = heatmap > self.score_thresh # 0.05 - pre_nms_top_n = candidate_inds.view(N, -1).sum(1) # N - pre_nms_topk = self.pre_nms_topk_train if self.training else self.pre_nms_topk_test - pre_nms_top_n = pre_nms_top_n.clamp(max=pre_nms_topk) # N - - if agn_hm is not None: - agn_hm = agn_hm.view(N, 1, H, W).permute(0, 2, 3, 1) - agn_hm = agn_hm.reshape(N, -1) - heatmap = heatmap * agn_hm[:, :, None] - - results = [] - for i in range(N): - per_box_cls = heatmap[i] # HW x C - per_candidate_inds = candidate_inds[i] # n - per_box_cls = per_box_cls[per_candidate_inds] # n - - per_candidate_nonzeros = per_candidate_inds.nonzero() # n - per_box_loc = per_candidate_nonzeros[:, 0] # n - per_class = per_candidate_nonzeros[:, 1] # n - - per_box_regression = box_regression[i] # HW x 4 - per_box_regression = per_box_regression[per_box_loc] # n x 4 - per_grids = grids[per_box_loc] # n x 2 - - per_pre_nms_top_n = pre_nms_top_n[i] # 1 - - if per_candidate_inds.sum().item() > per_pre_nms_top_n.item(): - per_box_cls, top_k_indices = \ - per_box_cls.topk(per_pre_nms_top_n, sorted=False) - per_class = per_class[top_k_indices] - per_box_regression = per_box_regression[top_k_indices] - per_grids = per_grids[top_k_indices] - - detections = torch.stack([ - per_grids[:, 0] - per_box_regression[:, 0], - per_grids[:, 1] - per_box_regression[:, 1], - per_grids[:, 0] + per_box_regression[:, 2], - per_grids[:, 1] + per_box_regression[:, 3], - ], dim=1) # n x 4 - - # avoid invalid boxes in RoI heads - detections[:, 2] = torch.max(detections[:, 2], detections[:, 0] + 0.01) - detections[:, 3] = torch.max(detections[:, 3], detections[:, 1] + 0.01) - boxlist = Instances(image_sizes[i]) - boxlist.scores = torch.sqrt(per_box_cls) \ - if self.with_agn_hm else per_box_cls # n - # import pdb; pdb.set_trace() - boxlist.pred_boxes = Boxes(detections) - boxlist.pred_classes = per_class - results.append(boxlist) - return results - - - def nms_and_topK(self, boxlists, nms=True): - num_images = len(boxlists) - results = [] - for i in range(num_images): - nms_thresh = self.nms_thresh_train if self.training else \ - self.nms_thresh_test - result = ml_nms(boxlists[i], nms_thresh) if nms else boxlists[i] - if self.debug: - print('#proposals before nms', len(boxlists[i])) - print('#proposals after nms', len(result)) - num_dets = len(result) - post_nms_topk = self.post_nms_topk_train if self.training else \ - self.post_nms_topk_test - if num_dets > post_nms_topk: - cls_scores = result.scores - image_thresh, _ = torch.kthvalue( - cls_scores.float().cpu(), - num_dets - post_nms_topk + 1 - ) - keep = cls_scores >= image_thresh.item() - keep = torch.nonzero(keep).squeeze(1) - result = result[keep] - if self.debug: - print('#proposals after filter', len(result)) - results.append(result) - return results - - - def _add_more_pos(self, reg_pred, gt_instances, shapes_per_level): - labels, level_masks, c33_inds, c33_masks, c33_regs = \ - self._get_c33_inds(gt_instances, shapes_per_level) - N, L, K = labels.shape[0], len(self.strides), 9 - c33_inds[c33_masks == 0] = 0 - reg_pred_c33 = reg_pred[c33_inds].detach() # N x L x K - invalid_reg = c33_masks == 0 - c33_regs_expand = c33_regs.view(N * L * K, 4).clamp(min=0) - if N > 0: - with torch.no_grad(): - c33_reg_loss = self.iou_loss( - reg_pred_c33.view(N * L * K, 4), - c33_regs_expand, None, - reduction='none').view(N, L, K).detach() # N x L x K - else: - c33_reg_loss = reg_pred_c33.new_zeros((N, L, K)).detach() - c33_reg_loss[invalid_reg] = INF # N x L x K - c33_reg_loss.view(N * L, K)[level_masks.view(N * L), 4] = 0 # real center - c33_reg_loss = c33_reg_loss.view(N, L * K) - if N == 0: - loss_thresh = c33_reg_loss.new_ones((N)).float() - else: - loss_thresh = torch.kthvalue( - c33_reg_loss, self.more_pos_topk, dim=1)[0] # N - loss_thresh[loss_thresh > self.more_pos_thresh] = self.more_pos_thresh # N - new_pos = c33_reg_loss.view(N, L, K) < \ - loss_thresh.view(N, 1, 1).expand(N, L, K) - pos_inds = c33_inds[new_pos].view(-1) # P - labels = labels.view(N, 1, 1).expand(N, L, K)[new_pos].view(-1) - return pos_inds, labels - - - def _get_c33_inds(self, gt_instances, shapes_per_level): - ''' - TODO (Xingyi): The current implementation is ugly. Refactor. - Get the center (and the 3x3 region near center) locations of each objects - Inputs: - gt_instances: [n_i], sum n_i = N - shapes_per_level: L x 2 [(h_l, w_l)]_L - ''' - labels = [] - level_masks = [] - c33_inds = [] - c33_masks = [] - c33_regs = [] - L = len(self.strides) - B = len(gt_instances) - shapes_per_level = shapes_per_level.long() - loc_per_level = (shapes_per_level[:, 0] * shapes_per_level[:, 1]).long() # L - level_bases = [] - s = 0 - for l in range(L): - level_bases.append(s) - s = s + B * loc_per_level[l] - level_bases = shapes_per_level.new_tensor(level_bases).long() # L - strides_default = shapes_per_level.new_tensor(self.strides).float() # L - K = 9 - dx = shapes_per_level.new_tensor([-1, 0, 1, -1, 0, 1, -1, 0, 1]).long() - dy = shapes_per_level.new_tensor([-1, -1, -1, 0, 0, 0, 1, 1, 1]).long() - for im_i in range(B): - targets_per_im = gt_instances[im_i] - bboxes = targets_per_im.gt_boxes.tensor # n x 4 - n = bboxes.shape[0] - if n == 0: - continue - centers = ((bboxes[:, [0, 1]] + bboxes[:, [2, 3]]) / 2) # n x 2 - centers = centers.view(n, 1, 2).expand(n, L, 2) - - strides = strides_default.view(1, L, 1).expand(n, L, 2) # - centers_inds = (centers / strides).long() # n x L x 2 - center_grids = centers_inds * strides + strides // 2# n x L x 2 - l = center_grids[:, :, 0] - bboxes[:, 0].view(n, 1).expand(n, L) - t = center_grids[:, :, 1] - bboxes[:, 1].view(n, 1).expand(n, L) - r = bboxes[:, 2].view(n, 1).expand(n, L) - center_grids[:, :, 0] - b = bboxes[:, 3].view(n, 1).expand(n, L) - center_grids[:, :, 1] # n x L - reg = torch.stack([l, t, r, b], dim=2) # n x L x 4 - reg = reg / strides_default.view(1, L, 1).expand(n, L, 4).float() - - Ws = shapes_per_level[:, 1].view(1, L).expand(n, L) - Hs = shapes_per_level[:, 0].view(1, L).expand(n, L) - expand_Ws = Ws.view(n, L, 1).expand(n, L, K) - expand_Hs = Hs.view(n, L, 1).expand(n, L, K) - label = targets_per_im.gt_classes.view(n).clone() - mask = reg.min(dim=2)[0] >= 0 # n x L - mask = mask & self.assign_fpn_level(bboxes) - labels.append(label) # n - level_masks.append(mask) # n x L - - Dy = dy.view(1, 1, K).expand(n, L, K) - Dx = dx.view(1, 1, K).expand(n, L, K) - c33_ind = level_bases.view(1, L, 1).expand(n, L, K) + \ - im_i * loc_per_level.view(1, L, 1).expand(n, L, K) + \ - (centers_inds[:, :, 1:2].expand(n, L, K) + Dy) * expand_Ws + \ - (centers_inds[:, :, 0:1].expand(n, L, K) + Dx) # n x L x K - - c33_mask = \ - ((centers_inds[:, :, 1:2].expand(n, L, K) + dy) < expand_Hs) & \ - ((centers_inds[:, :, 1:2].expand(n, L, K) + dy) >= 0) & \ - ((centers_inds[:, :, 0:1].expand(n, L, K) + dx) < expand_Ws) & \ - ((centers_inds[:, :, 0:1].expand(n, L, K) + dx) >= 0) - # TODO (Xingyi): think about better way to implement this - # Currently it hard codes the 3x3 region - c33_reg = reg.view(n, L, 1, 4).expand(n, L, K, 4).clone() - c33_reg[:, :, [0, 3, 6], 0] -= 1 - c33_reg[:, :, [0, 3, 6], 2] += 1 - c33_reg[:, :, [2, 5, 8], 0] += 1 - c33_reg[:, :, [2, 5, 8], 2] -= 1 - c33_reg[:, :, [0, 1, 2], 1] -= 1 - c33_reg[:, :, [0, 1, 2], 3] += 1 - c33_reg[:, :, [6, 7, 8], 1] += 1 - c33_reg[:, :, [6, 7, 8], 3] -= 1 - c33_mask = c33_mask & (c33_reg.min(dim=3)[0] >= 0) # n x L x K - c33_inds.append(c33_ind) - c33_masks.append(c33_mask) - c33_regs.append(c33_reg) - - if len(level_masks) > 0: - labels = torch.cat(labels, dim=0) - level_masks = torch.cat(level_masks, dim=0) - c33_inds = torch.cat(c33_inds, dim=0).long() - c33_regs = torch.cat(c33_regs, dim=0) - c33_masks = torch.cat(c33_masks, dim=0) - else: - labels = shapes_per_level.new_zeros((0)).long() - level_masks = shapes_per_level.new_zeros((0, L)).bool() - c33_inds = shapes_per_level.new_zeros((0, L, K)).long() - c33_regs = shapes_per_level.new_zeros((0, L, K, 4)).float() - c33_masks = shapes_per_level.new_zeros((0, L, K)).bool() - return labels, level_masks, c33_inds, c33_masks, c33_regs # N x L, N x L x K \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet_head.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet_head.py deleted file mode 100755 index 57e0960a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet_head.py +++ /dev/null @@ -1,162 +0,0 @@ -import math -from typing import List -import torch -from torch import nn -from torch.nn import functional as F - -from detectron2.layers import ShapeSpec, get_norm -from detectron2.config import configurable -from ..layers.deform_conv import DFConv2d - -__all__ = ["CenterNetHead"] - -class Scale(nn.Module): - def __init__(self, init_value=1.0): - super(Scale, self).__init__() - self.scale = nn.Parameter(torch.FloatTensor([init_value])) - - def forward(self, input): - return input * self.scale - -class CenterNetHead(nn.Module): - @configurable - def __init__(self, - # input_shape: List[ShapeSpec], - in_channels, - num_levels, - *, - num_classes=80, - with_agn_hm=False, - only_proposal=False, - norm='GN', - num_cls_convs=4, - num_box_convs=4, - num_share_convs=0, - use_deformable=False, - prior_prob=0.01): - super().__init__() - self.num_classes = num_classes - self.with_agn_hm = with_agn_hm - self.only_proposal = only_proposal - self.out_kernel = 3 - - head_configs = { - "cls": (num_cls_convs if not self.only_proposal else 0, \ - use_deformable), - "bbox": (num_box_convs, use_deformable), - "share": (num_share_convs, use_deformable)} - - # in_channels = [s.channels for s in input_shape] - # assert len(set(in_channels)) == 1, \ - # "Each level must have the same channel!" - # in_channels = in_channels[0] - channels = { - 'cls': in_channels, - 'bbox': in_channels, - 'share': in_channels, - } - for head in head_configs: - tower = [] - num_convs, use_deformable = head_configs[head] - channel = channels[head] - for i in range(num_convs): - if use_deformable and i == num_convs - 1: - conv_func = DFConv2d - else: - conv_func = nn.Conv2d - tower.append(conv_func( - in_channels if i == 0 else channel, - channel, - kernel_size=3, stride=1, - padding=1, bias=True - )) - if norm == 'GN' and channel % 32 != 0: - tower.append(nn.GroupNorm(25, channel)) - elif norm != '': - tower.append(get_norm(norm, channel)) - tower.append(nn.ReLU()) - self.add_module('{}_tower'.format(head), - nn.Sequential(*tower)) - - self.bbox_pred = nn.Conv2d( - in_channels, 4, kernel_size=self.out_kernel, - stride=1, padding=self.out_kernel // 2 - ) - - self.scales = nn.ModuleList( - [Scale(init_value=1.0) for _ in range(num_levels)]) - - for modules in [ - self.cls_tower, self.bbox_tower, - self.share_tower, - self.bbox_pred, - ]: - for l in modules.modules(): - if isinstance(l, nn.Conv2d): - torch.nn.init.normal_(l.weight, std=0.01) - torch.nn.init.constant_(l.bias, 0) - - torch.nn.init.constant_(self.bbox_pred.bias, 8.) - prior_prob = prior_prob - bias_value = -math.log((1 - prior_prob) / prior_prob) - - if self.with_agn_hm: - self.agn_hm = nn.Conv2d( - in_channels, 1, kernel_size=self.out_kernel, - stride=1, padding=self.out_kernel // 2 - ) - torch.nn.init.constant_(self.agn_hm.bias, bias_value) - torch.nn.init.normal_(self.agn_hm.weight, std=0.01) - - if not self.only_proposal: - cls_kernel_size = self.out_kernel - self.cls_logits = nn.Conv2d( - in_channels, self.num_classes, - kernel_size=cls_kernel_size, - stride=1, - padding=cls_kernel_size // 2, - ) - - torch.nn.init.constant_(self.cls_logits.bias, bias_value) - torch.nn.init.normal_(self.cls_logits.weight, std=0.01) - - @classmethod - def from_config(cls, cfg, input_shape): - ret = { - # 'input_shape': input_shape, - 'in_channels': [s.channels for s in input_shape][0], - 'num_levels': len(input_shape), - 'num_classes': cfg.MODEL.CENTERNET.NUM_CLASSES, - 'with_agn_hm': cfg.MODEL.CENTERNET.WITH_AGN_HM, - 'only_proposal': cfg.MODEL.CENTERNET.ONLY_PROPOSAL, - 'norm': cfg.MODEL.CENTERNET.NORM, - 'num_cls_convs': cfg.MODEL.CENTERNET.NUM_CLS_CONVS, - 'num_box_convs': cfg.MODEL.CENTERNET.NUM_BOX_CONVS, - 'num_share_convs': cfg.MODEL.CENTERNET.NUM_SHARE_CONVS, - 'use_deformable': cfg.MODEL.CENTERNET.USE_DEFORMABLE, - 'prior_prob': cfg.MODEL.CENTERNET.PRIOR_PROB, - } - return ret - - def forward(self, x): - clss = [] - bbox_reg = [] - agn_hms = [] - for l, feature in enumerate(x): - feature = self.share_tower(feature) - cls_tower = self.cls_tower(feature) - bbox_tower = self.bbox_tower(feature) - if not self.only_proposal: - clss.append(self.cls_logits(cls_tower)) - else: - clss.append(None) - - if self.with_agn_hm: - agn_hms.append(self.agn_hm(bbox_tower)) - else: - agn_hms.append(None) - reg = self.bbox_pred(bbox_tower) - reg = self.scales[l](reg) - bbox_reg.append(F.relu(reg)) - - return clss, bbox_reg, agn_hms \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/utils.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/utils.py deleted file mode 100755 index c9efa287..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/utils.py +++ /dev/null @@ -1,38 +0,0 @@ -import cv2 -import torch -from torch import nn -from detectron2.utils.comm import get_world_size -from detectron2.structures import pairwise_iou, Boxes -# from .data import CenterNetCrop -import torch.nn.functional as F -import numpy as np -from detectron2.structures import Boxes, ImageList, Instances - -__all__ = ['reduce_sum', '_transpose'] - -INF = 1000000000 - -def _transpose(training_targets, num_loc_list): - ''' - This function is used to transpose image first training targets to - level first ones - :return: level first training targets - ''' - for im_i in range(len(training_targets)): - training_targets[im_i] = torch.split( - training_targets[im_i], num_loc_list, dim=0) - - targets_level_first = [] - for targets_per_level in zip(*training_targets): - targets_level_first.append( - torch.cat(targets_per_level, dim=0)) - return targets_level_first - - -def reduce_sum(tensor): - world_size = get_world_size() - if world_size < 2: - return tensor - tensor = tensor.clone() - torch.distributed.all_reduce(tensor, op=torch.distributed.ReduceOp.SUM) - return tensor \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/deform_conv.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/deform_conv.py deleted file mode 100755 index e5650c40..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/deform_conv.py +++ /dev/null @@ -1,116 +0,0 @@ -import torch -from torch import nn - -from detectron2.layers import Conv2d - - -class _NewEmptyTensorOp(torch.autograd.Function): - @staticmethod - def forward(ctx, x, new_shape): - ctx.shape = x.shape - return x.new_empty(new_shape) - - @staticmethod - def backward(ctx, grad): - shape = ctx.shape - return _NewEmptyTensorOp.apply(grad, shape), None - - -class DFConv2d(nn.Module): - """Deformable convolutional layer""" - def __init__( - self, - in_channels, - out_channels, - with_modulated_dcn=True, - kernel_size=3, - stride=1, - groups=1, - dilation=1, - deformable_groups=1, - bias=False, - padding=None - ): - super(DFConv2d, self).__init__() - if isinstance(kernel_size, (list, tuple)): - assert isinstance(stride, (list, tuple)) - assert isinstance(dilation, (list, tuple)) - assert len(kernel_size) == 2 - assert len(stride) == 2 - assert len(dilation) == 2 - padding = ( - dilation[0] * (kernel_size[0] - 1) // 2, - dilation[1] * (kernel_size[1] - 1) // 2 - ) - offset_base_channels = kernel_size[0] * kernel_size[1] - else: - padding = dilation * (kernel_size - 1) // 2 - offset_base_channels = kernel_size * kernel_size - if with_modulated_dcn: - from detectron2.layers.deform_conv import ModulatedDeformConv - offset_channels = offset_base_channels * 3 # default: 27 - conv_block = ModulatedDeformConv - else: - from detectron2.layers.deform_conv import DeformConv - offset_channels = offset_base_channels * 2 # default: 18 - conv_block = DeformConv - self.offset = Conv2d( - in_channels, - deformable_groups * offset_channels, - kernel_size=kernel_size, - stride=stride, - padding=padding, - groups=1, - dilation=dilation - ) - nn.init.constant_(self.offset.weight, 0) - nn.init.constant_(self.offset.bias, 0) - ''' - for l in [self.offset, ]: - nn.init.kaiming_uniform_(l.weight, a=1) - torch.nn.init.constant_(l.bias, 0.) - ''' - self.conv = conv_block( - in_channels, - out_channels, - kernel_size=kernel_size, - stride=stride, - padding=padding, - dilation=dilation, - groups=groups, - deformable_groups=deformable_groups, - bias=bias - ) - self.with_modulated_dcn = with_modulated_dcn - self.kernel_size = kernel_size - self.stride = stride - self.padding = padding - self.dilation = dilation - self.offset_split = offset_base_channels * deformable_groups * 2 - - def forward(self, x, return_offset=False): - if x.numel() > 0: - if not self.with_modulated_dcn: - offset_mask = self.offset(x) - x = self.conv(x, offset_mask) - else: - offset_mask = self.offset(x) - offset = offset_mask[:, :self.offset_split, :, :] - mask = offset_mask[:, self.offset_split:, :, :].sigmoid() - x = self.conv(x, offset, mask) - if return_offset: - return x, offset_mask - return x - # get output shape - output_shape = [ - (i + 2 * p - (di * (k - 1) + 1)) // d + 1 - for i, p, di, k, d in zip( - x.shape[-2:], - self.padding, - self.dilation, - self.kernel_size, - self.stride - ) - ] - output_shape = [x.shape[0], self.conv.weight.shape[0]] + output_shape - return _NewEmptyTensorOp.apply(x, output_shape) \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/heatmap_focal_loss.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/heatmap_focal_loss.py deleted file mode 100755 index d4693b21..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/heatmap_focal_loss.py +++ /dev/null @@ -1,92 +0,0 @@ -import torch -from torch.nn import functional as F - -# TODO: merge these two function -def heatmap_focal_loss( - inputs, - targets, - pos_inds, - labels, - alpha: float = -1, - beta: float = 4, - gamma: float = 2, - reduction: str = 'sum', - sigmoid_clamp: float = 1e-4, - ignore_high_fp: float = -1., -): - """ - Loss used in RetinaNet for dense detection: https://arxiv.org/abs/1708.02002. - Args: - inputs: (sum_l N*Hl*Wl, C) - targets: (sum_l N*Hl*Wl, C) - pos_inds: N - labels: N - Returns: - Loss tensor with the reduction option applied. - """ - pred = torch.clamp(inputs.sigmoid_(), min=sigmoid_clamp, max=1-sigmoid_clamp) - neg_weights = torch.pow(1 - targets, beta) - pos_pred_pix = pred[pos_inds] # N x C - pos_pred = pos_pred_pix.gather(1, labels.unsqueeze(1)) - pos_loss = torch.log(pos_pred) * torch.pow(1 - pos_pred, gamma) - neg_loss = torch.log(1 - pred) * torch.pow(pred, gamma) * neg_weights - - if ignore_high_fp > 0: - not_high_fp = (pred < ignore_high_fp).float() - neg_loss = not_high_fp * neg_loss - - if reduction == "sum": - pos_loss = pos_loss.sum() - neg_loss = neg_loss.sum() - - if alpha >= 0: - pos_loss = alpha * pos_loss - neg_loss = (1 - alpha) * neg_loss - - return - pos_loss, - neg_loss - -heatmap_focal_loss_jit = torch.jit.script(heatmap_focal_loss) -# heatmap_focal_loss_jit = heatmap_focal_loss - -def binary_heatmap_focal_loss( - inputs, - targets, - pos_inds, - alpha: float = -1, - beta: float = 4, - gamma: float = 2, - sigmoid_clamp: float = 1e-4, - ignore_high_fp: float = -1., -): - """ - Args: - inputs: (sum_l N*Hl*Wl,) - targets: (sum_l N*Hl*Wl,) - pos_inds: N - Returns: - Loss tensor with the reduction option applied. - """ - pred = torch.clamp(inputs.sigmoid_(), min=sigmoid_clamp, max=1-sigmoid_clamp) - neg_weights = torch.pow(1 - targets, beta) - for i, ind in enumerate(pos_inds): - if ind >= pred.shape[0]: - print('%'*100) - print(pred.shape, ind, pos_inds) - pos_inds[i] = pred.shape[0] - 1 - pos_pred = pred[pos_inds] # N - pos_loss = torch.log(pos_pred) * torch.pow(1 - pos_pred, gamma) - neg_loss = torch.log(1 - pred) * torch.pow(pred, gamma) * neg_weights - if ignore_high_fp > 0: - not_high_fp = (pred < ignore_high_fp).float() - neg_loss = not_high_fp * neg_loss - - pos_loss = - pos_loss.sum() - neg_loss = - neg_loss.sum() - - if alpha >= 0: - pos_loss = alpha * pos_loss - neg_loss = (1 - alpha) * neg_loss - - return pos_loss, neg_loss - -# binary_heatmap_focal_loss_jit = torch.jit.script(binary_heatmap_focal_loss) \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/iou_loss.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/iou_loss.py deleted file mode 100755 index 6a024646..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/iou_loss.py +++ /dev/null @@ -1,121 +0,0 @@ -import torch -from torch import nn - - -class IOULoss(nn.Module): - def __init__(self, loc_loss_type='iou'): - super(IOULoss, self).__init__() - self.loc_loss_type = loc_loss_type - - def forward(self, pred, target, weight=None, reduction='sum'): - pred_left = pred[:, 0] - pred_top = pred[:, 1] - pred_right = pred[:, 2] - pred_bottom = pred[:, 3] - - target_left = target[:, 0] - target_top = target[:, 1] - target_right = target[:, 2] - target_bottom = target[:, 3] - - target_aera = (target_left + target_right) * \ - (target_top + target_bottom) - pred_aera = (pred_left + pred_right) * \ - (pred_top + pred_bottom) - - w_intersect = torch.min(pred_left, target_left) + \ - torch.min(pred_right, target_right) - h_intersect = torch.min(pred_bottom, target_bottom) + \ - torch.min(pred_top, target_top) - - g_w_intersect = torch.max(pred_left, target_left) + \ - torch.max(pred_right, target_right) - g_h_intersect = torch.max(pred_bottom, target_bottom) + \ - torch.max(pred_top, target_top) - ac_uion = g_w_intersect * g_h_intersect - - area_intersect = w_intersect * h_intersect - area_union = target_aera + pred_aera - area_intersect - - ious = (area_intersect + 1.0) / (area_union + 1.0) - gious = ious - (ac_uion - area_union) / ac_uion - if self.loc_loss_type == 'iou': - losses = -torch.log(ious) - elif self.loc_loss_type == 'linear_iou': - losses = 1 - ious - elif self.loc_loss_type == 'giou': - losses = 1 - gious - else: - raise NotImplementedError - - if weight is not None: - losses = losses * weight - else: - losses = losses - - if reduction == 'sum': - return losses.sum() - elif reduction == 'batch': - return losses.sum(dim=[1]) - elif reduction == 'none': - return losses - else: - raise NotImplementedError - - -def giou_loss( - boxes1: torch.Tensor, - boxes2: torch.Tensor, - reduction: str = "none", - eps: float = 1e-7, -) -> torch.Tensor: - """ - Generalized Intersection over Union Loss (Hamid Rezatofighi et. al) - https://arxiv.org/abs/1902.09630 - Gradient-friendly IoU loss with an additional penalty that is non-zero when the - boxes do not overlap and scales with the size of their smallest enclosing box. - This loss is symmetric, so the boxes1 and boxes2 arguments are interchangeable. - Args: - boxes1, boxes2 (Tensor): box locations in XYXY format, shape (N, 4) or (4,). - reduction: 'none' | 'mean' | 'sum' - 'none': No reduction will be applied to the output. - 'mean': The output will be averaged. - 'sum': The output will be summed. - eps (float): small number to prevent division by zero - """ - - x1, y1, x2, y2 = boxes1.unbind(dim=-1) - x1g, y1g, x2g, y2g = boxes2.unbind(dim=-1) - - assert (x2 >= x1).all(), "bad box: x1 larger than x2" - assert (y2 >= y1).all(), "bad box: y1 larger than y2" - - # Intersection keypoints - xkis1 = torch.max(x1, x1g) - ykis1 = torch.max(y1, y1g) - xkis2 = torch.min(x2, x2g) - ykis2 = torch.min(y2, y2g) - - intsctk = torch.zeros_like(x1) - mask = (ykis2 > ykis1) & (xkis2 > xkis1) - intsctk[mask] = (xkis2[mask] - xkis1[mask]) * (ykis2[mask] - ykis1[mask]) - unionk = (x2 - x1) * (y2 - y1) + (x2g - x1g) * (y2g - y1g) - intsctk - iouk = intsctk / (unionk + eps) - - # smallest enclosing box - xc1 = torch.min(x1, x1g) - yc1 = torch.min(y1, y1g) - xc2 = torch.max(x2, x2g) - yc2 = torch.max(y2, y2g) - - area_c = (xc2 - xc1) * (yc2 - yc1) - miouk = iouk - ((area_c - unionk) / (area_c + eps)) - - loss = 1 - miouk - - if reduction == "mean": - loss = loss.mean() - elif reduction == "sum": - loss = loss.sum() - - return loss \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/ml_nms.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/ml_nms.py deleted file mode 100755 index 325d709a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/ml_nms.py +++ /dev/null @@ -1,31 +0,0 @@ -from detectron2.layers import batched_nms - - -def ml_nms(boxlist, nms_thresh, max_proposals=-1, - score_field="scores", label_field="labels"): - """ - Performs non-maximum suppression on a boxlist, with scores specified - in a boxlist field via score_field. - Arguments: - boxlist(BoxList) - nms_thresh (float) - max_proposals (int): if > 0, then only the top max_proposals are kept - after non-maximum suppression - score_field (str) - """ - if nms_thresh <= 0: - return boxlist - if boxlist.has('pred_boxes'): - boxes = boxlist.pred_boxes.tensor - labels = boxlist.pred_classes - else: - boxes = boxlist.proposal_boxes.tensor - labels = boxlist.proposal_boxes.tensor.new_zeros( - len(boxlist.proposal_boxes.tensor)) - scores = boxlist.scores - - keep = batched_nms(boxes, scores, labels, nms_thresh) - if max_proposals > 0: - keep = keep[: max_proposals] - boxlist = boxlist[keep] - return boxlist diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/meta_arch/centernet_detector.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/meta_arch/centernet_detector.py deleted file mode 100755 index b7525c7b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/meta_arch/centernet_detector.py +++ /dev/null @@ -1,69 +0,0 @@ -import math -import json -import numpy as np -import torch -from torch import nn - -from detectron2.modeling.meta_arch.build import META_ARCH_REGISTRY -from detectron2.modeling import build_backbone, build_proposal_generator -from detectron2.modeling import detector_postprocess -from detectron2.structures import ImageList - -@META_ARCH_REGISTRY.register() -class CenterNetDetector(nn.Module): - def __init__(self, cfg): - super().__init__() - self.mean, self.std = cfg.MODEL.PIXEL_MEAN, cfg.MODEL.PIXEL_STD - self.register_buffer("pixel_mean", torch.Tensor(cfg.MODEL.PIXEL_MEAN).view(-1, 1, 1)) - self.register_buffer("pixel_std", torch.Tensor(cfg.MODEL.PIXEL_STD).view(-1, 1, 1)) - - self.backbone = build_backbone(cfg) - self.proposal_generator = build_proposal_generator( - cfg, self.backbone.output_shape()) # TODO: change to a more precise name - - - def forward(self, batched_inputs): - if not self.training: - return self.inference(batched_inputs) - images = self.preprocess_image(batched_inputs) - features = self.backbone(images.tensor) - gt_instances = [x["instances"].to(self.device) for x in batched_inputs] - - _, proposal_losses = self.proposal_generator( - images, features, gt_instances) - return proposal_losses - - - @property - def device(self): - return self.pixel_mean.device - - - @torch.no_grad() - def inference(self, batched_inputs, do_postprocess=True): - images = self.preprocess_image(batched_inputs) - inp = images.tensor - features = self.backbone(inp) - proposals, _ = self.proposal_generator(images, features, None) - - processed_results = [] - for results_per_image, input_per_image, image_size in zip( - proposals, batched_inputs, images.image_sizes): - if do_postprocess: - height = input_per_image.get("height", image_size[0]) - width = input_per_image.get("width", image_size[1]) - r = detector_postprocess(results_per_image, height, width) - processed_results.append({"instances": r}) - else: - r = results_per_image - processed_results.append(r) - return processed_results - - def preprocess_image(self, batched_inputs): - """ - Normalize, pad and batch the input images. - """ - images = [x["image"].to(self.device) for x in batched_inputs] - images = [(x - self.pixel_mean) / self.pixel_std for x in images] - images = ImageList.from_tensors(images, self.backbone.size_divisibility) - return images diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_fast_rcnn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_fast_rcnn.py deleted file mode 100755 index b6d95690..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_fast_rcnn.py +++ /dev/null @@ -1,124 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -# Part of the code is from https://github.com/tztztztztz/eql.detectron2/blob/master/projects/EQL/eql/fast_rcnn.py -import logging -import math -import json -from typing import Dict, Union -import torch -from fvcore.nn import giou_loss, smooth_l1_loss -from torch import nn -from torch.nn import functional as F - -from detectron2.config import configurable -from detectron2.layers import Linear, ShapeSpec, batched_nms, cat, nonzero_tuple -from detectron2.modeling.box_regression import Box2BoxTransform -from detectron2.structures import Boxes, Instances -from detectron2.utils.events import get_event_storage -from detectron2.modeling.roi_heads.fast_rcnn import FastRCNNOutputLayers -from detectron2.modeling.roi_heads.fast_rcnn import fast_rcnn_inference -from detectron2.modeling.roi_heads.fast_rcnn import _log_classification_stats -from detectron2.utils.comm import get_world_size -from .fed_loss import load_class_freq, get_fed_loss_inds - -__all__ = ["CustomFastRCNNOutputLayers"] - -class CustomFastRCNNOutputLayers(FastRCNNOutputLayers): - def __init__( - self, - cfg, - input_shape: ShapeSpec, - **kwargs - ): - super().__init__(cfg, input_shape, **kwargs) - - self.cfg = cfg - - def losses(self, predictions, proposals): - """ - enable advanced loss - """ - scores, proposal_deltas = predictions - gt_classes = ( - cat([p.gt_classes for p in proposals], dim=0) if len(proposals) else torch.empty(0) - ) - num_classes = self.num_classes - _log_classification_stats(scores, gt_classes) - - if len(proposals): - proposal_boxes = cat([p.proposal_boxes.tensor for p in proposals], dim=0) # Nx4 - assert not proposal_boxes.requires_grad, "Proposals should not require gradients!" - gt_boxes = cat( - [(p.gt_boxes if p.has("gt_boxes") else p.proposal_boxes).tensor for p in proposals], - dim=0, - ) - else: - proposal_boxes = gt_boxes = torch.empty((0, 4), device=proposal_deltas.device) - - loss_cls = self.softmax_cross_entropy_loss(scores, gt_classes) - return { - "loss_cls": loss_cls, - "loss_box_reg": self.box_reg_loss( - proposal_boxes, gt_boxes, proposal_deltas, gt_classes) - } - - - def sigmoid_cross_entropy_loss(self, pred_class_logits, gt_classes): - if pred_class_logits.numel() == 0: - return pred_class_logits.new_zeros([1])[0] # This is more robust than .sum() * 0. - - B = pred_class_logits.shape[0] - C = pred_class_logits.shape[1] - 1 - - target = pred_class_logits.new_zeros(B, C + 1) - target[range(len(gt_classes)), gt_classes] = 1 # B x (C + 1) - target = target[:, :C] # B x C - - weight = 1 - - cls_loss = F.binary_cross_entropy_with_logits( - pred_class_logits[:, :-1], target, reduction='none') # B x C - loss = torch.sum(cls_loss * weight) / B - return loss - - - def softmax_cross_entropy_loss(self, pred_class_logits, gt_classes): - """ - change _no_instance handling - """ - if pred_class_logits.numel() == 0: - return pred_class_logits.new_zeros([1])[0] - - loss = F.cross_entropy( - pred_class_logits, gt_classes, reduction="mean") - return loss - - - def inference(self, predictions, proposals): - """ - enable use proposal boxes - """ - boxes = self.predict_boxes(predictions, proposals) - scores = self.predict_probs(predictions, proposals) - if self.cfg.MODEL.ROI_BOX_HEAD.MULT_PROPOSAL_SCORE: - proposal_scores = [p.get('objectness_logits') for p in proposals] - scores = [(s * ps[:, None]) ** 0.5 \ - for s, ps in zip(scores, proposal_scores)] - image_shapes = [x.image_size for x in proposals] - return fast_rcnn_inference( - boxes, - scores, - image_shapes, - self.test_score_thresh, - self.test_nms_thresh, - self.test_topk_per_image, - ) - - - def predict_probs(self, predictions, proposals): - """ - support sigmoid - """ - scores, _ = predictions - num_inst_per_image = [len(p) for p in proposals] - probs = F.softmax(scores, dim=-1) - return probs.split(num_inst_per_image, dim=0) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_roi_heads.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_roi_heads.py deleted file mode 100755 index 90fadf1a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_roi_heads.py +++ /dev/null @@ -1,185 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -import numpy as np -import json -import math -import torch -from torch import nn -from torch.autograd.function import Function -from typing import Dict, List, Optional, Tuple, Union - -from detectron2.layers import ShapeSpec -from detectron2.structures import Boxes, Instances, pairwise_iou -from detectron2.utils.events import get_event_storage - -from detectron2.modeling.box_regression import Box2BoxTransform -from detectron2.modeling.roi_heads.fast_rcnn import fast_rcnn_inference -from detectron2.modeling.roi_heads.roi_heads import ROI_HEADS_REGISTRY, StandardROIHeads -from detectron2.modeling.roi_heads.cascade_rcnn import CascadeROIHeads -from detectron2.modeling.roi_heads.box_head import build_box_head -from .custom_fast_rcnn import CustomFastRCNNOutputLayers - - -@ROI_HEADS_REGISTRY.register() -class CustomROIHeads(StandardROIHeads): - @classmethod - def _init_box_head(self, cfg, input_shape): - ret = super()._init_box_head(cfg, input_shape) - del ret['box_predictor'] - ret['box_predictor'] = CustomFastRCNNOutputLayers( - cfg, ret['box_head'].output_shape) - self.debug = cfg.DEBUG - if self.debug: - self.debug_show_name = cfg.DEBUG_SHOW_NAME - self.save_debug = cfg.SAVE_DEBUG - self.vis_thresh = cfg.VIS_THRESH - self.pixel_mean = torch.Tensor(cfg.MODEL.PIXEL_MEAN).to( - torch.device(cfg.MODEL.DEVICE)).view(3, 1, 1) - self.pixel_std = torch.Tensor(cfg.MODEL.PIXEL_STD).to( - torch.device(cfg.MODEL.DEVICE)).view(3, 1, 1) - return ret - - def forward(self, images, features, proposals, targets=None): - """ - enable debug - """ - if not self.debug: - del images - if self.training: - assert targets - proposals = self.label_and_sample_proposals(proposals, targets) - del targets - - if self.training: - losses = self._forward_box(features, proposals) - losses.update(self._forward_mask(features, proposals)) - losses.update(self._forward_keypoint(features, proposals)) - return proposals, losses - else: - pred_instances = self._forward_box(features, proposals) - pred_instances = self.forward_with_given_boxes(features, pred_instances) - if self.debug: - from ..debug import debug_second_stage - denormalizer = lambda x: x * self.pixel_std + self.pixel_mean - debug_second_stage( - [denormalizer(images[0].clone())], - pred_instances, proposals=proposals, - debug_show_name=self.debug_show_name) - return pred_instances, {} - - -@ROI_HEADS_REGISTRY.register() -class CustomCascadeROIHeads(CascadeROIHeads): - @classmethod - def _init_box_head(self, cfg, input_shape): - self.mult_proposal_score = cfg.MODEL.ROI_BOX_HEAD.MULT_PROPOSAL_SCORE - ret = super()._init_box_head(cfg, input_shape) - del ret['box_predictors'] - cascade_bbox_reg_weights = cfg.MODEL.ROI_BOX_CASCADE_HEAD.BBOX_REG_WEIGHTS - box_predictors = [] - for box_head, bbox_reg_weights in zip(ret['box_heads'], cascade_bbox_reg_weights): - box_predictors.append( - CustomFastRCNNOutputLayers( - cfg, box_head.output_shape, - box2box_transform=Box2BoxTransform(weights=bbox_reg_weights) - )) - ret['box_predictors'] = box_predictors - self.debug = cfg.DEBUG - if self.debug: - self.debug_show_name = cfg.DEBUG_SHOW_NAME - self.save_debug = cfg.SAVE_DEBUG - self.vis_thresh = cfg.VIS_THRESH - self.pixel_mean = torch.Tensor(cfg.MODEL.PIXEL_MEAN).to( - torch.device(cfg.MODEL.DEVICE)).view(3, 1, 1) - self.pixel_std = torch.Tensor(cfg.MODEL.PIXEL_STD).to( - torch.device(cfg.MODEL.DEVICE)).view(3, 1, 1) - return ret - - - def _forward_box(self, features, proposals, targets=None): - """ - Add mult proposal scores at testing - """ - if (not self.training) and self.mult_proposal_score: - if len(proposals) > 0 and proposals[0].has('scores'): - proposal_scores = [ - p.get('scores') for p in proposals] - else: - proposal_scores = [ - p.get('objectness_logits') for p in proposals] - - features = [features[f] for f in self.box_in_features] - head_outputs = [] # (predictor, predictions, proposals) - prev_pred_boxes = None - image_sizes = [x.image_size for x in proposals] - for k in range(self.num_cascade_stages): - if k > 0: - proposals = self._create_proposals_from_boxes(prev_pred_boxes, image_sizes) - if self.training: - proposals = self._match_and_label_boxes(proposals, k, targets) - predictions = self._run_stage(features, proposals, k) - prev_pred_boxes = self.box_predictor[k].predict_boxes(predictions, proposals) - head_outputs.append((self.box_predictor[k], predictions, proposals)) - - if self.training: - losses = {} - storage = get_event_storage() - for stage, (predictor, predictions, proposals) in enumerate(head_outputs): - with storage.name_scope("stage{}".format(stage)): - stage_losses = predictor.losses(predictions, proposals) - losses.update({k + "_stage{}".format(stage): v for k, v in stage_losses.items()}) - return losses - else: - # Each is a list[Tensor] of length #image. Each tensor is Ri x (K+1) - scores_per_stage = [h[0].predict_probs(h[1], h[2]) for h in head_outputs] - scores = [ - sum(list(scores_per_image)) * (1.0 / self.num_cascade_stages) - for scores_per_image in zip(*scores_per_stage) - ] - - if self.mult_proposal_score: - scores = [(s * ps[:, None]) ** 0.5 \ - for s, ps in zip(scores, proposal_scores)] - - predictor, predictions, proposals = head_outputs[-1] - boxes = predictor.predict_boxes(predictions, proposals) - pred_instances, _ = fast_rcnn_inference( - boxes, - scores, - image_sizes, - predictor.test_score_thresh, - predictor.test_nms_thresh, - predictor.test_topk_per_image, - ) - - return pred_instances - - def forward(self, images, features, proposals, targets=None): - ''' - enable debug - ''' - if not self.debug: - del images - if self.training: - proposals = self.label_and_sample_proposals(proposals, targets) - - if self.training: - losses = self._forward_box(features, proposals, targets) - losses.update(self._forward_mask(features, proposals)) - losses.update(self._forward_keypoint(features, proposals)) - return proposals, losses - else: - # import pdb; pdb.set_trace() - pred_instances = self._forward_box(features, proposals) - pred_instances = self.forward_with_given_boxes(features, pred_instances) - if self.debug: - from ..debug import debug_second_stage - denormalizer = lambda x: x * self.pixel_std + self.pixel_mean - debug_second_stage( - [denormalizer(x.clone()) for x in images], - pred_instances, proposals=proposals, - save_debug=self.save_debug, - debug_show_name=self.debug_show_name, - vis_thresh=self.vis_thresh) - return pred_instances, {} - - diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/fed_loss.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/fed_loss.py deleted file mode 100755 index 290f0f07..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/fed_loss.py +++ /dev/null @@ -1,31 +0,0 @@ -import torch -import json -import numpy as np -from torch.nn import functional as F - -def load_class_freq( - path='datasets/lvis/lvis_v1_train_cat_info.json', - freq_weight=0.5): - cat_info = json.load(open(path, 'r')) - cat_info = torch.tensor( - [c['image_count'] for c in sorted(cat_info, key=lambda x: x['id'])]) - freq_weight = cat_info.float() ** freq_weight - return freq_weight - -def get_fed_loss_inds( - gt_classes, num_sample_cats=50, C=1203, \ - weight=None, fed_cls_inds=-1): - appeared = torch.unique(gt_classes) # C' - prob = appeared.new_ones(C + 1).float() - prob[-1] = 0 - if len(appeared) < num_sample_cats: - if weight is not None: - prob[:C] = weight.float().clone() - prob[appeared] = 0 - if fed_cls_inds > 0: - prob[fed_cls_inds:] = 0 - more_appeared = torch.multinomial( - prob, num_sample_cats - len(appeared), - replacement=False) - appeared = torch.cat([appeared, more_appeared]) - return appeared \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet2_docs/MODEL_ZOO.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet2_docs/MODEL_ZOO.md deleted file mode 100755 index 7a2a92b6..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/centernet2_docs/MODEL_ZOO.md +++ /dev/null @@ -1,73 +0,0 @@ -# MODEL_ZOO - -### Common settings and notes - -- Multiscale training is used by default in all models. The results are all reported using single-scale testing. -- We report runtime on our local workstation with a TitanXp GPU and a Titan RTX GPU. -- All models are trained on 8-GPU servers by default. The 1280 models are trained on 24G GPUs. Reducing the batchsize with the linear learning rate rule should be fine. -- All models can be downloaded directly from [Google drive](https://drive.google.com/drive/folders/1eae1cTX8tvIaCeof36sBgxrXEXALYlf-?usp=sharing). - - -## COCO - -### CenterNet - -| Model | val mAP | FPS (Titan Xp/ Titan RTX) | links | -|-------------------------------------------|---------|---------|-----------| -| CenterNet-S4_DLA_8x | 42.5 | 50 / 71 |[config](../configs/CenterNet-S4_DLA_8x.yaml)/[model](https://drive.google.com/file/d/1lNBhVHnZAEBRD66MFaHjm5Ij6Z4KYrJq/view?usp=sharing)| -| CenterNet-FPN_R50_1x | 40.2 | 20 / 24 |[config](../configs/CenterNet-FPN_R50_1x.yaml)/[model](https://drive.google.com/file/d/1rVG1YTthMXvutC6jr9KoE2DthT5-jhGj/view?usp=sharing)| - -#### Note - -- `CenterNet-S4_DLA_8x` is a re-implemented version of the original CenterNet (stride 4), with several changes, including - - Using top-left-right-bottom box encoding and GIoU Loss; adding regression loss to the center 3x3 region. - - Adding more positive pixels for the heatmap loss whose regression loss is small and is within the center3x3 region. - - Using more heavy crop augmentation (EfficientDet-style crop ratio 0.1-2), and removing color augmentations. - - Using standard NMS instead of max pooling. - - Using RetinaNet-style optimizer (SGD), learning rate rule (0.01 for each batch size 16), and schedule (8x12 epochs). -- `CenterNet-FPN_R50_1x` is a (new) FPN version of CenterNet. It includes the changes above, and assigns objects to FPN levels based on a fixed size range. The model is trained with standard short edge 640-800 multi-scale training with 12 epochs (1x). - - -### CenterNet2 - -| Model | val mAP | FPS (Titan Xp/ Titan RTX) | links | -|-------------------------------------------|---------|---------|-----------| -| CenterNet2-F_R50_1x | 41.7 | 22 / 27 |[config](../configs/CenterNet2-F_R50_1x.yaml)/[model](X)| -| CenterNet2_R50_1x | 42.9 | 18 / 24 |[config](../configs/CenterNet2_R50_1x.yaml)/[model](https://drive.google.com/file/d/1Osu1J_sskt_1FaGdfJKa4vd2N71TWS9W/view?usp=sharing)| -| CenterNet2_X101-DCN_2x | 49.9 | 6 / 8 |[config](../configs/CenterNet2_X101-DCN_2x.yaml)/[model](https://drive.google.com/file/d/1IHgpUHVJWpvMuFUUetgKWsw27pRNN2oK/view?usp=sharing)| -| CenterNet2_DLA-BiFPN-P3_4x | 43.8 | 40 / 50|[config](../configs/CenterNet2_DLA-BiFPN-P3_4x.yaml)/[model](https://drive.google.com/file/d/12GUNlDW9RmOs40UEMSiiUsk5QK_lpGsE/view?usp=sharing)| -| CenterNet2_DLA-BiFPN-P3_24x | 45.6 | 40 / 50 |[config](../configs/CenterNet2_DLA-BiFPN-P3_24x.yaml)/[model](https://drive.google.com/file/d/15ZES1ySxubDPzKsHPA7pYg8o_Vwmf-Mb/view?usp=sharing)| -| CenterNet2_R2-101-DCN_896_4x | 51.2 | 9 / 13 |[config](../configs/CenterNet2_R2-101-DCN_896_4x.yaml)/[model](https://drive.google.com/file/d/1S7_GE8ZDQBWuLEfKHkxzeF3KBsxsbABg/view?usp=sharing)| -| CenterNet2_R2-101-DCN-BiFPN_1280_4x | 52.9 | 6 / 8 |[config](../configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml)/[model](https://drive.google.com/file/d/14EBHNMagBCNTQjOXcHoZwLYIi2lFIm7F/view?usp=sharing)| -| CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST | 56.1 | 3 / 5 |[config](../configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml)/[model](https://drive.google.com/file/d/11ww9VlOi_nhpdsU_vBAecSxBU0dR_JzW/view?usp=sharing)| -| CenterNet2_DLA-BiFPN-P5_640_24x_ST | 49.2 | 33 / 38 |[config](../configs/CenterNet2_DLA-BiFPN-P5_640_24x_ST.yaml)/[model](https://drive.google.com/file/d/1qsHp2HrM1u8WrtBzF5S0oCoLMz-B40wk/view?usp=sharing)| - -#### Note - -- `CenterNet2-F_R50_1x` uses Faster RCNN as the second stage. All other CenterNet2 models use Cascade RCNN as the second stage. -- `CenterNet2_DLA-BiFPN-P3_4x` follows the same training setting as [realtime-FCOS](https://github.com/aim-uofa/AdelaiDet/blob/master/configs/FCOS-Detection/README.md). -- `CenterNet2_DLA-BiFPN-P3_24x` is trained by repeating the `4x` schedule (starting from learning rate 0.01) 6 times. -- R2 means [Res2Net](https://github.com/Res2Net/Res2Net-detectron2) backbone. To train Res2Net models, you need to download the ImageNet pre-trained weight [here](https://github.com/Res2Net/Res2Net-detectron2) and place it in `output/r2_101.pkl`. -- The last 4 models in the table are trained with the EfficientDet-style resize-and-crop augmentation, instead of the default random resizing short edge in detectron2. We found this trains faster (per-iteration) and gives better performance under a long schedule. -- `_ST` means using [self-training](https://arxiv.org/abs/2006.06882) using pseudo-labels produced by [Scaled-YOLOv4](https://github.com/WongKinYiu/ScaledYOLOv4) on COCO unlabeled images, with a hard score threshold 0.5. Our processed pseudo-labels can be downloaded [here](https://drive.google.com/file/d/1LMBjtHhLp6dYf6MjwEQmzCLWQLkmWPpw/view?usp=sharing). -- `CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST` finetunes from `CenterNet2_R2-101-DCN-BiFPN_1280_4x` for an additional `4x` schedule with the self-training data. It is trained under `1280x1280` but tested under `1560x1560`. - -## LVIS v1 - -| Model | val mAP box | links | -|-------------------------------------------|--------------|-----------| -| LVIS_CenterNet2_R50_1x | 26.5 |[config](../configs/LVIS_CenterNet2_R50_1x.yaml)/[model](https://drive.google.com/file/d/1gT9e-tNw8uzEBaCadQuoOOP2TEYa4kKP/view?usp=sharing)| -| LVIS_CenterNet2_R50_Fed_1x | 28.3 |[config](../configs/LVIS_CenterNet2_R50_Fed_1x.yaml)/[model](https://drive.google.com/file/d/1a9UjheMCKax0qAKEwPVpq2ZHN6vpqJv8/view?usp=sharing)| - -- The models are trained with repeat-factor sampling. -- `LVIS_CenterNet2_R50_Fed_1x` is CenterNet2 with our federated loss. Check our Appendix D of our [paper](https://arxiv.org/abs/2103.07461) or our [technical report at LVIS challenge](https://www.lvisdataset.org/assets/challenge_reports/2020/CenterNet2.pdf) for references. - -## Objects365 - -| Model | val mAP| links | -|-------------------------------------------|---------|-----------| -| O365_CenterNet2_R50_1x | 22.6 |[config](../configs/O365_CenterNet2_R50_1x.yaml)/[model](https://drive.google.com/file/d/18fG6xGchAlpNp5sx8RAtwadGkS-gdIBU/view?usp=sharing)| - -#### Note -- Objects365 dataset can be downloaded [here](https://www.objects365.org/overview.html). -- The model is trained with class-aware sampling. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet-FPN.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet-FPN.yaml deleted file mode 100755 index bef3dc10..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet-FPN.yaml +++ /dev/null @@ -1,28 +0,0 @@ -MODEL: - META_ARCHITECTURE: "CenterNetDetector" - PROPOSAL_GENERATOR: - NAME: "CenterNet" - BACKBONE: - NAME: "build_p67_resnet_fpn_backbone" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 - OUT_FEATURES: ["res3", "res4", "res5"] - FPN: - IN_FEATURES: ["res3", "res4", "res5"] -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - IMS_PER_BATCH: 16 - BASE_LR: 0.01 - STEPS: (60000, 80000) - MAX_ITER: 90000 - CHECKPOINT_PERIOD: 1000000000 - WARMUP_ITERS: 4000 - WARMUP_FACTOR: 0.00025 - CLIP_GRADIENTS: - ENABLED: True -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -OUTPUT_DIR: "./output/CenterNet2/auto" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet2.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet2.yaml deleted file mode 100755 index 68937231..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet2.yaml +++ /dev/null @@ -1,56 +0,0 @@ -MODEL: - META_ARCHITECTURE: "GeneralizedRCNN" - PROPOSAL_GENERATOR: - NAME: "CenterNet" - BACKBONE: - NAME: "build_p67_resnet_fpn_backbone" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 - OUT_FEATURES: ["res3", "res4", "res5"] - FPN: - IN_FEATURES: ["res3", "res4", "res5"] - ROI_HEADS: - NAME: CustomCascadeROIHeads - IN_FEATURES: ["p3", "p4", "p5", "p6", "p7"] - IOU_THRESHOLDS: [0.6] - NMS_THRESH_TEST: 0.7 - ROI_BOX_CASCADE_HEAD: - IOUS: [0.6, 0.7, 0.8] - ROI_BOX_HEAD: - NAME: "FastRCNNConvFCHead" - NUM_FC: 2 - POOLER_RESOLUTION: 7 - CLS_AGNOSTIC_BBOX_REG: True - MULT_PROPOSAL_SCORE: True - CENTERNET: - REG_WEIGHT: 1. - NOT_NORM_REG: True - ONLY_PROPOSAL: True - WITH_AGN_HM: True - INFERENCE_TH: 0.0001 - PRE_NMS_TOPK_TRAIN: 4000 - POST_NMS_TOPK_TRAIN: 2000 - PRE_NMS_TOPK_TEST: 1000 - POST_NMS_TOPK_TEST: 256 - NMS_TH_TRAIN: 0.9 - NMS_TH_TEST: 0.9 - POS_WEIGHT: 0.5 - NEG_WEIGHT: 0.5 - IGNORE_HIGH_FP: 0.85 -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - IMS_PER_BATCH: 16 - BASE_LR: 0.02 - STEPS: (60000, 80000) - MAX_ITER: 90000 - CHECKPOINT_PERIOD: 1000000000 - WARMUP_ITERS: 4000 - WARMUP_FACTOR: 0.00025 - CLIP_GRADIENTS: - ENABLED: True -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -OUTPUT_DIR: "./output/CenterNet2/auto" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base_S4_DLA.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base_S4_DLA.yaml deleted file mode 100755 index 7e01be7e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/Base_S4_DLA.yaml +++ /dev/null @@ -1,40 +0,0 @@ -MODEL: - META_ARCHITECTURE: "CenterNetDetector" - PROPOSAL_GENERATOR: - NAME: "CenterNet" - PIXEL_STD: [57.375, 57.120, 58.395] - BACKBONE: - NAME: "build_dla_backbone" - DLA: - NORM: "BN" - CENTERNET: - IN_FEATURES: ["dla2"] - FPN_STRIDES: [4] - SOI: [[0, 1000000]] - NUM_CLS_CONVS: 1 - NUM_BOX_CONVS: 1 - REG_WEIGHT: 1. - MORE_POS: True - HM_FOCAL_ALPHA: 0.25 -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - LR_SCHEDULER_NAME: "WarmupCosineLR" - MAX_ITER: 90000 - BASE_LR: 0.04 - IMS_PER_BATCH: 64 - WEIGHT_DECAY: 0.0001 - CHECKPOINT_PERIOD: 1000000 - CLIP_GRADIENTS: - ENABLED: True -INPUT: - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 640 - MIN_SIZE_TEST: 608 - MAX_SIZE_TEST: 900 -TEST: - EVAL_PERIOD: 7500 -DATALOADER: - NUM_WORKERS: 8 -OUTPUT_DIR: "output/CenterNet2/auto" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-FPN_R50_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-FPN_R50_1x.yaml deleted file mode 100755 index 6ea7d9b7..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-FPN_R50_1x.yaml +++ /dev/null @@ -1,4 +0,0 @@ -_BASE_: "Base-CenterNet-FPN.yaml" -MODEL: - CENTERNET: - MORE_POS: True \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-S4_DLA_8x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-S4_DLA_8x.yaml deleted file mode 100755 index b3d88be9..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-S4_DLA_8x.yaml +++ /dev/null @@ -1,5 +0,0 @@ -_BASE_: "Base_S4_DLA.yaml" -SOLVER: - MAX_ITER: 90000 - BASE_LR: 0.08 - IMS_PER_BATCH: 128 \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2-F_R50_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2-F_R50_1x.yaml deleted file mode 100755 index c40eecc1..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2-F_R50_1x.yaml +++ /dev/null @@ -1,4 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - ROI_HEADS: - NAME: CustomROIHeads \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml deleted file mode 100755 index d7491447..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml +++ /dev/null @@ -1,36 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - BACKBONE: - NAME: "build_p35_fcos_dla_bifpn_backbone" - BIFPN: - OUT_CHANNELS: 160 - NUM_LEVELS: 3 - NUM_BIFPN: 4 - DLA: - NUM_LAYERS: 34 - NORM: "SyncBN" - FPN: - IN_FEATURES: ["dla3", "dla4", "dla5"] - ROI_HEADS: - IN_FEATURES: ["p3", "p4", "p5"] - CENTERNET: - POST_NMS_TOPK_TEST: 128 - FPN_STRIDES: [8, 16, 32] - IN_FEATURES: ['p3', 'p4', 'p5'] - SOI: [[0, 64], [48, 192], [128, 1000000]] -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - IMS_PER_BATCH: 16 - BASE_LR: 0.02 - STEPS: (300000, 340000) - MAX_ITER: 360000 - CHECKPOINT_PERIOD: 100000 - WARMUP_ITERS: 4000 - WARMUP_FACTOR: 0.00025 -INPUT: - MIN_SIZE_TRAIN: (256, 288, 320, 352, 384, 416, 448, 480, 512, 544, 576, 608) - MAX_SIZE_TRAIN: 900 - MAX_SIZE_TEST: 736 - MIN_SIZE_TEST: 512 \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml deleted file mode 100755 index d7491447..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml +++ /dev/null @@ -1,36 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - BACKBONE: - NAME: "build_p35_fcos_dla_bifpn_backbone" - BIFPN: - OUT_CHANNELS: 160 - NUM_LEVELS: 3 - NUM_BIFPN: 4 - DLA: - NUM_LAYERS: 34 - NORM: "SyncBN" - FPN: - IN_FEATURES: ["dla3", "dla4", "dla5"] - ROI_HEADS: - IN_FEATURES: ["p3", "p4", "p5"] - CENTERNET: - POST_NMS_TOPK_TEST: 128 - FPN_STRIDES: [8, 16, 32] - IN_FEATURES: ['p3', 'p4', 'p5'] - SOI: [[0, 64], [48, 192], [128, 1000000]] -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - IMS_PER_BATCH: 16 - BASE_LR: 0.02 - STEPS: (300000, 340000) - MAX_ITER: 360000 - CHECKPOINT_PERIOD: 100000 - WARMUP_ITERS: 4000 - WARMUP_FACTOR: 0.00025 -INPUT: - MIN_SIZE_TRAIN: (256, 288, 320, 352, 384, 416, 448, 480, 512, 544, 576, 608) - MAX_SIZE_TRAIN: 900 - MAX_SIZE_TEST: 736 - MIN_SIZE_TEST: 512 \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml deleted file mode 100755 index 80413a62..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml +++ /dev/null @@ -1,29 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - BACKBONE: - NAME: "build_p37_dla_bifpn_backbone" - BIFPN: - OUT_CHANNELS: 160 - NUM_LEVELS: 5 - NUM_BIFPN: 3 - CENTERNET: - POST_NMS_TOPK_TEST: 128 - WEIGHTS: '' - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.12, 57.375] - FPN: - IN_FEATURES: ["dla3", "dla4", "dla5"] -SOLVER: - LR_SCHEDULER_NAME: "WarmupCosineLR" - MAX_ITER: 360000 - BASE_LR: 0.08 - IMS_PER_BATCH: 64 - CHECKPOINT_PERIOD: 90000 -TEST: - EVAL_PERIOD: 7500 -INPUT: - FORMAT: RGB - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 640 - MIN_SIZE_TEST: 608 - MAX_SIZE_TEST: 900 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml deleted file mode 100755 index 8813b39c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml +++ /dev/null @@ -1,30 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - BACKBONE: - NAME: "build_p37_dla_bifpn_backbone" - BIFPN: - OUT_CHANNELS: 160 - NUM_LEVELS: 5 - NUM_BIFPN: 3 - CENTERNET: - POST_NMS_TOPK_TEST: 128 - WEIGHTS: '' - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.12, 57.375] - FPN: - IN_FEATURES: ["dla3", "dla4", "dla5"] -SOLVER: - LR_SCHEDULER_NAME: "WarmupCosineLR" - MAX_ITER: 360000 - BASE_LR: 0.08 - IMS_PER_BATCH: 64 -TEST: - EVAL_PERIOD: 7500 -INPUT: - FORMAT: RGB - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 640 - MIN_SIZE_TEST: 608 - MAX_SIZE_TEST: 900 -DATASETS: - TRAIN: ("coco_2017_train","coco_un_yolov4_55_0.5",) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml deleted file mode 100755 index f94f1358..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml +++ /dev/null @@ -1,30 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - BACKBONE: - NAME: "build_p37_fcos_dla_bifpn_backbone" - BIFPN: - OUT_CHANNELS: 160 - NUM_LEVELS: 5 - NUM_BIFPN: 3 - CENTERNET: - POST_NMS_TOPK_TEST: 128 - WEIGHTS: '' - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.12, 57.375] - FPN: - IN_FEATURES: ["dla3", "dla4", "dla5"] -TEST: - EVAL_PERIOD: 7500 -SOLVER: - LR_SCHEDULER_NAME: "WarmupCosineLR" - MAX_ITER: 360000 - BASE_LR: 0.08 - IMS_PER_BATCH: 64 -INPUT: - FORMAT: RGB - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 640 - MIN_SIZE_TEST: 608 - MAX_SIZE_TEST: 900 -DATASETS: - TRAIN: ("coco_2017_train","coco_un_yolov4_55_0.5",) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml deleted file mode 100755 index e07574b3..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml +++ /dev/null @@ -1,32 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - BACKBONE: - NAME: "build_res2net_bifpn_backbone" - BIFPN: - NUM_BIFPN: 7 - OUT_CHANNELS: 288 - WEIGHTS: "output/r2_101.pkl" - RESNETS: - DEPTH: 101 - WIDTH_PER_GROUP: 26 - DEFORM_ON_PER_STAGE: [False, False, True, True] # on Res4, Res5 - DEFORM_MODULATED: True - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.12, 57.375] - CENTERNET: - USE_DEFORMABLE: True - ROI_HEADS: - IN_FEATURES: ["p3", "p4"] -INPUT: - FORMAT: RGB -TEST: - EVAL_PERIOD: 7500 -SOLVER: - MAX_ITER: 180000 - CHECKPOINT_PERIOD: 60000 - LR_SCHEDULER_NAME: "WarmupCosineLR" - BASE_LR: 0.04 - IMS_PER_BATCH: 32 -INPUT: - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 1280 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml deleted file mode 100755 index 81fcab09..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml +++ /dev/null @@ -1,36 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - BACKBONE: - NAME: "build_res2net_bifpn_backbone" - BIFPN: - NUM_BIFPN: 7 - OUT_CHANNELS: 288 - WEIGHTS: "output/r2_101.pkl" - RESNETS: - DEPTH: 101 - WIDTH_PER_GROUP: 26 - DEFORM_ON_PER_STAGE: [False, False, True, True] # on Res4, Res5 - DEFORM_MODULATED: True - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.12, 57.375] - CENTERNET: - USE_DEFORMABLE: True - ROI_HEADS: - IN_FEATURES: ["p3", "p4"] -TEST: - EVAL_PERIOD: 7500 -SOLVER: - MAX_ITER: 180000 - CHECKPOINT_PERIOD: 7500 - LR_SCHEDULER_NAME: "WarmupCosineLR" - BASE_LR: 0.04 - IMS_PER_BATCH: 32 -DATASETS: - TRAIN: "('coco_2017_train', 'coco_un_yolov4_55_0.5')" -INPUT: - FORMAT: RGB - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 1280 - TEST_SIZE: 1560 - TEST_INPUT_TYPE: 'square' - \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml deleted file mode 100755 index fd6c49ee..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml +++ /dev/null @@ -1,29 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - BACKBONE: - NAME: "build_p67_res2net_fpn_backbone" - WEIGHTS: "output/r2_101.pkl" - RESNETS: - DEPTH: 101 - WIDTH_PER_GROUP: 26 - DEFORM_ON_PER_STAGE: [False, False, True, True] # on Res4, Res5 - DEFORM_MODULATED: True - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.12, 57.375] - CENTERNET: - USE_DEFORMABLE: True - ROI_HEADS: - IN_FEATURES: ["p3", "p4"] -INPUT: - FORMAT: RGB -TEST: - EVAL_PERIOD: 7500 -SOLVER: - MAX_ITER: 180000 - CHECKPOINT_PERIOD: 600000 - LR_SCHEDULER_NAME: "WarmupCosineLR" - BASE_LR: 0.04 - IMS_PER_BATCH: 32 -INPUT: - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 896 \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R50_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R50_1x.yaml deleted file mode 100755 index 9dcdf5b8..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R50_1x.yaml +++ /dev/null @@ -1 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_X101-DCN_2x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_X101-DCN_2x.yaml deleted file mode 100755 index 009c6808..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_X101-DCN_2x.yaml +++ /dev/null @@ -1,22 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - CENTERNET: - USE_DEFORMABLE: True - WEIGHTS: "detectron2://ImageNetPretrained/FAIR/X-101-32x8d.pkl" - PIXEL_STD: [57.375, 57.120, 58.395] - RESNETS: - STRIDE_IN_1X1: False - NUM_GROUPS: 32 - WIDTH_PER_GROUP: 8 - DEPTH: 101 - DEFORM_ON_PER_STAGE: [False, False, True, True] # on Res4, Res5 - DEFORM_MODULATED: True - ROI_HEADS: - IN_FEATURES: ["p3", "p4"] -SOLVER: - STEPS: (120000, 160000) - MAX_ITER: 180000 - CHECKPOINT_PERIOD: 40000 -INPUT: - MIN_SIZE_TRAIN: (480, 960) - MIN_SIZE_TRAIN_SAMPLING: "range" diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_1x.yaml deleted file mode 100755 index 912e8925..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_1x.yaml +++ /dev/null @@ -1,17 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - ROI_HEADS: - NUM_CLASSES: 1203 - SCORE_THRESH_TEST: 0.02 - NMS_THRESH_TEST: 0.5 - CENTERNET: - NUM_CLASSES: 1203 - -DATASETS: - TRAIN: ("lvis_v1_train",) - TEST: ("lvis_v1_val",) -DATALOADER: - SAMPLER_TRAIN: "RepeatFactorTrainingSampler" - REPEAT_THRESHOLD: 0.001 -TEST: - DETECTIONS_PER_IMAGE: 300 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml deleted file mode 100755 index d6b6c823..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml +++ /dev/null @@ -1,19 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - ROI_HEADS: - NUM_CLASSES: 1203 - SCORE_THRESH_TEST: 0.02 - NMS_THRESH_TEST: 0.5 - CENTERNET: - NUM_CLASSES: 1203 - ROI_BOX_HEAD: - USE_SIGMOID_CE: True - USE_FED_LOSS: True -DATASETS: - TRAIN: ("lvis_v1_train",) - TEST: ("lvis_v1_val",) -DATALOADER: - SAMPLER_TRAIN: "RepeatFactorTrainingSampler" - REPEAT_THRESHOLD: 0.001 -TEST: - DETECTIONS_PER_IMAGE: 300 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/O365_CenterNet2_R50_1x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/O365_CenterNet2_R50_1x.yaml deleted file mode 100755 index 514e52cd..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/O365_CenterNet2_R50_1x.yaml +++ /dev/null @@ -1,13 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - ROI_HEADS: - NUM_CLASSES: 365 - CENTERNET: - NUM_CLASSES: 365 -DATASETS: - TRAIN: ("objects365_train",) - TEST: ("objects365_val",) -DATALOADER: - SAMPLER_TRAIN: "ClassAwareSampler" -TEST: - DETECTIONS_PER_IMAGE: 300 \ No newline at end of file diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml deleted file mode 100755 index c400e92c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml +++ /dev/null @@ -1,42 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - MASK_ON: True - ROI_MASK_HEAD: - NAME: "MaskRCNNConvUpsampleHead" - NUM_CONV: 4 - POOLER_RESOLUTION: 14 - ROI_HEADS: - NUM_CLASSES: 10 - IN_FEATURES: ["dla2"] - BACKBONE: - NAME: "build_dla_backbone" - DLA: - NORM: "BN" - CENTERNET: - IN_FEATURES: ["dla2"] - FPN_STRIDES: [4] - SOI: [[0, 1000000]] - NUM_CLS_CONVS: 1 - NUM_BOX_CONVS: 1 - REG_WEIGHT: 1. - MORE_POS: True - HM_FOCAL_ALPHA: 0.25 - POST_NMS_TOPK_TEST: 128 - WEIGHTS: '' - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.12, 57.375] -SOLVER: - MAX_ITER: 180000 - STEPS: (120000, 160000) - BASE_LR: 0.08 - IMS_PER_BATCH: 64 -INPUT: - FORMAT: RGB - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 640 - MIN_SIZE_TEST: 608 - MAX_SIZE_TEST: 900 - MASK_FORMAT: bitmask -DATASETS: - TRAIN: ("nuimages_train",) - TEST: ("nuimages_val",) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/demo.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/demo.py deleted file mode 100755 index 5213faf4..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/demo.py +++ /dev/null @@ -1,185 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -import argparse -import glob -import multiprocessing as mp -import os -import time -import cv2 -import tqdm - -from detectron2.config import get_cfg -from detectron2.data.detection_utils import read_image -from detectron2.utils.logger import setup_logger - -from predictor import VisualizationDemo -from centernet.config import add_centernet_config -# constants -WINDOW_NAME = "CenterNet2 detections" - -from detectron2.utils.video_visualizer import VideoVisualizer -from detectron2.utils.visualizer import ColorMode, Visualizer -from detectron2.data import MetadataCatalog - -def setup_cfg(args): - # load config from file and command-line arguments - cfg = get_cfg() - add_centernet_config(cfg) - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - # Set score_threshold for builtin models - cfg.MODEL.RETINANET.SCORE_THRESH_TEST = args.confidence_threshold - cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = args.confidence_threshold - if cfg.MODEL.META_ARCHITECTURE in ['ProposalNetwork', 'CenterNetDetector']: - cfg.MODEL.CENTERNET.INFERENCE_TH = args.confidence_threshold - cfg.MODEL.CENTERNET.NMS_TH = cfg.MODEL.ROI_HEADS.NMS_THRESH_TEST - cfg.MODEL.PANOPTIC_FPN.COMBINE.INSTANCES_CONFIDENCE_THRESH = args.confidence_threshold - cfg.freeze() - return cfg - - -def get_parser(): - parser = argparse.ArgumentParser(description="Detectron2 demo for builtin models") - parser.add_argument( - "--config-file", - default="configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml", - metavar="FILE", - help="path to config file", - ) - parser.add_argument("--webcam", action="store_true", help="Take inputs from webcam.") - parser.add_argument("--video-input", help="Path to video file.") - parser.add_argument("--input", nargs="+", help="A list of space separated input images") - parser.add_argument( - "--output", - help="A file or directory to save output visualizations. " - "If not given, will show output in an OpenCV window.", - ) - - parser.add_argument( - "--confidence-threshold", - type=float, - default=0.3, - help="Minimum score for instance predictions to be shown", - ) - parser.add_argument( - "--opts", - help="Modify config options using the command-line 'KEY VALUE' pairs", - default=[], - nargs=argparse.REMAINDER, - ) - return parser - - -if __name__ == "__main__": - mp.set_start_method("spawn", force=True) - args = get_parser().parse_args() - logger = setup_logger() - logger.info("Arguments: " + str(args)) - - cfg = setup_cfg(args) - - demo = VisualizationDemo(cfg) - output_file = None - if args.input: - if len(args.input) == 1: - args.input = glob.glob(os.path.expanduser(args.input[0])) - files = os.listdir(args.input[0]) - args.input = [args.input[0] + x for x in files] - assert args.input, "The input path(s) was not found" - visualizer = VideoVisualizer( - MetadataCatalog.get( - cfg.DATASETS.TEST[0] if len(cfg.DATASETS.TEST) else "__unused" - ), - instance_mode=ColorMode.IMAGE) - for path in tqdm.tqdm(args.input, disable=not args.output): - # use PIL, to be consistent with evaluation - img = read_image(path, format="BGR") - start_time = time.time() - predictions, visualized_output = demo.run_on_image( - img, visualizer=visualizer) - if 'instances' in predictions: - logger.info( - "{}: detected {} instances in {:.2f}s".format( - path, len(predictions["instances"]), time.time() - start_time - ) - ) - else: - logger.info( - "{}: detected {} instances in {:.2f}s".format( - path, len(predictions["proposals"]), time.time() - start_time - ) - ) - - if args.output: - if os.path.isdir(args.output): - assert os.path.isdir(args.output), args.output - out_filename = os.path.join(args.output, os.path.basename(path)) - visualized_output.save(out_filename) - else: - # assert len(args.input) == 1, "Please specify a directory with args.output" - # out_filename = args.output - if output_file is None: - width = visualized_output.get_image().shape[1] - height = visualized_output.get_image().shape[0] - frames_per_second = 15 - output_file = cv2.VideoWriter( - filename=args.output, - # some installation of opencv may not support x264 (due to its license), - # you can try other format (e.g. MPEG) - fourcc=cv2.VideoWriter_fourcc(*"x264"), - fps=float(frames_per_second), - frameSize=(width, height), - isColor=True, - ) - output_file.write(visualized_output.get_image()[:, :, ::-1]) - else: - # cv2.namedWindow(WINDOW_NAME, cv2.WINDOW_NORMAL) - cv2.imshow(WINDOW_NAME, visualized_output.get_image()[:, :, ::-1]) - if cv2.waitKey(1 ) == 27: - break # esc to quit - elif args.webcam: - assert args.input is None, "Cannot have both --input and --webcam!" - cam = cv2.VideoCapture(0) - for vis in tqdm.tqdm(demo.run_on_video(cam)): - cv2.namedWindow(WINDOW_NAME, cv2.WINDOW_NORMAL) - cv2.imshow(WINDOW_NAME, vis) - if cv2.waitKey(1) == 27: - break # esc to quit - cv2.destroyAllWindows() - elif args.video_input: - video = cv2.VideoCapture(args.video_input) - width = int(video.get(cv2.CAP_PROP_FRAME_WIDTH)) - height = int(video.get(cv2.CAP_PROP_FRAME_HEIGHT)) - frames_per_second = 15 # video.get(cv2.CAP_PROP_FPS) - num_frames = int(video.get(cv2.CAP_PROP_FRAME_COUNT)) - basename = os.path.basename(args.video_input) - - if args.output: - if os.path.isdir(args.output): - output_fname = os.path.join(args.output, basename) - output_fname = os.path.splitext(output_fname)[0] + ".mkv" - else: - output_fname = args.output - # assert not os.path.isfile(output_fname), output_fname - output_file = cv2.VideoWriter( - filename=output_fname, - # some installation of opencv may not support x264 (due to its license), - # you can try other format (e.g. MPEG) - fourcc=cv2.VideoWriter_fourcc(*"x264"), - fps=float(frames_per_second), - frameSize=(width, height), - isColor=True, - ) - assert os.path.isfile(args.video_input) - for vis_frame in tqdm.tqdm(demo.run_on_video(video), total=num_frames): - if args.output: - output_file.write(vis_frame) - - cv2.namedWindow(basename, cv2.WINDOW_NORMAL) - cv2.imshow(basename, vis_frame) - if cv2.waitKey(1) == 27: - break # esc to quit - video.release() - if args.output: - output_file.release() - else: - cv2.destroyAllWindows() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/predictor.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/predictor.py deleted file mode 100755 index 8a036bde..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/predictor.py +++ /dev/null @@ -1,243 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -import atexit -import bisect -import multiprocessing as mp -from collections import deque -import cv2 -import torch - -from detectron2.data import MetadataCatalog -from detectron2.engine.defaults import DefaultPredictor -from detectron2.utils.video_visualizer import VideoVisualizer -from detectron2.utils.visualizer import ColorMode, Visualizer - - -class VisualizationDemo(object): - def __init__(self, cfg, instance_mode=ColorMode.IMAGE, parallel=False): - """ - Args: - cfg (CfgNode): - instance_mode (ColorMode): - parallel (bool): whether to run the model in different processes from visualization. - Useful since the visualization logic can be slow. - """ - self.metadata = MetadataCatalog.get( - cfg.DATASETS.TRAIN[0] if len(cfg.DATASETS.TRAIN) else "__unused" - ) - self.cpu_device = torch.device("cpu") - self.instance_mode = instance_mode - - self.parallel = parallel - if parallel: - num_gpu = torch.cuda.device_count() - self.predictor = AsyncPredictor(cfg, num_gpus=num_gpu) - else: - self.predictor = DefaultPredictor(cfg) - - def run_on_image(self, image, visualizer=None): - """ - Args: - image (np.ndarray): an image of shape (H, W, C) (in BGR order). - This is the format used by OpenCV. - - Returns: - predictions (dict): the output of the model. - vis_output (VisImage): the visualized image output. - """ - vis_output = None - predictions = self.predictor(image) - # Convert image from OpenCV BGR format to Matplotlib RGB format. - image = image[:, :, ::-1] - use_video_vis = True - if visualizer is None: - use_video_vis = False - visualizer = Visualizer(image, self.metadata, instance_mode=self.instance_mode) - if "panoptic_seg" in predictions: - panoptic_seg, segments_info = predictions["panoptic_seg"] - vis_output = visualizer.draw_panoptic_seg_predictions( - panoptic_seg.to(self.cpu_device), segments_info - ) - else: - if "sem_seg" in predictions: - vis_output = visualizer.draw_sem_seg( - predictions["sem_seg"].argmax(dim=0).to(self.cpu_device) - ) - if "instances" in predictions: - instances = predictions["instances"].to(self.cpu_device) - if use_video_vis: - vis_output = visualizer.draw_instance_predictions( - image, predictions=instances) - else: - vis_output = visualizer.draw_instance_predictions(predictions=instances) - elif "proposals" in predictions: - instances = predictions["proposals"].to(self.cpu_device) - instances.pred_boxes = instances.proposal_boxes - instances.scores = instances.objectness_logits - instances.pred_classes[:] = -1 - if use_video_vis: - vis_output = visualizer.draw_instance_predictions( - image, predictions=instances) - else: - vis_output = visualizer.draw_instance_predictions(predictions=instances) - - return predictions, vis_output - - def _frame_from_video(self, video): - while video.isOpened(): - success, frame = video.read() - if success: - yield frame - else: - break - - def run_on_video(self, video): - """ - Visualizes predictions on frames of the input video. - - Args: - video (cv2.VideoCapture): a :class:`VideoCapture` object, whose source can be - either a webcam or a video file. - - Yields: - ndarray: BGR visualizations of each video frame. - """ - video_visualizer = VideoVisualizer(self.metadata, self.instance_mode) - - def process_predictions(frame, predictions): - frame = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR) - if "panoptic_seg" in predictions: - panoptic_seg, segments_info = predictions["panoptic_seg"] - vis_frame = video_visualizer.draw_panoptic_seg_predictions( - frame, panoptic_seg.to(self.cpu_device), segments_info - ) - elif "instances" in predictions: - predictions = predictions["instances"].to(self.cpu_device) - vis_frame = video_visualizer.draw_instance_predictions(frame, predictions) - elif "sem_seg" in predictions: - vis_frame = video_visualizer.draw_sem_seg( - frame, predictions["sem_seg"].argmax(dim=0).to(self.cpu_device) - ) - elif "proposals" in predictions: - predictions = predictions["proposals"].to(self.cpu_device) - predictions.pred_boxes = predictions.proposal_boxes - predictions.scores = predictions.objectness_logits - predictions.pred_classes[:] = -1 - vis_frame = video_visualizer.draw_instance_predictions(frame, predictions) - - # Converts Matplotlib RGB format to OpenCV BGR format - vis_frame = cv2.cvtColor(vis_frame.get_image(), cv2.COLOR_RGB2BGR) - return vis_frame - - frame_gen = self._frame_from_video(video) - if self.parallel: - buffer_size = self.predictor.default_buffer_size - - frame_data = deque() - - for cnt, frame in enumerate(frame_gen): - frame_data.append(frame) - self.predictor.put(frame) - - if cnt >= buffer_size: - frame = frame_data.popleft() - predictions = self.predictor.get() - yield process_predictions(frame, predictions) - - while len(frame_data): - frame = frame_data.popleft() - predictions = self.predictor.get() - yield process_predictions(frame, predictions) - else: - for frame in frame_gen: - yield process_predictions(frame, self.predictor(frame)) - - -class AsyncPredictor: - """ - A predictor that runs the model asynchronously, possibly on >1 GPUs. - Because rendering the visualization takes considerably amount of time, - this helps improve throughput when rendering videos. - """ - - class _StopToken: - pass - - class _PredictWorker(mp.Process): - def __init__(self, cfg, task_queue, result_queue): - self.cfg = cfg - self.task_queue = task_queue - self.result_queue = result_queue - super().__init__() - - def run(self): - predictor = DefaultPredictor(self.cfg) - - while True: - task = self.task_queue.get() - if isinstance(task, AsyncPredictor._StopToken): - break - idx, data = task - result = predictor(data) - self.result_queue.put((idx, result)) - - def __init__(self, cfg, num_gpus: int = 1): - """ - Args: - cfg (CfgNode): - num_gpus (int): if 0, will run on CPU - """ - num_workers = max(num_gpus, 1) - self.task_queue = mp.Queue(maxsize=num_workers * 3) - self.result_queue = mp.Queue(maxsize=num_workers * 3) - self.procs = [] - for gpuid in range(max(num_gpus, 1)): - cfg = cfg.clone() - cfg.defrost() - cfg.MODEL.DEVICE = "cuda:{}".format(gpuid) if num_gpus > 0 else "cpu" - self.procs.append( - AsyncPredictor._PredictWorker(cfg, self.task_queue, self.result_queue) - ) - - self.put_idx = 0 - self.get_idx = 0 - self.result_rank = [] - self.result_data = [] - - for p in self.procs: - p.start() - atexit.register(self.shutdown) - - def put(self, image): - self.put_idx += 1 - self.task_queue.put((self.put_idx, image)) - - def get(self): - self.get_idx += 1 # the index needed for this request - if len(self.result_rank) and self.result_rank[0] == self.get_idx: - res = self.result_data[0] - del self.result_data[0], self.result_rank[0] - return res - - while True: - # make sure the results are returned in the correct order - idx, res = self.result_queue.get() - if idx == self.get_idx: - return res - insert = bisect.bisect(self.result_rank, idx) - self.result_rank.insert(insert, idx) - self.result_data.insert(insert, res) - - def __len__(self): - return self.put_idx - self.get_idx - - def __call__(self, image): - self.put(image) - return self.get() - - def shutdown(self): - for _ in self.procs: - self.task_queue.put(AsyncPredictor._StopToken()) - - @property - def default_buffer_size(self): - return len(self.procs) * 5 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/train_net.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/train_net.py deleted file mode 100755 index d903efde..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/projects/CenterNet2/train_net.py +++ /dev/null @@ -1,228 +0,0 @@ -import logging -import os -from collections import OrderedDict -import torch -from torch.nn.parallel import DistributedDataParallel -import time -import datetime -import json - -from fvcore.common.timer import Timer -import detectron2.utils.comm as comm -from detectron2.checkpoint import DetectionCheckpointer, PeriodicCheckpointer -from detectron2.config import get_cfg -from detectron2.data import ( - MetadataCatalog, - build_detection_test_loader, -) -from detectron2.engine import default_argument_parser, default_setup, launch - -from detectron2.evaluation import ( - COCOEvaluator, - LVISEvaluator, - inference_on_dataset, - print_csv_format, -) -from detectron2.modeling import build_model -from detectron2.solver import build_lr_scheduler, build_optimizer -from detectron2.utils.events import ( - CommonMetricPrinter, - EventStorage, - JSONWriter, - TensorboardXWriter, -) -from detectron2.modeling.test_time_augmentation import GeneralizedRCNNWithTTA -from detectron2.data.dataset_mapper import DatasetMapper -from detectron2.data.build import build_detection_train_loader - -from centernet.config import add_centernet_config -from centernet.data.custom_build_augmentation import build_custom_augmentation - -logger = logging.getLogger("detectron2") - -def do_test(cfg, model): - results = OrderedDict() - for dataset_name in cfg.DATASETS.TEST: - mapper = None if cfg.INPUT.TEST_INPUT_TYPE == 'default' else \ - DatasetMapper( - cfg, False, augmentations=build_custom_augmentation(cfg, False)) - data_loader = build_detection_test_loader(cfg, dataset_name, mapper=mapper) - output_folder = os.path.join( - cfg.OUTPUT_DIR, "inference_{}".format(dataset_name)) - evaluator_type = MetadataCatalog.get(dataset_name).evaluator_type - - if evaluator_type == "lvis": - evaluator = LVISEvaluator(dataset_name, cfg, True, output_folder) - elif evaluator_type == 'coco': - evaluator = COCOEvaluator(dataset_name, cfg, True, output_folder) - else: - assert 0, evaluator_type - - results[dataset_name] = inference_on_dataset( - model, data_loader, evaluator) - if comm.is_main_process(): - logger.info("Evaluation results for {} in csv format:".format( - dataset_name)) - print_csv_format(results[dataset_name]) - if len(results) == 1: - results = list(results.values())[0] - return results - -def do_train(cfg, model, resume=False): - model.train() - optimizer = build_optimizer(cfg, model) - scheduler = build_lr_scheduler(cfg, optimizer) - - checkpointer = DetectionCheckpointer( - model, cfg.OUTPUT_DIR, optimizer=optimizer, scheduler=scheduler - ) - - start_iter = ( - checkpointer.resume_or_load( - cfg.MODEL.WEIGHTS, resume=resume, - ).get("iteration", -1) + 1 - ) - if cfg.SOLVER.RESET_ITER: - logger.info('Reset loaded iteration. Start training from iteration 0.') - start_iter = 0 - max_iter = cfg.SOLVER.MAX_ITER if cfg.SOLVER.TRAIN_ITER < 0 else cfg.SOLVER.TRAIN_ITER - - periodic_checkpointer = PeriodicCheckpointer( - checkpointer, cfg.SOLVER.CHECKPOINT_PERIOD, max_iter=max_iter - ) - - writers = ( - [ - CommonMetricPrinter(max_iter), - JSONWriter(os.path.join(cfg.OUTPUT_DIR, "metrics.json")), - TensorboardXWriter(cfg.OUTPUT_DIR), - ] - if comm.is_main_process() - else [] - ) - - - mapper = DatasetMapper(cfg, True) if cfg.INPUT.CUSTOM_AUG == '' else \ - DatasetMapper(cfg, True, augmentations=build_custom_augmentation(cfg, True)) - if cfg.DATALOADER.SAMPLER_TRAIN in ['TrainingSampler', 'RepeatFactorTrainingSampler']: - data_loader = build_detection_train_loader(cfg, mapper=mapper) - else: - from centernet.data.custom_dataset_dataloader import build_custom_train_loader - data_loader = build_custom_train_loader(cfg, mapper=mapper) - - - logger.info("Starting training from iteration {}".format(start_iter)) - with EventStorage(start_iter) as storage: - step_timer = Timer() - data_timer = Timer() - start_time = time.perf_counter() - for data, iteration in zip(data_loader, range(start_iter, max_iter)): - data_time = data_timer.seconds() - storage.put_scalars(data_time=data_time) - step_timer.reset() - iteration = iteration + 1 - storage.step() - loss_dict = model(data) - - losses = sum( - loss for k, loss in loss_dict.items()) - assert torch.isfinite(losses).all(), loss_dict - - loss_dict_reduced = {k: v.item() \ - for k, v in comm.reduce_dict(loss_dict).items()} - losses_reduced = sum(loss for loss in loss_dict_reduced.values()) - if comm.is_main_process(): - storage.put_scalars( - total_loss=losses_reduced, **loss_dict_reduced) - - optimizer.zero_grad() - losses.backward() - optimizer.step() - - storage.put_scalar( - "lr", optimizer.param_groups[0]["lr"], smoothing_hint=False) - - step_time = step_timer.seconds() - storage.put_scalars(time=step_time) - data_timer.reset() - scheduler.step() - - if ( - cfg.TEST.EVAL_PERIOD > 0 - and iteration % cfg.TEST.EVAL_PERIOD == 0 - and iteration != max_iter - ): - do_test(cfg, model) - comm.synchronize() - - if iteration - start_iter > 5 and \ - (iteration % 20 == 0 or iteration == max_iter): - for writer in writers: - writer.write() - periodic_checkpointer.step(iteration) - - total_time = time.perf_counter() - start_time - logger.info( - "Total training time: {}".format( - str(datetime.timedelta(seconds=int(total_time))))) - -def setup(args): - """ - Create configs and perform basic setups. - """ - cfg = get_cfg() - add_centernet_config(cfg) - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - if '/auto' in cfg.OUTPUT_DIR: - file_name = os.path.basename(args.config_file)[:-5] - cfg.OUTPUT_DIR = cfg.OUTPUT_DIR.replace('/auto', '/{}'.format(file_name)) - logger.info('OUTPUT_DIR: {}'.format(cfg.OUTPUT_DIR)) - cfg.freeze() - default_setup(cfg, args) - return cfg - - -def main(args): - cfg = setup(args) - - model = build_model(cfg) - logger.info("Model:\n{}".format(model)) - if args.eval_only: - DetectionCheckpointer(model, save_dir=cfg.OUTPUT_DIR).resume_or_load( - cfg.MODEL.WEIGHTS, resume=args.resume - ) - if cfg.TEST.AUG.ENABLED: - logger.info("Running inference with test-time augmentation ...") - model = GeneralizedRCNNWithTTA(cfg, model, batch_size=1) - - return do_test(cfg, model) - - distributed = comm.get_world_size() > 1 - if distributed: - model = DistributedDataParallel( - model, device_ids=[comm.get_local_rank()], broadcast_buffers=False, - find_unused_parameters=True - ) - - do_train(cfg, model, resume=args.resume) - return do_test(cfg, model) - - -if __name__ == "__main__": - args = default_argument_parser() - args.add_argument('--manual_device', default='') - args = args.parse_args() - if args.manual_device != '': - os.environ['CUDA_VISIBLE_DEVICES'] = args.manual_device - args.dist_url = 'tcp://127.0.0.1:{}'.format( - torch.randint(11111, 60000, (1,))[0].item()) - print("Command Line Args:", args) - launch( - main, - args.num_gpus, - num_machines=args.num_machines, - machine_rank=args.machine_rank, - dist_url=args.dist_url, - args=(args,), - ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/setup.cfg b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/setup.cfg deleted file mode 100755 index 2a1ccd4e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/setup.cfg +++ /dev/null @@ -1,26 +0,0 @@ -[isort] -line_length=100 -multi_line_output=3 -include_trailing_comma=True -known_standard_library=numpy,setuptools,mock -skip=./datasets,docs -skip_glob=*/__init__.py,**/configs/**,tests/config/** -known_myself=detectron2 -known_third_party=fvcore,matplotlib,cv2,torch,torchvision,PIL,pycocotools,yacs,termcolor,cityscapesscripts,tabulate,tqdm,scipy,lvis,psutil,pkg_resources,caffe2,onnx,panopticapi,black,isort,av,iopath,omegaconf,hydra,yaml,pydoc,submitit,cloudpickle -no_lines_before=STDLIB,THIRDPARTY -sections=FUTURE,STDLIB,THIRDPARTY,myself,FIRSTPARTY,LOCALFOLDER -default_section=FIRSTPARTY - -[mypy] -python_version=3.6 -ignore_missing_imports = True -warn_unused_configs = True -disallow_untyped_defs = True -check_untyped_defs = True -warn_unused_ignores = True -warn_redundant_casts = True -show_column_numbers = True -follow_imports = silent -allow_redefinition = True -; Require all functions to be annotated -disallow_incomplete_defs = True diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/setup.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/setup.py deleted file mode 100755 index 50a5e23e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/setup.py +++ /dev/null @@ -1,206 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. - -import glob -import os -import shutil -from os import path -from setuptools import find_packages, setup -from typing import List -import torch -from torch.utils.cpp_extension import CUDA_HOME, CppExtension, CUDAExtension - -torch_ver = [int(x) for x in torch.__version__.split(".")[:2]] -assert torch_ver >= [1, 8], "Requires PyTorch >= 1.8" - - -def get_version(): - init_py_path = path.join(path.abspath(path.dirname(__file__)), "detectron2", "__init__.py") - init_py = open(init_py_path, "r").readlines() - version_line = [l.strip() for l in init_py if l.startswith("__version__")][0] - version = version_line.split("=")[-1].strip().strip("'\"") - - # The following is used to build release packages. - # Users should never use it. - suffix = os.getenv("D2_VERSION_SUFFIX", "") - version = version + suffix - if os.getenv("BUILD_NIGHTLY", "0") == "1": - from datetime import datetime - - date_str = datetime.today().strftime("%y%m%d") - version = version + ".dev" + date_str - - new_init_py = [l for l in init_py if not l.startswith("__version__")] - new_init_py.append('__version__ = "{}"\n'.format(version)) - with open(init_py_path, "w") as f: - f.write("".join(new_init_py)) - return version - - -def get_extensions(): - this_dir = path.dirname(path.abspath(__file__)) - extensions_dir = path.join(this_dir, "detectron2", "layers", "csrc") - - main_source = path.join(extensions_dir, "vision.cpp") - sources = glob.glob(path.join(extensions_dir, "**", "*.cpp")) - - from torch.utils.cpp_extension import ROCM_HOME - - is_rocm_pytorch = ( - True if ((torch.version.hip is not None) and (ROCM_HOME is not None)) else False - ) - if is_rocm_pytorch: - assert torch_ver >= [1, 8], "ROCM support requires PyTorch >= 1.8!" - - # common code between cuda and rocm platforms, for hipify version [1,0,0] and later. - source_cuda = glob.glob(path.join(extensions_dir, "**", "*.cu")) + glob.glob( - path.join(extensions_dir, "*.cu") - ) - sources = [main_source] + sources - - extension = CppExtension - - extra_compile_args = {"cxx": []} - define_macros = [] - - if (torch.cuda.is_available() and ((CUDA_HOME is not None) or is_rocm_pytorch)) or os.getenv( - "FORCE_CUDA", "0" - ) == "1": - extension = CUDAExtension - sources += source_cuda - - if not is_rocm_pytorch: - define_macros += [("WITH_CUDA", None)] - extra_compile_args["nvcc"] = [ - "-O3", - "-DCUDA_HAS_FP16=1", - "-D__CUDA_NO_HALF_OPERATORS__", - "-D__CUDA_NO_HALF_CONVERSIONS__", - "-D__CUDA_NO_HALF2_OPERATORS__", - ] - else: - define_macros += [("WITH_HIP", None)] - extra_compile_args["nvcc"] = [] - - if torch_ver < [1, 7]: - # supported by https://github.com/pytorch/pytorch/pull/43931 - CC = os.environ.get("CC", None) - if CC is not None: - extra_compile_args["nvcc"].append("-ccbin={}".format(CC)) - - include_dirs = [extensions_dir] - - ext_modules = [ - extension( - "detectron2._C", - sources, - include_dirs=include_dirs, - define_macros=define_macros, - extra_compile_args=extra_compile_args, - ) - ] - - return ext_modules - - -def get_model_zoo_configs() -> List[str]: - """ - Return a list of configs to include in package for model zoo. Copy over these configs inside - detectron2/model_zoo. - """ - - # Use absolute paths while symlinking. - source_configs_dir = path.join(path.dirname(path.realpath(__file__)), "configs") - destination = path.join( - path.dirname(path.realpath(__file__)), "detectron2", "model_zoo", "configs" - ) - # Symlink the config directory inside package to have a cleaner pip install. - - # Remove stale symlink/directory from a previous build. - if path.exists(source_configs_dir): - if path.islink(destination): - os.unlink(destination) - elif path.isdir(destination): - shutil.rmtree(destination) - - if not path.exists(destination): - try: - os.symlink(source_configs_dir, destination) - except OSError: - # Fall back to copying if symlink fails: ex. on Windows. - shutil.copytree(source_configs_dir, destination) - - config_paths = glob.glob("configs/**/*.yaml", recursive=True) + glob.glob( - "configs/**/*.py", recursive=True - ) - return config_paths - - -# For projects that are relative small and provide features that are very close -# to detectron2's core functionalities, we install them under detectron2.projects -PROJECTS = { - -} - -setup( - name="detectron2", - version=get_version(), - author="FAIR", - url="https://github.com/facebookresearch/detectron2", - description="Detectron2 is FAIR's next-generation research " - "platform for object detection and segmentation.", - packages=find_packages(exclude=("configs", "tests*")) + list(PROJECTS.keys()), - package_dir=PROJECTS, - package_data={"detectron2.model_zoo": get_model_zoo_configs()}, - python_requires=">=3.6", - install_requires=[ - # These dependencies are not pure-python. - # In general, avoid adding more dependencies like them because they are not - # guaranteed to be installable by `pip install` on all platforms. - # To tell if a package is pure-python, go to https://pypi.org/project/{name}/#files - "Pillow>=7.1", # or use pillow-simd for better performance - "matplotlib", # TODO move it to optional after we add opencv visualization - "pycocotools>=2.0.2", # corresponds to https://github.com/ppwwyyxx/cocoapi - # Do not add opencv here. Just like pytorch, user should install - # opencv themselves, preferrably by OS's package manager, or by - # choosing the proper pypi package name at https://github.com/skvark/opencv-python - # The following are pure-python dependencies that should be easily installable - "termcolor>=1.1", - "yacs>=0.1.8", - "tabulate", - "cloudpickle", - "tqdm>4.29.0", - "tensorboard", - # Lock version of fvcore/iopath because they may have breaking changes - # NOTE: when updating fvcore/iopath version, make sure fvcore depends - # on compatible version of iopath. - "fvcore>=0.1.5,<0.1.6", # required like this to make it pip installable - "iopath>=0.1.7,<0.1.10", - "future", # used by caffe2 - "pydot", # used to save caffe2 SVGs - "dataclasses; python_version<'3.7'", - "omegaconf>=2.1", - "hydra-core>=1.1", - "black==21.4b2", - # If a new dependency is required at import time (in addition to runtime), it - # probably needs to exist in docs/requirements.txt, or as a mock in docs/conf.py - ], - extras_require={ - # optional dependencies, required by some features - "all": [ - "shapely", - "pygments>=2.2", - "psutil", - "panopticapi @ https://github.com/cocodataset/panopticapi/archive/master.zip", - ], - # dev dependencies. Install them by `pip install 'detectron2[dev]'` - "dev": [ - "flake8==3.8.1", - "isort==4.3.21", - "flake8-bugbear", - "flake8-comprehensions", - ], - }, - ext_modules=get_extensions(), - cmdclass={"build_ext": torch.utils.cpp_extension.BuildExtension}, -) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/README.md deleted file mode 100755 index f5603840..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/README.md +++ /dev/null @@ -1,9 +0,0 @@ -## Unit Tests - -To run the unittests, do: -``` -cd detectron2 -python -m unittest discover -v -s ./tests -``` - -There are also end-to-end inference & training tests, in [dev/run_*_tests.sh](../dev). diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/__init__.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/__init__.py deleted file mode 100755 index 9020c2df..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/__init__.py +++ /dev/null @@ -1 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/dir1/dir1_a.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/dir1/dir1_a.py deleted file mode 100755 index a9399551..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/dir1/dir1_a.py +++ /dev/null @@ -1,3 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -dir1a_str = "base_a_1" -dir1a_dict = {"a": 1, "b": 2} diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/dir1/dir1_b.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/dir1/dir1_b.py deleted file mode 100755 index 2dcb54cb..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/dir1/dir1_b.py +++ /dev/null @@ -1,11 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from detectron2.config import LazyConfig - -# equivalent to relative import -dir1a_str, dir1a_dict = LazyConfig.load_rel("dir1_a.py", ("dir1a_str", "dir1a_dict")) - -dir1b_str = dir1a_str + "_from_b" -dir1b_dict = dir1a_dict - -# Every import is a reload: not modified by other config files -assert dir1a_dict.a == 1 diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/root_cfg.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/root_cfg.py deleted file mode 100755 index 33d1d4bd..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/root_cfg.py +++ /dev/null @@ -1,14 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from itertools import count - -from detectron2.config import LazyCall as L - -from .dir1.dir1_a import dir1a_dict, dir1a_str - -dir1a_dict.a = "modified" - -# modification above won't affect future imports -from .dir1.dir1_b import dir1b_dict, dir1b_str - - -lazyobj = L(count)(x=dir1a_str, y=dir1b_str) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/test_instantiate_config.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/test_instantiate_config.py deleted file mode 100755 index b76f71b9..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/test_instantiate_config.py +++ /dev/null @@ -1,100 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import os -import tempfile -import unittest -import yaml -from omegaconf import OmegaConf -from omegaconf import __version__ as oc_version -from dataclasses import dataclass - -from detectron2.config import instantiate, LazyCall as L -from detectron2.layers import ShapeSpec - -OC_VERSION = tuple(int(x) for x in oc_version.split(".")[:2]) - - -class TestClass: - def __init__(self, int_arg, list_arg=None, dict_arg=None, extra_arg=None): - self.int_arg = int_arg - self.list_arg = list_arg - self.dict_arg = dict_arg - self.extra_arg = extra_arg - - def __call__(self, call_arg): - return call_arg + self.int_arg - - -@dataclass -class TestDataClass: - x: int - y: str - - -@unittest.skipIf(OC_VERSION < (2, 1), "omegaconf version too old") -class TestConstruction(unittest.TestCase): - def test_basic_construct(self): - objconf = L(TestClass)( - int_arg=3, - list_arg=[10], - dict_arg={}, - extra_arg=L(TestClass)(int_arg=4, list_arg="${..list_arg}"), - ) - - obj = instantiate(objconf) - self.assertIsInstance(obj, TestClass) - self.assertEqual(obj.int_arg, 3) - self.assertEqual(obj.extra_arg.int_arg, 4) - self.assertEqual(obj.extra_arg.list_arg, obj.list_arg) - - objconf.extra_arg.list_arg = [5] - obj = instantiate(objconf) - self.assertIsInstance(obj, TestClass) - self.assertEqual(obj.extra_arg.list_arg, [5]) - - def test_instantiate_other_obj(self): - # do nothing for other obj - self.assertEqual(instantiate(5), 5) - x = [3, 4, 5] - self.assertEqual(instantiate(x), x) - x = TestClass(1) - self.assertIs(instantiate(x), x) - x = {"xx": "yy"} - self.assertIs(instantiate(x), x) - - def test_instantiate_lazy_target(self): - # _target_ is result of instantiate - objconf = L(L(len)(int_arg=3))(call_arg=4) - objconf._target_._target_ = TestClass - self.assertEqual(instantiate(objconf), 7) - - def test_instantiate_lst(self): - lst = [1, 2, L(TestClass)(int_arg=1)] - x = L(TestClass)(int_arg=lst) # list as an argument should be recursively instantiated - x = instantiate(x).int_arg - self.assertEqual(x[:2], [1, 2]) - self.assertIsInstance(x[2], TestClass) - self.assertEqual(x[2].int_arg, 1) - - def test_instantiate_namedtuple(self): - x = L(TestClass)(int_arg=ShapeSpec(channels=1, width=3)) - # test serialization - with tempfile.TemporaryDirectory() as d: - fname = os.path.join(d, "d2_test.yaml") - OmegaConf.save(x, fname) - with open(fname) as f: - x = yaml.unsafe_load(f) - - x = instantiate(x) - self.assertIsInstance(x.int_arg, ShapeSpec) - self.assertEqual(x.int_arg.channels, 1) - - def test_bad_lazycall(self): - with self.assertRaises(Exception): - L(3) - - def test_instantiate_dataclass(self): - a = L(TestDataClass)(x=1, y="s") - a = instantiate(a) - self.assertEqual(a.x, 1) - self.assertEqual(a.y, "s") diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/test_lazy_config.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/test_lazy_config.py deleted file mode 100755 index 6ff5b6dc..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/test_lazy_config.py +++ /dev/null @@ -1,79 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import os -import unittest -import tempfile -from itertools import count - -from detectron2.config import LazyConfig, LazyCall as L -from omegaconf import DictConfig - - -class TestLazyPythonConfig(unittest.TestCase): - def setUp(self): - self.root_filename = os.path.join(os.path.dirname(__file__), "root_cfg.py") - - def test_load(self): - cfg = LazyConfig.load(self.root_filename) - - self.assertEqual(cfg.dir1a_dict.a, "modified") - self.assertEqual(cfg.dir1b_dict.a, 1) - self.assertEqual(cfg.lazyobj.x, "base_a_1") - - cfg.lazyobj.x = "new_x" - # reload - cfg = LazyConfig.load(self.root_filename) - self.assertEqual(cfg.lazyobj.x, "base_a_1") - - def test_save_load(self): - cfg = LazyConfig.load(self.root_filename) - with tempfile.TemporaryDirectory(prefix="detectron2") as d: - fname = os.path.join(d, "test_config.yaml") - LazyConfig.save(cfg, fname) - cfg2 = LazyConfig.load(fname) - - self.assertEqual(cfg2.lazyobj._target_, "itertools.count") - self.assertEqual(cfg.lazyobj._target_, count) - cfg2.lazyobj.pop("_target_") - cfg.lazyobj.pop("_target_") - # the rest are equal - self.assertEqual(cfg, cfg2) - - def test_failed_save(self): - cfg = DictConfig({"x": lambda: 3}, flags={"allow_objects": True}) - with tempfile.TemporaryDirectory(prefix="detectron2") as d: - fname = os.path.join(d, "test_config.yaml") - LazyConfig.save(cfg, fname) - self.assertTrue(os.path.exists(fname)) - self.assertTrue(os.path.exists(fname + ".pkl")) - - def test_overrides(self): - cfg = LazyConfig.load(self.root_filename) - LazyConfig.apply_overrides(cfg, ["lazyobj.x=123", 'dir1b_dict.a="123"']) - self.assertEqual(cfg.dir1b_dict.a, "123") - self.assertEqual(cfg.lazyobj.x, 123) - - def test_invalid_overrides(self): - cfg = LazyConfig.load(self.root_filename) - with self.assertRaises(KeyError): - LazyConfig.apply_overrides(cfg, ["lazyobj.x.xxx=123"]) - - def test_to_py(self): - cfg = LazyConfig.load(self.root_filename) - cfg.lazyobj.x = {"a": 1, "b": 2, "c": L(count)(x={"r": "a", "s": 2.4, "t": [1, 2, 3, "z"]})} - cfg.list = ["a", 1, "b", 3.2] - py_str = LazyConfig.to_py(cfg) - expected = """cfg.dir1a_dict.a = "modified" -cfg.dir1a_dict.b = 2 -cfg.dir1b_dict.a = 1 -cfg.dir1b_dict.b = 2 -cfg.lazyobj = itertools.count( - x={ - "a": 1, - "b": 2, - "c": itertools.count(x={"r": "a", "s": 2.4, "t": [1, 2, 3, "z"]}), - }, - y="base_a_1_from_b", -) -cfg.list = ["a", 1, "b", 3.2] -""" - self.assertEqual(py_str, expected) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/test_yacs_config.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/test_yacs_config.py deleted file mode 100755 index 01dd6955..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/config/test_yacs_config.py +++ /dev/null @@ -1,270 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. - - -import os -import tempfile -import unittest -import torch -from omegaconf import OmegaConf - -from detectron2 import model_zoo -from detectron2.config import configurable, downgrade_config, get_cfg, upgrade_config -from detectron2.layers import ShapeSpec -from detectron2.modeling import build_model - -_V0_CFG = """ -MODEL: - RPN_HEAD: - NAME: "TEST" -VERSION: 0 -""" - -_V1_CFG = """ -MODEL: - WEIGHT: "/path/to/weight" -""" - - -class TestConfigVersioning(unittest.TestCase): - def test_upgrade_downgrade_consistency(self): - cfg = get_cfg() - # check that custom is preserved - cfg.USER_CUSTOM = 1 - - down = downgrade_config(cfg, to_version=0) - up = upgrade_config(down) - self.assertTrue(up == cfg) - - def _merge_cfg_str(self, cfg, merge_str): - f = tempfile.NamedTemporaryFile(mode="w", suffix=".yaml", delete=False) - try: - f.write(merge_str) - f.close() - cfg.merge_from_file(f.name) - finally: - os.remove(f.name) - return cfg - - def test_auto_upgrade(self): - cfg = get_cfg() - latest_ver = cfg.VERSION - cfg.USER_CUSTOM = 1 - - self._merge_cfg_str(cfg, _V0_CFG) - - self.assertEqual(cfg.MODEL.RPN.HEAD_NAME, "TEST") - self.assertEqual(cfg.VERSION, latest_ver) - - def test_guess_v1(self): - cfg = get_cfg() - latest_ver = cfg.VERSION - self._merge_cfg_str(cfg, _V1_CFG) - self.assertEqual(cfg.VERSION, latest_ver) - - -class _TestClassA(torch.nn.Module): - @configurable - def __init__(self, arg1, arg2, arg3=3): - super().__init__() - self.arg1 = arg1 - self.arg2 = arg2 - self.arg3 = arg3 - assert arg1 == 1 - assert arg2 == 2 - assert arg3 == 3 - - @classmethod - def from_config(cls, cfg): - args = {"arg1": cfg.ARG1, "arg2": cfg.ARG2} - return args - - -class _TestClassB(_TestClassA): - @configurable - def __init__(self, input_shape, arg1, arg2, arg3=3): - """ - Doc of _TestClassB - """ - assert input_shape == "shape" - super().__init__(arg1, arg2, arg3) - - @classmethod - def from_config(cls, cfg, input_shape): # test extra positional arg in from_config - args = {"arg1": cfg.ARG1, "arg2": cfg.ARG2} - args["input_shape"] = input_shape - return args - - -class _LegacySubClass(_TestClassB): - # an old subclass written in cfg style - def __init__(self, cfg, input_shape, arg4=4): - super().__init__(cfg, input_shape) - assert self.arg1 == 1 - assert self.arg2 == 2 - assert self.arg3 == 3 - - -class _NewSubClassNewInit(_TestClassB): - # test new subclass with a new __init__ - @configurable - def __init__(self, input_shape, arg4=4, **kwargs): - super().__init__(input_shape, **kwargs) - assert self.arg1 == 1 - assert self.arg2 == 2 - assert self.arg3 == 3 - - -class _LegacySubClassNotCfg(_TestClassB): - # an old subclass written in cfg style, but argument is not called "cfg" - def __init__(self, config, input_shape): - super().__init__(config, input_shape) - assert self.arg1 == 1 - assert self.arg2 == 2 - assert self.arg3 == 3 - - -class _TestClassC(_TestClassB): - @classmethod - def from_config(cls, cfg, input_shape, **kwargs): # test extra kwarg overwrite - args = {"arg1": cfg.ARG1, "arg2": cfg.ARG2} - args["input_shape"] = input_shape - args.update(kwargs) - return args - - -class _TestClassD(_TestClassA): - @configurable - def __init__(self, input_shape: ShapeSpec, arg1: int, arg2, arg3=3): - assert input_shape == "shape" - super().__init__(arg1, arg2, arg3) - - # _TestClassA.from_config does not have input_shape args. - # Test whether input_shape will be forwarded to __init__ - - -@configurable(from_config=lambda cfg, arg2: {"arg1": cfg.ARG1, "arg2": arg2, "arg3": cfg.ARG3}) -def _test_func(arg1, arg2=2, arg3=3, arg4=4): - return arg1, arg2, arg3, arg4 - - -class TestConfigurable(unittest.TestCase): - def testInitWithArgs(self): - _ = _TestClassA(arg1=1, arg2=2, arg3=3) - _ = _TestClassB("shape", arg1=1, arg2=2) - _ = _TestClassC("shape", arg1=1, arg2=2) - _ = _TestClassD("shape", arg1=1, arg2=2, arg3=3) - - def testPatchedAttr(self): - self.assertTrue("Doc" in _TestClassB.__init__.__doc__) - self.assertEqual(_TestClassD.__init__.__annotations__["arg1"], int) - - def testInitWithCfg(self): - cfg = get_cfg() - cfg.ARG1 = 1 - cfg.ARG2 = 2 - cfg.ARG3 = 3 - _ = _TestClassA(cfg) - _ = _TestClassB(cfg, input_shape="shape") - _ = _TestClassC(cfg, input_shape="shape") - _ = _TestClassD(cfg, input_shape="shape") - _ = _LegacySubClass(cfg, input_shape="shape") - _ = _NewSubClassNewInit(cfg, input_shape="shape") - _ = _LegacySubClassNotCfg(cfg, input_shape="shape") - with self.assertRaises(TypeError): - # disallow forwarding positional args to __init__ since it's prone to errors - _ = _TestClassD(cfg, "shape") - - # call with kwargs instead - _ = _TestClassA(cfg=cfg) - _ = _TestClassB(cfg=cfg, input_shape="shape") - _ = _TestClassC(cfg=cfg, input_shape="shape") - _ = _TestClassD(cfg=cfg, input_shape="shape") - _ = _LegacySubClass(cfg=cfg, input_shape="shape") - _ = _NewSubClassNewInit(cfg=cfg, input_shape="shape") - _ = _LegacySubClassNotCfg(config=cfg, input_shape="shape") - - def testInitWithCfgOverwrite(self): - cfg = get_cfg() - cfg.ARG1 = 1 - cfg.ARG2 = 999 # wrong config - with self.assertRaises(AssertionError): - _ = _TestClassA(cfg, arg3=3) - - # overwrite arg2 with correct config later: - _ = _TestClassA(cfg, arg2=2, arg3=3) - _ = _TestClassB(cfg, input_shape="shape", arg2=2, arg3=3) - _ = _TestClassC(cfg, input_shape="shape", arg2=2, arg3=3) - _ = _TestClassD(cfg, input_shape="shape", arg2=2, arg3=3) - - # call with kwargs cfg=cfg instead - _ = _TestClassA(cfg=cfg, arg2=2, arg3=3) - _ = _TestClassB(cfg=cfg, input_shape="shape", arg2=2, arg3=3) - _ = _TestClassC(cfg=cfg, input_shape="shape", arg2=2, arg3=3) - _ = _TestClassD(cfg=cfg, input_shape="shape", arg2=2, arg3=3) - - def testInitWithCfgWrongArgs(self): - cfg = get_cfg() - cfg.ARG1 = 1 - cfg.ARG2 = 2 - with self.assertRaises(TypeError): - _ = _TestClassB(cfg, "shape", not_exist=1) - with self.assertRaises(TypeError): - _ = _TestClassC(cfg, "shape", not_exist=1) - with self.assertRaises(TypeError): - _ = _TestClassD(cfg, "shape", not_exist=1) - - def testBadClass(self): - class _BadClass1: - @configurable - def __init__(self, a=1, b=2): - pass - - class _BadClass2: - @configurable - def __init__(self, a=1, b=2): - pass - - def from_config(self, cfg): # noqa - pass - - class _BadClass3: - @configurable - def __init__(self, a=1, b=2): - pass - - # bad name: must be cfg - @classmethod - def from_config(cls, config): # noqa - pass - - with self.assertRaises(AttributeError): - _ = _BadClass1(a=1) - - with self.assertRaises(TypeError): - _ = _BadClass2(a=1) - - with self.assertRaises(TypeError): - _ = _BadClass3(get_cfg()) - - def testFuncWithCfg(self): - cfg = get_cfg() - cfg.ARG1 = 10 - cfg.ARG3 = 30 - - self.assertEqual(_test_func(1), (1, 2, 3, 4)) - with self.assertRaises(TypeError): - _test_func(cfg) - self.assertEqual(_test_func(cfg, arg2=2), (10, 2, 30, 4)) - self.assertEqual(_test_func(cfg, arg1=100, arg2=20), (100, 20, 30, 4)) - self.assertEqual(_test_func(cfg, arg1=100, arg2=20, arg4=40), (100, 20, 30, 40)) - - self.assertTrue(callable(_test_func.from_config)) - - def testOmegaConf(self): - cfg = model_zoo.get_config("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml") - cfg = OmegaConf.create(cfg.dump()) - if not torch.cuda.is_available(): - cfg.MODEL.DEVICE = "cpu" - # test that a model can be built with omegaconf config as well - build_model(cfg) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_coco.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_coco.py deleted file mode 100755 index caabead5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_coco.py +++ /dev/null @@ -1,139 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import json -import numpy as np -import os -import tempfile -import unittest -import pycocotools.mask as mask_util - -from detectron2.data import DatasetCatalog, MetadataCatalog -from detectron2.data.datasets.coco import convert_to_coco_dict, load_coco_json -from detectron2.structures import BoxMode - - -def make_mask(): - """ - Makes a donut shaped binary mask. - """ - H = 100 - W = 100 - mask = np.zeros([H, W], dtype=np.uint8) - for x in range(W): - for y in range(H): - d = np.linalg.norm(np.array([W, H]) / 2 - np.array([x, y])) - if d > 10 and d < 20: - mask[y, x] = 1 - return mask - - -def uncompressed_rle(mask): - l = mask.flatten(order="F").tolist() - counts = [] - p = False - cnt = 0 - for i in l: - if i == p: - cnt += 1 - else: - counts.append(cnt) - p = i - cnt = 1 - counts.append(cnt) - return {"counts": counts, "size": [mask.shape[0], mask.shape[1]]} - - -def make_dataset_dicts(mask, compressed: bool = True): - """ - Returns a list of dicts that represents a single COCO data point for - object detection. The single instance given by `mask` is represented by - RLE, either compressed or uncompressed. - """ - record = {} - record["file_name"] = "test" - record["image_id"] = 0 - record["height"] = mask.shape[0] - record["width"] = mask.shape[1] - - y, x = np.nonzero(mask) - if compressed: - segmentation = mask_util.encode(np.asarray(mask, order="F")) - else: - segmentation = uncompressed_rle(mask) - min_x = np.min(x) - max_x = np.max(x) - min_y = np.min(y) - max_y = np.max(y) - obj = { - "bbox": [min_x, min_y, max_x, max_y], - "bbox_mode": BoxMode.XYXY_ABS, - "category_id": 0, - "iscrowd": 0, - "segmentation": segmentation, - } - record["annotations"] = [obj] - return [record] - - -class TestRLEToJson(unittest.TestCase): - def test(self): - # Make a dummy dataset. - mask = make_mask() - DatasetCatalog.register("test_dataset", lambda: make_dataset_dicts(mask)) - MetadataCatalog.get("test_dataset").set(thing_classes=["test_label"]) - - # Dump to json. - json_dict = convert_to_coco_dict("test_dataset") - with tempfile.TemporaryDirectory() as tmpdir: - json_file_name = os.path.join(tmpdir, "test.json") - with open(json_file_name, "w") as f: - json.dump(json_dict, f) - # Load from json. - dicts = load_coco_json(json_file_name, "") - - # Check the loaded mask matches the original. - anno = dicts[0]["annotations"][0] - loaded_mask = mask_util.decode(anno["segmentation"]) - self.assertTrue(np.array_equal(loaded_mask, mask)) - DatasetCatalog.pop("test_dataset") - MetadataCatalog.pop("test_dataset") - - def test_uncompressed_RLE(self): - mask = make_mask() - rle = mask_util.encode(np.asarray(mask, order="F")) - uncompressed = uncompressed_rle(mask) - compressed = mask_util.frPyObjects(uncompressed, *rle["size"]) - self.assertEqual(rle, compressed) - - -class TestConvertCOCO(unittest.TestCase): - @staticmethod - def generate_data(): - record = { - "file_name": "test", - "image_id": 0, - "height": 100, - "width": 100, - "annotations": [ - { - "bbox": [10, 10, 10, 10, 5], - "bbox_mode": BoxMode.XYWHA_ABS, - "category_id": 0, - "iscrowd": 0, - }, - { - "bbox": [15, 15, 3, 3], - "bbox_mode": BoxMode.XYXY_ABS, - "category_id": 0, - "iscrowd": 0, - }, - ], - } - - return [record] - - def test_convert_to_coco(self): - DatasetCatalog.register("test_dataset", lambda: TestConvertCOCO.generate_data()) - MetadataCatalog.get("test_dataset").set(thing_classes=["test_label"]) - convert_to_coco_dict("test_dataset") - DatasetCatalog.pop("test_dataset") - MetadataCatalog.pop("test_dataset") diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_coco_evaluation.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_coco_evaluation.py deleted file mode 100755 index 964f0028..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_coco_evaluation.py +++ /dev/null @@ -1,138 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import contextlib -import copy -import io -import json -import numpy as np -import os -import tempfile -import unittest -import torch -from pycocotools.coco import COCO -from pycocotools.cocoeval import COCOeval - -from detectron2.data import DatasetCatalog -from detectron2.evaluation import COCOEvaluator -from detectron2.evaluation.fast_eval_api import COCOeval_opt -from detectron2.structures import Boxes, Instances - - -class TestCOCOeval(unittest.TestCase): - def test_fast_eval(self): - # A small set of images/categories from COCO val - # fmt: off - detections = [{"image_id": 139, "category_id": 1, "bbox": [417.3332824707031, 159.27003479003906, 47.66064453125, 143.00193786621094], "score": 0.9949821829795837, "segmentation": {"size": [426, 640], "counts": "Tc`52W=3N0N4aNN^E7]:4XE1g:8kDMT;U100000001O1gE[Nk8h1dFiNY9Z1aFkN]9g2J3NdN`FlN`9S1cFRN07]9g1bFoM6;X9c1cFoM=8R9g1bFQN>3U9Y30O01OO1O001N2O1N1O4L4L5UNoE3V:CVF6Q:@YF9l9@ZF 0 else 0.0 - msg = "%s: comparing COCO APIs, %s differs by %f" % (name, k, abs_diff) - self.assertTrue(abs_diff < 1e-4, msg=msg) - - def test_unknown_category(self): - dataset = "coco_2017_val_100" - evaluator = COCOEvaluator(dataset) - evaluator.reset() - inputs = DatasetCatalog.get(dataset)[:2] - pred = Instances((100, 100)) - pred.pred_boxes = Boxes(torch.rand(2, 4)) - pred.scores = torch.rand(2) - pred.pred_classes = torch.tensor([10, 80]) - output = {"instances": pred} - evaluator.process(inputs, [output, output]) - with self.assertRaises(AssertionError): - evaluator.evaluate() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_dataset.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_dataset.py deleted file mode 100755 index 7d16ec4c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_dataset.py +++ /dev/null @@ -1,134 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import os -import pickle -import sys -import unittest -from functools import partial -import torch -from iopath.common.file_io import LazyPath - -from detectron2 import model_zoo -from detectron2.config import instantiate -from detectron2.data import ( - DatasetFromList, - MapDataset, - ToIterableDataset, - build_batch_data_loader, - build_detection_test_loader, - build_detection_train_loader, -) -from detectron2.data.samplers import InferenceSampler, TrainingSampler - - -def _a_slow_func(x): - return "path/{}".format(x) - - -class TestDatasetFromList(unittest.TestCase): - # Failing for py3.6, likely due to pickle - @unittest.skipIf(sys.version_info.minor <= 6, "Not supported in Python 3.6") - def test_using_lazy_path(self): - dataset = [] - for i in range(10): - dataset.append({"file_name": LazyPath(partial(_a_slow_func, i))}) - - dataset = DatasetFromList(dataset) - for i in range(10): - path = dataset[i]["file_name"] - self.assertTrue(isinstance(path, LazyPath)) - self.assertEqual(os.fspath(path), _a_slow_func(i)) - - -class TestMapDataset(unittest.TestCase): - @staticmethod - def map_func(x): - if x == 2: - return None - return x * 2 - - def test_map_style(self): - ds = DatasetFromList([1, 2, 3]) - ds = MapDataset(ds, TestMapDataset.map_func) - self.assertEqual(ds[0], 2) - self.assertEqual(ds[2], 6) - self.assertIn(ds[1], [2, 6]) - - def test_iter_style(self): - class DS(torch.utils.data.IterableDataset): - def __iter__(self): - yield from [1, 2, 3] - - ds = DS() - ds = MapDataset(ds, TestMapDataset.map_func) - self.assertIsInstance(ds, torch.utils.data.IterableDataset) - - data = list(iter(ds)) - self.assertEqual(data, [2, 6]) - - def test_pickleability(self): - ds = DatasetFromList([1, 2, 3]) - ds = MapDataset(ds, lambda x: x * 2) - ds = pickle.loads(pickle.dumps(ds)) - self.assertEqual(ds[0], 2) - - -class TestDataLoader(unittest.TestCase): - def _get_kwargs(self): - # get kwargs of build_detection_train_loader - cfg = model_zoo.get_config("common/data/coco.py").dataloader.train - cfg.dataset.names = "coco_2017_val_100" - cfg.pop("_target_") - kwargs = {k: instantiate(v) for k, v in cfg.items()} - return kwargs - - def test_build_dataloader_train(self): - kwargs = self._get_kwargs() - dl = build_detection_train_loader(**kwargs) - next(iter(dl)) - - def test_build_iterable_dataloader_train(self): - kwargs = self._get_kwargs() - ds = DatasetFromList(kwargs.pop("dataset")) - ds = ToIterableDataset(ds, TrainingSampler(len(ds))) - dl = build_detection_train_loader(dataset=ds, **kwargs) - next(iter(dl)) - - def _check_is_range(self, data_loader, N): - # check that data_loader produces range(N) - data = list(iter(data_loader)) - data = [x for batch in data for x in batch] # flatten the batches - self.assertEqual(len(data), N) - self.assertEqual(set(data), set(range(N))) - - def test_build_batch_dataloader_inference(self): - # Test that build_batch_data_loader can be used for inference - N = 96 - ds = DatasetFromList(list(range(N))) - sampler = InferenceSampler(len(ds)) - dl = build_batch_data_loader(ds, sampler, 8, num_workers=3) - self._check_is_range(dl, N) - - def test_build_dataloader_inference(self): - N = 50 - ds = DatasetFromList(list(range(N))) - sampler = InferenceSampler(len(ds)) - # test that parallel loader works correctly - dl = build_detection_test_loader( - dataset=ds, sampler=sampler, mapper=lambda x: x, num_workers=3 - ) - self._check_is_range(dl, N) - - # test that batch_size works correctly - dl = build_detection_test_loader( - dataset=ds, sampler=sampler, mapper=lambda x: x, batch_size=4, num_workers=0 - ) - self._check_is_range(dl, N) - - def test_build_iterable_dataloader_inference(self): - # Test that build_detection_test_loader supports iterable dataset - N = 50 - ds = DatasetFromList(list(range(N))) - ds = ToIterableDataset(ds, InferenceSampler(len(ds))) - dl = build_detection_test_loader(dataset=ds, mapper=lambda x: x, num_workers=3) - self._check_is_range(dl, N) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_detection_utils.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_detection_utils.py deleted file mode 100755 index aac56c07..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_detection_utils.py +++ /dev/null @@ -1,176 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import copy -import numpy as np -import os -import unittest -import pycocotools.mask as mask_util - -from detectron2.data import MetadataCatalog, detection_utils -from detectron2.data import transforms as T -from detectron2.structures import BitMasks, BoxMode -from detectron2.utils.file_io import PathManager - - -class TestTransformAnnotations(unittest.TestCase): - def test_transform_simple_annotation(self): - transforms = T.TransformList([T.HFlipTransform(400)]) - anno = { - "bbox": np.asarray([10, 10, 200, 300]), - "bbox_mode": BoxMode.XYXY_ABS, - "category_id": 3, - "segmentation": [[10, 10, 100, 100, 100, 10], [150, 150, 200, 150, 200, 200]], - } - - output = detection_utils.transform_instance_annotations(anno, transforms, (400, 400)) - self.assertTrue(np.allclose(output["bbox"], [200, 10, 390, 300])) - self.assertEqual(len(output["segmentation"]), len(anno["segmentation"])) - self.assertTrue(np.allclose(output["segmentation"][0], [390, 10, 300, 100, 300, 10])) - - detection_utils.annotations_to_instances([output, output], (400, 400)) - - def test_transform_empty_annotation(self): - detection_utils.annotations_to_instances([], (400, 400)) - - def test_flip_keypoints(self): - transforms = T.TransformList([T.HFlipTransform(400)]) - anno = { - "bbox": np.asarray([10, 10, 200, 300]), - "bbox_mode": BoxMode.XYXY_ABS, - "keypoints": np.random.rand(17, 3) * 50 + 15, - } - - output = detection_utils.transform_instance_annotations( - copy.deepcopy(anno), - transforms, - (400, 400), - keypoint_hflip_indices=detection_utils.create_keypoint_hflip_indices( - ["keypoints_coco_2017_train"] - ), - ) - # The first keypoint is nose - self.assertTrue(np.allclose(output["keypoints"][0, 0], 400 - anno["keypoints"][0, 0])) - # The last 16 keypoints are 8 left-right pairs - self.assertTrue( - np.allclose( - output["keypoints"][1:, 0].reshape(-1, 2)[:, ::-1], - 400 - anno["keypoints"][1:, 0].reshape(-1, 2), - ) - ) - self.assertTrue( - np.allclose( - output["keypoints"][1:, 1:].reshape(-1, 2, 2)[:, ::-1, :], - anno["keypoints"][1:, 1:].reshape(-1, 2, 2), - ) - ) - - def test_crop(self): - transforms = T.TransformList([T.CropTransform(300, 300, 10, 10)]) - keypoints = np.random.rand(17, 3) * 50 + 15 - keypoints[:, 2] = 2 - anno = { - "bbox": np.asarray([10, 10, 200, 400]), - "bbox_mode": BoxMode.XYXY_ABS, - "keypoints": keypoints, - } - - output = detection_utils.transform_instance_annotations( - copy.deepcopy(anno), transforms, (10, 10) - ) - # box is shifted and cropped - self.assertTrue((output["bbox"] == np.asarray([0, 0, 0, 10])).all()) - # keypoints are no longer visible - self.assertTrue((output["keypoints"][:, 2] == 0).all()) - - def test_transform_RLE(self): - transforms = T.TransformList([T.HFlipTransform(400)]) - mask = np.zeros((300, 400), order="F").astype("uint8") - mask[:, :200] = 1 - - anno = { - "bbox": np.asarray([10, 10, 200, 300]), - "bbox_mode": BoxMode.XYXY_ABS, - "segmentation": mask_util.encode(mask[:, :, None])[0], - "category_id": 3, - } - output = detection_utils.transform_instance_annotations( - copy.deepcopy(anno), transforms, (300, 400) - ) - mask = output["segmentation"] - self.assertTrue((mask[:, 200:] == 1).all()) - self.assertTrue((mask[:, :200] == 0).all()) - - inst = detection_utils.annotations_to_instances( - [output, output], (400, 400), mask_format="bitmask" - ) - self.assertTrue(isinstance(inst.gt_masks, BitMasks)) - - def test_transform_RLE_resize(self): - transforms = T.TransformList( - [T.HFlipTransform(400), T.ScaleTransform(300, 400, 400, 400, "bilinear")] - ) - mask = np.zeros((300, 400), order="F").astype("uint8") - mask[:, :200] = 1 - - anno = { - "bbox": np.asarray([10, 10, 200, 300]), - "bbox_mode": BoxMode.XYXY_ABS, - "segmentation": mask_util.encode(mask[:, :, None])[0], - "category_id": 3, - } - output = detection_utils.transform_instance_annotations( - copy.deepcopy(anno), transforms, (400, 400) - ) - - inst = detection_utils.annotations_to_instances( - [output, output], (400, 400), mask_format="bitmask" - ) - self.assertTrue(isinstance(inst.gt_masks, BitMasks)) - - def test_gen_crop(self): - instance = {"bbox": [10, 10, 100, 100], "bbox_mode": BoxMode.XYXY_ABS} - t = detection_utils.gen_crop_transform_with_instance((10, 10), (150, 150), instance) - # the box center must fall into the cropped region - self.assertTrue(t.x0 <= 55 <= t.x0 + t.w) - - def test_gen_crop_outside_boxes(self): - instance = {"bbox": [10, 10, 100, 100], "bbox_mode": BoxMode.XYXY_ABS} - with self.assertRaises(AssertionError): - detection_utils.gen_crop_transform_with_instance((10, 10), (15, 15), instance) - - def test_read_sem_seg(self): - cityscapes_dir = MetadataCatalog.get("cityscapes_fine_sem_seg_val").gt_dir - sem_seg_gt_path = os.path.join( - cityscapes_dir, "frankfurt", "frankfurt_000001_083852_gtFine_labelIds.png" - ) - if not PathManager.exists(sem_seg_gt_path): - raise unittest.SkipTest( - "Semantic segmentation ground truth {} not found.".format(sem_seg_gt_path) - ) - sem_seg = detection_utils.read_image(sem_seg_gt_path, "L") - self.assertEqual(sem_seg.ndim, 3) - self.assertEqual(sem_seg.shape[2], 1) - self.assertEqual(sem_seg.dtype, np.uint8) - self.assertEqual(sem_seg.max(), 32) - self.assertEqual(sem_seg.min(), 1) - - def test_read_exif_orientation(self): - # https://github.com/recurser/exif-orientation-examples/raw/master/Landscape_5.jpg - URL = "detectron2://assets/Landscape_5.jpg" - img = detection_utils.read_image(URL, "RGB") - self.assertEqual(img.ndim, 3) - self.assertEqual(img.dtype, np.uint8) - self.assertEqual(img.shape, (1200, 1800, 3)) # check that shape is not transposed - - def test_opencv_exif_orientation(self): - import cv2 - - URL = "detectron2://assets/Landscape_5.jpg" - with PathManager.open(URL, "rb") as f: - img = cv2.imdecode(np.frombuffer(f.read(), dtype="uint8"), cv2.IMREAD_COLOR) - self.assertEqual(img.dtype, np.uint8) - self.assertEqual(img.shape, (1200, 1800, 3)) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_rotation_transform.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_rotation_transform.py deleted file mode 100755 index 0e8299ed..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_rotation_transform.py +++ /dev/null @@ -1,71 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -import unittest - -from detectron2.data.transforms.transform import RotationTransform - - -class TestRotationTransform(unittest.TestCase): - def assertEqualsArrays(self, a1, a2): - self.assertTrue(np.allclose(a1, a2)) - - def randomData(self, h=5, w=5): - image = np.random.rand(h, w) - coords = np.array([[i, j] for j in range(h + 1) for i in range(w + 1)], dtype=float) - return image, coords, h, w - - def test180(self): - image, coords, h, w = self.randomData(6, 6) - rot = RotationTransform(h, w, 180, expand=False, center=None) - self.assertEqualsArrays(rot.apply_image(image), image[::-1, ::-1]) - rotated_coords = [[w - c[0], h - c[1]] for c in coords] - self.assertEqualsArrays(rot.apply_coords(coords), rotated_coords) - - def test45_coords(self): - _, coords, h, w = self.randomData(4, 6) - rot = RotationTransform(h, w, 45, expand=False, center=None) - rotated_coords = [ - [(x + y - (h + w) / 2) / np.sqrt(2) + w / 2, h / 2 + (y + (w - h) / 2 - x) / np.sqrt(2)] - for (x, y) in coords - ] - self.assertEqualsArrays(rot.apply_coords(coords), rotated_coords) - - def test90(self): - image, coords, h, w = self.randomData() - rot = RotationTransform(h, w, 90, expand=False, center=None) - self.assertEqualsArrays(rot.apply_image(image), image.T[::-1]) - rotated_coords = [[c[1], w - c[0]] for c in coords] - self.assertEqualsArrays(rot.apply_coords(coords), rotated_coords) - - def test90_expand(self): # non-square image - image, coords, h, w = self.randomData(h=5, w=8) - rot = RotationTransform(h, w, 90, expand=True, center=None) - self.assertEqualsArrays(rot.apply_image(image), image.T[::-1]) - rotated_coords = [[c[1], w - c[0]] for c in coords] - self.assertEqualsArrays(rot.apply_coords(coords), rotated_coords) - - def test_center_expand(self): - # center has no effect if expand=True because it only affects shifting - image, coords, h, w = self.randomData(h=5, w=8) - angle = np.random.randint(360) - rot1 = RotationTransform(h, w, angle, expand=True, center=None) - rot2 = RotationTransform(h, w, angle, expand=True, center=(0, 0)) - rot3 = RotationTransform(h, w, angle, expand=True, center=(h, w)) - rot4 = RotationTransform(h, w, angle, expand=True, center=(2, 5)) - for r1 in [rot1, rot2, rot3, rot4]: - for r2 in [rot1, rot2, rot3, rot4]: - self.assertEqualsArrays(r1.apply_image(image), r2.apply_image(image)) - self.assertEqualsArrays(r1.apply_coords(coords), r2.apply_coords(coords)) - - def test_inverse_transform(self): - image, coords, h, w = self.randomData(h=5, w=8) - rot = RotationTransform(h, w, 90, expand=True, center=None) - rot_image = rot.apply_image(image) - self.assertEqualsArrays(rot.inverse().apply_image(rot_image), image) - rot = RotationTransform(h, w, 65, expand=True, center=None) - rotated_coords = rot.apply_coords(coords) - self.assertEqualsArrays(rot.inverse().apply_coords(rotated_coords), coords) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_sampler.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_sampler.py deleted file mode 100755 index 0d278439..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_sampler.py +++ /dev/null @@ -1,111 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import itertools -import math -import operator -import unittest -import torch -from torch.utils import data -from torch.utils.data.sampler import SequentialSampler - -from detectron2.data.build import worker_init_reset_seed -from detectron2.data.common import DatasetFromList, ToIterableDataset -from detectron2.data.samplers import ( - GroupedBatchSampler, - InferenceSampler, - RepeatFactorTrainingSampler, - TrainingSampler, -) -from detectron2.utils.env import seed_all_rng - - -class TestGroupedBatchSampler(unittest.TestCase): - def test_missing_group_id(self): - sampler = SequentialSampler(list(range(100))) - group_ids = [1] * 100 - samples = GroupedBatchSampler(sampler, group_ids, 2) - - for mini_batch in samples: - self.assertEqual(len(mini_batch), 2) - - def test_groups(self): - sampler = SequentialSampler(list(range(100))) - group_ids = [1, 0] * 50 - samples = GroupedBatchSampler(sampler, group_ids, 2) - - for mini_batch in samples: - self.assertEqual((mini_batch[0] + mini_batch[1]) % 2, 0) - - -class TestSamplerDeterministic(unittest.TestCase): - def test_to_iterable(self): - sampler = TrainingSampler(100, seed=10) - gt_output = list(itertools.islice(sampler, 100)) - self.assertEqual(set(gt_output), set(range(100))) - - dataset = DatasetFromList(list(range(100))) - dataset = ToIterableDataset(dataset, sampler) - data_loader = data.DataLoader(dataset, num_workers=0, collate_fn=operator.itemgetter(0)) - - output = list(itertools.islice(data_loader, 100)) - self.assertEqual(output, gt_output) - - data_loader = data.DataLoader( - dataset, - num_workers=2, - collate_fn=operator.itemgetter(0), - worker_init_fn=worker_init_reset_seed, - # reset seed should not affect behavior of TrainingSampler - ) - output = list(itertools.islice(data_loader, 100)) - # multiple workers should not lead to duplicate or different data - self.assertEqual(output, gt_output) - - def test_training_sampler_seed(self): - seed_all_rng(42) - sampler = TrainingSampler(30) - data = list(itertools.islice(sampler, 65)) - - seed_all_rng(42) - sampler = TrainingSampler(30) - seed_all_rng(999) # should be ineffective - data2 = list(itertools.islice(sampler, 65)) - self.assertEqual(data, data2) - - -class TestRepeatFactorTrainingSampler(unittest.TestCase): - def test_repeat_factors_from_category_frequency(self): - repeat_thresh = 0.5 - - dataset_dicts = [ - {"annotations": [{"category_id": 0}, {"category_id": 1}]}, - {"annotations": [{"category_id": 0}]}, - {"annotations": []}, - ] - - rep_factors = RepeatFactorTrainingSampler.repeat_factors_from_category_frequency( - dataset_dicts, repeat_thresh - ) - - expected_rep_factors = torch.tensor([math.sqrt(3 / 2), 1.0, 1.0]) - self.assertTrue(torch.allclose(rep_factors, expected_rep_factors)) - - -class TestInferenceSampler(unittest.TestCase): - def test_local_indices(self): - sizes = [0, 16, 2, 42] - world_sizes = [5, 2, 3, 4] - - expected_results = [ - [range(0) for _ in range(5)], - [range(8), range(8, 16)], - [range(1), range(1, 2), range(0)], - [range(11), range(11, 22), range(22, 32), range(32, 42)], - ] - - for size, world_size, expected_result in zip(sizes, world_sizes, expected_results): - with self.subTest(f"size={size}, world_size={world_size}"): - local_indices = [ - InferenceSampler._get_local_indices(size, world_size, r) - for r in range(world_size) - ] - self.assertEqual(local_indices, expected_result) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_transforms.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_transforms.py deleted file mode 100755 index 382048e5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/data/test_transforms.py +++ /dev/null @@ -1,268 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import logging -import numpy as np -import unittest -from unittest import mock -import torch -from PIL import Image, ImageOps -from torch.nn import functional as F - -from detectron2.config import get_cfg -from detectron2.data import detection_utils -from detectron2.data import transforms as T -from detectron2.utils.logger import setup_logger - -logger = logging.getLogger(__name__) - - -def polygon_allclose(poly1, poly2): - """ - Test whether two polygons are the same. - Both arguments are nx2 numpy arrays. - """ - # ABCD and CDAB are the same polygon. So it's important to check after rolling - for k in range(len(poly1)): - rolled_poly1 = np.roll(poly1, k, axis=0) - if np.allclose(rolled_poly1, poly2): - return True - return False - - -class TestTransforms(unittest.TestCase): - def setUp(self): - setup_logger() - - def test_apply_rotated_boxes(self): - np.random.seed(125) - cfg = get_cfg() - is_train = True - augs = detection_utils.build_augmentation(cfg, is_train) - image = np.random.rand(200, 300) - image, transforms = T.apply_augmentations(augs, image) - image_shape = image.shape[:2] # h, w - assert image_shape == (800, 1200) - annotation = {"bbox": [179, 97, 62, 40, -56]} - - boxes = np.array([annotation["bbox"]], dtype=np.float64) # boxes.shape = (1, 5) - transformed_bbox = transforms.apply_rotated_box(boxes)[0] - - expected_bbox = np.array([484, 388, 248, 160, 56], dtype=np.float64) - err_msg = "transformed_bbox = {}, expected {}".format(transformed_bbox, expected_bbox) - assert np.allclose(transformed_bbox, expected_bbox), err_msg - - def test_resize_and_crop(self): - np.random.seed(125) - min_scale = 0.2 - max_scale = 2.0 - target_height = 1100 - target_width = 1000 - resize_aug = T.ResizeScale(min_scale, max_scale, target_height, target_width) - fixed_size_crop_aug = T.FixedSizeCrop((target_height, target_width)) - hflip_aug = T.RandomFlip() - augs = [resize_aug, fixed_size_crop_aug, hflip_aug] - original_image = np.random.rand(900, 800) - image, transforms = T.apply_augmentations(augs, original_image) - image_shape = image.shape[:2] # h, w - self.assertEqual((1100, 1000), image_shape) - - boxes = np.array( - [[91, 46, 144, 111], [523, 251, 614, 295]], - dtype=np.float64, - ) - transformed_bboxs = transforms.apply_box(boxes) - expected_bboxs = np.array( - [ - [895.42, 33.42666667, 933.91125, 80.66], - [554.0825, 182.39333333, 620.17125, 214.36666667], - ], - dtype=np.float64, - ) - err_msg = "transformed_bbox = {}, expected {}".format(transformed_bboxs, expected_bboxs) - self.assertTrue(np.allclose(transformed_bboxs, expected_bboxs), err_msg) - - polygon = np.array([[91, 46], [144, 46], [144, 111], [91, 111]]) - transformed_polygons = transforms.apply_polygons([polygon]) - expected_polygon = np.array([[934.0, 33.0], [934.0, 80.0], [896.0, 80.0], [896.0, 33.0]]) - self.assertEqual(1, len(transformed_polygons)) - err_msg = "transformed_polygon = {}, expected {}".format( - transformed_polygons[0], expected_polygon - ) - self.assertTrue(polygon_allclose(transformed_polygons[0], expected_polygon), err_msg) - - def test_apply_rotated_boxes_unequal_scaling_factor(self): - np.random.seed(125) - h, w = 400, 200 - newh, neww = 800, 800 - image = np.random.rand(h, w) - augs = [] - augs.append(T.Resize(shape=(newh, neww))) - image, transforms = T.apply_augmentations(augs, image) - image_shape = image.shape[:2] # h, w - assert image_shape == (newh, neww) - - boxes = np.array( - [ - [150, 100, 40, 20, 0], - [150, 100, 40, 20, 30], - [150, 100, 40, 20, 90], - [150, 100, 40, 20, -90], - ], - dtype=np.float64, - ) - transformed_boxes = transforms.apply_rotated_box(boxes) - - expected_bboxes = np.array( - [ - [600, 200, 160, 40, 0], - [600, 200, 144.22205102, 52.91502622, 49.10660535], - [600, 200, 80, 80, 90], - [600, 200, 80, 80, -90], - ], - dtype=np.float64, - ) - err_msg = "transformed_boxes = {}, expected {}".format(transformed_boxes, expected_bboxes) - assert np.allclose(transformed_boxes, expected_bboxes), err_msg - - def test_print_augmentation(self): - t = T.RandomCrop("relative", (100, 100)) - self.assertEqual(str(t), "RandomCrop(crop_type='relative', crop_size=(100, 100))") - - t0 = T.RandomFlip(prob=0.5) - self.assertEqual(str(t0), "RandomFlip(prob=0.5)") - - t1 = T.RandomFlip() - self.assertEqual(str(t1), "RandomFlip()") - - t = T.AugmentationList([t0, t1]) - self.assertEqual(str(t), f"AugmentationList[{t0}, {t1}]") - - def test_random_apply_prob_out_of_range_check(self): - test_probabilities = {0.0: True, 0.5: True, 1.0: True, -0.01: False, 1.01: False} - - for given_probability, is_valid in test_probabilities.items(): - if not is_valid: - self.assertRaises(AssertionError, T.RandomApply, None, prob=given_probability) - else: - T.RandomApply(T.NoOpTransform(), prob=given_probability) - - def test_random_apply_wrapping_aug_probability_occured_evaluation(self): - transform_mock = mock.MagicMock(name="MockTransform", spec=T.Augmentation) - image_mock = mock.MagicMock(name="MockImage") - random_apply = T.RandomApply(transform_mock, prob=0.001) - - with mock.patch.object(random_apply, "_rand_range", return_value=0.0001): - transform = random_apply.get_transform(image_mock) - transform_mock.get_transform.assert_called_once_with(image_mock) - self.assertIsNot(transform, transform_mock) - - def test_random_apply_wrapping_std_transform_probability_occured_evaluation(self): - transform_mock = mock.MagicMock(name="MockTransform", spec=T.Transform) - image_mock = mock.MagicMock(name="MockImage") - random_apply = T.RandomApply(transform_mock, prob=0.001) - - with mock.patch.object(random_apply, "_rand_range", return_value=0.0001): - transform = random_apply.get_transform(image_mock) - self.assertIs(transform, transform_mock) - - def test_random_apply_probability_not_occured_evaluation(self): - transform_mock = mock.MagicMock(name="MockTransform", spec=T.Augmentation) - image_mock = mock.MagicMock(name="MockImage") - random_apply = T.RandomApply(transform_mock, prob=0.001) - - with mock.patch.object(random_apply, "_rand_range", return_value=0.9): - transform = random_apply.get_transform(image_mock) - transform_mock.get_transform.assert_not_called() - self.assertIsInstance(transform, T.NoOpTransform) - - def test_augmentation_input_args(self): - input_shape = (100, 100) - output_shape = (50, 50) - - # define two augmentations with different args - class TG1(T.Augmentation): - def get_transform(self, image, sem_seg): - return T.ResizeTransform( - input_shape[0], input_shape[1], output_shape[0], output_shape[1] - ) - - class TG2(T.Augmentation): - def get_transform(self, image): - assert image.shape[:2] == output_shape # check that TG1 is applied - return T.HFlipTransform(output_shape[1]) - - image = np.random.rand(*input_shape).astype("float32") - sem_seg = (np.random.rand(*input_shape) < 0.5).astype("uint8") - inputs = T.AugInput(image, sem_seg=sem_seg) # provide two args - tfms = inputs.apply_augmentations([TG1(), TG2()]) - self.assertIsInstance(tfms[0], T.ResizeTransform) - self.assertIsInstance(tfms[1], T.HFlipTransform) - self.assertTrue(inputs.image.shape[:2] == output_shape) - self.assertTrue(inputs.sem_seg.shape[:2] == output_shape) - - class TG3(T.Augmentation): - def get_transform(self, image, nonexist): - pass - - with self.assertRaises(AttributeError): - inputs.apply_augmentations([TG3()]) - - def test_augmentation_list(self): - input_shape = (100, 100) - image = np.random.rand(*input_shape).astype("float32") - sem_seg = (np.random.rand(*input_shape) < 0.5).astype("uint8") - inputs = T.AugInput(image, sem_seg=sem_seg) # provide two args - - augs = T.AugmentationList([T.RandomFlip(), T.Resize(20)]) - _ = T.AugmentationList([augs, T.Resize(30)])(inputs) - # 3 in latest fvcore (flattened transformlist), 2 in older - # self.assertEqual(len(tfms), 3) - - def test_color_transforms(self): - rand_img = np.random.random((100, 100, 3)) * 255 - rand_img = rand_img.astype("uint8") - - # Test no-op - noop_transform = T.ColorTransform(lambda img: img) - self.assertTrue(np.array_equal(rand_img, noop_transform.apply_image(rand_img))) - - # Test a ImageOps operation - magnitude = np.random.randint(0, 256) - solarize_transform = T.PILColorTransform(lambda img: ImageOps.solarize(img, magnitude)) - expected_img = ImageOps.solarize(Image.fromarray(rand_img), magnitude) - self.assertTrue(np.array_equal(expected_img, solarize_transform.apply_image(rand_img))) - - def test_resize_transform(self): - input_shapes = [(100, 100), (100, 100, 1), (100, 100, 3)] - output_shapes = [(200, 200), (200, 200, 1), (200, 200, 3)] - for in_shape, out_shape in zip(input_shapes, output_shapes): - in_img = np.random.randint(0, 255, size=in_shape, dtype=np.uint8) - tfm = T.ResizeTransform(in_shape[0], in_shape[1], out_shape[0], out_shape[1]) - out_img = tfm.apply_image(in_img) - self.assertEqual(out_img.shape, out_shape) - - def test_resize_shorted_edge_scriptable(self): - def f(image): - newh, neww = T.ResizeShortestEdge.get_output_shape( - image.shape[-2], image.shape[-1], 80, 133 - ) - return F.interpolate(image.unsqueeze(0), size=(newh, neww)) - - input = torch.randn(3, 10, 10) - script_f = torch.jit.script(f) - self.assertTrue(torch.allclose(f(input), script_f(input))) - - # generalize to new shapes - input = torch.randn(3, 8, 100) - self.assertTrue(torch.allclose(f(input), script_f(input))) - - def test_extent_transform(self): - input_shapes = [(100, 100), (100, 100, 1), (100, 100, 3)] - src_rect = (20, 20, 80, 80) - output_shapes = [(200, 200), (200, 200, 1), (200, 200, 3)] - for in_shape, out_shape in zip(input_shapes, output_shapes): - in_img = np.random.randint(0, 255, size=in_shape, dtype=np.uint8) - tfm = T.ExtentTransform(src_rect, out_shape[:2]) - out_img = tfm.apply_image(in_img) - self.assertTrue(out_img.shape == out_shape) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_blocks.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_blocks.py deleted file mode 100755 index 5a0488ad..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_blocks.py +++ /dev/null @@ -1,51 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import unittest -import torch -from torch import nn - -from detectron2.layers import ASPP, DepthwiseSeparableConv2d, FrozenBatchNorm2d -from detectron2.modeling.backbone.resnet import BasicStem, ResNet - - -""" -Test for misc layers. -""" - - -class TestBlocks(unittest.TestCase): - def test_separable_conv(self): - DepthwiseSeparableConv2d(3, 10, norm1="BN", activation1=nn.PReLU()) - - def test_aspp(self): - m = ASPP(3, 10, [2, 3, 4], norm="", activation=nn.PReLU()) - self.assertIsNot(m.convs[0].activation.weight, m.convs[1].activation.weight) - self.assertIsNot(m.convs[0].activation.weight, m.project.activation.weight) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_frozen_batchnorm_fp16(self): - from torch.cuda.amp import autocast - - C = 10 - input = torch.rand(1, C, 10, 10).cuda() - m = FrozenBatchNorm2d(C).cuda() - with autocast(): - output = m(input.half()) - self.assertEqual(output.dtype, torch.float16) - - # requires_grad triggers a different codepath - input.requires_grad_() - with autocast(): - output = m(input.half()) - self.assertEqual(output.dtype, torch.float16) - - def test_resnet_unused_stages(self): - resnet = ResNet(BasicStem(), ResNet.make_default_stages(18), out_features=["res2"]) - self.assertTrue(hasattr(resnet, "res2")) - self.assertFalse(hasattr(resnet, "res3")) - self.assertFalse(hasattr(resnet, "res5")) - - resnet = ResNet(BasicStem(), ResNet.make_default_stages(18), out_features=["res2", "res5"]) - self.assertTrue(hasattr(resnet, "res2")) - self.assertTrue(hasattr(resnet, "res4")) - self.assertTrue(hasattr(resnet, "res5")) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_deformable.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_deformable.py deleted file mode 100755 index 4aa319fc..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_deformable.py +++ /dev/null @@ -1,175 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -import unittest -import torch - -from detectron2.layers import DeformConv, ModulatedDeformConv -from detectron2.utils.env import TORCH_VERSION - - -@unittest.skipIf( - TORCH_VERSION == (1, 8) and torch.cuda.is_available(), - "This test fails under cuda11 + torch1.8.", -) -class DeformableTest(unittest.TestCase): - @unittest.skipIf(not torch.cuda.is_available(), "Deformable not supported for cpu") - def test_forward_output(self): - device = torch.device("cuda") - N, C, H, W = shape = 1, 1, 5, 5 - kernel_size = 3 - padding = 1 - - inputs = torch.arange(np.prod(shape), dtype=torch.float32).reshape(*shape).to(device) - """ - 0 1 2 3 4 - 5 6 7 8 9 - 10 11 12 13 14 - 15 16 17 18 19 - 20 21 22 23 24 - """ - offset_channels = kernel_size * kernel_size * 2 - offset = torch.full((N, offset_channels, H, W), 0.5, dtype=torch.float32).to(device) - - # Test DCN v1 - deform = DeformConv(C, C, kernel_size=kernel_size, padding=padding).to(device) - deform.weight = torch.nn.Parameter(torch.ones_like(deform.weight)) - output = deform(inputs, offset) - output = output.detach().cpu().numpy() - deform_results = np.array( - [ - [30, 41.25, 48.75, 45, 28.75], - [62.25, 81, 90, 80.25, 50.25], - [99.75, 126, 135, 117.75, 72.75], - [105, 131.25, 138.75, 120, 73.75], - [71.75, 89.25, 93.75, 80.75, 49.5], - ] - ) - self.assertTrue(np.allclose(output.flatten(), deform_results.flatten())) - - # Test DCN v2 - mask_channels = kernel_size * kernel_size - mask = torch.full((N, mask_channels, H, W), 0.5, dtype=torch.float32).to(device) - modulate_deform = ModulatedDeformConv(C, C, kernel_size, padding=padding, bias=False).to( - device - ) - modulate_deform.weight = deform.weight - output = modulate_deform(inputs, offset, mask) - output = output.detach().cpu().numpy() - self.assertTrue(np.allclose(output.flatten(), deform_results.flatten() * 0.5)) - - def test_forward_output_on_cpu(self): - device = torch.device("cpu") - N, C, H, W = shape = 1, 1, 5, 5 - kernel_size = 3 - padding = 1 - - inputs = torch.arange(np.prod(shape), dtype=torch.float32).reshape(*shape).to(device) - - offset_channels = kernel_size * kernel_size * 2 - offset = torch.full((N, offset_channels, H, W), 0.5, dtype=torch.float32).to(device) - - # Test DCN v1 on cpu - deform = DeformConv(C, C, kernel_size=kernel_size, padding=padding).to(device) - deform.weight = torch.nn.Parameter(torch.ones_like(deform.weight)) - output = deform(inputs, offset) - output = output.detach().cpu().numpy() - deform_results = np.array( - [ - [30, 41.25, 48.75, 45, 28.75], - [62.25, 81, 90, 80.25, 50.25], - [99.75, 126, 135, 117.75, 72.75], - [105, 131.25, 138.75, 120, 73.75], - [71.75, 89.25, 93.75, 80.75, 49.5], - ] - ) - self.assertTrue(np.allclose(output.flatten(), deform_results.flatten())) - - @unittest.skipIf(not torch.cuda.is_available(), "This test requires gpu access") - def test_forward_output_on_cpu_equals_output_on_gpu(self): - N, C, H, W = shape = 2, 4, 10, 10 - kernel_size = 3 - padding = 1 - - for groups in [1, 2]: - inputs = torch.arange(np.prod(shape), dtype=torch.float32).reshape(*shape) - offset_channels = kernel_size * kernel_size * 2 - offset = torch.full((N, offset_channels, H, W), 0.5, dtype=torch.float32) - - deform_gpu = DeformConv( - C, C, kernel_size=kernel_size, padding=padding, groups=groups - ).to("cuda") - deform_gpu.weight = torch.nn.Parameter(torch.ones_like(deform_gpu.weight)) - output_gpu = deform_gpu(inputs.to("cuda"), offset.to("cuda")).detach().cpu().numpy() - - deform_cpu = DeformConv( - C, C, kernel_size=kernel_size, padding=padding, groups=groups - ).to("cpu") - deform_cpu.weight = torch.nn.Parameter(torch.ones_like(deform_cpu.weight)) - output_cpu = deform_cpu(inputs.to("cpu"), offset.to("cpu")).detach().numpy() - - self.assertTrue(np.allclose(output_gpu.flatten(), output_cpu.flatten())) - - @unittest.skipIf(not torch.cuda.is_available(), "Deformable not supported for cpu") - def test_small_input(self): - device = torch.device("cuda") - for kernel_size in [3, 5]: - padding = kernel_size // 2 - N, C, H, W = shape = (1, 1, kernel_size - 1, kernel_size - 1) - - inputs = torch.rand(shape).to(device) # input size is smaller than kernel size - - offset_channels = kernel_size * kernel_size * 2 - offset = torch.randn((N, offset_channels, H, W), dtype=torch.float32).to(device) - deform = DeformConv(C, C, kernel_size=kernel_size, padding=padding).to(device) - output = deform(inputs, offset) - self.assertTrue(output.shape == inputs.shape) - - mask_channels = kernel_size * kernel_size - mask = torch.ones((N, mask_channels, H, W), dtype=torch.float32).to(device) - modulate_deform = ModulatedDeformConv( - C, C, kernel_size, padding=padding, bias=False - ).to(device) - output = modulate_deform(inputs, offset, mask) - self.assertTrue(output.shape == inputs.shape) - - @unittest.skipIf(not torch.cuda.is_available(), "Deformable not supported for cpu") - def test_raise_exception(self): - device = torch.device("cuda") - N, C, H, W = shape = 1, 1, 3, 3 - kernel_size = 3 - padding = 1 - - inputs = torch.rand(shape, dtype=torch.float32).to(device) - offset_channels = kernel_size * kernel_size # This is wrong channels for offset - offset = torch.randn((N, offset_channels, H, W), dtype=torch.float32).to(device) - deform = DeformConv(C, C, kernel_size=kernel_size, padding=padding).to(device) - self.assertRaises(RuntimeError, deform, inputs, offset) - - offset_channels = kernel_size * kernel_size * 2 - offset = torch.randn((N, offset_channels, H, W), dtype=torch.float32).to(device) - mask_channels = kernel_size * kernel_size * 2 # This is wrong channels for mask - mask = torch.ones((N, mask_channels, H, W), dtype=torch.float32).to(device) - modulate_deform = ModulatedDeformConv(C, C, kernel_size, padding=padding, bias=False).to( - device - ) - self.assertRaises(RuntimeError, modulate_deform, inputs, offset, mask) - - def test_repr(self): - module = DeformConv(3, 10, kernel_size=3, padding=1, deformable_groups=2) - correct_string = ( - "DeformConv(in_channels=3, out_channels=10, kernel_size=(3, 3), " - "stride=(1, 1), padding=(1, 1), dilation=(1, 1), " - "groups=1, deformable_groups=2, bias=False)" - ) - self.assertEqual(repr(module), correct_string) - - module = ModulatedDeformConv(3, 10, kernel_size=3, padding=1, deformable_groups=2) - correct_string = ( - "ModulatedDeformConv(in_channels=3, out_channels=10, kernel_size=(3, 3), " - "stride=1, padding=1, dilation=1, groups=1, deformable_groups=2, bias=True)" - ) - self.assertEqual(repr(module), correct_string) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_losses.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_losses.py deleted file mode 100755 index d7492024..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_losses.py +++ /dev/null @@ -1,82 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -import unittest -import torch - -from detectron2.layers import ciou_loss, diou_loss - - -class TestLosses(unittest.TestCase): - def test_diou_loss(self): - """ - loss = 1 - iou + d/c - where, - d = (distance between centers of the 2 boxes)^2 - c = (diagonal length of the smallest enclosing box covering the 2 boxes)^2 - """ - # Identical boxes should have loss of 0 - box = torch.tensor([-1, -1, 1, 1], dtype=torch.float32) - loss = diou_loss(box, box) - self.assertTrue(np.allclose(loss, [0.0])) - - # Half size box inside other box - # iou = 0.5, d = 0.25, c = 8 - box2 = torch.tensor([0, -1, 1, 1], dtype=torch.float32) - loss = diou_loss(box, box2) - self.assertTrue(np.allclose(loss, [0.53125])) - - # Two diagonally adjacent boxes - # iou = 0, d = 2, c = 8 - box3 = torch.tensor([0, 0, 1, 1], dtype=torch.float32) - box4 = torch.tensor([1, 1, 2, 2], dtype=torch.float32) - loss = diou_loss(box3, box4) - self.assertTrue(np.allclose(loss, [1.25])) - - # Test batched loss and reductions - box1s = torch.stack([box, box3], dim=0) - box2s = torch.stack([box2, box4], dim=0) - - loss = diou_loss(box1s, box2s, reduction="sum") - self.assertTrue(np.allclose(loss, [1.78125])) - - loss = diou_loss(box1s, box2s, reduction="mean") - self.assertTrue(np.allclose(loss, [0.890625])) - - def test_ciou_loss(self): - """ - loss = 1 - iou + d/c + alpha*v - where, - d = (distance between centers of the 2 boxes)^2 - c = (diagonal length of the smallest enclosing box covering the 2 boxes)^2 - v = (4/pi^2) * (arctan(box1_w/box1_h) - arctan(box2_w/box2_h))^2 - alpha = v/(1 - iou + v) - """ - # Identical boxes should have loss of 0 - box = torch.tensor([-1, -1, 1, 1], dtype=torch.float32) - loss = ciou_loss(box, box) - self.assertTrue(np.allclose(loss, [0.0])) - - # Half size box inside other box - # iou = 0.5, d = 0.25, c = 8 - # v = (4/pi^2) * (arctan(1) - arctan(0.5))^2 = 0.042 - # alpha = 0.0775 - box2 = torch.tensor([0, -1, 1, 1], dtype=torch.float32) - loss = ciou_loss(box, box2) - self.assertTrue(np.allclose(loss, [0.5345])) - - # Two diagonally adjacent boxes - # iou = 0, d = 2, c = 8, v = 0, alpha = 0 - box3 = torch.tensor([0, 0, 1, 1], dtype=torch.float32) - box4 = torch.tensor([1, 1, 2, 2], dtype=torch.float32) - loss = ciou_loss(box3, box4) - self.assertTrue(np.allclose(loss, [1.25])) - - # Test batched loss and reductions - box1s = torch.stack([box, box3], dim=0) - box2s = torch.stack([box2, box4], dim=0) - - loss = ciou_loss(box1s, box2s, reduction="sum") - self.assertTrue(np.allclose(loss, [1.7845])) - - loss = ciou_loss(box1s, box2s, reduction="mean") - self.assertTrue(np.allclose(loss, [0.89225])) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_mask_ops.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_mask_ops.py deleted file mode 100755 index 162c449c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_mask_ops.py +++ /dev/null @@ -1,202 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import contextlib -import io -import numpy as np -import unittest -from collections import defaultdict -import torch -import tqdm -from fvcore.common.benchmark import benchmark -from pycocotools.coco import COCO -from tabulate import tabulate -from torch.nn import functional as F - -from detectron2.data import MetadataCatalog -from detectron2.layers.mask_ops import ( - pad_masks, - paste_mask_in_image_old, - paste_masks_in_image, - scale_boxes, -) -from detectron2.structures import BitMasks, Boxes, BoxMode, PolygonMasks -from detectron2.structures.masks import polygons_to_bitmask -from detectron2.utils.file_io import PathManager -from detectron2.utils.testing import random_boxes - - -def iou_between_full_image_bit_masks(a, b): - intersect = (a & b).sum() - union = (a | b).sum() - return intersect / union - - -def rasterize_polygons_with_grid_sample(full_image_bit_mask, box, mask_size, threshold=0.5): - x0, y0, x1, y1 = box[0], box[1], box[2], box[3] - - img_h, img_w = full_image_bit_mask.shape - - mask_y = np.arange(0.0, mask_size) + 0.5 # mask y sample coords in [0.5, mask_size - 0.5] - mask_x = np.arange(0.0, mask_size) + 0.5 # mask x sample coords in [0.5, mask_size - 0.5] - mask_y = mask_y / mask_size * (y1 - y0) + y0 - mask_x = mask_x / mask_size * (x1 - x0) + x0 - - mask_x = (mask_x - 0.5) / (img_w - 1) * 2 + -1 - mask_y = (mask_y - 0.5) / (img_h - 1) * 2 + -1 - gy, gx = torch.meshgrid(torch.from_numpy(mask_y), torch.from_numpy(mask_x)) - ind = torch.stack([gx, gy], dim=-1).to(dtype=torch.float32) - - full_image_bit_mask = torch.from_numpy(full_image_bit_mask) - mask = F.grid_sample( - full_image_bit_mask[None, None, :, :].to(dtype=torch.float32), - ind[None, :, :, :], - align_corners=True, - ) - - return mask[0, 0] >= threshold - - -class TestMaskCropPaste(unittest.TestCase): - def setUp(self): - json_file = MetadataCatalog.get("coco_2017_val_100").json_file - if not PathManager.isfile(json_file): - raise unittest.SkipTest("{} not found".format(json_file)) - with contextlib.redirect_stdout(io.StringIO()): - json_file = PathManager.get_local_path(json_file) - self.coco = COCO(json_file) - - def test_crop_paste_consistency(self): - """ - rasterize_polygons_within_box (used in training) - and - paste_masks_in_image (used in inference) - should be inverse operations to each other. - - This function runs several implementation of the above two operations and prints - the reconstruction error. - """ - - anns = self.coco.loadAnns(self.coco.getAnnIds(iscrowd=False)) # avoid crowd annotations - - selected_anns = anns[:100] - - ious = [] - for ann in tqdm.tqdm(selected_anns): - results = self.process_annotation(ann) - ious.append([k[2] for k in results]) - - ious = np.array(ious) - mean_ious = ious.mean(axis=0) - table = [] - res_dic = defaultdict(dict) - for row, iou in zip(results, mean_ious): - table.append((row[0], row[1], iou)) - res_dic[row[0]][row[1]] = iou - print(tabulate(table, headers=["rasterize", "paste", "iou"], tablefmt="simple")) - # assert that the reconstruction is good: - self.assertTrue(res_dic["polygon"]["aligned"] > 0.94) - self.assertTrue(res_dic["roialign"]["aligned"] > 0.95) - - def process_annotation(self, ann, mask_side_len=28): - # Parse annotation data - img_info = self.coco.loadImgs(ids=[ann["image_id"]])[0] - height, width = img_info["height"], img_info["width"] - gt_polygons = [np.array(p, dtype=np.float64) for p in ann["segmentation"]] - gt_bbox = BoxMode.convert(ann["bbox"], BoxMode.XYWH_ABS, BoxMode.XYXY_ABS) - gt_bit_mask = polygons_to_bitmask(gt_polygons, height, width) - - # Run rasterize .. - torch_gt_bbox = torch.tensor(gt_bbox).to(dtype=torch.float32).reshape(-1, 4) - box_bitmasks = { - "polygon": PolygonMasks([gt_polygons]).crop_and_resize(torch_gt_bbox, mask_side_len)[0], - "gridsample": rasterize_polygons_with_grid_sample(gt_bit_mask, gt_bbox, mask_side_len), - "roialign": BitMasks(torch.from_numpy(gt_bit_mask[None, :, :])).crop_and_resize( - torch_gt_bbox, mask_side_len - )[0], - } - - # Run paste .. - results = defaultdict(dict) - for k, box_bitmask in box_bitmasks.items(): - padded_bitmask, scale = pad_masks(box_bitmask[None, :, :], 1) - scaled_boxes = scale_boxes(torch_gt_bbox, scale) - - r = results[k] - r["old"] = paste_mask_in_image_old( - padded_bitmask[0], scaled_boxes[0], height, width, threshold=0.5 - ) - r["aligned"] = paste_masks_in_image( - box_bitmask[None, :, :], Boxes(torch_gt_bbox), (height, width) - )[0] - - table = [] - for rasterize_method, r in results.items(): - for paste_method, mask in r.items(): - mask = np.asarray(mask) - iou = iou_between_full_image_bit_masks(gt_bit_mask.astype("uint8"), mask) - table.append((rasterize_method, paste_method, iou)) - return table - - def test_polygon_area(self): - # Draw polygon boxes - for d in [5.0, 10.0, 1000.0]: - polygon = PolygonMasks([[[0, 0, 0, d, d, d, d, 0]]]) - area = polygon.area()[0] - target = d ** 2 - self.assertEqual(area, target) - - # Draw polygon triangles - for d in [5.0, 10.0, 1000.0]: - polygon = PolygonMasks([[[0, 0, 0, d, d, d]]]) - area = polygon.area()[0] - target = d ** 2 / 2 - self.assertEqual(area, target) - - def test_paste_mask_scriptable(self): - scripted_f = torch.jit.script(paste_masks_in_image) - N = 10 - masks = torch.rand(N, 28, 28) - boxes = Boxes(random_boxes(N, 100)).tensor - image_shape = (150, 150) - - out = paste_masks_in_image(masks, boxes, image_shape) - scripted_out = scripted_f(masks, boxes, image_shape) - self.assertTrue(torch.equal(out, scripted_out)) - - -def benchmark_paste(): - S = 800 - H, W = image_shape = (S, S) - N = 64 - torch.manual_seed(42) - masks = torch.rand(N, 28, 28) - - center = torch.rand(N, 2) * 600 + 100 - wh = torch.clamp(torch.randn(N, 2) * 40 + 200, min=50) - x0y0 = torch.clamp(center - wh * 0.5, min=0.0) - x1y1 = torch.clamp(center + wh * 0.5, max=S) - boxes = Boxes(torch.cat([x0y0, x1y1], axis=1)) - - def func(device, n=3): - m = masks.to(device=device) - b = boxes.to(device=device) - - def bench(): - for _ in range(n): - paste_masks_in_image(m, b, image_shape) - if device.type == "cuda": - torch.cuda.synchronize() - - return bench - - specs = [{"device": torch.device("cpu"), "n": 3}] - if torch.cuda.is_available(): - specs.append({"device": torch.device("cuda"), "n": 3}) - - benchmark(func, "paste_masks", specs, num_iters=10, warmup_iters=2) - - -if __name__ == "__main__": - benchmark_paste() - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_nms.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_nms.py deleted file mode 100755 index a042db61..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_nms.py +++ /dev/null @@ -1,33 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from __future__ import absolute_import, division, print_function, unicode_literals -import unittest -import torch - -from detectron2.layers import batched_nms -from detectron2.utils.testing import random_boxes - - -class TestNMS(unittest.TestCase): - def _create_tensors(self, N): - boxes = random_boxes(N, 200) - scores = torch.rand(N) - return boxes, scores - - def test_nms_scriptability(self): - N = 2000 - num_classes = 50 - boxes, scores = self._create_tensors(N) - idxs = torch.randint(0, num_classes, (N,)) - scripted_batched_nms = torch.jit.script(batched_nms) - err_msg = "NMS is incompatible with jit-scripted NMS for IoU={}" - - for iou in [0.2, 0.5, 0.8]: - keep_ref = batched_nms(boxes, scores, idxs, iou) - backup = boxes.clone() - scripted_keep = scripted_batched_nms(boxes, scores, idxs, iou) - assert torch.allclose(boxes, backup), "boxes modified by jit-scripted batched_nms" - self.assertTrue(torch.equal(keep_ref, scripted_keep), err_msg.format(iou)) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_nms_rotated.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_nms_rotated.py deleted file mode 100755 index 4b453848..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_nms_rotated.py +++ /dev/null @@ -1,172 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from __future__ import absolute_import, division, print_function, unicode_literals -import numpy as np -import unittest -from copy import deepcopy -import torch -from torchvision import ops - -from detectron2.layers import batched_nms, batched_nms_rotated, nms_rotated -from detectron2.utils.testing import random_boxes - - -def nms_edit_distance(keep1, keep2): - """ - Compare the "keep" result of two nms call. - They are allowed to be different in terms of edit distance - due to floating point precision issues, e.g., - if a box happen to have an IoU of 0.5 with another box, - one implentation may choose to keep it while another may discard it. - """ - keep1, keep2 = keep1.cpu(), keep2.cpu() - if torch.equal(keep1, keep2): - # they should be equal most of the time - return 0 - keep1, keep2 = tuple(keep1), tuple(keep2) - m, n = len(keep1), len(keep2) - - # edit distance with DP - f = [np.arange(n + 1), np.arange(n + 1)] - for i in range(m): - cur_row = i % 2 - other_row = (i + 1) % 2 - f[other_row][0] = i + 1 - for j in range(n): - f[other_row][j + 1] = ( - f[cur_row][j] - if keep1[i] == keep2[j] - else min(min(f[cur_row][j], f[cur_row][j + 1]), f[other_row][j]) + 1 - ) - return f[m % 2][n] - - -class TestNMSRotated(unittest.TestCase): - def reference_horizontal_nms(self, boxes, scores, iou_threshold): - """ - Args: - box_scores (N, 5): boxes in corner-form and probabilities. - (Note here 5 == 4 + 1, i.e., 4-dim horizontal box + 1-dim prob) - iou_threshold: intersection over union threshold. - Returns: - picked: a list of indexes of the kept boxes - """ - picked = [] - _, indexes = scores.sort(descending=True) - while len(indexes) > 0: - current = indexes[0] - picked.append(current.item()) - if len(indexes) == 1: - break - current_box = boxes[current, :] - indexes = indexes[1:] - rest_boxes = boxes[indexes, :] - iou = ops.box_iou(rest_boxes, current_box.unsqueeze(0)).squeeze(1) - indexes = indexes[iou <= iou_threshold] - - return torch.as_tensor(picked) - - def _create_tensors(self, N, device="cpu"): - boxes = random_boxes(N, 200, device=device) - scores = torch.rand(N, device=device) - return boxes, scores - - def test_batched_nms_rotated_0_degree_cpu(self, device="cpu"): - N = 2000 - num_classes = 50 - boxes, scores = self._create_tensors(N, device=device) - idxs = torch.randint(0, num_classes, (N,)) - rotated_boxes = torch.zeros(N, 5, device=device) - rotated_boxes[:, 0] = (boxes[:, 0] + boxes[:, 2]) / 2.0 - rotated_boxes[:, 1] = (boxes[:, 1] + boxes[:, 3]) / 2.0 - rotated_boxes[:, 2] = boxes[:, 2] - boxes[:, 0] - rotated_boxes[:, 3] = boxes[:, 3] - boxes[:, 1] - err_msg = "Rotated NMS with 0 degree is incompatible with horizontal NMS for IoU={}" - for iou in [0.2, 0.5, 0.8]: - backup = boxes.clone() - keep_ref = batched_nms(boxes, scores, idxs, iou) - assert torch.allclose(boxes, backup), "boxes modified by batched_nms" - backup = rotated_boxes.clone() - keep = batched_nms_rotated(rotated_boxes, scores, idxs, iou) - assert torch.allclose( - rotated_boxes, backup - ), "rotated_boxes modified by batched_nms_rotated" - # Occasionally the gap can be large if there are many IOU on the threshold boundary - self.assertLessEqual(nms_edit_distance(keep, keep_ref), 5, err_msg.format(iou)) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_batched_nms_rotated_0_degree_cuda(self): - self.test_batched_nms_rotated_0_degree_cpu(device="cuda") - - def test_nms_rotated_0_degree_cpu(self, device="cpu"): - N = 1000 - boxes, scores = self._create_tensors(N, device=device) - rotated_boxes = torch.zeros(N, 5, device=device) - rotated_boxes[:, 0] = (boxes[:, 0] + boxes[:, 2]) / 2.0 - rotated_boxes[:, 1] = (boxes[:, 1] + boxes[:, 3]) / 2.0 - rotated_boxes[:, 2] = boxes[:, 2] - boxes[:, 0] - rotated_boxes[:, 3] = boxes[:, 3] - boxes[:, 1] - err_msg = "Rotated NMS incompatible between CPU and reference implementation for IoU={}" - for iou in [0.2, 0.5, 0.8]: - keep_ref = self.reference_horizontal_nms(boxes, scores, iou) - keep = nms_rotated(rotated_boxes, scores, iou) - self.assertLessEqual(nms_edit_distance(keep, keep_ref), 1, err_msg.format(iou)) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_nms_rotated_0_degree_cuda(self): - self.test_nms_rotated_0_degree_cpu(device="cuda") - - def test_nms_rotated_90_degrees_cpu(self): - N = 1000 - boxes, scores = self._create_tensors(N) - rotated_boxes = torch.zeros(N, 5) - rotated_boxes[:, 0] = (boxes[:, 0] + boxes[:, 2]) / 2.0 - rotated_boxes[:, 1] = (boxes[:, 1] + boxes[:, 3]) / 2.0 - # Note for rotated_boxes[:, 2] and rotated_boxes[:, 3]: - # widths and heights are intentionally swapped here for 90 degrees case - # so that the reference horizontal nms could be used - rotated_boxes[:, 2] = boxes[:, 3] - boxes[:, 1] - rotated_boxes[:, 3] = boxes[:, 2] - boxes[:, 0] - - rotated_boxes[:, 4] = torch.ones(N) * 90 - err_msg = "Rotated NMS incompatible between CPU and reference implementation for IoU={}" - for iou in [0.2, 0.5, 0.8]: - keep_ref = self.reference_horizontal_nms(boxes, scores, iou) - keep = nms_rotated(rotated_boxes, scores, iou) - self.assertLessEqual(nms_edit_distance(keep, keep_ref), 1, err_msg.format(iou)) - - def test_nms_rotated_180_degrees_cpu(self): - N = 1000 - boxes, scores = self._create_tensors(N) - rotated_boxes = torch.zeros(N, 5) - rotated_boxes[:, 0] = (boxes[:, 0] + boxes[:, 2]) / 2.0 - rotated_boxes[:, 1] = (boxes[:, 1] + boxes[:, 3]) / 2.0 - rotated_boxes[:, 2] = boxes[:, 2] - boxes[:, 0] - rotated_boxes[:, 3] = boxes[:, 3] - boxes[:, 1] - rotated_boxes[:, 4] = torch.ones(N) * 180 - err_msg = "Rotated NMS incompatible between CPU and reference implementation for IoU={}" - for iou in [0.2, 0.5, 0.8]: - keep_ref = self.reference_horizontal_nms(boxes, scores, iou) - keep = nms_rotated(rotated_boxes, scores, iou) - self.assertLessEqual(nms_edit_distance(keep, keep_ref), 1, err_msg.format(iou)) - - -class TestScriptable(unittest.TestCase): - def setUp(self): - class TestingModule(torch.nn.Module): - def forward(self, boxes, scores, threshold): - return nms_rotated(boxes, scores, threshold) - - self.module = TestingModule() - - def test_scriptable_cpu(self): - m = deepcopy(self.module).cpu() - _ = torch.jit.script(m) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_scriptable_cuda(self): - m = deepcopy(self.module).cuda() - _ = torch.jit.script(m) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_roi_align.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_roi_align.py deleted file mode 100755 index b6fd8ede..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_roi_align.py +++ /dev/null @@ -1,210 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -import unittest -from copy import copy -import cv2 -import torch -from fvcore.common.benchmark import benchmark -from torch.nn import functional as F - -from detectron2.layers.roi_align import ROIAlign, roi_align - - -class ROIAlignTest(unittest.TestCase): - def test_forward_output(self): - input = np.arange(25).reshape(5, 5).astype("float32") - """ - 0 1 2 3 4 - 5 6 7 8 9 - 10 11 12 13 14 - 15 16 17 18 19 - 20 21 22 23 24 - """ - - output = self._simple_roialign(input, [1, 1, 3, 3], (4, 4), aligned=False) - output_correct = self._simple_roialign(input, [1, 1, 3, 3], (4, 4), aligned=True) - - # without correction: - old_results = [ - [7.5, 8, 8.5, 9], - [10, 10.5, 11, 11.5], - [12.5, 13, 13.5, 14], - [15, 15.5, 16, 16.5], - ] - - # with 0.5 correction: - correct_results = [ - [4.5, 5.0, 5.5, 6.0], - [7.0, 7.5, 8.0, 8.5], - [9.5, 10.0, 10.5, 11.0], - [12.0, 12.5, 13.0, 13.5], - ] - # This is an upsampled version of [[6, 7], [11, 12]] - - self.assertTrue(np.allclose(output.flatten(), np.asarray(old_results).flatten())) - self.assertTrue( - np.allclose(output_correct.flatten(), np.asarray(correct_results).flatten()) - ) - - # Also see similar issues in tensorflow at - # https://github.com/tensorflow/tensorflow/issues/26278 - - def test_resize(self): - H, W = 30, 30 - input = np.random.rand(H, W).astype("float32") * 100 - box = [10, 10, 20, 20] - output = self._simple_roialign(input, box, (5, 5), aligned=True) - - input2x = cv2.resize(input, (W // 2, H // 2), interpolation=cv2.INTER_LINEAR) - box2x = [x / 2 for x in box] - output2x = self._simple_roialign(input2x, box2x, (5, 5), aligned=True) - diff = np.abs(output2x - output) - self.assertTrue(diff.max() < 1e-4) - - def test_grid_sample_equivalence(self): - H, W = 30, 30 - input = np.random.rand(H, W).astype("float32") * 100 - box = [10, 10, 20, 20] - for ratio in [1, 2, 3]: - output = self._simple_roialign(input, box, (5, 5), sampling_ratio=ratio) - output_grid_sample = grid_sample_roi_align( - torch.from_numpy(input[None, None, :, :]).float(), - torch.as_tensor(box).float()[None, :], - 5, - 1.0, - ratio, - ) - self.assertTrue(torch.allclose(output, output_grid_sample)) - - def _simple_roialign(self, img, box, resolution, sampling_ratio=0, aligned=True): - """ - RoiAlign with scale 1.0. - """ - if isinstance(resolution, int): - resolution = (resolution, resolution) - op = ROIAlign(resolution, 1.0, sampling_ratio, aligned=aligned) - input = torch.from_numpy(img[None, None, :, :].astype("float32")) - - rois = [0] + list(box) - rois = torch.from_numpy(np.asarray(rois)[None, :].astype("float32")) - output = op.forward(input, rois) - if torch.cuda.is_available(): - output_cuda = op.forward(input.cuda(), rois.cuda()).cpu() - self.assertTrue(torch.allclose(output, output_cuda)) - return output[0, 0] - - def _simple_roialign_with_grad(self, img, box, resolution, device): - if isinstance(resolution, int): - resolution = (resolution, resolution) - - op = ROIAlign(resolution, 1.0, 0, aligned=True) - input = torch.from_numpy(img[None, None, :, :].astype("float32")) - - rois = [0] + list(box) - rois = torch.from_numpy(np.asarray(rois)[None, :].astype("float32")) - input = input.to(device=device) - rois = rois.to(device=device) - input.requires_grad = True - output = op.forward(input, rois) - return input, output - - def test_empty_box(self): - img = np.random.rand(5, 5) - box = [3, 4, 5, 4] - o = self._simple_roialign(img, box, 7) - self.assertTrue(o.shape == (7, 7)) - self.assertTrue((o == 0).all()) - - for dev in ["cpu"] + ["cuda"] if torch.cuda.is_available() else []: - input, output = self._simple_roialign_with_grad(img, box, 7, torch.device(dev)) - output.sum().backward() - self.assertTrue(torch.allclose(input.grad, torch.zeros_like(input))) - - def test_empty_batch(self): - input = torch.zeros(0, 3, 10, 10, dtype=torch.float32) - rois = torch.zeros(0, 5, dtype=torch.float32) - op = ROIAlign((7, 7), 1.0, 0, aligned=True) - output = op.forward(input, rois) - self.assertTrue(output.shape == (0, 3, 7, 7)) - - -def grid_sample_roi_align(input, boxes, output_size, scale, sampling_ratio): - # unlike true roi_align, this does not support different batch_idx - from detectron2.projects.point_rend.point_features import ( - generate_regular_grid_point_coords, - get_point_coords_wrt_image, - point_sample, - ) - - N, _, H, W = input.shape - R = len(boxes) - assert N == 1 - boxes = boxes * scale - grid = generate_regular_grid_point_coords(R, output_size * sampling_ratio, device=boxes.device) - coords = get_point_coords_wrt_image(boxes, grid) - coords = coords / torch.as_tensor([W, H], device=coords.device) # R, s^2, 2 - res = point_sample(input, coords.unsqueeze(0), align_corners=False) # 1,C, R,s^2 - res = ( - res.squeeze(0) - .permute(1, 0, 2) - .reshape(R, -1, output_size * sampling_ratio, output_size * sampling_ratio) - ) - res = F.avg_pool2d(res, sampling_ratio) - return res - - -def benchmark_roi_align(): - def random_boxes(mean_box, stdev, N, maxsize): - ret = torch.rand(N, 4) * stdev + torch.tensor(mean_box, dtype=torch.float) - ret.clamp_(min=0, max=maxsize) - return ret - - def func(shape, nboxes_per_img, sampling_ratio, device, box_size="large"): - N, _, H, _ = shape - input = torch.rand(*shape) - boxes = [] - batch_idx = [] - for k in range(N): - if box_size == "large": - b = random_boxes([80, 80, 130, 130], 24, nboxes_per_img, H) - else: - b = random_boxes([100, 100, 110, 110], 4, nboxes_per_img, H) - boxes.append(b) - batch_idx.append(torch.zeros(nboxes_per_img, 1, dtype=torch.float32) + k) - boxes = torch.cat(boxes, axis=0) - batch_idx = torch.cat(batch_idx, axis=0) - boxes = torch.cat([batch_idx, boxes], axis=1) - - input = input.to(device=device) - boxes = boxes.to(device=device) - - def bench(): - if False and sampling_ratio > 0 and N == 1: - # enable to benchmark grid_sample (slower) - grid_sample_roi_align(input, boxes[:, 1:], 7, 1.0, sampling_ratio) - else: - roi_align(input, boxes, 7, 1.0, sampling_ratio, True) - if device == "cuda": - torch.cuda.synchronize() - - return bench - - def gen_args(arg): - args = [] - for size in ["small", "large"]: - for ratio in [0, 2]: - args.append(copy(arg)) - args[-1]["sampling_ratio"] = ratio - args[-1]["box_size"] = size - return args - - arg = dict(shape=(1, 512, 256, 256), nboxes_per_img=512, device="cuda") - benchmark(func, "cuda_roialign", gen_args(arg), num_iters=20, warmup_iters=1) - arg.update({"device": "cpu", "shape": (1, 256, 128, 128)}) - benchmark(func, "cpu_roialign", gen_args(arg), num_iters=5, warmup_iters=1) - - -if __name__ == "__main__": - if torch.cuda.is_available(): - benchmark_roi_align() - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_roi_align_rotated.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_roi_align_rotated.py deleted file mode 100755 index 7323d7d5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/layers/test_roi_align_rotated.py +++ /dev/null @@ -1,176 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import unittest -import cv2 -import torch -from torch.autograd import Variable, gradcheck - -from detectron2.layers.roi_align import ROIAlign -from detectron2.layers.roi_align_rotated import ROIAlignRotated - -logger = logging.getLogger(__name__) - - -class ROIAlignRotatedTest(unittest.TestCase): - def _box_to_rotated_box(self, box, angle): - return [ - (box[0] + box[2]) / 2.0, - (box[1] + box[3]) / 2.0, - box[2] - box[0], - box[3] - box[1], - angle, - ] - - def _rot90(self, img, num): - num = num % 4 # note: -1 % 4 == 3 - for _ in range(num): - img = img.transpose(0, 1).flip(0) - return img - - def test_forward_output_0_90_180_270(self): - for i in range(4): - # i = 0, 1, 2, 3 corresponding to 0, 90, 180, 270 degrees - img = torch.arange(25, dtype=torch.float32).reshape(5, 5) - """ - 0 1 2 3 4 - 5 6 7 8 9 - 10 11 12 13 14 - 15 16 17 18 19 - 20 21 22 23 24 - """ - box = [1, 1, 3, 3] - rotated_box = self._box_to_rotated_box(box=box, angle=90 * i) - - result = self._simple_roi_align_rotated(img=img, box=rotated_box, resolution=(4, 4)) - - # Here's an explanation for 0 degree case: - # point 0 in the original input lies at [0.5, 0.5] - # (the center of bin [0, 1] x [0, 1]) - # point 1 in the original input lies at [1.5, 0.5], etc. - # since the resolution is (4, 4) that divides [1, 3] x [1, 3] - # into 4 x 4 equal bins, - # the top-left bin is [1, 1.5] x [1, 1.5], and its center - # (1.25, 1.25) lies at the 3/4 position - # between point 0 and point 1, point 5 and point 6, - # point 0 and point 5, point 1 and point 6, so it can be calculated as - # 0.25*(0*0.25+1*0.75)+(5*0.25+6*0.75)*0.75 = 4.5 - result_expected = torch.tensor( - [ - [4.5, 5.0, 5.5, 6.0], - [7.0, 7.5, 8.0, 8.5], - [9.5, 10.0, 10.5, 11.0], - [12.0, 12.5, 13.0, 13.5], - ] - ) - # This is also an upsampled version of [[6, 7], [11, 12]] - - # When the box is rotated by 90 degrees CCW, - # the result would be rotated by 90 degrees CW, thus it's -i here - result_expected = self._rot90(result_expected, -i) - - assert torch.allclose(result, result_expected) - - def test_resize(self): - H, W = 30, 30 - input = torch.rand(H, W) * 100 - box = [10, 10, 20, 20] - rotated_box = self._box_to_rotated_box(box, angle=0) - output = self._simple_roi_align_rotated(img=input, box=rotated_box, resolution=(5, 5)) - - input2x = cv2.resize(input.numpy(), (W // 2, H // 2), interpolation=cv2.INTER_LINEAR) - input2x = torch.from_numpy(input2x) - box2x = [x / 2 for x in box] - rotated_box2x = self._box_to_rotated_box(box2x, angle=0) - output2x = self._simple_roi_align_rotated(img=input2x, box=rotated_box2x, resolution=(5, 5)) - assert torch.allclose(output2x, output) - - def _simple_roi_align_rotated(self, img, box, resolution): - """ - RoiAlignRotated with scale 1.0 and 0 sample ratio. - """ - op = ROIAlignRotated(output_size=resolution, spatial_scale=1.0, sampling_ratio=0) - input = img[None, None, :, :] - - rois = [0] + list(box) - rois = torch.tensor(rois, dtype=torch.float32)[None, :] - result_cpu = op.forward(input, rois) - if torch.cuda.is_available(): - result_cuda = op.forward(input.cuda(), rois.cuda()) - assert torch.allclose(result_cpu, result_cuda.cpu()) - return result_cpu[0, 0] - - def test_empty_box(self): - img = torch.rand(5, 5) - out = self._simple_roi_align_rotated(img, [2, 3, 0, 0, 0], (7, 7)) - self.assertTrue((out == 0).all()) - - def test_roi_align_rotated_gradcheck_cpu(self): - dtype = torch.float64 - device = torch.device("cpu") - roi_align_rotated_op = ROIAlignRotated( - output_size=(5, 5), spatial_scale=0.5, sampling_ratio=1 - ).to(dtype=dtype, device=device) - x = torch.rand(1, 1, 10, 10, dtype=dtype, device=device, requires_grad=True) - # roi format is (batch index, x_center, y_center, width, height, angle) - rois = torch.tensor( - [[0, 4.5, 4.5, 9, 9, 0], [0, 2, 7, 4, 4, 0], [0, 7, 7, 4, 4, 0]], - dtype=dtype, - device=device, - ) - - def func(input): - return roi_align_rotated_op(input, rois) - - assert gradcheck(func, (x,)), "gradcheck failed for RoIAlignRotated CPU" - assert gradcheck(func, (x.transpose(2, 3),)), "gradcheck failed for RoIAlignRotated CPU" - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_roi_align_rotated_gradient_cuda(self): - """ - Compute gradients for ROIAlignRotated with multiple bounding boxes on the GPU, - and compare the result with ROIAlign - """ - # torch.manual_seed(123) - dtype = torch.float64 - device = torch.device("cuda") - pool_h, pool_w = (5, 5) - - roi_align = ROIAlign(output_size=(pool_h, pool_w), spatial_scale=1, sampling_ratio=2).to( - device=device - ) - - roi_align_rotated = ROIAlignRotated( - output_size=(pool_h, pool_w), spatial_scale=1, sampling_ratio=2 - ).to(device=device) - - x = torch.rand(1, 1, 10, 10, dtype=dtype, device=device, requires_grad=True) - # x_rotated = x.clone() won't work (will lead to grad_fun=CloneBackward)! - x_rotated = Variable(x.data.clone(), requires_grad=True) - - # roi_rotated format is (batch index, x_center, y_center, width, height, angle) - rois_rotated = torch.tensor( - [[0, 4.5, 4.5, 9, 9, 0], [0, 2, 7, 4, 4, 0], [0, 7, 7, 4, 4, 0]], - dtype=dtype, - device=device, - ) - - y_rotated = roi_align_rotated(x_rotated, rois_rotated) - s_rotated = y_rotated.sum() - s_rotated.backward() - - # roi format is (batch index, x1, y1, x2, y2) - rois = torch.tensor( - [[0, 0, 0, 9, 9], [0, 0, 5, 4, 9], [0, 5, 5, 9, 9]], dtype=dtype, device=device - ) - - y = roi_align(x, rois) - s = y.sum() - s.backward() - - assert torch.allclose( - x.grad, x_rotated.grad - ), "gradients for ROIAlign and ROIAlignRotated mismatch on CUDA" - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_anchor_generator.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_anchor_generator.py deleted file mode 100755 index 13a808e5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_anchor_generator.py +++ /dev/null @@ -1,120 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import unittest -import torch - -from detectron2.config import get_cfg -from detectron2.layers import ShapeSpec -from detectron2.modeling.anchor_generator import DefaultAnchorGenerator, RotatedAnchorGenerator - -logger = logging.getLogger(__name__) - - -class TestAnchorGenerator(unittest.TestCase): - def test_default_anchor_generator(self): - cfg = get_cfg() - cfg.MODEL.ANCHOR_GENERATOR.SIZES = [[32, 64]] - cfg.MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS = [[0.25, 1, 4]] - - anchor_generator = DefaultAnchorGenerator(cfg, [ShapeSpec(stride=4)]) - - # only the last two dimensions of features matter here - num_images = 2 - features = {"stage3": torch.rand(num_images, 96, 1, 2)} - anchors = anchor_generator([features["stage3"]]) - expected_anchor_tensor = torch.tensor( - [ - [-32.0, -8.0, 32.0, 8.0], - [-16.0, -16.0, 16.0, 16.0], - [-8.0, -32.0, 8.0, 32.0], - [-64.0, -16.0, 64.0, 16.0], - [-32.0, -32.0, 32.0, 32.0], - [-16.0, -64.0, 16.0, 64.0], - [-28.0, -8.0, 36.0, 8.0], # -28.0 == -32.0 + STRIDE (4) - [-12.0, -16.0, 20.0, 16.0], - [-4.0, -32.0, 12.0, 32.0], - [-60.0, -16.0, 68.0, 16.0], - [-28.0, -32.0, 36.0, 32.0], - [-12.0, -64.0, 20.0, 64.0], - ] - ) - - self.assertTrue(torch.allclose(anchors[0].tensor, expected_anchor_tensor)) - - def test_default_anchor_generator_centered(self): - # test explicit args - anchor_generator = DefaultAnchorGenerator( - sizes=[32, 64], aspect_ratios=[0.25, 1, 4], strides=[4] - ) - - # only the last two dimensions of features matter here - num_images = 2 - features = {"stage3": torch.rand(num_images, 96, 1, 2)} - expected_anchor_tensor = torch.tensor( - [ - [-30.0, -6.0, 34.0, 10.0], - [-14.0, -14.0, 18.0, 18.0], - [-6.0, -30.0, 10.0, 34.0], - [-62.0, -14.0, 66.0, 18.0], - [-30.0, -30.0, 34.0, 34.0], - [-14.0, -62.0, 18.0, 66.0], - [-26.0, -6.0, 38.0, 10.0], - [-10.0, -14.0, 22.0, 18.0], - [-2.0, -30.0, 14.0, 34.0], - [-58.0, -14.0, 70.0, 18.0], - [-26.0, -30.0, 38.0, 34.0], - [-10.0, -62.0, 22.0, 66.0], - ] - ) - - anchors = anchor_generator([features["stage3"]]) - self.assertTrue(torch.allclose(anchors[0].tensor, expected_anchor_tensor)) - - anchors = torch.jit.script(anchor_generator)([features["stage3"]]) - self.assertTrue(torch.allclose(anchors[0].tensor, expected_anchor_tensor)) - - def test_rrpn_anchor_generator(self): - cfg = get_cfg() - cfg.MODEL.ANCHOR_GENERATOR.SIZES = [[32, 64]] - cfg.MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS = [[0.25, 1, 4]] - cfg.MODEL.ANCHOR_GENERATOR.ANGLES = [0, 45] # test single list[float] - anchor_generator = RotatedAnchorGenerator(cfg, [ShapeSpec(stride=4)]) - - # only the last two dimensions of features matter here - num_images = 2 - features = {"stage3": torch.rand(num_images, 96, 1, 2)} - anchors = anchor_generator([features["stage3"]]) - expected_anchor_tensor = torch.tensor( - [ - [0.0, 0.0, 64.0, 16.0, 0.0], - [0.0, 0.0, 64.0, 16.0, 45.0], - [0.0, 0.0, 32.0, 32.0, 0.0], - [0.0, 0.0, 32.0, 32.0, 45.0], - [0.0, 0.0, 16.0, 64.0, 0.0], - [0.0, 0.0, 16.0, 64.0, 45.0], - [0.0, 0.0, 128.0, 32.0, 0.0], - [0.0, 0.0, 128.0, 32.0, 45.0], - [0.0, 0.0, 64.0, 64.0, 0.0], - [0.0, 0.0, 64.0, 64.0, 45.0], - [0.0, 0.0, 32.0, 128.0, 0.0], - [0.0, 0.0, 32.0, 128.0, 45.0], - [4.0, 0.0, 64.0, 16.0, 0.0], # 4.0 == 0.0 + STRIDE (4) - [4.0, 0.0, 64.0, 16.0, 45.0], - [4.0, 0.0, 32.0, 32.0, 0.0], - [4.0, 0.0, 32.0, 32.0, 45.0], - [4.0, 0.0, 16.0, 64.0, 0.0], - [4.0, 0.0, 16.0, 64.0, 45.0], - [4.0, 0.0, 128.0, 32.0, 0.0], - [4.0, 0.0, 128.0, 32.0, 45.0], - [4.0, 0.0, 64.0, 64.0, 0.0], - [4.0, 0.0, 64.0, 64.0, 45.0], - [4.0, 0.0, 32.0, 128.0, 0.0], - [4.0, 0.0, 32.0, 128.0, 45.0], - ] - ) - - self.assertTrue(torch.allclose(anchors[0].tensor, expected_anchor_tensor)) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_backbone.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_backbone.py deleted file mode 100755 index 3bb100f9..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_backbone.py +++ /dev/null @@ -1,34 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved - -import unittest -import torch - -import detectron2.export.torchscript # apply patch # noqa -from detectron2 import model_zoo -from detectron2.config import get_cfg -from detectron2.layers import ShapeSpec -from detectron2.modeling.backbone import build_resnet_backbone -from detectron2.modeling.backbone.fpn import build_resnet_fpn_backbone - - -class TestBackBone(unittest.TestCase): - def test_resnet_scriptability(self): - cfg = get_cfg() - resnet = build_resnet_backbone(cfg, ShapeSpec(channels=3)) - - scripted_resnet = torch.jit.script(resnet) - - inp = torch.rand(2, 3, 100, 100) - out1 = resnet(inp)["res4"] - out2 = scripted_resnet(inp)["res4"] - self.assertTrue(torch.allclose(out1, out2)) - - def test_fpn_scriptability(self): - cfg = model_zoo.get_config("Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml") - bb = build_resnet_fpn_backbone(cfg, ShapeSpec(channels=3)) - bb_s = torch.jit.script(bb) - - inp = torch.rand(2, 3, 128, 128) - out1 = bb(inp)["p5"] - out2 = bb_s(inp)["p5"] - self.assertTrue(torch.allclose(out1, out2)) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_box2box_transform.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_box2box_transform.py deleted file mode 100755 index fd3a7b79..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_box2box_transform.py +++ /dev/null @@ -1,94 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import unittest -import torch - -from detectron2.modeling.box_regression import ( - Box2BoxTransform, - Box2BoxTransformLinear, - Box2BoxTransformRotated, -) -from detectron2.utils.testing import random_boxes - -logger = logging.getLogger(__name__) - - -class TestBox2BoxTransform(unittest.TestCase): - def test_reconstruction(self): - weights = (5, 5, 10, 10) - b2b_tfm = Box2BoxTransform(weights=weights) - src_boxes = random_boxes(10) - dst_boxes = random_boxes(10) - - devices = [torch.device("cpu")] - if torch.cuda.is_available(): - devices.append(torch.device("cuda")) - for device in devices: - src_boxes = src_boxes.to(device=device) - dst_boxes = dst_boxes.to(device=device) - deltas = b2b_tfm.get_deltas(src_boxes, dst_boxes) - dst_boxes_reconstructed = b2b_tfm.apply_deltas(deltas, src_boxes) - self.assertTrue(torch.allclose(dst_boxes, dst_boxes_reconstructed)) - - def test_apply_deltas_tracing(self): - weights = (5, 5, 10, 10) - b2b_tfm = Box2BoxTransform(weights=weights) - - with torch.no_grad(): - func = torch.jit.trace(b2b_tfm.apply_deltas, (torch.randn(10, 20), torch.randn(10, 4))) - - o = func(torch.randn(10, 20), torch.randn(10, 4)) - self.assertEqual(o.shape, (10, 20)) - o = func(torch.randn(5, 20), torch.randn(5, 4)) - self.assertEqual(o.shape, (5, 20)) - - -def random_rotated_boxes(mean_box, std_length, std_angle, N): - return torch.cat( - [torch.rand(N, 4) * std_length, torch.rand(N, 1) * std_angle], dim=1 - ) + torch.tensor(mean_box, dtype=torch.float) - - -class TestBox2BoxTransformRotated(unittest.TestCase): - def test_reconstruction(self): - weights = (5, 5, 10, 10, 1) - b2b_transform = Box2BoxTransformRotated(weights=weights) - src_boxes = random_rotated_boxes([10, 10, 20, 20, -30], 5, 60.0, 10) - dst_boxes = random_rotated_boxes([10, 10, 20, 20, -30], 5, 60.0, 10) - - devices = [torch.device("cpu")] - if torch.cuda.is_available(): - devices.append(torch.device("cuda")) - for device in devices: - src_boxes = src_boxes.to(device=device) - dst_boxes = dst_boxes.to(device=device) - deltas = b2b_transform.get_deltas(src_boxes, dst_boxes) - dst_boxes_reconstructed = b2b_transform.apply_deltas(deltas, src_boxes) - assert torch.allclose(dst_boxes[:, :4], dst_boxes_reconstructed[:, :4], atol=1e-5) - # angle difference has to be normalized - assert torch.allclose( - (dst_boxes[:, 4] - dst_boxes_reconstructed[:, 4] + 180.0) % 360.0 - 180.0, - torch.zeros_like(dst_boxes[:, 4]), - atol=1e-4, - ) - - -class TestBox2BoxTransformLinear(unittest.TestCase): - def test_reconstruction(self): - b2b_tfm = Box2BoxTransformLinear() - src_boxes = random_boxes(10) - dst_boxes = torch.tensor([0, 0, 101, 101] * 10).reshape(10, 4).float() - - devices = [torch.device("cpu")] - if torch.cuda.is_available(): - devices.append(torch.device("cuda")) - for device in devices: - src_boxes = src_boxes.to(device=device) - dst_boxes = dst_boxes.to(device=device) - deltas = b2b_tfm.get_deltas(src_boxes, dst_boxes) - dst_boxes_reconstructed = b2b_tfm.apply_deltas(deltas, src_boxes) - self.assertTrue(torch.allclose(dst_boxes, dst_boxes_reconstructed, atol=1e-3)) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_fast_rcnn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_fast_rcnn.py deleted file mode 100755 index e29b944b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_fast_rcnn.py +++ /dev/null @@ -1,171 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import unittest -import torch - -from detectron2.layers import ShapeSpec -from detectron2.modeling.box_regression import Box2BoxTransform, Box2BoxTransformRotated -from detectron2.modeling.roi_heads.fast_rcnn import FastRCNNOutputLayers -from detectron2.modeling.roi_heads.rotated_fast_rcnn import RotatedFastRCNNOutputLayers -from detectron2.structures import Boxes, Instances, RotatedBoxes -from detectron2.utils.events import EventStorage - -logger = logging.getLogger(__name__) - - -class FastRCNNTest(unittest.TestCase): - def test_fast_rcnn(self): - torch.manual_seed(132) - - box_head_output_size = 8 - - box_predictor = FastRCNNOutputLayers( - ShapeSpec(channels=box_head_output_size), - box2box_transform=Box2BoxTransform(weights=(10, 10, 5, 5)), - num_classes=5, - ) - feature_pooled = torch.rand(2, box_head_output_size) - predictions = box_predictor(feature_pooled) - - proposal_boxes = torch.tensor([[0.8, 1.1, 3.2, 2.8], [2.3, 2.5, 7, 8]], dtype=torch.float32) - gt_boxes = torch.tensor([[1, 1, 3, 3], [2, 2, 6, 6]], dtype=torch.float32) - proposal = Instances((10, 10)) - proposal.proposal_boxes = Boxes(proposal_boxes) - proposal.gt_boxes = Boxes(gt_boxes) - proposal.gt_classes = torch.tensor([1, 2]) - - with EventStorage(): # capture events in a new storage to discard them - losses = box_predictor.losses(predictions, [proposal]) - - expected_losses = { - "loss_cls": torch.tensor(1.7951188087), - "loss_box_reg": torch.tensor(4.0357131958), - } - for name in expected_losses.keys(): - assert torch.allclose(losses[name], expected_losses[name]) - - def test_fast_rcnn_empty_batch(self, device="cpu"): - box_predictor = FastRCNNOutputLayers( - ShapeSpec(channels=10), - box2box_transform=Box2BoxTransform(weights=(10, 10, 5, 5)), - num_classes=8, - ).to(device=device) - - logits = torch.randn(0, 100, requires_grad=True, device=device) - deltas = torch.randn(0, 4, requires_grad=True, device=device) - losses = box_predictor.losses([logits, deltas], []) - for value in losses.values(): - self.assertTrue(torch.allclose(value, torch.zeros_like(value))) - sum(losses.values()).backward() - self.assertTrue(logits.grad is not None) - self.assertTrue(deltas.grad is not None) - - predictions, _ = box_predictor.inference([logits, deltas], []) - self.assertEqual(len(predictions), 0) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_fast_rcnn_empty_batch_cuda(self): - self.test_fast_rcnn_empty_batch(device=torch.device("cuda")) - - def test_fast_rcnn_rotated(self): - torch.manual_seed(132) - box_head_output_size = 8 - - box_predictor = RotatedFastRCNNOutputLayers( - ShapeSpec(channels=box_head_output_size), - box2box_transform=Box2BoxTransformRotated(weights=(10, 10, 5, 5, 1)), - num_classes=5, - ) - feature_pooled = torch.rand(2, box_head_output_size) - predictions = box_predictor(feature_pooled) - proposal_boxes = torch.tensor( - [[2, 1.95, 2.4, 1.7, 0], [4.65, 5.25, 4.7, 5.5, 0]], dtype=torch.float32 - ) - gt_boxes = torch.tensor([[2, 2, 2, 2, 0], [4, 4, 4, 4, 0]], dtype=torch.float32) - proposal = Instances((10, 10)) - proposal.proposal_boxes = RotatedBoxes(proposal_boxes) - proposal.gt_boxes = RotatedBoxes(gt_boxes) - proposal.gt_classes = torch.tensor([1, 2]) - - with EventStorage(): # capture events in a new storage to discard them - losses = box_predictor.losses(predictions, [proposal]) - - # Note: the expected losses are slightly different even if - # the boxes are essentially the same as in the FastRCNNOutput test, because - # bbox_pred in FastRCNNOutputLayers have different Linear layers/initialization - # between the two cases. - expected_losses = { - "loss_cls": torch.tensor(1.7920907736), - "loss_box_reg": torch.tensor(4.0410838127), - } - for name in expected_losses.keys(): - assert torch.allclose(losses[name], expected_losses[name]) - - def test_predict_boxes_tracing(self): - class Model(torch.nn.Module): - def __init__(self, output_layer): - super(Model, self).__init__() - self._output_layer = output_layer - - def forward(self, proposal_deltas, proposal_boxes): - instances = Instances((10, 10)) - instances.proposal_boxes = Boxes(proposal_boxes) - return self._output_layer.predict_boxes((None, proposal_deltas), [instances]) - - box_head_output_size = 8 - - box_predictor = FastRCNNOutputLayers( - ShapeSpec(channels=box_head_output_size), - box2box_transform=Box2BoxTransform(weights=(10, 10, 5, 5)), - num_classes=5, - ) - - model = Model(box_predictor) - - from detectron2.export.torchscript_patch import patch_builtin_len - - with torch.no_grad(), patch_builtin_len(): - func = torch.jit.trace(model, (torch.randn(10, 20), torch.randn(10, 4))) - - o = func(torch.randn(10, 20), torch.randn(10, 4)) - self.assertEqual(o[0].shape, (10, 20)) - o = func(torch.randn(5, 20), torch.randn(5, 4)) - self.assertEqual(o[0].shape, (5, 20)) - o = func(torch.randn(20, 20), torch.randn(20, 4)) - self.assertEqual(o[0].shape, (20, 20)) - - def test_predict_probs_tracing(self): - class Model(torch.nn.Module): - def __init__(self, output_layer): - super(Model, self).__init__() - self._output_layer = output_layer - - def forward(self, scores, proposal_boxes): - instances = Instances((10, 10)) - instances.proposal_boxes = Boxes(proposal_boxes) - return self._output_layer.predict_probs((scores, None), [instances]) - - box_head_output_size = 8 - - box_predictor = FastRCNNOutputLayers( - ShapeSpec(channels=box_head_output_size), - box2box_transform=Box2BoxTransform(weights=(10, 10, 5, 5)), - num_classes=5, - ) - - model = Model(box_predictor) - - from detectron2.export.torchscript_patch import patch_builtin_len - - with torch.no_grad(), patch_builtin_len(): - func = torch.jit.trace(model, (torch.randn(10, 6), torch.rand(10, 4))) - o = func(torch.randn(10, 6), torch.randn(10, 4)) - self.assertEqual(o[0].shape, (10, 6)) - o = func(torch.randn(5, 6), torch.randn(5, 4)) - self.assertEqual(o[0].shape, (5, 6)) - o = func(torch.randn(20, 6), torch.randn(20, 4)) - self.assertEqual(o[0].shape, (20, 6)) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_matcher.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_matcher.py deleted file mode 100755 index 6eb2db0c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_matcher.py +++ /dev/null @@ -1,42 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import unittest -from typing import List -import torch - -from detectron2.config import get_cfg -from detectron2.modeling.matcher import Matcher - - -class TestMatcher(unittest.TestCase): - def test_scriptability(self): - cfg = get_cfg() - anchor_matcher = Matcher( - cfg.MODEL.RPN.IOU_THRESHOLDS, cfg.MODEL.RPN.IOU_LABELS, allow_low_quality_matches=True - ) - match_quality_matrix = torch.tensor( - [[0.15, 0.45, 0.2, 0.6], [0.3, 0.65, 0.05, 0.1], [0.05, 0.4, 0.25, 0.4]] - ) - expected_matches = torch.tensor([1, 1, 2, 0]) - expected_match_labels = torch.tensor([-1, 1, 0, 1], dtype=torch.int8) - - matches, match_labels = anchor_matcher(match_quality_matrix) - self.assertTrue(torch.allclose(matches, expected_matches)) - self.assertTrue(torch.allclose(match_labels, expected_match_labels)) - - # nonzero_tuple must be import explicitly to let jit know what it is. - # https://github.com/pytorch/pytorch/issues/38964 - from detectron2.layers import nonzero_tuple # noqa F401 - - def f(thresholds: List[float], labels: List[int]): - return Matcher(thresholds, labels, allow_low_quality_matches=True) - - scripted_anchor_matcher = torch.jit.script(f)( - cfg.MODEL.RPN.IOU_THRESHOLDS, cfg.MODEL.RPN.IOU_LABELS - ) - matches, match_labels = scripted_anchor_matcher(match_quality_matrix) - self.assertTrue(torch.allclose(matches, expected_matches)) - self.assertTrue(torch.allclose(match_labels, expected_match_labels)) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_mmdet.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_mmdet.py deleted file mode 100755 index a743b0b6..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_mmdet.py +++ /dev/null @@ -1,186 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import unittest - -from detectron2.layers import ShapeSpec -from detectron2.modeling.mmdet_wrapper import MMDetBackbone, MMDetDetector - -try: - import mmdet.models # noqa - - HAS_MMDET = True -except ImportError: - HAS_MMDET = False - - -@unittest.skipIf(not HAS_MMDET, "mmdet not available") -class TestMMDetWrapper(unittest.TestCase): - def test_backbone(self): - MMDetBackbone( - backbone=dict( - type="DetectoRS_ResNet", - conv_cfg=dict(type="ConvAWS"), - sac=dict(type="SAC", use_deform=True), - stage_with_sac=(False, True, True, True), - depth=50, - num_stages=4, - out_indices=(0, 1, 2, 3), - frozen_stages=1, - norm_cfg=dict(type="BN", requires_grad=True), - norm_eval=True, - style="pytorch", - ), - neck=dict( - type="FPN", - in_channels=[256, 512, 1024, 2048], - out_channels=256, - num_outs=5, - ), - # skip pretrained model for tests - # pretrained_backbone="torchvision://resnet50", - output_shapes=[ShapeSpec(channels=256, stride=s) for s in [4, 8, 16, 32, 64]], - output_names=["p2", "p3", "p4", "p5", "p6"], - ) - - def test_detector(self): - # a basic R50 Mask R-CNN - MMDetDetector( - detector=dict( - type="MaskRCNN", - backbone=dict( - type="ResNet", - depth=50, - num_stages=4, - out_indices=(0, 1, 2, 3), - frozen_stages=1, - norm_cfg=dict(type="BN", requires_grad=True), - norm_eval=True, - style="pytorch", - # skip pretrained model for tests - # init_cfg=dict(type='Pretrained', checkpoint='torchvision://resnet50')) - ), - neck=dict( - type="FPN", in_channels=[256, 512, 1024, 2048], out_channels=256, num_outs=5 - ), - rpn_head=dict( - type="RPNHead", - in_channels=256, - feat_channels=256, - anchor_generator=dict( - type="AnchorGenerator", - scales=[8], - ratios=[0.5, 1.0, 2.0], - strides=[4, 8, 16, 32, 64], - ), - bbox_coder=dict( - type="DeltaXYWHBBoxCoder", - target_means=[0.0, 0.0, 0.0, 0.0], - target_stds=[1.0, 1.0, 1.0, 1.0], - ), - loss_cls=dict(type="CrossEntropyLoss", use_sigmoid=True, loss_weight=1.0), - loss_bbox=dict(type="L1Loss", loss_weight=1.0), - ), - roi_head=dict( - type="StandardRoIHead", - bbox_roi_extractor=dict( - type="SingleRoIExtractor", - roi_layer=dict(type="RoIAlign", output_size=7, sampling_ratio=0), - out_channels=256, - featmap_strides=[4, 8, 16, 32], - ), - bbox_head=dict( - type="Shared2FCBBoxHead", - in_channels=256, - fc_out_channels=1024, - roi_feat_size=7, - num_classes=80, - bbox_coder=dict( - type="DeltaXYWHBBoxCoder", - target_means=[0.0, 0.0, 0.0, 0.0], - target_stds=[0.1, 0.1, 0.2, 0.2], - ), - reg_class_agnostic=False, - loss_cls=dict(type="CrossEntropyLoss", use_sigmoid=False, loss_weight=1.0), - loss_bbox=dict(type="L1Loss", loss_weight=1.0), - ), - mask_roi_extractor=dict( - type="SingleRoIExtractor", - roi_layer=dict(type="RoIAlign", output_size=14, sampling_ratio=0), - out_channels=256, - featmap_strides=[4, 8, 16, 32], - ), - mask_head=dict( - type="FCNMaskHead", - num_convs=4, - in_channels=256, - conv_out_channels=256, - num_classes=80, - loss_mask=dict(type="CrossEntropyLoss", use_mask=True, loss_weight=1.0), - ), - ), - # model training and testing settings - train_cfg=dict( - rpn=dict( - assigner=dict( - type="MaxIoUAssigner", - pos_iou_thr=0.7, - neg_iou_thr=0.3, - min_pos_iou=0.3, - match_low_quality=True, - ignore_iof_thr=-1, - ), - sampler=dict( - type="RandomSampler", - num=256, - pos_fraction=0.5, - neg_pos_ub=-1, - add_gt_as_proposals=False, - ), - allowed_border=-1, - pos_weight=-1, - debug=False, - ), - rpn_proposal=dict( - nms_pre=2000, - max_per_img=1000, - nms=dict(type="nms", iou_threshold=0.7), - min_bbox_size=0, - ), - rcnn=dict( - assigner=dict( - type="MaxIoUAssigner", - pos_iou_thr=0.5, - neg_iou_thr=0.5, - min_pos_iou=0.5, - match_low_quality=True, - ignore_iof_thr=-1, - ), - sampler=dict( - type="RandomSampler", - num=512, - pos_fraction=0.25, - neg_pos_ub=-1, - add_gt_as_proposals=True, - ), - mask_size=28, - pos_weight=-1, - debug=False, - ), - ), - test_cfg=dict( - rpn=dict( - nms_pre=1000, - max_per_img=1000, - nms=dict(type="nms", iou_threshold=0.7), - min_bbox_size=0, - ), - rcnn=dict( - score_thr=0.05, - nms=dict(type="nms", iou_threshold=0.5), - max_per_img=100, - mask_thr_binary=0.5, - ), - ), - ), - pixel_mean=[1, 2, 3], - pixel_std=[1, 2, 3], - ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_model_e2e.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_model_e2e.py deleted file mode 100755 index 5da35205..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_model_e2e.py +++ /dev/null @@ -1,223 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - - -import itertools -import unittest -from contextlib import contextmanager -from copy import deepcopy -import torch - -from detectron2.structures import BitMasks, Boxes, ImageList, Instances -from detectron2.utils.events import EventStorage -from detectron2.utils.testing import get_model_no_weights - - -@contextmanager -def typecheck_hook(model, *, in_dtype=None, out_dtype=None): - """ - Check that the model must be called with the given input/output dtype - """ - if not isinstance(in_dtype, set): - in_dtype = {in_dtype} - if not isinstance(out_dtype, set): - out_dtype = {out_dtype} - - def flatten(x): - if isinstance(x, torch.Tensor): - return [x] - if isinstance(x, (list, tuple)): - return list(itertools.chain(*[flatten(t) for t in x])) - if isinstance(x, dict): - return flatten(list(x.values())) - return [] - - def hook(module, input, output): - if in_dtype is not None: - dtypes = {x.dtype for x in flatten(input)} - assert ( - dtypes == in_dtype - ), f"Expected input dtype of {type(module)} is {in_dtype}. Got {dtypes} instead!" - - if out_dtype is not None: - dtypes = {x.dtype for x in flatten(output)} - assert ( - dtypes == out_dtype - ), f"Expected output dtype of {type(module)} is {out_dtype}. Got {dtypes} instead!" - - with model.register_forward_hook(hook): - yield - - -def create_model_input(img, inst=None): - if inst is not None: - return {"image": img, "instances": inst} - else: - return {"image": img} - - -def get_empty_instance(h, w): - inst = Instances((h, w)) - inst.gt_boxes = Boxes(torch.rand(0, 4)) - inst.gt_classes = torch.tensor([]).to(dtype=torch.int64) - inst.gt_masks = BitMasks(torch.rand(0, h, w)) - return inst - - -def get_regular_bitmask_instances(h, w): - inst = Instances((h, w)) - inst.gt_boxes = Boxes(torch.rand(3, 4)) - inst.gt_boxes.tensor[:, 2:] += inst.gt_boxes.tensor[:, :2] - inst.gt_classes = torch.tensor([3, 4, 5]).to(dtype=torch.int64) - inst.gt_masks = BitMasks((torch.rand(3, h, w) > 0.5)) - return inst - - -class InstanceModelE2ETest: - def setUp(self): - torch.manual_seed(43) - self.model = get_model_no_weights(self.CONFIG_PATH) - - def _test_eval(self, input_sizes): - inputs = [create_model_input(torch.rand(3, s[0], s[1])) for s in input_sizes] - self.model.eval() - self.model(inputs) - - def _test_train(self, input_sizes, instances): - assert len(input_sizes) == len(instances) - inputs = [ - create_model_input(torch.rand(3, s[0], s[1]), inst) - for s, inst in zip(input_sizes, instances) - ] - self.model.train() - with EventStorage(): - losses = self.model(inputs) - sum(losses.values()).backward() - del losses - - def _inf_tensor(self, *shape): - return 1.0 / torch.zeros(*shape, device=self.model.device) - - def _nan_tensor(self, *shape): - return torch.zeros(*shape, device=self.model.device).fill_(float("nan")) - - def test_empty_data(self): - instances = [get_empty_instance(200, 250), get_empty_instance(200, 249)] - self._test_eval([(200, 250), (200, 249)]) - self._test_train([(200, 250), (200, 249)], instances) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA unavailable") - def test_eval_tocpu(self): - model = deepcopy(self.model).cpu() - model.eval() - input_sizes = [(200, 250), (200, 249)] - inputs = [create_model_input(torch.rand(3, s[0], s[1])) for s in input_sizes] - model(inputs) - - -class MaskRCNNE2ETest(InstanceModelE2ETest, unittest.TestCase): - CONFIG_PATH = "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml" - - def test_half_empty_data(self): - instances = [get_empty_instance(200, 250), get_regular_bitmask_instances(200, 249)] - self._test_train([(200, 250), (200, 249)], instances) - - # This test is flaky because in some environment the output features are zero due to relu - # def test_rpn_inf_nan_data(self): - # self.model.eval() - # for tensor in [self._inf_tensor, self._nan_tensor]: - # images = ImageList(tensor(1, 3, 512, 512), [(510, 510)]) - # features = { - # "p2": tensor(1, 256, 256, 256), - # "p3": tensor(1, 256, 128, 128), - # "p4": tensor(1, 256, 64, 64), - # "p5": tensor(1, 256, 32, 32), - # "p6": tensor(1, 256, 16, 16), - # } - # props, _ = self.model.proposal_generator(images, features) - # self.assertEqual(len(props[0]), 0) - - def test_roiheads_inf_nan_data(self): - self.model.eval() - for tensor in [self._inf_tensor, self._nan_tensor]: - images = ImageList(tensor(1, 3, 512, 512), [(510, 510)]) - features = { - "p2": tensor(1, 256, 256, 256), - "p3": tensor(1, 256, 128, 128), - "p4": tensor(1, 256, 64, 64), - "p5": tensor(1, 256, 32, 32), - "p6": tensor(1, 256, 16, 16), - } - props = [Instances((510, 510))] - props[0].proposal_boxes = Boxes([[10, 10, 20, 20]]).to(device=self.model.device) - props[0].objectness_logits = torch.tensor([1.0]).reshape(1, 1) - det, _ = self.model.roi_heads(images, features, props) - self.assertEqual(len(det[0]), 0) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_autocast(self): - from torch.cuda.amp import autocast - - inputs = [{"image": torch.rand(3, 100, 100)}] - self.model.eval() - with autocast(), typecheck_hook( - self.model.backbone, in_dtype=torch.float32, out_dtype=torch.float16 - ), typecheck_hook( - self.model.roi_heads.box_predictor, in_dtype=torch.float16, out_dtype=torch.float16 - ): - out = self.model.inference(inputs, do_postprocess=False)[0] - self.assertEqual(out.pred_boxes.tensor.dtype, torch.float32) - self.assertEqual(out.pred_masks.dtype, torch.float16) - self.assertEqual(out.scores.dtype, torch.float32) # scores comes from softmax - - -class RetinaNetE2ETest(InstanceModelE2ETest, unittest.TestCase): - CONFIG_PATH = "COCO-Detection/retinanet_R_50_FPN_1x.yaml" - - def test_inf_nan_data(self): - self.model.eval() - self.model.score_threshold = -999999999 - for tensor in [self._inf_tensor, self._nan_tensor]: - images = ImageList(tensor(1, 3, 512, 512), [(510, 510)]) - features = [ - tensor(1, 256, 128, 128), - tensor(1, 256, 64, 64), - tensor(1, 256, 32, 32), - tensor(1, 256, 16, 16), - tensor(1, 256, 8, 8), - ] - pred_logits, pred_anchor_deltas = self.model.head(features) - pred_logits = [tensor(*x.shape) for x in pred_logits] - pred_anchor_deltas = [tensor(*x.shape) for x in pred_anchor_deltas] - det = self.model.forward_inference(images, features, [pred_logits, pred_anchor_deltas]) - # all predictions (if any) are infinite or nan - if len(det[0]): - self.assertTrue(torch.isfinite(det[0].pred_boxes.tensor).sum() == 0) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_autocast(self): - from torch.cuda.amp import autocast - - inputs = [{"image": torch.rand(3, 100, 100)}] - self.model.eval() - with autocast(), typecheck_hook( - self.model.backbone, in_dtype=torch.float32, out_dtype=torch.float16 - ), typecheck_hook(self.model.head, in_dtype=torch.float16, out_dtype=torch.float16): - out = self.model(inputs)[0]["instances"] - self.assertEqual(out.pred_boxes.tensor.dtype, torch.float32) - self.assertEqual(out.scores.dtype, torch.float16) - - -class SemSegE2ETest(unittest.TestCase): - CONFIG_PATH = "Misc/semantic_R_50_FPN_1x.yaml" - - def setUp(self): - torch.manual_seed(43) - self.model = get_model_no_weights(self.CONFIG_PATH) - - def _test_eval(self, input_sizes): - inputs = [create_model_input(torch.rand(3, s[0], s[1])) for s in input_sizes] - self.model.eval() - self.model(inputs) - - def test_forward(self): - self._test_eval([(200, 250), (200, 249)]) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_roi_heads.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_roi_heads.py deleted file mode 100755 index 6af160ef..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_roi_heads.py +++ /dev/null @@ -1,323 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import unittest -from copy import deepcopy -import torch -from torch import nn - -from detectron2 import model_zoo -from detectron2.config import get_cfg -from detectron2.export.torchscript_patch import ( - freeze_training_mode, - patch_builtin_len, - patch_instances, -) -from detectron2.layers import ShapeSpec -from detectron2.modeling.proposal_generator.build import build_proposal_generator -from detectron2.modeling.roi_heads import ( - FastRCNNConvFCHead, - KRCNNConvDeconvUpsampleHead, - MaskRCNNConvUpsampleHead, - StandardROIHeads, - build_roi_heads, -) -from detectron2.projects import point_rend -from detectron2.structures import BitMasks, Boxes, ImageList, Instances, RotatedBoxes -from detectron2.utils.events import EventStorage -from detectron2.utils.testing import assert_instances_allclose, random_boxes - -logger = logging.getLogger(__name__) - -""" -Make sure the losses of ROIHeads/RPN do not change, to avoid -breaking the forward logic by mistake. -This relies on assumption that pytorch's RNG is stable. -""" - - -class ROIHeadsTest(unittest.TestCase): - def test_roi_heads(self): - torch.manual_seed(121) - cfg = get_cfg() - cfg.MODEL.ROI_BOX_HEAD.NAME = "FastRCNNConvFCHead" - cfg.MODEL.ROI_BOX_HEAD.NUM_FC = 2 - cfg.MODEL.ROI_BOX_HEAD.POOLER_TYPE = "ROIAlignV2" - cfg.MODEL.ROI_BOX_HEAD.BBOX_REG_WEIGHTS = (10, 10, 5, 5) - cfg.MODEL.MASK_ON = True - num_images = 2 - images_tensor = torch.rand(num_images, 20, 30) - image_sizes = [(10, 10), (20, 30)] - images = ImageList(images_tensor, image_sizes) - num_channels = 1024 - features = {"res4": torch.rand(num_images, num_channels, 1, 2)} - feature_shape = {"res4": ShapeSpec(channels=num_channels, stride=16)} - - image_shape = (15, 15) - gt_boxes0 = torch.tensor([[1, 1, 3, 3], [2, 2, 6, 6]], dtype=torch.float32) - gt_instance0 = Instances(image_shape) - gt_instance0.gt_boxes = Boxes(gt_boxes0) - gt_instance0.gt_classes = torch.tensor([2, 1]) - gt_instance0.gt_masks = BitMasks(torch.rand((2,) + image_shape) > 0.5) - gt_boxes1 = torch.tensor([[1, 5, 2, 8], [7, 3, 10, 5]], dtype=torch.float32) - gt_instance1 = Instances(image_shape) - gt_instance1.gt_boxes = Boxes(gt_boxes1) - gt_instance1.gt_classes = torch.tensor([1, 2]) - gt_instance1.gt_masks = BitMasks(torch.rand((2,) + image_shape) > 0.5) - gt_instances = [gt_instance0, gt_instance1] - - proposal_generator = build_proposal_generator(cfg, feature_shape) - roi_heads = StandardROIHeads(cfg, feature_shape) - - with EventStorage(): # capture events in a new storage to discard them - proposals, proposal_losses = proposal_generator(images, features, gt_instances) - _, detector_losses = roi_heads(images, features, proposals, gt_instances) - - detector_losses.update(proposal_losses) - expected_losses = { - "loss_cls": 4.5253729820251465, - "loss_box_reg": 0.009785720147192478, - "loss_mask": 0.693184494972229, - "loss_rpn_cls": 0.08186662942171097, - "loss_rpn_loc": 0.1104838103055954, - } - succ = all( - torch.allclose(detector_losses[name], torch.tensor(expected_losses.get(name, 0.0))) - for name in detector_losses.keys() - ) - self.assertTrue( - succ, - "Losses has changed! New losses: {}".format( - {k: v.item() for k, v in detector_losses.items()} - ), - ) - - def test_rroi_heads(self): - torch.manual_seed(121) - cfg = get_cfg() - cfg.MODEL.PROPOSAL_GENERATOR.NAME = "RRPN" - cfg.MODEL.ANCHOR_GENERATOR.NAME = "RotatedAnchorGenerator" - cfg.MODEL.ROI_HEADS.NAME = "RROIHeads" - cfg.MODEL.ROI_BOX_HEAD.NAME = "FastRCNNConvFCHead" - cfg.MODEL.ROI_BOX_HEAD.NUM_FC = 2 - cfg.MODEL.RPN.BBOX_REG_WEIGHTS = (1, 1, 1, 1, 1) - cfg.MODEL.RPN.HEAD_NAME = "StandardRPNHead" - cfg.MODEL.ROI_BOX_HEAD.POOLER_TYPE = "ROIAlignRotated" - cfg.MODEL.ROI_BOX_HEAD.BBOX_REG_WEIGHTS = (10, 10, 5, 5, 1) - num_images = 2 - images_tensor = torch.rand(num_images, 20, 30) - image_sizes = [(10, 10), (20, 30)] - images = ImageList(images_tensor, image_sizes) - num_channels = 1024 - features = {"res4": torch.rand(num_images, num_channels, 1, 2)} - feature_shape = {"res4": ShapeSpec(channels=num_channels, stride=16)} - - image_shape = (15, 15) - gt_boxes0 = torch.tensor([[2, 2, 2, 2, 30], [4, 4, 4, 4, 0]], dtype=torch.float32) - gt_instance0 = Instances(image_shape) - gt_instance0.gt_boxes = RotatedBoxes(gt_boxes0) - gt_instance0.gt_classes = torch.tensor([2, 1]) - gt_boxes1 = torch.tensor([[1.5, 5.5, 1, 3, 0], [8.5, 4, 3, 2, -50]], dtype=torch.float32) - gt_instance1 = Instances(image_shape) - gt_instance1.gt_boxes = RotatedBoxes(gt_boxes1) - gt_instance1.gt_classes = torch.tensor([1, 2]) - gt_instances = [gt_instance0, gt_instance1] - - proposal_generator = build_proposal_generator(cfg, feature_shape) - roi_heads = build_roi_heads(cfg, feature_shape) - - with EventStorage(): # capture events in a new storage to discard them - proposals, proposal_losses = proposal_generator(images, features, gt_instances) - _, detector_losses = roi_heads(images, features, proposals, gt_instances) - - detector_losses.update(proposal_losses) - expected_losses = { - "loss_cls": 4.365657806396484, - "loss_box_reg": 0.0015851043863222003, - "loss_rpn_cls": 0.2427729219198227, - "loss_rpn_loc": 0.3646621108055115, - } - succ = all( - torch.allclose(detector_losses[name], torch.tensor(expected_losses.get(name, 0.0))) - for name in detector_losses.keys() - ) - self.assertTrue( - succ, - "Losses has changed! New losses: {}".format( - {k: v.item() for k, v in detector_losses.items()} - ), - ) - - def test_box_head_scriptability(self): - input_shape = ShapeSpec(channels=1024, height=14, width=14) - box_features = torch.randn(4, 1024, 14, 14) - - box_head = FastRCNNConvFCHead( - input_shape, conv_dims=[512, 512], fc_dims=[1024, 1024] - ).eval() - script_box_head = torch.jit.script(box_head) - - origin_output = box_head(box_features) - script_output = script_box_head(box_features) - self.assertTrue(torch.equal(origin_output, script_output)) - - def test_mask_head_scriptability(self): - input_shape = ShapeSpec(channels=1024) - mask_features = torch.randn(4, 1024, 14, 14) - - image_shapes = [(10, 10), (15, 15)] - pred_instance0 = Instances(image_shapes[0]) - pred_classes0 = torch.tensor([1, 2, 3], dtype=torch.int64) - pred_instance0.pred_classes = pred_classes0 - pred_instance1 = Instances(image_shapes[1]) - pred_classes1 = torch.tensor([4], dtype=torch.int64) - pred_instance1.pred_classes = pred_classes1 - - mask_head = MaskRCNNConvUpsampleHead( - input_shape, num_classes=80, conv_dims=[256, 256] - ).eval() - # pred_instance will be in-place changed during the inference - # process of `MaskRCNNConvUpsampleHead` - origin_outputs = mask_head(mask_features, deepcopy([pred_instance0, pred_instance1])) - - fields = {"pred_masks": torch.Tensor, "pred_classes": torch.Tensor} - with freeze_training_mode(mask_head), patch_instances(fields) as NewInstances: - sciript_mask_head = torch.jit.script(mask_head) - pred_instance0 = NewInstances.from_instances(pred_instance0) - pred_instance1 = NewInstances.from_instances(pred_instance1) - script_outputs = sciript_mask_head(mask_features, [pred_instance0, pred_instance1]) - - for origin_ins, script_ins in zip(origin_outputs, script_outputs): - assert_instances_allclose(origin_ins, script_ins, rtol=0) - - def test_keypoint_head_scriptability(self): - input_shape = ShapeSpec(channels=1024, height=14, width=14) - keypoint_features = torch.randn(4, 1024, 14, 14) - - image_shapes = [(10, 10), (15, 15)] - pred_boxes0 = torch.tensor([[1, 1, 3, 3], [2, 2, 6, 6], [1, 5, 2, 8]], dtype=torch.float32) - pred_instance0 = Instances(image_shapes[0]) - pred_instance0.pred_boxes = Boxes(pred_boxes0) - pred_boxes1 = torch.tensor([[7, 3, 10, 5]], dtype=torch.float32) - pred_instance1 = Instances(image_shapes[1]) - pred_instance1.pred_boxes = Boxes(pred_boxes1) - - keypoint_head = KRCNNConvDeconvUpsampleHead( - input_shape, num_keypoints=17, conv_dims=[512, 512] - ).eval() - origin_outputs = keypoint_head( - keypoint_features, deepcopy([pred_instance0, pred_instance1]) - ) - - fields = { - "pred_boxes": Boxes, - "pred_keypoints": torch.Tensor, - "pred_keypoint_heatmaps": torch.Tensor, - } - with freeze_training_mode(keypoint_head), patch_instances(fields) as NewInstances: - sciript_keypoint_head = torch.jit.script(keypoint_head) - pred_instance0 = NewInstances.from_instances(pred_instance0) - pred_instance1 = NewInstances.from_instances(pred_instance1) - script_outputs = sciript_keypoint_head( - keypoint_features, [pred_instance0, pred_instance1] - ) - - for origin_ins, script_ins in zip(origin_outputs, script_outputs): - assert_instances_allclose(origin_ins, script_ins, rtol=0) - - def test_StandardROIHeads_scriptability(self): - cfg = get_cfg() - cfg.MODEL.ROI_BOX_HEAD.NAME = "FastRCNNConvFCHead" - cfg.MODEL.ROI_BOX_HEAD.NUM_FC = 2 - cfg.MODEL.ROI_BOX_HEAD.POOLER_TYPE = "ROIAlignV2" - cfg.MODEL.ROI_BOX_HEAD.BBOX_REG_WEIGHTS = (10, 10, 5, 5) - cfg.MODEL.MASK_ON = True - cfg.MODEL.ROI_HEADS.NMS_THRESH_TEST = 0.01 - cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.01 - num_images = 2 - images_tensor = torch.rand(num_images, 20, 30) - image_sizes = [(10, 10), (20, 30)] - images = ImageList(images_tensor, image_sizes) - num_channels = 1024 - features = {"res4": torch.rand(num_images, num_channels, 1, 2)} - feature_shape = {"res4": ShapeSpec(channels=num_channels, stride=16)} - - roi_heads = StandardROIHeads(cfg, feature_shape).eval() - - proposal0 = Instances(image_sizes[0]) - proposal_boxes0 = torch.tensor([[1, 1, 3, 3], [2, 2, 6, 6]], dtype=torch.float32) - proposal0.proposal_boxes = Boxes(proposal_boxes0) - proposal0.objectness_logits = torch.tensor([0.5, 0.7], dtype=torch.float32) - - proposal1 = Instances(image_sizes[1]) - proposal_boxes1 = torch.tensor([[1, 5, 2, 8], [7, 3, 10, 5]], dtype=torch.float32) - proposal1.proposal_boxes = Boxes(proposal_boxes1) - proposal1.objectness_logits = torch.tensor([0.1, 0.9], dtype=torch.float32) - proposals = [proposal0, proposal1] - - pred_instances, _ = roi_heads(images, features, proposals) - fields = { - "objectness_logits": torch.Tensor, - "proposal_boxes": Boxes, - "pred_classes": torch.Tensor, - "scores": torch.Tensor, - "pred_masks": torch.Tensor, - "pred_boxes": Boxes, - "pred_keypoints": torch.Tensor, - "pred_keypoint_heatmaps": torch.Tensor, - } - with freeze_training_mode(roi_heads), patch_instances(fields) as new_instances: - proposal0 = new_instances.from_instances(proposal0) - proposal1 = new_instances.from_instances(proposal1) - proposals = [proposal0, proposal1] - scripted_rot_heads = torch.jit.script(roi_heads) - scripted_pred_instances, _ = scripted_rot_heads(images, features, proposals) - - for instance, scripted_instance in zip(pred_instances, scripted_pred_instances): - assert_instances_allclose(instance, scripted_instance, rtol=0) - - def test_PointRend_mask_head_tracing(self): - cfg = model_zoo.get_config("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml") - point_rend.add_pointrend_config(cfg) - cfg.MODEL.ROI_HEADS.IN_FEATURES = ["p2", "p3"] - cfg.MODEL.ROI_MASK_HEAD.NAME = "PointRendMaskHead" - cfg.MODEL.ROI_MASK_HEAD.POOLER_TYPE = "" - cfg.MODEL.ROI_MASK_HEAD.POINT_HEAD_ON = True - chan = 256 - head = point_rend.PointRendMaskHead( - cfg, - { - "p2": ShapeSpec(channels=chan, stride=4), - "p3": ShapeSpec(channels=chan, stride=8), - }, - ) - - def gen_inputs(h, w, N): - p2 = torch.rand(1, chan, h, w) - p3 = torch.rand(1, chan, h // 2, w // 2) - boxes = random_boxes(N, max_coord=h) - return p2, p3, boxes - - class Wrap(nn.ModuleDict): - def forward(self, p2, p3, boxes): - features = { - "p2": p2, - "p3": p3, - } - inst = Instances((p2.shape[2] * 4, p2.shape[3] * 4)) - inst.pred_boxes = Boxes(boxes) - inst.pred_classes = torch.zeros(inst.__len__(), dtype=torch.long) - out = self.head(features, [inst])[0] - return out.pred_masks - - model = Wrap({"head": head}) - model.eval() - with torch.no_grad(), patch_builtin_len(): - traced = torch.jit.trace(model, gen_inputs(302, 208, 20)) - inputs = gen_inputs(100, 120, 30) - out_eager = model(*inputs) - out_trace = traced(*inputs) - self.assertTrue(torch.allclose(out_eager, out_trace)) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_roi_pooler.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_roi_pooler.py deleted file mode 100755 index b93b7ae6..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_roi_pooler.py +++ /dev/null @@ -1,165 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import unittest -import torch - -from detectron2.modeling.poolers import ROIPooler -from detectron2.structures import Boxes, RotatedBoxes -from detectron2.utils.testing import random_boxes - -logger = logging.getLogger(__name__) - - -class TestROIPooler(unittest.TestCase): - def _test_roialignv2_roialignrotated_match(self, device): - pooler_resolution = 14 - canonical_level = 4 - canonical_scale_factor = 2 ** canonical_level - pooler_scales = (1.0 / canonical_scale_factor,) - sampling_ratio = 0 - - N, C, H, W = 2, 4, 10, 8 - N_rois = 10 - std = 11 - mean = 0 - feature = (torch.rand(N, C, H, W) - 0.5) * 2 * std + mean - - features = [feature.to(device)] - - rois = [] - rois_rotated = [] - for _ in range(N): - boxes = random_boxes(N_rois, W * canonical_scale_factor) - rotated_boxes = torch.zeros(N_rois, 5) - rotated_boxes[:, 0] = (boxes[:, 0] + boxes[:, 2]) / 2.0 - rotated_boxes[:, 1] = (boxes[:, 1] + boxes[:, 3]) / 2.0 - rotated_boxes[:, 2] = boxes[:, 2] - boxes[:, 0] - rotated_boxes[:, 3] = boxes[:, 3] - boxes[:, 1] - rois.append(Boxes(boxes).to(device)) - rois_rotated.append(RotatedBoxes(rotated_boxes).to(device)) - - roialignv2_pooler = ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type="ROIAlignV2", - ) - - roialignv2_out = roialignv2_pooler(features, rois) - - roialignrotated_pooler = ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type="ROIAlignRotated", - ) - - roialignrotated_out = roialignrotated_pooler(features, rois_rotated) - - self.assertTrue(torch.allclose(roialignv2_out, roialignrotated_out, atol=1e-4)) - - def test_roialignv2_roialignrotated_match_cpu(self): - self._test_roialignv2_roialignrotated_match(device="cpu") - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_roialignv2_roialignrotated_match_cuda(self): - self._test_roialignv2_roialignrotated_match(device="cuda") - - def _test_scriptability(self, device): - pooler_resolution = 14 - canonical_level = 4 - canonical_scale_factor = 2 ** canonical_level - pooler_scales = (1.0 / canonical_scale_factor,) - sampling_ratio = 0 - - N, C, H, W = 2, 4, 10, 8 - N_rois = 10 - std = 11 - mean = 0 - feature = (torch.rand(N, C, H, W) - 0.5) * 2 * std + mean - - features = [feature.to(device)] - - rois = [] - for _ in range(N): - boxes = random_boxes(N_rois, W * canonical_scale_factor) - - rois.append(Boxes(boxes).to(device)) - - roialignv2_pooler = ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type="ROIAlignV2", - ) - - roialignv2_out = roialignv2_pooler(features, rois) - scripted_roialignv2_out = torch.jit.script(roialignv2_pooler)(features, rois) - self.assertTrue(torch.equal(roialignv2_out, scripted_roialignv2_out)) - - def test_scriptability_cpu(self): - self._test_scriptability(device="cpu") - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_scriptability_gpu(self): - self._test_scriptability(device="cuda") - - def test_no_images(self): - N, C, H, W = 0, 32, 32, 32 - feature = torch.rand(N, C, H, W) - 0.5 - features = [feature] - pooler = ROIPooler( - output_size=14, scales=(1.0,), sampling_ratio=0.0, pooler_type="ROIAlignV2" - ) - output = pooler.forward(features, []) - self.assertEqual(output.shape, (0, C, 14, 14)) - - def test_roi_pooler_tracing(self): - class Model(torch.nn.Module): - def __init__(self, roi): - super(Model, self).__init__() - self.roi = roi - - def forward(self, x, boxes): - return self.roi(x, [Boxes(boxes)]) - - pooler_resolution = 14 - canonical_level = 4 - canonical_scale_factor = 2 ** canonical_level - pooler_scales = (1.0 / canonical_scale_factor, 0.5 / canonical_scale_factor) - sampling_ratio = 0 - - N, C, H, W = 1, 4, 10, 8 - N_rois = 10 - std = 11 - mean = 0 - feature = (torch.rand(N, C, H, W) - 0.5) * 2 * std + mean - feature = [feature, feature] - - rois = random_boxes(N_rois, W * canonical_scale_factor) - # Add one larger box so that this level has only one box. - # This may trigger the bug https://github.com/pytorch/pytorch/issues/49852 - # that we shall workaround. - rois = torch.cat([rois, torch.tensor([[0, 0, 448, 448]])]) - - model = Model( - ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type="ROIAlign", - ) - ) - - with torch.no_grad(): - func = torch.jit.trace(model, (feature, rois)) - o = func(feature, rois) - self.assertEqual(o.shape, (11, 4, 14, 14)) - o = func(feature, rois[:5]) - self.assertEqual(o.shape, (5, 4, 14, 14)) - o = func(feature, random_boxes(20, W * canonical_scale_factor)) - self.assertEqual(o.shape, (20, 4, 14, 14)) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_rpn.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_rpn.py deleted file mode 100755 index f14faae5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/modeling/test_rpn.py +++ /dev/null @@ -1,262 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import unittest -import torch - -from detectron2.config import get_cfg -from detectron2.export import scripting_with_instances -from detectron2.layers import ShapeSpec -from detectron2.modeling.backbone import build_backbone -from detectron2.modeling.proposal_generator import RPN, build_proposal_generator -from detectron2.modeling.proposal_generator.proposal_utils import ( - add_ground_truth_to_proposals, - find_top_rpn_proposals, -) -from detectron2.structures import Boxes, ImageList, Instances, RotatedBoxes -from detectron2.utils.events import EventStorage - -logger = logging.getLogger(__name__) - - -class RPNTest(unittest.TestCase): - def get_gt_and_features(self): - num_images = 2 - images_tensor = torch.rand(num_images, 20, 30) - image_sizes = [(10, 10), (20, 30)] - images = ImageList(images_tensor, image_sizes) - image_shape = (15, 15) - num_channels = 1024 - features = {"res4": torch.rand(num_images, num_channels, 1, 2)} - gt_boxes = torch.tensor([[1, 1, 3, 3], [2, 2, 6, 6]], dtype=torch.float32) - gt_instances = Instances(image_shape) - gt_instances.gt_boxes = Boxes(gt_boxes) - return (gt_instances, features, images, image_sizes) - - def test_rpn(self): - torch.manual_seed(121) - cfg = get_cfg() - backbone = build_backbone(cfg) - proposal_generator = RPN(cfg, backbone.output_shape()) - (gt_instances, features, images, image_sizes) = self.get_gt_and_features() - with EventStorage(): # capture events in a new storage to discard them - proposals, proposal_losses = proposal_generator( - images, features, [gt_instances[0], gt_instances[1]] - ) - - expected_losses = { - "loss_rpn_cls": torch.tensor(0.08011703193), - "loss_rpn_loc": torch.tensor(0.101470276), - } - for name in expected_losses.keys(): - err_msg = "proposal_losses[{}] = {}, expected losses = {}".format( - name, proposal_losses[name], expected_losses[name] - ) - self.assertTrue(torch.allclose(proposal_losses[name], expected_losses[name]), err_msg) - - self.assertEqual(len(proposals), len(image_sizes)) - for proposal, im_size in zip(proposals, image_sizes): - self.assertEqual(proposal.image_size, im_size) - - expected_proposal_box = torch.tensor([[0, 0, 10, 10], [7.2702, 0, 10, 10]]) - expected_objectness_logit = torch.tensor([0.1596, -0.0007]) - self.assertTrue( - torch.allclose(proposals[0].proposal_boxes.tensor, expected_proposal_box, atol=1e-4) - ) - self.assertTrue( - torch.allclose(proposals[0].objectness_logits, expected_objectness_logit, atol=1e-4) - ) - - def verify_rpn(self, conv_dims, expected_conv_dims): - torch.manual_seed(121) - cfg = get_cfg() - cfg.MODEL.RPN.CONV_DIMS = conv_dims - backbone = build_backbone(cfg) - proposal_generator = RPN(cfg, backbone.output_shape()) - for k, conv in enumerate(proposal_generator.rpn_head.conv): - self.assertEqual(expected_conv_dims[k], conv.out_channels) - return proposal_generator - - def test_rpn_larger_num_convs(self): - conv_dims = [64, 64, 64, 64, 64] - proposal_generator = self.verify_rpn(conv_dims, conv_dims) - (gt_instances, features, images, image_sizes) = self.get_gt_and_features() - with EventStorage(): # capture events in a new storage to discard them - proposals, proposal_losses = proposal_generator( - images, features, [gt_instances[0], gt_instances[1]] - ) - expected_losses = { - "loss_rpn_cls": torch.tensor(0.08122821152), - "loss_rpn_loc": torch.tensor(0.10064548254), - } - for name in expected_losses.keys(): - err_msg = "proposal_losses[{}] = {}, expected losses = {}".format( - name, proposal_losses[name], expected_losses[name] - ) - self.assertTrue(torch.allclose(proposal_losses[name], expected_losses[name]), err_msg) - - def test_rpn_conv_dims_not_set(self): - conv_dims = [-1, -1, -1] - expected_conv_dims = [1024, 1024, 1024] - self.verify_rpn(conv_dims, expected_conv_dims) - - def test_rpn_scriptability(self): - cfg = get_cfg() - proposal_generator = RPN(cfg, {"res4": ShapeSpec(channels=1024, stride=16)}).eval() - num_images = 2 - images_tensor = torch.rand(num_images, 30, 40) - image_sizes = [(32, 32), (30, 40)] - images = ImageList(images_tensor, image_sizes) - features = {"res4": torch.rand(num_images, 1024, 1, 2)} - - fields = {"proposal_boxes": Boxes, "objectness_logits": torch.Tensor} - proposal_generator_ts = scripting_with_instances(proposal_generator, fields) - - proposals, _ = proposal_generator(images, features) - proposals_ts, _ = proposal_generator_ts(images, features) - - for proposal, proposal_ts in zip(proposals, proposals_ts): - self.assertEqual(proposal.image_size, proposal_ts.image_size) - self.assertTrue( - torch.equal(proposal.proposal_boxes.tensor, proposal_ts.proposal_boxes.tensor) - ) - self.assertTrue(torch.equal(proposal.objectness_logits, proposal_ts.objectness_logits)) - - def test_rrpn(self): - torch.manual_seed(121) - cfg = get_cfg() - cfg.MODEL.PROPOSAL_GENERATOR.NAME = "RRPN" - cfg.MODEL.ANCHOR_GENERATOR.NAME = "RotatedAnchorGenerator" - cfg.MODEL.ANCHOR_GENERATOR.SIZES = [[32, 64]] - cfg.MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS = [[0.25, 1]] - cfg.MODEL.ANCHOR_GENERATOR.ANGLES = [[0, 60]] - cfg.MODEL.RPN.BBOX_REG_WEIGHTS = (1, 1, 1, 1, 1) - cfg.MODEL.RPN.HEAD_NAME = "StandardRPNHead" - backbone = build_backbone(cfg) - proposal_generator = build_proposal_generator(cfg, backbone.output_shape()) - num_images = 2 - images_tensor = torch.rand(num_images, 20, 30) - image_sizes = [(10, 10), (20, 30)] - images = ImageList(images_tensor, image_sizes) - image_shape = (15, 15) - num_channels = 1024 - features = {"res4": torch.rand(num_images, num_channels, 1, 2)} - gt_boxes = torch.tensor([[2, 2, 2, 2, 0], [4, 4, 4, 4, 0]], dtype=torch.float32) - gt_instances = Instances(image_shape) - gt_instances.gt_boxes = RotatedBoxes(gt_boxes) - with EventStorage(): # capture events in a new storage to discard them - proposals, proposal_losses = proposal_generator( - images, features, [gt_instances[0], gt_instances[1]] - ) - - expected_losses = { - "loss_rpn_cls": torch.tensor(0.04291602224), - "loss_rpn_loc": torch.tensor(0.145077362), - } - for name in expected_losses.keys(): - err_msg = "proposal_losses[{}] = {}, expected losses = {}".format( - name, proposal_losses[name], expected_losses[name] - ) - self.assertTrue(torch.allclose(proposal_losses[name], expected_losses[name]), err_msg) - - expected_proposal_box = torch.tensor( - [ - [-1.77999556, 0.78155339, 68.04367828, 14.78156471, 60.59333801], - [13.82740974, -1.50282836, 34.67269897, 29.19676590, -3.81942749], - [8.10392570, -0.99071521, 145.39100647, 32.13126373, 3.67242432], - [5.00000000, 4.57370186, 10.00000000, 9.14740372, 0.89196777], - ] - ) - - expected_objectness_logit = torch.tensor([0.10924313, 0.09881870, 0.07649877, 0.05858029]) - - torch.set_printoptions(precision=8, sci_mode=False) - - self.assertEqual(len(proposals), len(image_sizes)) - - proposal = proposals[0] - # It seems that there's some randomness in the result across different machines: - # This test can be run on a local machine for 100 times with exactly the same result, - # However, a different machine might produce slightly different results, - # thus the atol here. - err_msg = "computed proposal boxes = {}, expected {}".format( - proposal.proposal_boxes.tensor, expected_proposal_box - ) - self.assertTrue( - torch.allclose(proposal.proposal_boxes.tensor[:4], expected_proposal_box, atol=1e-5), - err_msg, - ) - - err_msg = "computed objectness logits = {}, expected {}".format( - proposal.objectness_logits, expected_objectness_logit - ) - self.assertTrue( - torch.allclose(proposal.objectness_logits[:4], expected_objectness_logit, atol=1e-5), - err_msg, - ) - - def test_find_rpn_proposals_inf(self): - N, Hi, Wi, A = 3, 3, 3, 3 - proposals = [torch.rand(N, Hi * Wi * A, 4)] - pred_logits = [torch.rand(N, Hi * Wi * A)] - pred_logits[0][1][3:5].fill_(float("inf")) - find_top_rpn_proposals(proposals, pred_logits, [(10, 10)], 0.5, 1000, 1000, 0, False) - - def test_find_rpn_proposals_tracing(self): - N, Hi, Wi, A = 3, 50, 50, 9 - proposal = torch.rand(N, Hi * Wi * A, 4) - pred_logit = torch.rand(N, Hi * Wi * A) - - def func(proposal, logit, image_size): - r = find_top_rpn_proposals( - [proposal], [logit], [image_size], 0.7, 1000, 1000, 0, False - )[0] - size = r.image_size - if not isinstance(size, torch.Tensor): - size = torch.tensor(size) - return (size, r.proposal_boxes.tensor, r.objectness_logits) - - other_inputs = [] - # test that it generalizes to other shapes - for Hi, Wi, shp in [(30, 30, 60), (10, 10, 800)]: - other_inputs.append( - ( - torch.rand(N, Hi * Wi * A, 4), - torch.rand(N, Hi * Wi * A), - torch.tensor([shp, shp]), - ) - ) - torch.jit.trace( - func, (proposal, pred_logit, torch.tensor([100, 100])), check_inputs=other_inputs - ) - - def test_append_gt_to_proposal(self): - proposals = Instances( - (10, 10), - **{ - "proposal_boxes": Boxes(torch.empty((0, 4))), - "objectness_logits": torch.tensor([]), - "custom_attribute": torch.tensor([]), - } - ) - gt_boxes = Boxes(torch.tensor([[0, 0, 1, 1]])) - - self.assertRaises(AssertionError, add_ground_truth_to_proposals, [gt_boxes], [proposals]) - - gt_instances = Instances((10, 10)) - gt_instances.gt_boxes = gt_boxes - - self.assertRaises( - AssertionError, add_ground_truth_to_proposals, [gt_instances], [proposals] - ) - - gt_instances.custom_attribute = torch.tensor([1]) - gt_instances.custom_attribute2 = torch.tensor([1]) - new_proposals = add_ground_truth_to_proposals([gt_instances], [proposals])[0] - - self.assertEqual(new_proposals.custom_attribute[0], 1) - # new proposals should only include the attributes in proposals - self.assertRaises(AttributeError, lambda: new_proposals.custom_attribute2) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_boxes.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_boxes.py deleted file mode 100755 index 10119181..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_boxes.py +++ /dev/null @@ -1,223 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import json -import math -import numpy as np -import unittest -import torch - -from detectron2.structures import Boxes, BoxMode, pairwise_ioa, pairwise_iou -from detectron2.utils.testing import reload_script_model - - -class TestBoxMode(unittest.TestCase): - def _convert_xy_to_wh(self, x): - return BoxMode.convert(x, BoxMode.XYXY_ABS, BoxMode.XYWH_ABS) - - def _convert_xywha_to_xyxy(self, x): - return BoxMode.convert(x, BoxMode.XYWHA_ABS, BoxMode.XYXY_ABS) - - def _convert_xywh_to_xywha(self, x): - return BoxMode.convert(x, BoxMode.XYWH_ABS, BoxMode.XYWHA_ABS) - - def test_convert_int_mode(self): - BoxMode.convert([1, 2, 3, 4], 0, 1) - - def test_box_convert_list(self): - for tp in [list, tuple]: - box = tp([5.0, 5.0, 10.0, 10.0]) - output = self._convert_xy_to_wh(box) - self.assertIsInstance(output, tp) - self.assertIsInstance(output[0], float) - self.assertEqual(output, tp([5.0, 5.0, 5.0, 5.0])) - - with self.assertRaises(Exception): - self._convert_xy_to_wh([box]) - - def test_box_convert_array(self): - box = np.asarray([[5, 5, 10, 10], [1, 1, 2, 3]]) - output = self._convert_xy_to_wh(box) - self.assertEqual(output.dtype, box.dtype) - self.assertEqual(output.shape, box.shape) - self.assertTrue((output[0] == [5, 5, 5, 5]).all()) - self.assertTrue((output[1] == [1, 1, 1, 2]).all()) - - def test_box_convert_cpu_tensor(self): - box = torch.tensor([[5, 5, 10, 10], [1, 1, 2, 3]]) - output = self._convert_xy_to_wh(box) - self.assertEqual(output.dtype, box.dtype) - self.assertEqual(output.shape, box.shape) - output = output.numpy() - self.assertTrue((output[0] == [5, 5, 5, 5]).all()) - self.assertTrue((output[1] == [1, 1, 1, 2]).all()) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_box_convert_cuda_tensor(self): - box = torch.tensor([[5, 5, 10, 10], [1, 1, 2, 3]]).cuda() - output = self._convert_xy_to_wh(box) - self.assertEqual(output.dtype, box.dtype) - self.assertEqual(output.shape, box.shape) - self.assertEqual(output.device, box.device) - output = output.cpu().numpy() - self.assertTrue((output[0] == [5, 5, 5, 5]).all()) - self.assertTrue((output[1] == [1, 1, 1, 2]).all()) - - def test_box_convert_xywha_to_xyxy_list(self): - for tp in [list, tuple]: - box = tp([50, 50, 30, 20, 0]) - output = self._convert_xywha_to_xyxy(box) - self.assertIsInstance(output, tp) - self.assertEqual(output, tp([35, 40, 65, 60])) - - with self.assertRaises(Exception): - self._convert_xywha_to_xyxy([box]) - - def test_box_convert_xywha_to_xyxy_array(self): - for dtype in [np.float64, np.float32]: - box = np.asarray( - [ - [50, 50, 30, 20, 0], - [50, 50, 30, 20, 90], - [1, 1, math.sqrt(2), math.sqrt(2), -45], - ], - dtype=dtype, - ) - output = self._convert_xywha_to_xyxy(box) - self.assertEqual(output.dtype, box.dtype) - expected = np.asarray([[35, 40, 65, 60], [40, 35, 60, 65], [0, 0, 2, 2]], dtype=dtype) - self.assertTrue(np.allclose(output, expected, atol=1e-6), "output={}".format(output)) - - def test_box_convert_xywha_to_xyxy_tensor(self): - for dtype in [torch.float32, torch.float64]: - box = torch.tensor( - [ - [50, 50, 30, 20, 0], - [50, 50, 30, 20, 90], - [1, 1, math.sqrt(2), math.sqrt(2), -45], - ], - dtype=dtype, - ) - output = self._convert_xywha_to_xyxy(box) - self.assertEqual(output.dtype, box.dtype) - expected = torch.tensor([[35, 40, 65, 60], [40, 35, 60, 65], [0, 0, 2, 2]], dtype=dtype) - - self.assertTrue(torch.allclose(output, expected, atol=1e-6), "output={}".format(output)) - - def test_box_convert_xywh_to_xywha_list(self): - for tp in [list, tuple]: - box = tp([50, 50, 30, 20]) - output = self._convert_xywh_to_xywha(box) - self.assertIsInstance(output, tp) - self.assertEqual(output, tp([65, 60, 30, 20, 0])) - - with self.assertRaises(Exception): - self._convert_xywh_to_xywha([box]) - - def test_box_convert_xywh_to_xywha_array(self): - for dtype in [np.float64, np.float32]: - box = np.asarray([[30, 40, 70, 60], [30, 40, 60, 70], [-1, -1, 2, 2]], dtype=dtype) - output = self._convert_xywh_to_xywha(box) - self.assertEqual(output.dtype, box.dtype) - expected = np.asarray( - [[65, 70, 70, 60, 0], [60, 75, 60, 70, 0], [0, 0, 2, 2, 0]], dtype=dtype - ) - self.assertTrue(np.allclose(output, expected, atol=1e-6), "output={}".format(output)) - - def test_box_convert_xywh_to_xywha_tensor(self): - for dtype in [torch.float32, torch.float64]: - box = torch.tensor([[30, 40, 70, 60], [30, 40, 60, 70], [-1, -1, 2, 2]], dtype=dtype) - output = self._convert_xywh_to_xywha(box) - self.assertEqual(output.dtype, box.dtype) - expected = torch.tensor( - [[65, 70, 70, 60, 0], [60, 75, 60, 70, 0], [0, 0, 2, 2, 0]], dtype=dtype - ) - - self.assertTrue(torch.allclose(output, expected, atol=1e-6), "output={}".format(output)) - - def test_json_serializable(self): - payload = {"box_mode": BoxMode.XYWH_REL} - try: - json.dumps(payload) - except Exception: - self.fail("JSON serialization failed") - - def test_json_deserializable(self): - payload = '{"box_mode": 2}' - obj = json.loads(payload) - try: - obj["box_mode"] = BoxMode(obj["box_mode"]) - except Exception: - self.fail("JSON deserialization failed") - - -class TestBoxIOU(unittest.TestCase): - def create_boxes(self): - boxes1 = torch.tensor([[0.0, 0.0, 1.0, 1.0], [0.0, 0.0, 1.0, 1.0]]) - - boxes2 = torch.tensor( - [ - [0.0, 0.0, 1.0, 1.0], - [0.0, 0.0, 0.5, 1.0], - [0.0, 0.0, 1.0, 0.5], - [0.0, 0.0, 0.5, 0.5], - [0.5, 0.5, 1.0, 1.0], - [0.5, 0.5, 1.5, 1.5], - ] - ) - return boxes1, boxes2 - - def test_pairwise_iou(self): - boxes1, boxes2 = self.create_boxes() - expected_ious = torch.tensor( - [ - [1.0, 0.5, 0.5, 0.25, 0.25, 0.25 / (2 - 0.25)], - [1.0, 0.5, 0.5, 0.25, 0.25, 0.25 / (2 - 0.25)], - ] - ) - - ious = pairwise_iou(Boxes(boxes1), Boxes(boxes2)) - self.assertTrue(torch.allclose(ious, expected_ious)) - - def test_pairwise_ioa(self): - boxes1, boxes2 = self.create_boxes() - expected_ioas = torch.tensor( - [[1.0, 1.0, 1.0, 1.0, 1.0, 0.25], [1.0, 1.0, 1.0, 1.0, 1.0, 0.25]] - ) - ioas = pairwise_ioa(Boxes(boxes1), Boxes(boxes2)) - self.assertTrue(torch.allclose(ioas, expected_ioas)) - - -class TestBoxes(unittest.TestCase): - def test_empty_cat(self): - x = Boxes.cat([]) - self.assertTrue(x.tensor.shape, (0, 4)) - - def test_to(self): - x = Boxes(torch.rand(3, 4)) - self.assertEqual(x.to(device="cpu").tensor.device.type, "cpu") - - def test_scriptability(self): - def func(x): - boxes = Boxes(x) - test = boxes.to(torch.device("cpu")).tensor - return boxes.area(), test - - f = torch.jit.script(func) - f = reload_script_model(f) - f(torch.rand((3, 4))) - - data = torch.rand((3, 4)) - - def func_cat(x: torch.Tensor): - boxes1 = Boxes(x) - boxes2 = Boxes(x) - # boxes3 = Boxes.cat([boxes1, boxes2]) # this is not supported by torchsript for now. - boxes3 = boxes1.cat([boxes1, boxes2]) - return boxes3 - - f = torch.jit.script(func_cat) - script_box = f(data) - self.assertTrue(torch.equal(torch.cat([data, data]), script_box.tensor)) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_imagelist.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_imagelist.py deleted file mode 100755 index e446e44a..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_imagelist.py +++ /dev/null @@ -1,75 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import unittest -from typing import List, Sequence, Tuple -import torch - -from detectron2.structures import ImageList - - -class TestImageList(unittest.TestCase): - def test_imagelist_padding_tracing(self): - # test that the trace does not contain hard-coded constant sizes - def to_imagelist(tensors: Sequence[torch.Tensor]): - image_list = ImageList.from_tensors(tensors, 4) - return image_list.tensor, image_list.image_sizes - - def _tensor(*shape): - return torch.ones(shape, dtype=torch.float32) - - # test CHW (inputs needs padding vs. no padding) - for shape in [(3, 10, 10), (3, 12, 12)]: - func = torch.jit.trace(to_imagelist, ([_tensor(*shape)],)) - tensor, image_sizes = func([_tensor(3, 15, 20)]) - self.assertEqual(tensor.shape, (1, 3, 16, 20), tensor.shape) - self.assertEqual(image_sizes[0].tolist(), [15, 20], image_sizes[0]) - - # test HW - func = torch.jit.trace(to_imagelist, ([_tensor(10, 10)],)) - tensor, image_sizes = func([_tensor(15, 20)]) - self.assertEqual(tensor.shape, (1, 16, 20), tensor.shape) - self.assertEqual(image_sizes[0].tolist(), [15, 20], image_sizes[0]) - - # test 2x CHW - func = torch.jit.trace( - to_imagelist, - ([_tensor(3, 16, 10), _tensor(3, 13, 11)],), - ) - tensor, image_sizes = func([_tensor(3, 25, 20), _tensor(3, 10, 10)]) - self.assertEqual(tensor.shape, (2, 3, 28, 20), tensor.shape) - self.assertEqual(image_sizes[0].tolist(), [25, 20], image_sizes[0]) - self.assertEqual(image_sizes[1].tolist(), [10, 10], image_sizes[1]) - # support calling with different spatial sizes, but not with different #images - - def test_imagelist_scriptability(self): - image_nums = 2 - image_tensor = torch.randn((image_nums, 10, 20), dtype=torch.float32) - image_shape = [(10, 20)] * image_nums - - def f(image_tensor, image_shape: List[Tuple[int, int]]): - return ImageList(image_tensor, image_shape) - - ret = f(image_tensor, image_shape) - ret_script = torch.jit.script(f)(image_tensor, image_shape) - - self.assertEqual(len(ret), len(ret_script)) - for i in range(image_nums): - self.assertTrue(torch.equal(ret[i], ret_script[i])) - - def test_imagelist_from_tensors_scriptability(self): - image_tensor_0 = torch.randn(10, 20, dtype=torch.float32) - image_tensor_1 = torch.randn(12, 22, dtype=torch.float32) - inputs = [image_tensor_0, image_tensor_1] - - def f(image_tensor: List[torch.Tensor]): - return ImageList.from_tensors(image_tensor, 10) - - ret = f(inputs) - ret_script = torch.jit.script(f)(inputs) - - self.assertEqual(len(ret), len(ret_script)) - self.assertTrue(torch.equal(ret.tensor, ret_script.tensor)) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_instances.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_instances.py deleted file mode 100755 index a352f743..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_instances.py +++ /dev/null @@ -1,219 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import unittest -import torch -from torch import Tensor - -from detectron2.export.torchscript import patch_instances -from detectron2.structures import Boxes, Instances -from detectron2.utils.testing import convert_scripted_instances - - -class TestInstances(unittest.TestCase): - def test_int_indexing(self): - attr1 = torch.tensor([[0.0, 0.0, 1.0], [0.0, 0.0, 0.5], [0.0, 0.0, 1.0], [0.0, 0.5, 0.5]]) - attr2 = torch.tensor([0.1, 0.2, 0.3, 0.4]) - instances = Instances((100, 100)) - instances.attr1 = attr1 - instances.attr2 = attr2 - for i in range(-len(instances), len(instances)): - inst = instances[i] - self.assertEqual((inst.attr1 == attr1[i]).all(), True) - self.assertEqual((inst.attr2 == attr2[i]).all(), True) - - self.assertRaises(IndexError, lambda: instances[len(instances)]) - self.assertRaises(IndexError, lambda: instances[-len(instances) - 1]) - - def test_script_new_fields(self): - def get_mask(x: Instances) -> torch.Tensor: - return x.mask - - class f(torch.nn.Module): - def forward(self, x: Instances): - proposal_boxes = x.proposal_boxes # noqa F841 - objectness_logits = x.objectness_logits # noqa F841 - return x - - class g(torch.nn.Module): - def forward(self, x: Instances): - return get_mask(x) - - class g2(torch.nn.Module): - def __init__(self): - super().__init__() - self.g = g() - - def forward(self, x: Instances): - proposal_boxes = x.proposal_boxes # noqa F841 - return x, self.g(x) - - fields = {"proposal_boxes": Boxes, "objectness_logits": Tensor} - with patch_instances(fields): - torch.jit.script(f()) - - # can't script anymore after exiting the context - with self.assertRaises(Exception): - # will create a ConcreteType for g - torch.jit.script(g2()) - - new_fields = {"mask": Tensor} - with patch_instances(new_fields): - # will compile g with a different Instances; this should pass - torch.jit.script(g()) - with self.assertRaises(Exception): - torch.jit.script(g2()) - - new_fields = {"mask": Tensor, "proposal_boxes": Boxes} - with patch_instances(new_fields) as NewInstances: - # get_mask will be compiled with a different Instances; this should pass - scripted_g2 = torch.jit.script(g2()) - x = NewInstances((3, 4)) - x.mask = torch.rand(3) - x.proposal_boxes = Boxes(torch.rand(3, 4)) - scripted_g2(x) # it should accept the new Instances object and run successfully - - def test_script_access_fields(self): - class f(torch.nn.Module): - def forward(self, x: Instances): - proposal_boxes = x.proposal_boxes - objectness_logits = x.objectness_logits - return proposal_boxes.tensor + objectness_logits - - fields = {"proposal_boxes": Boxes, "objectness_logits": Tensor} - with patch_instances(fields): - torch.jit.script(f()) - - def test_script_len(self): - class f(torch.nn.Module): - def forward(self, x: Instances): - return len(x) - - class g(torch.nn.Module): - def forward(self, x: Instances): - return len(x) - - image_shape = (15, 15) - - fields = {"proposal_boxes": Boxes} - with patch_instances(fields) as new_instance: - script_module = torch.jit.script(f()) - x = new_instance(image_shape) - with self.assertRaises(Exception): - script_module(x) - box_tensors = torch.tensor([[5, 5, 10, 10], [1, 1, 2, 3]]) - x.proposal_boxes = Boxes(box_tensors) - length = script_module(x) - self.assertEqual(length, 2) - - fields = {"objectness_logits": Tensor} - with patch_instances(fields) as new_instance: - script_module = torch.jit.script(g()) - x = new_instance(image_shape) - objectness_logits = torch.tensor([1.0]).reshape(1, 1) - x.objectness_logits = objectness_logits - length = script_module(x) - self.assertEqual(length, 1) - - def test_script_has(self): - class f(torch.nn.Module): - def forward(self, x: Instances): - return x.has("proposal_boxes") - - image_shape = (15, 15) - fields = {"proposal_boxes": Boxes} - with patch_instances(fields) as new_instance: - script_module = torch.jit.script(f()) - x = new_instance(image_shape) - self.assertFalse(script_module(x)) - - box_tensors = torch.tensor([[5, 5, 10, 10], [1, 1, 2, 3]]) - x.proposal_boxes = Boxes(box_tensors) - self.assertTrue(script_module(x)) - - def test_script_to(self): - class f(torch.nn.Module): - def forward(self, x: Instances): - return x.to(torch.device("cpu")) - - image_shape = (15, 15) - fields = {"proposal_boxes": Boxes, "a": Tensor} - with patch_instances(fields) as new_instance: - script_module = torch.jit.script(f()) - x = new_instance(image_shape) - script_module(x) - - box_tensors = torch.tensor([[5, 5, 10, 10], [1, 1, 2, 3]]) - x.proposal_boxes = Boxes(box_tensors) - x.a = box_tensors - script_module(x) - - def test_script_getitem(self): - class f(torch.nn.Module): - def forward(self, x: Instances, idx): - return x[idx] - - image_shape = (15, 15) - fields = {"proposal_boxes": Boxes, "a": Tensor} - inst = Instances(image_shape) - inst.proposal_boxes = Boxes(torch.rand(4, 4)) - inst.a = torch.rand(4, 10) - idx = torch.tensor([True, False, True, False]) - with patch_instances(fields) as new_instance: - script_module = torch.jit.script(f()) - - out = f()(inst, idx) - out_scripted = script_module(new_instance.from_instances(inst), idx) - self.assertTrue( - torch.equal(out.proposal_boxes.tensor, out_scripted.proposal_boxes.tensor) - ) - self.assertTrue(torch.equal(out.a, out_scripted.a)) - - def test_from_to_instances(self): - orig = Instances((30, 30)) - orig.proposal_boxes = Boxes(torch.rand(3, 4)) - - fields = {"proposal_boxes": Boxes, "a": Tensor} - with patch_instances(fields) as NewInstances: - # convert to NewInstances and back - new1 = NewInstances.from_instances(orig) - new2 = convert_scripted_instances(new1) - self.assertTrue(torch.equal(orig.proposal_boxes.tensor, new1.proposal_boxes.tensor)) - self.assertTrue(torch.equal(orig.proposal_boxes.tensor, new2.proposal_boxes.tensor)) - - def test_script_init_args(self): - def f(x: Tensor): - image_shape = (15, 15) - # __init__ can take arguments - inst = Instances(image_shape, a=x, proposal_boxes=Boxes(x)) - inst2 = Instances(image_shape, a=x) - return inst.a, inst2.a - - fields = {"proposal_boxes": Boxes, "a": Tensor} - with patch_instances(fields): - script_f = torch.jit.script(f) - x = torch.randn(3, 4) - outputs = script_f(x) - self.assertTrue(torch.equal(outputs[0], x)) - self.assertTrue(torch.equal(outputs[1], x)) - - def test_script_cat(self): - def f(x: Tensor): - image_shape = (15, 15) - # __init__ can take arguments - inst = Instances(image_shape, a=x) - inst2 = Instances(image_shape, a=x) - - inst3 = Instances(image_shape, proposal_boxes=Boxes(x)) - return inst.cat([inst, inst2]), inst3.cat([inst3, inst3]) - - fields = {"proposal_boxes": Boxes, "a": Tensor} - with patch_instances(fields): - script_f = torch.jit.script(f) - x = torch.randn(3, 4) - output, output2 = script_f(x) - self.assertTrue(torch.equal(output.a, torch.cat([x, x]))) - self.assertFalse(output.has("proposal_boxes")) - self.assertTrue(torch.equal(output2.proposal_boxes.tensor, torch.cat([x, x]))) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_keypoints.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_keypoints.py deleted file mode 100755 index adc616e4..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_keypoints.py +++ /dev/null @@ -1,19 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import unittest -import torch - -from detectron2.structures.keypoints import Keypoints - - -class TestKeypoints(unittest.TestCase): - def test_cat_keypoints(self): - keypoints1 = Keypoints(torch.rand(2, 21, 3)) - keypoints2 = Keypoints(torch.rand(4, 21, 3)) - - cat_keypoints = keypoints1.cat([keypoints1, keypoints2]) - self.assertTrue(torch.all(cat_keypoints.tensor[:2] == keypoints1.tensor).item()) - self.assertTrue(torch.all(cat_keypoints.tensor[2:] == keypoints2.tensor).item()) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_masks.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_masks.py deleted file mode 100755 index 7991eb0b..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_masks.py +++ /dev/null @@ -1,53 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import unittest -import torch - -from detectron2.structures.masks import BitMasks, PolygonMasks, polygons_to_bitmask - - -class TestBitMask(unittest.TestCase): - def test_get_bounding_box(self): - masks = torch.tensor( - [ - [ - [False, False, False, True], - [False, False, True, True], - [False, True, True, False], - [False, True, True, False], - ], - [ - [False, False, False, False], - [False, False, True, False], - [False, True, True, False], - [False, True, True, False], - ], - torch.zeros(4, 4), - ] - ) - bitmask = BitMasks(masks) - box_true = torch.tensor([[1, 0, 4, 4], [1, 1, 3, 4], [0, 0, 0, 0]], dtype=torch.float32) - box = bitmask.get_bounding_boxes() - self.assertTrue(torch.all(box.tensor == box_true).item()) - - for box in box_true: - poly = box[[0, 1, 2, 1, 2, 3, 0, 3]].numpy() - mask = polygons_to_bitmask([poly], 4, 4) - reconstruct_box = BitMasks(mask[None, :, :]).get_bounding_boxes()[0].tensor - self.assertTrue(torch.all(box == reconstruct_box).item()) - - reconstruct_box = PolygonMasks([[poly]]).get_bounding_boxes()[0].tensor - self.assertTrue(torch.all(box == reconstruct_box).item()) - - def test_from_empty_polygons(self): - masks = BitMasks.from_polygon_masks([], 100, 100) - self.assertEqual(masks.tensor.shape, (0, 100, 100)) - - def test_getitem(self): - masks = BitMasks(torch.ones(3, 10, 10)) - self.assertEqual(masks[1].tensor.shape, (1, 10, 10)) - self.assertEqual(masks[1:3].tensor.shape, (2, 10, 10)) - self.assertEqual(masks[torch.tensor([True, False, False])].tensor.shape, (1, 10, 10)) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_rotated_boxes.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_rotated_boxes.py deleted file mode 100755 index 27812374..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/structures/test_rotated_boxes.py +++ /dev/null @@ -1,437 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from __future__ import absolute_import, division, print_function, unicode_literals -import logging -import math -import random -import unittest -import torch -from fvcore.common.benchmark import benchmark - -from detectron2.layers.rotated_boxes import pairwise_iou_rotated -from detectron2.structures.boxes import Boxes -from detectron2.structures.rotated_boxes import RotatedBoxes, pairwise_iou -from detectron2.utils.testing import reload_script_model - -logger = logging.getLogger(__name__) - - -class TestRotatedBoxesLayer(unittest.TestCase): - def test_iou_0_dim_cpu(self): - boxes1 = torch.rand(0, 5, dtype=torch.float32) - boxes2 = torch.rand(10, 5, dtype=torch.float32) - expected_ious = torch.zeros(0, 10, dtype=torch.float32) - ious = pairwise_iou_rotated(boxes1, boxes2) - self.assertTrue(torch.allclose(ious, expected_ious)) - - boxes1 = torch.rand(10, 5, dtype=torch.float32) - boxes2 = torch.rand(0, 5, dtype=torch.float32) - expected_ious = torch.zeros(10, 0, dtype=torch.float32) - ious = pairwise_iou_rotated(boxes1, boxes2) - self.assertTrue(torch.allclose(ious, expected_ious)) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_iou_0_dim_cuda(self): - boxes1 = torch.rand(0, 5, dtype=torch.float32) - boxes2 = torch.rand(10, 5, dtype=torch.float32) - expected_ious = torch.zeros(0, 10, dtype=torch.float32) - ious_cuda = pairwise_iou_rotated(boxes1.cuda(), boxes2.cuda()) - self.assertTrue(torch.allclose(ious_cuda.cpu(), expected_ious)) - - boxes1 = torch.rand(10, 5, dtype=torch.float32) - boxes2 = torch.rand(0, 5, dtype=torch.float32) - expected_ious = torch.zeros(10, 0, dtype=torch.float32) - ious_cuda = pairwise_iou_rotated(boxes1.cuda(), boxes2.cuda()) - self.assertTrue(torch.allclose(ious_cuda.cpu(), expected_ious)) - - def test_iou_half_overlap_cpu(self): - boxes1 = torch.tensor([[0.5, 0.5, 1.0, 1.0, 0.0]], dtype=torch.float32) - boxes2 = torch.tensor([[0.25, 0.5, 0.5, 1.0, 0.0]], dtype=torch.float32) - expected_ious = torch.tensor([[0.5]], dtype=torch.float32) - ious = pairwise_iou_rotated(boxes1, boxes2) - self.assertTrue(torch.allclose(ious, expected_ious)) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_iou_half_overlap_cuda(self): - boxes1 = torch.tensor([[0.5, 0.5, 1.0, 1.0, 0.0]], dtype=torch.float32) - boxes2 = torch.tensor([[0.25, 0.5, 0.5, 1.0, 0.0]], dtype=torch.float32) - expected_ious = torch.tensor([[0.5]], dtype=torch.float32) - ious_cuda = pairwise_iou_rotated(boxes1.cuda(), boxes2.cuda()) - self.assertTrue(torch.allclose(ious_cuda.cpu(), expected_ious)) - - def test_iou_precision(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - boxes1 = torch.tensor([[565, 565, 10, 10.0, 0]], dtype=torch.float32, device=device) - boxes2 = torch.tensor([[565, 565, 10, 8.3, 0]], dtype=torch.float32, device=device) - iou = 8.3 / 10.0 - expected_ious = torch.tensor([[iou]], dtype=torch.float32) - ious = pairwise_iou_rotated(boxes1, boxes2) - self.assertTrue(torch.allclose(ious.cpu(), expected_ious)) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_iou_too_many_boxes_cuda(self): - s1, s2 = 5, 1289035 - boxes1 = torch.zeros(s1, 5) - boxes2 = torch.zeros(s2, 5) - ious_cuda = pairwise_iou_rotated(boxes1.cuda(), boxes2.cuda()) - self.assertTupleEqual(tuple(ious_cuda.shape), (s1, s2)) - - def test_iou_extreme(self): - # Cause floating point issues in cuda kernels (#1266) - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - boxes1 = torch.tensor([[160.0, 153.0, 230.0, 23.0, -37.0]], device=device) - boxes2 = torch.tensor( - [ - [ - -1.117407639806935e17, - 1.3858420478349148e18, - 1000.0000610351562, - 1000.0000610351562, - 1612.0, - ] - ], - device=device, - ) - ious = pairwise_iou_rotated(boxes1, boxes2) - self.assertTrue(ious.min() >= 0, ious) - - def test_iou_issue_2154(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - boxes1 = torch.tensor( - [ - [ - 296.6620178222656, - 458.73883056640625, - 23.515729904174805, - 47.677001953125, - 0.08795166015625, - ] - ], - device=device, - ) - boxes2 = torch.tensor( - [[296.66201, 458.73882000000003, 23.51573, 47.67702, 0.087951]], - device=device, - ) - ious = pairwise_iou_rotated(boxes1, boxes2) - expected_ious = torch.tensor([[1.0]], dtype=torch.float32) - self.assertTrue(torch.allclose(ious.cpu(), expected_ious)) - - def test_iou_issue_2167(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - boxes1 = torch.tensor( - [ - [ - 2563.74462890625000000000, - 1436.79016113281250000000, - 2174.70336914062500000000, - 214.09500122070312500000, - 115.11834716796875000000, - ] - ], - device=device, - ) - boxes2 = torch.tensor( - [ - [ - 2563.74462890625000000000, - 1436.79028320312500000000, - 2174.70288085937500000000, - 214.09495544433593750000, - 115.11835479736328125000, - ] - ], - device=device, - ) - ious = pairwise_iou_rotated(boxes1, boxes2) - expected_ious = torch.tensor([[1.0]], dtype=torch.float32) - self.assertTrue(torch.allclose(ious.cpu(), expected_ious)) - - -class TestRotatedBoxesStructure(unittest.TestCase): - def test_clip_area_0_degree(self): - for _ in range(50): - num_boxes = 100 - boxes_5d = torch.zeros(num_boxes, 5) - boxes_5d[:, 0] = torch.FloatTensor(num_boxes).uniform_(-100, 500) - boxes_5d[:, 1] = torch.FloatTensor(num_boxes).uniform_(-100, 500) - boxes_5d[:, 2] = torch.FloatTensor(num_boxes).uniform_(0, 500) - boxes_5d[:, 3] = torch.FloatTensor(num_boxes).uniform_(0, 500) - # Convert from (x_ctr, y_ctr, w, h, 0) to (x1, y1, x2, y2) - boxes_4d = torch.zeros(num_boxes, 4) - boxes_4d[:, 0] = boxes_5d[:, 0] - boxes_5d[:, 2] / 2.0 - boxes_4d[:, 1] = boxes_5d[:, 1] - boxes_5d[:, 3] / 2.0 - boxes_4d[:, 2] = boxes_5d[:, 0] + boxes_5d[:, 2] / 2.0 - boxes_4d[:, 3] = boxes_5d[:, 1] + boxes_5d[:, 3] / 2.0 - - image_size = (500, 600) - test_boxes_4d = Boxes(boxes_4d) - test_boxes_5d = RotatedBoxes(boxes_5d) - # Before clip - areas_4d = test_boxes_4d.area() - areas_5d = test_boxes_5d.area() - self.assertTrue(torch.allclose(areas_4d, areas_5d, atol=1e-1, rtol=1e-5)) - # After clip - test_boxes_4d.clip(image_size) - test_boxes_5d.clip(image_size) - areas_4d = test_boxes_4d.area() - areas_5d = test_boxes_5d.area() - self.assertTrue(torch.allclose(areas_4d, areas_5d, atol=1e-1, rtol=1e-5)) - - def test_clip_area_arbitrary_angle(self): - num_boxes = 100 - boxes_5d = torch.zeros(num_boxes, 5) - boxes_5d[:, 0] = torch.FloatTensor(num_boxes).uniform_(-100, 500) - boxes_5d[:, 1] = torch.FloatTensor(num_boxes).uniform_(-100, 500) - boxes_5d[:, 2] = torch.FloatTensor(num_boxes).uniform_(0, 500) - boxes_5d[:, 3] = torch.FloatTensor(num_boxes).uniform_(0, 500) - boxes_5d[:, 4] = torch.FloatTensor(num_boxes).uniform_(-1800, 1800) - clip_angle_threshold = random.uniform(0, 180) - - image_size = (500, 600) - test_boxes_5d = RotatedBoxes(boxes_5d) - # Before clip - areas_before = test_boxes_5d.area() - # After clip - test_boxes_5d.clip(image_size, clip_angle_threshold) - areas_diff = test_boxes_5d.area() - areas_before - - # the areas should only decrease after clipping - self.assertTrue(torch.all(areas_diff <= 0)) - # whenever the box is clipped (thus the area shrinks), - # the angle for the box must be within the clip_angle_threshold - # Note that the clip function will normalize the angle range - # to be within (-180, 180] - self.assertTrue( - torch.all(torch.abs(boxes_5d[:, 4][torch.where(areas_diff < 0)]) < clip_angle_threshold) - ) - - def test_normalize_angles(self): - # torch.manual_seed(0) - for _ in range(50): - num_boxes = 100 - boxes_5d = torch.zeros(num_boxes, 5) - boxes_5d[:, 0] = torch.FloatTensor(num_boxes).uniform_(-100, 500) - boxes_5d[:, 1] = torch.FloatTensor(num_boxes).uniform_(-100, 500) - boxes_5d[:, 2] = torch.FloatTensor(num_boxes).uniform_(0, 500) - boxes_5d[:, 3] = torch.FloatTensor(num_boxes).uniform_(0, 500) - boxes_5d[:, 4] = torch.FloatTensor(num_boxes).uniform_(-1800, 1800) - rotated_boxes = RotatedBoxes(boxes_5d) - normalized_boxes = rotated_boxes.clone() - normalized_boxes.normalize_angles() - self.assertTrue(torch.all(normalized_boxes.tensor[:, 4] >= -180)) - self.assertTrue(torch.all(normalized_boxes.tensor[:, 4] < 180)) - # x, y, w, h should not change - self.assertTrue(torch.allclose(boxes_5d[:, :4], normalized_boxes.tensor[:, :4])) - # the cos/sin values of the angles should stay the same - - self.assertTrue( - torch.allclose( - torch.cos(boxes_5d[:, 4] * math.pi / 180), - torch.cos(normalized_boxes.tensor[:, 4] * math.pi / 180), - atol=1e-5, - ) - ) - - self.assertTrue( - torch.allclose( - torch.sin(boxes_5d[:, 4] * math.pi / 180), - torch.sin(normalized_boxes.tensor[:, 4] * math.pi / 180), - atol=1e-5, - ) - ) - - def test_pairwise_iou_0_degree(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - boxes1 = torch.tensor( - [[0.5, 0.5, 1.0, 1.0, 0.0], [0.5, 0.5, 1.0, 1.0, 0.0]], - dtype=torch.float32, - device=device, - ) - boxes2 = torch.tensor( - [ - [0.5, 0.5, 1.0, 1.0, 0.0], - [0.25, 0.5, 0.5, 1.0, 0.0], - [0.5, 0.25, 1.0, 0.5, 0.0], - [0.25, 0.25, 0.5, 0.5, 0.0], - [0.75, 0.75, 0.5, 0.5, 0.0], - [1.0, 1.0, 1.0, 1.0, 0.0], - ], - dtype=torch.float32, - device=device, - ) - expected_ious = torch.tensor( - [ - [1.0, 0.5, 0.5, 0.25, 0.25, 0.25 / (2 - 0.25)], - [1.0, 0.5, 0.5, 0.25, 0.25, 0.25 / (2 - 0.25)], - ], - dtype=torch.float32, - device=device, - ) - ious = pairwise_iou(RotatedBoxes(boxes1), RotatedBoxes(boxes2)) - self.assertTrue(torch.allclose(ious, expected_ious)) - - def test_pairwise_iou_45_degrees(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - boxes1 = torch.tensor( - [ - [1, 1, math.sqrt(2), math.sqrt(2), 45], - [1, 1, 2 * math.sqrt(2), 2 * math.sqrt(2), -45], - ], - dtype=torch.float32, - device=device, - ) - boxes2 = torch.tensor([[1, 1, 2, 2, 0]], dtype=torch.float32, device=device) - expected_ious = torch.tensor([[0.5], [0.5]], dtype=torch.float32, device=device) - ious = pairwise_iou(RotatedBoxes(boxes1), RotatedBoxes(boxes2)) - self.assertTrue(torch.allclose(ious, expected_ious)) - - def test_pairwise_iou_orthogonal(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - boxes1 = torch.tensor([[5, 5, 10, 6, 55]], dtype=torch.float32, device=device) - boxes2 = torch.tensor([[5, 5, 10, 6, -35]], dtype=torch.float32, device=device) - iou = (6.0 * 6.0) / (6.0 * 6.0 + 4.0 * 6.0 + 4.0 * 6.0) - expected_ious = torch.tensor([[iou]], dtype=torch.float32, device=device) - ious = pairwise_iou(RotatedBoxes(boxes1), RotatedBoxes(boxes2)) - self.assertTrue(torch.allclose(ious, expected_ious)) - - def test_pairwise_iou_large_close_boxes(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - boxes1 = torch.tensor( - [[299.500000, 417.370422, 600.000000, 364.259186, 27.1828]], - dtype=torch.float32, - device=device, - ) - boxes2 = torch.tensor( - [[299.500000, 417.370422, 600.000000, 364.259155, 27.1828]], - dtype=torch.float32, - device=device, - ) - iou = 364.259155 / 364.259186 - expected_ious = torch.tensor([[iou]], dtype=torch.float32, device=device) - ious = pairwise_iou(RotatedBoxes(boxes1), RotatedBoxes(boxes2)) - self.assertTrue(torch.allclose(ious, expected_ious)) - - def test_pairwise_iou_many_boxes(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - num_boxes1 = 100 - num_boxes2 = 200 - boxes1 = torch.stack( - [ - torch.tensor( - [5 + 20 * i, 5 + 20 * i, 10, 10, 0], - dtype=torch.float32, - device=device, - ) - for i in range(num_boxes1) - ] - ) - boxes2 = torch.stack( - [ - torch.tensor( - [5 + 20 * i, 5 + 20 * i, 10, 1 + 9 * i / num_boxes2, 0], - dtype=torch.float32, - device=device, - ) - for i in range(num_boxes2) - ] - ) - expected_ious = torch.zeros(num_boxes1, num_boxes2, dtype=torch.float32, device=device) - for i in range(min(num_boxes1, num_boxes2)): - expected_ious[i][i] = (1 + 9 * i / num_boxes2) / 10.0 - ious = pairwise_iou(RotatedBoxes(boxes1), RotatedBoxes(boxes2)) - self.assertTrue(torch.allclose(ious, expected_ious)) - - def test_pairwise_iou_issue1207_simplified(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - # Simplified test case of D2-issue-1207 - boxes1 = torch.tensor([[3, 3, 8, 2, -45.0]], device=device) - boxes2 = torch.tensor([[6, 0, 8, 2, -45.0]], device=device) - iou = 0.0 - expected_ious = torch.tensor([[iou]], dtype=torch.float32, device=device) - - ious = pairwise_iou(RotatedBoxes(boxes1), RotatedBoxes(boxes2)) - self.assertTrue(torch.allclose(ious, expected_ious)) - - def test_pairwise_iou_issue1207(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - # The original test case in D2-issue-1207 - boxes1 = torch.tensor([[160.0, 153.0, 230.0, 23.0, -37.0]], device=device) - boxes2 = torch.tensor([[190.0, 127.0, 80.0, 21.0, -46.0]], device=device) - - iou = 0.0 - expected_ious = torch.tensor([[iou]], dtype=torch.float32, device=device) - - ious = pairwise_iou(RotatedBoxes(boxes1), RotatedBoxes(boxes2)) - self.assertTrue(torch.allclose(ious, expected_ious)) - - def test_empty_cat(self): - x = RotatedBoxes.cat([]) - self.assertTrue(x.tensor.shape, (0, 5)) - - def test_scriptability(self): - def func(x): - boxes = RotatedBoxes(x) - test = boxes.to(torch.device("cpu")).tensor - return boxes.area(), test - - f = torch.jit.script(func) - f = reload_script_model(f) - f(torch.rand((3, 5))) - - data = torch.rand((3, 5)) - - def func_cat(x: torch.Tensor): - boxes1 = RotatedBoxes(x) - boxes2 = RotatedBoxes(x) - # this is not supported by torchscript for now. - # boxes3 = RotatedBoxes.cat([boxes1, boxes2]) - boxes3 = boxes1.cat([boxes1, boxes2]) - return boxes3 - - f = torch.jit.script(func_cat) - script_box = f(data) - self.assertTrue(torch.equal(torch.cat([data, data]), script_box.tensor)) - - -def benchmark_rotated_iou(): - num_boxes1 = 200 - num_boxes2 = 500 - boxes1 = torch.stack( - [ - torch.tensor([5 + 20 * i, 5 + 20 * i, 10, 10, 0], dtype=torch.float32) - for i in range(num_boxes1) - ] - ) - boxes2 = torch.stack( - [ - torch.tensor( - [5 + 20 * i, 5 + 20 * i, 10, 1 + 9 * i / num_boxes2, 0], - dtype=torch.float32, - ) - for i in range(num_boxes2) - ] - ) - - def func(dev, n=1): - b1 = boxes1.to(device=dev) - b2 = boxes2.to(device=dev) - - def bench(): - for _ in range(n): - pairwise_iou_rotated(b1, b2) - if dev.type == "cuda": - torch.cuda.synchronize() - - return bench - - # only run it once per timed loop, since it's slow - args = [{"dev": torch.device("cpu"), "n": 1}] - if torch.cuda.is_available(): - args.append({"dev": torch.device("cuda"), "n": 10}) - - benchmark(func, "rotated_iou", args, warmup_iters=3) - - -if __name__ == "__main__": - unittest.main() - benchmark_rotated_iou() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_checkpoint.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_checkpoint.py deleted file mode 100755 index ab0bfbd5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_checkpoint.py +++ /dev/null @@ -1,49 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import unittest -from collections import OrderedDict -import torch -from torch import nn - -from detectron2.checkpoint.c2_model_loading import align_and_update_state_dicts -from detectron2.utils.logger import setup_logger - - -class TestCheckpointer(unittest.TestCase): - def setUp(self): - setup_logger() - - def create_complex_model(self): - m = nn.Module() - m.block1 = nn.Module() - m.block1.layer1 = nn.Linear(2, 3) - m.layer2 = nn.Linear(3, 2) - m.res = nn.Module() - m.res.layer2 = nn.Linear(3, 2) - - state_dict = OrderedDict() - state_dict["layer1.weight"] = torch.rand(3, 2) - state_dict["layer1.bias"] = torch.rand(3) - state_dict["layer2.weight"] = torch.rand(2, 3) - state_dict["layer2.bias"] = torch.rand(2) - state_dict["res.layer2.weight"] = torch.rand(2, 3) - state_dict["res.layer2.bias"] = torch.rand(2) - return m, state_dict - - def test_complex_model_loaded(self): - for add_data_parallel in [False, True]: - model, state_dict = self.create_complex_model() - if add_data_parallel: - model = nn.DataParallel(model) - model_sd = model.state_dict() - - sd_to_load = align_and_update_state_dicts(model_sd, state_dict) - model.load_state_dict(sd_to_load) - for loaded, stored in zip(model_sd.values(), state_dict.values()): - # different tensor references - self.assertFalse(id(loaded) == id(stored)) - # same content - self.assertTrue(loaded.to(stored).equal(stored)) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_engine.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_engine.py deleted file mode 100755 index 6f6a0997..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_engine.py +++ /dev/null @@ -1,186 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import json -import math -import os -import tempfile -import time -import unittest -from unittest import mock -import torch -from fvcore.common.checkpoint import Checkpointer -from torch import nn - -from detectron2 import model_zoo -from detectron2.config import configurable, get_cfg -from detectron2.engine import DefaultTrainer, SimpleTrainer, default_setup, hooks -from detectron2.modeling.meta_arch import META_ARCH_REGISTRY -from detectron2.utils.events import CommonMetricPrinter, JSONWriter - - -@META_ARCH_REGISTRY.register() -class _SimpleModel(nn.Module): - @configurable - def __init__(self, sleep_sec=0): - super().__init__() - self.mod = nn.Linear(10, 20) - self.sleep_sec = sleep_sec - - @classmethod - def from_config(cls, cfg): - return {} - - def forward(self, x): - if self.sleep_sec > 0: - time.sleep(self.sleep_sec) - return {"loss": x.sum() + sum([x.mean() for x in self.parameters()])} - - -class TestTrainer(unittest.TestCase): - def _data_loader(self, device): - device = torch.device(device) - while True: - yield torch.rand(3, 3).to(device) - - def test_simple_trainer(self, device="cpu"): - model = _SimpleModel().to(device=device) - trainer = SimpleTrainer( - model, self._data_loader(device), torch.optim.SGD(model.parameters(), 0.1) - ) - trainer.train(0, 10) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_simple_trainer_cuda(self): - self.test_simple_trainer(device="cuda") - - def test_writer_hooks(self): - model = _SimpleModel(sleep_sec=0.1) - trainer = SimpleTrainer( - model, self._data_loader("cpu"), torch.optim.SGD(model.parameters(), 0.1) - ) - - max_iter = 50 - - with tempfile.TemporaryDirectory(prefix="detectron2_test") as d: - json_file = os.path.join(d, "metrics.json") - writers = [CommonMetricPrinter(max_iter), JSONWriter(json_file)] - - trainer.register_hooks( - [hooks.EvalHook(0, lambda: {"metric": 100}), hooks.PeriodicWriter(writers)] - ) - with self.assertLogs(writers[0].logger) as logs: - trainer.train(0, max_iter) - - with open(json_file, "r") as f: - data = [json.loads(line.strip()) for line in f] - self.assertEqual([x["iteration"] for x in data], [19, 39, 49, 50]) - # the eval metric is in the last line with iter 50 - self.assertIn("metric", data[-1], "Eval metric must be in last line of JSON!") - - # test logged messages from CommonMetricPrinter - self.assertEqual(len(logs.output), 3) - for log, iter in zip(logs.output, [19, 39, 49]): - self.assertIn(f"iter: {iter}", log) - - self.assertIn("eta: 0:00:00", logs.output[-1], "Last ETA must be 0!") - - def test_default_trainer(self): - # TODO: this test requires manifold access, so changed device to CPU. see: T88318502 - cfg = get_cfg() - cfg.MODEL.DEVICE = "cpu" - cfg.MODEL.META_ARCHITECTURE = "_SimpleModel" - cfg.DATASETS.TRAIN = ("coco_2017_val_100",) - with tempfile.TemporaryDirectory(prefix="detectron2_test") as d: - cfg.OUTPUT_DIR = d - trainer = DefaultTrainer(cfg) - - # test property - self.assertIs(trainer.model, trainer._trainer.model) - trainer.model = _SimpleModel() - self.assertIs(trainer.model, trainer._trainer.model) - - def test_checkpoint_resume(self): - model = _SimpleModel() - dataloader = self._data_loader("cpu") - opt = torch.optim.SGD(model.parameters(), 0.1) - scheduler = torch.optim.lr_scheduler.StepLR(opt, 3) - - with tempfile.TemporaryDirectory(prefix="detectron2_test") as d: - trainer = SimpleTrainer(model, dataloader, opt) - checkpointer = Checkpointer(model, d, opt=opt, trainer=trainer) - - trainer.register_hooks( - [ - hooks.LRScheduler(scheduler=scheduler), - # checkpoint after scheduler to properly save the state of scheduler - hooks.PeriodicCheckpointer(checkpointer, 10), - ] - ) - - trainer.train(0, 12) - self.assertAlmostEqual(opt.param_groups[0]["lr"], 1e-5) - self.assertEqual(scheduler.last_epoch, 12) - del trainer - - opt = torch.optim.SGD(model.parameters(), 999) # lr will be loaded - trainer = SimpleTrainer(model, dataloader, opt) - scheduler = torch.optim.lr_scheduler.StepLR(opt, 3) - trainer.register_hooks( - [ - hooks.LRScheduler(scheduler=scheduler), - ] - ) - checkpointer = Checkpointer(model, d, opt=opt, trainer=trainer) - checkpointer.resume_or_load("non_exist.pth") - self.assertEqual(trainer.iter, 11) # last finished iter number (0-based in Trainer) - # number of times `scheduler.step()` was called (1-based) - self.assertEqual(scheduler.last_epoch, 12) - self.assertAlmostEqual(opt.param_groups[0]["lr"], 1e-5) - - def test_eval_hook(self): - model = _SimpleModel() - dataloader = self._data_loader("cpu") - opt = torch.optim.SGD(model.parameters(), 0.1) - - for total_iter, period, eval_count in [(30, 15, 2), (31, 15, 3), (20, 0, 1)]: - test_func = mock.Mock(return_value={"metric": 3.0}) - trainer = SimpleTrainer(model, dataloader, opt) - trainer.register_hooks([hooks.EvalHook(period, test_func)]) - trainer.train(0, total_iter) - self.assertEqual(test_func.call_count, eval_count) - - def test_best_checkpointer(self): - model = _SimpleModel() - dataloader = self._data_loader("cpu") - opt = torch.optim.SGD(model.parameters(), 0.1) - metric_name = "metric" - total_iter = 40 - test_period = 10 - test_cases = [ - ("max", iter([0.3, 0.4, 0.35, 0.5]), 3), - ("min", iter([1.0, 0.8, 0.9, 0.9]), 2), - ("min", iter([math.nan, 0.8, 0.9, 0.9]), 1), - ] - for mode, metrics, call_count in test_cases: - trainer = SimpleTrainer(model, dataloader, opt) - with tempfile.TemporaryDirectory(prefix="detectron2_test") as d: - checkpointer = Checkpointer(model, d, opt=opt, trainer=trainer) - trainer.register_hooks( - [ - hooks.EvalHook(test_period, lambda: {metric_name: next(metrics)}), - hooks.BestCheckpointer(test_period, checkpointer, metric_name, mode=mode), - ] - ) - with mock.patch.object(checkpointer, "save") as mock_save_method: - trainer.train(0, total_iter) - self.assertEqual(mock_save_method.call_count, call_count) - - def test_setup_config(self): - with tempfile.TemporaryDirectory(prefix="detectron2_test") as d: - cfg = get_cfg() - cfg.OUTPUT_DIR = os.path.join(d, "yacs") - default_setup(cfg, {}) - - cfg = model_zoo.get_config("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py") - cfg.train.output_dir = os.path.join(d, "omegaconf") - default_setup(cfg, {}) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_events.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_events.py deleted file mode 100755 index c1b03e4d..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_events.py +++ /dev/null @@ -1,64 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import json -import os -import tempfile -import unittest - -from detectron2.utils.events import CommonMetricPrinter, EventStorage, JSONWriter - - -class TestEventWriter(unittest.TestCase): - def testScalar(self): - with tempfile.TemporaryDirectory( - prefix="detectron2_tests" - ) as dir, EventStorage() as storage: - json_file = os.path.join(dir, "test.json") - writer = JSONWriter(json_file) - for k in range(60): - storage.put_scalar("key", k, smoothing_hint=False) - if (k + 1) % 20 == 0: - writer.write() - storage.step() - writer.close() - with open(json_file) as f: - data = [json.loads(l) for l in f] - self.assertTrue([int(k["key"]) for k in data] == [19, 39, 59]) - - def testScalarMismatchedPeriod(self): - with tempfile.TemporaryDirectory( - prefix="detectron2_tests" - ) as dir, EventStorage() as storage: - json_file = os.path.join(dir, "test.json") - - writer = JSONWriter(json_file) - for k in range(60): - if k % 17 == 0: # write in a differnt period - storage.put_scalar("key2", k, smoothing_hint=False) - storage.put_scalar("key", k, smoothing_hint=False) - if (k + 1) % 20 == 0: - writer.write() - storage.step() - writer.close() - with open(json_file) as f: - data = [json.loads(l) for l in f] - self.assertTrue([int(k.get("key2", 0)) for k in data] == [17, 0, 34, 0, 51, 0]) - self.assertTrue([int(k.get("key", 0)) for k in data] == [0, 19, 0, 39, 0, 59]) - self.assertTrue([int(k["iteration"]) for k in data] == [17, 19, 34, 39, 51, 59]) - - def testPrintETA(self): - with EventStorage() as s: - p1 = CommonMetricPrinter(10) - p2 = CommonMetricPrinter() - - s.put_scalar("time", 1.0) - s.step() - s.put_scalar("time", 1.0) - s.step() - - with self.assertLogs("detectron2.utils.events") as logs: - p1.write() - self.assertIn("eta", logs.output[0]) - - with self.assertLogs("detectron2.utils.events") as logs: - p2.write() - self.assertNotIn("eta", logs.output[0]) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_export_caffe2.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_export_caffe2.py deleted file mode 100755 index 9a5e155f..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_export_caffe2.py +++ /dev/null @@ -1,52 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# -*- coding: utf-8 -*- - -import copy -import os -import tempfile -import unittest -import torch - -from detectron2 import model_zoo -from detectron2.export import Caffe2Model, Caffe2Tracer -from detectron2.utils.logger import setup_logger -from detectron2.utils.testing import get_sample_coco_image - - -# TODO: this test requires manifold access, see: T88318502 -# Running it on CircleCI causes crash, not sure why. -@unittest.skipIf(os.environ.get("CIRCLECI"), "Caffe2 tests crash on CircleCI.") -class TestCaffe2Export(unittest.TestCase): - def setUp(self): - setup_logger() - - def _test_model(self, config_path, device="cpu"): - cfg = model_zoo.get_config(config_path) - cfg.MODEL.DEVICE = device - model = model_zoo.get(config_path, trained=True, device=device) - - inputs = [{"image": get_sample_coco_image()}] - tracer = Caffe2Tracer(cfg, model, copy.deepcopy(inputs)) - - with tempfile.TemporaryDirectory(prefix="detectron2_unittest") as d: - if not os.environ.get("CI"): - # This requires onnx, which is not yet available on public CI - c2_model = tracer.export_caffe2() - c2_model.save_protobuf(d) - c2_model.save_graph(os.path.join(d, "test.svg"), inputs=copy.deepcopy(inputs)) - - c2_model = Caffe2Model.load_protobuf(d) - c2_model(inputs)[0]["instances"] - - ts_model = tracer.export_torchscript() - ts_model.save(os.path.join(d, "model.ts")) - - def testMaskRCNN(self): - self._test_model("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml") - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def testMaskRCNNGPU(self): - self._test_model("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml", device="cuda") - - def testRetinaNet(self): - self._test_model("COCO-Detection/retinanet_R_50_FPN_3x.yaml") diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_export_torchscript.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_export_torchscript.py deleted file mode 100755 index e9a0ff58..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_export_torchscript.py +++ /dev/null @@ -1,296 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import json -import os -import random -import tempfile -import unittest -import torch -from torch import Tensor, nn - -from detectron2 import model_zoo -from detectron2.config import get_cfg -from detectron2.config.instantiate import dump_dataclass, instantiate -from detectron2.export import dump_torchscript_IR, scripting_with_instances -from detectron2.export.flatten import TracingAdapter, flatten_to_tuple -from detectron2.export.torchscript_patch import patch_builtin_len -from detectron2.layers import ShapeSpec -from detectron2.modeling import build_backbone -from detectron2.modeling.postprocessing import detector_postprocess -from detectron2.modeling.roi_heads import KRCNNConvDeconvUpsampleHead -from detectron2.structures import Boxes, Instances -from detectron2.utils.env import TORCH_VERSION -from detectron2.utils.testing import ( - assert_instances_allclose, - convert_scripted_instances, - get_sample_coco_image, - random_boxes, -) - -""" -https://detectron2.readthedocs.io/tutorials/deployment.html -contains some explanations of this file. -""" - -SLOW_PUBLIC_CPU_TEST = unittest.skipIf( - os.environ.get("CI") and not torch.cuda.is_available(), - "The test is too slow on CPUs and will be executed on CircleCI's GPU jobs.", -) - - -class TestScripting(unittest.TestCase): - def testMaskRCNNFPN(self): - self._test_rcnn_model("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml") - - @SLOW_PUBLIC_CPU_TEST - def testMaskRCNNC4(self): - self._test_rcnn_model("COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml") - - def testRetinaNet(self): - self._test_retinanet_model("COCO-Detection/retinanet_R_50_FPN_3x.yaml") - - def _test_rcnn_model(self, config_path): - model = model_zoo.get(config_path, trained=True) - model.eval() - - fields = { - "proposal_boxes": Boxes, - "objectness_logits": Tensor, - "pred_boxes": Boxes, - "scores": Tensor, - "pred_classes": Tensor, - "pred_masks": Tensor, - } - script_model = scripting_with_instances(model, fields) - - # Test that batch inference with different shapes are supported - image = get_sample_coco_image() - small_image = nn.functional.interpolate(image, scale_factor=0.5) - inputs = [{"image": image}, {"image": small_image}] - with torch.no_grad(): - instance = model.inference(inputs, do_postprocess=False)[0] - scripted_instance = script_model.inference(inputs, do_postprocess=False)[0] - assert_instances_allclose(instance, scripted_instance) - - def _test_retinanet_model(self, config_path): - model = model_zoo.get(config_path, trained=True) - model.eval() - - fields = { - "pred_boxes": Boxes, - "scores": Tensor, - "pred_classes": Tensor, - } - script_model = scripting_with_instances(model, fields) - - img = get_sample_coco_image() - inputs = [{"image": img}] * 2 - with torch.no_grad(): - instance = model(inputs)[0]["instances"] - scripted_instance = convert_scripted_instances(script_model(inputs)[0]) - scripted_instance = detector_postprocess(scripted_instance, img.shape[1], img.shape[2]) - assert_instances_allclose(instance, scripted_instance) - # Note that the model currently cannot be saved and loaded into a new process: - # https://github.com/pytorch/pytorch/issues/46944 - - -# TODO: this test requires manifold access, see: T88318502 -class TestTracing(unittest.TestCase): - def testMaskRCNNFPN(self): - def inference_func(model, image): - inputs = [{"image": image}] - return model.inference(inputs, do_postprocess=False)[0] - - self._test_model("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml", inference_func) - - def testMaskRCNNFPN_with_postproc(self): - def inference_func(model, image): - inputs = [{"image": image, "height": image.shape[1], "width": image.shape[2]}] - return model.inference(inputs, do_postprocess=True)[0]["instances"] - - self._test_model("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml", inference_func) - - @SLOW_PUBLIC_CPU_TEST - def testMaskRCNNC4(self): - def inference_func(model, image): - inputs = [{"image": image}] - return model.inference(inputs, do_postprocess=False)[0] - - self._test_model("COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml", inference_func) - - @SLOW_PUBLIC_CPU_TEST - def testCascadeRCNN(self): - def inference_func(model, image): - inputs = [{"image": image}] - return model.inference(inputs, do_postprocess=False)[0] - - self._test_model("Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml", inference_func) - - # bug fixed by https://github.com/pytorch/pytorch/pull/67734 - @unittest.skipIf(TORCH_VERSION == (1, 10) and os.environ.get("CI"), "1.10 has bugs.") - def testRetinaNet(self): - def inference_func(model, image): - return model.forward([{"image": image}])[0]["instances"] - - self._test_model("COCO-Detection/retinanet_R_50_FPN_3x.yaml", inference_func) - - def _test_model(self, config_path, inference_func, batch=1): - model = model_zoo.get(config_path, trained=True) - image = get_sample_coco_image() - inputs = tuple(image.clone() for _ in range(batch)) - - wrapper = TracingAdapter(model, inputs, inference_func) - wrapper.eval() - with torch.no_grad(): - # trace with smaller images, and the trace must still work - trace_inputs = tuple( - nn.functional.interpolate(image, scale_factor=random.uniform(0.5, 0.7)) - for _ in range(batch) - ) - traced_model = torch.jit.trace(wrapper, trace_inputs) - - outputs = inference_func(model, *inputs) - traced_outputs = wrapper.outputs_schema(traced_model(*inputs)) - if batch > 1: - for output, traced_output in zip(outputs, traced_outputs): - assert_instances_allclose(output, traced_output, size_as_tensor=True) - else: - assert_instances_allclose(outputs, traced_outputs, size_as_tensor=True) - - @SLOW_PUBLIC_CPU_TEST - def testMaskRCNNFPN_batched(self): - def inference_func(model, image1, image2): - inputs = [{"image": image1}, {"image": image2}] - return model.inference(inputs, do_postprocess=False) - - self._test_model( - "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml", inference_func, batch=2 - ) - - def testKeypointHead(self): - class M(nn.Module): - def __init__(self): - super().__init__() - self.model = KRCNNConvDeconvUpsampleHead( - ShapeSpec(channels=4, height=14, width=14), num_keypoints=17, conv_dims=(4,) - ) - - def forward(self, x, predbox1, predbox2): - inst = [ - Instances((100, 100), pred_boxes=Boxes(predbox1)), - Instances((100, 100), pred_boxes=Boxes(predbox2)), - ] - ret = self.model(x, inst) - return tuple(x.pred_keypoints for x in ret) - - model = M() - model.eval() - - def gen_input(num1, num2): - feat = torch.randn((num1 + num2, 4, 14, 14)) - box1 = random_boxes(num1) - box2 = random_boxes(num2) - return feat, box1, box2 - - with torch.no_grad(), patch_builtin_len(): - trace = torch.jit.trace(model, gen_input(15, 15), check_trace=False) - - inputs = gen_input(12, 10) - trace_outputs = trace(*inputs) - true_outputs = model(*inputs) - for trace_output, true_output in zip(trace_outputs, true_outputs): - self.assertTrue(torch.allclose(trace_output, true_output)) - - -class TestTorchscriptUtils(unittest.TestCase): - # TODO: add test to dump scripting - def test_dump_IR_tracing(self): - cfg = get_cfg() - cfg.MODEL.RESNETS.DEPTH = 18 - cfg.MODEL.RESNETS.RES2_OUT_CHANNELS = 64 - - class Mod(nn.Module): - def forward(self, x): - return tuple(self.m(x).values()) - - model = Mod() - model.m = build_backbone(cfg) - model.eval() - - with torch.no_grad(): - ts_model = torch.jit.trace(model, (torch.rand(2, 3, 224, 224),)) - - with tempfile.TemporaryDirectory(prefix="detectron2_test") as d: - dump_torchscript_IR(ts_model, d) - # check that the files are created - for name in ["model_ts_code", "model_ts_IR", "model_ts_IR_inlined", "model"]: - fname = os.path.join(d, name + ".txt") - self.assertTrue(os.stat(fname).st_size > 0, fname) - - def test_dump_IR_function(self): - @torch.jit.script - def gunc(x, y): - return x + y - - def func(x, y): - return x + y + gunc(x, y) - - ts_model = torch.jit.trace(func, (torch.rand(3), torch.rand(3))) - with tempfile.TemporaryDirectory(prefix="detectron2_test") as d: - dump_torchscript_IR(ts_model, d) - for name in ["model_ts_code", "model_ts_IR", "model_ts_IR_inlined"]: - fname = os.path.join(d, name + ".txt") - self.assertTrue(os.stat(fname).st_size > 0, fname) - - def test_flatten_basic(self): - obj = [3, ([5, 6], {"name": [7, 9], "name2": 3})] - res, schema = flatten_to_tuple(obj) - self.assertEqual(res, (3, 5, 6, 7, 9, 3)) - new_obj = schema(res) - self.assertEqual(new_obj, obj) - - _, new_schema = flatten_to_tuple(new_obj) - self.assertEqual(schema, new_schema) # test __eq__ - self._check_schema(schema) - - def _check_schema(self, schema): - dumped_schema = dump_dataclass(schema) - # Check that the schema is json-serializable - # Although in reality you might want to use yaml because it often has many levels - json.dumps(dumped_schema) - - # Check that the schema can be deserialized - new_schema = instantiate(dumped_schema) - self.assertEqual(schema, new_schema) - - def test_flatten_instances_boxes(self): - inst = Instances( - torch.tensor([5, 8]), pred_masks=torch.tensor([3]), pred_boxes=Boxes(torch.ones((1, 4))) - ) - obj = [3, ([5, 6], inst)] - res, schema = flatten_to_tuple(obj) - self.assertEqual(res[:3], (3, 5, 6)) - for r, expected in zip(res[3:], (inst.pred_boxes.tensor, inst.pred_masks, inst.image_size)): - self.assertIs(r, expected) - new_obj = schema(res) - assert_instances_allclose(new_obj[1][1], inst, rtol=0.0, size_as_tensor=True) - - self._check_schema(schema) - - def test_allow_non_tensor(self): - data = (torch.tensor([5, 8]), 3) # contains non-tensor - - class M(nn.Module): - def forward(self, input, number): - return input - - model = M() - with self.assertRaisesRegex(ValueError, "must only contain tensors"): - adap = TracingAdapter(model, data, allow_non_tensor=False) - - adap = TracingAdapter(model, data, allow_non_tensor=True) - _ = adap(*adap.flattened_inputs) - - newdata = (data[0].clone(),) - with self.assertRaisesRegex(ValueError, "cannot generalize"): - _ = adap(*newdata) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_model_analysis.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_model_analysis.py deleted file mode 100755 index c01b7af0..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_model_analysis.py +++ /dev/null @@ -1,80 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - - -import unittest -import torch -from torch import nn - -from detectron2.utils.analysis import find_unused_parameters, flop_count_operators, parameter_count -from detectron2.utils.testing import get_model_no_weights - - -class RetinaNetTest(unittest.TestCase): - def setUp(self): - self.model = get_model_no_weights("COCO-Detection/retinanet_R_50_FPN_1x.yaml") - - def test_flop(self): - # RetinaNet supports flop-counting with random inputs - inputs = [{"image": torch.rand(3, 800, 800), "test_unused": "abcd"}] - res = flop_count_operators(self.model, inputs) - self.assertEqual(int(res["conv"]), 146) # 146B flops - - def test_param_count(self): - res = parameter_count(self.model) - self.assertEqual(res[""], 37915572) - self.assertEqual(res["backbone"], 31452352) - - -class FasterRCNNTest(unittest.TestCase): - def setUp(self): - self.model = get_model_no_weights("COCO-Detection/faster_rcnn_R_50_FPN_1x.yaml") - - def test_flop(self): - # Faster R-CNN supports flop-counting with random inputs - inputs = [{"image": torch.rand(3, 800, 800)}] - res = flop_count_operators(self.model, inputs) - - # This only checks flops for backbone & proposal generator - # Flops for box head is not conv, and depends on #proposals, which is - # almost 0 for random inputs. - self.assertEqual(int(res["conv"]), 117) - - def test_flop_with_output_shape(self): - inputs = [{"image": torch.rand(3, 800, 800), "height": 700, "width": 700}] - res = flop_count_operators(self.model, inputs) - self.assertEqual(int(res["conv"]), 117) - - def test_param_count(self): - res = parameter_count(self.model) - self.assertEqual(res[""], 41699936) - self.assertEqual(res["backbone"], 26799296) - - -class MaskRCNNTest(unittest.TestCase): - def setUp(self): - self.model = get_model_no_weights("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml") - - def test_flop(self): - inputs1 = [{"image": torch.rand(3, 800, 800)}] - inputs2 = [{"image": torch.rand(3, 800, 800), "height": 700, "width": 700}] - - for inputs in [inputs1, inputs2]: - res = flop_count_operators(self.model, inputs) - # The mask head could have extra conv flops, so total >= 117 - self.assertGreaterEqual(int(res["conv"]), 117) - - -class UnusedParamTest(unittest.TestCase): - def test_unused(self): - class TestMod(nn.Module): - def __init__(self): - super().__init__() - self.fc1 = nn.Linear(10, 10) - self.t = nn.Linear(10, 10) - - def forward(self, x): - return self.fc1(x).mean() - - m = TestMod() - ret = find_unused_parameters(m, torch.randn(10, 10)) - self.assertEqual(set(ret), {"t.weight", "t.bias"}) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_model_zoo.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_model_zoo.py deleted file mode 100755 index e3360a74..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_model_zoo.py +++ /dev/null @@ -1,50 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import unittest - -from detectron2 import model_zoo -from detectron2.config import instantiate -from detectron2.modeling import FPN, GeneralizedRCNN - -logger = logging.getLogger(__name__) - - -class TestModelZoo(unittest.TestCase): - def test_get_returns_model(self): - model = model_zoo.get("Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml", trained=False) - self.assertIsInstance(model, GeneralizedRCNN) - self.assertIsInstance(model.backbone, FPN) - - def test_get_invalid_model(self): - self.assertRaises(RuntimeError, model_zoo.get, "Invalid/config.yaml") - - def test_get_url(self): - url = model_zoo.get_checkpoint_url("Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml") - self.assertEqual( - url, - "https://dl.fbaipublicfiles.com/detectron2/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn/138602908/model_final_01ca85.pkl", # noqa - ) - url2 = model_zoo.get_checkpoint_url("Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.py") - self.assertEqual(url, url2) - - def _build_lazy_model(self, name): - cfg = model_zoo.get_config("common/models/" + name) - instantiate(cfg.model) - - def test_mask_rcnn_fpn(self): - self._build_lazy_model("mask_rcnn_fpn.py") - - def test_mask_rcnn_c4(self): - self._build_lazy_model("mask_rcnn_c4.py") - - def test_panoptic_fpn(self): - self._build_lazy_model("panoptic_fpn.py") - - def test_schedule(self): - cfg = model_zoo.get_config("common/coco_schedule.py") - for _, v in cfg.items(): - instantiate(v) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_packaging.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_packaging.py deleted file mode 100755 index a5b1661e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_packaging.py +++ /dev/null @@ -1,24 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import unittest - -from detectron2.utils.collect_env import collect_env_info - - -class TestProjects(unittest.TestCase): - def test_import(self): - from detectron2.projects import point_rend - - _ = point_rend.add_pointrend_config - - import detectron2.projects.deeplab as deeplab - - _ = deeplab.add_deeplab_config - - # import detectron2.projects.panoptic_deeplab as panoptic_deeplab - - # _ = panoptic_deeplab.add_panoptic_deeplab_config - - -class TestCollectEnv(unittest.TestCase): - def test(self): - _ = collect_env_info() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_registry.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_registry.py deleted file mode 100755 index 4e425a6e..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_registry.py +++ /dev/null @@ -1,45 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import unittest -import torch - -from detectron2.modeling.meta_arch import GeneralizedRCNN -from detectron2.utils.registry import _convert_target_to_string, locate - - -class A: - class B: - pass - - -class TestLocate(unittest.TestCase): - def _test_obj(self, obj): - name = _convert_target_to_string(obj) - newobj = locate(name) - self.assertIs(obj, newobj) - - def test_basic(self): - self._test_obj(GeneralizedRCNN) - - def test_inside_class(self): - # requires using __qualname__ instead of __name__ - self._test_obj(A.B) - - def test_builtin(self): - self._test_obj(len) - self._test_obj(dict) - - def test_pytorch_optim(self): - # pydoc.locate does not work for it - self._test_obj(torch.optim.SGD) - - def test_failure(self): - with self.assertRaises(ImportError): - locate("asdf") - - def test_compress_target(self): - from detectron2.data.transforms import RandomCrop - - name = _convert_target_to_string(RandomCrop) - # name shouldn't contain 'augmentation_impl' - self.assertEqual(name, "detectron2.data.transforms.RandomCrop") - self.assertIs(RandomCrop, locate(name)) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_scheduler.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_scheduler.py deleted file mode 100755 index 6cccb03f..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_scheduler.py +++ /dev/null @@ -1,68 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import math -import numpy as np -from unittest import TestCase -import torch -from fvcore.common.param_scheduler import CosineParamScheduler, MultiStepParamScheduler -from torch import nn - -from detectron2.solver import LRMultiplier, WarmupParamScheduler - - -class TestScheduler(TestCase): - def test_warmup_multistep(self): - p = nn.Parameter(torch.zeros(0)) - opt = torch.optim.SGD([p], lr=5) - - multiplier = WarmupParamScheduler( - MultiStepParamScheduler( - [1, 0.1, 0.01, 0.001], - milestones=[10, 15, 20], - num_updates=30, - ), - 0.001, - 5 / 30, - ) - sched = LRMultiplier(opt, multiplier, 30) - # This is an equivalent of: - # sched = WarmupMultiStepLR( - # opt, milestones=[10, 15, 20], gamma=0.1, warmup_factor=0.001, warmup_iters=5) - - p.sum().backward() - opt.step() - - lrs = [0.005] - for _ in range(30): - sched.step() - lrs.append(opt.param_groups[0]["lr"]) - self.assertTrue(np.allclose(lrs[:5], [0.005, 1.004, 2.003, 3.002, 4.001])) - self.assertTrue(np.allclose(lrs[5:10], 5.0)) - self.assertTrue(np.allclose(lrs[10:15], 0.5)) - self.assertTrue(np.allclose(lrs[15:20], 0.05)) - self.assertTrue(np.allclose(lrs[20:], 0.005)) - - def test_warmup_cosine(self): - p = nn.Parameter(torch.zeros(0)) - opt = torch.optim.SGD([p], lr=5) - multiplier = WarmupParamScheduler( - CosineParamScheduler(1, 0), - 0.001, - 5 / 30, - ) - sched = LRMultiplier(opt, multiplier, 30) - - p.sum().backward() - opt.step() - self.assertEqual(opt.param_groups[0]["lr"], 0.005) - lrs = [0.005] - - for _ in range(30): - sched.step() - lrs.append(opt.param_groups[0]["lr"]) - for idx, lr in enumerate(lrs): - expected_cosine = 2.5 * (1.0 + math.cos(math.pi * idx / 30)) - if idx >= 5: - self.assertAlmostEqual(lr, expected_cosine) - else: - self.assertNotAlmostEqual(lr, expected_cosine) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_solver.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_solver.py deleted file mode 100755 index 6b3ae84c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_solver.py +++ /dev/null @@ -1,66 +0,0 @@ -import unittest - -from detectron2.solver.build import _expand_param_groups, reduce_param_groups - - -class TestOptimizer(unittest.TestCase): - def testExpandParamsGroups(self): - params = [ - { - "params": ["p1", "p2", "p3", "p4"], - "lr": 1.0, - "weight_decay": 3.0, - }, - { - "params": ["p2", "p3", "p5"], - "lr": 2.0, - "momentum": 2.0, - }, - { - "params": ["p1"], - "weight_decay": 4.0, - }, - ] - out = _expand_param_groups(params) - gt = [ - dict(params=["p1"], lr=1.0, weight_decay=4.0), # noqa - dict(params=["p2"], lr=2.0, weight_decay=3.0, momentum=2.0), # noqa - dict(params=["p3"], lr=2.0, weight_decay=3.0, momentum=2.0), # noqa - dict(params=["p4"], lr=1.0, weight_decay=3.0), # noqa - dict(params=["p5"], lr=2.0, momentum=2.0), # noqa - ] - self.assertEqual(out, gt) - - def testReduceParamGroups(self): - params = [ - dict(params=["p1"], lr=1.0, weight_decay=4.0), # noqa - dict(params=["p2", "p6"], lr=2.0, weight_decay=3.0, momentum=2.0), # noqa - dict(params=["p3"], lr=2.0, weight_decay=3.0, momentum=2.0), # noqa - dict(params=["p4"], lr=1.0, weight_decay=3.0), # noqa - dict(params=["p5"], lr=2.0, momentum=2.0), # noqa - ] - gt_groups = [ - { - "lr": 1.0, - "weight_decay": 4.0, - "params": ["p1"], - }, - { - "lr": 2.0, - "weight_decay": 3.0, - "momentum": 2.0, - "params": ["p2", "p6", "p3"], - }, - { - "lr": 1.0, - "weight_decay": 3.0, - "params": ["p4"], - }, - { - "lr": 2.0, - "momentum": 2.0, - "params": ["p5"], - }, - ] - out = reduce_param_groups(params) - self.assertEqual(out, gt_groups) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_visualizer.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_visualizer.py deleted file mode 100755 index 1005000f..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tests/test_visualizer.py +++ /dev/null @@ -1,278 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import numpy as np -import os -import tempfile -import unittest -import cv2 -import torch - -from detectron2.data import MetadataCatalog -from detectron2.structures import BoxMode, Instances, RotatedBoxes -from detectron2.utils.visualizer import ColorMode, Visualizer - - -class TestVisualizer(unittest.TestCase): - def _random_data(self): - H, W = 100, 100 - N = 10 - img = np.random.rand(H, W, 3) * 255 - boxxy = np.random.rand(N, 2) * (H // 2) - boxes = np.concatenate((boxxy, boxxy + H // 2), axis=1) - - def _rand_poly(): - return np.random.rand(3, 2).flatten() * H - - polygons = [[_rand_poly() for _ in range(np.random.randint(1, 5))] for _ in range(N)] - - mask = np.zeros_like(img[:, :, 0], dtype=np.bool) - mask[:40, 10:20] = 1 - - labels = [str(i) for i in range(N)] - return img, boxes, labels, polygons, [mask] * N - - @property - def metadata(self): - return MetadataCatalog.get("coco_2017_train") - - def test_draw_dataset_dict(self): - img = np.random.rand(512, 512, 3) * 255 - dic = { - "annotations": [ - { - "bbox": [ - 368.9946492271106, - 330.891438763377, - 13.148537455410235, - 13.644708680142685, - ], - "bbox_mode": BoxMode.XYWH_ABS, - "category_id": 0, - "iscrowd": 1, - "segmentation": { - "counts": "_jh52m?2N2N2N2O100O10O001N1O2MceP2", - "size": [512, 512], - }, - } - ], - "height": 512, - "image_id": 1, - "width": 512, - } - v = Visualizer(img) - v.draw_dataset_dict(dic) - - v = Visualizer(img, self.metadata) - v.draw_dataset_dict(dic) - - def test_draw_rotated_dataset_dict(self): - img = np.random.rand(512, 512, 3) * 255 - dic = { - "annotations": [ - { - "bbox": [ - 368.9946492271106, - 330.891438763377, - 13.148537455410235, - 13.644708680142685, - 45.0, - ], - "bbox_mode": BoxMode.XYWHA_ABS, - "category_id": 0, - "iscrowd": 1, - } - ], - "height": 512, - "image_id": 1, - "width": 512, - } - v = Visualizer(img, self.metadata) - v.draw_dataset_dict(dic) - - def test_overlay_instances(self): - img, boxes, labels, polygons, masks = self._random_data() - - v = Visualizer(img, self.metadata) - output = v.overlay_instances(masks=polygons, boxes=boxes, labels=labels).get_image() - self.assertEqual(output.shape, img.shape) - - # Test 2x scaling - v = Visualizer(img, self.metadata, scale=2.0) - output = v.overlay_instances(masks=polygons, boxes=boxes, labels=labels).get_image() - self.assertEqual(output.shape[0], img.shape[0] * 2) - - # Test overlay masks - v = Visualizer(img, self.metadata) - output = v.overlay_instances(masks=masks, boxes=boxes, labels=labels).get_image() - self.assertEqual(output.shape, img.shape) - - def test_overlay_instances_no_boxes(self): - img, boxes, labels, polygons, _ = self._random_data() - v = Visualizer(img, self.metadata) - v.overlay_instances(masks=polygons, boxes=None, labels=labels).get_image() - - def test_draw_instance_predictions(self): - img, boxes, _, _, masks = self._random_data() - num_inst = len(boxes) - inst = Instances((img.shape[0], img.shape[1])) - inst.pred_classes = torch.randint(0, 80, size=(num_inst,)) - inst.scores = torch.rand(num_inst) - inst.pred_boxes = torch.from_numpy(boxes) - inst.pred_masks = torch.from_numpy(np.asarray(masks)) - - v = Visualizer(img) - v.draw_instance_predictions(inst) - - v = Visualizer(img, self.metadata) - v.draw_instance_predictions(inst) - - def test_BWmode_nomask(self): - img, boxes, _, _, masks = self._random_data() - num_inst = len(boxes) - inst = Instances((img.shape[0], img.shape[1])) - inst.pred_classes = torch.randint(0, 80, size=(num_inst,)) - inst.scores = torch.rand(num_inst) - inst.pred_boxes = torch.from_numpy(boxes) - - v = Visualizer(img, self.metadata, instance_mode=ColorMode.IMAGE_BW) - v.draw_instance_predictions(inst) - - # check that output is grayscale - inst = inst[:0] - v = Visualizer(img, self.metadata, instance_mode=ColorMode.IMAGE_BW) - output = v.draw_instance_predictions(inst).get_image() - self.assertTrue(np.allclose(output[:, :, 0], output[:, :, 1])) - self.assertTrue(np.allclose(output[:, :, 0], output[:, :, 2])) - - def test_draw_empty_mask_predictions(self): - img, boxes, _, _, masks = self._random_data() - num_inst = len(boxes) - inst = Instances((img.shape[0], img.shape[1])) - inst.pred_classes = torch.randint(0, 80, size=(num_inst,)) - inst.scores = torch.rand(num_inst) - inst.pred_boxes = torch.from_numpy(boxes) - inst.pred_masks = torch.from_numpy(np.zeros_like(np.asarray(masks))) - - v = Visualizer(img, self.metadata) - v.draw_instance_predictions(inst) - - def test_correct_output_shape(self): - img = np.random.rand(928, 928, 3) * 255 - v = Visualizer(img, self.metadata) - out = v.output.get_image() - self.assertEqual(out.shape, img.shape) - - def test_overlay_rotated_instances(self): - H, W = 100, 150 - img = np.random.rand(H, W, 3) * 255 - num_boxes = 50 - boxes_5d = torch.zeros(num_boxes, 5) - boxes_5d[:, 0] = torch.FloatTensor(num_boxes).uniform_(-0.1 * W, 1.1 * W) - boxes_5d[:, 1] = torch.FloatTensor(num_boxes).uniform_(-0.1 * H, 1.1 * H) - boxes_5d[:, 2] = torch.FloatTensor(num_boxes).uniform_(0, max(W, H)) - boxes_5d[:, 3] = torch.FloatTensor(num_boxes).uniform_(0, max(W, H)) - boxes_5d[:, 4] = torch.FloatTensor(num_boxes).uniform_(-1800, 1800) - rotated_boxes = RotatedBoxes(boxes_5d) - labels = [str(i) for i in range(num_boxes)] - - v = Visualizer(img, self.metadata) - output = v.overlay_instances(boxes=rotated_boxes, labels=labels).get_image() - self.assertEqual(output.shape, img.shape) - - def test_draw_no_metadata(self): - img, boxes, _, _, masks = self._random_data() - num_inst = len(boxes) - inst = Instances((img.shape[0], img.shape[1])) - inst.pred_classes = torch.randint(0, 80, size=(num_inst,)) - inst.scores = torch.rand(num_inst) - inst.pred_boxes = torch.from_numpy(boxes) - inst.pred_masks = torch.from_numpy(np.asarray(masks)) - - v = Visualizer(img, MetadataCatalog.get("asdfasdf")) - v.draw_instance_predictions(inst) - - def test_draw_binary_mask(self): - img, boxes, _, _, masks = self._random_data() - img[:, :, 0] = 0 # remove red color - mask = masks[0] - mask_with_hole = np.zeros_like(mask).astype("uint8") - mask_with_hole = cv2.rectangle(mask_with_hole, (10, 10), (50, 50), 1, 5) - - for m in [mask, mask_with_hole]: - for save in [True, False]: - v = Visualizer(img) - o = v.draw_binary_mask(m, color="red", text="test") - if save: - with tempfile.TemporaryDirectory(prefix="detectron2_viz") as d: - path = os.path.join(d, "output.png") - o.save(path) - o = cv2.imread(path)[:, :, ::-1] - else: - o = o.get_image().astype("float32") - # red color is drawn on the image - self.assertTrue(o[:, :, 0].sum() > 0) - - def test_draw_soft_mask(self): - img = np.random.rand(100, 100, 3) * 255 - img[:, :, 0] = 0 # remove red color - mask = np.zeros((100, 100), dtype=np.float32) - mask[30:50, 40:50] = 1.0 - cv2.GaussianBlur(mask, (21, 21), 10) - - v = Visualizer(img) - o = v.draw_soft_mask(mask, color="red", text="test") - o = o.get_image().astype("float32") - # red color is drawn on the image - self.assertTrue(o[:, :, 0].sum() > 0) - - # test draw empty mask - v = Visualizer(img) - o = v.draw_soft_mask(np.zeros((100, 100), dtype=np.float32), color="red", text="test") - o = o.get_image().astype("float32") - - def test_border_mask_with_holes(self): - H, W = 200, 200 - img = np.zeros((H, W, 3)) - img[:, :, 0] = 255.0 - v = Visualizer(img, scale=3) - - mask = np.zeros((H, W)) - mask[:, 100:150] = 1 - # create a hole, to trigger imshow - mask = cv2.rectangle(mask, (110, 110), (130, 130), 0, thickness=-1) - output = v.draw_binary_mask(mask, color="blue") - output = output.get_image()[:, :, ::-1] - - first_row = {tuple(x.tolist()) for x in output[0]} - last_row = {tuple(x.tolist()) for x in output[-1]} - # Check quantization / off-by-1 error: the first and last row must have two colors - self.assertEqual(len(last_row), 2) - self.assertEqual(len(first_row), 2) - self.assertIn((0, 0, 255), last_row) - self.assertIn((0, 0, 255), first_row) - - def test_border_polygons(self): - H, W = 200, 200 - img = np.zeros((H, W, 3)) - img[:, :, 0] = 255.0 - v = Visualizer(img, scale=3) - mask = np.zeros((H, W)) - mask[:, 100:150] = 1 - - output = v.draw_binary_mask(mask, color="blue") - output = output.get_image()[:, :, ::-1] - - first_row = {tuple(x.tolist()) for x in output[0]} - last_row = {tuple(x.tolist()) for x in output[-1]} - # Check quantization / off-by-1 error: - # the first and last row must have >=2 colors, because the polygon - # touches both rows - self.assertGreaterEqual(len(last_row), 2) - self.assertGreaterEqual(len(first_row), 2) - self.assertIn((0, 0, 255), last_row) - self.assertIn((0, 0, 255), first_row) - - -if __name__ == "__main__": - unittest.main() diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/README.md deleted file mode 100755 index 0b40d531..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/README.md +++ /dev/null @@ -1,49 +0,0 @@ - -This directory contains a few example scripts that demonstrate features of detectron2. - - -* `train_net.py` - -An example training script that's made to train builtin models of detectron2. - -For usage, see [GETTING_STARTED.md](../GETTING_STARTED.md). - -* `plain_train_net.py` - -Similar to `train_net.py`, but implements a training loop instead of using `Trainer`. -This script includes fewer features but it may be more friendly to hackers. - -* `benchmark.py` - -Benchmark the training speed, inference speed or data loading speed of a given config. - -Usage: -``` -python benchmark.py --config-file config.yaml --task train/eval/data [optional DDP flags] -``` - -* `analyze_model.py` - -Analyze FLOPs, parameters, activations of a detectron2 model. See its `--help` for usage. - -* `visualize_json_results.py` - -Visualize the json instance detection/segmentation results dumped by `COCOEvalutor` or `LVISEvaluator` - -Usage: -``` -python visualize_json_results.py --input x.json --output dir/ --dataset coco_2017_val -``` -If not using a builtin dataset, you'll need your own script or modify this script. - -* `visualize_data.py` - -Visualize ground truth raw annotations or training data (after preprocessing/augmentations). - -Usage: -``` -python visualize_data.py --config-file config.yaml --source annotation/dataloader --output-dir dir/ [--show] -``` - -NOTE: the script does not stop by itself when using `--source dataloader` because a training -dataloader is usually infinite. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/analyze_model.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/analyze_model.py deleted file mode 100755 index 8e38f8b7..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/analyze_model.py +++ /dev/null @@ -1,159 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import logging -import numpy as np -from collections import Counter -import tqdm -from fvcore.nn import flop_count_table # can also try flop_count_str - -from detectron2.checkpoint import DetectionCheckpointer -from detectron2.config import CfgNode, LazyConfig, get_cfg, instantiate -from detectron2.data import build_detection_test_loader -from detectron2.engine import default_argument_parser -from detectron2.modeling import build_model -from detectron2.utils.analysis import ( - FlopCountAnalysis, - activation_count_operators, - parameter_count_table, -) -from detectron2.utils.logger import setup_logger - -logger = logging.getLogger("detectron2") - - -def setup(args): - if args.config_file.endswith(".yaml"): - cfg = get_cfg() - cfg.merge_from_file(args.config_file) - cfg.DATALOADER.NUM_WORKERS = 0 - cfg.merge_from_list(args.opts) - cfg.freeze() - else: - cfg = LazyConfig.load(args.config_file) - cfg = LazyConfig.apply_overrides(cfg, args.opts) - setup_logger(name="fvcore") - setup_logger() - return cfg - - -def do_flop(cfg): - if isinstance(cfg, CfgNode): - data_loader = build_detection_test_loader(cfg, cfg.DATASETS.TEST[0]) - model = build_model(cfg) - DetectionCheckpointer(model).load(cfg.MODEL.WEIGHTS) - else: - data_loader = instantiate(cfg.dataloader.test) - model = instantiate(cfg.model) - model.to(cfg.train.device) - DetectionCheckpointer(model).load(cfg.train.init_checkpoint) - model.eval() - - counts = Counter() - total_flops = [] - for idx, data in zip(tqdm.trange(args.num_inputs), data_loader): # noqa - flops = FlopCountAnalysis(model, data) - if idx > 0: - flops.unsupported_ops_warnings(False).uncalled_modules_warnings(False) - counts += flops.by_operator() - total_flops.append(flops.total()) - - logger.info("Flops table computed from only one input sample:\n" + flop_count_table(flops)) - logger.info( - "Average GFlops for each type of operators:\n" - + str([(k, v / (idx + 1) / 1e9) for k, v in counts.items()]) - ) - logger.info( - "Total GFlops: {:.1f}±{:.1f}".format(np.mean(total_flops) / 1e9, np.std(total_flops) / 1e9) - ) - - -def do_activation(cfg): - if isinstance(cfg, CfgNode): - data_loader = build_detection_test_loader(cfg, cfg.DATASETS.TEST[0]) - model = build_model(cfg) - DetectionCheckpointer(model).load(cfg.MODEL.WEIGHTS) - else: - data_loader = instantiate(cfg.dataloader.test) - model = instantiate(cfg.model) - model.to(cfg.train.device) - DetectionCheckpointer(model).load(cfg.train.init_checkpoint) - model.eval() - - counts = Counter() - total_activations = [] - for idx, data in zip(tqdm.trange(args.num_inputs), data_loader): # noqa - count = activation_count_operators(model, data) - counts += count - total_activations.append(sum(count.values())) - logger.info( - "(Million) Activations for Each Type of Operators:\n" - + str([(k, v / idx) for k, v in counts.items()]) - ) - logger.info( - "Total (Million) Activations: {}±{}".format( - np.mean(total_activations), np.std(total_activations) - ) - ) - - -def do_parameter(cfg): - if isinstance(cfg, CfgNode): - model = build_model(cfg) - else: - model = instantiate(cfg.model) - logger.info("Parameter Count:\n" + parameter_count_table(model, max_depth=5)) - - -def do_structure(cfg): - if isinstance(cfg, CfgNode): - model = build_model(cfg) - else: - model = instantiate(cfg.model) - logger.info("Model Structure:\n" + str(model)) - - -if __name__ == "__main__": - parser = default_argument_parser( - epilog=""" -Examples: - -To show parameters of a model: -$ ./analyze_model.py --tasks parameter \\ - --config-file ../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml - -Flops and activations are data-dependent, therefore inputs and model weights -are needed to count them: - -$ ./analyze_model.py --num-inputs 100 --tasks flop \\ - --config-file ../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml \\ - MODEL.WEIGHTS /path/to/model.pkl -""" - ) - parser.add_argument( - "--tasks", - choices=["flop", "activation", "parameter", "structure"], - required=True, - nargs="+", - ) - parser.add_argument( - "-n", - "--num-inputs", - default=100, - type=int, - help="number of inputs used to compute statistics for flops/activations, " - "both are data dependent.", - ) - args = parser.parse_args() - assert not args.eval_only - assert args.num_gpus == 1 - - cfg = setup(args) - - for task in args.tasks: - { - "flop": do_flop, - "activation": do_activation, - "parameter": do_parameter, - "structure": do_structure, - }[task](cfg) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/benchmark.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/benchmark.py deleted file mode 100755 index aaac5640..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/benchmark.py +++ /dev/null @@ -1,197 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. -""" -A script to benchmark builtin models. - -Note: this script has an extra dependency of psutil. -""" - -import itertools -import logging -import psutil -import torch -import tqdm -from fvcore.common.timer import Timer -from torch.nn.parallel import DistributedDataParallel - -from detectron2.checkpoint import DetectionCheckpointer -from detectron2.config import LazyConfig, get_cfg, instantiate -from detectron2.data import ( - DatasetFromList, - build_detection_test_loader, - build_detection_train_loader, -) -from detectron2.data.benchmark import DataLoaderBenchmark -from detectron2.engine import AMPTrainer, SimpleTrainer, default_argument_parser, hooks, launch -from detectron2.modeling import build_model -from detectron2.solver import build_optimizer -from detectron2.utils import comm -from detectron2.utils.collect_env import collect_env_info -from detectron2.utils.events import CommonMetricPrinter -from detectron2.utils.logger import setup_logger - -logger = logging.getLogger("detectron2") - - -def setup(args): - if args.config_file.endswith(".yaml"): - cfg = get_cfg() - cfg.merge_from_file(args.config_file) - cfg.SOLVER.BASE_LR = 0.001 # Avoid NaNs. Not useful in this script anyway. - cfg.merge_from_list(args.opts) - cfg.freeze() - else: - cfg = LazyConfig.load(args.config_file) - cfg = LazyConfig.apply_overrides(cfg, args.opts) - setup_logger(distributed_rank=comm.get_rank()) - return cfg - - -def create_data_benchmark(cfg, args): - if args.config_file.endswith(".py"): - dl_cfg = cfg.dataloader.train - dl_cfg._target_ = DataLoaderBenchmark - return instantiate(dl_cfg) - else: - kwargs = build_detection_train_loader.from_config(cfg) - kwargs.pop("aspect_ratio_grouping", None) - kwargs["_target_"] = DataLoaderBenchmark - return instantiate(kwargs) - - -def RAM_msg(): - vram = psutil.virtual_memory() - return "RAM Usage: {:.2f}/{:.2f} GB".format( - (vram.total - vram.available) / 1024 ** 3, vram.total / 1024 ** 3 - ) - - -def benchmark_data(args): - cfg = setup(args) - logger.info("After spawning " + RAM_msg()) - - benchmark = create_data_benchmark(cfg, args) - benchmark.benchmark_distributed(250, 10) - # test for a few more rounds - for k in range(10): - logger.info(f"Iteration {k} " + RAM_msg()) - benchmark.benchmark_distributed(250, 1) - - -def benchmark_data_advanced(args): - # benchmark dataloader with more details to help analyze performance bottleneck - cfg = setup(args) - benchmark = create_data_benchmark(cfg, args) - - if comm.get_rank() == 0: - benchmark.benchmark_dataset(100) - benchmark.benchmark_mapper(100) - benchmark.benchmark_workers(100, warmup=10) - benchmark.benchmark_IPC(100, warmup=10) - if comm.get_world_size() > 1: - benchmark.benchmark_distributed(100) - logger.info("Rerun ...") - benchmark.benchmark_distributed(100) - - -def benchmark_train(args): - cfg = setup(args) - model = build_model(cfg) - logger.info("Model:\n{}".format(model)) - if comm.get_world_size() > 1: - model = DistributedDataParallel( - model, device_ids=[comm.get_local_rank()], broadcast_buffers=False - ) - optimizer = build_optimizer(cfg, model) - checkpointer = DetectionCheckpointer(model, optimizer=optimizer) - checkpointer.load(cfg.MODEL.WEIGHTS) - - cfg.defrost() - cfg.DATALOADER.NUM_WORKERS = 2 - data_loader = build_detection_train_loader(cfg) - dummy_data = list(itertools.islice(data_loader, 100)) - - def f(): - data = DatasetFromList(dummy_data, copy=False, serialize=False) - while True: - yield from data - - max_iter = 400 - trainer = (AMPTrainer if cfg.SOLVER.AMP.ENABLED else SimpleTrainer)(model, f(), optimizer) - trainer.register_hooks( - [ - hooks.IterationTimer(), - hooks.PeriodicWriter([CommonMetricPrinter(max_iter)]), - hooks.TorchProfiler( - lambda trainer: trainer.iter == max_iter - 1, cfg.OUTPUT_DIR, save_tensorboard=True - ), - ] - ) - trainer.train(1, max_iter) - - -@torch.no_grad() -def benchmark_eval(args): - cfg = setup(args) - if args.config_file.endswith(".yaml"): - model = build_model(cfg) - DetectionCheckpointer(model).load(cfg.MODEL.WEIGHTS) - - cfg.defrost() - cfg.DATALOADER.NUM_WORKERS = 0 - data_loader = build_detection_test_loader(cfg, cfg.DATASETS.TEST[0]) - else: - model = instantiate(cfg.model) - model.to(cfg.train.device) - DetectionCheckpointer(model).load(cfg.train.init_checkpoint) - - cfg.dataloader.num_workers = 0 - data_loader = instantiate(cfg.dataloader.test) - - model.eval() - logger.info("Model:\n{}".format(model)) - dummy_data = DatasetFromList(list(itertools.islice(data_loader, 100)), copy=False) - - def f(): - while True: - yield from dummy_data - - for k in range(5): # warmup - model(dummy_data[k]) - - max_iter = 300 - timer = Timer() - with tqdm.tqdm(total=max_iter) as pbar: - for idx, d in enumerate(f()): - if idx == max_iter: - break - model(d) - pbar.update() - logger.info("{} iters in {} seconds.".format(max_iter, timer.seconds())) - - -if __name__ == "__main__": - parser = default_argument_parser() - parser.add_argument("--task", choices=["train", "eval", "data", "data_advanced"], required=True) - args = parser.parse_args() - assert not args.eval_only - - logger.info("Environment info:\n" + collect_env_info()) - if "data" in args.task: - print("Initial " + RAM_msg()) - if args.task == "data": - f = benchmark_data - if args.task == "data_advanced": - f = benchmark_data_advanced - elif args.task == "train": - """ - Note: training speed may not be representative. - The training cost of a R-CNN model varies with the content of the data - and the quality of the model. - """ - f = benchmark_train - elif args.task == "eval": - f = benchmark_eval - # only benchmark single-GPU inference. - assert args.num_gpus == 1 and args.num_machines == 1 - launch(f, args.num_gpus, args.num_machines, args.machine_rank, args.dist_url, args=(args,)) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/convert-torchvision-to-d2.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/convert-torchvision-to-d2.py deleted file mode 100755 index 4b827d96..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/convert-torchvision-to-d2.py +++ /dev/null @@ -1,56 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. - -import pickle as pkl -import sys -import torch - -""" -Usage: - # download one of the ResNet{18,34,50,101,152} models from torchvision: - wget https://download.pytorch.org/models/resnet50-19c8e357.pth -O r50.pth - # run the conversion - ./convert-torchvision-to-d2.py r50.pth r50.pkl - - # Then, use r50.pkl with the following changes in config: - -MODEL: - WEIGHTS: "/path/to/r50.pkl" - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.120, 57.375] - RESNETS: - DEPTH: 50 - STRIDE_IN_1X1: False -INPUT: - FORMAT: "RGB" - - These models typically produce slightly worse results than the - pre-trained ResNets we use in official configs, which are the - original ResNet models released by MSRA. -""" - -if __name__ == "__main__": - input = sys.argv[1] - - obj = torch.load(input, map_location="cpu") - - newmodel = {} - for k in list(obj.keys()): - old_k = k - if "layer" not in k: - k = "stem." + k - for t in [1, 2, 3, 4]: - k = k.replace("layer{}".format(t), "res{}".format(t + 1)) - for t in [1, 2, 3]: - k = k.replace("bn{}".format(t), "conv{}.norm".format(t)) - k = k.replace("downsample.0", "shortcut") - k = k.replace("downsample.1", "shortcut.norm") - print(old_k, "->", k) - newmodel[k] = obj.pop(old_k).detach().numpy() - - res = {"model": newmodel, "__author__": "torchvision", "matching_heuristics": True} - - with open(sys.argv[2], "wb") as f: - pkl.dump(res, f) - if obj: - print("Unconverted keys:", obj.keys()) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/deploy/CMakeLists.txt b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/deploy/CMakeLists.txt deleted file mode 100755 index 80dae125..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/deploy/CMakeLists.txt +++ /dev/null @@ -1,15 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# See https://pytorch.org/tutorials/advanced/cpp_frontend.html -cmake_minimum_required(VERSION 3.12 FATAL_ERROR) -project(torchscript_mask_rcnn) - -find_package(Torch REQUIRED) -find_package(OpenCV REQUIRED) -find_package(TorchVision REQUIRED) # needed by export-method=tracing/scripting - -add_executable(torchscript_mask_rcnn torchscript_mask_rcnn.cpp) -target_link_libraries( - torchscript_mask_rcnn - -Wl,--no-as-needed TorchVision::TorchVision -Wl,--as-needed - "${TORCH_LIBRARIES}" ${OpenCV_LIBS}) -set_property(TARGET torchscript_mask_rcnn PROPERTY CXX_STANDARD 14) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/deploy/README.md b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/deploy/README.md deleted file mode 100755 index e33cbeb5..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/deploy/README.md +++ /dev/null @@ -1,66 +0,0 @@ -See [deployment tutorial](https://detectron2.readthedocs.io/tutorials/deployment.html) -for some high-level background about deployment. - -This directory contains the following examples: - -1. An example script `export_model.py` - that exports a detectron2 model for deployment using different methods and formats. - -2. A C++ example that runs inference with Mask R-CNN model in TorchScript format. - -## Build -Deployment depends on libtorch and OpenCV. Some require more dependencies: - -* Running TorchScript-format models produced by `--export-method=caffe2_tracing` requires libtorch - to be built with caffe2 enabled. -* Running TorchScript-format models produced by `--export-method=tracing/scripting` requires libtorchvision (C++ library of torchvision). - -All methods are supported in one C++ file that requires all the above dependencies. -Adjust it and remove code you don't need. -As a reference, we provide a [Dockerfile](../../docker/deploy.Dockerfile) that installs all the above dependencies and builds the C++ example. - -## Use - -We show a few example commands to export and execute a Mask R-CNN model in C++. - -* `export-method=tracing, format=torchscript`: -``` -./export_model.py --config-file ../../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml \ - --output ./output --export-method tracing --format torchscript \ - MODEL.WEIGHTS detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl \ - MODEL.DEVICE cuda - -./build/torchscript_mask_rcnn output/model.ts input.jpg tracing -``` - -* `export-method=scripting, format=torchscript`: -``` -./export_model.py --config-file ../../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml \ - --output ./output --export-method scripting --format torchscript \ - MODEL.WEIGHTS detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl \ - -./build/torchscript_mask_rcnn output/model.ts input.jpg scripting -``` - -* `export-method=caffe2_tracing, format=torchscript`: - -``` -./export_model.py --config-file ../../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml \ - --output ./output --export-method caffe2_tracing --format torchscript \ - MODEL.WEIGHTS detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl \ - -./build/torchscript_mask_rcnn output/model.ts input.jpg caffe2_tracing -``` - - -## Notes: - -1. Tracing/Caffe2-tracing requires valid weights & sample inputs. - Therefore the above commands require pre-trained models and [COCO dataset](https://detectron2.readthedocs.io/tutorials/builtin_datasets.html). - You can modify the script to obtain sample inputs in other ways instead of from COCO. - -2. `--run-eval` is implemented only for tracing mode - to evaluate the exported model using the dataset in the config. - It's recommended to always verify the accuracy in case the conversion is not successful. - Evaluation can be slow if model is exported to CPU or dataset is too large ("coco_2017_val_100" is a small subset of COCO useful for evaluation). - `caffe2_tracing` accuracy may be slightly different (within 0.1 AP) from original model due to numerical precisions between different runtime. diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/deploy/export_model.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/deploy/export_model.py deleted file mode 100755 index bb1bcee6..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/deploy/export_model.py +++ /dev/null @@ -1,235 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. -import argparse -import os -from typing import Dict, List, Tuple -import torch -from torch import Tensor, nn - -import detectron2.data.transforms as T -from detectron2.checkpoint import DetectionCheckpointer -from detectron2.config import get_cfg -from detectron2.data import build_detection_test_loader, detection_utils -from detectron2.evaluation import COCOEvaluator, inference_on_dataset, print_csv_format -from detectron2.export import TracingAdapter, dump_torchscript_IR, scripting_with_instances -from detectron2.modeling import GeneralizedRCNN, RetinaNet, build_model -from detectron2.modeling.postprocessing import detector_postprocess -from detectron2.projects.point_rend import add_pointrend_config -from detectron2.structures import Boxes -from detectron2.utils.env import TORCH_VERSION -from detectron2.utils.file_io import PathManager -from detectron2.utils.logger import setup_logger - - -def setup_cfg(args): - cfg = get_cfg() - # cuda context is initialized before creating dataloader, so we don't fork anymore - cfg.DATALOADER.NUM_WORKERS = 0 - add_pointrend_config(cfg) - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - cfg.freeze() - return cfg - - -def export_caffe2_tracing(cfg, torch_model, inputs): - from detectron2.export import Caffe2Tracer - - tracer = Caffe2Tracer(cfg, torch_model, inputs) - if args.format == "caffe2": - caffe2_model = tracer.export_caffe2() - caffe2_model.save_protobuf(args.output) - # draw the caffe2 graph - caffe2_model.save_graph(os.path.join(args.output, "model.svg"), inputs=inputs) - return caffe2_model - elif args.format == "onnx": - import onnx - - onnx_model = tracer.export_onnx() - onnx.save(onnx_model, os.path.join(args.output, "model.onnx")) - elif args.format == "torchscript": - ts_model = tracer.export_torchscript() - with PathManager.open(os.path.join(args.output, "model.ts"), "wb") as f: - torch.jit.save(ts_model, f) - dump_torchscript_IR(ts_model, args.output) - - -# experimental. API not yet final -def export_scripting(torch_model): - assert TORCH_VERSION >= (1, 8) - fields = { - "proposal_boxes": Boxes, - "objectness_logits": Tensor, - "pred_boxes": Boxes, - "scores": Tensor, - "pred_classes": Tensor, - "pred_masks": Tensor, - "pred_keypoints": torch.Tensor, - "pred_keypoint_heatmaps": torch.Tensor, - } - assert args.format == "torchscript", "Scripting only supports torchscript format." - - class ScriptableAdapterBase(nn.Module): - # Use this adapter to workaround https://github.com/pytorch/pytorch/issues/46944 - # by not retuning instances but dicts. Otherwise the exported model is not deployable - def __init__(self): - super().__init__() - self.model = torch_model - self.eval() - - if isinstance(torch_model, GeneralizedRCNN): - - class ScriptableAdapter(ScriptableAdapterBase): - def forward(self, inputs: Tuple[Dict[str, torch.Tensor]]) -> List[Dict[str, Tensor]]: - instances = self.model.inference(inputs, do_postprocess=False) - return [i.get_fields() for i in instances] - - else: - - class ScriptableAdapter(ScriptableAdapterBase): - def forward(self, inputs: Tuple[Dict[str, torch.Tensor]]) -> List[Dict[str, Tensor]]: - instances = self.model(inputs) - return [i.get_fields() for i in instances] - - ts_model = scripting_with_instances(ScriptableAdapter(), fields) - with PathManager.open(os.path.join(args.output, "model.ts"), "wb") as f: - torch.jit.save(ts_model, f) - dump_torchscript_IR(ts_model, args.output) - # TODO inference in Python now missing postprocessing glue code - return None - - -# experimental. API not yet final -def export_tracing(torch_model, inputs): - assert TORCH_VERSION >= (1, 8) - image = inputs[0]["image"] - inputs = [{"image": image}] # remove other unused keys - - if isinstance(torch_model, GeneralizedRCNN): - - def inference(model, inputs): - # use do_postprocess=False so it returns ROI mask - inst = model.inference(inputs, do_postprocess=False)[0] - return [{"instances": inst}] - - else: - inference = None # assume that we just call the model directly - - traceable_model = TracingAdapter(torch_model, inputs, inference) - - if args.format == "torchscript": - ts_model = torch.jit.trace(traceable_model, (image,)) - with PathManager.open(os.path.join(args.output, "model.ts"), "wb") as f: - torch.jit.save(ts_model, f) - dump_torchscript_IR(ts_model, args.output) - elif args.format == "onnx": - with PathManager.open(os.path.join(args.output, "model.onnx"), "wb") as f: - torch.onnx.export(traceable_model, (image,), f, opset_version=11) - logger.info("Inputs schema: " + str(traceable_model.inputs_schema)) - logger.info("Outputs schema: " + str(traceable_model.outputs_schema)) - - if args.format != "torchscript": - return None - if not isinstance(torch_model, (GeneralizedRCNN, RetinaNet)): - return None - - def eval_wrapper(inputs): - """ - The exported model does not contain the final resize step, which is typically - unused in deployment but needed for evaluation. We add it manually here. - """ - input = inputs[0] - instances = traceable_model.outputs_schema(ts_model(input["image"]))[0]["instances"] - postprocessed = detector_postprocess(instances, input["height"], input["width"]) - return [{"instances": postprocessed}] - - return eval_wrapper - - -def get_sample_inputs(args): - - if args.sample_image is None: - # get a first batch from dataset - data_loader = build_detection_test_loader(cfg, cfg.DATASETS.TEST[0]) - first_batch = next(iter(data_loader)) - return first_batch - else: - # get a sample data - original_image = detection_utils.read_image(args.sample_image, format=cfg.INPUT.FORMAT) - # Do same preprocessing as DefaultPredictor - aug = T.ResizeShortestEdge( - [cfg.INPUT.MIN_SIZE_TEST, cfg.INPUT.MIN_SIZE_TEST], cfg.INPUT.MAX_SIZE_TEST - ) - height, width = original_image.shape[:2] - image = aug.get_transform(original_image).apply_image(original_image) - image = torch.as_tensor(image.astype("float32").transpose(2, 0, 1)) - - inputs = {"image": image, "height": height, "width": width} - - # Sample ready - sample_inputs = [inputs] - return sample_inputs - - -if __name__ == "__main__": - parser = argparse.ArgumentParser(description="Export a model for deployment.") - parser.add_argument( - "--format", - choices=["caffe2", "onnx", "torchscript"], - help="output format", - default="torchscript", - ) - parser.add_argument( - "--export-method", - choices=["caffe2_tracing", "tracing", "scripting"], - help="Method to export models", - default="tracing", - ) - parser.add_argument("--config-file", default="", metavar="FILE", help="path to config file") - parser.add_argument("--sample-image", default=None, type=str, help="sample image for input") - parser.add_argument("--run-eval", action="store_true") - parser.add_argument("--output", help="output directory for the converted model") - parser.add_argument( - "opts", - help="Modify config options using the command-line", - default=None, - nargs=argparse.REMAINDER, - ) - args = parser.parse_args() - logger = setup_logger() - logger.info("Command line arguments: " + str(args)) - PathManager.mkdirs(args.output) - # Disable respecialization on new shapes. Otherwise --run-eval will be slow - torch._C._jit_set_bailout_depth(1) - - cfg = setup_cfg(args) - - # create a torch model - torch_model = build_model(cfg) - DetectionCheckpointer(torch_model).resume_or_load(cfg.MODEL.WEIGHTS) - torch_model.eval() - - # get sample data - sample_inputs = get_sample_inputs(args) - - # convert and save model - if args.export_method == "caffe2_tracing": - exported_model = export_caffe2_tracing(cfg, torch_model, sample_inputs) - elif args.export_method == "scripting": - exported_model = export_scripting(torch_model) - elif args.export_method == "tracing": - exported_model = export_tracing(torch_model, sample_inputs) - - # run evaluation with the converted model - if args.run_eval: - assert exported_model is not None, ( - "Python inference is not yet implemented for " - f"export_method={args.export_method}, format={args.format}." - ) - logger.info("Running evaluation ... this takes a long time if you export to CPU.") - dataset = cfg.DATASETS.TEST[0] - data_loader = build_detection_test_loader(cfg, dataset) - # NOTE: hard-coded evaluator. change to the evaluator for your dataset - evaluator = COCOEvaluator(dataset, output_dir=args.output) - metrics = inference_on_dataset(exported_model, data_loader, evaluator) - print_csv_format(metrics) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/deploy/torchscript_mask_rcnn.cpp b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/deploy/torchscript_mask_rcnn.cpp deleted file mode 100755 index b40f13b8..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/deploy/torchscript_mask_rcnn.cpp +++ /dev/null @@ -1,187 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -// @lint-ignore-every CLANGTIDY -// This is an example code that demonstrates how to run inference -// with a torchscript format Mask R-CNN model exported by ./export_model.py -// using export method=tracing, caffe2_tracing & scripting. - -#include -#include -#include - -#include -#include -#include -#include - -// only needed for export_method=tracing -#include // @oss-only -// @fb-only: #include - -using namespace std; - -c10::IValue get_caffe2_tracing_inputs(cv::Mat& img, c10::Device device) { - const int height = img.rows; - const int width = img.cols; - // FPN models require divisibility of 32. - // Tracing mode does padding inside the graph, but caffe2_tracing does not. - assert(height % 32 == 0 && width % 32 == 0); - const int channels = 3; - - auto input = - torch::from_blob(img.data, {1, height, width, channels}, torch::kUInt8); - // NHWC to NCHW - input = input.to(device, torch::kFloat).permute({0, 3, 1, 2}).contiguous(); - - std::array im_info_data{height * 1.0f, width * 1.0f, 1.0f}; - auto im_info = - torch::from_blob(im_info_data.data(), {1, 3}).clone().to(device); - return std::make_tuple(input, im_info); -} - -c10::IValue get_tracing_inputs(cv::Mat& img, c10::Device device) { - const int height = img.rows; - const int width = img.cols; - const int channels = 3; - - auto input = - torch::from_blob(img.data, {height, width, channels}, torch::kUInt8); - // HWC to CHW - input = input.to(device, torch::kFloat).permute({2, 0, 1}).contiguous(); - return input; -} - -// create a Tuple[Dict[str, Tensor]] which is the input type of scripted model -c10::IValue get_scripting_inputs(cv::Mat& img, c10::Device device) { - const int height = img.rows; - const int width = img.cols; - const int channels = 3; - - auto img_tensor = - torch::from_blob(img.data, {height, width, channels}, torch::kUInt8); - // HWC to CHW - img_tensor = - img_tensor.to(device, torch::kFloat).permute({2, 0, 1}).contiguous(); - auto dic = c10::Dict(); - dic.insert("image", img_tensor); - return std::make_tuple(dic); -} - -c10::IValue -get_inputs(std::string export_method, cv::Mat& img, c10::Device device) { - // Given an image, create inputs in the format required by the model. - if (export_method == "tracing") - return get_tracing_inputs(img, device); - if (export_method == "caffe2_tracing") - return get_caffe2_tracing_inputs(img, device); - if (export_method == "scripting") - return get_scripting_inputs(img, device); - abort(); -} - -struct MaskRCNNOutputs { - at::Tensor pred_boxes, pred_classes, pred_masks, scores; - int num_instances() const { - return pred_boxes.sizes()[0]; - } -}; - -MaskRCNNOutputs get_outputs(std::string export_method, c10::IValue outputs) { - // Given outputs of the model, extract tensors from it to turn into a - // common MaskRCNNOutputs format. - if (export_method == "tracing") { - auto out_tuple = outputs.toTuple()->elements(); - // They are ordered alphabetically by their field name in Instances - return MaskRCNNOutputs{ - out_tuple[0].toTensor(), - out_tuple[1].toTensor(), - out_tuple[2].toTensor(), - out_tuple[3].toTensor()}; - } - if (export_method == "caffe2_tracing") { - auto out_tuple = outputs.toTuple()->elements(); - // A legacy order used by caffe2 models - return MaskRCNNOutputs{ - out_tuple[0].toTensor(), - out_tuple[2].toTensor(), - out_tuple[3].toTensor(), - out_tuple[1].toTensor()}; - } - if (export_method == "scripting") { - // With the ScriptableAdapter defined in export_model.py, the output is - // List[Dict[str, Any]]. - auto out_dict = outputs.toList().get(0).toGenericDict(); - return MaskRCNNOutputs{ - out_dict.at("pred_boxes").toTensor(), - out_dict.at("pred_classes").toTensor(), - out_dict.at("pred_masks").toTensor(), - out_dict.at("scores").toTensor()}; - } - abort(); -} - -int main(int argc, const char* argv[]) { - if (argc != 4) { - cerr << R"xx( -Usage: - ./torchscript_mask_rcnn model.ts input.jpg EXPORT_METHOD - - EXPORT_METHOD can be "tracing", "caffe2_tracing" or "scripting". -)xx"; - return 1; - } - std::string image_file = argv[2]; - std::string export_method = argv[3]; - assert( - export_method == "caffe2_tracing" || export_method == "tracing" || - export_method == "scripting"); - - torch::jit::getBailoutDepth() = 1; - torch::autograd::AutoGradMode guard(false); - auto module = torch::jit::load(argv[1]); - - assert(module.buffers().size() > 0); - // Assume that the entire model is on the same device. - // We just put input to this device. - auto device = (*begin(module.buffers())).device(); - - cv::Mat input_img = cv::imread(image_file, cv::IMREAD_COLOR); - auto inputs = get_inputs(export_method, input_img, device); - - // Run the network - auto output = module.forward({inputs}); - if (device.is_cuda()) - c10::cuda::getCurrentCUDAStream().synchronize(); - - // run 3 more times to benchmark - int N_benchmark = 3, N_warmup = 1; - auto start_time = chrono::high_resolution_clock::now(); - for (int i = 0; i < N_benchmark + N_warmup; ++i) { - if (i == N_warmup) - start_time = chrono::high_resolution_clock::now(); - output = module.forward({inputs}); - if (device.is_cuda()) - c10::cuda::getCurrentCUDAStream().synchronize(); - } - auto end_time = chrono::high_resolution_clock::now(); - auto ms = chrono::duration_cast(end_time - start_time) - .count(); - cout << "Latency (should vary with different inputs): " - << ms * 1.0 / 1e6 / N_benchmark << " seconds" << endl; - - // Parse Mask R-CNN outputs - auto rcnn_outputs = get_outputs(export_method, output); - cout << "Number of detected objects: " << rcnn_outputs.num_instances() - << endl; - - cout << "pred_boxes: " << rcnn_outputs.pred_boxes.toString() << " " - << rcnn_outputs.pred_boxes.sizes() << endl; - cout << "scores: " << rcnn_outputs.scores.toString() << " " - << rcnn_outputs.scores.sizes() << endl; - cout << "pred_classes: " << rcnn_outputs.pred_classes.toString() << " " - << rcnn_outputs.pred_classes.sizes() << endl; - cout << "pred_masks: " << rcnn_outputs.pred_masks.toString() << " " - << rcnn_outputs.pred_masks.sizes() << endl; - - cout << rcnn_outputs.pred_boxes << endl; - return 0; -} diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/lazyconfig_train_net.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/lazyconfig_train_net.py deleted file mode 100755 index bb62d36c..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/lazyconfig_train_net.py +++ /dev/null @@ -1,131 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. -""" -Training script using the new "LazyConfig" python config files. - -This scripts reads a given python config file and runs the training or evaluation. -It can be used to train any models or dataset as long as they can be -instantiated by the recursive construction defined in the given config file. - -Besides lazy construction of models, dataloader, etc., this scripts expects a -few common configuration parameters currently defined in "configs/common/train.py". -To add more complicated training logic, you can easily add other configs -in the config file and implement a new train_net.py to handle them. -""" -import logging - -from detectron2.checkpoint import DetectionCheckpointer -from detectron2.config import LazyConfig, instantiate -from detectron2.engine import ( - AMPTrainer, - SimpleTrainer, - default_argument_parser, - default_setup, - default_writers, - hooks, - launch, -) -from detectron2.engine.defaults import create_ddp_model -from detectron2.evaluation import inference_on_dataset, print_csv_format -from detectron2.utils import comm - -logger = logging.getLogger("detectron2") - - -def do_test(cfg, model): - if "evaluator" in cfg.dataloader: - ret = inference_on_dataset( - model, instantiate(cfg.dataloader.test), instantiate(cfg.dataloader.evaluator) - ) - print_csv_format(ret) - return ret - - -def do_train(args, cfg): - """ - Args: - cfg: an object with the following attributes: - model: instantiate to a module - dataloader.{train,test}: instantiate to dataloaders - dataloader.evaluator: instantiate to evaluator for test set - optimizer: instantaite to an optimizer - lr_multiplier: instantiate to a fvcore scheduler - train: other misc config defined in `configs/common/train.py`, including: - output_dir (str) - init_checkpoint (str) - amp.enabled (bool) - max_iter (int) - eval_period, log_period (int) - device (str) - checkpointer (dict) - ddp (dict) - """ - model = instantiate(cfg.model) - logger = logging.getLogger("detectron2") - logger.info("Model:\n{}".format(model)) - model.to(cfg.train.device) - - cfg.optimizer.params.model = model - optim = instantiate(cfg.optimizer) - - train_loader = instantiate(cfg.dataloader.train) - - model = create_ddp_model(model, **cfg.train.ddp) - trainer = (AMPTrainer if cfg.train.amp.enabled else SimpleTrainer)(model, train_loader, optim) - checkpointer = DetectionCheckpointer( - model, - cfg.train.output_dir, - trainer=trainer, - ) - trainer.register_hooks( - [ - hooks.IterationTimer(), - hooks.LRScheduler(scheduler=instantiate(cfg.lr_multiplier)), - hooks.PeriodicCheckpointer(checkpointer, **cfg.train.checkpointer) - if comm.is_main_process() - else None, - hooks.EvalHook(cfg.train.eval_period, lambda: do_test(cfg, model)), - hooks.PeriodicWriter( - default_writers(cfg.train.output_dir, cfg.train.max_iter), - period=cfg.train.log_period, - ) - if comm.is_main_process() - else None, - ] - ) - - checkpointer.resume_or_load(cfg.train.init_checkpoint, resume=args.resume) - if args.resume and checkpointer.has_checkpoint(): - # The checkpoint stores the training iteration that just finished, thus we start - # at the next iteration - start_iter = trainer.iter + 1 - else: - start_iter = 0 - trainer.train(start_iter, cfg.train.max_iter) - - -def main(args): - cfg = LazyConfig.load(args.config_file) - cfg = LazyConfig.apply_overrides(cfg, args.opts) - default_setup(cfg, args) - - if args.eval_only: - model = instantiate(cfg.model) - model.to(cfg.train.device) - model = create_ddp_model(model) - DetectionCheckpointer(model).load(cfg.train.init_checkpoint) - print(do_test(cfg, model)) - else: - do_train(args, cfg) - - -if __name__ == "__main__": - args = default_argument_parser().parse_args() - launch( - main, - args.num_gpus, - num_machines=args.num_machines, - machine_rank=args.machine_rank, - dist_url=args.dist_url, - args=(args,), - ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/lightning_train_net.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/lightning_train_net.py deleted file mode 100755 index f6734b56..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/lightning_train_net.py +++ /dev/null @@ -1,239 +0,0 @@ -#!/usr/bin/env python3 -# Copyright (c) Facebook, Inc. and its affiliates. -# Lightning Trainer should be considered beta at this point -# We have confirmed that training and validation run correctly and produce correct results -# Depending on how you launch the trainer, there are issues with processes terminating correctly -# This module is still dependent on D2 logging, but could be transferred to use Lightning logging - -import logging -import os -import time -import weakref -from collections import OrderedDict -from typing import Any, Dict, List - -import detectron2.utils.comm as comm -from detectron2.checkpoint import DetectionCheckpointer -from detectron2.config import get_cfg -from detectron2.data import build_detection_test_loader, build_detection_train_loader -from detectron2.engine import ( - DefaultTrainer, - SimpleTrainer, - default_argument_parser, - default_setup, - default_writers, - hooks, -) -from detectron2.evaluation import print_csv_format -from detectron2.evaluation.testing import flatten_results_dict -from detectron2.modeling import build_model -from detectron2.solver import build_lr_scheduler, build_optimizer -from detectron2.utils.events import EventStorage -from detectron2.utils.logger import setup_logger - -import pytorch_lightning as pl # type: ignore -from pytorch_lightning import LightningDataModule, LightningModule -from train_net import build_evaluator - -logging.basicConfig(level=logging.INFO) -logger = logging.getLogger("detectron2") - - -class TrainingModule(LightningModule): - def __init__(self, cfg): - super().__init__() - if not logger.isEnabledFor(logging.INFO): # setup_logger is not called for d2 - setup_logger() - self.cfg = DefaultTrainer.auto_scale_workers(cfg, comm.get_world_size()) - self.storage: EventStorage = None - self.model = build_model(self.cfg) - - self.start_iter = 0 - self.max_iter = cfg.SOLVER.MAX_ITER - - def on_save_checkpoint(self, checkpoint: Dict[str, Any]) -> None: - checkpoint["iteration"] = self.storage.iter - - def on_load_checkpoint(self, checkpointed_state: Dict[str, Any]) -> None: - self.start_iter = checkpointed_state["iteration"] - self.storage.iter = self.start_iter - - def setup(self, stage: str): - if self.cfg.MODEL.WEIGHTS: - self.checkpointer = DetectionCheckpointer( - # Assume you want to save checkpoints together with logs/statistics - self.model, - self.cfg.OUTPUT_DIR, - ) - logger.info(f"Load model weights from checkpoint: {self.cfg.MODEL.WEIGHTS}.") - # Only load weights, use lightning checkpointing if you want to resume - self.checkpointer.load(self.cfg.MODEL.WEIGHTS) - - self.iteration_timer = hooks.IterationTimer() - self.iteration_timer.before_train() - self.data_start = time.perf_counter() - self.writers = None - - def training_step(self, batch, batch_idx): - data_time = time.perf_counter() - self.data_start - # Need to manually enter/exit since trainer may launch processes - # This ideally belongs in setup, but setup seems to run before processes are spawned - if self.storage is None: - self.storage = EventStorage(0) - self.storage.__enter__() - self.iteration_timer.trainer = weakref.proxy(self) - self.iteration_timer.before_step() - self.writers = ( - default_writers(self.cfg.OUTPUT_DIR, self.max_iter) - if comm.is_main_process() - else {} - ) - - loss_dict = self.model(batch) - SimpleTrainer.write_metrics(loss_dict, data_time) - - opt = self.optimizers() - self.storage.put_scalar( - "lr", opt.param_groups[self._best_param_group_id]["lr"], smoothing_hint=False - ) - self.iteration_timer.after_step() - self.storage.step() - # A little odd to put before step here, but it's the best way to get a proper timing - self.iteration_timer.before_step() - - if self.storage.iter % 20 == 0: - for writer in self.writers: - writer.write() - return sum(loss_dict.values()) - - def training_step_end(self, training_step_outpus): - self.data_start = time.perf_counter() - return training_step_outpus - - def training_epoch_end(self, training_step_outputs): - self.iteration_timer.after_train() - if comm.is_main_process(): - self.checkpointer.save("model_final") - for writer in self.writers: - writer.write() - writer.close() - self.storage.__exit__(None, None, None) - - def _process_dataset_evaluation_results(self) -> OrderedDict: - results = OrderedDict() - for idx, dataset_name in enumerate(self.cfg.DATASETS.TEST): - results[dataset_name] = self._evaluators[idx].evaluate() - if comm.is_main_process(): - print_csv_format(results[dataset_name]) - - if len(results) == 1: - results = list(results.values())[0] - return results - - def _reset_dataset_evaluators(self): - self._evaluators = [] - for dataset_name in self.cfg.DATASETS.TEST: - evaluator = build_evaluator(self.cfg, dataset_name) - evaluator.reset() - self._evaluators.append(evaluator) - - def on_validation_epoch_start(self, _outputs): - self._reset_dataset_evaluators() - - def validation_epoch_end(self, _outputs): - results = self._process_dataset_evaluation_results(_outputs) - - flattened_results = flatten_results_dict(results) - for k, v in flattened_results.items(): - try: - v = float(v) - except Exception as e: - raise ValueError( - "[EvalHook] eval_function should return a nested dict of float. " - "Got '{}: {}' instead.".format(k, v) - ) from e - self.storage.put_scalars(**flattened_results, smoothing_hint=False) - - def validation_step(self, batch, batch_idx: int, dataloader_idx: int = 0) -> None: - if not isinstance(batch, List): - batch = [batch] - outputs = self.model(batch) - self._evaluators[dataloader_idx].process(batch, outputs) - - def configure_optimizers(self): - optimizer = build_optimizer(self.cfg, self.model) - self._best_param_group_id = hooks.LRScheduler.get_best_param_group_id(optimizer) - scheduler = build_lr_scheduler(self.cfg, optimizer) - return [optimizer], [{"scheduler": scheduler, "interval": "step"}] - - -class DataModule(LightningDataModule): - def __init__(self, cfg): - super().__init__() - self.cfg = DefaultTrainer.auto_scale_workers(cfg, comm.get_world_size()) - - def train_dataloader(self): - return build_detection_train_loader(self.cfg) - - def val_dataloader(self): - dataloaders = [] - for dataset_name in self.cfg.DATASETS.TEST: - dataloaders.append(build_detection_test_loader(self.cfg, dataset_name)) - return dataloaders - - -def main(args): - cfg = setup(args) - train(cfg, args) - - -def train(cfg, args): - trainer_params = { - # training loop is bounded by max steps, use a large max_epochs to make - # sure max_steps is met first - "max_epochs": 10 ** 8, - "max_steps": cfg.SOLVER.MAX_ITER, - "val_check_interval": cfg.TEST.EVAL_PERIOD if cfg.TEST.EVAL_PERIOD > 0 else 10 ** 8, - "num_nodes": args.num_machines, - "gpus": args.num_gpus, - "num_sanity_val_steps": 0, - } - if cfg.SOLVER.AMP.ENABLED: - trainer_params["precision"] = 16 - - last_checkpoint = os.path.join(cfg.OUTPUT_DIR, "last.ckpt") - if args.resume: - # resume training from checkpoint - trainer_params["resume_from_checkpoint"] = last_checkpoint - logger.info(f"Resuming training from checkpoint: {last_checkpoint}.") - - trainer = pl.Trainer(**trainer_params) - logger.info(f"start to train with {args.num_machines} nodes and {args.num_gpus} GPUs") - - module = TrainingModule(cfg) - data_module = DataModule(cfg) - if args.eval_only: - logger.info("Running inference") - trainer.validate(module, data_module) - else: - logger.info("Running training") - trainer.fit(module, data_module) - - -def setup(args): - """ - Create configs and perform basic setups. - """ - cfg = get_cfg() - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - cfg.freeze() - default_setup(cfg, args) - return cfg - - -if __name__ == "__main__": - parser = default_argument_parser() - args = parser.parse_args() - logger.info("Command Line Args:", args) - main(args) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/plain_train_net.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/plain_train_net.py deleted file mode 100755 index 4851a839..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/plain_train_net.py +++ /dev/null @@ -1,223 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. -""" -Detectron2 training script with a plain training loop. - -This script reads a given config file and runs the training or evaluation. -It is an entry point that is able to train standard models in detectron2. - -In order to let one script support training of many models, -this script contains logic that are specific to these built-in models and therefore -may not be suitable for your own project. -For example, your research project perhaps only needs a single "evaluator". - -Therefore, we recommend you to use detectron2 as a library and take -this file as an example of how to use the library. -You may want to write your own script with your datasets and other customizations. - -Compared to "train_net.py", this script supports fewer default features. -It also includes fewer abstraction, therefore is easier to add custom logic. -""" - -import logging -import os -from collections import OrderedDict -import torch -from torch.nn.parallel import DistributedDataParallel - -import detectron2.utils.comm as comm -from detectron2.checkpoint import DetectionCheckpointer, PeriodicCheckpointer -from detectron2.config import get_cfg -from detectron2.data import ( - MetadataCatalog, - build_detection_test_loader, - build_detection_train_loader, -) -from detectron2.engine import default_argument_parser, default_setup, default_writers, launch -from detectron2.evaluation import ( - CityscapesInstanceEvaluator, - CityscapesSemSegEvaluator, - COCOEvaluator, - COCOPanopticEvaluator, - DatasetEvaluators, - LVISEvaluator, - PascalVOCDetectionEvaluator, - SemSegEvaluator, - inference_on_dataset, - print_csv_format, -) -from detectron2.modeling import build_model -from detectron2.solver import build_lr_scheduler, build_optimizer -from detectron2.utils.events import EventStorage - -logger = logging.getLogger("detectron2") - - -def get_evaluator(cfg, dataset_name, output_folder=None): - """ - Create evaluator(s) for a given dataset. - This uses the special metadata "evaluator_type" associated with each builtin dataset. - For your own dataset, you can simply create an evaluator manually in your - script and do not have to worry about the hacky if-else logic here. - """ - if output_folder is None: - output_folder = os.path.join(cfg.OUTPUT_DIR, "inference") - evaluator_list = [] - evaluator_type = MetadataCatalog.get(dataset_name).evaluator_type - if evaluator_type in ["sem_seg", "coco_panoptic_seg"]: - evaluator_list.append( - SemSegEvaluator( - dataset_name, - distributed=True, - output_dir=output_folder, - ) - ) - if evaluator_type in ["coco", "coco_panoptic_seg"]: - evaluator_list.append(COCOEvaluator(dataset_name, output_dir=output_folder)) - if evaluator_type == "coco_panoptic_seg": - evaluator_list.append(COCOPanopticEvaluator(dataset_name, output_folder)) - if evaluator_type == "cityscapes_instance": - assert ( - torch.cuda.device_count() > comm.get_rank() - ), "CityscapesEvaluator currently do not work with multiple machines." - return CityscapesInstanceEvaluator(dataset_name) - if evaluator_type == "cityscapes_sem_seg": - assert ( - torch.cuda.device_count() > comm.get_rank() - ), "CityscapesEvaluator currently do not work with multiple machines." - return CityscapesSemSegEvaluator(dataset_name) - if evaluator_type == "pascal_voc": - return PascalVOCDetectionEvaluator(dataset_name) - if evaluator_type == "lvis": - return LVISEvaluator(dataset_name, cfg, True, output_folder) - if len(evaluator_list) == 0: - raise NotImplementedError( - "no Evaluator for the dataset {} with the type {}".format(dataset_name, evaluator_type) - ) - if len(evaluator_list) == 1: - return evaluator_list[0] - return DatasetEvaluators(evaluator_list) - - -def do_test(cfg, model): - results = OrderedDict() - for dataset_name in cfg.DATASETS.TEST: - data_loader = build_detection_test_loader(cfg, dataset_name) - evaluator = get_evaluator( - cfg, dataset_name, os.path.join(cfg.OUTPUT_DIR, "inference", dataset_name) - ) - results_i = inference_on_dataset(model, data_loader, evaluator) - results[dataset_name] = results_i - if comm.is_main_process(): - logger.info("Evaluation results for {} in csv format:".format(dataset_name)) - print_csv_format(results_i) - if len(results) == 1: - results = list(results.values())[0] - return results - - -def do_train(cfg, model, resume=False): - model.train() - optimizer = build_optimizer(cfg, model) - scheduler = build_lr_scheduler(cfg, optimizer) - - checkpointer = DetectionCheckpointer( - model, cfg.OUTPUT_DIR, optimizer=optimizer, scheduler=scheduler - ) - start_iter = ( - checkpointer.resume_or_load(cfg.MODEL.WEIGHTS, resume=resume).get("iteration", -1) + 1 - ) - max_iter = cfg.SOLVER.MAX_ITER - - periodic_checkpointer = PeriodicCheckpointer( - checkpointer, cfg.SOLVER.CHECKPOINT_PERIOD, max_iter=max_iter - ) - - writers = default_writers(cfg.OUTPUT_DIR, max_iter) if comm.is_main_process() else [] - - # compared to "train_net.py", we do not support accurate timing and - # precise BN here, because they are not trivial to implement in a small training loop - data_loader = build_detection_train_loader(cfg) - logger.info("Starting training from iteration {}".format(start_iter)) - with EventStorage(start_iter) as storage: - for data, iteration in zip(data_loader, range(start_iter, max_iter)): - storage.iter = iteration - - loss_dict = model(data) - losses = sum(loss_dict.values()) - assert torch.isfinite(losses).all(), loss_dict - - loss_dict_reduced = {k: v.item() for k, v in comm.reduce_dict(loss_dict).items()} - losses_reduced = sum(loss for loss in loss_dict_reduced.values()) - if comm.is_main_process(): - storage.put_scalars(total_loss=losses_reduced, **loss_dict_reduced) - - optimizer.zero_grad() - losses.backward() - optimizer.step() - storage.put_scalar("lr", optimizer.param_groups[0]["lr"], smoothing_hint=False) - scheduler.step() - - if ( - cfg.TEST.EVAL_PERIOD > 0 - and (iteration + 1) % cfg.TEST.EVAL_PERIOD == 0 - and iteration != max_iter - 1 - ): - do_test(cfg, model) - # Compared to "train_net.py", the test results are not dumped to EventStorage - comm.synchronize() - - if iteration - start_iter > 5 and ( - (iteration + 1) % 20 == 0 or iteration == max_iter - 1 - ): - for writer in writers: - writer.write() - periodic_checkpointer.step(iteration) - - -def setup(args): - """ - Create configs and perform basic setups. - """ - cfg = get_cfg() - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - cfg.freeze() - default_setup( - cfg, args - ) # if you don't like any of the default setup, write your own setup code - return cfg - - -def main(args): - cfg = setup(args) - - model = build_model(cfg) - logger.info("Model:\n{}".format(model)) - if args.eval_only: - DetectionCheckpointer(model, save_dir=cfg.OUTPUT_DIR).resume_or_load( - cfg.MODEL.WEIGHTS, resume=args.resume - ) - return do_test(cfg, model) - - distributed = comm.get_world_size() > 1 - if distributed: - model = DistributedDataParallel( - model, device_ids=[comm.get_local_rank()], broadcast_buffers=False - ) - - do_train(cfg, model, resume=args.resume) - return do_test(cfg, model) - - -if __name__ == "__main__": - args = default_argument_parser().parse_args() - print("Command Line Args:", args) - launch( - main, - args.num_gpus, - num_machines=args.num_machines, - machine_rank=args.machine_rank, - dist_url=args.dist_url, - args=(args,), - ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/train_net.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/train_net.py deleted file mode 100755 index 6ebf5f60..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/train_net.py +++ /dev/null @@ -1,170 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. -""" -A main training script. - -This scripts reads a given config file and runs the training or evaluation. -It is an entry point that is made to train standard models in detectron2. - -In order to let one script support training of many models, -this script contains logic that are specific to these built-in models and therefore -may not be suitable for your own project. -For example, your research project perhaps only needs a single "evaluator". - -Therefore, we recommend you to use detectron2 as an library and take -this file as an example of how to use the library. -You may want to write your own script with your datasets and other customizations. -""" - -import logging -import os -from collections import OrderedDict -import torch - -import detectron2.utils.comm as comm -from detectron2.checkpoint import DetectionCheckpointer -from detectron2.config import get_cfg -from detectron2.data import MetadataCatalog -from detectron2.engine import DefaultTrainer, default_argument_parser, default_setup, hooks, launch -from detectron2.evaluation import ( - CityscapesInstanceEvaluator, - CityscapesSemSegEvaluator, - COCOEvaluator, - COCOPanopticEvaluator, - DatasetEvaluators, - LVISEvaluator, - PascalVOCDetectionEvaluator, - SemSegEvaluator, - verify_results, -) -from detectron2.modeling import GeneralizedRCNNWithTTA - - -def build_evaluator(cfg, dataset_name, output_folder=None): - """ - Create evaluator(s) for a given dataset. - This uses the special metadata "evaluator_type" associated with each builtin dataset. - For your own dataset, you can simply create an evaluator manually in your - script and do not have to worry about the hacky if-else logic here. - """ - if output_folder is None: - output_folder = os.path.join(cfg.OUTPUT_DIR, "inference") - evaluator_list = [] - evaluator_type = MetadataCatalog.get(dataset_name).evaluator_type - if evaluator_type in ["sem_seg", "coco_panoptic_seg"]: - evaluator_list.append( - SemSegEvaluator( - dataset_name, - distributed=True, - output_dir=output_folder, - ) - ) - if evaluator_type in ["coco", "coco_panoptic_seg"]: - evaluator_list.append(COCOEvaluator(dataset_name, output_dir=output_folder)) - if evaluator_type == "coco_panoptic_seg": - evaluator_list.append(COCOPanopticEvaluator(dataset_name, output_folder)) - if evaluator_type == "cityscapes_instance": - assert ( - torch.cuda.device_count() > comm.get_rank() - ), "CityscapesEvaluator currently do not work with multiple machines." - return CityscapesInstanceEvaluator(dataset_name) - if evaluator_type == "cityscapes_sem_seg": - assert ( - torch.cuda.device_count() > comm.get_rank() - ), "CityscapesEvaluator currently do not work with multiple machines." - return CityscapesSemSegEvaluator(dataset_name) - elif evaluator_type == "pascal_voc": - return PascalVOCDetectionEvaluator(dataset_name) - elif evaluator_type == "lvis": - return LVISEvaluator(dataset_name, output_dir=output_folder) - if len(evaluator_list) == 0: - raise NotImplementedError( - "no Evaluator for the dataset {} with the type {}".format(dataset_name, evaluator_type) - ) - elif len(evaluator_list) == 1: - return evaluator_list[0] - return DatasetEvaluators(evaluator_list) - - -class Trainer(DefaultTrainer): - """ - We use the "DefaultTrainer" which contains pre-defined default logic for - standard training workflow. They may not work for you, especially if you - are working on a new research project. In that case you can write your - own training loop. You can use "tools/plain_train_net.py" as an example. - """ - - @classmethod - def build_evaluator(cls, cfg, dataset_name, output_folder=None): - return build_evaluator(cfg, dataset_name, output_folder) - - @classmethod - def test_with_TTA(cls, cfg, model): - logger = logging.getLogger("detectron2.trainer") - # In the end of training, run an evaluation with TTA - # Only support some R-CNN models. - logger.info("Running inference with test-time augmentation ...") - model = GeneralizedRCNNWithTTA(cfg, model) - evaluators = [ - cls.build_evaluator( - cfg, name, output_folder=os.path.join(cfg.OUTPUT_DIR, "inference_TTA") - ) - for name in cfg.DATASETS.TEST - ] - res = cls.test(cfg, model, evaluators) - res = OrderedDict({k + "_TTA": v for k, v in res.items()}) - return res - - -def setup(args): - """ - Create configs and perform basic setups. - """ - cfg = get_cfg() - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - cfg.freeze() - default_setup(cfg, args) - return cfg - - -def main(args): - cfg = setup(args) - - if args.eval_only: - model = Trainer.build_model(cfg) - DetectionCheckpointer(model, save_dir=cfg.OUTPUT_DIR).resume_or_load( - cfg.MODEL.WEIGHTS, resume=args.resume - ) - res = Trainer.test(cfg, model) - if cfg.TEST.AUG.ENABLED: - res.update(Trainer.test_with_TTA(cfg, model)) - if comm.is_main_process(): - verify_results(cfg, res) - return res - - """ - If you'd like to do anything fancier than the standard training logic, - consider writing your own training loop (see plain_train_net.py) or - subclassing the trainer. - """ - trainer = Trainer(cfg) - trainer.resume_or_load(resume=args.resume) - if cfg.TEST.AUG.ENABLED: - trainer.register_hooks( - [hooks.EvalHook(0, lambda: trainer.test_with_TTA(cfg, trainer.model))] - ) - return trainer.train() - - -if __name__ == "__main__": - args = default_argument_parser().parse_args() - print("Command Line Args:", args) - launch( - main, - args.num_gpus, - num_machines=args.num_machines, - machine_rank=args.machine_rank, - dist_url=args.dist_url, - args=(args,), - ) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/visualize_data.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/visualize_data.py deleted file mode 100755 index fd0ba834..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/visualize_data.py +++ /dev/null @@ -1,94 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. -import argparse -import os -from itertools import chain -import cv2 -import tqdm - -from detectron2.config import get_cfg -from detectron2.data import DatasetCatalog, MetadataCatalog, build_detection_train_loader -from detectron2.data import detection_utils as utils -from detectron2.data.build import filter_images_with_few_keypoints -from detectron2.utils.logger import setup_logger -from detectron2.utils.visualizer import Visualizer - - -def setup(args): - cfg = get_cfg() - if args.config_file: - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - cfg.DATALOADER.NUM_WORKERS = 0 - cfg.freeze() - return cfg - - -def parse_args(in_args=None): - parser = argparse.ArgumentParser(description="Visualize ground-truth data") - parser.add_argument( - "--source", - choices=["annotation", "dataloader"], - required=True, - help="visualize the annotations or the data loader (with pre-processing)", - ) - parser.add_argument("--config-file", metavar="FILE", help="path to config file") - parser.add_argument("--output-dir", default="./", help="path to output directory") - parser.add_argument("--show", action="store_true", help="show output in a window") - parser.add_argument( - "opts", - help="Modify config options using the command-line", - default=None, - nargs=argparse.REMAINDER, - ) - return parser.parse_args(in_args) - - -if __name__ == "__main__": - args = parse_args() - logger = setup_logger() - logger.info("Arguments: " + str(args)) - cfg = setup(args) - - dirname = args.output_dir - os.makedirs(dirname, exist_ok=True) - metadata = MetadataCatalog.get(cfg.DATASETS.TRAIN[0]) - - def output(vis, fname): - if args.show: - print(fname) - cv2.imshow("window", vis.get_image()[:, :, ::-1]) - cv2.waitKey() - else: - filepath = os.path.join(dirname, fname) - print("Saving to {} ...".format(filepath)) - vis.save(filepath) - - scale = 1.0 - if args.source == "dataloader": - train_data_loader = build_detection_train_loader(cfg) - for batch in train_data_loader: - for per_image in batch: - # Pytorch tensor is in (C, H, W) format - img = per_image["image"].permute(1, 2, 0).cpu().detach().numpy() - img = utils.convert_image_to_rgb(img, cfg.INPUT.FORMAT) - - visualizer = Visualizer(img, metadata=metadata, scale=scale) - target_fields = per_image["instances"].get_fields() - labels = [metadata.thing_classes[i] for i in target_fields["gt_classes"]] - vis = visualizer.overlay_instances( - labels=labels, - boxes=target_fields.get("gt_boxes", None), - masks=target_fields.get("gt_masks", None), - keypoints=target_fields.get("gt_keypoints", None), - ) - output(vis, str(per_image["image_id"]) + ".jpg") - else: - dicts = list(chain.from_iterable([DatasetCatalog.get(k) for k in cfg.DATASETS.TRAIN])) - if cfg.MODEL.KEYPOINT_ON: - dicts = filter_images_with_few_keypoints(dicts, 1) - for dic in tqdm.tqdm(dicts): - img = utils.read_image(dic["file_name"], "RGB") - visualizer = Visualizer(img, metadata=metadata, scale=scale) - vis = visualizer.draw_dataset_dict(dic) - output(vis, os.path.basename(dic["file_name"])) diff --git a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/visualize_json_results.py b/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/visualize_json_results.py deleted file mode 100755 index 472190e0..00000000 --- a/vbench/third_party/tag2Text/grit_src/third_party/CenterNet2/tools/visualize_json_results.py +++ /dev/null @@ -1,90 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. - -import argparse -import json -import numpy as np -import os -from collections import defaultdict -import cv2 -import tqdm - -from detectron2.data import DatasetCatalog, MetadataCatalog -from detectron2.structures import Boxes, BoxMode, Instances -from detectron2.utils.file_io import PathManager -from detectron2.utils.logger import setup_logger -from detectron2.utils.visualizer import Visualizer - - -def create_instances(predictions, image_size): - ret = Instances(image_size) - - score = np.asarray([x["score"] for x in predictions]) - chosen = (score > args.conf_threshold).nonzero()[0] - score = score[chosen] - bbox = np.asarray([predictions[i]["bbox"] for i in chosen]).reshape(-1, 4) - bbox = BoxMode.convert(bbox, BoxMode.XYWH_ABS, BoxMode.XYXY_ABS) - - labels = np.asarray([dataset_id_map(predictions[i]["category_id"]) for i in chosen]) - - ret.scores = score - ret.pred_boxes = Boxes(bbox) - ret.pred_classes = labels - - try: - ret.pred_masks = [predictions[i]["segmentation"] for i in chosen] - except KeyError: - pass - return ret - - -if __name__ == "__main__": - parser = argparse.ArgumentParser( - description="A script that visualizes the json predictions from COCO or LVIS dataset." - ) - parser.add_argument("--input", required=True, help="JSON file produced by the model") - parser.add_argument("--output", required=True, help="output directory") - parser.add_argument("--dataset", help="name of the dataset", default="coco_2017_val") - parser.add_argument("--conf-threshold", default=0.5, type=float, help="confidence threshold") - args = parser.parse_args() - - logger = setup_logger() - - with PathManager.open(args.input, "r") as f: - predictions = json.load(f) - - pred_by_image = defaultdict(list) - for p in predictions: - pred_by_image[p["image_id"]].append(p) - - dicts = list(DatasetCatalog.get(args.dataset)) - metadata = MetadataCatalog.get(args.dataset) - if hasattr(metadata, "thing_dataset_id_to_contiguous_id"): - - def dataset_id_map(ds_id): - return metadata.thing_dataset_id_to_contiguous_id[ds_id] - - elif "lvis" in args.dataset: - # LVIS results are in the same format as COCO results, but have a different - # mapping from dataset category id to contiguous category id in [0, #categories - 1] - def dataset_id_map(ds_id): - return ds_id - 1 - - else: - raise ValueError("Unsupported dataset: {}".format(args.dataset)) - - os.makedirs(args.output, exist_ok=True) - - for dic in tqdm.tqdm(dicts): - img = cv2.imread(dic["file_name"], cv2.IMREAD_COLOR)[:, :, ::-1] - basename = os.path.basename(dic["file_name"]) - - predictions = create_instances(pred_by_image[dic["image_id"]], img.shape[:2]) - vis = Visualizer(img, metadata) - vis_pred = vis.draw_instance_predictions(predictions).get_image() - - vis = Visualizer(img, metadata) - vis_gt = vis.draw_dataset_dict(dic).get_image() - - concat = np.concatenate((vis_pred, vis_gt), axis=1) - cv2.imwrite(os.path.join(args.output, basename), concat[:, :, ::-1]) diff --git a/vbench/third_party/umt/__init__.py b/vbench/third_party/umt/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/umt/models/__init__.py b/vbench/third_party/umt/models/__init__.py index 0bab3794..881731df 100755 --- a/vbench/third_party/umt/models/__init__.py +++ b/vbench/third_party/umt/models/__init__.py @@ -1,4 +1,4 @@ from .clip import clip_b16, clip_l14, clip_l14_336 -from .modeling_finetune import vit_base_patch16_224, vit_base_patch16_384, vit_large_patch16_224, vit_large_patch16_384 +# from .modeling_finetune import vit_base_patch16_224, vit_base_patch16_384, vit_large_patch16_224, vit_large_patch16_384 from .modeling_pretrain_umt import pretrain_umt_base_patch16_224, pretrain_umt_large_patch16_224 -from .modeling_pretrain import pretrain_videomae_base_patch16_224, pretrain_videomae_large_patch16_224, pretrain_videomae_huge_patch16_224 \ No newline at end of file +from .modeling_pretrain import pretrain_videomae_base_patch16_224, pretrain_videomae_large_patch16_224, pretrain_videomae_huge_patch16_224 diff --git a/vbench/third_party/umt/models/modeling_finetune.py b/vbench/third_party/umt/models/modeling_finetune.py index 0c714073..65353688 100755 --- a/vbench/third_party/umt/models/modeling_finetune.py +++ b/vbench/third_party/umt/models/modeling_finetune.py @@ -326,42 +326,42 @@ def forward(self, x): return x -@register_model -def vit_base_patch16_224(pretrained=False, **kwargs): - model = VisionTransformer( - patch_size=16, embed_dim=768, depth=12, num_heads=12, mlp_ratio=4, qkv_bias=True, - norm_layer=partial(nn.LayerNorm, eps=1e-6), **kwargs) - model.default_cfg = _cfg() - return model - - -@register_model -def vit_base_patch16_384(pretrained=False, **kwargs): - model = VisionTransformer( - img_size=384, patch_size=16, embed_dim=768, depth=12, num_heads=12, mlp_ratio=4, qkv_bias=True, - norm_layer=partial(nn.LayerNorm, eps=1e-6), **kwargs) - model.default_cfg = _cfg() - return model - - -@register_model -def vit_large_patch16_224(pretrained=False, **kwargs): - kwargs.pop('pretrained_cfg', None) # added by Ziqi to accommodate timm=0.9.12 - kwargs.pop('pretrained_cfg_overlay', None) # added by Ziqi to accommodate timm=0.9.12 - model = VisionTransformer( - patch_size=16, embed_dim=1024, depth=24, num_heads=16, mlp_ratio=4, qkv_bias=True, - norm_layer=partial(nn.LayerNorm, eps=1e-6), **kwargs) - model.default_cfg = _cfg() - return model - - -@register_model -def vit_large_patch16_384(pretrained=False, **kwargs): - model = VisionTransformer( - img_size=384, patch_size=16, embed_dim=1024, depth=24, num_heads=16, mlp_ratio=4, qkv_bias=True, - norm_layer=partial(nn.LayerNorm, eps=1e-6), **kwargs) - model.default_cfg = _cfg() - return model +# @register_model +# def vit_base_patch16_224(pretrained=False, **kwargs): +# model = VisionTransformer( +# patch_size=16, embed_dim=768, depth=12, num_heads=12, mlp_ratio=4, qkv_bias=True, +# norm_layer=partial(nn.LayerNorm, eps=1e-6), **kwargs) +# model.default_cfg = _cfg() +# return model +# +# +# # @register_model +# def vit_base_patch16_384(pretrained=False, **kwargs): +# model = VisionTransformer( +# img_size=384, patch_size=16, embed_dim=768, depth=12, num_heads=12, mlp_ratio=4, qkv_bias=True, +# norm_layer=partial(nn.LayerNorm, eps=1e-6), **kwargs) +# model.default_cfg = _cfg() +# return model + + +# @register_model +# def vit_large_patch16_224(pretrained=False, **kwargs): +# kwargs.pop('pretrained_cfg', None) # added by Ziqi to accommodate timm=0.9.12 +# kwargs.pop('pretrained_cfg_overlay', None) # added by Ziqi to accommodate timm=0.9.12 +# model = VisionTransformer( +# patch_size=16, embed_dim=1024, depth=24, num_heads=16, mlp_ratio=4, qkv_bias=True, +# norm_layer=partial(nn.LayerNorm, eps=1e-6), **kwargs) +# model.default_cfg = _cfg() +# return model + + +# @register_model +# def vit_large_patch16_384(pretrained=False, **kwargs): +# model = VisionTransformer( +# img_size=384, patch_size=16, embed_dim=1024, depth=24, num_heads=16, mlp_ratio=4, qkv_bias=True, +# norm_layer=partial(nn.LayerNorm, eps=1e-6), **kwargs) +# model.default_cfg = _cfg() +# return model if __name__ == '__main__': @@ -378,11 +378,11 @@ def vit_large_patch16_384(pretrained=False, **kwargs): num_frames = 8 # model = vit_base_patch16_384(all_frames=num_frames, tubelet_size=1) - model = vit_large_patch16_384(all_frames=num_frames, tubelet_size=1) + # model = vit_large_patch16_384(all_frames=num_frames, tubelet_size=1) # print(model) flops = FlopCountAnalysis(model, torch.rand(1, 3, num_frames, 384, 384)) s = time.time() print(flop_count_table(flops, max_depth=1)) print(time.time()-s) - # print(model(torch.rand(1, 3, num_frames, 224, 224)).shape) \ No newline at end of file + # print(model(torch.rand(1, 3, num_frames, 224, 224)).shape) diff --git a/version.txt b/version.txt new file mode 100644 index 00000000..6e8bf73a --- /dev/null +++ b/version.txt @@ -0,0 +1 @@ +0.1.0 From a40df20914c5d2bbb61641f7cc2e3d2fe397b75c Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Wed, 27 Dec 2023 21:06:44 +0700 Subject: [PATCH 063/248] [add]: add static filter --- setup.py | 2 +- vbench/cli/static_filter.py | 136 ++++++++++++++++++++++++++++++++++++ 2 files changed, 137 insertions(+), 1 deletion(-) create mode 100644 vbench/cli/static_filter.py diff --git a/setup.py b/setup.py index 5b240aa2..86e89adb 100644 --- a/setup.py +++ b/setup.py @@ -25,7 +25,7 @@ def fetch_requirements(): 'Source': 'https://github.com/Vchitect/VBench', }, entry_points={ - 'console_scripts': ['evaluate=vbench.cli.evaluate:main'] + 'console_scripts': ['evaluate=vbench.cli.evaluate:main', 'static_filter=vbench.cli.static_filter:main'] }, install_requires=install_requires, packages=find_packages(), diff --git a/vbench/cli/static_filter.py b/vbench/cli/static_filter.py new file mode 100644 index 00000000..b35d1445 --- /dev/null +++ b/vbench/cli/static_filter.py @@ -0,0 +1,136 @@ +import argparse +import os +import cv2 +import glob +import numpy as np +import torch +from tqdm import tqdm +import json + +from vbench.third_party.RAFT.core.raft import RAFT +from vbench.third_party.RAFT.core.utils_core.utils import InputPadder + + +DEVICE = 'cuda' + + +class StaticFilter: + def __init__(self, args, device): + self.args = args + self.device = device + self.load_model() + + + def load_model(self): + self.model = torch.nn.DataParallel(RAFT(self.args)) + self.model.load_state_dict(torch.load(self.args.model)) + + self.model = self.model.module + self.model.to(self.device) + self.model.eval() + + + def get_score(self, img, flo): + img = img[0].permute(1,2,0).cpu().numpy() + flo = flo[0].permute(1,2,0).cpu().numpy() + + u = flo[:,:,0] + v = flo[:,:,1] + rad = np.sqrt(np.square(u) + np.square(v)) + + h, w = rad.shape + rad_flat = rad.flatten() + cut_index = int(h*w*0.02) + + max_rad = np.mean(abs(np.sort(-rad_flat))[:cut_index]) + + return max_rad + + + def check_static(self, score_list): + thres = self.params["thres"] + count_num = self.params["count_num"] + count = 0 + for score in score_list[:-2]: + if score > thres: + count += 1 + if count > count_num: + return False + for score in score_list[-2:]: + if score > thres*count_num*2: + return False + return True + + + def set_params(self, frame, count): + scale = min(list(frame.shape)[-2:]) + self.params = {"thres":3.0*(scale/256.0), "count_num":round(2*(count/16.0))} + + + def infer(self, path): + with torch.no_grad(): + frames = self.get_frames(path) + self.set_params(frame=frames[0], count=len(frames)) + static_score = [] + for image1, image2 in zip(frames[:-1]+[frames[0],frames[-1]], frames[1:]+[frames[-1],frames[0]]): + padder = InputPadder(image1.shape) + image1, image2 = padder.pad(image1, image2) + _, flow_up = self.model(image1, image2, iters=20, test_mode=True) + max_rad = self.get_score(image1, flow_up) + static_score.append(max_rad) + whether_static = self.check_static(static_score) + return whether_static + + + def get_frames(self, video_path): + frame_list = [] + video = cv2.VideoCapture(video_path) + while video.isOpened(): + success, frame = video.read() + if success: + frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB) # convert to rgb + frame = torch.from_numpy(frame.astype(np.uint8)).permute(2, 0, 1).float() + frame = frame[None].to(DEVICE) + frame_list.append(frame) + else: + break + video.release() + assert frame_list != [] + return frame_list + + +def filter_static(args): + static_filter = StaticFilter(args, device=DEVICE) + prompt_dict = {} + with open(args.prompt_file, "r") as f: + lines = [line.strip() for line in f.readlines()] + for line in lines: + prompt_dict[line] = {"static_count":0, "static_path":[]} + + paths = sorted(glob.glob(os.path.join(args.videos_path, "*.mp4"))) + for path in tqdm(paths): + name = '-'.join(path.split('/')[-1].split('-')[:-1]) + if name in lines: + if prompt_dict[name]["static_count"] < 5: + if static_filter.infer(path): + prompt_dict[name]["static_count"] += 1 + prompt_dict[name]["static_path"].append(path) + os.makedirs(args.result_path, exist_ok=True) + json.dump(prompt_dict, open(os.path.join(args.result_path, args.store_name), "w")) + +def main(): + parser = argparse.ArgumentParser() + parser.add_argument('--model', type=str, default="./pretrained/raft_model/models/raft-things.pth", help="restore checkpoint") + parser.add_argument('--videos_path', default="", required=True, help="video path for filtering") + parser.add_argument('--result_path', type=str, default="./filter_results", help='result save path') + parser.add_argument('--store_name', type=str, default="filtered_static_video.json", help='result file name') + parser.add_argument('--prompt_file', type=str, default="./prompts/prompts_per_dimension/temporal_flickering.txt", help='static_prompt') + parser.add_argument('--small', action='store_true', help='use small model') + parser.add_argument('--mixed_precision', action='store_true', help='use mixed precision') + parser.add_argument('--alternate_corr', action='store_true', help='use efficent correlation implementation') + args = parser.parse_args() + + filter_static(args) + +if __name__ == "__main__": + main() From c88f72d8e7e8228075397487b1d5b0501dd9d43a Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Thu, 28 Dec 2023 14:44:50 +0700 Subject: [PATCH 064/248] [update]: update readme && restructure grit --- MANIFEST.in | 1 + README.md | 5 + dimension_to_folder.json | 10 +- grit_model_deprecated.py | 45 - grit_src_deprecated/configs/Base.yaml | 77 - .../configs/GRiT_B_DenseCap.yaml | 20 - .../configs/GRiT_B_DenseCap_ObjectDet.yaml | 23 - .../configs/GRiT_B_ObjectDet.yaml | 20 - .../configs/GRiT_H_ObjectDet.yaml | 21 - .../configs/GRiT_L_ObjectDet.yaml | 20 - grit_src_deprecated/grit/__init__.py | 7 - grit_src_deprecated/grit/config.py | 50 - grit_src_deprecated/grit/custom_solver.py | 88 -- .../grit/data/custom_build_augmentation.py | 44 - .../grit/data/custom_dataset_dataloader.py | 250 ---- .../grit/data/custom_dataset_mapper.py | 149 -- .../grit/data/datasets/grit_coco.py | 112 -- .../grit/data/datasets/object365.py | 111 -- grit_src_deprecated/grit/data/datasets/vg.py | 98 -- .../transforms/custom_augmentation_impl.py | 52 - .../grit/data/transforms/custom_transform.py | 115 -- grit_src_deprecated/grit/evaluation/eval.py | 156 -- .../grit/modeling/backbone/utils.py | 186 --- .../grit/modeling/backbone/vit.py | 538 ------- .../grit/modeling/meta_arch/grit.py | 67 - .../grit/modeling/roi_heads/grit_fast_rcnn.py | 126 -- .../grit/modeling/roi_heads/grit_roi_heads.py | 478 ------ grit_src_deprecated/grit/modeling/soft_nms.py | 177 --- .../grit/modeling/text/file_utils.py | 256 ---- .../grit/modeling/text/load_text_token.py | 80 - .../grit/modeling/text/modeling_bert.py | 529 ------- .../grit/modeling/text/text_decoder.py | 672 --------- grit_src_deprecated/grit/predictor.py | 91 -- grit_src_deprecated/image_dense_captions.py | 87 -- .../CenterNet2/.circleci/config.yml | 256 ---- .../third_party/CenterNet2/.clang-format | 85 -- .../third_party/CenterNet2/.flake8 | 15 - .../CenterNet2/.github/CODE_OF_CONDUCT.md | 5 - .../CenterNet2/.github/CONTRIBUTING.md | 68 - .../.github/Detectron2-Logo-Horz.svg | 1 - .../CenterNet2/.github/ISSUE_TEMPLATE.md | 5 - .../CenterNet2/.github/ISSUE_TEMPLATE/bugs.md | 38 - .../.github/ISSUE_TEMPLATE/config.yml | 17 - .../.github/ISSUE_TEMPLATE/documentation.md | 14 - .../.github/ISSUE_TEMPLATE/feature-request.md | 31 - .../unexpected-problems-bugs.md | 44 - .../.github/pull_request_template.md | 10 - .../.github/workflows/check-template.yml | 86 -- .../.github/workflows/levenshtein.js | 44 - .../.github/workflows/needs-reply.yml | 98 -- .../.github/workflows/remove-needs-reply.yml | 25 - .../CenterNet2/.github/workflows/workflow.yml | 81 -- .../third_party/CenterNet2/.gitignore | 57 - .../third_party/CenterNet2/GETTING_STARTED.md | 79 - .../third_party/CenterNet2/INSTALL.md | 261 ---- .../third_party/CenterNet2/LICENSE | 202 --- .../third_party/CenterNet2/MODEL_ZOO.md | 1052 -------------- .../third_party/CenterNet2/README.md | 85 -- .../third_party/CenterNet2/README_D2.md | 62 - .../CenterNet2/configs/Base-RCNN-C4.yaml | 18 - .../configs/Base-RCNN-DilatedC5.yaml | 31 - .../CenterNet2/configs/Base-RCNN-FPN.yaml | 42 - .../CenterNet2/configs/Base-RetinaNet.yaml | 25 - .../COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml | 17 - .../faster_rcnn_R_101_C4_3x.yaml | 9 - .../faster_rcnn_R_101_DC5_3x.yaml | 9 - .../faster_rcnn_R_101_FPN_3x.yaml | 9 - .../faster_rcnn_R_50_C4_1x.yaml | 6 - .../faster_rcnn_R_50_C4_3x.yaml | 9 - .../faster_rcnn_R_50_DC5_1x.yaml | 6 - .../faster_rcnn_R_50_DC5_3x.yaml | 9 - .../faster_rcnn_R_50_FPN_1x.yaml | 6 - .../faster_rcnn_R_50_FPN_3x.yaml | 9 - .../faster_rcnn_X_101_32x8d_FPN_3x.yaml | 13 - .../COCO-Detection/fcos_R_50_FPN_1x.py | 11 - .../retinanet_R_101_FPN_3x.yaml | 8 - .../COCO-Detection/retinanet_R_50_FPN_1x.py | 11 - .../COCO-Detection/retinanet_R_50_FPN_1x.yaml | 5 - .../COCO-Detection/retinanet_R_50_FPN_3x.yaml | 8 - .../COCO-Detection/rpn_R_50_C4_1x.yaml | 10 - .../COCO-Detection/rpn_R_50_FPN_1x.yaml | 9 - .../mask_rcnn_R_101_C4_3x.yaml | 9 - .../mask_rcnn_R_101_DC5_3x.yaml | 9 - .../mask_rcnn_R_101_FPN_3x.yaml | 9 - .../mask_rcnn_R_50_C4_1x.py | 8 - .../mask_rcnn_R_50_C4_1x.yaml | 6 - .../mask_rcnn_R_50_C4_3x.yaml | 9 - .../mask_rcnn_R_50_DC5_1x.yaml | 6 - .../mask_rcnn_R_50_DC5_3x.yaml | 9 - .../mask_rcnn_R_50_FPN_1x.py | 8 - .../mask_rcnn_R_50_FPN_1x.yaml | 6 - .../mask_rcnn_R_50_FPN_1x_giou.yaml | 12 - .../mask_rcnn_R_50_FPN_3x.yaml | 9 - .../mask_rcnn_X_101_32x8d_FPN_3x.yaml | 13 - .../mask_rcnn_regnetx_4gf_dds_fpn_1x.py | 34 - .../mask_rcnn_regnety_4gf_dds_fpn_1x.py | 35 - .../Base-Keypoint-RCNN-FPN.yaml | 15 - .../keypoint_rcnn_R_101_FPN_3x.yaml | 8 - .../keypoint_rcnn_R_50_FPN_1x.py | 8 - .../keypoint_rcnn_R_50_FPN_1x.yaml | 5 - .../keypoint_rcnn_R_50_FPN_3x.yaml | 8 - .../keypoint_rcnn_X_101_32x8d_FPN_3x.yaml | 12 - .../Base-Panoptic-FPN.yaml | 11 - .../panoptic_fpn_R_101_3x.yaml | 8 - .../panoptic_fpn_R_50_1x.py | 8 - .../panoptic_fpn_R_50_1x.yaml | 5 - .../panoptic_fpn_R_50_3x.yaml | 8 - .../Cityscapes/mask_rcnn_R_50_FPN.yaml | 27 - .../configs/Detectron1-Comparisons/README.md | 84 -- .../faster_rcnn_R_50_FPN_noaug_1x.yaml | 17 - .../keypoint_rcnn_R_50_FPN_1x.yaml | 27 - .../mask_rcnn_R_50_FPN_noaug_1x.yaml | 20 - .../mask_rcnn_R_101_FPN_1x.yaml | 19 - .../mask_rcnn_R_50_FPN_1x.yaml | 19 - .../mask_rcnn_X_101_32x8d_FPN_1x.yaml | 23 - .../mask_rcnn_R_101_FPN_1x.yaml | 22 - .../mask_rcnn_R_50_FPN_1x.yaml | 22 - .../mask_rcnn_X_101_32x8d_FPN_1x.yaml | 26 - .../Misc/cascade_mask_rcnn_R_50_FPN_1x.yaml | 12 - .../Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml | 15 - ...sk_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv.yaml | 36 - .../mask_rcnn_R_50_FPN_1x_cls_agnostic.yaml | 10 - .../mask_rcnn_R_50_FPN_1x_dconv_c3-c5.yaml | 8 - .../mask_rcnn_R_50_FPN_3x_dconv_c3-c5.yaml | 11 - .../Misc/mask_rcnn_R_50_FPN_3x_gn.yaml | 21 - .../Misc/mask_rcnn_R_50_FPN_3x_syncbn.yaml | 24 - .../Misc/mmdet_mask_rcnn_R_50_FPN_1x.py | 151 -- ...anoptic_fpn_R_101_dconv_cascade_gn_3x.yaml | 26 - .../scratch_mask_rcnn_R_50_FPN_3x_gn.yaml | 13 - .../scratch_mask_rcnn_R_50_FPN_9x_gn.yaml | 19 - .../scratch_mask_rcnn_R_50_FPN_9x_syncbn.yaml | 19 - .../configs/Misc/semantic_R_50_FPN_1x.yaml | 11 - .../configs/Misc/torchvision_imagenet_R_50.py | 150 -- .../faster_rcnn_R_50_C4.yaml | 18 - .../faster_rcnn_R_50_FPN.yaml | 18 - .../CenterNet2/configs/common/README.md | 6 - .../configs/common/coco_schedule.py | 47 - .../CenterNet2/configs/common/data/coco.py | 48 - .../configs/common/data/coco_keypoint.py | 13 - .../common/data/coco_panoptic_separated.py | 26 - .../configs/common/models/cascade_rcnn.py | 36 - .../CenterNet2/configs/common/models/fcos.py | 23 - .../common/models/keypoint_rcnn_fpn.py | 33 - .../configs/common/models/mask_rcnn_c4.py | 88 -- .../configs/common/models/mask_rcnn_fpn.py | 93 -- .../configs/common/models/panoptic_fpn.py | 20 - .../configs/common/models/retinanet.py | 53 - .../CenterNet2/configs/common/optim.py | 15 - .../CenterNet2/configs/common/train.py | 18 - .../mask_rcnn_R_101_FPN_100ep_LSJ.py | 9 - .../mask_rcnn_R_101_FPN_200ep_LSJ.py | 14 - .../mask_rcnn_R_101_FPN_400ep_LSJ.py | 14 - .../mask_rcnn_R_50_FPN_100ep_LSJ.py | 72 - .../mask_rcnn_R_50_FPN_200ep_LSJ.py | 14 - .../mask_rcnn_R_50_FPN_400ep_LSJ.py | 14 - .../mask_rcnn_R_50_FPN_50ep_LSJ.py | 14 - ...mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py | 29 - ...mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ.py | 14 - ...mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ.py | 14 - ...mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py | 30 - ...mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ.py | 14 - ...mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ.py | 14 - .../configs/quick_schedules/README.md | 8 - ...mask_rcnn_R_50_FPN_inference_acc_test.yaml | 7 - ...scade_mask_rcnn_R_50_FPN_instant_test.yaml | 11 - ...fast_rcnn_R_50_FPN_inference_acc_test.yaml | 7 - .../fast_rcnn_R_50_FPN_instant_test.yaml | 15 - ...oint_rcnn_R_50_FPN_inference_acc_test.yaml | 7 - .../keypoint_rcnn_R_50_FPN_instant_test.yaml | 16 - ...R_50_FPN_normalized_training_acc_test.yaml | 30 - ...point_rcnn_R_50_FPN_training_acc_test.yaml | 28 - .../mask_rcnn_R_50_C4_GCV_instant_test.yaml | 18 - .../mask_rcnn_R_50_C4_inference_acc_test.yaml | 7 - .../mask_rcnn_R_50_C4_instant_test.yaml | 14 - .../mask_rcnn_R_50_C4_training_acc_test.yaml | 22 - ...mask_rcnn_R_50_DC5_inference_acc_test.yaml | 7 - ...mask_rcnn_R_50_FPN_inference_acc_test.yaml | 10 - .../mask_rcnn_R_50_FPN_instant_test.yaml | 14 - ...R_50_FPN_pred_boxes_training_acc_test.yaml | 6 - .../mask_rcnn_R_50_FPN_training_acc_test.yaml | 21 - .../panoptic_fpn_R_50_inference_acc_test.yaml | 7 - .../panoptic_fpn_R_50_instant_test.yaml | 19 - .../panoptic_fpn_R_50_training_acc_test.yaml | 20 - ...retinanet_R_50_FPN_inference_acc_test.yaml | 7 - .../retinanet_R_50_FPN_instant_test.yaml | 13 - .../rpn_R_50_FPN_inference_acc_test.yaml | 7 - .../rpn_R_50_FPN_instant_test.yaml | 13 - .../semantic_R_50_FPN_inference_acc_test.yaml | 10 - .../semantic_R_50_FPN_instant_test.yaml | 18 - .../semantic_R_50_FPN_training_acc_test.yaml | 20 - .../third_party/CenterNet2/datasets/README.md | 140 -- .../datasets/lvis/lvis_v1_train_cat_info.json | 1 - .../datasets/prepare_ade20k_sem_seg.py | 26 - .../datasets/prepare_cocofied_lvis.py | 176 --- .../CenterNet2/datasets/prepare_for_tests.sh | 31 - .../datasets/prepare_panoptic_fpn.py | 116 -- .../third_party/CenterNet2/demo/README.md | 8 - .../third_party/CenterNet2/demo/demo.py | 188 --- .../third_party/CenterNet2/demo/predictor.py | 220 --- .../CenterNet2/detectron2/__init__.py | 10 - .../detectron2/checkpoint/__init__.py | 10 - .../detectron2/checkpoint/c2_model_loading.py | 407 ------ .../detectron2/checkpoint/catalog.py | 115 -- .../checkpoint/detection_checkpoint.py | 120 -- .../CenterNet2/detectron2/config/__init__.py | 24 - .../CenterNet2/detectron2/config/compat.py | 229 --- .../CenterNet2/detectron2/config/config.py | 265 ---- .../CenterNet2/detectron2/config/defaults.py | 635 -------- .../detectron2/config/instantiate.py | 82 -- .../CenterNet2/detectron2/config/lazy.py | 399 ----- .../CenterNet2/detectron2/data/__init__.py | 19 - .../CenterNet2/detectron2/data/benchmark.py | 225 --- .../CenterNet2/detectron2/data/build.py | 542 ------- .../CenterNet2/detectron2/data/catalog.py | 236 --- .../CenterNet2/detectron2/data/common.py | 241 --- .../detectron2/data/dataset_mapper.py | 191 --- .../detectron2/data/datasets/README.md | 9 - .../detectron2/data/datasets/__init__.py | 9 - .../detectron2/data/datasets/builtin.py | 259 ---- .../detectron2/data/datasets/builtin_meta.py | 350 ----- .../detectron2/data/datasets/cityscapes.py | 329 ----- .../data/datasets/cityscapes_panoptic.py | 187 --- .../detectron2/data/datasets/coco.py | 539 ------- .../detectron2/data/datasets/coco_panoptic.py | 228 --- .../detectron2/data/datasets/lvis.py | 240 --- .../data/datasets/lvis_v0_5_categories.py | 13 - .../data/datasets/lvis_v1_categories.py | 16 - .../detectron2/data/datasets/pascal_voc.py | 82 -- .../detectron2/data/datasets/register_coco.py | 3 - .../detectron2/data/detection_utils.py | 623 -------- .../detectron2/data/samplers/__init__.py | 17 - .../data/samplers/distributed_sampler.py | 278 ---- .../data/samplers/grouped_batch_sampler.py | 47 - .../detectron2/data/transforms/__init__.py | 14 - .../data/transforms/augmentation.py | 377 ----- .../data/transforms/augmentation_impl.py | 614 -------- .../detectron2/data/transforms/transform.py | 351 ----- .../CenterNet2/detectron2/engine/__init__.py | 12 - .../CenterNet2/detectron2/engine/defaults.py | 715 --------- .../CenterNet2/detectron2/engine/hooks.py | 686 --------- .../CenterNet2/detectron2/engine/launch.py | 126 -- .../detectron2/engine/train_loop.py | 417 ------ .../detectron2/evaluation/__init__.py | 12 - .../evaluation/cityscapes_evaluation.py | 194 --- .../detectron2/evaluation/coco_evaluation.py | 710 --------- .../detectron2/evaluation/evaluator.py | 224 --- .../detectron2/evaluation/fast_eval_api.py | 121 -- .../detectron2/evaluation/lvis_evaluation.py | 380 ----- .../evaluation/panoptic_evaluation.py | 199 --- .../evaluation/pascal_voc_evaluation.py | 300 ---- .../evaluation/rotated_coco_evaluation.py | 207 --- .../evaluation/sem_seg_evaluation.py | 184 --- .../detectron2/evaluation/testing.py | 85 -- .../CenterNet2/detectron2/export/README.md | 13 - .../CenterNet2/detectron2/export/__init__.py | 15 - .../CenterNet2/detectron2/export/api.py | 235 --- .../CenterNet2/detectron2/export/c10.py | 534 ------- .../detectron2/export/caffe2_export.py | 207 --- .../detectron2/export/caffe2_inference.py | 161 --- .../detectron2/export/caffe2_modeling.py | 419 ------ .../detectron2/export/caffe2_patch.py | 152 -- .../CenterNet2/detectron2/export/flatten.py | 330 ----- .../CenterNet2/detectron2/export/shared.py | 1034 ------------- .../detectron2/export/torchscript.py | 132 -- .../detectron2/export/torchscript_patch.py | 406 ------ .../CenterNet2/detectron2/layers/__init__.py | 24 - .../CenterNet2/detectron2/layers/aspp.py | 144 -- .../detectron2/layers/batch_norm.py | 276 ---- .../CenterNet2/detectron2/layers/blocks.py | 111 -- .../detectron2/layers/csrc/README.md | 7 - .../csrc/ROIAlignRotated/ROIAlignRotated.h | 115 -- .../ROIAlignRotated/ROIAlignRotated_cpu.cpp | 522 ------- .../ROIAlignRotated/ROIAlignRotated_cuda.cu | 443 ------ .../csrc/box_iou_rotated/box_iou_rotated.h | 35 - .../box_iou_rotated/box_iou_rotated_cpu.cpp | 39 - .../box_iou_rotated/box_iou_rotated_cuda.cu | 130 -- .../box_iou_rotated/box_iou_rotated_utils.h | 370 ----- .../layers/csrc/cocoeval/cocoeval.cpp | 507 ------- .../layers/csrc/cocoeval/cocoeval.h | 88 -- .../detectron2/layers/csrc/cuda_version.cu | 26 - .../layers/csrc/deformable/deform_conv.h | 377 ----- .../csrc/deformable/deform_conv_cuda.cu | 1223 ---------------- .../deformable/deform_conv_cuda_kernel.cu | 1288 ----------------- .../layers/csrc/nms_rotated/nms_rotated.h | 39 - .../csrc/nms_rotated/nms_rotated_cpu.cpp | 75 - .../csrc/nms_rotated/nms_rotated_cuda.cu | 145 -- .../detectron2/layers/csrc/vision.cpp | 117 -- .../detectron2/layers/deform_conv.py | 501 ------- .../CenterNet2/detectron2/layers/losses.py | 133 -- .../CenterNet2/detectron2/layers/mask_ops.py | 275 ---- .../CenterNet2/detectron2/layers/nms.py | 139 -- .../CenterNet2/detectron2/layers/roi_align.py | 74 - .../detectron2/layers/roi_align_rotated.py | 91 -- .../detectron2/layers/rotated_boxes.py | 21 - .../detectron2/layers/shape_spec.py | 20 - .../CenterNet2/detectron2/layers/wrappers.py | 132 -- .../detectron2/model_zoo/__init__.py | 10 - .../detectron2/model_zoo/model_zoo.py | 213 --- .../detectron2/modeling/__init__.py | 59 - .../detectron2/modeling/anchor_generator.py | 382 ----- .../detectron2/modeling/backbone/__init__.py | 17 - .../detectron2/modeling/backbone/backbone.py | 53 - .../detectron2/modeling/backbone/build.py | 33 - .../detectron2/modeling/backbone/fpn.py | 255 ---- .../detectron2/modeling/backbone/regnet.py | 452 ------ .../detectron2/modeling/backbone/resnet.py | 694 --------- .../detectron2/modeling/box_regression.py | 369 ----- .../CenterNet2/detectron2/modeling/matcher.py | 127 -- .../detectron2/modeling/meta_arch/__init__.py | 16 - .../detectron2/modeling/meta_arch/build.py | 25 - .../modeling/meta_arch/dense_detector.py | 282 ---- .../detectron2/modeling/meta_arch/fcos.py | 303 ---- .../modeling/meta_arch/panoptic_fpn.py | 266 ---- .../detectron2/modeling/meta_arch/rcnn.py | 327 ----- .../modeling/meta_arch/retinanet.py | 439 ------ .../modeling/meta_arch/semantic_seg.py | 260 ---- .../detectron2/modeling/mmdet_wrapper.py | 274 ---- .../CenterNet2/detectron2/modeling/poolers.py | 245 ---- .../detectron2/modeling/postprocessing.py | 101 -- .../modeling/proposal_generator/__init__.py | 5 - .../modeling/proposal_generator/build.py | 24 - .../proposal_generator/proposal_utils.py | 196 --- .../modeling/proposal_generator/rpn.py | 533 ------- .../modeling/proposal_generator/rrpn.py | 203 --- .../detectron2/modeling/roi_heads/__init__.py | 29 - .../detectron2/modeling/roi_heads/box_head.py | 118 -- .../modeling/roi_heads/cascade_rcnn.py | 299 ---- .../modeling/roi_heads/fast_rcnn.py | 462 ------ .../modeling/roi_heads/keypoint_head.py | 272 ---- .../modeling/roi_heads/mask_head.py | 292 ---- .../modeling/roi_heads/roi_heads.py | 877 ----------- .../modeling/roi_heads/rotated_fast_rcnn.py | 270 ---- .../detectron2/modeling/sampling.py | 54 - .../modeling/test_time_augmentation.py | 307 ---- .../CenterNet2/detectron2/projects/README.md | 2 - .../detectron2/projects/__init__.py | 31 - .../CenterNet2/detectron2/solver/__init__.py | 5 - .../CenterNet2/detectron2/solver/build.py | 285 ---- .../detectron2/solver/lr_scheduler.py | 238 --- .../detectron2/structures/__init__.py | 17 - .../CenterNet2/detectron2/structures/boxes.py | 423 ------ .../detectron2/structures/image_list.py | 110 -- .../detectron2/structures/instances.py | 192 --- .../detectron2/structures/keypoints.py | 239 --- .../CenterNet2/detectron2/structures/masks.py | 532 ------- .../detectron2/structures/rotated_boxes.py | 503 ------- .../CenterNet2/detectron2/utils/README.md | 5 - .../CenterNet2/detectron2/utils/__init__.py | 1 - .../CenterNet2/detectron2/utils/analysis.py | 188 --- .../detectron2/utils/collect_env.py | 242 ---- .../CenterNet2/detectron2/utils/colormap.py | 140 -- .../CenterNet2/detectron2/utils/comm.py | 199 --- .../CenterNet2/detectron2/utils/env.py | 170 --- .../CenterNet2/detectron2/utils/events.py | 486 ------- .../CenterNet2/detectron2/utils/file_io.py | 37 - .../CenterNet2/detectron2/utils/logger.py | 237 --- .../CenterNet2/detectron2/utils/memory.py | 84 -- .../CenterNet2/detectron2/utils/registry.py | 60 - .../CenterNet2/detectron2/utils/serialize.py | 32 - .../CenterNet2/detectron2/utils/testing.py | 137 -- .../detectron2/utils/video_visualizer.py | 252 ---- .../CenterNet2/detectron2/utils/visualizer.py | 1267 ---------------- .../third_party/CenterNet2/dev/README.md | 7 - .../third_party/CenterNet2/dev/linter.sh | 42 - .../CenterNet2/dev/packaging/README.md | 17 - .../dev/packaging/build_all_wheels.sh | 65 - .../CenterNet2/dev/packaging/build_wheel.sh | 31 - .../dev/packaging/gen_install_table.py | 63 - .../dev/packaging/gen_wheel_index.sh | 46 - .../CenterNet2/dev/packaging/pkg_helpers.bash | 76 - .../CenterNet2/dev/parse_results.sh | 45 - .../CenterNet2/dev/run_inference_tests.sh | 44 - .../CenterNet2/dev/run_instant_tests.sh | 27 - .../third_party/CenterNet2/docker/Dockerfile | 47 - .../third_party/CenterNet2/docker/README.md | 45 - .../CenterNet2/docker/deploy.Dockerfile | 32 - .../CenterNet2/docker/docker-compose.yml | 26 - .../third_party/CenterNet2/docs/.gitignore | 1 - .../third_party/CenterNet2/docs/Makefile | 19 - .../third_party/CenterNet2/docs/README.md | 15 - .../CenterNet2/docs/_static/css/custom.css | 30 - .../third_party/CenterNet2/docs/conf.py | 382 ----- .../third_party/CenterNet2/docs/index.rst | 14 - .../CenterNet2/docs/modules/checkpoint.rst | 7 - .../CenterNet2/docs/modules/config.rst | 18 - .../CenterNet2/docs/modules/data.rst | 37 - .../docs/modules/data_transforms.rst | 10 - .../CenterNet2/docs/modules/engine.rst | 26 - .../CenterNet2/docs/modules/evaluation.rst | 7 - .../CenterNet2/docs/modules/export.rst | 9 - .../CenterNet2/docs/modules/fvcore.rst | 49 - .../CenterNet2/docs/modules/index.rst | 19 - .../CenterNet2/docs/modules/layers.rst | 7 - .../CenterNet2/docs/modules/model_zoo.rst | 7 - .../CenterNet2/docs/modules/modeling.rst | 58 - .../CenterNet2/docs/modules/solver.rst | 7 - .../CenterNet2/docs/modules/structures.rst | 7 - .../CenterNet2/docs/modules/utils.rst | 80 - .../CenterNet2/docs/notes/benchmarks.md | 196 --- .../CenterNet2/docs/notes/changelog.md | 48 - .../CenterNet2/docs/notes/compatibility.md | 84 -- .../CenterNet2/docs/notes/contributing.md | 68 - .../CenterNet2/docs/notes/index.rst | 10 - .../CenterNet2/docs/requirements.txt | 21 - .../CenterNet2/docs/tutorials/README.md | 4 - .../CenterNet2/docs/tutorials/augmentation.md | 186 --- .../docs/tutorials/builtin_datasets.md | 140 -- .../CenterNet2/docs/tutorials/configs.md | 62 - .../CenterNet2/docs/tutorials/data_loading.md | 95 -- .../CenterNet2/docs/tutorials/datasets.md | 290 ---- .../CenterNet2/docs/tutorials/deployment.md | 137 -- .../CenterNet2/docs/tutorials/evaluation.md | 68 - .../CenterNet2/docs/tutorials/extend.md | 141 -- .../docs/tutorials/getting_started.md | 79 - .../CenterNet2/docs/tutorials/index.rst | 20 - .../CenterNet2/docs/tutorials/install.md | 261 ---- .../CenterNet2/docs/tutorials/lazyconfigs.md | 170 --- .../CenterNet2/docs/tutorials/models.md | 180 --- .../CenterNet2/docs/tutorials/training.md | 67 - .../CenterNet2/docs/tutorials/write-models.md | 90 -- .../CenterNet2/projects/CenterNet2/.gitignore | 10 - .../projects/CenterNet2/centernet/__init__.py | 14 - .../projects/CenterNet2/centernet/config.py | 87 -- .../data/custom_build_augmentation.py | 59 - .../data/custom_dataset_dataloader.py | 229 --- .../centernet/data/datasets/coco.py | 49 - .../centernet/data/datasets/nuimages.py | 37 - .../centernet/data/datasets/objects365.py | 394 ----- .../transforms/custom_augmentation_impl.py | 63 - .../data/transforms/custom_transform.py | 94 -- .../centernet/modeling/backbone/bifpn.py | 425 ------ .../centernet/modeling/backbone/bifpn_fcos.py | 469 ------ .../centernet/modeling/backbone/dla.py | 479 ------ .../centernet/modeling/backbone/dlafpn.py | 493 ------- .../centernet/modeling/backbone/fpn_p5.py | 78 - .../centernet/modeling/backbone/res2net.py | 802 ---------- .../CenterNet2/centernet/modeling/debug.py | 283 ---- .../modeling/dense_heads/centernet.py | 864 ----------- .../modeling/dense_heads/centernet_head.py | 162 --- .../centernet/modeling/dense_heads/utils.py | 38 - .../centernet/modeling/layers/deform_conv.py | 116 -- .../modeling/layers/heatmap_focal_loss.py | 92 -- .../centernet/modeling/layers/iou_loss.py | 121 -- .../centernet/modeling/layers/ml_nms.py | 31 - .../modeling/meta_arch/centernet_detector.py | 69 - .../modeling/roi_heads/custom_fast_rcnn.py | 124 -- .../modeling/roi_heads/custom_roi_heads.py | 185 --- .../centernet/modeling/roi_heads/fed_loss.py | 31 - .../CenterNet2/centernet2_docs/MODEL_ZOO.md | 73 - .../configs/Base-CenterNet-FPN.yaml | 28 - .../CenterNet2/configs/Base-CenterNet2.yaml | 56 - .../CenterNet2/configs/Base_S4_DLA.yaml | 40 - .../configs/CenterNet-FPN_R50_1x.yaml | 4 - .../configs/CenterNet-S4_DLA_8x.yaml | 5 - .../configs/CenterNet2-F_R50_1x.yaml | 4 - .../configs/CenterNet2_DLA-BiFPN-P3_24x.yaml | 36 - .../configs/CenterNet2_DLA-BiFPN-P3_4x.yaml | 36 - .../CenterNet2_DLA-BiFPN-P5_640_16x.yaml | 29 - .../CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml | 30 - ...enterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml | 30 - .../CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml | 32 - ...erNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml | 36 - .../configs/CenterNet2_R2-101-DCN_896_4x.yaml | 29 - .../CenterNet2/configs/CenterNet2_R50_1x.yaml | 1 - .../configs/CenterNet2_X101-DCN_2x.yaml | 22 - .../configs/LVIS_CenterNet2_R50_1x.yaml | 17 - .../configs/LVIS_CenterNet2_R50_Fed_1x.yaml | 19 - .../configs/O365_CenterNet2_R50_1x.yaml | 13 - .../nuImages_CenterNet2_DLA_640_8x.yaml | 42 - .../CenterNet2/projects/CenterNet2/demo.py | 185 --- .../projects/CenterNet2/predictor.py | 243 ---- .../projects/CenterNet2/train_net.py | 228 --- .../third_party/CenterNet2/setup.cfg | 26 - .../third_party/CenterNet2/setup.py | 206 --- .../third_party/CenterNet2/tests/README.md | 9 - .../third_party/CenterNet2/tests/__init__.py | 1 - .../CenterNet2/tests/config/dir1/dir1_a.py | 3 - .../CenterNet2/tests/config/dir1/dir1_b.py | 11 - .../CenterNet2/tests/config/root_cfg.py | 14 - .../tests/config/test_instantiate_config.py | 100 -- .../tests/config/test_lazy_config.py | 79 - .../tests/config/test_yacs_config.py | 270 ---- .../CenterNet2/tests/data/__init__.py | 0 .../CenterNet2/tests/data/test_coco.py | 139 -- .../tests/data/test_coco_evaluation.py | 138 -- .../CenterNet2/tests/data/test_dataset.py | 134 -- .../tests/data/test_detection_utils.py | 176 --- .../tests/data/test_rotation_transform.py | 71 - .../CenterNet2/tests/data/test_sampler.py | 111 -- .../CenterNet2/tests/data/test_transforms.py | 268 ---- .../CenterNet2/tests/layers/__init__.py | 0 .../CenterNet2/tests/layers/test_blocks.py | 51 - .../tests/layers/test_deformable.py | 175 --- .../CenterNet2/tests/layers/test_losses.py | 82 -- .../CenterNet2/tests/layers/test_mask_ops.py | 202 --- .../CenterNet2/tests/layers/test_nms.py | 33 - .../tests/layers/test_nms_rotated.py | 172 --- .../CenterNet2/tests/layers/test_roi_align.py | 210 --- .../tests/layers/test_roi_align_rotated.py | 176 --- .../CenterNet2/tests/modeling/__init__.py | 0 .../tests/modeling/test_anchor_generator.py | 120 -- .../tests/modeling/test_backbone.py | 34 - .../tests/modeling/test_box2box_transform.py | 94 -- .../tests/modeling/test_fast_rcnn.py | 171 --- .../CenterNet2/tests/modeling/test_matcher.py | 42 - .../CenterNet2/tests/modeling/test_mmdet.py | 186 --- .../tests/modeling/test_model_e2e.py | 223 --- .../tests/modeling/test_roi_heads.py | 323 ----- .../tests/modeling/test_roi_pooler.py | 165 --- .../CenterNet2/tests/modeling/test_rpn.py | 262 ---- .../CenterNet2/tests/structures/__init__.py | 0 .../CenterNet2/tests/structures/test_boxes.py | 223 --- .../tests/structures/test_imagelist.py | 75 - .../tests/structures/test_instances.py | 219 --- .../tests/structures/test_keypoints.py | 19 - .../CenterNet2/tests/structures/test_masks.py | 53 - .../tests/structures/test_rotated_boxes.py | 437 ------ .../CenterNet2/tests/test_checkpoint.py | 49 - .../CenterNet2/tests/test_engine.py | 186 --- .../CenterNet2/tests/test_events.py | 64 - .../CenterNet2/tests/test_export_caffe2.py | 52 - .../tests/test_export_torchscript.py | 296 ---- .../CenterNet2/tests/test_model_analysis.py | 80 - .../CenterNet2/tests/test_model_zoo.py | 50 - .../CenterNet2/tests/test_packaging.py | 24 - .../CenterNet2/tests/test_registry.py | 45 - .../CenterNet2/tests/test_scheduler.py | 68 - .../CenterNet2/tests/test_solver.py | 66 - .../CenterNet2/tests/test_visualizer.py | 278 ---- .../third_party/CenterNet2/tools/README.md | 49 - .../third_party/CenterNet2/tools/__init__.py | 0 .../CenterNet2/tools/analyze_model.py | 159 -- .../third_party/CenterNet2/tools/benchmark.py | 197 --- .../tools/convert-torchvision-to-d2.py | 56 - .../CenterNet2/tools/deploy/CMakeLists.txt | 15 - .../CenterNet2/tools/deploy/README.md | 66 - .../CenterNet2/tools/deploy/export_model.py | 235 --- .../tools/deploy/torchscript_mask_rcnn.cpp | 187 --- .../CenterNet2/tools/lazyconfig_train_net.py | 131 -- .../CenterNet2/tools/lightning_train_net.py | 239 --- .../CenterNet2/tools/plain_train_net.py | 223 --- .../third_party/CenterNet2/tools/train_net.py | 170 --- .../CenterNet2/tools/visualize_data.py | 94 -- .../tools/visualize_json_results.py | 90 -- 544 files changed, 11 insertions(+), 69906 deletions(-) delete mode 100755 grit_model_deprecated.py delete mode 100755 grit_src_deprecated/configs/Base.yaml delete mode 100755 grit_src_deprecated/configs/GRiT_B_DenseCap.yaml delete mode 100755 grit_src_deprecated/configs/GRiT_B_DenseCap_ObjectDet.yaml delete mode 100755 grit_src_deprecated/configs/GRiT_B_ObjectDet.yaml delete mode 100755 grit_src_deprecated/configs/GRiT_H_ObjectDet.yaml delete mode 100755 grit_src_deprecated/configs/GRiT_L_ObjectDet.yaml delete mode 100755 grit_src_deprecated/grit/__init__.py delete mode 100755 grit_src_deprecated/grit/config.py delete mode 100755 grit_src_deprecated/grit/custom_solver.py delete mode 100755 grit_src_deprecated/grit/data/custom_build_augmentation.py delete mode 100755 grit_src_deprecated/grit/data/custom_dataset_dataloader.py delete mode 100755 grit_src_deprecated/grit/data/custom_dataset_mapper.py delete mode 100755 grit_src_deprecated/grit/data/datasets/grit_coco.py delete mode 100755 grit_src_deprecated/grit/data/datasets/object365.py delete mode 100755 grit_src_deprecated/grit/data/datasets/vg.py delete mode 100755 grit_src_deprecated/grit/data/transforms/custom_augmentation_impl.py delete mode 100755 grit_src_deprecated/grit/data/transforms/custom_transform.py delete mode 100755 grit_src_deprecated/grit/evaluation/eval.py delete mode 100755 grit_src_deprecated/grit/modeling/backbone/utils.py delete mode 100755 grit_src_deprecated/grit/modeling/backbone/vit.py delete mode 100755 grit_src_deprecated/grit/modeling/meta_arch/grit.py delete mode 100755 grit_src_deprecated/grit/modeling/roi_heads/grit_fast_rcnn.py delete mode 100755 grit_src_deprecated/grit/modeling/roi_heads/grit_roi_heads.py delete mode 100755 grit_src_deprecated/grit/modeling/soft_nms.py delete mode 100755 grit_src_deprecated/grit/modeling/text/file_utils.py delete mode 100755 grit_src_deprecated/grit/modeling/text/load_text_token.py delete mode 100755 grit_src_deprecated/grit/modeling/text/modeling_bert.py delete mode 100755 grit_src_deprecated/grit/modeling/text/text_decoder.py delete mode 100755 grit_src_deprecated/grit/predictor.py delete mode 100755 grit_src_deprecated/image_dense_captions.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.circleci/config.yml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.clang-format delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.flake8 delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.github/CODE_OF_CONDUCT.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.github/CONTRIBUTING.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.github/Detectron2-Logo-Horz.svg delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/bugs.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/config.yml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/documentation.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/feature-request.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/unexpected-problems-bugs.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.github/pull_request_template.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.github/workflows/check-template.yml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.github/workflows/levenshtein.js delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.github/workflows/needs-reply.yml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.github/workflows/remove-needs-reply.yml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.github/workflows/workflow.yml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/.gitignore delete mode 100755 grit_src_deprecated/third_party/CenterNet2/GETTING_STARTED.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/INSTALL.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/LICENSE delete mode 100755 grit_src_deprecated/third_party/CenterNet2/MODEL_ZOO.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/README_D2.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-C4.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-DilatedC5.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-FPN.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Base-RetinaNet.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_C4_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_DC5_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_FPN_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/fcos_R_50_FPN_1x.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_101_FPN_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_C4_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_FPN_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_C4_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_DC5_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_FPN_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x_giou.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/Base-Keypoint-RCNN-FPN.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_101_FPN_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_X_101_32x8d_FPN_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/Base-Panoptic-FPN.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_101_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Cityscapes/mask_rcnn_R_50_FPN.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/faster_rcnn_R_50_FPN_noaug_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/keypoint_rcnn_R_50_FPN_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_cls_agnostic.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_dconv_c3-c5.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_dconv_c3-c5.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_gn.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_syncbn.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Misc/mmdet_mask_rcnn_R_50_FPN_1x.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Misc/panoptic_fpn_R_101_dconv_cascade_gn_3x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_gn.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_syncbn.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Misc/semantic_R_50_FPN_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/Misc/torchvision_imagenet_R_50.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_C4.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_FPN.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/common/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/common/coco_schedule.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco_keypoint.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco_panoptic_separated.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/common/models/cascade_rcnn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/common/models/fcos.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/common/models/keypoint_rcnn_fpn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/common/models/mask_rcnn_c4.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/common/models/mask_rcnn_fpn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/common/models/panoptic_fpn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/common/models/retinanet.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/common/optim.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/common/train.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_100ep_LSJ.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_200ep_LSJ.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_400ep_LSJ.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_200ep_LSJ.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_400ep_LSJ.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_50ep_LSJ.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_inference_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_instant_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_inference_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_instant_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_inference_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_instant_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_normalized_training_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_training_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_GCV_instant_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_inference_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_instant_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_training_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_DC5_inference_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_instant_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_pred_boxes_training_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_training_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_inference_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_instant_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_training_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_inference_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_instant_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_inference_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_instant_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_inference_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_instant_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_training_acc_test.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/datasets/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/datasets/lvis/lvis_v1_train_cat_info.json delete mode 100755 grit_src_deprecated/third_party/CenterNet2/datasets/prepare_ade20k_sem_seg.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/datasets/prepare_cocofied_lvis.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/datasets/prepare_for_tests.sh delete mode 100755 grit_src_deprecated/third_party/CenterNet2/datasets/prepare_panoptic_fpn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/demo/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/demo/demo.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/demo/predictor.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/c2_model_loading.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/catalog.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/detection_checkpoint.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/config/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/config/compat.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/config/config.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/config/defaults.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/config/instantiate.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/config/lazy.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/benchmark.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/build.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/catalog.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/common.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/dataset_mapper.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/builtin.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/builtin_meta.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/cityscapes.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/cityscapes_panoptic.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/coco.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/coco_panoptic.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis_v0_5_categories.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis_v1_categories.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/pascal_voc.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/register_coco.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/detection_utils.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/distributed_sampler.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/grouped_batch_sampler.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/augmentation.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/augmentation_impl.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/transform.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/engine/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/engine/defaults.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/engine/hooks.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/engine/launch.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/engine/train_loop.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/cityscapes_evaluation.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/coco_evaluation.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/evaluator.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/fast_eval_api.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/lvis_evaluation.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/panoptic_evaluation.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/pascal_voc_evaluation.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/rotated_coco_evaluation.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/sem_seg_evaluation.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/testing.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/export/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/export/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/export/api.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/export/c10.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_export.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_inference.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_modeling.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_patch.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/export/flatten.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/export/shared.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/export/torchscript.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/export/torchscript_patch.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/aspp.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/batch_norm.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/blocks.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated.h delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cpu.cpp delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cuda.cu delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.cpp delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.h delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cuda_version.cu delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv.h delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated.h delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/vision.cpp delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/deform_conv.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/losses.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/mask_ops.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/nms.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/roi_align.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/roi_align_rotated.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/rotated_boxes.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/shape_spec.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/layers/wrappers.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/model_zoo/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/model_zoo/model_zoo.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/anchor_generator.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/backbone.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/build.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/fpn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/regnet.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/resnet.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/box_regression.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/matcher.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/build.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/dense_detector.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/fcos.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/panoptic_fpn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/rcnn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/retinanet.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/semantic_seg.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/mmdet_wrapper.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/poolers.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/postprocessing.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/build.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/proposal_utils.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/rpn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/rrpn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/box_head.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/cascade_rcnn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/fast_rcnn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/keypoint_head.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/mask_head.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/roi_heads.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/rotated_fast_rcnn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/sampling.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/test_time_augmentation.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/projects/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/projects/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/solver/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/solver/build.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/solver/lr_scheduler.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/structures/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/structures/boxes.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/structures/image_list.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/structures/instances.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/structures/keypoints.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/structures/masks.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/structures/rotated_boxes.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/utils/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/utils/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/utils/analysis.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/utils/collect_env.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/utils/colormap.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/utils/comm.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/utils/env.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/utils/events.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/utils/file_io.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/utils/logger.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/utils/memory.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/utils/registry.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/utils/serialize.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/utils/testing.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/utils/video_visualizer.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/detectron2/utils/visualizer.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/dev/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/dev/linter.sh delete mode 100755 grit_src_deprecated/third_party/CenterNet2/dev/packaging/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/dev/packaging/build_all_wheels.sh delete mode 100755 grit_src_deprecated/third_party/CenterNet2/dev/packaging/build_wheel.sh delete mode 100755 grit_src_deprecated/third_party/CenterNet2/dev/packaging/gen_install_table.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/dev/packaging/gen_wheel_index.sh delete mode 100755 grit_src_deprecated/third_party/CenterNet2/dev/packaging/pkg_helpers.bash delete mode 100755 grit_src_deprecated/third_party/CenterNet2/dev/parse_results.sh delete mode 100755 grit_src_deprecated/third_party/CenterNet2/dev/run_inference_tests.sh delete mode 100755 grit_src_deprecated/third_party/CenterNet2/dev/run_instant_tests.sh delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docker/Dockerfile delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docker/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docker/deploy.Dockerfile delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docker/docker-compose.yml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/.gitignore delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/Makefile delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/_static/css/custom.css delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/conf.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/index.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/modules/checkpoint.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/modules/config.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/modules/data.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/modules/data_transforms.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/modules/engine.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/modules/evaluation.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/modules/export.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/modules/fvcore.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/modules/index.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/modules/layers.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/modules/model_zoo.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/modules/modeling.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/modules/solver.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/modules/structures.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/modules/utils.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/notes/benchmarks.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/notes/changelog.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/notes/compatibility.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/notes/contributing.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/notes/index.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/requirements.txt delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/tutorials/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/tutorials/augmentation.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/tutorials/builtin_datasets.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/tutorials/configs.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/tutorials/data_loading.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/tutorials/datasets.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/tutorials/deployment.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/tutorials/evaluation.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/tutorials/extend.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/tutorials/getting_started.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/tutorials/index.rst delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/tutorials/install.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/tutorials/lazyconfigs.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/tutorials/models.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/tutorials/training.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/docs/tutorials/write-models.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/.gitignore delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/config.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_build_augmentation.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_dataset_dataloader.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/coco.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/nuimages.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/objects365.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_augmentation_impl.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_transform.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dla.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/fpn_p5.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/res2net.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/debug.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet_head.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/utils.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/deform_conv.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/heatmap_focal_loss.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/iou_loss.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/ml_nms.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/meta_arch/centernet_detector.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_fast_rcnn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_roi_heads.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/fed_loss.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet2_docs/MODEL_ZOO.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet-FPN.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet2.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base_S4_DLA.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-FPN_R50_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-S4_DLA_8x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2-F_R50_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R50_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_X101-DCN_2x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/O365_CenterNet2_R50_1x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/demo.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/predictor.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/train_net.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/setup.cfg delete mode 100755 grit_src_deprecated/third_party/CenterNet2/setup.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/config/dir1/dir1_a.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/config/dir1/dir1_b.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/config/root_cfg.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/config/test_instantiate_config.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/config/test_lazy_config.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/config/test_yacs_config.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/data/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/data/test_coco.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/data/test_coco_evaluation.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/data/test_dataset.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/data/test_detection_utils.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/data/test_rotation_transform.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/data/test_sampler.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/data/test_transforms.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/layers/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/layers/test_blocks.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/layers/test_deformable.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/layers/test_losses.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/layers/test_mask_ops.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/layers/test_nms.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/layers/test_nms_rotated.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/layers/test_roi_align.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/layers/test_roi_align_rotated.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/modeling/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_anchor_generator.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_backbone.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_box2box_transform.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_fast_rcnn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_matcher.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_mmdet.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_model_e2e.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_roi_heads.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_roi_pooler.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_rpn.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/structures/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/structures/test_boxes.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/structures/test_imagelist.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/structures/test_instances.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/structures/test_keypoints.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/structures/test_masks.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/structures/test_rotated_boxes.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/test_checkpoint.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/test_engine.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/test_events.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/test_export_caffe2.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/test_export_torchscript.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/test_model_analysis.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/test_model_zoo.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/test_packaging.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/test_registry.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/test_scheduler.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/test_solver.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tests/test_visualizer.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tools/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tools/__init__.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tools/analyze_model.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tools/benchmark.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tools/convert-torchvision-to-d2.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tools/deploy/CMakeLists.txt delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tools/deploy/README.md delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tools/deploy/export_model.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tools/deploy/torchscript_mask_rcnn.cpp delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tools/lazyconfig_train_net.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tools/lightning_train_net.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tools/plain_train_net.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tools/train_net.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tools/visualize_data.py delete mode 100755 grit_src_deprecated/third_party/CenterNet2/tools/visualize_json_results.py diff --git a/MANIFEST.in b/MANIFEST.in index e258121c..7914e2c7 100644 --- a/MANIFEST.in +++ b/MANIFEST.in @@ -1,2 +1,3 @@ include version.txt include requirements.txt +recursive-include vbench/third_party *.yaml diff --git a/README.md b/README.md index 1f655088..fa073816 100755 --- a/README.md +++ b/README.md @@ -36,6 +36,11 @@ We propose **VBench**, a comprehensive benchmark suite for video generative mode conda activate vbench ``` +## :hammer: Installation (pip install) +``` +python setup.py sdist && python -m pip install dist/vbench-0.1.0.tar.gz +``` + ## :gem: Pre-Trained Models [Optional] Please download the pre-trained weights according to the guidance in the `model_path.txt` file for each model in the `pretrained` folder. diff --git a/dimension_to_folder.json b/dimension_to_folder.json index 126198af..085532cd 100644 --- a/dimension_to_folder.json +++ b/dimension_to_folder.json @@ -1,18 +1,18 @@ { "subject_consistency": "subject_consistency", "background_consistency": "scene", - "aesthetic_quality": "overall_consistency", - "imaging_quality": "overall_consistency", + "aesthetic_quality": "overall_consistency", + "imaging_quality": "overall_consistency", "object_class": "object_class", "multiple_objects": "multiple_objects", "color": "color", "spatial_relationship": "spatial_relationship", "scene": "scene", "temporal_style": "temporal_style", - "overall_consistency": "overall_consistency", + "overall_consistency": "overall_consistency", "human_action": "human_action", "temporal_flickering": "temporal_flickering", - "motion_smoothness": "subject_consistency", + "motion_smoothness": "subject_consistency", "dynamic_degree": "subject_consistency", "appearance_style": "appearance_style" -} \ No newline at end of file +} diff --git a/grit_model_deprecated.py b/grit_model_deprecated.py deleted file mode 100755 index 898cd708..00000000 --- a/grit_model_deprecated.py +++ /dev/null @@ -1,45 +0,0 @@ -import os -import sys -CUR_DIR = os.path.dirname(os.path.abspath(__file__)) -sys.path.insert(0,CUR_DIR) -from grit_src.image_dense_captions import image_caption_api, init_demo, dense_pred_to_caption, dense_pred_to_caption_tuple -from detectron2.data.detection_utils import read_image - -class DenseCaptioning(): - def __init__(self, device): - self.device = device - self.demo = None - - - def initialize_model(self): - self.demo = init_demo(self.device) - - def image_dense_caption_debug(self, image_src): - dense_caption = """ - 1. the broccoli is green, [0, 0, 333, 325]; - 2. a piece of broccoli, [0, 147, 143, 324]; - 3. silver fork on plate, [4, 547, 252, 612]; - """ - return dense_caption - - def image_dense_caption(self, image_src): - dense_caption = image_caption_api(image_src, self.device) - print('\033[1;35m' + '*' * 100 + '\033[0m') - print("Step2, Dense Caption:\n") - print(dense_caption) - print('\033[1;35m' + '*' * 100 + '\033[0m') - return dense_caption - - def run_caption_api(self,image_src): - img = read_image(image_src, format="BGR") - print(img.shape) - predictions, visualized_output = self.demo.run_on_image(img) - new_caption = dense_pred_to_caption(predictions) - return new_caption - - def run_caption_tensor(self,img): - # img = read_image(image_src, format="BGR") - # print(img.shape) - predictions, _ = self.demo.run_on_image(img,self.device) - new_caption = dense_pred_to_caption_tuple(predictions) - return new_caption diff --git a/grit_src_deprecated/configs/Base.yaml b/grit_src_deprecated/configs/Base.yaml deleted file mode 100755 index 445690ac..00000000 --- a/grit_src_deprecated/configs/Base.yaml +++ /dev/null @@ -1,77 +0,0 @@ -MODEL: - META_ARCHITECTURE: "GRiT" - MASK_ON: True - PROPOSAL_GENERATOR: - NAME: "CenterNet" - FPN: - IN_FEATURES: ["layer3", "layer4", "layer5"] - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.12, 57.375] - ROI_HEADS: - NAME: GRiTROIHeadsAndTextDecoder - IN_FEATURES: ["p3", "p4", "p5"] - IOU_THRESHOLDS: [0.6] - NUM_CLASSES: 1 - SCORE_THRESH_TEST: 0.02 - NMS_THRESH_TEST: 0.5 - OBJECT_FEAT_POOLER_RES: 14 - ROI_BOX_CASCADE_HEAD: - IOUS: [0.6, 0.7, 0.8] - ROI_BOX_HEAD: - NAME: "FastRCNNConvFCHead" - NUM_FC: 2 - POOLER_RESOLUTION: 7 - CLS_AGNOSTIC_BBOX_REG: True - MULT_PROPOSAL_SCORE: True - ROI_MASK_HEAD: - NAME: "MaskRCNNConvUpsampleHead" - NUM_CONV: 4 - POOLER_RESOLUTION: 14 - CLS_AGNOSTIC_MASK: True - CENTERNET: - NUM_CLASSES: 1 - REG_WEIGHT: 1. - NOT_NORM_REG: True - ONLY_PROPOSAL: True - WITH_AGN_HM: True - INFERENCE_TH: 0.0001 - PRE_NMS_TOPK_TRAIN: 4000 - POST_NMS_TOPK_TRAIN: 2000 - PRE_NMS_TOPK_TEST: 1000 - POST_NMS_TOPK_TEST: 256 - NMS_TH_TRAIN: 0.9 - NMS_TH_TEST: 0.9 - POS_WEIGHT: 0.5 - NEG_WEIGHT: 0.5 - IGNORE_HIGH_FP: 0.85 -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -DATALOADER: - SAMPLER_TRAIN: "MultiDatasetSampler" - DATASET_RATIO: [1] - DATASET_INPUT_SIZE: [1024] - DATASET_INPUT_SCALE: [[0.1, 2.0]] - FILTER_EMPTY_ANNOTATIONS: False - NUM_WORKERS: 8 -TEST: - DETECTIONS_PER_IMAGE: 256 -SOLVER: - LR_SCHEDULER_NAME: "WarmupCosineLR" - CHECKPOINT_PERIOD: 10000 - WARMUP_ITERS: 1000 - WARMUP_FACTOR: 0.001 - USE_CUSTOM_SOLVER: True - OPTIMIZER: "ADAMW" - MAX_ITER: 180000 - IMS_PER_BATCH: 64 - BASE_LR: 0.00008 - VIT_LAYER_DECAY: True - CLIP_GRADIENTS: - ENABLED: True -INPUT: - FORMAT: RGB - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 640 -USE_ACT_CHECKPOINT: True -VERSION: 2 \ No newline at end of file diff --git a/grit_src_deprecated/configs/GRiT_B_DenseCap.yaml b/grit_src_deprecated/configs/GRiT_B_DenseCap.yaml deleted file mode 100755 index 0e7d2d2c..00000000 --- a/grit_src_deprecated/configs/GRiT_B_DenseCap.yaml +++ /dev/null @@ -1,20 +0,0 @@ -_BASE_: "Base.yaml" -MODEL: - TRAIN_TASK: ["DenseCap"] - TEST_TASK: "DenseCap" - MASK_ON: False - ROI_HEADS: - SOFT_NMS_ENABLED: False - BEAM_SIZE: 1 - WEIGHTS: "detectron2://ImageNetPretrained/MAE/mae_pretrain_vit_base.pth" - BACKBONE: - NAME: build_vit_fpn_backbone - VIT_LAYERS: 12 -SOLVER: - VIT_LAYER_DECAY_RATE: 0.7 -DATASETS: - TRAIN: ("vg_train",) - TEST: ("vg_test",) -DATALOADER: - DATASET_BS: 2 -OUTPUT_DIR: "./output/GRiT_B_DenseCap" \ No newline at end of file diff --git a/grit_src_deprecated/configs/GRiT_B_DenseCap_ObjectDet.yaml b/grit_src_deprecated/configs/GRiT_B_DenseCap_ObjectDet.yaml deleted file mode 100755 index 49f3ef13..00000000 --- a/grit_src_deprecated/configs/GRiT_B_DenseCap_ObjectDet.yaml +++ /dev/null @@ -1,23 +0,0 @@ -_BASE_: "Base.yaml" -MODEL: - TRAIN_TASK: ["ObjectDet", "DenseCap"] - TEST_TASK: "DenseCap" # DenseCap or ObjectDet: Choose one for testing - MASK_ON: True - ROI_HEADS: - SOFT_NMS_ENABLED: False - BEAM_SIZE: 1 - WEIGHTS: "detectron2://ImageNetPretrained/MAE/mae_pretrain_vit_base.pth" - BACKBONE: - NAME: build_vit_fpn_backbone - VIT_LAYERS: 12 -SOLVER: - VIT_LAYER_DECAY_RATE: 0.7 -DATASETS: - TRAIN: ("GRiT_coco2017_train", "vg_train") - TEST: ("coco_2017_test-dev",) -DATALOADER: - DATASET_RATIO: [1, 1] - DATASET_BS: 2 - DATASET_INPUT_SIZE: [1024, 1024] - DATASET_INPUT_SCALE: [[0.1, 2.0], [0.1, 2.0]] -OUTPUT_DIR: "./output/GRiT_B_DenseCap_ObjectDet" \ No newline at end of file diff --git a/grit_src_deprecated/configs/GRiT_B_ObjectDet.yaml b/grit_src_deprecated/configs/GRiT_B_ObjectDet.yaml deleted file mode 100755 index e7a75052..00000000 --- a/grit_src_deprecated/configs/GRiT_B_ObjectDet.yaml +++ /dev/null @@ -1,20 +0,0 @@ -_BASE_: "Base.yaml" -MODEL: - TRAIN_TASK: ["ObjectDet"] - TEST_TASK: "ObjectDet" - MASK_ON: True - ROI_HEADS: - SOFT_NMS_ENABLED: True - BEAM_SIZE: 3 - WEIGHTS: "detectron2://ImageNetPretrained/MAE/mae_pretrain_vit_base.pth" - BACKBONE: - NAME: build_vit_fpn_backbone - VIT_LAYERS: 12 -SOLVER: - VIT_LAYER_DECAY_RATE: 0.7 -DATASETS: - TRAIN: ("GRiT_coco2017_train",) - TEST: ("coco_2017_val",) -DATALOADER: - DATASET_BS: 2 -OUTPUT_DIR: "./output/GRiT_B_ObjectDet" \ No newline at end of file diff --git a/grit_src_deprecated/configs/GRiT_H_ObjectDet.yaml b/grit_src_deprecated/configs/GRiT_H_ObjectDet.yaml deleted file mode 100755 index 000a1d46..00000000 --- a/grit_src_deprecated/configs/GRiT_H_ObjectDet.yaml +++ /dev/null @@ -1,21 +0,0 @@ -_BASE_: "Base.yaml" -MODEL: - TRAIN_TASK: ["ObjectDet"] - TEST_TASK: "ObjectDet" - MASK_ON: True - ROI_HEADS: - SOFT_NMS_ENABLED: True - BEAM_SIZE: 3 - WEIGHTS: "detectron2://ImageNetPretrained/MAE/mae_pretrain_vit_huge_p14to16.pth" - BACKBONE: - NAME: build_vit_fpn_backbone_huge - VIT_LAYERS: 32 -SOLVER: - MAX_ITER: 135000 - VIT_LAYER_DECAY_RATE: 0.9 -DATASETS: - TRAIN: ("GRiT_coco2017_train",) - TEST: ("coco_2017_val",) -DATALOADER: - DATASET_BS: 1 -OUTPUT_DIR: "./output/GRiT_H_ObjectDet" \ No newline at end of file diff --git a/grit_src_deprecated/configs/GRiT_L_ObjectDet.yaml b/grit_src_deprecated/configs/GRiT_L_ObjectDet.yaml deleted file mode 100755 index b6e3b97f..00000000 --- a/grit_src_deprecated/configs/GRiT_L_ObjectDet.yaml +++ /dev/null @@ -1,20 +0,0 @@ -_BASE_: "Base.yaml" -MODEL: - TRAIN_TASK: ["ObjectDet"] - TEST_TASK: "ObjectDet" - MASK_ON: True - ROI_HEADS: - SOFT_NMS_ENABLED: True - BEAM_SIZE: 3 - WEIGHTS: "detectron2://ImageNetPretrained/MAE/mae_pretrain_vit_large.pth" - BACKBONE: - NAME: build_vit_fpn_backbone_large - VIT_LAYERS: 24 -SOLVER: - VIT_LAYER_DECAY_RATE: 0.8 -DATASETS: - TRAIN: ("GRiT_coco2017_train",) - TEST: ("coco_2017_val",) -DATALOADER: - DATASET_BS: 1 -OUTPUT_DIR: "./output/GRiT_L_ObjectDet" \ No newline at end of file diff --git a/grit_src_deprecated/grit/__init__.py b/grit_src_deprecated/grit/__init__.py deleted file mode 100755 index 81f24566..00000000 --- a/grit_src_deprecated/grit/__init__.py +++ /dev/null @@ -1,7 +0,0 @@ -from .modeling.meta_arch import grit -from .modeling.roi_heads import grit_roi_heads -from .modeling.backbone import vit - -from .data.datasets import object365 -from .data.datasets import vg -from .data.datasets import grit_coco \ No newline at end of file diff --git a/grit_src_deprecated/grit/config.py b/grit_src_deprecated/grit/config.py deleted file mode 100755 index fabe7f0f..00000000 --- a/grit_src_deprecated/grit/config.py +++ /dev/null @@ -1,50 +0,0 @@ -from detectron2.config import CfgNode as CN - - -def add_grit_config(cfg): - _C = cfg - - _C.MODEL.BEAM_SIZE = 1 - _C.MODEL.TRAIN_TASK = ["ObjectDet", "DenseCap"] - _C.MODEL.TEST_TASK = "DenseCap" # This can be varied if the model is jointly trained on multiple tasks - - _C.MODEL.ROI_BOX_HEAD.USE_BIAS = 0.0 # >= 0: not use - _C.MODEL.ROI_BOX_HEAD.MULT_PROPOSAL_SCORE = False - - _C.MODEL.ROI_HEADS.MASK_WEIGHT = 1.0 - _C.MODEL.ROI_HEADS.OBJECT_FEAT_POOLER_RES = 14 - _C.MODEL.ROI_HEADS.SOFT_NMS_ENABLED = False - - # Backbones - _C.MODEL.VIT_LAYERS = 12 - - # Text Decoder - _C.TEXT_DECODER = CN() - _C.TEXT_DECODER.VOCAB_SIZE = 30522 - _C.TEXT_DECODER.HIDDEN_SIZE = 768 - _C.TEXT_DECODER.NUM_LAYERS = 6 - _C.TEXT_DECODER.ATTENTION_HEADS = 12 - _C.TEXT_DECODER.FEEDFORWARD_SIZE = 768 * 4 - - # Multi-dataset dataloader - _C.DATALOADER.DATASET_RATIO = [1, 1] # sample ratio - _C.DATALOADER.DATASET_BS = 1 - _C.DATALOADER.DATASET_INPUT_SIZE = [1024, 1024] - _C.DATALOADER.DATASET_INPUT_SCALE = [(0.1, 2.0), (0.1, 2.0)] - _C.DATALOADER.DATASET_MIN_SIZES = [(640, 800), (640, 800)] - _C.DATALOADER.DATASET_MAX_SIZES = [1333, 1333] - - _C.SOLVER.USE_CUSTOM_SOLVER = True - _C.SOLVER.OPTIMIZER = 'ADAMW' - _C.SOLVER.VIT_LAYER_DECAY = True - _C.SOLVER.VIT_LAYER_DECAY_RATE = 0.7 - - _C.INPUT.CUSTOM_AUG = 'EfficientDetResizeCrop' - _C.INPUT.TRAIN_SIZE = 1024 - _C.INPUT.TEST_SIZE = 1024 - _C.INPUT.SCALE_RANGE = (0.1, 2.) - # 'default' for fixed short / long edge - _C.INPUT.TEST_INPUT_TYPE = 'default' - - _C.FIND_UNUSED_PARAM = True - _C.USE_ACT_CHECKPOINT = True \ No newline at end of file diff --git a/grit_src_deprecated/grit/custom_solver.py b/grit_src_deprecated/grit/custom_solver.py deleted file mode 100755 index 87f7d61e..00000000 --- a/grit_src_deprecated/grit/custom_solver.py +++ /dev/null @@ -1,88 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -# Modified by Jialian Wu from https://github.com/facebookresearch/Detic/blob/main/detic/custom_solver.py -import itertools -from typing import Any, Callable, Dict, Iterable, List, Set, Type, Union -import torch - -from detectron2.config import CfgNode - -from detectron2.solver.build import maybe_add_gradient_clipping - - -def build_custom_optimizer(cfg: CfgNode, model: torch.nn.Module) -> torch.optim.Optimizer: - params: List[Dict[str, Any]] = [] - memo: Set[torch.nn.parameter.Parameter] = set() - optimizer_type = cfg.SOLVER.OPTIMIZER - - for key, value in model.named_parameters(recurse=True): - if not value.requires_grad: - continue - # Avoid duplicating parameters - if value in memo: - continue - memo.add(value) - lr = cfg.SOLVER.BASE_LR - weight_decay = cfg.SOLVER.WEIGHT_DECAY - - if cfg.SOLVER.VIT_LAYER_DECAY: - lr = lr * get_vit_lr_decay_rate(key, cfg.SOLVER.VIT_LAYER_DECAY_RATE, cfg.MODEL.VIT_LAYERS) - - param = {"params": [value], "lr": lr} - if optimizer_type != 'ADAMW': - param['weight_decay'] = weight_decay - params += [param] - - def maybe_add_full_model_gradient_clipping(optim): # optim: the optimizer class - # detectron2 doesn't have full model gradient clipping now - clip_norm_val = cfg.SOLVER.CLIP_GRADIENTS.CLIP_VALUE - enable = ( - cfg.SOLVER.CLIP_GRADIENTS.ENABLED - and cfg.SOLVER.CLIP_GRADIENTS.CLIP_TYPE == "full_model" - and clip_norm_val > 0.0 - ) - - class FullModelGradientClippingOptimizer(optim): - def step(self, closure=None): - all_params = itertools.chain(*[x["params"] for x in self.param_groups]) - torch.nn.utils.clip_grad_norm_(all_params, clip_norm_val) - super().step(closure=closure) - - return FullModelGradientClippingOptimizer if enable else optim - - - if optimizer_type == 'SGD': - optimizer = maybe_add_full_model_gradient_clipping(torch.optim.SGD)( - params, cfg.SOLVER.BASE_LR, momentum=cfg.SOLVER.MOMENTUM, - nesterov=cfg.SOLVER.NESTEROV - ) - elif optimizer_type == 'ADAMW': - optimizer = maybe_add_full_model_gradient_clipping(torch.optim.AdamW)( - params, cfg.SOLVER.BASE_LR, - weight_decay=cfg.SOLVER.WEIGHT_DECAY - ) - else: - raise NotImplementedError(f"no optimizer type {optimizer_type}") - if not cfg.SOLVER.CLIP_GRADIENTS.CLIP_TYPE == "full_model": - optimizer = maybe_add_gradient_clipping(cfg, optimizer) - return optimizer - - -def get_vit_lr_decay_rate(name, lr_decay_rate=1.0, num_layers=12): - """ - Calculate lr decay rate for different ViT blocks. - Args: - name (string): parameter name. - lr_decay_rate (float): base lr decay rate. - num_layers (int): number of ViT blocks. - - Returns: - lr decay rate for the given parameter. - """ - layer_id = num_layers + 1 - if name.startswith("backbone"): - if ".pos_embed" in name or ".patch_embed" in name: - layer_id = 0 - elif ".blocks." in name and ".residual." not in name: - layer_id = int(name[name.find(".blocks.") :].split(".")[2]) + 1 - - return lr_decay_rate ** (num_layers + 1 - layer_id) \ No newline at end of file diff --git a/grit_src_deprecated/grit/data/custom_build_augmentation.py b/grit_src_deprecated/grit/data/custom_build_augmentation.py deleted file mode 100755 index 49a52d01..00000000 --- a/grit_src_deprecated/grit/data/custom_build_augmentation.py +++ /dev/null @@ -1,44 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from detectron2.data import transforms as T -from .transforms.custom_augmentation_impl import EfficientDetResizeCrop - - -def build_custom_augmentation(cfg, is_train, scale=None, size=None, \ - min_size=None, max_size=None): - """ - Create a list of default :class:`Augmentation` from config. - Now it includes resizing and flipping. - - Returns: - list[Augmentation] - """ - if cfg.INPUT.CUSTOM_AUG == 'ResizeShortestEdge': - if is_train: - min_size = cfg.INPUT.MIN_SIZE_TRAIN if min_size is None else min_size - max_size = cfg.INPUT.MAX_SIZE_TRAIN if max_size is None else max_size - sample_style = cfg.INPUT.MIN_SIZE_TRAIN_SAMPLING - else: - min_size = cfg.INPUT.MIN_SIZE_TEST - max_size = cfg.INPUT.MAX_SIZE_TEST - sample_style = "choice" - augmentation = [T.ResizeShortestEdge(min_size, max_size, sample_style)] - elif cfg.INPUT.CUSTOM_AUG == 'EfficientDetResizeCrop': - if is_train: - scale = cfg.INPUT.SCALE_RANGE if scale is None else scale - size = cfg.INPUT.TRAIN_SIZE if size is None else size - else: - scale = (1, 1) - size = cfg.INPUT.TEST_SIZE - augmentation = [EfficientDetResizeCrop(size, scale)] - else: - assert 0, cfg.INPUT.CUSTOM_AUG - - if is_train: - augmentation.append(T.RandomFlip()) - return augmentation - - -build_custom_transform_gen = build_custom_augmentation -""" -Alias for backward-compatibility. -""" \ No newline at end of file diff --git a/grit_src_deprecated/grit/data/custom_dataset_dataloader.py b/grit_src_deprecated/grit/data/custom_dataset_dataloader.py deleted file mode 100755 index ea9c4172..00000000 --- a/grit_src_deprecated/grit/data/custom_dataset_dataloader.py +++ /dev/null @@ -1,250 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# Modified by Jialian Wu from https://github.com/facebookresearch/Detic/blob/main/detic/data/custom_dataset_dataloader.py -import operator -import torch -import torch.utils.data -from detectron2.utils.comm import get_world_size - -from detectron2.config import configurable -from torch.utils.data.sampler import BatchSampler, Sampler -from detectron2.data.common import DatasetFromList, MapDataset -from detectron2.data.dataset_mapper import DatasetMapper -from detectron2.data.build import get_detection_dataset_dicts, build_batch_data_loader -from detectron2.data.samplers import TrainingSampler -from detectron2.data.build import worker_init_reset_seed, print_instances_class_histogram -from detectron2.data.build import filter_images_with_only_crowd_annotations -from detectron2.data.build import filter_images_with_few_keypoints -from detectron2.data.build import check_metadata_consistency -from detectron2.data.catalog import MetadataCatalog, DatasetCatalog -from detectron2.utils import comm -import itertools -from typing import Optional - - -def _custom_train_loader_from_config(cfg, mapper=None, *, dataset=None, sampler=None): - sampler_name = cfg.DATALOADER.SAMPLER_TRAIN - if 'MultiDataset' in sampler_name: - dataset_dicts = get_detection_dataset_dicts_with_source( - cfg.DATASETS.TRAIN, - filter_empty=cfg.DATALOADER.FILTER_EMPTY_ANNOTATIONS, - min_keypoints=cfg.MODEL.ROI_KEYPOINT_HEAD.MIN_KEYPOINTS_PER_IMAGE - if cfg.MODEL.KEYPOINT_ON else 0, - proposal_files=cfg.DATASETS.PROPOSAL_FILES_TRAIN if cfg.MODEL.LOAD_PROPOSALS else None, - ) - else: - dataset_dicts = get_detection_dataset_dicts( - cfg.DATASETS.TRAIN, - filter_empty=cfg.DATALOADER.FILTER_EMPTY_ANNOTATIONS, - min_keypoints=cfg.MODEL.ROI_KEYPOINT_HEAD.MIN_KEYPOINTS_PER_IMAGE - if cfg.MODEL.KEYPOINT_ON else 0, - proposal_files=cfg.DATASETS.PROPOSAL_FILES_TRAIN if cfg.MODEL.LOAD_PROPOSALS else None, - ) - - if mapper is None: - mapper = DatasetMapper(cfg, True) - - if sampler is not None: - pass - elif sampler_name == "TrainingSampler": - sampler = TrainingSampler(len(dataset)) - elif sampler_name == "MultiDatasetSampler": - sampler = MultiDatasetSampler( - dataset_dicts, - dataset_ratio=cfg.DATALOADER.DATASET_RATIO, - ) - else: - raise ValueError("Unknown training sampler: {}".format(sampler_name)) - - return { - "dataset": dataset_dicts, - "sampler": sampler, - "mapper": mapper, - "total_batch_size": cfg.SOLVER.IMS_PER_BATCH, - "num_workers": cfg.DATALOADER.NUM_WORKERS, - 'dataset_bs': cfg.DATALOADER.DATASET_BS, - 'num_datasets': len(cfg.DATASETS.TRAIN) - } - - -@configurable(from_config=_custom_train_loader_from_config) -def build_custom_train_loader( - dataset, *, mapper, sampler, - total_batch_size=16, - num_workers=0, - num_datasets=1, - dataset_bs=1 -): - - if isinstance(dataset, list): - dataset = DatasetFromList(dataset, copy=False) - if mapper is not None: - dataset = MapDataset(dataset, mapper) - if sampler is None: - sampler = TrainingSampler(len(dataset)) - assert isinstance(sampler, torch.utils.data.sampler.Sampler) - - return build_dataset_batch_data_loader( - dataset_bs, - dataset, - sampler, - total_batch_size, - num_datasets=num_datasets, - num_workers=num_workers, - ) - - -def build_dataset_batch_data_loader( - dataset_bs, dataset, sampler, total_batch_size, num_datasets, num_workers=0 -): - - world_size = get_world_size() - assert ( - total_batch_size > 0 and total_batch_size % world_size == 0 - ), "Total batch size ({}) must be divisible by the number of gpus ({}).".format( - total_batch_size, world_size - ) - - data_loader = torch.utils.data.DataLoader( - dataset, - sampler=sampler, - num_workers=num_workers, - batch_sampler=None, - collate_fn=operator.itemgetter(0), # don't batch, but yield individual elements - worker_init_fn=worker_init_reset_seed, - ) - - if num_datasets > 1: - return MultiDatasets(data_loader, dataset_bs, num_datasets) - else: - return SingleDataset(data_loader, dataset_bs) - - -def get_detection_dataset_dicts_with_source( - dataset_names, filter_empty=True, min_keypoints=0, proposal_files=None -): - assert len(dataset_names) - dataset_dicts = [DatasetCatalog.get(dataset_name) for dataset_name in dataset_names] - for dataset_name, dicts in zip(dataset_names, dataset_dicts): - assert len(dicts), "Dataset '{}' is empty!".format(dataset_name) - - for source_id, (dataset_name, dicts) in \ - enumerate(zip(dataset_names, dataset_dicts)): - assert len(dicts), "Dataset '{}' is empty!".format(dataset_name) - for d in dicts: - d['dataset_source'] = source_id - - if "annotations" in dicts[0]: - try: - class_names = MetadataCatalog.get(dataset_name).thing_classes - check_metadata_consistency("thing_classes", dataset_name) - print_instances_class_histogram(dicts, class_names) - except AttributeError: # class names are not available for this dataset - pass - - assert proposal_files is None - - dataset_dicts = list(itertools.chain.from_iterable(dataset_dicts)) - - has_instances = "annotations" in dataset_dicts[0] - if filter_empty and has_instances: - dataset_dicts = filter_images_with_only_crowd_annotations(dataset_dicts) - if min_keypoints > 0 and has_instances: - dataset_dicts = filter_images_with_few_keypoints(dataset_dicts, min_keypoints) - - return dataset_dicts - - -class MultiDatasetSampler(Sampler): - def __init__( - self, - dataset_dicts, - dataset_ratio, - seed: Optional[int] = None, - ): - sizes = [0 for _ in range(len(dataset_ratio))] - for d in dataset_dicts: - sizes[d['dataset_source']] += 1 - print('dataset sizes', sizes) - self.sizes = sizes - assert len(dataset_ratio) == len(sizes), \ - 'length of dataset ratio {} should be equal to number if dataset {}'.format( - len(dataset_ratio), len(sizes) - ) - if seed is None: - seed = comm.shared_random_seed() - self._seed = int(seed) - self._rank = comm.get_rank() - self._world_size = comm.get_world_size() - - self.dataset_ids = torch.tensor( - [d['dataset_source'] for d in dataset_dicts], dtype=torch.long) - self.dataset_ratio = dataset_ratio - - dataset_weight = [torch.ones(s) * max(sizes) / s * r / sum(dataset_ratio) \ - for i, (r, s) in enumerate(zip(dataset_ratio, sizes))] - dataset_weight = torch.cat(dataset_weight) - - self.weights = dataset_weight - self.sample_epoch_size = len(self.weights) - - def __iter__(self): - start = self._rank - yield from itertools.islice( - self._infinite_indices(), start, None, self._world_size) - - def _infinite_indices(self): - g = torch.Generator() - g.manual_seed(self._seed) - while True: - if len(self.dataset_ratio) > 1: - # multiple datasets - ids = torch.multinomial( - self.weights, self.sample_epoch_size, generator=g, - replacement=True) - nums = [(self.dataset_ids[ids] == i).sum().int().item() \ - for i in range(len(self.sizes))] - yield from ids - else: - # single dataset - yield from torch.randperm(self.sizes[0], generator=g).tolist() - - -class SingleDataset(torch.utils.data.IterableDataset): - def __init__(self, dataset, batch_sizes): - self.dataset = dataset - self.batch_sizes = batch_sizes - self._buckets = [[] for _ in range(2)] - - def __iter__(self): - for d in self.dataset: - w, h = d["width"], d["height"] - aspect_ratio_bucket_id = 0 if w > h else 1 - bucket_id = aspect_ratio_bucket_id - bucket = self._buckets[bucket_id] - bucket.append(d) - if len(bucket) == self.batch_sizes: - yield bucket[:] - del bucket[:] - - -class MultiDatasets(torch.utils.data.IterableDataset): - def __init__(self, dataset, batch_sizes, num_datasets): - self.dataset = dataset - self.batch_sizes = batch_sizes - self._buckets = [[] for _ in range(2 * num_datasets)] - self.iter_idx = 0 - self.num_datasets = num_datasets - - def __iter__(self): - for d in self.dataset: - w, h = d["width"], d["height"] - aspect_ratio_bucket_id = 0 if w > h else 1 - bucket_id = d['dataset_source'] * 2 + aspect_ratio_bucket_id - bucket = self._buckets[bucket_id] - if len(bucket) < self.batch_sizes: - bucket.append(d) - selected_dataset = self.iter_idx % self.num_datasets - if len(bucket) == self.batch_sizes and selected_dataset == d['dataset_source']: - self.iter_idx += 1 - yield bucket[:] - del bucket[:] \ No newline at end of file diff --git a/grit_src_deprecated/grit/data/custom_dataset_mapper.py b/grit_src_deprecated/grit/data/custom_dataset_mapper.py deleted file mode 100755 index 1e21edb3..00000000 --- a/grit_src_deprecated/grit/data/custom_dataset_mapper.py +++ /dev/null @@ -1,149 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -# Modified by Jialian Wu from https://github.com/facebookresearch/Detic/blob/main/detic/data/custom_dataset_mapper.py -import copy -import numpy as np -import torch - -from detectron2.config import configurable - -from detectron2.data import detection_utils as utils -from detectron2.data import transforms as T -from detectron2.data.dataset_mapper import DatasetMapper -from .custom_build_augmentation import build_custom_augmentation -from itertools import compress -import logging - -__all__ = ["CustomDatasetMapper", "ObjDescription"] -logger = logging.getLogger(__name__) - - -class CustomDatasetMapper(DatasetMapper): - @configurable - def __init__(self, is_train: bool, - dataset_augs=[], - **kwargs): - if is_train: - self.dataset_augs = [T.AugmentationList(x) for x in dataset_augs] - super().__init__(is_train, **kwargs) - - @classmethod - def from_config(cls, cfg, is_train: bool = True): - ret = super().from_config(cfg, is_train) - if is_train: - if cfg.INPUT.CUSTOM_AUG == 'EfficientDetResizeCrop': - dataset_scales = cfg.DATALOADER.DATASET_INPUT_SCALE - dataset_sizes = cfg.DATALOADER.DATASET_INPUT_SIZE - ret['dataset_augs'] = [ - build_custom_augmentation(cfg, True, scale, size) \ - for scale, size in zip(dataset_scales, dataset_sizes)] - else: - assert cfg.INPUT.CUSTOM_AUG == 'ResizeShortestEdge' - min_sizes = cfg.DATALOADER.DATASET_MIN_SIZES - max_sizes = cfg.DATALOADER.DATASET_MAX_SIZES - ret['dataset_augs'] = [ - build_custom_augmentation( - cfg, True, min_size=mi, max_size=ma) \ - for mi, ma in zip(min_sizes, max_sizes)] - else: - ret['dataset_augs'] = [] - - return ret - - def __call__(self, dataset_dict): - dataset_dict_out = self.prepare_data(dataset_dict) - - # When augmented image is too small, do re-augmentation - retry = 0 - while (dataset_dict_out["image"].shape[1] < 32 or dataset_dict_out["image"].shape[2] < 32): - retry += 1 - if retry == 100: - logger.info('Retry 100 times for augmentation. Make sure the image size is not too small.') - logger.info('Find image information below') - logger.info(dataset_dict) - dataset_dict_out = self.prepare_data(dataset_dict) - - return dataset_dict_out - - def prepare_data(self, dataset_dict_in): - dataset_dict = copy.deepcopy(dataset_dict_in) - if 'file_name' in dataset_dict: - ori_image = utils.read_image( - dataset_dict["file_name"], format=self.image_format) - else: - ori_image, _, _ = self.tar_dataset[dataset_dict["tar_index"]] - ori_image = utils._apply_exif_orientation(ori_image) - ori_image = utils.convert_PIL_to_numpy(ori_image, self.image_format) - utils.check_image_size(dataset_dict, ori_image) - - aug_input = T.AugInput(copy.deepcopy(ori_image), sem_seg=None) - if self.is_train: - transforms = \ - self.dataset_augs[dataset_dict['dataset_source']](aug_input) - else: - transforms = self.augmentations(aug_input) - image, sem_seg_gt = aug_input.image, aug_input.sem_seg - - image_shape = image.shape[:2] - dataset_dict["image"] = torch.as_tensor( - np.ascontiguousarray(image.transpose(2, 0, 1))) - - if not self.is_train: - # USER: Modify this if you want to keep them for some reason. - dataset_dict.pop("annotations", None) - return dataset_dict - - if "annotations" in dataset_dict: - if len(dataset_dict["annotations"]) > 0: - object_descriptions = [an['object_description'] for an in dataset_dict["annotations"]] - else: - object_descriptions = [] - # USER: Modify this if you want to keep them for some reason. - for anno in dataset_dict["annotations"]: - if not self.use_instance_mask: - anno.pop("segmentation", None) - if not self.use_keypoint: - anno.pop("keypoints", None) - - all_annos = [ - (utils.transform_instance_annotations( - obj, transforms, image_shape, - keypoint_hflip_indices=self.keypoint_hflip_indices, - ), obj.get("iscrowd", 0)) - for obj in dataset_dict.pop("annotations") - ] - annos = [ann[0] for ann in all_annos if ann[1] == 0] - instances = utils.annotations_to_instances( - annos, image_shape, mask_format=self.instance_mask_format - ) - - instances.gt_object_descriptions = ObjDescription(object_descriptions) - - del all_annos - if self.recompute_boxes: - instances.gt_boxes = instances.gt_masks.get_bounding_boxes() - dataset_dict["instances"] = utils.filter_empty_instances(instances) - - return dataset_dict - - -class ObjDescription: - def __init__(self, object_descriptions): - self.data = object_descriptions - - def __getitem__(self, item): - assert type(item) == torch.Tensor - assert item.dim() == 1 - if len(item) > 0: - assert item.dtype == torch.int64 or item.dtype == torch.bool - if item.dtype == torch.int64: - return ObjDescription([self.data[x.item()] for x in item]) - elif item.dtype == torch.bool: - return ObjDescription(list(compress(self.data, item))) - - return ObjDescription(list(compress(self.data, item))) - - def __len__(self): - return len(self.data) - - def __repr__(self): - return "ObjDescription({})".format(self.data) \ No newline at end of file diff --git a/grit_src_deprecated/grit/data/datasets/grit_coco.py b/grit_src_deprecated/grit/data/datasets/grit_coco.py deleted file mode 100755 index fea81f7d..00000000 --- a/grit_src_deprecated/grit/data/datasets/grit_coco.py +++ /dev/null @@ -1,112 +0,0 @@ -import logging -import os -from fvcore.common.timer import Timer -from detectron2.structures import BoxMode -from fvcore.common.file_io import PathManager -from detectron2.data import DatasetCatalog, MetadataCatalog -from lvis import LVIS - -logger = logging.getLogger(__name__) - -__all__ = ["load_GRiTcoco_json", "register_GRiTcoco_instances"] - - -def register_GRiTcoco_instances(name, metadata, json_file, image_root): - """ - """ - DatasetCatalog.register(name, lambda: load_GRiTcoco_json( - json_file, image_root, name)) - MetadataCatalog.get(name).set( - json_file=json_file, image_root=image_root, - evaluator_type="coco", **metadata - ) - - -def get_GRiTcoco_meta(): - categories = [{'supercategory': 'object', 'id': 1, 'name': 'object'}] - categories = sorted(categories, key=lambda x: x["id"]) - thing_classes = [k["name"] for k in categories] - meta = {"thing_classes": thing_classes} - return meta - - -def load_GRiTcoco_json(json_file, image_root, dataset_name=None): - ''' - Load COCO class name text for object description for GRiT - ''' - - json_file = PathManager.get_local_path(json_file) - - timer = Timer() - lvis_api = LVIS(json_file) - if timer.seconds() > 1: - logger.info("Loading {} takes {:.2f} seconds.".format( - json_file, timer.seconds())) - - class_names = {} - sort_cat = sorted(lvis_api.dataset['categories'], key=lambda x: x['id']) - for x in sort_cat: - class_names[x['id']] = x['name'] - - img_ids = sorted(lvis_api.imgs.keys()) - imgs = lvis_api.load_imgs(img_ids) - anns = [lvis_api.img_ann_map[img_id] for img_id in img_ids] - - ann_ids = [ann["id"] for anns_per_image in anns for ann in anns_per_image] - assert len(set(ann_ids)) == len(ann_ids), \ - "Annotation ids in '{}' are not unique".format(json_file) - - imgs_anns = list(zip(imgs, anns)) - logger.info("Loaded {} images in the LVIS v1 format from {}".format( - len(imgs_anns), json_file)) - - dataset_dicts = [] - - for (img_dict, anno_dict_list) in imgs_anns: - record = {} - if "file_name" in img_dict: - file_name = img_dict["file_name"] - record["file_name"] = os.path.join(image_root, file_name) - - record["height"] = int(img_dict["height"]) - record["width"] = int(img_dict["width"]) - image_id = record["image_id"] = img_dict["id"] - - objs = [] - for anno in anno_dict_list: - assert anno["image_id"] == image_id - if anno.get('iscrowd', 0) > 0: - continue - obj = {"bbox": anno["bbox"], "bbox_mode": BoxMode.XYWH_ABS} - obj["category_id"] = 0 - obj["object_description"] = class_names[anno['category_id']] - if 'segmentation' in anno: - segm = anno["segmentation"] - valid_segm = [poly for poly in segm \ - if len(poly) % 2 == 0 and len(poly) >= 6] - if not len(segm) == len(valid_segm): - print('Annotation contains an invalid polygon with < 3 points') - assert len(segm) > 0 - obj["segmentation"] = segm - objs.append(obj) - record["annotations"] = objs - if len(record["annotations"]) == 0: - continue - record["task"] = "ObjectDet" - dataset_dicts.append(record) - - return dataset_dicts - - -_CUSTOM_SPLITS_LVIS = { - "GRiT_coco2017_train": ("coco/train2017/", "coco/annotations/instances_train2017.json"), -} - - -for key, (image_root, json_file) in _CUSTOM_SPLITS_LVIS.items(): - register_GRiTcoco_instances( - key, - get_GRiTcoco_meta(), - os.path.join("datasets", json_file) if "://" not in json_file else json_file, - os.path.join("datasets", image_root), - ) \ No newline at end of file diff --git a/grit_src_deprecated/grit/data/datasets/object365.py b/grit_src_deprecated/grit/data/datasets/object365.py deleted file mode 100755 index 8b8cc19d..00000000 --- a/grit_src_deprecated/grit/data/datasets/object365.py +++ /dev/null @@ -1,111 +0,0 @@ -import logging -import os -from fvcore.common.timer import Timer -from detectron2.structures import BoxMode -from fvcore.common.file_io import PathManager -from detectron2.data import DatasetCatalog, MetadataCatalog -from lvis import LVIS - -logger = logging.getLogger(__name__) - -__all__ = ["load_o365_json", "register_o365_instances"] - - -def register_o365_instances(name, metadata, json_file, image_root): - DatasetCatalog.register(name, lambda: load_o365_json( - json_file, image_root, name)) - MetadataCatalog.get(name).set( - json_file=json_file, image_root=image_root, - evaluator_type="lvis", **metadata - ) - - -def get_o365_meta(): - categories = [{'supercategory': 'object', 'id': 1, 'name': 'object'}] - o365_categories = sorted(categories, key=lambda x: x["id"]) - thing_classes = [k["name"] for k in o365_categories] - meta = {"thing_classes": thing_classes} - return meta - - -def load_o365_json(json_file, image_root, dataset_name=None): - ''' - Load Object365 class name text for object description for GRiT - ''' - - json_file = PathManager.get_local_path(json_file) - - timer = Timer() - lvis_api = LVIS(json_file) - if timer.seconds() > 1: - logger.info("Loading {} takes {:.2f} seconds.".format( - json_file, timer.seconds())) - - class_names = {} - sort_cat = sorted(lvis_api.dataset['categories'], key=lambda x: x['id']) - for x in sort_cat: - if '/' in x['name']: - text = '' - for xx in x['name'].split('/'): - text += xx - text += ' ' - text = text[:-1] - else: - text = x['name'] - class_names[x['id']] = text - - img_ids = sorted(lvis_api.imgs.keys()) - imgs = lvis_api.load_imgs(img_ids) - anns = [lvis_api.img_ann_map[img_id] for img_id in img_ids] - - ann_ids = [ann["id"] for anns_per_image in anns for ann in anns_per_image] - assert len(set(ann_ids)) == len(ann_ids), \ - "Annotation ids in '{}' are not unique".format(json_file) - - imgs_anns = list(zip(imgs, anns)) - logger.info("Loaded {} images in the LVIS v1 format from {}".format( - len(imgs_anns), json_file)) - - dataset_dicts = [] - - for (img_dict, anno_dict_list) in imgs_anns: - record = {} - if "file_name" in img_dict: - file_name = img_dict["file_name"] - record["file_name"] = os.path.join(image_root, file_name) - - record["height"] = int(img_dict["height"]) - record["width"] = int(img_dict["width"]) - image_id = record["image_id"] = img_dict["id"] - - objs = [] - for anno in anno_dict_list: - assert anno["image_id"] == image_id - if anno.get('iscrowd', 0) > 0: - continue - obj = {"bbox": anno["bbox"], "bbox_mode": BoxMode.XYWH_ABS} - obj["category_id"] = 0 - obj["object_description"] = class_names[anno['category_id']] - - objs.append(obj) - record["annotations"] = objs - if len(record["annotations"]) == 0: - continue - record["task"] = "ObjectDet" - dataset_dicts.append(record) - - return dataset_dicts - - -_CUSTOM_SPLITS_LVIS = { - "object365_train": ("object365/images/train/", "object365/annotations/train_v1.json"), -} - - -for key, (image_root, json_file) in _CUSTOM_SPLITS_LVIS.items(): - register_o365_instances( - key, - get_o365_meta(), - os.path.join("datasets", json_file) if "://" not in json_file else json_file, - os.path.join("datasets", image_root), - ) \ No newline at end of file diff --git a/grit_src_deprecated/grit/data/datasets/vg.py b/grit_src_deprecated/grit/data/datasets/vg.py deleted file mode 100755 index 4d47a80d..00000000 --- a/grit_src_deprecated/grit/data/datasets/vg.py +++ /dev/null @@ -1,98 +0,0 @@ -import logging -import os -from fvcore.common.timer import Timer -from detectron2.structures import BoxMode -from fvcore.common.file_io import PathManager -from detectron2.data import DatasetCatalog, MetadataCatalog -from lvis import LVIS - -logger = logging.getLogger(__name__) - -__all__ = ["load_vg_json", "register_vg_instances"] - - -def register_vg_instances(name, metadata, json_file, image_root): - """ - """ - DatasetCatalog.register(name, lambda: load_vg_json( - json_file, image_root, name)) - MetadataCatalog.get(name).set( - json_file=json_file, image_root=image_root, - evaluator_type="vg", **metadata - ) - - -def get_vg_meta(): - categories = [{'supercategory': 'object', 'id': 1, 'name': 'object'}] - vg_categories = sorted(categories, key=lambda x: x["id"]) - thing_classes = [k["name"] for k in vg_categories] - meta = {"thing_classes": thing_classes} - return meta - - -def load_vg_json(json_file, image_root, dataset_name=None): - - json_file = PathManager.get_local_path(json_file) - - timer = Timer() - lvis_api = LVIS(json_file) - if timer.seconds() > 1: - logger.info("Loading {} takes {:.2f} seconds.".format( - json_file, timer.seconds())) - - img_ids = sorted(lvis_api.imgs.keys()) - imgs = lvis_api.load_imgs(img_ids) - anns = [lvis_api.img_ann_map[img_id] for img_id in img_ids] - - ann_ids = [ann["id"] for anns_per_image in anns for ann in anns_per_image] - assert len(set(ann_ids)) == len(ann_ids), \ - "Annotation ids in '{}' are not unique".format(json_file) - - imgs_anns = list(zip(imgs, anns)) - logger.info("Loaded {} images in the LVIS v1 format from {}".format( - len(imgs_anns), json_file)) - - dataset_dicts = [] - - for (img_dict, anno_dict_list) in imgs_anns: - record = {} - if "file_name" in img_dict: - file_name = img_dict["file_name"] - record["file_name"] = os.path.join(image_root, file_name) - - record["height"] = int(img_dict["height"]) - record["width"] = int(img_dict["width"]) - image_id = record["image_id"] = img_dict["id"] - - objs = [] - for anno in anno_dict_list: - assert anno["image_id"] == image_id - if anno.get('iscrowd', 0) > 0: - continue - obj = {"bbox": anno["bbox"], "bbox_mode": BoxMode.XYWH_ABS} - obj["category_id"] = 0 - obj["object_description"] = anno["caption"] - - objs.append(obj) - record["annotations"] = objs - if len(record["annotations"]) == 0: - continue - record["task"] = "DenseCap" - dataset_dicts.append(record) - - return dataset_dicts - - -_CUSTOM_SPLITS_LVIS = { - "vg_train": ("vg/images", "vg/annotations/train.json"), - "vg_test": ("vg/images", "vg/annotations/test.json"), -} - - -for key, (image_root, json_file) in _CUSTOM_SPLITS_LVIS.items(): - register_vg_instances( - key, - get_vg_meta(), - os.path.join("datasets", json_file) if "://" not in json_file else json_file, - os.path.join("datasets", image_root), - ) \ No newline at end of file diff --git a/grit_src_deprecated/grit/data/transforms/custom_augmentation_impl.py b/grit_src_deprecated/grit/data/transforms/custom_augmentation_impl.py deleted file mode 100755 index 6b9637f3..00000000 --- a/grit_src_deprecated/grit/data/transforms/custom_augmentation_impl.py +++ /dev/null @@ -1,52 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -# Part of the code is from https://github.com/rwightman/efficientdet-pytorch/blob/master/effdet/data/transforms.py -# Modified by Xingyi Zhou -# The original code is under Apache-2.0 License -import numpy as np -from PIL import Image - -from detectron2.data.transforms.augmentation import Augmentation -from .custom_transform import EfficientDetResizeCropTransform - -__all__ = [ - "EfficientDetResizeCrop", -] - - -class EfficientDetResizeCrop(Augmentation): - """ - Scale the shorter edge to the given size, with a limit of `max_size` on the longer edge. - If `max_size` is reached, then downscale so that the longer edge does not exceed max_size. - """ - - def __init__( - self, size, scale, interp=Image.BILINEAR - ): - """ - """ - super().__init__() - self.target_size = (size, size) - self.scale = scale - self.interp = interp - - def get_transform(self, img): - # Select a random scale factor. - scale_factor = np.random.uniform(*self.scale) - scaled_target_height = scale_factor * self.target_size[0] - scaled_target_width = scale_factor * self.target_size[1] - # Recompute the accurate scale_factor using rounded scaled image size. - width, height = img.shape[1], img.shape[0] - img_scale_y = scaled_target_height / height - img_scale_x = scaled_target_width / width - img_scale = min(img_scale_y, img_scale_x) - - # Select non-zero random offset (x, y) if scaled image is larger than target size - scaled_h = int(height * img_scale) - scaled_w = int(width * img_scale) - offset_y = scaled_h - self.target_size[0] - offset_x = scaled_w - self.target_size[1] - offset_y = int(max(0.0, float(offset_y)) * np.random.uniform(0, 1)) - offset_x = int(max(0.0, float(offset_x)) * np.random.uniform(0, 1)) - return EfficientDetResizeCropTransform( - scaled_h, scaled_w, offset_y, offset_x, img_scale, self.target_size, self.interp) diff --git a/grit_src_deprecated/grit/data/transforms/custom_transform.py b/grit_src_deprecated/grit/data/transforms/custom_transform.py deleted file mode 100755 index 423063a4..00000000 --- a/grit_src_deprecated/grit/data/transforms/custom_transform.py +++ /dev/null @@ -1,115 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -# Part of the code is from https://github.com/rwightman/efficientdet-pytorch/blob/master/effdet/data/transforms.py -# Modified by Xingyi Zhou -# The original code is under Apache-2.0 License -import numpy as np -import torch -import torch.nn.functional as F -from fvcore.transforms.transform import ( - CropTransform, - HFlipTransform, - NoOpTransform, - Transform, - TransformList, -) -from PIL import Image - -try: - import cv2 # noqa -except ImportError: - # OpenCV is an optional dependency at the moment - pass - -__all__ = [ - "EfficientDetResizeCropTransform", -] - - -class EfficientDetResizeCropTransform(Transform): - """ - """ - - def __init__(self, scaled_h, scaled_w, offset_y, offset_x, img_scale, \ - target_size, interp=None): - """ - Args: - h, w (int): original image size - new_h, new_w (int): new image size - interp: PIL interpolation methods, defaults to bilinear. - """ - # TODO decide on PIL vs opencv - super().__init__() - if interp is None: - interp = Image.BILINEAR - self._set_attributes(locals()) - - def apply_image(self, img, interp=None): - assert len(img.shape) <= 4 - - if img.dtype == np.uint8: - pil_image = Image.fromarray(img) - interp_method = interp if interp is not None else self.interp - pil_image = pil_image.resize((self.scaled_w, self.scaled_h), interp_method) - ret = np.asarray(pil_image) - right = min(self.scaled_w, self.offset_x + self.target_size[1]) - lower = min(self.scaled_h, self.offset_y + self.target_size[0]) - if len(ret.shape) <= 3: - ret = ret[self.offset_y: lower, self.offset_x: right] - else: - ret = ret[..., self.offset_y: lower, self.offset_x: right, :] - else: - # PIL only supports uint8 - img = torch.from_numpy(img) - shape = list(img.shape) - shape_4d = shape[:2] + [1] * (4 - len(shape)) + shape[2:] - img = img.view(shape_4d).permute(2, 3, 0, 1) # hw(c) -> nchw - _PIL_RESIZE_TO_INTERPOLATE_MODE = {Image.BILINEAR: "bilinear", Image.BICUBIC: "bicubic"} - mode = _PIL_RESIZE_TO_INTERPOLATE_MODE[self.interp] - img = F.interpolate(img, (self.scaled_h, self.scaled_w), mode=mode, align_corners=False) - shape[:2] = (self.scaled_h, self.scaled_w) - ret = img.permute(2, 3, 0, 1).view(shape).numpy() # nchw -> hw(c) - right = min(self.scaled_w, self.offset_x + self.target_size[1]) - lower = min(self.scaled_h, self.offset_y + self.target_size[0]) - if len(ret.shape) <= 3: - ret = ret[self.offset_y: lower, self.offset_x: right] - else: - ret = ret[..., self.offset_y: lower, self.offset_x: right, :] - return ret - - - def apply_coords(self, coords): - coords[:, 0] = coords[:, 0] * self.img_scale - coords[:, 1] = coords[:, 1] * self.img_scale - coords[:, 0] -= self.offset_x - coords[:, 1] -= self.offset_y - return coords - - - def apply_segmentation(self, segmentation): - segmentation = self.apply_image(segmentation, interp=Image.NEAREST) - return segmentation - - - def inverse(self): - raise NotImplementedError - - - def inverse_apply_coords(self, coords): - coords[:, 0] += self.offset_x - coords[:, 1] += self.offset_y - coords[:, 0] = coords[:, 0] / self.img_scale - coords[:, 1] = coords[:, 1] / self.img_scale - return coords - - - def inverse_apply_box(self, box: np.ndarray) -> np.ndarray: - """ - """ - idxs = np.array([(0, 1), (2, 1), (0, 3), (2, 3)]).flatten() - coords = np.asarray(box).reshape(-1, 4)[:, idxs].reshape(-1, 2) - coords = self.inverse_apply_coords(coords).reshape((-1, 4, 2)) - minxy = coords.min(axis=1) - maxxy = coords.max(axis=1) - trans_boxes = np.concatenate((minxy, maxxy), axis=1) - return trans_boxes \ No newline at end of file diff --git a/grit_src_deprecated/grit/evaluation/eval.py b/grit_src_deprecated/grit/evaluation/eval.py deleted file mode 100755 index 951a0920..00000000 --- a/grit_src_deprecated/grit/evaluation/eval.py +++ /dev/null @@ -1,156 +0,0 @@ -import itertools -import json -import os -from detectron2.structures import Boxes, BoxMode, pairwise_iou -from detectron2.utils.file_io import PathManager -import numpy as np -import pycocotools.mask as mask_util -from detectron2.evaluation.coco_evaluation import COCOEvaluator -from detectron2.evaluation.coco_evaluation import _evaluate_predictions_on_coco - - -class GRiTCOCOEvaluator(COCOEvaluator): - def process(self, inputs, outputs): - for input, output in zip(inputs, outputs): - prediction = {"image_id": input["image_id"]} - - if "instances" in output: - instances = output["instances"].to(self._cpu_device) - prediction["instances"] = instances_to_coco_json(instances, input["image_id"]) - - if len(prediction) > 1: - self._predictions.append(prediction) - - def _eval_predictions(self, predictions, img_ids=None): - self._logger.info("Preparing results for COCO format ...") - coco_results = list(itertools.chain(*[x["instances"] for x in predictions])) - tasks = self._tasks or self._tasks_from_predictions(coco_results) - - if self._output_dir: - file_path = os.path.join(self._output_dir, "coco_instances_results.json") - self._logger.info("Saving results to {}".format(file_path)) - with PathManager.open(file_path, "w") as f: - f.write(json.dumps(coco_results)) - f.flush() - - if not self._do_evaluation: - self._logger.info("Annotations are not available for evaluation.") - return - - self._logger.info( - "Evaluating predictions with {} COCO API...".format( - "unofficial" if self._use_fast_impl else "official" - ) - ) - - coco_results = self.convert_classname_to_id(coco_results) - - for task in sorted(tasks): - assert task in {"bbox", "segm", "keypoints"}, f"Got unknown task: {task}!" - coco_eval = ( - _evaluate_predictions_on_coco( - self._coco_api, - coco_results, - task, - kpt_oks_sigmas=self._kpt_oks_sigmas, - use_fast_impl=self._use_fast_impl, - img_ids=img_ids, - max_dets_per_image=self._max_dets_per_image, - ) - if len(coco_results) > 0 - else None # cocoapi does not handle empty results very well - ) - - res = self._derive_coco_results( - coco_eval, task, class_names=self._metadata.get("thing_classes") - ) - self._results[task] = res - - def convert_classname_to_id(self, results): - outputs = [] - class_name_to_id = {} - categories = sorted(self._coco_api.dataset['categories'], key=lambda x: x['id']) - - for cat in categories: - class_name_to_id[cat['name']] = cat['id'] - - for pred in results: - if pred['object_descriptions'] in class_name_to_id: - pred['category_id'] = class_name_to_id[pred['object_descriptions']] - del pred['object_descriptions'] - outputs.append(pred) - - return outputs - - -class GRiTVGEvaluator(COCOEvaluator): - def process(self, inputs, outputs): - for input, output in zip(inputs, outputs): - assert input["image_id"] == int(input['file_name'].split('/')[-1].split('.')[0]) - prediction = {"image_id": input["image_id"]} - - if "instances" in output: - instances = output["instances"].to(self._cpu_device) - prediction["instances"] = instances_to_coco_json(instances, input["image_id"], output_logits=True) - h = input['height'] - w = input['width'] - scale = 720.0 / max(h, w) - scaled_inst = [] - for inst in prediction["instances"]: - inst['bbox'][0] = inst['bbox'][0] * scale - inst['bbox'][1] = inst['bbox'][1] * scale - inst['bbox'][2] = inst['bbox'][2] * scale - inst['bbox'][3] = inst['bbox'][3] * scale - scaled_inst.append(inst) - if len(scaled_inst) > 0: - prediction["instances"] = scaled_inst - if len(prediction) > 1: - self._predictions.append(prediction) - - def _eval_predictions(self, predictions, img_ids=None): - ''' - This is only for saving the results to json file - ''' - self._logger.info("Preparing results for COCO format ...") - coco_results = list(itertools.chain(*[x["instances"] for x in predictions])) - - if self._output_dir: - file_path = os.path.join(self._output_dir, "vg_instances_results.json") - self._logger.info("Saving results to {}".format(file_path)) - with PathManager.open(file_path, "w") as f: - f.write(json.dumps(coco_results)) - f.flush() - - -def instances_to_coco_json(instances, img_id, output_logits=False): - """ - Add object_descriptions and logit (if applicable) to - detectron2's instances_to_coco_json - """ - num_instance = len(instances) - if num_instance == 0: - return [] - - boxes = instances.pred_boxes.tensor.numpy() - boxes = BoxMode.convert(boxes, BoxMode.XYXY_ABS, BoxMode.XYWH_ABS) - boxes = boxes.tolist() - scores = instances.scores.tolist() - classes = instances.pred_classes.tolist() - object_descriptions = instances.pred_object_descriptions.data - if output_logits: - logits = instances.logits.tolist() - - results = [] - for k in range(num_instance): - result = { - "image_id": img_id, - "category_id": classes[k], - "bbox": boxes[k], - "score": scores[k], - 'object_descriptions': object_descriptions[k], - } - if output_logits: - result["logit"] = logits[k] - - results.append(result) - return results \ No newline at end of file diff --git a/grit_src_deprecated/grit/modeling/backbone/utils.py b/grit_src_deprecated/grit/modeling/backbone/utils.py deleted file mode 100755 index e71db21f..00000000 --- a/grit_src_deprecated/grit/modeling/backbone/utils.py +++ /dev/null @@ -1,186 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -# This code is from https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/backbone/utils.py -import math -import torch -import torch.nn as nn -import torch.nn.functional as F - -__all__ = [ - "window_partition", - "window_unpartition", - "add_decomposed_rel_pos", - "get_abs_pos", - "PatchEmbed", -] - -def window_partition(x, window_size): - """ - Partition into non-overlapping windows with padding if needed. - Args: - x (tensor): input tokens with [B, H, W, C]. - window_size (int): window size. - - Returns: - windows: windows after partition with [B * num_windows, window_size, window_size, C]. - (Hp, Wp): padded height and width before partition - """ - B, H, W, C = x.shape - - pad_h = (window_size - H % window_size) % window_size - pad_w = (window_size - W % window_size) % window_size - if pad_h > 0 or pad_w > 0: - x = F.pad(x, (0, 0, 0, pad_w, 0, pad_h)) - Hp, Wp = H + pad_h, W + pad_w - - x = x.view(B, Hp // window_size, window_size, Wp // window_size, window_size, C) - windows = x.permute(0, 1, 3, 2, 4, 5).contiguous().view(-1, window_size, window_size, C) - return windows, (Hp, Wp) - - -def window_unpartition(windows, window_size, pad_hw, hw): - """ - Window unpartition into original sequences and removing padding. - Args: - x (tensor): input tokens with [B * num_windows, window_size, window_size, C]. - window_size (int): window size. - pad_hw (Tuple): padded height and width (Hp, Wp). - hw (Tuple): original height and width (H, W) before padding. - - Returns: - x: unpartitioned sequences with [B, H, W, C]. - """ - Hp, Wp = pad_hw - H, W = hw - B = windows.shape[0] // (Hp * Wp // window_size // window_size) - x = windows.view(B, Hp // window_size, Wp // window_size, window_size, window_size, -1) - x = x.permute(0, 1, 3, 2, 4, 5).contiguous().view(B, Hp, Wp, -1) - - if Hp > H or Wp > W: - x = x[:, :H, :W, :].contiguous() - return x - - -def get_rel_pos(q_size, k_size, rel_pos): - """ - Get relative positional embeddings according to the relative positions of - query and key sizes. - Args: - q_size (int): size of query q. - k_size (int): size of key k. - rel_pos (Tensor): relative position embeddings (L, C). - - Returns: - Extracted positional embeddings according to relative positions. - """ - max_rel_dist = int(2 * max(q_size, k_size) - 1) - # Interpolate rel pos if needed. - if rel_pos.shape[0] != max_rel_dist: - # Interpolate rel pos. - rel_pos_resized = F.interpolate( - rel_pos.reshape(1, rel_pos.shape[0], -1).permute(0, 2, 1), - size=max_rel_dist, - mode="linear", - ) - rel_pos_resized = rel_pos_resized.reshape(-1, max_rel_dist).permute(1, 0) - else: - rel_pos_resized = rel_pos - - # Scale the coords with short length if shapes for q and k are different. - q_coords = torch.arange(q_size)[:, None] * max(k_size / q_size, 1.0) - k_coords = torch.arange(k_size)[None, :] * max(q_size / k_size, 1.0) - relative_coords = (q_coords - k_coords) + (k_size - 1) * max(q_size / k_size, 1.0) - - return rel_pos_resized[relative_coords.long()] - - -def add_decomposed_rel_pos(attn, q, rel_pos_h, rel_pos_w, q_size, k_size): - """ - Calculate decomposed Relative Positional Embeddings from :paper:`mvitv2`. - https://github.com/facebookresearch/mvit/blob/19786631e330df9f3622e5402b4a419a263a2c80/mvit/models/attention.py # noqa B950 - Args: - attn (Tensor): attention map. - q (Tensor): query q in the attention layer with shape (B, q_h * q_w, C). - rel_pos_h (Tensor): relative position embeddings (Lh, C) for height axis. - rel_pos_w (Tensor): relative position embeddings (Lw, C) for width axis. - q_size (Tuple): spatial sequence size of query q with (q_h, q_w). - k_size (Tuple): spatial sequence size of key k with (k_h, k_w). - - Returns: - attn (Tensor): attention map with added relative positional embeddings. - """ - q_h, q_w = q_size - k_h, k_w = k_size - Rh = get_rel_pos(q_h, k_h, rel_pos_h) - Rw = get_rel_pos(q_w, k_w, rel_pos_w) - - B, _, dim = q.shape - r_q = q.reshape(B, q_h, q_w, dim) - rel_h = torch.einsum("bhwc,hkc->bhwk", r_q, Rh) - rel_w = torch.einsum("bhwc,wkc->bhwk", r_q, Rw) - - attn = ( - attn.view(B, q_h, q_w, k_h, k_w) + rel_h[:, :, :, :, None] + rel_w[:, :, :, None, :] - ).view(B, q_h * q_w, k_h * k_w) - - return attn - - -def get_abs_pos(abs_pos, has_cls_token, hw): - """ - Calculate absolute positional embeddings. If needed, resize embeddings and remove cls_token - dimension for the original embeddings. - Args: - abs_pos (Tensor): absolute positional embeddings with (1, num_position, C). - has_cls_token (bool): If true, has 1 embedding in abs_pos for cls token. - hw (Tuple): size of input image tokens. - - Returns: - Absolute positional embeddings after processing with shape (1, H, W, C) - """ - h, w = hw - if has_cls_token: - abs_pos = abs_pos[:, 1:] - xy_num = abs_pos.shape[1] - size = int(math.sqrt(xy_num)) - assert size * size == xy_num - - if size != h or size != w: - new_abs_pos = F.interpolate( - abs_pos.reshape(1, size, size, -1).permute(0, 3, 1, 2), - size=(h, w), - mode="bicubic", - align_corners=False, - ) - - return new_abs_pos.permute(0, 2, 3, 1) - else: - return abs_pos.reshape(1, h, w, -1) - - -class PatchEmbed(nn.Module): - """ - Image to Patch Embedding. - """ - - def __init__( - self, kernel_size=(16, 16), stride=(16, 16), padding=(0, 0), in_chans=3, embed_dim=768 - ): - """ - Args: - kernel_size (Tuple): kernel size of the projection layer. - stride (Tuple): stride of the projection layer. - padding (Tuple): padding size of the projection layer. - in_chans (int): Number of input image channels. - embed_dim (int): embed_dim (int): Patch embedding dimension. - """ - super().__init__() - - self.proj = nn.Conv2d( - in_chans, embed_dim, kernel_size=kernel_size, stride=stride, padding=padding - ) - - def forward(self, x): - x = self.proj(x) - # B C H W -> B H W C - x = x.permute(0, 2, 3, 1) - return x diff --git a/grit_src_deprecated/grit/modeling/backbone/vit.py b/grit_src_deprecated/grit/modeling/backbone/vit.py deleted file mode 100755 index 36d1207f..00000000 --- a/grit_src_deprecated/grit/modeling/backbone/vit.py +++ /dev/null @@ -1,538 +0,0 @@ -# Modified by Jialian Wu from https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/backbone/vit.py -import logging -import math -import fvcore.nn.weight_init as weight_init -import torch -import torch.nn as nn -from functools import partial - -from detectron2.layers import CNNBlockBase, Conv2d, get_norm -from detectron2.modeling.backbone.build import BACKBONE_REGISTRY -from detectron2.layers import ShapeSpec -from centernet.modeling.backbone.fpn_p5 import LastLevelP6P7_P5 - -import torch.utils.checkpoint as checkpoint -from timm.models.layers import DropPath, Mlp, trunc_normal_ - -from detectron2.modeling.backbone.backbone import Backbone -from .utils import ( - PatchEmbed, - add_decomposed_rel_pos, - get_abs_pos, - window_partition, - window_unpartition, -) - -logger = logging.getLogger(__name__) - - -__all__ = ["ViT"] - - -class Attention(nn.Module): - """Multi-head Attention block with relative position embeddings.""" - - def __init__( - self, - dim, - num_heads=8, - qkv_bias=True, - use_rel_pos=False, - rel_pos_zero_init=True, - input_size=None, - ): - """ - Args: - dim (int): Number of input channels. - num_heads (int): Number of attention heads. - qkv_bias (bool: If True, add a learnable bias to query, key, value. - rel_pos (bool): If True, add relative positional embeddings to the attention map. - rel_pos_zero_init (bool): If True, zero initialize relative positional parameters. - input_size (int or None): Input resolution for calculating the relative positional - parameter size. - """ - super().__init__() - self.num_heads = num_heads - head_dim = dim // num_heads - self.scale = head_dim**-0.5 - - self.qkv = nn.Linear(dim, dim * 3, bias=qkv_bias) - self.proj = nn.Linear(dim, dim) - - self.use_rel_pos = use_rel_pos - if self.use_rel_pos: - # initialize relative positional embeddings - self.rel_pos_h = nn.Parameter(torch.zeros(2 * input_size[0] - 1, head_dim)) - self.rel_pos_w = nn.Parameter(torch.zeros(2 * input_size[1] - 1, head_dim)) - - if not rel_pos_zero_init: - trunc_normal_(self.rel_pos_h, std=0.02) - trunc_normal_(self.rel_pos_w, std=0.02) - - def forward(self, x): - B, H, W, _ = x.shape - # qkv with shape (3, B, nHead, H * W, C) - qkv = self.qkv(x).reshape(B, H * W, 3, self.num_heads, -1).permute(2, 0, 3, 1, 4) - # q, k, v with shape (B * nHead, H * W, C) - q, k, v = qkv.reshape(3, B * self.num_heads, H * W, -1).unbind(0) - - attn = (q * self.scale) @ k.transpose(-2, -1) - - if self.use_rel_pos: - attn = add_decomposed_rel_pos(attn, q, self.rel_pos_h, self.rel_pos_w, (H, W), (H, W)) - - attn = attn.softmax(dim=-1) - x = (attn @ v).view(B, self.num_heads, H, W, -1).permute(0, 2, 3, 1, 4).reshape(B, H, W, -1) - x = self.proj(x) - - return x - - -class ResBottleneckBlock(CNNBlockBase): - """ - The standard bottleneck residual block without the last activation layer. - It contains 3 conv layers with kernels 1x1, 3x3, 1x1. - """ - - def __init__( - self, - in_channels, - out_channels, - bottleneck_channels, - norm="LN", - act_layer=nn.GELU, - ): - """ - Args: - in_channels (int): Number of input channels. - out_channels (int): Number of output channels. - bottleneck_channels (int): number of output channels for the 3x3 - "bottleneck" conv layers. - norm (str or callable): normalization for all conv layers. - See :func:`layers.get_norm` for supported format. - act_layer (callable): activation for all conv layers. - """ - super().__init__(in_channels, out_channels, 1) - - self.conv1 = Conv2d(in_channels, bottleneck_channels, 1, bias=False) - self.norm1 = get_norm(norm, bottleneck_channels) - self.act1 = act_layer() - - self.conv2 = Conv2d( - bottleneck_channels, - bottleneck_channels, - 3, - padding=1, - bias=False, - ) - self.norm2 = get_norm(norm, bottleneck_channels) - self.act2 = act_layer() - - self.conv3 = Conv2d(bottleneck_channels, out_channels, 1, bias=False) - self.norm3 = get_norm(norm, out_channels) - - for layer in [self.conv1, self.conv2, self.conv3]: - weight_init.c2_msra_fill(layer) - for layer in [self.norm1, self.norm2]: - layer.weight.data.fill_(1.0) - layer.bias.data.zero_() - # zero init last norm layer. - self.norm3.weight.data.zero_() - self.norm3.bias.data.zero_() - - def forward(self, x): - out = x - for layer in self.children(): - out = layer(out) - - out = x + out - return out - - -class Block(nn.Module): - """Transformer blocks with support of window attention and residual propagation blocks""" - - def __init__( - self, - dim, - num_heads, - mlp_ratio=4.0, - qkv_bias=True, - drop_path=0.0, - norm_layer=nn.LayerNorm, - act_layer=nn.GELU, - use_rel_pos=False, - rel_pos_zero_init=True, - window_size=0, - use_residual_block=False, - input_size=None, - ): - """ - Args: - dim (int): Number of input channels. - num_heads (int): Number of attention heads in each ViT block. - mlp_ratio (float): Ratio of mlp hidden dim to embedding dim. - qkv_bias (bool): If True, add a learnable bias to query, key, value. - drop_path (float): Stochastic depth rate. - norm_layer (nn.Module): Normalization layer. - act_layer (nn.Module): Activation layer. - use_rel_pos (bool): If True, add relative positional embeddings to the attention map. - rel_pos_zero_init (bool): If True, zero initialize relative positional parameters. - window_size (int): Window size for window attention blocks. If it equals 0, then not - use window attention. - use_residual_block (bool): If True, use a residual block after the MLP block. - input_size (int or None): Input resolution for calculating the relative positional - parameter size. - """ - super().__init__() - self.norm1 = norm_layer(dim) - self.attn = Attention( - dim, - num_heads=num_heads, - qkv_bias=qkv_bias, - use_rel_pos=use_rel_pos, - rel_pos_zero_init=rel_pos_zero_init, - input_size=input_size if window_size == 0 else (window_size, window_size), - ) - - self.drop_path = DropPath(drop_path) if drop_path > 0.0 else nn.Identity() - self.norm2 = norm_layer(dim) - self.mlp = Mlp(in_features=dim, hidden_features=int(dim * mlp_ratio), act_layer=act_layer) - - self.window_size = window_size - - self.use_residual_block = use_residual_block - if use_residual_block: - # Use a residual block with bottleneck channel as dim // 2 - self.residual = ResBottleneckBlock( - in_channels=dim, - out_channels=dim, - bottleneck_channels=dim // 2, - norm="LN", - act_layer=act_layer, - ) - - def forward(self, x): - shortcut = x - x = self.norm1(x) - # Window partition - if self.window_size > 0: - H, W = x.shape[1], x.shape[2] - x, pad_hw = window_partition(x, self.window_size) - - x = self.attn(x) - # Reverse window partition - if self.window_size > 0: - x = window_unpartition(x, self.window_size, pad_hw, (H, W)) - - x = shortcut + self.drop_path(x) - x = x + self.drop_path(self.mlp(self.norm2(x))) - - if self.use_residual_block: - x = self.residual(x.permute(0, 3, 1, 2)).permute(0, 2, 3, 1) - - return x - - -class ViT(Backbone): - """ - This module implements Vision Transformer (ViT) backbone in :paper:`vitdet`. - "Exploring Plain Vision Transformer Backbones for Object Detection", - https://arxiv.org/abs/2203.16527 - """ - - def __init__( - self, - img_size=1024, - patch_size=16, - in_chans=3, - embed_dim=768, - depth=12, - num_heads=12, - mlp_ratio=4.0, - qkv_bias=True, - drop_path_rate=0.0, - norm_layer=nn.LayerNorm, - act_layer=nn.GELU, - use_abs_pos=True, - use_rel_pos=False, - rel_pos_zero_init=True, - window_size=0, - window_block_indexes=(), - residual_block_indexes=(), - use_act_checkpoint=True, - pretrain_img_size=224, - pretrain_use_cls_token=True, - out_feature="last_feat", - ): - """ - Args: - img_size (int): Input image size. - patch_size (int): Patch size. - in_chans (int): Number of input image channels. - embed_dim (int): Patch embedding dimension. - depth (int): Depth of ViT. - num_heads (int): Number of attention heads in each ViT block. - mlp_ratio (float): Ratio of mlp hidden dim to embedding dim. - qkv_bias (bool): If True, add a learnable bias to query, key, value. - drop_path_rate (float): Stochastic depth rate. - norm_layer (nn.Module): Normalization layer. - act_layer (nn.Module): Activation layer. - use_abs_pos (bool): If True, use absolute positional embeddings. - use_rel_pos (bool): If True, add relative positional embeddings to the attention map. - rel_pos_zero_init (bool): If True, zero initialize relative positional parameters. - window_size (int): Window size for window attention blocks. - window_block_indexes (list): Indexes for blocks using window attention. - residual_block_indexes (list): Indexes for blocks using conv propagation. - use_act_checkpoint (bool): If True, use activation checkpointing. - pretrain_img_size (int): input image size for pretraining models. - pretrain_use_cls_token (bool): If True, pretrainig models use class token. - out_feature (str): name of the feature from the last block. - """ - super().__init__() - self.pretrain_use_cls_token = pretrain_use_cls_token - self.use_act_checkpoint = use_act_checkpoint - - self.patch_embed = PatchEmbed( - kernel_size=(patch_size, patch_size), - stride=(patch_size, patch_size), - in_chans=in_chans, - embed_dim=embed_dim, - ) - - if use_abs_pos: - # Initialize absolute positional embedding with pretrain image size. - num_patches = (pretrain_img_size // patch_size) * (pretrain_img_size // patch_size) - num_positions = (num_patches + 1) if pretrain_use_cls_token else num_patches - self.pos_embed = nn.Parameter(torch.zeros(1, num_positions, embed_dim)) - else: - self.pos_embed = None - - # stochastic depth decay rule - dpr = [x.item() for x in torch.linspace(0, drop_path_rate, depth)] - - self.blocks = nn.ModuleList() - for i in range(depth): - block = Block( - dim=embed_dim, - num_heads=num_heads, - mlp_ratio=mlp_ratio, - qkv_bias=qkv_bias, - drop_path=dpr[i], - norm_layer=norm_layer, - act_layer=act_layer, - use_rel_pos=use_rel_pos, - rel_pos_zero_init=rel_pos_zero_init, - window_size=window_size if i in window_block_indexes else 0, - use_residual_block=i in residual_block_indexes, - input_size=(img_size // patch_size, img_size // patch_size), - ) - self.blocks.append(block) - - self._out_feature_channels = {out_feature: embed_dim} - self._out_feature_strides = {out_feature: patch_size} - self._out_features = [out_feature] - - if self.pos_embed is not None: - trunc_normal_(self.pos_embed, std=0.02) - - self.apply(self._init_weights) - - def _init_weights(self, m): - if isinstance(m, nn.Linear): - trunc_normal_(m.weight, std=0.02) - if isinstance(m, nn.Linear) and m.bias is not None: - nn.init.constant_(m.bias, 0) - elif isinstance(m, nn.LayerNorm): - nn.init.constant_(m.bias, 0) - nn.init.constant_(m.weight, 1.0) - - def forward(self, x): - x = self.patch_embed(x) - if self.pos_embed is not None: - x = x + get_abs_pos( - self.pos_embed, self.pretrain_use_cls_token, (x.shape[1], x.shape[2]) - ) - - for blk in self.blocks: - if self.use_act_checkpoint: - x = checkpoint.checkpoint(blk, x) - else: - x = blk(x) - - return x.permute(0, 3, 1, 2) - - -class ViT_FPN(Backbone): - def __init__(self, bottom_up=None, top_block=None, out_channels=None, strides=None, vit_out_dim=None): - super(ViT_FPN, self).__init__() - assert isinstance(bottom_up, Backbone) - self.bottom_up = bottom_up - self.top_block = top_block - - self._out_feature_strides = {"p{}".format(int(math.log2(s))): s for s in strides} - self._out_features = list(self._out_feature_strides.keys()) - self._out_feature_channels = {k: out_channels for k in self._out_features} - self._size_divisibility = strides[2] - - self.maxpool = nn.MaxPool2d(2, stride=2) - self.fpn_stride_16_8 = nn.ConvTranspose2d(vit_out_dim, vit_out_dim, 2, stride=2, bias=False) - self.fpn_stride8_conv1 = nn.Conv2d(in_channels=vit_out_dim, out_channels=out_channels, kernel_size=1, bias=False) - self.fpn_stride8_norm1 = nn.LayerNorm(out_channels) - self.fpn_stride8_conv2 = nn.Conv2d(in_channels=out_channels, out_channels=out_channels, kernel_size=3, stride=1, padding=1, bias=False) - self.fpn_stride8_norm2 = nn.LayerNorm(out_channels) - - self.fpn_stride16_conv1 = nn.Conv2d(in_channels=vit_out_dim, out_channels=out_channels, kernel_size=1, bias=False) - self.fpn_stride16_norm1 = nn.LayerNorm(out_channels) - self.fpn_stride16_conv2 = nn.Conv2d(in_channels=out_channels, out_channels=out_channels, kernel_size=3, stride=1, padding=1, bias=False) - self.fpn_stride16_norm2 = nn.LayerNorm(out_channels) - - self.fpn_stride32_conv1 = nn.Conv2d(in_channels=vit_out_dim, out_channels=out_channels, kernel_size=1, bias=False) - self.fpn_stride32_norm1 = nn.LayerNorm(out_channels) - self.fpn_stride32_conv2 = nn.Conv2d(in_channels=out_channels, out_channels=out_channels, kernel_size=3, stride=1, padding=1, bias=False) - self.fpn_stride32_norm2 = nn.LayerNorm(out_channels) - - def forward(self, x): - vit_output_featuremap = self.bottom_up(x) - - stride8_feature = self.fpn_stride_16_8(vit_output_featuremap) - stride8_feature = self.fpn_stride8_norm1(self.fpn_stride8_conv1(stride8_feature) - .permute(0, 2, 3, 1)).permute(0, 3, 1, 2) - stride8_feature = self.fpn_stride8_norm2(self.fpn_stride8_conv2(stride8_feature) - .permute(0, 2, 3, 1)).permute(0, 3, 1, 2) - - stride32_feature = self.maxpool(vit_output_featuremap) - stride32_feature = self.fpn_stride32_norm1(self.fpn_stride32_conv1(stride32_feature) - .permute(0, 2, 3, 1)).permute(0, 3, 1, 2) - stride32_feature = self.fpn_stride32_norm2(self.fpn_stride32_conv2(stride32_feature) - .permute(0, 2, 3, 1)).permute(0, 3, 1, 2) - - stride16_feature = self.fpn_stride16_norm1(self.fpn_stride16_conv1(vit_output_featuremap). - permute(0, 2, 3, 1)).permute(0, 3, 1, 2) - stride16_feature = self.fpn_stride16_norm2(self.fpn_stride16_conv2(stride16_feature) - .permute(0, 2, 3, 1)).permute(0, 3, 1, 2) - - results = [stride8_feature, stride16_feature, stride32_feature] - - results.extend(self.top_block(stride32_feature)) - - assert len(self._out_features) == len(results) - fpn_out = {f: res for f, res in zip(self._out_features, results)} - - return fpn_out - @property - def size_divisibility(self): - return self._size_divisibility - - def output_shape(self): - return { - name: ShapeSpec( - channels=self._out_feature_channels[name], stride=self._out_feature_strides[name] - ) - for name in self._out_features - } - - -@BACKBONE_REGISTRY.register() -def build_vit_fpn_backbone(cfg, input_shape: ShapeSpec): - embed_dim = 768 - vit_out_dim = embed_dim - bottom_up = ViT( # Single-scale ViT backbone - img_size=1024, - patch_size=16, - embed_dim=embed_dim, - depth=12, - num_heads=12, - drop_path_rate=0.1, - window_size=14, - mlp_ratio=4, - qkv_bias=True, - norm_layer=partial(nn.LayerNorm, eps=1e-6), - window_block_indexes=[ - # 2, 5, 8 11 for global attention - 0, - 1, - 3, - 4, - 6, - 7, - 9, - 10, - ], - residual_block_indexes=[], - use_act_checkpoint=cfg.USE_ACT_CHECKPOINT, - use_rel_pos=True, - out_feature="last_feat",) - - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - assert out_channels == 256 or out_channels == 768 or out_channels == 1024 - backbone = ViT_FPN(bottom_up=bottom_up, - top_block=LastLevelP6P7_P5(out_channels, out_channels), - out_channels=out_channels, - strides=[8, 16, 32, 64, 128], - vit_out_dim=vit_out_dim) - return backbone - - -@BACKBONE_REGISTRY.register() -def build_vit_fpn_backbone_large(cfg, input_shape: ShapeSpec): - window_block_indexes = (list(range(0, 5)) + list(range(6, 11)) + list(range(12, 17)) + list(range(18, 23))) - embed_dim = 1024 - vit_out_dim = embed_dim - bottom_up = ViT( # Single-scale ViT backbone - img_size=1024, - patch_size=16, - embed_dim=embed_dim, - depth=24, - num_heads=16, - drop_path_rate=0.4, - window_size=14, - mlp_ratio=4, - qkv_bias=True, - norm_layer=partial(nn.LayerNorm, eps=1e-6), - window_block_indexes=window_block_indexes, - residual_block_indexes=[], - use_act_checkpoint=cfg.USE_ACT_CHECKPOINT, - use_rel_pos=True, - out_feature="last_feat",) - - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - assert out_channels == 256 or out_channels == 768 or out_channels == 1024 - backbone = ViT_FPN(bottom_up=bottom_up, - top_block=LastLevelP6P7_P5(out_channels, out_channels), - out_channels=out_channels, - strides=[8, 16, 32, 64, 128], - vit_out_dim=vit_out_dim) - return backbone - - -@BACKBONE_REGISTRY.register() -def build_vit_fpn_backbone_huge(cfg, input_shape: ShapeSpec): - window_block_indexes = (list(range(0, 7)) + list(range(8, 15)) + list(range(16, 23)) + list(range(24, 31))) - embed_dim = 1280 - vit_out_dim = embed_dim - bottom_up = ViT( # Single-scale ViT backbone - img_size=1024, - patch_size=16, - embed_dim=embed_dim, - depth=32, - num_heads=16, - drop_path_rate=0.5, - window_size=14, - mlp_ratio=4, - qkv_bias=True, - norm_layer=partial(nn.LayerNorm, eps=1e-6), - window_block_indexes=window_block_indexes, - residual_block_indexes=[], - use_act_checkpoint=cfg.USE_ACT_CHECKPOINT, - use_rel_pos=True, - out_feature="last_feat",) - - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - assert out_channels == 256 or out_channels == 768 or out_channels == 1024 - backbone = ViT_FPN(bottom_up=bottom_up, - top_block=LastLevelP6P7_P5(out_channels, out_channels), - out_channels=out_channels, - strides=[8, 16, 32, 64, 128], - vit_out_dim=vit_out_dim) - return backbone diff --git a/grit_src_deprecated/grit/modeling/meta_arch/grit.py b/grit_src_deprecated/grit/modeling/meta_arch/grit.py deleted file mode 100755 index 057da53a..00000000 --- a/grit_src_deprecated/grit/modeling/meta_arch/grit.py +++ /dev/null @@ -1,67 +0,0 @@ -from typing import Dict, List, Optional, Tuple -import torch -from detectron2.config import configurable -from detectron2.structures import ImageList, Instances, Boxes -from detectron2.modeling.meta_arch.build import META_ARCH_REGISTRY -from detectron2.modeling.meta_arch.rcnn import GeneralizedRCNN - - -@META_ARCH_REGISTRY.register() -class GRiT(GeneralizedRCNN): - @configurable - def __init__( - self, - **kwargs): - super().__init__(**kwargs) - assert self.proposal_generator is not None - - @classmethod - def from_config(cls, cfg): - ret = super().from_config(cfg) - return ret - - def inference( - self, - batched_inputs: Tuple[Dict[str, torch.Tensor]], - detected_instances: Optional[List[Instances]] = None, - do_postprocess: bool = True, - ): - assert not self.training - assert detected_instances is None - - images = self.preprocess_image(batched_inputs) - features = self.backbone(images.tensor) - proposals, _ = self.proposal_generator(images, features, None) - results, _ = self.roi_heads(features, proposals) - if do_postprocess: - assert not torch.jit.is_scripting(), \ - "Scripting is not supported for postprocess." - return GRiT._postprocess( - results, batched_inputs, images.image_sizes) - else: - return results - - def forward(self, batched_inputs: List[Dict[str, torch.Tensor]]): - if not self.training: - return self.inference(batched_inputs) - - images = self.preprocess_image(batched_inputs) - - gt_instances = [x["instances"].to(self.device) for x in batched_inputs] - # import ipdb - # ipdb.set_trace() - targets_task = batched_inputs[0]['task'] - for anno_per_image in batched_inputs: - assert targets_task == anno_per_image['task'] - - features = self.backbone(images.tensor) - proposals, proposal_losses = self.proposal_generator( - images, features, gt_instances) - proposals, roihead_textdecoder_losses = self.roi_heads( - features, proposals, gt_instances, targets_task=targets_task) - - losses = {} - losses.update(roihead_textdecoder_losses) - losses.update(proposal_losses) - - return losses \ No newline at end of file diff --git a/grit_src_deprecated/grit/modeling/roi_heads/grit_fast_rcnn.py b/grit_src_deprecated/grit/modeling/roi_heads/grit_fast_rcnn.py deleted file mode 100755 index 5d03daab..00000000 --- a/grit_src_deprecated/grit/modeling/roi_heads/grit_fast_rcnn.py +++ /dev/null @@ -1,126 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# Modified by Jialian Wu from https://github.com/facebookresearch/Detic/blob/main/detic/modeling/roi_heads/detic_fast_rcnn.py -import torch -from fvcore.nn import giou_loss, smooth_l1_loss -from torch import nn -from torch.nn import functional as F -import fvcore.nn.weight_init as weight_init -from detectron2.config import configurable -from detectron2.layers import ShapeSpec, batched_nms, cat, cross_entropy, nonzero_tuple -from detectron2.modeling.roi_heads.fast_rcnn import FastRCNNOutputLayers -from detectron2.modeling.roi_heads.fast_rcnn import _log_classification_stats - - -__all__ = ["GRiTFastRCNNOutputLayers"] - - -class GRiTFastRCNNOutputLayers(FastRCNNOutputLayers): - @configurable - def __init__( - self, - input_shape: ShapeSpec, - **kwargs, - ): - super().__init__( - input_shape=input_shape, - **kwargs, - ) - - input_size = input_shape.channels * \ - (input_shape.width or 1) * (input_shape.height or 1) - - self.bbox_pred = nn.Sequential( - nn.Linear(input_size, input_size), - nn.ReLU(inplace=True), - nn.Linear(input_size, 4) - ) - weight_init.c2_xavier_fill(self.bbox_pred[0]) - nn.init.normal_(self.bbox_pred[-1].weight, std=0.001) - nn.init.constant_(self.bbox_pred[-1].bias, 0) - - @classmethod - def from_config(cls, cfg, input_shape): - ret = super().from_config(cfg, input_shape) - return ret - - def losses(self, predictions, proposals): - scores, proposal_deltas = predictions - gt_classes = ( - cat([p.gt_classes for p in proposals], dim=0) if len(proposals) else torch.empty(0) - ) - num_classes = self.num_classes - _log_classification_stats(scores, gt_classes) - - if len(proposals): - proposal_boxes = cat([p.proposal_boxes.tensor for p in proposals], dim=0) # Nx4 - assert not proposal_boxes.requires_grad, "Proposals should not require gradients!" - gt_boxes = cat( - [(p.gt_boxes if p.has("gt_boxes") else p.proposal_boxes).tensor for p in proposals], - dim=0, - ) - else: - proposal_boxes = gt_boxes = torch.empty((0, 4), device=proposal_deltas.device) - - loss_cls = self.softmax_cross_entropy_loss(scores, gt_classes) - return { - "loss_cls": loss_cls, - "loss_box_reg": self.box_reg_loss( - proposal_boxes, gt_boxes, proposal_deltas, gt_classes, - num_classes=num_classes) - } - - def softmax_cross_entropy_loss(self, pred_class_logits, gt_classes): - if pred_class_logits.numel() == 0: - return pred_class_logits.new_zeros([1])[0] - - loss = F.cross_entropy( - pred_class_logits, gt_classes, reduction="mean") - return loss - - def box_reg_loss( - self, proposal_boxes, gt_boxes, pred_deltas, gt_classes, - num_classes=-1): - num_classes = num_classes if num_classes > 0 else self.num_classes - box_dim = proposal_boxes.shape[1] - fg_inds = nonzero_tuple((gt_classes >= 0) & (gt_classes < num_classes))[0] - if pred_deltas.shape[1] == box_dim: - fg_pred_deltas = pred_deltas[fg_inds] - else: - fg_pred_deltas = pred_deltas.view(-1, self.num_classes, box_dim)[ - fg_inds, gt_classes[fg_inds] - ] - - if self.box_reg_loss_type == "smooth_l1": - gt_pred_deltas = self.box2box_transform.get_deltas( - proposal_boxes[fg_inds], - gt_boxes[fg_inds], - ) - loss_box_reg = smooth_l1_loss( - fg_pred_deltas, gt_pred_deltas, self.smooth_l1_beta, reduction="sum" - ) - elif self.box_reg_loss_type == "giou": - fg_pred_boxes = self.box2box_transform.apply_deltas( - fg_pred_deltas, proposal_boxes[fg_inds] - ) - loss_box_reg = giou_loss(fg_pred_boxes, gt_boxes[fg_inds], reduction="sum") - else: - raise ValueError(f"Invalid bbox reg loss type '{self.box_reg_loss_type}'") - return loss_box_reg / max(gt_classes.numel(), 1.0) - - def predict_probs(self, predictions, proposals): - scores = predictions[0] - num_inst_per_image = [len(p) for p in proposals] - probs = F.softmax(scores, dim=-1) - return probs.split(num_inst_per_image, dim=0) - - def forward(self, x): - if x.dim() > 2: - x = torch.flatten(x, start_dim=1) - scores = [] - - cls_scores = self.cls_score(x) - scores.append(cls_scores) - scores = torch.cat(scores, dim=1) - - proposal_deltas = self.bbox_pred(x) - return scores, proposal_deltas \ No newline at end of file diff --git a/grit_src_deprecated/grit/modeling/roi_heads/grit_roi_heads.py b/grit_src_deprecated/grit/modeling/roi_heads/grit_roi_heads.py deleted file mode 100755 index 648214d7..00000000 --- a/grit_src_deprecated/grit/modeling/roi_heads/grit_roi_heads.py +++ /dev/null @@ -1,478 +0,0 @@ -import math -import torch -from typing import Dict, List, Optional, Tuple, Union - -from detectron2.config import configurable -from detectron2.structures import Boxes, Instances, pairwise_iou -from detectron2.utils.events import get_event_storage - -from detectron2.modeling.box_regression import Box2BoxTransform -from detectron2.modeling.roi_heads.roi_heads import ROI_HEADS_REGISTRY, StandardROIHeads -from detectron2.modeling.roi_heads.cascade_rcnn import CascadeROIHeads, _ScaleGradient -from detectron2.modeling.poolers import ROIPooler -from detectron2.layers import batched_nms -from .grit_fast_rcnn import GRiTFastRCNNOutputLayers - -from ..text.text_decoder import TransformerDecoderTextualHead, GRiTTextDecoder, AutoRegressiveBeamSearch -from ..text.load_text_token import LoadTextTokens -from transformers import BertTokenizer -from models.grit_src.grit.data.custom_dataset_mapper import ObjDescription -from ..soft_nms import batched_soft_nms - -import logging -logger = logging.getLogger(__name__) - - -@ROI_HEADS_REGISTRY.register() -class GRiTROIHeadsAndTextDecoder(CascadeROIHeads): - @configurable - def __init__( - self, - *, - text_decoder_transformer, - train_task: list, - test_task: str, - mult_proposal_score: bool = False, - mask_weight: float = 1.0, - object_feat_pooler=None, - soft_nms_enabled=False, - beam_size=1, - **kwargs, - ): - super().__init__(**kwargs) - self.mult_proposal_score = mult_proposal_score - self.mask_weight = mask_weight - self.object_feat_pooler = object_feat_pooler - self.soft_nms_enabled = soft_nms_enabled - self.test_task = test_task - self.beam_size = beam_size - - tokenizer = BertTokenizer.from_pretrained('bert-base-uncased', do_lower_case=True) - self.tokenizer = tokenizer - - assert test_task in train_task, 'GRiT has not been trained on {} task, ' \ - 'please verify the task name or train a new ' \ - 'GRiT on {} task'.format(test_task, test_task) - task_begin_tokens = {} - for i, task in enumerate(train_task): - if i == 0: - task_begin_tokens[task] = tokenizer.cls_token_id - else: - task_begin_tokens[task] = 103 + i - self.task_begin_tokens = task_begin_tokens - - beamsearch_decode = AutoRegressiveBeamSearch( - end_token_id=tokenizer.sep_token_id, - max_steps=40, - beam_size=beam_size, - objectdet=test_task == "ObjectDet", - per_node_beam_size=1, - ) - self.text_decoder = GRiTTextDecoder( - text_decoder_transformer, - beamsearch_decode=beamsearch_decode, - begin_token_id=task_begin_tokens[test_task], - loss_type='smooth', - tokenizer=tokenizer, - ) - self.get_target_text_tokens = LoadTextTokens(tokenizer, max_text_len=40, padding='do_not_pad') - - @classmethod - def from_config(cls, cfg, input_shape): - ret = super().from_config(cfg, input_shape) - text_decoder_transformer = TransformerDecoderTextualHead( - object_feature_size=cfg.MODEL.FPN.OUT_CHANNELS, - vocab_size=cfg.TEXT_DECODER.VOCAB_SIZE, - hidden_size=cfg.TEXT_DECODER.HIDDEN_SIZE, - num_layers=cfg.TEXT_DECODER.NUM_LAYERS, - attention_heads=cfg.TEXT_DECODER.ATTENTION_HEADS, - feedforward_size=cfg.TEXT_DECODER.FEEDFORWARD_SIZE, - mask_future_positions=True, - padding_idx=0, - decoder_type='bert_en', - use_act_checkpoint=cfg.USE_ACT_CHECKPOINT, - ) - ret.update({ - 'text_decoder_transformer': text_decoder_transformer, - 'train_task': cfg.MODEL.TRAIN_TASK, - 'test_task': cfg.MODEL.TEST_TASK, - 'mult_proposal_score': cfg.MODEL.ROI_BOX_HEAD.MULT_PROPOSAL_SCORE, - 'mask_weight': cfg.MODEL.ROI_HEADS.MASK_WEIGHT, - 'soft_nms_enabled': cfg.MODEL.ROI_HEADS.SOFT_NMS_ENABLED, - 'beam_size': cfg.MODEL.BEAM_SIZE, - }) - return ret - - @classmethod - def _init_box_head(self, cfg, input_shape): - ret = super()._init_box_head(cfg, input_shape) - del ret['box_predictors'] - cascade_bbox_reg_weights = cfg.MODEL.ROI_BOX_CASCADE_HEAD.BBOX_REG_WEIGHTS - box_predictors = [] - for box_head, bbox_reg_weights in zip(ret['box_heads'], \ - cascade_bbox_reg_weights): - box_predictors.append( - GRiTFastRCNNOutputLayers( - cfg, box_head.output_shape, - box2box_transform=Box2BoxTransform(weights=bbox_reg_weights) - )) - ret['box_predictors'] = box_predictors - - in_features = cfg.MODEL.ROI_HEADS.IN_FEATURES - pooler_scales = tuple(1.0 / input_shape[k].stride for k in in_features) - sampling_ratio = cfg.MODEL.ROI_BOX_HEAD.POOLER_SAMPLING_RATIO - pooler_type = cfg.MODEL.ROI_BOX_HEAD.POOLER_TYPE - object_feat_pooler = ROIPooler( - output_size=cfg.MODEL.ROI_HEADS.OBJECT_FEAT_POOLER_RES, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type=pooler_type, - ) - ret['object_feat_pooler'] = object_feat_pooler - return ret - - def check_if_all_background(self, proposals, targets, stage): - all_background = True - for proposals_per_image in proposals: - if not (proposals_per_image.gt_classes == self.num_classes).all(): - all_background = False - - if all_background: - logger.info('all proposals are background at stage {}'.format(stage)) - proposals[0].proposal_boxes.tensor[0, :] = targets[0].gt_boxes.tensor[0, :] - proposals[0].gt_boxes.tensor[0, :] = targets[0].gt_boxes.tensor[0, :] - proposals[0].objectness_logits[0] = math.log((1.0 - 1e-10) / (1 - (1.0 - 1e-10))) - proposals[0].gt_classes[0] = targets[0].gt_classes[0] - proposals[0].gt_object_descriptions.data[0] = targets[0].gt_object_descriptions.data[0] - if 'foreground' in proposals[0].get_fields().keys(): - proposals[0].foreground[0] = 1 - return proposals - - def _forward_box(self, features, proposals, targets=None, task="ObjectDet"): - if self.training: - proposals = self.check_if_all_background(proposals, targets, 0) - if (not self.training) and self.mult_proposal_score: - if len(proposals) > 0 and proposals[0].has('scores'): - proposal_scores = [p.get('scores') for p in proposals] - else: - proposal_scores = [p.get('objectness_logits') for p in proposals] - - features = [features[f] for f in self.box_in_features] - head_outputs = [] - prev_pred_boxes = None - image_sizes = [x.image_size for x in proposals] - - for k in range(self.num_cascade_stages): - if k > 0: - proposals = self._create_proposals_from_boxes( - prev_pred_boxes, image_sizes, - logits=[p.objectness_logits for p in proposals]) - if self.training: - proposals = self._match_and_label_boxes_GRiT( - proposals, k, targets) - proposals = self.check_if_all_background(proposals, targets, k) - predictions = self._run_stage(features, proposals, k) - prev_pred_boxes = self.box_predictor[k].predict_boxes( - (predictions[0], predictions[1]), proposals) - head_outputs.append((self.box_predictor[k], predictions, proposals)) - - if self.training: - object_features = self.object_feat_pooler(features, [x.proposal_boxes for x in proposals]) - object_features = _ScaleGradient.apply(object_features, 1.0 / self.num_cascade_stages) - foreground = torch.cat([x.foreground for x in proposals]) - object_features = object_features[foreground > 0] - - object_descriptions = [] - for x in proposals: - object_descriptions += x.gt_object_descriptions[x.foreground > 0].data - object_descriptions = ObjDescription(object_descriptions) - object_descriptions = object_descriptions.data - - if len(object_descriptions) > 0: - begin_token = self.task_begin_tokens[task] - text_decoder_inputs = self.get_target_text_tokens(object_descriptions, object_features, begin_token) - object_features = object_features.view( - object_features.shape[0], object_features.shape[1], -1).permute(0, 2, 1).contiguous() - text_decoder_inputs.update({'object_features': object_features}) - text_decoder_loss = self.text_decoder(text_decoder_inputs) - else: - text_decoder_loss = head_outputs[0][1][0].new_zeros([1])[0] - - losses = {} - storage = get_event_storage() - # RoI Head losses (For the proposal generator loss, please find it in grit.py) - for stage, (predictor, predictions, proposals) in enumerate(head_outputs): - with storage.name_scope("stage{}".format(stage)): - stage_losses = predictor.losses( - (predictions[0], predictions[1]), proposals) - losses.update({k + "_stage{}".format(stage): v for k, v in stage_losses.items()}) - # Text Decoder loss - losses.update({'text_decoder_loss': text_decoder_loss}) - return losses - else: - scores_per_stage = [h[0].predict_probs(h[1], h[2]) for h in head_outputs] - logits_per_stage = [(h[1][0],) for h in head_outputs] - scores = [ - sum(list(scores_per_image)) * (1.0 / self.num_cascade_stages) - for scores_per_image in zip(*scores_per_stage) - ] - logits = [ - sum(list(logits_per_image)) * (1.0 / self.num_cascade_stages) - for logits_per_image in zip(*logits_per_stage) - ] - if self.mult_proposal_score: - scores = [(s * ps[:, None]) ** 0.5 for s, ps in zip(scores, proposal_scores)] - predictor, predictions, proposals = head_outputs[-1] - boxes = predictor.predict_boxes( - (predictions[0], predictions[1]), proposals) - assert len(boxes) == 1 - pred_instances, _ = self.fast_rcnn_inference_GRiT( - boxes, - scores, - logits, - image_sizes, - predictor.test_score_thresh, - predictor.test_nms_thresh, - predictor.test_topk_per_image, - self.soft_nms_enabled, - ) - - assert len(pred_instances) == 1, "Only support one image" - for i, pred_instance in enumerate(pred_instances): - if len(pred_instance.pred_boxes) > 0: - object_features = self.object_feat_pooler(features, [pred_instance.pred_boxes]) - object_features = object_features.view( - object_features.shape[0], object_features.shape[1], -1).permute(0, 2, 1).contiguous() - text_decoder_output = self.text_decoder({'object_features': object_features}) - if self.beam_size > 1 and self.test_task == "ObjectDet": - pred_boxes = [] - pred_scores = [] - pred_classes = [] - pred_object_descriptions = [] - - for beam_id in range(self.beam_size): - pred_boxes.append(pred_instance.pred_boxes.tensor) - # object score = sqrt(objectness score x description score) - pred_scores.append((pred_instance.scores * - torch.exp(text_decoder_output['logprobs'])[:, beam_id]) ** 0.5) - pred_classes.append(pred_instance.pred_classes) - for prediction in text_decoder_output['predictions'][:, beam_id, :]: - # convert text tokens to words - description = self.tokenizer.decode(prediction.tolist()[1:], skip_special_tokens=True) - pred_object_descriptions.append(description) - - merged_instances = Instances(image_sizes[0]) - if torch.cat(pred_scores, dim=0).shape[0] <= predictor.test_topk_per_image: - merged_instances.scores = torch.cat(pred_scores, dim=0) - merged_instances.pred_boxes = Boxes(torch.cat(pred_boxes, dim=0)) - merged_instances.pred_classes = torch.cat(pred_classes, dim=0) - merged_instances.pred_object_descriptions = ObjDescription(pred_object_descriptions) - else: - pred_scores, top_idx = torch.topk( - torch.cat(pred_scores, dim=0), predictor.test_topk_per_image) - merged_instances.scores = pred_scores - merged_instances.pred_boxes = Boxes(torch.cat(pred_boxes, dim=0)[top_idx, :]) - merged_instances.pred_classes = torch.cat(pred_classes, dim=0)[top_idx] - merged_instances.pred_object_descriptions = \ - ObjDescription(ObjDescription(pred_object_descriptions)[top_idx].data) - - pred_instances[i] = merged_instances - else: - # object score = sqrt(objectness score x description score) - pred_instance.scores = (pred_instance.scores * - torch.exp(text_decoder_output['logprobs'])) ** 0.5 - - pred_object_descriptions = [] - for prediction in text_decoder_output['predictions']: - # convert text tokens to words - description = self.tokenizer.decode(prediction.tolist()[1:], skip_special_tokens=True) - pred_object_descriptions.append(description) - pred_instance.pred_object_descriptions = ObjDescription(pred_object_descriptions) - else: - pred_instance.pred_object_descriptions = ObjDescription([]) - - return pred_instances - - - def forward(self, features, proposals, targets=None, targets_task="ObjectDet"): - if self.training: - proposals = self.label_and_sample_proposals( - proposals, targets) - - losses = self._forward_box(features, proposals, targets, task=targets_task) - if targets[0].has('gt_masks'): - mask_losses = self._forward_mask(features, proposals) - losses.update({k: v * self.mask_weight \ - for k, v in mask_losses.items()}) - else: - losses.update(self._get_empty_mask_loss(device=proposals[0].objectness_logits.device)) - return proposals, losses - else: - pred_instances = self._forward_box(features, proposals, task=self.test_task) - pred_instances = self.forward_with_given_boxes(features, pred_instances) - return pred_instances, {} - - @torch.no_grad() - def _match_and_label_boxes_GRiT(self, proposals, stage, targets): - """ - Add "gt_object_description" and "foreground" to detectron2's _match_and_label_boxes - """ - num_fg_samples, num_bg_samples = [], [] - for proposals_per_image, targets_per_image in zip(proposals, targets): - match_quality_matrix = pairwise_iou( - targets_per_image.gt_boxes, proposals_per_image.proposal_boxes - ) - # proposal_labels are 0 or 1 - matched_idxs, proposal_labels = self.proposal_matchers[stage](match_quality_matrix) - if len(targets_per_image) > 0: - gt_classes = targets_per_image.gt_classes[matched_idxs] - # Label unmatched proposals (0 label from matcher) as background (label=num_classes) - gt_classes[proposal_labels == 0] = self.num_classes - foreground = torch.ones_like(gt_classes) - foreground[proposal_labels == 0] = 0 - gt_boxes = targets_per_image.gt_boxes[matched_idxs] - gt_object_descriptions = targets_per_image.gt_object_descriptions[matched_idxs] - else: - gt_classes = torch.zeros_like(matched_idxs) + self.num_classes - foreground = torch.zeros_like(gt_classes) - gt_boxes = Boxes( - targets_per_image.gt_boxes.tensor.new_zeros((len(proposals_per_image), 4)) - ) - gt_object_descriptions = ObjDescription(['None' for i in range(len(proposals_per_image))]) - proposals_per_image.gt_classes = gt_classes - proposals_per_image.gt_boxes = gt_boxes - proposals_per_image.gt_object_descriptions = gt_object_descriptions - proposals_per_image.foreground = foreground - - num_fg_samples.append((proposal_labels == 1).sum().item()) - num_bg_samples.append(proposal_labels.numel() - num_fg_samples[-1]) - - # Log the number of fg/bg samples in each stage - storage = get_event_storage() - storage.put_scalar( - "stage{}/roi_head/num_fg_samples".format(stage), - sum(num_fg_samples) / len(num_fg_samples), - ) - storage.put_scalar( - "stage{}/roi_head/num_bg_samples".format(stage), - sum(num_bg_samples) / len(num_bg_samples), - ) - return proposals - - def fast_rcnn_inference_GRiT( - self, - boxes: List[torch.Tensor], - scores: List[torch.Tensor], - logits: List[torch.Tensor], - image_shapes: List[Tuple[int, int]], - score_thresh: float, - nms_thresh: float, - topk_per_image: int, - soft_nms_enabled: bool, - ): - result_per_image = [ - self.fast_rcnn_inference_single_image_GRiT( - boxes_per_image, scores_per_image, logits_per_image, image_shape, - score_thresh, nms_thresh, topk_per_image, soft_nms_enabled - ) - for scores_per_image, boxes_per_image, image_shape, logits_per_image \ - in zip(scores, boxes, image_shapes, logits) - ] - return [x[0] for x in result_per_image], [x[1] for x in result_per_image] - - def fast_rcnn_inference_single_image_GRiT( - self, - boxes, - scores, - logits, - image_shape: Tuple[int, int], - score_thresh: float, - nms_thresh: float, - topk_per_image: int, - soft_nms_enabled, - ): - """ - Add soft NMS to detectron2's fast_rcnn_inference_single_image - """ - valid_mask = torch.isfinite(boxes).all(dim=1) & torch.isfinite(scores).all(dim=1) - if not valid_mask.all(): - boxes = boxes[valid_mask] - scores = scores[valid_mask] - logits = logits[valid_mask] - - scores = scores[:, :-1] - logits = logits[:, :-1] - num_bbox_reg_classes = boxes.shape[1] // 4 - # Convert to Boxes to use the `clip` function ... - boxes = Boxes(boxes.reshape(-1, 4)) - boxes.clip(image_shape) - boxes = boxes.tensor.view(-1, num_bbox_reg_classes, 4) # R x C x 4 - - # 1. Filter results based on detection scores. It can make NMS more efficient - # by filtering out low-confidence detections. - filter_mask = scores > score_thresh # R x K - # R' x 2. First column contains indices of the R predictions; - # Second column contains indices of classes. - filter_inds = filter_mask.nonzero() - if num_bbox_reg_classes == 1: - boxes = boxes[filter_inds[:, 0], 0] - else: - boxes = boxes[filter_mask] - scores = scores[filter_mask] - logits = logits[filter_mask] - - # 2. Apply NMS for each class independently. - if not soft_nms_enabled: - keep = batched_nms(boxes, scores, filter_inds[:, 1], nms_thresh) - else: - keep, soft_nms_scores = batched_soft_nms( - boxes, - scores, - filter_inds[:, 1], - "linear", - 0.5, - nms_thresh, - 0.001, - ) - scores[keep] = soft_nms_scores - if topk_per_image >= 0: - keep = keep[:topk_per_image] - boxes, scores, filter_inds = boxes[keep], scores[keep], filter_inds[keep] - logits = logits[keep] - - result = Instances(image_shape) - result.pred_boxes = Boxes(boxes) - result.scores = scores - result.pred_classes = filter_inds[:, 1] - result.logits = logits - return result, filter_inds[:, 0] - - def _get_empty_mask_loss(self, device): - if self.mask_on: - return {'loss_mask': torch.zeros( - (1, ), device=device, dtype=torch.float32)[0]} - else: - return {} - - def _create_proposals_from_boxes(self, boxes, image_sizes, logits): - boxes = [Boxes(b.detach()) for b in boxes] - proposals = [] - for boxes_per_image, image_size, logit in zip( - boxes, image_sizes, logits): - boxes_per_image.clip(image_size) - if self.training: - inds = boxes_per_image.nonempty() - boxes_per_image = boxes_per_image[inds] - logit = logit[inds] - prop = Instances(image_size) - prop.proposal_boxes = boxes_per_image - prop.objectness_logits = logit - proposals.append(prop) - return proposals - - def _run_stage(self, features, proposals, stage): - pool_boxes = [x.proposal_boxes for x in proposals] - box_features = self.box_pooler(features, pool_boxes) - box_features = _ScaleGradient.apply(box_features, 1.0 / self.num_cascade_stages) - box_features = self.box_head[stage](box_features) - return self.box_predictor[stage](box_features) diff --git a/grit_src_deprecated/grit/modeling/soft_nms.py b/grit_src_deprecated/grit/modeling/soft_nms.py deleted file mode 100755 index 6a5aae7c..00000000 --- a/grit_src_deprecated/grit/modeling/soft_nms.py +++ /dev/null @@ -1,177 +0,0 @@ -import torch - -from detectron2.structures import Boxes, RotatedBoxes, pairwise_iou, pairwise_iou_rotated - - -def soft_nms(boxes, scores, method, gaussian_sigma, linear_threshold, prune_threshold): - """ - Performs soft non-maximum suppression algorithm on axis aligned boxes - - Args: - boxes (Tensor[N, 5]): - boxes where NMS will be performed. They - are expected to be in (x_ctr, y_ctr, width, height, angle_degrees) format - scores (Tensor[N]): - scores for each one of the boxes - method (str): - one of ['gaussian', 'linear', 'hard'] - see paper for details. users encouraged not to use "hard", as this is the - same nms available elsewhere in detectron2 - gaussian_sigma (float): - parameter for Gaussian penalty function - linear_threshold (float): - iou threshold for applying linear decay. Nt from the paper - re-used as threshold for standard "hard" nms - prune_threshold (float): - boxes with scores below this threshold are pruned at each iteration. - Dramatically reduces computation time. Authors use values in [10e-4, 10e-2] - - Returns: - tuple(Tensor, Tensor): - [0]: int64 tensor with the indices of the elements that have been kept - by Soft NMS, sorted in decreasing order of scores - [1]: float tensor with the re-scored scores of the elements that were kept -""" - return _soft_nms( - Boxes, - pairwise_iou, - boxes, - scores, - method, - gaussian_sigma, - linear_threshold, - prune_threshold, - ) - - -def batched_soft_nms( - boxes, scores, idxs, method, gaussian_sigma, linear_threshold, prune_threshold -): - """ - Performs soft non-maximum suppression in a batched fashion. - - Each index value correspond to a category, and NMS - will not be applied between elements of different categories. - - Args: - boxes (Tensor[N, 4]): - boxes where NMS will be performed. They - are expected to be in (x1, y1, x2, y2) format - scores (Tensor[N]): - scores for each one of the boxes - idxs (Tensor[N]): - indices of the categories for each one of the boxes. - method (str): - one of ['gaussian', 'linear', 'hard'] - see paper for details. users encouraged not to use "hard", as this is the - same nms available elsewhere in detectron2 - gaussian_sigma (float): - parameter for Gaussian penalty function - linear_threshold (float): - iou threshold for applying linear decay. Nt from the paper - re-used as threshold for standard "hard" nms - prune_threshold (float): - boxes with scores below this threshold are pruned at each iteration. - Dramatically reduces computation time. Authors use values in [10e-4, 10e-2] - Returns: - tuple(Tensor, Tensor): - [0]: int64 tensor with the indices of the elements that have been kept - by Soft NMS, sorted in decreasing order of scores - [1]: float tensor with the re-scored scores of the elements that were kept - """ - if boxes.numel() == 0: - return ( - torch.empty((0,), dtype=torch.int64, device=boxes.device), - torch.empty((0,), dtype=torch.float32, device=scores.device), - ) - # strategy: in order to perform NMS independently per class. - # we add an offset to all the boxes. The offset is dependent - # only on the class idx, and is large enough so that boxes - # from different classes do not overlap - max_coordinate = boxes.max() - offsets = idxs.to(boxes) * (max_coordinate + 1) - boxes_for_nms = boxes + offsets[:, None] - return soft_nms( - boxes_for_nms, scores, method, gaussian_sigma, linear_threshold, prune_threshold - ) - - -def _soft_nms( - box_class, - pairwise_iou_func, - boxes, - scores, - method, - gaussian_sigma, - linear_threshold, - prune_threshold, -): - """ - Soft non-max suppression algorithm. - - Implementation of [Soft-NMS -- Improving Object Detection With One Line of Codec] - (https://arxiv.org/abs/1704.04503) - - Args: - box_class (cls): one of Box, RotatedBoxes - pairwise_iou_func (func): one of pairwise_iou, pairwise_iou_rotated - boxes (Tensor[N, ?]): - boxes where NMS will be performed - if Boxes, in (x1, y1, x2, y2) format - if RotatedBoxes, in (x_ctr, y_ctr, width, height, angle_degrees) format - scores (Tensor[N]): - scores for each one of the boxes - method (str): - one of ['gaussian', 'linear', 'hard'] - see paper for details. users encouraged not to use "hard", as this is the - same nms available elsewhere in detectron2 - gaussian_sigma (float): - parameter for Gaussian penalty function - linear_threshold (float): - iou threshold for applying linear decay. Nt from the paper - re-used as threshold for standard "hard" nms - prune_threshold (float): - boxes with scores below this threshold are pruned at each iteration. - Dramatically reduces computation time. Authors use values in [10e-4, 10e-2] - - Returns: - tuple(Tensor, Tensor): - [0]: int64 tensor with the indices of the elements that have been kept - by Soft NMS, sorted in decreasing order of scores - [1]: float tensor with the re-scored scores of the elements that were kept - """ - boxes = boxes.clone() - scores = scores.clone() - idxs = torch.arange(scores.size()[0]) - - idxs_out = [] - scores_out = [] - - while scores.numel() > 0: - top_idx = torch.argmax(scores) - idxs_out.append(idxs[top_idx].item()) - scores_out.append(scores[top_idx].item()) - - top_box = boxes[top_idx] - ious = pairwise_iou_func(box_class(top_box.unsqueeze(0)), box_class(boxes))[0] - - if method == "linear": - decay = torch.ones_like(ious) - decay_mask = ious > linear_threshold - decay[decay_mask] = 1 - ious[decay_mask] - elif method == "gaussian": - decay = torch.exp(-torch.pow(ious, 2) / gaussian_sigma) - elif method == "hard": # standard NMS - decay = (ious < linear_threshold).float() - else: - raise NotImplementedError("{} soft nms method not implemented.".format(method)) - - scores *= decay - keep = scores > prune_threshold - keep[top_idx] = False - - boxes = boxes[keep] - scores = scores[keep] - idxs = idxs[keep] - - return torch.tensor(idxs_out).to(boxes.device), torch.tensor(scores_out).to(scores.device) \ No newline at end of file diff --git a/grit_src_deprecated/grit/modeling/text/file_utils.py b/grit_src_deprecated/grit/modeling/text/file_utils.py deleted file mode 100755 index 51918cf3..00000000 --- a/grit_src_deprecated/grit/modeling/text/file_utils.py +++ /dev/null @@ -1,256 +0,0 @@ -# Utilities for working with the local dataset cache. -# This file is adapted from the AllenNLP library at https://github.com/allenai/allennlp -# Copyright by the AllenNLP authors. - -from __future__ import absolute_import, division, print_function, unicode_literals - -import sys -import json -import logging -import os -import shutil -import tempfile -import fnmatch -from functools import wraps -from hashlib import sha256 -from io import open - -import boto3 -import requests -from botocore.exceptions import ClientError -from tqdm import tqdm - -try: - from torch.hub import _get_torch_home - torch_cache_home = _get_torch_home() -except ImportError: - torch_cache_home = os.path.expanduser( - os.getenv('TORCH_HOME', os.path.join( - os.getenv('XDG_CACHE_HOME', '~/.cache'), 'torch'))) -default_cache_path = os.path.join(torch_cache_home, 'pytorch_transformers') - -try: - from urllib.parse import urlparse -except ImportError: - from urlparse import urlparse - -try: - from pathlib import Path - PYTORCH_PRETRAINED_BERT_CACHE = Path( - os.getenv('PYTORCH_PRETRAINED_BERT_CACHE', default_cache_path)) -except (AttributeError, ImportError): - PYTORCH_PRETRAINED_BERT_CACHE = os.getenv('PYTORCH_PRETRAINED_BERT_CACHE', - default_cache_path) - -logger = logging.getLogger(__name__) # pylint: disable=invalid-name - - -def url_to_filename(url, etag=None): - """ - Convert `url` into a hashed filename in a repeatable way. - If `etag` is specified, append its hash to the url's, delimited - by a period. - """ - url_bytes = url.encode('utf-8') - url_hash = sha256(url_bytes) - filename = url_hash.hexdigest() - - if etag: - etag_bytes = etag.encode('utf-8') - etag_hash = sha256(etag_bytes) - filename += '.' + etag_hash.hexdigest() - - return filename - - -def filename_to_url(filename, cache_dir=None): - """ - Return the url and etag (which may be ``None``) stored for `filename`. - Raise ``EnvironmentError`` if `filename` or its stored metadata do not exist. - """ - if cache_dir is None: - cache_dir = PYTORCH_PRETRAINED_BERT_CACHE - if sys.version_info[0] == 3 and isinstance(cache_dir, Path): - cache_dir = str(cache_dir) - - cache_path = os.path.join(cache_dir, filename) - if not os.path.exists(cache_path): - raise EnvironmentError("file {} not found".format(cache_path)) - - meta_path = cache_path + '.json' - if not os.path.exists(meta_path): - raise EnvironmentError("file {} not found".format(meta_path)) - - with open(meta_path, encoding="utf-8") as meta_file: - metadata = json.load(meta_file) - url = metadata['url'] - etag = metadata['etag'] - - return url, etag - - -def cached_path(url_or_filename, cache_dir=None): - """ - Given something that might be a URL (or might be a local path), - determine which. If it's a URL, download the file and cache it, and - return the path to the cached file. If it's already a local path, - make sure the file exists and then return the path. - """ - if cache_dir is None: - cache_dir = PYTORCH_PRETRAINED_BERT_CACHE - if sys.version_info[0] == 3 and isinstance(url_or_filename, Path): - url_or_filename = str(url_or_filename) - if sys.version_info[0] == 3 and isinstance(cache_dir, Path): - cache_dir = str(cache_dir) - - parsed = urlparse(url_or_filename) - - if parsed.scheme in ('http', 'https', 's3'): - # URL, so get it from the cache (downloading if necessary) - return get_from_cache(url_or_filename, cache_dir) - elif os.path.exists(url_or_filename): - # File, and it exists. - return url_or_filename - elif parsed.scheme == '': - # File, but it doesn't exist. - raise EnvironmentError("file {} not found".format(url_or_filename)) - else: - # Something unknown - raise ValueError("unable to parse {} as a URL or as a local path".format(url_or_filename)) - - -def split_s3_path(url): - """Split a full s3 path into the bucket name and path.""" - parsed = urlparse(url) - if not parsed.netloc or not parsed.path: - raise ValueError("bad s3 path {}".format(url)) - bucket_name = parsed.netloc - s3_path = parsed.path - # Remove '/' at beginning of path. - if s3_path.startswith("/"): - s3_path = s3_path[1:] - return bucket_name, s3_path - - -def s3_request(func): - """ - Wrapper function for s3 requests in order to create more helpful error - messages. - """ - - @wraps(func) - def wrapper(url, *args, **kwargs): - try: - return func(url, *args, **kwargs) - except ClientError as exc: - if int(exc.response["Error"]["Code"]) == 404: - raise EnvironmentError("file {} not found".format(url)) - else: - raise - - return wrapper - - -@s3_request -def s3_etag(url): - """Check ETag on S3 object.""" - s3_resource = boto3.resource("s3") - bucket_name, s3_path = split_s3_path(url) - s3_object = s3_resource.Object(bucket_name, s3_path) - return s3_object.e_tag - - -@s3_request -def s3_get(url, temp_file): - """Pull a file directly from S3.""" - s3_resource = boto3.resource("s3") - bucket_name, s3_path = split_s3_path(url) - s3_resource.Bucket(bucket_name).download_fileobj(s3_path, temp_file) - - -def http_get(url, temp_file): - req = requests.get(url, stream=True) - content_length = req.headers.get('Content-Length') - total = int(content_length) if content_length is not None else None - progress = tqdm(unit="B", total=total) - for chunk in req.iter_content(chunk_size=1024): - if chunk: # filter out keep-alive new chunks - progress.update(len(chunk)) - temp_file.write(chunk) - progress.close() - - -def get_from_cache(url, cache_dir=None): - """ - Given a URL, look for the corresponding dataset in the local cache. - If it's not there, download it. Then return the path to the cached file. - """ - if cache_dir is None: - cache_dir = PYTORCH_PRETRAINED_BERT_CACHE - if sys.version_info[0] == 3 and isinstance(cache_dir, Path): - cache_dir = str(cache_dir) - if sys.version_info[0] == 2 and not isinstance(cache_dir, str): - cache_dir = str(cache_dir) - - if not os.path.exists(cache_dir): - os.makedirs(cache_dir) - - # Get eTag to add to filename, if it exists. - if url.startswith("s3://"): - etag = s3_etag(url) - else: - try: - response = requests.head(url, allow_redirects=True) - if response.status_code != 200: - etag = None - else: - etag = response.headers.get("ETag") - except EnvironmentError: - etag = None - - if sys.version_info[0] == 2 and etag is not None: - etag = etag.decode('utf-8') - filename = url_to_filename(url, etag) - - # get cache path to put the file - cache_path = os.path.join(cache_dir, filename) - - # If we don't have a connection (etag is None) and can't identify the file - # try to get the last downloaded one - if not os.path.exists(cache_path) and etag is None: - matching_files = fnmatch.filter(os.listdir(cache_dir), filename + '.*') - matching_files = list(filter(lambda s: not s.endswith('.json'), matching_files)) - if matching_files: - cache_path = os.path.join(cache_dir, matching_files[-1]) - - if not os.path.exists(cache_path): - # Download to temporary file, then copy to cache dir once finished. - # Otherwise you get corrupt cache entries if the download gets interrupted. - with tempfile.NamedTemporaryFile() as temp_file: - logger.info("%s not found in cache, downloading to %s", url, temp_file.name) - - # GET file object - if url.startswith("s3://"): - s3_get(url, temp_file) - else: - http_get(url, temp_file) - - # we are copying the file before closing it, so flush to avoid truncation - temp_file.flush() - # shutil.copyfileobj() starts at the current position, so go to the start - temp_file.seek(0) - - logger.info("copying %s to cache at %s", temp_file.name, cache_path) - with open(cache_path, 'wb') as cache_file: - shutil.copyfileobj(temp_file, cache_file) - - logger.info("creating metadata file for %s", cache_path) - meta = {'url': url, 'etag': etag} - meta_path = cache_path + '.json' - with open(meta_path, 'w') as meta_file: - output_string = json.dumps(meta) - meta_file.write(output_string) - - logger.info("removing temp file %s", temp_file.name) - - return cache_path diff --git a/grit_src_deprecated/grit/modeling/text/load_text_token.py b/grit_src_deprecated/grit/modeling/text/load_text_token.py deleted file mode 100755 index 8491021b..00000000 --- a/grit_src_deprecated/grit/modeling/text/load_text_token.py +++ /dev/null @@ -1,80 +0,0 @@ -import torch - - -class LoadTextTokens(object): - def __init__(self, tokenizer, max_text_len=40, padding='do_not_pad'): - self.tokenizer = tokenizer - self.max_text_len = max_text_len - self.padding = padding - - def descriptions_to_text_tokens(self, target, begin_token): - target_encoding = self.tokenizer( - target, padding=self.padding, - add_special_tokens=False, - truncation=True, max_length=self.max_text_len) - - need_predict = [1] * len(target_encoding['input_ids']) - payload = target_encoding['input_ids'] - if len(payload) > self.max_text_len - 2: - payload = payload[-(self.max_text_len - 2):] - need_predict = payload[-(self.max_text_len - 2):] - - input_ids = [begin_token] + payload + [self.tokenizer.sep_token_id] - - need_predict = [0] + need_predict + [1] - data = { - 'text_tokens': torch.tensor(input_ids), - 'text_lengths': len(input_ids), - 'need_predict': torch.tensor(need_predict), - } - - return data - - def __call__(self, object_descriptions, box_features, begin_token): - text_tokens = [] - text_lengths = [] - need_predict = [] - for description in object_descriptions: - tokens = self.descriptions_to_text_tokens(description, begin_token) - text_tokens.append(tokens['text_tokens']) - text_lengths.append(tokens['text_lengths']) - need_predict.append(tokens['need_predict']) - - text_tokens = torch.cat(self.collate(text_tokens), dim=0).to(box_features.device) - text_lengths = torch.tensor(text_lengths).to(box_features.device) - need_predict = torch.cat(self.collate(need_predict), dim=0).to(box_features.device) - - assert text_tokens.dim() == 2 and need_predict.dim() == 2 - data = {'text_tokens': text_tokens, - 'text_lengths': text_lengths, - 'need_predict': need_predict} - - return data - - def collate(self, batch): - if all(isinstance(b, torch.Tensor) for b in batch) and len(batch) > 0: - if not all(b.shape == batch[0].shape for b in batch[1:]): - assert all(len(b.shape) == len(batch[0].shape) for b in batch[1:]) - shape = torch.tensor([b.shape for b in batch]) - max_shape = tuple(shape.max(dim=0)[0].tolist()) - batch2 = [] - for b in batch: - if any(c < m for c, m in zip(b.shape, max_shape)): - b2 = torch.zeros(max_shape, dtype=b.dtype, device=b.device) - if b.dim() == 1: - b2[:b.shape[0]] = b - elif b.dim() == 2: - b2[:b.shape[0], :b.shape[1]] = b - elif b.dim() == 3: - b2[:b.shape[0], :b.shape[1], :b.shape[2]] = b - else: - raise NotImplementedError - b = b2 - batch2.append(b[None, ...]) - else: - batch2 = [] - for b in batch: - batch2.append(b[None, ...]) - return batch2 - else: - raise NotImplementedError diff --git a/grit_src_deprecated/grit/modeling/text/modeling_bert.py b/grit_src_deprecated/grit/modeling/text/modeling_bert.py deleted file mode 100755 index 3f8bf2d5..00000000 --- a/grit_src_deprecated/grit/modeling/text/modeling_bert.py +++ /dev/null @@ -1,529 +0,0 @@ -# coding=utf-8 -# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team. -# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. -"""PyTorch BERT model. """ -# Adapted from https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/modeling_bert.py - -from __future__ import absolute_import, division, print_function, unicode_literals -import copy -import os -import json -import logging -import math -import sys -from io import open -import torch -from torch import nn -import torch.utils.checkpoint as checkpoint -from .file_utils import cached_path - - -logger = logging.getLogger() - - -BERT_PRETRAINED_CONFIG_ARCHIVE_MAP = { - 'bert-base-uncased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json", - 'bert-large-uncased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-config.json", - 'bert-base-cased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-config.json", - 'bert-large-cased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-config.json", - 'bert-base-multilingual-uncased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-config.json", - 'bert-base-multilingual-cased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-config.json", - 'bert-base-chinese': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-config.json", - 'bert-base-german-cased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-cased-config.json", - 'bert-large-uncased-whole-word-masking': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-config.json", - 'bert-large-cased-whole-word-masking': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-config.json", - 'bert-large-uncased-whole-word-masking-finetuned-squad': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-finetuned-squad-config.json", - 'bert-large-cased-whole-word-masking-finetuned-squad': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-finetuned-squad-config.json", - 'bert-base-cased-finetuned-mrpc': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-finetuned-mrpc-config.json", -} - - -def qk2attn(query, key, attention_mask, gamma): - query = query / gamma - attention_scores = torch.matmul(query, key.transpose(-1, -2)) - if attention_mask is not None: - # Apply the attention mask is (precomputed for all layers in BertModel forward() function) - attention_scores = attention_scores + attention_mask - return attention_scores.softmax(dim=-1) - - -class QK2Attention(nn.Module): - def forward(self, query, key, attention_mask, gamma): - return qk2attn(query, key, attention_mask, gamma) - - -LayerNormClass = torch.nn.LayerNorm - - -class BertSelfAttention(nn.Module): - def __init__(self, config): - super(BertSelfAttention, self).__init__() - if config.hidden_size % config.num_attention_heads != 0: - raise ValueError( - "The hidden size (%d) is not a multiple of the number of attention " - "heads (%d)" % (config.hidden_size, config.num_attention_heads)) - self.output_attentions = config.output_attentions - - self.num_attention_heads = config.num_attention_heads - self.attention_head_size = int(config.hidden_size / config.num_attention_heads) - self.all_head_size = self.num_attention_heads * self.attention_head_size - - self.query = nn.Linear(config.hidden_size, self.all_head_size) - self.key = nn.Linear(config.hidden_size, self.all_head_size) - self.value = nn.Linear(config.hidden_size, self.all_head_size) - - self.dropout = nn.Dropout(config.attention_probs_dropout_prob) - self.softmax = nn.Softmax(dim=-1) - self.qk2attn = QK2Attention() - - def transpose_for_scores(self, x): - if torch._C._get_tracing_state(): - # exporter is not smart enough to detect dynamic size for some paths - x = x.view(x.shape[0], -1, self.num_attention_heads, self.attention_head_size) - else: - new_x_shape = x.size()[:-1] + (self.num_attention_heads, self.attention_head_size) - x = x.view(*new_x_shape) - return x.permute(0, 2, 1, 3) - - def forward(self, hidden_states, attention_mask, head_mask=None, - history_state=None): - if history_state is not None: - x_states = torch.cat([history_state, hidden_states], dim=1) - mixed_query_layer = self.query(hidden_states) - mixed_key_layer = self.key(x_states) - mixed_value_layer = self.value(x_states) - else: - mixed_query_layer = self.query(hidden_states) - mixed_key_layer = self.key(hidden_states) - mixed_value_layer = self.value(hidden_states) - - query_layer = self.transpose_for_scores(mixed_query_layer) - key_layer = self.transpose_for_scores(mixed_key_layer) - value_layer = self.transpose_for_scores(mixed_value_layer) - - attention_probs = self.qk2attn(query_layer, key_layer, attention_mask, math.sqrt(self.attention_head_size)) - - # This is actually dropping out entire tokens to attend to, which might - # seem a bit unusual, but is taken from the original Transformer paper. - attention_probs = self.dropout(attention_probs) - - # Mask heads if we want to - if head_mask is not None: - attention_probs = attention_probs * head_mask - - context_layer = torch.matmul(attention_probs, value_layer) - - context_layer = context_layer.permute(0, 2, 1, 3).contiguous() - new_context_layer_shape = context_layer.size()[:-2] + (self.all_head_size,) - context_layer = context_layer.view(*new_context_layer_shape) - - outputs = (context_layer, attention_probs) if self.output_attentions else (context_layer,) - return outputs - - -class BertSelfOutput(nn.Module): - def __init__(self, config): - super(BertSelfOutput, self).__init__() - self.dense = nn.Linear(config.hidden_size, config.hidden_size) - self.pre_norm = hasattr(config, 'pre_norm') and config.pre_norm - if not self.pre_norm: - self.LayerNorm = LayerNormClass(config.hidden_size, eps=config.layer_norm_eps) - self.dropout = nn.Dropout(config.hidden_dropout_prob) - - def forward(self, hidden_states, input_tensor): - hidden_states = self.dense(hidden_states) - hidden_states = self.dropout(hidden_states) - if not self.pre_norm: - hidden_states = self.LayerNorm(hidden_states + input_tensor) - else: - hidden_states = hidden_states + input_tensor - return hidden_states - - -class BertAttention(nn.Module): - def __init__(self, config): - super(BertAttention, self).__init__() - self.pre_norm = hasattr(config, 'pre_norm') and config.pre_norm - if self.pre_norm: - self.LayerNorm = LayerNormClass(config.hidden_size, eps=config.layer_norm_eps) - self.self = BertSelfAttention(config) - self.output = BertSelfOutput(config) - - def forward(self, input_tensor, attention_mask, head_mask=None, - history_state=None): - if self.pre_norm: - self_outputs = self.self(self.LayerNorm(input_tensor), attention_mask, head_mask, - self.layerNorm(history_state) if history_state else history_state) - else: - self_outputs = self.self(input_tensor, attention_mask, head_mask, - history_state) - attention_output = self.output(self_outputs[0], input_tensor) - outputs = (attention_output,) + self_outputs[1:] # add attentions if we output them - return outputs - - -class BertIntermediate(nn.Module): - def __init__(self, config): - super(BertIntermediate, self).__init__() - self.dense = nn.Linear(config.hidden_size, config.intermediate_size) - assert config.hidden_act == 'gelu', 'Please implement other activation functions' - self.intermediate_act_fn = _gelu_python - - def forward(self, hidden_states): - hidden_states = self.dense(hidden_states) - hidden_states = self.intermediate_act_fn(hidden_states) - return hidden_states - - -class BertOutput(nn.Module): - def __init__(self, config): - super(BertOutput, self).__init__() - self.dense = nn.Linear(config.intermediate_size, config.hidden_size) - self.pre_norm = hasattr(config, 'pre_norm') and config.pre_norm - self.dropout = nn.Dropout(config.hidden_dropout_prob) - if not self.pre_norm: - self.LayerNorm = LayerNormClass(config.hidden_size, eps=config.layer_norm_eps) - - def forward(self, hidden_states, input_tensor): - hidden_states = self.dense(hidden_states) - hidden_states = self.dropout(hidden_states) - if not self.pre_norm: - hidden_states = self.LayerNorm(hidden_states + input_tensor) - else: - hidden_states = hidden_states + input_tensor - return hidden_states - - -class Mlp(nn.Module): - def __init__(self, config): - super().__init__() - self.pre_norm = hasattr(config, 'pre_norm') and config.pre_norm - self.intermediate = BertIntermediate(config) - if self.pre_norm: - self.LayerNorm = LayerNormClass(config.hidden_size, eps=config.layer_norm_eps) - self.output = BertOutput(config) - - def forward(self, attention_output): - if not self.pre_norm: - intermediate_output = self.intermediate(attention_output) - else: - intermediate_output = self.intermediate(self.LayerNorm(attention_output)) - layer_output = self.output(intermediate_output, attention_output) - return layer_output - - -class BertLayer(nn.Module): - def __init__(self, config, use_act_checkpoint=True): - super(BertLayer, self).__init__() - self.pre_norm = hasattr(config, 'pre_norm') and config.pre_norm - self.use_mlp_wrapper = hasattr(config, 'use_mlp_wrapper') and config.use_mlp_wrapper - self.attention = BertAttention(config) - self.use_act_checkpoint = use_act_checkpoint - if self.use_mlp_wrapper: - self.mlp = Mlp(config) - else: - self.intermediate = BertIntermediate(config) - if self.pre_norm: - self.LayerNorm = LayerNormClass(config.hidden_size, eps=config.layer_norm_eps) - self.output = BertOutput(config) - - def forward(self, hidden_states, attention_mask, head_mask=None, - history_state=None): - if self.use_act_checkpoint: - attention_outputs = checkpoint.checkpoint(self.attention, hidden_states, - attention_mask, head_mask, history_state) - else: - attention_outputs = self.attention(hidden_states, attention_mask, - head_mask, history_state) - attention_output = attention_outputs[0] - if self.use_mlp_wrapper: - layer_output = self.mlp(attention_output) - else: - if not self.pre_norm: - intermediate_output = self.intermediate(attention_output) - else: - intermediate_output = self.intermediate(self.LayerNorm(attention_output)) - layer_output = self.output(intermediate_output, attention_output) - outputs = (layer_output,) + attention_outputs[1:] # add attentions if we output them - return outputs - - -class BertEncoder(nn.Module): - def __init__(self, config, use_act_checkpoint=True): - super(BertEncoder, self).__init__() - self.output_attentions = config.output_attentions - self.output_hidden_states = config.output_hidden_states - self.layer = nn.ModuleList([BertLayer(config, use_act_checkpoint=use_act_checkpoint) for _ in range(config.num_hidden_layers)]) - self.pre_norm = hasattr(config, 'pre_norm') and config.pre_norm - if self.pre_norm: - self.LayerNorm = LayerNormClass(config.hidden_size, eps=config.layer_norm_eps) - - def forward(self, hidden_states, attention_mask, head_mask=None, - encoder_history_states=None): - all_hidden_states = () - all_attentions = () - for i, layer_module in enumerate(self.layer): - if self.output_hidden_states: - all_hidden_states = all_hidden_states + (hidden_states,) - - history_state = None if encoder_history_states is None else encoder_history_states[i] - layer_outputs = layer_module( - hidden_states, attention_mask, - (None if head_mask is None else head_mask[i]), - history_state, - ) - hidden_states = layer_outputs[0] - - if self.output_attentions: - all_attentions = all_attentions + (layer_outputs[1],) - if self.pre_norm: - hidden_states = self.LayerNorm(hidden_states) - outputs = (hidden_states,) - if self.output_hidden_states: - outputs = outputs + (all_hidden_states,) - if self.output_attentions: - outputs = outputs + (all_attentions,) - return outputs - -CONFIG_NAME = "config.json" - -class PretrainedConfig(object): - """ Base class for all configuration classes. - Handle a few common parameters and methods for loading/downloading/saving configurations. - """ - pretrained_config_archive_map = {} - - def __init__(self, **kwargs): - self.finetuning_task = kwargs.pop('finetuning_task', None) - self.num_labels = kwargs.pop('num_labels', 2) - self.output_attentions = kwargs.pop('output_attentions', False) - self.output_hidden_states = kwargs.pop('output_hidden_states', False) - self.torchscript = kwargs.pop('torchscript', False) - - def save_pretrained(self, save_directory): - """ Save a configuration object to a directory, so that it - can be re-loaded using the `from_pretrained(save_directory)` class method. - """ - assert os.path.isdir(save_directory), "Saving path should be a directory where the model and configuration can be saved" - - # If we save using the predefined names, we can load using `from_pretrained` - output_config_file = os.path.join(save_directory, CONFIG_NAME) - - self.to_json_file(output_config_file) - - @classmethod - def from_pretrained(cls, pretrained_model_name_or_path, **kwargs): - r""" Instantiate a PretrainedConfig from a pre-trained model configuration. - - Params: - **pretrained_model_name_or_path**: either: - - a string with the `shortcut name` of a pre-trained model configuration to load from cache - or download and cache if not already stored in cache (e.g. 'bert-base-uncased'). - - a path to a `directory` containing a configuration file saved - using the `save_pretrained(save_directory)` method. - - a path or url to a saved configuration `file`. - **cache_dir**: (`optional`) string: - Path to a directory in which a downloaded pre-trained model - configuration should be cached if the standard cache should not be used. - **return_unused_kwargs**: (`optional`) bool: - - If False, then this function returns just the final configuration object. - - If True, then this functions returns a tuple `(config, unused_kwargs)` where `unused_kwargs` - is a dictionary consisting of the key/value pairs whose keys are not configuration attributes: - ie the part of kwargs which has not been used to update `config` and is otherwise ignored. - **kwargs**: (`optional`) dict: - Dictionary of key/value pairs with which to update the configuration object after loading. - - The values in kwargs of any keys which are configuration attributes will be used - to override the loaded values. - - Behavior concerning key/value pairs whose keys are *not* configuration attributes is controlled - by the `return_unused_kwargs` keyword parameter. - - Examples:: - - >>> config = BertConfig.from_pretrained('bert-base-uncased') # Download configuration from S3 and cache. - >>> config = BertConfig.from_pretrained('./test/saved_model/') # E.g. config (or model) was saved using `save_pretrained('./test/saved_model/')` - >>> config = BertConfig.from_pretrained('./test/saved_model/my_configuration.json') - >>> config = BertConfig.from_pretrained('bert-base-uncased', output_attention=True, foo=False) - >>> assert config.output_attention == True - >>> config, unused_kwargs = BertConfig.from_pretrained('bert-base-uncased', output_attention=True, - >>> foo=False, return_unused_kwargs=True) - >>> assert config.output_attention == True - >>> assert unused_kwargs == {'foo': False} - - """ - cache_dir = kwargs.pop('cache_dir', None) - return_unused_kwargs = kwargs.pop('return_unused_kwargs', False) - - if pretrained_model_name_or_path in cls.pretrained_config_archive_map: - config_file = cls.pretrained_config_archive_map[pretrained_model_name_or_path] - elif os.path.isdir(pretrained_model_name_or_path): - config_file = os.path.join(pretrained_model_name_or_path, CONFIG_NAME) - else: - config_file = pretrained_model_name_or_path - # redirect to the cache, if necessary - try: - resolved_config_file = cached_path(config_file, cache_dir=cache_dir) - except EnvironmentError: - if pretrained_model_name_or_path in cls.pretrained_config_archive_map: - logger.error( - "Couldn't reach server at '{}' to download pretrained model configuration file.".format( - config_file)) - else: - logger.error( - "Model name '{}' was not found in model name list ({}). " - "We assumed '{}' was a path or url but couldn't find any file " - "associated to this path or url.".format( - pretrained_model_name_or_path, - ', '.join(cls.pretrained_config_archive_map.keys()), - config_file)) - return None - if resolved_config_file == config_file: - logger.info("loading configuration file {}".format(config_file)) - else: - logger.info("loading configuration file {} from cache at {}".format( - config_file, resolved_config_file)) - - # Load config - config = cls.from_json_file(resolved_config_file) - - # Update config with kwargs if needed - to_remove = [] - for key, value in kwargs.items(): - if hasattr(config, key): - setattr(config, key, value) - to_remove.append(key) - # add img_layer_norm_eps, use_img_layernorm - if "img_layer_norm_eps" in kwargs: - setattr(config, "img_layer_norm_eps", kwargs["img_layer_norm_eps"]) - to_remove.append("img_layer_norm_eps") - if "use_img_layernorm" in kwargs: - setattr(config, "use_img_layernorm", kwargs["use_img_layernorm"]) - to_remove.append("use_img_layernorm") - for key in to_remove: - kwargs.pop(key, None) - - logger.info("Model config %s", config) - if return_unused_kwargs: - return config, kwargs - else: - return config - - @classmethod - def from_dict(cls, json_object): - """Constructs a `Config` from a Python dictionary of parameters.""" - config = cls(vocab_size_or_config_json_file=-1) - for key, value in json_object.items(): - config.__dict__[key] = value - return config - - @classmethod - def from_json_file(cls, json_file): - """Constructs a `BertConfig` from a json file of parameters.""" - with open(json_file, "r", encoding='utf-8') as reader: - text = reader.read() - return cls.from_dict(json.loads(text)) - - def __eq__(self, other): - return self.__dict__ == other.__dict__ - - def __repr__(self): - return str(self.to_json_string()) - - def to_dict(self): - """Serializes this instance to a Python dictionary.""" - output = copy.deepcopy(self.__dict__) - return output - - def to_json_string(self): - """Serializes this instance to a JSON string.""" - return json.dumps(self.to_dict(), indent=2, sort_keys=True) + "\n" - - def to_json_file(self, json_file_path): - """ Save this instance to a json file.""" - with open(json_file_path, "w", encoding='utf-8') as writer: - writer.write(self.to_json_string()) - - -class BertConfig(PretrainedConfig): - r""" - :class:`~pytorch_transformers.BertConfig` is the configuration class to store the configuration of a - `BertModel`. - - - Arguments: - vocab_size_or_config_json_file: Vocabulary size of `inputs_ids` in `BertModel`. - hidden_size: Size of the encoder layers and the pooler layer. - num_hidden_layers: Number of hidden layers in the Transformer encoder. - num_attention_heads: Number of attention heads for each attention layer in - the Transformer encoder. - intermediate_size: The size of the "intermediate" (i.e., feed-forward) - layer in the Transformer encoder. - hidden_act: The non-linear activation function (function or string) in the - encoder and pooler. If string, "gelu", "relu" and "swish" are supported. - hidden_dropout_prob: The dropout probabilitiy for all fully connected - layers in the embeddings, encoder, and pooler. - attention_probs_dropout_prob: The dropout ratio for the attention - probabilities. - max_position_embeddings: The maximum sequence length that this model might - ever be used with. Typically set this to something large just in case - (e.g., 512 or 1024 or 2048). - type_vocab_size: The vocabulary size of the `token_type_ids` passed into - `BertModel`. - initializer_range: The sttdev of the truncated_normal_initializer for - initializing all weight matrices. - layer_norm_eps: The epsilon used by LayerNorm. - """ - pretrained_config_archive_map = BERT_PRETRAINED_CONFIG_ARCHIVE_MAP - - def __init__(self, - vocab_size_or_config_json_file=30522, - hidden_size=768, - num_hidden_layers=12, - num_attention_heads=12, - intermediate_size=3072, - hidden_act="gelu", - hidden_dropout_prob=0.1, - attention_probs_dropout_prob=0.1, - max_position_embeddings=512, - type_vocab_size=2, - initializer_range=0.02, - layer_norm_eps=1e-12, - **kwargs): - super(BertConfig, self).__init__(**kwargs) - if isinstance(vocab_size_or_config_json_file, str): - with open(vocab_size_or_config_json_file, "r", encoding='utf-8') as reader: - json_config = json.loads(reader.read()) - for key, value in json_config.items(): - self.__dict__[key] = value - elif isinstance(vocab_size_or_config_json_file, int): - self.vocab_size = vocab_size_or_config_json_file - self.hidden_size = hidden_size - self.num_hidden_layers = num_hidden_layers - self.num_attention_heads = num_attention_heads - self.hidden_act = hidden_act - self.intermediate_size = intermediate_size - self.hidden_dropout_prob = hidden_dropout_prob - self.attention_probs_dropout_prob = attention_probs_dropout_prob - self.max_position_embeddings = max_position_embeddings - self.type_vocab_size = type_vocab_size - self.initializer_range = initializer_range - self.layer_norm_eps = layer_norm_eps - else: - raise ValueError("First argument must be either a vocabulary size (int)" - "or the path to a pretrained model config file (str)") - - -def _gelu_python(x): - - return x * 0.5 * (1.0 + torch.erf(x / math.sqrt(2.0))) \ No newline at end of file diff --git a/grit_src_deprecated/grit/modeling/text/text_decoder.py b/grit_src_deprecated/grit/modeling/text/text_decoder.py deleted file mode 100755 index 071baa7a..00000000 --- a/grit_src_deprecated/grit/modeling/text/text_decoder.py +++ /dev/null @@ -1,672 +0,0 @@ -# Modified by Jialian Wu from -# https://github.com/microsoft/GenerativeImage2Text/blob/main/generativeimage2text/layers/decoder.py -# and https://github.com/kdexd/virtex -from torch import nn -import torch -import functools -from torch.nn import functional as F -import warnings - - -class TextualHead(nn.Module): - def __init__(self, - visual_feature_size: int, vocab_size: int, hidden_size: int): - super().__init__() - self.visual_feature_size = visual_feature_size - self.vocab_size = vocab_size - self.hidden_size = hidden_size - - @property - def textual_feature_size(self): - return self.hidden_size - - -class WordAndPositionalEmbedding(nn.Module): - def __init__( - self, - vocab_size: int, - hidden_size: int, - dropout: float = 0.0, - max_caption_length: int = 30, - padding_idx: int = 0, - ): - super().__init__() - self.vocab_size = vocab_size - self.padding_idx = padding_idx - - #self.words = nn.Embedding(vocab_size, hidden_size, padding_idx=padding_idx) - self.words = nn.Embedding(vocab_size, hidden_size) - - # We provide no "padding index" for positional embeddings. We zero out - # the positional embeddings of padded positions as a post-processing. - self.positions = nn.Embedding(max_caption_length, hidden_size) - self.layer_norm = nn.LayerNorm( - hidden_size, eps=1e-8, elementwise_affine=True - ) - self.dropout = nn.Dropout(p=dropout) - - def forward(self, tokens: torch.Tensor): - position_indices = self._create_position_indices(tokens) - - # shape: (batch_size, max_caption_length, hidden_size) - word_embeddings = self.words(tokens) - position_embeddings = self.positions(position_indices) - - # shape: (batch_size, max_caption_length, hidden_size) - embeddings = self.layer_norm(word_embeddings + position_embeddings) - embeddings = self.dropout(embeddings) - - return embeddings - - @functools.lru_cache(maxsize=128) - def _create_position_indices(self, tokens: torch.Tensor): - - # Create position indices of the same size as token indices. - batch_size, max_caption_length = tokens.size() - positions = torch.arange( - max_caption_length, dtype=tokens.dtype, device=tokens.device - ) - # shape: (batch_size, max_caption_length) - positions = positions.unsqueeze(0).expand(batch_size, max_caption_length) - return positions - - -class BertEncoderAsDecoder(nn.Module): - def __init__(self, encoder): - super().__init__() - self.encoder = encoder - - def forward(self, tgt, memory, - tgt_mask=None, - tgt_key_padding_mask=None, - memory_key_padding_mask=None, - tgt_bi_valid_mask=None, - encoder_history_states=None, - ): - assert tgt_key_padding_mask is None, 'not supported' - assert tgt_mask.dim() == 2 - assert tgt_mask.shape[0] == tgt_mask.shape[1] - # tgt_mask should always be 0/negative infinity - tgt = tgt.transpose(0, 1) - memory = memory.transpose(0, 1) - - hidden_states = torch.cat((memory, tgt), dim=1) - num_tgt = tgt.shape[1] - num_memory = memory.shape[1] - device = tgt.device - dtype = tgt.dtype - top_left = torch.zeros((num_memory, num_memory), device=device, dtype=dtype) - top_right = torch.full((num_memory, num_tgt), float('-inf'), device=tgt.device, dtype=dtype,) - bottom_left = torch.zeros((num_tgt, num_memory), dtype=dtype, device=tgt_mask.device,) - left = torch.cat((top_left, bottom_left), dim=0) - right = torch.cat((top_right, tgt_mask.to(dtype)), dim=0) - - full_attention_mask = torch.cat((left, right), dim=1)[None, :] - - if memory_key_padding_mask is None: - memory_key_padding_mask = torch.full((memory.shape[0], memory.shape[1]), fill_value=False, device=device) - # if it is False, it means valid. That is, it is not a padding - assert memory_key_padding_mask.dtype == torch.bool - zero_negative_infinity = torch.zeros_like(memory_key_padding_mask, dtype=tgt.dtype) - zero_negative_infinity[memory_key_padding_mask] = float('-inf') - full_attention_mask = full_attention_mask.expand((memory_key_padding_mask.shape[0], num_memory + num_tgt, num_memory + num_tgt)) - full_attention_mask = full_attention_mask.clone() - origin_left = full_attention_mask[:, :, :num_memory] - update = zero_negative_infinity[:, None, :] - full_attention_mask[:, :, :num_memory] = origin_left + update - - if tgt_bi_valid_mask is not None: - # verify the correctness - bs = full_attention_mask.shape[0] - # during inference, tgt_bi_valid_mask's length is not changed, but - # num_tgt can be increased - max_valid_target = tgt_bi_valid_mask.shape[1] - mask = tgt_bi_valid_mask[:, None, :].expand((bs, num_memory+num_tgt, max_valid_target)) - full_attention_mask[:, :, num_memory:(num_memory+max_valid_target)][mask] = 0 - - # add axis for multi-head - full_attention_mask = full_attention_mask[:, None, :, :] - - if encoder_history_states is None: - result = self.encoder( - hidden_states=hidden_states, - attention_mask=full_attention_mask, - encoder_history_states=encoder_history_states, - ) - result = list(result) - result[0] = result[0][:, num_memory:].transpose(0, 1) - if self.encoder.output_hidden_states: - return result[0], result[1] - else: - # make it back-compatible - return result[0] - else: - encoder_out = self.encoder( - hidden_states=hidden_states[:, -1:], - attention_mask=full_attention_mask[:, :, -1:], - encoder_history_states=encoder_history_states, - ) - result = encoder_out[0].transpose(0, 1) - if self.encoder.output_hidden_states: - return result, encoder_out[1] - else: - return result - - -def create_transformer(decoder_type, norm_type, - textual_feature_size, - attention_heads, - feedforward_size, - dropout, - num_layers, - output_hidden_states=False, - use_mlp_wrapper=None, - use_act_checkpoint=True, - ): - assert norm_type in ['post', 'pre'] - if decoder_type is None: - LayerClass = ( - nn.TransformerDecoderLayer - if norm_type == "post" - else PreNormTransformerDecoderLayer - ) - _layer = LayerClass( - textual_feature_size, - attention_heads, - dim_feedforward=feedforward_size, - dropout=dropout, - activation="gelu", - ) - return nn.TransformerDecoder(_layer, num_layers) - elif decoder_type == 'bert_en': - from .modeling_bert import BertConfig, BertEncoder - config = BertConfig( - vocab_size_or_config_json_file=30522, - hidden_size=textual_feature_size, - num_hidden_layers=num_layers, - num_attention_heads=attention_heads, - intermediate_size=feedforward_size, - hidden_act="gelu", - hidden_dropout_prob=0.1, - attention_probs_dropout_prob=0.1, - layer_norm_eps=1e-12, - ) - config.pre_norm = (norm_type == 'pre') - config.use_mlp_wrapper = use_mlp_wrapper - config.output_hidden_states = output_hidden_states - encoder = BertEncoder(config, use_act_checkpoint=use_act_checkpoint) - return BertEncoderAsDecoder(encoder) - - -class PreNormTransformerDecoderLayer(nn.TransformerDecoderLayer): - def forward(self, tgt, memory, tgt_mask=None, memory_mask=None, - tgt_key_padding_mask=None, memory_key_padding_mask=None): - # fmt: off - # We use the members (modules) from super-class, just the order of - # operations is changed here. First layernorm, then attention. - tgt2 = self.norm1(tgt) - tgt2, _ = self.self_attn( - tgt2, tgt2, tgt2, attn_mask=tgt_mask, - key_padding_mask=tgt_key_padding_mask - ) - tgt = tgt + self.dropout1(tgt2) - - # Layernorm first, then decoder attention. - tgt2 = self.norm2(tgt) - tgt2, _ = self.multihead_attn( - tgt2, memory, memory, attn_mask=memory_mask, - key_padding_mask=memory_key_padding_mask - ) - tgt = tgt + self.dropout2(tgt2) - - # Layernorm first, then transformation through feedforward network. - tgt2 = self.norm3(tgt) - tgt2 = self.linear2(self.dropout(self.activation(self.linear1(tgt2)))) - tgt = tgt + self.dropout3(tgt2) - return tgt - - -class TransformerDecoderTextualHead(TextualHead): - def __init__( - self, - object_feature_size: int, - vocab_size: int, - hidden_size: int, - num_layers: int, - attention_heads: int, - feedforward_size: int, - dropout: float = 0.1, - norm_type: str = "post", - mask_future_positions: bool = True, - max_caption_length: int = 1024, - padding_idx: int = 0, - decoder_type=None, - not_tie_weight=None, - output_hidden_states=None, - use_mlp_wrapper=None, - use_act_checkpoint=True, - ): - super().__init__(object_feature_size, vocab_size, hidden_size) - self.num_layers = num_layers - self.attention_heads = attention_heads - self.feedforward_size = feedforward_size - self.dropout = dropout - assert mask_future_positions - self.padding_idx = padding_idx - - self.object_feature_projection = nn.Sequential( - nn.Linear(object_feature_size, self.textual_feature_size), - nn.LayerNorm(self.textual_feature_size)) - - self.embedding = WordAndPositionalEmbedding( - self.vocab_size, - self.textual_feature_size, - dropout=dropout, - max_caption_length=max_caption_length, - padding_idx=padding_idx, - ) - self.transformer = create_transformer( - decoder_type=decoder_type, - norm_type=norm_type, - textual_feature_size=self.textual_feature_size, - attention_heads=self.attention_heads, - feedforward_size=self.feedforward_size, - dropout=dropout, - num_layers=self.num_layers, - output_hidden_states=output_hidden_states, - use_mlp_wrapper=use_mlp_wrapper, - use_act_checkpoint=use_act_checkpoint, - ) - self.apply(self._init_weights) - - # Create an output linear layer and tie the input and output word - # embeddings to reduce parametejs. - self.output = nn.Linear(self.textual_feature_size, vocab_size) - if not not_tie_weight: - self.output.weight = self.embedding.words.weight - - @staticmethod - def _init_weights(module): - """Initialize weights like BERT - N(0.0, 0.02), bias = 0.""" - - if isinstance(module, nn.Linear): - module.weight.data.normal_(mean=0.0, std=0.02) - elif isinstance(module, nn.MultiheadAttention): - module.in_proj_weight.data.normal_(mean=0.0, std=0.02) - module.out_proj.weight.data.normal_(mean=0.0, std=0.02) - elif isinstance(module, nn.Embedding): - module.weight.data.normal_(mean=0.0, std=0.02) - if module.padding_idx is not None: - module.weight.data[module.padding_idx].zero_() - - def forward( - self, - hidden_states, - text_tokens, - ): - projected_object_features = self.object_feature_projection(hidden_states) if hidden_states is not None else None - batch_size, max_text_length = text_tokens.size() - text_embeddings = self.embedding(text_tokens) - - # An additive mask for masking the future (one direction). - uni_mask_zero_neg = self._generate_future_mask( - max_text_length, text_embeddings.dtype, text_embeddings.device - ) - - # We transpose the first two dimensions of tokens embeddings and visual - # features, as required by decoder. - text_embeddings = text_embeddings.transpose(0, 1) - - projected_object_features = projected_object_features.transpose(0, 1) - - # if transformer here is the pytorch/decoder, there is no chance, the - # output is always tensor - trans_out = self.transformer( - text_embeddings, - projected_object_features, - tgt_mask=uni_mask_zero_neg, - ) - if isinstance(trans_out, tuple): - textual_features = trans_out[0] - else: - assert isinstance(trans_out, torch.Tensor) - textual_features = trans_out - # Undo the transpose and bring batch to dim 0. - # shape: (batch_size, max_caption_length, hidden_size) - textual_features = textual_features.transpose(0, 1) - - # shape: (batch_size, max_caption_length, vocab_size) - output_logits = self.output(textual_features) - if isinstance(trans_out, tuple): - return output_logits, trans_out[1] - else: - return output_logits - - def _generate_future_mask( - self, size: int, dtype: torch.dtype, device: torch.device - ): - # Default mask is for forward direction. Flip for backward direction. - mask = torch.triu( - torch.ones(size, size, device=device, dtype=dtype), diagonal=1 - ) - mask = mask.masked_fill(mask == 1, float("-inf")) - return mask - - -class AutoRegressiveBeamSearch(object): - def __init__( - self, - end_token_id: int, - max_steps: int = 50, - beam_size: int = 5, - objectdet=True, - per_node_beam_size: int = 2, - ): - self._eos_index = end_token_id - self.max_steps = max_steps - self.beam_size = beam_size - self.objectdet = objectdet - self.per_node_beam_size = per_node_beam_size or beam_size - - def search(self, begin_tokens, step): - if self.beam_size > 1 and self.objectdet: - only_return_best = False - else: - only_return_best = True - - batch_size = begin_tokens.size()[0] - - predictions = begin_tokens.unsqueeze(1).expand((batch_size, self.beam_size, begin_tokens.shape[-1])) - # Calculate the first timestep. This is done outside the main loop - # because we are going from a single decoder input (the output from the - # encoder) to the top `beam_size` decoder outputs. On the other hand, - # within the main loop we are going from the `beam_size` elements of the - # beam to `beam_size`^2 candidates from which we will select the top - # `beam_size` elements for the next iteration. - # shape: (batch_size, num_classes) - start_class_logits = step(begin_tokens) - - # Convert logits to logprobs. - # shape: (batch_size * beam_size, vocab_size) - start_class_logprobs = F.log_softmax(start_class_logits, dim=1) - - num_classes = start_class_logprobs.size()[1] - - # shape: (batch_size, beam_size), (batch_size, beam_size) - start_top_logprobs, start_predicted_classes = start_class_logprobs.topk( - self.beam_size - ) - - if ( - self.beam_size == 1 - and (start_predicted_classes == self._eos_index).all() - ): - warnings.warn( - "Empty object description predicted. You may want to increase beam" - "size or ensure your step function is working properly.", - RuntimeWarning, - ) - if only_return_best: - return start_predicted_classes, start_top_logprobs - else: - return start_predicted_classes.unsqueeze(-1), start_top_logprobs - - # The log probs for the last time step. - # shape: (batch_size, beam_size) - last_logprobs = start_top_logprobs - - # shape: (batch_size, beam_size, sequence_length) - predictions = torch.cat([predictions, start_predicted_classes.unsqueeze(-1)], dim=-1) - - # Log probability tensor that mandates that the end token is selected. - # shape: (batch_size * beam_size, num_classes) - logprobs_after_end = start_class_logprobs.new_full( - (batch_size * self.beam_size, num_classes), float("-inf") - ) - logprobs_after_end[:, self._eos_index] = 0.0 - - logits_after_end = start_class_logprobs.new_full( - (batch_size * self.beam_size, num_classes), float("-inf") - ) - logits_after_end[:, self._eos_index] = 0 - - while predictions.shape[-1] < self.max_steps: - # shape: (batch_size * beam_size,) - last_predictions = predictions[:, :, -1].reshape(batch_size * self.beam_size) - - # If every predicted token from the last step is `self._eos_index`, - # then we can stop early. - if (last_predictions == self._eos_index).all(): - break - - predictions_so_far = predictions.view( - batch_size * self.beam_size, -1 - ) - # shape: (batch_size * beam_size, num_classes) - class_logits = step(predictions_so_far) - - # Set logprobs of last predicted tokens as high negative value to avoid - # repetition in description. - class_logits = class_logits.scatter(1, predictions_so_far[:, -1].view((-1, 1)), -10000) - - # shape: (batch_size * beam_size, num_classes) - last_predictions_expanded = last_predictions.unsqueeze(-1).expand( - batch_size * self.beam_size, num_classes - ) - - # Here we are finding any beams where we predicted the end token in - # the previous timestep and replacing the distribution with a - # one-hot distribution, forcing the beam to predict the end token - # this timestep as well. - class_logits = torch.where( - last_predictions_expanded == self._eos_index, - logits_after_end, - class_logits, - ) - - # Convert logits to logprobs. - # shape: (batch_size * beam_size, vocab_size) - class_logprobs = F.log_softmax(class_logits, dim=1) - - # shape (both): (batch_size * beam_size, per_node_beam_size) - top_logprobs, predicted_classes = class_logprobs.topk( - self.per_node_beam_size - ) - - # Here we expand the last log probs to `(batch_size * beam_size, - # per_node_beam_size)` so that we can add them to the current log - # probs for this timestep. This lets us maintain the log - # probability of each element on the beam. - # shape: (batch_size * beam_size, per_node_beam_size) - expanded_last_logprobs = ( - last_logprobs.unsqueeze(2) - .expand(batch_size, self.beam_size, self.per_node_beam_size) - .reshape(batch_size * self.beam_size, self.per_node_beam_size) - ) - # shape: (batch_size * beam_size, per_node_beam_size) - summed_top_logprobs = top_logprobs + expanded_last_logprobs - - # shape: (batch_size, beam_size * per_node_beam_size) - reshaped_summed = summed_top_logprobs.reshape( - batch_size, self.beam_size * self.per_node_beam_size - ) - # shape: (batch_size, beam_size * per_node_beam_size) - reshaped_predicted_classes = predicted_classes.reshape( - batch_size, self.beam_size * self.per_node_beam_size - ) - # Append the predictions to the current beam. - reshaped_beam = ( - predictions.view(batch_size * self.beam_size, 1, -1) - .repeat(1, self.per_node_beam_size, 1) - .reshape(batch_size, self.beam_size * self.per_node_beam_size, -1) - ) - # batch_size, (beam_size * per_node_beach_size), #token - reshaped_beam = torch.cat([reshaped_beam, reshaped_predicted_classes.unsqueeze(-1)], dim=-1) - - # Keep only the top `beam_size` beam indices. - # shape: (batch_size, beam_size), (batch_size, beam_size) - restricted_beam_logprobs, restricted_beam_indices = reshaped_summed.topk( - self.beam_size - ) - predictions = reshaped_beam.gather( - 1, restricted_beam_indices.unsqueeze(-1).repeat(1,1,reshaped_beam.shape[-1]) - ) - - # shape: (batch_size, beam_size) - last_logprobs = restricted_beam_logprobs - - if not torch.isfinite(last_logprobs).all(): - warnings.warn( - "Infinite log probs encountered. Some final descriptions may not " - "make sense. This can happen when the beam size is larger than" - " the number of valid (non-zero probability) transitions that " - "the step function produces.", - RuntimeWarning, - ) - - # Optionally select best beam and its logprobs. - if only_return_best: - # shape: (batch_size, sequence_length) - predictions = predictions[:, 0, :] - last_logprobs = last_logprobs[:, 0] - num_valid = (predictions != self._eos_index).sum(dim=-1) - num_valid += (predictions == self._eos_index).sum(dim=-1) > 0 - num_valid = num_valid - begin_tokens.shape[1] - num_valid = num_valid.clip(min=1) - - last_logprobs = last_logprobs / num_valid - - return predictions, last_logprobs - - -class GRiTTextDecoder(nn.Module): - def __init__( - self, - transformer, - begin_token_id=101, - beamsearch_decode=None, - loss_type=None, - tokenizer=None, - ): - super().__init__() - self.textual = transformer - self.padding_idx = self.textual.padding_idx - - self.begin_token_id = begin_token_id - self.beamsearch_decode = beamsearch_decode - self.tokenizer = tokenizer - - if loss_type is None: - self.loss = nn.CrossEntropyLoss(ignore_index=self.padding_idx) - elif loss_type == 'smooth': - self.loss = SmoothLabelCrossEntropyLoss(ignore_index=self.padding_idx) - else: - raise NotImplementedError(loss_type) - - def forward(self, batch): - object_features = batch['object_features'] - - if self.training: - caption_token_input = batch["text_tokens"] - - output_logits = self.textual( - object_features, - caption_token_input, - ) - - if 'need_predict' in batch: - # in place should also be good, but we do not choose that for - # safety as we may use it in prediction results in future - target = batch["text_tokens"].clone() - target[batch['need_predict'] == 0] = self.padding_idx - else: - target = batch["text_tokens"] - - feat = output_logits[:, :-1].contiguous() - target = target[:, 1:].contiguous() - feat = feat.view(-1, self.textual.vocab_size) - target = target.view(-1) - - valid_mask = target != self.padding_idx - target = target[valid_mask] - feat = feat[valid_mask] - loss = self.loss(feat, target) - - return loss - else: - output_dict = self.infer(object_features) - return output_dict - - def infer(self, object_features): - batch_size = object_features.size(0) - begin_tokens = object_features.new_full( - (batch_size, 1), self.begin_token_id - ).long() - - decoding_step = functools.partial( - self.decoding_step, object_features - ) - - object_description_tokens, logprobs = self.beamsearch_decode.search( - begin_tokens, decoding_step - ) - - output_dict = { - 'predictions': object_description_tokens, - 'logprobs': logprobs, - } - - return output_dict - - def decoding_step(self, object_features, partial_text): - batch_size = object_features.shape[0] - beam_size = int(partial_text.size(0) / batch_size) - if beam_size > 1: - batch_size, num_token, channels = object_features.size() - object_features = object_features.unsqueeze(1).repeat(1, beam_size, 1, 1) - object_features = object_features.view( - batch_size * beam_size, num_token, channels - ) - - text_lengths = torch.ones_like(partial_text) - if len(text_lengths.size()) != 2: - partial_text = partial_text.unsqueeze(1) - - # shape: (batch_size * beam_size, partial_caption_length, vocab_size) - logits = self.textual( - object_features, - partial_text, - ) - - return logits[:, -1, :].float() - - -class SmoothLabelCrossEntropyLoss(nn.Module): - def __init__(self, eps=0.1, log_prefix='', ignore_index=None): - super().__init__() - self.eps = eps - self.log_soft = nn.LogSoftmax(dim=1) - self.kl = nn.KLDivLoss(reduction='none') - - self.iter = 0 - self.max_loss = 0 - self.min_loss = 0 - self.log_prefix = log_prefix - self.ignore_index = ignore_index - - def forward(self, feature, target): - feature = feature.float() - if self.ignore_index is not None: - valid_mask = target != self.ignore_index - target = target[valid_mask] - feature = feature[valid_mask] - assert target.numel() > 0 - self.iter += 1 - eps = self.eps - n_class = feature.size(1) - one_hot = torch.zeros_like(feature).scatter(1, target.view(-1, 1), 1) - one_hot = one_hot * (1 - eps) + (1 - one_hot) * eps / (n_class - 1) - log_prb = self.log_soft(feature) - loss = self.kl(log_prb, one_hot) - return loss.sum(dim=1).mean() - diff --git a/grit_src_deprecated/grit/predictor.py b/grit_src_deprecated/grit/predictor.py deleted file mode 100755 index f1d44a31..00000000 --- a/grit_src_deprecated/grit/predictor.py +++ /dev/null @@ -1,91 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# Modified by Jialian Wu from https://github.com/facebookresearch/detectron2/blob/main/detectron2/utils/visualizer.py -import torch - -from detectron2.engine.defaults import DefaultPredictor -from detectron2.utils.visualizer import ColorMode, Visualizer - - -class BatchDefaultPredictor(DefaultPredictor): - def __call__(self, original_images): - """ - Args: - original_image (np.ndarray): an image of shape (H, W, C) (in BGR order). - - Returns: - predictions (dict): - the output of the model for one image only. - See :doc:`/tutorials/models` for details about the format. - """ - with torch.no_grad(): # https://github.com/sphinx-doc/sphinx/issues/4258 - # Apply pre-processing to image. - height, width = original_images.shape[1:3] - batch_inputs = [] - for original_image in original_images: - image = self.aug.get_transform(original_image).apply_image(original_image) - image = torch.as_tensor(image.astype("float32").transpose(2, 0, 1)) - - inputs = {"image": image, "height": height, "width": width} - batch_inputs.append(inputs) - predictions = self.model(batch_inputs)[0] - return predictions - -class Visualizer_GRiT(Visualizer): - def __init__(self, image, instance_mode=None): - super().__init__(image, instance_mode=instance_mode) - - def draw_instance_predictions(self, predictions): - boxes = predictions.pred_boxes if predictions.has("pred_boxes") else None - scores = predictions.scores if predictions.has("scores") else None - classes = predictions.pred_classes.tolist() if predictions.has("pred_classes") else None - object_description = predictions.pred_object_descriptions.data - # uncomment to output scores in visualized images - # object_description = [c + '|' + str(round(s.item(), 1)) for c, s in zip(object_description, scores)] - - if self._instance_mode == ColorMode.SEGMENTATION and self.metadata.get("thing_colors"): - colors = [ - self._jitter([x / 255 for x in self.metadata.thing_colors[c]]) for c in classes - ] - alpha = 0.8 - else: - colors = None - alpha = 0.5 - - if self._instance_mode == ColorMode.IMAGE_BW: - self.output.reset_image( - self._create_grayscale_image( - (predictions.pred_masks.any(dim=0) > 0).numpy() - if predictions.has("pred_masks") - else None - ) - ) - alpha = 0.3 - - self.overlay_instances( - masks=None, - boxes=boxes, - labels=object_description, - keypoints=None, - assigned_colors=colors, - alpha=alpha, - ) - return self.output - - -class VisualizationDemo(object): - def __init__(self, cfg, instance_mode=ColorMode.IMAGE): - self.cpu_device = torch.device("cpu") - self.instance_mode = instance_mode - - self.predictor = DefaultPredictor(cfg) - - def run_on_image(self, image,device): - - predictions = self.predictor(image, device) - # Convert image from OpenCV BGR format to Matplotlib RGB format. - # image = image[:, :, ::-1] - # visualizer = Visualizer_GRiT(image, instance_mode=self.instance_mode) - # instances = predictions["instances"].to(self.cpu_device) - # vis_output = visualizer.draw_instance_predictions(predictions=instances) - - return predictions, None \ No newline at end of file diff --git a/grit_src_deprecated/image_dense_captions.py b/grit_src_deprecated/image_dense_captions.py deleted file mode 100755 index fa130af5..00000000 --- a/grit_src_deprecated/image_dense_captions.py +++ /dev/null @@ -1,87 +0,0 @@ -import argparse -import multiprocessing as mp -import os -import time -import cv2 -import tqdm -import sys - -from detectron2.config import get_cfg -from detectron2.data.detection_utils import read_image -from detectron2.utils.logger import setup_logger - -sys.path.insert(0, 'models/grit_src/third_party/CenterNet2/projects/CenterNet2/') -from centernet.config import add_centernet_config -from models.grit_src.grit.config import add_grit_config - -from models.grit_src.grit.predictor import VisualizationDemo -import json - - -# constants -WINDOW_NAME = "GRiT" - - -def dense_pred_to_caption(predictions): - boxes = predictions["instances"].pred_boxes if predictions["instances"].has("pred_boxes") else None - object_description = predictions["instances"].pred_object_descriptions.data - new_caption = "" - for i in range(len(object_description)): - new_caption += (object_description[i] + ": " + str([int(a) for a in boxes[i].tensor.cpu().detach().numpy()[0]])) + "; " - return new_caption - -def dense_pred_to_caption_tuple(predictions): - boxes = predictions["instances"].pred_boxes if predictions["instances"].has("pred_boxes") else None - object_description = predictions["instances"].pred_object_descriptions.data - new_caption = [] - for i in range(len(object_description)): - new_caption += (object_description[i], [int(a) for a in boxes[i].tensor.cpu().detach().numpy()[0]]) - # new_caption += (object_description[i] + ": " + str([int(a) for a in boxes[i].tensor.cpu().detach().numpy()[0]])) + "; " - return new_caption - -def setup_cfg(args): - cfg = get_cfg() - if args["cpu"]: - cfg.MODEL.DEVICE="cpu" - add_centernet_config(cfg) - add_grit_config(cfg) - cfg.merge_from_file(args["config_file"]) - cfg.merge_from_list(args["opts"]) - # Set score_threshold for builtin models - cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = args["confidence_threshold"] - cfg.MODEL.PANOPTIC_FPN.COMBINE.INSTANCES_CONFIDENCE_THRESH = args["confidence_threshold"] - if args["test_task"]: - cfg.MODEL.TEST_TASK = args["test_task"] - cfg.MODEL.BEAM_SIZE = 1 - cfg.MODEL.ROI_HEADS.SOFT_NMS_ENABLED = False - cfg.USE_ACT_CHECKPOINT = False - cfg.freeze() - return cfg - - -def get_parser(device): - arg_dict = {'config_file': "models/grit_src/configs/GRiT_B_DenseCap_ObjectDet.yaml", 'cpu': False, 'confidence_threshold': 0.5, 'test_task': 'DenseCap', 'opts': ["MODEL.WEIGHTS", "pretrained_models/grit_b_densecap_objectdet.pth"]} - if device == "cpu": - arg_dict["cpu"] = True - return arg_dict - -def image_caption_api(image_src, device): - args2 = get_parser(device) - cfg = setup_cfg(args2) - demo = VisualizationDemo(cfg) - if image_src: - img = read_image(image_src, format="BGR") - predictions, visualized_output = demo.run_on_image(img) - new_caption = dense_pred_to_caption(predictions) - return new_caption - -def init_demo(device): - args2 = get_parser(device) - cfg = setup_cfg(args2) - demo = VisualizationDemo(cfg) - return demo - -if __name__=="__main__": - import os - os.environ['CUDA_VISIBLE_DEVICES']='7' - print(image_caption_api("images/dancing_example_4.mp4_20230417_135359.263.jpg",'cuda')) diff --git a/grit_src_deprecated/third_party/CenterNet2/.circleci/config.yml b/grit_src_deprecated/third_party/CenterNet2/.circleci/config.yml deleted file mode 100755 index 097afade..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.circleci/config.yml +++ /dev/null @@ -1,256 +0,0 @@ -version: 2.1 - -# ------------------------------------------------------------------------------------- -# Environments to run the jobs in -# ------------------------------------------------------------------------------------- -cpu: &cpu - machine: - image: ubuntu-2004:202107-02 - resource_class: medium - -gpu: &gpu - machine: - # NOTE: use a cuda vesion that's supported by all our pytorch versions - image: ubuntu-1604-cuda-11.1:202012-01 - resource_class: gpu.nvidia.small - -windows-cpu: &windows_cpu - machine: - resource_class: windows.medium - image: windows-server-2019-vs2019:stable - shell: powershell.exe - -# windows-gpu: &windows_gpu -# machine: -# resource_class: windows.gpu.nvidia.medium -# image: windows-server-2019-nvidia:stable - -version_parameters: &version_parameters - parameters: - pytorch_version: - type: string - torchvision_version: - type: string - pytorch_index: - type: string - # use test wheels index to have access to RC wheels - # https://download.pytorch.org/whl/test/torch_test.html - default: "https://download.pytorch.org/whl/torch_stable.html" - python_version: # NOTE: only affect linux - type: string - default: '3.6.8' - - environment: - PYTORCH_VERSION: << parameters.pytorch_version >> - TORCHVISION_VERSION: << parameters.torchvision_version >> - PYTORCH_INDEX: << parameters.pytorch_index >> - PYTHON_VERSION: << parameters.python_version>> - # point datasets to ~/.torch so it's cached in CI - DETECTRON2_DATASETS: ~/.torch/datasets - -# ------------------------------------------------------------------------------------- -# Re-usable commands -# ------------------------------------------------------------------------------------- -# install_nvidia_driver: &install_nvidia_driver -# - run: -# name: Install nvidia driver -# working_directory: ~/ -# command: | -# wget -q 'https://s3.amazonaws.com/ossci-linux/nvidia_driver/NVIDIA-Linux-x86_64-430.40.run' -# sudo /bin/bash ./NVIDIA-Linux-x86_64-430.40.run -s --no-drm -# nvidia-smi - -add_ssh_keys: &add_ssh_keys - # https://circleci.com/docs/2.0/add-ssh-key/ - - add_ssh_keys: - fingerprints: - - "e4:13:f2:22:d4:49:e8:e4:57:5a:ac:20:2f:3f:1f:ca" - -install_python: &install_python - - run: - name: Install Python - working_directory: ~/ - command: | - # upgrade pyenv - cd /opt/circleci/.pyenv/plugins/python-build/../.. && git pull && cd - - pyenv install -s $PYTHON_VERSION - pyenv global $PYTHON_VERSION - python --version - which python - pip install --upgrade pip - -setup_venv: &setup_venv - - run: - name: Setup Virtual Env - working_directory: ~/ - command: | - python -m venv ~/venv - echo ". ~/venv/bin/activate" >> $BASH_ENV - . ~/venv/bin/activate - python --version - which python - which pip - pip install --upgrade pip - -setup_venv_win: &setup_venv_win - - run: - name: Setup Virutal Env for Windows - command: | - pip install virtualenv - python -m virtualenv env - .\env\Scripts\activate - python --version - which python - which pip - -install_linux_dep: &install_linux_dep - - run: - name: Install Dependencies - command: | - # disable crash coredump, so unittests fail fast - sudo systemctl stop apport.service - # install from github to get latest; install iopath first since fvcore depends on it - pip install --progress-bar off -U 'git+https://github.com/facebookresearch/iopath' - pip install --progress-bar off -U 'git+https://github.com/facebookresearch/fvcore' - # Don't use pytest-xdist: cuda tests are unstable under multi-process workers. - pip install --progress-bar off ninja opencv-python-headless pytest tensorboard pycocotools - pip install --progress-bar off torch==$PYTORCH_VERSION -f $PYTORCH_INDEX - if [[ "$TORCHVISION_VERSION" == "master" ]]; then - pip install git+https://github.com/pytorch/vision.git - else - pip install --progress-bar off torchvision==$TORCHVISION_VERSION -f $PYTORCH_INDEX - fi - - python -c 'import torch; print("CUDA:", torch.cuda.is_available())' - gcc --version - -install_detectron2: &install_detectron2 - - run: - name: Install Detectron2 - command: | - # Remove first, in case it's in the CI cache - pip uninstall -y detectron2 - pip install --progress-bar off -e .[all] - python -m detectron2.utils.collect_env - ./datasets/prepare_for_tests.sh - -run_unittests: &run_unittests - - run: - name: Run Unit Tests - command: | - pytest -v --durations=15 tests # parallel causes some random failures - -# ------------------------------------------------------------------------------------- -# Jobs to run -# ------------------------------------------------------------------------------------- -jobs: - linux_cpu_tests: - <<: *cpu - <<: *version_parameters - - working_directory: ~/detectron2 - - steps: - - checkout - - # Cache the venv directory that contains python, dependencies, and checkpoints - # Refresh the key when dependencies should be updated (e.g. when pytorch releases) - - restore_cache: - keys: - - cache-{{ arch }}-<< parameters.pytorch_version >>-{{ .Branch }}-20210827 - - - <<: *install_python - - <<: *install_linux_dep - - <<: *install_detectron2 - - <<: *run_unittests - - - save_cache: - paths: - - /opt/circleci/.pyenv - - ~/.torch - key: cache-{{ arch }}-<< parameters.pytorch_version >>-{{ .Branch }}-20210827 - - - linux_gpu_tests: - <<: *gpu - <<: *version_parameters - - working_directory: ~/detectron2 - - steps: - - checkout - - - restore_cache: - keys: - - cache-{{ arch }}-<< parameters.pytorch_version >>-{{ .Branch }}-20210827 - - - <<: *install_python - - <<: *install_linux_dep - - <<: *install_detectron2 - - <<: *run_unittests - - - save_cache: - paths: - - /opt/circleci/.pyenv - - ~/.torch - key: cache-{{ arch }}-<< parameters.pytorch_version >>-{{ .Branch }}-20210827 - - windows_cpu_build: - <<: *windows_cpu - <<: *version_parameters - steps: - - <<: *add_ssh_keys - - checkout - - <<: *setup_venv_win - - # Cache the env directory that contains dependencies - - restore_cache: - keys: - - cache-{{ arch }}-<< parameters.pytorch_version >>-{{ .Branch }}-20210404 - - - run: - name: Install Dependencies - command: | - pip install certifi --ignore-installed # required on windows to workaround some cert issue - pip install numpy cython # required on windows before pycocotools - pip install opencv-python-headless pytest-xdist pycocotools tensorboard - pip install -U git+https://github.com/facebookresearch/iopath - pip install -U git+https://github.com/facebookresearch/fvcore - pip install torch==$env:PYTORCH_VERSION torchvision==$env:TORCHVISION_VERSION -f $env:PYTORCH_INDEX - - - save_cache: - paths: - - env - key: cache-{{ arch }}-<< parameters.pytorch_version >>-{{ .Branch }}-20210404 - - - <<: *install_detectron2 - # TODO: unittest fails for now - -workflows: - version: 2 - regular_test: - jobs: - - linux_cpu_tests: - name: linux_cpu_tests_pytorch1.10 - pytorch_version: '1.10.0+cpu' - torchvision_version: '0.11.1+cpu' - - linux_gpu_tests: - name: linux_gpu_tests_pytorch1.8 - pytorch_version: '1.8.1+cu111' - torchvision_version: '0.9.1+cu111' - - linux_gpu_tests: - name: linux_gpu_tests_pytorch1.9 - pytorch_version: '1.9+cu111' - torchvision_version: '0.10+cu111' - - linux_gpu_tests: - name: linux_gpu_tests_pytorch1.10 - pytorch_version: '1.10+cu111' - torchvision_version: '0.11.1+cu111' - - linux_gpu_tests: - name: linux_gpu_tests_pytorch1.10_python39 - pytorch_version: '1.10+cu111' - torchvision_version: '0.11.1+cu111' - python_version: '3.9.6' - - windows_cpu_build: - pytorch_version: '1.10+cpu' - torchvision_version: '0.11.1+cpu' diff --git a/grit_src_deprecated/third_party/CenterNet2/.clang-format b/grit_src_deprecated/third_party/CenterNet2/.clang-format deleted file mode 100755 index 39b1b3d6..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.clang-format +++ /dev/null @@ -1,85 +0,0 @@ -AccessModifierOffset: -1 -AlignAfterOpenBracket: AlwaysBreak -AlignConsecutiveAssignments: false -AlignConsecutiveDeclarations: false -AlignEscapedNewlinesLeft: true -AlignOperands: false -AlignTrailingComments: false -AllowAllParametersOfDeclarationOnNextLine: false -AllowShortBlocksOnASingleLine: false -AllowShortCaseLabelsOnASingleLine: false -AllowShortFunctionsOnASingleLine: Empty -AllowShortIfStatementsOnASingleLine: false -AllowShortLoopsOnASingleLine: false -AlwaysBreakAfterReturnType: None -AlwaysBreakBeforeMultilineStrings: true -AlwaysBreakTemplateDeclarations: true -BinPackArguments: false -BinPackParameters: false -BraceWrapping: - AfterClass: false - AfterControlStatement: false - AfterEnum: false - AfterFunction: false - AfterNamespace: false - AfterObjCDeclaration: false - AfterStruct: false - AfterUnion: false - BeforeCatch: false - BeforeElse: false - IndentBraces: false -BreakBeforeBinaryOperators: None -BreakBeforeBraces: Attach -BreakBeforeTernaryOperators: true -BreakConstructorInitializersBeforeComma: false -BreakAfterJavaFieldAnnotations: false -BreakStringLiterals: false -ColumnLimit: 80 -CommentPragmas: '^ IWYU pragma:' -ConstructorInitializerAllOnOneLineOrOnePerLine: true -ConstructorInitializerIndentWidth: 4 -ContinuationIndentWidth: 4 -Cpp11BracedListStyle: true -DerivePointerAlignment: false -DisableFormat: false -ForEachMacros: [ FOR_EACH, FOR_EACH_R, FOR_EACH_RANGE, ] -IncludeCategories: - - Regex: '^<.*\.h(pp)?>' - Priority: 1 - - Regex: '^<.*' - Priority: 2 - - Regex: '.*' - Priority: 3 -IndentCaseLabels: true -IndentWidth: 2 -IndentWrappedFunctionNames: false -KeepEmptyLinesAtTheStartOfBlocks: false -MacroBlockBegin: '' -MacroBlockEnd: '' -MaxEmptyLinesToKeep: 1 -NamespaceIndentation: None -ObjCBlockIndentWidth: 2 -ObjCSpaceAfterProperty: false -ObjCSpaceBeforeProtocolList: false -PenaltyBreakBeforeFirstCallParameter: 1 -PenaltyBreakComment: 300 -PenaltyBreakFirstLessLess: 120 -PenaltyBreakString: 1000 -PenaltyExcessCharacter: 1000000 -PenaltyReturnTypeOnItsOwnLine: 200 -PointerAlignment: Left -ReflowComments: true -SortIncludes: true -SpaceAfterCStyleCast: false -SpaceBeforeAssignmentOperators: true -SpaceBeforeParens: ControlStatements -SpaceInEmptyParentheses: false -SpacesBeforeTrailingComments: 1 -SpacesInAngles: false -SpacesInContainerLiterals: true -SpacesInCStyleCastParentheses: false -SpacesInParentheses: false -SpacesInSquareBrackets: false -Standard: Cpp11 -TabWidth: 8 -UseTab: Never diff --git a/grit_src_deprecated/third_party/CenterNet2/.flake8 b/grit_src_deprecated/third_party/CenterNet2/.flake8 deleted file mode 100755 index ae8edda5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.flake8 +++ /dev/null @@ -1,15 +0,0 @@ -# This is an example .flake8 config, used when developing *Black* itself. -# Keep in sync with setup.cfg which is used for source packages. - -[flake8] -ignore = W503, E203, E221, C901, C408, E741, C407, B017 -max-line-length = 100 -max-complexity = 18 -select = B,C,E,F,W,T4,B9 -exclude = build -per-file-ignores = - **/__init__.py:F401,F403,E402 - **/configs/**.py:F401,E402 - configs/**.py:F401,E402 - **/tests/config/**.py:F401,E402 - tests/config/**.py:F401,E402 diff --git a/grit_src_deprecated/third_party/CenterNet2/.github/CODE_OF_CONDUCT.md b/grit_src_deprecated/third_party/CenterNet2/.github/CODE_OF_CONDUCT.md deleted file mode 100755 index 0f7ad8bf..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.github/CODE_OF_CONDUCT.md +++ /dev/null @@ -1,5 +0,0 @@ -# Code of Conduct - -Facebook has adopted a Code of Conduct that we expect project participants to adhere to. -Please read the [full text](https://code.fb.com/codeofconduct/) -so that you can understand what actions will and will not be tolerated. diff --git a/grit_src_deprecated/third_party/CenterNet2/.github/CONTRIBUTING.md b/grit_src_deprecated/third_party/CenterNet2/.github/CONTRIBUTING.md deleted file mode 100755 index 9bab709c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.github/CONTRIBUTING.md +++ /dev/null @@ -1,68 +0,0 @@ -# Contributing to detectron2 - -## Issues -We use GitHub issues to track public bugs and questions. -Please make sure to follow one of the -[issue templates](https://github.com/facebookresearch/detectron2/issues/new/choose) -when reporting any issues. - -Facebook has a [bounty program](https://www.facebook.com/whitehat/) for the safe -disclosure of security bugs. In those cases, please go through the process -outlined on that page and do not file a public issue. - -## Pull Requests -We actively welcome pull requests. - -However, if you're adding any significant features (e.g. > 50 lines), please -make sure to discuss with maintainers about your motivation and proposals in an issue -before sending a PR. This is to save your time so you don't spend time on a PR that we'll not accept. - -We do not always accept new features, and we take the following -factors into consideration: - -1. Whether the same feature can be achieved without modifying detectron2. - Detectron2 is designed so that you can implement many extensions from the outside, e.g. - those in [projects](https://github.com/facebookresearch/detectron2/tree/master/projects). - * If some part of detectron2 is not extensible enough, you can also bring up a more general issue to - improve it. Such feature request may be useful to more users. -2. Whether the feature is potentially useful to a large audience (e.g. an impactful detection paper, a popular dataset, - a significant speedup, a widely useful utility), - or only to a small portion of users (e.g., a less-known paper, an improvement not in the object - detection field, a trick that's not very popular in the community, code to handle a non-standard type of data) - * Adoption of additional models, datasets, new task are by default not added to detectron2 before they - receive significant popularity in the community. - We sometimes accept such features in `projects/`, or as a link in `projects/README.md`. -3. Whether the proposed solution has a good design / interface. This can be discussed in the issue prior to PRs, or - in the form of a draft PR. -4. Whether the proposed solution adds extra mental/practical overhead to users who don't - need such feature. -5. Whether the proposed solution breaks existing APIs. - -To add a feature to an existing function/class `Func`, there are always two approaches: -(1) add new arguments to `Func`; (2) write a new `Func_with_new_feature`. -To meet the above criteria, we often prefer approach (2), because: - -1. It does not involve modifying or potentially breaking existing code. -2. It does not add overhead to users who do not need the new feature. -3. Adding new arguments to a function/class is not scalable w.r.t. all the possible new research ideas in the future. - -When sending a PR, please do: - -1. If a PR contains multiple orthogonal changes, split it to several PRs. -2. If you've added code that should be tested, add tests. -3. For PRs that need experiments (e.g. adding a new model or new methods), - you don't need to update model zoo, but do provide experiment results in the description of the PR. -4. If APIs are changed, update the documentation. -5. We use the [Google style docstrings](https://www.sphinx-doc.org/en/master/usage/extensions/napoleon.html) in python. -6. Make sure your code lints with `./dev/linter.sh`. - - -## Contributor License Agreement ("CLA") -In order to accept your pull request, we need you to submit a CLA. You only need -to do this once to work on any of Facebook's open source projects. - -Complete your CLA here: - -## License -By contributing to detectron2, you agree that your contributions will be licensed -under the LICENSE file in the root directory of this source tree. diff --git a/grit_src_deprecated/third_party/CenterNet2/.github/Detectron2-Logo-Horz.svg b/grit_src_deprecated/third_party/CenterNet2/.github/Detectron2-Logo-Horz.svg deleted file mode 100755 index eb2d643d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.github/Detectron2-Logo-Horz.svg +++ /dev/null @@ -1 +0,0 @@ -Detectron2-Logo-Horz \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE.md b/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE.md deleted file mode 100755 index 5e8aaa2d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE.md +++ /dev/null @@ -1,5 +0,0 @@ - -Please select an issue template from -https://github.com/facebookresearch/detectron2/issues/new/choose . - -Otherwise your issue will be closed. diff --git a/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/bugs.md b/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/bugs.md deleted file mode 100755 index d0235c70..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/bugs.md +++ /dev/null @@ -1,38 +0,0 @@ ---- -name: "🐛 Bugs" -about: Report bugs in detectron2 -title: Please read & provide the following - ---- - -## Instructions To Reproduce the 🐛 Bug: -1. Full runnable code or full changes you made: -``` -If making changes to the project itself, please use output of the following command: -git rev-parse HEAD; git diff - - -``` -2. What exact command you run: -3. __Full logs__ or other relevant observations: -``` - -``` -4. please simplify the steps as much as possible so they do not require additional resources to - run, such as a private dataset. - -## Expected behavior: - -If there are no obvious error in "full logs" provided above, -please tell us the expected behavior. - -## Environment: - -Provide your environment information using the following command: -``` -wget -nc -q https://github.com/facebookresearch/detectron2/raw/main/detectron2/utils/collect_env.py && python collect_env.py -``` - -If your issue looks like an installation issue / environment issue, -please first try to solve it yourself with the instructions in -https://detectron2.readthedocs.io/tutorials/install.html#common-installation-issues diff --git a/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/config.yml b/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/config.yml deleted file mode 100755 index c60c2e14..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/config.yml +++ /dev/null @@ -1,17 +0,0 @@ -# require an issue template to be chosen -blank_issues_enabled: false - -contact_links: - - name: How-To / All Other Questions - url: https://github.com/facebookresearch/detectron2/discussions - about: Use "github discussions" for community support on general questions that don't belong to the above issue categories - - name: Detectron2 Documentation - url: https://detectron2.readthedocs.io/index.html - about: Check if your question is answered in tutorials or API docs - -# Unexpected behaviors & bugs are split to two templates. -# When they are one template, users think "it's not a bug" and don't choose the template. -# -# But the file name is still "unexpected-problems-bugs.md" so that old references -# to this issue template still works. -# It's ok since this template should be a superset of "bugs.md" (unexpected behaviors is a superset of bugs) diff --git a/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/documentation.md b/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/documentation.md deleted file mode 100755 index 88214d62..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/documentation.md +++ /dev/null @@ -1,14 +0,0 @@ ---- -name: "\U0001F4DA Documentation Issue" -about: Report a problem about existing documentation, comments, website or tutorials. -labels: documentation - ---- - -## 📚 Documentation Issue - -This issue category is for problems about existing documentation, not for asking how-to questions. - -* Provide a link to an existing documentation/comment/tutorial: - -* How should the above documentation/comment/tutorial improve: diff --git a/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/feature-request.md b/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/feature-request.md deleted file mode 100755 index 03a1e93d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/feature-request.md +++ /dev/null @@ -1,31 +0,0 @@ ---- -name: "\U0001F680Feature Request" -about: Suggest an improvement or new feature -labels: enhancement - ---- - -## 🚀 Feature -A clear and concise description of the feature proposal. - -## Motivation & Examples - -Tell us why the feature is useful. - -Describe what the feature would look like, if it is implemented. -Best demonstrated using **code examples** in addition to words. - -## Note - -We only consider adding new features if they are relevant to many users. - -If you request implementation of research papers -- we only consider papers that have enough significance and prevalance in the object detection field. - -We do not take requests for most projects in the `projects/` directory, because they are research code release that is mainly for other researchers to reproduce results. - -"Make X faster/accurate" is not a valid feature request. "Implement a concrete feature that can make X faster/accurate" can be a valid feature request. - -Instead of adding features inside detectron2, -you can implement many features by [extending detectron2](https://detectron2.readthedocs.io/tutorials/extend.html). -The [projects/](https://github.com/facebookresearch/detectron2/tree/main/projects/) directory contains many of such examples. - diff --git a/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/unexpected-problems-bugs.md b/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/unexpected-problems-bugs.md deleted file mode 100755 index 5db8f224..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.github/ISSUE_TEMPLATE/unexpected-problems-bugs.md +++ /dev/null @@ -1,44 +0,0 @@ ---- -name: "😩 Unexpected behaviors" -about: Report unexpected behaviors when using detectron2 -title: Please read & provide the following - ---- - -If you do not know the root cause of the problem, please post according to this template: - -## Instructions To Reproduce the Issue: - -Check https://stackoverflow.com/help/minimal-reproducible-example for how to ask good questions. -Simplify the steps to reproduce the issue using suggestions from the above link, and provide them below: - -1. Full runnable code or full changes you made: -``` -If making changes to the project itself, please use output of the following command: -git rev-parse HEAD; git diff - - -``` -2. What exact command you run: -3. __Full logs__ or other relevant observations: -``` - -``` - -## Expected behavior: - -If there are no obvious crash in "full logs" provided above, -please tell us the expected behavior. - -If you expect a model to converge / work better, we do not help with such issues, unless -a model fails to reproduce the results in detectron2 model zoo, or proves existence of bugs. - -## Environment: - -Paste the output of the following command: -``` -wget -nc -nv https://github.com/facebookresearch/detectron2/raw/main/detectron2/utils/collect_env.py && python collect_env.py -``` - -If your issue looks like an installation issue / environment issue, -please first check common issues in https://detectron2.readthedocs.io/tutorials/install.html#common-installation-issues diff --git a/grit_src_deprecated/third_party/CenterNet2/.github/pull_request_template.md b/grit_src_deprecated/third_party/CenterNet2/.github/pull_request_template.md deleted file mode 100755 index d71729ba..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.github/pull_request_template.md +++ /dev/null @@ -1,10 +0,0 @@ -Thanks for your contribution! - -If you're sending a large PR (e.g., >100 lines), -please open an issue first about the feature / bug, and indicate how you want to contribute. - -We do not always accept features. -See https://detectron2.readthedocs.io/notes/contributing.html#pull-requests about how we handle PRs. - -Before submitting a PR, please run `dev/linter.sh` to lint the code. - diff --git a/grit_src_deprecated/third_party/CenterNet2/.github/workflows/check-template.yml b/grit_src_deprecated/third_party/CenterNet2/.github/workflows/check-template.yml deleted file mode 100755 index 3caed9df..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.github/workflows/check-template.yml +++ /dev/null @@ -1,86 +0,0 @@ -name: Check issue template - -on: - issues: - types: [opened] - -jobs: - check-template: - runs-on: ubuntu-latest - # comment this out when testing with https://github.com/nektos/act - if: ${{ github.repository_owner == 'facebookresearch' }} - steps: - - uses: actions/checkout@v2 - - uses: actions/github-script@v3 - with: - github-token: ${{secrets.GITHUB_TOKEN}} - script: | - // Arguments available: - // - github: A pre-authenticated octokit/rest.js client - // - context: An object containing the context of the workflow run - // - core: A reference to the @actions/core package - // - io: A reference to the @actions/io package - const fs = require('fs'); - const editDistance = require(`${process.env.GITHUB_WORKSPACE}/.github/workflows/levenshtein.js`).getEditDistance - issue = await github.issues.get({ - owner: context.issue.owner, - repo: context.issue.repo, - issue_number: context.issue.number, - }); - const hasLabel = issue.data.labels.length > 0; - if (hasLabel || issue.state === "closed") { - // don't require template on them - core.debug("Issue " + issue.data.title + " was skipped."); - return; - } - - sameAsTemplate = function(filename, body) { - let tmpl = fs.readFileSync(`.github/ISSUE_TEMPLATE/${filename}`, 'utf8'); - tmpl = tmpl.toLowerCase().split("---").slice(2).join("").trim(); - tmpl = tmpl.replace(/(\r\n|\n|\r)/gm, ""); - let bodyr = body.replace(/(\r\n|\n|\r)/gm, ""); - let dist = editDistance(tmpl, bodyr); - return dist < 8; - }; - - checkFail = async function(msg) { - core.info("Processing '" + issue.data.title + "' with message: " + msg); - await github.issues.addLabels({ - owner: context.issue.owner, - repo: context.issue.repo, - issue_number: context.issue.number, - labels: ["needs-more-info"], - }); - await github.issues.createComment({ - owner: context.issue.owner, - repo: context.issue.repo, - issue_number: context.issue.number, - body: msg, - }); - }; - - const body = issue.data.body.toLowerCase().trim(); - - if (sameAsTemplate("bugs.md", body) || sameAsTemplate("unexpected-problems-bugs.md", body)) { - await checkFail(` - We found that not enough information is provided about this issue. - Please provide details following the [issue template](https://github.com/facebookresearch/detectron2/issues/new/choose).`) - return; - } - - const hasInstructions = body.indexOf("reproduce") != -1; - const hasEnvironment = (body.indexOf("environment") != -1) || (body.indexOf("colab") != -1) || (body.indexOf("docker") != -1); - if (hasInstructions && hasEnvironment) { - core.debug("Issue " + issue.data.title + " follows template."); - return; - } - - let message = "You've chosen to report an unexpected problem or bug. Unless you already know the root cause of it, please include details about it by filling the [issue template](https://github.com/facebookresearch/detectron2/issues/new/choose).\n"; - message += "The following information is missing: "; - if (!hasInstructions) { - message += "\"Instructions To Reproduce the Issue and __Full__ Logs\"; "; - } - if (!hasEnvironment) { - message += "\"Your Environment\"; "; - } - await checkFail(message); diff --git a/grit_src_deprecated/third_party/CenterNet2/.github/workflows/levenshtein.js b/grit_src_deprecated/third_party/CenterNet2/.github/workflows/levenshtein.js deleted file mode 100755 index 67a5e361..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.github/workflows/levenshtein.js +++ /dev/null @@ -1,44 +0,0 @@ -/* -Copyright (c) 2011 Andrei Mackenzie - -Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: - -The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. - -THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. -*/ - -// Compute the edit distance between the two given strings -exports.getEditDistance = function(a, b){ - if(a.length == 0) return b.length; - if(b.length == 0) return a.length; - - var matrix = []; - - // increment along the first column of each row - var i; - for(i = 0; i <= b.length; i++){ - matrix[i] = [i]; - } - - // increment each column in the first row - var j; - for(j = 0; j <= a.length; j++){ - matrix[0][j] = j; - } - - // Fill in the rest of the matrix - for(i = 1; i <= b.length; i++){ - for(j = 1; j <= a.length; j++){ - if(b.charAt(i-1) == a.charAt(j-1)){ - matrix[i][j] = matrix[i-1][j-1]; - } else { - matrix[i][j] = Math.min(matrix[i-1][j-1] + 1, // substitution - Math.min(matrix[i][j-1] + 1, // insertion - matrix[i-1][j] + 1)); // deletion - } - } - } - - return matrix[b.length][a.length]; -}; diff --git a/grit_src_deprecated/third_party/CenterNet2/.github/workflows/needs-reply.yml b/grit_src_deprecated/third_party/CenterNet2/.github/workflows/needs-reply.yml deleted file mode 100755 index 4affabd3..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.github/workflows/needs-reply.yml +++ /dev/null @@ -1,98 +0,0 @@ -name: Close/Lock issues after inactivity - -on: - schedule: - - cron: "0 0 * * *" - -jobs: - close-issues-needs-more-info: - runs-on: ubuntu-latest - if: ${{ github.repository_owner == 'facebookresearch' }} - steps: - - name: Close old issues that need reply - uses: actions/github-script@v3 - with: - github-token: ${{secrets.GITHUB_TOKEN}} - # Modified from https://github.com/dwieeb/needs-reply - script: | - // Arguments available: - // - github: A pre-authenticated octokit/rest.js client - // - context: An object containing the context of the workflow run - // - core: A reference to the @actions/core package - // - io: A reference to the @actions/io package - const kLabelToCheck = "needs-more-info"; - const kInvalidLabel = "invalid/unrelated"; - const kDaysBeforeClose = 7; - const kMessage = "Requested information was not provided in 7 days, so we're closing this issue.\n\nPlease open new issue if information becomes available. Otherwise, use [github discussions](https://github.com/facebookresearch/detectron2/discussions) for free-form discussions." - - issues = await github.issues.listForRepo({ - owner: context.repo.owner, - repo: context.repo.repo, - state: 'open', - labels: kLabelToCheck, - sort: 'updated', - direction: 'asc', - per_page: 30, - page: 1, - }); - issues = issues.data; - if (issues.length === 0) { - core.info('No more issues found to process. Exiting.'); - return; - } - for (const issue of issues) { - if (!!issue.pull_request) - continue; - core.info(`Processing issue #${issue.number}`); - - let updatedAt = new Date(issue.updated_at).getTime(); - const numComments = issue.comments; - const comments = await github.issues.listComments({ - owner: context.repo.owner, - repo: context.repo.repo, - issue_number: issue.number, - per_page: 30, - page: Math.floor((numComments - 1) / 30) + 1, // the last page - }); - const lastComments = comments.data - .map(l => new Date(l.created_at).getTime()) - .sort(); - if (lastComments.length > 0) { - updatedAt = lastComments[lastComments.length - 1]; - } - - const now = new Date().getTime(); - const daysSinceUpdated = (now - updatedAt) / 1000 / 60 / 60 / 24; - - if (daysSinceUpdated < kDaysBeforeClose) { - core.info(`Skipping #${issue.number} because it has been updated in the last ${daysSinceUpdated} days`); - continue; - } - core.info(`Closing #${issue.number} because it has not been updated in the last ${daysSinceUpdated} days`); - await github.issues.createComment({ - owner: context.repo.owner, - repo: context.repo.repo, - issue_number: issue.number, - body: kMessage, - }); - const newLabels = numComments <= 2 ? [kInvalidLabel, kLabelToCheck] : issue.labels; - await github.issues.update({ - owner: context.repo.owner, - repo: context.repo.repo, - issue_number: issue.number, - labels: newLabels, - state: 'closed', - }); - } - - lock-issues-after-closed: - runs-on: ubuntu-latest - if: ${{ github.repository_owner == 'facebookresearch' }} - steps: - - name: Lock closed issues that have no activity for a while - uses: dessant/lock-threads@v2 - with: - github-token: ${{ github.token }} - issue-lock-inactive-days: '300' - process-only: 'issues' - issue-exclude-labels: 'enhancement,bug,documentation' diff --git a/grit_src_deprecated/third_party/CenterNet2/.github/workflows/remove-needs-reply.yml b/grit_src_deprecated/third_party/CenterNet2/.github/workflows/remove-needs-reply.yml deleted file mode 100755 index 1f000b28..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.github/workflows/remove-needs-reply.yml +++ /dev/null @@ -1,25 +0,0 @@ -name: Remove needs-more-info label - -on: - issue_comment: - types: [created] - issues: - types: [edited] - -jobs: - remove-needs-more-info-label: - runs-on: ubuntu-latest - # 1. issue_comment events could include PR comment, filter them out - # 2. Only trigger action if event was produced by the original author - if: ${{ !github.event.issue.pull_request && github.event.sender.login == github.event.issue.user.login }} - steps: - - name: Remove needs-more-info label - uses: octokit/request-action@v2.x - continue-on-error: true - with: - route: DELETE /repos/:repository/issues/:issue/labels/:label - repository: ${{ github.repository }} - issue: ${{ github.event.issue.number }} - label: needs-more-info - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} diff --git a/grit_src_deprecated/third_party/CenterNet2/.github/workflows/workflow.yml b/grit_src_deprecated/third_party/CenterNet2/.github/workflows/workflow.yml deleted file mode 100755 index 6085b32a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.github/workflows/workflow.yml +++ /dev/null @@ -1,81 +0,0 @@ -name: CI -on: [push, pull_request] - -# Run linter with github actions for quick feedbacks. -# Run macos tests with github actions. Linux (CPU & GPU) tests currently runs on CircleCI -jobs: - linter: - runs-on: ubuntu-latest - # run on PRs, or commits to facebookresearch (not internal) - if: ${{ github.repository_owner == 'facebookresearch' || github.event_name == 'pull_request' }} - steps: - - uses: actions/checkout@v2 - - name: Set up Python 3.6 - uses: actions/setup-python@v2 - with: - python-version: 3.6 - - name: Install dependencies - # flake8-bugbear flake8-comprehensions are useful but not available internally - run: | - python -m pip install --upgrade pip - python -m pip install flake8==3.8.1 isort==4.3.21 - python -m pip install black==21.4b2 - flake8 --version - - name: Lint - run: | - echo "Running isort" - isort -c -sp . - echo "Running black" - black -l 100 --check . - echo "Running flake8" - flake8 . - - macos_tests: - runs-on: macos-latest - # run on PRs, or commits to facebookresearch (not internal) - if: ${{ github.repository_owner == 'facebookresearch' || github.event_name == 'pull_request' }} - strategy: - fail-fast: false - matrix: - torch: ["1.8", "1.9", "1.10"] - include: - - torch: "1.8" - torchvision: 0.9 - - torch: "1.9" - torchvision: "0.10" - - torch: "1.10" - torchvision: "0.11.1" - env: - # point datasets to ~/.torch so it's cached by CI - DETECTRON2_DATASETS: ~/.torch/datasets - steps: - - name: Checkout - uses: actions/checkout@v2 - - name: Set up Python 3.6 - uses: actions/setup-python@v2 - with: - python-version: 3.6 - - name: Cache dependencies - uses: actions/cache@v2 - with: - path: | - ${{ env.pythonLocation }}/lib/python3.6/site-packages - ~/.torch - key: ${{ runner.os }}-torch${{ matrix.torch }}-${{ hashFiles('setup.py') }}-20210420 - - - name: Install dependencies - run: | - python -m pip install -U pip - python -m pip install ninja opencv-python-headless onnx pytest-xdist - python -m pip install torch==${{matrix.torch}} torchvision==${{matrix.torchvision}} -f https://download.pytorch.org/whl/torch_stable.html - # install from github to get latest; install iopath first since fvcore depends on it - python -m pip install -U 'git+https://github.com/facebookresearch/iopath' - python -m pip install -U 'git+https://github.com/facebookresearch/fvcore' - - - name: Build and install - run: | - CC=clang CXX=clang++ python -m pip install -e .[all] - python -m detectron2.utils.collect_env - ./datasets/prepare_for_tests.sh - - name: Run unittests - run: python -m pytest -n 4 --durations=15 -v tests/ diff --git a/grit_src_deprecated/third_party/CenterNet2/.gitignore b/grit_src_deprecated/third_party/CenterNet2/.gitignore deleted file mode 100755 index 8ca283cb..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/.gitignore +++ /dev/null @@ -1,57 +0,0 @@ -slurm* -# output dir -output -instant_test_output -inference_test_output - - -*.png -*.json -*.diff -# *.jpg -!/projects/DensePose/doc/images/*.jpg - -# compilation and distribution -__pycache__ -_ext -*.pyc -*.pyd -*.so -*.dll -*.egg-info/ -build/ -dist/ -wheels/ - -# pytorch/python/numpy formats -*.pth -*.pkl -*.npy -*.ts -model_ts*.txt - -# ipython/jupyter notebooks -*.ipynb -**/.ipynb_checkpoints/ - -# Editor temporaries -*.swn -*.swo -*.swp -*~ - -# editor settings -.idea -.vscode -_darcs - -# project dirs -/detectron2/model_zoo/configs -/datasets/* -!/datasets/*.* -!/datasets/lvis/ -/datasets/lvis/* -!/datasets/lvis/lvis_v1_train_cat_info.json -/projects/*/datasets -/models -/snippet diff --git a/grit_src_deprecated/third_party/CenterNet2/GETTING_STARTED.md b/grit_src_deprecated/third_party/CenterNet2/GETTING_STARTED.md deleted file mode 100755 index 404b0c8f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/GETTING_STARTED.md +++ /dev/null @@ -1,79 +0,0 @@ -## Getting Started with Detectron2 - -This document provides a brief intro of the usage of builtin command-line tools in detectron2. - -For a tutorial that involves actual coding with the API, -see our [Colab Notebook](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5) -which covers how to run inference with an -existing model, and how to train a builtin model on a custom dataset. - - -### Inference Demo with Pre-trained Models - -1. Pick a model and its config file from - [model zoo](MODEL_ZOO.md), - for example, `mask_rcnn_R_50_FPN_3x.yaml`. -2. We provide `demo.py` that is able to demo builtin configs. Run it with: -``` -cd demo/ -python demo.py --config-file ../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml \ - --input input1.jpg input2.jpg \ - [--other-options] - --opts MODEL.WEIGHTS detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl -``` -The configs are made for training, therefore we need to specify `MODEL.WEIGHTS` to a model from model zoo for evaluation. -This command will run the inference and show visualizations in an OpenCV window. - -For details of the command line arguments, see `demo.py -h` or look at its source code -to understand its behavior. Some common arguments are: -* To run __on your webcam__, replace `--input files` with `--webcam`. -* To run __on a video__, replace `--input files` with `--video-input video.mp4`. -* To run __on cpu__, add `MODEL.DEVICE cpu` after `--opts`. -* To save outputs to a directory (for images) or a file (for webcam or video), use `--output`. - - -### Training & Evaluation in Command Line - -We provide two scripts in "tools/plain_train_net.py" and "tools/train_net.py", -that are made to train all the configs provided in detectron2. You may want to -use it as a reference to write your own training script. - -Compared to "train_net.py", "plain_train_net.py" supports fewer default -features. It also includes fewer abstraction, therefore is easier to add custom -logic. - -To train a model with "train_net.py", first -setup the corresponding datasets following -[datasets/README.md](./datasets/README.md), -then run: -``` -cd tools/ -./train_net.py --num-gpus 8 \ - --config-file ../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml -``` - -The configs are made for 8-GPU training. -To train on 1 GPU, you may need to [change some parameters](https://arxiv.org/abs/1706.02677), e.g.: -``` -./train_net.py \ - --config-file ../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml \ - --num-gpus 1 SOLVER.IMS_PER_BATCH 2 SOLVER.BASE_LR 0.0025 -``` - -To evaluate a model's performance, use -``` -./train_net.py \ - --config-file ../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml \ - --eval-only MODEL.WEIGHTS /path/to/checkpoint_file -``` -For more options, see `./train_net.py -h`. - -### Use Detectron2 APIs in Your Code - -See our [Colab Notebook](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5) -to learn how to use detectron2 APIs to: -1. run inference with an existing model -2. train a builtin model on a custom dataset - -See [detectron2/projects](https://github.com/facebookresearch/detectron2/tree/main/projects) -for more ways to build your project on detectron2. diff --git a/grit_src_deprecated/third_party/CenterNet2/INSTALL.md b/grit_src_deprecated/third_party/CenterNet2/INSTALL.md deleted file mode 100755 index b4076891..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/INSTALL.md +++ /dev/null @@ -1,261 +0,0 @@ -## Installation - -### Requirements -- Linux or macOS with Python ≥ 3.6 -- PyTorch ≥ 1.8 and [torchvision](https://github.com/pytorch/vision/) that matches the PyTorch installation. - Install them together at [pytorch.org](https://pytorch.org) to make sure of this -- OpenCV is optional but needed by demo and visualization - - -### Build Detectron2 from Source - -gcc & g++ ≥ 5.4 are required. [ninja](https://ninja-build.org/) is optional but recommended for faster build. -After having them, run: -``` -python -m pip install 'git+https://github.com/facebookresearch/detectron2.git' -# (add --user if you don't have permission) - -# Or, to install it from a local clone: -git clone https://github.com/facebookresearch/detectron2.git -python -m pip install -e detectron2 - -# On macOS, you may need to prepend the above commands with a few environment variables: -CC=clang CXX=clang++ ARCHFLAGS="-arch x86_64" python -m pip install ... -``` - -To __rebuild__ detectron2 that's built from a local clone, use `rm -rf build/ **/*.so` to clean the -old build first. You often need to rebuild detectron2 after reinstalling PyTorch. - -### Install Pre-Built Detectron2 (Linux only) - -Choose from this table to install [v0.6 (Oct 2021)](https://github.com/facebookresearch/detectron2/releases): - -
      CUDA torch 1.10torch 1.9torch 1.8
      11.3
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu113/torch1.10/index.html
      -
      11.1
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu111/torch1.10/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu111/torch1.9/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu111/torch1.8/index.html
      -
      10.2
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu102/torch1.10/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu102/torch1.9/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu102/torch1.8/index.html
      -
      10.1
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu101/torch1.8/index.html
      -
      cpu
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cpu/torch1.10/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cpu/torch1.9/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cpu/torch1.8/index.html
      -
      - -Note that: -1. The pre-built packages have to be used with corresponding version of CUDA and the official package of PyTorch. - Otherwise, please build detectron2 from source. -2. New packages are released every few months. Therefore, packages may not contain latest features in the main - branch and may not be compatible with the main branch of a research project that uses detectron2 - (e.g. those in [projects](projects)). - -### Common Installation Issues - -Click each issue for its solutions: - -
      - -Undefined symbols that looks like "TH..","at::Tensor...","torch..." - -
      - -This usually happens when detectron2 or torchvision is not -compiled with the version of PyTorch you're running. - -If the error comes from a pre-built torchvision, uninstall torchvision and pytorch and reinstall them -following [pytorch.org](http://pytorch.org). So the versions will match. - -If the error comes from a pre-built detectron2, check [release notes](https://github.com/facebookresearch/detectron2/releases), -uninstall and reinstall the correct pre-built detectron2 that matches pytorch version. - -If the error comes from detectron2 or torchvision that you built manually from source, -remove files you built (`build/`, `**/*.so`) and rebuild it so it can pick up the version of pytorch currently in your environment. - -If the above instructions do not resolve this problem, please provide an environment (e.g. a dockerfile) that can reproduce the issue. -
      - -
      - -Missing torch dynamic libraries, OR segmentation fault immediately when using detectron2. - -This usually happens when detectron2 or torchvision is not -compiled with the version of PyTorch you're running. See the previous common issue for the solution. -
      - -
      - -Undefined C++ symbols (e.g. "GLIBCXX..") or C++ symbols not found. - -
      -Usually it's because the library is compiled with a newer C++ compiler but run with an old C++ runtime. - -This often happens with old anaconda. -It may help to run `conda update libgcc` to upgrade its runtime. - -The fundamental solution is to avoid the mismatch, either by compiling using older version of C++ -compiler, or run the code with proper C++ runtime. -To run the code with a specific C++ runtime, you can use environment variable `LD_PRELOAD=/path/to/libstdc++.so`. - -
      - -
      - -"nvcc not found" or "Not compiled with GPU support" or "Detectron2 CUDA Compiler: not available". - -
      -CUDA is not found when building detectron2. -You should make sure - -``` -python -c 'import torch; from torch.utils.cpp_extension import CUDA_HOME; print(torch.cuda.is_available(), CUDA_HOME)' -``` - -print `(True, a directory with cuda)` at the time you build detectron2. - -Most models can run inference (but not training) without GPU support. To use CPUs, set `MODEL.DEVICE='cpu'` in the config. -
      - -
      - -"invalid device function" or "no kernel image is available for execution". - -
      -Two possibilities: - -* You build detectron2 with one version of CUDA but run it with a different version. - - To check whether it is the case, - use `python -m detectron2.utils.collect_env` to find out inconsistent CUDA versions. - In the output of this command, you should expect "Detectron2 CUDA Compiler", "CUDA_HOME", "PyTorch built with - CUDA" - to contain cuda libraries of the same version. - - When they are inconsistent, - you need to either install a different build of PyTorch (or build by yourself) - to match your local CUDA installation, or install a different version of CUDA to match PyTorch. - -* PyTorch/torchvision/Detectron2 is not built for the correct GPU SM architecture (aka. compute capability). - - The architecture included by PyTorch/detectron2/torchvision is available in the "architecture flags" in - `python -m detectron2.utils.collect_env`. It must include - the architecture of your GPU, which can be found at [developer.nvidia.com/cuda-gpus](https://developer.nvidia.com/cuda-gpus). - - If you're using pre-built PyTorch/detectron2/torchvision, they have included support for most popular GPUs already. - If not supported, you need to build them from source. - - When building detectron2/torchvision from source, they detect the GPU device and build for only the device. - This means the compiled code may not work on a different GPU device. - To recompile them for the correct architecture, remove all installed/compiled files, - and rebuild them with the `TORCH_CUDA_ARCH_LIST` environment variable set properly. - For example, `export TORCH_CUDA_ARCH_LIST="6.0;7.0"` makes it compile for both P100s and V100s. -
      - -
      - -Undefined CUDA symbols; Cannot open libcudart.so - -
      -The version of NVCC you use to build detectron2 or torchvision does -not match the version of CUDA you are running with. -This often happens when using anaconda's CUDA runtime. - -Use `python -m detectron2.utils.collect_env` to find out inconsistent CUDA versions. -In the output of this command, you should expect "Detectron2 CUDA Compiler", "CUDA_HOME", "PyTorch built with - CUDA" -to contain cuda libraries of the same version. - -When they are inconsistent, -you need to either install a different build of PyTorch (or build by yourself) -to match your local CUDA installation, or install a different version of CUDA to match PyTorch. -
      - - -
      - -C++ compilation errors from NVCC / NVRTC, or "Unsupported gpu architecture" - -
      -A few possibilities: - -1. Local CUDA/NVCC version has to match the CUDA version of your PyTorch. Both can be found in `python collect_env.py`. - When they are inconsistent, you need to either install a different build of PyTorch (or build by yourself) - to match your local CUDA installation, or install a different version of CUDA to match PyTorch. - -2. Local CUDA/NVCC version shall support the SM architecture (a.k.a. compute capability) of your GPU. - The capability of your GPU can be found at [developer.nvidia.com/cuda-gpus](https://developer.nvidia.com/cuda-gpus). - The capability supported by NVCC is listed at [here](https://gist.github.com/ax3l/9489132). - If your NVCC version is too old, this can be workaround by setting environment variable - `TORCH_CUDA_ARCH_LIST` to a lower, supported capability. - -3. The combination of NVCC and GCC you use is incompatible. You need to change one of their versions. - See [here](https://gist.github.com/ax3l/9489132) for some valid combinations. - Notably, CUDA<=10.1.105 doesn't support GCC>7.3. - - The CUDA/GCC version used by PyTorch can be found by `print(torch.__config__.show())`. - -
      - - -
      - -"ImportError: cannot import name '_C'". - -
      -Please build and install detectron2 following the instructions above. - -Or, if you are running code from detectron2's root directory, `cd` to a different one. -Otherwise you may not import the code that you installed. -
      - - -
      - -Any issue on windows. - -
      - -Detectron2 is continuously built on windows with [CircleCI](https://app.circleci.com/pipelines/github/facebookresearch/detectron2?branch=main). -However we do not provide official support for it. -PRs that improves code compatibility on windows are welcome. -
      - -
      - -ONNX conversion segfault after some "TraceWarning". - -
      -The ONNX package is compiled with a too old compiler. - -Please build and install ONNX from its source code using a compiler -whose version is closer to what's used by PyTorch (available in `torch.__config__.show()`). -
      - - -
      - -"library not found for -lstdc++" on older version of MacOS - -
      -See -[this stackoverflow answer](https://stackoverflow.com/questions/56083725/macos-build-issues-lstdc-not-found-while-building-python-package). - -
      - - -### Installation inside specific environments: - -* __Colab__: see our [Colab Tutorial](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5) - which has step-by-step instructions. - -* __Docker__: The official [Dockerfile](docker) installs detectron2 with a few simple commands. - diff --git a/grit_src_deprecated/third_party/CenterNet2/LICENSE b/grit_src_deprecated/third_party/CenterNet2/LICENSE deleted file mode 100755 index cd1b0706..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/LICENSE +++ /dev/null @@ -1,202 +0,0 @@ -Apache License -Version 2.0, January 2004 -http://www.apache.org/licenses/ - -TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION - -1. Definitions. - -"License" shall mean the terms and conditions for use, reproduction, -and distribution as defined by Sections 1 through 9 of this document. - -"Licensor" shall mean the copyright owner or entity authorized by -the copyright owner that is granting the License. - -"Legal Entity" shall mean the union of the acting entity and all -other entities that control, are controlled by, or are under common -control with that entity. For the purposes of this definition, -"control" means (i) the power, direct or indirect, to cause the -direction or management of such entity, whether by contract or -otherwise, or (ii) ownership of fifty percent (50%) or more of the -outstanding shares, or (iii) beneficial ownership of such entity. - -"You" (or "Your") shall mean an individual or Legal Entity -exercising permissions granted by this License. - -"Source" form shall mean the preferred form for making modifications, -including but not limited to software source code, documentation -source, and configuration files. - -"Object" form shall mean any form resulting from mechanical -transformation or translation of a Source form, including but -not limited to compiled object code, generated documentation, -and conversions to other media types. - -"Work" shall mean the work of authorship, whether in Source or -Object form, made available under the License, as indicated by a -copyright notice that is included in or attached to the work -(an example is provided in the Appendix below). - -"Derivative Works" shall mean any work, whether in Source or Object -form, that is based on (or derived from) the Work and for which the -editorial revisions, annotations, elaborations, or other modifications -represent, as a whole, an original work of authorship. For the purposes -of this License, Derivative Works shall not include works that remain -separable from, or merely link (or bind by name) to the interfaces of, -the Work and Derivative Works thereof. - -"Contribution" shall mean any work of authorship, including -the original version of the Work and any modifications or additions -to that Work or Derivative Works thereof, that is intentionally -submitted to Licensor for inclusion in the Work by the copyright owner -or by an individual or Legal Entity authorized to submit on behalf of -the copyright owner. For the purposes of this definition, "submitted" -means any form of electronic, verbal, or written communication sent -to the Licensor or its representatives, including but not limited to -communication on electronic mailing lists, source code control systems, -and issue tracking systems that are managed by, or on behalf of, the -Licensor for the purpose of discussing and improving the Work, but -excluding communication that is conspicuously marked or otherwise -designated in writing by the copyright owner as "Not a Contribution." - -"Contributor" shall mean Licensor and any individual or Legal Entity -on behalf of whom a Contribution has been received by Licensor and -subsequently incorporated within the Work. - -2. Grant of Copyright License. Subject to the terms and conditions of -this License, each Contributor hereby grants to You a perpetual, -worldwide, non-exclusive, no-charge, royalty-free, irrevocable -copyright license to reproduce, prepare Derivative Works of, -publicly display, publicly perform, sublicense, and distribute the -Work and such Derivative Works in Source or Object form. - -3. Grant of Patent License. Subject to the terms and conditions of -this License, each Contributor hereby grants to You a perpetual, -worldwide, non-exclusive, no-charge, royalty-free, irrevocable -(except as stated in this section) patent license to make, have made, -use, offer to sell, sell, import, and otherwise transfer the Work, -where such license applies only to those patent claims licensable -by such Contributor that are necessarily infringed by their -Contribution(s) alone or by combination of their Contribution(s) -with the Work to which such Contribution(s) was submitted. If You -institute patent litigation against any entity (including a -cross-claim or counterclaim in a lawsuit) alleging that the Work -or a Contribution incorporated within the Work constitutes direct -or contributory patent infringement, then any patent licenses -granted to You under this License for that Work shall terminate -as of the date such litigation is filed. - -4. Redistribution. You may reproduce and distribute copies of the -Work or Derivative Works thereof in any medium, with or without -modifications, and in Source or Object form, provided that You -meet the following conditions: - -(a) You must give any other recipients of the Work or -Derivative Works a copy of this License; and - -(b) You must cause any modified files to carry prominent notices -stating that You changed the files; and - -(c) You must retain, in the Source form of any Derivative Works -that You distribute, all copyright, patent, trademark, and -attribution notices from the Source form of the Work, -excluding those notices that do not pertain to any part of -the Derivative Works; and - -(d) If the Work includes a "NOTICE" text file as part of its -distribution, then any Derivative Works that You distribute must -include a readable copy of the attribution notices contained -within such NOTICE file, excluding those notices that do not -pertain to any part of the Derivative Works, in at least one -of the following places: within a NOTICE text file distributed -as part of the Derivative Works; within the Source form or -documentation, if provided along with the Derivative Works; or, -within a display generated by the Derivative Works, if and -wherever such third-party notices normally appear. The contents -of the NOTICE file are for informational purposes only and -do not modify the License. You may add Your own attribution -notices within Derivative Works that You distribute, alongside -or as an addendum to the NOTICE text from the Work, provided -that such additional attribution notices cannot be construed -as modifying the License. - -You may add Your own copyright statement to Your modifications and -may provide additional or different license terms and conditions -for use, reproduction, or distribution of Your modifications, or -for any such Derivative Works as a whole, provided Your use, -reproduction, and distribution of the Work otherwise complies with -the conditions stated in this License. - -5. Submission of Contributions. Unless You explicitly state otherwise, -any Contribution intentionally submitted for inclusion in the Work -by You to the Licensor shall be under the terms and conditions of -this License, without any additional terms or conditions. -Notwithstanding the above, nothing herein shall supersede or modify -the terms of any separate license agreement you may have executed -with Licensor regarding such Contributions. - -6. Trademarks. This License does not grant permission to use the trade -names, trademarks, service marks, or product names of the Licensor, -except as required for reasonable and customary use in describing the -origin of the Work and reproducing the content of the NOTICE file. - -7. Disclaimer of Warranty. Unless required by applicable law or -agreed to in writing, Licensor provides the Work (and each -Contributor provides its Contributions) on an "AS IS" BASIS, -WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or -implied, including, without limitation, any warranties or conditions -of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A -PARTICULAR PURPOSE. You are solely responsible for determining the -appropriateness of using or redistributing the Work and assume any -risks associated with Your exercise of permissions under this License. - -8. Limitation of Liability. In no event and under no legal theory, -whether in tort (including negligence), contract, or otherwise, -unless required by applicable law (such as deliberate and grossly -negligent acts) or agreed to in writing, shall any Contributor be -liable to You for damages, including any direct, indirect, special, -incidental, or consequential damages of any character arising as a -result of this License or out of the use or inability to use the -Work (including but not limited to damages for loss of goodwill, -work stoppage, computer failure or malfunction, or any and all -other commercial damages or losses), even if such Contributor -has been advised of the possibility of such damages. - -9. Accepting Warranty or Additional Liability. While redistributing -the Work or Derivative Works thereof, You may choose to offer, -and charge a fee for, acceptance of support, warranty, indemnity, -or other liability obligations and/or rights consistent with this -License. However, in accepting such obligations, You may act only -on Your own behalf and on Your sole responsibility, not on behalf -of any other Contributor, and only if You agree to indemnify, -defend, and hold each Contributor harmless for any liability -incurred by, or claims asserted against, such Contributor by reason -of your accepting any such warranty or additional liability. - -END OF TERMS AND CONDITIONS - -APPENDIX: How to apply the Apache License to your work. - -To apply the Apache License to your work, attach the following -boilerplate notice, with the fields enclosed by brackets "[]" -replaced with your own identifying information. (Don't include -the brackets!) The text should be enclosed in the appropriate -comment syntax for the file format. We also recommend that a -file or class name and description of purpose be included on the -same "printed page" as the copyright notice for easier -identification within third-party archives. - -Copyright [yyyy] [name of copyright owner] - - -Licensed under the Apache License, Version 2.0 (the "License"); -you may not use this file except in compliance with the License. -You may obtain a copy of the License at - -http://www.apache.org/licenses/LICENSE-2.0 - -Unless required by applicable law or agreed to in writing, software -distributed under the License is distributed on an "AS IS" BASIS, -WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -See the License for the specific language governing permissions and -limitations under the License. diff --git a/grit_src_deprecated/third_party/CenterNet2/MODEL_ZOO.md b/grit_src_deprecated/third_party/CenterNet2/MODEL_ZOO.md deleted file mode 100755 index 69db2728..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/MODEL_ZOO.md +++ /dev/null @@ -1,1052 +0,0 @@ -# Detectron2 Model Zoo and Baselines - -## Introduction - -This file documents a large collection of baselines trained -with detectron2 in Sep-Oct, 2019. -All numbers were obtained on [Big Basin](https://engineering.fb.com/data-center-engineering/introducing-big-basin-our-next-generation-ai-hardware/) -servers with 8 NVIDIA V100 GPUs & NVLink. The speed numbers are periodically updated with latest PyTorch/CUDA/cuDNN versions. -You can access these models from code using [detectron2.model_zoo](https://detectron2.readthedocs.io/modules/model_zoo.html) APIs. - -In addition to these official baseline models, you can find more models in [projects/](projects/). - -#### How to Read the Tables -* The "Name" column contains a link to the config file. Models can be reproduced using `tools/train_net.py` with the corresponding yaml config file, - or `tools/lazyconfig_train_net.py` for python config files. -* Training speed is averaged across the entire training. - We keep updating the speed with latest version of detectron2/pytorch/etc., - so they might be different from the `metrics` file. - Training speed for multi-machine jobs is not provided. -* Inference speed is measured by `tools/train_net.py --eval-only`, or [inference_on_dataset()](https://detectron2.readthedocs.io/modules/evaluation.html#detectron2.evaluation.inference_on_dataset), - with batch size 1 in detectron2 directly. - Measuring it with custom code may introduce other overhead. - Actual deployment in production should in general be faster than the given inference - speed due to more optimizations. -* The *model id* column is provided for ease of reference. - To check downloaded file integrity, any model on this page contains its md5 prefix in its file name. -* Training curves and other statistics can be found in `metrics` for each model. - -#### Common Settings for COCO Models -* All COCO models were trained on `train2017` and evaluated on `val2017`. -* The default settings are __not directly comparable__ with Detectron's standard settings. - For example, our default training data augmentation uses scale jittering in addition to horizontal flipping. - - To make fair comparisons with Detectron's settings, see - [Detectron1-Comparisons](configs/Detectron1-Comparisons/) for accuracy comparison, - and [benchmarks](https://detectron2.readthedocs.io/notes/benchmarks.html) - for speed comparison. -* For Faster/Mask R-CNN, we provide baselines based on __3 different backbone combinations__: - * __FPN__: Use a ResNet+FPN backbone with standard conv and FC heads for mask and box prediction, - respectively. It obtains the best - speed/accuracy tradeoff, but the other two are still useful for research. - * __C4__: Use a ResNet conv4 backbone with conv5 head. The original baseline in the Faster R-CNN paper. - * __DC5__ (Dilated-C5): Use a ResNet conv5 backbone with dilations in conv5, and standard conv and FC heads - for mask and box prediction, respectively. - This is used by the Deformable ConvNet paper. -* Most models are trained with the 3x schedule (~37 COCO epochs). - Although 1x models are heavily under-trained, we provide some ResNet-50 models with the 1x (~12 COCO epochs) - training schedule for comparison when doing quick research iteration. - -#### ImageNet Pretrained Models - -It's common to initialize from backbone models pre-trained on ImageNet classification tasks. The following backbone models are available: - -* [R-50.pkl](https://dl.fbaipublicfiles.com/detectron2/ImageNetPretrained/MSRA/R-50.pkl): converted copy of [MSRA's original ResNet-50](https://github.com/KaimingHe/deep-residual-networks) model. -* [R-101.pkl](https://dl.fbaipublicfiles.com/detectron2/ImageNetPretrained/MSRA/R-101.pkl): converted copy of [MSRA's original ResNet-101](https://github.com/KaimingHe/deep-residual-networks) model. -* [X-101-32x8d.pkl](https://dl.fbaipublicfiles.com/detectron2/ImageNetPretrained/FAIR/X-101-32x8d.pkl): ResNeXt-101-32x8d model trained with Caffe2 at FB. -* [R-50.pkl (torchvision)](https://dl.fbaipublicfiles.com/detectron2/ImageNetPretrained/torchvision/R-50.pkl): converted copy of [torchvision's ResNet-50](https://pytorch.org/docs/stable/torchvision/models.html#torchvision.models.resnet50) model. - More details can be found in [the conversion script](tools/convert-torchvision-to-d2.py). - -Note that the above models have __different__ format from those provided in Detectron: we do not fuse BatchNorm into an affine layer. -Pretrained models in Detectron's format can still be used. For example: -* [X-152-32x8d-IN5k.pkl](https://dl.fbaipublicfiles.com/detectron/ImageNetPretrained/25093814/X-152-32x8d-IN5k.pkl): - ResNeXt-152-32x8d model trained on ImageNet-5k with Caffe2 at FB (see ResNeXt paper for details on ImageNet-5k). -* [R-50-GN.pkl](https://dl.fbaipublicfiles.com/detectron/ImageNetPretrained/47261647/R-50-GN.pkl): - ResNet-50 with Group Normalization. -* [R-101-GN.pkl](https://dl.fbaipublicfiles.com/detectron/ImageNetPretrained/47592356/R-101-GN.pkl): - ResNet-101 with Group Normalization. - -These models require slightly different settings regarding normalization and architecture. See the model zoo configs for reference. - -#### License - -All models available for download through this document are licensed under the -[Creative Commons Attribution-ShareAlike 3.0 license](https://creativecommons.org/licenses/by-sa/3.0/). - -### COCO Object Detection Baselines - -#### Faster R-CNN: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      model iddownload
      R50-C41x0.5510.1024.835.7137257644model | metrics
      R50-DC51x0.3800.0685.037.3137847829model | metrics
      R50-FPN1x0.2100.0383.037.9137257794model | metrics
      R50-C43x0.5430.1044.838.4137849393model | metrics
      R50-DC53x0.3780.0705.039.0137849425model | metrics
      R50-FPN3x0.2090.0383.040.2137849458model | metrics
      R101-C43x0.6190.1395.941.1138204752model | metrics
      R101-DC53x0.4520.0866.140.6138204841model | metrics
      R101-FPN3x0.2860.0514.142.0137851257model | metrics
      X101-FPN3x0.6380.0986.743.0139173657model | metrics
      - -#### RetinaNet: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      model iddownload
      R501x0.2050.0414.137.4190397773model | metrics
      R503x0.2050.0414.138.7190397829model | metrics
      R1013x0.2910.0545.240.4190397697model | metrics
      - - -#### RPN & Fast R-CNN: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      prop.
      AR
      model iddownload
      RPN R50-C41x0.1300.0341.551.6137258005model | metrics
      RPN R50-FPN1x0.1860.0322.758.0137258492model | metrics
      Fast R-CNN R50-FPN1x0.1400.0292.637.8137635226model | metrics
      - -### COCO Instance Segmentation Baselines with Mask R-CNN - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      mask
      AP
      model iddownload
      R50-C41x0.5840.1105.236.832.2137259246model | metrics
      R50-DC51x0.4710.0766.538.334.2137260150model | metrics
      R50-FPN1x0.2610.0433.438.635.2137260431model | metrics
      R50-C43x0.5750.1115.239.834.4137849525model | metrics
      R50-DC53x0.4700.0766.540.035.9137849551model | metrics
      R50-FPN3x0.2610.0433.441.037.2137849600model | metrics
      R101-C43x0.6520.1456.342.636.7138363239model | metrics
      R101-DC53x0.5450.0927.641.937.3138363294model | metrics
      R101-FPN3x0.3400.0564.642.938.6138205316model | metrics
      X101-FPN3x0.6900.1037.244.339.5139653917model | metrics
      - - - -#### New baselines using Large-Scale Jitter and Longer Training Schedule - -The following baselines of COCO Instance Segmentation with Mask R-CNN are generated -using a longer training schedule and large-scale jitter as described in Google's -[Simple Copy-Paste Data Augmentation](https://arxiv.org/pdf/2012.07177.pdf) paper. These -models are trained from scratch using random initialization. These baselines exceed the -previous Mask R-CNN baselines. - -In the following table, one epoch consists of training on 118000 COCO images. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Nameepochstrain
      time
      (s/im)
      inference
      time
      (s/im)
      box
      AP
      mask
      AP
      model iddownload
      R50-FPN1000.3760.06944.640.342047764model | metrics
      R50-FPN2000.3760.06946.341.742047638model | metrics
      R50-FPN4000.3760.06947.442.542019571model | metrics
      R101-FPN1000.5180.07346.441.642025812model | metrics
      R101-FPN2000.5180.07348.043.142131867model | metrics
      R101-FPN4000.5180.07348.943.742073830model | metrics
      regnetx_4gf_dds_FPN1000.4740.07146.041.342047771model | metrics
      regnetx_4gf_dds_FPN2000.4740.07148.143.142132721model | metrics
      regnetx_4gf_dds_FPN4000.4740.07148.643.542025447model | metrics
      regnety_4gf_dds_FPN1000.4870.07346.141.642047784model | metrics
      regnety_4gf_dds_FPN2000.4870.07247.843.042047642model | metrics
      regnety_4gf_dds_FPN4000.4870.07248.243.342045954model | metrics
      - -### COCO Person Keypoint Detection Baselines with Keypoint R-CNN - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      kp.
      AP
      model iddownload
      R50-FPN1x0.3150.0725.053.664.0137261548model | metrics
      R50-FPN3x0.3160.0665.055.465.5137849621model | metrics
      R101-FPN3x0.3900.0766.156.466.1138363331model | metrics
      X101-FPN3x0.7380.1218.757.366.0139686956model | metrics
      - -### COCO Panoptic Segmentation Baselines with Panoptic FPN - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      mask
      AP
      PQmodel iddownload
      R50-FPN1x0.3040.0534.837.634.739.4139514544model | metrics
      R50-FPN3x0.3020.0534.840.036.541.5139514569model | metrics
      R101-FPN3x0.3920.0666.042.438.543.0139514519model | metrics
      - - -### LVIS Instance Segmentation Baselines with Mask R-CNN - -Mask R-CNN baselines on the [LVIS dataset](https://lvisdataset.org), v0.5. -These baselines are described in Table 3(c) of the [LVIS paper](https://arxiv.org/abs/1908.03195). - -NOTE: the 1x schedule here has the same amount of __iterations__ as the COCO 1x baselines. -They are roughly 24 epochs of LVISv0.5 data. -The final results of these configs have large variance across different runs. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      mask
      AP
      model iddownload
      R50-FPN1x0.2920.1077.123.624.4144219072model | metrics
      R101-FPN1x0.3710.1147.825.625.9144219035model | metrics
      X101-FPN1x0.7120.15110.226.727.1144219108model | metrics
      - - - -### Cityscapes & Pascal VOC Baselines - -Simple baselines for -* Mask R-CNN on Cityscapes instance segmentation (initialized from COCO pre-training, then trained on Cityscapes fine annotations only) -* Faster R-CNN on PASCAL VOC object detection (trained on VOC 2007 train+val + VOC 2012 train+val, tested on VOC 2007 using 11-point interpolated AP) - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Nametrain
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      box
      AP50
      mask
      AP
      model iddownload
      R50-FPN, Cityscapes0.2400.0784.436.5142423278model | metrics
      R50-C4, VOC0.5370.0814.851.980.3142202221model | metrics
      - - - -### Other Settings - -Ablations for Deformable Conv and Cascade R-CNN: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      mask
      AP
      model iddownload
      Baseline R50-FPN1x0.2610.0433.438.635.2137260431model | metrics
      Deformable Conv1x0.3420.0483.541.537.5138602867model | metrics
      Cascade R-CNN1x0.3170.0524.042.136.4138602847model | metrics
      Baseline R50-FPN3x0.2610.0433.441.037.2137849600model | metrics
      Deformable Conv3x0.3490.0473.542.738.5144998336model | metrics
      Cascade R-CNN3x0.3280.0534.044.338.5144998488model | metrics
      - - -Ablations for normalization methods, and a few models trained from scratch following [Rethinking ImageNet Pre-training](https://arxiv.org/abs/1811.08883). -(Note: The baseline uses `2fc` head while the others use [`4conv1fc` head](https://arxiv.org/abs/1803.08494)) - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      mask
      AP
      model iddownload
      Baseline R50-FPN3x0.2610.0433.441.037.2137849600model | metrics
      GN3x0.3090.0605.642.638.6138602888model | metrics
      SyncBN3x0.3450.0535.541.937.8169527823model | metrics
      GN (from scratch)3x0.3380.0617.239.936.6138602908model | metrics
      GN (from scratch)9xN/A0.0617.243.739.6183808979model | metrics
      SyncBN (from scratch)9xN/A0.0557.243.639.3184226666model | metrics
      - - -A few very large models trained for a long time, for demo purposes. They are trained using multiple machines: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Nameinference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      mask
      AP
      PQmodel iddownload
      Panoptic FPN R1010.09811.447.441.346.1139797668model | metrics
      Mask R-CNN X1520.23415.150.244.018131413model | metrics
      above + test-time aug.51.945.9
      diff --git a/grit_src_deprecated/third_party/CenterNet2/README.md b/grit_src_deprecated/third_party/CenterNet2/README.md deleted file mode 100755 index d3e1d5cf..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/README.md +++ /dev/null @@ -1,85 +0,0 @@ -# Probabilistic two-stage detection -Two-stage object detectors that use class-agnostic one-stage detectors as the proposal network. - - -

      - -> [**Probabilistic two-stage detection**](http://arxiv.org/abs/2103.07461), -> Xingyi Zhou, Vladlen Koltun, Philipp Krähenbühl, -> *arXiv technical report ([arXiv 2103.07461](http://arxiv.org/abs/2103.07461))* - -Contact: [zhouxy@cs.utexas.edu](mailto:zhouxy@cs.utexas.edu). Any questions or discussions are welcomed! - -## Abstract - -We develop a probabilistic interpretation of two-stage object detection. We show that this probabilistic interpretation motivates a number of common empirical training practices. It also suggests changes to two-stage detection pipelines. Specifically, the first stage should infer proper object-vs-background likelihoods, which should then inform the overall score of the detector. A standard region proposal network (RPN) cannot infer this likelihood sufficiently well, but many one-stage detectors can. We show how to build a probabilistic two-stage detector from any state-of-the-art one-stage detector. The resulting detectors are faster and more accurate than both their one- and two-stage precursors. Our detector achieves 56.4 mAP on COCO test-dev with single-scale testing, outperforming all published results. Using a lightweight backbone, our detector achieves 49.2 mAP on COCO at 33 fps on a Titan Xp. - -## Summary - -- Two-stage CenterNet: First stage estimates object probabilities, second stage conditionally classifies objects. - -- Resulting detector is faster and more accurate than both traditional two-stage detectors (fewer proposals required), and one-stage detectors (lighter first stage head). - -- Our best model achieves 56.4 mAP on COCO test-dev. - -- This repo also includes a detectron2-based CenterNet implementation with better accuracy (42.5 mAP at 70FPS) and a new FPN version of CenterNet (40.2 mAP with Res50_1x). - -## Main results - -All models are trained with multi-scale training, and tested with a single scale. The FPS is tested on a Titan RTX GPU. -More models and details can be found in the [MODEL_ZOO](projects/CenterNet2/centernet2_docs/MODEL_ZOO.md). - -#### COCO - -| Model | COCO val mAP | FPS | -|-------------------------------------------|---------------|-------| -| CenterNet-S4_DLA_8x | 42.5 | 71 | -| CenterNet2_R50_1x | 42.9 | 24 | -| CenterNet2_X101-DCN_2x | 49.9 | 8 | -| CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST | 56.1 | 5 | -| CenterNet2_DLA-BiFPN-P5_24x_ST | 49.2 | 38 | - - -#### LVIS - -| Model | val mAP box | -| ------------------------- | ----------- | -| CenterNet2_R50_1x | 26.5 | -| CenterNet2_FedLoss_R50_1x | 28.3 | - - -#### Objects365 - -| Model | val mAP | -|-------------------------------------------|----------| -| CenterNet2_R50_1x | 22.6 | - -## Installation - -Our project is developed on [detectron2](https://github.com/facebookresearch/detectron2). Please follow the official detectron2 [installation](https://github.com/facebookresearch/detectron2/blob/master/INSTALL.md). All our code is under `projects/CenterNet2/`. In theory, you should be able to copy-paste `projects/CenterNet2/` to the latest detectron2 release or your own detectron2 repo to run our project. There might be API changes in future detectron2 releases that make it incompatible. - -We use the default detectron2 demo script. To run inference on an image folder using our pre-trained model, run - -~~~ -python projects/CenterNet2/demo/demo.py --config-file projects/CenterNet2/configs/CenterNet2_R50_1x.yaml --input path/to/image/ --opts MODEL.WEIGHTS models/CenterNet2_R50_1x.pth -~~~ - -## Benchmark evaluation and training - -Please check detectron2 [GETTING_STARTED.md](https://github.com/facebookresearch/detectron2/blob/master/GETTING_STARTED.md) for running evaluation and training. Our config files are under `projects/CenterNet2/configs` and the pre-trained models are in the [MODEL_ZOO](projects/CenterNet2/centernet2_docs/MODEL_ZOO.md). - - -## License - -Our code under `projects/CenterNet2/` is under [Apache 2.0 license](projects/CenterNet2/LICENSE). `projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py` are from [AdelaiDet](https://github.com/aim-uofa/AdelaiDet), which follows the original [non-commercial license](https://github.com/aim-uofa/AdelaiDet/blob/master/LICENSE). The code from detectron2 follows the original [Apache 2.0 license](LICENSE). - -## Citation - -If you find this project useful for your research, please use the following BibTeX entry. - - @inproceedings{zhou2021probablistic, - title={Probabilistic two-stage detection}, - author={Zhou, Xingyi and Koltun, Vladlen and Kr{\"a}henb{\"u}hl, Philipp}, - booktitle={arXiv preprint arXiv:2103.07461}, - year={2021} - } diff --git a/grit_src_deprecated/third_party/CenterNet2/README_D2.md b/grit_src_deprecated/third_party/CenterNet2/README_D2.md deleted file mode 100755 index a88ad7e2..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/README_D2.md +++ /dev/null @@ -1,62 +0,0 @@ - - -Detectron2 is Facebook AI Research's next generation software system -that implements state-of-the-art object detection algorithms. -It is a ground-up rewrite of the previous version, -[Detectron](https://github.com/facebookresearch/Detectron/), -and it originates from [maskrcnn-benchmark](https://github.com/facebookresearch/maskrcnn-benchmark/). - -
      - -
      - -### What's New -* It is powered by the [PyTorch](https://pytorch.org) deep learning framework. -* Includes more features such as panoptic segmentation, Densepose, Cascade R-CNN, rotated bounding boxes, PointRend, - DeepLab, etc. -* Can be used as a library to support [different projects](projects/) on top of it. - We'll open source more research projects in this way. -* It [trains much faster](https://detectron2.readthedocs.io/notes/benchmarks.html). -* Models can be exported to TorchScript format or Caffe2 format for deployment. - -See our [blog post](https://ai.facebook.com/blog/-detectron2-a-pytorch-based-modular-object-detection-library-/) -to see more demos and learn about detectron2. - -## Installation - -See [INSTALL.md](INSTALL.md). - -## Getting Started - -Follow the [installation instructions](https://detectron2.readthedocs.io/tutorials/install.html) to -install detectron2. - -See [Getting Started with Detectron2](https://detectron2.readthedocs.io/tutorials/getting_started.html), -and the [Colab Notebook](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5) -to learn about basic usage. - -Learn more at our [documentation](https://detectron2.readthedocs.org). -And see [projects/](projects/) for some projects that are built on top of detectron2. - -## Model Zoo and Baselines - -We provide a large set of baseline results and trained models available for download in the [Detectron2 Model Zoo](MODEL_ZOO.md). - - -## License - -Detectron2 is released under the [Apache 2.0 license](LICENSE). - -## Citing Detectron2 - -If you use Detectron2 in your research or wish to refer to the baseline results published in the [Model Zoo](MODEL_ZOO.md), please use the following BibTeX entry. - -```BibTeX -@misc{wu2019detectron2, - author = {Yuxin Wu and Alexander Kirillov and Francisco Massa and - Wan-Yen Lo and Ross Girshick}, - title = {Detectron2}, - howpublished = {\url{https://github.com/facebookresearch/detectron2}}, - year = {2019} -} -``` diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-C4.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-C4.yaml deleted file mode 100755 index fbf34a0e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-C4.yaml +++ /dev/null @@ -1,18 +0,0 @@ -MODEL: - META_ARCHITECTURE: "GeneralizedRCNN" - RPN: - PRE_NMS_TOPK_TEST: 6000 - POST_NMS_TOPK_TEST: 1000 - ROI_HEADS: - NAME: "Res5ROIHeads" -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - IMS_PER_BATCH: 16 - BASE_LR: 0.02 - STEPS: (60000, 80000) - MAX_ITER: 90000 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -VERSION: 2 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-DilatedC5.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-DilatedC5.yaml deleted file mode 100755 index c0d6d16b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-DilatedC5.yaml +++ /dev/null @@ -1,31 +0,0 @@ -MODEL: - META_ARCHITECTURE: "GeneralizedRCNN" - RESNETS: - OUT_FEATURES: ["res5"] - RES5_DILATION: 2 - RPN: - IN_FEATURES: ["res5"] - PRE_NMS_TOPK_TEST: 6000 - POST_NMS_TOPK_TEST: 1000 - ROI_HEADS: - NAME: "StandardROIHeads" - IN_FEATURES: ["res5"] - ROI_BOX_HEAD: - NAME: "FastRCNNConvFCHead" - NUM_FC: 2 - POOLER_RESOLUTION: 7 - ROI_MASK_HEAD: - NAME: "MaskRCNNConvUpsampleHead" - NUM_CONV: 4 - POOLER_RESOLUTION: 14 -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - IMS_PER_BATCH: 16 - BASE_LR: 0.02 - STEPS: (60000, 80000) - MAX_ITER: 90000 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -VERSION: 2 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-FPN.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-FPN.yaml deleted file mode 100755 index 3e020f2e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Base-RCNN-FPN.yaml +++ /dev/null @@ -1,42 +0,0 @@ -MODEL: - META_ARCHITECTURE: "GeneralizedRCNN" - BACKBONE: - NAME: "build_resnet_fpn_backbone" - RESNETS: - OUT_FEATURES: ["res2", "res3", "res4", "res5"] - FPN: - IN_FEATURES: ["res2", "res3", "res4", "res5"] - ANCHOR_GENERATOR: - SIZES: [[32], [64], [128], [256], [512]] # One size for each in feature map - ASPECT_RATIOS: [[0.5, 1.0, 2.0]] # Three aspect ratios (same for all in feature maps) - RPN: - IN_FEATURES: ["p2", "p3", "p4", "p5", "p6"] - PRE_NMS_TOPK_TRAIN: 2000 # Per FPN level - PRE_NMS_TOPK_TEST: 1000 # Per FPN level - # Detectron1 uses 2000 proposals per-batch, - # (See "modeling/rpn/rpn_outputs.py" for details of this legacy issue) - # which is approximately 1000 proposals per-image since the default batch size for FPN is 2. - POST_NMS_TOPK_TRAIN: 1000 - POST_NMS_TOPK_TEST: 1000 - ROI_HEADS: - NAME: "StandardROIHeads" - IN_FEATURES: ["p2", "p3", "p4", "p5"] - ROI_BOX_HEAD: - NAME: "FastRCNNConvFCHead" - NUM_FC: 2 - POOLER_RESOLUTION: 7 - ROI_MASK_HEAD: - NAME: "MaskRCNNConvUpsampleHead" - NUM_CONV: 4 - POOLER_RESOLUTION: 14 -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - IMS_PER_BATCH: 16 - BASE_LR: 0.02 - STEPS: (60000, 80000) - MAX_ITER: 90000 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -VERSION: 2 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Base-RetinaNet.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Base-RetinaNet.yaml deleted file mode 100755 index 8b45b982..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Base-RetinaNet.yaml +++ /dev/null @@ -1,25 +0,0 @@ -MODEL: - META_ARCHITECTURE: "RetinaNet" - BACKBONE: - NAME: "build_retinanet_resnet_fpn_backbone" - RESNETS: - OUT_FEATURES: ["res3", "res4", "res5"] - ANCHOR_GENERATOR: - SIZES: !!python/object/apply:eval ["[[x, x * 2**(1.0/3), x * 2**(2.0/3) ] for x in [32, 64, 128, 256, 512 ]]"] - FPN: - IN_FEATURES: ["res3", "res4", "res5"] - RETINANET: - IOU_THRESHOLDS: [0.4, 0.5] - IOU_LABELS: [0, -1, 1] - SMOOTH_L1_LOSS_BETA: 0.0 -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - IMS_PER_BATCH: 16 - BASE_LR: 0.01 # Note that RetinaNet uses a different default learning rate - STEPS: (60000, 80000) - MAX_ITER: 90000 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -VERSION: 2 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml deleted file mode 100755 index 773ac10e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,17 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - LOAD_PROPOSALS: True - RESNETS: - DEPTH: 50 - PROPOSAL_GENERATOR: - NAME: "PrecomputedProposals" -DATASETS: - TRAIN: ("coco_2017_train",) - PROPOSAL_FILES_TRAIN: ("detectron2://COCO-Detection/rpn_R_50_FPN_1x/137258492/coco_2017_train_box_proposals_21bc3a.pkl", ) - TEST: ("coco_2017_val",) - PROPOSAL_FILES_TEST: ("detectron2://COCO-Detection/rpn_R_50_FPN_1x/137258492/coco_2017_val_box_proposals_ee0dad.pkl", ) -DATALOADER: - # proposals are part of the dataset_dicts, and take a lot of RAM - NUM_WORKERS: 2 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_C4_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_C4_3x.yaml deleted file mode 100755 index db142cd6..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_C4_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - MASK_ON: False - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_DC5_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_DC5_3x.yaml deleted file mode 100755 index bceb6b34..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_DC5_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-DilatedC5.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - MASK_ON: False - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_FPN_3x.yaml deleted file mode 100755 index 57a098f5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_101_FPN_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - MASK_ON: False - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_1x.yaml deleted file mode 100755 index f9613010..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_1x.yaml +++ /dev/null @@ -1,6 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_3x.yaml deleted file mode 100755 index bc51bce3..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_C4_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_1x.yaml deleted file mode 100755 index 0fe96f57..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_1x.yaml +++ /dev/null @@ -1,6 +0,0 @@ -_BASE_: "../Base-RCNN-DilatedC5.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_3x.yaml deleted file mode 100755 index 33fadeb8..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_DC5_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-DilatedC5.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_1x.yaml deleted file mode 100755 index 3262019a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,6 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_3x.yaml deleted file mode 100755 index 41395182..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_R_50_FPN_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x.yaml deleted file mode 100755 index 9c9b5ab7..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x.yaml +++ /dev/null @@ -1,13 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - MASK_ON: False - WEIGHTS: "detectron2://ImageNetPretrained/FAIR/X-101-32x8d.pkl" - PIXEL_STD: [57.375, 57.120, 58.395] - RESNETS: - STRIDE_IN_1X1: False # this is a C2 model - NUM_GROUPS: 32 - WIDTH_PER_GROUP: 8 - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/fcos_R_50_FPN_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/fcos_R_50_FPN_1x.py deleted file mode 100755 index 86f83c68..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/fcos_R_50_FPN_1x.py +++ /dev/null @@ -1,11 +0,0 @@ -from ..common.optim import SGD as optimizer -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.data.coco import dataloader -from ..common.models.fcos import model -from ..common.train import train - -dataloader.train.mapper.use_instance_mask = False -optimizer.lr = 0.01 - -model.backbone.bottom_up.freeze_at = 2 -train.init_checkpoint = "detectron2://ImageNetPretrained/MSRA/R-50.pkl" diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_101_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_101_FPN_3x.yaml deleted file mode 100755 index 4abb1b9a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_101_FPN_3x.yaml +++ /dev/null @@ -1,8 +0,0 @@ -_BASE_: "../Base-RetinaNet.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.py deleted file mode 100755 index 43057a8e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.py +++ /dev/null @@ -1,11 +0,0 @@ -from ..common.optim import SGD as optimizer -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.data.coco import dataloader -from ..common.models.retinanet import model -from ..common.train import train - -dataloader.train.mapper.use_instance_mask = False -model.backbone.bottom_up.freeze_at = 2 -optimizer.lr = 0.01 - -train.init_checkpoint = "detectron2://ImageNetPretrained/MSRA/R-50.pkl" diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.yaml deleted file mode 100755 index 4a24ce3a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_1x.yaml +++ /dev/null @@ -1,5 +0,0 @@ -_BASE_: "../Base-RetinaNet.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_3x.yaml deleted file mode 100755 index 3b5412d4..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/retinanet_R_50_FPN_3x.yaml +++ /dev/null @@ -1,8 +0,0 @@ -_BASE_: "../Base-RetinaNet.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_C4_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_C4_1x.yaml deleted file mode 100755 index e0482115..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_C4_1x.yaml +++ /dev/null @@ -1,10 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - META_ARCHITECTURE: "ProposalNetwork" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 - RPN: - PRE_NMS_TOPK_TEST: 12000 - POST_NMS_TOPK_TEST: 2000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_FPN_1x.yaml deleted file mode 100755 index dc9c9520..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Detection/rpn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - META_ARCHITECTURE: "ProposalNetwork" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 - RPN: - POST_NMS_TOPK_TEST: 2000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_C4_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_C4_3x.yaml deleted file mode 100755 index 1a94cc45..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_C4_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - MASK_ON: True - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_DC5_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_DC5_3x.yaml deleted file mode 100755 index 67b70cf4..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_DC5_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-DilatedC5.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - MASK_ON: True - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_FPN_3x.yaml deleted file mode 100755 index 1935a302..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_101_FPN_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - MASK_ON: True - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.py deleted file mode 100755 index 22016be1..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.py +++ /dev/null @@ -1,8 +0,0 @@ -from ..common.train import train -from ..common.optim import SGD as optimizer -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.data.coco import dataloader -from ..common.models.mask_rcnn_c4 import model - -model.backbone.freeze_at = 2 -train.init_checkpoint = "detectron2://ImageNetPretrained/MSRA/R-50.pkl" diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.yaml deleted file mode 100755 index a9aeb4ea..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x.yaml +++ /dev/null @@ -1,6 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml deleted file mode 100755 index 38ed867d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_1x.yaml deleted file mode 100755 index b13eefab..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_1x.yaml +++ /dev/null @@ -1,6 +0,0 @@ -_BASE_: "../Base-RCNN-DilatedC5.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x.yaml deleted file mode 100755 index d4010163..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-DilatedC5.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py deleted file mode 100755 index 40844dde..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py +++ /dev/null @@ -1,8 +0,0 @@ -from ..common.optim import SGD as optimizer -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.data.coco import dataloader -from ..common.models.mask_rcnn_fpn import model -from ..common.train import train - -model.backbone.bottom_up.freeze_at = 2 -train.init_checkpoint = "detectron2://ImageNetPretrained/MSRA/R-50.pkl" diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml deleted file mode 100755 index d50fb866..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,6 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x_giou.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x_giou.yaml deleted file mode 100755 index bec680ee..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x_giou.yaml +++ /dev/null @@ -1,12 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - RPN: - BBOX_REG_LOSS_TYPE: "giou" - BBOX_REG_LOSS_WEIGHT: 2.0 - ROI_BOX_HEAD: - BBOX_REG_LOSS_TYPE: "giou" - BBOX_REG_LOSS_WEIGHT: 10.0 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml deleted file mode 100755 index be7d06b8..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml +++ /dev/null @@ -1,9 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_3x.yaml deleted file mode 100755 index d14c63f7..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_3x.yaml +++ /dev/null @@ -1,13 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - MASK_ON: True - WEIGHTS: "detectron2://ImageNetPretrained/FAIR/X-101-32x8d.pkl" - PIXEL_STD: [57.375, 57.120, 58.395] - RESNETS: - STRIDE_IN_1X1: False # this is a C2 model - NUM_GROUPS: 32 - WIDTH_PER_GROUP: 8 - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py deleted file mode 100755 index d7bbdd7d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py +++ /dev/null @@ -1,34 +0,0 @@ -from ..common.optim import SGD as optimizer -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.data.coco import dataloader -from ..common.models.mask_rcnn_fpn import model -from ..common.train import train - -from detectron2.config import LazyCall as L -from detectron2.modeling.backbone import RegNet -from detectron2.modeling.backbone.regnet import SimpleStem, ResBottleneckBlock - - -# Replace default ResNet with RegNetX-4GF from the DDS paper. Config source: -# https://github.com/facebookresearch/pycls/blob/2c152a6e5d913e898cca4f0a758f41e6b976714d/configs/dds_baselines/regnetx/RegNetX-4.0GF_dds_8gpu.yaml#L4-L9 # noqa -model.backbone.bottom_up = L(RegNet)( - stem_class=SimpleStem, - stem_width=32, - block_class=ResBottleneckBlock, - depth=23, - w_a=38.65, - w_0=96, - w_m=2.43, - group_width=40, - freeze_at=2, - norm="FrozenBN", - out_features=["s1", "s2", "s3", "s4"], -) -model.pixel_std = [57.375, 57.120, 58.395] - -optimizer.weight_decay = 5e-5 -train.init_checkpoint = ( - "https://dl.fbaipublicfiles.com/pycls/dds_baselines/160906383/RegNetX-4.0GF_dds_8gpu.pyth" -) -# RegNets benefit from enabling cudnn benchmark mode -train.cudnn_benchmark = True diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py deleted file mode 100755 index 72c6b7a5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py +++ /dev/null @@ -1,35 +0,0 @@ -from ..common.optim import SGD as optimizer -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.data.coco import dataloader -from ..common.models.mask_rcnn_fpn import model -from ..common.train import train - -from detectron2.config import LazyCall as L -from detectron2.modeling.backbone import RegNet -from detectron2.modeling.backbone.regnet import SimpleStem, ResBottleneckBlock - - -# Replace default ResNet with RegNetY-4GF from the DDS paper. Config source: -# https://github.com/facebookresearch/pycls/blob/2c152a6e5d913e898cca4f0a758f41e6b976714d/configs/dds_baselines/regnety/RegNetY-4.0GF_dds_8gpu.yaml#L4-L10 # noqa -model.backbone.bottom_up = L(RegNet)( - stem_class=SimpleStem, - stem_width=32, - block_class=ResBottleneckBlock, - depth=22, - w_a=31.41, - w_0=96, - w_m=2.24, - group_width=64, - se_ratio=0.25, - freeze_at=2, - norm="FrozenBN", - out_features=["s1", "s2", "s3", "s4"], -) -model.pixel_std = [57.375, 57.120, 58.395] - -optimizer.weight_decay = 5e-5 -train.init_checkpoint = ( - "https://dl.fbaipublicfiles.com/pycls/dds_baselines/160906838/RegNetY-4.0GF_dds_8gpu.pyth" -) -# RegNets benefit from enabling cudnn benchmark mode -train.cudnn_benchmark = True diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/Base-Keypoint-RCNN-FPN.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/Base-Keypoint-RCNN-FPN.yaml deleted file mode 100755 index 4e03944a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/Base-Keypoint-RCNN-FPN.yaml +++ /dev/null @@ -1,15 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - KEYPOINT_ON: True - ROI_HEADS: - NUM_CLASSES: 1 - ROI_BOX_HEAD: - SMOOTH_L1_BETA: 0.5 # Keypoint AP degrades (though box AP improves) when using plain L1 loss - RPN: - # Detectron1 uses 2000 proposals per-batch, but this option is per-image in detectron2. - # 1000 proposals per-image is found to hurt box AP. - # Therefore we increase it to 1500 per-image. - POST_NMS_TOPK_TRAIN: 1500 -DATASETS: - TRAIN: ("keypoints_coco_2017_train",) - TEST: ("keypoints_coco_2017_val",) diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_101_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_101_FPN_3x.yaml deleted file mode 100755 index 9309535c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_101_FPN_3x.yaml +++ /dev/null @@ -1,8 +0,0 @@ -_BASE_: "Base-Keypoint-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.py deleted file mode 100755 index 1aad53bf..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.py +++ /dev/null @@ -1,8 +0,0 @@ -from ..common.optim import SGD as optimizer -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.data.coco_keypoint import dataloader -from ..common.models.keypoint_rcnn_fpn import model -from ..common.train import train - -model.backbone.bottom_up.freeze_at = 2 -train.init_checkpoint = "detectron2://ImageNetPretrained/MSRA/R-50.pkl" diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.yaml deleted file mode 100755 index 7bf85cf7..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,5 +0,0 @@ -_BASE_: "Base-Keypoint-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml deleted file mode 100755 index a07f243f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml +++ /dev/null @@ -1,8 +0,0 @@ -_BASE_: "Base-Keypoint-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_X_101_32x8d_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_X_101_32x8d_FPN_3x.yaml deleted file mode 100755 index d4bfa20a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-Keypoints/keypoint_rcnn_X_101_32x8d_FPN_3x.yaml +++ /dev/null @@ -1,12 +0,0 @@ -_BASE_: "Base-Keypoint-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/FAIR/X-101-32x8d.pkl" - PIXEL_STD: [57.375, 57.120, 58.395] - RESNETS: - STRIDE_IN_1X1: False # this is a C2 model - NUM_GROUPS: 32 - WIDTH_PER_GROUP: 8 - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/Base-Panoptic-FPN.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/Base-Panoptic-FPN.yaml deleted file mode 100755 index f00d54b7..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/Base-Panoptic-FPN.yaml +++ /dev/null @@ -1,11 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - META_ARCHITECTURE: "PanopticFPN" - MASK_ON: True - SEM_SEG_HEAD: - LOSS_WEIGHT: 0.5 -DATASETS: - TRAIN: ("coco_2017_train_panoptic_separated",) - TEST: ("coco_2017_val_panoptic_separated",) -DATALOADER: - FILTER_EMPTY_ANNOTATIONS: False diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_101_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_101_3x.yaml deleted file mode 100755 index 0e01f6fb..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_101_3x.yaml +++ /dev/null @@ -1,8 +0,0 @@ -_BASE_: "Base-Panoptic-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - RESNETS: - DEPTH: 101 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.py deleted file mode 100755 index 40cf1813..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.py +++ /dev/null @@ -1,8 +0,0 @@ -from ..common.optim import SGD as optimizer -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.data.coco_panoptic_separated import dataloader -from ..common.models.panoptic_fpn import model -from ..common.train import train - -model.backbone.bottom_up.freeze_at = 2 -train.init_checkpoint = "detectron2://ImageNetPretrained/MSRA/R-50.pkl" diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.yaml deleted file mode 100755 index 6afa2c1c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.yaml +++ /dev/null @@ -1,5 +0,0 @@ -_BASE_: "Base-Panoptic-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_3x.yaml deleted file mode 100755 index b956b3f6..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_3x.yaml +++ /dev/null @@ -1,8 +0,0 @@ -_BASE_: "Base-Panoptic-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Cityscapes/mask_rcnn_R_50_FPN.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Cityscapes/mask_rcnn_R_50_FPN.yaml deleted file mode 100755 index 1a7aaeb9..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Cityscapes/mask_rcnn_R_50_FPN.yaml +++ /dev/null @@ -1,27 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - # WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - # For better, more stable performance initialize from COCO - WEIGHTS: "detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl" - MASK_ON: True - ROI_HEADS: - NUM_CLASSES: 8 -# This is similar to the setting used in Mask R-CNN paper, Appendix A -# But there are some differences, e.g., we did not initialize the output -# layer using the corresponding classes from COCO -INPUT: - MIN_SIZE_TRAIN: (800, 832, 864, 896, 928, 960, 992, 1024) - MIN_SIZE_TRAIN_SAMPLING: "choice" - MIN_SIZE_TEST: 1024 - MAX_SIZE_TRAIN: 2048 - MAX_SIZE_TEST: 2048 -DATASETS: - TRAIN: ("cityscapes_fine_instance_seg_train",) - TEST: ("cityscapes_fine_instance_seg_val",) -SOLVER: - BASE_LR: 0.01 - STEPS: (18000,) - MAX_ITER: 24000 - IMS_PER_BATCH: 8 -TEST: - EVAL_PERIOD: 8000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/README.md b/grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/README.md deleted file mode 100755 index 924fd00a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/README.md +++ /dev/null @@ -1,84 +0,0 @@ - -Detectron2 model zoo's experimental settings and a few implementation details are different from Detectron. - -The differences in implementation details are shared in -[Compatibility with Other Libraries](../../docs/notes/compatibility.md). - -The differences in model zoo's experimental settings include: -* Use scale augmentation during training. This improves AP with lower training cost. -* Use L1 loss instead of smooth L1 loss for simplicity. This sometimes improves box AP but may - affect other AP. -* Use `POOLER_SAMPLING_RATIO=0` instead of 2. This does not significantly affect AP. -* Use `ROIAlignV2`. This does not significantly affect AP. - -In this directory, we provide a few configs that __do not__ have the above changes. -They mimic Detectron's behavior as close as possible, -and provide a fair comparison of accuracy and speed against Detectron. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
      Namelr
      sched
      train
      time
      (s/iter)
      inference
      time
      (s/im)
      train
      mem
      (GB)
      box
      AP
      mask
      AP
      kp.
      AP
      model iddownload
      Faster R-CNN1x0.2190.0383.136.9137781054model | metrics
      Keypoint R-CNN1x0.3130.0715.053.164.2137781195model | metrics
      Mask R-CNN1x0.2730.0433.437.834.9137781281model | metrics
      - -## Comparisons: - -* Faster R-CNN: Detectron's AP is 36.7, similar to ours. -* Keypoint R-CNN: Detectron's AP is box 53.6, keypoint 64.2. Fixing a Detectron's - [bug](https://github.com/facebookresearch/Detectron/issues/459) lead to a drop in box AP, and can be - compensated back by some parameter tuning. -* Mask R-CNN: Detectron's AP is box 37.7, mask 33.9. We're 1 AP better in mask AP, due to more correct implementation. - See [this article](https://ppwwyyxx.com/blog/2021/Where-are-Pixels/) for details. - -For speed comparison, see [benchmarks](https://detectron2.readthedocs.io/notes/benchmarks.html). diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/faster_rcnn_R_50_FPN_noaug_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/faster_rcnn_R_50_FPN_noaug_1x.yaml deleted file mode 100755 index 6ce77f13..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/faster_rcnn_R_50_FPN_noaug_1x.yaml +++ /dev/null @@ -1,17 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 - # Detectron1 uses smooth L1 loss with some magic beta values. - # The defaults are changed to L1 loss in Detectron2. - RPN: - SMOOTH_L1_BETA: 0.1111 - ROI_BOX_HEAD: - SMOOTH_L1_BETA: 1.0 - POOLER_SAMPLING_RATIO: 2 - POOLER_TYPE: "ROIAlign" -INPUT: - # no scale augmentation - MIN_SIZE_TRAIN: (800, ) diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/keypoint_rcnn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/keypoint_rcnn_R_50_FPN_1x.yaml deleted file mode 100755 index aacf868b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/keypoint_rcnn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,27 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - KEYPOINT_ON: True - RESNETS: - DEPTH: 50 - ROI_HEADS: - NUM_CLASSES: 1 - ROI_KEYPOINT_HEAD: - POOLER_RESOLUTION: 14 - POOLER_SAMPLING_RATIO: 2 - POOLER_TYPE: "ROIAlign" - # Detectron1 uses smooth L1 loss with some magic beta values. - # The defaults are changed to L1 loss in Detectron2. - ROI_BOX_HEAD: - SMOOTH_L1_BETA: 1.0 - POOLER_SAMPLING_RATIO: 2 - POOLER_TYPE: "ROIAlign" - RPN: - SMOOTH_L1_BETA: 0.1111 - # Detectron1 uses 2000 proposals per-batch, but this option is per-image in detectron2 - # 1000 proposals per-image is found to hurt box AP. - # Therefore we increase it to 1500 per-image. - POST_NMS_TOPK_TRAIN: 1500 -DATASETS: - TRAIN: ("keypoints_coco_2017_train",) - TEST: ("keypoints_coco_2017_val",) diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x.yaml deleted file mode 100755 index 4ea86a8d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x.yaml +++ /dev/null @@ -1,20 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - # Detectron1 uses smooth L1 loss with some magic beta values. - # The defaults are changed to L1 loss in Detectron2. - RPN: - SMOOTH_L1_BETA: 0.1111 - ROI_BOX_HEAD: - SMOOTH_L1_BETA: 1.0 - POOLER_SAMPLING_RATIO: 2 - POOLER_TYPE: "ROIAlign" - ROI_MASK_HEAD: - POOLER_SAMPLING_RATIO: 2 - POOLER_TYPE: "ROIAlign" -INPUT: - # no scale augmentation - MIN_SIZE_TRAIN: (800, ) diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml deleted file mode 100755 index f0c3a1bb..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml +++ /dev/null @@ -1,19 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - MASK_ON: True - RESNETS: - DEPTH: 101 - ROI_HEADS: - NUM_CLASSES: 1230 - SCORE_THRESH_TEST: 0.0001 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -DATASETS: - TRAIN: ("lvis_v0.5_train",) - TEST: ("lvis_v0.5_val",) -TEST: - DETECTIONS_PER_IMAGE: 300 # LVIS allows up to 300 -DATALOADER: - SAMPLER_TRAIN: "RepeatFactorTrainingSampler" - REPEAT_THRESHOLD: 0.001 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml deleted file mode 100755 index 64b4caa4..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,19 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - ROI_HEADS: - NUM_CLASSES: 1230 - SCORE_THRESH_TEST: 0.0001 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -DATASETS: - TRAIN: ("lvis_v0.5_train",) - TEST: ("lvis_v0.5_val",) -TEST: - DETECTIONS_PER_IMAGE: 300 # LVIS allows up to 300 -DATALOADER: - SAMPLER_TRAIN: "RepeatFactorTrainingSampler" - REPEAT_THRESHOLD: 0.001 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml deleted file mode 100755 index c8b822c6..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/LVISv0.5-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml +++ /dev/null @@ -1,23 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/FAIR/X-101-32x8d.pkl" - PIXEL_STD: [57.375, 57.120, 58.395] - MASK_ON: True - RESNETS: - STRIDE_IN_1X1: False # this is a C2 model - NUM_GROUPS: 32 - WIDTH_PER_GROUP: 8 - DEPTH: 101 - ROI_HEADS: - NUM_CLASSES: 1230 - SCORE_THRESH_TEST: 0.0001 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -DATASETS: - TRAIN: ("lvis_v0.5_train",) - TEST: ("lvis_v0.5_val",) -TEST: - DETECTIONS_PER_IMAGE: 300 # LVIS allows up to 300 -DATALOADER: - SAMPLER_TRAIN: "RepeatFactorTrainingSampler" - REPEAT_THRESHOLD: 0.001 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml deleted file mode 100755 index ca4dd971..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_101_FPN_1x.yaml +++ /dev/null @@ -1,22 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-101.pkl" - MASK_ON: True - RESNETS: - DEPTH: 101 - ROI_HEADS: - NUM_CLASSES: 1203 - SCORE_THRESH_TEST: 0.0001 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -DATASETS: - TRAIN: ("lvis_v1_train",) - TEST: ("lvis_v1_val",) -TEST: - DETECTIONS_PER_IMAGE: 300 # LVIS allows up to 300 -SOLVER: - STEPS: (120000, 160000) - MAX_ITER: 180000 # 180000 * 16 / 100000 ~ 28.8 epochs -DATALOADER: - SAMPLER_TRAIN: "RepeatFactorTrainingSampler" - REPEAT_THRESHOLD: 0.001 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml deleted file mode 100755 index f313295e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,22 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - ROI_HEADS: - NUM_CLASSES: 1203 - SCORE_THRESH_TEST: 0.0001 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -DATASETS: - TRAIN: ("lvis_v1_train",) - TEST: ("lvis_v1_val",) -TEST: - DETECTIONS_PER_IMAGE: 300 # LVIS allows up to 300 -SOLVER: - STEPS: (120000, 160000) - MAX_ITER: 180000 # 180000 * 16 / 100000 ~ 28.8 epochs -DATALOADER: - SAMPLER_TRAIN: "RepeatFactorTrainingSampler" - REPEAT_THRESHOLD: 0.001 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml deleted file mode 100755 index f6528f7c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/LVISv1-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x.yaml +++ /dev/null @@ -1,26 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/FAIR/X-101-32x8d.pkl" - PIXEL_STD: [57.375, 57.120, 58.395] - MASK_ON: True - RESNETS: - STRIDE_IN_1X1: False # this is a C2 model - NUM_GROUPS: 32 - WIDTH_PER_GROUP: 8 - DEPTH: 101 - ROI_HEADS: - NUM_CLASSES: 1203 - SCORE_THRESH_TEST: 0.0001 -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -DATASETS: - TRAIN: ("lvis_v1_train",) - TEST: ("lvis_v1_val",) -SOLVER: - STEPS: (120000, 160000) - MAX_ITER: 180000 # 180000 * 16 / 100000 ~ 28.8 epochs -TEST: - DETECTIONS_PER_IMAGE: 300 # LVIS allows up to 300 -DATALOADER: - SAMPLER_TRAIN: "RepeatFactorTrainingSampler" - REPEAT_THRESHOLD: 0.001 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_1x.yaml deleted file mode 100755 index abb33b61..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_1x.yaml +++ /dev/null @@ -1,12 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - ROI_HEADS: - NAME: CascadeROIHeads - ROI_BOX_HEAD: - CLS_AGNOSTIC_BBOX_REG: True - RPN: - POST_NMS_TOPK_TRAIN: 2000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml deleted file mode 100755 index e2201ad5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml +++ /dev/null @@ -1,15 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - ROI_HEADS: - NAME: CascadeROIHeads - ROI_BOX_HEAD: - CLS_AGNOSTIC_BBOX_REG: True - RPN: - POST_NMS_TOPK_TRAIN: 2000 -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv.yaml deleted file mode 100755 index fc117f6b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/cascade_mask_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv.yaml +++ /dev/null @@ -1,36 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - MASK_ON: True - WEIGHTS: "catalog://ImageNetPretrained/FAIR/X-152-32x8d-IN5k" - RESNETS: - STRIDE_IN_1X1: False # this is a C2 model - NUM_GROUPS: 32 - WIDTH_PER_GROUP: 8 - DEPTH: 152 - DEFORM_ON_PER_STAGE: [False, True, True, True] - ROI_HEADS: - NAME: "CascadeROIHeads" - ROI_BOX_HEAD: - NAME: "FastRCNNConvFCHead" - NUM_CONV: 4 - NUM_FC: 1 - NORM: "GN" - CLS_AGNOSTIC_BBOX_REG: True - ROI_MASK_HEAD: - NUM_CONV: 8 - NORM: "GN" - RPN: - POST_NMS_TOPK_TRAIN: 2000 -SOLVER: - IMS_PER_BATCH: 128 - STEPS: (35000, 45000) - MAX_ITER: 50000 - BASE_LR: 0.16 -INPUT: - MIN_SIZE_TRAIN: (640, 864) - MIN_SIZE_TRAIN_SAMPLING: "range" - MAX_SIZE_TRAIN: 1440 - CROP: - ENABLED: True -TEST: - EVAL_PERIOD: 2500 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_cls_agnostic.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_cls_agnostic.yaml deleted file mode 100755 index 4c3b767f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_cls_agnostic.yaml +++ /dev/null @@ -1,10 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - ROI_BOX_HEAD: - CLS_AGNOSTIC_BBOX_REG: True - ROI_MASK_HEAD: - CLS_AGNOSTIC_MASK: True diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_dconv_c3-c5.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_dconv_c3-c5.yaml deleted file mode 100755 index 04ff988d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_1x_dconv_c3-c5.yaml +++ /dev/null @@ -1,8 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - DEFORM_ON_PER_STAGE: [False, True, True, True] # on Res3,Res4,Res5 - DEFORM_MODULATED: False diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_dconv_c3-c5.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_dconv_c3-c5.yaml deleted file mode 100755 index 68c0ca58..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_dconv_c3-c5.yaml +++ /dev/null @@ -1,11 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - DEFORM_ON_PER_STAGE: [False, True, True, True] # on Res3,Res4,Res5 - DEFORM_MODULATED: False -SOLVER: - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_gn.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_gn.yaml deleted file mode 100755 index 74d274e5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_gn.yaml +++ /dev/null @@ -1,21 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "catalog://ImageNetPretrained/FAIR/R-50-GN" - MASK_ON: True - RESNETS: - DEPTH: 50 - NORM: "GN" - STRIDE_IN_1X1: False - FPN: - NORM: "GN" - ROI_BOX_HEAD: - NAME: "FastRCNNConvFCHead" - NUM_CONV: 4 - NUM_FC: 1 - NORM: "GN" - ROI_MASK_HEAD: - NORM: "GN" -SOLVER: - # 3x schedule - STEPS: (210000, 250000) - MAX_ITER: 270000 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_syncbn.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_syncbn.yaml deleted file mode 100755 index 11ebb076..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mask_rcnn_R_50_FPN_3x_syncbn.yaml +++ /dev/null @@ -1,24 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - NORM: "SyncBN" - STRIDE_IN_1X1: True - FPN: - NORM: "SyncBN" - ROI_BOX_HEAD: - NAME: "FastRCNNConvFCHead" - NUM_CONV: 4 - NUM_FC: 1 - NORM: "SyncBN" - ROI_MASK_HEAD: - NORM: "SyncBN" -SOLVER: - # 3x schedule - STEPS: (210000, 250000) - MAX_ITER: 270000 -TEST: - PRECISE_BN: - ENABLED: True diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mmdet_mask_rcnn_R_50_FPN_1x.py b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mmdet_mask_rcnn_R_50_FPN_1x.py deleted file mode 100755 index 0f2464be..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/mmdet_mask_rcnn_R_50_FPN_1x.py +++ /dev/null @@ -1,151 +0,0 @@ -# An example config to train a mmdetection model using detectron2. - -from ..common.data.coco import dataloader -from ..common.coco_schedule import lr_multiplier_1x as lr_multiplier -from ..common.optim import SGD as optimizer -from ..common.train import train - -from detectron2.modeling.mmdet_wrapper import MMDetDetector -from detectron2.config import LazyCall as L - -model = L(MMDetDetector)( - detector=dict( - type="MaskRCNN", - pretrained="torchvision://resnet50", - backbone=dict( - type="ResNet", - depth=50, - num_stages=4, - out_indices=(0, 1, 2, 3), - frozen_stages=1, - norm_cfg=dict(type="BN", requires_grad=True), - norm_eval=True, - style="pytorch", - ), - neck=dict(type="FPN", in_channels=[256, 512, 1024, 2048], out_channels=256, num_outs=5), - rpn_head=dict( - type="RPNHead", - in_channels=256, - feat_channels=256, - anchor_generator=dict( - type="AnchorGenerator", - scales=[8], - ratios=[0.5, 1.0, 2.0], - strides=[4, 8, 16, 32, 64], - ), - bbox_coder=dict( - type="DeltaXYWHBBoxCoder", - target_means=[0.0, 0.0, 0.0, 0.0], - target_stds=[1.0, 1.0, 1.0, 1.0], - ), - loss_cls=dict(type="CrossEntropyLoss", use_sigmoid=True, loss_weight=1.0), - loss_bbox=dict(type="L1Loss", loss_weight=1.0), - ), - roi_head=dict( - type="StandardRoIHead", - bbox_roi_extractor=dict( - type="SingleRoIExtractor", - roi_layer=dict(type="RoIAlign", output_size=7, sampling_ratio=0), - out_channels=256, - featmap_strides=[4, 8, 16, 32], - ), - bbox_head=dict( - type="Shared2FCBBoxHead", - in_channels=256, - fc_out_channels=1024, - roi_feat_size=7, - num_classes=80, - bbox_coder=dict( - type="DeltaXYWHBBoxCoder", - target_means=[0.0, 0.0, 0.0, 0.0], - target_stds=[0.1, 0.1, 0.2, 0.2], - ), - reg_class_agnostic=False, - loss_cls=dict(type="CrossEntropyLoss", use_sigmoid=False, loss_weight=1.0), - loss_bbox=dict(type="L1Loss", loss_weight=1.0), - ), - mask_roi_extractor=dict( - type="SingleRoIExtractor", - roi_layer=dict(type="RoIAlign", output_size=14, sampling_ratio=0), - out_channels=256, - featmap_strides=[4, 8, 16, 32], - ), - mask_head=dict( - type="FCNMaskHead", - num_convs=4, - in_channels=256, - conv_out_channels=256, - num_classes=80, - loss_mask=dict(type="CrossEntropyLoss", use_mask=True, loss_weight=1.0), - ), - ), - # model training and testing settings - train_cfg=dict( - rpn=dict( - assigner=dict( - type="MaxIoUAssigner", - pos_iou_thr=0.7, - neg_iou_thr=0.3, - min_pos_iou=0.3, - match_low_quality=True, - ignore_iof_thr=-1, - ), - sampler=dict( - type="RandomSampler", - num=256, - pos_fraction=0.5, - neg_pos_ub=-1, - add_gt_as_proposals=False, - ), - allowed_border=-1, - pos_weight=-1, - debug=False, - ), - rpn_proposal=dict( - nms_pre=2000, - max_per_img=1000, - nms=dict(type="nms", iou_threshold=0.7), - min_bbox_size=0, - ), - rcnn=dict( - assigner=dict( - type="MaxIoUAssigner", - pos_iou_thr=0.5, - neg_iou_thr=0.5, - min_pos_iou=0.5, - match_low_quality=True, - ignore_iof_thr=-1, - ), - sampler=dict( - type="RandomSampler", - num=512, - pos_fraction=0.25, - neg_pos_ub=-1, - add_gt_as_proposals=True, - ), - mask_size=28, - pos_weight=-1, - debug=False, - ), - ), - test_cfg=dict( - rpn=dict( - nms_pre=1000, - max_per_img=1000, - nms=dict(type="nms", iou_threshold=0.7), - min_bbox_size=0, - ), - rcnn=dict( - score_thr=0.05, - nms=dict(type="nms", iou_threshold=0.5), - max_per_img=100, - mask_thr_binary=0.5, - ), - ), - ), - pixel_mean=[123.675, 116.280, 103.530], - pixel_std=[58.395, 57.120, 57.375], -) - -dataloader.train.mapper.image_format = "RGB" # torchvision pretrained model -train.init_checkpoint = None # pretrained model is loaded inside backbone diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/panoptic_fpn_R_101_dconv_cascade_gn_3x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/panoptic_fpn_R_101_dconv_cascade_gn_3x.yaml deleted file mode 100755 index 34016cea..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/panoptic_fpn_R_101_dconv_cascade_gn_3x.yaml +++ /dev/null @@ -1,26 +0,0 @@ -# A large PanopticFPN for demo purposes. -# Use GN on backbone to support semantic seg. -# Use Cascade + Deform Conv to improve localization. -_BASE_: "../COCO-PanopticSegmentation/Base-Panoptic-FPN.yaml" -MODEL: - WEIGHTS: "catalog://ImageNetPretrained/FAIR/R-101-GN" - RESNETS: - DEPTH: 101 - NORM: "GN" - DEFORM_ON_PER_STAGE: [False, True, True, True] - STRIDE_IN_1X1: False - FPN: - NORM: "GN" - ROI_HEADS: - NAME: CascadeROIHeads - ROI_BOX_HEAD: - CLS_AGNOSTIC_BBOX_REG: True - ROI_MASK_HEAD: - NORM: "GN" - RPN: - POST_NMS_TOPK_TRAIN: 2000 -SOLVER: - STEPS: (105000, 125000) - MAX_ITER: 135000 - IMS_PER_BATCH: 32 - BASE_LR: 0.04 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml deleted file mode 100755 index f3400288..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml +++ /dev/null @@ -1,13 +0,0 @@ -_BASE_: "mask_rcnn_R_50_FPN_3x_gn.yaml" -MODEL: - # Train from random initialization. - WEIGHTS: "" - # It makes sense to divide by STD when training from scratch - # But it seems to make no difference on the results and C2's models didn't do this. - # So we keep things consistent with C2. - # PIXEL_STD: [57.375, 57.12, 58.395] - MASK_ON: True - BACKBONE: - FREEZE_AT: 0 -# NOTE: Please refer to Rethinking ImageNet Pre-training https://arxiv.org/abs/1811.08883 -# to learn what you need for training from scratch. diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_gn.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_gn.yaml deleted file mode 100755 index d90c9ff0..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_gn.yaml +++ /dev/null @@ -1,19 +0,0 @@ -_BASE_: "mask_rcnn_R_50_FPN_3x_gn.yaml" -MODEL: - PIXEL_STD: [57.375, 57.12, 58.395] - WEIGHTS: "" - MASK_ON: True - RESNETS: - STRIDE_IN_1X1: False - BACKBONE: - FREEZE_AT: 0 -SOLVER: - # 9x schedule - IMS_PER_BATCH: 64 # 4x the standard - STEPS: (187500, 197500) # last 60/4==15k and last 20/4==5k - MAX_ITER: 202500 # 90k * 9 / 4 - BASE_LR: 0.08 -TEST: - EVAL_PERIOD: 2500 -# NOTE: Please refer to Rethinking ImageNet Pre-training https://arxiv.org/abs/1811.08883 -# to learn what you need for training from scratch. diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_syncbn.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_syncbn.yaml deleted file mode 100755 index 60d4e423..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/scratch_mask_rcnn_R_50_FPN_9x_syncbn.yaml +++ /dev/null @@ -1,19 +0,0 @@ -_BASE_: "mask_rcnn_R_50_FPN_3x_syncbn.yaml" -MODEL: - PIXEL_STD: [57.375, 57.12, 58.395] - WEIGHTS: "" - MASK_ON: True - RESNETS: - STRIDE_IN_1X1: False - BACKBONE: - FREEZE_AT: 0 -SOLVER: - # 9x schedule - IMS_PER_BATCH: 64 # 4x the standard - STEPS: (187500, 197500) # last 60/4==15k and last 20/4==5k - MAX_ITER: 202500 # 90k * 9 / 4 - BASE_LR: 0.08 -TEST: - EVAL_PERIOD: 2500 -# NOTE: Please refer to Rethinking ImageNet Pre-training https://arxiv.org/abs/1811.08883 -# to learn what you need for training from scratch. diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/semantic_R_50_FPN_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/semantic_R_50_FPN_1x.yaml deleted file mode 100755 index ac256e13..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/semantic_R_50_FPN_1x.yaml +++ /dev/null @@ -1,11 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - META_ARCHITECTURE: "SemanticSegmentor" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 -DATASETS: - TRAIN: ("coco_2017_train_panoptic_stuffonly",) - TEST: ("coco_2017_val_panoptic_stuffonly",) -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/torchvision_imagenet_R_50.py b/grit_src_deprecated/third_party/CenterNet2/configs/Misc/torchvision_imagenet_R_50.py deleted file mode 100755 index 0d75305b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/Misc/torchvision_imagenet_R_50.py +++ /dev/null @@ -1,150 +0,0 @@ -""" -An example config file to train a ImageNet classifier with detectron2. -Model and dataloader both come from torchvision. -This shows how to use detectron2 as a general engine for any new models and tasks. - -To run, use the following command: - -python tools/lazyconfig_train_net.py --config-file configs/Misc/torchvision_imagenet_R_50.py \ - --num-gpus 8 dataloader.train.dataset.root=/path/to/imagenet/ - -""" - - -import torch -from torch import nn -from torch.nn import functional as F -from omegaconf import OmegaConf -import torchvision -from torchvision.transforms import transforms as T -from torchvision.models.resnet import ResNet, Bottleneck -from fvcore.common.param_scheduler import MultiStepParamScheduler - -from detectron2.solver import WarmupParamScheduler -from detectron2.solver.build import get_default_optimizer_params -from detectron2.config import LazyCall as L -from detectron2.model_zoo import get_config -from detectron2.data.samplers import TrainingSampler, InferenceSampler -from detectron2.evaluation import DatasetEvaluator -from detectron2.utils import comm - - -""" -Note: Here we put reusable code (models, evaluation, data) together with configs just as a -proof-of-concept, to easily demonstrate what's needed to train a ImageNet classifier in detectron2. -Writing code in configs offers extreme flexibility but is often not a good engineering practice. -In practice, you might want to put code in your project and import them instead. -""" - - -def build_data_loader(dataset, batch_size, num_workers, training=True): - return torch.utils.data.DataLoader( - dataset, - sampler=(TrainingSampler if training else InferenceSampler)(len(dataset)), - batch_size=batch_size, - num_workers=num_workers, - pin_memory=True, - ) - - -class ClassificationNet(nn.Module): - def __init__(self, model: nn.Module): - super().__init__() - self.model = model - - @property - def device(self): - return list(self.model.parameters())[0].device - - def forward(self, inputs): - image, label = inputs - pred = self.model(image.to(self.device)) - if self.training: - label = label.to(self.device) - return F.cross_entropy(pred, label) - else: - return pred - - -class ClassificationAcc(DatasetEvaluator): - def reset(self): - self.corr = self.total = 0 - - def process(self, inputs, outputs): - image, label = inputs - self.corr += (outputs.argmax(dim=1).cpu() == label.cpu()).sum().item() - self.total += len(label) - - def evaluate(self): - all_corr_total = comm.all_gather([self.corr, self.total]) - corr = sum(x[0] for x in all_corr_total) - total = sum(x[1] for x in all_corr_total) - return {"accuracy": corr / total} - - -# --- End of code that could be in a project and be imported - - -dataloader = OmegaConf.create() -dataloader.train = L(build_data_loader)( - dataset=L(torchvision.datasets.ImageNet)( - root="/path/to/imagenet", - split="train", - transform=L(T.Compose)( - transforms=[ - L(T.RandomResizedCrop)(size=224), - L(T.RandomHorizontalFlip)(), - T.ToTensor(), - L(T.Normalize)(mean=(0.485, 0.456, 0.406), std=(0.229, 0.224, 0.225)), - ] - ), - ), - batch_size=256 // 8, - num_workers=4, - training=True, -) - -dataloader.test = L(build_data_loader)( - dataset=L(torchvision.datasets.ImageNet)( - root="${...train.dataset.root}", - split="val", - transform=L(T.Compose)( - transforms=[ - L(T.Resize)(size=256), - L(T.CenterCrop)(size=224), - T.ToTensor(), - L(T.Normalize)(mean=(0.485, 0.456, 0.406), std=(0.229, 0.224, 0.225)), - ] - ), - ), - batch_size=256 // 8, - num_workers=4, - training=False, -) - -dataloader.evaluator = L(ClassificationAcc)() - -model = L(ClassificationNet)( - model=(ResNet)(block=Bottleneck, layers=[3, 4, 6, 3], zero_init_residual=True) -) - - -optimizer = L(torch.optim.SGD)( - params=L(get_default_optimizer_params)(), - lr=0.1, - momentum=0.9, - weight_decay=1e-4, -) - -lr_multiplier = L(WarmupParamScheduler)( - scheduler=L(MultiStepParamScheduler)( - values=[1.0, 0.1, 0.01, 0.001], milestones=[30, 60, 90, 100] - ), - warmup_length=1 / 100, - warmup_factor=0.1, -) - - -train = get_config("common/train.py").train -train.init_checkpoint = None -train.max_iter = 100 * 1281167 // 256 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_C4.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_C4.yaml deleted file mode 100755 index ea2a6baa..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_C4.yaml +++ /dev/null @@ -1,18 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 - ROI_HEADS: - NUM_CLASSES: 20 -INPUT: - MIN_SIZE_TRAIN: (480, 512, 544, 576, 608, 640, 672, 704, 736, 768, 800) - MIN_SIZE_TEST: 800 -DATASETS: - TRAIN: ('voc_2007_trainval', 'voc_2012_trainval') - TEST: ('voc_2007_test',) -SOLVER: - STEPS: (12000, 16000) - MAX_ITER: 18000 # 17.4 epochs - WARMUP_ITERS: 100 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_FPN.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_FPN.yaml deleted file mode 100755 index e554cab1..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/PascalVOC-Detection/faster_rcnn_R_50_FPN.yaml +++ /dev/null @@ -1,18 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: False - RESNETS: - DEPTH: 50 - ROI_HEADS: - NUM_CLASSES: 20 -INPUT: - MIN_SIZE_TRAIN: (480, 512, 544, 576, 608, 640, 672, 704, 736, 768, 800) - MIN_SIZE_TEST: 800 -DATASETS: - TRAIN: ('voc_2007_trainval', 'voc_2012_trainval') - TEST: ('voc_2007_test',) -SOLVER: - STEPS: (12000, 16000) - MAX_ITER: 18000 # 17.4 epochs - WARMUP_ITERS: 100 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/common/README.md b/grit_src_deprecated/third_party/CenterNet2/configs/common/README.md deleted file mode 100755 index 912cc299..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/common/README.md +++ /dev/null @@ -1,6 +0,0 @@ -This directory provides definitions for a few common models, dataloaders, scheduler, -and optimizers that are often used in training. -The definition of these objects are provided in the form of lazy instantiation: -their arguments can be edited by users before constructing the objects. - -They can be imported, or loaded by `model_zoo.get_config` API in users' own configs. diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/common/coco_schedule.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/coco_schedule.py deleted file mode 100755 index 355e66a1..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/common/coco_schedule.py +++ /dev/null @@ -1,47 +0,0 @@ -from fvcore.common.param_scheduler import MultiStepParamScheduler - -from detectron2.config import LazyCall as L -from detectron2.solver import WarmupParamScheduler - - -def default_X_scheduler(num_X): - """ - Returns the config for a default multi-step LR scheduler such as "1x", "3x", - commonly referred to in papers, where every 1x has the total length of 1440k - training images (~12 COCO epochs). LR is decayed twice at the end of training - following the strategy defined in "Rethinking ImageNet Pretraining", Sec 4. - - Args: - num_X: a positive real number - - Returns: - DictConfig: configs that define the multiplier for LR during training - """ - # total number of iterations assuming 16 batch size, using 1440000/16=90000 - total_steps_16bs = num_X * 90000 - - if num_X <= 2: - scheduler = L(MultiStepParamScheduler)( - values=[1.0, 0.1, 0.01], - # note that scheduler is scale-invariant. This is equivalent to - # milestones=[6, 8, 9] - milestones=[60000, 80000, 90000], - ) - else: - scheduler = L(MultiStepParamScheduler)( - values=[1.0, 0.1, 0.01], - milestones=[total_steps_16bs - 60000, total_steps_16bs - 20000, total_steps_16bs], - ) - return L(WarmupParamScheduler)( - scheduler=scheduler, - warmup_length=1000 / total_steps_16bs, - warmup_method="linear", - warmup_factor=0.001, - ) - - -lr_multiplier_1x = default_X_scheduler(1) -lr_multiplier_2x = default_X_scheduler(2) -lr_multiplier_3x = default_X_scheduler(3) -lr_multiplier_6x = default_X_scheduler(6) -lr_multiplier_9x = default_X_scheduler(9) diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco.py deleted file mode 100755 index 703c4385..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco.py +++ /dev/null @@ -1,48 +0,0 @@ -from omegaconf import OmegaConf - -import detectron2.data.transforms as T -from detectron2.config import LazyCall as L -from detectron2.data import ( - DatasetMapper, - build_detection_test_loader, - build_detection_train_loader, - get_detection_dataset_dicts, -) -from detectron2.evaluation import COCOEvaluator - -dataloader = OmegaConf.create() - -dataloader.train = L(build_detection_train_loader)( - dataset=L(get_detection_dataset_dicts)(names="coco_2017_train"), - mapper=L(DatasetMapper)( - is_train=True, - augmentations=[ - L(T.ResizeShortestEdge)( - short_edge_length=(640, 672, 704, 736, 768, 800), - sample_style="choice", - max_size=1333, - ), - L(T.RandomFlip)(horizontal=True), - ], - image_format="BGR", - use_instance_mask=True, - ), - total_batch_size=16, - num_workers=4, -) - -dataloader.test = L(build_detection_test_loader)( - dataset=L(get_detection_dataset_dicts)(names="coco_2017_val", filter_empty=False), - mapper=L(DatasetMapper)( - is_train=False, - augmentations=[ - L(T.ResizeShortestEdge)(short_edge_length=800, max_size=1333), - ], - image_format="${...train.mapper.image_format}", - ), - num_workers=4, -) - -dataloader.evaluator = L(COCOEvaluator)( - dataset_name="${..test.dataset.names}", -) diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco_keypoint.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco_keypoint.py deleted file mode 100755 index b4ceb066..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco_keypoint.py +++ /dev/null @@ -1,13 +0,0 @@ -from detectron2.data.detection_utils import create_keypoint_hflip_indices - -from .coco import dataloader - -dataloader.train.dataset.min_keypoints = 1 -dataloader.train.dataset.names = "keypoints_coco_2017_train" -dataloader.test.dataset.names = "keypoints_coco_2017_val" - -dataloader.train.mapper.update( - use_instance_mask=False, - use_keypoint=True, - keypoint_hflip_indices=create_keypoint_hflip_indices(dataloader.train.dataset.names), -) diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco_panoptic_separated.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco_panoptic_separated.py deleted file mode 100755 index 5ccbc77e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/common/data/coco_panoptic_separated.py +++ /dev/null @@ -1,26 +0,0 @@ -from detectron2.config import LazyCall as L -from detectron2.evaluation import ( - COCOEvaluator, - COCOPanopticEvaluator, - DatasetEvaluators, - SemSegEvaluator, -) - -from .coco import dataloader - -dataloader.train.dataset.names = "coco_2017_train_panoptic_separated" -dataloader.train.dataset.filter_empty = False -dataloader.test.dataset.names = "coco_2017_val_panoptic_separated" - - -dataloader.evaluator = [ - L(COCOEvaluator)( - dataset_name="${...test.dataset.names}", - ), - L(SemSegEvaluator)( - dataset_name="${...test.dataset.names}", - ), - L(COCOPanopticEvaluator)( - dataset_name="${...test.dataset.names}", - ), -] diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/common/models/cascade_rcnn.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/models/cascade_rcnn.py deleted file mode 100755 index c7372a80..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/common/models/cascade_rcnn.py +++ /dev/null @@ -1,36 +0,0 @@ -from detectron2.config import LazyCall as L -from detectron2.layers import ShapeSpec -from detectron2.modeling.box_regression import Box2BoxTransform -from detectron2.modeling.matcher import Matcher -from detectron2.modeling.roi_heads import FastRCNNOutputLayers, FastRCNNConvFCHead, CascadeROIHeads - -from .mask_rcnn_fpn import model - -# arguments that don't exist for Cascade R-CNN -[model.roi_heads.pop(k) for k in ["box_head", "box_predictor", "proposal_matcher"]] - -model.roi_heads.update( - _target_=CascadeROIHeads, - box_heads=[ - L(FastRCNNConvFCHead)( - input_shape=ShapeSpec(channels=256, height=7, width=7), - conv_dims=[], - fc_dims=[1024, 1024], - ) - for k in range(3) - ], - box_predictors=[ - L(FastRCNNOutputLayers)( - input_shape=ShapeSpec(channels=1024), - test_score_thresh=0.05, - box2box_transform=L(Box2BoxTransform)(weights=(w1, w1, w2, w2)), - cls_agnostic_bbox_reg=True, - num_classes="${...num_classes}", - ) - for (w1, w2) in [(10, 5), (20, 10), (30, 15)] - ], - proposal_matchers=[ - L(Matcher)(thresholds=[th], labels=[0, 1], allow_low_quality_matches=False) - for th in [0.5, 0.6, 0.7] - ], -) diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/common/models/fcos.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/models/fcos.py deleted file mode 100755 index 1c752029..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/common/models/fcos.py +++ /dev/null @@ -1,23 +0,0 @@ -from detectron2.modeling.meta_arch.fcos import FCOS, FCOSHead - -from .retinanet import model - -model._target_ = FCOS - -del model.anchor_generator -del model.box2box_transform -del model.anchor_matcher -del model.input_format - -# Use P5 instead of C5 to compute P6/P7 -# (Sec 2.2 of https://arxiv.org/abs/2006.09214) -model.backbone.top_block.in_feature = "p5" -model.backbone.top_block.in_channels = 256 - -# New score threshold determined based on sqrt(cls_score * centerness) -model.test_score_thresh = 0.2 -model.test_nms_thresh = 0.6 - -model.head._target_ = FCOSHead -del model.head.num_anchors -model.head.norm = "GN" diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/common/models/keypoint_rcnn_fpn.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/models/keypoint_rcnn_fpn.py deleted file mode 100755 index 56b3994d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/common/models/keypoint_rcnn_fpn.py +++ /dev/null @@ -1,33 +0,0 @@ -from detectron2.config import LazyCall as L -from detectron2.layers import ShapeSpec -from detectron2.modeling.poolers import ROIPooler -from detectron2.modeling.roi_heads import KRCNNConvDeconvUpsampleHead - -from .mask_rcnn_fpn import model - -[model.roi_heads.pop(x) for x in ["mask_in_features", "mask_pooler", "mask_head"]] - -model.roi_heads.update( - num_classes=1, - keypoint_in_features=["p2", "p3", "p4", "p5"], - keypoint_pooler=L(ROIPooler)( - output_size=14, - scales=(1.0 / 4, 1.0 / 8, 1.0 / 16, 1.0 / 32), - sampling_ratio=0, - pooler_type="ROIAlignV2", - ), - keypoint_head=L(KRCNNConvDeconvUpsampleHead)( - input_shape=ShapeSpec(channels=256, width=14, height=14), - num_keypoints=17, - conv_dims=[512] * 8, - loss_normalizer="visible", - ), -) - -# Detectron1 uses 2000 proposals per-batch, but this option is per-image in detectron2. -# 1000 proposals per-image is found to hurt box AP. -# Therefore we increase it to 1500 per-image. -model.proposal_generator.post_nms_topk = (1500, 1000) - -# Keypoint AP degrades (though box AP improves) when using plain L1 loss -model.roi_heads.box_predictor.smooth_l1_beta = 0.5 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/common/models/mask_rcnn_c4.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/models/mask_rcnn_c4.py deleted file mode 100755 index a3dcf8be..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/common/models/mask_rcnn_c4.py +++ /dev/null @@ -1,88 +0,0 @@ -from detectron2.config import LazyCall as L -from detectron2.layers import ShapeSpec -from detectron2.modeling.meta_arch import GeneralizedRCNN -from detectron2.modeling.anchor_generator import DefaultAnchorGenerator -from detectron2.modeling.backbone import BasicStem, BottleneckBlock, ResNet -from detectron2.modeling.box_regression import Box2BoxTransform -from detectron2.modeling.matcher import Matcher -from detectron2.modeling.poolers import ROIPooler -from detectron2.modeling.proposal_generator import RPN, StandardRPNHead -from detectron2.modeling.roi_heads import ( - FastRCNNOutputLayers, - MaskRCNNConvUpsampleHead, - Res5ROIHeads, -) - -model = L(GeneralizedRCNN)( - backbone=L(ResNet)( - stem=L(BasicStem)(in_channels=3, out_channels=64, norm="FrozenBN"), - stages=L(ResNet.make_default_stages)( - depth=50, - stride_in_1x1=True, - norm="FrozenBN", - ), - out_features=["res4"], - ), - proposal_generator=L(RPN)( - in_features=["res4"], - head=L(StandardRPNHead)(in_channels=1024, num_anchors=15), - anchor_generator=L(DefaultAnchorGenerator)( - sizes=[[32, 64, 128, 256, 512]], - aspect_ratios=[0.5, 1.0, 2.0], - strides=[16], - offset=0.0, - ), - anchor_matcher=L(Matcher)( - thresholds=[0.3, 0.7], labels=[0, -1, 1], allow_low_quality_matches=True - ), - box2box_transform=L(Box2BoxTransform)(weights=[1.0, 1.0, 1.0, 1.0]), - batch_size_per_image=256, - positive_fraction=0.5, - pre_nms_topk=(12000, 6000), - post_nms_topk=(2000, 1000), - nms_thresh=0.7, - ), - roi_heads=L(Res5ROIHeads)( - num_classes=80, - batch_size_per_image=512, - positive_fraction=0.25, - proposal_matcher=L(Matcher)( - thresholds=[0.5], labels=[0, 1], allow_low_quality_matches=False - ), - in_features=["res4"], - pooler=L(ROIPooler)( - output_size=14, - scales=(1.0 / 16,), - sampling_ratio=0, - pooler_type="ROIAlignV2", - ), - res5=L(ResNet.make_stage)( - block_class=BottleneckBlock, - num_blocks=3, - stride_per_block=[2, 1, 1], - in_channels=1024, - bottleneck_channels=512, - out_channels=2048, - norm="FrozenBN", - stride_in_1x1=True, - ), - box_predictor=L(FastRCNNOutputLayers)( - input_shape=L(ShapeSpec)(channels="${...res5.out_channels}", height=1, width=1), - test_score_thresh=0.05, - box2box_transform=L(Box2BoxTransform)(weights=(10, 10, 5, 5)), - num_classes="${..num_classes}", - ), - mask_head=L(MaskRCNNConvUpsampleHead)( - input_shape=L(ShapeSpec)( - channels="${...res5.out_channels}", - width="${...pooler.output_size}", - height="${...pooler.output_size}", - ), - num_classes="${..num_classes}", - conv_dims=[256], - ), - ), - pixel_mean=[103.530, 116.280, 123.675], - pixel_std=[1.0, 1.0, 1.0], - input_format="BGR", -) diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/common/models/mask_rcnn_fpn.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/models/mask_rcnn_fpn.py deleted file mode 100755 index 744d5306..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/common/models/mask_rcnn_fpn.py +++ /dev/null @@ -1,93 +0,0 @@ -from detectron2.config import LazyCall as L -from detectron2.layers import ShapeSpec -from detectron2.modeling.meta_arch import GeneralizedRCNN -from detectron2.modeling.anchor_generator import DefaultAnchorGenerator -from detectron2.modeling.backbone.fpn import LastLevelMaxPool -from detectron2.modeling.backbone import BasicStem, FPN, ResNet -from detectron2.modeling.box_regression import Box2BoxTransform -from detectron2.modeling.matcher import Matcher -from detectron2.modeling.poolers import ROIPooler -from detectron2.modeling.proposal_generator import RPN, StandardRPNHead -from detectron2.modeling.roi_heads import ( - StandardROIHeads, - FastRCNNOutputLayers, - MaskRCNNConvUpsampleHead, - FastRCNNConvFCHead, -) - -model = L(GeneralizedRCNN)( - backbone=L(FPN)( - bottom_up=L(ResNet)( - stem=L(BasicStem)(in_channels=3, out_channels=64, norm="FrozenBN"), - stages=L(ResNet.make_default_stages)( - depth=50, - stride_in_1x1=True, - norm="FrozenBN", - ), - out_features=["res2", "res3", "res4", "res5"], - ), - in_features="${.bottom_up.out_features}", - out_channels=256, - top_block=L(LastLevelMaxPool)(), - ), - proposal_generator=L(RPN)( - in_features=["p2", "p3", "p4", "p5", "p6"], - head=L(StandardRPNHead)(in_channels=256, num_anchors=3), - anchor_generator=L(DefaultAnchorGenerator)( - sizes=[[32], [64], [128], [256], [512]], - aspect_ratios=[0.5, 1.0, 2.0], - strides=[4, 8, 16, 32, 64], - offset=0.0, - ), - anchor_matcher=L(Matcher)( - thresholds=[0.3, 0.7], labels=[0, -1, 1], allow_low_quality_matches=True - ), - box2box_transform=L(Box2BoxTransform)(weights=[1.0, 1.0, 1.0, 1.0]), - batch_size_per_image=256, - positive_fraction=0.5, - pre_nms_topk=(2000, 1000), - post_nms_topk=(1000, 1000), - nms_thresh=0.7, - ), - roi_heads=L(StandardROIHeads)( - num_classes=80, - batch_size_per_image=512, - positive_fraction=0.25, - proposal_matcher=L(Matcher)( - thresholds=[0.5], labels=[0, 1], allow_low_quality_matches=False - ), - box_in_features=["p2", "p3", "p4", "p5"], - box_pooler=L(ROIPooler)( - output_size=7, - scales=(1.0 / 4, 1.0 / 8, 1.0 / 16, 1.0 / 32), - sampling_ratio=0, - pooler_type="ROIAlignV2", - ), - box_head=L(FastRCNNConvFCHead)( - input_shape=ShapeSpec(channels=256, height=7, width=7), - conv_dims=[], - fc_dims=[1024, 1024], - ), - box_predictor=L(FastRCNNOutputLayers)( - input_shape=ShapeSpec(channels=1024), - test_score_thresh=0.05, - box2box_transform=L(Box2BoxTransform)(weights=(10, 10, 5, 5)), - num_classes="${..num_classes}", - ), - mask_in_features=["p2", "p3", "p4", "p5"], - mask_pooler=L(ROIPooler)( - output_size=14, - scales=(1.0 / 4, 1.0 / 8, 1.0 / 16, 1.0 / 32), - sampling_ratio=0, - pooler_type="ROIAlignV2", - ), - mask_head=L(MaskRCNNConvUpsampleHead)( - input_shape=ShapeSpec(channels=256, width=14, height=14), - num_classes="${..num_classes}", - conv_dims=[256, 256, 256, 256, 256], - ), - ), - pixel_mean=[103.530, 116.280, 123.675], - pixel_std=[1.0, 1.0, 1.0], - input_format="BGR", -) diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/common/models/panoptic_fpn.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/models/panoptic_fpn.py deleted file mode 100755 index 88f55d2c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/common/models/panoptic_fpn.py +++ /dev/null @@ -1,20 +0,0 @@ -from detectron2.config import LazyCall as L -from detectron2.layers import ShapeSpec -from detectron2.modeling import PanopticFPN -from detectron2.modeling.meta_arch.semantic_seg import SemSegFPNHead - -from .mask_rcnn_fpn import model - -model._target_ = PanopticFPN -model.sem_seg_head = L(SemSegFPNHead)( - input_shape={ - f: L(ShapeSpec)(stride=s, channels="${....backbone.out_channels}") - for f, s in zip(["p2", "p3", "p4", "p5"], [4, 8, 16, 32]) - }, - ignore_value=255, - num_classes=54, # COCO stuff + 1 - conv_dims=128, - common_stride=4, - loss_weight=0.5, - norm="GN", -) diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/common/models/retinanet.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/models/retinanet.py deleted file mode 100755 index 83cfda4b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/common/models/retinanet.py +++ /dev/null @@ -1,53 +0,0 @@ -# -*- coding: utf-8 -*- - -from detectron2.config import LazyCall as L -from detectron2.layers import ShapeSpec -from detectron2.modeling.meta_arch import RetinaNet -from detectron2.modeling.anchor_generator import DefaultAnchorGenerator -from detectron2.modeling.backbone.fpn import LastLevelP6P7 -from detectron2.modeling.backbone import BasicStem, FPN, ResNet -from detectron2.modeling.box_regression import Box2BoxTransform -from detectron2.modeling.matcher import Matcher -from detectron2.modeling.meta_arch.retinanet import RetinaNetHead - -model = L(RetinaNet)( - backbone=L(FPN)( - bottom_up=L(ResNet)( - stem=L(BasicStem)(in_channels=3, out_channels=64, norm="FrozenBN"), - stages=L(ResNet.make_default_stages)( - depth=50, - stride_in_1x1=True, - norm="FrozenBN", - ), - out_features=["res3", "res4", "res5"], - ), - in_features=["res3", "res4", "res5"], - out_channels=256, - top_block=L(LastLevelP6P7)(in_channels=2048, out_channels="${..out_channels}"), - ), - head=L(RetinaNetHead)( - # Shape for each input feature map - input_shape=[ShapeSpec(channels=256)] * 5, - num_classes="${..num_classes}", - conv_dims=[256, 256, 256, 256], - prior_prob=0.01, - num_anchors=9, - ), - anchor_generator=L(DefaultAnchorGenerator)( - sizes=[[x, x * 2 ** (1.0 / 3), x * 2 ** (2.0 / 3)] for x in [32, 64, 128, 256, 512]], - aspect_ratios=[0.5, 1.0, 2.0], - strides=[8, 16, 32, 64, 128], - offset=0.0, - ), - box2box_transform=L(Box2BoxTransform)(weights=[1.0, 1.0, 1.0, 1.0]), - anchor_matcher=L(Matcher)( - thresholds=[0.4, 0.5], labels=[0, -1, 1], allow_low_quality_matches=True - ), - num_classes=80, - head_in_features=["p3", "p4", "p5", "p6", "p7"], - focal_loss_alpha=0.25, - focal_loss_gamma=2.0, - pixel_mean=[103.530, 116.280, 123.675], - pixel_std=[1.0, 1.0, 1.0], - input_format="BGR", -) diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/common/optim.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/optim.py deleted file mode 100755 index d39d3aaa..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/common/optim.py +++ /dev/null @@ -1,15 +0,0 @@ -import torch - -from detectron2.config import LazyCall as L -from detectron2.solver.build import get_default_optimizer_params - -SGD = L(torch.optim.SGD)( - params=L(get_default_optimizer_params)( - # params.model is meant to be set to the model object, before instantiating - # the optimizer. - weight_decay_norm=0.0 - ), - lr=0.02, - momentum=0.9, - weight_decay=1e-4, -) diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/common/train.py b/grit_src_deprecated/third_party/CenterNet2/configs/common/train.py deleted file mode 100755 index b6ed02bd..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/common/train.py +++ /dev/null @@ -1,18 +0,0 @@ -# Common training-related configs that are designed for "tools/lazyconfig_train_net.py" -# You can use your own instead, together with your own train_net.py -train = dict( - output_dir="./output", - init_checkpoint="", - max_iter=90000, - amp=dict(enabled=False), # options for Automatic Mixed Precision - ddp=dict( # options for DistributedDataParallel - broadcast_buffers=False, - find_unused_parameters=False, - fp16_compression=False, - ), - checkpointer=dict(period=5000, max_to_keep=100), # options for PeriodicCheckpointer - eval_period=5000, - log_period=20, - device="cuda" - # ... -) diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_100ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_100ep_LSJ.py deleted file mode 100755 index 3740e9bb..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_100ep_LSJ.py +++ /dev/null @@ -1,9 +0,0 @@ -from .mask_rcnn_R_50_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -model.backbone.bottom_up.stages.depth = 101 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_200ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_200ep_LSJ.py deleted file mode 100755 index 18e5f072..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_200ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_R_101_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter *= 2 # 100ep -> 200ep - -lr_multiplier.scheduler.milestones = [ - milestone * 2 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_400ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_400ep_LSJ.py deleted file mode 100755 index 63c54ee9..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_101_FPN_400ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_R_101_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter *= 4 # 100ep -> 400ep - -lr_multiplier.scheduler.milestones = [ - milestone * 4 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ.py deleted file mode 100755 index df7a2aed..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ.py +++ /dev/null @@ -1,72 +0,0 @@ -import detectron2.data.transforms as T -from detectron2.config.lazy import LazyCall as L -from detectron2.layers.batch_norm import NaiveSyncBatchNorm -from detectron2.solver import WarmupParamScheduler -from fvcore.common.param_scheduler import MultiStepParamScheduler - -from ..common.data.coco import dataloader -from ..common.models.mask_rcnn_fpn import model -from ..common.optim import SGD as optimizer -from ..common.train import train - -# train from scratch -train.init_checkpoint = "" -train.amp.enabled = True -train.ddp.fp16_compression = True -model.backbone.bottom_up.freeze_at = 0 - -# SyncBN -# fmt: off -model.backbone.bottom_up.stem.norm = \ - model.backbone.bottom_up.stages.norm = \ - model.backbone.norm = "SyncBN" - -# Using NaiveSyncBatchNorm becase heads may have empty input. That is not supported by -# torch.nn.SyncBatchNorm. We can remove this after -# https://github.com/pytorch/pytorch/issues/36530 is fixed. -model.roi_heads.box_head.conv_norm = \ - model.roi_heads.mask_head.conv_norm = lambda c: NaiveSyncBatchNorm(c, - stats_mode="N") -# fmt: on - -# 2conv in RPN: -# https://github.com/tensorflow/tpu/blob/b24729de804fdb751b06467d3dce0637fa652060/models/official/detection/modeling/architecture/heads.py#L95-L97 # noqa: E501, B950 -model.proposal_generator.head.conv_dims = [-1, -1] - -# 4conv1fc box head -model.roi_heads.box_head.conv_dims = [256, 256, 256, 256] -model.roi_heads.box_head.fc_dims = [1024] - -# resize_and_crop_image in: -# https://github.com/tensorflow/tpu/blob/b24729de804fdb751b06467d3dce0637fa652060/models/official/detection/utils/input_utils.py#L127 # noqa: E501, B950 -image_size = 1024 -dataloader.train.mapper.augmentations = [ - L(T.ResizeScale)( - min_scale=0.1, max_scale=2.0, target_height=image_size, target_width=image_size - ), - L(T.FixedSizeCrop)(crop_size=(image_size, image_size)), - L(T.RandomFlip)(horizontal=True), -] - -# recompute boxes due to cropping -dataloader.train.mapper.recompute_boxes = True - -# larger batch-size. -dataloader.train.total_batch_size = 64 - -# Equivalent to 100 epochs. -# 100 ep = 184375 iters * 64 images/iter / 118000 images/ep -train.max_iter = 184375 - -lr_multiplier = L(WarmupParamScheduler)( - scheduler=L(MultiStepParamScheduler)( - values=[1.0, 0.1, 0.01], - milestones=[163889, 177546], - num_updates=train.max_iter, - ), - warmup_length=500 / train.max_iter, - warmup_factor=0.067, -) - -optimizer.lr = 0.1 -optimizer.weight_decay = 4e-5 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_200ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_200ep_LSJ.py deleted file mode 100755 index 2a7c376d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_200ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_R_50_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter *= 2 # 100ep -> 200ep - -lr_multiplier.scheduler.milestones = [ - milestone * 2 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_400ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_400ep_LSJ.py deleted file mode 100755 index 97586b8f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_400ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_R_50_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter *= 4 # 100ep -> 400ep - -lr_multiplier.scheduler.milestones = [ - milestone * 4 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_50ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_50ep_LSJ.py deleted file mode 100755 index 2ca1ede2..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_R_50_FPN_50ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_R_50_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter //= 2 # 100ep -> 50ep - -lr_multiplier.scheduler.milestones = [ - milestone // 2 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py deleted file mode 100755 index ef0b6d16..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py +++ /dev/null @@ -1,29 +0,0 @@ -from .mask_rcnn_R_50_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) -from detectron2.config import LazyCall as L -from detectron2.modeling.backbone import RegNet -from detectron2.modeling.backbone.regnet import SimpleStem, ResBottleneckBlock - -# Config source: -# https://github.com/facebookresearch/detectron2/blob/main/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py # noqa -model.backbone.bottom_up = L(RegNet)( - stem_class=SimpleStem, - stem_width=32, - block_class=ResBottleneckBlock, - depth=23, - w_a=38.65, - w_0=96, - w_m=2.43, - group_width=40, - norm="SyncBN", - out_features=["s1", "s2", "s3", "s4"], -) -model.pixel_std = [57.375, 57.120, 58.395] - -# RegNets benefit from enabling cudnn benchmark mode -train.cudnn_benchmark = True diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ.py deleted file mode 100755 index 731320e7..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter *= 2 # 100ep -> 200ep - -lr_multiplier.scheduler.milestones = [ - milestone * 2 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ.py deleted file mode 100755 index 8f369a2a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter *= 4 # 100ep -> 400ep - -lr_multiplier.scheduler.milestones = [ - milestone * 4 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py deleted file mode 100755 index ba2c3274..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py +++ /dev/null @@ -1,30 +0,0 @@ -from .mask_rcnn_R_50_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) -from detectron2.config import LazyCall as L -from detectron2.modeling.backbone import RegNet -from detectron2.modeling.backbone.regnet import SimpleStem, ResBottleneckBlock - -# Config source: -# https://github.com/facebookresearch/detectron2/blob/main/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py # noqa -model.backbone.bottom_up = L(RegNet)( - stem_class=SimpleStem, - stem_width=32, - block_class=ResBottleneckBlock, - depth=22, - w_a=31.41, - w_0=96, - w_m=2.24, - group_width=64, - se_ratio=0.25, - norm="SyncBN", - out_features=["s1", "s2", "s3", "s4"], -) -model.pixel_std = [57.375, 57.120, 58.395] - -# RegNets benefit from enabling cudnn benchmark mode -train.cudnn_benchmark = True diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ.py deleted file mode 100755 index b867cc86..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter *= 2 # 100ep -> 200ep - -lr_multiplier.scheduler.milestones = [ - milestone * 2 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ.py b/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ.py deleted file mode 100755 index 7b86ea8c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ.py +++ /dev/null @@ -1,14 +0,0 @@ -from .mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ import ( - dataloader, - lr_multiplier, - model, - optimizer, - train, -) - -train.max_iter *= 4 # 100ep -> 400ep - -lr_multiplier.scheduler.milestones = [ - milestone * 4 for milestone in lr_multiplier.scheduler.milestones -] -lr_multiplier.scheduler.num_updates = train.max_iter diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/README.md b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/README.md deleted file mode 100755 index 4e6c82ef..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/README.md +++ /dev/null @@ -1,8 +0,0 @@ -These are quick configs for performance or accuracy regression tracking purposes. - -* `*instance_test.yaml`: can train on 2 GPUs. They are used to test whether the training can - successfully finish. They are not expected to produce reasonable training results. -* `*inference_acc_test.yaml`: They should be run using `--eval-only`. They run inference using pre-trained models and verify - the results are as expected. -* `*training_acc_test.yaml`: They should be trained on 8 GPUs. They finish in about an hour and verify the training accuracy - is within the normal range. diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_inference_acc_test.yaml deleted file mode 100755 index fc5a4116..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_inference_acc_test.yaml +++ /dev/null @@ -1,7 +0,0 @@ -_BASE_: "../Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml" -MODEL: - WEIGHTS: "detectron2://Misc/cascade_mask_rcnn_R_50_FPN_3x/144998488/model_final_480dd8.pkl" -DATASETS: - TEST: ("coco_2017_val_100",) -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 50.18, 0.02], ["segm", "AP", 43.87, 0.02]] diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_instant_test.yaml deleted file mode 100755 index e41a0fe7..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/cascade_mask_rcnn_R_50_FPN_instant_test.yaml +++ /dev/null @@ -1,11 +0,0 @@ -_BASE_: "../Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml" -DATASETS: - TRAIN: ("coco_2017_val_100",) - TEST: ("coco_2017_val_100",) -SOLVER: - BASE_LR: 0.005 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 2 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_inference_acc_test.yaml deleted file mode 100755 index a2f37e5e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_inference_acc_test.yaml +++ /dev/null @@ -1,7 +0,0 @@ -_BASE_: "../COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml" -MODEL: - WEIGHTS: "detectron2://COCO-Detection/fast_rcnn_R_50_FPN_1x/137635226/model_final_e5f7ce.pkl" -DATASETS: - TEST: ("coco_2017_val_100",) -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 45.70, 0.02]] diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_instant_test.yaml deleted file mode 100755 index 52fc0ec0..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/fast_rcnn_R_50_FPN_instant_test.yaml +++ /dev/null @@ -1,15 +0,0 @@ -_BASE_: "../COCO-Detection/fast_rcnn_R_50_FPN_1x.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" -DATASETS: - TRAIN: ("coco_2017_val_100",) - PROPOSAL_FILES_TRAIN: ("detectron2://COCO-Detection/rpn_R_50_FPN_1x/137258492/coco_2017_val_box_proposals_ee0dad.pkl", ) - TEST: ("coco_2017_val_100",) - PROPOSAL_FILES_TEST: ("detectron2://COCO-Detection/rpn_R_50_FPN_1x/137258492/coco_2017_val_box_proposals_ee0dad.pkl", ) -SOLVER: - BASE_LR: 0.005 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 2 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_inference_acc_test.yaml deleted file mode 100755 index 14cf2aa8..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_inference_acc_test.yaml +++ /dev/null @@ -1,7 +0,0 @@ -_BASE_: "../COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml" -MODEL: - WEIGHTS: "detectron2://COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x/137849621/model_final_a6e10b.pkl" -DATASETS: - TEST: ("keypoints_coco_2017_val_100",) -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 52.47, 0.02], ["keypoints", "AP", 67.36, 0.02]] diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_instant_test.yaml deleted file mode 100755 index 3dd209f6..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_instant_test.yaml +++ /dev/null @@ -1,16 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - KEYPOINT_ON: True - ROI_HEADS: - NUM_CLASSES: 1 -DATASETS: - TRAIN: ("keypoints_coco_2017_val_100",) - TEST: ("keypoints_coco_2017_val_100",) -SOLVER: - BASE_LR: 0.005 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 2 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_normalized_training_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_normalized_training_acc_test.yaml deleted file mode 100755 index 4b92392f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_normalized_training_acc_test.yaml +++ /dev/null @@ -1,30 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - KEYPOINT_ON: True - RESNETS: - DEPTH: 50 - ROI_HEADS: - BATCH_SIZE_PER_IMAGE: 256 - NUM_CLASSES: 1 - ROI_KEYPOINT_HEAD: - POOLER_RESOLUTION: 14 - POOLER_SAMPLING_RATIO: 2 - NORMALIZE_LOSS_BY_VISIBLE_KEYPOINTS: False - LOSS_WEIGHT: 4.0 - ROI_BOX_HEAD: - SMOOTH_L1_BETA: 1.0 # Keypoint AP degrades when using plain L1 loss - RPN: - SMOOTH_L1_BETA: 0.2 # Keypoint AP degrades when using plain L1 loss -DATASETS: - TRAIN: ("keypoints_coco_2017_val",) - TEST: ("keypoints_coco_2017_val",) -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -SOLVER: - WARMUP_FACTOR: 0.33333333 - WARMUP_ITERS: 100 - STEPS: (5500, 5800) - MAX_ITER: 6000 -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 55.35, 1.0], ["keypoints", "AP", 76.91, 1.0]] diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_training_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_training_acc_test.yaml deleted file mode 100755 index 9bd96287..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/keypoint_rcnn_R_50_FPN_training_acc_test.yaml +++ /dev/null @@ -1,28 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - KEYPOINT_ON: True - RESNETS: - DEPTH: 50 - ROI_HEADS: - BATCH_SIZE_PER_IMAGE: 256 - NUM_CLASSES: 1 - ROI_KEYPOINT_HEAD: - POOLER_RESOLUTION: 14 - POOLER_SAMPLING_RATIO: 2 - ROI_BOX_HEAD: - SMOOTH_L1_BETA: 1.0 # Keypoint AP degrades when using plain L1 loss - RPN: - SMOOTH_L1_BETA: 0.2 # Keypoint AP degrades when using plain L1 loss -DATASETS: - TRAIN: ("keypoints_coco_2017_val",) - TEST: ("keypoints_coco_2017_val",) -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -SOLVER: - WARMUP_FACTOR: 0.33333333 - WARMUP_ITERS: 100 - STEPS: (5500, 5800) - MAX_ITER: 6000 -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 53.5, 1.0], ["keypoints", "AP", 72.4, 1.0]] diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_GCV_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_GCV_instant_test.yaml deleted file mode 100755 index ab6e6981..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_GCV_instant_test.yaml +++ /dev/null @@ -1,18 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True -DATASETS: - TRAIN: ("coco_2017_val_100",) - TEST: ("coco_2017_val_100",) -SOLVER: - BASE_LR: 0.001 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 - CLIP_GRADIENTS: - ENABLED: True - CLIP_TYPE: "value" - CLIP_VALUE: 1.0 -DATALOADER: - NUM_WORKERS: 2 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_inference_acc_test.yaml deleted file mode 100755 index b2d5b7ff..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_inference_acc_test.yaml +++ /dev/null @@ -1,7 +0,0 @@ -_BASE_: "../COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml" -MODEL: - WEIGHTS: "detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x/137849525/model_final_4ce675.pkl" -DATASETS: - TEST: ("coco_2017_val_100",) -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 47.37, 0.02], ["segm", "AP", 40.99, 0.02]] diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_instant_test.yaml deleted file mode 100755 index 6c4f1214..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_instant_test.yaml +++ /dev/null @@ -1,14 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True -DATASETS: - TRAIN: ("coco_2017_val_100",) - TEST: ("coco_2017_val_100",) -SOLVER: - BASE_LR: 0.001 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 2 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_training_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_training_acc_test.yaml deleted file mode 100755 index f68dd8f9..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_C4_training_acc_test.yaml +++ /dev/null @@ -1,22 +0,0 @@ -_BASE_: "../Base-RCNN-C4.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - ROI_HEADS: - BATCH_SIZE_PER_IMAGE: 256 - MASK_ON: True -DATASETS: - TRAIN: ("coco_2017_val",) - TEST: ("coco_2017_val",) -INPUT: - MIN_SIZE_TRAIN: (600,) - MAX_SIZE_TRAIN: 1000 - MIN_SIZE_TEST: 800 - MAX_SIZE_TEST: 1000 -SOLVER: - IMS_PER_BATCH: 8 # base uses 16 - WARMUP_FACTOR: 0.33333 - WARMUP_ITERS: 100 - STEPS: (11000, 11600) - MAX_ITER: 12000 -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 41.88, 0.7], ["segm", "AP", 33.79, 0.5]] diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_DC5_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_DC5_inference_acc_test.yaml deleted file mode 100755 index e3ce6cf9..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_DC5_inference_acc_test.yaml +++ /dev/null @@ -1,7 +0,0 @@ -_BASE_: "../COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x.yaml" -MODEL: - WEIGHTS: "detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x/137849551/model_final_84107b.pkl" -DATASETS: - TEST: ("coco_2017_val_100",) -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 47.44, 0.02], ["segm", "AP", 42.94, 0.02]] diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml deleted file mode 100755 index e5454bfd..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml +++ /dev/null @@ -1,10 +0,0 @@ -_BASE_: "../COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml" -MODEL: - WEIGHTS: "detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl" -DATASETS: - TEST: ("coco_2017_val_100",) -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 47.34, 0.02], ["segm", "AP", 42.67, 0.02], ["bbox_TTA", "AP", 49.11, 0.02], ["segm_TTA", "AP", 45.04, 0.02]] - AUG: - ENABLED: True - MIN_SIZES: (700, 800) # to save some time diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_instant_test.yaml deleted file mode 100755 index 6dbfcde0..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_instant_test.yaml +++ /dev/null @@ -1,14 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True -DATASETS: - TRAIN: ("coco_2017_val_100",) - TEST: ("coco_2017_val_100",) -SOLVER: - BASE_LR: 0.005 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 2 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_pred_boxes_training_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_pred_boxes_training_acc_test.yaml deleted file mode 100755 index 52f78762..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_pred_boxes_training_acc_test.yaml +++ /dev/null @@ -1,6 +0,0 @@ -_BASE_: "./mask_rcnn_R_50_FPN_training_acc_test.yaml" -MODEL: - ROI_BOX_HEAD: - TRAIN_ON_PRED_BOXES: True -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 42.6, 1.0], ["segm", "AP", 35.8, 0.8]] diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_training_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_training_acc_test.yaml deleted file mode 100755 index aadae4ce..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/mask_rcnn_R_50_FPN_training_acc_test.yaml +++ /dev/null @@ -1,21 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - ROI_HEADS: - BATCH_SIZE_PER_IMAGE: 256 - MASK_ON: True -DATASETS: - TRAIN: ("coco_2017_val",) - TEST: ("coco_2017_val",) -INPUT: - MIN_SIZE_TRAIN: (600,) - MAX_SIZE_TRAIN: 1000 - MIN_SIZE_TEST: 800 - MAX_SIZE_TEST: 1000 -SOLVER: - WARMUP_FACTOR: 0.3333333 - WARMUP_ITERS: 100 - STEPS: (5500, 5800) - MAX_ITER: 6000 -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 42.5, 1.0], ["segm", "AP", 35.8, 0.8]] diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_inference_acc_test.yaml deleted file mode 100755 index 70874e3a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_inference_acc_test.yaml +++ /dev/null @@ -1,7 +0,0 @@ -_BASE_: "../COCO-PanopticSegmentation/panoptic_fpn_R_50_3x.yaml" -MODEL: - WEIGHTS: "detectron2://COCO-PanopticSegmentation/panoptic_fpn_R_50_3x/139514569/model_final_c10459.pkl" -DATASETS: - TEST: ("coco_2017_val_100_panoptic_separated",) -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 46.47, 0.02], ["segm", "AP", 43.39, 0.02], ["sem_seg", "mIoU", 42.55, 0.02], ["panoptic_seg", "PQ", 38.99, 0.02]] diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_instant_test.yaml deleted file mode 100755 index 7cdee7bf..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_instant_test.yaml +++ /dev/null @@ -1,19 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - META_ARCHITECTURE: "PanopticFPN" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - SEM_SEG_HEAD: - LOSS_WEIGHT: 0.5 -DATASETS: - TRAIN: ("coco_2017_val_100_panoptic_separated",) - TEST: ("coco_2017_val_100_panoptic_separated",) -SOLVER: - BASE_LR: 0.005 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 1 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_training_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_training_acc_test.yaml deleted file mode 100755 index f3bbf301..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/panoptic_fpn_R_50_training_acc_test.yaml +++ /dev/null @@ -1,20 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - META_ARCHITECTURE: "PanopticFPN" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - MASK_ON: True - RESNETS: - DEPTH: 50 - SEM_SEG_HEAD: - LOSS_WEIGHT: 0.5 -DATASETS: - TRAIN: ("coco_2017_val_panoptic_separated",) - TEST: ("coco_2017_val_panoptic_separated",) -SOLVER: - BASE_LR: 0.01 - WARMUP_FACTOR: 0.001 - WARMUP_ITERS: 500 - STEPS: (5500,) - MAX_ITER: 7000 -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 46.70, 1.1], ["segm", "AP", 39.0, 0.7], ["sem_seg", "mIoU", 64.73, 1.3], ["panoptic_seg", "PQ", 48.13, 0.8]] diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_inference_acc_test.yaml deleted file mode 100755 index cb666c1a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_inference_acc_test.yaml +++ /dev/null @@ -1,7 +0,0 @@ -_BASE_: "../COCO-Detection/retinanet_R_50_FPN_3x.yaml" -MODEL: - WEIGHTS: "detectron2://COCO-Detection/retinanet_R_50_FPN_3x/190397829/model_final_5bd44e.pkl" -DATASETS: - TEST: ("coco_2017_val_100",) -TEST: - EXPECTED_RESULTS: [["bbox", "AP", 44.45, 0.02]] diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_instant_test.yaml deleted file mode 100755 index 8d95c1f6..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/retinanet_R_50_FPN_instant_test.yaml +++ /dev/null @@ -1,13 +0,0 @@ -_BASE_: "../COCO-Detection/retinanet_R_50_FPN_1x.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" -DATASETS: - TRAIN: ("coco_2017_val_100",) - TEST: ("coco_2017_val_100",) -SOLVER: - BASE_LR: 0.005 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 2 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_inference_acc_test.yaml deleted file mode 100755 index c7c3f908..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_inference_acc_test.yaml +++ /dev/null @@ -1,7 +0,0 @@ -_BASE_: "../COCO-Detection/rpn_R_50_FPN_1x.yaml" -MODEL: - WEIGHTS: "detectron2://COCO-Detection/rpn_R_50_FPN_1x/137258492/model_final_02ce48.pkl" -DATASETS: - TEST: ("coco_2017_val_100",) -TEST: - EXPECTED_RESULTS: [["box_proposals", "AR@1000", 58.16, 0.02]] diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_instant_test.yaml deleted file mode 100755 index 402d4324..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/rpn_R_50_FPN_instant_test.yaml +++ /dev/null @@ -1,13 +0,0 @@ -_BASE_: "../COCO-Detection/rpn_R_50_FPN_1x.yaml" -MODEL: - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" -DATASETS: - TRAIN: ("coco_2017_val_100",) - TEST: ("coco_2017_val_100",) -SOLVER: - STEPS: (30,) - MAX_ITER: 40 - BASE_LR: 0.005 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 2 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_inference_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_inference_acc_test.yaml deleted file mode 100755 index bca74987..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_inference_acc_test.yaml +++ /dev/null @@ -1,10 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - META_ARCHITECTURE: "SemanticSegmentor" - WEIGHTS: "detectron2://semantic_R_50_FPN_1x/111802073/model_final_c18079783c55a94968edc28b7101c5f0.pkl" - RESNETS: - DEPTH: 50 -DATASETS: - TEST: ("coco_2017_val_100_panoptic_stuffonly",) -TEST: - EXPECTED_RESULTS: [["sem_seg", "mIoU", 39.53, 0.02], ["sem_seg", "mACC", 51.50, 0.02]] diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_instant_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_instant_test.yaml deleted file mode 100755 index 14ab606f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_instant_test.yaml +++ /dev/null @@ -1,18 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - META_ARCHITECTURE: "SemanticSegmentor" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 -DATASETS: - TRAIN: ("coco_2017_val_100_panoptic_stuffonly",) - TEST: ("coco_2017_val_100_panoptic_stuffonly",) -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -SOLVER: - BASE_LR: 0.005 - STEPS: (30,) - MAX_ITER: 40 - IMS_PER_BATCH: 4 -DATALOADER: - NUM_WORKERS: 2 diff --git a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_training_acc_test.yaml b/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_training_acc_test.yaml deleted file mode 100755 index 1f78d775..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/configs/quick_schedules/semantic_R_50_FPN_training_acc_test.yaml +++ /dev/null @@ -1,20 +0,0 @@ -_BASE_: "../Base-RCNN-FPN.yaml" -MODEL: - META_ARCHITECTURE: "SemanticSegmentor" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 -DATASETS: - TRAIN: ("coco_2017_val_panoptic_stuffonly",) - TEST: ("coco_2017_val_panoptic_stuffonly",) -SOLVER: - BASE_LR: 0.01 - WARMUP_FACTOR: 0.001 - WARMUP_ITERS: 300 - STEPS: (5500,) - MAX_ITER: 7000 -TEST: - EXPECTED_RESULTS: [["sem_seg", "mIoU", 76.51, 1.0], ["sem_seg", "mACC", 83.25, 1.0]] -INPUT: - # no scale augmentation - MIN_SIZE_TRAIN: (800, ) diff --git a/grit_src_deprecated/third_party/CenterNet2/datasets/README.md b/grit_src_deprecated/third_party/CenterNet2/datasets/README.md deleted file mode 100755 index 0eb44cc3..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/datasets/README.md +++ /dev/null @@ -1,140 +0,0 @@ -# Use Builtin Datasets - -A dataset can be used by accessing [DatasetCatalog](https://detectron2.readthedocs.io/modules/data.html#detectron2.data.DatasetCatalog) -for its data, or [MetadataCatalog](https://detectron2.readthedocs.io/modules/data.html#detectron2.data.MetadataCatalog) for its metadata (class names, etc). -This document explains how to setup the builtin datasets so they can be used by the above APIs. -[Use Custom Datasets](https://detectron2.readthedocs.io/tutorials/datasets.html) gives a deeper dive on how to use `DatasetCatalog` and `MetadataCatalog`, -and how to add new datasets to them. - -Detectron2 has builtin support for a few datasets. -The datasets are assumed to exist in a directory specified by the environment variable -`DETECTRON2_DATASETS`. -Under this directory, detectron2 will look for datasets in the structure described below, if needed. -``` -$DETECTRON2_DATASETS/ - coco/ - lvis/ - cityscapes/ - VOC20{07,12}/ -``` - -You can set the location for builtin datasets by `export DETECTRON2_DATASETS=/path/to/datasets`. -If left unset, the default is `./datasets` relative to your current working directory. - -The [model zoo](https://github.com/facebookresearch/detectron2/blob/master/MODEL_ZOO.md) -contains configs and models that use these builtin datasets. - -## Expected dataset structure for [COCO instance/keypoint detection](https://cocodataset.org/#download): - -``` -coco/ - annotations/ - instances_{train,val}2017.json - person_keypoints_{train,val}2017.json - {train,val}2017/ - # image files that are mentioned in the corresponding json -``` - -You can use the 2014 version of the dataset as well. - -Some of the builtin tests (`dev/run_*_tests.sh`) uses a tiny version of the COCO dataset, -which you can download with `./datasets/prepare_for_tests.sh`. - -## Expected dataset structure for PanopticFPN: - -Extract panoptic annotations from [COCO website](https://cocodataset.org/#download) -into the following structure: -``` -coco/ - annotations/ - panoptic_{train,val}2017.json - panoptic_{train,val}2017/ # png annotations - panoptic_stuff_{train,val}2017/ # generated by the script mentioned below -``` - -Install panopticapi by: -``` -pip install git+https://github.com/cocodataset/panopticapi.git -``` -Then, run `python datasets/prepare_panoptic_fpn.py`, to extract semantic annotations from panoptic annotations. - -## Expected dataset structure for [LVIS instance segmentation](https://www.lvisdataset.org/dataset): -``` -coco/ - {train,val,test}2017/ -lvis/ - lvis_v0.5_{train,val}.json - lvis_v0.5_image_info_test.json - lvis_v1_{train,val}.json - lvis_v1_image_info_test{,_challenge}.json -``` - -Install lvis-api by: -``` -pip install git+https://github.com/lvis-dataset/lvis-api.git -``` - -To evaluate models trained on the COCO dataset using LVIS annotations, -run `python datasets/prepare_cocofied_lvis.py` to prepare "cocofied" LVIS annotations. - -## Expected dataset structure for [cityscapes](https://www.cityscapes-dataset.com/downloads/): -``` -cityscapes/ - gtFine/ - train/ - aachen/ - color.png, instanceIds.png, labelIds.png, polygons.json, - labelTrainIds.png - ... - val/ - test/ - # below are generated Cityscapes panoptic annotation - cityscapes_panoptic_train.json - cityscapes_panoptic_train/ - cityscapes_panoptic_val.json - cityscapes_panoptic_val/ - cityscapes_panoptic_test.json - cityscapes_panoptic_test/ - leftImg8bit/ - train/ - val/ - test/ -``` -Install cityscapes scripts by: -``` -pip install git+https://github.com/mcordts/cityscapesScripts.git -``` - -Note: to create labelTrainIds.png, first prepare the above structure, then run cityscapesescript with: -``` -CITYSCAPES_DATASET=/path/to/abovementioned/cityscapes python cityscapesscripts/preparation/createTrainIdLabelImgs.py -``` -These files are not needed for instance segmentation. - -Note: to generate Cityscapes panoptic dataset, run cityscapesescript with: -``` -CITYSCAPES_DATASET=/path/to/abovementioned/cityscapes python cityscapesscripts/preparation/createPanopticImgs.py -``` -These files are not needed for semantic and instance segmentation. - -## Expected dataset structure for [Pascal VOC](http://host.robots.ox.ac.uk/pascal/VOC/index.html): -``` -VOC20{07,12}/ - Annotations/ - ImageSets/ - Main/ - trainval.txt - test.txt - # train.txt or val.txt, if you use these splits - JPEGImages/ -``` - -## Expected dataset structure for [ADE20k Scene Parsing](http://sceneparsing.csail.mit.edu/): -``` -ADEChallengeData2016/ - annotations/ - annotations_detectron2/ - images/ - objectInfo150.txt -``` -The directory `annotations_detectron2` is generated by running `python datasets/prepare_ade20k_sem_seg.py`. diff --git a/grit_src_deprecated/third_party/CenterNet2/datasets/lvis/lvis_v1_train_cat_info.json b/grit_src_deprecated/third_party/CenterNet2/datasets/lvis/lvis_v1_train_cat_info.json deleted file mode 100755 index 95fef092..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/datasets/lvis/lvis_v1_train_cat_info.json +++ /dev/null @@ -1 +0,0 @@ -[{"name": "aerosol_can", "instance_count": 109, "def": "a dispenser that holds a substance under pressure", "synonyms": ["aerosol_can", "spray_can"], "image_count": 64, "id": 1, "frequency": "c", "synset": "aerosol.n.02"}, {"name": "air_conditioner", "instance_count": 1081, "def": "a machine that keeps air cool and dry", "synonyms": ["air_conditioner"], "image_count": 364, "id": 2, "frequency": "f", "synset": "air_conditioner.n.01"}, {"name": "airplane", "instance_count": 3720, "def": "an aircraft that has a fixed wing and is powered by propellers or jets", "synonyms": ["airplane", "aeroplane"], "image_count": 1911, "id": 3, "frequency": "f", "synset": "airplane.n.01"}, {"name": "alarm_clock", "instance_count": 158, "def": "a clock that wakes a sleeper at some preset time", "synonyms": ["alarm_clock"], "image_count": 149, "id": 4, "frequency": "f", "synset": "alarm_clock.n.01"}, {"name": "alcohol", "instance_count": 207, "def": "a liquor or brew containing alcohol as the active agent", "synonyms": ["alcohol", "alcoholic_beverage"], "image_count": 29, "id": 5, "frequency": "c", "synset": "alcohol.n.01"}, {"name": "alligator", "instance_count": 39, "def": "amphibious reptiles related to crocodiles but with shorter broader snouts", "synonyms": ["alligator", "gator"], "image_count": 26, "id": 6, "frequency": "c", "synset": "alligator.n.02"}, {"name": "almond", "instance_count": 1700, "def": "oval-shaped edible seed of the almond tree", "synonyms": ["almond"], "image_count": 59, "id": 7, "frequency": "c", "synset": "almond.n.02"}, {"name": "ambulance", "instance_count": 25, "def": "a vehicle that takes people to and from hospitals", "synonyms": ["ambulance"], "image_count": 22, "id": 8, "frequency": "c", "synset": "ambulance.n.01"}, {"name": "amplifier", "instance_count": 16, "def": "electronic equipment that increases strength of signals", "synonyms": ["amplifier"], "image_count": 12, "id": 9, "frequency": "c", "synset": "amplifier.n.01"}, {"name": "anklet", "instance_count": 39, "def": "an ornament worn around the ankle", "synonyms": ["anklet", "ankle_bracelet"], "image_count": 28, "id": 10, "frequency": "c", "synset": "anklet.n.03"}, {"name": "antenna", "instance_count": 1018, "def": "an electrical device that sends or receives radio or television signals", "synonyms": ["antenna", "aerial", "transmitting_aerial"], "image_count": 505, "id": 11, "frequency": "f", "synset": "antenna.n.01"}, {"name": "apple", "instance_count": 17451, "def": "fruit with red or yellow or green skin and sweet to tart crisp whitish flesh", "synonyms": ["apple"], "image_count": 1207, "id": 12, "frequency": "f", "synset": "apple.n.01"}, {"name": "applesauce", "instance_count": 7, "def": "puree of stewed apples usually sweetened and spiced", "synonyms": ["applesauce"], "image_count": 4, "id": 13, "frequency": "r", "synset": "applesauce.n.01"}, {"name": "apricot", "instance_count": 62, "def": "downy yellow to rosy-colored fruit resembling a small peach", "synonyms": ["apricot"], "image_count": 10, "id": 14, "frequency": "r", "synset": "apricot.n.02"}, {"name": "apron", "instance_count": 881, "def": "a garment of cloth that is tied about the waist and worn to protect clothing", "synonyms": ["apron"], "image_count": 500, "id": 15, "frequency": "f", "synset": "apron.n.01"}, {"name": "aquarium", "instance_count": 36, "def": "a tank/pool/bowl filled with water for keeping live fish and underwater animals", "synonyms": ["aquarium", "fish_tank"], "image_count": 33, "id": 16, "frequency": "c", "synset": "aquarium.n.01"}, {"name": "arctic_(type_of_shoe)", "instance_count": 8, "def": "a waterproof overshoe that protects shoes from water or snow", "synonyms": ["arctic_(type_of_shoe)", "galosh", "golosh", "rubber_(type_of_shoe)", "gumshoe"], "image_count": 3, "id": 17, "frequency": "r", "synset": "arctic.n.02"}, {"name": "armband", "instance_count": 85, "def": "a band worn around the upper arm", "synonyms": ["armband"], "image_count": 44, "id": 18, "frequency": "c", "synset": "armband.n.02"}, {"name": "armchair", "instance_count": 1112, "def": "chair with a support on each side for arms", "synonyms": ["armchair"], "image_count": 561, "id": 19, "frequency": "f", "synset": "armchair.n.01"}, {"name": "armoire", "instance_count": 11, "def": "a large wardrobe or cabinet", "synonyms": ["armoire"], "image_count": 8, "id": 20, "frequency": "r", "synset": "armoire.n.01"}, {"name": "armor", "instance_count": 23, "def": "protective covering made of metal and used in combat", "synonyms": ["armor", "armour"], "image_count": 9, "id": 21, "frequency": "r", "synset": "armor.n.01"}, {"name": "artichoke", "instance_count": 293, "def": "a thistlelike flower head with edible fleshy leaves and heart", "synonyms": ["artichoke"], "image_count": 33, "id": 22, "frequency": "c", "synset": "artichoke.n.02"}, {"name": "trash_can", "instance_count": 2722, "def": "a bin that holds rubbish until it is collected", "synonyms": ["trash_can", "garbage_can", "wastebin", "dustbin", "trash_barrel", "trash_bin"], "image_count": 1883, "id": 23, "frequency": "f", "synset": "ashcan.n.01"}, {"name": "ashtray", "instance_count": 136, "def": "a receptacle for the ash from smokers' cigars or cigarettes", "synonyms": ["ashtray"], "image_count": 98, "id": 24, "frequency": "c", "synset": "ashtray.n.01"}, {"name": "asparagus", "instance_count": 969, "def": "edible young shoots of the asparagus plant", "synonyms": ["asparagus"], "image_count": 70, "id": 25, "frequency": "c", "synset": "asparagus.n.02"}, {"name": "atomizer", "instance_count": 67, "def": "a dispenser that turns a liquid (such as perfume) into a fine mist", "synonyms": ["atomizer", "atomiser", "spray", "sprayer", "nebulizer", "nebuliser"], "image_count": 46, "id": 26, "frequency": "c", "synset": "atomizer.n.01"}, {"name": "avocado", "instance_count": 1048, "def": "a pear-shaped fruit with green or blackish skin and rich yellowish pulp enclosing a single large seed", "synonyms": ["avocado"], "image_count": 117, "id": 27, "frequency": "f", "synset": "avocado.n.01"}, {"name": "award", "instance_count": 163, "def": "a tangible symbol signifying approval or distinction", "synonyms": ["award", "accolade"], "image_count": 41, "id": 28, "frequency": "c", "synset": "award.n.02"}, {"name": "awning", "instance_count": 4270, "def": "a canopy made of canvas to shelter people or things from rain or sun", "synonyms": ["awning"], "image_count": 1395, "id": 29, "frequency": "f", "synset": "awning.n.01"}, {"name": "ax", "instance_count": 8, "def": "an edge tool with a heavy bladed head mounted across a handle", "synonyms": ["ax", "axe"], "image_count": 7, "id": 30, "frequency": "r", "synset": "ax.n.01"}, {"name": "baboon", "instance_count": 3, "def": "large terrestrial monkeys having doglike muzzles", "synonyms": ["baboon"], "image_count": 1, "id": 31, "frequency": "r", "synset": "baboon.n.01"}, {"name": "baby_buggy", "instance_count": 447, "def": "a small vehicle with four wheels in which a baby or child is pushed around", "synonyms": ["baby_buggy", "baby_carriage", "perambulator", "pram", "stroller"], "image_count": 314, "id": 32, "frequency": "f", "synset": "baby_buggy.n.01"}, {"name": "basketball_backboard", "instance_count": 42, "def": "a raised vertical board with basket attached; used to play basketball", "synonyms": ["basketball_backboard"], "image_count": 31, "id": 33, "frequency": "c", "synset": "backboard.n.01"}, {"name": "backpack", "instance_count": 3907, "def": "a bag carried by a strap on your back or shoulder", "synonyms": ["backpack", "knapsack", "packsack", "rucksack", "haversack"], "image_count": 1905, "id": 34, "frequency": "f", "synset": "backpack.n.01"}, {"name": "handbag", "instance_count": 3947, "def": "a container used for carrying money and small personal items or accessories", "synonyms": ["handbag", "purse", "pocketbook"], "image_count": 1859, "id": 35, "frequency": "f", "synset": "bag.n.04"}, {"name": "suitcase", "instance_count": 8537, "def": "cases used to carry belongings when traveling", "synonyms": ["suitcase", "baggage", "luggage"], "image_count": 1623, "id": 36, "frequency": "f", "synset": "bag.n.06"}, {"name": "bagel", "instance_count": 372, "def": "glazed yeast-raised doughnut-shaped roll with hard crust", "synonyms": ["bagel", "beigel"], "image_count": 47, "id": 37, "frequency": "c", "synset": "bagel.n.01"}, {"name": "bagpipe", "instance_count": 6, "def": "a tubular wind instrument; the player blows air into a bag and squeezes it out", "synonyms": ["bagpipe"], "image_count": 3, "id": 38, "frequency": "r", "synset": "bagpipe.n.01"}, {"name": "baguet", "instance_count": 9, "def": "narrow French stick loaf", "synonyms": ["baguet", "baguette"], "image_count": 3, "id": 39, "frequency": "r", "synset": "baguet.n.01"}, {"name": "bait", "instance_count": 1, "def": "something used to lure fish or other animals into danger so they can be trapped or killed", "synonyms": ["bait", "lure"], "image_count": 1, "id": 40, "frequency": "r", "synset": "bait.n.02"}, {"name": "ball", "instance_count": 755, "def": "a spherical object used as a plaything", "synonyms": ["ball"], "image_count": 305, "id": 41, "frequency": "f", "synset": "ball.n.06"}, {"name": "ballet_skirt", "instance_count": 12, "def": "very short skirt worn by ballerinas", "synonyms": ["ballet_skirt", "tutu"], "image_count": 6, "id": 42, "frequency": "r", "synset": "ballet_skirt.n.01"}, {"name": "balloon", "instance_count": 1556, "def": "large tough nonrigid bag filled with gas or heated air", "synonyms": ["balloon"], "image_count": 210, "id": 43, "frequency": "f", "synset": "balloon.n.01"}, {"name": "bamboo", "instance_count": 243, "def": "woody tropical grass having hollow woody stems", "synonyms": ["bamboo"], "image_count": 36, "id": 44, "frequency": "c", "synset": "bamboo.n.02"}, {"name": "banana", "instance_count": 50552, "def": "elongated crescent-shaped yellow fruit with soft sweet flesh", "synonyms": ["banana"], "image_count": 1787, "id": 45, "frequency": "f", "synset": "banana.n.02"}, {"name": "Band_Aid", "instance_count": 19, "def": "trade name for an adhesive bandage to cover small cuts or blisters", "synonyms": ["Band_Aid"], "image_count": 17, "id": 46, "frequency": "c", "synset": "band_aid.n.01"}, {"name": "bandage", "instance_count": 92, "def": "a piece of soft material that covers and protects an injured part of the body", "synonyms": ["bandage"], "image_count": 51, "id": 47, "frequency": "c", "synset": "bandage.n.01"}, {"name": "bandanna", "instance_count": 219, "def": "large and brightly colored handkerchief; often used as a neckerchief", "synonyms": ["bandanna", "bandana"], "image_count": 138, "id": 48, "frequency": "f", "synset": "bandanna.n.01"}, {"name": "banjo", "instance_count": 3, "def": "a stringed instrument of the guitar family with a long neck and circular body", "synonyms": ["banjo"], "image_count": 3, "id": 49, "frequency": "r", "synset": "banjo.n.01"}, {"name": "banner", "instance_count": 5907, "def": "long strip of cloth or paper used for decoration or advertising", "synonyms": ["banner", "streamer"], "image_count": 1470, "id": 50, "frequency": "f", "synset": "banner.n.01"}, {"name": "barbell", "instance_count": 4, "def": "a bar to which heavy discs are attached at each end; used in weightlifting", "synonyms": ["barbell"], "image_count": 3, "id": 51, "frequency": "r", "synset": "barbell.n.01"}, {"name": "barge", "instance_count": 3, "def": "a flatbottom boat for carrying heavy loads (especially on canals)", "synonyms": ["barge"], "image_count": 2, "id": 52, "frequency": "r", "synset": "barge.n.01"}, {"name": "barrel", "instance_count": 707, "def": "a cylindrical container that holds liquids", "synonyms": ["barrel", "cask"], "image_count": 186, "id": 53, "frequency": "f", "synset": "barrel.n.02"}, {"name": "barrette", "instance_count": 119, "def": "a pin for holding women's hair in place", "synonyms": ["barrette"], "image_count": 76, "id": 54, "frequency": "c", "synset": "barrette.n.01"}, {"name": "barrow", "instance_count": 30, "def": "a cart for carrying small loads; has handles and one or more wheels", "synonyms": ["barrow", "garden_cart", "lawn_cart", "wheelbarrow"], "image_count": 26, "id": 55, "frequency": "c", "synset": "barrow.n.03"}, {"name": "baseball_base", "instance_count": 404, "def": "a place that the runner must touch before scoring", "synonyms": ["baseball_base"], "image_count": 303, "id": 56, "frequency": "f", "synset": "base.n.03"}, {"name": "baseball", "instance_count": 1013, "def": "a ball used in playing baseball", "synonyms": ["baseball"], "image_count": 738, "id": 57, "frequency": "f", "synset": "baseball.n.02"}, {"name": "baseball_bat", "instance_count": 2698, "def": "an implement used in baseball by the batter", "synonyms": ["baseball_bat"], "image_count": 1799, "id": 58, "frequency": "f", "synset": "baseball_bat.n.01"}, {"name": "baseball_cap", "instance_count": 9028, "def": "a cap with a bill", "synonyms": ["baseball_cap", "jockey_cap", "golf_cap"], "image_count": 1934, "id": 59, "frequency": "f", "synset": "baseball_cap.n.01"}, {"name": "baseball_glove", "instance_count": 2536, "def": "the handwear used by fielders in playing baseball", "synonyms": ["baseball_glove", "baseball_mitt"], "image_count": 1609, "id": 60, "frequency": "f", "synset": "baseball_glove.n.01"}, {"name": "basket", "instance_count": 3984, "def": "a container that is usually woven and has handles", "synonyms": ["basket", "handbasket"], "image_count": 1622, "id": 61, "frequency": "f", "synset": "basket.n.01"}, {"name": "basketball", "instance_count": 56, "def": "an inflated ball used in playing basketball", "synonyms": ["basketball"], "image_count": 41, "id": 62, "frequency": "c", "synset": "basketball.n.02"}, {"name": "bass_horn", "instance_count": 6, "def": "the lowest brass wind instrument", "synonyms": ["bass_horn", "sousaphone", "tuba"], "image_count": 4, "id": 63, "frequency": "r", "synset": "bass_horn.n.01"}, {"name": "bat_(animal)", "instance_count": 47, "def": "nocturnal mouselike mammal with forelimbs modified to form membranous wings", "synonyms": ["bat_(animal)"], "image_count": 11, "id": 64, "frequency": "c", "synset": "bat.n.01"}, {"name": "bath_mat", "instance_count": 336, "def": "a heavy towel or mat to stand on while drying yourself after a bath", "synonyms": ["bath_mat"], "image_count": 270, "id": 65, "frequency": "f", "synset": "bath_mat.n.01"}, {"name": "bath_towel", "instance_count": 1210, "def": "a large towel; to dry yourself after a bath", "synonyms": ["bath_towel"], "image_count": 349, "id": 66, "frequency": "f", "synset": "bath_towel.n.01"}, {"name": "bathrobe", "instance_count": 53, "def": "a loose-fitting robe of towelling; worn after a bath or swim", "synonyms": ["bathrobe"], "image_count": 42, "id": 67, "frequency": "c", "synset": "bathrobe.n.01"}, {"name": "bathtub", "instance_count": 868, "def": "a large open container that you fill with water and use to wash the body", "synonyms": ["bathtub", "bathing_tub"], "image_count": 823, "id": 68, "frequency": "f", "synset": "bathtub.n.01"}, {"name": "batter_(food)", "instance_count": 26, "def": "a liquid or semiliquid mixture, as of flour, eggs, and milk, used in cooking", "synonyms": ["batter_(food)"], "image_count": 6, "id": 69, "frequency": "r", "synset": "batter.n.02"}, {"name": "battery", "instance_count": 155, "def": "a portable device that produces electricity", "synonyms": ["battery"], "image_count": 48, "id": 70, "frequency": "c", "synset": "battery.n.02"}, {"name": "beachball", "instance_count": 3, "def": "large and light ball; for play at the seaside", "synonyms": ["beachball"], "image_count": 3, "id": 71, "frequency": "r", "synset": "beach_ball.n.01"}, {"name": "bead", "instance_count": 1371, "def": "a small ball with a hole through the middle used for ornamentation, jewellery, etc.", "synonyms": ["bead"], "image_count": 42, "id": 72, "frequency": "c", "synset": "bead.n.01"}, {"name": "bean_curd", "instance_count": 231, "def": "cheeselike food made of curdled soybean milk", "synonyms": ["bean_curd", "tofu"], "image_count": 24, "id": 73, "frequency": "c", "synset": "bean_curd.n.01"}, {"name": "beanbag", "instance_count": 20, "def": "a bag filled with dried beans or similar items; used in games or to sit on", "synonyms": ["beanbag"], "image_count": 16, "id": 74, "frequency": "c", "synset": "beanbag.n.01"}, {"name": "beanie", "instance_count": 1907, "def": "a small skullcap; formerly worn by schoolboys and college freshmen", "synonyms": ["beanie", "beany"], "image_count": 605, "id": 75, "frequency": "f", "synset": "beanie.n.01"}, {"name": "bear", "instance_count": 1069, "def": "large carnivorous or omnivorous mammals with shaggy coats and claws", "synonyms": ["bear"], "image_count": 646, "id": 76, "frequency": "f", "synset": "bear.n.01"}, {"name": "bed", "instance_count": 2137, "def": "a piece of furniture that provides a place to sleep", "synonyms": ["bed"], "image_count": 1765, "id": 77, "frequency": "f", "synset": "bed.n.01"}, {"name": "bedpan", "instance_count": 2, "def": "a shallow vessel used by a bedridden patient for defecation and urination", "synonyms": ["bedpan"], "image_count": 2, "id": 78, "frequency": "r", "synset": "bedpan.n.01"}, {"name": "bedspread", "instance_count": 188, "def": "decorative cover for a bed", "synonyms": ["bedspread", "bedcover", "bed_covering", "counterpane", "spread"], "image_count": 125, "id": 79, "frequency": "f", "synset": "bedspread.n.01"}, {"name": "cow", "instance_count": 8085, "def": "cattle/cow", "synonyms": ["cow"], "image_count": 1420, "id": 80, "frequency": "f", "synset": "beef.n.01"}, {"name": "beef_(food)", "instance_count": 1242, "def": "meat from an adult domestic bovine", "synonyms": ["beef_(food)", "boeuf_(food)"], "image_count": 140, "id": 81, "frequency": "f", "synset": "beef.n.02"}, {"name": "beeper", "instance_count": 4, "def": "an device that beeps when the person carrying it is being paged", "synonyms": ["beeper", "pager"], "image_count": 4, "id": 82, "frequency": "r", "synset": "beeper.n.01"}, {"name": "beer_bottle", "instance_count": 1227, "def": "a bottle that holds beer", "synonyms": ["beer_bottle"], "image_count": 322, "id": 83, "frequency": "f", "synset": "beer_bottle.n.01"}, {"name": "beer_can", "instance_count": 203, "def": "a can that holds beer", "synonyms": ["beer_can"], "image_count": 60, "id": 84, "frequency": "c", "synset": "beer_can.n.01"}, {"name": "beetle", "instance_count": 9, "def": "insect with hard wing covers", "synonyms": ["beetle"], "image_count": 2, "id": 85, "frequency": "r", "synset": "beetle.n.01"}, {"name": "bell", "instance_count": 590, "def": "a hollow device made of metal that makes a ringing sound when struck", "synonyms": ["bell"], "image_count": 231, "id": 86, "frequency": "f", "synset": "bell.n.01"}, {"name": "bell_pepper", "instance_count": 4369, "def": "large bell-shaped sweet pepper in green or red or yellow or orange or black varieties", "synonyms": ["bell_pepper", "capsicum"], "image_count": 333, "id": 87, "frequency": "f", "synset": "bell_pepper.n.02"}, {"name": "belt", "instance_count": 3683, "def": "a band to tie or buckle around the body (usually at the waist)", "synonyms": ["belt"], "image_count": 1941, "id": 88, "frequency": "f", "synset": "belt.n.02"}, {"name": "belt_buckle", "instance_count": 589, "def": "the buckle used to fasten a belt", "synonyms": ["belt_buckle"], "image_count": 367, "id": 89, "frequency": "f", "synset": "belt_buckle.n.01"}, {"name": "bench", "instance_count": 4374, "def": "a long seat for more than one person", "synonyms": ["bench"], "image_count": 1922, "id": 90, "frequency": "f", "synset": "bench.n.01"}, {"name": "beret", "instance_count": 57, "def": "a cap with no brim or bill; made of soft cloth", "synonyms": ["beret"], "image_count": 18, "id": 91, "frequency": "c", "synset": "beret.n.01"}, {"name": "bib", "instance_count": 96, "def": "a napkin tied under the chin of a child while eating", "synonyms": ["bib"], "image_count": 81, "id": 92, "frequency": "c", "synset": "bib.n.02"}, {"name": "Bible", "instance_count": 2, "def": "the sacred writings of the Christian religions", "synonyms": ["Bible"], "image_count": 1, "id": 93, "frequency": "r", "synset": "bible.n.01"}, {"name": "bicycle", "instance_count": 4566, "def": "a wheeled vehicle that has two wheels and is moved by foot pedals", "synonyms": ["bicycle", "bike_(bicycle)"], "image_count": 1852, "id": 94, "frequency": "f", "synset": "bicycle.n.01"}, {"name": "visor", "instance_count": 777, "def": "a brim that projects to the front to shade the eyes", "synonyms": ["visor", "vizor"], "image_count": 430, "id": 95, "frequency": "f", "synset": "bill.n.09"}, {"name": "billboard", "instance_count": 1025, "def": "large outdoor signboard", "synonyms": ["billboard"], "image_count": 247, "id": 96, "frequency": "f", "synset": "billboard.n.01"}, {"name": "binder", "instance_count": 311, "def": "holds loose papers or magazines", "synonyms": ["binder", "ring-binder"], "image_count": 94, "id": 97, "frequency": "c", "synset": "binder.n.03"}, {"name": "binoculars", "instance_count": 22, "def": "an optical instrument designed for simultaneous use by both eyes", "synonyms": ["binoculars", "field_glasses", "opera_glasses"], "image_count": 21, "id": 98, "frequency": "c", "synset": "binoculars.n.01"}, {"name": "bird", "instance_count": 11557, "def": "animal characterized by feathers and wings", "synonyms": ["bird"], "image_count": 1821, "id": 99, "frequency": "f", "synset": "bird.n.01"}, {"name": "birdfeeder", "instance_count": 16, "def": "an outdoor device that supplies food for wild birds", "synonyms": ["birdfeeder"], "image_count": 16, "id": 100, "frequency": "c", "synset": "bird_feeder.n.01"}, {"name": "birdbath", "instance_count": 12, "def": "an ornamental basin (usually in a garden) for birds to bathe in", "synonyms": ["birdbath"], "image_count": 12, "id": 101, "frequency": "c", "synset": "birdbath.n.01"}, {"name": "birdcage", "instance_count": 180, "def": "a cage in which a bird can be kept", "synonyms": ["birdcage"], "image_count": 25, "id": 102, "frequency": "c", "synset": "birdcage.n.01"}, {"name": "birdhouse", "instance_count": 60, "def": "a shelter for birds", "synonyms": ["birdhouse"], "image_count": 41, "id": 103, "frequency": "c", "synset": "birdhouse.n.01"}, {"name": "birthday_cake", "instance_count": 311, "def": "decorated cake served at a birthday party", "synonyms": ["birthday_cake"], "image_count": 244, "id": 104, "frequency": "f", "synset": "birthday_cake.n.01"}, {"name": "birthday_card", "instance_count": 23, "def": "a card expressing a birthday greeting", "synonyms": ["birthday_card"], "image_count": 7, "id": 105, "frequency": "r", "synset": "birthday_card.n.01"}, {"name": "pirate_flag", "instance_count": 1, "def": "a flag usually bearing a white skull and crossbones on a black background", "synonyms": ["pirate_flag"], "image_count": 1, "id": 106, "frequency": "r", "synset": "black_flag.n.01"}, {"name": "black_sheep", "instance_count": 214, "def": "sheep with a black coat", "synonyms": ["black_sheep"], "image_count": 40, "id": 107, "frequency": "c", "synset": "black_sheep.n.02"}, {"name": "blackberry", "instance_count": 406, "def": "large sweet black or very dark purple edible aggregate fruit", "synonyms": ["blackberry"], "image_count": 40, "id": 108, "frequency": "c", "synset": "blackberry.n.01"}, {"name": "blackboard", "instance_count": 154, "def": "sheet of slate; for writing with chalk", "synonyms": ["blackboard", "chalkboard"], "image_count": 104, "id": 109, "frequency": "f", "synset": "blackboard.n.01"}, {"name": "blanket", "instance_count": 3075, "def": "bedding that keeps a person warm in bed", "synonyms": ["blanket"], "image_count": 1671, "id": 110, "frequency": "f", "synset": "blanket.n.01"}, {"name": "blazer", "instance_count": 124, "def": "lightweight jacket; often striped in the colors of a club or school", "synonyms": ["blazer", "sport_jacket", "sport_coat", "sports_jacket", "sports_coat"], "image_count": 49, "id": 111, "frequency": "c", "synset": "blazer.n.01"}, {"name": "blender", "instance_count": 316, "def": "an electrically powered mixer that mix or chop or liquefy foods", "synonyms": ["blender", "liquidizer", "liquidiser"], "image_count": 243, "id": 112, "frequency": "f", "synset": "blender.n.01"}, {"name": "blimp", "instance_count": 3, "def": "a small nonrigid airship used for observation or as a barrage balloon", "synonyms": ["blimp"], "image_count": 2, "id": 113, "frequency": "r", "synset": "blimp.n.02"}, {"name": "blinker", "instance_count": 1269, "def": "a light that flashes on and off; used as a signal or to send messages", "synonyms": ["blinker", "flasher"], "image_count": 242, "id": 114, "frequency": "f", "synset": "blinker.n.01"}, {"name": "blouse", "instance_count": 623, "def": "a top worn by women", "synonyms": ["blouse"], "image_count": 271, "id": 115, "frequency": "f", "synset": "blouse.n.01"}, {"name": "blueberry", "instance_count": 2114, "def": "sweet edible dark-blue berries of blueberry plants", "synonyms": ["blueberry"], "image_count": 104, "id": 116, "frequency": "f", "synset": "blueberry.n.02"}, {"name": "gameboard", "instance_count": 17, "def": "a flat portable surface (usually rectangular) designed for board games", "synonyms": ["gameboard"], "image_count": 8, "id": 117, "frequency": "r", "synset": "board.n.09"}, {"name": "boat", "instance_count": 9981, "def": "a vessel for travel on water", "synonyms": ["boat", "ship_(boat)"], "image_count": 1758, "id": 118, "frequency": "f", "synset": "boat.n.01"}, {"name": "bob", "instance_count": 2, "def": "a small float usually made of cork; attached to a fishing line", "synonyms": ["bob", "bobber", "bobfloat"], "image_count": 1, "id": 119, "frequency": "r", "synset": "bob.n.05"}, {"name": "bobbin", "instance_count": 190, "def": "a thing around which thread/tape/film or other flexible materials can be wound", "synonyms": ["bobbin", "spool", "reel"], "image_count": 48, "id": 120, "frequency": "c", "synset": "bobbin.n.01"}, {"name": "bobby_pin", "instance_count": 43, "def": "a flat wire hairpin used to hold bobbed hair in place", "synonyms": ["bobby_pin", "hairgrip"], "image_count": 14, "id": 121, "frequency": "c", "synset": "bobby_pin.n.01"}, {"name": "boiled_egg", "instance_count": 125, "def": "egg cooked briefly in the shell in gently boiling water", "synonyms": ["boiled_egg", "coddled_egg"], "image_count": 40, "id": 122, "frequency": "c", "synset": "boiled_egg.n.01"}, {"name": "bolo_tie", "instance_count": 1, "def": "a cord fastened around the neck with an ornamental clasp and worn as a necktie", "synonyms": ["bolo_tie", "bolo", "bola_tie", "bola"], "image_count": 1, "id": 123, "frequency": "r", "synset": "bolo_tie.n.01"}, {"name": "deadbolt", "instance_count": 46, "def": "the part of a lock that is engaged or withdrawn with a key", "synonyms": ["deadbolt"], "image_count": 37, "id": 124, "frequency": "c", "synset": "bolt.n.03"}, {"name": "bolt", "instance_count": 11261, "def": "a screw that screws into a nut to form a fastener", "synonyms": ["bolt"], "image_count": 1510, "id": 125, "frequency": "f", "synset": "bolt.n.06"}, {"name": "bonnet", "instance_count": 10, "def": "a hat tied under the chin", "synonyms": ["bonnet"], "image_count": 6, "id": 126, "frequency": "r", "synset": "bonnet.n.01"}, {"name": "book", "instance_count": 33353, "def": "a written work or composition that has been published", "synonyms": ["book"], "image_count": 1903, "id": 127, "frequency": "f", "synset": "book.n.01"}, {"name": "bookcase", "instance_count": 113, "def": "a piece of furniture with shelves for storing books", "synonyms": ["bookcase"], "image_count": 70, "id": 128, "frequency": "c", "synset": "bookcase.n.01"}, {"name": "booklet", "instance_count": 439, "def": "a small book usually having a paper cover", "synonyms": ["booklet", "brochure", "leaflet", "pamphlet"], "image_count": 86, "id": 129, "frequency": "c", "synset": "booklet.n.01"}, {"name": "bookmark", "instance_count": 15, "def": "a marker (a piece of paper or ribbon) placed between the pages of a book", "synonyms": ["bookmark", "bookmarker"], "image_count": 7, "id": 130, "frequency": "r", "synset": "bookmark.n.01"}, {"name": "boom_microphone", "instance_count": 10, "def": "a pole carrying an overhead microphone projected over a film or tv set", "synonyms": ["boom_microphone", "microphone_boom"], "image_count": 5, "id": 131, "frequency": "r", "synset": "boom.n.04"}, {"name": "boot", "instance_count": 4194, "def": "footwear that covers the whole foot and lower leg", "synonyms": ["boot"], "image_count": 1406, "id": 132, "frequency": "f", "synset": "boot.n.01"}, {"name": "bottle", "instance_count": 7969, "def": "a glass or plastic vessel used for storing drinks or other liquids", "synonyms": ["bottle"], "image_count": 1901, "id": 133, "frequency": "f", "synset": "bottle.n.01"}, {"name": "bottle_opener", "instance_count": 15, "def": "an opener for removing caps or corks from bottles", "synonyms": ["bottle_opener"], "image_count": 15, "id": 134, "frequency": "c", "synset": "bottle_opener.n.01"}, {"name": "bouquet", "instance_count": 53, "def": "an arrangement of flowers that is usually given as a present", "synonyms": ["bouquet"], "image_count": 28, "id": 135, "frequency": "c", "synset": "bouquet.n.01"}, {"name": "bow_(weapon)", "instance_count": 6, "def": "a weapon for shooting arrows", "synonyms": ["bow_(weapon)"], "image_count": 6, "id": 136, "frequency": "r", "synset": "bow.n.04"}, {"name": "bow_(decorative_ribbons)", "instance_count": 1144, "def": "a decorative interlacing of ribbons", "synonyms": ["bow_(decorative_ribbons)"], "image_count": 494, "id": 137, "frequency": "f", "synset": "bow.n.08"}, {"name": "bow-tie", "instance_count": 359, "def": "a man's tie that ties in a bow", "synonyms": ["bow-tie", "bowtie"], "image_count": 234, "id": 138, "frequency": "f", "synset": "bow_tie.n.01"}, {"name": "bowl", "instance_count": 5308, "def": "a dish that is round and open at the top for serving foods", "synonyms": ["bowl"], "image_count": 1922, "id": 139, "frequency": "f", "synset": "bowl.n.03"}, {"name": "pipe_bowl", "instance_count": 1, "def": "a small round container that is open at the top for holding tobacco", "synonyms": ["pipe_bowl"], "image_count": 1, "id": 140, "frequency": "r", "synset": "bowl.n.08"}, {"name": "bowler_hat", "instance_count": 89, "def": "a felt hat that is round and hard with a narrow brim", "synonyms": ["bowler_hat", "bowler", "derby_hat", "derby", "plug_hat"], "image_count": 35, "id": 141, "frequency": "c", "synset": "bowler_hat.n.01"}, {"name": "bowling_ball", "instance_count": 38, "def": "a large ball with finger holes used in the sport of bowling", "synonyms": ["bowling_ball"], "image_count": 5, "id": 142, "frequency": "r", "synset": "bowling_ball.n.01"}, {"name": "box", "instance_count": 7855, "def": "a (usually rectangular) container; may have a lid", "synonyms": ["box"], "image_count": 1828, "id": 143, "frequency": "f", "synset": "box.n.01"}, {"name": "boxing_glove", "instance_count": 22, "def": "large glove coverings the fists of a fighter worn for the sport of boxing", "synonyms": ["boxing_glove"], "image_count": 8, "id": 144, "frequency": "r", "synset": "boxing_glove.n.01"}, {"name": "suspenders", "instance_count": 88, "def": "elastic straps that hold trousers up (usually used in the plural)", "synonyms": ["suspenders"], "image_count": 63, "id": 145, "frequency": "c", "synset": "brace.n.06"}, {"name": "bracelet", "instance_count": 3219, "def": "jewelry worn around the wrist for decoration", "synonyms": ["bracelet", "bangle"], "image_count": 1668, "id": 146, "frequency": "f", "synset": "bracelet.n.02"}, {"name": "brass_plaque", "instance_count": 4, "def": "a memorial made of brass", "synonyms": ["brass_plaque"], "image_count": 4, "id": 147, "frequency": "r", "synset": "brass.n.07"}, {"name": "brassiere", "instance_count": 118, "def": "an undergarment worn by women to support their breasts", "synonyms": ["brassiere", "bra", "bandeau"], "image_count": 95, "id": 148, "frequency": "c", "synset": "brassiere.n.01"}, {"name": "bread-bin", "instance_count": 17, "def": "a container used to keep bread or cake in", "synonyms": ["bread-bin", "breadbox"], "image_count": 17, "id": 149, "frequency": "c", "synset": "bread-bin.n.01"}, {"name": "bread", "instance_count": 6550, "def": "food made from dough of flour or meal and usually raised with yeast or baking powder and then baked", "synonyms": ["bread"], "image_count": 1567, "id": 150, "frequency": "f", "synset": "bread.n.01"}, {"name": "breechcloth", "instance_count": 3, "def": "a garment that provides covering for the loins", "synonyms": ["breechcloth", "breechclout", "loincloth"], "image_count": 2, "id": 151, "frequency": "r", "synset": "breechcloth.n.01"}, {"name": "bridal_gown", "instance_count": 118, "def": "a gown worn by the bride at a wedding", "synonyms": ["bridal_gown", "wedding_gown", "wedding_dress"], "image_count": 103, "id": 152, "frequency": "f", "synset": "bridal_gown.n.01"}, {"name": "briefcase", "instance_count": 84, "def": "a case with a handle; for carrying papers or files or books", "synonyms": ["briefcase"], "image_count": 50, "id": 153, "frequency": "c", "synset": "briefcase.n.01"}, {"name": "broccoli", "instance_count": 12166, "def": "plant with dense clusters of tight green flower buds", "synonyms": ["broccoli"], "image_count": 1309, "id": 154, "frequency": "f", "synset": "broccoli.n.01"}, {"name": "broach", "instance_count": 9, "def": "a decorative pin worn by women", "synonyms": ["broach"], "image_count": 6, "id": 155, "frequency": "r", "synset": "brooch.n.01"}, {"name": "broom", "instance_count": 144, "def": "bundle of straws or twigs attached to a long handle; used for cleaning", "synonyms": ["broom"], "image_count": 92, "id": 156, "frequency": "c", "synset": "broom.n.01"}, {"name": "brownie", "instance_count": 217, "def": "square or bar of very rich chocolate cake usually with nuts", "synonyms": ["brownie"], "image_count": 19, "id": 157, "frequency": "c", "synset": "brownie.n.03"}, {"name": "brussels_sprouts", "instance_count": 590, "def": "the small edible cabbage-like buds growing along a stalk", "synonyms": ["brussels_sprouts"], "image_count": 37, "id": 158, "frequency": "c", "synset": "brussels_sprouts.n.01"}, {"name": "bubble_gum", "instance_count": 4, "def": "a kind of chewing gum that can be blown into bubbles", "synonyms": ["bubble_gum"], "image_count": 4, "id": 159, "frequency": "r", "synset": "bubble_gum.n.01"}, {"name": "bucket", "instance_count": 1346, "def": "a roughly cylindrical vessel that is open at the top", "synonyms": ["bucket", "pail"], "image_count": 709, "id": 160, "frequency": "f", "synset": "bucket.n.01"}, {"name": "horse_buggy", "instance_count": 19, "def": "a small lightweight carriage; drawn by a single horse", "synonyms": ["horse_buggy"], "image_count": 9, "id": 161, "frequency": "r", "synset": "buggy.n.01"}, {"name": "bull", "instance_count": 230, "def": "a cow with horns", "synonyms": ["horned_cow"], "image_count": 82, "id": 162, "frequency": "c", "synset": "bull.n.11"}, {"name": "bulldog", "instance_count": 21, "def": "a thickset short-haired dog with a large head and strong undershot lower jaw", "synonyms": ["bulldog"], "image_count": 15, "id": 163, "frequency": "c", "synset": "bulldog.n.01"}, {"name": "bulldozer", "instance_count": 4, "def": "large powerful tractor; a large blade in front flattens areas of ground", "synonyms": ["bulldozer", "dozer"], "image_count": 3, "id": 164, "frequency": "r", "synset": "bulldozer.n.01"}, {"name": "bullet_train", "instance_count": 80, "def": "a high-speed passenger train", "synonyms": ["bullet_train"], "image_count": 61, "id": 165, "frequency": "c", "synset": "bullet_train.n.01"}, {"name": "bulletin_board", "instance_count": 76, "def": "a board that hangs on a wall; displays announcements", "synonyms": ["bulletin_board", "notice_board"], "image_count": 51, "id": 166, "frequency": "c", "synset": "bulletin_board.n.02"}, {"name": "bulletproof_vest", "instance_count": 27, "def": "a vest capable of resisting the impact of a bullet", "synonyms": ["bulletproof_vest"], "image_count": 5, "id": 167, "frequency": "r", "synset": "bulletproof_vest.n.01"}, {"name": "bullhorn", "instance_count": 15, "def": "a portable loudspeaker with built-in microphone and amplifier", "synonyms": ["bullhorn", "megaphone"], "image_count": 13, "id": 168, "frequency": "c", "synset": "bullhorn.n.01"}, {"name": "bun", "instance_count": 1780, "def": "small rounded bread either plain or sweet", "synonyms": ["bun", "roll"], "image_count": 642, "id": 169, "frequency": "f", "synset": "bun.n.01"}, {"name": "bunk_bed", "instance_count": 44, "def": "beds built one above the other", "synonyms": ["bunk_bed"], "image_count": 24, "id": 170, "frequency": "c", "synset": "bunk_bed.n.01"}, {"name": "buoy", "instance_count": 1404, "def": "a float attached by rope to the seabed to mark channels in a harbor or underwater hazards", "synonyms": ["buoy"], "image_count": 255, "id": 171, "frequency": "f", "synset": "buoy.n.01"}, {"name": "burrito", "instance_count": 14, "def": "a flour tortilla folded around a filling", "synonyms": ["burrito"], "image_count": 9, "id": 172, "frequency": "r", "synset": "burrito.n.01"}, {"name": "bus_(vehicle)", "instance_count": 3281, "def": "a vehicle carrying many passengers; used for public transport", "synonyms": ["bus_(vehicle)", "autobus", "charabanc", "double-decker", "motorbus", "motorcoach"], "image_count": 1808, "id": 173, "frequency": "f", "synset": "bus.n.01"}, {"name": "business_card", "instance_count": 84, "def": "a card on which are printed the person's name and business affiliation", "synonyms": ["business_card"], "image_count": 31, "id": 174, "frequency": "c", "synset": "business_card.n.01"}, {"name": "butter", "instance_count": 308, "def": "an edible emulsion of fat globules made by churning milk or cream; for cooking and table use", "synonyms": ["butter"], "image_count": 158, "id": 175, "frequency": "f", "synset": "butter.n.01"}, {"name": "butterfly", "instance_count": 296, "def": "insect typically having a slender body with knobbed antennae and broad colorful wings", "synonyms": ["butterfly"], "image_count": 80, "id": 176, "frequency": "c", "synset": "butterfly.n.01"}, {"name": "button", "instance_count": 7884, "def": "a round fastener sewn to shirts and coats etc to fit through buttonholes", "synonyms": ["button"], "image_count": 1884, "id": 177, "frequency": "f", "synset": "button.n.01"}, {"name": "cab_(taxi)", "instance_count": 414, "def": "a car that takes passengers where they want to go in exchange for money", "synonyms": ["cab_(taxi)", "taxi", "taxicab"], "image_count": 158, "id": 178, "frequency": "f", "synset": "cab.n.03"}, {"name": "cabana", "instance_count": 20, "def": "a small tent used as a dressing room beside the sea or a swimming pool", "synonyms": ["cabana"], "image_count": 2, "id": 179, "frequency": "r", "synset": "cabana.n.01"}, {"name": "cabin_car", "instance_count": 14, "def": "a car on a freight train for use of the train crew; usually the last car on the train", "synonyms": ["cabin_car", "caboose"], "image_count": 12, "id": 180, "frequency": "c", "synset": "cabin_car.n.01"}, {"name": "cabinet", "instance_count": 7371, "def": "a piece of furniture resembling a cupboard with doors and shelves and drawers", "synonyms": ["cabinet"], "image_count": 1659, "id": 181, "frequency": "f", "synset": "cabinet.n.01"}, {"name": "locker", "instance_count": 95, "def": "a storage compartment for clothes and valuables; usually it has a lock", "synonyms": ["locker", "storage_locker"], "image_count": 7, "id": 182, "frequency": "r", "synset": "cabinet.n.03"}, {"name": "cake", "instance_count": 2297, "def": "baked goods made from or based on a mixture of flour, sugar, eggs, and fat", "synonyms": ["cake"], "image_count": 834, "id": 183, "frequency": "f", "synset": "cake.n.03"}, {"name": "calculator", "instance_count": 60, "def": "a small machine that is used for mathematical calculations", "synonyms": ["calculator"], "image_count": 57, "id": 184, "frequency": "c", "synset": "calculator.n.02"}, {"name": "calendar", "instance_count": 251, "def": "a list or register of events (appointments/social events/court cases, etc)", "synonyms": ["calendar"], "image_count": 174, "id": 185, "frequency": "f", "synset": "calendar.n.02"}, {"name": "calf", "instance_count": 301, "def": "young of domestic cattle", "synonyms": ["calf"], "image_count": 95, "id": 186, "frequency": "c", "synset": "calf.n.01"}, {"name": "camcorder", "instance_count": 45, "def": "a portable television camera and videocassette recorder", "synonyms": ["camcorder"], "image_count": 27, "id": 187, "frequency": "c", "synset": "camcorder.n.01"}, {"name": "camel", "instance_count": 34, "def": "cud-chewing mammal used as a draft or saddle animal in desert regions", "synonyms": ["camel"], "image_count": 22, "id": 188, "frequency": "c", "synset": "camel.n.01"}, {"name": "camera", "instance_count": 2471, "def": "equipment for taking photographs", "synonyms": ["camera"], "image_count": 1391, "id": 189, "frequency": "f", "synset": "camera.n.01"}, {"name": "camera_lens", "instance_count": 167, "def": "a lens that focuses the image in a camera", "synonyms": ["camera_lens"], "image_count": 90, "id": 190, "frequency": "c", "synset": "camera_lens.n.01"}, {"name": "camper_(vehicle)", "instance_count": 102, "def": "a recreational vehicle equipped for camping out while traveling", "synonyms": ["camper_(vehicle)", "camping_bus", "motor_home"], "image_count": 40, "id": 191, "frequency": "c", "synset": "camper.n.02"}, {"name": "can", "instance_count": 1424, "def": "airtight sealed metal container for food or drink or paint etc.", "synonyms": ["can", "tin_can"], "image_count": 445, "id": 192, "frequency": "f", "synset": "can.n.01"}, {"name": "can_opener", "instance_count": 22, "def": "a device for cutting cans open", "synonyms": ["can_opener", "tin_opener"], "image_count": 21, "id": 193, "frequency": "c", "synset": "can_opener.n.01"}, {"name": "candle", "instance_count": 4288, "def": "stick of wax with a wick in the middle", "synonyms": ["candle", "candlestick"], "image_count": 1132, "id": 194, "frequency": "f", "synset": "candle.n.01"}, {"name": "candle_holder", "instance_count": 530, "def": "a holder with sockets for candles", "synonyms": ["candle_holder"], "image_count": 177, "id": 195, "frequency": "f", "synset": "candlestick.n.01"}, {"name": "candy_bar", "instance_count": 29, "def": "a candy shaped as a bar", "synonyms": ["candy_bar"], "image_count": 4, "id": 196, "frequency": "r", "synset": "candy_bar.n.01"}, {"name": "candy_cane", "instance_count": 107, "def": "a hard candy in the shape of a rod (usually with stripes)", "synonyms": ["candy_cane"], "image_count": 17, "id": 197, "frequency": "c", "synset": "candy_cane.n.01"}, {"name": "walking_cane", "instance_count": 106, "def": "a stick that people can lean on to help them walk", "synonyms": ["walking_cane"], "image_count": 84, "id": 198, "frequency": "c", "synset": "cane.n.01"}, {"name": "canister", "instance_count": 218, "def": "metal container for storing dry foods such as tea or flour", "synonyms": ["canister", "cannister"], "image_count": 55, "id": 199, "frequency": "c", "synset": "canister.n.02"}, {"name": "canoe", "instance_count": 96, "def": "small and light boat; pointed at both ends; propelled with a paddle", "synonyms": ["canoe"], "image_count": 30, "id": 200, "frequency": "c", "synset": "canoe.n.01"}, {"name": "cantaloup", "instance_count": 193, "def": "the fruit of a cantaloup vine; small to medium-sized melon with yellowish flesh", "synonyms": ["cantaloup", "cantaloupe"], "image_count": 25, "id": 201, "frequency": "c", "synset": "cantaloup.n.02"}, {"name": "canteen", "instance_count": 2, "def": "a flask for carrying water; used by soldiers or travelers", "synonyms": ["canteen"], "image_count": 2, "id": 202, "frequency": "r", "synset": "canteen.n.01"}, {"name": "cap_(headwear)", "instance_count": 636, "def": "a tight-fitting headwear", "synonyms": ["cap_(headwear)"], "image_count": 125, "id": 203, "frequency": "f", "synset": "cap.n.01"}, {"name": "bottle_cap", "instance_count": 5293, "def": "a top (as for a bottle)", "synonyms": ["bottle_cap", "cap_(container_lid)"], "image_count": 1135, "id": 204, "frequency": "f", "synset": "cap.n.02"}, {"name": "cape", "instance_count": 27, "def": "a sleeveless garment like a cloak but shorter", "synonyms": ["cape"], "image_count": 19, "id": 205, "frequency": "c", "synset": "cape.n.02"}, {"name": "cappuccino", "instance_count": 87, "def": "equal parts of espresso and steamed milk", "synonyms": ["cappuccino", "coffee_cappuccino"], "image_count": 72, "id": 206, "frequency": "c", "synset": "cappuccino.n.01"}, {"name": "car_(automobile)", "instance_count": 10528, "def": "a motor vehicle with four wheels", "synonyms": ["car_(automobile)", "auto_(automobile)", "automobile"], "image_count": 1926, "id": 207, "frequency": "f", "synset": "car.n.01"}, {"name": "railcar_(part_of_a_train)", "instance_count": 928, "def": "a wheeled vehicle adapted to the rails of railroad (mark each individual railcar separately)", "synonyms": ["railcar_(part_of_a_train)", "railway_car_(part_of_a_train)", "railroad_car_(part_of_a_train)"], "image_count": 159, "id": 208, "frequency": "f", "synset": "car.n.02"}, {"name": "elevator_car", "instance_count": 10, "def": "where passengers ride up and down", "synonyms": ["elevator_car"], "image_count": 7, "id": 209, "frequency": "r", "synset": "car.n.04"}, {"name": "car_battery", "instance_count": 1, "def": "a battery in a motor vehicle", "synonyms": ["car_battery", "automobile_battery"], "image_count": 1, "id": 210, "frequency": "r", "synset": "car_battery.n.01"}, {"name": "identity_card", "instance_count": 16, "def": "a card certifying the identity of the bearer", "synonyms": ["identity_card"], "image_count": 13, "id": 211, "frequency": "c", "synset": "card.n.02"}, {"name": "card", "instance_count": 122, "def": "a rectangular piece of paper used to send messages (e.g. greetings or pictures)", "synonyms": ["card"], "image_count": 35, "id": 212, "frequency": "c", "synset": "card.n.03"}, {"name": "cardigan", "instance_count": 22, "def": "knitted jacket that is fastened up the front with buttons or a zipper", "synonyms": ["cardigan"], "image_count": 18, "id": 213, "frequency": "c", "synset": "cardigan.n.01"}, {"name": "cargo_ship", "instance_count": 15, "def": "a ship designed to carry cargo", "synonyms": ["cargo_ship", "cargo_vessel"], "image_count": 8, "id": 214, "frequency": "r", "synset": "cargo_ship.n.01"}, {"name": "carnation", "instance_count": 22, "def": "plant with pink to purple-red spice-scented usually double flowers", "synonyms": ["carnation"], "image_count": 6, "id": 215, "frequency": "r", "synset": "carnation.n.01"}, {"name": "horse_carriage", "instance_count": 49, "def": "a vehicle with wheels drawn by one or more horses", "synonyms": ["horse_carriage"], "image_count": 35, "id": 216, "frequency": "c", "synset": "carriage.n.02"}, {"name": "carrot", "instance_count": 18049, "def": "deep orange edible root of the cultivated carrot plant", "synonyms": ["carrot"], "image_count": 1222, "id": 217, "frequency": "f", "synset": "carrot.n.01"}, {"name": "tote_bag", "instance_count": 231, "def": "a capacious bag or basket", "synonyms": ["tote_bag"], "image_count": 103, "id": 218, "frequency": "f", "synset": "carryall.n.01"}, {"name": "cart", "instance_count": 51, "def": "a heavy open wagon usually having two wheels and drawn by an animal", "synonyms": ["cart"], "image_count": 28, "id": 219, "frequency": "c", "synset": "cart.n.01"}, {"name": "carton", "instance_count": 206, "def": "a container made of cardboard for holding food or drink", "synonyms": ["carton"], "image_count": 63, "id": 220, "frequency": "c", "synset": "carton.n.02"}, {"name": "cash_register", "instance_count": 33, "def": "a cashbox with an adding machine to register transactions", "synonyms": ["cash_register", "register_(for_cash_transactions)"], "image_count": 28, "id": 221, "frequency": "c", "synset": "cash_register.n.01"}, {"name": "casserole", "instance_count": 12, "def": "food cooked and served in a casserole", "synonyms": ["casserole"], "image_count": 5, "id": 222, "frequency": "r", "synset": "casserole.n.01"}, {"name": "cassette", "instance_count": 74, "def": "a container that holds a magnetic tape used for recording or playing sound or video", "synonyms": ["cassette"], "image_count": 7, "id": 223, "frequency": "r", "synset": "cassette.n.01"}, {"name": "cast", "instance_count": 15, "def": "bandage consisting of a firm covering that immobilizes broken bones while they heal", "synonyms": ["cast", "plaster_cast", "plaster_bandage"], "image_count": 14, "id": 224, "frequency": "c", "synset": "cast.n.05"}, {"name": "cat", "instance_count": 2387, "def": "a domestic house cat", "synonyms": ["cat"], "image_count": 1918, "id": 225, "frequency": "f", "synset": "cat.n.01"}, {"name": "cauliflower", "instance_count": 1035, "def": "edible compact head of white undeveloped flowers", "synonyms": ["cauliflower"], "image_count": 133, "id": 226, "frequency": "f", "synset": "cauliflower.n.02"}, {"name": "cayenne_(spice)", "instance_count": 49, "def": "ground pods and seeds of pungent red peppers of the genus Capsicum", "synonyms": ["cayenne_(spice)", "cayenne_pepper_(spice)", "red_pepper_(spice)"], "image_count": 16, "id": 227, "frequency": "c", "synset": "cayenne.n.02"}, {"name": "CD_player", "instance_count": 37, "def": "electronic equipment for playing compact discs (CDs)", "synonyms": ["CD_player"], "image_count": 27, "id": 228, "frequency": "c", "synset": "cd_player.n.01"}, {"name": "celery", "instance_count": 911, "def": "widely cultivated herb with aromatic leaf stalks that are eaten raw or cooked", "synonyms": ["celery"], "image_count": 110, "id": 229, "frequency": "f", "synset": "celery.n.01"}, {"name": "cellular_telephone", "instance_count": 2902, "def": "a hand-held mobile telephone", "synonyms": ["cellular_telephone", "cellular_phone", "cellphone", "mobile_phone", "smart_phone"], "image_count": 1895, "id": 230, "frequency": "f", "synset": "cellular_telephone.n.01"}, {"name": "chain_mail", "instance_count": 13, "def": "(Middle Ages) flexible armor made of interlinked metal rings", "synonyms": ["chain_mail", "ring_mail", "chain_armor", "chain_armour", "ring_armor", "ring_armour"], "image_count": 4, "id": 231, "frequency": "r", "synset": "chain_mail.n.01"}, {"name": "chair", "instance_count": 11549, "def": "a seat for one person, with a support for the back", "synonyms": ["chair"], "image_count": 1927, "id": 232, "frequency": "f", "synset": "chair.n.01"}, {"name": "chaise_longue", "instance_count": 15, "def": "a long chair; for reclining", "synonyms": ["chaise_longue", "chaise", "daybed"], "image_count": 8, "id": 233, "frequency": "r", "synset": "chaise_longue.n.01"}, {"name": "chalice", "instance_count": 1, "def": "a bowl-shaped drinking vessel; especially the Eucharistic cup", "synonyms": ["chalice"], "image_count": 1, "id": 234, "frequency": "r", "synset": "chalice.n.01"}, {"name": "chandelier", "instance_count": 392, "def": "branched lighting fixture; often ornate; hangs from the ceiling", "synonyms": ["chandelier"], "image_count": 263, "id": 235, "frequency": "f", "synset": "chandelier.n.01"}, {"name": "chap", "instance_count": 19, "def": "leather leggings without a seat; worn over trousers by cowboys to protect their legs", "synonyms": ["chap"], "image_count": 10, "id": 236, "frequency": "r", "synset": "chap.n.04"}, {"name": "checkbook", "instance_count": 2, "def": "a book issued to holders of checking accounts", "synonyms": ["checkbook", "chequebook"], "image_count": 2, "id": 237, "frequency": "r", "synset": "checkbook.n.01"}, {"name": "checkerboard", "instance_count": 3, "def": "a board having 64 squares of two alternating colors", "synonyms": ["checkerboard"], "image_count": 3, "id": 238, "frequency": "r", "synset": "checkerboard.n.01"}, {"name": "cherry", "instance_count": 903, "def": "a red fruit with a single hard stone", "synonyms": ["cherry"], "image_count": 87, "id": 239, "frequency": "c", "synset": "cherry.n.03"}, {"name": "chessboard", "instance_count": 13, "def": "a checkerboard used to play chess", "synonyms": ["chessboard"], "image_count": 9, "id": 240, "frequency": "r", "synset": "chessboard.n.01"}, {"name": "chicken_(animal)", "instance_count": 417, "def": "a domestic fowl bred for flesh or eggs", "synonyms": ["chicken_(animal)"], "image_count": 71, "id": 241, "frequency": "c", "synset": "chicken.n.02"}, {"name": "chickpea", "instance_count": 265, "def": "the seed of the chickpea plant; usually dried", "synonyms": ["chickpea", "garbanzo"], "image_count": 13, "id": 242, "frequency": "c", "synset": "chickpea.n.01"}, {"name": "chili_(vegetable)", "instance_count": 354, "def": "very hot and finely tapering pepper of special pungency", "synonyms": ["chili_(vegetable)", "chili_pepper_(vegetable)", "chilli_(vegetable)", "chilly_(vegetable)", "chile_(vegetable)"], "image_count": 18, "id": 243, "frequency": "c", "synset": "chili.n.02"}, {"name": "chime", "instance_count": 2, "def": "an instrument consisting of a set of bells that are struck with a hammer", "synonyms": ["chime", "gong"], "image_count": 2, "id": 244, "frequency": "r", "synset": "chime.n.01"}, {"name": "chinaware", "instance_count": 41, "def": "dishware made of high quality porcelain", "synonyms": ["chinaware"], "image_count": 5, "id": 245, "frequency": "r", "synset": "chinaware.n.01"}, {"name": "crisp_(potato_chip)", "instance_count": 541, "def": "a thin crisp slice of potato fried in deep fat", "synonyms": ["crisp_(potato_chip)", "potato_chip"], "image_count": 45, "id": 246, "frequency": "c", "synset": "chip.n.04"}, {"name": "poker_chip", "instance_count": 21, "def": "a small disk-shaped counter used to represent money when gambling", "synonyms": ["poker_chip"], "image_count": 1, "id": 247, "frequency": "r", "synset": "chip.n.06"}, {"name": "chocolate_bar", "instance_count": 179, "def": "a bar of chocolate candy", "synonyms": ["chocolate_bar"], "image_count": 23, "id": 248, "frequency": "c", "synset": "chocolate_bar.n.01"}, {"name": "chocolate_cake", "instance_count": 80, "def": "cake containing chocolate", "synonyms": ["chocolate_cake"], "image_count": 32, "id": 249, "frequency": "c", "synset": "chocolate_cake.n.01"}, {"name": "chocolate_milk", "instance_count": 7, "def": "milk flavored with chocolate syrup", "synonyms": ["chocolate_milk"], "image_count": 4, "id": 250, "frequency": "r", "synset": "chocolate_milk.n.01"}, {"name": "chocolate_mousse", "instance_count": 1, "def": "dessert mousse made with chocolate", "synonyms": ["chocolate_mousse"], "image_count": 1, "id": 251, "frequency": "r", "synset": "chocolate_mousse.n.01"}, {"name": "choker", "instance_count": 1380, "def": "shirt collar, animal collar, or tight-fitting necklace", "synonyms": ["choker", "collar", "neckband"], "image_count": 858, "id": 252, "frequency": "f", "synset": "choker.n.03"}, {"name": "chopping_board", "instance_count": 840, "def": "a wooden board where meats or vegetables can be cut", "synonyms": ["chopping_board", "cutting_board", "chopping_block"], "image_count": 661, "id": 253, "frequency": "f", "synset": "chopping_board.n.01"}, {"name": "chopstick", "instance_count": 557, "def": "one of a pair of slender sticks used as oriental tableware to eat food with", "synonyms": ["chopstick"], "image_count": 168, "id": 254, "frequency": "f", "synset": "chopstick.n.01"}, {"name": "Christmas_tree", "instance_count": 303, "def": "an ornamented evergreen used as a Christmas decoration", "synonyms": ["Christmas_tree"], "image_count": 210, "id": 255, "frequency": "f", "synset": "christmas_tree.n.05"}, {"name": "slide", "instance_count": 106, "def": "sloping channel through which things can descend", "synonyms": ["slide"], "image_count": 65, "id": 256, "frequency": "c", "synset": "chute.n.02"}, {"name": "cider", "instance_count": 38, "def": "a beverage made from juice pressed from apples", "synonyms": ["cider", "cyder"], "image_count": 4, "id": 257, "frequency": "r", "synset": "cider.n.01"}, {"name": "cigar_box", "instance_count": 3, "def": "a box for holding cigars", "synonyms": ["cigar_box"], "image_count": 2, "id": 258, "frequency": "r", "synset": "cigar_box.n.01"}, {"name": "cigarette", "instance_count": 269, "def": "finely ground tobacco wrapped in paper; for smoking", "synonyms": ["cigarette"], "image_count": 159, "id": 259, "frequency": "f", "synset": "cigarette.n.01"}, {"name": "cigarette_case", "instance_count": 35, "def": "a small flat case for holding cigarettes", "synonyms": ["cigarette_case", "cigarette_pack"], "image_count": 31, "id": 260, "frequency": "c", "synset": "cigarette_case.n.01"}, {"name": "cistern", "instance_count": 901, "def": "a tank that holds the water used to flush a toilet", "synonyms": ["cistern", "water_tank"], "image_count": 811, "id": 261, "frequency": "f", "synset": "cistern.n.02"}, {"name": "clarinet", "instance_count": 1, "def": "a single-reed instrument with a straight tube", "synonyms": ["clarinet"], "image_count": 1, "id": 262, "frequency": "r", "synset": "clarinet.n.01"}, {"name": "clasp", "instance_count": 197, "def": "a fastener (as a buckle or hook) that is used to hold two things together", "synonyms": ["clasp"], "image_count": 42, "id": 263, "frequency": "c", "synset": "clasp.n.01"}, {"name": "cleansing_agent", "instance_count": 63, "def": "a preparation used in cleaning something", "synonyms": ["cleansing_agent", "cleanser", "cleaner"], "image_count": 27, "id": 264, "frequency": "c", "synset": "cleansing_agent.n.01"}, {"name": "cleat_(for_securing_rope)", "instance_count": 8, "def": "a fastener (usually with two projecting horns) around which a rope can be secured", "synonyms": ["cleat_(for_securing_rope)"], "image_count": 2, "id": 265, "frequency": "r", "synset": "cleat.n.02"}, {"name": "clementine", "instance_count": 108, "def": "a variety of mandarin orange", "synonyms": ["clementine"], "image_count": 5, "id": 266, "frequency": "r", "synset": "clementine.n.01"}, {"name": "clip", "instance_count": 301, "def": "any of various small fasteners used to hold loose articles together", "synonyms": ["clip"], "image_count": 95, "id": 267, "frequency": "c", "synset": "clip.n.03"}, {"name": "clipboard", "instance_count": 36, "def": "a small writing board with a clip at the top for holding papers", "synonyms": ["clipboard"], "image_count": 32, "id": 268, "frequency": "c", "synset": "clipboard.n.01"}, {"name": "clippers_(for_plants)", "instance_count": 1, "def": "shears for cutting grass or shrubbery (often used in the plural)", "synonyms": ["clippers_(for_plants)"], "image_count": 1, "id": 269, "frequency": "r", "synset": "clipper.n.03"}, {"name": "cloak", "instance_count": 1, "def": "a loose outer garment", "synonyms": ["cloak"], "image_count": 1, "id": 270, "frequency": "r", "synset": "cloak.n.02"}, {"name": "clock", "instance_count": 2677, "def": "a timepiece that shows the time of day", "synonyms": ["clock", "timepiece", "timekeeper"], "image_count": 1844, "id": 271, "frequency": "f", "synset": "clock.n.01"}, {"name": "clock_tower", "instance_count": 932, "def": "a tower with a large clock visible high up on an outside face", "synonyms": ["clock_tower"], "image_count": 897, "id": 272, "frequency": "f", "synset": "clock_tower.n.01"}, {"name": "clothes_hamper", "instance_count": 47, "def": "a hamper that holds dirty clothes to be washed or wet clothes to be dried", "synonyms": ["clothes_hamper", "laundry_basket", "clothes_basket"], "image_count": 31, "id": 273, "frequency": "c", "synset": "clothes_hamper.n.01"}, {"name": "clothespin", "instance_count": 111, "def": "wood or plastic fastener; for holding clothes on a clothesline", "synonyms": ["clothespin", "clothes_peg"], "image_count": 23, "id": 274, "frequency": "c", "synset": "clothespin.n.01"}, {"name": "clutch_bag", "instance_count": 1, "def": "a woman's strapless purse that is carried in the hand", "synonyms": ["clutch_bag"], "image_count": 1, "id": 275, "frequency": "r", "synset": "clutch_bag.n.01"}, {"name": "coaster", "instance_count": 390, "def": "a covering (plate or mat) that protects the surface of a table", "synonyms": ["coaster"], "image_count": 202, "id": 276, "frequency": "f", "synset": "coaster.n.03"}, {"name": "coat", "instance_count": 4145, "def": "an outer garment that has sleeves and covers the body from shoulder down", "synonyms": ["coat"], "image_count": 746, "id": 277, "frequency": "f", "synset": "coat.n.01"}, {"name": "coat_hanger", "instance_count": 282, "def": "a hanger that is shaped like a person's shoulders", "synonyms": ["coat_hanger", "clothes_hanger", "dress_hanger"], "image_count": 44, "id": 278, "frequency": "c", "synset": "coat_hanger.n.01"}, {"name": "coatrack", "instance_count": 16, "def": "a rack with hooks for temporarily holding coats and hats", "synonyms": ["coatrack", "hatrack"], "image_count": 14, "id": 279, "frequency": "c", "synset": "coatrack.n.01"}, {"name": "cock", "instance_count": 132, "def": "adult male chicken", "synonyms": ["cock", "rooster"], "image_count": 26, "id": 280, "frequency": "c", "synset": "cock.n.04"}, {"name": "cockroach", "instance_count": 1, "def": "any of numerous chiefly nocturnal insects; some are domestic pests", "synonyms": ["cockroach"], "image_count": 1, "id": 281, "frequency": "r", "synset": "cockroach.n.01"}, {"name": "cocoa_(beverage)", "instance_count": 4, "def": "a beverage made from cocoa powder and milk and sugar; usually drunk hot", "synonyms": ["cocoa_(beverage)", "hot_chocolate_(beverage)", "drinking_chocolate"], "image_count": 2, "id": 282, "frequency": "r", "synset": "cocoa.n.01"}, {"name": "coconut", "instance_count": 273, "def": "large hard-shelled brown oval nut with a fibrous husk", "synonyms": ["coconut", "cocoanut"], "image_count": 25, "id": 283, "frequency": "c", "synset": "coconut.n.02"}, {"name": "coffee_maker", "instance_count": 271, "def": "a kitchen appliance for brewing coffee automatically", "synonyms": ["coffee_maker", "coffee_machine"], "image_count": 238, "id": 284, "frequency": "f", "synset": "coffee_maker.n.01"}, {"name": "coffee_table", "instance_count": 709, "def": "low table where magazines can be placed and coffee or cocktails are served", "synonyms": ["coffee_table", "cocktail_table"], "image_count": 592, "id": 285, "frequency": "f", "synset": "coffee_table.n.01"}, {"name": "coffeepot", "instance_count": 32, "def": "tall pot in which coffee is brewed", "synonyms": ["coffeepot"], "image_count": 26, "id": 286, "frequency": "c", "synset": "coffeepot.n.01"}, {"name": "coil", "instance_count": 7, "def": "tubing that is wound in a spiral", "synonyms": ["coil"], "image_count": 5, "id": 287, "frequency": "r", "synset": "coil.n.05"}, {"name": "coin", "instance_count": 305, "def": "a flat metal piece (usually a disc) used as money", "synonyms": ["coin"], "image_count": 42, "id": 288, "frequency": "c", "synset": "coin.n.01"}, {"name": "colander", "instance_count": 16, "def": "bowl-shaped strainer; used to wash or drain foods", "synonyms": ["colander", "cullender"], "image_count": 13, "id": 289, "frequency": "c", "synset": "colander.n.01"}, {"name": "coleslaw", "instance_count": 72, "def": "basically shredded cabbage", "synonyms": ["coleslaw", "slaw"], "image_count": 46, "id": 290, "frequency": "c", "synset": "coleslaw.n.01"}, {"name": "coloring_material", "instance_count": 1, "def": "any material used for its color", "synonyms": ["coloring_material", "colouring_material"], "image_count": 1, "id": 291, "frequency": "r", "synset": "coloring_material.n.01"}, {"name": "combination_lock", "instance_count": 13, "def": "lock that can be opened only by turning dials in a special sequence", "synonyms": ["combination_lock"], "image_count": 8, "id": 292, "frequency": "r", "synset": "combination_lock.n.01"}, {"name": "pacifier", "instance_count": 40, "def": "device used for an infant to suck or bite on", "synonyms": ["pacifier", "teething_ring"], "image_count": 34, "id": 293, "frequency": "c", "synset": "comforter.n.04"}, {"name": "comic_book", "instance_count": 97, "def": "a magazine devoted to comic strips", "synonyms": ["comic_book"], "image_count": 5, "id": 294, "frequency": "r", "synset": "comic_book.n.01"}, {"name": "compass", "instance_count": 1, "def": "navigational instrument for finding directions", "synonyms": ["compass"], "image_count": 1, "id": 295, "frequency": "r", "synset": "compass.n.01"}, {"name": "computer_keyboard", "instance_count": 2745, "def": "a keyboard that is a data input device for computers", "synonyms": ["computer_keyboard", "keyboard_(computer)"], "image_count": 1871, "id": 296, "frequency": "f", "synset": "computer_keyboard.n.01"}, {"name": "condiment", "instance_count": 2985, "def": "a preparation (a sauce or relish or spice) to enhance flavor or enjoyment", "synonyms": ["condiment"], "image_count": 717, "id": 297, "frequency": "f", "synset": "condiment.n.01"}, {"name": "cone", "instance_count": 4081, "def": "a cone-shaped object used to direct traffic", "synonyms": ["cone", "traffic_cone"], "image_count": 1010, "id": 298, "frequency": "f", "synset": "cone.n.01"}, {"name": "control", "instance_count": 1775, "def": "a mechanism that controls the operation of a machine", "synonyms": ["control", "controller"], "image_count": 679, "id": 299, "frequency": "f", "synset": "control.n.09"}, {"name": "convertible_(automobile)", "instance_count": 4, "def": "a car that has top that can be folded or removed", "synonyms": ["convertible_(automobile)"], "image_count": 3, "id": 300, "frequency": "r", "synset": "convertible.n.01"}, {"name": "sofa_bed", "instance_count": 5, "def": "a sofa that can be converted into a bed", "synonyms": ["sofa_bed"], "image_count": 4, "id": 301, "frequency": "r", "synset": "convertible.n.03"}, {"name": "cooker", "instance_count": 1, "def": "a utensil for cooking", "synonyms": ["cooker"], "image_count": 1, "id": 302, "frequency": "r", "synset": "cooker.n.01"}, {"name": "cookie", "instance_count": 1920, "def": "any of various small flat sweet cakes (`biscuit' is the British term)", "synonyms": ["cookie", "cooky", "biscuit_(cookie)"], "image_count": 166, "id": 303, "frequency": "f", "synset": "cookie.n.01"}, {"name": "cooking_utensil", "instance_count": 18, "def": "a kitchen utensil made of material that does not melt easily; used for cooking", "synonyms": ["cooking_utensil"], "image_count": 2, "id": 304, "frequency": "r", "synset": "cooking_utensil.n.01"}, {"name": "cooler_(for_food)", "instance_count": 499, "def": "an insulated box for storing food often with ice", "synonyms": ["cooler_(for_food)", "ice_chest"], "image_count": 266, "id": 305, "frequency": "f", "synset": "cooler.n.01"}, {"name": "cork_(bottle_plug)", "instance_count": 326, "def": "the plug in the mouth of a bottle (especially a wine bottle)", "synonyms": ["cork_(bottle_plug)", "bottle_cork"], "image_count": 101, "id": 306, "frequency": "f", "synset": "cork.n.04"}, {"name": "corkboard", "instance_count": 7, "def": "a sheet consisting of cork granules", "synonyms": ["corkboard"], "image_count": 6, "id": 307, "frequency": "r", "synset": "corkboard.n.01"}, {"name": "corkscrew", "instance_count": 15, "def": "a bottle opener that pulls corks", "synonyms": ["corkscrew", "bottle_screw"], "image_count": 14, "id": 308, "frequency": "c", "synset": "corkscrew.n.01"}, {"name": "edible_corn", "instance_count": 1883, "def": "ears or kernels of corn that can be prepared and served for human food (only mark individual ears or kernels)", "synonyms": ["edible_corn", "corn", "maize"], "image_count": 133, "id": 309, "frequency": "f", "synset": "corn.n.03"}, {"name": "cornbread", "instance_count": 10, "def": "bread made primarily of cornmeal", "synonyms": ["cornbread"], "image_count": 2, "id": 310, "frequency": "r", "synset": "cornbread.n.01"}, {"name": "cornet", "instance_count": 65, "def": "a brass musical instrument with a narrow tube and a flared bell and many valves", "synonyms": ["cornet", "horn", "trumpet"], "image_count": 38, "id": 311, "frequency": "c", "synset": "cornet.n.01"}, {"name": "cornice", "instance_count": 149, "def": "a decorative framework to conceal curtain fixtures at the top of a window casing", "synonyms": ["cornice", "valance", "valance_board", "pelmet"], "image_count": 95, "id": 312, "frequency": "c", "synset": "cornice.n.01"}, {"name": "cornmeal", "instance_count": 1, "def": "coarsely ground corn", "synonyms": ["cornmeal"], "image_count": 1, "id": 313, "frequency": "r", "synset": "cornmeal.n.01"}, {"name": "corset", "instance_count": 12, "def": "a woman's close-fitting foundation garment", "synonyms": ["corset", "girdle"], "image_count": 12, "id": 314, "frequency": "c", "synset": "corset.n.01"}, {"name": "costume", "instance_count": 124, "def": "the attire characteristic of a country or a time or a social class", "synonyms": ["costume"], "image_count": 49, "id": 315, "frequency": "c", "synset": "costume.n.04"}, {"name": "cougar", "instance_count": 6, "def": "large American feline resembling a lion", "synonyms": ["cougar", "puma", "catamount", "mountain_lion", "panther"], "image_count": 5, "id": 316, "frequency": "r", "synset": "cougar.n.01"}, {"name": "coverall", "instance_count": 12, "def": "a loose-fitting protective garment that is worn over other clothing", "synonyms": ["coverall"], "image_count": 5, "id": 317, "frequency": "r", "synset": "coverall.n.01"}, {"name": "cowbell", "instance_count": 29, "def": "a bell hung around the neck of cow so that the cow can be easily located", "synonyms": ["cowbell"], "image_count": 16, "id": 318, "frequency": "c", "synset": "cowbell.n.01"}, {"name": "cowboy_hat", "instance_count": 535, "def": "a hat with a wide brim and a soft crown; worn by American ranch hands", "synonyms": ["cowboy_hat", "ten-gallon_hat"], "image_count": 216, "id": 319, "frequency": "f", "synset": "cowboy_hat.n.01"}, {"name": "crab_(animal)", "instance_count": 50, "def": "decapod having eyes on short stalks and a broad flattened shell and pincers", "synonyms": ["crab_(animal)"], "image_count": 12, "id": 320, "frequency": "c", "synset": "crab.n.01"}, {"name": "crabmeat", "instance_count": 5, "def": "the edible flesh of any of various crabs", "synonyms": ["crabmeat"], "image_count": 1, "id": 321, "frequency": "r", "synset": "crab.n.05"}, {"name": "cracker", "instance_count": 510, "def": "a thin crisp wafer", "synonyms": ["cracker"], "image_count": 54, "id": 322, "frequency": "c", "synset": "cracker.n.01"}, {"name": "crape", "instance_count": 12, "def": "small very thin pancake", "synonyms": ["crape", "crepe", "French_pancake"], "image_count": 5, "id": 323, "frequency": "r", "synset": "crape.n.01"}, {"name": "crate", "instance_count": 1832, "def": "a rugged box (usually made of wood); used for shipping", "synonyms": ["crate"], "image_count": 245, "id": 324, "frequency": "f", "synset": "crate.n.01"}, {"name": "crayon", "instance_count": 59, "def": "writing or drawing implement made of a colored stick of composition wax", "synonyms": ["crayon", "wax_crayon"], "image_count": 12, "id": 325, "frequency": "c", "synset": "crayon.n.01"}, {"name": "cream_pitcher", "instance_count": 10, "def": "a small pitcher for serving cream", "synonyms": ["cream_pitcher"], "image_count": 7, "id": 326, "frequency": "r", "synset": "cream_pitcher.n.01"}, {"name": "crescent_roll", "instance_count": 152, "def": "very rich flaky crescent-shaped roll", "synonyms": ["crescent_roll", "croissant"], "image_count": 35, "id": 327, "frequency": "c", "synset": "crescent_roll.n.01"}, {"name": "crib", "instance_count": 40, "def": "baby bed with high sides made of slats", "synonyms": ["crib", "cot"], "image_count": 36, "id": 328, "frequency": "c", "synset": "crib.n.01"}, {"name": "crock_pot", "instance_count": 128, "def": "an earthen jar (made of baked clay) or a modern electric crockpot", "synonyms": ["crock_pot", "earthenware_jar"], "image_count": 32, "id": 329, "frequency": "c", "synset": "crock.n.03"}, {"name": "crossbar", "instance_count": 6991, "def": "a horizontal bar that goes across something", "synonyms": ["crossbar"], "image_count": 1027, "id": 330, "frequency": "f", "synset": "crossbar.n.01"}, {"name": "crouton", "instance_count": 140, "def": "a small piece of toasted or fried bread; served in soup or salads", "synonyms": ["crouton"], "image_count": 10, "id": 331, "frequency": "r", "synset": "crouton.n.01"}, {"name": "crow", "instance_count": 24, "def": "black birds having a raucous call", "synonyms": ["crow"], "image_count": 12, "id": 332, "frequency": "c", "synset": "crow.n.01"}, {"name": "crowbar", "instance_count": 1, "def": "a heavy iron lever with one end forged into a wedge", "synonyms": ["crowbar", "wrecking_bar", "pry_bar"], "image_count": 1, "id": 333, "frequency": "r", "synset": "crowbar.n.01"}, {"name": "crown", "instance_count": 126, "def": "an ornamental jeweled headdress signifying sovereignty", "synonyms": ["crown"], "image_count": 67, "id": 334, "frequency": "c", "synset": "crown.n.04"}, {"name": "crucifix", "instance_count": 99, "def": "representation of the cross on which Jesus died", "synonyms": ["crucifix"], "image_count": 71, "id": 335, "frequency": "c", "synset": "crucifix.n.01"}, {"name": "cruise_ship", "instance_count": 35, "def": "a passenger ship used commercially for pleasure cruises", "synonyms": ["cruise_ship", "cruise_liner"], "image_count": 30, "id": 336, "frequency": "c", "synset": "cruise_ship.n.01"}, {"name": "police_cruiser", "instance_count": 86, "def": "a car in which policemen cruise the streets", "synonyms": ["police_cruiser", "patrol_car", "police_car", "squad_car"], "image_count": 48, "id": 337, "frequency": "c", "synset": "cruiser.n.01"}, {"name": "crumb", "instance_count": 3021, "def": "small piece of e.g. bread or cake", "synonyms": ["crumb"], "image_count": 249, "id": 338, "frequency": "f", "synset": "crumb.n.03"}, {"name": "crutch", "instance_count": 20, "def": "a wooden or metal staff that fits under the armpit and reaches to the ground", "synonyms": ["crutch"], "image_count": 13, "id": 339, "frequency": "c", "synset": "crutch.n.01"}, {"name": "cub_(animal)", "instance_count": 55, "def": "the young of certain carnivorous mammals such as the bear or wolf or lion", "synonyms": ["cub_(animal)"], "image_count": 29, "id": 340, "frequency": "c", "synset": "cub.n.03"}, {"name": "cube", "instance_count": 189, "def": "a block in the (approximate) shape of a cube", "synonyms": ["cube", "square_block"], "image_count": 14, "id": 341, "frequency": "c", "synset": "cube.n.05"}, {"name": "cucumber", "instance_count": 1533, "def": "cylindrical green fruit with thin green rind and white flesh eaten as a vegetable", "synonyms": ["cucumber", "cuke"], "image_count": 236, "id": 342, "frequency": "f", "synset": "cucumber.n.02"}, {"name": "cufflink", "instance_count": 17, "def": "jewelry consisting of linked buttons used to fasten the cuffs of a shirt", "synonyms": ["cufflink"], "image_count": 15, "id": 343, "frequency": "c", "synset": "cufflink.n.01"}, {"name": "cup", "instance_count": 4637, "def": "a small open container usually used for drinking; usually has a handle", "synonyms": ["cup"], "image_count": 1521, "id": 344, "frequency": "f", "synset": "cup.n.01"}, {"name": "trophy_cup", "instance_count": 80, "def": "a metal award or cup-shaped vessel with handles that is awarded as a trophy to a competition winner", "synonyms": ["trophy_cup"], "image_count": 25, "id": 345, "frequency": "c", "synset": "cup.n.08"}, {"name": "cupboard", "instance_count": 1623, "def": "a small room (or recess) or cabinet used for storage space", "synonyms": ["cupboard", "closet"], "image_count": 249, "id": 346, "frequency": "f", "synset": "cupboard.n.01"}, {"name": "cupcake", "instance_count": 1628, "def": "small cake baked in a muffin tin", "synonyms": ["cupcake"], "image_count": 139, "id": 347, "frequency": "f", "synset": "cupcake.n.01"}, {"name": "hair_curler", "instance_count": 20, "def": "a cylindrical tube around which the hair is wound to curl it", "synonyms": ["hair_curler", "hair_roller", "hair_crimper"], "image_count": 2, "id": 348, "frequency": "r", "synset": "curler.n.01"}, {"name": "curling_iron", "instance_count": 2, "def": "a cylindrical home appliance that heats hair that has been curled around it", "synonyms": ["curling_iron"], "image_count": 2, "id": 349, "frequency": "r", "synset": "curling_iron.n.01"}, {"name": "curtain", "instance_count": 4506, "def": "hanging cloth used as a blind (especially for a window)", "synonyms": ["curtain", "drapery"], "image_count": 1890, "id": 350, "frequency": "f", "synset": "curtain.n.01"}, {"name": "cushion", "instance_count": 7174, "def": "a soft bag filled with air or padding such as feathers or foam rubber", "synonyms": ["cushion"], "image_count": 1240, "id": 351, "frequency": "f", "synset": "cushion.n.03"}, {"name": "cylinder", "instance_count": 3, "def": "a cylindrical container", "synonyms": ["cylinder"], "image_count": 1, "id": 352, "frequency": "r", "synset": "cylinder.n.04"}, {"name": "cymbal", "instance_count": 24, "def": "a percussion instrument consisting of a concave brass disk", "synonyms": ["cymbal"], "image_count": 9, "id": 353, "frequency": "r", "synset": "cymbal.n.01"}, {"name": "dagger", "instance_count": 1, "def": "a short knife with a pointed blade used for piercing or stabbing", "synonyms": ["dagger"], "image_count": 1, "id": 354, "frequency": "r", "synset": "dagger.n.01"}, {"name": "dalmatian", "instance_count": 3, "def": "a large breed having a smooth white coat with black or brown spots", "synonyms": ["dalmatian"], "image_count": 3, "id": 355, "frequency": "r", "synset": "dalmatian.n.02"}, {"name": "dartboard", "instance_count": 11, "def": "a circular board of wood or cork used as the target in the game of darts", "synonyms": ["dartboard"], "image_count": 11, "id": 356, "frequency": "c", "synset": "dartboard.n.01"}, {"name": "date_(fruit)", "instance_count": 103, "def": "sweet edible fruit of the date palm with a single long woody seed", "synonyms": ["date_(fruit)"], "image_count": 4, "id": 357, "frequency": "r", "synset": "date.n.08"}, {"name": "deck_chair", "instance_count": 1787, "def": "a folding chair for use outdoors; a wooden frame supports a length of canvas", "synonyms": ["deck_chair", "beach_chair"], "image_count": 236, "id": 358, "frequency": "f", "synset": "deck_chair.n.01"}, {"name": "deer", "instance_count": 130, "def": "distinguished from Bovidae by the male's having solid deciduous antlers", "synonyms": ["deer", "cervid"], "image_count": 44, "id": 359, "frequency": "c", "synset": "deer.n.01"}, {"name": "dental_floss", "instance_count": 20, "def": "a soft thread for cleaning the spaces between the teeth", "synonyms": ["dental_floss", "floss"], "image_count": 19, "id": 360, "frequency": "c", "synset": "dental_floss.n.01"}, {"name": "desk", "instance_count": 1662, "def": "a piece of furniture with a writing surface and usually drawers or other compartments", "synonyms": ["desk"], "image_count": 1100, "id": 361, "frequency": "f", "synset": "desk.n.01"}, {"name": "detergent", "instance_count": 11, "def": "a surface-active chemical widely used in industry and laundering", "synonyms": ["detergent"], "image_count": 7, "id": 362, "frequency": "r", "synset": "detergent.n.01"}, {"name": "diaper", "instance_count": 89, "def": "garment consisting of a folded cloth drawn up between the legs and fastened at the waist", "synonyms": ["diaper"], "image_count": 69, "id": 363, "frequency": "c", "synset": "diaper.n.01"}, {"name": "diary", "instance_count": 2, "def": "yearly planner book", "synonyms": ["diary", "journal"], "image_count": 2, "id": 364, "frequency": "r", "synset": "diary.n.01"}, {"name": "die", "instance_count": 25, "def": "a small cube with 1 to 6 spots on the six faces; used in gambling", "synonyms": ["die", "dice"], "image_count": 8, "id": 365, "frequency": "r", "synset": "die.n.01"}, {"name": "dinghy", "instance_count": 15, "def": "a small boat of shallow draft with seats and oars with which it is propelled", "synonyms": ["dinghy", "dory", "rowboat"], "image_count": 5, "id": 366, "frequency": "r", "synset": "dinghy.n.01"}, {"name": "dining_table", "instance_count": 312, "def": "a table at which meals are served", "synonyms": ["dining_table"], "image_count": 227, "id": 367, "frequency": "f", "synset": "dining_table.n.01"}, {"name": "tux", "instance_count": 10, "def": "semiformal evening dress for men", "synonyms": ["tux", "tuxedo"], "image_count": 6, "id": 368, "frequency": "r", "synset": "dinner_jacket.n.01"}, {"name": "dish", "instance_count": 532, "def": "a piece of dishware normally used as a container for holding or serving food", "synonyms": ["dish"], "image_count": 106, "id": 369, "frequency": "f", "synset": "dish.n.01"}, {"name": "dish_antenna", "instance_count": 153, "def": "directional antenna consisting of a parabolic reflector", "synonyms": ["dish_antenna"], "image_count": 81, "id": 370, "frequency": "c", "synset": "dish.n.05"}, {"name": "dishrag", "instance_count": 32, "def": "a cloth for washing dishes or cleaning in general", "synonyms": ["dishrag", "dishcloth"], "image_count": 17, "id": 371, "frequency": "c", "synset": "dishrag.n.01"}, {"name": "dishtowel", "instance_count": 223, "def": "a towel for drying dishes", "synonyms": ["dishtowel", "tea_towel"], "image_count": 134, "id": 372, "frequency": "f", "synset": "dishtowel.n.01"}, {"name": "dishwasher", "instance_count": 317, "def": "a machine for washing dishes", "synonyms": ["dishwasher", "dishwashing_machine"], "image_count": 312, "id": 373, "frequency": "f", "synset": "dishwasher.n.01"}, {"name": "dishwasher_detergent", "instance_count": 9, "def": "dishsoap or dish detergent designed for use in dishwashers", "synonyms": ["dishwasher_detergent", "dishwashing_detergent", "dishwashing_liquid", "dishsoap"], "image_count": 8, "id": 374, "frequency": "r", "synset": "dishwasher_detergent.n.01"}, {"name": "dispenser", "instance_count": 610, "def": "a container so designed that the contents can be used in prescribed amounts", "synonyms": ["dispenser"], "image_count": 271, "id": 375, "frequency": "f", "synset": "dispenser.n.01"}, {"name": "diving_board", "instance_count": 2, "def": "a springboard from which swimmers can dive", "synonyms": ["diving_board"], "image_count": 2, "id": 376, "frequency": "r", "synset": "diving_board.n.01"}, {"name": "Dixie_cup", "instance_count": 352, "def": "a disposable cup made of paper; for holding drinks", "synonyms": ["Dixie_cup", "paper_cup"], "image_count": 103, "id": 377, "frequency": "f", "synset": "dixie_cup.n.01"}, {"name": "dog", "instance_count": 2684, "def": "a common domesticated dog", "synonyms": ["dog"], "image_count": 1938, "id": 378, "frequency": "f", "synset": "dog.n.01"}, {"name": "dog_collar", "instance_count": 733, "def": "a collar for a dog", "synonyms": ["dog_collar"], "image_count": 574, "id": 379, "frequency": "f", "synset": "dog_collar.n.01"}, {"name": "doll", "instance_count": 398, "def": "a toy replica of a HUMAN (NOT AN ANIMAL)", "synonyms": ["doll"], "image_count": 120, "id": 380, "frequency": "f", "synset": "doll.n.01"}, {"name": "dollar", "instance_count": 2, "def": "a piece of paper money worth one dollar", "synonyms": ["dollar", "dollar_bill", "one_dollar_bill"], "image_count": 2, "id": 381, "frequency": "r", "synset": "dollar.n.02"}, {"name": "dollhouse", "instance_count": 2, "def": "a house so small that it is likened to a child's plaything", "synonyms": ["dollhouse", "doll's_house"], "image_count": 2, "id": 382, "frequency": "r", "synset": "dollhouse.n.01"}, {"name": "dolphin", "instance_count": 38, "def": "any of various small toothed whales with a beaklike snout; larger than porpoises", "synonyms": ["dolphin"], "image_count": 13, "id": 383, "frequency": "c", "synset": "dolphin.n.02"}, {"name": "domestic_ass", "instance_count": 49, "def": "domestic beast of burden descended from the African wild ass; patient but stubborn", "synonyms": ["domestic_ass", "donkey"], "image_count": 29, "id": 384, "frequency": "c", "synset": "domestic_ass.n.01"}, {"name": "doorknob", "instance_count": 4072, "def": "a knob used to open a door (often called `doorhandle' in Great Britain)", "synonyms": ["doorknob", "doorhandle"], "image_count": 1710, "id": 385, "frequency": "f", "synset": "doorknob.n.01"}, {"name": "doormat", "instance_count": 78, "def": "a mat placed outside an exterior door for wiping the shoes before entering", "synonyms": ["doormat", "welcome_mat"], "image_count": 66, "id": 386, "frequency": "c", "synset": "doormat.n.02"}, {"name": "doughnut", "instance_count": 11911, "def": "a small ring-shaped friedcake", "synonyms": ["doughnut", "donut"], "image_count": 1008, "id": 387, "frequency": "f", "synset": "doughnut.n.02"}, {"name": "dove", "instance_count": 2, "def": "any of numerous small pigeons", "synonyms": ["dove"], "image_count": 1, "id": 388, "frequency": "r", "synset": "dove.n.01"}, {"name": "dragonfly", "instance_count": 8, "def": "slender-bodied non-stinging insect having iridescent wings that are outspread at rest", "synonyms": ["dragonfly"], "image_count": 3, "id": 389, "frequency": "r", "synset": "dragonfly.n.01"}, {"name": "drawer", "instance_count": 7927, "def": "a boxlike container in a piece of furniture; made so as to slide in and out", "synonyms": ["drawer"], "image_count": 1942, "id": 390, "frequency": "f", "synset": "drawer.n.01"}, {"name": "underdrawers", "instance_count": 23, "def": "underpants worn by men", "synonyms": ["underdrawers", "boxers", "boxershorts"], "image_count": 19, "id": 391, "frequency": "c", "synset": "drawers.n.01"}, {"name": "dress", "instance_count": 2842, "def": "a one-piece garment for a woman; has skirt and bodice", "synonyms": ["dress", "frock"], "image_count": 1488, "id": 392, "frequency": "f", "synset": "dress.n.01"}, {"name": "dress_hat", "instance_count": 76, "def": "a man's hat with a tall crown; usually covered with silk or with beaver fur", "synonyms": ["dress_hat", "high_hat", "opera_hat", "silk_hat", "top_hat"], "image_count": 46, "id": 393, "frequency": "c", "synset": "dress_hat.n.01"}, {"name": "dress_suit", "instance_count": 306, "def": "formalwear consisting of full evening dress for men", "synonyms": ["dress_suit"], "image_count": 106, "id": 394, "frequency": "f", "synset": "dress_suit.n.01"}, {"name": "dresser", "instance_count": 152, "def": "a cabinet with shelves", "synonyms": ["dresser"], "image_count": 115, "id": 395, "frequency": "f", "synset": "dresser.n.05"}, {"name": "drill", "instance_count": 24, "def": "a tool with a sharp rotating point for making holes in hard materials", "synonyms": ["drill"], "image_count": 19, "id": 396, "frequency": "c", "synset": "drill.n.01"}, {"name": "drone", "instance_count": 2, "def": "an aircraft without a pilot that is operated by remote control", "synonyms": ["drone"], "image_count": 2, "id": 397, "frequency": "r", "synset": "drone.n.04"}, {"name": "dropper", "instance_count": 1, "def": "pipet consisting of a small tube with a vacuum bulb at one end for drawing liquid in and releasing it a drop at a time", "synonyms": ["dropper", "eye_dropper"], "image_count": 1, "id": 398, "frequency": "r", "synset": "dropper.n.01"}, {"name": "drum_(musical_instrument)", "instance_count": 59, "def": "a musical percussion instrument; usually consists of a hollow cylinder with a membrane stretched across each end", "synonyms": ["drum_(musical_instrument)"], "image_count": 28, "id": 399, "frequency": "c", "synset": "drum.n.01"}, {"name": "drumstick", "instance_count": 25, "def": "a stick used for playing a drum", "synonyms": ["drumstick"], "image_count": 9, "id": 400, "frequency": "r", "synset": "drumstick.n.02"}, {"name": "duck", "instance_count": 1090, "def": "small web-footed broad-billed swimming bird", "synonyms": ["duck"], "image_count": 192, "id": 401, "frequency": "f", "synset": "duck.n.01"}, {"name": "duckling", "instance_count": 36, "def": "young duck", "synonyms": ["duckling"], "image_count": 12, "id": 402, "frequency": "c", "synset": "duckling.n.02"}, {"name": "duct_tape", "instance_count": 77, "def": "a wide silvery adhesive tape", "synonyms": ["duct_tape"], "image_count": 21, "id": 403, "frequency": "c", "synset": "duct_tape.n.01"}, {"name": "duffel_bag", "instance_count": 666, "def": "a large cylindrical bag of heavy cloth (does not include suitcases)", "synonyms": ["duffel_bag", "duffle_bag", "duffel", "duffle"], "image_count": 247, "id": 404, "frequency": "f", "synset": "duffel_bag.n.01"}, {"name": "dumbbell", "instance_count": 13, "def": "an exercising weight with two ball-like ends connected by a short handle", "synonyms": ["dumbbell"], "image_count": 6, "id": 405, "frequency": "r", "synset": "dumbbell.n.01"}, {"name": "dumpster", "instance_count": 95, "def": "a container designed to receive and transport and dump waste", "synonyms": ["dumpster"], "image_count": 64, "id": 406, "frequency": "c", "synset": "dumpster.n.01"}, {"name": "dustpan", "instance_count": 7, "def": "a short-handled receptacle into which dust can be swept", "synonyms": ["dustpan"], "image_count": 7, "id": 407, "frequency": "r", "synset": "dustpan.n.02"}, {"name": "eagle", "instance_count": 48, "def": "large birds of prey noted for their broad wings and strong soaring flight", "synonyms": ["eagle"], "image_count": 40, "id": 408, "frequency": "c", "synset": "eagle.n.01"}, {"name": "earphone", "instance_count": 767, "def": "device for listening to audio that is held over or inserted into the ear", "synonyms": ["earphone", "earpiece", "headphone"], "image_count": 542, "id": 409, "frequency": "f", "synset": "earphone.n.01"}, {"name": "earplug", "instance_count": 39, "def": "a soft plug that is inserted into the ear canal to block sound", "synonyms": ["earplug"], "image_count": 2, "id": 410, "frequency": "r", "synset": "earplug.n.01"}, {"name": "earring", "instance_count": 3070, "def": "jewelry to ornament the ear", "synonyms": ["earring"], "image_count": 1898, "id": 411, "frequency": "f", "synset": "earring.n.01"}, {"name": "easel", "instance_count": 43, "def": "an upright tripod for displaying something (usually an artist's canvas)", "synonyms": ["easel"], "image_count": 36, "id": 412, "frequency": "c", "synset": "easel.n.01"}, {"name": "eclair", "instance_count": 39, "def": "oblong cream puff", "synonyms": ["eclair"], "image_count": 4, "id": 413, "frequency": "r", "synset": "eclair.n.01"}, {"name": "eel", "instance_count": 1, "def": "an elongate fish with fatty flesh", "synonyms": ["eel"], "image_count": 1, "id": 414, "frequency": "r", "synset": "eel.n.01"}, {"name": "egg", "instance_count": 813, "def": "oval reproductive body of a fowl (especially a hen) used as food", "synonyms": ["egg", "eggs"], "image_count": 191, "id": 415, "frequency": "f", "synset": "egg.n.02"}, {"name": "egg_roll", "instance_count": 15, "def": "minced vegetables and meat wrapped in a pancake and fried", "synonyms": ["egg_roll", "spring_roll"], "image_count": 6, "id": 416, "frequency": "r", "synset": "egg_roll.n.01"}, {"name": "egg_yolk", "instance_count": 90, "def": "the yellow spherical part of an egg", "synonyms": ["egg_yolk", "yolk_(egg)"], "image_count": 41, "id": 417, "frequency": "c", "synset": "egg_yolk.n.01"}, {"name": "eggbeater", "instance_count": 52, "def": "a mixer for beating eggs or whipping cream", "synonyms": ["eggbeater", "eggwhisk"], "image_count": 39, "id": 418, "frequency": "c", "synset": "eggbeater.n.02"}, {"name": "eggplant", "instance_count": 337, "def": "egg-shaped vegetable having a shiny skin typically dark purple", "synonyms": ["eggplant", "aubergine"], "image_count": 46, "id": 419, "frequency": "c", "synset": "eggplant.n.01"}, {"name": "electric_chair", "instance_count": 1, "def": "a chair-shaped instrument of execution by electrocution", "synonyms": ["electric_chair"], "image_count": 1, "id": 420, "frequency": "r", "synset": "electric_chair.n.01"}, {"name": "refrigerator", "instance_count": 1702, "def": "a refrigerator in which the coolant is pumped around by an electric motor", "synonyms": ["refrigerator"], "image_count": 1451, "id": 421, "frequency": "f", "synset": "electric_refrigerator.n.01"}, {"name": "elephant", "instance_count": 5325, "def": "a common elephant", "synonyms": ["elephant"], "image_count": 1878, "id": 422, "frequency": "f", "synset": "elephant.n.01"}, {"name": "elk", "instance_count": 29, "def": "large northern deer with enormous flattened antlers in the male", "synonyms": ["elk", "moose"], "image_count": 11, "id": 423, "frequency": "c", "synset": "elk.n.01"}, {"name": "envelope", "instance_count": 210, "def": "a flat (usually rectangular) container for a letter, thin package, etc.", "synonyms": ["envelope"], "image_count": 82, "id": 424, "frequency": "c", "synset": "envelope.n.01"}, {"name": "eraser", "instance_count": 41, "def": "an implement used to erase something", "synonyms": ["eraser"], "image_count": 18, "id": 425, "frequency": "c", "synset": "eraser.n.01"}, {"name": "escargot", "instance_count": 5, "def": "edible snail usually served in the shell with a sauce of melted butter and garlic", "synonyms": ["escargot"], "image_count": 1, "id": 426, "frequency": "r", "synset": "escargot.n.01"}, {"name": "eyepatch", "instance_count": 9, "def": "a protective cloth covering for an injured eye", "synonyms": ["eyepatch"], "image_count": 7, "id": 427, "frequency": "r", "synset": "eyepatch.n.01"}, {"name": "falcon", "instance_count": 3, "def": "birds of prey having long pointed powerful wings adapted for swift flight", "synonyms": ["falcon"], "image_count": 3, "id": 428, "frequency": "r", "synset": "falcon.n.01"}, {"name": "fan", "instance_count": 737, "def": "a device for creating a current of air by movement of a surface or surfaces", "synonyms": ["fan"], "image_count": 575, "id": 429, "frequency": "f", "synset": "fan.n.01"}, {"name": "faucet", "instance_count": 3185, "def": "a regulator for controlling the flow of a liquid from a reservoir", "synonyms": ["faucet", "spigot", "tap"], "image_count": 1907, "id": 430, "frequency": "f", "synset": "faucet.n.01"}, {"name": "fedora", "instance_count": 14, "def": "a hat made of felt with a creased crown", "synonyms": ["fedora"], "image_count": 8, "id": 431, "frequency": "r", "synset": "fedora.n.01"}, {"name": "ferret", "instance_count": 5, "def": "domesticated albino variety of the European polecat bred for hunting rats and rabbits", "synonyms": ["ferret"], "image_count": 4, "id": 432, "frequency": "r", "synset": "ferret.n.02"}, {"name": "Ferris_wheel", "instance_count": 32, "def": "a large wheel with suspended seats that remain upright as the wheel rotates", "synonyms": ["Ferris_wheel"], "image_count": 32, "id": 433, "frequency": "c", "synset": "ferris_wheel.n.01"}, {"name": "ferry", "instance_count": 17, "def": "a boat that transports people or vehicles across a body of water and operates on a regular schedule", "synonyms": ["ferry", "ferryboat"], "image_count": 11, "id": 434, "frequency": "c", "synset": "ferry.n.01"}, {"name": "fig_(fruit)", "instance_count": 147, "def": "fleshy sweet pear-shaped yellowish or purple fruit eaten fresh or preserved or dried", "synonyms": ["fig_(fruit)"], "image_count": 4, "id": 435, "frequency": "r", "synset": "fig.n.04"}, {"name": "fighter_jet", "instance_count": 115, "def": "a high-speed military or naval airplane designed to destroy enemy targets", "synonyms": ["fighter_jet", "fighter_aircraft", "attack_aircraft"], "image_count": 54, "id": 436, "frequency": "c", "synset": "fighter.n.02"}, {"name": "figurine", "instance_count": 1056, "def": "a small carved or molded figure", "synonyms": ["figurine"], "image_count": 202, "id": 437, "frequency": "f", "synset": "figurine.n.01"}, {"name": "file_cabinet", "instance_count": 53, "def": "office furniture consisting of a container for keeping papers in order", "synonyms": ["file_cabinet", "filing_cabinet"], "image_count": 32, "id": 438, "frequency": "c", "synset": "file.n.03"}, {"name": "file_(tool)", "instance_count": 3, "def": "a steel hand tool with small sharp teeth on some or all of its surfaces; used for smoothing wood or metal", "synonyms": ["file_(tool)"], "image_count": 3, "id": 439, "frequency": "r", "synset": "file.n.04"}, {"name": "fire_alarm", "instance_count": 151, "def": "an alarm that is tripped off by fire or smoke", "synonyms": ["fire_alarm", "smoke_alarm"], "image_count": 130, "id": 440, "frequency": "f", "synset": "fire_alarm.n.02"}, {"name": "fire_engine", "instance_count": 179, "def": "large trucks that carry firefighters and equipment to the site of a fire", "synonyms": ["fire_engine", "fire_truck"], "image_count": 119, "id": 441, "frequency": "f", "synset": "fire_engine.n.01"}, {"name": "fire_extinguisher", "instance_count": 165, "def": "a manually operated device for extinguishing small fires", "synonyms": ["fire_extinguisher", "extinguisher"], "image_count": 141, "id": 442, "frequency": "f", "synset": "fire_extinguisher.n.01"}, {"name": "fire_hose", "instance_count": 67, "def": "a large hose that carries water from a fire hydrant to the site of the fire", "synonyms": ["fire_hose"], "image_count": 29, "id": 443, "frequency": "c", "synset": "fire_hose.n.01"}, {"name": "fireplace", "instance_count": 530, "def": "an open recess in a wall at the base of a chimney where a fire can be built", "synonyms": ["fireplace"], "image_count": 525, "id": 444, "frequency": "f", "synset": "fireplace.n.01"}, {"name": "fireplug", "instance_count": 1458, "def": "an upright hydrant for drawing water to use in fighting a fire", "synonyms": ["fireplug", "fire_hydrant", "hydrant"], "image_count": 1323, "id": 445, "frequency": "f", "synset": "fireplug.n.01"}, {"name": "first-aid_kit", "instance_count": 2, "def": "kit consisting of a set of bandages and medicines for giving first aid", "synonyms": ["first-aid_kit"], "image_count": 2, "id": 446, "frequency": "r", "synset": "first-aid_kit.n.01"}, {"name": "fish", "instance_count": 525, "def": "any of various mostly cold-blooded aquatic vertebrates usually having scales and breathing through gills", "synonyms": ["fish"], "image_count": 113, "id": 447, "frequency": "f", "synset": "fish.n.01"}, {"name": "fish_(food)", "instance_count": 96, "def": "the flesh of fish used as food", "synonyms": ["fish_(food)"], "image_count": 16, "id": 448, "frequency": "c", "synset": "fish.n.02"}, {"name": "fishbowl", "instance_count": 33, "def": "a transparent bowl in which small fish are kept", "synonyms": ["fishbowl", "goldfish_bowl"], "image_count": 7, "id": 449, "frequency": "r", "synset": "fishbowl.n.02"}, {"name": "fishing_rod", "instance_count": 84, "def": "a rod that is used in fishing to extend the fishing line", "synonyms": ["fishing_rod", "fishing_pole"], "image_count": 35, "id": 450, "frequency": "c", "synset": "fishing_rod.n.01"}, {"name": "flag", "instance_count": 7007, "def": "emblem usually consisting of a rectangular piece of cloth of distinctive design (do not include pole)", "synonyms": ["flag"], "image_count": 1908, "id": 451, "frequency": "f", "synset": "flag.n.01"}, {"name": "flagpole", "instance_count": 1082, "def": "a tall staff or pole on which a flag is raised", "synonyms": ["flagpole", "flagstaff"], "image_count": 353, "id": 452, "frequency": "f", "synset": "flagpole.n.02"}, {"name": "flamingo", "instance_count": 309, "def": "large pink web-footed bird with down-bent bill", "synonyms": ["flamingo"], "image_count": 18, "id": 453, "frequency": "c", "synset": "flamingo.n.01"}, {"name": "flannel", "instance_count": 18, "def": "a soft light woolen fabric; used for clothing", "synonyms": ["flannel"], "image_count": 14, "id": 454, "frequency": "c", "synset": "flannel.n.01"}, {"name": "flap", "instance_count": 218, "def": "any broad thin covering attached at one edge, such as a mud flap next to a wheel or a flap on an airplane wing", "synonyms": ["flap"], "image_count": 77, "id": 455, "frequency": "c", "synset": "flap.n.01"}, {"name": "flash", "instance_count": 10, "def": "a lamp for providing momentary light to take a photograph", "synonyms": ["flash", "flashbulb"], "image_count": 8, "id": 456, "frequency": "r", "synset": "flash.n.10"}, {"name": "flashlight", "instance_count": 48, "def": "a small portable battery-powered electric lamp", "synonyms": ["flashlight", "torch"], "image_count": 37, "id": 457, "frequency": "c", "synset": "flashlight.n.01"}, {"name": "fleece", "instance_count": 2, "def": "a soft bulky fabric with deep pile; used chiefly for clothing", "synonyms": ["fleece"], "image_count": 1, "id": 458, "frequency": "r", "synset": "fleece.n.03"}, {"name": "flip-flop_(sandal)", "instance_count": 1103, "def": "a backless sandal held to the foot by a thong between two toes", "synonyms": ["flip-flop_(sandal)"], "image_count": 346, "id": 459, "frequency": "f", "synset": "flip-flop.n.02"}, {"name": "flipper_(footwear)", "instance_count": 49, "def": "a shoe to aid a person in swimming", "synonyms": ["flipper_(footwear)", "fin_(footwear)"], "image_count": 19, "id": 460, "frequency": "c", "synset": "flipper.n.01"}, {"name": "flower_arrangement", "instance_count": 3960, "def": "a decorative arrangement of flowers", "synonyms": ["flower_arrangement", "floral_arrangement"], "image_count": 1779, "id": 461, "frequency": "f", "synset": "flower_arrangement.n.01"}, {"name": "flute_glass", "instance_count": 86, "def": "a tall narrow wineglass", "synonyms": ["flute_glass", "champagne_flute"], "image_count": 23, "id": 462, "frequency": "c", "synset": "flute.n.02"}, {"name": "foal", "instance_count": 30, "def": "a young horse", "synonyms": ["foal"], "image_count": 25, "id": 463, "frequency": "c", "synset": "foal.n.01"}, {"name": "folding_chair", "instance_count": 303, "def": "a chair that can be folded flat for storage", "synonyms": ["folding_chair"], "image_count": 67, "id": 464, "frequency": "c", "synset": "folding_chair.n.01"}, {"name": "food_processor", "instance_count": 22, "def": "a kitchen appliance for shredding, blending, chopping, or slicing food", "synonyms": ["food_processor"], "image_count": 19, "id": 465, "frequency": "c", "synset": "food_processor.n.01"}, {"name": "football_(American)", "instance_count": 35, "def": "the inflated oblong ball used in playing American football", "synonyms": ["football_(American)"], "image_count": 28, "id": 466, "frequency": "c", "synset": "football.n.02"}, {"name": "football_helmet", "instance_count": 7, "def": "a padded helmet with a face mask to protect the head of football players", "synonyms": ["football_helmet"], "image_count": 4, "id": 467, "frequency": "r", "synset": "football_helmet.n.01"}, {"name": "footstool", "instance_count": 41, "def": "a low seat or a stool to rest the feet of a seated person", "synonyms": ["footstool", "footrest"], "image_count": 27, "id": 468, "frequency": "c", "synset": "footstool.n.01"}, {"name": "fork", "instance_count": 3137, "def": "cutlery used for serving and eating food", "synonyms": ["fork"], "image_count": 1861, "id": 469, "frequency": "f", "synset": "fork.n.01"}, {"name": "forklift", "instance_count": 14, "def": "an industrial vehicle with a power operated fork in front that can be inserted under loads to lift and move them", "synonyms": ["forklift"], "image_count": 11, "id": 470, "frequency": "c", "synset": "forklift.n.01"}, {"name": "freight_car", "instance_count": 121, "def": "a railway car that carries freight", "synonyms": ["freight_car"], "image_count": 13, "id": 471, "frequency": "c", "synset": "freight_car.n.01"}, {"name": "French_toast", "instance_count": 41, "def": "bread slice dipped in egg and milk and fried", "synonyms": ["French_toast"], "image_count": 13, "id": 472, "frequency": "c", "synset": "french_toast.n.01"}, {"name": "freshener", "instance_count": 39, "def": "anything that freshens air by removing or covering odor", "synonyms": ["freshener", "air_freshener"], "image_count": 32, "id": 473, "frequency": "c", "synset": "freshener.n.01"}, {"name": "frisbee", "instance_count": 2332, "def": "a light, plastic disk propelled with a flip of the wrist for recreation or competition", "synonyms": ["frisbee"], "image_count": 1767, "id": 474, "frequency": "f", "synset": "frisbee.n.01"}, {"name": "frog", "instance_count": 84, "def": "a tailless stout-bodied amphibians with long hind limbs for leaping", "synonyms": ["frog", "toad", "toad_frog"], "image_count": 42, "id": 475, "frequency": "c", "synset": "frog.n.01"}, {"name": "fruit_juice", "instance_count": 37, "def": "drink produced by squeezing or crushing fruit", "synonyms": ["fruit_juice"], "image_count": 17, "id": 476, "frequency": "c", "synset": "fruit_juice.n.01"}, {"name": "frying_pan", "instance_count": 310, "def": "a pan used for frying foods", "synonyms": ["frying_pan", "frypan", "skillet"], "image_count": 128, "id": 477, "frequency": "f", "synset": "frying_pan.n.01"}, {"name": "fudge", "instance_count": 4, "def": "soft creamy candy", "synonyms": ["fudge"], "image_count": 1, "id": 478, "frequency": "r", "synset": "fudge.n.01"}, {"name": "funnel", "instance_count": 9, "def": "a cone-shaped utensil used to channel a substance into a container with a small mouth", "synonyms": ["funnel"], "image_count": 9, "id": 479, "frequency": "r", "synset": "funnel.n.02"}, {"name": "futon", "instance_count": 11, "def": "a pad that is used for sleeping on the floor or on a raised frame", "synonyms": ["futon"], "image_count": 10, "id": 480, "frequency": "r", "synset": "futon.n.01"}, {"name": "gag", "instance_count": 4, "def": "restraint put into a person's mouth to prevent speaking or shouting", "synonyms": ["gag", "muzzle"], "image_count": 4, "id": 481, "frequency": "r", "synset": "gag.n.02"}, {"name": "garbage", "instance_count": 18, "def": "a receptacle where waste can be discarded", "synonyms": ["garbage"], "image_count": 9, "id": 482, "frequency": "r", "synset": "garbage.n.03"}, {"name": "garbage_truck", "instance_count": 18, "def": "a truck for collecting domestic refuse", "synonyms": ["garbage_truck"], "image_count": 18, "id": 483, "frequency": "c", "synset": "garbage_truck.n.01"}, {"name": "garden_hose", "instance_count": 50, "def": "a hose used for watering a lawn or garden", "synonyms": ["garden_hose"], "image_count": 41, "id": 484, "frequency": "c", "synset": "garden_hose.n.01"}, {"name": "gargle", "instance_count": 38, "def": "a medicated solution used for gargling and rinsing the mouth", "synonyms": ["gargle", "mouthwash"], "image_count": 28, "id": 485, "frequency": "c", "synset": "gargle.n.01"}, {"name": "gargoyle", "instance_count": 8, "def": "an ornament consisting of a grotesquely carved figure of a person or animal", "synonyms": ["gargoyle"], "image_count": 3, "id": 486, "frequency": "r", "synset": "gargoyle.n.02"}, {"name": "garlic", "instance_count": 487, "def": "aromatic bulb used as seasoning", "synonyms": ["garlic", "ail"], "image_count": 65, "id": 487, "frequency": "c", "synset": "garlic.n.02"}, {"name": "gasmask", "instance_count": 12, "def": "a protective face mask with a filter", "synonyms": ["gasmask", "respirator", "gas_helmet"], "image_count": 9, "id": 488, "frequency": "r", "synset": "gasmask.n.01"}, {"name": "gazelle", "instance_count": 82, "def": "small swift graceful antelope of Africa and Asia having lustrous eyes", "synonyms": ["gazelle"], "image_count": 23, "id": 489, "frequency": "c", "synset": "gazelle.n.01"}, {"name": "gelatin", "instance_count": 248, "def": "an edible jelly made with gelatin and used as a dessert or salad base or a coating for foods", "synonyms": ["gelatin", "jelly"], "image_count": 24, "id": 490, "frequency": "c", "synset": "gelatin.n.02"}, {"name": "gemstone", "instance_count": 2, "def": "a crystalline rock that can be cut and polished for jewelry", "synonyms": ["gemstone"], "image_count": 1, "id": 491, "frequency": "r", "synset": "gem.n.02"}, {"name": "generator", "instance_count": 2, "def": "engine that converts mechanical energy into electrical energy by electromagnetic induction", "synonyms": ["generator"], "image_count": 2, "id": 492, "frequency": "r", "synset": "generator.n.02"}, {"name": "giant_panda", "instance_count": 112, "def": "large black-and-white herbivorous mammal of bamboo forests of China and Tibet", "synonyms": ["giant_panda", "panda", "panda_bear"], "image_count": 59, "id": 493, "frequency": "c", "synset": "giant_panda.n.01"}, {"name": "gift_wrap", "instance_count": 247, "def": "attractive wrapping paper suitable for wrapping gifts", "synonyms": ["gift_wrap"], "image_count": 48, "id": 494, "frequency": "c", "synset": "gift_wrap.n.01"}, {"name": "ginger", "instance_count": 93, "def": "the root of the common ginger plant; used fresh as a seasoning", "synonyms": ["ginger", "gingerroot"], "image_count": 17, "id": 495, "frequency": "c", "synset": "ginger.n.03"}, {"name": "giraffe", "instance_count": 3923, "def": "tall animal having a spotted coat and small horns and very long neck and legs", "synonyms": ["giraffe"], "image_count": 1877, "id": 496, "frequency": "f", "synset": "giraffe.n.01"}, {"name": "cincture", "instance_count": 56, "def": "a band of material around the waist that strengthens a skirt or trousers", "synonyms": ["cincture", "sash", "waistband", "waistcloth"], "image_count": 18, "id": 497, "frequency": "c", "synset": "girdle.n.02"}, {"name": "glass_(drink_container)", "instance_count": 6420, "def": "a container for holding liquids while drinking", "synonyms": ["glass_(drink_container)", "drinking_glass"], "image_count": 1920, "id": 498, "frequency": "f", "synset": "glass.n.02"}, {"name": "globe", "instance_count": 59, "def": "a sphere on which a map (especially of the earth) is represented", "synonyms": ["globe"], "image_count": 50, "id": 499, "frequency": "c", "synset": "globe.n.03"}, {"name": "glove", "instance_count": 5951, "def": "handwear covering the hand", "synonyms": ["glove"], "image_count": 1890, "id": 500, "frequency": "f", "synset": "glove.n.02"}, {"name": "goat", "instance_count": 842, "def": "a common goat", "synonyms": ["goat"], "image_count": 99, "id": 501, "frequency": "c", "synset": "goat.n.01"}, {"name": "goggles", "instance_count": 3202, "def": "tight-fitting spectacles worn to protect the eyes", "synonyms": ["goggles"], "image_count": 1530, "id": 502, "frequency": "f", "synset": "goggles.n.01"}, {"name": "goldfish", "instance_count": 11, "def": "small golden or orange-red freshwater fishes used as pond or aquarium pets", "synonyms": ["goldfish"], "image_count": 3, "id": 503, "frequency": "r", "synset": "goldfish.n.01"}, {"name": "golf_club", "instance_count": 14, "def": "golf equipment used by a golfer to hit a golf ball", "synonyms": ["golf_club", "golf-club"], "image_count": 11, "id": 504, "frequency": "c", "synset": "golf_club.n.02"}, {"name": "golfcart", "instance_count": 25, "def": "a small motor vehicle in which golfers can ride between shots", "synonyms": ["golfcart"], "image_count": 19, "id": 505, "frequency": "c", "synset": "golfcart.n.01"}, {"name": "gondola_(boat)", "instance_count": 8, "def": "long narrow flat-bottomed boat propelled by sculling; traditionally used on canals of Venice", "synonyms": ["gondola_(boat)"], "image_count": 3, "id": 506, "frequency": "r", "synset": "gondola.n.02"}, {"name": "goose", "instance_count": 413, "def": "loud, web-footed long-necked aquatic birds usually larger than ducks", "synonyms": ["goose"], "image_count": 63, "id": 507, "frequency": "c", "synset": "goose.n.01"}, {"name": "gorilla", "instance_count": 10, "def": "largest ape", "synonyms": ["gorilla"], "image_count": 5, "id": 508, "frequency": "r", "synset": "gorilla.n.01"}, {"name": "gourd", "instance_count": 101, "def": "any of numerous inedible fruits with hard rinds", "synonyms": ["gourd"], "image_count": 6, "id": 509, "frequency": "r", "synset": "gourd.n.02"}, {"name": "grape", "instance_count": 6377, "def": "any of various juicy fruit with green or purple skins; grow in clusters", "synonyms": ["grape"], "image_count": 233, "id": 510, "frequency": "f", "synset": "grape.n.01"}, {"name": "grater", "instance_count": 64, "def": "utensil with sharp perforations for shredding foods (as vegetables or cheese)", "synonyms": ["grater"], "image_count": 54, "id": 511, "frequency": "c", "synset": "grater.n.01"}, {"name": "gravestone", "instance_count": 778, "def": "a stone that is used to mark a grave", "synonyms": ["gravestone", "headstone", "tombstone"], "image_count": 36, "id": 512, "frequency": "c", "synset": "gravestone.n.01"}, {"name": "gravy_boat", "instance_count": 10, "def": "a dish (often boat-shaped) for serving gravy or sauce", "synonyms": ["gravy_boat", "gravy_holder"], "image_count": 10, "id": 513, "frequency": "r", "synset": "gravy_boat.n.01"}, {"name": "green_bean", "instance_count": 2571, "def": "a common bean plant cultivated for its slender green edible pods", "synonyms": ["green_bean"], "image_count": 124, "id": 514, "frequency": "f", "synset": "green_bean.n.02"}, {"name": "green_onion", "instance_count": 1618, "def": "a young onion before the bulb has enlarged", "synonyms": ["green_onion", "spring_onion", "scallion"], "image_count": 101, "id": 515, "frequency": "f", "synset": "green_onion.n.01"}, {"name": "griddle", "instance_count": 4, "def": "cooking utensil consisting of a flat heated surface on which food is cooked", "synonyms": ["griddle"], "image_count": 3, "id": 516, "frequency": "r", "synset": "griddle.n.01"}, {"name": "grill", "instance_count": 747, "def": "a framework of metal bars used as a partition or a grate", "synonyms": ["grill", "grille", "grillwork", "radiator_grille"], "image_count": 363, "id": 517, "frequency": "f", "synset": "grill.n.02"}, {"name": "grits", "instance_count": 3, "def": "coarsely ground corn boiled as a breakfast dish", "synonyms": ["grits", "hominy_grits"], "image_count": 3, "id": 518, "frequency": "r", "synset": "grits.n.01"}, {"name": "grizzly", "instance_count": 44, "def": "powerful brownish-yellow bear of the uplands of western North America", "synonyms": ["grizzly", "grizzly_bear"], "image_count": 30, "id": 519, "frequency": "c", "synset": "grizzly.n.01"}, {"name": "grocery_bag", "instance_count": 46, "def": "a sack for holding customer's groceries", "synonyms": ["grocery_bag"], "image_count": 18, "id": 520, "frequency": "c", "synset": "grocery_bag.n.01"}, {"name": "guitar", "instance_count": 315, "def": "a stringed instrument usually having six strings; played by strumming or plucking", "synonyms": ["guitar"], "image_count": 199, "id": 521, "frequency": "f", "synset": "guitar.n.01"}, {"name": "gull", "instance_count": 1398, "def": "mostly white aquatic bird having long pointed wings and short legs", "synonyms": ["gull", "seagull"], "image_count": 97, "id": 522, "frequency": "c", "synset": "gull.n.02"}, {"name": "gun", "instance_count": 68, "def": "a weapon that discharges a bullet at high velocity from a metal tube", "synonyms": ["gun"], "image_count": 32, "id": 523, "frequency": "c", "synset": "gun.n.01"}, {"name": "hairbrush", "instance_count": 165, "def": "a brush used to groom a person's hair", "synonyms": ["hairbrush"], "image_count": 121, "id": 524, "frequency": "f", "synset": "hairbrush.n.01"}, {"name": "hairnet", "instance_count": 53, "def": "a small net that someone wears over their hair to keep it in place", "synonyms": ["hairnet"], "image_count": 16, "id": 525, "frequency": "c", "synset": "hairnet.n.01"}, {"name": "hairpin", "instance_count": 20, "def": "a double pronged pin used to hold women's hair in place", "synonyms": ["hairpin"], "image_count": 12, "id": 526, "frequency": "c", "synset": "hairpin.n.01"}, {"name": "halter_top", "instance_count": 3, "def": "a woman's top that fastens behind the back and neck leaving the back and arms uncovered", "synonyms": ["halter_top"], "image_count": 2, "id": 527, "frequency": "r", "synset": "halter.n.03"}, {"name": "ham", "instance_count": 1765, "def": "meat cut from the thigh of a hog (usually smoked)", "synonyms": ["ham", "jambon", "gammon"], "image_count": 214, "id": 528, "frequency": "f", "synset": "ham.n.01"}, {"name": "hamburger", "instance_count": 126, "def": "a sandwich consisting of a patty of minced beef served on a bun", "synonyms": ["hamburger", "beefburger", "burger"], "image_count": 48, "id": 529, "frequency": "c", "synset": "hamburger.n.01"}, {"name": "hammer", "instance_count": 41, "def": "a hand tool with a heavy head and a handle; used to deliver an impulsive force by striking", "synonyms": ["hammer"], "image_count": 26, "id": 530, "frequency": "c", "synset": "hammer.n.02"}, {"name": "hammock", "instance_count": 15, "def": "a hanging bed of canvas or rope netting (usually suspended between two trees)", "synonyms": ["hammock"], "image_count": 13, "id": 531, "frequency": "c", "synset": "hammock.n.02"}, {"name": "hamper", "instance_count": 5, "def": "a basket usually with a cover", "synonyms": ["hamper"], "image_count": 4, "id": 532, "frequency": "r", "synset": "hamper.n.02"}, {"name": "hamster", "instance_count": 12, "def": "short-tailed burrowing rodent with large cheek pouches", "synonyms": ["hamster"], "image_count": 11, "id": 533, "frequency": "c", "synset": "hamster.n.01"}, {"name": "hair_dryer", "instance_count": 144, "def": "a hand-held electric blower that can blow warm air onto the hair", "synonyms": ["hair_dryer"], "image_count": 123, "id": 534, "frequency": "f", "synset": "hand_blower.n.01"}, {"name": "hand_glass", "instance_count": 7, "def": "a mirror intended to be held in the hand", "synonyms": ["hand_glass", "hand_mirror"], "image_count": 7, "id": 535, "frequency": "r", "synset": "hand_glass.n.01"}, {"name": "hand_towel", "instance_count": 619, "def": "a small towel used to dry the hands or face", "synonyms": ["hand_towel", "face_towel"], "image_count": 200, "id": 536, "frequency": "f", "synset": "hand_towel.n.01"}, {"name": "handcart", "instance_count": 204, "def": "wheeled vehicle that can be pushed by a person", "synonyms": ["handcart", "pushcart", "hand_truck"], "image_count": 91, "id": 537, "frequency": "c", "synset": "handcart.n.01"}, {"name": "handcuff", "instance_count": 10, "def": "shackle that consists of a metal loop that can be locked around the wrist", "synonyms": ["handcuff"], "image_count": 9, "id": 538, "frequency": "r", "synset": "handcuff.n.01"}, {"name": "handkerchief", "instance_count": 86, "def": "a square piece of cloth used for wiping the eyes or nose or as a costume accessory", "synonyms": ["handkerchief"], "image_count": 72, "id": 539, "frequency": "c", "synset": "handkerchief.n.01"}, {"name": "handle", "instance_count": 8314, "def": "the appendage to an object that is designed to be held in order to use or move it", "synonyms": ["handle", "grip", "handgrip"], "image_count": 1886, "id": 540, "frequency": "f", "synset": "handle.n.01"}, {"name": "handsaw", "instance_count": 5, "def": "a saw used with one hand for cutting wood", "synonyms": ["handsaw", "carpenter's_saw"], "image_count": 4, "id": 541, "frequency": "r", "synset": "handsaw.n.01"}, {"name": "hardback_book", "instance_count": 2, "def": "a book with cardboard or cloth or leather covers", "synonyms": ["hardback_book", "hardcover_book"], "image_count": 1, "id": 542, "frequency": "r", "synset": "hardback.n.01"}, {"name": "harmonium", "instance_count": 2, "def": "a free-reed instrument in which air is forced through the reeds by bellows", "synonyms": ["harmonium", "organ_(musical_instrument)", "reed_organ_(musical_instrument)"], "image_count": 1, "id": 543, "frequency": "r", "synset": "harmonium.n.01"}, {"name": "hat", "instance_count": 7213, "def": "headwear that protects the head from bad weather, sun, or worn for fashion", "synonyms": ["hat"], "image_count": 1932, "id": 544, "frequency": "f", "synset": "hat.n.01"}, {"name": "hatbox", "instance_count": 7, "def": "a round piece of luggage for carrying hats", "synonyms": ["hatbox"], "image_count": 4, "id": 545, "frequency": "r", "synset": "hatbox.n.01"}, {"name": "veil", "instance_count": 57, "def": "a garment that covers the head OR face", "synonyms": ["veil"], "image_count": 56, "id": 546, "frequency": "c", "synset": "head_covering.n.01"}, {"name": "headband", "instance_count": 1114, "def": "a band worn around or over the head", "synonyms": ["headband"], "image_count": 854, "id": 547, "frequency": "f", "synset": "headband.n.01"}, {"name": "headboard", "instance_count": 850, "def": "a vertical board or panel forming the head of a bedstead", "synonyms": ["headboard"], "image_count": 755, "id": 548, "frequency": "f", "synset": "headboard.n.01"}, {"name": "headlight", "instance_count": 7326, "def": "a powerful light with reflector; attached to the front of an automobile or locomotive", "synonyms": ["headlight", "headlamp"], "image_count": 1843, "id": 549, "frequency": "f", "synset": "headlight.n.01"}, {"name": "headscarf", "instance_count": 235, "def": "a kerchief worn over the head and tied under the chin", "synonyms": ["headscarf"], "image_count": 96, "id": 550, "frequency": "c", "synset": "headscarf.n.01"}, {"name": "headset", "instance_count": 10, "def": "receiver consisting of a pair of headphones", "synonyms": ["headset"], "image_count": 7, "id": 551, "frequency": "r", "synset": "headset.n.01"}, {"name": "headstall_(for_horses)", "instance_count": 133, "def": "the band that is the part of a bridle that fits around a horse's head", "synonyms": ["headstall_(for_horses)", "headpiece_(for_horses)"], "image_count": 74, "id": 552, "frequency": "c", "synset": "headstall.n.01"}, {"name": "heart", "instance_count": 347, "def": "a muscular organ; its contractions move the blood through the body", "synonyms": ["heart"], "image_count": 66, "id": 553, "frequency": "c", "synset": "heart.n.02"}, {"name": "heater", "instance_count": 64, "def": "device that heats water or supplies warmth to a room", "synonyms": ["heater", "warmer"], "image_count": 57, "id": 554, "frequency": "c", "synset": "heater.n.01"}, {"name": "helicopter", "instance_count": 68, "def": "an aircraft without wings that obtains its lift from the rotation of overhead blades", "synonyms": ["helicopter"], "image_count": 44, "id": 555, "frequency": "c", "synset": "helicopter.n.01"}, {"name": "helmet", "instance_count": 4845, "def": "a protective headgear made of hard material to resist blows", "synonyms": ["helmet"], "image_count": 1905, "id": 556, "frequency": "f", "synset": "helmet.n.02"}, {"name": "heron", "instance_count": 6, "def": "grey or white wading bird with long neck and long legs and (usually) long bill", "synonyms": ["heron"], "image_count": 4, "id": 557, "frequency": "r", "synset": "heron.n.02"}, {"name": "highchair", "instance_count": 98, "def": "a chair for feeding a very young child", "synonyms": ["highchair", "feeding_chair"], "image_count": 90, "id": 558, "frequency": "c", "synset": "highchair.n.01"}, {"name": "hinge", "instance_count": 5283, "def": "a joint that holds two parts together so that one can swing relative to the other", "synonyms": ["hinge"], "image_count": 1635, "id": 559, "frequency": "f", "synset": "hinge.n.01"}, {"name": "hippopotamus", "instance_count": 24, "def": "massive thick-skinned animal living in or around rivers of tropical Africa", "synonyms": ["hippopotamus"], "image_count": 8, "id": 560, "frequency": "r", "synset": "hippopotamus.n.01"}, {"name": "hockey_stick", "instance_count": 15, "def": "sports implement consisting of a stick used by hockey players to move the puck", "synonyms": ["hockey_stick"], "image_count": 5, "id": 561, "frequency": "r", "synset": "hockey_stick.n.01"}, {"name": "hog", "instance_count": 73, "def": "domestic swine", "synonyms": ["hog", "pig"], "image_count": 50, "id": 562, "frequency": "c", "synset": "hog.n.03"}, {"name": "home_plate_(baseball)", "instance_count": 551, "def": "(baseball) a rubber slab where the batter stands; it must be touched by a base runner in order to score", "synonyms": ["home_plate_(baseball)", "home_base_(baseball)"], "image_count": 545, "id": 563, "frequency": "f", "synset": "home_plate.n.01"}, {"name": "honey", "instance_count": 90, "def": "a sweet yellow liquid produced by bees", "synonyms": ["honey"], "image_count": 20, "id": 564, "frequency": "c", "synset": "honey.n.01"}, {"name": "fume_hood", "instance_count": 208, "def": "metal covering leading to a vent that exhausts smoke or fumes", "synonyms": ["fume_hood", "exhaust_hood"], "image_count": 193, "id": 565, "frequency": "f", "synset": "hood.n.06"}, {"name": "hook", "instance_count": 1157, "def": "a curved or bent implement for suspending or pulling something", "synonyms": ["hook"], "image_count": 285, "id": 566, "frequency": "f", "synset": "hook.n.05"}, {"name": "hookah", "instance_count": 3, "def": "a tobacco pipe with a long flexible tube connected to a container where the smoke is cooled by passing through water", "synonyms": ["hookah", "narghile", "nargileh", "sheesha", "shisha", "water_pipe"], "image_count": 3, "id": 567, "frequency": "r", "synset": "hookah.n.01"}, {"name": "hornet", "instance_count": 1, "def": "large stinging wasp", "synonyms": ["hornet"], "image_count": 1, "id": 568, "frequency": "r", "synset": "hornet.n.01"}, {"name": "horse", "instance_count": 4744, "def": "a common horse", "synonyms": ["horse"], "image_count": 1904, "id": 569, "frequency": "f", "synset": "horse.n.01"}, {"name": "hose", "instance_count": 610, "def": "a flexible pipe for conveying a liquid or gas", "synonyms": ["hose", "hosepipe"], "image_count": 294, "id": 570, "frequency": "f", "synset": "hose.n.03"}, {"name": "hot-air_balloon", "instance_count": 4, "def": "balloon for travel through the air in a basket suspended below a large bag of heated air", "synonyms": ["hot-air_balloon"], "image_count": 3, "id": 571, "frequency": "r", "synset": "hot-air_balloon.n.01"}, {"name": "hotplate", "instance_count": 6, "def": "a portable electric appliance for heating or cooking or keeping food warm", "synonyms": ["hotplate"], "image_count": 5, "id": 572, "frequency": "r", "synset": "hot_plate.n.01"}, {"name": "hot_sauce", "instance_count": 70, "def": "a pungent peppery sauce", "synonyms": ["hot_sauce"], "image_count": 24, "id": 573, "frequency": "c", "synset": "hot_sauce.n.01"}, {"name": "hourglass", "instance_count": 2, "def": "a sandglass timer that runs for sixty minutes", "synonyms": ["hourglass"], "image_count": 2, "id": 574, "frequency": "r", "synset": "hourglass.n.01"}, {"name": "houseboat", "instance_count": 4, "def": "a barge that is designed and equipped for use as a dwelling", "synonyms": ["houseboat"], "image_count": 2, "id": 575, "frequency": "r", "synset": "houseboat.n.01"}, {"name": "hummingbird", "instance_count": 18, "def": "tiny American bird having brilliant iridescent plumage and long slender bills", "synonyms": ["hummingbird"], "image_count": 16, "id": 576, "frequency": "c", "synset": "hummingbird.n.01"}, {"name": "hummus", "instance_count": 9, "def": "a thick spread made from mashed chickpeas", "synonyms": ["hummus", "humus", "hommos", "hoummos", "humous"], "image_count": 8, "id": 577, "frequency": "r", "synset": "hummus.n.01"}, {"name": "polar_bear", "instance_count": 196, "def": "white bear of Arctic regions", "synonyms": ["polar_bear"], "image_count": 154, "id": 578, "frequency": "f", "synset": "ice_bear.n.01"}, {"name": "icecream", "instance_count": 180, "def": "frozen dessert containing cream and sugar and flavoring", "synonyms": ["icecream"], "image_count": 66, "id": 579, "frequency": "c", "synset": "ice_cream.n.01"}, {"name": "popsicle", "instance_count": 1, "def": "ice cream or water ice on a small wooden stick", "synonyms": ["popsicle"], "image_count": 1, "id": 580, "frequency": "r", "synset": "ice_lolly.n.01"}, {"name": "ice_maker", "instance_count": 26, "def": "an appliance included in some electric refrigerators for making ice cubes", "synonyms": ["ice_maker"], "image_count": 24, "id": 581, "frequency": "c", "synset": "ice_maker.n.01"}, {"name": "ice_pack", "instance_count": 4, "def": "a waterproof bag filled with ice: applied to the body (especially the head) to cool or reduce swelling", "synonyms": ["ice_pack", "ice_bag"], "image_count": 1, "id": 582, "frequency": "r", "synset": "ice_pack.n.01"}, {"name": "ice_skate", "instance_count": 14, "def": "skate consisting of a boot with a steel blade fitted to the sole", "synonyms": ["ice_skate"], "image_count": 4, "id": 583, "frequency": "r", "synset": "ice_skate.n.01"}, {"name": "igniter", "instance_count": 77, "def": "a substance or device used to start a fire", "synonyms": ["igniter", "ignitor", "lighter"], "image_count": 75, "id": 584, "frequency": "c", "synset": "igniter.n.01"}, {"name": "inhaler", "instance_count": 7, "def": "a dispenser that produces a chemical vapor to be inhaled through mouth or nose", "synonyms": ["inhaler", "inhalator"], "image_count": 6, "id": 585, "frequency": "r", "synset": "inhaler.n.01"}, {"name": "iPod", "instance_count": 172, "def": "a pocket-sized device used to play music files", "synonyms": ["iPod"], "image_count": 126, "id": 586, "frequency": "f", "synset": "ipod.n.01"}, {"name": "iron_(for_clothing)", "instance_count": 38, "def": "home appliance consisting of a flat metal base that is heated and used to smooth cloth", "synonyms": ["iron_(for_clothing)", "smoothing_iron_(for_clothing)"], "image_count": 24, "id": 587, "frequency": "c", "synset": "iron.n.04"}, {"name": "ironing_board", "instance_count": 24, "def": "narrow padded board on collapsible supports; used for ironing clothes", "synonyms": ["ironing_board"], "image_count": 22, "id": 588, "frequency": "c", "synset": "ironing_board.n.01"}, {"name": "jacket", "instance_count": 8013, "def": "a waist-length coat", "synonyms": ["jacket"], "image_count": 1872, "id": 589, "frequency": "f", "synset": "jacket.n.01"}, {"name": "jam", "instance_count": 29, "def": "preserve of crushed fruit", "synonyms": ["jam"], "image_count": 16, "id": 590, "frequency": "c", "synset": "jam.n.01"}, {"name": "jar", "instance_count": 2002, "def": "a vessel (usually cylindrical) with a wide mouth and without handles", "synonyms": ["jar"], "image_count": 423, "id": 591, "frequency": "f", "synset": "jar.n.01"}, {"name": "jean", "instance_count": 5421, "def": "(usually plural) close-fitting trousers of heavy denim for manual work or casual wear", "synonyms": ["jean", "blue_jean", "denim"], "image_count": 1927, "id": 592, "frequency": "f", "synset": "jean.n.01"}, {"name": "jeep", "instance_count": 55, "def": "a car suitable for traveling over rough terrain", "synonyms": ["jeep", "landrover"], "image_count": 38, "id": 593, "frequency": "c", "synset": "jeep.n.01"}, {"name": "jelly_bean", "instance_count": 116, "def": "sugar-glazed jellied candy", "synonyms": ["jelly_bean", "jelly_egg"], "image_count": 3, "id": 594, "frequency": "r", "synset": "jelly_bean.n.01"}, {"name": "jersey", "instance_count": 8117, "def": "a close-fitting pullover shirt", "synonyms": ["jersey", "T-shirt", "tee_shirt"], "image_count": 1945, "id": 595, "frequency": "f", "synset": "jersey.n.03"}, {"name": "jet_plane", "instance_count": 87, "def": "an airplane powered by one or more jet engines", "synonyms": ["jet_plane", "jet-propelled_plane"], "image_count": 35, "id": 596, "frequency": "c", "synset": "jet.n.01"}, {"name": "jewel", "instance_count": 1, "def": "a precious or semiprecious stone incorporated into a piece of jewelry", "synonyms": ["jewel", "gem", "precious_stone"], "image_count": 1, "id": 597, "frequency": "r", "synset": "jewel.n.01"}, {"name": "jewelry", "instance_count": 51, "def": "an adornment (as a bracelet or ring or necklace) made of precious metals and set with gems (or imitation gems)", "synonyms": ["jewelry", "jewellery"], "image_count": 13, "id": 598, "frequency": "c", "synset": "jewelry.n.01"}, {"name": "joystick", "instance_count": 12, "def": "a control device for computers consisting of a vertical handle that can move freely in two directions", "synonyms": ["joystick"], "image_count": 9, "id": 599, "frequency": "r", "synset": "joystick.n.02"}, {"name": "jumpsuit", "instance_count": 21, "def": "one-piece garment fashioned after a parachutist's uniform", "synonyms": ["jumpsuit"], "image_count": 14, "id": 600, "frequency": "c", "synset": "jump_suit.n.01"}, {"name": "kayak", "instance_count": 124, "def": "a small canoe consisting of a light frame made watertight with animal skins", "synonyms": ["kayak"], "image_count": 37, "id": 601, "frequency": "c", "synset": "kayak.n.01"}, {"name": "keg", "instance_count": 6, "def": "small cask or barrel", "synonyms": ["keg"], "image_count": 3, "id": 602, "frequency": "r", "synset": "keg.n.02"}, {"name": "kennel", "instance_count": 4, "def": "outbuilding that serves as a shelter for a dog", "synonyms": ["kennel", "doghouse"], "image_count": 4, "id": 603, "frequency": "r", "synset": "kennel.n.01"}, {"name": "kettle", "instance_count": 130, "def": "a metal pot for stewing or boiling; usually has a lid", "synonyms": ["kettle", "boiler"], "image_count": 100, "id": 604, "frequency": "c", "synset": "kettle.n.01"}, {"name": "key", "instance_count": 447, "def": "metal instrument used to unlock a lock", "synonyms": ["key"], "image_count": 195, "id": 605, "frequency": "f", "synset": "key.n.01"}, {"name": "keycard", "instance_count": 1, "def": "a plastic card used to gain access typically to a door", "synonyms": ["keycard"], "image_count": 1, "id": 606, "frequency": "r", "synset": "keycard.n.01"}, {"name": "kilt", "instance_count": 19, "def": "a knee-length pleated tartan skirt worn by men as part of the traditional dress in the Highlands of northern Scotland", "synonyms": ["kilt"], "image_count": 12, "id": 607, "frequency": "c", "synset": "kilt.n.01"}, {"name": "kimono", "instance_count": 38, "def": "a loose robe; imitated from robes originally worn by Japanese", "synonyms": ["kimono"], "image_count": 24, "id": 608, "frequency": "c", "synset": "kimono.n.01"}, {"name": "kitchen_sink", "instance_count": 519, "def": "a sink in a kitchen", "synonyms": ["kitchen_sink"], "image_count": 489, "id": 609, "frequency": "f", "synset": "kitchen_sink.n.01"}, {"name": "kitchen_table", "instance_count": 11, "def": "a table in the kitchen", "synonyms": ["kitchen_table"], "image_count": 10, "id": 610, "frequency": "r", "synset": "kitchen_table.n.01"}, {"name": "kite", "instance_count": 11174, "def": "plaything consisting of a light frame covered with tissue paper; flown in wind at end of a string", "synonyms": ["kite"], "image_count": 1689, "id": 611, "frequency": "f", "synset": "kite.n.03"}, {"name": "kitten", "instance_count": 60, "def": "young domestic cat", "synonyms": ["kitten", "kitty"], "image_count": 42, "id": 612, "frequency": "c", "synset": "kitten.n.01"}, {"name": "kiwi_fruit", "instance_count": 702, "def": "fuzzy brown egg-shaped fruit with slightly tart green flesh", "synonyms": ["kiwi_fruit"], "image_count": 81, "id": 613, "frequency": "c", "synset": "kiwi.n.03"}, {"name": "knee_pad", "instance_count": 1765, "def": "protective garment consisting of a pad worn by football or baseball or hockey players", "synonyms": ["knee_pad"], "image_count": 894, "id": 614, "frequency": "f", "synset": "knee_pad.n.01"}, {"name": "knife", "instance_count": 3515, "def": "tool with a blade and point used as a cutting instrument", "synonyms": ["knife"], "image_count": 1868, "id": 615, "frequency": "f", "synset": "knife.n.01"}, {"name": "knitting_needle", "instance_count": 16, "def": "needle consisting of a slender rod with pointed ends; usually used in pairs", "synonyms": ["knitting_needle"], "image_count": 7, "id": 616, "frequency": "r", "synset": "knitting_needle.n.01"}, {"name": "knob", "instance_count": 8432, "def": "a round handle often found on a door", "synonyms": ["knob"], "image_count": 1567, "id": 617, "frequency": "f", "synset": "knob.n.02"}, {"name": "knocker_(on_a_door)", "instance_count": 10, "def": "a device (usually metal and ornamental) attached by a hinge to a door", "synonyms": ["knocker_(on_a_door)", "doorknocker"], "image_count": 10, "id": 618, "frequency": "r", "synset": "knocker.n.05"}, {"name": "koala", "instance_count": 15, "def": "sluggish tailless Australian marsupial with grey furry ears and coat", "synonyms": ["koala", "koala_bear"], "image_count": 8, "id": 619, "frequency": "r", "synset": "koala.n.01"}, {"name": "lab_coat", "instance_count": 42, "def": "a light coat worn to protect clothing from substances used while working in a laboratory", "synonyms": ["lab_coat", "laboratory_coat"], "image_count": 7, "id": 620, "frequency": "r", "synset": "lab_coat.n.01"}, {"name": "ladder", "instance_count": 975, "def": "steps consisting of two parallel members connected by rungs", "synonyms": ["ladder"], "image_count": 629, "id": 621, "frequency": "f", "synset": "ladder.n.01"}, {"name": "ladle", "instance_count": 226, "def": "a spoon-shaped vessel with a long handle frequently used to transfer liquids", "synonyms": ["ladle"], "image_count": 89, "id": 622, "frequency": "c", "synset": "ladle.n.01"}, {"name": "ladybug", "instance_count": 68, "def": "small round bright-colored and spotted beetle, typically red and black", "synonyms": ["ladybug", "ladybeetle", "ladybird_beetle"], "image_count": 15, "id": 623, "frequency": "c", "synset": "ladybug.n.01"}, {"name": "lamb_(animal)", "instance_count": 618, "def": "young sheep", "synonyms": ["lamb_(animal)"], "image_count": 134, "id": 624, "frequency": "f", "synset": "lamb.n.01"}, {"name": "lamb-chop", "instance_count": 8, "def": "chop cut from a lamb", "synonyms": ["lamb-chop", "lambchop"], "image_count": 4, "id": 625, "frequency": "r", "synset": "lamb_chop.n.01"}, {"name": "lamp", "instance_count": 4139, "def": "a piece of furniture holding one or more electric light bulbs", "synonyms": ["lamp"], "image_count": 1802, "id": 626, "frequency": "f", "synset": "lamp.n.02"}, {"name": "lamppost", "instance_count": 2234, "def": "a metal post supporting an outdoor lamp (such as a streetlight)", "synonyms": ["lamppost"], "image_count": 595, "id": 627, "frequency": "f", "synset": "lamppost.n.01"}, {"name": "lampshade", "instance_count": 2475, "def": "a protective ornamental shade used to screen a light bulb from direct view", "synonyms": ["lampshade"], "image_count": 1210, "id": 628, "frequency": "f", "synset": "lampshade.n.01"}, {"name": "lantern", "instance_count": 364, "def": "light in a transparent protective case", "synonyms": ["lantern"], "image_count": 48, "id": 629, "frequency": "c", "synset": "lantern.n.01"}, {"name": "lanyard", "instance_count": 1065, "def": "a cord worn around the neck to hold a knife or whistle, etc.", "synonyms": ["lanyard", "laniard"], "image_count": 418, "id": 630, "frequency": "f", "synset": "lanyard.n.02"}, {"name": "laptop_computer", "instance_count": 2852, "def": "a portable computer small enough to use in your lap", "synonyms": ["laptop_computer", "notebook_computer"], "image_count": 1846, "id": 631, "frequency": "f", "synset": "laptop.n.01"}, {"name": "lasagna", "instance_count": 7, "def": "baked dish of layers of lasagna pasta with sauce and cheese and meat or vegetables", "synonyms": ["lasagna", "lasagne"], "image_count": 5, "id": 632, "frequency": "r", "synset": "lasagna.n.01"}, {"name": "latch", "instance_count": 702, "def": "a bar that can be lowered or slid into a groove to fasten a door or gate", "synonyms": ["latch"], "image_count": 221, "id": 633, "frequency": "f", "synset": "latch.n.02"}, {"name": "lawn_mower", "instance_count": 12, "def": "garden tool for mowing grass on lawns", "synonyms": ["lawn_mower"], "image_count": 10, "id": 634, "frequency": "r", "synset": "lawn_mower.n.01"}, {"name": "leather", "instance_count": 20, "def": "an animal skin made smooth and flexible by removing the hair and then tanning", "synonyms": ["leather"], "image_count": 7, "id": 635, "frequency": "r", "synset": "leather.n.01"}, {"name": "legging_(clothing)", "instance_count": 154, "def": "a garment covering the leg (usually extending from the knee to the ankle)", "synonyms": ["legging_(clothing)", "leging_(clothing)", "leg_covering"], "image_count": 76, "id": 636, "frequency": "c", "synset": "legging.n.01"}, {"name": "Lego", "instance_count": 331, "def": "a child's plastic construction set for making models from blocks", "synonyms": ["Lego", "Lego_set"], "image_count": 22, "id": 637, "frequency": "c", "synset": "lego.n.01"}, {"name": "legume", "instance_count": 333, "def": "the fruit or seed of bean or pea plants", "synonyms": ["legume"], "image_count": 10, "id": 638, "frequency": "r", "synset": "legume.n.02"}, {"name": "lemon", "instance_count": 2168, "def": "yellow oval fruit with juicy acidic flesh", "synonyms": ["lemon"], "image_count": 341, "id": 639, "frequency": "f", "synset": "lemon.n.01"}, {"name": "lemonade", "instance_count": 2, "def": "sweetened beverage of diluted lemon juice", "synonyms": ["lemonade"], "image_count": 1, "id": 640, "frequency": "r", "synset": "lemonade.n.01"}, {"name": "lettuce", "instance_count": 5500, "def": "leafy plant commonly eaten in salad or on sandwiches", "synonyms": ["lettuce"], "image_count": 705, "id": 641, "frequency": "f", "synset": "lettuce.n.02"}, {"name": "license_plate", "instance_count": 4392, "def": "a plate mounted on the front and back of car and bearing the car's registration number", "synonyms": ["license_plate", "numberplate"], "image_count": 1900, "id": 642, "frequency": "f", "synset": "license_plate.n.01"}, {"name": "life_buoy", "instance_count": 524, "def": "a ring-shaped life preserver used to prevent drowning (NOT a life-jacket or vest)", "synonyms": ["life_buoy", "lifesaver", "life_belt", "life_ring"], "image_count": 188, "id": 643, "frequency": "f", "synset": "life_buoy.n.01"}, {"name": "life_jacket", "instance_count": 689, "def": "life preserver consisting of a sleeveless jacket of buoyant or inflatable design", "synonyms": ["life_jacket", "life_vest"], "image_count": 227, "id": 644, "frequency": "f", "synset": "life_jacket.n.01"}, {"name": "lightbulb", "instance_count": 7075, "def": "lightblub/source of light", "synonyms": ["lightbulb"], "image_count": 861, "id": 645, "frequency": "f", "synset": "light_bulb.n.01"}, {"name": "lightning_rod", "instance_count": 6, "def": "a metallic conductor that is attached to a high point and leads to the ground", "synonyms": ["lightning_rod", "lightning_conductor"], "image_count": 6, "id": 646, "frequency": "r", "synset": "lightning_rod.n.02"}, {"name": "lime", "instance_count": 1134, "def": "the green acidic fruit of any of various lime trees", "synonyms": ["lime"], "image_count": 115, "id": 647, "frequency": "f", "synset": "lime.n.06"}, {"name": "limousine", "instance_count": 6, "def": "long luxurious car; usually driven by a chauffeur", "synonyms": ["limousine"], "image_count": 5, "id": 648, "frequency": "r", "synset": "limousine.n.01"}, {"name": "lion", "instance_count": 69, "def": "large gregarious predatory cat of Africa and India", "synonyms": ["lion"], "image_count": 43, "id": 649, "frequency": "c", "synset": "lion.n.01"}, {"name": "lip_balm", "instance_count": 29, "def": "a balm applied to the lips", "synonyms": ["lip_balm"], "image_count": 14, "id": 650, "frequency": "c", "synset": "lip_balm.n.01"}, {"name": "liquor", "instance_count": 66, "def": "liquor or beer", "synonyms": ["liquor", "spirits", "hard_liquor", "liqueur", "cordial"], "image_count": 6, "id": 651, "frequency": "r", "synset": "liquor.n.01"}, {"name": "lizard", "instance_count": 22, "def": "a reptile with usually two pairs of legs and a tapering tail", "synonyms": ["lizard"], "image_count": 15, "id": 652, "frequency": "c", "synset": "lizard.n.01"}, {"name": "log", "instance_count": 7363, "def": "a segment of the trunk of a tree when stripped of branches", "synonyms": ["log"], "image_count": 1167, "id": 653, "frequency": "f", "synset": "log.n.01"}, {"name": "lollipop", "instance_count": 59, "def": "hard candy on a stick", "synonyms": ["lollipop"], "image_count": 15, "id": 654, "frequency": "c", "synset": "lollipop.n.02"}, {"name": "speaker_(stero_equipment)", "instance_count": 2029, "def": "electronic device that produces sound often as part of a stereo system", "synonyms": ["speaker_(stero_equipment)"], "image_count": 994, "id": 655, "frequency": "f", "synset": "loudspeaker.n.01"}, {"name": "loveseat", "instance_count": 41, "def": "small sofa that seats two people", "synonyms": ["loveseat"], "image_count": 28, "id": 656, "frequency": "c", "synset": "love_seat.n.01"}, {"name": "machine_gun", "instance_count": 5, "def": "a rapidly firing automatic gun", "synonyms": ["machine_gun"], "image_count": 2, "id": 657, "frequency": "r", "synset": "machine_gun.n.01"}, {"name": "magazine", "instance_count": 1379, "def": "a paperback periodic publication", "synonyms": ["magazine"], "image_count": 338, "id": 658, "frequency": "f", "synset": "magazine.n.02"}, {"name": "magnet", "instance_count": 5638, "def": "a device that attracts iron and produces a magnetic field", "synonyms": ["magnet"], "image_count": 334, "id": 659, "frequency": "f", "synset": "magnet.n.01"}, {"name": "mail_slot", "instance_count": 16, "def": "a slot (usually in a door) through which mail can be delivered", "synonyms": ["mail_slot"], "image_count": 15, "id": 660, "frequency": "c", "synset": "mail_slot.n.01"}, {"name": "mailbox_(at_home)", "instance_count": 240, "def": "a private box for delivery of mail", "synonyms": ["mailbox_(at_home)", "letter_box_(at_home)"], "image_count": 102, "id": 661, "frequency": "f", "synset": "mailbox.n.01"}, {"name": "mallard", "instance_count": 2, "def": "wild dabbling duck from which domestic ducks are descended", "synonyms": ["mallard"], "image_count": 1, "id": 662, "frequency": "r", "synset": "mallard.n.01"}, {"name": "mallet", "instance_count": 16, "def": "a sports implement with a long handle and a hammer-like head used to hit a ball", "synonyms": ["mallet"], "image_count": 8, "id": 663, "frequency": "r", "synset": "mallet.n.01"}, {"name": "mammoth", "instance_count": 2, "def": "any of numerous extinct elephants widely distributed in the Pleistocene", "synonyms": ["mammoth"], "image_count": 1, "id": 664, "frequency": "r", "synset": "mammoth.n.01"}, {"name": "manatee", "instance_count": 1, "def": "sirenian mammal of tropical coastal waters of America", "synonyms": ["manatee"], "image_count": 1, "id": 665, "frequency": "r", "synset": "manatee.n.01"}, {"name": "mandarin_orange", "instance_count": 401, "def": "a somewhat flat reddish-orange loose skinned citrus of China", "synonyms": ["mandarin_orange"], "image_count": 28, "id": 666, "frequency": "c", "synset": "mandarin.n.05"}, {"name": "manger", "instance_count": 126, "def": "a container (usually in a barn or stable) from which cattle or horses feed", "synonyms": ["manger", "trough"], "image_count": 91, "id": 667, "frequency": "c", "synset": "manger.n.01"}, {"name": "manhole", "instance_count": 445, "def": "a hole (usually with a flush cover) through which a person can gain access to an underground structure", "synonyms": ["manhole"], "image_count": 260, "id": 668, "frequency": "f", "synset": "manhole.n.01"}, {"name": "map", "instance_count": 186, "def": "a diagrammatic representation of the earth's surface (or part of it)", "synonyms": ["map"], "image_count": 131, "id": 669, "frequency": "f", "synset": "map.n.01"}, {"name": "marker", "instance_count": 501, "def": "a writing implement for making a mark", "synonyms": ["marker"], "image_count": 128, "id": 670, "frequency": "f", "synset": "marker.n.03"}, {"name": "martini", "instance_count": 3, "def": "a cocktail made of gin (or vodka) with dry vermouth", "synonyms": ["martini"], "image_count": 3, "id": 671, "frequency": "r", "synset": "martini.n.01"}, {"name": "mascot", "instance_count": 10, "def": "a person or animal that is adopted by a team or other group as a symbolic figure", "synonyms": ["mascot"], "image_count": 10, "id": 672, "frequency": "r", "synset": "mascot.n.01"}, {"name": "mashed_potato", "instance_count": 58, "def": "potato that has been peeled and boiled and then mashed", "synonyms": ["mashed_potato"], "image_count": 39, "id": 673, "frequency": "c", "synset": "mashed_potato.n.01"}, {"name": "masher", "instance_count": 2, "def": "a kitchen utensil used for mashing (e.g. potatoes)", "synonyms": ["masher"], "image_count": 2, "id": 674, "frequency": "r", "synset": "masher.n.02"}, {"name": "mask", "instance_count": 1595, "def": "a protective covering worn over the face", "synonyms": ["mask", "facemask"], "image_count": 925, "id": 675, "frequency": "f", "synset": "mask.n.04"}, {"name": "mast", "instance_count": 2985, "def": "a vertical spar for supporting sails", "synonyms": ["mast"], "image_count": 354, "id": 676, "frequency": "f", "synset": "mast.n.01"}, {"name": "mat_(gym_equipment)", "instance_count": 114, "def": "sports equipment consisting of a piece of thick padding on the floor for gymnastics", "synonyms": ["mat_(gym_equipment)", "gym_mat"], "image_count": 31, "id": 677, "frequency": "c", "synset": "mat.n.03"}, {"name": "matchbox", "instance_count": 11, "def": "a box for holding matches", "synonyms": ["matchbox"], "image_count": 10, "id": 678, "frequency": "r", "synset": "matchbox.n.01"}, {"name": "mattress", "instance_count": 354, "def": "a thick pad filled with resilient material used as a bed or part of a bed", "synonyms": ["mattress"], "image_count": 215, "id": 679, "frequency": "f", "synset": "mattress.n.01"}, {"name": "measuring_cup", "instance_count": 139, "def": "graduated cup used to measure liquid or granular ingredients", "synonyms": ["measuring_cup"], "image_count": 71, "id": 680, "frequency": "c", "synset": "measuring_cup.n.01"}, {"name": "measuring_stick", "instance_count": 57, "def": "measuring instrument having a sequence of marks at regular intervals", "synonyms": ["measuring_stick", "ruler_(measuring_stick)", "measuring_rod"], "image_count": 43, "id": 681, "frequency": "c", "synset": "measuring_stick.n.01"}, {"name": "meatball", "instance_count": 174, "def": "ground meat formed into a ball and fried or simmered in broth", "synonyms": ["meatball"], "image_count": 28, "id": 682, "frequency": "c", "synset": "meatball.n.01"}, {"name": "medicine", "instance_count": 243, "def": "something that treats or prevents or alleviates the symptoms of disease", "synonyms": ["medicine"], "image_count": 34, "id": 683, "frequency": "c", "synset": "medicine.n.02"}, {"name": "melon", "instance_count": 167, "def": "fruit of the gourd family having a hard rind and sweet juicy flesh", "synonyms": ["melon"], "image_count": 16, "id": 684, "frequency": "c", "synset": "melon.n.01"}, {"name": "microphone", "instance_count": 435, "def": "device for converting sound waves into electrical energy", "synonyms": ["microphone"], "image_count": 273, "id": 685, "frequency": "f", "synset": "microphone.n.01"}, {"name": "microscope", "instance_count": 3, "def": "magnifier of the image of small objects", "synonyms": ["microscope"], "image_count": 2, "id": 686, "frequency": "r", "synset": "microscope.n.01"}, {"name": "microwave_oven", "instance_count": 1105, "def": "kitchen appliance that cooks food by passing an electromagnetic wave through it", "synonyms": ["microwave_oven"], "image_count": 999, "id": 687, "frequency": "f", "synset": "microwave.n.02"}, {"name": "milestone", "instance_count": 5, "def": "stone post at side of a road to show distances", "synonyms": ["milestone", "milepost"], "image_count": 4, "id": 688, "frequency": "r", "synset": "milestone.n.01"}, {"name": "milk", "instance_count": 227, "def": "a white nutritious liquid secreted by mammals and used as food by human beings", "synonyms": ["milk"], "image_count": 107, "id": 689, "frequency": "f", "synset": "milk.n.01"}, {"name": "milk_can", "instance_count": 8, "def": "can for transporting milk", "synonyms": ["milk_can"], "image_count": 2, "id": 690, "frequency": "r", "synset": "milk_can.n.01"}, {"name": "milkshake", "instance_count": 1, "def": "frothy drink of milk and flavoring and sometimes fruit or ice cream", "synonyms": ["milkshake"], "image_count": 1, "id": 691, "frequency": "r", "synset": "milkshake.n.01"}, {"name": "minivan", "instance_count": 1046, "def": "a small box-shaped passenger van", "synonyms": ["minivan"], "image_count": 454, "id": 692, "frequency": "f", "synset": "minivan.n.01"}, {"name": "mint_candy", "instance_count": 27, "def": "a candy that is flavored with a mint oil", "synonyms": ["mint_candy"], "image_count": 9, "id": 693, "frequency": "r", "synset": "mint.n.05"}, {"name": "mirror", "instance_count": 3490, "def": "polished surface that forms images by reflecting light", "synonyms": ["mirror"], "image_count": 1901, "id": 694, "frequency": "f", "synset": "mirror.n.01"}, {"name": "mitten", "instance_count": 156, "def": "glove that encases the thumb separately and the other four fingers together", "synonyms": ["mitten"], "image_count": 61, "id": 695, "frequency": "c", "synset": "mitten.n.01"}, {"name": "mixer_(kitchen_tool)", "instance_count": 108, "def": "a kitchen utensil that is used for mixing foods", "synonyms": ["mixer_(kitchen_tool)", "stand_mixer"], "image_count": 91, "id": 696, "frequency": "c", "synset": "mixer.n.04"}, {"name": "money", "instance_count": 122, "def": "the official currency issued by a government or national bank", "synonyms": ["money"], "image_count": 46, "id": 697, "frequency": "c", "synset": "money.n.03"}, {"name": "monitor_(computer_equipment) computer_monitor", "instance_count": 2955, "def": "a computer monitor", "synonyms": ["monitor_(computer_equipment) computer_monitor"], "image_count": 1402, "id": 698, "frequency": "f", "synset": "monitor.n.04"}, {"name": "monkey", "instance_count": 166, "def": "any of various long-tailed primates", "synonyms": ["monkey"], "image_count": 74, "id": 699, "frequency": "c", "synset": "monkey.n.01"}, {"name": "motor", "instance_count": 985, "def": "machine that converts other forms of energy into mechanical energy and so imparts motion", "synonyms": ["motor"], "image_count": 421, "id": 700, "frequency": "f", "synset": "motor.n.01"}, {"name": "motor_scooter", "instance_count": 720, "def": "a wheeled vehicle with small wheels and a low-powered engine", "synonyms": ["motor_scooter", "scooter"], "image_count": 226, "id": 701, "frequency": "f", "synset": "motor_scooter.n.01"}, {"name": "motor_vehicle", "instance_count": 64, "def": "a self-propelled wheeled vehicle that does not run on rails", "synonyms": ["motor_vehicle", "automotive_vehicle"], "image_count": 10, "id": 702, "frequency": "r", "synset": "motor_vehicle.n.01"}, {"name": "motorcycle", "instance_count": 5247, "def": "a motor vehicle with two wheels and a strong frame", "synonyms": ["motorcycle"], "image_count": 1720, "id": 703, "frequency": "f", "synset": "motorcycle.n.01"}, {"name": "mound_(baseball)", "instance_count": 269, "def": "(baseball) the slight elevation on which the pitcher stands", "synonyms": ["mound_(baseball)", "pitcher's_mound"], "image_count": 261, "id": 704, "frequency": "f", "synset": "mound.n.01"}, {"name": "mouse_(computer_equipment)", "instance_count": 1832, "def": "a computer input device that controls an on-screen pointer (does not include trackpads / touchpads)", "synonyms": ["mouse_(computer_equipment)", "computer_mouse"], "image_count": 1337, "id": 705, "frequency": "f", "synset": "mouse.n.04"}, {"name": "mousepad", "instance_count": 333, "def": "a small portable pad that provides an operating surface for a computer mouse", "synonyms": ["mousepad"], "image_count": 293, "id": 706, "frequency": "f", "synset": "mousepad.n.01"}, {"name": "muffin", "instance_count": 352, "def": "a sweet quick bread baked in a cup-shaped pan", "synonyms": ["muffin"], "image_count": 62, "id": 707, "frequency": "c", "synset": "muffin.n.01"}, {"name": "mug", "instance_count": 1785, "def": "with handle and usually cylindrical", "synonyms": ["mug"], "image_count": 814, "id": 708, "frequency": "f", "synset": "mug.n.04"}, {"name": "mushroom", "instance_count": 6257, "def": "a common mushroom", "synonyms": ["mushroom"], "image_count": 407, "id": 709, "frequency": "f", "synset": "mushroom.n.02"}, {"name": "music_stool", "instance_count": 6, "def": "a stool for piano players; usually adjustable in height", "synonyms": ["music_stool", "piano_stool"], "image_count": 6, "id": 710, "frequency": "r", "synset": "music_stool.n.01"}, {"name": "musical_instrument", "instance_count": 33, "def": "any of various devices or contrivances that can be used to produce musical tones or sounds", "synonyms": ["musical_instrument", "instrument_(musical)"], "image_count": 16, "id": 711, "frequency": "c", "synset": "musical_instrument.n.01"}, {"name": "nailfile", "instance_count": 10, "def": "a small flat file for shaping the nails", "synonyms": ["nailfile"], "image_count": 7, "id": 712, "frequency": "r", "synset": "nailfile.n.01"}, {"name": "napkin", "instance_count": 3979, "def": "a small piece of table linen or paper that is used to wipe the mouth and to cover the lap in order to protect clothing", "synonyms": ["napkin", "table_napkin", "serviette"], "image_count": 1791, "id": 713, "frequency": "f", "synset": "napkin.n.01"}, {"name": "neckerchief", "instance_count": 4, "def": "a kerchief worn around the neck", "synonyms": ["neckerchief"], "image_count": 2, "id": 714, "frequency": "r", "synset": "neckerchief.n.01"}, {"name": "necklace", "instance_count": 2709, "def": "jewelry consisting of a cord or chain (often bearing gems) worn about the neck as an ornament", "synonyms": ["necklace"], "image_count": 1915, "id": 715, "frequency": "f", "synset": "necklace.n.01"}, {"name": "necktie", "instance_count": 4069, "def": "neckwear consisting of a long narrow piece of material worn under a collar and tied in knot at the front", "synonyms": ["necktie", "tie_(necktie)"], "image_count": 1940, "id": 716, "frequency": "f", "synset": "necktie.n.01"}, {"name": "needle", "instance_count": 61, "def": "a sharp pointed implement (usually metal)", "synonyms": ["needle"], "image_count": 13, "id": 717, "frequency": "c", "synset": "needle.n.03"}, {"name": "nest", "instance_count": 20, "def": "a structure in which animals lay eggs or give birth to their young", "synonyms": ["nest"], "image_count": 16, "id": 718, "frequency": "c", "synset": "nest.n.01"}, {"name": "newspaper", "instance_count": 1179, "def": "a daily or weekly publication on folded sheets containing news, articles, and advertisements", "synonyms": ["newspaper", "paper_(newspaper)"], "image_count": 448, "id": 719, "frequency": "f", "synset": "newspaper.n.01"}, {"name": "newsstand", "instance_count": 39, "def": "a stall where newspapers and other periodicals are sold", "synonyms": ["newsstand"], "image_count": 12, "id": 720, "frequency": "c", "synset": "newsstand.n.01"}, {"name": "nightshirt", "instance_count": 35, "def": "garments designed to be worn in bed", "synonyms": ["nightshirt", "nightwear", "sleepwear", "nightclothes"], "image_count": 18, "id": 721, "frequency": "c", "synset": "nightwear.n.01"}, {"name": "nosebag_(for_animals)", "instance_count": 4, "def": "a canvas bag that is used to feed an animal (such as a horse); covers the muzzle and fastens at the top of the head", "synonyms": ["nosebag_(for_animals)", "feedbag"], "image_count": 4, "id": 722, "frequency": "r", "synset": "nosebag.n.01"}, {"name": "noseband_(for_animals)", "instance_count": 120, "def": "a strap that is the part of a bridle that goes over the animal's nose", "synonyms": ["noseband_(for_animals)", "nosepiece_(for_animals)"], "image_count": 71, "id": 723, "frequency": "c", "synset": "noseband.n.01"}, {"name": "notebook", "instance_count": 290, "def": "a book with blank pages for recording notes or memoranda", "synonyms": ["notebook"], "image_count": 189, "id": 724, "frequency": "f", "synset": "notebook.n.01"}, {"name": "notepad", "instance_count": 187, "def": "a pad of paper for keeping notes", "synonyms": ["notepad"], "image_count": 74, "id": 725, "frequency": "c", "synset": "notepad.n.01"}, {"name": "nut", "instance_count": 790, "def": "a small metal block (usually square or hexagonal) with internal screw thread to be fitted onto a bolt", "synonyms": ["nut"], "image_count": 103, "id": 726, "frequency": "f", "synset": "nut.n.03"}, {"name": "nutcracker", "instance_count": 7, "def": "a hand tool used to crack nuts open", "synonyms": ["nutcracker"], "image_count": 3, "id": 727, "frequency": "r", "synset": "nutcracker.n.01"}, {"name": "oar", "instance_count": 488, "def": "an implement used to propel or steer a boat", "synonyms": ["oar"], "image_count": 110, "id": 728, "frequency": "f", "synset": "oar.n.01"}, {"name": "octopus_(food)", "instance_count": 5, "def": "tentacles of octopus prepared as food", "synonyms": ["octopus_(food)"], "image_count": 5, "id": 729, "frequency": "r", "synset": "octopus.n.01"}, {"name": "octopus_(animal)", "instance_count": 17, "def": "bottom-living cephalopod having a soft oval body with eight long tentacles", "synonyms": ["octopus_(animal)"], "image_count": 9, "id": 730, "frequency": "r", "synset": "octopus.n.02"}, {"name": "oil_lamp", "instance_count": 28, "def": "a lamp that burns oil (as kerosine) for light", "synonyms": ["oil_lamp", "kerosene_lamp", "kerosine_lamp"], "image_count": 15, "id": 731, "frequency": "c", "synset": "oil_lamp.n.01"}, {"name": "olive_oil", "instance_count": 36, "def": "oil from olives", "synonyms": ["olive_oil"], "image_count": 25, "id": 732, "frequency": "c", "synset": "olive_oil.n.01"}, {"name": "omelet", "instance_count": 10, "def": "beaten eggs cooked until just set; may be folded around e.g. ham or cheese or jelly", "synonyms": ["omelet", "omelette"], "image_count": 7, "id": 733, "frequency": "r", "synset": "omelet.n.01"}, {"name": "onion", "instance_count": 9779, "def": "the bulb of an onion plant", "synonyms": ["onion"], "image_count": 647, "id": 734, "frequency": "f", "synset": "onion.n.01"}, {"name": "orange_(fruit)", "instance_count": 13034, "def": "orange (FRUIT of an orange tree)", "synonyms": ["orange_(fruit)"], "image_count": 824, "id": 735, "frequency": "f", "synset": "orange.n.01"}, {"name": "orange_juice", "instance_count": 223, "def": "bottled or freshly squeezed juice of oranges", "synonyms": ["orange_juice"], "image_count": 100, "id": 736, "frequency": "c", "synset": "orange_juice.n.01"}, {"name": "ostrich", "instance_count": 71, "def": "fast-running African flightless bird with two-toed feet; largest living bird", "synonyms": ["ostrich"], "image_count": 47, "id": 737, "frequency": "c", "synset": "ostrich.n.02"}, {"name": "ottoman", "instance_count": 157, "def": "a thick standalone cushion used as a seat or footrest, often next to a chair", "synonyms": ["ottoman", "pouf", "pouffe", "hassock"], "image_count": 121, "id": 738, "frequency": "f", "synset": "ottoman.n.03"}, {"name": "oven", "instance_count": 929, "def": "kitchen appliance used for baking or roasting", "synonyms": ["oven"], "image_count": 731, "id": 739, "frequency": "f", "synset": "oven.n.01"}, {"name": "overalls_(clothing)", "instance_count": 76, "def": "work clothing consisting of denim trousers usually with a bib and shoulder straps", "synonyms": ["overalls_(clothing)"], "image_count": 73, "id": 740, "frequency": "c", "synset": "overall.n.01"}, {"name": "owl", "instance_count": 73, "def": "nocturnal bird of prey with hawk-like beak and claws and large head with front-facing eyes", "synonyms": ["owl"], "image_count": 49, "id": 741, "frequency": "c", "synset": "owl.n.01"}, {"name": "packet", "instance_count": 109, "def": "a small package or bundle", "synonyms": ["packet"], "image_count": 23, "id": 742, "frequency": "c", "synset": "packet.n.03"}, {"name": "inkpad", "instance_count": 12, "def": "absorbent material saturated with ink used to transfer ink evenly to a rubber stamp", "synonyms": ["inkpad", "inking_pad", "stamp_pad"], "image_count": 4, "id": 743, "frequency": "r", "synset": "pad.n.03"}, {"name": "pad", "instance_count": 264, "def": "mostly arm/knee pads labeled", "synonyms": ["pad"], "image_count": 62, "id": 744, "frequency": "c", "synset": "pad.n.04"}, {"name": "paddle", "instance_count": 306, "def": "a short light oar used without an oarlock to propel a canoe or small boat", "synonyms": ["paddle", "boat_paddle"], "image_count": 118, "id": 745, "frequency": "f", "synset": "paddle.n.04"}, {"name": "padlock", "instance_count": 184, "def": "a detachable, portable lock", "synonyms": ["padlock"], "image_count": 99, "id": 746, "frequency": "c", "synset": "padlock.n.01"}, {"name": "paintbrush", "instance_count": 91, "def": "a brush used as an applicator to apply paint", "synonyms": ["paintbrush"], "image_count": 40, "id": 747, "frequency": "c", "synset": "paintbrush.n.01"}, {"name": "painting", "instance_count": 2645, "def": "graphic art consisting of an artistic composition made by applying paints to a surface", "synonyms": ["painting"], "image_count": 1036, "id": 748, "frequency": "f", "synset": "painting.n.01"}, {"name": "pajamas", "instance_count": 163, "def": "loose-fitting nightclothes worn for sleeping or lounging", "synonyms": ["pajamas", "pyjamas"], "image_count": 105, "id": 749, "frequency": "f", "synset": "pajama.n.02"}, {"name": "palette", "instance_count": 68, "def": "board that provides a flat surface on which artists mix paints and the range of colors used", "synonyms": ["palette", "pallet"], "image_count": 21, "id": 750, "frequency": "c", "synset": "palette.n.02"}, {"name": "pan_(for_cooking)", "instance_count": 643, "def": "cooking utensil consisting of a wide metal vessel", "synonyms": ["pan_(for_cooking)", "cooking_pan"], "image_count": 229, "id": 751, "frequency": "f", "synset": "pan.n.01"}, {"name": "pan_(metal_container)", "instance_count": 21, "def": "shallow container made of metal", "synonyms": ["pan_(metal_container)"], "image_count": 7, "id": 752, "frequency": "r", "synset": "pan.n.03"}, {"name": "pancake", "instance_count": 295, "def": "a flat cake of thin batter fried on both sides on a griddle", "synonyms": ["pancake"], "image_count": 72, "id": 753, "frequency": "c", "synset": "pancake.n.01"}, {"name": "pantyhose", "instance_count": 11, "def": "a woman's tights consisting of underpants and stockings", "synonyms": ["pantyhose"], "image_count": 9, "id": 754, "frequency": "r", "synset": "pantyhose.n.01"}, {"name": "papaya", "instance_count": 206, "def": "large oval melon-like tropical fruit with yellowish flesh", "synonyms": ["papaya"], "image_count": 10, "id": 755, "frequency": "r", "synset": "papaya.n.02"}, {"name": "paper_plate", "instance_count": 957, "def": "a disposable plate made of cardboard", "synonyms": ["paper_plate"], "image_count": 328, "id": 756, "frequency": "f", "synset": "paper_plate.n.01"}, {"name": "paper_towel", "instance_count": 600, "def": "a disposable towel made of absorbent paper", "synonyms": ["paper_towel"], "image_count": 468, "id": 757, "frequency": "f", "synset": "paper_towel.n.01"}, {"name": "paperback_book", "instance_count": 3, "def": "a book with paper covers", "synonyms": ["paperback_book", "paper-back_book", "softback_book", "soft-cover_book"], "image_count": 1, "id": 758, "frequency": "r", "synset": "paperback_book.n.01"}, {"name": "paperweight", "instance_count": 4, "def": "a weight used to hold down a stack of papers", "synonyms": ["paperweight"], "image_count": 2, "id": 759, "frequency": "r", "synset": "paperweight.n.01"}, {"name": "parachute", "instance_count": 61, "def": "rescue equipment consisting of a device that fills with air and retards your fall", "synonyms": ["parachute"], "image_count": 24, "id": 760, "frequency": "c", "synset": "parachute.n.01"}, {"name": "parakeet", "instance_count": 46, "def": "any of numerous small slender long-tailed parrots", "synonyms": ["parakeet", "parrakeet", "parroket", "paraquet", "paroquet", "parroquet"], "image_count": 11, "id": 761, "frequency": "c", "synset": "parakeet.n.01"}, {"name": "parasail_(sports)", "instance_count": 385, "def": "parachute that will lift a person up into the air when it is towed by a motorboat or a car", "synonyms": ["parasail_(sports)"], "image_count": 72, "id": 762, "frequency": "c", "synset": "parasail.n.01"}, {"name": "parasol", "instance_count": 45, "def": "a handheld collapsible source of shade", "synonyms": ["parasol", "sunshade"], "image_count": 17, "id": 763, "frequency": "c", "synset": "parasol.n.01"}, {"name": "parchment", "instance_count": 17, "def": "a superior paper resembling sheepskin", "synonyms": ["parchment"], "image_count": 10, "id": 764, "frequency": "r", "synset": "parchment.n.01"}, {"name": "parka", "instance_count": 89, "def": "a kind of heavy jacket (`windcheater' is a British term)", "synonyms": ["parka", "anorak"], "image_count": 17, "id": 765, "frequency": "c", "synset": "parka.n.01"}, {"name": "parking_meter", "instance_count": 1075, "def": "a coin-operated timer located next to a parking space", "synonyms": ["parking_meter"], "image_count": 489, "id": 766, "frequency": "f", "synset": "parking_meter.n.01"}, {"name": "parrot", "instance_count": 76, "def": "usually brightly colored tropical birds with short hooked beaks and the ability to mimic sounds", "synonyms": ["parrot"], "image_count": 47, "id": 767, "frequency": "c", "synset": "parrot.n.01"}, {"name": "passenger_car_(part_of_a_train)", "instance_count": 465, "def": "a railcar where passengers ride", "synonyms": ["passenger_car_(part_of_a_train)", "coach_(part_of_a_train)"], "image_count": 93, "id": 768, "frequency": "c", "synset": "passenger_car.n.01"}, {"name": "passenger_ship", "instance_count": 1, "def": "a ship built to carry passengers", "synonyms": ["passenger_ship"], "image_count": 1, "id": 769, "frequency": "r", "synset": "passenger_ship.n.01"}, {"name": "passport", "instance_count": 12, "def": "a document issued by a country to a citizen allowing that person to travel abroad and re-enter the home country", "synonyms": ["passport"], "image_count": 12, "id": 770, "frequency": "c", "synset": "passport.n.02"}, {"name": "pastry", "instance_count": 4972, "def": "any of various baked foods made of dough or batter", "synonyms": ["pastry"], "image_count": 228, "id": 771, "frequency": "f", "synset": "pastry.n.02"}, {"name": "patty_(food)", "instance_count": 20, "def": "small flat mass of chopped food", "synonyms": ["patty_(food)"], "image_count": 5, "id": 772, "frequency": "r", "synset": "patty.n.01"}, {"name": "pea_(food)", "instance_count": 1869, "def": "seed of a pea plant used for food", "synonyms": ["pea_(food)"], "image_count": 76, "id": 773, "frequency": "c", "synset": "pea.n.01"}, {"name": "peach", "instance_count": 1041, "def": "downy juicy fruit with sweet yellowish or whitish flesh", "synonyms": ["peach"], "image_count": 71, "id": 774, "frequency": "c", "synset": "peach.n.03"}, {"name": "peanut_butter", "instance_count": 50, "def": "a spread made from ground peanuts", "synonyms": ["peanut_butter"], "image_count": 30, "id": 775, "frequency": "c", "synset": "peanut_butter.n.01"}, {"name": "pear", "instance_count": 1069, "def": "sweet juicy gritty-textured fruit available in many varieties", "synonyms": ["pear"], "image_count": 109, "id": 776, "frequency": "f", "synset": "pear.n.01"}, {"name": "peeler_(tool_for_fruit_and_vegetables)", "instance_count": 18, "def": "a device for peeling vegetables or fruits", "synonyms": ["peeler_(tool_for_fruit_and_vegetables)"], "image_count": 14, "id": 777, "frequency": "c", "synset": "peeler.n.03"}, {"name": "wooden_leg", "instance_count": 1, "def": "a prosthesis that replaces a missing leg", "synonyms": ["wooden_leg", "pegleg"], "image_count": 1, "id": 778, "frequency": "r", "synset": "peg.n.04"}, {"name": "pegboard", "instance_count": 9, "def": "a board perforated with regularly spaced holes into which pegs can be fitted", "synonyms": ["pegboard"], "image_count": 8, "id": 779, "frequency": "r", "synset": "pegboard.n.01"}, {"name": "pelican", "instance_count": 76, "def": "large long-winged warm-water seabird having a large bill with a distensible pouch for fish", "synonyms": ["pelican"], "image_count": 26, "id": 780, "frequency": "c", "synset": "pelican.n.01"}, {"name": "pen", "instance_count": 987, "def": "a writing implement with a point from which ink flows", "synonyms": ["pen"], "image_count": 339, "id": 781, "frequency": "f", "synset": "pen.n.01"}, {"name": "pencil", "instance_count": 543, "def": "a thin cylindrical pointed writing implement made of wood and graphite", "synonyms": ["pencil"], "image_count": 153, "id": 782, "frequency": "f", "synset": "pencil.n.01"}, {"name": "pencil_box", "instance_count": 2, "def": "a box for holding pencils", "synonyms": ["pencil_box", "pencil_case"], "image_count": 2, "id": 783, "frequency": "r", "synset": "pencil_box.n.01"}, {"name": "pencil_sharpener", "instance_count": 4, "def": "a rotary implement for sharpening the point on pencils", "synonyms": ["pencil_sharpener"], "image_count": 3, "id": 784, "frequency": "r", "synset": "pencil_sharpener.n.01"}, {"name": "pendulum", "instance_count": 18, "def": "an apparatus consisting of an object mounted so that it swings freely under the influence of gravity", "synonyms": ["pendulum"], "image_count": 8, "id": 785, "frequency": "r", "synset": "pendulum.n.01"}, {"name": "penguin", "instance_count": 229, "def": "short-legged flightless birds of cold southern regions having webbed feet and wings modified as flippers", "synonyms": ["penguin"], "image_count": 47, "id": 786, "frequency": "c", "synset": "penguin.n.01"}, {"name": "pennant", "instance_count": 235, "def": "a flag longer than it is wide (and often tapering)", "synonyms": ["pennant"], "image_count": 8, "id": 787, "frequency": "r", "synset": "pennant.n.02"}, {"name": "penny_(coin)", "instance_count": 15, "def": "a coin worth one-hundredth of the value of the basic unit", "synonyms": ["penny_(coin)"], "image_count": 6, "id": 788, "frequency": "r", "synset": "penny.n.02"}, {"name": "pepper", "instance_count": 697, "def": "pungent seasoning from the berry of the common pepper plant; whole or ground", "synonyms": ["pepper", "peppercorn"], "image_count": 116, "id": 789, "frequency": "f", "synset": "pepper.n.03"}, {"name": "pepper_mill", "instance_count": 91, "def": "a mill for grinding pepper", "synonyms": ["pepper_mill", "pepper_grinder"], "image_count": 69, "id": 790, "frequency": "c", "synset": "pepper_mill.n.01"}, {"name": "perfume", "instance_count": 28, "def": "a toiletry that emits and diffuses a fragrant odor", "synonyms": ["perfume"], "image_count": 13, "id": 791, "frequency": "c", "synset": "perfume.n.02"}, {"name": "persimmon", "instance_count": 22, "def": "orange fruit resembling a plum; edible when fully ripe", "synonyms": ["persimmon"], "image_count": 6, "id": 792, "frequency": "r", "synset": "persimmon.n.02"}, {"name": "person", "instance_count": 13439, "def": "a human being", "synonyms": ["person", "baby", "child", "boy", "girl", "man", "woman", "human"], "image_count": 1928, "id": 793, "frequency": "f", "synset": "person.n.01"}, {"name": "pet", "instance_count": 103, "def": "a domesticated animal kept for companionship or amusement", "synonyms": ["pet"], "image_count": 79, "id": 794, "frequency": "c", "synset": "pet.n.01"}, {"name": "pew_(church_bench)", "instance_count": 194, "def": "long bench with backs; used in church by the congregation", "synonyms": ["pew_(church_bench)", "church_bench"], "image_count": 14, "id": 795, "frequency": "c", "synset": "pew.n.01"}, {"name": "phonebook", "instance_count": 24, "def": "a directory containing an alphabetical list of telephone subscribers and their telephone numbers", "synonyms": ["phonebook", "telephone_book", "telephone_directory"], "image_count": 7, "id": 796, "frequency": "r", "synset": "phonebook.n.01"}, {"name": "phonograph_record", "instance_count": 138, "def": "sound recording consisting of a typically black disk with a continuous groove", "synonyms": ["phonograph_record", "phonograph_recording", "record_(phonograph_recording)"], "image_count": 20, "id": 797, "frequency": "c", "synset": "phonograph_record.n.01"}, {"name": "piano", "instance_count": 126, "def": "a keyboard instrument that is played by depressing keys that cause hammers to strike tuned strings and produce sounds", "synonyms": ["piano"], "image_count": 114, "id": 798, "frequency": "f", "synset": "piano.n.01"}, {"name": "pickle", "instance_count": 632, "def": "vegetables (especially cucumbers) preserved in brine or vinegar", "synonyms": ["pickle"], "image_count": 221, "id": 799, "frequency": "f", "synset": "pickle.n.01"}, {"name": "pickup_truck", "instance_count": 838, "def": "a light truck with an open body and low sides and a tailboard", "synonyms": ["pickup_truck"], "image_count": 502, "id": 800, "frequency": "f", "synset": "pickup.n.01"}, {"name": "pie", "instance_count": 228, "def": "dish baked in pastry-lined pan often with a pastry top", "synonyms": ["pie"], "image_count": 62, "id": 801, "frequency": "c", "synset": "pie.n.01"}, {"name": "pigeon", "instance_count": 1850, "def": "wild and domesticated birds having a heavy body and short legs", "synonyms": ["pigeon"], "image_count": 87, "id": 802, "frequency": "c", "synset": "pigeon.n.01"}, {"name": "piggy_bank", "instance_count": 5, "def": "a child's coin bank (often shaped like a pig)", "synonyms": ["piggy_bank", "penny_bank"], "image_count": 4, "id": 803, "frequency": "r", "synset": "piggy_bank.n.01"}, {"name": "pillow", "instance_count": 6115, "def": "a cushion to support the head of a sleeping person", "synonyms": ["pillow"], "image_count": 1912, "id": 804, "frequency": "f", "synset": "pillow.n.01"}, {"name": "pin_(non_jewelry)", "instance_count": 112, "def": "a small slender (often pointed) piece of wood or metal used to support or fasten or attach things", "synonyms": ["pin_(non_jewelry)"], "image_count": 7, "id": 805, "frequency": "r", "synset": "pin.n.09"}, {"name": "pineapple", "instance_count": 1636, "def": "large sweet fleshy tropical fruit with a tuft of stiff leaves", "synonyms": ["pineapple"], "image_count": 186, "id": 806, "frequency": "f", "synset": "pineapple.n.02"}, {"name": "pinecone", "instance_count": 141, "def": "the seed-producing cone of a pine tree", "synonyms": ["pinecone"], "image_count": 18, "id": 807, "frequency": "c", "synset": "pinecone.n.01"}, {"name": "ping-pong_ball", "instance_count": 4, "def": "light hollow ball used in playing table tennis", "synonyms": ["ping-pong_ball"], "image_count": 4, "id": 808, "frequency": "r", "synset": "ping-pong_ball.n.01"}, {"name": "pinwheel", "instance_count": 172, "def": "a toy consisting of vanes of colored paper or plastic that is pinned to a stick and spins when it is pointed into the wind", "synonyms": ["pinwheel"], "image_count": 3, "id": 809, "frequency": "r", "synset": "pinwheel.n.03"}, {"name": "tobacco_pipe", "instance_count": 7, "def": "a tube with a small bowl at one end; used for smoking tobacco", "synonyms": ["tobacco_pipe"], "image_count": 7, "id": 810, "frequency": "r", "synset": "pipe.n.01"}, {"name": "pipe", "instance_count": 4762, "def": "a long tube made of metal or plastic that is used to carry water or oil or gas etc.", "synonyms": ["pipe", "piping"], "image_count": 1413, "id": 811, "frequency": "f", "synset": "pipe.n.02"}, {"name": "pistol", "instance_count": 9, "def": "a firearm that is held and fired with one hand", "synonyms": ["pistol", "handgun"], "image_count": 7, "id": 812, "frequency": "r", "synset": "pistol.n.01"}, {"name": "pita_(bread)", "instance_count": 28, "def": "usually small round bread that can open into a pocket for filling", "synonyms": ["pita_(bread)", "pocket_bread"], "image_count": 12, "id": 813, "frequency": "c", "synset": "pita.n.01"}, {"name": "pitcher_(vessel_for_liquid)", "instance_count": 488, "def": "an open vessel with a handle and a spout for pouring", "synonyms": ["pitcher_(vessel_for_liquid)", "ewer"], "image_count": 248, "id": 814, "frequency": "f", "synset": "pitcher.n.02"}, {"name": "pitchfork", "instance_count": 4, "def": "a long-handled hand tool with sharp widely spaced prongs for lifting and pitching hay", "synonyms": ["pitchfork"], "image_count": 4, "id": 815, "frequency": "r", "synset": "pitchfork.n.01"}, {"name": "pizza", "instance_count": 4103, "def": "Italian open pie made of thin bread dough spread with a spiced mixture of e.g. tomato sauce and cheese", "synonyms": ["pizza"], "image_count": 1881, "id": 816, "frequency": "f", "synset": "pizza.n.01"}, {"name": "place_mat", "instance_count": 1123, "def": "a mat placed on a table for an individual place setting", "synonyms": ["place_mat"], "image_count": 529, "id": 817, "frequency": "f", "synset": "place_mat.n.01"}, {"name": "plate", "instance_count": 5214, "def": "dish on which food is served or from which food is eaten", "synonyms": ["plate"], "image_count": 1932, "id": 818, "frequency": "f", "synset": "plate.n.04"}, {"name": "platter", "instance_count": 148, "def": "a large shallow dish used for serving food", "synonyms": ["platter"], "image_count": 50, "id": 819, "frequency": "c", "synset": "platter.n.01"}, {"name": "playpen", "instance_count": 3, "def": "a portable enclosure in which babies may be left to play", "synonyms": ["playpen"], "image_count": 3, "id": 820, "frequency": "r", "synset": "playpen.n.01"}, {"name": "pliers", "instance_count": 49, "def": "a gripping hand tool with two hinged arms and (usually) serrated jaws", "synonyms": ["pliers", "plyers"], "image_count": 28, "id": 821, "frequency": "c", "synset": "pliers.n.01"}, {"name": "plow_(farm_equipment)", "instance_count": 12, "def": "a farm tool having one or more heavy blades to break the soil and cut a furrow prior to sowing", "synonyms": ["plow_(farm_equipment)", "plough_(farm_equipment)"], "image_count": 10, "id": 822, "frequency": "r", "synset": "plow.n.01"}, {"name": "plume", "instance_count": 11, "def": "a feather or cluster of feathers worn as an ornament", "synonyms": ["plume"], "image_count": 5, "id": 823, "frequency": "r", "synset": "plume.n.02"}, {"name": "pocket_watch", "instance_count": 20, "def": "a watch that is carried in a small watch pocket", "synonyms": ["pocket_watch"], "image_count": 5, "id": 824, "frequency": "r", "synset": "pocket_watch.n.01"}, {"name": "pocketknife", "instance_count": 21, "def": "a knife with a blade that folds into the handle; suitable for carrying in the pocket", "synonyms": ["pocketknife"], "image_count": 18, "id": 825, "frequency": "c", "synset": "pocketknife.n.01"}, {"name": "poker_(fire_stirring_tool)", "instance_count": 34, "def": "fire iron consisting of a metal rod with a handle; used to stir a fire", "synonyms": ["poker_(fire_stirring_tool)", "stove_poker", "fire_hook"], "image_count": 14, "id": 826, "frequency": "c", "synset": "poker.n.01"}, {"name": "pole", "instance_count": 14276, "def": "a long (usually round) rod of wood or metal or plastic", "synonyms": ["pole", "post"], "image_count": 1890, "id": 827, "frequency": "f", "synset": "pole.n.01"}, {"name": "polo_shirt", "instance_count": 1695, "def": "a shirt with short sleeves designed for comfort and casual wear", "synonyms": ["polo_shirt", "sport_shirt"], "image_count": 660, "id": 828, "frequency": "f", "synset": "polo_shirt.n.01"}, {"name": "poncho", "instance_count": 14, "def": "a blanket-like cloak with a hole in the center for the head", "synonyms": ["poncho"], "image_count": 8, "id": 829, "frequency": "r", "synset": "poncho.n.01"}, {"name": "pony", "instance_count": 57, "def": "any of various breeds of small gentle horses usually less than five feet high at the shoulder", "synonyms": ["pony"], "image_count": 25, "id": 830, "frequency": "c", "synset": "pony.n.05"}, {"name": "pool_table", "instance_count": 10, "def": "game equipment consisting of a heavy table on which pool is played", "synonyms": ["pool_table", "billiard_table", "snooker_table"], "image_count": 10, "id": 831, "frequency": "r", "synset": "pool_table.n.01"}, {"name": "pop_(soda)", "instance_count": 951, "def": "a sweet drink containing carbonated water and flavoring", "synonyms": ["pop_(soda)", "soda_(pop)", "tonic", "soft_drink"], "image_count": 218, "id": 832, "frequency": "f", "synset": "pop.n.02"}, {"name": "postbox_(public)", "instance_count": 57, "def": "public box for deposit of mail", "synonyms": ["postbox_(public)", "mailbox_(public)"], "image_count": 36, "id": 833, "frequency": "c", "synset": "postbox.n.01"}, {"name": "postcard", "instance_count": 276, "def": "a card for sending messages by post without an envelope", "synonyms": ["postcard", "postal_card", "mailing-card"], "image_count": 16, "id": 834, "frequency": "c", "synset": "postcard.n.01"}, {"name": "poster", "instance_count": 3378, "def": "a sign posted in a public place as an advertisement", "synonyms": ["poster", "placard"], "image_count": 808, "id": 835, "frequency": "f", "synset": "poster.n.01"}, {"name": "pot", "instance_count": 1719, "def": "metal or earthenware cooking vessel that is usually round and deep; often has a handle and lid", "synonyms": ["pot"], "image_count": 479, "id": 836, "frequency": "f", "synset": "pot.n.01"}, {"name": "flowerpot", "instance_count": 3902, "def": "a container in which plants are cultivated", "synonyms": ["flowerpot"], "image_count": 1404, "id": 837, "frequency": "f", "synset": "pot.n.04"}, {"name": "potato", "instance_count": 4393, "def": "an edible tuber native to South America", "synonyms": ["potato"], "image_count": 307, "id": 838, "frequency": "f", "synset": "potato.n.01"}, {"name": "potholder", "instance_count": 112, "def": "an insulated pad for holding hot pots", "synonyms": ["potholder"], "image_count": 57, "id": 839, "frequency": "c", "synset": "potholder.n.01"}, {"name": "pottery", "instance_count": 272, "def": "ceramic ware made from clay and baked in a kiln", "synonyms": ["pottery", "clayware"], "image_count": 28, "id": 840, "frequency": "c", "synset": "pottery.n.01"}, {"name": "pouch", "instance_count": 131, "def": "a small or medium size container for holding or carrying things", "synonyms": ["pouch"], "image_count": 80, "id": 841, "frequency": "c", "synset": "pouch.n.01"}, {"name": "power_shovel", "instance_count": 16, "def": "a machine for excavating", "synonyms": ["power_shovel", "excavator", "digger"], "image_count": 11, "id": 842, "frequency": "c", "synset": "power_shovel.n.01"}, {"name": "prawn", "instance_count": 779, "def": "any of various edible decapod crustaceans", "synonyms": ["prawn", "shrimp"], "image_count": 92, "id": 843, "frequency": "c", "synset": "prawn.n.01"}, {"name": "pretzel", "instance_count": 179, "def": "glazed and salted cracker typically in the shape of a loose knot", "synonyms": ["pretzel"], "image_count": 20, "id": 844, "frequency": "c", "synset": "pretzel.n.01"}, {"name": "printer", "instance_count": 217, "def": "a machine that prints", "synonyms": ["printer", "printing_machine"], "image_count": 194, "id": 845, "frequency": "f", "synset": "printer.n.03"}, {"name": "projectile_(weapon)", "instance_count": 64, "def": "a weapon that is forcibly thrown or projected at a targets", "synonyms": ["projectile_(weapon)", "missile"], "image_count": 23, "id": 846, "frequency": "c", "synset": "projectile.n.01"}, {"name": "projector", "instance_count": 54, "def": "an optical instrument that projects an enlarged image onto a screen", "synonyms": ["projector"], "image_count": 52, "id": 847, "frequency": "c", "synset": "projector.n.02"}, {"name": "propeller", "instance_count": 1458, "def": "a mechanical device that rotates to push against air or water", "synonyms": ["propeller", "propellor"], "image_count": 673, "id": 848, "frequency": "f", "synset": "propeller.n.01"}, {"name": "prune", "instance_count": 8, "def": "dried plum", "synonyms": ["prune"], "image_count": 2, "id": 849, "frequency": "r", "synset": "prune.n.01"}, {"name": "pudding", "instance_count": 2, "def": "any of various soft thick unsweetened baked dishes", "synonyms": ["pudding"], "image_count": 2, "id": 850, "frequency": "r", "synset": "pudding.n.01"}, {"name": "puffer_(fish)", "instance_count": 2, "def": "fishes whose elongated spiny body can inflate itself with water or air to form a globe", "synonyms": ["puffer_(fish)", "pufferfish", "blowfish", "globefish"], "image_count": 1, "id": 851, "frequency": "r", "synset": "puffer.n.02"}, {"name": "puffin", "instance_count": 4, "def": "seabirds having short necks and brightly colored compressed bills", "synonyms": ["puffin"], "image_count": 2, "id": 852, "frequency": "r", "synset": "puffin.n.01"}, {"name": "pug-dog", "instance_count": 13, "def": "small compact smooth-coated breed of Asiatic origin having a tightly curled tail and broad flat wrinkled muzzle", "synonyms": ["pug-dog"], "image_count": 8, "id": 853, "frequency": "r", "synset": "pug.n.01"}, {"name": "pumpkin", "instance_count": 1192, "def": "usually large pulpy deep-yellow round fruit of the squash family maturing in late summer or early autumn", "synonyms": ["pumpkin"], "image_count": 80, "id": 854, "frequency": "c", "synset": "pumpkin.n.02"}, {"name": "puncher", "instance_count": 6, "def": "a tool for making holes or indentations", "synonyms": ["puncher"], "image_count": 3, "id": 855, "frequency": "r", "synset": "punch.n.03"}, {"name": "puppet", "instance_count": 18, "def": "a small figure of a person operated from above with strings by a puppeteer", "synonyms": ["puppet", "marionette"], "image_count": 3, "id": 856, "frequency": "r", "synset": "puppet.n.01"}, {"name": "puppy", "instance_count": 57, "def": "a young dog", "synonyms": ["puppy"], "image_count": 15, "id": 857, "frequency": "c", "synset": "puppy.n.01"}, {"name": "quesadilla", "instance_count": 6, "def": "a tortilla that is filled with cheese and heated", "synonyms": ["quesadilla"], "image_count": 2, "id": 858, "frequency": "r", "synset": "quesadilla.n.01"}, {"name": "quiche", "instance_count": 33, "def": "a tart filled with rich unsweetened custard; often contains other ingredients (as cheese or ham or seafood or vegetables)", "synonyms": ["quiche"], "image_count": 10, "id": 859, "frequency": "r", "synset": "quiche.n.02"}, {"name": "quilt", "instance_count": 513, "def": "bedding made of two layers of cloth filled with stuffing and stitched together", "synonyms": ["quilt", "comforter"], "image_count": 386, "id": 860, "frequency": "f", "synset": "quilt.n.01"}, {"name": "rabbit", "instance_count": 139, "def": "any of various burrowing animals of the family Leporidae having long ears and short tails", "synonyms": ["rabbit"], "image_count": 65, "id": 861, "frequency": "c", "synset": "rabbit.n.01"}, {"name": "race_car", "instance_count": 6, "def": "a fast car that competes in races", "synonyms": ["race_car", "racing_car"], "image_count": 3, "id": 862, "frequency": "r", "synset": "racer.n.02"}, {"name": "racket", "instance_count": 64, "def": "a sports implement used to strike a ball in various games", "synonyms": ["racket", "racquet"], "image_count": 35, "id": 863, "frequency": "c", "synset": "racket.n.04"}, {"name": "radar", "instance_count": 13, "def": "measuring instrument in which the echo of a pulse of microwave radiation is used to detect and locate distant objects", "synonyms": ["radar"], "image_count": 5, "id": 864, "frequency": "r", "synset": "radar.n.01"}, {"name": "radiator", "instance_count": 195, "def": "a mechanism consisting of a metal honeycomb through which hot fluids circulate", "synonyms": ["radiator"], "image_count": 180, "id": 865, "frequency": "f", "synset": "radiator.n.03"}, {"name": "radio_receiver", "instance_count": 123, "def": "an electronic receiver that detects and demodulates and amplifies transmitted radio signals", "synonyms": ["radio_receiver", "radio_set", "radio", "tuner_(radio)"], "image_count": 99, "id": 866, "frequency": "c", "synset": "radio_receiver.n.01"}, {"name": "radish", "instance_count": 519, "def": "pungent edible root of any of various cultivated radish plants", "synonyms": ["radish", "daikon"], "image_count": 49, "id": 867, "frequency": "c", "synset": "radish.n.03"}, {"name": "raft", "instance_count": 66, "def": "a flat float (usually made of logs or planks) that can be used for transport or as a platform for swimmers", "synonyms": ["raft"], "image_count": 28, "id": 868, "frequency": "c", "synset": "raft.n.01"}, {"name": "rag_doll", "instance_count": 3, "def": "a cloth doll that is stuffed and (usually) painted", "synonyms": ["rag_doll"], "image_count": 1, "id": 869, "frequency": "r", "synset": "rag_doll.n.01"}, {"name": "raincoat", "instance_count": 303, "def": "a water-resistant coat", "synonyms": ["raincoat", "waterproof_jacket"], "image_count": 52, "id": 870, "frequency": "c", "synset": "raincoat.n.01"}, {"name": "ram_(animal)", "instance_count": 132, "def": "uncastrated adult male sheep", "synonyms": ["ram_(animal)"], "image_count": 36, "id": 871, "frequency": "c", "synset": "ram.n.05"}, {"name": "raspberry", "instance_count": 778, "def": "red or black edible aggregate berries usually smaller than the related blackberries", "synonyms": ["raspberry"], "image_count": 70, "id": 872, "frequency": "c", "synset": "raspberry.n.02"}, {"name": "rat", "instance_count": 6, "def": "any of various long-tailed rodents similar to but larger than a mouse", "synonyms": ["rat"], "image_count": 6, "id": 873, "frequency": "r", "synset": "rat.n.01"}, {"name": "razorblade", "instance_count": 35, "def": "a blade that has very sharp edge", "synonyms": ["razorblade"], "image_count": 29, "id": 874, "frequency": "c", "synset": "razorblade.n.01"}, {"name": "reamer_(juicer)", "instance_count": 26, "def": "a squeezer with a conical ridged center that is used for squeezing juice from citrus fruit", "synonyms": ["reamer_(juicer)", "juicer", "juice_reamer"], "image_count": 24, "id": 875, "frequency": "c", "synset": "reamer.n.01"}, {"name": "rearview_mirror", "instance_count": 3650, "def": "vehicle mirror (side or rearview)", "synonyms": ["rearview_mirror"], "image_count": 1115, "id": 876, "frequency": "f", "synset": "rearview_mirror.n.01"}, {"name": "receipt", "instance_count": 89, "def": "an acknowledgment (usually tangible) that payment has been made", "synonyms": ["receipt"], "image_count": 61, "id": 877, "frequency": "c", "synset": "receipt.n.02"}, {"name": "recliner", "instance_count": 28, "def": "an armchair whose back can be lowered and foot can be raised to allow the sitter to recline in it", "synonyms": ["recliner", "reclining_chair", "lounger_(chair)"], "image_count": 18, "id": 878, "frequency": "c", "synset": "recliner.n.01"}, {"name": "record_player", "instance_count": 22, "def": "machine in which rotating records cause a stylus to vibrate and the vibrations are amplified acoustically or electronically", "synonyms": ["record_player", "phonograph_(record_player)", "turntable"], "image_count": 18, "id": 879, "frequency": "c", "synset": "record_player.n.01"}, {"name": "reflector", "instance_count": 3426, "def": "device that reflects light, radiation, etc.", "synonyms": ["reflector"], "image_count": 665, "id": 880, "frequency": "f", "synset": "reflector.n.01"}, {"name": "remote_control", "instance_count": 2467, "def": "a device that can be used to control a machine or apparatus from a distance", "synonyms": ["remote_control"], "image_count": 1096, "id": 881, "frequency": "f", "synset": "remote_control.n.01"}, {"name": "rhinoceros", "instance_count": 50, "def": "massive powerful herbivorous odd-toed ungulate of southeast Asia and Africa having very thick skin and one or two horns on the snout", "synonyms": ["rhinoceros"], "image_count": 29, "id": 882, "frequency": "c", "synset": "rhinoceros.n.01"}, {"name": "rib_(food)", "instance_count": 32, "def": "cut of meat including one or more ribs", "synonyms": ["rib_(food)"], "image_count": 8, "id": 883, "frequency": "r", "synset": "rib.n.03"}, {"name": "rifle", "instance_count": 37, "def": "a shoulder firearm with a long barrel", "synonyms": ["rifle"], "image_count": 14, "id": 884, "frequency": "c", "synset": "rifle.n.01"}, {"name": "ring", "instance_count": 2314, "def": "jewelry consisting of a circlet of precious metal (often set with jewels) worn on the finger", "synonyms": ["ring"], "image_count": 1622, "id": 885, "frequency": "f", "synset": "ring.n.08"}, {"name": "river_boat", "instance_count": 3, "def": "a boat used on rivers or to ply a river", "synonyms": ["river_boat"], "image_count": 2, "id": 886, "frequency": "r", "synset": "river_boat.n.01"}, {"name": "road_map", "instance_count": 3, "def": "(NOT A ROAD) a MAP showing roads (for automobile travel)", "synonyms": ["road_map"], "image_count": 3, "id": 887, "frequency": "r", "synset": "road_map.n.02"}, {"name": "robe", "instance_count": 77, "def": "any loose flowing garment", "synonyms": ["robe"], "image_count": 32, "id": 888, "frequency": "c", "synset": "robe.n.01"}, {"name": "rocking_chair", "instance_count": 70, "def": "a chair mounted on rockers", "synonyms": ["rocking_chair"], "image_count": 55, "id": 889, "frequency": "c", "synset": "rocking_chair.n.01"}, {"name": "rodent", "instance_count": 2, "def": "relatively small placental mammals having a single pair of constantly growing incisor teeth specialized for gnawing", "synonyms": ["rodent"], "image_count": 1, "id": 890, "frequency": "r", "synset": "rodent.n.01"}, {"name": "roller_skate", "instance_count": 35, "def": "a shoe with pairs of rollers (small hard wheels) fixed to the sole", "synonyms": ["roller_skate"], "image_count": 10, "id": 891, "frequency": "r", "synset": "roller_skate.n.01"}, {"name": "Rollerblade", "instance_count": 31, "def": "an in-line variant of a roller skate", "synonyms": ["Rollerblade"], "image_count": 10, "id": 892, "frequency": "r", "synset": "rollerblade.n.01"}, {"name": "rolling_pin", "instance_count": 52, "def": "utensil consisting of a cylinder (usually of wood) with a handle at each end; used to roll out dough", "synonyms": ["rolling_pin"], "image_count": 47, "id": 893, "frequency": "c", "synset": "rolling_pin.n.01"}, {"name": "root_beer", "instance_count": 3, "def": "carbonated drink containing extracts of roots and herbs", "synonyms": ["root_beer"], "image_count": 3, "id": 894, "frequency": "r", "synset": "root_beer.n.01"}, {"name": "router_(computer_equipment)", "instance_count": 41, "def": "a device that forwards data packets between computer networks", "synonyms": ["router_(computer_equipment)"], "image_count": 29, "id": 895, "frequency": "c", "synset": "router.n.02"}, {"name": "rubber_band", "instance_count": 574, "def": "a narrow band of elastic rubber used to hold things (such as papers) together", "synonyms": ["rubber_band", "elastic_band"], "image_count": 342, "id": 896, "frequency": "f", "synset": "rubber_band.n.01"}, {"name": "runner_(carpet)", "instance_count": 32, "def": "a long narrow carpet", "synonyms": ["runner_(carpet)"], "image_count": 25, "id": 897, "frequency": "c", "synset": "runner.n.08"}, {"name": "plastic_bag", "instance_count": 3631, "def": "a bag made of paper or plastic for holding customer's purchases", "synonyms": ["plastic_bag", "paper_bag"], "image_count": 1469, "id": 898, "frequency": "f", "synset": "sack.n.01"}, {"name": "saddle_(on_an_animal)", "instance_count": 955, "def": "a seat for the rider of a horse or camel", "synonyms": ["saddle_(on_an_animal)"], "image_count": 521, "id": 899, "frequency": "f", "synset": "saddle.n.01"}, {"name": "saddle_blanket", "instance_count": 648, "def": "stable gear consisting of a blanket placed under the saddle", "synonyms": ["saddle_blanket", "saddlecloth", "horse_blanket"], "image_count": 347, "id": 900, "frequency": "f", "synset": "saddle_blanket.n.01"}, {"name": "saddlebag", "instance_count": 56, "def": "a large bag (or pair of bags) hung over a saddle", "synonyms": ["saddlebag"], "image_count": 35, "id": 901, "frequency": "c", "synset": "saddlebag.n.01"}, {"name": "safety_pin", "instance_count": 15, "def": "a pin in the form of a clasp; has a guard so the point of the pin will not stick the user", "synonyms": ["safety_pin"], "image_count": 7, "id": 902, "frequency": "r", "synset": "safety_pin.n.01"}, {"name": "sail", "instance_count": 863, "def": "a large piece of fabric by means of which wind is used to propel a sailing vessel", "synonyms": ["sail"], "image_count": 207, "id": 903, "frequency": "f", "synset": "sail.n.01"}, {"name": "salad", "instance_count": 171, "def": "food mixtures either arranged on a plate or tossed and served with a moist dressing; usually consisting of or including greens", "synonyms": ["salad"], "image_count": 108, "id": 904, "frequency": "f", "synset": "salad.n.01"}, {"name": "salad_plate", "instance_count": 6, "def": "a plate or bowl for individual servings of salad", "synonyms": ["salad_plate", "salad_bowl"], "image_count": 2, "id": 905, "frequency": "r", "synset": "salad_plate.n.01"}, {"name": "salami", "instance_count": 290, "def": "highly seasoned fatty sausage of pork and beef usually dried", "synonyms": ["salami"], "image_count": 34, "id": 906, "frequency": "c", "synset": "salami.n.01"}, {"name": "salmon_(fish)", "instance_count": 27, "def": "any of various large food and game fishes of northern waters", "synonyms": ["salmon_(fish)"], "image_count": 12, "id": 907, "frequency": "c", "synset": "salmon.n.01"}, {"name": "salmon_(food)", "instance_count": 14, "def": "flesh of any of various marine or freshwater fish of the family Salmonidae", "synonyms": ["salmon_(food)"], "image_count": 10, "id": 908, "frequency": "r", "synset": "salmon.n.03"}, {"name": "salsa", "instance_count": 22, "def": "spicy sauce of tomatoes and onions and chili peppers to accompany Mexican foods", "synonyms": ["salsa"], "image_count": 13, "id": 909, "frequency": "c", "synset": "salsa.n.01"}, {"name": "saltshaker", "instance_count": 543, "def": "a shaker with a perforated top for sprinkling salt", "synonyms": ["saltshaker"], "image_count": 361, "id": 910, "frequency": "f", "synset": "saltshaker.n.01"}, {"name": "sandal_(type_of_shoe)", "instance_count": 3145, "def": "a shoe consisting of a sole fastened by straps to the foot", "synonyms": ["sandal_(type_of_shoe)"], "image_count": 1023, "id": 911, "frequency": "f", "synset": "sandal.n.01"}, {"name": "sandwich", "instance_count": 2315, "def": "two (or more) slices of bread with a filling between them", "synonyms": ["sandwich"], "image_count": 782, "id": 912, "frequency": "f", "synset": "sandwich.n.01"}, {"name": "satchel", "instance_count": 3, "def": "luggage consisting of a small case with a flat bottom and (usually) a shoulder strap", "synonyms": ["satchel"], "image_count": 2, "id": 913, "frequency": "r", "synset": "satchel.n.01"}, {"name": "saucepan", "instance_count": 26, "def": "a deep pan with a handle; used for stewing or boiling", "synonyms": ["saucepan"], "image_count": 5, "id": 914, "frequency": "r", "synset": "saucepan.n.01"}, {"name": "saucer", "instance_count": 555, "def": "a small shallow dish for holding a cup at the table", "synonyms": ["saucer"], "image_count": 247, "id": 915, "frequency": "f", "synset": "saucer.n.02"}, {"name": "sausage", "instance_count": 2704, "def": "highly seasoned minced meat stuffed in casings", "synonyms": ["sausage"], "image_count": 221, "id": 916, "frequency": "f", "synset": "sausage.n.01"}, {"name": "sawhorse", "instance_count": 5, "def": "a framework for holding wood that is being sawed", "synonyms": ["sawhorse", "sawbuck"], "image_count": 4, "id": 917, "frequency": "r", "synset": "sawhorse.n.01"}, {"name": "saxophone", "instance_count": 13, "def": "a wind instrument with a `J'-shaped form typically made of brass", "synonyms": ["saxophone"], "image_count": 8, "id": 918, "frequency": "r", "synset": "sax.n.02"}, {"name": "scale_(measuring_instrument)", "instance_count": 178, "def": "a measuring instrument for weighing; shows amount of mass", "synonyms": ["scale_(measuring_instrument)"], "image_count": 158, "id": 919, "frequency": "f", "synset": "scale.n.07"}, {"name": "scarecrow", "instance_count": 4, "def": "an effigy in the shape of a man to frighten birds away from seeds", "synonyms": ["scarecrow", "strawman"], "image_count": 3, "id": 920, "frequency": "r", "synset": "scarecrow.n.01"}, {"name": "scarf", "instance_count": 1310, "def": "a garment worn around the head or neck or shoulders for warmth or decoration", "synonyms": ["scarf"], "image_count": 752, "id": 921, "frequency": "f", "synset": "scarf.n.01"}, {"name": "school_bus", "instance_count": 142, "def": "a bus used to transport children to or from school", "synonyms": ["school_bus"], "image_count": 64, "id": 922, "frequency": "c", "synset": "school_bus.n.01"}, {"name": "scissors", "instance_count": 1376, "def": "a tool having two crossed pivoting blades with looped handles", "synonyms": ["scissors"], "image_count": 707, "id": 923, "frequency": "f", "synset": "scissors.n.01"}, {"name": "scoreboard", "instance_count": 161, "def": "a large board for displaying the score of a contest (and some other information)", "synonyms": ["scoreboard"], "image_count": 143, "id": 924, "frequency": "f", "synset": "scoreboard.n.01"}, {"name": "scraper", "instance_count": 1, "def": "any of various hand tools for scraping", "synonyms": ["scraper"], "image_count": 1, "id": 925, "frequency": "r", "synset": "scraper.n.01"}, {"name": "screwdriver", "instance_count": 88, "def": "a hand tool for driving screws; has a tip that fits into the head of a screw", "synonyms": ["screwdriver"], "image_count": 49, "id": 926, "frequency": "c", "synset": "screwdriver.n.01"}, {"name": "scrubbing_brush", "instance_count": 141, "def": "a brush with short stiff bristles for heavy cleaning", "synonyms": ["scrubbing_brush"], "image_count": 126, "id": 927, "frequency": "f", "synset": "scrub_brush.n.01"}, {"name": "sculpture", "instance_count": 202, "def": "a three-dimensional work of art", "synonyms": ["sculpture"], "image_count": 76, "id": 928, "frequency": "c", "synset": "sculpture.n.01"}, {"name": "seabird", "instance_count": 126, "def": "a bird that frequents coastal waters and the open ocean: gulls; pelicans; gannets; cormorants; albatrosses; petrels; etc.", "synonyms": ["seabird", "seafowl"], "image_count": 11, "id": 929, "frequency": "c", "synset": "seabird.n.01"}, {"name": "seahorse", "instance_count": 23, "def": "small fish with horse-like heads bent sharply downward and curled tails", "synonyms": ["seahorse"], "image_count": 11, "id": 930, "frequency": "c", "synset": "seahorse.n.02"}, {"name": "seaplane", "instance_count": 4, "def": "an airplane that can land on or take off from water", "synonyms": ["seaplane", "hydroplane"], "image_count": 4, "id": 931, "frequency": "r", "synset": "seaplane.n.01"}, {"name": "seashell", "instance_count": 451, "def": "the shell of a marine organism", "synonyms": ["seashell"], "image_count": 39, "id": 932, "frequency": "c", "synset": "seashell.n.01"}, {"name": "sewing_machine", "instance_count": 11, "def": "a textile machine used as a home appliance for sewing", "synonyms": ["sewing_machine"], "image_count": 11, "id": 933, "frequency": "c", "synset": "sewing_machine.n.01"}, {"name": "shaker", "instance_count": 24, "def": "a container in which something can be shaken", "synonyms": ["shaker"], "image_count": 13, "id": 934, "frequency": "c", "synset": "shaker.n.03"}, {"name": "shampoo", "instance_count": 254, "def": "cleansing agent consisting of soaps or detergents used for washing the hair", "synonyms": ["shampoo"], "image_count": 91, "id": 935, "frequency": "c", "synset": "shampoo.n.01"}, {"name": "shark", "instance_count": 20, "def": "typically large carnivorous fishes with sharpe teeth", "synonyms": ["shark"], "image_count": 14, "id": 936, "frequency": "c", "synset": "shark.n.01"}, {"name": "sharpener", "instance_count": 7, "def": "any implement that is used to make something (an edge or a point) sharper", "synonyms": ["sharpener"], "image_count": 5, "id": 937, "frequency": "r", "synset": "sharpener.n.01"}, {"name": "Sharpie", "instance_count": 5, "def": "a pen with indelible ink that will write on any surface", "synonyms": ["Sharpie"], "image_count": 3, "id": 938, "frequency": "r", "synset": "sharpie.n.03"}, {"name": "shaver_(electric)", "instance_count": 12, "def": "a razor powered by an electric motor", "synonyms": ["shaver_(electric)", "electric_shaver", "electric_razor"], "image_count": 10, "id": 939, "frequency": "r", "synset": "shaver.n.03"}, {"name": "shaving_cream", "instance_count": 33, "def": "toiletry consisting that forms a rich lather for softening the beard before shaving", "synonyms": ["shaving_cream", "shaving_soap"], "image_count": 18, "id": 940, "frequency": "c", "synset": "shaving_cream.n.01"}, {"name": "shawl", "instance_count": 9, "def": "cloak consisting of an oblong piece of cloth used to cover the head and shoulders", "synonyms": ["shawl"], "image_count": 9, "id": 941, "frequency": "r", "synset": "shawl.n.01"}, {"name": "shears", "instance_count": 38, "def": "large scissors with strong blades", "synonyms": ["shears"], "image_count": 6, "id": 942, "frequency": "r", "synset": "shears.n.01"}, {"name": "sheep", "instance_count": 13304, "def": "woolly usually horned ruminant mammal related to the goat", "synonyms": ["sheep"], "image_count": 951, "id": 943, "frequency": "f", "synset": "sheep.n.01"}, {"name": "shepherd_dog", "instance_count": 2, "def": "any of various usually long-haired breeds of dog reared to herd and guard sheep", "synonyms": ["shepherd_dog", "sheepdog"], "image_count": 2, "id": 944, "frequency": "r", "synset": "shepherd_dog.n.01"}, {"name": "sherbert", "instance_count": 2, "def": "a frozen dessert made primarily of fruit juice and sugar", "synonyms": ["sherbert", "sherbet"], "image_count": 1, "id": 945, "frequency": "r", "synset": "sherbert.n.01"}, {"name": "shield", "instance_count": 41, "def": "armor carried on the arm to intercept blows", "synonyms": ["shield"], "image_count": 19, "id": 946, "frequency": "c", "synset": "shield.n.02"}, {"name": "shirt", "instance_count": 10177, "def": "a garment worn on the upper half of the body", "synonyms": ["shirt"], "image_count": 1942, "id": 947, "frequency": "f", "synset": "shirt.n.01"}, {"name": "shoe", "instance_count": 9374, "def": "common footwear covering the foot", "synonyms": ["shoe", "sneaker_(type_of_shoe)", "tennis_shoe"], "image_count": 1916, "id": 948, "frequency": "f", "synset": "shoe.n.01"}, {"name": "shopping_bag", "instance_count": 377, "def": "a bag made of plastic or strong paper (often with handles); used to transport goods after shopping", "synonyms": ["shopping_bag"], "image_count": 139, "id": 949, "frequency": "f", "synset": "shopping_bag.n.01"}, {"name": "shopping_cart", "instance_count": 90, "def": "a handcart that holds groceries or other goods while shopping", "synonyms": ["shopping_cart"], "image_count": 43, "id": 950, "frequency": "c", "synset": "shopping_cart.n.01"}, {"name": "short_pants", "instance_count": 5305, "def": "trousers that end at or above the knee", "synonyms": ["short_pants", "shorts_(clothing)", "trunks_(clothing)"], "image_count": 1969, "id": 951, "frequency": "f", "synset": "short_pants.n.01"}, {"name": "shot_glass", "instance_count": 24, "def": "a small glass adequate to hold a single swallow of whiskey", "synonyms": ["shot_glass"], "image_count": 5, "id": 952, "frequency": "r", "synset": "shot_glass.n.01"}, {"name": "shoulder_bag", "instance_count": 331, "def": "a large handbag that can be carried by a strap looped over the shoulder", "synonyms": ["shoulder_bag"], "image_count": 134, "id": 953, "frequency": "f", "synset": "shoulder_bag.n.01"}, {"name": "shovel", "instance_count": 110, "def": "a hand tool for lifting loose material such as snow, dirt, etc.", "synonyms": ["shovel"], "image_count": 74, "id": 954, "frequency": "c", "synset": "shovel.n.01"}, {"name": "shower_head", "instance_count": 450, "def": "a plumbing fixture that sprays water over you", "synonyms": ["shower_head"], "image_count": 381, "id": 955, "frequency": "f", "synset": "shower.n.01"}, {"name": "shower_cap", "instance_count": 1, "def": "a tight cap worn to keep hair dry while showering", "synonyms": ["shower_cap"], "image_count": 1, "id": 956, "frequency": "r", "synset": "shower_cap.n.01"}, {"name": "shower_curtain", "instance_count": 479, "def": "a curtain that keeps water from splashing out of the shower area", "synonyms": ["shower_curtain"], "image_count": 381, "id": 957, "frequency": "f", "synset": "shower_curtain.n.01"}, {"name": "shredder_(for_paper)", "instance_count": 6, "def": "a device that shreds documents", "synonyms": ["shredder_(for_paper)"], "image_count": 6, "id": 958, "frequency": "r", "synset": "shredder.n.01"}, {"name": "signboard", "instance_count": 8091, "def": "structure displaying a board on which advertisements can be posted", "synonyms": ["signboard"], "image_count": 1826, "id": 959, "frequency": "f", "synset": "signboard.n.01"}, {"name": "silo", "instance_count": 95, "def": "a cylindrical tower used for storing goods", "synonyms": ["silo"], "image_count": 28, "id": 960, "frequency": "c", "synset": "silo.n.01"}, {"name": "sink", "instance_count": 2182, "def": "plumbing fixture consisting of a water basin fixed to a wall or floor and having a drainpipe", "synonyms": ["sink"], "image_count": 1635, "id": 961, "frequency": "f", "synset": "sink.n.01"}, {"name": "skateboard", "instance_count": 3597, "def": "a board with wheels that is ridden in a standing or crouching position and propelled by foot", "synonyms": ["skateboard"], "image_count": 1967, "id": 962, "frequency": "f", "synset": "skateboard.n.01"}, {"name": "skewer", "instance_count": 81, "def": "a long pin for holding meat in position while it is being roasted", "synonyms": ["skewer"], "image_count": 16, "id": 963, "frequency": "c", "synset": "skewer.n.01"}, {"name": "ski", "instance_count": 8496, "def": "sports equipment for skiing on snow", "synonyms": ["ski"], "image_count": 1926, "id": 964, "frequency": "f", "synset": "ski.n.01"}, {"name": "ski_boot", "instance_count": 8124, "def": "a stiff boot that is fastened to a ski with a ski binding", "synonyms": ["ski_boot"], "image_count": 1789, "id": 965, "frequency": "f", "synset": "ski_boot.n.01"}, {"name": "ski_parka", "instance_count": 1727, "def": "a parka to be worn while skiing", "synonyms": ["ski_parka", "ski_jacket"], "image_count": 401, "id": 966, "frequency": "f", "synset": "ski_parka.n.01"}, {"name": "ski_pole", "instance_count": 8263, "def": "a pole with metal points used as an aid in skiing", "synonyms": ["ski_pole"], "image_count": 1968, "id": 967, "frequency": "f", "synset": "ski_pole.n.01"}, {"name": "skirt", "instance_count": 1784, "def": "a garment hanging from the waist; worn mainly by girls and women", "synonyms": ["skirt"], "image_count": 1167, "id": 968, "frequency": "f", "synset": "skirt.n.02"}, {"name": "skullcap", "instance_count": 1, "def": "rounded brimless cap fitting the crown of the head", "synonyms": ["skullcap"], "image_count": 1, "id": 969, "frequency": "r", "synset": "skullcap.n.01"}, {"name": "sled", "instance_count": 102, "def": "a vehicle or flat object for transportation over snow by sliding or pulled by dogs, etc.", "synonyms": ["sled", "sledge", "sleigh"], "image_count": 56, "id": 970, "frequency": "c", "synset": "sled.n.01"}, {"name": "sleeping_bag", "instance_count": 33, "def": "large padded bag designed to be slept in outdoors", "synonyms": ["sleeping_bag"], "image_count": 17, "id": 971, "frequency": "c", "synset": "sleeping_bag.n.01"}, {"name": "sling_(bandage)", "instance_count": 1, "def": "bandage to support an injured forearm; slung over the shoulder or neck", "synonyms": ["sling_(bandage)", "triangular_bandage"], "image_count": 1, "id": 972, "frequency": "r", "synset": "sling.n.05"}, {"name": "slipper_(footwear)", "instance_count": 121, "def": "low footwear that can be slipped on and off easily; usually worn indoors", "synonyms": ["slipper_(footwear)", "carpet_slipper_(footwear)"], "image_count": 58, "id": 973, "frequency": "c", "synset": "slipper.n.01"}, {"name": "smoothie", "instance_count": 53, "def": "a thick smooth drink consisting of fresh fruit pureed with ice cream or yoghurt or milk", "synonyms": ["smoothie"], "image_count": 9, "id": 974, "frequency": "r", "synset": "smoothie.n.02"}, {"name": "snake", "instance_count": 16, "def": "limbless scaly elongate reptile; some are venomous", "synonyms": ["snake", "serpent"], "image_count": 8, "id": 975, "frequency": "r", "synset": "snake.n.01"}, {"name": "snowboard", "instance_count": 2119, "def": "a board that resembles a broad ski or a small surfboard; used in a standing position to slide down snow-covered slopes", "synonyms": ["snowboard"], "image_count": 1124, "id": 976, "frequency": "f", "synset": "snowboard.n.01"}, {"name": "snowman", "instance_count": 61, "def": "a figure of a person made of packed snow", "synonyms": ["snowman"], "image_count": 31, "id": 977, "frequency": "c", "synset": "snowman.n.01"}, {"name": "snowmobile", "instance_count": 23, "def": "tracked vehicle for travel on snow having skis in front", "synonyms": ["snowmobile"], "image_count": 16, "id": 978, "frequency": "c", "synset": "snowmobile.n.01"}, {"name": "soap", "instance_count": 895, "def": "a cleansing agent made from the salts of vegetable or animal fats", "synonyms": ["soap"], "image_count": 491, "id": 979, "frequency": "f", "synset": "soap.n.01"}, {"name": "soccer_ball", "instance_count": 670, "def": "an inflated ball used in playing soccer (called `football' outside of the United States)", "synonyms": ["soccer_ball"], "image_count": 432, "id": 980, "frequency": "f", "synset": "soccer_ball.n.01"}, {"name": "sock", "instance_count": 6866, "def": "cloth covering for the foot; worn inside the shoe; reaches to between the ankle and the knee", "synonyms": ["sock"], "image_count": 1945, "id": 981, "frequency": "f", "synset": "sock.n.01"}, {"name": "sofa", "instance_count": 2408, "def": "an upholstered seat for more than one person", "synonyms": ["sofa", "couch", "lounge"], "image_count": 1899, "id": 982, "frequency": "f", "synset": "sofa.n.01"}, {"name": "softball", "instance_count": 5, "def": "ball used in playing softball", "synonyms": ["softball"], "image_count": 5, "id": 983, "frequency": "r", "synset": "softball.n.01"}, {"name": "solar_array", "instance_count": 52, "def": "electrical device consisting of a large array of connected solar cells", "synonyms": ["solar_array", "solar_battery", "solar_panel"], "image_count": 28, "id": 984, "frequency": "c", "synset": "solar_array.n.01"}, {"name": "sombrero", "instance_count": 22, "def": "a straw hat with a tall crown and broad brim; worn in American southwest and in Mexico", "synonyms": ["sombrero"], "image_count": 7, "id": 985, "frequency": "r", "synset": "sombrero.n.02"}, {"name": "soup", "instance_count": 193, "def": "liquid food especially of meat or fish or vegetable stock often containing pieces of solid food", "synonyms": ["soup"], "image_count": 146, "id": 986, "frequency": "f", "synset": "soup.n.01"}, {"name": "soup_bowl", "instance_count": 2, "def": "a bowl for serving soup", "synonyms": ["soup_bowl"], "image_count": 1, "id": 987, "frequency": "r", "synset": "soup_bowl.n.01"}, {"name": "soupspoon", "instance_count": 44, "def": "a spoon with a rounded bowl for eating soup", "synonyms": ["soupspoon"], "image_count": 25, "id": 988, "frequency": "c", "synset": "soupspoon.n.01"}, {"name": "sour_cream", "instance_count": 49, "def": "soured light cream", "synonyms": ["sour_cream", "soured_cream"], "image_count": 22, "id": 989, "frequency": "c", "synset": "sour_cream.n.01"}, {"name": "soya_milk", "instance_count": 2, "def": "a milk substitute containing soybean flour and water; used in some infant formulas and in making tofu", "synonyms": ["soya_milk", "soybean_milk", "soymilk"], "image_count": 1, "id": 990, "frequency": "r", "synset": "soya_milk.n.01"}, {"name": "space_shuttle", "instance_count": 10, "def": "a reusable spacecraft with wings for a controlled descent through the Earth's atmosphere", "synonyms": ["space_shuttle"], "image_count": 10, "id": 991, "frequency": "r", "synset": "space_shuttle.n.01"}, {"name": "sparkler_(fireworks)", "instance_count": 12, "def": "a firework that burns slowly and throws out a shower of sparks", "synonyms": ["sparkler_(fireworks)"], "image_count": 9, "id": 992, "frequency": "r", "synset": "sparkler.n.02"}, {"name": "spatula", "instance_count": 508, "def": "a hand tool with a thin flexible blade used to mix or spread soft substances", "synonyms": ["spatula"], "image_count": 308, "id": 993, "frequency": "f", "synset": "spatula.n.02"}, {"name": "spear", "instance_count": 9, "def": "a long pointed rod used as a tool or weapon", "synonyms": ["spear", "lance"], "image_count": 4, "id": 994, "frequency": "r", "synset": "spear.n.01"}, {"name": "spectacles", "instance_count": 3040, "def": "optical instrument consisting of a frame that holds a pair of lenses for correcting defective vision", "synonyms": ["spectacles", "specs", "eyeglasses", "glasses"], "image_count": 1969, "id": 995, "frequency": "f", "synset": "spectacles.n.01"}, {"name": "spice_rack", "instance_count": 54, "def": "a rack for displaying containers filled with spices", "synonyms": ["spice_rack"], "image_count": 45, "id": 996, "frequency": "c", "synset": "spice_rack.n.01"}, {"name": "spider", "instance_count": 19, "def": "predatory arachnid with eight legs, two poison fangs, two feelers, and usually two silk-spinning organs at the back end of the body", "synonyms": ["spider"], "image_count": 12, "id": 997, "frequency": "c", "synset": "spider.n.01"}, {"name": "crawfish", "instance_count": 5, "def": "large edible marine crustacean having a spiny carapace but lacking the large pincers of true lobsters", "synonyms": ["crawfish", "crayfish"], "image_count": 1, "id": 998, "frequency": "r", "synset": "spiny_lobster.n.02"}, {"name": "sponge", "instance_count": 116, "def": "a porous mass usable to absorb water typically used for cleaning", "synonyms": ["sponge"], "image_count": 85, "id": 999, "frequency": "c", "synset": "sponge.n.01"}, {"name": "spoon", "instance_count": 2111, "def": "a piece of cutlery with a shallow bowl-shaped container and a handle", "synonyms": ["spoon"], "image_count": 1127, "id": 1000, "frequency": "f", "synset": "spoon.n.01"}, {"name": "sportswear", "instance_count": 85, "def": "attire worn for sport or for casual wear", "synonyms": ["sportswear", "athletic_wear", "activewear"], "image_count": 11, "id": 1001, "frequency": "c", "synset": "sportswear.n.01"}, {"name": "spotlight", "instance_count": 403, "def": "a lamp that produces a strong beam of light to illuminate a restricted area; used to focus attention of a stage performer", "synonyms": ["spotlight"], "image_count": 60, "id": 1002, "frequency": "c", "synset": "spotlight.n.02"}, {"name": "squid_(food)", "instance_count": 6, "def": "(Italian cuisine) squid prepared as food", "synonyms": ["squid_(food)", "calamari", "calamary"], "image_count": 1, "id": 1003, "frequency": "r", "synset": "squid.n.01"}, {"name": "squirrel", "instance_count": 19, "def": "a kind of arboreal rodent having a long bushy tail", "synonyms": ["squirrel"], "image_count": 16, "id": 1004, "frequency": "c", "synset": "squirrel.n.01"}, {"name": "stagecoach", "instance_count": 1, "def": "a large coach-and-four formerly used to carry passengers and mail on regular routes between towns", "synonyms": ["stagecoach"], "image_count": 1, "id": 1005, "frequency": "r", "synset": "stagecoach.n.01"}, {"name": "stapler_(stapling_machine)", "instance_count": 68, "def": "a machine that inserts staples into sheets of paper in order to fasten them together", "synonyms": ["stapler_(stapling_machine)"], "image_count": 65, "id": 1006, "frequency": "c", "synset": "stapler.n.01"}, {"name": "starfish", "instance_count": 28, "def": "echinoderms characterized by five arms extending from a central disk", "synonyms": ["starfish", "sea_star"], "image_count": 13, "id": 1007, "frequency": "c", "synset": "starfish.n.01"}, {"name": "statue_(sculpture)", "instance_count": 1934, "def": "a sculpture representing a human or animal", "synonyms": ["statue_(sculpture)"], "image_count": 655, "id": 1008, "frequency": "f", "synset": "statue.n.01"}, {"name": "steak_(food)", "instance_count": 139, "def": "a slice of meat cut from the fleshy part of an animal or large fish", "synonyms": ["steak_(food)"], "image_count": 51, "id": 1009, "frequency": "c", "synset": "steak.n.01"}, {"name": "steak_knife", "instance_count": 1, "def": "a sharp table knife used in eating steak", "synonyms": ["steak_knife"], "image_count": 1, "id": 1010, "frequency": "r", "synset": "steak_knife.n.01"}, {"name": "steering_wheel", "instance_count": 901, "def": "a handwheel that is used for steering", "synonyms": ["steering_wheel"], "image_count": 673, "id": 1011, "frequency": "f", "synset": "steering_wheel.n.01"}, {"name": "stepladder", "instance_count": 5, "def": "a folding portable ladder hinged at the top", "synonyms": ["stepladder"], "image_count": 5, "id": 1012, "frequency": "r", "synset": "step_ladder.n.01"}, {"name": "step_stool", "instance_count": 43, "def": "a stool that has one or two steps that fold under the seat", "synonyms": ["step_stool"], "image_count": 36, "id": 1013, "frequency": "c", "synset": "step_stool.n.01"}, {"name": "stereo_(sound_system)", "instance_count": 77, "def": "electronic device for playing audio", "synonyms": ["stereo_(sound_system)"], "image_count": 54, "id": 1014, "frequency": "c", "synset": "stereo.n.01"}, {"name": "stew", "instance_count": 7, "def": "food prepared by stewing especially meat or fish with vegetables", "synonyms": ["stew"], "image_count": 5, "id": 1015, "frequency": "r", "synset": "stew.n.02"}, {"name": "stirrer", "instance_count": 18, "def": "an implement used for stirring", "synonyms": ["stirrer"], "image_count": 8, "id": 1016, "frequency": "r", "synset": "stirrer.n.02"}, {"name": "stirrup", "instance_count": 625, "def": "support consisting of metal loops into which rider's feet go", "synonyms": ["stirrup"], "image_count": 305, "id": 1017, "frequency": "f", "synset": "stirrup.n.01"}, {"name": "stool", "instance_count": 583, "def": "a simple seat without a back or arms", "synonyms": ["stool"], "image_count": 297, "id": 1018, "frequency": "f", "synset": "stool.n.01"}, {"name": "stop_sign", "instance_count": 1349, "def": "a traffic sign to notify drivers that they must come to a complete stop", "synonyms": ["stop_sign"], "image_count": 1053, "id": 1019, "frequency": "f", "synset": "stop_sign.n.01"}, {"name": "brake_light", "instance_count": 1334, "def": "a red light on the rear of a motor vehicle that signals when the brakes are applied", "synonyms": ["brake_light"], "image_count": 223, "id": 1020, "frequency": "f", "synset": "stoplight.n.01"}, {"name": "stove", "instance_count": 1133, "def": "a kitchen appliance used for cooking food", "synonyms": ["stove", "kitchen_stove", "range_(kitchen_appliance)", "kitchen_range", "cooking_stove"], "image_count": 1037, "id": 1021, "frequency": "f", "synset": "stove.n.01"}, {"name": "strainer", "instance_count": 99, "def": "a filter to retain larger pieces while smaller pieces and liquids pass through", "synonyms": ["strainer"], "image_count": 63, "id": 1022, "frequency": "c", "synset": "strainer.n.01"}, {"name": "strap", "instance_count": 7435, "def": "an elongated strip of material for binding things together or holding", "synonyms": ["strap"], "image_count": 1881, "id": 1023, "frequency": "f", "synset": "strap.n.01"}, {"name": "straw_(for_drinking)", "instance_count": 1154, "def": "a thin paper or plastic tube used to suck liquids into the mouth", "synonyms": ["straw_(for_drinking)", "drinking_straw"], "image_count": 507, "id": 1024, "frequency": "f", "synset": "straw.n.04"}, {"name": "strawberry", "instance_count": 4386, "def": "sweet fleshy red fruit", "synonyms": ["strawberry"], "image_count": 333, "id": 1025, "frequency": "f", "synset": "strawberry.n.01"}, {"name": "street_sign", "instance_count": 8350, "def": "a sign visible from the street", "synonyms": ["street_sign"], "image_count": 1911, "id": 1026, "frequency": "f", "synset": "street_sign.n.01"}, {"name": "streetlight", "instance_count": 7381, "def": "a lamp supported on a lamppost; for illuminating a street", "synonyms": ["streetlight", "street_lamp"], "image_count": 1765, "id": 1027, "frequency": "f", "synset": "streetlight.n.01"}, {"name": "string_cheese", "instance_count": 1, "def": "cheese formed in long strings twisted together", "synonyms": ["string_cheese"], "image_count": 1, "id": 1028, "frequency": "r", "synset": "string_cheese.n.01"}, {"name": "stylus", "instance_count": 11, "def": "a pointed tool for writing or drawing or engraving, including pens", "synonyms": ["stylus"], "image_count": 5, "id": 1029, "frequency": "r", "synset": "stylus.n.02"}, {"name": "subwoofer", "instance_count": 1, "def": "a loudspeaker that is designed to reproduce very low bass frequencies", "synonyms": ["subwoofer"], "image_count": 1, "id": 1030, "frequency": "r", "synset": "subwoofer.n.01"}, {"name": "sugar_bowl", "instance_count": 10, "def": "a dish in which sugar is served", "synonyms": ["sugar_bowl"], "image_count": 9, "id": 1031, "frequency": "r", "synset": "sugar_bowl.n.01"}, {"name": "sugarcane_(plant)", "instance_count": 31, "def": "juicy canes whose sap is a source of molasses and commercial sugar; fresh canes are sometimes chewed for the juice", "synonyms": ["sugarcane_(plant)"], "image_count": 2, "id": 1032, "frequency": "r", "synset": "sugarcane.n.01"}, {"name": "suit_(clothing)", "instance_count": 461, "def": "a set of garments (usually including a jacket and trousers or skirt) for outerwear all of the same fabric and color", "synonyms": ["suit_(clothing)"], "image_count": 151, "id": 1033, "frequency": "f", "synset": "suit.n.01"}, {"name": "sunflower", "instance_count": 618, "def": "any plant of the genus Helianthus having large flower heads with dark disk florets and showy yellow rays", "synonyms": ["sunflower"], "image_count": 82, "id": 1034, "frequency": "c", "synset": "sunflower.n.01"}, {"name": "sunglasses", "instance_count": 5603, "def": "spectacles that are darkened or polarized to protect the eyes from the glare of the sun", "synonyms": ["sunglasses"], "image_count": 1931, "id": 1035, "frequency": "f", "synset": "sunglasses.n.01"}, {"name": "sunhat", "instance_count": 170, "def": "a hat with a broad brim that protects the face from direct exposure to the sun", "synonyms": ["sunhat"], "image_count": 41, "id": 1036, "frequency": "c", "synset": "sunhat.n.01"}, {"name": "surfboard", "instance_count": 3835, "def": "a narrow buoyant board for riding surf", "synonyms": ["surfboard"], "image_count": 1895, "id": 1037, "frequency": "f", "synset": "surfboard.n.01"}, {"name": "sushi", "instance_count": 337, "def": "rice (with raw fish) wrapped in seaweed", "synonyms": ["sushi"], "image_count": 24, "id": 1038, "frequency": "c", "synset": "sushi.n.01"}, {"name": "mop", "instance_count": 22, "def": "cleaning implement consisting of absorbent material fastened to a handle; for cleaning floors", "synonyms": ["mop"], "image_count": 22, "id": 1039, "frequency": "c", "synset": "swab.n.02"}, {"name": "sweat_pants", "instance_count": 56, "def": "loose-fitting trousers with elastic cuffs; worn by athletes", "synonyms": ["sweat_pants"], "image_count": 35, "id": 1040, "frequency": "c", "synset": "sweat_pants.n.01"}, {"name": "sweatband", "instance_count": 145, "def": "a band of material tied around the forehead or wrist to absorb sweat", "synonyms": ["sweatband"], "image_count": 69, "id": 1041, "frequency": "c", "synset": "sweatband.n.02"}, {"name": "sweater", "instance_count": 1894, "def": "a crocheted or knitted garment covering the upper part of the body", "synonyms": ["sweater"], "image_count": 962, "id": 1042, "frequency": "f", "synset": "sweater.n.01"}, {"name": "sweatshirt", "instance_count": 1482, "def": "cotton knit pullover with long sleeves worn during athletic activity", "synonyms": ["sweatshirt"], "image_count": 588, "id": 1043, "frequency": "f", "synset": "sweatshirt.n.01"}, {"name": "sweet_potato", "instance_count": 137, "def": "the edible tuberous root of the sweet potato vine", "synonyms": ["sweet_potato"], "image_count": 21, "id": 1044, "frequency": "c", "synset": "sweet_potato.n.02"}, {"name": "swimsuit", "instance_count": 3141, "def": "garment worn for swimming", "synonyms": ["swimsuit", "swimwear", "bathing_suit", "swimming_costume", "bathing_costume", "swimming_trunks", "bathing_trunks"], "image_count": 825, "id": 1045, "frequency": "f", "synset": "swimsuit.n.01"}, {"name": "sword", "instance_count": 72, "def": "a cutting or thrusting weapon that has a long metal blade", "synonyms": ["sword"], "image_count": 52, "id": 1046, "frequency": "c", "synset": "sword.n.01"}, {"name": "syringe", "instance_count": 14, "def": "a medical instrument used to inject or withdraw fluids", "synonyms": ["syringe"], "image_count": 5, "id": 1047, "frequency": "r", "synset": "syringe.n.01"}, {"name": "Tabasco_sauce", "instance_count": 5, "def": "very spicy sauce (trade name Tabasco) made from fully-aged red peppers", "synonyms": ["Tabasco_sauce"], "image_count": 5, "id": 1048, "frequency": "r", "synset": "tabasco.n.02"}, {"name": "table-tennis_table", "instance_count": 5, "def": "a table used for playing table tennis", "synonyms": ["table-tennis_table", "ping-pong_table"], "image_count": 5, "id": 1049, "frequency": "r", "synset": "table-tennis_table.n.01"}, {"name": "table", "instance_count": 2804, "def": "a piece of furniture having a smooth flat top that is usually supported by one or more vertical legs", "synonyms": ["table"], "image_count": 1860, "id": 1050, "frequency": "f", "synset": "table.n.02"}, {"name": "table_lamp", "instance_count": 81, "def": "a lamp that sits on a table", "synonyms": ["table_lamp"], "image_count": 56, "id": 1051, "frequency": "c", "synset": "table_lamp.n.01"}, {"name": "tablecloth", "instance_count": 2496, "def": "a covering spread over a dining table", "synonyms": ["tablecloth"], "image_count": 1582, "id": 1052, "frequency": "f", "synset": "tablecloth.n.01"}, {"name": "tachometer", "instance_count": 10, "def": "measuring instrument for indicating speed of rotation", "synonyms": ["tachometer"], "image_count": 7, "id": 1053, "frequency": "r", "synset": "tachometer.n.01"}, {"name": "taco", "instance_count": 21, "def": "a small tortilla cupped around a filling", "synonyms": ["taco"], "image_count": 2, "id": 1054, "frequency": "r", "synset": "taco.n.02"}, {"name": "tag", "instance_count": 7550, "def": "a label associated with something for the purpose of identification or information", "synonyms": ["tag"], "image_count": 1562, "id": 1055, "frequency": "f", "synset": "tag.n.02"}, {"name": "taillight", "instance_count": 9222, "def": "lamp (usually red) mounted at the rear of a motor vehicle", "synonyms": ["taillight", "rear_light"], "image_count": 1885, "id": 1056, "frequency": "f", "synset": "taillight.n.01"}, {"name": "tambourine", "instance_count": 1, "def": "a shallow drum with a single drumhead and with metallic disks in the sides", "synonyms": ["tambourine"], "image_count": 1, "id": 1057, "frequency": "r", "synset": "tambourine.n.01"}, {"name": "army_tank", "instance_count": 7, "def": "an enclosed armored military vehicle; has a cannon and moves on caterpillar treads", "synonyms": ["army_tank", "armored_combat_vehicle", "armoured_combat_vehicle"], "image_count": 5, "id": 1058, "frequency": "r", "synset": "tank.n.01"}, {"name": "tank_(storage_vessel)", "instance_count": 304, "def": "a large (usually metallic) vessel for holding gases or liquids", "synonyms": ["tank_(storage_vessel)", "storage_tank"], "image_count": 137, "id": 1059, "frequency": "f", "synset": "tank.n.02"}, {"name": "tank_top_(clothing)", "instance_count": 1799, "def": "a tight-fitting sleeveless shirt with wide shoulder straps and low neck and no front opening", "synonyms": ["tank_top_(clothing)"], "image_count": 1094, "id": 1060, "frequency": "f", "synset": "tank_top.n.01"}, {"name": "tape_(sticky_cloth_or_paper)", "instance_count": 560, "def": "a long thin piece of cloth or paper as used for binding or fastening", "synonyms": ["tape_(sticky_cloth_or_paper)"], "image_count": 134, "id": 1061, "frequency": "f", "synset": "tape.n.01"}, {"name": "tape_measure", "instance_count": 35, "def": "measuring instrument consisting of a narrow strip (cloth or metal) marked in inches or centimeters and used for measuring lengths", "synonyms": ["tape_measure", "measuring_tape"], "image_count": 29, "id": 1062, "frequency": "c", "synset": "tape.n.04"}, {"name": "tapestry", "instance_count": 29, "def": "a heavy textile with a woven design; used for curtains and upholstery", "synonyms": ["tapestry"], "image_count": 22, "id": 1063, "frequency": "c", "synset": "tapestry.n.02"}, {"name": "tarp", "instance_count": 1315, "def": "waterproofed canvas", "synonyms": ["tarp"], "image_count": 522, "id": 1064, "frequency": "f", "synset": "tarpaulin.n.01"}, {"name": "tartan", "instance_count": 68, "def": "a cloth having a crisscross design", "synonyms": ["tartan", "plaid"], "image_count": 50, "id": 1065, "frequency": "c", "synset": "tartan.n.01"}, {"name": "tassel", "instance_count": 276, "def": "adornment consisting of a bunch of cords fastened at one end", "synonyms": ["tassel"], "image_count": 68, "id": 1066, "frequency": "c", "synset": "tassel.n.01"}, {"name": "tea_bag", "instance_count": 42, "def": "a measured amount of tea in a bag for an individual serving of tea", "synonyms": ["tea_bag"], "image_count": 16, "id": 1067, "frequency": "c", "synset": "tea_bag.n.01"}, {"name": "teacup", "instance_count": 152, "def": "a cup from which tea is drunk", "synonyms": ["teacup"], "image_count": 40, "id": 1068, "frequency": "c", "synset": "teacup.n.02"}, {"name": "teakettle", "instance_count": 40, "def": "kettle for boiling water to make tea", "synonyms": ["teakettle"], "image_count": 35, "id": 1069, "frequency": "c", "synset": "teakettle.n.01"}, {"name": "teapot", "instance_count": 209, "def": "pot for brewing tea; usually has a spout and handle", "synonyms": ["teapot"], "image_count": 135, "id": 1070, "frequency": "f", "synset": "teapot.n.01"}, {"name": "teddy_bear", "instance_count": 4886, "def": "plaything consisting of a child's toy bear (usually plush and stuffed with soft materials)", "synonyms": ["teddy_bear"], "image_count": 1413, "id": 1071, "frequency": "f", "synset": "teddy.n.01"}, {"name": "telephone", "instance_count": 945, "def": "electronic device for communicating by voice over long distances (includes wired and wireless/cell phones)", "synonyms": ["telephone", "phone", "telephone_set"], "image_count": 772, "id": 1072, "frequency": "f", "synset": "telephone.n.01"}, {"name": "telephone_booth", "instance_count": 62, "def": "booth for using a telephone", "synonyms": ["telephone_booth", "phone_booth", "call_box", "telephone_box", "telephone_kiosk"], "image_count": 50, "id": 1073, "frequency": "c", "synset": "telephone_booth.n.01"}, {"name": "telephone_pole", "instance_count": 3725, "def": "tall pole supporting telephone wires", "synonyms": ["telephone_pole", "telegraph_pole", "telegraph_post"], "image_count": 1015, "id": 1074, "frequency": "f", "synset": "telephone_pole.n.01"}, {"name": "telephoto_lens", "instance_count": 1, "def": "a camera lens that magnifies the image", "synonyms": ["telephoto_lens", "zoom_lens"], "image_count": 1, "id": 1075, "frequency": "r", "synset": "telephoto_lens.n.01"}, {"name": "television_camera", "instance_count": 117, "def": "television equipment for capturing and recording video", "synonyms": ["television_camera", "tv_camera"], "image_count": 65, "id": 1076, "frequency": "c", "synset": "television_camera.n.01"}, {"name": "television_set", "instance_count": 2205, "def": "an electronic device that receives television signals and displays them on a screen", "synonyms": ["television_set", "tv", "tv_set"], "image_count": 1900, "id": 1077, "frequency": "f", "synset": "television_receiver.n.01"}, {"name": "tennis_ball", "instance_count": 2835, "def": "ball about the size of a fist used in playing tennis", "synonyms": ["tennis_ball"], "image_count": 1302, "id": 1078, "frequency": "f", "synset": "tennis_ball.n.01"}, {"name": "tennis_racket", "instance_count": 3035, "def": "a racket used to play tennis", "synonyms": ["tennis_racket"], "image_count": 1977, "id": 1079, "frequency": "f", "synset": "tennis_racket.n.01"}, {"name": "tequila", "instance_count": 2, "def": "Mexican liquor made from fermented juices of an agave plant", "synonyms": ["tequila"], "image_count": 2, "id": 1080, "frequency": "r", "synset": "tequila.n.01"}, {"name": "thermometer", "instance_count": 33, "def": "measuring instrument for measuring temperature", "synonyms": ["thermometer"], "image_count": 29, "id": 1081, "frequency": "c", "synset": "thermometer.n.01"}, {"name": "thermos_bottle", "instance_count": 49, "def": "vacuum flask that preserves temperature of hot or cold drinks", "synonyms": ["thermos_bottle"], "image_count": 36, "id": 1082, "frequency": "c", "synset": "thermos.n.01"}, {"name": "thermostat", "instance_count": 153, "def": "a regulator for automatically regulating temperature by starting or stopping the supply of heat", "synonyms": ["thermostat"], "image_count": 138, "id": 1083, "frequency": "f", "synset": "thermostat.n.01"}, {"name": "thimble", "instance_count": 6, "def": "a small metal cap to protect the finger while sewing; can be used as a small container", "synonyms": ["thimble"], "image_count": 4, "id": 1084, "frequency": "r", "synset": "thimble.n.02"}, {"name": "thread", "instance_count": 320, "def": "a fine cord of twisted fibers (of cotton or silk or wool or nylon etc.) used in sewing and weaving", "synonyms": ["thread", "yarn"], "image_count": 67, "id": 1085, "frequency": "c", "synset": "thread.n.01"}, {"name": "thumbtack", "instance_count": 224, "def": "a tack for attaching papers to a bulletin board or drawing board", "synonyms": ["thumbtack", "drawing_pin", "pushpin"], "image_count": 26, "id": 1086, "frequency": "c", "synset": "thumbtack.n.01"}, {"name": "tiara", "instance_count": 31, "def": "a jeweled headdress worn by women on formal occasions", "synonyms": ["tiara"], "image_count": 25, "id": 1087, "frequency": "c", "synset": "tiara.n.01"}, {"name": "tiger", "instance_count": 67, "def": "large feline of forests in most of Asia having a tawny coat with black stripes", "synonyms": ["tiger"], "image_count": 33, "id": 1088, "frequency": "c", "synset": "tiger.n.02"}, {"name": "tights_(clothing)", "instance_count": 45, "def": "skintight knit hose covering the body from the waist to the feet worn by acrobats and dancers and as stockings by women and girls", "synonyms": ["tights_(clothing)", "leotards"], "image_count": 37, "id": 1089, "frequency": "c", "synset": "tights.n.01"}, {"name": "timer", "instance_count": 62, "def": "a timepiece that measures a time interval and signals its end", "synonyms": ["timer", "stopwatch"], "image_count": 50, "id": 1090, "frequency": "c", "synset": "timer.n.01"}, {"name": "tinfoil", "instance_count": 421, "def": "foil made of tin or an alloy of tin and lead", "synonyms": ["tinfoil"], "image_count": 270, "id": 1091, "frequency": "f", "synset": "tinfoil.n.01"}, {"name": "tinsel", "instance_count": 70, "def": "a showy decoration that is basically valueless", "synonyms": ["tinsel"], "image_count": 12, "id": 1092, "frequency": "c", "synset": "tinsel.n.01"}, {"name": "tissue_paper", "instance_count": 587, "def": "a soft thin (usually translucent) paper", "synonyms": ["tissue_paper"], "image_count": 316, "id": 1093, "frequency": "f", "synset": "tissue.n.02"}, {"name": "toast_(food)", "instance_count": 125, "def": "slice of bread that has been toasted", "synonyms": ["toast_(food)"], "image_count": 41, "id": 1094, "frequency": "c", "synset": "toast.n.01"}, {"name": "toaster", "instance_count": 240, "def": "a kitchen appliance (usually electric) for toasting bread", "synonyms": ["toaster"], "image_count": 224, "id": 1095, "frequency": "f", "synset": "toaster.n.02"}, {"name": "toaster_oven", "instance_count": 114, "def": "kitchen appliance consisting of a small electric oven for toasting or warming food", "synonyms": ["toaster_oven"], "image_count": 105, "id": 1096, "frequency": "f", "synset": "toaster_oven.n.01"}, {"name": "toilet", "instance_count": 2295, "def": "a plumbing fixture for defecation and urination", "synonyms": ["toilet"], "image_count": 1925, "id": 1097, "frequency": "f", "synset": "toilet.n.02"}, {"name": "toilet_tissue", "instance_count": 1683, "def": "a soft thin absorbent paper for use in toilets", "synonyms": ["toilet_tissue", "toilet_paper", "bathroom_tissue"], "image_count": 1021, "id": 1098, "frequency": "f", "synset": "toilet_tissue.n.01"}, {"name": "tomato", "instance_count": 12338, "def": "mildly acid red or yellow pulpy fruit eaten as a vegetable", "synonyms": ["tomato"], "image_count": 1213, "id": 1099, "frequency": "f", "synset": "tomato.n.01"}, {"name": "tongs", "instance_count": 294, "def": "any of various devices for taking hold of objects; usually have two hinged legs with handles above and pointed hooks below", "synonyms": ["tongs"], "image_count": 172, "id": 1100, "frequency": "f", "synset": "tongs.n.01"}, {"name": "toolbox", "instance_count": 39, "def": "a box or chest or cabinet for holding hand tools", "synonyms": ["toolbox"], "image_count": 28, "id": 1101, "frequency": "c", "synset": "toolbox.n.01"}, {"name": "toothbrush", "instance_count": 1683, "def": "small brush; has long handle; used to clean teeth", "synonyms": ["toothbrush"], "image_count": 745, "id": 1102, "frequency": "f", "synset": "toothbrush.n.01"}, {"name": "toothpaste", "instance_count": 326, "def": "a dentifrice in the form of a paste", "synonyms": ["toothpaste"], "image_count": 187, "id": 1103, "frequency": "f", "synset": "toothpaste.n.01"}, {"name": "toothpick", "instance_count": 423, "def": "pick consisting of a small strip of wood or plastic; used to pick food from between the teeth", "synonyms": ["toothpick"], "image_count": 147, "id": 1104, "frequency": "f", "synset": "toothpick.n.01"}, {"name": "cover", "instance_count": 306, "def": "covering for a hole (especially a hole in the top of a container)", "synonyms": ["cover"], "image_count": 136, "id": 1105, "frequency": "f", "synset": "top.n.09"}, {"name": "tortilla", "instance_count": 135, "def": "thin unleavened pancake made from cornmeal or wheat flour", "synonyms": ["tortilla"], "image_count": 34, "id": 1106, "frequency": "c", "synset": "tortilla.n.01"}, {"name": "tow_truck", "instance_count": 45, "def": "a truck equipped to hoist and pull wrecked cars (or to remove cars from no-parking zones)", "synonyms": ["tow_truck"], "image_count": 41, "id": 1107, "frequency": "c", "synset": "tow_truck.n.01"}, {"name": "towel", "instance_count": 2212, "def": "a rectangular piece of absorbent cloth (or paper) for drying or wiping", "synonyms": ["towel"], "image_count": 636, "id": 1108, "frequency": "f", "synset": "towel.n.01"}, {"name": "towel_rack", "instance_count": 987, "def": "a rack consisting of one or more bars on which towels can be hung", "synonyms": ["towel_rack", "towel_rail", "towel_bar"], "image_count": 570, "id": 1109, "frequency": "f", "synset": "towel_rack.n.01"}, {"name": "toy", "instance_count": 6756, "def": "a device regarded as providing amusement", "synonyms": ["toy"], "image_count": 1149, "id": 1110, "frequency": "f", "synset": "toy.n.03"}, {"name": "tractor_(farm_equipment)", "instance_count": 80, "def": "a wheeled vehicle with large wheels; used in farming and other applications", "synonyms": ["tractor_(farm_equipment)"], "image_count": 61, "id": 1111, "frequency": "c", "synset": "tractor.n.01"}, {"name": "traffic_light", "instance_count": 7298, "def": "a device to control vehicle traffic often consisting of three or more lights", "synonyms": ["traffic_light"], "image_count": 1890, "id": 1112, "frequency": "f", "synset": "traffic_light.n.01"}, {"name": "dirt_bike", "instance_count": 47, "def": "a lightweight motorcycle equipped with rugged tires and suspension for off-road use", "synonyms": ["dirt_bike"], "image_count": 18, "id": 1113, "frequency": "c", "synset": "trail_bike.n.01"}, {"name": "trailer_truck", "instance_count": 297, "def": "a truck consisting of a tractor and trailer together", "synonyms": ["trailer_truck", "tractor_trailer", "trucking_rig", "articulated_lorry", "semi_truck"], "image_count": 143, "id": 1114, "frequency": "f", "synset": "trailer_truck.n.01"}, {"name": "train_(railroad_vehicle)", "instance_count": 2192, "def": "public or private transport provided by a line of railway cars coupled together and drawn by a locomotive", "synonyms": ["train_(railroad_vehicle)", "railroad_train"], "image_count": 1517, "id": 1115, "frequency": "f", "synset": "train.n.01"}, {"name": "trampoline", "instance_count": 7, "def": "gymnastic apparatus consisting of a strong canvas sheet attached with springs to a metal frame", "synonyms": ["trampoline"], "image_count": 7, "id": 1116, "frequency": "r", "synset": "trampoline.n.01"}, {"name": "tray", "instance_count": 2397, "def": "an open receptacle for holding or displaying or serving articles or food", "synonyms": ["tray"], "image_count": 943, "id": 1117, "frequency": "f", "synset": "tray.n.01"}, {"name": "trench_coat", "instance_count": 16, "def": "a military style raincoat; belted with deep pockets", "synonyms": ["trench_coat"], "image_count": 6, "id": 1118, "frequency": "r", "synset": "trench_coat.n.01"}, {"name": "triangle_(musical_instrument)", "instance_count": 1, "def": "a percussion instrument consisting of a metal bar bent in the shape of an open triangle", "synonyms": ["triangle_(musical_instrument)"], "image_count": 1, "id": 1119, "frequency": "r", "synset": "triangle.n.05"}, {"name": "tricycle", "instance_count": 15, "def": "a vehicle with three wheels that is moved by foot pedals", "synonyms": ["tricycle"], "image_count": 11, "id": 1120, "frequency": "c", "synset": "tricycle.n.01"}, {"name": "tripod", "instance_count": 132, "def": "a three-legged rack used for support", "synonyms": ["tripod"], "image_count": 101, "id": 1121, "frequency": "f", "synset": "tripod.n.01"}, {"name": "trousers", "instance_count": 7806, "def": "a garment extending from the waist to the knee or ankle, covering each leg separately", "synonyms": ["trousers", "pants_(clothing)"], "image_count": 1909, "id": 1122, "frequency": "f", "synset": "trouser.n.01"}, {"name": "truck", "instance_count": 1797, "def": "an automotive vehicle suitable for hauling", "synonyms": ["truck"], "image_count": 800, "id": 1123, "frequency": "f", "synset": "truck.n.01"}, {"name": "truffle_(chocolate)", "instance_count": 4, "def": "creamy chocolate candy", "synonyms": ["truffle_(chocolate)", "chocolate_truffle"], "image_count": 1, "id": 1124, "frequency": "r", "synset": "truffle.n.03"}, {"name": "trunk", "instance_count": 334, "def": "luggage consisting of a large strong case used when traveling or for storage", "synonyms": ["trunk"], "image_count": 44, "id": 1125, "frequency": "c", "synset": "trunk.n.02"}, {"name": "vat", "instance_count": 15, "def": "a large vessel for holding or storing liquids", "synonyms": ["vat"], "image_count": 3, "id": 1126, "frequency": "r", "synset": "tub.n.02"}, {"name": "turban", "instance_count": 124, "def": "a traditional headdress consisting of a long scarf wrapped around the head", "synonyms": ["turban"], "image_count": 44, "id": 1127, "frequency": "c", "synset": "turban.n.01"}, {"name": "turkey_(food)", "instance_count": 120, "def": "flesh of large domesticated fowl usually roasted", "synonyms": ["turkey_(food)"], "image_count": 31, "id": 1128, "frequency": "c", "synset": "turkey.n.04"}, {"name": "turnip", "instance_count": 109, "def": "widely cultivated plant having a large fleshy edible white or yellow root", "synonyms": ["turnip"], "image_count": 7, "id": 1129, "frequency": "r", "synset": "turnip.n.01"}, {"name": "turtle", "instance_count": 31, "def": "any of various aquatic and land reptiles having a bony shell and flipper-like limbs for swimming", "synonyms": ["turtle"], "image_count": 20, "id": 1130, "frequency": "c", "synset": "turtle.n.02"}, {"name": "turtleneck_(clothing)", "instance_count": 13, "def": "a sweater or jersey with a high close-fitting collar", "synonyms": ["turtleneck_(clothing)", "polo-neck"], "image_count": 11, "id": 1131, "frequency": "c", "synset": "turtleneck.n.01"}, {"name": "typewriter", "instance_count": 14, "def": "hand-operated character printer for printing written messages one character at a time", "synonyms": ["typewriter"], "image_count": 13, "id": 1132, "frequency": "c", "synset": "typewriter.n.01"}, {"name": "umbrella", "instance_count": 9161, "def": "a lightweight handheld collapsible canopy", "synonyms": ["umbrella"], "image_count": 1924, "id": 1133, "frequency": "f", "synset": "umbrella.n.01"}, {"name": "underwear", "instance_count": 164, "def": "undergarment worn next to the skin and under the outer garments", "synonyms": ["underwear", "underclothes", "underclothing", "underpants"], "image_count": 113, "id": 1134, "frequency": "f", "synset": "underwear.n.01"}, {"name": "unicycle", "instance_count": 2, "def": "a vehicle with a single wheel that is driven by pedals", "synonyms": ["unicycle"], "image_count": 2, "id": 1135, "frequency": "r", "synset": "unicycle.n.01"}, {"name": "urinal", "instance_count": 381, "def": "a plumbing fixture (usually attached to the wall) used by men to urinate", "synonyms": ["urinal"], "image_count": 139, "id": 1136, "frequency": "f", "synset": "urinal.n.01"}, {"name": "urn", "instance_count": 81, "def": "a large vase that usually has a pedestal or feet", "synonyms": ["urn"], "image_count": 12, "id": 1137, "frequency": "c", "synset": "urn.n.01"}, {"name": "vacuum_cleaner", "instance_count": 38, "def": "an electrical home appliance that cleans by suction", "synonyms": ["vacuum_cleaner"], "image_count": 37, "id": 1138, "frequency": "c", "synset": "vacuum.n.04"}, {"name": "vase", "instance_count": 4971, "def": "an open jar of glass or porcelain used as an ornament or to hold flowers", "synonyms": ["vase"], "image_count": 1866, "id": 1139, "frequency": "f", "synset": "vase.n.01"}, {"name": "vending_machine", "instance_count": 65, "def": "a slot machine for selling goods", "synonyms": ["vending_machine"], "image_count": 47, "id": 1140, "frequency": "c", "synset": "vending_machine.n.01"}, {"name": "vent", "instance_count": 3370, "def": "a hole for the escape of gas or air", "synonyms": ["vent", "blowhole", "air_vent"], "image_count": 1468, "id": 1141, "frequency": "f", "synset": "vent.n.01"}, {"name": "vest", "instance_count": 1313, "def": "a man's sleeveless garment worn underneath a coat", "synonyms": ["vest", "waistcoat"], "image_count": 729, "id": 1142, "frequency": "f", "synset": "vest.n.01"}, {"name": "videotape", "instance_count": 228, "def": "a video recording made on magnetic tape", "synonyms": ["videotape"], "image_count": 24, "id": 1143, "frequency": "c", "synset": "videotape.n.01"}, {"name": "vinegar", "instance_count": 1, "def": "sour-tasting liquid produced usually by oxidation of the alcohol in wine or cider and used as a condiment or food preservative", "synonyms": ["vinegar"], "image_count": 1, "id": 1144, "frequency": "r", "synset": "vinegar.n.01"}, {"name": "violin", "instance_count": 10, "def": "bowed stringed instrument that is the highest member of the violin family", "synonyms": ["violin", "fiddle"], "image_count": 10, "id": 1145, "frequency": "r", "synset": "violin.n.01"}, {"name": "vodka", "instance_count": 3, "def": "unaged colorless liquor originating in Russia", "synonyms": ["vodka"], "image_count": 3, "id": 1146, "frequency": "r", "synset": "vodka.n.01"}, {"name": "volleyball", "instance_count": 33, "def": "an inflated ball used in playing volleyball", "synonyms": ["volleyball"], "image_count": 14, "id": 1147, "frequency": "c", "synset": "volleyball.n.02"}, {"name": "vulture", "instance_count": 16, "def": "any of various large birds of prey having naked heads and weak claws and feeding chiefly on carrion", "synonyms": ["vulture"], "image_count": 4, "id": 1148, "frequency": "r", "synset": "vulture.n.01"}, {"name": "waffle", "instance_count": 61, "def": "pancake batter baked in a waffle iron", "synonyms": ["waffle"], "image_count": 29, "id": 1149, "frequency": "c", "synset": "waffle.n.01"}, {"name": "waffle_iron", "instance_count": 4, "def": "a kitchen appliance for baking waffles", "synonyms": ["waffle_iron"], "image_count": 4, "id": 1150, "frequency": "r", "synset": "waffle_iron.n.01"}, {"name": "wagon", "instance_count": 121, "def": "any of various kinds of wheeled vehicles drawn by an animal or a tractor", "synonyms": ["wagon"], "image_count": 70, "id": 1151, "frequency": "c", "synset": "wagon.n.01"}, {"name": "wagon_wheel", "instance_count": 209, "def": "a wheel of a wagon", "synonyms": ["wagon_wheel"], "image_count": 46, "id": 1152, "frequency": "c", "synset": "wagon_wheel.n.01"}, {"name": "walking_stick", "instance_count": 21, "def": "a stick carried in the hand for support in walking", "synonyms": ["walking_stick"], "image_count": 14, "id": 1153, "frequency": "c", "synset": "walking_stick.n.01"}, {"name": "wall_clock", "instance_count": 100, "def": "a clock mounted on a wall", "synonyms": ["wall_clock"], "image_count": 48, "id": 1154, "frequency": "c", "synset": "wall_clock.n.01"}, {"name": "wall_socket", "instance_count": 3069, "def": "receptacle providing a place in a wiring system where current can be taken to run electrical devices", "synonyms": ["wall_socket", "wall_plug", "electric_outlet", "electrical_outlet", "outlet", "electric_receptacle"], "image_count": 1855, "id": 1155, "frequency": "f", "synset": "wall_socket.n.01"}, {"name": "wallet", "instance_count": 123, "def": "a pocket-size case for holding papers and paper money", "synonyms": ["wallet", "billfold"], "image_count": 113, "id": 1156, "frequency": "f", "synset": "wallet.n.01"}, {"name": "walrus", "instance_count": 1, "def": "either of two large northern marine mammals having ivory tusks and tough hide over thick blubber", "synonyms": ["walrus"], "image_count": 1, "id": 1157, "frequency": "r", "synset": "walrus.n.01"}, {"name": "wardrobe", "instance_count": 1, "def": "a tall piece of furniture that provides storage space for clothes; has a door and rails or hooks for hanging clothes", "synonyms": ["wardrobe"], "image_count": 1, "id": 1158, "frequency": "r", "synset": "wardrobe.n.01"}, {"name": "washbasin", "instance_count": 15, "def": "a bathroom sink that is permanently installed and connected to a water supply and drainpipe; where you can wash your hands and face", "synonyms": ["washbasin", "basin_(for_washing)", "washbowl", "washstand", "handbasin"], "image_count": 10, "id": 1159, "frequency": "r", "synset": "washbasin.n.01"}, {"name": "automatic_washer", "instance_count": 68, "def": "a home appliance for washing clothes and linens automatically", "synonyms": ["automatic_washer", "washing_machine"], "image_count": 54, "id": 1160, "frequency": "c", "synset": "washer.n.03"}, {"name": "watch", "instance_count": 2703, "def": "a small, portable timepiece", "synonyms": ["watch", "wristwatch"], "image_count": 1923, "id": 1161, "frequency": "f", "synset": "watch.n.01"}, {"name": "water_bottle", "instance_count": 1449, "def": "a bottle for holding water", "synonyms": ["water_bottle"], "image_count": 630, "id": 1162, "frequency": "f", "synset": "water_bottle.n.01"}, {"name": "water_cooler", "instance_count": 39, "def": "a device for cooling and dispensing drinking water", "synonyms": ["water_cooler"], "image_count": 31, "id": 1163, "frequency": "c", "synset": "water_cooler.n.01"}, {"name": "water_faucet", "instance_count": 109, "def": "a faucet for drawing water from a pipe or cask", "synonyms": ["water_faucet", "water_tap", "tap_(water_faucet)"], "image_count": 69, "id": 1164, "frequency": "c", "synset": "water_faucet.n.01"}, {"name": "water_heater", "instance_count": 7, "def": "a heater and storage tank to supply heated water", "synonyms": ["water_heater", "hot-water_heater"], "image_count": 7, "id": 1165, "frequency": "r", "synset": "water_heater.n.01"}, {"name": "water_jug", "instance_count": 23, "def": "a jug that holds water", "synonyms": ["water_jug"], "image_count": 11, "id": 1166, "frequency": "c", "synset": "water_jug.n.01"}, {"name": "water_gun", "instance_count": 1, "def": "plaything consisting of a toy pistol that squirts water", "synonyms": ["water_gun", "squirt_gun"], "image_count": 1, "id": 1167, "frequency": "r", "synset": "water_pistol.n.01"}, {"name": "water_scooter", "instance_count": 54, "def": "a motorboat resembling a motor scooter (NOT A SURFBOARD OR WATER SKI)", "synonyms": ["water_scooter", "sea_scooter", "jet_ski"], "image_count": 30, "id": 1168, "frequency": "c", "synset": "water_scooter.n.01"}, {"name": "water_ski", "instance_count": 98, "def": "broad ski for skimming over water towed by a speedboat (DO NOT MARK WATER)", "synonyms": ["water_ski"], "image_count": 50, "id": 1169, "frequency": "c", "synset": "water_ski.n.01"}, {"name": "water_tower", "instance_count": 60, "def": "a large reservoir for water", "synonyms": ["water_tower"], "image_count": 45, "id": 1170, "frequency": "c", "synset": "water_tower.n.01"}, {"name": "watering_can", "instance_count": 44, "def": "a container with a handle and a spout with a perforated nozzle; used to sprinkle water over plants", "synonyms": ["watering_can"], "image_count": 28, "id": 1171, "frequency": "c", "synset": "watering_can.n.01"}, {"name": "watermelon", "instance_count": 814, "def": "large oblong or roundish melon with a hard green rind and sweet watery red or occasionally yellowish pulp", "synonyms": ["watermelon"], "image_count": 114, "id": 1172, "frequency": "f", "synset": "watermelon.n.02"}, {"name": "weathervane", "instance_count": 237, "def": "mechanical device attached to an elevated structure; rotates freely to show the direction of the wind", "synonyms": ["weathervane", "vane_(weathervane)", "wind_vane"], "image_count": 193, "id": 1173, "frequency": "f", "synset": "weathervane.n.01"}, {"name": "webcam", "instance_count": 27, "def": "a digital camera designed to take digital photographs and transmit them over the internet", "synonyms": ["webcam"], "image_count": 21, "id": 1174, "frequency": "c", "synset": "webcam.n.01"}, {"name": "wedding_cake", "instance_count": 140, "def": "a rich cake with two or more tiers and covered with frosting and decorations; served at a wedding reception", "synonyms": ["wedding_cake", "bridecake"], "image_count": 91, "id": 1175, "frequency": "c", "synset": "wedding_cake.n.01"}, {"name": "wedding_ring", "instance_count": 49, "def": "a ring given to the bride and/or groom at the wedding", "synonyms": ["wedding_ring", "wedding_band"], "image_count": 31, "id": 1176, "frequency": "c", "synset": "wedding_ring.n.01"}, {"name": "wet_suit", "instance_count": 2907, "def": "a close-fitting garment made of a permeable material; worn in cold water to retain body heat", "synonyms": ["wet_suit"], "image_count": 1469, "id": 1177, "frequency": "f", "synset": "wet_suit.n.01"}, {"name": "wheel", "instance_count": 11272, "def": "a circular frame with spokes (or a solid disc) that can rotate on a shaft or axle", "synonyms": ["wheel"], "image_count": 1924, "id": 1178, "frequency": "f", "synset": "wheel.n.01"}, {"name": "wheelchair", "instance_count": 107, "def": "a movable chair mounted on large wheels", "synonyms": ["wheelchair"], "image_count": 87, "id": 1179, "frequency": "c", "synset": "wheelchair.n.01"}, {"name": "whipped_cream", "instance_count": 201, "def": "cream that has been beaten until light and fluffy", "synonyms": ["whipped_cream"], "image_count": 77, "id": 1180, "frequency": "c", "synset": "whipped_cream.n.01"}, {"name": "whistle", "instance_count": 13, "def": "a small wind instrument that produces a whistling sound by blowing into it", "synonyms": ["whistle"], "image_count": 11, "id": 1181, "frequency": "c", "synset": "whistle.n.03"}, {"name": "wig", "instance_count": 69, "def": "hairpiece covering the head and made of real or synthetic hair", "synonyms": ["wig"], "image_count": 47, "id": 1182, "frequency": "c", "synset": "wig.n.01"}, {"name": "wind_chime", "instance_count": 28, "def": "a decorative arrangement of pieces of metal or glass or pottery that hang together loosely so the wind can cause them to tinkle", "synonyms": ["wind_chime"], "image_count": 21, "id": 1183, "frequency": "c", "synset": "wind_chime.n.01"}, {"name": "windmill", "instance_count": 202, "def": "A mill or turbine that is powered by wind", "synonyms": ["windmill"], "image_count": 47, "id": 1184, "frequency": "c", "synset": "windmill.n.01"}, {"name": "window_box_(for_plants)", "instance_count": 253, "def": "a container for growing plants on a windowsill", "synonyms": ["window_box_(for_plants)"], "image_count": 70, "id": 1185, "frequency": "c", "synset": "window_box.n.01"}, {"name": "windshield_wiper", "instance_count": 4793, "def": "a mechanical device that cleans the windshield", "synonyms": ["windshield_wiper", "windscreen_wiper", "wiper_(for_windshield/screen)"], "image_count": 1838, "id": 1186, "frequency": "f", "synset": "windshield_wiper.n.01"}, {"name": "windsock", "instance_count": 26, "def": "a truncated cloth cone mounted on a mast/pole; shows wind direction", "synonyms": ["windsock", "air_sock", "air-sleeve", "wind_sleeve", "wind_cone"], "image_count": 19, "id": 1187, "frequency": "c", "synset": "windsock.n.01"}, {"name": "wine_bottle", "instance_count": 4449, "def": "a bottle for holding wine", "synonyms": ["wine_bottle"], "image_count": 531, "id": 1188, "frequency": "f", "synset": "wine_bottle.n.01"}, {"name": "wine_bucket", "instance_count": 21, "def": "a bucket of ice used to chill a bottle of wine", "synonyms": ["wine_bucket", "wine_cooler"], "image_count": 11, "id": 1189, "frequency": "c", "synset": "wine_bucket.n.01"}, {"name": "wineglass", "instance_count": 4259, "def": "a glass that has a stem and in which wine is served", "synonyms": ["wineglass"], "image_count": 941, "id": 1190, "frequency": "f", "synset": "wineglass.n.01"}, {"name": "blinder_(for_horses)", "instance_count": 271, "def": "blinds that prevent a horse from seeing something on either side", "synonyms": ["blinder_(for_horses)"], "image_count": 113, "id": 1191, "frequency": "f", "synset": "winker.n.02"}, {"name": "wok", "instance_count": 60, "def": "pan with a convex bottom; used for frying in Chinese cooking", "synonyms": ["wok"], "image_count": 26, "id": 1192, "frequency": "c", "synset": "wok.n.01"}, {"name": "wolf", "instance_count": 16, "def": "a wild carnivorous mammal of the dog family, living and hunting in packs", "synonyms": ["wolf"], "image_count": 5, "id": 1193, "frequency": "r", "synset": "wolf.n.01"}, {"name": "wooden_spoon", "instance_count": 123, "def": "a spoon made of wood", "synonyms": ["wooden_spoon"], "image_count": 56, "id": 1194, "frequency": "c", "synset": "wooden_spoon.n.02"}, {"name": "wreath", "instance_count": 119, "def": "an arrangement of flowers, leaves, or stems fastened in a ring", "synonyms": ["wreath"], "image_count": 73, "id": 1195, "frequency": "c", "synset": "wreath.n.01"}, {"name": "wrench", "instance_count": 80, "def": "a hand tool that is used to hold or twist a nut or bolt", "synonyms": ["wrench", "spanner"], "image_count": 32, "id": 1196, "frequency": "c", "synset": "wrench.n.03"}, {"name": "wristband", "instance_count": 268, "def": "band consisting of a part of a sleeve that covers the wrist", "synonyms": ["wristband"], "image_count": 128, "id": 1197, "frequency": "f", "synset": "wristband.n.01"}, {"name": "wristlet", "instance_count": 1330, "def": "a band or bracelet worn around the wrist", "synonyms": ["wristlet", "wrist_band"], "image_count": 623, "id": 1198, "frequency": "f", "synset": "wristlet.n.01"}, {"name": "yacht", "instance_count": 50, "def": "an expensive vessel propelled by sail or power and used for cruising or racing", "synonyms": ["yacht"], "image_count": 12, "id": 1199, "frequency": "c", "synset": "yacht.n.01"}, {"name": "yogurt", "instance_count": 116, "def": "a custard-like food made from curdled milk", "synonyms": ["yogurt", "yoghurt", "yoghourt"], "image_count": 52, "id": 1200, "frequency": "c", "synset": "yogurt.n.01"}, {"name": "yoke_(animal_equipment)", "instance_count": 20, "def": "gear joining two animals at the neck; NOT egg yolk", "synonyms": ["yoke_(animal_equipment)"], "image_count": 11, "id": 1201, "frequency": "c", "synset": "yoke.n.07"}, {"name": "zebra", "instance_count": 5443, "def": "any of several fleet black-and-white striped African equines", "synonyms": ["zebra"], "image_count": 1674, "id": 1202, "frequency": "f", "synset": "zebra.n.01"}, {"name": "zucchini", "instance_count": 798, "def": "small cucumber-shaped vegetable marrow; typically dark green", "synonyms": ["zucchini", "courgette"], "image_count": 81, "id": 1203, "frequency": "c", "synset": "zucchini.n.02"}] \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/datasets/prepare_ade20k_sem_seg.py b/grit_src_deprecated/third_party/CenterNet2/datasets/prepare_ade20k_sem_seg.py deleted file mode 100755 index 8b4a58d8..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/datasets/prepare_ade20k_sem_seg.py +++ /dev/null @@ -1,26 +0,0 @@ -#!/usr/bin/env python3 -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -import os -from pathlib import Path -import tqdm -from PIL import Image - - -def convert(input, output): - img = np.asarray(Image.open(input)) - assert img.dtype == np.uint8 - img = img - 1 # 0 (ignore) becomes 255. others are shifted by 1 - Image.fromarray(img).save(output) - - -if __name__ == "__main__": - dataset_dir = Path(os.getenv("DETECTRON2_DATASETS", "datasets")) / "ADEChallengeData2016" - for name in ["training", "validation"]: - annotation_dir = dataset_dir / "annotations" / name - output_dir = dataset_dir / "annotations_detectron2" / name - output_dir.mkdir(parents=True, exist_ok=True) - for file in tqdm.tqdm(list(annotation_dir.iterdir())): - output_file = output_dir / file.name - convert(file, output_file) diff --git a/grit_src_deprecated/third_party/CenterNet2/datasets/prepare_cocofied_lvis.py b/grit_src_deprecated/third_party/CenterNet2/datasets/prepare_cocofied_lvis.py deleted file mode 100755 index 245c8848..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/datasets/prepare_cocofied_lvis.py +++ /dev/null @@ -1,176 +0,0 @@ -#!/usr/bin/env python3 -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import copy -import json -import os -from collections import defaultdict - -# This mapping is extracted from the official LVIS mapping: -# https://github.com/lvis-dataset/lvis-api/blob/master/data/coco_to_synset.json -COCO_SYNSET_CATEGORIES = [ - {"synset": "person.n.01", "coco_cat_id": 1}, - {"synset": "bicycle.n.01", "coco_cat_id": 2}, - {"synset": "car.n.01", "coco_cat_id": 3}, - {"synset": "motorcycle.n.01", "coco_cat_id": 4}, - {"synset": "airplane.n.01", "coco_cat_id": 5}, - {"synset": "bus.n.01", "coco_cat_id": 6}, - {"synset": "train.n.01", "coco_cat_id": 7}, - {"synset": "truck.n.01", "coco_cat_id": 8}, - {"synset": "boat.n.01", "coco_cat_id": 9}, - {"synset": "traffic_light.n.01", "coco_cat_id": 10}, - {"synset": "fireplug.n.01", "coco_cat_id": 11}, - {"synset": "stop_sign.n.01", "coco_cat_id": 13}, - {"synset": "parking_meter.n.01", "coco_cat_id": 14}, - {"synset": "bench.n.01", "coco_cat_id": 15}, - {"synset": "bird.n.01", "coco_cat_id": 16}, - {"synset": "cat.n.01", "coco_cat_id": 17}, - {"synset": "dog.n.01", "coco_cat_id": 18}, - {"synset": "horse.n.01", "coco_cat_id": 19}, - {"synset": "sheep.n.01", "coco_cat_id": 20}, - {"synset": "beef.n.01", "coco_cat_id": 21}, - {"synset": "elephant.n.01", "coco_cat_id": 22}, - {"synset": "bear.n.01", "coco_cat_id": 23}, - {"synset": "zebra.n.01", "coco_cat_id": 24}, - {"synset": "giraffe.n.01", "coco_cat_id": 25}, - {"synset": "backpack.n.01", "coco_cat_id": 27}, - {"synset": "umbrella.n.01", "coco_cat_id": 28}, - {"synset": "bag.n.04", "coco_cat_id": 31}, - {"synset": "necktie.n.01", "coco_cat_id": 32}, - {"synset": "bag.n.06", "coco_cat_id": 33}, - {"synset": "frisbee.n.01", "coco_cat_id": 34}, - {"synset": "ski.n.01", "coco_cat_id": 35}, - {"synset": "snowboard.n.01", "coco_cat_id": 36}, - {"synset": "ball.n.06", "coco_cat_id": 37}, - {"synset": "kite.n.03", "coco_cat_id": 38}, - {"synset": "baseball_bat.n.01", "coco_cat_id": 39}, - {"synset": "baseball_glove.n.01", "coco_cat_id": 40}, - {"synset": "skateboard.n.01", "coco_cat_id": 41}, - {"synset": "surfboard.n.01", "coco_cat_id": 42}, - {"synset": "tennis_racket.n.01", "coco_cat_id": 43}, - {"synset": "bottle.n.01", "coco_cat_id": 44}, - {"synset": "wineglass.n.01", "coco_cat_id": 46}, - {"synset": "cup.n.01", "coco_cat_id": 47}, - {"synset": "fork.n.01", "coco_cat_id": 48}, - {"synset": "knife.n.01", "coco_cat_id": 49}, - {"synset": "spoon.n.01", "coco_cat_id": 50}, - {"synset": "bowl.n.03", "coco_cat_id": 51}, - {"synset": "banana.n.02", "coco_cat_id": 52}, - {"synset": "apple.n.01", "coco_cat_id": 53}, - {"synset": "sandwich.n.01", "coco_cat_id": 54}, - {"synset": "orange.n.01", "coco_cat_id": 55}, - {"synset": "broccoli.n.01", "coco_cat_id": 56}, - {"synset": "carrot.n.01", "coco_cat_id": 57}, - {"synset": "frank.n.02", "coco_cat_id": 58}, - {"synset": "pizza.n.01", "coco_cat_id": 59}, - {"synset": "doughnut.n.02", "coco_cat_id": 60}, - {"synset": "cake.n.03", "coco_cat_id": 61}, - {"synset": "chair.n.01", "coco_cat_id": 62}, - {"synset": "sofa.n.01", "coco_cat_id": 63}, - {"synset": "pot.n.04", "coco_cat_id": 64}, - {"synset": "bed.n.01", "coco_cat_id": 65}, - {"synset": "dining_table.n.01", "coco_cat_id": 67}, - {"synset": "toilet.n.02", "coco_cat_id": 70}, - {"synset": "television_receiver.n.01", "coco_cat_id": 72}, - {"synset": "laptop.n.01", "coco_cat_id": 73}, - {"synset": "mouse.n.04", "coco_cat_id": 74}, - {"synset": "remote_control.n.01", "coco_cat_id": 75}, - {"synset": "computer_keyboard.n.01", "coco_cat_id": 76}, - {"synset": "cellular_telephone.n.01", "coco_cat_id": 77}, - {"synset": "microwave.n.02", "coco_cat_id": 78}, - {"synset": "oven.n.01", "coco_cat_id": 79}, - {"synset": "toaster.n.02", "coco_cat_id": 80}, - {"synset": "sink.n.01", "coco_cat_id": 81}, - {"synset": "electric_refrigerator.n.01", "coco_cat_id": 82}, - {"synset": "book.n.01", "coco_cat_id": 84}, - {"synset": "clock.n.01", "coco_cat_id": 85}, - {"synset": "vase.n.01", "coco_cat_id": 86}, - {"synset": "scissors.n.01", "coco_cat_id": 87}, - {"synset": "teddy.n.01", "coco_cat_id": 88}, - {"synset": "hand_blower.n.01", "coco_cat_id": 89}, - {"synset": "toothbrush.n.01", "coco_cat_id": 90}, -] - - -def cocofy_lvis(input_filename, output_filename): - """ - Filter LVIS instance segmentation annotations to remove all categories that are not included in - COCO. The new json files can be used to evaluate COCO AP using `lvis-api`. The category ids in - the output json are the incontiguous COCO dataset ids. - - Args: - input_filename (str): path to the LVIS json file. - output_filename (str): path to the COCOfied json file. - """ - - with open(input_filename, "r") as f: - lvis_json = json.load(f) - - lvis_annos = lvis_json.pop("annotations") - cocofied_lvis = copy.deepcopy(lvis_json) - lvis_json["annotations"] = lvis_annos - - # Mapping from lvis cat id to coco cat id via synset - lvis_cat_id_to_synset = {cat["id"]: cat["synset"] for cat in lvis_json["categories"]} - synset_to_coco_cat_id = {x["synset"]: x["coco_cat_id"] for x in COCO_SYNSET_CATEGORIES} - # Synsets that we will keep in the dataset - synsets_to_keep = set(synset_to_coco_cat_id.keys()) - coco_cat_id_with_instances = defaultdict(int) - - new_annos = [] - ann_id = 1 - for ann in lvis_annos: - lvis_cat_id = ann["category_id"] - synset = lvis_cat_id_to_synset[lvis_cat_id] - if synset not in synsets_to_keep: - continue - coco_cat_id = synset_to_coco_cat_id[synset] - new_ann = copy.deepcopy(ann) - new_ann["category_id"] = coco_cat_id - new_ann["id"] = ann_id - ann_id += 1 - new_annos.append(new_ann) - coco_cat_id_with_instances[coco_cat_id] += 1 - cocofied_lvis["annotations"] = new_annos - - for image in cocofied_lvis["images"]: - for key in ["not_exhaustive_category_ids", "neg_category_ids"]: - new_category_list = [] - for lvis_cat_id in image[key]: - synset = lvis_cat_id_to_synset[lvis_cat_id] - if synset not in synsets_to_keep: - continue - coco_cat_id = synset_to_coco_cat_id[synset] - new_category_list.append(coco_cat_id) - coco_cat_id_with_instances[coco_cat_id] += 1 - image[key] = new_category_list - - coco_cat_id_with_instances = set(coco_cat_id_with_instances.keys()) - - new_categories = [] - for cat in lvis_json["categories"]: - synset = cat["synset"] - if synset not in synsets_to_keep: - continue - coco_cat_id = synset_to_coco_cat_id[synset] - if coco_cat_id not in coco_cat_id_with_instances: - continue - new_cat = copy.deepcopy(cat) - new_cat["id"] = coco_cat_id - new_categories.append(new_cat) - cocofied_lvis["categories"] = new_categories - - with open(output_filename, "w") as f: - json.dump(cocofied_lvis, f) - print("{} is COCOfied and stored in {}.".format(input_filename, output_filename)) - - -if __name__ == "__main__": - dataset_dir = os.path.join(os.getenv("DETECTRON2_DATASETS", "datasets"), "lvis") - for s in ["lvis_v0.5_train", "lvis_v0.5_val"]: - print("Start COCOfing {}.".format(s)) - cocofy_lvis( - os.path.join(dataset_dir, "{}.json".format(s)), - os.path.join(dataset_dir, "{}_cocofied.json".format(s)), - ) diff --git a/grit_src_deprecated/third_party/CenterNet2/datasets/prepare_for_tests.sh b/grit_src_deprecated/third_party/CenterNet2/datasets/prepare_for_tests.sh deleted file mode 100755 index 67e875a4..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/datasets/prepare_for_tests.sh +++ /dev/null @@ -1,31 +0,0 @@ -#!/bin/bash -e -# Copyright (c) Facebook, Inc. and its affiliates. - -# Download the mini dataset (coco val2017_100, with only 100 images) -# to be used in unittests & integration tests. - -cd "${0%/*}" - -BASE=https://dl.fbaipublicfiles.com/detectron2 -ROOT=${DETECTRON2_DATASETS:-./} -ROOT=${ROOT/#\~/$HOME} # expand ~ to HOME -mkdir -p $ROOT/coco/annotations - -for anno in instances_val2017_100 \ - person_keypoints_val2017_100 ; do - - dest=$ROOT/coco/annotations/$anno.json - [[ -s $dest ]] && { - echo "$dest exists. Skipping ..." - } || { - wget $BASE/annotations/coco/$anno.json -O $dest - } -done - -dest=$ROOT/coco/val2017_100.tgz -[[ -d $ROOT/coco/val2017 ]] && { - echo "$ROOT/coco/val2017 exists. Skipping ..." -} || { - wget $BASE/annotations/coco/val2017_100.tgz -O $dest - tar xzf $dest -C $ROOT/coco/ && rm -f $dest -} diff --git a/grit_src_deprecated/third_party/CenterNet2/datasets/prepare_panoptic_fpn.py b/grit_src_deprecated/third_party/CenterNet2/datasets/prepare_panoptic_fpn.py deleted file mode 100755 index 597d791a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/datasets/prepare_panoptic_fpn.py +++ /dev/null @@ -1,116 +0,0 @@ -#!/usr/bin/env python3 -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import functools -import json -import multiprocessing as mp -import numpy as np -import os -import time -from fvcore.common.download import download -from panopticapi.utils import rgb2id -from PIL import Image - -from detectron2.data.datasets.builtin_meta import COCO_CATEGORIES - - -def _process_panoptic_to_semantic(input_panoptic, output_semantic, segments, id_map): - panoptic = np.asarray(Image.open(input_panoptic), dtype=np.uint32) - panoptic = rgb2id(panoptic) - output = np.zeros_like(panoptic, dtype=np.uint8) + 255 - for seg in segments: - cat_id = seg["category_id"] - new_cat_id = id_map[cat_id] - output[panoptic == seg["id"]] = new_cat_id - Image.fromarray(output).save(output_semantic) - - -def separate_coco_semantic_from_panoptic(panoptic_json, panoptic_root, sem_seg_root, categories): - """ - Create semantic segmentation annotations from panoptic segmentation - annotations, to be used by PanopticFPN. - - It maps all thing categories to class 0, and maps all unlabeled pixels to class 255. - It maps all stuff categories to contiguous ids starting from 1. - - Args: - panoptic_json (str): path to the panoptic json file, in COCO's format. - panoptic_root (str): a directory with panoptic annotation files, in COCO's format. - sem_seg_root (str): a directory to output semantic annotation files - categories (list[dict]): category metadata. Each dict needs to have: - "id": corresponds to the "category_id" in the json annotations - "isthing": 0 or 1 - """ - os.makedirs(sem_seg_root, exist_ok=True) - - stuff_ids = [k["id"] for k in categories if k["isthing"] == 0] - thing_ids = [k["id"] for k in categories if k["isthing"] == 1] - id_map = {} # map from category id to id in the output semantic annotation - assert len(stuff_ids) <= 254 - for i, stuff_id in enumerate(stuff_ids): - id_map[stuff_id] = i + 1 - for thing_id in thing_ids: - id_map[thing_id] = 0 - id_map[0] = 255 - - with open(panoptic_json) as f: - obj = json.load(f) - - pool = mp.Pool(processes=max(mp.cpu_count() // 2, 4)) - - def iter_annotations(): - for anno in obj["annotations"]: - file_name = anno["file_name"] - segments = anno["segments_info"] - input = os.path.join(panoptic_root, file_name) - output = os.path.join(sem_seg_root, file_name) - yield input, output, segments - - print("Start writing to {} ...".format(sem_seg_root)) - start = time.time() - pool.starmap( - functools.partial(_process_panoptic_to_semantic, id_map=id_map), - iter_annotations(), - chunksize=100, - ) - print("Finished. time: {:.2f}s".format(time.time() - start)) - - -if __name__ == "__main__": - dataset_dir = os.path.join(os.getenv("DETECTRON2_DATASETS", "datasets"), "coco") - for s in ["val2017", "train2017"]: - separate_coco_semantic_from_panoptic( - os.path.join(dataset_dir, "annotations/panoptic_{}.json".format(s)), - os.path.join(dataset_dir, "panoptic_{}".format(s)), - os.path.join(dataset_dir, "panoptic_stuff_{}".format(s)), - COCO_CATEGORIES, - ) - - # Prepare val2017_100 for quick testing: - - dest_dir = os.path.join(dataset_dir, "annotations/") - URL_PREFIX = "https://dl.fbaipublicfiles.com/detectron2/" - download(URL_PREFIX + "annotations/coco/panoptic_val2017_100.json", dest_dir) - with open(os.path.join(dest_dir, "panoptic_val2017_100.json")) as f: - obj = json.load(f) - - def link_val100(dir_full, dir_100): - print("Creating " + dir_100 + " ...") - os.makedirs(dir_100, exist_ok=True) - for img in obj["images"]: - basename = os.path.splitext(img["file_name"])[0] - src = os.path.join(dir_full, basename + ".png") - dst = os.path.join(dir_100, basename + ".png") - src = os.path.relpath(src, start=dir_100) - os.symlink(src, dst) - - link_val100( - os.path.join(dataset_dir, "panoptic_val2017"), - os.path.join(dataset_dir, "panoptic_val2017_100"), - ) - - link_val100( - os.path.join(dataset_dir, "panoptic_stuff_val2017"), - os.path.join(dataset_dir, "panoptic_stuff_val2017_100"), - ) diff --git a/grit_src_deprecated/third_party/CenterNet2/demo/README.md b/grit_src_deprecated/third_party/CenterNet2/demo/README.md deleted file mode 100755 index 133d8d38..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/demo/README.md +++ /dev/null @@ -1,8 +0,0 @@ - -## Detectron2 Demo - -We provide a command line tool to run a simple demo of builtin configs. -The usage is explained in [GETTING_STARTED.md](../GETTING_STARTED.md). - -See our [blog post](https://ai.facebook.com/blog/-detectron2-a-pytorch-based-modular-object-detection-library-) -for a high-quality demo generated with this tool. diff --git a/grit_src_deprecated/third_party/CenterNet2/demo/demo.py b/grit_src_deprecated/third_party/CenterNet2/demo/demo.py deleted file mode 100755 index 4baa8767..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/demo/demo.py +++ /dev/null @@ -1,188 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import argparse -import glob -import multiprocessing as mp -import numpy as np -import os -import tempfile -import time -import warnings -import cv2 -import tqdm - -from detectron2.config import get_cfg -from detectron2.data.detection_utils import read_image -from detectron2.utils.logger import setup_logger - -from predictor import VisualizationDemo - -# constants -WINDOW_NAME = "COCO detections" - - -def setup_cfg(args): - # load config from file and command-line arguments - cfg = get_cfg() - # To use demo for Panoptic-DeepLab, please uncomment the following two lines. - # from detectron2.projects.panoptic_deeplab import add_panoptic_deeplab_config # noqa - # add_panoptic_deeplab_config(cfg) - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - # Set score_threshold for builtin models - cfg.MODEL.RETINANET.SCORE_THRESH_TEST = args.confidence_threshold - cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = args.confidence_threshold - cfg.MODEL.PANOPTIC_FPN.COMBINE.INSTANCES_CONFIDENCE_THRESH = args.confidence_threshold - cfg.freeze() - return cfg - - -def get_parser(): - parser = argparse.ArgumentParser(description="Detectron2 demo for builtin configs") - parser.add_argument( - "--config-file", - default="configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml", - metavar="FILE", - help="path to config file", - ) - parser.add_argument("--webcam", action="store_true", help="Take inputs from webcam.") - parser.add_argument("--video-input", help="Path to video file.") - parser.add_argument( - "--input", - nargs="+", - help="A list of space separated input images; " - "or a single glob pattern such as 'directory/*.jpg'", - ) - parser.add_argument( - "--output", - help="A file or directory to save output visualizations. " - "If not given, will show output in an OpenCV window.", - ) - - parser.add_argument( - "--confidence-threshold", - type=float, - default=0.5, - help="Minimum score for instance predictions to be shown", - ) - parser.add_argument( - "--opts", - help="Modify config options using the command-line 'KEY VALUE' pairs", - default=[], - nargs=argparse.REMAINDER, - ) - return parser - - -def test_opencv_video_format(codec, file_ext): - with tempfile.TemporaryDirectory(prefix="video_format_test") as dir: - filename = os.path.join(dir, "test_file" + file_ext) - writer = cv2.VideoWriter( - filename=filename, - fourcc=cv2.VideoWriter_fourcc(*codec), - fps=float(30), - frameSize=(10, 10), - isColor=True, - ) - [writer.write(np.zeros((10, 10, 3), np.uint8)) for _ in range(30)] - writer.release() - if os.path.isfile(filename): - return True - return False - - -if __name__ == "__main__": - mp.set_start_method("spawn", force=True) - args = get_parser().parse_args() - setup_logger(name="fvcore") - logger = setup_logger() - logger.info("Arguments: " + str(args)) - - cfg = setup_cfg(args) - - demo = VisualizationDemo(cfg) - - if args.input: - if len(args.input) == 1: - args.input = glob.glob(os.path.expanduser(args.input[0])) - assert args.input, "The input path(s) was not found" - for path in tqdm.tqdm(args.input, disable=not args.output): - # use PIL, to be consistent with evaluation - img = read_image(path, format="BGR") - start_time = time.time() - predictions, visualized_output = demo.run_on_image(img) - logger.info( - "{}: {} in {:.2f}s".format( - path, - "detected {} instances".format(len(predictions["instances"])) - if "instances" in predictions - else "finished", - time.time() - start_time, - ) - ) - - if args.output: - if os.path.isdir(args.output): - assert os.path.isdir(args.output), args.output - out_filename = os.path.join(args.output, os.path.basename(path)) - else: - assert len(args.input) == 1, "Please specify a directory with args.output" - out_filename = args.output - visualized_output.save(out_filename) - else: - cv2.namedWindow(WINDOW_NAME, cv2.WINDOW_NORMAL) - cv2.imshow(WINDOW_NAME, visualized_output.get_image()[:, :, ::-1]) - if cv2.waitKey(0) == 27: - break # esc to quit - elif args.webcam: - assert args.input is None, "Cannot have both --input and --webcam!" - assert args.output is None, "output not yet supported with --webcam!" - cam = cv2.VideoCapture(0) - for vis in tqdm.tqdm(demo.run_on_video(cam)): - cv2.namedWindow(WINDOW_NAME, cv2.WINDOW_NORMAL) - cv2.imshow(WINDOW_NAME, vis) - if cv2.waitKey(1) == 27: - break # esc to quit - cam.release() - cv2.destroyAllWindows() - elif args.video_input: - video = cv2.VideoCapture(args.video_input) - width = int(video.get(cv2.CAP_PROP_FRAME_WIDTH)) - height = int(video.get(cv2.CAP_PROP_FRAME_HEIGHT)) - frames_per_second = video.get(cv2.CAP_PROP_FPS) - num_frames = int(video.get(cv2.CAP_PROP_FRAME_COUNT)) - basename = os.path.basename(args.video_input) - codec, file_ext = ( - ("x264", ".mkv") if test_opencv_video_format("x264", ".mkv") else ("mp4v", ".mp4") - ) - if codec == ".mp4v": - warnings.warn("x264 codec not available, switching to mp4v") - if args.output: - if os.path.isdir(args.output): - output_fname = os.path.join(args.output, basename) - output_fname = os.path.splitext(output_fname)[0] + file_ext - else: - output_fname = args.output - assert not os.path.isfile(output_fname), output_fname - output_file = cv2.VideoWriter( - filename=output_fname, - # some installation of opencv may not support x264 (due to its license), - # you can try other format (e.g. MPEG) - fourcc=cv2.VideoWriter_fourcc(*codec), - fps=float(frames_per_second), - frameSize=(width, height), - isColor=True, - ) - assert os.path.isfile(args.video_input) - for vis_frame in tqdm.tqdm(demo.run_on_video(video), total=num_frames): - if args.output: - output_file.write(vis_frame) - else: - cv2.namedWindow(basename, cv2.WINDOW_NORMAL) - cv2.imshow(basename, vis_frame) - if cv2.waitKey(1) == 27: - break # esc to quit - video.release() - if args.output: - output_file.release() - else: - cv2.destroyAllWindows() diff --git a/grit_src_deprecated/third_party/CenterNet2/demo/predictor.py b/grit_src_deprecated/third_party/CenterNet2/demo/predictor.py deleted file mode 100755 index 7b7ebd3f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/demo/predictor.py +++ /dev/null @@ -1,220 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import atexit -import bisect -import multiprocessing as mp -from collections import deque -import cv2 -import torch - -from detectron2.data import MetadataCatalog -from detectron2.engine.defaults import DefaultPredictor -from detectron2.utils.video_visualizer import VideoVisualizer -from detectron2.utils.visualizer import ColorMode, Visualizer - - -class VisualizationDemo(object): - def __init__(self, cfg, instance_mode=ColorMode.IMAGE, parallel=False): - """ - Args: - cfg (CfgNode): - instance_mode (ColorMode): - parallel (bool): whether to run the model in different processes from visualization. - Useful since the visualization logic can be slow. - """ - self.metadata = MetadataCatalog.get( - cfg.DATASETS.TEST[0] if len(cfg.DATASETS.TEST) else "__unused" - ) - self.cpu_device = torch.device("cpu") - self.instance_mode = instance_mode - - self.parallel = parallel - if parallel: - num_gpu = torch.cuda.device_count() - self.predictor = AsyncPredictor(cfg, num_gpus=num_gpu) - else: - self.predictor = DefaultPredictor(cfg) - - def run_on_image(self, image): - """ - Args: - image (np.ndarray): an image of shape (H, W, C) (in BGR order). - This is the format used by OpenCV. - - Returns: - predictions (dict): the output of the model. - vis_output (VisImage): the visualized image output. - """ - vis_output = None - predictions = self.predictor(image) - # Convert image from OpenCV BGR format to Matplotlib RGB format. - image = image[:, :, ::-1] - visualizer = Visualizer(image, self.metadata, instance_mode=self.instance_mode) - if "panoptic_seg" in predictions: - panoptic_seg, segments_info = predictions["panoptic_seg"] - vis_output = visualizer.draw_panoptic_seg_predictions( - panoptic_seg.to(self.cpu_device), segments_info - ) - else: - if "sem_seg" in predictions: - vis_output = visualizer.draw_sem_seg( - predictions["sem_seg"].argmax(dim=0).to(self.cpu_device) - ) - if "instances" in predictions: - instances = predictions["instances"].to(self.cpu_device) - vis_output = visualizer.draw_instance_predictions(predictions=instances) - - return predictions, vis_output - - def _frame_from_video(self, video): - while video.isOpened(): - success, frame = video.read() - if success: - yield frame - else: - break - - def run_on_video(self, video): - """ - Visualizes predictions on frames of the input video. - - Args: - video (cv2.VideoCapture): a :class:`VideoCapture` object, whose source can be - either a webcam or a video file. - - Yields: - ndarray: BGR visualizations of each video frame. - """ - video_visualizer = VideoVisualizer(self.metadata, self.instance_mode) - - def process_predictions(frame, predictions): - frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB) - if "panoptic_seg" in predictions: - panoptic_seg, segments_info = predictions["panoptic_seg"] - vis_frame = video_visualizer.draw_panoptic_seg_predictions( - frame, panoptic_seg.to(self.cpu_device), segments_info - ) - elif "instances" in predictions: - predictions = predictions["instances"].to(self.cpu_device) - vis_frame = video_visualizer.draw_instance_predictions(frame, predictions) - elif "sem_seg" in predictions: - vis_frame = video_visualizer.draw_sem_seg( - frame, predictions["sem_seg"].argmax(dim=0).to(self.cpu_device) - ) - - # Converts Matplotlib RGB format to OpenCV BGR format - vis_frame = cv2.cvtColor(vis_frame.get_image(), cv2.COLOR_RGB2BGR) - return vis_frame - - frame_gen = self._frame_from_video(video) - if self.parallel: - buffer_size = self.predictor.default_buffer_size - - frame_data = deque() - - for cnt, frame in enumerate(frame_gen): - frame_data.append(frame) - self.predictor.put(frame) - - if cnt >= buffer_size: - frame = frame_data.popleft() - predictions = self.predictor.get() - yield process_predictions(frame, predictions) - - while len(frame_data): - frame = frame_data.popleft() - predictions = self.predictor.get() - yield process_predictions(frame, predictions) - else: - for frame in frame_gen: - yield process_predictions(frame, self.predictor(frame)) - - -class AsyncPredictor: - """ - A predictor that runs the model asynchronously, possibly on >1 GPUs. - Because rendering the visualization takes considerably amount of time, - this helps improve throughput a little bit when rendering videos. - """ - - class _StopToken: - pass - - class _PredictWorker(mp.Process): - def __init__(self, cfg, task_queue, result_queue): - self.cfg = cfg - self.task_queue = task_queue - self.result_queue = result_queue - super().__init__() - - def run(self): - predictor = DefaultPredictor(self.cfg) - - while True: - task = self.task_queue.get() - if isinstance(task, AsyncPredictor._StopToken): - break - idx, data = task - result = predictor(data) - self.result_queue.put((idx, result)) - - def __init__(self, cfg, num_gpus: int = 1): - """ - Args: - cfg (CfgNode): - num_gpus (int): if 0, will run on CPU - """ - num_workers = max(num_gpus, 1) - self.task_queue = mp.Queue(maxsize=num_workers * 3) - self.result_queue = mp.Queue(maxsize=num_workers * 3) - self.procs = [] - for gpuid in range(max(num_gpus, 1)): - cfg = cfg.clone() - cfg.defrost() - cfg.MODEL.DEVICE = "cuda:{}".format(gpuid) if num_gpus > 0 else "cpu" - self.procs.append( - AsyncPredictor._PredictWorker(cfg, self.task_queue, self.result_queue) - ) - - self.put_idx = 0 - self.get_idx = 0 - self.result_rank = [] - self.result_data = [] - - for p in self.procs: - p.start() - atexit.register(self.shutdown) - - def put(self, image): - self.put_idx += 1 - self.task_queue.put((self.put_idx, image)) - - def get(self): - self.get_idx += 1 # the index needed for this request - if len(self.result_rank) and self.result_rank[0] == self.get_idx: - res = self.result_data[0] - del self.result_data[0], self.result_rank[0] - return res - - while True: - # make sure the results are returned in the correct order - idx, res = self.result_queue.get() - if idx == self.get_idx: - return res - insert = bisect.bisect(self.result_rank, idx) - self.result_rank.insert(insert, idx) - self.result_data.insert(insert, res) - - def __len__(self): - return self.put_idx - self.get_idx - - def __call__(self, image): - self.put(image) - return self.get() - - def shutdown(self): - for _ in self.procs: - self.task_queue.put(AsyncPredictor._StopToken()) - - @property - def default_buffer_size(self): - return len(self.procs) * 5 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/__init__.py deleted file mode 100755 index bdd994b4..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/__init__.py +++ /dev/null @@ -1,10 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -from .utils.env import setup_environment - -setup_environment() - - -# This line will be programatically read/write by setup.py. -# Leave them at the bottom of this file and don't touch them. -__version__ = "0.6" diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/__init__.py deleted file mode 100755 index 99da0469..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/__init__.py +++ /dev/null @@ -1,10 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. -# File: - - -from . import catalog as _UNUSED # register the handler -from .detection_checkpoint import DetectionCheckpointer -from fvcore.common.checkpoint import Checkpointer, PeriodicCheckpointer - -__all__ = ["Checkpointer", "PeriodicCheckpointer", "DetectionCheckpointer"] diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/c2_model_loading.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/c2_model_loading.py deleted file mode 100755 index 8c8d181b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/c2_model_loading.py +++ /dev/null @@ -1,407 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import logging -import re -from typing import Dict, List -import torch -from tabulate import tabulate - - -def convert_basic_c2_names(original_keys): - """ - Apply some basic name conversion to names in C2 weights. - It only deals with typical backbone models. - - Args: - original_keys (list[str]): - Returns: - list[str]: The same number of strings matching those in original_keys. - """ - layer_keys = copy.deepcopy(original_keys) - layer_keys = [ - {"pred_b": "linear_b", "pred_w": "linear_w"}.get(k, k) for k in layer_keys - ] # some hard-coded mappings - - layer_keys = [k.replace("_", ".") for k in layer_keys] - layer_keys = [re.sub("\\.b$", ".bias", k) for k in layer_keys] - layer_keys = [re.sub("\\.w$", ".weight", k) for k in layer_keys] - # Uniform both bn and gn names to "norm" - layer_keys = [re.sub("bn\\.s$", "norm.weight", k) for k in layer_keys] - layer_keys = [re.sub("bn\\.bias$", "norm.bias", k) for k in layer_keys] - layer_keys = [re.sub("bn\\.rm", "norm.running_mean", k) for k in layer_keys] - layer_keys = [re.sub("bn\\.running.mean$", "norm.running_mean", k) for k in layer_keys] - layer_keys = [re.sub("bn\\.riv$", "norm.running_var", k) for k in layer_keys] - layer_keys = [re.sub("bn\\.running.var$", "norm.running_var", k) for k in layer_keys] - layer_keys = [re.sub("bn\\.gamma$", "norm.weight", k) for k in layer_keys] - layer_keys = [re.sub("bn\\.beta$", "norm.bias", k) for k in layer_keys] - layer_keys = [re.sub("gn\\.s$", "norm.weight", k) for k in layer_keys] - layer_keys = [re.sub("gn\\.bias$", "norm.bias", k) for k in layer_keys] - - # stem - layer_keys = [re.sub("^res\\.conv1\\.norm\\.", "conv1.norm.", k) for k in layer_keys] - # to avoid mis-matching with "conv1" in other components (e.g. detection head) - layer_keys = [re.sub("^conv1\\.", "stem.conv1.", k) for k in layer_keys] - - # layer1-4 is used by torchvision, however we follow the C2 naming strategy (res2-5) - # layer_keys = [re.sub("^res2.", "layer1.", k) for k in layer_keys] - # layer_keys = [re.sub("^res3.", "layer2.", k) for k in layer_keys] - # layer_keys = [re.sub("^res4.", "layer3.", k) for k in layer_keys] - # layer_keys = [re.sub("^res5.", "layer4.", k) for k in layer_keys] - - # blocks - layer_keys = [k.replace(".branch1.", ".shortcut.") for k in layer_keys] - layer_keys = [k.replace(".branch2a.", ".conv1.") for k in layer_keys] - layer_keys = [k.replace(".branch2b.", ".conv2.") for k in layer_keys] - layer_keys = [k.replace(".branch2c.", ".conv3.") for k in layer_keys] - - # DensePose substitutions - layer_keys = [re.sub("^body.conv.fcn", "body_conv_fcn", k) for k in layer_keys] - layer_keys = [k.replace("AnnIndex.lowres", "ann_index_lowres") for k in layer_keys] - layer_keys = [k.replace("Index.UV.lowres", "index_uv_lowres") for k in layer_keys] - layer_keys = [k.replace("U.lowres", "u_lowres") for k in layer_keys] - layer_keys = [k.replace("V.lowres", "v_lowres") for k in layer_keys] - return layer_keys - - -def convert_c2_detectron_names(weights): - """ - Map Caffe2 Detectron weight names to Detectron2 names. - - Args: - weights (dict): name -> tensor - - Returns: - dict: detectron2 names -> tensor - dict: detectron2 names -> C2 names - """ - logger = logging.getLogger(__name__) - logger.info("Renaming Caffe2 weights ......") - original_keys = sorted(weights.keys()) - layer_keys = copy.deepcopy(original_keys) - - layer_keys = convert_basic_c2_names(layer_keys) - - # -------------------------------------------------------------------------- - # RPN hidden representation conv - # -------------------------------------------------------------------------- - # FPN case - # In the C2 model, the RPN hidden layer conv is defined for FPN level 2 and then - # shared for all other levels, hence the appearance of "fpn2" - layer_keys = [ - k.replace("conv.rpn.fpn2", "proposal_generator.rpn_head.conv") for k in layer_keys - ] - # Non-FPN case - layer_keys = [k.replace("conv.rpn", "proposal_generator.rpn_head.conv") for k in layer_keys] - - # -------------------------------------------------------------------------- - # RPN box transformation conv - # -------------------------------------------------------------------------- - # FPN case (see note above about "fpn2") - layer_keys = [ - k.replace("rpn.bbox.pred.fpn2", "proposal_generator.rpn_head.anchor_deltas") - for k in layer_keys - ] - layer_keys = [ - k.replace("rpn.cls.logits.fpn2", "proposal_generator.rpn_head.objectness_logits") - for k in layer_keys - ] - # Non-FPN case - layer_keys = [ - k.replace("rpn.bbox.pred", "proposal_generator.rpn_head.anchor_deltas") for k in layer_keys - ] - layer_keys = [ - k.replace("rpn.cls.logits", "proposal_generator.rpn_head.objectness_logits") - for k in layer_keys - ] - - # -------------------------------------------------------------------------- - # Fast R-CNN box head - # -------------------------------------------------------------------------- - layer_keys = [re.sub("^bbox\\.pred", "bbox_pred", k) for k in layer_keys] - layer_keys = [re.sub("^cls\\.score", "cls_score", k) for k in layer_keys] - layer_keys = [re.sub("^fc6\\.", "box_head.fc1.", k) for k in layer_keys] - layer_keys = [re.sub("^fc7\\.", "box_head.fc2.", k) for k in layer_keys] - # 4conv1fc head tensor names: head_conv1_w, head_conv1_gn_s - layer_keys = [re.sub("^head\\.conv", "box_head.conv", k) for k in layer_keys] - - # -------------------------------------------------------------------------- - # FPN lateral and output convolutions - # -------------------------------------------------------------------------- - def fpn_map(name): - """ - Look for keys with the following patterns: - 1) Starts with "fpn.inner." - Example: "fpn.inner.res2.2.sum.lateral.weight" - Meaning: These are lateral pathway convolutions - 2) Starts with "fpn.res" - Example: "fpn.res2.2.sum.weight" - Meaning: These are FPN output convolutions - """ - splits = name.split(".") - norm = ".norm" if "norm" in splits else "" - if name.startswith("fpn.inner."): - # splits example: ['fpn', 'inner', 'res2', '2', 'sum', 'lateral', 'weight'] - stage = int(splits[2][len("res") :]) - return "fpn_lateral{}{}.{}".format(stage, norm, splits[-1]) - elif name.startswith("fpn.res"): - # splits example: ['fpn', 'res2', '2', 'sum', 'weight'] - stage = int(splits[1][len("res") :]) - return "fpn_output{}{}.{}".format(stage, norm, splits[-1]) - return name - - layer_keys = [fpn_map(k) for k in layer_keys] - - # -------------------------------------------------------------------------- - # Mask R-CNN mask head - # -------------------------------------------------------------------------- - # roi_heads.StandardROIHeads case - layer_keys = [k.replace(".[mask].fcn", "mask_head.mask_fcn") for k in layer_keys] - layer_keys = [re.sub("^\\.mask\\.fcn", "mask_head.mask_fcn", k) for k in layer_keys] - layer_keys = [k.replace("mask.fcn.logits", "mask_head.predictor") for k in layer_keys] - # roi_heads.Res5ROIHeads case - layer_keys = [k.replace("conv5.mask", "mask_head.deconv") for k in layer_keys] - - # -------------------------------------------------------------------------- - # Keypoint R-CNN head - # -------------------------------------------------------------------------- - # interestingly, the keypoint head convs have blob names that are simply "conv_fcnX" - layer_keys = [k.replace("conv.fcn", "roi_heads.keypoint_head.conv_fcn") for k in layer_keys] - layer_keys = [ - k.replace("kps.score.lowres", "roi_heads.keypoint_head.score_lowres") for k in layer_keys - ] - layer_keys = [k.replace("kps.score.", "roi_heads.keypoint_head.score.") for k in layer_keys] - - # -------------------------------------------------------------------------- - # Done with replacements - # -------------------------------------------------------------------------- - assert len(set(layer_keys)) == len(layer_keys) - assert len(original_keys) == len(layer_keys) - - new_weights = {} - new_keys_to_original_keys = {} - for orig, renamed in zip(original_keys, layer_keys): - new_keys_to_original_keys[renamed] = orig - if renamed.startswith("bbox_pred.") or renamed.startswith("mask_head.predictor."): - # remove the meaningless prediction weight for background class - new_start_idx = 4 if renamed.startswith("bbox_pred.") else 1 - new_weights[renamed] = weights[orig][new_start_idx:] - logger.info( - "Remove prediction weight for background class in {}. The shape changes from " - "{} to {}.".format( - renamed, tuple(weights[orig].shape), tuple(new_weights[renamed].shape) - ) - ) - elif renamed.startswith("cls_score."): - # move weights of bg class from original index 0 to last index - logger.info( - "Move classification weights for background class in {} from index 0 to " - "index {}.".format(renamed, weights[orig].shape[0] - 1) - ) - new_weights[renamed] = torch.cat([weights[orig][1:], weights[orig][:1]]) - else: - new_weights[renamed] = weights[orig] - - return new_weights, new_keys_to_original_keys - - -# Note the current matching is not symmetric. -# it assumes model_state_dict will have longer names. -def align_and_update_state_dicts(model_state_dict, ckpt_state_dict, c2_conversion=True): - """ - Match names between the two state-dict, and returns a new chkpt_state_dict with names - converted to match model_state_dict with heuristics. The returned dict can be later - loaded with fvcore checkpointer. - If `c2_conversion==True`, `ckpt_state_dict` is assumed to be a Caffe2 - model and will be renamed at first. - - Strategy: suppose that the models that we will create will have prefixes appended - to each of its keys, for example due to an extra level of nesting that the original - pre-trained weights from ImageNet won't contain. For example, model.state_dict() - might return backbone[0].body.res2.conv1.weight, while the pre-trained model contains - res2.conv1.weight. We thus want to match both parameters together. - For that, we look for each model weight, look among all loaded keys if there is one - that is a suffix of the current weight name, and use it if that's the case. - If multiple matches exist, take the one with longest size - of the corresponding name. For example, for the same model as before, the pretrained - weight file can contain both res2.conv1.weight, as well as conv1.weight. In this case, - we want to match backbone[0].body.conv1.weight to conv1.weight, and - backbone[0].body.res2.conv1.weight to res2.conv1.weight. - """ - model_keys = sorted(model_state_dict.keys()) - if c2_conversion: - ckpt_state_dict, original_keys = convert_c2_detectron_names(ckpt_state_dict) - # original_keys: the name in the original dict (before renaming) - else: - original_keys = {x: x for x in ckpt_state_dict.keys()} - ckpt_keys = sorted(ckpt_state_dict.keys()) - - def match(a, b): - # Matched ckpt_key should be a complete (starts with '.') suffix. - # For example, roi_heads.mesh_head.whatever_conv1 does not match conv1, - # but matches whatever_conv1 or mesh_head.whatever_conv1. - return a == b or a.endswith("." + b) - - # get a matrix of string matches, where each (i, j) entry correspond to the size of the - # ckpt_key string, if it matches - match_matrix = [len(j) if match(i, j) else 0 for i in model_keys for j in ckpt_keys] - match_matrix = torch.as_tensor(match_matrix).view(len(model_keys), len(ckpt_keys)) - # use the matched one with longest size in case of multiple matches - max_match_size, idxs = match_matrix.max(1) - # remove indices that correspond to no-match - idxs[max_match_size == 0] = -1 - - logger = logging.getLogger(__name__) - # matched_pairs (matched checkpoint key --> matched model key) - matched_keys = {} - result_state_dict = {} - for idx_model, idx_ckpt in enumerate(idxs.tolist()): - if idx_ckpt == -1: - continue - key_model = model_keys[idx_model] - key_ckpt = ckpt_keys[idx_ckpt] - value_ckpt = ckpt_state_dict[key_ckpt] - shape_in_model = model_state_dict[key_model].shape - - if shape_in_model != value_ckpt.shape: - logger.warning( - "Shape of {} in checkpoint is {}, while shape of {} in model is {}.".format( - key_ckpt, value_ckpt.shape, key_model, shape_in_model - ) - ) - logger.warning( - "{} will not be loaded. Please double check and see if this is desired.".format( - key_ckpt - ) - ) - continue - - assert key_model not in result_state_dict - result_state_dict[key_model] = value_ckpt - if key_ckpt in matched_keys: # already added to matched_keys - logger.error( - "Ambiguity found for {} in checkpoint!" - "It matches at least two keys in the model ({} and {}).".format( - key_ckpt, key_model, matched_keys[key_ckpt] - ) - ) - raise ValueError("Cannot match one checkpoint key to multiple keys in the model.") - - matched_keys[key_ckpt] = key_model - - # logging: - matched_model_keys = sorted(matched_keys.values()) - if len(matched_model_keys) == 0: - logger.warning("No weights in checkpoint matched with model.") - return ckpt_state_dict - common_prefix = _longest_common_prefix(matched_model_keys) - rev_matched_keys = {v: k for k, v in matched_keys.items()} - original_keys = {k: original_keys[rev_matched_keys[k]] for k in matched_model_keys} - - model_key_groups = _group_keys_by_module(matched_model_keys, original_keys) - table = [] - memo = set() - for key_model in matched_model_keys: - if key_model in memo: - continue - if key_model in model_key_groups: - group = model_key_groups[key_model] - memo |= set(group) - shapes = [tuple(model_state_dict[k].shape) for k in group] - table.append( - ( - _longest_common_prefix([k[len(common_prefix) :] for k in group]) + "*", - _group_str([original_keys[k] for k in group]), - " ".join([str(x).replace(" ", "") for x in shapes]), - ) - ) - else: - key_checkpoint = original_keys[key_model] - shape = str(tuple(model_state_dict[key_model].shape)) - table.append((key_model[len(common_prefix) :], key_checkpoint, shape)) - table_str = tabulate( - table, tablefmt="pipe", headers=["Names in Model", "Names in Checkpoint", "Shapes"] - ) - logger.info( - "Following weights matched with " - + (f"submodule {common_prefix[:-1]}" if common_prefix else "model") - + ":\n" - + table_str - ) - - unmatched_ckpt_keys = [k for k in ckpt_keys if k not in set(matched_keys.keys())] - for k in unmatched_ckpt_keys: - result_state_dict[k] = ckpt_state_dict[k] - return result_state_dict - - -def _group_keys_by_module(keys: List[str], original_names: Dict[str, str]): - """ - Params in the same submodule are grouped together. - - Args: - keys: names of all parameters - original_names: mapping from parameter name to their name in the checkpoint - - Returns: - dict[name -> all other names in the same group] - """ - - def _submodule_name(key): - pos = key.rfind(".") - if pos < 0: - return None - prefix = key[: pos + 1] - return prefix - - all_submodules = [_submodule_name(k) for k in keys] - all_submodules = [x for x in all_submodules if x] - all_submodules = sorted(all_submodules, key=len) - - ret = {} - for prefix in all_submodules: - group = [k for k in keys if k.startswith(prefix)] - if len(group) <= 1: - continue - original_name_lcp = _longest_common_prefix_str([original_names[k] for k in group]) - if len(original_name_lcp) == 0: - # don't group weights if original names don't share prefix - continue - - for k in group: - if k in ret: - continue - ret[k] = group - return ret - - -def _longest_common_prefix(names: List[str]) -> str: - """ - ["abc.zfg", "abc.zef"] -> "abc." - """ - names = [n.split(".") for n in names] - m1, m2 = min(names), max(names) - ret = [a for a, b in zip(m1, m2) if a == b] - ret = ".".join(ret) + "." if len(ret) else "" - return ret - - -def _longest_common_prefix_str(names: List[str]) -> str: - m1, m2 = min(names), max(names) - lcp = [a for a, b in zip(m1, m2) if a == b] - lcp = "".join(lcp) - return lcp - - -def _group_str(names: List[str]) -> str: - """ - Turn "common1", "common2", "common3" into "common{1,2,3}" - """ - lcp = _longest_common_prefix_str(names) - rest = [x[len(lcp) :] for x in names] - rest = "{" + ",".join(rest) + "}" - ret = lcp + rest - - # add some simplification for BN specifically - ret = ret.replace("bn_{beta,running_mean,running_var,gamma}", "bn_*") - ret = ret.replace("bn_beta,bn_running_mean,bn_running_var,bn_gamma", "bn_*") - return ret diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/catalog.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/catalog.py deleted file mode 100755 index 9a857367..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/catalog.py +++ /dev/null @@ -1,115 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging - -from detectron2.utils.file_io import PathHandler, PathManager - - -class ModelCatalog(object): - """ - Store mappings from names to third-party models. - """ - - S3_C2_DETECTRON_PREFIX = "https://dl.fbaipublicfiles.com/detectron" - - # MSRA models have STRIDE_IN_1X1=True. False otherwise. - # NOTE: all BN models here have fused BN into an affine layer. - # As a result, you should only load them to a model with "FrozenBN". - # Loading them to a model with regular BN or SyncBN is wrong. - # Even when loaded to FrozenBN, it is still different from affine by an epsilon, - # which should be negligible for training. - # NOTE: all models here uses PIXEL_STD=[1,1,1] - # NOTE: Most of the BN models here are no longer used. We use the - # re-converted pre-trained models under detectron2 model zoo instead. - C2_IMAGENET_MODELS = { - "MSRA/R-50": "ImageNetPretrained/MSRA/R-50.pkl", - "MSRA/R-101": "ImageNetPretrained/MSRA/R-101.pkl", - "FAIR/R-50-GN": "ImageNetPretrained/47261647/R-50-GN.pkl", - "FAIR/R-101-GN": "ImageNetPretrained/47592356/R-101-GN.pkl", - "FAIR/X-101-32x8d": "ImageNetPretrained/20171220/X-101-32x8d.pkl", - "FAIR/X-101-64x4d": "ImageNetPretrained/FBResNeXt/X-101-64x4d.pkl", - "FAIR/X-152-32x8d-IN5k": "ImageNetPretrained/25093814/X-152-32x8d-IN5k.pkl", - } - - C2_DETECTRON_PATH_FORMAT = ( - "{prefix}/{url}/output/train/{dataset}/{type}/model_final.pkl" # noqa B950 - ) - - C2_DATASET_COCO = "coco_2014_train%3Acoco_2014_valminusminival" - C2_DATASET_COCO_KEYPOINTS = "keypoints_coco_2014_train%3Akeypoints_coco_2014_valminusminival" - - # format: {model_name} -> part of the url - C2_DETECTRON_MODELS = { - "35857197/e2e_faster_rcnn_R-50-C4_1x": "35857197/12_2017_baselines/e2e_faster_rcnn_R-50-C4_1x.yaml.01_33_49.iAX0mXvW", # noqa B950 - "35857345/e2e_faster_rcnn_R-50-FPN_1x": "35857345/12_2017_baselines/e2e_faster_rcnn_R-50-FPN_1x.yaml.01_36_30.cUF7QR7I", # noqa B950 - "35857890/e2e_faster_rcnn_R-101-FPN_1x": "35857890/12_2017_baselines/e2e_faster_rcnn_R-101-FPN_1x.yaml.01_38_50.sNxI7sX7", # noqa B950 - "36761737/e2e_faster_rcnn_X-101-32x8d-FPN_1x": "36761737/12_2017_baselines/e2e_faster_rcnn_X-101-32x8d-FPN_1x.yaml.06_31_39.5MIHi1fZ", # noqa B950 - "35858791/e2e_mask_rcnn_R-50-C4_1x": "35858791/12_2017_baselines/e2e_mask_rcnn_R-50-C4_1x.yaml.01_45_57.ZgkA7hPB", # noqa B950 - "35858933/e2e_mask_rcnn_R-50-FPN_1x": "35858933/12_2017_baselines/e2e_mask_rcnn_R-50-FPN_1x.yaml.01_48_14.DzEQe4wC", # noqa B950 - "35861795/e2e_mask_rcnn_R-101-FPN_1x": "35861795/12_2017_baselines/e2e_mask_rcnn_R-101-FPN_1x.yaml.02_31_37.KqyEK4tT", # noqa B950 - "36761843/e2e_mask_rcnn_X-101-32x8d-FPN_1x": "36761843/12_2017_baselines/e2e_mask_rcnn_X-101-32x8d-FPN_1x.yaml.06_35_59.RZotkLKI", # noqa B950 - "48616381/e2e_mask_rcnn_R-50-FPN_2x_gn": "GN/48616381/04_2018_gn_baselines/e2e_mask_rcnn_R-50-FPN_2x_gn_0416.13_23_38.bTlTI97Q", # noqa B950 - "37697547/e2e_keypoint_rcnn_R-50-FPN_1x": "37697547/12_2017_baselines/e2e_keypoint_rcnn_R-50-FPN_1x.yaml.08_42_54.kdzV35ao", # noqa B950 - "35998355/rpn_R-50-C4_1x": "35998355/12_2017_baselines/rpn_R-50-C4_1x.yaml.08_00_43.njH5oD9L", # noqa B950 - "35998814/rpn_R-50-FPN_1x": "35998814/12_2017_baselines/rpn_R-50-FPN_1x.yaml.08_06_03.Axg0r179", # noqa B950 - "36225147/fast_R-50-FPN_1x": "36225147/12_2017_baselines/fast_rcnn_R-50-FPN_1x.yaml.08_39_09.L3obSdQ2", # noqa B950 - } - - @staticmethod - def get(name): - if name.startswith("Caffe2Detectron/COCO"): - return ModelCatalog._get_c2_detectron_baseline(name) - if name.startswith("ImageNetPretrained/"): - return ModelCatalog._get_c2_imagenet_pretrained(name) - raise RuntimeError("model not present in the catalog: {}".format(name)) - - @staticmethod - def _get_c2_imagenet_pretrained(name): - prefix = ModelCatalog.S3_C2_DETECTRON_PREFIX - name = name[len("ImageNetPretrained/") :] - name = ModelCatalog.C2_IMAGENET_MODELS[name] - url = "/".join([prefix, name]) - return url - - @staticmethod - def _get_c2_detectron_baseline(name): - name = name[len("Caffe2Detectron/COCO/") :] - url = ModelCatalog.C2_DETECTRON_MODELS[name] - if "keypoint_rcnn" in name: - dataset = ModelCatalog.C2_DATASET_COCO_KEYPOINTS - else: - dataset = ModelCatalog.C2_DATASET_COCO - - if "35998355/rpn_R-50-C4_1x" in name: - # this one model is somehow different from others .. - type = "rpn" - else: - type = "generalized_rcnn" - - # Detectron C2 models are stored in the structure defined in `C2_DETECTRON_PATH_FORMAT`. - url = ModelCatalog.C2_DETECTRON_PATH_FORMAT.format( - prefix=ModelCatalog.S3_C2_DETECTRON_PREFIX, url=url, type=type, dataset=dataset - ) - return url - - -class ModelCatalogHandler(PathHandler): - """ - Resolve URL like catalog://. - """ - - PREFIX = "catalog://" - - def _get_supported_prefixes(self): - return [self.PREFIX] - - def _get_local_path(self, path, **kwargs): - logger = logging.getLogger(__name__) - catalog_path = ModelCatalog.get(path[len(self.PREFIX) :]) - logger.info("Catalog entry {} points to {}".format(path, catalog_path)) - return PathManager.get_local_path(catalog_path, **kwargs) - - def _open(self, path, mode="r", **kwargs): - return PathManager.open(self._get_local_path(path), mode, **kwargs) - - -PathManager.register_handler(ModelCatalogHandler()) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/detection_checkpoint.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/detection_checkpoint.py deleted file mode 100755 index 82fd3b2d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/checkpoint/detection_checkpoint.py +++ /dev/null @@ -1,120 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import os -import pickle -import torch -from fvcore.common.checkpoint import Checkpointer -from torch.nn.parallel import DistributedDataParallel - -import detectron2.utils.comm as comm -from detectron2.utils.file_io import PathManager - -from .c2_model_loading import align_and_update_state_dicts - - -class DetectionCheckpointer(Checkpointer): - """ - Same as :class:`Checkpointer`, but is able to: - 1. handle models in detectron & detectron2 model zoo, and apply conversions for legacy models. - 2. correctly load checkpoints that are only available on the master worker - """ - - def __init__(self, model, save_dir="", *, save_to_disk=None, **checkpointables): - is_main_process = comm.is_main_process() - super().__init__( - model, - save_dir, - save_to_disk=is_main_process if save_to_disk is None else save_to_disk, - **checkpointables, - ) - self.path_manager = PathManager - - def load(self, path, *args, **kwargs): - need_sync = False - - if path and isinstance(self.model, DistributedDataParallel): - logger = logging.getLogger(__name__) - path = self.path_manager.get_local_path(path) - has_file = os.path.isfile(path) - all_has_file = comm.all_gather(has_file) - if not all_has_file[0]: - raise OSError(f"File {path} not found on main worker.") - if not all(all_has_file): - logger.warning( - f"Not all workers can read checkpoint {path}. " - "Training may fail to fully resume." - ) - # TODO: broadcast the checkpoint file contents from main - # worker, and load from it instead. - need_sync = True - if not has_file: - path = None # don't load if not readable - ret = super().load(path, *args, **kwargs) - - if need_sync: - logger.info("Broadcasting model states from main worker ...") - self.model._sync_params_and_buffers() - return ret - - def _load_file(self, filename): - if filename.endswith(".pkl"): - with PathManager.open(filename, "rb") as f: - data = pickle.load(f, encoding="latin1") - if "model" in data and "__author__" in data: - # file is in Detectron2 model zoo format - self.logger.info("Reading a file from '{}'".format(data["__author__"])) - return data - else: - # assume file is from Caffe2 / Detectron1 model zoo - if "blobs" in data: - # Detection models have "blobs", but ImageNet models don't - data = data["blobs"] - data = {k: v for k, v in data.items() if not k.endswith("_momentum")} - return {"model": data, "__author__": "Caffe2", "matching_heuristics": True} - elif filename.endswith(".pyth"): - # assume file is from pycls; no one else seems to use the ".pyth" extension - with PathManager.open(filename, "rb") as f: - data = torch.load(f) - assert ( - "model_state" in data - ), f"Cannot load .pyth file {filename}; pycls checkpoints must contain 'model_state'." - model_state = { - k: v - for k, v in data["model_state"].items() - if not k.endswith("num_batches_tracked") - } - return {"model": model_state, "__author__": "pycls", "matching_heuristics": True} - - loaded = super()._load_file(filename) # load native pth checkpoint - if "model" not in loaded: - loaded = {"model": loaded} - return loaded - - def _load_model(self, checkpoint): - if checkpoint.get("matching_heuristics", False): - self._convert_ndarray_to_tensor(checkpoint["model"]) - # convert weights by name-matching heuristics - checkpoint["model"] = align_and_update_state_dicts( - self.model.state_dict(), - checkpoint["model"], - c2_conversion=checkpoint.get("__author__", None) == "Caffe2", - ) - # for non-caffe2 models, use standard ways to load it - incompatible = super()._load_model(checkpoint) - - model_buffers = dict(self.model.named_buffers(recurse=False)) - for k in ["pixel_mean", "pixel_std"]: - # Ignore missing key message about pixel_mean/std. - # Though they may be missing in old checkpoints, they will be correctly - # initialized from config anyway. - if k in model_buffers: - try: - incompatible.missing_keys.remove(k) - except ValueError: - pass - for k in incompatible.unexpected_keys[:]: - # Ignore unexpected keys about cell anchors. They exist in old checkpoints - # but now they are non-persistent buffers and will not be in new checkpoints. - if "anchor_generator.cell_anchors" in k: - incompatible.unexpected_keys.remove(k) - return incompatible diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/config/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/config/__init__.py deleted file mode 100755 index 4e648e63..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/config/__init__.py +++ /dev/null @@ -1,24 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .compat import downgrade_config, upgrade_config -from .config import CfgNode, get_cfg, global_cfg, set_global_cfg, configurable -from .instantiate import instantiate -from .lazy import LazyCall, LazyConfig - -__all__ = [ - "CfgNode", - "get_cfg", - "global_cfg", - "set_global_cfg", - "downgrade_config", - "upgrade_config", - "configurable", - "instantiate", - "LazyCall", - "LazyConfig", -] - - -from detectron2.utils.env import fixup_module_metadata - -fixup_module_metadata(__name__, globals(), __all__) -del fixup_module_metadata diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/config/compat.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/config/compat.py deleted file mode 100755 index 11a08c43..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/config/compat.py +++ /dev/null @@ -1,229 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -""" -Backward compatibility of configs. - -Instructions to bump version: -+ It's not needed to bump version if new keys are added. - It's only needed when backward-incompatible changes happen - (i.e., some existing keys disappear, or the meaning of a key changes) -+ To bump version, do the following: - 1. Increment _C.VERSION in defaults.py - 2. Add a converter in this file. - - Each ConverterVX has a function "upgrade" which in-place upgrades config from X-1 to X, - and a function "downgrade" which in-place downgrades config from X to X-1 - - In each function, VERSION is left unchanged. - - Each converter assumes that its input has the relevant keys - (i.e., the input is not a partial config). - 3. Run the tests (test_config.py) to make sure the upgrade & downgrade - functions are consistent. -""" - -import logging -from typing import List, Optional, Tuple - -from .config import CfgNode as CN -from .defaults import _C - -__all__ = ["upgrade_config", "downgrade_config"] - - -def upgrade_config(cfg: CN, to_version: Optional[int] = None) -> CN: - """ - Upgrade a config from its current version to a newer version. - - Args: - cfg (CfgNode): - to_version (int): defaults to the latest version. - """ - cfg = cfg.clone() - if to_version is None: - to_version = _C.VERSION - - assert cfg.VERSION <= to_version, "Cannot upgrade from v{} to v{}!".format( - cfg.VERSION, to_version - ) - for k in range(cfg.VERSION, to_version): - converter = globals()["ConverterV" + str(k + 1)] - converter.upgrade(cfg) - cfg.VERSION = k + 1 - return cfg - - -def downgrade_config(cfg: CN, to_version: int) -> CN: - """ - Downgrade a config from its current version to an older version. - - Args: - cfg (CfgNode): - to_version (int): - - Note: - A general downgrade of arbitrary configs is not always possible due to the - different functionalities in different versions. - The purpose of downgrade is only to recover the defaults in old versions, - allowing it to load an old partial yaml config. - Therefore, the implementation only needs to fill in the default values - in the old version when a general downgrade is not possible. - """ - cfg = cfg.clone() - assert cfg.VERSION >= to_version, "Cannot downgrade from v{} to v{}!".format( - cfg.VERSION, to_version - ) - for k in range(cfg.VERSION, to_version, -1): - converter = globals()["ConverterV" + str(k)] - converter.downgrade(cfg) - cfg.VERSION = k - 1 - return cfg - - -def guess_version(cfg: CN, filename: str) -> int: - """ - Guess the version of a partial config where the VERSION field is not specified. - Returns the version, or the latest if cannot make a guess. - - This makes it easier for users to migrate. - """ - logger = logging.getLogger(__name__) - - def _has(name: str) -> bool: - cur = cfg - for n in name.split("."): - if n not in cur: - return False - cur = cur[n] - return True - - # Most users' partial configs have "MODEL.WEIGHT", so guess on it - ret = None - if _has("MODEL.WEIGHT") or _has("TEST.AUG_ON"): - ret = 1 - - if ret is not None: - logger.warning("Config '{}' has no VERSION. Assuming it to be v{}.".format(filename, ret)) - else: - ret = _C.VERSION - logger.warning( - "Config '{}' has no VERSION. Assuming it to be compatible with latest v{}.".format( - filename, ret - ) - ) - return ret - - -def _rename(cfg: CN, old: str, new: str) -> None: - old_keys = old.split(".") - new_keys = new.split(".") - - def _set(key_seq: List[str], val: str) -> None: - cur = cfg - for k in key_seq[:-1]: - if k not in cur: - cur[k] = CN() - cur = cur[k] - cur[key_seq[-1]] = val - - def _get(key_seq: List[str]) -> CN: - cur = cfg - for k in key_seq: - cur = cur[k] - return cur - - def _del(key_seq: List[str]) -> None: - cur = cfg - for k in key_seq[:-1]: - cur = cur[k] - del cur[key_seq[-1]] - if len(cur) == 0 and len(key_seq) > 1: - _del(key_seq[:-1]) - - _set(new_keys, _get(old_keys)) - _del(old_keys) - - -class _RenameConverter: - """ - A converter that handles simple rename. - """ - - RENAME: List[Tuple[str, str]] = [] # list of tuples of (old name, new name) - - @classmethod - def upgrade(cls, cfg: CN) -> None: - for old, new in cls.RENAME: - _rename(cfg, old, new) - - @classmethod - def downgrade(cls, cfg: CN) -> None: - for old, new in cls.RENAME[::-1]: - _rename(cfg, new, old) - - -class ConverterV1(_RenameConverter): - RENAME = [("MODEL.RPN_HEAD.NAME", "MODEL.RPN.HEAD_NAME")] - - -class ConverterV2(_RenameConverter): - """ - A large bulk of rename, before public release. - """ - - RENAME = [ - ("MODEL.WEIGHT", "MODEL.WEIGHTS"), - ("MODEL.PANOPTIC_FPN.SEMANTIC_LOSS_SCALE", "MODEL.SEM_SEG_HEAD.LOSS_WEIGHT"), - ("MODEL.PANOPTIC_FPN.RPN_LOSS_SCALE", "MODEL.RPN.LOSS_WEIGHT"), - ("MODEL.PANOPTIC_FPN.INSTANCE_LOSS_SCALE", "MODEL.PANOPTIC_FPN.INSTANCE_LOSS_WEIGHT"), - ("MODEL.PANOPTIC_FPN.COMBINE_ON", "MODEL.PANOPTIC_FPN.COMBINE.ENABLED"), - ( - "MODEL.PANOPTIC_FPN.COMBINE_OVERLAP_THRESHOLD", - "MODEL.PANOPTIC_FPN.COMBINE.OVERLAP_THRESH", - ), - ( - "MODEL.PANOPTIC_FPN.COMBINE_STUFF_AREA_LIMIT", - "MODEL.PANOPTIC_FPN.COMBINE.STUFF_AREA_LIMIT", - ), - ( - "MODEL.PANOPTIC_FPN.COMBINE_INSTANCES_CONFIDENCE_THRESHOLD", - "MODEL.PANOPTIC_FPN.COMBINE.INSTANCES_CONFIDENCE_THRESH", - ), - ("MODEL.ROI_HEADS.SCORE_THRESH", "MODEL.ROI_HEADS.SCORE_THRESH_TEST"), - ("MODEL.ROI_HEADS.NMS", "MODEL.ROI_HEADS.NMS_THRESH_TEST"), - ("MODEL.RETINANET.INFERENCE_SCORE_THRESHOLD", "MODEL.RETINANET.SCORE_THRESH_TEST"), - ("MODEL.RETINANET.INFERENCE_TOPK_CANDIDATES", "MODEL.RETINANET.TOPK_CANDIDATES_TEST"), - ("MODEL.RETINANET.INFERENCE_NMS_THRESHOLD", "MODEL.RETINANET.NMS_THRESH_TEST"), - ("TEST.DETECTIONS_PER_IMG", "TEST.DETECTIONS_PER_IMAGE"), - ("TEST.AUG_ON", "TEST.AUG.ENABLED"), - ("TEST.AUG_MIN_SIZES", "TEST.AUG.MIN_SIZES"), - ("TEST.AUG_MAX_SIZE", "TEST.AUG.MAX_SIZE"), - ("TEST.AUG_FLIP", "TEST.AUG.FLIP"), - ] - - @classmethod - def upgrade(cls, cfg: CN) -> None: - super().upgrade(cfg) - - if cfg.MODEL.META_ARCHITECTURE == "RetinaNet": - _rename( - cfg, "MODEL.RETINANET.ANCHOR_ASPECT_RATIOS", "MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS" - ) - _rename(cfg, "MODEL.RETINANET.ANCHOR_SIZES", "MODEL.ANCHOR_GENERATOR.SIZES") - del cfg["MODEL"]["RPN"]["ANCHOR_SIZES"] - del cfg["MODEL"]["RPN"]["ANCHOR_ASPECT_RATIOS"] - else: - _rename(cfg, "MODEL.RPN.ANCHOR_ASPECT_RATIOS", "MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS") - _rename(cfg, "MODEL.RPN.ANCHOR_SIZES", "MODEL.ANCHOR_GENERATOR.SIZES") - del cfg["MODEL"]["RETINANET"]["ANCHOR_SIZES"] - del cfg["MODEL"]["RETINANET"]["ANCHOR_ASPECT_RATIOS"] - del cfg["MODEL"]["RETINANET"]["ANCHOR_STRIDES"] - - @classmethod - def downgrade(cls, cfg: CN) -> None: - super().downgrade(cfg) - - _rename(cfg, "MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS", "MODEL.RPN.ANCHOR_ASPECT_RATIOS") - _rename(cfg, "MODEL.ANCHOR_GENERATOR.SIZES", "MODEL.RPN.ANCHOR_SIZES") - cfg.MODEL.RETINANET.ANCHOR_ASPECT_RATIOS = cfg.MODEL.RPN.ANCHOR_ASPECT_RATIOS - cfg.MODEL.RETINANET.ANCHOR_SIZES = cfg.MODEL.RPN.ANCHOR_SIZES - cfg.MODEL.RETINANET.ANCHOR_STRIDES = [] # this is not used anywhere in any version diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/config/config.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/config/config.py deleted file mode 100755 index 49a55b1b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/config/config.py +++ /dev/null @@ -1,265 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import functools -import inspect -import logging -from fvcore.common.config import CfgNode as _CfgNode - -from detectron2.utils.file_io import PathManager - - -class CfgNode(_CfgNode): - """ - The same as `fvcore.common.config.CfgNode`, but different in: - - 1. Use unsafe yaml loading by default. - Note that this may lead to arbitrary code execution: you must not - load a config file from untrusted sources before manually inspecting - the content of the file. - 2. Support config versioning. - When attempting to merge an old config, it will convert the old config automatically. - - .. automethod:: clone - .. automethod:: freeze - .. automethod:: defrost - .. automethod:: is_frozen - .. automethod:: load_yaml_with_base - .. automethod:: merge_from_list - .. automethod:: merge_from_other_cfg - """ - - @classmethod - def _open_cfg(cls, filename): - return PathManager.open(filename, "r") - - # Note that the default value of allow_unsafe is changed to True - def merge_from_file(self, cfg_filename: str, allow_unsafe: bool = True) -> None: - """ - Load content from the given config file and merge it into self. - - Args: - cfg_filename: config filename - allow_unsafe: allow unsafe yaml syntax - """ - assert PathManager.isfile(cfg_filename), f"Config file '{cfg_filename}' does not exist!" - loaded_cfg = self.load_yaml_with_base(cfg_filename, allow_unsafe=allow_unsafe) - loaded_cfg = type(self)(loaded_cfg) - - # defaults.py needs to import CfgNode - from .defaults import _C - - latest_ver = _C.VERSION - assert ( - latest_ver == self.VERSION - ), "CfgNode.merge_from_file is only allowed on a config object of latest version!" - - logger = logging.getLogger(__name__) - - loaded_ver = loaded_cfg.get("VERSION", None) - if loaded_ver is None: - from .compat import guess_version - - loaded_ver = guess_version(loaded_cfg, cfg_filename) - assert loaded_ver <= self.VERSION, "Cannot merge a v{} config into a v{} config.".format( - loaded_ver, self.VERSION - ) - - if loaded_ver == self.VERSION: - self.merge_from_other_cfg(loaded_cfg) - else: - # compat.py needs to import CfgNode - from .compat import upgrade_config, downgrade_config - - logger.warning( - "Loading an old v{} config file '{}' by automatically upgrading to v{}. " - "See docs/CHANGELOG.md for instructions to update your files.".format( - loaded_ver, cfg_filename, self.VERSION - ) - ) - # To convert, first obtain a full config at an old version - old_self = downgrade_config(self, to_version=loaded_ver) - old_self.merge_from_other_cfg(loaded_cfg) - new_config = upgrade_config(old_self) - self.clear() - self.update(new_config) - - def dump(self, *args, **kwargs): - """ - Returns: - str: a yaml string representation of the config - """ - # to make it show up in docs - return super().dump(*args, **kwargs) - - -global_cfg = CfgNode() - - -def get_cfg() -> CfgNode: - """ - Get a copy of the default config. - - Returns: - a detectron2 CfgNode instance. - """ - from .defaults import _C - - return _C.clone() - - -def set_global_cfg(cfg: CfgNode) -> None: - """ - Let the global config point to the given cfg. - - Assume that the given "cfg" has the key "KEY", after calling - `set_global_cfg(cfg)`, the key can be accessed by: - :: - from detectron2.config import global_cfg - print(global_cfg.KEY) - - By using a hacky global config, you can access these configs anywhere, - without having to pass the config object or the values deep into the code. - This is a hacky feature introduced for quick prototyping / research exploration. - """ - global global_cfg - global_cfg.clear() - global_cfg.update(cfg) - - -def configurable(init_func=None, *, from_config=None): - """ - Decorate a function or a class's __init__ method so that it can be called - with a :class:`CfgNode` object using a :func:`from_config` function that translates - :class:`CfgNode` to arguments. - - Examples: - :: - # Usage 1: Decorator on __init__: - class A: - @configurable - def __init__(self, a, b=2, c=3): - pass - - @classmethod - def from_config(cls, cfg): # 'cfg' must be the first argument - # Returns kwargs to be passed to __init__ - return {"a": cfg.A, "b": cfg.B} - - a1 = A(a=1, b=2) # regular construction - a2 = A(cfg) # construct with a cfg - a3 = A(cfg, b=3, c=4) # construct with extra overwrite - - # Usage 2: Decorator on any function. Needs an extra from_config argument: - @configurable(from_config=lambda cfg: {"a: cfg.A, "b": cfg.B}) - def a_func(a, b=2, c=3): - pass - - a1 = a_func(a=1, b=2) # regular call - a2 = a_func(cfg) # call with a cfg - a3 = a_func(cfg, b=3, c=4) # call with extra overwrite - - Args: - init_func (callable): a class's ``__init__`` method in usage 1. The - class must have a ``from_config`` classmethod which takes `cfg` as - the first argument. - from_config (callable): the from_config function in usage 2. It must take `cfg` - as its first argument. - """ - - if init_func is not None: - assert ( - inspect.isfunction(init_func) - and from_config is None - and init_func.__name__ == "__init__" - ), "Incorrect use of @configurable. Check API documentation for examples." - - @functools.wraps(init_func) - def wrapped(self, *args, **kwargs): - try: - from_config_func = type(self).from_config - except AttributeError as e: - raise AttributeError( - "Class with @configurable must have a 'from_config' classmethod." - ) from e - if not inspect.ismethod(from_config_func): - raise TypeError("Class with @configurable must have a 'from_config' classmethod.") - - if _called_with_cfg(*args, **kwargs): - explicit_args = _get_args_from_config(from_config_func, *args, **kwargs) - init_func(self, **explicit_args) - else: - init_func(self, *args, **kwargs) - - return wrapped - - else: - if from_config is None: - return configurable # @configurable() is made equivalent to @configurable - assert inspect.isfunction( - from_config - ), "from_config argument of configurable must be a function!" - - def wrapper(orig_func): - @functools.wraps(orig_func) - def wrapped(*args, **kwargs): - if _called_with_cfg(*args, **kwargs): - explicit_args = _get_args_from_config(from_config, *args, **kwargs) - return orig_func(**explicit_args) - else: - return orig_func(*args, **kwargs) - - wrapped.from_config = from_config - return wrapped - - return wrapper - - -def _get_args_from_config(from_config_func, *args, **kwargs): - """ - Use `from_config` to obtain explicit arguments. - - Returns: - dict: arguments to be used for cls.__init__ - """ - signature = inspect.signature(from_config_func) - if list(signature.parameters.keys())[0] != "cfg": - if inspect.isfunction(from_config_func): - name = from_config_func.__name__ - else: - name = f"{from_config_func.__self__}.from_config" - raise TypeError(f"{name} must take 'cfg' as the first argument!") - support_var_arg = any( - param.kind in [param.VAR_POSITIONAL, param.VAR_KEYWORD] - for param in signature.parameters.values() - ) - if support_var_arg: # forward all arguments to from_config, if from_config accepts them - ret = from_config_func(*args, **kwargs) - else: - # forward supported arguments to from_config - supported_arg_names = set(signature.parameters.keys()) - extra_kwargs = {} - for name in list(kwargs.keys()): - if name not in supported_arg_names: - extra_kwargs[name] = kwargs.pop(name) - ret = from_config_func(*args, **kwargs) - # forward the other arguments to __init__ - ret.update(extra_kwargs) - return ret - - -def _called_with_cfg(*args, **kwargs): - """ - Returns: - bool: whether the arguments contain CfgNode and should be considered - forwarded to from_config. - """ - from omegaconf import DictConfig - - if len(args) and isinstance(args[0], (_CfgNode, DictConfig)): - return True - if isinstance(kwargs.pop("cfg", None), (_CfgNode, DictConfig)): - return True - # `from_config`'s first argument is forced to be "cfg". - # So the above check covers all cases. - return False diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/config/defaults.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/config/defaults.py deleted file mode 100755 index 848486df..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/config/defaults.py +++ /dev/null @@ -1,635 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .config import CfgNode as CN - -# NOTE: given the new config system -# (https://detectron2.readthedocs.io/en/latest/tutorials/lazyconfigs.html), -# we will stop adding new functionalities to default CfgNode. - -# ----------------------------------------------------------------------------- -# Convention about Training / Test specific parameters -# ----------------------------------------------------------------------------- -# Whenever an argument can be either used for training or for testing, the -# corresponding name will be post-fixed by a _TRAIN for a training parameter, -# or _TEST for a test-specific parameter. -# For example, the number of images during training will be -# IMAGES_PER_BATCH_TRAIN, while the number of images for testing will be -# IMAGES_PER_BATCH_TEST - -# ----------------------------------------------------------------------------- -# Config definition -# ----------------------------------------------------------------------------- - -_C = CN() - -# The version number, to upgrade from old configs to new ones if any -# changes happen. It's recommended to keep a VERSION in your config file. -_C.VERSION = 2 - -_C.MODEL = CN() -_C.MODEL.LOAD_PROPOSALS = False -_C.MODEL.MASK_ON = False -_C.MODEL.KEYPOINT_ON = False -_C.MODEL.DEVICE = "cuda" -_C.MODEL.META_ARCHITECTURE = "GeneralizedRCNN" - -# Path (a file path, or URL like detectron2://.., https://..) to a checkpoint file -# to be loaded to the model. You can find available models in the model zoo. -_C.MODEL.WEIGHTS = "" - -# Values to be used for image normalization (BGR order, since INPUT.FORMAT defaults to BGR). -# To train on images of different number of channels, just set different mean & std. -# Default values are the mean pixel value from ImageNet: [103.53, 116.28, 123.675] -_C.MODEL.PIXEL_MEAN = [103.530, 116.280, 123.675] -# When using pre-trained models in Detectron1 or any MSRA models, -# std has been absorbed into its conv1 weights, so the std needs to be set 1. -# Otherwise, you can use [57.375, 57.120, 58.395] (ImageNet std) -_C.MODEL.PIXEL_STD = [1.0, 1.0, 1.0] - - -# ----------------------------------------------------------------------------- -# INPUT -# ----------------------------------------------------------------------------- -_C.INPUT = CN() -# By default, {MIN,MAX}_SIZE options are used in transforms.ResizeShortestEdge. -# Please refer to ResizeShortestEdge for detailed definition. -# Size of the smallest side of the image during training -_C.INPUT.MIN_SIZE_TRAIN = (800,) -# Sample size of smallest side by choice or random selection from range give by -# INPUT.MIN_SIZE_TRAIN -_C.INPUT.MIN_SIZE_TRAIN_SAMPLING = "choice" -# Maximum size of the side of the image during training -_C.INPUT.MAX_SIZE_TRAIN = 1333 -# Size of the smallest side of the image during testing. Set to zero to disable resize in testing. -_C.INPUT.MIN_SIZE_TEST = 800 -# Maximum size of the side of the image during testing -_C.INPUT.MAX_SIZE_TEST = 1333 -# Mode for flipping images used in data augmentation during training -# choose one of ["horizontal, "vertical", "none"] -_C.INPUT.RANDOM_FLIP = "horizontal" - -# `True` if cropping is used for data augmentation during training -_C.INPUT.CROP = CN({"ENABLED": False}) -# Cropping type. See documentation of `detectron2.data.transforms.RandomCrop` for explanation. -_C.INPUT.CROP.TYPE = "relative_range" -# Size of crop in range (0, 1] if CROP.TYPE is "relative" or "relative_range" and in number of -# pixels if CROP.TYPE is "absolute" -_C.INPUT.CROP.SIZE = [0.9, 0.9] - - -# Whether the model needs RGB, YUV, HSV etc. -# Should be one of the modes defined here, as we use PIL to read the image: -# https://pillow.readthedocs.io/en/stable/handbook/concepts.html#concept-modes -# with BGR being the one exception. One can set image format to BGR, we will -# internally use RGB for conversion and flip the channels over -_C.INPUT.FORMAT = "BGR" -# The ground truth mask format that the model will use. -# Mask R-CNN supports either "polygon" or "bitmask" as ground truth. -_C.INPUT.MASK_FORMAT = "polygon" # alternative: "bitmask" - - -# ----------------------------------------------------------------------------- -# Dataset -# ----------------------------------------------------------------------------- -_C.DATASETS = CN() -# List of the dataset names for training. Must be registered in DatasetCatalog -# Samples from these datasets will be merged and used as one dataset. -_C.DATASETS.TRAIN = () -# List of the pre-computed proposal files for training, which must be consistent -# with datasets listed in DATASETS.TRAIN. -_C.DATASETS.PROPOSAL_FILES_TRAIN = () -# Number of top scoring precomputed proposals to keep for training -_C.DATASETS.PRECOMPUTED_PROPOSAL_TOPK_TRAIN = 2000 -# List of the dataset names for testing. Must be registered in DatasetCatalog -_C.DATASETS.TEST = () -# List of the pre-computed proposal files for test, which must be consistent -# with datasets listed in DATASETS.TEST. -_C.DATASETS.PROPOSAL_FILES_TEST = () -# Number of top scoring precomputed proposals to keep for test -_C.DATASETS.PRECOMPUTED_PROPOSAL_TOPK_TEST = 1000 - -# ----------------------------------------------------------------------------- -# DataLoader -# ----------------------------------------------------------------------------- -_C.DATALOADER = CN() -# Number of data loading threads -_C.DATALOADER.NUM_WORKERS = 4 -# If True, each batch should contain only images for which the aspect ratio -# is compatible. This groups portrait images together, and landscape images -# are not batched with portrait images. -_C.DATALOADER.ASPECT_RATIO_GROUPING = True -# Options: TrainingSampler, RepeatFactorTrainingSampler -_C.DATALOADER.SAMPLER_TRAIN = "TrainingSampler" -# Repeat threshold for RepeatFactorTrainingSampler -_C.DATALOADER.REPEAT_THRESHOLD = 0.0 -# Tf True, when working on datasets that have instance annotations, the -# training dataloader will filter out images without associated annotations -_C.DATALOADER.FILTER_EMPTY_ANNOTATIONS = True - -# ---------------------------------------------------------------------------- # -# Backbone options -# ---------------------------------------------------------------------------- # -_C.MODEL.BACKBONE = CN() - -_C.MODEL.BACKBONE.NAME = "build_resnet_backbone" -# Freeze the first several stages so they are not trained. -# There are 5 stages in ResNet. The first is a convolution, and the following -# stages are each group of residual blocks. -_C.MODEL.BACKBONE.FREEZE_AT = 2 - - -# ---------------------------------------------------------------------------- # -# FPN options -# ---------------------------------------------------------------------------- # -_C.MODEL.FPN = CN() -# Names of the input feature maps to be used by FPN -# They must have contiguous power of 2 strides -# e.g., ["res2", "res3", "res4", "res5"] -_C.MODEL.FPN.IN_FEATURES = [] -_C.MODEL.FPN.OUT_CHANNELS = 256 - -# Options: "" (no norm), "GN" -_C.MODEL.FPN.NORM = "" - -# Types for fusing the FPN top-down and lateral features. Can be either "sum" or "avg" -_C.MODEL.FPN.FUSE_TYPE = "sum" - - -# ---------------------------------------------------------------------------- # -# Proposal generator options -# ---------------------------------------------------------------------------- # -_C.MODEL.PROPOSAL_GENERATOR = CN() -# Current proposal generators include "RPN", "RRPN" and "PrecomputedProposals" -_C.MODEL.PROPOSAL_GENERATOR.NAME = "RPN" -# Proposal height and width both need to be greater than MIN_SIZE -# (a the scale used during training or inference) -_C.MODEL.PROPOSAL_GENERATOR.MIN_SIZE = 0 - - -# ---------------------------------------------------------------------------- # -# Anchor generator options -# ---------------------------------------------------------------------------- # -_C.MODEL.ANCHOR_GENERATOR = CN() -# The generator can be any name in the ANCHOR_GENERATOR registry -_C.MODEL.ANCHOR_GENERATOR.NAME = "DefaultAnchorGenerator" -# Anchor sizes (i.e. sqrt of area) in absolute pixels w.r.t. the network input. -# Format: list[list[float]]. SIZES[i] specifies the list of sizes to use for -# IN_FEATURES[i]; len(SIZES) must be equal to len(IN_FEATURES) or 1. -# When len(SIZES) == 1, SIZES[0] is used for all IN_FEATURES. -_C.MODEL.ANCHOR_GENERATOR.SIZES = [[32, 64, 128, 256, 512]] -# Anchor aspect ratios. For each area given in `SIZES`, anchors with different aspect -# ratios are generated by an anchor generator. -# Format: list[list[float]]. ASPECT_RATIOS[i] specifies the list of aspect ratios (H/W) -# to use for IN_FEATURES[i]; len(ASPECT_RATIOS) == len(IN_FEATURES) must be true, -# or len(ASPECT_RATIOS) == 1 is true and aspect ratio list ASPECT_RATIOS[0] is used -# for all IN_FEATURES. -_C.MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS = [[0.5, 1.0, 2.0]] -# Anchor angles. -# list[list[float]], the angle in degrees, for each input feature map. -# ANGLES[i] specifies the list of angles for IN_FEATURES[i]. -_C.MODEL.ANCHOR_GENERATOR.ANGLES = [[-90, 0, 90]] -# Relative offset between the center of the first anchor and the top-left corner of the image -# Value has to be in [0, 1). Recommend to use 0.5, which means half stride. -# The value is not expected to affect model accuracy. -_C.MODEL.ANCHOR_GENERATOR.OFFSET = 0.0 - -# ---------------------------------------------------------------------------- # -# RPN options -# ---------------------------------------------------------------------------- # -_C.MODEL.RPN = CN() -_C.MODEL.RPN.HEAD_NAME = "StandardRPNHead" # used by RPN_HEAD_REGISTRY - -# Names of the input feature maps to be used by RPN -# e.g., ["p2", "p3", "p4", "p5", "p6"] for FPN -_C.MODEL.RPN.IN_FEATURES = ["res4"] -# Remove RPN anchors that go outside the image by BOUNDARY_THRESH pixels -# Set to -1 or a large value, e.g. 100000, to disable pruning anchors -_C.MODEL.RPN.BOUNDARY_THRESH = -1 -# IOU overlap ratios [BG_IOU_THRESHOLD, FG_IOU_THRESHOLD] -# Minimum overlap required between an anchor and ground-truth box for the -# (anchor, gt box) pair to be a positive example (IoU >= FG_IOU_THRESHOLD -# ==> positive RPN example: 1) -# Maximum overlap allowed between an anchor and ground-truth box for the -# (anchor, gt box) pair to be a negative examples (IoU < BG_IOU_THRESHOLD -# ==> negative RPN example: 0) -# Anchors with overlap in between (BG_IOU_THRESHOLD <= IoU < FG_IOU_THRESHOLD) -# are ignored (-1) -_C.MODEL.RPN.IOU_THRESHOLDS = [0.3, 0.7] -_C.MODEL.RPN.IOU_LABELS = [0, -1, 1] -# Number of regions per image used to train RPN -_C.MODEL.RPN.BATCH_SIZE_PER_IMAGE = 256 -# Target fraction of foreground (positive) examples per RPN minibatch -_C.MODEL.RPN.POSITIVE_FRACTION = 0.5 -# Options are: "smooth_l1", "giou", "diou", "ciou" -_C.MODEL.RPN.BBOX_REG_LOSS_TYPE = "smooth_l1" -_C.MODEL.RPN.BBOX_REG_LOSS_WEIGHT = 1.0 -# Weights on (dx, dy, dw, dh) for normalizing RPN anchor regression targets -_C.MODEL.RPN.BBOX_REG_WEIGHTS = (1.0, 1.0, 1.0, 1.0) -# The transition point from L1 to L2 loss. Set to 0.0 to make the loss simply L1. -_C.MODEL.RPN.SMOOTH_L1_BETA = 0.0 -_C.MODEL.RPN.LOSS_WEIGHT = 1.0 -# Number of top scoring RPN proposals to keep before applying NMS -# When FPN is used, this is *per FPN level* (not total) -_C.MODEL.RPN.PRE_NMS_TOPK_TRAIN = 12000 -_C.MODEL.RPN.PRE_NMS_TOPK_TEST = 6000 -# Number of top scoring RPN proposals to keep after applying NMS -# When FPN is used, this limit is applied per level and then again to the union -# of proposals from all levels -# NOTE: When FPN is used, the meaning of this config is different from Detectron1. -# It means per-batch topk in Detectron1, but per-image topk here. -# See the "find_top_rpn_proposals" function for details. -_C.MODEL.RPN.POST_NMS_TOPK_TRAIN = 2000 -_C.MODEL.RPN.POST_NMS_TOPK_TEST = 1000 -# NMS threshold used on RPN proposals -_C.MODEL.RPN.NMS_THRESH = 0.7 -# Set this to -1 to use the same number of output channels as input channels. -_C.MODEL.RPN.CONV_DIMS = [-1] - -# ---------------------------------------------------------------------------- # -# ROI HEADS options -# ---------------------------------------------------------------------------- # -_C.MODEL.ROI_HEADS = CN() -_C.MODEL.ROI_HEADS.NAME = "Res5ROIHeads" -# Number of foreground classes -_C.MODEL.ROI_HEADS.NUM_CLASSES = 80 -# Names of the input feature maps to be used by ROI heads -# Currently all heads (box, mask, ...) use the same input feature map list -# e.g., ["p2", "p3", "p4", "p5"] is commonly used for FPN -_C.MODEL.ROI_HEADS.IN_FEATURES = ["res4"] -# IOU overlap ratios [IOU_THRESHOLD] -# Overlap threshold for an RoI to be considered background (if < IOU_THRESHOLD) -# Overlap threshold for an RoI to be considered foreground (if >= IOU_THRESHOLD) -_C.MODEL.ROI_HEADS.IOU_THRESHOLDS = [0.5] -_C.MODEL.ROI_HEADS.IOU_LABELS = [0, 1] -# RoI minibatch size *per image* (number of regions of interest [ROIs]) during training -# Total number of RoIs per training minibatch = -# ROI_HEADS.BATCH_SIZE_PER_IMAGE * SOLVER.IMS_PER_BATCH -# E.g., a common configuration is: 512 * 16 = 8192 -_C.MODEL.ROI_HEADS.BATCH_SIZE_PER_IMAGE = 512 -# Target fraction of RoI minibatch that is labeled foreground (i.e. class > 0) -_C.MODEL.ROI_HEADS.POSITIVE_FRACTION = 0.25 - -# Only used on test mode - -# Minimum score threshold (assuming scores in a [0, 1] range); a value chosen to -# balance obtaining high recall with not having too many low precision -# detections that will slow down inference post processing steps (like NMS) -# A default threshold of 0.0 increases AP by ~0.2-0.3 but significantly slows down -# inference. -_C.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.05 -# Overlap threshold used for non-maximum suppression (suppress boxes with -# IoU >= this threshold) -_C.MODEL.ROI_HEADS.NMS_THRESH_TEST = 0.5 -# If True, augment proposals with ground-truth boxes before sampling proposals to -# train ROI heads. -_C.MODEL.ROI_HEADS.PROPOSAL_APPEND_GT = True - -# ---------------------------------------------------------------------------- # -# Box Head -# ---------------------------------------------------------------------------- # -_C.MODEL.ROI_BOX_HEAD = CN() -# C4 don't use head name option -# Options for non-C4 models: FastRCNNConvFCHead, -_C.MODEL.ROI_BOX_HEAD.NAME = "" -# Options are: "smooth_l1", "giou", "diou", "ciou" -_C.MODEL.ROI_BOX_HEAD.BBOX_REG_LOSS_TYPE = "smooth_l1" -# The final scaling coefficient on the box regression loss, used to balance the magnitude of its -# gradients with other losses in the model. See also `MODEL.ROI_KEYPOINT_HEAD.LOSS_WEIGHT`. -_C.MODEL.ROI_BOX_HEAD.BBOX_REG_LOSS_WEIGHT = 1.0 -# Default weights on (dx, dy, dw, dh) for normalizing bbox regression targets -# These are empirically chosen to approximately lead to unit variance targets -_C.MODEL.ROI_BOX_HEAD.BBOX_REG_WEIGHTS = (10.0, 10.0, 5.0, 5.0) -# The transition point from L1 to L2 loss. Set to 0.0 to make the loss simply L1. -_C.MODEL.ROI_BOX_HEAD.SMOOTH_L1_BETA = 0.0 -_C.MODEL.ROI_BOX_HEAD.POOLER_RESOLUTION = 14 -_C.MODEL.ROI_BOX_HEAD.POOLER_SAMPLING_RATIO = 0 -# Type of pooling operation applied to the incoming feature map for each RoI -_C.MODEL.ROI_BOX_HEAD.POOLER_TYPE = "ROIAlignV2" - -_C.MODEL.ROI_BOX_HEAD.NUM_FC = 0 -# Hidden layer dimension for FC layers in the RoI box head -_C.MODEL.ROI_BOX_HEAD.FC_DIM = 1024 -_C.MODEL.ROI_BOX_HEAD.NUM_CONV = 0 -# Channel dimension for Conv layers in the RoI box head -_C.MODEL.ROI_BOX_HEAD.CONV_DIM = 256 -# Normalization method for the convolution layers. -# Options: "" (no norm), "GN", "SyncBN". -_C.MODEL.ROI_BOX_HEAD.NORM = "" -# Whether to use class agnostic for bbox regression -_C.MODEL.ROI_BOX_HEAD.CLS_AGNOSTIC_BBOX_REG = False -# If true, RoI heads use bounding boxes predicted by the box head rather than proposal boxes. -_C.MODEL.ROI_BOX_HEAD.TRAIN_ON_PRED_BOXES = False - -# ---------------------------------------------------------------------------- # -# Cascaded Box Head -# ---------------------------------------------------------------------------- # -_C.MODEL.ROI_BOX_CASCADE_HEAD = CN() -# The number of cascade stages is implicitly defined by the length of the following two configs. -_C.MODEL.ROI_BOX_CASCADE_HEAD.BBOX_REG_WEIGHTS = ( - (10.0, 10.0, 5.0, 5.0), - (20.0, 20.0, 10.0, 10.0), - (30.0, 30.0, 15.0, 15.0), -) -_C.MODEL.ROI_BOX_CASCADE_HEAD.IOUS = (0.5, 0.6, 0.7) - - -# ---------------------------------------------------------------------------- # -# Mask Head -# ---------------------------------------------------------------------------- # -_C.MODEL.ROI_MASK_HEAD = CN() -_C.MODEL.ROI_MASK_HEAD.NAME = "MaskRCNNConvUpsampleHead" -_C.MODEL.ROI_MASK_HEAD.POOLER_RESOLUTION = 14 -_C.MODEL.ROI_MASK_HEAD.POOLER_SAMPLING_RATIO = 0 -_C.MODEL.ROI_MASK_HEAD.NUM_CONV = 0 # The number of convs in the mask head -_C.MODEL.ROI_MASK_HEAD.CONV_DIM = 256 -# Normalization method for the convolution layers. -# Options: "" (no norm), "GN", "SyncBN". -_C.MODEL.ROI_MASK_HEAD.NORM = "" -# Whether to use class agnostic for mask prediction -_C.MODEL.ROI_MASK_HEAD.CLS_AGNOSTIC_MASK = False -# Type of pooling operation applied to the incoming feature map for each RoI -_C.MODEL.ROI_MASK_HEAD.POOLER_TYPE = "ROIAlignV2" - - -# ---------------------------------------------------------------------------- # -# Keypoint Head -# ---------------------------------------------------------------------------- # -_C.MODEL.ROI_KEYPOINT_HEAD = CN() -_C.MODEL.ROI_KEYPOINT_HEAD.NAME = "KRCNNConvDeconvUpsampleHead" -_C.MODEL.ROI_KEYPOINT_HEAD.POOLER_RESOLUTION = 14 -_C.MODEL.ROI_KEYPOINT_HEAD.POOLER_SAMPLING_RATIO = 0 -_C.MODEL.ROI_KEYPOINT_HEAD.CONV_DIMS = tuple(512 for _ in range(8)) -_C.MODEL.ROI_KEYPOINT_HEAD.NUM_KEYPOINTS = 17 # 17 is the number of keypoints in COCO. - -# Images with too few (or no) keypoints are excluded from training. -_C.MODEL.ROI_KEYPOINT_HEAD.MIN_KEYPOINTS_PER_IMAGE = 1 -# Normalize by the total number of visible keypoints in the minibatch if True. -# Otherwise, normalize by the total number of keypoints that could ever exist -# in the minibatch. -# The keypoint softmax loss is only calculated on visible keypoints. -# Since the number of visible keypoints can vary significantly between -# minibatches, this has the effect of up-weighting the importance of -# minibatches with few visible keypoints. (Imagine the extreme case of -# only one visible keypoint versus N: in the case of N, each one -# contributes 1/N to the gradient compared to the single keypoint -# determining the gradient direction). Instead, we can normalize the -# loss by the total number of keypoints, if it were the case that all -# keypoints were visible in a full minibatch. (Returning to the example, -# this means that the one visible keypoint contributes as much as each -# of the N keypoints.) -_C.MODEL.ROI_KEYPOINT_HEAD.NORMALIZE_LOSS_BY_VISIBLE_KEYPOINTS = True -# Multi-task loss weight to use for keypoints -# Recommended values: -# - use 1.0 if NORMALIZE_LOSS_BY_VISIBLE_KEYPOINTS is True -# - use 4.0 if NORMALIZE_LOSS_BY_VISIBLE_KEYPOINTS is False -_C.MODEL.ROI_KEYPOINT_HEAD.LOSS_WEIGHT = 1.0 -# Type of pooling operation applied to the incoming feature map for each RoI -_C.MODEL.ROI_KEYPOINT_HEAD.POOLER_TYPE = "ROIAlignV2" - -# ---------------------------------------------------------------------------- # -# Semantic Segmentation Head -# ---------------------------------------------------------------------------- # -_C.MODEL.SEM_SEG_HEAD = CN() -_C.MODEL.SEM_SEG_HEAD.NAME = "SemSegFPNHead" -_C.MODEL.SEM_SEG_HEAD.IN_FEATURES = ["p2", "p3", "p4", "p5"] -# Label in the semantic segmentation ground truth that is ignored, i.e., no loss is calculated for -# the correposnding pixel. -_C.MODEL.SEM_SEG_HEAD.IGNORE_VALUE = 255 -# Number of classes in the semantic segmentation head -_C.MODEL.SEM_SEG_HEAD.NUM_CLASSES = 54 -# Number of channels in the 3x3 convs inside semantic-FPN heads. -_C.MODEL.SEM_SEG_HEAD.CONVS_DIM = 128 -# Outputs from semantic-FPN heads are up-scaled to the COMMON_STRIDE stride. -_C.MODEL.SEM_SEG_HEAD.COMMON_STRIDE = 4 -# Normalization method for the convolution layers. Options: "" (no norm), "GN". -_C.MODEL.SEM_SEG_HEAD.NORM = "GN" -_C.MODEL.SEM_SEG_HEAD.LOSS_WEIGHT = 1.0 - -_C.MODEL.PANOPTIC_FPN = CN() -# Scaling of all losses from instance detection / segmentation head. -_C.MODEL.PANOPTIC_FPN.INSTANCE_LOSS_WEIGHT = 1.0 - -# options when combining instance & semantic segmentation outputs -_C.MODEL.PANOPTIC_FPN.COMBINE = CN({"ENABLED": True}) # "COMBINE.ENABLED" is deprecated & not used -_C.MODEL.PANOPTIC_FPN.COMBINE.OVERLAP_THRESH = 0.5 -_C.MODEL.PANOPTIC_FPN.COMBINE.STUFF_AREA_LIMIT = 4096 -_C.MODEL.PANOPTIC_FPN.COMBINE.INSTANCES_CONFIDENCE_THRESH = 0.5 - - -# ---------------------------------------------------------------------------- # -# RetinaNet Head -# ---------------------------------------------------------------------------- # -_C.MODEL.RETINANET = CN() - -# This is the number of foreground classes. -_C.MODEL.RETINANET.NUM_CLASSES = 80 - -_C.MODEL.RETINANET.IN_FEATURES = ["p3", "p4", "p5", "p6", "p7"] - -# Convolutions to use in the cls and bbox tower -# NOTE: this doesn't include the last conv for logits -_C.MODEL.RETINANET.NUM_CONVS = 4 - -# IoU overlap ratio [bg, fg] for labeling anchors. -# Anchors with < bg are labeled negative (0) -# Anchors with >= bg and < fg are ignored (-1) -# Anchors with >= fg are labeled positive (1) -_C.MODEL.RETINANET.IOU_THRESHOLDS = [0.4, 0.5] -_C.MODEL.RETINANET.IOU_LABELS = [0, -1, 1] - -# Prior prob for rare case (i.e. foreground) at the beginning of training. -# This is used to set the bias for the logits layer of the classifier subnet. -# This improves training stability in the case of heavy class imbalance. -_C.MODEL.RETINANET.PRIOR_PROB = 0.01 - -# Inference cls score threshold, only anchors with score > INFERENCE_TH are -# considered for inference (to improve speed) -_C.MODEL.RETINANET.SCORE_THRESH_TEST = 0.05 -# Select topk candidates before NMS -_C.MODEL.RETINANET.TOPK_CANDIDATES_TEST = 1000 -_C.MODEL.RETINANET.NMS_THRESH_TEST = 0.5 - -# Weights on (dx, dy, dw, dh) for normalizing Retinanet anchor regression targets -_C.MODEL.RETINANET.BBOX_REG_WEIGHTS = (1.0, 1.0, 1.0, 1.0) - -# Loss parameters -_C.MODEL.RETINANET.FOCAL_LOSS_GAMMA = 2.0 -_C.MODEL.RETINANET.FOCAL_LOSS_ALPHA = 0.25 -_C.MODEL.RETINANET.SMOOTH_L1_LOSS_BETA = 0.1 -# Options are: "smooth_l1", "giou", "diou", "ciou" -_C.MODEL.RETINANET.BBOX_REG_LOSS_TYPE = "smooth_l1" - -# One of BN, SyncBN, FrozenBN, GN -# Only supports GN until unshared norm is implemented -_C.MODEL.RETINANET.NORM = "" - - -# ---------------------------------------------------------------------------- # -# ResNe[X]t options (ResNets = {ResNet, ResNeXt} -# Note that parts of a resnet may be used for both the backbone and the head -# These options apply to both -# ---------------------------------------------------------------------------- # -_C.MODEL.RESNETS = CN() - -_C.MODEL.RESNETS.DEPTH = 50 -_C.MODEL.RESNETS.OUT_FEATURES = ["res4"] # res4 for C4 backbone, res2..5 for FPN backbone - -# Number of groups to use; 1 ==> ResNet; > 1 ==> ResNeXt -_C.MODEL.RESNETS.NUM_GROUPS = 1 - -# Options: FrozenBN, GN, "SyncBN", "BN" -_C.MODEL.RESNETS.NORM = "FrozenBN" - -# Baseline width of each group. -# Scaling this parameters will scale the width of all bottleneck layers. -_C.MODEL.RESNETS.WIDTH_PER_GROUP = 64 - -# Place the stride 2 conv on the 1x1 filter -# Use True only for the original MSRA ResNet; use False for C2 and Torch models -_C.MODEL.RESNETS.STRIDE_IN_1X1 = True - -# Apply dilation in stage "res5" -_C.MODEL.RESNETS.RES5_DILATION = 1 - -# Output width of res2. Scaling this parameters will scale the width of all 1x1 convs in ResNet -# For R18 and R34, this needs to be set to 64 -_C.MODEL.RESNETS.RES2_OUT_CHANNELS = 256 -_C.MODEL.RESNETS.STEM_OUT_CHANNELS = 64 - -# Apply Deformable Convolution in stages -# Specify if apply deform_conv on Res2, Res3, Res4, Res5 -_C.MODEL.RESNETS.DEFORM_ON_PER_STAGE = [False, False, False, False] -# Use True to use modulated deform_conv (DeformableV2, https://arxiv.org/abs/1811.11168); -# Use False for DeformableV1. -_C.MODEL.RESNETS.DEFORM_MODULATED = False -# Number of groups in deformable conv. -_C.MODEL.RESNETS.DEFORM_NUM_GROUPS = 1 - - -# ---------------------------------------------------------------------------- # -# Solver -# ---------------------------------------------------------------------------- # -_C.SOLVER = CN() - -# Options: WarmupMultiStepLR, WarmupCosineLR. -# See detectron2/solver/build.py for definition. -_C.SOLVER.LR_SCHEDULER_NAME = "WarmupMultiStepLR" - -_C.SOLVER.MAX_ITER = 40000 - -_C.SOLVER.BASE_LR = 0.001 - -_C.SOLVER.MOMENTUM = 0.9 - -_C.SOLVER.NESTEROV = False - -_C.SOLVER.WEIGHT_DECAY = 0.0001 -# The weight decay that's applied to parameters of normalization layers -# (typically the affine transformation) -_C.SOLVER.WEIGHT_DECAY_NORM = 0.0 - -_C.SOLVER.GAMMA = 0.1 -# The iteration number to decrease learning rate by GAMMA. -_C.SOLVER.STEPS = (30000,) - -_C.SOLVER.WARMUP_FACTOR = 1.0 / 1000 -_C.SOLVER.WARMUP_ITERS = 1000 -_C.SOLVER.WARMUP_METHOD = "linear" - -# Save a checkpoint after every this number of iterations -_C.SOLVER.CHECKPOINT_PERIOD = 5000 - -# Number of images per batch across all machines. This is also the number -# of training images per step (i.e. per iteration). If we use 16 GPUs -# and IMS_PER_BATCH = 32, each GPU will see 2 images per batch. -# May be adjusted automatically if REFERENCE_WORLD_SIZE is set. -_C.SOLVER.IMS_PER_BATCH = 16 - -# The reference number of workers (GPUs) this config is meant to train with. -# It takes no effect when set to 0. -# With a non-zero value, it will be used by DefaultTrainer to compute a desired -# per-worker batch size, and then scale the other related configs (total batch size, -# learning rate, etc) to match the per-worker batch size. -# See documentation of `DefaultTrainer.auto_scale_workers` for details: -_C.SOLVER.REFERENCE_WORLD_SIZE = 0 - -# Detectron v1 (and previous detection code) used a 2x higher LR and 0 WD for -# biases. This is not useful (at least for recent models). You should avoid -# changing these and they exist only to reproduce Detectron v1 training if -# desired. -_C.SOLVER.BIAS_LR_FACTOR = 1.0 -_C.SOLVER.WEIGHT_DECAY_BIAS = None # None means following WEIGHT_DECAY - -# Gradient clipping -_C.SOLVER.CLIP_GRADIENTS = CN({"ENABLED": False}) -# Type of gradient clipping, currently 2 values are supported: -# - "value": the absolute values of elements of each gradients are clipped -# - "norm": the norm of the gradient for each parameter is clipped thus -# affecting all elements in the parameter -_C.SOLVER.CLIP_GRADIENTS.CLIP_TYPE = "value" -# Maximum absolute value used for clipping gradients -_C.SOLVER.CLIP_GRADIENTS.CLIP_VALUE = 1.0 -# Floating point number p for L-p norm to be used with the "norm" -# gradient clipping type; for L-inf, please specify .inf -_C.SOLVER.CLIP_GRADIENTS.NORM_TYPE = 2.0 - -# Enable automatic mixed precision for training -# Note that this does not change model's inference behavior. -# To use AMP in inference, run inference under autocast() -_C.SOLVER.AMP = CN({"ENABLED": False}) - -# ---------------------------------------------------------------------------- # -# Specific test options -# ---------------------------------------------------------------------------- # -_C.TEST = CN() -# For end-to-end tests to verify the expected accuracy. -# Each item is [task, metric, value, tolerance] -# e.g.: [['bbox', 'AP', 38.5, 0.2]] -_C.TEST.EXPECTED_RESULTS = [] -# The period (in terms of steps) to evaluate the model during training. -# Set to 0 to disable. -_C.TEST.EVAL_PERIOD = 0 -# The sigmas used to calculate keypoint OKS. See http://cocodataset.org/#keypoints-eval -# When empty, it will use the defaults in COCO. -# Otherwise it should be a list[float] with the same length as ROI_KEYPOINT_HEAD.NUM_KEYPOINTS. -_C.TEST.KEYPOINT_OKS_SIGMAS = [] -# Maximum number of detections to return per image during inference (100 is -# based on the limit established for the COCO dataset). -_C.TEST.DETECTIONS_PER_IMAGE = 100 - -_C.TEST.AUG = CN({"ENABLED": False}) -_C.TEST.AUG.MIN_SIZES = (400, 500, 600, 700, 800, 900, 1000, 1100, 1200) -_C.TEST.AUG.MAX_SIZE = 4000 -_C.TEST.AUG.FLIP = True - -_C.TEST.PRECISE_BN = CN({"ENABLED": False}) -_C.TEST.PRECISE_BN.NUM_ITER = 200 - -# ---------------------------------------------------------------------------- # -# Misc options -# ---------------------------------------------------------------------------- # -# Directory where output files are written -_C.OUTPUT_DIR = "./output" -# Set seed to negative to fully randomize everything. -# Set seed to positive to use a fixed seed. Note that a fixed seed increases -# reproducibility but does not guarantee fully deterministic behavior. -# Disabling all parallelism further increases reproducibility. -_C.SEED = -1 -# Benchmark different cudnn algorithms. -# If input images have very different sizes, this option will have large overhead -# for about 10k iterations. It usually hurts total time, but can benefit for certain models. -# If input images have the same or similar sizes, benchmark is often helpful. -_C.CUDNN_BENCHMARK = False -# The period (in terms of steps) for minibatch visualization at train time. -# Set to 0 to disable. -_C.VIS_PERIOD = 0 - -# global config is for quick hack purposes. -# You can set them in command line or config files, -# and access it with: -# -# from detectron2.config import global_cfg -# print(global_cfg.HACK) -# -# Do not commit any configs into it. -_C.GLOBAL = CN() -_C.GLOBAL.HACK = 1.0 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/config/instantiate.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/config/instantiate.py deleted file mode 100755 index cbb32e19..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/config/instantiate.py +++ /dev/null @@ -1,82 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import dataclasses -import logging -from collections import abc -from typing import Any - -from detectron2.utils.registry import _convert_target_to_string, locate - -__all__ = ["dump_dataclass", "instantiate"] - - -def dump_dataclass(obj: Any): - """ - Dump a dataclass recursively into a dict that can be later instantiated. - - Args: - obj: a dataclass object - - Returns: - dict - """ - assert dataclasses.is_dataclass(obj) and not isinstance( - obj, type - ), "dump_dataclass() requires an instance of a dataclass." - ret = {"_target_": _convert_target_to_string(type(obj))} - for f in dataclasses.fields(obj): - v = getattr(obj, f.name) - if dataclasses.is_dataclass(v): - v = dump_dataclass(v) - if isinstance(v, (list, tuple)): - v = [dump_dataclass(x) if dataclasses.is_dataclass(x) else x for x in v] - ret[f.name] = v - return ret - - -def instantiate(cfg): - """ - Recursively instantiate objects defined in dictionaries by - "_target_" and arguments. - - Args: - cfg: a dict-like object with "_target_" that defines the caller, and - other keys that define the arguments - - Returns: - object instantiated by cfg - """ - from omegaconf import ListConfig - - if isinstance(cfg, ListConfig): - lst = [instantiate(x) for x in cfg] - return ListConfig(lst, flags={"allow_objects": True}) - if isinstance(cfg, list): - # Specialize for list, because many classes take - # list[objects] as arguments, such as ResNet, DatasetMapper - return [instantiate(x) for x in cfg] - - if isinstance(cfg, abc.Mapping) and "_target_" in cfg: - # conceptually equivalent to hydra.utils.instantiate(cfg) with _convert_=all, - # but faster: https://github.com/facebookresearch/hydra/issues/1200 - cfg = {k: instantiate(v) for k, v in cfg.items()} - cls = cfg.pop("_target_") - cls = instantiate(cls) - - if isinstance(cls, str): - cls_name = cls - cls = locate(cls_name) - assert cls is not None, cls_name - else: - try: - cls_name = cls.__module__ + "." + cls.__qualname__ - except Exception: - # target could be anything, so the above could fail - cls_name = str(cls) - assert callable(cls), f"_target_ {cls} does not define a callable object" - try: - return cls(**cfg) - except TypeError: - logger = logging.getLogger(__name__) - logger.error(f"Error when instantiating {cls_name}!") - raise - return cfg # return as-is if don't know what to do diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/config/lazy.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/config/lazy.py deleted file mode 100755 index fa5d86b4..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/config/lazy.py +++ /dev/null @@ -1,399 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import ast -import builtins -import importlib -import inspect -import logging -import os -import uuid -from collections import abc -from contextlib import contextmanager -from copy import deepcopy -from dataclasses import is_dataclass -from typing import List, Tuple, Union -import cloudpickle -import yaml -from omegaconf import DictConfig, ListConfig, OmegaConf - -from detectron2.utils.file_io import PathManager -from detectron2.utils.registry import _convert_target_to_string - -__all__ = ["LazyCall", "LazyConfig"] - - -class LazyCall: - """ - Wrap a callable so that when it's called, the call will not be executed, - but returns a dict that describes the call. - - LazyCall object has to be called with only keyword arguments. Positional - arguments are not yet supported. - - Examples: - :: - from detectron2.config import instantiate, LazyCall - - layer_cfg = LazyCall(nn.Conv2d)(in_channels=32, out_channels=32) - layer_cfg.out_channels = 64 # can edit it afterwards - layer = instantiate(layer_cfg) - """ - - def __init__(self, target): - if not (callable(target) or isinstance(target, (str, abc.Mapping))): - raise TypeError( - f"target of LazyCall must be a callable or defines a callable! Got {target}" - ) - self._target = target - - def __call__(self, **kwargs): - if is_dataclass(self._target): - # omegaconf object cannot hold dataclass type - # https://github.com/omry/omegaconf/issues/784 - target = _convert_target_to_string(self._target) - else: - target = self._target - kwargs["_target_"] = target - - return DictConfig(content=kwargs, flags={"allow_objects": True}) - - -def _visit_dict_config(cfg, func): - """ - Apply func recursively to all DictConfig in cfg. - """ - if isinstance(cfg, DictConfig): - func(cfg) - for v in cfg.values(): - _visit_dict_config(v, func) - elif isinstance(cfg, ListConfig): - for v in cfg: - _visit_dict_config(v, func) - - -def _validate_py_syntax(filename): - # see also https://github.com/open-mmlab/mmcv/blob/master/mmcv/utils/config.py - with PathManager.open(filename, "r") as f: - content = f.read() - try: - ast.parse(content) - except SyntaxError as e: - raise SyntaxError(f"Config file {filename} has syntax error!") from e - - -def _cast_to_config(obj): - # if given a dict, return DictConfig instead - if isinstance(obj, dict): - return DictConfig(obj, flags={"allow_objects": True}) - return obj - - -_CFG_PACKAGE_NAME = "detectron2._cfg_loader" -""" -A namespace to put all imported config into. -""" - - -def _random_package_name(filename): - # generate a random package name when loading config files - return _CFG_PACKAGE_NAME + str(uuid.uuid4())[:4] + "." + os.path.basename(filename) - - -@contextmanager -def _patch_import(): - """ - Enhance relative import statements in config files, so that they: - 1. locate files purely based on relative location, regardless of packages. - e.g. you can import file without having __init__ - 2. do not cache modules globally; modifications of module states has no side effect - 3. support other storage system through PathManager - 4. imported dict are turned into omegaconf.DictConfig automatically - """ - old_import = builtins.__import__ - - def find_relative_file(original_file, relative_import_path, level): - cur_file = os.path.dirname(original_file) - for _ in range(level - 1): - cur_file = os.path.dirname(cur_file) - cur_name = relative_import_path.lstrip(".") - for part in cur_name.split("."): - cur_file = os.path.join(cur_file, part) - # NOTE: directory import is not handled. Because then it's unclear - # if such import should produce python module or DictConfig. This can - # be discussed further if needed. - if not cur_file.endswith(".py"): - cur_file += ".py" - if not PathManager.isfile(cur_file): - raise ImportError( - f"Cannot import name {relative_import_path} from " - f"{original_file}: {cur_file} has to exist." - ) - return cur_file - - def new_import(name, globals=None, locals=None, fromlist=(), level=0): - if ( - # Only deal with relative imports inside config files - level != 0 - and globals is not None - and (globals.get("__package__", "") or "").startswith(_CFG_PACKAGE_NAME) - ): - cur_file = find_relative_file(globals["__file__"], name, level) - _validate_py_syntax(cur_file) - spec = importlib.machinery.ModuleSpec( - _random_package_name(cur_file), None, origin=cur_file - ) - module = importlib.util.module_from_spec(spec) - module.__file__ = cur_file - with PathManager.open(cur_file) as f: - content = f.read() - exec(compile(content, cur_file, "exec"), module.__dict__) - for name in fromlist: # turn imported dict into DictConfig automatically - val = _cast_to_config(module.__dict__[name]) - module.__dict__[name] = val - return module - return old_import(name, globals, locals, fromlist=fromlist, level=level) - - builtins.__import__ = new_import - yield new_import - builtins.__import__ = old_import - - -class LazyConfig: - """ - Provide methods to save, load, and overrides an omegaconf config object - which may contain definition of lazily-constructed objects. - """ - - @staticmethod - def load_rel(filename: str, keys: Union[None, str, Tuple[str, ...]] = None): - """ - Similar to :meth:`load()`, but load path relative to the caller's - source file. - - This has the same functionality as a relative import, except that this method - accepts filename as a string, so more characters are allowed in the filename. - """ - caller_frame = inspect.stack()[1] - caller_fname = caller_frame[0].f_code.co_filename - assert caller_fname != "", "load_rel Unable to find caller" - caller_dir = os.path.dirname(caller_fname) - filename = os.path.join(caller_dir, filename) - return LazyConfig.load(filename, keys) - - @staticmethod - def load(filename: str, keys: Union[None, str, Tuple[str, ...]] = None): - """ - Load a config file. - - Args: - filename: absolute path or relative path w.r.t. the current working directory - keys: keys to load and return. If not given, return all keys - (whose values are config objects) in a dict. - """ - has_keys = keys is not None - filename = filename.replace("/./", "/") # redundant - if os.path.splitext(filename)[1] not in [".py", ".yaml", ".yml"]: - raise ValueError(f"Config file {filename} has to be a python or yaml file.") - if filename.endswith(".py"): - _validate_py_syntax(filename) - - with _patch_import(): - # Record the filename - module_namespace = { - "__file__": filename, - "__package__": _random_package_name(filename), - } - with PathManager.open(filename) as f: - content = f.read() - # Compile first with filename to: - # 1. make filename appears in stacktrace - # 2. make load_rel able to find its parent's (possibly remote) location - exec(compile(content, filename, "exec"), module_namespace) - - ret = module_namespace - else: - with PathManager.open(filename) as f: - obj = yaml.unsafe_load(f) - ret = OmegaConf.create(obj, flags={"allow_objects": True}) - - if has_keys: - if isinstance(keys, str): - return _cast_to_config(ret[keys]) - else: - return tuple(_cast_to_config(ret[a]) for a in keys) - else: - if filename.endswith(".py"): - # when not specified, only load those that are config objects - ret = DictConfig( - { - name: _cast_to_config(value) - for name, value in ret.items() - if isinstance(value, (DictConfig, ListConfig, dict)) - and not name.startswith("_") - }, - flags={"allow_objects": True}, - ) - return ret - - @staticmethod - def save(cfg, filename: str): - """ - Save a config object to a yaml file. - Note that when the config dictionary contains complex objects (e.g. lambda), - it can't be saved to yaml. In that case we will print an error and - attempt to save to a pkl file instead. - - Args: - cfg: an omegaconf config object - filename: yaml file name to save the config file - """ - logger = logging.getLogger(__name__) - try: - cfg = deepcopy(cfg) - except Exception: - pass - else: - # if it's deep-copyable, then... - def _replace_type_by_name(x): - if "_target_" in x and callable(x._target_): - try: - x._target_ = _convert_target_to_string(x._target_) - except AttributeError: - pass - - # not necessary, but makes yaml looks nicer - _visit_dict_config(cfg, _replace_type_by_name) - - save_pkl = False - try: - dict = OmegaConf.to_container(cfg, resolve=False) - dumped = yaml.dump(dict, default_flow_style=None, allow_unicode=True, width=9999) - with PathManager.open(filename, "w") as f: - f.write(dumped) - - try: - _ = yaml.unsafe_load(dumped) # test that it is loadable - except Exception: - logger.warning( - "The config contains objects that cannot serialize to a valid yaml. " - f"{filename} is human-readable but cannot be loaded." - ) - save_pkl = True - except Exception: - logger.exception("Unable to serialize the config to yaml. Error:") - save_pkl = True - - if save_pkl: - new_filename = filename + ".pkl" - try: - # retry by pickle - with PathManager.open(new_filename, "wb") as f: - cloudpickle.dump(cfg, f) - logger.warning(f"Config is saved using cloudpickle at {new_filename}.") - except Exception: - pass - - @staticmethod - def apply_overrides(cfg, overrides: List[str]): - """ - In-place override contents of cfg. - - Args: - cfg: an omegaconf config object - overrides: list of strings in the format of "a=b" to override configs. - See https://hydra.cc/docs/next/advanced/override_grammar/basic/ - for syntax. - - Returns: - the cfg object - """ - - def safe_update(cfg, key, value): - parts = key.split(".") - for idx in range(1, len(parts)): - prefix = ".".join(parts[:idx]) - v = OmegaConf.select(cfg, prefix, default=None) - if v is None: - break - if not OmegaConf.is_config(v): - raise KeyError( - f"Trying to update key {key}, but {prefix} " - f"is not a config, but has type {type(v)}." - ) - OmegaConf.update(cfg, key, value, merge=True) - - from hydra.core.override_parser.overrides_parser import OverridesParser - - parser = OverridesParser.create() - overrides = parser.parse_overrides(overrides) - for o in overrides: - key = o.key_or_group - value = o.value() - if o.is_delete(): - # TODO support this - raise NotImplementedError("deletion is not yet a supported override") - safe_update(cfg, key, value) - return cfg - - @staticmethod - def to_py(cfg, prefix: str = "cfg."): - """ - Try to convert a config object into Python-like psuedo code. - - Note that perfect conversion is not always possible. So the returned - results are mainly meant to be human-readable, and not meant to be executed. - - Args: - cfg: an omegaconf config object - prefix: root name for the resulting code (default: "cfg.") - - - Returns: - str of formatted Python code - """ - import black - - cfg = OmegaConf.to_container(cfg, resolve=True) - - def _to_str(obj, prefix=None, inside_call=False): - if prefix is None: - prefix = [] - if isinstance(obj, abc.Mapping) and "_target_" in obj: - # Dict representing a function call - target = _convert_target_to_string(obj.pop("_target_")) - args = [] - for k, v in sorted(obj.items()): - args.append(f"{k}={_to_str(v, inside_call=True)}") - args = ", ".join(args) - call = f"{target}({args})" - return "".join(prefix) + call - elif isinstance(obj, abc.Mapping) and not inside_call: - # Dict that is not inside a call is a list of top-level config objects that we - # render as one object per line with dot separated prefixes - key_list = [] - for k, v in sorted(obj.items()): - if isinstance(v, abc.Mapping) and "_target_" not in v: - key_list.append(_to_str(v, prefix=prefix + [k + "."])) - else: - key = "".join(prefix) + k - key_list.append(f"{key}={_to_str(v)}") - return "\n".join(key_list) - elif isinstance(obj, abc.Mapping): - # Dict that is inside a call is rendered as a regular dict - return ( - "{" - + ",".join( - f"{repr(k)}: {_to_str(v, inside_call=inside_call)}" - for k, v in sorted(obj.items()) - ) - + "}" - ) - elif isinstance(obj, list): - return "[" + ",".join(_to_str(x, inside_call=inside_call) for x in obj) + "]" - else: - return repr(obj) - - py_str = _to_str(cfg, prefix=[prefix]) - try: - return black.format_str(py_str, mode=black.Mode()) - except black.InvalidInput: - return py_str diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/__init__.py deleted file mode 100755 index 259f669b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/__init__.py +++ /dev/null @@ -1,19 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from . import transforms # isort:skip - -from .build import ( - build_batch_data_loader, - build_detection_test_loader, - build_detection_train_loader, - get_detection_dataset_dicts, - load_proposals_into_dataset, - print_instances_class_histogram, -) -from .catalog import DatasetCatalog, MetadataCatalog, Metadata -from .common import DatasetFromList, MapDataset, ToIterableDataset -from .dataset_mapper import DatasetMapper - -# ensure the builtin datasets are registered -from . import datasets, samplers # isort:skip - -__all__ = [k for k in globals().keys() if not k.startswith("_")] diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/benchmark.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/benchmark.py deleted file mode 100755 index ac2f372a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/benchmark.py +++ /dev/null @@ -1,225 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import numpy as np -from itertools import count -from typing import List, Tuple -import torch -import tqdm -from fvcore.common.timer import Timer - -from detectron2.utils import comm - -from .build import build_batch_data_loader -from .common import DatasetFromList, MapDataset -from .samplers import TrainingSampler - -logger = logging.getLogger(__name__) - - -class _EmptyMapDataset(torch.utils.data.Dataset): - """ - Map anything to emptiness. - """ - - def __init__(self, dataset): - self.ds = dataset - - def __len__(self): - return len(self.ds) - - def __getitem__(self, idx): - _ = self.ds[idx] - return [0] - - -def iter_benchmark( - iterator, num_iter: int, warmup: int = 5, max_time_seconds: float = 60 -) -> Tuple[float, List[float]]: - """ - Benchmark an iterator/iterable for `num_iter` iterations with an extra - `warmup` iterations of warmup. - End early if `max_time_seconds` time is spent on iterations. - - Returns: - float: average time (seconds) per iteration - list[float]: time spent on each iteration. Sometimes useful for further analysis. - """ - num_iter, warmup = int(num_iter), int(warmup) - - iterator = iter(iterator) - for _ in range(warmup): - next(iterator) - timer = Timer() - all_times = [] - for curr_iter in tqdm.trange(num_iter): - start = timer.seconds() - if start > max_time_seconds: - num_iter = curr_iter - break - next(iterator) - all_times.append(timer.seconds() - start) - avg = timer.seconds() / num_iter - return avg, all_times - - -class DataLoaderBenchmark: - """ - Some common benchmarks that help understand perf bottleneck of a standard dataloader - made of dataset, mapper and sampler. - """ - - def __init__( - self, - dataset, - *, - mapper, - sampler=None, - total_batch_size, - num_workers=0, - max_time_seconds: int = 90, - ): - """ - Args: - max_time_seconds (int): maximum time to spent for each benchmark - other args: same as in `build.py:build_detection_train_loader` - """ - if isinstance(dataset, list): - dataset = DatasetFromList(dataset, copy=False, serialize=True) - if sampler is None: - sampler = TrainingSampler(len(dataset)) - - self.dataset = dataset - self.mapper = mapper - self.sampler = sampler - self.total_batch_size = total_batch_size - self.num_workers = num_workers - self.per_gpu_batch_size = self.total_batch_size // comm.get_world_size() - - self.max_time_seconds = max_time_seconds - - def _benchmark(self, iterator, num_iter, warmup, msg=None): - avg, all_times = iter_benchmark(iterator, num_iter, warmup, self.max_time_seconds) - if msg is not None: - self._log_time(msg, avg, all_times) - return avg, all_times - - def _log_time(self, msg, avg, all_times, distributed=False): - percentiles = [np.percentile(all_times, k, interpolation="nearest") for k in [1, 5, 95, 99]] - if not distributed: - logger.info( - f"{msg}: avg={1.0/avg:.1f} it/s, " - f"p1={percentiles[0]:.2g}s, p5={percentiles[1]:.2g}s, " - f"p95={percentiles[2]:.2g}s, p99={percentiles[3]:.2g}s." - ) - return - avg_per_gpu = comm.all_gather(avg) - percentiles_per_gpu = comm.all_gather(percentiles) - if comm.get_rank() > 0: - return - for idx, avg, percentiles in zip(count(), avg_per_gpu, percentiles_per_gpu): - logger.info( - f"GPU{idx} {msg}: avg={1.0/avg:.1f} it/s, " - f"p1={percentiles[0]:.2g}s, p5={percentiles[1]:.2g}s, " - f"p95={percentiles[2]:.2g}s, p99={percentiles[3]:.2g}s." - ) - - def benchmark_dataset(self, num_iter, warmup=5): - """ - Benchmark the speed of taking raw samples from the dataset. - """ - - def loader(): - while True: - for k in self.sampler: - yield self.dataset[k] - - self._benchmark(loader(), num_iter, warmup, "Dataset Alone") - - def benchmark_mapper(self, num_iter, warmup=5): - """ - Benchmark the speed of taking raw samples from the dataset and map - them in a single process. - """ - - def loader(): - while True: - for k in self.sampler: - yield self.mapper(self.dataset[k]) - - self._benchmark(loader(), num_iter, warmup, "Single Process Mapper (sec/sample)") - - def benchmark_workers(self, num_iter, warmup=10): - """ - Benchmark the dataloader by tuning num_workers to [0, 1, self.num_workers]. - """ - candidates = [0, 1] - if self.num_workers not in candidates: - candidates.append(self.num_workers) - - dataset = MapDataset(self.dataset, self.mapper) - for n in candidates: - loader = build_batch_data_loader( - dataset, - self.sampler, - self.total_batch_size, - num_workers=n, - ) - self._benchmark( - iter(loader), - num_iter * max(n, 1), - warmup * max(n, 1), - f"DataLoader ({n} workers, bs={self.per_gpu_batch_size})", - ) - del loader - - def benchmark_IPC(self, num_iter, warmup=10): - """ - Benchmark the dataloader where each worker outputs nothing. This - eliminates the IPC overhead compared to the regular dataloader. - - PyTorch multiprocessing's IPC only optimizes for torch tensors. - Large numpy arrays or other data structure may incur large IPC overhead. - """ - n = self.num_workers - dataset = _EmptyMapDataset(MapDataset(self.dataset, self.mapper)) - loader = build_batch_data_loader( - dataset, self.sampler, self.total_batch_size, num_workers=n - ) - self._benchmark( - iter(loader), - num_iter * max(n, 1), - warmup * max(n, 1), - f"DataLoader ({n} workers, bs={self.per_gpu_batch_size}) w/o comm", - ) - - def benchmark_distributed(self, num_iter, warmup=10): - """ - Benchmark the dataloader in each distributed worker, and log results of - all workers. This helps understand the final performance as well as - the variances among workers. - - It also prints startup time (first iter) of the dataloader. - """ - gpu = comm.get_world_size() - dataset = MapDataset(self.dataset, self.mapper) - n = self.num_workers - loader = build_batch_data_loader( - dataset, self.sampler, self.total_batch_size, num_workers=n - ) - - timer = Timer() - loader = iter(loader) - next(loader) - startup_time = timer.seconds() - logger.info("Dataloader startup time: {:.2f} seconds".format(startup_time)) - - comm.synchronize() - - avg, all_times = self._benchmark(loader, num_iter * max(n, 1), warmup * max(n, 1)) - del loader - self._log_time( - f"DataLoader ({gpu} GPUs x {n} workers, total bs={self.total_batch_size})", - avg, - all_times, - True, - ) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/build.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/build.py deleted file mode 100755 index a31369d1..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/build.py +++ /dev/null @@ -1,542 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import itertools -import logging -import numpy as np -import operator -import pickle -from typing import Any, Callable, Dict, List, Optional, Union -import torch -import torch.utils.data as torchdata -from tabulate import tabulate -from termcolor import colored - -from detectron2.config import configurable -from detectron2.structures import BoxMode -from detectron2.utils.comm import get_world_size -from detectron2.utils.env import seed_all_rng -from detectron2.utils.file_io import PathManager -from detectron2.utils.logger import _log_api_usage, log_first_n - -from .catalog import DatasetCatalog, MetadataCatalog -from .common import AspectRatioGroupedDataset, DatasetFromList, MapDataset, ToIterableDataset -from .dataset_mapper import DatasetMapper -from .detection_utils import check_metadata_consistency -from .samplers import ( - InferenceSampler, - RandomSubsetTrainingSampler, - RepeatFactorTrainingSampler, - TrainingSampler, -) - -""" -This file contains the default logic to build a dataloader for training or testing. -""" - -__all__ = [ - "build_batch_data_loader", - "build_detection_train_loader", - "build_detection_test_loader", - "get_detection_dataset_dicts", - "load_proposals_into_dataset", - "print_instances_class_histogram", -] - - -def filter_images_with_only_crowd_annotations(dataset_dicts): - """ - Filter out images with none annotations or only crowd annotations - (i.e., images without non-crowd annotations). - A common training-time preprocessing on COCO dataset. - - Args: - dataset_dicts (list[dict]): annotations in Detectron2 Dataset format. - - Returns: - list[dict]: the same format, but filtered. - """ - num_before = len(dataset_dicts) - - def valid(anns): - for ann in anns: - if ann.get("iscrowd", 0) == 0: - return True - return False - - dataset_dicts = [x for x in dataset_dicts if valid(x["annotations"])] - num_after = len(dataset_dicts) - logger = logging.getLogger(__name__) - logger.info( - "Removed {} images with no usable annotations. {} images left.".format( - num_before - num_after, num_after - ) - ) - return dataset_dicts - - -def filter_images_with_few_keypoints(dataset_dicts, min_keypoints_per_image): - """ - Filter out images with too few number of keypoints. - - Args: - dataset_dicts (list[dict]): annotations in Detectron2 Dataset format. - - Returns: - list[dict]: the same format as dataset_dicts, but filtered. - """ - num_before = len(dataset_dicts) - - def visible_keypoints_in_image(dic): - # Each keypoints field has the format [x1, y1, v1, ...], where v is visibility - annotations = dic["annotations"] - return sum( - (np.array(ann["keypoints"][2::3]) > 0).sum() - for ann in annotations - if "keypoints" in ann - ) - - dataset_dicts = [ - x for x in dataset_dicts if visible_keypoints_in_image(x) >= min_keypoints_per_image - ] - num_after = len(dataset_dicts) - logger = logging.getLogger(__name__) - logger.info( - "Removed {} images with fewer than {} keypoints.".format( - num_before - num_after, min_keypoints_per_image - ) - ) - return dataset_dicts - - -def load_proposals_into_dataset(dataset_dicts, proposal_file): - """ - Load precomputed object proposals into the dataset. - - The proposal file should be a pickled dict with the following keys: - - - "ids": list[int] or list[str], the image ids - - "boxes": list[np.ndarray], each is an Nx4 array of boxes corresponding to the image id - - "objectness_logits": list[np.ndarray], each is an N sized array of objectness scores - corresponding to the boxes. - - "bbox_mode": the BoxMode of the boxes array. Defaults to ``BoxMode.XYXY_ABS``. - - Args: - dataset_dicts (list[dict]): annotations in Detectron2 Dataset format. - proposal_file (str): file path of pre-computed proposals, in pkl format. - - Returns: - list[dict]: the same format as dataset_dicts, but added proposal field. - """ - logger = logging.getLogger(__name__) - logger.info("Loading proposals from: {}".format(proposal_file)) - - with PathManager.open(proposal_file, "rb") as f: - proposals = pickle.load(f, encoding="latin1") - - # Rename the key names in D1 proposal files - rename_keys = {"indexes": "ids", "scores": "objectness_logits"} - for key in rename_keys: - if key in proposals: - proposals[rename_keys[key]] = proposals.pop(key) - - # Fetch the indexes of all proposals that are in the dataset - # Convert image_id to str since they could be int. - img_ids = set({str(record["image_id"]) for record in dataset_dicts}) - id_to_index = {str(id): i for i, id in enumerate(proposals["ids"]) if str(id) in img_ids} - - # Assuming default bbox_mode of precomputed proposals are 'XYXY_ABS' - bbox_mode = BoxMode(proposals["bbox_mode"]) if "bbox_mode" in proposals else BoxMode.XYXY_ABS - - for record in dataset_dicts: - # Get the index of the proposal - i = id_to_index[str(record["image_id"])] - - boxes = proposals["boxes"][i] - objectness_logits = proposals["objectness_logits"][i] - # Sort the proposals in descending order of the scores - inds = objectness_logits.argsort()[::-1] - record["proposal_boxes"] = boxes[inds] - record["proposal_objectness_logits"] = objectness_logits[inds] - record["proposal_bbox_mode"] = bbox_mode - - return dataset_dicts - - -def print_instances_class_histogram(dataset_dicts, class_names): - """ - Args: - dataset_dicts (list[dict]): list of dataset dicts. - class_names (list[str]): list of class names (zero-indexed). - """ - num_classes = len(class_names) - hist_bins = np.arange(num_classes + 1) - histogram = np.zeros((num_classes,), dtype=np.int) - for entry in dataset_dicts: - annos = entry["annotations"] - classes = np.asarray( - [x["category_id"] for x in annos if not x.get("iscrowd", 0)], dtype=np.int - ) - if len(classes): - assert classes.min() >= 0, f"Got an invalid category_id={classes.min()}" - assert ( - classes.max() < num_classes - ), f"Got an invalid category_id={classes.max()} for a dataset of {num_classes} classes" - histogram += np.histogram(classes, bins=hist_bins)[0] - - N_COLS = min(6, len(class_names) * 2) - - def short_name(x): - # make long class names shorter. useful for lvis - if len(x) > 13: - return x[:11] + ".." - return x - - data = list( - itertools.chain(*[[short_name(class_names[i]), int(v)] for i, v in enumerate(histogram)]) - ) - total_num_instances = sum(data[1::2]) - data.extend([None] * (N_COLS - (len(data) % N_COLS))) - if num_classes > 1: - data.extend(["total", total_num_instances]) - data = itertools.zip_longest(*[data[i::N_COLS] for i in range(N_COLS)]) - table = tabulate( - data, - headers=["category", "#instances"] * (N_COLS // 2), - tablefmt="pipe", - numalign="left", - stralign="center", - ) - log_first_n( - logging.INFO, - "Distribution of instances among all {} categories:\n".format(num_classes) - + colored(table, "cyan"), - key="message", - ) - - -def get_detection_dataset_dicts( - names, - filter_empty=True, - min_keypoints=0, - proposal_files=None, - check_consistency=True, -): - """ - Load and prepare dataset dicts for instance detection/segmentation and semantic segmentation. - - Args: - names (str or list[str]): a dataset name or a list of dataset names - filter_empty (bool): whether to filter out images without instance annotations - min_keypoints (int): filter out images with fewer keypoints than - `min_keypoints`. Set to 0 to do nothing. - proposal_files (list[str]): if given, a list of object proposal files - that match each dataset in `names`. - check_consistency (bool): whether to check if datasets have consistent metadata. - - Returns: - list[dict]: a list of dicts following the standard dataset dict format. - """ - if isinstance(names, str): - names = [names] - assert len(names), names - dataset_dicts = [DatasetCatalog.get(dataset_name) for dataset_name in names] - for dataset_name, dicts in zip(names, dataset_dicts): - assert len(dicts), "Dataset '{}' is empty!".format(dataset_name) - - if proposal_files is not None: - assert len(names) == len(proposal_files) - # load precomputed proposals from proposal files - dataset_dicts = [ - load_proposals_into_dataset(dataset_i_dicts, proposal_file) - for dataset_i_dicts, proposal_file in zip(dataset_dicts, proposal_files) - ] - - if isinstance(dataset_dicts[0], torchdata.Dataset): - return torchdata.ConcatDataset(dataset_dicts) - - dataset_dicts = list(itertools.chain.from_iterable(dataset_dicts)) - - has_instances = "annotations" in dataset_dicts[0] - if filter_empty and has_instances: - dataset_dicts = filter_images_with_only_crowd_annotations(dataset_dicts) - if min_keypoints > 0 and has_instances: - dataset_dicts = filter_images_with_few_keypoints(dataset_dicts, min_keypoints) - - if check_consistency and has_instances: - try: - class_names = MetadataCatalog.get(names[0]).thing_classes - check_metadata_consistency("thing_classes", names) - print_instances_class_histogram(dataset_dicts, class_names) - except AttributeError: # class names are not available for this dataset - pass - - assert len(dataset_dicts), "No valid data found in {}.".format(",".join(names)) - return dataset_dicts - - -def build_batch_data_loader( - dataset, - sampler, - total_batch_size, - *, - aspect_ratio_grouping=False, - num_workers=0, - collate_fn=None, -): - """ - Build a batched dataloader. The main differences from `torch.utils.data.DataLoader` are: - 1. support aspect ratio grouping options - 2. use no "batch collation", because this is common for detection training - - Args: - dataset (torch.utils.data.Dataset): a pytorch map-style or iterable dataset. - sampler (torch.utils.data.sampler.Sampler or None): a sampler that produces indices. - Must be provided iff. ``dataset`` is a map-style dataset. - total_batch_size, aspect_ratio_grouping, num_workers, collate_fn: see - :func:`build_detection_train_loader`. - - Returns: - iterable[list]. Length of each list is the batch size of the current - GPU. Each element in the list comes from the dataset. - """ - world_size = get_world_size() - assert ( - total_batch_size > 0 and total_batch_size % world_size == 0 - ), "Total batch size ({}) must be divisible by the number of gpus ({}).".format( - total_batch_size, world_size - ) - batch_size = total_batch_size // world_size - - if isinstance(dataset, torchdata.IterableDataset): - assert sampler is None, "sampler must be None if dataset is IterableDataset" - else: - dataset = ToIterableDataset(dataset, sampler) - - if aspect_ratio_grouping: - data_loader = torchdata.DataLoader( - dataset, - num_workers=num_workers, - collate_fn=operator.itemgetter(0), # don't batch, but yield individual elements - worker_init_fn=worker_init_reset_seed, - ) # yield individual mapped dict - data_loader = AspectRatioGroupedDataset(data_loader, batch_size) - if collate_fn is None: - return data_loader - return MapDataset(data_loader, collate_fn) - else: - return torchdata.DataLoader( - dataset, - batch_size=batch_size, - drop_last=True, - num_workers=num_workers, - collate_fn=trivial_batch_collator if collate_fn is None else collate_fn, - worker_init_fn=worker_init_reset_seed, - ) - - -def _train_loader_from_config(cfg, mapper=None, *, dataset=None, sampler=None): - if dataset is None: - dataset = get_detection_dataset_dicts( - cfg.DATASETS.TRAIN, - filter_empty=cfg.DATALOADER.FILTER_EMPTY_ANNOTATIONS, - min_keypoints=cfg.MODEL.ROI_KEYPOINT_HEAD.MIN_KEYPOINTS_PER_IMAGE - if cfg.MODEL.KEYPOINT_ON - else 0, - proposal_files=cfg.DATASETS.PROPOSAL_FILES_TRAIN if cfg.MODEL.LOAD_PROPOSALS else None, - ) - _log_api_usage("dataset." + cfg.DATASETS.TRAIN[0]) - - if mapper is None: - mapper = DatasetMapper(cfg, True) - - if sampler is None: - sampler_name = cfg.DATALOADER.SAMPLER_TRAIN - logger = logging.getLogger(__name__) - logger.info("Using training sampler {}".format(sampler_name)) - if sampler_name == "TrainingSampler": - sampler = TrainingSampler(len(dataset)) - elif sampler_name == "RepeatFactorTrainingSampler": - repeat_factors = RepeatFactorTrainingSampler.repeat_factors_from_category_frequency( - dataset, cfg.DATALOADER.REPEAT_THRESHOLD - ) - sampler = RepeatFactorTrainingSampler(repeat_factors) - elif sampler_name == "RandomSubsetTrainingSampler": - sampler = RandomSubsetTrainingSampler(len(dataset), cfg.DATALOADER.RANDOM_SUBSET_RATIO) - else: - raise ValueError("Unknown training sampler: {}".format(sampler_name)) - - return { - "dataset": dataset, - "sampler": sampler, - "mapper": mapper, - "total_batch_size": cfg.SOLVER.IMS_PER_BATCH, - "aspect_ratio_grouping": cfg.DATALOADER.ASPECT_RATIO_GROUPING, - "num_workers": cfg.DATALOADER.NUM_WORKERS, - } - - -@configurable(from_config=_train_loader_from_config) -def build_detection_train_loader( - dataset, - *, - mapper, - sampler=None, - total_batch_size, - aspect_ratio_grouping=True, - num_workers=0, - collate_fn=None, -): - """ - Build a dataloader for object detection with some default features. - - Args: - dataset (list or torch.utils.data.Dataset): a list of dataset dicts, - or a pytorch dataset (either map-style or iterable). It can be obtained - by using :func:`DatasetCatalog.get` or :func:`get_detection_dataset_dicts`. - mapper (callable): a callable which takes a sample (dict) from dataset and - returns the format to be consumed by the model. - When using cfg, the default choice is ``DatasetMapper(cfg, is_train=True)``. - sampler (torch.utils.data.sampler.Sampler or None): a sampler that produces - indices to be applied on ``dataset``. - If ``dataset`` is map-style, the default sampler is a :class:`TrainingSampler`, - which coordinates an infinite random shuffle sequence across all workers. - Sampler must be None if ``dataset`` is iterable. - total_batch_size (int): total batch size across all workers. - aspect_ratio_grouping (bool): whether to group images with similar - aspect ratio for efficiency. When enabled, it requires each - element in dataset be a dict with keys "width" and "height". - num_workers (int): number of parallel data loading workers - collate_fn: a function that determines how to do batching, same as the argument of - `torch.utils.data.DataLoader`. Defaults to do no collation and return a list of - data. No collation is OK for small batch size and simple data structures. - If your batch size is large and each sample contains too many small tensors, - it's more efficient to collate them in data loader. - - Returns: - torch.utils.data.DataLoader: - a dataloader. Each output from it is a ``list[mapped_element]`` of length - ``total_batch_size / num_workers``, where ``mapped_element`` is produced - by the ``mapper``. - """ - if isinstance(dataset, list): - dataset = DatasetFromList(dataset, copy=False) - if mapper is not None: - dataset = MapDataset(dataset, mapper) - - if isinstance(dataset, torchdata.IterableDataset): - assert sampler is None, "sampler must be None if dataset is IterableDataset" - else: - if sampler is None: - sampler = TrainingSampler(len(dataset)) - assert isinstance(sampler, torchdata.Sampler), f"Expect a Sampler but got {type(sampler)}" - return build_batch_data_loader( - dataset, - sampler, - total_batch_size, - aspect_ratio_grouping=aspect_ratio_grouping, - num_workers=num_workers, - collate_fn=collate_fn, - ) - - -def _test_loader_from_config(cfg, dataset_name, mapper=None): - """ - Uses the given `dataset_name` argument (instead of the names in cfg), because the - standard practice is to evaluate each test set individually (not combining them). - """ - if isinstance(dataset_name, str): - dataset_name = [dataset_name] - - dataset = get_detection_dataset_dicts( - dataset_name, - filter_empty=False, - proposal_files=[ - cfg.DATASETS.PROPOSAL_FILES_TEST[list(cfg.DATASETS.TEST).index(x)] for x in dataset_name - ] - if cfg.MODEL.LOAD_PROPOSALS - else None, - ) - if mapper is None: - mapper = DatasetMapper(cfg, False) - return { - "dataset": dataset, - "mapper": mapper, - "num_workers": cfg.DATALOADER.NUM_WORKERS, - "sampler": InferenceSampler(len(dataset)), - } - - -@configurable(from_config=_test_loader_from_config) -def build_detection_test_loader( - dataset: Union[List[Any], torchdata.Dataset], - *, - mapper: Callable[[Dict[str, Any]], Any], - sampler: Optional[torchdata.Sampler] = None, - batch_size: int = 1, - num_workers: int = 0, - collate_fn: Optional[Callable[[List[Any]], Any]] = None, -) -> torchdata.DataLoader: - """ - Similar to `build_detection_train_loader`, with default batch size = 1, - and sampler = :class:`InferenceSampler`. This sampler coordinates all workers - to produce the exact set of all samples. - - Args: - dataset: a list of dataset dicts, - or a pytorch dataset (either map-style or iterable). They can be obtained - by using :func:`DatasetCatalog.get` or :func:`get_detection_dataset_dicts`. - mapper: a callable which takes a sample (dict) from dataset - and returns the format to be consumed by the model. - When using cfg, the default choice is ``DatasetMapper(cfg, is_train=False)``. - sampler: a sampler that produces - indices to be applied on ``dataset``. Default to :class:`InferenceSampler`, - which splits the dataset across all workers. Sampler must be None - if `dataset` is iterable. - batch_size: the batch size of the data loader to be created. - Default to 1 image per worker since this is the standard when reporting - inference time in papers. - num_workers: number of parallel data loading workers - collate_fn: same as the argument of `torch.utils.data.DataLoader`. - Defaults to do no collation and return a list of data. - - Returns: - DataLoader: a torch DataLoader, that loads the given detection - dataset, with test-time transformation and batching. - - Examples: - :: - data_loader = build_detection_test_loader( - DatasetRegistry.get("my_test"), - mapper=DatasetMapper(...)) - - # or, instantiate with a CfgNode: - data_loader = build_detection_test_loader(cfg, "my_test") - """ - if isinstance(dataset, list): - dataset = DatasetFromList(dataset, copy=False) - if mapper is not None: - dataset = MapDataset(dataset, mapper) - if isinstance(dataset, torchdata.IterableDataset): - assert sampler is None, "sampler must be None if dataset is IterableDataset" - else: - if sampler is None: - sampler = InferenceSampler(len(dataset)) - return torchdata.DataLoader( - dataset, - batch_size=batch_size, - sampler=sampler, - drop_last=False, - num_workers=num_workers, - collate_fn=trivial_batch_collator if collate_fn is None else collate_fn, - ) - - -def trivial_batch_collator(batch): - """ - A batch collator that does nothing. - """ - return batch - - -def worker_init_reset_seed(worker_id): - initial_seed = torch.initial_seed() % 2 ** 31 - seed_all_rng(initial_seed + worker_id) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/catalog.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/catalog.py deleted file mode 100755 index 45c110c1..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/catalog.py +++ /dev/null @@ -1,236 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import logging -import types -from collections import UserDict -from typing import List - -from detectron2.utils.logger import log_first_n - -__all__ = ["DatasetCatalog", "MetadataCatalog", "Metadata"] - - -class _DatasetCatalog(UserDict): - """ - A global dictionary that stores information about the datasets and how to obtain them. - - It contains a mapping from strings - (which are names that identify a dataset, e.g. "coco_2014_train") - to a function which parses the dataset and returns the samples in the - format of `list[dict]`. - - The returned dicts should be in Detectron2 Dataset format (See DATASETS.md for details) - if used with the data loader functionalities in `data/build.py,data/detection_transform.py`. - - The purpose of having this catalog is to make it easy to choose - different datasets, by just using the strings in the config. - """ - - def register(self, name, func): - """ - Args: - name (str): the name that identifies a dataset, e.g. "coco_2014_train". - func (callable): a callable which takes no arguments and returns a list of dicts. - It must return the same results if called multiple times. - """ - assert callable(func), "You must register a function with `DatasetCatalog.register`!" - assert name not in self, "Dataset '{}' is already registered!".format(name) - self[name] = func - - def get(self, name): - """ - Call the registered function and return its results. - - Args: - name (str): the name that identifies a dataset, e.g. "coco_2014_train". - - Returns: - list[dict]: dataset annotations. - """ - try: - f = self[name] - except KeyError as e: - raise KeyError( - "Dataset '{}' is not registered! Available datasets are: {}".format( - name, ", ".join(list(self.keys())) - ) - ) from e - return f() - - def list(self) -> List[str]: - """ - List all registered datasets. - - Returns: - list[str] - """ - return list(self.keys()) - - def remove(self, name): - """ - Alias of ``pop``. - """ - self.pop(name) - - def __str__(self): - return "DatasetCatalog(registered datasets: {})".format(", ".join(self.keys())) - - __repr__ = __str__ - - -DatasetCatalog = _DatasetCatalog() -DatasetCatalog.__doc__ = ( - _DatasetCatalog.__doc__ - + """ - .. automethod:: detectron2.data.catalog.DatasetCatalog.register - .. automethod:: detectron2.data.catalog.DatasetCatalog.get -""" -) - - -class Metadata(types.SimpleNamespace): - """ - A class that supports simple attribute setter/getter. - It is intended for storing metadata of a dataset and make it accessible globally. - - Examples: - :: - # somewhere when you load the data: - MetadataCatalog.get("mydataset").thing_classes = ["person", "dog"] - - # somewhere when you print statistics or visualize: - classes = MetadataCatalog.get("mydataset").thing_classes - """ - - # the name of the dataset - # set default to N/A so that `self.name` in the errors will not trigger getattr again - name: str = "N/A" - - _RENAMED = { - "class_names": "thing_classes", - "dataset_id_to_contiguous_id": "thing_dataset_id_to_contiguous_id", - "stuff_class_names": "stuff_classes", - } - - def __getattr__(self, key): - if key in self._RENAMED: - log_first_n( - logging.WARNING, - "Metadata '{}' was renamed to '{}'!".format(key, self._RENAMED[key]), - n=10, - ) - return getattr(self, self._RENAMED[key]) - - # "name" exists in every metadata - if len(self.__dict__) > 1: - raise AttributeError( - "Attribute '{}' does not exist in the metadata of dataset '{}'. Available " - "keys are {}.".format(key, self.name, str(self.__dict__.keys())) - ) - else: - raise AttributeError( - f"Attribute '{key}' does not exist in the metadata of dataset '{self.name}': " - "metadata is empty." - ) - - def __setattr__(self, key, val): - if key in self._RENAMED: - log_first_n( - logging.WARNING, - "Metadata '{}' was renamed to '{}'!".format(key, self._RENAMED[key]), - n=10, - ) - setattr(self, self._RENAMED[key], val) - - # Ensure that metadata of the same name stays consistent - try: - oldval = getattr(self, key) - assert oldval == val, ( - "Attribute '{}' in the metadata of '{}' cannot be set " - "to a different value!\n{} != {}".format(key, self.name, oldval, val) - ) - except AttributeError: - super().__setattr__(key, val) - - def as_dict(self): - """ - Returns all the metadata as a dict. - Note that modifications to the returned dict will not reflect on the Metadata object. - """ - return copy.copy(self.__dict__) - - def set(self, **kwargs): - """ - Set multiple metadata with kwargs. - """ - for k, v in kwargs.items(): - setattr(self, k, v) - return self - - def get(self, key, default=None): - """ - Access an attribute and return its value if exists. - Otherwise return default. - """ - try: - return getattr(self, key) - except AttributeError: - return default - - -class _MetadataCatalog(UserDict): - """ - MetadataCatalog is a global dictionary that provides access to - :class:`Metadata` of a given dataset. - - The metadata associated with a certain name is a singleton: once created, the - metadata will stay alive and will be returned by future calls to ``get(name)``. - - It's like global variables, so don't abuse it. - It's meant for storing knowledge that's constant and shared across the execution - of the program, e.g.: the class names in COCO. - """ - - def get(self, name): - """ - Args: - name (str): name of a dataset (e.g. coco_2014_train). - - Returns: - Metadata: The :class:`Metadata` instance associated with this name, - or create an empty one if none is available. - """ - assert len(name) - r = super().get(name, None) - if r is None: - r = self[name] = Metadata(name=name) - return r - - def list(self): - """ - List all registered metadata. - - Returns: - list[str]: keys (names of datasets) of all registered metadata - """ - return list(self.keys()) - - def remove(self, name): - """ - Alias of ``pop``. - """ - self.pop(name) - - def __str__(self): - return "MetadataCatalog(registered metadata: {})".format(", ".join(self.keys())) - - __repr__ = __str__ - - -MetadataCatalog = _MetadataCatalog() -MetadataCatalog.__doc__ = ( - _MetadataCatalog.__doc__ - + """ - .. automethod:: detectron2.data.catalog.MetadataCatalog.get -""" -) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/common.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/common.py deleted file mode 100755 index d6b87424..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/common.py +++ /dev/null @@ -1,241 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import itertools -import logging -import numpy as np -import pickle -import random -import torch.utils.data as data -from torch.utils.data.sampler import Sampler - -from detectron2.utils.serialize import PicklableWrapper - -__all__ = ["MapDataset", "DatasetFromList", "AspectRatioGroupedDataset", "ToIterableDataset"] - - -def _shard_iterator_dataloader_worker(iterable): - # Shard the iterable if we're currently inside pytorch dataloader worker. - worker_info = data.get_worker_info() - if worker_info is None or worker_info.num_workers == 1: - # do nothing - yield from iterable - else: - yield from itertools.islice(iterable, worker_info.id, None, worker_info.num_workers) - - -class _MapIterableDataset(data.IterableDataset): - """ - Map a function over elements in an IterableDataset. - - Similar to pytorch's MapIterDataPipe, but support filtering when map_func - returns None. - - This class is not public-facing. Will be called by `MapDataset`. - """ - - def __init__(self, dataset, map_func): - self._dataset = dataset - self._map_func = PicklableWrapper(map_func) # wrap so that a lambda will work - - def __len__(self): - return len(self._dataset) - - def __iter__(self): - for x in map(self._map_func, self._dataset): - if x is not None: - yield x - - -class MapDataset(data.Dataset): - """ - Map a function over the elements in a dataset. - """ - - def __init__(self, dataset, map_func): - """ - Args: - dataset: a dataset where map function is applied. Can be either - map-style or iterable dataset. When given an iterable dataset, - the returned object will also be an iterable dataset. - map_func: a callable which maps the element in dataset. map_func can - return None to skip the data (e.g. in case of errors). - How None is handled depends on the style of `dataset`. - If `dataset` is map-style, it randomly tries other elements. - If `dataset` is iterable, it skips the data and tries the next. - """ - self._dataset = dataset - self._map_func = PicklableWrapper(map_func) # wrap so that a lambda will work - - self._rng = random.Random(42) - self._fallback_candidates = set(range(len(dataset))) - - def __new__(cls, dataset, map_func): - is_iterable = isinstance(dataset, data.IterableDataset) - if is_iterable: - return _MapIterableDataset(dataset, map_func) - else: - return super().__new__(cls) - - def __getnewargs__(self): - return self._dataset, self._map_func - - def __len__(self): - return len(self._dataset) - - def __getitem__(self, idx): - retry_count = 0 - cur_idx = int(idx) - - while True: - data = self._map_func(self._dataset[cur_idx]) - if data is not None: - self._fallback_candidates.add(cur_idx) - return data - - # _map_func fails for this idx, use a random new index from the pool - retry_count += 1 - self._fallback_candidates.discard(cur_idx) - cur_idx = self._rng.sample(self._fallback_candidates, k=1)[0] - - if retry_count >= 3: - logger = logging.getLogger(__name__) - logger.warning( - "Failed to apply `_map_func` for idx: {}, retry count: {}".format( - idx, retry_count - ) - ) - - -class DatasetFromList(data.Dataset): - """ - Wrap a list to a torch Dataset. It produces elements of the list as data. - """ - - def __init__(self, lst: list, copy: bool = True, serialize: bool = True): - """ - Args: - lst (list): a list which contains elements to produce. - copy (bool): whether to deepcopy the element when producing it, - so that the result can be modified in place without affecting the - source in the list. - serialize (bool): whether to hold memory using serialized objects, when - enabled, data loader workers can use shared RAM from master - process instead of making a copy. - """ - self._lst = lst - self._copy = copy - self._serialize = serialize - - def _serialize(data): - buffer = pickle.dumps(data, protocol=-1) - return np.frombuffer(buffer, dtype=np.uint8) - - if self._serialize: - logger = logging.getLogger(__name__) - logger.info( - "Serializing {} elements to byte tensors and concatenating them all ...".format( - len(self._lst) - ) - ) - self._lst = [_serialize(x) for x in self._lst] - self._addr = np.asarray([len(x) for x in self._lst], dtype=np.int64) - self._addr = np.cumsum(self._addr) - self._lst = np.concatenate(self._lst) - logger.info("Serialized dataset takes {:.2f} MiB".format(len(self._lst) / 1024 ** 2)) - - def __len__(self): - if self._serialize: - return len(self._addr) - else: - return len(self._lst) - - def __getitem__(self, idx): - if self._serialize: - start_addr = 0 if idx == 0 else self._addr[idx - 1].item() - end_addr = self._addr[idx].item() - bytes = memoryview(self._lst[start_addr:end_addr]) - return pickle.loads(bytes) - elif self._copy: - return copy.deepcopy(self._lst[idx]) - else: - return self._lst[idx] - - -class ToIterableDataset(data.IterableDataset): - """ - Convert an old indices-based (also called map-style) dataset - to an iterable-style dataset. - """ - - def __init__(self, dataset: data.Dataset, sampler: Sampler, shard_sampler: bool = True): - """ - Args: - dataset: an old-style dataset with ``__getitem__`` - sampler: a cheap iterable that produces indices to be applied on ``dataset``. - shard_sampler: whether to shard the sampler based on the current pytorch data loader - worker id. When an IterableDataset is forked by pytorch's DataLoader into multiple - workers, it is responsible for sharding its data based on worker id so that workers - don't produce identical data. - - Most samplers (like our TrainingSampler) do not shard based on dataloader worker id - and this argument should be set to True. But certain samplers may be already - sharded, in that case this argument should be set to False. - """ - assert not isinstance(dataset, data.IterableDataset), dataset - assert isinstance(sampler, Sampler), sampler - self.dataset = dataset - self.sampler = sampler - self.shard_sampler = shard_sampler - - def __iter__(self): - if not self.shard_sampler: - sampler = self.sampler - else: - # With map-style dataset, `DataLoader(dataset, sampler)` runs the - # sampler in main process only. But `DataLoader(ToIterableDataset(dataset, sampler))` - # will run sampler in every of the N worker. So we should only keep 1/N of the ids on - # each worker. The assumption is that sampler is cheap to iterate so it's fine to - # discard ids in workers. - sampler = _shard_iterator_dataloader_worker(self.sampler) - for idx in sampler: - yield self.dataset[idx] - - def __len__(self): - return len(self.sampler) - - -class AspectRatioGroupedDataset(data.IterableDataset): - """ - Batch data that have similar aspect ratio together. - In this implementation, images whose aspect ratio < (or >) 1 will - be batched together. - This improves training speed because the images then need less padding - to form a batch. - - It assumes the underlying dataset produces dicts with "width" and "height" keys. - It will then produce a list of original dicts with length = batch_size, - all with similar aspect ratios. - """ - - def __init__(self, dataset, batch_size): - """ - Args: - dataset: an iterable. Each element must be a dict with keys - "width" and "height", which will be used to batch data. - batch_size (int): - """ - self.dataset = dataset - self.batch_size = batch_size - self._buckets = [[] for _ in range(2)] - # Hard-coded two aspect ratio groups: w > h and w < h. - # Can add support for more aspect ratio groups, but doesn't seem useful - - def __iter__(self): - for d in self.dataset: - w, h = d["width"], d["height"] - bucket_id = 0 if w > h else 1 - bucket = self._buckets[bucket_id] - bucket.append(d) - if len(bucket) == self.batch_size: - yield bucket[:] - del bucket[:] diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/dataset_mapper.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/dataset_mapper.py deleted file mode 100755 index a8714f79..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/dataset_mapper.py +++ /dev/null @@ -1,191 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import logging -import numpy as np -from typing import List, Optional, Union -import torch - -from detectron2.config import configurable - -from . import detection_utils as utils -from . import transforms as T - -""" -This file contains the default mapping that's applied to "dataset dicts". -""" - -__all__ = ["DatasetMapper"] - - -class DatasetMapper: - """ - A callable which takes a dataset dict in Detectron2 Dataset format, - and map it into a format used by the model. - - This is the default callable to be used to map your dataset dict into training data. - You may need to follow it to implement your own one for customized logic, - such as a different way to read or transform images. - See :doc:`/tutorials/data_loading` for details. - - The callable currently does the following: - - 1. Read the image from "file_name" - 2. Applies cropping/geometric transforms to the image and annotations - 3. Prepare data and annotations to Tensor and :class:`Instances` - """ - - @configurable - def __init__( - self, - is_train: bool, - *, - augmentations: List[Union[T.Augmentation, T.Transform]], - image_format: str, - use_instance_mask: bool = False, - use_keypoint: bool = False, - instance_mask_format: str = "polygon", - keypoint_hflip_indices: Optional[np.ndarray] = None, - precomputed_proposal_topk: Optional[int] = None, - recompute_boxes: bool = False, - ): - """ - NOTE: this interface is experimental. - - Args: - is_train: whether it's used in training or inference - augmentations: a list of augmentations or deterministic transforms to apply - image_format: an image format supported by :func:`detection_utils.read_image`. - use_instance_mask: whether to process instance segmentation annotations, if available - use_keypoint: whether to process keypoint annotations if available - instance_mask_format: one of "polygon" or "bitmask". Process instance segmentation - masks into this format. - keypoint_hflip_indices: see :func:`detection_utils.create_keypoint_hflip_indices` - precomputed_proposal_topk: if given, will load pre-computed - proposals from dataset_dict and keep the top k proposals for each image. - recompute_boxes: whether to overwrite bounding box annotations - by computing tight bounding boxes from instance mask annotations. - """ - if recompute_boxes: - assert use_instance_mask, "recompute_boxes requires instance masks" - # fmt: off - self.is_train = is_train - self.augmentations = T.AugmentationList(augmentations) - self.image_format = image_format - self.use_instance_mask = use_instance_mask - self.instance_mask_format = instance_mask_format - self.use_keypoint = use_keypoint - self.keypoint_hflip_indices = keypoint_hflip_indices - self.proposal_topk = precomputed_proposal_topk - self.recompute_boxes = recompute_boxes - # fmt: on - logger = logging.getLogger(__name__) - mode = "training" if is_train else "inference" - logger.info(f"[DatasetMapper] Augmentations used in {mode}: {augmentations}") - - @classmethod - def from_config(cls, cfg, is_train: bool = True): - augs = utils.build_augmentation(cfg, is_train) - if cfg.INPUT.CROP.ENABLED and is_train: - augs.insert(0, T.RandomCrop(cfg.INPUT.CROP.TYPE, cfg.INPUT.CROP.SIZE)) - recompute_boxes = cfg.MODEL.MASK_ON - else: - recompute_boxes = False - - ret = { - "is_train": is_train, - "augmentations": augs, - "image_format": cfg.INPUT.FORMAT, - "use_instance_mask": cfg.MODEL.MASK_ON, - "instance_mask_format": cfg.INPUT.MASK_FORMAT, - "use_keypoint": cfg.MODEL.KEYPOINT_ON, - "recompute_boxes": recompute_boxes, - } - - if cfg.MODEL.KEYPOINT_ON: - ret["keypoint_hflip_indices"] = utils.create_keypoint_hflip_indices(cfg.DATASETS.TRAIN) - - if cfg.MODEL.LOAD_PROPOSALS: - ret["precomputed_proposal_topk"] = ( - cfg.DATASETS.PRECOMPUTED_PROPOSAL_TOPK_TRAIN - if is_train - else cfg.DATASETS.PRECOMPUTED_PROPOSAL_TOPK_TEST - ) - return ret - - def _transform_annotations(self, dataset_dict, transforms, image_shape): - # USER: Modify this if you want to keep them for some reason. - for anno in dataset_dict["annotations"]: - if not self.use_instance_mask: - anno.pop("segmentation", None) - if not self.use_keypoint: - anno.pop("keypoints", None) - - # USER: Implement additional transformations if you have other types of data - annos = [ - utils.transform_instance_annotations( - obj, transforms, image_shape, keypoint_hflip_indices=self.keypoint_hflip_indices - ) - for obj in dataset_dict.pop("annotations") - if obj.get("iscrowd", 0) == 0 - ] - instances = utils.annotations_to_instances( - annos, image_shape, mask_format=self.instance_mask_format - ) - - # After transforms such as cropping are applied, the bounding box may no longer - # tightly bound the object. As an example, imagine a triangle object - # [(0,0), (2,0), (0,2)] cropped by a box [(1,0),(2,2)] (XYXY format). The tight - # bounding box of the cropped triangle should be [(1,0),(2,1)], which is not equal to - # the intersection of original bounding box and the cropping box. - if self.recompute_boxes: - instances.gt_boxes = instances.gt_masks.get_bounding_boxes() - dataset_dict["instances"] = utils.filter_empty_instances(instances) - - def __call__(self, dataset_dict): - """ - Args: - dataset_dict (dict): Metadata of one image, in Detectron2 Dataset format. - - Returns: - dict: a format that builtin models in detectron2 accept - """ - dataset_dict = copy.deepcopy(dataset_dict) # it will be modified by code below - # USER: Write your own image loading if it's not from a file - image = utils.read_image(dataset_dict["file_name"], format=self.image_format) - utils.check_image_size(dataset_dict, image) - - # USER: Remove if you don't do semantic/panoptic segmentation. - if "sem_seg_file_name" in dataset_dict: - sem_seg_gt = utils.read_image(dataset_dict.pop("sem_seg_file_name"), "L").squeeze(2) - else: - sem_seg_gt = None - - aug_input = T.AugInput(image, sem_seg=sem_seg_gt) - transforms = self.augmentations(aug_input) - image, sem_seg_gt = aug_input.image, aug_input.sem_seg - - image_shape = image.shape[:2] # h, w - # Pytorch's dataloader is efficient on torch.Tensor due to shared-memory, - # but not efficient on large generic data structures due to the use of pickle & mp.Queue. - # Therefore it's important to use torch.Tensor. - dataset_dict["image"] = torch.as_tensor(np.ascontiguousarray(image.transpose(2, 0, 1))) - if sem_seg_gt is not None: - dataset_dict["sem_seg"] = torch.as_tensor(sem_seg_gt.astype("long")) - - # USER: Remove if you don't use pre-computed proposals. - # Most users would not need this feature. - if self.proposal_topk is not None: - utils.transform_proposals( - dataset_dict, image_shape, transforms, proposal_topk=self.proposal_topk - ) - - if not self.is_train: - # USER: Modify this if you want to keep them for some reason. - dataset_dict.pop("annotations", None) - dataset_dict.pop("sem_seg_file_name", None) - return dataset_dict - - if "annotations" in dataset_dict: - self._transform_annotations(dataset_dict, transforms, image_shape) - - return dataset_dict diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/README.md b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/README.md deleted file mode 100755 index 9fb3e4f7..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/README.md +++ /dev/null @@ -1,9 +0,0 @@ - - -### Common Datasets - -The dataset implemented here do not need to load the data into the final format. -It should provide the minimal data structure needed to use the dataset, so it can be very efficient. - -For example, for an image dataset, just provide the file names and labels, but don't read the images. -Let the downstream decide how to read. diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/__init__.py deleted file mode 100755 index a44bedc1..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/__init__.py +++ /dev/null @@ -1,9 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .coco import load_coco_json, load_sem_seg, register_coco_instances, convert_to_coco_json -from .coco_panoptic import register_coco_panoptic, register_coco_panoptic_separated -from .lvis import load_lvis_json, register_lvis_instances, get_lvis_instances_meta -from .pascal_voc import load_voc_instances, register_pascal_voc -from . import builtin as _builtin # ensure the builtin datasets are registered - - -__all__ = [k for k in globals().keys() if not k.startswith("_")] diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/builtin.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/builtin.py deleted file mode 100755 index c3a68aa8..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/builtin.py +++ /dev/null @@ -1,259 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - - -""" -This file registers pre-defined datasets at hard-coded paths, and their metadata. - -We hard-code metadata for common datasets. This will enable: -1. Consistency check when loading the datasets -2. Use models on these standard datasets directly and run demos, - without having to download the dataset annotations - -We hard-code some paths to the dataset that's assumed to -exist in "./datasets/". - -Users SHOULD NOT use this file to create new dataset / metadata for new dataset. -To add new dataset, refer to the tutorial "docs/DATASETS.md". -""" - -import os - -from detectron2.data import DatasetCatalog, MetadataCatalog - -from .builtin_meta import ADE20K_SEM_SEG_CATEGORIES, _get_builtin_metadata -from .cityscapes import load_cityscapes_instances, load_cityscapes_semantic -from .cityscapes_panoptic import register_all_cityscapes_panoptic -from .coco import load_sem_seg, register_coco_instances -from .coco_panoptic import register_coco_panoptic, register_coco_panoptic_separated -from .lvis import get_lvis_instances_meta, register_lvis_instances -from .pascal_voc import register_pascal_voc - -# ==== Predefined datasets and splits for COCO ========== - -_PREDEFINED_SPLITS_COCO = {} -_PREDEFINED_SPLITS_COCO["coco"] = { - "coco_2014_train": ("coco/train2014", "coco/annotations/instances_train2014.json"), - "coco_2014_val": ("coco/val2014", "coco/annotations/instances_val2014.json"), - "coco_2014_minival": ("coco/val2014", "coco/annotations/instances_minival2014.json"), - "coco_2014_valminusminival": ( - "coco/val2014", - "coco/annotations/instances_valminusminival2014.json", - ), - "coco_2017_train": ("coco/train2017", "coco/annotations/instances_train2017.json"), - "coco_2017_val": ("coco/val2017", "coco/annotations/instances_val2017.json"), - "coco_2017_test": ("coco/test2017", "coco/annotations/image_info_test2017.json"), - "coco_2017_test-dev": ("coco/test2017", "coco/annotations/image_info_test-dev2017.json"), - "coco_2017_val_100": ("coco/val2017", "coco/annotations/instances_val2017_100.json"), -} - -_PREDEFINED_SPLITS_COCO["coco_person"] = { - "keypoints_coco_2014_train": ( - "coco/train2014", - "coco/annotations/person_keypoints_train2014.json", - ), - "keypoints_coco_2014_val": ("coco/val2014", "coco/annotations/person_keypoints_val2014.json"), - "keypoints_coco_2014_minival": ( - "coco/val2014", - "coco/annotations/person_keypoints_minival2014.json", - ), - "keypoints_coco_2014_valminusminival": ( - "coco/val2014", - "coco/annotations/person_keypoints_valminusminival2014.json", - ), - "keypoints_coco_2017_train": ( - "coco/train2017", - "coco/annotations/person_keypoints_train2017.json", - ), - "keypoints_coco_2017_val": ("coco/val2017", "coco/annotations/person_keypoints_val2017.json"), - "keypoints_coco_2017_val_100": ( - "coco/val2017", - "coco/annotations/person_keypoints_val2017_100.json", - ), -} - - -_PREDEFINED_SPLITS_COCO_PANOPTIC = { - "coco_2017_train_panoptic": ( - # This is the original panoptic annotation directory - "coco/panoptic_train2017", - "coco/annotations/panoptic_train2017.json", - # This directory contains semantic annotations that are - # converted from panoptic annotations. - # It is used by PanopticFPN. - # You can use the script at detectron2/datasets/prepare_panoptic_fpn.py - # to create these directories. - "coco/panoptic_stuff_train2017", - ), - "coco_2017_val_panoptic": ( - "coco/panoptic_val2017", - "coco/annotations/panoptic_val2017.json", - "coco/panoptic_stuff_val2017", - ), - "coco_2017_val_100_panoptic": ( - "coco/panoptic_val2017_100", - "coco/annotations/panoptic_val2017_100.json", - "coco/panoptic_stuff_val2017_100", - ), -} - - -def register_all_coco(root): - for dataset_name, splits_per_dataset in _PREDEFINED_SPLITS_COCO.items(): - for key, (image_root, json_file) in splits_per_dataset.items(): - # Assume pre-defined datasets live in `./datasets`. - register_coco_instances( - key, - _get_builtin_metadata(dataset_name), - os.path.join(root, json_file) if "://" not in json_file else json_file, - os.path.join(root, image_root), - ) - - for ( - prefix, - (panoptic_root, panoptic_json, semantic_root), - ) in _PREDEFINED_SPLITS_COCO_PANOPTIC.items(): - prefix_instances = prefix[: -len("_panoptic")] - instances_meta = MetadataCatalog.get(prefix_instances) - image_root, instances_json = instances_meta.image_root, instances_meta.json_file - # The "separated" version of COCO panoptic segmentation dataset, - # e.g. used by Panoptic FPN - register_coco_panoptic_separated( - prefix, - _get_builtin_metadata("coco_panoptic_separated"), - image_root, - os.path.join(root, panoptic_root), - os.path.join(root, panoptic_json), - os.path.join(root, semantic_root), - instances_json, - ) - # The "standard" version of COCO panoptic segmentation dataset, - # e.g. used by Panoptic-DeepLab - register_coco_panoptic( - prefix, - _get_builtin_metadata("coco_panoptic_standard"), - image_root, - os.path.join(root, panoptic_root), - os.path.join(root, panoptic_json), - instances_json, - ) - - -# ==== Predefined datasets and splits for LVIS ========== - - -_PREDEFINED_SPLITS_LVIS = { - "lvis_v1": { - "lvis_v1_train": ("coco/", "lvis/lvis_v1_train.json"), - "lvis_v1_val": ("coco/", "lvis/lvis_v1_val.json"), - "lvis_v1_test_dev": ("coco/", "lvis/lvis_v1_image_info_test_dev.json"), - "lvis_v1_test_challenge": ("coco/", "lvis/lvis_v1_image_info_test_challenge.json"), - }, - "lvis_v0.5": { - "lvis_v0.5_train": ("coco/", "lvis/lvis_v0.5_train.json"), - "lvis_v0.5_val": ("coco/", "lvis/lvis_v0.5_val.json"), - "lvis_v0.5_val_rand_100": ("coco/", "lvis/lvis_v0.5_val_rand_100.json"), - "lvis_v0.5_test": ("coco/", "lvis/lvis_v0.5_image_info_test.json"), - }, - "lvis_v0.5_cocofied": { - "lvis_v0.5_train_cocofied": ("coco/", "lvis/lvis_v0.5_train_cocofied.json"), - "lvis_v0.5_val_cocofied": ("coco/", "lvis/lvis_v0.5_val_cocofied.json"), - }, -} - - -def register_all_lvis(root): - for dataset_name, splits_per_dataset in _PREDEFINED_SPLITS_LVIS.items(): - for key, (image_root, json_file) in splits_per_dataset.items(): - register_lvis_instances( - key, - get_lvis_instances_meta(dataset_name), - os.path.join(root, json_file) if "://" not in json_file else json_file, - os.path.join(root, image_root), - ) - - -# ==== Predefined splits for raw cityscapes images =========== -_RAW_CITYSCAPES_SPLITS = { - "cityscapes_fine_{task}_train": ("cityscapes/leftImg8bit/train/", "cityscapes/gtFine/train/"), - "cityscapes_fine_{task}_val": ("cityscapes/leftImg8bit/val/", "cityscapes/gtFine/val/"), - "cityscapes_fine_{task}_test": ("cityscapes/leftImg8bit/test/", "cityscapes/gtFine/test/"), -} - - -def register_all_cityscapes(root): - for key, (image_dir, gt_dir) in _RAW_CITYSCAPES_SPLITS.items(): - meta = _get_builtin_metadata("cityscapes") - image_dir = os.path.join(root, image_dir) - gt_dir = os.path.join(root, gt_dir) - - inst_key = key.format(task="instance_seg") - DatasetCatalog.register( - inst_key, - lambda x=image_dir, y=gt_dir: load_cityscapes_instances( - x, y, from_json=True, to_polygons=True - ), - ) - MetadataCatalog.get(inst_key).set( - image_dir=image_dir, gt_dir=gt_dir, evaluator_type="cityscapes_instance", **meta - ) - - sem_key = key.format(task="sem_seg") - DatasetCatalog.register( - sem_key, lambda x=image_dir, y=gt_dir: load_cityscapes_semantic(x, y) - ) - MetadataCatalog.get(sem_key).set( - image_dir=image_dir, - gt_dir=gt_dir, - evaluator_type="cityscapes_sem_seg", - ignore_label=255, - **meta, - ) - - -# ==== Predefined splits for PASCAL VOC =========== -def register_all_pascal_voc(root): - SPLITS = [ - ("voc_2007_trainval", "VOC2007", "trainval"), - ("voc_2007_train", "VOC2007", "train"), - ("voc_2007_val", "VOC2007", "val"), - ("voc_2007_test", "VOC2007", "test"), - ("voc_2012_trainval", "VOC2012", "trainval"), - ("voc_2012_train", "VOC2012", "train"), - ("voc_2012_val", "VOC2012", "val"), - ] - for name, dirname, split in SPLITS: - year = 2007 if "2007" in name else 2012 - register_pascal_voc(name, os.path.join(root, dirname), split, year) - MetadataCatalog.get(name).evaluator_type = "pascal_voc" - - -def register_all_ade20k(root): - root = os.path.join(root, "ADEChallengeData2016") - for name, dirname in [("train", "training"), ("val", "validation")]: - image_dir = os.path.join(root, "images", dirname) - gt_dir = os.path.join(root, "annotations_detectron2", dirname) - name = f"ade20k_sem_seg_{name}" - DatasetCatalog.register( - name, lambda x=image_dir, y=gt_dir: load_sem_seg(y, x, gt_ext="png", image_ext="jpg") - ) - MetadataCatalog.get(name).set( - stuff_classes=ADE20K_SEM_SEG_CATEGORIES[:], - image_root=image_dir, - sem_seg_root=gt_dir, - evaluator_type="sem_seg", - ignore_label=255, - ) - - -# True for open source; -# Internally at fb, we register them elsewhere -if __name__.endswith(".builtin"): - # Assume pre-defined datasets live in `./datasets`. - _root = os.path.expanduser(os.getenv("DETECTRON2_DATASETS", "datasets")) - register_all_coco(_root) - register_all_lvis(_root) - register_all_cityscapes(_root) - register_all_cityscapes_panoptic(_root) - register_all_pascal_voc(_root) - register_all_ade20k(_root) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/builtin_meta.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/builtin_meta.py deleted file mode 100755 index 63c7a1a3..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/builtin_meta.py +++ /dev/null @@ -1,350 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -""" -Note: -For your custom dataset, there is no need to hard-code metadata anywhere in the code. -For example, for COCO-format dataset, metadata will be obtained automatically -when calling `load_coco_json`. For other dataset, metadata may also be obtained in other ways -during loading. - -However, we hard-coded metadata for a few common dataset here. -The only goal is to allow users who don't have these dataset to use pre-trained models. -Users don't have to download a COCO json (which contains metadata), in order to visualize a -COCO model (with correct class names and colors). -""" - - -# All coco categories, together with their nice-looking visualization colors -# It's from https://github.com/cocodataset/panopticapi/blob/master/panoptic_coco_categories.json -COCO_CATEGORIES = [ - {"color": [220, 20, 60], "isthing": 1, "id": 1, "name": "person"}, - {"color": [119, 11, 32], "isthing": 1, "id": 2, "name": "bicycle"}, - {"color": [0, 0, 142], "isthing": 1, "id": 3, "name": "car"}, - {"color": [0, 0, 230], "isthing": 1, "id": 4, "name": "motorcycle"}, - {"color": [106, 0, 228], "isthing": 1, "id": 5, "name": "airplane"}, - {"color": [0, 60, 100], "isthing": 1, "id": 6, "name": "bus"}, - {"color": [0, 80, 100], "isthing": 1, "id": 7, "name": "train"}, - {"color": [0, 0, 70], "isthing": 1, "id": 8, "name": "truck"}, - {"color": [0, 0, 192], "isthing": 1, "id": 9, "name": "boat"}, - {"color": [250, 170, 30], "isthing": 1, "id": 10, "name": "traffic light"}, - {"color": [100, 170, 30], "isthing": 1, "id": 11, "name": "fire hydrant"}, - {"color": [220, 220, 0], "isthing": 1, "id": 13, "name": "stop sign"}, - {"color": [175, 116, 175], "isthing": 1, "id": 14, "name": "parking meter"}, - {"color": [250, 0, 30], "isthing": 1, "id": 15, "name": "bench"}, - {"color": [165, 42, 42], "isthing": 1, "id": 16, "name": "bird"}, - {"color": [255, 77, 255], "isthing": 1, "id": 17, "name": "cat"}, - {"color": [0, 226, 252], "isthing": 1, "id": 18, "name": "dog"}, - {"color": [182, 182, 255], "isthing": 1, "id": 19, "name": "horse"}, - {"color": [0, 82, 0], "isthing": 1, "id": 20, "name": "sheep"}, - {"color": [120, 166, 157], "isthing": 1, "id": 21, "name": "cow"}, - {"color": [110, 76, 0], "isthing": 1, "id": 22, "name": "elephant"}, - {"color": [174, 57, 255], "isthing": 1, "id": 23, "name": "bear"}, - {"color": [199, 100, 0], "isthing": 1, "id": 24, "name": "zebra"}, - {"color": [72, 0, 118], "isthing": 1, "id": 25, "name": "giraffe"}, - {"color": [255, 179, 240], "isthing": 1, "id": 27, "name": "backpack"}, - {"color": [0, 125, 92], "isthing": 1, "id": 28, "name": "umbrella"}, - {"color": [209, 0, 151], "isthing": 1, "id": 31, "name": "handbag"}, - {"color": [188, 208, 182], "isthing": 1, "id": 32, "name": "tie"}, - {"color": [0, 220, 176], "isthing": 1, "id": 33, "name": "suitcase"}, - {"color": [255, 99, 164], "isthing": 1, "id": 34, "name": "frisbee"}, - {"color": [92, 0, 73], "isthing": 1, "id": 35, "name": "skis"}, - {"color": [133, 129, 255], "isthing": 1, "id": 36, "name": "snowboard"}, - {"color": [78, 180, 255], "isthing": 1, "id": 37, "name": "sports ball"}, - {"color": [0, 228, 0], "isthing": 1, "id": 38, "name": "kite"}, - {"color": [174, 255, 243], "isthing": 1, "id": 39, "name": "baseball bat"}, - {"color": [45, 89, 255], "isthing": 1, "id": 40, "name": "baseball glove"}, - {"color": [134, 134, 103], "isthing": 1, "id": 41, "name": "skateboard"}, - {"color": [145, 148, 174], "isthing": 1, "id": 42, "name": "surfboard"}, - {"color": [255, 208, 186], "isthing": 1, "id": 43, "name": "tennis racket"}, - {"color": [197, 226, 255], "isthing": 1, "id": 44, "name": "bottle"}, - {"color": [171, 134, 1], "isthing": 1, "id": 46, "name": "wine glass"}, - {"color": [109, 63, 54], "isthing": 1, "id": 47, "name": "cup"}, - {"color": [207, 138, 255], "isthing": 1, "id": 48, "name": "fork"}, - {"color": [151, 0, 95], "isthing": 1, "id": 49, "name": "knife"}, - {"color": [9, 80, 61], "isthing": 1, "id": 50, "name": "spoon"}, - {"color": [84, 105, 51], "isthing": 1, "id": 51, "name": "bowl"}, - {"color": [74, 65, 105], "isthing": 1, "id": 52, "name": "banana"}, - {"color": [166, 196, 102], "isthing": 1, "id": 53, "name": "apple"}, - {"color": [208, 195, 210], "isthing": 1, "id": 54, "name": "sandwich"}, - {"color": [255, 109, 65], "isthing": 1, "id": 55, "name": "orange"}, - {"color": [0, 143, 149], "isthing": 1, "id": 56, "name": "broccoli"}, - {"color": [179, 0, 194], "isthing": 1, "id": 57, "name": "carrot"}, - {"color": [209, 99, 106], "isthing": 1, "id": 58, "name": "hot dog"}, - {"color": [5, 121, 0], "isthing": 1, "id": 59, "name": "pizza"}, - {"color": [227, 255, 205], "isthing": 1, "id": 60, "name": "donut"}, - {"color": [147, 186, 208], "isthing": 1, "id": 61, "name": "cake"}, - {"color": [153, 69, 1], "isthing": 1, "id": 62, "name": "chair"}, - {"color": [3, 95, 161], "isthing": 1, "id": 63, "name": "couch"}, - {"color": [163, 255, 0], "isthing": 1, "id": 64, "name": "potted plant"}, - {"color": [119, 0, 170], "isthing": 1, "id": 65, "name": "bed"}, - {"color": [0, 182, 199], "isthing": 1, "id": 67, "name": "dining table"}, - {"color": [0, 165, 120], "isthing": 1, "id": 70, "name": "toilet"}, - {"color": [183, 130, 88], "isthing": 1, "id": 72, "name": "tv"}, - {"color": [95, 32, 0], "isthing": 1, "id": 73, "name": "laptop"}, - {"color": [130, 114, 135], "isthing": 1, "id": 74, "name": "mouse"}, - {"color": [110, 129, 133], "isthing": 1, "id": 75, "name": "remote"}, - {"color": [166, 74, 118], "isthing": 1, "id": 76, "name": "keyboard"}, - {"color": [219, 142, 185], "isthing": 1, "id": 77, "name": "cell phone"}, - {"color": [79, 210, 114], "isthing": 1, "id": 78, "name": "microwave"}, - {"color": [178, 90, 62], "isthing": 1, "id": 79, "name": "oven"}, - {"color": [65, 70, 15], "isthing": 1, "id": 80, "name": "toaster"}, - {"color": [127, 167, 115], "isthing": 1, "id": 81, "name": "sink"}, - {"color": [59, 105, 106], "isthing": 1, "id": 82, "name": "refrigerator"}, - {"color": [142, 108, 45], "isthing": 1, "id": 84, "name": "book"}, - {"color": [196, 172, 0], "isthing": 1, "id": 85, "name": "clock"}, - {"color": [95, 54, 80], "isthing": 1, "id": 86, "name": "vase"}, - {"color": [128, 76, 255], "isthing": 1, "id": 87, "name": "scissors"}, - {"color": [201, 57, 1], "isthing": 1, "id": 88, "name": "teddy bear"}, - {"color": [246, 0, 122], "isthing": 1, "id": 89, "name": "hair drier"}, - {"color": [191, 162, 208], "isthing": 1, "id": 90, "name": "toothbrush"}, - {"color": [255, 255, 128], "isthing": 0, "id": 92, "name": "banner"}, - {"color": [147, 211, 203], "isthing": 0, "id": 93, "name": "blanket"}, - {"color": [150, 100, 100], "isthing": 0, "id": 95, "name": "bridge"}, - {"color": [168, 171, 172], "isthing": 0, "id": 100, "name": "cardboard"}, - {"color": [146, 112, 198], "isthing": 0, "id": 107, "name": "counter"}, - {"color": [210, 170, 100], "isthing": 0, "id": 109, "name": "curtain"}, - {"color": [92, 136, 89], "isthing": 0, "id": 112, "name": "door-stuff"}, - {"color": [218, 88, 184], "isthing": 0, "id": 118, "name": "floor-wood"}, - {"color": [241, 129, 0], "isthing": 0, "id": 119, "name": "flower"}, - {"color": [217, 17, 255], "isthing": 0, "id": 122, "name": "fruit"}, - {"color": [124, 74, 181], "isthing": 0, "id": 125, "name": "gravel"}, - {"color": [70, 70, 70], "isthing": 0, "id": 128, "name": "house"}, - {"color": [255, 228, 255], "isthing": 0, "id": 130, "name": "light"}, - {"color": [154, 208, 0], "isthing": 0, "id": 133, "name": "mirror-stuff"}, - {"color": [193, 0, 92], "isthing": 0, "id": 138, "name": "net"}, - {"color": [76, 91, 113], "isthing": 0, "id": 141, "name": "pillow"}, - {"color": [255, 180, 195], "isthing": 0, "id": 144, "name": "platform"}, - {"color": [106, 154, 176], "isthing": 0, "id": 145, "name": "playingfield"}, - {"color": [230, 150, 140], "isthing": 0, "id": 147, "name": "railroad"}, - {"color": [60, 143, 255], "isthing": 0, "id": 148, "name": "river"}, - {"color": [128, 64, 128], "isthing": 0, "id": 149, "name": "road"}, - {"color": [92, 82, 55], "isthing": 0, "id": 151, "name": "roof"}, - {"color": [254, 212, 124], "isthing": 0, "id": 154, "name": "sand"}, - {"color": [73, 77, 174], "isthing": 0, "id": 155, "name": "sea"}, - {"color": [255, 160, 98], "isthing": 0, "id": 156, "name": "shelf"}, - {"color": [255, 255, 255], "isthing": 0, "id": 159, "name": "snow"}, - {"color": [104, 84, 109], "isthing": 0, "id": 161, "name": "stairs"}, - {"color": [169, 164, 131], "isthing": 0, "id": 166, "name": "tent"}, - {"color": [225, 199, 255], "isthing": 0, "id": 168, "name": "towel"}, - {"color": [137, 54, 74], "isthing": 0, "id": 171, "name": "wall-brick"}, - {"color": [135, 158, 223], "isthing": 0, "id": 175, "name": "wall-stone"}, - {"color": [7, 246, 231], "isthing": 0, "id": 176, "name": "wall-tile"}, - {"color": [107, 255, 200], "isthing": 0, "id": 177, "name": "wall-wood"}, - {"color": [58, 41, 149], "isthing": 0, "id": 178, "name": "water-other"}, - {"color": [183, 121, 142], "isthing": 0, "id": 180, "name": "window-blind"}, - {"color": [255, 73, 97], "isthing": 0, "id": 181, "name": "window-other"}, - {"color": [107, 142, 35], "isthing": 0, "id": 184, "name": "tree-merged"}, - {"color": [190, 153, 153], "isthing": 0, "id": 185, "name": "fence-merged"}, - {"color": [146, 139, 141], "isthing": 0, "id": 186, "name": "ceiling-merged"}, - {"color": [70, 130, 180], "isthing": 0, "id": 187, "name": "sky-other-merged"}, - {"color": [134, 199, 156], "isthing": 0, "id": 188, "name": "cabinet-merged"}, - {"color": [209, 226, 140], "isthing": 0, "id": 189, "name": "table-merged"}, - {"color": [96, 36, 108], "isthing": 0, "id": 190, "name": "floor-other-merged"}, - {"color": [96, 96, 96], "isthing": 0, "id": 191, "name": "pavement-merged"}, - {"color": [64, 170, 64], "isthing": 0, "id": 192, "name": "mountain-merged"}, - {"color": [152, 251, 152], "isthing": 0, "id": 193, "name": "grass-merged"}, - {"color": [208, 229, 228], "isthing": 0, "id": 194, "name": "dirt-merged"}, - {"color": [206, 186, 171], "isthing": 0, "id": 195, "name": "paper-merged"}, - {"color": [152, 161, 64], "isthing": 0, "id": 196, "name": "food-other-merged"}, - {"color": [116, 112, 0], "isthing": 0, "id": 197, "name": "building-other-merged"}, - {"color": [0, 114, 143], "isthing": 0, "id": 198, "name": "rock-merged"}, - {"color": [102, 102, 156], "isthing": 0, "id": 199, "name": "wall-other-merged"}, - {"color": [250, 141, 255], "isthing": 0, "id": 200, "name": "rug-merged"}, -] - -# fmt: off -COCO_PERSON_KEYPOINT_NAMES = ( - "nose", - "left_eye", "right_eye", - "left_ear", "right_ear", - "left_shoulder", "right_shoulder", - "left_elbow", "right_elbow", - "left_wrist", "right_wrist", - "left_hip", "right_hip", - "left_knee", "right_knee", - "left_ankle", "right_ankle", -) -# fmt: on - -# Pairs of keypoints that should be exchanged under horizontal flipping -COCO_PERSON_KEYPOINT_FLIP_MAP = ( - ("left_eye", "right_eye"), - ("left_ear", "right_ear"), - ("left_shoulder", "right_shoulder"), - ("left_elbow", "right_elbow"), - ("left_wrist", "right_wrist"), - ("left_hip", "right_hip"), - ("left_knee", "right_knee"), - ("left_ankle", "right_ankle"), -) - -# rules for pairs of keypoints to draw a line between, and the line color to use. -KEYPOINT_CONNECTION_RULES = [ - # face - ("left_ear", "left_eye", (102, 204, 255)), - ("right_ear", "right_eye", (51, 153, 255)), - ("left_eye", "nose", (102, 0, 204)), - ("nose", "right_eye", (51, 102, 255)), - # upper-body - ("left_shoulder", "right_shoulder", (255, 128, 0)), - ("left_shoulder", "left_elbow", (153, 255, 204)), - ("right_shoulder", "right_elbow", (128, 229, 255)), - ("left_elbow", "left_wrist", (153, 255, 153)), - ("right_elbow", "right_wrist", (102, 255, 224)), - # lower-body - ("left_hip", "right_hip", (255, 102, 0)), - ("left_hip", "left_knee", (255, 255, 77)), - ("right_hip", "right_knee", (153, 255, 204)), - ("left_knee", "left_ankle", (191, 255, 128)), - ("right_knee", "right_ankle", (255, 195, 77)), -] - -# All Cityscapes categories, together with their nice-looking visualization colors -# It's from https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/helpers/labels.py # noqa -CITYSCAPES_CATEGORIES = [ - {"color": (128, 64, 128), "isthing": 0, "id": 7, "trainId": 0, "name": "road"}, - {"color": (244, 35, 232), "isthing": 0, "id": 8, "trainId": 1, "name": "sidewalk"}, - {"color": (70, 70, 70), "isthing": 0, "id": 11, "trainId": 2, "name": "building"}, - {"color": (102, 102, 156), "isthing": 0, "id": 12, "trainId": 3, "name": "wall"}, - {"color": (190, 153, 153), "isthing": 0, "id": 13, "trainId": 4, "name": "fence"}, - {"color": (153, 153, 153), "isthing": 0, "id": 17, "trainId": 5, "name": "pole"}, - {"color": (250, 170, 30), "isthing": 0, "id": 19, "trainId": 6, "name": "traffic light"}, - {"color": (220, 220, 0), "isthing": 0, "id": 20, "trainId": 7, "name": "traffic sign"}, - {"color": (107, 142, 35), "isthing": 0, "id": 21, "trainId": 8, "name": "vegetation"}, - {"color": (152, 251, 152), "isthing": 0, "id": 22, "trainId": 9, "name": "terrain"}, - {"color": (70, 130, 180), "isthing": 0, "id": 23, "trainId": 10, "name": "sky"}, - {"color": (220, 20, 60), "isthing": 1, "id": 24, "trainId": 11, "name": "person"}, - {"color": (255, 0, 0), "isthing": 1, "id": 25, "trainId": 12, "name": "rider"}, - {"color": (0, 0, 142), "isthing": 1, "id": 26, "trainId": 13, "name": "car"}, - {"color": (0, 0, 70), "isthing": 1, "id": 27, "trainId": 14, "name": "truck"}, - {"color": (0, 60, 100), "isthing": 1, "id": 28, "trainId": 15, "name": "bus"}, - {"color": (0, 80, 100), "isthing": 1, "id": 31, "trainId": 16, "name": "train"}, - {"color": (0, 0, 230), "isthing": 1, "id": 32, "trainId": 17, "name": "motorcycle"}, - {"color": (119, 11, 32), "isthing": 1, "id": 33, "trainId": 18, "name": "bicycle"}, -] - -# fmt: off -ADE20K_SEM_SEG_CATEGORIES = [ - "wall", "building", "sky", "floor", "tree", "ceiling", "road, route", "bed", "window ", "grass", "cabinet", "sidewalk, pavement", "person", "earth, ground", "door", "table", "mountain, mount", "plant", "curtain", "chair", "car", "water", "painting, picture", "sofa", "shelf", "house", "sea", "mirror", "rug", "field", "armchair", "seat", "fence", "desk", "rock, stone", "wardrobe, closet, press", "lamp", "tub", "rail", "cushion", "base, pedestal, stand", "box", "column, pillar", "signboard, sign", "chest of drawers, chest, bureau, dresser", "counter", "sand", "sink", "skyscraper", "fireplace", "refrigerator, icebox", "grandstand, covered stand", "path", "stairs", "runway", "case, display case, showcase, vitrine", "pool table, billiard table, snooker table", "pillow", "screen door, screen", "stairway, staircase", "river", "bridge, span", "bookcase", "blind, screen", "coffee table", "toilet, can, commode, crapper, pot, potty, stool, throne", "flower", "book", "hill", "bench", "countertop", "stove", "palm, palm tree", "kitchen island", "computer", "swivel chair", "boat", "bar", "arcade machine", "hovel, hut, hutch, shack, shanty", "bus", "towel", "light", "truck", "tower", "chandelier", "awning, sunshade, sunblind", "street lamp", "booth", "tv", "plane", "dirt track", "clothes", "pole", "land, ground, soil", "bannister, banister, balustrade, balusters, handrail", "escalator, moving staircase, moving stairway", "ottoman, pouf, pouffe, puff, hassock", "bottle", "buffet, counter, sideboard", "poster, posting, placard, notice, bill, card", "stage", "van", "ship", "fountain", "conveyer belt, conveyor belt, conveyer, conveyor, transporter", "canopy", "washer, automatic washer, washing machine", "plaything, toy", "pool", "stool", "barrel, cask", "basket, handbasket", "falls", "tent", "bag", "minibike, motorbike", "cradle", "oven", "ball", "food, solid food", "step, stair", "tank, storage tank", "trade name", "microwave", "pot", "animal", "bicycle", "lake", "dishwasher", "screen", "blanket, cover", "sculpture", "hood, exhaust hood", "sconce", "vase", "traffic light", "tray", "trash can", "fan", "pier", "crt screen", "plate", "monitor", "bulletin board", "shower", "radiator", "glass, drinking glass", "clock", "flag", # noqa -] -# After processed by `prepare_ade20k_sem_seg.py`, id 255 means ignore -# fmt: on - - -def _get_coco_instances_meta(): - thing_ids = [k["id"] for k in COCO_CATEGORIES if k["isthing"] == 1] - thing_colors = [k["color"] for k in COCO_CATEGORIES if k["isthing"] == 1] - assert len(thing_ids) == 80, len(thing_ids) - # Mapping from the incontiguous COCO category id to an id in [0, 79] - thing_dataset_id_to_contiguous_id = {k: i for i, k in enumerate(thing_ids)} - thing_classes = [k["name"] for k in COCO_CATEGORIES if k["isthing"] == 1] - ret = { - "thing_dataset_id_to_contiguous_id": thing_dataset_id_to_contiguous_id, - "thing_classes": thing_classes, - "thing_colors": thing_colors, - } - return ret - - -def _get_coco_panoptic_separated_meta(): - """ - Returns metadata for "separated" version of the panoptic segmentation dataset. - """ - stuff_ids = [k["id"] for k in COCO_CATEGORIES if k["isthing"] == 0] - assert len(stuff_ids) == 53, len(stuff_ids) - - # For semantic segmentation, this mapping maps from contiguous stuff id - # (in [0, 53], used in models) to ids in the dataset (used for processing results) - # The id 0 is mapped to an extra category "thing". - stuff_dataset_id_to_contiguous_id = {k: i + 1 for i, k in enumerate(stuff_ids)} - # When converting COCO panoptic annotations to semantic annotations - # We label the "thing" category to 0 - stuff_dataset_id_to_contiguous_id[0] = 0 - - # 54 names for COCO stuff categories (including "things") - stuff_classes = ["things"] + [ - k["name"].replace("-other", "").replace("-merged", "") - for k in COCO_CATEGORIES - if k["isthing"] == 0 - ] - - # NOTE: I randomly picked a color for things - stuff_colors = [[82, 18, 128]] + [k["color"] for k in COCO_CATEGORIES if k["isthing"] == 0] - ret = { - "stuff_dataset_id_to_contiguous_id": stuff_dataset_id_to_contiguous_id, - "stuff_classes": stuff_classes, - "stuff_colors": stuff_colors, - } - ret.update(_get_coco_instances_meta()) - return ret - - -def _get_builtin_metadata(dataset_name): - if dataset_name == "coco": - return _get_coco_instances_meta() - if dataset_name == "coco_panoptic_separated": - return _get_coco_panoptic_separated_meta() - elif dataset_name == "coco_panoptic_standard": - meta = {} - # The following metadata maps contiguous id from [0, #thing categories + - # #stuff categories) to their names and colors. We have to replica of the - # same name and color under "thing_*" and "stuff_*" because the current - # visualization function in D2 handles thing and class classes differently - # due to some heuristic used in Panoptic FPN. We keep the same naming to - # enable reusing existing visualization functions. - thing_classes = [k["name"] for k in COCO_CATEGORIES] - thing_colors = [k["color"] for k in COCO_CATEGORIES] - stuff_classes = [k["name"] for k in COCO_CATEGORIES] - stuff_colors = [k["color"] for k in COCO_CATEGORIES] - - meta["thing_classes"] = thing_classes - meta["thing_colors"] = thing_colors - meta["stuff_classes"] = stuff_classes - meta["stuff_colors"] = stuff_colors - - # Convert category id for training: - # category id: like semantic segmentation, it is the class id for each - # pixel. Since there are some classes not used in evaluation, the category - # id is not always contiguous and thus we have two set of category ids: - # - original category id: category id in the original dataset, mainly - # used for evaluation. - # - contiguous category id: [0, #classes), in order to train the linear - # softmax classifier. - thing_dataset_id_to_contiguous_id = {} - stuff_dataset_id_to_contiguous_id = {} - - for i, cat in enumerate(COCO_CATEGORIES): - if cat["isthing"]: - thing_dataset_id_to_contiguous_id[cat["id"]] = i - else: - stuff_dataset_id_to_contiguous_id[cat["id"]] = i - - meta["thing_dataset_id_to_contiguous_id"] = thing_dataset_id_to_contiguous_id - meta["stuff_dataset_id_to_contiguous_id"] = stuff_dataset_id_to_contiguous_id - - return meta - elif dataset_name == "coco_person": - return { - "thing_classes": ["person"], - "keypoint_names": COCO_PERSON_KEYPOINT_NAMES, - "keypoint_flip_map": COCO_PERSON_KEYPOINT_FLIP_MAP, - "keypoint_connection_rules": KEYPOINT_CONNECTION_RULES, - } - elif dataset_name == "cityscapes": - # fmt: off - CITYSCAPES_THING_CLASSES = [ - "person", "rider", "car", "truck", - "bus", "train", "motorcycle", "bicycle", - ] - CITYSCAPES_STUFF_CLASSES = [ - "road", "sidewalk", "building", "wall", "fence", "pole", "traffic light", - "traffic sign", "vegetation", "terrain", "sky", "person", "rider", "car", - "truck", "bus", "train", "motorcycle", "bicycle", - ] - # fmt: on - return { - "thing_classes": CITYSCAPES_THING_CLASSES, - "stuff_classes": CITYSCAPES_STUFF_CLASSES, - } - raise KeyError("No built-in metadata for dataset {}".format(dataset_name)) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/cityscapes.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/cityscapes.py deleted file mode 100755 index 1e84a5bd..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/cityscapes.py +++ /dev/null @@ -1,329 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import functools -import json -import logging -import multiprocessing as mp -import numpy as np -import os -from itertools import chain -import pycocotools.mask as mask_util -from PIL import Image - -from detectron2.structures import BoxMode -from detectron2.utils.comm import get_world_size -from detectron2.utils.file_io import PathManager -from detectron2.utils.logger import setup_logger - -try: - import cv2 # noqa -except ImportError: - # OpenCV is an optional dependency at the moment - pass - - -logger = logging.getLogger(__name__) - - -def _get_cityscapes_files(image_dir, gt_dir): - files = [] - # scan through the directory - cities = PathManager.ls(image_dir) - logger.info(f"{len(cities)} cities found in '{image_dir}'.") - for city in cities: - city_img_dir = os.path.join(image_dir, city) - city_gt_dir = os.path.join(gt_dir, city) - for basename in PathManager.ls(city_img_dir): - image_file = os.path.join(city_img_dir, basename) - - suffix = "leftImg8bit.png" - assert basename.endswith(suffix), basename - basename = basename[: -len(suffix)] - - instance_file = os.path.join(city_gt_dir, basename + "gtFine_instanceIds.png") - label_file = os.path.join(city_gt_dir, basename + "gtFine_labelIds.png") - json_file = os.path.join(city_gt_dir, basename + "gtFine_polygons.json") - - files.append((image_file, instance_file, label_file, json_file)) - assert len(files), "No images found in {}".format(image_dir) - for f in files[0]: - assert PathManager.isfile(f), f - return files - - -def load_cityscapes_instances(image_dir, gt_dir, from_json=True, to_polygons=True): - """ - Args: - image_dir (str): path to the raw dataset. e.g., "~/cityscapes/leftImg8bit/train". - gt_dir (str): path to the raw annotations. e.g., "~/cityscapes/gtFine/train". - from_json (bool): whether to read annotations from the raw json file or the png files. - to_polygons (bool): whether to represent the segmentation as polygons - (COCO's format) instead of masks (cityscapes's format). - - Returns: - list[dict]: a list of dicts in Detectron2 standard format. (See - `Using Custom Datasets `_ ) - """ - if from_json: - assert to_polygons, ( - "Cityscapes's json annotations are in polygon format. " - "Converting to mask format is not supported now." - ) - files = _get_cityscapes_files(image_dir, gt_dir) - - logger.info("Preprocessing cityscapes annotations ...") - # This is still not fast: all workers will execute duplicate works and will - # take up to 10m on a 8GPU server. - pool = mp.Pool(processes=max(mp.cpu_count() // get_world_size() // 2, 4)) - - ret = pool.map( - functools.partial(_cityscapes_files_to_dict, from_json=from_json, to_polygons=to_polygons), - files, - ) - logger.info("Loaded {} images from {}".format(len(ret), image_dir)) - - # Map cityscape ids to contiguous ids - from cityscapesscripts.helpers.labels import labels - - labels = [l for l in labels if l.hasInstances and not l.ignoreInEval] - dataset_id_to_contiguous_id = {l.id: idx for idx, l in enumerate(labels)} - for dict_per_image in ret: - for anno in dict_per_image["annotations"]: - anno["category_id"] = dataset_id_to_contiguous_id[anno["category_id"]] - return ret - - -def load_cityscapes_semantic(image_dir, gt_dir): - """ - Args: - image_dir (str): path to the raw dataset. e.g., "~/cityscapes/leftImg8bit/train". - gt_dir (str): path to the raw annotations. e.g., "~/cityscapes/gtFine/train". - - Returns: - list[dict]: a list of dict, each has "file_name" and - "sem_seg_file_name". - """ - ret = [] - # gt_dir is small and contain many small files. make sense to fetch to local first - gt_dir = PathManager.get_local_path(gt_dir) - for image_file, _, label_file, json_file in _get_cityscapes_files(image_dir, gt_dir): - label_file = label_file.replace("labelIds", "labelTrainIds") - - with PathManager.open(json_file, "r") as f: - jsonobj = json.load(f) - ret.append( - { - "file_name": image_file, - "sem_seg_file_name": label_file, - "height": jsonobj["imgHeight"], - "width": jsonobj["imgWidth"], - } - ) - assert len(ret), f"No images found in {image_dir}!" - assert PathManager.isfile( - ret[0]["sem_seg_file_name"] - ), "Please generate labelTrainIds.png with cityscapesscripts/preparation/createTrainIdLabelImgs.py" # noqa - return ret - - -def _cityscapes_files_to_dict(files, from_json, to_polygons): - """ - Parse cityscapes annotation files to a instance segmentation dataset dict. - - Args: - files (tuple): consists of (image_file, instance_id_file, label_id_file, json_file) - from_json (bool): whether to read annotations from the raw json file or the png files. - to_polygons (bool): whether to represent the segmentation as polygons - (COCO's format) instead of masks (cityscapes's format). - - Returns: - A dict in Detectron2 Dataset format. - """ - from cityscapesscripts.helpers.labels import id2label, name2label - - image_file, instance_id_file, _, json_file = files - - annos = [] - - if from_json: - from shapely.geometry import MultiPolygon, Polygon - - with PathManager.open(json_file, "r") as f: - jsonobj = json.load(f) - ret = { - "file_name": image_file, - "image_id": os.path.basename(image_file), - "height": jsonobj["imgHeight"], - "width": jsonobj["imgWidth"], - } - - # `polygons_union` contains the union of all valid polygons. - polygons_union = Polygon() - - # CityscapesScripts draw the polygons in sequential order - # and each polygon *overwrites* existing ones. See - # (https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/preparation/json2instanceImg.py) # noqa - # We use reverse order, and each polygon *avoids* early ones. - # This will resolve the ploygon overlaps in the same way as CityscapesScripts. - for obj in jsonobj["objects"][::-1]: - if "deleted" in obj: # cityscapes data format specific - continue - label_name = obj["label"] - - try: - label = name2label[label_name] - except KeyError: - if label_name.endswith("group"): # crowd area - label = name2label[label_name[: -len("group")]] - else: - raise - if label.id < 0: # cityscapes data format - continue - - # Cityscapes's raw annotations uses integer coordinates - # Therefore +0.5 here - poly_coord = np.asarray(obj["polygon"], dtype="f4") + 0.5 - # CityscapesScript uses PIL.ImageDraw.polygon to rasterize - # polygons for evaluation. This function operates in integer space - # and draws each pixel whose center falls into the polygon. - # Therefore it draws a polygon which is 0.5 "fatter" in expectation. - # We therefore dilate the input polygon by 0.5 as our input. - poly = Polygon(poly_coord).buffer(0.5, resolution=4) - - if not label.hasInstances or label.ignoreInEval: - # even if we won't store the polygon it still contributes to overlaps resolution - polygons_union = polygons_union.union(poly) - continue - - # Take non-overlapping part of the polygon - poly_wo_overlaps = poly.difference(polygons_union) - if poly_wo_overlaps.is_empty: - continue - polygons_union = polygons_union.union(poly) - - anno = {} - anno["iscrowd"] = label_name.endswith("group") - anno["category_id"] = label.id - - if isinstance(poly_wo_overlaps, Polygon): - poly_list = [poly_wo_overlaps] - elif isinstance(poly_wo_overlaps, MultiPolygon): - poly_list = poly_wo_overlaps.geoms - else: - raise NotImplementedError("Unknown geometric structure {}".format(poly_wo_overlaps)) - - poly_coord = [] - for poly_el in poly_list: - # COCO API can work only with exterior boundaries now, hence we store only them. - # TODO: store both exterior and interior boundaries once other parts of the - # codebase support holes in polygons. - poly_coord.append(list(chain(*poly_el.exterior.coords))) - anno["segmentation"] = poly_coord - (xmin, ymin, xmax, ymax) = poly_wo_overlaps.bounds - - anno["bbox"] = (xmin, ymin, xmax, ymax) - anno["bbox_mode"] = BoxMode.XYXY_ABS - - annos.append(anno) - else: - # See also the official annotation parsing scripts at - # https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/instances2dict.py # noqa - with PathManager.open(instance_id_file, "rb") as f: - inst_image = np.asarray(Image.open(f), order="F") - # ids < 24 are stuff labels (filtering them first is about 5% faster) - flattened_ids = np.unique(inst_image[inst_image >= 24]) - - ret = { - "file_name": image_file, - "image_id": os.path.basename(image_file), - "height": inst_image.shape[0], - "width": inst_image.shape[1], - } - - for instance_id in flattened_ids: - # For non-crowd annotations, instance_id // 1000 is the label_id - # Crowd annotations have <1000 instance ids - label_id = instance_id // 1000 if instance_id >= 1000 else instance_id - label = id2label[label_id] - if not label.hasInstances or label.ignoreInEval: - continue - - anno = {} - anno["iscrowd"] = instance_id < 1000 - anno["category_id"] = label.id - - mask = np.asarray(inst_image == instance_id, dtype=np.uint8, order="F") - - inds = np.nonzero(mask) - ymin, ymax = inds[0].min(), inds[0].max() - xmin, xmax = inds[1].min(), inds[1].max() - anno["bbox"] = (xmin, ymin, xmax, ymax) - if xmax <= xmin or ymax <= ymin: - continue - anno["bbox_mode"] = BoxMode.XYXY_ABS - if to_polygons: - # This conversion comes from D4809743 and D5171122, - # when Mask-RCNN was first developed. - contours = cv2.findContours(mask.copy(), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_NONE)[ - -2 - ] - polygons = [c.reshape(-1).tolist() for c in contours if len(c) >= 3] - # opencv's can produce invalid polygons - if len(polygons) == 0: - continue - anno["segmentation"] = polygons - else: - anno["segmentation"] = mask_util.encode(mask[:, :, None])[0] - annos.append(anno) - ret["annotations"] = annos - return ret - - -if __name__ == "__main__": - """ - Test the cityscapes dataset loader. - - Usage: - python -m detectron2.data.datasets.cityscapes \ - cityscapes/leftImg8bit/train cityscapes/gtFine/train - """ - import argparse - - parser = argparse.ArgumentParser() - parser.add_argument("image_dir") - parser.add_argument("gt_dir") - parser.add_argument("--type", choices=["instance", "semantic"], default="instance") - args = parser.parse_args() - from detectron2.data.catalog import Metadata - from detectron2.utils.visualizer import Visualizer - from cityscapesscripts.helpers.labels import labels - - logger = setup_logger(name=__name__) - - dirname = "cityscapes-data-vis" - os.makedirs(dirname, exist_ok=True) - - if args.type == "instance": - dicts = load_cityscapes_instances( - args.image_dir, args.gt_dir, from_json=True, to_polygons=True - ) - logger.info("Done loading {} samples.".format(len(dicts))) - - thing_classes = [k.name for k in labels if k.hasInstances and not k.ignoreInEval] - meta = Metadata().set(thing_classes=thing_classes) - - else: - dicts = load_cityscapes_semantic(args.image_dir, args.gt_dir) - logger.info("Done loading {} samples.".format(len(dicts))) - - stuff_classes = [k.name for k in labels if k.trainId != 255] - stuff_colors = [k.color for k in labels if k.trainId != 255] - meta = Metadata().set(stuff_classes=stuff_classes, stuff_colors=stuff_colors) - - for d in dicts: - img = np.array(Image.open(PathManager.open(d["file_name"], "rb"))) - visualizer = Visualizer(img, metadata=meta) - vis = visualizer.draw_dataset_dict(d) - # cv2.imshow("a", vis.get_image()[:, :, ::-1]) - # cv2.waitKey() - fpath = os.path.join(dirname, os.path.basename(d["file_name"])) - vis.save(fpath) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/cityscapes_panoptic.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/cityscapes_panoptic.py deleted file mode 100755 index 48c136f1..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/cityscapes_panoptic.py +++ /dev/null @@ -1,187 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import json -import logging -import os - -from detectron2.data import DatasetCatalog, MetadataCatalog -from detectron2.data.datasets.builtin_meta import CITYSCAPES_CATEGORIES -from detectron2.utils.file_io import PathManager - -""" -This file contains functions to register the Cityscapes panoptic dataset to the DatasetCatalog. -""" - - -logger = logging.getLogger(__name__) - - -def get_cityscapes_panoptic_files(image_dir, gt_dir, json_info): - files = [] - # scan through the directory - cities = PathManager.ls(image_dir) - logger.info(f"{len(cities)} cities found in '{image_dir}'.") - image_dict = {} - for city in cities: - city_img_dir = os.path.join(image_dir, city) - for basename in PathManager.ls(city_img_dir): - image_file = os.path.join(city_img_dir, basename) - - suffix = "_leftImg8bit.png" - assert basename.endswith(suffix), basename - basename = os.path.basename(basename)[: -len(suffix)] - - image_dict[basename] = image_file - - for ann in json_info["annotations"]: - image_file = image_dict.get(ann["image_id"], None) - assert image_file is not None, "No image {} found for annotation {}".format( - ann["image_id"], ann["file_name"] - ) - label_file = os.path.join(gt_dir, ann["file_name"]) - segments_info = ann["segments_info"] - - files.append((image_file, label_file, segments_info)) - - assert len(files), "No images found in {}".format(image_dir) - assert PathManager.isfile(files[0][0]), files[0][0] - assert PathManager.isfile(files[0][1]), files[0][1] - return files - - -def load_cityscapes_panoptic(image_dir, gt_dir, gt_json, meta): - """ - Args: - image_dir (str): path to the raw dataset. e.g., "~/cityscapes/leftImg8bit/train". - gt_dir (str): path to the raw annotations. e.g., - "~/cityscapes/gtFine/cityscapes_panoptic_train". - gt_json (str): path to the json file. e.g., - "~/cityscapes/gtFine/cityscapes_panoptic_train.json". - meta (dict): dictionary containing "thing_dataset_id_to_contiguous_id" - and "stuff_dataset_id_to_contiguous_id" to map category ids to - contiguous ids for training. - - Returns: - list[dict]: a list of dicts in Detectron2 standard format. (See - `Using Custom Datasets `_ ) - """ - - def _convert_category_id(segment_info, meta): - if segment_info["category_id"] in meta["thing_dataset_id_to_contiguous_id"]: - segment_info["category_id"] = meta["thing_dataset_id_to_contiguous_id"][ - segment_info["category_id"] - ] - else: - segment_info["category_id"] = meta["stuff_dataset_id_to_contiguous_id"][ - segment_info["category_id"] - ] - return segment_info - - assert os.path.exists( - gt_json - ), "Please run `python cityscapesscripts/preparation/createPanopticImgs.py` to generate label files." # noqa - with open(gt_json) as f: - json_info = json.load(f) - files = get_cityscapes_panoptic_files(image_dir, gt_dir, json_info) - ret = [] - for image_file, label_file, segments_info in files: - sem_label_file = ( - image_file.replace("leftImg8bit", "gtFine").split(".")[0] + "_labelTrainIds.png" - ) - segments_info = [_convert_category_id(x, meta) for x in segments_info] - ret.append( - { - "file_name": image_file, - "image_id": "_".join( - os.path.splitext(os.path.basename(image_file))[0].split("_")[:3] - ), - "sem_seg_file_name": sem_label_file, - "pan_seg_file_name": label_file, - "segments_info": segments_info, - } - ) - assert len(ret), f"No images found in {image_dir}!" - assert PathManager.isfile( - ret[0]["sem_seg_file_name"] - ), "Please generate labelTrainIds.png with cityscapesscripts/preparation/createTrainIdLabelImgs.py" # noqa - assert PathManager.isfile( - ret[0]["pan_seg_file_name"] - ), "Please generate panoptic annotation with python cityscapesscripts/preparation/createPanopticImgs.py" # noqa - return ret - - -_RAW_CITYSCAPES_PANOPTIC_SPLITS = { - "cityscapes_fine_panoptic_train": ( - "cityscapes/leftImg8bit/train", - "cityscapes/gtFine/cityscapes_panoptic_train", - "cityscapes/gtFine/cityscapes_panoptic_train.json", - ), - "cityscapes_fine_panoptic_val": ( - "cityscapes/leftImg8bit/val", - "cityscapes/gtFine/cityscapes_panoptic_val", - "cityscapes/gtFine/cityscapes_panoptic_val.json", - ), - # "cityscapes_fine_panoptic_test": not supported yet -} - - -def register_all_cityscapes_panoptic(root): - meta = {} - # The following metadata maps contiguous id from [0, #thing categories + - # #stuff categories) to their names and colors. We have to replica of the - # same name and color under "thing_*" and "stuff_*" because the current - # visualization function in D2 handles thing and class classes differently - # due to some heuristic used in Panoptic FPN. We keep the same naming to - # enable reusing existing visualization functions. - thing_classes = [k["name"] for k in CITYSCAPES_CATEGORIES] - thing_colors = [k["color"] for k in CITYSCAPES_CATEGORIES] - stuff_classes = [k["name"] for k in CITYSCAPES_CATEGORIES] - stuff_colors = [k["color"] for k in CITYSCAPES_CATEGORIES] - - meta["thing_classes"] = thing_classes - meta["thing_colors"] = thing_colors - meta["stuff_classes"] = stuff_classes - meta["stuff_colors"] = stuff_colors - - # There are three types of ids in cityscapes panoptic segmentation: - # (1) category id: like semantic segmentation, it is the class id for each - # pixel. Since there are some classes not used in evaluation, the category - # id is not always contiguous and thus we have two set of category ids: - # - original category id: category id in the original dataset, mainly - # used for evaluation. - # - contiguous category id: [0, #classes), in order to train the classifier - # (2) instance id: this id is used to differentiate different instances from - # the same category. For "stuff" classes, the instance id is always 0; for - # "thing" classes, the instance id starts from 1 and 0 is reserved for - # ignored instances (e.g. crowd annotation). - # (3) panoptic id: this is the compact id that encode both category and - # instance id by: category_id * 1000 + instance_id. - thing_dataset_id_to_contiguous_id = {} - stuff_dataset_id_to_contiguous_id = {} - - for k in CITYSCAPES_CATEGORIES: - if k["isthing"] == 1: - thing_dataset_id_to_contiguous_id[k["id"]] = k["trainId"] - else: - stuff_dataset_id_to_contiguous_id[k["id"]] = k["trainId"] - - meta["thing_dataset_id_to_contiguous_id"] = thing_dataset_id_to_contiguous_id - meta["stuff_dataset_id_to_contiguous_id"] = stuff_dataset_id_to_contiguous_id - - for key, (image_dir, gt_dir, gt_json) in _RAW_CITYSCAPES_PANOPTIC_SPLITS.items(): - image_dir = os.path.join(root, image_dir) - gt_dir = os.path.join(root, gt_dir) - gt_json = os.path.join(root, gt_json) - - DatasetCatalog.register( - key, lambda x=image_dir, y=gt_dir, z=gt_json: load_cityscapes_panoptic(x, y, z, meta) - ) - MetadataCatalog.get(key).set( - panoptic_root=gt_dir, - image_root=image_dir, - panoptic_json=gt_json, - gt_dir=gt_dir.replace("cityscapes_panoptic_", ""), - evaluator_type="cityscapes_panoptic_seg", - ignore_label=255, - label_divisor=1000, - **meta, - ) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/coco.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/coco.py deleted file mode 100755 index ed4f7ccb..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/coco.py +++ /dev/null @@ -1,539 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import contextlib -import datetime -import io -import json -import logging -import numpy as np -import os -import shutil -import pycocotools.mask as mask_util -from fvcore.common.timer import Timer -from iopath.common.file_io import file_lock -from PIL import Image - -from detectron2.structures import Boxes, BoxMode, PolygonMasks, RotatedBoxes -from detectron2.utils.file_io import PathManager - -from .. import DatasetCatalog, MetadataCatalog - -""" -This file contains functions to parse COCO-format annotations into dicts in "Detectron2 format". -""" - - -logger = logging.getLogger(__name__) - -__all__ = ["load_coco_json", "load_sem_seg", "convert_to_coco_json", "register_coco_instances"] - - -def load_coco_json(json_file, image_root, dataset_name=None, extra_annotation_keys=None): - """ - Load a json file with COCO's instances annotation format. - Currently supports instance detection, instance segmentation, - and person keypoints annotations. - - Args: - json_file (str): full path to the json file in COCO instances annotation format. - image_root (str or path-like): the directory where the images in this json file exists. - dataset_name (str or None): the name of the dataset (e.g., coco_2017_train). - When provided, this function will also do the following: - - * Put "thing_classes" into the metadata associated with this dataset. - * Map the category ids into a contiguous range (needed by standard dataset format), - and add "thing_dataset_id_to_contiguous_id" to the metadata associated - with this dataset. - - This option should usually be provided, unless users need to load - the original json content and apply more processing manually. - extra_annotation_keys (list[str]): list of per-annotation keys that should also be - loaded into the dataset dict (besides "iscrowd", "bbox", "keypoints", - "category_id", "segmentation"). The values for these keys will be returned as-is. - For example, the densepose annotations are loaded in this way. - - Returns: - list[dict]: a list of dicts in Detectron2 standard dataset dicts format (See - `Using Custom Datasets `_ ) when `dataset_name` is not None. - If `dataset_name` is None, the returned `category_ids` may be - incontiguous and may not conform to the Detectron2 standard format. - - Notes: - 1. This function does not read the image files. - The results do not have the "image" field. - """ - from pycocotools.coco import COCO - - timer = Timer() - json_file = PathManager.get_local_path(json_file) - with contextlib.redirect_stdout(io.StringIO()): - coco_api = COCO(json_file) - if timer.seconds() > 1: - logger.info("Loading {} takes {:.2f} seconds.".format(json_file, timer.seconds())) - - id_map = None - if dataset_name is not None: - meta = MetadataCatalog.get(dataset_name) - cat_ids = sorted(coco_api.getCatIds()) - cats = coco_api.loadCats(cat_ids) - # The categories in a custom json file may not be sorted. - thing_classes = [c["name"] for c in sorted(cats, key=lambda x: x["id"])] - meta.thing_classes = thing_classes - - # In COCO, certain category ids are artificially removed, - # and by convention they are always ignored. - # We deal with COCO's id issue and translate - # the category ids to contiguous ids in [0, 80). - - # It works by looking at the "categories" field in the json, therefore - # if users' own json also have incontiguous ids, we'll - # apply this mapping as well but print a warning. - if not (min(cat_ids) == 1 and max(cat_ids) == len(cat_ids)): - if "coco" not in dataset_name: - logger.warning( - """ -Category ids in annotations are not in [1, #categories]! We'll apply a mapping for you. -""" - ) - id_map = {v: i for i, v in enumerate(cat_ids)} - meta.thing_dataset_id_to_contiguous_id = id_map - - # sort indices for reproducible results - img_ids = sorted(coco_api.imgs.keys()) - # imgs is a list of dicts, each looks something like: - # {'license': 4, - # 'url': 'http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg', - # 'file_name': 'COCO_val2014_000000001268.jpg', - # 'height': 427, - # 'width': 640, - # 'date_captured': '2013-11-17 05:57:24', - # 'id': 1268} - imgs = coco_api.loadImgs(img_ids) - # anns is a list[list[dict]], where each dict is an annotation - # record for an object. The inner list enumerates the objects in an image - # and the outer list enumerates over images. Example of anns[0]: - # [{'segmentation': [[192.81, - # 247.09, - # ... - # 219.03, - # 249.06]], - # 'area': 1035.749, - # 'iscrowd': 0, - # 'image_id': 1268, - # 'bbox': [192.81, 224.8, 74.73, 33.43], - # 'category_id': 16, - # 'id': 42986}, - # ...] - anns = [coco_api.imgToAnns[img_id] for img_id in img_ids] - total_num_valid_anns = sum([len(x) for x in anns]) - total_num_anns = len(coco_api.anns) - if total_num_valid_anns < total_num_anns: - logger.warning( - f"{json_file} contains {total_num_anns} annotations, but only " - f"{total_num_valid_anns} of them match to images in the file." - ) - - if "minival" not in json_file: - # The popular valminusminival & minival annotations for COCO2014 contain this bug. - # However the ratio of buggy annotations there is tiny and does not affect accuracy. - # Therefore we explicitly white-list them. - ann_ids = [ann["id"] for anns_per_image in anns for ann in anns_per_image] - assert len(set(ann_ids)) == len(ann_ids), "Annotation ids in '{}' are not unique!".format( - json_file - ) - - imgs_anns = list(zip(imgs, anns)) - logger.info("Loaded {} images in COCO format from {}".format(len(imgs_anns), json_file)) - - dataset_dicts = [] - - ann_keys = ["iscrowd", "bbox", "keypoints", "category_id"] + (extra_annotation_keys or []) - - num_instances_without_valid_segmentation = 0 - - for (img_dict, anno_dict_list) in imgs_anns: - record = {} - record["file_name"] = os.path.join(image_root, img_dict["file_name"]) - record["height"] = img_dict["height"] - record["width"] = img_dict["width"] - image_id = record["image_id"] = img_dict["id"] - - objs = [] - for anno in anno_dict_list: - # Check that the image_id in this annotation is the same as - # the image_id we're looking at. - # This fails only when the data parsing logic or the annotation file is buggy. - - # The original COCO valminusminival2014 & minival2014 annotation files - # actually contains bugs that, together with certain ways of using COCO API, - # can trigger this assertion. - assert anno["image_id"] == image_id - - assert anno.get("ignore", 0) == 0, '"ignore" in COCO json file is not supported.' - - obj = {key: anno[key] for key in ann_keys if key in anno} - if "bbox" in obj and len(obj["bbox"]) == 0: - raise ValueError( - f"One annotation of image {image_id} contains empty 'bbox' value! " - "This json does not have valid COCO format." - ) - - segm = anno.get("segmentation", None) - if segm: # either list[list[float]] or dict(RLE) - if isinstance(segm, dict): - if isinstance(segm["counts"], list): - # convert to compressed RLE - segm = mask_util.frPyObjects(segm, *segm["size"]) - else: - # filter out invalid polygons (< 3 points) - segm = [poly for poly in segm if len(poly) % 2 == 0 and len(poly) >= 6] - if len(segm) == 0: - num_instances_without_valid_segmentation += 1 - continue # ignore this instance - obj["segmentation"] = segm - - keypts = anno.get("keypoints", None) - if keypts: # list[int] - for idx, v in enumerate(keypts): - if idx % 3 != 2: - # COCO's segmentation coordinates are floating points in [0, H or W], - # but keypoint coordinates are integers in [0, H-1 or W-1] - # Therefore we assume the coordinates are "pixel indices" and - # add 0.5 to convert to floating point coordinates. - keypts[idx] = v + 0.5 - obj["keypoints"] = keypts - - obj["bbox_mode"] = BoxMode.XYWH_ABS - if id_map: - annotation_category_id = obj["category_id"] - try: - obj["category_id"] = id_map[annotation_category_id] - except KeyError as e: - raise KeyError( - f"Encountered category_id={annotation_category_id} " - "but this id does not exist in 'categories' of the json file." - ) from e - objs.append(obj) - record["annotations"] = objs - dataset_dicts.append(record) - - if num_instances_without_valid_segmentation > 0: - logger.warning( - "Filtered out {} instances without valid segmentation. ".format( - num_instances_without_valid_segmentation - ) - + "There might be issues in your dataset generation process. Please " - "check https://detectron2.readthedocs.io/en/latest/tutorials/datasets.html carefully" - ) - return dataset_dicts - - -def load_sem_seg(gt_root, image_root, gt_ext="png", image_ext="jpg"): - """ - Load semantic segmentation datasets. All files under "gt_root" with "gt_ext" extension are - treated as ground truth annotations and all files under "image_root" with "image_ext" extension - as input images. Ground truth and input images are matched using file paths relative to - "gt_root" and "image_root" respectively without taking into account file extensions. - This works for COCO as well as some other datasets. - - Args: - gt_root (str): full path to ground truth semantic segmentation files. Semantic segmentation - annotations are stored as images with integer values in pixels that represent - corresponding semantic labels. - image_root (str): the directory where the input images are. - gt_ext (str): file extension for ground truth annotations. - image_ext (str): file extension for input images. - - Returns: - list[dict]: - a list of dicts in detectron2 standard format without instance-level - annotation. - - Notes: - 1. This function does not read the image and ground truth files. - The results do not have the "image" and "sem_seg" fields. - """ - - # We match input images with ground truth based on their relative filepaths (without file - # extensions) starting from 'image_root' and 'gt_root' respectively. - def file2id(folder_path, file_path): - # extract relative path starting from `folder_path` - image_id = os.path.normpath(os.path.relpath(file_path, start=folder_path)) - # remove file extension - image_id = os.path.splitext(image_id)[0] - return image_id - - input_files = sorted( - (os.path.join(image_root, f) for f in PathManager.ls(image_root) if f.endswith(image_ext)), - key=lambda file_path: file2id(image_root, file_path), - ) - gt_files = sorted( - (os.path.join(gt_root, f) for f in PathManager.ls(gt_root) if f.endswith(gt_ext)), - key=lambda file_path: file2id(gt_root, file_path), - ) - - assert len(gt_files) > 0, "No annotations found in {}.".format(gt_root) - - # Use the intersection, so that val2017_100 annotations can run smoothly with val2017 images - if len(input_files) != len(gt_files): - logger.warn( - "Directory {} and {} has {} and {} files, respectively.".format( - image_root, gt_root, len(input_files), len(gt_files) - ) - ) - input_basenames = [os.path.basename(f)[: -len(image_ext)] for f in input_files] - gt_basenames = [os.path.basename(f)[: -len(gt_ext)] for f in gt_files] - intersect = list(set(input_basenames) & set(gt_basenames)) - # sort, otherwise each worker may obtain a list[dict] in different order - intersect = sorted(intersect) - logger.warn("Will use their intersection of {} files.".format(len(intersect))) - input_files = [os.path.join(image_root, f + image_ext) for f in intersect] - gt_files = [os.path.join(gt_root, f + gt_ext) for f in intersect] - - logger.info( - "Loaded {} images with semantic segmentation from {}".format(len(input_files), image_root) - ) - - dataset_dicts = [] - for (img_path, gt_path) in zip(input_files, gt_files): - record = {} - record["file_name"] = img_path - record["sem_seg_file_name"] = gt_path - dataset_dicts.append(record) - - return dataset_dicts - - -def convert_to_coco_dict(dataset_name): - """ - Convert an instance detection/segmentation or keypoint detection dataset - in detectron2's standard format into COCO json format. - - Generic dataset description can be found here: - https://detectron2.readthedocs.io/tutorials/datasets.html#register-a-dataset - - COCO data format description can be found here: - http://cocodataset.org/#format-data - - Args: - dataset_name (str): - name of the source dataset - Must be registered in DatastCatalog and in detectron2's standard format. - Must have corresponding metadata "thing_classes" - Returns: - coco_dict: serializable dict in COCO json format - """ - - dataset_dicts = DatasetCatalog.get(dataset_name) - metadata = MetadataCatalog.get(dataset_name) - - # unmap the category mapping ids for COCO - if hasattr(metadata, "thing_dataset_id_to_contiguous_id"): - reverse_id_mapping = {v: k for k, v in metadata.thing_dataset_id_to_contiguous_id.items()} - reverse_id_mapper = lambda contiguous_id: reverse_id_mapping[contiguous_id] # noqa - else: - reverse_id_mapper = lambda contiguous_id: contiguous_id # noqa - - categories = [ - {"id": reverse_id_mapper(id), "name": name} - for id, name in enumerate(metadata.thing_classes) - ] - - logger.info("Converting dataset dicts into COCO format") - coco_images = [] - coco_annotations = [] - - for image_id, image_dict in enumerate(dataset_dicts): - coco_image = { - "id": image_dict.get("image_id", image_id), - "width": int(image_dict["width"]), - "height": int(image_dict["height"]), - "file_name": str(image_dict["file_name"]), - } - coco_images.append(coco_image) - - anns_per_image = image_dict.get("annotations", []) - for annotation in anns_per_image: - # create a new dict with only COCO fields - coco_annotation = {} - - # COCO requirement: XYWH box format for axis-align and XYWHA for rotated - bbox = annotation["bbox"] - if isinstance(bbox, np.ndarray): - if bbox.ndim != 1: - raise ValueError(f"bbox has to be 1-dimensional. Got shape={bbox.shape}.") - bbox = bbox.tolist() - if len(bbox) not in [4, 5]: - raise ValueError(f"bbox has to has length 4 or 5. Got {bbox}.") - from_bbox_mode = annotation["bbox_mode"] - to_bbox_mode = BoxMode.XYWH_ABS if len(bbox) == 4 else BoxMode.XYWHA_ABS - bbox = BoxMode.convert(bbox, from_bbox_mode, to_bbox_mode) - - # COCO requirement: instance area - if "segmentation" in annotation: - # Computing areas for instances by counting the pixels - segmentation = annotation["segmentation"] - # TODO: check segmentation type: RLE, BinaryMask or Polygon - if isinstance(segmentation, list): - polygons = PolygonMasks([segmentation]) - area = polygons.area()[0].item() - elif isinstance(segmentation, dict): # RLE - area = mask_util.area(segmentation).item() - else: - raise TypeError(f"Unknown segmentation type {type(segmentation)}!") - else: - # Computing areas using bounding boxes - if to_bbox_mode == BoxMode.XYWH_ABS: - bbox_xy = BoxMode.convert(bbox, to_bbox_mode, BoxMode.XYXY_ABS) - area = Boxes([bbox_xy]).area()[0].item() - else: - area = RotatedBoxes([bbox]).area()[0].item() - - if "keypoints" in annotation: - keypoints = annotation["keypoints"] # list[int] - for idx, v in enumerate(keypoints): - if idx % 3 != 2: - # COCO's segmentation coordinates are floating points in [0, H or W], - # but keypoint coordinates are integers in [0, H-1 or W-1] - # For COCO format consistency we substract 0.5 - # https://github.com/facebookresearch/detectron2/pull/175#issuecomment-551202163 - keypoints[idx] = v - 0.5 - if "num_keypoints" in annotation: - num_keypoints = annotation["num_keypoints"] - else: - num_keypoints = sum(kp > 0 for kp in keypoints[2::3]) - - # COCO requirement: - # linking annotations to images - # "id" field must start with 1 - coco_annotation["id"] = len(coco_annotations) + 1 - coco_annotation["image_id"] = coco_image["id"] - coco_annotation["bbox"] = [round(float(x), 3) for x in bbox] - coco_annotation["area"] = float(area) - coco_annotation["iscrowd"] = int(annotation.get("iscrowd", 0)) - coco_annotation["category_id"] = int(reverse_id_mapper(annotation["category_id"])) - - # Add optional fields - if "keypoints" in annotation: - coco_annotation["keypoints"] = keypoints - coco_annotation["num_keypoints"] = num_keypoints - - if "segmentation" in annotation: - seg = coco_annotation["segmentation"] = annotation["segmentation"] - if isinstance(seg, dict): # RLE - counts = seg["counts"] - if not isinstance(counts, str): - # make it json-serializable - seg["counts"] = counts.decode("ascii") - - coco_annotations.append(coco_annotation) - - logger.info( - "Conversion finished, " - f"#images: {len(coco_images)}, #annotations: {len(coco_annotations)}" - ) - - info = { - "date_created": str(datetime.datetime.now()), - "description": "Automatically generated COCO json file for Detectron2.", - } - coco_dict = {"info": info, "images": coco_images, "categories": categories, "licenses": None} - if len(coco_annotations) > 0: - coco_dict["annotations"] = coco_annotations - return coco_dict - - -def convert_to_coco_json(dataset_name, output_file, allow_cached=True): - """ - Converts dataset into COCO format and saves it to a json file. - dataset_name must be registered in DatasetCatalog and in detectron2's standard format. - - Args: - dataset_name: - reference from the config file to the catalogs - must be registered in DatasetCatalog and in detectron2's standard format - output_file: path of json file that will be saved to - allow_cached: if json file is already present then skip conversion - """ - - # TODO: The dataset or the conversion script *may* change, - # a checksum would be useful for validating the cached data - - PathManager.mkdirs(os.path.dirname(output_file)) - with file_lock(output_file): - if PathManager.exists(output_file) and allow_cached: - logger.warning( - f"Using previously cached COCO format annotations at '{output_file}'. " - "You need to clear the cache file if your dataset has been modified." - ) - else: - logger.info(f"Converting annotations of dataset '{dataset_name}' to COCO format ...)") - coco_dict = convert_to_coco_dict(dataset_name) - - logger.info(f"Caching COCO format annotations at '{output_file}' ...") - tmp_file = output_file + ".tmp" - with PathManager.open(tmp_file, "w") as f: - json.dump(coco_dict, f) - shutil.move(tmp_file, output_file) - - -def register_coco_instances(name, metadata, json_file, image_root): - """ - Register a dataset in COCO's json annotation format for - instance detection, instance segmentation and keypoint detection. - (i.e., Type 1 and 2 in http://cocodataset.org/#format-data. - `instances*.json` and `person_keypoints*.json` in the dataset). - - This is an example of how to register a new dataset. - You can do something similar to this function, to register new datasets. - - Args: - name (str): the name that identifies a dataset, e.g. "coco_2014_train". - metadata (dict): extra metadata associated with this dataset. You can - leave it as an empty dict. - json_file (str): path to the json instance annotation file. - image_root (str or path-like): directory which contains all the images. - """ - assert isinstance(name, str), name - assert isinstance(json_file, (str, os.PathLike)), json_file - assert isinstance(image_root, (str, os.PathLike)), image_root - # 1. register a function which returns dicts - DatasetCatalog.register(name, lambda: load_coco_json(json_file, image_root, name)) - - # 2. Optionally, add metadata about this dataset, - # since they might be useful in evaluation, visualization or logging - MetadataCatalog.get(name).set( - json_file=json_file, image_root=image_root, evaluator_type="coco", **metadata - ) - - -if __name__ == "__main__": - """ - Test the COCO json dataset loader. - - Usage: - python -m detectron2.data.datasets.coco \ - path/to/json path/to/image_root dataset_name - - "dataset_name" can be "coco_2014_minival_100", or other - pre-registered ones - """ - from detectron2.utils.logger import setup_logger - from detectron2.utils.visualizer import Visualizer - import detectron2.data.datasets # noqa # add pre-defined metadata - import sys - - logger = setup_logger(name=__name__) - assert sys.argv[3] in DatasetCatalog.list() - meta = MetadataCatalog.get(sys.argv[3]) - - dicts = load_coco_json(sys.argv[1], sys.argv[2], sys.argv[3]) - logger.info("Done loading {} samples.".format(len(dicts))) - - dirname = "coco-data-vis" - os.makedirs(dirname, exist_ok=True) - for d in dicts: - img = np.array(Image.open(d["file_name"])) - visualizer = Visualizer(img, metadata=meta) - vis = visualizer.draw_dataset_dict(d) - fpath = os.path.join(dirname, os.path.basename(d["file_name"])) - vis.save(fpath) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/coco_panoptic.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/coco_panoptic.py deleted file mode 100755 index b8dae443..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/coco_panoptic.py +++ /dev/null @@ -1,228 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import json -import os - -from detectron2.data import DatasetCatalog, MetadataCatalog -from detectron2.utils.file_io import PathManager - -from .coco import load_coco_json, load_sem_seg - -__all__ = ["register_coco_panoptic", "register_coco_panoptic_separated"] - - -def load_coco_panoptic_json(json_file, image_dir, gt_dir, meta): - """ - Args: - image_dir (str): path to the raw dataset. e.g., "~/coco/train2017". - gt_dir (str): path to the raw annotations. e.g., "~/coco/panoptic_train2017". - json_file (str): path to the json file. e.g., "~/coco/annotations/panoptic_train2017.json". - - Returns: - list[dict]: a list of dicts in Detectron2 standard format. (See - `Using Custom Datasets `_ ) - """ - - def _convert_category_id(segment_info, meta): - if segment_info["category_id"] in meta["thing_dataset_id_to_contiguous_id"]: - segment_info["category_id"] = meta["thing_dataset_id_to_contiguous_id"][ - segment_info["category_id"] - ] - segment_info["isthing"] = True - else: - segment_info["category_id"] = meta["stuff_dataset_id_to_contiguous_id"][ - segment_info["category_id"] - ] - segment_info["isthing"] = False - return segment_info - - with PathManager.open(json_file) as f: - json_info = json.load(f) - - ret = [] - for ann in json_info["annotations"]: - image_id = int(ann["image_id"]) - # TODO: currently we assume image and label has the same filename but - # different extension, and images have extension ".jpg" for COCO. Need - # to make image extension a user-provided argument if we extend this - # function to support other COCO-like datasets. - image_file = os.path.join(image_dir, os.path.splitext(ann["file_name"])[0] + ".jpg") - label_file = os.path.join(gt_dir, ann["file_name"]) - segments_info = [_convert_category_id(x, meta) for x in ann["segments_info"]] - ret.append( - { - "file_name": image_file, - "image_id": image_id, - "pan_seg_file_name": label_file, - "segments_info": segments_info, - } - ) - assert len(ret), f"No images found in {image_dir}!" - assert PathManager.isfile(ret[0]["file_name"]), ret[0]["file_name"] - assert PathManager.isfile(ret[0]["pan_seg_file_name"]), ret[0]["pan_seg_file_name"] - return ret - - -def register_coco_panoptic( - name, metadata, image_root, panoptic_root, panoptic_json, instances_json=None -): - """ - Register a "standard" version of COCO panoptic segmentation dataset named `name`. - The dictionaries in this registered dataset follows detectron2's standard format. - Hence it's called "standard". - - Args: - name (str): the name that identifies a dataset, - e.g. "coco_2017_train_panoptic" - metadata (dict): extra metadata associated with this dataset. - image_root (str): directory which contains all the images - panoptic_root (str): directory which contains panoptic annotation images in COCO format - panoptic_json (str): path to the json panoptic annotation file in COCO format - sem_seg_root (none): not used, to be consistent with - `register_coco_panoptic_separated`. - instances_json (str): path to the json instance annotation file - """ - panoptic_name = name - DatasetCatalog.register( - panoptic_name, - lambda: load_coco_panoptic_json(panoptic_json, image_root, panoptic_root, metadata), - ) - MetadataCatalog.get(panoptic_name).set( - panoptic_root=panoptic_root, - image_root=image_root, - panoptic_json=panoptic_json, - json_file=instances_json, - evaluator_type="coco_panoptic_seg", - ignore_label=255, - label_divisor=1000, - **metadata, - ) - - -def register_coco_panoptic_separated( - name, metadata, image_root, panoptic_root, panoptic_json, sem_seg_root, instances_json -): - """ - Register a "separated" version of COCO panoptic segmentation dataset named `name`. - The annotations in this registered dataset will contain both instance annotations and - semantic annotations, each with its own contiguous ids. Hence it's called "separated". - - It follows the setting used by the PanopticFPN paper: - - 1. The instance annotations directly come from polygons in the COCO - instances annotation task, rather than from the masks in the COCO panoptic annotations. - - The two format have small differences: - Polygons in the instance annotations may have overlaps. - The mask annotations are produced by labeling the overlapped polygons - with depth ordering. - - 2. The semantic annotations are converted from panoptic annotations, where - all "things" are assigned a semantic id of 0. - All semantic categories will therefore have ids in contiguous - range [1, #stuff_categories]. - - This function will also register a pure semantic segmentation dataset - named ``name + '_stuffonly'``. - - Args: - name (str): the name that identifies a dataset, - e.g. "coco_2017_train_panoptic" - metadata (dict): extra metadata associated with this dataset. - image_root (str): directory which contains all the images - panoptic_root (str): directory which contains panoptic annotation images - panoptic_json (str): path to the json panoptic annotation file - sem_seg_root (str): directory which contains all the ground truth segmentation annotations. - instances_json (str): path to the json instance annotation file - """ - panoptic_name = name + "_separated" - DatasetCatalog.register( - panoptic_name, - lambda: merge_to_panoptic( - load_coco_json(instances_json, image_root, panoptic_name), - load_sem_seg(sem_seg_root, image_root), - ), - ) - MetadataCatalog.get(panoptic_name).set( - panoptic_root=panoptic_root, - image_root=image_root, - panoptic_json=panoptic_json, - sem_seg_root=sem_seg_root, - json_file=instances_json, # TODO rename - evaluator_type="coco_panoptic_seg", - ignore_label=255, - **metadata, - ) - - semantic_name = name + "_stuffonly" - DatasetCatalog.register(semantic_name, lambda: load_sem_seg(sem_seg_root, image_root)) - MetadataCatalog.get(semantic_name).set( - sem_seg_root=sem_seg_root, - image_root=image_root, - evaluator_type="sem_seg", - ignore_label=255, - **metadata, - ) - - -def merge_to_panoptic(detection_dicts, sem_seg_dicts): - """ - Create dataset dicts for panoptic segmentation, by - merging two dicts using "file_name" field to match their entries. - - Args: - detection_dicts (list[dict]): lists of dicts for object detection or instance segmentation. - sem_seg_dicts (list[dict]): lists of dicts for semantic segmentation. - - Returns: - list[dict] (one per input image): Each dict contains all (key, value) pairs from dicts in - both detection_dicts and sem_seg_dicts that correspond to the same image. - The function assumes that the same key in different dicts has the same value. - """ - results = [] - sem_seg_file_to_entry = {x["file_name"]: x for x in sem_seg_dicts} - assert len(sem_seg_file_to_entry) > 0 - - for det_dict in detection_dicts: - dic = copy.copy(det_dict) - dic.update(sem_seg_file_to_entry[dic["file_name"]]) - results.append(dic) - return results - - -if __name__ == "__main__": - """ - Test the COCO panoptic dataset loader. - - Usage: - python -m detectron2.data.datasets.coco_panoptic \ - path/to/image_root path/to/panoptic_root path/to/panoptic_json dataset_name 10 - - "dataset_name" can be "coco_2017_train_panoptic", or other - pre-registered ones - """ - from detectron2.utils.logger import setup_logger - from detectron2.utils.visualizer import Visualizer - import detectron2.data.datasets # noqa # add pre-defined metadata - import sys - from PIL import Image - import numpy as np - - logger = setup_logger(name=__name__) - assert sys.argv[4] in DatasetCatalog.list() - meta = MetadataCatalog.get(sys.argv[4]) - - dicts = load_coco_panoptic_json(sys.argv[3], sys.argv[1], sys.argv[2], meta.as_dict()) - logger.info("Done loading {} samples.".format(len(dicts))) - - dirname = "coco-data-vis" - os.makedirs(dirname, exist_ok=True) - num_imgs_to_vis = int(sys.argv[5]) - for i, d in enumerate(dicts): - img = np.array(Image.open(d["file_name"])) - visualizer = Visualizer(img, metadata=meta) - vis = visualizer.draw_dataset_dict(d) - fpath = os.path.join(dirname, os.path.basename(d["file_name"])) - vis.save(fpath) - if i + 1 >= num_imgs_to_vis: - break diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis.py deleted file mode 100755 index 78b39653..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis.py +++ /dev/null @@ -1,240 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import os -from fvcore.common.timer import Timer - -from detectron2.data import DatasetCatalog, MetadataCatalog -from detectron2.structures import BoxMode -from detectron2.utils.file_io import PathManager - -from .builtin_meta import _get_coco_instances_meta -from .lvis_v0_5_categories import LVIS_CATEGORIES as LVIS_V0_5_CATEGORIES -from .lvis_v1_categories import LVIS_CATEGORIES as LVIS_V1_CATEGORIES - -""" -This file contains functions to parse LVIS-format annotations into dicts in the -"Detectron2 format". -""" - -logger = logging.getLogger(__name__) - -__all__ = ["load_lvis_json", "register_lvis_instances", "get_lvis_instances_meta"] - - -def register_lvis_instances(name, metadata, json_file, image_root): - """ - Register a dataset in LVIS's json annotation format for instance detection and segmentation. - - Args: - name (str): a name that identifies the dataset, e.g. "lvis_v0.5_train". - metadata (dict): extra metadata associated with this dataset. It can be an empty dict. - json_file (str): path to the json instance annotation file. - image_root (str or path-like): directory which contains all the images. - """ - DatasetCatalog.register(name, lambda: load_lvis_json(json_file, image_root, name)) - MetadataCatalog.get(name).set( - json_file=json_file, image_root=image_root, evaluator_type="lvis", **metadata - ) - - -def load_lvis_json(json_file, image_root, dataset_name=None, extra_annotation_keys=None): - """ - Load a json file in LVIS's annotation format. - - Args: - json_file (str): full path to the LVIS json annotation file. - image_root (str): the directory where the images in this json file exists. - dataset_name (str): the name of the dataset (e.g., "lvis_v0.5_train"). - If provided, this function will put "thing_classes" into the metadata - associated with this dataset. - extra_annotation_keys (list[str]): list of per-annotation keys that should also be - loaded into the dataset dict (besides "bbox", "bbox_mode", "category_id", - "segmentation"). The values for these keys will be returned as-is. - - Returns: - list[dict]: a list of dicts in Detectron2 standard format. (See - `Using Custom Datasets `_ ) - - Notes: - 1. This function does not read the image files. - The results do not have the "image" field. - """ - from lvis import LVIS - - json_file = PathManager.get_local_path(json_file) - - timer = Timer() - lvis_api = LVIS(json_file) - if timer.seconds() > 1: - logger.info("Loading {} takes {:.2f} seconds.".format(json_file, timer.seconds())) - - if dataset_name is not None: - meta = get_lvis_instances_meta(dataset_name) - MetadataCatalog.get(dataset_name).set(**meta) - - # sort indices for reproducible results - img_ids = sorted(lvis_api.imgs.keys()) - # imgs is a list of dicts, each looks something like: - # {'license': 4, - # 'url': 'http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg', - # 'file_name': 'COCO_val2014_000000001268.jpg', - # 'height': 427, - # 'width': 640, - # 'date_captured': '2013-11-17 05:57:24', - # 'id': 1268} - imgs = lvis_api.load_imgs(img_ids) - # anns is a list[list[dict]], where each dict is an annotation - # record for an object. The inner list enumerates the objects in an image - # and the outer list enumerates over images. Example of anns[0]: - # [{'segmentation': [[192.81, - # 247.09, - # ... - # 219.03, - # 249.06]], - # 'area': 1035.749, - # 'image_id': 1268, - # 'bbox': [192.81, 224.8, 74.73, 33.43], - # 'category_id': 16, - # 'id': 42986}, - # ...] - anns = [lvis_api.img_ann_map[img_id] for img_id in img_ids] - - # Sanity check that each annotation has a unique id - ann_ids = [ann["id"] for anns_per_image in anns for ann in anns_per_image] - assert len(set(ann_ids)) == len(ann_ids), "Annotation ids in '{}' are not unique".format( - json_file - ) - - imgs_anns = list(zip(imgs, anns)) - - logger.info("Loaded {} images in the LVIS format from {}".format(len(imgs_anns), json_file)) - - if extra_annotation_keys: - logger.info( - "The following extra annotation keys will be loaded: {} ".format(extra_annotation_keys) - ) - else: - extra_annotation_keys = [] - - def get_file_name(img_root, img_dict): - # Determine the path including the split folder ("train2017", "val2017", "test2017") from - # the coco_url field. Example: - # 'coco_url': 'http://images.cocodataset.org/train2017/000000155379.jpg' - split_folder, file_name = img_dict["coco_url"].split("/")[-2:] - return os.path.join(img_root + split_folder, file_name) - - dataset_dicts = [] - - for (img_dict, anno_dict_list) in imgs_anns: - record = {} - record["file_name"] = get_file_name(image_root, img_dict) - record["height"] = img_dict["height"] - record["width"] = img_dict["width"] - record["not_exhaustive_category_ids"] = img_dict.get("not_exhaustive_category_ids", []) - record["neg_category_ids"] = img_dict.get("neg_category_ids", []) - image_id = record["image_id"] = img_dict["id"] - - objs = [] - for anno in anno_dict_list: - # Check that the image_id in this annotation is the same as - # the image_id we're looking at. - # This fails only when the data parsing logic or the annotation file is buggy. - assert anno["image_id"] == image_id - obj = {"bbox": anno["bbox"], "bbox_mode": BoxMode.XYWH_ABS} - # LVIS data loader can be used to load COCO dataset categories. In this case `meta` - # variable will have a field with COCO-specific category mapping. - if dataset_name is not None and "thing_dataset_id_to_contiguous_id" in meta: - obj["category_id"] = meta["thing_dataset_id_to_contiguous_id"][anno["category_id"]] - else: - obj["category_id"] = anno["category_id"] - 1 # Convert 1-indexed to 0-indexed - segm = anno["segmentation"] # list[list[float]] - # filter out invalid polygons (< 3 points) - valid_segm = [poly for poly in segm if len(poly) % 2 == 0 and len(poly) >= 6] - assert len(segm) == len( - valid_segm - ), "Annotation contains an invalid polygon with < 3 points" - assert len(segm) > 0 - obj["segmentation"] = segm - for extra_ann_key in extra_annotation_keys: - obj[extra_ann_key] = anno[extra_ann_key] - objs.append(obj) - record["annotations"] = objs - dataset_dicts.append(record) - - return dataset_dicts - - -def get_lvis_instances_meta(dataset_name): - """ - Load LVIS metadata. - - Args: - dataset_name (str): LVIS dataset name without the split name (e.g., "lvis_v0.5"). - - Returns: - dict: LVIS metadata with keys: thing_classes - """ - if "cocofied" in dataset_name: - return _get_coco_instances_meta() - if "v0.5" in dataset_name: - return _get_lvis_instances_meta_v0_5() - elif "v1" in dataset_name: - return _get_lvis_instances_meta_v1() - raise ValueError("No built-in metadata for dataset {}".format(dataset_name)) - - -def _get_lvis_instances_meta_v0_5(): - assert len(LVIS_V0_5_CATEGORIES) == 1230 - cat_ids = [k["id"] for k in LVIS_V0_5_CATEGORIES] - assert min(cat_ids) == 1 and max(cat_ids) == len( - cat_ids - ), "Category ids are not in [1, #categories], as expected" - # Ensure that the category list is sorted by id - lvis_categories = sorted(LVIS_V0_5_CATEGORIES, key=lambda x: x["id"]) - thing_classes = [k["synonyms"][0] for k in lvis_categories] - meta = {"thing_classes": thing_classes} - return meta - - -def _get_lvis_instances_meta_v1(): - assert len(LVIS_V1_CATEGORIES) == 1203 - cat_ids = [k["id"] for k in LVIS_V1_CATEGORIES] - assert min(cat_ids) == 1 and max(cat_ids) == len( - cat_ids - ), "Category ids are not in [1, #categories], as expected" - # Ensure that the category list is sorted by id - lvis_categories = sorted(LVIS_V1_CATEGORIES, key=lambda x: x["id"]) - thing_classes = [k["synonyms"][0] for k in lvis_categories] - meta = {"thing_classes": thing_classes} - return meta - - -if __name__ == "__main__": - """ - Test the LVIS json dataset loader. - - Usage: - python -m detectron2.data.datasets.lvis \ - path/to/json path/to/image_root dataset_name vis_limit - """ - import sys - import numpy as np - from detectron2.utils.logger import setup_logger - from PIL import Image - import detectron2.data.datasets # noqa # add pre-defined metadata - from detectron2.utils.visualizer import Visualizer - - logger = setup_logger(name=__name__) - meta = MetadataCatalog.get(sys.argv[3]) - - dicts = load_lvis_json(sys.argv[1], sys.argv[2], sys.argv[3]) - logger.info("Done loading {} samples.".format(len(dicts))) - - dirname = "lvis-data-vis" - os.makedirs(dirname, exist_ok=True) - for d in dicts[: int(sys.argv[4])]: - img = np.array(Image.open(d["file_name"])) - visualizer = Visualizer(img, metadata=meta) - vis = visualizer.draw_dataset_dict(d) - fpath = os.path.join(dirname, os.path.basename(d["file_name"])) - vis.save(fpath) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis_v0_5_categories.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis_v0_5_categories.py deleted file mode 100755 index d3dab619..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis_v0_5_categories.py +++ /dev/null @@ -1,13 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# Autogen with -# with open("lvis_v0.5_val.json", "r") as f: -# a = json.load(f) -# c = a["categories"] -# for x in c: -# del x["image_count"] -# del x["instance_count"] -# LVIS_CATEGORIES = repr(c) + " # noqa" - -# fmt: off -LVIS_CATEGORIES = [{'frequency': 'r', 'id': 1, 'synset': 'acorn.n.01', 'synonyms': ['acorn'], 'def': 'nut from an oak tree', 'name': 'acorn'}, {'frequency': 'c', 'id': 2, 'synset': 'aerosol.n.02', 'synonyms': ['aerosol_can', 'spray_can'], 'def': 'a dispenser that holds a substance under pressure', 'name': 'aerosol_can'}, {'frequency': 'f', 'id': 3, 'synset': 'air_conditioner.n.01', 'synonyms': ['air_conditioner'], 'def': 'a machine that keeps air cool and dry', 'name': 'air_conditioner'}, {'frequency': 'f', 'id': 4, 'synset': 'airplane.n.01', 'synonyms': ['airplane', 'aeroplane'], 'def': 'an aircraft that has a fixed wing and is powered by propellers or jets', 'name': 'airplane'}, {'frequency': 'c', 'id': 5, 'synset': 'alarm_clock.n.01', 'synonyms': ['alarm_clock'], 'def': 'a clock that wakes a sleeper at some preset time', 'name': 'alarm_clock'}, {'frequency': 'c', 'id': 6, 'synset': 'alcohol.n.01', 'synonyms': ['alcohol', 'alcoholic_beverage'], 'def': 'a liquor or brew containing alcohol as the active agent', 'name': 'alcohol'}, {'frequency': 'r', 'id': 7, 'synset': 'alligator.n.02', 'synonyms': ['alligator', 'gator'], 'def': 'amphibious reptiles related to crocodiles but with shorter broader snouts', 'name': 'alligator'}, {'frequency': 'c', 'id': 8, 'synset': 'almond.n.02', 'synonyms': ['almond'], 'def': 'oval-shaped edible seed of the almond tree', 'name': 'almond'}, {'frequency': 'c', 'id': 9, 'synset': 'ambulance.n.01', 'synonyms': ['ambulance'], 'def': 'a vehicle that takes people to and from hospitals', 'name': 'ambulance'}, {'frequency': 'r', 'id': 10, 'synset': 'amplifier.n.01', 'synonyms': ['amplifier'], 'def': 'electronic equipment that increases strength of signals', 'name': 'amplifier'}, {'frequency': 'c', 'id': 11, 'synset': 'anklet.n.03', 'synonyms': ['anklet', 'ankle_bracelet'], 'def': 'an ornament worn around the ankle', 'name': 'anklet'}, {'frequency': 'f', 'id': 12, 'synset': 'antenna.n.01', 'synonyms': ['antenna', 'aerial', 'transmitting_aerial'], 'def': 'an electrical device that sends or receives radio or television signals', 'name': 'antenna'}, {'frequency': 'f', 'id': 13, 'synset': 'apple.n.01', 'synonyms': ['apple'], 'def': 'fruit with red or yellow or green skin and sweet to tart crisp whitish flesh', 'name': 'apple'}, {'frequency': 'r', 'id': 14, 'synset': 'apple_juice.n.01', 'synonyms': ['apple_juice'], 'def': 'the juice of apples', 'name': 'apple_juice'}, {'frequency': 'r', 'id': 15, 'synset': 'applesauce.n.01', 'synonyms': ['applesauce'], 'def': 'puree of stewed apples usually sweetened and spiced', 'name': 'applesauce'}, {'frequency': 'r', 'id': 16, 'synset': 'apricot.n.02', 'synonyms': ['apricot'], 'def': 'downy yellow to rosy-colored fruit resembling a small peach', 'name': 'apricot'}, {'frequency': 'f', 'id': 17, 'synset': 'apron.n.01', 'synonyms': ['apron'], 'def': 'a garment of cloth that is tied about the waist and worn to protect clothing', 'name': 'apron'}, {'frequency': 'c', 'id': 18, 'synset': 'aquarium.n.01', 'synonyms': ['aquarium', 'fish_tank'], 'def': 'a tank/pool/bowl filled with water for keeping live fish and underwater animals', 'name': 'aquarium'}, {'frequency': 'c', 'id': 19, 'synset': 'armband.n.02', 'synonyms': ['armband'], 'def': 'a band worn around the upper arm', 'name': 'armband'}, {'frequency': 'f', 'id': 20, 'synset': 'armchair.n.01', 'synonyms': ['armchair'], 'def': 'chair with a support on each side for arms', 'name': 'armchair'}, {'frequency': 'r', 'id': 21, 'synset': 'armoire.n.01', 'synonyms': ['armoire'], 'def': 'a large wardrobe or cabinet', 'name': 'armoire'}, {'frequency': 'r', 'id': 22, 'synset': 'armor.n.01', 'synonyms': ['armor', 'armour'], 'def': 'protective covering made of metal and used in combat', 'name': 'armor'}, {'frequency': 'c', 'id': 23, 'synset': 'artichoke.n.02', 'synonyms': ['artichoke'], 'def': 'a thistlelike flower head with edible fleshy leaves and heart', 'name': 'artichoke'}, {'frequency': 'f', 'id': 24, 'synset': 'ashcan.n.01', 'synonyms': ['trash_can', 'garbage_can', 'wastebin', 'dustbin', 'trash_barrel', 'trash_bin'], 'def': 'a bin that holds rubbish until it is collected', 'name': 'trash_can'}, {'frequency': 'c', 'id': 25, 'synset': 'ashtray.n.01', 'synonyms': ['ashtray'], 'def': "a receptacle for the ash from smokers' cigars or cigarettes", 'name': 'ashtray'}, {'frequency': 'c', 'id': 26, 'synset': 'asparagus.n.02', 'synonyms': ['asparagus'], 'def': 'edible young shoots of the asparagus plant', 'name': 'asparagus'}, {'frequency': 'c', 'id': 27, 'synset': 'atomizer.n.01', 'synonyms': ['atomizer', 'atomiser', 'spray', 'sprayer', 'nebulizer', 'nebuliser'], 'def': 'a dispenser that turns a liquid (such as perfume) into a fine mist', 'name': 'atomizer'}, {'frequency': 'c', 'id': 28, 'synset': 'avocado.n.01', 'synonyms': ['avocado'], 'def': 'a pear-shaped fruit with green or blackish skin and rich yellowish pulp enclosing a single large seed', 'name': 'avocado'}, {'frequency': 'c', 'id': 29, 'synset': 'award.n.02', 'synonyms': ['award', 'accolade'], 'def': 'a tangible symbol signifying approval or distinction', 'name': 'award'}, {'frequency': 'f', 'id': 30, 'synset': 'awning.n.01', 'synonyms': ['awning'], 'def': 'a canopy made of canvas to shelter people or things from rain or sun', 'name': 'awning'}, {'frequency': 'r', 'id': 31, 'synset': 'ax.n.01', 'synonyms': ['ax', 'axe'], 'def': 'an edge tool with a heavy bladed head mounted across a handle', 'name': 'ax'}, {'frequency': 'f', 'id': 32, 'synset': 'baby_buggy.n.01', 'synonyms': ['baby_buggy', 'baby_carriage', 'perambulator', 'pram', 'stroller'], 'def': 'a small vehicle with four wheels in which a baby or child is pushed around', 'name': 'baby_buggy'}, {'frequency': 'c', 'id': 33, 'synset': 'backboard.n.01', 'synonyms': ['basketball_backboard'], 'def': 'a raised vertical board with basket attached; used to play basketball', 'name': 'basketball_backboard'}, {'frequency': 'f', 'id': 34, 'synset': 'backpack.n.01', 'synonyms': ['backpack', 'knapsack', 'packsack', 'rucksack', 'haversack'], 'def': 'a bag carried by a strap on your back or shoulder', 'name': 'backpack'}, {'frequency': 'f', 'id': 35, 'synset': 'bag.n.04', 'synonyms': ['handbag', 'purse', 'pocketbook'], 'def': 'a container used for carrying money and small personal items or accessories', 'name': 'handbag'}, {'frequency': 'f', 'id': 36, 'synset': 'bag.n.06', 'synonyms': ['suitcase', 'baggage', 'luggage'], 'def': 'cases used to carry belongings when traveling', 'name': 'suitcase'}, {'frequency': 'c', 'id': 37, 'synset': 'bagel.n.01', 'synonyms': ['bagel', 'beigel'], 'def': 'glazed yeast-raised doughnut-shaped roll with hard crust', 'name': 'bagel'}, {'frequency': 'r', 'id': 38, 'synset': 'bagpipe.n.01', 'synonyms': ['bagpipe'], 'def': 'a tubular wind instrument; the player blows air into a bag and squeezes it out', 'name': 'bagpipe'}, {'frequency': 'r', 'id': 39, 'synset': 'baguet.n.01', 'synonyms': ['baguet', 'baguette'], 'def': 'narrow French stick loaf', 'name': 'baguet'}, {'frequency': 'r', 'id': 40, 'synset': 'bait.n.02', 'synonyms': ['bait', 'lure'], 'def': 'something used to lure fish or other animals into danger so they can be trapped or killed', 'name': 'bait'}, {'frequency': 'f', 'id': 41, 'synset': 'ball.n.06', 'synonyms': ['ball'], 'def': 'a spherical object used as a plaything', 'name': 'ball'}, {'frequency': 'r', 'id': 42, 'synset': 'ballet_skirt.n.01', 'synonyms': ['ballet_skirt', 'tutu'], 'def': 'very short skirt worn by ballerinas', 'name': 'ballet_skirt'}, {'frequency': 'f', 'id': 43, 'synset': 'balloon.n.01', 'synonyms': ['balloon'], 'def': 'large tough nonrigid bag filled with gas or heated air', 'name': 'balloon'}, {'frequency': 'c', 'id': 44, 'synset': 'bamboo.n.02', 'synonyms': ['bamboo'], 'def': 'woody tropical grass having hollow woody stems', 'name': 'bamboo'}, {'frequency': 'f', 'id': 45, 'synset': 'banana.n.02', 'synonyms': ['banana'], 'def': 'elongated crescent-shaped yellow fruit with soft sweet flesh', 'name': 'banana'}, {'frequency': 'r', 'id': 46, 'synset': 'band_aid.n.01', 'synonyms': ['Band_Aid'], 'def': 'trade name for an adhesive bandage to cover small cuts or blisters', 'name': 'Band_Aid'}, {'frequency': 'c', 'id': 47, 'synset': 'bandage.n.01', 'synonyms': ['bandage'], 'def': 'a piece of soft material that covers and protects an injured part of the body', 'name': 'bandage'}, {'frequency': 'c', 'id': 48, 'synset': 'bandanna.n.01', 'synonyms': ['bandanna', 'bandana'], 'def': 'large and brightly colored handkerchief; often used as a neckerchief', 'name': 'bandanna'}, {'frequency': 'r', 'id': 49, 'synset': 'banjo.n.01', 'synonyms': ['banjo'], 'def': 'a stringed instrument of the guitar family with a long neck and circular body', 'name': 'banjo'}, {'frequency': 'f', 'id': 50, 'synset': 'banner.n.01', 'synonyms': ['banner', 'streamer'], 'def': 'long strip of cloth or paper used for decoration or advertising', 'name': 'banner'}, {'frequency': 'r', 'id': 51, 'synset': 'barbell.n.01', 'synonyms': ['barbell'], 'def': 'a bar to which heavy discs are attached at each end; used in weightlifting', 'name': 'barbell'}, {'frequency': 'r', 'id': 52, 'synset': 'barge.n.01', 'synonyms': ['barge'], 'def': 'a flatbottom boat for carrying heavy loads (especially on canals)', 'name': 'barge'}, {'frequency': 'f', 'id': 53, 'synset': 'barrel.n.02', 'synonyms': ['barrel', 'cask'], 'def': 'a cylindrical container that holds liquids', 'name': 'barrel'}, {'frequency': 'c', 'id': 54, 'synset': 'barrette.n.01', 'synonyms': ['barrette'], 'def': "a pin for holding women's hair in place", 'name': 'barrette'}, {'frequency': 'c', 'id': 55, 'synset': 'barrow.n.03', 'synonyms': ['barrow', 'garden_cart', 'lawn_cart', 'wheelbarrow'], 'def': 'a cart for carrying small loads; has handles and one or more wheels', 'name': 'barrow'}, {'frequency': 'f', 'id': 56, 'synset': 'base.n.03', 'synonyms': ['baseball_base'], 'def': 'a place that the runner must touch before scoring', 'name': 'baseball_base'}, {'frequency': 'f', 'id': 57, 'synset': 'baseball.n.02', 'synonyms': ['baseball'], 'def': 'a ball used in playing baseball', 'name': 'baseball'}, {'frequency': 'f', 'id': 58, 'synset': 'baseball_bat.n.01', 'synonyms': ['baseball_bat'], 'def': 'an implement used in baseball by the batter', 'name': 'baseball_bat'}, {'frequency': 'f', 'id': 59, 'synset': 'baseball_cap.n.01', 'synonyms': ['baseball_cap', 'jockey_cap', 'golf_cap'], 'def': 'a cap with a bill', 'name': 'baseball_cap'}, {'frequency': 'f', 'id': 60, 'synset': 'baseball_glove.n.01', 'synonyms': ['baseball_glove', 'baseball_mitt'], 'def': 'the handwear used by fielders in playing baseball', 'name': 'baseball_glove'}, {'frequency': 'f', 'id': 61, 'synset': 'basket.n.01', 'synonyms': ['basket', 'handbasket'], 'def': 'a container that is usually woven and has handles', 'name': 'basket'}, {'frequency': 'c', 'id': 62, 'synset': 'basket.n.03', 'synonyms': ['basketball_hoop'], 'def': 'metal hoop supporting a net through which players try to throw the basketball', 'name': 'basketball_hoop'}, {'frequency': 'c', 'id': 63, 'synset': 'basketball.n.02', 'synonyms': ['basketball'], 'def': 'an inflated ball used in playing basketball', 'name': 'basketball'}, {'frequency': 'r', 'id': 64, 'synset': 'bass_horn.n.01', 'synonyms': ['bass_horn', 'sousaphone', 'tuba'], 'def': 'the lowest brass wind instrument', 'name': 'bass_horn'}, {'frequency': 'r', 'id': 65, 'synset': 'bat.n.01', 'synonyms': ['bat_(animal)'], 'def': 'nocturnal mouselike mammal with forelimbs modified to form membranous wings', 'name': 'bat_(animal)'}, {'frequency': 'f', 'id': 66, 'synset': 'bath_mat.n.01', 'synonyms': ['bath_mat'], 'def': 'a heavy towel or mat to stand on while drying yourself after a bath', 'name': 'bath_mat'}, {'frequency': 'f', 'id': 67, 'synset': 'bath_towel.n.01', 'synonyms': ['bath_towel'], 'def': 'a large towel; to dry yourself after a bath', 'name': 'bath_towel'}, {'frequency': 'c', 'id': 68, 'synset': 'bathrobe.n.01', 'synonyms': ['bathrobe'], 'def': 'a loose-fitting robe of towelling; worn after a bath or swim', 'name': 'bathrobe'}, {'frequency': 'f', 'id': 69, 'synset': 'bathtub.n.01', 'synonyms': ['bathtub', 'bathing_tub'], 'def': 'a large open container that you fill with water and use to wash the body', 'name': 'bathtub'}, {'frequency': 'r', 'id': 70, 'synset': 'batter.n.02', 'synonyms': ['batter_(food)'], 'def': 'a liquid or semiliquid mixture, as of flour, eggs, and milk, used in cooking', 'name': 'batter_(food)'}, {'frequency': 'c', 'id': 71, 'synset': 'battery.n.02', 'synonyms': ['battery'], 'def': 'a portable device that produces electricity', 'name': 'battery'}, {'frequency': 'r', 'id': 72, 'synset': 'beach_ball.n.01', 'synonyms': ['beachball'], 'def': 'large and light ball; for play at the seaside', 'name': 'beachball'}, {'frequency': 'c', 'id': 73, 'synset': 'bead.n.01', 'synonyms': ['bead'], 'def': 'a small ball with a hole through the middle used for ornamentation, jewellery, etc.', 'name': 'bead'}, {'frequency': 'r', 'id': 74, 'synset': 'beaker.n.01', 'synonyms': ['beaker'], 'def': 'a flatbottomed jar made of glass or plastic; used for chemistry', 'name': 'beaker'}, {'frequency': 'c', 'id': 75, 'synset': 'bean_curd.n.01', 'synonyms': ['bean_curd', 'tofu'], 'def': 'cheeselike food made of curdled soybean milk', 'name': 'bean_curd'}, {'frequency': 'c', 'id': 76, 'synset': 'beanbag.n.01', 'synonyms': ['beanbag'], 'def': 'a bag filled with dried beans or similar items; used in games or to sit on', 'name': 'beanbag'}, {'frequency': 'f', 'id': 77, 'synset': 'beanie.n.01', 'synonyms': ['beanie', 'beany'], 'def': 'a small skullcap; formerly worn by schoolboys and college freshmen', 'name': 'beanie'}, {'frequency': 'f', 'id': 78, 'synset': 'bear.n.01', 'synonyms': ['bear'], 'def': 'large carnivorous or omnivorous mammals with shaggy coats and claws', 'name': 'bear'}, {'frequency': 'f', 'id': 79, 'synset': 'bed.n.01', 'synonyms': ['bed'], 'def': 'a piece of furniture that provides a place to sleep', 'name': 'bed'}, {'frequency': 'c', 'id': 80, 'synset': 'bedspread.n.01', 'synonyms': ['bedspread', 'bedcover', 'bed_covering', 'counterpane', 'spread'], 'def': 'decorative cover for a bed', 'name': 'bedspread'}, {'frequency': 'f', 'id': 81, 'synset': 'beef.n.01', 'synonyms': ['cow'], 'def': 'cattle that are reared for their meat', 'name': 'cow'}, {'frequency': 'c', 'id': 82, 'synset': 'beef.n.02', 'synonyms': ['beef_(food)', 'boeuf_(food)'], 'def': 'meat from an adult domestic bovine', 'name': 'beef_(food)'}, {'frequency': 'r', 'id': 83, 'synset': 'beeper.n.01', 'synonyms': ['beeper', 'pager'], 'def': 'an device that beeps when the person carrying it is being paged', 'name': 'beeper'}, {'frequency': 'f', 'id': 84, 'synset': 'beer_bottle.n.01', 'synonyms': ['beer_bottle'], 'def': 'a bottle that holds beer', 'name': 'beer_bottle'}, {'frequency': 'c', 'id': 85, 'synset': 'beer_can.n.01', 'synonyms': ['beer_can'], 'def': 'a can that holds beer', 'name': 'beer_can'}, {'frequency': 'r', 'id': 86, 'synset': 'beetle.n.01', 'synonyms': ['beetle'], 'def': 'insect with hard wing covers', 'name': 'beetle'}, {'frequency': 'f', 'id': 87, 'synset': 'bell.n.01', 'synonyms': ['bell'], 'def': 'a hollow device made of metal that makes a ringing sound when struck', 'name': 'bell'}, {'frequency': 'f', 'id': 88, 'synset': 'bell_pepper.n.02', 'synonyms': ['bell_pepper', 'capsicum'], 'def': 'large bell-shaped sweet pepper in green or red or yellow or orange or black varieties', 'name': 'bell_pepper'}, {'frequency': 'f', 'id': 89, 'synset': 'belt.n.02', 'synonyms': ['belt'], 'def': 'a band to tie or buckle around the body (usually at the waist)', 'name': 'belt'}, {'frequency': 'f', 'id': 90, 'synset': 'belt_buckle.n.01', 'synonyms': ['belt_buckle'], 'def': 'the buckle used to fasten a belt', 'name': 'belt_buckle'}, {'frequency': 'f', 'id': 91, 'synset': 'bench.n.01', 'synonyms': ['bench'], 'def': 'a long seat for more than one person', 'name': 'bench'}, {'frequency': 'c', 'id': 92, 'synset': 'beret.n.01', 'synonyms': ['beret'], 'def': 'a cap with no brim or bill; made of soft cloth', 'name': 'beret'}, {'frequency': 'c', 'id': 93, 'synset': 'bib.n.02', 'synonyms': ['bib'], 'def': 'a napkin tied under the chin of a child while eating', 'name': 'bib'}, {'frequency': 'r', 'id': 94, 'synset': 'bible.n.01', 'synonyms': ['Bible'], 'def': 'the sacred writings of the Christian religions', 'name': 'Bible'}, {'frequency': 'f', 'id': 95, 'synset': 'bicycle.n.01', 'synonyms': ['bicycle', 'bike_(bicycle)'], 'def': 'a wheeled vehicle that has two wheels and is moved by foot pedals', 'name': 'bicycle'}, {'frequency': 'f', 'id': 96, 'synset': 'bill.n.09', 'synonyms': ['visor', 'vizor'], 'def': 'a brim that projects to the front to shade the eyes', 'name': 'visor'}, {'frequency': 'c', 'id': 97, 'synset': 'binder.n.03', 'synonyms': ['binder', 'ring-binder'], 'def': 'holds loose papers or magazines', 'name': 'binder'}, {'frequency': 'c', 'id': 98, 'synset': 'binoculars.n.01', 'synonyms': ['binoculars', 'field_glasses', 'opera_glasses'], 'def': 'an optical instrument designed for simultaneous use by both eyes', 'name': 'binoculars'}, {'frequency': 'f', 'id': 99, 'synset': 'bird.n.01', 'synonyms': ['bird'], 'def': 'animal characterized by feathers and wings', 'name': 'bird'}, {'frequency': 'r', 'id': 100, 'synset': 'bird_feeder.n.01', 'synonyms': ['birdfeeder'], 'def': 'an outdoor device that supplies food for wild birds', 'name': 'birdfeeder'}, {'frequency': 'r', 'id': 101, 'synset': 'birdbath.n.01', 'synonyms': ['birdbath'], 'def': 'an ornamental basin (usually in a garden) for birds to bathe in', 'name': 'birdbath'}, {'frequency': 'c', 'id': 102, 'synset': 'birdcage.n.01', 'synonyms': ['birdcage'], 'def': 'a cage in which a bird can be kept', 'name': 'birdcage'}, {'frequency': 'c', 'id': 103, 'synset': 'birdhouse.n.01', 'synonyms': ['birdhouse'], 'def': 'a shelter for birds', 'name': 'birdhouse'}, {'frequency': 'f', 'id': 104, 'synset': 'birthday_cake.n.01', 'synonyms': ['birthday_cake'], 'def': 'decorated cake served at a birthday party', 'name': 'birthday_cake'}, {'frequency': 'r', 'id': 105, 'synset': 'birthday_card.n.01', 'synonyms': ['birthday_card'], 'def': 'a card expressing a birthday greeting', 'name': 'birthday_card'}, {'frequency': 'r', 'id': 106, 'synset': 'biscuit.n.01', 'synonyms': ['biscuit_(bread)'], 'def': 'small round bread leavened with baking-powder or soda', 'name': 'biscuit_(bread)'}, {'frequency': 'r', 'id': 107, 'synset': 'black_flag.n.01', 'synonyms': ['pirate_flag'], 'def': 'a flag usually bearing a white skull and crossbones on a black background', 'name': 'pirate_flag'}, {'frequency': 'c', 'id': 108, 'synset': 'black_sheep.n.02', 'synonyms': ['black_sheep'], 'def': 'sheep with a black coat', 'name': 'black_sheep'}, {'frequency': 'c', 'id': 109, 'synset': 'blackboard.n.01', 'synonyms': ['blackboard', 'chalkboard'], 'def': 'sheet of slate; for writing with chalk', 'name': 'blackboard'}, {'frequency': 'f', 'id': 110, 'synset': 'blanket.n.01', 'synonyms': ['blanket'], 'def': 'bedding that keeps a person warm in bed', 'name': 'blanket'}, {'frequency': 'c', 'id': 111, 'synset': 'blazer.n.01', 'synonyms': ['blazer', 'sport_jacket', 'sport_coat', 'sports_jacket', 'sports_coat'], 'def': 'lightweight jacket; often striped in the colors of a club or school', 'name': 'blazer'}, {'frequency': 'f', 'id': 112, 'synset': 'blender.n.01', 'synonyms': ['blender', 'liquidizer', 'liquidiser'], 'def': 'an electrically powered mixer that mix or chop or liquefy foods', 'name': 'blender'}, {'frequency': 'r', 'id': 113, 'synset': 'blimp.n.02', 'synonyms': ['blimp'], 'def': 'a small nonrigid airship used for observation or as a barrage balloon', 'name': 'blimp'}, {'frequency': 'c', 'id': 114, 'synset': 'blinker.n.01', 'synonyms': ['blinker', 'flasher'], 'def': 'a light that flashes on and off; used as a signal or to send messages', 'name': 'blinker'}, {'frequency': 'c', 'id': 115, 'synset': 'blueberry.n.02', 'synonyms': ['blueberry'], 'def': 'sweet edible dark-blue berries of blueberry plants', 'name': 'blueberry'}, {'frequency': 'r', 'id': 116, 'synset': 'boar.n.02', 'synonyms': ['boar'], 'def': 'an uncastrated male hog', 'name': 'boar'}, {'frequency': 'r', 'id': 117, 'synset': 'board.n.09', 'synonyms': ['gameboard'], 'def': 'a flat portable surface (usually rectangular) designed for board games', 'name': 'gameboard'}, {'frequency': 'f', 'id': 118, 'synset': 'boat.n.01', 'synonyms': ['boat', 'ship_(boat)'], 'def': 'a vessel for travel on water', 'name': 'boat'}, {'frequency': 'c', 'id': 119, 'synset': 'bobbin.n.01', 'synonyms': ['bobbin', 'spool', 'reel'], 'def': 'a thing around which thread/tape/film or other flexible materials can be wound', 'name': 'bobbin'}, {'frequency': 'r', 'id': 120, 'synset': 'bobby_pin.n.01', 'synonyms': ['bobby_pin', 'hairgrip'], 'def': 'a flat wire hairpin used to hold bobbed hair in place', 'name': 'bobby_pin'}, {'frequency': 'c', 'id': 121, 'synset': 'boiled_egg.n.01', 'synonyms': ['boiled_egg', 'coddled_egg'], 'def': 'egg cooked briefly in the shell in gently boiling water', 'name': 'boiled_egg'}, {'frequency': 'r', 'id': 122, 'synset': 'bolo_tie.n.01', 'synonyms': ['bolo_tie', 'bolo', 'bola_tie', 'bola'], 'def': 'a cord fastened around the neck with an ornamental clasp and worn as a necktie', 'name': 'bolo_tie'}, {'frequency': 'c', 'id': 123, 'synset': 'bolt.n.03', 'synonyms': ['deadbolt'], 'def': 'the part of a lock that is engaged or withdrawn with a key', 'name': 'deadbolt'}, {'frequency': 'f', 'id': 124, 'synset': 'bolt.n.06', 'synonyms': ['bolt'], 'def': 'a screw that screws into a nut to form a fastener', 'name': 'bolt'}, {'frequency': 'r', 'id': 125, 'synset': 'bonnet.n.01', 'synonyms': ['bonnet'], 'def': 'a hat tied under the chin', 'name': 'bonnet'}, {'frequency': 'f', 'id': 126, 'synset': 'book.n.01', 'synonyms': ['book'], 'def': 'a written work or composition that has been published', 'name': 'book'}, {'frequency': 'r', 'id': 127, 'synset': 'book_bag.n.01', 'synonyms': ['book_bag'], 'def': 'a bag in which students carry their books', 'name': 'book_bag'}, {'frequency': 'c', 'id': 128, 'synset': 'bookcase.n.01', 'synonyms': ['bookcase'], 'def': 'a piece of furniture with shelves for storing books', 'name': 'bookcase'}, {'frequency': 'c', 'id': 129, 'synset': 'booklet.n.01', 'synonyms': ['booklet', 'brochure', 'leaflet', 'pamphlet'], 'def': 'a small book usually having a paper cover', 'name': 'booklet'}, {'frequency': 'r', 'id': 130, 'synset': 'bookmark.n.01', 'synonyms': ['bookmark', 'bookmarker'], 'def': 'a marker (a piece of paper or ribbon) placed between the pages of a book', 'name': 'bookmark'}, {'frequency': 'r', 'id': 131, 'synset': 'boom.n.04', 'synonyms': ['boom_microphone', 'microphone_boom'], 'def': 'a pole carrying an overhead microphone projected over a film or tv set', 'name': 'boom_microphone'}, {'frequency': 'f', 'id': 132, 'synset': 'boot.n.01', 'synonyms': ['boot'], 'def': 'footwear that covers the whole foot and lower leg', 'name': 'boot'}, {'frequency': 'f', 'id': 133, 'synset': 'bottle.n.01', 'synonyms': ['bottle'], 'def': 'a glass or plastic vessel used for storing drinks or other liquids', 'name': 'bottle'}, {'frequency': 'c', 'id': 134, 'synset': 'bottle_opener.n.01', 'synonyms': ['bottle_opener'], 'def': 'an opener for removing caps or corks from bottles', 'name': 'bottle_opener'}, {'frequency': 'c', 'id': 135, 'synset': 'bouquet.n.01', 'synonyms': ['bouquet'], 'def': 'an arrangement of flowers that is usually given as a present', 'name': 'bouquet'}, {'frequency': 'r', 'id': 136, 'synset': 'bow.n.04', 'synonyms': ['bow_(weapon)'], 'def': 'a weapon for shooting arrows', 'name': 'bow_(weapon)'}, {'frequency': 'f', 'id': 137, 'synset': 'bow.n.08', 'synonyms': ['bow_(decorative_ribbons)'], 'def': 'a decorative interlacing of ribbons', 'name': 'bow_(decorative_ribbons)'}, {'frequency': 'f', 'id': 138, 'synset': 'bow_tie.n.01', 'synonyms': ['bow-tie', 'bowtie'], 'def': "a man's tie that ties in a bow", 'name': 'bow-tie'}, {'frequency': 'f', 'id': 139, 'synset': 'bowl.n.03', 'synonyms': ['bowl'], 'def': 'a dish that is round and open at the top for serving foods', 'name': 'bowl'}, {'frequency': 'r', 'id': 140, 'synset': 'bowl.n.08', 'synonyms': ['pipe_bowl'], 'def': 'a small round container that is open at the top for holding tobacco', 'name': 'pipe_bowl'}, {'frequency': 'c', 'id': 141, 'synset': 'bowler_hat.n.01', 'synonyms': ['bowler_hat', 'bowler', 'derby_hat', 'derby', 'plug_hat'], 'def': 'a felt hat that is round and hard with a narrow brim', 'name': 'bowler_hat'}, {'frequency': 'r', 'id': 142, 'synset': 'bowling_ball.n.01', 'synonyms': ['bowling_ball'], 'def': 'a large ball with finger holes used in the sport of bowling', 'name': 'bowling_ball'}, {'frequency': 'r', 'id': 143, 'synset': 'bowling_pin.n.01', 'synonyms': ['bowling_pin'], 'def': 'a club-shaped wooden object used in bowling', 'name': 'bowling_pin'}, {'frequency': 'r', 'id': 144, 'synset': 'boxing_glove.n.01', 'synonyms': ['boxing_glove'], 'def': 'large glove coverings the fists of a fighter worn for the sport of boxing', 'name': 'boxing_glove'}, {'frequency': 'c', 'id': 145, 'synset': 'brace.n.06', 'synonyms': ['suspenders'], 'def': 'elastic straps that hold trousers up (usually used in the plural)', 'name': 'suspenders'}, {'frequency': 'f', 'id': 146, 'synset': 'bracelet.n.02', 'synonyms': ['bracelet', 'bangle'], 'def': 'jewelry worn around the wrist for decoration', 'name': 'bracelet'}, {'frequency': 'r', 'id': 147, 'synset': 'brass.n.07', 'synonyms': ['brass_plaque'], 'def': 'a memorial made of brass', 'name': 'brass_plaque'}, {'frequency': 'c', 'id': 148, 'synset': 'brassiere.n.01', 'synonyms': ['brassiere', 'bra', 'bandeau'], 'def': 'an undergarment worn by women to support their breasts', 'name': 'brassiere'}, {'frequency': 'c', 'id': 149, 'synset': 'bread-bin.n.01', 'synonyms': ['bread-bin', 'breadbox'], 'def': 'a container used to keep bread or cake in', 'name': 'bread-bin'}, {'frequency': 'r', 'id': 150, 'synset': 'breechcloth.n.01', 'synonyms': ['breechcloth', 'breechclout', 'loincloth'], 'def': 'a garment that provides covering for the loins', 'name': 'breechcloth'}, {'frequency': 'c', 'id': 151, 'synset': 'bridal_gown.n.01', 'synonyms': ['bridal_gown', 'wedding_gown', 'wedding_dress'], 'def': 'a gown worn by the bride at a wedding', 'name': 'bridal_gown'}, {'frequency': 'c', 'id': 152, 'synset': 'briefcase.n.01', 'synonyms': ['briefcase'], 'def': 'a case with a handle; for carrying papers or files or books', 'name': 'briefcase'}, {'frequency': 'c', 'id': 153, 'synset': 'bristle_brush.n.01', 'synonyms': ['bristle_brush'], 'def': 'a brush that is made with the short stiff hairs of an animal or plant', 'name': 'bristle_brush'}, {'frequency': 'f', 'id': 154, 'synset': 'broccoli.n.01', 'synonyms': ['broccoli'], 'def': 'plant with dense clusters of tight green flower buds', 'name': 'broccoli'}, {'frequency': 'r', 'id': 155, 'synset': 'brooch.n.01', 'synonyms': ['broach'], 'def': 'a decorative pin worn by women', 'name': 'broach'}, {'frequency': 'c', 'id': 156, 'synset': 'broom.n.01', 'synonyms': ['broom'], 'def': 'bundle of straws or twigs attached to a long handle; used for cleaning', 'name': 'broom'}, {'frequency': 'c', 'id': 157, 'synset': 'brownie.n.03', 'synonyms': ['brownie'], 'def': 'square or bar of very rich chocolate cake usually with nuts', 'name': 'brownie'}, {'frequency': 'c', 'id': 158, 'synset': 'brussels_sprouts.n.01', 'synonyms': ['brussels_sprouts'], 'def': 'the small edible cabbage-like buds growing along a stalk', 'name': 'brussels_sprouts'}, {'frequency': 'r', 'id': 159, 'synset': 'bubble_gum.n.01', 'synonyms': ['bubble_gum'], 'def': 'a kind of chewing gum that can be blown into bubbles', 'name': 'bubble_gum'}, {'frequency': 'f', 'id': 160, 'synset': 'bucket.n.01', 'synonyms': ['bucket', 'pail'], 'def': 'a roughly cylindrical vessel that is open at the top', 'name': 'bucket'}, {'frequency': 'r', 'id': 161, 'synset': 'buggy.n.01', 'synonyms': ['horse_buggy'], 'def': 'a small lightweight carriage; drawn by a single horse', 'name': 'horse_buggy'}, {'frequency': 'c', 'id': 162, 'synset': 'bull.n.11', 'synonyms': ['bull'], 'def': 'mature male cow', 'name': 'bull'}, {'frequency': 'r', 'id': 163, 'synset': 'bulldog.n.01', 'synonyms': ['bulldog'], 'def': 'a thickset short-haired dog with a large head and strong undershot lower jaw', 'name': 'bulldog'}, {'frequency': 'r', 'id': 164, 'synset': 'bulldozer.n.01', 'synonyms': ['bulldozer', 'dozer'], 'def': 'large powerful tractor; a large blade in front flattens areas of ground', 'name': 'bulldozer'}, {'frequency': 'c', 'id': 165, 'synset': 'bullet_train.n.01', 'synonyms': ['bullet_train'], 'def': 'a high-speed passenger train', 'name': 'bullet_train'}, {'frequency': 'c', 'id': 166, 'synset': 'bulletin_board.n.02', 'synonyms': ['bulletin_board', 'notice_board'], 'def': 'a board that hangs on a wall; displays announcements', 'name': 'bulletin_board'}, {'frequency': 'r', 'id': 167, 'synset': 'bulletproof_vest.n.01', 'synonyms': ['bulletproof_vest'], 'def': 'a vest capable of resisting the impact of a bullet', 'name': 'bulletproof_vest'}, {'frequency': 'c', 'id': 168, 'synset': 'bullhorn.n.01', 'synonyms': ['bullhorn', 'megaphone'], 'def': 'a portable loudspeaker with built-in microphone and amplifier', 'name': 'bullhorn'}, {'frequency': 'r', 'id': 169, 'synset': 'bully_beef.n.01', 'synonyms': ['corned_beef', 'corn_beef'], 'def': 'beef cured or pickled in brine', 'name': 'corned_beef'}, {'frequency': 'f', 'id': 170, 'synset': 'bun.n.01', 'synonyms': ['bun', 'roll'], 'def': 'small rounded bread either plain or sweet', 'name': 'bun'}, {'frequency': 'c', 'id': 171, 'synset': 'bunk_bed.n.01', 'synonyms': ['bunk_bed'], 'def': 'beds built one above the other', 'name': 'bunk_bed'}, {'frequency': 'f', 'id': 172, 'synset': 'buoy.n.01', 'synonyms': ['buoy'], 'def': 'a float attached by rope to the seabed to mark channels in a harbor or underwater hazards', 'name': 'buoy'}, {'frequency': 'r', 'id': 173, 'synset': 'burrito.n.01', 'synonyms': ['burrito'], 'def': 'a flour tortilla folded around a filling', 'name': 'burrito'}, {'frequency': 'f', 'id': 174, 'synset': 'bus.n.01', 'synonyms': ['bus_(vehicle)', 'autobus', 'charabanc', 'double-decker', 'motorbus', 'motorcoach'], 'def': 'a vehicle carrying many passengers; used for public transport', 'name': 'bus_(vehicle)'}, {'frequency': 'c', 'id': 175, 'synset': 'business_card.n.01', 'synonyms': ['business_card'], 'def': "a card on which are printed the person's name and business affiliation", 'name': 'business_card'}, {'frequency': 'c', 'id': 176, 'synset': 'butcher_knife.n.01', 'synonyms': ['butcher_knife'], 'def': 'a large sharp knife for cutting or trimming meat', 'name': 'butcher_knife'}, {'frequency': 'c', 'id': 177, 'synset': 'butter.n.01', 'synonyms': ['butter'], 'def': 'an edible emulsion of fat globules made by churning milk or cream; for cooking and table use', 'name': 'butter'}, {'frequency': 'c', 'id': 178, 'synset': 'butterfly.n.01', 'synonyms': ['butterfly'], 'def': 'insect typically having a slender body with knobbed antennae and broad colorful wings', 'name': 'butterfly'}, {'frequency': 'f', 'id': 179, 'synset': 'button.n.01', 'synonyms': ['button'], 'def': 'a round fastener sewn to shirts and coats etc to fit through buttonholes', 'name': 'button'}, {'frequency': 'f', 'id': 180, 'synset': 'cab.n.03', 'synonyms': ['cab_(taxi)', 'taxi', 'taxicab'], 'def': 'a car that takes passengers where they want to go in exchange for money', 'name': 'cab_(taxi)'}, {'frequency': 'r', 'id': 181, 'synset': 'cabana.n.01', 'synonyms': ['cabana'], 'def': 'a small tent used as a dressing room beside the sea or a swimming pool', 'name': 'cabana'}, {'frequency': 'r', 'id': 182, 'synset': 'cabin_car.n.01', 'synonyms': ['cabin_car', 'caboose'], 'def': 'a car on a freight train for use of the train crew; usually the last car on the train', 'name': 'cabin_car'}, {'frequency': 'f', 'id': 183, 'synset': 'cabinet.n.01', 'synonyms': ['cabinet'], 'def': 'a piece of furniture resembling a cupboard with doors and shelves and drawers', 'name': 'cabinet'}, {'frequency': 'r', 'id': 184, 'synset': 'cabinet.n.03', 'synonyms': ['locker', 'storage_locker'], 'def': 'a storage compartment for clothes and valuables; usually it has a lock', 'name': 'locker'}, {'frequency': 'f', 'id': 185, 'synset': 'cake.n.03', 'synonyms': ['cake'], 'def': 'baked goods made from or based on a mixture of flour, sugar, eggs, and fat', 'name': 'cake'}, {'frequency': 'c', 'id': 186, 'synset': 'calculator.n.02', 'synonyms': ['calculator'], 'def': 'a small machine that is used for mathematical calculations', 'name': 'calculator'}, {'frequency': 'f', 'id': 187, 'synset': 'calendar.n.02', 'synonyms': ['calendar'], 'def': 'a list or register of events (appointments/social events/court cases, etc)', 'name': 'calendar'}, {'frequency': 'c', 'id': 188, 'synset': 'calf.n.01', 'synonyms': ['calf'], 'def': 'young of domestic cattle', 'name': 'calf'}, {'frequency': 'c', 'id': 189, 'synset': 'camcorder.n.01', 'synonyms': ['camcorder'], 'def': 'a portable television camera and videocassette recorder', 'name': 'camcorder'}, {'frequency': 'c', 'id': 190, 'synset': 'camel.n.01', 'synonyms': ['camel'], 'def': 'cud-chewing mammal used as a draft or saddle animal in desert regions', 'name': 'camel'}, {'frequency': 'f', 'id': 191, 'synset': 'camera.n.01', 'synonyms': ['camera'], 'def': 'equipment for taking photographs', 'name': 'camera'}, {'frequency': 'c', 'id': 192, 'synset': 'camera_lens.n.01', 'synonyms': ['camera_lens'], 'def': 'a lens that focuses the image in a camera', 'name': 'camera_lens'}, {'frequency': 'c', 'id': 193, 'synset': 'camper.n.02', 'synonyms': ['camper_(vehicle)', 'camping_bus', 'motor_home'], 'def': 'a recreational vehicle equipped for camping out while traveling', 'name': 'camper_(vehicle)'}, {'frequency': 'f', 'id': 194, 'synset': 'can.n.01', 'synonyms': ['can', 'tin_can'], 'def': 'airtight sealed metal container for food or drink or paint etc.', 'name': 'can'}, {'frequency': 'c', 'id': 195, 'synset': 'can_opener.n.01', 'synonyms': ['can_opener', 'tin_opener'], 'def': 'a device for cutting cans open', 'name': 'can_opener'}, {'frequency': 'r', 'id': 196, 'synset': 'candelabrum.n.01', 'synonyms': ['candelabrum', 'candelabra'], 'def': 'branched candlestick; ornamental; has several lights', 'name': 'candelabrum'}, {'frequency': 'f', 'id': 197, 'synset': 'candle.n.01', 'synonyms': ['candle', 'candlestick'], 'def': 'stick of wax with a wick in the middle', 'name': 'candle'}, {'frequency': 'f', 'id': 198, 'synset': 'candlestick.n.01', 'synonyms': ['candle_holder'], 'def': 'a holder with sockets for candles', 'name': 'candle_holder'}, {'frequency': 'r', 'id': 199, 'synset': 'candy_bar.n.01', 'synonyms': ['candy_bar'], 'def': 'a candy shaped as a bar', 'name': 'candy_bar'}, {'frequency': 'c', 'id': 200, 'synset': 'candy_cane.n.01', 'synonyms': ['candy_cane'], 'def': 'a hard candy in the shape of a rod (usually with stripes)', 'name': 'candy_cane'}, {'frequency': 'c', 'id': 201, 'synset': 'cane.n.01', 'synonyms': ['walking_cane'], 'def': 'a stick that people can lean on to help them walk', 'name': 'walking_cane'}, {'frequency': 'c', 'id': 202, 'synset': 'canister.n.02', 'synonyms': ['canister', 'cannister'], 'def': 'metal container for storing dry foods such as tea or flour', 'name': 'canister'}, {'frequency': 'r', 'id': 203, 'synset': 'cannon.n.02', 'synonyms': ['cannon'], 'def': 'heavy gun fired from a tank', 'name': 'cannon'}, {'frequency': 'c', 'id': 204, 'synset': 'canoe.n.01', 'synonyms': ['canoe'], 'def': 'small and light boat; pointed at both ends; propelled with a paddle', 'name': 'canoe'}, {'frequency': 'r', 'id': 205, 'synset': 'cantaloup.n.02', 'synonyms': ['cantaloup', 'cantaloupe'], 'def': 'the fruit of a cantaloup vine; small to medium-sized melon with yellowish flesh', 'name': 'cantaloup'}, {'frequency': 'r', 'id': 206, 'synset': 'canteen.n.01', 'synonyms': ['canteen'], 'def': 'a flask for carrying water; used by soldiers or travelers', 'name': 'canteen'}, {'frequency': 'c', 'id': 207, 'synset': 'cap.n.01', 'synonyms': ['cap_(headwear)'], 'def': 'a tight-fitting headwear', 'name': 'cap_(headwear)'}, {'frequency': 'f', 'id': 208, 'synset': 'cap.n.02', 'synonyms': ['bottle_cap', 'cap_(container_lid)'], 'def': 'a top (as for a bottle)', 'name': 'bottle_cap'}, {'frequency': 'r', 'id': 209, 'synset': 'cape.n.02', 'synonyms': ['cape'], 'def': 'a sleeveless garment like a cloak but shorter', 'name': 'cape'}, {'frequency': 'c', 'id': 210, 'synset': 'cappuccino.n.01', 'synonyms': ['cappuccino', 'coffee_cappuccino'], 'def': 'equal parts of espresso and steamed milk', 'name': 'cappuccino'}, {'frequency': 'f', 'id': 211, 'synset': 'car.n.01', 'synonyms': ['car_(automobile)', 'auto_(automobile)', 'automobile'], 'def': 'a motor vehicle with four wheels', 'name': 'car_(automobile)'}, {'frequency': 'f', 'id': 212, 'synset': 'car.n.02', 'synonyms': ['railcar_(part_of_a_train)', 'railway_car_(part_of_a_train)', 'railroad_car_(part_of_a_train)'], 'def': 'a wheeled vehicle adapted to the rails of railroad', 'name': 'railcar_(part_of_a_train)'}, {'frequency': 'r', 'id': 213, 'synset': 'car.n.04', 'synonyms': ['elevator_car'], 'def': 'where passengers ride up and down', 'name': 'elevator_car'}, {'frequency': 'r', 'id': 214, 'synset': 'car_battery.n.01', 'synonyms': ['car_battery', 'automobile_battery'], 'def': 'a battery in a motor vehicle', 'name': 'car_battery'}, {'frequency': 'c', 'id': 215, 'synset': 'card.n.02', 'synonyms': ['identity_card'], 'def': 'a card certifying the identity of the bearer', 'name': 'identity_card'}, {'frequency': 'c', 'id': 216, 'synset': 'card.n.03', 'synonyms': ['card'], 'def': 'a rectangular piece of paper used to send messages (e.g. greetings or pictures)', 'name': 'card'}, {'frequency': 'r', 'id': 217, 'synset': 'cardigan.n.01', 'synonyms': ['cardigan'], 'def': 'knitted jacket that is fastened up the front with buttons or a zipper', 'name': 'cardigan'}, {'frequency': 'r', 'id': 218, 'synset': 'cargo_ship.n.01', 'synonyms': ['cargo_ship', 'cargo_vessel'], 'def': 'a ship designed to carry cargo', 'name': 'cargo_ship'}, {'frequency': 'r', 'id': 219, 'synset': 'carnation.n.01', 'synonyms': ['carnation'], 'def': 'plant with pink to purple-red spice-scented usually double flowers', 'name': 'carnation'}, {'frequency': 'c', 'id': 220, 'synset': 'carriage.n.02', 'synonyms': ['horse_carriage'], 'def': 'a vehicle with wheels drawn by one or more horses', 'name': 'horse_carriage'}, {'frequency': 'f', 'id': 221, 'synset': 'carrot.n.01', 'synonyms': ['carrot'], 'def': 'deep orange edible root of the cultivated carrot plant', 'name': 'carrot'}, {'frequency': 'c', 'id': 222, 'synset': 'carryall.n.01', 'synonyms': ['tote_bag'], 'def': 'a capacious bag or basket', 'name': 'tote_bag'}, {'frequency': 'c', 'id': 223, 'synset': 'cart.n.01', 'synonyms': ['cart'], 'def': 'a heavy open wagon usually having two wheels and drawn by an animal', 'name': 'cart'}, {'frequency': 'c', 'id': 224, 'synset': 'carton.n.02', 'synonyms': ['carton'], 'def': 'a box made of cardboard; opens by flaps on top', 'name': 'carton'}, {'frequency': 'c', 'id': 225, 'synset': 'cash_register.n.01', 'synonyms': ['cash_register', 'register_(for_cash_transactions)'], 'def': 'a cashbox with an adding machine to register transactions', 'name': 'cash_register'}, {'frequency': 'r', 'id': 226, 'synset': 'casserole.n.01', 'synonyms': ['casserole'], 'def': 'food cooked and served in a casserole', 'name': 'casserole'}, {'frequency': 'r', 'id': 227, 'synset': 'cassette.n.01', 'synonyms': ['cassette'], 'def': 'a container that holds a magnetic tape used for recording or playing sound or video', 'name': 'cassette'}, {'frequency': 'c', 'id': 228, 'synset': 'cast.n.05', 'synonyms': ['cast', 'plaster_cast', 'plaster_bandage'], 'def': 'bandage consisting of a firm covering that immobilizes broken bones while they heal', 'name': 'cast'}, {'frequency': 'f', 'id': 229, 'synset': 'cat.n.01', 'synonyms': ['cat'], 'def': 'a domestic house cat', 'name': 'cat'}, {'frequency': 'c', 'id': 230, 'synset': 'cauliflower.n.02', 'synonyms': ['cauliflower'], 'def': 'edible compact head of white undeveloped flowers', 'name': 'cauliflower'}, {'frequency': 'r', 'id': 231, 'synset': 'caviar.n.01', 'synonyms': ['caviar', 'caviare'], 'def': "salted roe of sturgeon or other large fish; usually served as an hors d'oeuvre", 'name': 'caviar'}, {'frequency': 'c', 'id': 232, 'synset': 'cayenne.n.02', 'synonyms': ['cayenne_(spice)', 'cayenne_pepper_(spice)', 'red_pepper_(spice)'], 'def': 'ground pods and seeds of pungent red peppers of the genus Capsicum', 'name': 'cayenne_(spice)'}, {'frequency': 'c', 'id': 233, 'synset': 'cd_player.n.01', 'synonyms': ['CD_player'], 'def': 'electronic equipment for playing compact discs (CDs)', 'name': 'CD_player'}, {'frequency': 'c', 'id': 234, 'synset': 'celery.n.01', 'synonyms': ['celery'], 'def': 'widely cultivated herb with aromatic leaf stalks that are eaten raw or cooked', 'name': 'celery'}, {'frequency': 'f', 'id': 235, 'synset': 'cellular_telephone.n.01', 'synonyms': ['cellular_telephone', 'cellular_phone', 'cellphone', 'mobile_phone', 'smart_phone'], 'def': 'a hand-held mobile telephone', 'name': 'cellular_telephone'}, {'frequency': 'r', 'id': 236, 'synset': 'chain_mail.n.01', 'synonyms': ['chain_mail', 'ring_mail', 'chain_armor', 'chain_armour', 'ring_armor', 'ring_armour'], 'def': '(Middle Ages) flexible armor made of interlinked metal rings', 'name': 'chain_mail'}, {'frequency': 'f', 'id': 237, 'synset': 'chair.n.01', 'synonyms': ['chair'], 'def': 'a seat for one person, with a support for the back', 'name': 'chair'}, {'frequency': 'r', 'id': 238, 'synset': 'chaise_longue.n.01', 'synonyms': ['chaise_longue', 'chaise', 'daybed'], 'def': 'a long chair; for reclining', 'name': 'chaise_longue'}, {'frequency': 'r', 'id': 239, 'synset': 'champagne.n.01', 'synonyms': ['champagne'], 'def': 'a white sparkling wine produced in Champagne or resembling that produced there', 'name': 'champagne'}, {'frequency': 'f', 'id': 240, 'synset': 'chandelier.n.01', 'synonyms': ['chandelier'], 'def': 'branched lighting fixture; often ornate; hangs from the ceiling', 'name': 'chandelier'}, {'frequency': 'r', 'id': 241, 'synset': 'chap.n.04', 'synonyms': ['chap'], 'def': 'leather leggings without a seat; worn over trousers by cowboys to protect their legs', 'name': 'chap'}, {'frequency': 'r', 'id': 242, 'synset': 'checkbook.n.01', 'synonyms': ['checkbook', 'chequebook'], 'def': 'a book issued to holders of checking accounts', 'name': 'checkbook'}, {'frequency': 'r', 'id': 243, 'synset': 'checkerboard.n.01', 'synonyms': ['checkerboard'], 'def': 'a board having 64 squares of two alternating colors', 'name': 'checkerboard'}, {'frequency': 'c', 'id': 244, 'synset': 'cherry.n.03', 'synonyms': ['cherry'], 'def': 'a red fruit with a single hard stone', 'name': 'cherry'}, {'frequency': 'r', 'id': 245, 'synset': 'chessboard.n.01', 'synonyms': ['chessboard'], 'def': 'a checkerboard used to play chess', 'name': 'chessboard'}, {'frequency': 'r', 'id': 246, 'synset': 'chest_of_drawers.n.01', 'synonyms': ['chest_of_drawers_(furniture)', 'bureau_(furniture)', 'chest_(furniture)'], 'def': 'furniture with drawers for keeping clothes', 'name': 'chest_of_drawers_(furniture)'}, {'frequency': 'c', 'id': 247, 'synset': 'chicken.n.02', 'synonyms': ['chicken_(animal)'], 'def': 'a domestic fowl bred for flesh or eggs', 'name': 'chicken_(animal)'}, {'frequency': 'c', 'id': 248, 'synset': 'chicken_wire.n.01', 'synonyms': ['chicken_wire'], 'def': 'a galvanized wire network with a hexagonal mesh; used to build fences', 'name': 'chicken_wire'}, {'frequency': 'r', 'id': 249, 'synset': 'chickpea.n.01', 'synonyms': ['chickpea', 'garbanzo'], 'def': 'the seed of the chickpea plant; usually dried', 'name': 'chickpea'}, {'frequency': 'r', 'id': 250, 'synset': 'chihuahua.n.03', 'synonyms': ['Chihuahua'], 'def': 'an old breed of tiny short-haired dog with protruding eyes from Mexico', 'name': 'Chihuahua'}, {'frequency': 'r', 'id': 251, 'synset': 'chili.n.02', 'synonyms': ['chili_(vegetable)', 'chili_pepper_(vegetable)', 'chilli_(vegetable)', 'chilly_(vegetable)', 'chile_(vegetable)'], 'def': 'very hot and finely tapering pepper of special pungency', 'name': 'chili_(vegetable)'}, {'frequency': 'r', 'id': 252, 'synset': 'chime.n.01', 'synonyms': ['chime', 'gong'], 'def': 'an instrument consisting of a set of bells that are struck with a hammer', 'name': 'chime'}, {'frequency': 'r', 'id': 253, 'synset': 'chinaware.n.01', 'synonyms': ['chinaware'], 'def': 'dishware made of high quality porcelain', 'name': 'chinaware'}, {'frequency': 'c', 'id': 254, 'synset': 'chip.n.04', 'synonyms': ['crisp_(potato_chip)', 'potato_chip'], 'def': 'a thin crisp slice of potato fried in deep fat', 'name': 'crisp_(potato_chip)'}, {'frequency': 'r', 'id': 255, 'synset': 'chip.n.06', 'synonyms': ['poker_chip'], 'def': 'a small disk-shaped counter used to represent money when gambling', 'name': 'poker_chip'}, {'frequency': 'c', 'id': 256, 'synset': 'chocolate_bar.n.01', 'synonyms': ['chocolate_bar'], 'def': 'a bar of chocolate candy', 'name': 'chocolate_bar'}, {'frequency': 'c', 'id': 257, 'synset': 'chocolate_cake.n.01', 'synonyms': ['chocolate_cake'], 'def': 'cake containing chocolate', 'name': 'chocolate_cake'}, {'frequency': 'r', 'id': 258, 'synset': 'chocolate_milk.n.01', 'synonyms': ['chocolate_milk'], 'def': 'milk flavored with chocolate syrup', 'name': 'chocolate_milk'}, {'frequency': 'r', 'id': 259, 'synset': 'chocolate_mousse.n.01', 'synonyms': ['chocolate_mousse'], 'def': 'dessert mousse made with chocolate', 'name': 'chocolate_mousse'}, {'frequency': 'f', 'id': 260, 'synset': 'choker.n.03', 'synonyms': ['choker', 'collar', 'neckband'], 'def': 'necklace that fits tightly around the neck', 'name': 'choker'}, {'frequency': 'f', 'id': 261, 'synset': 'chopping_board.n.01', 'synonyms': ['chopping_board', 'cutting_board', 'chopping_block'], 'def': 'a wooden board where meats or vegetables can be cut', 'name': 'chopping_board'}, {'frequency': 'c', 'id': 262, 'synset': 'chopstick.n.01', 'synonyms': ['chopstick'], 'def': 'one of a pair of slender sticks used as oriental tableware to eat food with', 'name': 'chopstick'}, {'frequency': 'f', 'id': 263, 'synset': 'christmas_tree.n.05', 'synonyms': ['Christmas_tree'], 'def': 'an ornamented evergreen used as a Christmas decoration', 'name': 'Christmas_tree'}, {'frequency': 'c', 'id': 264, 'synset': 'chute.n.02', 'synonyms': ['slide'], 'def': 'sloping channel through which things can descend', 'name': 'slide'}, {'frequency': 'r', 'id': 265, 'synset': 'cider.n.01', 'synonyms': ['cider', 'cyder'], 'def': 'a beverage made from juice pressed from apples', 'name': 'cider'}, {'frequency': 'r', 'id': 266, 'synset': 'cigar_box.n.01', 'synonyms': ['cigar_box'], 'def': 'a box for holding cigars', 'name': 'cigar_box'}, {'frequency': 'c', 'id': 267, 'synset': 'cigarette.n.01', 'synonyms': ['cigarette'], 'def': 'finely ground tobacco wrapped in paper; for smoking', 'name': 'cigarette'}, {'frequency': 'c', 'id': 268, 'synset': 'cigarette_case.n.01', 'synonyms': ['cigarette_case', 'cigarette_pack'], 'def': 'a small flat case for holding cigarettes', 'name': 'cigarette_case'}, {'frequency': 'f', 'id': 269, 'synset': 'cistern.n.02', 'synonyms': ['cistern', 'water_tank'], 'def': 'a tank that holds the water used to flush a toilet', 'name': 'cistern'}, {'frequency': 'r', 'id': 270, 'synset': 'clarinet.n.01', 'synonyms': ['clarinet'], 'def': 'a single-reed instrument with a straight tube', 'name': 'clarinet'}, {'frequency': 'r', 'id': 271, 'synset': 'clasp.n.01', 'synonyms': ['clasp'], 'def': 'a fastener (as a buckle or hook) that is used to hold two things together', 'name': 'clasp'}, {'frequency': 'c', 'id': 272, 'synset': 'cleansing_agent.n.01', 'synonyms': ['cleansing_agent', 'cleanser', 'cleaner'], 'def': 'a preparation used in cleaning something', 'name': 'cleansing_agent'}, {'frequency': 'r', 'id': 273, 'synset': 'clementine.n.01', 'synonyms': ['clementine'], 'def': 'a variety of mandarin orange', 'name': 'clementine'}, {'frequency': 'c', 'id': 274, 'synset': 'clip.n.03', 'synonyms': ['clip'], 'def': 'any of various small fasteners used to hold loose articles together', 'name': 'clip'}, {'frequency': 'c', 'id': 275, 'synset': 'clipboard.n.01', 'synonyms': ['clipboard'], 'def': 'a small writing board with a clip at the top for holding papers', 'name': 'clipboard'}, {'frequency': 'f', 'id': 276, 'synset': 'clock.n.01', 'synonyms': ['clock', 'timepiece', 'timekeeper'], 'def': 'a timepiece that shows the time of day', 'name': 'clock'}, {'frequency': 'f', 'id': 277, 'synset': 'clock_tower.n.01', 'synonyms': ['clock_tower'], 'def': 'a tower with a large clock visible high up on an outside face', 'name': 'clock_tower'}, {'frequency': 'c', 'id': 278, 'synset': 'clothes_hamper.n.01', 'synonyms': ['clothes_hamper', 'laundry_basket', 'clothes_basket'], 'def': 'a hamper that holds dirty clothes to be washed or wet clothes to be dried', 'name': 'clothes_hamper'}, {'frequency': 'c', 'id': 279, 'synset': 'clothespin.n.01', 'synonyms': ['clothespin', 'clothes_peg'], 'def': 'wood or plastic fastener; for holding clothes on a clothesline', 'name': 'clothespin'}, {'frequency': 'r', 'id': 280, 'synset': 'clutch_bag.n.01', 'synonyms': ['clutch_bag'], 'def': "a woman's strapless purse that is carried in the hand", 'name': 'clutch_bag'}, {'frequency': 'f', 'id': 281, 'synset': 'coaster.n.03', 'synonyms': ['coaster'], 'def': 'a covering (plate or mat) that protects the surface of a table', 'name': 'coaster'}, {'frequency': 'f', 'id': 282, 'synset': 'coat.n.01', 'synonyms': ['coat'], 'def': 'an outer garment that has sleeves and covers the body from shoulder down', 'name': 'coat'}, {'frequency': 'c', 'id': 283, 'synset': 'coat_hanger.n.01', 'synonyms': ['coat_hanger', 'clothes_hanger', 'dress_hanger'], 'def': "a hanger that is shaped like a person's shoulders", 'name': 'coat_hanger'}, {'frequency': 'r', 'id': 284, 'synset': 'coatrack.n.01', 'synonyms': ['coatrack', 'hatrack'], 'def': 'a rack with hooks for temporarily holding coats and hats', 'name': 'coatrack'}, {'frequency': 'c', 'id': 285, 'synset': 'cock.n.04', 'synonyms': ['cock', 'rooster'], 'def': 'adult male chicken', 'name': 'cock'}, {'frequency': 'c', 'id': 286, 'synset': 'coconut.n.02', 'synonyms': ['coconut', 'cocoanut'], 'def': 'large hard-shelled brown oval nut with a fibrous husk', 'name': 'coconut'}, {'frequency': 'r', 'id': 287, 'synset': 'coffee_filter.n.01', 'synonyms': ['coffee_filter'], 'def': 'filter (usually of paper) that passes the coffee and retains the coffee grounds', 'name': 'coffee_filter'}, {'frequency': 'f', 'id': 288, 'synset': 'coffee_maker.n.01', 'synonyms': ['coffee_maker', 'coffee_machine'], 'def': 'a kitchen appliance for brewing coffee automatically', 'name': 'coffee_maker'}, {'frequency': 'f', 'id': 289, 'synset': 'coffee_table.n.01', 'synonyms': ['coffee_table', 'cocktail_table'], 'def': 'low table where magazines can be placed and coffee or cocktails are served', 'name': 'coffee_table'}, {'frequency': 'c', 'id': 290, 'synset': 'coffeepot.n.01', 'synonyms': ['coffeepot'], 'def': 'tall pot in which coffee is brewed', 'name': 'coffeepot'}, {'frequency': 'r', 'id': 291, 'synset': 'coil.n.05', 'synonyms': ['coil'], 'def': 'tubing that is wound in a spiral', 'name': 'coil'}, {'frequency': 'c', 'id': 292, 'synset': 'coin.n.01', 'synonyms': ['coin'], 'def': 'a flat metal piece (usually a disc) used as money', 'name': 'coin'}, {'frequency': 'r', 'id': 293, 'synset': 'colander.n.01', 'synonyms': ['colander', 'cullender'], 'def': 'bowl-shaped strainer; used to wash or drain foods', 'name': 'colander'}, {'frequency': 'c', 'id': 294, 'synset': 'coleslaw.n.01', 'synonyms': ['coleslaw', 'slaw'], 'def': 'basically shredded cabbage', 'name': 'coleslaw'}, {'frequency': 'r', 'id': 295, 'synset': 'coloring_material.n.01', 'synonyms': ['coloring_material', 'colouring_material'], 'def': 'any material used for its color', 'name': 'coloring_material'}, {'frequency': 'r', 'id': 296, 'synset': 'combination_lock.n.01', 'synonyms': ['combination_lock'], 'def': 'lock that can be opened only by turning dials in a special sequence', 'name': 'combination_lock'}, {'frequency': 'c', 'id': 297, 'synset': 'comforter.n.04', 'synonyms': ['pacifier', 'teething_ring'], 'def': 'device used for an infant to suck or bite on', 'name': 'pacifier'}, {'frequency': 'r', 'id': 298, 'synset': 'comic_book.n.01', 'synonyms': ['comic_book'], 'def': 'a magazine devoted to comic strips', 'name': 'comic_book'}, {'frequency': 'f', 'id': 299, 'synset': 'computer_keyboard.n.01', 'synonyms': ['computer_keyboard', 'keyboard_(computer)'], 'def': 'a keyboard that is a data input device for computers', 'name': 'computer_keyboard'}, {'frequency': 'r', 'id': 300, 'synset': 'concrete_mixer.n.01', 'synonyms': ['concrete_mixer', 'cement_mixer'], 'def': 'a machine with a large revolving drum in which cement/concrete is mixed', 'name': 'concrete_mixer'}, {'frequency': 'f', 'id': 301, 'synset': 'cone.n.01', 'synonyms': ['cone', 'traffic_cone'], 'def': 'a cone-shaped object used to direct traffic', 'name': 'cone'}, {'frequency': 'f', 'id': 302, 'synset': 'control.n.09', 'synonyms': ['control', 'controller'], 'def': 'a mechanism that controls the operation of a machine', 'name': 'control'}, {'frequency': 'r', 'id': 303, 'synset': 'convertible.n.01', 'synonyms': ['convertible_(automobile)'], 'def': 'a car that has top that can be folded or removed', 'name': 'convertible_(automobile)'}, {'frequency': 'r', 'id': 304, 'synset': 'convertible.n.03', 'synonyms': ['sofa_bed'], 'def': 'a sofa that can be converted into a bed', 'name': 'sofa_bed'}, {'frequency': 'c', 'id': 305, 'synset': 'cookie.n.01', 'synonyms': ['cookie', 'cooky', 'biscuit_(cookie)'], 'def': "any of various small flat sweet cakes (`biscuit' is the British term)", 'name': 'cookie'}, {'frequency': 'r', 'id': 306, 'synset': 'cookie_jar.n.01', 'synonyms': ['cookie_jar', 'cooky_jar'], 'def': 'a jar in which cookies are kept (and sometimes money is hidden)', 'name': 'cookie_jar'}, {'frequency': 'r', 'id': 307, 'synset': 'cooking_utensil.n.01', 'synonyms': ['cooking_utensil'], 'def': 'a kitchen utensil made of material that does not melt easily; used for cooking', 'name': 'cooking_utensil'}, {'frequency': 'f', 'id': 308, 'synset': 'cooler.n.01', 'synonyms': ['cooler_(for_food)', 'ice_chest'], 'def': 'an insulated box for storing food often with ice', 'name': 'cooler_(for_food)'}, {'frequency': 'c', 'id': 309, 'synset': 'cork.n.04', 'synonyms': ['cork_(bottle_plug)', 'bottle_cork'], 'def': 'the plug in the mouth of a bottle (especially a wine bottle)', 'name': 'cork_(bottle_plug)'}, {'frequency': 'r', 'id': 310, 'synset': 'corkboard.n.01', 'synonyms': ['corkboard'], 'def': 'a sheet consisting of cork granules', 'name': 'corkboard'}, {'frequency': 'r', 'id': 311, 'synset': 'corkscrew.n.01', 'synonyms': ['corkscrew', 'bottle_screw'], 'def': 'a bottle opener that pulls corks', 'name': 'corkscrew'}, {'frequency': 'c', 'id': 312, 'synset': 'corn.n.03', 'synonyms': ['edible_corn', 'corn', 'maize'], 'def': 'ears of corn that can be prepared and served for human food', 'name': 'edible_corn'}, {'frequency': 'r', 'id': 313, 'synset': 'cornbread.n.01', 'synonyms': ['cornbread'], 'def': 'bread made primarily of cornmeal', 'name': 'cornbread'}, {'frequency': 'c', 'id': 314, 'synset': 'cornet.n.01', 'synonyms': ['cornet', 'horn', 'trumpet'], 'def': 'a brass musical instrument with a narrow tube and a flared bell and many valves', 'name': 'cornet'}, {'frequency': 'c', 'id': 315, 'synset': 'cornice.n.01', 'synonyms': ['cornice', 'valance', 'valance_board', 'pelmet'], 'def': 'a decorative framework to conceal curtain fixtures at the top of a window casing', 'name': 'cornice'}, {'frequency': 'r', 'id': 316, 'synset': 'cornmeal.n.01', 'synonyms': ['cornmeal'], 'def': 'coarsely ground corn', 'name': 'cornmeal'}, {'frequency': 'r', 'id': 317, 'synset': 'corset.n.01', 'synonyms': ['corset', 'girdle'], 'def': "a woman's close-fitting foundation garment", 'name': 'corset'}, {'frequency': 'r', 'id': 318, 'synset': 'cos.n.02', 'synonyms': ['romaine_lettuce'], 'def': 'lettuce with long dark-green leaves in a loosely packed elongated head', 'name': 'romaine_lettuce'}, {'frequency': 'c', 'id': 319, 'synset': 'costume.n.04', 'synonyms': ['costume'], 'def': 'the attire characteristic of a country or a time or a social class', 'name': 'costume'}, {'frequency': 'r', 'id': 320, 'synset': 'cougar.n.01', 'synonyms': ['cougar', 'puma', 'catamount', 'mountain_lion', 'panther'], 'def': 'large American feline resembling a lion', 'name': 'cougar'}, {'frequency': 'r', 'id': 321, 'synset': 'coverall.n.01', 'synonyms': ['coverall'], 'def': 'a loose-fitting protective garment that is worn over other clothing', 'name': 'coverall'}, {'frequency': 'r', 'id': 322, 'synset': 'cowbell.n.01', 'synonyms': ['cowbell'], 'def': 'a bell hung around the neck of cow so that the cow can be easily located', 'name': 'cowbell'}, {'frequency': 'f', 'id': 323, 'synset': 'cowboy_hat.n.01', 'synonyms': ['cowboy_hat', 'ten-gallon_hat'], 'def': 'a hat with a wide brim and a soft crown; worn by American ranch hands', 'name': 'cowboy_hat'}, {'frequency': 'r', 'id': 324, 'synset': 'crab.n.01', 'synonyms': ['crab_(animal)'], 'def': 'decapod having eyes on short stalks and a broad flattened shell and pincers', 'name': 'crab_(animal)'}, {'frequency': 'c', 'id': 325, 'synset': 'cracker.n.01', 'synonyms': ['cracker'], 'def': 'a thin crisp wafer', 'name': 'cracker'}, {'frequency': 'r', 'id': 326, 'synset': 'crape.n.01', 'synonyms': ['crape', 'crepe', 'French_pancake'], 'def': 'small very thin pancake', 'name': 'crape'}, {'frequency': 'f', 'id': 327, 'synset': 'crate.n.01', 'synonyms': ['crate'], 'def': 'a rugged box (usually made of wood); used for shipping', 'name': 'crate'}, {'frequency': 'r', 'id': 328, 'synset': 'crayon.n.01', 'synonyms': ['crayon', 'wax_crayon'], 'def': 'writing or drawing implement made of a colored stick of composition wax', 'name': 'crayon'}, {'frequency': 'r', 'id': 329, 'synset': 'cream_pitcher.n.01', 'synonyms': ['cream_pitcher'], 'def': 'a small pitcher for serving cream', 'name': 'cream_pitcher'}, {'frequency': 'r', 'id': 330, 'synset': 'credit_card.n.01', 'synonyms': ['credit_card', 'charge_card', 'debit_card'], 'def': 'a card, usually plastic, used to pay for goods and services', 'name': 'credit_card'}, {'frequency': 'c', 'id': 331, 'synset': 'crescent_roll.n.01', 'synonyms': ['crescent_roll', 'croissant'], 'def': 'very rich flaky crescent-shaped roll', 'name': 'crescent_roll'}, {'frequency': 'c', 'id': 332, 'synset': 'crib.n.01', 'synonyms': ['crib', 'cot'], 'def': 'baby bed with high sides made of slats', 'name': 'crib'}, {'frequency': 'c', 'id': 333, 'synset': 'crock.n.03', 'synonyms': ['crock_pot', 'earthenware_jar'], 'def': 'an earthen jar (made of baked clay)', 'name': 'crock_pot'}, {'frequency': 'f', 'id': 334, 'synset': 'crossbar.n.01', 'synonyms': ['crossbar'], 'def': 'a horizontal bar that goes across something', 'name': 'crossbar'}, {'frequency': 'r', 'id': 335, 'synset': 'crouton.n.01', 'synonyms': ['crouton'], 'def': 'a small piece of toasted or fried bread; served in soup or salads', 'name': 'crouton'}, {'frequency': 'r', 'id': 336, 'synset': 'crow.n.01', 'synonyms': ['crow'], 'def': 'black birds having a raucous call', 'name': 'crow'}, {'frequency': 'c', 'id': 337, 'synset': 'crown.n.04', 'synonyms': ['crown'], 'def': 'an ornamental jeweled headdress signifying sovereignty', 'name': 'crown'}, {'frequency': 'c', 'id': 338, 'synset': 'crucifix.n.01', 'synonyms': ['crucifix'], 'def': 'representation of the cross on which Jesus died', 'name': 'crucifix'}, {'frequency': 'c', 'id': 339, 'synset': 'cruise_ship.n.01', 'synonyms': ['cruise_ship', 'cruise_liner'], 'def': 'a passenger ship used commercially for pleasure cruises', 'name': 'cruise_ship'}, {'frequency': 'c', 'id': 340, 'synset': 'cruiser.n.01', 'synonyms': ['police_cruiser', 'patrol_car', 'police_car', 'squad_car'], 'def': 'a car in which policemen cruise the streets', 'name': 'police_cruiser'}, {'frequency': 'c', 'id': 341, 'synset': 'crumb.n.03', 'synonyms': ['crumb'], 'def': 'small piece of e.g. bread or cake', 'name': 'crumb'}, {'frequency': 'r', 'id': 342, 'synset': 'crutch.n.01', 'synonyms': ['crutch'], 'def': 'a wooden or metal staff that fits under the armpit and reaches to the ground', 'name': 'crutch'}, {'frequency': 'c', 'id': 343, 'synset': 'cub.n.03', 'synonyms': ['cub_(animal)'], 'def': 'the young of certain carnivorous mammals such as the bear or wolf or lion', 'name': 'cub_(animal)'}, {'frequency': 'r', 'id': 344, 'synset': 'cube.n.05', 'synonyms': ['cube', 'square_block'], 'def': 'a block in the (approximate) shape of a cube', 'name': 'cube'}, {'frequency': 'f', 'id': 345, 'synset': 'cucumber.n.02', 'synonyms': ['cucumber', 'cuke'], 'def': 'cylindrical green fruit with thin green rind and white flesh eaten as a vegetable', 'name': 'cucumber'}, {'frequency': 'c', 'id': 346, 'synset': 'cufflink.n.01', 'synonyms': ['cufflink'], 'def': 'jewelry consisting of linked buttons used to fasten the cuffs of a shirt', 'name': 'cufflink'}, {'frequency': 'f', 'id': 347, 'synset': 'cup.n.01', 'synonyms': ['cup'], 'def': 'a small open container usually used for drinking; usually has a handle', 'name': 'cup'}, {'frequency': 'c', 'id': 348, 'synset': 'cup.n.08', 'synonyms': ['trophy_cup'], 'def': 'a metal vessel with handles that is awarded as a trophy to a competition winner', 'name': 'trophy_cup'}, {'frequency': 'c', 'id': 349, 'synset': 'cupcake.n.01', 'synonyms': ['cupcake'], 'def': 'small cake baked in a muffin tin', 'name': 'cupcake'}, {'frequency': 'r', 'id': 350, 'synset': 'curler.n.01', 'synonyms': ['hair_curler', 'hair_roller', 'hair_crimper'], 'def': 'a cylindrical tube around which the hair is wound to curl it', 'name': 'hair_curler'}, {'frequency': 'r', 'id': 351, 'synset': 'curling_iron.n.01', 'synonyms': ['curling_iron'], 'def': 'a cylindrical home appliance that heats hair that has been curled around it', 'name': 'curling_iron'}, {'frequency': 'f', 'id': 352, 'synset': 'curtain.n.01', 'synonyms': ['curtain', 'drapery'], 'def': 'hanging cloth used as a blind (especially for a window)', 'name': 'curtain'}, {'frequency': 'f', 'id': 353, 'synset': 'cushion.n.03', 'synonyms': ['cushion'], 'def': 'a soft bag filled with air or padding such as feathers or foam rubber', 'name': 'cushion'}, {'frequency': 'r', 'id': 354, 'synset': 'custard.n.01', 'synonyms': ['custard'], 'def': 'sweetened mixture of milk and eggs baked or boiled or frozen', 'name': 'custard'}, {'frequency': 'c', 'id': 355, 'synset': 'cutter.n.06', 'synonyms': ['cutting_tool'], 'def': 'a cutting implement; a tool for cutting', 'name': 'cutting_tool'}, {'frequency': 'r', 'id': 356, 'synset': 'cylinder.n.04', 'synonyms': ['cylinder'], 'def': 'a cylindrical container', 'name': 'cylinder'}, {'frequency': 'r', 'id': 357, 'synset': 'cymbal.n.01', 'synonyms': ['cymbal'], 'def': 'a percussion instrument consisting of a concave brass disk', 'name': 'cymbal'}, {'frequency': 'r', 'id': 358, 'synset': 'dachshund.n.01', 'synonyms': ['dachshund', 'dachsie', 'badger_dog'], 'def': 'small long-bodied short-legged breed of dog having a short sleek coat and long drooping ears', 'name': 'dachshund'}, {'frequency': 'r', 'id': 359, 'synset': 'dagger.n.01', 'synonyms': ['dagger'], 'def': 'a short knife with a pointed blade used for piercing or stabbing', 'name': 'dagger'}, {'frequency': 'r', 'id': 360, 'synset': 'dartboard.n.01', 'synonyms': ['dartboard'], 'def': 'a circular board of wood or cork used as the target in the game of darts', 'name': 'dartboard'}, {'frequency': 'r', 'id': 361, 'synset': 'date.n.08', 'synonyms': ['date_(fruit)'], 'def': 'sweet edible fruit of the date palm with a single long woody seed', 'name': 'date_(fruit)'}, {'frequency': 'f', 'id': 362, 'synset': 'deck_chair.n.01', 'synonyms': ['deck_chair', 'beach_chair'], 'def': 'a folding chair for use outdoors; a wooden frame supports a length of canvas', 'name': 'deck_chair'}, {'frequency': 'c', 'id': 363, 'synset': 'deer.n.01', 'synonyms': ['deer', 'cervid'], 'def': "distinguished from Bovidae by the male's having solid deciduous antlers", 'name': 'deer'}, {'frequency': 'c', 'id': 364, 'synset': 'dental_floss.n.01', 'synonyms': ['dental_floss', 'floss'], 'def': 'a soft thread for cleaning the spaces between the teeth', 'name': 'dental_floss'}, {'frequency': 'f', 'id': 365, 'synset': 'desk.n.01', 'synonyms': ['desk'], 'def': 'a piece of furniture with a writing surface and usually drawers or other compartments', 'name': 'desk'}, {'frequency': 'r', 'id': 366, 'synset': 'detergent.n.01', 'synonyms': ['detergent'], 'def': 'a surface-active chemical widely used in industry and laundering', 'name': 'detergent'}, {'frequency': 'c', 'id': 367, 'synset': 'diaper.n.01', 'synonyms': ['diaper'], 'def': 'garment consisting of a folded cloth drawn up between the legs and fastened at the waist', 'name': 'diaper'}, {'frequency': 'r', 'id': 368, 'synset': 'diary.n.01', 'synonyms': ['diary', 'journal'], 'def': 'a daily written record of (usually personal) experiences and observations', 'name': 'diary'}, {'frequency': 'r', 'id': 369, 'synset': 'die.n.01', 'synonyms': ['die', 'dice'], 'def': 'a small cube with 1 to 6 spots on the six faces; used in gambling', 'name': 'die'}, {'frequency': 'r', 'id': 370, 'synset': 'dinghy.n.01', 'synonyms': ['dinghy', 'dory', 'rowboat'], 'def': 'a small boat of shallow draft with seats and oars with which it is propelled', 'name': 'dinghy'}, {'frequency': 'f', 'id': 371, 'synset': 'dining_table.n.01', 'synonyms': ['dining_table'], 'def': 'a table at which meals are served', 'name': 'dining_table'}, {'frequency': 'r', 'id': 372, 'synset': 'dinner_jacket.n.01', 'synonyms': ['tux', 'tuxedo'], 'def': 'semiformal evening dress for men', 'name': 'tux'}, {'frequency': 'c', 'id': 373, 'synset': 'dish.n.01', 'synonyms': ['dish'], 'def': 'a piece of dishware normally used as a container for holding or serving food', 'name': 'dish'}, {'frequency': 'c', 'id': 374, 'synset': 'dish.n.05', 'synonyms': ['dish_antenna'], 'def': 'directional antenna consisting of a parabolic reflector', 'name': 'dish_antenna'}, {'frequency': 'c', 'id': 375, 'synset': 'dishrag.n.01', 'synonyms': ['dishrag', 'dishcloth'], 'def': 'a cloth for washing dishes', 'name': 'dishrag'}, {'frequency': 'c', 'id': 376, 'synset': 'dishtowel.n.01', 'synonyms': ['dishtowel', 'tea_towel'], 'def': 'a towel for drying dishes', 'name': 'dishtowel'}, {'frequency': 'f', 'id': 377, 'synset': 'dishwasher.n.01', 'synonyms': ['dishwasher', 'dishwashing_machine'], 'def': 'a machine for washing dishes', 'name': 'dishwasher'}, {'frequency': 'r', 'id': 378, 'synset': 'dishwasher_detergent.n.01', 'synonyms': ['dishwasher_detergent', 'dishwashing_detergent', 'dishwashing_liquid'], 'def': 'a low-sudsing detergent designed for use in dishwashers', 'name': 'dishwasher_detergent'}, {'frequency': 'r', 'id': 379, 'synset': 'diskette.n.01', 'synonyms': ['diskette', 'floppy', 'floppy_disk'], 'def': 'a small plastic magnetic disk enclosed in a stiff envelope used to store data', 'name': 'diskette'}, {'frequency': 'c', 'id': 380, 'synset': 'dispenser.n.01', 'synonyms': ['dispenser'], 'def': 'a container so designed that the contents can be used in prescribed amounts', 'name': 'dispenser'}, {'frequency': 'c', 'id': 381, 'synset': 'dixie_cup.n.01', 'synonyms': ['Dixie_cup', 'paper_cup'], 'def': 'a disposable cup made of paper; for holding drinks', 'name': 'Dixie_cup'}, {'frequency': 'f', 'id': 382, 'synset': 'dog.n.01', 'synonyms': ['dog'], 'def': 'a common domesticated dog', 'name': 'dog'}, {'frequency': 'f', 'id': 383, 'synset': 'dog_collar.n.01', 'synonyms': ['dog_collar'], 'def': 'a collar for a dog', 'name': 'dog_collar'}, {'frequency': 'c', 'id': 384, 'synset': 'doll.n.01', 'synonyms': ['doll'], 'def': 'a toy replica of a HUMAN (NOT AN ANIMAL)', 'name': 'doll'}, {'frequency': 'r', 'id': 385, 'synset': 'dollar.n.02', 'synonyms': ['dollar', 'dollar_bill', 'one_dollar_bill'], 'def': 'a piece of paper money worth one dollar', 'name': 'dollar'}, {'frequency': 'r', 'id': 386, 'synset': 'dolphin.n.02', 'synonyms': ['dolphin'], 'def': 'any of various small toothed whales with a beaklike snout; larger than porpoises', 'name': 'dolphin'}, {'frequency': 'c', 'id': 387, 'synset': 'domestic_ass.n.01', 'synonyms': ['domestic_ass', 'donkey'], 'def': 'domestic beast of burden descended from the African wild ass; patient but stubborn', 'name': 'domestic_ass'}, {'frequency': 'r', 'id': 388, 'synset': 'domino.n.03', 'synonyms': ['eye_mask'], 'def': 'a mask covering the upper part of the face but with holes for the eyes', 'name': 'eye_mask'}, {'frequency': 'r', 'id': 389, 'synset': 'doorbell.n.01', 'synonyms': ['doorbell', 'buzzer'], 'def': 'a button at an outer door that gives a ringing or buzzing signal when pushed', 'name': 'doorbell'}, {'frequency': 'f', 'id': 390, 'synset': 'doorknob.n.01', 'synonyms': ['doorknob', 'doorhandle'], 'def': "a knob used to open a door (often called `doorhandle' in Great Britain)", 'name': 'doorknob'}, {'frequency': 'c', 'id': 391, 'synset': 'doormat.n.02', 'synonyms': ['doormat', 'welcome_mat'], 'def': 'a mat placed outside an exterior door for wiping the shoes before entering', 'name': 'doormat'}, {'frequency': 'f', 'id': 392, 'synset': 'doughnut.n.02', 'synonyms': ['doughnut', 'donut'], 'def': 'a small ring-shaped friedcake', 'name': 'doughnut'}, {'frequency': 'r', 'id': 393, 'synset': 'dove.n.01', 'synonyms': ['dove'], 'def': 'any of numerous small pigeons', 'name': 'dove'}, {'frequency': 'r', 'id': 394, 'synset': 'dragonfly.n.01', 'synonyms': ['dragonfly'], 'def': 'slender-bodied non-stinging insect having iridescent wings that are outspread at rest', 'name': 'dragonfly'}, {'frequency': 'f', 'id': 395, 'synset': 'drawer.n.01', 'synonyms': ['drawer'], 'def': 'a boxlike container in a piece of furniture; made so as to slide in and out', 'name': 'drawer'}, {'frequency': 'c', 'id': 396, 'synset': 'drawers.n.01', 'synonyms': ['underdrawers', 'boxers', 'boxershorts'], 'def': 'underpants worn by men', 'name': 'underdrawers'}, {'frequency': 'f', 'id': 397, 'synset': 'dress.n.01', 'synonyms': ['dress', 'frock'], 'def': 'a one-piece garment for a woman; has skirt and bodice', 'name': 'dress'}, {'frequency': 'c', 'id': 398, 'synset': 'dress_hat.n.01', 'synonyms': ['dress_hat', 'high_hat', 'opera_hat', 'silk_hat', 'top_hat'], 'def': "a man's hat with a tall crown; usually covered with silk or with beaver fur", 'name': 'dress_hat'}, {'frequency': 'c', 'id': 399, 'synset': 'dress_suit.n.01', 'synonyms': ['dress_suit'], 'def': 'formalwear consisting of full evening dress for men', 'name': 'dress_suit'}, {'frequency': 'c', 'id': 400, 'synset': 'dresser.n.05', 'synonyms': ['dresser'], 'def': 'a cabinet with shelves', 'name': 'dresser'}, {'frequency': 'c', 'id': 401, 'synset': 'drill.n.01', 'synonyms': ['drill'], 'def': 'a tool with a sharp rotating point for making holes in hard materials', 'name': 'drill'}, {'frequency': 'r', 'id': 402, 'synset': 'drinking_fountain.n.01', 'synonyms': ['drinking_fountain'], 'def': 'a public fountain to provide a jet of drinking water', 'name': 'drinking_fountain'}, {'frequency': 'r', 'id': 403, 'synset': 'drone.n.04', 'synonyms': ['drone'], 'def': 'an aircraft without a pilot that is operated by remote control', 'name': 'drone'}, {'frequency': 'r', 'id': 404, 'synset': 'dropper.n.01', 'synonyms': ['dropper', 'eye_dropper'], 'def': 'pipet consisting of a small tube with a vacuum bulb at one end for drawing liquid in and releasing it a drop at a time', 'name': 'dropper'}, {'frequency': 'c', 'id': 405, 'synset': 'drum.n.01', 'synonyms': ['drum_(musical_instrument)'], 'def': 'a musical percussion instrument; usually consists of a hollow cylinder with a membrane stretched across each end', 'name': 'drum_(musical_instrument)'}, {'frequency': 'r', 'id': 406, 'synset': 'drumstick.n.02', 'synonyms': ['drumstick'], 'def': 'a stick used for playing a drum', 'name': 'drumstick'}, {'frequency': 'f', 'id': 407, 'synset': 'duck.n.01', 'synonyms': ['duck'], 'def': 'small web-footed broad-billed swimming bird', 'name': 'duck'}, {'frequency': 'r', 'id': 408, 'synset': 'duckling.n.02', 'synonyms': ['duckling'], 'def': 'young duck', 'name': 'duckling'}, {'frequency': 'c', 'id': 409, 'synset': 'duct_tape.n.01', 'synonyms': ['duct_tape'], 'def': 'a wide silvery adhesive tape', 'name': 'duct_tape'}, {'frequency': 'f', 'id': 410, 'synset': 'duffel_bag.n.01', 'synonyms': ['duffel_bag', 'duffle_bag', 'duffel', 'duffle'], 'def': 'a large cylindrical bag of heavy cloth', 'name': 'duffel_bag'}, {'frequency': 'r', 'id': 411, 'synset': 'dumbbell.n.01', 'synonyms': ['dumbbell'], 'def': 'an exercising weight with two ball-like ends connected by a short handle', 'name': 'dumbbell'}, {'frequency': 'c', 'id': 412, 'synset': 'dumpster.n.01', 'synonyms': ['dumpster'], 'def': 'a container designed to receive and transport and dump waste', 'name': 'dumpster'}, {'frequency': 'r', 'id': 413, 'synset': 'dustpan.n.02', 'synonyms': ['dustpan'], 'def': 'a short-handled receptacle into which dust can be swept', 'name': 'dustpan'}, {'frequency': 'r', 'id': 414, 'synset': 'dutch_oven.n.02', 'synonyms': ['Dutch_oven'], 'def': 'iron or earthenware cooking pot; used for stews', 'name': 'Dutch_oven'}, {'frequency': 'c', 'id': 415, 'synset': 'eagle.n.01', 'synonyms': ['eagle'], 'def': 'large birds of prey noted for their broad wings and strong soaring flight', 'name': 'eagle'}, {'frequency': 'f', 'id': 416, 'synset': 'earphone.n.01', 'synonyms': ['earphone', 'earpiece', 'headphone'], 'def': 'device for listening to audio that is held over or inserted into the ear', 'name': 'earphone'}, {'frequency': 'r', 'id': 417, 'synset': 'earplug.n.01', 'synonyms': ['earplug'], 'def': 'a soft plug that is inserted into the ear canal to block sound', 'name': 'earplug'}, {'frequency': 'f', 'id': 418, 'synset': 'earring.n.01', 'synonyms': ['earring'], 'def': 'jewelry to ornament the ear', 'name': 'earring'}, {'frequency': 'c', 'id': 419, 'synset': 'easel.n.01', 'synonyms': ['easel'], 'def': "an upright tripod for displaying something (usually an artist's canvas)", 'name': 'easel'}, {'frequency': 'r', 'id': 420, 'synset': 'eclair.n.01', 'synonyms': ['eclair'], 'def': 'oblong cream puff', 'name': 'eclair'}, {'frequency': 'r', 'id': 421, 'synset': 'eel.n.01', 'synonyms': ['eel'], 'def': 'an elongate fish with fatty flesh', 'name': 'eel'}, {'frequency': 'f', 'id': 422, 'synset': 'egg.n.02', 'synonyms': ['egg', 'eggs'], 'def': 'oval reproductive body of a fowl (especially a hen) used as food', 'name': 'egg'}, {'frequency': 'r', 'id': 423, 'synset': 'egg_roll.n.01', 'synonyms': ['egg_roll', 'spring_roll'], 'def': 'minced vegetables and meat wrapped in a pancake and fried', 'name': 'egg_roll'}, {'frequency': 'c', 'id': 424, 'synset': 'egg_yolk.n.01', 'synonyms': ['egg_yolk', 'yolk_(egg)'], 'def': 'the yellow spherical part of an egg', 'name': 'egg_yolk'}, {'frequency': 'c', 'id': 425, 'synset': 'eggbeater.n.02', 'synonyms': ['eggbeater', 'eggwhisk'], 'def': 'a mixer for beating eggs or whipping cream', 'name': 'eggbeater'}, {'frequency': 'c', 'id': 426, 'synset': 'eggplant.n.01', 'synonyms': ['eggplant', 'aubergine'], 'def': 'egg-shaped vegetable having a shiny skin typically dark purple', 'name': 'eggplant'}, {'frequency': 'r', 'id': 427, 'synset': 'electric_chair.n.01', 'synonyms': ['electric_chair'], 'def': 'a chair-shaped instrument of execution by electrocution', 'name': 'electric_chair'}, {'frequency': 'f', 'id': 428, 'synset': 'electric_refrigerator.n.01', 'synonyms': ['refrigerator'], 'def': 'a refrigerator in which the coolant is pumped around by an electric motor', 'name': 'refrigerator'}, {'frequency': 'f', 'id': 429, 'synset': 'elephant.n.01', 'synonyms': ['elephant'], 'def': 'a common elephant', 'name': 'elephant'}, {'frequency': 'r', 'id': 430, 'synset': 'elk.n.01', 'synonyms': ['elk', 'moose'], 'def': 'large northern deer with enormous flattened antlers in the male', 'name': 'elk'}, {'frequency': 'c', 'id': 431, 'synset': 'envelope.n.01', 'synonyms': ['envelope'], 'def': 'a flat (usually rectangular) container for a letter, thin package, etc.', 'name': 'envelope'}, {'frequency': 'c', 'id': 432, 'synset': 'eraser.n.01', 'synonyms': ['eraser'], 'def': 'an implement used to erase something', 'name': 'eraser'}, {'frequency': 'r', 'id': 433, 'synset': 'escargot.n.01', 'synonyms': ['escargot'], 'def': 'edible snail usually served in the shell with a sauce of melted butter and garlic', 'name': 'escargot'}, {'frequency': 'r', 'id': 434, 'synset': 'eyepatch.n.01', 'synonyms': ['eyepatch'], 'def': 'a protective cloth covering for an injured eye', 'name': 'eyepatch'}, {'frequency': 'r', 'id': 435, 'synset': 'falcon.n.01', 'synonyms': ['falcon'], 'def': 'birds of prey having long pointed powerful wings adapted for swift flight', 'name': 'falcon'}, {'frequency': 'f', 'id': 436, 'synset': 'fan.n.01', 'synonyms': ['fan'], 'def': 'a device for creating a current of air by movement of a surface or surfaces', 'name': 'fan'}, {'frequency': 'f', 'id': 437, 'synset': 'faucet.n.01', 'synonyms': ['faucet', 'spigot', 'tap'], 'def': 'a regulator for controlling the flow of a liquid from a reservoir', 'name': 'faucet'}, {'frequency': 'r', 'id': 438, 'synset': 'fedora.n.01', 'synonyms': ['fedora'], 'def': 'a hat made of felt with a creased crown', 'name': 'fedora'}, {'frequency': 'r', 'id': 439, 'synset': 'ferret.n.02', 'synonyms': ['ferret'], 'def': 'domesticated albino variety of the European polecat bred for hunting rats and rabbits', 'name': 'ferret'}, {'frequency': 'c', 'id': 440, 'synset': 'ferris_wheel.n.01', 'synonyms': ['Ferris_wheel'], 'def': 'a large wheel with suspended seats that remain upright as the wheel rotates', 'name': 'Ferris_wheel'}, {'frequency': 'r', 'id': 441, 'synset': 'ferry.n.01', 'synonyms': ['ferry', 'ferryboat'], 'def': 'a boat that transports people or vehicles across a body of water and operates on a regular schedule', 'name': 'ferry'}, {'frequency': 'r', 'id': 442, 'synset': 'fig.n.04', 'synonyms': ['fig_(fruit)'], 'def': 'fleshy sweet pear-shaped yellowish or purple fruit eaten fresh or preserved or dried', 'name': 'fig_(fruit)'}, {'frequency': 'c', 'id': 443, 'synset': 'fighter.n.02', 'synonyms': ['fighter_jet', 'fighter_aircraft', 'attack_aircraft'], 'def': 'a high-speed military or naval airplane designed to destroy enemy targets', 'name': 'fighter_jet'}, {'frequency': 'f', 'id': 444, 'synset': 'figurine.n.01', 'synonyms': ['figurine'], 'def': 'a small carved or molded figure', 'name': 'figurine'}, {'frequency': 'c', 'id': 445, 'synset': 'file.n.03', 'synonyms': ['file_cabinet', 'filing_cabinet'], 'def': 'office furniture consisting of a container for keeping papers in order', 'name': 'file_cabinet'}, {'frequency': 'r', 'id': 446, 'synset': 'file.n.04', 'synonyms': ['file_(tool)'], 'def': 'a steel hand tool with small sharp teeth on some or all of its surfaces; used for smoothing wood or metal', 'name': 'file_(tool)'}, {'frequency': 'f', 'id': 447, 'synset': 'fire_alarm.n.02', 'synonyms': ['fire_alarm', 'smoke_alarm'], 'def': 'an alarm that is tripped off by fire or smoke', 'name': 'fire_alarm'}, {'frequency': 'c', 'id': 448, 'synset': 'fire_engine.n.01', 'synonyms': ['fire_engine', 'fire_truck'], 'def': 'large trucks that carry firefighters and equipment to the site of a fire', 'name': 'fire_engine'}, {'frequency': 'c', 'id': 449, 'synset': 'fire_extinguisher.n.01', 'synonyms': ['fire_extinguisher', 'extinguisher'], 'def': 'a manually operated device for extinguishing small fires', 'name': 'fire_extinguisher'}, {'frequency': 'c', 'id': 450, 'synset': 'fire_hose.n.01', 'synonyms': ['fire_hose'], 'def': 'a large hose that carries water from a fire hydrant to the site of the fire', 'name': 'fire_hose'}, {'frequency': 'f', 'id': 451, 'synset': 'fireplace.n.01', 'synonyms': ['fireplace'], 'def': 'an open recess in a wall at the base of a chimney where a fire can be built', 'name': 'fireplace'}, {'frequency': 'f', 'id': 452, 'synset': 'fireplug.n.01', 'synonyms': ['fireplug', 'fire_hydrant', 'hydrant'], 'def': 'an upright hydrant for drawing water to use in fighting a fire', 'name': 'fireplug'}, {'frequency': 'c', 'id': 453, 'synset': 'fish.n.01', 'synonyms': ['fish'], 'def': 'any of various mostly cold-blooded aquatic vertebrates usually having scales and breathing through gills', 'name': 'fish'}, {'frequency': 'r', 'id': 454, 'synset': 'fish.n.02', 'synonyms': ['fish_(food)'], 'def': 'the flesh of fish used as food', 'name': 'fish_(food)'}, {'frequency': 'r', 'id': 455, 'synset': 'fishbowl.n.02', 'synonyms': ['fishbowl', 'goldfish_bowl'], 'def': 'a transparent bowl in which small fish are kept', 'name': 'fishbowl'}, {'frequency': 'r', 'id': 456, 'synset': 'fishing_boat.n.01', 'synonyms': ['fishing_boat', 'fishing_vessel'], 'def': 'a vessel for fishing', 'name': 'fishing_boat'}, {'frequency': 'c', 'id': 457, 'synset': 'fishing_rod.n.01', 'synonyms': ['fishing_rod', 'fishing_pole'], 'def': 'a rod that is used in fishing to extend the fishing line', 'name': 'fishing_rod'}, {'frequency': 'f', 'id': 458, 'synset': 'flag.n.01', 'synonyms': ['flag'], 'def': 'emblem usually consisting of a rectangular piece of cloth of distinctive design (do not include pole)', 'name': 'flag'}, {'frequency': 'f', 'id': 459, 'synset': 'flagpole.n.02', 'synonyms': ['flagpole', 'flagstaff'], 'def': 'a tall staff or pole on which a flag is raised', 'name': 'flagpole'}, {'frequency': 'c', 'id': 460, 'synset': 'flamingo.n.01', 'synonyms': ['flamingo'], 'def': 'large pink web-footed bird with down-bent bill', 'name': 'flamingo'}, {'frequency': 'c', 'id': 461, 'synset': 'flannel.n.01', 'synonyms': ['flannel'], 'def': 'a soft light woolen fabric; used for clothing', 'name': 'flannel'}, {'frequency': 'r', 'id': 462, 'synset': 'flash.n.10', 'synonyms': ['flash', 'flashbulb'], 'def': 'a lamp for providing momentary light to take a photograph', 'name': 'flash'}, {'frequency': 'c', 'id': 463, 'synset': 'flashlight.n.01', 'synonyms': ['flashlight', 'torch'], 'def': 'a small portable battery-powered electric lamp', 'name': 'flashlight'}, {'frequency': 'r', 'id': 464, 'synset': 'fleece.n.03', 'synonyms': ['fleece'], 'def': 'a soft bulky fabric with deep pile; used chiefly for clothing', 'name': 'fleece'}, {'frequency': 'f', 'id': 465, 'synset': 'flip-flop.n.02', 'synonyms': ['flip-flop_(sandal)'], 'def': 'a backless sandal held to the foot by a thong between two toes', 'name': 'flip-flop_(sandal)'}, {'frequency': 'c', 'id': 466, 'synset': 'flipper.n.01', 'synonyms': ['flipper_(footwear)', 'fin_(footwear)'], 'def': 'a shoe to aid a person in swimming', 'name': 'flipper_(footwear)'}, {'frequency': 'f', 'id': 467, 'synset': 'flower_arrangement.n.01', 'synonyms': ['flower_arrangement', 'floral_arrangement'], 'def': 'a decorative arrangement of flowers', 'name': 'flower_arrangement'}, {'frequency': 'c', 'id': 468, 'synset': 'flute.n.02', 'synonyms': ['flute_glass', 'champagne_flute'], 'def': 'a tall narrow wineglass', 'name': 'flute_glass'}, {'frequency': 'r', 'id': 469, 'synset': 'foal.n.01', 'synonyms': ['foal'], 'def': 'a young horse', 'name': 'foal'}, {'frequency': 'c', 'id': 470, 'synset': 'folding_chair.n.01', 'synonyms': ['folding_chair'], 'def': 'a chair that can be folded flat for storage', 'name': 'folding_chair'}, {'frequency': 'c', 'id': 471, 'synset': 'food_processor.n.01', 'synonyms': ['food_processor'], 'def': 'a kitchen appliance for shredding, blending, chopping, or slicing food', 'name': 'food_processor'}, {'frequency': 'c', 'id': 472, 'synset': 'football.n.02', 'synonyms': ['football_(American)'], 'def': 'the inflated oblong ball used in playing American football', 'name': 'football_(American)'}, {'frequency': 'r', 'id': 473, 'synset': 'football_helmet.n.01', 'synonyms': ['football_helmet'], 'def': 'a padded helmet with a face mask to protect the head of football players', 'name': 'football_helmet'}, {'frequency': 'c', 'id': 474, 'synset': 'footstool.n.01', 'synonyms': ['footstool', 'footrest'], 'def': 'a low seat or a stool to rest the feet of a seated person', 'name': 'footstool'}, {'frequency': 'f', 'id': 475, 'synset': 'fork.n.01', 'synonyms': ['fork'], 'def': 'cutlery used for serving and eating food', 'name': 'fork'}, {'frequency': 'r', 'id': 476, 'synset': 'forklift.n.01', 'synonyms': ['forklift'], 'def': 'an industrial vehicle with a power operated fork in front that can be inserted under loads to lift and move them', 'name': 'forklift'}, {'frequency': 'r', 'id': 477, 'synset': 'freight_car.n.01', 'synonyms': ['freight_car'], 'def': 'a railway car that carries freight', 'name': 'freight_car'}, {'frequency': 'r', 'id': 478, 'synset': 'french_toast.n.01', 'synonyms': ['French_toast'], 'def': 'bread slice dipped in egg and milk and fried', 'name': 'French_toast'}, {'frequency': 'c', 'id': 479, 'synset': 'freshener.n.01', 'synonyms': ['freshener', 'air_freshener'], 'def': 'anything that freshens', 'name': 'freshener'}, {'frequency': 'f', 'id': 480, 'synset': 'frisbee.n.01', 'synonyms': ['frisbee'], 'def': 'a light, plastic disk propelled with a flip of the wrist for recreation or competition', 'name': 'frisbee'}, {'frequency': 'c', 'id': 481, 'synset': 'frog.n.01', 'synonyms': ['frog', 'toad', 'toad_frog'], 'def': 'a tailless stout-bodied amphibians with long hind limbs for leaping', 'name': 'frog'}, {'frequency': 'c', 'id': 482, 'synset': 'fruit_juice.n.01', 'synonyms': ['fruit_juice'], 'def': 'drink produced by squeezing or crushing fruit', 'name': 'fruit_juice'}, {'frequency': 'r', 'id': 483, 'synset': 'fruit_salad.n.01', 'synonyms': ['fruit_salad'], 'def': 'salad composed of fruits', 'name': 'fruit_salad'}, {'frequency': 'c', 'id': 484, 'synset': 'frying_pan.n.01', 'synonyms': ['frying_pan', 'frypan', 'skillet'], 'def': 'a pan used for frying foods', 'name': 'frying_pan'}, {'frequency': 'r', 'id': 485, 'synset': 'fudge.n.01', 'synonyms': ['fudge'], 'def': 'soft creamy candy', 'name': 'fudge'}, {'frequency': 'r', 'id': 486, 'synset': 'funnel.n.02', 'synonyms': ['funnel'], 'def': 'a cone-shaped utensil used to channel a substance into a container with a small mouth', 'name': 'funnel'}, {'frequency': 'c', 'id': 487, 'synset': 'futon.n.01', 'synonyms': ['futon'], 'def': 'a pad that is used for sleeping on the floor or on a raised frame', 'name': 'futon'}, {'frequency': 'r', 'id': 488, 'synset': 'gag.n.02', 'synonyms': ['gag', 'muzzle'], 'def': "restraint put into a person's mouth to prevent speaking or shouting", 'name': 'gag'}, {'frequency': 'r', 'id': 489, 'synset': 'garbage.n.03', 'synonyms': ['garbage'], 'def': 'a receptacle where waste can be discarded', 'name': 'garbage'}, {'frequency': 'c', 'id': 490, 'synset': 'garbage_truck.n.01', 'synonyms': ['garbage_truck'], 'def': 'a truck for collecting domestic refuse', 'name': 'garbage_truck'}, {'frequency': 'c', 'id': 491, 'synset': 'garden_hose.n.01', 'synonyms': ['garden_hose'], 'def': 'a hose used for watering a lawn or garden', 'name': 'garden_hose'}, {'frequency': 'c', 'id': 492, 'synset': 'gargle.n.01', 'synonyms': ['gargle', 'mouthwash'], 'def': 'a medicated solution used for gargling and rinsing the mouth', 'name': 'gargle'}, {'frequency': 'r', 'id': 493, 'synset': 'gargoyle.n.02', 'synonyms': ['gargoyle'], 'def': 'an ornament consisting of a grotesquely carved figure of a person or animal', 'name': 'gargoyle'}, {'frequency': 'c', 'id': 494, 'synset': 'garlic.n.02', 'synonyms': ['garlic', 'ail'], 'def': 'aromatic bulb used as seasoning', 'name': 'garlic'}, {'frequency': 'r', 'id': 495, 'synset': 'gasmask.n.01', 'synonyms': ['gasmask', 'respirator', 'gas_helmet'], 'def': 'a protective face mask with a filter', 'name': 'gasmask'}, {'frequency': 'r', 'id': 496, 'synset': 'gazelle.n.01', 'synonyms': ['gazelle'], 'def': 'small swift graceful antelope of Africa and Asia having lustrous eyes', 'name': 'gazelle'}, {'frequency': 'c', 'id': 497, 'synset': 'gelatin.n.02', 'synonyms': ['gelatin', 'jelly'], 'def': 'an edible jelly made with gelatin and used as a dessert or salad base or a coating for foods', 'name': 'gelatin'}, {'frequency': 'r', 'id': 498, 'synset': 'gem.n.02', 'synonyms': ['gemstone'], 'def': 'a crystalline rock that can be cut and polished for jewelry', 'name': 'gemstone'}, {'frequency': 'c', 'id': 499, 'synset': 'giant_panda.n.01', 'synonyms': ['giant_panda', 'panda', 'panda_bear'], 'def': 'large black-and-white herbivorous mammal of bamboo forests of China and Tibet', 'name': 'giant_panda'}, {'frequency': 'c', 'id': 500, 'synset': 'gift_wrap.n.01', 'synonyms': ['gift_wrap'], 'def': 'attractive wrapping paper suitable for wrapping gifts', 'name': 'gift_wrap'}, {'frequency': 'c', 'id': 501, 'synset': 'ginger.n.03', 'synonyms': ['ginger', 'gingerroot'], 'def': 'the root of the common ginger plant; used fresh as a seasoning', 'name': 'ginger'}, {'frequency': 'f', 'id': 502, 'synset': 'giraffe.n.01', 'synonyms': ['giraffe'], 'def': 'tall animal having a spotted coat and small horns and very long neck and legs', 'name': 'giraffe'}, {'frequency': 'c', 'id': 503, 'synset': 'girdle.n.02', 'synonyms': ['cincture', 'sash', 'waistband', 'waistcloth'], 'def': 'a band of material around the waist that strengthens a skirt or trousers', 'name': 'cincture'}, {'frequency': 'f', 'id': 504, 'synset': 'glass.n.02', 'synonyms': ['glass_(drink_container)', 'drinking_glass'], 'def': 'a container for holding liquids while drinking', 'name': 'glass_(drink_container)'}, {'frequency': 'c', 'id': 505, 'synset': 'globe.n.03', 'synonyms': ['globe'], 'def': 'a sphere on which a map (especially of the earth) is represented', 'name': 'globe'}, {'frequency': 'f', 'id': 506, 'synset': 'glove.n.02', 'synonyms': ['glove'], 'def': 'handwear covering the hand', 'name': 'glove'}, {'frequency': 'c', 'id': 507, 'synset': 'goat.n.01', 'synonyms': ['goat'], 'def': 'a common goat', 'name': 'goat'}, {'frequency': 'f', 'id': 508, 'synset': 'goggles.n.01', 'synonyms': ['goggles'], 'def': 'tight-fitting spectacles worn to protect the eyes', 'name': 'goggles'}, {'frequency': 'r', 'id': 509, 'synset': 'goldfish.n.01', 'synonyms': ['goldfish'], 'def': 'small golden or orange-red freshwater fishes used as pond or aquarium pets', 'name': 'goldfish'}, {'frequency': 'r', 'id': 510, 'synset': 'golf_club.n.02', 'synonyms': ['golf_club', 'golf-club'], 'def': 'golf equipment used by a golfer to hit a golf ball', 'name': 'golf_club'}, {'frequency': 'c', 'id': 511, 'synset': 'golfcart.n.01', 'synonyms': ['golfcart'], 'def': 'a small motor vehicle in which golfers can ride between shots', 'name': 'golfcart'}, {'frequency': 'r', 'id': 512, 'synset': 'gondola.n.02', 'synonyms': ['gondola_(boat)'], 'def': 'long narrow flat-bottomed boat propelled by sculling; traditionally used on canals of Venice', 'name': 'gondola_(boat)'}, {'frequency': 'c', 'id': 513, 'synset': 'goose.n.01', 'synonyms': ['goose'], 'def': 'loud, web-footed long-necked aquatic birds usually larger than ducks', 'name': 'goose'}, {'frequency': 'r', 'id': 514, 'synset': 'gorilla.n.01', 'synonyms': ['gorilla'], 'def': 'largest ape', 'name': 'gorilla'}, {'frequency': 'r', 'id': 515, 'synset': 'gourd.n.02', 'synonyms': ['gourd'], 'def': 'any of numerous inedible fruits with hard rinds', 'name': 'gourd'}, {'frequency': 'r', 'id': 516, 'synset': 'gown.n.04', 'synonyms': ['surgical_gown', 'scrubs_(surgical_clothing)'], 'def': 'protective garment worn by surgeons during operations', 'name': 'surgical_gown'}, {'frequency': 'f', 'id': 517, 'synset': 'grape.n.01', 'synonyms': ['grape'], 'def': 'any of various juicy fruit with green or purple skins; grow in clusters', 'name': 'grape'}, {'frequency': 'r', 'id': 518, 'synset': 'grasshopper.n.01', 'synonyms': ['grasshopper'], 'def': 'plant-eating insect with hind legs adapted for leaping', 'name': 'grasshopper'}, {'frequency': 'c', 'id': 519, 'synset': 'grater.n.01', 'synonyms': ['grater'], 'def': 'utensil with sharp perforations for shredding foods (as vegetables or cheese)', 'name': 'grater'}, {'frequency': 'c', 'id': 520, 'synset': 'gravestone.n.01', 'synonyms': ['gravestone', 'headstone', 'tombstone'], 'def': 'a stone that is used to mark a grave', 'name': 'gravestone'}, {'frequency': 'r', 'id': 521, 'synset': 'gravy_boat.n.01', 'synonyms': ['gravy_boat', 'gravy_holder'], 'def': 'a dish (often boat-shaped) for serving gravy or sauce', 'name': 'gravy_boat'}, {'frequency': 'c', 'id': 522, 'synset': 'green_bean.n.02', 'synonyms': ['green_bean'], 'def': 'a common bean plant cultivated for its slender green edible pods', 'name': 'green_bean'}, {'frequency': 'c', 'id': 523, 'synset': 'green_onion.n.01', 'synonyms': ['green_onion', 'spring_onion', 'scallion'], 'def': 'a young onion before the bulb has enlarged', 'name': 'green_onion'}, {'frequency': 'r', 'id': 524, 'synset': 'griddle.n.01', 'synonyms': ['griddle'], 'def': 'cooking utensil consisting of a flat heated surface on which food is cooked', 'name': 'griddle'}, {'frequency': 'r', 'id': 525, 'synset': 'grillroom.n.01', 'synonyms': ['grillroom', 'grill_(restaurant)'], 'def': 'a restaurant where food is cooked on a grill', 'name': 'grillroom'}, {'frequency': 'r', 'id': 526, 'synset': 'grinder.n.04', 'synonyms': ['grinder_(tool)'], 'def': 'a machine tool that polishes metal', 'name': 'grinder_(tool)'}, {'frequency': 'r', 'id': 527, 'synset': 'grits.n.01', 'synonyms': ['grits', 'hominy_grits'], 'def': 'coarsely ground corn boiled as a breakfast dish', 'name': 'grits'}, {'frequency': 'c', 'id': 528, 'synset': 'grizzly.n.01', 'synonyms': ['grizzly', 'grizzly_bear'], 'def': 'powerful brownish-yellow bear of the uplands of western North America', 'name': 'grizzly'}, {'frequency': 'c', 'id': 529, 'synset': 'grocery_bag.n.01', 'synonyms': ['grocery_bag'], 'def': "a sack for holding customer's groceries", 'name': 'grocery_bag'}, {'frequency': 'r', 'id': 530, 'synset': 'guacamole.n.01', 'synonyms': ['guacamole'], 'def': 'a dip made of mashed avocado mixed with chopped onions and other seasonings', 'name': 'guacamole'}, {'frequency': 'f', 'id': 531, 'synset': 'guitar.n.01', 'synonyms': ['guitar'], 'def': 'a stringed instrument usually having six strings; played by strumming or plucking', 'name': 'guitar'}, {'frequency': 'c', 'id': 532, 'synset': 'gull.n.02', 'synonyms': ['gull', 'seagull'], 'def': 'mostly white aquatic bird having long pointed wings and short legs', 'name': 'gull'}, {'frequency': 'c', 'id': 533, 'synset': 'gun.n.01', 'synonyms': ['gun'], 'def': 'a weapon that discharges a bullet at high velocity from a metal tube', 'name': 'gun'}, {'frequency': 'r', 'id': 534, 'synset': 'hair_spray.n.01', 'synonyms': ['hair_spray'], 'def': 'substance sprayed on the hair to hold it in place', 'name': 'hair_spray'}, {'frequency': 'c', 'id': 535, 'synset': 'hairbrush.n.01', 'synonyms': ['hairbrush'], 'def': "a brush used to groom a person's hair", 'name': 'hairbrush'}, {'frequency': 'c', 'id': 536, 'synset': 'hairnet.n.01', 'synonyms': ['hairnet'], 'def': 'a small net that someone wears over their hair to keep it in place', 'name': 'hairnet'}, {'frequency': 'c', 'id': 537, 'synset': 'hairpin.n.01', 'synonyms': ['hairpin'], 'def': "a double pronged pin used to hold women's hair in place", 'name': 'hairpin'}, {'frequency': 'f', 'id': 538, 'synset': 'ham.n.01', 'synonyms': ['ham', 'jambon', 'gammon'], 'def': 'meat cut from the thigh of a hog (usually smoked)', 'name': 'ham'}, {'frequency': 'c', 'id': 539, 'synset': 'hamburger.n.01', 'synonyms': ['hamburger', 'beefburger', 'burger'], 'def': 'a sandwich consisting of a patty of minced beef served on a bun', 'name': 'hamburger'}, {'frequency': 'c', 'id': 540, 'synset': 'hammer.n.02', 'synonyms': ['hammer'], 'def': 'a hand tool with a heavy head and a handle; used to deliver an impulsive force by striking', 'name': 'hammer'}, {'frequency': 'r', 'id': 541, 'synset': 'hammock.n.02', 'synonyms': ['hammock'], 'def': 'a hanging bed of canvas or rope netting (usually suspended between two trees)', 'name': 'hammock'}, {'frequency': 'r', 'id': 542, 'synset': 'hamper.n.02', 'synonyms': ['hamper'], 'def': 'a basket usually with a cover', 'name': 'hamper'}, {'frequency': 'r', 'id': 543, 'synset': 'hamster.n.01', 'synonyms': ['hamster'], 'def': 'short-tailed burrowing rodent with large cheek pouches', 'name': 'hamster'}, {'frequency': 'c', 'id': 544, 'synset': 'hand_blower.n.01', 'synonyms': ['hair_dryer'], 'def': 'a hand-held electric blower that can blow warm air onto the hair', 'name': 'hair_dryer'}, {'frequency': 'r', 'id': 545, 'synset': 'hand_glass.n.01', 'synonyms': ['hand_glass', 'hand_mirror'], 'def': 'a mirror intended to be held in the hand', 'name': 'hand_glass'}, {'frequency': 'f', 'id': 546, 'synset': 'hand_towel.n.01', 'synonyms': ['hand_towel', 'face_towel'], 'def': 'a small towel used to dry the hands or face', 'name': 'hand_towel'}, {'frequency': 'c', 'id': 547, 'synset': 'handcart.n.01', 'synonyms': ['handcart', 'pushcart', 'hand_truck'], 'def': 'wheeled vehicle that can be pushed by a person', 'name': 'handcart'}, {'frequency': 'r', 'id': 548, 'synset': 'handcuff.n.01', 'synonyms': ['handcuff'], 'def': 'shackle that consists of a metal loop that can be locked around the wrist', 'name': 'handcuff'}, {'frequency': 'c', 'id': 549, 'synset': 'handkerchief.n.01', 'synonyms': ['handkerchief'], 'def': 'a square piece of cloth used for wiping the eyes or nose or as a costume accessory', 'name': 'handkerchief'}, {'frequency': 'f', 'id': 550, 'synset': 'handle.n.01', 'synonyms': ['handle', 'grip', 'handgrip'], 'def': 'the appendage to an object that is designed to be held in order to use or move it', 'name': 'handle'}, {'frequency': 'r', 'id': 551, 'synset': 'handsaw.n.01', 'synonyms': ['handsaw', "carpenter's_saw"], 'def': 'a saw used with one hand for cutting wood', 'name': 'handsaw'}, {'frequency': 'r', 'id': 552, 'synset': 'hardback.n.01', 'synonyms': ['hardback_book', 'hardcover_book'], 'def': 'a book with cardboard or cloth or leather covers', 'name': 'hardback_book'}, {'frequency': 'r', 'id': 553, 'synset': 'harmonium.n.01', 'synonyms': ['harmonium', 'organ_(musical_instrument)', 'reed_organ_(musical_instrument)'], 'def': 'a free-reed instrument in which air is forced through the reeds by bellows', 'name': 'harmonium'}, {'frequency': 'f', 'id': 554, 'synset': 'hat.n.01', 'synonyms': ['hat'], 'def': 'headwear that protects the head from bad weather, sun, or worn for fashion', 'name': 'hat'}, {'frequency': 'r', 'id': 555, 'synset': 'hatbox.n.01', 'synonyms': ['hatbox'], 'def': 'a round piece of luggage for carrying hats', 'name': 'hatbox'}, {'frequency': 'r', 'id': 556, 'synset': 'hatch.n.03', 'synonyms': ['hatch'], 'def': 'a movable barrier covering a hatchway', 'name': 'hatch'}, {'frequency': 'c', 'id': 557, 'synset': 'head_covering.n.01', 'synonyms': ['veil'], 'def': 'a garment that covers the head and face', 'name': 'veil'}, {'frequency': 'f', 'id': 558, 'synset': 'headband.n.01', 'synonyms': ['headband'], 'def': 'a band worn around or over the head', 'name': 'headband'}, {'frequency': 'f', 'id': 559, 'synset': 'headboard.n.01', 'synonyms': ['headboard'], 'def': 'a vertical board or panel forming the head of a bedstead', 'name': 'headboard'}, {'frequency': 'f', 'id': 560, 'synset': 'headlight.n.01', 'synonyms': ['headlight', 'headlamp'], 'def': 'a powerful light with reflector; attached to the front of an automobile or locomotive', 'name': 'headlight'}, {'frequency': 'c', 'id': 561, 'synset': 'headscarf.n.01', 'synonyms': ['headscarf'], 'def': 'a kerchief worn over the head and tied under the chin', 'name': 'headscarf'}, {'frequency': 'r', 'id': 562, 'synset': 'headset.n.01', 'synonyms': ['headset'], 'def': 'receiver consisting of a pair of headphones', 'name': 'headset'}, {'frequency': 'c', 'id': 563, 'synset': 'headstall.n.01', 'synonyms': ['headstall_(for_horses)', 'headpiece_(for_horses)'], 'def': "the band that is the part of a bridle that fits around a horse's head", 'name': 'headstall_(for_horses)'}, {'frequency': 'r', 'id': 564, 'synset': 'hearing_aid.n.02', 'synonyms': ['hearing_aid'], 'def': 'an acoustic device used to direct sound to the ear of a hearing-impaired person', 'name': 'hearing_aid'}, {'frequency': 'c', 'id': 565, 'synset': 'heart.n.02', 'synonyms': ['heart'], 'def': 'a muscular organ; its contractions move the blood through the body', 'name': 'heart'}, {'frequency': 'c', 'id': 566, 'synset': 'heater.n.01', 'synonyms': ['heater', 'warmer'], 'def': 'device that heats water or supplies warmth to a room', 'name': 'heater'}, {'frequency': 'c', 'id': 567, 'synset': 'helicopter.n.01', 'synonyms': ['helicopter'], 'def': 'an aircraft without wings that obtains its lift from the rotation of overhead blades', 'name': 'helicopter'}, {'frequency': 'f', 'id': 568, 'synset': 'helmet.n.02', 'synonyms': ['helmet'], 'def': 'a protective headgear made of hard material to resist blows', 'name': 'helmet'}, {'frequency': 'r', 'id': 569, 'synset': 'heron.n.02', 'synonyms': ['heron'], 'def': 'grey or white wading bird with long neck and long legs and (usually) long bill', 'name': 'heron'}, {'frequency': 'c', 'id': 570, 'synset': 'highchair.n.01', 'synonyms': ['highchair', 'feeding_chair'], 'def': 'a chair for feeding a very young child', 'name': 'highchair'}, {'frequency': 'f', 'id': 571, 'synset': 'hinge.n.01', 'synonyms': ['hinge'], 'def': 'a joint that holds two parts together so that one can swing relative to the other', 'name': 'hinge'}, {'frequency': 'r', 'id': 572, 'synset': 'hippopotamus.n.01', 'synonyms': ['hippopotamus'], 'def': 'massive thick-skinned animal living in or around rivers of tropical Africa', 'name': 'hippopotamus'}, {'frequency': 'r', 'id': 573, 'synset': 'hockey_stick.n.01', 'synonyms': ['hockey_stick'], 'def': 'sports implement consisting of a stick used by hockey players to move the puck', 'name': 'hockey_stick'}, {'frequency': 'c', 'id': 574, 'synset': 'hog.n.03', 'synonyms': ['hog', 'pig'], 'def': 'domestic swine', 'name': 'hog'}, {'frequency': 'f', 'id': 575, 'synset': 'home_plate.n.01', 'synonyms': ['home_plate_(baseball)', 'home_base_(baseball)'], 'def': '(baseball) a rubber slab where the batter stands; it must be touched by a base runner in order to score', 'name': 'home_plate_(baseball)'}, {'frequency': 'c', 'id': 576, 'synset': 'honey.n.01', 'synonyms': ['honey'], 'def': 'a sweet yellow liquid produced by bees', 'name': 'honey'}, {'frequency': 'f', 'id': 577, 'synset': 'hood.n.06', 'synonyms': ['fume_hood', 'exhaust_hood'], 'def': 'metal covering leading to a vent that exhausts smoke or fumes', 'name': 'fume_hood'}, {'frequency': 'f', 'id': 578, 'synset': 'hook.n.05', 'synonyms': ['hook'], 'def': 'a curved or bent implement for suspending or pulling something', 'name': 'hook'}, {'frequency': 'f', 'id': 579, 'synset': 'horse.n.01', 'synonyms': ['horse'], 'def': 'a common horse', 'name': 'horse'}, {'frequency': 'f', 'id': 580, 'synset': 'hose.n.03', 'synonyms': ['hose', 'hosepipe'], 'def': 'a flexible pipe for conveying a liquid or gas', 'name': 'hose'}, {'frequency': 'r', 'id': 581, 'synset': 'hot-air_balloon.n.01', 'synonyms': ['hot-air_balloon'], 'def': 'balloon for travel through the air in a basket suspended below a large bag of heated air', 'name': 'hot-air_balloon'}, {'frequency': 'r', 'id': 582, 'synset': 'hot_plate.n.01', 'synonyms': ['hotplate'], 'def': 'a portable electric appliance for heating or cooking or keeping food warm', 'name': 'hotplate'}, {'frequency': 'c', 'id': 583, 'synset': 'hot_sauce.n.01', 'synonyms': ['hot_sauce'], 'def': 'a pungent peppery sauce', 'name': 'hot_sauce'}, {'frequency': 'r', 'id': 584, 'synset': 'hourglass.n.01', 'synonyms': ['hourglass'], 'def': 'a sandglass timer that runs for sixty minutes', 'name': 'hourglass'}, {'frequency': 'r', 'id': 585, 'synset': 'houseboat.n.01', 'synonyms': ['houseboat'], 'def': 'a barge that is designed and equipped for use as a dwelling', 'name': 'houseboat'}, {'frequency': 'r', 'id': 586, 'synset': 'hummingbird.n.01', 'synonyms': ['hummingbird'], 'def': 'tiny American bird having brilliant iridescent plumage and long slender bills', 'name': 'hummingbird'}, {'frequency': 'r', 'id': 587, 'synset': 'hummus.n.01', 'synonyms': ['hummus', 'humus', 'hommos', 'hoummos', 'humous'], 'def': 'a thick spread made from mashed chickpeas', 'name': 'hummus'}, {'frequency': 'c', 'id': 588, 'synset': 'ice_bear.n.01', 'synonyms': ['polar_bear'], 'def': 'white bear of Arctic regions', 'name': 'polar_bear'}, {'frequency': 'c', 'id': 589, 'synset': 'ice_cream.n.01', 'synonyms': ['icecream'], 'def': 'frozen dessert containing cream and sugar and flavoring', 'name': 'icecream'}, {'frequency': 'r', 'id': 590, 'synset': 'ice_lolly.n.01', 'synonyms': ['popsicle'], 'def': 'ice cream or water ice on a small wooden stick', 'name': 'popsicle'}, {'frequency': 'c', 'id': 591, 'synset': 'ice_maker.n.01', 'synonyms': ['ice_maker'], 'def': 'an appliance included in some electric refrigerators for making ice cubes', 'name': 'ice_maker'}, {'frequency': 'r', 'id': 592, 'synset': 'ice_pack.n.01', 'synonyms': ['ice_pack', 'ice_bag'], 'def': 'a waterproof bag filled with ice: applied to the body (especially the head) to cool or reduce swelling', 'name': 'ice_pack'}, {'frequency': 'r', 'id': 593, 'synset': 'ice_skate.n.01', 'synonyms': ['ice_skate'], 'def': 'skate consisting of a boot with a steel blade fitted to the sole', 'name': 'ice_skate'}, {'frequency': 'r', 'id': 594, 'synset': 'ice_tea.n.01', 'synonyms': ['ice_tea', 'iced_tea'], 'def': 'strong tea served over ice', 'name': 'ice_tea'}, {'frequency': 'c', 'id': 595, 'synset': 'igniter.n.01', 'synonyms': ['igniter', 'ignitor', 'lighter'], 'def': 'a substance or device used to start a fire', 'name': 'igniter'}, {'frequency': 'r', 'id': 596, 'synset': 'incense.n.01', 'synonyms': ['incense'], 'def': 'a substance that produces a fragrant odor when burned', 'name': 'incense'}, {'frequency': 'r', 'id': 597, 'synset': 'inhaler.n.01', 'synonyms': ['inhaler', 'inhalator'], 'def': 'a dispenser that produces a chemical vapor to be inhaled through mouth or nose', 'name': 'inhaler'}, {'frequency': 'c', 'id': 598, 'synset': 'ipod.n.01', 'synonyms': ['iPod'], 'def': 'a pocket-sized device used to play music files', 'name': 'iPod'}, {'frequency': 'c', 'id': 599, 'synset': 'iron.n.04', 'synonyms': ['iron_(for_clothing)', 'smoothing_iron_(for_clothing)'], 'def': 'home appliance consisting of a flat metal base that is heated and used to smooth cloth', 'name': 'iron_(for_clothing)'}, {'frequency': 'r', 'id': 600, 'synset': 'ironing_board.n.01', 'synonyms': ['ironing_board'], 'def': 'narrow padded board on collapsible supports; used for ironing clothes', 'name': 'ironing_board'}, {'frequency': 'f', 'id': 601, 'synset': 'jacket.n.01', 'synonyms': ['jacket'], 'def': 'a waist-length coat', 'name': 'jacket'}, {'frequency': 'r', 'id': 602, 'synset': 'jam.n.01', 'synonyms': ['jam'], 'def': 'preserve of crushed fruit', 'name': 'jam'}, {'frequency': 'f', 'id': 603, 'synset': 'jean.n.01', 'synonyms': ['jean', 'blue_jean', 'denim'], 'def': '(usually plural) close-fitting trousers of heavy denim for manual work or casual wear', 'name': 'jean'}, {'frequency': 'c', 'id': 604, 'synset': 'jeep.n.01', 'synonyms': ['jeep', 'landrover'], 'def': 'a car suitable for traveling over rough terrain', 'name': 'jeep'}, {'frequency': 'r', 'id': 605, 'synset': 'jelly_bean.n.01', 'synonyms': ['jelly_bean', 'jelly_egg'], 'def': 'sugar-glazed jellied candy', 'name': 'jelly_bean'}, {'frequency': 'f', 'id': 606, 'synset': 'jersey.n.03', 'synonyms': ['jersey', 'T-shirt', 'tee_shirt'], 'def': 'a close-fitting pullover shirt', 'name': 'jersey'}, {'frequency': 'c', 'id': 607, 'synset': 'jet.n.01', 'synonyms': ['jet_plane', 'jet-propelled_plane'], 'def': 'an airplane powered by one or more jet engines', 'name': 'jet_plane'}, {'frequency': 'c', 'id': 608, 'synset': 'jewelry.n.01', 'synonyms': ['jewelry', 'jewellery'], 'def': 'an adornment (as a bracelet or ring or necklace) made of precious metals and set with gems (or imitation gems)', 'name': 'jewelry'}, {'frequency': 'r', 'id': 609, 'synset': 'joystick.n.02', 'synonyms': ['joystick'], 'def': 'a control device for computers consisting of a vertical handle that can move freely in two directions', 'name': 'joystick'}, {'frequency': 'r', 'id': 610, 'synset': 'jump_suit.n.01', 'synonyms': ['jumpsuit'], 'def': "one-piece garment fashioned after a parachutist's uniform", 'name': 'jumpsuit'}, {'frequency': 'c', 'id': 611, 'synset': 'kayak.n.01', 'synonyms': ['kayak'], 'def': 'a small canoe consisting of a light frame made watertight with animal skins', 'name': 'kayak'}, {'frequency': 'r', 'id': 612, 'synset': 'keg.n.02', 'synonyms': ['keg'], 'def': 'small cask or barrel', 'name': 'keg'}, {'frequency': 'r', 'id': 613, 'synset': 'kennel.n.01', 'synonyms': ['kennel', 'doghouse'], 'def': 'outbuilding that serves as a shelter for a dog', 'name': 'kennel'}, {'frequency': 'c', 'id': 614, 'synset': 'kettle.n.01', 'synonyms': ['kettle', 'boiler'], 'def': 'a metal pot for stewing or boiling; usually has a lid', 'name': 'kettle'}, {'frequency': 'f', 'id': 615, 'synset': 'key.n.01', 'synonyms': ['key'], 'def': 'metal instrument used to unlock a lock', 'name': 'key'}, {'frequency': 'r', 'id': 616, 'synset': 'keycard.n.01', 'synonyms': ['keycard'], 'def': 'a plastic card used to gain access typically to a door', 'name': 'keycard'}, {'frequency': 'r', 'id': 617, 'synset': 'kilt.n.01', 'synonyms': ['kilt'], 'def': 'a knee-length pleated tartan skirt worn by men as part of the traditional dress in the Highlands of northern Scotland', 'name': 'kilt'}, {'frequency': 'c', 'id': 618, 'synset': 'kimono.n.01', 'synonyms': ['kimono'], 'def': 'a loose robe; imitated from robes originally worn by Japanese', 'name': 'kimono'}, {'frequency': 'f', 'id': 619, 'synset': 'kitchen_sink.n.01', 'synonyms': ['kitchen_sink'], 'def': 'a sink in a kitchen', 'name': 'kitchen_sink'}, {'frequency': 'c', 'id': 620, 'synset': 'kitchen_table.n.01', 'synonyms': ['kitchen_table'], 'def': 'a table in the kitchen', 'name': 'kitchen_table'}, {'frequency': 'f', 'id': 621, 'synset': 'kite.n.03', 'synonyms': ['kite'], 'def': 'plaything consisting of a light frame covered with tissue paper; flown in wind at end of a string', 'name': 'kite'}, {'frequency': 'c', 'id': 622, 'synset': 'kitten.n.01', 'synonyms': ['kitten', 'kitty'], 'def': 'young domestic cat', 'name': 'kitten'}, {'frequency': 'c', 'id': 623, 'synset': 'kiwi.n.03', 'synonyms': ['kiwi_fruit'], 'def': 'fuzzy brown egg-shaped fruit with slightly tart green flesh', 'name': 'kiwi_fruit'}, {'frequency': 'f', 'id': 624, 'synset': 'knee_pad.n.01', 'synonyms': ['knee_pad'], 'def': 'protective garment consisting of a pad worn by football or baseball or hockey players', 'name': 'knee_pad'}, {'frequency': 'f', 'id': 625, 'synset': 'knife.n.01', 'synonyms': ['knife'], 'def': 'tool with a blade and point used as a cutting instrument', 'name': 'knife'}, {'frequency': 'r', 'id': 626, 'synset': 'knight.n.02', 'synonyms': ['knight_(chess_piece)', 'horse_(chess_piece)'], 'def': 'a chess game piece shaped to resemble the head of a horse', 'name': 'knight_(chess_piece)'}, {'frequency': 'r', 'id': 627, 'synset': 'knitting_needle.n.01', 'synonyms': ['knitting_needle'], 'def': 'needle consisting of a slender rod with pointed ends; usually used in pairs', 'name': 'knitting_needle'}, {'frequency': 'f', 'id': 628, 'synset': 'knob.n.02', 'synonyms': ['knob'], 'def': 'a round handle often found on a door', 'name': 'knob'}, {'frequency': 'r', 'id': 629, 'synset': 'knocker.n.05', 'synonyms': ['knocker_(on_a_door)', 'doorknocker'], 'def': 'a device (usually metal and ornamental) attached by a hinge to a door', 'name': 'knocker_(on_a_door)'}, {'frequency': 'r', 'id': 630, 'synset': 'koala.n.01', 'synonyms': ['koala', 'koala_bear'], 'def': 'sluggish tailless Australian marsupial with grey furry ears and coat', 'name': 'koala'}, {'frequency': 'r', 'id': 631, 'synset': 'lab_coat.n.01', 'synonyms': ['lab_coat', 'laboratory_coat'], 'def': 'a light coat worn to protect clothing from substances used while working in a laboratory', 'name': 'lab_coat'}, {'frequency': 'f', 'id': 632, 'synset': 'ladder.n.01', 'synonyms': ['ladder'], 'def': 'steps consisting of two parallel members connected by rungs', 'name': 'ladder'}, {'frequency': 'c', 'id': 633, 'synset': 'ladle.n.01', 'synonyms': ['ladle'], 'def': 'a spoon-shaped vessel with a long handle frequently used to transfer liquids', 'name': 'ladle'}, {'frequency': 'r', 'id': 634, 'synset': 'ladybug.n.01', 'synonyms': ['ladybug', 'ladybeetle', 'ladybird_beetle'], 'def': 'small round bright-colored and spotted beetle, typically red and black', 'name': 'ladybug'}, {'frequency': 'c', 'id': 635, 'synset': 'lamb.n.01', 'synonyms': ['lamb_(animal)'], 'def': 'young sheep', 'name': 'lamb_(animal)'}, {'frequency': 'r', 'id': 636, 'synset': 'lamb_chop.n.01', 'synonyms': ['lamb-chop', 'lambchop'], 'def': 'chop cut from a lamb', 'name': 'lamb-chop'}, {'frequency': 'f', 'id': 637, 'synset': 'lamp.n.02', 'synonyms': ['lamp'], 'def': 'a piece of furniture holding one or more electric light bulbs', 'name': 'lamp'}, {'frequency': 'f', 'id': 638, 'synset': 'lamppost.n.01', 'synonyms': ['lamppost'], 'def': 'a metal post supporting an outdoor lamp (such as a streetlight)', 'name': 'lamppost'}, {'frequency': 'f', 'id': 639, 'synset': 'lampshade.n.01', 'synonyms': ['lampshade'], 'def': 'a protective ornamental shade used to screen a light bulb from direct view', 'name': 'lampshade'}, {'frequency': 'c', 'id': 640, 'synset': 'lantern.n.01', 'synonyms': ['lantern'], 'def': 'light in a transparent protective case', 'name': 'lantern'}, {'frequency': 'f', 'id': 641, 'synset': 'lanyard.n.02', 'synonyms': ['lanyard', 'laniard'], 'def': 'a cord worn around the neck to hold a knife or whistle, etc.', 'name': 'lanyard'}, {'frequency': 'f', 'id': 642, 'synset': 'laptop.n.01', 'synonyms': ['laptop_computer', 'notebook_computer'], 'def': 'a portable computer small enough to use in your lap', 'name': 'laptop_computer'}, {'frequency': 'r', 'id': 643, 'synset': 'lasagna.n.01', 'synonyms': ['lasagna', 'lasagne'], 'def': 'baked dish of layers of lasagna pasta with sauce and cheese and meat or vegetables', 'name': 'lasagna'}, {'frequency': 'c', 'id': 644, 'synset': 'latch.n.02', 'synonyms': ['latch'], 'def': 'a bar that can be lowered or slid into a groove to fasten a door or gate', 'name': 'latch'}, {'frequency': 'r', 'id': 645, 'synset': 'lawn_mower.n.01', 'synonyms': ['lawn_mower'], 'def': 'garden tool for mowing grass on lawns', 'name': 'lawn_mower'}, {'frequency': 'r', 'id': 646, 'synset': 'leather.n.01', 'synonyms': ['leather'], 'def': 'an animal skin made smooth and flexible by removing the hair and then tanning', 'name': 'leather'}, {'frequency': 'c', 'id': 647, 'synset': 'legging.n.01', 'synonyms': ['legging_(clothing)', 'leging_(clothing)', 'leg_covering'], 'def': 'a garment covering the leg (usually extending from the knee to the ankle)', 'name': 'legging_(clothing)'}, {'frequency': 'c', 'id': 648, 'synset': 'lego.n.01', 'synonyms': ['Lego', 'Lego_set'], 'def': "a child's plastic construction set for making models from blocks", 'name': 'Lego'}, {'frequency': 'f', 'id': 649, 'synset': 'lemon.n.01', 'synonyms': ['lemon'], 'def': 'yellow oval fruit with juicy acidic flesh', 'name': 'lemon'}, {'frequency': 'r', 'id': 650, 'synset': 'lemonade.n.01', 'synonyms': ['lemonade'], 'def': 'sweetened beverage of diluted lemon juice', 'name': 'lemonade'}, {'frequency': 'f', 'id': 651, 'synset': 'lettuce.n.02', 'synonyms': ['lettuce'], 'def': 'leafy plant commonly eaten in salad or on sandwiches', 'name': 'lettuce'}, {'frequency': 'f', 'id': 652, 'synset': 'license_plate.n.01', 'synonyms': ['license_plate', 'numberplate'], 'def': "a plate mounted on the front and back of car and bearing the car's registration number", 'name': 'license_plate'}, {'frequency': 'f', 'id': 653, 'synset': 'life_buoy.n.01', 'synonyms': ['life_buoy', 'lifesaver', 'life_belt', 'life_ring'], 'def': 'a ring-shaped life preserver used to prevent drowning (NOT a life-jacket or vest)', 'name': 'life_buoy'}, {'frequency': 'f', 'id': 654, 'synset': 'life_jacket.n.01', 'synonyms': ['life_jacket', 'life_vest'], 'def': 'life preserver consisting of a sleeveless jacket of buoyant or inflatable design', 'name': 'life_jacket'}, {'frequency': 'f', 'id': 655, 'synset': 'light_bulb.n.01', 'synonyms': ['lightbulb'], 'def': 'glass bulb or tube shaped electric device that emits light (DO NOT MARK LAMPS AS A WHOLE)', 'name': 'lightbulb'}, {'frequency': 'r', 'id': 656, 'synset': 'lightning_rod.n.02', 'synonyms': ['lightning_rod', 'lightning_conductor'], 'def': 'a metallic conductor that is attached to a high point and leads to the ground', 'name': 'lightning_rod'}, {'frequency': 'c', 'id': 657, 'synset': 'lime.n.06', 'synonyms': ['lime'], 'def': 'the green acidic fruit of any of various lime trees', 'name': 'lime'}, {'frequency': 'r', 'id': 658, 'synset': 'limousine.n.01', 'synonyms': ['limousine'], 'def': 'long luxurious car; usually driven by a chauffeur', 'name': 'limousine'}, {'frequency': 'r', 'id': 659, 'synset': 'linen.n.02', 'synonyms': ['linen_paper'], 'def': 'a high-quality paper made of linen fibers or with a linen finish', 'name': 'linen_paper'}, {'frequency': 'c', 'id': 660, 'synset': 'lion.n.01', 'synonyms': ['lion'], 'def': 'large gregarious predatory cat of Africa and India', 'name': 'lion'}, {'frequency': 'c', 'id': 661, 'synset': 'lip_balm.n.01', 'synonyms': ['lip_balm'], 'def': 'a balm applied to the lips', 'name': 'lip_balm'}, {'frequency': 'c', 'id': 662, 'synset': 'lipstick.n.01', 'synonyms': ['lipstick', 'lip_rouge'], 'def': 'makeup that is used to color the lips', 'name': 'lipstick'}, {'frequency': 'r', 'id': 663, 'synset': 'liquor.n.01', 'synonyms': ['liquor', 'spirits', 'hard_liquor', 'liqueur', 'cordial'], 'def': 'an alcoholic beverage that is distilled rather than fermented', 'name': 'liquor'}, {'frequency': 'r', 'id': 664, 'synset': 'lizard.n.01', 'synonyms': ['lizard'], 'def': 'a reptile with usually two pairs of legs and a tapering tail', 'name': 'lizard'}, {'frequency': 'r', 'id': 665, 'synset': 'loafer.n.02', 'synonyms': ['Loafer_(type_of_shoe)'], 'def': 'a low leather step-in shoe', 'name': 'Loafer_(type_of_shoe)'}, {'frequency': 'f', 'id': 666, 'synset': 'log.n.01', 'synonyms': ['log'], 'def': 'a segment of the trunk of a tree when stripped of branches', 'name': 'log'}, {'frequency': 'c', 'id': 667, 'synset': 'lollipop.n.02', 'synonyms': ['lollipop'], 'def': 'hard candy on a stick', 'name': 'lollipop'}, {'frequency': 'c', 'id': 668, 'synset': 'lotion.n.01', 'synonyms': ['lotion'], 'def': 'any of various cosmetic preparations that are applied to the skin', 'name': 'lotion'}, {'frequency': 'f', 'id': 669, 'synset': 'loudspeaker.n.01', 'synonyms': ['speaker_(stero_equipment)'], 'def': 'electronic device that produces sound often as part of a stereo system', 'name': 'speaker_(stero_equipment)'}, {'frequency': 'c', 'id': 670, 'synset': 'love_seat.n.01', 'synonyms': ['loveseat'], 'def': 'small sofa that seats two people', 'name': 'loveseat'}, {'frequency': 'r', 'id': 671, 'synset': 'machine_gun.n.01', 'synonyms': ['machine_gun'], 'def': 'a rapidly firing automatic gun', 'name': 'machine_gun'}, {'frequency': 'f', 'id': 672, 'synset': 'magazine.n.02', 'synonyms': ['magazine'], 'def': 'a paperback periodic publication', 'name': 'magazine'}, {'frequency': 'f', 'id': 673, 'synset': 'magnet.n.01', 'synonyms': ['magnet'], 'def': 'a device that attracts iron and produces a magnetic field', 'name': 'magnet'}, {'frequency': 'r', 'id': 674, 'synset': 'mail_slot.n.01', 'synonyms': ['mail_slot'], 'def': 'a slot (usually in a door) through which mail can be delivered', 'name': 'mail_slot'}, {'frequency': 'c', 'id': 675, 'synset': 'mailbox.n.01', 'synonyms': ['mailbox_(at_home)', 'letter_box_(at_home)'], 'def': 'a private box for delivery of mail', 'name': 'mailbox_(at_home)'}, {'frequency': 'r', 'id': 676, 'synset': 'mallet.n.01', 'synonyms': ['mallet'], 'def': 'a sports implement with a long handle and a hammer-like head used to hit a ball', 'name': 'mallet'}, {'frequency': 'r', 'id': 677, 'synset': 'mammoth.n.01', 'synonyms': ['mammoth'], 'def': 'any of numerous extinct elephants widely distributed in the Pleistocene', 'name': 'mammoth'}, {'frequency': 'c', 'id': 678, 'synset': 'mandarin.n.05', 'synonyms': ['mandarin_orange'], 'def': 'a somewhat flat reddish-orange loose skinned citrus of China', 'name': 'mandarin_orange'}, {'frequency': 'c', 'id': 679, 'synset': 'manger.n.01', 'synonyms': ['manger', 'trough'], 'def': 'a container (usually in a barn or stable) from which cattle or horses feed', 'name': 'manger'}, {'frequency': 'f', 'id': 680, 'synset': 'manhole.n.01', 'synonyms': ['manhole'], 'def': 'a hole (usually with a flush cover) through which a person can gain access to an underground structure', 'name': 'manhole'}, {'frequency': 'c', 'id': 681, 'synset': 'map.n.01', 'synonyms': ['map'], 'def': "a diagrammatic representation of the earth's surface (or part of it)", 'name': 'map'}, {'frequency': 'c', 'id': 682, 'synset': 'marker.n.03', 'synonyms': ['marker'], 'def': 'a writing implement for making a mark', 'name': 'marker'}, {'frequency': 'r', 'id': 683, 'synset': 'martini.n.01', 'synonyms': ['martini'], 'def': 'a cocktail made of gin (or vodka) with dry vermouth', 'name': 'martini'}, {'frequency': 'r', 'id': 684, 'synset': 'mascot.n.01', 'synonyms': ['mascot'], 'def': 'a person or animal that is adopted by a team or other group as a symbolic figure', 'name': 'mascot'}, {'frequency': 'c', 'id': 685, 'synset': 'mashed_potato.n.01', 'synonyms': ['mashed_potato'], 'def': 'potato that has been peeled and boiled and then mashed', 'name': 'mashed_potato'}, {'frequency': 'r', 'id': 686, 'synset': 'masher.n.02', 'synonyms': ['masher'], 'def': 'a kitchen utensil used for mashing (e.g. potatoes)', 'name': 'masher'}, {'frequency': 'f', 'id': 687, 'synset': 'mask.n.04', 'synonyms': ['mask', 'facemask'], 'def': 'a protective covering worn over the face', 'name': 'mask'}, {'frequency': 'f', 'id': 688, 'synset': 'mast.n.01', 'synonyms': ['mast'], 'def': 'a vertical spar for supporting sails', 'name': 'mast'}, {'frequency': 'c', 'id': 689, 'synset': 'mat.n.03', 'synonyms': ['mat_(gym_equipment)', 'gym_mat'], 'def': 'sports equipment consisting of a piece of thick padding on the floor for gymnastics', 'name': 'mat_(gym_equipment)'}, {'frequency': 'r', 'id': 690, 'synset': 'matchbox.n.01', 'synonyms': ['matchbox'], 'def': 'a box for holding matches', 'name': 'matchbox'}, {'frequency': 'f', 'id': 691, 'synset': 'mattress.n.01', 'synonyms': ['mattress'], 'def': 'a thick pad filled with resilient material used as a bed or part of a bed', 'name': 'mattress'}, {'frequency': 'c', 'id': 692, 'synset': 'measuring_cup.n.01', 'synonyms': ['measuring_cup'], 'def': 'graduated cup used to measure liquid or granular ingredients', 'name': 'measuring_cup'}, {'frequency': 'c', 'id': 693, 'synset': 'measuring_stick.n.01', 'synonyms': ['measuring_stick', 'ruler_(measuring_stick)', 'measuring_rod'], 'def': 'measuring instrument having a sequence of marks at regular intervals', 'name': 'measuring_stick'}, {'frequency': 'c', 'id': 694, 'synset': 'meatball.n.01', 'synonyms': ['meatball'], 'def': 'ground meat formed into a ball and fried or simmered in broth', 'name': 'meatball'}, {'frequency': 'c', 'id': 695, 'synset': 'medicine.n.02', 'synonyms': ['medicine'], 'def': 'something that treats or prevents or alleviates the symptoms of disease', 'name': 'medicine'}, {'frequency': 'r', 'id': 696, 'synset': 'melon.n.01', 'synonyms': ['melon'], 'def': 'fruit of the gourd family having a hard rind and sweet juicy flesh', 'name': 'melon'}, {'frequency': 'f', 'id': 697, 'synset': 'microphone.n.01', 'synonyms': ['microphone'], 'def': 'device for converting sound waves into electrical energy', 'name': 'microphone'}, {'frequency': 'r', 'id': 698, 'synset': 'microscope.n.01', 'synonyms': ['microscope'], 'def': 'magnifier of the image of small objects', 'name': 'microscope'}, {'frequency': 'f', 'id': 699, 'synset': 'microwave.n.02', 'synonyms': ['microwave_oven'], 'def': 'kitchen appliance that cooks food by passing an electromagnetic wave through it', 'name': 'microwave_oven'}, {'frequency': 'r', 'id': 700, 'synset': 'milestone.n.01', 'synonyms': ['milestone', 'milepost'], 'def': 'stone post at side of a road to show distances', 'name': 'milestone'}, {'frequency': 'c', 'id': 701, 'synset': 'milk.n.01', 'synonyms': ['milk'], 'def': 'a white nutritious liquid secreted by mammals and used as food by human beings', 'name': 'milk'}, {'frequency': 'f', 'id': 702, 'synset': 'minivan.n.01', 'synonyms': ['minivan'], 'def': 'a small box-shaped passenger van', 'name': 'minivan'}, {'frequency': 'r', 'id': 703, 'synset': 'mint.n.05', 'synonyms': ['mint_candy'], 'def': 'a candy that is flavored with a mint oil', 'name': 'mint_candy'}, {'frequency': 'f', 'id': 704, 'synset': 'mirror.n.01', 'synonyms': ['mirror'], 'def': 'polished surface that forms images by reflecting light', 'name': 'mirror'}, {'frequency': 'c', 'id': 705, 'synset': 'mitten.n.01', 'synonyms': ['mitten'], 'def': 'glove that encases the thumb separately and the other four fingers together', 'name': 'mitten'}, {'frequency': 'c', 'id': 706, 'synset': 'mixer.n.04', 'synonyms': ['mixer_(kitchen_tool)', 'stand_mixer'], 'def': 'a kitchen utensil that is used for mixing foods', 'name': 'mixer_(kitchen_tool)'}, {'frequency': 'c', 'id': 707, 'synset': 'money.n.03', 'synonyms': ['money'], 'def': 'the official currency issued by a government or national bank', 'name': 'money'}, {'frequency': 'f', 'id': 708, 'synset': 'monitor.n.04', 'synonyms': ['monitor_(computer_equipment) computer_monitor'], 'def': 'a computer monitor', 'name': 'monitor_(computer_equipment) computer_monitor'}, {'frequency': 'c', 'id': 709, 'synset': 'monkey.n.01', 'synonyms': ['monkey'], 'def': 'any of various long-tailed primates', 'name': 'monkey'}, {'frequency': 'f', 'id': 710, 'synset': 'motor.n.01', 'synonyms': ['motor'], 'def': 'machine that converts other forms of energy into mechanical energy and so imparts motion', 'name': 'motor'}, {'frequency': 'f', 'id': 711, 'synset': 'motor_scooter.n.01', 'synonyms': ['motor_scooter', 'scooter'], 'def': 'a wheeled vehicle with small wheels and a low-powered engine', 'name': 'motor_scooter'}, {'frequency': 'r', 'id': 712, 'synset': 'motor_vehicle.n.01', 'synonyms': ['motor_vehicle', 'automotive_vehicle'], 'def': 'a self-propelled wheeled vehicle that does not run on rails', 'name': 'motor_vehicle'}, {'frequency': 'r', 'id': 713, 'synset': 'motorboat.n.01', 'synonyms': ['motorboat', 'powerboat'], 'def': 'a boat propelled by an internal-combustion engine', 'name': 'motorboat'}, {'frequency': 'f', 'id': 714, 'synset': 'motorcycle.n.01', 'synonyms': ['motorcycle'], 'def': 'a motor vehicle with two wheels and a strong frame', 'name': 'motorcycle'}, {'frequency': 'f', 'id': 715, 'synset': 'mound.n.01', 'synonyms': ['mound_(baseball)', "pitcher's_mound"], 'def': '(baseball) the slight elevation on which the pitcher stands', 'name': 'mound_(baseball)'}, {'frequency': 'r', 'id': 716, 'synset': 'mouse.n.01', 'synonyms': ['mouse_(animal_rodent)'], 'def': 'a small rodent with pointed snouts and small ears on elongated bodies with slender usually hairless tails', 'name': 'mouse_(animal_rodent)'}, {'frequency': 'f', 'id': 717, 'synset': 'mouse.n.04', 'synonyms': ['mouse_(computer_equipment)', 'computer_mouse'], 'def': 'a computer input device that controls an on-screen pointer', 'name': 'mouse_(computer_equipment)'}, {'frequency': 'f', 'id': 718, 'synset': 'mousepad.n.01', 'synonyms': ['mousepad'], 'def': 'a small portable pad that provides an operating surface for a computer mouse', 'name': 'mousepad'}, {'frequency': 'c', 'id': 719, 'synset': 'muffin.n.01', 'synonyms': ['muffin'], 'def': 'a sweet quick bread baked in a cup-shaped pan', 'name': 'muffin'}, {'frequency': 'f', 'id': 720, 'synset': 'mug.n.04', 'synonyms': ['mug'], 'def': 'with handle and usually cylindrical', 'name': 'mug'}, {'frequency': 'f', 'id': 721, 'synset': 'mushroom.n.02', 'synonyms': ['mushroom'], 'def': 'a common mushroom', 'name': 'mushroom'}, {'frequency': 'r', 'id': 722, 'synset': 'music_stool.n.01', 'synonyms': ['music_stool', 'piano_stool'], 'def': 'a stool for piano players; usually adjustable in height', 'name': 'music_stool'}, {'frequency': 'r', 'id': 723, 'synset': 'musical_instrument.n.01', 'synonyms': ['musical_instrument', 'instrument_(musical)'], 'def': 'any of various devices or contrivances that can be used to produce musical tones or sounds', 'name': 'musical_instrument'}, {'frequency': 'r', 'id': 724, 'synset': 'nailfile.n.01', 'synonyms': ['nailfile'], 'def': 'a small flat file for shaping the nails', 'name': 'nailfile'}, {'frequency': 'r', 'id': 725, 'synset': 'nameplate.n.01', 'synonyms': ['nameplate'], 'def': 'a plate bearing a name', 'name': 'nameplate'}, {'frequency': 'f', 'id': 726, 'synset': 'napkin.n.01', 'synonyms': ['napkin', 'table_napkin', 'serviette'], 'def': 'a small piece of table linen or paper that is used to wipe the mouth and to cover the lap in order to protect clothing', 'name': 'napkin'}, {'frequency': 'r', 'id': 727, 'synset': 'neckerchief.n.01', 'synonyms': ['neckerchief'], 'def': 'a kerchief worn around the neck', 'name': 'neckerchief'}, {'frequency': 'f', 'id': 728, 'synset': 'necklace.n.01', 'synonyms': ['necklace'], 'def': 'jewelry consisting of a cord or chain (often bearing gems) worn about the neck as an ornament', 'name': 'necklace'}, {'frequency': 'f', 'id': 729, 'synset': 'necktie.n.01', 'synonyms': ['necktie', 'tie_(necktie)'], 'def': 'neckwear consisting of a long narrow piece of material worn under a collar and tied in knot at the front', 'name': 'necktie'}, {'frequency': 'r', 'id': 730, 'synset': 'needle.n.03', 'synonyms': ['needle'], 'def': 'a sharp pointed implement (usually metal)', 'name': 'needle'}, {'frequency': 'c', 'id': 731, 'synset': 'nest.n.01', 'synonyms': ['nest'], 'def': 'a structure in which animals lay eggs or give birth to their young', 'name': 'nest'}, {'frequency': 'r', 'id': 732, 'synset': 'newsstand.n.01', 'synonyms': ['newsstand'], 'def': 'a stall where newspapers and other periodicals are sold', 'name': 'newsstand'}, {'frequency': 'c', 'id': 733, 'synset': 'nightwear.n.01', 'synonyms': ['nightshirt', 'nightwear', 'sleepwear', 'nightclothes'], 'def': 'garments designed to be worn in bed', 'name': 'nightshirt'}, {'frequency': 'r', 'id': 734, 'synset': 'nosebag.n.01', 'synonyms': ['nosebag_(for_animals)', 'feedbag'], 'def': 'a canvas bag that is used to feed an animal (such as a horse); covers the muzzle and fastens at the top of the head', 'name': 'nosebag_(for_animals)'}, {'frequency': 'r', 'id': 735, 'synset': 'noseband.n.01', 'synonyms': ['noseband_(for_animals)', 'nosepiece_(for_animals)'], 'def': "a strap that is the part of a bridle that goes over the animal's nose", 'name': 'noseband_(for_animals)'}, {'frequency': 'f', 'id': 736, 'synset': 'notebook.n.01', 'synonyms': ['notebook'], 'def': 'a book with blank pages for recording notes or memoranda', 'name': 'notebook'}, {'frequency': 'c', 'id': 737, 'synset': 'notepad.n.01', 'synonyms': ['notepad'], 'def': 'a pad of paper for keeping notes', 'name': 'notepad'}, {'frequency': 'c', 'id': 738, 'synset': 'nut.n.03', 'synonyms': ['nut'], 'def': 'a small metal block (usually square or hexagonal) with internal screw thread to be fitted onto a bolt', 'name': 'nut'}, {'frequency': 'r', 'id': 739, 'synset': 'nutcracker.n.01', 'synonyms': ['nutcracker'], 'def': 'a hand tool used to crack nuts open', 'name': 'nutcracker'}, {'frequency': 'c', 'id': 740, 'synset': 'oar.n.01', 'synonyms': ['oar'], 'def': 'an implement used to propel or steer a boat', 'name': 'oar'}, {'frequency': 'r', 'id': 741, 'synset': 'octopus.n.01', 'synonyms': ['octopus_(food)'], 'def': 'tentacles of octopus prepared as food', 'name': 'octopus_(food)'}, {'frequency': 'r', 'id': 742, 'synset': 'octopus.n.02', 'synonyms': ['octopus_(animal)'], 'def': 'bottom-living cephalopod having a soft oval body with eight long tentacles', 'name': 'octopus_(animal)'}, {'frequency': 'c', 'id': 743, 'synset': 'oil_lamp.n.01', 'synonyms': ['oil_lamp', 'kerosene_lamp', 'kerosine_lamp'], 'def': 'a lamp that burns oil (as kerosine) for light', 'name': 'oil_lamp'}, {'frequency': 'c', 'id': 744, 'synset': 'olive_oil.n.01', 'synonyms': ['olive_oil'], 'def': 'oil from olives', 'name': 'olive_oil'}, {'frequency': 'r', 'id': 745, 'synset': 'omelet.n.01', 'synonyms': ['omelet', 'omelette'], 'def': 'beaten eggs cooked until just set; may be folded around e.g. ham or cheese or jelly', 'name': 'omelet'}, {'frequency': 'f', 'id': 746, 'synset': 'onion.n.01', 'synonyms': ['onion'], 'def': 'the bulb of an onion plant', 'name': 'onion'}, {'frequency': 'f', 'id': 747, 'synset': 'orange.n.01', 'synonyms': ['orange_(fruit)'], 'def': 'orange (FRUIT of an orange tree)', 'name': 'orange_(fruit)'}, {'frequency': 'c', 'id': 748, 'synset': 'orange_juice.n.01', 'synonyms': ['orange_juice'], 'def': 'bottled or freshly squeezed juice of oranges', 'name': 'orange_juice'}, {'frequency': 'r', 'id': 749, 'synset': 'oregano.n.01', 'synonyms': ['oregano', 'marjoram'], 'def': 'aromatic Eurasian perennial herb used in cooking and baking', 'name': 'oregano'}, {'frequency': 'c', 'id': 750, 'synset': 'ostrich.n.02', 'synonyms': ['ostrich'], 'def': 'fast-running African flightless bird with two-toed feet; largest living bird', 'name': 'ostrich'}, {'frequency': 'c', 'id': 751, 'synset': 'ottoman.n.03', 'synonyms': ['ottoman', 'pouf', 'pouffe', 'hassock'], 'def': 'thick cushion used as a seat', 'name': 'ottoman'}, {'frequency': 'c', 'id': 752, 'synset': 'overall.n.01', 'synonyms': ['overalls_(clothing)'], 'def': 'work clothing consisting of denim trousers usually with a bib and shoulder straps', 'name': 'overalls_(clothing)'}, {'frequency': 'c', 'id': 753, 'synset': 'owl.n.01', 'synonyms': ['owl'], 'def': 'nocturnal bird of prey with hawk-like beak and claws and large head with front-facing eyes', 'name': 'owl'}, {'frequency': 'c', 'id': 754, 'synset': 'packet.n.03', 'synonyms': ['packet'], 'def': 'a small package or bundle', 'name': 'packet'}, {'frequency': 'r', 'id': 755, 'synset': 'pad.n.03', 'synonyms': ['inkpad', 'inking_pad', 'stamp_pad'], 'def': 'absorbent material saturated with ink used to transfer ink evenly to a rubber stamp', 'name': 'inkpad'}, {'frequency': 'c', 'id': 756, 'synset': 'pad.n.04', 'synonyms': ['pad'], 'def': 'a flat mass of soft material used for protection, stuffing, or comfort', 'name': 'pad'}, {'frequency': 'c', 'id': 757, 'synset': 'paddle.n.04', 'synonyms': ['paddle', 'boat_paddle'], 'def': 'a short light oar used without an oarlock to propel a canoe or small boat', 'name': 'paddle'}, {'frequency': 'c', 'id': 758, 'synset': 'padlock.n.01', 'synonyms': ['padlock'], 'def': 'a detachable, portable lock', 'name': 'padlock'}, {'frequency': 'r', 'id': 759, 'synset': 'paintbox.n.01', 'synonyms': ['paintbox'], 'def': "a box containing a collection of cubes or tubes of artists' paint", 'name': 'paintbox'}, {'frequency': 'c', 'id': 760, 'synset': 'paintbrush.n.01', 'synonyms': ['paintbrush'], 'def': 'a brush used as an applicator to apply paint', 'name': 'paintbrush'}, {'frequency': 'f', 'id': 761, 'synset': 'painting.n.01', 'synonyms': ['painting'], 'def': 'graphic art consisting of an artistic composition made by applying paints to a surface', 'name': 'painting'}, {'frequency': 'c', 'id': 762, 'synset': 'pajama.n.02', 'synonyms': ['pajamas', 'pyjamas'], 'def': 'loose-fitting nightclothes worn for sleeping or lounging', 'name': 'pajamas'}, {'frequency': 'c', 'id': 763, 'synset': 'palette.n.02', 'synonyms': ['palette', 'pallet'], 'def': 'board that provides a flat surface on which artists mix paints and the range of colors used', 'name': 'palette'}, {'frequency': 'f', 'id': 764, 'synset': 'pan.n.01', 'synonyms': ['pan_(for_cooking)', 'cooking_pan'], 'def': 'cooking utensil consisting of a wide metal vessel', 'name': 'pan_(for_cooking)'}, {'frequency': 'r', 'id': 765, 'synset': 'pan.n.03', 'synonyms': ['pan_(metal_container)'], 'def': 'shallow container made of metal', 'name': 'pan_(metal_container)'}, {'frequency': 'c', 'id': 766, 'synset': 'pancake.n.01', 'synonyms': ['pancake'], 'def': 'a flat cake of thin batter fried on both sides on a griddle', 'name': 'pancake'}, {'frequency': 'r', 'id': 767, 'synset': 'pantyhose.n.01', 'synonyms': ['pantyhose'], 'def': "a woman's tights consisting of underpants and stockings", 'name': 'pantyhose'}, {'frequency': 'r', 'id': 768, 'synset': 'papaya.n.02', 'synonyms': ['papaya'], 'def': 'large oval melon-like tropical fruit with yellowish flesh', 'name': 'papaya'}, {'frequency': 'r', 'id': 769, 'synset': 'paper_clip.n.01', 'synonyms': ['paperclip'], 'def': 'a wire or plastic clip for holding sheets of paper together', 'name': 'paperclip'}, {'frequency': 'f', 'id': 770, 'synset': 'paper_plate.n.01', 'synonyms': ['paper_plate'], 'def': 'a disposable plate made of cardboard', 'name': 'paper_plate'}, {'frequency': 'f', 'id': 771, 'synset': 'paper_towel.n.01', 'synonyms': ['paper_towel'], 'def': 'a disposable towel made of absorbent paper', 'name': 'paper_towel'}, {'frequency': 'r', 'id': 772, 'synset': 'paperback_book.n.01', 'synonyms': ['paperback_book', 'paper-back_book', 'softback_book', 'soft-cover_book'], 'def': 'a book with paper covers', 'name': 'paperback_book'}, {'frequency': 'r', 'id': 773, 'synset': 'paperweight.n.01', 'synonyms': ['paperweight'], 'def': 'a weight used to hold down a stack of papers', 'name': 'paperweight'}, {'frequency': 'c', 'id': 774, 'synset': 'parachute.n.01', 'synonyms': ['parachute'], 'def': 'rescue equipment consisting of a device that fills with air and retards your fall', 'name': 'parachute'}, {'frequency': 'r', 'id': 775, 'synset': 'parakeet.n.01', 'synonyms': ['parakeet', 'parrakeet', 'parroket', 'paraquet', 'paroquet', 'parroquet'], 'def': 'any of numerous small slender long-tailed parrots', 'name': 'parakeet'}, {'frequency': 'c', 'id': 776, 'synset': 'parasail.n.01', 'synonyms': ['parasail_(sports)'], 'def': 'parachute that will lift a person up into the air when it is towed by a motorboat or a car', 'name': 'parasail_(sports)'}, {'frequency': 'r', 'id': 777, 'synset': 'parchment.n.01', 'synonyms': ['parchment'], 'def': 'a superior paper resembling sheepskin', 'name': 'parchment'}, {'frequency': 'r', 'id': 778, 'synset': 'parka.n.01', 'synonyms': ['parka', 'anorak'], 'def': "a kind of heavy jacket (`windcheater' is a British term)", 'name': 'parka'}, {'frequency': 'f', 'id': 779, 'synset': 'parking_meter.n.01', 'synonyms': ['parking_meter'], 'def': 'a coin-operated timer located next to a parking space', 'name': 'parking_meter'}, {'frequency': 'c', 'id': 780, 'synset': 'parrot.n.01', 'synonyms': ['parrot'], 'def': 'usually brightly colored tropical birds with short hooked beaks and the ability to mimic sounds', 'name': 'parrot'}, {'frequency': 'c', 'id': 781, 'synset': 'passenger_car.n.01', 'synonyms': ['passenger_car_(part_of_a_train)', 'coach_(part_of_a_train)'], 'def': 'a railcar where passengers ride', 'name': 'passenger_car_(part_of_a_train)'}, {'frequency': 'r', 'id': 782, 'synset': 'passenger_ship.n.01', 'synonyms': ['passenger_ship'], 'def': 'a ship built to carry passengers', 'name': 'passenger_ship'}, {'frequency': 'r', 'id': 783, 'synset': 'passport.n.02', 'synonyms': ['passport'], 'def': 'a document issued by a country to a citizen allowing that person to travel abroad and re-enter the home country', 'name': 'passport'}, {'frequency': 'f', 'id': 784, 'synset': 'pastry.n.02', 'synonyms': ['pastry'], 'def': 'any of various baked foods made of dough or batter', 'name': 'pastry'}, {'frequency': 'r', 'id': 785, 'synset': 'patty.n.01', 'synonyms': ['patty_(food)'], 'def': 'small flat mass of chopped food', 'name': 'patty_(food)'}, {'frequency': 'c', 'id': 786, 'synset': 'pea.n.01', 'synonyms': ['pea_(food)'], 'def': 'seed of a pea plant used for food', 'name': 'pea_(food)'}, {'frequency': 'c', 'id': 787, 'synset': 'peach.n.03', 'synonyms': ['peach'], 'def': 'downy juicy fruit with sweet yellowish or whitish flesh', 'name': 'peach'}, {'frequency': 'c', 'id': 788, 'synset': 'peanut_butter.n.01', 'synonyms': ['peanut_butter'], 'def': 'a spread made from ground peanuts', 'name': 'peanut_butter'}, {'frequency': 'c', 'id': 789, 'synset': 'pear.n.01', 'synonyms': ['pear'], 'def': 'sweet juicy gritty-textured fruit available in many varieties', 'name': 'pear'}, {'frequency': 'r', 'id': 790, 'synset': 'peeler.n.03', 'synonyms': ['peeler_(tool_for_fruit_and_vegetables)'], 'def': 'a device for peeling vegetables or fruits', 'name': 'peeler_(tool_for_fruit_and_vegetables)'}, {'frequency': 'r', 'id': 791, 'synset': 'pegboard.n.01', 'synonyms': ['pegboard'], 'def': 'a board perforated with regularly spaced holes into which pegs can be fitted', 'name': 'pegboard'}, {'frequency': 'c', 'id': 792, 'synset': 'pelican.n.01', 'synonyms': ['pelican'], 'def': 'large long-winged warm-water seabird having a large bill with a distensible pouch for fish', 'name': 'pelican'}, {'frequency': 'f', 'id': 793, 'synset': 'pen.n.01', 'synonyms': ['pen'], 'def': 'a writing implement with a point from which ink flows', 'name': 'pen'}, {'frequency': 'c', 'id': 794, 'synset': 'pencil.n.01', 'synonyms': ['pencil'], 'def': 'a thin cylindrical pointed writing implement made of wood and graphite', 'name': 'pencil'}, {'frequency': 'r', 'id': 795, 'synset': 'pencil_box.n.01', 'synonyms': ['pencil_box', 'pencil_case'], 'def': 'a box for holding pencils', 'name': 'pencil_box'}, {'frequency': 'r', 'id': 796, 'synset': 'pencil_sharpener.n.01', 'synonyms': ['pencil_sharpener'], 'def': 'a rotary implement for sharpening the point on pencils', 'name': 'pencil_sharpener'}, {'frequency': 'r', 'id': 797, 'synset': 'pendulum.n.01', 'synonyms': ['pendulum'], 'def': 'an apparatus consisting of an object mounted so that it swings freely under the influence of gravity', 'name': 'pendulum'}, {'frequency': 'c', 'id': 798, 'synset': 'penguin.n.01', 'synonyms': ['penguin'], 'def': 'short-legged flightless birds of cold southern regions having webbed feet and wings modified as flippers', 'name': 'penguin'}, {'frequency': 'r', 'id': 799, 'synset': 'pennant.n.02', 'synonyms': ['pennant'], 'def': 'a flag longer than it is wide (and often tapering)', 'name': 'pennant'}, {'frequency': 'r', 'id': 800, 'synset': 'penny.n.02', 'synonyms': ['penny_(coin)'], 'def': 'a coin worth one-hundredth of the value of the basic unit', 'name': 'penny_(coin)'}, {'frequency': 'c', 'id': 801, 'synset': 'pepper.n.03', 'synonyms': ['pepper', 'peppercorn'], 'def': 'pungent seasoning from the berry of the common pepper plant; whole or ground', 'name': 'pepper'}, {'frequency': 'c', 'id': 802, 'synset': 'pepper_mill.n.01', 'synonyms': ['pepper_mill', 'pepper_grinder'], 'def': 'a mill for grinding pepper', 'name': 'pepper_mill'}, {'frequency': 'c', 'id': 803, 'synset': 'perfume.n.02', 'synonyms': ['perfume'], 'def': 'a toiletry that emits and diffuses a fragrant odor', 'name': 'perfume'}, {'frequency': 'r', 'id': 804, 'synset': 'persimmon.n.02', 'synonyms': ['persimmon'], 'def': 'orange fruit resembling a plum; edible when fully ripe', 'name': 'persimmon'}, {'frequency': 'f', 'id': 805, 'synset': 'person.n.01', 'synonyms': ['baby', 'child', 'boy', 'girl', 'man', 'woman', 'person', 'human'], 'def': 'a human being', 'name': 'baby'}, {'frequency': 'r', 'id': 806, 'synset': 'pet.n.01', 'synonyms': ['pet'], 'def': 'a domesticated animal kept for companionship or amusement', 'name': 'pet'}, {'frequency': 'r', 'id': 807, 'synset': 'petfood.n.01', 'synonyms': ['petfood', 'pet-food'], 'def': 'food prepared for animal pets', 'name': 'petfood'}, {'frequency': 'r', 'id': 808, 'synset': 'pew.n.01', 'synonyms': ['pew_(church_bench)', 'church_bench'], 'def': 'long bench with backs; used in church by the congregation', 'name': 'pew_(church_bench)'}, {'frequency': 'r', 'id': 809, 'synset': 'phonebook.n.01', 'synonyms': ['phonebook', 'telephone_book', 'telephone_directory'], 'def': 'a directory containing an alphabetical list of telephone subscribers and their telephone numbers', 'name': 'phonebook'}, {'frequency': 'c', 'id': 810, 'synset': 'phonograph_record.n.01', 'synonyms': ['phonograph_record', 'phonograph_recording', 'record_(phonograph_recording)'], 'def': 'sound recording consisting of a typically black disk with a continuous groove', 'name': 'phonograph_record'}, {'frequency': 'c', 'id': 811, 'synset': 'piano.n.01', 'synonyms': ['piano'], 'def': 'a keyboard instrument that is played by depressing keys that cause hammers to strike tuned strings and produce sounds', 'name': 'piano'}, {'frequency': 'f', 'id': 812, 'synset': 'pickle.n.01', 'synonyms': ['pickle'], 'def': 'vegetables (especially cucumbers) preserved in brine or vinegar', 'name': 'pickle'}, {'frequency': 'f', 'id': 813, 'synset': 'pickup.n.01', 'synonyms': ['pickup_truck'], 'def': 'a light truck with an open body and low sides and a tailboard', 'name': 'pickup_truck'}, {'frequency': 'c', 'id': 814, 'synset': 'pie.n.01', 'synonyms': ['pie'], 'def': 'dish baked in pastry-lined pan often with a pastry top', 'name': 'pie'}, {'frequency': 'c', 'id': 815, 'synset': 'pigeon.n.01', 'synonyms': ['pigeon'], 'def': 'wild and domesticated birds having a heavy body and short legs', 'name': 'pigeon'}, {'frequency': 'r', 'id': 816, 'synset': 'piggy_bank.n.01', 'synonyms': ['piggy_bank', 'penny_bank'], 'def': "a child's coin bank (often shaped like a pig)", 'name': 'piggy_bank'}, {'frequency': 'f', 'id': 817, 'synset': 'pillow.n.01', 'synonyms': ['pillow'], 'def': 'a cushion to support the head of a sleeping person', 'name': 'pillow'}, {'frequency': 'r', 'id': 818, 'synset': 'pin.n.09', 'synonyms': ['pin_(non_jewelry)'], 'def': 'a small slender (often pointed) piece of wood or metal used to support or fasten or attach things', 'name': 'pin_(non_jewelry)'}, {'frequency': 'f', 'id': 819, 'synset': 'pineapple.n.02', 'synonyms': ['pineapple'], 'def': 'large sweet fleshy tropical fruit with a tuft of stiff leaves', 'name': 'pineapple'}, {'frequency': 'c', 'id': 820, 'synset': 'pinecone.n.01', 'synonyms': ['pinecone'], 'def': 'the seed-producing cone of a pine tree', 'name': 'pinecone'}, {'frequency': 'r', 'id': 821, 'synset': 'ping-pong_ball.n.01', 'synonyms': ['ping-pong_ball'], 'def': 'light hollow ball used in playing table tennis', 'name': 'ping-pong_ball'}, {'frequency': 'r', 'id': 822, 'synset': 'pinwheel.n.03', 'synonyms': ['pinwheel'], 'def': 'a toy consisting of vanes of colored paper or plastic that is pinned to a stick and spins when it is pointed into the wind', 'name': 'pinwheel'}, {'frequency': 'r', 'id': 823, 'synset': 'pipe.n.01', 'synonyms': ['tobacco_pipe'], 'def': 'a tube with a small bowl at one end; used for smoking tobacco', 'name': 'tobacco_pipe'}, {'frequency': 'f', 'id': 824, 'synset': 'pipe.n.02', 'synonyms': ['pipe', 'piping'], 'def': 'a long tube made of metal or plastic that is used to carry water or oil or gas etc.', 'name': 'pipe'}, {'frequency': 'r', 'id': 825, 'synset': 'pistol.n.01', 'synonyms': ['pistol', 'handgun'], 'def': 'a firearm that is held and fired with one hand', 'name': 'pistol'}, {'frequency': 'r', 'id': 826, 'synset': 'pita.n.01', 'synonyms': ['pita_(bread)', 'pocket_bread'], 'def': 'usually small round bread that can open into a pocket for filling', 'name': 'pita_(bread)'}, {'frequency': 'f', 'id': 827, 'synset': 'pitcher.n.02', 'synonyms': ['pitcher_(vessel_for_liquid)', 'ewer'], 'def': 'an open vessel with a handle and a spout for pouring', 'name': 'pitcher_(vessel_for_liquid)'}, {'frequency': 'r', 'id': 828, 'synset': 'pitchfork.n.01', 'synonyms': ['pitchfork'], 'def': 'a long-handled hand tool with sharp widely spaced prongs for lifting and pitching hay', 'name': 'pitchfork'}, {'frequency': 'f', 'id': 829, 'synset': 'pizza.n.01', 'synonyms': ['pizza'], 'def': 'Italian open pie made of thin bread dough spread with a spiced mixture of e.g. tomato sauce and cheese', 'name': 'pizza'}, {'frequency': 'f', 'id': 830, 'synset': 'place_mat.n.01', 'synonyms': ['place_mat'], 'def': 'a mat placed on a table for an individual place setting', 'name': 'place_mat'}, {'frequency': 'f', 'id': 831, 'synset': 'plate.n.04', 'synonyms': ['plate'], 'def': 'dish on which food is served or from which food is eaten', 'name': 'plate'}, {'frequency': 'c', 'id': 832, 'synset': 'platter.n.01', 'synonyms': ['platter'], 'def': 'a large shallow dish used for serving food', 'name': 'platter'}, {'frequency': 'r', 'id': 833, 'synset': 'playing_card.n.01', 'synonyms': ['playing_card'], 'def': 'one of a pack of cards that are used to play card games', 'name': 'playing_card'}, {'frequency': 'r', 'id': 834, 'synset': 'playpen.n.01', 'synonyms': ['playpen'], 'def': 'a portable enclosure in which babies may be left to play', 'name': 'playpen'}, {'frequency': 'c', 'id': 835, 'synset': 'pliers.n.01', 'synonyms': ['pliers', 'plyers'], 'def': 'a gripping hand tool with two hinged arms and (usually) serrated jaws', 'name': 'pliers'}, {'frequency': 'r', 'id': 836, 'synset': 'plow.n.01', 'synonyms': ['plow_(farm_equipment)', 'plough_(farm_equipment)'], 'def': 'a farm tool having one or more heavy blades to break the soil and cut a furrow prior to sowing', 'name': 'plow_(farm_equipment)'}, {'frequency': 'r', 'id': 837, 'synset': 'pocket_watch.n.01', 'synonyms': ['pocket_watch'], 'def': 'a watch that is carried in a small watch pocket', 'name': 'pocket_watch'}, {'frequency': 'c', 'id': 838, 'synset': 'pocketknife.n.01', 'synonyms': ['pocketknife'], 'def': 'a knife with a blade that folds into the handle; suitable for carrying in the pocket', 'name': 'pocketknife'}, {'frequency': 'c', 'id': 839, 'synset': 'poker.n.01', 'synonyms': ['poker_(fire_stirring_tool)', 'stove_poker', 'fire_hook'], 'def': 'fire iron consisting of a metal rod with a handle; used to stir a fire', 'name': 'poker_(fire_stirring_tool)'}, {'frequency': 'f', 'id': 840, 'synset': 'pole.n.01', 'synonyms': ['pole', 'post'], 'def': 'a long (usually round) rod of wood or metal or plastic', 'name': 'pole'}, {'frequency': 'r', 'id': 841, 'synset': 'police_van.n.01', 'synonyms': ['police_van', 'police_wagon', 'paddy_wagon', 'patrol_wagon'], 'def': 'van used by police to transport prisoners', 'name': 'police_van'}, {'frequency': 'f', 'id': 842, 'synset': 'polo_shirt.n.01', 'synonyms': ['polo_shirt', 'sport_shirt'], 'def': 'a shirt with short sleeves designed for comfort and casual wear', 'name': 'polo_shirt'}, {'frequency': 'r', 'id': 843, 'synset': 'poncho.n.01', 'synonyms': ['poncho'], 'def': 'a blanket-like cloak with a hole in the center for the head', 'name': 'poncho'}, {'frequency': 'c', 'id': 844, 'synset': 'pony.n.05', 'synonyms': ['pony'], 'def': 'any of various breeds of small gentle horses usually less than five feet high at the shoulder', 'name': 'pony'}, {'frequency': 'r', 'id': 845, 'synset': 'pool_table.n.01', 'synonyms': ['pool_table', 'billiard_table', 'snooker_table'], 'def': 'game equipment consisting of a heavy table on which pool is played', 'name': 'pool_table'}, {'frequency': 'f', 'id': 846, 'synset': 'pop.n.02', 'synonyms': ['pop_(soda)', 'soda_(pop)', 'tonic', 'soft_drink'], 'def': 'a sweet drink containing carbonated water and flavoring', 'name': 'pop_(soda)'}, {'frequency': 'r', 'id': 847, 'synset': 'portrait.n.02', 'synonyms': ['portrait', 'portrayal'], 'def': 'any likeness of a person, in any medium', 'name': 'portrait'}, {'frequency': 'c', 'id': 848, 'synset': 'postbox.n.01', 'synonyms': ['postbox_(public)', 'mailbox_(public)'], 'def': 'public box for deposit of mail', 'name': 'postbox_(public)'}, {'frequency': 'c', 'id': 849, 'synset': 'postcard.n.01', 'synonyms': ['postcard', 'postal_card', 'mailing-card'], 'def': 'a card for sending messages by post without an envelope', 'name': 'postcard'}, {'frequency': 'f', 'id': 850, 'synset': 'poster.n.01', 'synonyms': ['poster', 'placard'], 'def': 'a sign posted in a public place as an advertisement', 'name': 'poster'}, {'frequency': 'f', 'id': 851, 'synset': 'pot.n.01', 'synonyms': ['pot'], 'def': 'metal or earthenware cooking vessel that is usually round and deep; often has a handle and lid', 'name': 'pot'}, {'frequency': 'f', 'id': 852, 'synset': 'pot.n.04', 'synonyms': ['flowerpot'], 'def': 'a container in which plants are cultivated', 'name': 'flowerpot'}, {'frequency': 'f', 'id': 853, 'synset': 'potato.n.01', 'synonyms': ['potato'], 'def': 'an edible tuber native to South America', 'name': 'potato'}, {'frequency': 'c', 'id': 854, 'synset': 'potholder.n.01', 'synonyms': ['potholder'], 'def': 'an insulated pad for holding hot pots', 'name': 'potholder'}, {'frequency': 'c', 'id': 855, 'synset': 'pottery.n.01', 'synonyms': ['pottery', 'clayware'], 'def': 'ceramic ware made from clay and baked in a kiln', 'name': 'pottery'}, {'frequency': 'c', 'id': 856, 'synset': 'pouch.n.01', 'synonyms': ['pouch'], 'def': 'a small or medium size container for holding or carrying things', 'name': 'pouch'}, {'frequency': 'r', 'id': 857, 'synset': 'power_shovel.n.01', 'synonyms': ['power_shovel', 'excavator', 'digger'], 'def': 'a machine for excavating', 'name': 'power_shovel'}, {'frequency': 'c', 'id': 858, 'synset': 'prawn.n.01', 'synonyms': ['prawn', 'shrimp'], 'def': 'any of various edible decapod crustaceans', 'name': 'prawn'}, {'frequency': 'f', 'id': 859, 'synset': 'printer.n.03', 'synonyms': ['printer', 'printing_machine'], 'def': 'a machine that prints', 'name': 'printer'}, {'frequency': 'c', 'id': 860, 'synset': 'projectile.n.01', 'synonyms': ['projectile_(weapon)', 'missile'], 'def': 'a weapon that is forcibly thrown or projected at a targets', 'name': 'projectile_(weapon)'}, {'frequency': 'c', 'id': 861, 'synset': 'projector.n.02', 'synonyms': ['projector'], 'def': 'an optical instrument that projects an enlarged image onto a screen', 'name': 'projector'}, {'frequency': 'f', 'id': 862, 'synset': 'propeller.n.01', 'synonyms': ['propeller', 'propellor'], 'def': 'a mechanical device that rotates to push against air or water', 'name': 'propeller'}, {'frequency': 'r', 'id': 863, 'synset': 'prune.n.01', 'synonyms': ['prune'], 'def': 'dried plum', 'name': 'prune'}, {'frequency': 'r', 'id': 864, 'synset': 'pudding.n.01', 'synonyms': ['pudding'], 'def': 'any of various soft thick unsweetened baked dishes', 'name': 'pudding'}, {'frequency': 'r', 'id': 865, 'synset': 'puffer.n.02', 'synonyms': ['puffer_(fish)', 'pufferfish', 'blowfish', 'globefish'], 'def': 'fishes whose elongated spiny body can inflate itself with water or air to form a globe', 'name': 'puffer_(fish)'}, {'frequency': 'r', 'id': 866, 'synset': 'puffin.n.01', 'synonyms': ['puffin'], 'def': 'seabirds having short necks and brightly colored compressed bills', 'name': 'puffin'}, {'frequency': 'r', 'id': 867, 'synset': 'pug.n.01', 'synonyms': ['pug-dog'], 'def': 'small compact smooth-coated breed of Asiatic origin having a tightly curled tail and broad flat wrinkled muzzle', 'name': 'pug-dog'}, {'frequency': 'c', 'id': 868, 'synset': 'pumpkin.n.02', 'synonyms': ['pumpkin'], 'def': 'usually large pulpy deep-yellow round fruit of the squash family maturing in late summer or early autumn', 'name': 'pumpkin'}, {'frequency': 'r', 'id': 869, 'synset': 'punch.n.03', 'synonyms': ['puncher'], 'def': 'a tool for making holes or indentations', 'name': 'puncher'}, {'frequency': 'r', 'id': 870, 'synset': 'puppet.n.01', 'synonyms': ['puppet', 'marionette'], 'def': 'a small figure of a person operated from above with strings by a puppeteer', 'name': 'puppet'}, {'frequency': 'r', 'id': 871, 'synset': 'puppy.n.01', 'synonyms': ['puppy'], 'def': 'a young dog', 'name': 'puppy'}, {'frequency': 'r', 'id': 872, 'synset': 'quesadilla.n.01', 'synonyms': ['quesadilla'], 'def': 'a tortilla that is filled with cheese and heated', 'name': 'quesadilla'}, {'frequency': 'r', 'id': 873, 'synset': 'quiche.n.02', 'synonyms': ['quiche'], 'def': 'a tart filled with rich unsweetened custard; often contains other ingredients (as cheese or ham or seafood or vegetables)', 'name': 'quiche'}, {'frequency': 'f', 'id': 874, 'synset': 'quilt.n.01', 'synonyms': ['quilt', 'comforter'], 'def': 'bedding made of two layers of cloth filled with stuffing and stitched together', 'name': 'quilt'}, {'frequency': 'c', 'id': 875, 'synset': 'rabbit.n.01', 'synonyms': ['rabbit'], 'def': 'any of various burrowing animals of the family Leporidae having long ears and short tails', 'name': 'rabbit'}, {'frequency': 'r', 'id': 876, 'synset': 'racer.n.02', 'synonyms': ['race_car', 'racing_car'], 'def': 'a fast car that competes in races', 'name': 'race_car'}, {'frequency': 'c', 'id': 877, 'synset': 'racket.n.04', 'synonyms': ['racket', 'racquet'], 'def': 'a sports implement used to strike a ball in various games', 'name': 'racket'}, {'frequency': 'r', 'id': 878, 'synset': 'radar.n.01', 'synonyms': ['radar'], 'def': 'measuring instrument in which the echo of a pulse of microwave radiation is used to detect and locate distant objects', 'name': 'radar'}, {'frequency': 'c', 'id': 879, 'synset': 'radiator.n.03', 'synonyms': ['radiator'], 'def': 'a mechanism consisting of a metal honeycomb through which hot fluids circulate', 'name': 'radiator'}, {'frequency': 'c', 'id': 880, 'synset': 'radio_receiver.n.01', 'synonyms': ['radio_receiver', 'radio_set', 'radio', 'tuner_(radio)'], 'def': 'an electronic receiver that detects and demodulates and amplifies transmitted radio signals', 'name': 'radio_receiver'}, {'frequency': 'c', 'id': 881, 'synset': 'radish.n.03', 'synonyms': ['radish', 'daikon'], 'def': 'pungent edible root of any of various cultivated radish plants', 'name': 'radish'}, {'frequency': 'c', 'id': 882, 'synset': 'raft.n.01', 'synonyms': ['raft'], 'def': 'a flat float (usually made of logs or planks) that can be used for transport or as a platform for swimmers', 'name': 'raft'}, {'frequency': 'r', 'id': 883, 'synset': 'rag_doll.n.01', 'synonyms': ['rag_doll'], 'def': 'a cloth doll that is stuffed and (usually) painted', 'name': 'rag_doll'}, {'frequency': 'c', 'id': 884, 'synset': 'raincoat.n.01', 'synonyms': ['raincoat', 'waterproof_jacket'], 'def': 'a water-resistant coat', 'name': 'raincoat'}, {'frequency': 'c', 'id': 885, 'synset': 'ram.n.05', 'synonyms': ['ram_(animal)'], 'def': 'uncastrated adult male sheep', 'name': 'ram_(animal)'}, {'frequency': 'c', 'id': 886, 'synset': 'raspberry.n.02', 'synonyms': ['raspberry'], 'def': 'red or black edible aggregate berries usually smaller than the related blackberries', 'name': 'raspberry'}, {'frequency': 'r', 'id': 887, 'synset': 'rat.n.01', 'synonyms': ['rat'], 'def': 'any of various long-tailed rodents similar to but larger than a mouse', 'name': 'rat'}, {'frequency': 'c', 'id': 888, 'synset': 'razorblade.n.01', 'synonyms': ['razorblade'], 'def': 'a blade that has very sharp edge', 'name': 'razorblade'}, {'frequency': 'c', 'id': 889, 'synset': 'reamer.n.01', 'synonyms': ['reamer_(juicer)', 'juicer', 'juice_reamer'], 'def': 'a squeezer with a conical ridged center that is used for squeezing juice from citrus fruit', 'name': 'reamer_(juicer)'}, {'frequency': 'f', 'id': 890, 'synset': 'rearview_mirror.n.01', 'synonyms': ['rearview_mirror'], 'def': 'car mirror that reflects the view out of the rear window', 'name': 'rearview_mirror'}, {'frequency': 'c', 'id': 891, 'synset': 'receipt.n.02', 'synonyms': ['receipt'], 'def': 'an acknowledgment (usually tangible) that payment has been made', 'name': 'receipt'}, {'frequency': 'c', 'id': 892, 'synset': 'recliner.n.01', 'synonyms': ['recliner', 'reclining_chair', 'lounger_(chair)'], 'def': 'an armchair whose back can be lowered and foot can be raised to allow the sitter to recline in it', 'name': 'recliner'}, {'frequency': 'r', 'id': 893, 'synset': 'record_player.n.01', 'synonyms': ['record_player', 'phonograph_(record_player)', 'turntable'], 'def': 'machine in which rotating records cause a stylus to vibrate and the vibrations are amplified acoustically or electronically', 'name': 'record_player'}, {'frequency': 'r', 'id': 894, 'synset': 'red_cabbage.n.02', 'synonyms': ['red_cabbage'], 'def': 'compact head of purplish-red leaves', 'name': 'red_cabbage'}, {'frequency': 'f', 'id': 895, 'synset': 'reflector.n.01', 'synonyms': ['reflector'], 'def': 'device that reflects light, radiation, etc.', 'name': 'reflector'}, {'frequency': 'f', 'id': 896, 'synset': 'remote_control.n.01', 'synonyms': ['remote_control'], 'def': 'a device that can be used to control a machine or apparatus from a distance', 'name': 'remote_control'}, {'frequency': 'c', 'id': 897, 'synset': 'rhinoceros.n.01', 'synonyms': ['rhinoceros'], 'def': 'massive powerful herbivorous odd-toed ungulate of southeast Asia and Africa having very thick skin and one or two horns on the snout', 'name': 'rhinoceros'}, {'frequency': 'r', 'id': 898, 'synset': 'rib.n.03', 'synonyms': ['rib_(food)'], 'def': 'cut of meat including one or more ribs', 'name': 'rib_(food)'}, {'frequency': 'r', 'id': 899, 'synset': 'rifle.n.01', 'synonyms': ['rifle'], 'def': 'a shoulder firearm with a long barrel', 'name': 'rifle'}, {'frequency': 'f', 'id': 900, 'synset': 'ring.n.08', 'synonyms': ['ring'], 'def': 'jewelry consisting of a circlet of precious metal (often set with jewels) worn on the finger', 'name': 'ring'}, {'frequency': 'r', 'id': 901, 'synset': 'river_boat.n.01', 'synonyms': ['river_boat'], 'def': 'a boat used on rivers or to ply a river', 'name': 'river_boat'}, {'frequency': 'r', 'id': 902, 'synset': 'road_map.n.02', 'synonyms': ['road_map'], 'def': '(NOT A ROAD) a MAP showing roads (for automobile travel)', 'name': 'road_map'}, {'frequency': 'c', 'id': 903, 'synset': 'robe.n.01', 'synonyms': ['robe'], 'def': 'any loose flowing garment', 'name': 'robe'}, {'frequency': 'c', 'id': 904, 'synset': 'rocking_chair.n.01', 'synonyms': ['rocking_chair'], 'def': 'a chair mounted on rockers', 'name': 'rocking_chair'}, {'frequency': 'r', 'id': 905, 'synset': 'roller_skate.n.01', 'synonyms': ['roller_skate'], 'def': 'a shoe with pairs of rollers (small hard wheels) fixed to the sole', 'name': 'roller_skate'}, {'frequency': 'r', 'id': 906, 'synset': 'rollerblade.n.01', 'synonyms': ['Rollerblade'], 'def': 'an in-line variant of a roller skate', 'name': 'Rollerblade'}, {'frequency': 'c', 'id': 907, 'synset': 'rolling_pin.n.01', 'synonyms': ['rolling_pin'], 'def': 'utensil consisting of a cylinder (usually of wood) with a handle at each end; used to roll out dough', 'name': 'rolling_pin'}, {'frequency': 'r', 'id': 908, 'synset': 'root_beer.n.01', 'synonyms': ['root_beer'], 'def': 'carbonated drink containing extracts of roots and herbs', 'name': 'root_beer'}, {'frequency': 'c', 'id': 909, 'synset': 'router.n.02', 'synonyms': ['router_(computer_equipment)'], 'def': 'a device that forwards data packets between computer networks', 'name': 'router_(computer_equipment)'}, {'frequency': 'f', 'id': 910, 'synset': 'rubber_band.n.01', 'synonyms': ['rubber_band', 'elastic_band'], 'def': 'a narrow band of elastic rubber used to hold things (such as papers) together', 'name': 'rubber_band'}, {'frequency': 'c', 'id': 911, 'synset': 'runner.n.08', 'synonyms': ['runner_(carpet)'], 'def': 'a long narrow carpet', 'name': 'runner_(carpet)'}, {'frequency': 'f', 'id': 912, 'synset': 'sack.n.01', 'synonyms': ['plastic_bag', 'paper_bag'], 'def': "a bag made of paper or plastic for holding customer's purchases", 'name': 'plastic_bag'}, {'frequency': 'f', 'id': 913, 'synset': 'saddle.n.01', 'synonyms': ['saddle_(on_an_animal)'], 'def': 'a seat for the rider of a horse or camel', 'name': 'saddle_(on_an_animal)'}, {'frequency': 'f', 'id': 914, 'synset': 'saddle_blanket.n.01', 'synonyms': ['saddle_blanket', 'saddlecloth', 'horse_blanket'], 'def': 'stable gear consisting of a blanket placed under the saddle', 'name': 'saddle_blanket'}, {'frequency': 'c', 'id': 915, 'synset': 'saddlebag.n.01', 'synonyms': ['saddlebag'], 'def': 'a large bag (or pair of bags) hung over a saddle', 'name': 'saddlebag'}, {'frequency': 'r', 'id': 916, 'synset': 'safety_pin.n.01', 'synonyms': ['safety_pin'], 'def': 'a pin in the form of a clasp; has a guard so the point of the pin will not stick the user', 'name': 'safety_pin'}, {'frequency': 'c', 'id': 917, 'synset': 'sail.n.01', 'synonyms': ['sail'], 'def': 'a large piece of fabric by means of which wind is used to propel a sailing vessel', 'name': 'sail'}, {'frequency': 'c', 'id': 918, 'synset': 'salad.n.01', 'synonyms': ['salad'], 'def': 'food mixtures either arranged on a plate or tossed and served with a moist dressing; usually consisting of or including greens', 'name': 'salad'}, {'frequency': 'r', 'id': 919, 'synset': 'salad_plate.n.01', 'synonyms': ['salad_plate', 'salad_bowl'], 'def': 'a plate or bowl for individual servings of salad', 'name': 'salad_plate'}, {'frequency': 'r', 'id': 920, 'synset': 'salami.n.01', 'synonyms': ['salami'], 'def': 'highly seasoned fatty sausage of pork and beef usually dried', 'name': 'salami'}, {'frequency': 'r', 'id': 921, 'synset': 'salmon.n.01', 'synonyms': ['salmon_(fish)'], 'def': 'any of various large food and game fishes of northern waters', 'name': 'salmon_(fish)'}, {'frequency': 'r', 'id': 922, 'synset': 'salmon.n.03', 'synonyms': ['salmon_(food)'], 'def': 'flesh of any of various marine or freshwater fish of the family Salmonidae', 'name': 'salmon_(food)'}, {'frequency': 'r', 'id': 923, 'synset': 'salsa.n.01', 'synonyms': ['salsa'], 'def': 'spicy sauce of tomatoes and onions and chili peppers to accompany Mexican foods', 'name': 'salsa'}, {'frequency': 'f', 'id': 924, 'synset': 'saltshaker.n.01', 'synonyms': ['saltshaker'], 'def': 'a shaker with a perforated top for sprinkling salt', 'name': 'saltshaker'}, {'frequency': 'f', 'id': 925, 'synset': 'sandal.n.01', 'synonyms': ['sandal_(type_of_shoe)'], 'def': 'a shoe consisting of a sole fastened by straps to the foot', 'name': 'sandal_(type_of_shoe)'}, {'frequency': 'f', 'id': 926, 'synset': 'sandwich.n.01', 'synonyms': ['sandwich'], 'def': 'two (or more) slices of bread with a filling between them', 'name': 'sandwich'}, {'frequency': 'r', 'id': 927, 'synset': 'satchel.n.01', 'synonyms': ['satchel'], 'def': 'luggage consisting of a small case with a flat bottom and (usually) a shoulder strap', 'name': 'satchel'}, {'frequency': 'r', 'id': 928, 'synset': 'saucepan.n.01', 'synonyms': ['saucepan'], 'def': 'a deep pan with a handle; used for stewing or boiling', 'name': 'saucepan'}, {'frequency': 'f', 'id': 929, 'synset': 'saucer.n.02', 'synonyms': ['saucer'], 'def': 'a small shallow dish for holding a cup at the table', 'name': 'saucer'}, {'frequency': 'f', 'id': 930, 'synset': 'sausage.n.01', 'synonyms': ['sausage'], 'def': 'highly seasoned minced meat stuffed in casings', 'name': 'sausage'}, {'frequency': 'r', 'id': 931, 'synset': 'sawhorse.n.01', 'synonyms': ['sawhorse', 'sawbuck'], 'def': 'a framework for holding wood that is being sawed', 'name': 'sawhorse'}, {'frequency': 'r', 'id': 932, 'synset': 'sax.n.02', 'synonyms': ['saxophone'], 'def': "a wind instrument with a `J'-shaped form typically made of brass", 'name': 'saxophone'}, {'frequency': 'f', 'id': 933, 'synset': 'scale.n.07', 'synonyms': ['scale_(measuring_instrument)'], 'def': 'a measuring instrument for weighing; shows amount of mass', 'name': 'scale_(measuring_instrument)'}, {'frequency': 'r', 'id': 934, 'synset': 'scarecrow.n.01', 'synonyms': ['scarecrow', 'strawman'], 'def': 'an effigy in the shape of a man to frighten birds away from seeds', 'name': 'scarecrow'}, {'frequency': 'f', 'id': 935, 'synset': 'scarf.n.01', 'synonyms': ['scarf'], 'def': 'a garment worn around the head or neck or shoulders for warmth or decoration', 'name': 'scarf'}, {'frequency': 'c', 'id': 936, 'synset': 'school_bus.n.01', 'synonyms': ['school_bus'], 'def': 'a bus used to transport children to or from school', 'name': 'school_bus'}, {'frequency': 'f', 'id': 937, 'synset': 'scissors.n.01', 'synonyms': ['scissors'], 'def': 'a tool having two crossed pivoting blades with looped handles', 'name': 'scissors'}, {'frequency': 'c', 'id': 938, 'synset': 'scoreboard.n.01', 'synonyms': ['scoreboard'], 'def': 'a large board for displaying the score of a contest (and some other information)', 'name': 'scoreboard'}, {'frequency': 'c', 'id': 939, 'synset': 'scrambled_eggs.n.01', 'synonyms': ['scrambled_eggs'], 'def': 'eggs beaten and cooked to a soft firm consistency while stirring', 'name': 'scrambled_eggs'}, {'frequency': 'r', 'id': 940, 'synset': 'scraper.n.01', 'synonyms': ['scraper'], 'def': 'any of various hand tools for scraping', 'name': 'scraper'}, {'frequency': 'r', 'id': 941, 'synset': 'scratcher.n.03', 'synonyms': ['scratcher'], 'def': 'a device used for scratching', 'name': 'scratcher'}, {'frequency': 'c', 'id': 942, 'synset': 'screwdriver.n.01', 'synonyms': ['screwdriver'], 'def': 'a hand tool for driving screws; has a tip that fits into the head of a screw', 'name': 'screwdriver'}, {'frequency': 'c', 'id': 943, 'synset': 'scrub_brush.n.01', 'synonyms': ['scrubbing_brush'], 'def': 'a brush with short stiff bristles for heavy cleaning', 'name': 'scrubbing_brush'}, {'frequency': 'c', 'id': 944, 'synset': 'sculpture.n.01', 'synonyms': ['sculpture'], 'def': 'a three-dimensional work of art', 'name': 'sculpture'}, {'frequency': 'r', 'id': 945, 'synset': 'seabird.n.01', 'synonyms': ['seabird', 'seafowl'], 'def': 'a bird that frequents coastal waters and the open ocean: gulls; pelicans; gannets; cormorants; albatrosses; petrels; etc.', 'name': 'seabird'}, {'frequency': 'r', 'id': 946, 'synset': 'seahorse.n.02', 'synonyms': ['seahorse'], 'def': 'small fish with horse-like heads bent sharply downward and curled tails', 'name': 'seahorse'}, {'frequency': 'r', 'id': 947, 'synset': 'seaplane.n.01', 'synonyms': ['seaplane', 'hydroplane'], 'def': 'an airplane that can land on or take off from water', 'name': 'seaplane'}, {'frequency': 'c', 'id': 948, 'synset': 'seashell.n.01', 'synonyms': ['seashell'], 'def': 'the shell of a marine organism', 'name': 'seashell'}, {'frequency': 'r', 'id': 949, 'synset': 'seedling.n.01', 'synonyms': ['seedling'], 'def': 'young plant or tree grown from a seed', 'name': 'seedling'}, {'frequency': 'c', 'id': 950, 'synset': 'serving_dish.n.01', 'synonyms': ['serving_dish'], 'def': 'a dish used for serving food', 'name': 'serving_dish'}, {'frequency': 'r', 'id': 951, 'synset': 'sewing_machine.n.01', 'synonyms': ['sewing_machine'], 'def': 'a textile machine used as a home appliance for sewing', 'name': 'sewing_machine'}, {'frequency': 'r', 'id': 952, 'synset': 'shaker.n.03', 'synonyms': ['shaker'], 'def': 'a container in which something can be shaken', 'name': 'shaker'}, {'frequency': 'c', 'id': 953, 'synset': 'shampoo.n.01', 'synonyms': ['shampoo'], 'def': 'cleansing agent consisting of soaps or detergents used for washing the hair', 'name': 'shampoo'}, {'frequency': 'r', 'id': 954, 'synset': 'shark.n.01', 'synonyms': ['shark'], 'def': 'typically large carnivorous fishes with sharpe teeth', 'name': 'shark'}, {'frequency': 'r', 'id': 955, 'synset': 'sharpener.n.01', 'synonyms': ['sharpener'], 'def': 'any implement that is used to make something (an edge or a point) sharper', 'name': 'sharpener'}, {'frequency': 'r', 'id': 956, 'synset': 'sharpie.n.03', 'synonyms': ['Sharpie'], 'def': 'a pen with indelible ink that will write on any surface', 'name': 'Sharpie'}, {'frequency': 'r', 'id': 957, 'synset': 'shaver.n.03', 'synonyms': ['shaver_(electric)', 'electric_shaver', 'electric_razor'], 'def': 'a razor powered by an electric motor', 'name': 'shaver_(electric)'}, {'frequency': 'c', 'id': 958, 'synset': 'shaving_cream.n.01', 'synonyms': ['shaving_cream', 'shaving_soap'], 'def': 'toiletry consisting that forms a rich lather for softening the beard before shaving', 'name': 'shaving_cream'}, {'frequency': 'r', 'id': 959, 'synset': 'shawl.n.01', 'synonyms': ['shawl'], 'def': 'cloak consisting of an oblong piece of cloth used to cover the head and shoulders', 'name': 'shawl'}, {'frequency': 'r', 'id': 960, 'synset': 'shears.n.01', 'synonyms': ['shears'], 'def': 'large scissors with strong blades', 'name': 'shears'}, {'frequency': 'f', 'id': 961, 'synset': 'sheep.n.01', 'synonyms': ['sheep'], 'def': 'woolly usually horned ruminant mammal related to the goat', 'name': 'sheep'}, {'frequency': 'r', 'id': 962, 'synset': 'shepherd_dog.n.01', 'synonyms': ['shepherd_dog', 'sheepdog'], 'def': 'any of various usually long-haired breeds of dog reared to herd and guard sheep', 'name': 'shepherd_dog'}, {'frequency': 'r', 'id': 963, 'synset': 'sherbert.n.01', 'synonyms': ['sherbert', 'sherbet'], 'def': 'a frozen dessert made primarily of fruit juice and sugar', 'name': 'sherbert'}, {'frequency': 'r', 'id': 964, 'synset': 'shield.n.02', 'synonyms': ['shield'], 'def': 'armor carried on the arm to intercept blows', 'name': 'shield'}, {'frequency': 'f', 'id': 965, 'synset': 'shirt.n.01', 'synonyms': ['shirt'], 'def': 'a garment worn on the upper half of the body', 'name': 'shirt'}, {'frequency': 'f', 'id': 966, 'synset': 'shoe.n.01', 'synonyms': ['shoe', 'sneaker_(type_of_shoe)', 'tennis_shoe'], 'def': 'common footwear covering the foot', 'name': 'shoe'}, {'frequency': 'c', 'id': 967, 'synset': 'shopping_bag.n.01', 'synonyms': ['shopping_bag'], 'def': 'a bag made of plastic or strong paper (often with handles); used to transport goods after shopping', 'name': 'shopping_bag'}, {'frequency': 'c', 'id': 968, 'synset': 'shopping_cart.n.01', 'synonyms': ['shopping_cart'], 'def': 'a handcart that holds groceries or other goods while shopping', 'name': 'shopping_cart'}, {'frequency': 'f', 'id': 969, 'synset': 'short_pants.n.01', 'synonyms': ['short_pants', 'shorts_(clothing)', 'trunks_(clothing)'], 'def': 'trousers that end at or above the knee', 'name': 'short_pants'}, {'frequency': 'r', 'id': 970, 'synset': 'shot_glass.n.01', 'synonyms': ['shot_glass'], 'def': 'a small glass adequate to hold a single swallow of whiskey', 'name': 'shot_glass'}, {'frequency': 'c', 'id': 971, 'synset': 'shoulder_bag.n.01', 'synonyms': ['shoulder_bag'], 'def': 'a large handbag that can be carried by a strap looped over the shoulder', 'name': 'shoulder_bag'}, {'frequency': 'c', 'id': 972, 'synset': 'shovel.n.01', 'synonyms': ['shovel'], 'def': 'a hand tool for lifting loose material such as snow, dirt, etc.', 'name': 'shovel'}, {'frequency': 'f', 'id': 973, 'synset': 'shower.n.01', 'synonyms': ['shower_head'], 'def': 'a plumbing fixture that sprays water over you', 'name': 'shower_head'}, {'frequency': 'f', 'id': 974, 'synset': 'shower_curtain.n.01', 'synonyms': ['shower_curtain'], 'def': 'a curtain that keeps water from splashing out of the shower area', 'name': 'shower_curtain'}, {'frequency': 'r', 'id': 975, 'synset': 'shredder.n.01', 'synonyms': ['shredder_(for_paper)'], 'def': 'a device that shreds documents', 'name': 'shredder_(for_paper)'}, {'frequency': 'r', 'id': 976, 'synset': 'sieve.n.01', 'synonyms': ['sieve', 'screen_(sieve)'], 'def': 'a strainer for separating lumps from powdered material or grading particles', 'name': 'sieve'}, {'frequency': 'f', 'id': 977, 'synset': 'signboard.n.01', 'synonyms': ['signboard'], 'def': 'structure displaying a board on which advertisements can be posted', 'name': 'signboard'}, {'frequency': 'c', 'id': 978, 'synset': 'silo.n.01', 'synonyms': ['silo'], 'def': 'a cylindrical tower used for storing goods', 'name': 'silo'}, {'frequency': 'f', 'id': 979, 'synset': 'sink.n.01', 'synonyms': ['sink'], 'def': 'plumbing fixture consisting of a water basin fixed to a wall or floor and having a drainpipe', 'name': 'sink'}, {'frequency': 'f', 'id': 980, 'synset': 'skateboard.n.01', 'synonyms': ['skateboard'], 'def': 'a board with wheels that is ridden in a standing or crouching position and propelled by foot', 'name': 'skateboard'}, {'frequency': 'c', 'id': 981, 'synset': 'skewer.n.01', 'synonyms': ['skewer'], 'def': 'a long pin for holding meat in position while it is being roasted', 'name': 'skewer'}, {'frequency': 'f', 'id': 982, 'synset': 'ski.n.01', 'synonyms': ['ski'], 'def': 'sports equipment for skiing on snow', 'name': 'ski'}, {'frequency': 'f', 'id': 983, 'synset': 'ski_boot.n.01', 'synonyms': ['ski_boot'], 'def': 'a stiff boot that is fastened to a ski with a ski binding', 'name': 'ski_boot'}, {'frequency': 'f', 'id': 984, 'synset': 'ski_parka.n.01', 'synonyms': ['ski_parka', 'ski_jacket'], 'def': 'a parka to be worn while skiing', 'name': 'ski_parka'}, {'frequency': 'f', 'id': 985, 'synset': 'ski_pole.n.01', 'synonyms': ['ski_pole'], 'def': 'a pole with metal points used as an aid in skiing', 'name': 'ski_pole'}, {'frequency': 'f', 'id': 986, 'synset': 'skirt.n.02', 'synonyms': ['skirt'], 'def': 'a garment hanging from the waist; worn mainly by girls and women', 'name': 'skirt'}, {'frequency': 'c', 'id': 987, 'synset': 'sled.n.01', 'synonyms': ['sled', 'sledge', 'sleigh'], 'def': 'a vehicle or flat object for transportation over snow by sliding or pulled by dogs, etc.', 'name': 'sled'}, {'frequency': 'c', 'id': 988, 'synset': 'sleeping_bag.n.01', 'synonyms': ['sleeping_bag'], 'def': 'large padded bag designed to be slept in outdoors', 'name': 'sleeping_bag'}, {'frequency': 'r', 'id': 989, 'synset': 'sling.n.05', 'synonyms': ['sling_(bandage)', 'triangular_bandage'], 'def': 'bandage to support an injured forearm; slung over the shoulder or neck', 'name': 'sling_(bandage)'}, {'frequency': 'c', 'id': 990, 'synset': 'slipper.n.01', 'synonyms': ['slipper_(footwear)', 'carpet_slipper_(footwear)'], 'def': 'low footwear that can be slipped on and off easily; usually worn indoors', 'name': 'slipper_(footwear)'}, {'frequency': 'r', 'id': 991, 'synset': 'smoothie.n.02', 'synonyms': ['smoothie'], 'def': 'a thick smooth drink consisting of fresh fruit pureed with ice cream or yoghurt or milk', 'name': 'smoothie'}, {'frequency': 'r', 'id': 992, 'synset': 'snake.n.01', 'synonyms': ['snake', 'serpent'], 'def': 'limbless scaly elongate reptile; some are venomous', 'name': 'snake'}, {'frequency': 'f', 'id': 993, 'synset': 'snowboard.n.01', 'synonyms': ['snowboard'], 'def': 'a board that resembles a broad ski or a small surfboard; used in a standing position to slide down snow-covered slopes', 'name': 'snowboard'}, {'frequency': 'c', 'id': 994, 'synset': 'snowman.n.01', 'synonyms': ['snowman'], 'def': 'a figure of a person made of packed snow', 'name': 'snowman'}, {'frequency': 'c', 'id': 995, 'synset': 'snowmobile.n.01', 'synonyms': ['snowmobile'], 'def': 'tracked vehicle for travel on snow having skis in front', 'name': 'snowmobile'}, {'frequency': 'f', 'id': 996, 'synset': 'soap.n.01', 'synonyms': ['soap'], 'def': 'a cleansing agent made from the salts of vegetable or animal fats', 'name': 'soap'}, {'frequency': 'f', 'id': 997, 'synset': 'soccer_ball.n.01', 'synonyms': ['soccer_ball'], 'def': "an inflated ball used in playing soccer (called `football' outside of the United States)", 'name': 'soccer_ball'}, {'frequency': 'f', 'id': 998, 'synset': 'sock.n.01', 'synonyms': ['sock'], 'def': 'cloth covering for the foot; worn inside the shoe; reaches to between the ankle and the knee', 'name': 'sock'}, {'frequency': 'r', 'id': 999, 'synset': 'soda_fountain.n.02', 'synonyms': ['soda_fountain'], 'def': 'an apparatus for dispensing soda water', 'name': 'soda_fountain'}, {'frequency': 'r', 'id': 1000, 'synset': 'soda_water.n.01', 'synonyms': ['carbonated_water', 'club_soda', 'seltzer', 'sparkling_water'], 'def': 'effervescent beverage artificially charged with carbon dioxide', 'name': 'carbonated_water'}, {'frequency': 'f', 'id': 1001, 'synset': 'sofa.n.01', 'synonyms': ['sofa', 'couch', 'lounge'], 'def': 'an upholstered seat for more than one person', 'name': 'sofa'}, {'frequency': 'r', 'id': 1002, 'synset': 'softball.n.01', 'synonyms': ['softball'], 'def': 'ball used in playing softball', 'name': 'softball'}, {'frequency': 'c', 'id': 1003, 'synset': 'solar_array.n.01', 'synonyms': ['solar_array', 'solar_battery', 'solar_panel'], 'def': 'electrical device consisting of a large array of connected solar cells', 'name': 'solar_array'}, {'frequency': 'r', 'id': 1004, 'synset': 'sombrero.n.02', 'synonyms': ['sombrero'], 'def': 'a straw hat with a tall crown and broad brim; worn in American southwest and in Mexico', 'name': 'sombrero'}, {'frequency': 'c', 'id': 1005, 'synset': 'soup.n.01', 'synonyms': ['soup'], 'def': 'liquid food especially of meat or fish or vegetable stock often containing pieces of solid food', 'name': 'soup'}, {'frequency': 'r', 'id': 1006, 'synset': 'soup_bowl.n.01', 'synonyms': ['soup_bowl'], 'def': 'a bowl for serving soup', 'name': 'soup_bowl'}, {'frequency': 'c', 'id': 1007, 'synset': 'soupspoon.n.01', 'synonyms': ['soupspoon'], 'def': 'a spoon with a rounded bowl for eating soup', 'name': 'soupspoon'}, {'frequency': 'c', 'id': 1008, 'synset': 'sour_cream.n.01', 'synonyms': ['sour_cream', 'soured_cream'], 'def': 'soured light cream', 'name': 'sour_cream'}, {'frequency': 'r', 'id': 1009, 'synset': 'soya_milk.n.01', 'synonyms': ['soya_milk', 'soybean_milk', 'soymilk'], 'def': 'a milk substitute containing soybean flour and water; used in some infant formulas and in making tofu', 'name': 'soya_milk'}, {'frequency': 'r', 'id': 1010, 'synset': 'space_shuttle.n.01', 'synonyms': ['space_shuttle'], 'def': "a reusable spacecraft with wings for a controlled descent through the Earth's atmosphere", 'name': 'space_shuttle'}, {'frequency': 'r', 'id': 1011, 'synset': 'sparkler.n.02', 'synonyms': ['sparkler_(fireworks)'], 'def': 'a firework that burns slowly and throws out a shower of sparks', 'name': 'sparkler_(fireworks)'}, {'frequency': 'f', 'id': 1012, 'synset': 'spatula.n.02', 'synonyms': ['spatula'], 'def': 'a hand tool with a thin flexible blade used to mix or spread soft substances', 'name': 'spatula'}, {'frequency': 'r', 'id': 1013, 'synset': 'spear.n.01', 'synonyms': ['spear', 'lance'], 'def': 'a long pointed rod used as a tool or weapon', 'name': 'spear'}, {'frequency': 'f', 'id': 1014, 'synset': 'spectacles.n.01', 'synonyms': ['spectacles', 'specs', 'eyeglasses', 'glasses'], 'def': 'optical instrument consisting of a frame that holds a pair of lenses for correcting defective vision', 'name': 'spectacles'}, {'frequency': 'c', 'id': 1015, 'synset': 'spice_rack.n.01', 'synonyms': ['spice_rack'], 'def': 'a rack for displaying containers filled with spices', 'name': 'spice_rack'}, {'frequency': 'r', 'id': 1016, 'synset': 'spider.n.01', 'synonyms': ['spider'], 'def': 'predatory arachnid with eight legs, two poison fangs, two feelers, and usually two silk-spinning organs at the back end of the body', 'name': 'spider'}, {'frequency': 'c', 'id': 1017, 'synset': 'sponge.n.01', 'synonyms': ['sponge'], 'def': 'a porous mass usable to absorb water typically used for cleaning', 'name': 'sponge'}, {'frequency': 'f', 'id': 1018, 'synset': 'spoon.n.01', 'synonyms': ['spoon'], 'def': 'a piece of cutlery with a shallow bowl-shaped container and a handle', 'name': 'spoon'}, {'frequency': 'c', 'id': 1019, 'synset': 'sportswear.n.01', 'synonyms': ['sportswear', 'athletic_wear', 'activewear'], 'def': 'attire worn for sport or for casual wear', 'name': 'sportswear'}, {'frequency': 'c', 'id': 1020, 'synset': 'spotlight.n.02', 'synonyms': ['spotlight'], 'def': 'a lamp that produces a strong beam of light to illuminate a restricted area; used to focus attention of a stage performer', 'name': 'spotlight'}, {'frequency': 'r', 'id': 1021, 'synset': 'squirrel.n.01', 'synonyms': ['squirrel'], 'def': 'a kind of arboreal rodent having a long bushy tail', 'name': 'squirrel'}, {'frequency': 'c', 'id': 1022, 'synset': 'stapler.n.01', 'synonyms': ['stapler_(stapling_machine)'], 'def': 'a machine that inserts staples into sheets of paper in order to fasten them together', 'name': 'stapler_(stapling_machine)'}, {'frequency': 'r', 'id': 1023, 'synset': 'starfish.n.01', 'synonyms': ['starfish', 'sea_star'], 'def': 'echinoderms characterized by five arms extending from a central disk', 'name': 'starfish'}, {'frequency': 'f', 'id': 1024, 'synset': 'statue.n.01', 'synonyms': ['statue_(sculpture)'], 'def': 'a sculpture representing a human or animal', 'name': 'statue_(sculpture)'}, {'frequency': 'c', 'id': 1025, 'synset': 'steak.n.01', 'synonyms': ['steak_(food)'], 'def': 'a slice of meat cut from the fleshy part of an animal or large fish', 'name': 'steak_(food)'}, {'frequency': 'r', 'id': 1026, 'synset': 'steak_knife.n.01', 'synonyms': ['steak_knife'], 'def': 'a sharp table knife used in eating steak', 'name': 'steak_knife'}, {'frequency': 'r', 'id': 1027, 'synset': 'steamer.n.02', 'synonyms': ['steamer_(kitchen_appliance)'], 'def': 'a cooking utensil that can be used to cook food by steaming it', 'name': 'steamer_(kitchen_appliance)'}, {'frequency': 'f', 'id': 1028, 'synset': 'steering_wheel.n.01', 'synonyms': ['steering_wheel'], 'def': 'a handwheel that is used for steering', 'name': 'steering_wheel'}, {'frequency': 'r', 'id': 1029, 'synset': 'stencil.n.01', 'synonyms': ['stencil'], 'def': 'a sheet of material (metal, plastic, etc.) that has been perforated with a pattern; ink or paint can pass through the perforations to create the printed pattern on the surface below', 'name': 'stencil'}, {'frequency': 'r', 'id': 1030, 'synset': 'step_ladder.n.01', 'synonyms': ['stepladder'], 'def': 'a folding portable ladder hinged at the top', 'name': 'stepladder'}, {'frequency': 'c', 'id': 1031, 'synset': 'step_stool.n.01', 'synonyms': ['step_stool'], 'def': 'a stool that has one or two steps that fold under the seat', 'name': 'step_stool'}, {'frequency': 'c', 'id': 1032, 'synset': 'stereo.n.01', 'synonyms': ['stereo_(sound_system)'], 'def': 'electronic device for playing audio', 'name': 'stereo_(sound_system)'}, {'frequency': 'r', 'id': 1033, 'synset': 'stew.n.02', 'synonyms': ['stew'], 'def': 'food prepared by stewing especially meat or fish with vegetables', 'name': 'stew'}, {'frequency': 'r', 'id': 1034, 'synset': 'stirrer.n.02', 'synonyms': ['stirrer'], 'def': 'an implement used for stirring', 'name': 'stirrer'}, {'frequency': 'f', 'id': 1035, 'synset': 'stirrup.n.01', 'synonyms': ['stirrup'], 'def': "support consisting of metal loops into which rider's feet go", 'name': 'stirrup'}, {'frequency': 'c', 'id': 1036, 'synset': 'stocking.n.01', 'synonyms': ['stockings_(leg_wear)'], 'def': 'close-fitting hosiery to cover the foot and leg; come in matched pairs', 'name': 'stockings_(leg_wear)'}, {'frequency': 'f', 'id': 1037, 'synset': 'stool.n.01', 'synonyms': ['stool'], 'def': 'a simple seat without a back or arms', 'name': 'stool'}, {'frequency': 'f', 'id': 1038, 'synset': 'stop_sign.n.01', 'synonyms': ['stop_sign'], 'def': 'a traffic sign to notify drivers that they must come to a complete stop', 'name': 'stop_sign'}, {'frequency': 'f', 'id': 1039, 'synset': 'stoplight.n.01', 'synonyms': ['brake_light'], 'def': 'a red light on the rear of a motor vehicle that signals when the brakes are applied', 'name': 'brake_light'}, {'frequency': 'f', 'id': 1040, 'synset': 'stove.n.01', 'synonyms': ['stove', 'kitchen_stove', 'range_(kitchen_appliance)', 'kitchen_range', 'cooking_stove'], 'def': 'a kitchen appliance used for cooking food', 'name': 'stove'}, {'frequency': 'c', 'id': 1041, 'synset': 'strainer.n.01', 'synonyms': ['strainer'], 'def': 'a filter to retain larger pieces while smaller pieces and liquids pass through', 'name': 'strainer'}, {'frequency': 'f', 'id': 1042, 'synset': 'strap.n.01', 'synonyms': ['strap'], 'def': 'an elongated strip of material for binding things together or holding', 'name': 'strap'}, {'frequency': 'f', 'id': 1043, 'synset': 'straw.n.04', 'synonyms': ['straw_(for_drinking)', 'drinking_straw'], 'def': 'a thin paper or plastic tube used to suck liquids into the mouth', 'name': 'straw_(for_drinking)'}, {'frequency': 'f', 'id': 1044, 'synset': 'strawberry.n.01', 'synonyms': ['strawberry'], 'def': 'sweet fleshy red fruit', 'name': 'strawberry'}, {'frequency': 'f', 'id': 1045, 'synset': 'street_sign.n.01', 'synonyms': ['street_sign'], 'def': 'a sign visible from the street', 'name': 'street_sign'}, {'frequency': 'f', 'id': 1046, 'synset': 'streetlight.n.01', 'synonyms': ['streetlight', 'street_lamp'], 'def': 'a lamp supported on a lamppost; for illuminating a street', 'name': 'streetlight'}, {'frequency': 'r', 'id': 1047, 'synset': 'string_cheese.n.01', 'synonyms': ['string_cheese'], 'def': 'cheese formed in long strings twisted together', 'name': 'string_cheese'}, {'frequency': 'r', 'id': 1048, 'synset': 'stylus.n.02', 'synonyms': ['stylus'], 'def': 'a pointed tool for writing or drawing or engraving', 'name': 'stylus'}, {'frequency': 'r', 'id': 1049, 'synset': 'subwoofer.n.01', 'synonyms': ['subwoofer'], 'def': 'a loudspeaker that is designed to reproduce very low bass frequencies', 'name': 'subwoofer'}, {'frequency': 'r', 'id': 1050, 'synset': 'sugar_bowl.n.01', 'synonyms': ['sugar_bowl'], 'def': 'a dish in which sugar is served', 'name': 'sugar_bowl'}, {'frequency': 'r', 'id': 1051, 'synset': 'sugarcane.n.01', 'synonyms': ['sugarcane_(plant)'], 'def': 'juicy canes whose sap is a source of molasses and commercial sugar; fresh canes are sometimes chewed for the juice', 'name': 'sugarcane_(plant)'}, {'frequency': 'c', 'id': 1052, 'synset': 'suit.n.01', 'synonyms': ['suit_(clothing)'], 'def': 'a set of garments (usually including a jacket and trousers or skirt) for outerwear all of the same fabric and color', 'name': 'suit_(clothing)'}, {'frequency': 'c', 'id': 1053, 'synset': 'sunflower.n.01', 'synonyms': ['sunflower'], 'def': 'any plant of the genus Helianthus having large flower heads with dark disk florets and showy yellow rays', 'name': 'sunflower'}, {'frequency': 'f', 'id': 1054, 'synset': 'sunglasses.n.01', 'synonyms': ['sunglasses'], 'def': 'spectacles that are darkened or polarized to protect the eyes from the glare of the sun', 'name': 'sunglasses'}, {'frequency': 'c', 'id': 1055, 'synset': 'sunhat.n.01', 'synonyms': ['sunhat'], 'def': 'a hat with a broad brim that protects the face from direct exposure to the sun', 'name': 'sunhat'}, {'frequency': 'r', 'id': 1056, 'synset': 'sunscreen.n.01', 'synonyms': ['sunscreen', 'sunblock'], 'def': 'a cream spread on the skin; contains a chemical to filter out ultraviolet light and so protect from sunburn', 'name': 'sunscreen'}, {'frequency': 'f', 'id': 1057, 'synset': 'surfboard.n.01', 'synonyms': ['surfboard'], 'def': 'a narrow buoyant board for riding surf', 'name': 'surfboard'}, {'frequency': 'c', 'id': 1058, 'synset': 'sushi.n.01', 'synonyms': ['sushi'], 'def': 'rice (with raw fish) wrapped in seaweed', 'name': 'sushi'}, {'frequency': 'c', 'id': 1059, 'synset': 'swab.n.02', 'synonyms': ['mop'], 'def': 'cleaning implement consisting of absorbent material fastened to a handle; for cleaning floors', 'name': 'mop'}, {'frequency': 'c', 'id': 1060, 'synset': 'sweat_pants.n.01', 'synonyms': ['sweat_pants'], 'def': 'loose-fitting trousers with elastic cuffs; worn by athletes', 'name': 'sweat_pants'}, {'frequency': 'c', 'id': 1061, 'synset': 'sweatband.n.02', 'synonyms': ['sweatband'], 'def': 'a band of material tied around the forehead or wrist to absorb sweat', 'name': 'sweatband'}, {'frequency': 'f', 'id': 1062, 'synset': 'sweater.n.01', 'synonyms': ['sweater'], 'def': 'a crocheted or knitted garment covering the upper part of the body', 'name': 'sweater'}, {'frequency': 'f', 'id': 1063, 'synset': 'sweatshirt.n.01', 'synonyms': ['sweatshirt'], 'def': 'cotton knit pullover with long sleeves worn during athletic activity', 'name': 'sweatshirt'}, {'frequency': 'c', 'id': 1064, 'synset': 'sweet_potato.n.02', 'synonyms': ['sweet_potato'], 'def': 'the edible tuberous root of the sweet potato vine', 'name': 'sweet_potato'}, {'frequency': 'f', 'id': 1065, 'synset': 'swimsuit.n.01', 'synonyms': ['swimsuit', 'swimwear', 'bathing_suit', 'swimming_costume', 'bathing_costume', 'swimming_trunks', 'bathing_trunks'], 'def': 'garment worn for swimming', 'name': 'swimsuit'}, {'frequency': 'c', 'id': 1066, 'synset': 'sword.n.01', 'synonyms': ['sword'], 'def': 'a cutting or thrusting weapon that has a long metal blade', 'name': 'sword'}, {'frequency': 'r', 'id': 1067, 'synset': 'syringe.n.01', 'synonyms': ['syringe'], 'def': 'a medical instrument used to inject or withdraw fluids', 'name': 'syringe'}, {'frequency': 'r', 'id': 1068, 'synset': 'tabasco.n.02', 'synonyms': ['Tabasco_sauce'], 'def': 'very spicy sauce (trade name Tabasco) made from fully-aged red peppers', 'name': 'Tabasco_sauce'}, {'frequency': 'r', 'id': 1069, 'synset': 'table-tennis_table.n.01', 'synonyms': ['table-tennis_table', 'ping-pong_table'], 'def': 'a table used for playing table tennis', 'name': 'table-tennis_table'}, {'frequency': 'f', 'id': 1070, 'synset': 'table.n.02', 'synonyms': ['table'], 'def': 'a piece of furniture having a smooth flat top that is usually supported by one or more vertical legs', 'name': 'table'}, {'frequency': 'c', 'id': 1071, 'synset': 'table_lamp.n.01', 'synonyms': ['table_lamp'], 'def': 'a lamp that sits on a table', 'name': 'table_lamp'}, {'frequency': 'f', 'id': 1072, 'synset': 'tablecloth.n.01', 'synonyms': ['tablecloth'], 'def': 'a covering spread over a dining table', 'name': 'tablecloth'}, {'frequency': 'r', 'id': 1073, 'synset': 'tachometer.n.01', 'synonyms': ['tachometer'], 'def': 'measuring instrument for indicating speed of rotation', 'name': 'tachometer'}, {'frequency': 'r', 'id': 1074, 'synset': 'taco.n.02', 'synonyms': ['taco'], 'def': 'a small tortilla cupped around a filling', 'name': 'taco'}, {'frequency': 'f', 'id': 1075, 'synset': 'tag.n.02', 'synonyms': ['tag'], 'def': 'a label associated with something for the purpose of identification or information', 'name': 'tag'}, {'frequency': 'f', 'id': 1076, 'synset': 'taillight.n.01', 'synonyms': ['taillight', 'rear_light'], 'def': 'lamp (usually red) mounted at the rear of a motor vehicle', 'name': 'taillight'}, {'frequency': 'r', 'id': 1077, 'synset': 'tambourine.n.01', 'synonyms': ['tambourine'], 'def': 'a shallow drum with a single drumhead and with metallic disks in the sides', 'name': 'tambourine'}, {'frequency': 'r', 'id': 1078, 'synset': 'tank.n.01', 'synonyms': ['army_tank', 'armored_combat_vehicle', 'armoured_combat_vehicle'], 'def': 'an enclosed armored military vehicle; has a cannon and moves on caterpillar treads', 'name': 'army_tank'}, {'frequency': 'c', 'id': 1079, 'synset': 'tank.n.02', 'synonyms': ['tank_(storage_vessel)', 'storage_tank'], 'def': 'a large (usually metallic) vessel for holding gases or liquids', 'name': 'tank_(storage_vessel)'}, {'frequency': 'f', 'id': 1080, 'synset': 'tank_top.n.01', 'synonyms': ['tank_top_(clothing)'], 'def': 'a tight-fitting sleeveless shirt with wide shoulder straps and low neck and no front opening', 'name': 'tank_top_(clothing)'}, {'frequency': 'c', 'id': 1081, 'synset': 'tape.n.01', 'synonyms': ['tape_(sticky_cloth_or_paper)'], 'def': 'a long thin piece of cloth or paper as used for binding or fastening', 'name': 'tape_(sticky_cloth_or_paper)'}, {'frequency': 'c', 'id': 1082, 'synset': 'tape.n.04', 'synonyms': ['tape_measure', 'measuring_tape'], 'def': 'measuring instrument consisting of a narrow strip (cloth or metal) marked in inches or centimeters and used for measuring lengths', 'name': 'tape_measure'}, {'frequency': 'c', 'id': 1083, 'synset': 'tapestry.n.02', 'synonyms': ['tapestry'], 'def': 'a heavy textile with a woven design; used for curtains and upholstery', 'name': 'tapestry'}, {'frequency': 'f', 'id': 1084, 'synset': 'tarpaulin.n.01', 'synonyms': ['tarp'], 'def': 'waterproofed canvas', 'name': 'tarp'}, {'frequency': 'c', 'id': 1085, 'synset': 'tartan.n.01', 'synonyms': ['tartan', 'plaid'], 'def': 'a cloth having a crisscross design', 'name': 'tartan'}, {'frequency': 'c', 'id': 1086, 'synset': 'tassel.n.01', 'synonyms': ['tassel'], 'def': 'adornment consisting of a bunch of cords fastened at one end', 'name': 'tassel'}, {'frequency': 'r', 'id': 1087, 'synset': 'tea_bag.n.01', 'synonyms': ['tea_bag'], 'def': 'a measured amount of tea in a bag for an individual serving of tea', 'name': 'tea_bag'}, {'frequency': 'c', 'id': 1088, 'synset': 'teacup.n.02', 'synonyms': ['teacup'], 'def': 'a cup from which tea is drunk', 'name': 'teacup'}, {'frequency': 'c', 'id': 1089, 'synset': 'teakettle.n.01', 'synonyms': ['teakettle'], 'def': 'kettle for boiling water to make tea', 'name': 'teakettle'}, {'frequency': 'c', 'id': 1090, 'synset': 'teapot.n.01', 'synonyms': ['teapot'], 'def': 'pot for brewing tea; usually has a spout and handle', 'name': 'teapot'}, {'frequency': 'f', 'id': 1091, 'synset': 'teddy.n.01', 'synonyms': ['teddy_bear'], 'def': "plaything consisting of a child's toy bear (usually plush and stuffed with soft materials)", 'name': 'teddy_bear'}, {'frequency': 'f', 'id': 1092, 'synset': 'telephone.n.01', 'synonyms': ['telephone', 'phone', 'telephone_set'], 'def': 'electronic device for communicating by voice over long distances', 'name': 'telephone'}, {'frequency': 'c', 'id': 1093, 'synset': 'telephone_booth.n.01', 'synonyms': ['telephone_booth', 'phone_booth', 'call_box', 'telephone_box', 'telephone_kiosk'], 'def': 'booth for using a telephone', 'name': 'telephone_booth'}, {'frequency': 'f', 'id': 1094, 'synset': 'telephone_pole.n.01', 'synonyms': ['telephone_pole', 'telegraph_pole', 'telegraph_post'], 'def': 'tall pole supporting telephone wires', 'name': 'telephone_pole'}, {'frequency': 'r', 'id': 1095, 'synset': 'telephoto_lens.n.01', 'synonyms': ['telephoto_lens', 'zoom_lens'], 'def': 'a camera lens that magnifies the image', 'name': 'telephoto_lens'}, {'frequency': 'c', 'id': 1096, 'synset': 'television_camera.n.01', 'synonyms': ['television_camera', 'tv_camera'], 'def': 'television equipment for capturing and recording video', 'name': 'television_camera'}, {'frequency': 'f', 'id': 1097, 'synset': 'television_receiver.n.01', 'synonyms': ['television_set', 'tv', 'tv_set'], 'def': 'an electronic device that receives television signals and displays them on a screen', 'name': 'television_set'}, {'frequency': 'f', 'id': 1098, 'synset': 'tennis_ball.n.01', 'synonyms': ['tennis_ball'], 'def': 'ball about the size of a fist used in playing tennis', 'name': 'tennis_ball'}, {'frequency': 'f', 'id': 1099, 'synset': 'tennis_racket.n.01', 'synonyms': ['tennis_racket'], 'def': 'a racket used to play tennis', 'name': 'tennis_racket'}, {'frequency': 'r', 'id': 1100, 'synset': 'tequila.n.01', 'synonyms': ['tequila'], 'def': 'Mexican liquor made from fermented juices of an agave plant', 'name': 'tequila'}, {'frequency': 'c', 'id': 1101, 'synset': 'thermometer.n.01', 'synonyms': ['thermometer'], 'def': 'measuring instrument for measuring temperature', 'name': 'thermometer'}, {'frequency': 'c', 'id': 1102, 'synset': 'thermos.n.01', 'synonyms': ['thermos_bottle'], 'def': 'vacuum flask that preserves temperature of hot or cold drinks', 'name': 'thermos_bottle'}, {'frequency': 'c', 'id': 1103, 'synset': 'thermostat.n.01', 'synonyms': ['thermostat'], 'def': 'a regulator for automatically regulating temperature by starting or stopping the supply of heat', 'name': 'thermostat'}, {'frequency': 'r', 'id': 1104, 'synset': 'thimble.n.02', 'synonyms': ['thimble'], 'def': 'a small metal cap to protect the finger while sewing; can be used as a small container', 'name': 'thimble'}, {'frequency': 'c', 'id': 1105, 'synset': 'thread.n.01', 'synonyms': ['thread', 'yarn'], 'def': 'a fine cord of twisted fibers (of cotton or silk or wool or nylon etc.) used in sewing and weaving', 'name': 'thread'}, {'frequency': 'c', 'id': 1106, 'synset': 'thumbtack.n.01', 'synonyms': ['thumbtack', 'drawing_pin', 'pushpin'], 'def': 'a tack for attaching papers to a bulletin board or drawing board', 'name': 'thumbtack'}, {'frequency': 'c', 'id': 1107, 'synset': 'tiara.n.01', 'synonyms': ['tiara'], 'def': 'a jeweled headdress worn by women on formal occasions', 'name': 'tiara'}, {'frequency': 'c', 'id': 1108, 'synset': 'tiger.n.02', 'synonyms': ['tiger'], 'def': 'large feline of forests in most of Asia having a tawny coat with black stripes', 'name': 'tiger'}, {'frequency': 'c', 'id': 1109, 'synset': 'tights.n.01', 'synonyms': ['tights_(clothing)', 'leotards'], 'def': 'skintight knit hose covering the body from the waist to the feet worn by acrobats and dancers and as stockings by women and girls', 'name': 'tights_(clothing)'}, {'frequency': 'c', 'id': 1110, 'synset': 'timer.n.01', 'synonyms': ['timer', 'stopwatch'], 'def': 'a timepiece that measures a time interval and signals its end', 'name': 'timer'}, {'frequency': 'f', 'id': 1111, 'synset': 'tinfoil.n.01', 'synonyms': ['tinfoil'], 'def': 'foil made of tin or an alloy of tin and lead', 'name': 'tinfoil'}, {'frequency': 'r', 'id': 1112, 'synset': 'tinsel.n.01', 'synonyms': ['tinsel'], 'def': 'a showy decoration that is basically valueless', 'name': 'tinsel'}, {'frequency': 'f', 'id': 1113, 'synset': 'tissue.n.02', 'synonyms': ['tissue_paper'], 'def': 'a soft thin (usually translucent) paper', 'name': 'tissue_paper'}, {'frequency': 'c', 'id': 1114, 'synset': 'toast.n.01', 'synonyms': ['toast_(food)'], 'def': 'slice of bread that has been toasted', 'name': 'toast_(food)'}, {'frequency': 'f', 'id': 1115, 'synset': 'toaster.n.02', 'synonyms': ['toaster'], 'def': 'a kitchen appliance (usually electric) for toasting bread', 'name': 'toaster'}, {'frequency': 'c', 'id': 1116, 'synset': 'toaster_oven.n.01', 'synonyms': ['toaster_oven'], 'def': 'kitchen appliance consisting of a small electric oven for toasting or warming food', 'name': 'toaster_oven'}, {'frequency': 'f', 'id': 1117, 'synset': 'toilet.n.02', 'synonyms': ['toilet'], 'def': 'a plumbing fixture for defecation and urination', 'name': 'toilet'}, {'frequency': 'f', 'id': 1118, 'synset': 'toilet_tissue.n.01', 'synonyms': ['toilet_tissue', 'toilet_paper', 'bathroom_tissue'], 'def': 'a soft thin absorbent paper for use in toilets', 'name': 'toilet_tissue'}, {'frequency': 'f', 'id': 1119, 'synset': 'tomato.n.01', 'synonyms': ['tomato'], 'def': 'mildly acid red or yellow pulpy fruit eaten as a vegetable', 'name': 'tomato'}, {'frequency': 'c', 'id': 1120, 'synset': 'tongs.n.01', 'synonyms': ['tongs'], 'def': 'any of various devices for taking hold of objects; usually have two hinged legs with handles above and pointed hooks below', 'name': 'tongs'}, {'frequency': 'c', 'id': 1121, 'synset': 'toolbox.n.01', 'synonyms': ['toolbox'], 'def': 'a box or chest or cabinet for holding hand tools', 'name': 'toolbox'}, {'frequency': 'f', 'id': 1122, 'synset': 'toothbrush.n.01', 'synonyms': ['toothbrush'], 'def': 'small brush; has long handle; used to clean teeth', 'name': 'toothbrush'}, {'frequency': 'f', 'id': 1123, 'synset': 'toothpaste.n.01', 'synonyms': ['toothpaste'], 'def': 'a dentifrice in the form of a paste', 'name': 'toothpaste'}, {'frequency': 'c', 'id': 1124, 'synset': 'toothpick.n.01', 'synonyms': ['toothpick'], 'def': 'pick consisting of a small strip of wood or plastic; used to pick food from between the teeth', 'name': 'toothpick'}, {'frequency': 'c', 'id': 1125, 'synset': 'top.n.09', 'synonyms': ['cover'], 'def': 'covering for a hole (especially a hole in the top of a container)', 'name': 'cover'}, {'frequency': 'c', 'id': 1126, 'synset': 'tortilla.n.01', 'synonyms': ['tortilla'], 'def': 'thin unleavened pancake made from cornmeal or wheat flour', 'name': 'tortilla'}, {'frequency': 'c', 'id': 1127, 'synset': 'tow_truck.n.01', 'synonyms': ['tow_truck'], 'def': 'a truck equipped to hoist and pull wrecked cars (or to remove cars from no-parking zones)', 'name': 'tow_truck'}, {'frequency': 'f', 'id': 1128, 'synset': 'towel.n.01', 'synonyms': ['towel'], 'def': 'a rectangular piece of absorbent cloth (or paper) for drying or wiping', 'name': 'towel'}, {'frequency': 'f', 'id': 1129, 'synset': 'towel_rack.n.01', 'synonyms': ['towel_rack', 'towel_rail', 'towel_bar'], 'def': 'a rack consisting of one or more bars on which towels can be hung', 'name': 'towel_rack'}, {'frequency': 'f', 'id': 1130, 'synset': 'toy.n.03', 'synonyms': ['toy'], 'def': 'a device regarded as providing amusement', 'name': 'toy'}, {'frequency': 'c', 'id': 1131, 'synset': 'tractor.n.01', 'synonyms': ['tractor_(farm_equipment)'], 'def': 'a wheeled vehicle with large wheels; used in farming and other applications', 'name': 'tractor_(farm_equipment)'}, {'frequency': 'f', 'id': 1132, 'synset': 'traffic_light.n.01', 'synonyms': ['traffic_light'], 'def': 'a device to control vehicle traffic often consisting of three or more lights', 'name': 'traffic_light'}, {'frequency': 'r', 'id': 1133, 'synset': 'trail_bike.n.01', 'synonyms': ['dirt_bike'], 'def': 'a lightweight motorcycle equipped with rugged tires and suspension for off-road use', 'name': 'dirt_bike'}, {'frequency': 'c', 'id': 1134, 'synset': 'trailer_truck.n.01', 'synonyms': ['trailer_truck', 'tractor_trailer', 'trucking_rig', 'articulated_lorry', 'semi_truck'], 'def': 'a truck consisting of a tractor and trailer together', 'name': 'trailer_truck'}, {'frequency': 'f', 'id': 1135, 'synset': 'train.n.01', 'synonyms': ['train_(railroad_vehicle)', 'railroad_train'], 'def': 'public or private transport provided by a line of railway cars coupled together and drawn by a locomotive', 'name': 'train_(railroad_vehicle)'}, {'frequency': 'r', 'id': 1136, 'synset': 'trampoline.n.01', 'synonyms': ['trampoline'], 'def': 'gymnastic apparatus consisting of a strong canvas sheet attached with springs to a metal frame', 'name': 'trampoline'}, {'frequency': 'f', 'id': 1137, 'synset': 'tray.n.01', 'synonyms': ['tray'], 'def': 'an open receptacle for holding or displaying or serving articles or food', 'name': 'tray'}, {'frequency': 'r', 'id': 1138, 'synset': 'tree_house.n.01', 'synonyms': ['tree_house'], 'def': '(NOT A TREE) a PLAYHOUSE built in the branches of a tree', 'name': 'tree_house'}, {'frequency': 'r', 'id': 1139, 'synset': 'trench_coat.n.01', 'synonyms': ['trench_coat'], 'def': 'a military style raincoat; belted with deep pockets', 'name': 'trench_coat'}, {'frequency': 'r', 'id': 1140, 'synset': 'triangle.n.05', 'synonyms': ['triangle_(musical_instrument)'], 'def': 'a percussion instrument consisting of a metal bar bent in the shape of an open triangle', 'name': 'triangle_(musical_instrument)'}, {'frequency': 'r', 'id': 1141, 'synset': 'tricycle.n.01', 'synonyms': ['tricycle'], 'def': 'a vehicle with three wheels that is moved by foot pedals', 'name': 'tricycle'}, {'frequency': 'c', 'id': 1142, 'synset': 'tripod.n.01', 'synonyms': ['tripod'], 'def': 'a three-legged rack used for support', 'name': 'tripod'}, {'frequency': 'f', 'id': 1143, 'synset': 'trouser.n.01', 'synonyms': ['trousers', 'pants_(clothing)'], 'def': 'a garment extending from the waist to the knee or ankle, covering each leg separately', 'name': 'trousers'}, {'frequency': 'f', 'id': 1144, 'synset': 'truck.n.01', 'synonyms': ['truck'], 'def': 'an automotive vehicle suitable for hauling', 'name': 'truck'}, {'frequency': 'r', 'id': 1145, 'synset': 'truffle.n.03', 'synonyms': ['truffle_(chocolate)', 'chocolate_truffle'], 'def': 'creamy chocolate candy', 'name': 'truffle_(chocolate)'}, {'frequency': 'c', 'id': 1146, 'synset': 'trunk.n.02', 'synonyms': ['trunk'], 'def': 'luggage consisting of a large strong case used when traveling or for storage', 'name': 'trunk'}, {'frequency': 'r', 'id': 1147, 'synset': 'tub.n.02', 'synonyms': ['vat'], 'def': 'a large open vessel for holding or storing liquids', 'name': 'vat'}, {'frequency': 'c', 'id': 1148, 'synset': 'turban.n.01', 'synonyms': ['turban'], 'def': 'a traditional headdress consisting of a long scarf wrapped around the head', 'name': 'turban'}, {'frequency': 'r', 'id': 1149, 'synset': 'turkey.n.01', 'synonyms': ['turkey_(bird)'], 'def': 'large gallinaceous bird with fan-shaped tail; widely domesticated for food', 'name': 'turkey_(bird)'}, {'frequency': 'c', 'id': 1150, 'synset': 'turkey.n.04', 'synonyms': ['turkey_(food)'], 'def': 'flesh of large domesticated fowl usually roasted', 'name': 'turkey_(food)'}, {'frequency': 'r', 'id': 1151, 'synset': 'turnip.n.01', 'synonyms': ['turnip'], 'def': 'widely cultivated plant having a large fleshy edible white or yellow root', 'name': 'turnip'}, {'frequency': 'c', 'id': 1152, 'synset': 'turtle.n.02', 'synonyms': ['turtle'], 'def': 'any of various aquatic and land reptiles having a bony shell and flipper-like limbs for swimming', 'name': 'turtle'}, {'frequency': 'r', 'id': 1153, 'synset': 'turtleneck.n.01', 'synonyms': ['turtleneck_(clothing)', 'polo-neck'], 'def': 'a sweater or jersey with a high close-fitting collar', 'name': 'turtleneck_(clothing)'}, {'frequency': 'r', 'id': 1154, 'synset': 'typewriter.n.01', 'synonyms': ['typewriter'], 'def': 'hand-operated character printer for printing written messages one character at a time', 'name': 'typewriter'}, {'frequency': 'f', 'id': 1155, 'synset': 'umbrella.n.01', 'synonyms': ['umbrella'], 'def': 'a lightweight handheld collapsible canopy', 'name': 'umbrella'}, {'frequency': 'c', 'id': 1156, 'synset': 'underwear.n.01', 'synonyms': ['underwear', 'underclothes', 'underclothing', 'underpants'], 'def': 'undergarment worn next to the skin and under the outer garments', 'name': 'underwear'}, {'frequency': 'r', 'id': 1157, 'synset': 'unicycle.n.01', 'synonyms': ['unicycle'], 'def': 'a vehicle with a single wheel that is driven by pedals', 'name': 'unicycle'}, {'frequency': 'c', 'id': 1158, 'synset': 'urinal.n.01', 'synonyms': ['urinal'], 'def': 'a plumbing fixture (usually attached to the wall) used by men to urinate', 'name': 'urinal'}, {'frequency': 'r', 'id': 1159, 'synset': 'urn.n.01', 'synonyms': ['urn'], 'def': 'a large vase that usually has a pedestal or feet', 'name': 'urn'}, {'frequency': 'c', 'id': 1160, 'synset': 'vacuum.n.04', 'synonyms': ['vacuum_cleaner'], 'def': 'an electrical home appliance that cleans by suction', 'name': 'vacuum_cleaner'}, {'frequency': 'c', 'id': 1161, 'synset': 'valve.n.03', 'synonyms': ['valve'], 'def': 'control consisting of a mechanical device for controlling the flow of a fluid', 'name': 'valve'}, {'frequency': 'f', 'id': 1162, 'synset': 'vase.n.01', 'synonyms': ['vase'], 'def': 'an open jar of glass or porcelain used as an ornament or to hold flowers', 'name': 'vase'}, {'frequency': 'c', 'id': 1163, 'synset': 'vending_machine.n.01', 'synonyms': ['vending_machine'], 'def': 'a slot machine for selling goods', 'name': 'vending_machine'}, {'frequency': 'f', 'id': 1164, 'synset': 'vent.n.01', 'synonyms': ['vent', 'blowhole', 'air_vent'], 'def': 'a hole for the escape of gas or air', 'name': 'vent'}, {'frequency': 'c', 'id': 1165, 'synset': 'videotape.n.01', 'synonyms': ['videotape'], 'def': 'a video recording made on magnetic tape', 'name': 'videotape'}, {'frequency': 'r', 'id': 1166, 'synset': 'vinegar.n.01', 'synonyms': ['vinegar'], 'def': 'sour-tasting liquid produced usually by oxidation of the alcohol in wine or cider and used as a condiment or food preservative', 'name': 'vinegar'}, {'frequency': 'r', 'id': 1167, 'synset': 'violin.n.01', 'synonyms': ['violin', 'fiddle'], 'def': 'bowed stringed instrument that is the highest member of the violin family', 'name': 'violin'}, {'frequency': 'r', 'id': 1168, 'synset': 'vodka.n.01', 'synonyms': ['vodka'], 'def': 'unaged colorless liquor originating in Russia', 'name': 'vodka'}, {'frequency': 'r', 'id': 1169, 'synset': 'volleyball.n.02', 'synonyms': ['volleyball'], 'def': 'an inflated ball used in playing volleyball', 'name': 'volleyball'}, {'frequency': 'r', 'id': 1170, 'synset': 'vulture.n.01', 'synonyms': ['vulture'], 'def': 'any of various large birds of prey having naked heads and weak claws and feeding chiefly on carrion', 'name': 'vulture'}, {'frequency': 'c', 'id': 1171, 'synset': 'waffle.n.01', 'synonyms': ['waffle'], 'def': 'pancake batter baked in a waffle iron', 'name': 'waffle'}, {'frequency': 'r', 'id': 1172, 'synset': 'waffle_iron.n.01', 'synonyms': ['waffle_iron'], 'def': 'a kitchen appliance for baking waffles', 'name': 'waffle_iron'}, {'frequency': 'c', 'id': 1173, 'synset': 'wagon.n.01', 'synonyms': ['wagon'], 'def': 'any of various kinds of wheeled vehicles drawn by an animal or a tractor', 'name': 'wagon'}, {'frequency': 'c', 'id': 1174, 'synset': 'wagon_wheel.n.01', 'synonyms': ['wagon_wheel'], 'def': 'a wheel of a wagon', 'name': 'wagon_wheel'}, {'frequency': 'c', 'id': 1175, 'synset': 'walking_stick.n.01', 'synonyms': ['walking_stick'], 'def': 'a stick carried in the hand for support in walking', 'name': 'walking_stick'}, {'frequency': 'c', 'id': 1176, 'synset': 'wall_clock.n.01', 'synonyms': ['wall_clock'], 'def': 'a clock mounted on a wall', 'name': 'wall_clock'}, {'frequency': 'f', 'id': 1177, 'synset': 'wall_socket.n.01', 'synonyms': ['wall_socket', 'wall_plug', 'electric_outlet', 'electrical_outlet', 'outlet', 'electric_receptacle'], 'def': 'receptacle providing a place in a wiring system where current can be taken to run electrical devices', 'name': 'wall_socket'}, {'frequency': 'c', 'id': 1178, 'synset': 'wallet.n.01', 'synonyms': ['wallet', 'billfold'], 'def': 'a pocket-size case for holding papers and paper money', 'name': 'wallet'}, {'frequency': 'r', 'id': 1179, 'synset': 'walrus.n.01', 'synonyms': ['walrus'], 'def': 'either of two large northern marine mammals having ivory tusks and tough hide over thick blubber', 'name': 'walrus'}, {'frequency': 'r', 'id': 1180, 'synset': 'wardrobe.n.01', 'synonyms': ['wardrobe'], 'def': 'a tall piece of furniture that provides storage space for clothes; has a door and rails or hooks for hanging clothes', 'name': 'wardrobe'}, {'frequency': 'r', 'id': 1181, 'synset': 'wasabi.n.02', 'synonyms': ['wasabi'], 'def': 'the thick green root of the wasabi plant that the Japanese use in cooking and that tastes like strong horseradish', 'name': 'wasabi'}, {'frequency': 'c', 'id': 1182, 'synset': 'washer.n.03', 'synonyms': ['automatic_washer', 'washing_machine'], 'def': 'a home appliance for washing clothes and linens automatically', 'name': 'automatic_washer'}, {'frequency': 'f', 'id': 1183, 'synset': 'watch.n.01', 'synonyms': ['watch', 'wristwatch'], 'def': 'a small, portable timepiece', 'name': 'watch'}, {'frequency': 'f', 'id': 1184, 'synset': 'water_bottle.n.01', 'synonyms': ['water_bottle'], 'def': 'a bottle for holding water', 'name': 'water_bottle'}, {'frequency': 'c', 'id': 1185, 'synset': 'water_cooler.n.01', 'synonyms': ['water_cooler'], 'def': 'a device for cooling and dispensing drinking water', 'name': 'water_cooler'}, {'frequency': 'c', 'id': 1186, 'synset': 'water_faucet.n.01', 'synonyms': ['water_faucet', 'water_tap', 'tap_(water_faucet)'], 'def': 'a faucet for drawing water from a pipe or cask', 'name': 'water_faucet'}, {'frequency': 'r', 'id': 1187, 'synset': 'water_filter.n.01', 'synonyms': ['water_filter'], 'def': 'a filter to remove impurities from the water supply', 'name': 'water_filter'}, {'frequency': 'r', 'id': 1188, 'synset': 'water_heater.n.01', 'synonyms': ['water_heater', 'hot-water_heater'], 'def': 'a heater and storage tank to supply heated water', 'name': 'water_heater'}, {'frequency': 'r', 'id': 1189, 'synset': 'water_jug.n.01', 'synonyms': ['water_jug'], 'def': 'a jug that holds water', 'name': 'water_jug'}, {'frequency': 'r', 'id': 1190, 'synset': 'water_pistol.n.01', 'synonyms': ['water_gun', 'squirt_gun'], 'def': 'plaything consisting of a toy pistol that squirts water', 'name': 'water_gun'}, {'frequency': 'c', 'id': 1191, 'synset': 'water_scooter.n.01', 'synonyms': ['water_scooter', 'sea_scooter', 'jet_ski'], 'def': 'a motorboat resembling a motor scooter (NOT A SURFBOARD OR WATER SKI)', 'name': 'water_scooter'}, {'frequency': 'c', 'id': 1192, 'synset': 'water_ski.n.01', 'synonyms': ['water_ski'], 'def': 'broad ski for skimming over water towed by a speedboat (DO NOT MARK WATER)', 'name': 'water_ski'}, {'frequency': 'c', 'id': 1193, 'synset': 'water_tower.n.01', 'synonyms': ['water_tower'], 'def': 'a large reservoir for water', 'name': 'water_tower'}, {'frequency': 'c', 'id': 1194, 'synset': 'watering_can.n.01', 'synonyms': ['watering_can'], 'def': 'a container with a handle and a spout with a perforated nozzle; used to sprinkle water over plants', 'name': 'watering_can'}, {'frequency': 'c', 'id': 1195, 'synset': 'watermelon.n.02', 'synonyms': ['watermelon'], 'def': 'large oblong or roundish melon with a hard green rind and sweet watery red or occasionally yellowish pulp', 'name': 'watermelon'}, {'frequency': 'f', 'id': 1196, 'synset': 'weathervane.n.01', 'synonyms': ['weathervane', 'vane_(weathervane)', 'wind_vane'], 'def': 'mechanical device attached to an elevated structure; rotates freely to show the direction of the wind', 'name': 'weathervane'}, {'frequency': 'c', 'id': 1197, 'synset': 'webcam.n.01', 'synonyms': ['webcam'], 'def': 'a digital camera designed to take digital photographs and transmit them over the internet', 'name': 'webcam'}, {'frequency': 'c', 'id': 1198, 'synset': 'wedding_cake.n.01', 'synonyms': ['wedding_cake', 'bridecake'], 'def': 'a rich cake with two or more tiers and covered with frosting and decorations; served at a wedding reception', 'name': 'wedding_cake'}, {'frequency': 'c', 'id': 1199, 'synset': 'wedding_ring.n.01', 'synonyms': ['wedding_ring', 'wedding_band'], 'def': 'a ring given to the bride and/or groom at the wedding', 'name': 'wedding_ring'}, {'frequency': 'f', 'id': 1200, 'synset': 'wet_suit.n.01', 'synonyms': ['wet_suit'], 'def': 'a close-fitting garment made of a permeable material; worn in cold water to retain body heat', 'name': 'wet_suit'}, {'frequency': 'f', 'id': 1201, 'synset': 'wheel.n.01', 'synonyms': ['wheel'], 'def': 'a circular frame with spokes (or a solid disc) that can rotate on a shaft or axle', 'name': 'wheel'}, {'frequency': 'c', 'id': 1202, 'synset': 'wheelchair.n.01', 'synonyms': ['wheelchair'], 'def': 'a movable chair mounted on large wheels', 'name': 'wheelchair'}, {'frequency': 'c', 'id': 1203, 'synset': 'whipped_cream.n.01', 'synonyms': ['whipped_cream'], 'def': 'cream that has been beaten until light and fluffy', 'name': 'whipped_cream'}, {'frequency': 'r', 'id': 1204, 'synset': 'whiskey.n.01', 'synonyms': ['whiskey'], 'def': 'a liquor made from fermented mash of grain', 'name': 'whiskey'}, {'frequency': 'r', 'id': 1205, 'synset': 'whistle.n.03', 'synonyms': ['whistle'], 'def': 'a small wind instrument that produces a whistling sound by blowing into it', 'name': 'whistle'}, {'frequency': 'r', 'id': 1206, 'synset': 'wick.n.02', 'synonyms': ['wick'], 'def': 'a loosely woven cord in a candle or oil lamp that is lit on fire', 'name': 'wick'}, {'frequency': 'c', 'id': 1207, 'synset': 'wig.n.01', 'synonyms': ['wig'], 'def': 'hairpiece covering the head and made of real or synthetic hair', 'name': 'wig'}, {'frequency': 'c', 'id': 1208, 'synset': 'wind_chime.n.01', 'synonyms': ['wind_chime'], 'def': 'a decorative arrangement of pieces of metal or glass or pottery that hang together loosely so the wind can cause them to tinkle', 'name': 'wind_chime'}, {'frequency': 'c', 'id': 1209, 'synset': 'windmill.n.01', 'synonyms': ['windmill'], 'def': 'a mill that is powered by the wind', 'name': 'windmill'}, {'frequency': 'c', 'id': 1210, 'synset': 'window_box.n.01', 'synonyms': ['window_box_(for_plants)'], 'def': 'a container for growing plants on a windowsill', 'name': 'window_box_(for_plants)'}, {'frequency': 'f', 'id': 1211, 'synset': 'windshield_wiper.n.01', 'synonyms': ['windshield_wiper', 'windscreen_wiper', 'wiper_(for_windshield/screen)'], 'def': 'a mechanical device that cleans the windshield', 'name': 'windshield_wiper'}, {'frequency': 'c', 'id': 1212, 'synset': 'windsock.n.01', 'synonyms': ['windsock', 'air_sock', 'air-sleeve', 'wind_sleeve', 'wind_cone'], 'def': 'a truncated cloth cone mounted on a mast/pole; shows wind direction', 'name': 'windsock'}, {'frequency': 'f', 'id': 1213, 'synset': 'wine_bottle.n.01', 'synonyms': ['wine_bottle'], 'def': 'a bottle for holding wine', 'name': 'wine_bottle'}, {'frequency': 'r', 'id': 1214, 'synset': 'wine_bucket.n.01', 'synonyms': ['wine_bucket', 'wine_cooler'], 'def': 'a bucket of ice used to chill a bottle of wine', 'name': 'wine_bucket'}, {'frequency': 'f', 'id': 1215, 'synset': 'wineglass.n.01', 'synonyms': ['wineglass'], 'def': 'a glass that has a stem and in which wine is served', 'name': 'wineglass'}, {'frequency': 'r', 'id': 1216, 'synset': 'wing_chair.n.01', 'synonyms': ['wing_chair'], 'def': 'easy chair having wings on each side of a high back', 'name': 'wing_chair'}, {'frequency': 'c', 'id': 1217, 'synset': 'winker.n.02', 'synonyms': ['blinder_(for_horses)'], 'def': 'blinds that prevent a horse from seeing something on either side', 'name': 'blinder_(for_horses)'}, {'frequency': 'c', 'id': 1218, 'synset': 'wok.n.01', 'synonyms': ['wok'], 'def': 'pan with a convex bottom; used for frying in Chinese cooking', 'name': 'wok'}, {'frequency': 'r', 'id': 1219, 'synset': 'wolf.n.01', 'synonyms': ['wolf'], 'def': 'a wild carnivorous mammal of the dog family, living and hunting in packs', 'name': 'wolf'}, {'frequency': 'c', 'id': 1220, 'synset': 'wooden_spoon.n.02', 'synonyms': ['wooden_spoon'], 'def': 'a spoon made of wood', 'name': 'wooden_spoon'}, {'frequency': 'c', 'id': 1221, 'synset': 'wreath.n.01', 'synonyms': ['wreath'], 'def': 'an arrangement of flowers, leaves, or stems fastened in a ring', 'name': 'wreath'}, {'frequency': 'c', 'id': 1222, 'synset': 'wrench.n.03', 'synonyms': ['wrench', 'spanner'], 'def': 'a hand tool that is used to hold or twist a nut or bolt', 'name': 'wrench'}, {'frequency': 'c', 'id': 1223, 'synset': 'wristband.n.01', 'synonyms': ['wristband'], 'def': 'band consisting of a part of a sleeve that covers the wrist', 'name': 'wristband'}, {'frequency': 'f', 'id': 1224, 'synset': 'wristlet.n.01', 'synonyms': ['wristlet', 'wrist_band'], 'def': 'a band or bracelet worn around the wrist', 'name': 'wristlet'}, {'frequency': 'r', 'id': 1225, 'synset': 'yacht.n.01', 'synonyms': ['yacht'], 'def': 'an expensive vessel propelled by sail or power and used for cruising or racing', 'name': 'yacht'}, {'frequency': 'r', 'id': 1226, 'synset': 'yak.n.02', 'synonyms': ['yak'], 'def': 'large long-haired wild ox of Tibet often domesticated', 'name': 'yak'}, {'frequency': 'c', 'id': 1227, 'synset': 'yogurt.n.01', 'synonyms': ['yogurt', 'yoghurt', 'yoghourt'], 'def': 'a custard-like food made from curdled milk', 'name': 'yogurt'}, {'frequency': 'r', 'id': 1228, 'synset': 'yoke.n.07', 'synonyms': ['yoke_(animal_equipment)'], 'def': 'gear joining two animals at the neck; NOT egg yolk', 'name': 'yoke_(animal_equipment)'}, {'frequency': 'f', 'id': 1229, 'synset': 'zebra.n.01', 'synonyms': ['zebra'], 'def': 'any of several fleet black-and-white striped African equines', 'name': 'zebra'}, {'frequency': 'c', 'id': 1230, 'synset': 'zucchini.n.02', 'synonyms': ['zucchini', 'courgette'], 'def': 'small cucumber-shaped vegetable marrow; typically dark green', 'name': 'zucchini'}] # noqa -# fmt: on diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis_v1_categories.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis_v1_categories.py deleted file mode 100755 index 7374e696..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/lvis_v1_categories.py +++ /dev/null @@ -1,16 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# Autogen with -# with open("lvis_v1_val.json", "r") as f: -# a = json.load(f) -# c = a["categories"] -# for x in c: -# del x["image_count"] -# del x["instance_count"] -# LVIS_CATEGORIES = repr(c) + " # noqa" -# with open("/tmp/lvis_categories.py", "wt") as f: -# f.write(f"LVIS_CATEGORIES = {LVIS_CATEGORIES}") -# Then paste the contents of that file below - -# fmt: off -LVIS_CATEGORIES = [{'frequency': 'c', 'synset': 'aerosol.n.02', 'synonyms': ['aerosol_can', 'spray_can'], 'id': 1, 'def': 'a dispenser that holds a substance under pressure', 'name': 'aerosol_can'}, {'frequency': 'f', 'synset': 'air_conditioner.n.01', 'synonyms': ['air_conditioner'], 'id': 2, 'def': 'a machine that keeps air cool and dry', 'name': 'air_conditioner'}, {'frequency': 'f', 'synset': 'airplane.n.01', 'synonyms': ['airplane', 'aeroplane'], 'id': 3, 'def': 'an aircraft that has a fixed wing and is powered by propellers or jets', 'name': 'airplane'}, {'frequency': 'f', 'synset': 'alarm_clock.n.01', 'synonyms': ['alarm_clock'], 'id': 4, 'def': 'a clock that wakes a sleeper at some preset time', 'name': 'alarm_clock'}, {'frequency': 'c', 'synset': 'alcohol.n.01', 'synonyms': ['alcohol', 'alcoholic_beverage'], 'id': 5, 'def': 'a liquor or brew containing alcohol as the active agent', 'name': 'alcohol'}, {'frequency': 'c', 'synset': 'alligator.n.02', 'synonyms': ['alligator', 'gator'], 'id': 6, 'def': 'amphibious reptiles related to crocodiles but with shorter broader snouts', 'name': 'alligator'}, {'frequency': 'c', 'synset': 'almond.n.02', 'synonyms': ['almond'], 'id': 7, 'def': 'oval-shaped edible seed of the almond tree', 'name': 'almond'}, {'frequency': 'c', 'synset': 'ambulance.n.01', 'synonyms': ['ambulance'], 'id': 8, 'def': 'a vehicle that takes people to and from hospitals', 'name': 'ambulance'}, {'frequency': 'c', 'synset': 'amplifier.n.01', 'synonyms': ['amplifier'], 'id': 9, 'def': 'electronic equipment that increases strength of signals', 'name': 'amplifier'}, {'frequency': 'c', 'synset': 'anklet.n.03', 'synonyms': ['anklet', 'ankle_bracelet'], 'id': 10, 'def': 'an ornament worn around the ankle', 'name': 'anklet'}, {'frequency': 'f', 'synset': 'antenna.n.01', 'synonyms': ['antenna', 'aerial', 'transmitting_aerial'], 'id': 11, 'def': 'an electrical device that sends or receives radio or television signals', 'name': 'antenna'}, {'frequency': 'f', 'synset': 'apple.n.01', 'synonyms': ['apple'], 'id': 12, 'def': 'fruit with red or yellow or green skin and sweet to tart crisp whitish flesh', 'name': 'apple'}, {'frequency': 'r', 'synset': 'applesauce.n.01', 'synonyms': ['applesauce'], 'id': 13, 'def': 'puree of stewed apples usually sweetened and spiced', 'name': 'applesauce'}, {'frequency': 'r', 'synset': 'apricot.n.02', 'synonyms': ['apricot'], 'id': 14, 'def': 'downy yellow to rosy-colored fruit resembling a small peach', 'name': 'apricot'}, {'frequency': 'f', 'synset': 'apron.n.01', 'synonyms': ['apron'], 'id': 15, 'def': 'a garment of cloth that is tied about the waist and worn to protect clothing', 'name': 'apron'}, {'frequency': 'c', 'synset': 'aquarium.n.01', 'synonyms': ['aquarium', 'fish_tank'], 'id': 16, 'def': 'a tank/pool/bowl filled with water for keeping live fish and underwater animals', 'name': 'aquarium'}, {'frequency': 'r', 'synset': 'arctic.n.02', 'synonyms': ['arctic_(type_of_shoe)', 'galosh', 'golosh', 'rubber_(type_of_shoe)', 'gumshoe'], 'id': 17, 'def': 'a waterproof overshoe that protects shoes from water or snow', 'name': 'arctic_(type_of_shoe)'}, {'frequency': 'c', 'synset': 'armband.n.02', 'synonyms': ['armband'], 'id': 18, 'def': 'a band worn around the upper arm', 'name': 'armband'}, {'frequency': 'f', 'synset': 'armchair.n.01', 'synonyms': ['armchair'], 'id': 19, 'def': 'chair with a support on each side for arms', 'name': 'armchair'}, {'frequency': 'r', 'synset': 'armoire.n.01', 'synonyms': ['armoire'], 'id': 20, 'def': 'a large wardrobe or cabinet', 'name': 'armoire'}, {'frequency': 'r', 'synset': 'armor.n.01', 'synonyms': ['armor', 'armour'], 'id': 21, 'def': 'protective covering made of metal and used in combat', 'name': 'armor'}, {'frequency': 'c', 'synset': 'artichoke.n.02', 'synonyms': ['artichoke'], 'id': 22, 'def': 'a thistlelike flower head with edible fleshy leaves and heart', 'name': 'artichoke'}, {'frequency': 'f', 'synset': 'ashcan.n.01', 'synonyms': ['trash_can', 'garbage_can', 'wastebin', 'dustbin', 'trash_barrel', 'trash_bin'], 'id': 23, 'def': 'a bin that holds rubbish until it is collected', 'name': 'trash_can'}, {'frequency': 'c', 'synset': 'ashtray.n.01', 'synonyms': ['ashtray'], 'id': 24, 'def': "a receptacle for the ash from smokers' cigars or cigarettes", 'name': 'ashtray'}, {'frequency': 'c', 'synset': 'asparagus.n.02', 'synonyms': ['asparagus'], 'id': 25, 'def': 'edible young shoots of the asparagus plant', 'name': 'asparagus'}, {'frequency': 'c', 'synset': 'atomizer.n.01', 'synonyms': ['atomizer', 'atomiser', 'spray', 'sprayer', 'nebulizer', 'nebuliser'], 'id': 26, 'def': 'a dispenser that turns a liquid (such as perfume) into a fine mist', 'name': 'atomizer'}, {'frequency': 'f', 'synset': 'avocado.n.01', 'synonyms': ['avocado'], 'id': 27, 'def': 'a pear-shaped fruit with green or blackish skin and rich yellowish pulp enclosing a single large seed', 'name': 'avocado'}, {'frequency': 'c', 'synset': 'award.n.02', 'synonyms': ['award', 'accolade'], 'id': 28, 'def': 'a tangible symbol signifying approval or distinction', 'name': 'award'}, {'frequency': 'f', 'synset': 'awning.n.01', 'synonyms': ['awning'], 'id': 29, 'def': 'a canopy made of canvas to shelter people or things from rain or sun', 'name': 'awning'}, {'frequency': 'r', 'synset': 'ax.n.01', 'synonyms': ['ax', 'axe'], 'id': 30, 'def': 'an edge tool with a heavy bladed head mounted across a handle', 'name': 'ax'}, {'frequency': 'r', 'synset': 'baboon.n.01', 'synonyms': ['baboon'], 'id': 31, 'def': 'large terrestrial monkeys having doglike muzzles', 'name': 'baboon'}, {'frequency': 'f', 'synset': 'baby_buggy.n.01', 'synonyms': ['baby_buggy', 'baby_carriage', 'perambulator', 'pram', 'stroller'], 'id': 32, 'def': 'a small vehicle with four wheels in which a baby or child is pushed around', 'name': 'baby_buggy'}, {'frequency': 'c', 'synset': 'backboard.n.01', 'synonyms': ['basketball_backboard'], 'id': 33, 'def': 'a raised vertical board with basket attached; used to play basketball', 'name': 'basketball_backboard'}, {'frequency': 'f', 'synset': 'backpack.n.01', 'synonyms': ['backpack', 'knapsack', 'packsack', 'rucksack', 'haversack'], 'id': 34, 'def': 'a bag carried by a strap on your back or shoulder', 'name': 'backpack'}, {'frequency': 'f', 'synset': 'bag.n.04', 'synonyms': ['handbag', 'purse', 'pocketbook'], 'id': 35, 'def': 'a container used for carrying money and small personal items or accessories', 'name': 'handbag'}, {'frequency': 'f', 'synset': 'bag.n.06', 'synonyms': ['suitcase', 'baggage', 'luggage'], 'id': 36, 'def': 'cases used to carry belongings when traveling', 'name': 'suitcase'}, {'frequency': 'c', 'synset': 'bagel.n.01', 'synonyms': ['bagel', 'beigel'], 'id': 37, 'def': 'glazed yeast-raised doughnut-shaped roll with hard crust', 'name': 'bagel'}, {'frequency': 'r', 'synset': 'bagpipe.n.01', 'synonyms': ['bagpipe'], 'id': 38, 'def': 'a tubular wind instrument; the player blows air into a bag and squeezes it out', 'name': 'bagpipe'}, {'frequency': 'r', 'synset': 'baguet.n.01', 'synonyms': ['baguet', 'baguette'], 'id': 39, 'def': 'narrow French stick loaf', 'name': 'baguet'}, {'frequency': 'r', 'synset': 'bait.n.02', 'synonyms': ['bait', 'lure'], 'id': 40, 'def': 'something used to lure fish or other animals into danger so they can be trapped or killed', 'name': 'bait'}, {'frequency': 'f', 'synset': 'ball.n.06', 'synonyms': ['ball'], 'id': 41, 'def': 'a spherical object used as a plaything', 'name': 'ball'}, {'frequency': 'r', 'synset': 'ballet_skirt.n.01', 'synonyms': ['ballet_skirt', 'tutu'], 'id': 42, 'def': 'very short skirt worn by ballerinas', 'name': 'ballet_skirt'}, {'frequency': 'f', 'synset': 'balloon.n.01', 'synonyms': ['balloon'], 'id': 43, 'def': 'large tough nonrigid bag filled with gas or heated air', 'name': 'balloon'}, {'frequency': 'c', 'synset': 'bamboo.n.02', 'synonyms': ['bamboo'], 'id': 44, 'def': 'woody tropical grass having hollow woody stems', 'name': 'bamboo'}, {'frequency': 'f', 'synset': 'banana.n.02', 'synonyms': ['banana'], 'id': 45, 'def': 'elongated crescent-shaped yellow fruit with soft sweet flesh', 'name': 'banana'}, {'frequency': 'c', 'synset': 'band_aid.n.01', 'synonyms': ['Band_Aid'], 'id': 46, 'def': 'trade name for an adhesive bandage to cover small cuts or blisters', 'name': 'Band_Aid'}, {'frequency': 'c', 'synset': 'bandage.n.01', 'synonyms': ['bandage'], 'id': 47, 'def': 'a piece of soft material that covers and protects an injured part of the body', 'name': 'bandage'}, {'frequency': 'f', 'synset': 'bandanna.n.01', 'synonyms': ['bandanna', 'bandana'], 'id': 48, 'def': 'large and brightly colored handkerchief; often used as a neckerchief', 'name': 'bandanna'}, {'frequency': 'r', 'synset': 'banjo.n.01', 'synonyms': ['banjo'], 'id': 49, 'def': 'a stringed instrument of the guitar family with a long neck and circular body', 'name': 'banjo'}, {'frequency': 'f', 'synset': 'banner.n.01', 'synonyms': ['banner', 'streamer'], 'id': 50, 'def': 'long strip of cloth or paper used for decoration or advertising', 'name': 'banner'}, {'frequency': 'r', 'synset': 'barbell.n.01', 'synonyms': ['barbell'], 'id': 51, 'def': 'a bar to which heavy discs are attached at each end; used in weightlifting', 'name': 'barbell'}, {'frequency': 'r', 'synset': 'barge.n.01', 'synonyms': ['barge'], 'id': 52, 'def': 'a flatbottom boat for carrying heavy loads (especially on canals)', 'name': 'barge'}, {'frequency': 'f', 'synset': 'barrel.n.02', 'synonyms': ['barrel', 'cask'], 'id': 53, 'def': 'a cylindrical container that holds liquids', 'name': 'barrel'}, {'frequency': 'c', 'synset': 'barrette.n.01', 'synonyms': ['barrette'], 'id': 54, 'def': "a pin for holding women's hair in place", 'name': 'barrette'}, {'frequency': 'c', 'synset': 'barrow.n.03', 'synonyms': ['barrow', 'garden_cart', 'lawn_cart', 'wheelbarrow'], 'id': 55, 'def': 'a cart for carrying small loads; has handles and one or more wheels', 'name': 'barrow'}, {'frequency': 'f', 'synset': 'base.n.03', 'synonyms': ['baseball_base'], 'id': 56, 'def': 'a place that the runner must touch before scoring', 'name': 'baseball_base'}, {'frequency': 'f', 'synset': 'baseball.n.02', 'synonyms': ['baseball'], 'id': 57, 'def': 'a ball used in playing baseball', 'name': 'baseball'}, {'frequency': 'f', 'synset': 'baseball_bat.n.01', 'synonyms': ['baseball_bat'], 'id': 58, 'def': 'an implement used in baseball by the batter', 'name': 'baseball_bat'}, {'frequency': 'f', 'synset': 'baseball_cap.n.01', 'synonyms': ['baseball_cap', 'jockey_cap', 'golf_cap'], 'id': 59, 'def': 'a cap with a bill', 'name': 'baseball_cap'}, {'frequency': 'f', 'synset': 'baseball_glove.n.01', 'synonyms': ['baseball_glove', 'baseball_mitt'], 'id': 60, 'def': 'the handwear used by fielders in playing baseball', 'name': 'baseball_glove'}, {'frequency': 'f', 'synset': 'basket.n.01', 'synonyms': ['basket', 'handbasket'], 'id': 61, 'def': 'a container that is usually woven and has handles', 'name': 'basket'}, {'frequency': 'c', 'synset': 'basketball.n.02', 'synonyms': ['basketball'], 'id': 62, 'def': 'an inflated ball used in playing basketball', 'name': 'basketball'}, {'frequency': 'r', 'synset': 'bass_horn.n.01', 'synonyms': ['bass_horn', 'sousaphone', 'tuba'], 'id': 63, 'def': 'the lowest brass wind instrument', 'name': 'bass_horn'}, {'frequency': 'c', 'synset': 'bat.n.01', 'synonyms': ['bat_(animal)'], 'id': 64, 'def': 'nocturnal mouselike mammal with forelimbs modified to form membranous wings', 'name': 'bat_(animal)'}, {'frequency': 'f', 'synset': 'bath_mat.n.01', 'synonyms': ['bath_mat'], 'id': 65, 'def': 'a heavy towel or mat to stand on while drying yourself after a bath', 'name': 'bath_mat'}, {'frequency': 'f', 'synset': 'bath_towel.n.01', 'synonyms': ['bath_towel'], 'id': 66, 'def': 'a large towel; to dry yourself after a bath', 'name': 'bath_towel'}, {'frequency': 'c', 'synset': 'bathrobe.n.01', 'synonyms': ['bathrobe'], 'id': 67, 'def': 'a loose-fitting robe of towelling; worn after a bath or swim', 'name': 'bathrobe'}, {'frequency': 'f', 'synset': 'bathtub.n.01', 'synonyms': ['bathtub', 'bathing_tub'], 'id': 68, 'def': 'a large open container that you fill with water and use to wash the body', 'name': 'bathtub'}, {'frequency': 'r', 'synset': 'batter.n.02', 'synonyms': ['batter_(food)'], 'id': 69, 'def': 'a liquid or semiliquid mixture, as of flour, eggs, and milk, used in cooking', 'name': 'batter_(food)'}, {'frequency': 'c', 'synset': 'battery.n.02', 'synonyms': ['battery'], 'id': 70, 'def': 'a portable device that produces electricity', 'name': 'battery'}, {'frequency': 'r', 'synset': 'beach_ball.n.01', 'synonyms': ['beachball'], 'id': 71, 'def': 'large and light ball; for play at the seaside', 'name': 'beachball'}, {'frequency': 'c', 'synset': 'bead.n.01', 'synonyms': ['bead'], 'id': 72, 'def': 'a small ball with a hole through the middle used for ornamentation, jewellery, etc.', 'name': 'bead'}, {'frequency': 'c', 'synset': 'bean_curd.n.01', 'synonyms': ['bean_curd', 'tofu'], 'id': 73, 'def': 'cheeselike food made of curdled soybean milk', 'name': 'bean_curd'}, {'frequency': 'c', 'synset': 'beanbag.n.01', 'synonyms': ['beanbag'], 'id': 74, 'def': 'a bag filled with dried beans or similar items; used in games or to sit on', 'name': 'beanbag'}, {'frequency': 'f', 'synset': 'beanie.n.01', 'synonyms': ['beanie', 'beany'], 'id': 75, 'def': 'a small skullcap; formerly worn by schoolboys and college freshmen', 'name': 'beanie'}, {'frequency': 'f', 'synset': 'bear.n.01', 'synonyms': ['bear'], 'id': 76, 'def': 'large carnivorous or omnivorous mammals with shaggy coats and claws', 'name': 'bear'}, {'frequency': 'f', 'synset': 'bed.n.01', 'synonyms': ['bed'], 'id': 77, 'def': 'a piece of furniture that provides a place to sleep', 'name': 'bed'}, {'frequency': 'r', 'synset': 'bedpan.n.01', 'synonyms': ['bedpan'], 'id': 78, 'def': 'a shallow vessel used by a bedridden patient for defecation and urination', 'name': 'bedpan'}, {'frequency': 'f', 'synset': 'bedspread.n.01', 'synonyms': ['bedspread', 'bedcover', 'bed_covering', 'counterpane', 'spread'], 'id': 79, 'def': 'decorative cover for a bed', 'name': 'bedspread'}, {'frequency': 'f', 'synset': 'beef.n.01', 'synonyms': ['cow'], 'id': 80, 'def': 'cattle/cow', 'name': 'cow'}, {'frequency': 'f', 'synset': 'beef.n.02', 'synonyms': ['beef_(food)', 'boeuf_(food)'], 'id': 81, 'def': 'meat from an adult domestic bovine', 'name': 'beef_(food)'}, {'frequency': 'r', 'synset': 'beeper.n.01', 'synonyms': ['beeper', 'pager'], 'id': 82, 'def': 'an device that beeps when the person carrying it is being paged', 'name': 'beeper'}, {'frequency': 'f', 'synset': 'beer_bottle.n.01', 'synonyms': ['beer_bottle'], 'id': 83, 'def': 'a bottle that holds beer', 'name': 'beer_bottle'}, {'frequency': 'c', 'synset': 'beer_can.n.01', 'synonyms': ['beer_can'], 'id': 84, 'def': 'a can that holds beer', 'name': 'beer_can'}, {'frequency': 'r', 'synset': 'beetle.n.01', 'synonyms': ['beetle'], 'id': 85, 'def': 'insect with hard wing covers', 'name': 'beetle'}, {'frequency': 'f', 'synset': 'bell.n.01', 'synonyms': ['bell'], 'id': 86, 'def': 'a hollow device made of metal that makes a ringing sound when struck', 'name': 'bell'}, {'frequency': 'f', 'synset': 'bell_pepper.n.02', 'synonyms': ['bell_pepper', 'capsicum'], 'id': 87, 'def': 'large bell-shaped sweet pepper in green or red or yellow or orange or black varieties', 'name': 'bell_pepper'}, {'frequency': 'f', 'synset': 'belt.n.02', 'synonyms': ['belt'], 'id': 88, 'def': 'a band to tie or buckle around the body (usually at the waist)', 'name': 'belt'}, {'frequency': 'f', 'synset': 'belt_buckle.n.01', 'synonyms': ['belt_buckle'], 'id': 89, 'def': 'the buckle used to fasten a belt', 'name': 'belt_buckle'}, {'frequency': 'f', 'synset': 'bench.n.01', 'synonyms': ['bench'], 'id': 90, 'def': 'a long seat for more than one person', 'name': 'bench'}, {'frequency': 'c', 'synset': 'beret.n.01', 'synonyms': ['beret'], 'id': 91, 'def': 'a cap with no brim or bill; made of soft cloth', 'name': 'beret'}, {'frequency': 'c', 'synset': 'bib.n.02', 'synonyms': ['bib'], 'id': 92, 'def': 'a napkin tied under the chin of a child while eating', 'name': 'bib'}, {'frequency': 'r', 'synset': 'bible.n.01', 'synonyms': ['Bible'], 'id': 93, 'def': 'the sacred writings of the Christian religions', 'name': 'Bible'}, {'frequency': 'f', 'synset': 'bicycle.n.01', 'synonyms': ['bicycle', 'bike_(bicycle)'], 'id': 94, 'def': 'a wheeled vehicle that has two wheels and is moved by foot pedals', 'name': 'bicycle'}, {'frequency': 'f', 'synset': 'bill.n.09', 'synonyms': ['visor', 'vizor'], 'id': 95, 'def': 'a brim that projects to the front to shade the eyes', 'name': 'visor'}, {'frequency': 'f', 'synset': 'billboard.n.01', 'synonyms': ['billboard'], 'id': 96, 'def': 'large outdoor signboard', 'name': 'billboard'}, {'frequency': 'c', 'synset': 'binder.n.03', 'synonyms': ['binder', 'ring-binder'], 'id': 97, 'def': 'holds loose papers or magazines', 'name': 'binder'}, {'frequency': 'c', 'synset': 'binoculars.n.01', 'synonyms': ['binoculars', 'field_glasses', 'opera_glasses'], 'id': 98, 'def': 'an optical instrument designed for simultaneous use by both eyes', 'name': 'binoculars'}, {'frequency': 'f', 'synset': 'bird.n.01', 'synonyms': ['bird'], 'id': 99, 'def': 'animal characterized by feathers and wings', 'name': 'bird'}, {'frequency': 'c', 'synset': 'bird_feeder.n.01', 'synonyms': ['birdfeeder'], 'id': 100, 'def': 'an outdoor device that supplies food for wild birds', 'name': 'birdfeeder'}, {'frequency': 'c', 'synset': 'birdbath.n.01', 'synonyms': ['birdbath'], 'id': 101, 'def': 'an ornamental basin (usually in a garden) for birds to bathe in', 'name': 'birdbath'}, {'frequency': 'c', 'synset': 'birdcage.n.01', 'synonyms': ['birdcage'], 'id': 102, 'def': 'a cage in which a bird can be kept', 'name': 'birdcage'}, {'frequency': 'c', 'synset': 'birdhouse.n.01', 'synonyms': ['birdhouse'], 'id': 103, 'def': 'a shelter for birds', 'name': 'birdhouse'}, {'frequency': 'f', 'synset': 'birthday_cake.n.01', 'synonyms': ['birthday_cake'], 'id': 104, 'def': 'decorated cake served at a birthday party', 'name': 'birthday_cake'}, {'frequency': 'r', 'synset': 'birthday_card.n.01', 'synonyms': ['birthday_card'], 'id': 105, 'def': 'a card expressing a birthday greeting', 'name': 'birthday_card'}, {'frequency': 'r', 'synset': 'black_flag.n.01', 'synonyms': ['pirate_flag'], 'id': 106, 'def': 'a flag usually bearing a white skull and crossbones on a black background', 'name': 'pirate_flag'}, {'frequency': 'c', 'synset': 'black_sheep.n.02', 'synonyms': ['black_sheep'], 'id': 107, 'def': 'sheep with a black coat', 'name': 'black_sheep'}, {'frequency': 'c', 'synset': 'blackberry.n.01', 'synonyms': ['blackberry'], 'id': 108, 'def': 'large sweet black or very dark purple edible aggregate fruit', 'name': 'blackberry'}, {'frequency': 'f', 'synset': 'blackboard.n.01', 'synonyms': ['blackboard', 'chalkboard'], 'id': 109, 'def': 'sheet of slate; for writing with chalk', 'name': 'blackboard'}, {'frequency': 'f', 'synset': 'blanket.n.01', 'synonyms': ['blanket'], 'id': 110, 'def': 'bedding that keeps a person warm in bed', 'name': 'blanket'}, {'frequency': 'c', 'synset': 'blazer.n.01', 'synonyms': ['blazer', 'sport_jacket', 'sport_coat', 'sports_jacket', 'sports_coat'], 'id': 111, 'def': 'lightweight jacket; often striped in the colors of a club or school', 'name': 'blazer'}, {'frequency': 'f', 'synset': 'blender.n.01', 'synonyms': ['blender', 'liquidizer', 'liquidiser'], 'id': 112, 'def': 'an electrically powered mixer that mix or chop or liquefy foods', 'name': 'blender'}, {'frequency': 'r', 'synset': 'blimp.n.02', 'synonyms': ['blimp'], 'id': 113, 'def': 'a small nonrigid airship used for observation or as a barrage balloon', 'name': 'blimp'}, {'frequency': 'f', 'synset': 'blinker.n.01', 'synonyms': ['blinker', 'flasher'], 'id': 114, 'def': 'a light that flashes on and off; used as a signal or to send messages', 'name': 'blinker'}, {'frequency': 'f', 'synset': 'blouse.n.01', 'synonyms': ['blouse'], 'id': 115, 'def': 'a top worn by women', 'name': 'blouse'}, {'frequency': 'f', 'synset': 'blueberry.n.02', 'synonyms': ['blueberry'], 'id': 116, 'def': 'sweet edible dark-blue berries of blueberry plants', 'name': 'blueberry'}, {'frequency': 'r', 'synset': 'board.n.09', 'synonyms': ['gameboard'], 'id': 117, 'def': 'a flat portable surface (usually rectangular) designed for board games', 'name': 'gameboard'}, {'frequency': 'f', 'synset': 'boat.n.01', 'synonyms': ['boat', 'ship_(boat)'], 'id': 118, 'def': 'a vessel for travel on water', 'name': 'boat'}, {'frequency': 'r', 'synset': 'bob.n.05', 'synonyms': ['bob', 'bobber', 'bobfloat'], 'id': 119, 'def': 'a small float usually made of cork; attached to a fishing line', 'name': 'bob'}, {'frequency': 'c', 'synset': 'bobbin.n.01', 'synonyms': ['bobbin', 'spool', 'reel'], 'id': 120, 'def': 'a thing around which thread/tape/film or other flexible materials can be wound', 'name': 'bobbin'}, {'frequency': 'c', 'synset': 'bobby_pin.n.01', 'synonyms': ['bobby_pin', 'hairgrip'], 'id': 121, 'def': 'a flat wire hairpin used to hold bobbed hair in place', 'name': 'bobby_pin'}, {'frequency': 'c', 'synset': 'boiled_egg.n.01', 'synonyms': ['boiled_egg', 'coddled_egg'], 'id': 122, 'def': 'egg cooked briefly in the shell in gently boiling water', 'name': 'boiled_egg'}, {'frequency': 'r', 'synset': 'bolo_tie.n.01', 'synonyms': ['bolo_tie', 'bolo', 'bola_tie', 'bola'], 'id': 123, 'def': 'a cord fastened around the neck with an ornamental clasp and worn as a necktie', 'name': 'bolo_tie'}, {'frequency': 'c', 'synset': 'bolt.n.03', 'synonyms': ['deadbolt'], 'id': 124, 'def': 'the part of a lock that is engaged or withdrawn with a key', 'name': 'deadbolt'}, {'frequency': 'f', 'synset': 'bolt.n.06', 'synonyms': ['bolt'], 'id': 125, 'def': 'a screw that screws into a nut to form a fastener', 'name': 'bolt'}, {'frequency': 'r', 'synset': 'bonnet.n.01', 'synonyms': ['bonnet'], 'id': 126, 'def': 'a hat tied under the chin', 'name': 'bonnet'}, {'frequency': 'f', 'synset': 'book.n.01', 'synonyms': ['book'], 'id': 127, 'def': 'a written work or composition that has been published', 'name': 'book'}, {'frequency': 'c', 'synset': 'bookcase.n.01', 'synonyms': ['bookcase'], 'id': 128, 'def': 'a piece of furniture with shelves for storing books', 'name': 'bookcase'}, {'frequency': 'c', 'synset': 'booklet.n.01', 'synonyms': ['booklet', 'brochure', 'leaflet', 'pamphlet'], 'id': 129, 'def': 'a small book usually having a paper cover', 'name': 'booklet'}, {'frequency': 'r', 'synset': 'bookmark.n.01', 'synonyms': ['bookmark', 'bookmarker'], 'id': 130, 'def': 'a marker (a piece of paper or ribbon) placed between the pages of a book', 'name': 'bookmark'}, {'frequency': 'r', 'synset': 'boom.n.04', 'synonyms': ['boom_microphone', 'microphone_boom'], 'id': 131, 'def': 'a pole carrying an overhead microphone projected over a film or tv set', 'name': 'boom_microphone'}, {'frequency': 'f', 'synset': 'boot.n.01', 'synonyms': ['boot'], 'id': 132, 'def': 'footwear that covers the whole foot and lower leg', 'name': 'boot'}, {'frequency': 'f', 'synset': 'bottle.n.01', 'synonyms': ['bottle'], 'id': 133, 'def': 'a glass or plastic vessel used for storing drinks or other liquids', 'name': 'bottle'}, {'frequency': 'c', 'synset': 'bottle_opener.n.01', 'synonyms': ['bottle_opener'], 'id': 134, 'def': 'an opener for removing caps or corks from bottles', 'name': 'bottle_opener'}, {'frequency': 'c', 'synset': 'bouquet.n.01', 'synonyms': ['bouquet'], 'id': 135, 'def': 'an arrangement of flowers that is usually given as a present', 'name': 'bouquet'}, {'frequency': 'r', 'synset': 'bow.n.04', 'synonyms': ['bow_(weapon)'], 'id': 136, 'def': 'a weapon for shooting arrows', 'name': 'bow_(weapon)'}, {'frequency': 'f', 'synset': 'bow.n.08', 'synonyms': ['bow_(decorative_ribbons)'], 'id': 137, 'def': 'a decorative interlacing of ribbons', 'name': 'bow_(decorative_ribbons)'}, {'frequency': 'f', 'synset': 'bow_tie.n.01', 'synonyms': ['bow-tie', 'bowtie'], 'id': 138, 'def': "a man's tie that ties in a bow", 'name': 'bow-tie'}, {'frequency': 'f', 'synset': 'bowl.n.03', 'synonyms': ['bowl'], 'id': 139, 'def': 'a dish that is round and open at the top for serving foods', 'name': 'bowl'}, {'frequency': 'r', 'synset': 'bowl.n.08', 'synonyms': ['pipe_bowl'], 'id': 140, 'def': 'a small round container that is open at the top for holding tobacco', 'name': 'pipe_bowl'}, {'frequency': 'c', 'synset': 'bowler_hat.n.01', 'synonyms': ['bowler_hat', 'bowler', 'derby_hat', 'derby', 'plug_hat'], 'id': 141, 'def': 'a felt hat that is round and hard with a narrow brim', 'name': 'bowler_hat'}, {'frequency': 'r', 'synset': 'bowling_ball.n.01', 'synonyms': ['bowling_ball'], 'id': 142, 'def': 'a large ball with finger holes used in the sport of bowling', 'name': 'bowling_ball'}, {'frequency': 'f', 'synset': 'box.n.01', 'synonyms': ['box'], 'id': 143, 'def': 'a (usually rectangular) container; may have a lid', 'name': 'box'}, {'frequency': 'r', 'synset': 'boxing_glove.n.01', 'synonyms': ['boxing_glove'], 'id': 144, 'def': 'large glove coverings the fists of a fighter worn for the sport of boxing', 'name': 'boxing_glove'}, {'frequency': 'c', 'synset': 'brace.n.06', 'synonyms': ['suspenders'], 'id': 145, 'def': 'elastic straps that hold trousers up (usually used in the plural)', 'name': 'suspenders'}, {'frequency': 'f', 'synset': 'bracelet.n.02', 'synonyms': ['bracelet', 'bangle'], 'id': 146, 'def': 'jewelry worn around the wrist for decoration', 'name': 'bracelet'}, {'frequency': 'r', 'synset': 'brass.n.07', 'synonyms': ['brass_plaque'], 'id': 147, 'def': 'a memorial made of brass', 'name': 'brass_plaque'}, {'frequency': 'c', 'synset': 'brassiere.n.01', 'synonyms': ['brassiere', 'bra', 'bandeau'], 'id': 148, 'def': 'an undergarment worn by women to support their breasts', 'name': 'brassiere'}, {'frequency': 'c', 'synset': 'bread-bin.n.01', 'synonyms': ['bread-bin', 'breadbox'], 'id': 149, 'def': 'a container used to keep bread or cake in', 'name': 'bread-bin'}, {'frequency': 'f', 'synset': 'bread.n.01', 'synonyms': ['bread'], 'id': 150, 'def': 'food made from dough of flour or meal and usually raised with yeast or baking powder and then baked', 'name': 'bread'}, {'frequency': 'r', 'synset': 'breechcloth.n.01', 'synonyms': ['breechcloth', 'breechclout', 'loincloth'], 'id': 151, 'def': 'a garment that provides covering for the loins', 'name': 'breechcloth'}, {'frequency': 'f', 'synset': 'bridal_gown.n.01', 'synonyms': ['bridal_gown', 'wedding_gown', 'wedding_dress'], 'id': 152, 'def': 'a gown worn by the bride at a wedding', 'name': 'bridal_gown'}, {'frequency': 'c', 'synset': 'briefcase.n.01', 'synonyms': ['briefcase'], 'id': 153, 'def': 'a case with a handle; for carrying papers or files or books', 'name': 'briefcase'}, {'frequency': 'f', 'synset': 'broccoli.n.01', 'synonyms': ['broccoli'], 'id': 154, 'def': 'plant with dense clusters of tight green flower buds', 'name': 'broccoli'}, {'frequency': 'r', 'synset': 'brooch.n.01', 'synonyms': ['broach'], 'id': 155, 'def': 'a decorative pin worn by women', 'name': 'broach'}, {'frequency': 'c', 'synset': 'broom.n.01', 'synonyms': ['broom'], 'id': 156, 'def': 'bundle of straws or twigs attached to a long handle; used for cleaning', 'name': 'broom'}, {'frequency': 'c', 'synset': 'brownie.n.03', 'synonyms': ['brownie'], 'id': 157, 'def': 'square or bar of very rich chocolate cake usually with nuts', 'name': 'brownie'}, {'frequency': 'c', 'synset': 'brussels_sprouts.n.01', 'synonyms': ['brussels_sprouts'], 'id': 158, 'def': 'the small edible cabbage-like buds growing along a stalk', 'name': 'brussels_sprouts'}, {'frequency': 'r', 'synset': 'bubble_gum.n.01', 'synonyms': ['bubble_gum'], 'id': 159, 'def': 'a kind of chewing gum that can be blown into bubbles', 'name': 'bubble_gum'}, {'frequency': 'f', 'synset': 'bucket.n.01', 'synonyms': ['bucket', 'pail'], 'id': 160, 'def': 'a roughly cylindrical vessel that is open at the top', 'name': 'bucket'}, {'frequency': 'r', 'synset': 'buggy.n.01', 'synonyms': ['horse_buggy'], 'id': 161, 'def': 'a small lightweight carriage; drawn by a single horse', 'name': 'horse_buggy'}, {'frequency': 'c', 'synset': 'bull.n.11', 'synonyms': ['horned_cow'], 'id': 162, 'def': 'a cow with horns', 'name': 'bull'}, {'frequency': 'c', 'synset': 'bulldog.n.01', 'synonyms': ['bulldog'], 'id': 163, 'def': 'a thickset short-haired dog with a large head and strong undershot lower jaw', 'name': 'bulldog'}, {'frequency': 'r', 'synset': 'bulldozer.n.01', 'synonyms': ['bulldozer', 'dozer'], 'id': 164, 'def': 'large powerful tractor; a large blade in front flattens areas of ground', 'name': 'bulldozer'}, {'frequency': 'c', 'synset': 'bullet_train.n.01', 'synonyms': ['bullet_train'], 'id': 165, 'def': 'a high-speed passenger train', 'name': 'bullet_train'}, {'frequency': 'c', 'synset': 'bulletin_board.n.02', 'synonyms': ['bulletin_board', 'notice_board'], 'id': 166, 'def': 'a board that hangs on a wall; displays announcements', 'name': 'bulletin_board'}, {'frequency': 'r', 'synset': 'bulletproof_vest.n.01', 'synonyms': ['bulletproof_vest'], 'id': 167, 'def': 'a vest capable of resisting the impact of a bullet', 'name': 'bulletproof_vest'}, {'frequency': 'c', 'synset': 'bullhorn.n.01', 'synonyms': ['bullhorn', 'megaphone'], 'id': 168, 'def': 'a portable loudspeaker with built-in microphone and amplifier', 'name': 'bullhorn'}, {'frequency': 'f', 'synset': 'bun.n.01', 'synonyms': ['bun', 'roll'], 'id': 169, 'def': 'small rounded bread either plain or sweet', 'name': 'bun'}, {'frequency': 'c', 'synset': 'bunk_bed.n.01', 'synonyms': ['bunk_bed'], 'id': 170, 'def': 'beds built one above the other', 'name': 'bunk_bed'}, {'frequency': 'f', 'synset': 'buoy.n.01', 'synonyms': ['buoy'], 'id': 171, 'def': 'a float attached by rope to the seabed to mark channels in a harbor or underwater hazards', 'name': 'buoy'}, {'frequency': 'r', 'synset': 'burrito.n.01', 'synonyms': ['burrito'], 'id': 172, 'def': 'a flour tortilla folded around a filling', 'name': 'burrito'}, {'frequency': 'f', 'synset': 'bus.n.01', 'synonyms': ['bus_(vehicle)', 'autobus', 'charabanc', 'double-decker', 'motorbus', 'motorcoach'], 'id': 173, 'def': 'a vehicle carrying many passengers; used for public transport', 'name': 'bus_(vehicle)'}, {'frequency': 'c', 'synset': 'business_card.n.01', 'synonyms': ['business_card'], 'id': 174, 'def': "a card on which are printed the person's name and business affiliation", 'name': 'business_card'}, {'frequency': 'f', 'synset': 'butter.n.01', 'synonyms': ['butter'], 'id': 175, 'def': 'an edible emulsion of fat globules made by churning milk or cream; for cooking and table use', 'name': 'butter'}, {'frequency': 'c', 'synset': 'butterfly.n.01', 'synonyms': ['butterfly'], 'id': 176, 'def': 'insect typically having a slender body with knobbed antennae and broad colorful wings', 'name': 'butterfly'}, {'frequency': 'f', 'synset': 'button.n.01', 'synonyms': ['button'], 'id': 177, 'def': 'a round fastener sewn to shirts and coats etc to fit through buttonholes', 'name': 'button'}, {'frequency': 'f', 'synset': 'cab.n.03', 'synonyms': ['cab_(taxi)', 'taxi', 'taxicab'], 'id': 178, 'def': 'a car that takes passengers where they want to go in exchange for money', 'name': 'cab_(taxi)'}, {'frequency': 'r', 'synset': 'cabana.n.01', 'synonyms': ['cabana'], 'id': 179, 'def': 'a small tent used as a dressing room beside the sea or a swimming pool', 'name': 'cabana'}, {'frequency': 'c', 'synset': 'cabin_car.n.01', 'synonyms': ['cabin_car', 'caboose'], 'id': 180, 'def': 'a car on a freight train for use of the train crew; usually the last car on the train', 'name': 'cabin_car'}, {'frequency': 'f', 'synset': 'cabinet.n.01', 'synonyms': ['cabinet'], 'id': 181, 'def': 'a piece of furniture resembling a cupboard with doors and shelves and drawers', 'name': 'cabinet'}, {'frequency': 'r', 'synset': 'cabinet.n.03', 'synonyms': ['locker', 'storage_locker'], 'id': 182, 'def': 'a storage compartment for clothes and valuables; usually it has a lock', 'name': 'locker'}, {'frequency': 'f', 'synset': 'cake.n.03', 'synonyms': ['cake'], 'id': 183, 'def': 'baked goods made from or based on a mixture of flour, sugar, eggs, and fat', 'name': 'cake'}, {'frequency': 'c', 'synset': 'calculator.n.02', 'synonyms': ['calculator'], 'id': 184, 'def': 'a small machine that is used for mathematical calculations', 'name': 'calculator'}, {'frequency': 'f', 'synset': 'calendar.n.02', 'synonyms': ['calendar'], 'id': 185, 'def': 'a list or register of events (appointments/social events/court cases, etc)', 'name': 'calendar'}, {'frequency': 'c', 'synset': 'calf.n.01', 'synonyms': ['calf'], 'id': 186, 'def': 'young of domestic cattle', 'name': 'calf'}, {'frequency': 'c', 'synset': 'camcorder.n.01', 'synonyms': ['camcorder'], 'id': 187, 'def': 'a portable television camera and videocassette recorder', 'name': 'camcorder'}, {'frequency': 'c', 'synset': 'camel.n.01', 'synonyms': ['camel'], 'id': 188, 'def': 'cud-chewing mammal used as a draft or saddle animal in desert regions', 'name': 'camel'}, {'frequency': 'f', 'synset': 'camera.n.01', 'synonyms': ['camera'], 'id': 189, 'def': 'equipment for taking photographs', 'name': 'camera'}, {'frequency': 'c', 'synset': 'camera_lens.n.01', 'synonyms': ['camera_lens'], 'id': 190, 'def': 'a lens that focuses the image in a camera', 'name': 'camera_lens'}, {'frequency': 'c', 'synset': 'camper.n.02', 'synonyms': ['camper_(vehicle)', 'camping_bus', 'motor_home'], 'id': 191, 'def': 'a recreational vehicle equipped for camping out while traveling', 'name': 'camper_(vehicle)'}, {'frequency': 'f', 'synset': 'can.n.01', 'synonyms': ['can', 'tin_can'], 'id': 192, 'def': 'airtight sealed metal container for food or drink or paint etc.', 'name': 'can'}, {'frequency': 'c', 'synset': 'can_opener.n.01', 'synonyms': ['can_opener', 'tin_opener'], 'id': 193, 'def': 'a device for cutting cans open', 'name': 'can_opener'}, {'frequency': 'f', 'synset': 'candle.n.01', 'synonyms': ['candle', 'candlestick'], 'id': 194, 'def': 'stick of wax with a wick in the middle', 'name': 'candle'}, {'frequency': 'f', 'synset': 'candlestick.n.01', 'synonyms': ['candle_holder'], 'id': 195, 'def': 'a holder with sockets for candles', 'name': 'candle_holder'}, {'frequency': 'r', 'synset': 'candy_bar.n.01', 'synonyms': ['candy_bar'], 'id': 196, 'def': 'a candy shaped as a bar', 'name': 'candy_bar'}, {'frequency': 'c', 'synset': 'candy_cane.n.01', 'synonyms': ['candy_cane'], 'id': 197, 'def': 'a hard candy in the shape of a rod (usually with stripes)', 'name': 'candy_cane'}, {'frequency': 'c', 'synset': 'cane.n.01', 'synonyms': ['walking_cane'], 'id': 198, 'def': 'a stick that people can lean on to help them walk', 'name': 'walking_cane'}, {'frequency': 'c', 'synset': 'canister.n.02', 'synonyms': ['canister', 'cannister'], 'id': 199, 'def': 'metal container for storing dry foods such as tea or flour', 'name': 'canister'}, {'frequency': 'c', 'synset': 'canoe.n.01', 'synonyms': ['canoe'], 'id': 200, 'def': 'small and light boat; pointed at both ends; propelled with a paddle', 'name': 'canoe'}, {'frequency': 'c', 'synset': 'cantaloup.n.02', 'synonyms': ['cantaloup', 'cantaloupe'], 'id': 201, 'def': 'the fruit of a cantaloup vine; small to medium-sized melon with yellowish flesh', 'name': 'cantaloup'}, {'frequency': 'r', 'synset': 'canteen.n.01', 'synonyms': ['canteen'], 'id': 202, 'def': 'a flask for carrying water; used by soldiers or travelers', 'name': 'canteen'}, {'frequency': 'f', 'synset': 'cap.n.01', 'synonyms': ['cap_(headwear)'], 'id': 203, 'def': 'a tight-fitting headwear', 'name': 'cap_(headwear)'}, {'frequency': 'f', 'synset': 'cap.n.02', 'synonyms': ['bottle_cap', 'cap_(container_lid)'], 'id': 204, 'def': 'a top (as for a bottle)', 'name': 'bottle_cap'}, {'frequency': 'c', 'synset': 'cape.n.02', 'synonyms': ['cape'], 'id': 205, 'def': 'a sleeveless garment like a cloak but shorter', 'name': 'cape'}, {'frequency': 'c', 'synset': 'cappuccino.n.01', 'synonyms': ['cappuccino', 'coffee_cappuccino'], 'id': 206, 'def': 'equal parts of espresso and steamed milk', 'name': 'cappuccino'}, {'frequency': 'f', 'synset': 'car.n.01', 'synonyms': ['car_(automobile)', 'auto_(automobile)', 'automobile'], 'id': 207, 'def': 'a motor vehicle with four wheels', 'name': 'car_(automobile)'}, {'frequency': 'f', 'synset': 'car.n.02', 'synonyms': ['railcar_(part_of_a_train)', 'railway_car_(part_of_a_train)', 'railroad_car_(part_of_a_train)'], 'id': 208, 'def': 'a wheeled vehicle adapted to the rails of railroad (mark each individual railcar separately)', 'name': 'railcar_(part_of_a_train)'}, {'frequency': 'r', 'synset': 'car.n.04', 'synonyms': ['elevator_car'], 'id': 209, 'def': 'where passengers ride up and down', 'name': 'elevator_car'}, {'frequency': 'r', 'synset': 'car_battery.n.01', 'synonyms': ['car_battery', 'automobile_battery'], 'id': 210, 'def': 'a battery in a motor vehicle', 'name': 'car_battery'}, {'frequency': 'c', 'synset': 'card.n.02', 'synonyms': ['identity_card'], 'id': 211, 'def': 'a card certifying the identity of the bearer', 'name': 'identity_card'}, {'frequency': 'c', 'synset': 'card.n.03', 'synonyms': ['card'], 'id': 212, 'def': 'a rectangular piece of paper used to send messages (e.g. greetings or pictures)', 'name': 'card'}, {'frequency': 'c', 'synset': 'cardigan.n.01', 'synonyms': ['cardigan'], 'id': 213, 'def': 'knitted jacket that is fastened up the front with buttons or a zipper', 'name': 'cardigan'}, {'frequency': 'r', 'synset': 'cargo_ship.n.01', 'synonyms': ['cargo_ship', 'cargo_vessel'], 'id': 214, 'def': 'a ship designed to carry cargo', 'name': 'cargo_ship'}, {'frequency': 'r', 'synset': 'carnation.n.01', 'synonyms': ['carnation'], 'id': 215, 'def': 'plant with pink to purple-red spice-scented usually double flowers', 'name': 'carnation'}, {'frequency': 'c', 'synset': 'carriage.n.02', 'synonyms': ['horse_carriage'], 'id': 216, 'def': 'a vehicle with wheels drawn by one or more horses', 'name': 'horse_carriage'}, {'frequency': 'f', 'synset': 'carrot.n.01', 'synonyms': ['carrot'], 'id': 217, 'def': 'deep orange edible root of the cultivated carrot plant', 'name': 'carrot'}, {'frequency': 'f', 'synset': 'carryall.n.01', 'synonyms': ['tote_bag'], 'id': 218, 'def': 'a capacious bag or basket', 'name': 'tote_bag'}, {'frequency': 'c', 'synset': 'cart.n.01', 'synonyms': ['cart'], 'id': 219, 'def': 'a heavy open wagon usually having two wheels and drawn by an animal', 'name': 'cart'}, {'frequency': 'c', 'synset': 'carton.n.02', 'synonyms': ['carton'], 'id': 220, 'def': 'a container made of cardboard for holding food or drink', 'name': 'carton'}, {'frequency': 'c', 'synset': 'cash_register.n.01', 'synonyms': ['cash_register', 'register_(for_cash_transactions)'], 'id': 221, 'def': 'a cashbox with an adding machine to register transactions', 'name': 'cash_register'}, {'frequency': 'r', 'synset': 'casserole.n.01', 'synonyms': ['casserole'], 'id': 222, 'def': 'food cooked and served in a casserole', 'name': 'casserole'}, {'frequency': 'r', 'synset': 'cassette.n.01', 'synonyms': ['cassette'], 'id': 223, 'def': 'a container that holds a magnetic tape used for recording or playing sound or video', 'name': 'cassette'}, {'frequency': 'c', 'synset': 'cast.n.05', 'synonyms': ['cast', 'plaster_cast', 'plaster_bandage'], 'id': 224, 'def': 'bandage consisting of a firm covering that immobilizes broken bones while they heal', 'name': 'cast'}, {'frequency': 'f', 'synset': 'cat.n.01', 'synonyms': ['cat'], 'id': 225, 'def': 'a domestic house cat', 'name': 'cat'}, {'frequency': 'f', 'synset': 'cauliflower.n.02', 'synonyms': ['cauliflower'], 'id': 226, 'def': 'edible compact head of white undeveloped flowers', 'name': 'cauliflower'}, {'frequency': 'c', 'synset': 'cayenne.n.02', 'synonyms': ['cayenne_(spice)', 'cayenne_pepper_(spice)', 'red_pepper_(spice)'], 'id': 227, 'def': 'ground pods and seeds of pungent red peppers of the genus Capsicum', 'name': 'cayenne_(spice)'}, {'frequency': 'c', 'synset': 'cd_player.n.01', 'synonyms': ['CD_player'], 'id': 228, 'def': 'electronic equipment for playing compact discs (CDs)', 'name': 'CD_player'}, {'frequency': 'f', 'synset': 'celery.n.01', 'synonyms': ['celery'], 'id': 229, 'def': 'widely cultivated herb with aromatic leaf stalks that are eaten raw or cooked', 'name': 'celery'}, {'frequency': 'f', 'synset': 'cellular_telephone.n.01', 'synonyms': ['cellular_telephone', 'cellular_phone', 'cellphone', 'mobile_phone', 'smart_phone'], 'id': 230, 'def': 'a hand-held mobile telephone', 'name': 'cellular_telephone'}, {'frequency': 'r', 'synset': 'chain_mail.n.01', 'synonyms': ['chain_mail', 'ring_mail', 'chain_armor', 'chain_armour', 'ring_armor', 'ring_armour'], 'id': 231, 'def': '(Middle Ages) flexible armor made of interlinked metal rings', 'name': 'chain_mail'}, {'frequency': 'f', 'synset': 'chair.n.01', 'synonyms': ['chair'], 'id': 232, 'def': 'a seat for one person, with a support for the back', 'name': 'chair'}, {'frequency': 'r', 'synset': 'chaise_longue.n.01', 'synonyms': ['chaise_longue', 'chaise', 'daybed'], 'id': 233, 'def': 'a long chair; for reclining', 'name': 'chaise_longue'}, {'frequency': 'r', 'synset': 'chalice.n.01', 'synonyms': ['chalice'], 'id': 234, 'def': 'a bowl-shaped drinking vessel; especially the Eucharistic cup', 'name': 'chalice'}, {'frequency': 'f', 'synset': 'chandelier.n.01', 'synonyms': ['chandelier'], 'id': 235, 'def': 'branched lighting fixture; often ornate; hangs from the ceiling', 'name': 'chandelier'}, {'frequency': 'r', 'synset': 'chap.n.04', 'synonyms': ['chap'], 'id': 236, 'def': 'leather leggings without a seat; worn over trousers by cowboys to protect their legs', 'name': 'chap'}, {'frequency': 'r', 'synset': 'checkbook.n.01', 'synonyms': ['checkbook', 'chequebook'], 'id': 237, 'def': 'a book issued to holders of checking accounts', 'name': 'checkbook'}, {'frequency': 'r', 'synset': 'checkerboard.n.01', 'synonyms': ['checkerboard'], 'id': 238, 'def': 'a board having 64 squares of two alternating colors', 'name': 'checkerboard'}, {'frequency': 'c', 'synset': 'cherry.n.03', 'synonyms': ['cherry'], 'id': 239, 'def': 'a red fruit with a single hard stone', 'name': 'cherry'}, {'frequency': 'r', 'synset': 'chessboard.n.01', 'synonyms': ['chessboard'], 'id': 240, 'def': 'a checkerboard used to play chess', 'name': 'chessboard'}, {'frequency': 'c', 'synset': 'chicken.n.02', 'synonyms': ['chicken_(animal)'], 'id': 241, 'def': 'a domestic fowl bred for flesh or eggs', 'name': 'chicken_(animal)'}, {'frequency': 'c', 'synset': 'chickpea.n.01', 'synonyms': ['chickpea', 'garbanzo'], 'id': 242, 'def': 'the seed of the chickpea plant; usually dried', 'name': 'chickpea'}, {'frequency': 'c', 'synset': 'chili.n.02', 'synonyms': ['chili_(vegetable)', 'chili_pepper_(vegetable)', 'chilli_(vegetable)', 'chilly_(vegetable)', 'chile_(vegetable)'], 'id': 243, 'def': 'very hot and finely tapering pepper of special pungency', 'name': 'chili_(vegetable)'}, {'frequency': 'r', 'synset': 'chime.n.01', 'synonyms': ['chime', 'gong'], 'id': 244, 'def': 'an instrument consisting of a set of bells that are struck with a hammer', 'name': 'chime'}, {'frequency': 'r', 'synset': 'chinaware.n.01', 'synonyms': ['chinaware'], 'id': 245, 'def': 'dishware made of high quality porcelain', 'name': 'chinaware'}, {'frequency': 'c', 'synset': 'chip.n.04', 'synonyms': ['crisp_(potato_chip)', 'potato_chip'], 'id': 246, 'def': 'a thin crisp slice of potato fried in deep fat', 'name': 'crisp_(potato_chip)'}, {'frequency': 'r', 'synset': 'chip.n.06', 'synonyms': ['poker_chip'], 'id': 247, 'def': 'a small disk-shaped counter used to represent money when gambling', 'name': 'poker_chip'}, {'frequency': 'c', 'synset': 'chocolate_bar.n.01', 'synonyms': ['chocolate_bar'], 'id': 248, 'def': 'a bar of chocolate candy', 'name': 'chocolate_bar'}, {'frequency': 'c', 'synset': 'chocolate_cake.n.01', 'synonyms': ['chocolate_cake'], 'id': 249, 'def': 'cake containing chocolate', 'name': 'chocolate_cake'}, {'frequency': 'r', 'synset': 'chocolate_milk.n.01', 'synonyms': ['chocolate_milk'], 'id': 250, 'def': 'milk flavored with chocolate syrup', 'name': 'chocolate_milk'}, {'frequency': 'r', 'synset': 'chocolate_mousse.n.01', 'synonyms': ['chocolate_mousse'], 'id': 251, 'def': 'dessert mousse made with chocolate', 'name': 'chocolate_mousse'}, {'frequency': 'f', 'synset': 'choker.n.03', 'synonyms': ['choker', 'collar', 'neckband'], 'id': 252, 'def': 'shirt collar, animal collar, or tight-fitting necklace', 'name': 'choker'}, {'frequency': 'f', 'synset': 'chopping_board.n.01', 'synonyms': ['chopping_board', 'cutting_board', 'chopping_block'], 'id': 253, 'def': 'a wooden board where meats or vegetables can be cut', 'name': 'chopping_board'}, {'frequency': 'f', 'synset': 'chopstick.n.01', 'synonyms': ['chopstick'], 'id': 254, 'def': 'one of a pair of slender sticks used as oriental tableware to eat food with', 'name': 'chopstick'}, {'frequency': 'f', 'synset': 'christmas_tree.n.05', 'synonyms': ['Christmas_tree'], 'id': 255, 'def': 'an ornamented evergreen used as a Christmas decoration', 'name': 'Christmas_tree'}, {'frequency': 'c', 'synset': 'chute.n.02', 'synonyms': ['slide'], 'id': 256, 'def': 'sloping channel through which things can descend', 'name': 'slide'}, {'frequency': 'r', 'synset': 'cider.n.01', 'synonyms': ['cider', 'cyder'], 'id': 257, 'def': 'a beverage made from juice pressed from apples', 'name': 'cider'}, {'frequency': 'r', 'synset': 'cigar_box.n.01', 'synonyms': ['cigar_box'], 'id': 258, 'def': 'a box for holding cigars', 'name': 'cigar_box'}, {'frequency': 'f', 'synset': 'cigarette.n.01', 'synonyms': ['cigarette'], 'id': 259, 'def': 'finely ground tobacco wrapped in paper; for smoking', 'name': 'cigarette'}, {'frequency': 'c', 'synset': 'cigarette_case.n.01', 'synonyms': ['cigarette_case', 'cigarette_pack'], 'id': 260, 'def': 'a small flat case for holding cigarettes', 'name': 'cigarette_case'}, {'frequency': 'f', 'synset': 'cistern.n.02', 'synonyms': ['cistern', 'water_tank'], 'id': 261, 'def': 'a tank that holds the water used to flush a toilet', 'name': 'cistern'}, {'frequency': 'r', 'synset': 'clarinet.n.01', 'synonyms': ['clarinet'], 'id': 262, 'def': 'a single-reed instrument with a straight tube', 'name': 'clarinet'}, {'frequency': 'c', 'synset': 'clasp.n.01', 'synonyms': ['clasp'], 'id': 263, 'def': 'a fastener (as a buckle or hook) that is used to hold two things together', 'name': 'clasp'}, {'frequency': 'c', 'synset': 'cleansing_agent.n.01', 'synonyms': ['cleansing_agent', 'cleanser', 'cleaner'], 'id': 264, 'def': 'a preparation used in cleaning something', 'name': 'cleansing_agent'}, {'frequency': 'r', 'synset': 'cleat.n.02', 'synonyms': ['cleat_(for_securing_rope)'], 'id': 265, 'def': 'a fastener (usually with two projecting horns) around which a rope can be secured', 'name': 'cleat_(for_securing_rope)'}, {'frequency': 'r', 'synset': 'clementine.n.01', 'synonyms': ['clementine'], 'id': 266, 'def': 'a variety of mandarin orange', 'name': 'clementine'}, {'frequency': 'c', 'synset': 'clip.n.03', 'synonyms': ['clip'], 'id': 267, 'def': 'any of various small fasteners used to hold loose articles together', 'name': 'clip'}, {'frequency': 'c', 'synset': 'clipboard.n.01', 'synonyms': ['clipboard'], 'id': 268, 'def': 'a small writing board with a clip at the top for holding papers', 'name': 'clipboard'}, {'frequency': 'r', 'synset': 'clipper.n.03', 'synonyms': ['clippers_(for_plants)'], 'id': 269, 'def': 'shears for cutting grass or shrubbery (often used in the plural)', 'name': 'clippers_(for_plants)'}, {'frequency': 'r', 'synset': 'cloak.n.02', 'synonyms': ['cloak'], 'id': 270, 'def': 'a loose outer garment', 'name': 'cloak'}, {'frequency': 'f', 'synset': 'clock.n.01', 'synonyms': ['clock', 'timepiece', 'timekeeper'], 'id': 271, 'def': 'a timepiece that shows the time of day', 'name': 'clock'}, {'frequency': 'f', 'synset': 'clock_tower.n.01', 'synonyms': ['clock_tower'], 'id': 272, 'def': 'a tower with a large clock visible high up on an outside face', 'name': 'clock_tower'}, {'frequency': 'c', 'synset': 'clothes_hamper.n.01', 'synonyms': ['clothes_hamper', 'laundry_basket', 'clothes_basket'], 'id': 273, 'def': 'a hamper that holds dirty clothes to be washed or wet clothes to be dried', 'name': 'clothes_hamper'}, {'frequency': 'c', 'synset': 'clothespin.n.01', 'synonyms': ['clothespin', 'clothes_peg'], 'id': 274, 'def': 'wood or plastic fastener; for holding clothes on a clothesline', 'name': 'clothespin'}, {'frequency': 'r', 'synset': 'clutch_bag.n.01', 'synonyms': ['clutch_bag'], 'id': 275, 'def': "a woman's strapless purse that is carried in the hand", 'name': 'clutch_bag'}, {'frequency': 'f', 'synset': 'coaster.n.03', 'synonyms': ['coaster'], 'id': 276, 'def': 'a covering (plate or mat) that protects the surface of a table', 'name': 'coaster'}, {'frequency': 'f', 'synset': 'coat.n.01', 'synonyms': ['coat'], 'id': 277, 'def': 'an outer garment that has sleeves and covers the body from shoulder down', 'name': 'coat'}, {'frequency': 'c', 'synset': 'coat_hanger.n.01', 'synonyms': ['coat_hanger', 'clothes_hanger', 'dress_hanger'], 'id': 278, 'def': "a hanger that is shaped like a person's shoulders", 'name': 'coat_hanger'}, {'frequency': 'c', 'synset': 'coatrack.n.01', 'synonyms': ['coatrack', 'hatrack'], 'id': 279, 'def': 'a rack with hooks for temporarily holding coats and hats', 'name': 'coatrack'}, {'frequency': 'c', 'synset': 'cock.n.04', 'synonyms': ['cock', 'rooster'], 'id': 280, 'def': 'adult male chicken', 'name': 'cock'}, {'frequency': 'r', 'synset': 'cockroach.n.01', 'synonyms': ['cockroach'], 'id': 281, 'def': 'any of numerous chiefly nocturnal insects; some are domestic pests', 'name': 'cockroach'}, {'frequency': 'r', 'synset': 'cocoa.n.01', 'synonyms': ['cocoa_(beverage)', 'hot_chocolate_(beverage)', 'drinking_chocolate'], 'id': 282, 'def': 'a beverage made from cocoa powder and milk and sugar; usually drunk hot', 'name': 'cocoa_(beverage)'}, {'frequency': 'c', 'synset': 'coconut.n.02', 'synonyms': ['coconut', 'cocoanut'], 'id': 283, 'def': 'large hard-shelled brown oval nut with a fibrous husk', 'name': 'coconut'}, {'frequency': 'f', 'synset': 'coffee_maker.n.01', 'synonyms': ['coffee_maker', 'coffee_machine'], 'id': 284, 'def': 'a kitchen appliance for brewing coffee automatically', 'name': 'coffee_maker'}, {'frequency': 'f', 'synset': 'coffee_table.n.01', 'synonyms': ['coffee_table', 'cocktail_table'], 'id': 285, 'def': 'low table where magazines can be placed and coffee or cocktails are served', 'name': 'coffee_table'}, {'frequency': 'c', 'synset': 'coffeepot.n.01', 'synonyms': ['coffeepot'], 'id': 286, 'def': 'tall pot in which coffee is brewed', 'name': 'coffeepot'}, {'frequency': 'r', 'synset': 'coil.n.05', 'synonyms': ['coil'], 'id': 287, 'def': 'tubing that is wound in a spiral', 'name': 'coil'}, {'frequency': 'c', 'synset': 'coin.n.01', 'synonyms': ['coin'], 'id': 288, 'def': 'a flat metal piece (usually a disc) used as money', 'name': 'coin'}, {'frequency': 'c', 'synset': 'colander.n.01', 'synonyms': ['colander', 'cullender'], 'id': 289, 'def': 'bowl-shaped strainer; used to wash or drain foods', 'name': 'colander'}, {'frequency': 'c', 'synset': 'coleslaw.n.01', 'synonyms': ['coleslaw', 'slaw'], 'id': 290, 'def': 'basically shredded cabbage', 'name': 'coleslaw'}, {'frequency': 'r', 'synset': 'coloring_material.n.01', 'synonyms': ['coloring_material', 'colouring_material'], 'id': 291, 'def': 'any material used for its color', 'name': 'coloring_material'}, {'frequency': 'r', 'synset': 'combination_lock.n.01', 'synonyms': ['combination_lock'], 'id': 292, 'def': 'lock that can be opened only by turning dials in a special sequence', 'name': 'combination_lock'}, {'frequency': 'c', 'synset': 'comforter.n.04', 'synonyms': ['pacifier', 'teething_ring'], 'id': 293, 'def': 'device used for an infant to suck or bite on', 'name': 'pacifier'}, {'frequency': 'r', 'synset': 'comic_book.n.01', 'synonyms': ['comic_book'], 'id': 294, 'def': 'a magazine devoted to comic strips', 'name': 'comic_book'}, {'frequency': 'r', 'synset': 'compass.n.01', 'synonyms': ['compass'], 'id': 295, 'def': 'navigational instrument for finding directions', 'name': 'compass'}, {'frequency': 'f', 'synset': 'computer_keyboard.n.01', 'synonyms': ['computer_keyboard', 'keyboard_(computer)'], 'id': 296, 'def': 'a keyboard that is a data input device for computers', 'name': 'computer_keyboard'}, {'frequency': 'f', 'synset': 'condiment.n.01', 'synonyms': ['condiment'], 'id': 297, 'def': 'a preparation (a sauce or relish or spice) to enhance flavor or enjoyment', 'name': 'condiment'}, {'frequency': 'f', 'synset': 'cone.n.01', 'synonyms': ['cone', 'traffic_cone'], 'id': 298, 'def': 'a cone-shaped object used to direct traffic', 'name': 'cone'}, {'frequency': 'f', 'synset': 'control.n.09', 'synonyms': ['control', 'controller'], 'id': 299, 'def': 'a mechanism that controls the operation of a machine', 'name': 'control'}, {'frequency': 'r', 'synset': 'convertible.n.01', 'synonyms': ['convertible_(automobile)'], 'id': 300, 'def': 'a car that has top that can be folded or removed', 'name': 'convertible_(automobile)'}, {'frequency': 'r', 'synset': 'convertible.n.03', 'synonyms': ['sofa_bed'], 'id': 301, 'def': 'a sofa that can be converted into a bed', 'name': 'sofa_bed'}, {'frequency': 'r', 'synset': 'cooker.n.01', 'synonyms': ['cooker'], 'id': 302, 'def': 'a utensil for cooking', 'name': 'cooker'}, {'frequency': 'f', 'synset': 'cookie.n.01', 'synonyms': ['cookie', 'cooky', 'biscuit_(cookie)'], 'id': 303, 'def': "any of various small flat sweet cakes (`biscuit' is the British term)", 'name': 'cookie'}, {'frequency': 'r', 'synset': 'cooking_utensil.n.01', 'synonyms': ['cooking_utensil'], 'id': 304, 'def': 'a kitchen utensil made of material that does not melt easily; used for cooking', 'name': 'cooking_utensil'}, {'frequency': 'f', 'synset': 'cooler.n.01', 'synonyms': ['cooler_(for_food)', 'ice_chest'], 'id': 305, 'def': 'an insulated box for storing food often with ice', 'name': 'cooler_(for_food)'}, {'frequency': 'f', 'synset': 'cork.n.04', 'synonyms': ['cork_(bottle_plug)', 'bottle_cork'], 'id': 306, 'def': 'the plug in the mouth of a bottle (especially a wine bottle)', 'name': 'cork_(bottle_plug)'}, {'frequency': 'r', 'synset': 'corkboard.n.01', 'synonyms': ['corkboard'], 'id': 307, 'def': 'a sheet consisting of cork granules', 'name': 'corkboard'}, {'frequency': 'c', 'synset': 'corkscrew.n.01', 'synonyms': ['corkscrew', 'bottle_screw'], 'id': 308, 'def': 'a bottle opener that pulls corks', 'name': 'corkscrew'}, {'frequency': 'f', 'synset': 'corn.n.03', 'synonyms': ['edible_corn', 'corn', 'maize'], 'id': 309, 'def': 'ears or kernels of corn that can be prepared and served for human food (only mark individual ears or kernels)', 'name': 'edible_corn'}, {'frequency': 'r', 'synset': 'cornbread.n.01', 'synonyms': ['cornbread'], 'id': 310, 'def': 'bread made primarily of cornmeal', 'name': 'cornbread'}, {'frequency': 'c', 'synset': 'cornet.n.01', 'synonyms': ['cornet', 'horn', 'trumpet'], 'id': 311, 'def': 'a brass musical instrument with a narrow tube and a flared bell and many valves', 'name': 'cornet'}, {'frequency': 'c', 'synset': 'cornice.n.01', 'synonyms': ['cornice', 'valance', 'valance_board', 'pelmet'], 'id': 312, 'def': 'a decorative framework to conceal curtain fixtures at the top of a window casing', 'name': 'cornice'}, {'frequency': 'r', 'synset': 'cornmeal.n.01', 'synonyms': ['cornmeal'], 'id': 313, 'def': 'coarsely ground corn', 'name': 'cornmeal'}, {'frequency': 'c', 'synset': 'corset.n.01', 'synonyms': ['corset', 'girdle'], 'id': 314, 'def': "a woman's close-fitting foundation garment", 'name': 'corset'}, {'frequency': 'c', 'synset': 'costume.n.04', 'synonyms': ['costume'], 'id': 315, 'def': 'the attire characteristic of a country or a time or a social class', 'name': 'costume'}, {'frequency': 'r', 'synset': 'cougar.n.01', 'synonyms': ['cougar', 'puma', 'catamount', 'mountain_lion', 'panther'], 'id': 316, 'def': 'large American feline resembling a lion', 'name': 'cougar'}, {'frequency': 'r', 'synset': 'coverall.n.01', 'synonyms': ['coverall'], 'id': 317, 'def': 'a loose-fitting protective garment that is worn over other clothing', 'name': 'coverall'}, {'frequency': 'c', 'synset': 'cowbell.n.01', 'synonyms': ['cowbell'], 'id': 318, 'def': 'a bell hung around the neck of cow so that the cow can be easily located', 'name': 'cowbell'}, {'frequency': 'f', 'synset': 'cowboy_hat.n.01', 'synonyms': ['cowboy_hat', 'ten-gallon_hat'], 'id': 319, 'def': 'a hat with a wide brim and a soft crown; worn by American ranch hands', 'name': 'cowboy_hat'}, {'frequency': 'c', 'synset': 'crab.n.01', 'synonyms': ['crab_(animal)'], 'id': 320, 'def': 'decapod having eyes on short stalks and a broad flattened shell and pincers', 'name': 'crab_(animal)'}, {'frequency': 'r', 'synset': 'crab.n.05', 'synonyms': ['crabmeat'], 'id': 321, 'def': 'the edible flesh of any of various crabs', 'name': 'crabmeat'}, {'frequency': 'c', 'synset': 'cracker.n.01', 'synonyms': ['cracker'], 'id': 322, 'def': 'a thin crisp wafer', 'name': 'cracker'}, {'frequency': 'r', 'synset': 'crape.n.01', 'synonyms': ['crape', 'crepe', 'French_pancake'], 'id': 323, 'def': 'small very thin pancake', 'name': 'crape'}, {'frequency': 'f', 'synset': 'crate.n.01', 'synonyms': ['crate'], 'id': 324, 'def': 'a rugged box (usually made of wood); used for shipping', 'name': 'crate'}, {'frequency': 'c', 'synset': 'crayon.n.01', 'synonyms': ['crayon', 'wax_crayon'], 'id': 325, 'def': 'writing or drawing implement made of a colored stick of composition wax', 'name': 'crayon'}, {'frequency': 'r', 'synset': 'cream_pitcher.n.01', 'synonyms': ['cream_pitcher'], 'id': 326, 'def': 'a small pitcher for serving cream', 'name': 'cream_pitcher'}, {'frequency': 'c', 'synset': 'crescent_roll.n.01', 'synonyms': ['crescent_roll', 'croissant'], 'id': 327, 'def': 'very rich flaky crescent-shaped roll', 'name': 'crescent_roll'}, {'frequency': 'c', 'synset': 'crib.n.01', 'synonyms': ['crib', 'cot'], 'id': 328, 'def': 'baby bed with high sides made of slats', 'name': 'crib'}, {'frequency': 'c', 'synset': 'crock.n.03', 'synonyms': ['crock_pot', 'earthenware_jar'], 'id': 329, 'def': 'an earthen jar (made of baked clay) or a modern electric crockpot', 'name': 'crock_pot'}, {'frequency': 'f', 'synset': 'crossbar.n.01', 'synonyms': ['crossbar'], 'id': 330, 'def': 'a horizontal bar that goes across something', 'name': 'crossbar'}, {'frequency': 'r', 'synset': 'crouton.n.01', 'synonyms': ['crouton'], 'id': 331, 'def': 'a small piece of toasted or fried bread; served in soup or salads', 'name': 'crouton'}, {'frequency': 'c', 'synset': 'crow.n.01', 'synonyms': ['crow'], 'id': 332, 'def': 'black birds having a raucous call', 'name': 'crow'}, {'frequency': 'r', 'synset': 'crowbar.n.01', 'synonyms': ['crowbar', 'wrecking_bar', 'pry_bar'], 'id': 333, 'def': 'a heavy iron lever with one end forged into a wedge', 'name': 'crowbar'}, {'frequency': 'c', 'synset': 'crown.n.04', 'synonyms': ['crown'], 'id': 334, 'def': 'an ornamental jeweled headdress signifying sovereignty', 'name': 'crown'}, {'frequency': 'c', 'synset': 'crucifix.n.01', 'synonyms': ['crucifix'], 'id': 335, 'def': 'representation of the cross on which Jesus died', 'name': 'crucifix'}, {'frequency': 'c', 'synset': 'cruise_ship.n.01', 'synonyms': ['cruise_ship', 'cruise_liner'], 'id': 336, 'def': 'a passenger ship used commercially for pleasure cruises', 'name': 'cruise_ship'}, {'frequency': 'c', 'synset': 'cruiser.n.01', 'synonyms': ['police_cruiser', 'patrol_car', 'police_car', 'squad_car'], 'id': 337, 'def': 'a car in which policemen cruise the streets', 'name': 'police_cruiser'}, {'frequency': 'f', 'synset': 'crumb.n.03', 'synonyms': ['crumb'], 'id': 338, 'def': 'small piece of e.g. bread or cake', 'name': 'crumb'}, {'frequency': 'c', 'synset': 'crutch.n.01', 'synonyms': ['crutch'], 'id': 339, 'def': 'a wooden or metal staff that fits under the armpit and reaches to the ground', 'name': 'crutch'}, {'frequency': 'c', 'synset': 'cub.n.03', 'synonyms': ['cub_(animal)'], 'id': 340, 'def': 'the young of certain carnivorous mammals such as the bear or wolf or lion', 'name': 'cub_(animal)'}, {'frequency': 'c', 'synset': 'cube.n.05', 'synonyms': ['cube', 'square_block'], 'id': 341, 'def': 'a block in the (approximate) shape of a cube', 'name': 'cube'}, {'frequency': 'f', 'synset': 'cucumber.n.02', 'synonyms': ['cucumber', 'cuke'], 'id': 342, 'def': 'cylindrical green fruit with thin green rind and white flesh eaten as a vegetable', 'name': 'cucumber'}, {'frequency': 'c', 'synset': 'cufflink.n.01', 'synonyms': ['cufflink'], 'id': 343, 'def': 'jewelry consisting of linked buttons used to fasten the cuffs of a shirt', 'name': 'cufflink'}, {'frequency': 'f', 'synset': 'cup.n.01', 'synonyms': ['cup'], 'id': 344, 'def': 'a small open container usually used for drinking; usually has a handle', 'name': 'cup'}, {'frequency': 'c', 'synset': 'cup.n.08', 'synonyms': ['trophy_cup'], 'id': 345, 'def': 'a metal award or cup-shaped vessel with handles that is awarded as a trophy to a competition winner', 'name': 'trophy_cup'}, {'frequency': 'f', 'synset': 'cupboard.n.01', 'synonyms': ['cupboard', 'closet'], 'id': 346, 'def': 'a small room (or recess) or cabinet used for storage space', 'name': 'cupboard'}, {'frequency': 'f', 'synset': 'cupcake.n.01', 'synonyms': ['cupcake'], 'id': 347, 'def': 'small cake baked in a muffin tin', 'name': 'cupcake'}, {'frequency': 'r', 'synset': 'curler.n.01', 'synonyms': ['hair_curler', 'hair_roller', 'hair_crimper'], 'id': 348, 'def': 'a cylindrical tube around which the hair is wound to curl it', 'name': 'hair_curler'}, {'frequency': 'r', 'synset': 'curling_iron.n.01', 'synonyms': ['curling_iron'], 'id': 349, 'def': 'a cylindrical home appliance that heats hair that has been curled around it', 'name': 'curling_iron'}, {'frequency': 'f', 'synset': 'curtain.n.01', 'synonyms': ['curtain', 'drapery'], 'id': 350, 'def': 'hanging cloth used as a blind (especially for a window)', 'name': 'curtain'}, {'frequency': 'f', 'synset': 'cushion.n.03', 'synonyms': ['cushion'], 'id': 351, 'def': 'a soft bag filled with air or padding such as feathers or foam rubber', 'name': 'cushion'}, {'frequency': 'r', 'synset': 'cylinder.n.04', 'synonyms': ['cylinder'], 'id': 352, 'def': 'a cylindrical container', 'name': 'cylinder'}, {'frequency': 'r', 'synset': 'cymbal.n.01', 'synonyms': ['cymbal'], 'id': 353, 'def': 'a percussion instrument consisting of a concave brass disk', 'name': 'cymbal'}, {'frequency': 'r', 'synset': 'dagger.n.01', 'synonyms': ['dagger'], 'id': 354, 'def': 'a short knife with a pointed blade used for piercing or stabbing', 'name': 'dagger'}, {'frequency': 'r', 'synset': 'dalmatian.n.02', 'synonyms': ['dalmatian'], 'id': 355, 'def': 'a large breed having a smooth white coat with black or brown spots', 'name': 'dalmatian'}, {'frequency': 'c', 'synset': 'dartboard.n.01', 'synonyms': ['dartboard'], 'id': 356, 'def': 'a circular board of wood or cork used as the target in the game of darts', 'name': 'dartboard'}, {'frequency': 'r', 'synset': 'date.n.08', 'synonyms': ['date_(fruit)'], 'id': 357, 'def': 'sweet edible fruit of the date palm with a single long woody seed', 'name': 'date_(fruit)'}, {'frequency': 'f', 'synset': 'deck_chair.n.01', 'synonyms': ['deck_chair', 'beach_chair'], 'id': 358, 'def': 'a folding chair for use outdoors; a wooden frame supports a length of canvas', 'name': 'deck_chair'}, {'frequency': 'c', 'synset': 'deer.n.01', 'synonyms': ['deer', 'cervid'], 'id': 359, 'def': "distinguished from Bovidae by the male's having solid deciduous antlers", 'name': 'deer'}, {'frequency': 'c', 'synset': 'dental_floss.n.01', 'synonyms': ['dental_floss', 'floss'], 'id': 360, 'def': 'a soft thread for cleaning the spaces between the teeth', 'name': 'dental_floss'}, {'frequency': 'f', 'synset': 'desk.n.01', 'synonyms': ['desk'], 'id': 361, 'def': 'a piece of furniture with a writing surface and usually drawers or other compartments', 'name': 'desk'}, {'frequency': 'r', 'synset': 'detergent.n.01', 'synonyms': ['detergent'], 'id': 362, 'def': 'a surface-active chemical widely used in industry and laundering', 'name': 'detergent'}, {'frequency': 'c', 'synset': 'diaper.n.01', 'synonyms': ['diaper'], 'id': 363, 'def': 'garment consisting of a folded cloth drawn up between the legs and fastened at the waist', 'name': 'diaper'}, {'frequency': 'r', 'synset': 'diary.n.01', 'synonyms': ['diary', 'journal'], 'id': 364, 'def': 'yearly planner book', 'name': 'diary'}, {'frequency': 'r', 'synset': 'die.n.01', 'synonyms': ['die', 'dice'], 'id': 365, 'def': 'a small cube with 1 to 6 spots on the six faces; used in gambling', 'name': 'die'}, {'frequency': 'r', 'synset': 'dinghy.n.01', 'synonyms': ['dinghy', 'dory', 'rowboat'], 'id': 366, 'def': 'a small boat of shallow draft with seats and oars with which it is propelled', 'name': 'dinghy'}, {'frequency': 'f', 'synset': 'dining_table.n.01', 'synonyms': ['dining_table'], 'id': 367, 'def': 'a table at which meals are served', 'name': 'dining_table'}, {'frequency': 'r', 'synset': 'dinner_jacket.n.01', 'synonyms': ['tux', 'tuxedo'], 'id': 368, 'def': 'semiformal evening dress for men', 'name': 'tux'}, {'frequency': 'f', 'synset': 'dish.n.01', 'synonyms': ['dish'], 'id': 369, 'def': 'a piece of dishware normally used as a container for holding or serving food', 'name': 'dish'}, {'frequency': 'c', 'synset': 'dish.n.05', 'synonyms': ['dish_antenna'], 'id': 370, 'def': 'directional antenna consisting of a parabolic reflector', 'name': 'dish_antenna'}, {'frequency': 'c', 'synset': 'dishrag.n.01', 'synonyms': ['dishrag', 'dishcloth'], 'id': 371, 'def': 'a cloth for washing dishes or cleaning in general', 'name': 'dishrag'}, {'frequency': 'f', 'synset': 'dishtowel.n.01', 'synonyms': ['dishtowel', 'tea_towel'], 'id': 372, 'def': 'a towel for drying dishes', 'name': 'dishtowel'}, {'frequency': 'f', 'synset': 'dishwasher.n.01', 'synonyms': ['dishwasher', 'dishwashing_machine'], 'id': 373, 'def': 'a machine for washing dishes', 'name': 'dishwasher'}, {'frequency': 'r', 'synset': 'dishwasher_detergent.n.01', 'synonyms': ['dishwasher_detergent', 'dishwashing_detergent', 'dishwashing_liquid', 'dishsoap'], 'id': 374, 'def': 'dishsoap or dish detergent designed for use in dishwashers', 'name': 'dishwasher_detergent'}, {'frequency': 'f', 'synset': 'dispenser.n.01', 'synonyms': ['dispenser'], 'id': 375, 'def': 'a container so designed that the contents can be used in prescribed amounts', 'name': 'dispenser'}, {'frequency': 'r', 'synset': 'diving_board.n.01', 'synonyms': ['diving_board'], 'id': 376, 'def': 'a springboard from which swimmers can dive', 'name': 'diving_board'}, {'frequency': 'f', 'synset': 'dixie_cup.n.01', 'synonyms': ['Dixie_cup', 'paper_cup'], 'id': 377, 'def': 'a disposable cup made of paper; for holding drinks', 'name': 'Dixie_cup'}, {'frequency': 'f', 'synset': 'dog.n.01', 'synonyms': ['dog'], 'id': 378, 'def': 'a common domesticated dog', 'name': 'dog'}, {'frequency': 'f', 'synset': 'dog_collar.n.01', 'synonyms': ['dog_collar'], 'id': 379, 'def': 'a collar for a dog', 'name': 'dog_collar'}, {'frequency': 'f', 'synset': 'doll.n.01', 'synonyms': ['doll'], 'id': 380, 'def': 'a toy replica of a HUMAN (NOT AN ANIMAL)', 'name': 'doll'}, {'frequency': 'r', 'synset': 'dollar.n.02', 'synonyms': ['dollar', 'dollar_bill', 'one_dollar_bill'], 'id': 381, 'def': 'a piece of paper money worth one dollar', 'name': 'dollar'}, {'frequency': 'r', 'synset': 'dollhouse.n.01', 'synonyms': ['dollhouse', "doll's_house"], 'id': 382, 'def': "a house so small that it is likened to a child's plaything", 'name': 'dollhouse'}, {'frequency': 'c', 'synset': 'dolphin.n.02', 'synonyms': ['dolphin'], 'id': 383, 'def': 'any of various small toothed whales with a beaklike snout; larger than porpoises', 'name': 'dolphin'}, {'frequency': 'c', 'synset': 'domestic_ass.n.01', 'synonyms': ['domestic_ass', 'donkey'], 'id': 384, 'def': 'domestic beast of burden descended from the African wild ass; patient but stubborn', 'name': 'domestic_ass'}, {'frequency': 'f', 'synset': 'doorknob.n.01', 'synonyms': ['doorknob', 'doorhandle'], 'id': 385, 'def': "a knob used to open a door (often called `doorhandle' in Great Britain)", 'name': 'doorknob'}, {'frequency': 'c', 'synset': 'doormat.n.02', 'synonyms': ['doormat', 'welcome_mat'], 'id': 386, 'def': 'a mat placed outside an exterior door for wiping the shoes before entering', 'name': 'doormat'}, {'frequency': 'f', 'synset': 'doughnut.n.02', 'synonyms': ['doughnut', 'donut'], 'id': 387, 'def': 'a small ring-shaped friedcake', 'name': 'doughnut'}, {'frequency': 'r', 'synset': 'dove.n.01', 'synonyms': ['dove'], 'id': 388, 'def': 'any of numerous small pigeons', 'name': 'dove'}, {'frequency': 'r', 'synset': 'dragonfly.n.01', 'synonyms': ['dragonfly'], 'id': 389, 'def': 'slender-bodied non-stinging insect having iridescent wings that are outspread at rest', 'name': 'dragonfly'}, {'frequency': 'f', 'synset': 'drawer.n.01', 'synonyms': ['drawer'], 'id': 390, 'def': 'a boxlike container in a piece of furniture; made so as to slide in and out', 'name': 'drawer'}, {'frequency': 'c', 'synset': 'drawers.n.01', 'synonyms': ['underdrawers', 'boxers', 'boxershorts'], 'id': 391, 'def': 'underpants worn by men', 'name': 'underdrawers'}, {'frequency': 'f', 'synset': 'dress.n.01', 'synonyms': ['dress', 'frock'], 'id': 392, 'def': 'a one-piece garment for a woman; has skirt and bodice', 'name': 'dress'}, {'frequency': 'c', 'synset': 'dress_hat.n.01', 'synonyms': ['dress_hat', 'high_hat', 'opera_hat', 'silk_hat', 'top_hat'], 'id': 393, 'def': "a man's hat with a tall crown; usually covered with silk or with beaver fur", 'name': 'dress_hat'}, {'frequency': 'f', 'synset': 'dress_suit.n.01', 'synonyms': ['dress_suit'], 'id': 394, 'def': 'formalwear consisting of full evening dress for men', 'name': 'dress_suit'}, {'frequency': 'f', 'synset': 'dresser.n.05', 'synonyms': ['dresser'], 'id': 395, 'def': 'a cabinet with shelves', 'name': 'dresser'}, {'frequency': 'c', 'synset': 'drill.n.01', 'synonyms': ['drill'], 'id': 396, 'def': 'a tool with a sharp rotating point for making holes in hard materials', 'name': 'drill'}, {'frequency': 'r', 'synset': 'drone.n.04', 'synonyms': ['drone'], 'id': 397, 'def': 'an aircraft without a pilot that is operated by remote control', 'name': 'drone'}, {'frequency': 'r', 'synset': 'dropper.n.01', 'synonyms': ['dropper', 'eye_dropper'], 'id': 398, 'def': 'pipet consisting of a small tube with a vacuum bulb at one end for drawing liquid in and releasing it a drop at a time', 'name': 'dropper'}, {'frequency': 'c', 'synset': 'drum.n.01', 'synonyms': ['drum_(musical_instrument)'], 'id': 399, 'def': 'a musical percussion instrument; usually consists of a hollow cylinder with a membrane stretched across each end', 'name': 'drum_(musical_instrument)'}, {'frequency': 'r', 'synset': 'drumstick.n.02', 'synonyms': ['drumstick'], 'id': 400, 'def': 'a stick used for playing a drum', 'name': 'drumstick'}, {'frequency': 'f', 'synset': 'duck.n.01', 'synonyms': ['duck'], 'id': 401, 'def': 'small web-footed broad-billed swimming bird', 'name': 'duck'}, {'frequency': 'c', 'synset': 'duckling.n.02', 'synonyms': ['duckling'], 'id': 402, 'def': 'young duck', 'name': 'duckling'}, {'frequency': 'c', 'synset': 'duct_tape.n.01', 'synonyms': ['duct_tape'], 'id': 403, 'def': 'a wide silvery adhesive tape', 'name': 'duct_tape'}, {'frequency': 'f', 'synset': 'duffel_bag.n.01', 'synonyms': ['duffel_bag', 'duffle_bag', 'duffel', 'duffle'], 'id': 404, 'def': 'a large cylindrical bag of heavy cloth (does not include suitcases)', 'name': 'duffel_bag'}, {'frequency': 'r', 'synset': 'dumbbell.n.01', 'synonyms': ['dumbbell'], 'id': 405, 'def': 'an exercising weight with two ball-like ends connected by a short handle', 'name': 'dumbbell'}, {'frequency': 'c', 'synset': 'dumpster.n.01', 'synonyms': ['dumpster'], 'id': 406, 'def': 'a container designed to receive and transport and dump waste', 'name': 'dumpster'}, {'frequency': 'r', 'synset': 'dustpan.n.02', 'synonyms': ['dustpan'], 'id': 407, 'def': 'a short-handled receptacle into which dust can be swept', 'name': 'dustpan'}, {'frequency': 'c', 'synset': 'eagle.n.01', 'synonyms': ['eagle'], 'id': 408, 'def': 'large birds of prey noted for their broad wings and strong soaring flight', 'name': 'eagle'}, {'frequency': 'f', 'synset': 'earphone.n.01', 'synonyms': ['earphone', 'earpiece', 'headphone'], 'id': 409, 'def': 'device for listening to audio that is held over or inserted into the ear', 'name': 'earphone'}, {'frequency': 'r', 'synset': 'earplug.n.01', 'synonyms': ['earplug'], 'id': 410, 'def': 'a soft plug that is inserted into the ear canal to block sound', 'name': 'earplug'}, {'frequency': 'f', 'synset': 'earring.n.01', 'synonyms': ['earring'], 'id': 411, 'def': 'jewelry to ornament the ear', 'name': 'earring'}, {'frequency': 'c', 'synset': 'easel.n.01', 'synonyms': ['easel'], 'id': 412, 'def': "an upright tripod for displaying something (usually an artist's canvas)", 'name': 'easel'}, {'frequency': 'r', 'synset': 'eclair.n.01', 'synonyms': ['eclair'], 'id': 413, 'def': 'oblong cream puff', 'name': 'eclair'}, {'frequency': 'r', 'synset': 'eel.n.01', 'synonyms': ['eel'], 'id': 414, 'def': 'an elongate fish with fatty flesh', 'name': 'eel'}, {'frequency': 'f', 'synset': 'egg.n.02', 'synonyms': ['egg', 'eggs'], 'id': 415, 'def': 'oval reproductive body of a fowl (especially a hen) used as food', 'name': 'egg'}, {'frequency': 'r', 'synset': 'egg_roll.n.01', 'synonyms': ['egg_roll', 'spring_roll'], 'id': 416, 'def': 'minced vegetables and meat wrapped in a pancake and fried', 'name': 'egg_roll'}, {'frequency': 'c', 'synset': 'egg_yolk.n.01', 'synonyms': ['egg_yolk', 'yolk_(egg)'], 'id': 417, 'def': 'the yellow spherical part of an egg', 'name': 'egg_yolk'}, {'frequency': 'c', 'synset': 'eggbeater.n.02', 'synonyms': ['eggbeater', 'eggwhisk'], 'id': 418, 'def': 'a mixer for beating eggs or whipping cream', 'name': 'eggbeater'}, {'frequency': 'c', 'synset': 'eggplant.n.01', 'synonyms': ['eggplant', 'aubergine'], 'id': 419, 'def': 'egg-shaped vegetable having a shiny skin typically dark purple', 'name': 'eggplant'}, {'frequency': 'r', 'synset': 'electric_chair.n.01', 'synonyms': ['electric_chair'], 'id': 420, 'def': 'a chair-shaped instrument of execution by electrocution', 'name': 'electric_chair'}, {'frequency': 'f', 'synset': 'electric_refrigerator.n.01', 'synonyms': ['refrigerator'], 'id': 421, 'def': 'a refrigerator in which the coolant is pumped around by an electric motor', 'name': 'refrigerator'}, {'frequency': 'f', 'synset': 'elephant.n.01', 'synonyms': ['elephant'], 'id': 422, 'def': 'a common elephant', 'name': 'elephant'}, {'frequency': 'c', 'synset': 'elk.n.01', 'synonyms': ['elk', 'moose'], 'id': 423, 'def': 'large northern deer with enormous flattened antlers in the male', 'name': 'elk'}, {'frequency': 'c', 'synset': 'envelope.n.01', 'synonyms': ['envelope'], 'id': 424, 'def': 'a flat (usually rectangular) container for a letter, thin package, etc.', 'name': 'envelope'}, {'frequency': 'c', 'synset': 'eraser.n.01', 'synonyms': ['eraser'], 'id': 425, 'def': 'an implement used to erase something', 'name': 'eraser'}, {'frequency': 'r', 'synset': 'escargot.n.01', 'synonyms': ['escargot'], 'id': 426, 'def': 'edible snail usually served in the shell with a sauce of melted butter and garlic', 'name': 'escargot'}, {'frequency': 'r', 'synset': 'eyepatch.n.01', 'synonyms': ['eyepatch'], 'id': 427, 'def': 'a protective cloth covering for an injured eye', 'name': 'eyepatch'}, {'frequency': 'r', 'synset': 'falcon.n.01', 'synonyms': ['falcon'], 'id': 428, 'def': 'birds of prey having long pointed powerful wings adapted for swift flight', 'name': 'falcon'}, {'frequency': 'f', 'synset': 'fan.n.01', 'synonyms': ['fan'], 'id': 429, 'def': 'a device for creating a current of air by movement of a surface or surfaces', 'name': 'fan'}, {'frequency': 'f', 'synset': 'faucet.n.01', 'synonyms': ['faucet', 'spigot', 'tap'], 'id': 430, 'def': 'a regulator for controlling the flow of a liquid from a reservoir', 'name': 'faucet'}, {'frequency': 'r', 'synset': 'fedora.n.01', 'synonyms': ['fedora'], 'id': 431, 'def': 'a hat made of felt with a creased crown', 'name': 'fedora'}, {'frequency': 'r', 'synset': 'ferret.n.02', 'synonyms': ['ferret'], 'id': 432, 'def': 'domesticated albino variety of the European polecat bred for hunting rats and rabbits', 'name': 'ferret'}, {'frequency': 'c', 'synset': 'ferris_wheel.n.01', 'synonyms': ['Ferris_wheel'], 'id': 433, 'def': 'a large wheel with suspended seats that remain upright as the wheel rotates', 'name': 'Ferris_wheel'}, {'frequency': 'c', 'synset': 'ferry.n.01', 'synonyms': ['ferry', 'ferryboat'], 'id': 434, 'def': 'a boat that transports people or vehicles across a body of water and operates on a regular schedule', 'name': 'ferry'}, {'frequency': 'r', 'synset': 'fig.n.04', 'synonyms': ['fig_(fruit)'], 'id': 435, 'def': 'fleshy sweet pear-shaped yellowish or purple fruit eaten fresh or preserved or dried', 'name': 'fig_(fruit)'}, {'frequency': 'c', 'synset': 'fighter.n.02', 'synonyms': ['fighter_jet', 'fighter_aircraft', 'attack_aircraft'], 'id': 436, 'def': 'a high-speed military or naval airplane designed to destroy enemy targets', 'name': 'fighter_jet'}, {'frequency': 'f', 'synset': 'figurine.n.01', 'synonyms': ['figurine'], 'id': 437, 'def': 'a small carved or molded figure', 'name': 'figurine'}, {'frequency': 'c', 'synset': 'file.n.03', 'synonyms': ['file_cabinet', 'filing_cabinet'], 'id': 438, 'def': 'office furniture consisting of a container for keeping papers in order', 'name': 'file_cabinet'}, {'frequency': 'r', 'synset': 'file.n.04', 'synonyms': ['file_(tool)'], 'id': 439, 'def': 'a steel hand tool with small sharp teeth on some or all of its surfaces; used for smoothing wood or metal', 'name': 'file_(tool)'}, {'frequency': 'f', 'synset': 'fire_alarm.n.02', 'synonyms': ['fire_alarm', 'smoke_alarm'], 'id': 440, 'def': 'an alarm that is tripped off by fire or smoke', 'name': 'fire_alarm'}, {'frequency': 'f', 'synset': 'fire_engine.n.01', 'synonyms': ['fire_engine', 'fire_truck'], 'id': 441, 'def': 'large trucks that carry firefighters and equipment to the site of a fire', 'name': 'fire_engine'}, {'frequency': 'f', 'synset': 'fire_extinguisher.n.01', 'synonyms': ['fire_extinguisher', 'extinguisher'], 'id': 442, 'def': 'a manually operated device for extinguishing small fires', 'name': 'fire_extinguisher'}, {'frequency': 'c', 'synset': 'fire_hose.n.01', 'synonyms': ['fire_hose'], 'id': 443, 'def': 'a large hose that carries water from a fire hydrant to the site of the fire', 'name': 'fire_hose'}, {'frequency': 'f', 'synset': 'fireplace.n.01', 'synonyms': ['fireplace'], 'id': 444, 'def': 'an open recess in a wall at the base of a chimney where a fire can be built', 'name': 'fireplace'}, {'frequency': 'f', 'synset': 'fireplug.n.01', 'synonyms': ['fireplug', 'fire_hydrant', 'hydrant'], 'id': 445, 'def': 'an upright hydrant for drawing water to use in fighting a fire', 'name': 'fireplug'}, {'frequency': 'r', 'synset': 'first-aid_kit.n.01', 'synonyms': ['first-aid_kit'], 'id': 446, 'def': 'kit consisting of a set of bandages and medicines for giving first aid', 'name': 'first-aid_kit'}, {'frequency': 'f', 'synset': 'fish.n.01', 'synonyms': ['fish'], 'id': 447, 'def': 'any of various mostly cold-blooded aquatic vertebrates usually having scales and breathing through gills', 'name': 'fish'}, {'frequency': 'c', 'synset': 'fish.n.02', 'synonyms': ['fish_(food)'], 'id': 448, 'def': 'the flesh of fish used as food', 'name': 'fish_(food)'}, {'frequency': 'r', 'synset': 'fishbowl.n.02', 'synonyms': ['fishbowl', 'goldfish_bowl'], 'id': 449, 'def': 'a transparent bowl in which small fish are kept', 'name': 'fishbowl'}, {'frequency': 'c', 'synset': 'fishing_rod.n.01', 'synonyms': ['fishing_rod', 'fishing_pole'], 'id': 450, 'def': 'a rod that is used in fishing to extend the fishing line', 'name': 'fishing_rod'}, {'frequency': 'f', 'synset': 'flag.n.01', 'synonyms': ['flag'], 'id': 451, 'def': 'emblem usually consisting of a rectangular piece of cloth of distinctive design (do not include pole)', 'name': 'flag'}, {'frequency': 'f', 'synset': 'flagpole.n.02', 'synonyms': ['flagpole', 'flagstaff'], 'id': 452, 'def': 'a tall staff or pole on which a flag is raised', 'name': 'flagpole'}, {'frequency': 'c', 'synset': 'flamingo.n.01', 'synonyms': ['flamingo'], 'id': 453, 'def': 'large pink web-footed bird with down-bent bill', 'name': 'flamingo'}, {'frequency': 'c', 'synset': 'flannel.n.01', 'synonyms': ['flannel'], 'id': 454, 'def': 'a soft light woolen fabric; used for clothing', 'name': 'flannel'}, {'frequency': 'c', 'synset': 'flap.n.01', 'synonyms': ['flap'], 'id': 455, 'def': 'any broad thin covering attached at one edge, such as a mud flap next to a wheel or a flap on an airplane wing', 'name': 'flap'}, {'frequency': 'r', 'synset': 'flash.n.10', 'synonyms': ['flash', 'flashbulb'], 'id': 456, 'def': 'a lamp for providing momentary light to take a photograph', 'name': 'flash'}, {'frequency': 'c', 'synset': 'flashlight.n.01', 'synonyms': ['flashlight', 'torch'], 'id': 457, 'def': 'a small portable battery-powered electric lamp', 'name': 'flashlight'}, {'frequency': 'r', 'synset': 'fleece.n.03', 'synonyms': ['fleece'], 'id': 458, 'def': 'a soft bulky fabric with deep pile; used chiefly for clothing', 'name': 'fleece'}, {'frequency': 'f', 'synset': 'flip-flop.n.02', 'synonyms': ['flip-flop_(sandal)'], 'id': 459, 'def': 'a backless sandal held to the foot by a thong between two toes', 'name': 'flip-flop_(sandal)'}, {'frequency': 'c', 'synset': 'flipper.n.01', 'synonyms': ['flipper_(footwear)', 'fin_(footwear)'], 'id': 460, 'def': 'a shoe to aid a person in swimming', 'name': 'flipper_(footwear)'}, {'frequency': 'f', 'synset': 'flower_arrangement.n.01', 'synonyms': ['flower_arrangement', 'floral_arrangement'], 'id': 461, 'def': 'a decorative arrangement of flowers', 'name': 'flower_arrangement'}, {'frequency': 'c', 'synset': 'flute.n.02', 'synonyms': ['flute_glass', 'champagne_flute'], 'id': 462, 'def': 'a tall narrow wineglass', 'name': 'flute_glass'}, {'frequency': 'c', 'synset': 'foal.n.01', 'synonyms': ['foal'], 'id': 463, 'def': 'a young horse', 'name': 'foal'}, {'frequency': 'c', 'synset': 'folding_chair.n.01', 'synonyms': ['folding_chair'], 'id': 464, 'def': 'a chair that can be folded flat for storage', 'name': 'folding_chair'}, {'frequency': 'c', 'synset': 'food_processor.n.01', 'synonyms': ['food_processor'], 'id': 465, 'def': 'a kitchen appliance for shredding, blending, chopping, or slicing food', 'name': 'food_processor'}, {'frequency': 'c', 'synset': 'football.n.02', 'synonyms': ['football_(American)'], 'id': 466, 'def': 'the inflated oblong ball used in playing American football', 'name': 'football_(American)'}, {'frequency': 'r', 'synset': 'football_helmet.n.01', 'synonyms': ['football_helmet'], 'id': 467, 'def': 'a padded helmet with a face mask to protect the head of football players', 'name': 'football_helmet'}, {'frequency': 'c', 'synset': 'footstool.n.01', 'synonyms': ['footstool', 'footrest'], 'id': 468, 'def': 'a low seat or a stool to rest the feet of a seated person', 'name': 'footstool'}, {'frequency': 'f', 'synset': 'fork.n.01', 'synonyms': ['fork'], 'id': 469, 'def': 'cutlery used for serving and eating food', 'name': 'fork'}, {'frequency': 'c', 'synset': 'forklift.n.01', 'synonyms': ['forklift'], 'id': 470, 'def': 'an industrial vehicle with a power operated fork in front that can be inserted under loads to lift and move them', 'name': 'forklift'}, {'frequency': 'c', 'synset': 'freight_car.n.01', 'synonyms': ['freight_car'], 'id': 471, 'def': 'a railway car that carries freight', 'name': 'freight_car'}, {'frequency': 'c', 'synset': 'french_toast.n.01', 'synonyms': ['French_toast'], 'id': 472, 'def': 'bread slice dipped in egg and milk and fried', 'name': 'French_toast'}, {'frequency': 'c', 'synset': 'freshener.n.01', 'synonyms': ['freshener', 'air_freshener'], 'id': 473, 'def': 'anything that freshens air by removing or covering odor', 'name': 'freshener'}, {'frequency': 'f', 'synset': 'frisbee.n.01', 'synonyms': ['frisbee'], 'id': 474, 'def': 'a light, plastic disk propelled with a flip of the wrist for recreation or competition', 'name': 'frisbee'}, {'frequency': 'c', 'synset': 'frog.n.01', 'synonyms': ['frog', 'toad', 'toad_frog'], 'id': 475, 'def': 'a tailless stout-bodied amphibians with long hind limbs for leaping', 'name': 'frog'}, {'frequency': 'c', 'synset': 'fruit_juice.n.01', 'synonyms': ['fruit_juice'], 'id': 476, 'def': 'drink produced by squeezing or crushing fruit', 'name': 'fruit_juice'}, {'frequency': 'f', 'synset': 'frying_pan.n.01', 'synonyms': ['frying_pan', 'frypan', 'skillet'], 'id': 477, 'def': 'a pan used for frying foods', 'name': 'frying_pan'}, {'frequency': 'r', 'synset': 'fudge.n.01', 'synonyms': ['fudge'], 'id': 478, 'def': 'soft creamy candy', 'name': 'fudge'}, {'frequency': 'r', 'synset': 'funnel.n.02', 'synonyms': ['funnel'], 'id': 479, 'def': 'a cone-shaped utensil used to channel a substance into a container with a small mouth', 'name': 'funnel'}, {'frequency': 'r', 'synset': 'futon.n.01', 'synonyms': ['futon'], 'id': 480, 'def': 'a pad that is used for sleeping on the floor or on a raised frame', 'name': 'futon'}, {'frequency': 'r', 'synset': 'gag.n.02', 'synonyms': ['gag', 'muzzle'], 'id': 481, 'def': "restraint put into a person's mouth to prevent speaking or shouting", 'name': 'gag'}, {'frequency': 'r', 'synset': 'garbage.n.03', 'synonyms': ['garbage'], 'id': 482, 'def': 'a receptacle where waste can be discarded', 'name': 'garbage'}, {'frequency': 'c', 'synset': 'garbage_truck.n.01', 'synonyms': ['garbage_truck'], 'id': 483, 'def': 'a truck for collecting domestic refuse', 'name': 'garbage_truck'}, {'frequency': 'c', 'synset': 'garden_hose.n.01', 'synonyms': ['garden_hose'], 'id': 484, 'def': 'a hose used for watering a lawn or garden', 'name': 'garden_hose'}, {'frequency': 'c', 'synset': 'gargle.n.01', 'synonyms': ['gargle', 'mouthwash'], 'id': 485, 'def': 'a medicated solution used for gargling and rinsing the mouth', 'name': 'gargle'}, {'frequency': 'r', 'synset': 'gargoyle.n.02', 'synonyms': ['gargoyle'], 'id': 486, 'def': 'an ornament consisting of a grotesquely carved figure of a person or animal', 'name': 'gargoyle'}, {'frequency': 'c', 'synset': 'garlic.n.02', 'synonyms': ['garlic', 'ail'], 'id': 487, 'def': 'aromatic bulb used as seasoning', 'name': 'garlic'}, {'frequency': 'r', 'synset': 'gasmask.n.01', 'synonyms': ['gasmask', 'respirator', 'gas_helmet'], 'id': 488, 'def': 'a protective face mask with a filter', 'name': 'gasmask'}, {'frequency': 'c', 'synset': 'gazelle.n.01', 'synonyms': ['gazelle'], 'id': 489, 'def': 'small swift graceful antelope of Africa and Asia having lustrous eyes', 'name': 'gazelle'}, {'frequency': 'c', 'synset': 'gelatin.n.02', 'synonyms': ['gelatin', 'jelly'], 'id': 490, 'def': 'an edible jelly made with gelatin and used as a dessert or salad base or a coating for foods', 'name': 'gelatin'}, {'frequency': 'r', 'synset': 'gem.n.02', 'synonyms': ['gemstone'], 'id': 491, 'def': 'a crystalline rock that can be cut and polished for jewelry', 'name': 'gemstone'}, {'frequency': 'r', 'synset': 'generator.n.02', 'synonyms': ['generator'], 'id': 492, 'def': 'engine that converts mechanical energy into electrical energy by electromagnetic induction', 'name': 'generator'}, {'frequency': 'c', 'synset': 'giant_panda.n.01', 'synonyms': ['giant_panda', 'panda', 'panda_bear'], 'id': 493, 'def': 'large black-and-white herbivorous mammal of bamboo forests of China and Tibet', 'name': 'giant_panda'}, {'frequency': 'c', 'synset': 'gift_wrap.n.01', 'synonyms': ['gift_wrap'], 'id': 494, 'def': 'attractive wrapping paper suitable for wrapping gifts', 'name': 'gift_wrap'}, {'frequency': 'c', 'synset': 'ginger.n.03', 'synonyms': ['ginger', 'gingerroot'], 'id': 495, 'def': 'the root of the common ginger plant; used fresh as a seasoning', 'name': 'ginger'}, {'frequency': 'f', 'synset': 'giraffe.n.01', 'synonyms': ['giraffe'], 'id': 496, 'def': 'tall animal having a spotted coat and small horns and very long neck and legs', 'name': 'giraffe'}, {'frequency': 'c', 'synset': 'girdle.n.02', 'synonyms': ['cincture', 'sash', 'waistband', 'waistcloth'], 'id': 497, 'def': 'a band of material around the waist that strengthens a skirt or trousers', 'name': 'cincture'}, {'frequency': 'f', 'synset': 'glass.n.02', 'synonyms': ['glass_(drink_container)', 'drinking_glass'], 'id': 498, 'def': 'a container for holding liquids while drinking', 'name': 'glass_(drink_container)'}, {'frequency': 'c', 'synset': 'globe.n.03', 'synonyms': ['globe'], 'id': 499, 'def': 'a sphere on which a map (especially of the earth) is represented', 'name': 'globe'}, {'frequency': 'f', 'synset': 'glove.n.02', 'synonyms': ['glove'], 'id': 500, 'def': 'handwear covering the hand', 'name': 'glove'}, {'frequency': 'c', 'synset': 'goat.n.01', 'synonyms': ['goat'], 'id': 501, 'def': 'a common goat', 'name': 'goat'}, {'frequency': 'f', 'synset': 'goggles.n.01', 'synonyms': ['goggles'], 'id': 502, 'def': 'tight-fitting spectacles worn to protect the eyes', 'name': 'goggles'}, {'frequency': 'r', 'synset': 'goldfish.n.01', 'synonyms': ['goldfish'], 'id': 503, 'def': 'small golden or orange-red freshwater fishes used as pond or aquarium pets', 'name': 'goldfish'}, {'frequency': 'c', 'synset': 'golf_club.n.02', 'synonyms': ['golf_club', 'golf-club'], 'id': 504, 'def': 'golf equipment used by a golfer to hit a golf ball', 'name': 'golf_club'}, {'frequency': 'c', 'synset': 'golfcart.n.01', 'synonyms': ['golfcart'], 'id': 505, 'def': 'a small motor vehicle in which golfers can ride between shots', 'name': 'golfcart'}, {'frequency': 'r', 'synset': 'gondola.n.02', 'synonyms': ['gondola_(boat)'], 'id': 506, 'def': 'long narrow flat-bottomed boat propelled by sculling; traditionally used on canals of Venice', 'name': 'gondola_(boat)'}, {'frequency': 'c', 'synset': 'goose.n.01', 'synonyms': ['goose'], 'id': 507, 'def': 'loud, web-footed long-necked aquatic birds usually larger than ducks', 'name': 'goose'}, {'frequency': 'r', 'synset': 'gorilla.n.01', 'synonyms': ['gorilla'], 'id': 508, 'def': 'largest ape', 'name': 'gorilla'}, {'frequency': 'r', 'synset': 'gourd.n.02', 'synonyms': ['gourd'], 'id': 509, 'def': 'any of numerous inedible fruits with hard rinds', 'name': 'gourd'}, {'frequency': 'f', 'synset': 'grape.n.01', 'synonyms': ['grape'], 'id': 510, 'def': 'any of various juicy fruit with green or purple skins; grow in clusters', 'name': 'grape'}, {'frequency': 'c', 'synset': 'grater.n.01', 'synonyms': ['grater'], 'id': 511, 'def': 'utensil with sharp perforations for shredding foods (as vegetables or cheese)', 'name': 'grater'}, {'frequency': 'c', 'synset': 'gravestone.n.01', 'synonyms': ['gravestone', 'headstone', 'tombstone'], 'id': 512, 'def': 'a stone that is used to mark a grave', 'name': 'gravestone'}, {'frequency': 'r', 'synset': 'gravy_boat.n.01', 'synonyms': ['gravy_boat', 'gravy_holder'], 'id': 513, 'def': 'a dish (often boat-shaped) for serving gravy or sauce', 'name': 'gravy_boat'}, {'frequency': 'f', 'synset': 'green_bean.n.02', 'synonyms': ['green_bean'], 'id': 514, 'def': 'a common bean plant cultivated for its slender green edible pods', 'name': 'green_bean'}, {'frequency': 'f', 'synset': 'green_onion.n.01', 'synonyms': ['green_onion', 'spring_onion', 'scallion'], 'id': 515, 'def': 'a young onion before the bulb has enlarged', 'name': 'green_onion'}, {'frequency': 'r', 'synset': 'griddle.n.01', 'synonyms': ['griddle'], 'id': 516, 'def': 'cooking utensil consisting of a flat heated surface on which food is cooked', 'name': 'griddle'}, {'frequency': 'f', 'synset': 'grill.n.02', 'synonyms': ['grill', 'grille', 'grillwork', 'radiator_grille'], 'id': 517, 'def': 'a framework of metal bars used as a partition or a grate', 'name': 'grill'}, {'frequency': 'r', 'synset': 'grits.n.01', 'synonyms': ['grits', 'hominy_grits'], 'id': 518, 'def': 'coarsely ground corn boiled as a breakfast dish', 'name': 'grits'}, {'frequency': 'c', 'synset': 'grizzly.n.01', 'synonyms': ['grizzly', 'grizzly_bear'], 'id': 519, 'def': 'powerful brownish-yellow bear of the uplands of western North America', 'name': 'grizzly'}, {'frequency': 'c', 'synset': 'grocery_bag.n.01', 'synonyms': ['grocery_bag'], 'id': 520, 'def': "a sack for holding customer's groceries", 'name': 'grocery_bag'}, {'frequency': 'f', 'synset': 'guitar.n.01', 'synonyms': ['guitar'], 'id': 521, 'def': 'a stringed instrument usually having six strings; played by strumming or plucking', 'name': 'guitar'}, {'frequency': 'c', 'synset': 'gull.n.02', 'synonyms': ['gull', 'seagull'], 'id': 522, 'def': 'mostly white aquatic bird having long pointed wings and short legs', 'name': 'gull'}, {'frequency': 'c', 'synset': 'gun.n.01', 'synonyms': ['gun'], 'id': 523, 'def': 'a weapon that discharges a bullet at high velocity from a metal tube', 'name': 'gun'}, {'frequency': 'f', 'synset': 'hairbrush.n.01', 'synonyms': ['hairbrush'], 'id': 524, 'def': "a brush used to groom a person's hair", 'name': 'hairbrush'}, {'frequency': 'c', 'synset': 'hairnet.n.01', 'synonyms': ['hairnet'], 'id': 525, 'def': 'a small net that someone wears over their hair to keep it in place', 'name': 'hairnet'}, {'frequency': 'c', 'synset': 'hairpin.n.01', 'synonyms': ['hairpin'], 'id': 526, 'def': "a double pronged pin used to hold women's hair in place", 'name': 'hairpin'}, {'frequency': 'r', 'synset': 'halter.n.03', 'synonyms': ['halter_top'], 'id': 527, 'def': "a woman's top that fastens behind the back and neck leaving the back and arms uncovered", 'name': 'halter_top'}, {'frequency': 'f', 'synset': 'ham.n.01', 'synonyms': ['ham', 'jambon', 'gammon'], 'id': 528, 'def': 'meat cut from the thigh of a hog (usually smoked)', 'name': 'ham'}, {'frequency': 'c', 'synset': 'hamburger.n.01', 'synonyms': ['hamburger', 'beefburger', 'burger'], 'id': 529, 'def': 'a sandwich consisting of a patty of minced beef served on a bun', 'name': 'hamburger'}, {'frequency': 'c', 'synset': 'hammer.n.02', 'synonyms': ['hammer'], 'id': 530, 'def': 'a hand tool with a heavy head and a handle; used to deliver an impulsive force by striking', 'name': 'hammer'}, {'frequency': 'c', 'synset': 'hammock.n.02', 'synonyms': ['hammock'], 'id': 531, 'def': 'a hanging bed of canvas or rope netting (usually suspended between two trees)', 'name': 'hammock'}, {'frequency': 'r', 'synset': 'hamper.n.02', 'synonyms': ['hamper'], 'id': 532, 'def': 'a basket usually with a cover', 'name': 'hamper'}, {'frequency': 'c', 'synset': 'hamster.n.01', 'synonyms': ['hamster'], 'id': 533, 'def': 'short-tailed burrowing rodent with large cheek pouches', 'name': 'hamster'}, {'frequency': 'f', 'synset': 'hand_blower.n.01', 'synonyms': ['hair_dryer'], 'id': 534, 'def': 'a hand-held electric blower that can blow warm air onto the hair', 'name': 'hair_dryer'}, {'frequency': 'r', 'synset': 'hand_glass.n.01', 'synonyms': ['hand_glass', 'hand_mirror'], 'id': 535, 'def': 'a mirror intended to be held in the hand', 'name': 'hand_glass'}, {'frequency': 'f', 'synset': 'hand_towel.n.01', 'synonyms': ['hand_towel', 'face_towel'], 'id': 536, 'def': 'a small towel used to dry the hands or face', 'name': 'hand_towel'}, {'frequency': 'c', 'synset': 'handcart.n.01', 'synonyms': ['handcart', 'pushcart', 'hand_truck'], 'id': 537, 'def': 'wheeled vehicle that can be pushed by a person', 'name': 'handcart'}, {'frequency': 'r', 'synset': 'handcuff.n.01', 'synonyms': ['handcuff'], 'id': 538, 'def': 'shackle that consists of a metal loop that can be locked around the wrist', 'name': 'handcuff'}, {'frequency': 'c', 'synset': 'handkerchief.n.01', 'synonyms': ['handkerchief'], 'id': 539, 'def': 'a square piece of cloth used for wiping the eyes or nose or as a costume accessory', 'name': 'handkerchief'}, {'frequency': 'f', 'synset': 'handle.n.01', 'synonyms': ['handle', 'grip', 'handgrip'], 'id': 540, 'def': 'the appendage to an object that is designed to be held in order to use or move it', 'name': 'handle'}, {'frequency': 'r', 'synset': 'handsaw.n.01', 'synonyms': ['handsaw', "carpenter's_saw"], 'id': 541, 'def': 'a saw used with one hand for cutting wood', 'name': 'handsaw'}, {'frequency': 'r', 'synset': 'hardback.n.01', 'synonyms': ['hardback_book', 'hardcover_book'], 'id': 542, 'def': 'a book with cardboard or cloth or leather covers', 'name': 'hardback_book'}, {'frequency': 'r', 'synset': 'harmonium.n.01', 'synonyms': ['harmonium', 'organ_(musical_instrument)', 'reed_organ_(musical_instrument)'], 'id': 543, 'def': 'a free-reed instrument in which air is forced through the reeds by bellows', 'name': 'harmonium'}, {'frequency': 'f', 'synset': 'hat.n.01', 'synonyms': ['hat'], 'id': 544, 'def': 'headwear that protects the head from bad weather, sun, or worn for fashion', 'name': 'hat'}, {'frequency': 'r', 'synset': 'hatbox.n.01', 'synonyms': ['hatbox'], 'id': 545, 'def': 'a round piece of luggage for carrying hats', 'name': 'hatbox'}, {'frequency': 'c', 'synset': 'head_covering.n.01', 'synonyms': ['veil'], 'id': 546, 'def': 'a garment that covers the head OR face', 'name': 'veil'}, {'frequency': 'f', 'synset': 'headband.n.01', 'synonyms': ['headband'], 'id': 547, 'def': 'a band worn around or over the head', 'name': 'headband'}, {'frequency': 'f', 'synset': 'headboard.n.01', 'synonyms': ['headboard'], 'id': 548, 'def': 'a vertical board or panel forming the head of a bedstead', 'name': 'headboard'}, {'frequency': 'f', 'synset': 'headlight.n.01', 'synonyms': ['headlight', 'headlamp'], 'id': 549, 'def': 'a powerful light with reflector; attached to the front of an automobile or locomotive', 'name': 'headlight'}, {'frequency': 'c', 'synset': 'headscarf.n.01', 'synonyms': ['headscarf'], 'id': 550, 'def': 'a kerchief worn over the head and tied under the chin', 'name': 'headscarf'}, {'frequency': 'r', 'synset': 'headset.n.01', 'synonyms': ['headset'], 'id': 551, 'def': 'receiver consisting of a pair of headphones', 'name': 'headset'}, {'frequency': 'c', 'synset': 'headstall.n.01', 'synonyms': ['headstall_(for_horses)', 'headpiece_(for_horses)'], 'id': 552, 'def': "the band that is the part of a bridle that fits around a horse's head", 'name': 'headstall_(for_horses)'}, {'frequency': 'c', 'synset': 'heart.n.02', 'synonyms': ['heart'], 'id': 553, 'def': 'a muscular organ; its contractions move the blood through the body', 'name': 'heart'}, {'frequency': 'c', 'synset': 'heater.n.01', 'synonyms': ['heater', 'warmer'], 'id': 554, 'def': 'device that heats water or supplies warmth to a room', 'name': 'heater'}, {'frequency': 'c', 'synset': 'helicopter.n.01', 'synonyms': ['helicopter'], 'id': 555, 'def': 'an aircraft without wings that obtains its lift from the rotation of overhead blades', 'name': 'helicopter'}, {'frequency': 'f', 'synset': 'helmet.n.02', 'synonyms': ['helmet'], 'id': 556, 'def': 'a protective headgear made of hard material to resist blows', 'name': 'helmet'}, {'frequency': 'r', 'synset': 'heron.n.02', 'synonyms': ['heron'], 'id': 557, 'def': 'grey or white wading bird with long neck and long legs and (usually) long bill', 'name': 'heron'}, {'frequency': 'c', 'synset': 'highchair.n.01', 'synonyms': ['highchair', 'feeding_chair'], 'id': 558, 'def': 'a chair for feeding a very young child', 'name': 'highchair'}, {'frequency': 'f', 'synset': 'hinge.n.01', 'synonyms': ['hinge'], 'id': 559, 'def': 'a joint that holds two parts together so that one can swing relative to the other', 'name': 'hinge'}, {'frequency': 'r', 'synset': 'hippopotamus.n.01', 'synonyms': ['hippopotamus'], 'id': 560, 'def': 'massive thick-skinned animal living in or around rivers of tropical Africa', 'name': 'hippopotamus'}, {'frequency': 'r', 'synset': 'hockey_stick.n.01', 'synonyms': ['hockey_stick'], 'id': 561, 'def': 'sports implement consisting of a stick used by hockey players to move the puck', 'name': 'hockey_stick'}, {'frequency': 'c', 'synset': 'hog.n.03', 'synonyms': ['hog', 'pig'], 'id': 562, 'def': 'domestic swine', 'name': 'hog'}, {'frequency': 'f', 'synset': 'home_plate.n.01', 'synonyms': ['home_plate_(baseball)', 'home_base_(baseball)'], 'id': 563, 'def': '(baseball) a rubber slab where the batter stands; it must be touched by a base runner in order to score', 'name': 'home_plate_(baseball)'}, {'frequency': 'c', 'synset': 'honey.n.01', 'synonyms': ['honey'], 'id': 564, 'def': 'a sweet yellow liquid produced by bees', 'name': 'honey'}, {'frequency': 'f', 'synset': 'hood.n.06', 'synonyms': ['fume_hood', 'exhaust_hood'], 'id': 565, 'def': 'metal covering leading to a vent that exhausts smoke or fumes', 'name': 'fume_hood'}, {'frequency': 'f', 'synset': 'hook.n.05', 'synonyms': ['hook'], 'id': 566, 'def': 'a curved or bent implement for suspending or pulling something', 'name': 'hook'}, {'frequency': 'r', 'synset': 'hookah.n.01', 'synonyms': ['hookah', 'narghile', 'nargileh', 'sheesha', 'shisha', 'water_pipe'], 'id': 567, 'def': 'a tobacco pipe with a long flexible tube connected to a container where the smoke is cooled by passing through water', 'name': 'hookah'}, {'frequency': 'r', 'synset': 'hornet.n.01', 'synonyms': ['hornet'], 'id': 568, 'def': 'large stinging wasp', 'name': 'hornet'}, {'frequency': 'f', 'synset': 'horse.n.01', 'synonyms': ['horse'], 'id': 569, 'def': 'a common horse', 'name': 'horse'}, {'frequency': 'f', 'synset': 'hose.n.03', 'synonyms': ['hose', 'hosepipe'], 'id': 570, 'def': 'a flexible pipe for conveying a liquid or gas', 'name': 'hose'}, {'frequency': 'r', 'synset': 'hot-air_balloon.n.01', 'synonyms': ['hot-air_balloon'], 'id': 571, 'def': 'balloon for travel through the air in a basket suspended below a large bag of heated air', 'name': 'hot-air_balloon'}, {'frequency': 'r', 'synset': 'hot_plate.n.01', 'synonyms': ['hotplate'], 'id': 572, 'def': 'a portable electric appliance for heating or cooking or keeping food warm', 'name': 'hotplate'}, {'frequency': 'c', 'synset': 'hot_sauce.n.01', 'synonyms': ['hot_sauce'], 'id': 573, 'def': 'a pungent peppery sauce', 'name': 'hot_sauce'}, {'frequency': 'r', 'synset': 'hourglass.n.01', 'synonyms': ['hourglass'], 'id': 574, 'def': 'a sandglass timer that runs for sixty minutes', 'name': 'hourglass'}, {'frequency': 'r', 'synset': 'houseboat.n.01', 'synonyms': ['houseboat'], 'id': 575, 'def': 'a barge that is designed and equipped for use as a dwelling', 'name': 'houseboat'}, {'frequency': 'c', 'synset': 'hummingbird.n.01', 'synonyms': ['hummingbird'], 'id': 576, 'def': 'tiny American bird having brilliant iridescent plumage and long slender bills', 'name': 'hummingbird'}, {'frequency': 'r', 'synset': 'hummus.n.01', 'synonyms': ['hummus', 'humus', 'hommos', 'hoummos', 'humous'], 'id': 577, 'def': 'a thick spread made from mashed chickpeas', 'name': 'hummus'}, {'frequency': 'f', 'synset': 'ice_bear.n.01', 'synonyms': ['polar_bear'], 'id': 578, 'def': 'white bear of Arctic regions', 'name': 'polar_bear'}, {'frequency': 'c', 'synset': 'ice_cream.n.01', 'synonyms': ['icecream'], 'id': 579, 'def': 'frozen dessert containing cream and sugar and flavoring', 'name': 'icecream'}, {'frequency': 'r', 'synset': 'ice_lolly.n.01', 'synonyms': ['popsicle'], 'id': 580, 'def': 'ice cream or water ice on a small wooden stick', 'name': 'popsicle'}, {'frequency': 'c', 'synset': 'ice_maker.n.01', 'synonyms': ['ice_maker'], 'id': 581, 'def': 'an appliance included in some electric refrigerators for making ice cubes', 'name': 'ice_maker'}, {'frequency': 'r', 'synset': 'ice_pack.n.01', 'synonyms': ['ice_pack', 'ice_bag'], 'id': 582, 'def': 'a waterproof bag filled with ice: applied to the body (especially the head) to cool or reduce swelling', 'name': 'ice_pack'}, {'frequency': 'r', 'synset': 'ice_skate.n.01', 'synonyms': ['ice_skate'], 'id': 583, 'def': 'skate consisting of a boot with a steel blade fitted to the sole', 'name': 'ice_skate'}, {'frequency': 'c', 'synset': 'igniter.n.01', 'synonyms': ['igniter', 'ignitor', 'lighter'], 'id': 584, 'def': 'a substance or device used to start a fire', 'name': 'igniter'}, {'frequency': 'r', 'synset': 'inhaler.n.01', 'synonyms': ['inhaler', 'inhalator'], 'id': 585, 'def': 'a dispenser that produces a chemical vapor to be inhaled through mouth or nose', 'name': 'inhaler'}, {'frequency': 'f', 'synset': 'ipod.n.01', 'synonyms': ['iPod'], 'id': 586, 'def': 'a pocket-sized device used to play music files', 'name': 'iPod'}, {'frequency': 'c', 'synset': 'iron.n.04', 'synonyms': ['iron_(for_clothing)', 'smoothing_iron_(for_clothing)'], 'id': 587, 'def': 'home appliance consisting of a flat metal base that is heated and used to smooth cloth', 'name': 'iron_(for_clothing)'}, {'frequency': 'c', 'synset': 'ironing_board.n.01', 'synonyms': ['ironing_board'], 'id': 588, 'def': 'narrow padded board on collapsible supports; used for ironing clothes', 'name': 'ironing_board'}, {'frequency': 'f', 'synset': 'jacket.n.01', 'synonyms': ['jacket'], 'id': 589, 'def': 'a waist-length coat', 'name': 'jacket'}, {'frequency': 'c', 'synset': 'jam.n.01', 'synonyms': ['jam'], 'id': 590, 'def': 'preserve of crushed fruit', 'name': 'jam'}, {'frequency': 'f', 'synset': 'jar.n.01', 'synonyms': ['jar'], 'id': 591, 'def': 'a vessel (usually cylindrical) with a wide mouth and without handles', 'name': 'jar'}, {'frequency': 'f', 'synset': 'jean.n.01', 'synonyms': ['jean', 'blue_jean', 'denim'], 'id': 592, 'def': '(usually plural) close-fitting trousers of heavy denim for manual work or casual wear', 'name': 'jean'}, {'frequency': 'c', 'synset': 'jeep.n.01', 'synonyms': ['jeep', 'landrover'], 'id': 593, 'def': 'a car suitable for traveling over rough terrain', 'name': 'jeep'}, {'frequency': 'r', 'synset': 'jelly_bean.n.01', 'synonyms': ['jelly_bean', 'jelly_egg'], 'id': 594, 'def': 'sugar-glazed jellied candy', 'name': 'jelly_bean'}, {'frequency': 'f', 'synset': 'jersey.n.03', 'synonyms': ['jersey', 'T-shirt', 'tee_shirt'], 'id': 595, 'def': 'a close-fitting pullover shirt', 'name': 'jersey'}, {'frequency': 'c', 'synset': 'jet.n.01', 'synonyms': ['jet_plane', 'jet-propelled_plane'], 'id': 596, 'def': 'an airplane powered by one or more jet engines', 'name': 'jet_plane'}, {'frequency': 'r', 'synset': 'jewel.n.01', 'synonyms': ['jewel', 'gem', 'precious_stone'], 'id': 597, 'def': 'a precious or semiprecious stone incorporated into a piece of jewelry', 'name': 'jewel'}, {'frequency': 'c', 'synset': 'jewelry.n.01', 'synonyms': ['jewelry', 'jewellery'], 'id': 598, 'def': 'an adornment (as a bracelet or ring or necklace) made of precious metals and set with gems (or imitation gems)', 'name': 'jewelry'}, {'frequency': 'r', 'synset': 'joystick.n.02', 'synonyms': ['joystick'], 'id': 599, 'def': 'a control device for computers consisting of a vertical handle that can move freely in two directions', 'name': 'joystick'}, {'frequency': 'c', 'synset': 'jump_suit.n.01', 'synonyms': ['jumpsuit'], 'id': 600, 'def': "one-piece garment fashioned after a parachutist's uniform", 'name': 'jumpsuit'}, {'frequency': 'c', 'synset': 'kayak.n.01', 'synonyms': ['kayak'], 'id': 601, 'def': 'a small canoe consisting of a light frame made watertight with animal skins', 'name': 'kayak'}, {'frequency': 'r', 'synset': 'keg.n.02', 'synonyms': ['keg'], 'id': 602, 'def': 'small cask or barrel', 'name': 'keg'}, {'frequency': 'r', 'synset': 'kennel.n.01', 'synonyms': ['kennel', 'doghouse'], 'id': 603, 'def': 'outbuilding that serves as a shelter for a dog', 'name': 'kennel'}, {'frequency': 'c', 'synset': 'kettle.n.01', 'synonyms': ['kettle', 'boiler'], 'id': 604, 'def': 'a metal pot for stewing or boiling; usually has a lid', 'name': 'kettle'}, {'frequency': 'f', 'synset': 'key.n.01', 'synonyms': ['key'], 'id': 605, 'def': 'metal instrument used to unlock a lock', 'name': 'key'}, {'frequency': 'r', 'synset': 'keycard.n.01', 'synonyms': ['keycard'], 'id': 606, 'def': 'a plastic card used to gain access typically to a door', 'name': 'keycard'}, {'frequency': 'c', 'synset': 'kilt.n.01', 'synonyms': ['kilt'], 'id': 607, 'def': 'a knee-length pleated tartan skirt worn by men as part of the traditional dress in the Highlands of northern Scotland', 'name': 'kilt'}, {'frequency': 'c', 'synset': 'kimono.n.01', 'synonyms': ['kimono'], 'id': 608, 'def': 'a loose robe; imitated from robes originally worn by Japanese', 'name': 'kimono'}, {'frequency': 'f', 'synset': 'kitchen_sink.n.01', 'synonyms': ['kitchen_sink'], 'id': 609, 'def': 'a sink in a kitchen', 'name': 'kitchen_sink'}, {'frequency': 'r', 'synset': 'kitchen_table.n.01', 'synonyms': ['kitchen_table'], 'id': 610, 'def': 'a table in the kitchen', 'name': 'kitchen_table'}, {'frequency': 'f', 'synset': 'kite.n.03', 'synonyms': ['kite'], 'id': 611, 'def': 'plaything consisting of a light frame covered with tissue paper; flown in wind at end of a string', 'name': 'kite'}, {'frequency': 'c', 'synset': 'kitten.n.01', 'synonyms': ['kitten', 'kitty'], 'id': 612, 'def': 'young domestic cat', 'name': 'kitten'}, {'frequency': 'c', 'synset': 'kiwi.n.03', 'synonyms': ['kiwi_fruit'], 'id': 613, 'def': 'fuzzy brown egg-shaped fruit with slightly tart green flesh', 'name': 'kiwi_fruit'}, {'frequency': 'f', 'synset': 'knee_pad.n.01', 'synonyms': ['knee_pad'], 'id': 614, 'def': 'protective garment consisting of a pad worn by football or baseball or hockey players', 'name': 'knee_pad'}, {'frequency': 'f', 'synset': 'knife.n.01', 'synonyms': ['knife'], 'id': 615, 'def': 'tool with a blade and point used as a cutting instrument', 'name': 'knife'}, {'frequency': 'r', 'synset': 'knitting_needle.n.01', 'synonyms': ['knitting_needle'], 'id': 616, 'def': 'needle consisting of a slender rod with pointed ends; usually used in pairs', 'name': 'knitting_needle'}, {'frequency': 'f', 'synset': 'knob.n.02', 'synonyms': ['knob'], 'id': 617, 'def': 'a round handle often found on a door', 'name': 'knob'}, {'frequency': 'r', 'synset': 'knocker.n.05', 'synonyms': ['knocker_(on_a_door)', 'doorknocker'], 'id': 618, 'def': 'a device (usually metal and ornamental) attached by a hinge to a door', 'name': 'knocker_(on_a_door)'}, {'frequency': 'r', 'synset': 'koala.n.01', 'synonyms': ['koala', 'koala_bear'], 'id': 619, 'def': 'sluggish tailless Australian marsupial with grey furry ears and coat', 'name': 'koala'}, {'frequency': 'r', 'synset': 'lab_coat.n.01', 'synonyms': ['lab_coat', 'laboratory_coat'], 'id': 620, 'def': 'a light coat worn to protect clothing from substances used while working in a laboratory', 'name': 'lab_coat'}, {'frequency': 'f', 'synset': 'ladder.n.01', 'synonyms': ['ladder'], 'id': 621, 'def': 'steps consisting of two parallel members connected by rungs', 'name': 'ladder'}, {'frequency': 'c', 'synset': 'ladle.n.01', 'synonyms': ['ladle'], 'id': 622, 'def': 'a spoon-shaped vessel with a long handle frequently used to transfer liquids', 'name': 'ladle'}, {'frequency': 'c', 'synset': 'ladybug.n.01', 'synonyms': ['ladybug', 'ladybeetle', 'ladybird_beetle'], 'id': 623, 'def': 'small round bright-colored and spotted beetle, typically red and black', 'name': 'ladybug'}, {'frequency': 'f', 'synset': 'lamb.n.01', 'synonyms': ['lamb_(animal)'], 'id': 624, 'def': 'young sheep', 'name': 'lamb_(animal)'}, {'frequency': 'r', 'synset': 'lamb_chop.n.01', 'synonyms': ['lamb-chop', 'lambchop'], 'id': 625, 'def': 'chop cut from a lamb', 'name': 'lamb-chop'}, {'frequency': 'f', 'synset': 'lamp.n.02', 'synonyms': ['lamp'], 'id': 626, 'def': 'a piece of furniture holding one or more electric light bulbs', 'name': 'lamp'}, {'frequency': 'f', 'synset': 'lamppost.n.01', 'synonyms': ['lamppost'], 'id': 627, 'def': 'a metal post supporting an outdoor lamp (such as a streetlight)', 'name': 'lamppost'}, {'frequency': 'f', 'synset': 'lampshade.n.01', 'synonyms': ['lampshade'], 'id': 628, 'def': 'a protective ornamental shade used to screen a light bulb from direct view', 'name': 'lampshade'}, {'frequency': 'c', 'synset': 'lantern.n.01', 'synonyms': ['lantern'], 'id': 629, 'def': 'light in a transparent protective case', 'name': 'lantern'}, {'frequency': 'f', 'synset': 'lanyard.n.02', 'synonyms': ['lanyard', 'laniard'], 'id': 630, 'def': 'a cord worn around the neck to hold a knife or whistle, etc.', 'name': 'lanyard'}, {'frequency': 'f', 'synset': 'laptop.n.01', 'synonyms': ['laptop_computer', 'notebook_computer'], 'id': 631, 'def': 'a portable computer small enough to use in your lap', 'name': 'laptop_computer'}, {'frequency': 'r', 'synset': 'lasagna.n.01', 'synonyms': ['lasagna', 'lasagne'], 'id': 632, 'def': 'baked dish of layers of lasagna pasta with sauce and cheese and meat or vegetables', 'name': 'lasagna'}, {'frequency': 'f', 'synset': 'latch.n.02', 'synonyms': ['latch'], 'id': 633, 'def': 'a bar that can be lowered or slid into a groove to fasten a door or gate', 'name': 'latch'}, {'frequency': 'r', 'synset': 'lawn_mower.n.01', 'synonyms': ['lawn_mower'], 'id': 634, 'def': 'garden tool for mowing grass on lawns', 'name': 'lawn_mower'}, {'frequency': 'r', 'synset': 'leather.n.01', 'synonyms': ['leather'], 'id': 635, 'def': 'an animal skin made smooth and flexible by removing the hair and then tanning', 'name': 'leather'}, {'frequency': 'c', 'synset': 'legging.n.01', 'synonyms': ['legging_(clothing)', 'leging_(clothing)', 'leg_covering'], 'id': 636, 'def': 'a garment covering the leg (usually extending from the knee to the ankle)', 'name': 'legging_(clothing)'}, {'frequency': 'c', 'synset': 'lego.n.01', 'synonyms': ['Lego', 'Lego_set'], 'id': 637, 'def': "a child's plastic construction set for making models from blocks", 'name': 'Lego'}, {'frequency': 'r', 'synset': 'legume.n.02', 'synonyms': ['legume'], 'id': 638, 'def': 'the fruit or seed of bean or pea plants', 'name': 'legume'}, {'frequency': 'f', 'synset': 'lemon.n.01', 'synonyms': ['lemon'], 'id': 639, 'def': 'yellow oval fruit with juicy acidic flesh', 'name': 'lemon'}, {'frequency': 'r', 'synset': 'lemonade.n.01', 'synonyms': ['lemonade'], 'id': 640, 'def': 'sweetened beverage of diluted lemon juice', 'name': 'lemonade'}, {'frequency': 'f', 'synset': 'lettuce.n.02', 'synonyms': ['lettuce'], 'id': 641, 'def': 'leafy plant commonly eaten in salad or on sandwiches', 'name': 'lettuce'}, {'frequency': 'f', 'synset': 'license_plate.n.01', 'synonyms': ['license_plate', 'numberplate'], 'id': 642, 'def': "a plate mounted on the front and back of car and bearing the car's registration number", 'name': 'license_plate'}, {'frequency': 'f', 'synset': 'life_buoy.n.01', 'synonyms': ['life_buoy', 'lifesaver', 'life_belt', 'life_ring'], 'id': 643, 'def': 'a ring-shaped life preserver used to prevent drowning (NOT a life-jacket or vest)', 'name': 'life_buoy'}, {'frequency': 'f', 'synset': 'life_jacket.n.01', 'synonyms': ['life_jacket', 'life_vest'], 'id': 644, 'def': 'life preserver consisting of a sleeveless jacket of buoyant or inflatable design', 'name': 'life_jacket'}, {'frequency': 'f', 'synset': 'light_bulb.n.01', 'synonyms': ['lightbulb'], 'id': 645, 'def': 'lightblub/source of light', 'name': 'lightbulb'}, {'frequency': 'r', 'synset': 'lightning_rod.n.02', 'synonyms': ['lightning_rod', 'lightning_conductor'], 'id': 646, 'def': 'a metallic conductor that is attached to a high point and leads to the ground', 'name': 'lightning_rod'}, {'frequency': 'f', 'synset': 'lime.n.06', 'synonyms': ['lime'], 'id': 647, 'def': 'the green acidic fruit of any of various lime trees', 'name': 'lime'}, {'frequency': 'r', 'synset': 'limousine.n.01', 'synonyms': ['limousine'], 'id': 648, 'def': 'long luxurious car; usually driven by a chauffeur', 'name': 'limousine'}, {'frequency': 'c', 'synset': 'lion.n.01', 'synonyms': ['lion'], 'id': 649, 'def': 'large gregarious predatory cat of Africa and India', 'name': 'lion'}, {'frequency': 'c', 'synset': 'lip_balm.n.01', 'synonyms': ['lip_balm'], 'id': 650, 'def': 'a balm applied to the lips', 'name': 'lip_balm'}, {'frequency': 'r', 'synset': 'liquor.n.01', 'synonyms': ['liquor', 'spirits', 'hard_liquor', 'liqueur', 'cordial'], 'id': 651, 'def': 'liquor or beer', 'name': 'liquor'}, {'frequency': 'c', 'synset': 'lizard.n.01', 'synonyms': ['lizard'], 'id': 652, 'def': 'a reptile with usually two pairs of legs and a tapering tail', 'name': 'lizard'}, {'frequency': 'f', 'synset': 'log.n.01', 'synonyms': ['log'], 'id': 653, 'def': 'a segment of the trunk of a tree when stripped of branches', 'name': 'log'}, {'frequency': 'c', 'synset': 'lollipop.n.02', 'synonyms': ['lollipop'], 'id': 654, 'def': 'hard candy on a stick', 'name': 'lollipop'}, {'frequency': 'f', 'synset': 'loudspeaker.n.01', 'synonyms': ['speaker_(stero_equipment)'], 'id': 655, 'def': 'electronic device that produces sound often as part of a stereo system', 'name': 'speaker_(stero_equipment)'}, {'frequency': 'c', 'synset': 'love_seat.n.01', 'synonyms': ['loveseat'], 'id': 656, 'def': 'small sofa that seats two people', 'name': 'loveseat'}, {'frequency': 'r', 'synset': 'machine_gun.n.01', 'synonyms': ['machine_gun'], 'id': 657, 'def': 'a rapidly firing automatic gun', 'name': 'machine_gun'}, {'frequency': 'f', 'synset': 'magazine.n.02', 'synonyms': ['magazine'], 'id': 658, 'def': 'a paperback periodic publication', 'name': 'magazine'}, {'frequency': 'f', 'synset': 'magnet.n.01', 'synonyms': ['magnet'], 'id': 659, 'def': 'a device that attracts iron and produces a magnetic field', 'name': 'magnet'}, {'frequency': 'c', 'synset': 'mail_slot.n.01', 'synonyms': ['mail_slot'], 'id': 660, 'def': 'a slot (usually in a door) through which mail can be delivered', 'name': 'mail_slot'}, {'frequency': 'f', 'synset': 'mailbox.n.01', 'synonyms': ['mailbox_(at_home)', 'letter_box_(at_home)'], 'id': 661, 'def': 'a private box for delivery of mail', 'name': 'mailbox_(at_home)'}, {'frequency': 'r', 'synset': 'mallard.n.01', 'synonyms': ['mallard'], 'id': 662, 'def': 'wild dabbling duck from which domestic ducks are descended', 'name': 'mallard'}, {'frequency': 'r', 'synset': 'mallet.n.01', 'synonyms': ['mallet'], 'id': 663, 'def': 'a sports implement with a long handle and a hammer-like head used to hit a ball', 'name': 'mallet'}, {'frequency': 'r', 'synset': 'mammoth.n.01', 'synonyms': ['mammoth'], 'id': 664, 'def': 'any of numerous extinct elephants widely distributed in the Pleistocene', 'name': 'mammoth'}, {'frequency': 'r', 'synset': 'manatee.n.01', 'synonyms': ['manatee'], 'id': 665, 'def': 'sirenian mammal of tropical coastal waters of America', 'name': 'manatee'}, {'frequency': 'c', 'synset': 'mandarin.n.05', 'synonyms': ['mandarin_orange'], 'id': 666, 'def': 'a somewhat flat reddish-orange loose skinned citrus of China', 'name': 'mandarin_orange'}, {'frequency': 'c', 'synset': 'manger.n.01', 'synonyms': ['manger', 'trough'], 'id': 667, 'def': 'a container (usually in a barn or stable) from which cattle or horses feed', 'name': 'manger'}, {'frequency': 'f', 'synset': 'manhole.n.01', 'synonyms': ['manhole'], 'id': 668, 'def': 'a hole (usually with a flush cover) through which a person can gain access to an underground structure', 'name': 'manhole'}, {'frequency': 'f', 'synset': 'map.n.01', 'synonyms': ['map'], 'id': 669, 'def': "a diagrammatic representation of the earth's surface (or part of it)", 'name': 'map'}, {'frequency': 'f', 'synset': 'marker.n.03', 'synonyms': ['marker'], 'id': 670, 'def': 'a writing implement for making a mark', 'name': 'marker'}, {'frequency': 'r', 'synset': 'martini.n.01', 'synonyms': ['martini'], 'id': 671, 'def': 'a cocktail made of gin (or vodka) with dry vermouth', 'name': 'martini'}, {'frequency': 'r', 'synset': 'mascot.n.01', 'synonyms': ['mascot'], 'id': 672, 'def': 'a person or animal that is adopted by a team or other group as a symbolic figure', 'name': 'mascot'}, {'frequency': 'c', 'synset': 'mashed_potato.n.01', 'synonyms': ['mashed_potato'], 'id': 673, 'def': 'potato that has been peeled and boiled and then mashed', 'name': 'mashed_potato'}, {'frequency': 'r', 'synset': 'masher.n.02', 'synonyms': ['masher'], 'id': 674, 'def': 'a kitchen utensil used for mashing (e.g. potatoes)', 'name': 'masher'}, {'frequency': 'f', 'synset': 'mask.n.04', 'synonyms': ['mask', 'facemask'], 'id': 675, 'def': 'a protective covering worn over the face', 'name': 'mask'}, {'frequency': 'f', 'synset': 'mast.n.01', 'synonyms': ['mast'], 'id': 676, 'def': 'a vertical spar for supporting sails', 'name': 'mast'}, {'frequency': 'c', 'synset': 'mat.n.03', 'synonyms': ['mat_(gym_equipment)', 'gym_mat'], 'id': 677, 'def': 'sports equipment consisting of a piece of thick padding on the floor for gymnastics', 'name': 'mat_(gym_equipment)'}, {'frequency': 'r', 'synset': 'matchbox.n.01', 'synonyms': ['matchbox'], 'id': 678, 'def': 'a box for holding matches', 'name': 'matchbox'}, {'frequency': 'f', 'synset': 'mattress.n.01', 'synonyms': ['mattress'], 'id': 679, 'def': 'a thick pad filled with resilient material used as a bed or part of a bed', 'name': 'mattress'}, {'frequency': 'c', 'synset': 'measuring_cup.n.01', 'synonyms': ['measuring_cup'], 'id': 680, 'def': 'graduated cup used to measure liquid or granular ingredients', 'name': 'measuring_cup'}, {'frequency': 'c', 'synset': 'measuring_stick.n.01', 'synonyms': ['measuring_stick', 'ruler_(measuring_stick)', 'measuring_rod'], 'id': 681, 'def': 'measuring instrument having a sequence of marks at regular intervals', 'name': 'measuring_stick'}, {'frequency': 'c', 'synset': 'meatball.n.01', 'synonyms': ['meatball'], 'id': 682, 'def': 'ground meat formed into a ball and fried or simmered in broth', 'name': 'meatball'}, {'frequency': 'c', 'synset': 'medicine.n.02', 'synonyms': ['medicine'], 'id': 683, 'def': 'something that treats or prevents or alleviates the symptoms of disease', 'name': 'medicine'}, {'frequency': 'c', 'synset': 'melon.n.01', 'synonyms': ['melon'], 'id': 684, 'def': 'fruit of the gourd family having a hard rind and sweet juicy flesh', 'name': 'melon'}, {'frequency': 'f', 'synset': 'microphone.n.01', 'synonyms': ['microphone'], 'id': 685, 'def': 'device for converting sound waves into electrical energy', 'name': 'microphone'}, {'frequency': 'r', 'synset': 'microscope.n.01', 'synonyms': ['microscope'], 'id': 686, 'def': 'magnifier of the image of small objects', 'name': 'microscope'}, {'frequency': 'f', 'synset': 'microwave.n.02', 'synonyms': ['microwave_oven'], 'id': 687, 'def': 'kitchen appliance that cooks food by passing an electromagnetic wave through it', 'name': 'microwave_oven'}, {'frequency': 'r', 'synset': 'milestone.n.01', 'synonyms': ['milestone', 'milepost'], 'id': 688, 'def': 'stone post at side of a road to show distances', 'name': 'milestone'}, {'frequency': 'f', 'synset': 'milk.n.01', 'synonyms': ['milk'], 'id': 689, 'def': 'a white nutritious liquid secreted by mammals and used as food by human beings', 'name': 'milk'}, {'frequency': 'r', 'synset': 'milk_can.n.01', 'synonyms': ['milk_can'], 'id': 690, 'def': 'can for transporting milk', 'name': 'milk_can'}, {'frequency': 'r', 'synset': 'milkshake.n.01', 'synonyms': ['milkshake'], 'id': 691, 'def': 'frothy drink of milk and flavoring and sometimes fruit or ice cream', 'name': 'milkshake'}, {'frequency': 'f', 'synset': 'minivan.n.01', 'synonyms': ['minivan'], 'id': 692, 'def': 'a small box-shaped passenger van', 'name': 'minivan'}, {'frequency': 'r', 'synset': 'mint.n.05', 'synonyms': ['mint_candy'], 'id': 693, 'def': 'a candy that is flavored with a mint oil', 'name': 'mint_candy'}, {'frequency': 'f', 'synset': 'mirror.n.01', 'synonyms': ['mirror'], 'id': 694, 'def': 'polished surface that forms images by reflecting light', 'name': 'mirror'}, {'frequency': 'c', 'synset': 'mitten.n.01', 'synonyms': ['mitten'], 'id': 695, 'def': 'glove that encases the thumb separately and the other four fingers together', 'name': 'mitten'}, {'frequency': 'c', 'synset': 'mixer.n.04', 'synonyms': ['mixer_(kitchen_tool)', 'stand_mixer'], 'id': 696, 'def': 'a kitchen utensil that is used for mixing foods', 'name': 'mixer_(kitchen_tool)'}, {'frequency': 'c', 'synset': 'money.n.03', 'synonyms': ['money'], 'id': 697, 'def': 'the official currency issued by a government or national bank', 'name': 'money'}, {'frequency': 'f', 'synset': 'monitor.n.04', 'synonyms': ['monitor_(computer_equipment) computer_monitor'], 'id': 698, 'def': 'a computer monitor', 'name': 'monitor_(computer_equipment) computer_monitor'}, {'frequency': 'c', 'synset': 'monkey.n.01', 'synonyms': ['monkey'], 'id': 699, 'def': 'any of various long-tailed primates', 'name': 'monkey'}, {'frequency': 'f', 'synset': 'motor.n.01', 'synonyms': ['motor'], 'id': 700, 'def': 'machine that converts other forms of energy into mechanical energy and so imparts motion', 'name': 'motor'}, {'frequency': 'f', 'synset': 'motor_scooter.n.01', 'synonyms': ['motor_scooter', 'scooter'], 'id': 701, 'def': 'a wheeled vehicle with small wheels and a low-powered engine', 'name': 'motor_scooter'}, {'frequency': 'r', 'synset': 'motor_vehicle.n.01', 'synonyms': ['motor_vehicle', 'automotive_vehicle'], 'id': 702, 'def': 'a self-propelled wheeled vehicle that does not run on rails', 'name': 'motor_vehicle'}, {'frequency': 'f', 'synset': 'motorcycle.n.01', 'synonyms': ['motorcycle'], 'id': 703, 'def': 'a motor vehicle with two wheels and a strong frame', 'name': 'motorcycle'}, {'frequency': 'f', 'synset': 'mound.n.01', 'synonyms': ['mound_(baseball)', "pitcher's_mound"], 'id': 704, 'def': '(baseball) the slight elevation on which the pitcher stands', 'name': 'mound_(baseball)'}, {'frequency': 'f', 'synset': 'mouse.n.04', 'synonyms': ['mouse_(computer_equipment)', 'computer_mouse'], 'id': 705, 'def': 'a computer input device that controls an on-screen pointer (does not include trackpads / touchpads)', 'name': 'mouse_(computer_equipment)'}, {'frequency': 'f', 'synset': 'mousepad.n.01', 'synonyms': ['mousepad'], 'id': 706, 'def': 'a small portable pad that provides an operating surface for a computer mouse', 'name': 'mousepad'}, {'frequency': 'c', 'synset': 'muffin.n.01', 'synonyms': ['muffin'], 'id': 707, 'def': 'a sweet quick bread baked in a cup-shaped pan', 'name': 'muffin'}, {'frequency': 'f', 'synset': 'mug.n.04', 'synonyms': ['mug'], 'id': 708, 'def': 'with handle and usually cylindrical', 'name': 'mug'}, {'frequency': 'f', 'synset': 'mushroom.n.02', 'synonyms': ['mushroom'], 'id': 709, 'def': 'a common mushroom', 'name': 'mushroom'}, {'frequency': 'r', 'synset': 'music_stool.n.01', 'synonyms': ['music_stool', 'piano_stool'], 'id': 710, 'def': 'a stool for piano players; usually adjustable in height', 'name': 'music_stool'}, {'frequency': 'c', 'synset': 'musical_instrument.n.01', 'synonyms': ['musical_instrument', 'instrument_(musical)'], 'id': 711, 'def': 'any of various devices or contrivances that can be used to produce musical tones or sounds', 'name': 'musical_instrument'}, {'frequency': 'r', 'synset': 'nailfile.n.01', 'synonyms': ['nailfile'], 'id': 712, 'def': 'a small flat file for shaping the nails', 'name': 'nailfile'}, {'frequency': 'f', 'synset': 'napkin.n.01', 'synonyms': ['napkin', 'table_napkin', 'serviette'], 'id': 713, 'def': 'a small piece of table linen or paper that is used to wipe the mouth and to cover the lap in order to protect clothing', 'name': 'napkin'}, {'frequency': 'r', 'synset': 'neckerchief.n.01', 'synonyms': ['neckerchief'], 'id': 714, 'def': 'a kerchief worn around the neck', 'name': 'neckerchief'}, {'frequency': 'f', 'synset': 'necklace.n.01', 'synonyms': ['necklace'], 'id': 715, 'def': 'jewelry consisting of a cord or chain (often bearing gems) worn about the neck as an ornament', 'name': 'necklace'}, {'frequency': 'f', 'synset': 'necktie.n.01', 'synonyms': ['necktie', 'tie_(necktie)'], 'id': 716, 'def': 'neckwear consisting of a long narrow piece of material worn under a collar and tied in knot at the front', 'name': 'necktie'}, {'frequency': 'c', 'synset': 'needle.n.03', 'synonyms': ['needle'], 'id': 717, 'def': 'a sharp pointed implement (usually metal)', 'name': 'needle'}, {'frequency': 'c', 'synset': 'nest.n.01', 'synonyms': ['nest'], 'id': 718, 'def': 'a structure in which animals lay eggs or give birth to their young', 'name': 'nest'}, {'frequency': 'f', 'synset': 'newspaper.n.01', 'synonyms': ['newspaper', 'paper_(newspaper)'], 'id': 719, 'def': 'a daily or weekly publication on folded sheets containing news, articles, and advertisements', 'name': 'newspaper'}, {'frequency': 'c', 'synset': 'newsstand.n.01', 'synonyms': ['newsstand'], 'id': 720, 'def': 'a stall where newspapers and other periodicals are sold', 'name': 'newsstand'}, {'frequency': 'c', 'synset': 'nightwear.n.01', 'synonyms': ['nightshirt', 'nightwear', 'sleepwear', 'nightclothes'], 'id': 721, 'def': 'garments designed to be worn in bed', 'name': 'nightshirt'}, {'frequency': 'r', 'synset': 'nosebag.n.01', 'synonyms': ['nosebag_(for_animals)', 'feedbag'], 'id': 722, 'def': 'a canvas bag that is used to feed an animal (such as a horse); covers the muzzle and fastens at the top of the head', 'name': 'nosebag_(for_animals)'}, {'frequency': 'c', 'synset': 'noseband.n.01', 'synonyms': ['noseband_(for_animals)', 'nosepiece_(for_animals)'], 'id': 723, 'def': "a strap that is the part of a bridle that goes over the animal's nose", 'name': 'noseband_(for_animals)'}, {'frequency': 'f', 'synset': 'notebook.n.01', 'synonyms': ['notebook'], 'id': 724, 'def': 'a book with blank pages for recording notes or memoranda', 'name': 'notebook'}, {'frequency': 'c', 'synset': 'notepad.n.01', 'synonyms': ['notepad'], 'id': 725, 'def': 'a pad of paper for keeping notes', 'name': 'notepad'}, {'frequency': 'f', 'synset': 'nut.n.03', 'synonyms': ['nut'], 'id': 726, 'def': 'a small metal block (usually square or hexagonal) with internal screw thread to be fitted onto a bolt', 'name': 'nut'}, {'frequency': 'r', 'synset': 'nutcracker.n.01', 'synonyms': ['nutcracker'], 'id': 727, 'def': 'a hand tool used to crack nuts open', 'name': 'nutcracker'}, {'frequency': 'f', 'synset': 'oar.n.01', 'synonyms': ['oar'], 'id': 728, 'def': 'an implement used to propel or steer a boat', 'name': 'oar'}, {'frequency': 'r', 'synset': 'octopus.n.01', 'synonyms': ['octopus_(food)'], 'id': 729, 'def': 'tentacles of octopus prepared as food', 'name': 'octopus_(food)'}, {'frequency': 'r', 'synset': 'octopus.n.02', 'synonyms': ['octopus_(animal)'], 'id': 730, 'def': 'bottom-living cephalopod having a soft oval body with eight long tentacles', 'name': 'octopus_(animal)'}, {'frequency': 'c', 'synset': 'oil_lamp.n.01', 'synonyms': ['oil_lamp', 'kerosene_lamp', 'kerosine_lamp'], 'id': 731, 'def': 'a lamp that burns oil (as kerosine) for light', 'name': 'oil_lamp'}, {'frequency': 'c', 'synset': 'olive_oil.n.01', 'synonyms': ['olive_oil'], 'id': 732, 'def': 'oil from olives', 'name': 'olive_oil'}, {'frequency': 'r', 'synset': 'omelet.n.01', 'synonyms': ['omelet', 'omelette'], 'id': 733, 'def': 'beaten eggs cooked until just set; may be folded around e.g. ham or cheese or jelly', 'name': 'omelet'}, {'frequency': 'f', 'synset': 'onion.n.01', 'synonyms': ['onion'], 'id': 734, 'def': 'the bulb of an onion plant', 'name': 'onion'}, {'frequency': 'f', 'synset': 'orange.n.01', 'synonyms': ['orange_(fruit)'], 'id': 735, 'def': 'orange (FRUIT of an orange tree)', 'name': 'orange_(fruit)'}, {'frequency': 'c', 'synset': 'orange_juice.n.01', 'synonyms': ['orange_juice'], 'id': 736, 'def': 'bottled or freshly squeezed juice of oranges', 'name': 'orange_juice'}, {'frequency': 'c', 'synset': 'ostrich.n.02', 'synonyms': ['ostrich'], 'id': 737, 'def': 'fast-running African flightless bird with two-toed feet; largest living bird', 'name': 'ostrich'}, {'frequency': 'f', 'synset': 'ottoman.n.03', 'synonyms': ['ottoman', 'pouf', 'pouffe', 'hassock'], 'id': 738, 'def': 'a thick standalone cushion used as a seat or footrest, often next to a chair', 'name': 'ottoman'}, {'frequency': 'f', 'synset': 'oven.n.01', 'synonyms': ['oven'], 'id': 739, 'def': 'kitchen appliance used for baking or roasting', 'name': 'oven'}, {'frequency': 'c', 'synset': 'overall.n.01', 'synonyms': ['overalls_(clothing)'], 'id': 740, 'def': 'work clothing consisting of denim trousers usually with a bib and shoulder straps', 'name': 'overalls_(clothing)'}, {'frequency': 'c', 'synset': 'owl.n.01', 'synonyms': ['owl'], 'id': 741, 'def': 'nocturnal bird of prey with hawk-like beak and claws and large head with front-facing eyes', 'name': 'owl'}, {'frequency': 'c', 'synset': 'packet.n.03', 'synonyms': ['packet'], 'id': 742, 'def': 'a small package or bundle', 'name': 'packet'}, {'frequency': 'r', 'synset': 'pad.n.03', 'synonyms': ['inkpad', 'inking_pad', 'stamp_pad'], 'id': 743, 'def': 'absorbent material saturated with ink used to transfer ink evenly to a rubber stamp', 'name': 'inkpad'}, {'frequency': 'c', 'synset': 'pad.n.04', 'synonyms': ['pad'], 'id': 744, 'def': 'mostly arm/knee pads labeled', 'name': 'pad'}, {'frequency': 'f', 'synset': 'paddle.n.04', 'synonyms': ['paddle', 'boat_paddle'], 'id': 745, 'def': 'a short light oar used without an oarlock to propel a canoe or small boat', 'name': 'paddle'}, {'frequency': 'c', 'synset': 'padlock.n.01', 'synonyms': ['padlock'], 'id': 746, 'def': 'a detachable, portable lock', 'name': 'padlock'}, {'frequency': 'c', 'synset': 'paintbrush.n.01', 'synonyms': ['paintbrush'], 'id': 747, 'def': 'a brush used as an applicator to apply paint', 'name': 'paintbrush'}, {'frequency': 'f', 'synset': 'painting.n.01', 'synonyms': ['painting'], 'id': 748, 'def': 'graphic art consisting of an artistic composition made by applying paints to a surface', 'name': 'painting'}, {'frequency': 'f', 'synset': 'pajama.n.02', 'synonyms': ['pajamas', 'pyjamas'], 'id': 749, 'def': 'loose-fitting nightclothes worn for sleeping or lounging', 'name': 'pajamas'}, {'frequency': 'c', 'synset': 'palette.n.02', 'synonyms': ['palette', 'pallet'], 'id': 750, 'def': 'board that provides a flat surface on which artists mix paints and the range of colors used', 'name': 'palette'}, {'frequency': 'f', 'synset': 'pan.n.01', 'synonyms': ['pan_(for_cooking)', 'cooking_pan'], 'id': 751, 'def': 'cooking utensil consisting of a wide metal vessel', 'name': 'pan_(for_cooking)'}, {'frequency': 'r', 'synset': 'pan.n.03', 'synonyms': ['pan_(metal_container)'], 'id': 752, 'def': 'shallow container made of metal', 'name': 'pan_(metal_container)'}, {'frequency': 'c', 'synset': 'pancake.n.01', 'synonyms': ['pancake'], 'id': 753, 'def': 'a flat cake of thin batter fried on both sides on a griddle', 'name': 'pancake'}, {'frequency': 'r', 'synset': 'pantyhose.n.01', 'synonyms': ['pantyhose'], 'id': 754, 'def': "a woman's tights consisting of underpants and stockings", 'name': 'pantyhose'}, {'frequency': 'r', 'synset': 'papaya.n.02', 'synonyms': ['papaya'], 'id': 755, 'def': 'large oval melon-like tropical fruit with yellowish flesh', 'name': 'papaya'}, {'frequency': 'f', 'synset': 'paper_plate.n.01', 'synonyms': ['paper_plate'], 'id': 756, 'def': 'a disposable plate made of cardboard', 'name': 'paper_plate'}, {'frequency': 'f', 'synset': 'paper_towel.n.01', 'synonyms': ['paper_towel'], 'id': 757, 'def': 'a disposable towel made of absorbent paper', 'name': 'paper_towel'}, {'frequency': 'r', 'synset': 'paperback_book.n.01', 'synonyms': ['paperback_book', 'paper-back_book', 'softback_book', 'soft-cover_book'], 'id': 758, 'def': 'a book with paper covers', 'name': 'paperback_book'}, {'frequency': 'r', 'synset': 'paperweight.n.01', 'synonyms': ['paperweight'], 'id': 759, 'def': 'a weight used to hold down a stack of papers', 'name': 'paperweight'}, {'frequency': 'c', 'synset': 'parachute.n.01', 'synonyms': ['parachute'], 'id': 760, 'def': 'rescue equipment consisting of a device that fills with air and retards your fall', 'name': 'parachute'}, {'frequency': 'c', 'synset': 'parakeet.n.01', 'synonyms': ['parakeet', 'parrakeet', 'parroket', 'paraquet', 'paroquet', 'parroquet'], 'id': 761, 'def': 'any of numerous small slender long-tailed parrots', 'name': 'parakeet'}, {'frequency': 'c', 'synset': 'parasail.n.01', 'synonyms': ['parasail_(sports)'], 'id': 762, 'def': 'parachute that will lift a person up into the air when it is towed by a motorboat or a car', 'name': 'parasail_(sports)'}, {'frequency': 'c', 'synset': 'parasol.n.01', 'synonyms': ['parasol', 'sunshade'], 'id': 763, 'def': 'a handheld collapsible source of shade', 'name': 'parasol'}, {'frequency': 'r', 'synset': 'parchment.n.01', 'synonyms': ['parchment'], 'id': 764, 'def': 'a superior paper resembling sheepskin', 'name': 'parchment'}, {'frequency': 'c', 'synset': 'parka.n.01', 'synonyms': ['parka', 'anorak'], 'id': 765, 'def': "a kind of heavy jacket (`windcheater' is a British term)", 'name': 'parka'}, {'frequency': 'f', 'synset': 'parking_meter.n.01', 'synonyms': ['parking_meter'], 'id': 766, 'def': 'a coin-operated timer located next to a parking space', 'name': 'parking_meter'}, {'frequency': 'c', 'synset': 'parrot.n.01', 'synonyms': ['parrot'], 'id': 767, 'def': 'usually brightly colored tropical birds with short hooked beaks and the ability to mimic sounds', 'name': 'parrot'}, {'frequency': 'c', 'synset': 'passenger_car.n.01', 'synonyms': ['passenger_car_(part_of_a_train)', 'coach_(part_of_a_train)'], 'id': 768, 'def': 'a railcar where passengers ride', 'name': 'passenger_car_(part_of_a_train)'}, {'frequency': 'r', 'synset': 'passenger_ship.n.01', 'synonyms': ['passenger_ship'], 'id': 769, 'def': 'a ship built to carry passengers', 'name': 'passenger_ship'}, {'frequency': 'c', 'synset': 'passport.n.02', 'synonyms': ['passport'], 'id': 770, 'def': 'a document issued by a country to a citizen allowing that person to travel abroad and re-enter the home country', 'name': 'passport'}, {'frequency': 'f', 'synset': 'pastry.n.02', 'synonyms': ['pastry'], 'id': 771, 'def': 'any of various baked foods made of dough or batter', 'name': 'pastry'}, {'frequency': 'r', 'synset': 'patty.n.01', 'synonyms': ['patty_(food)'], 'id': 772, 'def': 'small flat mass of chopped food', 'name': 'patty_(food)'}, {'frequency': 'c', 'synset': 'pea.n.01', 'synonyms': ['pea_(food)'], 'id': 773, 'def': 'seed of a pea plant used for food', 'name': 'pea_(food)'}, {'frequency': 'c', 'synset': 'peach.n.03', 'synonyms': ['peach'], 'id': 774, 'def': 'downy juicy fruit with sweet yellowish or whitish flesh', 'name': 'peach'}, {'frequency': 'c', 'synset': 'peanut_butter.n.01', 'synonyms': ['peanut_butter'], 'id': 775, 'def': 'a spread made from ground peanuts', 'name': 'peanut_butter'}, {'frequency': 'f', 'synset': 'pear.n.01', 'synonyms': ['pear'], 'id': 776, 'def': 'sweet juicy gritty-textured fruit available in many varieties', 'name': 'pear'}, {'frequency': 'c', 'synset': 'peeler.n.03', 'synonyms': ['peeler_(tool_for_fruit_and_vegetables)'], 'id': 777, 'def': 'a device for peeling vegetables or fruits', 'name': 'peeler_(tool_for_fruit_and_vegetables)'}, {'frequency': 'r', 'synset': 'peg.n.04', 'synonyms': ['wooden_leg', 'pegleg'], 'id': 778, 'def': 'a prosthesis that replaces a missing leg', 'name': 'wooden_leg'}, {'frequency': 'r', 'synset': 'pegboard.n.01', 'synonyms': ['pegboard'], 'id': 779, 'def': 'a board perforated with regularly spaced holes into which pegs can be fitted', 'name': 'pegboard'}, {'frequency': 'c', 'synset': 'pelican.n.01', 'synonyms': ['pelican'], 'id': 780, 'def': 'large long-winged warm-water seabird having a large bill with a distensible pouch for fish', 'name': 'pelican'}, {'frequency': 'f', 'synset': 'pen.n.01', 'synonyms': ['pen'], 'id': 781, 'def': 'a writing implement with a point from which ink flows', 'name': 'pen'}, {'frequency': 'f', 'synset': 'pencil.n.01', 'synonyms': ['pencil'], 'id': 782, 'def': 'a thin cylindrical pointed writing implement made of wood and graphite', 'name': 'pencil'}, {'frequency': 'r', 'synset': 'pencil_box.n.01', 'synonyms': ['pencil_box', 'pencil_case'], 'id': 783, 'def': 'a box for holding pencils', 'name': 'pencil_box'}, {'frequency': 'r', 'synset': 'pencil_sharpener.n.01', 'synonyms': ['pencil_sharpener'], 'id': 784, 'def': 'a rotary implement for sharpening the point on pencils', 'name': 'pencil_sharpener'}, {'frequency': 'r', 'synset': 'pendulum.n.01', 'synonyms': ['pendulum'], 'id': 785, 'def': 'an apparatus consisting of an object mounted so that it swings freely under the influence of gravity', 'name': 'pendulum'}, {'frequency': 'c', 'synset': 'penguin.n.01', 'synonyms': ['penguin'], 'id': 786, 'def': 'short-legged flightless birds of cold southern regions having webbed feet and wings modified as flippers', 'name': 'penguin'}, {'frequency': 'r', 'synset': 'pennant.n.02', 'synonyms': ['pennant'], 'id': 787, 'def': 'a flag longer than it is wide (and often tapering)', 'name': 'pennant'}, {'frequency': 'r', 'synset': 'penny.n.02', 'synonyms': ['penny_(coin)'], 'id': 788, 'def': 'a coin worth one-hundredth of the value of the basic unit', 'name': 'penny_(coin)'}, {'frequency': 'f', 'synset': 'pepper.n.03', 'synonyms': ['pepper', 'peppercorn'], 'id': 789, 'def': 'pungent seasoning from the berry of the common pepper plant; whole or ground', 'name': 'pepper'}, {'frequency': 'c', 'synset': 'pepper_mill.n.01', 'synonyms': ['pepper_mill', 'pepper_grinder'], 'id': 790, 'def': 'a mill for grinding pepper', 'name': 'pepper_mill'}, {'frequency': 'c', 'synset': 'perfume.n.02', 'synonyms': ['perfume'], 'id': 791, 'def': 'a toiletry that emits and diffuses a fragrant odor', 'name': 'perfume'}, {'frequency': 'r', 'synset': 'persimmon.n.02', 'synonyms': ['persimmon'], 'id': 792, 'def': 'orange fruit resembling a plum; edible when fully ripe', 'name': 'persimmon'}, {'frequency': 'f', 'synset': 'person.n.01', 'synonyms': ['person', 'baby', 'child', 'boy', 'girl', 'man', 'woman', 'human'], 'id': 793, 'def': 'a human being', 'name': 'person'}, {'frequency': 'c', 'synset': 'pet.n.01', 'synonyms': ['pet'], 'id': 794, 'def': 'a domesticated animal kept for companionship or amusement', 'name': 'pet'}, {'frequency': 'c', 'synset': 'pew.n.01', 'synonyms': ['pew_(church_bench)', 'church_bench'], 'id': 795, 'def': 'long bench with backs; used in church by the congregation', 'name': 'pew_(church_bench)'}, {'frequency': 'r', 'synset': 'phonebook.n.01', 'synonyms': ['phonebook', 'telephone_book', 'telephone_directory'], 'id': 796, 'def': 'a directory containing an alphabetical list of telephone subscribers and their telephone numbers', 'name': 'phonebook'}, {'frequency': 'c', 'synset': 'phonograph_record.n.01', 'synonyms': ['phonograph_record', 'phonograph_recording', 'record_(phonograph_recording)'], 'id': 797, 'def': 'sound recording consisting of a typically black disk with a continuous groove', 'name': 'phonograph_record'}, {'frequency': 'f', 'synset': 'piano.n.01', 'synonyms': ['piano'], 'id': 798, 'def': 'a keyboard instrument that is played by depressing keys that cause hammers to strike tuned strings and produce sounds', 'name': 'piano'}, {'frequency': 'f', 'synset': 'pickle.n.01', 'synonyms': ['pickle'], 'id': 799, 'def': 'vegetables (especially cucumbers) preserved in brine or vinegar', 'name': 'pickle'}, {'frequency': 'f', 'synset': 'pickup.n.01', 'synonyms': ['pickup_truck'], 'id': 800, 'def': 'a light truck with an open body and low sides and a tailboard', 'name': 'pickup_truck'}, {'frequency': 'c', 'synset': 'pie.n.01', 'synonyms': ['pie'], 'id': 801, 'def': 'dish baked in pastry-lined pan often with a pastry top', 'name': 'pie'}, {'frequency': 'c', 'synset': 'pigeon.n.01', 'synonyms': ['pigeon'], 'id': 802, 'def': 'wild and domesticated birds having a heavy body and short legs', 'name': 'pigeon'}, {'frequency': 'r', 'synset': 'piggy_bank.n.01', 'synonyms': ['piggy_bank', 'penny_bank'], 'id': 803, 'def': "a child's coin bank (often shaped like a pig)", 'name': 'piggy_bank'}, {'frequency': 'f', 'synset': 'pillow.n.01', 'synonyms': ['pillow'], 'id': 804, 'def': 'a cushion to support the head of a sleeping person', 'name': 'pillow'}, {'frequency': 'r', 'synset': 'pin.n.09', 'synonyms': ['pin_(non_jewelry)'], 'id': 805, 'def': 'a small slender (often pointed) piece of wood or metal used to support or fasten or attach things', 'name': 'pin_(non_jewelry)'}, {'frequency': 'f', 'synset': 'pineapple.n.02', 'synonyms': ['pineapple'], 'id': 806, 'def': 'large sweet fleshy tropical fruit with a tuft of stiff leaves', 'name': 'pineapple'}, {'frequency': 'c', 'synset': 'pinecone.n.01', 'synonyms': ['pinecone'], 'id': 807, 'def': 'the seed-producing cone of a pine tree', 'name': 'pinecone'}, {'frequency': 'r', 'synset': 'ping-pong_ball.n.01', 'synonyms': ['ping-pong_ball'], 'id': 808, 'def': 'light hollow ball used in playing table tennis', 'name': 'ping-pong_ball'}, {'frequency': 'r', 'synset': 'pinwheel.n.03', 'synonyms': ['pinwheel'], 'id': 809, 'def': 'a toy consisting of vanes of colored paper or plastic that is pinned to a stick and spins when it is pointed into the wind', 'name': 'pinwheel'}, {'frequency': 'r', 'synset': 'pipe.n.01', 'synonyms': ['tobacco_pipe'], 'id': 810, 'def': 'a tube with a small bowl at one end; used for smoking tobacco', 'name': 'tobacco_pipe'}, {'frequency': 'f', 'synset': 'pipe.n.02', 'synonyms': ['pipe', 'piping'], 'id': 811, 'def': 'a long tube made of metal or plastic that is used to carry water or oil or gas etc.', 'name': 'pipe'}, {'frequency': 'r', 'synset': 'pistol.n.01', 'synonyms': ['pistol', 'handgun'], 'id': 812, 'def': 'a firearm that is held and fired with one hand', 'name': 'pistol'}, {'frequency': 'c', 'synset': 'pita.n.01', 'synonyms': ['pita_(bread)', 'pocket_bread'], 'id': 813, 'def': 'usually small round bread that can open into a pocket for filling', 'name': 'pita_(bread)'}, {'frequency': 'f', 'synset': 'pitcher.n.02', 'synonyms': ['pitcher_(vessel_for_liquid)', 'ewer'], 'id': 814, 'def': 'an open vessel with a handle and a spout for pouring', 'name': 'pitcher_(vessel_for_liquid)'}, {'frequency': 'r', 'synset': 'pitchfork.n.01', 'synonyms': ['pitchfork'], 'id': 815, 'def': 'a long-handled hand tool with sharp widely spaced prongs for lifting and pitching hay', 'name': 'pitchfork'}, {'frequency': 'f', 'synset': 'pizza.n.01', 'synonyms': ['pizza'], 'id': 816, 'def': 'Italian open pie made of thin bread dough spread with a spiced mixture of e.g. tomato sauce and cheese', 'name': 'pizza'}, {'frequency': 'f', 'synset': 'place_mat.n.01', 'synonyms': ['place_mat'], 'id': 817, 'def': 'a mat placed on a table for an individual place setting', 'name': 'place_mat'}, {'frequency': 'f', 'synset': 'plate.n.04', 'synonyms': ['plate'], 'id': 818, 'def': 'dish on which food is served or from which food is eaten', 'name': 'plate'}, {'frequency': 'c', 'synset': 'platter.n.01', 'synonyms': ['platter'], 'id': 819, 'def': 'a large shallow dish used for serving food', 'name': 'platter'}, {'frequency': 'r', 'synset': 'playpen.n.01', 'synonyms': ['playpen'], 'id': 820, 'def': 'a portable enclosure in which babies may be left to play', 'name': 'playpen'}, {'frequency': 'c', 'synset': 'pliers.n.01', 'synonyms': ['pliers', 'plyers'], 'id': 821, 'def': 'a gripping hand tool with two hinged arms and (usually) serrated jaws', 'name': 'pliers'}, {'frequency': 'r', 'synset': 'plow.n.01', 'synonyms': ['plow_(farm_equipment)', 'plough_(farm_equipment)'], 'id': 822, 'def': 'a farm tool having one or more heavy blades to break the soil and cut a furrow prior to sowing', 'name': 'plow_(farm_equipment)'}, {'frequency': 'r', 'synset': 'plume.n.02', 'synonyms': ['plume'], 'id': 823, 'def': 'a feather or cluster of feathers worn as an ornament', 'name': 'plume'}, {'frequency': 'r', 'synset': 'pocket_watch.n.01', 'synonyms': ['pocket_watch'], 'id': 824, 'def': 'a watch that is carried in a small watch pocket', 'name': 'pocket_watch'}, {'frequency': 'c', 'synset': 'pocketknife.n.01', 'synonyms': ['pocketknife'], 'id': 825, 'def': 'a knife with a blade that folds into the handle; suitable for carrying in the pocket', 'name': 'pocketknife'}, {'frequency': 'c', 'synset': 'poker.n.01', 'synonyms': ['poker_(fire_stirring_tool)', 'stove_poker', 'fire_hook'], 'id': 826, 'def': 'fire iron consisting of a metal rod with a handle; used to stir a fire', 'name': 'poker_(fire_stirring_tool)'}, {'frequency': 'f', 'synset': 'pole.n.01', 'synonyms': ['pole', 'post'], 'id': 827, 'def': 'a long (usually round) rod of wood or metal or plastic', 'name': 'pole'}, {'frequency': 'f', 'synset': 'polo_shirt.n.01', 'synonyms': ['polo_shirt', 'sport_shirt'], 'id': 828, 'def': 'a shirt with short sleeves designed for comfort and casual wear', 'name': 'polo_shirt'}, {'frequency': 'r', 'synset': 'poncho.n.01', 'synonyms': ['poncho'], 'id': 829, 'def': 'a blanket-like cloak with a hole in the center for the head', 'name': 'poncho'}, {'frequency': 'c', 'synset': 'pony.n.05', 'synonyms': ['pony'], 'id': 830, 'def': 'any of various breeds of small gentle horses usually less than five feet high at the shoulder', 'name': 'pony'}, {'frequency': 'r', 'synset': 'pool_table.n.01', 'synonyms': ['pool_table', 'billiard_table', 'snooker_table'], 'id': 831, 'def': 'game equipment consisting of a heavy table on which pool is played', 'name': 'pool_table'}, {'frequency': 'f', 'synset': 'pop.n.02', 'synonyms': ['pop_(soda)', 'soda_(pop)', 'tonic', 'soft_drink'], 'id': 832, 'def': 'a sweet drink containing carbonated water and flavoring', 'name': 'pop_(soda)'}, {'frequency': 'c', 'synset': 'postbox.n.01', 'synonyms': ['postbox_(public)', 'mailbox_(public)'], 'id': 833, 'def': 'public box for deposit of mail', 'name': 'postbox_(public)'}, {'frequency': 'c', 'synset': 'postcard.n.01', 'synonyms': ['postcard', 'postal_card', 'mailing-card'], 'id': 834, 'def': 'a card for sending messages by post without an envelope', 'name': 'postcard'}, {'frequency': 'f', 'synset': 'poster.n.01', 'synonyms': ['poster', 'placard'], 'id': 835, 'def': 'a sign posted in a public place as an advertisement', 'name': 'poster'}, {'frequency': 'f', 'synset': 'pot.n.01', 'synonyms': ['pot'], 'id': 836, 'def': 'metal or earthenware cooking vessel that is usually round and deep; often has a handle and lid', 'name': 'pot'}, {'frequency': 'f', 'synset': 'pot.n.04', 'synonyms': ['flowerpot'], 'id': 837, 'def': 'a container in which plants are cultivated', 'name': 'flowerpot'}, {'frequency': 'f', 'synset': 'potato.n.01', 'synonyms': ['potato'], 'id': 838, 'def': 'an edible tuber native to South America', 'name': 'potato'}, {'frequency': 'c', 'synset': 'potholder.n.01', 'synonyms': ['potholder'], 'id': 839, 'def': 'an insulated pad for holding hot pots', 'name': 'potholder'}, {'frequency': 'c', 'synset': 'pottery.n.01', 'synonyms': ['pottery', 'clayware'], 'id': 840, 'def': 'ceramic ware made from clay and baked in a kiln', 'name': 'pottery'}, {'frequency': 'c', 'synset': 'pouch.n.01', 'synonyms': ['pouch'], 'id': 841, 'def': 'a small or medium size container for holding or carrying things', 'name': 'pouch'}, {'frequency': 'c', 'synset': 'power_shovel.n.01', 'synonyms': ['power_shovel', 'excavator', 'digger'], 'id': 842, 'def': 'a machine for excavating', 'name': 'power_shovel'}, {'frequency': 'c', 'synset': 'prawn.n.01', 'synonyms': ['prawn', 'shrimp'], 'id': 843, 'def': 'any of various edible decapod crustaceans', 'name': 'prawn'}, {'frequency': 'c', 'synset': 'pretzel.n.01', 'synonyms': ['pretzel'], 'id': 844, 'def': 'glazed and salted cracker typically in the shape of a loose knot', 'name': 'pretzel'}, {'frequency': 'f', 'synset': 'printer.n.03', 'synonyms': ['printer', 'printing_machine'], 'id': 845, 'def': 'a machine that prints', 'name': 'printer'}, {'frequency': 'c', 'synset': 'projectile.n.01', 'synonyms': ['projectile_(weapon)', 'missile'], 'id': 846, 'def': 'a weapon that is forcibly thrown or projected at a targets', 'name': 'projectile_(weapon)'}, {'frequency': 'c', 'synset': 'projector.n.02', 'synonyms': ['projector'], 'id': 847, 'def': 'an optical instrument that projects an enlarged image onto a screen', 'name': 'projector'}, {'frequency': 'f', 'synset': 'propeller.n.01', 'synonyms': ['propeller', 'propellor'], 'id': 848, 'def': 'a mechanical device that rotates to push against air or water', 'name': 'propeller'}, {'frequency': 'r', 'synset': 'prune.n.01', 'synonyms': ['prune'], 'id': 849, 'def': 'dried plum', 'name': 'prune'}, {'frequency': 'r', 'synset': 'pudding.n.01', 'synonyms': ['pudding'], 'id': 850, 'def': 'any of various soft thick unsweetened baked dishes', 'name': 'pudding'}, {'frequency': 'r', 'synset': 'puffer.n.02', 'synonyms': ['puffer_(fish)', 'pufferfish', 'blowfish', 'globefish'], 'id': 851, 'def': 'fishes whose elongated spiny body can inflate itself with water or air to form a globe', 'name': 'puffer_(fish)'}, {'frequency': 'r', 'synset': 'puffin.n.01', 'synonyms': ['puffin'], 'id': 852, 'def': 'seabirds having short necks and brightly colored compressed bills', 'name': 'puffin'}, {'frequency': 'r', 'synset': 'pug.n.01', 'synonyms': ['pug-dog'], 'id': 853, 'def': 'small compact smooth-coated breed of Asiatic origin having a tightly curled tail and broad flat wrinkled muzzle', 'name': 'pug-dog'}, {'frequency': 'c', 'synset': 'pumpkin.n.02', 'synonyms': ['pumpkin'], 'id': 854, 'def': 'usually large pulpy deep-yellow round fruit of the squash family maturing in late summer or early autumn', 'name': 'pumpkin'}, {'frequency': 'r', 'synset': 'punch.n.03', 'synonyms': ['puncher'], 'id': 855, 'def': 'a tool for making holes or indentations', 'name': 'puncher'}, {'frequency': 'r', 'synset': 'puppet.n.01', 'synonyms': ['puppet', 'marionette'], 'id': 856, 'def': 'a small figure of a person operated from above with strings by a puppeteer', 'name': 'puppet'}, {'frequency': 'c', 'synset': 'puppy.n.01', 'synonyms': ['puppy'], 'id': 857, 'def': 'a young dog', 'name': 'puppy'}, {'frequency': 'r', 'synset': 'quesadilla.n.01', 'synonyms': ['quesadilla'], 'id': 858, 'def': 'a tortilla that is filled with cheese and heated', 'name': 'quesadilla'}, {'frequency': 'r', 'synset': 'quiche.n.02', 'synonyms': ['quiche'], 'id': 859, 'def': 'a tart filled with rich unsweetened custard; often contains other ingredients (as cheese or ham or seafood or vegetables)', 'name': 'quiche'}, {'frequency': 'f', 'synset': 'quilt.n.01', 'synonyms': ['quilt', 'comforter'], 'id': 860, 'def': 'bedding made of two layers of cloth filled with stuffing and stitched together', 'name': 'quilt'}, {'frequency': 'c', 'synset': 'rabbit.n.01', 'synonyms': ['rabbit'], 'id': 861, 'def': 'any of various burrowing animals of the family Leporidae having long ears and short tails', 'name': 'rabbit'}, {'frequency': 'r', 'synset': 'racer.n.02', 'synonyms': ['race_car', 'racing_car'], 'id': 862, 'def': 'a fast car that competes in races', 'name': 'race_car'}, {'frequency': 'c', 'synset': 'racket.n.04', 'synonyms': ['racket', 'racquet'], 'id': 863, 'def': 'a sports implement used to strike a ball in various games', 'name': 'racket'}, {'frequency': 'r', 'synset': 'radar.n.01', 'synonyms': ['radar'], 'id': 864, 'def': 'measuring instrument in which the echo of a pulse of microwave radiation is used to detect and locate distant objects', 'name': 'radar'}, {'frequency': 'f', 'synset': 'radiator.n.03', 'synonyms': ['radiator'], 'id': 865, 'def': 'a mechanism consisting of a metal honeycomb through which hot fluids circulate', 'name': 'radiator'}, {'frequency': 'c', 'synset': 'radio_receiver.n.01', 'synonyms': ['radio_receiver', 'radio_set', 'radio', 'tuner_(radio)'], 'id': 866, 'def': 'an electronic receiver that detects and demodulates and amplifies transmitted radio signals', 'name': 'radio_receiver'}, {'frequency': 'c', 'synset': 'radish.n.03', 'synonyms': ['radish', 'daikon'], 'id': 867, 'def': 'pungent edible root of any of various cultivated radish plants', 'name': 'radish'}, {'frequency': 'c', 'synset': 'raft.n.01', 'synonyms': ['raft'], 'id': 868, 'def': 'a flat float (usually made of logs or planks) that can be used for transport or as a platform for swimmers', 'name': 'raft'}, {'frequency': 'r', 'synset': 'rag_doll.n.01', 'synonyms': ['rag_doll'], 'id': 869, 'def': 'a cloth doll that is stuffed and (usually) painted', 'name': 'rag_doll'}, {'frequency': 'c', 'synset': 'raincoat.n.01', 'synonyms': ['raincoat', 'waterproof_jacket'], 'id': 870, 'def': 'a water-resistant coat', 'name': 'raincoat'}, {'frequency': 'c', 'synset': 'ram.n.05', 'synonyms': ['ram_(animal)'], 'id': 871, 'def': 'uncastrated adult male sheep', 'name': 'ram_(animal)'}, {'frequency': 'c', 'synset': 'raspberry.n.02', 'synonyms': ['raspberry'], 'id': 872, 'def': 'red or black edible aggregate berries usually smaller than the related blackberries', 'name': 'raspberry'}, {'frequency': 'r', 'synset': 'rat.n.01', 'synonyms': ['rat'], 'id': 873, 'def': 'any of various long-tailed rodents similar to but larger than a mouse', 'name': 'rat'}, {'frequency': 'c', 'synset': 'razorblade.n.01', 'synonyms': ['razorblade'], 'id': 874, 'def': 'a blade that has very sharp edge', 'name': 'razorblade'}, {'frequency': 'c', 'synset': 'reamer.n.01', 'synonyms': ['reamer_(juicer)', 'juicer', 'juice_reamer'], 'id': 875, 'def': 'a squeezer with a conical ridged center that is used for squeezing juice from citrus fruit', 'name': 'reamer_(juicer)'}, {'frequency': 'f', 'synset': 'rearview_mirror.n.01', 'synonyms': ['rearview_mirror'], 'id': 876, 'def': 'vehicle mirror (side or rearview)', 'name': 'rearview_mirror'}, {'frequency': 'c', 'synset': 'receipt.n.02', 'synonyms': ['receipt'], 'id': 877, 'def': 'an acknowledgment (usually tangible) that payment has been made', 'name': 'receipt'}, {'frequency': 'c', 'synset': 'recliner.n.01', 'synonyms': ['recliner', 'reclining_chair', 'lounger_(chair)'], 'id': 878, 'def': 'an armchair whose back can be lowered and foot can be raised to allow the sitter to recline in it', 'name': 'recliner'}, {'frequency': 'c', 'synset': 'record_player.n.01', 'synonyms': ['record_player', 'phonograph_(record_player)', 'turntable'], 'id': 879, 'def': 'machine in which rotating records cause a stylus to vibrate and the vibrations are amplified acoustically or electronically', 'name': 'record_player'}, {'frequency': 'f', 'synset': 'reflector.n.01', 'synonyms': ['reflector'], 'id': 880, 'def': 'device that reflects light, radiation, etc.', 'name': 'reflector'}, {'frequency': 'f', 'synset': 'remote_control.n.01', 'synonyms': ['remote_control'], 'id': 881, 'def': 'a device that can be used to control a machine or apparatus from a distance', 'name': 'remote_control'}, {'frequency': 'c', 'synset': 'rhinoceros.n.01', 'synonyms': ['rhinoceros'], 'id': 882, 'def': 'massive powerful herbivorous odd-toed ungulate of southeast Asia and Africa having very thick skin and one or two horns on the snout', 'name': 'rhinoceros'}, {'frequency': 'r', 'synset': 'rib.n.03', 'synonyms': ['rib_(food)'], 'id': 883, 'def': 'cut of meat including one or more ribs', 'name': 'rib_(food)'}, {'frequency': 'c', 'synset': 'rifle.n.01', 'synonyms': ['rifle'], 'id': 884, 'def': 'a shoulder firearm with a long barrel', 'name': 'rifle'}, {'frequency': 'f', 'synset': 'ring.n.08', 'synonyms': ['ring'], 'id': 885, 'def': 'jewelry consisting of a circlet of precious metal (often set with jewels) worn on the finger', 'name': 'ring'}, {'frequency': 'r', 'synset': 'river_boat.n.01', 'synonyms': ['river_boat'], 'id': 886, 'def': 'a boat used on rivers or to ply a river', 'name': 'river_boat'}, {'frequency': 'r', 'synset': 'road_map.n.02', 'synonyms': ['road_map'], 'id': 887, 'def': '(NOT A ROAD) a MAP showing roads (for automobile travel)', 'name': 'road_map'}, {'frequency': 'c', 'synset': 'robe.n.01', 'synonyms': ['robe'], 'id': 888, 'def': 'any loose flowing garment', 'name': 'robe'}, {'frequency': 'c', 'synset': 'rocking_chair.n.01', 'synonyms': ['rocking_chair'], 'id': 889, 'def': 'a chair mounted on rockers', 'name': 'rocking_chair'}, {'frequency': 'r', 'synset': 'rodent.n.01', 'synonyms': ['rodent'], 'id': 890, 'def': 'relatively small placental mammals having a single pair of constantly growing incisor teeth specialized for gnawing', 'name': 'rodent'}, {'frequency': 'r', 'synset': 'roller_skate.n.01', 'synonyms': ['roller_skate'], 'id': 891, 'def': 'a shoe with pairs of rollers (small hard wheels) fixed to the sole', 'name': 'roller_skate'}, {'frequency': 'r', 'synset': 'rollerblade.n.01', 'synonyms': ['Rollerblade'], 'id': 892, 'def': 'an in-line variant of a roller skate', 'name': 'Rollerblade'}, {'frequency': 'c', 'synset': 'rolling_pin.n.01', 'synonyms': ['rolling_pin'], 'id': 893, 'def': 'utensil consisting of a cylinder (usually of wood) with a handle at each end; used to roll out dough', 'name': 'rolling_pin'}, {'frequency': 'r', 'synset': 'root_beer.n.01', 'synonyms': ['root_beer'], 'id': 894, 'def': 'carbonated drink containing extracts of roots and herbs', 'name': 'root_beer'}, {'frequency': 'c', 'synset': 'router.n.02', 'synonyms': ['router_(computer_equipment)'], 'id': 895, 'def': 'a device that forwards data packets between computer networks', 'name': 'router_(computer_equipment)'}, {'frequency': 'f', 'synset': 'rubber_band.n.01', 'synonyms': ['rubber_band', 'elastic_band'], 'id': 896, 'def': 'a narrow band of elastic rubber used to hold things (such as papers) together', 'name': 'rubber_band'}, {'frequency': 'c', 'synset': 'runner.n.08', 'synonyms': ['runner_(carpet)'], 'id': 897, 'def': 'a long narrow carpet', 'name': 'runner_(carpet)'}, {'frequency': 'f', 'synset': 'sack.n.01', 'synonyms': ['plastic_bag', 'paper_bag'], 'id': 898, 'def': "a bag made of paper or plastic for holding customer's purchases", 'name': 'plastic_bag'}, {'frequency': 'f', 'synset': 'saddle.n.01', 'synonyms': ['saddle_(on_an_animal)'], 'id': 899, 'def': 'a seat for the rider of a horse or camel', 'name': 'saddle_(on_an_animal)'}, {'frequency': 'f', 'synset': 'saddle_blanket.n.01', 'synonyms': ['saddle_blanket', 'saddlecloth', 'horse_blanket'], 'id': 900, 'def': 'stable gear consisting of a blanket placed under the saddle', 'name': 'saddle_blanket'}, {'frequency': 'c', 'synset': 'saddlebag.n.01', 'synonyms': ['saddlebag'], 'id': 901, 'def': 'a large bag (or pair of bags) hung over a saddle', 'name': 'saddlebag'}, {'frequency': 'r', 'synset': 'safety_pin.n.01', 'synonyms': ['safety_pin'], 'id': 902, 'def': 'a pin in the form of a clasp; has a guard so the point of the pin will not stick the user', 'name': 'safety_pin'}, {'frequency': 'f', 'synset': 'sail.n.01', 'synonyms': ['sail'], 'id': 903, 'def': 'a large piece of fabric by means of which wind is used to propel a sailing vessel', 'name': 'sail'}, {'frequency': 'f', 'synset': 'salad.n.01', 'synonyms': ['salad'], 'id': 904, 'def': 'food mixtures either arranged on a plate or tossed and served with a moist dressing; usually consisting of or including greens', 'name': 'salad'}, {'frequency': 'r', 'synset': 'salad_plate.n.01', 'synonyms': ['salad_plate', 'salad_bowl'], 'id': 905, 'def': 'a plate or bowl for individual servings of salad', 'name': 'salad_plate'}, {'frequency': 'c', 'synset': 'salami.n.01', 'synonyms': ['salami'], 'id': 906, 'def': 'highly seasoned fatty sausage of pork and beef usually dried', 'name': 'salami'}, {'frequency': 'c', 'synset': 'salmon.n.01', 'synonyms': ['salmon_(fish)'], 'id': 907, 'def': 'any of various large food and game fishes of northern waters', 'name': 'salmon_(fish)'}, {'frequency': 'r', 'synset': 'salmon.n.03', 'synonyms': ['salmon_(food)'], 'id': 908, 'def': 'flesh of any of various marine or freshwater fish of the family Salmonidae', 'name': 'salmon_(food)'}, {'frequency': 'c', 'synset': 'salsa.n.01', 'synonyms': ['salsa'], 'id': 909, 'def': 'spicy sauce of tomatoes and onions and chili peppers to accompany Mexican foods', 'name': 'salsa'}, {'frequency': 'f', 'synset': 'saltshaker.n.01', 'synonyms': ['saltshaker'], 'id': 910, 'def': 'a shaker with a perforated top for sprinkling salt', 'name': 'saltshaker'}, {'frequency': 'f', 'synset': 'sandal.n.01', 'synonyms': ['sandal_(type_of_shoe)'], 'id': 911, 'def': 'a shoe consisting of a sole fastened by straps to the foot', 'name': 'sandal_(type_of_shoe)'}, {'frequency': 'f', 'synset': 'sandwich.n.01', 'synonyms': ['sandwich'], 'id': 912, 'def': 'two (or more) slices of bread with a filling between them', 'name': 'sandwich'}, {'frequency': 'r', 'synset': 'satchel.n.01', 'synonyms': ['satchel'], 'id': 913, 'def': 'luggage consisting of a small case with a flat bottom and (usually) a shoulder strap', 'name': 'satchel'}, {'frequency': 'r', 'synset': 'saucepan.n.01', 'synonyms': ['saucepan'], 'id': 914, 'def': 'a deep pan with a handle; used for stewing or boiling', 'name': 'saucepan'}, {'frequency': 'f', 'synset': 'saucer.n.02', 'synonyms': ['saucer'], 'id': 915, 'def': 'a small shallow dish for holding a cup at the table', 'name': 'saucer'}, {'frequency': 'f', 'synset': 'sausage.n.01', 'synonyms': ['sausage'], 'id': 916, 'def': 'highly seasoned minced meat stuffed in casings', 'name': 'sausage'}, {'frequency': 'r', 'synset': 'sawhorse.n.01', 'synonyms': ['sawhorse', 'sawbuck'], 'id': 917, 'def': 'a framework for holding wood that is being sawed', 'name': 'sawhorse'}, {'frequency': 'r', 'synset': 'sax.n.02', 'synonyms': ['saxophone'], 'id': 918, 'def': "a wind instrument with a `J'-shaped form typically made of brass", 'name': 'saxophone'}, {'frequency': 'f', 'synset': 'scale.n.07', 'synonyms': ['scale_(measuring_instrument)'], 'id': 919, 'def': 'a measuring instrument for weighing; shows amount of mass', 'name': 'scale_(measuring_instrument)'}, {'frequency': 'r', 'synset': 'scarecrow.n.01', 'synonyms': ['scarecrow', 'strawman'], 'id': 920, 'def': 'an effigy in the shape of a man to frighten birds away from seeds', 'name': 'scarecrow'}, {'frequency': 'f', 'synset': 'scarf.n.01', 'synonyms': ['scarf'], 'id': 921, 'def': 'a garment worn around the head or neck or shoulders for warmth or decoration', 'name': 'scarf'}, {'frequency': 'c', 'synset': 'school_bus.n.01', 'synonyms': ['school_bus'], 'id': 922, 'def': 'a bus used to transport children to or from school', 'name': 'school_bus'}, {'frequency': 'f', 'synset': 'scissors.n.01', 'synonyms': ['scissors'], 'id': 923, 'def': 'a tool having two crossed pivoting blades with looped handles', 'name': 'scissors'}, {'frequency': 'f', 'synset': 'scoreboard.n.01', 'synonyms': ['scoreboard'], 'id': 924, 'def': 'a large board for displaying the score of a contest (and some other information)', 'name': 'scoreboard'}, {'frequency': 'r', 'synset': 'scraper.n.01', 'synonyms': ['scraper'], 'id': 925, 'def': 'any of various hand tools for scraping', 'name': 'scraper'}, {'frequency': 'c', 'synset': 'screwdriver.n.01', 'synonyms': ['screwdriver'], 'id': 926, 'def': 'a hand tool for driving screws; has a tip that fits into the head of a screw', 'name': 'screwdriver'}, {'frequency': 'f', 'synset': 'scrub_brush.n.01', 'synonyms': ['scrubbing_brush'], 'id': 927, 'def': 'a brush with short stiff bristles for heavy cleaning', 'name': 'scrubbing_brush'}, {'frequency': 'c', 'synset': 'sculpture.n.01', 'synonyms': ['sculpture'], 'id': 928, 'def': 'a three-dimensional work of art', 'name': 'sculpture'}, {'frequency': 'c', 'synset': 'seabird.n.01', 'synonyms': ['seabird', 'seafowl'], 'id': 929, 'def': 'a bird that frequents coastal waters and the open ocean: gulls; pelicans; gannets; cormorants; albatrosses; petrels; etc.', 'name': 'seabird'}, {'frequency': 'c', 'synset': 'seahorse.n.02', 'synonyms': ['seahorse'], 'id': 930, 'def': 'small fish with horse-like heads bent sharply downward and curled tails', 'name': 'seahorse'}, {'frequency': 'r', 'synset': 'seaplane.n.01', 'synonyms': ['seaplane', 'hydroplane'], 'id': 931, 'def': 'an airplane that can land on or take off from water', 'name': 'seaplane'}, {'frequency': 'c', 'synset': 'seashell.n.01', 'synonyms': ['seashell'], 'id': 932, 'def': 'the shell of a marine organism', 'name': 'seashell'}, {'frequency': 'c', 'synset': 'sewing_machine.n.01', 'synonyms': ['sewing_machine'], 'id': 933, 'def': 'a textile machine used as a home appliance for sewing', 'name': 'sewing_machine'}, {'frequency': 'c', 'synset': 'shaker.n.03', 'synonyms': ['shaker'], 'id': 934, 'def': 'a container in which something can be shaken', 'name': 'shaker'}, {'frequency': 'c', 'synset': 'shampoo.n.01', 'synonyms': ['shampoo'], 'id': 935, 'def': 'cleansing agent consisting of soaps or detergents used for washing the hair', 'name': 'shampoo'}, {'frequency': 'c', 'synset': 'shark.n.01', 'synonyms': ['shark'], 'id': 936, 'def': 'typically large carnivorous fishes with sharpe teeth', 'name': 'shark'}, {'frequency': 'r', 'synset': 'sharpener.n.01', 'synonyms': ['sharpener'], 'id': 937, 'def': 'any implement that is used to make something (an edge or a point) sharper', 'name': 'sharpener'}, {'frequency': 'r', 'synset': 'sharpie.n.03', 'synonyms': ['Sharpie'], 'id': 938, 'def': 'a pen with indelible ink that will write on any surface', 'name': 'Sharpie'}, {'frequency': 'r', 'synset': 'shaver.n.03', 'synonyms': ['shaver_(electric)', 'electric_shaver', 'electric_razor'], 'id': 939, 'def': 'a razor powered by an electric motor', 'name': 'shaver_(electric)'}, {'frequency': 'c', 'synset': 'shaving_cream.n.01', 'synonyms': ['shaving_cream', 'shaving_soap'], 'id': 940, 'def': 'toiletry consisting that forms a rich lather for softening the beard before shaving', 'name': 'shaving_cream'}, {'frequency': 'r', 'synset': 'shawl.n.01', 'synonyms': ['shawl'], 'id': 941, 'def': 'cloak consisting of an oblong piece of cloth used to cover the head and shoulders', 'name': 'shawl'}, {'frequency': 'r', 'synset': 'shears.n.01', 'synonyms': ['shears'], 'id': 942, 'def': 'large scissors with strong blades', 'name': 'shears'}, {'frequency': 'f', 'synset': 'sheep.n.01', 'synonyms': ['sheep'], 'id': 943, 'def': 'woolly usually horned ruminant mammal related to the goat', 'name': 'sheep'}, {'frequency': 'r', 'synset': 'shepherd_dog.n.01', 'synonyms': ['shepherd_dog', 'sheepdog'], 'id': 944, 'def': 'any of various usually long-haired breeds of dog reared to herd and guard sheep', 'name': 'shepherd_dog'}, {'frequency': 'r', 'synset': 'sherbert.n.01', 'synonyms': ['sherbert', 'sherbet'], 'id': 945, 'def': 'a frozen dessert made primarily of fruit juice and sugar', 'name': 'sherbert'}, {'frequency': 'c', 'synset': 'shield.n.02', 'synonyms': ['shield'], 'id': 946, 'def': 'armor carried on the arm to intercept blows', 'name': 'shield'}, {'frequency': 'f', 'synset': 'shirt.n.01', 'synonyms': ['shirt'], 'id': 947, 'def': 'a garment worn on the upper half of the body', 'name': 'shirt'}, {'frequency': 'f', 'synset': 'shoe.n.01', 'synonyms': ['shoe', 'sneaker_(type_of_shoe)', 'tennis_shoe'], 'id': 948, 'def': 'common footwear covering the foot', 'name': 'shoe'}, {'frequency': 'f', 'synset': 'shopping_bag.n.01', 'synonyms': ['shopping_bag'], 'id': 949, 'def': 'a bag made of plastic or strong paper (often with handles); used to transport goods after shopping', 'name': 'shopping_bag'}, {'frequency': 'c', 'synset': 'shopping_cart.n.01', 'synonyms': ['shopping_cart'], 'id': 950, 'def': 'a handcart that holds groceries or other goods while shopping', 'name': 'shopping_cart'}, {'frequency': 'f', 'synset': 'short_pants.n.01', 'synonyms': ['short_pants', 'shorts_(clothing)', 'trunks_(clothing)'], 'id': 951, 'def': 'trousers that end at or above the knee', 'name': 'short_pants'}, {'frequency': 'r', 'synset': 'shot_glass.n.01', 'synonyms': ['shot_glass'], 'id': 952, 'def': 'a small glass adequate to hold a single swallow of whiskey', 'name': 'shot_glass'}, {'frequency': 'f', 'synset': 'shoulder_bag.n.01', 'synonyms': ['shoulder_bag'], 'id': 953, 'def': 'a large handbag that can be carried by a strap looped over the shoulder', 'name': 'shoulder_bag'}, {'frequency': 'c', 'synset': 'shovel.n.01', 'synonyms': ['shovel'], 'id': 954, 'def': 'a hand tool for lifting loose material such as snow, dirt, etc.', 'name': 'shovel'}, {'frequency': 'f', 'synset': 'shower.n.01', 'synonyms': ['shower_head'], 'id': 955, 'def': 'a plumbing fixture that sprays water over you', 'name': 'shower_head'}, {'frequency': 'r', 'synset': 'shower_cap.n.01', 'synonyms': ['shower_cap'], 'id': 956, 'def': 'a tight cap worn to keep hair dry while showering', 'name': 'shower_cap'}, {'frequency': 'f', 'synset': 'shower_curtain.n.01', 'synonyms': ['shower_curtain'], 'id': 957, 'def': 'a curtain that keeps water from splashing out of the shower area', 'name': 'shower_curtain'}, {'frequency': 'r', 'synset': 'shredder.n.01', 'synonyms': ['shredder_(for_paper)'], 'id': 958, 'def': 'a device that shreds documents', 'name': 'shredder_(for_paper)'}, {'frequency': 'f', 'synset': 'signboard.n.01', 'synonyms': ['signboard'], 'id': 959, 'def': 'structure displaying a board on which advertisements can be posted', 'name': 'signboard'}, {'frequency': 'c', 'synset': 'silo.n.01', 'synonyms': ['silo'], 'id': 960, 'def': 'a cylindrical tower used for storing goods', 'name': 'silo'}, {'frequency': 'f', 'synset': 'sink.n.01', 'synonyms': ['sink'], 'id': 961, 'def': 'plumbing fixture consisting of a water basin fixed to a wall or floor and having a drainpipe', 'name': 'sink'}, {'frequency': 'f', 'synset': 'skateboard.n.01', 'synonyms': ['skateboard'], 'id': 962, 'def': 'a board with wheels that is ridden in a standing or crouching position and propelled by foot', 'name': 'skateboard'}, {'frequency': 'c', 'synset': 'skewer.n.01', 'synonyms': ['skewer'], 'id': 963, 'def': 'a long pin for holding meat in position while it is being roasted', 'name': 'skewer'}, {'frequency': 'f', 'synset': 'ski.n.01', 'synonyms': ['ski'], 'id': 964, 'def': 'sports equipment for skiing on snow', 'name': 'ski'}, {'frequency': 'f', 'synset': 'ski_boot.n.01', 'synonyms': ['ski_boot'], 'id': 965, 'def': 'a stiff boot that is fastened to a ski with a ski binding', 'name': 'ski_boot'}, {'frequency': 'f', 'synset': 'ski_parka.n.01', 'synonyms': ['ski_parka', 'ski_jacket'], 'id': 966, 'def': 'a parka to be worn while skiing', 'name': 'ski_parka'}, {'frequency': 'f', 'synset': 'ski_pole.n.01', 'synonyms': ['ski_pole'], 'id': 967, 'def': 'a pole with metal points used as an aid in skiing', 'name': 'ski_pole'}, {'frequency': 'f', 'synset': 'skirt.n.02', 'synonyms': ['skirt'], 'id': 968, 'def': 'a garment hanging from the waist; worn mainly by girls and women', 'name': 'skirt'}, {'frequency': 'r', 'synset': 'skullcap.n.01', 'synonyms': ['skullcap'], 'id': 969, 'def': 'rounded brimless cap fitting the crown of the head', 'name': 'skullcap'}, {'frequency': 'c', 'synset': 'sled.n.01', 'synonyms': ['sled', 'sledge', 'sleigh'], 'id': 970, 'def': 'a vehicle or flat object for transportation over snow by sliding or pulled by dogs, etc.', 'name': 'sled'}, {'frequency': 'c', 'synset': 'sleeping_bag.n.01', 'synonyms': ['sleeping_bag'], 'id': 971, 'def': 'large padded bag designed to be slept in outdoors', 'name': 'sleeping_bag'}, {'frequency': 'r', 'synset': 'sling.n.05', 'synonyms': ['sling_(bandage)', 'triangular_bandage'], 'id': 972, 'def': 'bandage to support an injured forearm; slung over the shoulder or neck', 'name': 'sling_(bandage)'}, {'frequency': 'c', 'synset': 'slipper.n.01', 'synonyms': ['slipper_(footwear)', 'carpet_slipper_(footwear)'], 'id': 973, 'def': 'low footwear that can be slipped on and off easily; usually worn indoors', 'name': 'slipper_(footwear)'}, {'frequency': 'r', 'synset': 'smoothie.n.02', 'synonyms': ['smoothie'], 'id': 974, 'def': 'a thick smooth drink consisting of fresh fruit pureed with ice cream or yoghurt or milk', 'name': 'smoothie'}, {'frequency': 'r', 'synset': 'snake.n.01', 'synonyms': ['snake', 'serpent'], 'id': 975, 'def': 'limbless scaly elongate reptile; some are venomous', 'name': 'snake'}, {'frequency': 'f', 'synset': 'snowboard.n.01', 'synonyms': ['snowboard'], 'id': 976, 'def': 'a board that resembles a broad ski or a small surfboard; used in a standing position to slide down snow-covered slopes', 'name': 'snowboard'}, {'frequency': 'c', 'synset': 'snowman.n.01', 'synonyms': ['snowman'], 'id': 977, 'def': 'a figure of a person made of packed snow', 'name': 'snowman'}, {'frequency': 'c', 'synset': 'snowmobile.n.01', 'synonyms': ['snowmobile'], 'id': 978, 'def': 'tracked vehicle for travel on snow having skis in front', 'name': 'snowmobile'}, {'frequency': 'f', 'synset': 'soap.n.01', 'synonyms': ['soap'], 'id': 979, 'def': 'a cleansing agent made from the salts of vegetable or animal fats', 'name': 'soap'}, {'frequency': 'f', 'synset': 'soccer_ball.n.01', 'synonyms': ['soccer_ball'], 'id': 980, 'def': "an inflated ball used in playing soccer (called `football' outside of the United States)", 'name': 'soccer_ball'}, {'frequency': 'f', 'synset': 'sock.n.01', 'synonyms': ['sock'], 'id': 981, 'def': 'cloth covering for the foot; worn inside the shoe; reaches to between the ankle and the knee', 'name': 'sock'}, {'frequency': 'f', 'synset': 'sofa.n.01', 'synonyms': ['sofa', 'couch', 'lounge'], 'id': 982, 'def': 'an upholstered seat for more than one person', 'name': 'sofa'}, {'frequency': 'r', 'synset': 'softball.n.01', 'synonyms': ['softball'], 'id': 983, 'def': 'ball used in playing softball', 'name': 'softball'}, {'frequency': 'c', 'synset': 'solar_array.n.01', 'synonyms': ['solar_array', 'solar_battery', 'solar_panel'], 'id': 984, 'def': 'electrical device consisting of a large array of connected solar cells', 'name': 'solar_array'}, {'frequency': 'r', 'synset': 'sombrero.n.02', 'synonyms': ['sombrero'], 'id': 985, 'def': 'a straw hat with a tall crown and broad brim; worn in American southwest and in Mexico', 'name': 'sombrero'}, {'frequency': 'f', 'synset': 'soup.n.01', 'synonyms': ['soup'], 'id': 986, 'def': 'liquid food especially of meat or fish or vegetable stock often containing pieces of solid food', 'name': 'soup'}, {'frequency': 'r', 'synset': 'soup_bowl.n.01', 'synonyms': ['soup_bowl'], 'id': 987, 'def': 'a bowl for serving soup', 'name': 'soup_bowl'}, {'frequency': 'c', 'synset': 'soupspoon.n.01', 'synonyms': ['soupspoon'], 'id': 988, 'def': 'a spoon with a rounded bowl for eating soup', 'name': 'soupspoon'}, {'frequency': 'c', 'synset': 'sour_cream.n.01', 'synonyms': ['sour_cream', 'soured_cream'], 'id': 989, 'def': 'soured light cream', 'name': 'sour_cream'}, {'frequency': 'r', 'synset': 'soya_milk.n.01', 'synonyms': ['soya_milk', 'soybean_milk', 'soymilk'], 'id': 990, 'def': 'a milk substitute containing soybean flour and water; used in some infant formulas and in making tofu', 'name': 'soya_milk'}, {'frequency': 'r', 'synset': 'space_shuttle.n.01', 'synonyms': ['space_shuttle'], 'id': 991, 'def': "a reusable spacecraft with wings for a controlled descent through the Earth's atmosphere", 'name': 'space_shuttle'}, {'frequency': 'r', 'synset': 'sparkler.n.02', 'synonyms': ['sparkler_(fireworks)'], 'id': 992, 'def': 'a firework that burns slowly and throws out a shower of sparks', 'name': 'sparkler_(fireworks)'}, {'frequency': 'f', 'synset': 'spatula.n.02', 'synonyms': ['spatula'], 'id': 993, 'def': 'a hand tool with a thin flexible blade used to mix or spread soft substances', 'name': 'spatula'}, {'frequency': 'r', 'synset': 'spear.n.01', 'synonyms': ['spear', 'lance'], 'id': 994, 'def': 'a long pointed rod used as a tool or weapon', 'name': 'spear'}, {'frequency': 'f', 'synset': 'spectacles.n.01', 'synonyms': ['spectacles', 'specs', 'eyeglasses', 'glasses'], 'id': 995, 'def': 'optical instrument consisting of a frame that holds a pair of lenses for correcting defective vision', 'name': 'spectacles'}, {'frequency': 'c', 'synset': 'spice_rack.n.01', 'synonyms': ['spice_rack'], 'id': 996, 'def': 'a rack for displaying containers filled with spices', 'name': 'spice_rack'}, {'frequency': 'c', 'synset': 'spider.n.01', 'synonyms': ['spider'], 'id': 997, 'def': 'predatory arachnid with eight legs, two poison fangs, two feelers, and usually two silk-spinning organs at the back end of the body', 'name': 'spider'}, {'frequency': 'r', 'synset': 'spiny_lobster.n.02', 'synonyms': ['crawfish', 'crayfish'], 'id': 998, 'def': 'large edible marine crustacean having a spiny carapace but lacking the large pincers of true lobsters', 'name': 'crawfish'}, {'frequency': 'c', 'synset': 'sponge.n.01', 'synonyms': ['sponge'], 'id': 999, 'def': 'a porous mass usable to absorb water typically used for cleaning', 'name': 'sponge'}, {'frequency': 'f', 'synset': 'spoon.n.01', 'synonyms': ['spoon'], 'id': 1000, 'def': 'a piece of cutlery with a shallow bowl-shaped container and a handle', 'name': 'spoon'}, {'frequency': 'c', 'synset': 'sportswear.n.01', 'synonyms': ['sportswear', 'athletic_wear', 'activewear'], 'id': 1001, 'def': 'attire worn for sport or for casual wear', 'name': 'sportswear'}, {'frequency': 'c', 'synset': 'spotlight.n.02', 'synonyms': ['spotlight'], 'id': 1002, 'def': 'a lamp that produces a strong beam of light to illuminate a restricted area; used to focus attention of a stage performer', 'name': 'spotlight'}, {'frequency': 'r', 'synset': 'squid.n.01', 'synonyms': ['squid_(food)', 'calamari', 'calamary'], 'id': 1003, 'def': '(Italian cuisine) squid prepared as food', 'name': 'squid_(food)'}, {'frequency': 'c', 'synset': 'squirrel.n.01', 'synonyms': ['squirrel'], 'id': 1004, 'def': 'a kind of arboreal rodent having a long bushy tail', 'name': 'squirrel'}, {'frequency': 'r', 'synset': 'stagecoach.n.01', 'synonyms': ['stagecoach'], 'id': 1005, 'def': 'a large coach-and-four formerly used to carry passengers and mail on regular routes between towns', 'name': 'stagecoach'}, {'frequency': 'c', 'synset': 'stapler.n.01', 'synonyms': ['stapler_(stapling_machine)'], 'id': 1006, 'def': 'a machine that inserts staples into sheets of paper in order to fasten them together', 'name': 'stapler_(stapling_machine)'}, {'frequency': 'c', 'synset': 'starfish.n.01', 'synonyms': ['starfish', 'sea_star'], 'id': 1007, 'def': 'echinoderms characterized by five arms extending from a central disk', 'name': 'starfish'}, {'frequency': 'f', 'synset': 'statue.n.01', 'synonyms': ['statue_(sculpture)'], 'id': 1008, 'def': 'a sculpture representing a human or animal', 'name': 'statue_(sculpture)'}, {'frequency': 'c', 'synset': 'steak.n.01', 'synonyms': ['steak_(food)'], 'id': 1009, 'def': 'a slice of meat cut from the fleshy part of an animal or large fish', 'name': 'steak_(food)'}, {'frequency': 'r', 'synset': 'steak_knife.n.01', 'synonyms': ['steak_knife'], 'id': 1010, 'def': 'a sharp table knife used in eating steak', 'name': 'steak_knife'}, {'frequency': 'f', 'synset': 'steering_wheel.n.01', 'synonyms': ['steering_wheel'], 'id': 1011, 'def': 'a handwheel that is used for steering', 'name': 'steering_wheel'}, {'frequency': 'r', 'synset': 'step_ladder.n.01', 'synonyms': ['stepladder'], 'id': 1012, 'def': 'a folding portable ladder hinged at the top', 'name': 'stepladder'}, {'frequency': 'c', 'synset': 'step_stool.n.01', 'synonyms': ['step_stool'], 'id': 1013, 'def': 'a stool that has one or two steps that fold under the seat', 'name': 'step_stool'}, {'frequency': 'c', 'synset': 'stereo.n.01', 'synonyms': ['stereo_(sound_system)'], 'id': 1014, 'def': 'electronic device for playing audio', 'name': 'stereo_(sound_system)'}, {'frequency': 'r', 'synset': 'stew.n.02', 'synonyms': ['stew'], 'id': 1015, 'def': 'food prepared by stewing especially meat or fish with vegetables', 'name': 'stew'}, {'frequency': 'r', 'synset': 'stirrer.n.02', 'synonyms': ['stirrer'], 'id': 1016, 'def': 'an implement used for stirring', 'name': 'stirrer'}, {'frequency': 'f', 'synset': 'stirrup.n.01', 'synonyms': ['stirrup'], 'id': 1017, 'def': "support consisting of metal loops into which rider's feet go", 'name': 'stirrup'}, {'frequency': 'f', 'synset': 'stool.n.01', 'synonyms': ['stool'], 'id': 1018, 'def': 'a simple seat without a back or arms', 'name': 'stool'}, {'frequency': 'f', 'synset': 'stop_sign.n.01', 'synonyms': ['stop_sign'], 'id': 1019, 'def': 'a traffic sign to notify drivers that they must come to a complete stop', 'name': 'stop_sign'}, {'frequency': 'f', 'synset': 'stoplight.n.01', 'synonyms': ['brake_light'], 'id': 1020, 'def': 'a red light on the rear of a motor vehicle that signals when the brakes are applied', 'name': 'brake_light'}, {'frequency': 'f', 'synset': 'stove.n.01', 'synonyms': ['stove', 'kitchen_stove', 'range_(kitchen_appliance)', 'kitchen_range', 'cooking_stove'], 'id': 1021, 'def': 'a kitchen appliance used for cooking food', 'name': 'stove'}, {'frequency': 'c', 'synset': 'strainer.n.01', 'synonyms': ['strainer'], 'id': 1022, 'def': 'a filter to retain larger pieces while smaller pieces and liquids pass through', 'name': 'strainer'}, {'frequency': 'f', 'synset': 'strap.n.01', 'synonyms': ['strap'], 'id': 1023, 'def': 'an elongated strip of material for binding things together or holding', 'name': 'strap'}, {'frequency': 'f', 'synset': 'straw.n.04', 'synonyms': ['straw_(for_drinking)', 'drinking_straw'], 'id': 1024, 'def': 'a thin paper or plastic tube used to suck liquids into the mouth', 'name': 'straw_(for_drinking)'}, {'frequency': 'f', 'synset': 'strawberry.n.01', 'synonyms': ['strawberry'], 'id': 1025, 'def': 'sweet fleshy red fruit', 'name': 'strawberry'}, {'frequency': 'f', 'synset': 'street_sign.n.01', 'synonyms': ['street_sign'], 'id': 1026, 'def': 'a sign visible from the street', 'name': 'street_sign'}, {'frequency': 'f', 'synset': 'streetlight.n.01', 'synonyms': ['streetlight', 'street_lamp'], 'id': 1027, 'def': 'a lamp supported on a lamppost; for illuminating a street', 'name': 'streetlight'}, {'frequency': 'r', 'synset': 'string_cheese.n.01', 'synonyms': ['string_cheese'], 'id': 1028, 'def': 'cheese formed in long strings twisted together', 'name': 'string_cheese'}, {'frequency': 'r', 'synset': 'stylus.n.02', 'synonyms': ['stylus'], 'id': 1029, 'def': 'a pointed tool for writing or drawing or engraving, including pens', 'name': 'stylus'}, {'frequency': 'r', 'synset': 'subwoofer.n.01', 'synonyms': ['subwoofer'], 'id': 1030, 'def': 'a loudspeaker that is designed to reproduce very low bass frequencies', 'name': 'subwoofer'}, {'frequency': 'r', 'synset': 'sugar_bowl.n.01', 'synonyms': ['sugar_bowl'], 'id': 1031, 'def': 'a dish in which sugar is served', 'name': 'sugar_bowl'}, {'frequency': 'r', 'synset': 'sugarcane.n.01', 'synonyms': ['sugarcane_(plant)'], 'id': 1032, 'def': 'juicy canes whose sap is a source of molasses and commercial sugar; fresh canes are sometimes chewed for the juice', 'name': 'sugarcane_(plant)'}, {'frequency': 'f', 'synset': 'suit.n.01', 'synonyms': ['suit_(clothing)'], 'id': 1033, 'def': 'a set of garments (usually including a jacket and trousers or skirt) for outerwear all of the same fabric and color', 'name': 'suit_(clothing)'}, {'frequency': 'c', 'synset': 'sunflower.n.01', 'synonyms': ['sunflower'], 'id': 1034, 'def': 'any plant of the genus Helianthus having large flower heads with dark disk florets and showy yellow rays', 'name': 'sunflower'}, {'frequency': 'f', 'synset': 'sunglasses.n.01', 'synonyms': ['sunglasses'], 'id': 1035, 'def': 'spectacles that are darkened or polarized to protect the eyes from the glare of the sun', 'name': 'sunglasses'}, {'frequency': 'c', 'synset': 'sunhat.n.01', 'synonyms': ['sunhat'], 'id': 1036, 'def': 'a hat with a broad brim that protects the face from direct exposure to the sun', 'name': 'sunhat'}, {'frequency': 'f', 'synset': 'surfboard.n.01', 'synonyms': ['surfboard'], 'id': 1037, 'def': 'a narrow buoyant board for riding surf', 'name': 'surfboard'}, {'frequency': 'c', 'synset': 'sushi.n.01', 'synonyms': ['sushi'], 'id': 1038, 'def': 'rice (with raw fish) wrapped in seaweed', 'name': 'sushi'}, {'frequency': 'c', 'synset': 'swab.n.02', 'synonyms': ['mop'], 'id': 1039, 'def': 'cleaning implement consisting of absorbent material fastened to a handle; for cleaning floors', 'name': 'mop'}, {'frequency': 'c', 'synset': 'sweat_pants.n.01', 'synonyms': ['sweat_pants'], 'id': 1040, 'def': 'loose-fitting trousers with elastic cuffs; worn by athletes', 'name': 'sweat_pants'}, {'frequency': 'c', 'synset': 'sweatband.n.02', 'synonyms': ['sweatband'], 'id': 1041, 'def': 'a band of material tied around the forehead or wrist to absorb sweat', 'name': 'sweatband'}, {'frequency': 'f', 'synset': 'sweater.n.01', 'synonyms': ['sweater'], 'id': 1042, 'def': 'a crocheted or knitted garment covering the upper part of the body', 'name': 'sweater'}, {'frequency': 'f', 'synset': 'sweatshirt.n.01', 'synonyms': ['sweatshirt'], 'id': 1043, 'def': 'cotton knit pullover with long sleeves worn during athletic activity', 'name': 'sweatshirt'}, {'frequency': 'c', 'synset': 'sweet_potato.n.02', 'synonyms': ['sweet_potato'], 'id': 1044, 'def': 'the edible tuberous root of the sweet potato vine', 'name': 'sweet_potato'}, {'frequency': 'f', 'synset': 'swimsuit.n.01', 'synonyms': ['swimsuit', 'swimwear', 'bathing_suit', 'swimming_costume', 'bathing_costume', 'swimming_trunks', 'bathing_trunks'], 'id': 1045, 'def': 'garment worn for swimming', 'name': 'swimsuit'}, {'frequency': 'c', 'synset': 'sword.n.01', 'synonyms': ['sword'], 'id': 1046, 'def': 'a cutting or thrusting weapon that has a long metal blade', 'name': 'sword'}, {'frequency': 'r', 'synset': 'syringe.n.01', 'synonyms': ['syringe'], 'id': 1047, 'def': 'a medical instrument used to inject or withdraw fluids', 'name': 'syringe'}, {'frequency': 'r', 'synset': 'tabasco.n.02', 'synonyms': ['Tabasco_sauce'], 'id': 1048, 'def': 'very spicy sauce (trade name Tabasco) made from fully-aged red peppers', 'name': 'Tabasco_sauce'}, {'frequency': 'r', 'synset': 'table-tennis_table.n.01', 'synonyms': ['table-tennis_table', 'ping-pong_table'], 'id': 1049, 'def': 'a table used for playing table tennis', 'name': 'table-tennis_table'}, {'frequency': 'f', 'synset': 'table.n.02', 'synonyms': ['table'], 'id': 1050, 'def': 'a piece of furniture having a smooth flat top that is usually supported by one or more vertical legs', 'name': 'table'}, {'frequency': 'c', 'synset': 'table_lamp.n.01', 'synonyms': ['table_lamp'], 'id': 1051, 'def': 'a lamp that sits on a table', 'name': 'table_lamp'}, {'frequency': 'f', 'synset': 'tablecloth.n.01', 'synonyms': ['tablecloth'], 'id': 1052, 'def': 'a covering spread over a dining table', 'name': 'tablecloth'}, {'frequency': 'r', 'synset': 'tachometer.n.01', 'synonyms': ['tachometer'], 'id': 1053, 'def': 'measuring instrument for indicating speed of rotation', 'name': 'tachometer'}, {'frequency': 'r', 'synset': 'taco.n.02', 'synonyms': ['taco'], 'id': 1054, 'def': 'a small tortilla cupped around a filling', 'name': 'taco'}, {'frequency': 'f', 'synset': 'tag.n.02', 'synonyms': ['tag'], 'id': 1055, 'def': 'a label associated with something for the purpose of identification or information', 'name': 'tag'}, {'frequency': 'f', 'synset': 'taillight.n.01', 'synonyms': ['taillight', 'rear_light'], 'id': 1056, 'def': 'lamp (usually red) mounted at the rear of a motor vehicle', 'name': 'taillight'}, {'frequency': 'r', 'synset': 'tambourine.n.01', 'synonyms': ['tambourine'], 'id': 1057, 'def': 'a shallow drum with a single drumhead and with metallic disks in the sides', 'name': 'tambourine'}, {'frequency': 'r', 'synset': 'tank.n.01', 'synonyms': ['army_tank', 'armored_combat_vehicle', 'armoured_combat_vehicle'], 'id': 1058, 'def': 'an enclosed armored military vehicle; has a cannon and moves on caterpillar treads', 'name': 'army_tank'}, {'frequency': 'f', 'synset': 'tank.n.02', 'synonyms': ['tank_(storage_vessel)', 'storage_tank'], 'id': 1059, 'def': 'a large (usually metallic) vessel for holding gases or liquids', 'name': 'tank_(storage_vessel)'}, {'frequency': 'f', 'synset': 'tank_top.n.01', 'synonyms': ['tank_top_(clothing)'], 'id': 1060, 'def': 'a tight-fitting sleeveless shirt with wide shoulder straps and low neck and no front opening', 'name': 'tank_top_(clothing)'}, {'frequency': 'f', 'synset': 'tape.n.01', 'synonyms': ['tape_(sticky_cloth_or_paper)'], 'id': 1061, 'def': 'a long thin piece of cloth or paper as used for binding or fastening', 'name': 'tape_(sticky_cloth_or_paper)'}, {'frequency': 'c', 'synset': 'tape.n.04', 'synonyms': ['tape_measure', 'measuring_tape'], 'id': 1062, 'def': 'measuring instrument consisting of a narrow strip (cloth or metal) marked in inches or centimeters and used for measuring lengths', 'name': 'tape_measure'}, {'frequency': 'c', 'synset': 'tapestry.n.02', 'synonyms': ['tapestry'], 'id': 1063, 'def': 'a heavy textile with a woven design; used for curtains and upholstery', 'name': 'tapestry'}, {'frequency': 'f', 'synset': 'tarpaulin.n.01', 'synonyms': ['tarp'], 'id': 1064, 'def': 'waterproofed canvas', 'name': 'tarp'}, {'frequency': 'c', 'synset': 'tartan.n.01', 'synonyms': ['tartan', 'plaid'], 'id': 1065, 'def': 'a cloth having a crisscross design', 'name': 'tartan'}, {'frequency': 'c', 'synset': 'tassel.n.01', 'synonyms': ['tassel'], 'id': 1066, 'def': 'adornment consisting of a bunch of cords fastened at one end', 'name': 'tassel'}, {'frequency': 'c', 'synset': 'tea_bag.n.01', 'synonyms': ['tea_bag'], 'id': 1067, 'def': 'a measured amount of tea in a bag for an individual serving of tea', 'name': 'tea_bag'}, {'frequency': 'c', 'synset': 'teacup.n.02', 'synonyms': ['teacup'], 'id': 1068, 'def': 'a cup from which tea is drunk', 'name': 'teacup'}, {'frequency': 'c', 'synset': 'teakettle.n.01', 'synonyms': ['teakettle'], 'id': 1069, 'def': 'kettle for boiling water to make tea', 'name': 'teakettle'}, {'frequency': 'f', 'synset': 'teapot.n.01', 'synonyms': ['teapot'], 'id': 1070, 'def': 'pot for brewing tea; usually has a spout and handle', 'name': 'teapot'}, {'frequency': 'f', 'synset': 'teddy.n.01', 'synonyms': ['teddy_bear'], 'id': 1071, 'def': "plaything consisting of a child's toy bear (usually plush and stuffed with soft materials)", 'name': 'teddy_bear'}, {'frequency': 'f', 'synset': 'telephone.n.01', 'synonyms': ['telephone', 'phone', 'telephone_set'], 'id': 1072, 'def': 'electronic device for communicating by voice over long distances (includes wired and wireless/cell phones)', 'name': 'telephone'}, {'frequency': 'c', 'synset': 'telephone_booth.n.01', 'synonyms': ['telephone_booth', 'phone_booth', 'call_box', 'telephone_box', 'telephone_kiosk'], 'id': 1073, 'def': 'booth for using a telephone', 'name': 'telephone_booth'}, {'frequency': 'f', 'synset': 'telephone_pole.n.01', 'synonyms': ['telephone_pole', 'telegraph_pole', 'telegraph_post'], 'id': 1074, 'def': 'tall pole supporting telephone wires', 'name': 'telephone_pole'}, {'frequency': 'r', 'synset': 'telephoto_lens.n.01', 'synonyms': ['telephoto_lens', 'zoom_lens'], 'id': 1075, 'def': 'a camera lens that magnifies the image', 'name': 'telephoto_lens'}, {'frequency': 'c', 'synset': 'television_camera.n.01', 'synonyms': ['television_camera', 'tv_camera'], 'id': 1076, 'def': 'television equipment for capturing and recording video', 'name': 'television_camera'}, {'frequency': 'f', 'synset': 'television_receiver.n.01', 'synonyms': ['television_set', 'tv', 'tv_set'], 'id': 1077, 'def': 'an electronic device that receives television signals and displays them on a screen', 'name': 'television_set'}, {'frequency': 'f', 'synset': 'tennis_ball.n.01', 'synonyms': ['tennis_ball'], 'id': 1078, 'def': 'ball about the size of a fist used in playing tennis', 'name': 'tennis_ball'}, {'frequency': 'f', 'synset': 'tennis_racket.n.01', 'synonyms': ['tennis_racket'], 'id': 1079, 'def': 'a racket used to play tennis', 'name': 'tennis_racket'}, {'frequency': 'r', 'synset': 'tequila.n.01', 'synonyms': ['tequila'], 'id': 1080, 'def': 'Mexican liquor made from fermented juices of an agave plant', 'name': 'tequila'}, {'frequency': 'c', 'synset': 'thermometer.n.01', 'synonyms': ['thermometer'], 'id': 1081, 'def': 'measuring instrument for measuring temperature', 'name': 'thermometer'}, {'frequency': 'c', 'synset': 'thermos.n.01', 'synonyms': ['thermos_bottle'], 'id': 1082, 'def': 'vacuum flask that preserves temperature of hot or cold drinks', 'name': 'thermos_bottle'}, {'frequency': 'f', 'synset': 'thermostat.n.01', 'synonyms': ['thermostat'], 'id': 1083, 'def': 'a regulator for automatically regulating temperature by starting or stopping the supply of heat', 'name': 'thermostat'}, {'frequency': 'r', 'synset': 'thimble.n.02', 'synonyms': ['thimble'], 'id': 1084, 'def': 'a small metal cap to protect the finger while sewing; can be used as a small container', 'name': 'thimble'}, {'frequency': 'c', 'synset': 'thread.n.01', 'synonyms': ['thread', 'yarn'], 'id': 1085, 'def': 'a fine cord of twisted fibers (of cotton or silk or wool or nylon etc.) used in sewing and weaving', 'name': 'thread'}, {'frequency': 'c', 'synset': 'thumbtack.n.01', 'synonyms': ['thumbtack', 'drawing_pin', 'pushpin'], 'id': 1086, 'def': 'a tack for attaching papers to a bulletin board or drawing board', 'name': 'thumbtack'}, {'frequency': 'c', 'synset': 'tiara.n.01', 'synonyms': ['tiara'], 'id': 1087, 'def': 'a jeweled headdress worn by women on formal occasions', 'name': 'tiara'}, {'frequency': 'c', 'synset': 'tiger.n.02', 'synonyms': ['tiger'], 'id': 1088, 'def': 'large feline of forests in most of Asia having a tawny coat with black stripes', 'name': 'tiger'}, {'frequency': 'c', 'synset': 'tights.n.01', 'synonyms': ['tights_(clothing)', 'leotards'], 'id': 1089, 'def': 'skintight knit hose covering the body from the waist to the feet worn by acrobats and dancers and as stockings by women and girls', 'name': 'tights_(clothing)'}, {'frequency': 'c', 'synset': 'timer.n.01', 'synonyms': ['timer', 'stopwatch'], 'id': 1090, 'def': 'a timepiece that measures a time interval and signals its end', 'name': 'timer'}, {'frequency': 'f', 'synset': 'tinfoil.n.01', 'synonyms': ['tinfoil'], 'id': 1091, 'def': 'foil made of tin or an alloy of tin and lead', 'name': 'tinfoil'}, {'frequency': 'c', 'synset': 'tinsel.n.01', 'synonyms': ['tinsel'], 'id': 1092, 'def': 'a showy decoration that is basically valueless', 'name': 'tinsel'}, {'frequency': 'f', 'synset': 'tissue.n.02', 'synonyms': ['tissue_paper'], 'id': 1093, 'def': 'a soft thin (usually translucent) paper', 'name': 'tissue_paper'}, {'frequency': 'c', 'synset': 'toast.n.01', 'synonyms': ['toast_(food)'], 'id': 1094, 'def': 'slice of bread that has been toasted', 'name': 'toast_(food)'}, {'frequency': 'f', 'synset': 'toaster.n.02', 'synonyms': ['toaster'], 'id': 1095, 'def': 'a kitchen appliance (usually electric) for toasting bread', 'name': 'toaster'}, {'frequency': 'f', 'synset': 'toaster_oven.n.01', 'synonyms': ['toaster_oven'], 'id': 1096, 'def': 'kitchen appliance consisting of a small electric oven for toasting or warming food', 'name': 'toaster_oven'}, {'frequency': 'f', 'synset': 'toilet.n.02', 'synonyms': ['toilet'], 'id': 1097, 'def': 'a plumbing fixture for defecation and urination', 'name': 'toilet'}, {'frequency': 'f', 'synset': 'toilet_tissue.n.01', 'synonyms': ['toilet_tissue', 'toilet_paper', 'bathroom_tissue'], 'id': 1098, 'def': 'a soft thin absorbent paper for use in toilets', 'name': 'toilet_tissue'}, {'frequency': 'f', 'synset': 'tomato.n.01', 'synonyms': ['tomato'], 'id': 1099, 'def': 'mildly acid red or yellow pulpy fruit eaten as a vegetable', 'name': 'tomato'}, {'frequency': 'f', 'synset': 'tongs.n.01', 'synonyms': ['tongs'], 'id': 1100, 'def': 'any of various devices for taking hold of objects; usually have two hinged legs with handles above and pointed hooks below', 'name': 'tongs'}, {'frequency': 'c', 'synset': 'toolbox.n.01', 'synonyms': ['toolbox'], 'id': 1101, 'def': 'a box or chest or cabinet for holding hand tools', 'name': 'toolbox'}, {'frequency': 'f', 'synset': 'toothbrush.n.01', 'synonyms': ['toothbrush'], 'id': 1102, 'def': 'small brush; has long handle; used to clean teeth', 'name': 'toothbrush'}, {'frequency': 'f', 'synset': 'toothpaste.n.01', 'synonyms': ['toothpaste'], 'id': 1103, 'def': 'a dentifrice in the form of a paste', 'name': 'toothpaste'}, {'frequency': 'f', 'synset': 'toothpick.n.01', 'synonyms': ['toothpick'], 'id': 1104, 'def': 'pick consisting of a small strip of wood or plastic; used to pick food from between the teeth', 'name': 'toothpick'}, {'frequency': 'f', 'synset': 'top.n.09', 'synonyms': ['cover'], 'id': 1105, 'def': 'covering for a hole (especially a hole in the top of a container)', 'name': 'cover'}, {'frequency': 'c', 'synset': 'tortilla.n.01', 'synonyms': ['tortilla'], 'id': 1106, 'def': 'thin unleavened pancake made from cornmeal or wheat flour', 'name': 'tortilla'}, {'frequency': 'c', 'synset': 'tow_truck.n.01', 'synonyms': ['tow_truck'], 'id': 1107, 'def': 'a truck equipped to hoist and pull wrecked cars (or to remove cars from no-parking zones)', 'name': 'tow_truck'}, {'frequency': 'f', 'synset': 'towel.n.01', 'synonyms': ['towel'], 'id': 1108, 'def': 'a rectangular piece of absorbent cloth (or paper) for drying or wiping', 'name': 'towel'}, {'frequency': 'f', 'synset': 'towel_rack.n.01', 'synonyms': ['towel_rack', 'towel_rail', 'towel_bar'], 'id': 1109, 'def': 'a rack consisting of one or more bars on which towels can be hung', 'name': 'towel_rack'}, {'frequency': 'f', 'synset': 'toy.n.03', 'synonyms': ['toy'], 'id': 1110, 'def': 'a device regarded as providing amusement', 'name': 'toy'}, {'frequency': 'c', 'synset': 'tractor.n.01', 'synonyms': ['tractor_(farm_equipment)'], 'id': 1111, 'def': 'a wheeled vehicle with large wheels; used in farming and other applications', 'name': 'tractor_(farm_equipment)'}, {'frequency': 'f', 'synset': 'traffic_light.n.01', 'synonyms': ['traffic_light'], 'id': 1112, 'def': 'a device to control vehicle traffic often consisting of three or more lights', 'name': 'traffic_light'}, {'frequency': 'c', 'synset': 'trail_bike.n.01', 'synonyms': ['dirt_bike'], 'id': 1113, 'def': 'a lightweight motorcycle equipped with rugged tires and suspension for off-road use', 'name': 'dirt_bike'}, {'frequency': 'f', 'synset': 'trailer_truck.n.01', 'synonyms': ['trailer_truck', 'tractor_trailer', 'trucking_rig', 'articulated_lorry', 'semi_truck'], 'id': 1114, 'def': 'a truck consisting of a tractor and trailer together', 'name': 'trailer_truck'}, {'frequency': 'f', 'synset': 'train.n.01', 'synonyms': ['train_(railroad_vehicle)', 'railroad_train'], 'id': 1115, 'def': 'public or private transport provided by a line of railway cars coupled together and drawn by a locomotive', 'name': 'train_(railroad_vehicle)'}, {'frequency': 'r', 'synset': 'trampoline.n.01', 'synonyms': ['trampoline'], 'id': 1116, 'def': 'gymnastic apparatus consisting of a strong canvas sheet attached with springs to a metal frame', 'name': 'trampoline'}, {'frequency': 'f', 'synset': 'tray.n.01', 'synonyms': ['tray'], 'id': 1117, 'def': 'an open receptacle for holding or displaying or serving articles or food', 'name': 'tray'}, {'frequency': 'r', 'synset': 'trench_coat.n.01', 'synonyms': ['trench_coat'], 'id': 1118, 'def': 'a military style raincoat; belted with deep pockets', 'name': 'trench_coat'}, {'frequency': 'r', 'synset': 'triangle.n.05', 'synonyms': ['triangle_(musical_instrument)'], 'id': 1119, 'def': 'a percussion instrument consisting of a metal bar bent in the shape of an open triangle', 'name': 'triangle_(musical_instrument)'}, {'frequency': 'c', 'synset': 'tricycle.n.01', 'synonyms': ['tricycle'], 'id': 1120, 'def': 'a vehicle with three wheels that is moved by foot pedals', 'name': 'tricycle'}, {'frequency': 'f', 'synset': 'tripod.n.01', 'synonyms': ['tripod'], 'id': 1121, 'def': 'a three-legged rack used for support', 'name': 'tripod'}, {'frequency': 'f', 'synset': 'trouser.n.01', 'synonyms': ['trousers', 'pants_(clothing)'], 'id': 1122, 'def': 'a garment extending from the waist to the knee or ankle, covering each leg separately', 'name': 'trousers'}, {'frequency': 'f', 'synset': 'truck.n.01', 'synonyms': ['truck'], 'id': 1123, 'def': 'an automotive vehicle suitable for hauling', 'name': 'truck'}, {'frequency': 'r', 'synset': 'truffle.n.03', 'synonyms': ['truffle_(chocolate)', 'chocolate_truffle'], 'id': 1124, 'def': 'creamy chocolate candy', 'name': 'truffle_(chocolate)'}, {'frequency': 'c', 'synset': 'trunk.n.02', 'synonyms': ['trunk'], 'id': 1125, 'def': 'luggage consisting of a large strong case used when traveling or for storage', 'name': 'trunk'}, {'frequency': 'r', 'synset': 'tub.n.02', 'synonyms': ['vat'], 'id': 1126, 'def': 'a large vessel for holding or storing liquids', 'name': 'vat'}, {'frequency': 'c', 'synset': 'turban.n.01', 'synonyms': ['turban'], 'id': 1127, 'def': 'a traditional headdress consisting of a long scarf wrapped around the head', 'name': 'turban'}, {'frequency': 'c', 'synset': 'turkey.n.04', 'synonyms': ['turkey_(food)'], 'id': 1128, 'def': 'flesh of large domesticated fowl usually roasted', 'name': 'turkey_(food)'}, {'frequency': 'r', 'synset': 'turnip.n.01', 'synonyms': ['turnip'], 'id': 1129, 'def': 'widely cultivated plant having a large fleshy edible white or yellow root', 'name': 'turnip'}, {'frequency': 'c', 'synset': 'turtle.n.02', 'synonyms': ['turtle'], 'id': 1130, 'def': 'any of various aquatic and land reptiles having a bony shell and flipper-like limbs for swimming', 'name': 'turtle'}, {'frequency': 'c', 'synset': 'turtleneck.n.01', 'synonyms': ['turtleneck_(clothing)', 'polo-neck'], 'id': 1131, 'def': 'a sweater or jersey with a high close-fitting collar', 'name': 'turtleneck_(clothing)'}, {'frequency': 'c', 'synset': 'typewriter.n.01', 'synonyms': ['typewriter'], 'id': 1132, 'def': 'hand-operated character printer for printing written messages one character at a time', 'name': 'typewriter'}, {'frequency': 'f', 'synset': 'umbrella.n.01', 'synonyms': ['umbrella'], 'id': 1133, 'def': 'a lightweight handheld collapsible canopy', 'name': 'umbrella'}, {'frequency': 'f', 'synset': 'underwear.n.01', 'synonyms': ['underwear', 'underclothes', 'underclothing', 'underpants'], 'id': 1134, 'def': 'undergarment worn next to the skin and under the outer garments', 'name': 'underwear'}, {'frequency': 'r', 'synset': 'unicycle.n.01', 'synonyms': ['unicycle'], 'id': 1135, 'def': 'a vehicle with a single wheel that is driven by pedals', 'name': 'unicycle'}, {'frequency': 'f', 'synset': 'urinal.n.01', 'synonyms': ['urinal'], 'id': 1136, 'def': 'a plumbing fixture (usually attached to the wall) used by men to urinate', 'name': 'urinal'}, {'frequency': 'c', 'synset': 'urn.n.01', 'synonyms': ['urn'], 'id': 1137, 'def': 'a large vase that usually has a pedestal or feet', 'name': 'urn'}, {'frequency': 'c', 'synset': 'vacuum.n.04', 'synonyms': ['vacuum_cleaner'], 'id': 1138, 'def': 'an electrical home appliance that cleans by suction', 'name': 'vacuum_cleaner'}, {'frequency': 'f', 'synset': 'vase.n.01', 'synonyms': ['vase'], 'id': 1139, 'def': 'an open jar of glass or porcelain used as an ornament or to hold flowers', 'name': 'vase'}, {'frequency': 'c', 'synset': 'vending_machine.n.01', 'synonyms': ['vending_machine'], 'id': 1140, 'def': 'a slot machine for selling goods', 'name': 'vending_machine'}, {'frequency': 'f', 'synset': 'vent.n.01', 'synonyms': ['vent', 'blowhole', 'air_vent'], 'id': 1141, 'def': 'a hole for the escape of gas or air', 'name': 'vent'}, {'frequency': 'f', 'synset': 'vest.n.01', 'synonyms': ['vest', 'waistcoat'], 'id': 1142, 'def': "a man's sleeveless garment worn underneath a coat", 'name': 'vest'}, {'frequency': 'c', 'synset': 'videotape.n.01', 'synonyms': ['videotape'], 'id': 1143, 'def': 'a video recording made on magnetic tape', 'name': 'videotape'}, {'frequency': 'r', 'synset': 'vinegar.n.01', 'synonyms': ['vinegar'], 'id': 1144, 'def': 'sour-tasting liquid produced usually by oxidation of the alcohol in wine or cider and used as a condiment or food preservative', 'name': 'vinegar'}, {'frequency': 'r', 'synset': 'violin.n.01', 'synonyms': ['violin', 'fiddle'], 'id': 1145, 'def': 'bowed stringed instrument that is the highest member of the violin family', 'name': 'violin'}, {'frequency': 'r', 'synset': 'vodka.n.01', 'synonyms': ['vodka'], 'id': 1146, 'def': 'unaged colorless liquor originating in Russia', 'name': 'vodka'}, {'frequency': 'c', 'synset': 'volleyball.n.02', 'synonyms': ['volleyball'], 'id': 1147, 'def': 'an inflated ball used in playing volleyball', 'name': 'volleyball'}, {'frequency': 'r', 'synset': 'vulture.n.01', 'synonyms': ['vulture'], 'id': 1148, 'def': 'any of various large birds of prey having naked heads and weak claws and feeding chiefly on carrion', 'name': 'vulture'}, {'frequency': 'c', 'synset': 'waffle.n.01', 'synonyms': ['waffle'], 'id': 1149, 'def': 'pancake batter baked in a waffle iron', 'name': 'waffle'}, {'frequency': 'r', 'synset': 'waffle_iron.n.01', 'synonyms': ['waffle_iron'], 'id': 1150, 'def': 'a kitchen appliance for baking waffles', 'name': 'waffle_iron'}, {'frequency': 'c', 'synset': 'wagon.n.01', 'synonyms': ['wagon'], 'id': 1151, 'def': 'any of various kinds of wheeled vehicles drawn by an animal or a tractor', 'name': 'wagon'}, {'frequency': 'c', 'synset': 'wagon_wheel.n.01', 'synonyms': ['wagon_wheel'], 'id': 1152, 'def': 'a wheel of a wagon', 'name': 'wagon_wheel'}, {'frequency': 'c', 'synset': 'walking_stick.n.01', 'synonyms': ['walking_stick'], 'id': 1153, 'def': 'a stick carried in the hand for support in walking', 'name': 'walking_stick'}, {'frequency': 'c', 'synset': 'wall_clock.n.01', 'synonyms': ['wall_clock'], 'id': 1154, 'def': 'a clock mounted on a wall', 'name': 'wall_clock'}, {'frequency': 'f', 'synset': 'wall_socket.n.01', 'synonyms': ['wall_socket', 'wall_plug', 'electric_outlet', 'electrical_outlet', 'outlet', 'electric_receptacle'], 'id': 1155, 'def': 'receptacle providing a place in a wiring system where current can be taken to run electrical devices', 'name': 'wall_socket'}, {'frequency': 'f', 'synset': 'wallet.n.01', 'synonyms': ['wallet', 'billfold'], 'id': 1156, 'def': 'a pocket-size case for holding papers and paper money', 'name': 'wallet'}, {'frequency': 'r', 'synset': 'walrus.n.01', 'synonyms': ['walrus'], 'id': 1157, 'def': 'either of two large northern marine mammals having ivory tusks and tough hide over thick blubber', 'name': 'walrus'}, {'frequency': 'r', 'synset': 'wardrobe.n.01', 'synonyms': ['wardrobe'], 'id': 1158, 'def': 'a tall piece of furniture that provides storage space for clothes; has a door and rails or hooks for hanging clothes', 'name': 'wardrobe'}, {'frequency': 'r', 'synset': 'washbasin.n.01', 'synonyms': ['washbasin', 'basin_(for_washing)', 'washbowl', 'washstand', 'handbasin'], 'id': 1159, 'def': 'a bathroom sink that is permanently installed and connected to a water supply and drainpipe; where you can wash your hands and face', 'name': 'washbasin'}, {'frequency': 'c', 'synset': 'washer.n.03', 'synonyms': ['automatic_washer', 'washing_machine'], 'id': 1160, 'def': 'a home appliance for washing clothes and linens automatically', 'name': 'automatic_washer'}, {'frequency': 'f', 'synset': 'watch.n.01', 'synonyms': ['watch', 'wristwatch'], 'id': 1161, 'def': 'a small, portable timepiece', 'name': 'watch'}, {'frequency': 'f', 'synset': 'water_bottle.n.01', 'synonyms': ['water_bottle'], 'id': 1162, 'def': 'a bottle for holding water', 'name': 'water_bottle'}, {'frequency': 'c', 'synset': 'water_cooler.n.01', 'synonyms': ['water_cooler'], 'id': 1163, 'def': 'a device for cooling and dispensing drinking water', 'name': 'water_cooler'}, {'frequency': 'c', 'synset': 'water_faucet.n.01', 'synonyms': ['water_faucet', 'water_tap', 'tap_(water_faucet)'], 'id': 1164, 'def': 'a faucet for drawing water from a pipe or cask', 'name': 'water_faucet'}, {'frequency': 'r', 'synset': 'water_heater.n.01', 'synonyms': ['water_heater', 'hot-water_heater'], 'id': 1165, 'def': 'a heater and storage tank to supply heated water', 'name': 'water_heater'}, {'frequency': 'c', 'synset': 'water_jug.n.01', 'synonyms': ['water_jug'], 'id': 1166, 'def': 'a jug that holds water', 'name': 'water_jug'}, {'frequency': 'r', 'synset': 'water_pistol.n.01', 'synonyms': ['water_gun', 'squirt_gun'], 'id': 1167, 'def': 'plaything consisting of a toy pistol that squirts water', 'name': 'water_gun'}, {'frequency': 'c', 'synset': 'water_scooter.n.01', 'synonyms': ['water_scooter', 'sea_scooter', 'jet_ski'], 'id': 1168, 'def': 'a motorboat resembling a motor scooter (NOT A SURFBOARD OR WATER SKI)', 'name': 'water_scooter'}, {'frequency': 'c', 'synset': 'water_ski.n.01', 'synonyms': ['water_ski'], 'id': 1169, 'def': 'broad ski for skimming over water towed by a speedboat (DO NOT MARK WATER)', 'name': 'water_ski'}, {'frequency': 'c', 'synset': 'water_tower.n.01', 'synonyms': ['water_tower'], 'id': 1170, 'def': 'a large reservoir for water', 'name': 'water_tower'}, {'frequency': 'c', 'synset': 'watering_can.n.01', 'synonyms': ['watering_can'], 'id': 1171, 'def': 'a container with a handle and a spout with a perforated nozzle; used to sprinkle water over plants', 'name': 'watering_can'}, {'frequency': 'f', 'synset': 'watermelon.n.02', 'synonyms': ['watermelon'], 'id': 1172, 'def': 'large oblong or roundish melon with a hard green rind and sweet watery red or occasionally yellowish pulp', 'name': 'watermelon'}, {'frequency': 'f', 'synset': 'weathervane.n.01', 'synonyms': ['weathervane', 'vane_(weathervane)', 'wind_vane'], 'id': 1173, 'def': 'mechanical device attached to an elevated structure; rotates freely to show the direction of the wind', 'name': 'weathervane'}, {'frequency': 'c', 'synset': 'webcam.n.01', 'synonyms': ['webcam'], 'id': 1174, 'def': 'a digital camera designed to take digital photographs and transmit them over the internet', 'name': 'webcam'}, {'frequency': 'c', 'synset': 'wedding_cake.n.01', 'synonyms': ['wedding_cake', 'bridecake'], 'id': 1175, 'def': 'a rich cake with two or more tiers and covered with frosting and decorations; served at a wedding reception', 'name': 'wedding_cake'}, {'frequency': 'c', 'synset': 'wedding_ring.n.01', 'synonyms': ['wedding_ring', 'wedding_band'], 'id': 1176, 'def': 'a ring given to the bride and/or groom at the wedding', 'name': 'wedding_ring'}, {'frequency': 'f', 'synset': 'wet_suit.n.01', 'synonyms': ['wet_suit'], 'id': 1177, 'def': 'a close-fitting garment made of a permeable material; worn in cold water to retain body heat', 'name': 'wet_suit'}, {'frequency': 'f', 'synset': 'wheel.n.01', 'synonyms': ['wheel'], 'id': 1178, 'def': 'a circular frame with spokes (or a solid disc) that can rotate on a shaft or axle', 'name': 'wheel'}, {'frequency': 'c', 'synset': 'wheelchair.n.01', 'synonyms': ['wheelchair'], 'id': 1179, 'def': 'a movable chair mounted on large wheels', 'name': 'wheelchair'}, {'frequency': 'c', 'synset': 'whipped_cream.n.01', 'synonyms': ['whipped_cream'], 'id': 1180, 'def': 'cream that has been beaten until light and fluffy', 'name': 'whipped_cream'}, {'frequency': 'c', 'synset': 'whistle.n.03', 'synonyms': ['whistle'], 'id': 1181, 'def': 'a small wind instrument that produces a whistling sound by blowing into it', 'name': 'whistle'}, {'frequency': 'c', 'synset': 'wig.n.01', 'synonyms': ['wig'], 'id': 1182, 'def': 'hairpiece covering the head and made of real or synthetic hair', 'name': 'wig'}, {'frequency': 'c', 'synset': 'wind_chime.n.01', 'synonyms': ['wind_chime'], 'id': 1183, 'def': 'a decorative arrangement of pieces of metal or glass or pottery that hang together loosely so the wind can cause them to tinkle', 'name': 'wind_chime'}, {'frequency': 'c', 'synset': 'windmill.n.01', 'synonyms': ['windmill'], 'id': 1184, 'def': 'A mill or turbine that is powered by wind', 'name': 'windmill'}, {'frequency': 'c', 'synset': 'window_box.n.01', 'synonyms': ['window_box_(for_plants)'], 'id': 1185, 'def': 'a container for growing plants on a windowsill', 'name': 'window_box_(for_plants)'}, {'frequency': 'f', 'synset': 'windshield_wiper.n.01', 'synonyms': ['windshield_wiper', 'windscreen_wiper', 'wiper_(for_windshield/screen)'], 'id': 1186, 'def': 'a mechanical device that cleans the windshield', 'name': 'windshield_wiper'}, {'frequency': 'c', 'synset': 'windsock.n.01', 'synonyms': ['windsock', 'air_sock', 'air-sleeve', 'wind_sleeve', 'wind_cone'], 'id': 1187, 'def': 'a truncated cloth cone mounted on a mast/pole; shows wind direction', 'name': 'windsock'}, {'frequency': 'f', 'synset': 'wine_bottle.n.01', 'synonyms': ['wine_bottle'], 'id': 1188, 'def': 'a bottle for holding wine', 'name': 'wine_bottle'}, {'frequency': 'c', 'synset': 'wine_bucket.n.01', 'synonyms': ['wine_bucket', 'wine_cooler'], 'id': 1189, 'def': 'a bucket of ice used to chill a bottle of wine', 'name': 'wine_bucket'}, {'frequency': 'f', 'synset': 'wineglass.n.01', 'synonyms': ['wineglass'], 'id': 1190, 'def': 'a glass that has a stem and in which wine is served', 'name': 'wineglass'}, {'frequency': 'f', 'synset': 'winker.n.02', 'synonyms': ['blinder_(for_horses)'], 'id': 1191, 'def': 'blinds that prevent a horse from seeing something on either side', 'name': 'blinder_(for_horses)'}, {'frequency': 'c', 'synset': 'wok.n.01', 'synonyms': ['wok'], 'id': 1192, 'def': 'pan with a convex bottom; used for frying in Chinese cooking', 'name': 'wok'}, {'frequency': 'r', 'synset': 'wolf.n.01', 'synonyms': ['wolf'], 'id': 1193, 'def': 'a wild carnivorous mammal of the dog family, living and hunting in packs', 'name': 'wolf'}, {'frequency': 'c', 'synset': 'wooden_spoon.n.02', 'synonyms': ['wooden_spoon'], 'id': 1194, 'def': 'a spoon made of wood', 'name': 'wooden_spoon'}, {'frequency': 'c', 'synset': 'wreath.n.01', 'synonyms': ['wreath'], 'id': 1195, 'def': 'an arrangement of flowers, leaves, or stems fastened in a ring', 'name': 'wreath'}, {'frequency': 'c', 'synset': 'wrench.n.03', 'synonyms': ['wrench', 'spanner'], 'id': 1196, 'def': 'a hand tool that is used to hold or twist a nut or bolt', 'name': 'wrench'}, {'frequency': 'f', 'synset': 'wristband.n.01', 'synonyms': ['wristband'], 'id': 1197, 'def': 'band consisting of a part of a sleeve that covers the wrist', 'name': 'wristband'}, {'frequency': 'f', 'synset': 'wristlet.n.01', 'synonyms': ['wristlet', 'wrist_band'], 'id': 1198, 'def': 'a band or bracelet worn around the wrist', 'name': 'wristlet'}, {'frequency': 'c', 'synset': 'yacht.n.01', 'synonyms': ['yacht'], 'id': 1199, 'def': 'an expensive vessel propelled by sail or power and used for cruising or racing', 'name': 'yacht'}, {'frequency': 'c', 'synset': 'yogurt.n.01', 'synonyms': ['yogurt', 'yoghurt', 'yoghourt'], 'id': 1200, 'def': 'a custard-like food made from curdled milk', 'name': 'yogurt'}, {'frequency': 'c', 'synset': 'yoke.n.07', 'synonyms': ['yoke_(animal_equipment)'], 'id': 1201, 'def': 'gear joining two animals at the neck; NOT egg yolk', 'name': 'yoke_(animal_equipment)'}, {'frequency': 'f', 'synset': 'zebra.n.01', 'synonyms': ['zebra'], 'id': 1202, 'def': 'any of several fleet black-and-white striped African equines', 'name': 'zebra'}, {'frequency': 'c', 'synset': 'zucchini.n.02', 'synonyms': ['zucchini', 'courgette'], 'id': 1203, 'def': 'small cucumber-shaped vegetable marrow; typically dark green', 'name': 'zucchini'}] # noqa -# fmt: on diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/pascal_voc.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/pascal_voc.py deleted file mode 100755 index dbbf82cb..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/pascal_voc.py +++ /dev/null @@ -1,82 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import numpy as np -import os -import xml.etree.ElementTree as ET -from typing import List, Tuple, Union - -from detectron2.data import DatasetCatalog, MetadataCatalog -from detectron2.structures import BoxMode -from detectron2.utils.file_io import PathManager - -__all__ = ["load_voc_instances", "register_pascal_voc"] - - -# fmt: off -CLASS_NAMES = ( - "aeroplane", "bicycle", "bird", "boat", "bottle", "bus", "car", "cat", - "chair", "cow", "diningtable", "dog", "horse", "motorbike", "person", - "pottedplant", "sheep", "sofa", "train", "tvmonitor" -) -# fmt: on - - -def load_voc_instances(dirname: str, split: str, class_names: Union[List[str], Tuple[str, ...]]): - """ - Load Pascal VOC detection annotations to Detectron2 format. - - Args: - dirname: Contain "Annotations", "ImageSets", "JPEGImages" - split (str): one of "train", "test", "val", "trainval" - class_names: list or tuple of class names - """ - with PathManager.open(os.path.join(dirname, "ImageSets", "Main", split + ".txt")) as f: - fileids = np.loadtxt(f, dtype=np.str) - - # Needs to read many small annotation files. Makes sense at local - annotation_dirname = PathManager.get_local_path(os.path.join(dirname, "Annotations/")) - dicts = [] - for fileid in fileids: - anno_file = os.path.join(annotation_dirname, fileid + ".xml") - jpeg_file = os.path.join(dirname, "JPEGImages", fileid + ".jpg") - - with PathManager.open(anno_file) as f: - tree = ET.parse(f) - - r = { - "file_name": jpeg_file, - "image_id": fileid, - "height": int(tree.findall("./size/height")[0].text), - "width": int(tree.findall("./size/width")[0].text), - } - instances = [] - - for obj in tree.findall("object"): - cls = obj.find("name").text - # We include "difficult" samples in training. - # Based on limited experiments, they don't hurt accuracy. - # difficult = int(obj.find("difficult").text) - # if difficult == 1: - # continue - bbox = obj.find("bndbox") - bbox = [float(bbox.find(x).text) for x in ["xmin", "ymin", "xmax", "ymax"]] - # Original annotations are integers in the range [1, W or H] - # Assuming they mean 1-based pixel indices (inclusive), - # a box with annotation (xmin=1, xmax=W) covers the whole image. - # In coordinate space this is represented by (xmin=0, xmax=W) - bbox[0] -= 1.0 - bbox[1] -= 1.0 - instances.append( - {"category_id": class_names.index(cls), "bbox": bbox, "bbox_mode": BoxMode.XYXY_ABS} - ) - r["annotations"] = instances - dicts.append(r) - return dicts - - -def register_pascal_voc(name, dirname, split, year, class_names=CLASS_NAMES): - DatasetCatalog.register(name, lambda: load_voc_instances(dirname, split, class_names)) - MetadataCatalog.get(name).set( - thing_classes=list(class_names), dirname=dirname, year=year, split=split - ) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/register_coco.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/register_coco.py deleted file mode 100755 index e564438d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/datasets/register_coco.py +++ /dev/null @@ -1,3 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .coco import register_coco_instances # noqa -from .coco_panoptic import register_coco_panoptic_separated # noqa diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/detection_utils.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/detection_utils.py deleted file mode 100755 index 2707eb43..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/detection_utils.py +++ /dev/null @@ -1,623 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -""" -Common data processing utilities that are used in a -typical object detection data pipeline. -""" -import logging -import numpy as np -from typing import List, Union -import pycocotools.mask as mask_util -import torch -from PIL import Image - -from detectron2.structures import ( - BitMasks, - Boxes, - BoxMode, - Instances, - Keypoints, - PolygonMasks, - RotatedBoxes, - polygons_to_bitmask, -) -from detectron2.utils.file_io import PathManager - -from . import transforms as T -from .catalog import MetadataCatalog - -__all__ = [ - "SizeMismatchError", - "convert_image_to_rgb", - "check_image_size", - "transform_proposals", - "transform_instance_annotations", - "annotations_to_instances", - "annotations_to_instances_rotated", - "build_augmentation", - "build_transform_gen", - "create_keypoint_hflip_indices", - "filter_empty_instances", - "read_image", -] - - -class SizeMismatchError(ValueError): - """ - When loaded image has difference width/height compared with annotation. - """ - - -# https://en.wikipedia.org/wiki/YUV#SDTV_with_BT.601 -_M_RGB2YUV = [[0.299, 0.587, 0.114], [-0.14713, -0.28886, 0.436], [0.615, -0.51499, -0.10001]] -_M_YUV2RGB = [[1.0, 0.0, 1.13983], [1.0, -0.39465, -0.58060], [1.0, 2.03211, 0.0]] - -# https://www.exiv2.org/tags.html -_EXIF_ORIENT = 274 # exif 'Orientation' tag - - -def convert_PIL_to_numpy(image, format): - """ - Convert PIL image to numpy array of target format. - - Args: - image (PIL.Image): a PIL image - format (str): the format of output image - - Returns: - (np.ndarray): also see `read_image` - """ - if format is not None: - # PIL only supports RGB, so convert to RGB and flip channels over below - conversion_format = format - if format in ["BGR", "YUV-BT.601"]: - conversion_format = "RGB" - image = image.convert(conversion_format) - image = np.asarray(image) - # PIL squeezes out the channel dimension for "L", so make it HWC - if format == "L": - image = np.expand_dims(image, -1) - - # handle formats not supported by PIL - elif format == "BGR": - # flip channels if needed - image = image[:, :, ::-1] - elif format == "YUV-BT.601": - image = image / 255.0 - image = np.dot(image, np.array(_M_RGB2YUV).T) - - return image - - -def convert_image_to_rgb(image, format): - """ - Convert an image from given format to RGB. - - Args: - image (np.ndarray or Tensor): an HWC image - format (str): the format of input image, also see `read_image` - - Returns: - (np.ndarray): (H,W,3) RGB image in 0-255 range, can be either float or uint8 - """ - if isinstance(image, torch.Tensor): - image = image.cpu().numpy() - if format == "BGR": - image = image[:, :, [2, 1, 0]] - elif format == "YUV-BT.601": - image = np.dot(image, np.array(_M_YUV2RGB).T) - image = image * 255.0 - else: - if format == "L": - image = image[:, :, 0] - image = image.astype(np.uint8) - image = np.asarray(Image.fromarray(image, mode=format).convert("RGB")) - return image - - -def _apply_exif_orientation(image): - """ - Applies the exif orientation correctly. - - This code exists per the bug: - https://github.com/python-pillow/Pillow/issues/3973 - with the function `ImageOps.exif_transpose`. The Pillow source raises errors with - various methods, especially `tobytes` - - Function based on: - https://github.com/wkentaro/labelme/blob/v4.5.4/labelme/utils/image.py#L59 - https://github.com/python-pillow/Pillow/blob/7.1.2/src/PIL/ImageOps.py#L527 - - Args: - image (PIL.Image): a PIL image - - Returns: - (PIL.Image): the PIL image with exif orientation applied, if applicable - """ - if not hasattr(image, "getexif"): - return image - - try: - exif = image.getexif() - except Exception: # https://github.com/facebookresearch/detectron2/issues/1885 - exif = None - - if exif is None: - return image - - orientation = exif.get(_EXIF_ORIENT) - - method = { - 2: Image.FLIP_LEFT_RIGHT, - 3: Image.ROTATE_180, - 4: Image.FLIP_TOP_BOTTOM, - 5: Image.TRANSPOSE, - 6: Image.ROTATE_270, - 7: Image.TRANSVERSE, - 8: Image.ROTATE_90, - }.get(orientation) - - if method is not None: - return image.transpose(method) - return image - - -def read_image(file_name, format=None): - """ - Read an image into the given format. - Will apply rotation and flipping if the image has such exif information. - - Args: - file_name (str): image file path - format (str): one of the supported image modes in PIL, or "BGR" or "YUV-BT.601". - - Returns: - image (np.ndarray): - an HWC image in the given format, which is 0-255, uint8 for - supported image modes in PIL or "BGR"; float (0-1 for Y) for YUV-BT.601. - """ - with PathManager.open(file_name, "rb") as f: - image = Image.open(f) - - # work around this bug: https://github.com/python-pillow/Pillow/issues/3973 - image = _apply_exif_orientation(image) - return convert_PIL_to_numpy(image, format) - - -def check_image_size(dataset_dict, image): - """ - Raise an error if the image does not match the size specified in the dict. - """ - if "width" in dataset_dict or "height" in dataset_dict: - image_wh = (image.shape[1], image.shape[0]) - expected_wh = (dataset_dict["width"], dataset_dict["height"]) - if not image_wh == expected_wh: - raise SizeMismatchError( - "Mismatched image shape{}, got {}, expect {}.".format( - " for image " + dataset_dict["file_name"] - if "file_name" in dataset_dict - else "", - image_wh, - expected_wh, - ) - + " Please check the width/height in your annotation." - ) - - # To ensure bbox always remap to original image size - if "width" not in dataset_dict: - dataset_dict["width"] = image.shape[1] - if "height" not in dataset_dict: - dataset_dict["height"] = image.shape[0] - - -def transform_proposals(dataset_dict, image_shape, transforms, *, proposal_topk, min_box_size=0): - """ - Apply transformations to the proposals in dataset_dict, if any. - - Args: - dataset_dict (dict): a dict read from the dataset, possibly - contains fields "proposal_boxes", "proposal_objectness_logits", "proposal_bbox_mode" - image_shape (tuple): height, width - transforms (TransformList): - proposal_topk (int): only keep top-K scoring proposals - min_box_size (int): proposals with either side smaller than this - threshold are removed - - The input dict is modified in-place, with abovementioned keys removed. A new - key "proposals" will be added. Its value is an `Instances` - object which contains the transformed proposals in its field - "proposal_boxes" and "objectness_logits". - """ - if "proposal_boxes" in dataset_dict: - # Transform proposal boxes - boxes = transforms.apply_box( - BoxMode.convert( - dataset_dict.pop("proposal_boxes"), - dataset_dict.pop("proposal_bbox_mode"), - BoxMode.XYXY_ABS, - ) - ) - boxes = Boxes(boxes) - objectness_logits = torch.as_tensor( - dataset_dict.pop("proposal_objectness_logits").astype("float32") - ) - - boxes.clip(image_shape) - keep = boxes.nonempty(threshold=min_box_size) - boxes = boxes[keep] - objectness_logits = objectness_logits[keep] - - proposals = Instances(image_shape) - proposals.proposal_boxes = boxes[:proposal_topk] - proposals.objectness_logits = objectness_logits[:proposal_topk] - dataset_dict["proposals"] = proposals - - -def transform_instance_annotations( - annotation, transforms, image_size, *, keypoint_hflip_indices=None -): - """ - Apply transforms to box, segmentation and keypoints annotations of a single instance. - - It will use `transforms.apply_box` for the box, and - `transforms.apply_coords` for segmentation polygons & keypoints. - If you need anything more specially designed for each data structure, - you'll need to implement your own version of this function or the transforms. - - Args: - annotation (dict): dict of instance annotations for a single instance. - It will be modified in-place. - transforms (TransformList or list[Transform]): - image_size (tuple): the height, width of the transformed image - keypoint_hflip_indices (ndarray[int]): see `create_keypoint_hflip_indices`. - - Returns: - dict: - the same input dict with fields "bbox", "segmentation", "keypoints" - transformed according to `transforms`. - The "bbox_mode" field will be set to XYXY_ABS. - """ - if isinstance(transforms, (tuple, list)): - transforms = T.TransformList(transforms) - # bbox is 1d (per-instance bounding box) - bbox = BoxMode.convert(annotation["bbox"], annotation["bbox_mode"], BoxMode.XYXY_ABS) - # clip transformed bbox to image size - bbox = transforms.apply_box(np.array([bbox]))[0].clip(min=0) - annotation["bbox"] = np.minimum(bbox, list(image_size + image_size)[::-1]) - annotation["bbox_mode"] = BoxMode.XYXY_ABS - - if "segmentation" in annotation: - # each instance contains 1 or more polygons - segm = annotation["segmentation"] - if isinstance(segm, list): - # polygons - polygons = [np.asarray(p).reshape(-1, 2) for p in segm] - annotation["segmentation"] = [ - p.reshape(-1) for p in transforms.apply_polygons(polygons) - ] - elif isinstance(segm, dict): - # RLE - mask = mask_util.decode(segm) - mask = transforms.apply_segmentation(mask) - assert tuple(mask.shape[:2]) == image_size - annotation["segmentation"] = mask - else: - raise ValueError( - "Cannot transform segmentation of type '{}'!" - "Supported types are: polygons as list[list[float] or ndarray]," - " COCO-style RLE as a dict.".format(type(segm)) - ) - - if "keypoints" in annotation: - keypoints = transform_keypoint_annotations( - annotation["keypoints"], transforms, image_size, keypoint_hflip_indices - ) - annotation["keypoints"] = keypoints - - return annotation - - -def transform_keypoint_annotations(keypoints, transforms, image_size, keypoint_hflip_indices=None): - """ - Transform keypoint annotations of an image. - If a keypoint is transformed out of image boundary, it will be marked "unlabeled" (visibility=0) - - Args: - keypoints (list[float]): Nx3 float in Detectron2's Dataset format. - Each point is represented by (x, y, visibility). - transforms (TransformList): - image_size (tuple): the height, width of the transformed image - keypoint_hflip_indices (ndarray[int]): see `create_keypoint_hflip_indices`. - When `transforms` includes horizontal flip, will use the index - mapping to flip keypoints. - """ - # (N*3,) -> (N, 3) - keypoints = np.asarray(keypoints, dtype="float64").reshape(-1, 3) - keypoints_xy = transforms.apply_coords(keypoints[:, :2]) - - # Set all out-of-boundary points to "unlabeled" - inside = (keypoints_xy >= np.array([0, 0])) & (keypoints_xy <= np.array(image_size[::-1])) - inside = inside.all(axis=1) - keypoints[:, :2] = keypoints_xy - keypoints[:, 2][~inside] = 0 - - # This assumes that HorizFlipTransform is the only one that does flip - do_hflip = sum(isinstance(t, T.HFlipTransform) for t in transforms.transforms) % 2 == 1 - - # Alternative way: check if probe points was horizontally flipped. - # probe = np.asarray([[0.0, 0.0], [image_width, 0.0]]) - # probe_aug = transforms.apply_coords(probe.copy()) - # do_hflip = np.sign(probe[1][0] - probe[0][0]) != np.sign(probe_aug[1][0] - probe_aug[0][0]) # noqa - - # If flipped, swap each keypoint with its opposite-handed equivalent - if do_hflip: - if keypoint_hflip_indices is None: - raise ValueError("Cannot flip keypoints without providing flip indices!") - if len(keypoints) != len(keypoint_hflip_indices): - raise ValueError( - "Keypoint data has {} points, but metadata " - "contains {} points!".format(len(keypoints), len(keypoint_hflip_indices)) - ) - keypoints = keypoints[np.asarray(keypoint_hflip_indices, dtype=np.int32), :] - - # Maintain COCO convention that if visibility == 0 (unlabeled), then x, y = 0 - keypoints[keypoints[:, 2] == 0] = 0 - return keypoints - - -def annotations_to_instances(annos, image_size, mask_format="polygon"): - """ - Create an :class:`Instances` object used by the models, - from instance annotations in the dataset dict. - - Args: - annos (list[dict]): a list of instance annotations in one image, each - element for one instance. - image_size (tuple): height, width - - Returns: - Instances: - It will contain fields "gt_boxes", "gt_classes", - "gt_masks", "gt_keypoints", if they can be obtained from `annos`. - This is the format that builtin models expect. - """ - boxes = ( - np.stack( - [BoxMode.convert(obj["bbox"], obj["bbox_mode"], BoxMode.XYXY_ABS) for obj in annos] - ) - if len(annos) - else np.zeros((0, 4)) - ) - target = Instances(image_size) - target.gt_boxes = Boxes(boxes) - - classes = [int(obj["category_id"]) for obj in annos] - classes = torch.tensor(classes, dtype=torch.int64) - target.gt_classes = classes - - if len(annos) and "segmentation" in annos[0]: - segms = [obj["segmentation"] for obj in annos] - if mask_format == "polygon": - try: - masks = PolygonMasks(segms) - except ValueError as e: - raise ValueError( - "Failed to use mask_format=='polygon' from the given annotations!" - ) from e - else: - assert mask_format == "bitmask", mask_format - masks = [] - for segm in segms: - if isinstance(segm, list): - # polygon - masks.append(polygons_to_bitmask(segm, *image_size)) - elif isinstance(segm, dict): - # COCO RLE - masks.append(mask_util.decode(segm)) - elif isinstance(segm, np.ndarray): - assert segm.ndim == 2, "Expect segmentation of 2 dimensions, got {}.".format( - segm.ndim - ) - # mask array - masks.append(segm) - else: - raise ValueError( - "Cannot convert segmentation of type '{}' to BitMasks!" - "Supported types are: polygons as list[list[float] or ndarray]," - " COCO-style RLE as a dict, or a binary segmentation mask " - " in a 2D numpy array of shape HxW.".format(type(segm)) - ) - # torch.from_numpy does not support array with negative stride. - masks = BitMasks( - torch.stack([torch.from_numpy(np.ascontiguousarray(x)) for x in masks]) - ) - target.gt_masks = masks - - if len(annos) and "keypoints" in annos[0]: - kpts = [obj.get("keypoints", []) for obj in annos] - target.gt_keypoints = Keypoints(kpts) - - return target - - -def annotations_to_instances_rotated(annos, image_size): - """ - Create an :class:`Instances` object used by the models, - from instance annotations in the dataset dict. - Compared to `annotations_to_instances`, this function is for rotated boxes only - - Args: - annos (list[dict]): a list of instance annotations in one image, each - element for one instance. - image_size (tuple): height, width - - Returns: - Instances: - Containing fields "gt_boxes", "gt_classes", - if they can be obtained from `annos`. - This is the format that builtin models expect. - """ - boxes = [obj["bbox"] for obj in annos] - target = Instances(image_size) - boxes = target.gt_boxes = RotatedBoxes(boxes) - boxes.clip(image_size) - - classes = [obj["category_id"] for obj in annos] - classes = torch.tensor(classes, dtype=torch.int64) - target.gt_classes = classes - - return target - - -def filter_empty_instances( - instances, by_box=True, by_mask=True, box_threshold=1e-5, return_mask=False -): - """ - Filter out empty instances in an `Instances` object. - - Args: - instances (Instances): - by_box (bool): whether to filter out instances with empty boxes - by_mask (bool): whether to filter out instances with empty masks - box_threshold (float): minimum width and height to be considered non-empty - return_mask (bool): whether to return boolean mask of filtered instances - - Returns: - Instances: the filtered instances. - tensor[bool], optional: boolean mask of filtered instances - """ - assert by_box or by_mask - r = [] - if by_box: - r.append(instances.gt_boxes.nonempty(threshold=box_threshold)) - if instances.has("gt_masks") and by_mask: - r.append(instances.gt_masks.nonempty()) - - # TODO: can also filter visible keypoints - - if not r: - return instances - m = r[0] - for x in r[1:]: - m = m & x - if return_mask: - return instances[m], m - return instances[m] - - -def create_keypoint_hflip_indices(dataset_names: Union[str, List[str]]) -> List[int]: - """ - Args: - dataset_names: list of dataset names - - Returns: - list[int]: a list of size=#keypoints, storing the - horizontally-flipped keypoint indices. - """ - if isinstance(dataset_names, str): - dataset_names = [dataset_names] - - check_metadata_consistency("keypoint_names", dataset_names) - check_metadata_consistency("keypoint_flip_map", dataset_names) - - meta = MetadataCatalog.get(dataset_names[0]) - names = meta.keypoint_names - # TODO flip -> hflip - flip_map = dict(meta.keypoint_flip_map) - flip_map.update({v: k for k, v in flip_map.items()}) - flipped_names = [i if i not in flip_map else flip_map[i] for i in names] - flip_indices = [names.index(i) for i in flipped_names] - return flip_indices - - -def gen_crop_transform_with_instance(crop_size, image_size, instance): - """ - Generate a CropTransform so that the cropping region contains - the center of the given instance. - - Args: - crop_size (tuple): h, w in pixels - image_size (tuple): h, w - instance (dict): an annotation dict of one instance, in Detectron2's - dataset format. - """ - crop_size = np.asarray(crop_size, dtype=np.int32) - bbox = BoxMode.convert(instance["bbox"], instance["bbox_mode"], BoxMode.XYXY_ABS) - center_yx = (bbox[1] + bbox[3]) * 0.5, (bbox[0] + bbox[2]) * 0.5 - assert ( - image_size[0] >= center_yx[0] and image_size[1] >= center_yx[1] - ), "The annotation bounding box is outside of the image!" - assert ( - image_size[0] >= crop_size[0] and image_size[1] >= crop_size[1] - ), "Crop size is larger than image size!" - - min_yx = np.maximum(np.floor(center_yx).astype(np.int32) - crop_size, 0) - max_yx = np.maximum(np.asarray(image_size, dtype=np.int32) - crop_size, 0) - max_yx = np.minimum(max_yx, np.ceil(center_yx).astype(np.int32)) - - y0 = np.random.randint(min_yx[0], max_yx[0] + 1) - x0 = np.random.randint(min_yx[1], max_yx[1] + 1) - return T.CropTransform(x0, y0, crop_size[1], crop_size[0]) - - -def check_metadata_consistency(key, dataset_names): - """ - Check that the datasets have consistent metadata. - - Args: - key (str): a metadata key - dataset_names (list[str]): a list of dataset names - - Raises: - AttributeError: if the key does not exist in the metadata - ValueError: if the given datasets do not have the same metadata values defined by key - """ - if len(dataset_names) == 0: - return - logger = logging.getLogger(__name__) - entries_per_dataset = [getattr(MetadataCatalog.get(d), key) for d in dataset_names] - for idx, entry in enumerate(entries_per_dataset): - if entry != entries_per_dataset[0]: - logger.error( - "Metadata '{}' for dataset '{}' is '{}'".format(key, dataset_names[idx], str(entry)) - ) - logger.error( - "Metadata '{}' for dataset '{}' is '{}'".format( - key, dataset_names[0], str(entries_per_dataset[0]) - ) - ) - raise ValueError("Datasets have different metadata '{}'!".format(key)) - - -def build_augmentation(cfg, is_train): - """ - Create a list of default :class:`Augmentation` from config. - Now it includes resizing and flipping. - - Returns: - list[Augmentation] - """ - if is_train: - min_size = cfg.INPUT.MIN_SIZE_TRAIN - max_size = cfg.INPUT.MAX_SIZE_TRAIN - sample_style = cfg.INPUT.MIN_SIZE_TRAIN_SAMPLING - else: - min_size = cfg.INPUT.MIN_SIZE_TEST - max_size = cfg.INPUT.MAX_SIZE_TEST - sample_style = "choice" - augmentation = [T.ResizeShortestEdge(min_size, max_size, sample_style)] - if is_train and cfg.INPUT.RANDOM_FLIP != "none": - augmentation.append( - T.RandomFlip( - horizontal=cfg.INPUT.RANDOM_FLIP == "horizontal", - vertical=cfg.INPUT.RANDOM_FLIP == "vertical", - ) - ) - return augmentation - - -build_transform_gen = build_augmentation -""" -Alias for backward-compatibility. -""" diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/__init__.py deleted file mode 100755 index 85c9f1a9..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/__init__.py +++ /dev/null @@ -1,17 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .distributed_sampler import ( - InferenceSampler, - RandomSubsetTrainingSampler, - RepeatFactorTrainingSampler, - TrainingSampler, -) - -from .grouped_batch_sampler import GroupedBatchSampler - -__all__ = [ - "GroupedBatchSampler", - "TrainingSampler", - "RandomSubsetTrainingSampler", - "InferenceSampler", - "RepeatFactorTrainingSampler", -] diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/distributed_sampler.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/distributed_sampler.py deleted file mode 100755 index a098e6ac..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/distributed_sampler.py +++ /dev/null @@ -1,278 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import itertools -import logging -import math -from collections import defaultdict -from typing import Optional -import torch -from torch.utils.data.sampler import Sampler - -from detectron2.utils import comm - -logger = logging.getLogger(__name__) - - -class TrainingSampler(Sampler): - """ - In training, we only care about the "infinite stream" of training data. - So this sampler produces an infinite stream of indices and - all workers cooperate to correctly shuffle the indices and sample different indices. - - The samplers in each worker effectively produces `indices[worker_id::num_workers]` - where `indices` is an infinite stream of indices consisting of - `shuffle(range(size)) + shuffle(range(size)) + ...` (if shuffle is True) - or `range(size) + range(size) + ...` (if shuffle is False) - - Note that this sampler does not shard based on pytorch DataLoader worker id. - A sampler passed to pytorch DataLoader is used only with map-style dataset - and will not be executed inside workers. - But if this sampler is used in a way that it gets execute inside a dataloader - worker, then extra work needs to be done to shard its outputs based on worker id. - This is required so that workers don't produce identical data. - :class:`ToIterableDataset` implements this logic. - This note is true for all samplers in detectron2. - """ - - def __init__(self, size: int, shuffle: bool = True, seed: Optional[int] = None): - """ - Args: - size (int): the total number of data of the underlying dataset to sample from - shuffle (bool): whether to shuffle the indices or not - seed (int): the initial seed of the shuffle. Must be the same - across all workers. If None, will use a random seed shared - among workers (require synchronization among all workers). - """ - if not isinstance(size, int): - raise TypeError(f"TrainingSampler(size=) expects an int. Got type {type(size)}.") - if size <= 0: - raise ValueError(f"TrainingSampler(size=) expects a positive int. Got {size}.") - self._size = size - self._shuffle = shuffle - if seed is None: - seed = comm.shared_random_seed() - self._seed = int(seed) - - self._rank = comm.get_rank() - self._world_size = comm.get_world_size() - - def __iter__(self): - start = self._rank - yield from itertools.islice(self._infinite_indices(), start, None, self._world_size) - - def _infinite_indices(self): - g = torch.Generator() - g.manual_seed(self._seed) - while True: - if self._shuffle: - yield from torch.randperm(self._size, generator=g).tolist() - else: - yield from torch.arange(self._size).tolist() - - -class RandomSubsetTrainingSampler(TrainingSampler): - """ - Similar to TrainingSampler, but only sample a random subset of indices. - This is useful when you want to estimate the accuracy vs data-number curves by - training the model with different subset_ratio. - """ - - def __init__( - self, - size: int, - subset_ratio: float, - shuffle: bool = True, - seed_shuffle: Optional[int] = None, - seed_subset: Optional[int] = None, - ): - """ - Args: - size (int): the total number of data of the underlying dataset to sample from - subset_ratio (float): the ratio of subset data to sample from the underlying dataset - shuffle (bool): whether to shuffle the indices or not - seed_shuffle (int): the initial seed of the shuffle. Must be the same - across all workers. If None, will use a random seed shared - among workers (require synchronization among all workers). - seed_subset (int): the seed to randomize the subset to be sampled. - Must be the same across all workers. If None, will use a random seed shared - among workers (require synchronization among all workers). - """ - super().__init__(size=size, shuffle=shuffle, seed=seed_shuffle) - - assert 0.0 < subset_ratio <= 1.0 - self._size_subset = int(size * subset_ratio) - assert self._size_subset > 0 - if seed_subset is None: - seed_subset = comm.shared_random_seed() - self._seed_subset = int(seed_subset) - - # randomly generate the subset indexes to be sampled from - g = torch.Generator() - g.manual_seed(self._seed_subset) - indexes_randperm = torch.randperm(self._size, generator=g) - self._indexes_subset = indexes_randperm[: self._size_subset] - - logger.info("Using RandomSubsetTrainingSampler......") - logger.info(f"Randomly sample {self._size_subset} data from the original {self._size} data") - - def _infinite_indices(self): - g = torch.Generator() - g.manual_seed(self._seed) # self._seed equals seed_shuffle from __init__() - while True: - if self._shuffle: - # generate a random permutation to shuffle self._indexes_subset - randperm = torch.randperm(self._size_subset, generator=g) - yield from self._indexes_subset[randperm].tolist() - else: - yield from self._indexes_subset.tolist() - - -class RepeatFactorTrainingSampler(Sampler): - """ - Similar to TrainingSampler, but a sample may appear more times than others based - on its "repeat factor". This is suitable for training on class imbalanced datasets like LVIS. - """ - - def __init__(self, repeat_factors, *, shuffle=True, seed=None): - """ - Args: - repeat_factors (Tensor): a float vector, the repeat factor for each indice. When it's - full of ones, it is equivalent to ``TrainingSampler(len(repeat_factors), ...)``. - shuffle (bool): whether to shuffle the indices or not - seed (int): the initial seed of the shuffle. Must be the same - across all workers. If None, will use a random seed shared - among workers (require synchronization among all workers). - """ - self._shuffle = shuffle - if seed is None: - seed = comm.shared_random_seed() - self._seed = int(seed) - - self._rank = comm.get_rank() - self._world_size = comm.get_world_size() - - # Split into whole number (_int_part) and fractional (_frac_part) parts. - self._int_part = torch.trunc(repeat_factors) - self._frac_part = repeat_factors - self._int_part - - @staticmethod - def repeat_factors_from_category_frequency(dataset_dicts, repeat_thresh): - """ - Compute (fractional) per-image repeat factors based on category frequency. - The repeat factor for an image is a function of the frequency of the rarest - category labeled in that image. The "frequency of category c" in [0, 1] is defined - as the fraction of images in the training set (without repeats) in which category c - appears. - See :paper:`lvis` (>= v2) Appendix B.2. - - Args: - dataset_dicts (list[dict]): annotations in Detectron2 dataset format. - repeat_thresh (float): frequency threshold below which data is repeated. - If the frequency is half of `repeat_thresh`, the image will be - repeated twice. - - Returns: - torch.Tensor: - the i-th element is the repeat factor for the dataset image at index i. - """ - # 1. For each category c, compute the fraction of images that contain it: f(c) - category_freq = defaultdict(int) - for dataset_dict in dataset_dicts: # For each image (without repeats) - cat_ids = {ann["category_id"] for ann in dataset_dict["annotations"]} - for cat_id in cat_ids: - category_freq[cat_id] += 1 - num_images = len(dataset_dicts) - for k, v in category_freq.items(): - category_freq[k] = v / num_images - - # 2. For each category c, compute the category-level repeat factor: - # r(c) = max(1, sqrt(t / f(c))) - category_rep = { - cat_id: max(1.0, math.sqrt(repeat_thresh / cat_freq)) - for cat_id, cat_freq in category_freq.items() - } - - # 3. For each image I, compute the image-level repeat factor: - # r(I) = max_{c in I} r(c) - rep_factors = [] - for dataset_dict in dataset_dicts: - cat_ids = {ann["category_id"] for ann in dataset_dict["annotations"]} - rep_factor = max({category_rep[cat_id] for cat_id in cat_ids}, default=1.0) - rep_factors.append(rep_factor) - - return torch.tensor(rep_factors, dtype=torch.float32) - - def _get_epoch_indices(self, generator): - """ - Create a list of dataset indices (with repeats) to use for one epoch. - - Args: - generator (torch.Generator): pseudo random number generator used for - stochastic rounding. - - Returns: - torch.Tensor: list of dataset indices to use in one epoch. Each index - is repeated based on its calculated repeat factor. - """ - # Since repeat factors are fractional, we use stochastic rounding so - # that the target repeat factor is achieved in expectation over the - # course of training - rands = torch.rand(len(self._frac_part), generator=generator) - rep_factors = self._int_part + (rands < self._frac_part).float() - # Construct a list of indices in which we repeat images as specified - indices = [] - for dataset_index, rep_factor in enumerate(rep_factors): - indices.extend([dataset_index] * int(rep_factor.item())) - return torch.tensor(indices, dtype=torch.int64) - - def __iter__(self): - start = self._rank - yield from itertools.islice(self._infinite_indices(), start, None, self._world_size) - - def _infinite_indices(self): - g = torch.Generator() - g.manual_seed(self._seed) - while True: - # Sample indices with repeats determined by stochastic rounding; each - # "epoch" may have a slightly different size due to the rounding. - indices = self._get_epoch_indices(g) - if self._shuffle: - randperm = torch.randperm(len(indices), generator=g) - yield from indices[randperm].tolist() - else: - yield from indices.tolist() - - -class InferenceSampler(Sampler): - """ - Produce indices for inference across all workers. - Inference needs to run on the __exact__ set of samples, - therefore when the total number of samples is not divisible by the number of workers, - this sampler produces different number of samples on different workers. - """ - - def __init__(self, size: int): - """ - Args: - size (int): the total number of data of the underlying dataset to sample from - """ - self._size = size - assert size > 0 - self._rank = comm.get_rank() - self._world_size = comm.get_world_size() - self._local_indices = self._get_local_indices(size, self._world_size, self._rank) - - @staticmethod - def _get_local_indices(total_size, world_size, rank): - shard_size = total_size // world_size - left = total_size % world_size - shard_sizes = [shard_size + int(r < left) for r in range(world_size)] - - begin = sum(shard_sizes[:rank]) - end = min(sum(shard_sizes[: rank + 1]), total_size) - return range(begin, end) - - def __iter__(self): - yield from self._local_indices - - def __len__(self): - return len(self._local_indices) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/grouped_batch_sampler.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/grouped_batch_sampler.py deleted file mode 100755 index 5b247730..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/samplers/grouped_batch_sampler.py +++ /dev/null @@ -1,47 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -from torch.utils.data.sampler import BatchSampler, Sampler - - -class GroupedBatchSampler(BatchSampler): - """ - Wraps another sampler to yield a mini-batch of indices. - It enforces that the batch only contain elements from the same group. - It also tries to provide mini-batches which follows an ordering which is - as close as possible to the ordering from the original sampler. - """ - - def __init__(self, sampler, group_ids, batch_size): - """ - Args: - sampler (Sampler): Base sampler. - group_ids (list[int]): If the sampler produces indices in range [0, N), - `group_ids` must be a list of `N` ints which contains the group id of each sample. - The group ids must be a set of integers in the range [0, num_groups). - batch_size (int): Size of mini-batch. - """ - if not isinstance(sampler, Sampler): - raise ValueError( - "sampler should be an instance of " - "torch.utils.data.Sampler, but got sampler={}".format(sampler) - ) - self.sampler = sampler - self.group_ids = np.asarray(group_ids) - assert self.group_ids.ndim == 1 - self.batch_size = batch_size - groups = np.unique(self.group_ids).tolist() - - # buffer the indices of each group until batch size is reached - self.buffer_per_group = {k: [] for k in groups} - - def __iter__(self): - for idx in self.sampler: - group_id = self.group_ids[idx] - group_buffer = self.buffer_per_group[group_id] - group_buffer.append(idx) - if len(group_buffer) == self.batch_size: - yield group_buffer[:] # yield a copy of the list - del group_buffer[:] - - def __len__(self): - raise NotImplementedError("len() of GroupedBatchSampler is not well-defined.") diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/__init__.py deleted file mode 100755 index ab3c63b5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/__init__.py +++ /dev/null @@ -1,14 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from fvcore.transforms.transform import Transform, TransformList # order them first -from fvcore.transforms.transform import * -from .transform import * -from .augmentation import * -from .augmentation_impl import * - -__all__ = [k for k in globals().keys() if not k.startswith("_")] - - -from detectron2.utils.env import fixup_module_metadata - -fixup_module_metadata(__name__, globals(), __all__) -del fixup_module_metadata diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/augmentation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/augmentation.py deleted file mode 100755 index 48be5b1b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/augmentation.py +++ /dev/null @@ -1,377 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import inspect -import numpy as np -import pprint -from typing import Any, List, Optional, Tuple, Union -from fvcore.transforms.transform import Transform, TransformList - -""" -See "Data Augmentation" tutorial for an overview of the system: -https://detectron2.readthedocs.io/tutorials/augmentation.html -""" - - -__all__ = [ - "Augmentation", - "AugmentationList", - "AugInput", - "TransformGen", - "apply_transform_gens", - "StandardAugInput", - "apply_augmentations", -] - - -def _check_img_dtype(img): - assert isinstance(img, np.ndarray), "[Augmentation] Needs an numpy array, but got a {}!".format( - type(img) - ) - assert not isinstance(img.dtype, np.integer) or ( - img.dtype == np.uint8 - ), "[Augmentation] Got image of type {}, use uint8 or floating points instead!".format( - img.dtype - ) - assert img.ndim in [2, 3], img.ndim - - -def _get_aug_input_args(aug, aug_input) -> List[Any]: - """ - Get the arguments to be passed to ``aug.get_transform`` from the input ``aug_input``. - """ - if aug.input_args is None: - # Decide what attributes are needed automatically - prms = list(inspect.signature(aug.get_transform).parameters.items()) - # The default behavior is: if there is one parameter, then its "image" - # (work automatically for majority of use cases, and also avoid BC breaking), - # Otherwise, use the argument names. - if len(prms) == 1: - names = ("image",) - else: - names = [] - for name, prm in prms: - if prm.kind in (inspect.Parameter.VAR_POSITIONAL, inspect.Parameter.VAR_KEYWORD): - raise TypeError( - f""" \ -The default implementation of `{type(aug)}.__call__` does not allow \ -`{type(aug)}.get_transform` to use variable-length arguments (*args, **kwargs)! \ -If arguments are unknown, reimplement `__call__` instead. \ -""" - ) - names.append(name) - aug.input_args = tuple(names) - - args = [] - for f in aug.input_args: - try: - args.append(getattr(aug_input, f)) - except AttributeError as e: - raise AttributeError( - f"{type(aug)}.get_transform needs input attribute '{f}', " - f"but it is not an attribute of {type(aug_input)}!" - ) from e - return args - - -class Augmentation: - """ - Augmentation defines (often random) policies/strategies to generate :class:`Transform` - from data. It is often used for pre-processing of input data. - - A "policy" that generates a :class:`Transform` may, in the most general case, - need arbitrary information from input data in order to determine what transforms - to apply. Therefore, each :class:`Augmentation` instance defines the arguments - needed by its :meth:`get_transform` method. When called with the positional arguments, - the :meth:`get_transform` method executes the policy. - - Note that :class:`Augmentation` defines the policies to create a :class:`Transform`, - but not how to execute the actual transform operations to those data. - Its :meth:`__call__` method will use :meth:`AugInput.transform` to execute the transform. - - The returned `Transform` object is meant to describe deterministic transformation, which means - it can be re-applied on associated data, e.g. the geometry of an image and its segmentation - masks need to be transformed together. - (If such re-application is not needed, then determinism is not a crucial requirement.) - """ - - input_args: Optional[Tuple[str]] = None - """ - Stores the attribute names needed by :meth:`get_transform`, e.g. ``("image", "sem_seg")``. - By default, it is just a tuple of argument names in :meth:`self.get_transform`, which often only - contain "image". As long as the argument name convention is followed, there is no need for - users to touch this attribute. - """ - - def _init(self, params=None): - if params: - for k, v in params.items(): - if k != "self" and not k.startswith("_"): - setattr(self, k, v) - - def get_transform(self, *args) -> Transform: - """ - Execute the policy based on input data, and decide what transform to apply to inputs. - - Args: - args: Any fixed-length positional arguments. By default, the name of the arguments - should exist in the :class:`AugInput` to be used. - - Returns: - Transform: Returns the deterministic transform to apply to the input. - - Examples: - :: - class MyAug: - # if a policy needs to know both image and semantic segmentation - def get_transform(image, sem_seg) -> T.Transform: - pass - tfm: Transform = MyAug().get_transform(image, sem_seg) - new_image = tfm.apply_image(image) - - Notes: - Users can freely use arbitrary new argument names in custom - :meth:`get_transform` method, as long as they are available in the - input data. In detectron2 we use the following convention: - - * image: (H,W) or (H,W,C) ndarray of type uint8 in range [0, 255], or - floating point in range [0, 1] or [0, 255]. - * boxes: (N,4) ndarray of float32. It represents the instance bounding boxes - of N instances. Each is in XYXY format in unit of absolute coordinates. - * sem_seg: (H,W) ndarray of type uint8. Each element is an integer label of pixel. - - We do not specify convention for other types and do not include builtin - :class:`Augmentation` that uses other types in detectron2. - """ - raise NotImplementedError - - def __call__(self, aug_input) -> Transform: - """ - Augment the given `aug_input` **in-place**, and return the transform that's used. - - This method will be called to apply the augmentation. In most augmentation, it - is enough to use the default implementation, which calls :meth:`get_transform` - using the inputs. But a subclass can overwrite it to have more complicated logic. - - Args: - aug_input (AugInput): an object that has attributes needed by this augmentation - (defined by ``self.get_transform``). Its ``transform`` method will be called - to in-place transform it. - - Returns: - Transform: the transform that is applied on the input. - """ - args = _get_aug_input_args(self, aug_input) - tfm = self.get_transform(*args) - assert isinstance(tfm, (Transform, TransformList)), ( - f"{type(self)}.get_transform must return an instance of Transform! " - f"Got {type(tfm)} instead." - ) - aug_input.transform(tfm) - return tfm - - def _rand_range(self, low=1.0, high=None, size=None): - """ - Uniform float random number between low and high. - """ - if high is None: - low, high = 0, low - if size is None: - size = [] - return np.random.uniform(low, high, size) - - def __repr__(self): - """ - Produce something like: - "MyAugmentation(field1={self.field1}, field2={self.field2})" - """ - try: - sig = inspect.signature(self.__init__) - classname = type(self).__name__ - argstr = [] - for name, param in sig.parameters.items(): - assert ( - param.kind != param.VAR_POSITIONAL and param.kind != param.VAR_KEYWORD - ), "The default __repr__ doesn't support *args or **kwargs" - assert hasattr(self, name), ( - "Attribute {} not found! " - "Default __repr__ only works if attributes match the constructor.".format(name) - ) - attr = getattr(self, name) - default = param.default - if default is attr: - continue - attr_str = pprint.pformat(attr) - if "\n" in attr_str: - # don't show it if pformat decides to use >1 lines - attr_str = "..." - argstr.append("{}={}".format(name, attr_str)) - return "{}({})".format(classname, ", ".join(argstr)) - except AssertionError: - return super().__repr__() - - __str__ = __repr__ - - -def _transform_to_aug(tfm_or_aug): - """ - Wrap Transform into Augmentation. - Private, used internally to implement augmentations. - """ - assert isinstance(tfm_or_aug, (Transform, Augmentation)), tfm_or_aug - if isinstance(tfm_or_aug, Augmentation): - return tfm_or_aug - else: - - class _TransformToAug(Augmentation): - def __init__(self, tfm: Transform): - self.tfm = tfm - - def get_transform(self, *args): - return self.tfm - - def __repr__(self): - return repr(self.tfm) - - __str__ = __repr__ - - return _TransformToAug(tfm_or_aug) - - -class AugmentationList(Augmentation): - """ - Apply a sequence of augmentations. - - It has ``__call__`` method to apply the augmentations. - - Note that :meth:`get_transform` method is impossible (will throw error if called) - for :class:`AugmentationList`, because in order to apply a sequence of augmentations, - the kth augmentation must be applied first, to provide inputs needed by the (k+1)th - augmentation. - """ - - def __init__(self, augs): - """ - Args: - augs (list[Augmentation or Transform]): - """ - super().__init__() - self.augs = [_transform_to_aug(x) for x in augs] - - def __call__(self, aug_input) -> Transform: - tfms = [] - for x in self.augs: - tfm = x(aug_input) - tfms.append(tfm) - return TransformList(tfms) - - def __repr__(self): - msgs = [str(x) for x in self.augs] - return "AugmentationList[{}]".format(", ".join(msgs)) - - __str__ = __repr__ - - -class AugInput: - """ - Input that can be used with :meth:`Augmentation.__call__`. - This is a standard implementation for the majority of use cases. - This class provides the standard attributes **"image", "boxes", "sem_seg"** - defined in :meth:`__init__` and they may be needed by different augmentations. - Most augmentation policies do not need attributes beyond these three. - - After applying augmentations to these attributes (using :meth:`AugInput.transform`), - the returned transforms can then be used to transform other data structures that users have. - - Examples: - :: - input = AugInput(image, boxes=boxes) - tfms = augmentation(input) - transformed_image = input.image - transformed_boxes = input.boxes - transformed_other_data = tfms.apply_other(other_data) - - An extended project that works with new data types may implement augmentation policies - that need other inputs. An algorithm may need to transform inputs in a way different - from the standard approach defined in this class. In those rare situations, users can - implement a class similar to this class, that satify the following condition: - - * The input must provide access to these data in the form of attribute access - (``getattr``). For example, if an :class:`Augmentation` to be applied needs "image" - and "sem_seg" arguments, its input must have the attribute "image" and "sem_seg". - * The input must have a ``transform(tfm: Transform) -> None`` method which - in-place transforms all its attributes. - """ - - # TODO maybe should support more builtin data types here - def __init__( - self, - image: np.ndarray, - *, - boxes: Optional[np.ndarray] = None, - sem_seg: Optional[np.ndarray] = None, - ): - """ - Args: - image (ndarray): (H,W) or (H,W,C) ndarray of type uint8 in range [0, 255], or - floating point in range [0, 1] or [0, 255]. The meaning of C is up - to users. - boxes (ndarray or None): Nx4 float32 boxes in XYXY_ABS mode - sem_seg (ndarray or None): HxW uint8 semantic segmentation mask. Each element - is an integer label of pixel. - """ - _check_img_dtype(image) - self.image = image - self.boxes = boxes - self.sem_seg = sem_seg - - def transform(self, tfm: Transform) -> None: - """ - In-place transform all attributes of this class. - - By "in-place", it means after calling this method, accessing an attribute such - as ``self.image`` will return transformed data. - """ - self.image = tfm.apply_image(self.image) - if self.boxes is not None: - self.boxes = tfm.apply_box(self.boxes) - if self.sem_seg is not None: - self.sem_seg = tfm.apply_segmentation(self.sem_seg) - - def apply_augmentations( - self, augmentations: List[Union[Augmentation, Transform]] - ) -> TransformList: - """ - Equivalent of ``AugmentationList(augmentations)(self)`` - """ - return AugmentationList(augmentations)(self) - - -def apply_augmentations(augmentations: List[Union[Transform, Augmentation]], inputs): - """ - Use ``T.AugmentationList(augmentations)(inputs)`` instead. - """ - if isinstance(inputs, np.ndarray): - # handle the common case of image-only Augmentation, also for backward compatibility - image_only = True - inputs = AugInput(inputs) - else: - image_only = False - tfms = inputs.apply_augmentations(augmentations) - return inputs.image if image_only else inputs, tfms - - -apply_transform_gens = apply_augmentations -""" -Alias for backward-compatibility. -""" - -TransformGen = Augmentation -""" -Alias for Augmentation, since it is something that generates :class:`Transform`s -""" - -StandardAugInput = AugInput -""" -Alias for compatibility. It's not worth the complexity to have two classes. -""" diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/augmentation_impl.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/augmentation_impl.py deleted file mode 100755 index 652a34a9..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/augmentation_impl.py +++ /dev/null @@ -1,614 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. -""" -Implement many useful :class:`Augmentation`. -""" -import numpy as np -import sys -from typing import Tuple -import torch -from fvcore.transforms.transform import ( - BlendTransform, - CropTransform, - HFlipTransform, - NoOpTransform, - PadTransform, - Transform, - TransformList, - VFlipTransform, -) -from PIL import Image - -from .augmentation import Augmentation, _transform_to_aug -from .transform import ExtentTransform, ResizeTransform, RotationTransform - -__all__ = [ - "FixedSizeCrop", - "RandomApply", - "RandomBrightness", - "RandomContrast", - "RandomCrop", - "RandomExtent", - "RandomFlip", - "RandomSaturation", - "RandomLighting", - "RandomRotation", - "Resize", - "ResizeScale", - "ResizeShortestEdge", - "RandomCrop_CategoryAreaConstraint", -] - - -class RandomApply(Augmentation): - """ - Randomly apply an augmentation with a given probability. - """ - - def __init__(self, tfm_or_aug, prob=0.5): - """ - Args: - tfm_or_aug (Transform, Augmentation): the transform or augmentation - to be applied. It can either be a `Transform` or `Augmentation` - instance. - prob (float): probability between 0.0 and 1.0 that - the wrapper transformation is applied - """ - super().__init__() - self.aug = _transform_to_aug(tfm_or_aug) - assert 0.0 <= prob <= 1.0, f"Probablity must be between 0.0 and 1.0 (given: {prob})" - self.prob = prob - - def get_transform(self, *args): - do = self._rand_range() < self.prob - if do: - return self.aug.get_transform(*args) - else: - return NoOpTransform() - - def __call__(self, aug_input): - do = self._rand_range() < self.prob - if do: - return self.aug(aug_input) - else: - return NoOpTransform() - - -class RandomFlip(Augmentation): - """ - Flip the image horizontally or vertically with the given probability. - """ - - def __init__(self, prob=0.5, *, horizontal=True, vertical=False): - """ - Args: - prob (float): probability of flip. - horizontal (boolean): whether to apply horizontal flipping - vertical (boolean): whether to apply vertical flipping - """ - super().__init__() - - if horizontal and vertical: - raise ValueError("Cannot do both horiz and vert. Please use two Flip instead.") - if not horizontal and not vertical: - raise ValueError("At least one of horiz or vert has to be True!") - self._init(locals()) - - def get_transform(self, image): - h, w = image.shape[:2] - do = self._rand_range() < self.prob - if do: - if self.horizontal: - return HFlipTransform(w) - elif self.vertical: - return VFlipTransform(h) - else: - return NoOpTransform() - - -class Resize(Augmentation): - """Resize image to a fixed target size""" - - def __init__(self, shape, interp=Image.BILINEAR): - """ - Args: - shape: (h, w) tuple or a int - interp: PIL interpolation method - """ - if isinstance(shape, int): - shape = (shape, shape) - shape = tuple(shape) - self._init(locals()) - - def get_transform(self, image): - return ResizeTransform( - image.shape[0], image.shape[1], self.shape[0], self.shape[1], self.interp - ) - - -class ResizeShortestEdge(Augmentation): - """ - Resize the image while keeping the aspect ratio unchanged. - It attempts to scale the shorter edge to the given `short_edge_length`, - as long as the longer edge does not exceed `max_size`. - If `max_size` is reached, then downscale so that the longer edge does not exceed max_size. - """ - - @torch.jit.unused - def __init__( - self, short_edge_length, max_size=sys.maxsize, sample_style="range", interp=Image.BILINEAR - ): - """ - Args: - short_edge_length (list[int]): If ``sample_style=="range"``, - a [min, max] interval from which to sample the shortest edge length. - If ``sample_style=="choice"``, a list of shortest edge lengths to sample from. - max_size (int): maximum allowed longest edge length. - sample_style (str): either "range" or "choice". - """ - super().__init__() - assert sample_style in ["range", "choice"], sample_style - - self.is_range = sample_style == "range" - if isinstance(short_edge_length, int): - short_edge_length = (short_edge_length, short_edge_length) - if self.is_range: - assert len(short_edge_length) == 2, ( - "short_edge_length must be two values using 'range' sample style." - f" Got {short_edge_length}!" - ) - self._init(locals()) - - @torch.jit.unused - def get_transform(self, image): - h, w = image.shape[:2] - if self.is_range: - size = np.random.randint(self.short_edge_length[0], self.short_edge_length[1] + 1) - else: - size = np.random.choice(self.short_edge_length) - if size == 0: - return NoOpTransform() - - newh, neww = ResizeShortestEdge.get_output_shape(h, w, size, self.max_size) - return ResizeTransform(h, w, newh, neww, self.interp) - - @staticmethod - def get_output_shape( - oldh: int, oldw: int, short_edge_length: int, max_size: int - ) -> Tuple[int, int]: - """ - Compute the output size given input size and target short edge length. - """ - h, w = oldh, oldw - size = short_edge_length * 1.0 - scale = size / min(h, w) - if h < w: - newh, neww = size, scale * w - else: - newh, neww = scale * h, size - if max(newh, neww) > max_size: - scale = max_size * 1.0 / max(newh, neww) - newh = newh * scale - neww = neww * scale - neww = int(neww + 0.5) - newh = int(newh + 0.5) - return (newh, neww) - - -class ResizeScale(Augmentation): - """ - Takes target size as input and randomly scales the given target size between `min_scale` - and `max_scale`. It then scales the input image such that it fits inside the scaled target - box, keeping the aspect ratio constant. - This implements the resize part of the Google's 'resize_and_crop' data augmentation: - https://github.com/tensorflow/tpu/blob/master/models/official/detection/utils/input_utils.py#L127 - """ - - def __init__( - self, - min_scale: float, - max_scale: float, - target_height: int, - target_width: int, - interp: int = Image.BILINEAR, - ): - """ - Args: - min_scale: minimum image scale range. - max_scale: maximum image scale range. - target_height: target image height. - target_width: target image width. - interp: image interpolation method. - """ - super().__init__() - self._init(locals()) - - def _get_resize(self, image: np.ndarray, scale: float) -> Transform: - input_size = image.shape[:2] - - # Compute new target size given a scale. - target_size = (self.target_height, self.target_width) - target_scale_size = np.multiply(target_size, scale) - - # Compute actual rescaling applied to input image and output size. - output_scale = np.minimum( - target_scale_size[0] / input_size[0], target_scale_size[1] / input_size[1] - ) - output_size = np.round(np.multiply(input_size, output_scale)).astype(int) - - return ResizeTransform( - input_size[0], input_size[1], output_size[0], output_size[1], self.interp - ) - - def get_transform(self, image: np.ndarray) -> Transform: - random_scale = np.random.uniform(self.min_scale, self.max_scale) - return self._get_resize(image, random_scale) - - -class RandomRotation(Augmentation): - """ - This method returns a copy of this image, rotated the given - number of degrees counter clockwise around the given center. - """ - - def __init__(self, angle, expand=True, center=None, sample_style="range", interp=None): - """ - Args: - angle (list[float]): If ``sample_style=="range"``, - a [min, max] interval from which to sample the angle (in degrees). - If ``sample_style=="choice"``, a list of angles to sample from - expand (bool): choose if the image should be resized to fit the whole - rotated image (default), or simply cropped - center (list[[float, float]]): If ``sample_style=="range"``, - a [[minx, miny], [maxx, maxy]] relative interval from which to sample the center, - [0, 0] being the top left of the image and [1, 1] the bottom right. - If ``sample_style=="choice"``, a list of centers to sample from - Default: None, which means that the center of rotation is the center of the image - center has no effect if expand=True because it only affects shifting - """ - super().__init__() - assert sample_style in ["range", "choice"], sample_style - self.is_range = sample_style == "range" - if isinstance(angle, (float, int)): - angle = (angle, angle) - if center is not None and isinstance(center[0], (float, int)): - center = (center, center) - self._init(locals()) - - def get_transform(self, image): - h, w = image.shape[:2] - center = None - if self.is_range: - angle = np.random.uniform(self.angle[0], self.angle[1]) - if self.center is not None: - center = ( - np.random.uniform(self.center[0][0], self.center[1][0]), - np.random.uniform(self.center[0][1], self.center[1][1]), - ) - else: - angle = np.random.choice(self.angle) - if self.center is not None: - center = np.random.choice(self.center) - - if center is not None: - center = (w * center[0], h * center[1]) # Convert to absolute coordinates - - if angle % 360 == 0: - return NoOpTransform() - - return RotationTransform(h, w, angle, expand=self.expand, center=center, interp=self.interp) - - -class FixedSizeCrop(Augmentation): - """ - If `crop_size` is smaller than the input image size, then it uses a random crop of - the crop size. If `crop_size` is larger than the input image size, then it pads - the right and the bottom of the image to the crop size if `pad` is True, otherwise - it returns the smaller image. - """ - - def __init__(self, crop_size: Tuple[int], pad: bool = True, pad_value: float = 128.0): - """ - Args: - crop_size: target image (height, width). - pad: if True, will pad images smaller than `crop_size` up to `crop_size` - pad_value: the padding value. - """ - super().__init__() - self._init(locals()) - - def _get_crop(self, image: np.ndarray) -> Transform: - # Compute the image scale and scaled size. - input_size = image.shape[:2] - output_size = self.crop_size - - # Add random crop if the image is scaled up. - max_offset = np.subtract(input_size, output_size) - max_offset = np.maximum(max_offset, 0) - offset = np.multiply(max_offset, np.random.uniform(0.0, 1.0)) - offset = np.round(offset).astype(int) - return CropTransform( - offset[1], offset[0], output_size[1], output_size[0], input_size[1], input_size[0] - ) - - def _get_pad(self, image: np.ndarray) -> Transform: - # Compute the image scale and scaled size. - input_size = image.shape[:2] - output_size = self.crop_size - - # Add padding if the image is scaled down. - pad_size = np.subtract(output_size, input_size) - pad_size = np.maximum(pad_size, 0) - original_size = np.minimum(input_size, output_size) - return PadTransform( - 0, 0, pad_size[1], pad_size[0], original_size[1], original_size[0], self.pad_value - ) - - def get_transform(self, image: np.ndarray) -> TransformList: - transforms = [self._get_crop(image)] - if self.pad: - transforms.append(self._get_pad(image)) - return TransformList(transforms) - - -class RandomCrop(Augmentation): - """ - Randomly crop a rectangle region out of an image. - """ - - def __init__(self, crop_type: str, crop_size): - """ - Args: - crop_type (str): one of "relative_range", "relative", "absolute", "absolute_range". - crop_size (tuple[float, float]): two floats, explained below. - - - "relative": crop a (H * crop_size[0], W * crop_size[1]) region from an input image of - size (H, W). crop size should be in (0, 1] - - "relative_range": uniformly sample two values from [crop_size[0], 1] - and [crop_size[1]], 1], and use them as in "relative" crop type. - - "absolute" crop a (crop_size[0], crop_size[1]) region from input image. - crop_size must be smaller than the input image size. - - "absolute_range", for an input of size (H, W), uniformly sample H_crop in - [crop_size[0], min(H, crop_size[1])] and W_crop in [crop_size[0], min(W, crop_size[1])]. - Then crop a region (H_crop, W_crop). - """ - # TODO style of relative_range and absolute_range are not consistent: - # one takes (h, w) but another takes (min, max) - super().__init__() - assert crop_type in ["relative_range", "relative", "absolute", "absolute_range"] - self._init(locals()) - - def get_transform(self, image): - h, w = image.shape[:2] - croph, cropw = self.get_crop_size((h, w)) - assert h >= croph and w >= cropw, "Shape computation in {} has bugs.".format(self) - h0 = np.random.randint(h - croph + 1) - w0 = np.random.randint(w - cropw + 1) - return CropTransform(w0, h0, cropw, croph) - - def get_crop_size(self, image_size): - """ - Args: - image_size (tuple): height, width - - Returns: - crop_size (tuple): height, width in absolute pixels - """ - h, w = image_size - if self.crop_type == "relative": - ch, cw = self.crop_size - return int(h * ch + 0.5), int(w * cw + 0.5) - elif self.crop_type == "relative_range": - crop_size = np.asarray(self.crop_size, dtype=np.float32) - ch, cw = crop_size + np.random.rand(2) * (1 - crop_size) - return int(h * ch + 0.5), int(w * cw + 0.5) - elif self.crop_type == "absolute": - return (min(self.crop_size[0], h), min(self.crop_size[1], w)) - elif self.crop_type == "absolute_range": - assert self.crop_size[0] <= self.crop_size[1] - ch = np.random.randint(min(h, self.crop_size[0]), min(h, self.crop_size[1]) + 1) - cw = np.random.randint(min(w, self.crop_size[0]), min(w, self.crop_size[1]) + 1) - return ch, cw - else: - raise NotImplementedError("Unknown crop type {}".format(self.crop_type)) - - -class RandomCrop_CategoryAreaConstraint(Augmentation): - """ - Similar to :class:`RandomCrop`, but find a cropping window such that no single category - occupies a ratio of more than `single_category_max_area` in semantic segmentation ground - truth, which can cause unstability in training. The function attempts to find such a valid - cropping window for at most 10 times. - """ - - def __init__( - self, - crop_type: str, - crop_size, - single_category_max_area: float = 1.0, - ignored_category: int = None, - ): - """ - Args: - crop_type, crop_size: same as in :class:`RandomCrop` - single_category_max_area: the maximum allowed area ratio of a - category. Set to 1.0 to disable - ignored_category: allow this category in the semantic segmentation - ground truth to exceed the area ratio. Usually set to the category - that's ignored in training. - """ - self.crop_aug = RandomCrop(crop_type, crop_size) - self._init(locals()) - - def get_transform(self, image, sem_seg): - if self.single_category_max_area >= 1.0: - return self.crop_aug.get_transform(image) - else: - h, w = sem_seg.shape - for _ in range(10): - crop_size = self.crop_aug.get_crop_size((h, w)) - y0 = np.random.randint(h - crop_size[0] + 1) - x0 = np.random.randint(w - crop_size[1] + 1) - sem_seg_temp = sem_seg[y0 : y0 + crop_size[0], x0 : x0 + crop_size[1]] - labels, cnt = np.unique(sem_seg_temp, return_counts=True) - if self.ignored_category is not None: - cnt = cnt[labels != self.ignored_category] - if len(cnt) > 1 and np.max(cnt) < np.sum(cnt) * self.single_category_max_area: - break - crop_tfm = CropTransform(x0, y0, crop_size[1], crop_size[0]) - return crop_tfm - - -class RandomExtent(Augmentation): - """ - Outputs an image by cropping a random "subrect" of the source image. - - The subrect can be parameterized to include pixels outside the source image, - in which case they will be set to zeros (i.e. black). The size of the output - image will vary with the size of the random subrect. - """ - - def __init__(self, scale_range, shift_range): - """ - Args: - output_size (h, w): Dimensions of output image - scale_range (l, h): Range of input-to-output size scaling factor - shift_range (x, y): Range of shifts of the cropped subrect. The rect - is shifted by [w / 2 * Uniform(-x, x), h / 2 * Uniform(-y, y)], - where (w, h) is the (width, height) of the input image. Set each - component to zero to crop at the image's center. - """ - super().__init__() - self._init(locals()) - - def get_transform(self, image): - img_h, img_w = image.shape[:2] - - # Initialize src_rect to fit the input image. - src_rect = np.array([-0.5 * img_w, -0.5 * img_h, 0.5 * img_w, 0.5 * img_h]) - - # Apply a random scaling to the src_rect. - src_rect *= np.random.uniform(self.scale_range[0], self.scale_range[1]) - - # Apply a random shift to the coordinates origin. - src_rect[0::2] += self.shift_range[0] * img_w * (np.random.rand() - 0.5) - src_rect[1::2] += self.shift_range[1] * img_h * (np.random.rand() - 0.5) - - # Map src_rect coordinates into image coordinates (center at corner). - src_rect[0::2] += 0.5 * img_w - src_rect[1::2] += 0.5 * img_h - - return ExtentTransform( - src_rect=(src_rect[0], src_rect[1], src_rect[2], src_rect[3]), - output_size=(int(src_rect[3] - src_rect[1]), int(src_rect[2] - src_rect[0])), - ) - - -class RandomContrast(Augmentation): - """ - Randomly transforms image contrast. - - Contrast intensity is uniformly sampled in (intensity_min, intensity_max). - - intensity < 1 will reduce contrast - - intensity = 1 will preserve the input image - - intensity > 1 will increase contrast - - See: https://pillow.readthedocs.io/en/3.0.x/reference/ImageEnhance.html - """ - - def __init__(self, intensity_min, intensity_max): - """ - Args: - intensity_min (float): Minimum augmentation - intensity_max (float): Maximum augmentation - """ - super().__init__() - self._init(locals()) - - def get_transform(self, image): - w = np.random.uniform(self.intensity_min, self.intensity_max) - return BlendTransform(src_image=image.mean(), src_weight=1 - w, dst_weight=w) - - -class RandomBrightness(Augmentation): - """ - Randomly transforms image brightness. - - Brightness intensity is uniformly sampled in (intensity_min, intensity_max). - - intensity < 1 will reduce brightness - - intensity = 1 will preserve the input image - - intensity > 1 will increase brightness - - See: https://pillow.readthedocs.io/en/3.0.x/reference/ImageEnhance.html - """ - - def __init__(self, intensity_min, intensity_max): - """ - Args: - intensity_min (float): Minimum augmentation - intensity_max (float): Maximum augmentation - """ - super().__init__() - self._init(locals()) - - def get_transform(self, image): - w = np.random.uniform(self.intensity_min, self.intensity_max) - return BlendTransform(src_image=0, src_weight=1 - w, dst_weight=w) - - -class RandomSaturation(Augmentation): - """ - Randomly transforms saturation of an RGB image. - Input images are assumed to have 'RGB' channel order. - - Saturation intensity is uniformly sampled in (intensity_min, intensity_max). - - intensity < 1 will reduce saturation (make the image more grayscale) - - intensity = 1 will preserve the input image - - intensity > 1 will increase saturation - - See: https://pillow.readthedocs.io/en/3.0.x/reference/ImageEnhance.html - """ - - def __init__(self, intensity_min, intensity_max): - """ - Args: - intensity_min (float): Minimum augmentation (1 preserves input). - intensity_max (float): Maximum augmentation (1 preserves input). - """ - super().__init__() - self._init(locals()) - - def get_transform(self, image): - assert image.shape[-1] == 3, "RandomSaturation only works on RGB images" - w = np.random.uniform(self.intensity_min, self.intensity_max) - grayscale = image.dot([0.299, 0.587, 0.114])[:, :, np.newaxis] - return BlendTransform(src_image=grayscale, src_weight=1 - w, dst_weight=w) - - -class RandomLighting(Augmentation): - """ - The "lighting" augmentation described in AlexNet, using fixed PCA over ImageNet. - Input images are assumed to have 'RGB' channel order. - - The degree of color jittering is randomly sampled via a normal distribution, - with standard deviation given by the scale parameter. - """ - - def __init__(self, scale): - """ - Args: - scale (float): Standard deviation of principal component weighting. - """ - super().__init__() - self._init(locals()) - self.eigen_vecs = np.array( - [[-0.5675, 0.7192, 0.4009], [-0.5808, -0.0045, -0.8140], [-0.5836, -0.6948, 0.4203]] - ) - self.eigen_vals = np.array([0.2175, 0.0188, 0.0045]) - - def get_transform(self, image): - assert image.shape[-1] == 3, "RandomLighting only works on RGB images" - weights = np.random.normal(scale=self.scale, size=3) - return BlendTransform( - src_image=self.eigen_vecs.dot(weights * self.eigen_vals), src_weight=1.0, dst_weight=1.0 - ) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/transform.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/transform.py deleted file mode 100755 index de44b991..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/data/transforms/transform.py +++ /dev/null @@ -1,351 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -""" -See "Data Augmentation" tutorial for an overview of the system: -https://detectron2.readthedocs.io/tutorials/augmentation.html -""" - -import numpy as np -import torch -import torch.nn.functional as F -from fvcore.transforms.transform import ( - CropTransform, - HFlipTransform, - NoOpTransform, - Transform, - TransformList, -) -from PIL import Image - -try: - import cv2 # noqa -except ImportError: - # OpenCV is an optional dependency at the moment - pass - -__all__ = [ - "ExtentTransform", - "ResizeTransform", - "RotationTransform", - "ColorTransform", - "PILColorTransform", -] - - -class ExtentTransform(Transform): - """ - Extracts a subregion from the source image and scales it to the output size. - - The fill color is used to map pixels from the source rect that fall outside - the source image. - - See: https://pillow.readthedocs.io/en/latest/PIL.html#PIL.ImageTransform.ExtentTransform - """ - - def __init__(self, src_rect, output_size, interp=Image.LINEAR, fill=0): - """ - Args: - src_rect (x0, y0, x1, y1): src coordinates - output_size (h, w): dst image size - interp: PIL interpolation methods - fill: Fill color used when src_rect extends outside image - """ - super().__init__() - self._set_attributes(locals()) - - def apply_image(self, img, interp=None): - h, w = self.output_size - if len(img.shape) > 2 and img.shape[2] == 1: - pil_image = Image.fromarray(img[:, :, 0], mode="L") - else: - pil_image = Image.fromarray(img) - pil_image = pil_image.transform( - size=(w, h), - method=Image.EXTENT, - data=self.src_rect, - resample=interp if interp else self.interp, - fill=self.fill, - ) - ret = np.asarray(pil_image) - if len(img.shape) > 2 and img.shape[2] == 1: - ret = np.expand_dims(ret, -1) - return ret - - def apply_coords(self, coords): - # Transform image center from source coordinates into output coordinates - # and then map the new origin to the corner of the output image. - h, w = self.output_size - x0, y0, x1, y1 = self.src_rect - new_coords = coords.astype(np.float32) - new_coords[:, 0] -= 0.5 * (x0 + x1) - new_coords[:, 1] -= 0.5 * (y0 + y1) - new_coords[:, 0] *= w / (x1 - x0) - new_coords[:, 1] *= h / (y1 - y0) - new_coords[:, 0] += 0.5 * w - new_coords[:, 1] += 0.5 * h - return new_coords - - def apply_segmentation(self, segmentation): - segmentation = self.apply_image(segmentation, interp=Image.NEAREST) - return segmentation - - -class ResizeTransform(Transform): - """ - Resize the image to a target size. - """ - - def __init__(self, h, w, new_h, new_w, interp=None): - """ - Args: - h, w (int): original image size - new_h, new_w (int): new image size - interp: PIL interpolation methods, defaults to bilinear. - """ - # TODO decide on PIL vs opencv - super().__init__() - if interp is None: - interp = Image.BILINEAR - self._set_attributes(locals()) - - def apply_image(self, img, interp=None): - assert img.shape[:2] == (self.h, self.w) - assert len(img.shape) <= 4 - interp_method = interp if interp is not None else self.interp - - if img.dtype == np.uint8: - if len(img.shape) > 2 and img.shape[2] == 1: - pil_image = Image.fromarray(img[:, :, 0], mode="L") - else: - pil_image = Image.fromarray(img) - pil_image = pil_image.resize((self.new_w, self.new_h), interp_method) - ret = np.asarray(pil_image) - if len(img.shape) > 2 and img.shape[2] == 1: - ret = np.expand_dims(ret, -1) - else: - # PIL only supports uint8 - if any(x < 0 for x in img.strides): - img = np.ascontiguousarray(img) - img = torch.from_numpy(img) - shape = list(img.shape) - shape_4d = shape[:2] + [1] * (4 - len(shape)) + shape[2:] - img = img.view(shape_4d).permute(2, 3, 0, 1) # hw(c) -> nchw - _PIL_RESIZE_TO_INTERPOLATE_MODE = { - Image.NEAREST: "nearest", - Image.BILINEAR: "bilinear", - Image.BICUBIC: "bicubic", - } - mode = _PIL_RESIZE_TO_INTERPOLATE_MODE[interp_method] - align_corners = None if mode == "nearest" else False - img = F.interpolate( - img, (self.new_h, self.new_w), mode=mode, align_corners=align_corners - ) - shape[:2] = (self.new_h, self.new_w) - ret = img.permute(2, 3, 0, 1).view(shape).numpy() # nchw -> hw(c) - - return ret - - def apply_coords(self, coords): - coords[:, 0] = coords[:, 0] * (self.new_w * 1.0 / self.w) - coords[:, 1] = coords[:, 1] * (self.new_h * 1.0 / self.h) - return coords - - def apply_segmentation(self, segmentation): - segmentation = self.apply_image(segmentation, interp=Image.NEAREST) - return segmentation - - def inverse(self): - return ResizeTransform(self.new_h, self.new_w, self.h, self.w, self.interp) - - -class RotationTransform(Transform): - """ - This method returns a copy of this image, rotated the given - number of degrees counter clockwise around its center. - """ - - def __init__(self, h, w, angle, expand=True, center=None, interp=None): - """ - Args: - h, w (int): original image size - angle (float): degrees for rotation - expand (bool): choose if the image should be resized to fit the whole - rotated image (default), or simply cropped - center (tuple (width, height)): coordinates of the rotation center - if left to None, the center will be fit to the center of each image - center has no effect if expand=True because it only affects shifting - interp: cv2 interpolation method, default cv2.INTER_LINEAR - """ - super().__init__() - image_center = np.array((w / 2, h / 2)) - if center is None: - center = image_center - if interp is None: - interp = cv2.INTER_LINEAR - abs_cos, abs_sin = (abs(np.cos(np.deg2rad(angle))), abs(np.sin(np.deg2rad(angle)))) - if expand: - # find the new width and height bounds - bound_w, bound_h = np.rint( - [h * abs_sin + w * abs_cos, h * abs_cos + w * abs_sin] - ).astype(int) - else: - bound_w, bound_h = w, h - - self._set_attributes(locals()) - self.rm_coords = self.create_rotation_matrix() - # Needed because of this problem https://github.com/opencv/opencv/issues/11784 - self.rm_image = self.create_rotation_matrix(offset=-0.5) - - def apply_image(self, img, interp=None): - """ - img should be a numpy array, formatted as Height * Width * Nchannels - """ - if len(img) == 0 or self.angle % 360 == 0: - return img - assert img.shape[:2] == (self.h, self.w) - interp = interp if interp is not None else self.interp - return cv2.warpAffine(img, self.rm_image, (self.bound_w, self.bound_h), flags=interp) - - def apply_coords(self, coords): - """ - coords should be a N * 2 array-like, containing N couples of (x, y) points - """ - coords = np.asarray(coords, dtype=float) - if len(coords) == 0 or self.angle % 360 == 0: - return coords - return cv2.transform(coords[:, np.newaxis, :], self.rm_coords)[:, 0, :] - - def apply_segmentation(self, segmentation): - segmentation = self.apply_image(segmentation, interp=cv2.INTER_NEAREST) - return segmentation - - def create_rotation_matrix(self, offset=0): - center = (self.center[0] + offset, self.center[1] + offset) - rm = cv2.getRotationMatrix2D(tuple(center), self.angle, 1) - if self.expand: - # Find the coordinates of the center of rotation in the new image - # The only point for which we know the future coordinates is the center of the image - rot_im_center = cv2.transform(self.image_center[None, None, :] + offset, rm)[0, 0, :] - new_center = np.array([self.bound_w / 2, self.bound_h / 2]) + offset - rot_im_center - # shift the rotation center to the new coordinates - rm[:, 2] += new_center - return rm - - def inverse(self): - """ - The inverse is to rotate it back with expand, and crop to get the original shape. - """ - if not self.expand: # Not possible to inverse if a part of the image is lost - raise NotImplementedError() - rotation = RotationTransform( - self.bound_h, self.bound_w, -self.angle, True, None, self.interp - ) - crop = CropTransform( - (rotation.bound_w - self.w) // 2, (rotation.bound_h - self.h) // 2, self.w, self.h - ) - return TransformList([rotation, crop]) - - -class ColorTransform(Transform): - """ - Generic wrapper for any photometric transforms. - These transformations should only affect the color space and - not the coordinate space of the image (e.g. annotation - coordinates such as bounding boxes should not be changed) - """ - - def __init__(self, op): - """ - Args: - op (Callable): operation to be applied to the image, - which takes in an ndarray and returns an ndarray. - """ - if not callable(op): - raise ValueError("op parameter should be callable") - super().__init__() - self._set_attributes(locals()) - - def apply_image(self, img): - return self.op(img) - - def apply_coords(self, coords): - return coords - - def inverse(self): - return NoOpTransform() - - def apply_segmentation(self, segmentation): - return segmentation - - -class PILColorTransform(ColorTransform): - """ - Generic wrapper for PIL Photometric image transforms, - which affect the color space and not the coordinate - space of the image - """ - - def __init__(self, op): - """ - Args: - op (Callable): operation to be applied to the image, - which takes in a PIL Image and returns a transformed - PIL Image. - For reference on possible operations see: - - https://pillow.readthedocs.io/en/stable/ - """ - if not callable(op): - raise ValueError("op parameter should be callable") - super().__init__(op) - - def apply_image(self, img): - img = Image.fromarray(img) - return np.asarray(super().apply_image(img)) - - -def HFlip_rotated_box(transform, rotated_boxes): - """ - Apply the horizontal flip transform on rotated boxes. - - Args: - rotated_boxes (ndarray): Nx5 floating point array of - (x_center, y_center, width, height, angle_degrees) format - in absolute coordinates. - """ - # Transform x_center - rotated_boxes[:, 0] = transform.width - rotated_boxes[:, 0] - # Transform angle - rotated_boxes[:, 4] = -rotated_boxes[:, 4] - return rotated_boxes - - -def Resize_rotated_box(transform, rotated_boxes): - """ - Apply the resizing transform on rotated boxes. For details of how these (approximation) - formulas are derived, please refer to :meth:`RotatedBoxes.scale`. - - Args: - rotated_boxes (ndarray): Nx5 floating point array of - (x_center, y_center, width, height, angle_degrees) format - in absolute coordinates. - """ - scale_factor_x = transform.new_w * 1.0 / transform.w - scale_factor_y = transform.new_h * 1.0 / transform.h - rotated_boxes[:, 0] *= scale_factor_x - rotated_boxes[:, 1] *= scale_factor_y - theta = rotated_boxes[:, 4] * np.pi / 180.0 - c = np.cos(theta) - s = np.sin(theta) - rotated_boxes[:, 2] *= np.sqrt(np.square(scale_factor_x * c) + np.square(scale_factor_y * s)) - rotated_boxes[:, 3] *= np.sqrt(np.square(scale_factor_x * s) + np.square(scale_factor_y * c)) - rotated_boxes[:, 4] = np.arctan2(scale_factor_x * s, scale_factor_y * c) * 180 / np.pi - - return rotated_boxes - - -HFlipTransform.register_type("rotated_box", HFlip_rotated_box) -ResizeTransform.register_type("rotated_box", Resize_rotated_box) - -# not necessary any more with latest fvcore -NoOpTransform.register_type("rotated_box", lambda t, x: x) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/__init__.py deleted file mode 100755 index 08a61572..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/__init__.py +++ /dev/null @@ -1,12 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -from .launch import * -from .train_loop import * - -__all__ = [k for k in globals().keys() if not k.startswith("_")] - - -# prefer to let hooks and defaults live in separate namespaces (therefore not in __all__) -# but still make them available here -from .hooks import * -from .defaults import * diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/defaults.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/defaults.py deleted file mode 100755 index cc3faa15..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/defaults.py +++ /dev/null @@ -1,715 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -""" -This file contains components with some default boilerplate logic user may need -in training / testing. They will not work for everyone, but many users may find them useful. - -The behavior of functions/classes in this file is subject to change, -since they are meant to represent the "common default behavior" people need in their projects. -""" - -import argparse -import logging -import os -import sys -import weakref -from collections import OrderedDict -from typing import Optional -import torch -from fvcore.nn.precise_bn import get_bn_modules -from omegaconf import OmegaConf -from torch.nn.parallel import DistributedDataParallel - -import detectron2.data.transforms as T -from detectron2.checkpoint import DetectionCheckpointer -from detectron2.config import CfgNode, LazyConfig -from detectron2.data import ( - MetadataCatalog, - build_detection_test_loader, - build_detection_train_loader, -) -from detectron2.evaluation import ( - DatasetEvaluator, - inference_on_dataset, - print_csv_format, - verify_results, -) -from detectron2.modeling import build_model -from detectron2.solver import build_lr_scheduler, build_optimizer -from detectron2.utils import comm -from detectron2.utils.collect_env import collect_env_info -from detectron2.utils.env import seed_all_rng -from detectron2.utils.events import CommonMetricPrinter, JSONWriter, TensorboardXWriter -from detectron2.utils.file_io import PathManager -from detectron2.utils.logger import setup_logger - -from . import hooks -from .train_loop import AMPTrainer, SimpleTrainer, TrainerBase - -__all__ = [ - "create_ddp_model", - "default_argument_parser", - "default_setup", - "default_writers", - "DefaultPredictor", - "DefaultTrainer", -] - - -def create_ddp_model(model, *, fp16_compression=False, **kwargs): - """ - Create a DistributedDataParallel model if there are >1 processes. - - Args: - model: a torch.nn.Module - fp16_compression: add fp16 compression hooks to the ddp object. - See more at https://pytorch.org/docs/stable/ddp_comm_hooks.html#torch.distributed.algorithms.ddp_comm_hooks.default_hooks.fp16_compress_hook - kwargs: other arguments of :module:`torch.nn.parallel.DistributedDataParallel`. - """ # noqa - if comm.get_world_size() == 1: - return model - if "device_ids" not in kwargs: - kwargs["device_ids"] = [comm.get_local_rank()] - ddp = DistributedDataParallel(model, **kwargs) - if fp16_compression: - from torch.distributed.algorithms.ddp_comm_hooks import default as comm_hooks - - ddp.register_comm_hook(state=None, hook=comm_hooks.fp16_compress_hook) - return ddp - - -def default_argument_parser(epilog=None): - """ - Create a parser with some common arguments used by detectron2 users. - - Args: - epilog (str): epilog passed to ArgumentParser describing the usage. - - Returns: - argparse.ArgumentParser: - """ - parser = argparse.ArgumentParser( - epilog=epilog - or f""" -Examples: - -Run on single machine: - $ {sys.argv[0]} --num-gpus 8 --config-file cfg.yaml - -Change some config options: - $ {sys.argv[0]} --config-file cfg.yaml MODEL.WEIGHTS /path/to/weight.pth SOLVER.BASE_LR 0.001 - -Run on multiple machines: - (machine0)$ {sys.argv[0]} --machine-rank 0 --num-machines 2 --dist-url [--other-flags] - (machine1)$ {sys.argv[0]} --machine-rank 1 --num-machines 2 --dist-url [--other-flags] -""", - formatter_class=argparse.RawDescriptionHelpFormatter, - ) - parser.add_argument("--config-file", default="", metavar="FILE", help="path to config file") - parser.add_argument( - "--resume", - action="store_true", - help="Whether to attempt to resume from the checkpoint directory. " - "See documentation of `DefaultTrainer.resume_or_load()` for what it means.", - ) - parser.add_argument("--eval-only", action="store_true", help="perform evaluation only") - parser.add_argument("--num-gpus", type=int, default=1, help="number of gpus *per machine*") - parser.add_argument("--num-machines", type=int, default=1, help="total number of machines") - parser.add_argument( - "--machine-rank", type=int, default=0, help="the rank of this machine (unique per machine)" - ) - - # PyTorch still may leave orphan processes in multi-gpu training. - # Therefore we use a deterministic way to obtain port, - # so that users are aware of orphan processes by seeing the port occupied. - port = 2 ** 15 + 2 ** 14 + hash(os.getuid() if sys.platform != "win32" else 1) % 2 ** 14 - parser.add_argument( - "--dist-url", - default="tcp://127.0.0.1:{}".format(port), - help="initialization URL for pytorch distributed backend. See " - "https://pytorch.org/docs/stable/distributed.html for details.", - ) - parser.add_argument( - "opts", - help=""" -Modify config options at the end of the command. For Yacs configs, use -space-separated "PATH.KEY VALUE" pairs. -For python-based LazyConfig, use "path.key=value". - """.strip(), - default=None, - nargs=argparse.REMAINDER, - ) - return parser - - -def _try_get_key(cfg, *keys, default=None): - """ - Try select keys from cfg until the first key that exists. Otherwise return default. - """ - if isinstance(cfg, CfgNode): - cfg = OmegaConf.create(cfg.dump()) - for k in keys: - none = object() - p = OmegaConf.select(cfg, k, default=none) - if p is not none: - return p - return default - - -def _highlight(code, filename): - try: - import pygments - except ImportError: - return code - - from pygments.lexers import Python3Lexer, YamlLexer - from pygments.formatters import Terminal256Formatter - - lexer = Python3Lexer() if filename.endswith(".py") else YamlLexer() - code = pygments.highlight(code, lexer, Terminal256Formatter(style="monokai")) - return code - - -def default_setup(cfg, args): - """ - Perform some basic common setups at the beginning of a job, including: - - 1. Set up the detectron2 logger - 2. Log basic information about environment, cmdline arguments, and config - 3. Backup the config to the output directory - - Args: - cfg (CfgNode or omegaconf.DictConfig): the full config to be used - args (argparse.NameSpace): the command line arguments to be logged - """ - output_dir = _try_get_key(cfg, "OUTPUT_DIR", "output_dir", "train.output_dir") - if comm.is_main_process() and output_dir: - PathManager.mkdirs(output_dir) - - rank = comm.get_rank() - setup_logger(output_dir, distributed_rank=rank, name="fvcore") - logger = setup_logger(output_dir, distributed_rank=rank) - - logger.info("Rank of current process: {}. World size: {}".format(rank, comm.get_world_size())) - logger.info("Environment info:\n" + collect_env_info()) - - logger.info("Command line arguments: " + str(args)) - if hasattr(args, "config_file") and args.config_file != "": - logger.info( - "Contents of args.config_file={}:\n{}".format( - args.config_file, - _highlight(PathManager.open(args.config_file, "r").read(), args.config_file), - ) - ) - - if comm.is_main_process() and output_dir: - # Note: some of our scripts may expect the existence of - # config.yaml in output directory - path = os.path.join(output_dir, "config.yaml") - if isinstance(cfg, CfgNode): - logger.info("Running with full config:\n{}".format(_highlight(cfg.dump(), ".yaml"))) - with PathManager.open(path, "w") as f: - f.write(cfg.dump()) - else: - LazyConfig.save(cfg, path) - logger.info("Full config saved to {}".format(path)) - - # make sure each worker has a different, yet deterministic seed if specified - seed = _try_get_key(cfg, "SEED", "train.seed", default=-1) - seed_all_rng(None if seed < 0 else seed + rank) - - # cudnn benchmark has large overhead. It shouldn't be used considering the small size of - # typical validation set. - if not (hasattr(args, "eval_only") and args.eval_only): - torch.backends.cudnn.benchmark = _try_get_key( - cfg, "CUDNN_BENCHMARK", "train.cudnn_benchmark", default=False - ) - - -def default_writers(output_dir: str, max_iter: Optional[int] = None): - """ - Build a list of :class:`EventWriter` to be used. - It now consists of a :class:`CommonMetricPrinter`, - :class:`TensorboardXWriter` and :class:`JSONWriter`. - - Args: - output_dir: directory to store JSON metrics and tensorboard events - max_iter: the total number of iterations - - Returns: - list[EventWriter]: a list of :class:`EventWriter` objects. - """ - PathManager.mkdirs(output_dir) - return [ - # It may not always print what you want to see, since it prints "common" metrics only. - CommonMetricPrinter(max_iter), - JSONWriter(os.path.join(output_dir, "metrics.json")), - TensorboardXWriter(output_dir), - ] - - -class DefaultPredictor: - """ - Create a simple end-to-end predictor with the given config that runs on - single device for a single input image. - - Compared to using the model directly, this class does the following additions: - - 1. Load checkpoint from `cfg.MODEL.WEIGHTS`. - 2. Always take BGR image as the input and apply conversion defined by `cfg.INPUT.FORMAT`. - 3. Apply resizing defined by `cfg.INPUT.{MIN,MAX}_SIZE_TEST`. - 4. Take one input image and produce a single output, instead of a batch. - - This is meant for simple demo purposes, so it does the above steps automatically. - This is not meant for benchmarks or running complicated inference logic. - If you'd like to do anything more complicated, please refer to its source code as - examples to build and use the model manually. - - Attributes: - metadata (Metadata): the metadata of the underlying dataset, obtained from - cfg.DATASETS.TEST. - - Examples: - :: - pred = DefaultPredictor(cfg) - inputs = cv2.imread("input.jpg") - outputs = pred(inputs) - """ - - def __init__(self, cfg): - self.cfg = cfg.clone() # cfg can be modified by model - self.model = build_model(self.cfg) - self.model.eval() - if len(cfg.DATASETS.TEST): - self.metadata = MetadataCatalog.get(cfg.DATASETS.TEST[0]) - - checkpointer = DetectionCheckpointer(self.model) - checkpointer.load(cfg.MODEL.WEIGHTS) - - self.aug = T.ResizeShortestEdge( - [cfg.INPUT.MIN_SIZE_TEST, cfg.INPUT.MIN_SIZE_TEST], cfg.INPUT.MAX_SIZE_TEST - ) - - self.input_format = cfg.INPUT.FORMAT - assert self.input_format in ["RGB", "BGR"], self.input_format - - def __call__(self, original_image): - """ - Args: - original_image (np.ndarray): an image of shape (H, W, C) (in BGR order). - - Returns: - predictions (dict): - the output of the model for one image only. - See :doc:`/tutorials/models` for details about the format. - """ - with torch.no_grad(): # https://github.com/sphinx-doc/sphinx/issues/4258 - # Apply pre-processing to image. - if self.input_format == "RGB": - # whether the model expects BGR inputs or RGB - original_image = original_image[:, :, ::-1] - height, width = original_image.shape[:2] - image = self.aug.get_transform(original_image).apply_image(original_image) - image = torch.as_tensor(image.astype("float32").transpose(2, 0, 1)) - - inputs = {"image": image, "height": height, "width": width} - predictions = self.model([inputs])[0] - return predictions - - -class DefaultTrainer(TrainerBase): - """ - A trainer with default training logic. It does the following: - - 1. Create a :class:`SimpleTrainer` using model, optimizer, dataloader - defined by the given config. Create a LR scheduler defined by the config. - 2. Load the last checkpoint or `cfg.MODEL.WEIGHTS`, if exists, when - `resume_or_load` is called. - 3. Register a few common hooks defined by the config. - - It is created to simplify the **standard model training workflow** and reduce code boilerplate - for users who only need the standard training workflow, with standard features. - It means this class makes *many assumptions* about your training logic that - may easily become invalid in a new research. In fact, any assumptions beyond those made in the - :class:`SimpleTrainer` are too much for research. - - The code of this class has been annotated about restrictive assumptions it makes. - When they do not work for you, you're encouraged to: - - 1. Overwrite methods of this class, OR: - 2. Use :class:`SimpleTrainer`, which only does minimal SGD training and - nothing else. You can then add your own hooks if needed. OR: - 3. Write your own training loop similar to `tools/plain_train_net.py`. - - See the :doc:`/tutorials/training` tutorials for more details. - - Note that the behavior of this class, like other functions/classes in - this file, is not stable, since it is meant to represent the "common default behavior". - It is only guaranteed to work well with the standard models and training workflow in detectron2. - To obtain more stable behavior, write your own training logic with other public APIs. - - Examples: - :: - trainer = DefaultTrainer(cfg) - trainer.resume_or_load() # load last checkpoint or MODEL.WEIGHTS - trainer.train() - - Attributes: - scheduler: - checkpointer (DetectionCheckpointer): - cfg (CfgNode): - """ - - def __init__(self, cfg): - """ - Args: - cfg (CfgNode): - """ - super().__init__() - logger = logging.getLogger("detectron2") - if not logger.isEnabledFor(logging.INFO): # setup_logger is not called for d2 - setup_logger() - cfg = DefaultTrainer.auto_scale_workers(cfg, comm.get_world_size()) - - # Assume these objects must be constructed in this order. - model = self.build_model(cfg) - optimizer = self.build_optimizer(cfg, model) - data_loader = self.build_train_loader(cfg) - - model = create_ddp_model(model, broadcast_buffers=False) - self._trainer = (AMPTrainer if cfg.SOLVER.AMP.ENABLED else SimpleTrainer)( - model, data_loader, optimizer - ) - - self.scheduler = self.build_lr_scheduler(cfg, optimizer) - self.checkpointer = DetectionCheckpointer( - # Assume you want to save checkpoints together with logs/statistics - model, - cfg.OUTPUT_DIR, - trainer=weakref.proxy(self), - ) - self.start_iter = 0 - self.max_iter = cfg.SOLVER.MAX_ITER - self.cfg = cfg - - self.register_hooks(self.build_hooks()) - - def resume_or_load(self, resume=True): - """ - If `resume==True` and `cfg.OUTPUT_DIR` contains the last checkpoint (defined by - a `last_checkpoint` file), resume from the file. Resuming means loading all - available states (eg. optimizer and scheduler) and update iteration counter - from the checkpoint. ``cfg.MODEL.WEIGHTS`` will not be used. - - Otherwise, this is considered as an independent training. The method will load model - weights from the file `cfg.MODEL.WEIGHTS` (but will not load other states) and start - from iteration 0. - - Args: - resume (bool): whether to do resume or not - """ - self.checkpointer.resume_or_load(self.cfg.MODEL.WEIGHTS, resume=resume) - if resume and self.checkpointer.has_checkpoint(): - # The checkpoint stores the training iteration that just finished, thus we start - # at the next iteration - self.start_iter = self.iter + 1 - - def build_hooks(self): - """ - Build a list of default hooks, including timing, evaluation, - checkpointing, lr scheduling, precise BN, writing events. - - Returns: - list[HookBase]: - """ - cfg = self.cfg.clone() - cfg.defrost() - cfg.DATALOADER.NUM_WORKERS = 0 # save some memory and time for PreciseBN - - ret = [ - hooks.IterationTimer(), - hooks.LRScheduler(), - hooks.PreciseBN( - # Run at the same freq as (but before) evaluation. - cfg.TEST.EVAL_PERIOD, - self.model, - # Build a new data loader to not affect training - self.build_train_loader(cfg), - cfg.TEST.PRECISE_BN.NUM_ITER, - ) - if cfg.TEST.PRECISE_BN.ENABLED and get_bn_modules(self.model) - else None, - ] - - # Do PreciseBN before checkpointer, because it updates the model and need to - # be saved by checkpointer. - # This is not always the best: if checkpointing has a different frequency, - # some checkpoints may have more precise statistics than others. - if comm.is_main_process(): - ret.append(hooks.PeriodicCheckpointer(self.checkpointer, cfg.SOLVER.CHECKPOINT_PERIOD)) - - def test_and_save_results(): - self._last_eval_results = self.test(self.cfg, self.model) - return self._last_eval_results - - # Do evaluation after checkpointer, because then if it fails, - # we can use the saved checkpoint to debug. - ret.append(hooks.EvalHook(cfg.TEST.EVAL_PERIOD, test_and_save_results)) - - if comm.is_main_process(): - # Here the default print/log frequency of each writer is used. - # run writers in the end, so that evaluation metrics are written - ret.append(hooks.PeriodicWriter(self.build_writers(), period=20)) - return ret - - def build_writers(self): - """ - Build a list of writers to be used using :func:`default_writers()`. - If you'd like a different list of writers, you can overwrite it in - your trainer. - - Returns: - list[EventWriter]: a list of :class:`EventWriter` objects. - """ - return default_writers(self.cfg.OUTPUT_DIR, self.max_iter) - - def train(self): - """ - Run training. - - Returns: - OrderedDict of results, if evaluation is enabled. Otherwise None. - """ - super().train(self.start_iter, self.max_iter) - if len(self.cfg.TEST.EXPECTED_RESULTS) and comm.is_main_process(): - assert hasattr( - self, "_last_eval_results" - ), "No evaluation results obtained during training!" - verify_results(self.cfg, self._last_eval_results) - return self._last_eval_results - - def run_step(self): - self._trainer.iter = self.iter - self._trainer.run_step() - - def state_dict(self): - ret = super().state_dict() - ret["_trainer"] = self._trainer.state_dict() - return ret - - def load_state_dict(self, state_dict): - super().load_state_dict(state_dict) - self._trainer.load_state_dict(state_dict["_trainer"]) - - @classmethod - def build_model(cls, cfg): - """ - Returns: - torch.nn.Module: - - It now calls :func:`detectron2.modeling.build_model`. - Overwrite it if you'd like a different model. - """ - model = build_model(cfg) - logger = logging.getLogger(__name__) - logger.info("Model:\n{}".format(model)) - return model - - @classmethod - def build_optimizer(cls, cfg, model): - """ - Returns: - torch.optim.Optimizer: - - It now calls :func:`detectron2.solver.build_optimizer`. - Overwrite it if you'd like a different optimizer. - """ - return build_optimizer(cfg, model) - - @classmethod - def build_lr_scheduler(cls, cfg, optimizer): - """ - It now calls :func:`detectron2.solver.build_lr_scheduler`. - Overwrite it if you'd like a different scheduler. - """ - return build_lr_scheduler(cfg, optimizer) - - @classmethod - def build_train_loader(cls, cfg): - """ - Returns: - iterable - - It now calls :func:`detectron2.data.build_detection_train_loader`. - Overwrite it if you'd like a different data loader. - """ - return build_detection_train_loader(cfg) - - @classmethod - def build_test_loader(cls, cfg, dataset_name): - """ - Returns: - iterable - - It now calls :func:`detectron2.data.build_detection_test_loader`. - Overwrite it if you'd like a different data loader. - """ - return build_detection_test_loader(cfg, dataset_name) - - @classmethod - def build_evaluator(cls, cfg, dataset_name): - """ - Returns: - DatasetEvaluator or None - - It is not implemented by default. - """ - raise NotImplementedError( - """ -If you want DefaultTrainer to automatically run evaluation, -please implement `build_evaluator()` in subclasses (see train_net.py for example). -Alternatively, you can call evaluation functions yourself (see Colab balloon tutorial for example). -""" - ) - - @classmethod - def test(cls, cfg, model, evaluators=None): - """ - Evaluate the given model. The given model is expected to already contain - weights to evaluate. - - Args: - cfg (CfgNode): - model (nn.Module): - evaluators (list[DatasetEvaluator] or None): if None, will call - :meth:`build_evaluator`. Otherwise, must have the same length as - ``cfg.DATASETS.TEST``. - - Returns: - dict: a dict of result metrics - """ - logger = logging.getLogger(__name__) - if isinstance(evaluators, DatasetEvaluator): - evaluators = [evaluators] - if evaluators is not None: - assert len(cfg.DATASETS.TEST) == len(evaluators), "{} != {}".format( - len(cfg.DATASETS.TEST), len(evaluators) - ) - - results = OrderedDict() - for idx, dataset_name in enumerate(cfg.DATASETS.TEST): - data_loader = cls.build_test_loader(cfg, dataset_name) - # When evaluators are passed in as arguments, - # implicitly assume that evaluators can be created before data_loader. - if evaluators is not None: - evaluator = evaluators[idx] - else: - try: - evaluator = cls.build_evaluator(cfg, dataset_name) - except NotImplementedError: - logger.warn( - "No evaluator found. Use `DefaultTrainer.test(evaluators=)`, " - "or implement its `build_evaluator` method." - ) - results[dataset_name] = {} - continue - results_i = inference_on_dataset(model, data_loader, evaluator) - results[dataset_name] = results_i - if comm.is_main_process(): - assert isinstance( - results_i, dict - ), "Evaluator must return a dict on the main process. Got {} instead.".format( - results_i - ) - logger.info("Evaluation results for {} in csv format:".format(dataset_name)) - print_csv_format(results_i) - - if len(results) == 1: - results = list(results.values())[0] - return results - - @staticmethod - def auto_scale_workers(cfg, num_workers: int): - """ - When the config is defined for certain number of workers (according to - ``cfg.SOLVER.REFERENCE_WORLD_SIZE``) that's different from the number of - workers currently in use, returns a new cfg where the total batch size - is scaled so that the per-GPU batch size stays the same as the - original ``IMS_PER_BATCH // REFERENCE_WORLD_SIZE``. - - Other config options are also scaled accordingly: - * training steps and warmup steps are scaled inverse proportionally. - * learning rate are scaled proportionally, following :paper:`ImageNet in 1h`. - - For example, with the original config like the following: - - .. code-block:: yaml - - IMS_PER_BATCH: 16 - BASE_LR: 0.1 - REFERENCE_WORLD_SIZE: 8 - MAX_ITER: 5000 - STEPS: (4000,) - CHECKPOINT_PERIOD: 1000 - - When this config is used on 16 GPUs instead of the reference number 8, - calling this method will return a new config with: - - .. code-block:: yaml - - IMS_PER_BATCH: 32 - BASE_LR: 0.2 - REFERENCE_WORLD_SIZE: 16 - MAX_ITER: 2500 - STEPS: (2000,) - CHECKPOINT_PERIOD: 500 - - Note that both the original config and this new config can be trained on 16 GPUs. - It's up to user whether to enable this feature (by setting ``REFERENCE_WORLD_SIZE``). - - Returns: - CfgNode: a new config. Same as original if ``cfg.SOLVER.REFERENCE_WORLD_SIZE==0``. - """ - old_world_size = cfg.SOLVER.REFERENCE_WORLD_SIZE - if old_world_size == 0 or old_world_size == num_workers: - return cfg - cfg = cfg.clone() - frozen = cfg.is_frozen() - cfg.defrost() - - assert ( - cfg.SOLVER.IMS_PER_BATCH % old_world_size == 0 - ), "Invalid REFERENCE_WORLD_SIZE in config!" - scale = num_workers / old_world_size - bs = cfg.SOLVER.IMS_PER_BATCH = int(round(cfg.SOLVER.IMS_PER_BATCH * scale)) - lr = cfg.SOLVER.BASE_LR = cfg.SOLVER.BASE_LR * scale - max_iter = cfg.SOLVER.MAX_ITER = int(round(cfg.SOLVER.MAX_ITER / scale)) - warmup_iter = cfg.SOLVER.WARMUP_ITERS = int(round(cfg.SOLVER.WARMUP_ITERS / scale)) - cfg.SOLVER.STEPS = tuple(int(round(s / scale)) for s in cfg.SOLVER.STEPS) - cfg.TEST.EVAL_PERIOD = int(round(cfg.TEST.EVAL_PERIOD / scale)) - cfg.SOLVER.CHECKPOINT_PERIOD = int(round(cfg.SOLVER.CHECKPOINT_PERIOD / scale)) - cfg.SOLVER.REFERENCE_WORLD_SIZE = num_workers # maintain invariant - logger = logging.getLogger(__name__) - logger.info( - f"Auto-scaling the config to batch_size={bs}, learning_rate={lr}, " - f"max_iter={max_iter}, warmup={warmup_iter}." - ) - - if frozen: - cfg.freeze() - return cfg - - -# Access basic attributes from the underlying trainer -for _attr in ["model", "data_loader", "optimizer"]: - setattr( - DefaultTrainer, - _attr, - property( - # getter - lambda self, x=_attr: getattr(self._trainer, x), - # setter - lambda self, value, x=_attr: setattr(self._trainer, x, value), - ), - ) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/hooks.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/hooks.py deleted file mode 100755 index 52c321f9..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/hooks.py +++ /dev/null @@ -1,686 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import datetime -import itertools -import logging -import math -import operator -import os -import tempfile -import time -import warnings -from collections import Counter -import torch -from fvcore.common.checkpoint import Checkpointer -from fvcore.common.checkpoint import PeriodicCheckpointer as _PeriodicCheckpointer -from fvcore.common.param_scheduler import ParamScheduler -from fvcore.common.timer import Timer -from fvcore.nn.precise_bn import get_bn_modules, update_bn_stats - -import detectron2.utils.comm as comm -from detectron2.evaluation.testing import flatten_results_dict -from detectron2.solver import LRMultiplier -from detectron2.utils.events import EventStorage, EventWriter -from detectron2.utils.file_io import PathManager - -from .train_loop import HookBase - -__all__ = [ - "CallbackHook", - "IterationTimer", - "PeriodicWriter", - "PeriodicCheckpointer", - "BestCheckpointer", - "LRScheduler", - "AutogradProfiler", - "EvalHook", - "PreciseBN", - "TorchProfiler", - "TorchMemoryStats", -] - - -""" -Implement some common hooks. -""" - - -class CallbackHook(HookBase): - """ - Create a hook using callback functions provided by the user. - """ - - def __init__(self, *, before_train=None, after_train=None, before_step=None, after_step=None): - """ - Each argument is a function that takes one argument: the trainer. - """ - self._before_train = before_train - self._before_step = before_step - self._after_step = after_step - self._after_train = after_train - - def before_train(self): - if self._before_train: - self._before_train(self.trainer) - - def after_train(self): - if self._after_train: - self._after_train(self.trainer) - # The functions may be closures that hold reference to the trainer - # Therefore, delete them to avoid circular reference. - del self._before_train, self._after_train - del self._before_step, self._after_step - - def before_step(self): - if self._before_step: - self._before_step(self.trainer) - - def after_step(self): - if self._after_step: - self._after_step(self.trainer) - - -class IterationTimer(HookBase): - """ - Track the time spent for each iteration (each run_step call in the trainer). - Print a summary in the end of training. - - This hook uses the time between the call to its :meth:`before_step` - and :meth:`after_step` methods. - Under the convention that :meth:`before_step` of all hooks should only - take negligible amount of time, the :class:`IterationTimer` hook should be - placed at the beginning of the list of hooks to obtain accurate timing. - """ - - def __init__(self, warmup_iter=3): - """ - Args: - warmup_iter (int): the number of iterations at the beginning to exclude - from timing. - """ - self._warmup_iter = warmup_iter - self._step_timer = Timer() - self._start_time = time.perf_counter() - self._total_timer = Timer() - - def before_train(self): - self._start_time = time.perf_counter() - self._total_timer.reset() - self._total_timer.pause() - - def after_train(self): - logger = logging.getLogger(__name__) - total_time = time.perf_counter() - self._start_time - total_time_minus_hooks = self._total_timer.seconds() - hook_time = total_time - total_time_minus_hooks - - num_iter = self.trainer.storage.iter + 1 - self.trainer.start_iter - self._warmup_iter - - if num_iter > 0 and total_time_minus_hooks > 0: - # Speed is meaningful only after warmup - # NOTE this format is parsed by grep in some scripts - logger.info( - "Overall training speed: {} iterations in {} ({:.4f} s / it)".format( - num_iter, - str(datetime.timedelta(seconds=int(total_time_minus_hooks))), - total_time_minus_hooks / num_iter, - ) - ) - - logger.info( - "Total training time: {} ({} on hooks)".format( - str(datetime.timedelta(seconds=int(total_time))), - str(datetime.timedelta(seconds=int(hook_time))), - ) - ) - - def before_step(self): - self._step_timer.reset() - self._total_timer.resume() - - def after_step(self): - # +1 because we're in after_step, the current step is done - # but not yet counted - iter_done = self.trainer.storage.iter - self.trainer.start_iter + 1 - if iter_done >= self._warmup_iter: - sec = self._step_timer.seconds() - self.trainer.storage.put_scalars(time=sec) - else: - self._start_time = time.perf_counter() - self._total_timer.reset() - - self._total_timer.pause() - - -class PeriodicWriter(HookBase): - """ - Write events to EventStorage (by calling ``writer.write()``) periodically. - - It is executed every ``period`` iterations and after the last iteration. - Note that ``period`` does not affect how data is smoothed by each writer. - """ - - def __init__(self, writers, period=20): - """ - Args: - writers (list[EventWriter]): a list of EventWriter objects - period (int): - """ - self._writers = writers - for w in writers: - assert isinstance(w, EventWriter), w - self._period = period - - def after_step(self): - if (self.trainer.iter + 1) % self._period == 0 or ( - self.trainer.iter == self.trainer.max_iter - 1 - ): - for writer in self._writers: - writer.write() - - def after_train(self): - for writer in self._writers: - # If any new data is found (e.g. produced by other after_train), - # write them before closing - writer.write() - writer.close() - - -class PeriodicCheckpointer(_PeriodicCheckpointer, HookBase): - """ - Same as :class:`detectron2.checkpoint.PeriodicCheckpointer`, but as a hook. - - Note that when used as a hook, - it is unable to save additional data other than what's defined - by the given `checkpointer`. - - It is executed every ``period`` iterations and after the last iteration. - """ - - def before_train(self): - self.max_iter = self.trainer.max_iter - - def after_step(self): - # No way to use **kwargs - self.step(self.trainer.iter) - - -class BestCheckpointer(HookBase): - """ - Checkpoints best weights based off given metric. - - This hook should be used in conjunction to and executed after the hook - that produces the metric, e.g. `EvalHook`. - """ - - def __init__( - self, - eval_period: int, - checkpointer: Checkpointer, - val_metric: str, - mode: str = "max", - file_prefix: str = "model_best", - ) -> None: - """ - Args: - eval_period (int): the period `EvalHook` is set to run. - checkpointer: the checkpointer object used to save checkpoints. - val_metric (str): validation metric to track for best checkpoint, e.g. "bbox/AP50" - mode (str): one of {'max', 'min'}. controls whether the chosen val metric should be - maximized or minimized, e.g. for "bbox/AP50" it should be "max" - file_prefix (str): the prefix of checkpoint's filename, defaults to "model_best" - """ - self._logger = logging.getLogger(__name__) - self._period = eval_period - self._val_metric = val_metric - assert mode in [ - "max", - "min", - ], f'Mode "{mode}" to `BestCheckpointer` is unknown. It should be one of {"max", "min"}.' - if mode == "max": - self._compare = operator.gt - else: - self._compare = operator.lt - self._checkpointer = checkpointer - self._file_prefix = file_prefix - self.best_metric = None - self.best_iter = None - - def _update_best(self, val, iteration): - if math.isnan(val) or math.isinf(val): - return False - self.best_metric = val - self.best_iter = iteration - return True - - def _best_checking(self): - metric_tuple = self.trainer.storage.latest().get(self._val_metric) - if metric_tuple is None: - self._logger.warning( - f"Given val metric {self._val_metric} does not seem to be computed/stored." - "Will not be checkpointing based on it." - ) - return - else: - latest_metric, metric_iter = metric_tuple - - if self.best_metric is None: - if self._update_best(latest_metric, metric_iter): - additional_state = {"iteration": metric_iter} - self._checkpointer.save(f"{self._file_prefix}", **additional_state) - self._logger.info( - f"Saved first model at {self.best_metric:0.5f} @ {self.best_iter} steps" - ) - elif self._compare(latest_metric, self.best_metric): - additional_state = {"iteration": metric_iter} - self._checkpointer.save(f"{self._file_prefix}", **additional_state) - self._logger.info( - f"Saved best model as latest eval score for {self._val_metric} is " - f"{latest_metric:0.5f}, better than last best score " - f"{self.best_metric:0.5f} @ iteration {self.best_iter}." - ) - self._update_best(latest_metric, metric_iter) - else: - self._logger.info( - f"Not saving as latest eval score for {self._val_metric} is {latest_metric:0.5f}, " - f"not better than best score {self.best_metric:0.5f} @ iteration {self.best_iter}." - ) - - def after_step(self): - # same conditions as `EvalHook` - next_iter = self.trainer.iter + 1 - if ( - self._period > 0 - and next_iter % self._period == 0 - and next_iter != self.trainer.max_iter - ): - self._best_checking() - - def after_train(self): - # same conditions as `EvalHook` - if self.trainer.iter + 1 >= self.trainer.max_iter: - self._best_checking() - - -class LRScheduler(HookBase): - """ - A hook which executes a torch builtin LR scheduler and summarizes the LR. - It is executed after every iteration. - """ - - def __init__(self, optimizer=None, scheduler=None): - """ - Args: - optimizer (torch.optim.Optimizer): - scheduler (torch.optim.LRScheduler or fvcore.common.param_scheduler.ParamScheduler): - if a :class:`ParamScheduler` object, it defines the multiplier over the base LR - in the optimizer. - - If any argument is not given, will try to obtain it from the trainer. - """ - self._optimizer = optimizer - self._scheduler = scheduler - - def before_train(self): - self._optimizer = self._optimizer or self.trainer.optimizer - if isinstance(self.scheduler, ParamScheduler): - self._scheduler = LRMultiplier( - self._optimizer, - self.scheduler, - self.trainer.max_iter, - last_iter=self.trainer.iter - 1, - ) - self._best_param_group_id = LRScheduler.get_best_param_group_id(self._optimizer) - - @staticmethod - def get_best_param_group_id(optimizer): - # NOTE: some heuristics on what LR to summarize - # summarize the param group with most parameters - largest_group = max(len(g["params"]) for g in optimizer.param_groups) - - if largest_group == 1: - # If all groups have one parameter, - # then find the most common initial LR, and use it for summary - lr_count = Counter([g["lr"] for g in optimizer.param_groups]) - lr = lr_count.most_common()[0][0] - for i, g in enumerate(optimizer.param_groups): - if g["lr"] == lr: - return i - else: - for i, g in enumerate(optimizer.param_groups): - if len(g["params"]) == largest_group: - return i - - def after_step(self): - lr = self._optimizer.param_groups[self._best_param_group_id]["lr"] - self.trainer.storage.put_scalar("lr", lr, smoothing_hint=False) - self.scheduler.step() - - @property - def scheduler(self): - return self._scheduler or self.trainer.scheduler - - def state_dict(self): - if isinstance(self.scheduler, torch.optim.lr_scheduler._LRScheduler): - return self.scheduler.state_dict() - return {} - - def load_state_dict(self, state_dict): - if isinstance(self.scheduler, torch.optim.lr_scheduler._LRScheduler): - logger = logging.getLogger(__name__) - logger.info("Loading scheduler from state_dict ...") - self.scheduler.load_state_dict(state_dict) - - -class TorchProfiler(HookBase): - """ - A hook which runs `torch.profiler.profile`. - - Examples: - :: - hooks.TorchProfiler( - lambda trainer: 10 < trainer.iter < 20, self.cfg.OUTPUT_DIR - ) - - The above example will run the profiler for iteration 10~20 and dump - results to ``OUTPUT_DIR``. We did not profile the first few iterations - because they are typically slower than the rest. - The result files can be loaded in the ``chrome://tracing`` page in chrome browser, - and the tensorboard visualizations can be visualized using - ``tensorboard --logdir OUTPUT_DIR/log`` - """ - - def __init__(self, enable_predicate, output_dir, *, activities=None, save_tensorboard=True): - """ - Args: - enable_predicate (callable[trainer -> bool]): a function which takes a trainer, - and returns whether to enable the profiler. - It will be called once every step, and can be used to select which steps to profile. - output_dir (str): the output directory to dump tracing files. - activities (iterable): same as in `torch.profiler.profile`. - save_tensorboard (bool): whether to save tensorboard visualizations at (output_dir)/log/ - """ - self._enable_predicate = enable_predicate - self._activities = activities - self._output_dir = output_dir - self._save_tensorboard = save_tensorboard - - def before_step(self): - if self._enable_predicate(self.trainer): - if self._save_tensorboard: - on_trace_ready = torch.profiler.tensorboard_trace_handler( - os.path.join( - self._output_dir, - "log", - "profiler-tensorboard-iter{}".format(self.trainer.iter), - ), - f"worker{comm.get_rank()}", - ) - else: - on_trace_ready = None - self._profiler = torch.profiler.profile( - activities=self._activities, - on_trace_ready=on_trace_ready, - record_shapes=True, - profile_memory=True, - with_stack=True, - with_flops=True, - ) - self._profiler.__enter__() - else: - self._profiler = None - - def after_step(self): - if self._profiler is None: - return - self._profiler.__exit__(None, None, None) - if not self._save_tensorboard: - PathManager.mkdirs(self._output_dir) - out_file = os.path.join( - self._output_dir, "profiler-trace-iter{}.json".format(self.trainer.iter) - ) - if "://" not in out_file: - self._profiler.export_chrome_trace(out_file) - else: - # Support non-posix filesystems - with tempfile.TemporaryDirectory(prefix="detectron2_profiler") as d: - tmp_file = os.path.join(d, "tmp.json") - self._profiler.export_chrome_trace(tmp_file) - with open(tmp_file) as f: - content = f.read() - with PathManager.open(out_file, "w") as f: - f.write(content) - - -class AutogradProfiler(TorchProfiler): - """ - A hook which runs `torch.autograd.profiler.profile`. - - Examples: - :: - hooks.AutogradProfiler( - lambda trainer: 10 < trainer.iter < 20, self.cfg.OUTPUT_DIR - ) - - The above example will run the profiler for iteration 10~20 and dump - results to ``OUTPUT_DIR``. We did not profile the first few iterations - because they are typically slower than the rest. - The result files can be loaded in the ``chrome://tracing`` page in chrome browser. - - Note: - When used together with NCCL on older version of GPUs, - autograd profiler may cause deadlock because it unnecessarily allocates - memory on every device it sees. The memory management calls, if - interleaved with NCCL calls, lead to deadlock on GPUs that do not - support ``cudaLaunchCooperativeKernelMultiDevice``. - """ - - def __init__(self, enable_predicate, output_dir, *, use_cuda=True): - """ - Args: - enable_predicate (callable[trainer -> bool]): a function which takes a trainer, - and returns whether to enable the profiler. - It will be called once every step, and can be used to select which steps to profile. - output_dir (str): the output directory to dump tracing files. - use_cuda (bool): same as in `torch.autograd.profiler.profile`. - """ - warnings.warn("AutogradProfiler has been deprecated in favor of TorchProfiler.") - self._enable_predicate = enable_predicate - self._use_cuda = use_cuda - self._output_dir = output_dir - - def before_step(self): - if self._enable_predicate(self.trainer): - self._profiler = torch.autograd.profiler.profile(use_cuda=self._use_cuda) - self._profiler.__enter__() - else: - self._profiler = None - - -class EvalHook(HookBase): - """ - Run an evaluation function periodically, and at the end of training. - - It is executed every ``eval_period`` iterations and after the last iteration. - """ - - def __init__(self, eval_period, eval_function): - """ - Args: - eval_period (int): the period to run `eval_function`. Set to 0 to - not evaluate periodically (but still after the last iteration). - eval_function (callable): a function which takes no arguments, and - returns a nested dict of evaluation metrics. - - Note: - This hook must be enabled in all or none workers. - If you would like only certain workers to perform evaluation, - give other workers a no-op function (`eval_function=lambda: None`). - """ - self._period = eval_period - self._func = eval_function - - def _do_eval(self): - results = self._func() - - if results: - assert isinstance( - results, dict - ), "Eval function must return a dict. Got {} instead.".format(results) - - flattened_results = flatten_results_dict(results) - for k, v in flattened_results.items(): - try: - v = float(v) - except Exception as e: - raise ValueError( - "[EvalHook] eval_function should return a nested dict of float. " - "Got '{}: {}' instead.".format(k, v) - ) from e - self.trainer.storage.put_scalars(**flattened_results, smoothing_hint=False) - - # Evaluation may take different time among workers. - # A barrier make them start the next iteration together. - comm.synchronize() - - def after_step(self): - next_iter = self.trainer.iter + 1 - if self._period > 0 and next_iter % self._period == 0: - # do the last eval in after_train - if next_iter != self.trainer.max_iter: - self._do_eval() - - def after_train(self): - # This condition is to prevent the eval from running after a failed training - if self.trainer.iter + 1 >= self.trainer.max_iter: - self._do_eval() - # func is likely a closure that holds reference to the trainer - # therefore we clean it to avoid circular reference in the end - del self._func - - -class PreciseBN(HookBase): - """ - The standard implementation of BatchNorm uses EMA in inference, which is - sometimes suboptimal. - This class computes the true average of statistics rather than the moving average, - and put true averages to every BN layer in the given model. - - It is executed every ``period`` iterations and after the last iteration. - """ - - def __init__(self, period, model, data_loader, num_iter): - """ - Args: - period (int): the period this hook is run, or 0 to not run during training. - The hook will always run in the end of training. - model (nn.Module): a module whose all BN layers in training mode will be - updated by precise BN. - Note that user is responsible for ensuring the BN layers to be - updated are in training mode when this hook is triggered. - data_loader (iterable): it will produce data to be run by `model(data)`. - num_iter (int): number of iterations used to compute the precise - statistics. - """ - self._logger = logging.getLogger(__name__) - if len(get_bn_modules(model)) == 0: - self._logger.info( - "PreciseBN is disabled because model does not contain BN layers in training mode." - ) - self._disabled = True - return - - self._model = model - self._data_loader = data_loader - self._num_iter = num_iter - self._period = period - self._disabled = False - - self._data_iter = None - - def after_step(self): - next_iter = self.trainer.iter + 1 - is_final = next_iter == self.trainer.max_iter - if is_final or (self._period > 0 and next_iter % self._period == 0): - self.update_stats() - - def update_stats(self): - """ - Update the model with precise statistics. Users can manually call this method. - """ - if self._disabled: - return - - if self._data_iter is None: - self._data_iter = iter(self._data_loader) - - def data_loader(): - for num_iter in itertools.count(1): - if num_iter % 100 == 0: - self._logger.info( - "Running precise-BN ... {}/{} iterations.".format(num_iter, self._num_iter) - ) - # This way we can reuse the same iterator - yield next(self._data_iter) - - with EventStorage(): # capture events in a new storage to discard them - self._logger.info( - "Running precise-BN for {} iterations... ".format(self._num_iter) - + "Note that this could produce different statistics every time." - ) - update_bn_stats(self._model, data_loader(), self._num_iter) - - -class TorchMemoryStats(HookBase): - """ - Writes pytorch's cuda memory statistics periodically. - """ - - def __init__(self, period=20, max_runs=10): - """ - Args: - period (int): Output stats each 'period' iterations - max_runs (int): Stop the logging after 'max_runs' - """ - - self._logger = logging.getLogger(__name__) - self._period = period - self._max_runs = max_runs - self._runs = 0 - - def after_step(self): - if self._runs > self._max_runs: - return - - if (self.trainer.iter + 1) % self._period == 0 or ( - self.trainer.iter == self.trainer.max_iter - 1 - ): - if torch.cuda.is_available(): - max_reserved_mb = torch.cuda.max_memory_reserved() / 1024.0 / 1024.0 - reserved_mb = torch.cuda.memory_reserved() / 1024.0 / 1024.0 - max_allocated_mb = torch.cuda.max_memory_allocated() / 1024.0 / 1024.0 - allocated_mb = torch.cuda.memory_allocated() / 1024.0 / 1024.0 - - self._logger.info( - ( - " iter: {} " - " max_reserved_mem: {:.0f}MB " - " reserved_mem: {:.0f}MB " - " max_allocated_mem: {:.0f}MB " - " allocated_mem: {:.0f}MB " - ).format( - self.trainer.iter, - max_reserved_mb, - reserved_mb, - max_allocated_mb, - allocated_mb, - ) - ) - - self._runs += 1 - if self._runs == self._max_runs: - mem_summary = torch.cuda.memory_summary() - self._logger.info("\n" + mem_summary) - - torch.cuda.reset_peak_memory_stats() diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/launch.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/launch.py deleted file mode 100755 index 46f98691..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/launch.py +++ /dev/null @@ -1,126 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -from datetime import timedelta -import torch -import torch.distributed as dist -import torch.multiprocessing as mp - -from detectron2.utils import comm - -__all__ = ["DEFAULT_TIMEOUT", "launch"] - -DEFAULT_TIMEOUT = timedelta(minutes=30) - - -def _find_free_port(): - import socket - - sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) - # Binding to port 0 will cause the OS to find an available port for us - sock.bind(("", 0)) - port = sock.getsockname()[1] - sock.close() - # NOTE: there is still a chance the port could be taken by other processes. - return port - - -def launch( - main_func, - num_gpus_per_machine, - num_machines=1, - machine_rank=0, - dist_url=None, - args=(), - timeout=DEFAULT_TIMEOUT, -): - """ - Launch multi-gpu or distributed training. - This function must be called on all machines involved in the training. - It will spawn child processes (defined by ``num_gpus_per_machine``) on each machine. - - Args: - main_func: a function that will be called by `main_func(*args)` - num_gpus_per_machine (int): number of GPUs per machine - num_machines (int): the total number of machines - machine_rank (int): the rank of this machine - dist_url (str): url to connect to for distributed jobs, including protocol - e.g. "tcp://127.0.0.1:8686". - Can be set to "auto" to automatically select a free port on localhost - timeout (timedelta): timeout of the distributed workers - args (tuple): arguments passed to main_func - """ - world_size = num_machines * num_gpus_per_machine - if world_size > 1: - # https://github.com/pytorch/pytorch/pull/14391 - # TODO prctl in spawned processes - - if dist_url == "auto": - assert num_machines == 1, "dist_url=auto not supported in multi-machine jobs." - port = _find_free_port() - dist_url = f"tcp://127.0.0.1:{port}" - if num_machines > 1 and dist_url.startswith("file://"): - logger = logging.getLogger(__name__) - logger.warning( - "file:// is not a reliable init_method in multi-machine jobs. Prefer tcp://" - ) - - mp.spawn( - _distributed_worker, - nprocs=num_gpus_per_machine, - args=( - main_func, - world_size, - num_gpus_per_machine, - machine_rank, - dist_url, - args, - timeout, - ), - daemon=False, - ) - else: - main_func(*args) - - -def _distributed_worker( - local_rank, - main_func, - world_size, - num_gpus_per_machine, - machine_rank, - dist_url, - args, - timeout=DEFAULT_TIMEOUT, -): - assert torch.cuda.is_available(), "cuda is not available. Please check your installation." - global_rank = machine_rank * num_gpus_per_machine + local_rank - try: - dist.init_process_group( - backend="NCCL", - init_method=dist_url, - world_size=world_size, - rank=global_rank, - timeout=timeout, - ) - except Exception as e: - logger = logging.getLogger(__name__) - logger.error("Process group URL: {}".format(dist_url)) - raise e - - # Setup the local process group (which contains ranks within the same machine) - assert comm._LOCAL_PROCESS_GROUP is None - num_machines = world_size // num_gpus_per_machine - for i in range(num_machines): - ranks_on_i = list(range(i * num_gpus_per_machine, (i + 1) * num_gpus_per_machine)) - pg = dist.new_group(ranks_on_i) - if i == machine_rank: - comm._LOCAL_PROCESS_GROUP = pg - - assert num_gpus_per_machine <= torch.cuda.device_count() - torch.cuda.set_device(local_rank) - - # synchronize is needed here to prevent a possible timeout after calling init_process_group - # See: https://github.com/facebookresearch/maskrcnn-benchmark/issues/172 - comm.synchronize() - - main_func(*args) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/train_loop.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/train_loop.py deleted file mode 100755 index c4a86b52..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/engine/train_loop.py +++ /dev/null @@ -1,417 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import logging -import numpy as np -import time -import weakref -from typing import List, Mapping, Optional -import torch -from torch.nn.parallel import DataParallel, DistributedDataParallel - -import detectron2.utils.comm as comm -from detectron2.utils.events import EventStorage, get_event_storage -from detectron2.utils.logger import _log_api_usage - -__all__ = ["HookBase", "TrainerBase", "SimpleTrainer", "AMPTrainer"] - - -class HookBase: - """ - Base class for hooks that can be registered with :class:`TrainerBase`. - - Each hook can implement 4 methods. The way they are called is demonstrated - in the following snippet: - :: - hook.before_train() - for iter in range(start_iter, max_iter): - hook.before_step() - trainer.run_step() - hook.after_step() - iter += 1 - hook.after_train() - - Notes: - 1. In the hook method, users can access ``self.trainer`` to access more - properties about the context (e.g., model, current iteration, or config - if using :class:`DefaultTrainer`). - - 2. A hook that does something in :meth:`before_step` can often be - implemented equivalently in :meth:`after_step`. - If the hook takes non-trivial time, it is strongly recommended to - implement the hook in :meth:`after_step` instead of :meth:`before_step`. - The convention is that :meth:`before_step` should only take negligible time. - - Following this convention will allow hooks that do care about the difference - between :meth:`before_step` and :meth:`after_step` (e.g., timer) to - function properly. - - """ - - trainer: "TrainerBase" = None - """ - A weak reference to the trainer object. Set by the trainer when the hook is registered. - """ - - def before_train(self): - """ - Called before the first iteration. - """ - pass - - def after_train(self): - """ - Called after the last iteration. - """ - pass - - def before_step(self): - """ - Called before each iteration. - """ - pass - - def after_step(self): - """ - Called after each iteration. - """ - pass - - def state_dict(self): - """ - Hooks are stateless by default, but can be made checkpointable by - implementing `state_dict` and `load_state_dict`. - """ - return {} - - -class TrainerBase: - """ - Base class for iterative trainer with hooks. - - The only assumption we made here is: the training runs in a loop. - A subclass can implement what the loop is. - We made no assumptions about the existence of dataloader, optimizer, model, etc. - - Attributes: - iter(int): the current iteration. - - start_iter(int): The iteration to start with. - By convention the minimum possible value is 0. - - max_iter(int): The iteration to end training. - - storage(EventStorage): An EventStorage that's opened during the course of training. - """ - - def __init__(self) -> None: - self._hooks: List[HookBase] = [] - self.iter: int = 0 - self.start_iter: int = 0 - self.max_iter: int - self.storage: EventStorage - _log_api_usage("trainer." + self.__class__.__name__) - - def register_hooks(self, hooks: List[Optional[HookBase]]) -> None: - """ - Register hooks to the trainer. The hooks are executed in the order - they are registered. - - Args: - hooks (list[Optional[HookBase]]): list of hooks - """ - hooks = [h for h in hooks if h is not None] - for h in hooks: - assert isinstance(h, HookBase) - # To avoid circular reference, hooks and trainer cannot own each other. - # This normally does not matter, but will cause memory leak if the - # involved objects contain __del__: - # See http://engineering.hearsaysocial.com/2013/06/16/circular-references-in-python/ - h.trainer = weakref.proxy(self) - self._hooks.extend(hooks) - - def train(self, start_iter: int, max_iter: int): - """ - Args: - start_iter, max_iter (int): See docs above - """ - logger = logging.getLogger(__name__) - logger.info("Starting training from iteration {}".format(start_iter)) - - self.iter = self.start_iter = start_iter - self.max_iter = max_iter - - with EventStorage(start_iter) as self.storage: - try: - self.before_train() - for self.iter in range(start_iter, max_iter): - self.before_step() - self.run_step() - self.after_step() - # self.iter == max_iter can be used by `after_train` to - # tell whether the training successfully finished or failed - # due to exceptions. - self.iter += 1 - except Exception: - logger.exception("Exception during training:") - raise - finally: - self.after_train() - - def before_train(self): - for h in self._hooks: - h.before_train() - - def after_train(self): - self.storage.iter = self.iter - for h in self._hooks: - h.after_train() - - def before_step(self): - # Maintain the invariant that storage.iter == trainer.iter - # for the entire execution of each step - self.storage.iter = self.iter - - for h in self._hooks: - h.before_step() - - def after_step(self): - for h in self._hooks: - h.after_step() - - def run_step(self): - raise NotImplementedError - - def state_dict(self): - ret = {"iteration": self.iter} - hooks_state = {} - for h in self._hooks: - sd = h.state_dict() - if sd: - name = type(h).__qualname__ - if name in hooks_state: - # TODO handle repetitive stateful hooks - continue - hooks_state[name] = sd - if hooks_state: - ret["hooks"] = hooks_state - return ret - - def load_state_dict(self, state_dict): - logger = logging.getLogger(__name__) - self.iter = state_dict["iteration"] - for key, value in state_dict.get("hooks", {}).items(): - for h in self._hooks: - try: - name = type(h).__qualname__ - except AttributeError: - continue - if name == key: - h.load_state_dict(value) - break - else: - logger.warning(f"Cannot find the hook '{key}', its state_dict is ignored.") - - -class SimpleTrainer(TrainerBase): - """ - A simple trainer for the most common type of task: - single-cost single-optimizer single-data-source iterative optimization, - optionally using data-parallelism. - It assumes that every step, you: - - 1. Compute the loss with a data from the data_loader. - 2. Compute the gradients with the above loss. - 3. Update the model with the optimizer. - - All other tasks during training (checkpointing, logging, evaluation, LR schedule) - are maintained by hooks, which can be registered by :meth:`TrainerBase.register_hooks`. - - If you want to do anything fancier than this, - either subclass TrainerBase and implement your own `run_step`, - or write your own training loop. - """ - - def __init__(self, model, data_loader, optimizer): - """ - Args: - model: a torch Module. Takes a data from data_loader and returns a - dict of losses. - data_loader: an iterable. Contains data to be used to call model. - optimizer: a torch optimizer. - """ - super().__init__() - - """ - We set the model to training mode in the trainer. - However it's valid to train a model that's in eval mode. - If you want your model (or a submodule of it) to behave - like evaluation during training, you can overwrite its train() method. - """ - model.train() - - self.model = model - self.data_loader = data_loader - self._data_loader_iter = iter(data_loader) - self.optimizer = optimizer - - def run_step(self): - """ - Implement the standard training logic described above. - """ - assert self.model.training, "[SimpleTrainer] model was changed to eval mode!" - start = time.perf_counter() - """ - If you want to do something with the data, you can wrap the dataloader. - """ - data = next(self._data_loader_iter) - data_time = time.perf_counter() - start - - """ - If you want to do something with the losses, you can wrap the model. - """ - loss_dict = self.model(data) - if isinstance(loss_dict, torch.Tensor): - losses = loss_dict - loss_dict = {"total_loss": loss_dict} - else: - losses = sum(loss_dict.values()) - - """ - If you need to accumulate gradients or do something similar, you can - wrap the optimizer with your custom `zero_grad()` method. - """ - self.optimizer.zero_grad() - losses.backward() - - self._write_metrics(loss_dict, data_time) - - """ - If you need gradient clipping/scaling or other processing, you can - wrap the optimizer with your custom `step()` method. But it is - suboptimal as explained in https://arxiv.org/abs/2006.15704 Sec 3.2.4 - """ - self.optimizer.step() - - def _write_metrics( - self, - loss_dict: Mapping[str, torch.Tensor], - data_time: float, - prefix: str = "", - ) -> None: - SimpleTrainer.write_metrics(loss_dict, data_time, prefix) - - @staticmethod - def write_metrics( - loss_dict: Mapping[str, torch.Tensor], - data_time: float, - prefix: str = "", - ) -> None: - """ - Args: - loss_dict (dict): dict of scalar losses - data_time (float): time taken by the dataloader iteration - prefix (str): prefix for logging keys - """ - metrics_dict = {k: v.detach().cpu().item() for k, v in loss_dict.items()} - metrics_dict["data_time"] = data_time - - # Gather metrics among all workers for logging - # This assumes we do DDP-style training, which is currently the only - # supported method in detectron2. - all_metrics_dict = comm.gather(metrics_dict) - - if comm.is_main_process(): - storage = get_event_storage() - - # data_time among workers can have high variance. The actual latency - # caused by data_time is the maximum among workers. - data_time = np.max([x.pop("data_time") for x in all_metrics_dict]) - storage.put_scalar("data_time", data_time) - - # average the rest metrics - metrics_dict = { - k: np.mean([x[k] for x in all_metrics_dict]) for k in all_metrics_dict[0].keys() - } - total_losses_reduced = sum(metrics_dict.values()) - if not np.isfinite(total_losses_reduced): - raise FloatingPointError( - f"Loss became infinite or NaN at iteration={storage.iter}!\n" - f"loss_dict = {metrics_dict}" - ) - - storage.put_scalar("{}total_loss".format(prefix), total_losses_reduced) - if len(metrics_dict) > 1: - storage.put_scalars(**metrics_dict) - - def state_dict(self): - ret = super().state_dict() - ret["optimizer"] = self.optimizer.state_dict() - return ret - - def load_state_dict(self, state_dict): - super().load_state_dict(state_dict) - self.optimizer.load_state_dict(state_dict["optimizer"]) - - -class AMPTrainer(SimpleTrainer): - """ - Like :class:`SimpleTrainer`, but uses PyTorch's native automatic mixed precision - in the training loop. - """ - - def __init__(self, model, data_loader, optimizer, grad_scaler=None): - """ - Args: - model, data_loader, optimizer: same as in :class:`SimpleTrainer`. - grad_scaler: torch GradScaler to automatically scale gradients. - """ - unsupported = "AMPTrainer does not support single-process multi-device training!" - if isinstance(model, DistributedDataParallel): - assert not (model.device_ids and len(model.device_ids) > 1), unsupported - assert not isinstance(model, DataParallel), unsupported - - super().__init__(model, data_loader, optimizer) - - if grad_scaler is None: - from torch.cuda.amp import GradScaler - - grad_scaler = GradScaler() - self.grad_scaler = grad_scaler - - def run_step(self): - """ - Implement the AMP training logic. - """ - assert self.model.training, "[AMPTrainer] model was changed to eval mode!" - assert torch.cuda.is_available(), "[AMPTrainer] CUDA is required for AMP training!" - from torch.cuda.amp import autocast - - start = time.perf_counter() - data = next(self._data_loader_iter) - data_time = time.perf_counter() - start - - with autocast(): - loss_dict = self.model(data) - if isinstance(loss_dict, torch.Tensor): - losses = loss_dict - loss_dict = {"total_loss": loss_dict} - else: - losses = sum(loss_dict.values()) - - self.optimizer.zero_grad() - self.grad_scaler.scale(losses).backward() - - self._write_metrics(loss_dict, data_time) - - self.grad_scaler.step(self.optimizer) - self.grad_scaler.update() - - def state_dict(self): - ret = super().state_dict() - ret["grad_scaler"] = self.grad_scaler.state_dict() - return ret - - def load_state_dict(self, state_dict): - super().load_state_dict(state_dict) - self.grad_scaler.load_state_dict(state_dict["grad_scaler"]) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/__init__.py deleted file mode 100755 index d96609e8..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/__init__.py +++ /dev/null @@ -1,12 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .cityscapes_evaluation import CityscapesInstanceEvaluator, CityscapesSemSegEvaluator -from .coco_evaluation import COCOEvaluator -from .rotated_coco_evaluation import RotatedCOCOEvaluator -from .evaluator import DatasetEvaluator, DatasetEvaluators, inference_context, inference_on_dataset -from .lvis_evaluation import LVISEvaluator -from .panoptic_evaluation import COCOPanopticEvaluator -from .pascal_voc_evaluation import PascalVOCDetectionEvaluator -from .sem_seg_evaluation import SemSegEvaluator -from .testing import print_csv_format, verify_results - -__all__ = [k for k in globals().keys() if not k.startswith("_")] diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/cityscapes_evaluation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/cityscapes_evaluation.py deleted file mode 100755 index 3fb6c4cd..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/cityscapes_evaluation.py +++ /dev/null @@ -1,194 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import glob -import logging -import numpy as np -import os -import tempfile -from collections import OrderedDict -import torch -from PIL import Image - -from detectron2.data import MetadataCatalog -from detectron2.utils import comm -from detectron2.utils.file_io import PathManager - -from .evaluator import DatasetEvaluator - - -class CityscapesEvaluator(DatasetEvaluator): - """ - Base class for evaluation using cityscapes API. - """ - - def __init__(self, dataset_name): - """ - Args: - dataset_name (str): the name of the dataset. - It must have the following metadata associated with it: - "thing_classes", "gt_dir". - """ - self._metadata = MetadataCatalog.get(dataset_name) - self._cpu_device = torch.device("cpu") - self._logger = logging.getLogger(__name__) - - def reset(self): - self._working_dir = tempfile.TemporaryDirectory(prefix="cityscapes_eval_") - self._temp_dir = self._working_dir.name - # All workers will write to the same results directory - # TODO this does not work in distributed training - self._temp_dir = comm.all_gather(self._temp_dir)[0] - if self._temp_dir != self._working_dir.name: - self._working_dir.cleanup() - self._logger.info( - "Writing cityscapes results to temporary directory {} ...".format(self._temp_dir) - ) - - -class CityscapesInstanceEvaluator(CityscapesEvaluator): - """ - Evaluate instance segmentation results on cityscapes dataset using cityscapes API. - - Note: - * It does not work in multi-machine distributed training. - * It contains a synchronization, therefore has to be used on all ranks. - * Only the main process runs evaluation. - """ - - def process(self, inputs, outputs): - from cityscapesscripts.helpers.labels import name2label - - for input, output in zip(inputs, outputs): - file_name = input["file_name"] - basename = os.path.splitext(os.path.basename(file_name))[0] - pred_txt = os.path.join(self._temp_dir, basename + "_pred.txt") - - if "instances" in output: - output = output["instances"].to(self._cpu_device) - num_instances = len(output) - with open(pred_txt, "w") as fout: - for i in range(num_instances): - pred_class = output.pred_classes[i] - classes = self._metadata.thing_classes[pred_class] - class_id = name2label[classes].id - score = output.scores[i] - mask = output.pred_masks[i].numpy().astype("uint8") - png_filename = os.path.join( - self._temp_dir, basename + "_{}_{}.png".format(i, classes) - ) - - Image.fromarray(mask * 255).save(png_filename) - fout.write( - "{} {} {}\n".format(os.path.basename(png_filename), class_id, score) - ) - else: - # Cityscapes requires a prediction file for every ground truth image. - with open(pred_txt, "w") as fout: - pass - - def evaluate(self): - """ - Returns: - dict: has a key "segm", whose value is a dict of "AP" and "AP50". - """ - comm.synchronize() - if comm.get_rank() > 0: - return - import cityscapesscripts.evaluation.evalInstanceLevelSemanticLabeling as cityscapes_eval - - self._logger.info("Evaluating results under {} ...".format(self._temp_dir)) - - # set some global states in cityscapes evaluation API, before evaluating - cityscapes_eval.args.predictionPath = os.path.abspath(self._temp_dir) - cityscapes_eval.args.predictionWalk = None - cityscapes_eval.args.JSONOutput = False - cityscapes_eval.args.colorized = False - cityscapes_eval.args.gtInstancesFile = os.path.join(self._temp_dir, "gtInstances.json") - - # These lines are adopted from - # https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py # noqa - gt_dir = PathManager.get_local_path(self._metadata.gt_dir) - groundTruthImgList = glob.glob(os.path.join(gt_dir, "*", "*_gtFine_instanceIds.png")) - assert len( - groundTruthImgList - ), "Cannot find any ground truth images to use for evaluation. Searched for: {}".format( - cityscapes_eval.args.groundTruthSearch - ) - predictionImgList = [] - for gt in groundTruthImgList: - predictionImgList.append(cityscapes_eval.getPrediction(gt, cityscapes_eval.args)) - results = cityscapes_eval.evaluateImgLists( - predictionImgList, groundTruthImgList, cityscapes_eval.args - )["averages"] - - ret = OrderedDict() - ret["segm"] = {"AP": results["allAp"] * 100, "AP50": results["allAp50%"] * 100} - self._working_dir.cleanup() - return ret - - -class CityscapesSemSegEvaluator(CityscapesEvaluator): - """ - Evaluate semantic segmentation results on cityscapes dataset using cityscapes API. - - Note: - * It does not work in multi-machine distributed training. - * It contains a synchronization, therefore has to be used on all ranks. - * Only the main process runs evaluation. - """ - - def process(self, inputs, outputs): - from cityscapesscripts.helpers.labels import trainId2label - - for input, output in zip(inputs, outputs): - file_name = input["file_name"] - basename = os.path.splitext(os.path.basename(file_name))[0] - pred_filename = os.path.join(self._temp_dir, basename + "_pred.png") - - output = output["sem_seg"].argmax(dim=0).to(self._cpu_device).numpy() - pred = 255 * np.ones(output.shape, dtype=np.uint8) - for train_id, label in trainId2label.items(): - if label.ignoreInEval: - continue - pred[output == train_id] = label.id - Image.fromarray(pred).save(pred_filename) - - def evaluate(self): - comm.synchronize() - if comm.get_rank() > 0: - return - # Load the Cityscapes eval script *after* setting the required env var, - # since the script reads CITYSCAPES_DATASET into global variables at load time. - import cityscapesscripts.evaluation.evalPixelLevelSemanticLabeling as cityscapes_eval - - self._logger.info("Evaluating results under {} ...".format(self._temp_dir)) - - # set some global states in cityscapes evaluation API, before evaluating - cityscapes_eval.args.predictionPath = os.path.abspath(self._temp_dir) - cityscapes_eval.args.predictionWalk = None - cityscapes_eval.args.JSONOutput = False - cityscapes_eval.args.colorized = False - - # These lines are adopted from - # https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalPixelLevelSemanticLabeling.py # noqa - gt_dir = PathManager.get_local_path(self._metadata.gt_dir) - groundTruthImgList = glob.glob(os.path.join(gt_dir, "*", "*_gtFine_labelIds.png")) - assert len( - groundTruthImgList - ), "Cannot find any ground truth images to use for evaluation. Searched for: {}".format( - cityscapes_eval.args.groundTruthSearch - ) - predictionImgList = [] - for gt in groundTruthImgList: - predictionImgList.append(cityscapes_eval.getPrediction(cityscapes_eval.args, gt)) - results = cityscapes_eval.evaluateImgLists( - predictionImgList, groundTruthImgList, cityscapes_eval.args - ) - ret = OrderedDict() - ret["sem_seg"] = { - "IoU": 100.0 * results["averageScoreClasses"], - "iIoU": 100.0 * results["averageScoreInstClasses"], - "IoU_sup": 100.0 * results["averageScoreCategories"], - "iIoU_sup": 100.0 * results["averageScoreInstCategories"], - } - self._working_dir.cleanup() - return ret diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/coco_evaluation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/coco_evaluation.py deleted file mode 100755 index aad7f5a6..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/coco_evaluation.py +++ /dev/null @@ -1,710 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import contextlib -import copy -import io -import itertools -import json -import logging -import numpy as np -import os -import pickle -from collections import OrderedDict -import pycocotools.mask as mask_util -import torch -from pycocotools.coco import COCO -from pycocotools.cocoeval import COCOeval -from tabulate import tabulate - -import detectron2.utils.comm as comm -from detectron2.config import CfgNode -from detectron2.data import MetadataCatalog -from detectron2.data.datasets.coco import convert_to_coco_json -from detectron2.evaluation.fast_eval_api import COCOeval_opt -from detectron2.structures import Boxes, BoxMode, pairwise_iou -from detectron2.utils.file_io import PathManager -from detectron2.utils.logger import create_small_table - -from .evaluator import DatasetEvaluator - - -class COCOEvaluator(DatasetEvaluator): - """ - Evaluate AR for object proposals, AP for instance detection/segmentation, AP - for keypoint detection outputs using COCO's metrics. - See http://cocodataset.org/#detection-eval and - http://cocodataset.org/#keypoints-eval to understand its metrics. - The metrics range from 0 to 100 (instead of 0 to 1), where a -1 or NaN means - the metric cannot be computed (e.g. due to no predictions made). - - In addition to COCO, this evaluator is able to support any bounding box detection, - instance segmentation, or keypoint detection dataset. - """ - - def __init__( - self, - dataset_name, - tasks=None, - distributed=True, - output_dir=None, - *, - max_dets_per_image=None, - use_fast_impl=True, - kpt_oks_sigmas=(), - ): - """ - Args: - dataset_name (str): name of the dataset to be evaluated. - It must have either the following corresponding metadata: - - "json_file": the path to the COCO format annotation - - Or it must be in detectron2's standard dataset format - so it can be converted to COCO format automatically. - tasks (tuple[str]): tasks that can be evaluated under the given - configuration. A task is one of "bbox", "segm", "keypoints". - By default, will infer this automatically from predictions. - distributed (True): if True, will collect results from all ranks and run evaluation - in the main process. - Otherwise, will only evaluate the results in the current process. - output_dir (str): optional, an output directory to dump all - results predicted on the dataset. The dump contains two files: - - 1. "instances_predictions.pth" a file that can be loaded with `torch.load` and - contains all the results in the format they are produced by the model. - 2. "coco_instances_results.json" a json file in COCO's result format. - max_dets_per_image (int): limit on the maximum number of detections per image. - By default in COCO, this limit is to 100, but this can be customized - to be greater, as is needed in evaluation metrics AP fixed and AP pool - (see https://arxiv.org/pdf/2102.01066.pdf) - This doesn't affect keypoint evaluation. - use_fast_impl (bool): use a fast but **unofficial** implementation to compute AP. - Although the results should be very close to the official implementation in COCO - API, it is still recommended to compute results with the official API for use in - papers. The faster implementation also uses more RAM. - kpt_oks_sigmas (list[float]): The sigmas used to calculate keypoint OKS. - See http://cocodataset.org/#keypoints-eval - When empty, it will use the defaults in COCO. - Otherwise it should be the same length as ROI_KEYPOINT_HEAD.NUM_KEYPOINTS. - """ - self._logger = logging.getLogger(__name__) - self._distributed = distributed - self._output_dir = output_dir - self._use_fast_impl = use_fast_impl - - # COCOeval requires the limit on the number of detections per image (maxDets) to be a list - # with at least 3 elements. The default maxDets in COCOeval is [1, 10, 100], in which the - # 3rd element (100) is used as the limit on the number of detections per image when - # evaluating AP. COCOEvaluator expects an integer for max_dets_per_image, so for COCOeval, - # we reformat max_dets_per_image into [1, 10, max_dets_per_image], based on the defaults. - if max_dets_per_image is None: - max_dets_per_image = [1, 10, 100] - else: - max_dets_per_image = [1, 10, max_dets_per_image] - self._max_dets_per_image = max_dets_per_image - - if tasks is not None and isinstance(tasks, CfgNode): - kpt_oks_sigmas = ( - tasks.TEST.KEYPOINT_OKS_SIGMAS if not kpt_oks_sigmas else kpt_oks_sigmas - ) - self._logger.warn( - "COCO Evaluator instantiated using config, this is deprecated behavior." - " Please pass in explicit arguments instead." - ) - self._tasks = None # Infering it from predictions should be better - else: - self._tasks = tasks - - self._cpu_device = torch.device("cpu") - - self._metadata = MetadataCatalog.get(dataset_name) - if not hasattr(self._metadata, "json_file"): - if output_dir is None: - raise ValueError( - "output_dir must be provided to COCOEvaluator " - "for datasets not in COCO format." - ) - self._logger.info(f"Trying to convert '{dataset_name}' to COCO format ...") - - cache_path = os.path.join(output_dir, f"{dataset_name}_coco_format.json") - self._metadata.json_file = cache_path - convert_to_coco_json(dataset_name, cache_path) - - json_file = PathManager.get_local_path(self._metadata.json_file) - with contextlib.redirect_stdout(io.StringIO()): - self._coco_api = COCO(json_file) - - # Test set json files do not contain annotations (evaluation must be - # performed using the COCO evaluation server). - self._do_evaluation = "annotations" in self._coco_api.dataset - if self._do_evaluation: - self._kpt_oks_sigmas = kpt_oks_sigmas - - def reset(self): - self._predictions = [] - - def process(self, inputs, outputs): - """ - Args: - inputs: the inputs to a COCO model (e.g., GeneralizedRCNN). - It is a list of dict. Each dict corresponds to an image and - contains keys like "height", "width", "file_name", "image_id". - outputs: the outputs of a COCO model. It is a list of dicts with key - "instances" that contains :class:`Instances`. - """ - for input, output in zip(inputs, outputs): - prediction = {"image_id": input["image_id"]} - - if "instances" in output: - instances = output["instances"].to(self._cpu_device) - prediction["instances"] = instances_to_coco_json(instances, input["image_id"]) - if "proposals" in output: - prediction["proposals"] = output["proposals"].to(self._cpu_device) - if len(prediction) > 1: - self._predictions.append(prediction) - - def evaluate(self, img_ids=None): - """ - Args: - img_ids: a list of image IDs to evaluate on. Default to None for the whole dataset - """ - if self._distributed: - comm.synchronize() - predictions = comm.gather(self._predictions, dst=0) - predictions = list(itertools.chain(*predictions)) - - if not comm.is_main_process(): - return {} - else: - predictions = self._predictions - - if len(predictions) == 0: - self._logger.warning("[COCOEvaluator] Did not receive valid predictions.") - return {} - - if self._output_dir: - PathManager.mkdirs(self._output_dir) - file_path = os.path.join(self._output_dir, "instances_predictions.pth") - with PathManager.open(file_path, "wb") as f: - torch.save(predictions, f) - - self._results = OrderedDict() - if "proposals" in predictions[0]: - self._eval_box_proposals(predictions) - if "instances" in predictions[0]: - self._eval_predictions(predictions, img_ids=img_ids) - # Copy so the caller can do whatever with results - return copy.deepcopy(self._results) - - def _tasks_from_predictions(self, predictions): - """ - Get COCO API "tasks" (i.e. iou_type) from COCO-format predictions. - """ - tasks = {"bbox"} - for pred in predictions: - if "segmentation" in pred: - tasks.add("segm") - if "keypoints" in pred: - tasks.add("keypoints") - return sorted(tasks) - - def _eval_predictions(self, predictions, img_ids=None): - """ - Evaluate predictions. Fill self._results with the metrics of the tasks. - """ - self._logger.info("Preparing results for COCO format ...") - coco_results = list(itertools.chain(*[x["instances"] for x in predictions])) - tasks = self._tasks or self._tasks_from_predictions(coco_results) - - # unmap the category ids for COCO - if hasattr(self._metadata, "thing_dataset_id_to_contiguous_id"): - dataset_id_to_contiguous_id = self._metadata.thing_dataset_id_to_contiguous_id - all_contiguous_ids = list(dataset_id_to_contiguous_id.values()) - num_classes = len(all_contiguous_ids) - assert min(all_contiguous_ids) == 0 and max(all_contiguous_ids) == num_classes - 1 - - reverse_id_mapping = {v: k for k, v in dataset_id_to_contiguous_id.items()} - for result in coco_results: - category_id = result["category_id"] - assert category_id < num_classes, ( - f"A prediction has class={category_id}, " - f"but the dataset only has {num_classes} classes and " - f"predicted class id should be in [0, {num_classes - 1}]." - ) - result["category_id"] = reverse_id_mapping[category_id] - - if self._output_dir: - file_path = os.path.join(self._output_dir, "coco_instances_results.json") - self._logger.info("Saving results to {}".format(file_path)) - with PathManager.open(file_path, "w") as f: - f.write(json.dumps(coco_results)) - f.flush() - - if not self._do_evaluation: - self._logger.info("Annotations are not available for evaluation.") - return - - self._logger.info( - "Evaluating predictions with {} COCO API...".format( - "unofficial" if self._use_fast_impl else "official" - ) - ) - for task in sorted(tasks): - assert task in {"bbox", "segm", "keypoints"}, f"Got unknown task: {task}!" - coco_eval = ( - _evaluate_predictions_on_coco( - self._coco_api, - coco_results, - task, - kpt_oks_sigmas=self._kpt_oks_sigmas, - use_fast_impl=self._use_fast_impl, - img_ids=img_ids, - max_dets_per_image=self._max_dets_per_image, - ) - if len(coco_results) > 0 - else None # cocoapi does not handle empty results very well - ) - - res = self._derive_coco_results( - coco_eval, task, class_names=self._metadata.get("thing_classes") - ) - self._results[task] = res - - def _eval_box_proposals(self, predictions): - """ - Evaluate the box proposals in predictions. - Fill self._results with the metrics for "box_proposals" task. - """ - if self._output_dir: - # Saving generated box proposals to file. - # Predicted box_proposals are in XYXY_ABS mode. - bbox_mode = BoxMode.XYXY_ABS.value - ids, boxes, objectness_logits = [], [], [] - for prediction in predictions: - ids.append(prediction["image_id"]) - boxes.append(prediction["proposals"].proposal_boxes.tensor.numpy()) - objectness_logits.append(prediction["proposals"].objectness_logits.numpy()) - - proposal_data = { - "boxes": boxes, - "objectness_logits": objectness_logits, - "ids": ids, - "bbox_mode": bbox_mode, - } - with PathManager.open(os.path.join(self._output_dir, "box_proposals.pkl"), "wb") as f: - pickle.dump(proposal_data, f) - - if not self._do_evaluation: - self._logger.info("Annotations are not available for evaluation.") - return - - self._logger.info("Evaluating bbox proposals ...") - res = {} - areas = {"all": "", "small": "s", "medium": "m", "large": "l"} - for limit in [100, 1000]: - for area, suffix in areas.items(): - stats = _evaluate_box_proposals(predictions, self._coco_api, area=area, limit=limit) - key = "AR{}@{:d}".format(suffix, limit) - res[key] = float(stats["ar"].item() * 100) - self._logger.info("Proposal metrics: \n" + create_small_table(res)) - self._results["box_proposals"] = res - - def _derive_coco_results(self, coco_eval, iou_type, class_names=None): - """ - Derive the desired score numbers from summarized COCOeval. - - Args: - coco_eval (None or COCOEval): None represents no predictions from model. - iou_type (str): - class_names (None or list[str]): if provided, will use it to predict - per-category AP. - - Returns: - a dict of {metric name: score} - """ - - metrics = { - "bbox": ["AP", "AP50", "AP75", "APs", "APm", "APl"], - "segm": ["AP", "AP50", "AP75", "APs", "APm", "APl"], - "keypoints": ["AP", "AP50", "AP75", "APm", "APl"], - }[iou_type] - - if coco_eval is None: - self._logger.warn("No predictions from the model!") - return {metric: float("nan") for metric in metrics} - - # the standard metrics - results = { - metric: float(coco_eval.stats[idx] * 100 if coco_eval.stats[idx] >= 0 else "nan") - for idx, metric in enumerate(metrics) - } - self._logger.info( - "Evaluation results for {}: \n".format(iou_type) + create_small_table(results) - ) - if not np.isfinite(sum(results.values())): - self._logger.info("Some metrics cannot be computed and is shown as NaN.") - - if class_names is None or len(class_names) <= 1: - return results - # Compute per-category AP - # from https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L222-L252 # noqa - precisions = coco_eval.eval["precision"] - # precision has dims (iou, recall, cls, area range, max dets) - assert len(class_names) == precisions.shape[2] - - results_per_category = [] - for idx, name in enumerate(class_names): - # area range index 0: all area ranges - # max dets index -1: typically 100 per image - precision = precisions[:, :, idx, 0, -1] - precision = precision[precision > -1] - ap = np.mean(precision) if precision.size else float("nan") - results_per_category.append(("{}".format(name), float(ap * 100))) - - # tabulate it - N_COLS = min(6, len(results_per_category) * 2) - results_flatten = list(itertools.chain(*results_per_category)) - results_2d = itertools.zip_longest(*[results_flatten[i::N_COLS] for i in range(N_COLS)]) - table = tabulate( - results_2d, - tablefmt="pipe", - floatfmt=".3f", - headers=["category", "AP"] * (N_COLS // 2), - numalign="left", - ) - self._logger.info("Per-category {} AP: \n".format(iou_type) + table) - - results.update({"AP-" + name: ap for name, ap in results_per_category}) - return results - - -def instances_to_coco_json(instances, img_id): - """ - Dump an "Instances" object to a COCO-format json that's used for evaluation. - - Args: - instances (Instances): - img_id (int): the image id - - Returns: - list[dict]: list of json annotations in COCO format. - """ - num_instance = len(instances) - if num_instance == 0: - return [] - - boxes = instances.pred_boxes.tensor.numpy() - boxes = BoxMode.convert(boxes, BoxMode.XYXY_ABS, BoxMode.XYWH_ABS) - boxes = boxes.tolist() - scores = instances.scores.tolist() - classes = instances.pred_classes.tolist() - - has_mask = instances.has("pred_masks") - if has_mask: - # use RLE to encode the masks, because they are too large and takes memory - # since this evaluator stores outputs of the entire dataset - rles = [ - mask_util.encode(np.array(mask[:, :, None], order="F", dtype="uint8"))[0] - for mask in instances.pred_masks - ] - for rle in rles: - # "counts" is an array encoded by mask_util as a byte-stream. Python3's - # json writer which always produces strings cannot serialize a bytestream - # unless you decode it. Thankfully, utf-8 works out (which is also what - # the pycocotools/_mask.pyx does). - rle["counts"] = rle["counts"].decode("utf-8") - - has_keypoints = instances.has("pred_keypoints") - if has_keypoints: - keypoints = instances.pred_keypoints - - results = [] - for k in range(num_instance): - result = { - "image_id": img_id, - "category_id": classes[k], - "bbox": boxes[k], - "score": scores[k], - } - if has_mask: - result["segmentation"] = rles[k] - if has_keypoints: - # In COCO annotations, - # keypoints coordinates are pixel indices. - # However our predictions are floating point coordinates. - # Therefore we subtract 0.5 to be consistent with the annotation format. - # This is the inverse of data loading logic in `datasets/coco.py`. - keypoints[k][:, :2] -= 0.5 - result["keypoints"] = keypoints[k].flatten().tolist() - results.append(result) - return results - - -# inspired from Detectron: -# https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 # noqa -def _evaluate_box_proposals(dataset_predictions, coco_api, thresholds=None, area="all", limit=None): - """ - Evaluate detection proposal recall metrics. This function is a much - faster alternative to the official COCO API recall evaluation code. However, - it produces slightly different results. - """ - # Record max overlap value for each gt box - # Return vector of overlap values - areas = { - "all": 0, - "small": 1, - "medium": 2, - "large": 3, - "96-128": 4, - "128-256": 5, - "256-512": 6, - "512-inf": 7, - } - area_ranges = [ - [0 ** 2, 1e5 ** 2], # all - [0 ** 2, 32 ** 2], # small - [32 ** 2, 96 ** 2], # medium - [96 ** 2, 1e5 ** 2], # large - [96 ** 2, 128 ** 2], # 96-128 - [128 ** 2, 256 ** 2], # 128-256 - [256 ** 2, 512 ** 2], # 256-512 - [512 ** 2, 1e5 ** 2], - ] # 512-inf - assert area in areas, "Unknown area range: {}".format(area) - area_range = area_ranges[areas[area]] - gt_overlaps = [] - num_pos = 0 - - for prediction_dict in dataset_predictions: - predictions = prediction_dict["proposals"] - - # sort predictions in descending order - # TODO maybe remove this and make it explicit in the documentation - inds = predictions.objectness_logits.sort(descending=True)[1] - predictions = predictions[inds] - - ann_ids = coco_api.getAnnIds(imgIds=prediction_dict["image_id"]) - anno = coco_api.loadAnns(ann_ids) - gt_boxes = [ - BoxMode.convert(obj["bbox"], BoxMode.XYWH_ABS, BoxMode.XYXY_ABS) - for obj in anno - if obj["iscrowd"] == 0 - ] - gt_boxes = torch.as_tensor(gt_boxes).reshape(-1, 4) # guard against no boxes - gt_boxes = Boxes(gt_boxes) - gt_areas = torch.as_tensor([obj["area"] for obj in anno if obj["iscrowd"] == 0]) - - if len(gt_boxes) == 0 or len(predictions) == 0: - continue - - valid_gt_inds = (gt_areas >= area_range[0]) & (gt_areas <= area_range[1]) - gt_boxes = gt_boxes[valid_gt_inds] - - num_pos += len(gt_boxes) - - if len(gt_boxes) == 0: - continue - - if limit is not None and len(predictions) > limit: - predictions = predictions[:limit] - - overlaps = pairwise_iou(predictions.proposal_boxes, gt_boxes) - - _gt_overlaps = torch.zeros(len(gt_boxes)) - for j in range(min(len(predictions), len(gt_boxes))): - # find which proposal box maximally covers each gt box - # and get the iou amount of coverage for each gt box - max_overlaps, argmax_overlaps = overlaps.max(dim=0) - - # find which gt box is 'best' covered (i.e. 'best' = most iou) - gt_ovr, gt_ind = max_overlaps.max(dim=0) - assert gt_ovr >= 0 - # find the proposal box that covers the best covered gt box - box_ind = argmax_overlaps[gt_ind] - # record the iou coverage of this gt box - _gt_overlaps[j] = overlaps[box_ind, gt_ind] - assert _gt_overlaps[j] == gt_ovr - # mark the proposal box and the gt box as used - overlaps[box_ind, :] = -1 - overlaps[:, gt_ind] = -1 - - # append recorded iou coverage level - gt_overlaps.append(_gt_overlaps) - gt_overlaps = ( - torch.cat(gt_overlaps, dim=0) if len(gt_overlaps) else torch.zeros(0, dtype=torch.float32) - ) - gt_overlaps, _ = torch.sort(gt_overlaps) - - if thresholds is None: - step = 0.05 - thresholds = torch.arange(0.5, 0.95 + 1e-5, step, dtype=torch.float32) - recalls = torch.zeros_like(thresholds) - # compute recall for each iou threshold - for i, t in enumerate(thresholds): - recalls[i] = (gt_overlaps >= t).float().sum() / float(num_pos) - # ar = 2 * np.trapz(recalls, thresholds) - ar = recalls.mean() - return { - "ar": ar, - "recalls": recalls, - "thresholds": thresholds, - "gt_overlaps": gt_overlaps, - "num_pos": num_pos, - } - - -def _evaluate_predictions_on_coco( - coco_gt, - coco_results, - iou_type, - kpt_oks_sigmas=None, - use_fast_impl=True, - img_ids=None, - max_dets_per_image=None, -): - """ - Evaluate the coco results using COCOEval API. - """ - assert len(coco_results) > 0 - - if iou_type == "segm": - coco_results = copy.deepcopy(coco_results) - # When evaluating mask AP, if the results contain bbox, cocoapi will - # use the box area as the area of the instance, instead of the mask area. - # This leads to a different definition of small/medium/large. - # We remove the bbox field to let mask AP use mask area. - for c in coco_results: - c.pop("bbox", None) - - coco_dt = coco_gt.loadRes(coco_results) - coco_eval = (COCOeval_opt if use_fast_impl else COCOeval)(coco_gt, coco_dt, iou_type) - # For COCO, the default max_dets_per_image is [1, 10, 100]. - if max_dets_per_image is None: - max_dets_per_image = [1, 10, 100] # Default from COCOEval - else: - assert ( - len(max_dets_per_image) >= 3 - ), "COCOeval requires maxDets (and max_dets_per_image) to have length at least 3" - # In the case that user supplies a custom input for max_dets_per_image, - # apply COCOevalMaxDets to evaluate AP with the custom input. - if max_dets_per_image[2] != 100: - coco_eval = COCOevalMaxDets(coco_gt, coco_dt, iou_type) - if iou_type != "keypoints": - coco_eval.params.maxDets = max_dets_per_image - - if img_ids is not None: - coco_eval.params.imgIds = img_ids - - if iou_type == "keypoints": - # Use the COCO default keypoint OKS sigmas unless overrides are specified - if kpt_oks_sigmas: - assert hasattr(coco_eval.params, "kpt_oks_sigmas"), "pycocotools is too old!" - coco_eval.params.kpt_oks_sigmas = np.array(kpt_oks_sigmas) - # COCOAPI requires every detection and every gt to have keypoints, so - # we just take the first entry from both - num_keypoints_dt = len(coco_results[0]["keypoints"]) // 3 - num_keypoints_gt = len(next(iter(coco_gt.anns.values()))["keypoints"]) // 3 - num_keypoints_oks = len(coco_eval.params.kpt_oks_sigmas) - assert num_keypoints_oks == num_keypoints_dt == num_keypoints_gt, ( - f"[COCOEvaluator] Prediction contain {num_keypoints_dt} keypoints. " - f"Ground truth contains {num_keypoints_gt} keypoints. " - f"The length of cfg.TEST.KEYPOINT_OKS_SIGMAS is {num_keypoints_oks}. " - "They have to agree with each other. For meaning of OKS, please refer to " - "http://cocodataset.org/#keypoints-eval." - ) - - coco_eval.evaluate() - coco_eval.accumulate() - coco_eval.summarize() - - return coco_eval - - -class COCOevalMaxDets(COCOeval): - """ - Modified version of COCOeval for evaluating AP with a custom - maxDets (by default for COCO, maxDets is 100) - """ - - def summarize(self): - """ - Compute and display summary metrics for evaluation results given - a custom value for max_dets_per_image - """ - - def _summarize(ap=1, iouThr=None, areaRng="all", maxDets=100): - p = self.params - iStr = " {:<18} {} @[ IoU={:<9} | area={:>6s} | maxDets={:>3d} ] = {:0.3f}" - titleStr = "Average Precision" if ap == 1 else "Average Recall" - typeStr = "(AP)" if ap == 1 else "(AR)" - iouStr = ( - "{:0.2f}:{:0.2f}".format(p.iouThrs[0], p.iouThrs[-1]) - if iouThr is None - else "{:0.2f}".format(iouThr) - ) - - aind = [i for i, aRng in enumerate(p.areaRngLbl) if aRng == areaRng] - mind = [i for i, mDet in enumerate(p.maxDets) if mDet == maxDets] - if ap == 1: - # dimension of precision: [TxRxKxAxM] - s = self.eval["precision"] - # IoU - if iouThr is not None: - t = np.where(iouThr == p.iouThrs)[0] - s = s[t] - s = s[:, :, :, aind, mind] - else: - # dimension of recall: [TxKxAxM] - s = self.eval["recall"] - if iouThr is not None: - t = np.where(iouThr == p.iouThrs)[0] - s = s[t] - s = s[:, :, aind, mind] - if len(s[s > -1]) == 0: - mean_s = -1 - else: - mean_s = np.mean(s[s > -1]) - print(iStr.format(titleStr, typeStr, iouStr, areaRng, maxDets, mean_s)) - return mean_s - - def _summarizeDets(): - stats = np.zeros((12,)) - # Evaluate AP using the custom limit on maximum detections per image - stats[0] = _summarize(1, maxDets=self.params.maxDets[2]) - stats[1] = _summarize(1, iouThr=0.5, maxDets=self.params.maxDets[2]) - stats[2] = _summarize(1, iouThr=0.75, maxDets=self.params.maxDets[2]) - stats[3] = _summarize(1, areaRng="small", maxDets=self.params.maxDets[2]) - stats[4] = _summarize(1, areaRng="medium", maxDets=self.params.maxDets[2]) - stats[5] = _summarize(1, areaRng="large", maxDets=self.params.maxDets[2]) - stats[6] = _summarize(0, maxDets=self.params.maxDets[0]) - stats[7] = _summarize(0, maxDets=self.params.maxDets[1]) - stats[8] = _summarize(0, maxDets=self.params.maxDets[2]) - stats[9] = _summarize(0, areaRng="small", maxDets=self.params.maxDets[2]) - stats[10] = _summarize(0, areaRng="medium", maxDets=self.params.maxDets[2]) - stats[11] = _summarize(0, areaRng="large", maxDets=self.params.maxDets[2]) - return stats - - def _summarizeKps(): - stats = np.zeros((10,)) - stats[0] = _summarize(1, maxDets=20) - stats[1] = _summarize(1, maxDets=20, iouThr=0.5) - stats[2] = _summarize(1, maxDets=20, iouThr=0.75) - stats[3] = _summarize(1, maxDets=20, areaRng="medium") - stats[4] = _summarize(1, maxDets=20, areaRng="large") - stats[5] = _summarize(0, maxDets=20) - stats[6] = _summarize(0, maxDets=20, iouThr=0.5) - stats[7] = _summarize(0, maxDets=20, iouThr=0.75) - stats[8] = _summarize(0, maxDets=20, areaRng="medium") - stats[9] = _summarize(0, maxDets=20, areaRng="large") - return stats - - if not self.eval: - raise Exception("Please run accumulate() first") - iouType = self.params.iouType - if iouType == "segm" or iouType == "bbox": - summarize = _summarizeDets - elif iouType == "keypoints": - summarize = _summarizeKps - self.stats = summarize() - - def __str__(self): - self.summarize() diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/evaluator.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/evaluator.py deleted file mode 100755 index baf99600..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/evaluator.py +++ /dev/null @@ -1,224 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import datetime -import logging -import time -from collections import OrderedDict, abc -from contextlib import ExitStack, contextmanager -from typing import List, Union -import torch -from torch import nn - -from detectron2.utils.comm import get_world_size, is_main_process -from detectron2.utils.logger import log_every_n_seconds - - -class DatasetEvaluator: - """ - Base class for a dataset evaluator. - - The function :func:`inference_on_dataset` runs the model over - all samples in the dataset, and have a DatasetEvaluator to process the inputs/outputs. - - This class will accumulate information of the inputs/outputs (by :meth:`process`), - and produce evaluation results in the end (by :meth:`evaluate`). - """ - - def reset(self): - """ - Preparation for a new round of evaluation. - Should be called before starting a round of evaluation. - """ - pass - - def process(self, inputs, outputs): - """ - Process the pair of inputs and outputs. - If they contain batches, the pairs can be consumed one-by-one using `zip`: - - .. code-block:: python - - for input_, output in zip(inputs, outputs): - # do evaluation on single input/output pair - ... - - Args: - inputs (list): the inputs that's used to call the model. - outputs (list): the return value of `model(inputs)` - """ - pass - - def evaluate(self): - """ - Evaluate/summarize the performance, after processing all input/output pairs. - - Returns: - dict: - A new evaluator class can return a dict of arbitrary format - as long as the user can process the results. - In our train_net.py, we expect the following format: - - * key: the name of the task (e.g., bbox) - * value: a dict of {metric name: score}, e.g.: {"AP50": 80} - """ - pass - - -class DatasetEvaluators(DatasetEvaluator): - """ - Wrapper class to combine multiple :class:`DatasetEvaluator` instances. - - This class dispatches every evaluation call to - all of its :class:`DatasetEvaluator`. - """ - - def __init__(self, evaluators): - """ - Args: - evaluators (list): the evaluators to combine. - """ - super().__init__() - self._evaluators = evaluators - - def reset(self): - for evaluator in self._evaluators: - evaluator.reset() - - def process(self, inputs, outputs): - for evaluator in self._evaluators: - evaluator.process(inputs, outputs) - - def evaluate(self): - results = OrderedDict() - for evaluator in self._evaluators: - result = evaluator.evaluate() - if is_main_process() and result is not None: - for k, v in result.items(): - assert ( - k not in results - ), "Different evaluators produce results with the same key {}".format(k) - results[k] = v - return results - - -def inference_on_dataset( - model, data_loader, evaluator: Union[DatasetEvaluator, List[DatasetEvaluator], None] -): - """ - Run model on the data_loader and evaluate the metrics with evaluator. - Also benchmark the inference speed of `model.__call__` accurately. - The model will be used in eval mode. - - Args: - model (callable): a callable which takes an object from - `data_loader` and returns some outputs. - - If it's an nn.Module, it will be temporarily set to `eval` mode. - If you wish to evaluate a model in `training` mode instead, you can - wrap the given model and override its behavior of `.eval()` and `.train()`. - data_loader: an iterable object with a length. - The elements it generates will be the inputs to the model. - evaluator: the evaluator(s) to run. Use `None` if you only want to benchmark, - but don't want to do any evaluation. - - Returns: - The return value of `evaluator.evaluate()` - """ - num_devices = get_world_size() - logger = logging.getLogger(__name__) - logger.info("Start inference on {} batches".format(len(data_loader))) - - total = len(data_loader) # inference data loader must have a fixed length - if evaluator is None: - # create a no-op evaluator - evaluator = DatasetEvaluators([]) - if isinstance(evaluator, abc.MutableSequence): - evaluator = DatasetEvaluators(evaluator) - evaluator.reset() - - num_warmup = min(5, total - 1) - start_time = time.perf_counter() - total_data_time = 0 - total_compute_time = 0 - total_eval_time = 0 - with ExitStack() as stack: - if isinstance(model, nn.Module): - stack.enter_context(inference_context(model)) - stack.enter_context(torch.no_grad()) - - start_data_time = time.perf_counter() - for idx, inputs in enumerate(data_loader): - total_data_time += time.perf_counter() - start_data_time - if idx == num_warmup: - start_time = time.perf_counter() - total_data_time = 0 - total_compute_time = 0 - total_eval_time = 0 - - start_compute_time = time.perf_counter() - outputs = model(inputs) - if torch.cuda.is_available(): - torch.cuda.synchronize() - total_compute_time += time.perf_counter() - start_compute_time - - start_eval_time = time.perf_counter() - evaluator.process(inputs, outputs) - total_eval_time += time.perf_counter() - start_eval_time - - iters_after_start = idx + 1 - num_warmup * int(idx >= num_warmup) - data_seconds_per_iter = total_data_time / iters_after_start - compute_seconds_per_iter = total_compute_time / iters_after_start - eval_seconds_per_iter = total_eval_time / iters_after_start - total_seconds_per_iter = (time.perf_counter() - start_time) / iters_after_start - if idx >= num_warmup * 2 or compute_seconds_per_iter > 5: - eta = datetime.timedelta(seconds=int(total_seconds_per_iter * (total - idx - 1))) - log_every_n_seconds( - logging.INFO, - ( - f"Inference done {idx + 1}/{total}. " - f"Dataloading: {data_seconds_per_iter:.4f} s/iter. " - f"Inference: {compute_seconds_per_iter:.4f} s/iter. " - f"Eval: {eval_seconds_per_iter:.4f} s/iter. " - f"Total: {total_seconds_per_iter:.4f} s/iter. " - f"ETA={eta}" - ), - n=5, - ) - start_data_time = time.perf_counter() - - # Measure the time only for this worker (before the synchronization barrier) - total_time = time.perf_counter() - start_time - total_time_str = str(datetime.timedelta(seconds=total_time)) - # NOTE this format is parsed by grep - logger.info( - "Total inference time: {} ({:.6f} s / iter per device, on {} devices)".format( - total_time_str, total_time / (total - num_warmup), num_devices - ) - ) - total_compute_time_str = str(datetime.timedelta(seconds=int(total_compute_time))) - logger.info( - "Total inference pure compute time: {} ({:.6f} s / iter per device, on {} devices)".format( - total_compute_time_str, total_compute_time / (total - num_warmup), num_devices - ) - ) - - results = evaluator.evaluate() - # An evaluator may return None when not in main process. - # Replace it by an empty dict instead to make it easier for downstream code to handle - if results is None: - results = {} - return results - - -@contextmanager -def inference_context(model): - """ - A context where the model is temporarily changed to eval mode, - and restored to previous mode afterwards. - - Args: - model: a torch Module - """ - training_mode = model.training - model.eval() - yield - model.train(training_mode) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/fast_eval_api.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/fast_eval_api.py deleted file mode 100755 index 2eb202bd..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/fast_eval_api.py +++ /dev/null @@ -1,121 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import logging -import numpy as np -import time -from pycocotools.cocoeval import COCOeval - -from detectron2 import _C - -logger = logging.getLogger(__name__) - - -class COCOeval_opt(COCOeval): - """ - This is a slightly modified version of the original COCO API, where the functions evaluateImg() - and accumulate() are implemented in C++ to speedup evaluation - """ - - def evaluate(self): - """ - Run per image evaluation on given images and store results in self.evalImgs_cpp, a - datastructure that isn't readable from Python but is used by a c++ implementation of - accumulate(). Unlike the original COCO PythonAPI, we don't populate the datastructure - self.evalImgs because this datastructure is a computational bottleneck. - :return: None - """ - tic = time.time() - - p = self.params - # add backward compatibility if useSegm is specified in params - if p.useSegm is not None: - p.iouType = "segm" if p.useSegm == 1 else "bbox" - logger.info("Evaluate annotation type *{}*".format(p.iouType)) - p.imgIds = list(np.unique(p.imgIds)) - if p.useCats: - p.catIds = list(np.unique(p.catIds)) - p.maxDets = sorted(p.maxDets) - self.params = p - - self._prepare() # bottleneck - - # loop through images, area range, max detection number - catIds = p.catIds if p.useCats else [-1] - - if p.iouType == "segm" or p.iouType == "bbox": - computeIoU = self.computeIoU - elif p.iouType == "keypoints": - computeIoU = self.computeOks - self.ious = { - (imgId, catId): computeIoU(imgId, catId) for imgId in p.imgIds for catId in catIds - } # bottleneck - - maxDet = p.maxDets[-1] - - # <<<< Beginning of code differences with original COCO API - def convert_instances_to_cpp(instances, is_det=False): - # Convert annotations for a list of instances in an image to a format that's fast - # to access in C++ - instances_cpp = [] - for instance in instances: - instance_cpp = _C.InstanceAnnotation( - int(instance["id"]), - instance["score"] if is_det else instance.get("score", 0.0), - instance["area"], - bool(instance.get("iscrowd", 0)), - bool(instance.get("ignore", 0)), - ) - instances_cpp.append(instance_cpp) - return instances_cpp - - # Convert GT annotations, detections, and IOUs to a format that's fast to access in C++ - ground_truth_instances = [ - [convert_instances_to_cpp(self._gts[imgId, catId]) for catId in p.catIds] - for imgId in p.imgIds - ] - detected_instances = [ - [convert_instances_to_cpp(self._dts[imgId, catId], is_det=True) for catId in p.catIds] - for imgId in p.imgIds - ] - ious = [[self.ious[imgId, catId] for catId in catIds] for imgId in p.imgIds] - - if not p.useCats: - # For each image, flatten per-category lists into a single list - ground_truth_instances = [[[o for c in i for o in c]] for i in ground_truth_instances] - detected_instances = [[[o for c in i for o in c]] for i in detected_instances] - - # Call C++ implementation of self.evaluateImgs() - self._evalImgs_cpp = _C.COCOevalEvaluateImages( - p.areaRng, maxDet, p.iouThrs, ious, ground_truth_instances, detected_instances - ) - self._evalImgs = None - - self._paramsEval = copy.deepcopy(self.params) - toc = time.time() - logger.info("COCOeval_opt.evaluate() finished in {:0.2f} seconds.".format(toc - tic)) - # >>>> End of code differences with original COCO API - - def accumulate(self): - """ - Accumulate per image evaluation results and store the result in self.eval. Does not - support changing parameter settings from those used by self.evaluate() - """ - logger.info("Accumulating evaluation results...") - tic = time.time() - assert hasattr( - self, "_evalImgs_cpp" - ), "evaluate() must be called before accmulate() is called." - - self.eval = _C.COCOevalAccumulate(self._paramsEval, self._evalImgs_cpp) - - # recall is num_iou_thresholds X num_categories X num_area_ranges X num_max_detections - self.eval["recall"] = np.array(self.eval["recall"]).reshape( - self.eval["counts"][:1] + self.eval["counts"][2:] - ) - - # precision and scores are num_iou_thresholds X num_recall_thresholds X num_categories X - # num_area_ranges X num_max_detections - self.eval["precision"] = np.array(self.eval["precision"]).reshape(self.eval["counts"]) - self.eval["scores"] = np.array(self.eval["scores"]).reshape(self.eval["counts"]) - toc = time.time() - logger.info("COCOeval_opt.accumulate() finished in {:0.2f} seconds.".format(toc - tic)) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/lvis_evaluation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/lvis_evaluation.py deleted file mode 100755 index 0604feaa..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/lvis_evaluation.py +++ /dev/null @@ -1,380 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import itertools -import json -import logging -import os -import pickle -from collections import OrderedDict -import torch - -import detectron2.utils.comm as comm -from detectron2.config import CfgNode -from detectron2.data import MetadataCatalog -from detectron2.structures import Boxes, BoxMode, pairwise_iou -from detectron2.utils.file_io import PathManager -from detectron2.utils.logger import create_small_table - -from .coco_evaluation import instances_to_coco_json -from .evaluator import DatasetEvaluator - - -class LVISEvaluator(DatasetEvaluator): - """ - Evaluate object proposal and instance detection/segmentation outputs using - LVIS's metrics and evaluation API. - """ - - def __init__( - self, - dataset_name, - tasks=None, - distributed=True, - output_dir=None, - *, - max_dets_per_image=None, - ): - """ - Args: - dataset_name (str): name of the dataset to be evaluated. - It must have the following corresponding metadata: - "json_file": the path to the LVIS format annotation - tasks (tuple[str]): tasks that can be evaluated under the given - configuration. A task is one of "bbox", "segm". - By default, will infer this automatically from predictions. - distributed (True): if True, will collect results from all ranks for evaluation. - Otherwise, will evaluate the results in the current process. - output_dir (str): optional, an output directory to dump results. - max_dets_per_image (None or int): limit on maximum detections per image in evaluating AP - This limit, by default of the LVIS dataset, is 300. - """ - from lvis import LVIS - - self._logger = logging.getLogger(__name__) - - if tasks is not None and isinstance(tasks, CfgNode): - self._logger.warn( - "COCO Evaluator instantiated using config, this is deprecated behavior." - " Please pass in explicit arguments instead." - ) - self._tasks = None # Infering it from predictions should be better - else: - self._tasks = tasks - - self._distributed = distributed - self._output_dir = output_dir - self._max_dets_per_image = max_dets_per_image - - self._cpu_device = torch.device("cpu") - - self._metadata = MetadataCatalog.get(dataset_name) - json_file = PathManager.get_local_path(self._metadata.json_file) - self._lvis_api = LVIS(json_file) - # Test set json files do not contain annotations (evaluation must be - # performed using the LVIS evaluation server). - self._do_evaluation = len(self._lvis_api.get_ann_ids()) > 0 - - def reset(self): - self._predictions = [] - - def process(self, inputs, outputs): - """ - Args: - inputs: the inputs to a LVIS model (e.g., GeneralizedRCNN). - It is a list of dict. Each dict corresponds to an image and - contains keys like "height", "width", "file_name", "image_id". - outputs: the outputs of a LVIS model. It is a list of dicts with key - "instances" that contains :class:`Instances`. - """ - for input, output in zip(inputs, outputs): - prediction = {"image_id": input["image_id"]} - - if "instances" in output: - instances = output["instances"].to(self._cpu_device) - prediction["instances"] = instances_to_coco_json(instances, input["image_id"]) - if "proposals" in output: - prediction["proposals"] = output["proposals"].to(self._cpu_device) - self._predictions.append(prediction) - - def evaluate(self): - if self._distributed: - comm.synchronize() - predictions = comm.gather(self._predictions, dst=0) - predictions = list(itertools.chain(*predictions)) - - if not comm.is_main_process(): - return - else: - predictions = self._predictions - - if len(predictions) == 0: - self._logger.warning("[LVISEvaluator] Did not receive valid predictions.") - return {} - - if self._output_dir: - PathManager.mkdirs(self._output_dir) - file_path = os.path.join(self._output_dir, "instances_predictions.pth") - with PathManager.open(file_path, "wb") as f: - torch.save(predictions, f) - - self._results = OrderedDict() - if "proposals" in predictions[0]: - self._eval_box_proposals(predictions) - if "instances" in predictions[0]: - self._eval_predictions(predictions) - # Copy so the caller can do whatever with results - return copy.deepcopy(self._results) - - def _tasks_from_predictions(self, predictions): - for pred in predictions: - if "segmentation" in pred: - return ("bbox", "segm") - return ("bbox",) - - def _eval_predictions(self, predictions): - """ - Evaluate predictions. Fill self._results with the metrics of the tasks. - - Args: - predictions (list[dict]): list of outputs from the model - """ - self._logger.info("Preparing results in the LVIS format ...") - lvis_results = list(itertools.chain(*[x["instances"] for x in predictions])) - tasks = self._tasks or self._tasks_from_predictions(lvis_results) - - # LVIS evaluator can be used to evaluate results for COCO dataset categories. - # In this case `_metadata` variable will have a field with COCO-specific category mapping. - if hasattr(self._metadata, "thing_dataset_id_to_contiguous_id"): - reverse_id_mapping = { - v: k for k, v in self._metadata.thing_dataset_id_to_contiguous_id.items() - } - for result in lvis_results: - result["category_id"] = reverse_id_mapping[result["category_id"]] - else: - # unmap the category ids for LVIS (from 0-indexed to 1-indexed) - for result in lvis_results: - result["category_id"] += 1 - - if self._output_dir: - file_path = os.path.join(self._output_dir, "lvis_instances_results.json") - self._logger.info("Saving results to {}".format(file_path)) - with PathManager.open(file_path, "w") as f: - f.write(json.dumps(lvis_results)) - f.flush() - - if not self._do_evaluation: - self._logger.info("Annotations are not available for evaluation.") - return - - self._logger.info("Evaluating predictions ...") - for task in sorted(tasks): - res = _evaluate_predictions_on_lvis( - self._lvis_api, - lvis_results, - task, - max_dets_per_image=self._max_dets_per_image, - class_names=self._metadata.get("thing_classes"), - ) - self._results[task] = res - - def _eval_box_proposals(self, predictions): - """ - Evaluate the box proposals in predictions. - Fill self._results with the metrics for "box_proposals" task. - """ - if self._output_dir: - # Saving generated box proposals to file. - # Predicted box_proposals are in XYXY_ABS mode. - bbox_mode = BoxMode.XYXY_ABS.value - ids, boxes, objectness_logits = [], [], [] - for prediction in predictions: - ids.append(prediction["image_id"]) - boxes.append(prediction["proposals"].proposal_boxes.tensor.numpy()) - objectness_logits.append(prediction["proposals"].objectness_logits.numpy()) - - proposal_data = { - "boxes": boxes, - "objectness_logits": objectness_logits, - "ids": ids, - "bbox_mode": bbox_mode, - } - with PathManager.open(os.path.join(self._output_dir, "box_proposals.pkl"), "wb") as f: - pickle.dump(proposal_data, f) - - if not self._do_evaluation: - self._logger.info("Annotations are not available for evaluation.") - return - - self._logger.info("Evaluating bbox proposals ...") - res = {} - areas = {"all": "", "small": "s", "medium": "m", "large": "l"} - for limit in [100, 1000]: - for area, suffix in areas.items(): - stats = _evaluate_box_proposals(predictions, self._lvis_api, area=area, limit=limit) - key = "AR{}@{:d}".format(suffix, limit) - res[key] = float(stats["ar"].item() * 100) - self._logger.info("Proposal metrics: \n" + create_small_table(res)) - self._results["box_proposals"] = res - - -# inspired from Detectron: -# https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 # noqa -def _evaluate_box_proposals(dataset_predictions, lvis_api, thresholds=None, area="all", limit=None): - """ - Evaluate detection proposal recall metrics. This function is a much - faster alternative to the official LVIS API recall evaluation code. However, - it produces slightly different results. - """ - # Record max overlap value for each gt box - # Return vector of overlap values - areas = { - "all": 0, - "small": 1, - "medium": 2, - "large": 3, - "96-128": 4, - "128-256": 5, - "256-512": 6, - "512-inf": 7, - } - area_ranges = [ - [0 ** 2, 1e5 ** 2], # all - [0 ** 2, 32 ** 2], # small - [32 ** 2, 96 ** 2], # medium - [96 ** 2, 1e5 ** 2], # large - [96 ** 2, 128 ** 2], # 96-128 - [128 ** 2, 256 ** 2], # 128-256 - [256 ** 2, 512 ** 2], # 256-512 - [512 ** 2, 1e5 ** 2], - ] # 512-inf - assert area in areas, "Unknown area range: {}".format(area) - area_range = area_ranges[areas[area]] - gt_overlaps = [] - num_pos = 0 - - for prediction_dict in dataset_predictions: - predictions = prediction_dict["proposals"] - - # sort predictions in descending order - # TODO maybe remove this and make it explicit in the documentation - inds = predictions.objectness_logits.sort(descending=True)[1] - predictions = predictions[inds] - - ann_ids = lvis_api.get_ann_ids(img_ids=[prediction_dict["image_id"]]) - anno = lvis_api.load_anns(ann_ids) - gt_boxes = [ - BoxMode.convert(obj["bbox"], BoxMode.XYWH_ABS, BoxMode.XYXY_ABS) for obj in anno - ] - gt_boxes = torch.as_tensor(gt_boxes).reshape(-1, 4) # guard against no boxes - gt_boxes = Boxes(gt_boxes) - gt_areas = torch.as_tensor([obj["area"] for obj in anno]) - - if len(gt_boxes) == 0 or len(predictions) == 0: - continue - - valid_gt_inds = (gt_areas >= area_range[0]) & (gt_areas <= area_range[1]) - gt_boxes = gt_boxes[valid_gt_inds] - - num_pos += len(gt_boxes) - - if len(gt_boxes) == 0: - continue - - if limit is not None and len(predictions) > limit: - predictions = predictions[:limit] - - overlaps = pairwise_iou(predictions.proposal_boxes, gt_boxes) - - _gt_overlaps = torch.zeros(len(gt_boxes)) - for j in range(min(len(predictions), len(gt_boxes))): - # find which proposal box maximally covers each gt box - # and get the iou amount of coverage for each gt box - max_overlaps, argmax_overlaps = overlaps.max(dim=0) - - # find which gt box is 'best' covered (i.e. 'best' = most iou) - gt_ovr, gt_ind = max_overlaps.max(dim=0) - assert gt_ovr >= 0 - # find the proposal box that covers the best covered gt box - box_ind = argmax_overlaps[gt_ind] - # record the iou coverage of this gt box - _gt_overlaps[j] = overlaps[box_ind, gt_ind] - assert _gt_overlaps[j] == gt_ovr - # mark the proposal box and the gt box as used - overlaps[box_ind, :] = -1 - overlaps[:, gt_ind] = -1 - - # append recorded iou coverage level - gt_overlaps.append(_gt_overlaps) - gt_overlaps = ( - torch.cat(gt_overlaps, dim=0) if len(gt_overlaps) else torch.zeros(0, dtype=torch.float32) - ) - gt_overlaps, _ = torch.sort(gt_overlaps) - - if thresholds is None: - step = 0.05 - thresholds = torch.arange(0.5, 0.95 + 1e-5, step, dtype=torch.float32) - recalls = torch.zeros_like(thresholds) - # compute recall for each iou threshold - for i, t in enumerate(thresholds): - recalls[i] = (gt_overlaps >= t).float().sum() / float(num_pos) - # ar = 2 * np.trapz(recalls, thresholds) - ar = recalls.mean() - return { - "ar": ar, - "recalls": recalls, - "thresholds": thresholds, - "gt_overlaps": gt_overlaps, - "num_pos": num_pos, - } - - -def _evaluate_predictions_on_lvis( - lvis_gt, lvis_results, iou_type, max_dets_per_image=None, class_names=None -): - """ - Args: - iou_type (str): - max_dets_per_image (None or int): limit on maximum detections per image in evaluating AP - This limit, by default of the LVIS dataset, is 300. - class_names (None or list[str]): if provided, will use it to predict - per-category AP. - - Returns: - a dict of {metric name: score} - """ - metrics = { - "bbox": ["AP", "AP50", "AP75", "APs", "APm", "APl", "APr", "APc", "APf"], - "segm": ["AP", "AP50", "AP75", "APs", "APm", "APl", "APr", "APc", "APf"], - }[iou_type] - - logger = logging.getLogger(__name__) - - if len(lvis_results) == 0: # TODO: check if needed - logger.warn("No predictions from the model!") - return {metric: float("nan") for metric in metrics} - - if iou_type == "segm": - lvis_results = copy.deepcopy(lvis_results) - # When evaluating mask AP, if the results contain bbox, LVIS API will - # use the box area as the area of the instance, instead of the mask area. - # This leads to a different definition of small/medium/large. - # We remove the bbox field to let mask AP use mask area. - for c in lvis_results: - c.pop("bbox", None) - - if max_dets_per_image is None: - max_dets_per_image = 300 # Default for LVIS dataset - - from lvis import LVISEval, LVISResults - - logger.info(f"Evaluating with max detections per image = {max_dets_per_image}") - lvis_results = LVISResults(lvis_gt, lvis_results, max_dets=max_dets_per_image) - lvis_eval = LVISEval(lvis_gt, lvis_results, iou_type) - lvis_eval.run() - lvis_eval.print_results() - - # Pull the standard metrics from the LVIS results - results = lvis_eval.get_results() - results = {metric: float(results[metric] * 100) for metric in metrics} - logger.info("Evaluation results for {}: \n".format(iou_type) + create_small_table(results)) - return results diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/panoptic_evaluation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/panoptic_evaluation.py deleted file mode 100755 index 9fb3462b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/panoptic_evaluation.py +++ /dev/null @@ -1,199 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import contextlib -import io -import itertools -import json -import logging -import numpy as np -import os -import tempfile -from collections import OrderedDict -from typing import Optional -from PIL import Image -from tabulate import tabulate - -from detectron2.data import MetadataCatalog -from detectron2.utils import comm -from detectron2.utils.file_io import PathManager - -from .evaluator import DatasetEvaluator - -logger = logging.getLogger(__name__) - - -class COCOPanopticEvaluator(DatasetEvaluator): - """ - Evaluate Panoptic Quality metrics on COCO using PanopticAPI. - It saves panoptic segmentation prediction in `output_dir` - - It contains a synchronize call and has to be called from all workers. - """ - - def __init__(self, dataset_name: str, output_dir: Optional[str] = None): - """ - Args: - dataset_name: name of the dataset - output_dir: output directory to save results for evaluation. - """ - self._metadata = MetadataCatalog.get(dataset_name) - self._thing_contiguous_id_to_dataset_id = { - v: k for k, v in self._metadata.thing_dataset_id_to_contiguous_id.items() - } - self._stuff_contiguous_id_to_dataset_id = { - v: k for k, v in self._metadata.stuff_dataset_id_to_contiguous_id.items() - } - - self._output_dir = output_dir - if self._output_dir is not None: - PathManager.mkdirs(self._output_dir) - - def reset(self): - self._predictions = [] - - def _convert_category_id(self, segment_info): - isthing = segment_info.pop("isthing", None) - if isthing is None: - # the model produces panoptic category id directly. No more conversion needed - return segment_info - if isthing is True: - segment_info["category_id"] = self._thing_contiguous_id_to_dataset_id[ - segment_info["category_id"] - ] - else: - segment_info["category_id"] = self._stuff_contiguous_id_to_dataset_id[ - segment_info["category_id"] - ] - return segment_info - - def process(self, inputs, outputs): - from panopticapi.utils import id2rgb - - for input, output in zip(inputs, outputs): - panoptic_img, segments_info = output["panoptic_seg"] - panoptic_img = panoptic_img.cpu().numpy() - if segments_info is None: - # If "segments_info" is None, we assume "panoptic_img" is a - # H*W int32 image storing the panoptic_id in the format of - # category_id * label_divisor + instance_id. We reserve -1 for - # VOID label, and add 1 to panoptic_img since the official - # evaluation script uses 0 for VOID label. - label_divisor = self._metadata.label_divisor - segments_info = [] - for panoptic_label in np.unique(panoptic_img): - if panoptic_label == -1: - # VOID region. - continue - pred_class = panoptic_label // label_divisor - isthing = ( - pred_class in self._metadata.thing_dataset_id_to_contiguous_id.values() - ) - segments_info.append( - { - "id": int(panoptic_label) + 1, - "category_id": int(pred_class), - "isthing": bool(isthing), - } - ) - # Official evaluation script uses 0 for VOID label. - panoptic_img += 1 - - file_name = os.path.basename(input["file_name"]) - file_name_png = os.path.splitext(file_name)[0] + ".png" - with io.BytesIO() as out: - Image.fromarray(id2rgb(panoptic_img)).save(out, format="PNG") - segments_info = [self._convert_category_id(x) for x in segments_info] - self._predictions.append( - { - "image_id": input["image_id"], - "file_name": file_name_png, - "png_string": out.getvalue(), - "segments_info": segments_info, - } - ) - - def evaluate(self): - comm.synchronize() - - self._predictions = comm.gather(self._predictions) - self._predictions = list(itertools.chain(*self._predictions)) - if not comm.is_main_process(): - return - - # PanopticApi requires local files - gt_json = PathManager.get_local_path(self._metadata.panoptic_json) - gt_folder = PathManager.get_local_path(self._metadata.panoptic_root) - - with tempfile.TemporaryDirectory(prefix="panoptic_eval") as pred_dir: - logger.info("Writing all panoptic predictions to {} ...".format(pred_dir)) - for p in self._predictions: - with open(os.path.join(pred_dir, p["file_name"]), "wb") as f: - f.write(p.pop("png_string")) - - with open(gt_json, "r") as f: - json_data = json.load(f) - json_data["annotations"] = self._predictions - - output_dir = self._output_dir or pred_dir - predictions_json = os.path.join(output_dir, "predictions.json") - with PathManager.open(predictions_json, "w") as f: - f.write(json.dumps(json_data)) - - from panopticapi.evaluation import pq_compute - - with contextlib.redirect_stdout(io.StringIO()): - pq_res = pq_compute( - gt_json, - PathManager.get_local_path(predictions_json), - gt_folder=gt_folder, - pred_folder=pred_dir, - ) - - res = {} - res["PQ"] = 100 * pq_res["All"]["pq"] - res["SQ"] = 100 * pq_res["All"]["sq"] - res["RQ"] = 100 * pq_res["All"]["rq"] - res["PQ_th"] = 100 * pq_res["Things"]["pq"] - res["SQ_th"] = 100 * pq_res["Things"]["sq"] - res["RQ_th"] = 100 * pq_res["Things"]["rq"] - res["PQ_st"] = 100 * pq_res["Stuff"]["pq"] - res["SQ_st"] = 100 * pq_res["Stuff"]["sq"] - res["RQ_st"] = 100 * pq_res["Stuff"]["rq"] - - results = OrderedDict({"panoptic_seg": res}) - _print_panoptic_results(pq_res) - - return results - - -def _print_panoptic_results(pq_res): - headers = ["", "PQ", "SQ", "RQ", "#categories"] - data = [] - for name in ["All", "Things", "Stuff"]: - row = [name] + [pq_res[name][k] * 100 for k in ["pq", "sq", "rq"]] + [pq_res[name]["n"]] - data.append(row) - table = tabulate( - data, headers=headers, tablefmt="pipe", floatfmt=".3f", stralign="center", numalign="center" - ) - logger.info("Panoptic Evaluation Results:\n" + table) - - -if __name__ == "__main__": - from detectron2.utils.logger import setup_logger - - logger = setup_logger() - import argparse - - parser = argparse.ArgumentParser() - parser.add_argument("--gt-json") - parser.add_argument("--gt-dir") - parser.add_argument("--pred-json") - parser.add_argument("--pred-dir") - args = parser.parse_args() - - from panopticapi.evaluation import pq_compute - - with contextlib.redirect_stdout(io.StringIO()): - pq_res = pq_compute( - args.gt_json, args.pred_json, gt_folder=args.gt_dir, pred_folder=args.pred_dir - ) - _print_panoptic_results(pq_res) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/pascal_voc_evaluation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/pascal_voc_evaluation.py deleted file mode 100755 index 1d1abcde..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/pascal_voc_evaluation.py +++ /dev/null @@ -1,300 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import logging -import numpy as np -import os -import tempfile -import xml.etree.ElementTree as ET -from collections import OrderedDict, defaultdict -from functools import lru_cache -import torch - -from detectron2.data import MetadataCatalog -from detectron2.utils import comm -from detectron2.utils.file_io import PathManager - -from .evaluator import DatasetEvaluator - - -class PascalVOCDetectionEvaluator(DatasetEvaluator): - """ - Evaluate Pascal VOC style AP for Pascal VOC dataset. - It contains a synchronization, therefore has to be called from all ranks. - - Note that the concept of AP can be implemented in different ways and may not - produce identical results. This class mimics the implementation of the official - Pascal VOC Matlab API, and should produce similar but not identical results to the - official API. - """ - - def __init__(self, dataset_name): - """ - Args: - dataset_name (str): name of the dataset, e.g., "voc_2007_test" - """ - self._dataset_name = dataset_name - meta = MetadataCatalog.get(dataset_name) - - # Too many tiny files, download all to local for speed. - annotation_dir_local = PathManager.get_local_path( - os.path.join(meta.dirname, "Annotations/") - ) - self._anno_file_template = os.path.join(annotation_dir_local, "{}.xml") - self._image_set_path = os.path.join(meta.dirname, "ImageSets", "Main", meta.split + ".txt") - self._class_names = meta.thing_classes - assert meta.year in [2007, 2012], meta.year - self._is_2007 = meta.year == 2007 - self._cpu_device = torch.device("cpu") - self._logger = logging.getLogger(__name__) - - def reset(self): - self._predictions = defaultdict(list) # class name -> list of prediction strings - - def process(self, inputs, outputs): - for input, output in zip(inputs, outputs): - image_id = input["image_id"] - instances = output["instances"].to(self._cpu_device) - boxes = instances.pred_boxes.tensor.numpy() - scores = instances.scores.tolist() - classes = instances.pred_classes.tolist() - for box, score, cls in zip(boxes, scores, classes): - xmin, ymin, xmax, ymax = box - # The inverse of data loading logic in `datasets/pascal_voc.py` - xmin += 1 - ymin += 1 - self._predictions[cls].append( - f"{image_id} {score:.3f} {xmin:.1f} {ymin:.1f} {xmax:.1f} {ymax:.1f}" - ) - - def evaluate(self): - """ - Returns: - dict: has a key "segm", whose value is a dict of "AP", "AP50", and "AP75". - """ - all_predictions = comm.gather(self._predictions, dst=0) - if not comm.is_main_process(): - return - predictions = defaultdict(list) - for predictions_per_rank in all_predictions: - for clsid, lines in predictions_per_rank.items(): - predictions[clsid].extend(lines) - del all_predictions - - self._logger.info( - "Evaluating {} using {} metric. " - "Note that results do not use the official Matlab API.".format( - self._dataset_name, 2007 if self._is_2007 else 2012 - ) - ) - - with tempfile.TemporaryDirectory(prefix="pascal_voc_eval_") as dirname: - res_file_template = os.path.join(dirname, "{}.txt") - - aps = defaultdict(list) # iou -> ap per class - for cls_id, cls_name in enumerate(self._class_names): - lines = predictions.get(cls_id, [""]) - - with open(res_file_template.format(cls_name), "w") as f: - f.write("\n".join(lines)) - - for thresh in range(50, 100, 5): - rec, prec, ap = voc_eval( - res_file_template, - self._anno_file_template, - self._image_set_path, - cls_name, - ovthresh=thresh / 100.0, - use_07_metric=self._is_2007, - ) - aps[thresh].append(ap * 100) - - ret = OrderedDict() - mAP = {iou: np.mean(x) for iou, x in aps.items()} - ret["bbox"] = {"AP": np.mean(list(mAP.values())), "AP50": mAP[50], "AP75": mAP[75]} - return ret - - -############################################################################## -# -# Below code is modified from -# https://github.com/rbgirshick/py-faster-rcnn/blob/master/lib/datasets/voc_eval.py -# -------------------------------------------------------- -# Fast/er R-CNN -# Licensed under The MIT License [see LICENSE for details] -# Written by Bharath Hariharan -# -------------------------------------------------------- - -"""Python implementation of the PASCAL VOC devkit's AP evaluation code.""" - - -@lru_cache(maxsize=None) -def parse_rec(filename): - """Parse a PASCAL VOC xml file.""" - with PathManager.open(filename) as f: - tree = ET.parse(f) - objects = [] - for obj in tree.findall("object"): - obj_struct = {} - obj_struct["name"] = obj.find("name").text - obj_struct["pose"] = obj.find("pose").text - obj_struct["truncated"] = int(obj.find("truncated").text) - obj_struct["difficult"] = int(obj.find("difficult").text) - bbox = obj.find("bndbox") - obj_struct["bbox"] = [ - int(bbox.find("xmin").text), - int(bbox.find("ymin").text), - int(bbox.find("xmax").text), - int(bbox.find("ymax").text), - ] - objects.append(obj_struct) - - return objects - - -def voc_ap(rec, prec, use_07_metric=False): - """Compute VOC AP given precision and recall. If use_07_metric is true, uses - the VOC 07 11-point method (default:False). - """ - if use_07_metric: - # 11 point metric - ap = 0.0 - for t in np.arange(0.0, 1.1, 0.1): - if np.sum(rec >= t) == 0: - p = 0 - else: - p = np.max(prec[rec >= t]) - ap = ap + p / 11.0 - else: - # correct AP calculation - # first append sentinel values at the end - mrec = np.concatenate(([0.0], rec, [1.0])) - mpre = np.concatenate(([0.0], prec, [0.0])) - - # compute the precision envelope - for i in range(mpre.size - 1, 0, -1): - mpre[i - 1] = np.maximum(mpre[i - 1], mpre[i]) - - # to calculate area under PR curve, look for points - # where X axis (recall) changes value - i = np.where(mrec[1:] != mrec[:-1])[0] - - # and sum (\Delta recall) * prec - ap = np.sum((mrec[i + 1] - mrec[i]) * mpre[i + 1]) - return ap - - -def voc_eval(detpath, annopath, imagesetfile, classname, ovthresh=0.5, use_07_metric=False): - """rec, prec, ap = voc_eval(detpath, - annopath, - imagesetfile, - classname, - [ovthresh], - [use_07_metric]) - - Top level function that does the PASCAL VOC evaluation. - - detpath: Path to detections - detpath.format(classname) should produce the detection results file. - annopath: Path to annotations - annopath.format(imagename) should be the xml annotations file. - imagesetfile: Text file containing the list of images, one image per line. - classname: Category name (duh) - [ovthresh]: Overlap threshold (default = 0.5) - [use_07_metric]: Whether to use VOC07's 11 point AP computation - (default False) - """ - # assumes detections are in detpath.format(classname) - # assumes annotations are in annopath.format(imagename) - # assumes imagesetfile is a text file with each line an image name - - # first load gt - # read list of images - with PathManager.open(imagesetfile, "r") as f: - lines = f.readlines() - imagenames = [x.strip() for x in lines] - - # load annots - recs = {} - for imagename in imagenames: - recs[imagename] = parse_rec(annopath.format(imagename)) - - # extract gt objects for this class - class_recs = {} - npos = 0 - for imagename in imagenames: - R = [obj for obj in recs[imagename] if obj["name"] == classname] - bbox = np.array([x["bbox"] for x in R]) - difficult = np.array([x["difficult"] for x in R]).astype(np.bool) - # difficult = np.array([False for x in R]).astype(np.bool) # treat all "difficult" as GT - det = [False] * len(R) - npos = npos + sum(~difficult) - class_recs[imagename] = {"bbox": bbox, "difficult": difficult, "det": det} - - # read dets - detfile = detpath.format(classname) - with open(detfile, "r") as f: - lines = f.readlines() - - splitlines = [x.strip().split(" ") for x in lines] - image_ids = [x[0] for x in splitlines] - confidence = np.array([float(x[1]) for x in splitlines]) - BB = np.array([[float(z) for z in x[2:]] for x in splitlines]).reshape(-1, 4) - - # sort by confidence - sorted_ind = np.argsort(-confidence) - BB = BB[sorted_ind, :] - image_ids = [image_ids[x] for x in sorted_ind] - - # go down dets and mark TPs and FPs - nd = len(image_ids) - tp = np.zeros(nd) - fp = np.zeros(nd) - for d in range(nd): - R = class_recs[image_ids[d]] - bb = BB[d, :].astype(float) - ovmax = -np.inf - BBGT = R["bbox"].astype(float) - - if BBGT.size > 0: - # compute overlaps - # intersection - ixmin = np.maximum(BBGT[:, 0], bb[0]) - iymin = np.maximum(BBGT[:, 1], bb[1]) - ixmax = np.minimum(BBGT[:, 2], bb[2]) - iymax = np.minimum(BBGT[:, 3], bb[3]) - iw = np.maximum(ixmax - ixmin + 1.0, 0.0) - ih = np.maximum(iymax - iymin + 1.0, 0.0) - inters = iw * ih - - # union - uni = ( - (bb[2] - bb[0] + 1.0) * (bb[3] - bb[1] + 1.0) - + (BBGT[:, 2] - BBGT[:, 0] + 1.0) * (BBGT[:, 3] - BBGT[:, 1] + 1.0) - - inters - ) - - overlaps = inters / uni - ovmax = np.max(overlaps) - jmax = np.argmax(overlaps) - - if ovmax > ovthresh: - if not R["difficult"][jmax]: - if not R["det"][jmax]: - tp[d] = 1.0 - R["det"][jmax] = 1 - else: - fp[d] = 1.0 - else: - fp[d] = 1.0 - - # compute precision recall - fp = np.cumsum(fp) - tp = np.cumsum(tp) - rec = tp / float(npos) - # avoid divide by zero in case the first detection matches a difficult - # ground truth - prec = tp / np.maximum(tp + fp, np.finfo(np.float64).eps) - ap = voc_ap(rec, prec, use_07_metric) - - return rec, prec, ap diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/rotated_coco_evaluation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/rotated_coco_evaluation.py deleted file mode 100755 index ea6d1b38..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/rotated_coco_evaluation.py +++ /dev/null @@ -1,207 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import itertools -import json -import numpy as np -import os -import torch -from pycocotools.cocoeval import COCOeval, maskUtils - -from detectron2.structures import BoxMode, RotatedBoxes, pairwise_iou_rotated -from detectron2.utils.file_io import PathManager - -from .coco_evaluation import COCOEvaluator - - -class RotatedCOCOeval(COCOeval): - @staticmethod - def is_rotated(box_list): - if type(box_list) == np.ndarray: - return box_list.shape[1] == 5 - elif type(box_list) == list: - if box_list == []: # cannot decide the box_dim - return False - return np.all( - np.array( - [ - (len(obj) == 5) and ((type(obj) == list) or (type(obj) == np.ndarray)) - for obj in box_list - ] - ) - ) - return False - - @staticmethod - def boxlist_to_tensor(boxlist, output_box_dim): - if type(boxlist) == np.ndarray: - box_tensor = torch.from_numpy(boxlist) - elif type(boxlist) == list: - if boxlist == []: - return torch.zeros((0, output_box_dim), dtype=torch.float32) - else: - box_tensor = torch.FloatTensor(boxlist) - else: - raise Exception("Unrecognized boxlist type") - - input_box_dim = box_tensor.shape[1] - if input_box_dim != output_box_dim: - if input_box_dim == 4 and output_box_dim == 5: - box_tensor = BoxMode.convert(box_tensor, BoxMode.XYWH_ABS, BoxMode.XYWHA_ABS) - else: - raise Exception( - "Unable to convert from {}-dim box to {}-dim box".format( - input_box_dim, output_box_dim - ) - ) - return box_tensor - - def compute_iou_dt_gt(self, dt, gt, is_crowd): - if self.is_rotated(dt) or self.is_rotated(gt): - # TODO: take is_crowd into consideration - assert all(c == 0 for c in is_crowd) - dt = RotatedBoxes(self.boxlist_to_tensor(dt, output_box_dim=5)) - gt = RotatedBoxes(self.boxlist_to_tensor(gt, output_box_dim=5)) - return pairwise_iou_rotated(dt, gt) - else: - # This is the same as the classical COCO evaluation - return maskUtils.iou(dt, gt, is_crowd) - - def computeIoU(self, imgId, catId): - p = self.params - if p.useCats: - gt = self._gts[imgId, catId] - dt = self._dts[imgId, catId] - else: - gt = [_ for cId in p.catIds for _ in self._gts[imgId, cId]] - dt = [_ for cId in p.catIds for _ in self._dts[imgId, cId]] - if len(gt) == 0 and len(dt) == 0: - return [] - inds = np.argsort([-d["score"] for d in dt], kind="mergesort") - dt = [dt[i] for i in inds] - if len(dt) > p.maxDets[-1]: - dt = dt[0 : p.maxDets[-1]] - - assert p.iouType == "bbox", "unsupported iouType for iou computation" - - g = [g["bbox"] for g in gt] - d = [d["bbox"] for d in dt] - - # compute iou between each dt and gt region - iscrowd = [int(o["iscrowd"]) for o in gt] - - # Note: this function is copied from cocoeval.py in cocoapi - # and the major difference is here. - ious = self.compute_iou_dt_gt(d, g, iscrowd) - return ious - - -class RotatedCOCOEvaluator(COCOEvaluator): - """ - Evaluate object proposal/instance detection outputs using COCO-like metrics and APIs, - with rotated boxes support. - Note: this uses IOU only and does not consider angle differences. - """ - - def process(self, inputs, outputs): - """ - Args: - inputs: the inputs to a COCO model (e.g., GeneralizedRCNN). - It is a list of dict. Each dict corresponds to an image and - contains keys like "height", "width", "file_name", "image_id". - outputs: the outputs of a COCO model. It is a list of dicts with key - "instances" that contains :class:`Instances`. - """ - for input, output in zip(inputs, outputs): - prediction = {"image_id": input["image_id"]} - - if "instances" in output: - instances = output["instances"].to(self._cpu_device) - - prediction["instances"] = self.instances_to_json(instances, input["image_id"]) - if "proposals" in output: - prediction["proposals"] = output["proposals"].to(self._cpu_device) - self._predictions.append(prediction) - - def instances_to_json(self, instances, img_id): - num_instance = len(instances) - if num_instance == 0: - return [] - - boxes = instances.pred_boxes.tensor.numpy() - if boxes.shape[1] == 4: - boxes = BoxMode.convert(boxes, BoxMode.XYXY_ABS, BoxMode.XYWH_ABS) - boxes = boxes.tolist() - scores = instances.scores.tolist() - classes = instances.pred_classes.tolist() - - results = [] - for k in range(num_instance): - result = { - "image_id": img_id, - "category_id": classes[k], - "bbox": boxes[k], - "score": scores[k], - } - - results.append(result) - return results - - def _eval_predictions(self, predictions, img_ids=None): # img_ids: unused - """ - Evaluate predictions on the given tasks. - Fill self._results with the metrics of the tasks. - """ - self._logger.info("Preparing results for COCO format ...") - coco_results = list(itertools.chain(*[x["instances"] for x in predictions])) - - # unmap the category ids for COCO - if hasattr(self._metadata, "thing_dataset_id_to_contiguous_id"): - reverse_id_mapping = { - v: k for k, v in self._metadata.thing_dataset_id_to_contiguous_id.items() - } - for result in coco_results: - result["category_id"] = reverse_id_mapping[result["category_id"]] - - if self._output_dir: - file_path = os.path.join(self._output_dir, "coco_instances_results.json") - self._logger.info("Saving results to {}".format(file_path)) - with PathManager.open(file_path, "w") as f: - f.write(json.dumps(coco_results)) - f.flush() - - if not self._do_evaluation: - self._logger.info("Annotations are not available for evaluation.") - return - - self._logger.info("Evaluating predictions ...") - - assert self._tasks is None or set(self._tasks) == { - "bbox" - }, "[RotatedCOCOEvaluator] Only bbox evaluation is supported" - coco_eval = ( - self._evaluate_predictions_on_coco(self._coco_api, coco_results) - if len(coco_results) > 0 - else None # cocoapi does not handle empty results very well - ) - - task = "bbox" - res = self._derive_coco_results( - coco_eval, task, class_names=self._metadata.get("thing_classes") - ) - self._results[task] = res - - def _evaluate_predictions_on_coco(self, coco_gt, coco_results): - """ - Evaluate the coco results using COCOEval API. - """ - assert len(coco_results) > 0 - - coco_dt = coco_gt.loadRes(coco_results) - - # Only bbox is supported for now - coco_eval = RotatedCOCOeval(coco_gt, coco_dt, iouType="bbox") - - coco_eval.evaluate() - coco_eval.accumulate() - coco_eval.summarize() - - return coco_eval diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/sem_seg_evaluation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/sem_seg_evaluation.py deleted file mode 100755 index 7a19db71..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/sem_seg_evaluation.py +++ /dev/null @@ -1,184 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import itertools -import json -import logging -import numpy as np -import os -from collections import OrderedDict -import PIL.Image as Image -import pycocotools.mask as mask_util -import torch - -from detectron2.data import DatasetCatalog, MetadataCatalog -from detectron2.utils.comm import all_gather, is_main_process, synchronize -from detectron2.utils.file_io import PathManager - -from .evaluator import DatasetEvaluator - - -class SemSegEvaluator(DatasetEvaluator): - """ - Evaluate semantic segmentation metrics. - """ - - def __init__( - self, - dataset_name, - distributed=True, - output_dir=None, - *, - num_classes=None, - ignore_label=None, - ): - """ - Args: - dataset_name (str): name of the dataset to be evaluated. - distributed (bool): if True, will collect results from all ranks for evaluation. - Otherwise, will evaluate the results in the current process. - output_dir (str): an output directory to dump results. - num_classes, ignore_label: deprecated argument - """ - self._logger = logging.getLogger(__name__) - if num_classes is not None: - self._logger.warn( - "SemSegEvaluator(num_classes) is deprecated! It should be obtained from metadata." - ) - if ignore_label is not None: - self._logger.warn( - "SemSegEvaluator(ignore_label) is deprecated! It should be obtained from metadata." - ) - self._dataset_name = dataset_name - self._distributed = distributed - self._output_dir = output_dir - - self._cpu_device = torch.device("cpu") - - self.input_file_to_gt_file = { - dataset_record["file_name"]: dataset_record["sem_seg_file_name"] - for dataset_record in DatasetCatalog.get(dataset_name) - } - - meta = MetadataCatalog.get(dataset_name) - # Dict that maps contiguous training ids to COCO category ids - try: - c2d = meta.stuff_dataset_id_to_contiguous_id - self._contiguous_id_to_dataset_id = {v: k for k, v in c2d.items()} - except AttributeError: - self._contiguous_id_to_dataset_id = None - self._class_names = meta.stuff_classes - self._num_classes = len(meta.stuff_classes) - if num_classes is not None: - assert self._num_classes == num_classes, f"{self._num_classes} != {num_classes}" - self._ignore_label = ignore_label if ignore_label is not None else meta.ignore_label - - def reset(self): - self._conf_matrix = np.zeros((self._num_classes + 1, self._num_classes + 1), dtype=np.int64) - self._predictions = [] - - def process(self, inputs, outputs): - """ - Args: - inputs: the inputs to a model. - It is a list of dicts. Each dict corresponds to an image and - contains keys like "height", "width", "file_name". - outputs: the outputs of a model. It is either list of semantic segmentation predictions - (Tensor [H, W]) or list of dicts with key "sem_seg" that contains semantic - segmentation prediction in the same format. - """ - for input, output in zip(inputs, outputs): - output = output["sem_seg"].argmax(dim=0).to(self._cpu_device) - pred = np.array(output, dtype=np.int) - with PathManager.open(self.input_file_to_gt_file[input["file_name"]], "rb") as f: - gt = np.array(Image.open(f), dtype=np.int) - - gt[gt == self._ignore_label] = self._num_classes - - self._conf_matrix += np.bincount( - (self._num_classes + 1) * pred.reshape(-1) + gt.reshape(-1), - minlength=self._conf_matrix.size, - ).reshape(self._conf_matrix.shape) - - self._predictions.extend(self.encode_json_sem_seg(pred, input["file_name"])) - - def evaluate(self): - """ - Evaluates standard semantic segmentation metrics (http://cocodataset.org/#stuff-eval): - - * Mean intersection-over-union averaged across classes (mIoU) - * Frequency Weighted IoU (fwIoU) - * Mean pixel accuracy averaged across classes (mACC) - * Pixel Accuracy (pACC) - """ - if self._distributed: - synchronize() - conf_matrix_list = all_gather(self._conf_matrix) - self._predictions = all_gather(self._predictions) - self._predictions = list(itertools.chain(*self._predictions)) - if not is_main_process(): - return - - self._conf_matrix = np.zeros_like(self._conf_matrix) - for conf_matrix in conf_matrix_list: - self._conf_matrix += conf_matrix - - if self._output_dir: - PathManager.mkdirs(self._output_dir) - file_path = os.path.join(self._output_dir, "sem_seg_predictions.json") - with PathManager.open(file_path, "w") as f: - f.write(json.dumps(self._predictions)) - - acc = np.full(self._num_classes, np.nan, dtype=np.float) - iou = np.full(self._num_classes, np.nan, dtype=np.float) - tp = self._conf_matrix.diagonal()[:-1].astype(np.float) - pos_gt = np.sum(self._conf_matrix[:-1, :-1], axis=0).astype(np.float) - class_weights = pos_gt / np.sum(pos_gt) - pos_pred = np.sum(self._conf_matrix[:-1, :-1], axis=1).astype(np.float) - acc_valid = pos_gt > 0 - acc[acc_valid] = tp[acc_valid] / pos_gt[acc_valid] - iou_valid = (pos_gt + pos_pred) > 0 - union = pos_gt + pos_pred - tp - iou[acc_valid] = tp[acc_valid] / union[acc_valid] - macc = np.sum(acc[acc_valid]) / np.sum(acc_valid) - miou = np.sum(iou[acc_valid]) / np.sum(iou_valid) - fiou = np.sum(iou[acc_valid] * class_weights[acc_valid]) - pacc = np.sum(tp) / np.sum(pos_gt) - - res = {} - res["mIoU"] = 100 * miou - res["fwIoU"] = 100 * fiou - for i, name in enumerate(self._class_names): - res["IoU-{}".format(name)] = 100 * iou[i] - res["mACC"] = 100 * macc - res["pACC"] = 100 * pacc - for i, name in enumerate(self._class_names): - res["ACC-{}".format(name)] = 100 * acc[i] - - if self._output_dir: - file_path = os.path.join(self._output_dir, "sem_seg_evaluation.pth") - with PathManager.open(file_path, "wb") as f: - torch.save(res, f) - results = OrderedDict({"sem_seg": res}) - self._logger.info(results) - return results - - def encode_json_sem_seg(self, sem_seg, input_file_name): - """ - Convert semantic segmentation to COCO stuff format with segments encoded as RLEs. - See http://cocodataset.org/#format-results - """ - json_list = [] - for label in np.unique(sem_seg): - if self._contiguous_id_to_dataset_id is not None: - assert ( - label in self._contiguous_id_to_dataset_id - ), "Label {} is not in the metadata info for {}".format(label, self._dataset_name) - dataset_id = self._contiguous_id_to_dataset_id[label] - else: - dataset_id = int(label) - mask = (sem_seg == label).astype(np.uint8) - mask_rle = mask_util.encode(np.array(mask[:, :, None], order="F"))[0] - mask_rle["counts"] = mask_rle["counts"].decode("utf-8") - json_list.append( - {"file_name": input_file_name, "category_id": dataset_id, "segmentation": mask_rle} - ) - return json_list diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/testing.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/testing.py deleted file mode 100755 index 9e5ae625..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/evaluation/testing.py +++ /dev/null @@ -1,85 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import numpy as np -import pprint -import sys -from collections.abc import Mapping - - -def print_csv_format(results): - """ - Print main metrics in a format similar to Detectron, - so that they are easy to copypaste into a spreadsheet. - - Args: - results (OrderedDict[dict]): task_name -> {metric -> score} - unordered dict can also be printed, but in arbitrary order - """ - assert isinstance(results, Mapping) or not len(results), results - logger = logging.getLogger(__name__) - for task, res in results.items(): - if isinstance(res, Mapping): - # Don't print "AP-category" metrics since they are usually not tracked. - important_res = [(k, v) for k, v in res.items() if "-" not in k] - logger.info("copypaste: Task: {}".format(task)) - logger.info("copypaste: " + ",".join([k[0] for k in important_res])) - logger.info("copypaste: " + ",".join(["{0:.4f}".format(k[1]) for k in important_res])) - else: - logger.info(f"copypaste: {task}={res}") - - -def verify_results(cfg, results): - """ - Args: - results (OrderedDict[dict]): task_name -> {metric -> score} - - Returns: - bool: whether the verification succeeds or not - """ - expected_results = cfg.TEST.EXPECTED_RESULTS - if not len(expected_results): - return True - - ok = True - for task, metric, expected, tolerance in expected_results: - actual = results[task].get(metric, None) - if actual is None: - ok = False - continue - if not np.isfinite(actual): - ok = False - continue - diff = abs(actual - expected) - if diff > tolerance: - ok = False - - logger = logging.getLogger(__name__) - if not ok: - logger.error("Result verification failed!") - logger.error("Expected Results: " + str(expected_results)) - logger.error("Actual Results: " + pprint.pformat(results)) - - sys.exit(1) - else: - logger.info("Results verification passed.") - return ok - - -def flatten_results_dict(results): - """ - Expand a hierarchical dict of scalars into a flat dict of scalars. - If results[k1][k2][k3] = v, the returned dict will have the entry - {"k1/k2/k3": v}. - - Args: - results (dict): - """ - r = {} - for k, v in results.items(): - if isinstance(v, Mapping): - v = flatten_results_dict(v) - for kk, vv in v.items(): - r[k + "/" + kk] = vv - else: - r[k] = v - return r diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/README.md b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/README.md deleted file mode 100755 index 9fcd3351..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/README.md +++ /dev/null @@ -1,13 +0,0 @@ - -This directory contains code to prepare a detectron2 model for deployment. -Currently it supports exporting a detectron2 model to Caffe2 format through ONNX. - -Please see [documentation](https://detectron2.readthedocs.io/tutorials/deployment.html) for its usage. - - -### Acknowledgements - -Thanks to Mobile Vision team at Facebook for developing the Caffe2 conversion tools. - -Thanks to Computing Platform Department - PAI team at Alibaba Group (@bddpqq, @chenbohua3) who -help export Detectron2 models to TorchScript. diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/__init__.py deleted file mode 100755 index 25e5c946..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/__init__.py +++ /dev/null @@ -1,15 +0,0 @@ -# -*- coding: utf-8 -*- - -try: - from caffe2.proto import caffe2_pb2 as _tmp - - # caffe2 is optional -except ImportError: - pass -else: - from .api import * - -from .flatten import TracingAdapter -from .torchscript import scripting_with_instances, dump_torchscript_IR - -__all__ = [k for k in globals().keys() if not k.startswith("_")] diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/api.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/api.py deleted file mode 100755 index ad427218..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/api.py +++ /dev/null @@ -1,235 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import logging -import os -import torch -from caffe2.proto import caffe2_pb2 -from torch import nn - -from detectron2.config import CfgNode -from detectron2.utils.file_io import PathManager - -from .caffe2_inference import ProtobufDetectionModel -from .caffe2_modeling import META_ARCH_CAFFE2_EXPORT_TYPE_MAP, convert_batched_inputs_to_c2_format -from .shared import get_pb_arg_vali, get_pb_arg_vals, save_graph - -__all__ = [ - "add_export_config", - "Caffe2Model", - "Caffe2Tracer", -] - - -def add_export_config(cfg): - return cfg - - -class Caffe2Tracer: - """ - Make a detectron2 model traceable with Caffe2 operators. - This class creates a traceable version of a detectron2 model which: - - 1. Rewrite parts of the model using ops in Caffe2. Note that some ops do - not have GPU implementation in Caffe2. - 2. Remove post-processing and only produce raw layer outputs - - After making a traceable model, the class provide methods to export such a - model to different deployment formats. - Exported graph produced by this class take two input tensors: - - 1. (1, C, H, W) float "data" which is an image (usually in [0, 255]). - (H, W) often has to be padded to multiple of 32 (depend on the model - architecture). - 2. 1x3 float "im_info", each row of which is (height, width, 1.0). - Height and width are true image shapes before padding. - - The class currently only supports models using builtin meta architectures. - Batch inference is not supported, and contributions are welcome. - """ - - def __init__(self, cfg: CfgNode, model: nn.Module, inputs): - """ - Args: - cfg (CfgNode): a detectron2 config used to construct caffe2-compatible model. - model (nn.Module): An original pytorch model. Must be among a few official models - in detectron2 that can be converted to become caffe2-compatible automatically. - Weights have to be already loaded to this model. - inputs: sample inputs that the given model takes for inference. - Will be used to trace the model. For most models, random inputs with - no detected objects will not work as they lead to wrong traces. - """ - assert isinstance(cfg, CfgNode), cfg - assert isinstance(model, torch.nn.Module), type(model) - - # TODO make it support custom models, by passing in c2 model directly - C2MetaArch = META_ARCH_CAFFE2_EXPORT_TYPE_MAP[cfg.MODEL.META_ARCHITECTURE] - self.traceable_model = C2MetaArch(cfg, copy.deepcopy(model)) - self.inputs = inputs - self.traceable_inputs = self.traceable_model.get_caffe2_inputs(inputs) - - def export_caffe2(self): - """ - Export the model to Caffe2's protobuf format. - The returned object can be saved with its :meth:`.save_protobuf()` method. - The result can be loaded and executed using Caffe2 runtime. - - Returns: - :class:`Caffe2Model` - """ - from .caffe2_export import export_caffe2_detection_model - - predict_net, init_net = export_caffe2_detection_model( - self.traceable_model, self.traceable_inputs - ) - return Caffe2Model(predict_net, init_net) - - def export_onnx(self): - """ - Export the model to ONNX format. - Note that the exported model contains custom ops only available in caffe2, therefore it - cannot be directly executed by other runtime (such as onnxruntime or TensorRT). - Post-processing or transformation passes may be applied on the model to accommodate - different runtimes, but we currently do not provide support for them. - - Returns: - onnx.ModelProto: an onnx model. - """ - from .caffe2_export import export_onnx_model as export_onnx_model_impl - - return export_onnx_model_impl(self.traceable_model, (self.traceable_inputs,)) - - def export_torchscript(self): - """ - Export the model to a ``torch.jit.TracedModule`` by tracing. - The returned object can be saved to a file by ``.save()``. - - Returns: - torch.jit.TracedModule: a torch TracedModule - """ - logger = logging.getLogger(__name__) - logger.info("Tracing the model with torch.jit.trace ...") - with torch.no_grad(): - return torch.jit.trace(self.traceable_model, (self.traceable_inputs,)) - - -class Caffe2Model(nn.Module): - """ - A wrapper around the traced model in Caffe2's protobuf format. - The exported graph has different inputs/outputs from the original Pytorch - model, as explained in :class:`Caffe2Tracer`. This class wraps around the - exported graph to simulate the same interface as the original Pytorch model. - It also provides functions to save/load models in Caffe2's format.' - - Examples: - :: - c2_model = Caffe2Tracer(cfg, torch_model, inputs).export_caffe2() - inputs = [{"image": img_tensor_CHW}] - outputs = c2_model(inputs) - orig_outputs = torch_model(inputs) - """ - - def __init__(self, predict_net, init_net): - super().__init__() - self.eval() # always in eval mode - self._predict_net = predict_net - self._init_net = init_net - self._predictor = None - - __init__.__HIDE_SPHINX_DOC__ = True - - @property - def predict_net(self): - """ - caffe2.core.Net: the underlying caffe2 predict net - """ - return self._predict_net - - @property - def init_net(self): - """ - caffe2.core.Net: the underlying caffe2 init net - """ - return self._init_net - - def save_protobuf(self, output_dir): - """ - Save the model as caffe2's protobuf format. - It saves the following files: - - * "model.pb": definition of the graph. Can be visualized with - tools like `netron `_. - * "model_init.pb": model parameters - * "model.pbtxt": human-readable definition of the graph. Not - needed for deployment. - - Args: - output_dir (str): the output directory to save protobuf files. - """ - logger = logging.getLogger(__name__) - logger.info("Saving model to {} ...".format(output_dir)) - if not PathManager.exists(output_dir): - PathManager.mkdirs(output_dir) - - with PathManager.open(os.path.join(output_dir, "model.pb"), "wb") as f: - f.write(self._predict_net.SerializeToString()) - with PathManager.open(os.path.join(output_dir, "model.pbtxt"), "w") as f: - f.write(str(self._predict_net)) - with PathManager.open(os.path.join(output_dir, "model_init.pb"), "wb") as f: - f.write(self._init_net.SerializeToString()) - - def save_graph(self, output_file, inputs=None): - """ - Save the graph as SVG format. - - Args: - output_file (str): a SVG file - inputs: optional inputs given to the model. - If given, the inputs will be used to run the graph to record - shape of every tensor. The shape information will be - saved together with the graph. - """ - from .caffe2_export import run_and_save_graph - - if inputs is None: - save_graph(self._predict_net, output_file, op_only=False) - else: - size_divisibility = get_pb_arg_vali(self._predict_net, "size_divisibility", 0) - device = get_pb_arg_vals(self._predict_net, "device", b"cpu").decode("ascii") - inputs = convert_batched_inputs_to_c2_format(inputs, size_divisibility, device) - inputs = [x.cpu().numpy() for x in inputs] - run_and_save_graph(self._predict_net, self._init_net, inputs, output_file) - - @staticmethod - def load_protobuf(dir): - """ - Args: - dir (str): a directory used to save Caffe2Model with - :meth:`save_protobuf`. - The files "model.pb" and "model_init.pb" are needed. - - Returns: - Caffe2Model: the caffe2 model loaded from this directory. - """ - predict_net = caffe2_pb2.NetDef() - with PathManager.open(os.path.join(dir, "model.pb"), "rb") as f: - predict_net.ParseFromString(f.read()) - - init_net = caffe2_pb2.NetDef() - with PathManager.open(os.path.join(dir, "model_init.pb"), "rb") as f: - init_net.ParseFromString(f.read()) - - return Caffe2Model(predict_net, init_net) - - def __call__(self, inputs): - """ - An interface that wraps around a Caffe2 model and mimics detectron2's models' - input/output format. See details about the format at :doc:`/tutorials/models`. - This is used to compare the outputs of caffe2 model with its original torch model. - - Due to the extra conversion between Pytorch/Caffe2, this method is not meant for - benchmark. Because of the conversion, this method also has dependency - on detectron2 in order to convert to detectron2's output format. - """ - if self._predictor is None: - self._predictor = ProtobufDetectionModel(self._predict_net, self._init_net) - return self._predictor(inputs) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/c10.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/c10.py deleted file mode 100755 index 25ee2300..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/c10.py +++ /dev/null @@ -1,534 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import math -import torch -import torch.nn.functional as F - -from detectron2.layers import cat -from detectron2.layers.roi_align_rotated import ROIAlignRotated -from detectron2.modeling import poolers -from detectron2.modeling.proposal_generator import rpn -from detectron2.modeling.roi_heads.mask_head import mask_rcnn_inference -from detectron2.structures import Boxes, ImageList, Instances, Keypoints - -from .shared import alias, to_device - - -""" -This file contains caffe2-compatible implementation of several detectron2 components. -""" - - -class Caffe2Boxes(Boxes): - """ - Representing a list of detectron2.structures.Boxes from minibatch, each box - is represented by a 5d vector (batch index + 4 coordinates), or a 6d vector - (batch index + 5 coordinates) for RotatedBoxes. - """ - - def __init__(self, tensor): - assert isinstance(tensor, torch.Tensor) - assert tensor.dim() == 2 and tensor.size(-1) in [4, 5, 6], tensor.size() - # TODO: make tensor immutable when dim is Nx5 for Boxes, - # and Nx6 for RotatedBoxes? - self.tensor = tensor - - -# TODO clean up this class, maybe just extend Instances -class InstancesList(object): - """ - Tensor representation of a list of Instances object for a batch of images. - - When dealing with a batch of images with Caffe2 ops, a list of bboxes - (instances) are usually represented by single Tensor with size - (sigma(Ni), 5) or (sigma(Ni), 4) plus a batch split Tensor. This class is - for providing common functions to convert between these two representations. - """ - - def __init__(self, im_info, indices, extra_fields=None): - # [N, 3] -> (H, W, Scale) - self.im_info = im_info - # [N,] -> indice of batch to which the instance belongs - self.indices = indices - # [N, ...] - self.batch_extra_fields = extra_fields or {} - - self.image_size = self.im_info - - def get_fields(self): - """like `get_fields` in the Instances object, - but return each field in tensor representations""" - ret = {} - for k, v in self.batch_extra_fields.items(): - # if isinstance(v, torch.Tensor): - # tensor_rep = v - # elif isinstance(v, (Boxes, Keypoints)): - # tensor_rep = v.tensor - # else: - # raise ValueError("Can't find tensor representation for: {}".format()) - ret[k] = v - return ret - - def has(self, name): - return name in self.batch_extra_fields - - def set(self, name, value): - data_len = len(value) - if len(self.batch_extra_fields): - assert ( - len(self) == data_len - ), "Adding a field of length {} to a Instances of length {}".format(data_len, len(self)) - self.batch_extra_fields[name] = value - - def __setattr__(self, name, val): - if name in ["im_info", "indices", "batch_extra_fields", "image_size"]: - super().__setattr__(name, val) - else: - self.set(name, val) - - def __getattr__(self, name): - if name not in self.batch_extra_fields: - raise AttributeError("Cannot find field '{}' in the given Instances!".format(name)) - return self.batch_extra_fields[name] - - def __len__(self): - return len(self.indices) - - def flatten(self): - ret = [] - for _, v in self.batch_extra_fields.items(): - if isinstance(v, (Boxes, Keypoints)): - ret.append(v.tensor) - else: - ret.append(v) - return ret - - @staticmethod - def to_d2_instances_list(instances_list): - """ - Convert InstancesList to List[Instances]. The input `instances_list` can - also be a List[Instances], in this case this method is a non-op. - """ - if not isinstance(instances_list, InstancesList): - assert all(isinstance(x, Instances) for x in instances_list) - return instances_list - - ret = [] - for i, info in enumerate(instances_list.im_info): - instances = Instances(torch.Size([int(info[0].item()), int(info[1].item())])) - - ids = instances_list.indices == i - for k, v in instances_list.batch_extra_fields.items(): - if isinstance(v, torch.Tensor): - instances.set(k, v[ids]) - continue - elif isinstance(v, Boxes): - instances.set(k, v[ids, -4:]) - continue - - target_type, tensor_source = v - assert isinstance(tensor_source, torch.Tensor) - assert tensor_source.shape[0] == instances_list.indices.shape[0] - tensor_source = tensor_source[ids] - - if issubclass(target_type, Boxes): - instances.set(k, Boxes(tensor_source[:, -4:])) - elif issubclass(target_type, Keypoints): - instances.set(k, Keypoints(tensor_source)) - elif issubclass(target_type, torch.Tensor): - instances.set(k, tensor_source) - else: - raise ValueError("Can't handle targe type: {}".format(target_type)) - - ret.append(instances) - return ret - - -class Caffe2Compatible(object): - """ - A model can inherit this class to indicate that it can be traced and deployed with caffe2. - """ - - def _get_tensor_mode(self): - return self._tensor_mode - - def _set_tensor_mode(self, v): - self._tensor_mode = v - - tensor_mode = property(_get_tensor_mode, _set_tensor_mode) - """ - If true, the model expects C2-style tensor only inputs/outputs format. - """ - - -class Caffe2RPN(Caffe2Compatible, rpn.RPN): - def _generate_proposals( - self, images, objectness_logits_pred, anchor_deltas_pred, gt_instances=None - ): - assert isinstance(images, ImageList) - if self.tensor_mode: - im_info = images.image_sizes - else: - im_info = torch.tensor([[im_sz[0], im_sz[1], 1.0] for im_sz in images.image_sizes]).to( - images.tensor.device - ) - assert isinstance(im_info, torch.Tensor) - - rpn_rois_list = [] - rpn_roi_probs_list = [] - for scores, bbox_deltas, cell_anchors_tensor, feat_stride in zip( - objectness_logits_pred, - anchor_deltas_pred, - iter(self.anchor_generator.cell_anchors), - self.anchor_generator.strides, - ): - scores = scores.detach() - bbox_deltas = bbox_deltas.detach() - - rpn_rois, rpn_roi_probs = torch.ops._caffe2.GenerateProposals( - scores, - bbox_deltas, - im_info, - cell_anchors_tensor, - spatial_scale=1.0 / feat_stride, - pre_nms_topN=self.pre_nms_topk[self.training], - post_nms_topN=self.post_nms_topk[self.training], - nms_thresh=self.nms_thresh, - min_size=self.min_box_size, - # correct_transform_coords=True, # deprecated argument - angle_bound_on=True, # Default - angle_bound_lo=-180, - angle_bound_hi=180, - clip_angle_thresh=1.0, # Default - legacy_plus_one=False, - ) - rpn_rois_list.append(rpn_rois) - rpn_roi_probs_list.append(rpn_roi_probs) - - # For FPN in D2, in RPN all proposals from different levels are concated - # together, ranked and picked by top post_nms_topk. Then in ROIPooler - # it calculates level_assignments and calls the RoIAlign from - # the corresponding level. - - if len(objectness_logits_pred) == 1: - rpn_rois = rpn_rois_list[0] - rpn_roi_probs = rpn_roi_probs_list[0] - else: - assert len(rpn_rois_list) == len(rpn_roi_probs_list) - rpn_post_nms_topN = self.post_nms_topk[self.training] - - device = rpn_rois_list[0].device - input_list = [to_device(x, "cpu") for x in (rpn_rois_list + rpn_roi_probs_list)] - - # TODO remove this after confirming rpn_max_level/rpn_min_level - # is not needed in CollectRpnProposals. - feature_strides = list(self.anchor_generator.strides) - rpn_min_level = int(math.log2(feature_strides[0])) - rpn_max_level = int(math.log2(feature_strides[-1])) - assert (rpn_max_level - rpn_min_level + 1) == len( - rpn_rois_list - ), "CollectRpnProposals requires continuous levels" - - rpn_rois = torch.ops._caffe2.CollectRpnProposals( - input_list, - # NOTE: in current implementation, rpn_max_level and rpn_min_level - # are not needed, only the subtraction of two matters and it - # can be infer from the number of inputs. Keep them now for - # consistency. - rpn_max_level=2 + len(rpn_rois_list) - 1, - rpn_min_level=2, - rpn_post_nms_topN=rpn_post_nms_topN, - ) - rpn_rois = to_device(rpn_rois, device) - rpn_roi_probs = [] - - proposals = self.c2_postprocess(im_info, rpn_rois, rpn_roi_probs, self.tensor_mode) - return proposals, {} - - def forward(self, images, features, gt_instances=None): - assert not self.training - features = [features[f] for f in self.in_features] - objectness_logits_pred, anchor_deltas_pred = self.rpn_head(features) - return self._generate_proposals( - images, - objectness_logits_pred, - anchor_deltas_pred, - gt_instances, - ) - - @staticmethod - def c2_postprocess(im_info, rpn_rois, rpn_roi_probs, tensor_mode): - proposals = InstancesList( - im_info=im_info, - indices=rpn_rois[:, 0], - extra_fields={ - "proposal_boxes": Caffe2Boxes(rpn_rois), - "objectness_logits": (torch.Tensor, rpn_roi_probs), - }, - ) - if not tensor_mode: - proposals = InstancesList.to_d2_instances_list(proposals) - else: - proposals = [proposals] - return proposals - - -class Caffe2ROIPooler(Caffe2Compatible, poolers.ROIPooler): - @staticmethod - def c2_preprocess(box_lists): - assert all(isinstance(x, Boxes) for x in box_lists) - if all(isinstance(x, Caffe2Boxes) for x in box_lists): - # input is pure-tensor based - assert len(box_lists) == 1 - pooler_fmt_boxes = box_lists[0].tensor - else: - pooler_fmt_boxes = poolers.convert_boxes_to_pooler_format(box_lists) - return pooler_fmt_boxes - - def forward(self, x, box_lists): - assert not self.training - - pooler_fmt_boxes = self.c2_preprocess(box_lists) - num_level_assignments = len(self.level_poolers) - - if num_level_assignments == 1: - if isinstance(self.level_poolers[0], ROIAlignRotated): - c2_roi_align = torch.ops._caffe2.RoIAlignRotated - aligned = True - else: - c2_roi_align = torch.ops._caffe2.RoIAlign - aligned = self.level_poolers[0].aligned - - x0 = x[0] - if x0.is_quantized: - x0 = x0.dequantize() - - out = c2_roi_align( - x0, - pooler_fmt_boxes, - order="NCHW", - spatial_scale=float(self.level_poolers[0].spatial_scale), - pooled_h=int(self.output_size[0]), - pooled_w=int(self.output_size[1]), - sampling_ratio=int(self.level_poolers[0].sampling_ratio), - aligned=aligned, - ) - return out - - device = pooler_fmt_boxes.device - assert ( - self.max_level - self.min_level + 1 == 4 - ), "Currently DistributeFpnProposals only support 4 levels" - fpn_outputs = torch.ops._caffe2.DistributeFpnProposals( - to_device(pooler_fmt_boxes, "cpu"), - roi_canonical_scale=self.canonical_box_size, - roi_canonical_level=self.canonical_level, - roi_max_level=self.max_level, - roi_min_level=self.min_level, - legacy_plus_one=False, - ) - fpn_outputs = [to_device(x, device) for x in fpn_outputs] - - rois_fpn_list = fpn_outputs[:-1] - rois_idx_restore_int32 = fpn_outputs[-1] - - roi_feat_fpn_list = [] - for roi_fpn, x_level, pooler in zip(rois_fpn_list, x, self.level_poolers): - if isinstance(pooler, ROIAlignRotated): - c2_roi_align = torch.ops._caffe2.RoIAlignRotated - aligned = True - else: - c2_roi_align = torch.ops._caffe2.RoIAlign - aligned = bool(pooler.aligned) - - if x_level.is_quantized: - x_level = x_level.dequantize() - - roi_feat_fpn = c2_roi_align( - x_level, - roi_fpn, - order="NCHW", - spatial_scale=float(pooler.spatial_scale), - pooled_h=int(self.output_size[0]), - pooled_w=int(self.output_size[1]), - sampling_ratio=int(pooler.sampling_ratio), - aligned=aligned, - ) - roi_feat_fpn_list.append(roi_feat_fpn) - - roi_feat_shuffled = cat(roi_feat_fpn_list, dim=0) - assert roi_feat_shuffled.numel() > 0 and rois_idx_restore_int32.numel() > 0, ( - "Caffe2 export requires tracing with a model checkpoint + input that can produce valid" - " detections. But no detections were obtained with the given checkpoint and input!" - ) - roi_feat = torch.ops._caffe2.BatchPermutation(roi_feat_shuffled, rois_idx_restore_int32) - return roi_feat - - -class Caffe2FastRCNNOutputsInference: - def __init__(self, tensor_mode): - self.tensor_mode = tensor_mode # whether the output is caffe2 tensor mode - - def __call__(self, box_predictor, predictions, proposals): - """equivalent to FastRCNNOutputLayers.inference""" - num_classes = box_predictor.num_classes - score_thresh = box_predictor.test_score_thresh - nms_thresh = box_predictor.test_nms_thresh - topk_per_image = box_predictor.test_topk_per_image - is_rotated = len(box_predictor.box2box_transform.weights) == 5 - - if is_rotated: - box_dim = 5 - assert box_predictor.box2box_transform.weights[4] == 1, ( - "The weights for Rotated BBoxTransform in C2 have only 4 dimensions," - + " thus enforcing the angle weight to be 1 for now" - ) - box2box_transform_weights = box_predictor.box2box_transform.weights[:4] - else: - box_dim = 4 - box2box_transform_weights = box_predictor.box2box_transform.weights - - class_logits, box_regression = predictions - if num_classes + 1 == class_logits.shape[1]: - class_prob = F.softmax(class_logits, -1) - else: - assert num_classes == class_logits.shape[1] - class_prob = F.sigmoid(class_logits) - # BoxWithNMSLimit will infer num_classes from the shape of the class_prob - # So append a zero column as placeholder for the background class - class_prob = torch.cat((class_prob, torch.zeros(class_prob.shape[0], 1)), dim=1) - - assert box_regression.shape[1] % box_dim == 0 - cls_agnostic_bbox_reg = box_regression.shape[1] // box_dim == 1 - - input_tensor_mode = proposals[0].proposal_boxes.tensor.shape[1] == box_dim + 1 - - rois = type(proposals[0].proposal_boxes).cat([p.proposal_boxes for p in proposals]) - device, dtype = rois.tensor.device, rois.tensor.dtype - if input_tensor_mode: - im_info = proposals[0].image_size - rois = rois.tensor - else: - im_info = torch.tensor( - [[sz[0], sz[1], 1.0] for sz in [x.image_size for x in proposals]] - ) - batch_ids = cat( - [ - torch.full((b, 1), i, dtype=dtype, device=device) - for i, b in enumerate(len(p) for p in proposals) - ], - dim=0, - ) - rois = torch.cat([batch_ids, rois.tensor], dim=1) - - roi_pred_bbox, roi_batch_splits = torch.ops._caffe2.BBoxTransform( - to_device(rois, "cpu"), - to_device(box_regression, "cpu"), - to_device(im_info, "cpu"), - weights=box2box_transform_weights, - apply_scale=True, - rotated=is_rotated, - angle_bound_on=True, - angle_bound_lo=-180, - angle_bound_hi=180, - clip_angle_thresh=1.0, - legacy_plus_one=False, - ) - roi_pred_bbox = to_device(roi_pred_bbox, device) - roi_batch_splits = to_device(roi_batch_splits, device) - - nms_outputs = torch.ops._caffe2.BoxWithNMSLimit( - to_device(class_prob, "cpu"), - to_device(roi_pred_bbox, "cpu"), - to_device(roi_batch_splits, "cpu"), - score_thresh=float(score_thresh), - nms=float(nms_thresh), - detections_per_im=int(topk_per_image), - soft_nms_enabled=False, - soft_nms_method="linear", - soft_nms_sigma=0.5, - soft_nms_min_score_thres=0.001, - rotated=is_rotated, - cls_agnostic_bbox_reg=cls_agnostic_bbox_reg, - input_boxes_include_bg_cls=False, - output_classes_include_bg_cls=False, - legacy_plus_one=False, - ) - roi_score_nms = to_device(nms_outputs[0], device) - roi_bbox_nms = to_device(nms_outputs[1], device) - roi_class_nms = to_device(nms_outputs[2], device) - roi_batch_splits_nms = to_device(nms_outputs[3], device) - roi_keeps_nms = to_device(nms_outputs[4], device) - roi_keeps_size_nms = to_device(nms_outputs[5], device) - if not self.tensor_mode: - roi_class_nms = roi_class_nms.to(torch.int64) - - roi_batch_ids = cat( - [ - torch.full((b, 1), i, dtype=dtype, device=device) - for i, b in enumerate(int(x.item()) for x in roi_batch_splits_nms) - ], - dim=0, - ) - - roi_class_nms = alias(roi_class_nms, "class_nms") - roi_score_nms = alias(roi_score_nms, "score_nms") - roi_bbox_nms = alias(roi_bbox_nms, "bbox_nms") - roi_batch_splits_nms = alias(roi_batch_splits_nms, "batch_splits_nms") - roi_keeps_nms = alias(roi_keeps_nms, "keeps_nms") - roi_keeps_size_nms = alias(roi_keeps_size_nms, "keeps_size_nms") - - results = InstancesList( - im_info=im_info, - indices=roi_batch_ids[:, 0], - extra_fields={ - "pred_boxes": Caffe2Boxes(roi_bbox_nms), - "scores": roi_score_nms, - "pred_classes": roi_class_nms, - }, - ) - - if not self.tensor_mode: - results = InstancesList.to_d2_instances_list(results) - batch_splits = roi_batch_splits_nms.int().tolist() - kept_indices = list(roi_keeps_nms.to(torch.int64).split(batch_splits)) - else: - results = [results] - kept_indices = [roi_keeps_nms] - - return results, kept_indices - - -class Caffe2MaskRCNNInference: - def __call__(self, pred_mask_logits, pred_instances): - """equivalent to mask_head.mask_rcnn_inference""" - if all(isinstance(x, InstancesList) for x in pred_instances): - assert len(pred_instances) == 1 - mask_probs_pred = pred_mask_logits.sigmoid() - mask_probs_pred = alias(mask_probs_pred, "mask_fcn_probs") - pred_instances[0].pred_masks = mask_probs_pred - else: - mask_rcnn_inference(pred_mask_logits, pred_instances) - - -class Caffe2KeypointRCNNInference: - def __init__(self, use_heatmap_max_keypoint): - self.use_heatmap_max_keypoint = use_heatmap_max_keypoint - - def __call__(self, pred_keypoint_logits, pred_instances): - # just return the keypoint heatmap for now, - # there will be option to call HeatmapMaxKeypointOp - output = alias(pred_keypoint_logits, "kps_score") - if all(isinstance(x, InstancesList) for x in pred_instances): - assert len(pred_instances) == 1 - if self.use_heatmap_max_keypoint: - device = output.device - output = torch.ops._caffe2.HeatmapMaxKeypoint( - to_device(output, "cpu"), - pred_instances[0].pred_boxes.tensor, - should_output_softmax=True, # worth make it configerable? - ) - output = to_device(output, device) - output = alias(output, "keypoints_out") - pred_instances[0].pred_keypoints = output - return pred_keypoint_logits diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_export.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_export.py deleted file mode 100755 index 74ac123a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_export.py +++ /dev/null @@ -1,207 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import copy -import io -import logging -import numpy as np -from typing import List -import onnx -import torch -from caffe2.proto import caffe2_pb2 -from caffe2.python import core -from caffe2.python.onnx.backend import Caffe2Backend -from tabulate import tabulate -from termcolor import colored -from torch.onnx import OperatorExportTypes - -from .shared import ( - ScopedWS, - construct_init_net_from_params, - fuse_alias_placeholder, - fuse_copy_between_cpu_and_gpu, - get_params_from_init_net, - group_norm_replace_aten_with_caffe2, - infer_device_type, - remove_dead_end_ops, - remove_reshape_for_fc, - save_graph, -) - -logger = logging.getLogger(__name__) - - -def export_onnx_model(model, inputs): - """ - Trace and export a model to onnx format. - - Args: - model (nn.Module): - inputs (tuple[args]): the model will be called by `model(*inputs)` - - Returns: - an onnx model - """ - assert isinstance(model, torch.nn.Module) - - # make sure all modules are in eval mode, onnx may change the training state - # of the module if the states are not consistent - def _check_eval(module): - assert not module.training - - model.apply(_check_eval) - - # Export the model to ONNX - with torch.no_grad(): - with io.BytesIO() as f: - torch.onnx.export( - model, - inputs, - f, - operator_export_type=OperatorExportTypes.ONNX_ATEN_FALLBACK, - # verbose=True, # NOTE: uncomment this for debugging - # export_params=True, - ) - onnx_model = onnx.load_from_string(f.getvalue()) - - # Apply ONNX's Optimization - all_passes = onnx.optimizer.get_available_passes() - passes = ["fuse_bn_into_conv"] - assert all(p in all_passes for p in passes) - onnx_model = onnx.optimizer.optimize(onnx_model, passes) - return onnx_model - - -def _op_stats(net_def): - type_count = {} - for t in [op.type for op in net_def.op]: - type_count[t] = type_count.get(t, 0) + 1 - type_count_list = sorted(type_count.items(), key=lambda kv: kv[0]) # alphabet - type_count_list = sorted(type_count_list, key=lambda kv: -kv[1]) # count - return "\n".join("{:>4}x {}".format(count, name) for name, count in type_count_list) - - -def _assign_device_option( - predict_net: caffe2_pb2.NetDef, init_net: caffe2_pb2.NetDef, tensor_inputs: List[torch.Tensor] -): - """ - ONNX exported network doesn't have concept of device, assign necessary - device option for each op in order to make it runable on GPU runtime. - """ - - def _get_device_type(torch_tensor): - assert torch_tensor.device.type in ["cpu", "cuda"] - assert torch_tensor.device.index == 0 - return torch_tensor.device.type - - def _assign_op_device_option(net_proto, net_ssa, blob_device_types): - for op, ssa_i in zip(net_proto.op, net_ssa): - if op.type in ["CopyCPUToGPU", "CopyGPUToCPU"]: - op.device_option.CopyFrom(core.DeviceOption(caffe2_pb2.CUDA, 0)) - else: - devices = [blob_device_types[b] for b in ssa_i[0] + ssa_i[1]] - assert all(d == devices[0] for d in devices) - if devices[0] == "cuda": - op.device_option.CopyFrom(core.DeviceOption(caffe2_pb2.CUDA, 0)) - - # update ops in predict_net - predict_net_input_device_types = { - (name, 0): _get_device_type(tensor) - for name, tensor in zip(predict_net.external_input, tensor_inputs) - } - predict_net_device_types = infer_device_type( - predict_net, known_status=predict_net_input_device_types, device_name_style="pytorch" - ) - predict_net_ssa, _ = core.get_ssa(predict_net) - _assign_op_device_option(predict_net, predict_net_ssa, predict_net_device_types) - - # update ops in init_net - init_net_ssa, versions = core.get_ssa(init_net) - init_net_output_device_types = { - (name, versions[name]): predict_net_device_types[(name, 0)] - for name in init_net.external_output - } - init_net_device_types = infer_device_type( - init_net, known_status=init_net_output_device_types, device_name_style="pytorch" - ) - _assign_op_device_option(init_net, init_net_ssa, init_net_device_types) - - -def export_caffe2_detection_model(model: torch.nn.Module, tensor_inputs: List[torch.Tensor]): - """ - Export a caffe2-compatible Detectron2 model to caffe2 format via ONNX. - - Arg: - model: a caffe2-compatible version of detectron2 model, defined in caffe2_modeling.py - tensor_inputs: a list of tensors that caffe2 model takes as input. - """ - model = copy.deepcopy(model) - assert isinstance(model, torch.nn.Module) - assert hasattr(model, "encode_additional_info") - - # Export via ONNX - logger.info( - "Exporting a {} model via ONNX ...".format(type(model).__name__) - + " Some warnings from ONNX are expected and are usually not to worry about." - ) - onnx_model = export_onnx_model(model, (tensor_inputs,)) - # Convert ONNX model to Caffe2 protobuf - init_net, predict_net = Caffe2Backend.onnx_graph_to_caffe2_net(onnx_model) - ops_table = [[op.type, op.input, op.output] for op in predict_net.op] - table = tabulate(ops_table, headers=["type", "input", "output"], tablefmt="pipe") - logger.info( - "ONNX export Done. Exported predict_net (before optimizations):\n" + colored(table, "cyan") - ) - - # Apply protobuf optimization - fuse_alias_placeholder(predict_net, init_net) - if any(t.device.type != "cpu" for t in tensor_inputs): - fuse_copy_between_cpu_and_gpu(predict_net) - remove_dead_end_ops(init_net) - _assign_device_option(predict_net, init_net, tensor_inputs) - params, device_options = get_params_from_init_net(init_net) - predict_net, params = remove_reshape_for_fc(predict_net, params) - init_net = construct_init_net_from_params(params, device_options) - group_norm_replace_aten_with_caffe2(predict_net) - - # Record necessary information for running the pb model in Detectron2 system. - model.encode_additional_info(predict_net, init_net) - - logger.info("Operators used in predict_net: \n{}".format(_op_stats(predict_net))) - logger.info("Operators used in init_net: \n{}".format(_op_stats(init_net))) - - return predict_net, init_net - - -def run_and_save_graph(predict_net, init_net, tensor_inputs, graph_save_path): - """ - Run the caffe2 model on given inputs, recording the shape and draw the graph. - - predict_net/init_net: caffe2 model. - tensor_inputs: a list of tensors that caffe2 model takes as input. - graph_save_path: path for saving graph of exported model. - """ - - logger.info("Saving graph of ONNX exported model to {} ...".format(graph_save_path)) - save_graph(predict_net, graph_save_path, op_only=False) - - # Run the exported Caffe2 net - logger.info("Running ONNX exported model ...") - with ScopedWS("__ws_tmp__", True) as ws: - ws.RunNetOnce(init_net) - initialized_blobs = set(ws.Blobs()) - uninitialized = [inp for inp in predict_net.external_input if inp not in initialized_blobs] - for name, blob in zip(uninitialized, tensor_inputs): - ws.FeedBlob(name, blob) - - try: - ws.RunNetOnce(predict_net) - except RuntimeError as e: - logger.warning("Encountered RuntimeError: \n{}".format(str(e))) - - ws_blobs = {b: ws.FetchBlob(b) for b in ws.Blobs()} - blob_sizes = {b: ws_blobs[b].shape for b in ws_blobs if isinstance(ws_blobs[b], np.ndarray)} - - logger.info("Saving graph with blob shapes to {} ...".format(graph_save_path)) - save_graph(predict_net, graph_save_path, op_only=False, blob_sizes=blob_sizes) - - return ws_blobs diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_inference.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_inference.py deleted file mode 100755 index deb886c0..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_inference.py +++ /dev/null @@ -1,161 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import logging -import numpy as np -from itertools import count -import torch -from caffe2.proto import caffe2_pb2 -from caffe2.python import core - -from .caffe2_modeling import META_ARCH_CAFFE2_EXPORT_TYPE_MAP, convert_batched_inputs_to_c2_format -from .shared import ScopedWS, get_pb_arg_vali, get_pb_arg_vals, infer_device_type - -logger = logging.getLogger(__name__) - - -# ===== ref: mobile-vision predictor's 'Caffe2Wrapper' class ====== -class ProtobufModel(torch.nn.Module): - """ - Wrapper of a caffe2's protobuf model. - It works just like nn.Module, but running caffe2 under the hood. - Input/Output are tuple[tensor] that match the caffe2 net's external_input/output. - """ - - _ids = count(0) - - def __init__(self, predict_net, init_net): - logger.info(f"Initializing ProtobufModel for: {predict_net.name} ...") - super().__init__() - assert isinstance(predict_net, caffe2_pb2.NetDef) - assert isinstance(init_net, caffe2_pb2.NetDef) - # create unique temporary workspace for each instance - self.ws_name = "__tmp_ProtobufModel_{}__".format(next(self._ids)) - self.net = core.Net(predict_net) - - logger.info("Running init_net once to fill the parameters ...") - with ScopedWS(self.ws_name, is_reset=True, is_cleanup=False) as ws: - ws.RunNetOnce(init_net) - uninitialized_external_input = [] - for blob in self.net.Proto().external_input: - if blob not in ws.Blobs(): - uninitialized_external_input.append(blob) - ws.CreateBlob(blob) - ws.CreateNet(self.net) - - self._error_msgs = set() - self._input_blobs = uninitialized_external_input - - def _infer_output_devices(self, inputs): - """ - Returns: - list[str]: list of device for each external output - """ - - def _get_device_type(torch_tensor): - assert torch_tensor.device.type in ["cpu", "cuda"] - assert torch_tensor.device.index == 0 - return torch_tensor.device.type - - predict_net = self.net.Proto() - input_device_types = { - (name, 0): _get_device_type(tensor) for name, tensor in zip(self._input_blobs, inputs) - } - device_type_map = infer_device_type( - predict_net, known_status=input_device_types, device_name_style="pytorch" - ) - ssa, versions = core.get_ssa(predict_net) - versioned_outputs = [(name, versions[name]) for name in predict_net.external_output] - output_devices = [device_type_map[outp] for outp in versioned_outputs] - return output_devices - - def forward(self, inputs): - """ - Args: - inputs (tuple[torch.Tensor]) - - Returns: - tuple[torch.Tensor] - """ - assert len(inputs) == len(self._input_blobs), ( - f"Length of inputs ({len(inputs)}) " - f"doesn't match the required input blobs: {self._input_blobs}" - ) - - with ScopedWS(self.ws_name, is_reset=False, is_cleanup=False) as ws: - for b, tensor in zip(self._input_blobs, inputs): - ws.FeedBlob(b, tensor) - - try: - ws.RunNet(self.net.Proto().name) - except RuntimeError as e: - if not str(e) in self._error_msgs: - self._error_msgs.add(str(e)) - logger.warning("Encountered new RuntimeError: \n{}".format(str(e))) - logger.warning("Catch the error and use partial results.") - - c2_outputs = [ws.FetchBlob(b) for b in self.net.Proto().external_output] - # Remove outputs of current run, this is necessary in order to - # prevent fetching the result from previous run if the model fails - # in the middle. - for b in self.net.Proto().external_output: - # Needs to create uninitialized blob to make the net runable. - # This is "equivalent" to: ws.RemoveBlob(b) then ws.CreateBlob(b), - # but there'no such API. - ws.FeedBlob(b, f"{b}, a C++ native class of type nullptr (uninitialized).") - - # Cast output to torch.Tensor on the desired device - output_devices = ( - self._infer_output_devices(inputs) - if any(t.device.type != "cpu" for t in inputs) - else ["cpu" for _ in self.net.Proto().external_output] - ) - - outputs = [] - for name, c2_output, device in zip( - self.net.Proto().external_output, c2_outputs, output_devices - ): - if not isinstance(c2_output, np.ndarray): - raise RuntimeError( - "Invalid output for blob {}, received: {}".format(name, c2_output) - ) - outputs.append(torch.tensor(c2_output).to(device=device)) - return tuple(outputs) - - -class ProtobufDetectionModel(torch.nn.Module): - """ - A class works just like a pytorch meta arch in terms of inference, but running - caffe2 model under the hood. - """ - - def __init__(self, predict_net, init_net, *, convert_outputs=None): - """ - Args: - predict_net, init_net (core.Net): caffe2 nets - convert_outptus (callable): a function that converts caffe2 - outputs to the same format of the original pytorch model. - By default, use the one defined in the caffe2 meta_arch. - """ - super().__init__() - self.protobuf_model = ProtobufModel(predict_net, init_net) - self.size_divisibility = get_pb_arg_vali(predict_net, "size_divisibility", 0) - self.device = get_pb_arg_vals(predict_net, "device", b"cpu").decode("ascii") - - if convert_outputs is None: - meta_arch = get_pb_arg_vals(predict_net, "meta_architecture", b"GeneralizedRCNN") - meta_arch = META_ARCH_CAFFE2_EXPORT_TYPE_MAP[meta_arch.decode("ascii")] - self._convert_outputs = meta_arch.get_outputs_converter(predict_net, init_net) - else: - self._convert_outputs = convert_outputs - - def _convert_inputs(self, batched_inputs): - # currently all models convert inputs in the same way - return convert_batched_inputs_to_c2_format( - batched_inputs, self.size_divisibility, self.device - ) - - def forward(self, batched_inputs): - c2_inputs = self._convert_inputs(batched_inputs) - c2_results = self.protobuf_model(c2_inputs) - c2_results = dict(zip(self.protobuf_model.net.Proto().external_output, c2_results)) - return self._convert_outputs(batched_inputs, c2_inputs, c2_results) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_modeling.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_modeling.py deleted file mode 100755 index e00de4ad..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_modeling.py +++ /dev/null @@ -1,419 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import functools -import io -import struct -import types -import torch - -from detectron2.modeling import meta_arch -from detectron2.modeling.box_regression import Box2BoxTransform -from detectron2.modeling.roi_heads import keypoint_head -from detectron2.structures import Boxes, ImageList, Instances, RotatedBoxes - -from .c10 import Caffe2Compatible -from .caffe2_patch import ROIHeadsPatcher, patch_generalized_rcnn -from .shared import ( - alias, - check_set_pb_arg, - get_pb_arg_floats, - get_pb_arg_valf, - get_pb_arg_vali, - get_pb_arg_vals, - mock_torch_nn_functional_interpolate, -) - - -def assemble_rcnn_outputs_by_name(image_sizes, tensor_outputs, force_mask_on=False): - """ - A function to assemble caffe2 model's outputs (i.e. Dict[str, Tensor]) - to detectron2's format (i.e. list of Instances instance). - This only works when the model follows the Caffe2 detectron's naming convention. - - Args: - image_sizes (List[List[int, int]]): [H, W] of every image. - tensor_outputs (Dict[str, Tensor]): external_output to its tensor. - - force_mask_on (Bool): if true, the it make sure there'll be pred_masks even - if the mask is not found from tensor_outputs (usually due to model crash) - """ - - results = [Instances(image_size) for image_size in image_sizes] - - batch_splits = tensor_outputs.get("batch_splits", None) - if batch_splits: - raise NotImplementedError() - assert len(image_sizes) == 1 - result = results[0] - - bbox_nms = tensor_outputs["bbox_nms"] - score_nms = tensor_outputs["score_nms"] - class_nms = tensor_outputs["class_nms"] - # Detection will always success because Conv support 0-batch - assert bbox_nms is not None - assert score_nms is not None - assert class_nms is not None - if bbox_nms.shape[1] == 5: - result.pred_boxes = RotatedBoxes(bbox_nms) - else: - result.pred_boxes = Boxes(bbox_nms) - result.scores = score_nms - result.pred_classes = class_nms.to(torch.int64) - - mask_fcn_probs = tensor_outputs.get("mask_fcn_probs", None) - if mask_fcn_probs is not None: - # finish the mask pred - mask_probs_pred = mask_fcn_probs - num_masks = mask_probs_pred.shape[0] - class_pred = result.pred_classes - indices = torch.arange(num_masks, device=class_pred.device) - mask_probs_pred = mask_probs_pred[indices, class_pred][:, None] - result.pred_masks = mask_probs_pred - elif force_mask_on: - # NOTE: there's no way to know the height/width of mask here, it won't be - # used anyway when batch size is 0, so just set them to 0. - result.pred_masks = torch.zeros([0, 1, 0, 0], dtype=torch.uint8) - - keypoints_out = tensor_outputs.get("keypoints_out", None) - kps_score = tensor_outputs.get("kps_score", None) - if keypoints_out is not None: - # keypoints_out: [N, 4, #kypoints], where 4 is in order of (x, y, score, prob) - keypoints_tensor = keypoints_out - # NOTE: it's possible that prob is not calculated if "should_output_softmax" - # is set to False in HeatmapMaxKeypoint, so just using raw score, seems - # it doesn't affect mAP. TODO: check more carefully. - keypoint_xyp = keypoints_tensor.transpose(1, 2)[:, :, [0, 1, 2]] - result.pred_keypoints = keypoint_xyp - elif kps_score is not None: - # keypoint heatmap to sparse data structure - pred_keypoint_logits = kps_score - keypoint_head.keypoint_rcnn_inference(pred_keypoint_logits, [result]) - - return results - - -def _cast_to_f32(f64): - return struct.unpack("f", struct.pack("f", f64))[0] - - -def set_caffe2_compatible_tensor_mode(model, enable=True): - def _fn(m): - if isinstance(m, Caffe2Compatible): - m.tensor_mode = enable - - model.apply(_fn) - - -def convert_batched_inputs_to_c2_format(batched_inputs, size_divisibility, device): - """ - See get_caffe2_inputs() below. - """ - assert all(isinstance(x, dict) for x in batched_inputs) - assert all(x["image"].dim() == 3 for x in batched_inputs) - - images = [x["image"] for x in batched_inputs] - images = ImageList.from_tensors(images, size_divisibility) - - im_info = [] - for input_per_image, image_size in zip(batched_inputs, images.image_sizes): - target_height = input_per_image.get("height", image_size[0]) - target_width = input_per_image.get("width", image_size[1]) # noqa - # NOTE: The scale inside im_info is kept as convention and for providing - # post-processing information if further processing is needed. For - # current Caffe2 model definitions that don't include post-processing inside - # the model, this number is not used. - # NOTE: There can be a slight difference between width and height - # scales, using a single number can results in numerical difference - # compared with D2's post-processing. - scale = target_height / image_size[0] - im_info.append([image_size[0], image_size[1], scale]) - im_info = torch.Tensor(im_info) - - return images.tensor.to(device), im_info.to(device) - - -class Caffe2MetaArch(Caffe2Compatible, torch.nn.Module): - """ - Base class for caffe2-compatible implementation of a meta architecture. - The forward is traceable and its traced graph can be converted to caffe2 - graph through ONNX. - """ - - def __init__(self, cfg, torch_model): - """ - Args: - cfg (CfgNode): - torch_model (nn.Module): the detectron2 model (meta_arch) to be - converted. - """ - super().__init__() - self._wrapped_model = torch_model - self.eval() - set_caffe2_compatible_tensor_mode(self, True) - - def get_caffe2_inputs(self, batched_inputs): - """ - Convert pytorch-style structured inputs to caffe2-style inputs that - are tuples of tensors. - - Args: - batched_inputs (list[dict]): inputs to a detectron2 model - in its standard format. Each dict has "image" (CHW tensor), and optionally - "height" and "width". - - Returns: - tuple[Tensor]: - tuple of tensors that will be the inputs to the - :meth:`forward` method. For existing models, the first - is an NCHW tensor (padded and batched); the second is - a im_info Nx3 tensor, where the rows are - (height, width, unused legacy parameter) - """ - return convert_batched_inputs_to_c2_format( - batched_inputs, - self._wrapped_model.backbone.size_divisibility, - self._wrapped_model.device, - ) - - def encode_additional_info(self, predict_net, init_net): - """ - Save extra metadata that will be used by inference in the output protobuf. - """ - pass - - def forward(self, inputs): - """ - Run the forward in caffe2-style. It has to use caffe2-compatible ops - and the method will be used for tracing. - - Args: - inputs (tuple[Tensor]): inputs defined by :meth:`get_caffe2_input`. - They will be the inputs of the converted caffe2 graph. - - Returns: - tuple[Tensor]: output tensors. They will be the outputs of the - converted caffe2 graph. - """ - raise NotImplementedError - - def _caffe2_preprocess_image(self, inputs): - """ - Caffe2 implementation of preprocess_image, which is called inside each MetaArch's forward. - It normalizes the input images, and the final caffe2 graph assumes the - inputs have been batched already. - """ - data, im_info = inputs - data = alias(data, "data") - im_info = alias(im_info, "im_info") - mean, std = self._wrapped_model.pixel_mean, self._wrapped_model.pixel_std - normalized_data = (data - mean) / std - normalized_data = alias(normalized_data, "normalized_data") - - # Pack (data, im_info) into ImageList which is recognized by self.inference. - images = ImageList(tensor=normalized_data, image_sizes=im_info) - return images - - @staticmethod - def get_outputs_converter(predict_net, init_net): - """ - Creates a function that converts outputs of the caffe2 model to - detectron2's standard format. - The function uses information in `predict_net` and `init_net` that are - available at inferene time. Therefore the function logic can be used in inference. - - The returned function has the following signature: - - def convert(batched_inputs, c2_inputs, c2_results) -> detectron2_outputs - - Where - - * batched_inputs (list[dict]): the original input format of the meta arch - * c2_inputs (tuple[Tensor]): the caffe2 inputs. - * c2_results (dict[str, Tensor]): the caffe2 output format, - corresponding to the outputs of the :meth:`forward` function. - * detectron2_outputs: the original output format of the meta arch. - - This function can be used to compare the outputs of the original meta arch and - the converted caffe2 graph. - - Returns: - callable: a callable of the above signature. - """ - raise NotImplementedError - - -class Caffe2GeneralizedRCNN(Caffe2MetaArch): - def __init__(self, cfg, torch_model): - assert isinstance(torch_model, meta_arch.GeneralizedRCNN) - torch_model = patch_generalized_rcnn(torch_model) - super().__init__(cfg, torch_model) - - try: - use_heatmap_max_keypoint = cfg.EXPORT_CAFFE2.USE_HEATMAP_MAX_KEYPOINT - except AttributeError: - use_heatmap_max_keypoint = False - self.roi_heads_patcher = ROIHeadsPatcher( - self._wrapped_model.roi_heads, use_heatmap_max_keypoint - ) - - def encode_additional_info(self, predict_net, init_net): - size_divisibility = self._wrapped_model.backbone.size_divisibility - check_set_pb_arg(predict_net, "size_divisibility", "i", size_divisibility) - check_set_pb_arg( - predict_net, "device", "s", str.encode(str(self._wrapped_model.device), "ascii") - ) - check_set_pb_arg(predict_net, "meta_architecture", "s", b"GeneralizedRCNN") - - @mock_torch_nn_functional_interpolate() - def forward(self, inputs): - if not self.tensor_mode: - return self._wrapped_model.inference(inputs) - images = self._caffe2_preprocess_image(inputs) - features = self._wrapped_model.backbone(images.tensor) - proposals, _ = self._wrapped_model.proposal_generator(images, features) - with self.roi_heads_patcher.mock_roi_heads(): - detector_results, _ = self._wrapped_model.roi_heads(images, features, proposals) - return tuple(detector_results[0].flatten()) - - @staticmethod - def get_outputs_converter(predict_net, init_net): - def f(batched_inputs, c2_inputs, c2_results): - _, im_info = c2_inputs - image_sizes = [[int(im[0]), int(im[1])] for im in im_info] - results = assemble_rcnn_outputs_by_name(image_sizes, c2_results) - return meta_arch.GeneralizedRCNN._postprocess(results, batched_inputs, image_sizes) - - return f - - -class Caffe2RetinaNet(Caffe2MetaArch): - def __init__(self, cfg, torch_model): - assert isinstance(torch_model, meta_arch.RetinaNet) - super().__init__(cfg, torch_model) - - @mock_torch_nn_functional_interpolate() - def forward(self, inputs): - assert self.tensor_mode - images = self._caffe2_preprocess_image(inputs) - - # explicitly return the images sizes to avoid removing "im_info" by ONNX - # since it's not used in the forward path - return_tensors = [images.image_sizes] - - features = self._wrapped_model.backbone(images.tensor) - features = [features[f] for f in self._wrapped_model.head_in_features] - for i, feature_i in enumerate(features): - features[i] = alias(feature_i, "feature_{}".format(i), is_backward=True) - return_tensors.append(features[i]) - - pred_logits, pred_anchor_deltas = self._wrapped_model.head(features) - for i, (box_cls_i, box_delta_i) in enumerate(zip(pred_logits, pred_anchor_deltas)): - return_tensors.append(alias(box_cls_i, "box_cls_{}".format(i))) - return_tensors.append(alias(box_delta_i, "box_delta_{}".format(i))) - - return tuple(return_tensors) - - def encode_additional_info(self, predict_net, init_net): - size_divisibility = self._wrapped_model.backbone.size_divisibility - check_set_pb_arg(predict_net, "size_divisibility", "i", size_divisibility) - check_set_pb_arg( - predict_net, "device", "s", str.encode(str(self._wrapped_model.device), "ascii") - ) - check_set_pb_arg(predict_net, "meta_architecture", "s", b"RetinaNet") - - # Inference parameters: - check_set_pb_arg( - predict_net, "score_threshold", "f", _cast_to_f32(self._wrapped_model.test_score_thresh) - ) - check_set_pb_arg( - predict_net, "topk_candidates", "i", self._wrapped_model.test_topk_candidates - ) - check_set_pb_arg( - predict_net, "nms_threshold", "f", _cast_to_f32(self._wrapped_model.test_nms_thresh) - ) - check_set_pb_arg( - predict_net, - "max_detections_per_image", - "i", - self._wrapped_model.max_detections_per_image, - ) - - check_set_pb_arg( - predict_net, - "bbox_reg_weights", - "floats", - [_cast_to_f32(w) for w in self._wrapped_model.box2box_transform.weights], - ) - self._encode_anchor_generator_cfg(predict_net) - - def _encode_anchor_generator_cfg(self, predict_net): - # serialize anchor_generator for future use - serialized_anchor_generator = io.BytesIO() - torch.save(self._wrapped_model.anchor_generator, serialized_anchor_generator) - # Ideally we can put anchor generating inside the model, then we don't - # need to store this information. - bytes = serialized_anchor_generator.getvalue() - check_set_pb_arg(predict_net, "serialized_anchor_generator", "s", bytes) - - @staticmethod - def get_outputs_converter(predict_net, init_net): - self = types.SimpleNamespace() - serialized_anchor_generator = io.BytesIO( - get_pb_arg_vals(predict_net, "serialized_anchor_generator", None) - ) - self.anchor_generator = torch.load(serialized_anchor_generator) - bbox_reg_weights = get_pb_arg_floats(predict_net, "bbox_reg_weights", None) - self.box2box_transform = Box2BoxTransform(weights=tuple(bbox_reg_weights)) - self.test_score_thresh = get_pb_arg_valf(predict_net, "score_threshold", None) - self.test_topk_candidates = get_pb_arg_vali(predict_net, "topk_candidates", None) - self.test_nms_thresh = get_pb_arg_valf(predict_net, "nms_threshold", None) - self.max_detections_per_image = get_pb_arg_vali( - predict_net, "max_detections_per_image", None - ) - - # hack to reuse inference code from RetinaNet - for meth in [ - "forward_inference", - "inference_single_image", - "_transpose_dense_predictions", - "_decode_multi_level_predictions", - "_decode_per_level_predictions", - ]: - setattr(self, meth, functools.partial(getattr(meta_arch.RetinaNet, meth), self)) - - def f(batched_inputs, c2_inputs, c2_results): - _, im_info = c2_inputs - image_sizes = [[int(im[0]), int(im[1])] for im in im_info] - dummy_images = ImageList( - torch.randn( - ( - len(im_info), - 3, - ) - + tuple(image_sizes[0]) - ), - image_sizes, - ) - - num_features = len([x for x in c2_results.keys() if x.startswith("box_cls_")]) - pred_logits = [c2_results["box_cls_{}".format(i)] for i in range(num_features)] - pred_anchor_deltas = [c2_results["box_delta_{}".format(i)] for i in range(num_features)] - - # For each feature level, feature should have the same batch size and - # spatial dimension as the box_cls and box_delta. - dummy_features = [x.clone()[:, 0:0, :, :] for x in pred_logits] - # self.num_classess can be inferred - self.num_classes = pred_logits[0].shape[1] // (pred_anchor_deltas[0].shape[1] // 4) - - results = self.forward_inference( - dummy_images, dummy_features, [pred_logits, pred_anchor_deltas] - ) - return meta_arch.GeneralizedRCNN._postprocess(results, batched_inputs, image_sizes) - - return f - - -META_ARCH_CAFFE2_EXPORT_TYPE_MAP = { - "GeneralizedRCNN": Caffe2GeneralizedRCNN, - "RetinaNet": Caffe2RetinaNet, -} diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_patch.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_patch.py deleted file mode 100755 index c9eee594..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/caffe2_patch.py +++ /dev/null @@ -1,152 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import contextlib -from unittest import mock -import torch - -from detectron2.modeling import poolers -from detectron2.modeling.proposal_generator import rpn -from detectron2.modeling.roi_heads import keypoint_head, mask_head -from detectron2.modeling.roi_heads.fast_rcnn import FastRCNNOutputLayers - -from .c10 import ( - Caffe2Compatible, - Caffe2FastRCNNOutputsInference, - Caffe2KeypointRCNNInference, - Caffe2MaskRCNNInference, - Caffe2ROIPooler, - Caffe2RPN, -) - - -class GenericMixin(object): - pass - - -class Caffe2CompatibleConverter(object): - """ - A GenericUpdater which implements the `create_from` interface, by modifying - module object and assign it with another class replaceCls. - """ - - def __init__(self, replaceCls): - self.replaceCls = replaceCls - - def create_from(self, module): - # update module's class to the new class - assert isinstance(module, torch.nn.Module) - if issubclass(self.replaceCls, GenericMixin): - # replaceCls should act as mixin, create a new class on-the-fly - new_class = type( - "{}MixedWith{}".format(self.replaceCls.__name__, module.__class__.__name__), - (self.replaceCls, module.__class__), - {}, # {"new_method": lambda self: ...}, - ) - module.__class__ = new_class - else: - # replaceCls is complete class, this allow arbitrary class swap - module.__class__ = self.replaceCls - - # initialize Caffe2Compatible - if isinstance(module, Caffe2Compatible): - module.tensor_mode = False - - return module - - -def patch(model, target, updater, *args, **kwargs): - """ - recursively (post-order) update all modules with the target type and its - subclasses, make a initialization/composition/inheritance/... via the - updater.create_from. - """ - for name, module in model.named_children(): - model._modules[name] = patch(module, target, updater, *args, **kwargs) - if isinstance(model, target): - return updater.create_from(model, *args, **kwargs) - return model - - -def patch_generalized_rcnn(model): - ccc = Caffe2CompatibleConverter - model = patch(model, rpn.RPN, ccc(Caffe2RPN)) - model = patch(model, poolers.ROIPooler, ccc(Caffe2ROIPooler)) - - return model - - -@contextlib.contextmanager -def mock_fastrcnn_outputs_inference( - tensor_mode, check=True, box_predictor_type=FastRCNNOutputLayers -): - with mock.patch.object( - box_predictor_type, - "inference", - autospec=True, - side_effect=Caffe2FastRCNNOutputsInference(tensor_mode), - ) as mocked_func: - yield - if check: - assert mocked_func.call_count > 0 - - -@contextlib.contextmanager -def mock_mask_rcnn_inference(tensor_mode, patched_module, check=True): - with mock.patch( - "{}.mask_rcnn_inference".format(patched_module), side_effect=Caffe2MaskRCNNInference() - ) as mocked_func: - yield - if check: - assert mocked_func.call_count > 0 - - -@contextlib.contextmanager -def mock_keypoint_rcnn_inference(tensor_mode, patched_module, use_heatmap_max_keypoint, check=True): - with mock.patch( - "{}.keypoint_rcnn_inference".format(patched_module), - side_effect=Caffe2KeypointRCNNInference(use_heatmap_max_keypoint), - ) as mocked_func: - yield - if check: - assert mocked_func.call_count > 0 - - -class ROIHeadsPatcher: - def __init__(self, heads, use_heatmap_max_keypoint): - self.heads = heads - self.use_heatmap_max_keypoint = use_heatmap_max_keypoint - - @contextlib.contextmanager - def mock_roi_heads(self, tensor_mode=True): - """ - Patching several inference functions inside ROIHeads and its subclasses - - Args: - tensor_mode (bool): whether the inputs/outputs are caffe2's tensor - format or not. Default to True. - """ - # NOTE: this requries the `keypoint_rcnn_inference` and `mask_rcnn_inference` - # are called inside the same file as BaseXxxHead due to using mock.patch. - kpt_heads_mod = keypoint_head.BaseKeypointRCNNHead.__module__ - mask_head_mod = mask_head.BaseMaskRCNNHead.__module__ - - mock_ctx_managers = [ - mock_fastrcnn_outputs_inference( - tensor_mode=tensor_mode, - check=True, - box_predictor_type=type(self.heads.box_predictor), - ) - ] - if getattr(self.heads, "keypoint_on", False): - mock_ctx_managers += [ - mock_keypoint_rcnn_inference( - tensor_mode, kpt_heads_mod, self.use_heatmap_max_keypoint - ) - ] - if getattr(self.heads, "mask_on", False): - mock_ctx_managers += [mock_mask_rcnn_inference(tensor_mode, mask_head_mod)] - - with contextlib.ExitStack() as stack: # python 3.3+ - for mgr in mock_ctx_managers: - stack.enter_context(mgr) - yield diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/flatten.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/flatten.py deleted file mode 100755 index f5ba4297..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/flatten.py +++ /dev/null @@ -1,330 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import collections -from dataclasses import dataclass -from typing import Callable, List, Optional, Tuple -import torch -from torch import nn - -from detectron2.structures import Boxes, Instances, ROIMasks -from detectron2.utils.registry import _convert_target_to_string, locate - -from .torchscript_patch import patch_builtin_len - - -@dataclass -class Schema: - """ - A Schema defines how to flatten a possibly hierarchical object into tuple of - primitive objects, so it can be used as inputs/outputs of PyTorch's tracing. - - PyTorch does not support tracing a function that produces rich output - structures (e.g. dict, Instances, Boxes). To trace such a function, we - flatten the rich object into tuple of tensors, and return this tuple of tensors - instead. Meanwhile, we also need to know how to "rebuild" the original object - from the flattened results, so we can evaluate the flattened results. - A Schema defines how to flatten an object, and while flattening it, it records - necessary schemas so that the object can be rebuilt using the flattened outputs. - - The flattened object and the schema object is returned by ``.flatten`` classmethod. - Then the original object can be rebuilt with the ``__call__`` method of schema. - - A Schema is a dataclass that can be serialized easily. - """ - - # inspired by FetchMapper in tensorflow/python/client/session.py - - @classmethod - def flatten(cls, obj): - raise NotImplementedError - - def __call__(self, values): - raise NotImplementedError - - @staticmethod - def _concat(values): - ret = () - sizes = [] - for v in values: - assert isinstance(v, tuple), "Flattened results must be a tuple" - ret = ret + v - sizes.append(len(v)) - return ret, sizes - - @staticmethod - def _split(values, sizes): - if len(sizes): - expected_len = sum(sizes) - assert ( - len(values) == expected_len - ), f"Values has length {len(values)} but expect length {expected_len}." - ret = [] - for k in range(len(sizes)): - begin, end = sum(sizes[:k]), sum(sizes[: k + 1]) - ret.append(values[begin:end]) - return ret - - -@dataclass -class ListSchema(Schema): - schemas: List[Schema] # the schemas that define how to flatten each element in the list - sizes: List[int] # the flattened length of each element - - def __call__(self, values): - values = self._split(values, self.sizes) - if len(values) != len(self.schemas): - raise ValueError( - f"Values has length {len(values)} but schemas " f"has length {len(self.schemas)}!" - ) - values = [m(v) for m, v in zip(self.schemas, values)] - return list(values) - - @classmethod - def flatten(cls, obj): - res = [flatten_to_tuple(k) for k in obj] - values, sizes = cls._concat([k[0] for k in res]) - return values, cls([k[1] for k in res], sizes) - - -@dataclass -class TupleSchema(ListSchema): - def __call__(self, values): - return tuple(super().__call__(values)) - - -@dataclass -class IdentitySchema(Schema): - def __call__(self, values): - return values[0] - - @classmethod - def flatten(cls, obj): - return (obj,), cls() - - -@dataclass -class DictSchema(ListSchema): - keys: List[str] - - def __call__(self, values): - values = super().__call__(values) - return dict(zip(self.keys, values)) - - @classmethod - def flatten(cls, obj): - for k in obj.keys(): - if not isinstance(k, str): - raise KeyError("Only support flattening dictionaries if keys are str.") - keys = sorted(obj.keys()) - values = [obj[k] for k in keys] - ret, schema = ListSchema.flatten(values) - return ret, cls(schema.schemas, schema.sizes, keys) - - -@dataclass -class InstancesSchema(DictSchema): - def __call__(self, values): - image_size, fields = values[-1], values[:-1] - fields = super().__call__(fields) - return Instances(image_size, **fields) - - @classmethod - def flatten(cls, obj): - ret, schema = super().flatten(obj.get_fields()) - size = obj.image_size - if not isinstance(size, torch.Tensor): - size = torch.tensor(size) - return ret + (size,), schema - - -@dataclass -class TensorWrapSchema(Schema): - """ - For classes that are simple wrapper of tensors, e.g. - Boxes, RotatedBoxes, BitMasks - """ - - class_name: str - - def __call__(self, values): - return locate(self.class_name)(values[0]) - - @classmethod - def flatten(cls, obj): - return (obj.tensor,), cls(_convert_target_to_string(type(obj))) - - -# if more custom structures needed in the future, can allow -# passing in extra schemas for custom types -def flatten_to_tuple(obj): - """ - Flatten an object so it can be used for PyTorch tracing. - Also returns how to rebuild the original object from the flattened outputs. - - Returns: - res (tuple): the flattened results that can be used as tracing outputs - schema: an object with a ``__call__`` method such that ``schema(res) == obj``. - It is a pure dataclass that can be serialized. - """ - schemas = [ - ((str, bytes), IdentitySchema), - (list, ListSchema), - (tuple, TupleSchema), - (collections.abc.Mapping, DictSchema), - (Instances, InstancesSchema), - ((Boxes, ROIMasks), TensorWrapSchema), - ] - for klass, schema in schemas: - if isinstance(obj, klass): - F = schema - break - else: - F = IdentitySchema - - return F.flatten(obj) - - -class TracingAdapter(nn.Module): - """ - A model may take rich input/output format (e.g. dict or custom classes), - but `torch.jit.trace` requires tuple of tensors as input/output. - This adapter flattens input/output format of a model so it becomes traceable. - - It also records the necessary schema to rebuild model's inputs/outputs from flattened - inputs/outputs. - - Example: - :: - outputs = model(inputs) # inputs/outputs may be rich structure - adapter = TracingAdapter(model, inputs) - - # can now trace the model, with adapter.flattened_inputs, or another - # tuple of tensors with the same length and meaning - traced = torch.jit.trace(adapter, adapter.flattened_inputs) - - # traced model can only produce flattened outputs (tuple of tensors) - flattened_outputs = traced(*adapter.flattened_inputs) - # adapter knows the schema to convert it back (new_outputs == outputs) - new_outputs = adapter.outputs_schema(flattened_outputs) - """ - - flattened_inputs: Tuple[torch.Tensor] = None - """ - Flattened version of inputs given to this class's constructor. - """ - - inputs_schema: Schema = None - """ - Schema of the inputs given to this class's constructor. - """ - - outputs_schema: Schema = None - """ - Schema of the output produced by calling the given model with inputs. - """ - - def __init__( - self, - model: nn.Module, - inputs, - inference_func: Optional[Callable] = None, - allow_non_tensor: bool = False, - ): - """ - Args: - model: an nn.Module - inputs: An input argument or a tuple of input arguments used to call model. - After flattening, it has to only consist of tensors. - inference_func: a callable that takes (model, *inputs), calls the - model with inputs, and return outputs. By default it - is ``lambda model, *inputs: model(*inputs)``. Can be override - if you need to call the model differently. - allow_non_tensor: allow inputs/outputs to contain non-tensor objects. - This option will filter out non-tensor objects to make the - model traceable, but ``inputs_schema``/``outputs_schema`` cannot be - used anymore because inputs/outputs cannot be rebuilt from pure tensors. - This is useful when you're only interested in the single trace of - execution (e.g. for flop count), but not interested in - generalizing the traced graph to new inputs. - """ - super().__init__() - if isinstance(model, (nn.parallel.distributed.DistributedDataParallel, nn.DataParallel)): - model = model.module - self.model = model - if not isinstance(inputs, tuple): - inputs = (inputs,) - self.inputs = inputs - self.allow_non_tensor = allow_non_tensor - - if inference_func is None: - inference_func = lambda model, *inputs: model(*inputs) # noqa - self.inference_func = inference_func - - self.flattened_inputs, self.inputs_schema = flatten_to_tuple(inputs) - - if all(isinstance(x, torch.Tensor) for x in self.flattened_inputs): - return - if self.allow_non_tensor: - self.flattened_inputs = tuple( - [x for x in self.flattened_inputs if isinstance(x, torch.Tensor)] - ) - self.inputs_schema = None - else: - for input in self.flattened_inputs: - if not isinstance(input, torch.Tensor): - raise ValueError( - "Inputs for tracing must only contain tensors. " - f"Got a {type(input)} instead." - ) - - def forward(self, *args: torch.Tensor): - with torch.no_grad(), patch_builtin_len(): - if self.inputs_schema is not None: - inputs_orig_format = self.inputs_schema(args) - else: - if len(args) != len(self.flattened_inputs) or any( - x is not y for x, y in zip(args, self.flattened_inputs) - ): - raise ValueError( - "TracingAdapter does not contain valid inputs_schema." - " So it cannot generalize to other inputs and must be" - " traced with `.flattened_inputs`." - ) - inputs_orig_format = self.inputs - - outputs = self.inference_func(self.model, *inputs_orig_format) - flattened_outputs, schema = flatten_to_tuple(outputs) - - flattened_output_tensors = tuple( - [x for x in flattened_outputs if isinstance(x, torch.Tensor)] - ) - if len(flattened_output_tensors) < len(flattened_outputs): - if self.allow_non_tensor: - flattened_outputs = flattened_output_tensors - self.outputs_schema = None - else: - raise ValueError( - "Model cannot be traced because some model outputs " - "cannot flatten to tensors." - ) - else: # schema is valid - if self.outputs_schema is None: - self.outputs_schema = schema - else: - assert self.outputs_schema == schema, ( - "Model should always return outputs with the same " - "structure so it can be traced!" - ) - return flattened_outputs - - def _create_wrapper(self, traced_model): - """ - Return a function that has an input/output interface the same as the - original model, but it calls the given traced model under the hood. - """ - - def forward(*args): - flattened_inputs, _ = flatten_to_tuple(args) - flattened_outputs = traced_model(*flattened_inputs) - return self.outputs_schema(flattened_outputs) - - return forward diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/shared.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/shared.py deleted file mode 100755 index 2d0f7bf3..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/shared.py +++ /dev/null @@ -1,1034 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import collections -import contextlib -import copy -import functools -import logging -import numpy as np -import os -from typing import Any, Callable, Dict, List, Optional, Tuple, Union -from unittest import mock -import caffe2.python.utils as putils -import torch -import torch.nn.functional as F -from caffe2.proto import caffe2_pb2 -from caffe2.python import core, net_drawer, workspace -from torch.nn.functional import interpolate as interp - -logger = logging.getLogger(__name__) - - -# ==== torch/utils_toffee/cast.py ======================================= - - -def to_device(t, device_str): - """ - This function is a replacement of .to(another_device) such that it allows the - casting to be traced properly by explicitly calling the underlying copy ops. - It also avoids introducing unncessary op when casting to the same device. - """ - src = t.device - dst = torch.device(device_str) - - if src == dst: - return t - elif src.type == "cuda" and dst.type == "cpu": - return torch.ops._caffe2.CopyGPUToCPU(t) - elif src.type == "cpu" and dst.type == "cuda": - return torch.ops._caffe2.CopyCPUToGPU(t) - else: - raise RuntimeError("Can't cast tensor from device {} to device {}".format(src, dst)) - - -# ==== torch/utils_toffee/interpolate.py ======================================= - - -# Note: borrowed from vision/detection/fair/detectron/detectron/modeling/detector.py -def BilinearInterpolation(tensor_in, up_scale): - assert up_scale % 2 == 0, "Scale should be even" - - def upsample_filt(size): - factor = (size + 1) // 2 - if size % 2 == 1: - center = factor - 1 - else: - center = factor - 0.5 - - og = np.ogrid[:size, :size] - return (1 - abs(og[0] - center) / factor) * (1 - abs(og[1] - center) / factor) - - kernel_size = int(up_scale) * 2 - bil_filt = upsample_filt(kernel_size) - - dim = int(tensor_in.shape[1]) - kernel = np.zeros((dim, dim, kernel_size, kernel_size), dtype=np.float32) - kernel[range(dim), range(dim), :, :] = bil_filt - - tensor_out = F.conv_transpose2d( - tensor_in, - weight=to_device(torch.Tensor(kernel), tensor_in.device), - bias=None, - stride=int(up_scale), - padding=int(up_scale / 2), - ) - - return tensor_out - - -# NOTE: ONNX is incompatible with traced torch.nn.functional.interpolate if -# using dynamic `scale_factor` rather than static `size`. (T43166860) -# NOTE: Caffe2 Int8 conversion might not be able to quantize `size` properly. -def onnx_compatibale_interpolate( - input, size=None, scale_factor=None, mode="nearest", align_corners=None -): - # NOTE: The input dimensions are interpreted in the form: - # `mini-batch x channels x [optional depth] x [optional height] x width`. - if size is None and scale_factor is not None: - if input.dim() == 4: - if isinstance(scale_factor, (int, float)): - height_scale, width_scale = (scale_factor, scale_factor) - else: - assert isinstance(scale_factor, (tuple, list)) - assert len(scale_factor) == 2 - height_scale, width_scale = scale_factor - - assert not align_corners, "No matching C2 op for align_corners == True" - if mode == "nearest": - return torch.ops._caffe2.ResizeNearest( - input, order="NCHW", width_scale=width_scale, height_scale=height_scale - ) - elif mode == "bilinear": - logger.warning( - "Use F.conv_transpose2d for bilinear interpolate" - " because there's no such C2 op, this may cause significant" - " slowdown and the boundary pixels won't be as same as" - " using F.interpolate due to padding." - ) - assert height_scale == width_scale - return BilinearInterpolation(input, up_scale=height_scale) - logger.warning("Output size is not static, it might cause ONNX conversion issue") - - return interp(input, size, scale_factor, mode, align_corners) - - -@contextlib.contextmanager -def mock_torch_nn_functional_interpolate(): - if torch.onnx.is_in_onnx_export(): - with mock.patch( - "torch.nn.functional.interpolate", side_effect=onnx_compatibale_interpolate - ): - yield - else: - yield - - -# ==== torch/utils_caffe2/ws_utils.py ========================================== - - -class ScopedWS(object): - def __init__(self, ws_name, is_reset, is_cleanup=False): - self.ws_name = ws_name - self.is_reset = is_reset - self.is_cleanup = is_cleanup - self.org_ws = "" - - def __enter__(self): - self.org_ws = workspace.CurrentWorkspace() - if self.ws_name is not None: - workspace.SwitchWorkspace(self.ws_name, True) - if self.is_reset: - workspace.ResetWorkspace() - - return workspace - - def __exit__(self, *args): - if self.is_cleanup: - workspace.ResetWorkspace() - if self.ws_name is not None: - workspace.SwitchWorkspace(self.org_ws) - - -def fetch_any_blob(name): - bb = None - try: - bb = workspace.FetchBlob(name) - except TypeError: - bb = workspace.FetchInt8Blob(name) - except Exception as e: - logger.error("Get blob {} error: {}".format(name, e)) - - return bb - - -# ==== torch/utils_caffe2/protobuf.py ========================================== - - -def get_pb_arg(pb, arg_name): - for x in pb.arg: - if x.name == arg_name: - return x - return None - - -def get_pb_arg_valf(pb, arg_name, default_val): - arg = get_pb_arg(pb, arg_name) - return arg.f if arg is not None else default_val - - -def get_pb_arg_floats(pb, arg_name, default_val): - arg = get_pb_arg(pb, arg_name) - return list(map(float, arg.floats)) if arg is not None else default_val - - -def get_pb_arg_ints(pb, arg_name, default_val): - arg = get_pb_arg(pb, arg_name) - return list(map(int, arg.ints)) if arg is not None else default_val - - -def get_pb_arg_vali(pb, arg_name, default_val): - arg = get_pb_arg(pb, arg_name) - return arg.i if arg is not None else default_val - - -def get_pb_arg_vals(pb, arg_name, default_val): - arg = get_pb_arg(pb, arg_name) - return arg.s if arg is not None else default_val - - -def get_pb_arg_valstrings(pb, arg_name, default_val): - arg = get_pb_arg(pb, arg_name) - return list(arg.strings) if arg is not None else default_val - - -def check_set_pb_arg(pb, arg_name, arg_attr, arg_value, allow_override=False): - arg = get_pb_arg(pb, arg_name) - if arg is None: - arg = putils.MakeArgument(arg_name, arg_value) - assert hasattr(arg, arg_attr) - pb.arg.extend([arg]) - if allow_override and getattr(arg, arg_attr) != arg_value: - logger.warning( - "Override argument {}: {} -> {}".format(arg_name, getattr(arg, arg_attr), arg_value) - ) - setattr(arg, arg_attr, arg_value) - else: - assert arg is not None - assert getattr(arg, arg_attr) == arg_value, "Existing value {}, new value {}".format( - getattr(arg, arg_attr), arg_value - ) - - -def _create_const_fill_op_from_numpy(name, tensor, device_option=None): - assert type(tensor) == np.ndarray - kTypeNameMapper = { - np.dtype("float32"): "GivenTensorFill", - np.dtype("int32"): "GivenTensorIntFill", - np.dtype("int64"): "GivenTensorInt64Fill", - np.dtype("uint8"): "GivenTensorStringFill", - } - - args_dict = {} - if tensor.dtype == np.dtype("uint8"): - args_dict.update({"values": [str(tensor.data)], "shape": [1]}) - else: - args_dict.update({"values": tensor, "shape": tensor.shape}) - - if device_option is not None: - args_dict["device_option"] = device_option - - return core.CreateOperator(kTypeNameMapper[tensor.dtype], [], [name], **args_dict) - - -def _create_const_fill_op_from_c2_int8_tensor(name, int8_tensor): - assert type(int8_tensor) == workspace.Int8Tensor - kTypeNameMapper = { - np.dtype("int32"): "Int8GivenIntTensorFill", - np.dtype("uint8"): "Int8GivenTensorFill", - } - - tensor = int8_tensor.data - assert tensor.dtype in [np.dtype("uint8"), np.dtype("int32")] - values = tensor.tobytes() if tensor.dtype == np.dtype("uint8") else tensor - - return core.CreateOperator( - kTypeNameMapper[tensor.dtype], - [], - [name], - values=values, - shape=tensor.shape, - Y_scale=int8_tensor.scale, - Y_zero_point=int8_tensor.zero_point, - ) - - -def create_const_fill_op( - name: str, - blob: Union[np.ndarray, workspace.Int8Tensor], - device_option: Optional[caffe2_pb2.DeviceOption] = None, -) -> caffe2_pb2.OperatorDef: - """ - Given a blob object, return the Caffe2 operator that creates this blob - as constant. Currently support NumPy tensor and Caffe2 Int8Tensor. - """ - - tensor_type = type(blob) - assert tensor_type in [ - np.ndarray, - workspace.Int8Tensor, - ], 'Error when creating const fill op for "{}", unsupported blob type: {}'.format( - name, type(blob) - ) - - if tensor_type == np.ndarray: - return _create_const_fill_op_from_numpy(name, blob, device_option) - elif tensor_type == workspace.Int8Tensor: - assert device_option is None - return _create_const_fill_op_from_c2_int8_tensor(name, blob) - - -def construct_init_net_from_params( - params: Dict[str, Any], device_options: Optional[Dict[str, caffe2_pb2.DeviceOption]] = None -) -> caffe2_pb2.NetDef: - """ - Construct the init_net from params dictionary - """ - init_net = caffe2_pb2.NetDef() - device_options = device_options or {} - for name, blob in params.items(): - if isinstance(blob, str): - logger.warning( - ( - "Blob {} with type {} is not supported in generating init net," - " skipped.".format(name, type(blob)) - ) - ) - continue - init_net.op.extend( - [create_const_fill_op(name, blob, device_option=device_options.get(name, None))] - ) - init_net.external_output.append(name) - return init_net - - -def get_producer_map(ssa): - """ - Return dict from versioned blob to (i, j), - where i is index of producer op, j is the index of output of that op. - """ - producer_map = {} - for i in range(len(ssa)): - outputs = ssa[i][1] - for j, outp in enumerate(outputs): - producer_map[outp] = (i, j) - return producer_map - - -def get_consumer_map(ssa): - """ - Return dict from versioned blob to list of (i, j), - where i is index of consumer op, j is the index of input of that op. - """ - consumer_map = collections.defaultdict(list) - for i in range(len(ssa)): - inputs = ssa[i][0] - for j, inp in enumerate(inputs): - consumer_map[inp].append((i, j)) - return consumer_map - - -def get_params_from_init_net( - init_net: caffe2_pb2.NetDef, -) -> [Dict[str, Any], Dict[str, caffe2_pb2.DeviceOption]]: - """ - Take the output blobs from init_net by running it. - Outputs: - params: dict from blob name to numpy array - device_options: dict from blob name to the device option of its creating op - """ - # NOTE: this assumes that the params is determined by producer op with the - # only exception be CopyGPUToCPU which is CUDA op but returns CPU tensor. - def _get_device_option(producer_op): - if producer_op.type == "CopyGPUToCPU": - return caffe2_pb2.DeviceOption() - else: - return producer_op.device_option - - with ScopedWS("__get_params_from_init_net__", is_reset=True, is_cleanup=True) as ws: - ws.RunNetOnce(init_net) - params = {b: fetch_any_blob(b) for b in init_net.external_output} - ssa, versions = core.get_ssa(init_net) - producer_map = get_producer_map(ssa) - device_options = { - b: _get_device_option(init_net.op[producer_map[(b, versions[b])][0]]) - for b in init_net.external_output - } - return params, device_options - - -def _updater_raise(op, input_types, output_types): - raise RuntimeError( - "Failed to apply updater for op {} given input_types {} and" - " output_types {}".format(op, input_types, output_types) - ) - - -def _generic_status_identifier( - predict_net: caffe2_pb2.NetDef, - status_updater: Callable, - known_status: Dict[Tuple[str, int], Any], -) -> Dict[Tuple[str, int], Any]: - """ - Statically infer the status of each blob, the status can be such as device type - (CPU/GPU), layout (NCHW/NHWC), data type (float32/int8), etc. "Blob" here - is versioned blob (Tuple[str, int]) in the format compatible with ssa. - Inputs: - predict_net: the caffe2 network - status_updater: a callable, given an op and the status of its input/output, - it returns the updated status of input/output. `None` is used for - representing unknown status. - known_status: a dict containing known status, used as initialization. - Outputs: - A dict mapping from versioned blob to its status - """ - ssa, versions = core.get_ssa(predict_net) - versioned_ext_input = [(b, 0) for b in predict_net.external_input] - versioned_ext_output = [(b, versions[b]) for b in predict_net.external_output] - all_versioned_blobs = set().union(*[set(x[0] + x[1]) for x in ssa]) - - allowed_vbs = all_versioned_blobs.union(versioned_ext_input).union(versioned_ext_output) - assert all(k in allowed_vbs for k in known_status) - assert all(v is not None for v in known_status.values()) - _known_status = copy.deepcopy(known_status) - - def _check_and_update(key, value): - assert value is not None - if key in _known_status: - if not _known_status[key] == value: - raise RuntimeError( - "Confilict status for {}, existing status {}, new status {}".format( - key, _known_status[key], value - ) - ) - _known_status[key] = value - - def _update_i(op, ssa_i): - versioned_inputs = ssa_i[0] - versioned_outputs = ssa_i[1] - - inputs_status = [_known_status.get(b, None) for b in versioned_inputs] - outputs_status = [_known_status.get(b, None) for b in versioned_outputs] - - new_inputs_status, new_outputs_status = status_updater(op, inputs_status, outputs_status) - - for versioned_blob, status in zip( - versioned_inputs + versioned_outputs, new_inputs_status + new_outputs_status - ): - if status is not None: - _check_and_update(versioned_blob, status) - - for op, ssa_i in zip(predict_net.op, ssa): - _update_i(op, ssa_i) - for op, ssa_i in zip(reversed(predict_net.op), reversed(ssa)): - _update_i(op, ssa_i) - - # NOTE: This strictly checks all the blob from predict_net must be assgined - # a known status. However sometimes it's impossible (eg. having deadend op), - # we may relax this constraint if - for k in all_versioned_blobs: - if k not in _known_status: - raise NotImplementedError( - "Can not infer the status for {}. Currently only support the case where" - " a single forward and backward pass can identify status for all blobs.".format(k) - ) - - return _known_status - - -def infer_device_type( - predict_net: caffe2_pb2.NetDef, - known_status: Dict[Tuple[str, int], Any], - device_name_style: str = "caffe2", -) -> Dict[Tuple[str, int], str]: - """Return the device type ("cpu" or "gpu"/"cuda") of each (versioned) blob""" - - assert device_name_style in ["caffe2", "pytorch"] - _CPU_STR = "cpu" - _GPU_STR = "gpu" if device_name_style == "caffe2" else "cuda" - - def _copy_cpu_to_gpu_updater(op, input_types, output_types): - if input_types[0] == _GPU_STR or output_types[0] == _CPU_STR: - _updater_raise(op, input_types, output_types) - return ([_CPU_STR], [_GPU_STR]) - - def _copy_gpu_to_cpu_updater(op, input_types, output_types): - if input_types[0] == _CPU_STR or output_types[0] == _GPU_STR: - _updater_raise(op, input_types, output_types) - return ([_GPU_STR], [_CPU_STR]) - - def _other_ops_updater(op, input_types, output_types): - non_none_types = [x for x in input_types + output_types if x is not None] - if len(non_none_types) > 0: - the_type = non_none_types[0] - if not all(x == the_type for x in non_none_types): - _updater_raise(op, input_types, output_types) - else: - the_type = None - return ([the_type for _ in op.input], [the_type for _ in op.output]) - - def _device_updater(op, *args, **kwargs): - return { - "CopyCPUToGPU": _copy_cpu_to_gpu_updater, - "CopyGPUToCPU": _copy_gpu_to_cpu_updater, - }.get(op.type, _other_ops_updater)(op, *args, **kwargs) - - return _generic_status_identifier(predict_net, _device_updater, known_status) - - -# ==== torch/utils_caffe2/vis.py =============================================== - - -def _modify_blob_names(ops, blob_rename_f): - ret = [] - - def _replace_list(blob_list, replaced_list): - del blob_list[:] - blob_list.extend(replaced_list) - - for x in ops: - cur = copy.deepcopy(x) - _replace_list(cur.input, list(map(blob_rename_f, cur.input))) - _replace_list(cur.output, list(map(blob_rename_f, cur.output))) - ret.append(cur) - - return ret - - -def _rename_blob(name, blob_sizes, blob_ranges): - def _list_to_str(bsize): - ret = ", ".join([str(x) for x in bsize]) - ret = "[" + ret + "]" - return ret - - ret = name - if blob_sizes is not None and name in blob_sizes: - ret += "\n" + _list_to_str(blob_sizes[name]) - if blob_ranges is not None and name in blob_ranges: - ret += "\n" + _list_to_str(blob_ranges[name]) - - return ret - - -# graph_name could not contain word 'graph' -def save_graph(net, file_name, graph_name="net", op_only=True, blob_sizes=None, blob_ranges=None): - blob_rename_f = functools.partial(_rename_blob, blob_sizes=blob_sizes, blob_ranges=blob_ranges) - return save_graph_base(net, file_name, graph_name, op_only, blob_rename_f) - - -def save_graph_base(net, file_name, graph_name="net", op_only=True, blob_rename_func=None): - graph = None - ops = net.op - if blob_rename_func is not None: - ops = _modify_blob_names(ops, blob_rename_func) - if not op_only: - graph = net_drawer.GetPydotGraph(ops, graph_name, rankdir="TB") - else: - graph = net_drawer.GetPydotGraphMinimal( - ops, graph_name, rankdir="TB", minimal_dependency=True - ) - - try: - par_dir = os.path.dirname(file_name) - if not os.path.exists(par_dir): - os.makedirs(par_dir) - - format = os.path.splitext(os.path.basename(file_name))[-1] - if format == ".png": - graph.write_png(file_name) - elif format == ".pdf": - graph.write_pdf(file_name) - elif format == ".svg": - graph.write_svg(file_name) - else: - print("Incorrect format {}".format(format)) - except Exception as e: - print("Error when writing graph to image {}".format(e)) - - return graph - - -# ==== torch/utils_toffee/aten_to_caffe2.py ==================================== - - -def group_norm_replace_aten_with_caffe2(predict_net: caffe2_pb2.NetDef): - """ - For ONNX exported model, GroupNorm will be represented as ATen op, - this can be a drop in replacement from ATen to GroupNorm - """ - count = 0 - for op in predict_net.op: - if op.type == "ATen": - op_name = get_pb_arg_vals(op, "operator", None) # return byte in py3 - if op_name and op_name.decode() == "group_norm": - op.arg.remove(get_pb_arg(op, "operator")) - - if get_pb_arg_vali(op, "cudnn_enabled", None): - op.arg.remove(get_pb_arg(op, "cudnn_enabled")) - - num_groups = get_pb_arg_vali(op, "num_groups", None) - if num_groups is not None: - op.arg.remove(get_pb_arg(op, "num_groups")) - check_set_pb_arg(op, "group", "i", num_groups) - - op.type = "GroupNorm" - count += 1 - if count > 1: - logger.info("Replaced {} ATen operator to GroupNormOp".format(count)) - - -# ==== torch/utils_toffee/alias.py ============================================= - - -def alias(x, name, is_backward=False): - if not torch.onnx.is_in_onnx_export(): - return x - assert isinstance(x, torch.Tensor) - return torch.ops._caffe2.AliasWithName(x, name, is_backward=is_backward) - - -def fuse_alias_placeholder(predict_net, init_net): - """Remove AliasWithName placeholder and rename the input/output of it""" - # First we finish all the re-naming - for i, op in enumerate(predict_net.op): - if op.type == "AliasWithName": - assert len(op.input) == 1 - assert len(op.output) == 1 - name = get_pb_arg_vals(op, "name", None).decode() - is_backward = bool(get_pb_arg_vali(op, "is_backward", 0)) - rename_op_input(predict_net, init_net, i, 0, name, from_producer=is_backward) - rename_op_output(predict_net, i, 0, name) - - # Remove AliasWithName, should be very safe since it's a non-op - new_ops = [] - for op in predict_net.op: - if op.type != "AliasWithName": - new_ops.append(op) - else: - # safety check - assert op.input == op.output - assert op.input[0] == op.arg[0].s.decode() - del predict_net.op[:] - predict_net.op.extend(new_ops) - - -# ==== torch/utils_caffe2/graph_transform.py =================================== - - -class IllegalGraphTransformError(ValueError): - """When a graph transform function call can't be executed.""" - - -def _rename_versioned_blob_in_proto( - proto: caffe2_pb2.NetDef, - old_name: str, - new_name: str, - version: int, - ssa: List[Tuple[List[Tuple[str, int]], List[Tuple[str, int]]]], - start_versions: Dict[str, int], - end_versions: Dict[str, int], -): - """In given proto, rename all blobs with matched version""" - # Operater list - for op, i_th_ssa in zip(proto.op, ssa): - versioned_inputs, versioned_outputs = i_th_ssa - for i in range(len(op.input)): - if versioned_inputs[i] == (old_name, version): - op.input[i] = new_name - for i in range(len(op.output)): - if versioned_outputs[i] == (old_name, version): - op.output[i] = new_name - # external_input - if start_versions.get(old_name, 0) == version: - for i in range(len(proto.external_input)): - if proto.external_input[i] == old_name: - proto.external_input[i] = new_name - # external_output - if end_versions.get(old_name, 0) == version: - for i in range(len(proto.external_output)): - if proto.external_output[i] == old_name: - proto.external_output[i] = new_name - - -def rename_op_input( - predict_net: caffe2_pb2.NetDef, - init_net: caffe2_pb2.NetDef, - op_id: int, - input_id: int, - new_name: str, - from_producer: bool = False, -): - """ - Rename the op_id-th operator in predict_net, change it's input_id-th input's - name to the new_name. It also does automatic re-route and change - external_input and init_net if necessary. - - It requires the input is only consumed by this op. - - This function modifies predict_net and init_net in-place. - - When from_producer is enable, this also updates other operators that consumes - the same input. Be cautious because may trigger unintended behavior. - """ - assert isinstance(predict_net, caffe2_pb2.NetDef) - assert isinstance(init_net, caffe2_pb2.NetDef) - - init_net_ssa, init_net_versions = core.get_ssa(init_net) - predict_net_ssa, predict_net_versions = core.get_ssa( - predict_net, copy.deepcopy(init_net_versions) - ) - - versioned_inputs, versioned_outputs = predict_net_ssa[op_id] - old_name, version = versioned_inputs[input_id] - - if from_producer: - producer_map = get_producer_map(predict_net_ssa) - if not (old_name, version) in producer_map: - raise NotImplementedError( - "Can't find producer, the input {} is probably from" - " init_net, this is not supported yet.".format(old_name) - ) - producer = producer_map[(old_name, version)] - rename_op_output(predict_net, producer[0], producer[1], new_name) - return - - def contain_targets(op_ssa): - return (old_name, version) in op_ssa[0] - - is_consumer = [contain_targets(op_ssa) for op_ssa in predict_net_ssa] - if sum(is_consumer) > 1: - raise IllegalGraphTransformError( - ( - "Input '{}' of operator(#{}) are consumed by other ops, please use" - + " rename_op_output on the producer instead. Offending op: \n{}" - ).format(old_name, op_id, predict_net.op[op_id]) - ) - - # update init_net - _rename_versioned_blob_in_proto( - init_net, old_name, new_name, version, init_net_ssa, {}, init_net_versions - ) - # update predict_net - _rename_versioned_blob_in_proto( - predict_net, - old_name, - new_name, - version, - predict_net_ssa, - init_net_versions, - predict_net_versions, - ) - - -def rename_op_output(predict_net: caffe2_pb2.NetDef, op_id: int, output_id: int, new_name: str): - """ - Rename the op_id-th operator in predict_net, change it's output_id-th input's - name to the new_name. It also does automatic re-route and change - external_output and if necessary. - - It allows multiple consumers of its output. - - This function modifies predict_net in-place, doesn't need init_net. - """ - assert isinstance(predict_net, caffe2_pb2.NetDef) - - ssa, blob_versions = core.get_ssa(predict_net) - - versioned_inputs, versioned_outputs = ssa[op_id] - old_name, version = versioned_outputs[output_id] - - # update predict_net - _rename_versioned_blob_in_proto( - predict_net, old_name, new_name, version, ssa, {}, blob_versions - ) - - -def get_sub_graph_external_input_output( - predict_net: caffe2_pb2.NetDef, sub_graph_op_indices: List[int] -) -> Tuple[List[Tuple[str, int]], List[Tuple[str, int]]]: - """ - Return the list of external input/output of sub-graph, - each element is tuple of the name and corresponding version in predict_net. - - external input/output is defined the same way as caffe2 NetDef. - """ - ssa, versions = core.get_ssa(predict_net) - - all_inputs = [] - all_outputs = [] - for op_id in sub_graph_op_indices: - all_inputs += [inp for inp in ssa[op_id][0] if inp not in all_inputs] - all_outputs += list(ssa[op_id][1]) # ssa output won't repeat - - # for versioned blobs, external inputs are just those blob in all_inputs - # but not in all_outputs - ext_inputs = [inp for inp in all_inputs if inp not in all_outputs] - - # external outputs are essentially outputs of this subgraph that are used - # outside of this sub-graph (including predict_net.external_output) - all_other_inputs = sum( - (ssa[i][0] for i in range(len(ssa)) if i not in sub_graph_op_indices), - [(outp, versions[outp]) for outp in predict_net.external_output], - ) - ext_outputs = [outp for outp in all_outputs if outp in set(all_other_inputs)] - - return ext_inputs, ext_outputs - - -class DiGraph: - """A DAG representation of caffe2 graph, each vertice is a versioned blob.""" - - def __init__(self): - self.vertices = set() - self.graph = collections.defaultdict(list) - - def add_edge(self, u, v): - self.graph[u].append(v) - self.vertices.add(u) - self.vertices.add(v) - - # grab from https://www.geeksforgeeks.org/find-paths-given-source-destination/ - def get_all_paths(self, s, d): - visited = {k: False for k in self.vertices} - path = [] - all_paths = [] - - def _get_all_paths_util(graph, u, d, visited, path): - visited[u] = True - path.append(u) - if u == d: - all_paths.append(copy.deepcopy(path)) - else: - for i in graph[u]: - if not visited[i]: - _get_all_paths_util(graph, i, d, visited, path) - path.pop() - visited[u] = False - - _get_all_paths_util(self.graph, s, d, visited, path) - return all_paths - - @staticmethod - def from_ssa(ssa): - graph = DiGraph() - for op_id in range(len(ssa)): - for inp in ssa[op_id][0]: - for outp in ssa[op_id][1]: - graph.add_edge(inp, outp) - return graph - - -def _get_dependency_chain(ssa, versioned_target, versioned_source): - """ - Return the index list of relevant operator to produce target blob from source blob, - if there's no dependency, return empty list. - """ - - # finding all paths between nodes can be O(N!), thus we can only search - # in the subgraph using the op starting from the first consumer of source blob - # to the producer of the target blob. - consumer_map = get_consumer_map(ssa) - producer_map = get_producer_map(ssa) - start_op = min(x[0] for x in consumer_map[versioned_source]) - 15 - end_op = ( - producer_map[versioned_target][0] + 15 if versioned_target in producer_map else start_op - ) - sub_graph_ssa = ssa[start_op : end_op + 1] - if len(sub_graph_ssa) > 30: - logger.warning( - "Subgraph bebetween {} and {} is large (from op#{} to op#{}), it" - " might take non-trival time to find all paths between them.".format( - versioned_source, versioned_target, start_op, end_op - ) - ) - - dag = DiGraph.from_ssa(sub_graph_ssa) - paths = dag.get_all_paths(versioned_source, versioned_target) # include two ends - ops_in_paths = [[producer_map[blob][0] for blob in path[1:]] for path in paths] - return sorted(set().union(*[set(ops) for ops in ops_in_paths])) - - -def identify_reshape_sub_graph(predict_net: caffe2_pb2.NetDef) -> List[List[int]]: - """ - Idenfity the reshape sub-graph in a protobuf. - The reshape sub-graph is defined as matching the following pattern: - - (input_blob) -> Op_1 -> ... -> Op_N -> (new_shape) -─┐ - └-------------------------------------------> Reshape -> (output_blob) - - Return: - List of sub-graphs, each sub-graph is represented as a list of indices - of the relavent ops, [Op_1, Op_2, ..., Op_N, Reshape] - """ - - ssa, _ = core.get_ssa(predict_net) - - ret = [] - for i, op in enumerate(predict_net.op): - if op.type == "Reshape": - assert len(op.input) == 2 - input_ssa = ssa[i][0] - data_source = input_ssa[0] - shape_source = input_ssa[1] - op_indices = _get_dependency_chain(ssa, shape_source, data_source) - ret.append(op_indices + [i]) - return ret - - -def remove_reshape_for_fc(predict_net, params): - """ - In PyTorch nn.Linear has to take 2D tensor, this often leads to reshape - a 4D tensor to 2D by calling .view(). However this (dynamic) reshaping - doesn't work well with ONNX and Int8 tools, and cause using extra - ops (eg. ExpandDims) that might not be available on mobile. - Luckily Caffe2 supports 4D tensor for FC, so we can remove those reshape - after exporting ONNX model. - """ - from caffe2.python import core - - # find all reshape sub-graph that can be removed, which is now all Reshape - # sub-graph whose output is only consumed by FC. - # TODO: to make it safer, we may need the actually value to better determine - # if a Reshape before FC is removable. - reshape_sub_graphs = identify_reshape_sub_graph(predict_net) - sub_graphs_to_remove = [] - for reshape_sub_graph in reshape_sub_graphs: - reshape_op_id = reshape_sub_graph[-1] - assert predict_net.op[reshape_op_id].type == "Reshape" - ssa, _ = core.get_ssa(predict_net) - reshape_output = ssa[reshape_op_id][1][0] - consumers = [i for i in range(len(ssa)) if reshape_output in ssa[i][0]] - if all(predict_net.op[consumer].type == "FC" for consumer in consumers): - # safety check if the sub-graph is isolated, for this reshape sub-graph, - # it means it has one non-param external input and one external output. - ext_inputs, ext_outputs = get_sub_graph_external_input_output( - predict_net, reshape_sub_graph - ) - non_params_ext_inputs = [inp for inp in ext_inputs if inp[1] != 0] - if len(non_params_ext_inputs) == 1 and len(ext_outputs) == 1: - sub_graphs_to_remove.append(reshape_sub_graph) - - # perform removing subgraph by: - # 1: rename the Reshape's output to its input, then the graph can be - # seen as in-place itentify, meaning whose external input/output are the same. - # 2: simply remove those ops. - remove_op_ids = [] - params_to_remove = [] - for sub_graph in sub_graphs_to_remove: - logger.info( - "Remove Reshape sub-graph:\n{}".format( - "".join(["(#{:>4})\n{}".format(i, predict_net.op[i]) for i in sub_graph]) - ) - ) - reshape_op_id = sub_graph[-1] - new_reshap_output = predict_net.op[reshape_op_id].input[0] - rename_op_output(predict_net, reshape_op_id, 0, new_reshap_output) - ext_inputs, ext_outputs = get_sub_graph_external_input_output(predict_net, sub_graph) - non_params_ext_inputs = [inp for inp in ext_inputs if inp[1] != 0] - params_ext_inputs = [inp for inp in ext_inputs if inp[1] == 0] - assert len(non_params_ext_inputs) == 1 and len(ext_outputs) == 1 - assert ext_outputs[0][0] == non_params_ext_inputs[0][0] - assert ext_outputs[0][1] == non_params_ext_inputs[0][1] + 1 - remove_op_ids.extend(sub_graph) - params_to_remove.extend(params_ext_inputs) - - predict_net = copy.deepcopy(predict_net) - new_ops = [op for i, op in enumerate(predict_net.op) if i not in remove_op_ids] - del predict_net.op[:] - predict_net.op.extend(new_ops) - for versioned_params in params_to_remove: - name = versioned_params[0] - logger.info("Remove params: {} from init_net and predict_net.external_input".format(name)) - del params[name] - predict_net.external_input.remove(name) - - return predict_net, params - - -def fuse_copy_between_cpu_and_gpu(predict_net: caffe2_pb2.NetDef): - """ - In-place fuse extra copy ops between cpu/gpu for the following case: - a -CopyAToB-> b -CopyBToA> c1 -NextOp1-> d1 - -CopyBToA> c2 -NextOp2-> d2 - The fused network will look like: - a -NextOp1-> d1 - -NextOp2-> d2 - """ - - _COPY_OPS = ["CopyCPUToGPU", "CopyGPUToCPU"] - - def _fuse_once(predict_net): - ssa, blob_versions = core.get_ssa(predict_net) - consumer_map = get_consumer_map(ssa) - versioned_external_output = [ - (name, blob_versions[name]) for name in predict_net.external_output - ] - - for op_id, op in enumerate(predict_net.op): - if op.type in _COPY_OPS: - fw_copy_versioned_output = ssa[op_id][1][0] - consumer_ids = [x[0] for x in consumer_map[fw_copy_versioned_output]] - reverse_op_type = _COPY_OPS[1 - _COPY_OPS.index(op.type)] - - is_fusable = ( - len(consumer_ids) > 0 - and fw_copy_versioned_output not in versioned_external_output - and all( - predict_net.op[_op_id].type == reverse_op_type - and ssa[_op_id][1][0] not in versioned_external_output - for _op_id in consumer_ids - ) - ) - - if is_fusable: - for rv_copy_op_id in consumer_ids: - # making each NextOp uses "a" directly and removing Copy ops - rs_copy_versioned_output = ssa[rv_copy_op_id][1][0] - next_op_id, inp_id = consumer_map[rs_copy_versioned_output][0] - predict_net.op[next_op_id].input[inp_id] = op.input[0] - # remove CopyOps - new_ops = [ - op - for i, op in enumerate(predict_net.op) - if i != op_id and i not in consumer_ids - ] - del predict_net.op[:] - predict_net.op.extend(new_ops) - return True - - return False - - # _fuse_once returns False is nothing can be fused - while _fuse_once(predict_net): - pass - - -def remove_dead_end_ops(net_def: caffe2_pb2.NetDef): - """remove ops if its output is not used or not in external_output""" - ssa, versions = core.get_ssa(net_def) - versioned_external_output = [(name, versions[name]) for name in net_def.external_output] - consumer_map = get_consumer_map(ssa) - removed_op_ids = set() - - def _is_dead_end(versioned_blob): - return not ( - versioned_blob in versioned_external_output - or ( - len(consumer_map[versioned_blob]) > 0 - and all(x[0] not in removed_op_ids for x in consumer_map[versioned_blob]) - ) - ) - - for i, ssa_i in reversed(list(enumerate(ssa))): - versioned_outputs = ssa_i[1] - if all(_is_dead_end(outp) for outp in versioned_outputs): - removed_op_ids.add(i) - - # simply removing those deadend ops should have no effect to external_output - new_ops = [op for i, op in enumerate(net_def.op) if i not in removed_op_ids] - del net_def.op[:] - net_def.op.extend(new_ops) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/torchscript.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/torchscript.py deleted file mode 100755 index 24fe59bd..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/torchscript.py +++ /dev/null @@ -1,132 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import os -import torch - -from detectron2.utils.file_io import PathManager - -from .torchscript_patch import freeze_training_mode, patch_instances - -__all__ = ["scripting_with_instances", "dump_torchscript_IR"] - - -def scripting_with_instances(model, fields): - """ - Run :func:`torch.jit.script` on a model that uses the :class:`Instances` class. Since - attributes of :class:`Instances` are "dynamically" added in eager mode,it is difficult - for scripting to support it out of the box. This function is made to support scripting - a model that uses :class:`Instances`. It does the following: - - 1. Create a scriptable ``new_Instances`` class which behaves similarly to ``Instances``, - but with all attributes been "static". - The attributes need to be statically declared in the ``fields`` argument. - 2. Register ``new_Instances``, and force scripting compiler to - use it when trying to compile ``Instances``. - - After this function, the process will be reverted. User should be able to script another model - using different fields. - - Example: - Assume that ``Instances`` in the model consist of two attributes named - ``proposal_boxes`` and ``objectness_logits`` with type :class:`Boxes` and - :class:`Tensor` respectively during inference. You can call this function like: - :: - fields = {"proposal_boxes": Boxes, "objectness_logits": torch.Tensor} - torchscipt_model = scripting_with_instances(model, fields) - - Note: - It only support models in evaluation mode. - - Args: - model (nn.Module): The input model to be exported by scripting. - fields (Dict[str, type]): Attribute names and corresponding type that - ``Instances`` will use in the model. Note that all attributes used in ``Instances`` - need to be added, regardless of whether they are inputs/outputs of the model. - Data type not defined in detectron2 is not supported for now. - - Returns: - torch.jit.ScriptModule: the model in torchscript format - """ - assert ( - not model.training - ), "Currently we only support exporting models in evaluation mode to torchscript" - - with freeze_training_mode(model), patch_instances(fields): - scripted_model = torch.jit.script(model) - return scripted_model - - -# alias for old name -export_torchscript_with_instances = scripting_with_instances - - -def dump_torchscript_IR(model, dir): - """ - Dump IR of a TracedModule/ScriptModule/Function in various format (code, graph, - inlined graph). Useful for debugging. - - Args: - model (TracedModule/ScriptModule/ScriptFUnction): traced or scripted module - dir (str): output directory to dump files. - """ - dir = os.path.expanduser(dir) - PathManager.mkdirs(dir) - - def _get_script_mod(mod): - if isinstance(mod, torch.jit.TracedModule): - return mod._actual_script_module - return mod - - # Dump pretty-printed code: https://pytorch.org/docs/stable/jit.html#inspecting-code - with PathManager.open(os.path.join(dir, "model_ts_code.txt"), "w") as f: - - def get_code(mod): - # Try a few ways to get code using private attributes. - try: - # This contains more information than just `mod.code` - return _get_script_mod(mod)._c.code - except AttributeError: - pass - try: - return mod.code - except AttributeError: - return None - - def dump_code(prefix, mod): - code = get_code(mod) - name = prefix or "root model" - if code is None: - f.write(f"Could not found code for {name} (type={mod.original_name})\n") - f.write("\n") - else: - f.write(f"\nCode for {name}, type={mod.original_name}:\n") - f.write(code) - f.write("\n") - f.write("-" * 80) - - for name, m in mod.named_children(): - dump_code(prefix + "." + name, m) - - if isinstance(model, torch.jit.ScriptFunction): - f.write(get_code(model)) - else: - dump_code("", model) - - def _get_graph(model): - try: - # Recursively dump IR of all modules - return _get_script_mod(model)._c.dump_to_str(True, False, False) - except AttributeError: - return model.graph.str() - - with PathManager.open(os.path.join(dir, "model_ts_IR.txt"), "w") as f: - f.write(_get_graph(model)) - - # Dump IR of the entire graph (all submodules inlined) - with PathManager.open(os.path.join(dir, "model_ts_IR_inlined.txt"), "w") as f: - f.write(str(model.inlined_graph)) - - if not isinstance(model, torch.jit.ScriptFunction): - # Dump the model structure in pytorch style - with PathManager.open(os.path.join(dir, "model.txt"), "w") as f: - f.write(str(model)) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/torchscript_patch.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/export/torchscript_patch.py deleted file mode 100755 index da9b324f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/export/torchscript_patch.py +++ /dev/null @@ -1,406 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import os -import sys -import tempfile -from contextlib import ExitStack, contextmanager -from copy import deepcopy -from unittest import mock -import torch -from torch import nn - -# need some explicit imports due to https://github.com/pytorch/pytorch/issues/38964 -import detectron2 # noqa F401 -from detectron2.structures import Boxes, Instances -from detectron2.utils.env import _import_file - -_counter = 0 - - -def _clear_jit_cache(): - from torch.jit._recursive import concrete_type_store - from torch.jit._state import _jit_caching_layer - - concrete_type_store.type_store.clear() # for modules - _jit_caching_layer.clear() # for free functions - - -def _add_instances_conversion_methods(newInstances): - """ - Add from_instances methods to the scripted Instances class. - """ - cls_name = newInstances.__name__ - - @torch.jit.unused - def from_instances(instances: Instances): - """ - Create scripted Instances from original Instances - """ - fields = instances.get_fields() - image_size = instances.image_size - ret = newInstances(image_size) - for name, val in fields.items(): - assert hasattr(ret, f"_{name}"), f"No attribute named {name} in {cls_name}" - setattr(ret, name, deepcopy(val)) - return ret - - newInstances.from_instances = from_instances - - -@contextmanager -def patch_instances(fields): - """ - A contextmanager, under which the Instances class in detectron2 is replaced - by a statically-typed scriptable class, defined by `fields`. - See more in `scripting_with_instances`. - """ - - with tempfile.TemporaryDirectory(prefix="detectron2") as dir, tempfile.NamedTemporaryFile( - mode="w", encoding="utf-8", suffix=".py", dir=dir, delete=False - ) as f: - try: - # Objects that use Instances should not reuse previously-compiled - # results in cache, because `Instances` could be a new class each time. - _clear_jit_cache() - - cls_name, s = _gen_instance_module(fields) - f.write(s) - f.flush() - f.close() - - module = _import(f.name) - new_instances = getattr(module, cls_name) - _ = torch.jit.script(new_instances) - # let torchscript think Instances was scripted already - Instances.__torch_script_class__ = True - # let torchscript find new_instances when looking for the jit type of Instances - Instances._jit_override_qualname = torch._jit_internal._qualified_name(new_instances) - - _add_instances_conversion_methods(new_instances) - yield new_instances - finally: - try: - del Instances.__torch_script_class__ - del Instances._jit_override_qualname - except AttributeError: - pass - sys.modules.pop(module.__name__) - - -def _gen_instance_class(fields): - """ - Args: - fields (dict[name: type]) - """ - - class _FieldType: - def __init__(self, name, type_): - assert isinstance(name, str), f"Field name must be str, got {name}" - self.name = name - self.type_ = type_ - self.annotation = f"{type_.__module__}.{type_.__name__}" - - fields = [_FieldType(k, v) for k, v in fields.items()] - - def indent(level, s): - return " " * 4 * level + s - - lines = [] - - global _counter - _counter += 1 - - cls_name = "ScriptedInstances{}".format(_counter) - - field_names = tuple(x.name for x in fields) - extra_args = ", ".join([f"{f.name}: Optional[{f.annotation}] = None" for f in fields]) - lines.append( - f""" -class {cls_name}: - def __init__(self, image_size: Tuple[int, int], {extra_args}): - self.image_size = image_size - self._field_names = {field_names} -""" - ) - - for f in fields: - lines.append( - indent(2, f"self._{f.name} = torch.jit.annotate(Optional[{f.annotation}], {f.name})") - ) - - for f in fields: - lines.append( - f""" - @property - def {f.name}(self) -> {f.annotation}: - # has to use a local for type refinement - # https://pytorch.org/docs/stable/jit_language_reference.html#optional-type-refinement - t = self._{f.name} - assert t is not None, "{f.name} is None and cannot be accessed!" - return t - - @{f.name}.setter - def {f.name}(self, value: {f.annotation}) -> None: - self._{f.name} = value -""" - ) - - # support method `__len__` - lines.append( - """ - def __len__(self) -> int: -""" - ) - for f in fields: - lines.append( - f""" - t = self._{f.name} - if t is not None: - return len(t) -""" - ) - lines.append( - """ - raise NotImplementedError("Empty Instances does not support __len__!") -""" - ) - - # support method `has` - lines.append( - """ - def has(self, name: str) -> bool: -""" - ) - for f in fields: - lines.append( - f""" - if name == "{f.name}": - return self._{f.name} is not None -""" - ) - lines.append( - """ - return False -""" - ) - - # support method `to` - none_args = ", None" * len(fields) - lines.append( - f""" - def to(self, device: torch.device) -> "{cls_name}": - ret = {cls_name}(self.image_size{none_args}) -""" - ) - for f in fields: - if hasattr(f.type_, "to"): - lines.append( - f""" - t = self._{f.name} - if t is not None: - ret._{f.name} = t.to(device) -""" - ) - else: - # For now, ignore fields that cannot be moved to devices. - # Maybe can support other tensor-like classes (e.g. __torch_function__) - pass - lines.append( - """ - return ret -""" - ) - - # support method `getitem` - none_args = ", None" * len(fields) - lines.append( - f""" - def __getitem__(self, item) -> "{cls_name}": - ret = {cls_name}(self.image_size{none_args}) -""" - ) - for f in fields: - lines.append( - f""" - t = self._{f.name} - if t is not None: - ret._{f.name} = t[item] -""" - ) - lines.append( - """ - return ret -""" - ) - - # support method `cat` - # this version does not contain checks that all instances have same size and fields - none_args = ", None" * len(fields) - lines.append( - f""" - def cat(self, instances: List["{cls_name}"]) -> "{cls_name}": - ret = {cls_name}(self.image_size{none_args}) -""" - ) - for f in fields: - lines.append( - f""" - t = self._{f.name} - if t is not None: - values: List[{f.annotation}] = [x.{f.name} for x in instances] - if torch.jit.isinstance(t, torch.Tensor): - ret._{f.name} = torch.cat(values, dim=0) - else: - ret._{f.name} = t.cat(values) -""" - ) - lines.append( - """ - return ret""" - ) - - # support method `get_fields()` - lines.append( - """ - def get_fields(self) -> Dict[str, Tensor]: - ret = {} - """ - ) - for f in fields: - if f.type_ == Boxes: - stmt = "t.tensor" - elif f.type_ == torch.Tensor: - stmt = "t" - else: - stmt = f'assert False, "unsupported type {str(f.type_)}"' - lines.append( - f""" - t = self._{f.name} - if t is not None: - ret["{f.name}"] = {stmt} - """ - ) - lines.append( - """ - return ret""" - ) - return cls_name, os.linesep.join(lines) - - -def _gen_instance_module(fields): - # TODO: find a more automatic way to enable import of other classes - s = """ -from copy import deepcopy -import torch -from torch import Tensor -import typing -from typing import * - -import detectron2 -from detectron2.structures import Boxes, Instances - -""" - - cls_name, cls_def = _gen_instance_class(fields) - s += cls_def - return cls_name, s - - -def _import(path): - return _import_file( - "{}{}".format(sys.modules[__name__].__name__, _counter), path, make_importable=True - ) - - -@contextmanager -def patch_builtin_len(modules=()): - """ - Patch the builtin len() function of a few detectron2 modules - to use __len__ instead, because __len__ does not convert values to - integers and therefore is friendly to tracing. - - Args: - modules (list[stsr]): names of extra modules to patch len(), in - addition to those in detectron2. - """ - - def _new_len(obj): - return obj.__len__() - - with ExitStack() as stack: - MODULES = [ - "detectron2.modeling.roi_heads.fast_rcnn", - "detectron2.modeling.roi_heads.mask_head", - "detectron2.modeling.roi_heads.keypoint_head", - ] + list(modules) - ctxs = [stack.enter_context(mock.patch(mod + ".len")) for mod in MODULES] - for m in ctxs: - m.side_effect = _new_len - yield - - -def patch_nonscriptable_classes(): - """ - Apply patches on a few nonscriptable detectron2 classes. - Should not have side-effects on eager usage. - """ - # __prepare_scriptable__ can also be added to models for easier maintenance. - # But it complicates the clean model code. - - from detectron2.modeling.backbone import ResNet, FPN - - # Due to https://github.com/pytorch/pytorch/issues/36061, - # we change backbone to use ModuleList for scripting. - # (note: this changes param names in state_dict) - - def prepare_resnet(self): - ret = deepcopy(self) - ret.stages = nn.ModuleList(ret.stages) - for k in self.stage_names: - delattr(ret, k) - return ret - - ResNet.__prepare_scriptable__ = prepare_resnet - - def prepare_fpn(self): - ret = deepcopy(self) - ret.lateral_convs = nn.ModuleList(ret.lateral_convs) - ret.output_convs = nn.ModuleList(ret.output_convs) - for name, _ in self.named_children(): - if name.startswith("fpn_"): - delattr(ret, name) - return ret - - FPN.__prepare_scriptable__ = prepare_fpn - - # Annotate some attributes to be constants for the purpose of scripting, - # even though they are not constants in eager mode. - from detectron2.modeling.roi_heads import StandardROIHeads - - if hasattr(StandardROIHeads, "__annotations__"): - # copy first to avoid editing annotations of base class - StandardROIHeads.__annotations__ = deepcopy(StandardROIHeads.__annotations__) - StandardROIHeads.__annotations__["mask_on"] = torch.jit.Final[bool] - StandardROIHeads.__annotations__["keypoint_on"] = torch.jit.Final[bool] - - -# These patches are not supposed to have side-effects. -patch_nonscriptable_classes() - - -@contextmanager -def freeze_training_mode(model): - """ - A context manager that annotates the "training" attribute of every submodule - to constant, so that the training codepath in these modules can be - meta-compiled away. Upon exiting, the annotations are reverted. - """ - classes = {type(x) for x in model.modules()} - # __constants__ is the old way to annotate constants and not compatible - # with __annotations__ . - classes = {x for x in classes if not hasattr(x, "__constants__")} - for cls in classes: - cls.__annotations__["training"] = torch.jit.Final[bool] - yield - for cls in classes: - cls.__annotations__["training"] = bool diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/__init__.py deleted file mode 100755 index 3d015c53..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/__init__.py +++ /dev/null @@ -1,24 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .batch_norm import FrozenBatchNorm2d, get_norm, NaiveSyncBatchNorm, CycleBatchNormList -from .deform_conv import DeformConv, ModulatedDeformConv -from .mask_ops import paste_masks_in_image -from .nms import batched_nms, batched_nms_rotated, nms, nms_rotated -from .roi_align import ROIAlign, roi_align -from .roi_align_rotated import ROIAlignRotated, roi_align_rotated -from .shape_spec import ShapeSpec -from .wrappers import ( - BatchNorm2d, - Conv2d, - ConvTranspose2d, - cat, - interpolate, - Linear, - nonzero_tuple, - cross_entropy, - shapes_to_tensor, -) -from .blocks import CNNBlockBase, DepthwiseSeparableConv2d -from .aspp import ASPP -from .losses import ciou_loss, diou_loss - -__all__ = [k for k in globals().keys() if not k.startswith("_")] diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/aspp.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/aspp.py deleted file mode 100755 index 14861aa9..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/aspp.py +++ /dev/null @@ -1,144 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -from copy import deepcopy -import fvcore.nn.weight_init as weight_init -import torch -from torch import nn -from torch.nn import functional as F - -from .batch_norm import get_norm -from .blocks import DepthwiseSeparableConv2d -from .wrappers import Conv2d - - -class ASPP(nn.Module): - """ - Atrous Spatial Pyramid Pooling (ASPP). - """ - - def __init__( - self, - in_channels, - out_channels, - dilations, - *, - norm, - activation, - pool_kernel_size=None, - dropout: float = 0.0, - use_depthwise_separable_conv=False, - ): - """ - Args: - in_channels (int): number of input channels for ASPP. - out_channels (int): number of output channels. - dilations (list): a list of 3 dilations in ASPP. - norm (str or callable): normalization for all conv layers. - See :func:`layers.get_norm` for supported format. norm is - applied to all conv layers except the conv following - global average pooling. - activation (callable): activation function. - pool_kernel_size (tuple, list): the average pooling size (kh, kw) - for image pooling layer in ASPP. If set to None, it always - performs global average pooling. If not None, it must be - divisible by the shape of inputs in forward(). It is recommended - to use a fixed input feature size in training, and set this - option to match this size, so that it performs global average - pooling in training, and the size of the pooling window stays - consistent in inference. - dropout (float): apply dropout on the output of ASPP. It is used in - the official DeepLab implementation with a rate of 0.1: - https://github.com/tensorflow/models/blob/21b73d22f3ed05b650e85ac50849408dd36de32e/research/deeplab/model.py#L532 # noqa - use_depthwise_separable_conv (bool): use DepthwiseSeparableConv2d - for 3x3 convs in ASPP, proposed in :paper:`DeepLabV3+`. - """ - super(ASPP, self).__init__() - assert len(dilations) == 3, "ASPP expects 3 dilations, got {}".format(len(dilations)) - self.pool_kernel_size = pool_kernel_size - self.dropout = dropout - use_bias = norm == "" - self.convs = nn.ModuleList() - # conv 1x1 - self.convs.append( - Conv2d( - in_channels, - out_channels, - kernel_size=1, - bias=use_bias, - norm=get_norm(norm, out_channels), - activation=deepcopy(activation), - ) - ) - weight_init.c2_xavier_fill(self.convs[-1]) - # atrous convs - for dilation in dilations: - if use_depthwise_separable_conv: - self.convs.append( - DepthwiseSeparableConv2d( - in_channels, - out_channels, - kernel_size=3, - padding=dilation, - dilation=dilation, - norm1=norm, - activation1=deepcopy(activation), - norm2=norm, - activation2=deepcopy(activation), - ) - ) - else: - self.convs.append( - Conv2d( - in_channels, - out_channels, - kernel_size=3, - padding=dilation, - dilation=dilation, - bias=use_bias, - norm=get_norm(norm, out_channels), - activation=deepcopy(activation), - ) - ) - weight_init.c2_xavier_fill(self.convs[-1]) - # image pooling - # We do not add BatchNorm because the spatial resolution is 1x1, - # the original TF implementation has BatchNorm. - if pool_kernel_size is None: - image_pooling = nn.Sequential( - nn.AdaptiveAvgPool2d(1), - Conv2d(in_channels, out_channels, 1, bias=True, activation=deepcopy(activation)), - ) - else: - image_pooling = nn.Sequential( - nn.AvgPool2d(kernel_size=pool_kernel_size, stride=1), - Conv2d(in_channels, out_channels, 1, bias=True, activation=deepcopy(activation)), - ) - weight_init.c2_xavier_fill(image_pooling[1]) - self.convs.append(image_pooling) - - self.project = Conv2d( - 5 * out_channels, - out_channels, - kernel_size=1, - bias=use_bias, - norm=get_norm(norm, out_channels), - activation=deepcopy(activation), - ) - weight_init.c2_xavier_fill(self.project) - - def forward(self, x): - size = x.shape[-2:] - if self.pool_kernel_size is not None: - if size[0] % self.pool_kernel_size[0] or size[1] % self.pool_kernel_size[1]: - raise ValueError( - "`pool_kernel_size` must be divisible by the shape of inputs. " - "Input size: {} `pool_kernel_size`: {}".format(size, self.pool_kernel_size) - ) - res = [] - for conv in self.convs: - res.append(conv(x)) - res[-1] = F.interpolate(res[-1], size=size, mode="bilinear", align_corners=False) - res = torch.cat(res, dim=1) - res = self.project(res) - res = F.dropout(res, self.dropout, training=self.training) if self.dropout > 0 else res - return res diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/batch_norm.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/batch_norm.py deleted file mode 100755 index 09a6c66c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/batch_norm.py +++ /dev/null @@ -1,276 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import torch -import torch.distributed as dist -from fvcore.nn.distributed import differentiable_all_reduce -from torch import nn -from torch.nn import functional as F - -from detectron2.utils import comm, env - -from .wrappers import BatchNorm2d - - -class FrozenBatchNorm2d(nn.Module): - """ - BatchNorm2d where the batch statistics and the affine parameters are fixed. - - It contains non-trainable buffers called - "weight" and "bias", "running_mean", "running_var", - initialized to perform identity transformation. - - The pre-trained backbone models from Caffe2 only contain "weight" and "bias", - which are computed from the original four parameters of BN. - The affine transform `x * weight + bias` will perform the equivalent - computation of `(x - running_mean) / sqrt(running_var) * weight + bias`. - When loading a backbone model from Caffe2, "running_mean" and "running_var" - will be left unchanged as identity transformation. - - Other pre-trained backbone models may contain all 4 parameters. - - The forward is implemented by `F.batch_norm(..., training=False)`. - """ - - _version = 3 - - def __init__(self, num_features, eps=1e-5): - super().__init__() - self.num_features = num_features - self.eps = eps - self.register_buffer("weight", torch.ones(num_features)) - self.register_buffer("bias", torch.zeros(num_features)) - self.register_buffer("running_mean", torch.zeros(num_features)) - self.register_buffer("running_var", torch.ones(num_features) - eps) - - def forward(self, x): - if x.requires_grad: - # When gradients are needed, F.batch_norm will use extra memory - # because its backward op computes gradients for weight/bias as well. - scale = self.weight * (self.running_var + self.eps).rsqrt() - bias = self.bias - self.running_mean * scale - scale = scale.reshape(1, -1, 1, 1) - bias = bias.reshape(1, -1, 1, 1) - out_dtype = x.dtype # may be half - return x * scale.to(out_dtype) + bias.to(out_dtype) - else: - # When gradients are not needed, F.batch_norm is a single fused op - # and provide more optimization opportunities. - return F.batch_norm( - x, - self.running_mean, - self.running_var, - self.weight, - self.bias, - training=False, - eps=self.eps, - ) - - def _load_from_state_dict( - self, state_dict, prefix, local_metadata, strict, missing_keys, unexpected_keys, error_msgs - ): - version = local_metadata.get("version", None) - - if version is None or version < 2: - # No running_mean/var in early versions - # This will silent the warnings - if prefix + "running_mean" not in state_dict: - state_dict[prefix + "running_mean"] = torch.zeros_like(self.running_mean) - if prefix + "running_var" not in state_dict: - state_dict[prefix + "running_var"] = torch.ones_like(self.running_var) - - super()._load_from_state_dict( - state_dict, prefix, local_metadata, strict, missing_keys, unexpected_keys, error_msgs - ) - - def __repr__(self): - return "FrozenBatchNorm2d(num_features={}, eps={})".format(self.num_features, self.eps) - - @classmethod - def convert_frozen_batchnorm(cls, module): - """ - Convert all BatchNorm/SyncBatchNorm in module into FrozenBatchNorm. - - Args: - module (torch.nn.Module): - - Returns: - If module is BatchNorm/SyncBatchNorm, returns a new module. - Otherwise, in-place convert module and return it. - - Similar to convert_sync_batchnorm in - https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/batchnorm.py - """ - bn_module = nn.modules.batchnorm - bn_module = (bn_module.BatchNorm2d, bn_module.SyncBatchNorm) - res = module - if isinstance(module, bn_module): - res = cls(module.num_features) - if module.affine: - res.weight.data = module.weight.data.clone().detach() - res.bias.data = module.bias.data.clone().detach() - res.running_mean.data = module.running_mean.data - res.running_var.data = module.running_var.data - res.eps = module.eps - else: - for name, child in module.named_children(): - new_child = cls.convert_frozen_batchnorm(child) - if new_child is not child: - res.add_module(name, new_child) - return res - - -def get_norm(norm, out_channels): - """ - Args: - norm (str or callable): either one of BN, SyncBN, FrozenBN, GN; - or a callable that takes a channel number and returns - the normalization layer as a nn.Module. - - Returns: - nn.Module or None: the normalization layer - """ - if norm is None: - return None - if isinstance(norm, str): - if len(norm) == 0: - return None - norm = { - "BN": BatchNorm2d, - # Fixed in https://github.com/pytorch/pytorch/pull/36382 - "SyncBN": NaiveSyncBatchNorm if env.TORCH_VERSION <= (1, 5) else nn.SyncBatchNorm, - "FrozenBN": FrozenBatchNorm2d, - "GN": lambda channels: nn.GroupNorm(32, channels), - # for debugging: - "nnSyncBN": nn.SyncBatchNorm, - "naiveSyncBN": NaiveSyncBatchNorm, - # expose stats_mode N as an option to caller, required for zero-len inputs - "naiveSyncBN_N": lambda channels: NaiveSyncBatchNorm(channels, stats_mode="N"), - }[norm] - return norm(out_channels) - - -class NaiveSyncBatchNorm(BatchNorm2d): - """ - In PyTorch<=1.5, ``nn.SyncBatchNorm`` has incorrect gradient - when the batch size on each worker is different. - (e.g., when scale augmentation is used, or when it is applied to mask head). - - This is a slower but correct alternative to `nn.SyncBatchNorm`. - - Note: - There isn't a single definition of Sync BatchNorm. - - When ``stats_mode==""``, this module computes overall statistics by using - statistics of each worker with equal weight. The result is true statistics - of all samples (as if they are all on one worker) only when all workers - have the same (N, H, W). This mode does not support inputs with zero batch size. - - When ``stats_mode=="N"``, this module computes overall statistics by weighting - the statistics of each worker by their ``N``. The result is true statistics - of all samples (as if they are all on one worker) only when all workers - have the same (H, W). It is slower than ``stats_mode==""``. - - Even though the result of this module may not be the true statistics of all samples, - it may still be reasonable because it might be preferrable to assign equal weights - to all workers, regardless of their (H, W) dimension, instead of putting larger weight - on larger images. From preliminary experiments, little difference is found between such - a simplified implementation and an accurate computation of overall mean & variance. - """ - - def __init__(self, *args, stats_mode="", **kwargs): - super().__init__(*args, **kwargs) - assert stats_mode in ["", "N"] - self._stats_mode = stats_mode - - def forward(self, input): - if comm.get_world_size() == 1 or not self.training: - return super().forward(input) - - B, C = input.shape[0], input.shape[1] - - half_input = input.dtype == torch.float16 - if half_input: - # fp16 does not have good enough numerics for the reduction here - input = input.float() - mean = torch.mean(input, dim=[0, 2, 3]) - meansqr = torch.mean(input * input, dim=[0, 2, 3]) - - if self._stats_mode == "": - assert B > 0, 'SyncBatchNorm(stats_mode="") does not support zero batch size.' - vec = torch.cat([mean, meansqr], dim=0) - vec = differentiable_all_reduce(vec) * (1.0 / dist.get_world_size()) - mean, meansqr = torch.split(vec, C) - momentum = self.momentum - else: - if B == 0: - vec = torch.zeros([2 * C + 1], device=mean.device, dtype=mean.dtype) - vec = vec + input.sum() # make sure there is gradient w.r.t input - else: - vec = torch.cat( - [mean, meansqr, torch.ones([1], device=mean.device, dtype=mean.dtype)], dim=0 - ) - vec = differentiable_all_reduce(vec * B) - - total_batch = vec[-1].detach() - momentum = total_batch.clamp(max=1) * self.momentum # no update if total_batch is 0 - mean, meansqr, _ = torch.split(vec / total_batch.clamp(min=1), C) # avoid div-by-zero - - var = meansqr - mean * mean - invstd = torch.rsqrt(var + self.eps) - scale = self.weight * invstd - bias = self.bias - mean * scale - scale = scale.reshape(1, -1, 1, 1) - bias = bias.reshape(1, -1, 1, 1) - - self.running_mean += momentum * (mean.detach() - self.running_mean) - self.running_var += momentum * (var.detach() - self.running_var) - ret = input * scale + bias - if half_input: - ret = ret.half() - return ret - - -class CycleBatchNormList(nn.ModuleList): - """ - Implement domain-specific BatchNorm by cycling. - - When a BatchNorm layer is used for multiple input domains or input - features, it might need to maintain a separate test-time statistics - for each domain. See Sec 5.2 in :paper:`rethinking-batchnorm`. - - This module implements it by using N separate BN layers - and it cycles through them every time a forward() is called. - - NOTE: The caller of this module MUST guarantee to always call - this module by multiple of N times. Otherwise its test-time statistics - will be incorrect. - """ - - def __init__(self, length: int, bn_class=nn.BatchNorm2d, **kwargs): - """ - Args: - length: number of BatchNorm layers to cycle. - bn_class: the BatchNorm class to use - kwargs: arguments of the BatchNorm class, such as num_features. - """ - self._affine = kwargs.pop("affine", True) - super().__init__([bn_class(**kwargs, affine=False) for k in range(length)]) - if self._affine: - # shared affine, domain-specific BN - channels = self[0].num_features - self.weight = nn.Parameter(torch.ones(channels)) - self.bias = nn.Parameter(torch.zeros(channels)) - self._pos = 0 - - def forward(self, x): - ret = self[self._pos](x) - self._pos = (self._pos + 1) % len(self) - - if self._affine: - w = self.weight.reshape(1, -1, 1, 1) - b = self.bias.reshape(1, -1, 1, 1) - return ret * w + b - else: - return ret - - def extra_repr(self): - return f"affine={self._affine}" diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/blocks.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/blocks.py deleted file mode 100755 index 1995a4bf..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/blocks.py +++ /dev/null @@ -1,111 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import fvcore.nn.weight_init as weight_init -from torch import nn - -from .batch_norm import FrozenBatchNorm2d, get_norm -from .wrappers import Conv2d - - -""" -CNN building blocks. -""" - - -class CNNBlockBase(nn.Module): - """ - A CNN block is assumed to have input channels, output channels and a stride. - The input and output of `forward()` method must be NCHW tensors. - The method can perform arbitrary computation but must match the given - channels and stride specification. - - Attribute: - in_channels (int): - out_channels (int): - stride (int): - """ - - def __init__(self, in_channels, out_channels, stride): - """ - The `__init__` method of any subclass should also contain these arguments. - - Args: - in_channels (int): - out_channels (int): - stride (int): - """ - super().__init__() - self.in_channels = in_channels - self.out_channels = out_channels - self.stride = stride - - def freeze(self): - """ - Make this block not trainable. - This method sets all parameters to `requires_grad=False`, - and convert all BatchNorm layers to FrozenBatchNorm - - Returns: - the block itself - """ - for p in self.parameters(): - p.requires_grad = False - FrozenBatchNorm2d.convert_frozen_batchnorm(self) - return self - - -class DepthwiseSeparableConv2d(nn.Module): - """ - A kxk depthwise convolution + a 1x1 convolution. - - In :paper:`xception`, norm & activation are applied on the second conv. - :paper:`mobilenet` uses norm & activation on both convs. - """ - - def __init__( - self, - in_channels, - out_channels, - kernel_size=3, - padding=1, - dilation=1, - *, - norm1=None, - activation1=None, - norm2=None, - activation2=None, - ): - """ - Args: - norm1, norm2 (str or callable): normalization for the two conv layers. - activation1, activation2 (callable(Tensor) -> Tensor): activation - function for the two conv layers. - """ - super().__init__() - self.depthwise = Conv2d( - in_channels, - in_channels, - kernel_size=kernel_size, - padding=padding, - dilation=dilation, - groups=in_channels, - bias=not norm1, - norm=get_norm(norm1, in_channels), - activation=activation1, - ) - self.pointwise = Conv2d( - in_channels, - out_channels, - kernel_size=1, - bias=not norm2, - norm=get_norm(norm2, out_channels), - activation=activation2, - ) - - # default initialization - weight_init.c2_msra_fill(self.depthwise) - weight_init.c2_msra_fill(self.pointwise) - - def forward(self, x): - return self.pointwise(self.depthwise(x)) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/README.md b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/README.md deleted file mode 100755 index 778ed3da..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/README.md +++ /dev/null @@ -1,7 +0,0 @@ - - -To add a new Op: - -1. Create a new directory -2. Implement new ops there -3. Delcare its Python interface in `vision.cpp`. diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated.h b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated.h deleted file mode 100755 index 03f42110..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated.h +++ /dev/null @@ -1,115 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#pragma once -#include - -namespace detectron2 { - -at::Tensor ROIAlignRotated_forward_cpu( - const at::Tensor& input, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int sampling_ratio); - -at::Tensor ROIAlignRotated_backward_cpu( - const at::Tensor& grad, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int batch_size, - const int channels, - const int height, - const int width, - const int sampling_ratio); - -#if defined(WITH_CUDA) || defined(WITH_HIP) -at::Tensor ROIAlignRotated_forward_cuda( - const at::Tensor& input, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int sampling_ratio); - -at::Tensor ROIAlignRotated_backward_cuda( - const at::Tensor& grad, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int batch_size, - const int channels, - const int height, - const int width, - const int sampling_ratio); -#endif - -// Interface for Python -inline at::Tensor ROIAlignRotated_forward( - const at::Tensor& input, - const at::Tensor& rois, - const double spatial_scale, - const int64_t pooled_height, - const int64_t pooled_width, - const int64_t sampling_ratio) { - if (input.is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - return ROIAlignRotated_forward_cuda( - input, - rois, - spatial_scale, - pooled_height, - pooled_width, - sampling_ratio); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - return ROIAlignRotated_forward_cpu( - input, rois, spatial_scale, pooled_height, pooled_width, sampling_ratio); -} - -inline at::Tensor ROIAlignRotated_backward( - const at::Tensor& grad, - const at::Tensor& rois, - const double spatial_scale, - const int64_t pooled_height, - const int64_t pooled_width, - const int64_t batch_size, - const int64_t channels, - const int64_t height, - const int64_t width, - const int64_t sampling_ratio) { - if (grad.is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - return ROIAlignRotated_backward_cuda( - grad, - rois, - spatial_scale, - pooled_height, - pooled_width, - batch_size, - channels, - height, - width, - sampling_ratio); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - return ROIAlignRotated_backward_cpu( - grad, - rois, - spatial_scale, - pooled_height, - pooled_width, - batch_size, - channels, - height, - width, - sampling_ratio); -} - -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cpu.cpp b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cpu.cpp deleted file mode 100755 index 2a3d3056..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cpu.cpp +++ /dev/null @@ -1,522 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#include -#include "ROIAlignRotated.h" - -// Note: this implementation originates from the Caffe2 ROIAlignRotated Op -// and PyTorch ROIAlign (non-rotated) Op implementations. -// The key difference between this implementation and those ones is -// we don't do "legacy offset" in this version, as there aren't many previous -// works, if any, using the "legacy" ROIAlignRotated Op. -// This would make the interface a bit cleaner. - -namespace detectron2 { - -namespace { -template -struct PreCalc { - int pos1; - int pos2; - int pos3; - int pos4; - T w1; - T w2; - T w3; - T w4; -}; - -template -void pre_calc_for_bilinear_interpolate( - const int height, - const int width, - const int pooled_height, - const int pooled_width, - const int iy_upper, - const int ix_upper, - T roi_start_h, - T roi_start_w, - T bin_size_h, - T bin_size_w, - int roi_bin_grid_h, - int roi_bin_grid_w, - T roi_center_h, - T roi_center_w, - T cos_theta, - T sin_theta, - std::vector>& pre_calc) { - int pre_calc_index = 0; - for (int ph = 0; ph < pooled_height; ph++) { - for (int pw = 0; pw < pooled_width; pw++) { - for (int iy = 0; iy < iy_upper; iy++) { - const T yy = roi_start_h + ph * bin_size_h + - static_cast(iy + .5f) * bin_size_h / - static_cast(roi_bin_grid_h); // e.g., 0.5, 1.5 - for (int ix = 0; ix < ix_upper; ix++) { - const T xx = roi_start_w + pw * bin_size_w + - static_cast(ix + .5f) * bin_size_w / - static_cast(roi_bin_grid_w); - - // Rotate by theta around the center and translate - // In image space, (y, x) is the order for Right Handed System, - // and this is essentially multiplying the point by a rotation matrix - // to rotate it counterclockwise through angle theta. - T y = yy * cos_theta - xx * sin_theta + roi_center_h; - T x = yy * sin_theta + xx * cos_theta + roi_center_w; - // deal with: inverse elements are out of feature map boundary - if (y < -1.0 || y > height || x < -1.0 || x > width) { - // empty - PreCalc pc; - pc.pos1 = 0; - pc.pos2 = 0; - pc.pos3 = 0; - pc.pos4 = 0; - pc.w1 = 0; - pc.w2 = 0; - pc.w3 = 0; - pc.w4 = 0; - pre_calc[pre_calc_index] = pc; - pre_calc_index += 1; - continue; - } - - if (y < 0) { - y = 0; - } - if (x < 0) { - x = 0; - } - - int y_low = (int)y; - int x_low = (int)x; - int y_high; - int x_high; - - if (y_low >= height - 1) { - y_high = y_low = height - 1; - y = (T)y_low; - } else { - y_high = y_low + 1; - } - - if (x_low >= width - 1) { - x_high = x_low = width - 1; - x = (T)x_low; - } else { - x_high = x_low + 1; - } - - T ly = y - y_low; - T lx = x - x_low; - T hy = 1. - ly, hx = 1. - lx; - T w1 = hy * hx, w2 = hy * lx, w3 = ly * hx, w4 = ly * lx; - - // save weights and indices - PreCalc pc; - pc.pos1 = y_low * width + x_low; - pc.pos2 = y_low * width + x_high; - pc.pos3 = y_high * width + x_low; - pc.pos4 = y_high * width + x_high; - pc.w1 = w1; - pc.w2 = w2; - pc.w3 = w3; - pc.w4 = w4; - pre_calc[pre_calc_index] = pc; - - pre_calc_index += 1; - } - } - } - } -} - -template -void bilinear_interpolate_gradient( - const int height, - const int width, - T y, - T x, - T& w1, - T& w2, - T& w3, - T& w4, - int& x_low, - int& x_high, - int& y_low, - int& y_high) { - // deal with cases that inverse elements are out of feature map boundary - if (y < -1.0 || y > height || x < -1.0 || x > width) { - // empty - w1 = w2 = w3 = w4 = 0.; - x_low = x_high = y_low = y_high = -1; - return; - } - - if (y < 0) { - y = 0; - } - - if (x < 0) { - x = 0; - } - - y_low = (int)y; - x_low = (int)x; - - if (y_low >= height - 1) { - y_high = y_low = height - 1; - y = (T)y_low; - } else { - y_high = y_low + 1; - } - - if (x_low >= width - 1) { - x_high = x_low = width - 1; - x = (T)x_low; - } else { - x_high = x_low + 1; - } - - T ly = y - y_low; - T lx = x - x_low; - T hy = 1. - ly, hx = 1. - lx; - - // reference in forward - // T v1 = input[y_low * width + x_low]; - // T v2 = input[y_low * width + x_high]; - // T v3 = input[y_high * width + x_low]; - // T v4 = input[y_high * width + x_high]; - // T val = (w1 * v1 + w2 * v2 + w3 * v3 + w4 * v4); - - w1 = hy * hx, w2 = hy * lx, w3 = ly * hx, w4 = ly * lx; - - return; -} - -template -inline void add(T* address, const T& val) { - *address += val; -} - -} // namespace - -template -void ROIAlignRotatedForward( - const int nthreads, - const T* input, - const T& spatial_scale, - const int channels, - const int height, - const int width, - const int pooled_height, - const int pooled_width, - const int sampling_ratio, - const T* rois, - T* output) { - int n_rois = nthreads / channels / pooled_width / pooled_height; - // (n, c, ph, pw) is an element in the pooled output - // can be parallelized using omp - // #pragma omp parallel for num_threads(32) - for (int n = 0; n < n_rois; n++) { - int index_n = n * channels * pooled_width * pooled_height; - - const T* current_roi = rois + n * 6; - int roi_batch_ind = current_roi[0]; - - // Do not use rounding; this implementation detail is critical - // ROIAlignRotated supports align == true, i.e., continuous coordinate - // by default, thus the 0.5 offset - T offset = (T)0.5; - T roi_center_w = current_roi[1] * spatial_scale - offset; - T roi_center_h = current_roi[2] * spatial_scale - offset; - T roi_width = current_roi[3] * spatial_scale; - T roi_height = current_roi[4] * spatial_scale; - T theta = current_roi[5] * M_PI / 180.0; - T cos_theta = cos(theta); - T sin_theta = sin(theta); - - AT_ASSERTM( - roi_width >= 0 && roi_height >= 0, - "ROIs in ROIAlignRotated do not have non-negative size!"); - - T bin_size_h = static_cast(roi_height) / static_cast(pooled_height); - T bin_size_w = static_cast(roi_width) / static_cast(pooled_width); - - // We use roi_bin_grid to sample the grid and mimic integral - int roi_bin_grid_h = (sampling_ratio > 0) - ? sampling_ratio - : ceil(roi_height / pooled_height); // e.g., = 2 - int roi_bin_grid_w = - (sampling_ratio > 0) ? sampling_ratio : ceil(roi_width / pooled_width); - - // We do average (integral) pooling inside a bin - const T count = std::max(roi_bin_grid_h * roi_bin_grid_w, 1); // e.g. = 4 - - // we want to precalculate indices and weights shared by all channels, - // this is the key point of optimization - std::vector> pre_calc( - roi_bin_grid_h * roi_bin_grid_w * pooled_width * pooled_height); - - // roi_start_h and roi_start_w are computed wrt the center of RoI (x, y). - // Appropriate translation needs to be applied after. - T roi_start_h = -roi_height / 2.0; - T roi_start_w = -roi_width / 2.0; - - pre_calc_for_bilinear_interpolate( - height, - width, - pooled_height, - pooled_width, - roi_bin_grid_h, - roi_bin_grid_w, - roi_start_h, - roi_start_w, - bin_size_h, - bin_size_w, - roi_bin_grid_h, - roi_bin_grid_w, - roi_center_h, - roi_center_w, - cos_theta, - sin_theta, - pre_calc); - - for (int c = 0; c < channels; c++) { - int index_n_c = index_n + c * pooled_width * pooled_height; - const T* offset_input = - input + (roi_batch_ind * channels + c) * height * width; - int pre_calc_index = 0; - - for (int ph = 0; ph < pooled_height; ph++) { - for (int pw = 0; pw < pooled_width; pw++) { - int index = index_n_c + ph * pooled_width + pw; - - T output_val = 0.; - for (int iy = 0; iy < roi_bin_grid_h; iy++) { - for (int ix = 0; ix < roi_bin_grid_w; ix++) { - PreCalc pc = pre_calc[pre_calc_index]; - output_val += pc.w1 * offset_input[pc.pos1] + - pc.w2 * offset_input[pc.pos2] + - pc.w3 * offset_input[pc.pos3] + pc.w4 * offset_input[pc.pos4]; - - pre_calc_index += 1; - } - } - output_val /= count; - - output[index] = output_val; - } // for pw - } // for ph - } // for c - } // for n -} - -template -void ROIAlignRotatedBackward( - const int nthreads, - // may not be contiguous. should index using n_stride, etc - const T* grad_output, - const T& spatial_scale, - const int channels, - const int height, - const int width, - const int pooled_height, - const int pooled_width, - const int sampling_ratio, - T* grad_input, - const T* rois, - const int n_stride, - const int c_stride, - const int h_stride, - const int w_stride) { - for (int index = 0; index < nthreads; index++) { - // (n, c, ph, pw) is an element in the pooled output - int pw = index % pooled_width; - int ph = (index / pooled_width) % pooled_height; - int c = (index / pooled_width / pooled_height) % channels; - int n = index / pooled_width / pooled_height / channels; - - const T* current_roi = rois + n * 6; - int roi_batch_ind = current_roi[0]; - - // Do not use rounding; this implementation detail is critical - // ROIAlignRotated supports align == true, i.e., continuous coordinate - // by default, thus the 0.5 offset - T offset = (T)0.5; - T roi_center_w = current_roi[1] * spatial_scale - offset; - T roi_center_h = current_roi[2] * spatial_scale - offset; - T roi_width = current_roi[3] * spatial_scale; - T roi_height = current_roi[4] * spatial_scale; - T theta = current_roi[5] * M_PI / 180.0; - T cos_theta = cos(theta); - T sin_theta = sin(theta); - - AT_ASSERTM( - roi_width >= 0 && roi_height >= 0, - "ROIs in ROIAlignRotated do not have non-negative size!"); - - T bin_size_h = static_cast(roi_height) / static_cast(pooled_height); - T bin_size_w = static_cast(roi_width) / static_cast(pooled_width); - - T* offset_grad_input = - grad_input + ((roi_batch_ind * channels + c) * height * width); - - int output_offset = n * n_stride + c * c_stride; - const T* offset_grad_output = grad_output + output_offset; - const T grad_output_this_bin = - offset_grad_output[ph * h_stride + pw * w_stride]; - - // We use roi_bin_grid to sample the grid and mimic integral - int roi_bin_grid_h = (sampling_ratio > 0) - ? sampling_ratio - : ceil(roi_height / pooled_height); // e.g., = 2 - int roi_bin_grid_w = - (sampling_ratio > 0) ? sampling_ratio : ceil(roi_width / pooled_width); - - // roi_start_h and roi_start_w are computed wrt the center of RoI (x, y). - // Appropriate translation needs to be applied after. - T roi_start_h = -roi_height / 2.0; - T roi_start_w = -roi_width / 2.0; - - // We do average (integral) pooling inside a bin - const T count = roi_bin_grid_h * roi_bin_grid_w; // e.g. = 4 - - for (int iy = 0; iy < roi_bin_grid_h; iy++) { - const T yy = roi_start_h + ph * bin_size_h + - static_cast(iy + .5f) * bin_size_h / - static_cast(roi_bin_grid_h); // e.g., 0.5, 1.5 - for (int ix = 0; ix < roi_bin_grid_w; ix++) { - const T xx = roi_start_w + pw * bin_size_w + - static_cast(ix + .5f) * bin_size_w / - static_cast(roi_bin_grid_w); - - // Rotate by theta around the center and translate - T y = yy * cos_theta - xx * sin_theta + roi_center_h; - T x = yy * sin_theta + xx * cos_theta + roi_center_w; - - T w1, w2, w3, w4; - int x_low, x_high, y_low, y_high; - - bilinear_interpolate_gradient( - height, width, y, x, w1, w2, w3, w4, x_low, x_high, y_low, y_high); - - T g1 = grad_output_this_bin * w1 / count; - T g2 = grad_output_this_bin * w2 / count; - T g3 = grad_output_this_bin * w3 / count; - T g4 = grad_output_this_bin * w4 / count; - - if (x_low >= 0 && x_high >= 0 && y_low >= 0 && y_high >= 0) { - // atomic add is not needed for now since it is single threaded - add(offset_grad_input + y_low * width + x_low, static_cast(g1)); - add(offset_grad_input + y_low * width + x_high, static_cast(g2)); - add(offset_grad_input + y_high * width + x_low, static_cast(g3)); - add(offset_grad_input + y_high * width + x_high, static_cast(g4)); - } // if - } // ix - } // iy - } // for -} // ROIAlignRotatedBackward - -at::Tensor ROIAlignRotated_forward_cpu( - const at::Tensor& input, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int sampling_ratio) { - AT_ASSERTM(input.device().is_cpu(), "input must be a CPU tensor"); - AT_ASSERTM(rois.device().is_cpu(), "rois must be a CPU tensor"); - - at::TensorArg input_t{input, "input", 1}, rois_t{rois, "rois", 2}; - - at::CheckedFrom c = "ROIAlign_forward_cpu"; - at::checkAllSameType(c, {input_t, rois_t}); - - auto num_rois = rois.size(0); - auto channels = input.size(1); - auto height = input.size(2); - auto width = input.size(3); - - at::Tensor output = at::zeros( - {num_rois, channels, pooled_height, pooled_width}, input.options()); - - auto output_size = num_rois * pooled_height * pooled_width * channels; - - if (output.numel() == 0) { - return output; - } - - auto input_ = input.contiguous(), rois_ = rois.contiguous(); - AT_DISPATCH_FLOATING_TYPES_AND_HALF( - input.scalar_type(), "ROIAlignRotated_forward", [&] { - ROIAlignRotatedForward( - output_size, - input_.data_ptr(), - spatial_scale, - channels, - height, - width, - pooled_height, - pooled_width, - sampling_ratio, - rois_.data_ptr(), - output.data_ptr()); - }); - return output; -} - -at::Tensor ROIAlignRotated_backward_cpu( - const at::Tensor& grad, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int batch_size, - const int channels, - const int height, - const int width, - const int sampling_ratio) { - AT_ASSERTM(grad.device().is_cpu(), "grad must be a CPU tensor"); - AT_ASSERTM(rois.device().is_cpu(), "rois must be a CPU tensor"); - - at::TensorArg grad_t{grad, "grad", 1}, rois_t{rois, "rois", 2}; - - at::CheckedFrom c = "ROIAlignRotated_backward_cpu"; - at::checkAllSameType(c, {grad_t, rois_t}); - - at::Tensor grad_input = - at::zeros({batch_size, channels, height, width}, grad.options()); - - // handle possibly empty gradients - if (grad.numel() == 0) { - return grad_input; - } - - // get stride values to ensure indexing into gradients is correct. - int n_stride = grad.stride(0); - int c_stride = grad.stride(1); - int h_stride = grad.stride(2); - int w_stride = grad.stride(3); - - auto rois_ = rois.contiguous(); - AT_DISPATCH_FLOATING_TYPES_AND_HALF( - grad.scalar_type(), "ROIAlignRotated_forward", [&] { - ROIAlignRotatedBackward( - grad.numel(), - grad.data_ptr(), - spatial_scale, - channels, - height, - width, - pooled_height, - pooled_width, - sampling_ratio, - grad_input.data_ptr(), - rois_.data_ptr(), - n_stride, - c_stride, - h_stride, - w_stride); - }); - return grad_input; -} - -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cuda.cu b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cuda.cu deleted file mode 100755 index fca18651..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/ROIAlignRotated/ROIAlignRotated_cuda.cu +++ /dev/null @@ -1,443 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#include -#include -#include -#include - -// TODO make it in a common file -#define CUDA_1D_KERNEL_LOOP(i, n) \ - for (int i = blockIdx.x * blockDim.x + threadIdx.x; i < n; \ - i += blockDim.x * gridDim.x) - -// Note: this implementation originates from the Caffe2 ROIAlignRotated Op -// and PyTorch ROIAlign (non-rotated) Op implementations. -// The key difference between this implementation and those ones is -// we don't do "legacy offset" in this version, as there aren't many previous -// works, if any, using the "legacy" ROIAlignRotated Op. -// This would make the interface a bit cleaner. - -namespace detectron2 { - -namespace { - -template -__device__ T bilinear_interpolate( - const T* input, - const int height, - const int width, - T y, - T x) { - // deal with cases that inverse elements are out of feature map boundary - if (y < -1.0 || y > height || x < -1.0 || x > width) { - // empty - return 0; - } - - if (y < 0) { - y = 0; - } - - if (x < 0) { - x = 0; - } - - int y_low = (int)y; - int x_low = (int)x; - int y_high; - int x_high; - - if (y_low >= height - 1) { - y_high = y_low = height - 1; - y = (T)y_low; - } else { - y_high = y_low + 1; - } - - if (x_low >= width - 1) { - x_high = x_low = width - 1; - x = (T)x_low; - } else { - x_high = x_low + 1; - } - - T ly = y - y_low; - T lx = x - x_low; - T hy = 1. - ly, hx = 1. - lx; - // do bilinear interpolation - T v1 = input[y_low * width + x_low]; - T v2 = input[y_low * width + x_high]; - T v3 = input[y_high * width + x_low]; - T v4 = input[y_high * width + x_high]; - T w1 = hy * hx, w2 = hy * lx, w3 = ly * hx, w4 = ly * lx; - - T val = (w1 * v1 + w2 * v2 + w3 * v3 + w4 * v4); - - return val; -} - -template -__device__ void bilinear_interpolate_gradient( - const int height, - const int width, - T y, - T x, - T& w1, - T& w2, - T& w3, - T& w4, - int& x_low, - int& x_high, - int& y_low, - int& y_high) { - // deal with cases that inverse elements are out of feature map boundary - if (y < -1.0 || y > height || x < -1.0 || x > width) { - // empty - w1 = w2 = w3 = w4 = 0.; - x_low = x_high = y_low = y_high = -1; - return; - } - - if (y < 0) { - y = 0; - } - - if (x < 0) { - x = 0; - } - - y_low = (int)y; - x_low = (int)x; - - if (y_low >= height - 1) { - y_high = y_low = height - 1; - y = (T)y_low; - } else { - y_high = y_low + 1; - } - - if (x_low >= width - 1) { - x_high = x_low = width - 1; - x = (T)x_low; - } else { - x_high = x_low + 1; - } - - T ly = y - y_low; - T lx = x - x_low; - T hy = 1. - ly, hx = 1. - lx; - - // reference in forward - // T v1 = input[y_low * width + x_low]; - // T v2 = input[y_low * width + x_high]; - // T v3 = input[y_high * width + x_low]; - // T v4 = input[y_high * width + x_high]; - // T val = (w1 * v1 + w2 * v2 + w3 * v3 + w4 * v4); - - w1 = hy * hx, w2 = hy * lx, w3 = ly * hx, w4 = ly * lx; - - return; -} - -} // namespace - -template -__global__ void RoIAlignRotatedForward( - const int nthreads, - const T* input, - const T spatial_scale, - const int channels, - const int height, - const int width, - const int pooled_height, - const int pooled_width, - const int sampling_ratio, - const T* rois, - T* top_data) { - CUDA_1D_KERNEL_LOOP(index, nthreads) { - // (n, c, ph, pw) is an element in the pooled output - int pw = index % pooled_width; - int ph = (index / pooled_width) % pooled_height; - int c = (index / pooled_width / pooled_height) % channels; - int n = index / pooled_width / pooled_height / channels; - - const T* current_roi = rois + n * 6; - int roi_batch_ind = current_roi[0]; - - // Do not use rounding; this implementation detail is critical - // ROIAlignRotated supports align == true, i.e., continuous coordinate - // by default, thus the 0.5 offset - T offset = (T)0.5; - T roi_center_w = current_roi[1] * spatial_scale - offset; - T roi_center_h = current_roi[2] * spatial_scale - offset; - T roi_width = current_roi[3] * spatial_scale; - T roi_height = current_roi[4] * spatial_scale; - T theta = current_roi[5] * M_PI / 180.0; - T cos_theta = cos(theta); - T sin_theta = sin(theta); - - T bin_size_h = static_cast(roi_height) / static_cast(pooled_height); - T bin_size_w = static_cast(roi_width) / static_cast(pooled_width); - - const T* offset_input = - input + (roi_batch_ind * channels + c) * height * width; - - // We use roi_bin_grid to sample the grid and mimic integral - int roi_bin_grid_h = (sampling_ratio > 0) - ? sampling_ratio - : ceil(roi_height / pooled_height); // e.g., = 2 - int roi_bin_grid_w = - (sampling_ratio > 0) ? sampling_ratio : ceil(roi_width / pooled_width); - - // roi_start_h and roi_start_w are computed wrt the center of RoI (x, y). - // Appropriate translation needs to be applied after. - T roi_start_h = -roi_height / 2.0; - T roi_start_w = -roi_width / 2.0; - - // We do average (inte gral) pooling inside a bin - const T count = max(roi_bin_grid_h * roi_bin_grid_w, 1); // e.g. = 4 - - T output_val = 0.; - for (int iy = 0; iy < roi_bin_grid_h; iy++) // e.g., iy = 0, 1 - { - const T yy = roi_start_h + ph * bin_size_h + - static_cast(iy + .5f) * bin_size_h / - static_cast(roi_bin_grid_h); // e.g., 0.5, 1.5 - for (int ix = 0; ix < roi_bin_grid_w; ix++) { - const T xx = roi_start_w + pw * bin_size_w + - static_cast(ix + .5f) * bin_size_w / - static_cast(roi_bin_grid_w); - - // Rotate by theta around the center and translate - T y = yy * cos_theta - xx * sin_theta + roi_center_h; - T x = yy * sin_theta + xx * cos_theta + roi_center_w; - - T val = bilinear_interpolate(offset_input, height, width, y, x); - output_val += val; - } - } - output_val /= count; - - top_data[index] = output_val; - } -} - -template -__global__ void RoIAlignRotatedBackwardFeature( - const int nthreads, - const T* top_diff, - const int num_rois, - const T spatial_scale, - const int channels, - const int height, - const int width, - const int pooled_height, - const int pooled_width, - const int sampling_ratio, - T* bottom_diff, - const T* rois) { - CUDA_1D_KERNEL_LOOP(index, nthreads) { - // (n, c, ph, pw) is an element in the pooled output - int pw = index % pooled_width; - int ph = (index / pooled_width) % pooled_height; - int c = (index / pooled_width / pooled_height) % channels; - int n = index / pooled_width / pooled_height / channels; - - const T* current_roi = rois + n * 6; - int roi_batch_ind = current_roi[0]; - - // Do not use rounding; this implementation detail is critical - // ROIAlignRotated supports align == true, i.e., continuous coordinate - // by default, thus the 0.5 offset - T offset = (T)0.5; - T roi_center_w = current_roi[1] * spatial_scale - offset; - T roi_center_h = current_roi[2] * spatial_scale - offset; - T roi_width = current_roi[3] * spatial_scale; - T roi_height = current_roi[4] * spatial_scale; - T theta = current_roi[5] * M_PI / 180.0; - T cos_theta = cos(theta); - T sin_theta = sin(theta); - - T bin_size_h = static_cast(roi_height) / static_cast(pooled_height); - T bin_size_w = static_cast(roi_width) / static_cast(pooled_width); - - T* offset_bottom_diff = - bottom_diff + (roi_batch_ind * channels + c) * height * width; - - int top_offset = (n * channels + c) * pooled_height * pooled_width; - const T* offset_top_diff = top_diff + top_offset; - const T top_diff_this_bin = offset_top_diff[ph * pooled_width + pw]; - - // We use roi_bin_grid to sample the grid and mimic integral - int roi_bin_grid_h = (sampling_ratio > 0) - ? sampling_ratio - : ceil(roi_height / pooled_height); // e.g., = 2 - int roi_bin_grid_w = - (sampling_ratio > 0) ? sampling_ratio : ceil(roi_width / pooled_width); - - // roi_start_h and roi_start_w are computed wrt the center of RoI (x, y). - // Appropriate translation needs to be applied after. - T roi_start_h = -roi_height / 2.0; - T roi_start_w = -roi_width / 2.0; - - // We do average (integral) pooling inside a bin - const T count = roi_bin_grid_h * roi_bin_grid_w; // e.g. = 4 - - for (int iy = 0; iy < roi_bin_grid_h; iy++) // e.g., iy = 0, 1 - { - const T yy = roi_start_h + ph * bin_size_h + - static_cast(iy + .5f) * bin_size_h / - static_cast(roi_bin_grid_h); // e.g., 0.5, 1.5 - for (int ix = 0; ix < roi_bin_grid_w; ix++) { - const T xx = roi_start_w + pw * bin_size_w + - static_cast(ix + .5f) * bin_size_w / - static_cast(roi_bin_grid_w); - - // Rotate by theta around the center and translate - T y = yy * cos_theta - xx * sin_theta + roi_center_h; - T x = yy * sin_theta + xx * cos_theta + roi_center_w; - - T w1, w2, w3, w4; - int x_low, x_high, y_low, y_high; - - bilinear_interpolate_gradient( - height, width, y, x, w1, w2, w3, w4, x_low, x_high, y_low, y_high); - - T g1 = top_diff_this_bin * w1 / count; - T g2 = top_diff_this_bin * w2 / count; - T g3 = top_diff_this_bin * w3 / count; - T g4 = top_diff_this_bin * w4 / count; - - if (x_low >= 0 && x_high >= 0 && y_low >= 0 && y_high >= 0) { - atomicAdd( - offset_bottom_diff + y_low * width + x_low, static_cast(g1)); - atomicAdd( - offset_bottom_diff + y_low * width + x_high, static_cast(g2)); - atomicAdd( - offset_bottom_diff + y_high * width + x_low, static_cast(g3)); - atomicAdd( - offset_bottom_diff + y_high * width + x_high, static_cast(g4)); - } // if - } // ix - } // iy - } // CUDA_1D_KERNEL_LOOP -} // RoIAlignRotatedBackward - -at::Tensor ROIAlignRotated_forward_cuda( - const at::Tensor& input, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int sampling_ratio) { - AT_ASSERTM(input.device().is_cuda(), "input must be a CUDA tensor"); - AT_ASSERTM(rois.device().is_cuda(), "rois must be a CUDA tensor"); - at::TensorArg input_t{input, "input", 1}, rois_t{rois, "rois", 2}; - - at::CheckedFrom c = "ROIAlignRotated_forward_cuda"; - at::checkAllSameGPU(c, {input_t, rois_t}); - at::checkAllSameType(c, {input_t, rois_t}); - at::cuda::CUDAGuard device_guard(input.device()); - - auto num_rois = rois.size(0); - auto channels = input.size(1); - auto height = input.size(2); - auto width = input.size(3); - - auto output = at::empty( - {num_rois, channels, pooled_height, pooled_width}, input.options()); - auto output_size = num_rois * pooled_height * pooled_width * channels; - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - dim3 grid(std::min( - at::cuda::ATenCeilDiv( - static_cast(output_size), static_cast(512)), - static_cast(4096))); - dim3 block(512); - - if (output.numel() == 0) { - AT_CUDA_CHECK(cudaGetLastError()); - return output; - } - - auto input_ = input.contiguous(), rois_ = rois.contiguous(); - AT_DISPATCH_FLOATING_TYPES( - input.scalar_type(), "ROIAlignRotated_forward", [&] { - RoIAlignRotatedForward<<>>( - output_size, - input_.data_ptr(), - spatial_scale, - channels, - height, - width, - pooled_height, - pooled_width, - sampling_ratio, - rois_.data_ptr(), - output.data_ptr()); - }); - cudaDeviceSynchronize(); - AT_CUDA_CHECK(cudaGetLastError()); - return output; -} - -// TODO remove the dependency on input and use instead its sizes -> save memory -at::Tensor ROIAlignRotated_backward_cuda( - const at::Tensor& grad, - const at::Tensor& rois, - const float spatial_scale, - const int pooled_height, - const int pooled_width, - const int batch_size, - const int channels, - const int height, - const int width, - const int sampling_ratio) { - AT_ASSERTM(grad.device().is_cuda(), "grad must be a CUDA tensor"); - AT_ASSERTM(rois.device().is_cuda(), "rois must be a CUDA tensor"); - - at::TensorArg grad_t{grad, "grad", 1}, rois_t{rois, "rois", 2}; - at::CheckedFrom c = "ROIAlign_backward_cuda"; - at::checkAllSameGPU(c, {grad_t, rois_t}); - at::checkAllSameType(c, {grad_t, rois_t}); - at::cuda::CUDAGuard device_guard(grad.device()); - - auto num_rois = rois.size(0); - auto grad_input = - at::zeros({batch_size, channels, height, width}, grad.options()); - - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - dim3 grid(std::min( - at::cuda::ATenCeilDiv( - static_cast(grad.numel()), static_cast(512)), - static_cast(4096))); - dim3 block(512); - - // handle possibly empty gradients - if (grad.numel() == 0) { - AT_CUDA_CHECK(cudaGetLastError()); - return grad_input; - } - - auto grad_ = grad.contiguous(), rois_ = rois.contiguous(); - AT_DISPATCH_FLOATING_TYPES( - grad.scalar_type(), "ROIAlignRotated_backward", [&] { - RoIAlignRotatedBackwardFeature<<>>( - grad.numel(), - grad_.data_ptr(), - num_rois, - spatial_scale, - channels, - height, - width, - pooled_height, - pooled_width, - sampling_ratio, - grad_input.data_ptr(), - rois_.data_ptr()); - }); - AT_CUDA_CHECK(cudaGetLastError()); - return grad_input; -} - -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h deleted file mode 100755 index 3bf383b8..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h +++ /dev/null @@ -1,35 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#pragma once -#include - -namespace detectron2 { - -at::Tensor box_iou_rotated_cpu( - const at::Tensor& boxes1, - const at::Tensor& boxes2); - -#if defined(WITH_CUDA) || defined(WITH_HIP) -at::Tensor box_iou_rotated_cuda( - const at::Tensor& boxes1, - const at::Tensor& boxes2); -#endif - -// Interface for Python -// inline is needed to prevent multiple function definitions when this header is -// included by different cpps -inline at::Tensor box_iou_rotated( - const at::Tensor& boxes1, - const at::Tensor& boxes2) { - assert(boxes1.device().is_cuda() == boxes2.device().is_cuda()); - if (boxes1.device().is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - return box_iou_rotated_cuda(boxes1.contiguous(), boxes2.contiguous()); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - - return box_iou_rotated_cpu(boxes1.contiguous(), boxes2.contiguous()); -} - -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp deleted file mode 100755 index c843487b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp +++ /dev/null @@ -1,39 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#include "box_iou_rotated.h" -#include "box_iou_rotated_utils.h" - -namespace detectron2 { - -template -void box_iou_rotated_cpu_kernel( - const at::Tensor& boxes1, - const at::Tensor& boxes2, - at::Tensor& ious) { - auto num_boxes1 = boxes1.size(0); - auto num_boxes2 = boxes2.size(0); - - for (int i = 0; i < num_boxes1; i++) { - for (int j = 0; j < num_boxes2; j++) { - ious[i * num_boxes2 + j] = single_box_iou_rotated( - boxes1[i].data_ptr(), boxes2[j].data_ptr()); - } - } -} - -at::Tensor box_iou_rotated_cpu( - // input must be contiguous: - const at::Tensor& boxes1, - const at::Tensor& boxes2) { - auto num_boxes1 = boxes1.size(0); - auto num_boxes2 = boxes2.size(0); - at::Tensor ious = - at::empty({num_boxes1 * num_boxes2}, boxes1.options().dtype(at::kFloat)); - - box_iou_rotated_cpu_kernel(boxes1, boxes2, ious); - - // reshape from 1d array to 2d array - auto shape = std::vector{num_boxes1, num_boxes2}; - return ious.reshape(shape); -} - -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu deleted file mode 100755 index 952710e5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu +++ /dev/null @@ -1,130 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#include -#include -#include -#include -#include "box_iou_rotated_utils.h" - -namespace detectron2 { - -// 2D block with 32 * 16 = 512 threads per block -const int BLOCK_DIM_X = 32; -const int BLOCK_DIM_Y = 16; - -template -__global__ void box_iou_rotated_cuda_kernel( - const int n_boxes1, - const int n_boxes2, - const T* dev_boxes1, - const T* dev_boxes2, - T* dev_ious) { - const int row_start = blockIdx.x * blockDim.x; - const int col_start = blockIdx.y * blockDim.y; - - const int row_size = min(n_boxes1 - row_start, blockDim.x); - const int col_size = min(n_boxes2 - col_start, blockDim.y); - - __shared__ float block_boxes1[BLOCK_DIM_X * 5]; - __shared__ float block_boxes2[BLOCK_DIM_Y * 5]; - - // It's safe to copy using threadIdx.x since BLOCK_DIM_X >= BLOCK_DIM_Y - if (threadIdx.x < row_size && threadIdx.y == 0) { - block_boxes1[threadIdx.x * 5 + 0] = - dev_boxes1[(row_start + threadIdx.x) * 5 + 0]; - block_boxes1[threadIdx.x * 5 + 1] = - dev_boxes1[(row_start + threadIdx.x) * 5 + 1]; - block_boxes1[threadIdx.x * 5 + 2] = - dev_boxes1[(row_start + threadIdx.x) * 5 + 2]; - block_boxes1[threadIdx.x * 5 + 3] = - dev_boxes1[(row_start + threadIdx.x) * 5 + 3]; - block_boxes1[threadIdx.x * 5 + 4] = - dev_boxes1[(row_start + threadIdx.x) * 5 + 4]; - } - - if (threadIdx.x < col_size && threadIdx.y == 0) { - block_boxes2[threadIdx.x * 5 + 0] = - dev_boxes2[(col_start + threadIdx.x) * 5 + 0]; - block_boxes2[threadIdx.x * 5 + 1] = - dev_boxes2[(col_start + threadIdx.x) * 5 + 1]; - block_boxes2[threadIdx.x * 5 + 2] = - dev_boxes2[(col_start + threadIdx.x) * 5 + 2]; - block_boxes2[threadIdx.x * 5 + 3] = - dev_boxes2[(col_start + threadIdx.x) * 5 + 3]; - block_boxes2[threadIdx.x * 5 + 4] = - dev_boxes2[(col_start + threadIdx.x) * 5 + 4]; - } - __syncthreads(); - - if (threadIdx.x < row_size && threadIdx.y < col_size) { - int offset = (row_start + threadIdx.x) * n_boxes2 + col_start + threadIdx.y; - dev_ious[offset] = single_box_iou_rotated( - block_boxes1 + threadIdx.x * 5, block_boxes2 + threadIdx.y * 5); - } -} - -at::Tensor box_iou_rotated_cuda( - // input must be contiguous - const at::Tensor& boxes1, - const at::Tensor& boxes2) { - using scalar_t = float; - AT_ASSERTM( - boxes1.scalar_type() == at::kFloat, "boxes1 must be a float tensor"); - AT_ASSERTM( - boxes2.scalar_type() == at::kFloat, "boxes2 must be a float tensor"); - AT_ASSERTM(boxes1.is_cuda(), "boxes1 must be a CUDA tensor"); - AT_ASSERTM(boxes2.is_cuda(), "boxes2 must be a CUDA tensor"); - at::cuda::CUDAGuard device_guard(boxes1.device()); - - auto num_boxes1 = boxes1.size(0); - auto num_boxes2 = boxes2.size(0); - - at::Tensor ious = - at::empty({num_boxes1 * num_boxes2}, boxes1.options().dtype(at::kFloat)); - - bool transpose = false; - if (num_boxes1 > 0 && num_boxes2 > 0) { - scalar_t *data1 = boxes1.data_ptr(), - *data2 = boxes2.data_ptr(); - - if (num_boxes2 > 65535 * BLOCK_DIM_Y) { - AT_ASSERTM( - num_boxes1 <= 65535 * BLOCK_DIM_Y, - "Too many boxes for box_iou_rotated_cuda!"); - // x dim is allowed to be large, but y dim cannot, - // so we transpose the two to avoid "invalid configuration argument" - // error. We assume one of them is small. Otherwise the result is hard to - // fit in memory anyway. - std::swap(num_boxes1, num_boxes2); - std::swap(data1, data2); - transpose = true; - } - - const int blocks_x = - at::cuda::ATenCeilDiv(static_cast(num_boxes1), BLOCK_DIM_X); - const int blocks_y = - at::cuda::ATenCeilDiv(static_cast(num_boxes2), BLOCK_DIM_Y); - - dim3 blocks(blocks_x, blocks_y); - dim3 threads(BLOCK_DIM_X, BLOCK_DIM_Y); - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - box_iou_rotated_cuda_kernel<<>>( - num_boxes1, - num_boxes2, - data1, - data2, - (scalar_t*)ious.data_ptr()); - - AT_CUDA_CHECK(cudaGetLastError()); - } - - // reshape from 1d array to 2d array - auto shape = std::vector{num_boxes1, num_boxes2}; - if (transpose) { - return ious.view(shape).t(); - } else { - return ious.view(shape); - } -} - -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h deleted file mode 100755 index b54a5dde..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h +++ /dev/null @@ -1,370 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#pragma once - -#include -#include - -#if defined(__CUDACC__) || __HCC__ == 1 || __HIP__ == 1 -// Designates functions callable from the host (CPU) and the device (GPU) -#define HOST_DEVICE __host__ __device__ -#define HOST_DEVICE_INLINE HOST_DEVICE __forceinline__ -#else -#include -#define HOST_DEVICE -#define HOST_DEVICE_INLINE HOST_DEVICE inline -#endif - -namespace detectron2 { - -namespace { - -template -struct RotatedBox { - T x_ctr, y_ctr, w, h, a; -}; - -template -struct Point { - T x, y; - HOST_DEVICE_INLINE Point(const T& px = 0, const T& py = 0) : x(px), y(py) {} - HOST_DEVICE_INLINE Point operator+(const Point& p) const { - return Point(x + p.x, y + p.y); - } - HOST_DEVICE_INLINE Point& operator+=(const Point& p) { - x += p.x; - y += p.y; - return *this; - } - HOST_DEVICE_INLINE Point operator-(const Point& p) const { - return Point(x - p.x, y - p.y); - } - HOST_DEVICE_INLINE Point operator*(const T coeff) const { - return Point(x * coeff, y * coeff); - } -}; - -template -HOST_DEVICE_INLINE T dot_2d(const Point& A, const Point& B) { - return A.x * B.x + A.y * B.y; -} - -// R: result type. can be different from input type -template -HOST_DEVICE_INLINE R cross_2d(const Point& A, const Point& B) { - return static_cast(A.x) * static_cast(B.y) - - static_cast(B.x) * static_cast(A.y); -} - -template -HOST_DEVICE_INLINE void get_rotated_vertices( - const RotatedBox& box, - Point (&pts)[4]) { - // M_PI / 180. == 0.01745329251 - double theta = box.a * 0.01745329251; - T cosTheta2 = (T)cos(theta) * 0.5f; - T sinTheta2 = (T)sin(theta) * 0.5f; - - // y: top --> down; x: left --> right - pts[0].x = box.x_ctr + sinTheta2 * box.h + cosTheta2 * box.w; - pts[0].y = box.y_ctr + cosTheta2 * box.h - sinTheta2 * box.w; - pts[1].x = box.x_ctr - sinTheta2 * box.h + cosTheta2 * box.w; - pts[1].y = box.y_ctr - cosTheta2 * box.h - sinTheta2 * box.w; - pts[2].x = 2 * box.x_ctr - pts[0].x; - pts[2].y = 2 * box.y_ctr - pts[0].y; - pts[3].x = 2 * box.x_ctr - pts[1].x; - pts[3].y = 2 * box.y_ctr - pts[1].y; -} - -template -HOST_DEVICE_INLINE int get_intersection_points( - const Point (&pts1)[4], - const Point (&pts2)[4], - Point (&intersections)[24]) { - // Line vector - // A line from p1 to p2 is: p1 + (p2-p1)*t, t=[0,1] - Point vec1[4], vec2[4]; - for (int i = 0; i < 4; i++) { - vec1[i] = pts1[(i + 1) % 4] - pts1[i]; - vec2[i] = pts2[(i + 1) % 4] - pts2[i]; - } - - // When computing the intersection area, it doesn't hurt if we have - // more (duplicated/approximate) intersections/vertices than needed, - // while it can cause drastic difference if we miss an intersection/vertex. - // Therefore, we add an epsilon to relax the comparisons between - // the float point numbers that decide the intersection points. - double EPS = 1e-5; - - // Line test - test all line combos for intersection - int num = 0; // number of intersections - for (int i = 0; i < 4; i++) { - for (int j = 0; j < 4; j++) { - // Solve for 2x2 Ax=b - T det = cross_2d(vec2[j], vec1[i]); - - // This takes care of parallel lines - if (fabs(det) <= 1e-14) { - continue; - } - - auto vec12 = pts2[j] - pts1[i]; - - T t1 = cross_2d(vec2[j], vec12) / det; - T t2 = cross_2d(vec1[i], vec12) / det; - - if (t1 > -EPS && t1 < 1.0f + EPS && t2 > -EPS && t2 < 1.0f + EPS) { - intersections[num++] = pts1[i] + vec1[i] * t1; - } - } - } - - // Check for vertices of rect1 inside rect2 - { - const auto& AB = vec2[0]; - const auto& DA = vec2[3]; - auto ABdotAB = dot_2d(AB, AB); - auto ADdotAD = dot_2d(DA, DA); - for (int i = 0; i < 4; i++) { - // assume ABCD is the rectangle, and P is the point to be judged - // P is inside ABCD iff. P's projection on AB lies within AB - // and P's projection on AD lies within AD - - auto AP = pts1[i] - pts2[0]; - - auto APdotAB = dot_2d(AP, AB); - auto APdotAD = -dot_2d(AP, DA); - - if ((APdotAB > -EPS) && (APdotAD > -EPS) && (APdotAB < ABdotAB + EPS) && - (APdotAD < ADdotAD + EPS)) { - intersections[num++] = pts1[i]; - } - } - } - - // Reverse the check - check for vertices of rect2 inside rect1 - { - const auto& AB = vec1[0]; - const auto& DA = vec1[3]; - auto ABdotAB = dot_2d(AB, AB); - auto ADdotAD = dot_2d(DA, DA); - for (int i = 0; i < 4; i++) { - auto AP = pts2[i] - pts1[0]; - - auto APdotAB = dot_2d(AP, AB); - auto APdotAD = -dot_2d(AP, DA); - - if ((APdotAB > -EPS) && (APdotAD > -EPS) && (APdotAB < ABdotAB + EPS) && - (APdotAD < ADdotAD + EPS)) { - intersections[num++] = pts2[i]; - } - } - } - - return num; -} - -template -HOST_DEVICE_INLINE int convex_hull_graham( - const Point (&p)[24], - const int& num_in, - Point (&q)[24], - bool shift_to_zero = false) { - assert(num_in >= 2); - - // Step 1: - // Find point with minimum y - // if more than 1 points have the same minimum y, - // pick the one with the minimum x. - int t = 0; - for (int i = 1; i < num_in; i++) { - if (p[i].y < p[t].y || (p[i].y == p[t].y && p[i].x < p[t].x)) { - t = i; - } - } - auto& start = p[t]; // starting point - - // Step 2: - // Subtract starting point from every points (for sorting in the next step) - for (int i = 0; i < num_in; i++) { - q[i] = p[i] - start; - } - - // Swap the starting point to position 0 - auto tmp = q[0]; - q[0] = q[t]; - q[t] = tmp; - - // Step 3: - // Sort point 1 ~ num_in according to their relative cross-product values - // (essentially sorting according to angles) - // If the angles are the same, sort according to their distance to origin - T dist[24]; -#if defined(__CUDACC__) || __HCC__ == 1 || __HIP__ == 1 - // compute distance to origin before sort, and sort them together with the - // points - for (int i = 0; i < num_in; i++) { - dist[i] = dot_2d(q[i], q[i]); - } - - // CUDA version - // In the future, we can potentially use thrust - // for sorting here to improve speed (though not guaranteed) - for (int i = 1; i < num_in - 1; i++) { - for (int j = i + 1; j < num_in; j++) { - T crossProduct = cross_2d(q[i], q[j]); - if ((crossProduct < -1e-6) || - (fabs(crossProduct) < 1e-6 && dist[i] > dist[j])) { - auto q_tmp = q[i]; - q[i] = q[j]; - q[j] = q_tmp; - auto dist_tmp = dist[i]; - dist[i] = dist[j]; - dist[j] = dist_tmp; - } - } - } -#else - // CPU version - std::sort( - q + 1, q + num_in, [](const Point& A, const Point& B) -> bool { - T temp = cross_2d(A, B); - if (fabs(temp) < 1e-6) { - return dot_2d(A, A) < dot_2d(B, B); - } else { - return temp > 0; - } - }); - // compute distance to origin after sort, since the points are now different. - for (int i = 0; i < num_in; i++) { - dist[i] = dot_2d(q[i], q[i]); - } -#endif - - // Step 4: - // Make sure there are at least 2 points (that don't overlap with each other) - // in the stack - int k; // index of the non-overlapped second point - for (k = 1; k < num_in; k++) { - if (dist[k] > 1e-8) { - break; - } - } - if (k == num_in) { - // We reach the end, which means the convex hull is just one point - q[0] = p[t]; - return 1; - } - q[1] = q[k]; - int m = 2; // 2 points in the stack - // Step 5: - // Finally we can start the scanning process. - // When a non-convex relationship between the 3 points is found - // (either concave shape or duplicated points), - // we pop the previous point from the stack - // until the 3-point relationship is convex again, or - // until the stack only contains two points - for (int i = k + 1; i < num_in; i++) { - while (m > 1) { - auto q1 = q[i] - q[m - 2], q2 = q[m - 1] - q[m - 2]; - // cross_2d() uses FMA and therefore computes round(round(q1.x*q2.y) - - // q2.x*q1.y) So it may not return 0 even when q1==q2. Therefore we - // compare round(q1.x*q2.y) and round(q2.x*q1.y) directly. (round means - // round to nearest floating point). - if (q1.x * q2.y >= q2.x * q1.y) - m--; - else - break; - } - // Using double also helps, but float can solve the issue for now. - // while (m > 1 && cross_2d(q[i] - q[m - 2], q[m - 1] - q[m - 2]) - // >= 0) { - // m--; - // } - q[m++] = q[i]; - } - - // Step 6 (Optional): - // In general sense we need the original coordinates, so we - // need to shift the points back (reverting Step 2) - // But if we're only interested in getting the area/perimeter of the shape - // We can simply return. - if (!shift_to_zero) { - for (int i = 0; i < m; i++) { - q[i] += start; - } - } - - return m; -} - -template -HOST_DEVICE_INLINE T polygon_area(const Point (&q)[24], const int& m) { - if (m <= 2) { - return 0; - } - - T area = 0; - for (int i = 1; i < m - 1; i++) { - area += fabs(cross_2d(q[i] - q[0], q[i + 1] - q[0])); - } - - return area / 2.0; -} - -template -HOST_DEVICE_INLINE T rotated_boxes_intersection( - const RotatedBox& box1, - const RotatedBox& box2) { - // There are up to 4 x 4 + 4 + 4 = 24 intersections (including dups) returned - // from rotated_rect_intersection_pts - Point intersectPts[24], orderedPts[24]; - - Point pts1[4]; - Point pts2[4]; - get_rotated_vertices(box1, pts1); - get_rotated_vertices(box2, pts2); - - int num = get_intersection_points(pts1, pts2, intersectPts); - - if (num <= 2) { - return 0.0; - } - - // Convex Hull to order the intersection points in clockwise order and find - // the contour area. - int num_convex = convex_hull_graham(intersectPts, num, orderedPts, true); - return polygon_area(orderedPts, num_convex); -} - -} // namespace - -template -HOST_DEVICE_INLINE T -single_box_iou_rotated(T const* const box1_raw, T const* const box2_raw) { - // shift center to the middle point to achieve higher precision in result - RotatedBox box1, box2; - auto center_shift_x = (box1_raw[0] + box2_raw[0]) / 2.0; - auto center_shift_y = (box1_raw[1] + box2_raw[1]) / 2.0; - box1.x_ctr = box1_raw[0] - center_shift_x; - box1.y_ctr = box1_raw[1] - center_shift_y; - box1.w = box1_raw[2]; - box1.h = box1_raw[3]; - box1.a = box1_raw[4]; - box2.x_ctr = box2_raw[0] - center_shift_x; - box2.y_ctr = box2_raw[1] - center_shift_y; - box2.w = box2_raw[2]; - box2.h = box2_raw[3]; - box2.a = box2_raw[4]; - - T area1 = box1.w * box1.h; - T area2 = box2.w * box2.h; - if (area1 < 1e-14 || area2 < 1e-14) { - return 0.f; - } - - T intersection = rotated_boxes_intersection(box1, box2); - T iou = intersection / (area1 + area2 - intersection); - return iou; -} - -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.cpp b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.cpp deleted file mode 100755 index 0a5b7b90..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.cpp +++ /dev/null @@ -1,507 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#include "cocoeval.h" -#include -#include -#include -#include - -using namespace pybind11::literals; - -namespace detectron2 { - -namespace COCOeval { - -// Sort detections from highest score to lowest, such that -// detection_instances[detection_sorted_indices[t]] >= -// detection_instances[detection_sorted_indices[t+1]]. Use stable_sort to match -// original COCO API -void SortInstancesByDetectionScore( - const std::vector& detection_instances, - std::vector* detection_sorted_indices) { - detection_sorted_indices->resize(detection_instances.size()); - std::iota( - detection_sorted_indices->begin(), detection_sorted_indices->end(), 0); - std::stable_sort( - detection_sorted_indices->begin(), - detection_sorted_indices->end(), - [&detection_instances](size_t j1, size_t j2) { - return detection_instances[j1].score > detection_instances[j2].score; - }); -} - -// Partition the ground truth objects based on whether or not to ignore them -// based on area -void SortInstancesByIgnore( - const std::array& area_range, - const std::vector& ground_truth_instances, - std::vector* ground_truth_sorted_indices, - std::vector* ignores) { - ignores->clear(); - ignores->reserve(ground_truth_instances.size()); - for (auto o : ground_truth_instances) { - ignores->push_back( - o.ignore || o.area < area_range[0] || o.area > area_range[1]); - } - - ground_truth_sorted_indices->resize(ground_truth_instances.size()); - std::iota( - ground_truth_sorted_indices->begin(), - ground_truth_sorted_indices->end(), - 0); - std::stable_sort( - ground_truth_sorted_indices->begin(), - ground_truth_sorted_indices->end(), - [&ignores](size_t j1, size_t j2) { - return (int)(*ignores)[j1] < (int)(*ignores)[j2]; - }); -} - -// For each IOU threshold, greedily match each detected instance to a ground -// truth instance (if possible) and store the results -void MatchDetectionsToGroundTruth( - const std::vector& detection_instances, - const std::vector& detection_sorted_indices, - const std::vector& ground_truth_instances, - const std::vector& ground_truth_sorted_indices, - const std::vector& ignores, - const std::vector>& ious, - const std::vector& iou_thresholds, - const std::array& area_range, - ImageEvaluation* results) { - // Initialize memory to store return data matches and ignore - const int num_iou_thresholds = iou_thresholds.size(); - const int num_ground_truth = ground_truth_sorted_indices.size(); - const int num_detections = detection_sorted_indices.size(); - std::vector ground_truth_matches( - num_iou_thresholds * num_ground_truth, 0); - std::vector& detection_matches = results->detection_matches; - std::vector& detection_ignores = results->detection_ignores; - std::vector& ground_truth_ignores = results->ground_truth_ignores; - detection_matches.resize(num_iou_thresholds * num_detections, 0); - detection_ignores.resize(num_iou_thresholds * num_detections, false); - ground_truth_ignores.resize(num_ground_truth); - for (auto g = 0; g < num_ground_truth; ++g) { - ground_truth_ignores[g] = ignores[ground_truth_sorted_indices[g]]; - } - - for (auto t = 0; t < num_iou_thresholds; ++t) { - for (auto d = 0; d < num_detections; ++d) { - // information about best match so far (match=-1 -> unmatched) - double best_iou = std::min(iou_thresholds[t], 1 - 1e-10); - int match = -1; - for (auto g = 0; g < num_ground_truth; ++g) { - // if this ground truth instance is already matched and not a - // crowd, it cannot be matched to another detection - if (ground_truth_matches[t * num_ground_truth + g] > 0 && - !ground_truth_instances[ground_truth_sorted_indices[g]].is_crowd) { - continue; - } - - // if detected instance matched to a regular ground truth - // instance, we can break on the first ground truth instance - // tagged as ignore (because they are sorted by the ignore tag) - if (match >= 0 && !ground_truth_ignores[match] && - ground_truth_ignores[g]) { - break; - } - - // if IOU overlap is the best so far, store the match appropriately - if (ious[d][ground_truth_sorted_indices[g]] >= best_iou) { - best_iou = ious[d][ground_truth_sorted_indices[g]]; - match = g; - } - } - // if match was made, store id of match for both detection and - // ground truth - if (match >= 0) { - detection_ignores[t * num_detections + d] = ground_truth_ignores[match]; - detection_matches[t * num_detections + d] = - ground_truth_instances[ground_truth_sorted_indices[match]].id; - ground_truth_matches[t * num_ground_truth + match] = - detection_instances[detection_sorted_indices[d]].id; - } - - // set unmatched detections outside of area range to ignore - const InstanceAnnotation& detection = - detection_instances[detection_sorted_indices[d]]; - detection_ignores[t * num_detections + d] = - detection_ignores[t * num_detections + d] || - (detection_matches[t * num_detections + d] == 0 && - (detection.area < area_range[0] || detection.area > area_range[1])); - } - } - - // store detection score results - results->detection_scores.resize(detection_sorted_indices.size()); - for (size_t d = 0; d < detection_sorted_indices.size(); ++d) { - results->detection_scores[d] = - detection_instances[detection_sorted_indices[d]].score; - } -} - -std::vector EvaluateImages( - const std::vector>& area_ranges, - int max_detections, - const std::vector& iou_thresholds, - const ImageCategoryInstances>& image_category_ious, - const ImageCategoryInstances& - image_category_ground_truth_instances, - const ImageCategoryInstances& - image_category_detection_instances) { - const int num_area_ranges = area_ranges.size(); - const int num_images = image_category_ground_truth_instances.size(); - const int num_categories = - image_category_ious.size() > 0 ? image_category_ious[0].size() : 0; - std::vector detection_sorted_indices; - std::vector ground_truth_sorted_indices; - std::vector ignores; - std::vector results_all( - num_images * num_area_ranges * num_categories); - - // Store results for each image, category, and area range combination. Results - // for each IOU threshold are packed into the same ImageEvaluation object - for (auto i = 0; i < num_images; ++i) { - for (auto c = 0; c < num_categories; ++c) { - const std::vector& ground_truth_instances = - image_category_ground_truth_instances[i][c]; - const std::vector& detection_instances = - image_category_detection_instances[i][c]; - - SortInstancesByDetectionScore( - detection_instances, &detection_sorted_indices); - if ((int)detection_sorted_indices.size() > max_detections) { - detection_sorted_indices.resize(max_detections); - } - - for (size_t a = 0; a < area_ranges.size(); ++a) { - SortInstancesByIgnore( - area_ranges[a], - ground_truth_instances, - &ground_truth_sorted_indices, - &ignores); - - MatchDetectionsToGroundTruth( - detection_instances, - detection_sorted_indices, - ground_truth_instances, - ground_truth_sorted_indices, - ignores, - image_category_ious[i][c], - iou_thresholds, - area_ranges[a], - &results_all - [c * num_area_ranges * num_images + a * num_images + i]); - } - } - } - - return results_all; -} - -// Convert a python list to a vector -template -std::vector list_to_vec(const py::list& l) { - std::vector v(py::len(l)); - for (int i = 0; i < (int)py::len(l); ++i) { - v[i] = l[i].cast(); - } - return v; -} - -// Helper function to Accumulate() -// Considers the evaluation results applicable to a particular category, area -// range, and max_detections parameter setting, which begin at -// evaluations[evaluation_index]. Extracts a sorted list of length n of all -// applicable detection instances concatenated across all images in the dataset, -// which are represented by the outputs evaluation_indices, detection_scores, -// image_detection_indices, and detection_sorted_indices--all of which are -// length n. evaluation_indices[i] stores the applicable index into -// evaluations[] for instance i, which has detection score detection_score[i], -// and is the image_detection_indices[i]'th of the list of detections -// for the image containing i. detection_sorted_indices[] defines a sorted -// permutation of the 3 other outputs -int BuildSortedDetectionList( - const std::vector& evaluations, - const int64_t evaluation_index, - const int64_t num_images, - const int max_detections, - std::vector* evaluation_indices, - std::vector* detection_scores, - std::vector* detection_sorted_indices, - std::vector* image_detection_indices) { - assert(evaluations.size() >= evaluation_index + num_images); - - // Extract a list of object instances of the applicable category, area - // range, and max detections requirements such that they can be sorted - image_detection_indices->clear(); - evaluation_indices->clear(); - detection_scores->clear(); - image_detection_indices->reserve(num_images * max_detections); - evaluation_indices->reserve(num_images * max_detections); - detection_scores->reserve(num_images * max_detections); - int num_valid_ground_truth = 0; - for (auto i = 0; i < num_images; ++i) { - const ImageEvaluation& evaluation = evaluations[evaluation_index + i]; - - for (int d = 0; - d < (int)evaluation.detection_scores.size() && d < max_detections; - ++d) { // detected instances - evaluation_indices->push_back(evaluation_index + i); - image_detection_indices->push_back(d); - detection_scores->push_back(evaluation.detection_scores[d]); - } - for (auto ground_truth_ignore : evaluation.ground_truth_ignores) { - if (!ground_truth_ignore) { - ++num_valid_ground_truth; - } - } - } - - // Sort detections by decreasing score, using stable sort to match - // python implementation - detection_sorted_indices->resize(detection_scores->size()); - std::iota( - detection_sorted_indices->begin(), detection_sorted_indices->end(), 0); - std::stable_sort( - detection_sorted_indices->begin(), - detection_sorted_indices->end(), - [&detection_scores](size_t j1, size_t j2) { - return (*detection_scores)[j1] > (*detection_scores)[j2]; - }); - - return num_valid_ground_truth; -} - -// Helper function to Accumulate() -// Compute a precision recall curve given a sorted list of detected instances -// encoded in evaluations, evaluation_indices, detection_scores, -// detection_sorted_indices, image_detection_indices (see -// BuildSortedDetectionList()). Using vectors precisions and recalls -// and temporary storage, output the results into precisions_out, recalls_out, -// and scores_out, which are large buffers containing many precion/recall curves -// for all possible parameter settings, with precisions_out_index and -// recalls_out_index defining the applicable indices to store results. -void ComputePrecisionRecallCurve( - const int64_t precisions_out_index, - const int64_t precisions_out_stride, - const int64_t recalls_out_index, - const std::vector& recall_thresholds, - const int iou_threshold_index, - const int num_iou_thresholds, - const int num_valid_ground_truth, - const std::vector& evaluations, - const std::vector& evaluation_indices, - const std::vector& detection_scores, - const std::vector& detection_sorted_indices, - const std::vector& image_detection_indices, - std::vector* precisions, - std::vector* recalls, - std::vector* precisions_out, - std::vector* scores_out, - std::vector* recalls_out) { - assert(recalls_out->size() > recalls_out_index); - - // Compute precision/recall for each instance in the sorted list of detections - int64_t true_positives_sum = 0, false_positives_sum = 0; - precisions->clear(); - recalls->clear(); - precisions->reserve(detection_sorted_indices.size()); - recalls->reserve(detection_sorted_indices.size()); - assert(!evaluations.empty() || detection_sorted_indices.empty()); - for (auto detection_sorted_index : detection_sorted_indices) { - const ImageEvaluation& evaluation = - evaluations[evaluation_indices[detection_sorted_index]]; - const auto num_detections = - evaluation.detection_matches.size() / num_iou_thresholds; - const auto detection_index = iou_threshold_index * num_detections + - image_detection_indices[detection_sorted_index]; - assert(evaluation.detection_matches.size() > detection_index); - assert(evaluation.detection_ignores.size() > detection_index); - const int64_t detection_match = - evaluation.detection_matches[detection_index]; - const bool detection_ignores = - evaluation.detection_ignores[detection_index]; - const auto true_positive = detection_match > 0 && !detection_ignores; - const auto false_positive = detection_match == 0 && !detection_ignores; - if (true_positive) { - ++true_positives_sum; - } - if (false_positive) { - ++false_positives_sum; - } - - const double recall = - static_cast(true_positives_sum) / num_valid_ground_truth; - recalls->push_back(recall); - const int64_t num_valid_detections = - true_positives_sum + false_positives_sum; - const double precision = num_valid_detections > 0 - ? static_cast(true_positives_sum) / num_valid_detections - : 0.0; - precisions->push_back(precision); - } - - (*recalls_out)[recalls_out_index] = !recalls->empty() ? recalls->back() : 0; - - for (int64_t i = static_cast(precisions->size()) - 1; i > 0; --i) { - if ((*precisions)[i] > (*precisions)[i - 1]) { - (*precisions)[i - 1] = (*precisions)[i]; - } - } - - // Sample the per instance precision/recall list at each recall threshold - for (size_t r = 0; r < recall_thresholds.size(); ++r) { - // first index in recalls >= recall_thresholds[r] - std::vector::iterator low = std::lower_bound( - recalls->begin(), recalls->end(), recall_thresholds[r]); - size_t precisions_index = low - recalls->begin(); - - const auto results_ind = precisions_out_index + r * precisions_out_stride; - assert(results_ind < precisions_out->size()); - assert(results_ind < scores_out->size()); - if (precisions_index < precisions->size()) { - (*precisions_out)[results_ind] = (*precisions)[precisions_index]; - (*scores_out)[results_ind] = - detection_scores[detection_sorted_indices[precisions_index]]; - } else { - (*precisions_out)[results_ind] = 0; - (*scores_out)[results_ind] = 0; - } - } -} -py::dict Accumulate( - const py::object& params, - const std::vector& evaluations) { - const std::vector recall_thresholds = - list_to_vec(params.attr("recThrs")); - const std::vector max_detections = - list_to_vec(params.attr("maxDets")); - const int num_iou_thresholds = py::len(params.attr("iouThrs")); - const int num_recall_thresholds = py::len(params.attr("recThrs")); - const int num_categories = params.attr("useCats").cast() == 1 - ? py::len(params.attr("catIds")) - : 1; - const int num_area_ranges = py::len(params.attr("areaRng")); - const int num_max_detections = py::len(params.attr("maxDets")); - const int num_images = py::len(params.attr("imgIds")); - - std::vector precisions_out( - num_iou_thresholds * num_recall_thresholds * num_categories * - num_area_ranges * num_max_detections, - -1); - std::vector recalls_out( - num_iou_thresholds * num_categories * num_area_ranges * - num_max_detections, - -1); - std::vector scores_out( - num_iou_thresholds * num_recall_thresholds * num_categories * - num_area_ranges * num_max_detections, - -1); - - // Consider the list of all detected instances in the entire dataset in one - // large list. evaluation_indices, detection_scores, - // image_detection_indices, and detection_sorted_indices all have the same - // length as this list, such that each entry corresponds to one detected - // instance - std::vector evaluation_indices; // indices into evaluations[] - std::vector detection_scores; // detection scores of each instance - std::vector detection_sorted_indices; // sorted indices of all - // instances in the dataset - std::vector - image_detection_indices; // indices into the list of detected instances in - // the same image as each instance - std::vector precisions, recalls; - - for (auto c = 0; c < num_categories; ++c) { - for (auto a = 0; a < num_area_ranges; ++a) { - for (auto m = 0; m < num_max_detections; ++m) { - // The COCO PythonAPI assumes evaluations[] (the return value of - // COCOeval::EvaluateImages() is one long list storing results for each - // combination of category, area range, and image id, with categories in - // the outermost loop and images in the innermost loop. - const int64_t evaluations_index = - c * num_area_ranges * num_images + a * num_images; - int num_valid_ground_truth = BuildSortedDetectionList( - evaluations, - evaluations_index, - num_images, - max_detections[m], - &evaluation_indices, - &detection_scores, - &detection_sorted_indices, - &image_detection_indices); - - if (num_valid_ground_truth == 0) { - continue; - } - - for (auto t = 0; t < num_iou_thresholds; ++t) { - // recalls_out is a flattened vectors representing a - // num_iou_thresholds X num_categories X num_area_ranges X - // num_max_detections matrix - const int64_t recalls_out_index = - t * num_categories * num_area_ranges * num_max_detections + - c * num_area_ranges * num_max_detections + - a * num_max_detections + m; - - // precisions_out and scores_out are flattened vectors - // representing a num_iou_thresholds X num_recall_thresholds X - // num_categories X num_area_ranges X num_max_detections matrix - const int64_t precisions_out_stride = - num_categories * num_area_ranges * num_max_detections; - const int64_t precisions_out_index = t * num_recall_thresholds * - num_categories * num_area_ranges * num_max_detections + - c * num_area_ranges * num_max_detections + - a * num_max_detections + m; - - ComputePrecisionRecallCurve( - precisions_out_index, - precisions_out_stride, - recalls_out_index, - recall_thresholds, - t, - num_iou_thresholds, - num_valid_ground_truth, - evaluations, - evaluation_indices, - detection_scores, - detection_sorted_indices, - image_detection_indices, - &precisions, - &recalls, - &precisions_out, - &scores_out, - &recalls_out); - } - } - } - } - - time_t rawtime; - struct tm local_time; - std::array buffer; - time(&rawtime); -#ifdef _WIN32 - localtime_s(&local_time, &rawtime); -#else - localtime_r(&rawtime, &local_time); -#endif - strftime( - buffer.data(), 200, "%Y-%m-%d %H:%num_max_detections:%S", &local_time); - return py::dict( - "params"_a = params, - "counts"_a = std::vector( - {num_iou_thresholds, - num_recall_thresholds, - num_categories, - num_area_ranges, - num_max_detections}), - "date"_a = buffer, - "precision"_a = precisions_out, - "recall"_a = recalls_out, - "scores"_a = scores_out); -} - -} // namespace COCOeval - -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.h b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.h deleted file mode 100755 index db246e49..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cocoeval/cocoeval.h +++ /dev/null @@ -1,88 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#pragma once - -#include -#include -#include -#include -#include - -namespace py = pybind11; - -namespace detectron2 { - -namespace COCOeval { - -// Annotation data for a single object instance in an image -struct InstanceAnnotation { - InstanceAnnotation( - uint64_t id, - double score, - double area, - bool is_crowd, - bool ignore) - : id{id}, score{score}, area{area}, is_crowd{is_crowd}, ignore{ignore} {} - uint64_t id; - double score = 0.; - double area = 0.; - bool is_crowd = false; - bool ignore = false; -}; - -// Stores intermediate results for evaluating detection results for a single -// image that has D detected instances and G ground truth instances. This stores -// matches between detected and ground truth instances -struct ImageEvaluation { - // For each of the D detected instances, the id of the matched ground truth - // instance, or 0 if unmatched - std::vector detection_matches; - - // The detection score of each of the D detected instances - std::vector detection_scores; - - // Marks whether or not each of G instances was ignored from evaluation (e.g., - // because it's outside area_range) - std::vector ground_truth_ignores; - - // Marks whether or not each of D instances was ignored from evaluation (e.g., - // because it's outside aRng) - std::vector detection_ignores; -}; - -template -using ImageCategoryInstances = std::vector>>; - -// C++ implementation of COCO API cocoeval.py::COCOeval.evaluateImg(). For each -// combination of image, category, area range settings, and IOU thresholds to -// evaluate, it matches detected instances to ground truth instances and stores -// the results into a vector of ImageEvaluation results, which will be -// interpreted by the COCOeval::Accumulate() function to produce precion-recall -// curves. The parameters of nested vectors have the following semantics: -// image_category_ious[i][c][d][g] is the intersection over union of the d'th -// detected instance and g'th ground truth instance of -// category category_ids[c] in image image_ids[i] -// image_category_ground_truth_instances[i][c] is a vector of ground truth -// instances in image image_ids[i] of category category_ids[c] -// image_category_detection_instances[i][c] is a vector of detected -// instances in image image_ids[i] of category category_ids[c] -std::vector EvaluateImages( - const std::vector>& area_ranges, // vector of 2-tuples - int max_detections, - const std::vector& iou_thresholds, - const ImageCategoryInstances>& image_category_ious, - const ImageCategoryInstances& - image_category_ground_truth_instances, - const ImageCategoryInstances& - image_category_detection_instances); - -// C++ implementation of COCOeval.accumulate(), which generates precision -// recall curves for each set of category, IOU threshold, detection area range, -// and max number of detections parameters. It is assumed that the parameter -// evaluations is the return value of the functon COCOeval::EvaluateImages(), -// which was called with the same parameter settings params -py::dict Accumulate( - const py::object& params, - const std::vector& evalutations); - -} // namespace COCOeval -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cuda_version.cu b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cuda_version.cu deleted file mode 100755 index 6dfe1b90..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/cuda_version.cu +++ /dev/null @@ -1,26 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. - -#include - -namespace detectron2 { -int get_cudart_version() { -// Not a ROCM platform: Either HIP is not used, or -// it is used, but platform is not ROCM (i.e. it is CUDA) -#if !defined(__HIP_PLATFORM_HCC__) - return CUDART_VERSION; -#else - int version = 0; - -#if HIP_VERSION_MAJOR != 0 - // Create a convention similar to that of CUDA, as assumed by other - // parts of the code. - - version = HIP_VERSION_MINOR; - version += (HIP_VERSION_MAJOR * 100); -#else - hipRuntimeGetVersion(&version); -#endif - return version; -#endif -} -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv.h b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv.h deleted file mode 100755 index 965c1bfd..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv.h +++ /dev/null @@ -1,377 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#pragma once -#include - -namespace detectron2 { - -#if defined(WITH_CUDA) || defined(WITH_HIP) -int deform_conv_forward_cuda( - at::Tensor input, - at::Tensor weight, - at::Tensor offset, - at::Tensor output, - at::Tensor columns, - at::Tensor ones, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - int im2col_step); - -int deform_conv_backward_input_cuda( - at::Tensor input, - at::Tensor offset, - at::Tensor gradOutput, - at::Tensor gradInput, - at::Tensor gradOffset, - at::Tensor weight, - at::Tensor columns, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - int im2col_step); - -int deform_conv_backward_parameters_cuda( - at::Tensor input, - at::Tensor offset, - at::Tensor gradOutput, - at::Tensor gradWeight, // at::Tensor gradBias, - at::Tensor columns, - at::Tensor ones, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - float scale, - int im2col_step); - -void modulated_deform_conv_cuda_forward( - at::Tensor input, - at::Tensor weight, - at::Tensor bias, - at::Tensor ones, - at::Tensor offset, - at::Tensor mask, - at::Tensor output, - at::Tensor columns, - int kernel_h, - int kernel_w, - const int stride_h, - const int stride_w, - const int pad_h, - const int pad_w, - const int dilation_h, - const int dilation_w, - const int group, - const int deformable_group, - const bool with_bias); - -void modulated_deform_conv_cuda_backward( - at::Tensor input, - at::Tensor weight, - at::Tensor bias, - at::Tensor ones, - at::Tensor offset, - at::Tensor mask, - at::Tensor columns, - at::Tensor grad_input, - at::Tensor grad_weight, - at::Tensor grad_bias, - at::Tensor grad_offset, - at::Tensor grad_mask, - at::Tensor grad_output, - int kernel_h, - int kernel_w, - int stride_h, - int stride_w, - int pad_h, - int pad_w, - int dilation_h, - int dilation_w, - int group, - int deformable_group, - const bool with_bias); - -#endif - -inline int deform_conv_forward( - at::Tensor input, - at::Tensor weight, - at::Tensor offset, - at::Tensor output, - at::Tensor columns, - at::Tensor ones, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - int im2col_step) { - if (input.is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - TORCH_CHECK(weight.is_cuda(), "weight tensor is not on GPU!"); - TORCH_CHECK(offset.is_cuda(), "offset tensor is not on GPU!"); - return deform_conv_forward_cuda( - input, - weight, - offset, - output, - columns, - ones, - kW, - kH, - dW, - dH, - padW, - padH, - dilationW, - dilationH, - group, - deformable_group, - im2col_step); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - AT_ERROR("This operator is not implemented on CPU"); -} - -inline int deform_conv_backward_input( - at::Tensor input, - at::Tensor offset, - at::Tensor gradOutput, - at::Tensor gradInput, - at::Tensor gradOffset, - at::Tensor weight, - at::Tensor columns, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - int im2col_step) { - if (gradOutput.is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - TORCH_CHECK(input.is_cuda(), "input tensor is not on GPU!"); - TORCH_CHECK(weight.is_cuda(), "weight tensor is not on GPU!"); - TORCH_CHECK(offset.is_cuda(), "offset tensor is not on GPU!"); - return deform_conv_backward_input_cuda( - input, - offset, - gradOutput, - gradInput, - gradOffset, - weight, - columns, - kW, - kH, - dW, - dH, - padW, - padH, - dilationW, - dilationH, - group, - deformable_group, - im2col_step); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - AT_ERROR("This operator is not implemented on CPU"); -} - -inline int deform_conv_backward_filter( - at::Tensor input, - at::Tensor offset, - at::Tensor gradOutput, - at::Tensor gradWeight, // at::Tensor gradBias, - at::Tensor columns, - at::Tensor ones, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - float scale, - int im2col_step) { - if (gradOutput.is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - TORCH_CHECK(input.is_cuda(), "input tensor is not on GPU!"); - TORCH_CHECK(offset.is_cuda(), "offset tensor is not on GPU!"); - return deform_conv_backward_parameters_cuda( - input, - offset, - gradOutput, - gradWeight, - columns, - ones, - kW, - kH, - dW, - dH, - padW, - padH, - dilationW, - dilationH, - group, - deformable_group, - scale, - im2col_step); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - AT_ERROR("This operator is not implemented on CPU"); -} - -inline void modulated_deform_conv_forward( - at::Tensor input, - at::Tensor weight, - at::Tensor bias, - at::Tensor ones, - at::Tensor offset, - at::Tensor mask, - at::Tensor output, - at::Tensor columns, - int kernel_h, - int kernel_w, - const int stride_h, - const int stride_w, - const int pad_h, - const int pad_w, - const int dilation_h, - const int dilation_w, - const int group, - const int deformable_group, - const bool with_bias) { - if (input.is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - TORCH_CHECK(weight.is_cuda(), "weight tensor is not on GPU!"); - TORCH_CHECK(bias.is_cuda(), "bias tensor is not on GPU!"); - TORCH_CHECK(offset.is_cuda(), "offset tensor is not on GPU!"); - return modulated_deform_conv_cuda_forward( - input, - weight, - bias, - ones, - offset, - mask, - output, - columns, - kernel_h, - kernel_w, - stride_h, - stride_w, - pad_h, - pad_w, - dilation_h, - dilation_w, - group, - deformable_group, - with_bias); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - AT_ERROR("This operator is not implemented on CPU"); -} - -inline void modulated_deform_conv_backward( - at::Tensor input, - at::Tensor weight, - at::Tensor bias, - at::Tensor ones, - at::Tensor offset, - at::Tensor mask, - at::Tensor columns, - at::Tensor grad_input, - at::Tensor grad_weight, - at::Tensor grad_bias, - at::Tensor grad_offset, - at::Tensor grad_mask, - at::Tensor grad_output, - int kernel_h, - int kernel_w, - int stride_h, - int stride_w, - int pad_h, - int pad_w, - int dilation_h, - int dilation_w, - int group, - int deformable_group, - const bool with_bias) { - if (grad_output.is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - TORCH_CHECK(input.is_cuda(), "input tensor is not on GPU!"); - TORCH_CHECK(weight.is_cuda(), "weight tensor is not on GPU!"); - TORCH_CHECK(bias.is_cuda(), "bias tensor is not on GPU!"); - TORCH_CHECK(offset.is_cuda(), "offset tensor is not on GPU!"); - return modulated_deform_conv_cuda_backward( - input, - weight, - bias, - ones, - offset, - mask, - columns, - grad_input, - grad_weight, - grad_bias, - grad_offset, - grad_mask, - grad_output, - kernel_h, - kernel_w, - stride_h, - stride_w, - pad_h, - pad_w, - dilation_h, - dilation_w, - group, - deformable_group, - with_bias); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - AT_ERROR("This operator is not implemented on CPU"); -} - -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu deleted file mode 100755 index 2072bb85..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu +++ /dev/null @@ -1,1223 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. - -// modified from -// https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda.cpp -// Original license: Apache 2.0 - -// modify from -// https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda.c -// Original license: Apache 2.0 - -#include - -#include "deform_conv.h" - -#include -#include - -namespace detectron2 { - -void deformable_im2col( - const at::Tensor data_im, - const at::Tensor data_offset, - const int channels, - const int height, - const int width, - const int ksize_h, - const int ksize_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int parallel_imgs, - const int deformable_group, - at::Tensor data_col); - -void deformable_col2im( - const at::Tensor data_col, - const at::Tensor data_offset, - const int channels, - const int height, - const int width, - const int ksize_h, - const int ksize_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int parallel_imgs, - const int deformable_group, - at::Tensor grad_im); - -void deformable_col2im_coord( - const at::Tensor data_col, - const at::Tensor data_im, - const at::Tensor data_offset, - const int channels, - const int height, - const int width, - const int ksize_h, - const int ksize_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int parallel_imgs, - const int deformable_group, - at::Tensor grad_offset); - -void modulated_deformable_im2col_cuda( - const at::Tensor data_im, - const at::Tensor data_offset, - const at::Tensor data_mask, - const int batch_size, - const int channels, - const int height_im, - const int width_im, - const int height_col, - const int width_col, - const int kernel_h, - const int kenerl_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int deformable_group, - at::Tensor data_col); - -void modulated_deformable_col2im_cuda( - const at::Tensor data_col, - const at::Tensor data_offset, - const at::Tensor data_mask, - const int batch_size, - const int channels, - const int height_im, - const int width_im, - const int height_col, - const int width_col, - const int kernel_h, - const int kenerl_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int deformable_group, - at::Tensor grad_im); - -void modulated_deformable_col2im_coord_cuda( - const at::Tensor data_col, - const at::Tensor data_im, - const at::Tensor data_offset, - const at::Tensor data_mask, - const int batch_size, - const int channels, - const int height_im, - const int width_im, - const int height_col, - const int width_col, - const int kernel_h, - const int kenerl_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int deformable_group, - at::Tensor grad_offset, - at::Tensor grad_mask); - -void shape_check( - at::Tensor input, - at::Tensor offset, - at::Tensor* gradOutput, - at::Tensor weight, - int kH, - int kW, - int dH, - int dW, - int padH, - int padW, - int dilationH, - int dilationW, - int group, - int deformable_group) { - TORCH_CHECK( - weight.ndimension() == 4, - "4D weight tensor (nOutputPlane,nInputPlane,kH,kW) expected, " - "but got: %s", - weight.ndimension()); - - TORCH_CHECK(weight.is_contiguous(), "weight tensor has to be contiguous"); - - TORCH_CHECK( - kW > 0 && kH > 0, - "kernel size should be greater than zero, but got kH: %d kW: %d", - kH, - kW); - - TORCH_CHECK( - (weight.size(2) == kH && weight.size(3) == kW), - "kernel size should be consistent with weight, ", - "but got kH: %d kW: %d weight.size(2): %d, weight.size(3): %d", - kH, - kW, - weight.size(2), - weight.size(3)); - - TORCH_CHECK( - dW > 0 && dH > 0, - "stride should be greater than zero, but got dH: %d dW: %d", - dH, - dW); - - TORCH_CHECK( - dilationW > 0 && dilationH > 0, - "dilation should be greater than 0, but got dilationH: %d dilationW: %d", - dilationH, - dilationW); - - int ndim = input.ndimension(); - int dimf = 0; - int dimh = 1; - int dimw = 2; - - if (ndim == 4) { - dimf++; - dimh++; - dimw++; - } - - TORCH_CHECK( - ndim == 3 || ndim == 4, - "3D or 4D input tensor expected but got: %s", - ndim); - - long nInputPlane = weight.size(1) * group; - long inputHeight = input.size(dimh); - long inputWidth = input.size(dimw); - long nOutputPlane = weight.size(0); - long outputHeight = - (inputHeight + 2 * padH - (dilationH * (kH - 1) + 1)) / dH + 1; - long outputWidth = - (inputWidth + 2 * padW - (dilationW * (kW - 1) + 1)) / dW + 1; - - TORCH_CHECK( - nInputPlane % deformable_group == 0, - "input channels must divide deformable group size"); - - if (outputWidth < 1 || outputHeight < 1) - AT_ERROR( - "Given input size: (%ld x %ld x %ld). " - "Calculated output size: (%ld x %ld x %ld). Output size is too small", - nInputPlane, - inputHeight, - inputWidth, - nOutputPlane, - outputHeight, - outputWidth); - - TORCH_CHECK( - input.size(1) == nInputPlane, - "invalid number of input planes, expected: %d, but got: %d", - nInputPlane, - input.size(1)); - - TORCH_CHECK( - (inputHeight + 2 * padH >= kH && inputWidth + 2 * padW >= kW), - "input image is smaller than kernel"); - - TORCH_CHECK( - (offset.size(2) == outputHeight && offset.size(3) == outputWidth), - "invalid spatial size of offset, expected height: %d width: %d, but " - "got height: %d width: %d", - outputHeight, - outputWidth, - offset.size(2), - offset.size(3)); - - TORCH_CHECK( - (offset.size(1) == deformable_group * 2 * kH * kW), - "invalid number of channels of offset"); - - if (gradOutput != NULL) { - TORCH_CHECK( - gradOutput->size(dimf) == nOutputPlane, - "invalid number of gradOutput planes, expected: %d, but got: %d", - nOutputPlane, - gradOutput->size(dimf)); - - TORCH_CHECK( - (gradOutput->size(dimh) == outputHeight && - gradOutput->size(dimw) == outputWidth), - "invalid size of gradOutput, expected height: %d width: %d , but " - "got height: %d width: %d", - outputHeight, - outputWidth, - gradOutput->size(dimh), - gradOutput->size(dimw)); - } -} - -int deform_conv_forward_cuda( - at::Tensor input, - at::Tensor weight, - at::Tensor offset, - at::Tensor output, - at::Tensor columns, - at::Tensor ones, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - int im2col_step) { - // todo: resize columns to include im2col: done - // todo: add im2col_step as input - // todo: add new output buffer and transpose it to output (or directly - // transpose output) todo: possibly change data indexing because of - // parallel_imgs - - shape_check( - input, - offset, - NULL, - weight, - kH, - kW, - dH, - dW, - padH, - padW, - dilationH, - dilationW, - group, - deformable_group); - - input = input.contiguous(); - offset = offset.contiguous(); - weight = weight.contiguous(); - - int batch = 1; - if (input.ndimension() == 3) { - // Force batch - batch = 0; - input.unsqueeze_(0); - offset.unsqueeze_(0); - } - - // todo: assert batchsize dividable by im2col_step - - long batchSize = input.size(0); - long nInputPlane = input.size(1); - long inputHeight = input.size(2); - long inputWidth = input.size(3); - - long nOutputPlane = weight.size(0); - - long outputWidth = - (inputWidth + 2 * padW - (dilationW * (kW - 1) + 1)) / dW + 1; - long outputHeight = - (inputHeight + 2 * padH - (dilationH * (kH - 1) + 1)) / dH + 1; - - TORCH_CHECK((offset.size(0) == batchSize), "invalid batch size of offset"); - - output = output.view( - {batchSize / im2col_step, - im2col_step, - nOutputPlane, - outputHeight, - outputWidth}); - columns = at::zeros( - {nInputPlane * kW * kH, im2col_step * outputHeight * outputWidth}, - input.options()); - - if (ones.ndimension() != 2 || - ones.size(0) * ones.size(1) < outputHeight * outputWidth) { - ones = at::ones({outputHeight, outputWidth}, input.options()); - } - - input = input.view( - {batchSize / im2col_step, - im2col_step, - nInputPlane, - inputHeight, - inputWidth}); - offset = offset.view( - {batchSize / im2col_step, - im2col_step, - deformable_group * 2 * kH * kW, - outputHeight, - outputWidth}); - - at::Tensor output_buffer = at::zeros( - {batchSize / im2col_step, - nOutputPlane, - im2col_step * outputHeight, - outputWidth}, - output.options()); - - output_buffer = output_buffer.view( - {output_buffer.size(0), - group, - output_buffer.size(1) / group, - output_buffer.size(2), - output_buffer.size(3)}); - - for (int elt = 0; elt < batchSize / im2col_step; elt++) { - deformable_im2col( - input[elt], - offset[elt], - nInputPlane, - inputHeight, - inputWidth, - kH, - kW, - padH, - padW, - dH, - dW, - dilationH, - dilationW, - im2col_step, - deformable_group, - columns); - - columns = columns.view({group, columns.size(0) / group, columns.size(1)}); - weight = weight.view( - {group, - weight.size(0) / group, - weight.size(1), - weight.size(2), - weight.size(3)}); - - for (int g = 0; g < group; g++) { - output_buffer[elt][g] = output_buffer[elt][g] - .flatten(1) - .addmm_(weight[g].flatten(1), columns[g]) - .view_as(output_buffer[elt][g]); - } - } - - output_buffer = output_buffer.view( - {output_buffer.size(0), - output_buffer.size(1) * output_buffer.size(2), - output_buffer.size(3), - output_buffer.size(4)}); - - output_buffer = output_buffer.view( - {batchSize / im2col_step, - nOutputPlane, - im2col_step, - outputHeight, - outputWidth}); - output_buffer.transpose_(1, 2); - output.copy_(output_buffer); - output = output.view({batchSize, nOutputPlane, outputHeight, outputWidth}); - - input = input.view({batchSize, nInputPlane, inputHeight, inputWidth}); - offset = offset.view( - {batchSize, deformable_group * 2 * kH * kW, outputHeight, outputWidth}); - - if (batch == 0) { - output = output.view({nOutputPlane, outputHeight, outputWidth}); - input = input.view({nInputPlane, inputHeight, inputWidth}); - offset = offset.view({offset.size(1), offset.size(2), offset.size(3)}); - } - - return 1; -} - -int deform_conv_backward_input_cuda( - at::Tensor input, - at::Tensor offset, - at::Tensor gradOutput, - at::Tensor gradInput, - at::Tensor gradOffset, - at::Tensor weight, - at::Tensor columns, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - int im2col_step) { - shape_check( - input, - offset, - &gradOutput, - weight, - kH, - kW, - dH, - dW, - padH, - padW, - dilationH, - dilationW, - group, - deformable_group); - - input = input.contiguous(); - offset = offset.contiguous(); - gradOutput = gradOutput.contiguous(); - weight = weight.contiguous(); - - int batch = 1; - - if (input.ndimension() == 3) { - // Force batch - batch = 0; - input = input.view({1, input.size(0), input.size(1), input.size(2)}); - offset = offset.view({1, offset.size(0), offset.size(1), offset.size(2)}); - gradOutput = gradOutput.view( - {1, gradOutput.size(0), gradOutput.size(1), gradOutput.size(2)}); - } - - long batchSize = input.size(0); - long nInputPlane = input.size(1); - long inputHeight = input.size(2); - long inputWidth = input.size(3); - - long nOutputPlane = weight.size(0); - - long outputWidth = - (inputWidth + 2 * padW - (dilationW * (kW - 1) + 1)) / dW + 1; - long outputHeight = - (inputHeight + 2 * padH - (dilationH * (kH - 1) + 1)) / dH + 1; - - TORCH_CHECK((offset.size(0) == batchSize), 3, "invalid batch size of offset"); - gradInput = gradInput.view({batchSize, nInputPlane, inputHeight, inputWidth}); - columns = at::zeros( - {nInputPlane * kW * kH, im2col_step * outputHeight * outputWidth}, - input.options()); - - // change order of grad output - gradOutput = gradOutput.view( - {batchSize / im2col_step, - im2col_step, - nOutputPlane, - outputHeight, - outputWidth}); - gradOutput.transpose_(1, 2); - - gradInput = gradInput.view( - {batchSize / im2col_step, - im2col_step, - nInputPlane, - inputHeight, - inputWidth}); - input = input.view( - {batchSize / im2col_step, - im2col_step, - nInputPlane, - inputHeight, - inputWidth}); - gradOffset = gradOffset.view( - {batchSize / im2col_step, - im2col_step, - deformable_group * 2 * kH * kW, - outputHeight, - outputWidth}); - offset = offset.view( - {batchSize / im2col_step, - im2col_step, - deformable_group * 2 * kH * kW, - outputHeight, - outputWidth}); - - for (int elt = 0; elt < batchSize / im2col_step; elt++) { - // divide into groups - columns = columns.view({group, columns.size(0) / group, columns.size(1)}); - weight = weight.view( - {group, - weight.size(0) / group, - weight.size(1), - weight.size(2), - weight.size(3)}); - gradOutput = gradOutput.view( - {gradOutput.size(0), - group, - gradOutput.size(1) / group, - gradOutput.size(2), - gradOutput.size(3), - gradOutput.size(4)}); - - for (int g = 0; g < group; g++) { - columns[g] = columns[g].addmm_( - weight[g].flatten(1).transpose(0, 1), - gradOutput[elt][g].flatten(1), - 0.0f, - 1.0f); - } - - columns = - columns.view({columns.size(0) * columns.size(1), columns.size(2)}); - gradOutput = gradOutput.view( - {gradOutput.size(0), - gradOutput.size(1) * gradOutput.size(2), - gradOutput.size(3), - gradOutput.size(4), - gradOutput.size(5)}); - - deformable_col2im_coord( - columns, - input[elt], - offset[elt], - nInputPlane, - inputHeight, - inputWidth, - kH, - kW, - padH, - padW, - dH, - dW, - dilationH, - dilationW, - im2col_step, - deformable_group, - gradOffset[elt]); - - deformable_col2im( - columns, - offset[elt], - nInputPlane, - inputHeight, - inputWidth, - kH, - kW, - padH, - padW, - dH, - dW, - dilationH, - dilationW, - im2col_step, - deformable_group, - gradInput[elt]); - } - - gradOutput.transpose_(1, 2); - gradOutput = - gradOutput.view({batchSize, nOutputPlane, outputHeight, outputWidth}); - - gradInput = gradInput.view({batchSize, nInputPlane, inputHeight, inputWidth}); - input = input.view({batchSize, nInputPlane, inputHeight, inputWidth}); - gradOffset = gradOffset.view( - {batchSize, deformable_group * 2 * kH * kW, outputHeight, outputWidth}); - offset = offset.view( - {batchSize, deformable_group * 2 * kH * kW, outputHeight, outputWidth}); - - if (batch == 0) { - gradOutput = gradOutput.view({nOutputPlane, outputHeight, outputWidth}); - input = input.view({nInputPlane, inputHeight, inputWidth}); - gradInput = gradInput.view({nInputPlane, inputHeight, inputWidth}); - offset = offset.view({offset.size(1), offset.size(2), offset.size(3)}); - gradOffset = - gradOffset.view({offset.size(1), offset.size(2), offset.size(3)}); - } - - return 1; -} - -int deform_conv_backward_parameters_cuda( - at::Tensor input, - at::Tensor offset, - at::Tensor gradOutput, - at::Tensor gradWeight, // at::Tensor gradBias, - at::Tensor columns, - at::Tensor ones, - int kW, - int kH, - int dW, - int dH, - int padW, - int padH, - int dilationW, - int dilationH, - int group, - int deformable_group, - float scale, - int im2col_step) { - // todo: transpose and reshape outGrad - // todo: reshape columns - // todo: add im2col_step as input - - shape_check( - input, - offset, - &gradOutput, - gradWeight, - kH, - kW, - dH, - dW, - padH, - padW, - dilationH, - dilationW, - group, - deformable_group); - - input = input.contiguous(); - offset = offset.contiguous(); - gradOutput = gradOutput.contiguous(); - - int batch = 1; - - if (input.ndimension() == 3) { - // Force batch - batch = 0; - input = input.view( - at::IntList({1, input.size(0), input.size(1), input.size(2)})); - gradOutput = gradOutput.view( - {1, gradOutput.size(0), gradOutput.size(1), gradOutput.size(2)}); - } - - long batchSize = input.size(0); - long nInputPlane = input.size(1); - long inputHeight = input.size(2); - long inputWidth = input.size(3); - - long nOutputPlane = gradWeight.size(0); - - long outputWidth = - (inputWidth + 2 * padW - (dilationW * (kW - 1) + 1)) / dW + 1; - long outputHeight = - (inputHeight + 2 * padH - (dilationH * (kH - 1) + 1)) / dH + 1; - - TORCH_CHECK((offset.size(0) == batchSize), "invalid batch size of offset"); - - columns = at::zeros( - {nInputPlane * kW * kH, im2col_step * outputHeight * outputWidth}, - input.options()); - - gradOutput = gradOutput.view( - {batchSize / im2col_step, - im2col_step, - nOutputPlane, - outputHeight, - outputWidth}); - gradOutput.transpose_(1, 2); - - at::Tensor gradOutputBuffer = at::zeros_like(gradOutput); - gradOutputBuffer = gradOutputBuffer.view( - {batchSize / im2col_step, - nOutputPlane, - im2col_step, - outputHeight, - outputWidth}); - gradOutputBuffer.copy_(gradOutput); - // gradOutput is not contiguous, so we do reshape (instead of view) next - gradOutputBuffer = gradOutputBuffer.reshape( - {batchSize / im2col_step, - nOutputPlane, - im2col_step * outputHeight, - outputWidth}); - - gradOutput.transpose_(1, 2); - gradOutput = - gradOutput.view({batchSize, nOutputPlane, outputHeight, outputWidth}); - - input = input.view( - {batchSize / im2col_step, - im2col_step, - nInputPlane, - inputHeight, - inputWidth}); - offset = offset.view( - {batchSize / im2col_step, - im2col_step, - deformable_group * 2 * kH * kW, - outputHeight, - outputWidth}); - - for (int elt = 0; elt < batchSize / im2col_step; elt++) { - deformable_im2col( - input[elt], - offset[elt], - nInputPlane, - inputHeight, - inputWidth, - kH, - kW, - padH, - padW, - dH, - dW, - dilationH, - dilationW, - im2col_step, - deformable_group, - columns); - - // divide into group - gradOutputBuffer = gradOutputBuffer.view( - {gradOutputBuffer.size(0), - group, - gradOutputBuffer.size(1) / group, - gradOutputBuffer.size(2), - gradOutputBuffer.size(3)}); - columns = columns.view({group, columns.size(0) / group, columns.size(1)}); - gradWeight = gradWeight.view( - {group, - gradWeight.size(0) / group, - gradWeight.size(1), - gradWeight.size(2), - gradWeight.size(3)}); - - for (int g = 0; g < group; g++) { - gradWeight[g] = gradWeight[g] - .flatten(1) - .addmm_( - gradOutputBuffer[elt][g].flatten(1), - columns[g].transpose(1, 0), - 1.0, - scale) - .view_as(gradWeight[g]); - } - gradOutputBuffer = gradOutputBuffer.view( - {gradOutputBuffer.size(0), - gradOutputBuffer.size(1) * gradOutputBuffer.size(2), - gradOutputBuffer.size(3), - gradOutputBuffer.size(4)}); - columns = - columns.view({columns.size(0) * columns.size(1), columns.size(2)}); - gradWeight = gradWeight.view( - {gradWeight.size(0) * gradWeight.size(1), - gradWeight.size(2), - gradWeight.size(3), - gradWeight.size(4)}); - } - - input = input.view({batchSize, nInputPlane, inputHeight, inputWidth}); - offset = offset.view( - {batchSize, deformable_group * 2 * kH * kW, outputHeight, outputWidth}); - - if (batch == 0) { - gradOutput = gradOutput.view({nOutputPlane, outputHeight, outputWidth}); - input = input.view({nInputPlane, inputHeight, inputWidth}); - } - - return 1; -} - -void modulated_deform_conv_cuda_forward( - at::Tensor input, - at::Tensor weight, - at::Tensor bias, - at::Tensor ones, - at::Tensor offset, - at::Tensor mask, - at::Tensor output, - at::Tensor columns, - int kernel_h, - int kernel_w, - const int stride_h, - const int stride_w, - const int pad_h, - const int pad_w, - const int dilation_h, - const int dilation_w, - const int group, - const int deformable_group, - const bool with_bias) { - shape_check( - input, - offset, - NULL, - weight, - kernel_h, - kernel_w, - stride_h, - stride_w, - pad_h, - pad_w, - dilation_h, - dilation_w, - group, - deformable_group); - - TORCH_CHECK(input.is_contiguous(), "input tensor has to be contiguous"); - TORCH_CHECK(weight.is_contiguous(), "weight tensor has to be contiguous"); - - const int batch = input.size(0); - const int channels = input.size(1); - const int height = input.size(2); - const int width = input.size(3); - - const int channels_out = weight.size(0); - const int channels_kernel = weight.size(1); - const int kernel_h_ = weight.size(2); - const int kernel_w_ = weight.size(3); - - if (kernel_h_ != kernel_h || kernel_w_ != kernel_w) - AT_ERROR( - "Input shape and kernel shape wont match: (%d x %d vs %d x %d).", - kernel_h_, - kernel_w, - kernel_h_, - kernel_w_); - if (channels != channels_kernel * group) - AT_ERROR( - "Input shape and kernel channels wont match: (%d vs %d).", - channels, - channels_kernel * group); - - const int height_out = - (height + 2 * pad_h - (dilation_h * (kernel_h - 1) + 1)) / stride_h + 1; - const int width_out = - (width + 2 * pad_w - (dilation_w * (kernel_w - 1) + 1)) / stride_w + 1; - - // mask shape check - TORCH_CHECK( - (mask.size(2) == height_out && mask.size(3) == width_out), - "invalid spatial size of mask, expected height: %d width: %d, but " - "got height: %d width: %d", - height_out, - width_out, - mask.size(2), - mask.size(3)); - - TORCH_CHECK( - (mask.size(1) == deformable_group * kernel_h * kernel_w), - "invalid number of channels of mask"); - - if (ones.ndimension() != 2 || - ones.size(0) * ones.size(1) < height_out * width_out) { - // Resize plane and fill with ones... - ones = at::ones({height_out, width_out}, input.options()); - } - - // resize output - output = output.view({batch, channels_out, height_out, width_out}).zero_(); - // resize temporary columns - columns = at::zeros( - {channels * kernel_h * kernel_w, 1 * height_out * width_out}, - input.options()); - - output = output.view( - {output.size(0), - group, - output.size(1) / group, - output.size(2), - output.size(3)}); - - for (int b = 0; b < batch; b++) { - modulated_deformable_im2col_cuda( - input[b], - offset[b], - mask[b], - 1, - channels, - height, - width, - height_out, - width_out, - kernel_h, - kernel_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - deformable_group, - columns); - - // divide into group - weight = weight.view( - {group, - weight.size(0) / group, - weight.size(1), - weight.size(2), - weight.size(3)}); - columns = columns.view({group, columns.size(0) / group, columns.size(1)}); - - for (int g = 0; g < group; g++) { - output[b][g] = output[b][g] - .flatten(1) - .addmm_(weight[g].flatten(1), columns[g]) - .view_as(output[b][g]); - } - - weight = weight.view( - {weight.size(0) * weight.size(1), - weight.size(2), - weight.size(3), - weight.size(4)}); - columns = - columns.view({columns.size(0) * columns.size(1), columns.size(2)}); - } - - output = output.view( - {output.size(0), - output.size(1) * output.size(2), - output.size(3), - output.size(4)}); - - if (with_bias) { - output += bias.view({1, bias.size(0), 1, 1}); - } -} - -void modulated_deform_conv_cuda_backward( - at::Tensor input, - at::Tensor weight, - at::Tensor bias, - at::Tensor ones, - at::Tensor offset, - at::Tensor mask, - at::Tensor columns, - at::Tensor grad_input, - at::Tensor grad_weight, - at::Tensor grad_bias, - at::Tensor grad_offset, - at::Tensor grad_mask, - at::Tensor grad_output, - int kernel_h, - int kernel_w, - int stride_h, - int stride_w, - int pad_h, - int pad_w, - int dilation_h, - int dilation_w, - int group, - int deformable_group, - const bool with_bias) { - shape_check( - input, - offset, - &grad_output, - weight, - kernel_h, - kernel_w, - stride_h, - stride_w, - pad_h, - pad_w, - dilation_h, - dilation_w, - group, - deformable_group); - - TORCH_CHECK(input.is_contiguous(), "input tensor has to be contiguous"); - TORCH_CHECK(weight.is_contiguous(), "weight tensor has to be contiguous"); - - const int batch = input.size(0); - const int channels = input.size(1); - const int height = input.size(2); - const int width = input.size(3); - - const int channels_kernel = weight.size(1); - const int kernel_h_ = weight.size(2); - const int kernel_w_ = weight.size(3); - if (kernel_h_ != kernel_h || kernel_w_ != kernel_w) - AT_ERROR( - "Input shape and kernel shape wont match: (%d x %d vs %d x %d).", - kernel_h_, - kernel_w, - kernel_h_, - kernel_w_); - if (channels != channels_kernel * group) - AT_ERROR( - "Input shape and kernel channels wont match: (%d vs %d).", - channels, - channels_kernel * group); - - const int height_out = - (height + 2 * pad_h - (dilation_h * (kernel_h - 1) + 1)) / stride_h + 1; - const int width_out = - (width + 2 * pad_w - (dilation_w * (kernel_w - 1) + 1)) / stride_w + 1; - - // mask shape check - TORCH_CHECK( - (mask.size(2) == height_out && mask.size(3) == width_out), - "invalid spatial size of mask, expected height: %d width: %d, but " - "got height: %d width: %d", - height_out, - width_out, - mask.size(2), - mask.size(3)); - - TORCH_CHECK( - (mask.size(1) == deformable_group * kernel_h * kernel_w), - "invalid number of channels of mask"); - - if (ones.ndimension() != 2 || - ones.size(0) * ones.size(1) < height_out * width_out) { - // Resize plane and fill with ones... - ones = at::ones({height_out, width_out}, input.options()); - } - - grad_input = grad_input.view({batch, channels, height, width}); - columns = at::zeros( - {channels * kernel_h * kernel_w, height_out * width_out}, - input.options()); - - grad_output = grad_output.view( - {grad_output.size(0), - group, - grad_output.size(1) / group, - grad_output.size(2), - grad_output.size(3)}); - - for (int b = 0; b < batch; b++) { - // divide int group - columns = columns.view({group, columns.size(0) / group, columns.size(1)}); - weight = weight.view( - {group, - weight.size(0) / group, - weight.size(1), - weight.size(2), - weight.size(3)}); - - for (int g = 0; g < group; g++) { - columns[g].addmm_( - weight[g].flatten(1).transpose(0, 1), - grad_output[b][g].flatten(1), - 0.0f, - 1.0f); - } - - columns = - columns.view({columns.size(0) * columns.size(1), columns.size(2)}); - weight = weight.view( - {weight.size(0) * weight.size(1), - weight.size(2), - weight.size(3), - weight.size(4)}); - - // gradient w.r.t. input coordinate data - modulated_deformable_col2im_coord_cuda( - columns, - input[b], - offset[b], - mask[b], - 1, - channels, - height, - width, - height_out, - width_out, - kernel_h, - kernel_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - deformable_group, - grad_offset[b], - grad_mask[b]); - // gradient w.r.t. input data - modulated_deformable_col2im_cuda( - columns, - offset[b], - mask[b], - 1, - channels, - height, - width, - height_out, - width_out, - kernel_h, - kernel_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - deformable_group, - grad_input[b]); - - // gradient w.r.t. weight, dWeight should accumulate across the batch and - // group - modulated_deformable_im2col_cuda( - input[b], - offset[b], - mask[b], - 1, - channels, - height, - width, - height_out, - width_out, - kernel_h, - kernel_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - deformable_group, - columns); - - columns = columns.view({group, columns.size(0) / group, columns.size(1)}); - grad_weight = grad_weight.view( - {group, - grad_weight.size(0) / group, - grad_weight.size(1), - grad_weight.size(2), - grad_weight.size(3)}); - if (with_bias) - grad_bias = grad_bias.view({group, grad_bias.size(0) / group}); - - for (int g = 0; g < group; g++) { - grad_weight[g] = - grad_weight[g] - .flatten(1) - .addmm_(grad_output[b][g].flatten(1), columns[g].transpose(0, 1)) - .view_as(grad_weight[g]); - if (with_bias) { - grad_bias[g] = - grad_bias[g] - .view({-1, 1}) - .addmm_(grad_output[b][g].flatten(1), ones.view({-1, 1})) - .view(-1); - } - } - - columns = - columns.view({columns.size(0) * columns.size(1), columns.size(2)}); - grad_weight = grad_weight.view( - {grad_weight.size(0) * grad_weight.size(1), - grad_weight.size(2), - grad_weight.size(3), - grad_weight.size(4)}); - if (with_bias) - grad_bias = grad_bias.view({grad_bias.size(0) * grad_bias.size(1)}); - } - grad_output = grad_output.view( - {grad_output.size(0) * grad_output.size(1), - grad_output.size(2), - grad_output.size(3), - grad_output.size(4)}); -} - -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu deleted file mode 100755 index f299c7ad..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu +++ /dev/null @@ -1,1288 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. - -// modified from -// https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu -// Original license: Apache 2.0 -// clang-format off - -// modify from -// https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu - -/*! - ******************* BEGIN Caffe Copyright Notice and Disclaimer ***************** - * - * COPYRIGHT - * - * All contributions by the University of California: - * Copyright (c) 2014-2017 The Regents of the University of California (Regents) - * All rights reserved. - * - * All other contributions: - * Copyright (c) 2014-2017, the respective contributors - * All rights reserved. - * - * Caffe uses a shared copyright model: each contributor holds copyright over - * their contributions to Caffe. The project versioning records all such - * contribution and copyright details. If a contributor wants to further mark - * their specific copyright on a particular contribution, they should indicate - * their copyright solely in the commit message of the change when it is - * committed. - * - * LICENSE - * - * Redistribution and use in source and binary forms, with or without - * modification, are permitted provided that the following conditions are met: - * - * 1. Redistributions of source code must retain the above copyright notice, this - * list of conditions and the following disclaimer. - * 2. Redistributions in binary form must reproduce the above copyright notice, - * this list of conditions and the following disclaimer in the documentation - * and/or other materials provided with the distribution. - * - * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" - *AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE - *IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE - * DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE - *FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL - *DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR - *SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER - *CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, - *OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE - *OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - * - * CONTRIBUTION AGREEMENT - * - * By contributing to the BVLC/caffe repository through pull-request, comment, - * or otherwise, the contributor releases their content to the - * license and copyright terms herein. - * - ***************** END Caffe Copyright Notice and Disclaimer ********************* - * - * Copyright (c) 2018 Microsoft - * Licensed under The MIT License [see LICENSE for details] - * \file modulated_deformable_im2col.cuh - * \brief Function definitions of converting an image to - * column matrix based on kernel, padding, dilation, and offset. - * These functions are mainly used in deformable convolution operators. - * \ref: https://arxiv.org/abs/1703.06211 - * \author Yuwen Xiong, Haozhi Qi, Jifeng Dai, Xizhou Zhu, Han Hu, Dazhi Cheng - */ - -#include -#include -#include -#include -#include -#include - -using namespace at; - -#define CUDA_KERNEL_LOOP(i, n) \ - for (int i = blockIdx.x * blockDim.x + threadIdx.x; i < (n); \ - i += blockDim.x * gridDim.x) - - -namespace { - -const int CUDA_NUM_THREADS = 1024; -const int kMaxGridNum = 65535; - -inline int GET_BLOCKS(const int N) { - return std::min(kMaxGridNum, (N + CUDA_NUM_THREADS - 1) / CUDA_NUM_THREADS); -} - -} - -template -__device__ scalar_t deformable_im2col_bilinear( - const scalar_t* bottom_data, - const int data_width, - const int height, - const int width, - scalar_t h, - scalar_t w) { - int h_low = floor(h); - int w_low = floor(w); - int h_high = h_low + 1; - int w_high = w_low + 1; - - scalar_t lh = h - h_low; - scalar_t lw = w - w_low; - scalar_t hh = 1 - lh, hw = 1 - lw; - - scalar_t v1 = 0; - if (h_low >= 0 && w_low >= 0) - v1 = bottom_data[h_low * data_width + w_low]; - scalar_t v2 = 0; - if (h_low >= 0 && w_high <= width - 1) - v2 = bottom_data[h_low * data_width + w_high]; - scalar_t v3 = 0; - if (h_high <= height - 1 && w_low >= 0) - v3 = bottom_data[h_high * data_width + w_low]; - scalar_t v4 = 0; - if (h_high <= height - 1 && w_high <= width - 1) - v4 = bottom_data[h_high * data_width + w_high]; - - scalar_t w1 = hh * hw, w2 = hh * lw, w3 = lh * hw, w4 = lh * lw; - - scalar_t val = (w1 * v1 + w2 * v2 + w3 * v3 + w4 * v4); - return val; -} - -template -__device__ scalar_t get_gradient_weight( - scalar_t argmax_h, - scalar_t argmax_w, - const int h, - const int w, - const int height, - const int width) { - if (argmax_h <= -1 || argmax_h >= height || argmax_w <= -1 || - argmax_w >= width) { - // empty - return 0; - } - - int argmax_h_low = floor(argmax_h); - int argmax_w_low = floor(argmax_w); - int argmax_h_high = argmax_h_low + 1; - int argmax_w_high = argmax_w_low + 1; - - scalar_t weight = 0; - if (h == argmax_h_low && w == argmax_w_low) - weight = (h + 1 - argmax_h) * (w + 1 - argmax_w); - if (h == argmax_h_low && w == argmax_w_high) - weight = (h + 1 - argmax_h) * (argmax_w + 1 - w); - if (h == argmax_h_high && w == argmax_w_low) - weight = (argmax_h + 1 - h) * (w + 1 - argmax_w); - if (h == argmax_h_high && w == argmax_w_high) - weight = (argmax_h + 1 - h) * (argmax_w + 1 - w); - return weight; -} - -template -__device__ scalar_t get_coordinate_weight( - scalar_t argmax_h, - scalar_t argmax_w, - const int height, - const int width, - const scalar_t* im_data, - const int data_width, - const int bp_dir) { - if (argmax_h <= -1 || argmax_h >= height || argmax_w <= -1 || - argmax_w >= width) { - // empty - return 0; - } - - int argmax_h_low = floor(argmax_h); - int argmax_w_low = floor(argmax_w); - int argmax_h_high = argmax_h_low + 1; - int argmax_w_high = argmax_w_low + 1; - - scalar_t weight = 0; - - if (bp_dir == 0) { - if (argmax_h_low >= 0 && argmax_w_low >= 0) - weight += -1 * (argmax_w_low + 1 - argmax_w) * - im_data[argmax_h_low * data_width + argmax_w_low]; - if (argmax_h_low >= 0 && argmax_w_high <= width - 1) - weight += -1 * (argmax_w - argmax_w_low) * - im_data[argmax_h_low * data_width + argmax_w_high]; - if (argmax_h_high <= height - 1 && argmax_w_low >= 0) - weight += (argmax_w_low + 1 - argmax_w) * - im_data[argmax_h_high * data_width + argmax_w_low]; - if (argmax_h_high <= height - 1 && argmax_w_high <= width - 1) - weight += (argmax_w - argmax_w_low) * - im_data[argmax_h_high * data_width + argmax_w_high]; - } else if (bp_dir == 1) { - if (argmax_h_low >= 0 && argmax_w_low >= 0) - weight += -1 * (argmax_h_low + 1 - argmax_h) * - im_data[argmax_h_low * data_width + argmax_w_low]; - if (argmax_h_low >= 0 && argmax_w_high <= width - 1) - weight += (argmax_h_low + 1 - argmax_h) * - im_data[argmax_h_low * data_width + argmax_w_high]; - if (argmax_h_high <= height - 1 && argmax_w_low >= 0) - weight += -1 * (argmax_h - argmax_h_low) * - im_data[argmax_h_high * data_width + argmax_w_low]; - if (argmax_h_high <= height - 1 && argmax_w_high <= width - 1) - weight += (argmax_h - argmax_h_low) * - im_data[argmax_h_high * data_width + argmax_w_high]; - } - - return weight; -} - -template -__global__ void deformable_im2col_gpu_kernel( - const int n, - const scalar_t* data_im, - const scalar_t* data_offset, - const int height, - const int width, - const int kernel_h, - const int kernel_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int channel_per_deformable_group, - const int batch_size, - const int num_channels, - const int deformable_group, - const int height_col, - const int width_col, - scalar_t* data_col) { - CUDA_KERNEL_LOOP(index, n) { - // index index of output matrix - const int w_col = index % width_col; - const int h_col = (index / width_col) % height_col; - const int b_col = (index / width_col / height_col) % batch_size; - const int c_im = (index / width_col / height_col) / batch_size; - const int c_col = c_im * kernel_h * kernel_w; - - // compute deformable group index - const int deformable_group_index = c_im / channel_per_deformable_group; - - const int h_in = h_col * stride_h - pad_h; - const int w_in = w_col * stride_w - pad_w; - scalar_t* data_col_ptr = data_col + - ((c_col * batch_size + b_col) * height_col + h_col) * width_col + w_col; - // const scalar_t* data_im_ptr = data_im + ((b_col * num_channels + c_im) * - // height + h_in) * width + w_in; - const scalar_t* data_im_ptr = - data_im + (b_col * num_channels + c_im) * height * width; - const scalar_t* data_offset_ptr = data_offset + - (b_col * deformable_group + deformable_group_index) * 2 * kernel_h * - kernel_w * height_col * width_col; - - for (int i = 0; i < kernel_h; ++i) { - for (int j = 0; j < kernel_w; ++j) { - const int data_offset_h_ptr = - ((2 * (i * kernel_w + j)) * height_col + h_col) * width_col + w_col; - const int data_offset_w_ptr = - ((2 * (i * kernel_w + j) + 1) * height_col + h_col) * width_col + - w_col; - const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr]; - const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr]; - scalar_t val = static_cast(0); - const scalar_t h_im = h_in + i * dilation_h + offset_h; - const scalar_t w_im = w_in + j * dilation_w + offset_w; - if (h_im > -1 && w_im > -1 && h_im < height && w_im < width) { - // const scalar_t map_h = i * dilation_h + offset_h; - // const scalar_t map_w = j * dilation_w + offset_w; - // const int cur_height = height - h_in; - // const int cur_width = width - w_in; - // val = deformable_im2col_bilinear(data_im_ptr, width, cur_height, - // cur_width, map_h, map_w); - val = deformable_im2col_bilinear( - data_im_ptr, width, height, width, h_im, w_im); - } - *data_col_ptr = val; - data_col_ptr += batch_size * height_col * width_col; - } - } - } -} - - -template -__global__ void deformable_col2im_gpu_kernel( - const int n, - const scalar_t* data_col, - const scalar_t* data_offset, - const int channels, - const int height, - const int width, - const int kernel_h, - const int kernel_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int channel_per_deformable_group, - const int batch_size, - const int deformable_group, - const int height_col, - const int width_col, - scalar_t* grad_im) { - CUDA_KERNEL_LOOP(index, n) { - const int j = (index / width_col / height_col / batch_size) % kernel_w; - const int i = - (index / width_col / height_col / batch_size / kernel_w) % kernel_h; - const int c = - index / width_col / height_col / batch_size / kernel_w / kernel_h; - // compute the start and end of the output - - const int deformable_group_index = c / channel_per_deformable_group; - - int w_out = index % width_col; - int h_out = (index / width_col) % height_col; - int b = (index / width_col / height_col) % batch_size; - int w_in = w_out * stride_w - pad_w; - int h_in = h_out * stride_h - pad_h; - - const scalar_t* data_offset_ptr = data_offset + - (b * deformable_group + deformable_group_index) * 2 * kernel_h * - kernel_w * height_col * width_col; - const int data_offset_h_ptr = - ((2 * (i * kernel_w + j)) * height_col + h_out) * width_col + w_out; - const int data_offset_w_ptr = - ((2 * (i * kernel_w + j) + 1) * height_col + h_out) * width_col + w_out; - const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr]; - const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr]; - const scalar_t cur_inv_h_data = h_in + i * dilation_h + offset_h; - const scalar_t cur_inv_w_data = w_in + j * dilation_w + offset_w; - - const scalar_t cur_top_grad = data_col[index]; - const int cur_h = (int)cur_inv_h_data; - const int cur_w = (int)cur_inv_w_data; - for (int dy = -2; dy <= 2; dy++) { - for (int dx = -2; dx <= 2; dx++) { - if (cur_h + dy >= 0 && cur_h + dy < height && cur_w + dx >= 0 && - cur_w + dx < width && abs(cur_inv_h_data - (cur_h + dy)) < 1 && - abs(cur_inv_w_data - (cur_w + dx)) < 1) { - int cur_bottom_grad_pos = - ((b * channels + c) * height + cur_h + dy) * width + cur_w + dx; - scalar_t weight = get_gradient_weight( - cur_inv_h_data, - cur_inv_w_data, - cur_h + dy, - cur_w + dx, - height, - width); - atomicAdd(grad_im + cur_bottom_grad_pos, weight * cur_top_grad); - } - } - } - } -} - - -template -__global__ void deformable_col2im_coord_gpu_kernel( - const int n, - const scalar_t* data_col, - const scalar_t* data_im, - const scalar_t* data_offset, - const int channels, - const int height, - const int width, - const int kernel_h, - const int kernel_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int channel_per_deformable_group, - const int batch_size, - const int offset_channels, - const int deformable_group, - const int height_col, - const int width_col, - scalar_t* grad_offset) { - CUDA_KERNEL_LOOP(index, n) { - scalar_t val = 0; - int w = index % width_col; - int h = (index / width_col) % height_col; - int c = (index / width_col / height_col) % offset_channels; - int b = (index / width_col / height_col) / offset_channels; - // compute the start and end of the output - - const int deformable_group_index = c / (2 * kernel_h * kernel_w); - const int col_step = kernel_h * kernel_w; - int cnt = 0; - const scalar_t* data_col_ptr = data_col + - deformable_group_index * channel_per_deformable_group * batch_size * - width_col * height_col; - const scalar_t* data_im_ptr = data_im + - (b * deformable_group + deformable_group_index) * - channel_per_deformable_group / kernel_h / kernel_w * height * width; - const scalar_t* data_offset_ptr = data_offset + - (b * deformable_group + deformable_group_index) * 2 * kernel_h * - kernel_w * height_col * width_col; - - const int offset_c = c - deformable_group_index * 2 * kernel_h * kernel_w; - - for (int col_c = (offset_c / 2); col_c < channel_per_deformable_group; - col_c += col_step) { - const int col_pos = - (((col_c * batch_size + b) * height_col) + h) * width_col + w; - const int bp_dir = offset_c % 2; - - int j = (col_pos / width_col / height_col / batch_size) % kernel_w; - int i = - (col_pos / width_col / height_col / batch_size / kernel_w) % kernel_h; - int w_out = col_pos % width_col; - int h_out = (col_pos / width_col) % height_col; - int w_in = w_out * stride_w - pad_w; - int h_in = h_out * stride_h - pad_h; - const int data_offset_h_ptr = - (((2 * (i * kernel_w + j)) * height_col + h_out) * width_col + w_out); - const int data_offset_w_ptr = - (((2 * (i * kernel_w + j) + 1) * height_col + h_out) * width_col + - w_out); - const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr]; - const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr]; - scalar_t inv_h = h_in + i * dilation_h + offset_h; - scalar_t inv_w = w_in + j * dilation_w + offset_w; - if (inv_h <= -1 || inv_w <= -1 || inv_h >= height || inv_w >= width) { - inv_h = inv_w = -2; - } - const scalar_t weight = get_coordinate_weight( - inv_h, - inv_w, - height, - width, - data_im_ptr + cnt * height * width, - width, - bp_dir); - val += weight * data_col_ptr[col_pos]; - cnt += 1; - } - - grad_offset[index] = val; - } -} - - -namespace detectron2 { - -void deformable_im2col( - const at::Tensor data_im, - const at::Tensor data_offset, - const int channels, - const int height, - const int width, - const int ksize_h, - const int ksize_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int parallel_imgs, - const int deformable_group, - at::Tensor data_col) { - // num_axes should be smaller than block size - // todo: check parallel_imgs is correctly passed in - int height_col = - (height + 2 * pad_h - (dilation_h * (ksize_h - 1) + 1)) / stride_h + 1; - int width_col = - (width + 2 * pad_w - (dilation_w * (ksize_w - 1) + 1)) / stride_w + 1; - int num_kernels = channels * height_col * width_col * parallel_imgs; - int channel_per_deformable_group = channels / deformable_group; - - at::cuda::CUDAGuard device_guard(data_im.device()); - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - AT_DISPATCH_FLOATING_TYPES_AND_HALF( - data_im.scalar_type(), "deformable_im2col_gpu", ([&] { - const scalar_t* data_im_ = data_im.data_ptr(); - const scalar_t* data_offset_ = data_offset.data_ptr(); - scalar_t* data_col_ = data_col.data_ptr(); - - deformable_im2col_gpu_kernel<<< - GET_BLOCKS(num_kernels), - CUDA_NUM_THREADS, - 0, - stream>>>( - num_kernels, - data_im_, - data_offset_, - height, - width, - ksize_h, - ksize_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - channel_per_deformable_group, - parallel_imgs, - channels, - deformable_group, - height_col, - width_col, - data_col_); - })); - - cudaError_t err = cudaGetLastError(); - if (err != cudaSuccess) { - printf("error in deformable_im2col: %s\n", cudaGetErrorString(err)); - } -} - - -void deformable_col2im( - const at::Tensor data_col, - const at::Tensor data_offset, - const int channels, - const int height, - const int width, - const int ksize_h, - const int ksize_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int parallel_imgs, - const int deformable_group, - at::Tensor grad_im) { - // todo: make sure parallel_imgs is passed in correctly - int height_col = - (height + 2 * pad_h - (dilation_h * (ksize_h - 1) + 1)) / stride_h + 1; - int width_col = - (width + 2 * pad_w - (dilation_w * (ksize_w - 1) + 1)) / stride_w + 1; - int num_kernels = - channels * ksize_h * ksize_w * height_col * width_col * parallel_imgs; - int channel_per_deformable_group = channels / deformable_group; - - at::cuda::CUDAGuard device_guard(data_col.device()); - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - AT_DISPATCH_FLOATING_TYPES_AND_HALF( - data_col.scalar_type(), "deformable_col2im_gpu", ([&] { - const scalar_t* data_col_ = data_col.data_ptr(); - const scalar_t* data_offset_ = data_offset.data_ptr(); - scalar_t* grad_im_ = grad_im.data_ptr(); - - deformable_col2im_gpu_kernel<<< - GET_BLOCKS(num_kernels), - CUDA_NUM_THREADS, - 0, - stream>>>( - num_kernels, - data_col_, - data_offset_, - channels, - height, - width, - ksize_h, - ksize_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - channel_per_deformable_group, - parallel_imgs, - deformable_group, - height_col, - width_col, - grad_im_); - })); - - cudaError_t err = cudaGetLastError(); - if (err != cudaSuccess) { - printf("error in deformable_col2im: %s\n", cudaGetErrorString(err)); - } -} - - -void deformable_col2im_coord( - const at::Tensor data_col, - const at::Tensor data_im, - const at::Tensor data_offset, - const int channels, - const int height, - const int width, - const int ksize_h, - const int ksize_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int parallel_imgs, - const int deformable_group, - at::Tensor grad_offset) { - int height_col = - (height + 2 * pad_h - (dilation_h * (ksize_h - 1) + 1)) / stride_h + 1; - int width_col = - (width + 2 * pad_w - (dilation_w * (ksize_w - 1) + 1)) / stride_w + 1; - int num_kernels = height_col * width_col * 2 * ksize_h * ksize_w * - deformable_group * parallel_imgs; - int channel_per_deformable_group = - channels * ksize_h * ksize_w / deformable_group; - - at::cuda::CUDAGuard device_guard(data_col.device()); - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - AT_DISPATCH_FLOATING_TYPES_AND_HALF( - data_col.scalar_type(), "deformable_col2im_coord_gpu", ([&] { - const scalar_t* data_col_ = data_col.data_ptr(); - const scalar_t* data_im_ = data_im.data_ptr(); - const scalar_t* data_offset_ = data_offset.data_ptr(); - scalar_t* grad_offset_ = grad_offset.data_ptr(); - - deformable_col2im_coord_gpu_kernel<<< - GET_BLOCKS(num_kernels), - CUDA_NUM_THREADS, - 0, - stream>>>( - num_kernels, - data_col_, - data_im_, - data_offset_, - channels, - height, - width, - ksize_h, - ksize_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - channel_per_deformable_group, - parallel_imgs, - 2 * ksize_h * ksize_w * deformable_group, - deformable_group, - height_col, - width_col, - grad_offset_); - })); -} - -} // namespace detectron2 - - -template -__device__ scalar_t dmcn_im2col_bilinear( - const scalar_t* bottom_data, - const int data_width, - const int height, - const int width, - scalar_t h, - scalar_t w) { - int h_low = floor(h); - int w_low = floor(w); - int h_high = h_low + 1; - int w_high = w_low + 1; - - scalar_t lh = h - h_low; - scalar_t lw = w - w_low; - scalar_t hh = 1 - lh, hw = 1 - lw; - - scalar_t v1 = 0; - if (h_low >= 0 && w_low >= 0) - v1 = bottom_data[h_low * data_width + w_low]; - scalar_t v2 = 0; - if (h_low >= 0 && w_high <= width - 1) - v2 = bottom_data[h_low * data_width + w_high]; - scalar_t v3 = 0; - if (h_high <= height - 1 && w_low >= 0) - v3 = bottom_data[h_high * data_width + w_low]; - scalar_t v4 = 0; - if (h_high <= height - 1 && w_high <= width - 1) - v4 = bottom_data[h_high * data_width + w_high]; - - scalar_t w1 = hh * hw, w2 = hh * lw, w3 = lh * hw, w4 = lh * lw; - - scalar_t val = (w1 * v1 + w2 * v2 + w3 * v3 + w4 * v4); - return val; -} - -template -__device__ scalar_t dmcn_get_gradient_weight( - scalar_t argmax_h, - scalar_t argmax_w, - const int h, - const int w, - const int height, - const int width) { - if (argmax_h <= -1 || argmax_h >= height || argmax_w <= -1 || - argmax_w >= width) { - // empty - return 0; - } - - int argmax_h_low = floor(argmax_h); - int argmax_w_low = floor(argmax_w); - int argmax_h_high = argmax_h_low + 1; - int argmax_w_high = argmax_w_low + 1; - - scalar_t weight = 0; - if (h == argmax_h_low && w == argmax_w_low) - weight = (h + 1 - argmax_h) * (w + 1 - argmax_w); - if (h == argmax_h_low && w == argmax_w_high) - weight = (h + 1 - argmax_h) * (argmax_w + 1 - w); - if (h == argmax_h_high && w == argmax_w_low) - weight = (argmax_h + 1 - h) * (w + 1 - argmax_w); - if (h == argmax_h_high && w == argmax_w_high) - weight = (argmax_h + 1 - h) * (argmax_w + 1 - w); - return weight; -} - -template -__device__ scalar_t dmcn_get_coordinate_weight( - scalar_t argmax_h, - scalar_t argmax_w, - const int height, - const int width, - const scalar_t* im_data, - const int data_width, - const int bp_dir) { - if (argmax_h <= -1 || argmax_h >= height || argmax_w <= -1 || - argmax_w >= width) { - // empty - return 0; - } - - int argmax_h_low = floor(argmax_h); - int argmax_w_low = floor(argmax_w); - int argmax_h_high = argmax_h_low + 1; - int argmax_w_high = argmax_w_low + 1; - - scalar_t weight = 0; - - if (bp_dir == 0) { - if (argmax_h_low >= 0 && argmax_w_low >= 0) - weight += -1 * (argmax_w_low + 1 - argmax_w) * - im_data[argmax_h_low * data_width + argmax_w_low]; - if (argmax_h_low >= 0 && argmax_w_high <= width - 1) - weight += -1 * (argmax_w - argmax_w_low) * - im_data[argmax_h_low * data_width + argmax_w_high]; - if (argmax_h_high <= height - 1 && argmax_w_low >= 0) - weight += (argmax_w_low + 1 - argmax_w) * - im_data[argmax_h_high * data_width + argmax_w_low]; - if (argmax_h_high <= height - 1 && argmax_w_high <= width - 1) - weight += (argmax_w - argmax_w_low) * - im_data[argmax_h_high * data_width + argmax_w_high]; - } else if (bp_dir == 1) { - if (argmax_h_low >= 0 && argmax_w_low >= 0) - weight += -1 * (argmax_h_low + 1 - argmax_h) * - im_data[argmax_h_low * data_width + argmax_w_low]; - if (argmax_h_low >= 0 && argmax_w_high <= width - 1) - weight += (argmax_h_low + 1 - argmax_h) * - im_data[argmax_h_low * data_width + argmax_w_high]; - if (argmax_h_high <= height - 1 && argmax_w_low >= 0) - weight += -1 * (argmax_h - argmax_h_low) * - im_data[argmax_h_high * data_width + argmax_w_low]; - if (argmax_h_high <= height - 1 && argmax_w_high <= width - 1) - weight += (argmax_h - argmax_h_low) * - im_data[argmax_h_high * data_width + argmax_w_high]; - } - - return weight; -} - -template -__global__ void modulated_deformable_im2col_gpu_kernel( - const int n, - const scalar_t* data_im, - const scalar_t* data_offset, - const scalar_t* data_mask, - const int height, - const int width, - const int kernel_h, - const int kernel_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int channel_per_deformable_group, - const int batch_size, - const int num_channels, - const int deformable_group, - const int height_col, - const int width_col, - scalar_t* data_col) { - CUDA_KERNEL_LOOP(index, n) { - // index index of output matrix - const int w_col = index % width_col; - const int h_col = (index / width_col) % height_col; - const int b_col = (index / width_col / height_col) % batch_size; - const int c_im = (index / width_col / height_col) / batch_size; - const int c_col = c_im * kernel_h * kernel_w; - - // compute deformable group index - const int deformable_group_index = c_im / channel_per_deformable_group; - - const int h_in = h_col * stride_h - pad_h; - const int w_in = w_col * stride_w - pad_w; - - scalar_t* data_col_ptr = data_col + - ((c_col * batch_size + b_col) * height_col + h_col) * width_col + w_col; - // const float* data_im_ptr = data_im + ((b_col * num_channels + c_im) * - // height + h_in) * width + w_in; - const scalar_t* data_im_ptr = - data_im + (b_col * num_channels + c_im) * height * width; - const scalar_t* data_offset_ptr = data_offset + - (b_col * deformable_group + deformable_group_index) * 2 * kernel_h * - kernel_w * height_col * width_col; - - const scalar_t* data_mask_ptr = data_mask + - (b_col * deformable_group + deformable_group_index) * kernel_h * - kernel_w * height_col * width_col; - - for (int i = 0; i < kernel_h; ++i) { - for (int j = 0; j < kernel_w; ++j) { - const int data_offset_h_ptr = - ((2 * (i * kernel_w + j)) * height_col + h_col) * width_col + w_col; - const int data_offset_w_ptr = - ((2 * (i * kernel_w + j) + 1) * height_col + h_col) * width_col + - w_col; - const int data_mask_hw_ptr = - ((i * kernel_w + j) * height_col + h_col) * width_col + w_col; - const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr]; - const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr]; - const scalar_t mask = data_mask_ptr[data_mask_hw_ptr]; - scalar_t val = static_cast(0); - const scalar_t h_im = h_in + i * dilation_h + offset_h; - const scalar_t w_im = w_in + j * dilation_w + offset_w; - // if (h_im >= 0 && w_im >= 0 && h_im < height && w_im < width) { - if (h_im > -1 && w_im > -1 && h_im < height && w_im < width) { - // const float map_h = i * dilation_h + offset_h; - // const float map_w = j * dilation_w + offset_w; - // const int cur_height = height - h_in; - // const int cur_width = width - w_in; - // val = dmcn_im2col_bilinear(data_im_ptr, width, cur_height, - // cur_width, map_h, map_w); - val = dmcn_im2col_bilinear( - data_im_ptr, width, height, width, h_im, w_im); - } - *data_col_ptr = val * mask; - data_col_ptr += batch_size * height_col * width_col; - // data_col_ptr += height_col * width_col; - } - } - } -} - -template -__global__ void modulated_deformable_col2im_gpu_kernel( - const int n, - const scalar_t* data_col, - const scalar_t* data_offset, - const scalar_t* data_mask, - const int channels, - const int height, - const int width, - const int kernel_h, - const int kernel_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int channel_per_deformable_group, - const int batch_size, - const int deformable_group, - const int height_col, - const int width_col, - scalar_t* grad_im) { - CUDA_KERNEL_LOOP(index, n) { - const int j = (index / width_col / height_col / batch_size) % kernel_w; - const int i = - (index / width_col / height_col / batch_size / kernel_w) % kernel_h; - const int c = - index / width_col / height_col / batch_size / kernel_w / kernel_h; - // compute the start and end of the output - - const int deformable_group_index = c / channel_per_deformable_group; - - int w_out = index % width_col; - int h_out = (index / width_col) % height_col; - int b = (index / width_col / height_col) % batch_size; - int w_in = w_out * stride_w - pad_w; - int h_in = h_out * stride_h - pad_h; - - const scalar_t* data_offset_ptr = data_offset + - (b * deformable_group + deformable_group_index) * 2 * kernel_h * - kernel_w * height_col * width_col; - const scalar_t* data_mask_ptr = data_mask + - (b * deformable_group + deformable_group_index) * kernel_h * kernel_w * - height_col * width_col; - const int data_offset_h_ptr = - ((2 * (i * kernel_w + j)) * height_col + h_out) * width_col + w_out; - const int data_offset_w_ptr = - ((2 * (i * kernel_w + j) + 1) * height_col + h_out) * width_col + w_out; - const int data_mask_hw_ptr = - ((i * kernel_w + j) * height_col + h_out) * width_col + w_out; - const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr]; - const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr]; - const scalar_t mask = data_mask_ptr[data_mask_hw_ptr]; - const scalar_t cur_inv_h_data = h_in + i * dilation_h + offset_h; - const scalar_t cur_inv_w_data = w_in + j * dilation_w + offset_w; - - const scalar_t cur_top_grad = data_col[index] * mask; - const int cur_h = (int)cur_inv_h_data; - const int cur_w = (int)cur_inv_w_data; - for (int dy = -2; dy <= 2; dy++) { - for (int dx = -2; dx <= 2; dx++) { - if (cur_h + dy >= 0 && cur_h + dy < height && cur_w + dx >= 0 && - cur_w + dx < width && abs(cur_inv_h_data - (cur_h + dy)) < 1 && - abs(cur_inv_w_data - (cur_w + dx)) < 1) { - int cur_bottom_grad_pos = - ((b * channels + c) * height + cur_h + dy) * width + cur_w + dx; - scalar_t weight = dmcn_get_gradient_weight( - cur_inv_h_data, - cur_inv_w_data, - cur_h + dy, - cur_w + dx, - height, - width); - atomicAdd(grad_im + cur_bottom_grad_pos, weight * cur_top_grad); - } - } - } - } -} - -template -__global__ void modulated_deformable_col2im_coord_gpu_kernel( - const int n, - const scalar_t* data_col, - const scalar_t* data_im, - const scalar_t* data_offset, - const scalar_t* data_mask, - const int channels, - const int height, - const int width, - const int kernel_h, - const int kernel_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int channel_per_deformable_group, - const int batch_size, - const int offset_channels, - const int deformable_group, - const int height_col, - const int width_col, - scalar_t* grad_offset, - scalar_t* grad_mask) { - CUDA_KERNEL_LOOP(index, n) { - scalar_t val = 0, mval = 0; - int w = index % width_col; - int h = (index / width_col) % height_col; - int c = (index / width_col / height_col) % offset_channels; - int b = (index / width_col / height_col) / offset_channels; - // compute the start and end of the output - - const int deformable_group_index = c / (2 * kernel_h * kernel_w); - const int col_step = kernel_h * kernel_w; - int cnt = 0; - const scalar_t* data_col_ptr = data_col + - deformable_group_index * channel_per_deformable_group * batch_size * - width_col * height_col; - const scalar_t* data_im_ptr = data_im + - (b * deformable_group + deformable_group_index) * - channel_per_deformable_group / kernel_h / kernel_w * height * width; - const scalar_t* data_offset_ptr = data_offset + - (b * deformable_group + deformable_group_index) * 2 * kernel_h * - kernel_w * height_col * width_col; - const scalar_t* data_mask_ptr = data_mask + - (b * deformable_group + deformable_group_index) * kernel_h * kernel_w * - height_col * width_col; - - const int offset_c = c - deformable_group_index * 2 * kernel_h * kernel_w; - - for (int col_c = (offset_c / 2); col_c < channel_per_deformable_group; - col_c += col_step) { - const int col_pos = - (((col_c * batch_size + b) * height_col) + h) * width_col + w; - const int bp_dir = offset_c % 2; - - int j = (col_pos / width_col / height_col / batch_size) % kernel_w; - int i = - (col_pos / width_col / height_col / batch_size / kernel_w) % kernel_h; - int w_out = col_pos % width_col; - int h_out = (col_pos / width_col) % height_col; - int w_in = w_out * stride_w - pad_w; - int h_in = h_out * stride_h - pad_h; - const int data_offset_h_ptr = - (((2 * (i * kernel_w + j)) * height_col + h_out) * width_col + w_out); - const int data_offset_w_ptr = - (((2 * (i * kernel_w + j) + 1) * height_col + h_out) * width_col + - w_out); - const int data_mask_hw_ptr = - (((i * kernel_w + j) * height_col + h_out) * width_col + w_out); - const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr]; - const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr]; - const scalar_t mask = data_mask_ptr[data_mask_hw_ptr]; - scalar_t inv_h = h_in + i * dilation_h + offset_h; - scalar_t inv_w = w_in + j * dilation_w + offset_w; - if (inv_h <= -1 || inv_w <= -1 || inv_h >= height || inv_w >= width) { - inv_h = inv_w = -2; - } else { - mval += data_col_ptr[col_pos] * - dmcn_im2col_bilinear( - data_im_ptr + cnt * height * width, - width, - height, - width, - inv_h, - inv_w); - } - const scalar_t weight = dmcn_get_coordinate_weight( - inv_h, - inv_w, - height, - width, - data_im_ptr + cnt * height * width, - width, - bp_dir); - val += weight * data_col_ptr[col_pos] * mask; - cnt += 1; - } - // KERNEL_ASSIGN(grad_offset[index], offset_req, val); - grad_offset[index] = val; - if (offset_c % 2 == 0) - // KERNEL_ASSIGN(grad_mask[(((b * deformable_group + - // deformable_group_index) * kernel_h * kernel_w + offset_c / 2) * - // height_col + h) * width_col + w], mask_req, mval); - grad_mask - [(((b * deformable_group + deformable_group_index) * kernel_h * - kernel_w + - offset_c / 2) * - height_col + - h) * - width_col + - w] = mval; - } -} - - -namespace detectron2 { - -void modulated_deformable_im2col_cuda( - const at::Tensor data_im, - const at::Tensor data_offset, - const at::Tensor data_mask, - const int batch_size, - const int channels, - const int height_im, - const int width_im, - const int height_col, - const int width_col, - const int kernel_h, - const int kenerl_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int deformable_group, - at::Tensor data_col) { - // num_axes should be smaller than block size - const int channel_per_deformable_group = channels / deformable_group; - const int num_kernels = channels * batch_size * height_col * width_col; - - at::cuda::CUDAGuard device_guard(data_im.device()); - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - AT_DISPATCH_FLOATING_TYPES_AND_HALF( - data_im.scalar_type(), "modulated_deformable_im2col_gpu", ([&] { - const scalar_t* data_im_ = data_im.data_ptr(); - const scalar_t* data_offset_ = data_offset.data_ptr(); - const scalar_t* data_mask_ = data_mask.data_ptr(); - scalar_t* data_col_ = data_col.data_ptr(); - - modulated_deformable_im2col_gpu_kernel<<< - GET_BLOCKS(num_kernels), - CUDA_NUM_THREADS, - 0, - stream>>>( - num_kernels, - data_im_, - data_offset_, - data_mask_, - height_im, - width_im, - kernel_h, - kenerl_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - channel_per_deformable_group, - batch_size, - channels, - deformable_group, - height_col, - width_col, - data_col_); - })); - - cudaError_t err = cudaGetLastError(); - if (err != cudaSuccess) { - printf( - "error in modulated_deformable_im2col_cuda: %s\n", - cudaGetErrorString(err)); - } -} - -void modulated_deformable_col2im_cuda( - const at::Tensor data_col, - const at::Tensor data_offset, - const at::Tensor data_mask, - const int batch_size, - const int channels, - const int height_im, - const int width_im, - const int height_col, - const int width_col, - const int kernel_h, - const int kernel_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int deformable_group, - at::Tensor grad_im) { - const int channel_per_deformable_group = channels / deformable_group; - const int num_kernels = - channels * kernel_h * kernel_w * batch_size * height_col * width_col; - - at::cuda::CUDAGuard device_guard(data_col.device()); - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - AT_DISPATCH_FLOATING_TYPES_AND_HALF( - data_col.scalar_type(), "modulated_deformable_col2im_gpu", ([&] { - const scalar_t* data_col_ = data_col.data_ptr(); - const scalar_t* data_offset_ = data_offset.data_ptr(); - const scalar_t* data_mask_ = data_mask.data_ptr(); - scalar_t* grad_im_ = grad_im.data_ptr(); - - modulated_deformable_col2im_gpu_kernel<<< - GET_BLOCKS(num_kernels), - CUDA_NUM_THREADS, - 0, - stream>>>( - num_kernels, - data_col_, - data_offset_, - data_mask_, - channels, - height_im, - width_im, - kernel_h, - kernel_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - channel_per_deformable_group, - batch_size, - deformable_group, - height_col, - width_col, - grad_im_); - })); - - cudaError_t err = cudaGetLastError(); - if (err != cudaSuccess) { - printf( - "error in modulated_deformable_col2im_cuda: %s\n", - cudaGetErrorString(err)); - } -} - -void modulated_deformable_col2im_coord_cuda( - const at::Tensor data_col, - const at::Tensor data_im, - const at::Tensor data_offset, - const at::Tensor data_mask, - const int batch_size, - const int channels, - const int height_im, - const int width_im, - const int height_col, - const int width_col, - const int kernel_h, - const int kernel_w, - const int pad_h, - const int pad_w, - const int stride_h, - const int stride_w, - const int dilation_h, - const int dilation_w, - const int deformable_group, - at::Tensor grad_offset, - at::Tensor grad_mask) { - const int num_kernels = batch_size * height_col * width_col * 2 * kernel_h * - kernel_w * deformable_group; - const int channel_per_deformable_group = - channels * kernel_h * kernel_w / deformable_group; - - at::cuda::CUDAGuard device_guard(data_col.device()); - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - AT_DISPATCH_FLOATING_TYPES_AND_HALF( - data_col.scalar_type(), "modulated_deformable_col2im_coord_gpu", ([&] { - const scalar_t* data_col_ = data_col.data_ptr(); - const scalar_t* data_im_ = data_im.data_ptr(); - const scalar_t* data_offset_ = data_offset.data_ptr(); - const scalar_t* data_mask_ = data_mask.data_ptr(); - scalar_t* grad_offset_ = grad_offset.data_ptr(); - scalar_t* grad_mask_ = grad_mask.data_ptr(); - - modulated_deformable_col2im_coord_gpu_kernel<<< - GET_BLOCKS(num_kernels), - CUDA_NUM_THREADS, - 0, - stream>>>( - num_kernels, - data_col_, - data_im_, - data_offset_, - data_mask_, - channels, - height_im, - width_im, - kernel_h, - kernel_w, - pad_h, - pad_w, - stride_h, - stride_w, - dilation_h, - dilation_w, - channel_per_deformable_group, - batch_size, - 2 * kernel_h * kernel_w * deformable_group, - deformable_group, - height_col, - width_col, - grad_offset_, - grad_mask_); - })); - cudaError_t err = cudaGetLastError(); - if (err != cudaSuccess) { - printf( - "error in modulated_deformable_col2im_coord_cuda: %s\n", - cudaGetErrorString(err)); - } -} - -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated.h b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated.h deleted file mode 100755 index 12aca388..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated.h +++ /dev/null @@ -1,39 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#pragma once -#include - -namespace detectron2 { - -at::Tensor nms_rotated_cpu( - const at::Tensor& dets, - const at::Tensor& scores, - const double iou_threshold); - -#if defined(WITH_CUDA) || defined(WITH_HIP) -at::Tensor nms_rotated_cuda( - const at::Tensor& dets, - const at::Tensor& scores, - const double iou_threshold); -#endif - -// Interface for Python -// inline is needed to prevent multiple function definitions when this header is -// included by different cpps -inline at::Tensor nms_rotated( - const at::Tensor& dets, - const at::Tensor& scores, - const double iou_threshold) { - assert(dets.device().is_cuda() == scores.device().is_cuda()); - if (dets.device().is_cuda()) { -#if defined(WITH_CUDA) || defined(WITH_HIP) - return nms_rotated_cuda( - dets.contiguous(), scores.contiguous(), iou_threshold); -#else - AT_ERROR("Detectron2 is not compiled with GPU support!"); -#endif - } - - return nms_rotated_cpu(dets.contiguous(), scores.contiguous(), iou_threshold); -} - -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp deleted file mode 100755 index d7556e64..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp +++ /dev/null @@ -1,75 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#include "../box_iou_rotated/box_iou_rotated_utils.h" -#include "nms_rotated.h" - -namespace detectron2 { - -template -at::Tensor nms_rotated_cpu_kernel( - const at::Tensor& dets, - const at::Tensor& scores, - const double iou_threshold) { - // nms_rotated_cpu_kernel is modified from torchvision's nms_cpu_kernel, - // however, the code in this function is much shorter because - // we delegate the IoU computation for rotated boxes to - // the single_box_iou_rotated function in box_iou_rotated_utils.h - AT_ASSERTM(dets.device().is_cpu(), "dets must be a CPU tensor"); - AT_ASSERTM(scores.device().is_cpu(), "scores must be a CPU tensor"); - AT_ASSERTM( - dets.scalar_type() == scores.scalar_type(), - "dets should have the same type as scores"); - - if (dets.numel() == 0) { - return at::empty({0}, dets.options().dtype(at::kLong)); - } - - auto order_t = std::get<1>(scores.sort(0, /* descending=*/true)); - - auto ndets = dets.size(0); - at::Tensor suppressed_t = at::zeros({ndets}, dets.options().dtype(at::kByte)); - at::Tensor keep_t = at::zeros({ndets}, dets.options().dtype(at::kLong)); - - auto suppressed = suppressed_t.data_ptr(); - auto keep = keep_t.data_ptr(); - auto order = order_t.data_ptr(); - - int64_t num_to_keep = 0; - - for (int64_t _i = 0; _i < ndets; _i++) { - auto i = order[_i]; - if (suppressed[i] == 1) { - continue; - } - - keep[num_to_keep++] = i; - - for (int64_t _j = _i + 1; _j < ndets; _j++) { - auto j = order[_j]; - if (suppressed[j] == 1) { - continue; - } - - auto ovr = single_box_iou_rotated( - dets[i].data_ptr(), dets[j].data_ptr()); - if (ovr >= iou_threshold) { - suppressed[j] = 1; - } - } - } - return keep_t.narrow(/*dim=*/0, /*start=*/0, /*length=*/num_to_keep); -} - -at::Tensor nms_rotated_cpu( - // input must be contiguous - const at::Tensor& dets, - const at::Tensor& scores, - const double iou_threshold) { - auto result = at::empty({0}, dets.options()); - - AT_DISPATCH_FLOATING_TYPES(dets.scalar_type(), "nms_rotated", [&] { - result = nms_rotated_cpu_kernel(dets, scores, iou_threshold); - }); - return result; -} - -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu deleted file mode 100755 index 2a3db5c6..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu +++ /dev/null @@ -1,145 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -#include -#include -#include -#include -#ifdef WITH_CUDA -#include "../box_iou_rotated/box_iou_rotated_utils.h" -#endif -// TODO avoid this when pytorch supports "same directory" hipification -#ifdef WITH_HIP -#include "box_iou_rotated/box_iou_rotated_utils.h" -#endif - -using namespace detectron2; - -namespace { -int const threadsPerBlock = sizeof(unsigned long long) * 8; -} - -template -__global__ void nms_rotated_cuda_kernel( - const int n_boxes, - const double iou_threshold, - const T* dev_boxes, - unsigned long long* dev_mask) { - // nms_rotated_cuda_kernel is modified from torchvision's nms_cuda_kernel - - const int row_start = blockIdx.y; - const int col_start = blockIdx.x; - - // if (row_start > col_start) return; - - const int row_size = - min(n_boxes - row_start * threadsPerBlock, threadsPerBlock); - const int col_size = - min(n_boxes - col_start * threadsPerBlock, threadsPerBlock); - - // Compared to nms_cuda_kernel, where each box is represented with 4 values - // (x1, y1, x2, y2), each rotated box is represented with 5 values - // (x_center, y_center, width, height, angle_degrees) here. - __shared__ T block_boxes[threadsPerBlock * 5]; - if (threadIdx.x < col_size) { - block_boxes[threadIdx.x * 5 + 0] = - dev_boxes[(threadsPerBlock * col_start + threadIdx.x) * 5 + 0]; - block_boxes[threadIdx.x * 5 + 1] = - dev_boxes[(threadsPerBlock * col_start + threadIdx.x) * 5 + 1]; - block_boxes[threadIdx.x * 5 + 2] = - dev_boxes[(threadsPerBlock * col_start + threadIdx.x) * 5 + 2]; - block_boxes[threadIdx.x * 5 + 3] = - dev_boxes[(threadsPerBlock * col_start + threadIdx.x) * 5 + 3]; - block_boxes[threadIdx.x * 5 + 4] = - dev_boxes[(threadsPerBlock * col_start + threadIdx.x) * 5 + 4]; - } - __syncthreads(); - - if (threadIdx.x < row_size) { - const int cur_box_idx = threadsPerBlock * row_start + threadIdx.x; - const T* cur_box = dev_boxes + cur_box_idx * 5; - int i = 0; - unsigned long long t = 0; - int start = 0; - if (row_start == col_start) { - start = threadIdx.x + 1; - } - for (i = start; i < col_size; i++) { - // Instead of devIoU used by original horizontal nms, here - // we use the single_box_iou_rotated function from box_iou_rotated_utils.h - if (single_box_iou_rotated(cur_box, block_boxes + i * 5) > - iou_threshold) { - t |= 1ULL << i; - } - } - const int col_blocks = at::cuda::ATenCeilDiv(n_boxes, threadsPerBlock); - dev_mask[cur_box_idx * col_blocks + col_start] = t; - } -} - -namespace detectron2 { - -at::Tensor nms_rotated_cuda( - // input must be contiguous - const at::Tensor& dets, - const at::Tensor& scores, - double iou_threshold) { - // using scalar_t = float; - AT_ASSERTM(dets.is_cuda(), "dets must be a CUDA tensor"); - AT_ASSERTM(scores.is_cuda(), "scores must be a CUDA tensor"); - at::cuda::CUDAGuard device_guard(dets.device()); - - auto order_t = std::get<1>(scores.sort(0, /* descending=*/true)); - auto dets_sorted = dets.index_select(0, order_t); - - auto dets_num = dets.size(0); - - const int col_blocks = - at::cuda::ATenCeilDiv(static_cast(dets_num), threadsPerBlock); - - at::Tensor mask = - at::empty({dets_num * col_blocks}, dets.options().dtype(at::kLong)); - - dim3 blocks(col_blocks, col_blocks); - dim3 threads(threadsPerBlock); - cudaStream_t stream = at::cuda::getCurrentCUDAStream(); - - AT_DISPATCH_FLOATING_TYPES( - dets_sorted.scalar_type(), "nms_rotated_kernel_cuda", [&] { - nms_rotated_cuda_kernel<<>>( - dets_num, - iou_threshold, - dets_sorted.data_ptr(), - (unsigned long long*)mask.data_ptr()); - }); - - at::Tensor mask_cpu = mask.to(at::kCPU); - unsigned long long* mask_host = - (unsigned long long*)mask_cpu.data_ptr(); - - std::vector remv(col_blocks); - memset(&remv[0], 0, sizeof(unsigned long long) * col_blocks); - - at::Tensor keep = - at::empty({dets_num}, dets.options().dtype(at::kLong).device(at::kCPU)); - int64_t* keep_out = keep.data_ptr(); - - int num_to_keep = 0; - for (int i = 0; i < dets_num; i++) { - int nblock = i / threadsPerBlock; - int inblock = i % threadsPerBlock; - - if (!(remv[nblock] & (1ULL << inblock))) { - keep_out[num_to_keep++] = i; - unsigned long long* p = mask_host + i * col_blocks; - for (int j = nblock; j < col_blocks; j++) { - remv[j] |= p[j]; - } - } - } - - AT_CUDA_CHECK(cudaGetLastError()); - return order_t.index( - {keep.narrow(/*dim=*/0, /*start=*/0, /*length=*/num_to_keep) - .to(order_t.device(), keep.scalar_type())}); -} - -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/vision.cpp b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/vision.cpp deleted file mode 100755 index c9a2cd4f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/csrc/vision.cpp +++ /dev/null @@ -1,117 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. - -#include -#include "ROIAlignRotated/ROIAlignRotated.h" -#include "box_iou_rotated/box_iou_rotated.h" -#include "cocoeval/cocoeval.h" -#include "deformable/deform_conv.h" -#include "nms_rotated/nms_rotated.h" - -namespace detectron2 { - -#if defined(WITH_CUDA) || defined(WITH_HIP) -extern int get_cudart_version(); -#endif - -std::string get_cuda_version() { -#if defined(WITH_CUDA) || defined(WITH_HIP) - std::ostringstream oss; - -#if defined(WITH_CUDA) - oss << "CUDA "; -#else - oss << "HIP "; -#endif - - // copied from - // https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 - auto printCudaStyleVersion = [&](int v) { - oss << (v / 1000) << "." << (v / 10 % 100); - if (v % 10 != 0) { - oss << "." << (v % 10); - } - }; - printCudaStyleVersion(get_cudart_version()); - return oss.str(); -#else // neither CUDA nor HIP - return std::string("not available"); -#endif -} - -bool has_cuda() { -#if defined(WITH_CUDA) - return true; -#else - return false; -#endif -} - -// similar to -// https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp -std::string get_compiler_version() { - std::ostringstream ss; -#if defined(__GNUC__) -#ifndef __clang__ - -#if ((__GNUC__ <= 4) && (__GNUC_MINOR__ <= 8)) -#error "GCC >= 4.9 is required!" -#endif - - { ss << "GCC " << __GNUC__ << "." << __GNUC_MINOR__; } -#endif -#endif - -#if defined(__clang_major__) - { - ss << "clang " << __clang_major__ << "." << __clang_minor__ << "." - << __clang_patchlevel__; - } -#endif - -#if defined(_MSC_VER) - { ss << "MSVC " << _MSC_FULL_VER; } -#endif - return ss.str(); -} - -PYBIND11_MODULE(TORCH_EXTENSION_NAME, m) { - m.def("get_compiler_version", &get_compiler_version, "get_compiler_version"); - m.def("get_cuda_version", &get_cuda_version, "get_cuda_version"); - m.def("has_cuda", &has_cuda, "has_cuda"); - - m.def("deform_conv_forward", &deform_conv_forward, "deform_conv_forward"); - m.def( - "deform_conv_backward_input", - &deform_conv_backward_input, - "deform_conv_backward_input"); - m.def( - "deform_conv_backward_filter", - &deform_conv_backward_filter, - "deform_conv_backward_filter"); - m.def( - "modulated_deform_conv_forward", - &modulated_deform_conv_forward, - "modulated_deform_conv_forward"); - m.def( - "modulated_deform_conv_backward", - &modulated_deform_conv_backward, - "modulated_deform_conv_backward"); - - m.def("COCOevalAccumulate", &COCOeval::Accumulate, "COCOeval::Accumulate"); - m.def( - "COCOevalEvaluateImages", - &COCOeval::EvaluateImages, - "COCOeval::EvaluateImages"); - pybind11::class_(m, "InstanceAnnotation") - .def(pybind11::init()); - pybind11::class_(m, "ImageEvaluation") - .def(pybind11::init<>()); -} - -TORCH_LIBRARY(detectron2, m) { - m.def("nms_rotated", &nms_rotated); - m.def("box_iou_rotated", &box_iou_rotated); - m.def("roi_align_rotated_forward", &ROIAlignRotated_forward); - m.def("roi_align_rotated_backward", &ROIAlignRotated_backward); -} -} // namespace detectron2 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/deform_conv.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/deform_conv.py deleted file mode 100755 index eca070f5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/deform_conv.py +++ /dev/null @@ -1,501 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import math -from functools import lru_cache -import torch -from torch import nn -from torch.autograd import Function -from torch.autograd.function import once_differentiable -from torch.nn.modules.utils import _pair -from torchvision.ops import deform_conv2d - -from detectron2 import _C - -from .wrappers import _NewEmptyTensorOp - - -class _DeformConv(Function): - @staticmethod - def forward( - ctx, - input, - offset, - weight, - stride=1, - padding=0, - dilation=1, - groups=1, - deformable_groups=1, - im2col_step=64, - ): - if input is not None and input.dim() != 4: - raise ValueError( - "Expected 4D tensor as input, got {}D tensor instead.".format(input.dim()) - ) - ctx.stride = _pair(stride) - ctx.padding = _pair(padding) - ctx.dilation = _pair(dilation) - ctx.groups = groups - ctx.deformable_groups = deformable_groups - ctx.im2col_step = im2col_step - - ctx.save_for_backward(input, offset, weight) - - output = input.new_empty( - _DeformConv._output_size(input, weight, ctx.padding, ctx.dilation, ctx.stride) - ) - - ctx.bufs_ = [input.new_empty(0), input.new_empty(0)] # columns, ones - - if not input.is_cuda: - if deformable_groups != 1: - raise NotImplementedError( - "Deformable Conv with deformable_groups != 1 is not supported on CPUs!" - ) - return deform_conv2d( - input, offset, weight, stride=stride, padding=padding, dilation=dilation - ) - else: - cur_im2col_step = _DeformConv._cal_im2col_step(input.shape[0], ctx.im2col_step) - assert (input.shape[0] % cur_im2col_step) == 0, "im2col step must divide batchsize" - - _C.deform_conv_forward( - input, - weight, - offset, - output, - ctx.bufs_[0], - ctx.bufs_[1], - weight.size(3), - weight.size(2), - ctx.stride[1], - ctx.stride[0], - ctx.padding[1], - ctx.padding[0], - ctx.dilation[1], - ctx.dilation[0], - ctx.groups, - ctx.deformable_groups, - cur_im2col_step, - ) - return output - - @staticmethod - @once_differentiable - def backward(ctx, grad_output): - input, offset, weight = ctx.saved_tensors - - grad_input = grad_offset = grad_weight = None - - if not grad_output.is_cuda: - raise NotImplementedError("Deformable Conv is not supported on CPUs!") - else: - cur_im2col_step = _DeformConv._cal_im2col_step(input.shape[0], ctx.im2col_step) - assert (input.shape[0] % cur_im2col_step) == 0, "im2col step must divide batchsize" - - if ctx.needs_input_grad[0] or ctx.needs_input_grad[1]: - grad_input = torch.zeros_like(input) - grad_offset = torch.zeros_like(offset) - _C.deform_conv_backward_input( - input, - offset, - grad_output, - grad_input, - grad_offset, - weight, - ctx.bufs_[0], - weight.size(3), - weight.size(2), - ctx.stride[1], - ctx.stride[0], - ctx.padding[1], - ctx.padding[0], - ctx.dilation[1], - ctx.dilation[0], - ctx.groups, - ctx.deformable_groups, - cur_im2col_step, - ) - - if ctx.needs_input_grad[2]: - grad_weight = torch.zeros_like(weight) - _C.deform_conv_backward_filter( - input, - offset, - grad_output, - grad_weight, - ctx.bufs_[0], - ctx.bufs_[1], - weight.size(3), - weight.size(2), - ctx.stride[1], - ctx.stride[0], - ctx.padding[1], - ctx.padding[0], - ctx.dilation[1], - ctx.dilation[0], - ctx.groups, - ctx.deformable_groups, - 1, - cur_im2col_step, - ) - - return grad_input, grad_offset, grad_weight, None, None, None, None, None, None - - @staticmethod - def _output_size(input, weight, padding, dilation, stride): - channels = weight.size(0) - output_size = (input.size(0), channels) - for d in range(input.dim() - 2): - in_size = input.size(d + 2) - pad = padding[d] - kernel = dilation[d] * (weight.size(d + 2) - 1) + 1 - stride_ = stride[d] - output_size += ((in_size + (2 * pad) - kernel) // stride_ + 1,) - if not all(map(lambda s: s > 0, output_size)): - raise ValueError( - "convolution input is too small (output would be {})".format( - "x".join(map(str, output_size)) - ) - ) - return output_size - - @staticmethod - @lru_cache(maxsize=128) - def _cal_im2col_step(input_size, default_size): - """ - Calculate proper im2col step size, which should be divisible by input_size and not larger - than prefer_size. Meanwhile the step size should be as large as possible to be more - efficient. So we choose the largest one among all divisors of input_size which are smaller - than prefer_size. - :param input_size: input batch size . - :param default_size: default preferred im2col step size. - :return: the largest proper step size. - """ - if input_size <= default_size: - return input_size - best_step = 1 - for step in range(2, min(int(math.sqrt(input_size)) + 1, default_size)): - if input_size % step == 0: - if input_size // step <= default_size: - return input_size // step - best_step = step - - return best_step - - -class _ModulatedDeformConv(Function): - @staticmethod - def forward( - ctx, - input, - offset, - mask, - weight, - bias=None, - stride=1, - padding=0, - dilation=1, - groups=1, - deformable_groups=1, - ): - ctx.stride = stride - ctx.padding = padding - ctx.dilation = dilation - ctx.groups = groups - ctx.deformable_groups = deformable_groups - ctx.with_bias = bias is not None - if not ctx.with_bias: - bias = input.new_empty(1) # fake tensor - if not input.is_cuda: - raise NotImplementedError("Deformable Conv is not supported on CPUs!") - if ( - weight.requires_grad - or mask.requires_grad - or offset.requires_grad - or input.requires_grad - ): - ctx.save_for_backward(input, offset, mask, weight, bias) - output = input.new_empty(_ModulatedDeformConv._infer_shape(ctx, input, weight)) - ctx._bufs = [input.new_empty(0), input.new_empty(0)] - _C.modulated_deform_conv_forward( - input, - weight, - bias, - ctx._bufs[0], - offset, - mask, - output, - ctx._bufs[1], - weight.shape[2], - weight.shape[3], - ctx.stride, - ctx.stride, - ctx.padding, - ctx.padding, - ctx.dilation, - ctx.dilation, - ctx.groups, - ctx.deformable_groups, - ctx.with_bias, - ) - return output - - @staticmethod - @once_differentiable - def backward(ctx, grad_output): - if not grad_output.is_cuda: - raise NotImplementedError("Deformable Conv is not supported on CPUs!") - input, offset, mask, weight, bias = ctx.saved_tensors - grad_input = torch.zeros_like(input) - grad_offset = torch.zeros_like(offset) - grad_mask = torch.zeros_like(mask) - grad_weight = torch.zeros_like(weight) - grad_bias = torch.zeros_like(bias) - _C.modulated_deform_conv_backward( - input, - weight, - bias, - ctx._bufs[0], - offset, - mask, - ctx._bufs[1], - grad_input, - grad_weight, - grad_bias, - grad_offset, - grad_mask, - grad_output, - weight.shape[2], - weight.shape[3], - ctx.stride, - ctx.stride, - ctx.padding, - ctx.padding, - ctx.dilation, - ctx.dilation, - ctx.groups, - ctx.deformable_groups, - ctx.with_bias, - ) - if not ctx.with_bias: - grad_bias = None - - return ( - grad_input, - grad_offset, - grad_mask, - grad_weight, - grad_bias, - None, - None, - None, - None, - None, - ) - - @staticmethod - def _infer_shape(ctx, input, weight): - n = input.size(0) - channels_out = weight.size(0) - height, width = input.shape[2:4] - kernel_h, kernel_w = weight.shape[2:4] - height_out = ( - height + 2 * ctx.padding - (ctx.dilation * (kernel_h - 1) + 1) - ) // ctx.stride + 1 - width_out = ( - width + 2 * ctx.padding - (ctx.dilation * (kernel_w - 1) + 1) - ) // ctx.stride + 1 - return n, channels_out, height_out, width_out - - -deform_conv = _DeformConv.apply -modulated_deform_conv = _ModulatedDeformConv.apply - - -class DeformConv(nn.Module): - def __init__( - self, - in_channels, - out_channels, - kernel_size, - stride=1, - padding=0, - dilation=1, - groups=1, - deformable_groups=1, - bias=False, - norm=None, - activation=None, - ): - """ - Deformable convolution from :paper:`deformconv`. - - Arguments are similar to :class:`Conv2D`. Extra arguments: - - Args: - deformable_groups (int): number of groups used in deformable convolution. - norm (nn.Module, optional): a normalization layer - activation (callable(Tensor) -> Tensor): a callable activation function - """ - super(DeformConv, self).__init__() - - assert not bias - assert in_channels % groups == 0, "in_channels {} cannot be divisible by groups {}".format( - in_channels, groups - ) - assert ( - out_channels % groups == 0 - ), "out_channels {} cannot be divisible by groups {}".format(out_channels, groups) - - self.in_channels = in_channels - self.out_channels = out_channels - self.kernel_size = _pair(kernel_size) - self.stride = _pair(stride) - self.padding = _pair(padding) - self.dilation = _pair(dilation) - self.groups = groups - self.deformable_groups = deformable_groups - self.norm = norm - self.activation = activation - - self.weight = nn.Parameter( - torch.Tensor(out_channels, in_channels // self.groups, *self.kernel_size) - ) - self.bias = None - - nn.init.kaiming_uniform_(self.weight, nonlinearity="relu") - - def forward(self, x, offset): - if x.numel() == 0: - # When input is empty, we want to return a empty tensor with "correct" shape, - # So that the following operations will not panic - # if they check for the shape of the tensor. - # This computes the height and width of the output tensor - output_shape = [ - (i + 2 * p - (di * (k - 1) + 1)) // s + 1 - for i, p, di, k, s in zip( - x.shape[-2:], self.padding, self.dilation, self.kernel_size, self.stride - ) - ] - output_shape = [x.shape[0], self.weight.shape[0]] + output_shape - return _NewEmptyTensorOp.apply(x, output_shape) - - x = deform_conv( - x, - offset, - self.weight, - self.stride, - self.padding, - self.dilation, - self.groups, - self.deformable_groups, - ) - if self.norm is not None: - x = self.norm(x) - if self.activation is not None: - x = self.activation(x) - return x - - def extra_repr(self): - tmpstr = "in_channels=" + str(self.in_channels) - tmpstr += ", out_channels=" + str(self.out_channels) - tmpstr += ", kernel_size=" + str(self.kernel_size) - tmpstr += ", stride=" + str(self.stride) - tmpstr += ", padding=" + str(self.padding) - tmpstr += ", dilation=" + str(self.dilation) - tmpstr += ", groups=" + str(self.groups) - tmpstr += ", deformable_groups=" + str(self.deformable_groups) - tmpstr += ", bias=False" - return tmpstr - - -class ModulatedDeformConv(nn.Module): - def __init__( - self, - in_channels, - out_channels, - kernel_size, - stride=1, - padding=0, - dilation=1, - groups=1, - deformable_groups=1, - bias=True, - norm=None, - activation=None, - ): - """ - Modulated deformable convolution from :paper:`deformconv2`. - - Arguments are similar to :class:`Conv2D`. Extra arguments: - - Args: - deformable_groups (int): number of groups used in deformable convolution. - norm (nn.Module, optional): a normalization layer - activation (callable(Tensor) -> Tensor): a callable activation function - """ - super(ModulatedDeformConv, self).__init__() - self.in_channels = in_channels - self.out_channels = out_channels - self.kernel_size = _pair(kernel_size) - self.stride = stride - self.padding = padding - self.dilation = dilation - self.groups = groups - self.deformable_groups = deformable_groups - self.with_bias = bias - self.norm = norm - self.activation = activation - - self.weight = nn.Parameter( - torch.Tensor(out_channels, in_channels // groups, *self.kernel_size) - ) - if bias: - self.bias = nn.Parameter(torch.Tensor(out_channels)) - else: - self.bias = None - - nn.init.kaiming_uniform_(self.weight, nonlinearity="relu") - if self.bias is not None: - nn.init.constant_(self.bias, 0) - - def forward(self, x, offset, mask): - if x.numel() == 0: - output_shape = [ - (i + 2 * p - (di * (k - 1) + 1)) // s + 1 - for i, p, di, k, s in zip( - x.shape[-2:], self.padding, self.dilation, self.kernel_size, self.stride - ) - ] - output_shape = [x.shape[0], self.weight.shape[0]] + output_shape - return _NewEmptyTensorOp.apply(x, output_shape) - - x = modulated_deform_conv( - x, - offset, - mask, - self.weight, - self.bias, - self.stride, - self.padding, - self.dilation, - self.groups, - self.deformable_groups, - ) - if self.norm is not None: - x = self.norm(x) - if self.activation is not None: - x = self.activation(x) - return x - - def extra_repr(self): - tmpstr = "in_channels=" + str(self.in_channels) - tmpstr += ", out_channels=" + str(self.out_channels) - tmpstr += ", kernel_size=" + str(self.kernel_size) - tmpstr += ", stride=" + str(self.stride) - tmpstr += ", padding=" + str(self.padding) - tmpstr += ", dilation=" + str(self.dilation) - tmpstr += ", groups=" + str(self.groups) - tmpstr += ", deformable_groups=" + str(self.deformable_groups) - tmpstr += ", bias=" + str(self.with_bias) - return tmpstr diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/losses.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/losses.py deleted file mode 100755 index cf4d5e9b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/losses.py +++ /dev/null @@ -1,133 +0,0 @@ -import math -import torch - - -def diou_loss( - boxes1: torch.Tensor, - boxes2: torch.Tensor, - reduction: str = "none", - eps: float = 1e-7, -) -> torch.Tensor: - """ - Distance Intersection over Union Loss (Zhaohui Zheng et. al) - https://arxiv.org/abs/1911.08287 - Args: - boxes1, boxes2 (Tensor): box locations in XYXY format, shape (N, 4) or (4,). - reduction: 'none' | 'mean' | 'sum' - 'none': No reduction will be applied to the output. - 'mean': The output will be averaged. - 'sum': The output will be summed. - eps (float): small number to prevent division by zero - """ - - x1, y1, x2, y2 = boxes1.unbind(dim=-1) - x1g, y1g, x2g, y2g = boxes2.unbind(dim=-1) - - # TODO: use torch._assert_async() when pytorch 1.8 support is dropped - assert (x2 >= x1).all(), "bad box: x1 larger than x2" - assert (y2 >= y1).all(), "bad box: y1 larger than y2" - - # Intersection keypoints - xkis1 = torch.max(x1, x1g) - ykis1 = torch.max(y1, y1g) - xkis2 = torch.min(x2, x2g) - ykis2 = torch.min(y2, y2g) - - intsct = torch.zeros_like(x1) - mask = (ykis2 > ykis1) & (xkis2 > xkis1) - intsct[mask] = (xkis2[mask] - xkis1[mask]) * (ykis2[mask] - ykis1[mask]) - union = (x2 - x1) * (y2 - y1) + (x2g - x1g) * (y2g - y1g) - intsct + eps - iou = intsct / union - - # smallest enclosing box - xc1 = torch.min(x1, x1g) - yc1 = torch.min(y1, y1g) - xc2 = torch.max(x2, x2g) - yc2 = torch.max(y2, y2g) - diag_len = ((xc2 - xc1) ** 2) + ((yc2 - yc1) ** 2) + eps - - # centers of boxes - x_p = (x2 + x1) / 2 - y_p = (y2 + y1) / 2 - x_g = (x1g + x2g) / 2 - y_g = (y1g + y2g) / 2 - distance = ((x_p - x_g) ** 2) + ((y_p - y_g) ** 2) - - # Eqn. (7) - loss = 1 - iou + (distance / diag_len) - if reduction == "mean": - loss = loss.mean() if loss.numel() > 0 else 0.0 * loss.sum() - elif reduction == "sum": - loss = loss.sum() - - return loss - - -def ciou_loss( - boxes1: torch.Tensor, - boxes2: torch.Tensor, - reduction: str = "none", - eps: float = 1e-7, -) -> torch.Tensor: - """ - Complete Intersection over Union Loss (Zhaohui Zheng et. al) - https://arxiv.org/abs/1911.08287 - Args: - boxes1, boxes2 (Tensor): box locations in XYXY format, shape (N, 4) or (4,). - reduction: 'none' | 'mean' | 'sum' - 'none': No reduction will be applied to the output. - 'mean': The output will be averaged. - 'sum': The output will be summed. - eps (float): small number to prevent division by zero - """ - - x1, y1, x2, y2 = boxes1.unbind(dim=-1) - x1g, y1g, x2g, y2g = boxes2.unbind(dim=-1) - - # TODO: use torch._assert_async() when pytorch 1.8 support is dropped - assert (x2 >= x1).all(), "bad box: x1 larger than x2" - assert (y2 >= y1).all(), "bad box: y1 larger than y2" - - # Intersection keypoints - xkis1 = torch.max(x1, x1g) - ykis1 = torch.max(y1, y1g) - xkis2 = torch.min(x2, x2g) - ykis2 = torch.min(y2, y2g) - - intsct = torch.zeros_like(x1) - mask = (ykis2 > ykis1) & (xkis2 > xkis1) - intsct[mask] = (xkis2[mask] - xkis1[mask]) * (ykis2[mask] - ykis1[mask]) - union = (x2 - x1) * (y2 - y1) + (x2g - x1g) * (y2g - y1g) - intsct + eps - iou = intsct / union - - # smallest enclosing box - xc1 = torch.min(x1, x1g) - yc1 = torch.min(y1, y1g) - xc2 = torch.max(x2, x2g) - yc2 = torch.max(y2, y2g) - diag_len = ((xc2 - xc1) ** 2) + ((yc2 - yc1) ** 2) + eps - - # centers of boxes - x_p = (x2 + x1) / 2 - y_p = (y2 + y1) / 2 - x_g = (x1g + x2g) / 2 - y_g = (y1g + y2g) / 2 - distance = ((x_p - x_g) ** 2) + ((y_p - y_g) ** 2) - - # width and height of boxes - w_pred = x2 - x1 - h_pred = y2 - y1 - w_gt = x2g - x1g - h_gt = y2g - y1g - v = (4 / (math.pi ** 2)) * torch.pow((torch.atan(w_gt / h_gt) - torch.atan(w_pred / h_pred)), 2) - with torch.no_grad(): - alpha = v / (1 - iou + v + eps) - - # Eqn. (10) - loss = 1 - iou + (distance / diag_len) + alpha * v - if reduction == "mean": - loss = loss.mean() if loss.numel() > 0 else 0.0 * loss.sum() - elif reduction == "sum": - loss = loss.sum() - - return loss diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/mask_ops.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/mask_ops.py deleted file mode 100755 index e7a9f3a3..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/mask_ops.py +++ /dev/null @@ -1,275 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -from typing import Tuple -import torch -from PIL import Image -from torch.nn import functional as F - -__all__ = ["paste_masks_in_image"] - - -BYTES_PER_FLOAT = 4 -# TODO: This memory limit may be too much or too little. It would be better to -# determine it based on available resources. -GPU_MEM_LIMIT = 1024 ** 3 # 1 GB memory limit - - -def _do_paste_mask(masks, boxes, img_h: int, img_w: int, skip_empty: bool = True): - """ - Args: - masks: N, 1, H, W - boxes: N, 4 - img_h, img_w (int): - skip_empty (bool): only paste masks within the region that - tightly bound all boxes, and returns the results this region only. - An important optimization for CPU. - - Returns: - if skip_empty == False, a mask of shape (N, img_h, img_w) - if skip_empty == True, a mask of shape (N, h', w'), and the slice - object for the corresponding region. - """ - # On GPU, paste all masks together (up to chunk size) - # by using the entire image to sample the masks - # Compared to pasting them one by one, - # this has more operations but is faster on COCO-scale dataset. - device = masks.device - - if skip_empty and not torch.jit.is_scripting(): - x0_int, y0_int = torch.clamp(boxes.min(dim=0).values.floor()[:2] - 1, min=0).to( - dtype=torch.int32 - ) - x1_int = torch.clamp(boxes[:, 2].max().ceil() + 1, max=img_w).to(dtype=torch.int32) - y1_int = torch.clamp(boxes[:, 3].max().ceil() + 1, max=img_h).to(dtype=torch.int32) - else: - x0_int, y0_int = 0, 0 - x1_int, y1_int = img_w, img_h - x0, y0, x1, y1 = torch.split(boxes, 1, dim=1) # each is Nx1 - - N = masks.shape[0] - - img_y = torch.arange(y0_int, y1_int, device=device, dtype=torch.float32) + 0.5 - img_x = torch.arange(x0_int, x1_int, device=device, dtype=torch.float32) + 0.5 - img_y = (img_y - y0) / (y1 - y0) * 2 - 1 - img_x = (img_x - x0) / (x1 - x0) * 2 - 1 - # img_x, img_y have shapes (N, w), (N, h) - - gx = img_x[:, None, :].expand(N, img_y.size(1), img_x.size(1)) - gy = img_y[:, :, None].expand(N, img_y.size(1), img_x.size(1)) - grid = torch.stack([gx, gy], dim=3) - - if not torch.jit.is_scripting(): - if not masks.dtype.is_floating_point: - masks = masks.float() - img_masks = F.grid_sample(masks, grid.to(masks.dtype), align_corners=False) - - if skip_empty and not torch.jit.is_scripting(): - return img_masks[:, 0], (slice(y0_int, y1_int), slice(x0_int, x1_int)) - else: - return img_masks[:, 0], () - - -# Annotate boxes as Tensor (but not Boxes) in order to use scripting -@torch.jit.script_if_tracing -def paste_masks_in_image( - masks: torch.Tensor, boxes: torch.Tensor, image_shape: Tuple[int, int], threshold: float = 0.5 -): - """ - Paste a set of masks that are of a fixed resolution (e.g., 28 x 28) into an image. - The location, height, and width for pasting each mask is determined by their - corresponding bounding boxes in boxes. - - Note: - This is a complicated but more accurate implementation. In actual deployment, it is - often enough to use a faster but less accurate implementation. - See :func:`paste_mask_in_image_old` in this file for an alternative implementation. - - Args: - masks (tensor): Tensor of shape (Bimg, Hmask, Wmask), where Bimg is the number of - detected object instances in the image and Hmask, Wmask are the mask width and mask - height of the predicted mask (e.g., Hmask = Wmask = 28). Values are in [0, 1]. - boxes (Boxes or Tensor): A Boxes of length Bimg or Tensor of shape (Bimg, 4). - boxes[i] and masks[i] correspond to the same object instance. - image_shape (tuple): height, width - threshold (float): A threshold in [0, 1] for converting the (soft) masks to - binary masks. - - Returns: - img_masks (Tensor): A tensor of shape (Bimg, Himage, Wimage), where Bimg is the - number of detected object instances and Himage, Wimage are the image width - and height. img_masks[i] is a binary mask for object instance i. - """ - - assert masks.shape[-1] == masks.shape[-2], "Only square mask predictions are supported" - N = len(masks) - if N == 0: - return masks.new_empty((0,) + image_shape, dtype=torch.uint8) - if not isinstance(boxes, torch.Tensor): - boxes = boxes.tensor - device = boxes.device - assert len(boxes) == N, boxes.shape - - img_h, img_w = image_shape - - # The actual implementation split the input into chunks, - # and paste them chunk by chunk. - if device.type == "cpu" or torch.jit.is_scripting(): - # CPU is most efficient when they are pasted one by one with skip_empty=True - # so that it performs minimal number of operations. - num_chunks = N - else: - # GPU benefits from parallelism for larger chunks, but may have memory issue - # int(img_h) because shape may be tensors in tracing - num_chunks = int(np.ceil(N * int(img_h) * int(img_w) * BYTES_PER_FLOAT / GPU_MEM_LIMIT)) - assert ( - num_chunks <= N - ), "Default GPU_MEM_LIMIT in mask_ops.py is too small; try increasing it" - chunks = torch.chunk(torch.arange(N, device=device), num_chunks) - - img_masks = torch.zeros( - N, img_h, img_w, device=device, dtype=torch.bool if threshold >= 0 else torch.uint8 - ) - for inds in chunks: - masks_chunk, spatial_inds = _do_paste_mask( - masks[inds, None, :, :], boxes[inds], img_h, img_w, skip_empty=device.type == "cpu" - ) - - if threshold >= 0: - masks_chunk = (masks_chunk >= threshold).to(dtype=torch.bool) - else: - # for visualization and debugging - masks_chunk = (masks_chunk * 255).to(dtype=torch.uint8) - - if torch.jit.is_scripting(): # Scripting does not use the optimized codepath - img_masks[inds] = masks_chunk - else: - img_masks[(inds,) + spatial_inds] = masks_chunk - return img_masks - - -# The below are the original paste function (from Detectron1) which has -# larger quantization error. -# It is faster on CPU, while the aligned one is faster on GPU thanks to grid_sample. - - -def paste_mask_in_image_old(mask, box, img_h, img_w, threshold): - """ - Paste a single mask in an image. - This is a per-box implementation of :func:`paste_masks_in_image`. - This function has larger quantization error due to incorrect pixel - modeling and is not used any more. - - Args: - mask (Tensor): A tensor of shape (Hmask, Wmask) storing the mask of a single - object instance. Values are in [0, 1]. - box (Tensor): A tensor of shape (4, ) storing the x0, y0, x1, y1 box corners - of the object instance. - img_h, img_w (int): Image height and width. - threshold (float): Mask binarization threshold in [0, 1]. - - Returns: - im_mask (Tensor): - The resized and binarized object mask pasted into the original - image plane (a tensor of shape (img_h, img_w)). - """ - # Conversion from continuous box coordinates to discrete pixel coordinates - # via truncation (cast to int32). This determines which pixels to paste the - # mask onto. - box = box.to(dtype=torch.int32) # Continuous to discrete coordinate conversion - # An example (1D) box with continuous coordinates (x0=0.7, x1=4.3) will map to - # a discrete coordinates (x0=0, x1=4). Note that box is mapped to 5 = x1 - x0 + 1 - # pixels (not x1 - x0 pixels). - samples_w = box[2] - box[0] + 1 # Number of pixel samples, *not* geometric width - samples_h = box[3] - box[1] + 1 # Number of pixel samples, *not* geometric height - - # Resample the mask from it's original grid to the new samples_w x samples_h grid - mask = Image.fromarray(mask.cpu().numpy()) - mask = mask.resize((samples_w, samples_h), resample=Image.BILINEAR) - mask = np.array(mask, copy=False) - - if threshold >= 0: - mask = np.array(mask > threshold, dtype=np.uint8) - mask = torch.from_numpy(mask) - else: - # for visualization and debugging, we also - # allow it to return an unmodified mask - mask = torch.from_numpy(mask * 255).to(torch.uint8) - - im_mask = torch.zeros((img_h, img_w), dtype=torch.uint8) - x_0 = max(box[0], 0) - x_1 = min(box[2] + 1, img_w) - y_0 = max(box[1], 0) - y_1 = min(box[3] + 1, img_h) - - im_mask[y_0:y_1, x_0:x_1] = mask[ - (y_0 - box[1]) : (y_1 - box[1]), (x_0 - box[0]) : (x_1 - box[0]) - ] - return im_mask - - -# Our pixel modeling requires extrapolation for any continuous -# coordinate < 0.5 or > length - 0.5. When sampling pixels on the masks, -# we would like this extrapolation to be an interpolation between boundary values and zero, -# instead of using absolute zero or boundary values. -# Therefore `paste_mask_in_image_old` is often used with zero padding around the masks like this: -# masks, scale = pad_masks(masks[:, 0, :, :], 1) -# boxes = scale_boxes(boxes.tensor, scale) - - -def pad_masks(masks, padding): - """ - Args: - masks (tensor): A tensor of shape (B, M, M) representing B masks. - padding (int): Number of cells to pad on all sides. - - Returns: - The padded masks and the scale factor of the padding size / original size. - """ - B = masks.shape[0] - M = masks.shape[-1] - pad2 = 2 * padding - scale = float(M + pad2) / M - padded_masks = masks.new_zeros((B, M + pad2, M + pad2)) - padded_masks[:, padding:-padding, padding:-padding] = masks - return padded_masks, scale - - -def scale_boxes(boxes, scale): - """ - Args: - boxes (tensor): A tensor of shape (B, 4) representing B boxes with 4 - coords representing the corners x0, y0, x1, y1, - scale (float): The box scaling factor. - - Returns: - Scaled boxes. - """ - w_half = (boxes[:, 2] - boxes[:, 0]) * 0.5 - h_half = (boxes[:, 3] - boxes[:, 1]) * 0.5 - x_c = (boxes[:, 2] + boxes[:, 0]) * 0.5 - y_c = (boxes[:, 3] + boxes[:, 1]) * 0.5 - - w_half *= scale - h_half *= scale - - scaled_boxes = torch.zeros_like(boxes) - scaled_boxes[:, 0] = x_c - w_half - scaled_boxes[:, 2] = x_c + w_half - scaled_boxes[:, 1] = y_c - h_half - scaled_boxes[:, 3] = y_c + h_half - return scaled_boxes - - -@torch.jit.script_if_tracing -def _paste_masks_tensor_shape( - masks: torch.Tensor, - boxes: torch.Tensor, - image_shape: Tuple[torch.Tensor, torch.Tensor], - threshold: float = 0.5, -): - """ - A wrapper of paste_masks_in_image where image_shape is Tensor. - During tracing, shapes might be tensors instead of ints. The Tensor->int - conversion should be scripted rather than traced. - """ - return paste_masks_in_image(masks, boxes, (int(image_shape[0]), int(image_shape[1])), threshold) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/nms.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/nms.py deleted file mode 100755 index 6b6be71c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/nms.py +++ /dev/null @@ -1,139 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import torch -from torchvision.ops import boxes as box_ops -from torchvision.ops import nms # noqa . for compatibility - - -def batched_nms( - boxes: torch.Tensor, scores: torch.Tensor, idxs: torch.Tensor, iou_threshold: float -): - """ - Same as torchvision.ops.boxes.batched_nms, but with float(). - """ - assert boxes.shape[-1] == 4 - # Note: Torchvision already has a strategy (https://github.com/pytorch/vision/issues/1311) - # to decide whether to use coordinate trick or for loop to implement batched_nms. So we - # just call it directly. - # Fp16 does not have enough range for batched NMS, so adding float(). - return box_ops.batched_nms(boxes.float(), scores, idxs, iou_threshold) - - -# Note: this function (nms_rotated) might be moved into -# torchvision/ops/boxes.py in the future -def nms_rotated(boxes, scores, iou_threshold): - """ - Performs non-maximum suppression (NMS) on the rotated boxes according - to their intersection-over-union (IoU). - - Rotated NMS iteratively removes lower scoring rotated boxes which have an - IoU greater than iou_threshold with another (higher scoring) rotated box. - - Note that RotatedBox (5, 3, 4, 2, -90) covers exactly the same region as - RotatedBox (5, 3, 4, 2, 90) does, and their IoU will be 1. However, they - can be representing completely different objects in certain tasks, e.g., OCR. - - As for the question of whether rotated-NMS should treat them as faraway boxes - even though their IOU is 1, it depends on the application and/or ground truth annotation. - - As an extreme example, consider a single character v and the square box around it. - - If the angle is 0 degree, the object (text) would be read as 'v'; - - If the angle is 90 degrees, the object (text) would become '>'; - - If the angle is 180 degrees, the object (text) would become '^'; - - If the angle is 270/-90 degrees, the object (text) would become '<' - - All of these cases have IoU of 1 to each other, and rotated NMS that only - uses IoU as criterion would only keep one of them with the highest score - - which, practically, still makes sense in most cases because typically - only one of theses orientations is the correct one. Also, it does not matter - as much if the box is only used to classify the object (instead of transcribing - them with a sequential OCR recognition model) later. - - On the other hand, when we use IoU to filter proposals that are close to the - ground truth during training, we should definitely take the angle into account if - we know the ground truth is labeled with the strictly correct orientation (as in, - upside-down words are annotated with -180 degrees even though they can be covered - with a 0/90/-90 degree box, etc.) - - The way the original dataset is annotated also matters. For example, if the dataset - is a 4-point polygon dataset that does not enforce ordering of vertices/orientation, - we can estimate a minimum rotated bounding box to this polygon, but there's no way - we can tell the correct angle with 100% confidence (as shown above, there could be 4 different - rotated boxes, with angles differed by 90 degrees to each other, covering the exactly - same region). In that case we have to just use IoU to determine the box - proximity (as many detection benchmarks (even for text) do) unless there're other - assumptions we can make (like width is always larger than height, or the object is not - rotated by more than 90 degrees CCW/CW, etc.) - - In summary, not considering angles in rotated NMS seems to be a good option for now, - but we should be aware of its implications. - - Args: - boxes (Tensor[N, 5]): Rotated boxes to perform NMS on. They are expected to be in - (x_center, y_center, width, height, angle_degrees) format. - scores (Tensor[N]): Scores for each one of the rotated boxes - iou_threshold (float): Discards all overlapping rotated boxes with IoU < iou_threshold - - Returns: - keep (Tensor): int64 tensor with the indices of the elements that have been kept - by Rotated NMS, sorted in decreasing order of scores - """ - return torch.ops.detectron2.nms_rotated(boxes, scores, iou_threshold) - - -# Note: this function (batched_nms_rotated) might be moved into -# torchvision/ops/boxes.py in the future -def batched_nms_rotated(boxes, scores, idxs, iou_threshold): - """ - Performs non-maximum suppression in a batched fashion. - - Each index value correspond to a category, and NMS - will not be applied between elements of different categories. - - Args: - boxes (Tensor[N, 5]): - boxes where NMS will be performed. They - are expected to be in (x_ctr, y_ctr, width, height, angle_degrees) format - scores (Tensor[N]): - scores for each one of the boxes - idxs (Tensor[N]): - indices of the categories for each one of the boxes. - iou_threshold (float): - discards all overlapping boxes - with IoU < iou_threshold - - Returns: - Tensor: - int64 tensor with the indices of the elements that have been kept - by NMS, sorted in decreasing order of scores - """ - assert boxes.shape[-1] == 5 - - if boxes.numel() == 0: - return torch.empty((0,), dtype=torch.int64, device=boxes.device) - boxes = boxes.float() # fp16 does not have enough range for batched NMS - # Strategy: in order to perform NMS independently per class, - # we add an offset to all the boxes. The offset is dependent - # only on the class idx, and is large enough so that boxes - # from different classes do not overlap - - # Note that batched_nms in torchvision/ops/boxes.py only uses max_coordinate, - # which won't handle negative coordinates correctly. - # Here by using min_coordinate we can make sure the negative coordinates are - # correctly handled. - max_coordinate = ( - torch.max(boxes[:, 0], boxes[:, 1]) + torch.max(boxes[:, 2], boxes[:, 3]) / 2 - ).max() - min_coordinate = ( - torch.min(boxes[:, 0], boxes[:, 1]) - torch.max(boxes[:, 2], boxes[:, 3]) / 2 - ).min() - offsets = idxs.to(boxes) * (max_coordinate - min_coordinate + 1) - boxes_for_nms = boxes.clone() # avoid modifying the original values in boxes - boxes_for_nms[:, :2] += offsets[:, None] - keep = nms_rotated(boxes_for_nms, scores, iou_threshold) - return keep diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/roi_align.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/roi_align.py deleted file mode 100755 index 163462e1..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/roi_align.py +++ /dev/null @@ -1,74 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from torch import nn -from torchvision.ops import roi_align - - -# NOTE: torchvision's RoIAlign has a different default aligned=False -class ROIAlign(nn.Module): - def __init__(self, output_size, spatial_scale, sampling_ratio, aligned=True): - """ - Args: - output_size (tuple): h, w - spatial_scale (float): scale the input boxes by this number - sampling_ratio (int): number of inputs samples to take for each output - sample. 0 to take samples densely. - aligned (bool): if False, use the legacy implementation in - Detectron. If True, align the results more perfectly. - - Note: - The meaning of aligned=True: - - Given a continuous coordinate c, its two neighboring pixel indices (in our - pixel model) are computed by floor(c - 0.5) and ceil(c - 0.5). For example, - c=1.3 has pixel neighbors with discrete indices [0] and [1] (which are sampled - from the underlying signal at continuous coordinates 0.5 and 1.5). But the original - roi_align (aligned=False) does not subtract the 0.5 when computing neighboring - pixel indices and therefore it uses pixels with a slightly incorrect alignment - (relative to our pixel model) when performing bilinear interpolation. - - With `aligned=True`, - we first appropriately scale the ROI and then shift it by -0.5 - prior to calling roi_align. This produces the correct neighbors; see - detectron2/tests/test_roi_align.py for verification. - - The difference does not make a difference to the model's performance if - ROIAlign is used together with conv layers. - """ - super().__init__() - self.output_size = output_size - self.spatial_scale = spatial_scale - self.sampling_ratio = sampling_ratio - self.aligned = aligned - - from torchvision import __version__ - - version = tuple(int(x) for x in __version__.split(".")[:2]) - # https://github.com/pytorch/vision/pull/2438 - assert version >= (0, 7), "Require torchvision >= 0.7" - - def forward(self, input, rois): - """ - Args: - input: NCHW images - rois: Bx5 boxes. First column is the index into N. The other 4 columns are xyxy. - """ - assert rois.dim() == 2 and rois.size(1) == 5 - if input.is_quantized: - input = input.dequantize() - return roi_align( - input, - rois.to(dtype=input.dtype), - self.output_size, - self.spatial_scale, - self.sampling_ratio, - self.aligned, - ) - - def __repr__(self): - tmpstr = self.__class__.__name__ + "(" - tmpstr += "output_size=" + str(self.output_size) - tmpstr += ", spatial_scale=" + str(self.spatial_scale) - tmpstr += ", sampling_ratio=" + str(self.sampling_ratio) - tmpstr += ", aligned=" + str(self.aligned) - tmpstr += ")" - return tmpstr diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/roi_align_rotated.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/roi_align_rotated.py deleted file mode 100755 index d097326c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/roi_align_rotated.py +++ /dev/null @@ -1,91 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import torch -from torch import nn -from torch.autograd import Function -from torch.autograd.function import once_differentiable -from torch.nn.modules.utils import _pair - - -class _ROIAlignRotated(Function): - @staticmethod - def forward(ctx, input, roi, output_size, spatial_scale, sampling_ratio): - ctx.save_for_backward(roi) - ctx.output_size = _pair(output_size) - ctx.spatial_scale = spatial_scale - ctx.sampling_ratio = sampling_ratio - ctx.input_shape = input.size() - output = torch.ops.detectron2.roi_align_rotated_forward( - input, roi, spatial_scale, output_size[0], output_size[1], sampling_ratio - ) - return output - - @staticmethod - @once_differentiable - def backward(ctx, grad_output): - (rois,) = ctx.saved_tensors - output_size = ctx.output_size - spatial_scale = ctx.spatial_scale - sampling_ratio = ctx.sampling_ratio - bs, ch, h, w = ctx.input_shape - grad_input = torch.ops.detectron2.roi_align_rotated_backward( - grad_output, - rois, - spatial_scale, - output_size[0], - output_size[1], - bs, - ch, - h, - w, - sampling_ratio, - ) - return grad_input, None, None, None, None, None - - -roi_align_rotated = _ROIAlignRotated.apply - - -class ROIAlignRotated(nn.Module): - def __init__(self, output_size, spatial_scale, sampling_ratio): - """ - Args: - output_size (tuple): h, w - spatial_scale (float): scale the input boxes by this number - sampling_ratio (int): number of inputs samples to take for each output - sample. 0 to take samples densely. - - Note: - ROIAlignRotated supports continuous coordinate by default: - Given a continuous coordinate c, its two neighboring pixel indices (in our - pixel model) are computed by floor(c - 0.5) and ceil(c - 0.5). For example, - c=1.3 has pixel neighbors with discrete indices [0] and [1] (which are sampled - from the underlying signal at continuous coordinates 0.5 and 1.5). - """ - super(ROIAlignRotated, self).__init__() - self.output_size = output_size - self.spatial_scale = spatial_scale - self.sampling_ratio = sampling_ratio - - def forward(self, input, rois): - """ - Args: - input: NCHW images - rois: Bx6 boxes. First column is the index into N. - The other 5 columns are (x_ctr, y_ctr, width, height, angle_degrees). - """ - assert rois.dim() == 2 and rois.size(1) == 6 - orig_dtype = input.dtype - if orig_dtype == torch.float16: - input = input.float() - rois = rois.float() - return roi_align_rotated( - input, rois, self.output_size, self.spatial_scale, self.sampling_ratio - ).to(dtype=orig_dtype) - - def __repr__(self): - tmpstr = self.__class__.__name__ + "(" - tmpstr += "output_size=" + str(self.output_size) - tmpstr += ", spatial_scale=" + str(self.spatial_scale) - tmpstr += ", sampling_ratio=" + str(self.sampling_ratio) - tmpstr += ")" - return tmpstr diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/rotated_boxes.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/rotated_boxes.py deleted file mode 100755 index 03f73b3b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/rotated_boxes.py +++ /dev/null @@ -1,21 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from __future__ import absolute_import, division, print_function, unicode_literals -import torch - - -def pairwise_iou_rotated(boxes1, boxes2): - """ - Return intersection-over-union (Jaccard index) of boxes. - - Both sets of boxes are expected to be in - (x_center, y_center, width, height, angle) format. - - Arguments: - boxes1 (Tensor[N, 5]) - boxes2 (Tensor[M, 5]) - - Returns: - iou (Tensor[N, M]): the NxM matrix containing the pairwise - IoU values for every element in boxes1 and boxes2 - """ - return torch.ops.detectron2.box_iou_rotated(boxes1, boxes2) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/shape_spec.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/shape_spec.py deleted file mode 100755 index fe7e8e26..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/shape_spec.py +++ /dev/null @@ -1,20 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. -from collections import namedtuple - - -class ShapeSpec(namedtuple("_ShapeSpec", ["channels", "height", "width", "stride"])): - """ - A simple structure that contains basic shape specification about a tensor. - It is often used as the auxiliary inputs/outputs of models, - to complement the lack of shape inference ability among pytorch modules. - - Attributes: - channels: - height: - width: - stride: - """ - - def __new__(cls, channels=None, height=None, width=None, stride=None): - return super().__new__(cls, channels, height, width, stride) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/wrappers.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/wrappers.py deleted file mode 100755 index 29d0ef91..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/layers/wrappers.py +++ /dev/null @@ -1,132 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -""" -Wrappers around on some nn functions, mainly to support empty tensors. - -Ideally, add support directly in PyTorch to empty tensors in those functions. - -These can be removed once https://github.com/pytorch/pytorch/issues/12013 -is implemented -""" - -from typing import List, Optional -import torch -from torch.nn import functional as F - - -def shapes_to_tensor(x: List[int], device: Optional[torch.device] = None) -> torch.Tensor: - """ - Turn a list of integer scalars or integer Tensor scalars into a vector, - in a way that's both traceable and scriptable. - - In tracing, `x` should be a list of scalar Tensor, so the output can trace to the inputs. - In scripting or eager, `x` should be a list of int. - """ - if torch.jit.is_scripting(): - return torch.as_tensor(x, device=device) - if torch.jit.is_tracing(): - assert all( - [isinstance(t, torch.Tensor) for t in x] - ), "Shape should be tensor during tracing!" - # as_tensor should not be used in tracing because it records a constant - ret = torch.stack(x) - if ret.device != device: # avoid recording a hard-coded device if not necessary - ret = ret.to(device=device) - return ret - return torch.as_tensor(x, device=device) - - -def cat(tensors: List[torch.Tensor], dim: int = 0): - """ - Efficient version of torch.cat that avoids a copy if there is only a single element in a list - """ - assert isinstance(tensors, (list, tuple)) - if len(tensors) == 1: - return tensors[0] - return torch.cat(tensors, dim) - - -def cross_entropy(input, target, *, reduction="mean", **kwargs): - """ - Same as `torch.nn.functional.cross_entropy`, but returns 0 (instead of nan) - for empty inputs. - """ - if target.numel() == 0 and reduction == "mean": - return input.sum() * 0.0 # connect the gradient - return F.cross_entropy(input, target, reduction=reduction, **kwargs) - - -class _NewEmptyTensorOp(torch.autograd.Function): - @staticmethod - def forward(ctx, x, new_shape): - ctx.shape = x.shape - return x.new_empty(new_shape) - - @staticmethod - def backward(ctx, grad): - shape = ctx.shape - return _NewEmptyTensorOp.apply(grad, shape), None - - -class Conv2d(torch.nn.Conv2d): - """ - A wrapper around :class:`torch.nn.Conv2d` to support empty inputs and more features. - """ - - def __init__(self, *args, **kwargs): - """ - Extra keyword arguments supported in addition to those in `torch.nn.Conv2d`: - - Args: - norm (nn.Module, optional): a normalization layer - activation (callable(Tensor) -> Tensor): a callable activation function - - It assumes that norm layer is used before activation. - """ - norm = kwargs.pop("norm", None) - activation = kwargs.pop("activation", None) - super().__init__(*args, **kwargs) - - self.norm = norm - self.activation = activation - - def forward(self, x): - # torchscript does not support SyncBatchNorm yet - # https://github.com/pytorch/pytorch/issues/40507 - # and we skip these codes in torchscript since: - # 1. currently we only support torchscript in evaluation mode - # 2. features needed by exporting module to torchscript are added in PyTorch 1.6 or - # later version, `Conv2d` in these PyTorch versions has already supported empty inputs. - if not torch.jit.is_scripting(): - if x.numel() == 0 and self.training: - # https://github.com/pytorch/pytorch/issues/12013 - assert not isinstance( - self.norm, torch.nn.SyncBatchNorm - ), "SyncBatchNorm does not support empty inputs!" - - x = F.conv2d( - x, self.weight, self.bias, self.stride, self.padding, self.dilation, self.groups - ) - if self.norm is not None: - x = self.norm(x) - if self.activation is not None: - x = self.activation(x) - return x - - -ConvTranspose2d = torch.nn.ConvTranspose2d -BatchNorm2d = torch.nn.BatchNorm2d -interpolate = F.interpolate -Linear = torch.nn.Linear - - -def nonzero_tuple(x): - """ - A 'as_tuple=True' version of torch.nonzero to support torchscript. - because of https://github.com/pytorch/pytorch/issues/38718 - """ - if torch.jit.is_scripting(): - if x.dim() == 0: - return x.unsqueeze(0).nonzero().unbind(1) - return x.nonzero().unbind(1) - else: - return x.nonzero(as_tuple=True) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/model_zoo/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/model_zoo/__init__.py deleted file mode 100755 index 62042081..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/model_zoo/__init__.py +++ /dev/null @@ -1,10 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -""" -Model Zoo API for Detectron2: a collection of functions to create common model architectures -listed in `MODEL_ZOO.md `_, -and optionally load their pre-trained weights. -""" - -from .model_zoo import get, get_config_file, get_checkpoint_url, get_config - -__all__ = ["get_checkpoint_url", "get", "get_config_file", "get_config"] diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/model_zoo/model_zoo.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/model_zoo/model_zoo.py deleted file mode 100755 index 5b90bc9a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/model_zoo/model_zoo.py +++ /dev/null @@ -1,213 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import os -from typing import Optional -import pkg_resources -import torch - -from detectron2.checkpoint import DetectionCheckpointer -from detectron2.config import CfgNode, LazyConfig, get_cfg, instantiate -from detectron2.modeling import build_model - - -class _ModelZooUrls(object): - """ - Mapping from names to officially released Detectron2 pre-trained models. - """ - - S3_PREFIX = "https://dl.fbaipublicfiles.com/detectron2/" - - # format: {config_path.yaml} -> model_id/model_final_{commit}.pkl - CONFIG_PATH_TO_URL_SUFFIX = { - # COCO Detection with Faster R-CNN - "COCO-Detection/faster_rcnn_R_50_C4_1x": "137257644/model_final_721ade.pkl", - "COCO-Detection/faster_rcnn_R_50_DC5_1x": "137847829/model_final_51d356.pkl", - "COCO-Detection/faster_rcnn_R_50_FPN_1x": "137257794/model_final_b275ba.pkl", - "COCO-Detection/faster_rcnn_R_50_C4_3x": "137849393/model_final_f97cb7.pkl", - "COCO-Detection/faster_rcnn_R_50_DC5_3x": "137849425/model_final_68d202.pkl", - "COCO-Detection/faster_rcnn_R_50_FPN_3x": "137849458/model_final_280758.pkl", - "COCO-Detection/faster_rcnn_R_101_C4_3x": "138204752/model_final_298dad.pkl", - "COCO-Detection/faster_rcnn_R_101_DC5_3x": "138204841/model_final_3e0943.pkl", - "COCO-Detection/faster_rcnn_R_101_FPN_3x": "137851257/model_final_f6e8b1.pkl", - "COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x": "139173657/model_final_68b088.pkl", - # COCO Detection with RetinaNet - "COCO-Detection/retinanet_R_50_FPN_1x": "190397773/model_final_bfca0b.pkl", - "COCO-Detection/retinanet_R_50_FPN_3x": "190397829/model_final_5bd44e.pkl", - "COCO-Detection/retinanet_R_101_FPN_3x": "190397697/model_final_971ab9.pkl", - # COCO Detection with RPN and Fast R-CNN - "COCO-Detection/rpn_R_50_C4_1x": "137258005/model_final_450694.pkl", - "COCO-Detection/rpn_R_50_FPN_1x": "137258492/model_final_02ce48.pkl", - "COCO-Detection/fast_rcnn_R_50_FPN_1x": "137635226/model_final_e5f7ce.pkl", - # COCO Instance Segmentation Baselines with Mask R-CNN - "COCO-InstanceSegmentation/mask_rcnn_R_50_C4_1x": "137259246/model_final_9243eb.pkl", - "COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_1x": "137260150/model_final_4f86c3.pkl", - "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x": "137260431/model_final_a54504.pkl", - "COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x": "137849525/model_final_4ce675.pkl", - "COCO-InstanceSegmentation/mask_rcnn_R_50_DC5_3x": "137849551/model_final_84107b.pkl", - "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x": "137849600/model_final_f10217.pkl", - "COCO-InstanceSegmentation/mask_rcnn_R_101_C4_3x": "138363239/model_final_a2914c.pkl", - "COCO-InstanceSegmentation/mask_rcnn_R_101_DC5_3x": "138363294/model_final_0464b7.pkl", - "COCO-InstanceSegmentation/mask_rcnn_R_101_FPN_3x": "138205316/model_final_a3ec72.pkl", - "COCO-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_3x": "139653917/model_final_2d9806.pkl", # noqa - # New baselines using Large-Scale Jitter and Longer Training Schedule - "new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ": "42047764/model_final_bb69de.pkl", - "new_baselines/mask_rcnn_R_50_FPN_200ep_LSJ": "42047638/model_final_89a8d3.pkl", - "new_baselines/mask_rcnn_R_50_FPN_400ep_LSJ": "42019571/model_final_14d201.pkl", - "new_baselines/mask_rcnn_R_101_FPN_100ep_LSJ": "42025812/model_final_4f7b58.pkl", - "new_baselines/mask_rcnn_R_101_FPN_200ep_LSJ": "42131867/model_final_0bb7ae.pkl", - "new_baselines/mask_rcnn_R_101_FPN_400ep_LSJ": "42073830/model_final_f96b26.pkl", - "new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ": "42047771/model_final_b7fbab.pkl", # noqa - "new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_200ep_LSJ": "42132721/model_final_5d87c1.pkl", # noqa - "new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_400ep_LSJ": "42025447/model_final_f1362d.pkl", # noqa - "new_baselines/mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ": "42047784/model_final_6ba57e.pkl", # noqa - "new_baselines/mask_rcnn_regnety_4gf_dds_FPN_200ep_LSJ": "42047642/model_final_27b9c1.pkl", # noqa - "new_baselines/mask_rcnn_regnety_4gf_dds_FPN_400ep_LSJ": "42045954/model_final_ef3a80.pkl", # noqa - # COCO Person Keypoint Detection Baselines with Keypoint R-CNN - "COCO-Keypoints/keypoint_rcnn_R_50_FPN_1x": "137261548/model_final_04e291.pkl", - "COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x": "137849621/model_final_a6e10b.pkl", - "COCO-Keypoints/keypoint_rcnn_R_101_FPN_3x": "138363331/model_final_997cc7.pkl", - "COCO-Keypoints/keypoint_rcnn_X_101_32x8d_FPN_3x": "139686956/model_final_5ad38f.pkl", - # COCO Panoptic Segmentation Baselines with Panoptic FPN - "COCO-PanopticSegmentation/panoptic_fpn_R_50_1x": "139514544/model_final_dbfeb4.pkl", - "COCO-PanopticSegmentation/panoptic_fpn_R_50_3x": "139514569/model_final_c10459.pkl", - "COCO-PanopticSegmentation/panoptic_fpn_R_101_3x": "139514519/model_final_cafdb1.pkl", - # LVIS Instance Segmentation Baselines with Mask R-CNN - "LVISv0.5-InstanceSegmentation/mask_rcnn_R_50_FPN_1x": "144219072/model_final_571f7c.pkl", # noqa - "LVISv0.5-InstanceSegmentation/mask_rcnn_R_101_FPN_1x": "144219035/model_final_824ab5.pkl", # noqa - "LVISv0.5-InstanceSegmentation/mask_rcnn_X_101_32x8d_FPN_1x": "144219108/model_final_5e3439.pkl", # noqa - # Cityscapes & Pascal VOC Baselines - "Cityscapes/mask_rcnn_R_50_FPN": "142423278/model_final_af9cf5.pkl", - "PascalVOC-Detection/faster_rcnn_R_50_C4": "142202221/model_final_b1acc2.pkl", - # Other Settings - "Misc/mask_rcnn_R_50_FPN_1x_dconv_c3-c5": "138602867/model_final_65c703.pkl", - "Misc/mask_rcnn_R_50_FPN_3x_dconv_c3-c5": "144998336/model_final_821d0b.pkl", - "Misc/cascade_mask_rcnn_R_50_FPN_1x": "138602847/model_final_e9d89b.pkl", - "Misc/cascade_mask_rcnn_R_50_FPN_3x": "144998488/model_final_480dd8.pkl", - "Misc/mask_rcnn_R_50_FPN_3x_syncbn": "169527823/model_final_3b3c51.pkl", - "Misc/mask_rcnn_R_50_FPN_3x_gn": "138602888/model_final_dc5d9e.pkl", - "Misc/scratch_mask_rcnn_R_50_FPN_3x_gn": "138602908/model_final_01ca85.pkl", - "Misc/scratch_mask_rcnn_R_50_FPN_9x_gn": "183808979/model_final_da7b4c.pkl", - "Misc/scratch_mask_rcnn_R_50_FPN_9x_syncbn": "184226666/model_final_5ce33e.pkl", - "Misc/panoptic_fpn_R_101_dconv_cascade_gn_3x": "139797668/model_final_be35db.pkl", - "Misc/cascade_mask_rcnn_X_152_32x8d_FPN_IN5k_gn_dconv": "18131413/model_0039999_e76410.pkl", # noqa - # D1 Comparisons - "Detectron1-Comparisons/faster_rcnn_R_50_FPN_noaug_1x": "137781054/model_final_7ab50c.pkl", # noqa - "Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x": "137781281/model_final_62ca52.pkl", # noqa - "Detectron1-Comparisons/keypoint_rcnn_R_50_FPN_1x": "137781195/model_final_cce136.pkl", - } - - @staticmethod - def query(config_path: str) -> Optional[str]: - """ - Args: - config_path: relative config filename - """ - name = config_path.replace(".yaml", "").replace(".py", "") - if name in _ModelZooUrls.CONFIG_PATH_TO_URL_SUFFIX: - suffix = _ModelZooUrls.CONFIG_PATH_TO_URL_SUFFIX[name] - return _ModelZooUrls.S3_PREFIX + name + "/" + suffix - return None - - -def get_checkpoint_url(config_path): - """ - Returns the URL to the model trained using the given config - - Args: - config_path (str): config file name relative to detectron2's "configs/" - directory, e.g., "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml" - - Returns: - str: a URL to the model - """ - url = _ModelZooUrls.query(config_path) - if url is None: - raise RuntimeError("Pretrained model for {} is not available!".format(config_path)) - return url - - -def get_config_file(config_path): - """ - Returns path to a builtin config file. - - Args: - config_path (str): config file name relative to detectron2's "configs/" - directory, e.g., "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml" - - Returns: - str: the real path to the config file. - """ - cfg_file = pkg_resources.resource_filename( - "detectron2.model_zoo", os.path.join("configs", config_path) - ) - if not os.path.exists(cfg_file): - raise RuntimeError("{} not available in Model Zoo!".format(config_path)) - return cfg_file - - -def get_config(config_path, trained: bool = False): - """ - Returns a config object for a model in model zoo. - - Args: - config_path (str): config file name relative to detectron2's "configs/" - directory, e.g., "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml" - trained (bool): If True, will set ``MODEL.WEIGHTS`` to trained model zoo weights. - If False, the checkpoint specified in the config file's ``MODEL.WEIGHTS`` is used - instead; this will typically (though not always) initialize a subset of weights using - an ImageNet pre-trained model, while randomly initializing the other weights. - - Returns: - CfgNode or omegaconf.DictConfig: a config object - """ - cfg_file = get_config_file(config_path) - if cfg_file.endswith(".yaml"): - cfg = get_cfg() - cfg.merge_from_file(cfg_file) - if trained: - cfg.MODEL.WEIGHTS = get_checkpoint_url(config_path) - return cfg - elif cfg_file.endswith(".py"): - cfg = LazyConfig.load(cfg_file) - if trained: - url = get_checkpoint_url(config_path) - if "train" in cfg and "init_checkpoint" in cfg.train: - cfg.train.init_checkpoint = url - else: - raise NotImplementedError - return cfg - - -def get(config_path, trained: bool = False, device: Optional[str] = None): - """ - Get a model specified by relative path under Detectron2's official ``configs/`` directory. - - Args: - config_path (str): config file name relative to detectron2's "configs/" - directory, e.g., "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml" - trained (bool): see :func:`get_config`. - device (str or None): overwrite the device in config, if given. - - Returns: - nn.Module: a detectron2 model. Will be in training mode. - - Example: - :: - from detectron2 import model_zoo - model = model_zoo.get("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml", trained=True) - """ - cfg = get_config(config_path, trained) - if device is None and not torch.cuda.is_available(): - device = "cpu" - if device is not None and isinstance(cfg, CfgNode): - cfg.MODEL.DEVICE = device - - if isinstance(cfg, CfgNode): - model = build_model(cfg) - DetectionCheckpointer(model).load(cfg.MODEL.WEIGHTS) - else: - model = instantiate(cfg.model) - if device is not None: - model = model.to(device) - if "train" in cfg and "init_checkpoint" in cfg.train: - DetectionCheckpointer(model).load(cfg.train.init_checkpoint) - return model diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/__init__.py deleted file mode 100755 index 576493de..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/__init__.py +++ /dev/null @@ -1,59 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from detectron2.layers import ShapeSpec - -from .anchor_generator import build_anchor_generator, ANCHOR_GENERATOR_REGISTRY -from .backbone import ( - BACKBONE_REGISTRY, - FPN, - Backbone, - ResNet, - ResNetBlockBase, - build_backbone, - build_resnet_backbone, - make_stage, -) -from .meta_arch import ( - META_ARCH_REGISTRY, - SEM_SEG_HEADS_REGISTRY, - GeneralizedRCNN, - PanopticFPN, - ProposalNetwork, - RetinaNet, - SemanticSegmentor, - build_model, - build_sem_seg_head, - FCOS, -) -from .postprocessing import detector_postprocess -from .proposal_generator import ( - PROPOSAL_GENERATOR_REGISTRY, - build_proposal_generator, - RPN_HEAD_REGISTRY, - build_rpn_head, -) -from .roi_heads import ( - ROI_BOX_HEAD_REGISTRY, - ROI_HEADS_REGISTRY, - ROI_KEYPOINT_HEAD_REGISTRY, - ROI_MASK_HEAD_REGISTRY, - ROIHeads, - StandardROIHeads, - BaseMaskRCNNHead, - BaseKeypointRCNNHead, - FastRCNNOutputLayers, - build_box_head, - build_keypoint_head, - build_mask_head, - build_roi_heads, -) -from .test_time_augmentation import DatasetMapperTTA, GeneralizedRCNNWithTTA -from .mmdet_wrapper import MMDetBackbone, MMDetDetector - -_EXCLUDE = {"ShapeSpec"} -__all__ = [k for k in globals().keys() if k not in _EXCLUDE and not k.startswith("_")] - - -from detectron2.utils.env import fixup_module_metadata - -fixup_module_metadata(__name__, globals(), __all__) -del fixup_module_metadata diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/anchor_generator.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/anchor_generator.py deleted file mode 100755 index ee4b9881..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/anchor_generator.py +++ /dev/null @@ -1,382 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import collections -import math -from typing import List -import torch -from torch import nn - -from detectron2.config import configurable -from detectron2.layers import ShapeSpec -from detectron2.structures import Boxes, RotatedBoxes -from detectron2.utils.registry import Registry - -ANCHOR_GENERATOR_REGISTRY = Registry("ANCHOR_GENERATOR") -ANCHOR_GENERATOR_REGISTRY.__doc__ = """ -Registry for modules that creates object detection anchors for feature maps. - -The registered object will be called with `obj(cfg, input_shape)`. -""" - - -class BufferList(nn.Module): - """ - Similar to nn.ParameterList, but for buffers - """ - - def __init__(self, buffers): - super().__init__() - for i, buffer in enumerate(buffers): - # Use non-persistent buffer so the values are not saved in checkpoint - self.register_buffer(str(i), buffer, persistent=False) - - def __len__(self): - return len(self._buffers) - - def __iter__(self): - return iter(self._buffers.values()) - - -def _create_grid_offsets(size: List[int], stride: int, offset: float, device: torch.device): - grid_height, grid_width = size - shifts_x = torch.arange( - offset * stride, grid_width * stride, step=stride, dtype=torch.float32, device=device - ) - shifts_y = torch.arange( - offset * stride, grid_height * stride, step=stride, dtype=torch.float32, device=device - ) - - shift_y, shift_x = torch.meshgrid(shifts_y, shifts_x) - shift_x = shift_x.reshape(-1) - shift_y = shift_y.reshape(-1) - return shift_x, shift_y - - -def _broadcast_params(params, num_features, name): - """ - If one size (or aspect ratio) is specified and there are multiple feature - maps, we "broadcast" anchors of that single size (or aspect ratio) - over all feature maps. - - If params is list[float], or list[list[float]] with len(params) == 1, repeat - it num_features time. - - Returns: - list[list[float]]: param for each feature - """ - assert isinstance( - params, collections.abc.Sequence - ), f"{name} in anchor generator has to be a list! Got {params}." - assert len(params), f"{name} in anchor generator cannot be empty!" - if not isinstance(params[0], collections.abc.Sequence): # params is list[float] - return [params] * num_features - if len(params) == 1: - return list(params) * num_features - assert len(params) == num_features, ( - f"Got {name} of length {len(params)} in anchor generator, " - f"but the number of input features is {num_features}!" - ) - return params - - -@ANCHOR_GENERATOR_REGISTRY.register() -class DefaultAnchorGenerator(nn.Module): - """ - Compute anchors in the standard ways described in - "Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks". - """ - - box_dim: torch.jit.Final[int] = 4 - """ - the dimension of each anchor box. - """ - - @configurable - def __init__(self, *, sizes, aspect_ratios, strides, offset=0.5): - """ - This interface is experimental. - - Args: - sizes (list[list[float]] or list[float]): - If ``sizes`` is list[list[float]], ``sizes[i]`` is the list of anchor sizes - (i.e. sqrt of anchor area) to use for the i-th feature map. - If ``sizes`` is list[float], ``sizes`` is used for all feature maps. - Anchor sizes are given in absolute lengths in units of - the input image; they do not dynamically scale if the input image size changes. - aspect_ratios (list[list[float]] or list[float]): list of aspect ratios - (i.e. height / width) to use for anchors. Same "broadcast" rule for `sizes` applies. - strides (list[int]): stride of each input feature. - offset (float): Relative offset between the center of the first anchor and the top-left - corner of the image. Value has to be in [0, 1). - Recommend to use 0.5, which means half stride. - """ - super().__init__() - - self.strides = strides - self.num_features = len(self.strides) - sizes = _broadcast_params(sizes, self.num_features, "sizes") - aspect_ratios = _broadcast_params(aspect_ratios, self.num_features, "aspect_ratios") - self.cell_anchors = self._calculate_anchors(sizes, aspect_ratios) - - self.offset = offset - assert 0.0 <= self.offset < 1.0, self.offset - - @classmethod - def from_config(cls, cfg, input_shape: List[ShapeSpec]): - return { - "sizes": cfg.MODEL.ANCHOR_GENERATOR.SIZES, - "aspect_ratios": cfg.MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS, - "strides": [x.stride for x in input_shape], - "offset": cfg.MODEL.ANCHOR_GENERATOR.OFFSET, - } - - def _calculate_anchors(self, sizes, aspect_ratios): - cell_anchors = [ - self.generate_cell_anchors(s, a).float() for s, a in zip(sizes, aspect_ratios) - ] - return BufferList(cell_anchors) - - @property - @torch.jit.unused - def num_cell_anchors(self): - """ - Alias of `num_anchors`. - """ - return self.num_anchors - - @property - @torch.jit.unused - def num_anchors(self): - """ - Returns: - list[int]: Each int is the number of anchors at every pixel - location, on that feature map. - For example, if at every pixel we use anchors of 3 aspect - ratios and 5 sizes, the number of anchors is 15. - (See also ANCHOR_GENERATOR.SIZES and ANCHOR_GENERATOR.ASPECT_RATIOS in config) - - In standard RPN models, `num_anchors` on every feature map is the same. - """ - return [len(cell_anchors) for cell_anchors in self.cell_anchors] - - def _grid_anchors(self, grid_sizes: List[List[int]]): - """ - Returns: - list[Tensor]: #featuremap tensors, each is (#locations x #cell_anchors) x 4 - """ - anchors = [] - # buffers() not supported by torchscript. use named_buffers() instead - buffers: List[torch.Tensor] = [x[1] for x in self.cell_anchors.named_buffers()] - for size, stride, base_anchors in zip(grid_sizes, self.strides, buffers): - shift_x, shift_y = _create_grid_offsets(size, stride, self.offset, base_anchors.device) - shifts = torch.stack((shift_x, shift_y, shift_x, shift_y), dim=1) - - anchors.append((shifts.view(-1, 1, 4) + base_anchors.view(1, -1, 4)).reshape(-1, 4)) - - return anchors - - def generate_cell_anchors(self, sizes=(32, 64, 128, 256, 512), aspect_ratios=(0.5, 1, 2)): - """ - Generate a tensor storing canonical anchor boxes, which are all anchor - boxes of different sizes and aspect_ratios centered at (0, 0). - We can later build the set of anchors for a full feature map by - shifting and tiling these tensors (see `meth:_grid_anchors`). - - Args: - sizes (tuple[float]): - aspect_ratios (tuple[float]]): - - Returns: - Tensor of shape (len(sizes) * len(aspect_ratios), 4) storing anchor boxes - in XYXY format. - """ - - # This is different from the anchor generator defined in the original Faster R-CNN - # code or Detectron. They yield the same AP, however the old version defines cell - # anchors in a less natural way with a shift relative to the feature grid and - # quantization that results in slightly different sizes for different aspect ratios. - # See also https://github.com/facebookresearch/Detectron/issues/227 - - anchors = [] - for size in sizes: - area = size ** 2.0 - for aspect_ratio in aspect_ratios: - # s * s = w * h - # a = h / w - # ... some algebra ... - # w = sqrt(s * s / a) - # h = a * w - w = math.sqrt(area / aspect_ratio) - h = aspect_ratio * w - x0, y0, x1, y1 = -w / 2.0, -h / 2.0, w / 2.0, h / 2.0 - anchors.append([x0, y0, x1, y1]) - return torch.tensor(anchors) - - def forward(self, features: List[torch.Tensor]): - """ - Args: - features (list[Tensor]): list of backbone feature maps on which to generate anchors. - - Returns: - list[Boxes]: a list of Boxes containing all the anchors for each feature map - (i.e. the cell anchors repeated over all locations in the feature map). - The number of anchors of each feature map is Hi x Wi x num_cell_anchors, - where Hi, Wi are resolution of the feature map divided by anchor stride. - """ - grid_sizes = [feature_map.shape[-2:] for feature_map in features] - anchors_over_all_feature_maps = self._grid_anchors(grid_sizes) - return [Boxes(x) for x in anchors_over_all_feature_maps] - - -@ANCHOR_GENERATOR_REGISTRY.register() -class RotatedAnchorGenerator(nn.Module): - """ - Compute rotated anchors used by Rotated RPN (RRPN), described in - "Arbitrary-Oriented Scene Text Detection via Rotation Proposals". - """ - - box_dim: int = 5 - """ - the dimension of each anchor box. - """ - - @configurable - def __init__(self, *, sizes, aspect_ratios, strides, angles, offset=0.5): - """ - This interface is experimental. - - Args: - sizes (list[list[float]] or list[float]): - If sizes is list[list[float]], sizes[i] is the list of anchor sizes - (i.e. sqrt of anchor area) to use for the i-th feature map. - If sizes is list[float], the sizes are used for all feature maps. - Anchor sizes are given in absolute lengths in units of - the input image; they do not dynamically scale if the input image size changes. - aspect_ratios (list[list[float]] or list[float]): list of aspect ratios - (i.e. height / width) to use for anchors. Same "broadcast" rule for `sizes` applies. - strides (list[int]): stride of each input feature. - angles (list[list[float]] or list[float]): list of angles (in degrees CCW) - to use for anchors. Same "broadcast" rule for `sizes` applies. - offset (float): Relative offset between the center of the first anchor and the top-left - corner of the image. Value has to be in [0, 1). - Recommend to use 0.5, which means half stride. - """ - super().__init__() - - self.strides = strides - self.num_features = len(self.strides) - sizes = _broadcast_params(sizes, self.num_features, "sizes") - aspect_ratios = _broadcast_params(aspect_ratios, self.num_features, "aspect_ratios") - angles = _broadcast_params(angles, self.num_features, "angles") - self.cell_anchors = self._calculate_anchors(sizes, aspect_ratios, angles) - - self.offset = offset - assert 0.0 <= self.offset < 1.0, self.offset - - @classmethod - def from_config(cls, cfg, input_shape: List[ShapeSpec]): - return { - "sizes": cfg.MODEL.ANCHOR_GENERATOR.SIZES, - "aspect_ratios": cfg.MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS, - "strides": [x.stride for x in input_shape], - "offset": cfg.MODEL.ANCHOR_GENERATOR.OFFSET, - "angles": cfg.MODEL.ANCHOR_GENERATOR.ANGLES, - } - - def _calculate_anchors(self, sizes, aspect_ratios, angles): - cell_anchors = [ - self.generate_cell_anchors(size, aspect_ratio, angle).float() - for size, aspect_ratio, angle in zip(sizes, aspect_ratios, angles) - ] - return BufferList(cell_anchors) - - @property - def num_cell_anchors(self): - """ - Alias of `num_anchors`. - """ - return self.num_anchors - - @property - def num_anchors(self): - """ - Returns: - list[int]: Each int is the number of anchors at every pixel - location, on that feature map. - For example, if at every pixel we use anchors of 3 aspect - ratios, 2 sizes and 5 angles, the number of anchors is 30. - (See also ANCHOR_GENERATOR.SIZES, ANCHOR_GENERATOR.ASPECT_RATIOS - and ANCHOR_GENERATOR.ANGLES in config) - - In standard RRPN models, `num_anchors` on every feature map is the same. - """ - return [len(cell_anchors) for cell_anchors in self.cell_anchors] - - def _grid_anchors(self, grid_sizes): - anchors = [] - for size, stride, base_anchors in zip(grid_sizes, self.strides, self.cell_anchors): - shift_x, shift_y = _create_grid_offsets(size, stride, self.offset, base_anchors.device) - zeros = torch.zeros_like(shift_x) - shifts = torch.stack((shift_x, shift_y, zeros, zeros, zeros), dim=1) - - anchors.append((shifts.view(-1, 1, 5) + base_anchors.view(1, -1, 5)).reshape(-1, 5)) - - return anchors - - def generate_cell_anchors( - self, - sizes=(32, 64, 128, 256, 512), - aspect_ratios=(0.5, 1, 2), - angles=(-90, -60, -30, 0, 30, 60, 90), - ): - """ - Generate a tensor storing canonical anchor boxes, which are all anchor - boxes of different sizes, aspect_ratios, angles centered at (0, 0). - We can later build the set of anchors for a full feature map by - shifting and tiling these tensors (see `meth:_grid_anchors`). - - Args: - sizes (tuple[float]): - aspect_ratios (tuple[float]]): - angles (tuple[float]]): - - Returns: - Tensor of shape (len(sizes) * len(aspect_ratios) * len(angles), 5) - storing anchor boxes in (x_ctr, y_ctr, w, h, angle) format. - """ - anchors = [] - for size in sizes: - area = size ** 2.0 - for aspect_ratio in aspect_ratios: - # s * s = w * h - # a = h / w - # ... some algebra ... - # w = sqrt(s * s / a) - # h = a * w - w = math.sqrt(area / aspect_ratio) - h = aspect_ratio * w - anchors.extend([0, 0, w, h, a] for a in angles) - - return torch.tensor(anchors) - - def forward(self, features): - """ - Args: - features (list[Tensor]): list of backbone feature maps on which to generate anchors. - - Returns: - list[RotatedBoxes]: a list of Boxes containing all the anchors for each feature map - (i.e. the cell anchors repeated over all locations in the feature map). - The number of anchors of each feature map is Hi x Wi x num_cell_anchors, - where Hi, Wi are resolution of the feature map divided by anchor stride. - """ - grid_sizes = [feature_map.shape[-2:] for feature_map in features] - anchors_over_all_feature_maps = self._grid_anchors(grid_sizes) - return [RotatedBoxes(x) for x in anchors_over_all_feature_maps] - - -def build_anchor_generator(cfg, input_shape): - """ - Built an anchor generator from `cfg.MODEL.ANCHOR_GENERATOR.NAME`. - """ - anchor_generator = cfg.MODEL.ANCHOR_GENERATOR.NAME - return ANCHOR_GENERATOR_REGISTRY.get(anchor_generator)(cfg, input_shape) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/__init__.py deleted file mode 100755 index 55b265d5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/__init__.py +++ /dev/null @@ -1,17 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .build import build_backbone, BACKBONE_REGISTRY # noqa F401 isort:skip - -from .backbone import Backbone -from .fpn import FPN -from .regnet import RegNet -from .resnet import ( - BasicStem, - ResNet, - ResNetBlockBase, - build_resnet_backbone, - make_stage, - BottleneckBlock, -) - -__all__ = [k for k in globals().keys() if not k.startswith("_")] -# TODO can expose more resnet blocks after careful consideration diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/backbone.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/backbone.py deleted file mode 100755 index 369fb884..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/backbone.py +++ /dev/null @@ -1,53 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from abc import ABCMeta, abstractmethod -import torch.nn as nn - -from detectron2.layers import ShapeSpec - -__all__ = ["Backbone"] - - -class Backbone(nn.Module, metaclass=ABCMeta): - """ - Abstract base class for network backbones. - """ - - def __init__(self): - """ - The `__init__` method of any subclass can specify its own set of arguments. - """ - super().__init__() - - @abstractmethod - def forward(self): - """ - Subclasses must override this method, but adhere to the same return type. - - Returns: - dict[str->Tensor]: mapping from feature name (e.g., "res2") to tensor - """ - pass - - @property - def size_divisibility(self) -> int: - """ - Some backbones require the input height and width to be divisible by a - specific integer. This is typically true for encoder / decoder type networks - with lateral connection (e.g., FPN) for which feature maps need to match - dimension in the "bottom up" and "top down" paths. Set to 0 if no specific - input size divisibility is required. - """ - return 0 - - def output_shape(self): - """ - Returns: - dict[str->ShapeSpec] - """ - # this is a backward-compatible default - return { - name: ShapeSpec( - channels=self._out_feature_channels[name], stride=self._out_feature_strides[name] - ) - for name in self._out_features - } diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/build.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/build.py deleted file mode 100755 index af021411..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/build.py +++ /dev/null @@ -1,33 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from detectron2.layers import ShapeSpec -from detectron2.utils.registry import Registry - -from .backbone import Backbone - -BACKBONE_REGISTRY = Registry("BACKBONE") -BACKBONE_REGISTRY.__doc__ = """ -Registry for backbones, which extract feature maps from images - -The registered object must be a callable that accepts two arguments: - -1. A :class:`detectron2.config.CfgNode` -2. A :class:`detectron2.layers.ShapeSpec`, which contains the input shape specification. - -Registered object must return instance of :class:`Backbone`. -""" - - -def build_backbone(cfg, input_shape=None): - """ - Build a backbone from `cfg.MODEL.BACKBONE.NAME`. - - Returns: - an instance of :class:`Backbone` - """ - if input_shape is None: - input_shape = ShapeSpec(channels=len(cfg.MODEL.PIXEL_MEAN)) - - backbone_name = cfg.MODEL.BACKBONE.NAME - backbone = BACKBONE_REGISTRY.get(backbone_name)(cfg, input_shape) - assert isinstance(backbone, Backbone) - return backbone diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/fpn.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/fpn.py deleted file mode 100755 index d0bdfc9d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/fpn.py +++ /dev/null @@ -1,255 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import math -import fvcore.nn.weight_init as weight_init -import torch -import torch.nn.functional as F -from torch import nn - -from detectron2.layers import Conv2d, ShapeSpec, get_norm - -from .backbone import Backbone -from .build import BACKBONE_REGISTRY -from .resnet import build_resnet_backbone - -__all__ = ["build_resnet_fpn_backbone", "build_retinanet_resnet_fpn_backbone", "FPN"] - - -class FPN(Backbone): - """ - This module implements :paper:`FPN`. - It creates pyramid features built on top of some input feature maps. - """ - - _fuse_type: torch.jit.Final[str] - - def __init__( - self, bottom_up, in_features, out_channels, norm="", top_block=None, fuse_type="sum" - ): - """ - Args: - bottom_up (Backbone): module representing the bottom up subnetwork. - Must be a subclass of :class:`Backbone`. The multi-scale feature - maps generated by the bottom up network, and listed in `in_features`, - are used to generate FPN levels. - in_features (list[str]): names of the input feature maps coming - from the backbone to which FPN is attached. For example, if the - backbone produces ["res2", "res3", "res4"], any *contiguous* sublist - of these may be used; order must be from high to low resolution. - out_channels (int): number of channels in the output feature maps. - norm (str): the normalization to use. - top_block (nn.Module or None): if provided, an extra operation will - be performed on the output of the last (smallest resolution) - FPN output, and the result will extend the result list. The top_block - further downsamples the feature map. It must have an attribute - "num_levels", meaning the number of extra FPN levels added by - this block, and "in_feature", which is a string representing - its input feature (e.g., p5). - fuse_type (str): types for fusing the top down features and the lateral - ones. It can be "sum" (default), which sums up element-wise; or "avg", - which takes the element-wise mean of the two. - """ - super(FPN, self).__init__() - assert isinstance(bottom_up, Backbone) - assert in_features, in_features - - # Feature map strides and channels from the bottom up network (e.g. ResNet) - input_shapes = bottom_up.output_shape() - strides = [input_shapes[f].stride for f in in_features] - in_channels_per_feature = [input_shapes[f].channels for f in in_features] - - _assert_strides_are_log2_contiguous(strides) - lateral_convs = [] - output_convs = [] - - use_bias = norm == "" - for idx, in_channels in enumerate(in_channels_per_feature): - lateral_norm = get_norm(norm, out_channels) - output_norm = get_norm(norm, out_channels) - - lateral_conv = Conv2d( - in_channels, out_channels, kernel_size=1, bias=use_bias, norm=lateral_norm - ) - output_conv = Conv2d( - out_channels, - out_channels, - kernel_size=3, - stride=1, - padding=1, - bias=use_bias, - norm=output_norm, - ) - weight_init.c2_xavier_fill(lateral_conv) - weight_init.c2_xavier_fill(output_conv) - stage = int(math.log2(strides[idx])) - self.add_module("fpn_lateral{}".format(stage), lateral_conv) - self.add_module("fpn_output{}".format(stage), output_conv) - - lateral_convs.append(lateral_conv) - output_convs.append(output_conv) - # Place convs into top-down order (from low to high resolution) - # to make the top-down computation in forward clearer. - self.lateral_convs = lateral_convs[::-1] - self.output_convs = output_convs[::-1] - self.top_block = top_block - self.in_features = tuple(in_features) - self.bottom_up = bottom_up - # Return feature names are "p", like ["p2", "p3", ..., "p6"] - self._out_feature_strides = {"p{}".format(int(math.log2(s))): s for s in strides} - # top block output feature maps. - if self.top_block is not None: - for s in range(stage, stage + self.top_block.num_levels): - self._out_feature_strides["p{}".format(s + 1)] = 2 ** (s + 1) - - self._out_features = list(self._out_feature_strides.keys()) - self._out_feature_channels = {k: out_channels for k in self._out_features} - self._size_divisibility = strides[-1] - assert fuse_type in {"avg", "sum"} - self._fuse_type = fuse_type - - @property - def size_divisibility(self): - return self._size_divisibility - - def forward(self, x): - """ - Args: - input (dict[str->Tensor]): mapping feature map name (e.g., "res5") to - feature map tensor for each feature level in high to low resolution order. - - Returns: - dict[str->Tensor]: - mapping from feature map name to FPN feature map tensor - in high to low resolution order. Returned feature names follow the FPN - paper convention: "p", where stage has stride = 2 ** stage e.g., - ["p2", "p3", ..., "p6"]. - """ - bottom_up_features = self.bottom_up(x) - results = [] - prev_features = self.lateral_convs[0](bottom_up_features[self.in_features[-1]]) - results.append(self.output_convs[0](prev_features)) - - # Reverse feature maps into top-down order (from low to high resolution) - for idx, (lateral_conv, output_conv) in enumerate( - zip(self.lateral_convs, self.output_convs) - ): - # Slicing of ModuleList is not supported https://github.com/pytorch/pytorch/issues/47336 - # Therefore we loop over all modules but skip the first one - if idx > 0: - features = self.in_features[-idx - 1] - features = bottom_up_features[features] - top_down_features = F.interpolate(prev_features, scale_factor=2.0, mode="nearest") - lateral_features = lateral_conv(features) - prev_features = lateral_features + top_down_features - if self._fuse_type == "avg": - prev_features /= 2 - results.insert(0, output_conv(prev_features)) - - if self.top_block is not None: - if self.top_block.in_feature in bottom_up_features: - top_block_in_feature = bottom_up_features[self.top_block.in_feature] - else: - top_block_in_feature = results[self._out_features.index(self.top_block.in_feature)] - results.extend(self.top_block(top_block_in_feature)) - assert len(self._out_features) == len(results) - return {f: res for f, res in zip(self._out_features, results)} - - def output_shape(self): - return { - name: ShapeSpec( - channels=self._out_feature_channels[name], stride=self._out_feature_strides[name] - ) - for name in self._out_features - } - - -def _assert_strides_are_log2_contiguous(strides): - """ - Assert that each stride is 2x times its preceding stride, i.e. "contiguous in log2". - """ - for i, stride in enumerate(strides[1:], 1): - assert stride == 2 * strides[i - 1], "Strides {} {} are not log2 contiguous".format( - stride, strides[i - 1] - ) - - -class LastLevelMaxPool(nn.Module): - """ - This module is used in the original FPN to generate a downsampled - P6 feature from P5. - """ - - def __init__(self): - super().__init__() - self.num_levels = 1 - self.in_feature = "p5" - - def forward(self, x): - return [F.max_pool2d(x, kernel_size=1, stride=2, padding=0)] - - -class LastLevelP6P7(nn.Module): - """ - This module is used in RetinaNet to generate extra layers, P6 and P7 from - C5 feature. - """ - - def __init__(self, in_channels, out_channels, in_feature="res5"): - super().__init__() - self.num_levels = 2 - self.in_feature = in_feature - self.p6 = nn.Conv2d(in_channels, out_channels, 3, 2, 1) - self.p7 = nn.Conv2d(out_channels, out_channels, 3, 2, 1) - for module in [self.p6, self.p7]: - weight_init.c2_xavier_fill(module) - - def forward(self, c5): - p6 = self.p6(c5) - p7 = self.p7(F.relu(p6)) - return [p6, p7] - - -@BACKBONE_REGISTRY.register() -def build_resnet_fpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_resnet_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - backbone = FPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - norm=cfg.MODEL.FPN.NORM, - top_block=LastLevelMaxPool(), - fuse_type=cfg.MODEL.FPN.FUSE_TYPE, - ) - return backbone - - -@BACKBONE_REGISTRY.register() -def build_retinanet_resnet_fpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_resnet_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - in_channels_p6p7 = bottom_up.output_shape()["res5"].channels - backbone = FPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - norm=cfg.MODEL.FPN.NORM, - top_block=LastLevelP6P7(in_channels_p6p7, out_channels), - fuse_type=cfg.MODEL.FPN.FUSE_TYPE, - ) - return backbone diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/regnet.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/regnet.py deleted file mode 100755 index 3533d633..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/regnet.py +++ /dev/null @@ -1,452 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -""" -Implementation of RegNet models from :paper:`dds` and :paper:`scaling`. - -This code is adapted from https://github.com/facebookresearch/pycls with minimal modifications. -Some code duplication exists between RegNet and ResNets (e.g., ResStem) in order to simplify -model loading. -""" - -import numpy as np -from torch import nn - -from detectron2.layers import CNNBlockBase, ShapeSpec, get_norm - -from .backbone import Backbone - -__all__ = [ - "AnyNet", - "RegNet", - "ResStem", - "SimpleStem", - "VanillaBlock", - "ResBasicBlock", - "ResBottleneckBlock", -] - - -def conv2d(w_in, w_out, k, *, stride=1, groups=1, bias=False): - """Helper for building a conv2d layer.""" - assert k % 2 == 1, "Only odd size kernels supported to avoid padding issues." - s, p, g, b = stride, (k - 1) // 2, groups, bias - return nn.Conv2d(w_in, w_out, k, stride=s, padding=p, groups=g, bias=b) - - -def gap2d(): - """Helper for building a global average pooling layer.""" - return nn.AdaptiveAvgPool2d((1, 1)) - - -def pool2d(k, *, stride=1): - """Helper for building a pool2d layer.""" - assert k % 2 == 1, "Only odd size kernels supported to avoid padding issues." - return nn.MaxPool2d(k, stride=stride, padding=(k - 1) // 2) - - -def init_weights(m): - """Performs ResNet-style weight initialization.""" - if isinstance(m, nn.Conv2d): - # Note that there is no bias due to BN - fan_out = m.kernel_size[0] * m.kernel_size[1] * m.out_channels - m.weight.data.normal_(mean=0.0, std=np.sqrt(2.0 / fan_out)) - elif isinstance(m, nn.BatchNorm2d): - m.weight.data.fill_(1.0) - m.bias.data.zero_() - elif isinstance(m, nn.Linear): - m.weight.data.normal_(mean=0.0, std=0.01) - m.bias.data.zero_() - - -class ResStem(CNNBlockBase): - """ResNet stem for ImageNet: 7x7, BN, AF, MaxPool.""" - - def __init__(self, w_in, w_out, norm, activation_class): - super().__init__(w_in, w_out, 4) - self.conv = conv2d(w_in, w_out, 7, stride=2) - self.bn = get_norm(norm, w_out) - self.af = activation_class() - self.pool = pool2d(3, stride=2) - - def forward(self, x): - for layer in self.children(): - x = layer(x) - return x - - -class SimpleStem(CNNBlockBase): - """Simple stem for ImageNet: 3x3, BN, AF.""" - - def __init__(self, w_in, w_out, norm, activation_class): - super().__init__(w_in, w_out, 2) - self.conv = conv2d(w_in, w_out, 3, stride=2) - self.bn = get_norm(norm, w_out) - self.af = activation_class() - - def forward(self, x): - for layer in self.children(): - x = layer(x) - return x - - -class SE(nn.Module): - """Squeeze-and-Excitation (SE) block: AvgPool, FC, Act, FC, Sigmoid.""" - - def __init__(self, w_in, w_se, activation_class): - super().__init__() - self.avg_pool = gap2d() - self.f_ex = nn.Sequential( - conv2d(w_in, w_se, 1, bias=True), - activation_class(), - conv2d(w_se, w_in, 1, bias=True), - nn.Sigmoid(), - ) - - def forward(self, x): - return x * self.f_ex(self.avg_pool(x)) - - -class VanillaBlock(CNNBlockBase): - """Vanilla block: [3x3 conv, BN, Relu] x2.""" - - def __init__(self, w_in, w_out, stride, norm, activation_class, _params): - super().__init__(w_in, w_out, stride) - self.a = conv2d(w_in, w_out, 3, stride=stride) - self.a_bn = get_norm(norm, w_out) - self.a_af = activation_class() - self.b = conv2d(w_out, w_out, 3) - self.b_bn = get_norm(norm, w_out) - self.b_af = activation_class() - - def forward(self, x): - for layer in self.children(): - x = layer(x) - return x - - -class BasicTransform(nn.Module): - """Basic transformation: [3x3 conv, BN, Relu] x2.""" - - def __init__(self, w_in, w_out, stride, norm, activation_class, _params): - super().__init__() - self.a = conv2d(w_in, w_out, 3, stride=stride) - self.a_bn = get_norm(norm, w_out) - self.a_af = activation_class() - self.b = conv2d(w_out, w_out, 3) - self.b_bn = get_norm(norm, w_out) - self.b_bn.final_bn = True - - def forward(self, x): - for layer in self.children(): - x = layer(x) - return x - - -class ResBasicBlock(CNNBlockBase): - """Residual basic block: x + f(x), f = basic transform.""" - - def __init__(self, w_in, w_out, stride, norm, activation_class, params): - super().__init__(w_in, w_out, stride) - self.proj, self.bn = None, None - if (w_in != w_out) or (stride != 1): - self.proj = conv2d(w_in, w_out, 1, stride=stride) - self.bn = get_norm(norm, w_out) - self.f = BasicTransform(w_in, w_out, stride, norm, activation_class, params) - self.af = activation_class() - - def forward(self, x): - x_p = self.bn(self.proj(x)) if self.proj else x - return self.af(x_p + self.f(x)) - - -class BottleneckTransform(nn.Module): - """Bottleneck transformation: 1x1, 3x3 [+SE], 1x1.""" - - def __init__(self, w_in, w_out, stride, norm, activation_class, params): - super().__init__() - w_b = int(round(w_out * params["bot_mul"])) - w_se = int(round(w_in * params["se_r"])) - groups = w_b // params["group_w"] - self.a = conv2d(w_in, w_b, 1) - self.a_bn = get_norm(norm, w_b) - self.a_af = activation_class() - self.b = conv2d(w_b, w_b, 3, stride=stride, groups=groups) - self.b_bn = get_norm(norm, w_b) - self.b_af = activation_class() - self.se = SE(w_b, w_se, activation_class) if w_se else None - self.c = conv2d(w_b, w_out, 1) - self.c_bn = get_norm(norm, w_out) - self.c_bn.final_bn = True - - def forward(self, x): - for layer in self.children(): - x = layer(x) - return x - - -class ResBottleneckBlock(CNNBlockBase): - """Residual bottleneck block: x + f(x), f = bottleneck transform.""" - - def __init__(self, w_in, w_out, stride, norm, activation_class, params): - super().__init__(w_in, w_out, stride) - self.proj, self.bn = None, None - if (w_in != w_out) or (stride != 1): - self.proj = conv2d(w_in, w_out, 1, stride=stride) - self.bn = get_norm(norm, w_out) - self.f = BottleneckTransform(w_in, w_out, stride, norm, activation_class, params) - self.af = activation_class() - - def forward(self, x): - x_p = self.bn(self.proj(x)) if self.proj else x - return self.af(x_p + self.f(x)) - - -class AnyStage(nn.Module): - """AnyNet stage (sequence of blocks w/ the same output shape).""" - - def __init__(self, w_in, w_out, stride, d, block_class, norm, activation_class, params): - super().__init__() - for i in range(d): - block = block_class(w_in, w_out, stride, norm, activation_class, params) - self.add_module("b{}".format(i + 1), block) - stride, w_in = 1, w_out - - def forward(self, x): - for block in self.children(): - x = block(x) - return x - - -class AnyNet(Backbone): - """AnyNet model. See :paper:`dds`.""" - - def __init__( - self, - *, - stem_class, - stem_width, - block_class, - depths, - widths, - group_widths, - strides, - bottleneck_ratios, - se_ratio, - activation_class, - freeze_at=0, - norm="BN", - out_features=None, - ): - """ - Args: - stem_class (callable): A callable taking 4 arguments (channels in, channels out, - normalization, callable returning an activation function) that returns another - callable implementing the stem module. - stem_width (int): The number of output channels that the stem produces. - block_class (callable): A callable taking 6 arguments (channels in, channels out, - stride, normalization, callable returning an activation function, a dict of - block-specific parameters) that returns another callable implementing the repeated - block module. - depths (list[int]): Number of blocks in each stage. - widths (list[int]): For each stage, the number of output channels of each block. - group_widths (list[int]): For each stage, the number of channels per group in group - convolution, if the block uses group convolution. - strides (list[int]): The stride that each network stage applies to its input. - bottleneck_ratios (list[float]): For each stage, the ratio of the number of bottleneck - channels to the number of block input channels (or, equivalently, output channels), - if the block uses a bottleneck. - se_ratio (float): The ratio of the number of channels used inside the squeeze-excitation - (SE) module to it number of input channels, if SE the block uses SE. - activation_class (callable): A callable taking no arguments that returns another - callable implementing an activation function. - freeze_at (int): The number of stages at the beginning to freeze. - see :meth:`freeze` for detailed explanation. - norm (str or callable): normalization for all conv layers. - See :func:`layers.get_norm` for supported format. - out_features (list[str]): name of the layers whose outputs should - be returned in forward. RegNet's use "stem" and "s1", "s2", etc for the stages after - the stem. If None, will return the output of the last layer. - """ - super().__init__() - self.stem = stem_class(3, stem_width, norm, activation_class) - - current_stride = self.stem.stride - self._out_feature_strides = {"stem": current_stride} - self._out_feature_channels = {"stem": self.stem.out_channels} - self.stages_and_names = [] - prev_w = stem_width - - for i, (d, w, s, b, g) in enumerate( - zip(depths, widths, strides, bottleneck_ratios, group_widths) - ): - params = {"bot_mul": b, "group_w": g, "se_r": se_ratio} - stage = AnyStage(prev_w, w, s, d, block_class, norm, activation_class, params) - name = "s{}".format(i + 1) - self.add_module(name, stage) - self.stages_and_names.append((stage, name)) - self._out_feature_strides[name] = current_stride = int( - current_stride * np.prod([k.stride for k in stage.children()]) - ) - self._out_feature_channels[name] = list(stage.children())[-1].out_channels - prev_w = w - - self.apply(init_weights) - - if out_features is None: - out_features = [name] - self._out_features = out_features - assert len(self._out_features) - children = [x[0] for x in self.named_children()] - for out_feature in self._out_features: - assert out_feature in children, "Available children: {} does not include {}".format( - ", ".join(children), out_feature - ) - self.freeze(freeze_at) - - def forward(self, x): - """ - Args: - x: Tensor of shape (N,C,H,W). H, W must be a multiple of ``self.size_divisibility``. - - Returns: - dict[str->Tensor]: names and the corresponding features - """ - assert x.dim() == 4, f"Model takes an input of shape (N, C, H, W). Got {x.shape} instead!" - outputs = {} - x = self.stem(x) - if "stem" in self._out_features: - outputs["stem"] = x - for stage, name in self.stages_and_names: - x = stage(x) - if name in self._out_features: - outputs[name] = x - return outputs - - def output_shape(self): - return { - name: ShapeSpec( - channels=self._out_feature_channels[name], stride=self._out_feature_strides[name] - ) - for name in self._out_features - } - - def freeze(self, freeze_at=0): - """ - Freeze the first several stages of the model. Commonly used in fine-tuning. - - Layers that produce the same feature map spatial size are defined as one - "stage" by :paper:`FPN`. - - Args: - freeze_at (int): number of stages to freeze. - `1` means freezing the stem. `2` means freezing the stem and - one residual stage, etc. - - Returns: - nn.Module: this model itself - """ - if freeze_at >= 1: - self.stem.freeze() - for idx, (stage, _) in enumerate(self.stages_and_names, start=2): - if freeze_at >= idx: - for block in stage.children(): - block.freeze() - return self - - -def adjust_block_compatibility(ws, bs, gs): - """Adjusts the compatibility of widths, bottlenecks, and groups.""" - assert len(ws) == len(bs) == len(gs) - assert all(w > 0 and b > 0 and g > 0 for w, b, g in zip(ws, bs, gs)) - vs = [int(max(1, w * b)) for w, b in zip(ws, bs)] - gs = [int(min(g, v)) for g, v in zip(gs, vs)] - ms = [np.lcm(g, b) if b > 1 else g for g, b in zip(gs, bs)] - vs = [max(m, int(round(v / m) * m)) for v, m in zip(vs, ms)] - ws = [int(v / b) for v, b in zip(vs, bs)] - assert all(w * b % g == 0 for w, b, g in zip(ws, bs, gs)) - return ws, bs, gs - - -def generate_regnet_parameters(w_a, w_0, w_m, d, q=8): - """Generates per stage widths and depths from RegNet parameters.""" - assert w_a >= 0 and w_0 > 0 and w_m > 1 and w_0 % q == 0 - # Generate continuous per-block ws - ws_cont = np.arange(d) * w_a + w_0 - # Generate quantized per-block ws - ks = np.round(np.log(ws_cont / w_0) / np.log(w_m)) - ws_all = w_0 * np.power(w_m, ks) - ws_all = np.round(np.divide(ws_all, q)).astype(int) * q - # Generate per stage ws and ds (assumes ws_all are sorted) - ws, ds = np.unique(ws_all, return_counts=True) - # Compute number of actual stages and total possible stages - num_stages, total_stages = len(ws), ks.max() + 1 - # Convert numpy arrays to lists and return - ws, ds, ws_all, ws_cont = (x.tolist() for x in (ws, ds, ws_all, ws_cont)) - return ws, ds, num_stages, total_stages, ws_all, ws_cont - - -class RegNet(AnyNet): - """RegNet model. See :paper:`dds`.""" - - def __init__( - self, - *, - stem_class, - stem_width, - block_class, - depth, - w_a, - w_0, - w_m, - group_width, - stride=2, - bottleneck_ratio=1.0, - se_ratio=0.0, - activation_class=None, - freeze_at=0, - norm="BN", - out_features=None, - ): - """ - Build a RegNet from the parameterization described in :paper:`dds` Section 3.3. - - Args: - See :class:`AnyNet` for arguments that are not listed here. - depth (int): Total number of blocks in the RegNet. - w_a (float): Factor by which block width would increase prior to quantizing block widths - by stage. See :paper:`dds` Section 3.3. - w_0 (int): Initial block width. See :paper:`dds` Section 3.3. - w_m (float): Parameter controlling block width quantization. - See :paper:`dds` Section 3.3. - group_width (int): Number of channels per group in group convolution, if the block uses - group convolution. - bottleneck_ratio (float): The ratio of the number of bottleneck channels to the number - of block input channels (or, equivalently, output channels), if the block uses a - bottleneck. - stride (int): The stride that each network stage applies to its input. - """ - ws, ds = generate_regnet_parameters(w_a, w_0, w_m, depth)[0:2] - ss = [stride for _ in ws] - bs = [bottleneck_ratio for _ in ws] - gs = [group_width for _ in ws] - ws, bs, gs = adjust_block_compatibility(ws, bs, gs) - - def default_activation_class(): - return nn.ReLU(inplace=True) - - super().__init__( - stem_class=stem_class, - stem_width=stem_width, - block_class=block_class, - depths=ds, - widths=ws, - strides=ss, - group_widths=gs, - bottleneck_ratios=bs, - se_ratio=se_ratio, - activation_class=default_activation_class - if activation_class is None - else activation_class, - freeze_at=freeze_at, - norm=norm, - out_features=out_features, - ) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/resnet.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/resnet.py deleted file mode 100755 index 5b8e842c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/backbone/resnet.py +++ /dev/null @@ -1,694 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -import fvcore.nn.weight_init as weight_init -import torch -import torch.nn.functional as F -from torch import nn - -from detectron2.layers import ( - CNNBlockBase, - Conv2d, - DeformConv, - ModulatedDeformConv, - ShapeSpec, - get_norm, -) - -from .backbone import Backbone -from .build import BACKBONE_REGISTRY - -__all__ = [ - "ResNetBlockBase", - "BasicBlock", - "BottleneckBlock", - "DeformBottleneckBlock", - "BasicStem", - "ResNet", - "make_stage", - "build_resnet_backbone", -] - - -class BasicBlock(CNNBlockBase): - """ - The basic residual block for ResNet-18 and ResNet-34 defined in :paper:`ResNet`, - with two 3x3 conv layers and a projection shortcut if needed. - """ - - def __init__(self, in_channels, out_channels, *, stride=1, norm="BN"): - """ - Args: - in_channels (int): Number of input channels. - out_channels (int): Number of output channels. - stride (int): Stride for the first conv. - norm (str or callable): normalization for all conv layers. - See :func:`layers.get_norm` for supported format. - """ - super().__init__(in_channels, out_channels, stride) - - if in_channels != out_channels: - self.shortcut = Conv2d( - in_channels, - out_channels, - kernel_size=1, - stride=stride, - bias=False, - norm=get_norm(norm, out_channels), - ) - else: - self.shortcut = None - - self.conv1 = Conv2d( - in_channels, - out_channels, - kernel_size=3, - stride=stride, - padding=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - - self.conv2 = Conv2d( - out_channels, - out_channels, - kernel_size=3, - stride=1, - padding=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - - for layer in [self.conv1, self.conv2, self.shortcut]: - if layer is not None: # shortcut can be None - weight_init.c2_msra_fill(layer) - - def forward(self, x): - out = self.conv1(x) - out = F.relu_(out) - out = self.conv2(out) - - if self.shortcut is not None: - shortcut = self.shortcut(x) - else: - shortcut = x - - out += shortcut - out = F.relu_(out) - return out - - -class BottleneckBlock(CNNBlockBase): - """ - The standard bottleneck residual block used by ResNet-50, 101 and 152 - defined in :paper:`ResNet`. It contains 3 conv layers with kernels - 1x1, 3x3, 1x1, and a projection shortcut if needed. - """ - - def __init__( - self, - in_channels, - out_channels, - *, - bottleneck_channels, - stride=1, - num_groups=1, - norm="BN", - stride_in_1x1=False, - dilation=1, - ): - """ - Args: - bottleneck_channels (int): number of output channels for the 3x3 - "bottleneck" conv layers. - num_groups (int): number of groups for the 3x3 conv layer. - norm (str or callable): normalization for all conv layers. - See :func:`layers.get_norm` for supported format. - stride_in_1x1 (bool): when stride>1, whether to put stride in the - first 1x1 convolution or the bottleneck 3x3 convolution. - dilation (int): the dilation rate of the 3x3 conv layer. - """ - super().__init__(in_channels, out_channels, stride) - - if in_channels != out_channels: - self.shortcut = Conv2d( - in_channels, - out_channels, - kernel_size=1, - stride=stride, - bias=False, - norm=get_norm(norm, out_channels), - ) - else: - self.shortcut = None - - # The original MSRA ResNet models have stride in the first 1x1 conv - # The subsequent fb.torch.resnet and Caffe2 ResNe[X]t implementations have - # stride in the 3x3 conv - stride_1x1, stride_3x3 = (stride, 1) if stride_in_1x1 else (1, stride) - - self.conv1 = Conv2d( - in_channels, - bottleneck_channels, - kernel_size=1, - stride=stride_1x1, - bias=False, - norm=get_norm(norm, bottleneck_channels), - ) - - self.conv2 = Conv2d( - bottleneck_channels, - bottleneck_channels, - kernel_size=3, - stride=stride_3x3, - padding=1 * dilation, - bias=False, - groups=num_groups, - dilation=dilation, - norm=get_norm(norm, bottleneck_channels), - ) - - self.conv3 = Conv2d( - bottleneck_channels, - out_channels, - kernel_size=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - - for layer in [self.conv1, self.conv2, self.conv3, self.shortcut]: - if layer is not None: # shortcut can be None - weight_init.c2_msra_fill(layer) - - # Zero-initialize the last normalization in each residual branch, - # so that at the beginning, the residual branch starts with zeros, - # and each residual block behaves like an identity. - # See Sec 5.1 in "Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour": - # "For BN layers, the learnable scaling coefficient γ is initialized - # to be 1, except for each residual block's last BN - # where γ is initialized to be 0." - - # nn.init.constant_(self.conv3.norm.weight, 0) - # TODO this somehow hurts performance when training GN models from scratch. - # Add it as an option when we need to use this code to train a backbone. - - def forward(self, x): - out = self.conv1(x) - out = F.relu_(out) - - out = self.conv2(out) - out = F.relu_(out) - - out = self.conv3(out) - - if self.shortcut is not None: - shortcut = self.shortcut(x) - else: - shortcut = x - - out += shortcut - out = F.relu_(out) - return out - - -class DeformBottleneckBlock(CNNBlockBase): - """ - Similar to :class:`BottleneckBlock`, but with :paper:`deformable conv ` - in the 3x3 convolution. - """ - - def __init__( - self, - in_channels, - out_channels, - *, - bottleneck_channels, - stride=1, - num_groups=1, - norm="BN", - stride_in_1x1=False, - dilation=1, - deform_modulated=False, - deform_num_groups=1, - ): - super().__init__(in_channels, out_channels, stride) - self.deform_modulated = deform_modulated - - if in_channels != out_channels: - self.shortcut = Conv2d( - in_channels, - out_channels, - kernel_size=1, - stride=stride, - bias=False, - norm=get_norm(norm, out_channels), - ) - else: - self.shortcut = None - - stride_1x1, stride_3x3 = (stride, 1) if stride_in_1x1 else (1, stride) - - self.conv1 = Conv2d( - in_channels, - bottleneck_channels, - kernel_size=1, - stride=stride_1x1, - bias=False, - norm=get_norm(norm, bottleneck_channels), - ) - - if deform_modulated: - deform_conv_op = ModulatedDeformConv - # offset channels are 2 or 3 (if with modulated) * kernel_size * kernel_size - offset_channels = 27 - else: - deform_conv_op = DeformConv - offset_channels = 18 - - self.conv2_offset = Conv2d( - bottleneck_channels, - offset_channels * deform_num_groups, - kernel_size=3, - stride=stride_3x3, - padding=1 * dilation, - dilation=dilation, - ) - self.conv2 = deform_conv_op( - bottleneck_channels, - bottleneck_channels, - kernel_size=3, - stride=stride_3x3, - padding=1 * dilation, - bias=False, - groups=num_groups, - dilation=dilation, - deformable_groups=deform_num_groups, - norm=get_norm(norm, bottleneck_channels), - ) - - self.conv3 = Conv2d( - bottleneck_channels, - out_channels, - kernel_size=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - - for layer in [self.conv1, self.conv2, self.conv3, self.shortcut]: - if layer is not None: # shortcut can be None - weight_init.c2_msra_fill(layer) - - nn.init.constant_(self.conv2_offset.weight, 0) - nn.init.constant_(self.conv2_offset.bias, 0) - - def forward(self, x): - out = self.conv1(x) - out = F.relu_(out) - - if self.deform_modulated: - offset_mask = self.conv2_offset(out) - offset_x, offset_y, mask = torch.chunk(offset_mask, 3, dim=1) - offset = torch.cat((offset_x, offset_y), dim=1) - mask = mask.sigmoid() - out = self.conv2(out, offset, mask) - else: - offset = self.conv2_offset(out) - out = self.conv2(out, offset) - out = F.relu_(out) - - out = self.conv3(out) - - if self.shortcut is not None: - shortcut = self.shortcut(x) - else: - shortcut = x - - out += shortcut - out = F.relu_(out) - return out - - -class BasicStem(CNNBlockBase): - """ - The standard ResNet stem (layers before the first residual block), - with a conv, relu and max_pool. - """ - - def __init__(self, in_channels=3, out_channels=64, norm="BN"): - """ - Args: - norm (str or callable): norm after the first conv layer. - See :func:`layers.get_norm` for supported format. - """ - super().__init__(in_channels, out_channels, 4) - self.in_channels = in_channels - self.conv1 = Conv2d( - in_channels, - out_channels, - kernel_size=7, - stride=2, - padding=3, - bias=False, - norm=get_norm(norm, out_channels), - ) - weight_init.c2_msra_fill(self.conv1) - - def forward(self, x): - x = self.conv1(x) - x = F.relu_(x) - x = F.max_pool2d(x, kernel_size=3, stride=2, padding=1) - return x - - -class ResNet(Backbone): - """ - Implement :paper:`ResNet`. - """ - - def __init__(self, stem, stages, num_classes=None, out_features=None, freeze_at=0): - """ - Args: - stem (nn.Module): a stem module - stages (list[list[CNNBlockBase]]): several (typically 4) stages, - each contains multiple :class:`CNNBlockBase`. - num_classes (None or int): if None, will not perform classification. - Otherwise, will create a linear layer. - out_features (list[str]): name of the layers whose outputs should - be returned in forward. Can be anything in "stem", "linear", or "res2" ... - If None, will return the output of the last layer. - freeze_at (int): The number of stages at the beginning to freeze. - see :meth:`freeze` for detailed explanation. - """ - super().__init__() - self.stem = stem - self.num_classes = num_classes - - current_stride = self.stem.stride - self._out_feature_strides = {"stem": current_stride} - self._out_feature_channels = {"stem": self.stem.out_channels} - - self.stage_names, self.stages = [], [] - - if out_features is not None: - # Avoid keeping unused layers in this module. They consume extra memory - # and may cause allreduce to fail - num_stages = max( - [{"res2": 1, "res3": 2, "res4": 3, "res5": 4}.get(f, 0) for f in out_features] - ) - stages = stages[:num_stages] - for i, blocks in enumerate(stages): - assert len(blocks) > 0, len(blocks) - for block in blocks: - assert isinstance(block, CNNBlockBase), block - - name = "res" + str(i + 2) - stage = nn.Sequential(*blocks) - - self.add_module(name, stage) - self.stage_names.append(name) - self.stages.append(stage) - - self._out_feature_strides[name] = current_stride = int( - current_stride * np.prod([k.stride for k in blocks]) - ) - self._out_feature_channels[name] = curr_channels = blocks[-1].out_channels - self.stage_names = tuple(self.stage_names) # Make it static for scripting - - if num_classes is not None: - self.avgpool = nn.AdaptiveAvgPool2d((1, 1)) - self.linear = nn.Linear(curr_channels, num_classes) - - # Sec 5.1 in "Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour": - # "The 1000-way fully-connected layer is initialized by - # drawing weights from a zero-mean Gaussian with standard deviation of 0.01." - nn.init.normal_(self.linear.weight, std=0.01) - name = "linear" - - if out_features is None: - out_features = [name] - self._out_features = out_features - assert len(self._out_features) - children = [x[0] for x in self.named_children()] - for out_feature in self._out_features: - assert out_feature in children, "Available children: {}".format(", ".join(children)) - self.freeze(freeze_at) - - def forward(self, x): - """ - Args: - x: Tensor of shape (N,C,H,W). H, W must be a multiple of ``self.size_divisibility``. - - Returns: - dict[str->Tensor]: names and the corresponding features - """ - assert x.dim() == 4, f"ResNet takes an input of shape (N, C, H, W). Got {x.shape} instead!" - outputs = {} - x = self.stem(x) - if "stem" in self._out_features: - outputs["stem"] = x - for name, stage in zip(self.stage_names, self.stages): - x = stage(x) - if name in self._out_features: - outputs[name] = x - if self.num_classes is not None: - x = self.avgpool(x) - x = torch.flatten(x, 1) - x = self.linear(x) - if "linear" in self._out_features: - outputs["linear"] = x - return outputs - - def output_shape(self): - return { - name: ShapeSpec( - channels=self._out_feature_channels[name], stride=self._out_feature_strides[name] - ) - for name in self._out_features - } - - def freeze(self, freeze_at=0): - """ - Freeze the first several stages of the ResNet. Commonly used in - fine-tuning. - - Layers that produce the same feature map spatial size are defined as one - "stage" by :paper:`FPN`. - - Args: - freeze_at (int): number of stages to freeze. - `1` means freezing the stem. `2` means freezing the stem and - one residual stage, etc. - - Returns: - nn.Module: this ResNet itself - """ - if freeze_at >= 1: - self.stem.freeze() - for idx, stage in enumerate(self.stages, start=2): - if freeze_at >= idx: - for block in stage.children(): - block.freeze() - return self - - @staticmethod - def make_stage(block_class, num_blocks, *, in_channels, out_channels, **kwargs): - """ - Create a list of blocks of the same type that forms one ResNet stage. - - Args: - block_class (type): a subclass of CNNBlockBase that's used to create all blocks in this - stage. A module of this type must not change spatial resolution of inputs unless its - stride != 1. - num_blocks (int): number of blocks in this stage - in_channels (int): input channels of the entire stage. - out_channels (int): output channels of **every block** in the stage. - kwargs: other arguments passed to the constructor of - `block_class`. If the argument name is "xx_per_block", the - argument is a list of values to be passed to each block in the - stage. Otherwise, the same argument is passed to every block - in the stage. - - Returns: - list[CNNBlockBase]: a list of block module. - - Examples: - :: - stage = ResNet.make_stage( - BottleneckBlock, 3, in_channels=16, out_channels=64, - bottleneck_channels=16, num_groups=1, - stride_per_block=[2, 1, 1], - dilations_per_block=[1, 1, 2] - ) - - Usually, layers that produce the same feature map spatial size are defined as one - "stage" (in :paper:`FPN`). Under such definition, ``stride_per_block[1:]`` should - all be 1. - """ - blocks = [] - for i in range(num_blocks): - curr_kwargs = {} - for k, v in kwargs.items(): - if k.endswith("_per_block"): - assert len(v) == num_blocks, ( - f"Argument '{k}' of make_stage should have the " - f"same length as num_blocks={num_blocks}." - ) - newk = k[: -len("_per_block")] - assert newk not in kwargs, f"Cannot call make_stage with both {k} and {newk}!" - curr_kwargs[newk] = v[i] - else: - curr_kwargs[k] = v - - blocks.append( - block_class(in_channels=in_channels, out_channels=out_channels, **curr_kwargs) - ) - in_channels = out_channels - return blocks - - @staticmethod - def make_default_stages(depth, block_class=None, **kwargs): - """ - Created list of ResNet stages from pre-defined depth (one of 18, 34, 50, 101, 152). - If it doesn't create the ResNet variant you need, please use :meth:`make_stage` - instead for fine-grained customization. - - Args: - depth (int): depth of ResNet - block_class (type): the CNN block class. Has to accept - `bottleneck_channels` argument for depth > 50. - By default it is BasicBlock or BottleneckBlock, based on the - depth. - kwargs: - other arguments to pass to `make_stage`. Should not contain - stride and channels, as they are predefined for each depth. - - Returns: - list[list[CNNBlockBase]]: modules in all stages; see arguments of - :class:`ResNet.__init__`. - """ - num_blocks_per_stage = { - 18: [2, 2, 2, 2], - 34: [3, 4, 6, 3], - 50: [3, 4, 6, 3], - 101: [3, 4, 23, 3], - 152: [3, 8, 36, 3], - }[depth] - if block_class is None: - block_class = BasicBlock if depth < 50 else BottleneckBlock - if depth < 50: - in_channels = [64, 64, 128, 256] - out_channels = [64, 128, 256, 512] - else: - in_channels = [64, 256, 512, 1024] - out_channels = [256, 512, 1024, 2048] - ret = [] - for (n, s, i, o) in zip(num_blocks_per_stage, [1, 2, 2, 2], in_channels, out_channels): - if depth >= 50: - kwargs["bottleneck_channels"] = o // 4 - ret.append( - ResNet.make_stage( - block_class=block_class, - num_blocks=n, - stride_per_block=[s] + [1] * (n - 1), - in_channels=i, - out_channels=o, - **kwargs, - ) - ) - return ret - - -ResNetBlockBase = CNNBlockBase -""" -Alias for backward compatibiltiy. -""" - - -def make_stage(*args, **kwargs): - """ - Deprecated alias for backward compatibiltiy. - """ - return ResNet.make_stage(*args, **kwargs) - - -@BACKBONE_REGISTRY.register() -def build_resnet_backbone(cfg, input_shape): - """ - Create a ResNet instance from config. - - Returns: - ResNet: a :class:`ResNet` instance. - """ - # need registration of new blocks/stems? - norm = cfg.MODEL.RESNETS.NORM - stem = BasicStem( - in_channels=input_shape.channels, - out_channels=cfg.MODEL.RESNETS.STEM_OUT_CHANNELS, - norm=norm, - ) - - # fmt: off - freeze_at = cfg.MODEL.BACKBONE.FREEZE_AT - out_features = cfg.MODEL.RESNETS.OUT_FEATURES - depth = cfg.MODEL.RESNETS.DEPTH - num_groups = cfg.MODEL.RESNETS.NUM_GROUPS - width_per_group = cfg.MODEL.RESNETS.WIDTH_PER_GROUP - bottleneck_channels = num_groups * width_per_group - in_channels = cfg.MODEL.RESNETS.STEM_OUT_CHANNELS - out_channels = cfg.MODEL.RESNETS.RES2_OUT_CHANNELS - stride_in_1x1 = cfg.MODEL.RESNETS.STRIDE_IN_1X1 - res5_dilation = cfg.MODEL.RESNETS.RES5_DILATION - deform_on_per_stage = cfg.MODEL.RESNETS.DEFORM_ON_PER_STAGE - deform_modulated = cfg.MODEL.RESNETS.DEFORM_MODULATED - deform_num_groups = cfg.MODEL.RESNETS.DEFORM_NUM_GROUPS - # fmt: on - assert res5_dilation in {1, 2}, "res5_dilation cannot be {}.".format(res5_dilation) - - num_blocks_per_stage = { - 18: [2, 2, 2, 2], - 34: [3, 4, 6, 3], - 50: [3, 4, 6, 3], - 101: [3, 4, 23, 3], - 152: [3, 8, 36, 3], - }[depth] - - if depth in [18, 34]: - assert out_channels == 64, "Must set MODEL.RESNETS.RES2_OUT_CHANNELS = 64 for R18/R34" - assert not any( - deform_on_per_stage - ), "MODEL.RESNETS.DEFORM_ON_PER_STAGE unsupported for R18/R34" - assert res5_dilation == 1, "Must set MODEL.RESNETS.RES5_DILATION = 1 for R18/R34" - assert num_groups == 1, "Must set MODEL.RESNETS.NUM_GROUPS = 1 for R18/R34" - - stages = [] - - for idx, stage_idx in enumerate(range(2, 6)): - # res5_dilation is used this way as a convention in R-FCN & Deformable Conv paper - dilation = res5_dilation if stage_idx == 5 else 1 - first_stride = 1 if idx == 0 or (stage_idx == 5 and dilation == 2) else 2 - stage_kargs = { - "num_blocks": num_blocks_per_stage[idx], - "stride_per_block": [first_stride] + [1] * (num_blocks_per_stage[idx] - 1), - "in_channels": in_channels, - "out_channels": out_channels, - "norm": norm, - } - # Use BasicBlock for R18 and R34. - if depth in [18, 34]: - stage_kargs["block_class"] = BasicBlock - else: - stage_kargs["bottleneck_channels"] = bottleneck_channels - stage_kargs["stride_in_1x1"] = stride_in_1x1 - stage_kargs["dilation"] = dilation - stage_kargs["num_groups"] = num_groups - if deform_on_per_stage[idx]: - stage_kargs["block_class"] = DeformBottleneckBlock - stage_kargs["deform_modulated"] = deform_modulated - stage_kargs["deform_num_groups"] = deform_num_groups - else: - stage_kargs["block_class"] = BottleneckBlock - blocks = ResNet.make_stage(**stage_kargs) - in_channels = out_channels - out_channels *= 2 - bottleneck_channels *= 2 - stages.append(blocks) - return ResNet(stem, stages, out_features=out_features, freeze_at=freeze_at) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/box_regression.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/box_regression.py deleted file mode 100755 index b24c123f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/box_regression.py +++ /dev/null @@ -1,369 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import math -from typing import List, Tuple, Union -import torch -from fvcore.nn import giou_loss, smooth_l1_loss -from torch.nn import functional as F - -from detectron2.layers import cat, ciou_loss, diou_loss -from detectron2.structures import Boxes - -# Value for clamping large dw and dh predictions. The heuristic is that we clamp -# such that dw and dh are no larger than what would transform a 16px box into a -# 1000px box (based on a small anchor, 16px, and a typical image size, 1000px). -_DEFAULT_SCALE_CLAMP = math.log(1000.0 / 16) - - -__all__ = ["Box2BoxTransform", "Box2BoxTransformRotated", "Box2BoxTransformLinear"] - - -@torch.jit.script -class Box2BoxTransform(object): - """ - The box-to-box transform defined in R-CNN. The transformation is parameterized - by 4 deltas: (dx, dy, dw, dh). The transformation scales the box's width and height - by exp(dw), exp(dh) and shifts a box's center by the offset (dx * width, dy * height). - """ - - def __init__( - self, weights: Tuple[float, float, float, float], scale_clamp: float = _DEFAULT_SCALE_CLAMP - ): - """ - Args: - weights (4-element tuple): Scaling factors that are applied to the - (dx, dy, dw, dh) deltas. In Fast R-CNN, these were originally set - such that the deltas have unit variance; now they are treated as - hyperparameters of the system. - scale_clamp (float): When predicting deltas, the predicted box scaling - factors (dw and dh) are clamped such that they are <= scale_clamp. - """ - self.weights = weights - self.scale_clamp = scale_clamp - - def get_deltas(self, src_boxes, target_boxes): - """ - Get box regression transformation deltas (dx, dy, dw, dh) that can be used - to transform the `src_boxes` into the `target_boxes`. That is, the relation - ``target_boxes == self.apply_deltas(deltas, src_boxes)`` is true (unless - any delta is too large and is clamped). - - Args: - src_boxes (Tensor): source boxes, e.g., object proposals - target_boxes (Tensor): target of the transformation, e.g., ground-truth - boxes. - """ - assert isinstance(src_boxes, torch.Tensor), type(src_boxes) - assert isinstance(target_boxes, torch.Tensor), type(target_boxes) - - src_widths = src_boxes[:, 2] - src_boxes[:, 0] - src_heights = src_boxes[:, 3] - src_boxes[:, 1] - src_ctr_x = src_boxes[:, 0] + 0.5 * src_widths - src_ctr_y = src_boxes[:, 1] + 0.5 * src_heights - - target_widths = target_boxes[:, 2] - target_boxes[:, 0] - target_heights = target_boxes[:, 3] - target_boxes[:, 1] - target_ctr_x = target_boxes[:, 0] + 0.5 * target_widths - target_ctr_y = target_boxes[:, 1] + 0.5 * target_heights - - wx, wy, ww, wh = self.weights - dx = wx * (target_ctr_x - src_ctr_x) / src_widths - dy = wy * (target_ctr_y - src_ctr_y) / src_heights - dw = ww * torch.log(target_widths / src_widths) - dh = wh * torch.log(target_heights / src_heights) - - deltas = torch.stack((dx, dy, dw, dh), dim=1) - assert (src_widths > 0).all().item(), "Input boxes to Box2BoxTransform are not valid!" - return deltas - - def apply_deltas(self, deltas, boxes): - """ - Apply transformation `deltas` (dx, dy, dw, dh) to `boxes`. - - Args: - deltas (Tensor): transformation deltas of shape (N, k*4), where k >= 1. - deltas[i] represents k potentially different class-specific - box transformations for the single box boxes[i]. - boxes (Tensor): boxes to transform, of shape (N, 4) - """ - deltas = deltas.float() # ensure fp32 for decoding precision - boxes = boxes.to(deltas.dtype) - - widths = boxes[:, 2] - boxes[:, 0] - heights = boxes[:, 3] - boxes[:, 1] - ctr_x = boxes[:, 0] + 0.5 * widths - ctr_y = boxes[:, 1] + 0.5 * heights - - wx, wy, ww, wh = self.weights - dx = deltas[:, 0::4] / wx - dy = deltas[:, 1::4] / wy - dw = deltas[:, 2::4] / ww - dh = deltas[:, 3::4] / wh - - # Prevent sending too large values into torch.exp() - dw = torch.clamp(dw, max=self.scale_clamp) - dh = torch.clamp(dh, max=self.scale_clamp) - - pred_ctr_x = dx * widths[:, None] + ctr_x[:, None] - pred_ctr_y = dy * heights[:, None] + ctr_y[:, None] - pred_w = torch.exp(dw) * widths[:, None] - pred_h = torch.exp(dh) * heights[:, None] - - x1 = pred_ctr_x - 0.5 * pred_w - y1 = pred_ctr_y - 0.5 * pred_h - x2 = pred_ctr_x + 0.5 * pred_w - y2 = pred_ctr_y + 0.5 * pred_h - pred_boxes = torch.stack((x1, y1, x2, y2), dim=-1) - return pred_boxes.reshape(deltas.shape) - - -@torch.jit.script -class Box2BoxTransformRotated(object): - """ - The box-to-box transform defined in Rotated R-CNN. The transformation is parameterized - by 5 deltas: (dx, dy, dw, dh, da). The transformation scales the box's width and height - by exp(dw), exp(dh), shifts a box's center by the offset (dx * width, dy * height), - and rotate a box's angle by da (radians). - Note: angles of deltas are in radians while angles of boxes are in degrees. - """ - - def __init__( - self, - weights: Tuple[float, float, float, float, float], - scale_clamp: float = _DEFAULT_SCALE_CLAMP, - ): - """ - Args: - weights (5-element tuple): Scaling factors that are applied to the - (dx, dy, dw, dh, da) deltas. These are treated as - hyperparameters of the system. - scale_clamp (float): When predicting deltas, the predicted box scaling - factors (dw and dh) are clamped such that they are <= scale_clamp. - """ - self.weights = weights - self.scale_clamp = scale_clamp - - def get_deltas(self, src_boxes, target_boxes): - """ - Get box regression transformation deltas (dx, dy, dw, dh, da) that can be used - to transform the `src_boxes` into the `target_boxes`. That is, the relation - ``target_boxes == self.apply_deltas(deltas, src_boxes)`` is true (unless - any delta is too large and is clamped). - - Args: - src_boxes (Tensor): Nx5 source boxes, e.g., object proposals - target_boxes (Tensor): Nx5 target of the transformation, e.g., ground-truth - boxes. - """ - assert isinstance(src_boxes, torch.Tensor), type(src_boxes) - assert isinstance(target_boxes, torch.Tensor), type(target_boxes) - - src_ctr_x, src_ctr_y, src_widths, src_heights, src_angles = torch.unbind(src_boxes, dim=1) - - target_ctr_x, target_ctr_y, target_widths, target_heights, target_angles = torch.unbind( - target_boxes, dim=1 - ) - - wx, wy, ww, wh, wa = self.weights - dx = wx * (target_ctr_x - src_ctr_x) / src_widths - dy = wy * (target_ctr_y - src_ctr_y) / src_heights - dw = ww * torch.log(target_widths / src_widths) - dh = wh * torch.log(target_heights / src_heights) - # Angles of deltas are in radians while angles of boxes are in degrees. - # the conversion to radians serve as a way to normalize the values - da = target_angles - src_angles - da = (da + 180.0) % 360.0 - 180.0 # make it in [-180, 180) - da *= wa * math.pi / 180.0 - - deltas = torch.stack((dx, dy, dw, dh, da), dim=1) - assert ( - (src_widths > 0).all().item() - ), "Input boxes to Box2BoxTransformRotated are not valid!" - return deltas - - def apply_deltas(self, deltas, boxes): - """ - Apply transformation `deltas` (dx, dy, dw, dh, da) to `boxes`. - - Args: - deltas (Tensor): transformation deltas of shape (N, k*5). - deltas[i] represents box transformation for the single box boxes[i]. - boxes (Tensor): boxes to transform, of shape (N, 5) - """ - assert deltas.shape[1] % 5 == 0 and boxes.shape[1] == 5 - - boxes = boxes.to(deltas.dtype).unsqueeze(2) - - ctr_x = boxes[:, 0] - ctr_y = boxes[:, 1] - widths = boxes[:, 2] - heights = boxes[:, 3] - angles = boxes[:, 4] - - wx, wy, ww, wh, wa = self.weights - - dx = deltas[:, 0::5] / wx - dy = deltas[:, 1::5] / wy - dw = deltas[:, 2::5] / ww - dh = deltas[:, 3::5] / wh - da = deltas[:, 4::5] / wa - - # Prevent sending too large values into torch.exp() - dw = torch.clamp(dw, max=self.scale_clamp) - dh = torch.clamp(dh, max=self.scale_clamp) - - pred_boxes = torch.zeros_like(deltas) - pred_boxes[:, 0::5] = dx * widths + ctr_x # x_ctr - pred_boxes[:, 1::5] = dy * heights + ctr_y # y_ctr - pred_boxes[:, 2::5] = torch.exp(dw) * widths # width - pred_boxes[:, 3::5] = torch.exp(dh) * heights # height - - # Following original RRPN implementation, - # angles of deltas are in radians while angles of boxes are in degrees. - pred_angle = da * 180.0 / math.pi + angles - pred_angle = (pred_angle + 180.0) % 360.0 - 180.0 # make it in [-180, 180) - - pred_boxes[:, 4::5] = pred_angle - - return pred_boxes - - -class Box2BoxTransformLinear(object): - """ - The linear box-to-box transform defined in FCOS. The transformation is parameterized - by the distance from the center of (square) src box to 4 edges of the target box. - """ - - def __init__(self, normalize_by_size=True): - """ - Args: - normalize_by_size: normalize deltas by the size of src (anchor) boxes. - """ - self.normalize_by_size = normalize_by_size - - def get_deltas(self, src_boxes, target_boxes): - """ - Get box regression transformation deltas (dx1, dy1, dx2, dy2) that can be used - to transform the `src_boxes` into the `target_boxes`. That is, the relation - ``target_boxes == self.apply_deltas(deltas, src_boxes)`` is true. - The center of src must be inside target boxes. - - Args: - src_boxes (Tensor): square source boxes, e.g., anchors - target_boxes (Tensor): target of the transformation, e.g., ground-truth - boxes. - """ - assert isinstance(src_boxes, torch.Tensor), type(src_boxes) - assert isinstance(target_boxes, torch.Tensor), type(target_boxes) - - src_ctr_x = 0.5 * (src_boxes[:, 0] + src_boxes[:, 2]) - src_ctr_y = 0.5 * (src_boxes[:, 1] + src_boxes[:, 3]) - - target_l = src_ctr_x - target_boxes[:, 0] - target_t = src_ctr_y - target_boxes[:, 1] - target_r = target_boxes[:, 2] - src_ctr_x - target_b = target_boxes[:, 3] - src_ctr_y - - deltas = torch.stack((target_l, target_t, target_r, target_b), dim=1) - if self.normalize_by_size: - stride_w = src_boxes[:, 2] - src_boxes[:, 0] - stride_h = src_boxes[:, 3] - src_boxes[:, 1] - strides = torch.stack([stride_w, stride_h, stride_w, stride_h], axis=1) - deltas = deltas / strides - - return deltas - - def apply_deltas(self, deltas, boxes): - """ - Apply transformation `deltas` (dx1, dy1, dx2, dy2) to `boxes`. - - Args: - deltas (Tensor): transformation deltas of shape (N, k*4), where k >= 1. - deltas[i] represents k potentially different class-specific - box transformations for the single box boxes[i]. - boxes (Tensor): boxes to transform, of shape (N, 4) - """ - # Ensure the output is a valid box. See Sec 2.1 of https://arxiv.org/abs/2006.09214 - deltas = F.relu(deltas) - boxes = boxes.to(deltas.dtype) - - ctr_x = 0.5 * (boxes[:, 0] + boxes[:, 2]) - ctr_y = 0.5 * (boxes[:, 1] + boxes[:, 3]) - if self.normalize_by_size: - stride_w = boxes[:, 2] - boxes[:, 0] - stride_h = boxes[:, 3] - boxes[:, 1] - strides = torch.stack([stride_w, stride_h, stride_w, stride_h], axis=1) - deltas = deltas * strides - - l = deltas[:, 0::4] - t = deltas[:, 1::4] - r = deltas[:, 2::4] - b = deltas[:, 3::4] - - pred_boxes = torch.zeros_like(deltas) - pred_boxes[:, 0::4] = ctr_x[:, None] - l # x1 - pred_boxes[:, 1::4] = ctr_y[:, None] - t # y1 - pred_boxes[:, 2::4] = ctr_x[:, None] + r # x2 - pred_boxes[:, 3::4] = ctr_y[:, None] + b # y2 - return pred_boxes - - -def _dense_box_regression_loss( - anchors: List[Union[Boxes, torch.Tensor]], - box2box_transform: Box2BoxTransform, - pred_anchor_deltas: List[torch.Tensor], - gt_boxes: List[torch.Tensor], - fg_mask: torch.Tensor, - box_reg_loss_type="smooth_l1", - smooth_l1_beta=0.0, -): - """ - Compute loss for dense multi-level box regression. - Loss is accumulated over ``fg_mask``. - - Args: - anchors: #lvl anchor boxes, each is (HixWixA, 4) - pred_anchor_deltas: #lvl predictions, each is (N, HixWixA, 4) - gt_boxes: N ground truth boxes, each has shape (R, 4) (R = sum(Hi * Wi * A)) - fg_mask: the foreground boolean mask of shape (N, R) to compute loss on - box_reg_loss_type (str): Loss type to use. Supported losses: "smooth_l1", "giou", - "diou", "ciou". - smooth_l1_beta (float): beta parameter for the smooth L1 regression loss. Default to - use L1 loss. Only used when `box_reg_loss_type` is "smooth_l1" - """ - if isinstance(anchors[0], Boxes): - anchors = type(anchors[0]).cat(anchors).tensor # (R, 4) - else: - anchors = cat(anchors) - if box_reg_loss_type == "smooth_l1": - gt_anchor_deltas = [box2box_transform.get_deltas(anchors, k) for k in gt_boxes] - gt_anchor_deltas = torch.stack(gt_anchor_deltas) # (N, R, 4) - loss_box_reg = smooth_l1_loss( - cat(pred_anchor_deltas, dim=1)[fg_mask], - gt_anchor_deltas[fg_mask], - beta=smooth_l1_beta, - reduction="sum", - ) - elif box_reg_loss_type == "giou": - pred_boxes = [ - box2box_transform.apply_deltas(k, anchors) for k in cat(pred_anchor_deltas, dim=1) - ] - loss_box_reg = giou_loss( - torch.stack(pred_boxes)[fg_mask], torch.stack(gt_boxes)[fg_mask], reduction="sum" - ) - elif box_reg_loss_type == "diou": - pred_boxes = [ - box2box_transform.apply_deltas(k, anchors) for k in cat(pred_anchor_deltas, dim=1) - ] - loss_box_reg = diou_loss( - torch.stack(pred_boxes)[fg_mask], torch.stack(gt_boxes)[fg_mask], reduction="sum" - ) - elif box_reg_loss_type == "ciou": - pred_boxes = [ - box2box_transform.apply_deltas(k, anchors) for k in cat(pred_anchor_deltas, dim=1) - ] - loss_box_reg = ciou_loss( - torch.stack(pred_boxes)[fg_mask], torch.stack(gt_boxes)[fg_mask], reduction="sum" - ) - else: - raise ValueError(f"Invalid dense box regression loss type '{box_reg_loss_type}'") - return loss_box_reg diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/matcher.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/matcher.py deleted file mode 100755 index c7597cab..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/matcher.py +++ /dev/null @@ -1,127 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from typing import List -import torch - -from detectron2.layers import nonzero_tuple - - -# TODO: the name is too general -class Matcher(object): - """ - This class assigns to each predicted "element" (e.g., a box) a ground-truth - element. Each predicted element will have exactly zero or one matches; each - ground-truth element may be matched to zero or more predicted elements. - - The matching is determined by the MxN match_quality_matrix, that characterizes - how well each (ground-truth, prediction)-pair match each other. For example, - if the elements are boxes, this matrix may contain box intersection-over-union - overlap values. - - The matcher returns (a) a vector of length N containing the index of the - ground-truth element m in [0, M) that matches to prediction n in [0, N). - (b) a vector of length N containing the labels for each prediction. - """ - - def __init__( - self, thresholds: List[float], labels: List[int], allow_low_quality_matches: bool = False - ): - """ - Args: - thresholds (list): a list of thresholds used to stratify predictions - into levels. - labels (list): a list of values to label predictions belonging at - each level. A label can be one of {-1, 0, 1} signifying - {ignore, negative class, positive class}, respectively. - allow_low_quality_matches (bool): if True, produce additional matches - for predictions with maximum match quality lower than high_threshold. - See set_low_quality_matches_ for more details. - - For example, - thresholds = [0.3, 0.5] - labels = [0, -1, 1] - All predictions with iou < 0.3 will be marked with 0 and - thus will be considered as false positives while training. - All predictions with 0.3 <= iou < 0.5 will be marked with -1 and - thus will be ignored. - All predictions with 0.5 <= iou will be marked with 1 and - thus will be considered as true positives. - """ - # Add -inf and +inf to first and last position in thresholds - thresholds = thresholds[:] - assert thresholds[0] > 0 - thresholds.insert(0, -float("inf")) - thresholds.append(float("inf")) - # Currently torchscript does not support all + generator - assert all([low <= high for (low, high) in zip(thresholds[:-1], thresholds[1:])]) - assert all([l in [-1, 0, 1] for l in labels]) - assert len(labels) == len(thresholds) - 1 - self.thresholds = thresholds - self.labels = labels - self.allow_low_quality_matches = allow_low_quality_matches - - def __call__(self, match_quality_matrix): - """ - Args: - match_quality_matrix (Tensor[float]): an MxN tensor, containing the - pairwise quality between M ground-truth elements and N predicted - elements. All elements must be >= 0 (due to the us of `torch.nonzero` - for selecting indices in :meth:`set_low_quality_matches_`). - - Returns: - matches (Tensor[int64]): a vector of length N, where matches[i] is a matched - ground-truth index in [0, M) - match_labels (Tensor[int8]): a vector of length N, where pred_labels[i] indicates - whether a prediction is a true or false positive or ignored - """ - assert match_quality_matrix.dim() == 2 - if match_quality_matrix.numel() == 0: - default_matches = match_quality_matrix.new_full( - (match_quality_matrix.size(1),), 0, dtype=torch.int64 - ) - # When no gt boxes exist, we define IOU = 0 and therefore set labels - # to `self.labels[0]`, which usually defaults to background class 0 - # To choose to ignore instead, can make labels=[-1,0,-1,1] + set appropriate thresholds - default_match_labels = match_quality_matrix.new_full( - (match_quality_matrix.size(1),), self.labels[0], dtype=torch.int8 - ) - return default_matches, default_match_labels - - assert torch.all(match_quality_matrix >= 0) - - # match_quality_matrix is M (gt) x N (predicted) - # Max over gt elements (dim 0) to find best gt candidate for each prediction - matched_vals, matches = match_quality_matrix.max(dim=0) - - match_labels = matches.new_full(matches.size(), 1, dtype=torch.int8) - - for (l, low, high) in zip(self.labels, self.thresholds[:-1], self.thresholds[1:]): - low_high = (matched_vals >= low) & (matched_vals < high) - match_labels[low_high] = l - - if self.allow_low_quality_matches: - self.set_low_quality_matches_(match_labels, match_quality_matrix) - - return matches, match_labels - - def set_low_quality_matches_(self, match_labels, match_quality_matrix): - """ - Produce additional matches for predictions that have only low-quality matches. - Specifically, for each ground-truth G find the set of predictions that have - maximum overlap with it (including ties); for each prediction in that set, if - it is unmatched, then match it to the ground-truth G. - - This function implements the RPN assignment case (i) in Sec. 3.1.2 of - :paper:`Faster R-CNN`. - """ - # For each gt, find the prediction with which it has highest quality - highest_quality_foreach_gt, _ = match_quality_matrix.max(dim=1) - # Find the highest quality match available, even if it is low, including ties. - # Note that the matches qualities must be positive due to the use of - # `torch.nonzero`. - _, pred_inds_with_highest_quality = nonzero_tuple( - match_quality_matrix == highest_quality_foreach_gt[:, None] - ) - # If an anchor was labeled positive only due to a low-quality match - # with gt_A, but it has larger overlap with gt_B, it's matched index will still be gt_B. - # This follows the implementation in Detectron, and is found to have no significant impact. - match_labels[pred_inds_with_highest_quality] = 1 diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/__init__.py deleted file mode 100755 index 6b066815..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/__init__.py +++ /dev/null @@ -1,16 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -from .build import META_ARCH_REGISTRY, build_model # isort:skip - -from .panoptic_fpn import PanopticFPN - -# import all the meta_arch, so they will be registered -from .rcnn import GeneralizedRCNN, ProposalNetwork -from .dense_detector import DenseDetector -from .retinanet import RetinaNet -from .fcos import FCOS -from .semantic_seg import SEM_SEG_HEADS_REGISTRY, SemanticSegmentor, build_sem_seg_head - - -__all__ = list(globals().keys()) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/build.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/build.py deleted file mode 100755 index 34272157..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/build.py +++ /dev/null @@ -1,25 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import torch - -from detectron2.utils.logger import _log_api_usage -from detectron2.utils.registry import Registry - -META_ARCH_REGISTRY = Registry("META_ARCH") # noqa F401 isort:skip -META_ARCH_REGISTRY.__doc__ = """ -Registry for meta-architectures, i.e. the whole model. - -The registered object will be called with `obj(cfg)` -and expected to return a `nn.Module` object. -""" - - -def build_model(cfg): - """ - Build the whole model architecture, defined by ``cfg.MODEL.META_ARCHITECTURE``. - Note that it does not load any weights from ``cfg``. - """ - meta_arch = cfg.MODEL.META_ARCHITECTURE - model = META_ARCH_REGISTRY.get(meta_arch)(cfg) - model.to(torch.device(cfg.MODEL.DEVICE)) - _log_api_usage("modeling.meta_arch." + meta_arch) - return model diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/dense_detector.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/dense_detector.py deleted file mode 100755 index 382eab97..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/dense_detector.py +++ /dev/null @@ -1,282 +0,0 @@ -import numpy as np -from typing import Dict, List, Optional, Tuple -import torch -from torch import Tensor, nn - -from detectron2.data.detection_utils import convert_image_to_rgb -from detectron2.modeling import Backbone -from detectron2.structures import Boxes, ImageList, Instances -from detectron2.utils.events import get_event_storage - -from ..postprocessing import detector_postprocess - - -def permute_to_N_HWA_K(tensor, K: int): - """ - Transpose/reshape a tensor from (N, (Ai x K), H, W) to (N, (HxWxAi), K) - """ - assert tensor.dim() == 4, tensor.shape - N, _, H, W = tensor.shape - tensor = tensor.view(N, -1, K, H, W) - tensor = tensor.permute(0, 3, 4, 1, 2) - tensor = tensor.reshape(N, -1, K) # Size=(N,HWA,K) - return tensor - - -class DenseDetector(nn.Module): - """ - Base class for dense detector. We define a dense detector as a fully-convolutional model that - makes per-pixel (i.e. dense) predictions. - """ - - def __init__( - self, - backbone: Backbone, - head: nn.Module, - head_in_features: Optional[List[str]] = None, - *, - pixel_mean, - pixel_std, - ): - """ - Args: - backbone: backbone module - head: head module - head_in_features: backbone features to use in head. Default to all backbone features. - pixel_mean (Tuple[float]): - Values to be used for image normalization (BGR order). - To train on images of different number of channels, set different mean & std. - Default values are the mean pixel value from ImageNet: [103.53, 116.28, 123.675] - pixel_std (Tuple[float]): - When using pre-trained models in Detectron1 or any MSRA models, - std has been absorbed into its conv1 weights, so the std needs to be set 1. - Otherwise, you can use [57.375, 57.120, 58.395] (ImageNet std) - """ - super().__init__() - - self.backbone = backbone - self.head = head - if head_in_features is None: - shapes = self.backbone.output_shape() - self.head_in_features = sorted(shapes.keys(), key=lambda x: shapes[x].stride) - else: - self.head_in_features = head_in_features - - self.register_buffer("pixel_mean", torch.tensor(pixel_mean).view(-1, 1, 1), False) - self.register_buffer("pixel_std", torch.tensor(pixel_std).view(-1, 1, 1), False) - - @property - def device(self): - return self.pixel_mean.device - - def forward(self, batched_inputs: List[Dict[str, Tensor]]): - """ - Args: - batched_inputs: a list, batched outputs of :class:`DatasetMapper` . - Each item in the list contains the inputs for one image. - For now, each item in the list is a dict that contains: - - * image: Tensor, image in (C, H, W) format. - * instances: Instances - - Other information that's included in the original dicts, such as: - - * "height", "width" (int): the output resolution of the model, used in inference. - See :meth:`postprocess` for details. - - Returns: - In training, dict[str, Tensor]: mapping from a named loss to a tensor storing the - loss. Used during training only. In inference, the standard output format, described - in :doc:`/tutorials/models`. - """ - images = self.preprocess_image(batched_inputs) - features = self.backbone(images.tensor) - features = [features[f] for f in self.head_in_features] - predictions = self.head(features) - - if self.training: - assert not torch.jit.is_scripting(), "Not supported" - assert "instances" in batched_inputs[0], "Instance annotations are missing in training!" - gt_instances = [x["instances"].to(self.device) for x in batched_inputs] - return self.forward_training(images, features, predictions, gt_instances) - else: - results = self.forward_inference(images, features, predictions) - if torch.jit.is_scripting(): - return results - - processed_results = [] - for results_per_image, input_per_image, image_size in zip( - results, batched_inputs, images.image_sizes - ): - height = input_per_image.get("height", image_size[0]) - width = input_per_image.get("width", image_size[1]) - r = detector_postprocess(results_per_image, height, width) - processed_results.append({"instances": r}) - return processed_results - - def forward_training(self, images, features, predictions, gt_instances): - raise NotImplementedError() - - def preprocess_image(self, batched_inputs: List[Dict[str, Tensor]]): - """ - Normalize, pad and batch the input images. - """ - images = [x["image"].to(self.device) for x in batched_inputs] - images = [(x - self.pixel_mean) / self.pixel_std for x in images] - images = ImageList.from_tensors(images, self.backbone.size_divisibility) - return images - - def _transpose_dense_predictions( - self, predictions: List[List[Tensor]], dims_per_anchor: List[int] - ) -> List[List[Tensor]]: - """ - Transpose the dense per-level predictions. - - Args: - predictions: a list of outputs, each is a list of per-level - predictions with shape (N, Ai x K, Hi, Wi), where N is the - number of images, Ai is the number of anchors per location on - level i, K is the dimension of predictions per anchor. - dims_per_anchor: the value of K for each predictions. e.g. 4 for - box prediction, #classes for classification prediction. - - Returns: - List[List[Tensor]]: each prediction is transposed to (N, Hi x Wi x Ai, K). - """ - assert len(predictions) == len(dims_per_anchor) - res: List[List[Tensor]] = [] - for pred, dim_per_anchor in zip(predictions, dims_per_anchor): - pred = [permute_to_N_HWA_K(x, dim_per_anchor) for x in pred] - res.append(pred) - return res - - def _ema_update(self, name: str, value: float, initial_value: float, momentum: float = 0.9): - """ - Apply EMA update to `self.name` using `value`. - - This is mainly used for loss normalizer. In Detectron1, loss is normalized by number - of foreground samples in the batch. When batch size is 1 per GPU, #foreground has a - large variance and using it lead to lower performance. Therefore we maintain an EMA of - #foreground to stabilize the normalizer. - - Args: - name: name of the normalizer - value: the new value to update - initial_value: the initial value to start with - momentum: momentum of EMA - - Returns: - float: the updated EMA value - """ - if hasattr(self, name): - old = getattr(self, name) - else: - old = initial_value - new = old * momentum + value * (1 - momentum) - setattr(self, name, new) - return new - - def _decode_per_level_predictions( - self, - anchors: Boxes, - pred_scores: Tensor, - pred_deltas: Tensor, - score_thresh: float, - topk_candidates: int, - image_size: Tuple[int, int], - ) -> Instances: - """ - Decode boxes and classification predictions of one featuer level, by - the following steps: - 1. filter the predictions based on score threshold and top K scores. - 2. transform the box regression outputs - 3. return the predicted scores, classes and boxes - - Args: - anchors: Boxes, anchor for this feature level - pred_scores: HxWxA,K - pred_deltas: HxWxA,4 - - Returns: - Instances: with field "scores", "pred_boxes", "pred_classes". - """ - # Apply two filtering to make NMS faster. - # 1. Keep boxes with confidence score higher than threshold - keep_idxs = pred_scores > score_thresh - pred_scores = pred_scores[keep_idxs] - topk_idxs = torch.nonzero(keep_idxs) # Kx2 - - # 2. Keep top k top scoring boxes only - num_topk = min(topk_candidates, topk_idxs.size(0)) - pred_scores, idxs = pred_scores.topk(num_topk) - topk_idxs = topk_idxs[idxs] - - anchor_idxs, classes_idxs = topk_idxs.unbind(dim=1) - - pred_boxes = self.box2box_transform.apply_deltas( - pred_deltas[anchor_idxs], anchors.tensor[anchor_idxs] - ) - return Instances( - image_size, pred_boxes=Boxes(pred_boxes), scores=pred_scores, pred_classes=classes_idxs - ) - - def _decode_multi_level_predictions( - self, - anchors: List[Boxes], - pred_scores: List[Tensor], - pred_deltas: List[Tensor], - score_thresh: float, - topk_candidates: int, - image_size: Tuple[int, int], - ) -> Instances: - """ - Run `_decode_per_level_predictions` for all feature levels and concat the results. - """ - predictions = [ - self._decode_per_level_predictions( - anchors_i, - box_cls_i, - box_reg_i, - self.test_score_thresh, - self.test_topk_candidates, - image_size, - ) - # Iterate over every feature level - for box_cls_i, box_reg_i, anchors_i in zip(pred_scores, pred_deltas, anchors) - ] - return predictions[0].cat(predictions) # 'Instances.cat' is not scriptale but this is - - def visualize_training(self, batched_inputs, results): - """ - A function used to visualize ground truth images and final network predictions. - It shows ground truth bounding boxes on the original image and up to 20 - predicted object bounding boxes on the original image. - - Args: - batched_inputs (list): a list that contains input to the model. - results (List[Instances]): a list of #images elements returned by forward_inference(). - """ - from detectron2.utils.visualizer import Visualizer - - assert len(batched_inputs) == len( - results - ), "Cannot visualize inputs and results of different sizes" - storage = get_event_storage() - max_boxes = 20 - - image_index = 0 # only visualize a single image - img = batched_inputs[image_index]["image"] - img = convert_image_to_rgb(img.permute(1, 2, 0), self.input_format) - v_gt = Visualizer(img, None) - v_gt = v_gt.overlay_instances(boxes=batched_inputs[image_index]["instances"].gt_boxes) - anno_img = v_gt.get_image() - processed_results = detector_postprocess(results[image_index], img.shape[0], img.shape[1]) - predicted_boxes = processed_results.pred_boxes.tensor.detach().cpu().numpy() - - v_pred = Visualizer(img, None) - v_pred = v_pred.overlay_instances(boxes=predicted_boxes[0:max_boxes]) - prop_img = v_pred.get_image() - vis_img = np.vstack((anno_img, prop_img)) - vis_img = vis_img.transpose(2, 0, 1) - vis_name = f"Top: GT bounding boxes; Bottom: {max_boxes} Highest Scoring Results" - storage.put_image(vis_name, vis_img) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/fcos.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/fcos.py deleted file mode 100755 index 55cdb76e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/fcos.py +++ /dev/null @@ -1,303 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import logging -from typing import List, Optional, Tuple -import torch -from fvcore.nn import sigmoid_focal_loss_jit -from torch import Tensor, nn -from torch.nn import functional as F - -from detectron2.layers import ShapeSpec, batched_nms -from detectron2.structures import Boxes, ImageList, Instances, pairwise_point_box_distance -from detectron2.utils.events import get_event_storage - -from ..anchor_generator import DefaultAnchorGenerator -from ..backbone import Backbone -from ..box_regression import Box2BoxTransformLinear, _dense_box_regression_loss -from .dense_detector import DenseDetector -from .retinanet import RetinaNetHead - -__all__ = ["FCOS"] - - -logger = logging.getLogger(__name__) - - -class FCOS(DenseDetector): - """ - Implement FCOS in :paper:`fcos`. - """ - - def __init__( - self, - *, - backbone: Backbone, - head: nn.Module, - head_in_features: Optional[List[str]] = None, - box2box_transform=None, - num_classes, - center_sampling_radius: float = 1.5, - focal_loss_alpha=0.25, - focal_loss_gamma=2.0, - test_score_thresh=0.2, - test_topk_candidates=1000, - test_nms_thresh=0.6, - max_detections_per_image=100, - pixel_mean, - pixel_std, - ): - """ - Args: - center_sampling_radius: radius of the "center" of a groundtruth box, - within which all anchor points are labeled positive. - Other arguments mean the same as in :class:`RetinaNet`. - """ - super().__init__( - backbone, head, head_in_features, pixel_mean=pixel_mean, pixel_std=pixel_std - ) - - self.num_classes = num_classes - - # FCOS uses one anchor point per location. - # We represent the anchor point by a box whose size equals the anchor stride. - feature_shapes = backbone.output_shape() - fpn_strides = [feature_shapes[k].stride for k in self.head_in_features] - self.anchor_generator = DefaultAnchorGenerator( - sizes=[[k] for k in fpn_strides], aspect_ratios=[1.0], strides=fpn_strides - ) - - # FCOS parameterizes box regression by a linear transform, - # where predictions are normalized by anchor stride (equal to anchor size). - if box2box_transform is None: - box2box_transform = Box2BoxTransformLinear(normalize_by_size=True) - self.box2box_transform = box2box_transform - - self.center_sampling_radius = float(center_sampling_radius) - - # Loss parameters: - self.focal_loss_alpha = focal_loss_alpha - self.focal_loss_gamma = focal_loss_gamma - - # Inference parameters: - self.test_score_thresh = test_score_thresh - self.test_topk_candidates = test_topk_candidates - self.test_nms_thresh = test_nms_thresh - self.max_detections_per_image = max_detections_per_image - - def forward_training(self, images, features, predictions, gt_instances): - # Transpose the Hi*Wi*A dimension to the middle: - pred_logits, pred_anchor_deltas, pred_centerness = self._transpose_dense_predictions( - predictions, [self.num_classes, 4, 1] - ) - anchors = self.anchor_generator(features) - gt_labels, gt_boxes = self.label_anchors(anchors, gt_instances) - return self.losses( - anchors, pred_logits, gt_labels, pred_anchor_deltas, gt_boxes, pred_centerness - ) - - @torch.no_grad() - def match_anchors(self, anchors: List[Boxes], gt_instances: List[Instances]): - """ - Match anchors with ground truth boxes. - - Args: - anchors: #level boxes, from the highest resolution to lower resolution - gt_instances: ground truth instances per image - - Returns: - List[Tensor]: - #image tensors, each is a vector of matched gt - indices (or -1 for unmatched anchors) for all anchors. - """ - num_anchors_per_level = [len(x) for x in anchors] - anchors = Boxes.cat(anchors) # Rx4 - anchor_centers = anchors.get_centers() # Rx2 - anchor_sizes = anchors.tensor[:, 2] - anchors.tensor[:, 0] # R - - lower_bound = anchor_sizes * 4 - lower_bound[: num_anchors_per_level[0]] = 0 - upper_bound = anchor_sizes * 8 - upper_bound[-num_anchors_per_level[-1] :] = float("inf") - - matched_indices = [] - for gt_per_image in gt_instances: - gt_centers = gt_per_image.gt_boxes.get_centers() # Nx2 - # FCOS with center sampling: anchor point must be close enough to gt center. - pairwise_match = (anchor_centers[:, None, :] - gt_centers[None, :, :]).abs_().max( - dim=2 - ).values < self.center_sampling_radius * anchor_sizes[:, None] - pairwise_dist = pairwise_point_box_distance(anchor_centers, gt_per_image.gt_boxes) - - # The original FCOS anchor matching rule: anchor point must be inside gt - pairwise_match &= pairwise_dist.min(dim=2).values > 0 - - # Multilevel anchor matching in FCOS: each anchor is only responsible - # for certain scale range. - pairwise_dist = pairwise_dist.max(dim=2).values - pairwise_match &= (pairwise_dist > lower_bound[:, None]) & ( - pairwise_dist < upper_bound[:, None] - ) - - # Match the GT box with minimum area, if there are multiple GT matches - gt_areas = gt_per_image.gt_boxes.area() # N - pairwise_match = pairwise_match.to(torch.float32) * (1e8 - gt_areas[None, :]) - min_values, matched_idx = pairwise_match.max(dim=1) # R, per-anchor match - matched_idx[min_values < 1e-5] = -1 # Unmatched anchors are assigned -1 - - matched_indices.append(matched_idx) - return matched_indices - - @torch.no_grad() - def label_anchors(self, anchors, gt_instances): - """ - Same interface as :meth:`RetinaNet.label_anchors`, but implemented with FCOS - anchor matching rule. - - Unlike RetinaNet, there are no ignored anchors. - """ - matched_indices = self.match_anchors(anchors, gt_instances) - - matched_labels, matched_boxes = [], [] - for gt_index, gt_per_image in zip(matched_indices, gt_instances): - label = gt_per_image.gt_classes[gt_index.clip(min=0)] - label[gt_index < 0] = self.num_classes # background - - matched_gt_boxes = gt_per_image.gt_boxes[gt_index.clip(min=0)] - - matched_labels.append(label) - matched_boxes.append(matched_gt_boxes) - return matched_labels, matched_boxes - - def losses( - self, anchors, pred_logits, gt_labels, pred_anchor_deltas, gt_boxes, pred_centerness - ): - """ - This method is almost identical to :meth:`RetinaNet.losses`, with an extra - "loss_centerness" in the returned dict. - """ - num_images = len(gt_labels) - gt_labels = torch.stack(gt_labels) # (N, R) - - pos_mask = (gt_labels >= 0) & (gt_labels != self.num_classes) - num_pos_anchors = pos_mask.sum().item() - get_event_storage().put_scalar("num_pos_anchors", num_pos_anchors / num_images) - normalizer = self._ema_update("loss_normalizer", max(num_pos_anchors, 1), 300) - - # classification and regression loss - gt_labels_target = F.one_hot(gt_labels, num_classes=self.num_classes + 1)[ - :, :, :-1 - ] # no loss for the last (background) class - loss_cls = sigmoid_focal_loss_jit( - torch.cat(pred_logits, dim=1), - gt_labels_target.to(pred_logits[0].dtype), - alpha=self.focal_loss_alpha, - gamma=self.focal_loss_gamma, - reduction="sum", - ) - - loss_box_reg = _dense_box_regression_loss( - anchors, - self.box2box_transform, - pred_anchor_deltas, - [x.tensor for x in gt_boxes], - pos_mask, - box_reg_loss_type="giou", - ) - - ctrness_targets = self.compute_ctrness_targets(anchors, gt_boxes) # NxR - pred_centerness = torch.cat(pred_centerness, dim=1).squeeze(dim=2) # NxR - ctrness_loss = F.binary_cross_entropy_with_logits( - pred_centerness[pos_mask], ctrness_targets[pos_mask], reduction="sum" - ) - return { - "loss_fcos_cls": loss_cls / normalizer, - "loss_fcos_loc": loss_box_reg / normalizer, - "loss_fcos_ctr": ctrness_loss / normalizer, - } - - def compute_ctrness_targets(self, anchors, gt_boxes): # NxR - anchors = Boxes.cat(anchors).tensor # Rx4 - reg_targets = [self.box2box_transform.get_deltas(anchors, m.tensor) for m in gt_boxes] - reg_targets = torch.stack(reg_targets, dim=0) # NxRx4 - if len(reg_targets) == 0: - return reg_targets.new_zeros(len(reg_targets)) - left_right = reg_targets[:, :, [0, 2]] - top_bottom = reg_targets[:, :, [1, 3]] - ctrness = (left_right.min(dim=-1)[0] / left_right.max(dim=-1)[0]) * ( - top_bottom.min(dim=-1)[0] / top_bottom.max(dim=-1)[0] - ) - return torch.sqrt(ctrness) - - def forward_inference( - self, images: ImageList, features: List[Tensor], predictions: List[List[Tensor]] - ): - pred_logits, pred_anchor_deltas, pred_centerness = self._transpose_dense_predictions( - predictions, [self.num_classes, 4, 1] - ) - anchors = self.anchor_generator(features) - - results: List[Instances] = [] - for img_idx, image_size in enumerate(images.image_sizes): - scores_per_image = [ - # Multiply and sqrt centerness & classification scores - # (See eqn. 4 in https://arxiv.org/abs/2006.09214) - torch.sqrt(x[img_idx].sigmoid_() * y[img_idx].sigmoid_()) - for x, y in zip(pred_logits, pred_centerness) - ] - deltas_per_image = [x[img_idx] for x in pred_anchor_deltas] - results_per_image = self.inference_single_image( - anchors, scores_per_image, deltas_per_image, image_size - ) - results.append(results_per_image) - return results - - def inference_single_image( - self, - anchors: List[Boxes], - box_cls: List[Tensor], - box_delta: List[Tensor], - image_size: Tuple[int, int], - ): - """ - Identical to :meth:`RetinaNet.inference_single_image. - """ - pred = self._decode_multi_level_predictions( - anchors, - box_cls, - box_delta, - self.test_score_thresh, - self.test_topk_candidates, - image_size, - ) - keep = batched_nms( - pred.pred_boxes.tensor, pred.scores, pred.pred_classes, self.test_nms_thresh - ) - return pred[keep[: self.max_detections_per_image]] - - -class FCOSHead(RetinaNetHead): - """ - The head used in :paper:`fcos`. It adds an additional centerness - prediction branch on top of :class:`RetinaNetHead`. - """ - - def __init__(self, *, input_shape: List[ShapeSpec], conv_dims: List[int], **kwargs): - super().__init__(input_shape=input_shape, conv_dims=conv_dims, num_anchors=1, **kwargs) - # Unlike original FCOS, we do not add an additional learnable scale layer - # because it's found to have no benefits after normalizing regression targets by stride. - self._num_features = len(input_shape) - self.ctrness = nn.Conv2d(conv_dims[-1], 1, kernel_size=3, stride=1, padding=1) - torch.nn.init.normal_(self.ctrness.weight, std=0.01) - torch.nn.init.constant_(self.ctrness.bias, 0) - - def forward(self, features): - assert len(features) == self._num_features - logits = [] - bbox_reg = [] - ctrness = [] - for feature in features: - logits.append(self.cls_score(self.cls_subnet(feature))) - bbox_feature = self.bbox_subnet(feature) - bbox_reg.append(self.bbox_pred(bbox_feature)) - ctrness.append(self.ctrness(bbox_feature)) - return logits, bbox_reg, ctrness diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/panoptic_fpn.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/panoptic_fpn.py deleted file mode 100755 index 13aeabce..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/panoptic_fpn.py +++ /dev/null @@ -1,266 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import logging -from typing import Dict, List -import torch -from torch import nn - -from detectron2.config import configurable -from detectron2.structures import ImageList - -from ..postprocessing import detector_postprocess, sem_seg_postprocess -from .build import META_ARCH_REGISTRY -from .rcnn import GeneralizedRCNN -from .semantic_seg import build_sem_seg_head - -__all__ = ["PanopticFPN"] - - -@META_ARCH_REGISTRY.register() -class PanopticFPN(GeneralizedRCNN): - """ - Implement the paper :paper:`PanopticFPN`. - """ - - @configurable - def __init__( - self, - *, - sem_seg_head: nn.Module, - combine_overlap_thresh: float = 0.5, - combine_stuff_area_thresh: float = 4096, - combine_instances_score_thresh: float = 0.5, - **kwargs, - ): - """ - NOTE: this interface is experimental. - - Args: - sem_seg_head: a module for the semantic segmentation head. - combine_overlap_thresh: combine masks into one instances if - they have enough overlap - combine_stuff_area_thresh: ignore stuff areas smaller than this threshold - combine_instances_score_thresh: ignore instances whose score is - smaller than this threshold - - Other arguments are the same as :class:`GeneralizedRCNN`. - """ - super().__init__(**kwargs) - self.sem_seg_head = sem_seg_head - # options when combining instance & semantic outputs - self.combine_overlap_thresh = combine_overlap_thresh - self.combine_stuff_area_thresh = combine_stuff_area_thresh - self.combine_instances_score_thresh = combine_instances_score_thresh - - @classmethod - def from_config(cls, cfg): - ret = super().from_config(cfg) - ret.update( - { - "combine_overlap_thresh": cfg.MODEL.PANOPTIC_FPN.COMBINE.OVERLAP_THRESH, - "combine_stuff_area_thresh": cfg.MODEL.PANOPTIC_FPN.COMBINE.STUFF_AREA_LIMIT, - "combine_instances_score_thresh": cfg.MODEL.PANOPTIC_FPN.COMBINE.INSTANCES_CONFIDENCE_THRESH, # noqa - } - ) - ret["sem_seg_head"] = build_sem_seg_head(cfg, ret["backbone"].output_shape()) - logger = logging.getLogger(__name__) - if not cfg.MODEL.PANOPTIC_FPN.COMBINE.ENABLED: - logger.warning( - "PANOPTIC_FPN.COMBINED.ENABLED is no longer used. " - " model.inference(do_postprocess=) should be used to toggle postprocessing." - ) - if cfg.MODEL.PANOPTIC_FPN.INSTANCE_LOSS_WEIGHT != 1.0: - w = cfg.MODEL.PANOPTIC_FPN.INSTANCE_LOSS_WEIGHT - logger.warning( - "PANOPTIC_FPN.INSTANCE_LOSS_WEIGHT should be replaced by weights on each ROI head." - ) - - def update_weight(x): - if isinstance(x, dict): - return {k: v * w for k, v in x.items()} - else: - return x * w - - roi_heads = ret["roi_heads"] - roi_heads.box_predictor.loss_weight = update_weight(roi_heads.box_predictor.loss_weight) - roi_heads.mask_head.loss_weight = update_weight(roi_heads.mask_head.loss_weight) - return ret - - def forward(self, batched_inputs): - """ - Args: - batched_inputs: a list, batched outputs of :class:`DatasetMapper`. - Each item in the list contains the inputs for one image. - - For now, each item in the list is a dict that contains: - - * "image": Tensor, image in (C, H, W) format. - * "instances": Instances - * "sem_seg": semantic segmentation ground truth. - * Other information that's included in the original dicts, such as: - "height", "width" (int): the output resolution of the model, used in inference. - See :meth:`postprocess` for details. - - Returns: - list[dict]: - each dict has the results for one image. The dict contains the following keys: - - * "instances": see :meth:`GeneralizedRCNN.forward` for its format. - * "sem_seg": see :meth:`SemanticSegmentor.forward` for its format. - * "panoptic_seg": See the return value of - :func:`combine_semantic_and_instance_outputs` for its format. - """ - if not self.training: - return self.inference(batched_inputs) - images = self.preprocess_image(batched_inputs) - features = self.backbone(images.tensor) - - assert "sem_seg" in batched_inputs[0] - gt_sem_seg = [x["sem_seg"].to(self.device) for x in batched_inputs] - gt_sem_seg = ImageList.from_tensors( - gt_sem_seg, self.backbone.size_divisibility, self.sem_seg_head.ignore_value - ).tensor - sem_seg_results, sem_seg_losses = self.sem_seg_head(features, gt_sem_seg) - - gt_instances = [x["instances"].to(self.device) for x in batched_inputs] - proposals, proposal_losses = self.proposal_generator(images, features, gt_instances) - detector_results, detector_losses = self.roi_heads( - images, features, proposals, gt_instances - ) - - losses = sem_seg_losses - losses.update(proposal_losses) - losses.update(detector_losses) - return losses - - def inference(self, batched_inputs: List[Dict[str, torch.Tensor]], do_postprocess: bool = True): - """ - Run inference on the given inputs. - - Args: - batched_inputs (list[dict]): same as in :meth:`forward` - do_postprocess (bool): whether to apply post-processing on the outputs. - - Returns: - When do_postprocess=True, see docs in :meth:`forward`. - Otherwise, returns a (list[Instances], list[Tensor]) that contains - the raw detector outputs, and raw semantic segmentation outputs. - """ - images = self.preprocess_image(batched_inputs) - features = self.backbone(images.tensor) - sem_seg_results, sem_seg_losses = self.sem_seg_head(features, None) - proposals, _ = self.proposal_generator(images, features, None) - detector_results, _ = self.roi_heads(images, features, proposals, None) - - if do_postprocess: - processed_results = [] - for sem_seg_result, detector_result, input_per_image, image_size in zip( - sem_seg_results, detector_results, batched_inputs, images.image_sizes - ): - height = input_per_image.get("height", image_size[0]) - width = input_per_image.get("width", image_size[1]) - sem_seg_r = sem_seg_postprocess(sem_seg_result, image_size, height, width) - detector_r = detector_postprocess(detector_result, height, width) - - processed_results.append({"sem_seg": sem_seg_r, "instances": detector_r}) - - panoptic_r = combine_semantic_and_instance_outputs( - detector_r, - sem_seg_r.argmax(dim=0), - self.combine_overlap_thresh, - self.combine_stuff_area_thresh, - self.combine_instances_score_thresh, - ) - processed_results[-1]["panoptic_seg"] = panoptic_r - return processed_results - else: - return detector_results, sem_seg_results - - -def combine_semantic_and_instance_outputs( - instance_results, - semantic_results, - overlap_threshold, - stuff_area_thresh, - instances_score_thresh, -): - """ - Implement a simple combining logic following - "combine_semantic_and_instance_predictions.py" in panopticapi - to produce panoptic segmentation outputs. - - Args: - instance_results: output of :func:`detector_postprocess`. - semantic_results: an (H, W) tensor, each element is the contiguous semantic - category id - - Returns: - panoptic_seg (Tensor): of shape (height, width) where the values are ids for each segment. - segments_info (list[dict]): Describe each segment in `panoptic_seg`. - Each dict contains keys "id", "category_id", "isthing". - """ - panoptic_seg = torch.zeros_like(semantic_results, dtype=torch.int32) - - # sort instance outputs by scores - sorted_inds = torch.argsort(-instance_results.scores) - - current_segment_id = 0 - segments_info = [] - - instance_masks = instance_results.pred_masks.to(dtype=torch.bool, device=panoptic_seg.device) - - # Add instances one-by-one, check for overlaps with existing ones - for inst_id in sorted_inds: - score = instance_results.scores[inst_id].item() - if score < instances_score_thresh: - break - mask = instance_masks[inst_id] # H,W - mask_area = mask.sum().item() - - if mask_area == 0: - continue - - intersect = (mask > 0) & (panoptic_seg > 0) - intersect_area = intersect.sum().item() - - if intersect_area * 1.0 / mask_area > overlap_threshold: - continue - - if intersect_area > 0: - mask = mask & (panoptic_seg == 0) - - current_segment_id += 1 - panoptic_seg[mask] = current_segment_id - segments_info.append( - { - "id": current_segment_id, - "isthing": True, - "score": score, - "category_id": instance_results.pred_classes[inst_id].item(), - "instance_id": inst_id.item(), - } - ) - - # Add semantic results to remaining empty areas - semantic_labels = torch.unique(semantic_results).cpu().tolist() - for semantic_label in semantic_labels: - if semantic_label == 0: # 0 is a special "thing" class - continue - mask = (semantic_results == semantic_label) & (panoptic_seg == 0) - mask_area = mask.sum().item() - if mask_area < stuff_area_thresh: - continue - - current_segment_id += 1 - panoptic_seg[mask] = current_segment_id - segments_info.append( - { - "id": current_segment_id, - "isthing": False, - "category_id": semantic_label, - "area": mask_area, - } - ) - - return panoptic_seg, segments_info diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/rcnn.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/rcnn.py deleted file mode 100755 index 7b45363e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/rcnn.py +++ /dev/null @@ -1,327 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import numpy as np -from typing import Dict, List, Optional, Tuple -import torch -from torch import nn - -from detectron2.config import configurable -from detectron2.data.detection_utils import convert_image_to_rgb -from detectron2.structures import ImageList, Instances -from detectron2.utils.events import get_event_storage -from detectron2.utils.logger import log_first_n - -from ..backbone import Backbone, build_backbone -from ..postprocessing import detector_postprocess -from ..proposal_generator import build_proposal_generator -from ..roi_heads import build_roi_heads -from .build import META_ARCH_REGISTRY - -__all__ = ["GeneralizedRCNN", "ProposalNetwork"] - - -@META_ARCH_REGISTRY.register() -class GeneralizedRCNN(nn.Module): - """ - Generalized R-CNN. Any models that contains the following three components: - 1. Per-image feature extraction (aka backbone) - 2. Region proposal generation - 3. Per-region feature extraction and prediction - """ - - @configurable - def __init__( - self, - *, - backbone: Backbone, - proposal_generator: nn.Module, - roi_heads: nn.Module, - pixel_mean: Tuple[float], - pixel_std: Tuple[float], - input_format: Optional[str] = None, - vis_period: int = 0, - ): - """ - Args: - backbone: a backbone module, must follow detectron2's backbone interface - proposal_generator: a module that generates proposals using backbone features - roi_heads: a ROI head that performs per-region computation - pixel_mean, pixel_std: list or tuple with #channels element, representing - the per-channel mean and std to be used to normalize the input image - input_format: describe the meaning of channels of input. Needed by visualization - vis_period: the period to run visualization. Set to 0 to disable. - """ - super().__init__() - self.backbone = backbone - self.proposal_generator = proposal_generator - self.roi_heads = roi_heads - - self.input_format = input_format - self.vis_period = vis_period - if vis_period > 0: - assert input_format is not None, "input_format is required for visualization!" - - self.register_buffer("pixel_mean", torch.tensor(pixel_mean).view(-1, 1, 1), False) - self.register_buffer("pixel_std", torch.tensor(pixel_std).view(-1, 1, 1), False) - assert ( - self.pixel_mean.shape == self.pixel_std.shape - ), f"{self.pixel_mean} and {self.pixel_std} have different shapes!" - - @classmethod - def from_config(cls, cfg): - backbone = build_backbone(cfg) - return { - "backbone": backbone, - "proposal_generator": build_proposal_generator(cfg, backbone.output_shape()), - "roi_heads": build_roi_heads(cfg, backbone.output_shape()), - "input_format": cfg.INPUT.FORMAT, - "vis_period": cfg.VIS_PERIOD, - "pixel_mean": cfg.MODEL.PIXEL_MEAN, - "pixel_std": cfg.MODEL.PIXEL_STD, - } - - @property - def device(self): - return self.pixel_mean.device - - def visualize_training(self, batched_inputs, proposals): - """ - A function used to visualize images and proposals. It shows ground truth - bounding boxes on the original image and up to 20 top-scoring predicted - object proposals on the original image. Users can implement different - visualization functions for different models. - - Args: - batched_inputs (list): a list that contains input to the model. - proposals (list): a list that contains predicted proposals. Both - batched_inputs and proposals should have the same length. - """ - from detectron2.utils.visualizer import Visualizer - - storage = get_event_storage() - max_vis_prop = 20 - - for input, prop in zip(batched_inputs, proposals): - img = input["image"] - img = convert_image_to_rgb(img.permute(1, 2, 0), self.input_format) - v_gt = Visualizer(img, None) - v_gt = v_gt.overlay_instances(boxes=input["instances"].gt_boxes) - anno_img = v_gt.get_image() - box_size = min(len(prop.proposal_boxes), max_vis_prop) - v_pred = Visualizer(img, None) - v_pred = v_pred.overlay_instances( - boxes=prop.proposal_boxes[0:box_size].tensor.cpu().numpy() - ) - prop_img = v_pred.get_image() - vis_img = np.concatenate((anno_img, prop_img), axis=1) - vis_img = vis_img.transpose(2, 0, 1) - vis_name = "Left: GT bounding boxes; Right: Predicted proposals" - storage.put_image(vis_name, vis_img) - break # only visualize one image in a batch - - def forward(self, batched_inputs: List[Dict[str, torch.Tensor]]): - """ - Args: - batched_inputs: a list, batched outputs of :class:`DatasetMapper` . - Each item in the list contains the inputs for one image. - For now, each item in the list is a dict that contains: - - * image: Tensor, image in (C, H, W) format. - * instances (optional): groundtruth :class:`Instances` - * proposals (optional): :class:`Instances`, precomputed proposals. - - Other information that's included in the original dicts, such as: - - * "height", "width" (int): the output resolution of the model, used in inference. - See :meth:`postprocess` for details. - - Returns: - list[dict]: - Each dict is the output for one input image. - The dict contains one key "instances" whose value is a :class:`Instances`. - The :class:`Instances` object has the following keys: - "pred_boxes", "pred_classes", "scores", "pred_masks", "pred_keypoints" - """ - if not self.training: - return self.inference(batched_inputs) - - images = self.preprocess_image(batched_inputs) - if "instances" in batched_inputs[0]: - gt_instances = [x["instances"].to(self.device) for x in batched_inputs] - else: - gt_instances = None - - features = self.backbone(images.tensor) - - if self.proposal_generator is not None: - proposals, proposal_losses = self.proposal_generator(images, features, gt_instances) - else: - assert "proposals" in batched_inputs[0] - proposals = [x["proposals"].to(self.device) for x in batched_inputs] - proposal_losses = {} - - _, detector_losses = self.roi_heads(images, features, proposals, gt_instances) - if self.vis_period > 0: - storage = get_event_storage() - if storage.iter % self.vis_period == 0: - self.visualize_training(batched_inputs, proposals) - - losses = {} - losses.update(detector_losses) - losses.update(proposal_losses) - return losses - - def inference( - self, - batched_inputs: List[Dict[str, torch.Tensor]], - detected_instances: Optional[List[Instances]] = None, - do_postprocess: bool = True, - ): - """ - Run inference on the given inputs. - - Args: - batched_inputs (list[dict]): same as in :meth:`forward` - detected_instances (None or list[Instances]): if not None, it - contains an `Instances` object per image. The `Instances` - object contains "pred_boxes" and "pred_classes" which are - known boxes in the image. - The inference will then skip the detection of bounding boxes, - and only predict other per-ROI outputs. - do_postprocess (bool): whether to apply post-processing on the outputs. - - Returns: - When do_postprocess=True, same as in :meth:`forward`. - Otherwise, a list[Instances] containing raw network outputs. - """ - assert not self.training - - images = self.preprocess_image(batched_inputs) - features = self.backbone(images.tensor) - - if detected_instances is None: - if self.proposal_generator is not None: - proposals, _ = self.proposal_generator(images, features, None) - else: - assert "proposals" in batched_inputs[0] - proposals = [x["proposals"].to(self.device) for x in batched_inputs] - - results, _ = self.roi_heads(images, features, proposals, None) - else: - detected_instances = [x.to(self.device) for x in detected_instances] - results = self.roi_heads.forward_with_given_boxes(features, detected_instances) - - if do_postprocess: - assert not torch.jit.is_scripting(), "Scripting is not supported for postprocess." - return GeneralizedRCNN._postprocess(results, batched_inputs, images.image_sizes) - else: - return results - - def preprocess_image(self, batched_inputs: List[Dict[str, torch.Tensor]]): - """ - Normalize, pad and batch the input images. - """ - images = [x["image"].to(self.device) for x in batched_inputs] - images = [(x - self.pixel_mean) / self.pixel_std for x in images] - images = ImageList.from_tensors(images, self.backbone.size_divisibility) - return images - - @staticmethod - def _postprocess(instances, batched_inputs: List[Dict[str, torch.Tensor]], image_sizes): - """ - Rescale the output instances to the target size. - """ - # note: private function; subject to changes - processed_results = [] - for results_per_image, input_per_image, image_size in zip( - instances, batched_inputs, image_sizes - ): - height = input_per_image.get("height", image_size[0]) - width = input_per_image.get("width", image_size[1]) - r = detector_postprocess(results_per_image, height, width) - processed_results.append({"instances": r}) - return processed_results - - -@META_ARCH_REGISTRY.register() -class ProposalNetwork(nn.Module): - """ - A meta architecture that only predicts object proposals. - """ - - @configurable - def __init__( - self, - *, - backbone: Backbone, - proposal_generator: nn.Module, - pixel_mean: Tuple[float], - pixel_std: Tuple[float], - ): - """ - Args: - backbone: a backbone module, must follow detectron2's backbone interface - proposal_generator: a module that generates proposals using backbone features - pixel_mean, pixel_std: list or tuple with #channels element, representing - the per-channel mean and std to be used to normalize the input image - """ - super().__init__() - self.backbone = backbone - self.proposal_generator = proposal_generator - self.register_buffer("pixel_mean", torch.tensor(pixel_mean).view(-1, 1, 1), False) - self.register_buffer("pixel_std", torch.tensor(pixel_std).view(-1, 1, 1), False) - - @classmethod - def from_config(cls, cfg): - backbone = build_backbone(cfg) - return { - "backbone": backbone, - "proposal_generator": build_proposal_generator(cfg, backbone.output_shape()), - "pixel_mean": cfg.MODEL.PIXEL_MEAN, - "pixel_std": cfg.MODEL.PIXEL_STD, - } - - @property - def device(self): - return self.pixel_mean.device - - def forward(self, batched_inputs): - """ - Args: - Same as in :class:`GeneralizedRCNN.forward` - - Returns: - list[dict]: - Each dict is the output for one input image. - The dict contains one key "proposals" whose value is a - :class:`Instances` with keys "proposal_boxes" and "objectness_logits". - """ - images = [x["image"].to(self.device) for x in batched_inputs] - images = [(x - self.pixel_mean) / self.pixel_std for x in images] - images = ImageList.from_tensors(images, self.backbone.size_divisibility) - features = self.backbone(images.tensor) - - if "instances" in batched_inputs[0]: - gt_instances = [x["instances"].to(self.device) for x in batched_inputs] - elif "targets" in batched_inputs[0]: - log_first_n( - logging.WARN, "'targets' in the model inputs is now renamed to 'instances'!", n=10 - ) - gt_instances = [x["targets"].to(self.device) for x in batched_inputs] - else: - gt_instances = None - proposals, proposal_losses = self.proposal_generator(images, features, gt_instances) - # In training, the proposals are not useful at all but we generate them anyway. - # This makes RPN-only models about 5% slower. - if self.training: - return proposal_losses - - processed_results = [] - for results_per_image, input_per_image, image_size in zip( - proposals, batched_inputs, images.image_sizes - ): - height = input_per_image.get("height", image_size[0]) - width = input_per_image.get("width", image_size[1]) - r = detector_postprocess(results_per_image, height, width) - processed_results.append({"proposals": r}) - return processed_results diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/retinanet.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/retinanet.py deleted file mode 100755 index 3ea88f61..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/retinanet.py +++ /dev/null @@ -1,439 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import math -from typing import List, Tuple -import torch -from fvcore.nn import sigmoid_focal_loss_jit -from torch import Tensor, nn -from torch.nn import functional as F - -from detectron2.config import configurable -from detectron2.layers import CycleBatchNormList, ShapeSpec, batched_nms, cat, get_norm -from detectron2.structures import Boxes, ImageList, Instances, pairwise_iou -from detectron2.utils.events import get_event_storage - -from ..anchor_generator import build_anchor_generator -from ..backbone import Backbone, build_backbone -from ..box_regression import Box2BoxTransform, _dense_box_regression_loss -from ..matcher import Matcher -from .build import META_ARCH_REGISTRY -from .dense_detector import DenseDetector, permute_to_N_HWA_K # noqa - -__all__ = ["RetinaNet"] - - -logger = logging.getLogger(__name__) - - -@META_ARCH_REGISTRY.register() -class RetinaNet(DenseDetector): - """ - Implement RetinaNet in :paper:`RetinaNet`. - """ - - @configurable - def __init__( - self, - *, - backbone: Backbone, - head: nn.Module, - head_in_features, - anchor_generator, - box2box_transform, - anchor_matcher, - num_classes, - focal_loss_alpha=0.25, - focal_loss_gamma=2.0, - smooth_l1_beta=0.0, - box_reg_loss_type="smooth_l1", - test_score_thresh=0.05, - test_topk_candidates=1000, - test_nms_thresh=0.5, - max_detections_per_image=100, - pixel_mean, - pixel_std, - vis_period=0, - input_format="BGR", - ): - """ - NOTE: this interface is experimental. - - Args: - backbone: a backbone module, must follow detectron2's backbone interface - head (nn.Module): a module that predicts logits and regression deltas - for each level from a list of per-level features - head_in_features (Tuple[str]): Names of the input feature maps to be used in head - anchor_generator (nn.Module): a module that creates anchors from a - list of features. Usually an instance of :class:`AnchorGenerator` - box2box_transform (Box2BoxTransform): defines the transform from anchors boxes to - instance boxes - anchor_matcher (Matcher): label the anchors by matching them with ground truth. - num_classes (int): number of classes. Used to label background proposals. - - # Loss parameters: - focal_loss_alpha (float): focal_loss_alpha - focal_loss_gamma (float): focal_loss_gamma - smooth_l1_beta (float): smooth_l1_beta - box_reg_loss_type (str): Options are "smooth_l1", "giou", "diou", "ciou" - - # Inference parameters: - test_score_thresh (float): Inference cls score threshold, only anchors with - score > INFERENCE_TH are considered for inference (to improve speed) - test_topk_candidates (int): Select topk candidates before NMS - test_nms_thresh (float): Overlap threshold used for non-maximum suppression - (suppress boxes with IoU >= this threshold) - max_detections_per_image (int): - Maximum number of detections to return per image during inference - (100 is based on the limit established for the COCO dataset). - - pixel_mean, pixel_std: see :class:`DenseDetector`. - """ - super().__init__( - backbone, head, head_in_features, pixel_mean=pixel_mean, pixel_std=pixel_std - ) - self.num_classes = num_classes - - # Anchors - self.anchor_generator = anchor_generator - self.box2box_transform = box2box_transform - self.anchor_matcher = anchor_matcher - - # Loss parameters: - self.focal_loss_alpha = focal_loss_alpha - self.focal_loss_gamma = focal_loss_gamma - self.smooth_l1_beta = smooth_l1_beta - self.box_reg_loss_type = box_reg_loss_type - # Inference parameters: - self.test_score_thresh = test_score_thresh - self.test_topk_candidates = test_topk_candidates - self.test_nms_thresh = test_nms_thresh - self.max_detections_per_image = max_detections_per_image - # Vis parameters - self.vis_period = vis_period - self.input_format = input_format - - @classmethod - def from_config(cls, cfg): - backbone = build_backbone(cfg) - backbone_shape = backbone.output_shape() - feature_shapes = [backbone_shape[f] for f in cfg.MODEL.RETINANET.IN_FEATURES] - head = RetinaNetHead(cfg, feature_shapes) - anchor_generator = build_anchor_generator(cfg, feature_shapes) - return { - "backbone": backbone, - "head": head, - "anchor_generator": anchor_generator, - "box2box_transform": Box2BoxTransform(weights=cfg.MODEL.RETINANET.BBOX_REG_WEIGHTS), - "anchor_matcher": Matcher( - cfg.MODEL.RETINANET.IOU_THRESHOLDS, - cfg.MODEL.RETINANET.IOU_LABELS, - allow_low_quality_matches=True, - ), - "pixel_mean": cfg.MODEL.PIXEL_MEAN, - "pixel_std": cfg.MODEL.PIXEL_STD, - "num_classes": cfg.MODEL.RETINANET.NUM_CLASSES, - "head_in_features": cfg.MODEL.RETINANET.IN_FEATURES, - # Loss parameters: - "focal_loss_alpha": cfg.MODEL.RETINANET.FOCAL_LOSS_ALPHA, - "focal_loss_gamma": cfg.MODEL.RETINANET.FOCAL_LOSS_GAMMA, - "smooth_l1_beta": cfg.MODEL.RETINANET.SMOOTH_L1_LOSS_BETA, - "box_reg_loss_type": cfg.MODEL.RETINANET.BBOX_REG_LOSS_TYPE, - # Inference parameters: - "test_score_thresh": cfg.MODEL.RETINANET.SCORE_THRESH_TEST, - "test_topk_candidates": cfg.MODEL.RETINANET.TOPK_CANDIDATES_TEST, - "test_nms_thresh": cfg.MODEL.RETINANET.NMS_THRESH_TEST, - "max_detections_per_image": cfg.TEST.DETECTIONS_PER_IMAGE, - # Vis parameters - "vis_period": cfg.VIS_PERIOD, - "input_format": cfg.INPUT.FORMAT, - } - - def forward_training(self, images, features, predictions, gt_instances): - # Transpose the Hi*Wi*A dimension to the middle: - pred_logits, pred_anchor_deltas = self._transpose_dense_predictions( - predictions, [self.num_classes, 4] - ) - anchors = self.anchor_generator(features) - gt_labels, gt_boxes = self.label_anchors(anchors, gt_instances) - return self.losses(anchors, pred_logits, gt_labels, pred_anchor_deltas, gt_boxes) - - def losses(self, anchors, pred_logits, gt_labels, pred_anchor_deltas, gt_boxes): - """ - Args: - anchors (list[Boxes]): a list of #feature level Boxes - gt_labels, gt_boxes: see output of :meth:`RetinaNet.label_anchors`. - Their shapes are (N, R) and (N, R, 4), respectively, where R is - the total number of anchors across levels, i.e. sum(Hi x Wi x Ai) - pred_logits, pred_anchor_deltas: both are list[Tensor]. Each element in the - list corresponds to one level and has shape (N, Hi * Wi * Ai, K or 4). - Where K is the number of classes used in `pred_logits`. - - Returns: - dict[str, Tensor]: - mapping from a named loss to a scalar tensor storing the loss. - Used during training only. The dict keys are: "loss_cls" and "loss_box_reg" - """ - num_images = len(gt_labels) - gt_labels = torch.stack(gt_labels) # (N, R) - - valid_mask = gt_labels >= 0 - pos_mask = (gt_labels >= 0) & (gt_labels != self.num_classes) - num_pos_anchors = pos_mask.sum().item() - get_event_storage().put_scalar("num_pos_anchors", num_pos_anchors / num_images) - normalizer = self._ema_update("loss_normalizer", max(num_pos_anchors, 1), 100) - - # classification and regression loss - gt_labels_target = F.one_hot(gt_labels[valid_mask], num_classes=self.num_classes + 1)[ - :, :-1 - ] # no loss for the last (background) class - loss_cls = sigmoid_focal_loss_jit( - cat(pred_logits, dim=1)[valid_mask], - gt_labels_target.to(pred_logits[0].dtype), - alpha=self.focal_loss_alpha, - gamma=self.focal_loss_gamma, - reduction="sum", - ) - - loss_box_reg = _dense_box_regression_loss( - anchors, - self.box2box_transform, - pred_anchor_deltas, - gt_boxes, - pos_mask, - box_reg_loss_type=self.box_reg_loss_type, - smooth_l1_beta=self.smooth_l1_beta, - ) - - return { - "loss_cls": loss_cls / normalizer, - "loss_box_reg": loss_box_reg / normalizer, - } - - @torch.no_grad() - def label_anchors(self, anchors, gt_instances): - """ - Args: - anchors (list[Boxes]): A list of #feature level Boxes. - The Boxes contains anchors of this image on the specific feature level. - gt_instances (list[Instances]): a list of N `Instances`s. The i-th - `Instances` contains the ground-truth per-instance annotations - for the i-th input image. - - Returns: - list[Tensor]: List of #img tensors. i-th element is a vector of labels whose length is - the total number of anchors across all feature maps (sum(Hi * Wi * A)). - Label values are in {-1, 0, ..., K}, with -1 means ignore, and K means background. - - list[Tensor]: i-th element is a Rx4 tensor, where R is the total number of anchors - across feature maps. The values are the matched gt boxes for each anchor. - Values are undefined for those anchors not labeled as foreground. - """ - anchors = Boxes.cat(anchors) # Rx4 - - gt_labels = [] - matched_gt_boxes = [] - for gt_per_image in gt_instances: - match_quality_matrix = pairwise_iou(gt_per_image.gt_boxes, anchors) - matched_idxs, anchor_labels = self.anchor_matcher(match_quality_matrix) - del match_quality_matrix - - if len(gt_per_image) > 0: - matched_gt_boxes_i = gt_per_image.gt_boxes.tensor[matched_idxs] - - gt_labels_i = gt_per_image.gt_classes[matched_idxs] - # Anchors with label 0 are treated as background. - gt_labels_i[anchor_labels == 0] = self.num_classes - # Anchors with label -1 are ignored. - gt_labels_i[anchor_labels == -1] = -1 - else: - matched_gt_boxes_i = torch.zeros_like(anchors.tensor) - gt_labels_i = torch.zeros_like(matched_idxs) + self.num_classes - - gt_labels.append(gt_labels_i) - matched_gt_boxes.append(matched_gt_boxes_i) - - return gt_labels, matched_gt_boxes - - def forward_inference( - self, images: ImageList, features: List[Tensor], predictions: List[List[Tensor]] - ): - pred_logits, pred_anchor_deltas = self._transpose_dense_predictions( - predictions, [self.num_classes, 4] - ) - anchors = self.anchor_generator(features) - - results: List[Instances] = [] - for img_idx, image_size in enumerate(images.image_sizes): - scores_per_image = [x[img_idx].sigmoid_() for x in pred_logits] - deltas_per_image = [x[img_idx] for x in pred_anchor_deltas] - results_per_image = self.inference_single_image( - anchors, scores_per_image, deltas_per_image, image_size - ) - results.append(results_per_image) - return results - - def inference_single_image( - self, - anchors: List[Boxes], - box_cls: List[Tensor], - box_delta: List[Tensor], - image_size: Tuple[int, int], - ): - """ - Single-image inference. Return bounding-box detection results by thresholding - on scores and applying non-maximum suppression (NMS). - - Arguments: - anchors (list[Boxes]): list of #feature levels. Each entry contains - a Boxes object, which contains all the anchors in that feature level. - box_cls (list[Tensor]): list of #feature levels. Each entry contains - tensor of size (H x W x A, K) - box_delta (list[Tensor]): Same shape as 'box_cls' except that K becomes 4. - image_size (tuple(H, W)): a tuple of the image height and width. - - Returns: - Same as `inference`, but for only one image. - """ - pred = self._decode_multi_level_predictions( - anchors, - box_cls, - box_delta, - self.test_score_thresh, - self.test_topk_candidates, - image_size, - ) - keep = batched_nms( # per-class NMS - pred.pred_boxes.tensor, pred.scores, pred.pred_classes, self.test_nms_thresh - ) - return pred[keep[: self.max_detections_per_image]] - - -class RetinaNetHead(nn.Module): - """ - The head used in RetinaNet for object classification and box regression. - It has two subnets for the two tasks, with a common structure but separate parameters. - """ - - @configurable - def __init__( - self, - *, - input_shape: List[ShapeSpec], - num_classes, - num_anchors, - conv_dims: List[int], - norm="", - prior_prob=0.01, - ): - """ - NOTE: this interface is experimental. - - Args: - input_shape (List[ShapeSpec]): input shape - num_classes (int): number of classes. Used to label background proposals. - num_anchors (int): number of generated anchors - conv_dims (List[int]): dimensions for each convolution layer - norm (str or callable): - Normalization for conv layers except for the two output layers. - See :func:`detectron2.layers.get_norm` for supported types. - prior_prob (float): Prior weight for computing bias - """ - super().__init__() - - self._num_features = len(input_shape) - if norm == "BN" or norm == "SyncBN": - logger.info( - f"Using domain-specific {norm} in RetinaNetHead with len={self._num_features}." - ) - bn_class = nn.BatchNorm2d if norm == "BN" else nn.SyncBatchNorm - - def norm(c): - return CycleBatchNormList( - length=self._num_features, bn_class=bn_class, num_features=c - ) - - else: - norm_name = str(type(get_norm(norm, 1))) - if "BN" in norm_name: - logger.warning( - f"Shared BatchNorm (type={norm_name}) may not work well in RetinaNetHead." - ) - - cls_subnet = [] - bbox_subnet = [] - for in_channels, out_channels in zip( - [input_shape[0].channels] + list(conv_dims), conv_dims - ): - cls_subnet.append( - nn.Conv2d(in_channels, out_channels, kernel_size=3, stride=1, padding=1) - ) - if norm: - cls_subnet.append(get_norm(norm, out_channels)) - cls_subnet.append(nn.ReLU()) - bbox_subnet.append( - nn.Conv2d(in_channels, out_channels, kernel_size=3, stride=1, padding=1) - ) - if norm: - bbox_subnet.append(get_norm(norm, out_channels)) - bbox_subnet.append(nn.ReLU()) - - self.cls_subnet = nn.Sequential(*cls_subnet) - self.bbox_subnet = nn.Sequential(*bbox_subnet) - self.cls_score = nn.Conv2d( - conv_dims[-1], num_anchors * num_classes, kernel_size=3, stride=1, padding=1 - ) - self.bbox_pred = nn.Conv2d( - conv_dims[-1], num_anchors * 4, kernel_size=3, stride=1, padding=1 - ) - - # Initialization - for modules in [self.cls_subnet, self.bbox_subnet, self.cls_score, self.bbox_pred]: - for layer in modules.modules(): - if isinstance(layer, nn.Conv2d): - torch.nn.init.normal_(layer.weight, mean=0, std=0.01) - torch.nn.init.constant_(layer.bias, 0) - - # Use prior in model initialization to improve stability - bias_value = -(math.log((1 - prior_prob) / prior_prob)) - torch.nn.init.constant_(self.cls_score.bias, bias_value) - - @classmethod - def from_config(cls, cfg, input_shape: List[ShapeSpec]): - num_anchors = build_anchor_generator(cfg, input_shape).num_cell_anchors - assert ( - len(set(num_anchors)) == 1 - ), "Using different number of anchors between levels is not currently supported!" - num_anchors = num_anchors[0] - - return { - "input_shape": input_shape, - "num_classes": cfg.MODEL.RETINANET.NUM_CLASSES, - "conv_dims": [input_shape[0].channels] * cfg.MODEL.RETINANET.NUM_CONVS, - "prior_prob": cfg.MODEL.RETINANET.PRIOR_PROB, - "norm": cfg.MODEL.RETINANET.NORM, - "num_anchors": num_anchors, - } - - def forward(self, features: List[Tensor]): - """ - Arguments: - features (list[Tensor]): FPN feature map tensors in high to low resolution. - Each tensor in the list correspond to different feature levels. - - Returns: - logits (list[Tensor]): #lvl tensors, each has shape (N, AxK, Hi, Wi). - The tensor predicts the classification probability - at each spatial position for each of the A anchors and K object - classes. - bbox_reg (list[Tensor]): #lvl tensors, each has shape (N, Ax4, Hi, Wi). - The tensor predicts 4-vector (dx,dy,dw,dh) box - regression values for every anchor. These values are the - relative offset between the anchor and the ground truth box. - """ - assert len(features) == self._num_features - logits = [] - bbox_reg = [] - for feature in features: - logits.append(self.cls_score(self.cls_subnet(feature))) - bbox_reg.append(self.bbox_pred(self.bbox_subnet(feature))) - return logits, bbox_reg diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/semantic_seg.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/semantic_seg.py deleted file mode 100755 index 6dd3dc23..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/meta_arch/semantic_seg.py +++ /dev/null @@ -1,260 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -from typing import Callable, Dict, Optional, Tuple, Union -import fvcore.nn.weight_init as weight_init -import torch -from torch import nn -from torch.nn import functional as F - -from detectron2.config import configurable -from detectron2.layers import Conv2d, ShapeSpec, get_norm -from detectron2.structures import ImageList -from detectron2.utils.registry import Registry - -from ..backbone import Backbone, build_backbone -from ..postprocessing import sem_seg_postprocess -from .build import META_ARCH_REGISTRY - -__all__ = [ - "SemanticSegmentor", - "SEM_SEG_HEADS_REGISTRY", - "SemSegFPNHead", - "build_sem_seg_head", -] - - -SEM_SEG_HEADS_REGISTRY = Registry("SEM_SEG_HEADS") -SEM_SEG_HEADS_REGISTRY.__doc__ = """ -Registry for semantic segmentation heads, which make semantic segmentation predictions -from feature maps. -""" - - -@META_ARCH_REGISTRY.register() -class SemanticSegmentor(nn.Module): - """ - Main class for semantic segmentation architectures. - """ - - @configurable - def __init__( - self, - *, - backbone: Backbone, - sem_seg_head: nn.Module, - pixel_mean: Tuple[float], - pixel_std: Tuple[float], - ): - """ - Args: - backbone: a backbone module, must follow detectron2's backbone interface - sem_seg_head: a module that predicts semantic segmentation from backbone features - pixel_mean, pixel_std: list or tuple with #channels element, representing - the per-channel mean and std to be used to normalize the input image - """ - super().__init__() - self.backbone = backbone - self.sem_seg_head = sem_seg_head - self.register_buffer("pixel_mean", torch.tensor(pixel_mean).view(-1, 1, 1), False) - self.register_buffer("pixel_std", torch.tensor(pixel_std).view(-1, 1, 1), False) - - @classmethod - def from_config(cls, cfg): - backbone = build_backbone(cfg) - sem_seg_head = build_sem_seg_head(cfg, backbone.output_shape()) - return { - "backbone": backbone, - "sem_seg_head": sem_seg_head, - "pixel_mean": cfg.MODEL.PIXEL_MEAN, - "pixel_std": cfg.MODEL.PIXEL_STD, - } - - @property - def device(self): - return self.pixel_mean.device - - def forward(self, batched_inputs): - """ - Args: - batched_inputs: a list, batched outputs of :class:`DatasetMapper`. - Each item in the list contains the inputs for one image. - - For now, each item in the list is a dict that contains: - - * "image": Tensor, image in (C, H, W) format. - * "sem_seg": semantic segmentation ground truth - * Other information that's included in the original dicts, such as: - "height", "width" (int): the output resolution of the model (may be different - from input resolution), used in inference. - - - Returns: - list[dict]: - Each dict is the output for one input image. - The dict contains one key "sem_seg" whose value is a - Tensor that represents the - per-pixel segmentation prediced by the head. - The prediction has shape KxHxW that represents the logits of - each class for each pixel. - """ - images = [x["image"].to(self.device) for x in batched_inputs] - images = [(x - self.pixel_mean) / self.pixel_std for x in images] - images = ImageList.from_tensors(images, self.backbone.size_divisibility) - - features = self.backbone(images.tensor) - - if "sem_seg" in batched_inputs[0]: - targets = [x["sem_seg"].to(self.device) for x in batched_inputs] - targets = ImageList.from_tensors( - targets, self.backbone.size_divisibility, self.sem_seg_head.ignore_value - ).tensor - else: - targets = None - results, losses = self.sem_seg_head(features, targets) - - if self.training: - return losses - - processed_results = [] - for result, input_per_image, image_size in zip(results, batched_inputs, images.image_sizes): - height = input_per_image.get("height", image_size[0]) - width = input_per_image.get("width", image_size[1]) - r = sem_seg_postprocess(result, image_size, height, width) - processed_results.append({"sem_seg": r}) - return processed_results - - -def build_sem_seg_head(cfg, input_shape): - """ - Build a semantic segmentation head from `cfg.MODEL.SEM_SEG_HEAD.NAME`. - """ - name = cfg.MODEL.SEM_SEG_HEAD.NAME - return SEM_SEG_HEADS_REGISTRY.get(name)(cfg, input_shape) - - -@SEM_SEG_HEADS_REGISTRY.register() -class SemSegFPNHead(nn.Module): - """ - A semantic segmentation head described in :paper:`PanopticFPN`. - It takes a list of FPN features as input, and applies a sequence of - 3x3 convs and upsampling to scale all of them to the stride defined by - ``common_stride``. Then these features are added and used to make final - predictions by another 1x1 conv layer. - """ - - @configurable - def __init__( - self, - input_shape: Dict[str, ShapeSpec], - *, - num_classes: int, - conv_dims: int, - common_stride: int, - loss_weight: float = 1.0, - norm: Optional[Union[str, Callable]] = None, - ignore_value: int = -1, - ): - """ - NOTE: this interface is experimental. - - Args: - input_shape: shapes (channels and stride) of the input features - num_classes: number of classes to predict - conv_dims: number of output channels for the intermediate conv layers. - common_stride: the common stride that all features will be upscaled to - loss_weight: loss weight - norm (str or callable): normalization for all conv layers - ignore_value: category id to be ignored during training. - """ - super().__init__() - input_shape = sorted(input_shape.items(), key=lambda x: x[1].stride) - if not len(input_shape): - raise ValueError("SemSegFPNHead(input_shape=) cannot be empty!") - self.in_features = [k for k, v in input_shape] - feature_strides = [v.stride for k, v in input_shape] - feature_channels = [v.channels for k, v in input_shape] - - self.ignore_value = ignore_value - self.common_stride = common_stride - self.loss_weight = loss_weight - - self.scale_heads = [] - for in_feature, stride, channels in zip( - self.in_features, feature_strides, feature_channels - ): - head_ops = [] - head_length = max(1, int(np.log2(stride) - np.log2(self.common_stride))) - for k in range(head_length): - norm_module = get_norm(norm, conv_dims) - conv = Conv2d( - channels if k == 0 else conv_dims, - conv_dims, - kernel_size=3, - stride=1, - padding=1, - bias=not norm, - norm=norm_module, - activation=F.relu, - ) - weight_init.c2_msra_fill(conv) - head_ops.append(conv) - if stride != self.common_stride: - head_ops.append( - nn.Upsample(scale_factor=2, mode="bilinear", align_corners=False) - ) - self.scale_heads.append(nn.Sequential(*head_ops)) - self.add_module(in_feature, self.scale_heads[-1]) - self.predictor = Conv2d(conv_dims, num_classes, kernel_size=1, stride=1, padding=0) - weight_init.c2_msra_fill(self.predictor) - - @classmethod - def from_config(cls, cfg, input_shape: Dict[str, ShapeSpec]): - return { - "input_shape": { - k: v for k, v in input_shape.items() if k in cfg.MODEL.SEM_SEG_HEAD.IN_FEATURES - }, - "ignore_value": cfg.MODEL.SEM_SEG_HEAD.IGNORE_VALUE, - "num_classes": cfg.MODEL.SEM_SEG_HEAD.NUM_CLASSES, - "conv_dims": cfg.MODEL.SEM_SEG_HEAD.CONVS_DIM, - "common_stride": cfg.MODEL.SEM_SEG_HEAD.COMMON_STRIDE, - "norm": cfg.MODEL.SEM_SEG_HEAD.NORM, - "loss_weight": cfg.MODEL.SEM_SEG_HEAD.LOSS_WEIGHT, - } - - def forward(self, features, targets=None): - """ - Returns: - In training, returns (None, dict of losses) - In inference, returns (CxHxW logits, {}) - """ - x = self.layers(features) - if self.training: - return None, self.losses(x, targets) - else: - x = F.interpolate( - x, scale_factor=self.common_stride, mode="bilinear", align_corners=False - ) - return x, {} - - def layers(self, features): - for i, f in enumerate(self.in_features): - if i == 0: - x = self.scale_heads[i](features[f]) - else: - x = x + self.scale_heads[i](features[f]) - x = self.predictor(x) - return x - - def losses(self, predictions, targets): - predictions = predictions.float() # https://github.com/pytorch/pytorch/issues/48163 - predictions = F.interpolate( - predictions, - scale_factor=self.common_stride, - mode="bilinear", - align_corners=False, - ) - loss = F.cross_entropy( - predictions, targets, reduction="mean", ignore_index=self.ignore_value - ) - losses = {"loss_sem_seg": loss * self.loss_weight} - return losses diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/mmdet_wrapper.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/mmdet_wrapper.py deleted file mode 100755 index 386e9296..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/mmdet_wrapper.py +++ /dev/null @@ -1,274 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import itertools -import logging -import numpy as np -from collections import OrderedDict -from collections.abc import Mapping -from typing import Dict, List, Optional, Tuple, Union -import torch -from omegaconf import DictConfig, OmegaConf -from torch import Tensor, nn - -from detectron2.layers import ShapeSpec -from detectron2.structures import BitMasks, Boxes, ImageList, Instances -from detectron2.utils.events import get_event_storage - -from .backbone import Backbone - -logger = logging.getLogger(__name__) - - -def _to_container(cfg): - """ - mmdet will assert the type of dict/list. - So convert omegaconf objects to dict/list. - """ - if isinstance(cfg, DictConfig): - cfg = OmegaConf.to_container(cfg, resolve=True) - from mmcv.utils import ConfigDict - - return ConfigDict(cfg) - - -class MMDetBackbone(Backbone): - """ - Wrapper of mmdetection backbones to use in detectron2. - - mmdet backbones produce list/tuple of tensors, while detectron2 backbones - produce a dict of tensors. This class wraps the given backbone to produce - output in detectron2's convention, so it can be used in place of detectron2 - backbones. - """ - - def __init__( - self, - backbone: Union[nn.Module, Mapping], - neck: Union[nn.Module, Mapping, None] = None, - *, - output_shapes: List[ShapeSpec], - output_names: Optional[List[str]] = None, - ): - """ - Args: - backbone: either a backbone module or a mmdet config dict that defines a - backbone. The backbone takes a 4D image tensor and returns a - sequence of tensors. - neck: either a backbone module or a mmdet config dict that defines a - neck. The neck takes outputs of backbone and returns a - sequence of tensors. If None, no neck is used. - pretrained_backbone: defines the backbone weights that can be loaded by - mmdet, such as "torchvision://resnet50". - output_shapes: shape for every output of the backbone (or neck, if given). - stride and channels are often needed. - output_names: names for every output of the backbone (or neck, if given). - By default, will use "out0", "out1", ... - """ - super().__init__() - if isinstance(backbone, Mapping): - from mmdet.models import build_backbone - - backbone = build_backbone(_to_container(backbone)) - self.backbone = backbone - - if isinstance(neck, Mapping): - from mmdet.models import build_neck - - neck = build_neck(_to_container(neck)) - self.neck = neck - - # "Neck" weights, if any, are part of neck itself. This is the interface - # of mmdet so we follow it. Reference: - # https://github.com/open-mmlab/mmdetection/blob/master/mmdet/models/detectors/two_stage.py - logger.info("Initializing mmdet backbone weights...") - self.backbone.init_weights() - # train() in mmdet modules is non-trivial, and has to be explicitly - # called. Reference: - # https://github.com/open-mmlab/mmdetection/blob/master/mmdet/models/backbones/resnet.py - self.backbone.train() - if self.neck is not None: - logger.info("Initializing mmdet neck weights ...") - if isinstance(self.neck, nn.Sequential): - for m in self.neck: - m.init_weights() - else: - self.neck.init_weights() - self.neck.train() - - self._output_shapes = output_shapes - if not output_names: - output_names = [f"out{i}" for i in range(len(output_shapes))] - self._output_names = output_names - - def forward(self, x) -> Dict[str, Tensor]: - outs = self.backbone(x) - if self.neck is not None: - outs = self.neck(outs) - assert isinstance( - outs, (list, tuple) - ), "mmdet backbone should return a list/tuple of tensors!" - if len(outs) != len(self._output_shapes): - raise ValueError( - "Length of output_shapes does not match outputs from the mmdet backbone: " - f"{len(outs)} != {len(self._output_shapes)}" - ) - return {k: v for k, v in zip(self._output_names, outs)} - - def output_shape(self) -> Dict[str, ShapeSpec]: - return {k: v for k, v in zip(self._output_names, self._output_shapes)} - - -class MMDetDetector(nn.Module): - """ - Wrapper of a mmdetection detector model, for detection and instance segmentation. - Input/output formats of this class follow detectron2's convention, so a - mmdetection model can be trained and evaluated in detectron2. - """ - - def __init__( - self, - detector: Union[nn.Module, Mapping], - *, - # Default is 32 regardless of model: - # https://github.com/open-mmlab/mmdetection/tree/master/configs/_base_/datasets - size_divisibility=32, - pixel_mean: Tuple[float], - pixel_std: Tuple[float], - ): - """ - Args: - detector: a mmdet detector, or a mmdet config dict that defines a detector. - size_divisibility: pad input images to multiple of this number - pixel_mean: per-channel mean to normalize input image - pixel_std: per-channel stddev to normalize input image - """ - super().__init__() - if isinstance(detector, Mapping): - from mmdet.models import build_detector - - detector = build_detector(_to_container(detector)) - self.detector = detector - self.size_divisibility = size_divisibility - - self.register_buffer("pixel_mean", torch.tensor(pixel_mean).view(-1, 1, 1), False) - self.register_buffer("pixel_std", torch.tensor(pixel_std).view(-1, 1, 1), False) - assert ( - self.pixel_mean.shape == self.pixel_std.shape - ), f"{self.pixel_mean} and {self.pixel_std} have different shapes!" - - def forward(self, batched_inputs: List[Dict[str, torch.Tensor]]): - images = [x["image"].to(self.device) for x in batched_inputs] - images = [(x - self.pixel_mean) / self.pixel_std for x in images] - images = ImageList.from_tensors(images, size_divisibility=self.size_divisibility).tensor - metas = [] - rescale = {"height" in x for x in batched_inputs} - if len(rescale) != 1: - raise ValueError("Some inputs have original height/width, but some don't!") - rescale = list(rescale)[0] - output_shapes = [] - for input in batched_inputs: - meta = {} - c, h, w = input["image"].shape - meta["img_shape"] = meta["ori_shape"] = (h, w, c) - if rescale: - scale_factor = np.array( - [w / input["width"], h / input["height"]] * 2, dtype="float32" - ) - ori_shape = (input["height"], input["width"]) - output_shapes.append(ori_shape) - meta["ori_shape"] = ori_shape + (c,) - else: - scale_factor = 1.0 - output_shapes.append((h, w)) - meta["scale_factor"] = scale_factor - meta["flip"] = False - padh, padw = images.shape[-2:] - meta["pad_shape"] = (padh, padw, c) - metas.append(meta) - - if self.training: - gt_instances = [x["instances"].to(self.device) for x in batched_inputs] - if gt_instances[0].has("gt_masks"): - from mmdet.core import PolygonMasks as mm_PolygonMasks, BitmapMasks as mm_BitMasks - - def convert_mask(m, shape): - # mmdet mask format - if isinstance(m, BitMasks): - return mm_BitMasks(m.tensor.cpu().numpy(), shape[0], shape[1]) - else: - return mm_PolygonMasks(m.polygons, shape[0], shape[1]) - - gt_masks = [convert_mask(x.gt_masks, x.image_size) for x in gt_instances] - losses_and_metrics = self.detector.forward_train( - images, - metas, - [x.gt_boxes.tensor for x in gt_instances], - [x.gt_classes for x in gt_instances], - gt_masks=gt_masks, - ) - else: - losses_and_metrics = self.detector.forward_train( - images, - metas, - [x.gt_boxes.tensor for x in gt_instances], - [x.gt_classes for x in gt_instances], - ) - return _parse_losses(losses_and_metrics) - else: - results = self.detector.simple_test(images, metas, rescale=rescale) - results = [ - {"instances": _convert_mmdet_result(r, shape)} - for r, shape in zip(results, output_shapes) - ] - return results - - @property - def device(self): - return self.pixel_mean.device - - -# Reference: show_result() in -# https://github.com/open-mmlab/mmdetection/blob/master/mmdet/models/detectors/base.py -def _convert_mmdet_result(result, shape: Tuple[int, int]) -> Instances: - if isinstance(result, tuple): - bbox_result, segm_result = result - if isinstance(segm_result, tuple): - segm_result = segm_result[0] - else: - bbox_result, segm_result = result, None - - bboxes = torch.from_numpy(np.vstack(bbox_result)) # Nx5 - bboxes, scores = bboxes[:, :4], bboxes[:, -1] - labels = [ - torch.full((bbox.shape[0],), i, dtype=torch.int32) for i, bbox in enumerate(bbox_result) - ] - labels = torch.cat(labels) - inst = Instances(shape) - inst.pred_boxes = Boxes(bboxes) - inst.scores = scores - inst.pred_classes = labels - - if segm_result is not None and len(labels) > 0: - segm_result = list(itertools.chain(*segm_result)) - segm_result = [torch.from_numpy(x) if isinstance(x, np.ndarray) else x for x in segm_result] - segm_result = torch.stack(segm_result, dim=0) - inst.pred_masks = segm_result - return inst - - -# reference: https://github.com/open-mmlab/mmdetection/blob/master/mmdet/models/detectors/base.py -def _parse_losses(losses: Dict[str, Tensor]) -> Dict[str, Tensor]: - log_vars = OrderedDict() - for loss_name, loss_value in losses.items(): - if isinstance(loss_value, torch.Tensor): - log_vars[loss_name] = loss_value.mean() - elif isinstance(loss_value, list): - log_vars[loss_name] = sum(_loss.mean() for _loss in loss_value) - else: - raise TypeError(f"{loss_name} is not a tensor or list of tensors") - - if "loss" not in loss_name: - # put metrics to storage; don't return them - storage = get_event_storage() - value = log_vars.pop(loss_name).cpu().item() - storage.put_scalar(loss_name, value) - return log_vars diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/poolers.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/poolers.py deleted file mode 100755 index 6bea77af..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/poolers.py +++ /dev/null @@ -1,245 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import math -from typing import List -import torch -from torch import nn -from torchvision.ops import RoIPool - -from detectron2.layers import ROIAlign, ROIAlignRotated, cat, nonzero_tuple, shapes_to_tensor -from detectron2.structures import Boxes - -""" -To export ROIPooler to torchscript, in this file, variables that should be annotated with -`Union[List[Boxes], List[RotatedBoxes]]` are only annotated with `List[Boxes]`. - -TODO: Correct these annotations when torchscript support `Union`. -https://github.com/pytorch/pytorch/issues/41412 -""" - -__all__ = ["ROIPooler"] - - -def assign_boxes_to_levels( - box_lists: List[Boxes], - min_level: int, - max_level: int, - canonical_box_size: int, - canonical_level: int, -): - """ - Map each box in `box_lists` to a feature map level index and return the assignment - vector. - - Args: - box_lists (list[Boxes] | list[RotatedBoxes]): A list of N Boxes or N RotatedBoxes, - where N is the number of images in the batch. - min_level (int): Smallest feature map level index. The input is considered index 0, - the output of stage 1 is index 1, and so. - max_level (int): Largest feature map level index. - canonical_box_size (int): A canonical box size in pixels (sqrt(box area)). - canonical_level (int): The feature map level index on which a canonically-sized box - should be placed. - - Returns: - A tensor of length M, where M is the total number of boxes aggregated over all - N batch images. The memory layout corresponds to the concatenation of boxes - from all images. Each element is the feature map index, as an offset from - `self.min_level`, for the corresponding box (so value i means the box is at - `self.min_level + i`). - """ - box_sizes = torch.sqrt(cat([boxes.area() for boxes in box_lists])) - # Eqn.(1) in FPN paper - level_assignments = torch.floor( - canonical_level + torch.log2(box_sizes / canonical_box_size + 1e-8) - ) - # clamp level to (min, max), in case the box size is too large or too small - # for the available feature maps - level_assignments = torch.clamp(level_assignments, min=min_level, max=max_level) - return level_assignments.to(torch.int64) - min_level - - -def convert_boxes_to_pooler_format(box_lists: List[Boxes]): - """ - Convert all boxes in `box_lists` to the low-level format used by ROI pooling ops - (see description under Returns). - - Args: - box_lists (list[Boxes] | list[RotatedBoxes]): - A list of N Boxes or N RotatedBoxes, where N is the number of images in the batch. - - Returns: - When input is list[Boxes]: - A tensor of shape (M, 5), where M is the total number of boxes aggregated over all - N batch images. - The 5 columns are (batch index, x0, y0, x1, y1), where batch index - is the index in [0, N) identifying which batch image the box with corners at - (x0, y0, x1, y1) comes from. - When input is list[RotatedBoxes]: - A tensor of shape (M, 6), where M is the total number of boxes aggregated over all - N batch images. - The 6 columns are (batch index, x_ctr, y_ctr, width, height, angle_degrees), - where batch index is the index in [0, N) identifying which batch image the - rotated box (x_ctr, y_ctr, width, height, angle_degrees) comes from. - """ - boxes = torch.cat([x.tensor for x in box_lists], dim=0) - # __len__ returns Tensor in tracing. - sizes = shapes_to_tensor([x.__len__() for x in box_lists], device=boxes.device) - indices = torch.repeat_interleave( - torch.arange(len(box_lists), dtype=boxes.dtype, device=boxes.device), sizes - ) - return cat([indices[:, None], boxes], dim=1) - - -class ROIPooler(nn.Module): - """ - Region of interest feature map pooler that supports pooling from one or more - feature maps. - """ - - def __init__( - self, - output_size, - scales, - sampling_ratio, - pooler_type, - canonical_box_size=224, - canonical_level=4, - ): - """ - Args: - output_size (int, tuple[int] or list[int]): output size of the pooled region, - e.g., 14 x 14. If tuple or list is given, the length must be 2. - scales (list[float]): The scale for each low-level pooling op relative to - the input image. For a feature map with stride s relative to the input - image, scale is defined as 1/s. The stride must be power of 2. - When there are multiple scales, they must form a pyramid, i.e. they must be - a monotically decreasing geometric sequence with a factor of 1/2. - sampling_ratio (int): The `sampling_ratio` parameter for the ROIAlign op. - pooler_type (string): Name of the type of pooling operation that should be applied. - For instance, "ROIPool" or "ROIAlignV2". - canonical_box_size (int): A canonical box size in pixels (sqrt(box area)). The default - is heuristically defined as 224 pixels in the FPN paper (based on ImageNet - pre-training). - canonical_level (int): The feature map level index from which a canonically-sized box - should be placed. The default is defined as level 4 (stride=16) in the FPN paper, - i.e., a box of size 224x224 will be placed on the feature with stride=16. - The box placement for all boxes will be determined from their sizes w.r.t - canonical_box_size. For example, a box whose area is 4x that of a canonical box - should be used to pool features from feature level ``canonical_level+1``. - - Note that the actual input feature maps given to this module may not have - sufficiently many levels for the input boxes. If the boxes are too large or too - small for the input feature maps, the closest level will be used. - """ - super().__init__() - - if isinstance(output_size, int): - output_size = (output_size, output_size) - assert len(output_size) == 2 - assert isinstance(output_size[0], int) and isinstance(output_size[1], int) - self.output_size = output_size - - if pooler_type == "ROIAlign": - self.level_poolers = nn.ModuleList( - ROIAlign( - output_size, spatial_scale=scale, sampling_ratio=sampling_ratio, aligned=False - ) - for scale in scales - ) - elif pooler_type == "ROIAlignV2": - self.level_poolers = nn.ModuleList( - ROIAlign( - output_size, spatial_scale=scale, sampling_ratio=sampling_ratio, aligned=True - ) - for scale in scales - ) - elif pooler_type == "ROIPool": - self.level_poolers = nn.ModuleList( - RoIPool(output_size, spatial_scale=scale) for scale in scales - ) - elif pooler_type == "ROIAlignRotated": - self.level_poolers = nn.ModuleList( - ROIAlignRotated(output_size, spatial_scale=scale, sampling_ratio=sampling_ratio) - for scale in scales - ) - else: - raise ValueError("Unknown pooler type: {}".format(pooler_type)) - - # Map scale (defined as 1 / stride) to its feature map level under the - # assumption that stride is a power of 2. - min_level = -(math.log2(scales[0])) - max_level = -(math.log2(scales[-1])) - assert math.isclose(min_level, int(min_level)) and math.isclose( - max_level, int(max_level) - ), "Featuremap stride is not power of 2!" - self.min_level = int(min_level) - self.max_level = int(max_level) - assert ( - len(scales) == self.max_level - self.min_level + 1 - ), "[ROIPooler] Sizes of input featuremaps do not form a pyramid!" - assert 0 <= self.min_level and self.min_level <= self.max_level - self.canonical_level = canonical_level - assert canonical_box_size > 0 - self.canonical_box_size = canonical_box_size - - def forward(self, x: List[torch.Tensor], box_lists: List[Boxes]): - """ - Args: - x (list[Tensor]): A list of feature maps of NCHW shape, with scales matching those - used to construct this module. - box_lists (list[Boxes] | list[RotatedBoxes]): - A list of N Boxes or N RotatedBoxes, where N is the number of images in the batch. - The box coordinates are defined on the original image and - will be scaled by the `scales` argument of :class:`ROIPooler`. - - Returns: - Tensor: - A tensor of shape (M, C, output_size, output_size) where M is the total number of - boxes aggregated over all N batch images and C is the number of channels in `x`. - """ - num_level_assignments = len(self.level_poolers) - - assert isinstance(x, list) and isinstance( - box_lists, list - ), "Arguments to pooler must be lists" - assert ( - len(x) == num_level_assignments - ), "unequal value, num_level_assignments={}, but x is list of {} Tensors".format( - num_level_assignments, len(x) - ) - - assert len(box_lists) == x[0].size( - 0 - ), "unequal value, x[0] batch dim 0 is {}, but box_list has length {}".format( - x[0].size(0), len(box_lists) - ) - if len(box_lists) == 0: - return torch.zeros( - (0, x[0].shape[1]) + self.output_size, device=x[0].device, dtype=x[0].dtype - ) - - pooler_fmt_boxes = convert_boxes_to_pooler_format(box_lists) - - if num_level_assignments == 1: - return self.level_poolers[0](x[0], pooler_fmt_boxes) - - level_assignments = assign_boxes_to_levels( - box_lists, self.min_level, self.max_level, self.canonical_box_size, self.canonical_level - ) - - num_boxes = pooler_fmt_boxes.size(0) - num_channels = x[0].shape[1] - output_size = self.output_size[0] - - dtype, device = x[0].dtype, x[0].device - output = torch.zeros( - (num_boxes, num_channels, output_size, output_size), dtype=dtype, device=device - ) - - for level, pooler in enumerate(self.level_poolers): - inds = nonzero_tuple(level_assignments == level)[0] - pooler_fmt_boxes_level = pooler_fmt_boxes[inds] - # Use index_put_ instead of advance indexing, to avoid pytorch/issues/49852 - output.index_put_((inds,), pooler(x[level], pooler_fmt_boxes_level)) - - return output diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/postprocessing.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/postprocessing.py deleted file mode 100755 index 52f273bb..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/postprocessing.py +++ /dev/null @@ -1,101 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import torch -from torch.nn import functional as F - -from detectron2.structures import Instances, ROIMasks - - -# perhaps should rename to "resize_instance" -def detector_postprocess( - results: Instances, output_height: int, output_width: int, mask_threshold: float = 0.5 -): - """ - Resize the output instances. - The input images are often resized when entering an object detector. - As a result, we often need the outputs of the detector in a different - resolution from its inputs. - - This function will resize the raw outputs of an R-CNN detector - to produce outputs according to the desired output resolution. - - Args: - results (Instances): the raw outputs from the detector. - `results.image_size` contains the input image resolution the detector sees. - This object might be modified in-place. - output_height, output_width: the desired output resolution. - - Returns: - Instances: the resized output from the model, based on the output resolution - """ - if isinstance(output_width, torch.Tensor): - # This shape might (but not necessarily) be tensors during tracing. - # Converts integer tensors to float temporaries to ensure true - # division is performed when computing scale_x and scale_y. - output_width_tmp = output_width.float() - output_height_tmp = output_height.float() - new_size = torch.stack([output_height, output_width]) - else: - new_size = (output_height, output_width) - output_width_tmp = output_width - output_height_tmp = output_height - - scale_x, scale_y = ( - output_width_tmp / results.image_size[1], - output_height_tmp / results.image_size[0], - ) - results = Instances(new_size, **results.get_fields()) - - if results.has("pred_boxes"): - output_boxes = results.pred_boxes - elif results.has("proposal_boxes"): - output_boxes = results.proposal_boxes - else: - output_boxes = None - assert output_boxes is not None, "Predictions must contain boxes!" - - output_boxes.scale(scale_x, scale_y) - output_boxes.clip(results.image_size) - - results = results[output_boxes.nonempty()] - - if results.has("pred_masks"): - if isinstance(results.pred_masks, ROIMasks): - roi_masks = results.pred_masks - else: - # pred_masks is a tensor of shape (N, 1, M, M) - roi_masks = ROIMasks(results.pred_masks[:, 0, :, :]) - results.pred_masks = roi_masks.to_bitmasks( - results.pred_boxes, output_height, output_width, mask_threshold - ).tensor # TODO return ROIMasks/BitMask object in the future - - if results.has("pred_keypoints"): - results.pred_keypoints[:, :, 0] *= scale_x - results.pred_keypoints[:, :, 1] *= scale_y - - return results - - -def sem_seg_postprocess(result, img_size, output_height, output_width): - """ - Return semantic segmentation predictions in the original resolution. - - The input images are often resized when entering semantic segmentor. Moreover, in same - cases, they also padded inside segmentor to be divisible by maximum network stride. - As a result, we often need the predictions of the segmentor in a different - resolution from its inputs. - - Args: - result (Tensor): semantic segmentation prediction logits. A tensor of shape (C, H, W), - where C is the number of classes, and H, W are the height and width of the prediction. - img_size (tuple): image size that segmentor is taking as input. - output_height, output_width: the desired output resolution. - - Returns: - semantic segmentation prediction (Tensor): A tensor of the shape - (C, output_height, output_width) that contains per-pixel soft predictions. - """ - result = result[:, : img_size[0], : img_size[1]].expand(1, -1, -1, -1) - result = F.interpolate( - result, size=(output_height, output_width), mode="bilinear", align_corners=False - )[0] - return result diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/__init__.py deleted file mode 100755 index 3f4e4df7..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/__init__.py +++ /dev/null @@ -1,5 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .build import PROPOSAL_GENERATOR_REGISTRY, build_proposal_generator -from .rpn import RPN_HEAD_REGISTRY, build_rpn_head, RPN, StandardRPNHead - -__all__ = list(globals().keys()) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/build.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/build.py deleted file mode 100755 index 34eb12d0..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/build.py +++ /dev/null @@ -1,24 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from detectron2.utils.registry import Registry - -PROPOSAL_GENERATOR_REGISTRY = Registry("PROPOSAL_GENERATOR") -PROPOSAL_GENERATOR_REGISTRY.__doc__ = """ -Registry for proposal generator, which produces object proposals from feature maps. - -The registered object will be called with `obj(cfg, input_shape)`. -The call should return a `nn.Module` object. -""" - -from . import rpn, rrpn # noqa F401 isort:skip - - -def build_proposal_generator(cfg, input_shape): - """ - Build a proposal generator from `cfg.MODEL.PROPOSAL_GENERATOR.NAME`. - The name can be "PrecomputedProposals" to use no proposal generator. - """ - name = cfg.MODEL.PROPOSAL_GENERATOR.NAME - if name == "PrecomputedProposals": - return None - - return PROPOSAL_GENERATOR_REGISTRY.get(name)(cfg, input_shape) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/proposal_utils.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/proposal_utils.py deleted file mode 100755 index 47032198..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/proposal_utils.py +++ /dev/null @@ -1,196 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import math -from typing import List, Tuple, Union -import torch - -from detectron2.layers import batched_nms, cat -from detectron2.structures import Boxes, Instances - -logger = logging.getLogger(__name__) - - -def _is_tracing(): - # (fixed in TORCH_VERSION >= 1.9) - if torch.jit.is_scripting(): - # https://github.com/pytorch/pytorch/issues/47379 - return False - else: - return torch.jit.is_tracing() - - -def find_top_rpn_proposals( - proposals: List[torch.Tensor], - pred_objectness_logits: List[torch.Tensor], - image_sizes: List[Tuple[int, int]], - nms_thresh: float, - pre_nms_topk: int, - post_nms_topk: int, - min_box_size: float, - training: bool, -): - """ - For each feature map, select the `pre_nms_topk` highest scoring proposals, - apply NMS, clip proposals, and remove small boxes. Return the `post_nms_topk` - highest scoring proposals among all the feature maps for each image. - - Args: - proposals (list[Tensor]): A list of L tensors. Tensor i has shape (N, Hi*Wi*A, 4). - All proposal predictions on the feature maps. - pred_objectness_logits (list[Tensor]): A list of L tensors. Tensor i has shape (N, Hi*Wi*A). - image_sizes (list[tuple]): sizes (h, w) for each image - nms_thresh (float): IoU threshold to use for NMS - pre_nms_topk (int): number of top k scoring proposals to keep before applying NMS. - When RPN is run on multiple feature maps (as in FPN) this number is per - feature map. - post_nms_topk (int): number of top k scoring proposals to keep after applying NMS. - When RPN is run on multiple feature maps (as in FPN) this number is total, - over all feature maps. - min_box_size (float): minimum proposal box side length in pixels (absolute units - wrt input images). - training (bool): True if proposals are to be used in training, otherwise False. - This arg exists only to support a legacy bug; look for the "NB: Legacy bug ..." - comment. - - Returns: - list[Instances]: list of N Instances. The i-th Instances - stores post_nms_topk object proposals for image i, sorted by their - objectness score in descending order. - """ - num_images = len(image_sizes) - device = proposals[0].device - - # 1. Select top-k anchor for every level and every image - topk_scores = [] # #lvl Tensor, each of shape N x topk - topk_proposals = [] - level_ids = [] # #lvl Tensor, each of shape (topk,) - batch_idx = torch.arange(num_images, device=device) - for level_id, (proposals_i, logits_i) in enumerate(zip(proposals, pred_objectness_logits)): - Hi_Wi_A = logits_i.shape[1] - if isinstance(Hi_Wi_A, torch.Tensor): # it's a tensor in tracing - num_proposals_i = torch.clamp(Hi_Wi_A, max=pre_nms_topk) - else: - num_proposals_i = min(Hi_Wi_A, pre_nms_topk) - - topk_scores_i, topk_idx = logits_i.topk(num_proposals_i, dim=1) - - # each is N x topk - topk_proposals_i = proposals_i[batch_idx[:, None], topk_idx] # N x topk x 4 - - topk_proposals.append(topk_proposals_i) - topk_scores.append(topk_scores_i) - level_ids.append(torch.full((num_proposals_i,), level_id, dtype=torch.int64, device=device)) - - # 2. Concat all levels together - topk_scores = cat(topk_scores, dim=1) - topk_proposals = cat(topk_proposals, dim=1) - level_ids = cat(level_ids, dim=0) - - # 3. For each image, run a per-level NMS, and choose topk results. - results: List[Instances] = [] - for n, image_size in enumerate(image_sizes): - boxes = Boxes(topk_proposals[n]) - scores_per_img = topk_scores[n] - lvl = level_ids - - valid_mask = torch.isfinite(boxes.tensor).all(dim=1) & torch.isfinite(scores_per_img) - if not valid_mask.all(): - if training: - raise FloatingPointError( - "Predicted boxes or scores contain Inf/NaN. Training has diverged." - ) - boxes = boxes[valid_mask] - scores_per_img = scores_per_img[valid_mask] - lvl = lvl[valid_mask] - boxes.clip(image_size) - - # filter empty boxes - keep = boxes.nonempty(threshold=min_box_size) - if _is_tracing() or keep.sum().item() != len(boxes): - boxes, scores_per_img, lvl = boxes[keep], scores_per_img[keep], lvl[keep] - - keep = batched_nms(boxes.tensor, scores_per_img, lvl, nms_thresh) - # In Detectron1, there was different behavior during training vs. testing. - # (https://github.com/facebookresearch/Detectron/issues/459) - # During training, topk is over the proposals from *all* images in the training batch. - # During testing, it is over the proposals for each image separately. - # As a result, the training behavior becomes batch-dependent, - # and the configuration "POST_NMS_TOPK_TRAIN" end up relying on the batch size. - # This bug is addressed in Detectron2 to make the behavior independent of batch size. - keep = keep[:post_nms_topk] # keep is already sorted - - res = Instances(image_size) - res.proposal_boxes = boxes[keep] - res.objectness_logits = scores_per_img[keep] - results.append(res) - return results - - -def add_ground_truth_to_proposals( - gt: Union[List[Instances], List[Boxes]], proposals: List[Instances] -) -> List[Instances]: - """ - Call `add_ground_truth_to_proposals_single_image` for all images. - - Args: - gt(Union[List[Instances], List[Boxes]): list of N elements. Element i is a Instances - representing the ground-truth for image i. - proposals (list[Instances]): list of N elements. Element i is a Instances - representing the proposals for image i. - - Returns: - list[Instances]: list of N Instances. Each is the proposals for the image, - with field "proposal_boxes" and "objectness_logits". - """ - assert gt is not None - - if len(proposals) != len(gt): - raise ValueError("proposals and gt should have the same length as the number of images!") - if len(proposals) == 0: - return proposals - - return [ - add_ground_truth_to_proposals_single_image(gt_i, proposals_i) - for gt_i, proposals_i in zip(gt, proposals) - ] - - -def add_ground_truth_to_proposals_single_image( - gt: Union[Instances, Boxes], proposals: Instances -) -> Instances: - """ - Augment `proposals` with `gt`. - - Args: - Same as `add_ground_truth_to_proposals`, but with gt and proposals - per image. - - Returns: - Same as `add_ground_truth_to_proposals`, but for only one image. - """ - if isinstance(gt, Boxes): - # convert Boxes to Instances - gt = Instances(proposals.image_size, gt_boxes=gt) - - gt_boxes = gt.gt_boxes - device = proposals.objectness_logits.device - # Assign all ground-truth boxes an objectness logit corresponding to - # P(object) = sigmoid(logit) =~ 1. - gt_logit_value = math.log((1.0 - 1e-10) / (1 - (1.0 - 1e-10))) - gt_logits = gt_logit_value * torch.ones(len(gt_boxes), device=device) - - # Concatenating gt_boxes with proposals requires them to have the same fields - gt_proposal = Instances(proposals.image_size, **gt.get_fields()) - gt_proposal.proposal_boxes = gt_boxes - gt_proposal.objectness_logits = gt_logits - - for key in proposals.get_fields().keys(): - assert gt_proposal.has( - key - ), "The attribute '{}' in `proposals` does not exist in `gt`".format(key) - - # NOTE: Instances.cat only use fields from the first item. Extra fields in latter items - # will be thrown away. - new_proposals = Instances.cat([proposals, gt_proposal]) - - return new_proposals diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/rpn.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/rpn.py deleted file mode 100755 index 99cd536d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/rpn.py +++ /dev/null @@ -1,533 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from typing import Dict, List, Optional, Tuple, Union -import torch -import torch.nn.functional as F -from torch import nn - -from detectron2.config import configurable -from detectron2.layers import Conv2d, ShapeSpec, cat -from detectron2.structures import Boxes, ImageList, Instances, pairwise_iou -from detectron2.utils.events import get_event_storage -from detectron2.utils.memory import retry_if_cuda_oom -from detectron2.utils.registry import Registry - -from ..anchor_generator import build_anchor_generator -from ..box_regression import Box2BoxTransform, _dense_box_regression_loss -from ..matcher import Matcher -from ..sampling import subsample_labels -from .build import PROPOSAL_GENERATOR_REGISTRY -from .proposal_utils import find_top_rpn_proposals - -RPN_HEAD_REGISTRY = Registry("RPN_HEAD") -RPN_HEAD_REGISTRY.__doc__ = """ -Registry for RPN heads, which take feature maps and perform -objectness classification and bounding box regression for anchors. - -The registered object will be called with `obj(cfg, input_shape)`. -The call should return a `nn.Module` object. -""" - - -""" -Shape shorthand in this module: - - N: number of images in the minibatch - L: number of feature maps per image on which RPN is run - A: number of cell anchors (must be the same for all feature maps) - Hi, Wi: height and width of the i-th feature map - B: size of the box parameterization - -Naming convention: - - objectness: refers to the binary classification of an anchor as object vs. not object. - - deltas: refers to the 4-d (dx, dy, dw, dh) deltas that parameterize the box2box - transform (see :class:`box_regression.Box2BoxTransform`), or 5d for rotated boxes. - - pred_objectness_logits: predicted objectness scores in [-inf, +inf]; use - sigmoid(pred_objectness_logits) to estimate P(object). - - gt_labels: ground-truth binary classification labels for objectness - - pred_anchor_deltas: predicted box2box transform deltas - - gt_anchor_deltas: ground-truth box2box transform deltas -""" - - -def build_rpn_head(cfg, input_shape): - """ - Build an RPN head defined by `cfg.MODEL.RPN.HEAD_NAME`. - """ - name = cfg.MODEL.RPN.HEAD_NAME - return RPN_HEAD_REGISTRY.get(name)(cfg, input_shape) - - -@RPN_HEAD_REGISTRY.register() -class StandardRPNHead(nn.Module): - """ - Standard RPN classification and regression heads described in :paper:`Faster R-CNN`. - Uses a 3x3 conv to produce a shared hidden state from which one 1x1 conv predicts - objectness logits for each anchor and a second 1x1 conv predicts bounding-box deltas - specifying how to deform each anchor into an object proposal. - """ - - @configurable - def __init__( - self, *, in_channels: int, num_anchors: int, box_dim: int = 4, conv_dims: List[int] = (-1,) - ): - """ - NOTE: this interface is experimental. - - Args: - in_channels (int): number of input feature channels. When using multiple - input features, they must have the same number of channels. - num_anchors (int): number of anchors to predict for *each spatial position* - on the feature map. The total number of anchors for each - feature map will be `num_anchors * H * W`. - box_dim (int): dimension of a box, which is also the number of box regression - predictions to make for each anchor. An axis aligned box has - box_dim=4, while a rotated box has box_dim=5. - conv_dims (list[int]): a list of integers representing the output channels - of N conv layers. Set it to -1 to use the same number of output channels - as input channels. - """ - super().__init__() - cur_channels = in_channels - # Keeping the old variable names and structure for backwards compatiblity. - # Otherwise the old checkpoints will fail to load. - if len(conv_dims) == 1: - out_channels = cur_channels if conv_dims[0] == -1 else conv_dims[0] - # 3x3 conv for the hidden representation - self.conv = self._get_rpn_conv(cur_channels, out_channels) - cur_channels = out_channels - else: - self.conv = nn.Sequential() - for k, conv_dim in enumerate(conv_dims): - out_channels = cur_channels if conv_dim == -1 else conv_dim - if out_channels <= 0: - raise ValueError( - f"Conv output channels should be greater than 0. Got {out_channels}" - ) - conv = self._get_rpn_conv(cur_channels, out_channels) - self.conv.add_module(f"conv{k}", conv) - cur_channels = out_channels - # 1x1 conv for predicting objectness logits - self.objectness_logits = nn.Conv2d(cur_channels, num_anchors, kernel_size=1, stride=1) - # 1x1 conv for predicting box2box transform deltas - self.anchor_deltas = nn.Conv2d(cur_channels, num_anchors * box_dim, kernel_size=1, stride=1) - - # Keeping the order of weights initialization same for backwards compatiblility. - for layer in self.modules(): - if isinstance(layer, nn.Conv2d): - nn.init.normal_(layer.weight, std=0.01) - nn.init.constant_(layer.bias, 0) - - def _get_rpn_conv(self, in_channels, out_channels): - return Conv2d( - in_channels, - out_channels, - kernel_size=3, - stride=1, - padding=1, - activation=nn.ReLU(), - ) - - @classmethod - def from_config(cls, cfg, input_shape): - # Standard RPN is shared across levels: - in_channels = [s.channels for s in input_shape] - assert len(set(in_channels)) == 1, "Each level must have the same channel!" - in_channels = in_channels[0] - - # RPNHead should take the same input as anchor generator - # NOTE: it assumes that creating an anchor generator does not have unwanted side effect. - anchor_generator = build_anchor_generator(cfg, input_shape) - num_anchors = anchor_generator.num_anchors - box_dim = anchor_generator.box_dim - assert ( - len(set(num_anchors)) == 1 - ), "Each level must have the same number of anchors per spatial position" - return { - "in_channels": in_channels, - "num_anchors": num_anchors[0], - "box_dim": box_dim, - "conv_dims": cfg.MODEL.RPN.CONV_DIMS, - } - - def forward(self, features: List[torch.Tensor]): - """ - Args: - features (list[Tensor]): list of feature maps - - Returns: - list[Tensor]: A list of L elements. - Element i is a tensor of shape (N, A, Hi, Wi) representing - the predicted objectness logits for all anchors. A is the number of cell anchors. - list[Tensor]: A list of L elements. Element i is a tensor of shape - (N, A*box_dim, Hi, Wi) representing the predicted "deltas" used to transform anchors - to proposals. - """ - pred_objectness_logits = [] - pred_anchor_deltas = [] - for x in features: - t = self.conv(x) - pred_objectness_logits.append(self.objectness_logits(t)) - pred_anchor_deltas.append(self.anchor_deltas(t)) - return pred_objectness_logits, pred_anchor_deltas - - -@PROPOSAL_GENERATOR_REGISTRY.register() -class RPN(nn.Module): - """ - Region Proposal Network, introduced by :paper:`Faster R-CNN`. - """ - - @configurable - def __init__( - self, - *, - in_features: List[str], - head: nn.Module, - anchor_generator: nn.Module, - anchor_matcher: Matcher, - box2box_transform: Box2BoxTransform, - batch_size_per_image: int, - positive_fraction: float, - pre_nms_topk: Tuple[float, float], - post_nms_topk: Tuple[float, float], - nms_thresh: float = 0.7, - min_box_size: float = 0.0, - anchor_boundary_thresh: float = -1.0, - loss_weight: Union[float, Dict[str, float]] = 1.0, - box_reg_loss_type: str = "smooth_l1", - smooth_l1_beta: float = 0.0, - ): - """ - NOTE: this interface is experimental. - - Args: - in_features (list[str]): list of names of input features to use - head (nn.Module): a module that predicts logits and regression deltas - for each level from a list of per-level features - anchor_generator (nn.Module): a module that creates anchors from a - list of features. Usually an instance of :class:`AnchorGenerator` - anchor_matcher (Matcher): label the anchors by matching them with ground truth. - box2box_transform (Box2BoxTransform): defines the transform from anchors boxes to - instance boxes - batch_size_per_image (int): number of anchors per image to sample for training - positive_fraction (float): fraction of foreground anchors to sample for training - pre_nms_topk (tuple[float]): (train, test) that represents the - number of top k proposals to select before NMS, in - training and testing. - post_nms_topk (tuple[float]): (train, test) that represents the - number of top k proposals to select after NMS, in - training and testing. - nms_thresh (float): NMS threshold used to de-duplicate the predicted proposals - min_box_size (float): remove proposal boxes with any side smaller than this threshold, - in the unit of input image pixels - anchor_boundary_thresh (float): legacy option - loss_weight (float|dict): weights to use for losses. Can be single float for weighting - all rpn losses together, or a dict of individual weightings. Valid dict keys are: - "loss_rpn_cls" - applied to classification loss - "loss_rpn_loc" - applied to box regression loss - box_reg_loss_type (str): Loss type to use. Supported losses: "smooth_l1", "giou". - smooth_l1_beta (float): beta parameter for the smooth L1 regression loss. Default to - use L1 loss. Only used when `box_reg_loss_type` is "smooth_l1" - """ - super().__init__() - self.in_features = in_features - self.rpn_head = head - self.anchor_generator = anchor_generator - self.anchor_matcher = anchor_matcher - self.box2box_transform = box2box_transform - self.batch_size_per_image = batch_size_per_image - self.positive_fraction = positive_fraction - # Map from self.training state to train/test settings - self.pre_nms_topk = {True: pre_nms_topk[0], False: pre_nms_topk[1]} - self.post_nms_topk = {True: post_nms_topk[0], False: post_nms_topk[1]} - self.nms_thresh = nms_thresh - self.min_box_size = float(min_box_size) - self.anchor_boundary_thresh = anchor_boundary_thresh - if isinstance(loss_weight, float): - loss_weight = {"loss_rpn_cls": loss_weight, "loss_rpn_loc": loss_weight} - self.loss_weight = loss_weight - self.box_reg_loss_type = box_reg_loss_type - self.smooth_l1_beta = smooth_l1_beta - - @classmethod - def from_config(cls, cfg, input_shape: Dict[str, ShapeSpec]): - in_features = cfg.MODEL.RPN.IN_FEATURES - ret = { - "in_features": in_features, - "min_box_size": cfg.MODEL.PROPOSAL_GENERATOR.MIN_SIZE, - "nms_thresh": cfg.MODEL.RPN.NMS_THRESH, - "batch_size_per_image": cfg.MODEL.RPN.BATCH_SIZE_PER_IMAGE, - "positive_fraction": cfg.MODEL.RPN.POSITIVE_FRACTION, - "loss_weight": { - "loss_rpn_cls": cfg.MODEL.RPN.LOSS_WEIGHT, - "loss_rpn_loc": cfg.MODEL.RPN.BBOX_REG_LOSS_WEIGHT * cfg.MODEL.RPN.LOSS_WEIGHT, - }, - "anchor_boundary_thresh": cfg.MODEL.RPN.BOUNDARY_THRESH, - "box2box_transform": Box2BoxTransform(weights=cfg.MODEL.RPN.BBOX_REG_WEIGHTS), - "box_reg_loss_type": cfg.MODEL.RPN.BBOX_REG_LOSS_TYPE, - "smooth_l1_beta": cfg.MODEL.RPN.SMOOTH_L1_BETA, - } - - ret["pre_nms_topk"] = (cfg.MODEL.RPN.PRE_NMS_TOPK_TRAIN, cfg.MODEL.RPN.PRE_NMS_TOPK_TEST) - ret["post_nms_topk"] = (cfg.MODEL.RPN.POST_NMS_TOPK_TRAIN, cfg.MODEL.RPN.POST_NMS_TOPK_TEST) - - ret["anchor_generator"] = build_anchor_generator(cfg, [input_shape[f] for f in in_features]) - ret["anchor_matcher"] = Matcher( - cfg.MODEL.RPN.IOU_THRESHOLDS, cfg.MODEL.RPN.IOU_LABELS, allow_low_quality_matches=True - ) - ret["head"] = build_rpn_head(cfg, [input_shape[f] for f in in_features]) - return ret - - def _subsample_labels(self, label): - """ - Randomly sample a subset of positive and negative examples, and overwrite - the label vector to the ignore value (-1) for all elements that are not - included in the sample. - - Args: - labels (Tensor): a vector of -1, 0, 1. Will be modified in-place and returned. - """ - pos_idx, neg_idx = subsample_labels( - label, self.batch_size_per_image, self.positive_fraction, 0 - ) - # Fill with the ignore label (-1), then set positive and negative labels - label.fill_(-1) - label.scatter_(0, pos_idx, 1) - label.scatter_(0, neg_idx, 0) - return label - - @torch.jit.unused - @torch.no_grad() - def label_and_sample_anchors( - self, anchors: List[Boxes], gt_instances: List[Instances] - ) -> Tuple[List[torch.Tensor], List[torch.Tensor]]: - """ - Args: - anchors (list[Boxes]): anchors for each feature map. - gt_instances: the ground-truth instances for each image. - - Returns: - list[Tensor]: - List of #img tensors. i-th element is a vector of labels whose length is - the total number of anchors across all feature maps R = sum(Hi * Wi * A). - Label values are in {-1, 0, 1}, with meanings: -1 = ignore; 0 = negative - class; 1 = positive class. - list[Tensor]: - i-th element is a Rx4 tensor. The values are the matched gt boxes for each - anchor. Values are undefined for those anchors not labeled as 1. - """ - anchors = Boxes.cat(anchors) - - gt_boxes = [x.gt_boxes for x in gt_instances] - image_sizes = [x.image_size for x in gt_instances] - del gt_instances - - gt_labels = [] - matched_gt_boxes = [] - for image_size_i, gt_boxes_i in zip(image_sizes, gt_boxes): - """ - image_size_i: (h, w) for the i-th image - gt_boxes_i: ground-truth boxes for i-th image - """ - - match_quality_matrix = retry_if_cuda_oom(pairwise_iou)(gt_boxes_i, anchors) - matched_idxs, gt_labels_i = retry_if_cuda_oom(self.anchor_matcher)(match_quality_matrix) - # Matching is memory-expensive and may result in CPU tensors. But the result is small - gt_labels_i = gt_labels_i.to(device=gt_boxes_i.device) - del match_quality_matrix - - if self.anchor_boundary_thresh >= 0: - # Discard anchors that go out of the boundaries of the image - # NOTE: This is legacy functionality that is turned off by default in Detectron2 - anchors_inside_image = anchors.inside_box(image_size_i, self.anchor_boundary_thresh) - gt_labels_i[~anchors_inside_image] = -1 - - # A vector of labels (-1, 0, 1) for each anchor - gt_labels_i = self._subsample_labels(gt_labels_i) - - if len(gt_boxes_i) == 0: - # These values won't be used anyway since the anchor is labeled as background - matched_gt_boxes_i = torch.zeros_like(anchors.tensor) - else: - # TODO wasted indexing computation for ignored boxes - matched_gt_boxes_i = gt_boxes_i[matched_idxs].tensor - - gt_labels.append(gt_labels_i) # N,AHW - matched_gt_boxes.append(matched_gt_boxes_i) - return gt_labels, matched_gt_boxes - - @torch.jit.unused - def losses( - self, - anchors: List[Boxes], - pred_objectness_logits: List[torch.Tensor], - gt_labels: List[torch.Tensor], - pred_anchor_deltas: List[torch.Tensor], - gt_boxes: List[torch.Tensor], - ) -> Dict[str, torch.Tensor]: - """ - Return the losses from a set of RPN predictions and their associated ground-truth. - - Args: - anchors (list[Boxes or RotatedBoxes]): anchors for each feature map, each - has shape (Hi*Wi*A, B), where B is box dimension (4 or 5). - pred_objectness_logits (list[Tensor]): A list of L elements. - Element i is a tensor of shape (N, Hi*Wi*A) representing - the predicted objectness logits for all anchors. - gt_labels (list[Tensor]): Output of :meth:`label_and_sample_anchors`. - pred_anchor_deltas (list[Tensor]): A list of L elements. Element i is a tensor of shape - (N, Hi*Wi*A, 4 or 5) representing the predicted "deltas" used to transform anchors - to proposals. - gt_boxes (list[Tensor]): Output of :meth:`label_and_sample_anchors`. - - Returns: - dict[loss name -> loss value]: A dict mapping from loss name to loss value. - Loss names are: `loss_rpn_cls` for objectness classification and - `loss_rpn_loc` for proposal localization. - """ - num_images = len(gt_labels) - gt_labels = torch.stack(gt_labels) # (N, sum(Hi*Wi*Ai)) - - # Log the number of positive/negative anchors per-image that's used in training - pos_mask = gt_labels == 1 - num_pos_anchors = pos_mask.sum().item() - num_neg_anchors = (gt_labels == 0).sum().item() - storage = get_event_storage() - storage.put_scalar("rpn/num_pos_anchors", num_pos_anchors / num_images) - storage.put_scalar("rpn/num_neg_anchors", num_neg_anchors / num_images) - - localization_loss = _dense_box_regression_loss( - anchors, - self.box2box_transform, - pred_anchor_deltas, - gt_boxes, - pos_mask, - box_reg_loss_type=self.box_reg_loss_type, - smooth_l1_beta=self.smooth_l1_beta, - ) - - valid_mask = gt_labels >= 0 - objectness_loss = F.binary_cross_entropy_with_logits( - cat(pred_objectness_logits, dim=1)[valid_mask], - gt_labels[valid_mask].to(torch.float32), - reduction="sum", - ) - normalizer = self.batch_size_per_image * num_images - losses = { - "loss_rpn_cls": objectness_loss / normalizer, - # The original Faster R-CNN paper uses a slightly different normalizer - # for loc loss. But it doesn't matter in practice - "loss_rpn_loc": localization_loss / normalizer, - } - losses = {k: v * self.loss_weight.get(k, 1.0) for k, v in losses.items()} - return losses - - def forward( - self, - images: ImageList, - features: Dict[str, torch.Tensor], - gt_instances: Optional[List[Instances]] = None, - ): - """ - Args: - images (ImageList): input images of length `N` - features (dict[str, Tensor]): input data as a mapping from feature - map name to tensor. Axis 0 represents the number of images `N` in - the input data; axes 1-3 are channels, height, and width, which may - vary between feature maps (e.g., if a feature pyramid is used). - gt_instances (list[Instances], optional): a length `N` list of `Instances`s. - Each `Instances` stores ground-truth instances for the corresponding image. - - Returns: - proposals: list[Instances]: contains fields "proposal_boxes", "objectness_logits" - loss: dict[Tensor] or None - """ - features = [features[f] for f in self.in_features] - anchors = self.anchor_generator(features) - - pred_objectness_logits, pred_anchor_deltas = self.rpn_head(features) - # Transpose the Hi*Wi*A dimension to the middle: - pred_objectness_logits = [ - # (N, A, Hi, Wi) -> (N, Hi, Wi, A) -> (N, Hi*Wi*A) - score.permute(0, 2, 3, 1).flatten(1) - for score in pred_objectness_logits - ] - pred_anchor_deltas = [ - # (N, A*B, Hi, Wi) -> (N, A, B, Hi, Wi) -> (N, Hi, Wi, A, B) -> (N, Hi*Wi*A, B) - x.view(x.shape[0], -1, self.anchor_generator.box_dim, x.shape[-2], x.shape[-1]) - .permute(0, 3, 4, 1, 2) - .flatten(1, -2) - for x in pred_anchor_deltas - ] - - if self.training: - assert gt_instances is not None, "RPN requires gt_instances in training!" - gt_labels, gt_boxes = self.label_and_sample_anchors(anchors, gt_instances) - losses = self.losses( - anchors, pred_objectness_logits, gt_labels, pred_anchor_deltas, gt_boxes - ) - else: - losses = {} - proposals = self.predict_proposals( - anchors, pred_objectness_logits, pred_anchor_deltas, images.image_sizes - ) - return proposals, losses - - def predict_proposals( - self, - anchors: List[Boxes], - pred_objectness_logits: List[torch.Tensor], - pred_anchor_deltas: List[torch.Tensor], - image_sizes: List[Tuple[int, int]], - ): - """ - Decode all the predicted box regression deltas to proposals. Find the top proposals - by applying NMS and removing boxes that are too small. - - Returns: - proposals (list[Instances]): list of N Instances. The i-th Instances - stores post_nms_topk object proposals for image i, sorted by their - objectness score in descending order. - """ - # The proposals are treated as fixed for joint training with roi heads. - # This approach ignores the derivative w.r.t. the proposal boxes’ coordinates that - # are also network responses. - with torch.no_grad(): - pred_proposals = self._decode_proposals(anchors, pred_anchor_deltas) - return find_top_rpn_proposals( - pred_proposals, - pred_objectness_logits, - image_sizes, - self.nms_thresh, - self.pre_nms_topk[self.training], - self.post_nms_topk[self.training], - self.min_box_size, - self.training, - ) - - def _decode_proposals(self, anchors: List[Boxes], pred_anchor_deltas: List[torch.Tensor]): - """ - Transform anchors into proposals by applying the predicted anchor deltas. - - Returns: - proposals (list[Tensor]): A list of L tensors. Tensor i has shape - (N, Hi*Wi*A, B) - """ - N = pred_anchor_deltas[0].shape[0] - proposals = [] - # For each feature map - for anchors_i, pred_anchor_deltas_i in zip(anchors, pred_anchor_deltas): - B = anchors_i.tensor.size(1) - pred_anchor_deltas_i = pred_anchor_deltas_i.reshape(-1, B) - # Expand anchors to shape (N*Hi*Wi*A, B) - anchors_i = anchors_i.tensor.unsqueeze(0).expand(N, -1, -1).reshape(-1, B) - proposals_i = self.box2box_transform.apply_deltas(pred_anchor_deltas_i, anchors_i) - # Append feature map proposals with shape (N, Hi*Wi*A, B) - proposals.append(proposals_i.view(N, -1, B)) - return proposals diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/rrpn.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/rrpn.py deleted file mode 100755 index d51b92b7..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/proposal_generator/rrpn.py +++ /dev/null @@ -1,203 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import itertools -import logging -from typing import Dict, List -import torch - -from detectron2.config import configurable -from detectron2.layers import ShapeSpec, batched_nms_rotated, cat -from detectron2.structures import Instances, RotatedBoxes, pairwise_iou_rotated -from detectron2.utils.memory import retry_if_cuda_oom - -from ..box_regression import Box2BoxTransformRotated -from .build import PROPOSAL_GENERATOR_REGISTRY -from .proposal_utils import _is_tracing -from .rpn import RPN - -logger = logging.getLogger(__name__) - - -def find_top_rrpn_proposals( - proposals, - pred_objectness_logits, - image_sizes, - nms_thresh, - pre_nms_topk, - post_nms_topk, - min_box_size, - training, -): - """ - For each feature map, select the `pre_nms_topk` highest scoring proposals, - apply NMS, clip proposals, and remove small boxes. Return the `post_nms_topk` - highest scoring proposals among all the feature maps if `training` is True, - otherwise, returns the highest `post_nms_topk` scoring proposals for each - feature map. - - Args: - proposals (list[Tensor]): A list of L tensors. Tensor i has shape (N, Hi*Wi*A, 5). - All proposal predictions on the feature maps. - pred_objectness_logits (list[Tensor]): A list of L tensors. Tensor i has shape (N, Hi*Wi*A). - image_sizes (list[tuple]): sizes (h, w) for each image - nms_thresh (float): IoU threshold to use for NMS - pre_nms_topk (int): number of top k scoring proposals to keep before applying NMS. - When RRPN is run on multiple feature maps (as in FPN) this number is per - feature map. - post_nms_topk (int): number of top k scoring proposals to keep after applying NMS. - When RRPN is run on multiple feature maps (as in FPN) this number is total, - over all feature maps. - min_box_size(float): minimum proposal box side length in pixels (absolute units wrt - input images). - training (bool): True if proposals are to be used in training, otherwise False. - This arg exists only to support a legacy bug; look for the "NB: Legacy bug ..." - comment. - - Returns: - proposals (list[Instances]): list of N Instances. The i-th Instances - stores post_nms_topk object proposals for image i. - """ - num_images = len(image_sizes) - device = proposals[0].device - - # 1. Select top-k anchor for every level and every image - topk_scores = [] # #lvl Tensor, each of shape N x topk - topk_proposals = [] - level_ids = [] # #lvl Tensor, each of shape (topk,) - batch_idx = torch.arange(num_images, device=device) - for level_id, proposals_i, logits_i in zip( - itertools.count(), proposals, pred_objectness_logits - ): - Hi_Wi_A = logits_i.shape[1] - if isinstance(Hi_Wi_A, torch.Tensor): # it's a tensor in tracing - num_proposals_i = torch.clamp(Hi_Wi_A, max=pre_nms_topk) - else: - num_proposals_i = min(Hi_Wi_A, pre_nms_topk) - - topk_scores_i, topk_idx = logits_i.topk(num_proposals_i, dim=1) - - # each is N x topk - topk_proposals_i = proposals_i[batch_idx[:, None], topk_idx] # N x topk x 5 - - topk_proposals.append(topk_proposals_i) - topk_scores.append(topk_scores_i) - level_ids.append(torch.full((num_proposals_i,), level_id, dtype=torch.int64, device=device)) - - # 2. Concat all levels together - topk_scores = cat(topk_scores, dim=1) - topk_proposals = cat(topk_proposals, dim=1) - level_ids = cat(level_ids, dim=0) - - # 3. For each image, run a per-level NMS, and choose topk results. - results = [] - for n, image_size in enumerate(image_sizes): - boxes = RotatedBoxes(topk_proposals[n]) - scores_per_img = topk_scores[n] - valid_mask = torch.isfinite(boxes.tensor).all(dim=1) & torch.isfinite(scores_per_img) - if not valid_mask.all(): - boxes = boxes[valid_mask] - scores_per_img = scores_per_img[valid_mask] - boxes.clip(image_size) - - # filter empty boxes - keep = boxes.nonempty(threshold=min_box_size) - lvl = level_ids - if _is_tracing() or keep.sum().item() != len(boxes): - boxes, scores_per_img, lvl = (boxes[keep], scores_per_img[keep], level_ids[keep]) - - keep = batched_nms_rotated(boxes.tensor, scores_per_img, lvl, nms_thresh) - # In Detectron1, there was different behavior during training vs. testing. - # (https://github.com/facebookresearch/Detectron/issues/459) - # During training, topk is over the proposals from *all* images in the training batch. - # During testing, it is over the proposals for each image separately. - # As a result, the training behavior becomes batch-dependent, - # and the configuration "POST_NMS_TOPK_TRAIN" end up relying on the batch size. - # This bug is addressed in Detectron2 to make the behavior independent of batch size. - keep = keep[:post_nms_topk] - - res = Instances(image_size) - res.proposal_boxes = boxes[keep] - res.objectness_logits = scores_per_img[keep] - results.append(res) - return results - - -@PROPOSAL_GENERATOR_REGISTRY.register() -class RRPN(RPN): - """ - Rotated Region Proposal Network described in :paper:`RRPN`. - """ - - @configurable - def __init__(self, *args, **kwargs): - super().__init__(*args, **kwargs) - if self.anchor_boundary_thresh >= 0: - raise NotImplementedError( - "anchor_boundary_thresh is a legacy option not implemented for RRPN." - ) - - @classmethod - def from_config(cls, cfg, input_shape: Dict[str, ShapeSpec]): - ret = super().from_config(cfg, input_shape) - ret["box2box_transform"] = Box2BoxTransformRotated(weights=cfg.MODEL.RPN.BBOX_REG_WEIGHTS) - return ret - - @torch.no_grad() - def label_and_sample_anchors(self, anchors: List[RotatedBoxes], gt_instances: List[Instances]): - """ - Args: - anchors (list[RotatedBoxes]): anchors for each feature map. - gt_instances: the ground-truth instances for each image. - - Returns: - list[Tensor]: - List of #img tensors. i-th element is a vector of labels whose length is - the total number of anchors across feature maps. Label values are in {-1, 0, 1}, - with meanings: -1 = ignore; 0 = negative class; 1 = positive class. - list[Tensor]: - i-th element is a Nx5 tensor, where N is the total number of anchors across - feature maps. The values are the matched gt boxes for each anchor. - Values are undefined for those anchors not labeled as 1. - """ - anchors = RotatedBoxes.cat(anchors) - - gt_boxes = [x.gt_boxes for x in gt_instances] - del gt_instances - - gt_labels = [] - matched_gt_boxes = [] - for gt_boxes_i in gt_boxes: - """ - gt_boxes_i: ground-truth boxes for i-th image - """ - match_quality_matrix = retry_if_cuda_oom(pairwise_iou_rotated)(gt_boxes_i, anchors) - matched_idxs, gt_labels_i = retry_if_cuda_oom(self.anchor_matcher)(match_quality_matrix) - # Matching is memory-expensive and may result in CPU tensors. But the result is small - gt_labels_i = gt_labels_i.to(device=gt_boxes_i.device) - - # A vector of labels (-1, 0, 1) for each anchor - gt_labels_i = self._subsample_labels(gt_labels_i) - - if len(gt_boxes_i) == 0: - # These values won't be used anyway since the anchor is labeled as background - matched_gt_boxes_i = torch.zeros_like(anchors.tensor) - else: - # TODO wasted indexing computation for ignored boxes - matched_gt_boxes_i = gt_boxes_i[matched_idxs].tensor - - gt_labels.append(gt_labels_i) # N,AHW - matched_gt_boxes.append(matched_gt_boxes_i) - return gt_labels, matched_gt_boxes - - @torch.no_grad() - def predict_proposals(self, anchors, pred_objectness_logits, pred_anchor_deltas, image_sizes): - pred_proposals = self._decode_proposals(anchors, pred_anchor_deltas) - return find_top_rrpn_proposals( - pred_proposals, - pred_objectness_logits, - image_sizes, - self.nms_thresh, - self.pre_nms_topk[self.training], - self.post_nms_topk[self.training], - self.min_box_size, - self.training, - ) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/__init__.py deleted file mode 100755 index d13e9c57..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/__init__.py +++ /dev/null @@ -1,29 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .box_head import ROI_BOX_HEAD_REGISTRY, build_box_head, FastRCNNConvFCHead -from .keypoint_head import ( - ROI_KEYPOINT_HEAD_REGISTRY, - build_keypoint_head, - BaseKeypointRCNNHead, - KRCNNConvDeconvUpsampleHead, -) -from .mask_head import ( - ROI_MASK_HEAD_REGISTRY, - build_mask_head, - BaseMaskRCNNHead, - MaskRCNNConvUpsampleHead, -) -from .roi_heads import ( - ROI_HEADS_REGISTRY, - ROIHeads, - Res5ROIHeads, - StandardROIHeads, - build_roi_heads, - select_foreground_proposals, -) -from .cascade_rcnn import CascadeROIHeads -from .rotated_fast_rcnn import RROIHeads -from .fast_rcnn import FastRCNNOutputLayers - -from . import cascade_rcnn # isort:skip - -__all__ = list(globals().keys()) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/box_head.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/box_head.py deleted file mode 100755 index 5d0370b0..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/box_head.py +++ /dev/null @@ -1,118 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -from typing import List -import fvcore.nn.weight_init as weight_init -import torch -from torch import nn - -from detectron2.config import configurable -from detectron2.layers import Conv2d, ShapeSpec, get_norm -from detectron2.utils.registry import Registry - -__all__ = ["FastRCNNConvFCHead", "build_box_head", "ROI_BOX_HEAD_REGISTRY"] - -ROI_BOX_HEAD_REGISTRY = Registry("ROI_BOX_HEAD") -ROI_BOX_HEAD_REGISTRY.__doc__ = """ -Registry for box heads, which make box predictions from per-region features. - -The registered object will be called with `obj(cfg, input_shape)`. -""" - - -# To get torchscript support, we make the head a subclass of `nn.Sequential`. -# Therefore, to add new layers in this head class, please make sure they are -# added in the order they will be used in forward(). -@ROI_BOX_HEAD_REGISTRY.register() -class FastRCNNConvFCHead(nn.Sequential): - """ - A head with several 3x3 conv layers (each followed by norm & relu) and then - several fc layers (each followed by relu). - """ - - @configurable - def __init__( - self, input_shape: ShapeSpec, *, conv_dims: List[int], fc_dims: List[int], conv_norm="" - ): - """ - NOTE: this interface is experimental. - - Args: - input_shape (ShapeSpec): shape of the input feature. - conv_dims (list[int]): the output dimensions of the conv layers - fc_dims (list[int]): the output dimensions of the fc layers - conv_norm (str or callable): normalization for the conv layers. - See :func:`detectron2.layers.get_norm` for supported types. - """ - super().__init__() - assert len(conv_dims) + len(fc_dims) > 0 - - self._output_size = (input_shape.channels, input_shape.height, input_shape.width) - - self.conv_norm_relus = [] - for k, conv_dim in enumerate(conv_dims): - conv = Conv2d( - self._output_size[0], - conv_dim, - kernel_size=3, - padding=1, - bias=not conv_norm, - norm=get_norm(conv_norm, conv_dim), - activation=nn.ReLU(), - ) - self.add_module("conv{}".format(k + 1), conv) - self.conv_norm_relus.append(conv) - self._output_size = (conv_dim, self._output_size[1], self._output_size[2]) - - self.fcs = [] - for k, fc_dim in enumerate(fc_dims): - if k == 0: - self.add_module("flatten", nn.Flatten()) - fc = nn.Linear(int(np.prod(self._output_size)), fc_dim) - self.add_module("fc{}".format(k + 1), fc) - self.add_module("fc_relu{}".format(k + 1), nn.ReLU()) - self.fcs.append(fc) - self._output_size = fc_dim - - for layer in self.conv_norm_relus: - weight_init.c2_msra_fill(layer) - for layer in self.fcs: - weight_init.c2_xavier_fill(layer) - - @classmethod - def from_config(cls, cfg, input_shape): - num_conv = cfg.MODEL.ROI_BOX_HEAD.NUM_CONV - conv_dim = cfg.MODEL.ROI_BOX_HEAD.CONV_DIM - num_fc = cfg.MODEL.ROI_BOX_HEAD.NUM_FC - fc_dim = cfg.MODEL.ROI_BOX_HEAD.FC_DIM - return { - "input_shape": input_shape, - "conv_dims": [conv_dim] * num_conv, - "fc_dims": [fc_dim] * num_fc, - "conv_norm": cfg.MODEL.ROI_BOX_HEAD.NORM, - } - - def forward(self, x): - for layer in self: - x = layer(x) - return x - - @property - @torch.jit.unused - def output_shape(self): - """ - Returns: - ShapeSpec: the output feature shape - """ - o = self._output_size - if isinstance(o, int): - return ShapeSpec(channels=o) - else: - return ShapeSpec(channels=o[0], height=o[1], width=o[2]) - - -def build_box_head(cfg, input_shape): - """ - Build a box head defined by `cfg.MODEL.ROI_BOX_HEAD.NAME`. - """ - name = cfg.MODEL.ROI_BOX_HEAD.NAME - return ROI_BOX_HEAD_REGISTRY.get(name)(cfg, input_shape) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/cascade_rcnn.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/cascade_rcnn.py deleted file mode 100755 index a0ca70fe..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/cascade_rcnn.py +++ /dev/null @@ -1,299 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from typing import List -import torch -from torch import nn -from torch.autograd.function import Function - -from detectron2.config import configurable -from detectron2.layers import ShapeSpec -from detectron2.structures import Boxes, Instances, pairwise_iou -from detectron2.utils.events import get_event_storage - -from ..box_regression import Box2BoxTransform -from ..matcher import Matcher -from ..poolers import ROIPooler -from .box_head import build_box_head -from .fast_rcnn import FastRCNNOutputLayers, fast_rcnn_inference -from .roi_heads import ROI_HEADS_REGISTRY, StandardROIHeads - - -class _ScaleGradient(Function): - @staticmethod - def forward(ctx, input, scale): - ctx.scale = scale - return input - - @staticmethod - def backward(ctx, grad_output): - return grad_output * ctx.scale, None - - -@ROI_HEADS_REGISTRY.register() -class CascadeROIHeads(StandardROIHeads): - """ - The ROI heads that implement :paper:`Cascade R-CNN`. - """ - - @configurable - def __init__( - self, - *, - box_in_features: List[str], - box_pooler: ROIPooler, - box_heads: List[nn.Module], - box_predictors: List[nn.Module], - proposal_matchers: List[Matcher], - **kwargs, - ): - """ - NOTE: this interface is experimental. - - Args: - box_pooler (ROIPooler): pooler that extracts region features from given boxes - box_heads (list[nn.Module]): box head for each cascade stage - box_predictors (list[nn.Module]): box predictor for each cascade stage - proposal_matchers (list[Matcher]): matcher with different IoU thresholds to - match boxes with ground truth for each stage. The first matcher matches - RPN proposals with ground truth, the other matchers use boxes predicted - by the previous stage as proposals and match them with ground truth. - """ - assert "proposal_matcher" not in kwargs, ( - "CascadeROIHeads takes 'proposal_matchers=' for each stage instead " - "of one 'proposal_matcher='." - ) - # The first matcher matches RPN proposals with ground truth, done in the base class - kwargs["proposal_matcher"] = proposal_matchers[0] - num_stages = self.num_cascade_stages = len(box_heads) - box_heads = nn.ModuleList(box_heads) - box_predictors = nn.ModuleList(box_predictors) - assert len(box_predictors) == num_stages, f"{len(box_predictors)} != {num_stages}!" - assert len(proposal_matchers) == num_stages, f"{len(proposal_matchers)} != {num_stages}!" - super().__init__( - box_in_features=box_in_features, - box_pooler=box_pooler, - box_head=box_heads, - box_predictor=box_predictors, - **kwargs, - ) - self.proposal_matchers = proposal_matchers - - @classmethod - def from_config(cls, cfg, input_shape): - ret = super().from_config(cfg, input_shape) - ret.pop("proposal_matcher") - return ret - - @classmethod - def _init_box_head(cls, cfg, input_shape): - # fmt: off - in_features = cfg.MODEL.ROI_HEADS.IN_FEATURES - pooler_resolution = cfg.MODEL.ROI_BOX_HEAD.POOLER_RESOLUTION - pooler_scales = tuple(1.0 / input_shape[k].stride for k in in_features) - sampling_ratio = cfg.MODEL.ROI_BOX_HEAD.POOLER_SAMPLING_RATIO - pooler_type = cfg.MODEL.ROI_BOX_HEAD.POOLER_TYPE - cascade_bbox_reg_weights = cfg.MODEL.ROI_BOX_CASCADE_HEAD.BBOX_REG_WEIGHTS - cascade_ious = cfg.MODEL.ROI_BOX_CASCADE_HEAD.IOUS - assert len(cascade_bbox_reg_weights) == len(cascade_ious) - assert cfg.MODEL.ROI_BOX_HEAD.CLS_AGNOSTIC_BBOX_REG, \ - "CascadeROIHeads only support class-agnostic regression now!" - assert cascade_ious[0] == cfg.MODEL.ROI_HEADS.IOU_THRESHOLDS[0] - # fmt: on - - in_channels = [input_shape[f].channels for f in in_features] - # Check all channel counts are equal - assert len(set(in_channels)) == 1, in_channels - in_channels = in_channels[0] - - box_pooler = ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type=pooler_type, - ) - pooled_shape = ShapeSpec( - channels=in_channels, width=pooler_resolution, height=pooler_resolution - ) - - box_heads, box_predictors, proposal_matchers = [], [], [] - for match_iou, bbox_reg_weights in zip(cascade_ious, cascade_bbox_reg_weights): - box_head = build_box_head(cfg, pooled_shape) - box_heads.append(box_head) - box_predictors.append( - FastRCNNOutputLayers( - cfg, - box_head.output_shape, - box2box_transform=Box2BoxTransform(weights=bbox_reg_weights), - ) - ) - proposal_matchers.append(Matcher([match_iou], [0, 1], allow_low_quality_matches=False)) - return { - "box_in_features": in_features, - "box_pooler": box_pooler, - "box_heads": box_heads, - "box_predictors": box_predictors, - "proposal_matchers": proposal_matchers, - } - - def forward(self, images, features, proposals, targets=None): - del images - if self.training: - proposals = self.label_and_sample_proposals(proposals, targets) - - if self.training: - # Need targets to box head - losses = self._forward_box(features, proposals, targets) - losses.update(self._forward_mask(features, proposals)) - losses.update(self._forward_keypoint(features, proposals)) - return proposals, losses - else: - pred_instances = self._forward_box(features, proposals) - pred_instances = self.forward_with_given_boxes(features, pred_instances) - return pred_instances, {} - - def _forward_box(self, features, proposals, targets=None): - """ - Args: - features, targets: the same as in - Same as in :meth:`ROIHeads.forward`. - proposals (list[Instances]): the per-image object proposals with - their matching ground truth. - Each has fields "proposal_boxes", and "objectness_logits", - "gt_classes", "gt_boxes". - """ - features = [features[f] for f in self.box_in_features] - head_outputs = [] # (predictor, predictions, proposals) - prev_pred_boxes = None - image_sizes = [x.image_size for x in proposals] - for k in range(self.num_cascade_stages): - if k > 0: - # The output boxes of the previous stage are used to create the input - # proposals of the next stage. - proposals = self._create_proposals_from_boxes(prev_pred_boxes, image_sizes) - if self.training: - proposals = self._match_and_label_boxes(proposals, k, targets) - predictions = self._run_stage(features, proposals, k) - prev_pred_boxes = self.box_predictor[k].predict_boxes(predictions, proposals) - head_outputs.append((self.box_predictor[k], predictions, proposals)) - - if self.training: - losses = {} - storage = get_event_storage() - for stage, (predictor, predictions, proposals) in enumerate(head_outputs): - with storage.name_scope("stage{}".format(stage)): - stage_losses = predictor.losses(predictions, proposals) - losses.update({k + "_stage{}".format(stage): v for k, v in stage_losses.items()}) - return losses - else: - # Each is a list[Tensor] of length #image. Each tensor is Ri x (K+1) - scores_per_stage = [h[0].predict_probs(h[1], h[2]) for h in head_outputs] - - # Average the scores across heads - scores = [ - sum(list(scores_per_image)) * (1.0 / self.num_cascade_stages) - for scores_per_image in zip(*scores_per_stage) - ] - # Use the boxes of the last head - predictor, predictions, proposals = head_outputs[-1] - boxes = predictor.predict_boxes(predictions, proposals) - pred_instances, _ = fast_rcnn_inference( - boxes, - scores, - image_sizes, - predictor.test_score_thresh, - predictor.test_nms_thresh, - predictor.test_topk_per_image, - ) - return pred_instances - - @torch.no_grad() - def _match_and_label_boxes(self, proposals, stage, targets): - """ - Match proposals with groundtruth using the matcher at the given stage. - Label the proposals as foreground or background based on the match. - - Args: - proposals (list[Instances]): One Instances for each image, with - the field "proposal_boxes". - stage (int): the current stage - targets (list[Instances]): the ground truth instances - - Returns: - list[Instances]: the same proposals, but with fields "gt_classes" and "gt_boxes" - """ - num_fg_samples, num_bg_samples = [], [] - for proposals_per_image, targets_per_image in zip(proposals, targets): - match_quality_matrix = pairwise_iou( - targets_per_image.gt_boxes, proposals_per_image.proposal_boxes - ) - # proposal_labels are 0 or 1 - matched_idxs, proposal_labels = self.proposal_matchers[stage](match_quality_matrix) - if len(targets_per_image) > 0: - gt_classes = targets_per_image.gt_classes[matched_idxs] - # Label unmatched proposals (0 label from matcher) as background (label=num_classes) - gt_classes[proposal_labels == 0] = self.num_classes - gt_boxes = targets_per_image.gt_boxes[matched_idxs] - else: - gt_classes = torch.zeros_like(matched_idxs) + self.num_classes - gt_boxes = Boxes( - targets_per_image.gt_boxes.tensor.new_zeros((len(proposals_per_image), 4)) - ) - proposals_per_image.gt_classes = gt_classes - proposals_per_image.gt_boxes = gt_boxes - - num_fg_samples.append((proposal_labels == 1).sum().item()) - num_bg_samples.append(proposal_labels.numel() - num_fg_samples[-1]) - - # Log the number of fg/bg samples in each stage - storage = get_event_storage() - storage.put_scalar( - "stage{}/roi_head/num_fg_samples".format(stage), - sum(num_fg_samples) / len(num_fg_samples), - ) - storage.put_scalar( - "stage{}/roi_head/num_bg_samples".format(stage), - sum(num_bg_samples) / len(num_bg_samples), - ) - return proposals - - def _run_stage(self, features, proposals, stage): - """ - Args: - features (list[Tensor]): #lvl input features to ROIHeads - proposals (list[Instances]): #image Instances, with the field "proposal_boxes" - stage (int): the current stage - - Returns: - Same output as `FastRCNNOutputLayers.forward()`. - """ - box_features = self.box_pooler(features, [x.proposal_boxes for x in proposals]) - # The original implementation averages the losses among heads, - # but scale up the parameter gradients of the heads. - # This is equivalent to adding the losses among heads, - # but scale down the gradients on features. - if self.training: - box_features = _ScaleGradient.apply(box_features, 1.0 / self.num_cascade_stages) - box_features = self.box_head[stage](box_features) - return self.box_predictor[stage](box_features) - - def _create_proposals_from_boxes(self, boxes, image_sizes): - """ - Args: - boxes (list[Tensor]): per-image predicted boxes, each of shape Ri x 4 - image_sizes (list[tuple]): list of image shapes in (h, w) - - Returns: - list[Instances]: per-image proposals with the given boxes. - """ - # Just like RPN, the proposals should not have gradients - boxes = [Boxes(b.detach()) for b in boxes] - proposals = [] - for boxes_per_image, image_size in zip(boxes, image_sizes): - boxes_per_image.clip(image_size) - if self.training: - # do not filter empty boxes at inference time, - # because the scores from each stage need to be aligned and added later - boxes_per_image = boxes_per_image[boxes_per_image.nonempty()] - prop = Instances(image_size) - prop.proposal_boxes = boxes_per_image - proposals.append(prop) - return proposals diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/fast_rcnn.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/fast_rcnn.py deleted file mode 100755 index 42eba210..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/fast_rcnn.py +++ /dev/null @@ -1,462 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -from typing import Dict, List, Tuple, Union -import torch -from torch import nn -from torch.nn import functional as F - -from detectron2.config import configurable -from detectron2.layers import ShapeSpec, batched_nms, cat, cross_entropy, nonzero_tuple -from detectron2.modeling.box_regression import Box2BoxTransform, _dense_box_regression_loss -from detectron2.structures import Boxes, Instances -from detectron2.utils.events import get_event_storage - -__all__ = ["fast_rcnn_inference", "FastRCNNOutputLayers"] - - -logger = logging.getLogger(__name__) - -""" -Shape shorthand in this module: - - N: number of images in the minibatch - R: number of ROIs, combined over all images, in the minibatch - Ri: number of ROIs in image i - K: number of foreground classes. E.g.,there are 80 foreground classes in COCO. - -Naming convention: - - deltas: refers to the 4-d (dx, dy, dw, dh) deltas that parameterize the box2box - transform (see :class:`box_regression.Box2BoxTransform`). - - pred_class_logits: predicted class scores in [-inf, +inf]; use - softmax(pred_class_logits) to estimate P(class). - - gt_classes: ground-truth classification labels in [0, K], where [0, K) represent - foreground object classes and K represents the background class. - - pred_proposal_deltas: predicted box2box transform deltas for transforming proposals - to detection box predictions. - - gt_proposal_deltas: ground-truth box2box transform deltas -""" - - -def fast_rcnn_inference( - boxes: List[torch.Tensor], - scores: List[torch.Tensor], - image_shapes: List[Tuple[int, int]], - score_thresh: float, - nms_thresh: float, - topk_per_image: int, -): - """ - Call `fast_rcnn_inference_single_image` for all images. - - Args: - boxes (list[Tensor]): A list of Tensors of predicted class-specific or class-agnostic - boxes for each image. Element i has shape (Ri, K * 4) if doing - class-specific regression, or (Ri, 4) if doing class-agnostic - regression, where Ri is the number of predicted objects for image i. - This is compatible with the output of :meth:`FastRCNNOutputLayers.predict_boxes`. - scores (list[Tensor]): A list of Tensors of predicted class scores for each image. - Element i has shape (Ri, K + 1), where Ri is the number of predicted objects - for image i. Compatible with the output of :meth:`FastRCNNOutputLayers.predict_probs`. - image_shapes (list[tuple]): A list of (width, height) tuples for each image in the batch. - score_thresh (float): Only return detections with a confidence score exceeding this - threshold. - nms_thresh (float): The threshold to use for box non-maximum suppression. Value in [0, 1]. - topk_per_image (int): The number of top scoring detections to return. Set < 0 to return - all detections. - - Returns: - instances: (list[Instances]): A list of N instances, one for each image in the batch, - that stores the topk most confidence detections. - kept_indices: (list[Tensor]): A list of 1D tensor of length of N, each element indicates - the corresponding boxes/scores index in [0, Ri) from the input, for image i. - """ - result_per_image = [ - fast_rcnn_inference_single_image( - boxes_per_image, scores_per_image, image_shape, score_thresh, nms_thresh, topk_per_image - ) - for scores_per_image, boxes_per_image, image_shape in zip(scores, boxes, image_shapes) - ] - return [x[0] for x in result_per_image], [x[1] for x in result_per_image] - - -def _log_classification_stats(pred_logits, gt_classes, prefix="fast_rcnn"): - """ - Log the classification metrics to EventStorage. - - Args: - pred_logits: Rx(K+1) logits. The last column is for background class. - gt_classes: R labels - """ - num_instances = gt_classes.numel() - if num_instances == 0: - return - pred_classes = pred_logits.argmax(dim=1) - bg_class_ind = pred_logits.shape[1] - 1 - - fg_inds = (gt_classes >= 0) & (gt_classes < bg_class_ind) - num_fg = fg_inds.nonzero().numel() - fg_gt_classes = gt_classes[fg_inds] - fg_pred_classes = pred_classes[fg_inds] - - num_false_negative = (fg_pred_classes == bg_class_ind).nonzero().numel() - num_accurate = (pred_classes == gt_classes).nonzero().numel() - fg_num_accurate = (fg_pred_classes == fg_gt_classes).nonzero().numel() - - storage = get_event_storage() - storage.put_scalar(f"{prefix}/cls_accuracy", num_accurate / num_instances) - if num_fg > 0: - storage.put_scalar(f"{prefix}/fg_cls_accuracy", fg_num_accurate / num_fg) - storage.put_scalar(f"{prefix}/false_negative", num_false_negative / num_fg) - - -def fast_rcnn_inference_single_image( - boxes, - scores, - image_shape: Tuple[int, int], - score_thresh: float, - nms_thresh: float, - topk_per_image: int, -): - """ - Single-image inference. Return bounding-box detection results by thresholding - on scores and applying non-maximum suppression (NMS). - - Args: - Same as `fast_rcnn_inference`, but with boxes, scores, and image shapes - per image. - - Returns: - Same as `fast_rcnn_inference`, but for only one image. - """ - valid_mask = torch.isfinite(boxes).all(dim=1) & torch.isfinite(scores).all(dim=1) - if not valid_mask.all(): - boxes = boxes[valid_mask] - scores = scores[valid_mask] - - scores = scores[:, :-1] - num_bbox_reg_classes = boxes.shape[1] // 4 - # Convert to Boxes to use the `clip` function ... - boxes = Boxes(boxes.reshape(-1, 4)) - boxes.clip(image_shape) - boxes = boxes.tensor.view(-1, num_bbox_reg_classes, 4) # R x C x 4 - - # 1. Filter results based on detection scores. It can make NMS more efficient - # by filtering out low-confidence detections. - filter_mask = scores > score_thresh # R x K - # R' x 2. First column contains indices of the R predictions; - # Second column contains indices of classes. - filter_inds = filter_mask.nonzero() - if num_bbox_reg_classes == 1: - boxes = boxes[filter_inds[:, 0], 0] - else: - boxes = boxes[filter_mask] - scores = scores[filter_mask] - - # 2. Apply NMS for each class independently. - keep = batched_nms(boxes, scores, filter_inds[:, 1], nms_thresh) - if topk_per_image >= 0: - keep = keep[:topk_per_image] - boxes, scores, filter_inds = boxes[keep], scores[keep], filter_inds[keep] - - result = Instances(image_shape) - result.pred_boxes = Boxes(boxes) - result.scores = scores - result.pred_classes = filter_inds[:, 1] - return result, filter_inds[:, 0] - - -class FastRCNNOutputLayers(nn.Module): - """ - Two linear layers for predicting Fast R-CNN outputs: - - 1. proposal-to-detection box regression deltas - 2. classification scores - """ - - @configurable - def __init__( - self, - input_shape: ShapeSpec, - *, - box2box_transform, - num_classes: int, - test_score_thresh: float = 0.0, - test_nms_thresh: float = 0.5, - test_topk_per_image: int = 100, - cls_agnostic_bbox_reg: bool = False, - smooth_l1_beta: float = 0.0, - box_reg_loss_type: str = "smooth_l1", - loss_weight: Union[float, Dict[str, float]] = 1.0, - ): - """ - NOTE: this interface is experimental. - - Args: - input_shape (ShapeSpec): shape of the input feature to this module - box2box_transform (Box2BoxTransform or Box2BoxTransformRotated): - num_classes (int): number of foreground classes - test_score_thresh (float): threshold to filter predictions results. - test_nms_thresh (float): NMS threshold for prediction results. - test_topk_per_image (int): number of top predictions to produce per image. - cls_agnostic_bbox_reg (bool): whether to use class agnostic for bbox regression - smooth_l1_beta (float): transition point from L1 to L2 loss. Only used if - `box_reg_loss_type` is "smooth_l1" - box_reg_loss_type (str): Box regression loss type. One of: "smooth_l1", "giou", - "diou", "ciou" - loss_weight (float|dict): weights to use for losses. Can be single float for weighting - all losses, or a dict of individual weightings. Valid dict keys are: - * "loss_cls": applied to classification loss - * "loss_box_reg": applied to box regression loss - """ - super().__init__() - if isinstance(input_shape, int): # some backward compatibility - input_shape = ShapeSpec(channels=input_shape) - self.num_classes = num_classes - input_size = input_shape.channels * (input_shape.width or 1) * (input_shape.height or 1) - # prediction layer for num_classes foreground classes and one background class (hence + 1) - self.cls_score = nn.Linear(input_size, num_classes + 1) - num_bbox_reg_classes = 1 if cls_agnostic_bbox_reg else num_classes - box_dim = len(box2box_transform.weights) - self.bbox_pred = nn.Linear(input_size, num_bbox_reg_classes * box_dim) - - nn.init.normal_(self.cls_score.weight, std=0.01) - nn.init.normal_(self.bbox_pred.weight, std=0.001) - for l in [self.cls_score, self.bbox_pred]: - nn.init.constant_(l.bias, 0) - - self.box2box_transform = box2box_transform - self.smooth_l1_beta = smooth_l1_beta - self.test_score_thresh = test_score_thresh - self.test_nms_thresh = test_nms_thresh - self.test_topk_per_image = test_topk_per_image - self.box_reg_loss_type = box_reg_loss_type - if isinstance(loss_weight, float): - loss_weight = {"loss_cls": loss_weight, "loss_box_reg": loss_weight} - self.loss_weight = loss_weight - - @classmethod - def from_config(cls, cfg, input_shape): - return { - "input_shape": input_shape, - "box2box_transform": Box2BoxTransform(weights=cfg.MODEL.ROI_BOX_HEAD.BBOX_REG_WEIGHTS), - # fmt: off - "num_classes" : cfg.MODEL.ROI_HEADS.NUM_CLASSES, - "cls_agnostic_bbox_reg" : cfg.MODEL.ROI_BOX_HEAD.CLS_AGNOSTIC_BBOX_REG, - "smooth_l1_beta" : cfg.MODEL.ROI_BOX_HEAD.SMOOTH_L1_BETA, - "test_score_thresh" : cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST, - "test_nms_thresh" : cfg.MODEL.ROI_HEADS.NMS_THRESH_TEST, - "test_topk_per_image" : cfg.TEST.DETECTIONS_PER_IMAGE, - "box_reg_loss_type" : cfg.MODEL.ROI_BOX_HEAD.BBOX_REG_LOSS_TYPE, - "loss_weight" : {"loss_box_reg": cfg.MODEL.ROI_BOX_HEAD.BBOX_REG_LOSS_WEIGHT}, - # fmt: on - } - - def forward(self, x): - """ - Args: - x: per-region features of shape (N, ...) for N bounding boxes to predict. - - Returns: - (Tensor, Tensor): - First tensor: shape (N,K+1), scores for each of the N box. Each row contains the - scores for K object categories and 1 background class. - - Second tensor: bounding box regression deltas for each box. Shape is shape (N,Kx4), - or (N,4) for class-agnostic regression. - """ - if x.dim() > 2: - x = torch.flatten(x, start_dim=1) - scores = self.cls_score(x) - proposal_deltas = self.bbox_pred(x) - return scores, proposal_deltas - - def losses(self, predictions, proposals): - """ - Args: - predictions: return values of :meth:`forward()`. - proposals (list[Instances]): proposals that match the features that were used - to compute predictions. The fields ``proposal_boxes``, ``gt_boxes``, - ``gt_classes`` are expected. - - Returns: - Dict[str, Tensor]: dict of losses - """ - scores, proposal_deltas = predictions - - # parse classification outputs - gt_classes = ( - cat([p.gt_classes for p in proposals], dim=0) if len(proposals) else torch.empty(0) - ) - _log_classification_stats(scores, gt_classes) - - # parse box regression outputs - if len(proposals): - proposal_boxes = cat([p.proposal_boxes.tensor for p in proposals], dim=0) # Nx4 - assert not proposal_boxes.requires_grad, "Proposals should not require gradients!" - # If "gt_boxes" does not exist, the proposals must be all negative and - # should not be included in regression loss computation. - # Here we just use proposal_boxes as an arbitrary placeholder because its - # value won't be used in self.box_reg_loss(). - gt_boxes = cat( - [(p.gt_boxes if p.has("gt_boxes") else p.proposal_boxes).tensor for p in proposals], - dim=0, - ) - else: - proposal_boxes = gt_boxes = torch.empty((0, 4), device=proposal_deltas.device) - - losses = { - "loss_cls": cross_entropy(scores, gt_classes, reduction="mean"), - "loss_box_reg": self.box_reg_loss( - proposal_boxes, gt_boxes, proposal_deltas, gt_classes - ), - } - return {k: v * self.loss_weight.get(k, 1.0) for k, v in losses.items()} - - def box_reg_loss(self, proposal_boxes, gt_boxes, pred_deltas, gt_classes): - """ - Args: - proposal_boxes/gt_boxes are tensors with the same shape (R, 4 or 5). - pred_deltas has shape (R, 4 or 5), or (R, num_classes * (4 or 5)). - gt_classes is a long tensor of shape R, the gt class label of each proposal. - R shall be the number of proposals. - """ - box_dim = proposal_boxes.shape[1] # 4 or 5 - # Regression loss is only computed for foreground proposals (those matched to a GT) - fg_inds = nonzero_tuple((gt_classes >= 0) & (gt_classes < self.num_classes))[0] - if pred_deltas.shape[1] == box_dim: # cls-agnostic regression - fg_pred_deltas = pred_deltas[fg_inds] - else: - fg_pred_deltas = pred_deltas.view(-1, self.num_classes, box_dim)[ - fg_inds, gt_classes[fg_inds] - ] - - loss_box_reg = _dense_box_regression_loss( - [proposal_boxes[fg_inds]], - self.box2box_transform, - [fg_pred_deltas.unsqueeze(0)], - [gt_boxes[fg_inds]], - ..., - self.box_reg_loss_type, - self.smooth_l1_beta, - ) - - # The reg loss is normalized using the total number of regions (R), not the number - # of foreground regions even though the box regression loss is only defined on - # foreground regions. Why? Because doing so gives equal training influence to - # each foreground example. To see how, consider two different minibatches: - # (1) Contains a single foreground region - # (2) Contains 100 foreground regions - # If we normalize by the number of foreground regions, the single example in - # minibatch (1) will be given 100 times as much influence as each foreground - # example in minibatch (2). Normalizing by the total number of regions, R, - # means that the single example in minibatch (1) and each of the 100 examples - # in minibatch (2) are given equal influence. - return loss_box_reg / max(gt_classes.numel(), 1.0) # return 0 if empty - - def inference(self, predictions: Tuple[torch.Tensor, torch.Tensor], proposals: List[Instances]): - """ - Args: - predictions: return values of :meth:`forward()`. - proposals (list[Instances]): proposals that match the features that were - used to compute predictions. The ``proposal_boxes`` field is expected. - - Returns: - list[Instances]: same as `fast_rcnn_inference`. - list[Tensor]: same as `fast_rcnn_inference`. - """ - boxes = self.predict_boxes(predictions, proposals) - scores = self.predict_probs(predictions, proposals) - image_shapes = [x.image_size for x in proposals] - return fast_rcnn_inference( - boxes, - scores, - image_shapes, - self.test_score_thresh, - self.test_nms_thresh, - self.test_topk_per_image, - ) - - def predict_boxes_for_gt_classes(self, predictions, proposals): - """ - Args: - predictions: return values of :meth:`forward()`. - proposals (list[Instances]): proposals that match the features that were used - to compute predictions. The fields ``proposal_boxes``, ``gt_classes`` are expected. - - Returns: - list[Tensor]: - A list of Tensors of predicted boxes for GT classes in case of - class-specific box head. Element i of the list has shape (Ri, B), where Ri is - the number of proposals for image i and B is the box dimension (4 or 5) - """ - if not len(proposals): - return [] - scores, proposal_deltas = predictions - proposal_boxes = cat([p.proposal_boxes.tensor for p in proposals], dim=0) - N, B = proposal_boxes.shape - predict_boxes = self.box2box_transform.apply_deltas( - proposal_deltas, proposal_boxes - ) # Nx(KxB) - - K = predict_boxes.shape[1] // B - if K > 1: - gt_classes = torch.cat([p.gt_classes for p in proposals], dim=0) - # Some proposals are ignored or have a background class. Their gt_classes - # cannot be used as index. - gt_classes = gt_classes.clamp_(0, K - 1) - - predict_boxes = predict_boxes.view(N, K, B)[ - torch.arange(N, dtype=torch.long, device=predict_boxes.device), gt_classes - ] - num_prop_per_image = [len(p) for p in proposals] - return predict_boxes.split(num_prop_per_image) - - def predict_boxes( - self, predictions: Tuple[torch.Tensor, torch.Tensor], proposals: List[Instances] - ): - """ - Args: - predictions: return values of :meth:`forward()`. - proposals (list[Instances]): proposals that match the features that were - used to compute predictions. The ``proposal_boxes`` field is expected. - - Returns: - list[Tensor]: - A list of Tensors of predicted class-specific or class-agnostic boxes - for each image. Element i has shape (Ri, K * B) or (Ri, B), where Ri is - the number of proposals for image i and B is the box dimension (4 or 5) - """ - if not len(proposals): - return [] - _, proposal_deltas = predictions - num_prop_per_image = [len(p) for p in proposals] - proposal_boxes = cat([p.proposal_boxes.tensor for p in proposals], dim=0) - predict_boxes = self.box2box_transform.apply_deltas( - proposal_deltas, - proposal_boxes, - ) # Nx(KxB) - return predict_boxes.split(num_prop_per_image) - - def predict_probs( - self, predictions: Tuple[torch.Tensor, torch.Tensor], proposals: List[Instances] - ): - """ - Args: - predictions: return values of :meth:`forward()`. - proposals (list[Instances]): proposals that match the features that were - used to compute predictions. - - Returns: - list[Tensor]: - A list of Tensors of predicted class probabilities for each image. - Element i has shape (Ri, K + 1), where Ri is the number of proposals for image i. - """ - scores, _ = predictions - num_inst_per_image = [len(p) for p in proposals] - probs = F.softmax(scores, dim=-1) - return probs.split(num_inst_per_image, dim=0) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/keypoint_head.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/keypoint_head.py deleted file mode 100755 index e0acc138..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/keypoint_head.py +++ /dev/null @@ -1,272 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from typing import List -import torch -from torch import nn -from torch.nn import functional as F - -from detectron2.config import configurable -from detectron2.layers import Conv2d, ConvTranspose2d, cat, interpolate -from detectron2.structures import Instances, heatmaps_to_keypoints -from detectron2.utils.events import get_event_storage -from detectron2.utils.registry import Registry - -_TOTAL_SKIPPED = 0 - - -__all__ = [ - "ROI_KEYPOINT_HEAD_REGISTRY", - "build_keypoint_head", - "BaseKeypointRCNNHead", - "KRCNNConvDeconvUpsampleHead", -] - - -ROI_KEYPOINT_HEAD_REGISTRY = Registry("ROI_KEYPOINT_HEAD") -ROI_KEYPOINT_HEAD_REGISTRY.__doc__ = """ -Registry for keypoint heads, which make keypoint predictions from per-region features. - -The registered object will be called with `obj(cfg, input_shape)`. -""" - - -def build_keypoint_head(cfg, input_shape): - """ - Build a keypoint head from `cfg.MODEL.ROI_KEYPOINT_HEAD.NAME`. - """ - name = cfg.MODEL.ROI_KEYPOINT_HEAD.NAME - return ROI_KEYPOINT_HEAD_REGISTRY.get(name)(cfg, input_shape) - - -def keypoint_rcnn_loss(pred_keypoint_logits, instances, normalizer): - """ - Arguments: - pred_keypoint_logits (Tensor): A tensor of shape (N, K, S, S) where N is the total number - of instances in the batch, K is the number of keypoints, and S is the side length - of the keypoint heatmap. The values are spatial logits. - instances (list[Instances]): A list of M Instances, where M is the batch size. - These instances are predictions from the model - that are in 1:1 correspondence with pred_keypoint_logits. - Each Instances should contain a `gt_keypoints` field containing a `structures.Keypoint` - instance. - normalizer (float): Normalize the loss by this amount. - If not specified, we normalize by the number of visible keypoints in the minibatch. - - Returns a scalar tensor containing the loss. - """ - heatmaps = [] - valid = [] - - keypoint_side_len = pred_keypoint_logits.shape[2] - for instances_per_image in instances: - if len(instances_per_image) == 0: - continue - keypoints = instances_per_image.gt_keypoints - heatmaps_per_image, valid_per_image = keypoints.to_heatmap( - instances_per_image.proposal_boxes.tensor, keypoint_side_len - ) - heatmaps.append(heatmaps_per_image.view(-1)) - valid.append(valid_per_image.view(-1)) - - if len(heatmaps): - keypoint_targets = cat(heatmaps, dim=0) - valid = cat(valid, dim=0).to(dtype=torch.uint8) - valid = torch.nonzero(valid).squeeze(1) - - # torch.mean (in binary_cross_entropy_with_logits) doesn't - # accept empty tensors, so handle it separately - if len(heatmaps) == 0 or valid.numel() == 0: - global _TOTAL_SKIPPED - _TOTAL_SKIPPED += 1 - storage = get_event_storage() - storage.put_scalar("kpts_num_skipped_batches", _TOTAL_SKIPPED, smoothing_hint=False) - return pred_keypoint_logits.sum() * 0 - - N, K, H, W = pred_keypoint_logits.shape - pred_keypoint_logits = pred_keypoint_logits.view(N * K, H * W) - - keypoint_loss = F.cross_entropy( - pred_keypoint_logits[valid], keypoint_targets[valid], reduction="sum" - ) - - # If a normalizer isn't specified, normalize by the number of visible keypoints in the minibatch - if normalizer is None: - normalizer = valid.numel() - keypoint_loss /= normalizer - - return keypoint_loss - - -def keypoint_rcnn_inference(pred_keypoint_logits: torch.Tensor, pred_instances: List[Instances]): - """ - Post process each predicted keypoint heatmap in `pred_keypoint_logits` into (x, y, score) - and add it to the `pred_instances` as a `pred_keypoints` field. - - Args: - pred_keypoint_logits (Tensor): A tensor of shape (R, K, S, S) where R is the total number - of instances in the batch, K is the number of keypoints, and S is the side length of - the keypoint heatmap. The values are spatial logits. - pred_instances (list[Instances]): A list of N Instances, where N is the number of images. - - Returns: - None. Each element in pred_instances will contain extra "pred_keypoints" and - "pred_keypoint_heatmaps" fields. "pred_keypoints" is a tensor of shape - (#instance, K, 3) where the last dimension corresponds to (x, y, score). - The scores are larger than 0. "pred_keypoint_heatmaps" contains the raw - keypoint logits as passed to this function. - """ - # flatten all bboxes from all images together (list[Boxes] -> Rx4 tensor) - bboxes_flat = cat([b.pred_boxes.tensor for b in pred_instances], dim=0) - - pred_keypoint_logits = pred_keypoint_logits.detach() - keypoint_results = heatmaps_to_keypoints(pred_keypoint_logits, bboxes_flat.detach()) - num_instances_per_image = [len(i) for i in pred_instances] - keypoint_results = keypoint_results[:, :, [0, 1, 3]].split(num_instances_per_image, dim=0) - heatmap_results = pred_keypoint_logits.split(num_instances_per_image, dim=0) - - for keypoint_results_per_image, heatmap_results_per_image, instances_per_image in zip( - keypoint_results, heatmap_results, pred_instances - ): - # keypoint_results_per_image is (num instances)x(num keypoints)x(x, y, score) - # heatmap_results_per_image is (num instances)x(num keypoints)x(side)x(side) - instances_per_image.pred_keypoints = keypoint_results_per_image - instances_per_image.pred_keypoint_heatmaps = heatmap_results_per_image - - -class BaseKeypointRCNNHead(nn.Module): - """ - Implement the basic Keypoint R-CNN losses and inference logic described in - Sec. 5 of :paper:`Mask R-CNN`. - """ - - @configurable - def __init__(self, *, num_keypoints, loss_weight=1.0, loss_normalizer=1.0): - """ - NOTE: this interface is experimental. - - Args: - num_keypoints (int): number of keypoints to predict - loss_weight (float): weight to multiple on the keypoint loss - loss_normalizer (float or str): - If float, divide the loss by `loss_normalizer * #images`. - If 'visible', the loss is normalized by the total number of - visible keypoints across images. - """ - super().__init__() - self.num_keypoints = num_keypoints - self.loss_weight = loss_weight - assert loss_normalizer == "visible" or isinstance(loss_normalizer, float), loss_normalizer - self.loss_normalizer = loss_normalizer - - @classmethod - def from_config(cls, cfg, input_shape): - ret = { - "loss_weight": cfg.MODEL.ROI_KEYPOINT_HEAD.LOSS_WEIGHT, - "num_keypoints": cfg.MODEL.ROI_KEYPOINT_HEAD.NUM_KEYPOINTS, - } - normalize_by_visible = ( - cfg.MODEL.ROI_KEYPOINT_HEAD.NORMALIZE_LOSS_BY_VISIBLE_KEYPOINTS - ) # noqa - if not normalize_by_visible: - batch_size_per_image = cfg.MODEL.ROI_HEADS.BATCH_SIZE_PER_IMAGE - positive_sample_fraction = cfg.MODEL.ROI_HEADS.POSITIVE_FRACTION - ret["loss_normalizer"] = ( - ret["num_keypoints"] * batch_size_per_image * positive_sample_fraction - ) - else: - ret["loss_normalizer"] = "visible" - return ret - - def forward(self, x, instances: List[Instances]): - """ - Args: - x: input 4D region feature(s) provided by :class:`ROIHeads`. - instances (list[Instances]): contains the boxes & labels corresponding - to the input features. - Exact format is up to its caller to decide. - Typically, this is the foreground instances in training, with - "proposal_boxes" field and other gt annotations. - In inference, it contains boxes that are already predicted. - - Returns: - A dict of losses if in training. The predicted "instances" if in inference. - """ - x = self.layers(x) - if self.training: - num_images = len(instances) - normalizer = ( - None if self.loss_normalizer == "visible" else num_images * self.loss_normalizer - ) - return { - "loss_keypoint": keypoint_rcnn_loss(x, instances, normalizer=normalizer) - * self.loss_weight - } - else: - keypoint_rcnn_inference(x, instances) - return instances - - def layers(self, x): - """ - Neural network layers that makes predictions from regional input features. - """ - raise NotImplementedError - - -# To get torchscript support, we make the head a subclass of `nn.Sequential`. -# Therefore, to add new layers in this head class, please make sure they are -# added in the order they will be used in forward(). -@ROI_KEYPOINT_HEAD_REGISTRY.register() -class KRCNNConvDeconvUpsampleHead(BaseKeypointRCNNHead, nn.Sequential): - """ - A standard keypoint head containing a series of 3x3 convs, followed by - a transpose convolution and bilinear interpolation for upsampling. - It is described in Sec. 5 of :paper:`Mask R-CNN`. - """ - - @configurable - def __init__(self, input_shape, *, num_keypoints, conv_dims, **kwargs): - """ - NOTE: this interface is experimental. - - Args: - input_shape (ShapeSpec): shape of the input feature - conv_dims: an iterable of output channel counts for each conv in the head - e.g. (512, 512, 512) for three convs outputting 512 channels. - """ - super().__init__(num_keypoints=num_keypoints, **kwargs) - - # default up_scale to 2.0 (this can be made an option) - up_scale = 2.0 - in_channels = input_shape.channels - - for idx, layer_channels in enumerate(conv_dims, 1): - module = Conv2d(in_channels, layer_channels, 3, stride=1, padding=1) - self.add_module("conv_fcn{}".format(idx), module) - self.add_module("conv_fcn_relu{}".format(idx), nn.ReLU()) - in_channels = layer_channels - - deconv_kernel = 4 - self.score_lowres = ConvTranspose2d( - in_channels, num_keypoints, deconv_kernel, stride=2, padding=deconv_kernel // 2 - 1 - ) - self.up_scale = up_scale - - for name, param in self.named_parameters(): - if "bias" in name: - nn.init.constant_(param, 0) - elif "weight" in name: - # Caffe2 implementation uses MSRAFill, which in fact - # corresponds to kaiming_normal_ in PyTorch - nn.init.kaiming_normal_(param, mode="fan_out", nonlinearity="relu") - - @classmethod - def from_config(cls, cfg, input_shape): - ret = super().from_config(cfg, input_shape) - ret["input_shape"] = input_shape - ret["conv_dims"] = cfg.MODEL.ROI_KEYPOINT_HEAD.CONV_DIMS - return ret - - def layers(self, x): - for layer in self: - x = layer(x) - x = interpolate(x, scale_factor=self.up_scale, mode="bilinear", align_corners=False) - return x diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/mask_head.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/mask_head.py deleted file mode 100755 index 5ac5c4b9..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/mask_head.py +++ /dev/null @@ -1,292 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from typing import List -import fvcore.nn.weight_init as weight_init -import torch -from torch import nn -from torch.nn import functional as F - -from detectron2.config import configurable -from detectron2.layers import Conv2d, ConvTranspose2d, ShapeSpec, cat, get_norm -from detectron2.structures import Instances -from detectron2.utils.events import get_event_storage -from detectron2.utils.registry import Registry - -__all__ = [ - "BaseMaskRCNNHead", - "MaskRCNNConvUpsampleHead", - "build_mask_head", - "ROI_MASK_HEAD_REGISTRY", -] - - -ROI_MASK_HEAD_REGISTRY = Registry("ROI_MASK_HEAD") -ROI_MASK_HEAD_REGISTRY.__doc__ = """ -Registry for mask heads, which predicts instance masks given -per-region features. - -The registered object will be called with `obj(cfg, input_shape)`. -""" - - -@torch.jit.unused -def mask_rcnn_loss(pred_mask_logits: torch.Tensor, instances: List[Instances], vis_period: int = 0): - """ - Compute the mask prediction loss defined in the Mask R-CNN paper. - - Args: - pred_mask_logits (Tensor): A tensor of shape (B, C, Hmask, Wmask) or (B, 1, Hmask, Wmask) - for class-specific or class-agnostic, where B is the total number of predicted masks - in all images, C is the number of foreground classes, and Hmask, Wmask are the height - and width of the mask predictions. The values are logits. - instances (list[Instances]): A list of N Instances, where N is the number of images - in the batch. These instances are in 1:1 - correspondence with the pred_mask_logits. The ground-truth labels (class, box, mask, - ...) associated with each instance are stored in fields. - vis_period (int): the period (in steps) to dump visualization. - - Returns: - mask_loss (Tensor): A scalar tensor containing the loss. - """ - cls_agnostic_mask = pred_mask_logits.size(1) == 1 - total_num_masks = pred_mask_logits.size(0) - mask_side_len = pred_mask_logits.size(2) - assert pred_mask_logits.size(2) == pred_mask_logits.size(3), "Mask prediction must be square!" - - gt_classes = [] - gt_masks = [] - for instances_per_image in instances: - if len(instances_per_image) == 0: - continue - if not cls_agnostic_mask: - gt_classes_per_image = instances_per_image.gt_classes.to(dtype=torch.int64) - gt_classes.append(gt_classes_per_image) - - gt_masks_per_image = instances_per_image.gt_masks.crop_and_resize( - instances_per_image.proposal_boxes.tensor, mask_side_len - ).to(device=pred_mask_logits.device) - # A tensor of shape (N, M, M), N=#instances in the image; M=mask_side_len - gt_masks.append(gt_masks_per_image) - - if len(gt_masks) == 0: - return pred_mask_logits.sum() * 0 - - gt_masks = cat(gt_masks, dim=0) - - if cls_agnostic_mask: - pred_mask_logits = pred_mask_logits[:, 0] - else: - indices = torch.arange(total_num_masks) - gt_classes = cat(gt_classes, dim=0) - pred_mask_logits = pred_mask_logits[indices, gt_classes] - - if gt_masks.dtype == torch.bool: - gt_masks_bool = gt_masks - else: - # Here we allow gt_masks to be float as well (depend on the implementation of rasterize()) - gt_masks_bool = gt_masks > 0.5 - gt_masks = gt_masks.to(dtype=torch.float32) - - # Log the training accuracy (using gt classes and 0.5 threshold) - mask_incorrect = (pred_mask_logits > 0.0) != gt_masks_bool - mask_accuracy = 1 - (mask_incorrect.sum().item() / max(mask_incorrect.numel(), 1.0)) - num_positive = gt_masks_bool.sum().item() - false_positive = (mask_incorrect & ~gt_masks_bool).sum().item() / max( - gt_masks_bool.numel() - num_positive, 1.0 - ) - false_negative = (mask_incorrect & gt_masks_bool).sum().item() / max(num_positive, 1.0) - - storage = get_event_storage() - storage.put_scalar("mask_rcnn/accuracy", mask_accuracy) - storage.put_scalar("mask_rcnn/false_positive", false_positive) - storage.put_scalar("mask_rcnn/false_negative", false_negative) - if vis_period > 0 and storage.iter % vis_period == 0: - pred_masks = pred_mask_logits.sigmoid() - vis_masks = torch.cat([pred_masks, gt_masks], axis=2) - name = "Left: mask prediction; Right: mask GT" - for idx, vis_mask in enumerate(vis_masks): - vis_mask = torch.stack([vis_mask] * 3, axis=0) - storage.put_image(name + f" ({idx})", vis_mask) - - mask_loss = F.binary_cross_entropy_with_logits(pred_mask_logits, gt_masks, reduction="mean") - return mask_loss - - -def mask_rcnn_inference(pred_mask_logits: torch.Tensor, pred_instances: List[Instances]): - """ - Convert pred_mask_logits to estimated foreground probability masks while also - extracting only the masks for the predicted classes in pred_instances. For each - predicted box, the mask of the same class is attached to the instance by adding a - new "pred_masks" field to pred_instances. - - Args: - pred_mask_logits (Tensor): A tensor of shape (B, C, Hmask, Wmask) or (B, 1, Hmask, Wmask) - for class-specific or class-agnostic, where B is the total number of predicted masks - in all images, C is the number of foreground classes, and Hmask, Wmask are the height - and width of the mask predictions. The values are logits. - pred_instances (list[Instances]): A list of N Instances, where N is the number of images - in the batch. Each Instances must have field "pred_classes". - - Returns: - None. pred_instances will contain an extra "pred_masks" field storing a mask of size (Hmask, - Wmask) for predicted class. Note that the masks are returned as a soft (non-quantized) - masks the resolution predicted by the network; post-processing steps, such as resizing - the predicted masks to the original image resolution and/or binarizing them, is left - to the caller. - """ - cls_agnostic_mask = pred_mask_logits.size(1) == 1 - - if cls_agnostic_mask: - mask_probs_pred = pred_mask_logits.sigmoid() - else: - # Select masks corresponding to the predicted classes - num_masks = pred_mask_logits.shape[0] - class_pred = cat([i.pred_classes for i in pred_instances]) - indices = torch.arange(num_masks, device=class_pred.device) - mask_probs_pred = pred_mask_logits[indices, class_pred][:, None].sigmoid() - # mask_probs_pred.shape: (B, 1, Hmask, Wmask) - - num_boxes_per_image = [len(i) for i in pred_instances] - mask_probs_pred = mask_probs_pred.split(num_boxes_per_image, dim=0) - - for prob, instances in zip(mask_probs_pred, pred_instances): - instances.pred_masks = prob # (1, Hmask, Wmask) - - -class BaseMaskRCNNHead(nn.Module): - """ - Implement the basic Mask R-CNN losses and inference logic described in :paper:`Mask R-CNN` - """ - - @configurable - def __init__(self, *, loss_weight: float = 1.0, vis_period: int = 0): - """ - NOTE: this interface is experimental. - - Args: - loss_weight (float): multiplier of the loss - vis_period (int): visualization period - """ - super().__init__() - self.vis_period = vis_period - self.loss_weight = loss_weight - - @classmethod - def from_config(cls, cfg, input_shape): - return {"vis_period": cfg.VIS_PERIOD} - - def forward(self, x, instances: List[Instances]): - """ - Args: - x: input region feature(s) provided by :class:`ROIHeads`. - instances (list[Instances]): contains the boxes & labels corresponding - to the input features. - Exact format is up to its caller to decide. - Typically, this is the foreground instances in training, with - "proposal_boxes" field and other gt annotations. - In inference, it contains boxes that are already predicted. - - Returns: - A dict of losses in training. The predicted "instances" in inference. - """ - x = self.layers(x) - if self.training: - return {"loss_mask": mask_rcnn_loss(x, instances, self.vis_period) * self.loss_weight} - else: - mask_rcnn_inference(x, instances) - return instances - - def layers(self, x): - """ - Neural network layers that makes predictions from input features. - """ - raise NotImplementedError - - -# To get torchscript support, we make the head a subclass of `nn.Sequential`. -# Therefore, to add new layers in this head class, please make sure they are -# added in the order they will be used in forward(). -@ROI_MASK_HEAD_REGISTRY.register() -class MaskRCNNConvUpsampleHead(BaseMaskRCNNHead, nn.Sequential): - """ - A mask head with several conv layers, plus an upsample layer (with `ConvTranspose2d`). - Predictions are made with a final 1x1 conv layer. - """ - - @configurable - def __init__(self, input_shape: ShapeSpec, *, num_classes, conv_dims, conv_norm="", **kwargs): - """ - NOTE: this interface is experimental. - - Args: - input_shape (ShapeSpec): shape of the input feature - num_classes (int): the number of foreground classes (i.e. background is not - included). 1 if using class agnostic prediction. - conv_dims (list[int]): a list of N>0 integers representing the output dimensions - of N-1 conv layers and the last upsample layer. - conv_norm (str or callable): normalization for the conv layers. - See :func:`detectron2.layers.get_norm` for supported types. - """ - super().__init__(**kwargs) - assert len(conv_dims) >= 1, "conv_dims have to be non-empty!" - - self.conv_norm_relus = [] - - cur_channels = input_shape.channels - for k, conv_dim in enumerate(conv_dims[:-1]): - conv = Conv2d( - cur_channels, - conv_dim, - kernel_size=3, - stride=1, - padding=1, - bias=not conv_norm, - norm=get_norm(conv_norm, conv_dim), - activation=nn.ReLU(), - ) - self.add_module("mask_fcn{}".format(k + 1), conv) - self.conv_norm_relus.append(conv) - cur_channels = conv_dim - - self.deconv = ConvTranspose2d( - cur_channels, conv_dims[-1], kernel_size=2, stride=2, padding=0 - ) - self.add_module("deconv_relu", nn.ReLU()) - cur_channels = conv_dims[-1] - - self.predictor = Conv2d(cur_channels, num_classes, kernel_size=1, stride=1, padding=0) - - for layer in self.conv_norm_relus + [self.deconv]: - weight_init.c2_msra_fill(layer) - # use normal distribution initialization for mask prediction layer - nn.init.normal_(self.predictor.weight, std=0.001) - if self.predictor.bias is not None: - nn.init.constant_(self.predictor.bias, 0) - - @classmethod - def from_config(cls, cfg, input_shape): - ret = super().from_config(cfg, input_shape) - conv_dim = cfg.MODEL.ROI_MASK_HEAD.CONV_DIM - num_conv = cfg.MODEL.ROI_MASK_HEAD.NUM_CONV - ret.update( - conv_dims=[conv_dim] * (num_conv + 1), # +1 for ConvTranspose - conv_norm=cfg.MODEL.ROI_MASK_HEAD.NORM, - input_shape=input_shape, - ) - if cfg.MODEL.ROI_MASK_HEAD.CLS_AGNOSTIC_MASK: - ret["num_classes"] = 1 - else: - ret["num_classes"] = cfg.MODEL.ROI_HEADS.NUM_CLASSES - return ret - - def layers(self, x): - for layer in self: - x = layer(x) - return x - - -def build_mask_head(cfg, input_shape): - """ - Build a mask head defined by `cfg.MODEL.ROI_MASK_HEAD.NAME`. - """ - name = cfg.MODEL.ROI_MASK_HEAD.NAME - return ROI_MASK_HEAD_REGISTRY.get(name)(cfg, input_shape) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/roi_heads.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/roi_heads.py deleted file mode 100755 index 13dd57a0..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/roi_heads.py +++ /dev/null @@ -1,877 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import inspect -import logging -import numpy as np -from typing import Dict, List, Optional, Tuple -import torch -from torch import nn - -from detectron2.config import configurable -from detectron2.layers import ShapeSpec, nonzero_tuple -from detectron2.structures import Boxes, ImageList, Instances, pairwise_iou -from detectron2.utils.events import get_event_storage -from detectron2.utils.registry import Registry - -from ..backbone.resnet import BottleneckBlock, ResNet -from ..matcher import Matcher -from ..poolers import ROIPooler -from ..proposal_generator.proposal_utils import add_ground_truth_to_proposals -from ..sampling import subsample_labels -from .box_head import build_box_head -from .fast_rcnn import FastRCNNOutputLayers -from .keypoint_head import build_keypoint_head -from .mask_head import build_mask_head - -ROI_HEADS_REGISTRY = Registry("ROI_HEADS") -ROI_HEADS_REGISTRY.__doc__ = """ -Registry for ROI heads in a generalized R-CNN model. -ROIHeads take feature maps and region proposals, and -perform per-region computation. - -The registered object will be called with `obj(cfg, input_shape)`. -The call is expected to return an :class:`ROIHeads`. -""" - -logger = logging.getLogger(__name__) - - -def build_roi_heads(cfg, input_shape): - """ - Build ROIHeads defined by `cfg.MODEL.ROI_HEADS.NAME`. - """ - name = cfg.MODEL.ROI_HEADS.NAME - return ROI_HEADS_REGISTRY.get(name)(cfg, input_shape) - - -def select_foreground_proposals( - proposals: List[Instances], bg_label: int -) -> Tuple[List[Instances], List[torch.Tensor]]: - """ - Given a list of N Instances (for N images), each containing a `gt_classes` field, - return a list of Instances that contain only instances with `gt_classes != -1 && - gt_classes != bg_label`. - - Args: - proposals (list[Instances]): A list of N Instances, where N is the number of - images in the batch. - bg_label: label index of background class. - - Returns: - list[Instances]: N Instances, each contains only the selected foreground instances. - list[Tensor]: N boolean vector, correspond to the selection mask of - each Instances object. True for selected instances. - """ - assert isinstance(proposals, (list, tuple)) - assert isinstance(proposals[0], Instances) - assert proposals[0].has("gt_classes") - fg_proposals = [] - fg_selection_masks = [] - for proposals_per_image in proposals: - gt_classes = proposals_per_image.gt_classes - fg_selection_mask = (gt_classes != -1) & (gt_classes != bg_label) - fg_idxs = fg_selection_mask.nonzero().squeeze(1) - fg_proposals.append(proposals_per_image[fg_idxs]) - fg_selection_masks.append(fg_selection_mask) - return fg_proposals, fg_selection_masks - - -def select_proposals_with_visible_keypoints(proposals: List[Instances]) -> List[Instances]: - """ - Args: - proposals (list[Instances]): a list of N Instances, where N is the - number of images. - - Returns: - proposals: only contains proposals with at least one visible keypoint. - - Note that this is still slightly different from Detectron. - In Detectron, proposals for training keypoint head are re-sampled from - all the proposals with IOU>threshold & >=1 visible keypoint. - - Here, the proposals are first sampled from all proposals with - IOU>threshold, then proposals with no visible keypoint are filtered out. - This strategy seems to make no difference on Detectron and is easier to implement. - """ - ret = [] - all_num_fg = [] - for proposals_per_image in proposals: - # If empty/unannotated image (hard negatives), skip filtering for train - if len(proposals_per_image) == 0: - ret.append(proposals_per_image) - continue - gt_keypoints = proposals_per_image.gt_keypoints.tensor - # #fg x K x 3 - vis_mask = gt_keypoints[:, :, 2] >= 1 - xs, ys = gt_keypoints[:, :, 0], gt_keypoints[:, :, 1] - proposal_boxes = proposals_per_image.proposal_boxes.tensor.unsqueeze(dim=1) # #fg x 1 x 4 - kp_in_box = ( - (xs >= proposal_boxes[:, :, 0]) - & (xs <= proposal_boxes[:, :, 2]) - & (ys >= proposal_boxes[:, :, 1]) - & (ys <= proposal_boxes[:, :, 3]) - ) - selection = (kp_in_box & vis_mask).any(dim=1) - selection_idxs = nonzero_tuple(selection)[0] - all_num_fg.append(selection_idxs.numel()) - ret.append(proposals_per_image[selection_idxs]) - - storage = get_event_storage() - storage.put_scalar("keypoint_head/num_fg_samples", np.mean(all_num_fg)) - return ret - - -class ROIHeads(torch.nn.Module): - """ - ROIHeads perform all per-region computation in an R-CNN. - - It typically contains logic to - - 1. (in training only) match proposals with ground truth and sample them - 2. crop the regions and extract per-region features using proposals - 3. make per-region predictions with different heads - - It can have many variants, implemented as subclasses of this class. - This base class contains the logic to match/sample proposals. - But it is not necessary to inherit this class if the sampling logic is not needed. - """ - - @configurable - def __init__( - self, - *, - num_classes, - batch_size_per_image, - positive_fraction, - proposal_matcher, - proposal_append_gt=True, - ): - """ - NOTE: this interface is experimental. - - Args: - num_classes (int): number of foreground classes (i.e. background is not included) - batch_size_per_image (int): number of proposals to sample for training - positive_fraction (float): fraction of positive (foreground) proposals - to sample for training. - proposal_matcher (Matcher): matcher that matches proposals and ground truth - proposal_append_gt (bool): whether to include ground truth as proposals as well - """ - super().__init__() - self.batch_size_per_image = batch_size_per_image - self.positive_fraction = positive_fraction - self.num_classes = num_classes - self.proposal_matcher = proposal_matcher - self.proposal_append_gt = proposal_append_gt - - @classmethod - def from_config(cls, cfg): - return { - "batch_size_per_image": cfg.MODEL.ROI_HEADS.BATCH_SIZE_PER_IMAGE, - "positive_fraction": cfg.MODEL.ROI_HEADS.POSITIVE_FRACTION, - "num_classes": cfg.MODEL.ROI_HEADS.NUM_CLASSES, - "proposal_append_gt": cfg.MODEL.ROI_HEADS.PROPOSAL_APPEND_GT, - # Matcher to assign box proposals to gt boxes - "proposal_matcher": Matcher( - cfg.MODEL.ROI_HEADS.IOU_THRESHOLDS, - cfg.MODEL.ROI_HEADS.IOU_LABELS, - allow_low_quality_matches=False, - ), - } - - def _sample_proposals( - self, matched_idxs: torch.Tensor, matched_labels: torch.Tensor, gt_classes: torch.Tensor - ) -> Tuple[torch.Tensor, torch.Tensor]: - """ - Based on the matching between N proposals and M groundtruth, - sample the proposals and set their classification labels. - - Args: - matched_idxs (Tensor): a vector of length N, each is the best-matched - gt index in [0, M) for each proposal. - matched_labels (Tensor): a vector of length N, the matcher's label - (one of cfg.MODEL.ROI_HEADS.IOU_LABELS) for each proposal. - gt_classes (Tensor): a vector of length M. - - Returns: - Tensor: a vector of indices of sampled proposals. Each is in [0, N). - Tensor: a vector of the same length, the classification label for - each sampled proposal. Each sample is labeled as either a category in - [0, num_classes) or the background (num_classes). - """ - has_gt = gt_classes.numel() > 0 - # Get the corresponding GT for each proposal - if has_gt: - gt_classes = gt_classes[matched_idxs] - # Label unmatched proposals (0 label from matcher) as background (label=num_classes) - gt_classes[matched_labels == 0] = self.num_classes - # Label ignore proposals (-1 label) - gt_classes[matched_labels == -1] = -1 - else: - gt_classes = torch.zeros_like(matched_idxs) + self.num_classes - - sampled_fg_idxs, sampled_bg_idxs = subsample_labels( - gt_classes, self.batch_size_per_image, self.positive_fraction, self.num_classes - ) - - sampled_idxs = torch.cat([sampled_fg_idxs, sampled_bg_idxs], dim=0) - return sampled_idxs, gt_classes[sampled_idxs] - - @torch.no_grad() - def label_and_sample_proposals( - self, proposals: List[Instances], targets: List[Instances] - ) -> List[Instances]: - """ - Prepare some proposals to be used to train the ROI heads. - It performs box matching between `proposals` and `targets`, and assigns - training labels to the proposals. - It returns ``self.batch_size_per_image`` random samples from proposals and groundtruth - boxes, with a fraction of positives that is no larger than - ``self.positive_fraction``. - - Args: - See :meth:`ROIHeads.forward` - - Returns: - list[Instances]: - length `N` list of `Instances`s containing the proposals - sampled for training. Each `Instances` has the following fields: - - - proposal_boxes: the proposal boxes - - gt_boxes: the ground-truth box that the proposal is assigned to - (this is only meaningful if the proposal has a label > 0; if label = 0 - then the ground-truth box is random) - - Other fields such as "gt_classes", "gt_masks", that's included in `targets`. - """ - # Augment proposals with ground-truth boxes. - # In the case of learned proposals (e.g., RPN), when training starts - # the proposals will be low quality due to random initialization. - # It's possible that none of these initial - # proposals have high enough overlap with the gt objects to be used - # as positive examples for the second stage components (box head, - # cls head, mask head). Adding the gt boxes to the set of proposals - # ensures that the second stage components will have some positive - # examples from the start of training. For RPN, this augmentation improves - # convergence and empirically improves box AP on COCO by about 0.5 - # points (under one tested configuration). - if self.proposal_append_gt: - proposals = add_ground_truth_to_proposals(targets, proposals) - - proposals_with_gt = [] - - num_fg_samples = [] - num_bg_samples = [] - for proposals_per_image, targets_per_image in zip(proposals, targets): - has_gt = len(targets_per_image) > 0 - match_quality_matrix = pairwise_iou( - targets_per_image.gt_boxes, proposals_per_image.proposal_boxes - ) - matched_idxs, matched_labels = self.proposal_matcher(match_quality_matrix) - sampled_idxs, gt_classes = self._sample_proposals( - matched_idxs, matched_labels, targets_per_image.gt_classes - ) - - # Set target attributes of the sampled proposals: - proposals_per_image = proposals_per_image[sampled_idxs] - proposals_per_image.gt_classes = gt_classes - - if has_gt: - sampled_targets = matched_idxs[sampled_idxs] - # We index all the attributes of targets that start with "gt_" - # and have not been added to proposals yet (="gt_classes"). - # NOTE: here the indexing waste some compute, because heads - # like masks, keypoints, etc, will filter the proposals again, - # (by foreground/background, or number of keypoints in the image, etc) - # so we essentially index the data twice. - for (trg_name, trg_value) in targets_per_image.get_fields().items(): - if trg_name.startswith("gt_") and not proposals_per_image.has(trg_name): - proposals_per_image.set(trg_name, trg_value[sampled_targets]) - # If no GT is given in the image, we don't know what a dummy gt value can be. - # Therefore the returned proposals won't have any gt_* fields, except for a - # gt_classes full of background label. - - num_bg_samples.append((gt_classes == self.num_classes).sum().item()) - num_fg_samples.append(gt_classes.numel() - num_bg_samples[-1]) - proposals_with_gt.append(proposals_per_image) - - # Log the number of fg/bg samples that are selected for training ROI heads - storage = get_event_storage() - storage.put_scalar("roi_head/num_fg_samples", np.mean(num_fg_samples)) - storage.put_scalar("roi_head/num_bg_samples", np.mean(num_bg_samples)) - - return proposals_with_gt - - def forward( - self, - images: ImageList, - features: Dict[str, torch.Tensor], - proposals: List[Instances], - targets: Optional[List[Instances]] = None, - ) -> Tuple[List[Instances], Dict[str, torch.Tensor]]: - """ - Args: - images (ImageList): - features (dict[str,Tensor]): input data as a mapping from feature - map name to tensor. Axis 0 represents the number of images `N` in - the input data; axes 1-3 are channels, height, and width, which may - vary between feature maps (e.g., if a feature pyramid is used). - proposals (list[Instances]): length `N` list of `Instances`. The i-th - `Instances` contains object proposals for the i-th input image, - with fields "proposal_boxes" and "objectness_logits". - targets (list[Instances], optional): length `N` list of `Instances`. The i-th - `Instances` contains the ground-truth per-instance annotations - for the i-th input image. Specify `targets` during training only. - It may have the following fields: - - - gt_boxes: the bounding box of each instance. - - gt_classes: the label for each instance with a category ranging in [0, #class]. - - gt_masks: PolygonMasks or BitMasks, the ground-truth masks of each instance. - - gt_keypoints: NxKx3, the groud-truth keypoints for each instance. - - Returns: - list[Instances]: length `N` list of `Instances` containing the - detected instances. Returned during inference only; may be [] during training. - - dict[str->Tensor]: - mapping from a named loss to a tensor storing the loss. Used during training only. - """ - raise NotImplementedError() - - -@ROI_HEADS_REGISTRY.register() -class Res5ROIHeads(ROIHeads): - """ - The ROIHeads in a typical "C4" R-CNN model, where - the box and mask head share the cropping and - the per-region feature computation by a Res5 block. - See :paper:`ResNet` Appendix A. - """ - - @configurable - def __init__( - self, - *, - in_features: List[str], - pooler: ROIPooler, - res5: nn.Module, - box_predictor: nn.Module, - mask_head: Optional[nn.Module] = None, - **kwargs, - ): - """ - NOTE: this interface is experimental. - - Args: - in_features (list[str]): list of backbone feature map names to use for - feature extraction - pooler (ROIPooler): pooler to extra region features from backbone - res5 (nn.Sequential): a CNN to compute per-region features, to be used by - ``box_predictor`` and ``mask_head``. Typically this is a "res5" - block from a ResNet. - box_predictor (nn.Module): make box predictions from the feature. - Should have the same interface as :class:`FastRCNNOutputLayers`. - mask_head (nn.Module): transform features to make mask predictions - """ - super().__init__(**kwargs) - self.in_features = in_features - self.pooler = pooler - if isinstance(res5, (list, tuple)): - res5 = nn.Sequential(*res5) - self.res5 = res5 - self.box_predictor = box_predictor - self.mask_on = mask_head is not None - if self.mask_on: - self.mask_head = mask_head - - @classmethod - def from_config(cls, cfg, input_shape): - # fmt: off - ret = super().from_config(cfg) - in_features = ret["in_features"] = cfg.MODEL.ROI_HEADS.IN_FEATURES - pooler_resolution = cfg.MODEL.ROI_BOX_HEAD.POOLER_RESOLUTION - pooler_type = cfg.MODEL.ROI_BOX_HEAD.POOLER_TYPE - pooler_scales = (1.0 / input_shape[in_features[0]].stride, ) - sampling_ratio = cfg.MODEL.ROI_BOX_HEAD.POOLER_SAMPLING_RATIO - mask_on = cfg.MODEL.MASK_ON - # fmt: on - assert not cfg.MODEL.KEYPOINT_ON - assert len(in_features) == 1 - - ret["pooler"] = ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type=pooler_type, - ) - - # Compatbility with old moco code. Might be useful. - # See notes in StandardROIHeads.from_config - if not inspect.ismethod(cls._build_res5_block): - logger.warning( - "The behavior of _build_res5_block may change. " - "Please do not depend on private methods." - ) - cls._build_res5_block = classmethod(cls._build_res5_block) - - ret["res5"], out_channels = cls._build_res5_block(cfg) - ret["box_predictor"] = FastRCNNOutputLayers( - cfg, ShapeSpec(channels=out_channels, height=1, width=1) - ) - - if mask_on: - ret["mask_head"] = build_mask_head( - cfg, - ShapeSpec(channels=out_channels, width=pooler_resolution, height=pooler_resolution), - ) - return ret - - @classmethod - def _build_res5_block(cls, cfg): - # fmt: off - stage_channel_factor = 2 ** 3 # res5 is 8x res2 - num_groups = cfg.MODEL.RESNETS.NUM_GROUPS - width_per_group = cfg.MODEL.RESNETS.WIDTH_PER_GROUP - bottleneck_channels = num_groups * width_per_group * stage_channel_factor - out_channels = cfg.MODEL.RESNETS.RES2_OUT_CHANNELS * stage_channel_factor - stride_in_1x1 = cfg.MODEL.RESNETS.STRIDE_IN_1X1 - norm = cfg.MODEL.RESNETS.NORM - assert not cfg.MODEL.RESNETS.DEFORM_ON_PER_STAGE[-1], \ - "Deformable conv is not yet supported in res5 head." - # fmt: on - - blocks = ResNet.make_stage( - BottleneckBlock, - 3, - stride_per_block=[2, 1, 1], - in_channels=out_channels // 2, - bottleneck_channels=bottleneck_channels, - out_channels=out_channels, - num_groups=num_groups, - norm=norm, - stride_in_1x1=stride_in_1x1, - ) - return nn.Sequential(*blocks), out_channels - - def _shared_roi_transform(self, features: List[torch.Tensor], boxes: List[Boxes]): - x = self.pooler(features, boxes) - return self.res5(x) - - def forward( - self, - images: ImageList, - features: Dict[str, torch.Tensor], - proposals: List[Instances], - targets: Optional[List[Instances]] = None, - ): - """ - See :meth:`ROIHeads.forward`. - """ - del images - - if self.training: - assert targets - proposals = self.label_and_sample_proposals(proposals, targets) - del targets - - proposal_boxes = [x.proposal_boxes for x in proposals] - box_features = self._shared_roi_transform( - [features[f] for f in self.in_features], proposal_boxes - ) - predictions = self.box_predictor(box_features.mean(dim=[2, 3])) - - if self.training: - del features - losses = self.box_predictor.losses(predictions, proposals) - if self.mask_on: - proposals, fg_selection_masks = select_foreground_proposals( - proposals, self.num_classes - ) - # Since the ROI feature transform is shared between boxes and masks, - # we don't need to recompute features. The mask loss is only defined - # on foreground proposals, so we need to select out the foreground - # features. - mask_features = box_features[torch.cat(fg_selection_masks, dim=0)] - del box_features - losses.update(self.mask_head(mask_features, proposals)) - return [], losses - else: - pred_instances, _ = self.box_predictor.inference(predictions, proposals) - pred_instances = self.forward_with_given_boxes(features, pred_instances) - return pred_instances, {} - - def forward_with_given_boxes( - self, features: Dict[str, torch.Tensor], instances: List[Instances] - ) -> List[Instances]: - """ - Use the given boxes in `instances` to produce other (non-box) per-ROI outputs. - - Args: - features: same as in `forward()` - instances (list[Instances]): instances to predict other outputs. Expect the keys - "pred_boxes" and "pred_classes" to exist. - - Returns: - instances (Instances): - the same `Instances` object, with extra - fields such as `pred_masks` or `pred_keypoints`. - """ - assert not self.training - assert instances[0].has("pred_boxes") and instances[0].has("pred_classes") - - if self.mask_on: - feature_list = [features[f] for f in self.in_features] - x = self._shared_roi_transform(feature_list, [x.pred_boxes for x in instances]) - return self.mask_head(x, instances) - else: - return instances - - -@ROI_HEADS_REGISTRY.register() -class StandardROIHeads(ROIHeads): - """ - It's "standard" in a sense that there is no ROI transform sharing - or feature sharing between tasks. - Each head independently processes the input features by each head's - own pooler and head. - - This class is used by most models, such as FPN and C5. - To implement more models, you can subclass it and implement a different - :meth:`forward()` or a head. - """ - - @configurable - def __init__( - self, - *, - box_in_features: List[str], - box_pooler: ROIPooler, - box_head: nn.Module, - box_predictor: nn.Module, - mask_in_features: Optional[List[str]] = None, - mask_pooler: Optional[ROIPooler] = None, - mask_head: Optional[nn.Module] = None, - keypoint_in_features: Optional[List[str]] = None, - keypoint_pooler: Optional[ROIPooler] = None, - keypoint_head: Optional[nn.Module] = None, - train_on_pred_boxes: bool = False, - **kwargs, - ): - """ - NOTE: this interface is experimental. - - Args: - box_in_features (list[str]): list of feature names to use for the box head. - box_pooler (ROIPooler): pooler to extra region features for box head - box_head (nn.Module): transform features to make box predictions - box_predictor (nn.Module): make box predictions from the feature. - Should have the same interface as :class:`FastRCNNOutputLayers`. - mask_in_features (list[str]): list of feature names to use for the mask - pooler or mask head. None if not using mask head. - mask_pooler (ROIPooler): pooler to extract region features from image features. - The mask head will then take region features to make predictions. - If None, the mask head will directly take the dict of image features - defined by `mask_in_features` - mask_head (nn.Module): transform features to make mask predictions - keypoint_in_features, keypoint_pooler, keypoint_head: similar to ``mask_*``. - train_on_pred_boxes (bool): whether to use proposal boxes or - predicted boxes from the box head to train other heads. - """ - super().__init__(**kwargs) - # keep self.in_features for backward compatibility - self.in_features = self.box_in_features = box_in_features - self.box_pooler = box_pooler - self.box_head = box_head - self.box_predictor = box_predictor - - self.mask_on = mask_in_features is not None - if self.mask_on: - self.mask_in_features = mask_in_features - self.mask_pooler = mask_pooler - self.mask_head = mask_head - - self.keypoint_on = keypoint_in_features is not None - if self.keypoint_on: - self.keypoint_in_features = keypoint_in_features - self.keypoint_pooler = keypoint_pooler - self.keypoint_head = keypoint_head - - self.train_on_pred_boxes = train_on_pred_boxes - - @classmethod - def from_config(cls, cfg, input_shape): - ret = super().from_config(cfg) - ret["train_on_pred_boxes"] = cfg.MODEL.ROI_BOX_HEAD.TRAIN_ON_PRED_BOXES - # Subclasses that have not been updated to use from_config style construction - # may have overridden _init_*_head methods. In this case, those overridden methods - # will not be classmethods and we need to avoid trying to call them here. - # We test for this with ismethod which only returns True for bound methods of cls. - # Such subclasses will need to handle calling their overridden _init_*_head methods. - if inspect.ismethod(cls._init_box_head): - ret.update(cls._init_box_head(cfg, input_shape)) - if inspect.ismethod(cls._init_mask_head): - ret.update(cls._init_mask_head(cfg, input_shape)) - if inspect.ismethod(cls._init_keypoint_head): - ret.update(cls._init_keypoint_head(cfg, input_shape)) - return ret - - @classmethod - def _init_box_head(cls, cfg, input_shape): - # fmt: off - in_features = cfg.MODEL.ROI_HEADS.IN_FEATURES - pooler_resolution = cfg.MODEL.ROI_BOX_HEAD.POOLER_RESOLUTION - pooler_scales = tuple(1.0 / input_shape[k].stride for k in in_features) - sampling_ratio = cfg.MODEL.ROI_BOX_HEAD.POOLER_SAMPLING_RATIO - pooler_type = cfg.MODEL.ROI_BOX_HEAD.POOLER_TYPE - # fmt: on - - # If StandardROIHeads is applied on multiple feature maps (as in FPN), - # then we share the same predictors and therefore the channel counts must be the same - in_channels = [input_shape[f].channels for f in in_features] - # Check all channel counts are equal - assert len(set(in_channels)) == 1, in_channels - in_channels = in_channels[0] - - box_pooler = ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type=pooler_type, - ) - # Here we split "box head" and "box predictor", which is mainly due to historical reasons. - # They are used together so the "box predictor" layers should be part of the "box head". - # New subclasses of ROIHeads do not need "box predictor"s. - box_head = build_box_head( - cfg, ShapeSpec(channels=in_channels, height=pooler_resolution, width=pooler_resolution) - ) - box_predictor = FastRCNNOutputLayers(cfg, box_head.output_shape) - return { - "box_in_features": in_features, - "box_pooler": box_pooler, - "box_head": box_head, - "box_predictor": box_predictor, - } - - @classmethod - def _init_mask_head(cls, cfg, input_shape): - if not cfg.MODEL.MASK_ON: - return {} - # fmt: off - in_features = cfg.MODEL.ROI_HEADS.IN_FEATURES - pooler_resolution = cfg.MODEL.ROI_MASK_HEAD.POOLER_RESOLUTION - pooler_scales = tuple(1.0 / input_shape[k].stride for k in in_features) - sampling_ratio = cfg.MODEL.ROI_MASK_HEAD.POOLER_SAMPLING_RATIO - pooler_type = cfg.MODEL.ROI_MASK_HEAD.POOLER_TYPE - # fmt: on - - in_channels = [input_shape[f].channels for f in in_features][0] - - ret = {"mask_in_features": in_features} - ret["mask_pooler"] = ( - ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type=pooler_type, - ) - if pooler_type - else None - ) - if pooler_type: - shape = ShapeSpec( - channels=in_channels, width=pooler_resolution, height=pooler_resolution - ) - else: - shape = {f: input_shape[f] for f in in_features} - ret["mask_head"] = build_mask_head(cfg, shape) - return ret - - @classmethod - def _init_keypoint_head(cls, cfg, input_shape): - if not cfg.MODEL.KEYPOINT_ON: - return {} - # fmt: off - in_features = cfg.MODEL.ROI_HEADS.IN_FEATURES - pooler_resolution = cfg.MODEL.ROI_KEYPOINT_HEAD.POOLER_RESOLUTION - pooler_scales = tuple(1.0 / input_shape[k].stride for k in in_features) # noqa - sampling_ratio = cfg.MODEL.ROI_KEYPOINT_HEAD.POOLER_SAMPLING_RATIO - pooler_type = cfg.MODEL.ROI_KEYPOINT_HEAD.POOLER_TYPE - # fmt: on - - in_channels = [input_shape[f].channels for f in in_features][0] - - ret = {"keypoint_in_features": in_features} - ret["keypoint_pooler"] = ( - ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type=pooler_type, - ) - if pooler_type - else None - ) - if pooler_type: - shape = ShapeSpec( - channels=in_channels, width=pooler_resolution, height=pooler_resolution - ) - else: - shape = {f: input_shape[f] for f in in_features} - ret["keypoint_head"] = build_keypoint_head(cfg, shape) - return ret - - def forward( - self, - images: ImageList, - features: Dict[str, torch.Tensor], - proposals: List[Instances], - targets: Optional[List[Instances]] = None, - ) -> Tuple[List[Instances], Dict[str, torch.Tensor]]: - """ - See :class:`ROIHeads.forward`. - """ - del images - if self.training: - assert targets, "'targets' argument is required during training" - proposals = self.label_and_sample_proposals(proposals, targets) - del targets - - if self.training: - losses = self._forward_box(features, proposals) - # Usually the original proposals used by the box head are used by the mask, keypoint - # heads. But when `self.train_on_pred_boxes is True`, proposals will contain boxes - # predicted by the box head. - losses.update(self._forward_mask(features, proposals)) - losses.update(self._forward_keypoint(features, proposals)) - return proposals, losses - else: - pred_instances = self._forward_box(features, proposals) - # During inference cascaded prediction is used: the mask and keypoints heads are only - # applied to the top scoring box detections. - pred_instances = self.forward_with_given_boxes(features, pred_instances) - return pred_instances, {} - - def forward_with_given_boxes( - self, features: Dict[str, torch.Tensor], instances: List[Instances] - ) -> List[Instances]: - """ - Use the given boxes in `instances` to produce other (non-box) per-ROI outputs. - - This is useful for downstream tasks where a box is known, but need to obtain - other attributes (outputs of other heads). - Test-time augmentation also uses this. - - Args: - features: same as in `forward()` - instances (list[Instances]): instances to predict other outputs. Expect the keys - "pred_boxes" and "pred_classes" to exist. - - Returns: - list[Instances]: - the same `Instances` objects, with extra - fields such as `pred_masks` or `pred_keypoints`. - """ - assert not self.training - assert instances[0].has("pred_boxes") and instances[0].has("pred_classes") - - instances = self._forward_mask(features, instances) - instances = self._forward_keypoint(features, instances) - return instances - - def _forward_box(self, features: Dict[str, torch.Tensor], proposals: List[Instances]): - """ - Forward logic of the box prediction branch. If `self.train_on_pred_boxes is True`, - the function puts predicted boxes in the `proposal_boxes` field of `proposals` argument. - - Args: - features (dict[str, Tensor]): mapping from feature map names to tensor. - Same as in :meth:`ROIHeads.forward`. - proposals (list[Instances]): the per-image object proposals with - their matching ground truth. - Each has fields "proposal_boxes", and "objectness_logits", - "gt_classes", "gt_boxes". - - Returns: - In training, a dict of losses. - In inference, a list of `Instances`, the predicted instances. - """ - features = [features[f] for f in self.box_in_features] - box_features = self.box_pooler(features, [x.proposal_boxes for x in proposals]) - box_features = self.box_head(box_features) - predictions = self.box_predictor(box_features) - del box_features - - if self.training: - losses = self.box_predictor.losses(predictions, proposals) - # proposals is modified in-place below, so losses must be computed first. - if self.train_on_pred_boxes: - with torch.no_grad(): - pred_boxes = self.box_predictor.predict_boxes_for_gt_classes( - predictions, proposals - ) - for proposals_per_image, pred_boxes_per_image in zip(proposals, pred_boxes): - proposals_per_image.proposal_boxes = Boxes(pred_boxes_per_image) - return losses - else: - pred_instances, _ = self.box_predictor.inference(predictions, proposals) - return pred_instances - - def _forward_mask(self, features: Dict[str, torch.Tensor], instances: List[Instances]): - """ - Forward logic of the mask prediction branch. - - Args: - features (dict[str, Tensor]): mapping from feature map names to tensor. - Same as in :meth:`ROIHeads.forward`. - instances (list[Instances]): the per-image instances to train/predict masks. - In training, they can be the proposals. - In inference, they can be the boxes predicted by R-CNN box head. - - Returns: - In training, a dict of losses. - In inference, update `instances` with new fields "pred_masks" and return it. - """ - if not self.mask_on: - return {} if self.training else instances - - if self.training: - # head is only trained on positive proposals. - instances, _ = select_foreground_proposals(instances, self.num_classes) - - if self.mask_pooler is not None: - features = [features[f] for f in self.mask_in_features] - boxes = [x.proposal_boxes if self.training else x.pred_boxes for x in instances] - features = self.mask_pooler(features, boxes) - else: - features = {f: features[f] for f in self.mask_in_features} - return self.mask_head(features, instances) - - def _forward_keypoint(self, features: Dict[str, torch.Tensor], instances: List[Instances]): - """ - Forward logic of the keypoint prediction branch. - - Args: - features (dict[str, Tensor]): mapping from feature map names to tensor. - Same as in :meth:`ROIHeads.forward`. - instances (list[Instances]): the per-image instances to train/predict keypoints. - In training, they can be the proposals. - In inference, they can be the boxes predicted by R-CNN box head. - - Returns: - In training, a dict of losses. - In inference, update `instances` with new fields "pred_keypoints" and return it. - """ - if not self.keypoint_on: - return {} if self.training else instances - - if self.training: - # head is only trained on positive proposals with >=1 visible keypoints. - instances, _ = select_foreground_proposals(instances, self.num_classes) - instances = select_proposals_with_visible_keypoints(instances) - - if self.keypoint_pooler is not None: - features = [features[f] for f in self.keypoint_in_features] - boxes = [x.proposal_boxes if self.training else x.pred_boxes for x in instances] - features = self.keypoint_pooler(features, boxes) - else: - features = {f: features[f] for f in self.keypoint_in_features} - return self.keypoint_head(features, instances) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/rotated_fast_rcnn.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/rotated_fast_rcnn.py deleted file mode 100755 index b1eedeeb..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/roi_heads/rotated_fast_rcnn.py +++ /dev/null @@ -1,270 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import numpy as np -import torch - -from detectron2.config import configurable -from detectron2.layers import ShapeSpec, batched_nms_rotated -from detectron2.structures import Instances, RotatedBoxes, pairwise_iou_rotated -from detectron2.utils.events import get_event_storage - -from ..box_regression import Box2BoxTransformRotated -from ..poolers import ROIPooler -from ..proposal_generator.proposal_utils import add_ground_truth_to_proposals -from .box_head import build_box_head -from .fast_rcnn import FastRCNNOutputLayers -from .roi_heads import ROI_HEADS_REGISTRY, StandardROIHeads - -logger = logging.getLogger(__name__) - -""" -Shape shorthand in this module: - - N: number of images in the minibatch - R: number of ROIs, combined over all images, in the minibatch - Ri: number of ROIs in image i - K: number of foreground classes. E.g.,there are 80 foreground classes in COCO. - -Naming convention: - - deltas: refers to the 5-d (dx, dy, dw, dh, da) deltas that parameterize the box2box - transform (see :class:`box_regression.Box2BoxTransformRotated`). - - pred_class_logits: predicted class scores in [-inf, +inf]; use - softmax(pred_class_logits) to estimate P(class). - - gt_classes: ground-truth classification labels in [0, K], where [0, K) represent - foreground object classes and K represents the background class. - - pred_proposal_deltas: predicted rotated box2box transform deltas for transforming proposals - to detection box predictions. - - gt_proposal_deltas: ground-truth rotated box2box transform deltas -""" - - -def fast_rcnn_inference_rotated( - boxes, scores, image_shapes, score_thresh, nms_thresh, topk_per_image -): - """ - Call `fast_rcnn_inference_single_image_rotated` for all images. - - Args: - boxes (list[Tensor]): A list of Tensors of predicted class-specific or class-agnostic - boxes for each image. Element i has shape (Ri, K * 5) if doing - class-specific regression, or (Ri, 5) if doing class-agnostic - regression, where Ri is the number of predicted objects for image i. - This is compatible with the output of :meth:`FastRCNNOutputLayers.predict_boxes`. - scores (list[Tensor]): A list of Tensors of predicted class scores for each image. - Element i has shape (Ri, K + 1), where Ri is the number of predicted objects - for image i. Compatible with the output of :meth:`FastRCNNOutputLayers.predict_probs`. - image_shapes (list[tuple]): A list of (width, height) tuples for each image in the batch. - score_thresh (float): Only return detections with a confidence score exceeding this - threshold. - nms_thresh (float): The threshold to use for box non-maximum suppression. Value in [0, 1]. - topk_per_image (int): The number of top scoring detections to return. Set < 0 to return - all detections. - - Returns: - instances: (list[Instances]): A list of N instances, one for each image in the batch, - that stores the topk most confidence detections. - kept_indices: (list[Tensor]): A list of 1D tensor of length of N, each element indicates - the corresponding boxes/scores index in [0, Ri) from the input, for image i. - """ - result_per_image = [ - fast_rcnn_inference_single_image_rotated( - boxes_per_image, scores_per_image, image_shape, score_thresh, nms_thresh, topk_per_image - ) - for scores_per_image, boxes_per_image, image_shape in zip(scores, boxes, image_shapes) - ] - return [x[0] for x in result_per_image], [x[1] for x in result_per_image] - - -def fast_rcnn_inference_single_image_rotated( - boxes, scores, image_shape, score_thresh, nms_thresh, topk_per_image -): - """ - Single-image inference. Return rotated bounding-box detection results by thresholding - on scores and applying rotated non-maximum suppression (Rotated NMS). - - Args: - Same as `fast_rcnn_inference_rotated`, but with rotated boxes, scores, and image shapes - per image. - - Returns: - Same as `fast_rcnn_inference_rotated`, but for only one image. - """ - valid_mask = torch.isfinite(boxes).all(dim=1) & torch.isfinite(scores).all(dim=1) - if not valid_mask.all(): - boxes = boxes[valid_mask] - scores = scores[valid_mask] - - B = 5 # box dimension - scores = scores[:, :-1] - num_bbox_reg_classes = boxes.shape[1] // B - # Convert to Boxes to use the `clip` function ... - boxes = RotatedBoxes(boxes.reshape(-1, B)) - boxes.clip(image_shape) - boxes = boxes.tensor.view(-1, num_bbox_reg_classes, B) # R x C x B - # Filter results based on detection scores - filter_mask = scores > score_thresh # R x K - # R' x 2. First column contains indices of the R predictions; - # Second column contains indices of classes. - filter_inds = filter_mask.nonzero() - if num_bbox_reg_classes == 1: - boxes = boxes[filter_inds[:, 0], 0] - else: - boxes = boxes[filter_mask] - scores = scores[filter_mask] - - # Apply per-class Rotated NMS - keep = batched_nms_rotated(boxes, scores, filter_inds[:, 1], nms_thresh) - if topk_per_image >= 0: - keep = keep[:topk_per_image] - boxes, scores, filter_inds = boxes[keep], scores[keep], filter_inds[keep] - - result = Instances(image_shape) - result.pred_boxes = RotatedBoxes(boxes) - result.scores = scores - result.pred_classes = filter_inds[:, 1] - - return result, filter_inds[:, 0] - - -class RotatedFastRCNNOutputLayers(FastRCNNOutputLayers): - """ - Two linear layers for predicting Rotated Fast R-CNN outputs. - """ - - @classmethod - def from_config(cls, cfg, input_shape): - args = super().from_config(cfg, input_shape) - args["box2box_transform"] = Box2BoxTransformRotated( - weights=cfg.MODEL.ROI_BOX_HEAD.BBOX_REG_WEIGHTS - ) - return args - - def inference(self, predictions, proposals): - """ - Returns: - list[Instances]: same as `fast_rcnn_inference_rotated`. - list[Tensor]: same as `fast_rcnn_inference_rotated`. - """ - boxes = self.predict_boxes(predictions, proposals) - scores = self.predict_probs(predictions, proposals) - image_shapes = [x.image_size for x in proposals] - - return fast_rcnn_inference_rotated( - boxes, - scores, - image_shapes, - self.test_score_thresh, - self.test_nms_thresh, - self.test_topk_per_image, - ) - - -@ROI_HEADS_REGISTRY.register() -class RROIHeads(StandardROIHeads): - """ - This class is used by Rotated Fast R-CNN to detect rotated boxes. - For now, it only supports box predictions but not mask or keypoints. - """ - - @configurable - def __init__(self, **kwargs): - """ - NOTE: this interface is experimental. - """ - super().__init__(**kwargs) - assert ( - not self.mask_on and not self.keypoint_on - ), "Mask/Keypoints not supported in Rotated ROIHeads." - assert not self.train_on_pred_boxes, "train_on_pred_boxes not implemented for RROIHeads!" - - @classmethod - def _init_box_head(cls, cfg, input_shape): - # fmt: off - in_features = cfg.MODEL.ROI_HEADS.IN_FEATURES - pooler_resolution = cfg.MODEL.ROI_BOX_HEAD.POOLER_RESOLUTION - pooler_scales = tuple(1.0 / input_shape[k].stride for k in in_features) - sampling_ratio = cfg.MODEL.ROI_BOX_HEAD.POOLER_SAMPLING_RATIO - pooler_type = cfg.MODEL.ROI_BOX_HEAD.POOLER_TYPE - # fmt: on - assert pooler_type in ["ROIAlignRotated"], pooler_type - # assume all channel counts are equal - in_channels = [input_shape[f].channels for f in in_features][0] - - box_pooler = ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type=pooler_type, - ) - box_head = build_box_head( - cfg, ShapeSpec(channels=in_channels, height=pooler_resolution, width=pooler_resolution) - ) - # This line is the only difference v.s. StandardROIHeads - box_predictor = RotatedFastRCNNOutputLayers(cfg, box_head.output_shape) - return { - "box_in_features": in_features, - "box_pooler": box_pooler, - "box_head": box_head, - "box_predictor": box_predictor, - } - - @torch.no_grad() - def label_and_sample_proposals(self, proposals, targets): - """ - Prepare some proposals to be used to train the RROI heads. - It performs box matching between `proposals` and `targets`, and assigns - training labels to the proposals. - It returns `self.batch_size_per_image` random samples from proposals and groundtruth boxes, - with a fraction of positives that is no larger than `self.positive_sample_fraction. - - Args: - See :meth:`StandardROIHeads.forward` - - Returns: - list[Instances]: length `N` list of `Instances`s containing the proposals - sampled for training. Each `Instances` has the following fields: - - proposal_boxes: the rotated proposal boxes - - gt_boxes: the ground-truth rotated boxes that the proposal is assigned to - (this is only meaningful if the proposal has a label > 0; if label = 0 - then the ground-truth box is random) - - gt_classes: the ground-truth classification lable for each proposal - """ - if self.proposal_append_gt: - proposals = add_ground_truth_to_proposals(targets, proposals) - - proposals_with_gt = [] - - num_fg_samples = [] - num_bg_samples = [] - for proposals_per_image, targets_per_image in zip(proposals, targets): - has_gt = len(targets_per_image) > 0 - match_quality_matrix = pairwise_iou_rotated( - targets_per_image.gt_boxes, proposals_per_image.proposal_boxes - ) - matched_idxs, matched_labels = self.proposal_matcher(match_quality_matrix) - sampled_idxs, gt_classes = self._sample_proposals( - matched_idxs, matched_labels, targets_per_image.gt_classes - ) - - proposals_per_image = proposals_per_image[sampled_idxs] - proposals_per_image.gt_classes = gt_classes - - if has_gt: - sampled_targets = matched_idxs[sampled_idxs] - proposals_per_image.gt_boxes = targets_per_image.gt_boxes[sampled_targets] - - num_bg_samples.append((gt_classes == self.num_classes).sum().item()) - num_fg_samples.append(gt_classes.numel() - num_bg_samples[-1]) - proposals_with_gt.append(proposals_per_image) - - # Log the number of fg/bg samples that are selected for training ROI heads - storage = get_event_storage() - storage.put_scalar("roi_head/num_fg_samples", np.mean(num_fg_samples)) - storage.put_scalar("roi_head/num_bg_samples", np.mean(num_bg_samples)) - - return proposals_with_gt diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/sampling.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/sampling.py deleted file mode 100755 index a2d0f664..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/sampling.py +++ /dev/null @@ -1,54 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import torch - -from detectron2.layers import nonzero_tuple - -__all__ = ["subsample_labels"] - - -def subsample_labels( - labels: torch.Tensor, num_samples: int, positive_fraction: float, bg_label: int -): - """ - Return `num_samples` (or fewer, if not enough found) - random samples from `labels` which is a mixture of positives & negatives. - It will try to return as many positives as possible without - exceeding `positive_fraction * num_samples`, and then try to - fill the remaining slots with negatives. - - Args: - labels (Tensor): (N, ) label vector with values: - * -1: ignore - * bg_label: background ("negative") class - * otherwise: one or more foreground ("positive") classes - num_samples (int): The total number of labels with value >= 0 to return. - Values that are not sampled will be filled with -1 (ignore). - positive_fraction (float): The number of subsampled labels with values > 0 - is `min(num_positives, int(positive_fraction * num_samples))`. The number - of negatives sampled is `min(num_negatives, num_samples - num_positives_sampled)`. - In order words, if there are not enough positives, the sample is filled with - negatives. If there are also not enough negatives, then as many elements are - sampled as is possible. - bg_label (int): label index of background ("negative") class. - - Returns: - pos_idx, neg_idx (Tensor): - 1D vector of indices. The total length of both is `num_samples` or fewer. - """ - positive = nonzero_tuple((labels != -1) & (labels != bg_label))[0] - negative = nonzero_tuple(labels == bg_label)[0] - - num_pos = int(num_samples * positive_fraction) - # protect against not enough positive examples - num_pos = min(positive.numel(), num_pos) - num_neg = num_samples - num_pos - # protect against not enough negative examples - num_neg = min(negative.numel(), num_neg) - - # randomly select positive and negative examples - perm1 = torch.randperm(positive.numel(), device=positive.device)[:num_pos] - perm2 = torch.randperm(negative.numel(), device=negative.device)[:num_neg] - - pos_idx = positive[perm1] - neg_idx = negative[perm2] - return pos_idx, neg_idx diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/test_time_augmentation.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/test_time_augmentation.py deleted file mode 100755 index 373e6bf0..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/modeling/test_time_augmentation.py +++ /dev/null @@ -1,307 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import numpy as np -from contextlib import contextmanager -from itertools import count -from typing import List -import torch -from fvcore.transforms import HFlipTransform, NoOpTransform -from torch import nn -from torch.nn.parallel import DistributedDataParallel - -from detectron2.config import configurable -from detectron2.data.detection_utils import read_image -from detectron2.data.transforms import ( - RandomFlip, - ResizeShortestEdge, - ResizeTransform, - apply_augmentations, -) -from detectron2.structures import Boxes, Instances - -from .meta_arch import GeneralizedRCNN -from .postprocessing import detector_postprocess -from .roi_heads.fast_rcnn import fast_rcnn_inference_single_image - -__all__ = ["DatasetMapperTTA", "GeneralizedRCNNWithTTA"] - - -class DatasetMapperTTA: - """ - Implement test-time augmentation for detection data. - It is a callable which takes a dataset dict from a detection dataset, - and returns a list of dataset dicts where the images - are augmented from the input image by the transformations defined in the config. - This is used for test-time augmentation. - """ - - @configurable - def __init__(self, min_sizes: List[int], max_size: int, flip: bool): - """ - Args: - min_sizes: list of short-edge size to resize the image to - max_size: maximum height or width of resized images - flip: whether to apply flipping augmentation - """ - self.min_sizes = min_sizes - self.max_size = max_size - self.flip = flip - - @classmethod - def from_config(cls, cfg): - return { - "min_sizes": cfg.TEST.AUG.MIN_SIZES, - "max_size": cfg.TEST.AUG.MAX_SIZE, - "flip": cfg.TEST.AUG.FLIP, - } - - def __call__(self, dataset_dict): - """ - Args: - dict: a dict in standard model input format. See tutorials for details. - - Returns: - list[dict]: - a list of dicts, which contain augmented version of the input image. - The total number of dicts is ``len(min_sizes) * (2 if flip else 1)``. - Each dict has field "transforms" which is a TransformList, - containing the transforms that are used to generate this image. - """ - numpy_image = dataset_dict["image"].permute(1, 2, 0).numpy() - shape = numpy_image.shape - orig_shape = (dataset_dict["height"], dataset_dict["width"]) - if shape[:2] != orig_shape: - # It transforms the "original" image in the dataset to the input image - pre_tfm = ResizeTransform(orig_shape[0], orig_shape[1], shape[0], shape[1]) - else: - pre_tfm = NoOpTransform() - - # Create all combinations of augmentations to use - aug_candidates = [] # each element is a list[Augmentation] - for min_size in self.min_sizes: - resize = ResizeShortestEdge(min_size, self.max_size) - aug_candidates.append([resize]) # resize only - if self.flip: - flip = RandomFlip(prob=1.0) - aug_candidates.append([resize, flip]) # resize + flip - - # Apply all the augmentations - ret = [] - for aug in aug_candidates: - new_image, tfms = apply_augmentations(aug, np.copy(numpy_image)) - torch_image = torch.from_numpy(np.ascontiguousarray(new_image.transpose(2, 0, 1))) - - dic = copy.deepcopy(dataset_dict) - dic["transforms"] = pre_tfm + tfms - dic["image"] = torch_image - ret.append(dic) - return ret - - -class GeneralizedRCNNWithTTA(nn.Module): - """ - A GeneralizedRCNN with test-time augmentation enabled. - Its :meth:`__call__` method has the same interface as :meth:`GeneralizedRCNN.forward`. - """ - - def __init__(self, cfg, model, tta_mapper=None, batch_size=3): - """ - Args: - cfg (CfgNode): - model (GeneralizedRCNN): a GeneralizedRCNN to apply TTA on. - tta_mapper (callable): takes a dataset dict and returns a list of - augmented versions of the dataset dict. Defaults to - `DatasetMapperTTA(cfg)`. - batch_size (int): batch the augmented images into this batch size for inference. - """ - super().__init__() - if isinstance(model, DistributedDataParallel): - model = model.module - assert isinstance( - model, GeneralizedRCNN - ), "TTA is only supported on GeneralizedRCNN. Got a model of type {}".format(type(model)) - self.cfg = cfg.clone() - assert not self.cfg.MODEL.KEYPOINT_ON, "TTA for keypoint is not supported yet" - assert ( - not self.cfg.MODEL.LOAD_PROPOSALS - ), "TTA for pre-computed proposals is not supported yet" - - self.model = model - - if tta_mapper is None: - tta_mapper = DatasetMapperTTA(cfg) - self.tta_mapper = tta_mapper - self.batch_size = batch_size - - @contextmanager - def _turn_off_roi_heads(self, attrs): - """ - Open a context where some heads in `model.roi_heads` are temporarily turned off. - Args: - attr (list[str]): the attribute in `model.roi_heads` which can be used - to turn off a specific head, e.g., "mask_on", "keypoint_on". - """ - roi_heads = self.model.roi_heads - old = {} - for attr in attrs: - try: - old[attr] = getattr(roi_heads, attr) - except AttributeError: - # The head may not be implemented in certain ROIHeads - pass - - if len(old.keys()) == 0: - yield - else: - for attr in old.keys(): - setattr(roi_heads, attr, False) - yield - for attr in old.keys(): - setattr(roi_heads, attr, old[attr]) - - def _batch_inference(self, batched_inputs, detected_instances=None): - """ - Execute inference on a list of inputs, - using batch size = self.batch_size, instead of the length of the list. - - Inputs & outputs have the same format as :meth:`GeneralizedRCNN.inference` - """ - if detected_instances is None: - detected_instances = [None] * len(batched_inputs) - - outputs = [] - inputs, instances = [], [] - for idx, input, instance in zip(count(), batched_inputs, detected_instances): - inputs.append(input) - instances.append(instance) - if len(inputs) == self.batch_size or idx == len(batched_inputs) - 1: - outputs.extend( - self.model.inference( - inputs, - instances if instances[0] is not None else None, - do_postprocess=False, - ) - ) - inputs, instances = [], [] - return outputs - - def __call__(self, batched_inputs): - """ - Same input/output format as :meth:`GeneralizedRCNN.forward` - """ - - def _maybe_read_image(dataset_dict): - ret = copy.copy(dataset_dict) - if "image" not in ret: - image = read_image(ret.pop("file_name"), self.model.input_format) - image = torch.from_numpy(np.ascontiguousarray(image.transpose(2, 0, 1))) # CHW - ret["image"] = image - if "height" not in ret and "width" not in ret: - ret["height"] = image.shape[1] - ret["width"] = image.shape[2] - return ret - - return [self._inference_one_image(_maybe_read_image(x)) for x in batched_inputs] - - def _inference_one_image(self, input): - """ - Args: - input (dict): one dataset dict with "image" field being a CHW tensor - - Returns: - dict: one output dict - """ - orig_shape = (input["height"], input["width"]) - augmented_inputs, tfms = self._get_augmented_inputs(input) - # Detect boxes from all augmented versions - with self._turn_off_roi_heads(["mask_on", "keypoint_on"]): - # temporarily disable roi heads - all_boxes, all_scores, all_classes = self._get_augmented_boxes(augmented_inputs, tfms) - # merge all detected boxes to obtain final predictions for boxes - merged_instances = self._merge_detections(all_boxes, all_scores, all_classes, orig_shape) - - if self.cfg.MODEL.MASK_ON: - # Use the detected boxes to obtain masks - augmented_instances = self._rescale_detected_boxes( - augmented_inputs, merged_instances, tfms - ) - # run forward on the detected boxes - outputs = self._batch_inference(augmented_inputs, augmented_instances) - # Delete now useless variables to avoid being out of memory - del augmented_inputs, augmented_instances - # average the predictions - merged_instances.pred_masks = self._reduce_pred_masks(outputs, tfms) - merged_instances = detector_postprocess(merged_instances, *orig_shape) - return {"instances": merged_instances} - else: - return {"instances": merged_instances} - - def _get_augmented_inputs(self, input): - augmented_inputs = self.tta_mapper(input) - tfms = [x.pop("transforms") for x in augmented_inputs] - return augmented_inputs, tfms - - def _get_augmented_boxes(self, augmented_inputs, tfms): - # 1: forward with all augmented images - outputs = self._batch_inference(augmented_inputs) - # 2: union the results - all_boxes = [] - all_scores = [] - all_classes = [] - for output, tfm in zip(outputs, tfms): - # Need to inverse the transforms on boxes, to obtain results on original image - pred_boxes = output.pred_boxes.tensor - original_pred_boxes = tfm.inverse().apply_box(pred_boxes.cpu().numpy()) - all_boxes.append(torch.from_numpy(original_pred_boxes).to(pred_boxes.device)) - - all_scores.extend(output.scores) - all_classes.extend(output.pred_classes) - all_boxes = torch.cat(all_boxes, dim=0) - return all_boxes, all_scores, all_classes - - def _merge_detections(self, all_boxes, all_scores, all_classes, shape_hw): - # select from the union of all results - num_boxes = len(all_boxes) - num_classes = self.cfg.MODEL.ROI_HEADS.NUM_CLASSES - # +1 because fast_rcnn_inference expects background scores as well - all_scores_2d = torch.zeros(num_boxes, num_classes + 1, device=all_boxes.device) - for idx, cls, score in zip(count(), all_classes, all_scores): - all_scores_2d[idx, cls] = score - - merged_instances, _ = fast_rcnn_inference_single_image( - all_boxes, - all_scores_2d, - shape_hw, - 1e-8, - self.cfg.MODEL.ROI_HEADS.NMS_THRESH_TEST, - self.cfg.TEST.DETECTIONS_PER_IMAGE, - ) - - return merged_instances - - def _rescale_detected_boxes(self, augmented_inputs, merged_instances, tfms): - augmented_instances = [] - for input, tfm in zip(augmented_inputs, tfms): - # Transform the target box to the augmented image's coordinate space - pred_boxes = merged_instances.pred_boxes.tensor.cpu().numpy() - pred_boxes = torch.from_numpy(tfm.apply_box(pred_boxes)) - - aug_instances = Instances( - image_size=input["image"].shape[1:3], - pred_boxes=Boxes(pred_boxes), - pred_classes=merged_instances.pred_classes, - scores=merged_instances.scores, - ) - augmented_instances.append(aug_instances) - return augmented_instances - - def _reduce_pred_masks(self, outputs, tfms): - # Should apply inverse transforms on masks. - # We assume only resize & flip are used. pred_masks is a scale-invariant - # representation, so we handle flip specially - for output, tfm in zip(outputs, tfms): - if any(isinstance(t, HFlipTransform) for t in tfm.transforms): - output.pred_masks = output.pred_masks.flip(dims=[3]) - all_pred_masks = torch.stack([o.pred_masks for o in outputs], dim=0) - avg_pred_masks = torch.mean(all_pred_masks, dim=0) - return avg_pred_masks diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/projects/README.md b/grit_src_deprecated/third_party/CenterNet2/detectron2/projects/README.md deleted file mode 100755 index 95afe7ff..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/projects/README.md +++ /dev/null @@ -1,2 +0,0 @@ - -Projects live in the [`projects` directory](../../projects) under the root of this repository, but not here. diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/projects/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/projects/__init__.py deleted file mode 100755 index a68207db..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/projects/__init__.py +++ /dev/null @@ -1,31 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import importlib -from pathlib import Path - -_PROJECTS = { - "point_rend": "PointRend", - "deeplab": "DeepLab", - "panoptic_deeplab": "Panoptic-DeepLab", -} -_PROJECT_ROOT = Path(__file__).resolve().parent.parent.parent / "projects" - -if _PROJECT_ROOT.is_dir(): - # This is true only for in-place installation (pip install -e, setup.py develop), - # where setup(package_dir=) does not work: https://github.com/pypa/setuptools/issues/230 - - class _D2ProjectsFinder(importlib.abc.MetaPathFinder): - def find_spec(self, name, path, target=None): - if not name.startswith("detectron2.projects."): - return - project_name = name.split(".")[-1] - project_dir = _PROJECTS.get(project_name) - if not project_dir: - return - target_file = _PROJECT_ROOT / f"{project_dir}/{project_name}/__init__.py" - if not target_file.is_file(): - return - return importlib.util.spec_from_file_location(name, target_file) - - import sys - - sys.meta_path.append(_D2ProjectsFinder()) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/solver/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/solver/__init__.py deleted file mode 100755 index 9a2dbd35..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/solver/__init__.py +++ /dev/null @@ -1,5 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .build import build_lr_scheduler, build_optimizer, get_default_optimizer_params -from .lr_scheduler import WarmupCosineLR, WarmupMultiStepLR, LRMultiplier, WarmupParamScheduler - -__all__ = [k for k in globals().keys() if not k.startswith("_")] diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/solver/build.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/solver/build.py deleted file mode 100755 index 1989dfcd..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/solver/build.py +++ /dev/null @@ -1,285 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import itertools -import logging -from collections import defaultdict -from enum import Enum -from typing import Any, Callable, Dict, Iterable, List, Optional, Set, Type, Union -import torch -from fvcore.common.param_scheduler import CosineParamScheduler, MultiStepParamScheduler - -from detectron2.config import CfgNode - -from .lr_scheduler import LRMultiplier, WarmupParamScheduler - -_GradientClipperInput = Union[torch.Tensor, Iterable[torch.Tensor]] -_GradientClipper = Callable[[_GradientClipperInput], None] - - -class GradientClipType(Enum): - VALUE = "value" - NORM = "norm" - - -def _create_gradient_clipper(cfg: CfgNode) -> _GradientClipper: - """ - Creates gradient clipping closure to clip by value or by norm, - according to the provided config. - """ - cfg = copy.deepcopy(cfg) - - def clip_grad_norm(p: _GradientClipperInput): - torch.nn.utils.clip_grad_norm_(p, cfg.CLIP_VALUE, cfg.NORM_TYPE) - - def clip_grad_value(p: _GradientClipperInput): - torch.nn.utils.clip_grad_value_(p, cfg.CLIP_VALUE) - - _GRADIENT_CLIP_TYPE_TO_CLIPPER = { - GradientClipType.VALUE: clip_grad_value, - GradientClipType.NORM: clip_grad_norm, - } - return _GRADIENT_CLIP_TYPE_TO_CLIPPER[GradientClipType(cfg.CLIP_TYPE)] - - -def _generate_optimizer_class_with_gradient_clipping( - optimizer: Type[torch.optim.Optimizer], - *, - per_param_clipper: Optional[_GradientClipper] = None, - global_clipper: Optional[_GradientClipper] = None, -) -> Type[torch.optim.Optimizer]: - """ - Dynamically creates a new type that inherits the type of a given instance - and overrides the `step` method to add gradient clipping - """ - assert ( - per_param_clipper is None or global_clipper is None - ), "Not allowed to use both per-parameter clipping and global clipping" - - def optimizer_wgc_step(self, closure=None): - if per_param_clipper is not None: - for group in self.param_groups: - for p in group["params"]: - per_param_clipper(p) - else: - # global clipper for future use with detr - # (https://github.com/facebookresearch/detr/pull/287) - all_params = itertools.chain(*[g["params"] for g in self.param_groups]) - global_clipper(all_params) - super(type(self), self).step(closure) - - OptimizerWithGradientClip = type( - optimizer.__name__ + "WithGradientClip", - (optimizer,), - {"step": optimizer_wgc_step}, - ) - return OptimizerWithGradientClip - - -def maybe_add_gradient_clipping( - cfg: CfgNode, optimizer: Type[torch.optim.Optimizer] -) -> Type[torch.optim.Optimizer]: - """ - If gradient clipping is enabled through config options, wraps the existing - optimizer type to become a new dynamically created class OptimizerWithGradientClip - that inherits the given optimizer and overrides the `step` method to - include gradient clipping. - - Args: - cfg: CfgNode, configuration options - optimizer: type. A subclass of torch.optim.Optimizer - - Return: - type: either the input `optimizer` (if gradient clipping is disabled), or - a subclass of it with gradient clipping included in the `step` method. - """ - if not cfg.SOLVER.CLIP_GRADIENTS.ENABLED: - return optimizer - if isinstance(optimizer, torch.optim.Optimizer): - optimizer_type = type(optimizer) - else: - assert issubclass(optimizer, torch.optim.Optimizer), optimizer - optimizer_type = optimizer - - grad_clipper = _create_gradient_clipper(cfg.SOLVER.CLIP_GRADIENTS) - OptimizerWithGradientClip = _generate_optimizer_class_with_gradient_clipping( - optimizer_type, per_param_clipper=grad_clipper - ) - if isinstance(optimizer, torch.optim.Optimizer): - optimizer.__class__ = OptimizerWithGradientClip # a bit hacky, not recommended - return optimizer - else: - return OptimizerWithGradientClip - - -def build_optimizer(cfg: CfgNode, model: torch.nn.Module) -> torch.optim.Optimizer: - """ - Build an optimizer from config. - """ - params = get_default_optimizer_params( - model, - base_lr=cfg.SOLVER.BASE_LR, - weight_decay_norm=cfg.SOLVER.WEIGHT_DECAY_NORM, - bias_lr_factor=cfg.SOLVER.BIAS_LR_FACTOR, - weight_decay_bias=cfg.SOLVER.WEIGHT_DECAY_BIAS, - ) - return maybe_add_gradient_clipping(cfg, torch.optim.SGD)( - params, - lr=cfg.SOLVER.BASE_LR, - momentum=cfg.SOLVER.MOMENTUM, - nesterov=cfg.SOLVER.NESTEROV, - weight_decay=cfg.SOLVER.WEIGHT_DECAY, - ) - - -def get_default_optimizer_params( - model: torch.nn.Module, - base_lr: Optional[float] = None, - weight_decay: Optional[float] = None, - weight_decay_norm: Optional[float] = None, - bias_lr_factor: Optional[float] = 1.0, - weight_decay_bias: Optional[float] = None, - overrides: Optional[Dict[str, Dict[str, float]]] = None, -) -> List[Dict[str, Any]]: - """ - Get default param list for optimizer, with support for a few types of - overrides. If no overrides needed, this is equivalent to `model.parameters()`. - - Args: - base_lr: lr for every group by default. Can be omitted to use the one in optimizer. - weight_decay: weight decay for every group by default. Can be omitted to use the one - in optimizer. - weight_decay_norm: override weight decay for params in normalization layers - bias_lr_factor: multiplier of lr for bias parameters. - weight_decay_bias: override weight decay for bias parameters - overrides: if not `None`, provides values for optimizer hyperparameters - (LR, weight decay) for module parameters with a given name; e.g. - ``{"embedding": {"lr": 0.01, "weight_decay": 0.1}}`` will set the LR and - weight decay values for all module parameters named `embedding`. - - For common detection models, ``weight_decay_norm`` is the only option - needed to be set. ``bias_lr_factor,weight_decay_bias`` are legacy settings - from Detectron1 that are not found useful. - - Example: - :: - torch.optim.SGD(get_default_optimizer_params(model, weight_decay_norm=0), - lr=0.01, weight_decay=1e-4, momentum=0.9) - """ - if overrides is None: - overrides = {} - defaults = {} - if base_lr is not None: - defaults["lr"] = base_lr - if weight_decay is not None: - defaults["weight_decay"] = weight_decay - bias_overrides = {} - if bias_lr_factor is not None and bias_lr_factor != 1.0: - # NOTE: unlike Detectron v1, we now by default make bias hyperparameters - # exactly the same as regular weights. - if base_lr is None: - raise ValueError("bias_lr_factor requires base_lr") - bias_overrides["lr"] = base_lr * bias_lr_factor - if weight_decay_bias is not None: - bias_overrides["weight_decay"] = weight_decay_bias - if len(bias_overrides): - if "bias" in overrides: - raise ValueError("Conflicting overrides for 'bias'") - overrides["bias"] = bias_overrides - - norm_module_types = ( - torch.nn.BatchNorm1d, - torch.nn.BatchNorm2d, - torch.nn.BatchNorm3d, - torch.nn.SyncBatchNorm, - # NaiveSyncBatchNorm inherits from BatchNorm2d - torch.nn.GroupNorm, - torch.nn.InstanceNorm1d, - torch.nn.InstanceNorm2d, - torch.nn.InstanceNorm3d, - torch.nn.LayerNorm, - torch.nn.LocalResponseNorm, - ) - params: List[Dict[str, Any]] = [] - memo: Set[torch.nn.parameter.Parameter] = set() - for module in model.modules(): - for module_param_name, value in module.named_parameters(recurse=False): - if not value.requires_grad: - continue - # Avoid duplicating parameters - if value in memo: - continue - memo.add(value) - - hyperparams = copy.copy(defaults) - if isinstance(module, norm_module_types) and weight_decay_norm is not None: - hyperparams["weight_decay"] = weight_decay_norm - hyperparams.update(overrides.get(module_param_name, {})) - params.append({"params": [value], **hyperparams}) - return reduce_param_groups(params) - - -def _expand_param_groups(params: List[Dict[str, Any]]) -> List[Dict[str, Any]]: - # Transform parameter groups into per-parameter structure. - # Later items in `params` can overwrite parameters set in previous items. - ret = defaultdict(dict) - for item in params: - assert "params" in item - cur_params = {x: y for x, y in item.items() if x != "params"} - for param in item["params"]: - ret[param].update({"params": [param], **cur_params}) - return list(ret.values()) - - -def reduce_param_groups(params: List[Dict[str, Any]]) -> List[Dict[str, Any]]: - # Reorganize the parameter groups and merge duplicated groups. - # The number of parameter groups needs to be as small as possible in order - # to efficiently use the PyTorch multi-tensor optimizer. Therefore instead - # of using a parameter_group per single parameter, we reorganize the - # parameter groups and merge duplicated groups. This approach speeds - # up multi-tensor optimizer significantly. - params = _expand_param_groups(params) - groups = defaultdict(list) # re-group all parameter groups by their hyperparams - for item in params: - cur_params = tuple((x, y) for x, y in item.items() if x != "params") - groups[cur_params].extend(item["params"]) - ret = [] - for param_keys, param_values in groups.items(): - cur = {kv[0]: kv[1] for kv in param_keys} - cur["params"] = param_values - ret.append(cur) - return ret - - -def build_lr_scheduler( - cfg: CfgNode, optimizer: torch.optim.Optimizer -) -> torch.optim.lr_scheduler._LRScheduler: - """ - Build a LR scheduler from config. - """ - name = cfg.SOLVER.LR_SCHEDULER_NAME - - if name == "WarmupMultiStepLR": - steps = [x for x in cfg.SOLVER.STEPS if x <= cfg.SOLVER.MAX_ITER] - if len(steps) != len(cfg.SOLVER.STEPS): - logger = logging.getLogger(__name__) - logger.warning( - "SOLVER.STEPS contains values larger than SOLVER.MAX_ITER. " - "These values will be ignored." - ) - sched = MultiStepParamScheduler( - values=[cfg.SOLVER.GAMMA ** k for k in range(len(steps) + 1)], - milestones=steps, - num_updates=cfg.SOLVER.MAX_ITER, - ) - elif name == "WarmupCosineLR": - sched = CosineParamScheduler(1, 0) - else: - raise ValueError("Unknown LR scheduler: {}".format(name)) - - sched = WarmupParamScheduler( - sched, - cfg.SOLVER.WARMUP_FACTOR, - min(cfg.SOLVER.WARMUP_ITERS / cfg.SOLVER.MAX_ITER, 1.0), - cfg.SOLVER.WARMUP_METHOD, - ) - return LRMultiplier(optimizer, multiplier=sched, max_iter=cfg.SOLVER.MAX_ITER) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/solver/lr_scheduler.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/solver/lr_scheduler.py deleted file mode 100755 index 8803e87b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/solver/lr_scheduler.py +++ /dev/null @@ -1,238 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import math -from bisect import bisect_right -from typing import List -import torch -from fvcore.common.param_scheduler import ( - CompositeParamScheduler, - ConstantParamScheduler, - LinearParamScheduler, - ParamScheduler, -) - -logger = logging.getLogger(__name__) - - -class WarmupParamScheduler(CompositeParamScheduler): - """ - Add an initial warmup stage to another scheduler. - """ - - def __init__( - self, - scheduler: ParamScheduler, - warmup_factor: float, - warmup_length: float, - warmup_method: str = "linear", - ): - """ - Args: - scheduler: warmup will be added at the beginning of this scheduler - warmup_factor: the factor w.r.t the initial value of ``scheduler``, e.g. 0.001 - warmup_length: the relative length (in [0, 1]) of warmup steps w.r.t the entire - training, e.g. 0.01 - warmup_method: one of "linear" or "constant" - """ - end_value = scheduler(warmup_length) # the value to reach when warmup ends - start_value = warmup_factor * scheduler(0.0) - if warmup_method == "constant": - warmup = ConstantParamScheduler(start_value) - elif warmup_method == "linear": - warmup = LinearParamScheduler(start_value, end_value) - else: - raise ValueError("Unknown warmup method: {}".format(warmup_method)) - super().__init__( - [warmup, scheduler], - interval_scaling=["rescaled", "fixed"], - lengths=[warmup_length, 1 - warmup_length], - ) - - -class LRMultiplier(torch.optim.lr_scheduler._LRScheduler): - """ - A LRScheduler which uses fvcore :class:`ParamScheduler` to multiply the - learning rate of each param in the optimizer. - Every step, the learning rate of each parameter becomes its initial value - multiplied by the output of the given :class:`ParamScheduler`. - - The absolute learning rate value of each parameter can be different. - This scheduler can be used as long as the relative scale among them do - not change during training. - - Examples: - :: - LRMultiplier( - opt, - WarmupParamScheduler( - MultiStepParamScheduler( - [1, 0.1, 0.01], - milestones=[60000, 80000], - num_updates=90000, - ), 0.001, 100 / 90000 - ), - max_iter=90000 - ) - """ - - # NOTES: in the most general case, every LR can use its own scheduler. - # Supporting this requires interaction with the optimizer when its parameter - # group is initialized. For example, classyvision implements its own optimizer - # that allows different schedulers for every parameter group. - # To avoid this complexity, we use this class to support the most common cases - # where the relative scale among all LRs stay unchanged during training. In this - # case we only need a total of one scheduler that defines the relative LR multiplier. - - def __init__( - self, - optimizer: torch.optim.Optimizer, - multiplier: ParamScheduler, - max_iter: int, - last_iter: int = -1, - ): - """ - Args: - optimizer, last_iter: See ``torch.optim.lr_scheduler._LRScheduler``. - ``last_iter`` is the same as ``last_epoch``. - multiplier: a fvcore ParamScheduler that defines the multiplier on - every LR of the optimizer - max_iter: the total number of training iterations - """ - if not isinstance(multiplier, ParamScheduler): - raise ValueError( - "_LRMultiplier(multiplier=) must be an instance of fvcore " - f"ParamScheduler. Got {multiplier} instead." - ) - self._multiplier = multiplier - self._max_iter = max_iter - super().__init__(optimizer, last_epoch=last_iter) - - def state_dict(self): - # fvcore schedulers are stateless. Only keep pytorch scheduler states - return {"base_lrs": self.base_lrs, "last_epoch": self.last_epoch} - - def get_lr(self) -> List[float]: - multiplier = self._multiplier(self.last_epoch / self._max_iter) - return [base_lr * multiplier for base_lr in self.base_lrs] - - -""" -Content below is no longer needed! -""" - -# NOTE: PyTorch's LR scheduler interface uses names that assume the LR changes -# only on epoch boundaries. We typically use iteration based schedules instead. -# As a result, "epoch" (e.g., as in self.last_epoch) should be understood to mean -# "iteration" instead. - -# FIXME: ideally this would be achieved with a CombinedLRScheduler, separating -# MultiStepLR with WarmupLR but the current LRScheduler design doesn't allow it. - - -class WarmupMultiStepLR(torch.optim.lr_scheduler._LRScheduler): - def __init__( - self, - optimizer: torch.optim.Optimizer, - milestones: List[int], - gamma: float = 0.1, - warmup_factor: float = 0.001, - warmup_iters: int = 1000, - warmup_method: str = "linear", - last_epoch: int = -1, - ): - logger.warning( - "WarmupMultiStepLR is deprecated! Use LRMultipilier with fvcore ParamScheduler instead!" - ) - if not list(milestones) == sorted(milestones): - raise ValueError( - "Milestones should be a list of" " increasing integers. Got {}", milestones - ) - self.milestones = milestones - self.gamma = gamma - self.warmup_factor = warmup_factor - self.warmup_iters = warmup_iters - self.warmup_method = warmup_method - super().__init__(optimizer, last_epoch) - - def get_lr(self) -> List[float]: - warmup_factor = _get_warmup_factor_at_iter( - self.warmup_method, self.last_epoch, self.warmup_iters, self.warmup_factor - ) - return [ - base_lr * warmup_factor * self.gamma ** bisect_right(self.milestones, self.last_epoch) - for base_lr in self.base_lrs - ] - - def _compute_values(self) -> List[float]: - # The new interface - return self.get_lr() - - -class WarmupCosineLR(torch.optim.lr_scheduler._LRScheduler): - def __init__( - self, - optimizer: torch.optim.Optimizer, - max_iters: int, - warmup_factor: float = 0.001, - warmup_iters: int = 1000, - warmup_method: str = "linear", - last_epoch: int = -1, - ): - logger.warning( - "WarmupCosineLR is deprecated! Use LRMultipilier with fvcore ParamScheduler instead!" - ) - self.max_iters = max_iters - self.warmup_factor = warmup_factor - self.warmup_iters = warmup_iters - self.warmup_method = warmup_method - super().__init__(optimizer, last_epoch) - - def get_lr(self) -> List[float]: - warmup_factor = _get_warmup_factor_at_iter( - self.warmup_method, self.last_epoch, self.warmup_iters, self.warmup_factor - ) - # Different definitions of half-cosine with warmup are possible. For - # simplicity we multiply the standard half-cosine schedule by the warmup - # factor. An alternative is to start the period of the cosine at warmup_iters - # instead of at 0. In the case that warmup_iters << max_iters the two are - # very close to each other. - return [ - base_lr - * warmup_factor - * 0.5 - * (1.0 + math.cos(math.pi * self.last_epoch / self.max_iters)) - for base_lr in self.base_lrs - ] - - def _compute_values(self) -> List[float]: - # The new interface - return self.get_lr() - - -def _get_warmup_factor_at_iter( - method: str, iter: int, warmup_iters: int, warmup_factor: float -) -> float: - """ - Return the learning rate warmup factor at a specific iteration. - See :paper:`ImageNet in 1h` for more details. - - Args: - method (str): warmup method; either "constant" or "linear". - iter (int): iteration at which to calculate the warmup factor. - warmup_iters (int): the number of warmup iterations. - warmup_factor (float): the base warmup factor (the meaning changes according - to the method used). - - Returns: - float: the effective warmup factor at the given iteration. - """ - if iter >= warmup_iters: - return 1.0 - - if method == "constant": - return warmup_factor - elif method == "linear": - alpha = iter / warmup_iters - return warmup_factor * (1 - alpha) + alpha - else: - raise ValueError("Unknown warmup method: {}".format(method)) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/__init__.py deleted file mode 100755 index f3ee6057..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/__init__.py +++ /dev/null @@ -1,17 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from .boxes import Boxes, BoxMode, pairwise_iou, pairwise_ioa, pairwise_point_box_distance -from .image_list import ImageList - -from .instances import Instances -from .keypoints import Keypoints, heatmaps_to_keypoints -from .masks import BitMasks, PolygonMasks, polygons_to_bitmask, ROIMasks -from .rotated_boxes import RotatedBoxes -from .rotated_boxes import pairwise_iou as pairwise_iou_rotated - -__all__ = [k for k in globals().keys() if not k.startswith("_")] - - -from detectron2.utils.env import fixup_module_metadata - -fixup_module_metadata(__name__, globals(), __all__) -del fixup_module_metadata diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/boxes.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/boxes.py deleted file mode 100755 index ae543c61..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/boxes.py +++ /dev/null @@ -1,423 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import math -import numpy as np -from enum import IntEnum, unique -from typing import List, Tuple, Union -import torch -from torch import device - -_RawBoxType = Union[List[float], Tuple[float, ...], torch.Tensor, np.ndarray] - - -@unique -class BoxMode(IntEnum): - """ - Enum of different ways to represent a box. - """ - - XYXY_ABS = 0 - """ - (x0, y0, x1, y1) in absolute floating points coordinates. - The coordinates in range [0, width or height]. - """ - XYWH_ABS = 1 - """ - (x0, y0, w, h) in absolute floating points coordinates. - """ - XYXY_REL = 2 - """ - Not yet supported! - (x0, y0, x1, y1) in range [0, 1]. They are relative to the size of the image. - """ - XYWH_REL = 3 - """ - Not yet supported! - (x0, y0, w, h) in range [0, 1]. They are relative to the size of the image. - """ - XYWHA_ABS = 4 - """ - (xc, yc, w, h, a) in absolute floating points coordinates. - (xc, yc) is the center of the rotated box, and the angle a is in degrees ccw. - """ - - @staticmethod - def convert(box: _RawBoxType, from_mode: "BoxMode", to_mode: "BoxMode") -> _RawBoxType: - """ - Args: - box: can be a k-tuple, k-list or an Nxk array/tensor, where k = 4 or 5 - from_mode, to_mode (BoxMode) - - Returns: - The converted box of the same type. - """ - if from_mode == to_mode: - return box - - original_type = type(box) - is_numpy = isinstance(box, np.ndarray) - single_box = isinstance(box, (list, tuple)) - if single_box: - assert len(box) == 4 or len(box) == 5, ( - "BoxMode.convert takes either a k-tuple/list or an Nxk array/tensor," - " where k == 4 or 5" - ) - arr = torch.tensor(box)[None, :] - else: - # avoid modifying the input box - if is_numpy: - arr = torch.from_numpy(np.asarray(box)).clone() - else: - arr = box.clone() - - assert to_mode not in [BoxMode.XYXY_REL, BoxMode.XYWH_REL] and from_mode not in [ - BoxMode.XYXY_REL, - BoxMode.XYWH_REL, - ], "Relative mode not yet supported!" - - if from_mode == BoxMode.XYWHA_ABS and to_mode == BoxMode.XYXY_ABS: - assert ( - arr.shape[-1] == 5 - ), "The last dimension of input shape must be 5 for XYWHA format" - original_dtype = arr.dtype - arr = arr.double() - - w = arr[:, 2] - h = arr[:, 3] - a = arr[:, 4] - c = torch.abs(torch.cos(a * math.pi / 180.0)) - s = torch.abs(torch.sin(a * math.pi / 180.0)) - # This basically computes the horizontal bounding rectangle of the rotated box - new_w = c * w + s * h - new_h = c * h + s * w - - # convert center to top-left corner - arr[:, 0] -= new_w / 2.0 - arr[:, 1] -= new_h / 2.0 - # bottom-right corner - arr[:, 2] = arr[:, 0] + new_w - arr[:, 3] = arr[:, 1] + new_h - - arr = arr[:, :4].to(dtype=original_dtype) - elif from_mode == BoxMode.XYWH_ABS and to_mode == BoxMode.XYWHA_ABS: - original_dtype = arr.dtype - arr = arr.double() - arr[:, 0] += arr[:, 2] / 2.0 - arr[:, 1] += arr[:, 3] / 2.0 - angles = torch.zeros((arr.shape[0], 1), dtype=arr.dtype) - arr = torch.cat((arr, angles), axis=1).to(dtype=original_dtype) - else: - if to_mode == BoxMode.XYXY_ABS and from_mode == BoxMode.XYWH_ABS: - arr[:, 2] += arr[:, 0] - arr[:, 3] += arr[:, 1] - elif from_mode == BoxMode.XYXY_ABS and to_mode == BoxMode.XYWH_ABS: - arr[:, 2] -= arr[:, 0] - arr[:, 3] -= arr[:, 1] - else: - raise NotImplementedError( - "Conversion from BoxMode {} to {} is not supported yet".format( - from_mode, to_mode - ) - ) - - if single_box: - return original_type(arr.flatten().tolist()) - if is_numpy: - return arr.numpy() - else: - return arr - - -class Boxes: - """ - This structure stores a list of boxes as a Nx4 torch.Tensor. - It supports some common methods about boxes - (`area`, `clip`, `nonempty`, etc), - and also behaves like a Tensor - (support indexing, `to(device)`, `.device`, and iteration over all boxes) - - Attributes: - tensor (torch.Tensor): float matrix of Nx4. Each row is (x1, y1, x2, y2). - """ - - def __init__(self, tensor: torch.Tensor): - """ - Args: - tensor (Tensor[float]): a Nx4 matrix. Each row is (x1, y1, x2, y2). - """ - device = tensor.device if isinstance(tensor, torch.Tensor) else torch.device("cpu") - tensor = torch.as_tensor(tensor, dtype=torch.float32, device=device) - if tensor.numel() == 0: - # Use reshape, so we don't end up creating a new tensor that does not depend on - # the inputs (and consequently confuses jit) - tensor = tensor.reshape((-1, 4)).to(dtype=torch.float32, device=device) - assert tensor.dim() == 2 and tensor.size(-1) == 4, tensor.size() - - self.tensor = tensor - - def clone(self) -> "Boxes": - """ - Clone the Boxes. - - Returns: - Boxes - """ - return Boxes(self.tensor.clone()) - - def to(self, device: torch.device): - # Boxes are assumed float32 and does not support to(dtype) - return Boxes(self.tensor.to(device=device)) - - def area(self) -> torch.Tensor: - """ - Computes the area of all the boxes. - - Returns: - torch.Tensor: a vector with areas of each box. - """ - box = self.tensor - area = (box[:, 2] - box[:, 0]) * (box[:, 3] - box[:, 1]) - return area - - def clip(self, box_size: Tuple[int, int]) -> None: - """ - Clip (in place) the boxes by limiting x coordinates to the range [0, width] - and y coordinates to the range [0, height]. - - Args: - box_size (height, width): The clipping box's size. - """ - assert torch.isfinite(self.tensor).all(), "Box tensor contains infinite or NaN!" - h, w = box_size - x1 = self.tensor[:, 0].clamp(min=0, max=w) - y1 = self.tensor[:, 1].clamp(min=0, max=h) - x2 = self.tensor[:, 2].clamp(min=0, max=w) - y2 = self.tensor[:, 3].clamp(min=0, max=h) - self.tensor = torch.stack((x1, y1, x2, y2), dim=-1) - - def nonempty(self, threshold: float = 0.0) -> torch.Tensor: - """ - Find boxes that are non-empty. - A box is considered empty, if either of its side is no larger than threshold. - - Returns: - Tensor: - a binary vector which represents whether each box is empty - (False) or non-empty (True). - """ - box = self.tensor - widths = box[:, 2] - box[:, 0] - heights = box[:, 3] - box[:, 1] - keep = (widths > threshold) & (heights > threshold) - return keep - - def __getitem__(self, item) -> "Boxes": - """ - Args: - item: int, slice, or a BoolTensor - - Returns: - Boxes: Create a new :class:`Boxes` by indexing. - - The following usage are allowed: - - 1. `new_boxes = boxes[3]`: return a `Boxes` which contains only one box. - 2. `new_boxes = boxes[2:10]`: return a slice of boxes. - 3. `new_boxes = boxes[vector]`, where vector is a torch.BoolTensor - with `length = len(boxes)`. Nonzero elements in the vector will be selected. - - Note that the returned Boxes might share storage with this Boxes, - subject to Pytorch's indexing semantics. - """ - if isinstance(item, int): - return Boxes(self.tensor[item].view(1, -1)) - b = self.tensor[item] - assert b.dim() == 2, "Indexing on Boxes with {} failed to return a matrix!".format(item) - return Boxes(b) - - def __len__(self) -> int: - return self.tensor.shape[0] - - def __repr__(self) -> str: - return "Boxes(" + str(self.tensor) + ")" - - def inside_box(self, box_size: Tuple[int, int], boundary_threshold: int = 0) -> torch.Tensor: - """ - Args: - box_size (height, width): Size of the reference box. - boundary_threshold (int): Boxes that extend beyond the reference box - boundary by more than boundary_threshold are considered "outside". - - Returns: - a binary vector, indicating whether each box is inside the reference box. - """ - height, width = box_size - inds_inside = ( - (self.tensor[..., 0] >= -boundary_threshold) - & (self.tensor[..., 1] >= -boundary_threshold) - & (self.tensor[..., 2] < width + boundary_threshold) - & (self.tensor[..., 3] < height + boundary_threshold) - ) - return inds_inside - - def get_centers(self) -> torch.Tensor: - """ - Returns: - The box centers in a Nx2 array of (x, y). - """ - return (self.tensor[:, :2] + self.tensor[:, 2:]) / 2 - - def scale(self, scale_x: float, scale_y: float) -> None: - """ - Scale the box with horizontal and vertical scaling factors - """ - self.tensor[:, 0::2] *= scale_x - self.tensor[:, 1::2] *= scale_y - - @classmethod - def cat(cls, boxes_list: List["Boxes"]) -> "Boxes": - """ - Concatenates a list of Boxes into a single Boxes - - Arguments: - boxes_list (list[Boxes]) - - Returns: - Boxes: the concatenated Boxes - """ - assert isinstance(boxes_list, (list, tuple)) - if len(boxes_list) == 0: - return cls(torch.empty(0)) - assert all([isinstance(box, Boxes) for box in boxes_list]) - - # use torch.cat (v.s. layers.cat) so the returned boxes never share storage with input - cat_boxes = cls(torch.cat([b.tensor for b in boxes_list], dim=0)) - return cat_boxes - - @property - def device(self) -> device: - return self.tensor.device - - # type "Iterator[torch.Tensor]", yield, and iter() not supported by torchscript - # https://github.com/pytorch/pytorch/issues/18627 - @torch.jit.unused - def __iter__(self): - """ - Yield a box as a Tensor of shape (4,) at a time. - """ - yield from self.tensor - - -def pairwise_intersection(boxes1: Boxes, boxes2: Boxes) -> torch.Tensor: - """ - Given two lists of boxes of size N and M, - compute the intersection area between __all__ N x M pairs of boxes. - The box order must be (xmin, ymin, xmax, ymax) - - Args: - boxes1,boxes2 (Boxes): two `Boxes`. Contains N & M boxes, respectively. - - Returns: - Tensor: intersection, sized [N,M]. - """ - boxes1, boxes2 = boxes1.tensor, boxes2.tensor - width_height = torch.min(boxes1[:, None, 2:], boxes2[:, 2:]) - torch.max( - boxes1[:, None, :2], boxes2[:, :2] - ) # [N,M,2] - - width_height.clamp_(min=0) # [N,M,2] - intersection = width_height.prod(dim=2) # [N,M] - return intersection - - -# implementation from https://github.com/kuangliu/torchcv/blob/master/torchcv/utils/box.py -# with slight modifications -def pairwise_iou(boxes1: Boxes, boxes2: Boxes) -> torch.Tensor: - """ - Given two lists of boxes of size N and M, compute the IoU - (intersection over union) between **all** N x M pairs of boxes. - The box order must be (xmin, ymin, xmax, ymax). - - Args: - boxes1,boxes2 (Boxes): two `Boxes`. Contains N & M boxes, respectively. - - Returns: - Tensor: IoU, sized [N,M]. - """ - area1 = boxes1.area() # [N] - area2 = boxes2.area() # [M] - inter = pairwise_intersection(boxes1, boxes2) - - # handle empty boxes - iou = torch.where( - inter > 0, - inter / (area1[:, None] + area2 - inter), - torch.zeros(1, dtype=inter.dtype, device=inter.device), - ) - return iou - - -def pairwise_ioa(boxes1: Boxes, boxes2: Boxes) -> torch.Tensor: - """ - Similar to :func:`pariwise_iou` but compute the IoA (intersection over boxes2 area). - - Args: - boxes1,boxes2 (Boxes): two `Boxes`. Contains N & M boxes, respectively. - - Returns: - Tensor: IoA, sized [N,M]. - """ - area2 = boxes2.area() # [M] - inter = pairwise_intersection(boxes1, boxes2) - - # handle empty boxes - ioa = torch.where( - inter > 0, inter / area2, torch.zeros(1, dtype=inter.dtype, device=inter.device) - ) - return ioa - - -def pairwise_point_box_distance(points: torch.Tensor, boxes: Boxes): - """ - Pairwise distance between N points and M boxes. The distance between a - point and a box is represented by the distance from the point to 4 edges - of the box. Distances are all positive when the point is inside the box. - - Args: - points: Nx2 coordinates. Each row is (x, y) - boxes: M boxes - - Returns: - Tensor: distances of size (N, M, 4). The 4 values are distances from - the point to the left, top, right, bottom of the box. - """ - x, y = points.unsqueeze(dim=2).unbind(dim=1) # (N, 1) - x0, y0, x1, y1 = boxes.tensor.unsqueeze(dim=0).unbind(dim=2) # (1, M) - return torch.stack([x - x0, y - y0, x1 - x, y1 - y], dim=2) - - -def matched_pairwise_iou(boxes1: Boxes, boxes2: Boxes) -> torch.Tensor: - """ - Compute pairwise intersection over union (IOU) of two sets of matched - boxes that have the same number of boxes. - Similar to :func:`pairwise_iou`, but computes only diagonal elements of the matrix. - - Args: - boxes1 (Boxes): bounding boxes, sized [N,4]. - boxes2 (Boxes): same length as boxes1 - Returns: - Tensor: iou, sized [N]. - """ - assert len(boxes1) == len( - boxes2 - ), "boxlists should have the same" "number of entries, got {}, {}".format( - len(boxes1), len(boxes2) - ) - area1 = boxes1.area() # [N] - area2 = boxes2.area() # [N] - box1, box2 = boxes1.tensor, boxes2.tensor - lt = torch.max(box1[:, :2], box2[:, :2]) # [N,2] - rb = torch.min(box1[:, 2:], box2[:, 2:]) # [N,2] - wh = (rb - lt).clamp(min=0) # [N,2] - inter = wh[:, 0] * wh[:, 1] # [N] - iou = inter / (area1 + area2 - inter) # [N] - return iou diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/image_list.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/image_list.py deleted file mode 100755 index b31b2d39..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/image_list.py +++ /dev/null @@ -1,110 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from __future__ import division -from typing import Any, List, Tuple -import torch -from torch import device -from torch.nn import functional as F - -from detectron2.layers.wrappers import shapes_to_tensor - - -class ImageList(object): - """ - Structure that holds a list of images (of possibly - varying sizes) as a single tensor. - This works by padding the images to the same size. - The original sizes of each image is stored in `image_sizes`. - - Attributes: - image_sizes (list[tuple[int, int]]): each tuple is (h, w). - During tracing, it becomes list[Tensor] instead. - """ - - def __init__(self, tensor: torch.Tensor, image_sizes: List[Tuple[int, int]]): - """ - Arguments: - tensor (Tensor): of shape (N, H, W) or (N, C_1, ..., C_K, H, W) where K >= 1 - image_sizes (list[tuple[int, int]]): Each tuple is (h, w). It can - be smaller than (H, W) due to padding. - """ - self.tensor = tensor - self.image_sizes = image_sizes - - def __len__(self) -> int: - return len(self.image_sizes) - - def __getitem__(self, idx) -> torch.Tensor: - """ - Access the individual image in its original size. - - Args: - idx: int or slice - - Returns: - Tensor: an image of shape (H, W) or (C_1, ..., C_K, H, W) where K >= 1 - """ - size = self.image_sizes[idx] - return self.tensor[idx, ..., : size[0], : size[1]] - - @torch.jit.unused - def to(self, *args: Any, **kwargs: Any) -> "ImageList": - cast_tensor = self.tensor.to(*args, **kwargs) - return ImageList(cast_tensor, self.image_sizes) - - @property - def device(self) -> device: - return self.tensor.device - - @staticmethod - def from_tensors( - tensors: List[torch.Tensor], size_divisibility: int = 0, pad_value: float = 0.0 - ) -> "ImageList": - """ - Args: - tensors: a tuple or list of `torch.Tensor`, each of shape (Hi, Wi) or - (C_1, ..., C_K, Hi, Wi) where K >= 1. The Tensors will be padded - to the same shape with `pad_value`. - size_divisibility (int): If `size_divisibility > 0`, add padding to ensure - the common height and width is divisible by `size_divisibility`. - This depends on the model and many models need a divisibility of 32. - pad_value (float): value to pad - - Returns: - an `ImageList`. - """ - assert len(tensors) > 0 - assert isinstance(tensors, (tuple, list)) - for t in tensors: - assert isinstance(t, torch.Tensor), type(t) - assert t.shape[:-2] == tensors[0].shape[:-2], t.shape - - image_sizes = [(im.shape[-2], im.shape[-1]) for im in tensors] - image_sizes_tensor = [shapes_to_tensor(x) for x in image_sizes] - max_size = torch.stack(image_sizes_tensor).max(0).values - - if size_divisibility > 1: - stride = size_divisibility - # the last two dims are H,W, both subject to divisibility requirement - max_size = (max_size + (stride - 1)).div(stride, rounding_mode="floor") * stride - - # handle weirdness of scripting and tracing ... - if torch.jit.is_scripting(): - max_size: List[int] = max_size.to(dtype=torch.long).tolist() - else: - if torch.jit.is_tracing(): - image_sizes = image_sizes_tensor - - if len(tensors) == 1: - # This seems slightly (2%) faster. - # TODO: check whether it's faster for multiple images as well - image_size = image_sizes[0] - padding_size = [0, max_size[-1] - image_size[1], 0, max_size[-2] - image_size[0]] - batched_imgs = F.pad(tensors[0], padding_size, value=pad_value).unsqueeze_(0) - else: - # max_size can be a tensor in tracing mode, therefore convert to list - batch_shape = [len(tensors)] + list(tensors[0].shape[:-2]) + list(max_size) - batched_imgs = tensors[0].new_full(batch_shape, pad_value) - for img, pad_img in zip(tensors, batched_imgs): - pad_img[..., : img.shape[-2], : img.shape[-1]].copy_(img) - - return ImageList(batched_imgs.contiguous(), image_sizes) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/instances.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/instances.py deleted file mode 100755 index 612e66f5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/instances.py +++ /dev/null @@ -1,192 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import itertools -from typing import Any, Dict, List, Tuple, Union -import torch - - -class Instances: - """ - This class represents a list of instances in an image. - It stores the attributes of instances (e.g., boxes, masks, labels, scores) as "fields". - All fields must have the same ``__len__`` which is the number of instances. - - All other (non-field) attributes of this class are considered private: - they must start with '_' and are not modifiable by a user. - - Some basic usage: - - 1. Set/get/check a field: - - .. code-block:: python - - instances.gt_boxes = Boxes(...) - print(instances.pred_masks) # a tensor of shape (N, H, W) - print('gt_masks' in instances) - - 2. ``len(instances)`` returns the number of instances - 3. Indexing: ``instances[indices]`` will apply the indexing on all the fields - and returns a new :class:`Instances`. - Typically, ``indices`` is a integer vector of indices, - or a binary mask of length ``num_instances`` - - .. code-block:: python - - category_3_detections = instances[instances.pred_classes == 3] - confident_detections = instances[instances.scores > 0.9] - """ - - def __init__(self, image_size: Tuple[int, int], **kwargs: Any): - """ - Args: - image_size (height, width): the spatial size of the image. - kwargs: fields to add to this `Instances`. - """ - self._image_size = image_size - self._fields: Dict[str, Any] = {} - for k, v in kwargs.items(): - self.set(k, v) - - @property - def image_size(self) -> Tuple[int, int]: - """ - Returns: - tuple: height, width - """ - return self._image_size - - def __setattr__(self, name: str, val: Any) -> None: - if name.startswith("_"): - super().__setattr__(name, val) - else: - self.set(name, val) - - def __getattr__(self, name: str) -> Any: - if name == "_fields" or name not in self._fields: - raise AttributeError("Cannot find field '{}' in the given Instances!".format(name)) - return self._fields[name] - - def set(self, name: str, value: Any) -> None: - """ - Set the field named `name` to `value`. - The length of `value` must be the number of instances, - and must agree with other existing fields in this object. - """ - data_len = len(value) - if len(self._fields): - assert ( - len(self) == data_len - ), "Adding a field of length {} to a Instances of length {}".format(data_len, len(self)) - self._fields[name] = value - - def has(self, name: str) -> bool: - """ - Returns: - bool: whether the field called `name` exists. - """ - return name in self._fields - - def remove(self, name: str) -> None: - """ - Remove the field called `name`. - """ - del self._fields[name] - - def get(self, name: str) -> Any: - """ - Returns the field called `name`. - """ - return self._fields[name] - - def get_fields(self) -> Dict[str, Any]: - """ - Returns: - dict: a dict which maps names (str) to data of the fields - - Modifying the returned dict will modify this instance. - """ - return self._fields - - # Tensor-like methods - def to(self, *args: Any, **kwargs: Any) -> "Instances": - """ - Returns: - Instances: all fields are called with a `to(device)`, if the field has this method. - """ - ret = Instances(self._image_size) - for k, v in self._fields.items(): - if hasattr(v, "to"): - v = v.to(*args, **kwargs) - ret.set(k, v) - return ret - - def __getitem__(self, item: Union[int, slice, torch.BoolTensor]) -> "Instances": - """ - Args: - item: an index-like object and will be used to index all the fields. - - Returns: - If `item` is a string, return the data in the corresponding field. - Otherwise, returns an `Instances` where all fields are indexed by `item`. - """ - if type(item) == int: - if item >= len(self) or item < -len(self): - raise IndexError("Instances index out of range!") - else: - item = slice(item, None, len(self)) - - ret = Instances(self._image_size) - for k, v in self._fields.items(): - ret.set(k, v[item]) - return ret - - def __len__(self) -> int: - for v in self._fields.values(): - # use __len__ because len() has to be int and is not friendly to tracing - return v.__len__() - raise NotImplementedError("Empty Instances does not support __len__!") - - def __iter__(self): - raise NotImplementedError("`Instances` object is not iterable!") - - @staticmethod - def cat(instance_lists: List["Instances"]) -> "Instances": - """ - Args: - instance_lists (list[Instances]) - - Returns: - Instances - """ - assert all(isinstance(i, Instances) for i in instance_lists) - assert len(instance_lists) > 0 - if len(instance_lists) == 1: - return instance_lists[0] - - image_size = instance_lists[0].image_size - if not isinstance(image_size, torch.Tensor): # could be a tensor in tracing - for i in instance_lists[1:]: - assert i.image_size == image_size - ret = Instances(image_size) - for k in instance_lists[0]._fields.keys(): - values = [i.get(k) for i in instance_lists] - v0 = values[0] - if isinstance(v0, torch.Tensor): - values = torch.cat(values, dim=0) - elif isinstance(v0, list): - values = list(itertools.chain(*values)) - elif hasattr(type(v0), "cat"): - values = type(v0).cat(values) - else: - raise ValueError("Unsupported type {} for concatenation".format(type(v0))) - ret.set(k, values) - return ret - - def __str__(self) -> str: - s = self.__class__.__name__ + "(" - s += "num_instances={}, ".format(len(self)) - s += "image_height={}, ".format(self._image_size[0]) - s += "image_width={}, ".format(self._image_size[1]) - s += "fields=[{}])".format(", ".join((f"{k}: {v}" for k, v in self._fields.items()))) - return s - - __repr__ = __str__ diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/keypoints.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/keypoints.py deleted file mode 100755 index d0ee8724..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/keypoints.py +++ /dev/null @@ -1,239 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -from typing import Any, List, Tuple, Union -import torch -from torch.nn import functional as F - - -class Keypoints: - """ - Stores keypoint **annotation** data. GT Instances have a `gt_keypoints` property - containing the x,y location and visibility flag of each keypoint. This tensor has shape - (N, K, 3) where N is the number of instances and K is the number of keypoints per instance. - - The visibility flag follows the COCO format and must be one of three integers: - - * v=0: not labeled (in which case x=y=0) - * v=1: labeled but not visible - * v=2: labeled and visible - """ - - def __init__(self, keypoints: Union[torch.Tensor, np.ndarray, List[List[float]]]): - """ - Arguments: - keypoints: A Tensor, numpy array, or list of the x, y, and visibility of each keypoint. - The shape should be (N, K, 3) where N is the number of - instances, and K is the number of keypoints per instance. - """ - device = keypoints.device if isinstance(keypoints, torch.Tensor) else torch.device("cpu") - keypoints = torch.as_tensor(keypoints, dtype=torch.float32, device=device) - assert keypoints.dim() == 3 and keypoints.shape[2] == 3, keypoints.shape - self.tensor = keypoints - - def __len__(self) -> int: - return self.tensor.size(0) - - def to(self, *args: Any, **kwargs: Any) -> "Keypoints": - return type(self)(self.tensor.to(*args, **kwargs)) - - @property - def device(self) -> torch.device: - return self.tensor.device - - def to_heatmap(self, boxes: torch.Tensor, heatmap_size: int) -> torch.Tensor: - """ - Convert keypoint annotations to a heatmap of one-hot labels for training, - as described in :paper:`Mask R-CNN`. - - Arguments: - boxes: Nx4 tensor, the boxes to draw the keypoints to - - Returns: - heatmaps: - A tensor of shape (N, K), each element is integer spatial label - in the range [0, heatmap_size**2 - 1] for each keypoint in the input. - valid: - A tensor of shape (N, K) containing whether each keypoint is in the roi or not. - """ - return _keypoints_to_heatmap(self.tensor, boxes, heatmap_size) - - def __getitem__(self, item: Union[int, slice, torch.BoolTensor]) -> "Keypoints": - """ - Create a new `Keypoints` by indexing on this `Keypoints`. - - The following usage are allowed: - - 1. `new_kpts = kpts[3]`: return a `Keypoints` which contains only one instance. - 2. `new_kpts = kpts[2:10]`: return a slice of key points. - 3. `new_kpts = kpts[vector]`, where vector is a torch.ByteTensor - with `length = len(kpts)`. Nonzero elements in the vector will be selected. - - Note that the returned Keypoints might share storage with this Keypoints, - subject to Pytorch's indexing semantics. - """ - if isinstance(item, int): - return Keypoints([self.tensor[item]]) - return Keypoints(self.tensor[item]) - - def __repr__(self) -> str: - s = self.__class__.__name__ + "(" - s += "num_instances={})".format(len(self.tensor)) - return s - - @staticmethod - def cat(keypoints_list: List["Keypoints"]) -> "Keypoints": - """ - Concatenates a list of Keypoints into a single Keypoints - - Arguments: - keypoints_list (list[Keypoints]) - - Returns: - Keypoints: the concatenated Keypoints - """ - assert isinstance(keypoints_list, (list, tuple)) - assert len(keypoints_list) > 0 - assert all(isinstance(keypoints, Keypoints) for keypoints in keypoints_list) - - cat_kpts = type(keypoints_list[0])( - torch.cat([kpts.tensor for kpts in keypoints_list], dim=0) - ) - return cat_kpts - - -# TODO make this nicer, this is a direct translation from C2 (but removing the inner loop) -def _keypoints_to_heatmap( - keypoints: torch.Tensor, rois: torch.Tensor, heatmap_size: int -) -> Tuple[torch.Tensor, torch.Tensor]: - """ - Encode keypoint locations into a target heatmap for use in SoftmaxWithLoss across space. - - Maps keypoints from the half-open interval [x1, x2) on continuous image coordinates to the - closed interval [0, heatmap_size - 1] on discrete image coordinates. We use the - continuous-discrete conversion from Heckbert 1990 ("What is the coordinate of a pixel?"): - d = floor(c) and c = d + 0.5, where d is a discrete coordinate and c is a continuous coordinate. - - Arguments: - keypoints: tensor of keypoint locations in of shape (N, K, 3). - rois: Nx4 tensor of rois in xyxy format - heatmap_size: integer side length of square heatmap. - - Returns: - heatmaps: A tensor of shape (N, K) containing an integer spatial label - in the range [0, heatmap_size**2 - 1] for each keypoint in the input. - valid: A tensor of shape (N, K) containing whether each keypoint is in - the roi or not. - """ - - if rois.numel() == 0: - return rois.new().long(), rois.new().long() - offset_x = rois[:, 0] - offset_y = rois[:, 1] - scale_x = heatmap_size / (rois[:, 2] - rois[:, 0]) - scale_y = heatmap_size / (rois[:, 3] - rois[:, 1]) - - offset_x = offset_x[:, None] - offset_y = offset_y[:, None] - scale_x = scale_x[:, None] - scale_y = scale_y[:, None] - - x = keypoints[..., 0] - y = keypoints[..., 1] - - x_boundary_inds = x == rois[:, 2][:, None] - y_boundary_inds = y == rois[:, 3][:, None] - - x = (x - offset_x) * scale_x - x = x.floor().long() - y = (y - offset_y) * scale_y - y = y.floor().long() - - x[x_boundary_inds] = heatmap_size - 1 - y[y_boundary_inds] = heatmap_size - 1 - - valid_loc = (x >= 0) & (y >= 0) & (x < heatmap_size) & (y < heatmap_size) - vis = keypoints[..., 2] > 0 - valid = (valid_loc & vis).long() - - lin_ind = y * heatmap_size + x - heatmaps = lin_ind * valid - - return heatmaps, valid - - -@torch.jit.script_if_tracing -def heatmaps_to_keypoints(maps: torch.Tensor, rois: torch.Tensor) -> torch.Tensor: - """ - Extract predicted keypoint locations from heatmaps. - - Args: - maps (Tensor): (#ROIs, #keypoints, POOL_H, POOL_W). The predicted heatmap of logits for - each ROI and each keypoint. - rois (Tensor): (#ROIs, 4). The box of each ROI. - - Returns: - Tensor of shape (#ROIs, #keypoints, 4) with the last dimension corresponding to - (x, y, logit, score) for each keypoint. - - When converting discrete pixel indices in an NxN image to a continuous keypoint coordinate, - we maintain consistency with :meth:`Keypoints.to_heatmap` by using the conversion from - Heckbert 1990: c = d + 0.5, where d is a discrete coordinate and c is a continuous coordinate. - """ - # The decorator use of torch.no_grad() was not supported by torchscript. - # https://github.com/pytorch/pytorch/issues/44768 - maps = maps.detach() - rois = rois.detach() - - offset_x = rois[:, 0] - offset_y = rois[:, 1] - - widths = (rois[:, 2] - rois[:, 0]).clamp(min=1) - heights = (rois[:, 3] - rois[:, 1]).clamp(min=1) - widths_ceil = widths.ceil() - heights_ceil = heights.ceil() - - num_rois, num_keypoints = maps.shape[:2] - xy_preds = maps.new_zeros(rois.shape[0], num_keypoints, 4) - - width_corrections = widths / widths_ceil - height_corrections = heights / heights_ceil - - keypoints_idx = torch.arange(num_keypoints, device=maps.device) - - for i in range(num_rois): - outsize = (int(heights_ceil[i]), int(widths_ceil[i])) - roi_map = F.interpolate( - maps[[i]], size=outsize, mode="bicubic", align_corners=False - ).squeeze( - 0 - ) # #keypoints x H x W - - # softmax over the spatial region - max_score, _ = roi_map.view(num_keypoints, -1).max(1) - max_score = max_score.view(num_keypoints, 1, 1) - tmp_full_resolution = (roi_map - max_score).exp_() - tmp_pool_resolution = (maps[i] - max_score).exp_() - # Produce scores over the region H x W, but normalize with POOL_H x POOL_W, - # so that the scores of objects of different absolute sizes will be more comparable - roi_map_scores = tmp_full_resolution / tmp_pool_resolution.sum((1, 2), keepdim=True) - - w = roi_map.shape[2] - pos = roi_map.view(num_keypoints, -1).argmax(1) - - x_int = pos % w - y_int = (pos - x_int) // w - - assert ( - roi_map_scores[keypoints_idx, y_int, x_int] - == roi_map_scores.view(num_keypoints, -1).max(1)[0] - ).all() - - x = (x_int.float() + 0.5) * width_corrections[i] - y = (y_int.float() + 0.5) * height_corrections[i] - - xy_preds[i, :, 0] = x + offset_x[i] - xy_preds[i, :, 1] = y + offset_y[i] - xy_preds[i, :, 2] = roi_map[keypoints_idx, y_int, x_int] - xy_preds[i, :, 3] = roi_map_scores[keypoints_idx, y_int, x_int] - - return xy_preds diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/masks.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/masks.py deleted file mode 100755 index 8f8e72dd..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/masks.py +++ /dev/null @@ -1,532 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import copy -import itertools -import numpy as np -from typing import Any, Iterator, List, Union -import pycocotools.mask as mask_util -import torch -from torch import device - -from detectron2.layers.roi_align import ROIAlign -from detectron2.utils.memory import retry_if_cuda_oom - -from .boxes import Boxes - - -def polygon_area(x, y): - # Using the shoelace formula - # https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates - return 0.5 * np.abs(np.dot(x, np.roll(y, 1)) - np.dot(y, np.roll(x, 1))) - - -def polygons_to_bitmask(polygons: List[np.ndarray], height: int, width: int) -> np.ndarray: - """ - Args: - polygons (list[ndarray]): each array has shape (Nx2,) - height, width (int) - - Returns: - ndarray: a bool mask of shape (height, width) - """ - if len(polygons) == 0: - # COCOAPI does not support empty polygons - return np.zeros((height, width)).astype(np.bool) - rles = mask_util.frPyObjects(polygons, height, width) - rle = mask_util.merge(rles) - return mask_util.decode(rle).astype(np.bool) - - -def rasterize_polygons_within_box( - polygons: List[np.ndarray], box: np.ndarray, mask_size: int -) -> torch.Tensor: - """ - Rasterize the polygons into a mask image and - crop the mask content in the given box. - The cropped mask is resized to (mask_size, mask_size). - - This function is used when generating training targets for mask head in Mask R-CNN. - Given original ground-truth masks for an image, new ground-truth mask - training targets in the size of `mask_size x mask_size` - must be provided for each predicted box. This function will be called to - produce such targets. - - Args: - polygons (list[ndarray[float]]): a list of polygons, which represents an instance. - box: 4-element numpy array - mask_size (int): - - Returns: - Tensor: BoolTensor of shape (mask_size, mask_size) - """ - # 1. Shift the polygons w.r.t the boxes - w, h = box[2] - box[0], box[3] - box[1] - - polygons = copy.deepcopy(polygons) - for p in polygons: - p[0::2] = p[0::2] - box[0] - p[1::2] = p[1::2] - box[1] - - # 2. Rescale the polygons to the new box size - # max() to avoid division by small number - ratio_h = mask_size / max(h, 0.1) - ratio_w = mask_size / max(w, 0.1) - - if ratio_h == ratio_w: - for p in polygons: - p *= ratio_h - else: - for p in polygons: - p[0::2] *= ratio_w - p[1::2] *= ratio_h - - # 3. Rasterize the polygons with coco api - mask = polygons_to_bitmask(polygons, mask_size, mask_size) - mask = torch.from_numpy(mask) - return mask - - -class BitMasks: - """ - This class stores the segmentation masks for all objects in one image, in - the form of bitmaps. - - Attributes: - tensor: bool Tensor of N,H,W, representing N instances in the image. - """ - - def __init__(self, tensor: Union[torch.Tensor, np.ndarray]): - """ - Args: - tensor: bool Tensor of N,H,W, representing N instances in the image. - """ - device = tensor.device if isinstance(tensor, torch.Tensor) else torch.device("cpu") - tensor = torch.as_tensor(tensor, dtype=torch.bool, device=device) - assert tensor.dim() == 3, tensor.size() - self.image_size = tensor.shape[1:] - self.tensor = tensor - - @torch.jit.unused - def to(self, *args: Any, **kwargs: Any) -> "BitMasks": - return BitMasks(self.tensor.to(*args, **kwargs)) - - @property - def device(self) -> torch.device: - return self.tensor.device - - @torch.jit.unused - def __getitem__(self, item: Union[int, slice, torch.BoolTensor]) -> "BitMasks": - """ - Returns: - BitMasks: Create a new :class:`BitMasks` by indexing. - - The following usage are allowed: - - 1. `new_masks = masks[3]`: return a `BitMasks` which contains only one mask. - 2. `new_masks = masks[2:10]`: return a slice of masks. - 3. `new_masks = masks[vector]`, where vector is a torch.BoolTensor - with `length = len(masks)`. Nonzero elements in the vector will be selected. - - Note that the returned object might share storage with this object, - subject to Pytorch's indexing semantics. - """ - if isinstance(item, int): - return BitMasks(self.tensor[item].unsqueeze(0)) - m = self.tensor[item] - assert m.dim() == 3, "Indexing on BitMasks with {} returns a tensor with shape {}!".format( - item, m.shape - ) - return BitMasks(m) - - @torch.jit.unused - def __iter__(self) -> torch.Tensor: - yield from self.tensor - - @torch.jit.unused - def __repr__(self) -> str: - s = self.__class__.__name__ + "(" - s += "num_instances={})".format(len(self.tensor)) - return s - - def __len__(self) -> int: - return self.tensor.shape[0] - - def nonempty(self) -> torch.Tensor: - """ - Find masks that are non-empty. - - Returns: - Tensor: a BoolTensor which represents - whether each mask is empty (False) or non-empty (True). - """ - return self.tensor.flatten(1).any(dim=1) - - @staticmethod - def from_polygon_masks( - polygon_masks: Union["PolygonMasks", List[List[np.ndarray]]], height: int, width: int - ) -> "BitMasks": - """ - Args: - polygon_masks (list[list[ndarray]] or PolygonMasks) - height, width (int) - """ - if isinstance(polygon_masks, PolygonMasks): - polygon_masks = polygon_masks.polygons - masks = [polygons_to_bitmask(p, height, width) for p in polygon_masks] - if len(masks): - return BitMasks(torch.stack([torch.from_numpy(x) for x in masks])) - else: - return BitMasks(torch.empty(0, height, width, dtype=torch.bool)) - - @staticmethod - def from_roi_masks(roi_masks: "ROIMasks", height: int, width: int) -> "BitMasks": - """ - Args: - roi_masks: - height, width (int): - """ - return roi_masks.to_bitmasks(height, width) - - def crop_and_resize(self, boxes: torch.Tensor, mask_size: int) -> torch.Tensor: - """ - Crop each bitmask by the given box, and resize results to (mask_size, mask_size). - This can be used to prepare training targets for Mask R-CNN. - It has less reconstruction error compared to rasterization with polygons. - However we observe no difference in accuracy, - but BitMasks requires more memory to store all the masks. - - Args: - boxes (Tensor): Nx4 tensor storing the boxes for each mask - mask_size (int): the size of the rasterized mask. - - Returns: - Tensor: - A bool tensor of shape (N, mask_size, mask_size), where - N is the number of predicted boxes for this image. - """ - assert len(boxes) == len(self), "{} != {}".format(len(boxes), len(self)) - device = self.tensor.device - - batch_inds = torch.arange(len(boxes), device=device).to(dtype=boxes.dtype)[:, None] - rois = torch.cat([batch_inds, boxes], dim=1) # Nx5 - - bit_masks = self.tensor.to(dtype=torch.float32) - rois = rois.to(device=device) - output = ( - ROIAlign((mask_size, mask_size), 1.0, 0, aligned=True) - .forward(bit_masks[:, None, :, :], rois) - .squeeze(1) - ) - output = output >= 0.5 - return output - - def get_bounding_boxes(self) -> Boxes: - """ - Returns: - Boxes: tight bounding boxes around bitmasks. - If a mask is empty, it's bounding box will be all zero. - """ - boxes = torch.zeros(self.tensor.shape[0], 4, dtype=torch.float32) - x_any = torch.any(self.tensor, dim=1) - y_any = torch.any(self.tensor, dim=2) - for idx in range(self.tensor.shape[0]): - x = torch.where(x_any[idx, :])[0] - y = torch.where(y_any[idx, :])[0] - if len(x) > 0 and len(y) > 0: - boxes[idx, :] = torch.as_tensor( - [x[0], y[0], x[-1] + 1, y[-1] + 1], dtype=torch.float32 - ) - return Boxes(boxes) - - @staticmethod - def cat(bitmasks_list: List["BitMasks"]) -> "BitMasks": - """ - Concatenates a list of BitMasks into a single BitMasks - - Arguments: - bitmasks_list (list[BitMasks]) - - Returns: - BitMasks: the concatenated BitMasks - """ - assert isinstance(bitmasks_list, (list, tuple)) - assert len(bitmasks_list) > 0 - assert all(isinstance(bitmask, BitMasks) for bitmask in bitmasks_list) - - cat_bitmasks = type(bitmasks_list[0])(torch.cat([bm.tensor for bm in bitmasks_list], dim=0)) - return cat_bitmasks - - -class PolygonMasks: - """ - This class stores the segmentation masks for all objects in one image, in the form of polygons. - - Attributes: - polygons: list[list[ndarray]]. Each ndarray is a float64 vector representing a polygon. - """ - - def __init__(self, polygons: List[List[Union[torch.Tensor, np.ndarray]]]): - """ - Arguments: - polygons (list[list[np.ndarray]]): The first - level of the list correspond to individual instances, - the second level to all the polygons that compose the - instance, and the third level to the polygon coordinates. - The third level array should have the format of - [x0, y0, x1, y1, ..., xn, yn] (n >= 3). - """ - if not isinstance(polygons, list): - raise ValueError( - "Cannot create PolygonMasks: Expect a list of list of polygons per image. " - "Got '{}' instead.".format(type(polygons)) - ) - - def _make_array(t: Union[torch.Tensor, np.ndarray]) -> np.ndarray: - # Use float64 for higher precision, because why not? - # Always put polygons on CPU (self.to is a no-op) since they - # are supposed to be small tensors. - # May need to change this assumption if GPU placement becomes useful - if isinstance(t, torch.Tensor): - t = t.cpu().numpy() - return np.asarray(t).astype("float64") - - def process_polygons( - polygons_per_instance: List[Union[torch.Tensor, np.ndarray]] - ) -> List[np.ndarray]: - if not isinstance(polygons_per_instance, list): - raise ValueError( - "Cannot create polygons: Expect a list of polygons per instance. " - "Got '{}' instead.".format(type(polygons_per_instance)) - ) - # transform each polygon to a numpy array - polygons_per_instance = [_make_array(p) for p in polygons_per_instance] - for polygon in polygons_per_instance: - if len(polygon) % 2 != 0 or len(polygon) < 6: - raise ValueError(f"Cannot create a polygon from {len(polygon)} coordinates.") - return polygons_per_instance - - self.polygons: List[List[np.ndarray]] = [ - process_polygons(polygons_per_instance) for polygons_per_instance in polygons - ] - - def to(self, *args: Any, **kwargs: Any) -> "PolygonMasks": - return self - - @property - def device(self) -> torch.device: - return torch.device("cpu") - - def get_bounding_boxes(self) -> Boxes: - """ - Returns: - Boxes: tight bounding boxes around polygon masks. - """ - boxes = torch.zeros(len(self.polygons), 4, dtype=torch.float32) - for idx, polygons_per_instance in enumerate(self.polygons): - minxy = torch.as_tensor([float("inf"), float("inf")], dtype=torch.float32) - maxxy = torch.zeros(2, dtype=torch.float32) - for polygon in polygons_per_instance: - coords = torch.from_numpy(polygon).view(-1, 2).to(dtype=torch.float32) - minxy = torch.min(minxy, torch.min(coords, dim=0).values) - maxxy = torch.max(maxxy, torch.max(coords, dim=0).values) - boxes[idx, :2] = minxy - boxes[idx, 2:] = maxxy - return Boxes(boxes) - - def nonempty(self) -> torch.Tensor: - """ - Find masks that are non-empty. - - Returns: - Tensor: - a BoolTensor which represents whether each mask is empty (False) or not (True). - """ - keep = [1 if len(polygon) > 0 else 0 for polygon in self.polygons] - return torch.from_numpy(np.asarray(keep, dtype=np.bool)) - - def __getitem__(self, item: Union[int, slice, List[int], torch.BoolTensor]) -> "PolygonMasks": - """ - Support indexing over the instances and return a `PolygonMasks` object. - `item` can be: - - 1. An integer. It will return an object with only one instance. - 2. A slice. It will return an object with the selected instances. - 3. A list[int]. It will return an object with the selected instances, - correpsonding to the indices in the list. - 4. A vector mask of type BoolTensor, whose length is num_instances. - It will return an object with the instances whose mask is nonzero. - """ - if isinstance(item, int): - selected_polygons = [self.polygons[item]] - elif isinstance(item, slice): - selected_polygons = self.polygons[item] - elif isinstance(item, list): - selected_polygons = [self.polygons[i] for i in item] - elif isinstance(item, torch.Tensor): - # Polygons is a list, so we have to move the indices back to CPU. - if item.dtype == torch.bool: - assert item.dim() == 1, item.shape - item = item.nonzero().squeeze(1).cpu().numpy().tolist() - elif item.dtype in [torch.int32, torch.int64]: - item = item.cpu().numpy().tolist() - else: - raise ValueError("Unsupported tensor dtype={} for indexing!".format(item.dtype)) - selected_polygons = [self.polygons[i] for i in item] - return PolygonMasks(selected_polygons) - - def __iter__(self) -> Iterator[List[np.ndarray]]: - """ - Yields: - list[ndarray]: the polygons for one instance. - Each Tensor is a float64 vector representing a polygon. - """ - return iter(self.polygons) - - def __repr__(self) -> str: - s = self.__class__.__name__ + "(" - s += "num_instances={})".format(len(self.polygons)) - return s - - def __len__(self) -> int: - return len(self.polygons) - - def crop_and_resize(self, boxes: torch.Tensor, mask_size: int) -> torch.Tensor: - """ - Crop each mask by the given box, and resize results to (mask_size, mask_size). - This can be used to prepare training targets for Mask R-CNN. - - Args: - boxes (Tensor): Nx4 tensor storing the boxes for each mask - mask_size (int): the size of the rasterized mask. - - Returns: - Tensor: A bool tensor of shape (N, mask_size, mask_size), where - N is the number of predicted boxes for this image. - """ - assert len(boxes) == len(self), "{} != {}".format(len(boxes), len(self)) - - device = boxes.device - # Put boxes on the CPU, as the polygon representation is not efficient GPU-wise - # (several small tensors for representing a single instance mask) - boxes = boxes.to(torch.device("cpu")) - - results = [ - rasterize_polygons_within_box(poly, box.numpy(), mask_size) - for poly, box in zip(self.polygons, boxes) - ] - """ - poly: list[list[float]], the polygons for one instance - box: a tensor of shape (4,) - """ - if len(results) == 0: - return torch.empty(0, mask_size, mask_size, dtype=torch.bool, device=device) - return torch.stack(results, dim=0).to(device=device) - - def area(self): - """ - Computes area of the mask. - Only works with Polygons, using the shoelace formula: - https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates - - Returns: - Tensor: a vector, area for each instance - """ - - area = [] - for polygons_per_instance in self.polygons: - area_per_instance = 0 - for p in polygons_per_instance: - area_per_instance += polygon_area(p[0::2], p[1::2]) - area.append(area_per_instance) - - return torch.tensor(area) - - @staticmethod - def cat(polymasks_list: List["PolygonMasks"]) -> "PolygonMasks": - """ - Concatenates a list of PolygonMasks into a single PolygonMasks - - Arguments: - polymasks_list (list[PolygonMasks]) - - Returns: - PolygonMasks: the concatenated PolygonMasks - """ - assert isinstance(polymasks_list, (list, tuple)) - assert len(polymasks_list) > 0 - assert all(isinstance(polymask, PolygonMasks) for polymask in polymasks_list) - - cat_polymasks = type(polymasks_list[0])( - list(itertools.chain.from_iterable(pm.polygons for pm in polymasks_list)) - ) - return cat_polymasks - - -class ROIMasks: - """ - Represent masks by N smaller masks defined in some ROIs. Once ROI boxes are given, - full-image bitmask can be obtained by "pasting" the mask on the region defined - by the corresponding ROI box. - """ - - def __init__(self, tensor: torch.Tensor): - """ - Args: - tensor: (N, M, M) mask tensor that defines the mask within each ROI. - """ - if tensor.dim() != 3: - raise ValueError("ROIMasks must take a masks of 3 dimension.") - self.tensor = tensor - - def to(self, device: torch.device) -> "ROIMasks": - return ROIMasks(self.tensor.to(device)) - - @property - def device(self) -> device: - return self.tensor.device - - def __len__(self): - return self.tensor.shape[0] - - def __getitem__(self, item) -> "ROIMasks": - """ - Returns: - ROIMasks: Create a new :class:`ROIMasks` by indexing. - - The following usage are allowed: - - 1. `new_masks = masks[2:10]`: return a slice of masks. - 2. `new_masks = masks[vector]`, where vector is a torch.BoolTensor - with `length = len(masks)`. Nonzero elements in the vector will be selected. - - Note that the returned object might share storage with this object, - subject to Pytorch's indexing semantics. - """ - t = self.tensor[item] - if t.dim() != 3: - raise ValueError( - f"Indexing on ROIMasks with {item} returns a tensor with shape {t.shape}!" - ) - return ROIMasks(t) - - @torch.jit.unused - def __repr__(self) -> str: - s = self.__class__.__name__ + "(" - s += "num_instances={})".format(len(self.tensor)) - return s - - @torch.jit.unused - def to_bitmasks(self, boxes: torch.Tensor, height, width, threshold=0.5): - """ - Args: see documentation of :func:`paste_masks_in_image`. - """ - from detectron2.layers.mask_ops import paste_masks_in_image, _paste_masks_tensor_shape - - if torch.jit.is_tracing(): - if isinstance(height, torch.Tensor): - paste_func = _paste_masks_tensor_shape - else: - paste_func = paste_masks_in_image - else: - paste_func = retry_if_cuda_oom(paste_masks_in_image) - bitmasks = paste_func(self.tensor, boxes.tensor, (height, width), threshold=threshold) - return BitMasks(bitmasks) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/rotated_boxes.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/rotated_boxes.py deleted file mode 100755 index 4ec8e4c7..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/structures/rotated_boxes.py +++ /dev/null @@ -1,503 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import math -from typing import List, Tuple -import torch - -from detectron2.layers.rotated_boxes import pairwise_iou_rotated - -from .boxes import Boxes - - -class RotatedBoxes(Boxes): - """ - This structure stores a list of rotated boxes as a Nx5 torch.Tensor. - It supports some common methods about boxes - (`area`, `clip`, `nonempty`, etc), - and also behaves like a Tensor - (support indexing, `to(device)`, `.device`, and iteration over all boxes) - """ - - def __init__(self, tensor: torch.Tensor): - """ - Args: - tensor (Tensor[float]): a Nx5 matrix. Each row is - (x_center, y_center, width, height, angle), - in which angle is represented in degrees. - While there's no strict range restriction for it, - the recommended principal range is between [-180, 180) degrees. - - Assume we have a horizontal box B = (x_center, y_center, width, height), - where width is along the x-axis and height is along the y-axis. - The rotated box B_rot (x_center, y_center, width, height, angle) - can be seen as: - - 1. When angle == 0: - B_rot == B - 2. When angle > 0: - B_rot is obtained by rotating B w.r.t its center by :math:`|angle|` degrees CCW; - 3. When angle < 0: - B_rot is obtained by rotating B w.r.t its center by :math:`|angle|` degrees CW. - - Mathematically, since the right-handed coordinate system for image space - is (y, x), where y is top->down and x is left->right, the 4 vertices of the - rotated rectangle :math:`(yr_i, xr_i)` (i = 1, 2, 3, 4) can be obtained from - the vertices of the horizontal rectangle :math:`(y_i, x_i)` (i = 1, 2, 3, 4) - in the following way (:math:`\\theta = angle*\\pi/180` is the angle in radians, - :math:`(y_c, x_c)` is the center of the rectangle): - - .. math:: - - yr_i = \\cos(\\theta) (y_i - y_c) - \\sin(\\theta) (x_i - x_c) + y_c, - - xr_i = \\sin(\\theta) (y_i - y_c) + \\cos(\\theta) (x_i - x_c) + x_c, - - which is the standard rigid-body rotation transformation. - - Intuitively, the angle is - (1) the rotation angle from y-axis in image space - to the height vector (top->down in the box's local coordinate system) - of the box in CCW, and - (2) the rotation angle from x-axis in image space - to the width vector (left->right in the box's local coordinate system) - of the box in CCW. - - More intuitively, consider the following horizontal box ABCD represented - in (x1, y1, x2, y2): (3, 2, 7, 4), - covering the [3, 7] x [2, 4] region of the continuous coordinate system - which looks like this: - - .. code:: none - - O--------> x - | - | A---B - | | | - | D---C - | - v y - - Note that each capital letter represents one 0-dimensional geometric point - instead of a 'square pixel' here. - - In the example above, using (x, y) to represent a point we have: - - .. math:: - - O = (0, 0), A = (3, 2), B = (7, 2), C = (7, 4), D = (3, 4) - - We name vector AB = vector DC as the width vector in box's local coordinate system, and - vector AD = vector BC as the height vector in box's local coordinate system. Initially, - when angle = 0 degree, they're aligned with the positive directions of x-axis and y-axis - in the image space, respectively. - - For better illustration, we denote the center of the box as E, - - .. code:: none - - O--------> x - | - | A---B - | | E | - | D---C - | - v y - - where the center E = ((3+7)/2, (2+4)/2) = (5, 3). - - Also, - - .. math:: - - width = |AB| = |CD| = 7 - 3 = 4, - height = |AD| = |BC| = 4 - 2 = 2. - - Therefore, the corresponding representation for the same shape in rotated box in - (x_center, y_center, width, height, angle) format is: - - (5, 3, 4, 2, 0), - - Now, let's consider (5, 3, 4, 2, 90), which is rotated by 90 degrees - CCW (counter-clockwise) by definition. It looks like this: - - .. code:: none - - O--------> x - | B-C - | | | - | |E| - | | | - | A-D - v y - - The center E is still located at the same point (5, 3), while the vertices - ABCD are rotated by 90 degrees CCW with regard to E: - A = (4, 5), B = (4, 1), C = (6, 1), D = (6, 5) - - Here, 90 degrees can be seen as the CCW angle to rotate from y-axis to - vector AD or vector BC (the top->down height vector in box's local coordinate system), - or the CCW angle to rotate from x-axis to vector AB or vector DC (the left->right - width vector in box's local coordinate system). - - .. math:: - - width = |AB| = |CD| = 5 - 1 = 4, - height = |AD| = |BC| = 6 - 4 = 2. - - Next, how about (5, 3, 4, 2, -90), which is rotated by 90 degrees CW (clockwise) - by definition? It looks like this: - - .. code:: none - - O--------> x - | D-A - | | | - | |E| - | | | - | C-B - v y - - The center E is still located at the same point (5, 3), while the vertices - ABCD are rotated by 90 degrees CW with regard to E: - A = (6, 1), B = (6, 5), C = (4, 5), D = (4, 1) - - .. math:: - - width = |AB| = |CD| = 5 - 1 = 4, - height = |AD| = |BC| = 6 - 4 = 2. - - This covers exactly the same region as (5, 3, 4, 2, 90) does, and their IoU - will be 1. However, these two will generate different RoI Pooling results and - should not be treated as an identical box. - - On the other hand, it's easy to see that (X, Y, W, H, A) is identical to - (X, Y, W, H, A+360N), for any integer N. For example (5, 3, 4, 2, 270) would be - identical to (5, 3, 4, 2, -90), because rotating the shape 270 degrees CCW is - equivalent to rotating the same shape 90 degrees CW. - - We could rotate further to get (5, 3, 4, 2, 180), or (5, 3, 4, 2, -180): - - .. code:: none - - O--------> x - | - | C---D - | | E | - | B---A - | - v y - - .. math:: - - A = (7, 4), B = (3, 4), C = (3, 2), D = (7, 2), - - width = |AB| = |CD| = 7 - 3 = 4, - height = |AD| = |BC| = 4 - 2 = 2. - - Finally, this is a very inaccurate (heavily quantized) illustration of - how (5, 3, 4, 2, 60) looks like in case anyone wonders: - - .. code:: none - - O--------> x - | B\ - | / C - | /E / - | A / - | `D - v y - - It's still a rectangle with center of (5, 3), width of 4 and height of 2, - but its angle (and thus orientation) is somewhere between - (5, 3, 4, 2, 0) and (5, 3, 4, 2, 90). - """ - device = tensor.device if isinstance(tensor, torch.Tensor) else torch.device("cpu") - tensor = torch.as_tensor(tensor, dtype=torch.float32, device=device) - if tensor.numel() == 0: - # Use reshape, so we don't end up creating a new tensor that does not depend on - # the inputs (and consequently confuses jit) - tensor = tensor.reshape((0, 5)).to(dtype=torch.float32, device=device) - assert tensor.dim() == 2 and tensor.size(-1) == 5, tensor.size() - - self.tensor = tensor - - def clone(self) -> "RotatedBoxes": - """ - Clone the RotatedBoxes. - - Returns: - RotatedBoxes - """ - return RotatedBoxes(self.tensor.clone()) - - def to(self, device: torch.device): - # Boxes are assumed float32 and does not support to(dtype) - return RotatedBoxes(self.tensor.to(device=device)) - - def area(self) -> torch.Tensor: - """ - Computes the area of all the boxes. - - Returns: - torch.Tensor: a vector with areas of each box. - """ - box = self.tensor - area = box[:, 2] * box[:, 3] - return area - - def normalize_angles(self) -> None: - """ - Restrict angles to the range of [-180, 180) degrees - """ - self.tensor[:, 4] = (self.tensor[:, 4] + 180.0) % 360.0 - 180.0 - - def clip(self, box_size: Tuple[int, int], clip_angle_threshold: float = 1.0) -> None: - """ - Clip (in place) the boxes by limiting x coordinates to the range [0, width] - and y coordinates to the range [0, height]. - - For RRPN: - Only clip boxes that are almost horizontal with a tolerance of - clip_angle_threshold to maintain backward compatibility. - - Rotated boxes beyond this threshold are not clipped for two reasons: - - 1. There are potentially multiple ways to clip a rotated box to make it - fit within the image. - 2. It's tricky to make the entire rectangular box fit within the image - and still be able to not leave out pixels of interest. - - Therefore we rely on ops like RoIAlignRotated to safely handle this. - - Args: - box_size (height, width): The clipping box's size. - clip_angle_threshold: - Iff. abs(normalized(angle)) <= clip_angle_threshold (in degrees), - we do the clipping as horizontal boxes. - """ - h, w = box_size - - # normalize angles to be within (-180, 180] degrees - self.normalize_angles() - - idx = torch.where(torch.abs(self.tensor[:, 4]) <= clip_angle_threshold)[0] - - # convert to (x1, y1, x2, y2) - x1 = self.tensor[idx, 0] - self.tensor[idx, 2] / 2.0 - y1 = self.tensor[idx, 1] - self.tensor[idx, 3] / 2.0 - x2 = self.tensor[idx, 0] + self.tensor[idx, 2] / 2.0 - y2 = self.tensor[idx, 1] + self.tensor[idx, 3] / 2.0 - - # clip - x1.clamp_(min=0, max=w) - y1.clamp_(min=0, max=h) - x2.clamp_(min=0, max=w) - y2.clamp_(min=0, max=h) - - # convert back to (xc, yc, w, h) - self.tensor[idx, 0] = (x1 + x2) / 2.0 - self.tensor[idx, 1] = (y1 + y2) / 2.0 - # make sure widths and heights do not increase due to numerical errors - self.tensor[idx, 2] = torch.min(self.tensor[idx, 2], x2 - x1) - self.tensor[idx, 3] = torch.min(self.tensor[idx, 3], y2 - y1) - - def nonempty(self, threshold: float = 0.0) -> torch.Tensor: - """ - Find boxes that are non-empty. - A box is considered empty, if either of its side is no larger than threshold. - - Returns: - Tensor: a binary vector which represents - whether each box is empty (False) or non-empty (True). - """ - box = self.tensor - widths = box[:, 2] - heights = box[:, 3] - keep = (widths > threshold) & (heights > threshold) - return keep - - def __getitem__(self, item) -> "RotatedBoxes": - """ - Returns: - RotatedBoxes: Create a new :class:`RotatedBoxes` by indexing. - - The following usage are allowed: - - 1. `new_boxes = boxes[3]`: return a `RotatedBoxes` which contains only one box. - 2. `new_boxes = boxes[2:10]`: return a slice of boxes. - 3. `new_boxes = boxes[vector]`, where vector is a torch.ByteTensor - with `length = len(boxes)`. Nonzero elements in the vector will be selected. - - Note that the returned RotatedBoxes might share storage with this RotatedBoxes, - subject to Pytorch's indexing semantics. - """ - if isinstance(item, int): - return RotatedBoxes(self.tensor[item].view(1, -1)) - b = self.tensor[item] - assert b.dim() == 2, "Indexing on RotatedBoxes with {} failed to return a matrix!".format( - item - ) - return RotatedBoxes(b) - - def __len__(self) -> int: - return self.tensor.shape[0] - - def __repr__(self) -> str: - return "RotatedBoxes(" + str(self.tensor) + ")" - - def inside_box(self, box_size: Tuple[int, int], boundary_threshold: int = 0) -> torch.Tensor: - """ - Args: - box_size (height, width): Size of the reference box covering - [0, width] x [0, height] - boundary_threshold (int): Boxes that extend beyond the reference box - boundary by more than boundary_threshold are considered "outside". - - For RRPN, it might not be necessary to call this function since it's common - for rotated box to extend to outside of the image boundaries - (the clip function only clips the near-horizontal boxes) - - Returns: - a binary vector, indicating whether each box is inside the reference box. - """ - height, width = box_size - - cnt_x = self.tensor[..., 0] - cnt_y = self.tensor[..., 1] - half_w = self.tensor[..., 2] / 2.0 - half_h = self.tensor[..., 3] / 2.0 - a = self.tensor[..., 4] - c = torch.abs(torch.cos(a * math.pi / 180.0)) - s = torch.abs(torch.sin(a * math.pi / 180.0)) - # This basically computes the horizontal bounding rectangle of the rotated box - max_rect_dx = c * half_w + s * half_h - max_rect_dy = c * half_h + s * half_w - - inds_inside = ( - (cnt_x - max_rect_dx >= -boundary_threshold) - & (cnt_y - max_rect_dy >= -boundary_threshold) - & (cnt_x + max_rect_dx < width + boundary_threshold) - & (cnt_y + max_rect_dy < height + boundary_threshold) - ) - - return inds_inside - - def get_centers(self) -> torch.Tensor: - """ - Returns: - The box centers in a Nx2 array of (x, y). - """ - return self.tensor[:, :2] - - def scale(self, scale_x: float, scale_y: float) -> None: - """ - Scale the rotated box with horizontal and vertical scaling factors - Note: when scale_factor_x != scale_factor_y, - the rotated box does not preserve the rectangular shape when the angle - is not a multiple of 90 degrees under resize transformation. - Instead, the shape is a parallelogram (that has skew) - Here we make an approximation by fitting a rotated rectangle to the parallelogram. - """ - self.tensor[:, 0] *= scale_x - self.tensor[:, 1] *= scale_y - theta = self.tensor[:, 4] * math.pi / 180.0 - c = torch.cos(theta) - s = torch.sin(theta) - - # In image space, y is top->down and x is left->right - # Consider the local coordintate system for the rotated box, - # where the box center is located at (0, 0), and the four vertices ABCD are - # A(-w / 2, -h / 2), B(w / 2, -h / 2), C(w / 2, h / 2), D(-w / 2, h / 2) - # the midpoint of the left edge AD of the rotated box E is: - # E = (A+D)/2 = (-w / 2, 0) - # the midpoint of the top edge AB of the rotated box F is: - # F(0, -h / 2) - # To get the old coordinates in the global system, apply the rotation transformation - # (Note: the right-handed coordinate system for image space is yOx): - # (old_x, old_y) = (s * y + c * x, c * y - s * x) - # E(old) = (s * 0 + c * (-w/2), c * 0 - s * (-w/2)) = (-c * w / 2, s * w / 2) - # F(old) = (s * (-h / 2) + c * 0, c * (-h / 2) - s * 0) = (-s * h / 2, -c * h / 2) - # After applying the scaling factor (sfx, sfy): - # E(new) = (-sfx * c * w / 2, sfy * s * w / 2) - # F(new) = (-sfx * s * h / 2, -sfy * c * h / 2) - # The new width after scaling tranformation becomes: - - # w(new) = |E(new) - O| * 2 - # = sqrt[(sfx * c * w / 2)^2 + (sfy * s * w / 2)^2] * 2 - # = sqrt[(sfx * c)^2 + (sfy * s)^2] * w - # i.e., scale_factor_w = sqrt[(sfx * c)^2 + (sfy * s)^2] - # - # For example, - # when angle = 0 or 180, |c| = 1, s = 0, scale_factor_w == scale_factor_x; - # when |angle| = 90, c = 0, |s| = 1, scale_factor_w == scale_factor_y - self.tensor[:, 2] *= torch.sqrt((scale_x * c) ** 2 + (scale_y * s) ** 2) - - # h(new) = |F(new) - O| * 2 - # = sqrt[(sfx * s * h / 2)^2 + (sfy * c * h / 2)^2] * 2 - # = sqrt[(sfx * s)^2 + (sfy * c)^2] * h - # i.e., scale_factor_h = sqrt[(sfx * s)^2 + (sfy * c)^2] - # - # For example, - # when angle = 0 or 180, |c| = 1, s = 0, scale_factor_h == scale_factor_y; - # when |angle| = 90, c = 0, |s| = 1, scale_factor_h == scale_factor_x - self.tensor[:, 3] *= torch.sqrt((scale_x * s) ** 2 + (scale_y * c) ** 2) - - # The angle is the rotation angle from y-axis in image space to the height - # vector (top->down in the box's local coordinate system) of the box in CCW. - # - # angle(new) = angle_yOx(O - F(new)) - # = angle_yOx( (sfx * s * h / 2, sfy * c * h / 2) ) - # = atan2(sfx * s * h / 2, sfy * c * h / 2) - # = atan2(sfx * s, sfy * c) - # - # For example, - # when sfx == sfy, angle(new) == atan2(s, c) == angle(old) - self.tensor[:, 4] = torch.atan2(scale_x * s, scale_y * c) * 180 / math.pi - - @classmethod - def cat(cls, boxes_list: List["RotatedBoxes"]) -> "RotatedBoxes": - """ - Concatenates a list of RotatedBoxes into a single RotatedBoxes - - Arguments: - boxes_list (list[RotatedBoxes]) - - Returns: - RotatedBoxes: the concatenated RotatedBoxes - """ - assert isinstance(boxes_list, (list, tuple)) - if len(boxes_list) == 0: - return cls(torch.empty(0)) - assert all([isinstance(box, RotatedBoxes) for box in boxes_list]) - - # use torch.cat (v.s. layers.cat) so the returned boxes never share storage with input - cat_boxes = cls(torch.cat([b.tensor for b in boxes_list], dim=0)) - return cat_boxes - - @property - def device(self) -> torch.device: - return self.tensor.device - - @torch.jit.unused - def __iter__(self): - """ - Yield a box as a Tensor of shape (5,) at a time. - """ - yield from self.tensor - - -def pairwise_iou(boxes1: RotatedBoxes, boxes2: RotatedBoxes) -> None: - """ - Given two lists of rotated boxes of size N and M, - compute the IoU (intersection over union) - between **all** N x M pairs of boxes. - The box order must be (x_center, y_center, width, height, angle). - - Args: - boxes1, boxes2 (RotatedBoxes): - two `RotatedBoxes`. Contains N & M rotated boxes, respectively. - - Returns: - Tensor: IoU, sized [N,M]. - """ - - return pairwise_iou_rotated(boxes1.tensor, boxes2.tensor) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/README.md b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/README.md deleted file mode 100755 index 9765b24a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/README.md +++ /dev/null @@ -1,5 +0,0 @@ -# Utility functions - -This folder contain utility functions that are not used in the -core library, but are useful for building models or training -code using the config system. diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/__init__.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/__init__.py deleted file mode 100755 index 9020c2df..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/__init__.py +++ /dev/null @@ -1 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/analysis.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/analysis.py deleted file mode 100755 index 178da796..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/analysis.py +++ /dev/null @@ -1,188 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# -*- coding: utf-8 -*- - -import typing -from typing import Any, List -import fvcore -from fvcore.nn import activation_count, flop_count, parameter_count, parameter_count_table -from torch import nn - -from detectron2.export import TracingAdapter - -__all__ = [ - "activation_count_operators", - "flop_count_operators", - "parameter_count_table", - "parameter_count", - "FlopCountAnalysis", -] - -FLOPS_MODE = "flops" -ACTIVATIONS_MODE = "activations" - - -# Some extra ops to ignore from counting, including elementwise and reduction ops -_IGNORED_OPS = { - "aten::add", - "aten::add_", - "aten::argmax", - "aten::argsort", - "aten::batch_norm", - "aten::constant_pad_nd", - "aten::div", - "aten::div_", - "aten::exp", - "aten::log2", - "aten::max_pool2d", - "aten::meshgrid", - "aten::mul", - "aten::mul_", - "aten::neg", - "aten::nonzero_numpy", - "aten::reciprocal", - "aten::repeat_interleave", - "aten::rsub", - "aten::sigmoid", - "aten::sigmoid_", - "aten::softmax", - "aten::sort", - "aten::sqrt", - "aten::sub", - "torchvision::nms", # TODO estimate flop for nms -} - - -class FlopCountAnalysis(fvcore.nn.FlopCountAnalysis): - """ - Same as :class:`fvcore.nn.FlopCountAnalysis`, but supports detectron2 models. - """ - - def __init__(self, model, inputs): - """ - Args: - model (nn.Module): - inputs (Any): inputs of the given model. Does not have to be tuple of tensors. - """ - wrapper = TracingAdapter(model, inputs, allow_non_tensor=True) - super().__init__(wrapper, wrapper.flattened_inputs) - self.set_op_handle(**{k: None for k in _IGNORED_OPS}) - - -def flop_count_operators(model: nn.Module, inputs: list) -> typing.DefaultDict[str, float]: - """ - Implement operator-level flops counting using jit. - This is a wrapper of :func:`fvcore.nn.flop_count` and adds supports for standard - detection models in detectron2. - Please use :class:`FlopCountAnalysis` for more advanced functionalities. - - Note: - The function runs the input through the model to compute flops. - The flops of a detection model is often input-dependent, for example, - the flops of box & mask head depends on the number of proposals & - the number of detected objects. - Therefore, the flops counting using a single input may not accurately - reflect the computation cost of a model. It's recommended to average - across a number of inputs. - - Args: - model: a detectron2 model that takes `list[dict]` as input. - inputs (list[dict]): inputs to model, in detectron2's standard format. - Only "image" key will be used. - supported_ops (dict[str, Handle]): see documentation of :func:`fvcore.nn.flop_count` - - Returns: - Counter: Gflop count per operator - """ - old_train = model.training - model.eval() - ret = FlopCountAnalysis(model, inputs).by_operator() - model.train(old_train) - return {k: v / 1e9 for k, v in ret.items()} - - -def activation_count_operators( - model: nn.Module, inputs: list, **kwargs -) -> typing.DefaultDict[str, float]: - """ - Implement operator-level activations counting using jit. - This is a wrapper of fvcore.nn.activation_count, that supports standard detection models - in detectron2. - - Note: - The function runs the input through the model to compute activations. - The activations of a detection model is often input-dependent, for example, - the activations of box & mask head depends on the number of proposals & - the number of detected objects. - - Args: - model: a detectron2 model that takes `list[dict]` as input. - inputs (list[dict]): inputs to model, in detectron2's standard format. - Only "image" key will be used. - - Returns: - Counter: activation count per operator - """ - return _wrapper_count_operators(model=model, inputs=inputs, mode=ACTIVATIONS_MODE, **kwargs) - - -def _wrapper_count_operators( - model: nn.Module, inputs: list, mode: str, **kwargs -) -> typing.DefaultDict[str, float]: - # ignore some ops - supported_ops = {k: lambda *args, **kwargs: {} for k in _IGNORED_OPS} - supported_ops.update(kwargs.pop("supported_ops", {})) - kwargs["supported_ops"] = supported_ops - - assert len(inputs) == 1, "Please use batch size=1" - tensor_input = inputs[0]["image"] - inputs = [{"image": tensor_input}] # remove other keys, in case there are any - - old_train = model.training - if isinstance(model, (nn.parallel.distributed.DistributedDataParallel, nn.DataParallel)): - model = model.module - wrapper = TracingAdapter(model, inputs) - wrapper.eval() - if mode == FLOPS_MODE: - ret = flop_count(wrapper, (tensor_input,), **kwargs) - elif mode == ACTIVATIONS_MODE: - ret = activation_count(wrapper, (tensor_input,), **kwargs) - else: - raise NotImplementedError("Count for mode {} is not supported yet.".format(mode)) - # compatible with change in fvcore - if isinstance(ret, tuple): - ret = ret[0] - model.train(old_train) - return ret - - -def find_unused_parameters(model: nn.Module, inputs: Any) -> List[str]: - """ - Given a model, find parameters that do not contribute - to the loss. - - Args: - model: a model in training mode that returns losses - inputs: argument or a tuple of arguments. Inputs of the model - - Returns: - list[str]: the name of unused parameters - """ - assert model.training - for _, prm in model.named_parameters(): - prm.grad = None - - if isinstance(inputs, tuple): - losses = model(*inputs) - else: - losses = model(inputs) - - if isinstance(losses, dict): - losses = sum(losses.values()) - losses.backward() - - unused: List[str] = [] - for name, prm in model.named_parameters(): - if prm.grad is None: - unused.append(name) - prm.grad = None - return unused diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/collect_env.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/collect_env.py deleted file mode 100755 index 807b6c7e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/collect_env.py +++ /dev/null @@ -1,242 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import importlib -import numpy as np -import os -import re -import subprocess -import sys -from collections import defaultdict -import PIL -import torch -import torchvision -from tabulate import tabulate - -__all__ = ["collect_env_info"] - - -def collect_torch_env(): - try: - import torch.__config__ - - return torch.__config__.show() - except ImportError: - # compatible with older versions of pytorch - from torch.utils.collect_env import get_pretty_env_info - - return get_pretty_env_info() - - -def get_env_module(): - var_name = "DETECTRON2_ENV_MODULE" - return var_name, os.environ.get(var_name, "") - - -def detect_compute_compatibility(CUDA_HOME, so_file): - try: - cuobjdump = os.path.join(CUDA_HOME, "bin", "cuobjdump") - if os.path.isfile(cuobjdump): - output = subprocess.check_output( - "'{}' --list-elf '{}'".format(cuobjdump, so_file), shell=True - ) - output = output.decode("utf-8").strip().split("\n") - arch = [] - for line in output: - line = re.findall(r"\.sm_([0-9]*)\.", line)[0] - arch.append(".".join(line)) - arch = sorted(set(arch)) - return ", ".join(arch) - else: - return so_file + "; cannot find cuobjdump" - except Exception: - # unhandled failure - return so_file - - -def collect_env_info(): - has_gpu = torch.cuda.is_available() # true for both CUDA & ROCM - torch_version = torch.__version__ - - # NOTE that CUDA_HOME/ROCM_HOME could be None even when CUDA runtime libs are functional - from torch.utils.cpp_extension import CUDA_HOME, ROCM_HOME - - has_rocm = False - if (getattr(torch.version, "hip", None) is not None) and (ROCM_HOME is not None): - has_rocm = True - has_cuda = has_gpu and (not has_rocm) - - data = [] - data.append(("sys.platform", sys.platform)) # check-template.yml depends on it - data.append(("Python", sys.version.replace("\n", ""))) - data.append(("numpy", np.__version__)) - - try: - import detectron2 # noqa - - data.append( - ("detectron2", detectron2.__version__ + " @" + os.path.dirname(detectron2.__file__)) - ) - except ImportError: - data.append(("detectron2", "failed to import")) - except AttributeError: - data.append(("detectron2", "imported a wrong installation")) - - try: - import detectron2._C as _C - except ImportError as e: - data.append(("detectron2._C", f"not built correctly: {e}")) - - # print system compilers when extension fails to build - if sys.platform != "win32": # don't know what to do for windows - try: - # this is how torch/utils/cpp_extensions.py choose compiler - cxx = os.environ.get("CXX", "c++") - cxx = subprocess.check_output("'{}' --version".format(cxx), shell=True) - cxx = cxx.decode("utf-8").strip().split("\n")[0] - except subprocess.SubprocessError: - cxx = "Not found" - data.append(("Compiler ($CXX)", cxx)) - - if has_cuda and CUDA_HOME is not None: - try: - nvcc = os.path.join(CUDA_HOME, "bin", "nvcc") - nvcc = subprocess.check_output("'{}' -V".format(nvcc), shell=True) - nvcc = nvcc.decode("utf-8").strip().split("\n")[-1] - except subprocess.SubprocessError: - nvcc = "Not found" - data.append(("CUDA compiler", nvcc)) - if has_cuda and sys.platform != "win32": - try: - so_file = importlib.util.find_spec("detectron2._C").origin - except (ImportError, AttributeError): - pass - else: - data.append( - ("detectron2 arch flags", detect_compute_compatibility(CUDA_HOME, so_file)) - ) - else: - # print compilers that are used to build extension - data.append(("Compiler", _C.get_compiler_version())) - data.append(("CUDA compiler", _C.get_cuda_version())) # cuda or hip - if has_cuda and getattr(_C, "has_cuda", lambda: True)(): - data.append( - ("detectron2 arch flags", detect_compute_compatibility(CUDA_HOME, _C.__file__)) - ) - - data.append(get_env_module()) - data.append(("PyTorch", torch_version + " @" + os.path.dirname(torch.__file__))) - data.append(("PyTorch debug build", torch.version.debug)) - - if not has_gpu: - has_gpu_text = "No: torch.cuda.is_available() == False" - else: - has_gpu_text = "Yes" - data.append(("GPU available", has_gpu_text)) - if has_gpu: - devices = defaultdict(list) - for k in range(torch.cuda.device_count()): - cap = ".".join((str(x) for x in torch.cuda.get_device_capability(k))) - name = torch.cuda.get_device_name(k) + f" (arch={cap})" - devices[name].append(str(k)) - for name, devids in devices.items(): - data.append(("GPU " + ",".join(devids), name)) - - if has_rocm: - msg = " - invalid!" if not (ROCM_HOME and os.path.isdir(ROCM_HOME)) else "" - data.append(("ROCM_HOME", str(ROCM_HOME) + msg)) - else: - try: - from torch.utils.collect_env import get_nvidia_driver_version, run as _run - - data.append(("Driver version", get_nvidia_driver_version(_run))) - except Exception: - pass - msg = " - invalid!" if not (CUDA_HOME and os.path.isdir(CUDA_HOME)) else "" - data.append(("CUDA_HOME", str(CUDA_HOME) + msg)) - - cuda_arch_list = os.environ.get("TORCH_CUDA_ARCH_LIST", None) - if cuda_arch_list: - data.append(("TORCH_CUDA_ARCH_LIST", cuda_arch_list)) - data.append(("Pillow", PIL.__version__)) - - try: - data.append( - ( - "torchvision", - str(torchvision.__version__) + " @" + os.path.dirname(torchvision.__file__), - ) - ) - if has_cuda: - try: - torchvision_C = importlib.util.find_spec("torchvision._C").origin - msg = detect_compute_compatibility(CUDA_HOME, torchvision_C) - data.append(("torchvision arch flags", msg)) - except (ImportError, AttributeError): - data.append(("torchvision._C", "Not found")) - except AttributeError: - data.append(("torchvision", "unknown")) - - try: - import fvcore - - data.append(("fvcore", fvcore.__version__)) - except (ImportError, AttributeError): - pass - - try: - import iopath - - data.append(("iopath", iopath.__version__)) - except (ImportError, AttributeError): - pass - - try: - import cv2 - - data.append(("cv2", cv2.__version__)) - except (ImportError, AttributeError): - data.append(("cv2", "Not found")) - env_str = tabulate(data) + "\n" - env_str += collect_torch_env() - return env_str - - -def test_nccl_ops(): - num_gpu = torch.cuda.device_count() - if os.access("/tmp", os.W_OK): - import torch.multiprocessing as mp - - dist_url = "file:///tmp/nccl_tmp_file" - print("Testing NCCL connectivity ... this should not hang.") - mp.spawn(_test_nccl_worker, nprocs=num_gpu, args=(num_gpu, dist_url), daemon=False) - print("NCCL succeeded.") - - -def _test_nccl_worker(rank, num_gpu, dist_url): - import torch.distributed as dist - - dist.init_process_group(backend="NCCL", init_method=dist_url, rank=rank, world_size=num_gpu) - dist.barrier(device_ids=[rank]) - - -if __name__ == "__main__": - try: - from detectron2.utils.collect_env import collect_env_info as f - - print(f()) - except ImportError: - print(collect_env_info()) - - if torch.cuda.is_available(): - num_gpu = torch.cuda.device_count() - for k in range(num_gpu): - device = f"cuda:{k}" - try: - x = torch.tensor([1, 2.0], dtype=torch.float32) - x = x.to(device) - except Exception as e: - print( - f"Unable to copy tensor to device={device}: {e}. " - "Your CUDA environment is broken." - ) - if num_gpu > 1: - test_nccl_ops() diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/colormap.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/colormap.py deleted file mode 100755 index 150ccc37..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/colormap.py +++ /dev/null @@ -1,140 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -""" -An awesome colormap for really neat visualizations. -Copied from Detectron, and removed gray colors. -""" - -import numpy as np - -__all__ = ["colormap", "random_color"] - -# fmt: off -# RGB: -_COLORS = np.array( - [ - 0.000, 0.447, 0.741, - 0.850, 0.325, 0.098, - 0.929, 0.694, 0.125, - 0.494, 0.184, 0.556, - 0.466, 0.674, 0.188, - 0.301, 0.745, 0.933, - 0.635, 0.078, 0.184, - 0.300, 0.300, 0.300, - 0.600, 0.600, 0.600, - 1.000, 0.000, 0.000, - 1.000, 0.500, 0.000, - 0.749, 0.749, 0.000, - 0.000, 1.000, 0.000, - 0.000, 0.000, 1.000, - 0.667, 0.000, 1.000, - 0.333, 0.333, 0.000, - 0.333, 0.667, 0.000, - 0.333, 1.000, 0.000, - 0.667, 0.333, 0.000, - 0.667, 0.667, 0.000, - 0.667, 1.000, 0.000, - 1.000, 0.333, 0.000, - 1.000, 0.667, 0.000, - 1.000, 1.000, 0.000, - 0.000, 0.333, 0.500, - 0.000, 0.667, 0.500, - 0.000, 1.000, 0.500, - 0.333, 0.000, 0.500, - 0.333, 0.333, 0.500, - 0.333, 0.667, 0.500, - 0.333, 1.000, 0.500, - 0.667, 0.000, 0.500, - 0.667, 0.333, 0.500, - 0.667, 0.667, 0.500, - 0.667, 1.000, 0.500, - 1.000, 0.000, 0.500, - 1.000, 0.333, 0.500, - 1.000, 0.667, 0.500, - 1.000, 1.000, 0.500, - 0.000, 0.333, 1.000, - 0.000, 0.667, 1.000, - 0.000, 1.000, 1.000, - 0.333, 0.000, 1.000, - 0.333, 0.333, 1.000, - 0.333, 0.667, 1.000, - 0.333, 1.000, 1.000, - 0.667, 0.000, 1.000, - 0.667, 0.333, 1.000, - 0.667, 0.667, 1.000, - 0.667, 1.000, 1.000, - 1.000, 0.000, 1.000, - 1.000, 0.333, 1.000, - 1.000, 0.667, 1.000, - 0.333, 0.000, 0.000, - 0.500, 0.000, 0.000, - 0.667, 0.000, 0.000, - 0.833, 0.000, 0.000, - 1.000, 0.000, 0.000, - 0.000, 0.167, 0.000, - 0.000, 0.333, 0.000, - 0.000, 0.500, 0.000, - 0.000, 0.667, 0.000, - 0.000, 0.833, 0.000, - 0.000, 1.000, 0.000, - 0.000, 0.000, 0.167, - 0.000, 0.000, 0.333, - 0.000, 0.000, 0.500, - 0.000, 0.000, 0.667, - 0.000, 0.000, 0.833, - 0.000, 0.000, 1.000, - 0.000, 0.000, 0.000, - 0.143, 0.143, 0.143, - 0.857, 0.857, 0.857, - 1.000, 1.000, 1.000 - ] -).astype(np.float32).reshape(-1, 3) -# fmt: on - - -def colormap(rgb=False, maximum=255): - """ - Args: - rgb (bool): whether to return RGB colors or BGR colors. - maximum (int): either 255 or 1 - - Returns: - ndarray: a float32 array of Nx3 colors, in range [0, 255] or [0, 1] - """ - assert maximum in [255, 1], maximum - c = _COLORS * maximum - if not rgb: - c = c[:, ::-1] - return c - - -def random_color(rgb=False, maximum=255): - """ - Args: - rgb (bool): whether to return RGB colors or BGR colors. - maximum (int): either 255 or 1 - - Returns: - ndarray: a vector of 3 numbers - """ - idx = np.random.randint(0, len(_COLORS)) - ret = _COLORS[idx] * maximum - if not rgb: - ret = ret[::-1] - return ret - - -if __name__ == "__main__": - import cv2 - - size = 100 - H, W = 10, 10 - canvas = np.random.rand(H * size, W * size, 3).astype("float32") - for h in range(H): - for w in range(W): - idx = h * W + w - if idx >= len(_COLORS): - break - canvas[h * size : (h + 1) * size, w * size : (w + 1) * size] = _COLORS[idx] - cv2.imshow("a", canvas) - cv2.waitKey(0) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/comm.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/comm.py deleted file mode 100755 index 7e2a0c44..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/comm.py +++ /dev/null @@ -1,199 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -""" -This file contains primitives for multi-gpu communication. -This is useful when doing distributed training. -""" - -import functools -import numpy as np -import torch -import torch.distributed as dist - -_LOCAL_PROCESS_GROUP = None -""" -A torch process group which only includes processes that on the same machine as the current process. -This variable is set when processes are spawned by `launch()` in "engine/launch.py". -""" - - -def get_world_size() -> int: - if not dist.is_available(): - return 1 - if not dist.is_initialized(): - return 1 - return dist.get_world_size() - - -def get_rank() -> int: - if not dist.is_available(): - return 0 - if not dist.is_initialized(): - return 0 - return dist.get_rank() - - -def get_local_rank() -> int: - """ - Returns: - The rank of the current process within the local (per-machine) process group. - """ - if not dist.is_available(): - return 0 - if not dist.is_initialized(): - return 0 - assert ( - _LOCAL_PROCESS_GROUP is not None - ), "Local process group is not created! Please use launch() to spawn processes!" - return dist.get_rank(group=_LOCAL_PROCESS_GROUP) - - -def get_local_size() -> int: - """ - Returns: - The size of the per-machine process group, - i.e. the number of processes per machine. - """ - if not dist.is_available(): - return 1 - if not dist.is_initialized(): - return 1 - return dist.get_world_size(group=_LOCAL_PROCESS_GROUP) - - -def is_main_process() -> bool: - return get_rank() == 0 - - -def synchronize(): - """ - Helper function to synchronize (barrier) among all processes when - using distributed training - """ - if not dist.is_available(): - return - if not dist.is_initialized(): - return - world_size = dist.get_world_size() - if world_size == 1: - return - if dist.get_backend() == dist.Backend.NCCL: - # This argument is needed to avoid warnings. - # It's valid only for NCCL backend. - dist.barrier(device_ids=[torch.cuda.current_device()]) - else: - dist.barrier() - - -@functools.lru_cache() -def _get_global_gloo_group(): - """ - Return a process group based on gloo backend, containing all the ranks - The result is cached. - """ - if dist.get_backend() == "nccl": - return dist.new_group(backend="gloo") - else: - return dist.group.WORLD - - -def all_gather(data, group=None): - """ - Run all_gather on arbitrary picklable data (not necessarily tensors). - - Args: - data: any picklable object - group: a torch process group. By default, will use a group which - contains all ranks on gloo backend. - - Returns: - list[data]: list of data gathered from each rank - """ - if get_world_size() == 1: - return [data] - if group is None: - group = _get_global_gloo_group() # use CPU group by default, to reduce GPU RAM usage. - world_size = dist.get_world_size(group) - if world_size == 1: - return [data] - - output = [None for _ in range(world_size)] - dist.all_gather_object(output, data, group=group) - return output - - -def gather(data, dst=0, group=None): - """ - Run gather on arbitrary picklable data (not necessarily tensors). - - Args: - data: any picklable object - dst (int): destination rank - group: a torch process group. By default, will use a group which - contains all ranks on gloo backend. - - Returns: - list[data]: on dst, a list of data gathered from each rank. Otherwise, - an empty list. - """ - if get_world_size() == 1: - return [data] - if group is None: - group = _get_global_gloo_group() - world_size = dist.get_world_size(group=group) - if world_size == 1: - return [data] - rank = dist.get_rank(group=group) - - if rank == dst: - output = [None for _ in range(world_size)] - dist.gather_object(data, output, dst=dst, group=group) - return output - else: - dist.gather_object(data, None, dst=dst, group=group) - return [] - - -def shared_random_seed(): - """ - Returns: - int: a random number that is the same across all workers. - If workers need a shared RNG, they can use this shared seed to - create one. - - All workers must call this function, otherwise it will deadlock. - """ - ints = np.random.randint(2 ** 31) - all_ints = all_gather(ints) - return all_ints[0] - - -def reduce_dict(input_dict, average=True): - """ - Reduce the values in the dictionary from all processes so that process with rank - 0 has the reduced results. - - Args: - input_dict (dict): inputs to be reduced. All the values must be scalar CUDA Tensor. - average (bool): whether to do average or sum - - Returns: - a dict with the same keys as input_dict, after reduction. - """ - world_size = get_world_size() - if world_size < 2: - return input_dict - with torch.no_grad(): - names = [] - values = [] - # sort the keys so that they are consistent across processes - for k in sorted(input_dict.keys()): - names.append(k) - values.append(input_dict[k]) - values = torch.stack(values, dim=0) - dist.reduce(values, dst=0) - if dist.get_rank() == 0 and average: - # only main process gets accumulated, so only divide by - # world_size in this case - values /= world_size - reduced_dict = {k: v for k, v in zip(names, values)} - return reduced_dict diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/env.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/env.py deleted file mode 100755 index 40634c17..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/env.py +++ /dev/null @@ -1,170 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import importlib -import importlib.util -import logging -import numpy as np -import os -import random -import sys -from datetime import datetime -import torch - -__all__ = ["seed_all_rng"] - - -TORCH_VERSION = tuple(int(x) for x in torch.__version__.split(".")[:2]) -""" -PyTorch version as a tuple of 2 ints. Useful for comparison. -""" - - -DOC_BUILDING = os.getenv("_DOC_BUILDING", False) # set in docs/conf.py -""" -Whether we're building documentation. -""" - - -def seed_all_rng(seed=None): - """ - Set the random seed for the RNG in torch, numpy and python. - - Args: - seed (int): if None, will use a strong random seed. - """ - if seed is None: - seed = ( - os.getpid() - + int(datetime.now().strftime("%S%f")) - + int.from_bytes(os.urandom(2), "big") - ) - logger = logging.getLogger(__name__) - logger.info("Using a generated random seed {}".format(seed)) - np.random.seed(seed) - torch.manual_seed(seed) - random.seed(seed) - os.environ["PYTHONHASHSEED"] = str(seed) - - -# from https://stackoverflow.com/questions/67631/how-to-import-a-module-given-the-full-path -def _import_file(module_name, file_path, make_importable=False): - spec = importlib.util.spec_from_file_location(module_name, file_path) - module = importlib.util.module_from_spec(spec) - spec.loader.exec_module(module) - if make_importable: - sys.modules[module_name] = module - return module - - -def _configure_libraries(): - """ - Configurations for some libraries. - """ - # An environment option to disable `import cv2` globally, - # in case it leads to negative performance impact - disable_cv2 = int(os.environ.get("DETECTRON2_DISABLE_CV2", False)) - if disable_cv2: - sys.modules["cv2"] = None - else: - # Disable opencl in opencv since its interaction with cuda often has negative effects - # This envvar is supported after OpenCV 3.4.0 - os.environ["OPENCV_OPENCL_RUNTIME"] = "disabled" - try: - import cv2 - - if int(cv2.__version__.split(".")[0]) >= 3: - cv2.ocl.setUseOpenCL(False) - except ModuleNotFoundError: - # Other types of ImportError, if happened, should not be ignored. - # Because a failed opencv import could mess up address space - # https://github.com/skvark/opencv-python/issues/381 - pass - - def get_version(module, digit=2): - return tuple(map(int, module.__version__.split(".")[:digit])) - - # fmt: off - assert get_version(torch) >= (1, 4), "Requires torch>=1.4" - import fvcore - assert get_version(fvcore, 3) >= (0, 1, 2), "Requires fvcore>=0.1.2" - import yaml - assert get_version(yaml) >= (5, 1), "Requires pyyaml>=5.1" - # fmt: on - - -_ENV_SETUP_DONE = False - - -def setup_environment(): - """Perform environment setup work. The default setup is a no-op, but this - function allows the user to specify a Python source file or a module in - the $DETECTRON2_ENV_MODULE environment variable, that performs - custom setup work that may be necessary to their computing environment. - """ - global _ENV_SETUP_DONE - if _ENV_SETUP_DONE: - return - _ENV_SETUP_DONE = True - - _configure_libraries() - - custom_module_path = os.environ.get("DETECTRON2_ENV_MODULE") - - if custom_module_path: - setup_custom_environment(custom_module_path) - else: - # The default setup is a no-op - pass - - -def setup_custom_environment(custom_module): - """ - Load custom environment setup by importing a Python source file or a - module, and run the setup function. - """ - if custom_module.endswith(".py"): - module = _import_file("detectron2.utils.env.custom_module", custom_module) - else: - module = importlib.import_module(custom_module) - assert hasattr(module, "setup_environment") and callable(module.setup_environment), ( - "Custom environment module defined in {} does not have the " - "required callable attribute 'setup_environment'." - ).format(custom_module) - module.setup_environment() - - -def fixup_module_metadata(module_name, namespace, keys=None): - """ - Fix the __qualname__ of module members to be their exported api name, so - when they are referenced in docs, sphinx can find them. Reference: - https://github.com/python-trio/trio/blob/6754c74eacfad9cc5c92d5c24727a2f3b620624e/trio/_util.py#L216-L241 - """ - if not DOC_BUILDING: - return - seen_ids = set() - - def fix_one(qualname, name, obj): - # avoid infinite recursion (relevant when using - # typing.Generic, for example) - if id(obj) in seen_ids: - return - seen_ids.add(id(obj)) - - mod = getattr(obj, "__module__", None) - if mod is not None and (mod.startswith(module_name) or mod.startswith("fvcore.")): - obj.__module__ = module_name - # Modules, unlike everything else in Python, put fully-qualitied - # names into their __name__ attribute. We check for "." to avoid - # rewriting these. - if hasattr(obj, "__name__") and "." not in obj.__name__: - obj.__name__ = name - obj.__qualname__ = qualname - if isinstance(obj, type): - for attr_name, attr_value in obj.__dict__.items(): - fix_one(objname + "." + attr_name, attr_name, attr_value) - - if keys is None: - keys = namespace.keys() - for objname in keys: - if not objname.startswith("_"): - obj = namespace[objname] - fix_one(objname, objname, obj) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/events.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/events.py deleted file mode 100755 index 5dee954b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/events.py +++ /dev/null @@ -1,486 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import datetime -import json -import logging -import os -import time -from collections import defaultdict -from contextlib import contextmanager -from typing import Optional -import torch -from fvcore.common.history_buffer import HistoryBuffer - -from detectron2.utils.file_io import PathManager - -__all__ = [ - "get_event_storage", - "JSONWriter", - "TensorboardXWriter", - "CommonMetricPrinter", - "EventStorage", -] - -_CURRENT_STORAGE_STACK = [] - - -def get_event_storage(): - """ - Returns: - The :class:`EventStorage` object that's currently being used. - Throws an error if no :class:`EventStorage` is currently enabled. - """ - assert len( - _CURRENT_STORAGE_STACK - ), "get_event_storage() has to be called inside a 'with EventStorage(...)' context!" - return _CURRENT_STORAGE_STACK[-1] - - -class EventWriter: - """ - Base class for writers that obtain events from :class:`EventStorage` and process them. - """ - - def write(self): - raise NotImplementedError - - def close(self): - pass - - -class JSONWriter(EventWriter): - """ - Write scalars to a json file. - - It saves scalars as one json per line (instead of a big json) for easy parsing. - - Examples parsing such a json file: - :: - $ cat metrics.json | jq -s '.[0:2]' - [ - { - "data_time": 0.008433341979980469, - "iteration": 19, - "loss": 1.9228371381759644, - "loss_box_reg": 0.050025828182697296, - "loss_classifier": 0.5316952466964722, - "loss_mask": 0.7236229181289673, - "loss_rpn_box": 0.0856662318110466, - "loss_rpn_cls": 0.48198649287223816, - "lr": 0.007173333333333333, - "time": 0.25401854515075684 - }, - { - "data_time": 0.007216215133666992, - "iteration": 39, - "loss": 1.282649278640747, - "loss_box_reg": 0.06222952902317047, - "loss_classifier": 0.30682939291000366, - "loss_mask": 0.6970193982124329, - "loss_rpn_box": 0.038663312792778015, - "loss_rpn_cls": 0.1471673548221588, - "lr": 0.007706666666666667, - "time": 0.2490077018737793 - } - ] - - $ cat metrics.json | jq '.loss_mask' - 0.7126231789588928 - 0.689423680305481 - 0.6776131987571716 - ... - - """ - - def __init__(self, json_file, window_size=20): - """ - Args: - json_file (str): path to the json file. New data will be appended if the file exists. - window_size (int): the window size of median smoothing for the scalars whose - `smoothing_hint` are True. - """ - self._file_handle = PathManager.open(json_file, "a") - self._window_size = window_size - self._last_write = -1 - - def write(self): - storage = get_event_storage() - to_save = defaultdict(dict) - - for k, (v, iter) in storage.latest_with_smoothing_hint(self._window_size).items(): - # keep scalars that have not been written - if iter <= self._last_write: - continue - to_save[iter][k] = v - if len(to_save): - all_iters = sorted(to_save.keys()) - self._last_write = max(all_iters) - - for itr, scalars_per_iter in to_save.items(): - scalars_per_iter["iteration"] = itr - self._file_handle.write(json.dumps(scalars_per_iter, sort_keys=True) + "\n") - self._file_handle.flush() - try: - os.fsync(self._file_handle.fileno()) - except AttributeError: - pass - - def close(self): - self._file_handle.close() - - -class TensorboardXWriter(EventWriter): - """ - Write all scalars to a tensorboard file. - """ - - def __init__(self, log_dir: str, window_size: int = 20, **kwargs): - """ - Args: - log_dir (str): the directory to save the output events - window_size (int): the scalars will be median-smoothed by this window size - - kwargs: other arguments passed to `torch.utils.tensorboard.SummaryWriter(...)` - """ - self._window_size = window_size - from torch.utils.tensorboard import SummaryWriter - - self._writer = SummaryWriter(log_dir, **kwargs) - self._last_write = -1 - - def write(self): - storage = get_event_storage() - new_last_write = self._last_write - for k, (v, iter) in storage.latest_with_smoothing_hint(self._window_size).items(): - if iter > self._last_write: - self._writer.add_scalar(k, v, iter) - new_last_write = max(new_last_write, iter) - self._last_write = new_last_write - - # storage.put_{image,histogram} is only meant to be used by - # tensorboard writer. So we access its internal fields directly from here. - if len(storage._vis_data) >= 1: - for img_name, img, step_num in storage._vis_data: - self._writer.add_image(img_name, img, step_num) - # Storage stores all image data and rely on this writer to clear them. - # As a result it assumes only one writer will use its image data. - # An alternative design is to let storage store limited recent - # data (e.g. only the most recent image) that all writers can access. - # In that case a writer may not see all image data if its period is long. - storage.clear_images() - - if len(storage._histograms) >= 1: - for params in storage._histograms: - self._writer.add_histogram_raw(**params) - storage.clear_histograms() - - def close(self): - if hasattr(self, "_writer"): # doesn't exist when the code fails at import - self._writer.close() - - -class CommonMetricPrinter(EventWriter): - """ - Print **common** metrics to the terminal, including - iteration time, ETA, memory, all losses, and the learning rate. - It also applies smoothing using a window of 20 elements. - - It's meant to print common metrics in common ways. - To print something in more customized ways, please implement a similar printer by yourself. - """ - - def __init__(self, max_iter: Optional[int] = None, window_size: int = 20): - """ - Args: - max_iter: the maximum number of iterations to train. - Used to compute ETA. If not given, ETA will not be printed. - window_size (int): the losses will be median-smoothed by this window size - """ - self.logger = logging.getLogger(__name__) - self._max_iter = max_iter - self._window_size = window_size - self._last_write = None # (step, time) of last call to write(). Used to compute ETA - - def _get_eta(self, storage) -> Optional[str]: - if self._max_iter is None: - return "" - iteration = storage.iter - try: - eta_seconds = storage.history("time").median(1000) * (self._max_iter - iteration - 1) - storage.put_scalar("eta_seconds", eta_seconds, smoothing_hint=False) - return str(datetime.timedelta(seconds=int(eta_seconds))) - except KeyError: - # estimate eta on our own - more noisy - eta_string = None - if self._last_write is not None: - estimate_iter_time = (time.perf_counter() - self._last_write[1]) / ( - iteration - self._last_write[0] - ) - eta_seconds = estimate_iter_time * (self._max_iter - iteration - 1) - eta_string = str(datetime.timedelta(seconds=int(eta_seconds))) - self._last_write = (iteration, time.perf_counter()) - return eta_string - - def write(self): - storage = get_event_storage() - iteration = storage.iter - if iteration == self._max_iter: - # This hook only reports training progress (loss, ETA, etc) but not other data, - # therefore do not write anything after training succeeds, even if this method - # is called. - return - - try: - data_time = storage.history("data_time").avg(20) - except KeyError: - # they may not exist in the first few iterations (due to warmup) - # or when SimpleTrainer is not used - data_time = None - try: - iter_time = storage.history("time").global_avg() - except KeyError: - iter_time = None - try: - lr = "{:.5g}".format(storage.history("lr").latest()) - except KeyError: - lr = "N/A" - - eta_string = self._get_eta(storage) - - if torch.cuda.is_available(): - max_mem_mb = torch.cuda.max_memory_allocated() / 1024.0 / 1024.0 - else: - max_mem_mb = None - - # NOTE: max_mem is parsed by grep in "dev/parse_results.sh" - self.logger.info( - " {eta}iter: {iter} {losses} {time}{data_time}lr: {lr} {memory}".format( - eta=f"eta: {eta_string} " if eta_string else "", - iter=iteration, - losses=" ".join( - [ - "{}: {:.4g}".format(k, v.median(self._window_size)) - for k, v in storage.histories().items() - if "loss" in k - ] - ), - time="time: {:.4f} ".format(iter_time) if iter_time is not None else "", - data_time="data_time: {:.4f} ".format(data_time) if data_time is not None else "", - lr=lr, - memory="max_mem: {:.0f}M".format(max_mem_mb) if max_mem_mb is not None else "", - ) - ) - - -class EventStorage: - """ - The user-facing class that provides metric storage functionalities. - - In the future we may add support for storing / logging other types of data if needed. - """ - - def __init__(self, start_iter=0): - """ - Args: - start_iter (int): the iteration number to start with - """ - self._history = defaultdict(HistoryBuffer) - self._smoothing_hints = {} - self._latest_scalars = {} - self._iter = start_iter - self._current_prefix = "" - self._vis_data = [] - self._histograms = [] - - def put_image(self, img_name, img_tensor): - """ - Add an `img_tensor` associated with `img_name`, to be shown on - tensorboard. - - Args: - img_name (str): The name of the image to put into tensorboard. - img_tensor (torch.Tensor or numpy.array): An `uint8` or `float` - Tensor of shape `[channel, height, width]` where `channel` is - 3. The image format should be RGB. The elements in img_tensor - can either have values in [0, 1] (float32) or [0, 255] (uint8). - The `img_tensor` will be visualized in tensorboard. - """ - self._vis_data.append((img_name, img_tensor, self._iter)) - - def put_scalar(self, name, value, smoothing_hint=True): - """ - Add a scalar `value` to the `HistoryBuffer` associated with `name`. - - Args: - smoothing_hint (bool): a 'hint' on whether this scalar is noisy and should be - smoothed when logged. The hint will be accessible through - :meth:`EventStorage.smoothing_hints`. A writer may ignore the hint - and apply custom smoothing rule. - - It defaults to True because most scalars we save need to be smoothed to - provide any useful signal. - """ - name = self._current_prefix + name - history = self._history[name] - value = float(value) - history.update(value, self._iter) - self._latest_scalars[name] = (value, self._iter) - - existing_hint = self._smoothing_hints.get(name) - if existing_hint is not None: - assert ( - existing_hint == smoothing_hint - ), "Scalar {} was put with a different smoothing_hint!".format(name) - else: - self._smoothing_hints[name] = smoothing_hint - - def put_scalars(self, *, smoothing_hint=True, **kwargs): - """ - Put multiple scalars from keyword arguments. - - Examples: - - storage.put_scalars(loss=my_loss, accuracy=my_accuracy, smoothing_hint=True) - """ - for k, v in kwargs.items(): - self.put_scalar(k, v, smoothing_hint=smoothing_hint) - - def put_histogram(self, hist_name, hist_tensor, bins=1000): - """ - Create a histogram from a tensor. - - Args: - hist_name (str): The name of the histogram to put into tensorboard. - hist_tensor (torch.Tensor): A Tensor of arbitrary shape to be converted - into a histogram. - bins (int): Number of histogram bins. - """ - ht_min, ht_max = hist_tensor.min().item(), hist_tensor.max().item() - - # Create a histogram with PyTorch - hist_counts = torch.histc(hist_tensor, bins=bins) - hist_edges = torch.linspace(start=ht_min, end=ht_max, steps=bins + 1, dtype=torch.float32) - - # Parameter for the add_histogram_raw function of SummaryWriter - hist_params = dict( - tag=hist_name, - min=ht_min, - max=ht_max, - num=len(hist_tensor), - sum=float(hist_tensor.sum()), - sum_squares=float(torch.sum(hist_tensor ** 2)), - bucket_limits=hist_edges[1:].tolist(), - bucket_counts=hist_counts.tolist(), - global_step=self._iter, - ) - self._histograms.append(hist_params) - - def history(self, name): - """ - Returns: - HistoryBuffer: the scalar history for name - """ - ret = self._history.get(name, None) - if ret is None: - raise KeyError("No history metric available for {}!".format(name)) - return ret - - def histories(self): - """ - Returns: - dict[name -> HistoryBuffer]: the HistoryBuffer for all scalars - """ - return self._history - - def latest(self): - """ - Returns: - dict[str -> (float, int)]: mapping from the name of each scalar to the most - recent value and the iteration number its added. - """ - return self._latest_scalars - - def latest_with_smoothing_hint(self, window_size=20): - """ - Similar to :meth:`latest`, but the returned values - are either the un-smoothed original latest value, - or a median of the given window_size, - depend on whether the smoothing_hint is True. - - This provides a default behavior that other writers can use. - """ - result = {} - for k, (v, itr) in self._latest_scalars.items(): - result[k] = ( - self._history[k].median(window_size) if self._smoothing_hints[k] else v, - itr, - ) - return result - - def smoothing_hints(self): - """ - Returns: - dict[name -> bool]: the user-provided hint on whether the scalar - is noisy and needs smoothing. - """ - return self._smoothing_hints - - def step(self): - """ - User should either: (1) Call this function to increment storage.iter when needed. Or - (2) Set `storage.iter` to the correct iteration number before each iteration. - - The storage will then be able to associate the new data with an iteration number. - """ - self._iter += 1 - - @property - def iter(self): - """ - Returns: - int: The current iteration number. When used together with a trainer, - this is ensured to be the same as trainer.iter. - """ - return self._iter - - @iter.setter - def iter(self, val): - self._iter = int(val) - - @property - def iteration(self): - # for backward compatibility - return self._iter - - def __enter__(self): - _CURRENT_STORAGE_STACK.append(self) - return self - - def __exit__(self, exc_type, exc_val, exc_tb): - assert _CURRENT_STORAGE_STACK[-1] == self - _CURRENT_STORAGE_STACK.pop() - - @contextmanager - def name_scope(self, name): - """ - Yields: - A context within which all the events added to this storage - will be prefixed by the name scope. - """ - old_prefix = self._current_prefix - self._current_prefix = name.rstrip("/") + "/" - yield - self._current_prefix = old_prefix - - def clear_images(self): - """ - Delete all the stored images for visualization. This should be called - after images are written to tensorboard. - """ - self._vis_data = [] - - def clear_histograms(self): - """ - Delete all the stored histograms for visualization. - This should be called after histograms are written to tensorboard. - """ - self._histograms = [] diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/file_io.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/file_io.py deleted file mode 100755 index 46ee4ec3..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/file_io.py +++ /dev/null @@ -1,37 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from iopath.common.file_io import HTTPURLHandler, OneDrivePathHandler, PathHandler -from iopath.common.file_io import PathManager as PathManagerBase - -__all__ = ["PathManager", "PathHandler"] - - -PathManager = PathManagerBase() -""" -This is a detectron2 project-specific PathManager. -We try to stay away from global PathManager in fvcore as it -introduces potential conflicts among other libraries. -""" - - -class Detectron2Handler(PathHandler): - """ - Resolve anything that's hosted under detectron2's namespace. - """ - - PREFIX = "detectron2://" - S3_DETECTRON2_PREFIX = "https://dl.fbaipublicfiles.com/detectron2/" - - def _get_supported_prefixes(self): - return [self.PREFIX] - - def _get_local_path(self, path, **kwargs): - name = path[len(self.PREFIX) :] - return PathManager.get_local_path(self.S3_DETECTRON2_PREFIX + name, **kwargs) - - def _open(self, path, mode="r", **kwargs): - return PathManager.open(self._get_local_path(path), mode, **kwargs) - - -PathManager.register_handler(HTTPURLHandler()) -PathManager.register_handler(OneDrivePathHandler()) -PathManager.register_handler(Detectron2Handler()) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/logger.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/logger.py deleted file mode 100755 index 7c7890f8..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/logger.py +++ /dev/null @@ -1,237 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import atexit -import functools -import logging -import os -import sys -import time -from collections import Counter -import torch -from tabulate import tabulate -from termcolor import colored - -from detectron2.utils.file_io import PathManager - -__all__ = ["setup_logger", "log_first_n", "log_every_n", "log_every_n_seconds"] - - -class _ColorfulFormatter(logging.Formatter): - def __init__(self, *args, **kwargs): - self._root_name = kwargs.pop("root_name") + "." - self._abbrev_name = kwargs.pop("abbrev_name", "") - if len(self._abbrev_name): - self._abbrev_name = self._abbrev_name + "." - super(_ColorfulFormatter, self).__init__(*args, **kwargs) - - def formatMessage(self, record): - record.name = record.name.replace(self._root_name, self._abbrev_name) - log = super(_ColorfulFormatter, self).formatMessage(record) - if record.levelno == logging.WARNING: - prefix = colored("WARNING", "red", attrs=["blink"]) - elif record.levelno == logging.ERROR or record.levelno == logging.CRITICAL: - prefix = colored("ERROR", "red", attrs=["blink", "underline"]) - else: - return log - return prefix + " " + log - - -@functools.lru_cache() # so that calling setup_logger multiple times won't add many handlers -def setup_logger( - output=None, distributed_rank=0, *, color=True, name="detectron2", abbrev_name=None -): - """ - Initialize the detectron2 logger and set its verbosity level to "DEBUG". - - Args: - output (str): a file name or a directory to save log. If None, will not save log file. - If ends with ".txt" or ".log", assumed to be a file name. - Otherwise, logs will be saved to `output/log.txt`. - name (str): the root module name of this logger - abbrev_name (str): an abbreviation of the module, to avoid long names in logs. - Set to "" to not log the root module in logs. - By default, will abbreviate "detectron2" to "d2" and leave other - modules unchanged. - - Returns: - logging.Logger: a logger - """ - logger = logging.getLogger(name) - logger.setLevel(logging.DEBUG) - logger.propagate = False - - if abbrev_name is None: - abbrev_name = "d2" if name == "detectron2" else name - - plain_formatter = logging.Formatter( - "[%(asctime)s] %(name)s %(levelname)s: %(message)s", datefmt="%m/%d %H:%M:%S" - ) - # stdout logging: master only - if distributed_rank == 0: - ch = logging.StreamHandler(stream=sys.stdout) - ch.setLevel(logging.DEBUG) - if color: - formatter = _ColorfulFormatter( - colored("[%(asctime)s %(name)s]: ", "green") + "%(message)s", - datefmt="%m/%d %H:%M:%S", - root_name=name, - abbrev_name=str(abbrev_name), - ) - else: - formatter = plain_formatter - ch.setFormatter(formatter) - logger.addHandler(ch) - - # file logging: all workers - if output is not None: - if output.endswith(".txt") or output.endswith(".log"): - filename = output - else: - filename = os.path.join(output, "log.txt") - if distributed_rank > 0: - filename = filename + ".rank{}".format(distributed_rank) - PathManager.mkdirs(os.path.dirname(filename)) - - fh = logging.StreamHandler(_cached_log_stream(filename)) - fh.setLevel(logging.DEBUG) - fh.setFormatter(plain_formatter) - logger.addHandler(fh) - - return logger - - -# cache the opened file object, so that different calls to `setup_logger` -# with the same file name can safely write to the same file. -@functools.lru_cache(maxsize=None) -def _cached_log_stream(filename): - # use 1K buffer if writing to cloud storage - io = PathManager.open(filename, "a", buffering=1024 if "://" in filename else -1) - atexit.register(io.close) - return io - - -""" -Below are some other convenient logging methods. -They are mainly adopted from -https://github.com/abseil/abseil-py/blob/master/absl/logging/__init__.py -""" - - -def _find_caller(): - """ - Returns: - str: module name of the caller - tuple: a hashable key to be used to identify different callers - """ - frame = sys._getframe(2) - while frame: - code = frame.f_code - if os.path.join("utils", "logger.") not in code.co_filename: - mod_name = frame.f_globals["__name__"] - if mod_name == "__main__": - mod_name = "detectron2" - return mod_name, (code.co_filename, frame.f_lineno, code.co_name) - frame = frame.f_back - - -_LOG_COUNTER = Counter() -_LOG_TIMER = {} - - -def log_first_n(lvl, msg, n=1, *, name=None, key="caller"): - """ - Log only for the first n times. - - Args: - lvl (int): the logging level - msg (str): - n (int): - name (str): name of the logger to use. Will use the caller's module by default. - key (str or tuple[str]): the string(s) can be one of "caller" or - "message", which defines how to identify duplicated logs. - For example, if called with `n=1, key="caller"`, this function - will only log the first call from the same caller, regardless of - the message content. - If called with `n=1, key="message"`, this function will log the - same content only once, even if they are called from different places. - If called with `n=1, key=("caller", "message")`, this function - will not log only if the same caller has logged the same message before. - """ - if isinstance(key, str): - key = (key,) - assert len(key) > 0 - - caller_module, caller_key = _find_caller() - hash_key = () - if "caller" in key: - hash_key = hash_key + caller_key - if "message" in key: - hash_key = hash_key + (msg,) - - _LOG_COUNTER[hash_key] += 1 - if _LOG_COUNTER[hash_key] <= n: - logging.getLogger(name or caller_module).log(lvl, msg) - - -def log_every_n(lvl, msg, n=1, *, name=None): - """ - Log once per n times. - - Args: - lvl (int): the logging level - msg (str): - n (int): - name (str): name of the logger to use. Will use the caller's module by default. - """ - caller_module, key = _find_caller() - _LOG_COUNTER[key] += 1 - if n == 1 or _LOG_COUNTER[key] % n == 1: - logging.getLogger(name or caller_module).log(lvl, msg) - - -def log_every_n_seconds(lvl, msg, n=1, *, name=None): - """ - Log no more than once per n seconds. - - Args: - lvl (int): the logging level - msg (str): - n (int): - name (str): name of the logger to use. Will use the caller's module by default. - """ - caller_module, key = _find_caller() - last_logged = _LOG_TIMER.get(key, None) - current_time = time.time() - if last_logged is None or current_time - last_logged >= n: - logging.getLogger(name or caller_module).log(lvl, msg) - _LOG_TIMER[key] = current_time - - -def create_small_table(small_dict): - """ - Create a small table using the keys of small_dict as headers. This is only - suitable for small dictionaries. - - Args: - small_dict (dict): a result dictionary of only a few items. - - Returns: - str: the table as a string. - """ - keys, values = tuple(zip(*small_dict.items())) - table = tabulate( - [values], - headers=keys, - tablefmt="pipe", - floatfmt=".3f", - stralign="center", - numalign="center", - ) - return table - - -def _log_api_usage(identifier: str): - """ - Internal function used to log the usage of different detectron2 components - inside facebook's infra. - """ - torch._C._log_api_usage_once("detectron2." + identifier) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/memory.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/memory.py deleted file mode 100755 index bd494780..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/memory.py +++ /dev/null @@ -1,84 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import logging -from contextlib import contextmanager -from functools import wraps -import torch - -__all__ = ["retry_if_cuda_oom"] - - -@contextmanager -def _ignore_torch_cuda_oom(): - """ - A context which ignores CUDA OOM exception from pytorch. - """ - try: - yield - except RuntimeError as e: - # NOTE: the string may change? - if "CUDA out of memory. " in str(e): - pass - else: - raise - - -def retry_if_cuda_oom(func): - """ - Makes a function retry itself after encountering - pytorch's CUDA OOM error. - It will first retry after calling `torch.cuda.empty_cache()`. - - If that still fails, it will then retry by trying to convert inputs to CPUs. - In this case, it expects the function to dispatch to CPU implementation. - The return values may become CPU tensors as well and it's user's - responsibility to convert it back to CUDA tensor if needed. - - Args: - func: a stateless callable that takes tensor-like objects as arguments - - Returns: - a callable which retries `func` if OOM is encountered. - - Examples: - :: - output = retry_if_cuda_oom(some_torch_function)(input1, input2) - # output may be on CPU even if inputs are on GPU - - Note: - 1. When converting inputs to CPU, it will only look at each argument and check - if it has `.device` and `.to` for conversion. Nested structures of tensors - are not supported. - - 2. Since the function might be called more than once, it has to be - stateless. - """ - - def maybe_to_cpu(x): - try: - like_gpu_tensor = x.device.type == "cuda" and hasattr(x, "to") - except AttributeError: - like_gpu_tensor = False - if like_gpu_tensor: - return x.to(device="cpu") - else: - return x - - @wraps(func) - def wrapped(*args, **kwargs): - with _ignore_torch_cuda_oom(): - return func(*args, **kwargs) - - # Clear cache and retry - torch.cuda.empty_cache() - with _ignore_torch_cuda_oom(): - return func(*args, **kwargs) - - # Try on CPU. This slows down the code significantly, therefore print a notice. - logger = logging.getLogger(__name__) - logger.info("Attempting to copy inputs of {} to CPU due to CUDA OOM".format(str(func))) - new_args = (maybe_to_cpu(x) for x in args) - new_kwargs = {k: maybe_to_cpu(v) for k, v in kwargs.items()} - return func(*new_args, **new_kwargs) - - return wrapped diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/registry.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/registry.py deleted file mode 100755 index 4b01e900..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/registry.py +++ /dev/null @@ -1,60 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -from typing import Any -import pydoc -from fvcore.common.registry import Registry # for backward compatibility. - -""" -``Registry`` and `locate` provide ways to map a string (typically found -in config files) to callable objects. -""" - -__all__ = ["Registry", "locate"] - - -def _convert_target_to_string(t: Any) -> str: - """ - Inverse of ``locate()``. - - Args: - t: any object with ``__module__`` and ``__qualname__`` - """ - module, qualname = t.__module__, t.__qualname__ - - # Compress the path to this object, e.g. ``module.submodule._impl.class`` - # may become ``module.submodule.class``, if the later also resolves to the same - # object. This simplifies the string, and also is less affected by moving the - # class implementation. - module_parts = module.split(".") - for k in range(1, len(module_parts)): - prefix = ".".join(module_parts[:k]) - candidate = f"{prefix}.{qualname}" - try: - if locate(candidate) is t: - return candidate - except ImportError: - pass - return f"{module}.{qualname}" - - -def locate(name: str) -> Any: - """ - Locate and return an object ``x`` using an input string ``{x.__module__}.{x.__qualname__}``, - such as "module.submodule.class_name". - - Raise Exception if it cannot be found. - """ - obj = pydoc.locate(name) - - # Some cases (e.g. torch.optim.sgd.SGD) not handled correctly - # by pydoc.locate. Try a private function from hydra. - if obj is None: - try: - # from hydra.utils import get_method - will print many errors - from hydra.utils import _locate - except ImportError as e: - raise ImportError(f"Cannot dynamically locate object {name}!") from e - else: - obj = _locate(name) # it raises if fails - - return obj diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/serialize.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/serialize.py deleted file mode 100755 index 0b388628..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/serialize.py +++ /dev/null @@ -1,32 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import cloudpickle - - -class PicklableWrapper(object): - """ - Wrap an object to make it more picklable, note that it uses - heavy weight serialization libraries that are slower than pickle. - It's best to use it only on closures (which are usually not picklable). - - This is a simplified version of - https://github.com/joblib/joblib/blob/master/joblib/externals/loky/cloudpickle_wrapper.py - """ - - def __init__(self, obj): - while isinstance(obj, PicklableWrapper): - # Wrapping an object twice is no-op - obj = obj._obj - self._obj = obj - - def __reduce__(self): - s = cloudpickle.dumps(self._obj) - return cloudpickle.loads, (s,) - - def __call__(self, *args, **kwargs): - return self._obj(*args, **kwargs) - - def __getattr__(self, attr): - # Ensure that the wrapped object can be used seamlessly as the previous object. - if attr not in ["_obj"]: - return getattr(self._obj, attr) - return getattr(self, attr) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/testing.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/testing.py deleted file mode 100755 index 161fa6b8..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/testing.py +++ /dev/null @@ -1,137 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import io -import numpy as np -import torch - -from detectron2 import model_zoo -from detectron2.data import DatasetCatalog -from detectron2.data.detection_utils import read_image -from detectron2.modeling import build_model -from detectron2.structures import Boxes, Instances, ROIMasks -from detectron2.utils.file_io import PathManager - - -""" -Internal utilities for tests. Don't use except for writing tests. -""" - - -def get_model_no_weights(config_path): - """ - Like model_zoo.get, but do not load any weights (even pretrained) - """ - cfg = model_zoo.get_config(config_path) - if not torch.cuda.is_available(): - cfg.MODEL.DEVICE = "cpu" - return build_model(cfg) - - -def random_boxes(num_boxes, max_coord=100, device="cpu"): - """ - Create a random Nx4 boxes tensor, with coordinates < max_coord. - """ - boxes = torch.rand(num_boxes, 4, device=device) * (max_coord * 0.5) - boxes.clamp_(min=1.0) # tiny boxes cause numerical instability in box regression - # Note: the implementation of this function in torchvision is: - # boxes[:, 2:] += torch.rand(N, 2) * 100 - # but it does not guarantee non-negative widths/heights constraints: - # boxes[:, 2] >= boxes[:, 0] and boxes[:, 3] >= boxes[:, 1]: - boxes[:, 2:] += boxes[:, :2] - return boxes - - -def get_sample_coco_image(tensor=True): - """ - Args: - tensor (bool): if True, returns 3xHxW tensor. - else, returns a HxWx3 numpy array. - - Returns: - an image, in BGR color. - """ - try: - file_name = DatasetCatalog.get("coco_2017_val_100")[0]["file_name"] - if not PathManager.exists(file_name): - raise FileNotFoundError() - except IOError: - # for public CI to run - file_name = PathManager.get_local_path( - "http://images.cocodataset.org/train2017/000000000009.jpg" - ) - ret = read_image(file_name, format="BGR") - if tensor: - ret = torch.from_numpy(np.ascontiguousarray(ret.transpose(2, 0, 1))) - return ret - - -def convert_scripted_instances(instances): - """ - Convert a scripted Instances object to a regular :class:`Instances` object - """ - assert hasattr( - instances, "image_size" - ), f"Expect an Instances object, but got {type(instances)}!" - ret = Instances(instances.image_size) - for name in instances._field_names: - val = getattr(instances, "_" + name, None) - if val is not None: - ret.set(name, val) - return ret - - -def assert_instances_allclose(input, other, *, rtol=1e-5, msg="", size_as_tensor=False): - """ - Args: - input, other (Instances): - size_as_tensor: compare image_size of the Instances as tensors (instead of tuples). - Useful for comparing outputs of tracing. - """ - if not isinstance(input, Instances): - input = convert_scripted_instances(input) - if not isinstance(other, Instances): - other = convert_scripted_instances(other) - - if not msg: - msg = "Two Instances are different! " - else: - msg = msg.rstrip() + " " - - size_error_msg = msg + f"image_size is {input.image_size} vs. {other.image_size}!" - if size_as_tensor: - assert torch.equal( - torch.tensor(input.image_size), torch.tensor(other.image_size) - ), size_error_msg - else: - assert input.image_size == other.image_size, size_error_msg - fields = sorted(input.get_fields().keys()) - fields_other = sorted(other.get_fields().keys()) - assert fields == fields_other, msg + f"Fields are {fields} vs {fields_other}!" - - for f in fields: - val1, val2 = input.get(f), other.get(f) - if isinstance(val1, (Boxes, ROIMasks)): - # boxes in the range of O(100) and can have a larger tolerance - assert torch.allclose(val1.tensor, val2.tensor, atol=100 * rtol), ( - msg + f"Field {f} differs too much!" - ) - elif isinstance(val1, torch.Tensor): - if val1.dtype.is_floating_point: - mag = torch.abs(val1).max().cpu().item() - assert torch.allclose(val1, val2, atol=mag * rtol), ( - msg + f"Field {f} differs too much!" - ) - else: - assert torch.equal(val1, val2), msg + f"Field {f} is different!" - else: - raise ValueError(f"Don't know how to compare type {type(val1)}") - - -def reload_script_model(module): - """ - Save a jit module and load it back. - Similar to the `getExportImportCopy` function in torch/testing/ - """ - buffer = io.BytesIO() - torch.jit.save(module, buffer) - buffer.seek(0) - return torch.jit.load(buffer) diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/video_visualizer.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/video_visualizer.py deleted file mode 100755 index 9d8a366d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/video_visualizer.py +++ /dev/null @@ -1,252 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -import pycocotools.mask as mask_util - -from detectron2.utils.visualizer import ( - ColorMode, - Visualizer, - _create_text_labels, - _PanopticPrediction, -) - -from .colormap import random_color - - -class _DetectedInstance: - """ - Used to store data about detected objects in video frame, - in order to transfer color to objects in the future frames. - - Attributes: - label (int): - bbox (tuple[float]): - mask_rle (dict): - color (tuple[float]): RGB colors in range (0, 1) - ttl (int): time-to-live for the instance. For example, if ttl=2, - the instance color can be transferred to objects in the next two frames. - """ - - __slots__ = ["label", "bbox", "mask_rle", "color", "ttl"] - - def __init__(self, label, bbox, mask_rle, color, ttl): - self.label = label - self.bbox = bbox - self.mask_rle = mask_rle - self.color = color - self.ttl = ttl - - -class VideoVisualizer: - def __init__(self, metadata, instance_mode=ColorMode.IMAGE): - """ - Args: - metadata (MetadataCatalog): image metadata. - """ - self.metadata = metadata - self._old_instances = [] - assert instance_mode in [ - ColorMode.IMAGE, - ColorMode.IMAGE_BW, - ], "Other mode not supported yet." - self._instance_mode = instance_mode - - def draw_instance_predictions(self, frame, predictions): - """ - Draw instance-level prediction results on an image. - - Args: - frame (ndarray): an RGB image of shape (H, W, C), in the range [0, 255]. - predictions (Instances): the output of an instance detection/segmentation - model. Following fields will be used to draw: - "pred_boxes", "pred_classes", "scores", "pred_masks" (or "pred_masks_rle"). - - Returns: - output (VisImage): image object with visualizations. - """ - frame_visualizer = Visualizer(frame, self.metadata) - num_instances = len(predictions) - if num_instances == 0: - return frame_visualizer.output - - boxes = predictions.pred_boxes.tensor.numpy() if predictions.has("pred_boxes") else None - scores = predictions.scores if predictions.has("scores") else None - classes = predictions.pred_classes.numpy() if predictions.has("pred_classes") else None - keypoints = predictions.pred_keypoints if predictions.has("pred_keypoints") else None - colors = predictions.COLOR if predictions.has("COLOR") else [None] * len(predictions) - durations = predictions.ID_duration if predictions.has("ID_duration") else None - duration_threshold = self.metadata.get("duration_threshold", 0) - visibilities = None if durations is None else [x > duration_threshold for x in durations] - - if predictions.has("pred_masks"): - masks = predictions.pred_masks - # mask IOU is not yet enabled - # masks_rles = mask_util.encode(np.asarray(masks.permute(1, 2, 0), order="F")) - # assert len(masks_rles) == num_instances - else: - masks = None - - detected = [ - _DetectedInstance(classes[i], boxes[i], mask_rle=None, color=colors[i], ttl=8) - for i in range(num_instances) - ] - if not predictions.has("COLOR"): - colors = self._assign_colors(detected) - - labels = _create_text_labels(classes, scores, self.metadata.get("thing_classes", None)) - - if self._instance_mode == ColorMode.IMAGE_BW: - # any() returns uint8 tensor - frame_visualizer.output.reset_image( - frame_visualizer._create_grayscale_image( - (masks.any(dim=0) > 0).numpy() if masks is not None else None - ) - ) - alpha = 0.3 - else: - alpha = 0.5 - - labels = ( - None - if labels is None - else [y[0] for y in filter(lambda x: x[1], zip(labels, visibilities))] - ) # noqa - assigned_colors = ( - None - if colors is None - else [y[0] for y in filter(lambda x: x[1], zip(colors, visibilities))] - ) # noqa - frame_visualizer.overlay_instances( - boxes=None if masks is not None else boxes[visibilities], # boxes are a bit distracting - masks=None if masks is None else masks[visibilities], - labels=labels, - keypoints=None if keypoints is None else keypoints[visibilities], - assigned_colors=assigned_colors, - alpha=alpha, - ) - - return frame_visualizer.output - - def draw_sem_seg(self, frame, sem_seg, area_threshold=None): - """ - Args: - sem_seg (ndarray or Tensor): semantic segmentation of shape (H, W), - each value is the integer label. - area_threshold (Optional[int]): only draw segmentations larger than the threshold - """ - # don't need to do anything special - frame_visualizer = Visualizer(frame, self.metadata) - frame_visualizer.draw_sem_seg(sem_seg, area_threshold=None) - return frame_visualizer.output - - def draw_panoptic_seg_predictions( - self, frame, panoptic_seg, segments_info, area_threshold=None, alpha=0.5 - ): - frame_visualizer = Visualizer(frame, self.metadata) - pred = _PanopticPrediction(panoptic_seg, segments_info, self.metadata) - - if self._instance_mode == ColorMode.IMAGE_BW: - frame_visualizer.output.reset_image( - frame_visualizer._create_grayscale_image(pred.non_empty_mask()) - ) - - # draw mask for all semantic segments first i.e. "stuff" - for mask, sinfo in pred.semantic_masks(): - category_idx = sinfo["category_id"] - try: - mask_color = [x / 255 for x in self.metadata.stuff_colors[category_idx]] - except AttributeError: - mask_color = None - - frame_visualizer.draw_binary_mask( - mask, - color=mask_color, - text=self.metadata.stuff_classes[category_idx], - alpha=alpha, - area_threshold=area_threshold, - ) - - all_instances = list(pred.instance_masks()) - if len(all_instances) == 0: - return frame_visualizer.output - # draw mask for all instances second - masks, sinfo = list(zip(*all_instances)) - num_instances = len(masks) - masks_rles = mask_util.encode( - np.asarray(np.asarray(masks).transpose(1, 2, 0), dtype=np.uint8, order="F") - ) - assert len(masks_rles) == num_instances - - category_ids = [x["category_id"] for x in sinfo] - detected = [ - _DetectedInstance(category_ids[i], bbox=None, mask_rle=masks_rles[i], color=None, ttl=8) - for i in range(num_instances) - ] - colors = self._assign_colors(detected) - labels = [self.metadata.thing_classes[k] for k in category_ids] - - frame_visualizer.overlay_instances( - boxes=None, - masks=masks, - labels=labels, - keypoints=None, - assigned_colors=colors, - alpha=alpha, - ) - return frame_visualizer.output - - def _assign_colors(self, instances): - """ - Naive tracking heuristics to assign same color to the same instance, - will update the internal state of tracked instances. - - Returns: - list[tuple[float]]: list of colors. - """ - - # Compute iou with either boxes or masks: - is_crowd = np.zeros((len(instances),), dtype=np.bool) - if instances[0].bbox is None: - assert instances[0].mask_rle is not None - # use mask iou only when box iou is None - # because box seems good enough - rles_old = [x.mask_rle for x in self._old_instances] - rles_new = [x.mask_rle for x in instances] - ious = mask_util.iou(rles_old, rles_new, is_crowd) - threshold = 0.5 - else: - boxes_old = [x.bbox for x in self._old_instances] - boxes_new = [x.bbox for x in instances] - ious = mask_util.iou(boxes_old, boxes_new, is_crowd) - threshold = 0.6 - if len(ious) == 0: - ious = np.zeros((len(self._old_instances), len(instances)), dtype="float32") - - # Only allow matching instances of the same label: - for old_idx, old in enumerate(self._old_instances): - for new_idx, new in enumerate(instances): - if old.label != new.label: - ious[old_idx, new_idx] = 0 - - matched_new_per_old = np.asarray(ious).argmax(axis=1) - max_iou_per_old = np.asarray(ious).max(axis=1) - - # Try to find match for each old instance: - extra_instances = [] - for idx, inst in enumerate(self._old_instances): - if max_iou_per_old[idx] > threshold: - newidx = matched_new_per_old[idx] - if instances[newidx].color is None: - instances[newidx].color = inst.color - continue - # If an old instance does not match any new instances, - # keep it for the next frame in case it is just missed by the detector - inst.ttl -= 1 - if inst.ttl > 0: - extra_instances.append(inst) - - # Assign random color to newly-detected instances: - for inst in instances: - if inst.color is None: - inst.color = random_color(rgb=True, maximum=1) - self._old_instances = instances[:] + extra_instances - return [d.color for d in instances] diff --git a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/visualizer.py b/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/visualizer.py deleted file mode 100755 index 8e145181..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/detectron2/utils/visualizer.py +++ /dev/null @@ -1,1267 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import colorsys -import logging -import math -import numpy as np -from enum import Enum, unique -import cv2 -import matplotlib as mpl -import matplotlib.colors as mplc -import matplotlib.figure as mplfigure -import pycocotools.mask as mask_util -import torch -from matplotlib.backends.backend_agg import FigureCanvasAgg -from PIL import Image - -from detectron2.data import MetadataCatalog -from detectron2.structures import BitMasks, Boxes, BoxMode, Keypoints, PolygonMasks, RotatedBoxes -from detectron2.utils.file_io import PathManager - -from .colormap import random_color - -logger = logging.getLogger(__name__) - -__all__ = ["ColorMode", "VisImage", "Visualizer"] - - -_SMALL_OBJECT_AREA_THRESH = 1000 -_LARGE_MASK_AREA_THRESH = 120000 -_OFF_WHITE = (1.0, 1.0, 240.0 / 255) -_BLACK = (0, 0, 0) -_RED = (1.0, 0, 0) - -_KEYPOINT_THRESHOLD = 0.05 - - -@unique -class ColorMode(Enum): - """ - Enum of different color modes to use for instance visualizations. - """ - - IMAGE = 0 - """ - Picks a random color for every instance and overlay segmentations with low opacity. - """ - SEGMENTATION = 1 - """ - Let instances of the same category have similar colors - (from metadata.thing_colors), and overlay them with - high opacity. This provides more attention on the quality of segmentation. - """ - IMAGE_BW = 2 - """ - Same as IMAGE, but convert all areas without masks to gray-scale. - Only available for drawing per-instance mask predictions. - """ - - -class GenericMask: - """ - Attribute: - polygons (list[ndarray]): list[ndarray]: polygons for this mask. - Each ndarray has format [x, y, x, y, ...] - mask (ndarray): a binary mask - """ - - def __init__(self, mask_or_polygons, height, width): - self._mask = self._polygons = self._has_holes = None - self.height = height - self.width = width - - m = mask_or_polygons - if isinstance(m, dict): - # RLEs - assert "counts" in m and "size" in m - if isinstance(m["counts"], list): # uncompressed RLEs - h, w = m["size"] - assert h == height and w == width - m = mask_util.frPyObjects(m, h, w) - self._mask = mask_util.decode(m)[:, :] - return - - if isinstance(m, list): # list[ndarray] - self._polygons = [np.asarray(x).reshape(-1) for x in m] - return - - if isinstance(m, np.ndarray): # assumed to be a binary mask - assert m.shape[1] != 2, m.shape - assert m.shape == ( - height, - width, - ), f"mask shape: {m.shape}, target dims: {height}, {width}" - self._mask = m.astype("uint8") - return - - raise ValueError("GenericMask cannot handle object {} of type '{}'".format(m, type(m))) - - @property - def mask(self): - if self._mask is None: - self._mask = self.polygons_to_mask(self._polygons) - return self._mask - - @property - def polygons(self): - if self._polygons is None: - self._polygons, self._has_holes = self.mask_to_polygons(self._mask) - return self._polygons - - @property - def has_holes(self): - if self._has_holes is None: - if self._mask is not None: - self._polygons, self._has_holes = self.mask_to_polygons(self._mask) - else: - self._has_holes = False # if original format is polygon, does not have holes - return self._has_holes - - def mask_to_polygons(self, mask): - # cv2.RETR_CCOMP flag retrieves all the contours and arranges them to a 2-level - # hierarchy. External contours (boundary) of the object are placed in hierarchy-1. - # Internal contours (holes) are placed in hierarchy-2. - # cv2.CHAIN_APPROX_NONE flag gets vertices of polygons from contours. - mask = np.ascontiguousarray(mask) # some versions of cv2 does not support incontiguous arr - res = cv2.findContours(mask.astype("uint8"), cv2.RETR_CCOMP, cv2.CHAIN_APPROX_NONE) - hierarchy = res[-1] - if hierarchy is None: # empty mask - return [], False - has_holes = (hierarchy.reshape(-1, 4)[:, 3] >= 0).sum() > 0 - res = res[-2] - res = [x.flatten() for x in res] - # These coordinates from OpenCV are integers in range [0, W-1 or H-1]. - # We add 0.5 to turn them into real-value coordinate space. A better solution - # would be to first +0.5 and then dilate the returned polygon by 0.5. - res = [x + 0.5 for x in res if len(x) >= 6] - return res, has_holes - - def polygons_to_mask(self, polygons): - rle = mask_util.frPyObjects(polygons, self.height, self.width) - rle = mask_util.merge(rle) - return mask_util.decode(rle)[:, :] - - def area(self): - return self.mask.sum() - - def bbox(self): - p = mask_util.frPyObjects(self.polygons, self.height, self.width) - p = mask_util.merge(p) - bbox = mask_util.toBbox(p) - bbox[2] += bbox[0] - bbox[3] += bbox[1] - return bbox - - -class _PanopticPrediction: - """ - Unify different panoptic annotation/prediction formats - """ - - def __init__(self, panoptic_seg, segments_info, metadata=None): - if segments_info is None: - assert metadata is not None - # If "segments_info" is None, we assume "panoptic_img" is a - # H*W int32 image storing the panoptic_id in the format of - # category_id * label_divisor + instance_id. We reserve -1 for - # VOID label. - label_divisor = metadata.label_divisor - segments_info = [] - for panoptic_label in np.unique(panoptic_seg.numpy()): - if panoptic_label == -1: - # VOID region. - continue - pred_class = panoptic_label // label_divisor - isthing = pred_class in metadata.thing_dataset_id_to_contiguous_id.values() - segments_info.append( - { - "id": int(panoptic_label), - "category_id": int(pred_class), - "isthing": bool(isthing), - } - ) - del metadata - - self._seg = panoptic_seg - - self._sinfo = {s["id"]: s for s in segments_info} # seg id -> seg info - segment_ids, areas = torch.unique(panoptic_seg, sorted=True, return_counts=True) - areas = areas.numpy() - sorted_idxs = np.argsort(-areas) - self._seg_ids, self._seg_areas = segment_ids[sorted_idxs], areas[sorted_idxs] - self._seg_ids = self._seg_ids.tolist() - for sid, area in zip(self._seg_ids, self._seg_areas): - if sid in self._sinfo: - self._sinfo[sid]["area"] = float(area) - - def non_empty_mask(self): - """ - Returns: - (H, W) array, a mask for all pixels that have a prediction - """ - empty_ids = [] - for id in self._seg_ids: - if id not in self._sinfo: - empty_ids.append(id) - if len(empty_ids) == 0: - return np.zeros(self._seg.shape, dtype=np.uint8) - assert ( - len(empty_ids) == 1 - ), ">1 ids corresponds to no labels. This is currently not supported" - return (self._seg != empty_ids[0]).numpy().astype(np.bool) - - def semantic_masks(self): - for sid in self._seg_ids: - sinfo = self._sinfo.get(sid) - if sinfo is None or sinfo["isthing"]: - # Some pixels (e.g. id 0 in PanopticFPN) have no instance or semantic predictions. - continue - yield (self._seg == sid).numpy().astype(np.bool), sinfo - - def instance_masks(self): - for sid in self._seg_ids: - sinfo = self._sinfo.get(sid) - if sinfo is None or not sinfo["isthing"]: - continue - mask = (self._seg == sid).numpy().astype(np.bool) - if mask.sum() > 0: - yield mask, sinfo - - -def _create_text_labels(classes, scores, class_names, is_crowd=None): - """ - Args: - classes (list[int] or None): - scores (list[float] or None): - class_names (list[str] or None): - is_crowd (list[bool] or None): - - Returns: - list[str] or None - """ - labels = None - if classes is not None: - if class_names is not None and len(class_names) > 0: - labels = [class_names[i] for i in classes] - else: - labels = [str(i) for i in classes] - if scores is not None: - if labels is None: - labels = ["{:.0f}%".format(s * 100) for s in scores] - else: - labels = ["{} {:.0f}%".format(l, s * 100) for l, s in zip(labels, scores)] - if labels is not None and is_crowd is not None: - labels = [l + ("|crowd" if crowd else "") for l, crowd in zip(labels, is_crowd)] - return labels - - -class VisImage: - def __init__(self, img, scale=1.0): - """ - Args: - img (ndarray): an RGB image of shape (H, W, 3) in range [0, 255]. - scale (float): scale the input image - """ - self.img = img - self.scale = scale - self.width, self.height = img.shape[1], img.shape[0] - self._setup_figure(img) - - def _setup_figure(self, img): - """ - Args: - Same as in :meth:`__init__()`. - - Returns: - fig (matplotlib.pyplot.figure): top level container for all the image plot elements. - ax (matplotlib.pyplot.Axes): contains figure elements and sets the coordinate system. - """ - fig = mplfigure.Figure(frameon=False) - self.dpi = fig.get_dpi() - # add a small 1e-2 to avoid precision lost due to matplotlib's truncation - # (https://github.com/matplotlib/matplotlib/issues/15363) - fig.set_size_inches( - (self.width * self.scale + 1e-2) / self.dpi, - (self.height * self.scale + 1e-2) / self.dpi, - ) - self.canvas = FigureCanvasAgg(fig) - # self.canvas = mpl.backends.backend_cairo.FigureCanvasCairo(fig) - ax = fig.add_axes([0.0, 0.0, 1.0, 1.0]) - ax.axis("off") - self.fig = fig - self.ax = ax - self.reset_image(img) - - def reset_image(self, img): - """ - Args: - img: same as in __init__ - """ - img = img.astype("uint8") - self.ax.imshow(img, extent=(0, self.width, self.height, 0), interpolation="nearest") - - def save(self, filepath): - """ - Args: - filepath (str): a string that contains the absolute path, including the file name, where - the visualized image will be saved. - """ - self.fig.savefig(filepath) - - def get_image(self): - """ - Returns: - ndarray: - the visualized image of shape (H, W, 3) (RGB) in uint8 type. - The shape is scaled w.r.t the input image using the given `scale` argument. - """ - canvas = self.canvas - s, (width, height) = canvas.print_to_buffer() - # buf = io.BytesIO() # works for cairo backend - # canvas.print_rgba(buf) - # width, height = self.width, self.height - # s = buf.getvalue() - - buffer = np.frombuffer(s, dtype="uint8") - - img_rgba = buffer.reshape(height, width, 4) - rgb, alpha = np.split(img_rgba, [3], axis=2) - return rgb.astype("uint8") - - -class Visualizer: - """ - Visualizer that draws data about detection/segmentation on images. - - It contains methods like `draw_{text,box,circle,line,binary_mask,polygon}` - that draw primitive objects to images, as well as high-level wrappers like - `draw_{instance_predictions,sem_seg,panoptic_seg_predictions,dataset_dict}` - that draw composite data in some pre-defined style. - - Note that the exact visualization style for the high-level wrappers are subject to change. - Style such as color, opacity, label contents, visibility of labels, or even the visibility - of objects themselves (e.g. when the object is too small) may change according - to different heuristics, as long as the results still look visually reasonable. - - To obtain a consistent style, you can implement custom drawing functions with the - abovementioned primitive methods instead. If you need more customized visualization - styles, you can process the data yourself following their format documented in - tutorials (:doc:`/tutorials/models`, :doc:`/tutorials/datasets`). This class does not - intend to satisfy everyone's preference on drawing styles. - - This visualizer focuses on high rendering quality rather than performance. It is not - designed to be used for real-time applications. - """ - - # TODO implement a fast, rasterized version using OpenCV - - def __init__(self, img_rgb, metadata=None, scale=1.0, instance_mode=ColorMode.IMAGE): - """ - Args: - img_rgb: a numpy array of shape (H, W, C), where H and W correspond to - the height and width of the image respectively. C is the number of - color channels. The image is required to be in RGB format since that - is a requirement of the Matplotlib library. The image is also expected - to be in the range [0, 255]. - metadata (Metadata): dataset metadata (e.g. class names and colors) - instance_mode (ColorMode): defines one of the pre-defined style for drawing - instances on an image. - """ - self.img = np.asarray(img_rgb).clip(0, 255).astype(np.uint8) - if metadata is None: - metadata = MetadataCatalog.get("__nonexist__") - self.metadata = metadata - self.output = VisImage(self.img, scale=scale) - self.cpu_device = torch.device("cpu") - - # too small texts are useless, therefore clamp to 9 - self._default_font_size = max( - np.sqrt(self.output.height * self.output.width) // 90, 10 // scale - ) - self._instance_mode = instance_mode - self.keypoint_threshold = _KEYPOINT_THRESHOLD - - def draw_instance_predictions(self, predictions): - """ - Draw instance-level prediction results on an image. - - Args: - predictions (Instances): the output of an instance detection/segmentation - model. Following fields will be used to draw: - "pred_boxes", "pred_classes", "scores", "pred_masks" (or "pred_masks_rle"). - - Returns: - output (VisImage): image object with visualizations. - """ - boxes = predictions.pred_boxes if predictions.has("pred_boxes") else None - scores = predictions.scores if predictions.has("scores") else None - classes = predictions.pred_classes.tolist() if predictions.has("pred_classes") else None - labels = _create_text_labels(classes, scores, self.metadata.get("thing_classes", None)) - keypoints = predictions.pred_keypoints if predictions.has("pred_keypoints") else None - - if predictions.has("pred_masks"): - masks = np.asarray(predictions.pred_masks) - masks = [GenericMask(x, self.output.height, self.output.width) for x in masks] - else: - masks = None - - if self._instance_mode == ColorMode.SEGMENTATION and self.metadata.get("thing_colors"): - colors = [ - self._jitter([x / 255 for x in self.metadata.thing_colors[c]]) for c in classes - ] - alpha = 0.8 - else: - colors = None - alpha = 0.5 - - if self._instance_mode == ColorMode.IMAGE_BW: - self.output.reset_image( - self._create_grayscale_image( - (predictions.pred_masks.any(dim=0) > 0).numpy() - if predictions.has("pred_masks") - else None - ) - ) - alpha = 0.3 - - self.overlay_instances( - masks=masks, - boxes=boxes, - labels=labels, - keypoints=keypoints, - assigned_colors=colors, - alpha=alpha, - ) - return self.output - - def draw_sem_seg(self, sem_seg, area_threshold=None, alpha=0.8): - """ - Draw semantic segmentation predictions/labels. - - Args: - sem_seg (Tensor or ndarray): the segmentation of shape (H, W). - Each value is the integer label of the pixel. - area_threshold (int): segments with less than `area_threshold` are not drawn. - alpha (float): the larger it is, the more opaque the segmentations are. - - Returns: - output (VisImage): image object with visualizations. - """ - if isinstance(sem_seg, torch.Tensor): - sem_seg = sem_seg.numpy() - labels, areas = np.unique(sem_seg, return_counts=True) - sorted_idxs = np.argsort(-areas).tolist() - labels = labels[sorted_idxs] - for label in filter(lambda l: l < len(self.metadata.stuff_classes), labels): - try: - mask_color = [x / 255 for x in self.metadata.stuff_colors[label]] - except (AttributeError, IndexError): - mask_color = None - - binary_mask = (sem_seg == label).astype(np.uint8) - text = self.metadata.stuff_classes[label] - self.draw_binary_mask( - binary_mask, - color=mask_color, - edge_color=_OFF_WHITE, - text=text, - alpha=alpha, - area_threshold=area_threshold, - ) - return self.output - - def draw_panoptic_seg(self, panoptic_seg, segments_info, area_threshold=None, alpha=0.7): - """ - Draw panoptic prediction annotations or results. - - Args: - panoptic_seg (Tensor): of shape (height, width) where the values are ids for each - segment. - segments_info (list[dict] or None): Describe each segment in `panoptic_seg`. - If it is a ``list[dict]``, each dict contains keys "id", "category_id". - If None, category id of each pixel is computed by - ``pixel // metadata.label_divisor``. - area_threshold (int): stuff segments with less than `area_threshold` are not drawn. - - Returns: - output (VisImage): image object with visualizations. - """ - pred = _PanopticPrediction(panoptic_seg, segments_info, self.metadata) - - if self._instance_mode == ColorMode.IMAGE_BW: - self.output.reset_image(self._create_grayscale_image(pred.non_empty_mask())) - - # draw mask for all semantic segments first i.e. "stuff" - for mask, sinfo in pred.semantic_masks(): - category_idx = sinfo["category_id"] - try: - mask_color = [x / 255 for x in self.metadata.stuff_colors[category_idx]] - except AttributeError: - mask_color = None - - text = self.metadata.stuff_classes[category_idx] - self.draw_binary_mask( - mask, - color=mask_color, - edge_color=_OFF_WHITE, - text=text, - alpha=alpha, - area_threshold=area_threshold, - ) - - # draw mask for all instances second - all_instances = list(pred.instance_masks()) - if len(all_instances) == 0: - return self.output - masks, sinfo = list(zip(*all_instances)) - category_ids = [x["category_id"] for x in sinfo] - - try: - scores = [x["score"] for x in sinfo] - except KeyError: - scores = None - labels = _create_text_labels( - category_ids, scores, self.metadata.thing_classes, [x.get("iscrowd", 0) for x in sinfo] - ) - - try: - colors = [ - self._jitter([x / 255 for x in self.metadata.thing_colors[c]]) for c in category_ids - ] - except AttributeError: - colors = None - self.overlay_instances(masks=masks, labels=labels, assigned_colors=colors, alpha=alpha) - - return self.output - - draw_panoptic_seg_predictions = draw_panoptic_seg # backward compatibility - - def draw_dataset_dict(self, dic): - """ - Draw annotations/segmentaions in Detectron2 Dataset format. - - Args: - dic (dict): annotation/segmentation data of one image, in Detectron2 Dataset format. - - Returns: - output (VisImage): image object with visualizations. - """ - annos = dic.get("annotations", None) - if annos: - if "segmentation" in annos[0]: - masks = [x["segmentation"] for x in annos] - else: - masks = None - if "keypoints" in annos[0]: - keypts = [x["keypoints"] for x in annos] - keypts = np.array(keypts).reshape(len(annos), -1, 3) - else: - keypts = None - - boxes = [ - BoxMode.convert(x["bbox"], x["bbox_mode"], BoxMode.XYXY_ABS) - if len(x["bbox"]) == 4 - else x["bbox"] - for x in annos - ] - - colors = None - category_ids = [x["category_id"] for x in annos] - if self._instance_mode == ColorMode.SEGMENTATION and self.metadata.get("thing_colors"): - colors = [ - self._jitter([x / 255 for x in self.metadata.thing_colors[c]]) - for c in category_ids - ] - names = self.metadata.get("thing_classes", None) - labels = _create_text_labels( - category_ids, - scores=None, - class_names=names, - is_crowd=[x.get("iscrowd", 0) for x in annos], - ) - self.overlay_instances( - labels=labels, boxes=boxes, masks=masks, keypoints=keypts, assigned_colors=colors - ) - - sem_seg = dic.get("sem_seg", None) - if sem_seg is None and "sem_seg_file_name" in dic: - with PathManager.open(dic["sem_seg_file_name"], "rb") as f: - sem_seg = Image.open(f) - sem_seg = np.asarray(sem_seg, dtype="uint8") - if sem_seg is not None: - self.draw_sem_seg(sem_seg, area_threshold=0, alpha=0.5) - - pan_seg = dic.get("pan_seg", None) - if pan_seg is None and "pan_seg_file_name" in dic: - with PathManager.open(dic["pan_seg_file_name"], "rb") as f: - pan_seg = Image.open(f) - pan_seg = np.asarray(pan_seg) - from panopticapi.utils import rgb2id - - pan_seg = rgb2id(pan_seg) - if pan_seg is not None: - segments_info = dic["segments_info"] - pan_seg = torch.tensor(pan_seg) - self.draw_panoptic_seg(pan_seg, segments_info, area_threshold=0, alpha=0.5) - return self.output - - def overlay_instances( - self, - *, - boxes=None, - labels=None, - masks=None, - keypoints=None, - assigned_colors=None, - alpha=0.5, - ): - """ - Args: - boxes (Boxes, RotatedBoxes or ndarray): either a :class:`Boxes`, - or an Nx4 numpy array of XYXY_ABS format for the N objects in a single image, - or a :class:`RotatedBoxes`, - or an Nx5 numpy array of (x_center, y_center, width, height, angle_degrees) format - for the N objects in a single image, - labels (list[str]): the text to be displayed for each instance. - masks (masks-like object): Supported types are: - - * :class:`detectron2.structures.PolygonMasks`, - :class:`detectron2.structures.BitMasks`. - * list[list[ndarray]]: contains the segmentation masks for all objects in one image. - The first level of the list corresponds to individual instances. The second - level to all the polygon that compose the instance, and the third level - to the polygon coordinates. The third level should have the format of - [x0, y0, x1, y1, ..., xn, yn] (n >= 3). - * list[ndarray]: each ndarray is a binary mask of shape (H, W). - * list[dict]: each dict is a COCO-style RLE. - keypoints (Keypoint or array like): an array-like object of shape (N, K, 3), - where the N is the number of instances and K is the number of keypoints. - The last dimension corresponds to (x, y, visibility or score). - assigned_colors (list[matplotlib.colors]): a list of colors, where each color - corresponds to each mask or box in the image. Refer to 'matplotlib.colors' - for full list of formats that the colors are accepted in. - Returns: - output (VisImage): image object with visualizations. - """ - num_instances = 0 - if boxes is not None: - boxes = self._convert_boxes(boxes) - num_instances = len(boxes) - if masks is not None: - masks = self._convert_masks(masks) - if num_instances: - assert len(masks) == num_instances - else: - num_instances = len(masks) - if keypoints is not None: - if num_instances: - assert len(keypoints) == num_instances - else: - num_instances = len(keypoints) - keypoints = self._convert_keypoints(keypoints) - if labels is not None: - assert len(labels) == num_instances - if assigned_colors is None: - assigned_colors = [random_color(rgb=True, maximum=1) for _ in range(num_instances)] - if num_instances == 0: - return self.output - if boxes is not None and boxes.shape[1] == 5: - return self.overlay_rotated_instances( - boxes=boxes, labels=labels, assigned_colors=assigned_colors - ) - - # Display in largest to smallest order to reduce occlusion. - areas = None - if boxes is not None: - areas = np.prod(boxes[:, 2:] - boxes[:, :2], axis=1) - elif masks is not None: - areas = np.asarray([x.area() for x in masks]) - - if areas is not None: - sorted_idxs = np.argsort(-areas).tolist() - # Re-order overlapped instances in descending order. - boxes = boxes[sorted_idxs] if boxes is not None else None - labels = [labels[k] for k in sorted_idxs] if labels is not None else None - masks = [masks[idx] for idx in sorted_idxs] if masks is not None else None - assigned_colors = [assigned_colors[idx] for idx in sorted_idxs] - keypoints = keypoints[sorted_idxs] if keypoints is not None else None - - for i in range(num_instances): - color = assigned_colors[i] - if boxes is not None: - self.draw_box(boxes[i], edge_color=color) - - if masks is not None: - for segment in masks[i].polygons: - self.draw_polygon(segment.reshape(-1, 2), color, alpha=alpha) - - if labels is not None: - # first get a box - if boxes is not None: - x0, y0, x1, y1 = boxes[i] - text_pos = (x0, y0) # if drawing boxes, put text on the box corner. - horiz_align = "left" - elif masks is not None: - # skip small mask without polygon - if len(masks[i].polygons) == 0: - continue - - x0, y0, x1, y1 = masks[i].bbox() - - # draw text in the center (defined by median) when box is not drawn - # median is less sensitive to outliers. - text_pos = np.median(masks[i].mask.nonzero(), axis=1)[::-1] - horiz_align = "center" - else: - continue # drawing the box confidence for keypoints isn't very useful. - # for small objects, draw text at the side to avoid occlusion - instance_area = (y1 - y0) * (x1 - x0) - if ( - instance_area < _SMALL_OBJECT_AREA_THRESH * self.output.scale - or y1 - y0 < 40 * self.output.scale - ): - if y1 >= self.output.height - 5: - text_pos = (x1, y0) - else: - text_pos = (x0, y1) - - height_ratio = (y1 - y0) / np.sqrt(self.output.height * self.output.width) - lighter_color = self._change_color_brightness(color, brightness_factor=0.7) - font_size = ( - np.clip((height_ratio - 0.02) / 0.08 + 1, 1.2, 2) - * 0.5 - * self._default_font_size - ) - self.draw_text( - labels[i], - text_pos, - color=lighter_color, - horizontal_alignment=horiz_align, - font_size=font_size, - ) - - # draw keypoints - if keypoints is not None: - for keypoints_per_instance in keypoints: - self.draw_and_connect_keypoints(keypoints_per_instance) - - return self.output - - def overlay_rotated_instances(self, boxes=None, labels=None, assigned_colors=None): - """ - Args: - boxes (ndarray): an Nx5 numpy array of - (x_center, y_center, width, height, angle_degrees) format - for the N objects in a single image. - labels (list[str]): the text to be displayed for each instance. - assigned_colors (list[matplotlib.colors]): a list of colors, where each color - corresponds to each mask or box in the image. Refer to 'matplotlib.colors' - for full list of formats that the colors are accepted in. - - Returns: - output (VisImage): image object with visualizations. - """ - num_instances = len(boxes) - - if assigned_colors is None: - assigned_colors = [random_color(rgb=True, maximum=1) for _ in range(num_instances)] - if num_instances == 0: - return self.output - - # Display in largest to smallest order to reduce occlusion. - if boxes is not None: - areas = boxes[:, 2] * boxes[:, 3] - - sorted_idxs = np.argsort(-areas).tolist() - # Re-order overlapped instances in descending order. - boxes = boxes[sorted_idxs] - labels = [labels[k] for k in sorted_idxs] if labels is not None else None - colors = [assigned_colors[idx] for idx in sorted_idxs] - - for i in range(num_instances): - self.draw_rotated_box_with_label( - boxes[i], edge_color=colors[i], label=labels[i] if labels is not None else None - ) - - return self.output - - def draw_and_connect_keypoints(self, keypoints): - """ - Draws keypoints of an instance and follows the rules for keypoint connections - to draw lines between appropriate keypoints. This follows color heuristics for - line color. - - Args: - keypoints (Tensor): a tensor of shape (K, 3), where K is the number of keypoints - and the last dimension corresponds to (x, y, probability). - - Returns: - output (VisImage): image object with visualizations. - """ - visible = {} - keypoint_names = self.metadata.get("keypoint_names") - for idx, keypoint in enumerate(keypoints): - - # draw keypoint - x, y, prob = keypoint - if prob > self.keypoint_threshold: - self.draw_circle((x, y), color=_RED) - if keypoint_names: - keypoint_name = keypoint_names[idx] - visible[keypoint_name] = (x, y) - - if self.metadata.get("keypoint_connection_rules"): - for kp0, kp1, color in self.metadata.keypoint_connection_rules: - if kp0 in visible and kp1 in visible: - x0, y0 = visible[kp0] - x1, y1 = visible[kp1] - color = tuple(x / 255.0 for x in color) - self.draw_line([x0, x1], [y0, y1], color=color) - - # draw lines from nose to mid-shoulder and mid-shoulder to mid-hip - # Note that this strategy is specific to person keypoints. - # For other keypoints, it should just do nothing - try: - ls_x, ls_y = visible["left_shoulder"] - rs_x, rs_y = visible["right_shoulder"] - mid_shoulder_x, mid_shoulder_y = (ls_x + rs_x) / 2, (ls_y + rs_y) / 2 - except KeyError: - pass - else: - # draw line from nose to mid-shoulder - nose_x, nose_y = visible.get("nose", (None, None)) - if nose_x is not None: - self.draw_line([nose_x, mid_shoulder_x], [nose_y, mid_shoulder_y], color=_RED) - - try: - # draw line from mid-shoulder to mid-hip - lh_x, lh_y = visible["left_hip"] - rh_x, rh_y = visible["right_hip"] - except KeyError: - pass - else: - mid_hip_x, mid_hip_y = (lh_x + rh_x) / 2, (lh_y + rh_y) / 2 - self.draw_line([mid_hip_x, mid_shoulder_x], [mid_hip_y, mid_shoulder_y], color=_RED) - return self.output - - """ - Primitive drawing functions: - """ - - def draw_text( - self, - text, - position, - *, - font_size=None, - color="g", - horizontal_alignment="center", - rotation=0, - ): - """ - Args: - text (str): class label - position (tuple): a tuple of the x and y coordinates to place text on image. - font_size (int, optional): font of the text. If not provided, a font size - proportional to the image width is calculated and used. - color: color of the text. Refer to `matplotlib.colors` for full list - of formats that are accepted. - horizontal_alignment (str): see `matplotlib.text.Text` - rotation: rotation angle in degrees CCW - - Returns: - output (VisImage): image object with text drawn. - """ - if not font_size: - font_size = self._default_font_size - - # since the text background is dark, we don't want the text to be dark - color = np.maximum(list(mplc.to_rgb(color)), 0.2) - color[np.argmax(color)] = max(0.8, np.max(color)) - - x, y = position - self.output.ax.text( - x, - y, - text, - size=font_size * self.output.scale, - family="sans-serif", - bbox={"facecolor": "black", "alpha": 0.8, "pad": 0.7, "edgecolor": "none"}, - verticalalignment="top", - horizontalalignment=horizontal_alignment, - color=color, - zorder=10, - rotation=rotation, - ) - return self.output - - def draw_box(self, box_coord, alpha=0.5, edge_color="g", line_style="-"): - """ - Args: - box_coord (tuple): a tuple containing x0, y0, x1, y1 coordinates, where x0 and y0 - are the coordinates of the image's top left corner. x1 and y1 are the - coordinates of the image's bottom right corner. - alpha (float): blending efficient. Smaller values lead to more transparent masks. - edge_color: color of the outline of the box. Refer to `matplotlib.colors` - for full list of formats that are accepted. - line_style (string): the string to use to create the outline of the boxes. - - Returns: - output (VisImage): image object with box drawn. - """ - x0, y0, x1, y1 = box_coord - width = x1 - x0 - height = y1 - y0 - - linewidth = max(self._default_font_size / 4, 1) - - self.output.ax.add_patch( - mpl.patches.Rectangle( - (x0, y0), - width, - height, - fill=False, - edgecolor=edge_color, - linewidth=linewidth * self.output.scale, - alpha=alpha, - linestyle=line_style, - ) - ) - return self.output - - def draw_rotated_box_with_label( - self, rotated_box, alpha=0.5, edge_color="g", line_style="-", label=None - ): - """ - Draw a rotated box with label on its top-left corner. - - Args: - rotated_box (tuple): a tuple containing (cnt_x, cnt_y, w, h, angle), - where cnt_x and cnt_y are the center coordinates of the box. - w and h are the width and height of the box. angle represents how - many degrees the box is rotated CCW with regard to the 0-degree box. - alpha (float): blending efficient. Smaller values lead to more transparent masks. - edge_color: color of the outline of the box. Refer to `matplotlib.colors` - for full list of formats that are accepted. - line_style (string): the string to use to create the outline of the boxes. - label (string): label for rotated box. It will not be rendered when set to None. - - Returns: - output (VisImage): image object with box drawn. - """ - cnt_x, cnt_y, w, h, angle = rotated_box - area = w * h - # use thinner lines when the box is small - linewidth = self._default_font_size / ( - 6 if area < _SMALL_OBJECT_AREA_THRESH * self.output.scale else 3 - ) - - theta = angle * math.pi / 180.0 - c = math.cos(theta) - s = math.sin(theta) - rect = [(-w / 2, h / 2), (-w / 2, -h / 2), (w / 2, -h / 2), (w / 2, h / 2)] - # x: left->right ; y: top->down - rotated_rect = [(s * yy + c * xx + cnt_x, c * yy - s * xx + cnt_y) for (xx, yy) in rect] - for k in range(4): - j = (k + 1) % 4 - self.draw_line( - [rotated_rect[k][0], rotated_rect[j][0]], - [rotated_rect[k][1], rotated_rect[j][1]], - color=edge_color, - linestyle="--" if k == 1 else line_style, - linewidth=linewidth, - ) - - if label is not None: - text_pos = rotated_rect[1] # topleft corner - - height_ratio = h / np.sqrt(self.output.height * self.output.width) - label_color = self._change_color_brightness(edge_color, brightness_factor=0.7) - font_size = ( - np.clip((height_ratio - 0.02) / 0.08 + 1, 1.2, 2) * 0.5 * self._default_font_size - ) - self.draw_text(label, text_pos, color=label_color, font_size=font_size, rotation=angle) - - return self.output - - def draw_circle(self, circle_coord, color, radius=3): - """ - Args: - circle_coord (list(int) or tuple(int)): contains the x and y coordinates - of the center of the circle. - color: color of the polygon. Refer to `matplotlib.colors` for a full list of - formats that are accepted. - radius (int): radius of the circle. - - Returns: - output (VisImage): image object with box drawn. - """ - x, y = circle_coord - self.output.ax.add_patch( - mpl.patches.Circle(circle_coord, radius=radius, fill=True, color=color) - ) - return self.output - - def draw_line(self, x_data, y_data, color, linestyle="-", linewidth=None): - """ - Args: - x_data (list[int]): a list containing x values of all the points being drawn. - Length of list should match the length of y_data. - y_data (list[int]): a list containing y values of all the points being drawn. - Length of list should match the length of x_data. - color: color of the line. Refer to `matplotlib.colors` for a full list of - formats that are accepted. - linestyle: style of the line. Refer to `matplotlib.lines.Line2D` - for a full list of formats that are accepted. - linewidth (float or None): width of the line. When it's None, - a default value will be computed and used. - - Returns: - output (VisImage): image object with line drawn. - """ - if linewidth is None: - linewidth = self._default_font_size / 3 - linewidth = max(linewidth, 1) - self.output.ax.add_line( - mpl.lines.Line2D( - x_data, - y_data, - linewidth=linewidth * self.output.scale, - color=color, - linestyle=linestyle, - ) - ) - return self.output - - def draw_binary_mask( - self, binary_mask, color=None, *, edge_color=None, text=None, alpha=0.5, area_threshold=10 - ): - """ - Args: - binary_mask (ndarray): numpy array of shape (H, W), where H is the image height and - W is the image width. Each value in the array is either a 0 or 1 value of uint8 - type. - color: color of the mask. Refer to `matplotlib.colors` for a full list of - formats that are accepted. If None, will pick a random color. - edge_color: color of the polygon edges. Refer to `matplotlib.colors` for a - full list of formats that are accepted. - text (str): if None, will be drawn on the object - alpha (float): blending efficient. Smaller values lead to more transparent masks. - area_threshold (float): a connected component smaller than this area will not be shown. - - Returns: - output (VisImage): image object with mask drawn. - """ - if color is None: - color = random_color(rgb=True, maximum=1) - color = mplc.to_rgb(color) - - has_valid_segment = False - binary_mask = binary_mask.astype("uint8") # opencv needs uint8 - mask = GenericMask(binary_mask, self.output.height, self.output.width) - shape2d = (binary_mask.shape[0], binary_mask.shape[1]) - - if not mask.has_holes: - # draw polygons for regular masks - for segment in mask.polygons: - area = mask_util.area(mask_util.frPyObjects([segment], shape2d[0], shape2d[1])) - if area < (area_threshold or 0): - continue - has_valid_segment = True - segment = segment.reshape(-1, 2) - self.draw_polygon(segment, color=color, edge_color=edge_color, alpha=alpha) - else: - # TODO: Use Path/PathPatch to draw vector graphics: - # https://stackoverflow.com/questions/8919719/how-to-plot-a-complex-polygon - rgba = np.zeros(shape2d + (4,), dtype="float32") - rgba[:, :, :3] = color - rgba[:, :, 3] = (mask.mask == 1).astype("float32") * alpha - has_valid_segment = True - self.output.ax.imshow(rgba, extent=(0, self.output.width, self.output.height, 0)) - - if text is not None and has_valid_segment: - lighter_color = self._change_color_brightness(color, brightness_factor=0.7) - self._draw_text_in_mask(binary_mask, text, lighter_color) - return self.output - - def draw_soft_mask(self, soft_mask, color=None, *, text=None, alpha=0.5): - """ - Args: - soft_mask (ndarray): float array of shape (H, W), each value in [0, 1]. - color: color of the mask. Refer to `matplotlib.colors` for a full list of - formats that are accepted. If None, will pick a random color. - text (str): if None, will be drawn on the object - alpha (float): blending efficient. Smaller values lead to more transparent masks. - - Returns: - output (VisImage): image object with mask drawn. - """ - if color is None: - color = random_color(rgb=True, maximum=1) - color = mplc.to_rgb(color) - - shape2d = (soft_mask.shape[0], soft_mask.shape[1]) - rgba = np.zeros(shape2d + (4,), dtype="float32") - rgba[:, :, :3] = color - rgba[:, :, 3] = soft_mask * alpha - self.output.ax.imshow(rgba, extent=(0, self.output.width, self.output.height, 0)) - - if text is not None: - lighter_color = self._change_color_brightness(color, brightness_factor=0.7) - binary_mask = (soft_mask > 0.5).astype("uint8") - self._draw_text_in_mask(binary_mask, text, lighter_color) - return self.output - - def draw_polygon(self, segment, color, edge_color=None, alpha=0.5): - """ - Args: - segment: numpy array of shape Nx2, containing all the points in the polygon. - color: color of the polygon. Refer to `matplotlib.colors` for a full list of - formats that are accepted. - edge_color: color of the polygon edges. Refer to `matplotlib.colors` for a - full list of formats that are accepted. If not provided, a darker shade - of the polygon color will be used instead. - alpha (float): blending efficient. Smaller values lead to more transparent masks. - - Returns: - output (VisImage): image object with polygon drawn. - """ - if edge_color is None: - # make edge color darker than the polygon color - if alpha > 0.8: - edge_color = self._change_color_brightness(color, brightness_factor=-0.7) - else: - edge_color = color - edge_color = mplc.to_rgb(edge_color) + (1,) - - polygon = mpl.patches.Polygon( - segment, - fill=True, - facecolor=mplc.to_rgb(color) + (alpha,), - edgecolor=edge_color, - linewidth=max(self._default_font_size // 15 * self.output.scale, 1), - ) - self.output.ax.add_patch(polygon) - return self.output - - """ - Internal methods: - """ - - def _jitter(self, color): - """ - Randomly modifies given color to produce a slightly different color than the color given. - - Args: - color (tuple[double]): a tuple of 3 elements, containing the RGB values of the color - picked. The values in the list are in the [0.0, 1.0] range. - - Returns: - jittered_color (tuple[double]): a tuple of 3 elements, containing the RGB values of the - color after being jittered. The values in the list are in the [0.0, 1.0] range. - """ - color = mplc.to_rgb(color) - vec = np.random.rand(3) - # better to do it in another color space - vec = vec / np.linalg.norm(vec) * 0.5 - res = np.clip(vec + color, 0, 1) - return tuple(res) - - def _create_grayscale_image(self, mask=None): - """ - Create a grayscale version of the original image. - The colors in masked area, if given, will be kept. - """ - img_bw = self.img.astype("f4").mean(axis=2) - img_bw = np.stack([img_bw] * 3, axis=2) - if mask is not None: - img_bw[mask] = self.img[mask] - return img_bw - - def _change_color_brightness(self, color, brightness_factor): - """ - Depending on the brightness_factor, gives a lighter or darker color i.e. a color with - less or more saturation than the original color. - - Args: - color: color of the polygon. Refer to `matplotlib.colors` for a full list of - formats that are accepted. - brightness_factor (float): a value in [-1.0, 1.0] range. A lightness factor of - 0 will correspond to no change, a factor in [-1.0, 0) range will result in - a darker color and a factor in (0, 1.0] range will result in a lighter color. - - Returns: - modified_color (tuple[double]): a tuple containing the RGB values of the - modified color. Each value in the tuple is in the [0.0, 1.0] range. - """ - assert brightness_factor >= -1.0 and brightness_factor <= 1.0 - color = mplc.to_rgb(color) - polygon_color = colorsys.rgb_to_hls(*mplc.to_rgb(color)) - modified_lightness = polygon_color[1] + (brightness_factor * polygon_color[1]) - modified_lightness = 0.0 if modified_lightness < 0.0 else modified_lightness - modified_lightness = 1.0 if modified_lightness > 1.0 else modified_lightness - modified_color = colorsys.hls_to_rgb(polygon_color[0], modified_lightness, polygon_color[2]) - return modified_color - - def _convert_boxes(self, boxes): - """ - Convert different format of boxes to an NxB array, where B = 4 or 5 is the box dimension. - """ - if isinstance(boxes, Boxes) or isinstance(boxes, RotatedBoxes): - return boxes.tensor.detach().numpy() - else: - return np.asarray(boxes) - - def _convert_masks(self, masks_or_polygons): - """ - Convert different format of masks or polygons to a tuple of masks and polygons. - - Returns: - list[GenericMask]: - """ - - m = masks_or_polygons - if isinstance(m, PolygonMasks): - m = m.polygons - if isinstance(m, BitMasks): - m = m.tensor.numpy() - if isinstance(m, torch.Tensor): - m = m.numpy() - ret = [] - for x in m: - if isinstance(x, GenericMask): - ret.append(x) - else: - ret.append(GenericMask(x, self.output.height, self.output.width)) - return ret - - def _draw_text_in_mask(self, binary_mask, text, color): - """ - Find proper places to draw text given a binary mask. - """ - # TODO sometimes drawn on wrong objects. the heuristics here can improve. - _num_cc, cc_labels, stats, centroids = cv2.connectedComponentsWithStats(binary_mask, 8) - if stats[1:, -1].size == 0: - return - largest_component_id = np.argmax(stats[1:, -1]) + 1 - - # draw text on the largest component, as well as other very large components. - for cid in range(1, _num_cc): - if cid == largest_component_id or stats[cid, -1] > _LARGE_MASK_AREA_THRESH: - # median is more stable than centroid - # center = centroids[largest_component_id] - center = np.median((cc_labels == cid).nonzero(), axis=1)[::-1] - self.draw_text(text, center, color=color) - - def _convert_keypoints(self, keypoints): - if isinstance(keypoints, Keypoints): - keypoints = keypoints.tensor - keypoints = np.asarray(keypoints) - return keypoints - - def get_output(self): - """ - Returns: - output (VisImage): the image output containing the visualizations added - to the image. - """ - return self.output diff --git a/grit_src_deprecated/third_party/CenterNet2/dev/README.md b/grit_src_deprecated/third_party/CenterNet2/dev/README.md deleted file mode 100755 index bec811ad..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/dev/README.md +++ /dev/null @@ -1,7 +0,0 @@ - -## Some scripts for developers to use, include: - -- `linter.sh`: lint the codebase before commit. -- `run_{inference,instant}_tests.sh`: run inference/training for a few iterations. - Note that these tests require 2 GPUs. -- `parse_results.sh`: parse results from a log file. diff --git a/grit_src_deprecated/third_party/CenterNet2/dev/linter.sh b/grit_src_deprecated/third_party/CenterNet2/dev/linter.sh deleted file mode 100755 index e873186f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/dev/linter.sh +++ /dev/null @@ -1,42 +0,0 @@ -#!/bin/bash -e -# Copyright (c) Facebook, Inc. and its affiliates. - -# cd to detectron2 project root -cd "$(dirname "${BASH_SOURCE[0]}")/.." - -{ - black --version | grep -E "21\." > /dev/null -} || { - echo "Linter requires 'black==21.*' !" - exit 1 -} - -ISORT_VERSION=$(isort --version-number) -if [[ "$ISORT_VERSION" != 4.3* ]]; then - echo "Linter requires isort==4.3.21 !" - exit 1 -fi - -set -v - -echo "Running isort ..." -isort -y -sp . --atomic - -echo "Running black ..." -black -l 100 . - -echo "Running flake8 ..." -if [ -x "$(command -v flake8-3)" ]; then - flake8-3 . -else - python3 -m flake8 . -fi - -# echo "Running mypy ..." -# Pytorch does not have enough type annotations -# mypy detectron2/solver detectron2/structures detectron2/config - -echo "Running clang-format ..." -find . -regex ".*\.\(cpp\|c\|cc\|cu\|cxx\|h\|hh\|hpp\|hxx\|tcc\|mm\|m\)" -print0 | xargs -0 clang-format -i - -command -v arc > /dev/null && arc lint diff --git a/grit_src_deprecated/third_party/CenterNet2/dev/packaging/README.md b/grit_src_deprecated/third_party/CenterNet2/dev/packaging/README.md deleted file mode 100755 index 0174b7dd..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/dev/packaging/README.md +++ /dev/null @@ -1,17 +0,0 @@ - -## To build a cu101 wheel for release: - -``` -$ nvidia-docker run -it --storage-opt "size=20GB" --name pt pytorch/manylinux-cuda101 -# inside the container: -# git clone https://github.com/facebookresearch/detectron2/ -# cd detectron2 -# export CU_VERSION=cu101 D2_VERSION_SUFFIX= PYTHON_VERSION=3.7 PYTORCH_VERSION=1.8 -# ./dev/packaging/build_wheel.sh -``` - -## To build all wheels for combinations of CUDA and Python -``` -./dev/packaging/build_all_wheels.sh -./dev/packaging/gen_wheel_index.sh /path/to/wheels -``` diff --git a/grit_src_deprecated/third_party/CenterNet2/dev/packaging/build_all_wheels.sh b/grit_src_deprecated/third_party/CenterNet2/dev/packaging/build_all_wheels.sh deleted file mode 100755 index 98b5e444..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/dev/packaging/build_all_wheels.sh +++ /dev/null @@ -1,65 +0,0 @@ -#!/bin/bash -e -# Copyright (c) Facebook, Inc. and its affiliates. - -[[ -d "dev/packaging" ]] || { - echo "Please run this script at detectron2 root!" - exit 1 -} - -build_one() { - cu=$1 - pytorch_ver=$2 - - case "$cu" in - cu*) - container_name=manylinux-cuda${cu/cu/} - ;; - cpu) - container_name=manylinux-cuda101 - ;; - *) - echo "Unrecognized cu=$cu" - exit 1 - ;; - esac - - echo "Launching container $container_name ..." - container_id="$container_name"_"$cu"_"$pytorch_ver" - - py_versions=(3.6 3.7 3.8 3.9) - - for py in "${py_versions[@]}"; do - docker run -itd \ - --name "$container_id" \ - --mount type=bind,source="$(pwd)",target=/detectron2 \ - pytorch/$container_name - - cat </dev/null 2>&1 && pwd )" -. "$script_dir/pkg_helpers.bash" - -echo "Build Settings:" -echo "CU_VERSION: $CU_VERSION" # e.g. cu101 -echo "D2_VERSION_SUFFIX: $D2_VERSION_SUFFIX" # e.g. +cu101 or "" -echo "PYTHON_VERSION: $PYTHON_VERSION" # e.g. 3.6 -echo "PYTORCH_VERSION: $PYTORCH_VERSION" # e.g. 1.4 - -setup_cuda -setup_wheel_python - -yum install ninja-build -y -ln -sv /usr/bin/ninja-build /usr/bin/ninja || true - -pip_install pip numpy -U -pip_install "torch==$PYTORCH_VERSION" \ - -f https://download.pytorch.org/whl/"$CU_VERSION"/torch_stable.html - -# use separate directories to allow parallel build -BASE_BUILD_DIR=build/$CU_VERSION-py$PYTHON_VERSION-pt$PYTORCH_VERSION -python setup.py \ - build -b "$BASE_BUILD_DIR" \ - bdist_wheel -b "$BASE_BUILD_DIR/build_dist" -d "wheels/$CU_VERSION/torch$PYTORCH_VERSION" -rm -rf "$BASE_BUILD_DIR" diff --git a/grit_src_deprecated/third_party/CenterNet2/dev/packaging/gen_install_table.py b/grit_src_deprecated/third_party/CenterNet2/dev/packaging/gen_install_table.py deleted file mode 100755 index b4c852dc..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/dev/packaging/gen_install_table.py +++ /dev/null @@ -1,63 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. -# -*- coding: utf-8 -*- - -import argparse - -template = """
      install
      \
      -python -m pip install detectron2{d2_version} -f \\
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/{cuda}/torch{torch}/index.html
      -
      """ -CUDA_SUFFIX = { - "11.3": "cu113", - "11.1": "cu111", - "11.0": "cu110", - "10.2": "cu102", - "10.1": "cu101", - "10.0": "cu100", - "9.2": "cu92", - "cpu": "cpu", -} - - -def gen_header(torch_versions): - return '' + "".join( - [ - ''.format(t) - for t in torch_versions - ] - ) - - -if __name__ == "__main__": - parser = argparse.ArgumentParser() - parser.add_argument("--d2-version", help="detectron2 version number, default to empty") - args = parser.parse_args() - d2_version = f"=={args.d2_version}" if args.d2_version else "" - - all_versions = ( - [("1.8", k) for k in ["11.1", "10.2", "10.1", "cpu"]] - + [("1.9", k) for k in ["11.1", "10.2", "cpu"]] - + [("1.10", k) for k in ["11.3", "11.1", "10.2", "cpu"]] - ) - - torch_versions = sorted( - {k[0] for k in all_versions}, key=lambda x: int(x.split(".")[1]), reverse=True - ) - cuda_versions = sorted( - {k[1] for k in all_versions}, key=lambda x: float(x) if x != "cpu" else 0, reverse=True - ) - - table = gen_header(torch_versions) - for cu in cuda_versions: - table += f""" """ - cu_suffix = CUDA_SUFFIX[cu] - for torch in torch_versions: - if (torch, cu) in all_versions: - cell = template.format(d2_version=d2_version, cuda=cu_suffix, torch=torch) - else: - cell = "" - table += f""" """ - table += "" - table += "
      CUDA torch {}
      {cu}{cell}
      " - print(table) diff --git a/grit_src_deprecated/third_party/CenterNet2/dev/packaging/gen_wheel_index.sh b/grit_src_deprecated/third_party/CenterNet2/dev/packaging/gen_wheel_index.sh deleted file mode 100755 index ec96a27d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/dev/packaging/gen_wheel_index.sh +++ /dev/null @@ -1,46 +0,0 @@ -#!/bin/bash -e -# Copyright (c) Facebook, Inc. and its affiliates. - - -root=$(readlink -f $1) -if [[ -z "$root" ]]; then - echo "Usage: ./gen_wheel_index.sh /absolute/path/to/wheels" - exit -fi - -export LC_ALL=C # reproducible sort -# NOTE: all sort in this script might not work when xx.10 is released - -index=$root/index.html - -cd "$root" -for cu in cpu cu92 cu100 cu101 cu102 cu110 cu111 cu113; do - mkdir -p "$root/$cu" - cd "$root/$cu" - echo "Creating $PWD/index.html ..." - # First sort by torch version, then stable sort by d2 version with unique. - # As a result, the latest torch version for each d2 version is kept. - for whl in $(find -type f -name '*.whl' -printf '%P\n' \ - | sort -k 1 -r | sort -t '/' -k 2 --stable -r --unique); do - echo "$whl
      " - done > index.html - - - for torch in torch*; do - cd "$root/$cu/$torch" - - # list all whl for each cuda,torch version - echo "Creating $PWD/index.html ..." - for whl in $(find . -type f -name '*.whl' -printf '%P\n' | sort -r); do - echo "$whl
      " - done > index.html - done -done - -cd "$root" -# Just list everything: -echo "Creating $index ..." -for whl in $(find . -type f -name '*.whl' -printf '%P\n' | sort -r); do - echo "$whl
      " -done > "$index" - diff --git a/grit_src_deprecated/third_party/CenterNet2/dev/packaging/pkg_helpers.bash b/grit_src_deprecated/third_party/CenterNet2/dev/packaging/pkg_helpers.bash deleted file mode 100755 index ed9acb00..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/dev/packaging/pkg_helpers.bash +++ /dev/null @@ -1,76 +0,0 @@ -#!/bin/bash -e -# Copyright (c) Facebook, Inc. and its affiliates. - -# Function to retry functions that sometimes timeout or have flaky failures -retry () { - $* || (sleep 1 && $*) || (sleep 2 && $*) || (sleep 4 && $*) || (sleep 8 && $*) -} -# Install with pip a bit more robustly than the default -pip_install() { - retry pip install --progress-bar off "$@" -} - - -setup_cuda() { - # Now work out the CUDA settings - # Like other torch domain libraries, we choose common GPU architectures only. - # See https://github.com/pytorch/pytorch/blob/master/torch/utils/cpp_extension.py - # and https://github.com/pytorch/vision/blob/main/packaging/pkg_helpers.bash for reference. - export FORCE_CUDA=1 - case "$CU_VERSION" in - cu113) - export CUDA_HOME=/usr/local/cuda-11.3/ - export TORCH_CUDA_ARCH_LIST="3.7;5.0;5.2;6.0;6.1+PTX;7.0;7.5+PTX;8.0;8.6+PTX" - ;; - cu112) - export CUDA_HOME=/usr/local/cuda-11.2/ - export TORCH_CUDA_ARCH_LIST="3.7;5.0;5.2;6.0;6.1+PTX;7.0;7.5+PTX;8.0;8.6+PTX" - ;; - cu111) - export CUDA_HOME=/usr/local/cuda-11.1/ - export TORCH_CUDA_ARCH_LIST="3.7;5.0;5.2;6.0;6.1+PTX;7.0;7.5+PTX;8.0;8.6+PTX" - ;; - cu110) - export CUDA_HOME=/usr/local/cuda-11.0/ - export TORCH_CUDA_ARCH_LIST="3.7;5.0;5.2;6.0;6.1+PTX;7.0;7.5+PTX;8.0+PTX" - ;; - cu102) - export CUDA_HOME=/usr/local/cuda-10.2/ - export TORCH_CUDA_ARCH_LIST="3.7;5.0;5.2;6.0;6.1+PTX;7.0;7.5+PTX" - ;; - cu101) - export CUDA_HOME=/usr/local/cuda-10.1/ - export TORCH_CUDA_ARCH_LIST="3.7;5.0;5.2;6.0;6.1+PTX;7.0;7.5+PTX" - ;; - cu100) - export CUDA_HOME=/usr/local/cuda-10.0/ - export TORCH_CUDA_ARCH_LIST="3.7;5.0;5.2;6.0;6.1+PTX;7.0;7.5+PTX" - ;; - cu92) - export CUDA_HOME=/usr/local/cuda-9.2/ - export TORCH_CUDA_ARCH_LIST="3.7;5.0;5.2;6.0;6.1+PTX;7.0+PTX" - ;; - cpu) - unset FORCE_CUDA - export CUDA_VISIBLE_DEVICES= - ;; - *) - echo "Unrecognized CU_VERSION=$CU_VERSION" - exit 1 - ;; - esac -} - -setup_wheel_python() { - case "$PYTHON_VERSION" in - 3.6) python_abi=cp36-cp36m ;; - 3.7) python_abi=cp37-cp37m ;; - 3.8) python_abi=cp38-cp38 ;; - 3.9) python_abi=cp39-cp39 ;; - *) - echo "Unrecognized PYTHON_VERSION=$PYTHON_VERSION" - exit 1 - ;; - esac - export PATH="/opt/python/$python_abi/bin:$PATH" -} diff --git a/grit_src_deprecated/third_party/CenterNet2/dev/parse_results.sh b/grit_src_deprecated/third_party/CenterNet2/dev/parse_results.sh deleted file mode 100755 index 80768a40..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/dev/parse_results.sh +++ /dev/null @@ -1,45 +0,0 @@ -#!/bin/bash -# Copyright (c) Facebook, Inc. and its affiliates. - -# A shell script that parses metrics from the log file. -# Make it easier for developers to track performance of models. - -LOG="$1" - -if [[ -z "$LOG" ]]; then - echo "Usage: $0 /path/to/log/file" - exit 1 -fi - -# [12/15 11:47:32] trainer INFO: Total training time: 12:15:04.446477 (0.4900 s / it) -# [12/15 11:49:03] inference INFO: Total inference time: 0:01:25.326167 (0.13652186737060548 s / img per device, on 8 devices) -# [12/15 11:49:03] inference INFO: Total inference pure compute time: ..... - -# training time -trainspeed=$(grep -o 'Overall training.*' "$LOG" | grep -Eo '\(.*\)' | grep -o '[0-9\.]*') -echo "Training speed: $trainspeed s/it" - -# inference time: there could be multiple inference during training -inferencespeed=$(grep -o 'Total inference pure.*' "$LOG" | tail -n1 | grep -Eo '\(.*\)' | grep -o '[0-9\.]*' | head -n1) -echo "Inference speed: $inferencespeed s/it" - -# [12/15 11:47:18] trainer INFO: eta: 0:00:00 iter: 90000 loss: 0.5407 (0.7256) loss_classifier: 0.1744 (0.2446) loss_box_reg: 0.0838 (0.1160) loss_mask: 0.2159 (0.2722) loss_objectness: 0.0244 (0.0429) loss_rpn_box_reg: 0.0279 (0.0500) time: 0.4487 (0.4899) data: 0.0076 (0.0975) lr: 0.000200 max mem: 4161 -memory=$(grep -o 'max[_ ]mem: [0-9]*' "$LOG" | tail -n1 | grep -o '[0-9]*') -echo "Training memory: $memory MB" - -echo "Easy to copypaste:" -echo "$trainspeed","$inferencespeed","$memory" - -echo "------------------------------" - -# [12/26 17:26:32] engine.coco_evaluation: copypaste: Task: bbox -# [12/26 17:26:32] engine.coco_evaluation: copypaste: AP,AP50,AP75,APs,APm,APl -# [12/26 17:26:32] engine.coco_evaluation: copypaste: 0.0017,0.0024,0.0017,0.0005,0.0019,0.0011 -# [12/26 17:26:32] engine.coco_evaluation: copypaste: Task: segm -# [12/26 17:26:32] engine.coco_evaluation: copypaste: AP,AP50,AP75,APs,APm,APl -# [12/26 17:26:32] engine.coco_evaluation: copypaste: 0.0014,0.0021,0.0016,0.0005,0.0016,0.0011 - -echo "COCO Results:" -num_tasks=$(grep -o 'copypaste:.*Task.*' "$LOG" | sort -u | wc -l) -# each task has 3 lines -grep -o 'copypaste:.*' "$LOG" | cut -d ' ' -f 2- | tail -n $((num_tasks * 3)) diff --git a/grit_src_deprecated/third_party/CenterNet2/dev/run_inference_tests.sh b/grit_src_deprecated/third_party/CenterNet2/dev/run_inference_tests.sh deleted file mode 100755 index bc9dcc56..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/dev/run_inference_tests.sh +++ /dev/null @@ -1,44 +0,0 @@ -#!/bin/bash -e -# Copyright (c) Facebook, Inc. and its affiliates. - -BIN="python tools/train_net.py" -OUTPUT="inference_test_output" -NUM_GPUS=2 - -CFG_LIST=( "${@:1}" ) - -if [ ${#CFG_LIST[@]} -eq 0 ]; then - CFG_LIST=( ./configs/quick_schedules/*inference_acc_test.yaml ) -fi - -echo "========================================================================" -echo "Configs to run:" -echo "${CFG_LIST[@]}" -echo "========================================================================" - - -for cfg in "${CFG_LIST[@]}"; do - echo "========================================================================" - echo "Running $cfg ..." - echo "========================================================================" - $BIN \ - --eval-only \ - --num-gpus $NUM_GPUS \ - --config-file "$cfg" \ - OUTPUT_DIR $OUTPUT - rm -rf $OUTPUT -done - - -echo "========================================================================" -echo "Running demo.py ..." -echo "========================================================================" -DEMO_BIN="python demo/demo.py" -COCO_DIR=datasets/coco/val2014 -mkdir -pv $OUTPUT - -set -v - -$DEMO_BIN --config-file ./configs/quick_schedules/panoptic_fpn_R_50_inference_acc_test.yaml \ - --input $COCO_DIR/COCO_val2014_0000001933* --output $OUTPUT -rm -rf $OUTPUT diff --git a/grit_src_deprecated/third_party/CenterNet2/dev/run_instant_tests.sh b/grit_src_deprecated/third_party/CenterNet2/dev/run_instant_tests.sh deleted file mode 100755 index 9fd9ba0c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/dev/run_instant_tests.sh +++ /dev/null @@ -1,27 +0,0 @@ -#!/bin/bash -e -# Copyright (c) Facebook, Inc. and its affiliates. - -BIN="python tools/train_net.py" -OUTPUT="instant_test_output" -NUM_GPUS=2 - -CFG_LIST=( "${@:1}" ) -if [ ${#CFG_LIST[@]} -eq 0 ]; then - CFG_LIST=( ./configs/quick_schedules/*instant_test.yaml ) -fi - -echo "========================================================================" -echo "Configs to run:" -echo "${CFG_LIST[@]}" -echo "========================================================================" - -for cfg in "${CFG_LIST[@]}"; do - echo "========================================================================" - echo "Running $cfg ..." - echo "========================================================================" - $BIN --num-gpus $NUM_GPUS --config-file "$cfg" \ - SOLVER.IMS_PER_BATCH $(($NUM_GPUS * 2)) \ - OUTPUT_DIR "$OUTPUT" - rm -rf "$OUTPUT" -done - diff --git a/grit_src_deprecated/third_party/CenterNet2/docker/Dockerfile b/grit_src_deprecated/third_party/CenterNet2/docker/Dockerfile deleted file mode 100755 index 4eec16dd..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docker/Dockerfile +++ /dev/null @@ -1,47 +0,0 @@ -FROM nvidia/cuda:11.1.1-cudnn8-devel-ubuntu18.04 -# use an older system (18.04) to avoid opencv incompatibility (issue#3524) - -ENV DEBIAN_FRONTEND noninteractive -RUN apt-get update && apt-get install -y \ - python3-opencv ca-certificates python3-dev git wget sudo ninja-build -RUN ln -sv /usr/bin/python3 /usr/bin/python - -# create a non-root user -ARG USER_ID=1000 -RUN useradd -m --no-log-init --system --uid ${USER_ID} appuser -g sudo -RUN echo '%sudo ALL=(ALL) NOPASSWD:ALL' >> /etc/sudoers -USER appuser -WORKDIR /home/appuser - -ENV PATH="/home/appuser/.local/bin:${PATH}" -RUN wget https://bootstrap.pypa.io/get-pip.py && \ - python3 get-pip.py --user && \ - rm get-pip.py - -# install dependencies -# See https://pytorch.org/ for other options if you use a different version of CUDA -RUN pip install --user tensorboard cmake # cmake from apt-get is too old -RUN pip install --user torch==1.10 torchvision==0.11.1 -f https://download.pytorch.org/whl/cu111/torch_stable.html - -RUN pip install --user 'git+https://github.com/facebookresearch/fvcore' -# install detectron2 -RUN git clone https://github.com/facebookresearch/detectron2 detectron2_repo -# set FORCE_CUDA because during `docker build` cuda is not accessible -ENV FORCE_CUDA="1" -# This will by default build detectron2 for all common cuda architectures and take a lot more time, -# because inside `docker build`, there is no way to tell which architecture will be used. -ARG TORCH_CUDA_ARCH_LIST="Kepler;Kepler+Tesla;Maxwell;Maxwell+Tegra;Pascal;Volta;Turing" -ENV TORCH_CUDA_ARCH_LIST="${TORCH_CUDA_ARCH_LIST}" - -RUN pip install --user -e detectron2_repo - -# Set a fixed model cache directory. -ENV FVCORE_CACHE="/tmp" -WORKDIR /home/appuser/detectron2_repo - -# run detectron2 under user "appuser": -# wget http://images.cocodataset.org/val2017/000000439715.jpg -O input.jpg -# python3 demo/demo.py \ - #--config-file configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml \ - #--input input.jpg --output outputs/ \ - #--opts MODEL.WEIGHTS detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl diff --git a/grit_src_deprecated/third_party/CenterNet2/docker/README.md b/grit_src_deprecated/third_party/CenterNet2/docker/README.md deleted file mode 100755 index ea709f33..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docker/README.md +++ /dev/null @@ -1,45 +0,0 @@ - -## Use the container (with docker ≥ 19.03) - -``` -cd docker/ -# Build: -docker build --build-arg USER_ID=$UID -t detectron2:v0 . -# Launch (require GPUs): -docker run --gpus all -it \ - --shm-size=8gb --env="DISPLAY" --volume="/tmp/.X11-unix:/tmp/.X11-unix:rw" \ - --name=detectron2 detectron2:v0 - -# Grant docker access to host X server to show images -xhost +local:`docker inspect --format='{{ .Config.Hostname }}' detectron2` -``` - -## Use the container (with docker-compose ≥ 1.28.0) - -Install docker-compose and nvidia-docker-toolkit, then run: -``` -cd docker && USER_ID=$UID docker-compose run detectron2 -``` - -## Use the deployment container (to test C++ examples) -After building the base detectron2 container as above, do: -``` -# Build: -docker build -t detectron2-deploy:v0 -f deploy.Dockerfile . -# Launch: -docker run --gpus all -it detectron2-deploy:v0 -``` - -#### Using a persistent cache directory - -You can prevent models from being re-downloaded on every run, -by storing them in a cache directory. - -To do this, add `--volume=$HOME/.torch/fvcore_cache:/tmp:rw` in the run command. - -## Install new dependencies -Add the following to `Dockerfile` to make persistent changes. -``` -RUN sudo apt-get update && sudo apt-get install -y vim -``` -Or run them in the container to make temporary changes. diff --git a/grit_src_deprecated/third_party/CenterNet2/docker/deploy.Dockerfile b/grit_src_deprecated/third_party/CenterNet2/docker/deploy.Dockerfile deleted file mode 100755 index 30b4ed77..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docker/deploy.Dockerfile +++ /dev/null @@ -1,32 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# This file defines a container that compiles the C++ examples of detectron2. -# See docker/README.md for usage. - -# Depends on the image produced by "./Dockerfile" -FROM detectron2:v0 - -USER appuser -ENV HOME=/home/appuser -WORKDIR $HOME - -# Let torchvision find libtorch -ENV CMAKE_PREFIX_PATH=$HOME/.local/lib/python3.6/site-packages/torch/ - -RUN sudo apt-get update && sudo apt-get install libopencv-dev --yes - -# install libtorchvision -RUN git clone --branch v0.11.1 https://github.com/pytorch/vision/ -RUN mkdir vision/build && cd vision/build && \ - cmake .. -DCMAKE_INSTALL_PREFIX=$HOME/.local -DCMAKE_BUILD_TYPE=Release -DWITH_CUDA=on -DTORCH_CUDA_ARCH_LIST=$TORCH_CUDA_ARCH_LIST && \ - make -j && make install - -# make our installation take effect -ENV CPATH=$HOME/.local/include \ - LIBRARY_PATH=$HOME/.local/lib \ - LD_LIBRARY_PATH=$HOME/.local/lib - - -# build C++ examples of detectron2 -RUN cd detectron2_repo/tools/deploy && mkdir build && cd build && \ - cmake -DTORCH_CUDA_ARCH_LIST=$TORCH_CUDA_ARCH_LIST .. && make -# binaries will be available under tools/deploy/build diff --git a/grit_src_deprecated/third_party/CenterNet2/docker/docker-compose.yml b/grit_src_deprecated/third_party/CenterNet2/docker/docker-compose.yml deleted file mode 100755 index 6665ab4c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docker/docker-compose.yml +++ /dev/null @@ -1,26 +0,0 @@ -version: "2.3" -services: - detectron2: - build: - context: . - dockerfile: Dockerfile - args: - USER_ID: ${USER_ID:-1000} - deploy: - resources: - reservations: - devices: - - capabilities: - - gpu - shm_size: "8gb" - ulimits: - memlock: -1 - stack: 67108864 - volumes: - - /tmp/.X11-unix:/tmp/.X11-unix:ro - environment: - - DISPLAY=$DISPLAY - - NVIDIA_VISIBLE_DEVICES=all - # Uncomment with proper source to access webcam from docker - # devices: - # - /dev/video0:/dev/video0 diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/.gitignore b/grit_src_deprecated/third_party/CenterNet2/docs/.gitignore deleted file mode 100755 index e35d8850..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/.gitignore +++ /dev/null @@ -1 +0,0 @@ -_build diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/Makefile b/grit_src_deprecated/third_party/CenterNet2/docs/Makefile deleted file mode 100755 index 718eddce..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/Makefile +++ /dev/null @@ -1,19 +0,0 @@ -# Minimal makefile for Sphinx documentation -# Copyright (c) Facebook, Inc. and its affiliates. - -# You can set these variables from the command line. -SPHINXOPTS = -SPHINXBUILD = sphinx-build -SOURCEDIR = . -BUILDDIR = _build - -# Put it first so that "make" without argument is like "make help". -help: - @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) - -.PHONY: help Makefile - -# Catch-all target: route all unknown targets to Sphinx using the new -# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS). -%: Makefile - @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/README.md b/grit_src_deprecated/third_party/CenterNet2/docs/README.md deleted file mode 100755 index 8531cafd..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/README.md +++ /dev/null @@ -1,15 +0,0 @@ -# Read the docs: - -The latest documentation built from this directory is available at [detectron2.readthedocs.io](https://detectron2.readthedocs.io/). -Documents in this directory are not meant to be read on github. - -# Build the docs: - -1. Install detectron2 according to [INSTALL.md](../INSTALL.md). -2. Install additional libraries required to build docs: - - docutils==0.16 - - Sphinx==3.2.0 - - recommonmark==0.6.0 - - sphinx_rtd_theme - -3. Run `make html` from this directory. diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/_static/css/custom.css b/grit_src_deprecated/third_party/CenterNet2/docs/_static/css/custom.css deleted file mode 100755 index 6c511764..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/_static/css/custom.css +++ /dev/null @@ -1,30 +0,0 @@ -/* - * Copyright (c) Facebook, Inc. and its affiliates. - * some extra css to make markdown look similar between github/sphinx - */ - -/* - * Below is for install.md: - */ -.rst-content code { - white-space: pre; - border: 0px; -} - -.rst-content th { - border: 1px solid #e1e4e5; -} - -.rst-content th p { - /* otherwise will be default 24px for regular paragraph */ - margin-bottom: 0px; -} - -.rst-content .line-block { - /* otherwise will be 24px */ - margin-bottom: 0px; -} - -div.section > details { - padding-bottom: 1em; -} diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/conf.py b/grit_src_deprecated/third_party/CenterNet2/docs/conf.py deleted file mode 100755 index c7232f41..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/conf.py +++ /dev/null @@ -1,382 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -# flake8: noqa - -# Configuration file for the Sphinx documentation builder. -# -# This file does only contain a selection of the most common options. For a -# full list see the documentation: -# http://www.sphinx-doc.org/en/master/config - -# -- Path setup -------------------------------------------------------------- - -# If extensions (or modules to document with autodoc) are in another directory, -# add these directories to sys.path here. If the directory is relative to the -# documentation root, use os.path.abspath to make it absolute, like shown here. -# -import os -import sys -from unittest import mock -from sphinx.domains import Domain -from typing import Dict, List, Tuple - -# The theme to use for HTML and HTML Help pages. See the documentation for -# a list of builtin themes. -# -import sphinx_rtd_theme - - -class GithubURLDomain(Domain): - """ - Resolve certain links in markdown files to github source. - """ - - name = "githuburl" - ROOT = "https://github.com/facebookresearch/detectron2/blob/main/" - LINKED_DOC = ["tutorials/install", "tutorials/getting_started"] - - def resolve_any_xref(self, env, fromdocname, builder, target, node, contnode): - github_url = None - if not target.endswith("html") and target.startswith("../../"): - url = target.replace("../", "") - github_url = url - if fromdocname in self.LINKED_DOC: - # unresolved links in these docs are all github links - github_url = target - - if github_url is not None: - if github_url.endswith("MODEL_ZOO") or github_url.endswith("README"): - # bug of recommonmark. - # https://github.com/readthedocs/recommonmark/blob/ddd56e7717e9745f11300059e4268e204138a6b1/recommonmark/parser.py#L152-L155 - github_url += ".md" - print("Ref {} resolved to github:{}".format(target, github_url)) - contnode["refuri"] = self.ROOT + github_url - return [("githuburl:any", contnode)] - else: - return [] - - -# to support markdown -from recommonmark.parser import CommonMarkParser - -sys.path.insert(0, os.path.abspath("../")) -os.environ["_DOC_BUILDING"] = "True" -DEPLOY = os.environ.get("READTHEDOCS") == "True" - - -# -- Project information ----------------------------------------------------- - -# fmt: off -try: - import torch # noqa -except ImportError: - for m in [ - "torch", "torchvision", "torch.nn", "torch.nn.parallel", "torch.distributed", "torch.multiprocessing", "torch.autograd", - "torch.autograd.function", "torch.nn.modules", "torch.nn.modules.utils", "torch.utils", "torch.utils.data", "torch.onnx", - "torchvision", "torchvision.ops", - ]: - sys.modules[m] = mock.Mock(name=m) - sys.modules['torch'].__version__ = "1.7" # fake version - HAS_TORCH = False -else: - try: - torch.ops.detectron2 = mock.Mock(name="torch.ops.detectron2") - except: - pass - HAS_TORCH = True - -for m in [ - "cv2", "scipy", "portalocker", "detectron2._C", - "pycocotools", "pycocotools.mask", "pycocotools.coco", "pycocotools.cocoeval", - "google", "google.protobuf", "google.protobuf.internal", "onnx", - "caffe2", "caffe2.proto", "caffe2.python", "caffe2.python.utils", "caffe2.python.onnx", "caffe2.python.onnx.backend", -]: - sys.modules[m] = mock.Mock(name=m) -# fmt: on -sys.modules["cv2"].__version__ = "3.4" - -import detectron2 # isort: skip - -if HAS_TORCH: - from detectron2.utils.env import fixup_module_metadata - - fixup_module_metadata("torch.nn", torch.nn.__dict__) - fixup_module_metadata("torch.utils.data", torch.utils.data.__dict__) - - -project = "detectron2" -copyright = "2019-2020, detectron2 contributors" -author = "detectron2 contributors" - -# The short X.Y version -version = detectron2.__version__ -# The full version, including alpha/beta/rc tags -release = version - - -# -- General configuration --------------------------------------------------- - -# If your documentation needs a minimal Sphinx version, state it here. -# -needs_sphinx = "3.0" - -# Add any Sphinx extension module names here, as strings. They can be -# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom -# ones. -extensions = [ - "recommonmark", - "sphinx.ext.autodoc", - "sphinx.ext.napoleon", - "sphinx.ext.intersphinx", - "sphinx.ext.todo", - "sphinx.ext.coverage", - "sphinx.ext.mathjax", - "sphinx.ext.viewcode", - "sphinx.ext.githubpages", -] - -# -- Configurations for plugins ------------ -napoleon_google_docstring = True -napoleon_include_init_with_doc = True -napoleon_include_special_with_doc = True -napoleon_numpy_docstring = False -napoleon_use_rtype = False -autodoc_inherit_docstrings = False -autodoc_member_order = "bysource" - -if DEPLOY: - intersphinx_timeout = 10 -else: - # skip this when building locally - intersphinx_timeout = 0.5 -intersphinx_mapping = { - "python": ("https://docs.python.org/3.6", None), - "numpy": ("https://docs.scipy.org/doc/numpy/", None), - "torch": ("https://pytorch.org/docs/master/", None), -} -# ------------------------- - - -# Add any paths that contain templates here, relative to this directory. -templates_path = ["_templates"] - -source_suffix = [".rst", ".md"] - -# The master toctree document. -master_doc = "index" - -# The language for content autogenerated by Sphinx. Refer to documentation -# for a list of supported languages. -# -# This is also used if you do content translation via gettext catalogs. -# Usually you set "language" from the command line for these cases. -language = None - -# List of patterns, relative to source directory, that match files and -# directories to ignore when looking for source files. -# This pattern also affects html_static_path and html_extra_path. -exclude_patterns = ["_build", "Thumbs.db", ".DS_Store", "build", "README.md", "tutorials/README.md"] - -# The name of the Pygments (syntax highlighting) style to use. -pygments_style = "sphinx" - - -# -- Options for HTML output ------------------------------------------------- - -html_theme = "sphinx_rtd_theme" -html_theme_path = [sphinx_rtd_theme.get_html_theme_path()] - -# Theme options are theme-specific and customize the look and feel of a theme -# further. For a list of options available for each theme, see the -# documentation. -# -# html_theme_options = {} - -# Add any paths that contain custom static files (such as style sheets) here, -# relative to this directory. They are copied after the builtin static files, -# so a file named "default.css" will overwrite the builtin "default.css". -html_static_path = ["_static"] -html_css_files = ["css/custom.css"] - -# Custom sidebar templates, must be a dictionary that maps document names -# to template names. -# -# The default sidebars (for documents that don't match any pattern) are -# defined by theme itself. Builtin themes are using these templates by -# default: ``['localtoc.html', 'relations.html', 'sourcelink.html', -# 'searchbox.html']``. -# -# html_sidebars = {} - - -# -- Options for HTMLHelp output --------------------------------------------- - -# Output file base name for HTML help builder. -htmlhelp_basename = "detectron2doc" - - -# -- Options for LaTeX output ------------------------------------------------ - -latex_elements = { - # The paper size ('letterpaper' or 'a4paper'). - # - # 'papersize': 'letterpaper', - # The font size ('10pt', '11pt' or '12pt'). - # - # 'pointsize': '10pt', - # Additional stuff for the LaTeX preamble. - # - # 'preamble': '', - # Latex figure (float) alignment - # - # 'figure_align': 'htbp', -} - -# Grouping the document tree into LaTeX files. List of tuples -# (source start file, target name, title, -# author, documentclass [howto, manual, or own class]). -latex_documents = [ - (master_doc, "detectron2.tex", "detectron2 Documentation", "detectron2 contributors", "manual") -] - - -# -- Options for manual page output ------------------------------------------ - -# One entry per manual page. List of tuples -# (source start file, name, description, authors, manual section). -man_pages = [(master_doc, "detectron2", "detectron2 Documentation", [author], 1)] - - -# -- Options for Texinfo output ---------------------------------------------- - -# Grouping the document tree into Texinfo files. List of tuples -# (source start file, target name, title, author, -# dir menu entry, description, category) -texinfo_documents = [ - ( - master_doc, - "detectron2", - "detectron2 Documentation", - author, - "detectron2", - "One line description of project.", - "Miscellaneous", - ) -] - - -# -- Options for todo extension ---------------------------------------------- - -# If true, `todo` and `todoList` produce output, else they produce nothing. -todo_include_todos = True - - -def autodoc_skip_member(app, what, name, obj, skip, options): - # we hide something deliberately - if getattr(obj, "__HIDE_SPHINX_DOC__", False): - return True - - # Hide some that are deprecated or not intended to be used - HIDDEN = { - "ResNetBlockBase", - "GroupedBatchSampler", - "build_transform_gen", - "apply_transform_gens", - "TransformGen", - "apply_augmentations", - "StandardAugInput", - "build_batch_data_loader", - "draw_panoptic_seg_predictions", - "WarmupCosineLR", - "WarmupMultiStepLR", - "downgrade_config", - "upgrade_config", - "add_export_config", - } - try: - if name in HIDDEN or ( - hasattr(obj, "__doc__") and obj.__doc__.lower().strip().startswith("deprecated") - ): - print("Skipping deprecated object: {}".format(name)) - return True - except: - pass - return skip - - -_PAPER_DATA = { - "resnet": ("1512.03385", "Deep Residual Learning for Image Recognition"), - "fpn": ("1612.03144", "Feature Pyramid Networks for Object Detection"), - "mask r-cnn": ("1703.06870", "Mask R-CNN"), - "faster r-cnn": ( - "1506.01497", - "Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks", - ), - "deformconv": ("1703.06211", "Deformable Convolutional Networks"), - "deformconv2": ("1811.11168", "Deformable ConvNets v2: More Deformable, Better Results"), - "panopticfpn": ("1901.02446", "Panoptic Feature Pyramid Networks"), - "retinanet": ("1708.02002", "Focal Loss for Dense Object Detection"), - "cascade r-cnn": ("1712.00726", "Cascade R-CNN: Delving into High Quality Object Detection"), - "lvis": ("1908.03195", "LVIS: A Dataset for Large Vocabulary Instance Segmentation"), - "rrpn": ("1703.01086", "Arbitrary-Oriented Scene Text Detection via Rotation Proposals"), - "imagenet in 1h": ("1706.02677", "Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour"), - "xception": ("1610.02357", "Xception: Deep Learning with Depthwise Separable Convolutions"), - "mobilenet": ( - "1704.04861", - "MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications", - ), - "deeplabv3+": ( - "1802.02611", - "Encoder-Decoder with Atrous Separable Convolution for Semantic Image Segmentation", - ), - "dds": ("2003.13678", "Designing Network Design Spaces"), - "scaling": ("2103.06877", "Fast and Accurate Model Scaling"), - "fcos": ("2006.09214", "FCOS: A Simple and Strong Anchor-free Object Detector"), - "rethinking-batchnorm": ("2105.07576", 'Rethinking "Batch" in BatchNorm'), -} - - -def paper_ref_role( - typ: str, - rawtext: str, - text: str, - lineno: int, - inliner, - options: Dict = {}, - content: List[str] = [], -): - """ - Parse :paper:`xxx`. Similar to the "extlinks" sphinx extension. - """ - from docutils import nodes, utils - from sphinx.util.nodes import split_explicit_title - - text = utils.unescape(text) - has_explicit_title, title, link = split_explicit_title(text) - link = link.lower() - if link not in _PAPER_DATA: - inliner.reporter.warning("Cannot find paper " + link) - paper_url, paper_title = "#", link - else: - paper_url, paper_title = _PAPER_DATA[link] - if "/" not in paper_url: - paper_url = "https://arxiv.org/abs/" + paper_url - if not has_explicit_title: - title = paper_title - pnode = nodes.reference(title, title, internal=False, refuri=paper_url) - return [pnode], [] - - -def setup(app): - from recommonmark.transform import AutoStructify - - app.add_domain(GithubURLDomain) - app.connect("autodoc-skip-member", autodoc_skip_member) - app.add_role("paper", paper_ref_role) - app.add_config_value( - "recommonmark_config", - {"enable_math": True, "enable_inline_math": True, "enable_eval_rst": True}, - True, - ) - app.add_transform(AutoStructify) diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/index.rst b/grit_src_deprecated/third_party/CenterNet2/docs/index.rst deleted file mode 100755 index 8634b7b1..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/index.rst +++ /dev/null @@ -1,14 +0,0 @@ -.. detectron2 documentation master file, created by - sphinx-quickstart on Sat Sep 21 13:46:45 2019. - You can adapt this file completely to your liking, but it should at least - contain the root `toctree` directive. - -Welcome to detectron2's documentation! -====================================== - -.. toctree:: - :maxdepth: 2 - - tutorials/index - notes/index - modules/index diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/modules/checkpoint.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/checkpoint.rst deleted file mode 100755 index 449caaff..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/modules/checkpoint.rst +++ /dev/null @@ -1,7 +0,0 @@ -detectron2.checkpoint -============================= - -.. automodule:: detectron2.checkpoint - :members: - :undoc-members: - :show-inheritance: diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/modules/config.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/config.rst deleted file mode 100755 index c76913d8..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/modules/config.rst +++ /dev/null @@ -1,18 +0,0 @@ -detectron2.config -========================= - -Related tutorials: :doc:`../tutorials/configs`, :doc:`../tutorials/extend`. - -.. automodule:: detectron2.config - :members: - :undoc-members: - :show-inheritance: - - -Yaml Config References ------------------ - -.. literalinclude:: ../../detectron2/config/defaults.py - :language: python - :linenos: - :lines: 7- diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/modules/data.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/data.rst deleted file mode 100755 index 0d5bd891..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/modules/data.rst +++ /dev/null @@ -1,37 +0,0 @@ -detectron2.data -======================= - -.. autodata:: detectron2.data.DatasetCatalog(dict) - :annotation: - -.. autodata:: detectron2.data.MetadataCatalog(dict) - :annotation: - -.. automodule:: detectron2.data - :members: - :undoc-members: - :show-inheritance: - -detectron2.data.detection\_utils module ---------------------------------------- - -.. automodule:: detectron2.data.detection_utils - :members: - :undoc-members: - :show-inheritance: - -detectron2.data.datasets module ---------------------------------------- - -.. automodule:: detectron2.data.datasets - :members: - :undoc-members: - :show-inheritance: - -detectron2.data.samplers module ---------------------------------------- - -.. automodule:: detectron2.data.samplers - :members: - :undoc-members: - :show-inheritance: diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/modules/data_transforms.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/data_transforms.rst deleted file mode 100755 index 1533a434..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/modules/data_transforms.rst +++ /dev/null @@ -1,10 +0,0 @@ -detectron2.data.transforms -==================================== - -Related tutorial: :doc:`../tutorials/augmentation`. - -.. automodule:: detectron2.data.transforms - :members: - :undoc-members: - :show-inheritance: - :imported-members: diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/modules/engine.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/engine.rst deleted file mode 100755 index 7e0d2b07..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/modules/engine.rst +++ /dev/null @@ -1,26 +0,0 @@ -detectron2.engine -========================= - -Related tutorial: :doc:`../tutorials/training`. - -.. automodule:: detectron2.engine - :members: - :undoc-members: - :show-inheritance: - - -detectron2.engine.defaults module ---------------------------------- - -.. automodule:: detectron2.engine.defaults - :members: - :undoc-members: - :show-inheritance: - -detectron2.engine.hooks module ---------------------------------- - -.. automodule:: detectron2.engine.hooks - :members: - :undoc-members: - :show-inheritance: diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/modules/evaluation.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/evaluation.rst deleted file mode 100755 index 69bfc4b9..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/modules/evaluation.rst +++ /dev/null @@ -1,7 +0,0 @@ -detectron2.evaluation -============================= - -.. automodule:: detectron2.evaluation - :members: - :undoc-members: - :show-inheritance: diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/modules/export.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/export.rst deleted file mode 100755 index dcee14f8..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/modules/export.rst +++ /dev/null @@ -1,9 +0,0 @@ -detectron2.export -========================= - -Related tutorial: :doc:`../tutorials/deployment`. - -.. automodule:: detectron2.export - :members: - :undoc-members: - :show-inheritance: diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/modules/fvcore.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/fvcore.rst deleted file mode 100755 index c8bf9f58..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/modules/fvcore.rst +++ /dev/null @@ -1,49 +0,0 @@ -fvcore documentation -==================== - -Detectron2 depends on utilities in -`fvcore `_. -We include part of fvcore documentation here for easier reference. - -fvcore.nn ------------------ - -.. automodule:: fvcore.nn - :members: - :inherited-members: - :undoc-members: - :show-inheritance: - -fvcore.common ---------------------- - -.. automodule:: fvcore.common.checkpoint - :members: - :undoc-members: - :show-inheritance: - -.. automodule:: fvcore.common.config - :members: - :undoc-members: - :show-inheritance: - -.. automodule:: fvcore.common.history_buffer - :members: - :undoc-members: - :show-inheritance: - -.. automodule:: fvcore.common.param_scheduler - :members: - :inherited-members: - :undoc-members: - :show-inheritance: - -.. automodule:: fvcore.common.registry - :members: - :undoc-members: - :show-inheritance: - -.. automodule:: fvcore.common.timer - :members: - :undoc-members: - :show-inheritance: diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/modules/index.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/index.rst deleted file mode 100755 index 14b75439..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/modules/index.rst +++ /dev/null @@ -1,19 +0,0 @@ -API Documentation -================== - -.. toctree:: - - checkpoint - config - data - data_transforms - engine - evaluation - layers - model_zoo - modeling - solver - structures - utils - export - fvcore diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/modules/layers.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/layers.rst deleted file mode 100755 index b43b42a7..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/modules/layers.rst +++ /dev/null @@ -1,7 +0,0 @@ -detectron2.layers -========================= - -.. automodule:: detectron2.layers - :members: - :undoc-members: - :show-inheritance: diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/modules/model_zoo.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/model_zoo.rst deleted file mode 100755 index 5abbad1f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/modules/model_zoo.rst +++ /dev/null @@ -1,7 +0,0 @@ -detectron2.model_zoo -============================ - -.. automodule:: detectron2.model_zoo - :members: - :undoc-members: - :show-inheritance: diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/modules/modeling.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/modeling.rst deleted file mode 100755 index a22c7ed3..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/modules/modeling.rst +++ /dev/null @@ -1,58 +0,0 @@ -detectron2.modeling -=========================== - -.. automodule:: detectron2.modeling - :members: - :undoc-members: - :show-inheritance: - - -detectron2.modeling.poolers module ---------------------------------------- - -.. automodule:: detectron2.modeling.poolers - :members: - :undoc-members: - :show-inheritance: - - -detectron2.modeling.sampling module ------------------------------------- - -.. automodule:: detectron2.modeling.sampling - :members: - :undoc-members: - :show-inheritance: - - -detectron2.modeling.box_regression module ------------------------------------------- - -.. automodule:: detectron2.modeling.box_regression - :members: - :undoc-members: - :show-inheritance: - - -Model Registries ------------------ - -These are different registries provided in modeling. -Each registry provide you the ability to replace it with your customized component, -without having to modify detectron2's code. - -Note that it is impossible to allow users to customize any line of code directly. -Even just to add one line at some place, -you'll likely need to find out the smallest registry which contains that line, -and register your component to that registry. - - -.. autodata:: detectron2.modeling.META_ARCH_REGISTRY -.. autodata:: detectron2.modeling.BACKBONE_REGISTRY -.. autodata:: detectron2.modeling.PROPOSAL_GENERATOR_REGISTRY -.. autodata:: detectron2.modeling.RPN_HEAD_REGISTRY -.. autodata:: detectron2.modeling.ANCHOR_GENERATOR_REGISTRY -.. autodata:: detectron2.modeling.ROI_HEADS_REGISTRY -.. autodata:: detectron2.modeling.ROI_BOX_HEAD_REGISTRY -.. autodata:: detectron2.modeling.ROI_MASK_HEAD_REGISTRY -.. autodata:: detectron2.modeling.ROI_KEYPOINT_HEAD_REGISTRY diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/modules/solver.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/solver.rst deleted file mode 100755 index 59d98c72..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/modules/solver.rst +++ /dev/null @@ -1,7 +0,0 @@ -detectron2.solver -========================= - -.. automodule:: detectron2.solver - :members: - :undoc-members: - :show-inheritance: diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/modules/structures.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/structures.rst deleted file mode 100755 index 1369dc08..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/modules/structures.rst +++ /dev/null @@ -1,7 +0,0 @@ -detectron2.structures -============================= - -.. automodule:: detectron2.structures - :members: - :undoc-members: - :show-inheritance: diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/modules/utils.rst b/grit_src_deprecated/third_party/CenterNet2/docs/modules/utils.rst deleted file mode 100755 index ab58f2ca..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/modules/utils.rst +++ /dev/null @@ -1,80 +0,0 @@ -detectron2.utils -======================== - -detectron2.utils.colormap module --------------------------------- - -.. automodule:: detectron2.utils.colormap - :members: - :undoc-members: - :show-inheritance: - -detectron2.utils.comm module ----------------------------- - -.. automodule:: detectron2.utils.comm - :members: - :undoc-members: - :show-inheritance: - - -detectron2.utils.events module ------------------------------- - -.. automodule:: detectron2.utils.events - :members: - :undoc-members: - :show-inheritance: - - -detectron2.utils.logger module ------------------------------- - -.. automodule:: detectron2.utils.logger - :members: - :undoc-members: - :show-inheritance: - - -detectron2.utils.registry module --------------------------------- - -.. automodule:: detectron2.utils.registry - :members: - :undoc-members: - :show-inheritance: - -detectron2.utils.memory module ----------------------------------- - -.. automodule:: detectron2.utils.memory - :members: - :undoc-members: - :show-inheritance: - - -detectron2.utils.analysis module ----------------------------------- - -.. automodule:: detectron2.utils.analysis - :members: - :undoc-members: - :show-inheritance: - - -detectron2.utils.visualizer module ----------------------------------- - -.. automodule:: detectron2.utils.visualizer - :members: - :undoc-members: - :show-inheritance: - -detectron2.utils.video\_visualizer module ------------------------------------------ - -.. automodule:: detectron2.utils.video_visualizer - :members: - :undoc-members: - :show-inheritance: - diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/notes/benchmarks.md b/grit_src_deprecated/third_party/CenterNet2/docs/notes/benchmarks.md deleted file mode 100755 index b41588da..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/notes/benchmarks.md +++ /dev/null @@ -1,196 +0,0 @@ - -# Benchmarks - -Here we benchmark the training speed of a Mask R-CNN in detectron2, -with some other popular open source Mask R-CNN implementations. - - -### Settings - -* Hardware: 8 NVIDIA V100s with NVLink. -* Software: Python 3.7, CUDA 10.1, cuDNN 7.6.5, PyTorch 1.5, - TensorFlow 1.15.0rc2, Keras 2.2.5, MxNet 1.6.0b20190820. -* Model: an end-to-end R-50-FPN Mask-RCNN model, using the same hyperparameter as the - [Detectron baseline config](https://github.com/facebookresearch/Detectron/blob/master/configs/12_2017_baselines/e2e_mask_rcnn_R-50-FPN_1x.yaml) - (it does not have scale augmentation). -* Metrics: We use the average throughput in iterations 100-500 to skip GPU warmup time. - Note that for R-CNN-style models, the throughput of a model typically changes during training, because - it depends on the predictions of the model. Therefore this metric is not directly comparable with - "train speed" in model zoo, which is the average speed of the entire training run. - - -### Main Results - -```eval_rst -+-------------------------------+--------------------+ -| Implementation | Throughput (img/s) | -+===============================+====================+ -| |D2| |PT| | 62 | -+-------------------------------+--------------------+ -| mmdetection_ |PT| | 53 | -+-------------------------------+--------------------+ -| maskrcnn-benchmark_ |PT| | 53 | -+-------------------------------+--------------------+ -| tensorpack_ |TF| | 50 | -+-------------------------------+--------------------+ -| simpledet_ |mxnet| | 39 | -+-------------------------------+--------------------+ -| Detectron_ |C2| | 19 | -+-------------------------------+--------------------+ -| `matterport/Mask_RCNN`__ |TF| | 14 | -+-------------------------------+--------------------+ - -.. _maskrcnn-benchmark: https://github.com/facebookresearch/maskrcnn-benchmark/ -.. _tensorpack: https://github.com/tensorpack/tensorpack/tree/master/examples/FasterRCNN -.. _mmdetection: https://github.com/open-mmlab/mmdetection/ -.. _simpledet: https://github.com/TuSimple/simpledet/ -.. _Detectron: https://github.com/facebookresearch/Detectron -__ https://github.com/matterport/Mask_RCNN/ - -.. |D2| image:: https://github.com/facebookresearch/detectron2/raw/main/.github/Detectron2-Logo-Horz.svg?sanitize=true - :height: 15pt - :target: https://github.com/facebookresearch/detectron2/ -.. |PT| image:: https://pytorch.org/assets/images/logo-icon.svg - :width: 15pt - :height: 15pt - :target: https://pytorch.org -.. |TF| image:: https://static.nvidiagrid.net/ngc/containers/tensorflow.png - :width: 15pt - :height: 15pt - :target: https://tensorflow.org -.. |mxnet| image:: https://github.com/dmlc/web-data/raw/master/mxnet/image/mxnet_favicon.png - :width: 15pt - :height: 15pt - :target: https://mxnet.apache.org/ -.. |C2| image:: https://caffe2.ai/static/logo.svg - :width: 15pt - :height: 15pt - :target: https://caffe2.ai -``` - - -Details for each implementation: - -* __Detectron2__: with release v0.1.2, run: - ``` - python tools/train_net.py --config-file configs/Detectron1-Comparisons/mask_rcnn_R_50_FPN_noaug_1x.yaml --num-gpus 8 - ``` - -* __mmdetection__: at commit `b0d845f`, run - ``` - ./tools/dist_train.sh configs/mask_rcnn/mask_rcnn_r50_caffe_fpn_1x_coco.py 8 - ``` - -* __maskrcnn-benchmark__: use commit `0ce8f6f` with `sed -i 's/torch.uint8/torch.bool/g' **/*.py; sed -i 's/AT_CHECK/TORCH_CHECK/g' **/*.cu` - to make it compatible with PyTorch 1.5. Then, run training with - ``` - python -m torch.distributed.launch --nproc_per_node=8 tools/train_net.py --config-file configs/e2e_mask_rcnn_R_50_FPN_1x.yaml - ``` - The speed we observed is faster than its model zoo, likely due to different software versions. - -* __tensorpack__: at commit `caafda`, `export TF_CUDNN_USE_AUTOTUNE=0`, then run - ``` - mpirun -np 8 ./train.py --config DATA.BASEDIR=/data/coco TRAINER=horovod BACKBONE.STRIDE_1X1=True TRAIN.STEPS_PER_EPOCH=50 --load ImageNet-R50-AlignPadding.npz - ``` - -* __SimpleDet__: at commit `9187a1`, run - ``` - python detection_train.py --config config/mask_r50v1_fpn_1x.py - ``` - -* __Detectron__: run - ``` - python tools/train_net.py --cfg configs/12_2017_baselines/e2e_mask_rcnn_R-50-FPN_1x.yaml - ``` - Note that many of its ops run on CPUs, therefore the performance is limited. - -* __matterport/Mask_RCNN__: at commit `3deaec`, apply the following diff, `export TF_CUDNN_USE_AUTOTUNE=0`, then run - ``` - python coco.py train --dataset=/data/coco/ --model=imagenet - ``` - Note that many small details in this implementation might be different - from Detectron's standards. - -
      - - (diff to make it use the same hyperparameters - click to expand) - - - ```diff - diff --git i/mrcnn/model.py w/mrcnn/model.py - index 62cb2b0..61d7779 100644 - --- i/mrcnn/model.py - +++ w/mrcnn/model.py - @@ -2367,8 +2367,8 @@ class MaskRCNN(): - epochs=epochs, - steps_per_epoch=self.config.STEPS_PER_EPOCH, - callbacks=callbacks, - - validation_data=val_generator, - - validation_steps=self.config.VALIDATION_STEPS, - + #validation_data=val_generator, - + #validation_steps=self.config.VALIDATION_STEPS, - max_queue_size=100, - workers=workers, - use_multiprocessing=True, - diff --git i/mrcnn/parallel_model.py w/mrcnn/parallel_model.py - index d2bf53b..060172a 100644 - --- i/mrcnn/parallel_model.py - +++ w/mrcnn/parallel_model.py - @@ -32,6 +32,7 @@ class ParallelModel(KM.Model): - keras_model: The Keras model to parallelize - gpu_count: Number of GPUs. Must be > 1 - """ - + super().__init__() - self.inner_model = keras_model - self.gpu_count = gpu_count - merged_outputs = self.make_parallel() - diff --git i/samples/coco/coco.py w/samples/coco/coco.py - index 5d172b5..239ed75 100644 - --- i/samples/coco/coco.py - +++ w/samples/coco/coco.py - @@ -81,7 +81,10 @@ class CocoConfig(Config): - IMAGES_PER_GPU = 2 - - # Uncomment to train on 8 GPUs (default is 1) - - # GPU_COUNT = 8 - + GPU_COUNT = 8 - + BACKBONE = "resnet50" - + STEPS_PER_EPOCH = 50 - + TRAIN_ROIS_PER_IMAGE = 512 - - # Number of classes (including background) - NUM_CLASSES = 1 + 80 # COCO has 80 classes - @@ -496,29 +499,10 @@ if __name__ == '__main__': - # *** This training schedule is an example. Update to your needs *** - - # Training - Stage 1 - - print("Training network heads") - model.train(dataset_train, dataset_val, - learning_rate=config.LEARNING_RATE, - epochs=40, - - layers='heads', - - augmentation=augmentation) - - - - # Training - Stage 2 - - # Finetune layers from ResNet stage 4 and up - - print("Fine tune Resnet stage 4 and up") - - model.train(dataset_train, dataset_val, - - learning_rate=config.LEARNING_RATE, - - epochs=120, - - layers='4+', - - augmentation=augmentation) - - - - # Training - Stage 3 - - # Fine tune all layers - - print("Fine tune all layers") - - model.train(dataset_train, dataset_val, - - learning_rate=config.LEARNING_RATE / 10, - - epochs=160, - - layers='all', - + layers='3+', - augmentation=augmentation) - - elif args.command == "evaluate": - ``` - -
      diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/notes/changelog.md b/grit_src_deprecated/third_party/CenterNet2/docs/notes/changelog.md deleted file mode 100755 index 000e9f88..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/notes/changelog.md +++ /dev/null @@ -1,48 +0,0 @@ -# Change Log and Backward Compatibility - -### Releases -See release logs at -[https://github.com/facebookresearch/detectron2/releases](https://github.com/facebookresearch/detectron2/releases) -for new updates. - -### Backward Compatibility - -Due to the research nature of what the library does, there might be backward incompatible changes. -But we try to reduce users' disruption by the following ways: -* APIs listed in [API documentation](https://detectron2.readthedocs.io/modules/index.html), including - function/class names, their arguments, and documented class attributes, are considered *stable* unless - otherwise noted in the documentation. - They are less likely to be broken, but if needed, will trigger a deprecation warning for a reasonable period - before getting broken, and will be documented in release logs. -* Others functions/classses/attributes are considered internal, and are more likely to change. - However, we're aware that some of them may be already used by other projects, and in particular we may - use them for convenience among projects under `detectron2/projects`. - For such APIs, we may treat them as stable APIs and also apply the above strategies. - They may be promoted to stable when we're ready. -* Projects under "detectron2/projects" or imported with "detectron2.projects" are research projects - and are all considered experimental. -* Classes/functions that contain the word "default" or are explicitly documented to produce - "default behavior" may change their behaviors when new features are added. - -Despite of the possible breakage, if a third-party project would like to keep up with the latest updates -in detectron2, using it as a library will still be less disruptive than forking, because -the frequency and scope of API changes will be much smaller than code changes. - -To see such changes, search for "incompatible changes" in [release logs](https://github.com/facebookresearch/detectron2/releases). - -### Config Version Change Log - -Detectron2's config version has not been changed since open source. -There is no need for an open source user to worry about this. - -* v1: Rename `RPN_HEAD.NAME` to `RPN.HEAD_NAME`. -* v2: A batch of rename of many configurations before release. - -### Silent Regressions in Historical Versions: - -We list a few silent regressions, since they may silently produce incorrect results and will be hard to debug. - -* 04/01/2020 - 05/11/2020: Bad accuracy if `TRAIN_ON_PRED_BOXES` is set to True. -* 03/30/2020 - 04/01/2020: ResNets are not correctly built. -* 12/19/2019 - 12/26/2019: Using aspect ratio grouping causes a drop in accuracy. -* - 11/9/2019: Test time augmentation does not predict the last category. diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/notes/compatibility.md b/grit_src_deprecated/third_party/CenterNet2/docs/notes/compatibility.md deleted file mode 100755 index 83d93f51..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/notes/compatibility.md +++ /dev/null @@ -1,84 +0,0 @@ -# Compatibility with Other Libraries - -## Compatibility with Detectron (and maskrcnn-benchmark) - -Detectron2 addresses some legacy issues left in Detectron. As a result, their models -are not compatible: -running inference with the same model weights will produce different results in the two code bases. - -The major differences regarding inference are: - -- The height and width of a box with corners (x1, y1) and (x2, y2) is now computed more naturally as - width = x2 - x1 and height = y2 - y1; - In Detectron, a "+ 1" was added both height and width. - - Note that the relevant ops in Caffe2 have [adopted this change of convention](https://github.com/pytorch/pytorch/pull/20550) - with an extra option. - So it is still possible to run inference with a Detectron2-trained model in Caffe2. - - The change in height/width calculations most notably changes: - - encoding/decoding in bounding box regression. - - non-maximum suppression. The effect here is very negligible, though. - -- RPN now uses simpler anchors with fewer quantization artifacts. - - In Detectron, the anchors were quantized and - [do not have accurate areas](https://github.com/facebookresearch/Detectron/issues/227). - In Detectron2, the anchors are center-aligned to feature grid points and not quantized. - -- Classification layers have a different ordering of class labels. - - This involves any trainable parameter with shape (..., num_categories + 1, ...). - In Detectron2, integer labels [0, K-1] correspond to the K = num_categories object categories - and the label "K" corresponds to the special "background" category. - In Detectron, label "0" means background, and labels [1, K] correspond to the K categories. - -- ROIAlign is implemented differently. The new implementation is [available in Caffe2](https://github.com/pytorch/pytorch/pull/23706). - - 1. All the ROIs are shifted by half a pixel compared to Detectron in order to create better image-feature-map alignment. - See `layers/roi_align.py` for details. - To enable the old behavior, use `ROIAlign(aligned=False)`, or `POOLER_TYPE=ROIAlign` instead of - `ROIAlignV2` (the default). - - 1. The ROIs are not required to have a minimum size of 1. - This will lead to tiny differences in the output, but should be negligible. - -- Mask inference function is different. - - In Detectron2, the "paste_mask" function is different and should be more accurate than in Detectron. This change - can improve mask AP on COCO by ~0.5% absolute. - -There are some other differences in training as well, but they won't affect -model-level compatibility. The major ones are: - -- We fixed a [bug](https://github.com/facebookresearch/Detectron/issues/459) in - Detectron, by making `RPN.POST_NMS_TOPK_TRAIN` per-image, rather than per-batch. - The fix may lead to a small accuracy drop for a few models (e.g. keypoint - detection) and will require some parameter tuning to match the Detectron results. -- For simplicity, we change the default loss in bounding box regression to L1 loss, instead of smooth L1 loss. - We have observed that this tends to slightly decrease box AP50 while improving box AP for higher - overlap thresholds (and leading to a slight overall improvement in box AP). -- We interpret the coordinates in COCO bounding box and segmentation annotations - as coordinates in range `[0, width]` or `[0, height]`. The coordinates in - COCO keypoint annotations are interpreted as pixel indices in range `[0, width - 1]` or `[0, height - 1]`. - Note that this affects how flip augmentation is implemented. - - -[This article](https://ppwwyyxx.com/blog/2021/Where-are-Pixels/) -explains more details on the above mentioned issues -about pixels, coordinates, and "+1"s. - - -## Compatibility with Caffe2 - -As mentioned above, despite the incompatibilities with Detectron, the relevant -ops have been implemented in Caffe2. -Therefore, models trained with detectron2 can be converted in Caffe2. -See [Deployment](../tutorials/deployment.md) for the tutorial. - -## Compatibility with TensorFlow - -Most ops are available in TensorFlow, although some tiny differences in -the implementation of resize / ROIAlign / padding need to be addressed. -A working conversion script is provided by [tensorpack Faster R-CNN](https://github.com/tensorpack/tensorpack/tree/master/examples/FasterRCNN/convert_d2) -to run a standard detectron2 model in TensorFlow. diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/notes/contributing.md b/grit_src_deprecated/third_party/CenterNet2/docs/notes/contributing.md deleted file mode 100755 index 9bab709c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/notes/contributing.md +++ /dev/null @@ -1,68 +0,0 @@ -# Contributing to detectron2 - -## Issues -We use GitHub issues to track public bugs and questions. -Please make sure to follow one of the -[issue templates](https://github.com/facebookresearch/detectron2/issues/new/choose) -when reporting any issues. - -Facebook has a [bounty program](https://www.facebook.com/whitehat/) for the safe -disclosure of security bugs. In those cases, please go through the process -outlined on that page and do not file a public issue. - -## Pull Requests -We actively welcome pull requests. - -However, if you're adding any significant features (e.g. > 50 lines), please -make sure to discuss with maintainers about your motivation and proposals in an issue -before sending a PR. This is to save your time so you don't spend time on a PR that we'll not accept. - -We do not always accept new features, and we take the following -factors into consideration: - -1. Whether the same feature can be achieved without modifying detectron2. - Detectron2 is designed so that you can implement many extensions from the outside, e.g. - those in [projects](https://github.com/facebookresearch/detectron2/tree/master/projects). - * If some part of detectron2 is not extensible enough, you can also bring up a more general issue to - improve it. Such feature request may be useful to more users. -2. Whether the feature is potentially useful to a large audience (e.g. an impactful detection paper, a popular dataset, - a significant speedup, a widely useful utility), - or only to a small portion of users (e.g., a less-known paper, an improvement not in the object - detection field, a trick that's not very popular in the community, code to handle a non-standard type of data) - * Adoption of additional models, datasets, new task are by default not added to detectron2 before they - receive significant popularity in the community. - We sometimes accept such features in `projects/`, or as a link in `projects/README.md`. -3. Whether the proposed solution has a good design / interface. This can be discussed in the issue prior to PRs, or - in the form of a draft PR. -4. Whether the proposed solution adds extra mental/practical overhead to users who don't - need such feature. -5. Whether the proposed solution breaks existing APIs. - -To add a feature to an existing function/class `Func`, there are always two approaches: -(1) add new arguments to `Func`; (2) write a new `Func_with_new_feature`. -To meet the above criteria, we often prefer approach (2), because: - -1. It does not involve modifying or potentially breaking existing code. -2. It does not add overhead to users who do not need the new feature. -3. Adding new arguments to a function/class is not scalable w.r.t. all the possible new research ideas in the future. - -When sending a PR, please do: - -1. If a PR contains multiple orthogonal changes, split it to several PRs. -2. If you've added code that should be tested, add tests. -3. For PRs that need experiments (e.g. adding a new model or new methods), - you don't need to update model zoo, but do provide experiment results in the description of the PR. -4. If APIs are changed, update the documentation. -5. We use the [Google style docstrings](https://www.sphinx-doc.org/en/master/usage/extensions/napoleon.html) in python. -6. Make sure your code lints with `./dev/linter.sh`. - - -## Contributor License Agreement ("CLA") -In order to accept your pull request, we need you to submit a CLA. You only need -to do this once to work on any of Facebook's open source projects. - -Complete your CLA here: - -## License -By contributing to detectron2, you agree that your contributions will be licensed -under the LICENSE file in the root directory of this source tree. diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/notes/index.rst b/grit_src_deprecated/third_party/CenterNet2/docs/notes/index.rst deleted file mode 100755 index 63cf907b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/notes/index.rst +++ /dev/null @@ -1,10 +0,0 @@ -Notes -====================================== - -.. toctree:: - :maxdepth: 2 - - benchmarks - compatibility - contributing - changelog diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/requirements.txt b/grit_src_deprecated/third_party/CenterNet2/docs/requirements.txt deleted file mode 100755 index 58d3c2a5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/requirements.txt +++ /dev/null @@ -1,21 +0,0 @@ -docutils==0.16 -# https://github.com/sphinx-doc/sphinx/commit/7acd3ada3f38076af7b2b5c9f3b60bb9c2587a3d -sphinx==3.2.0 -recommonmark==0.6.0 -sphinx_rtd_theme -# Dependencies here are only those required by import -termcolor -numpy -tqdm -matplotlib -termcolor -yacs -tabulate -cloudpickle -Pillow -future -git+git://github.com/facebookresearch/fvcore.git -https://download.pytorch.org/whl/cpu/torch-1.8.1%2Bcpu-cp37-cp37m-linux_x86_64.whl -https://download.pytorch.org/whl/cpu/torchvision-0.9.1%2Bcpu-cp37-cp37m-linux_x86_64.whl -omegaconf>=2.1.0.dev24 -hydra-core>=1.1.0.dev5 diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/README.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/README.md deleted file mode 100755 index 1ca9c94d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/README.md +++ /dev/null @@ -1,4 +0,0 @@ -# Read the docs: - -The latest documentation built from this directory is available at [detectron2.readthedocs.io](https://detectron2.readthedocs.io/). -Documents in this directory are not meant to be read on github. diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/augmentation.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/augmentation.md deleted file mode 100755 index 7601a082..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/augmentation.md +++ /dev/null @@ -1,186 +0,0 @@ - -# Data Augmentation - -Augmentation is an important part of training. -Detectron2's data augmentation system aims at addressing the following goals: - -1. Allow augmenting multiple data types together - (e.g., images together with their bounding boxes and masks) -2. Allow applying a sequence of statically-declared augmentation -3. Allow adding custom new data types to augment (rotated bounding boxes, video clips, etc.) -4. Process and manipulate the __operations__ that are applied by augmentations - -The first two features cover most of the common use cases, and is also -available in other libraries such as [albumentations](https://medium.com/pytorch/multi-target-in-albumentations-16a777e9006e). -Supporting other features adds some overhead to detectron2's augmentation API, -which we'll explain in this tutorial. - -This tutorial focuses on how to use augmentations when writing new data loaders, -and how to write new augmentations. -If you use the default data loader in detectron2, it already supports taking a user-provided list of custom augmentations, -as explained in the [Dataloader tutorial](data_loading). - -## Basic Usage - -The basic usage of feature (1) and (2) is like the following: -```python -from detectron2.data import transforms as T -# Define a sequence of augmentations: -augs = T.AugmentationList([ - T.RandomBrightness(0.9, 1.1), - T.RandomFlip(prob=0.5), - T.RandomCrop("absolute", (640, 640)) -]) # type: T.Augmentation - -# Define the augmentation input ("image" required, others optional): -input = T.AugInput(image, boxes=boxes, sem_seg=sem_seg) -# Apply the augmentation: -transform = augs(input) # type: T.Transform -image_transformed = input.image # new image -sem_seg_transformed = input.sem_seg # new semantic segmentation - -# For any extra data that needs to be augmented together, use transform, e.g.: -image2_transformed = transform.apply_image(image2) -polygons_transformed = transform.apply_polygons(polygons) -``` - -Three basic concepts are involved here. They are: -* [T.Augmentation](../modules/data_transforms.html#detectron2.data.transforms.Augmentation) defines the __"policy"__ to modify inputs. - * its `__call__(AugInput) -> Transform` method augments the inputs in-place, and returns the operation that is applied -* [T.Transform](../modules/data_transforms.html#detectron2.data.transforms.Transform) - implements the actual __operations__ to transform data - * it has methods such as `apply_image`, `apply_coords` that define how to transform each data type -* [T.AugInput](../modules/data_transforms.html#detectron2.data.transforms.AugInput) - stores inputs needed by `T.Augmentation` and how they should be transformed. - This concept is needed for some advanced usage. - Using this class directly should be sufficient for all common use cases, - since extra data not in `T.AugInput` can be augmented using the returned - `transform`, as shown in the above example. - -## Write New Augmentations - -Most 2D augmentations only need to know about the input image. Such augmentation can be implemented easily like this: - -```python -class MyColorAugmentation(T.Augmentation): - def get_transform(self, image): - r = np.random.rand(2) - return T.ColorTransform(lambda x: x * r[0] + r[1] * 10) - -class MyCustomResize(T.Augmentation): - def get_transform(self, image): - old_h, old_w = image.shape[:2] - new_h, new_w = int(old_h * np.random.rand()), int(old_w * 1.5) - return T.ResizeTransform(old_h, old_w, new_h, new_w) - -augs = MyCustomResize() -transform = augs(input) -``` - -In addition to image, any attributes of the given `AugInput` can be used as long -as they are part of the function signature, e.g.: - -```python -class MyCustomCrop(T.Augmentation): - def get_transform(self, image, sem_seg): - # decide where to crop using both image and sem_seg - return T.CropTransform(...) - -augs = MyCustomCrop() -assert hasattr(input, "image") and hasattr(input, "sem_seg") -transform = augs(input) -``` - -New transform operation can also be added by subclassing -[T.Transform](../modules/data_transforms.html#detectron2.data.transforms.Transform). - -## Advanced Usage - -We give a few examples of advanced usages that -are enabled by our system. -These options can be interesting to new research, -although changing them is often not needed -for standard use cases. - -### Custom transform strategy - -Instead of only returning the augmented data, detectron2's `Augmentation` returns the __operations__ as `T.Transform`. -This allows users to apply custom transform strategy on their data. -We use keypoints data as an example. - -Keypoints are (x, y) coordinates, but they are not so trivial to augment due to the semantic meaning they carry. -Such meaning is only known to the users, therefore users may want to augment them manually -by looking at the returned `transform`. -For example, when an image is horizontally flipped, we'd like to swap the keypoint annotations for "left eye" and "right eye". -This can be done like this (included by default in detectron2's default data loader): -```python -# augs, input are defined as in previous examples -transform = augs(input) # type: T.Transform -keypoints_xy = transform.apply_coords(keypoints_xy) # transform the coordinates - -# get a list of all transforms that were applied -transforms = T.TransformList([transform]).transforms -# check if it is flipped for odd number of times -do_hflip = sum(isinstance(t, T.HFlipTransform) for t in transforms) % 2 == 1 -if do_hflip: - keypoints_xy = keypoints_xy[flip_indices_mapping] -``` - -As another example, keypoints annotations often have a "visibility" field. -A sequence of augmentations might augment a visible keypoint out of the image boundary (e.g. with cropping), -but then bring it back within the boundary afterwards (e.g. with image padding). -If users decide to label such keypoints "invisible", -then the visibility check has to happen after every transform step. -This can be achieved by: - -```python -transform = augs(input) # type: T.TransformList -assert isinstance(transform, T.TransformList) -for t in transform.transforms: - keypoints_xy = t.apply_coords(keypoints_xy) - visibility &= (keypoints_xy >= [0, 0] & keypoints_xy <= [W, H]).all(axis=1) - -# btw, detectron2's `transform_keypoint_annotations` function chooses to label such keypoints "visible": -# keypoints_xy = transform.apply_coords(keypoints_xy) -# visibility &= (keypoints_xy >= [0, 0] & keypoints_xy <= [W, H]).all(axis=1) -``` - - -### Geometrically invert the transform -If images are pre-processed by augmentations before inference, the predicted results -such as segmentation masks are localized on the augmented image. -We'd like to invert the applied augmentation with the [inverse()](../modules/data_transforms.html#detectron2.data.transforms.Transform.inverse) -API, to obtain results on the original image: -```python -transform = augs(input) -pred_mask = make_prediction(input.image) -inv_transform = transform.inverse() -pred_mask_orig = inv_transform.apply_segmentation(pred_mask) -``` - -### Add new data types - -[T.Transform](../modules/data_transforms.html#detectron2.data.transforms.Transform) -supports a few common data types to transform, including images, coordinates, masks, boxes, polygons. -It allows registering new data types, e.g.: -```python -@T.HFlipTransform.register_type("rotated_boxes") -def func(flip_transform: T.HFlipTransform, rotated_boxes: Any): - # do the work - return flipped_rotated_boxes - -t = HFlipTransform(width=800) -transformed_rotated_boxes = t.apply_rotated_boxes(rotated_boxes) # func will be called -``` - -### Extend T.AugInput - -An augmentation can only access attributes available in the given input. -[T.AugInput](../modules/data_transforms.html#detectron2.data.transforms.StandardAugInput) defines "image", "boxes", "sem_seg", -which are sufficient for common augmentation strategies to decide how to augment. -If not, a custom implementation is needed. - -By re-implement the "transform()" method in AugInput, it is also possible to -augment different fields in ways that are dependent on each other. -Such use case is uncommon (e.g. post-process bounding box based on augmented masks), but allowed by the system. - diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/builtin_datasets.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/builtin_datasets.md deleted file mode 100755 index 0eb44cc3..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/builtin_datasets.md +++ /dev/null @@ -1,140 +0,0 @@ -# Use Builtin Datasets - -A dataset can be used by accessing [DatasetCatalog](https://detectron2.readthedocs.io/modules/data.html#detectron2.data.DatasetCatalog) -for its data, or [MetadataCatalog](https://detectron2.readthedocs.io/modules/data.html#detectron2.data.MetadataCatalog) for its metadata (class names, etc). -This document explains how to setup the builtin datasets so they can be used by the above APIs. -[Use Custom Datasets](https://detectron2.readthedocs.io/tutorials/datasets.html) gives a deeper dive on how to use `DatasetCatalog` and `MetadataCatalog`, -and how to add new datasets to them. - -Detectron2 has builtin support for a few datasets. -The datasets are assumed to exist in a directory specified by the environment variable -`DETECTRON2_DATASETS`. -Under this directory, detectron2 will look for datasets in the structure described below, if needed. -``` -$DETECTRON2_DATASETS/ - coco/ - lvis/ - cityscapes/ - VOC20{07,12}/ -``` - -You can set the location for builtin datasets by `export DETECTRON2_DATASETS=/path/to/datasets`. -If left unset, the default is `./datasets` relative to your current working directory. - -The [model zoo](https://github.com/facebookresearch/detectron2/blob/master/MODEL_ZOO.md) -contains configs and models that use these builtin datasets. - -## Expected dataset structure for [COCO instance/keypoint detection](https://cocodataset.org/#download): - -``` -coco/ - annotations/ - instances_{train,val}2017.json - person_keypoints_{train,val}2017.json - {train,val}2017/ - # image files that are mentioned in the corresponding json -``` - -You can use the 2014 version of the dataset as well. - -Some of the builtin tests (`dev/run_*_tests.sh`) uses a tiny version of the COCO dataset, -which you can download with `./datasets/prepare_for_tests.sh`. - -## Expected dataset structure for PanopticFPN: - -Extract panoptic annotations from [COCO website](https://cocodataset.org/#download) -into the following structure: -``` -coco/ - annotations/ - panoptic_{train,val}2017.json - panoptic_{train,val}2017/ # png annotations - panoptic_stuff_{train,val}2017/ # generated by the script mentioned below -``` - -Install panopticapi by: -``` -pip install git+https://github.com/cocodataset/panopticapi.git -``` -Then, run `python datasets/prepare_panoptic_fpn.py`, to extract semantic annotations from panoptic annotations. - -## Expected dataset structure for [LVIS instance segmentation](https://www.lvisdataset.org/dataset): -``` -coco/ - {train,val,test}2017/ -lvis/ - lvis_v0.5_{train,val}.json - lvis_v0.5_image_info_test.json - lvis_v1_{train,val}.json - lvis_v1_image_info_test{,_challenge}.json -``` - -Install lvis-api by: -``` -pip install git+https://github.com/lvis-dataset/lvis-api.git -``` - -To evaluate models trained on the COCO dataset using LVIS annotations, -run `python datasets/prepare_cocofied_lvis.py` to prepare "cocofied" LVIS annotations. - -## Expected dataset structure for [cityscapes](https://www.cityscapes-dataset.com/downloads/): -``` -cityscapes/ - gtFine/ - train/ - aachen/ - color.png, instanceIds.png, labelIds.png, polygons.json, - labelTrainIds.png - ... - val/ - test/ - # below are generated Cityscapes panoptic annotation - cityscapes_panoptic_train.json - cityscapes_panoptic_train/ - cityscapes_panoptic_val.json - cityscapes_panoptic_val/ - cityscapes_panoptic_test.json - cityscapes_panoptic_test/ - leftImg8bit/ - train/ - val/ - test/ -``` -Install cityscapes scripts by: -``` -pip install git+https://github.com/mcordts/cityscapesScripts.git -``` - -Note: to create labelTrainIds.png, first prepare the above structure, then run cityscapesescript with: -``` -CITYSCAPES_DATASET=/path/to/abovementioned/cityscapes python cityscapesscripts/preparation/createTrainIdLabelImgs.py -``` -These files are not needed for instance segmentation. - -Note: to generate Cityscapes panoptic dataset, run cityscapesescript with: -``` -CITYSCAPES_DATASET=/path/to/abovementioned/cityscapes python cityscapesscripts/preparation/createPanopticImgs.py -``` -These files are not needed for semantic and instance segmentation. - -## Expected dataset structure for [Pascal VOC](http://host.robots.ox.ac.uk/pascal/VOC/index.html): -``` -VOC20{07,12}/ - Annotations/ - ImageSets/ - Main/ - trainval.txt - test.txt - # train.txt or val.txt, if you use these splits - JPEGImages/ -``` - -## Expected dataset structure for [ADE20k Scene Parsing](http://sceneparsing.csail.mit.edu/): -``` -ADEChallengeData2016/ - annotations/ - annotations_detectron2/ - images/ - objectInfo150.txt -``` -The directory `annotations_detectron2` is generated by running `python datasets/prepare_ade20k_sem_seg.py`. diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/configs.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/configs.md deleted file mode 100755 index 751e4eb6..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/configs.md +++ /dev/null @@ -1,62 +0,0 @@ -# Yacs Configs - -Detectron2 provides a key-value based config system that can be -used to obtain standard, common behaviors. - -This system uses YAML and [yacs](https://github.com/rbgirshick/yacs). -Yaml is a very limited language, -so we do not expect all features in detectron2 to be available through configs. -If you need something that's not available in the config space, -please write code using detectron2's API. - -With the introduction of a more powerful [LazyConfig system](lazyconfigs.md), -we no longer add functionality / new keys to the Yacs/Yaml-based config system. - -### Basic Usage - -Some basic usage of the `CfgNode` object is shown here. See more in [documentation](../modules/config.html#detectron2.config.CfgNode). -```python -from detectron2.config import get_cfg -cfg = get_cfg() # obtain detectron2's default config -cfg.xxx = yyy # add new configs for your own custom components -cfg.merge_from_file("my_cfg.yaml") # load values from a file - -cfg.merge_from_list(["MODEL.WEIGHTS", "weights.pth"]) # can also load values from a list of str -print(cfg.dump()) # print formatted configs -with open("output.yaml", "w") as f: - f.write(cfg.dump()) # save config to file -``` - -In addition to the basic Yaml syntax, the config file can -define a `_BASE_: base.yaml` field, which will load a base config file first. -Values in the base config will be overwritten in sub-configs, if there are any conflicts. -We provided several base configs for standard model architectures. - -Many builtin tools in detectron2 accept command line config overwrite: -Key-value pairs provided in the command line will overwrite the existing values in the config file. -For example, [demo.py](../../demo/demo.py) can be used with -``` -./demo.py --config-file config.yaml [--other-options] \ - --opts MODEL.WEIGHTS /path/to/weights INPUT.MIN_SIZE_TEST 1000 -``` - -To see a list of available configs in detectron2 and what they mean, -check [Config References](../modules/config.html#config-references) - -### Configs in Projects - -A project that lives outside the detectron2 library may define its own configs, which will need to be added -for the project to be functional, e.g.: -```python -from detectron2.projects.point_rend import add_pointrend_config -cfg = get_cfg() # obtain detectron2's default config -add_pointrend_config(cfg) # add pointrend's default config -# ... ... -``` - -### Best Practice with Configs - -1. Treat the configs you write as "code": avoid copying them or duplicating them; use `_BASE_` - to share common parts between configs. - -2. Keep the configs you write simple: don't include keys that do not affect the experimental setting. diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/data_loading.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/data_loading.md deleted file mode 100755 index 1d2769fc..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/data_loading.md +++ /dev/null @@ -1,95 +0,0 @@ - -# Dataloader - -Dataloader is the component that provides data to models. -A dataloader usually (but not necessarily) takes raw information from [datasets](./datasets.md), -and process them into a format needed by the model. - -## How the Existing Dataloader Works - -Detectron2 contains a builtin data loading pipeline. -It's good to understand how it works, in case you need to write a custom one. - -Detectron2 provides two functions -[build_detection_{train,test}_loader](../modules/data.html#detectron2.data.build_detection_train_loader) -that create a default data loader from a given config. -Here is how `build_detection_{train,test}_loader` work: - -1. It takes the name of a registered dataset (e.g., "coco_2017_train") and loads a `list[dict]` representing the dataset items - in a lightweight format. These dataset items are not yet ready to be used by the model (e.g., images are - not loaded into memory, random augmentations have not been applied, etc.). - Details about the dataset format and dataset registration can be found in - [datasets](./datasets.md). -2. Each dict in this list is mapped by a function ("mapper"): - * Users can customize this mapping function by specifying the "mapper" argument in - `build_detection_{train,test}_loader`. The default mapper is [DatasetMapper](../modules/data.html#detectron2.data.DatasetMapper). - * The output format of the mapper can be arbitrary, as long as it is accepted by the consumer of this data loader (usually the model). - The outputs of the default mapper, after batching, follow the default model input format documented in - [Use Models](./models.html#model-input-format). - * The role of the mapper is to transform the lightweight representation of a dataset item into a format - that is ready for the model to consume (including, e.g., read images, perform random data augmentation and convert to torch Tensors). - If you would like to perform custom transformations to data, you often want a custom mapper. -3. The outputs of the mapper are batched (simply into a list). -4. This batched data is the output of the data loader. Typically, it's also the input of - `model.forward()`. - - -## Write a Custom Dataloader - -Using a different "mapper" with `build_detection_{train,test}_loader(mapper=)` works for most use cases -of custom data loading. -For example, if you want to resize all images to a fixed size for training, use: - -```python -import detectron2.data.transforms as T -from detectron2.data import DatasetMapper # the default mapper -dataloader = build_detection_train_loader(cfg, - mapper=DatasetMapper(cfg, is_train=True, augmentations=[ - T.Resize((800, 800)) - ])) -# use this dataloader instead of the default -``` -If the arguments of the default [DatasetMapper](../modules/data.html#detectron2.data.DatasetMapper) -does not provide what you need, you may write a custom mapper function and use it instead, e.g.: - -```python -from detectron2.data import detection_utils as utils - # Show how to implement a minimal mapper, similar to the default DatasetMapper -def mapper(dataset_dict): - dataset_dict = copy.deepcopy(dataset_dict) # it will be modified by code below - # can use other ways to read image - image = utils.read_image(dataset_dict["file_name"], format="BGR") - # See "Data Augmentation" tutorial for details usage - auginput = T.AugInput(image) - transform = T.Resize((800, 800))(auginput) - image = torch.from_numpy(auginput.image.transpose(2, 0, 1)) - annos = [ - utils.transform_instance_annotations(annotation, [transform], image.shape[1:]) - for annotation in dataset_dict.pop("annotations") - ] - return { - # create the format that the model expects - "image": image, - "instances": utils.annotations_to_instances(annos, image.shape[1:]) - } -dataloader = build_detection_train_loader(cfg, mapper=mapper) -``` - -If you want to change not only the mapper (e.g., in order to implement different sampling or batching logic), -`build_detection_train_loader` won't work and you will need to write a different data loader. -The data loader is simply a -python iterator that produces [the format](./models.md) that the model accepts. -You can implement it using any tools you like. - -No matter what to implement, it's recommended to -check out [API documentation of detectron2.data](../modules/data) to learn more about the APIs of -these functions. - -## Use a Custom Dataloader - -If you use [DefaultTrainer](../modules/engine.html#detectron2.engine.defaults.DefaultTrainer), -you can overwrite its `build_{train,test}_loader` method to use your own dataloader. -See the [deeplab dataloader](../../projects/DeepLab/train_net.py) -for an example. - -If you write your own training loop, you can plug in your data loader easily. diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/datasets.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/datasets.md deleted file mode 100755 index 91103f64..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/datasets.md +++ /dev/null @@ -1,290 +0,0 @@ -# Use Custom Datasets - -This document explains how the dataset APIs -([DatasetCatalog](../modules/data.html#detectron2.data.DatasetCatalog), [MetadataCatalog](../modules/data.html#detectron2.data.MetadataCatalog)) -work, and how to use them to add custom datasets. - -Datasets that have builtin support in detectron2 are listed in [builtin datasets](builtin_datasets.md). -If you want to use a custom dataset while also reusing detectron2's data loaders, -you will need to: - -1. __Register__ your dataset (i.e., tell detectron2 how to obtain your dataset). -2. Optionally, __register metadata__ for your dataset. - -Next, we explain the above two concepts in detail. - -The [Colab tutorial](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5) -has a live example of how to register and train on a dataset of custom formats. - -### Register a Dataset - -To let detectron2 know how to obtain a dataset named "my_dataset", users need to implement -a function that returns the items in your dataset and then tell detectron2 about this -function: -```python -def my_dataset_function(): - ... - return list[dict] in the following format - -from detectron2.data import DatasetCatalog -DatasetCatalog.register("my_dataset", my_dataset_function) -# later, to access the data: -data: List[Dict] = DatasetCatalog.get("my_dataset") -``` - -Here, the snippet associates a dataset named "my_dataset" with a function that returns the data. -The function must return the same data (with same order) if called multiple times. -The registration stays effective until the process exits. - -The function can do arbitrary things and should return the data in `list[dict]`, each dict in either -of the following formats: -1. Detectron2's standard dataset dict, described below. This will make it work with many other builtin - features in detectron2, so it's recommended to use it when it's sufficient. -2. Any custom format. You can also return arbitrary dicts in your own format, - such as adding extra keys for new tasks. - Then you will need to handle them properly downstream as well. - See below for more details. - -#### Standard Dataset Dicts - -For standard tasks -(instance detection, instance/semantic/panoptic segmentation, keypoint detection), -we load the original dataset into `list[dict]` with a specification similar to COCO's annotations. -This is our standard representation for a dataset. - -Each dict contains information about one image. -The dict may have the following fields, -and the required fields vary based on what the dataloader or the task needs (see more below). - -```eval_rst -.. list-table:: - :header-rows: 1 - - * - Task - - Fields - * - Common - - file_name, height, width, image_id - - * - Instance detection/segmentation - - annotations - - * - Semantic segmentation - - sem_seg_file_name - - * - Panoptic segmentation - - pan_seg_file_name, segments_info -``` - -+ `file_name`: the full path to the image file. -+ `height`, `width`: integer. The shape of the image. -+ `image_id` (str or int): a unique id that identifies this image. Required by many - evaluators to identify the images, but a dataset may use it for different purposes. -+ `annotations` (list[dict]): Required by __instance detection/segmentation or keypoint detection__ tasks. - Each dict corresponds to annotations of one instance in this image, and - may contain the following keys: - + `bbox` (list[float], required): list of 4 numbers representing the bounding box of the instance. - + `bbox_mode` (int, required): the format of bbox. It must be a member of - [structures.BoxMode](../modules/structures.html#detectron2.structures.BoxMode). - Currently supports: `BoxMode.XYXY_ABS`, `BoxMode.XYWH_ABS`. - + `category_id` (int, required): an integer in the range [0, num_categories-1] representing the category label. - The value num_categories is reserved to represent the "background" category, if applicable. - + `segmentation` (list[list[float]] or dict): the segmentation mask of the instance. - + If `list[list[float]]`, it represents a list of polygons, one for each connected component - of the object. Each `list[float]` is one simple polygon in the format of `[x1, y1, ..., xn, yn]` (n≥3). - The Xs and Ys are absolute coordinates in unit of pixels. - + If `dict`, it represents the per-pixel segmentation mask in COCO's compressed RLE format. - The dict should have keys "size" and "counts". You can convert a uint8 segmentation mask of 0s and - 1s into such dict by `pycocotools.mask.encode(np.asarray(mask, order="F"))`. - `cfg.INPUT.MASK_FORMAT` must be set to `bitmask` if using the default data loader with such format. - + `keypoints` (list[float]): in the format of [x1, y1, v1,..., xn, yn, vn]. - v[i] means the [visibility](http://cocodataset.org/#format-data) of this keypoint. - `n` must be equal to the number of keypoint categories. - The Xs and Ys are absolute real-value coordinates in range [0, W or H]. - - (Note that the keypoint coordinates in COCO format are integers in range [0, W-1 or H-1], which is different - from our standard format. Detectron2 adds 0.5 to COCO keypoint coordinates to convert them from discrete - pixel indices to floating point coordinates.) - + `iscrowd`: 0 (default) or 1. Whether this instance is labeled as COCO's "crowd - region". Don't include this field if you don't know what it means. - - If `annotations` is an empty list, it means the image is labeled to have no objects. - Such images will by default be removed from training, - but can be included using `DATALOADER.FILTER_EMPTY_ANNOTATIONS`. - -+ `sem_seg_file_name` (str): - The full path to the semantic segmentation ground truth file. - It should be a grayscale image whose pixel values are integer labels. -+ `pan_seg_file_name` (str): - The full path to panoptic segmentation ground truth file. - It should be an RGB image whose pixel values are integer ids encoded using the - [panopticapi.utils.id2rgb](https://github.com/cocodataset/panopticapi/) function. - The ids are defined by `segments_info`. - If an id does not appear in `segments_info`, the pixel is considered unlabeled - and is usually ignored in training & evaluation. -+ `segments_info` (list[dict]): defines the meaning of each id in panoptic segmentation ground truth. - Each dict has the following keys: - + `id` (int): integer that appears in the ground truth image. - + `category_id` (int): an integer in the range [0, num_categories-1] representing the category label. - + `iscrowd`: 0 (default) or 1. Whether this instance is labeled as COCO's "crowd region". - - -```eval_rst - -.. note:: - - The PanopticFPN model does not use the panoptic segmentation - format defined here, but a combination of both instance segmentation and semantic segmentation data - format. See :doc:`builtin_datasets` for instructions on COCO. - -``` - -Fast R-CNN (with pre-computed proposals) models are rarely used today. -To train a Fast R-CNN, the following extra keys are needed: - -+ `proposal_boxes` (array): 2D numpy array with shape (K, 4) representing K precomputed proposal boxes for this image. -+ `proposal_objectness_logits` (array): numpy array with shape (K, ), which corresponds to the objectness - logits of proposals in 'proposal_boxes'. -+ `proposal_bbox_mode` (int): the format of the precomputed proposal bbox. - It must be a member of - [structures.BoxMode](../modules/structures.html#detectron2.structures.BoxMode). - Default is `BoxMode.XYXY_ABS`. - - - -#### Custom Dataset Dicts for New Tasks - -In the `list[dict]` that your dataset function returns, the dictionary can also have __arbitrary custom data__. -This will be useful for a new task that needs extra information not covered -by the standard dataset dicts. In this case, you need to make sure the downstream code can handle your data -correctly. Usually this requires writing a new `mapper` for the dataloader (see [Use Custom Dataloaders](./data_loading.md)). - -When designing a custom format, note that all dicts are stored in memory -(sometimes serialized and with multiple copies). -To save memory, each dict is meant to contain __small__ but sufficient information -about each sample, such as file names and annotations. -Loading full samples typically happens in the data loader. - -For attributes shared among the entire dataset, use `Metadata` (see below). -To avoid extra memory, do not save such information inside each sample. - -### "Metadata" for Datasets - -Each dataset is associated with some metadata, accessible through -`MetadataCatalog.get(dataset_name).some_metadata`. -Metadata is a key-value mapping that contains information that's shared among -the entire dataset, and usually is used to interpret what's in the dataset, e.g., -names of classes, colors of classes, root of files, etc. -This information will be useful for augmentation, evaluation, visualization, logging, etc. -The structure of metadata depends on what is needed from the corresponding downstream code. - -If you register a new dataset through `DatasetCatalog.register`, -you may also want to add its corresponding metadata through -`MetadataCatalog.get(dataset_name).some_key = some_value`, to enable any features that need the metadata. -You can do it like this (using the metadata key "thing_classes" as an example): - -```python -from detectron2.data import MetadataCatalog -MetadataCatalog.get("my_dataset").thing_classes = ["person", "dog"] -``` - -Here is a list of metadata keys that are used by builtin features in detectron2. -If you add your own dataset without these metadata, some features may be -unavailable to you: - -* `thing_classes` (list[str]): Used by all instance detection/segmentation tasks. - A list of names for each instance/thing category. - If you load a COCO format dataset, it will be automatically set by the function `load_coco_json`. - -* `thing_colors` (list[tuple(r, g, b)]): Pre-defined color (in [0, 255]) for each thing category. - Used for visualization. If not given, random colors will be used. - -* `stuff_classes` (list[str]): Used by semantic and panoptic segmentation tasks. - A list of names for each stuff category. - -* `stuff_colors` (list[tuple(r, g, b)]): Pre-defined color (in [0, 255]) for each stuff category. - Used for visualization. If not given, random colors are used. - -* `ignore_label` (int): Used by semantic and panoptic segmentation tasks. Pixels in ground-truth - annotations with this category label should be ignored in evaluation. Typically these are "unlabeled" - pixels. - -* `keypoint_names` (list[str]): Used by keypoint detection. A list of names for each keypoint. - -* `keypoint_flip_map` (list[tuple[str]]): Used by keypoint detection. A list of pairs of names, - where each pair are the two keypoints that should be flipped if the image is - flipped horizontally during augmentation. -* `keypoint_connection_rules`: list[tuple(str, str, (r, g, b))]. Each tuple specifies a pair of keypoints - that are connected and the color (in [0, 255]) to use for the line between them when visualized. - -Some additional metadata that are specific to the evaluation of certain datasets (e.g. COCO): - -* `thing_dataset_id_to_contiguous_id` (dict[int->int]): Used by all instance detection/segmentation tasks in the COCO format. - A mapping from instance class ids in the dataset to contiguous ids in range [0, #class). - Will be automatically set by the function `load_coco_json`. - -* `stuff_dataset_id_to_contiguous_id` (dict[int->int]): Used when generating prediction json files for - semantic/panoptic segmentation. - A mapping from semantic segmentation class ids in the dataset - to contiguous ids in [0, num_categories). It is useful for evaluation only. - -* `json_file`: The COCO annotation json file. Used by COCO evaluation for COCO-format datasets. -* `panoptic_root`, `panoptic_json`: Used by COCO-format panoptic evaluation. -* `evaluator_type`: Used by the builtin main training script to select - evaluator. Don't use it in a new training script. - You can just provide the [DatasetEvaluator](../modules/evaluation.html#detectron2.evaluation.DatasetEvaluator) - for your dataset directly in your main script. - -```eval_rst -.. note:: - - In recognition, sometimes we use the term "thing" for instance-level tasks, - and "stuff" for semantic segmentation tasks. - Both are used in panoptic segmentation tasks. - For background on the concept of "thing" and "stuff", see - `On Seeing Stuff: The Perception of Materials by Humans and Machines - `_. -``` - -### Register a COCO Format Dataset - -If your instance-level (detection, segmentation, keypoint) dataset is already a json file in the COCO format, -the dataset and its associated metadata can be registered easily with: -```python -from detectron2.data.datasets import register_coco_instances -register_coco_instances("my_dataset", {}, "json_annotation.json", "path/to/image/dir") -``` - -If your dataset is in COCO format but need to be further processed, or has extra custom per-instance annotations, -the [load_coco_json](../modules/data.html#detectron2.data.datasets.load_coco_json) -function might be useful. - -### Update the Config for New Datasets - -Once you've registered the dataset, you can use the name of the dataset (e.g., "my_dataset" in -example above) in `cfg.DATASETS.{TRAIN,TEST}`. -There are other configs you might want to change to train or evaluate on new datasets: - -* `MODEL.ROI_HEADS.NUM_CLASSES` and `MODEL.RETINANET.NUM_CLASSES` are the number of thing classes - for R-CNN and RetinaNet models, respectively. -* `MODEL.ROI_KEYPOINT_HEAD.NUM_KEYPOINTS` sets the number of keypoints for Keypoint R-CNN. - You'll also need to set [Keypoint OKS](http://cocodataset.org/#keypoints-eval) - with `TEST.KEYPOINT_OKS_SIGMAS` for evaluation. -* `MODEL.SEM_SEG_HEAD.NUM_CLASSES` sets the number of stuff classes for Semantic FPN & Panoptic FPN. -* `TEST.DETECTIONS_PER_IMAGE` controls the maximum number of objects to be detected. - Set it to a larger number if test images may contain >100 objects. -* If you're training Fast R-CNN (with precomputed proposals), `DATASETS.PROPOSAL_FILES_{TRAIN,TEST}` - need to match the datasets. The format of proposal files are documented - [here](../modules/data.html#detectron2.data.load_proposals_into_dataset). - -New models -(e.g. [TensorMask](../../projects/TensorMask), -[PointRend](../../projects/PointRend)) -often have similar configs of their own that need to be changed as well. - -```eval_rst -.. tip:: - - After changing the number of classes, certain layers in a pre-trained model will become incompatible - and therefore cannot be loaded to the new model. - This is expected, and loading such pre-trained models will produce warnings about such layers. -``` diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/deployment.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/deployment.md deleted file mode 100755 index 173b9a0e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/deployment.md +++ /dev/null @@ -1,137 +0,0 @@ -# Deployment - -Models written in Python need to go through an export process to become a deployable artifact. -A few basic concepts about this process: - -__"Export method"__ is how a Python model is fully serialized to a deployable format. -We support the following export methods: - -* `tracing`: see [pytorch documentation](https://pytorch.org/tutorials/beginner/Intro_to_TorchScript_tutorial.html) to learn about it -* `scripting`: see [pytorch documentation](https://pytorch.org/tutorials/beginner/Intro_to_TorchScript_tutorial.html) to learn about it -* `caffe2_tracing`: replace parts of the model by caffe2 operators, then use tracing. - -__"Format"__ is how a serialized model is described in a file, e.g. -TorchScript, Caffe2 protobuf, ONNX format. -__"Runtime"__ is an engine that loads a serialized model and executes it, -e.g., PyTorch, Caffe2, TensorFlow, onnxruntime, TensorRT, etc. -A runtime is often tied to a specific format -(e.g. PyTorch needs TorchScript format, Caffe2 needs protobuf format). -We currently support the following combination and each has some limitations: - -```eval_rst -+----------------------------+-------------+-------------+-----------------------------+ -| Export Method | tracing | scripting | caffe2_tracing | -+============================+=============+=============+=============================+ -| **Formats** | TorchScript | TorchScript | Caffe2, TorchScript, ONNX | -+----------------------------+-------------+-------------+-----------------------------+ -| **Runtime** | PyTorch | PyTorch | Caffe2, PyTorch | -+----------------------------+-------------+-------------+-----------------------------+ -| C++/Python inference | ✅ | ✅ | ✅ | -+----------------------------+-------------+-------------+-----------------------------+ -| Dynamic resolution | ✅ | ✅ | ✅ | -+----------------------------+-------------+-------------+-----------------------------+ -| Batch size requirement | Constant | Dynamic | Batch inference unsupported | -+----------------------------+-------------+-------------+-----------------------------+ -| Extra runtime deps | torchvision | torchvision | Caffe2 ops (usually already | -| | | | | -| | | | included in PyTorch) | -+----------------------------+-------------+-------------+-----------------------------+ -| Faster/Mask/Keypoint R-CNN | ✅ | ✅ | ✅ | -+----------------------------+-------------+-------------+-----------------------------+ -| RetinaNet | ✅ | ✅ | ✅ | -+----------------------------+-------------+-------------+-----------------------------+ -| PointRend R-CNN | ✅ | ❌ | ❌ | -+----------------------------+-------------+-------------+-----------------------------+ -| Cascade R-CNN | ✅ | ❌ | ❌ | -+----------------------------+-------------+-------------+-----------------------------+ - -``` - -`caffe2_tracing` is going to be deprecated. -We don't plan to work on additional support for other formats/runtime, but contributions are welcome. - - -## Deployment with Tracing or Scripting - -Models can be exported to TorchScript format, by either -[tracing or scripting](https://pytorch.org/tutorials/beginner/Intro_to_TorchScript_tutorial.html). -The output model file can be loaded without detectron2 dependency in either Python or C++. -The exported model often requires torchvision (or its C++ library) dependency for some custom ops. - -This feature requires PyTorch ≥ 1.8. - -### Coverage -Most official models under the meta architectures `GeneralizedRCNN` and `RetinaNet` -are supported in both tracing and scripting mode. -Cascade R-CNN and PointRend are currently supported in tracing. -Users' custom extensions are supported if they are also scriptable or traceable. - -For models exported with tracing, dynamic input resolution is allowed, but batch size -(number of input images) must be fixed. -Scripting can support dynamic batch size. - -### Usage - -The main export APIs for tracing and scripting are [TracingAdapter](../modules/export.html#detectron2.export.TracingAdapter) -and [scripting_with_instances](../modules/export.html#detectron2.export.scripting_with_instances). -Their usage is currently demonstrated in [test_export_torchscript.py](../../tests/test_export_torchscript.py) -(see `TestScripting` and `TestTracing`) -as well as the [deployment example](../../tools/deploy). -Please check that these examples can run, and then modify for your use cases. -The usage now requires some user effort and necessary knowledge for each model to workaround the limitation of scripting and tracing. -In the future we plan to wrap these under simpler APIs to lower the bar to use them. - -## Deployment with Caffe2-tracing -We provide [Caffe2Tracer](../modules/export.html#detectron2.export.Caffe2Tracer) -that performs the export logic. -It replaces parts of the model with Caffe2 operators, -and then export the model into Caffe2, TorchScript or ONNX format. - -The converted model is able to run in either Python or C++ without detectron2/torchvision dependency, on CPU or GPUs. -It has a runtime optimized for CPU & mobile inference, but not optimized for GPU inference. - -This feature requires 1.9 > ONNX ≥ 1.6. - -### Coverage - -Most official models under these 3 common meta architectures: `GeneralizedRCNN`, `RetinaNet`, `PanopticFPN` -are supported. Cascade R-CNN is not supported. Batch inference is not supported. - -Users' custom extensions under these architectures (added through registration) are supported -as long as they do not contain control flow or operators not available in Caffe2 (e.g. deformable convolution). -For example, custom backbones and heads are often supported out of the box. - -### Usage - -The APIs are listed at [the API documentation](../modules/export). -We provide [export_model.py](../../tools/deploy/) as an example that uses -these APIs to convert a standard model. For custom models/datasets, you can add them to this script. - -### Use the model in C++/Python - -The model can be loaded in C++ and deployed with -either Caffe2 or Pytorch runtime.. [C++ examples](../../tools/deploy/) for Mask R-CNN -are given as a reference. Note that: - -* Models exported with `caffe2_tracing` method take a special input format - described in [documentation](../modules/export.html#detectron2.export.Caffe2Tracer). - This was taken care of in the C++ example. - -* The converted models do not contain post-processing operations that - transform raw layer outputs into formatted predictions. - For example, the C++ examples only produce raw outputs (28x28 masks) from the final - layers that are not post-processed, because in actual deployment, an application often needs - its custom lightweight post-processing, so this step is left for users. - -To help use the Caffe2-format model in python, -we provide a python wrapper around the converted model, in the -[Caffe2Model.\_\_call\_\_](../modules/export.html#detectron2.export.Caffe2Model.__call__) method. -This method has an interface that's identical to the [pytorch versions of models](./models.md), -and it internally applies pre/post-processing code to match the formats. -This wrapper can serve as a reference for how to use Caffe2's python API, -or for how to implement pre/post-processing in actual deployment. - -## Conversion to TensorFlow -[tensorpack Faster R-CNN](https://github.com/tensorpack/tensorpack/tree/master/examples/FasterRCNN/convert_d2) -provides scripts to convert a few standard detectron2 R-CNN models to TensorFlow's pb format. -It works by translating configs and weights, therefore only support a few models. diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/evaluation.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/evaluation.md deleted file mode 100755 index bd924a3b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/evaluation.md +++ /dev/null @@ -1,68 +0,0 @@ - -# Evaluation - -Evaluation is a process that takes a number of inputs/outputs pairs and aggregate them. -You can always [use the model](./models.md) directly and just parse its inputs/outputs manually to perform -evaluation. -Alternatively, evaluation is implemented in detectron2 using the [DatasetEvaluator](../modules/evaluation.html#detectron2.evaluation.DatasetEvaluator) -interface. - -Detectron2 includes a few `DatasetEvaluator` that computes metrics using standard dataset-specific -APIs (e.g., COCO, LVIS). -You can also implement your own `DatasetEvaluator` that performs some other jobs -using the inputs/outputs pairs. -For example, to count how many instances are detected on the validation set: - -``` -class Counter(DatasetEvaluator): - def reset(self): - self.count = 0 - def process(self, inputs, outputs): - for output in outputs: - self.count += len(output["instances"]) - def evaluate(self): - # save self.count somewhere, or print it, or return it. - return {"count": self.count} -``` - -## Use evaluators - -To evaluate using the methods of evaluators manually: -``` -def get_all_inputs_outputs(): - for data in data_loader: - yield data, model(data) - -evaluator.reset() -for inputs, outputs in get_all_inputs_outputs(): - evaluator.process(inputs, outputs) -eval_results = evaluator.evaluate() -``` - -Evaluators can also be used with [inference_on_dataset](../modules/evaluation.html#detectron2.evaluation.inference_on_dataset). -For example, - -```python -eval_results = inference_on_dataset( - model, - data_loader, - DatasetEvaluators([COCOEvaluator(...), Counter()])) -``` -This will execute `model` on all inputs from `data_loader`, and call evaluator to process them. - -Compared to running the evaluation manually using the model, the benefit of this function is that -evaluators can be merged together using [DatasetEvaluators](../modules/evaluation.html#detectron2.evaluation.DatasetEvaluators), -and all the evaluation can finish in one forward pass over the dataset. -This function also provides accurate speed benchmarks for the given model and dataset. - -## Evaluators for custom dataset - -Many evaluators in detectron2 are made for specific datasets, -in order to obtain scores using each dataset's official API. -In addition to that, two evaluators are able to evaluate any generic dataset -that follows detectron2's [standard dataset format](./datasets.md), so they -can be used to evaluate custom datasets: - -* [COCOEvaluator](../modules/evaluation.html#detectron2.evaluation.COCOEvaluator) is able to evaluate AP (Average Precision) for box detection, - instance segmentation, keypoint detection on any custom dataset. -* [SemSegEvaluator](../modules/evaluation.html#detectron2.evaluation.SemSegEvaluator) is able to evaluate semantic segmentation metrics on any custom dataset. diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/extend.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/extend.md deleted file mode 100755 index a6af550f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/extend.md +++ /dev/null @@ -1,141 +0,0 @@ -# Extend Detectron2's Defaults - -__Research is about doing things in new ways__. -This brings a tension in how to create abstractions in code, -which is a challenge for any research engineering project of a significant size: - -1. On one hand, it needs to have very thin abstractions to allow for the possibility of doing - everything in new ways. It should be reasonably easy to break existing - abstractions and replace them with new ones. - -2. On the other hand, such a project also needs reasonably high-level - abstractions, so that users can easily do things in standard ways, - without worrying too much about the details that only certain researchers care about. - -In detectron2, there are two types of interfaces that address this tension together: - -1. Functions and classes that take a config (`cfg`) argument - created from a yaml file - (sometimes with few extra arguments). - - Such functions and classes implement - the "standard default" behavior: it will read what it needs from a given - config and do the "standard" thing. - Users only need to load an expert-made config and pass it around, without having to worry about - which arguments are used and what they all mean. - - See [Yacs Configs](configs.md) for a detailed tutorial. - -2. Functions and classes that have well-defined explicit arguments. - - Each of these is a small building block of the entire system. - They require users' expertise to understand what each argument should be, - and require more effort to stitch together to a larger system. - But they can be stitched together in more flexible ways. - - When you need to implement something not supported by the "standard defaults" - included in detectron2, these well-defined components can be reused. - - The [LazyConfig system](lazyconfigs.md) relies on such functions and classes. - -3. A few functions and classes are implemented with the - [@configurable](../modules/config.html#detectron2.config.configurable) - decorator - they can be called with either a config, or with explicit arguments, or a mixture of both. - Their explicit argument interfaces are currently experimental. - - As an example, a Mask R-CNN model can be built in the following ways: - - 1. Config-only: - ```python - # load proper yaml config file, then - model = build_model(cfg) - ``` - - 2. Mixture of config and additional argument overrides: - ```python - model = GeneralizedRCNN( - cfg, - roi_heads=StandardROIHeads(cfg, batch_size_per_image=666), - pixel_std=[57.0, 57.0, 57.0]) - ``` - - 3. Full explicit arguments: -
      - - (click to expand) - - - ```python - model = GeneralizedRCNN( - backbone=FPN( - ResNet( - BasicStem(3, 64, norm="FrozenBN"), - ResNet.make_default_stages(50, stride_in_1x1=True, norm="FrozenBN"), - out_features=["res2", "res3", "res4", "res5"], - ).freeze(2), - ["res2", "res3", "res4", "res5"], - 256, - top_block=LastLevelMaxPool(), - ), - proposal_generator=RPN( - in_features=["p2", "p3", "p4", "p5", "p6"], - head=StandardRPNHead(in_channels=256, num_anchors=3), - anchor_generator=DefaultAnchorGenerator( - sizes=[[32], [64], [128], [256], [512]], - aspect_ratios=[0.5, 1.0, 2.0], - strides=[4, 8, 16, 32, 64], - offset=0.0, - ), - anchor_matcher=Matcher([0.3, 0.7], [0, -1, 1], allow_low_quality_matches=True), - box2box_transform=Box2BoxTransform([1.0, 1.0, 1.0, 1.0]), - batch_size_per_image=256, - positive_fraction=0.5, - pre_nms_topk=(2000, 1000), - post_nms_topk=(1000, 1000), - nms_thresh=0.7, - ), - roi_heads=StandardROIHeads( - num_classes=80, - batch_size_per_image=512, - positive_fraction=0.25, - proposal_matcher=Matcher([0.5], [0, 1], allow_low_quality_matches=False), - box_in_features=["p2", "p3", "p4", "p5"], - box_pooler=ROIPooler(7, (1.0 / 4, 1.0 / 8, 1.0 / 16, 1.0 / 32), 0, "ROIAlignV2"), - box_head=FastRCNNConvFCHead( - ShapeSpec(channels=256, height=7, width=7), conv_dims=[], fc_dims=[1024, 1024] - ), - box_predictor=FastRCNNOutputLayers( - ShapeSpec(channels=1024), - test_score_thresh=0.05, - box2box_transform=Box2BoxTransform((10, 10, 5, 5)), - num_classes=80, - ), - mask_in_features=["p2", "p3", "p4", "p5"], - mask_pooler=ROIPooler(14, (1.0 / 4, 1.0 / 8, 1.0 / 16, 1.0 / 32), 0, "ROIAlignV2"), - mask_head=MaskRCNNConvUpsampleHead( - ShapeSpec(channels=256, width=14, height=14), - num_classes=80, - conv_dims=[256, 256, 256, 256, 256], - ), - ), - pixel_mean=[103.530, 116.280, 123.675], - pixel_std=[1.0, 1.0, 1.0], - input_format="BGR", - ) - ``` - -
      - - -If you only need the standard behavior, the [Beginner's Tutorial](./getting_started.md) -should suffice. If you need to extend detectron2 to your own needs, -see the following tutorials for more details: - -* Detectron2 includes a few standard datasets. To use custom ones, see - [Use Custom Datasets](./datasets.md). -* Detectron2 contains the standard logic that creates a data loader for training/testing from a - dataset, but you can write your own as well. See [Use Custom Data Loaders](./data_loading.md). -* Detectron2 implements many standard detection models, and provide ways for you - to overwrite their behaviors. See [Use Models](./models.md) and [Write Models](./write-models.md). -* Detectron2 provides a default training loop that is good for common training tasks. - You can customize it with hooks, or write your own loop instead. See [training](./training.md). diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/getting_started.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/getting_started.md deleted file mode 100755 index 404b0c8f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/getting_started.md +++ /dev/null @@ -1,79 +0,0 @@ -## Getting Started with Detectron2 - -This document provides a brief intro of the usage of builtin command-line tools in detectron2. - -For a tutorial that involves actual coding with the API, -see our [Colab Notebook](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5) -which covers how to run inference with an -existing model, and how to train a builtin model on a custom dataset. - - -### Inference Demo with Pre-trained Models - -1. Pick a model and its config file from - [model zoo](MODEL_ZOO.md), - for example, `mask_rcnn_R_50_FPN_3x.yaml`. -2. We provide `demo.py` that is able to demo builtin configs. Run it with: -``` -cd demo/ -python demo.py --config-file ../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml \ - --input input1.jpg input2.jpg \ - [--other-options] - --opts MODEL.WEIGHTS detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl -``` -The configs are made for training, therefore we need to specify `MODEL.WEIGHTS` to a model from model zoo for evaluation. -This command will run the inference and show visualizations in an OpenCV window. - -For details of the command line arguments, see `demo.py -h` or look at its source code -to understand its behavior. Some common arguments are: -* To run __on your webcam__, replace `--input files` with `--webcam`. -* To run __on a video__, replace `--input files` with `--video-input video.mp4`. -* To run __on cpu__, add `MODEL.DEVICE cpu` after `--opts`. -* To save outputs to a directory (for images) or a file (for webcam or video), use `--output`. - - -### Training & Evaluation in Command Line - -We provide two scripts in "tools/plain_train_net.py" and "tools/train_net.py", -that are made to train all the configs provided in detectron2. You may want to -use it as a reference to write your own training script. - -Compared to "train_net.py", "plain_train_net.py" supports fewer default -features. It also includes fewer abstraction, therefore is easier to add custom -logic. - -To train a model with "train_net.py", first -setup the corresponding datasets following -[datasets/README.md](./datasets/README.md), -then run: -``` -cd tools/ -./train_net.py --num-gpus 8 \ - --config-file ../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml -``` - -The configs are made for 8-GPU training. -To train on 1 GPU, you may need to [change some parameters](https://arxiv.org/abs/1706.02677), e.g.: -``` -./train_net.py \ - --config-file ../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml \ - --num-gpus 1 SOLVER.IMS_PER_BATCH 2 SOLVER.BASE_LR 0.0025 -``` - -To evaluate a model's performance, use -``` -./train_net.py \ - --config-file ../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml \ - --eval-only MODEL.WEIGHTS /path/to/checkpoint_file -``` -For more options, see `./train_net.py -h`. - -### Use Detectron2 APIs in Your Code - -See our [Colab Notebook](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5) -to learn how to use detectron2 APIs to: -1. run inference with an existing model -2. train a builtin model on a custom dataset - -See [detectron2/projects](https://github.com/facebookresearch/detectron2/tree/main/projects) -for more ways to build your project on detectron2. diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/index.rst b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/index.rst deleted file mode 100755 index 850b95cf..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/index.rst +++ /dev/null @@ -1,20 +0,0 @@ -Tutorials -====================================== - -.. toctree:: - :maxdepth: 2 - - install - getting_started - builtin_datasets - extend - datasets - data_loading - augmentation - models - write-models - training - evaluation - configs - lazyconfigs - deployment diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/install.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/install.md deleted file mode 100755 index b4076891..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/install.md +++ /dev/null @@ -1,261 +0,0 @@ -## Installation - -### Requirements -- Linux or macOS with Python ≥ 3.6 -- PyTorch ≥ 1.8 and [torchvision](https://github.com/pytorch/vision/) that matches the PyTorch installation. - Install them together at [pytorch.org](https://pytorch.org) to make sure of this -- OpenCV is optional but needed by demo and visualization - - -### Build Detectron2 from Source - -gcc & g++ ≥ 5.4 are required. [ninja](https://ninja-build.org/) is optional but recommended for faster build. -After having them, run: -``` -python -m pip install 'git+https://github.com/facebookresearch/detectron2.git' -# (add --user if you don't have permission) - -# Or, to install it from a local clone: -git clone https://github.com/facebookresearch/detectron2.git -python -m pip install -e detectron2 - -# On macOS, you may need to prepend the above commands with a few environment variables: -CC=clang CXX=clang++ ARCHFLAGS="-arch x86_64" python -m pip install ... -``` - -To __rebuild__ detectron2 that's built from a local clone, use `rm -rf build/ **/*.so` to clean the -old build first. You often need to rebuild detectron2 after reinstalling PyTorch. - -### Install Pre-Built Detectron2 (Linux only) - -Choose from this table to install [v0.6 (Oct 2021)](https://github.com/facebookresearch/detectron2/releases): - -
      CUDA torch 1.10torch 1.9torch 1.8
      11.3
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu113/torch1.10/index.html
      -
      11.1
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu111/torch1.10/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu111/torch1.9/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu111/torch1.8/index.html
      -
      10.2
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu102/torch1.10/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu102/torch1.9/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu102/torch1.8/index.html
      -
      10.1
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cu101/torch1.8/index.html
      -
      cpu
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cpu/torch1.10/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cpu/torch1.9/index.html
      -
      install
      python -m pip install detectron2 -f \
      -  https://dl.fbaipublicfiles.com/detectron2/wheels/cpu/torch1.8/index.html
      -
      - -Note that: -1. The pre-built packages have to be used with corresponding version of CUDA and the official package of PyTorch. - Otherwise, please build detectron2 from source. -2. New packages are released every few months. Therefore, packages may not contain latest features in the main - branch and may not be compatible with the main branch of a research project that uses detectron2 - (e.g. those in [projects](projects)). - -### Common Installation Issues - -Click each issue for its solutions: - -
      - -Undefined symbols that looks like "TH..","at::Tensor...","torch..." - -
      - -This usually happens when detectron2 or torchvision is not -compiled with the version of PyTorch you're running. - -If the error comes from a pre-built torchvision, uninstall torchvision and pytorch and reinstall them -following [pytorch.org](http://pytorch.org). So the versions will match. - -If the error comes from a pre-built detectron2, check [release notes](https://github.com/facebookresearch/detectron2/releases), -uninstall and reinstall the correct pre-built detectron2 that matches pytorch version. - -If the error comes from detectron2 or torchvision that you built manually from source, -remove files you built (`build/`, `**/*.so`) and rebuild it so it can pick up the version of pytorch currently in your environment. - -If the above instructions do not resolve this problem, please provide an environment (e.g. a dockerfile) that can reproduce the issue. -
      - -
      - -Missing torch dynamic libraries, OR segmentation fault immediately when using detectron2. - -This usually happens when detectron2 or torchvision is not -compiled with the version of PyTorch you're running. See the previous common issue for the solution. -
      - -
      - -Undefined C++ symbols (e.g. "GLIBCXX..") or C++ symbols not found. - -
      -Usually it's because the library is compiled with a newer C++ compiler but run with an old C++ runtime. - -This often happens with old anaconda. -It may help to run `conda update libgcc` to upgrade its runtime. - -The fundamental solution is to avoid the mismatch, either by compiling using older version of C++ -compiler, or run the code with proper C++ runtime. -To run the code with a specific C++ runtime, you can use environment variable `LD_PRELOAD=/path/to/libstdc++.so`. - -
      - -
      - -"nvcc not found" or "Not compiled with GPU support" or "Detectron2 CUDA Compiler: not available". - -
      -CUDA is not found when building detectron2. -You should make sure - -``` -python -c 'import torch; from torch.utils.cpp_extension import CUDA_HOME; print(torch.cuda.is_available(), CUDA_HOME)' -``` - -print `(True, a directory with cuda)` at the time you build detectron2. - -Most models can run inference (but not training) without GPU support. To use CPUs, set `MODEL.DEVICE='cpu'` in the config. -
      - -
      - -"invalid device function" or "no kernel image is available for execution". - -
      -Two possibilities: - -* You build detectron2 with one version of CUDA but run it with a different version. - - To check whether it is the case, - use `python -m detectron2.utils.collect_env` to find out inconsistent CUDA versions. - In the output of this command, you should expect "Detectron2 CUDA Compiler", "CUDA_HOME", "PyTorch built with - CUDA" - to contain cuda libraries of the same version. - - When they are inconsistent, - you need to either install a different build of PyTorch (or build by yourself) - to match your local CUDA installation, or install a different version of CUDA to match PyTorch. - -* PyTorch/torchvision/Detectron2 is not built for the correct GPU SM architecture (aka. compute capability). - - The architecture included by PyTorch/detectron2/torchvision is available in the "architecture flags" in - `python -m detectron2.utils.collect_env`. It must include - the architecture of your GPU, which can be found at [developer.nvidia.com/cuda-gpus](https://developer.nvidia.com/cuda-gpus). - - If you're using pre-built PyTorch/detectron2/torchvision, they have included support for most popular GPUs already. - If not supported, you need to build them from source. - - When building detectron2/torchvision from source, they detect the GPU device and build for only the device. - This means the compiled code may not work on a different GPU device. - To recompile them for the correct architecture, remove all installed/compiled files, - and rebuild them with the `TORCH_CUDA_ARCH_LIST` environment variable set properly. - For example, `export TORCH_CUDA_ARCH_LIST="6.0;7.0"` makes it compile for both P100s and V100s. -
      - -
      - -Undefined CUDA symbols; Cannot open libcudart.so - -
      -The version of NVCC you use to build detectron2 or torchvision does -not match the version of CUDA you are running with. -This often happens when using anaconda's CUDA runtime. - -Use `python -m detectron2.utils.collect_env` to find out inconsistent CUDA versions. -In the output of this command, you should expect "Detectron2 CUDA Compiler", "CUDA_HOME", "PyTorch built with - CUDA" -to contain cuda libraries of the same version. - -When they are inconsistent, -you need to either install a different build of PyTorch (or build by yourself) -to match your local CUDA installation, or install a different version of CUDA to match PyTorch. -
      - - -
      - -C++ compilation errors from NVCC / NVRTC, or "Unsupported gpu architecture" - -
      -A few possibilities: - -1. Local CUDA/NVCC version has to match the CUDA version of your PyTorch. Both can be found in `python collect_env.py`. - When they are inconsistent, you need to either install a different build of PyTorch (or build by yourself) - to match your local CUDA installation, or install a different version of CUDA to match PyTorch. - -2. Local CUDA/NVCC version shall support the SM architecture (a.k.a. compute capability) of your GPU. - The capability of your GPU can be found at [developer.nvidia.com/cuda-gpus](https://developer.nvidia.com/cuda-gpus). - The capability supported by NVCC is listed at [here](https://gist.github.com/ax3l/9489132). - If your NVCC version is too old, this can be workaround by setting environment variable - `TORCH_CUDA_ARCH_LIST` to a lower, supported capability. - -3. The combination of NVCC and GCC you use is incompatible. You need to change one of their versions. - See [here](https://gist.github.com/ax3l/9489132) for some valid combinations. - Notably, CUDA<=10.1.105 doesn't support GCC>7.3. - - The CUDA/GCC version used by PyTorch can be found by `print(torch.__config__.show())`. - -
      - - -
      - -"ImportError: cannot import name '_C'". - -
      -Please build and install detectron2 following the instructions above. - -Or, if you are running code from detectron2's root directory, `cd` to a different one. -Otherwise you may not import the code that you installed. -
      - - -
      - -Any issue on windows. - -
      - -Detectron2 is continuously built on windows with [CircleCI](https://app.circleci.com/pipelines/github/facebookresearch/detectron2?branch=main). -However we do not provide official support for it. -PRs that improves code compatibility on windows are welcome. -
      - -
      - -ONNX conversion segfault after some "TraceWarning". - -
      -The ONNX package is compiled with a too old compiler. - -Please build and install ONNX from its source code using a compiler -whose version is closer to what's used by PyTorch (available in `torch.__config__.show()`). -
      - - -
      - -"library not found for -lstdc++" on older version of MacOS - -
      -See -[this stackoverflow answer](https://stackoverflow.com/questions/56083725/macos-build-issues-lstdc-not-found-while-building-python-package). - -
      - - -### Installation inside specific environments: - -* __Colab__: see our [Colab Tutorial](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5) - which has step-by-step instructions. - -* __Docker__: The official [Dockerfile](docker) installs detectron2 with a few simple commands. - diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/lazyconfigs.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/lazyconfigs.md deleted file mode 100755 index ca9de305..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/lazyconfigs.md +++ /dev/null @@ -1,170 +0,0 @@ -# Lazy Configs - -The traditional yacs-based config system provides basic, standard functionalities. -However, it does not offer enough flexibility for many new projects. -We develop an alternative, non-intrusive config system that can be used with -detectron2 or potentially any other complex projects. - -## Python Syntax - -Our config objects are still dictionaries. Instead of using Yaml to define dictionaries, -we create dictionaries in Python directly. This gives users the following power that -doesn't exist in Yaml: - -* Easily manipulate the dictionary (addition & deletion) using Python. -* Write simple arithmetics or call simple functions. -* Use more data types / objects. -* Import / compose other config files, using the familiar Python import syntax. - -A Python config file can be loaded like this: -```python -# config.py: -a = dict(x=1, y=2, z=dict(xx=1)) -b = dict(x=3, y=4) - -# my_code.py: -from detectron2.config import LazyConfig -cfg = LazyConfig.load("path/to/config.py") # an omegaconf dictionary -assert cfg.a.z.xx == 1 -``` - -After [LazyConfig.load](../modules/config.html#detectron2.config.LazyConfig.load), `cfg` will be a dictionary that contains all dictionaries -defined in the global scope of the config file. Note that: -* All dictionaries are turned to an [omegaconf](https://omegaconf.readthedocs.io/) - config object during loading. This enables access to omegaconf features, - such as its [access syntax](https://omegaconf.readthedocs.io/en/2.1_branch/usage.html#access-and-manipulation) - and [interpolation](https://omegaconf.readthedocs.io/en/2.1_branch/usage.html#variable-interpolation). -* Absolute imports in `config.py` works the same as in regular Python. -* Relative imports can only import dictionaries from config files. - They are simply a syntax sugar for [LazyConfig.load_rel](../modules/config.html#detectron2.config.LazyConfig.load_rel). - They can load Python files at relative path without requiring `__init__.py`. - -[LazyConfig.save](../modules/config.html#detectron2.config.LazyConfig.save) can save a config object to yaml. -Note that this is not always successful if non-serializable objects appear in the config file (e.g. lambdas). -It is up to users whether to sacrifice the ability to save in exchange for flexibility. - -## Recursive Instantiation - -The LazyConfig system heavily uses recursive instantiation, which is a pattern that -uses a dictionary to describe a -call to a function/class. The dictionary consists of: - -1. A "\_target\_" key which contains path to the callable, such as "module.submodule.class_name". -2. Other keys that represent arguments to pass to the callable. Arguments themselves can be defined - using recursive instantiation. - -We provide a helper function [LazyCall](../modules/config.html#detectron2.config.LazyCall) that helps create such dictionaries. -The following code using `LazyCall` -```python -from detectron2.config import LazyCall as L -from my_app import Trainer, Optimizer -cfg = L(Trainer)( - optimizer=L(Optimizer)( - lr=0.01, - algo="SGD" - ) -) -``` -creates a dictionary like this: -``` -cfg = { - "_target_": "my_app.Trainer", - "optimizer": { - "_target_": "my_app.Optimizer", - "lr": 0.01, "algo": "SGD" - } -} -``` - -By representing objects using such dictionaries, a general -[instantiate](../modules/config.html#detectron2.config.instantiate) -function can turn them into actual objects, i.e.: -```python -from detectron2.config import instantiate -trainer = instantiate(cfg) -# equivalent to: -# from my_app import Trainer, Optimizer -# trainer = Trainer(optimizer=Optimizer(lr=0.01, algo="SGD")) -``` - -This pattern is powerful enough to describe very complex objects, e.g.: - -
      - -A Full Mask R-CNN described in recursive instantiation (click to expand) - - -```eval_rst -.. literalinclude:: ../../configs/common/models/mask_rcnn_fpn.py - :language: python - :linenos: -``` - -
      - -There are also objects or logic that cannot be described simply by a dictionary, -such as reused objects or method calls. They may require some refactoring -to work with recursive instantiation. - -## Using Model Zoo LazyConfigs - -We provide some configs in the model zoo using the LazyConfig system, for example: - -* [common baselines](../../configs/common/). -* [new Mask R-CNN baselines](../../configs/new_baselines/) - -After installing detectron2, they can be loaded by the model zoo API -[model_zoo.get_config](../modules/model_zoo.html#detectron2.model_zoo.get_config). - -Using these as references, you're free to define custom config structure / fields for your own -project, as long as your training script can understand them. -Despite of this, our model zoo configs still follow some simple conventions for consistency, e.g. -`cfg.model` defines a model object, `cfg.dataloader.{train,test}` defines dataloader objects, -and `cfg.train` contains training options in key-value form. -In addition to `print()`, a better way to view the structure of a config is like this: -``` -from detectron2.model_zoo import get_config -from detectron2.config import LazyConfig -print(LazyConfig.to_py(get_config("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py"))) -``` -From the output it's easier to find relevant options to change, e.g. -`dataloader.train.total_batch_size` for the batch size, or `optimizer.lr` for base learning rate. - -We provide a reference training script -[tools/lazyconfig_train_net.py](../../tools/lazyconfig_train_net.py), -that can train/eval our model zoo configs. -It also shows how to support command line value overrides. - -To demonstrate the power and flexibility of the new system, we show that -[a simple config file](../../configs/Misc/torchvision_imagenet_R_50.py) -can let detectron2 train an ImageNet classification model from torchvision, even though -detectron2 contains no features about ImageNet classification. -This can serve as a reference for using detectron2 in other deep learning tasks. - -## Summary - -By using recursive instantiation to create objects, -we avoid passing a giant config to many places, because `cfg` is only passed to `instantiate`. -This has the following benefits: - -* It's __non-intrusive__: objects to be constructed are config-agnostic, regular Python - functions/classes. - They can even live in other libraries. For example, - `{"_target_": "torch.nn.Conv2d", "in_channels": 10, "out_channels": 10, "kernel_size": 1}` - defines a conv layer. -* __Clarity__ of what function/classes will be called, and what arguments they use. -* `cfg` doesn't need pre-defined keys and structures. It's valid as long as it translates to valid - code. This gives a lot more __flexibility__. -* You can still pass huge dictionaries as arguments, just like the old way. - -Recursive instantiation and Python syntax are orthogonal: you can use one without the other. -But by putting them together, the config file looks a lot like the code that will be executed: - -![img](./lazyconfig.jpg) - -However, the config file just defines dictionaries, which can be easily manipulated further -by composition or overrides. -The corresponding code will only be executed -later when `instantiate` is called. In some way, -in config files we're writing "editable code" that will be "lazily executed" later when needed. -That's why we call this system "LazyConfig". diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/models.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/models.md deleted file mode 100755 index 3cf918e7..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/models.md +++ /dev/null @@ -1,180 +0,0 @@ -# Use Models - -## Build Models from Yacs Config -From a yacs config object, -models (and their sub-models) can be built by -functions such as `build_model`, `build_backbone`, `build_roi_heads`: -```python -from detectron2.modeling import build_model -model = build_model(cfg) # returns a torch.nn.Module -``` - -`build_model` only builds the model structure and fills it with random parameters. -See below for how to load an existing checkpoint to the model and how to use the `model` object. - -### Load/Save a Checkpoint -```python -from detectron2.checkpoint import DetectionCheckpointer -DetectionCheckpointer(model).load(file_path_or_url) # load a file, usually from cfg.MODEL.WEIGHTS - -checkpointer = DetectionCheckpointer(model, save_dir="output") -checkpointer.save("model_999") # save to output/model_999.pth -``` - -Detectron2's checkpointer recognizes models in pytorch's `.pth` format, as well as the `.pkl` files -in our model zoo. -See [API doc](../modules/checkpoint.html#detectron2.checkpoint.DetectionCheckpointer) -for more details about its usage. - -The model files can be arbitrarily manipulated using `torch.{load,save}` for `.pth` files or -`pickle.{dump,load}` for `.pkl` files. - -### Use a Model - -A model can be called by `outputs = model(inputs)`, where `inputs` is a `list[dict]`. -Each dict corresponds to one image and the required keys -depend on the type of model, and whether the model is in training or evaluation mode. -For example, in order to do inference, -all existing models expect the "image" key, and optionally "height" and "width". -The detailed format of inputs and outputs of existing models are explained below. - -__Training__: When in training mode, all models are required to be used under an `EventStorage`. -The training statistics will be put into the storage: -```python -from detectron2.utils.events import EventStorage -with EventStorage() as storage: - losses = model(inputs) -``` - -__Inference__: If you only want to do simple inference using an existing model, -[DefaultPredictor](../modules/engine.html#detectron2.engine.defaults.DefaultPredictor) -is a wrapper around model that provides such basic functionality. -It includes default behavior including model loading, preprocessing, -and operates on single image rather than batches. See its documentation for usage. - -You can also run inference directly like this: -``` -model.eval() -with torch.no_grad(): - outputs = model(inputs) -``` - -### Model Input Format - -Users can implement custom models that support any arbitrary input format. -Here we describe the standard input format that all builtin models support in detectron2. -They all take a `list[dict]` as the inputs. Each dict -corresponds to information about one image. - -The dict may contain the following keys: - -* "image": `Tensor` in (C, H, W) format. The meaning of channels are defined by `cfg.INPUT.FORMAT`. - Image normalization, if any, will be performed inside the model using - `cfg.MODEL.PIXEL_{MEAN,STD}`. -* "height", "width": the **desired** output height and width **in inference**, which is not necessarily the same - as the height or width of the `image` field. - For example, the `image` field contains the resized image, if resize is used as a preprocessing step. - But you may want the outputs to be in **original** resolution. - If provided, the model will produce output in this resolution, - rather than in the resolution of the `image` as input into the model. This is more efficient and accurate. -* "instances": an [Instances](../modules/structures.html#detectron2.structures.Instances) - object for training, with the following fields: - + "gt_boxes": a [Boxes](../modules/structures.html#detectron2.structures.Boxes) object storing N boxes, one for each instance. - + "gt_classes": `Tensor` of long type, a vector of N labels, in range [0, num_categories). - + "gt_masks": a [PolygonMasks](../modules/structures.html#detectron2.structures.PolygonMasks) - or [BitMasks](../modules/structures.html#detectron2.structures.BitMasks) object storing N masks, one for each instance. - + "gt_keypoints": a [Keypoints](../modules/structures.html#detectron2.structures.Keypoints) - object storing N keypoint sets, one for each instance. -* "sem_seg": `Tensor[int]` in (H, W) format. The semantic segmentation ground truth for training. - Values represent category labels starting from 0. -* "proposals": an [Instances](../modules/structures.html#detectron2.structures.Instances) - object used only in Fast R-CNN style models, with the following fields: - + "proposal_boxes": a [Boxes](../modules/structures.html#detectron2.structures.Boxes) object storing P proposal boxes. - + "objectness_logits": `Tensor`, a vector of P scores, one for each proposal. - -For inference of builtin models, only "image" key is required, and "width/height" are optional. - -We currently don't define standard input format for panoptic segmentation training, -because models now use custom formats produced by custom data loaders. - -#### How it connects to data loader: - -The output of the default [DatasetMapper]( ../modules/data.html#detectron2.data.DatasetMapper) is a dict -that follows the above format. -After the data loader performs batching, it becomes `list[dict]` which the builtin models support. - - -### Model Output Format - -When in training mode, the builtin models output a `dict[str->ScalarTensor]` with all the losses. - -When in inference mode, the builtin models output a `list[dict]`, one dict for each image. -Based on the tasks the model is doing, each dict may contain the following fields: - -* "instances": [Instances](../modules/structures.html#detectron2.structures.Instances) - object with the following fields: - * "pred_boxes": [Boxes](../modules/structures.html#detectron2.structures.Boxes) object storing N boxes, one for each detected instance. - * "scores": `Tensor`, a vector of N confidence scores. - * "pred_classes": `Tensor`, a vector of N labels in range [0, num_categories). - + "pred_masks": a `Tensor` of shape (N, H, W), masks for each detected instance. - + "pred_keypoints": a `Tensor` of shape (N, num_keypoint, 3). - Each row in the last dimension is (x, y, score). Confidence scores are larger than 0. -* "sem_seg": `Tensor` of (num_categories, H, W), the semantic segmentation prediction. -* "proposals": [Instances](../modules/structures.html#detectron2.structures.Instances) - object with the following fields: - * "proposal_boxes": [Boxes](../modules/structures.html#detectron2.structures.Boxes) - object storing N boxes. - * "objectness_logits": a torch vector of N confidence scores. -* "panoptic_seg": A tuple of `(pred: Tensor, segments_info: Optional[list[dict]])`. - The `pred` tensor has shape (H, W), containing the segment id of each pixel. - - * If `segments_info` exists, each dict describes one segment id in `pred` and has the following fields: - - * "id": the segment id - * "isthing": whether the segment is a thing or stuff - * "category_id": the category id of this segment. - - If a pixel's id does not exist in `segments_info`, it is considered to be void label - defined in [Panoptic Segmentation](https://arxiv.org/abs/1801.00868). - - * If `segments_info` is None, all pixel values in `pred` must be ≥ -1. - Pixels with value -1 are assigned void labels. - Otherwise, the category id of each pixel is obtained by - `category_id = pixel // metadata.label_divisor`. - - -### Partially execute a model: - -Sometimes you may want to obtain an intermediate tensor inside a model, -such as the input of certain layer, the output before post-processing. -Since there are typically hundreds of intermediate tensors, there isn't an API that provides you -the intermediate result you need. -You have the following options: - -1. Write a (sub)model. Following the [tutorial](./write-models.md), you can - rewrite a model component (e.g. a head of a model), such that it - does the same thing as the existing component, but returns the output - you need. -2. Partially execute a model. You can create the model as usual, - but use custom code to execute it instead of its `forward()`. For example, - the following code obtains mask features before mask head. - - ```python - images = ImageList.from_tensors(...) # preprocessed input tensor - model = build_model(cfg) - model.eval() - features = model.backbone(images.tensor) - proposals, _ = model.proposal_generator(images, features) - instances, _ = model.roi_heads(images, features, proposals) - mask_features = [features[f] for f in model.roi_heads.in_features] - mask_features = model.roi_heads.mask_pooler(mask_features, [x.pred_boxes for x in instances]) - ``` - -3. Use [forward hooks](https://pytorch.org/tutorials/beginner/former_torchies/nnft_tutorial.html#forward-and-backward-function-hooks). - Forward hooks can help you obtain inputs or outputs of a certain module. - If they are not exactly what you want, they can at least be used together with partial execution - to obtain other tensors. - -All options require you to read documentation and sometimes code -of the existing models to understand the internal logic, -in order to write code to obtain the internal tensors. diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/training.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/training.md deleted file mode 100755 index 7e2987e4..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/training.md +++ /dev/null @@ -1,67 +0,0 @@ -# Training - -From the previous tutorials, you may now have a custom model and a data loader. -To run training, users typically have a preference in one of the following two styles: - -### Custom Training Loop - -With a model and a data loader ready, everything else needed to write a training loop can -be found in PyTorch, and you are free to write the training loop yourself. -This style allows researchers to manage the entire training logic more clearly and have full control. -One such example is provided in [tools/plain_train_net.py](../../tools/plain_train_net.py). - -Any customization on the training logic is then easily controlled by the user. - -### Trainer Abstraction - -We also provide a standardized "trainer" abstraction with a -hook system that helps simplify the standard training behavior. -It includes the following two instantiations: - -* [SimpleTrainer](../modules/engine.html#detectron2.engine.SimpleTrainer) - provides a minimal training loop for single-cost single-optimizer single-data-source training, with nothing else. - Other tasks (checkpointing, logging, etc) can be implemented using - [the hook system](../modules/engine.html#detectron2.engine.HookBase). -* [DefaultTrainer](../modules/engine.html#detectron2.engine.defaults.DefaultTrainer) is a `SimpleTrainer` initialized from a - yacs config, used by - [tools/train_net.py](../../tools/train_net.py) and many scripts. - It includes more standard default behaviors that one might want to opt in, - including default configurations for optimizer, learning rate schedule, - logging, evaluation, checkpointing etc. - -To customize a `DefaultTrainer`: - -1. For simple customizations (e.g. change optimizer, evaluator, LR scheduler, data loader, etc.), overwrite [its methods](../modules/engine.html#detectron2.engine.defaults.DefaultTrainer) in a subclass, just like [tools/train_net.py](../../tools/train_net.py). -2. For extra tasks during training, check the - [hook system](../modules/engine.html#detectron2.engine.HookBase) to see if it's supported. - - As an example, to print hello during training: - ```python - class HelloHook(HookBase): - def after_step(self): - if self.trainer.iter % 100 == 0: - print(f"Hello at iteration {self.trainer.iter}!") - ``` -3. Using a trainer+hook system means there will always be some non-standard behaviors that cannot be supported, especially in research. - For this reason, we intentionally keep the trainer & hook system minimal, rather than powerful. - If anything cannot be achieved by such a system, it's easier to start from [tools/plain_train_net.py](../../tools/plain_train_net.py) to implement custom training logic manually. - -### Logging of Metrics - -During training, detectron2 models and trainer put metrics to a centralized [EventStorage](../modules/utils.html#detectron2.utils.events.EventStorage). -You can use the following code to access it and log metrics to it: -``` -from detectron2.utils.events import get_event_storage - -# inside the model: -if self.training: - value = # compute the value from inputs - storage = get_event_storage() - storage.put_scalar("some_accuracy", value) -``` - -Refer to its documentation for more details. - -Metrics are then written to various destinations with [EventWriter](../modules/utils.html#module-detectron2.utils.events). -DefaultTrainer enables a few `EventWriter` with default configurations. -See above for how to customize them. diff --git a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/write-models.md b/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/write-models.md deleted file mode 100755 index 967d1265..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/docs/tutorials/write-models.md +++ /dev/null @@ -1,90 +0,0 @@ -# Write Models - -If you are trying to do something completely new, you may wish to implement -a model entirely from scratch. However, in many situations you may -be interested in modifying or extending some components of an existing model. -Therefore, we also provide mechanisms that let users override the -behavior of certain internal components of standard models. - - -## Register New Components - -For common concepts that users often want to customize, such as "backbone feature extractor", "box head", -we provide a registration mechanism for users to inject custom implementation that -will be immediately available to use in config files. - -For example, to add a new backbone, import this code in your code: -```python -from detectron2.modeling import BACKBONE_REGISTRY, Backbone, ShapeSpec - -@BACKBONE_REGISTRY.register() -class ToyBackbone(Backbone): - def __init__(self, cfg, input_shape): - super().__init__() - # create your own backbone - self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=16, padding=3) - - def forward(self, image): - return {"conv1": self.conv1(image)} - - def output_shape(self): - return {"conv1": ShapeSpec(channels=64, stride=16)} -``` - -In this code, we implement a new backbone following the interface of the -[Backbone](../modules/modeling.html#detectron2.modeling.Backbone) class, -and register it into the [BACKBONE_REGISTRY](../modules/modeling.html#detectron2.modeling.BACKBONE_REGISTRY) -which requires subclasses of `Backbone`. -After importing this code, detectron2 can link the name of the class to its implementation. Therefore you can write the following code: - -```python -cfg = ... # read a config -cfg.MODEL.BACKBONE.NAME = 'ToyBackbone' # or set it in the config file -model = build_model(cfg) # it will find `ToyBackbone` defined above -``` - -As another example, to add new abilities to the ROI heads in the Generalized R-CNN meta-architecture, -you can implement a new -[ROIHeads](../modules/modeling.html#detectron2.modeling.ROIHeads) subclass and put it in the `ROI_HEADS_REGISTRY`. -[DensePose](../../projects/DensePose) -and [MeshRCNN](https://github.com/facebookresearch/meshrcnn) -are two examples that implement new ROIHeads to perform new tasks. -And [projects/](../../projects/) -contains more examples that implement different architectures. - -A complete list of registries can be found in [API documentation](../modules/modeling.html#model-registries). -You can register components in these registries to customize different parts of a model, or the -entire model. - -## Construct Models with Explicit Arguments - -Registry is a bridge to connect names in config files to the actual code. -They are meant to cover a few main components that users frequently need to replace. -However, the capability of a text-based config file is sometimes limited and -some deeper customization may be available only through writing code. - -Most model components in detectron2 have a clear `__init__` interface that documents -what input arguments it needs. Calling them with custom arguments will give you a custom variant -of the model. - -As an example, to use __custom loss function__ in the box head of a Faster R-CNN, we can do the following: - -1. Losses are currently computed in [FastRCNNOutputLayers](../modules/modeling.html#detectron2.modeling.FastRCNNOutputLayers). - We need to implement a variant or a subclass of it, with custom loss functions, named `MyRCNNOutput`. -2. Call `StandardROIHeads` with `box_predictor=MyRCNNOutput()` argument instead of the builtin `FastRCNNOutputLayers`. - If all other arguments should stay unchanged, this can be easily achieved by using the [configurable `__init__`](../modules/config.html#detectron2.config.configurable) mechanism: - - ```python - roi_heads = StandardROIHeads( - cfg, backbone.output_shape(), - box_predictor=MyRCNNOutput(...) - ) - ``` -3. (optional) If we want to enable this new model from a config file, registration is needed: - ```python - @ROI_HEADS_REGISTRY.register() - class MyStandardROIHeads(StandardROIHeads): - def __init__(self, cfg, input_shape): - super().__init__(cfg, input_shape, - box_predictor=MyRCNNOutput(...)) - ``` diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/.gitignore b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/.gitignore deleted file mode 100755 index 51c17688..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/.gitignore +++ /dev/null @@ -1,10 +0,0 @@ -# compilation and distribution -__pycache__ -_ext -*.pyc -*.pyd -*.so -centernet.egg-info/ -build/ -dist/ -wheels/ diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/__init__.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/__init__.py deleted file mode 100755 index e17db317..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/__init__.py +++ /dev/null @@ -1,14 +0,0 @@ -from .modeling.meta_arch.centernet_detector import CenterNetDetector -from .modeling.dense_heads.centernet import CenterNet -from .modeling.roi_heads.custom_roi_heads import CustomROIHeads, CustomCascadeROIHeads - -from .modeling.backbone.fpn_p5 import build_p67_resnet_fpn_backbone -from .modeling.backbone.dla import build_dla_backbone -from .modeling.backbone.dlafpn import build_dla_fpn3_backbone -from .modeling.backbone.bifpn import build_resnet_bifpn_backbone -from .modeling.backbone.bifpn_fcos import build_fcos_resnet_bifpn_backbone -from .modeling.backbone.res2net import build_p67_res2net_fpn_backbone - -from .data.datasets.objects365 import categories_v1 -from .data.datasets.coco import _PREDEFINED_SPLITS_COCO -from .data.datasets import nuimages diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/config.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/config.py deleted file mode 100755 index 36d0d250..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/config.py +++ /dev/null @@ -1,87 +0,0 @@ -from detectron2.config import CfgNode as CN - -def add_centernet_config(cfg): - _C = cfg - - _C.MODEL.CENTERNET = CN() - _C.MODEL.CENTERNET.NUM_CLASSES = 80 - _C.MODEL.CENTERNET.IN_FEATURES = ["p3", "p4", "p5", "p6", "p7"] - _C.MODEL.CENTERNET.FPN_STRIDES = [8, 16, 32, 64, 128] - _C.MODEL.CENTERNET.PRIOR_PROB = 0.01 - _C.MODEL.CENTERNET.INFERENCE_TH = 0.05 - _C.MODEL.CENTERNET.CENTER_NMS = False - _C.MODEL.CENTERNET.NMS_TH_TRAIN = 0.6 - _C.MODEL.CENTERNET.NMS_TH_TEST = 0.6 - _C.MODEL.CENTERNET.PRE_NMS_TOPK_TRAIN = 1000 - _C.MODEL.CENTERNET.POST_NMS_TOPK_TRAIN = 100 - _C.MODEL.CENTERNET.PRE_NMS_TOPK_TEST = 1000 - _C.MODEL.CENTERNET.POST_NMS_TOPK_TEST = 100 - _C.MODEL.CENTERNET.NORM = "GN" - _C.MODEL.CENTERNET.USE_DEFORMABLE = False - _C.MODEL.CENTERNET.NUM_CLS_CONVS = 4 - _C.MODEL.CENTERNET.NUM_BOX_CONVS = 4 - _C.MODEL.CENTERNET.NUM_SHARE_CONVS = 0 - _C.MODEL.CENTERNET.LOC_LOSS_TYPE = 'giou' - _C.MODEL.CENTERNET.SIGMOID_CLAMP = 1e-4 - _C.MODEL.CENTERNET.HM_MIN_OVERLAP = 0.8 - _C.MODEL.CENTERNET.MIN_RADIUS = 4 - _C.MODEL.CENTERNET.SOI = [[0, 80], [64, 160], [128, 320], [256, 640], [512, 10000000]] - _C.MODEL.CENTERNET.POS_WEIGHT = 1. - _C.MODEL.CENTERNET.NEG_WEIGHT = 1. - _C.MODEL.CENTERNET.REG_WEIGHT = 2. - _C.MODEL.CENTERNET.HM_FOCAL_BETA = 4 - _C.MODEL.CENTERNET.HM_FOCAL_ALPHA = 0.25 - _C.MODEL.CENTERNET.LOSS_GAMMA = 2.0 - _C.MODEL.CENTERNET.WITH_AGN_HM = False - _C.MODEL.CENTERNET.ONLY_PROPOSAL = False - _C.MODEL.CENTERNET.AS_PROPOSAL = False - _C.MODEL.CENTERNET.IGNORE_HIGH_FP = -1. - _C.MODEL.CENTERNET.MORE_POS = False - _C.MODEL.CENTERNET.MORE_POS_THRESH = 0.2 - _C.MODEL.CENTERNET.MORE_POS_TOPK = 9 - _C.MODEL.CENTERNET.NOT_NORM_REG = True - _C.MODEL.CENTERNET.NOT_NMS = False - _C.MODEL.CENTERNET.NO_REDUCE = False - - _C.MODEL.ROI_BOX_HEAD.USE_SIGMOID_CE = False - _C.MODEL.ROI_BOX_HEAD.PRIOR_PROB = 0.01 - _C.MODEL.ROI_BOX_HEAD.USE_EQL_LOSS = False - _C.MODEL.ROI_BOX_HEAD.CAT_FREQ_PATH = \ - 'datasets/lvis/lvis_v1_train_cat_info.json' - _C.MODEL.ROI_BOX_HEAD.EQL_FREQ_CAT = 200 - _C.MODEL.ROI_BOX_HEAD.USE_FED_LOSS = False - _C.MODEL.ROI_BOX_HEAD.FED_LOSS_NUM_CAT = 50 - _C.MODEL.ROI_BOX_HEAD.FED_LOSS_FREQ_WEIGHT = 0.5 - _C.MODEL.ROI_BOX_HEAD.MULT_PROPOSAL_SCORE = False - - _C.MODEL.BIFPN = CN() - _C.MODEL.BIFPN.NUM_LEVELS = 5 - _C.MODEL.BIFPN.NUM_BIFPN = 6 - _C.MODEL.BIFPN.NORM = 'GN' - _C.MODEL.BIFPN.OUT_CHANNELS = 160 - _C.MODEL.BIFPN.SEPARABLE_CONV = False - - _C.MODEL.DLA = CN() - _C.MODEL.DLA.OUT_FEATURES = ['dla2'] - _C.MODEL.DLA.USE_DLA_UP = True - _C.MODEL.DLA.NUM_LAYERS = 34 - _C.MODEL.DLA.MS_OUTPUT = False - _C.MODEL.DLA.NORM = 'BN' - _C.MODEL.DLA.DLAUP_IN_FEATURES = ['dla3', 'dla4', 'dla5'] - _C.MODEL.DLA.DLAUP_NODE = 'conv' - - _C.SOLVER.RESET_ITER = False - _C.SOLVER.TRAIN_ITER = -1 - - _C.INPUT.CUSTOM_AUG = '' - _C.INPUT.TRAIN_SIZE = 640 - _C.INPUT.TEST_SIZE = 640 - _C.INPUT.SCALE_RANGE = (0.1, 2.) - # 'default' for fixed short/ long edge, 'square' for max size=INPUT.SIZE - _C.INPUT.TEST_INPUT_TYPE = 'default' - - _C.DEBUG = False - _C.SAVE_DEBUG = False - _C.SAVE_PTH = False - _C.VIS_THRESH = 0.3 - _C.DEBUG_SHOW_NAME = False diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_build_augmentation.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_build_augmentation.py deleted file mode 100755 index 7d91f21e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_build_augmentation.py +++ /dev/null @@ -1,59 +0,0 @@ -import logging -import numpy as np -import pycocotools.mask as mask_util -import torch -from fvcore.common.file_io import PathManager -from PIL import Image - -from detectron2.structures import ( - BitMasks, - Boxes, - BoxMode, - Instances, - Keypoints, - PolygonMasks, - RotatedBoxes, - polygons_to_bitmask, -) - -from detectron2.data import transforms as T -from .transforms.custom_augmentation_impl import EfficientDetResizeCrop - -def build_custom_augmentation(cfg, is_train): - """ - Create a list of default :class:`Augmentation` from config. - Now it includes resizing and flipping. - - Returns: - list[Augmentation] - """ - if cfg.INPUT.CUSTOM_AUG == 'ResizeShortestEdge': - if is_train: - min_size = cfg.INPUT.MIN_SIZE_TRAIN - max_size = cfg.INPUT.MAX_SIZE_TRAIN - sample_style = cfg.INPUT.MIN_SIZE_TRAIN_SAMPLING - else: - min_size = cfg.INPUT.MIN_SIZE_TEST - max_size = cfg.INPUT.MAX_SIZE_TEST - sample_style = "choice" - augmentation = [T.ResizeShortestEdge(min_size, max_size, sample_style)] - elif cfg.INPUT.CUSTOM_AUG == 'EfficientDetResizeCrop': - if is_train: - scale = cfg.INPUT.SCALE_RANGE - size = cfg.INPUT.TRAIN_SIZE - else: - scale = (1, 1) - size = cfg.INPUT.TEST_SIZE - augmentation = [EfficientDetResizeCrop(size, scale)] - else: - assert 0, cfg.INPUT.CUSTOM_AUG - - if is_train: - augmentation.append(T.RandomFlip()) - return augmentation - - -build_custom_transform_gen = build_custom_augmentation -""" -Alias for backward-compatibility. -""" \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_dataset_dataloader.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_dataset_dataloader.py deleted file mode 100755 index 4e9844c9..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/custom_dataset_dataloader.py +++ /dev/null @@ -1,229 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -import copy -import logging -import numpy as np -import operator -import torch -import torch.utils.data -import json -from detectron2.utils.comm import get_world_size - -from detectron2.data import samplers -from torch.utils.data.sampler import BatchSampler, Sampler -from detectron2.data.common import DatasetFromList, MapDataset -from detectron2.data.dataset_mapper import DatasetMapper -from detectron2.data.build import get_detection_dataset_dicts, build_batch_data_loader -from detectron2.data.samplers import TrainingSampler, RepeatFactorTrainingSampler -from detectron2.data.build import worker_init_reset_seed, print_instances_class_histogram -from detectron2.data.build import filter_images_with_only_crowd_annotations -from detectron2.data.build import filter_images_with_few_keypoints -from detectron2.data.build import check_metadata_consistency -from detectron2.data.catalog import MetadataCatalog, DatasetCatalog -from detectron2.utils import comm -import itertools -import math -from collections import defaultdict -from typing import Optional - -# from .custom_build_augmentation import build_custom_augmentation - -def build_custom_train_loader(cfg, mapper=None): - """ - Modified from detectron2.data.build.build_custom_train_loader, but supports - different samplers - """ - source_aware = cfg.DATALOADER.SOURCE_AWARE - if source_aware: - dataset_dicts = get_detection_dataset_dicts_with_source( - cfg.DATASETS.TRAIN, - filter_empty=cfg.DATALOADER.FILTER_EMPTY_ANNOTATIONS, - min_keypoints=cfg.MODEL.ROI_KEYPOINT_HEAD.MIN_KEYPOINTS_PER_IMAGE - if cfg.MODEL.KEYPOINT_ON - else 0, - proposal_files=cfg.DATASETS.PROPOSAL_FILES_TRAIN if cfg.MODEL.LOAD_PROPOSALS else None, - ) - sizes = [0 for _ in range(len(cfg.DATASETS.TRAIN))] - for d in dataset_dicts: - sizes[d['dataset_source']] += 1 - print('dataset sizes', sizes) - else: - dataset_dicts = get_detection_dataset_dicts( - cfg.DATASETS.TRAIN, - filter_empty=cfg.DATALOADER.FILTER_EMPTY_ANNOTATIONS, - min_keypoints=cfg.MODEL.ROI_KEYPOINT_HEAD.MIN_KEYPOINTS_PER_IMAGE - if cfg.MODEL.KEYPOINT_ON - else 0, - proposal_files=cfg.DATASETS.PROPOSAL_FILES_TRAIN if cfg.MODEL.LOAD_PROPOSALS else None, - ) - dataset = DatasetFromList(dataset_dicts, copy=False) - - if mapper is None: - assert 0 - # mapper = DatasetMapper(cfg, True) - dataset = MapDataset(dataset, mapper) - - sampler_name = cfg.DATALOADER.SAMPLER_TRAIN - logger = logging.getLogger(__name__) - logger.info("Using training sampler {}".format(sampler_name)) - # TODO avoid if-else? - if sampler_name == "TrainingSampler": - sampler = TrainingSampler(len(dataset)) - elif sampler_name == "MultiDatasetSampler": - assert source_aware - sampler = MultiDatasetSampler(cfg, sizes, dataset_dicts) - elif sampler_name == "RepeatFactorTrainingSampler": - repeat_factors = RepeatFactorTrainingSampler.repeat_factors_from_category_frequency( - dataset_dicts, cfg.DATALOADER.REPEAT_THRESHOLD - ) - sampler = RepeatFactorTrainingSampler(repeat_factors) - elif sampler_name == "ClassAwareSampler": - sampler = ClassAwareSampler(dataset_dicts) - else: - raise ValueError("Unknown training sampler: {}".format(sampler_name)) - - return build_batch_data_loader( - dataset, - sampler, - cfg.SOLVER.IMS_PER_BATCH, - aspect_ratio_grouping=cfg.DATALOADER.ASPECT_RATIO_GROUPING, - num_workers=cfg.DATALOADER.NUM_WORKERS, - ) - - -class ClassAwareSampler(Sampler): - def __init__(self, dataset_dicts, seed: Optional[int] = None): - """ - Args: - size (int): the total number of data of the underlying dataset to sample from - seed (int): the initial seed of the shuffle. Must be the same - across all workers. If None, will use a random seed shared - among workers (require synchronization among all workers). - """ - self._size = len(dataset_dicts) - assert self._size > 0 - if seed is None: - seed = comm.shared_random_seed() - self._seed = int(seed) - - self._rank = comm.get_rank() - self._world_size = comm.get_world_size() - self.weights = self._get_class_balance_factor(dataset_dicts) - - - def __iter__(self): - start = self._rank - yield from itertools.islice( - self._infinite_indices(), start, None, self._world_size) - - - def _infinite_indices(self): - g = torch.Generator() - g.manual_seed(self._seed) - while True: - ids = torch.multinomial( - self.weights, self._size, generator=g, - replacement=True) - yield from ids - - - def _get_class_balance_factor(self, dataset_dicts, l=1.): - # 1. For each category c, compute the fraction of images that contain it: f(c) - ret = [] - category_freq = defaultdict(int) - for dataset_dict in dataset_dicts: # For each image (without repeats) - cat_ids = {ann["category_id"] for ann in dataset_dict["annotations"]} - for cat_id in cat_ids: - category_freq[cat_id] += 1 - for i, dataset_dict in enumerate(dataset_dicts): - cat_ids = {ann["category_id"] for ann in dataset_dict["annotations"]} - ret.append(sum( - [1. / (category_freq[cat_id] ** l) for cat_id in cat_ids])) - return torch.tensor(ret).float() - - -def get_detection_dataset_dicts_with_source( - dataset_names, filter_empty=True, min_keypoints=0, proposal_files=None -): - assert len(dataset_names) - dataset_dicts = [DatasetCatalog.get(dataset_name) for dataset_name in dataset_names] - for dataset_name, dicts in zip(dataset_names, dataset_dicts): - assert len(dicts), "Dataset '{}' is empty!".format(dataset_name) - - for source_id, (dataset_name, dicts) in \ - enumerate(zip(dataset_names, dataset_dicts)): - assert len(dicts), "Dataset '{}' is empty!".format(dataset_name) - for d in dicts: - d['dataset_source'] = source_id - - if "annotations" in dicts[0]: - try: - class_names = MetadataCatalog.get(dataset_name).thing_classes - check_metadata_consistency("thing_classes", dataset_name) - print_instances_class_histogram(dicts, class_names) - except AttributeError: # class names are not available for this dataset - pass - - assert proposal_files is None - - dataset_dicts = list(itertools.chain.from_iterable(dataset_dicts)) - - has_instances = "annotations" in dataset_dicts[0] - if filter_empty and has_instances: - dataset_dicts = filter_images_with_only_crowd_annotations(dataset_dicts) - if min_keypoints > 0 and has_instances: - dataset_dicts = filter_images_with_few_keypoints(dataset_dicts, min_keypoints) - - return dataset_dicts - -class MultiDatasetSampler(Sampler): - def __init__(self, cfg, sizes, dataset_dicts, seed: Optional[int] = None): - """ - Args: - size (int): the total number of data of the underlying dataset to sample from - seed (int): the initial seed of the shuffle. Must be the same - across all workers. If None, will use a random seed shared - among workers (require synchronization among all workers). - """ - self.sizes = sizes - dataset_ratio = cfg.DATALOADER.DATASET_RATIO - self._batch_size = cfg.SOLVER.IMS_PER_BATCH - assert len(dataset_ratio) == len(sizes), \ - 'length of dataset ratio {} should be equal to number if dataset {}'.format( - len(dataset_ratio), len(sizes) - ) - if seed is None: - seed = comm.shared_random_seed() - self._seed = int(seed) - self._rank = comm.get_rank() - self._world_size = comm.get_world_size() - - self._ims_per_gpu = self._batch_size // self._world_size - self.dataset_ids = torch.tensor( - [d['dataset_source'] for d in dataset_dicts], dtype=torch.long) - - dataset_weight = [torch.ones(s) * max(sizes) / s * r / sum(dataset_ratio) \ - for i, (r, s) in enumerate(zip(dataset_ratio, sizes))] - dataset_weight = torch.cat(dataset_weight) - self.weights = dataset_weight - self.sample_epoch_size = len(self.weights) - - def __iter__(self): - start = self._rank - yield from itertools.islice( - self._infinite_indices(), start, None, self._world_size) - - - def _infinite_indices(self): - g = torch.Generator() - g.manual_seed(self._seed) - while True: - ids = torch.multinomial( - self.weights, self.sample_epoch_size, generator=g, - replacement=True) - nums = [(self.dataset_ids[ids] == i).sum().int().item() \ - for i in range(len(self.sizes))] - print('_rank, len, nums', self._rank, len(ids), nums, flush=True) - # print('_rank, len, nums, self.dataset_ids[ids[:10]], ', - # self._rank, len(ids), nums, self.dataset_ids[ids[:10]], - # flush=True) - yield from ids \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/coco.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/coco.py deleted file mode 100755 index f8496aac..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/coco.py +++ /dev/null @@ -1,49 +0,0 @@ -import os - -from detectron2.data.datasets.register_coco import register_coco_instances -from detectron2.data.datasets.coco import load_coco_json -from detectron2.data.datasets.builtin_meta import _get_builtin_metadata -from detectron2.data import DatasetCatalog, MetadataCatalog - - -def register_distill_coco_instances(name, metadata, json_file, image_root): - """ - add extra_annotation_keys - """ - assert isinstance(name, str), name - assert isinstance(json_file, (str, os.PathLike)), json_file - assert isinstance(image_root, (str, os.PathLike)), image_root - # 1. register a function which returns dicts - DatasetCatalog.register(name, lambda: load_coco_json( - json_file, image_root, name, extra_annotation_keys=['score'])) - - # 2. Optionally, add metadata about this dataset, - # since they might be useful in evaluation, visualization or logging - MetadataCatalog.get(name).set( - json_file=json_file, image_root=image_root, evaluator_type="coco", **metadata - ) - - -_PREDEFINED_SPLITS_COCO = { - "coco_2017_unlabeled": ("coco/unlabeled2017", "coco/annotations/image_info_unlabeled2017.json"), -} - -for key, (image_root, json_file) in _PREDEFINED_SPLITS_COCO.items(): - register_coco_instances( - key, - _get_builtin_metadata('coco'), - os.path.join("datasets", json_file) if "://" not in json_file else json_file, - os.path.join("datasets", image_root), - ) - -_PREDEFINED_SPLITS_DISTILL_COCO = { - "coco_un_yolov4_55_0.5": ("coco/unlabeled2017", "coco/annotations/yolov4_cocounlabeled_55_ann0.5.json"), -} - -for key, (image_root, json_file) in _PREDEFINED_SPLITS_DISTILL_COCO.items(): - register_distill_coco_instances( - key, - _get_builtin_metadata('coco'), - os.path.join("datasets", json_file) if "://" not in json_file else json_file, - os.path.join("datasets", image_root), - ) \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/nuimages.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/nuimages.py deleted file mode 100755 index 52736e33..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/nuimages.py +++ /dev/null @@ -1,37 +0,0 @@ -from detectron2.data.datasets.register_coco import register_coco_instances -import os - -categories = [ - {'id': 0, 'name': 'car'}, - {'id': 1, 'name': 'truck'}, - {'id': 2, 'name': 'trailer'}, - {'id': 3, 'name': 'bus'}, - {'id': 4, 'name': 'construction_vehicle'}, - {'id': 5, 'name': 'bicycle'}, - {'id': 6, 'name': 'motorcycle'}, - {'id': 7, 'name': 'pedestrian'}, - {'id': 8, 'name': 'traffic_cone'}, - {'id': 9, 'name': 'barrier'}, -] - -def _get_builtin_metadata(): - id_to_name = {x['id']: x['name'] for x in categories} - thing_dataset_id_to_contiguous_id = {i: i for i in range(len(categories))} - thing_classes = [id_to_name[k] for k in sorted(id_to_name)] - return { - "thing_dataset_id_to_contiguous_id": thing_dataset_id_to_contiguous_id, - "thing_classes": thing_classes} - -_PREDEFINED_SPLITS = { - "nuimages_train": ("nuimages", "nuimages/annotations/nuimages_v1.0-train.json"), - "nuimages_val": ("nuimages", "nuimages/annotations/nuimages_v1.0-val.json"), - "nuimages_mini": ("nuimages", "nuimages/annotations/nuimages_v1.0-mini.json"), -} - -for key, (image_root, json_file) in _PREDEFINED_SPLITS.items(): - register_coco_instances( - key, - _get_builtin_metadata(), - os.path.join("datasets", json_file) if "://" not in json_file else json_file, - os.path.join("datasets", image_root), - ) diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/objects365.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/objects365.py deleted file mode 100755 index 41395bdd..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/datasets/objects365.py +++ /dev/null @@ -1,394 +0,0 @@ -from detectron2.data.datasets.register_coco import register_coco_instances -import os - -categories_v1 = [ -{'id': 164, 'name': 'cutting/chopping board'} , -{'id': 49, 'name': 'tie'} , -{'id': 306, 'name': 'crosswalk sign'} , -{'id': 145, 'name': 'gun'} , -{'id': 14, 'name': 'street lights'} , -{'id': 223, 'name': 'bar soap'} , -{'id': 74, 'name': 'wild bird'} , -{'id': 219, 'name': 'ice cream'} , -{'id': 37, 'name': 'stool'} , -{'id': 25, 'name': 'storage box'} , -{'id': 153, 'name': 'giraffe'} , -{'id': 52, 'name': 'pen/pencil'} , -{'id': 61, 'name': 'high heels'} , -{'id': 340, 'name': 'mangosteen'} , -{'id': 22, 'name': 'bracelet'} , -{'id': 155, 'name': 'piano'} , -{'id': 162, 'name': 'vent'} , -{'id': 75, 'name': 'laptop'} , -{'id': 236, 'name': 'toaster'} , -{'id': 231, 'name': 'fire truck'} , -{'id': 42, 'name': 'basket'} , -{'id': 150, 'name': 'zebra'} , -{'id': 124, 'name': 'head phone'} , -{'id': 90, 'name': 'sheep'} , -{'id': 322, 'name': 'steak'} , -{'id': 39, 'name': 'couch'} , -{'id': 209, 'name': 'toothbrush'} , -{'id': 59, 'name': 'bicycle'} , -{'id': 336, 'name': 'red cabbage'} , -{'id': 228, 'name': 'golf ball'} , -{'id': 120, 'name': 'tomato'} , -{'id': 132, 'name': 'computer box'} , -{'id': 8, 'name': 'cup'} , -{'id': 183, 'name': 'basketball'} , -{'id': 298, 'name': 'butterfly'} , -{'id': 250, 'name': 'garlic'} , -{'id': 12, 'name': 'desk'} , -{'id': 141, 'name': 'microwave'} , -{'id': 171, 'name': 'strawberry'} , -{'id': 200, 'name': 'kettle'} , -{'id': 63, 'name': 'van'} , -{'id': 300, 'name': 'cheese'} , -{'id': 215, 'name': 'marker'} , -{'id': 100, 'name': 'blackboard/whiteboard'} , -{'id': 186, 'name': 'printer'} , -{'id': 333, 'name': 'bread/bun'} , -{'id': 243, 'name': 'penguin'} , -{'id': 364, 'name': 'iron'} , -{'id': 180, 'name': 'ladder'} , -{'id': 34, 'name': 'flag'} , -{'id': 78, 'name': 'cell phone'} , -{'id': 97, 'name': 'fan'} , -{'id': 224, 'name': 'scale'} , -{'id': 151, 'name': 'duck'} , -{'id': 319, 'name': 'flute'} , -{'id': 156, 'name': 'stop sign'} , -{'id': 290, 'name': 'rickshaw'} , -{'id': 128, 'name': 'sailboat'} , -{'id': 165, 'name': 'tennis racket'} , -{'id': 241, 'name': 'cigar'} , -{'id': 101, 'name': 'balloon'} , -{'id': 308, 'name': 'hair drier'} , -{'id': 167, 'name': 'skating and skiing shoes'} , -{'id': 237, 'name': 'helicopter'} , -{'id': 65, 'name': 'sink'} , -{'id': 129, 'name': 'tangerine'} , -{'id': 330, 'name': 'crab'} , -{'id': 320, 'name': 'measuring cup'} , -{'id': 260, 'name': 'fishing rod'} , -{'id': 346, 'name': 'saw'} , -{'id': 216, 'name': 'ship'} , -{'id': 46, 'name': 'coffee table'} , -{'id': 194, 'name': 'facial mask'} , -{'id': 281, 'name': 'stapler'} , -{'id': 118, 'name': 'refrigerator'} , -{'id': 40, 'name': 'belt'} , -{'id': 349, 'name': 'starfish'} , -{'id': 87, 'name': 'hanger'} , -{'id': 116, 'name': 'baseball glove'} , -{'id': 261, 'name': 'cherry'} , -{'id': 334, 'name': 'baozi'} , -{'id': 267, 'name': 'screwdriver'} , -{'id': 158, 'name': 'converter'} , -{'id': 335, 'name': 'lion'} , -{'id': 170, 'name': 'baseball'} , -{'id': 111, 'name': 'skis'} , -{'id': 136, 'name': 'broccoli'} , -{'id': 342, 'name': 'eraser'} , -{'id': 337, 'name': 'polar bear'} , -{'id': 139, 'name': 'shovel'} , -{'id': 193, 'name': 'extension cord'} , -{'id': 284, 'name': 'goldfish'} , -{'id': 174, 'name': 'pepper'} , -{'id': 138, 'name': 'stroller'} , -{'id': 328, 'name': 'yak'} , -{'id': 83, 'name': 'clock'} , -{'id': 235, 'name': 'tricycle'} , -{'id': 248, 'name': 'parking meter'} , -{'id': 274, 'name': 'trophy'} , -{'id': 324, 'name': 'binoculars'} , -{'id': 51, 'name': 'traffic light'} , -{'id': 314, 'name': 'donkey'} , -{'id': 45, 'name': 'barrel/bucket'} , -{'id': 292, 'name': 'pomegranate'} , -{'id': 13, 'name': 'handbag'} , -{'id': 262, 'name': 'tablet'} , -{'id': 68, 'name': 'apple'} , -{'id': 226, 'name': 'cabbage'} , -{'id': 23, 'name': 'flower'} , -{'id': 58, 'name': 'faucet'} , -{'id': 206, 'name': 'tong'} , -{'id': 291, 'name': 'trombone'} , -{'id': 160, 'name': 'carrot'} , -{'id': 172, 'name': 'bow tie'} , -{'id': 122, 'name': 'tent'} , -{'id': 163, 'name': 'cookies'} , -{'id': 115, 'name': 'remote'} , -{'id': 175, 'name': 'coffee machine'} , -{'id': 238, 'name': 'green beans'} , -{'id': 233, 'name': 'cello'} , -{'id': 28, 'name': 'wine glass'} , -{'id': 295, 'name': 'mushroom'} , -{'id': 344, 'name': 'scallop'} , -{'id': 125, 'name': 'lantern'} , -{'id': 123, 'name': 'shampoo/shower gel'} , -{'id': 285, 'name': 'meat balls'} , -{'id': 266, 'name': 'key'} , -{'id': 296, 'name': 'calculator'} , -{'id': 168, 'name': 'scissors'} , -{'id': 103, 'name': 'cymbal'} , -{'id': 6, 'name': 'bottle'} , -{'id': 264, 'name': 'nuts'} , -{'id': 234, 'name': 'notepaper'} , -{'id': 211, 'name': 'mango'} , -{'id': 287, 'name': 'toothpaste'} , -{'id': 196, 'name': 'chopsticks'} , -{'id': 140, 'name': 'baseball bat'} , -{'id': 244, 'name': 'hurdle'} , -{'id': 195, 'name': 'tennis ball'} , -{'id': 144, 'name': 'surveillance camera'} , -{'id': 271, 'name': 'volleyball'} , -{'id': 94, 'name': 'keyboard'} , -{'id': 339, 'name': 'seal'} , -{'id': 11, 'name': 'picture/frame'} , -{'id': 348, 'name': 'okra'} , -{'id': 191, 'name': 'sausage'} , -{'id': 166, 'name': 'candy'} , -{'id': 62, 'name': 'ring'} , -{'id': 311, 'name': 'dolphin'} , -{'id': 273, 'name': 'eggplant'} , -{'id': 84, 'name': 'drum'} , -{'id': 143, 'name': 'surfboard'} , -{'id': 288, 'name': 'antelope'} , -{'id': 204, 'name': 'clutch'} , -{'id': 207, 'name': 'slide'} , -{'id': 43, 'name': 'towel/napkin'} , -{'id': 352, 'name': 'durian'} , -{'id': 276, 'name': 'board eraser'} , -{'id': 315, 'name': 'electric drill'} , -{'id': 312, 'name': 'sushi'} , -{'id': 198, 'name': 'pie'} , -{'id': 106, 'name': 'pickup truck'} , -{'id': 176, 'name': 'bathtub'} , -{'id': 26, 'name': 'vase'} , -{'id': 133, 'name': 'elephant'} , -{'id': 256, 'name': 'sandwich'} , -{'id': 327, 'name': 'noodles'} , -{'id': 10, 'name': 'glasses'} , -{'id': 109, 'name': 'airplane'} , -{'id': 95, 'name': 'tripod'} , -{'id': 247, 'name': 'CD'} , -{'id': 121, 'name': 'machinery vehicle'} , -{'id': 365, 'name': 'flashlight'} , -{'id': 53, 'name': 'microphone'} , -{'id': 270, 'name': 'pliers'} , -{'id': 362, 'name': 'chainsaw'} , -{'id': 259, 'name': 'bear'} , -{'id': 197, 'name': 'electronic stove and gas stove'} , -{'id': 89, 'name': 'pot/pan'} , -{'id': 220, 'name': 'tape'} , -{'id': 338, 'name': 'lighter'} , -{'id': 177, 'name': 'snowboard'} , -{'id': 214, 'name': 'violin'} , -{'id': 217, 'name': 'chicken'} , -{'id': 2, 'name': 'sneakers'} , -{'id': 161, 'name': 'washing machine'} , -{'id': 131, 'name': 'kite'} , -{'id': 354, 'name': 'rabbit'} , -{'id': 86, 'name': 'bus'} , -{'id': 275, 'name': 'dates'} , -{'id': 282, 'name': 'camel'} , -{'id': 88, 'name': 'nightstand'} , -{'id': 179, 'name': 'grapes'} , -{'id': 229, 'name': 'pine apple'} , -{'id': 56, 'name': 'necklace'} , -{'id': 18, 'name': 'leather shoes'} , -{'id': 358, 'name': 'hoverboard'} , -{'id': 345, 'name': 'pencil case'} , -{'id': 359, 'name': 'pasta'} , -{'id': 157, 'name': 'radiator'} , -{'id': 201, 'name': 'hamburger'} , -{'id': 268, 'name': 'globe'} , -{'id': 332, 'name': 'barbell'} , -{'id': 329, 'name': 'mop'} , -{'id': 252, 'name': 'horn'} , -{'id': 350, 'name': 'eagle'} , -{'id': 169, 'name': 'folder'} , -{'id': 137, 'name': 'toilet'} , -{'id': 5, 'name': 'lamp'} , -{'id': 27, 'name': 'bench'} , -{'id': 249, 'name': 'swan'} , -{'id': 76, 'name': 'knife'} , -{'id': 341, 'name': 'comb'} , -{'id': 64, 'name': 'watch'} , -{'id': 105, 'name': 'telephone'} , -{'id': 3, 'name': 'chair'} , -{'id': 33, 'name': 'boat'} , -{'id': 107, 'name': 'orange'} , -{'id': 60, 'name': 'bread'} , -{'id': 147, 'name': 'cat'} , -{'id': 135, 'name': 'gas stove'} , -{'id': 307, 'name': 'papaya'} , -{'id': 227, 'name': 'router/modem'} , -{'id': 357, 'name': 'asparagus'} , -{'id': 73, 'name': 'motorcycle'} , -{'id': 77, 'name': 'traffic sign'} , -{'id': 67, 'name': 'fish'} , -{'id': 326, 'name': 'radish'} , -{'id': 213, 'name': 'egg'} , -{'id': 203, 'name': 'cucumber'} , -{'id': 17, 'name': 'helmet'} , -{'id': 110, 'name': 'luggage'} , -{'id': 80, 'name': 'truck'} , -{'id': 199, 'name': 'frisbee'} , -{'id': 232, 'name': 'peach'} , -{'id': 1, 'name': 'person'} , -{'id': 29, 'name': 'boots'} , -{'id': 310, 'name': 'chips'} , -{'id': 142, 'name': 'skateboard'} , -{'id': 44, 'name': 'slippers'} , -{'id': 4, 'name': 'hat'} , -{'id': 178, 'name': 'suitcase'} , -{'id': 24, 'name': 'tv'} , -{'id': 119, 'name': 'train'} , -{'id': 82, 'name': 'power outlet'} , -{'id': 245, 'name': 'swing'} , -{'id': 15, 'name': 'book'} , -{'id': 294, 'name': 'jellyfish'} , -{'id': 192, 'name': 'fire extinguisher'} , -{'id': 212, 'name': 'deer'} , -{'id': 181, 'name': 'pear'} , -{'id': 347, 'name': 'table tennis paddle'} , -{'id': 113, 'name': 'trolley'} , -{'id': 91, 'name': 'guitar'} , -{'id': 202, 'name': 'golf club'} , -{'id': 221, 'name': 'wheelchair'} , -{'id': 254, 'name': 'saxophone'} , -{'id': 117, 'name': 'paper towel'} , -{'id': 303, 'name': 'race car'} , -{'id': 240, 'name': 'carriage'} , -{'id': 246, 'name': 'radio'} , -{'id': 318, 'name': 'parrot'} , -{'id': 251, 'name': 'french fries'} , -{'id': 98, 'name': 'dog'} , -{'id': 112, 'name': 'soccer'} , -{'id': 355, 'name': 'french horn'} , -{'id': 79, 'name': 'paddle'} , -{'id': 283, 'name': 'lettuce'} , -{'id': 9, 'name': 'car'} , -{'id': 258, 'name': 'kiwi fruit'} , -{'id': 325, 'name': 'llama'} , -{'id': 187, 'name': 'billiards'} , -{'id': 210, 'name': 'facial cleanser'} , -{'id': 81, 'name': 'cow'} , -{'id': 331, 'name': 'microscope'} , -{'id': 148, 'name': 'lemon'} , -{'id': 302, 'name': 'pomelo'} , -{'id': 85, 'name': 'fork'} , -{'id': 154, 'name': 'pumpkin'} , -{'id': 289, 'name': 'shrimp'} , -{'id': 71, 'name': 'teddy bear'} , -{'id': 184, 'name': 'potato'} , -{'id': 102, 'name': 'air conditioner'} , -{'id': 208, 'name': 'hot dog'} , -{'id': 222, 'name': 'plum'} , -{'id': 316, 'name': 'spring rolls'} , -{'id': 230, 'name': 'crane'} , -{'id': 149, 'name': 'liquid soap'} , -{'id': 55, 'name': 'canned'} , -{'id': 35, 'name': 'speaker'} , -{'id': 108, 'name': 'banana'} , -{'id': 297, 'name': 'treadmill'} , -{'id': 99, 'name': 'spoon'} , -{'id': 104, 'name': 'mouse'} , -{'id': 182, 'name': 'american football'} , -{'id': 299, 'name': 'egg tart'} , -{'id': 127, 'name': 'cleaning products'} , -{'id': 313, 'name': 'urinal'} , -{'id': 286, 'name': 'medal'} , -{'id': 239, 'name': 'brush'} , -{'id': 96, 'name': 'hockey'} , -{'id': 279, 'name': 'dumbbell'} , -{'id': 32, 'name': 'umbrella'} , -{'id': 272, 'name': 'hammer'} , -{'id': 16, 'name': 'plate'} , -{'id': 21, 'name': 'potted plant'} , -{'id': 242, 'name': 'earphone'} , -{'id': 70, 'name': 'candle'} , -{'id': 185, 'name': 'paint brush'} , -{'id': 48, 'name': 'toy'} , -{'id': 130, 'name': 'pizza'} , -{'id': 255, 'name': 'trumpet'} , -{'id': 361, 'name': 'hotair balloon'} , -{'id': 188, 'name': 'fire hydrant'} , -{'id': 50, 'name': 'bed'} , -{'id': 253, 'name': 'avocado'} , -{'id': 293, 'name': 'coconut'} , -{'id': 257, 'name': 'cue'} , -{'id': 280, 'name': 'hamimelon'} , -{'id': 66, 'name': 'horse'} , -{'id': 173, 'name': 'pigeon'} , -{'id': 190, 'name': 'projector'} , -{'id': 69, 'name': 'camera'} , -{'id': 30, 'name': 'bowl'} , -{'id': 269, 'name': 'broom'} , -{'id': 343, 'name': 'pitaya'} , -{'id': 305, 'name': 'tuba'} , -{'id': 309, 'name': 'green onion'} , -{'id': 363, 'name': 'lobster'} , -{'id': 225, 'name': 'watermelon'} , -{'id': 47, 'name': 'suv'} , -{'id': 31, 'name': 'dining table'} , -{'id': 54, 'name': 'sandals'} , -{'id': 351, 'name': 'monkey'} , -{'id': 218, 'name': 'onion'} , -{'id': 36, 'name': 'trash bin/can'} , -{'id': 20, 'name': 'glove'} , -{'id': 277, 'name': 'rice'} , -{'id': 152, 'name': 'sports car'} , -{'id': 360, 'name': 'target'} , -{'id': 205, 'name': 'blender'} , -{'id': 19, 'name': 'pillow'} , -{'id': 72, 'name': 'cake'} , -{'id': 93, 'name': 'tea pot'} , -{'id': 353, 'name': 'game board'} , -{'id': 38, 'name': 'backpack'} , -{'id': 356, 'name': 'ambulance'} , -{'id': 146, 'name': 'life saver'} , -{'id': 189, 'name': 'goose'} , -{'id': 278, 'name': 'tape measure/ruler'} , -{'id': 92, 'name': 'traffic cone'} , -{'id': 134, 'name': 'toiletries'} , -{'id': 114, 'name': 'oven'} , -{'id': 317, 'name': 'tortoise/turtle'} , -{'id': 265, 'name': 'corn'} , -{'id': 126, 'name': 'donut'} , -{'id': 57, 'name': 'mirror'} , -{'id': 7, 'name': 'cabinet/shelf'} , -{'id': 263, 'name': 'green vegetables'} , -{'id': 159, 'name': 'tissue '} , -{'id': 321, 'name': 'shark'} , -{'id': 301, 'name': 'pig'} , -{'id': 41, 'name': 'carpet'} , -{'id': 304, 'name': 'rice cooker'} , -{'id': 323, 'name': 'poker card'} , -] - -def _get_builtin_metadata(version): - if version == 'v1': - id_to_name = {x['id']: x['name'] for x in categories_v1} - else: - assert 0, version - thing_dataset_id_to_contiguous_id = {i + 1: i for i in range(365)} - thing_classes = [id_to_name[k] for k in sorted(id_to_name)] - return { - "thing_dataset_id_to_contiguous_id": thing_dataset_id_to_contiguous_id, - "thing_classes": thing_classes} - -_PREDEFINED_SPLITS_OBJECTS365 = { - "objects365_train": ("objects365/train", "objects365/annotations/objects365_train.json"), - "objects365_val": ("objects365/val", "objects365/annotations/objects365_val.json"), -} - -for key, (image_root, json_file) in _PREDEFINED_SPLITS_OBJECTS365.items(): - register_coco_instances( - key, - _get_builtin_metadata('v1'), - os.path.join("datasets", json_file) if "://" not in json_file else json_file, - os.path.join("datasets", image_root), - ) diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_augmentation_impl.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_augmentation_impl.py deleted file mode 100755 index 5a69e178..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_augmentation_impl.py +++ /dev/null @@ -1,63 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -# Modified by Xingyi Zhou -""" -Implement many useful :class:`Augmentation`. -""" -import numpy as np -import sys -from fvcore.transforms.transform import ( - BlendTransform, - CropTransform, - HFlipTransform, - NoOpTransform, - Transform, - VFlipTransform, -) -from PIL import Image - -from detectron2.data.transforms.augmentation import Augmentation -from .custom_transform import EfficientDetResizeCropTransform - -__all__ = [ - "EfficientDetResizeCrop", -] - - -class EfficientDetResizeCrop(Augmentation): - """ - Scale the shorter edge to the given size, with a limit of `max_size` on the longer edge. - If `max_size` is reached, then downscale so that the longer edge does not exceed max_size. - """ - - def __init__( - self, size, scale, interp=Image.BILINEAR - ): - """ - Args: - """ - super().__init__() - self.target_size = (size, size) - self.scale = scale - self.interp = interp - - def get_transform(self, img): - # Select a random scale factor. - scale_factor = np.random.uniform(*self.scale) - scaled_target_height = scale_factor * self.target_size[0] - scaled_target_width = scale_factor * self.target_size[1] - # Recompute the accurate scale_factor using rounded scaled image size. - width, height = img.shape[1], img.shape[0] - img_scale_y = scaled_target_height / height - img_scale_x = scaled_target_width / width - img_scale = min(img_scale_y, img_scale_x) - - # Select non-zero random offset (x, y) if scaled image is larger than target size - scaled_h = int(height * img_scale) - scaled_w = int(width * img_scale) - offset_y = scaled_h - self.target_size[0] - offset_x = scaled_w - self.target_size[1] - offset_y = int(max(0.0, float(offset_y)) * np.random.uniform(0, 1)) - offset_x = int(max(0.0, float(offset_x)) * np.random.uniform(0, 1)) - return EfficientDetResizeCropTransform( - scaled_h, scaled_w, offset_y, offset_x, img_scale, self.target_size, self.interp) diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_transform.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_transform.py deleted file mode 100755 index 654d65d9..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/data/transforms/custom_transform.py +++ /dev/null @@ -1,94 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -# Modified by Xingyi Zhou -# File: transform.py - -import numpy as np -import torch -import torch.nn.functional as F -from fvcore.transforms.transform import ( - CropTransform, - HFlipTransform, - NoOpTransform, - Transform, - TransformList, -) -from PIL import Image - -try: - import cv2 # noqa -except ImportError: - # OpenCV is an optional dependency at the moment - pass - -__all__ = [ - "EfficientDetResizeCropTransform", -] - - -class EfficientDetResizeCropTransform(Transform): - """ - """ - - def __init__(self, scaled_h, scaled_w, offset_y, offset_x, img_scale, target_size, interp=None): - """ - Args: - h, w (int): original image size - new_h, new_w (int): new image size - interp: PIL interpolation methods, defaults to bilinear. - """ - # TODO decide on PIL vs opencv - super().__init__() - if interp is None: - interp = Image.BILINEAR - self._set_attributes(locals()) - - def apply_image(self, img, interp=None): - # assert img.shape[:2] == (self.h, self.w) - assert len(img.shape) <= 4 - - if img.dtype == np.uint8: - pil_image = Image.fromarray(img) - interp_method = interp if interp is not None else self.interp - pil_image = pil_image.resize((self.scaled_w, self.scaled_h), interp_method) - ret = np.asarray(pil_image) - right = min(self.scaled_w, self.offset_x + self.target_size[1]) - lower = min(self.scaled_h, self.offset_y + self.target_size[0]) - # img = img.crop((self.offset_x, self.offset_y, right, lower)) - if len(ret.shape) <= 3: - ret = ret[self.offset_y: lower, self.offset_x: right] - else: - ret = ret[..., self.offset_y: lower, self.offset_x: right, :] - else: - # PIL only supports uint8 - img = torch.from_numpy(img) - shape = list(img.shape) - shape_4d = shape[:2] + [1] * (4 - len(shape)) + shape[2:] - img = img.view(shape_4d).permute(2, 3, 0, 1) # hw(c) -> nchw - _PIL_RESIZE_TO_INTERPOLATE_MODE = {Image.BILINEAR: "bilinear", Image.BICUBIC: "bicubic"} - mode = _PIL_RESIZE_TO_INTERPOLATE_MODE[self.interp] - img = F.interpolate(img, (self.scaled_h, self.scaled_w), mode=mode, align_corners=False) - shape[:2] = (self.scaled_h, self.scaled_w) - ret = img.permute(2, 3, 0, 1).view(shape).numpy() # nchw -> hw(c) - right = min(self.scaled_w, self.offset_x + self.target_size[1]) - lower = min(self.scaled_h, self.offset_y + self.target_size[0]) - if len(ret.shape) <= 3: - ret = ret[self.offset_y: lower, self.offset_x: right] - else: - ret = ret[..., self.offset_y: lower, self.offset_x: right, :] - return ret - - def apply_coords(self, coords): - coords[:, 0] = coords[:, 0] * self.img_scale - coords[:, 1] = coords[:, 1] * self.img_scale - coords[:, 0] -= self.offset_x - coords[:, 1] -= self.offset_y - return coords - - def apply_segmentation(self, segmentation): - segmentation = self.apply_image(segmentation, interp=Image.NEAREST) - return segmentation - - def inverse(self): - raise NotImplementedError - # return ResizeTransform(self.new_h, self.new_w, self.h, self.w, self.interp) \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn.py deleted file mode 100755 index 565e2940..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn.py +++ /dev/null @@ -1,425 +0,0 @@ -# Modified from https://github.com/rwightman/efficientdet-pytorch/blob/master/effdet/efficientdet.py -# The original file is under Apache-2.0 License -import math -from os.path import join -import numpy as np -from collections import OrderedDict -from typing import List - -import torch -from torch import nn -import torch.utils.model_zoo as model_zoo -import torch.nn.functional as F -import fvcore.nn.weight_init as weight_init - -from detectron2.layers import ShapeSpec, Conv2d -from detectron2.modeling.backbone.resnet import build_resnet_backbone -from detectron2.modeling.backbone.build import BACKBONE_REGISTRY -from detectron2.layers.batch_norm import get_norm -from detectron2.modeling.backbone import Backbone -from .dlafpn import dla34 - -def get_fpn_config(base_reduction=8): - """BiFPN config with sum.""" - p = { - 'nodes': [ - {'reduction': base_reduction << 3, 'inputs_offsets': [3, 4]}, - {'reduction': base_reduction << 2, 'inputs_offsets': [2, 5]}, - {'reduction': base_reduction << 1, 'inputs_offsets': [1, 6]}, - {'reduction': base_reduction, 'inputs_offsets': [0, 7]}, - {'reduction': base_reduction << 1, 'inputs_offsets': [1, 7, 8]}, - {'reduction': base_reduction << 2, 'inputs_offsets': [2, 6, 9]}, - {'reduction': base_reduction << 3, 'inputs_offsets': [3, 5, 10]}, - {'reduction': base_reduction << 4, 'inputs_offsets': [4, 11]}, - ], - 'weight_method': 'fastattn', - } - return p - - -def swish(x, inplace: bool = False): - """Swish - Described in: https://arxiv.org/abs/1710.05941 - """ - return x.mul_(x.sigmoid()) if inplace else x.mul(x.sigmoid()) - - -class Swish(nn.Module): - def __init__(self, inplace: bool = False): - super(Swish, self).__init__() - self.inplace = inplace - - def forward(self, x): - return swish(x, self.inplace) - - -class SequentialAppend(nn.Sequential): - def __init__(self, *args): - super(SequentialAppend, self).__init__(*args) - - def forward(self, x): - for module in self: - x.append(module(x)) - return x - - -class SequentialAppendLast(nn.Sequential): - def __init__(self, *args): - super(SequentialAppendLast, self).__init__(*args) - - # def forward(self, x: List[torch.Tensor]): - def forward(self, x): - for module in self: - x.append(module(x[-1])) - return x - - -class ConvBnAct2d(nn.Module): - def __init__(self, in_channels, out_channels, kernel_size, stride=1, dilation=1, padding='', bias=False, - norm='', act_layer=Swish): - super(ConvBnAct2d, self).__init__() - # self.conv = create_conv2d( - # in_channels, out_channels, kernel_size, stride=stride, dilation=dilation, padding=padding, bias=bias) - self.conv = Conv2d( - in_channels, out_channels, kernel_size=kernel_size, stride=stride, - padding=kernel_size // 2, bias=(norm == '')) - self.bn = get_norm(norm, out_channels) - self.act = None if act_layer is None else act_layer(inplace=True) - - def forward(self, x): - x = self.conv(x) - if self.bn is not None: - x = self.bn(x) - if self.act is not None: - x = self.act(x) - return x - - -class SeparableConv2d(nn.Module): - """ Separable Conv - """ - def __init__(self, in_channels, out_channels, kernel_size=3, stride=1, dilation=1, padding='', bias=False, - channel_multiplier=1.0, pw_kernel_size=1, act_layer=Swish, - norm=''): - super(SeparableConv2d, self).__init__() - - # self.conv_dw = create_conv2d( - # in_channels, int(in_channels * channel_multiplier), kernel_size, - # stride=stride, dilation=dilation, padding=padding, depthwise=True) - - self.conv_dw = Conv2d( - in_channels, int(in_channels * channel_multiplier), - kernel_size=kernel_size, stride=stride, padding=kernel_size // 2, bias=bias, - groups=out_channels) - # print('conv_dw', kernel_size, stride) - # self.conv_pw = create_conv2d( - # int(in_channels * channel_multiplier), out_channels, pw_kernel_size, padding=padding, bias=bias) - - self.conv_pw = Conv2d( - int(in_channels * channel_multiplier), out_channels, - kernel_size=pw_kernel_size, padding=pw_kernel_size // 2, bias=(norm=='')) - # print('conv_pw', pw_kernel_size) - - self.bn = get_norm(norm, out_channels) - self.act = None if act_layer is None else act_layer(inplace=True) - - def forward(self, x): - x = self.conv_dw(x) - x = self.conv_pw(x) - if self.bn is not None: - x = self.bn(x) - if self.act is not None: - x = self.act(x) - return x - - -class ResampleFeatureMap(nn.Sequential): - def __init__(self, in_channels, out_channels, reduction_ratio=1., pad_type='', pooling_type='max', - norm='', apply_bn=False, conv_after_downsample=False, - redundant_bias=False): - super(ResampleFeatureMap, self).__init__() - pooling_type = pooling_type or 'max' - self.in_channels = in_channels - self.out_channels = out_channels - self.reduction_ratio = reduction_ratio - self.conv_after_downsample = conv_after_downsample - - conv = None - if in_channels != out_channels: - conv = ConvBnAct2d( - in_channels, out_channels, kernel_size=1, padding=pad_type, - norm=norm if apply_bn else '', - bias=not apply_bn or redundant_bias, act_layer=None) - - if reduction_ratio > 1: - stride_size = int(reduction_ratio) - if conv is not None and not self.conv_after_downsample: - self.add_module('conv', conv) - self.add_module( - 'downsample', - # create_pool2d( - # pooling_type, kernel_size=stride_size + 1, stride=stride_size, padding=pad_type) - # nn.MaxPool2d(kernel_size=stride_size + 1, stride=stride_size, padding=pad_type) - nn.MaxPool2d(kernel_size=stride_size, stride=stride_size) - ) - if conv is not None and self.conv_after_downsample: - self.add_module('conv', conv) - else: - if conv is not None: - self.add_module('conv', conv) - if reduction_ratio < 1: - scale = int(1 // reduction_ratio) - self.add_module('upsample', nn.UpsamplingNearest2d(scale_factor=scale)) - - -class FpnCombine(nn.Module): - def __init__(self, feature_info, fpn_config, fpn_channels, inputs_offsets, target_reduction, pad_type='', - pooling_type='max', norm='', apply_bn_for_resampling=False, - conv_after_downsample=False, redundant_bias=False, weight_method='attn'): - super(FpnCombine, self).__init__() - self.inputs_offsets = inputs_offsets - self.weight_method = weight_method - - self.resample = nn.ModuleDict() - for idx, offset in enumerate(inputs_offsets): - in_channels = fpn_channels - if offset < len(feature_info): - in_channels = feature_info[offset]['num_chs'] - input_reduction = feature_info[offset]['reduction'] - else: - node_idx = offset - len(feature_info) - # print('node_idx, len', node_idx, len(fpn_config['nodes'])) - input_reduction = fpn_config['nodes'][node_idx]['reduction'] - reduction_ratio = target_reduction / input_reduction - self.resample[str(offset)] = ResampleFeatureMap( - in_channels, fpn_channels, reduction_ratio=reduction_ratio, pad_type=pad_type, - pooling_type=pooling_type, norm=norm, - apply_bn=apply_bn_for_resampling, conv_after_downsample=conv_after_downsample, - redundant_bias=redundant_bias) - - if weight_method == 'attn' or weight_method == 'fastattn': - # WSM - self.edge_weights = nn.Parameter(torch.ones(len(inputs_offsets)), requires_grad=True) - else: - self.edge_weights = None - - def forward(self, x): - dtype = x[0].dtype - nodes = [] - for offset in self.inputs_offsets: - input_node = x[offset] - input_node = self.resample[str(offset)](input_node) - nodes.append(input_node) - - if self.weight_method == 'attn': - normalized_weights = torch.softmax(self.edge_weights.type(dtype), dim=0) - x = torch.stack(nodes, dim=-1) * normalized_weights - elif self.weight_method == 'fastattn': - edge_weights = nn.functional.relu(self.edge_weights.type(dtype)) - weights_sum = torch.sum(edge_weights) - x = torch.stack( - [(nodes[i] * edge_weights[i]) / (weights_sum + 0.0001) for i in range(len(nodes))], dim=-1) - elif self.weight_method == 'sum': - x = torch.stack(nodes, dim=-1) - else: - raise ValueError('unknown weight_method {}'.format(self.weight_method)) - x = torch.sum(x, dim=-1) - return x - - -class BiFpnLayer(nn.Module): - def __init__(self, feature_info, fpn_config, fpn_channels, num_levels=5, pad_type='', - pooling_type='max', norm='', act_layer=Swish, - apply_bn_for_resampling=False, conv_after_downsample=True, conv_bn_relu_pattern=False, - separable_conv=True, redundant_bias=False): - super(BiFpnLayer, self).__init__() - self.fpn_config = fpn_config - self.num_levels = num_levels - self.conv_bn_relu_pattern = False - - self.feature_info = [] - self.fnode = SequentialAppend() - for i, fnode_cfg in enumerate(fpn_config['nodes']): - # logging.debug('fnode {} : {}'.format(i, fnode_cfg)) - # print('fnode {} : {}'.format(i, fnode_cfg)) - fnode_layers = OrderedDict() - - # combine features - reduction = fnode_cfg['reduction'] - fnode_layers['combine'] = FpnCombine( - feature_info, fpn_config, fpn_channels, fnode_cfg['inputs_offsets'], target_reduction=reduction, - pad_type=pad_type, pooling_type=pooling_type, norm=norm, - apply_bn_for_resampling=apply_bn_for_resampling, conv_after_downsample=conv_after_downsample, - redundant_bias=redundant_bias, weight_method=fpn_config['weight_method']) - self.feature_info.append(dict(num_chs=fpn_channels, reduction=reduction)) - - # after combine ops - after_combine = OrderedDict() - if not conv_bn_relu_pattern: - after_combine['act'] = act_layer(inplace=True) - conv_bias = redundant_bias - conv_act = None - else: - conv_bias = False - conv_act = act_layer - conv_kwargs = dict( - in_channels=fpn_channels, out_channels=fpn_channels, kernel_size=3, padding=pad_type, - bias=conv_bias, norm=norm, act_layer=conv_act) - after_combine['conv'] = SeparableConv2d(**conv_kwargs) if separable_conv else ConvBnAct2d(**conv_kwargs) - fnode_layers['after_combine'] = nn.Sequential(after_combine) - - self.fnode.add_module(str(i), nn.Sequential(fnode_layers)) - - self.feature_info = self.feature_info[-num_levels::] - - def forward(self, x): - x = self.fnode(x) - return x[-self.num_levels::] - - -class BiFPN(Backbone): - def __init__( - self, cfg, bottom_up, in_features, out_channels, norm='', - num_levels=5, num_bifpn=4, separable_conv=False, - ): - super(BiFPN, self).__init__() - assert isinstance(bottom_up, Backbone) - - # Feature map strides and channels from the bottom up network (e.g. ResNet) - input_shapes = bottom_up.output_shape() - in_strides = [input_shapes[f].stride for f in in_features] - in_channels = [input_shapes[f].channels for f in in_features] - - self.num_levels = num_levels - self.num_bifpn = num_bifpn - self.bottom_up = bottom_up - self.in_features = in_features - self._size_divisibility = 128 - levels = [int(math.log2(s)) for s in in_strides] - self._out_feature_strides = { - "p{}".format(int(math.log2(s))): s for s in in_strides} - if len(in_features) < num_levels: - for l in range(num_levels - len(in_features)): - s = l + levels[-1] - self._out_feature_strides["p{}".format(s + 1)] = 2 ** (s + 1) - self._out_features = list(sorted(self._out_feature_strides.keys())) - self._out_feature_channels = {k: out_channels for k in self._out_features} - - # print('self._out_feature_strides', self._out_feature_strides) - # print('self._out_feature_channels', self._out_feature_channels) - - feature_info = [ - {'num_chs': in_channels[level], 'reduction': in_strides[level]} \ - for level in range(len(self.in_features)) - ] - # self.config = config - fpn_config = get_fpn_config() - self.resample = SequentialAppendLast() - for level in range(num_levels): - if level < len(feature_info): - in_chs = in_channels[level] # feature_info[level]['num_chs'] - reduction = in_strides[level] # feature_info[level]['reduction'] - else: - # Adds a coarser level by downsampling the last feature map - reduction_ratio = 2 - self.resample.add_module(str(level), ResampleFeatureMap( - in_channels=in_chs, - out_channels=out_channels, - pad_type='same', - pooling_type=None, - norm=norm, - reduction_ratio=reduction_ratio, - apply_bn=True, - conv_after_downsample=False, - redundant_bias=False, - )) - in_chs = out_channels - reduction = int(reduction * reduction_ratio) - feature_info.append(dict(num_chs=in_chs, reduction=reduction)) - - self.cell = nn.Sequential() - for rep in range(self.num_bifpn): - # logging.debug('building cell {}'.format(rep)) - # print('building cell {}'.format(rep)) - fpn_layer = BiFpnLayer( - feature_info=feature_info, - fpn_config=fpn_config, - fpn_channels=out_channels, - num_levels=self.num_levels, - pad_type='same', - pooling_type=None, - norm=norm, - act_layer=Swish, - separable_conv=separable_conv, - apply_bn_for_resampling=True, - conv_after_downsample=False, - conv_bn_relu_pattern=False, - redundant_bias=False, - ) - self.cell.add_module(str(rep), fpn_layer) - feature_info = fpn_layer.feature_info - # import pdb; pdb.set_trace() - - @property - def size_divisibility(self): - return self._size_divisibility - - def forward(self, x): - # print('input shapes', x.shape) - bottom_up_features = self.bottom_up(x) - x = [bottom_up_features[f] for f in self.in_features] - assert len(self.resample) == self.num_levels - len(x) - x = self.resample(x) - shapes = [xx.shape for xx in x] - # print('resample shapes', shapes) - x = self.cell(x) - out = {f: xx for f, xx in zip(self._out_features, x)} - # import pdb; pdb.set_trace() - return out - - -@BACKBONE_REGISTRY.register() -def build_resnet_bifpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_resnet_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - backbone = BiFPN( - cfg=cfg, - bottom_up=bottom_up, - in_features=in_features, - out_channels=cfg.MODEL.BIFPN.OUT_CHANNELS, - norm=cfg.MODEL.BIFPN.NORM, - num_levels=cfg.MODEL.BIFPN.NUM_LEVELS, - num_bifpn=cfg.MODEL.BIFPN.NUM_BIFPN, - separable_conv=cfg.MODEL.BIFPN.SEPARABLE_CONV, - ) - return backbone - -@BACKBONE_REGISTRY.register() -def build_p37_dla_bifpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = dla34(cfg) - in_features = cfg.MODEL.FPN.IN_FEATURES - assert cfg.MODEL.BIFPN.NUM_LEVELS == 5 - - backbone = BiFPN( - cfg=cfg, - bottom_up=bottom_up, - in_features=in_features, - out_channels=cfg.MODEL.BIFPN.OUT_CHANNELS, - norm=cfg.MODEL.BIFPN.NORM, - num_levels=cfg.MODEL.BIFPN.NUM_LEVELS, - num_bifpn=cfg.MODEL.BIFPN.NUM_BIFPN, - separable_conv=cfg.MODEL.BIFPN.SEPARABLE_CONV, - ) - return backbone diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py deleted file mode 100755 index 17f2904c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py +++ /dev/null @@ -1,469 +0,0 @@ -# This file is modified from https://github.com/aim-uofa/AdelaiDet/blob/master/adet/modeling/backbone/bifpn.py -# The original file is under 2-clause BSD License for academic use, and *non-commercial use*. -import torch -import torch.nn.functional as F -from torch import nn - -from detectron2.layers import Conv2d, ShapeSpec, get_norm - -from detectron2.modeling.backbone import Backbone, build_resnet_backbone -from detectron2.modeling import BACKBONE_REGISTRY -from .dlafpn import dla34 - -__all__ = [] - - -def swish(x): - return x * x.sigmoid() - - -def split_name(name): - for i, c in enumerate(name): - if not c.isalpha(): - return name[:i], int(name[i:]) - raise ValueError() - - -class FeatureMapResampler(nn.Module): - def __init__(self, in_channels, out_channels, stride, norm=""): - super(FeatureMapResampler, self).__init__() - if in_channels != out_channels: - self.reduction = Conv2d( - in_channels, out_channels, kernel_size=1, - bias=(norm == ""), - norm=get_norm(norm, out_channels), - activation=None - ) - else: - self.reduction = None - - assert stride <= 2 - self.stride = stride - - def forward(self, x): - if self.reduction is not None: - x = self.reduction(x) - - if self.stride == 2: - x = F.max_pool2d( - x, kernel_size=self.stride + 1, - stride=self.stride, padding=1 - ) - elif self.stride == 1: - pass - else: - raise NotImplementedError() - return x - - -class BackboneWithTopLevels(Backbone): - def __init__(self, backbone, out_channels, num_top_levels, norm=""): - super(BackboneWithTopLevels, self).__init__() - self.backbone = backbone - backbone_output_shape = backbone.output_shape() - - self._out_feature_channels = {name: shape.channels for name, shape in backbone_output_shape.items()} - self._out_feature_strides = {name: shape.stride for name, shape in backbone_output_shape.items()} - self._out_features = list(self._out_feature_strides.keys()) - - last_feature_name = max(self._out_feature_strides.keys(), key=lambda x: split_name(x)[1]) - self.last_feature_name = last_feature_name - self.num_top_levels = num_top_levels - - last_channels = self._out_feature_channels[last_feature_name] - last_stride = self._out_feature_strides[last_feature_name] - - prefix, suffix = split_name(last_feature_name) - prev_channels = last_channels - for i in range(num_top_levels): - name = prefix + str(suffix + i + 1) - self.add_module(name, FeatureMapResampler( - prev_channels, out_channels, 2, norm - )) - prev_channels = out_channels - - self._out_feature_channels[name] = out_channels - self._out_feature_strides[name] = last_stride * 2 ** (i + 1) - self._out_features.append(name) - - def forward(self, x): - outputs = self.backbone(x) - last_features = outputs[self.last_feature_name] - prefix, suffix = split_name(self.last_feature_name) - - x = last_features - for i in range(self.num_top_levels): - name = prefix + str(suffix + i + 1) - x = self.__getattr__(name)(x) - outputs[name] = x - - return outputs - - -class SingleBiFPN(Backbone): - """ - This module implements Feature Pyramid Network. - It creates pyramid features built on top of some input feature maps. - """ - - def __init__( - self, in_channels_list, out_channels, norm="" - ): - """ - Args: - bottom_up (Backbone): module representing the bottom up subnetwork. - Must be a subclass of :class:`Backbone`. The multi-scale feature - maps generated by the bottom up network, and listed in `in_features`, - are used to generate FPN levels. - in_features (list[str]): names of the input feature maps coming - from the backbone to which FPN is attached. For example, if the - backbone produces ["res2", "res3", "res4"], any *contiguous* sublist - of these may be used; order must be from high to low resolution. - out_channels (int): number of channels in the output feature maps. - norm (str): the normalization to use. - """ - super(SingleBiFPN, self).__init__() - - self.out_channels = out_channels - # build 5-levels bifpn - if len(in_channels_list) == 5: - self.nodes = [ - {'feat_level': 3, 'inputs_offsets': [3, 4]}, - {'feat_level': 2, 'inputs_offsets': [2, 5]}, - {'feat_level': 1, 'inputs_offsets': [1, 6]}, - {'feat_level': 0, 'inputs_offsets': [0, 7]}, - {'feat_level': 1, 'inputs_offsets': [1, 7, 8]}, - {'feat_level': 2, 'inputs_offsets': [2, 6, 9]}, - {'feat_level': 3, 'inputs_offsets': [3, 5, 10]}, - {'feat_level': 4, 'inputs_offsets': [4, 11]}, - ] - elif len(in_channels_list) == 3: - self.nodes = [ - {'feat_level': 1, 'inputs_offsets': [1, 2]}, - {'feat_level': 0, 'inputs_offsets': [0, 3]}, - {'feat_level': 1, 'inputs_offsets': [1, 3, 4]}, - {'feat_level': 2, 'inputs_offsets': [2, 5]}, - ] - else: - raise NotImplementedError - - node_info = [_ for _ in in_channels_list] - - num_output_connections = [0 for _ in in_channels_list] - for fnode in self.nodes: - feat_level = fnode["feat_level"] - inputs_offsets = fnode["inputs_offsets"] - inputs_offsets_str = "_".join(map(str, inputs_offsets)) - for input_offset in inputs_offsets: - num_output_connections[input_offset] += 1 - - in_channels = node_info[input_offset] - if in_channels != out_channels: - lateral_conv = Conv2d( - in_channels, - out_channels, - kernel_size=1, - norm=get_norm(norm, out_channels) - ) - self.add_module( - "lateral_{}_f{}".format(input_offset, feat_level), lateral_conv - ) - node_info.append(out_channels) - num_output_connections.append(0) - - # generate attention weights - name = "weights_f{}_{}".format(feat_level, inputs_offsets_str) - self.__setattr__(name, nn.Parameter( - torch.ones(len(inputs_offsets), dtype=torch.float32), - requires_grad=True - )) - - # generate convolutions after combination - name = "outputs_f{}_{}".format(feat_level, inputs_offsets_str) - self.add_module(name, Conv2d( - out_channels, - out_channels, - kernel_size=3, - padding=1, - norm=get_norm(norm, out_channels), - bias=(norm == "") - )) - - def forward(self, feats): - """ - Args: - input (dict[str->Tensor]): mapping feature map name (e.g., "p5") to - feature map tensor for each feature level in high to low resolution order. - Returns: - dict[str->Tensor]: - mapping from feature map name to FPN feature map tensor - in high to low resolution order. Returned feature names follow the FPN - paper convention: "p", where stage has stride = 2 ** stage e.g., - ["n2", "n3", ..., "n6"]. - """ - feats = [_ for _ in feats] - num_levels = len(feats) - num_output_connections = [0 for _ in feats] - for fnode in self.nodes: - feat_level = fnode["feat_level"] - inputs_offsets = fnode["inputs_offsets"] - inputs_offsets_str = "_".join(map(str, inputs_offsets)) - input_nodes = [] - _, _, target_h, target_w = feats[feat_level].size() - for input_offset in inputs_offsets: - num_output_connections[input_offset] += 1 - input_node = feats[input_offset] - - # reduction - if input_node.size(1) != self.out_channels: - name = "lateral_{}_f{}".format(input_offset, feat_level) - input_node = self.__getattr__(name)(input_node) - - # maybe downsample - _, _, h, w = input_node.size() - if h > target_h and w > target_w: - height_stride_size = int((h - 1) // target_h + 1) - width_stride_size = int((w - 1) // target_w + 1) - assert height_stride_size == width_stride_size == 2 - input_node = F.max_pool2d( - input_node, kernel_size=(height_stride_size + 1, width_stride_size + 1), - stride=(height_stride_size, width_stride_size), padding=1 - ) - elif h <= target_h and w <= target_w: - if h < target_h or w < target_w: - input_node = F.interpolate( - input_node, - size=(target_h, target_w), - mode="nearest" - ) - else: - raise NotImplementedError() - input_nodes.append(input_node) - - # attention - name = "weights_f{}_{}".format(feat_level, inputs_offsets_str) - weights = F.relu(self.__getattr__(name)) - norm_weights = weights / (weights.sum() + 0.0001) - - new_node = torch.stack(input_nodes, dim=-1) - new_node = (norm_weights * new_node).sum(dim=-1) - new_node = swish(new_node) - - name = "outputs_f{}_{}".format(feat_level, inputs_offsets_str) - feats.append(self.__getattr__(name)(new_node)) - - num_output_connections.append(0) - - output_feats = [] - for idx in range(num_levels): - for i, fnode in enumerate(reversed(self.nodes)): - if fnode['feat_level'] == idx: - output_feats.append(feats[-1 - i]) - break - else: - raise ValueError() - return output_feats - - -class BiFPN(Backbone): - """ - This module implements Feature Pyramid Network. - It creates pyramid features built on top of some input feature maps. - """ - - def __init__( - self, bottom_up, in_features, out_channels, num_top_levels, num_repeats, norm="" - ): - """ - Args: - bottom_up (Backbone): module representing the bottom up subnetwork. - Must be a subclass of :class:`Backbone`. The multi-scale feature - maps generated by the bottom up network, and listed in `in_features`, - are used to generate FPN levels. - in_features (list[str]): names of the input feature maps coming - from the backbone to which FPN is attached. For example, if the - backbone produces ["res2", "res3", "res4"], any *contiguous* sublist - of these may be used; order must be from high to low resolution. - out_channels (int): number of channels in the output feature maps. - num_top_levels (int): the number of the top levels (p6 or p7). - num_repeats (int): the number of repeats of BiFPN. - norm (str): the normalization to use. - """ - super(BiFPN, self).__init__() - assert isinstance(bottom_up, Backbone) - - # add extra feature levels (i.e., 6 and 7) - self.bottom_up = BackboneWithTopLevels( - bottom_up, out_channels, - num_top_levels, norm - ) - bottom_up_output_shapes = self.bottom_up.output_shape() - - in_features = sorted(in_features, key=lambda x: split_name(x)[1]) - self._size_divisibility = 128 #bottom_up_output_shapes[in_features[-1]].stride - self.out_channels = out_channels - self.min_level = split_name(in_features[0])[1] - - # add the names for top blocks - prefix, last_suffix = split_name(in_features[-1]) - for i in range(num_top_levels): - in_features.append(prefix + str(last_suffix + i + 1)) - self.in_features = in_features - - # generate output features - self._out_features = ["p{}".format(split_name(name)[1]) for name in in_features] - self._out_feature_strides = { - out_name: bottom_up_output_shapes[in_name].stride - for out_name, in_name in zip(self._out_features, in_features) - } - self._out_feature_channels = {k: out_channels for k in self._out_features} - - # build bifpn - self.repeated_bifpn = nn.ModuleList() - for i in range(num_repeats): - if i == 0: - in_channels_list = [ - bottom_up_output_shapes[name].channels for name in in_features - ] - else: - in_channels_list = [ - self._out_feature_channels[name] for name in self._out_features - ] - self.repeated_bifpn.append(SingleBiFPN( - in_channels_list, out_channels, norm - )) - - @property - def size_divisibility(self): - return self._size_divisibility - - def forward(self, x): - """ - Args: - input (dict[str->Tensor]): mapping feature map name (e.g., "p5") to - feature map tensor for each feature level in high to low resolution order. - Returns: - dict[str->Tensor]: - mapping from feature map name to FPN feature map tensor - in high to low resolution order. Returned feature names follow the FPN - paper convention: "p", where stage has stride = 2 ** stage e.g., - ["n2", "n3", ..., "n6"]. - """ - bottom_up_features = self.bottom_up(x) - feats = [bottom_up_features[f] for f in self.in_features] - - for bifpn in self.repeated_bifpn: - feats = bifpn(feats) - - return dict(zip(self._out_features, feats)) - - -def _assert_strides_are_log2_contiguous(strides): - """ - Assert that each stride is 2x times its preceding stride, i.e. "contiguous in log2". - """ - for i, stride in enumerate(strides[1:], 1): - assert stride == 2 * strides[i - 1], "Strides {} {} are not log2 contiguous".format( - stride, strides[i - 1] - ) - - -@BACKBONE_REGISTRY.register() -def build_fcos_resnet_bifpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_resnet_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.BIFPN.OUT_CHANNELS - num_repeats = cfg.MODEL.BIFPN.NUM_BIFPN - top_levels = 2 - - backbone = BiFPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - num_top_levels=top_levels, - num_repeats=num_repeats, - norm=cfg.MODEL.BIFPN.NORM - ) - return backbone - - - -@BACKBONE_REGISTRY.register() -def build_p35_fcos_resnet_bifpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_resnet_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.BIFPN.OUT_CHANNELS - num_repeats = cfg.MODEL.BIFPN.NUM_BIFPN - top_levels = 0 - - backbone = BiFPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - num_top_levels=top_levels, - num_repeats=num_repeats, - norm=cfg.MODEL.BIFPN.NORM - ) - return backbone - - -@BACKBONE_REGISTRY.register() -def build_p35_fcos_dla_bifpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = dla34(cfg) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.BIFPN.OUT_CHANNELS - num_repeats = cfg.MODEL.BIFPN.NUM_BIFPN - top_levels = 0 - - backbone = BiFPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - num_top_levels=top_levels, - num_repeats=num_repeats, - norm=cfg.MODEL.BIFPN.NORM - ) - return backbone - -@BACKBONE_REGISTRY.register() -def build_p37_fcos_dla_bifpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = dla34(cfg) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.BIFPN.OUT_CHANNELS - num_repeats = cfg.MODEL.BIFPN.NUM_BIFPN - assert cfg.MODEL.BIFPN.NUM_LEVELS == 5 - top_levels = 2 - - backbone = BiFPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - num_top_levels=top_levels, - num_repeats=num_repeats, - norm=cfg.MODEL.BIFPN.NORM - ) - return backbone \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dla.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dla.py deleted file mode 100755 index 9f15f840..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dla.py +++ /dev/null @@ -1,479 +0,0 @@ -import numpy as np -import math -from os.path import join -import fvcore.nn.weight_init as weight_init -import torch -import torch.nn.functional as F -from torch import nn -import torch.utils.model_zoo as model_zoo - -from detectron2.modeling.backbone.resnet import ( - BasicStem, BottleneckBlock, DeformBottleneckBlock) -from detectron2.layers import ( - Conv2d, - DeformConv, - FrozenBatchNorm2d, - ModulatedDeformConv, - ShapeSpec, - get_norm, -) - -from detectron2.modeling.backbone.backbone import Backbone -from detectron2.modeling.backbone.build import BACKBONE_REGISTRY -from detectron2.modeling.backbone.fpn import FPN - -__all__ = [ - "BottleneckBlock", - "DeformBottleneckBlock", - "BasicStem", -] - -DCNV1 = False - -HASH = { - 34: 'ba72cf86', - 60: '24839fc4', -} - -def get_model_url(data, name, hash): - return join('http://dl.yf.io/dla/models', data, '{}-{}.pth'.format(name, hash)) - -class BasicBlock(nn.Module): - def __init__(self, inplanes, planes, stride=1, dilation=1, norm='BN'): - super(BasicBlock, self).__init__() - self.conv1 = nn.Conv2d(inplanes, planes, kernel_size=3, - stride=stride, padding=dilation, - bias=False, dilation=dilation) - self.bn1 = get_norm(norm, planes) - self.relu = nn.ReLU(inplace=True) - self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, - stride=1, padding=dilation, - bias=False, dilation=dilation) - self.bn2 = get_norm(norm, planes) - self.stride = stride - - def forward(self, x, residual=None): - if residual is None: - residual = x - - out = self.conv1(x) - out = self.bn1(out) - out = self.relu(out) - - out = self.conv2(out) - out = self.bn2(out) - - out += residual - out = self.relu(out) - - return out - -class Bottleneck(nn.Module): - expansion = 2 - - def __init__(self, inplanes, planes, stride=1, dilation=1, norm='BN'): - super(Bottleneck, self).__init__() - expansion = Bottleneck.expansion - bottle_planes = planes // expansion - self.conv1 = nn.Conv2d(inplanes, bottle_planes, - kernel_size=1, bias=False) - self.bn1 = get_norm(norm, bottle_planes) - self.conv2 = nn.Conv2d(bottle_planes, bottle_planes, kernel_size=3, - stride=stride, padding=dilation, - bias=False, dilation=dilation) - self.bn2 = get_norm(norm, bottle_planes) - self.conv3 = nn.Conv2d(bottle_planes, planes, - kernel_size=1, bias=False) - self.bn3 = get_norm(norm, planes) - self.relu = nn.ReLU(inplace=True) - self.stride = stride - - def forward(self, x, residual=None): - if residual is None: - residual = x - - out = self.conv1(x) - out = self.bn1(out) - out = self.relu(out) - - out = self.conv2(out) - out = self.bn2(out) - out = self.relu(out) - - out = self.conv3(out) - out = self.bn3(out) - - out += residual - out = self.relu(out) - - return out - -class Root(nn.Module): - def __init__(self, in_channels, out_channels, kernel_size, residual, norm='BN'): - super(Root, self).__init__() - self.conv = nn.Conv2d( - in_channels, out_channels, 1, - stride=1, bias=False, padding=(kernel_size - 1) // 2) - self.bn = get_norm(norm, out_channels) - self.relu = nn.ReLU(inplace=True) - self.residual = residual - - def forward(self, *x): - children = x - x = self.conv(torch.cat(x, 1)) - x = self.bn(x) - if self.residual: - x += children[0] - x = self.relu(x) - - return x - - -class Tree(nn.Module): - def __init__(self, levels, block, in_channels, out_channels, stride=1, - level_root=False, root_dim=0, root_kernel_size=1, - dilation=1, root_residual=False, norm='BN'): - super(Tree, self).__init__() - if root_dim == 0: - root_dim = 2 * out_channels - if level_root: - root_dim += in_channels - if levels == 1: - self.tree1 = block(in_channels, out_channels, stride, - dilation=dilation, norm=norm) - self.tree2 = block(out_channels, out_channels, 1, - dilation=dilation, norm=norm) - else: - self.tree1 = Tree(levels - 1, block, in_channels, out_channels, - stride, root_dim=0, - root_kernel_size=root_kernel_size, - dilation=dilation, root_residual=root_residual, - norm=norm) - self.tree2 = Tree(levels - 1, block, out_channels, out_channels, - root_dim=root_dim + out_channels, - root_kernel_size=root_kernel_size, - dilation=dilation, root_residual=root_residual, - norm=norm) - if levels == 1: - self.root = Root(root_dim, out_channels, root_kernel_size, - root_residual, norm=norm) - self.level_root = level_root - self.root_dim = root_dim - self.downsample = None - self.project = None - self.levels = levels - if stride > 1: - self.downsample = nn.MaxPool2d(stride, stride=stride) - if in_channels != out_channels: - self.project = nn.Sequential( - nn.Conv2d(in_channels, out_channels, - kernel_size=1, stride=1, bias=False), - get_norm(norm, out_channels) - ) - - def forward(self, x, residual=None, children=None): - children = [] if children is None else children - bottom = self.downsample(x) if self.downsample else x - residual = self.project(bottom) if self.project else bottom - if self.level_root: - children.append(bottom) - x1 = self.tree1(x, residual) - if self.levels == 1: - x2 = self.tree2(x1) - x = self.root(x2, x1, *children) - else: - children.append(x1) - x = self.tree2(x1, children=children) - return x - -class DLA(nn.Module): - def __init__(self, num_layers, levels, channels, - block=BasicBlock, residual_root=False, norm='BN'): - """ - Args: - """ - super(DLA, self).__init__() - self.norm = norm - self.channels = channels - self.base_layer = nn.Sequential( - nn.Conv2d(3, channels[0], kernel_size=7, stride=1, - padding=3, bias=False), - get_norm(self.norm, channels[0]), - nn.ReLU(inplace=True)) - self.level0 = self._make_conv_level( - channels[0], channels[0], levels[0]) - self.level1 = self._make_conv_level( - channels[0], channels[1], levels[1], stride=2) - self.level2 = Tree(levels[2], block, channels[1], channels[2], 2, - level_root=False, - root_residual=residual_root, norm=norm) - self.level3 = Tree(levels[3], block, channels[2], channels[3], 2, - level_root=True, root_residual=residual_root, - norm=norm) - self.level4 = Tree(levels[4], block, channels[3], channels[4], 2, - level_root=True, root_residual=residual_root, - norm=norm) - self.level5 = Tree(levels[5], block, channels[4], channels[5], 2, - level_root=True, root_residual=residual_root, - norm=norm) - self.load_pretrained_model( - data='imagenet', name='dla{}'.format(num_layers), - hash=HASH[num_layers]) - - def load_pretrained_model(self, data, name, hash): - model_url = get_model_url(data, name, hash) - model_weights = model_zoo.load_url(model_url) - num_classes = len(model_weights[list(model_weights.keys())[-1]]) - self.fc = nn.Conv2d( - self.channels[-1], num_classes, - kernel_size=1, stride=1, padding=0, bias=True) - print('Loading pretrained') - self.load_state_dict(model_weights, strict=False) - - def _make_conv_level(self, inplanes, planes, convs, stride=1, dilation=1): - modules = [] - for i in range(convs): - modules.extend([ - nn.Conv2d(inplanes, planes, kernel_size=3, - stride=stride if i == 0 else 1, - padding=dilation, bias=False, dilation=dilation), - get_norm(self.norm, planes), - nn.ReLU(inplace=True)]) - inplanes = planes - return nn.Sequential(*modules) - - def forward(self, x): - y = [] - x = self.base_layer(x) - for i in range(6): - x = getattr(self, 'level{}'.format(i))(x) - y.append(x) - return y - - -def fill_up_weights(up): - w = up.weight.data - f = math.ceil(w.size(2) / 2) - c = (2 * f - 1 - f % 2) / (2. * f) - for i in range(w.size(2)): - for j in range(w.size(3)): - w[0, 0, i, j] = \ - (1 - math.fabs(i / f - c)) * (1 - math.fabs(j / f - c)) - for c in range(1, w.size(0)): - w[c, 0, :, :] = w[0, 0, :, :] - - -class _DeformConv(nn.Module): - def __init__(self, chi, cho, norm='BN'): - super(_DeformConv, self).__init__() - self.actf = nn.Sequential( - get_norm(norm, cho), - nn.ReLU(inplace=True) - ) - if DCNV1: - self.offset = Conv2d( - chi, 18, kernel_size=3, stride=1, - padding=1, dilation=1) - self.conv = DeformConv( - chi, cho, kernel_size=(3,3), stride=1, padding=1, - dilation=1, deformable_groups=1) - else: - self.offset = Conv2d( - chi, 27, kernel_size=3, stride=1, - padding=1, dilation=1) - self.conv = ModulatedDeformConv( - chi, cho, kernel_size=3, stride=1, padding=1, - dilation=1, deformable_groups=1) - nn.init.constant_(self.offset.weight, 0) - nn.init.constant_(self.offset.bias, 0) - - def forward(self, x): - if DCNV1: - offset = self.offset(x) - x = self.conv(x, offset) - else: - offset_mask = self.offset(x) - offset_x, offset_y, mask = torch.chunk(offset_mask, 3, dim=1) - offset = torch.cat((offset_x, offset_y), dim=1) - mask = mask.sigmoid() - x = self.conv(x, offset, mask) - x = self.actf(x) - return x - - -class IDAUp(nn.Module): - def __init__(self, o, channels, up_f, norm='BN'): - super(IDAUp, self).__init__() - for i in range(1, len(channels)): - c = channels[i] - f = int(up_f[i]) - proj = _DeformConv(c, o, norm=norm) - node = _DeformConv(o, o, norm=norm) - - up = nn.ConvTranspose2d(o, o, f * 2, stride=f, - padding=f // 2, output_padding=0, - groups=o, bias=False) - fill_up_weights(up) - - setattr(self, 'proj_' + str(i), proj) - setattr(self, 'up_' + str(i), up) - setattr(self, 'node_' + str(i), node) - - - def forward(self, layers, startp, endp): - for i in range(startp + 1, endp): - upsample = getattr(self, 'up_' + str(i - startp)) - project = getattr(self, 'proj_' + str(i - startp)) - layers[i] = upsample(project(layers[i])) - node = getattr(self, 'node_' + str(i - startp)) - layers[i] = node(layers[i] + layers[i - 1]) - - -class DLAUp(nn.Module): - def __init__(self, startp, channels, scales, in_channels=None, norm='BN'): - super(DLAUp, self).__init__() - self.startp = startp - if in_channels is None: - in_channels = channels - self.channels = channels - channels = list(channels) - scales = np.array(scales, dtype=int) - for i in range(len(channels) - 1): - j = -i - 2 - setattr(self, 'ida_{}'.format(i), - IDAUp(channels[j], in_channels[j:], - scales[j:] // scales[j], norm=norm)) - scales[j + 1:] = scales[j] - in_channels[j + 1:] = [channels[j] for _ in channels[j + 1:]] - - def forward(self, layers): - out = [layers[-1]] # start with 32 - for i in range(len(layers) - self.startp - 1): - ida = getattr(self, 'ida_{}'.format(i)) - ida(layers, len(layers) -i - 2, len(layers)) - out.insert(0, layers[-1]) - return out - -DLA_CONFIGS = { - 34: ([1, 1, 1, 2, 2, 1], [16, 32, 64, 128, 256, 512], BasicBlock), - 60: ([1, 1, 1, 2, 3, 1], [16, 32, 128, 256, 512, 1024], Bottleneck) -} - - -class DLASeg(Backbone): - def __init__(self, num_layers, out_features, use_dla_up=True, - ms_output=False, norm='BN'): - super(DLASeg, self).__init__() - # depth = 34 - levels, channels, Block = DLA_CONFIGS[num_layers] - self.base = DLA(num_layers=num_layers, - levels=levels, channels=channels, block=Block, norm=norm) - down_ratio = 4 - self.first_level = int(np.log2(down_ratio)) - self.ms_output = ms_output - self.last_level = 5 if not self.ms_output else 6 - channels = self.base.channels - scales = [2 ** i for i in range(len(channels[self.first_level:]))] - self.use_dla_up = use_dla_up - if self.use_dla_up: - self.dla_up = DLAUp( - self.first_level, channels[self.first_level:], scales, - norm=norm) - out_channel = channels[self.first_level] - if not self.ms_output: # stride 4 DLA - self.ida_up = IDAUp( - out_channel, channels[self.first_level:self.last_level], - [2 ** i for i in range(self.last_level - self.first_level)], - norm=norm) - self._out_features = out_features - self._out_feature_channels = { - 'dla{}'.format(i): channels[i] for i in range(6)} - self._out_feature_strides = { - 'dla{}'.format(i): 2 ** i for i in range(6)} - self._size_divisibility = 32 - - @property - def size_divisibility(self): - return self._size_divisibility - - def forward(self, x): - x = self.base(x) - if self.use_dla_up: - x = self.dla_up(x) - if not self.ms_output: # stride 4 dla - y = [] - for i in range(self.last_level - self.first_level): - y.append(x[i].clone()) - self.ida_up(y, 0, len(y)) - ret = {} - for i in range(self.last_level - self.first_level): - out_feature = 'dla{}'.format(i) - if out_feature in self._out_features: - ret[out_feature] = y[i] - else: - ret = {} - st = self.first_level if self.use_dla_up else 0 - for i in range(self.last_level - st): - out_feature = 'dla{}'.format(i + st) - if out_feature in self._out_features: - ret[out_feature] = x[i] - - return ret - - -@BACKBONE_REGISTRY.register() -def build_dla_backbone(cfg, input_shape): - """ - Create a ResNet instance from config. - - Returns: - ResNet: a :class:`ResNet` instance. - """ - return DLASeg( - out_features=cfg.MODEL.DLA.OUT_FEATURES, - num_layers=cfg.MODEL.DLA.NUM_LAYERS, - use_dla_up=cfg.MODEL.DLA.USE_DLA_UP, - ms_output=cfg.MODEL.DLA.MS_OUTPUT, - norm=cfg.MODEL.DLA.NORM) - -class LastLevelP6P7(nn.Module): - """ - This module is used in RetinaNet to generate extra layers, P6 and P7 from - C5 feature. - """ - - def __init__(self, in_channels, out_channels): - super().__init__() - self.num_levels = 2 - self.in_feature = "dla5" - self.p6 = nn.Conv2d(in_channels, out_channels, 3, 2, 1) - self.p7 = nn.Conv2d(out_channels, out_channels, 3, 2, 1) - for module in [self.p6, self.p7]: - weight_init.c2_xavier_fill(module) - - def forward(self, c5): - p6 = self.p6(c5) - p7 = self.p7(F.relu(p6)) - return [p6, p7] - -@BACKBONE_REGISTRY.register() -def build_retinanet_dla_fpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_dla_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - in_channels_p6p7 = bottom_up.output_shape()['dla5'].channels - backbone = FPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - norm=cfg.MODEL.FPN.NORM, - top_block=LastLevelP6P7(in_channels_p6p7, out_channels), - fuse_type=cfg.MODEL.FPN.FUSE_TYPE, - ) - return backbone diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py deleted file mode 100755 index 2a33c66b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py +++ /dev/null @@ -1,493 +0,0 @@ -#!/usr/bin/env python -# -*- coding: utf-8 -*- - -# this file is from https://github.com/ucbdrive/dla/blob/master/dla.py. - -import math -from os.path import join -import numpy as np - -import torch -from torch import nn -import torch.utils.model_zoo as model_zoo -import torch.nn.functional as F -import fvcore.nn.weight_init as weight_init - -from detectron2.modeling.backbone import FPN -from detectron2.layers import ShapeSpec, ModulatedDeformConv, Conv2d -from detectron2.modeling.backbone.build import BACKBONE_REGISTRY -from detectron2.layers.batch_norm import get_norm -from detectron2.modeling.backbone import Backbone - -WEB_ROOT = 'http://dl.yf.io/dla/models' - - -def get_model_url(data, name, hash): - return join( - 'http://dl.yf.io/dla/models', data, '{}-{}.pth'.format(name, hash)) - - -def conv3x3(in_planes, out_planes, stride=1): - "3x3 convolution with padding" - return nn.Conv2d(in_planes, out_planes, kernel_size=3, stride=stride, - padding=1, bias=False) - - -class BasicBlock(nn.Module): - def __init__(self, cfg, inplanes, planes, stride=1, dilation=1): - super(BasicBlock, self).__init__() - self.conv1 = nn.Conv2d(inplanes, planes, kernel_size=3, - stride=stride, padding=dilation, - bias=False, dilation=dilation) - self.bn1 = get_norm(cfg.MODEL.DLA.NORM, planes) - self.relu = nn.ReLU(inplace=True) - self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, - stride=1, padding=dilation, - bias=False, dilation=dilation) - self.bn2 = get_norm(cfg.MODEL.DLA.NORM, planes) - self.stride = stride - - def forward(self, x, residual=None): - if residual is None: - residual = x - - out = self.conv1(x) - out = self.bn1(out) - out = self.relu(out) - - out = self.conv2(out) - out = self.bn2(out) - - out += residual - out = self.relu(out) - - return out - - -class Bottleneck(nn.Module): - expansion = 2 - - def __init__(self, cfg, inplanes, planes, stride=1, dilation=1): - super(Bottleneck, self).__init__() - expansion = Bottleneck.expansion - bottle_planes = planes // expansion - self.conv1 = nn.Conv2d(inplanes, bottle_planes, - kernel_size=1, bias=False) - self.bn1 = get_norm(cfg.MODEL.DLA.NORM, bottle_planes) - self.conv2 = nn.Conv2d(bottle_planes, bottle_planes, kernel_size=3, - stride=stride, padding=dilation, - bias=False, dilation=dilation) - self.bn2 = get_norm(cfg.MODEL.DLA.NORM, bottle_planes) - self.conv3 = nn.Conv2d(bottle_planes, planes, - kernel_size=1, bias=False) - self.bn3 = get_norm(cfg.MODEL.DLA.NORM, planes) - self.relu = nn.ReLU(inplace=True) - self.stride = stride - - def forward(self, x, residual=None): - if residual is None: - residual = x - - out = self.conv1(x) - out = self.bn1(out) - out = self.relu(out) - - out = self.conv2(out) - out = self.bn2(out) - out = self.relu(out) - - out = self.conv3(out) - out = self.bn3(out) - - out += residual - out = self.relu(out) - - return out - - -class Root(nn.Module): - def __init__(self, cfg, in_channels, out_channels, kernel_size, residual): - super(Root, self).__init__() - self.conv = nn.Conv2d( - in_channels, out_channels, kernel_size, - stride=1, bias=False, padding=(kernel_size - 1) // 2) - self.bn = get_norm(cfg.MODEL.DLA.NORM, out_channels) - self.relu = nn.ReLU(inplace=True) - self.residual = residual - - def forward(self, *x): - children = x - x = self.conv(torch.cat(x, 1)) - x = self.bn(x) - if self.residual: - x += children[0] - x = self.relu(x) - - return x - - -class Tree(nn.Module): - def __init__(self, cfg, levels, block, in_channels, out_channels, stride=1, - level_root=False, root_dim=0, root_kernel_size=1, - dilation=1, root_residual=False): - super(Tree, self).__init__() - if root_dim == 0: - root_dim = 2 * out_channels - if level_root: - root_dim += in_channels - if levels == 1: - self.tree1 = block(cfg, in_channels, out_channels, stride, - dilation=dilation) - self.tree2 = block(cfg, out_channels, out_channels, 1, - dilation=dilation) - else: - self.tree1 = Tree(cfg, levels - 1, block, in_channels, out_channels, - stride, root_dim=0, - root_kernel_size=root_kernel_size, - dilation=dilation, root_residual=root_residual) - self.tree2 = Tree(cfg, levels - 1, block, out_channels, out_channels, - root_dim=root_dim + out_channels, - root_kernel_size=root_kernel_size, - dilation=dilation, root_residual=root_residual) - if levels == 1: - self.root = Root(cfg, root_dim, out_channels, root_kernel_size, - root_residual) - self.level_root = level_root - self.root_dim = root_dim - self.downsample = None - self.project = None - self.levels = levels - if stride > 1: - self.downsample = nn.MaxPool2d(stride, stride=stride) - if in_channels != out_channels: - self.project = nn.Sequential( - nn.Conv2d(in_channels, out_channels, - kernel_size=1, stride=1, bias=False), - get_norm(cfg.MODEL.DLA.NORM, out_channels) - ) - - def forward(self, x, residual=None, children=None): - if self.training and residual is not None: - x = x + residual.sum() * 0.0 - children = [] if children is None else children - bottom = self.downsample(x) if self.downsample else x - residual = self.project(bottom) if self.project else bottom - if self.level_root: - children.append(bottom) - x1 = self.tree1(x, residual) - if self.levels == 1: - x2 = self.tree2(x1) - x = self.root(x2, x1, *children) - else: - children.append(x1) - x = self.tree2(x1, children=children) - return x - - -class DLA(Backbone): - def __init__(self, cfg, levels, channels, block=BasicBlock, residual_root=False): - super(DLA, self).__init__() - self.cfg = cfg - self.channels = channels - - self._out_features = ["dla{}".format(i) for i in range(6)] - self._out_feature_channels = {k: channels[i] for i, k in enumerate(self._out_features)} - self._out_feature_strides = {k: 2 ** i for i, k in enumerate(self._out_features)} - - self.base_layer = nn.Sequential( - nn.Conv2d(3, channels[0], kernel_size=7, stride=1, - padding=3, bias=False), - get_norm(cfg.MODEL.DLA.NORM, channels[0]), - nn.ReLU(inplace=True)) - self.level0 = self._make_conv_level( - channels[0], channels[0], levels[0]) - self.level1 = self._make_conv_level( - channels[0], channels[1], levels[1], stride=2) - self.level2 = Tree(cfg, levels[2], block, channels[1], channels[2], 2, - level_root=False, - root_residual=residual_root) - self.level3 = Tree(cfg, levels[3], block, channels[2], channels[3], 2, - level_root=True, root_residual=residual_root) - self.level4 = Tree(cfg, levels[4], block, channels[3], channels[4], 2, - level_root=True, root_residual=residual_root) - self.level5 = Tree(cfg, levels[5], block, channels[4], channels[5], 2, - level_root=True, root_residual=residual_root) - - for m in self.modules(): - if isinstance(m, nn.Conv2d): - n = m.kernel_size[0] * m.kernel_size[1] * m.out_channels - m.weight.data.normal_(0, math.sqrt(2. / n)) - - self.load_pretrained_model( - data='imagenet', name='dla34', hash='ba72cf86') - - def load_pretrained_model(self, data, name, hash): - model_url = get_model_url(data, name, hash) - model_weights = model_zoo.load_url(model_url) - del model_weights['fc.weight'] - del model_weights['fc.bias'] - print('Loading pretrained DLA!') - self.load_state_dict(model_weights, strict=True) - - def _make_conv_level(self, inplanes, planes, convs, stride=1, dilation=1): - modules = [] - for i in range(convs): - modules.extend([ - nn.Conv2d(inplanes, planes, kernel_size=3, - stride=stride if i == 0 else 1, - padding=dilation, bias=False, dilation=dilation), - get_norm(self.cfg.MODEL.DLA.NORM, planes), - nn.ReLU(inplace=True)]) - inplanes = planes - return nn.Sequential(*modules) - - def forward(self, x): - y = {} - x = self.base_layer(x) - for i in range(6): - name = 'level{}'.format(i) - x = getattr(self, name)(x) - y['dla{}'.format(i)] = x - return y - - -def fill_up_weights(up): - w = up.weight.data - f = math.ceil(w.size(2) / 2) - c = (2 * f - 1 - f % 2) / (2. * f) - for i in range(w.size(2)): - for j in range(w.size(3)): - w[0, 0, i, j] = \ - (1 - math.fabs(i / f - c)) * (1 - math.fabs(j / f - c)) - for c in range(1, w.size(0)): - w[c, 0, :, :] = w[0, 0, :, :] - - -class Conv(nn.Module): - def __init__(self, chi, cho, norm): - super(Conv, self).__init__() - self.conv = nn.Sequential( - nn.Conv2d(chi, cho, kernel_size=1, stride=1, bias=False), - get_norm(norm, cho), - nn.ReLU(inplace=True)) - - def forward(self, x): - return self.conv(x) - - -class DeformConv(nn.Module): - def __init__(self, chi, cho, norm): - super(DeformConv, self).__init__() - self.actf = nn.Sequential( - get_norm(norm, cho), - nn.ReLU(inplace=True) - ) - self.offset = Conv2d( - chi, 27, kernel_size=3, stride=1, - padding=1, dilation=1) - self.conv = ModulatedDeformConv( - chi, cho, kernel_size=3, stride=1, padding=1, - dilation=1, deformable_groups=1) - nn.init.constant_(self.offset.weight, 0) - nn.init.constant_(self.offset.bias, 0) - - def forward(self, x): - offset_mask = self.offset(x) - offset_x, offset_y, mask = torch.chunk(offset_mask, 3, dim=1) - offset = torch.cat((offset_x, offset_y), dim=1) - mask = mask.sigmoid() - x = self.conv(x, offset, mask) - x = self.actf(x) - return x - - -class IDAUp(nn.Module): - def __init__(self, o, channels, up_f, norm='FrozenBN', node_type=Conv): - super(IDAUp, self).__init__() - for i in range(1, len(channels)): - c = channels[i] - f = int(up_f[i]) - proj = node_type(c, o, norm) - node = node_type(o, o, norm) - - up = nn.ConvTranspose2d(o, o, f * 2, stride=f, - padding=f // 2, output_padding=0, - groups=o, bias=False) - fill_up_weights(up) - - setattr(self, 'proj_' + str(i), proj) - setattr(self, 'up_' + str(i), up) - setattr(self, 'node_' + str(i), node) - - - def forward(self, layers, startp, endp): - for i in range(startp + 1, endp): - upsample = getattr(self, 'up_' + str(i - startp)) - project = getattr(self, 'proj_' + str(i - startp)) - layers[i] = upsample(project(layers[i])) - node = getattr(self, 'node_' + str(i - startp)) - layers[i] = node(layers[i] + layers[i - 1]) - - -DLAUP_NODE_MAP = { - 'conv': Conv, - 'dcn': DeformConv, -} - -class DLAUP(Backbone): - def __init__(self, bottom_up, in_features, norm, dlaup_node='conv'): - super(DLAUP, self).__init__() - assert isinstance(bottom_up, Backbone) - self.bottom_up = bottom_up - input_shapes = bottom_up.output_shape() - in_strides = [input_shapes[f].stride for f in in_features] - in_channels = [input_shapes[f].channels for f in in_features] - in_levels = [int(math.log2(input_shapes[f].stride)) for f in in_features] - self.in_features = in_features - out_features = ['dlaup{}'.format(l) for l in in_levels] - self._out_features = out_features - self._out_feature_channels = { - 'dlaup{}'.format(l): in_channels[i] for i, l in enumerate(in_levels)} - self._out_feature_strides = { - 'dlaup{}'.format(l): 2 ** l for l in in_levels} - - print('self._out_features', self._out_features) - print('self._out_feature_channels', self._out_feature_channels) - print('self._out_feature_strides', self._out_feature_strides) - self._size_divisibility = 32 - - node_type = DLAUP_NODE_MAP[dlaup_node] - - self.startp = int(math.log2(in_strides[0])) - self.channels = in_channels - channels = list(in_channels) - scales = np.array([2 ** i for i in range(len(out_features))], dtype=int) - for i in range(len(channels) - 1): - j = -i - 2 - setattr(self, 'ida_{}'.format(i), - IDAUp(channels[j], in_channels[j:], - scales[j:] // scales[j], - norm=norm, - node_type=node_type)) - scales[j + 1:] = scales[j] - in_channels[j + 1:] = [channels[j] for _ in channels[j + 1:]] - - @property - def size_divisibility(self): - return self._size_divisibility - - def forward(self, x): - bottom_up_features = self.bottom_up(x) - layers = [bottom_up_features[f] for f in self.in_features] - out = [layers[-1]] # start with 32 - for i in range(len(layers) - 1): - ida = getattr(self, 'ida_{}'.format(i)) - ida(layers, len(layers) - i - 2, len(layers)) - out.insert(0, layers[-1]) - ret = {} - for k, v in zip(self._out_features, out): - ret[k] = v - # import pdb; pdb.set_trace() - return ret - - -def dla34(cfg, pretrained=None): # DLA-34 - model = DLA(cfg, [1, 1, 1, 2, 2, 1], - [16, 32, 64, 128, 256, 512], - block=BasicBlock) - return model - - -class LastLevelP6P7(nn.Module): - """ - This module is used in RetinaNet to generate extra layers, P6 and P7 from - C5 feature. - """ - - def __init__(self, in_channels, out_channels): - super().__init__() - self.num_levels = 2 - self.in_feature = "dla5" - self.p6 = nn.Conv2d(in_channels, out_channels, 3, 2, 1) - self.p7 = nn.Conv2d(out_channels, out_channels, 3, 2, 1) - for module in [self.p6, self.p7]: - weight_init.c2_xavier_fill(module) - - def forward(self, c5): - p6 = self.p6(c5) - p7 = self.p7(F.relu(p6)) - return [p6, p7] - - -@BACKBONE_REGISTRY.register() -def build_dla_fpn3_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - - depth_to_creator = {"dla34": dla34} - bottom_up = depth_to_creator['dla{}'.format(cfg.MODEL.DLA.NUM_LAYERS)](cfg) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - - backbone = FPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - norm=cfg.MODEL.FPN.NORM, - top_block=None, - fuse_type=cfg.MODEL.FPN.FUSE_TYPE, - ) - - return backbone - -@BACKBONE_REGISTRY.register() -def build_dla_fpn5_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - - depth_to_creator = {"dla34": dla34} - bottom_up = depth_to_creator['dla{}'.format(cfg.MODEL.DLA.NUM_LAYERS)](cfg) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - in_channels_top = bottom_up.output_shape()['dla5'].channels - - backbone = FPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - norm=cfg.MODEL.FPN.NORM, - top_block=LastLevelP6P7(in_channels_top, out_channels), - fuse_type=cfg.MODEL.FPN.FUSE_TYPE, - ) - - return backbone - - -@BACKBONE_REGISTRY.register() -def build_dlaup_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - - depth_to_creator = {"dla34": dla34} - bottom_up = depth_to_creator['dla{}'.format(cfg.MODEL.DLA.NUM_LAYERS)](cfg) - - backbone = DLAUP( - bottom_up=bottom_up, - in_features=cfg.MODEL.DLA.DLAUP_IN_FEATURES, - norm=cfg.MODEL.DLA.NORM, - dlaup_node=cfg.MODEL.DLA.DLAUP_NODE, - ) - - return backbone diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/fpn_p5.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/fpn_p5.py deleted file mode 100755 index cc4e7a49..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/fpn_p5.py +++ /dev/null @@ -1,78 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -import math -import fvcore.nn.weight_init as weight_init -import torch.nn.functional as F -from torch import nn - -from detectron2.layers import Conv2d, ShapeSpec, get_norm - -from detectron2.modeling.backbone import Backbone -from detectron2.modeling.backbone.fpn import FPN -from detectron2.modeling.backbone.build import BACKBONE_REGISTRY -from detectron2.modeling.backbone.resnet import build_resnet_backbone - - -class LastLevelP6P7_P5(nn.Module): - """ - This module is used in RetinaNet to generate extra layers, P6 and P7 from - C5 feature. - """ - - def __init__(self, in_channels, out_channels): - super().__init__() - self.num_levels = 2 - self.in_feature = "p5" - self.p6 = nn.Conv2d(in_channels, out_channels, 3, 2, 1) - self.p7 = nn.Conv2d(out_channels, out_channels, 3, 2, 1) - for module in [self.p6, self.p7]: - weight_init.c2_xavier_fill(module) - - def forward(self, c5): - p6 = self.p6(c5) - p7 = self.p7(F.relu(p6)) - return [p6, p7] - - -@BACKBONE_REGISTRY.register() -def build_p67_resnet_fpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_resnet_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - backbone = FPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - norm=cfg.MODEL.FPN.NORM, - top_block=LastLevelP6P7_P5(out_channels, out_channels), - fuse_type=cfg.MODEL.FPN.FUSE_TYPE, - ) - return backbone - -@BACKBONE_REGISTRY.register() -def build_p35_resnet_fpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_resnet_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - backbone = FPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - norm=cfg.MODEL.FPN.NORM, - top_block=None, - fuse_type=cfg.MODEL.FPN.FUSE_TYPE, - ) - return backbone diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/res2net.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/res2net.py deleted file mode 100755 index 1d0d40ad..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/backbone/res2net.py +++ /dev/null @@ -1,802 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -# This file is modified from https://github.com/Res2Net/Res2Net-detectron2/blob/master/detectron2/modeling/backbone/resnet.py -# The original file is under Apache-2.0 License -import numpy as np -import fvcore.nn.weight_init as weight_init -import torch -import torch.nn.functional as F -from torch import nn - -from detectron2.layers import ( - CNNBlockBase, - Conv2d, - DeformConv, - ModulatedDeformConv, - ShapeSpec, - get_norm, -) - -from detectron2.modeling.backbone import Backbone -from detectron2.modeling.backbone.fpn import FPN -from detectron2.modeling.backbone.build import BACKBONE_REGISTRY -from .fpn_p5 import LastLevelP6P7_P5 -from .bifpn import BiFPN - -__all__ = [ - "ResNetBlockBase", - "BasicBlock", - "BottleneckBlock", - "DeformBottleneckBlock", - "BasicStem", - "ResNet", - "make_stage", - "build_res2net_backbone", -] - - -ResNetBlockBase = CNNBlockBase -""" -Alias for backward compatibiltiy. -""" - - -class BasicBlock(CNNBlockBase): - """ - The basic residual block for ResNet-18 and ResNet-34, with two 3x3 conv layers - and a projection shortcut if needed. - """ - - def __init__(self, in_channels, out_channels, *, stride=1, norm="BN"): - """ - Args: - in_channels (int): Number of input channels. - out_channels (int): Number of output channels. - stride (int): Stride for the first conv. - norm (str or callable): normalization for all conv layers. - See :func:`layers.get_norm` for supported format. - """ - super().__init__(in_channels, out_channels, stride) - - if in_channels != out_channels: - self.shortcut = Conv2d( - in_channels, - out_channels, - kernel_size=1, - stride=stride, - bias=False, - norm=get_norm(norm, out_channels), - ) - else: - self.shortcut = None - - self.conv1 = Conv2d( - in_channels, - out_channels, - kernel_size=3, - stride=stride, - padding=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - - self.conv2 = Conv2d( - out_channels, - out_channels, - kernel_size=3, - stride=1, - padding=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - - for layer in [self.conv1, self.conv2, self.shortcut]: - if layer is not None: # shortcut can be None - weight_init.c2_msra_fill(layer) - - def forward(self, x): - out = self.conv1(x) - out = F.relu_(out) - out = self.conv2(out) - - if self.shortcut is not None: - shortcut = self.shortcut(x) - else: - shortcut = x - - out += shortcut - out = F.relu_(out) - return out - - -class BottleneckBlock(CNNBlockBase): - """ - The standard bottle2neck residual block used by Res2Net-50, 101 and 152. - """ - - def __init__( - self, - in_channels, - out_channels, - *, - bottleneck_channels, - stride=1, - num_groups=1, - norm="BN", - stride_in_1x1=False, - dilation=1, - basewidth=26, - scale=4, - ): - """ - Args: - bottleneck_channels (int): number of output channels for the 3x3 - "bottleneck" conv layers. - num_groups (int): number of groups for the 3x3 conv layer. - norm (str or callable): normalization for all conv layers. - See :func:`layers.get_norm` for supported format. - stride_in_1x1 (bool): when stride>1, whether to put stride in the - first 1x1 convolution or the bottleneck 3x3 convolution. - dilation (int): the dilation rate of the 3x3 conv layer. - """ - super().__init__(in_channels, out_channels, stride) - - if in_channels != out_channels: - self.shortcut = nn.Sequential( - nn.AvgPool2d(kernel_size=stride, stride=stride, - ceil_mode=True, count_include_pad=False), - Conv2d( - in_channels, - out_channels, - kernel_size=1, - stride=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - ) - else: - self.shortcut = None - - # The original MSRA ResNet models have stride in the first 1x1 conv - # The subsequent fb.torch.resnet and Caffe2 ResNe[X]t implementations have - # stride in the 3x3 conv - stride_1x1, stride_3x3 = (stride, 1) if stride_in_1x1 else (1, stride) - width = bottleneck_channels//scale - - self.conv1 = Conv2d( - in_channels, - bottleneck_channels, - kernel_size=1, - stride=stride_1x1, - bias=False, - norm=get_norm(norm, bottleneck_channels), - ) - if scale == 1: - self.nums = 1 - else: - self.nums = scale -1 - if self.in_channels!=self.out_channels and stride_3x3!=2: - self.pool = nn.AvgPool2d(kernel_size=3, stride = stride_3x3, padding=1) - - convs = [] - bns = [] - for i in range(self.nums): - convs.append(nn.Conv2d( - width, - width, - kernel_size=3, - stride=stride_3x3, - padding=1 * dilation, - bias=False, - groups=num_groups, - dilation=dilation, - )) - bns.append(get_norm(norm, width)) - self.convs = nn.ModuleList(convs) - self.bns = nn.ModuleList(bns) - - self.conv3 = Conv2d( - bottleneck_channels, - out_channels, - kernel_size=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - self.scale = scale - self.width = width - self.in_channels = in_channels - self.out_channels = out_channels - self.stride_3x3 = stride_3x3 - for layer in [self.conv1, self.conv3]: - if layer is not None: # shortcut can be None - weight_init.c2_msra_fill(layer) - if self.shortcut is not None: - for layer in self.shortcut.modules(): - if isinstance(layer, Conv2d): - weight_init.c2_msra_fill(layer) - - for layer in self.convs: - if layer is not None: # shortcut can be None - weight_init.c2_msra_fill(layer) - - # Zero-initialize the last normalization in each residual branch, - # so that at the beginning, the residual branch starts with zeros, - # and each residual block behaves like an identity. - # See Sec 5.1 in "Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour": - # "For BN layers, the learnable scaling coefficient γ is initialized - # to be 1, except for each residual block's last BN - # where γ is initialized to be 0." - - # nn.init.constant_(self.conv3.norm.weight, 0) - # TODO this somehow hurts performance when training GN models from scratch. - # Add it as an option when we need to use this code to train a backbone. - - def forward(self, x): - out = self.conv1(x) - out = F.relu_(out) - - spx = torch.split(out, self.width, 1) - for i in range(self.nums): - if i==0 or self.in_channels!=self.out_channels: - sp = spx[i] - else: - sp = sp + spx[i] - sp = self.convs[i](sp) - sp = F.relu_(self.bns[i](sp)) - if i==0: - out = sp - else: - out = torch.cat((out, sp), 1) - if self.scale!=1 and self.stride_3x3==1: - out = torch.cat((out, spx[self.nums]), 1) - elif self.scale != 1 and self.stride_3x3==2: - out = torch.cat((out, self.pool(spx[self.nums])), 1) - - out = self.conv3(out) - - if self.shortcut is not None: - shortcut = self.shortcut(x) - else: - shortcut = x - - out += shortcut - out = F.relu_(out) - return out - - -class DeformBottleneckBlock(ResNetBlockBase): - """ - Not implemented for res2net yet. - Similar to :class:`BottleneckBlock`, but with deformable conv in the 3x3 convolution. - """ - - def __init__( - self, - in_channels, - out_channels, - *, - bottleneck_channels, - stride=1, - num_groups=1, - norm="BN", - stride_in_1x1=False, - dilation=1, - deform_modulated=False, - deform_num_groups=1, - basewidth=26, - scale=4, - ): - super().__init__(in_channels, out_channels, stride) - self.deform_modulated = deform_modulated - - if in_channels != out_channels: - # self.shortcut = Conv2d( - # in_channels, - # out_channels, - # kernel_size=1, - # stride=stride, - # bias=False, - # norm=get_norm(norm, out_channels), - # ) - self.shortcut = nn.Sequential( - nn.AvgPool2d(kernel_size=stride, stride=stride, - ceil_mode=True, count_include_pad=False), - Conv2d( - in_channels, - out_channels, - kernel_size=1, - stride=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - ) - else: - self.shortcut = None - - stride_1x1, stride_3x3 = (stride, 1) if stride_in_1x1 else (1, stride) - width = bottleneck_channels//scale - - self.conv1 = Conv2d( - in_channels, - bottleneck_channels, - kernel_size=1, - stride=stride_1x1, - bias=False, - norm=get_norm(norm, bottleneck_channels), - ) - - if scale == 1: - self.nums = 1 - else: - self.nums = scale -1 - if self.in_channels!=self.out_channels and stride_3x3!=2: - self.pool = nn.AvgPool2d(kernel_size=3, stride = stride_3x3, padding=1) - - if deform_modulated: - deform_conv_op = ModulatedDeformConv - # offset channels are 2 or 3 (if with modulated) * kernel_size * kernel_size - offset_channels = 27 - else: - deform_conv_op = DeformConv - offset_channels = 18 - - # self.conv2_offset = Conv2d( - # bottleneck_channels, - # offset_channels * deform_num_groups, - # kernel_size=3, - # stride=stride_3x3, - # padding=1 * dilation, - # dilation=dilation, - # ) - # self.conv2 = deform_conv_op( - # bottleneck_channels, - # bottleneck_channels, - # kernel_size=3, - # stride=stride_3x3, - # padding=1 * dilation, - # bias=False, - # groups=num_groups, - # dilation=dilation, - # deformable_groups=deform_num_groups, - # norm=get_norm(norm, bottleneck_channels), - # ) - - conv2_offsets = [] - convs = [] - bns = [] - for i in range(self.nums): - conv2_offsets.append(Conv2d( - width, - offset_channels * deform_num_groups, - kernel_size=3, - stride=stride_3x3, - padding=1 * dilation, - bias=False, - groups=num_groups, - dilation=dilation, - )) - convs.append(deform_conv_op( - width, - width, - kernel_size=3, - stride=stride_3x3, - padding=1 * dilation, - bias=False, - groups=num_groups, - dilation=dilation, - deformable_groups=deform_num_groups, - )) - bns.append(get_norm(norm, width)) - self.conv2_offsets = nn.ModuleList(conv2_offsets) - self.convs = nn.ModuleList(convs) - self.bns = nn.ModuleList(bns) - - self.conv3 = Conv2d( - bottleneck_channels, - out_channels, - kernel_size=1, - bias=False, - norm=get_norm(norm, out_channels), - ) - self.scale = scale - self.width = width - self.in_channels = in_channels - self.out_channels = out_channels - self.stride_3x3 = stride_3x3 - # for layer in [self.conv1, self.conv2, self.conv3, self.shortcut]: - # if layer is not None: # shortcut can be None - # weight_init.c2_msra_fill(layer) - - # nn.init.constant_(self.conv2_offset.weight, 0) - # nn.init.constant_(self.conv2_offset.bias, 0) - for layer in [self.conv1, self.conv3]: - if layer is not None: # shortcut can be None - weight_init.c2_msra_fill(layer) - if self.shortcut is not None: - for layer in self.shortcut.modules(): - if isinstance(layer, Conv2d): - weight_init.c2_msra_fill(layer) - - for layer in self.convs: - if layer is not None: # shortcut can be None - weight_init.c2_msra_fill(layer) - - for layer in self.conv2_offsets: - if layer.weight is not None: - nn.init.constant_(layer.weight, 0) - if layer.bias is not None: - nn.init.constant_(layer.bias, 0) - - def forward(self, x): - out = self.conv1(x) - out = F.relu_(out) - - # if self.deform_modulated: - # offset_mask = self.conv2_offset(out) - # offset_x, offset_y, mask = torch.chunk(offset_mask, 3, dim=1) - # offset = torch.cat((offset_x, offset_y), dim=1) - # mask = mask.sigmoid() - # out = self.conv2(out, offset, mask) - # else: - # offset = self.conv2_offset(out) - # out = self.conv2(out, offset) - # out = F.relu_(out) - - spx = torch.split(out, self.width, 1) - for i in range(self.nums): - if i==0 or self.in_channels!=self.out_channels: - sp = spx[i].contiguous() - else: - sp = sp + spx[i].contiguous() - - # sp = self.convs[i](sp) - if self.deform_modulated: - offset_mask = self.conv2_offsets[i](sp) - offset_x, offset_y, mask = torch.chunk(offset_mask, 3, dim=1) - offset = torch.cat((offset_x, offset_y), dim=1) - mask = mask.sigmoid() - sp = self.convs[i](sp, offset, mask) - else: - offset = self.conv2_offsets[i](sp) - sp = self.convs[i](sp, offset) - sp = F.relu_(self.bns[i](sp)) - if i==0: - out = sp - else: - out = torch.cat((out, sp), 1) - if self.scale!=1 and self.stride_3x3==1: - out = torch.cat((out, spx[self.nums]), 1) - elif self.scale != 1 and self.stride_3x3==2: - out = torch.cat((out, self.pool(spx[self.nums])), 1) - - out = self.conv3(out) - - if self.shortcut is not None: - shortcut = self.shortcut(x) - else: - shortcut = x - - out += shortcut - out = F.relu_(out) - return out - - -def make_stage(block_class, num_blocks, first_stride, *, in_channels, out_channels, **kwargs): - """ - Create a list of blocks just like those in a ResNet stage. - Args: - block_class (type): a subclass of ResNetBlockBase - num_blocks (int): - first_stride (int): the stride of the first block. The other blocks will have stride=1. - in_channels (int): input channels of the entire stage. - out_channels (int): output channels of **every block** in the stage. - kwargs: other arguments passed to the constructor of every block. - Returns: - list[nn.Module]: a list of block module. - """ - assert "stride" not in kwargs, "Stride of blocks in make_stage cannot be changed." - blocks = [] - for i in range(num_blocks): - blocks.append( - block_class( - in_channels=in_channels, - out_channels=out_channels, - stride=first_stride if i == 0 else 1, - **kwargs, - ) - ) - in_channels = out_channels - return blocks - - -class BasicStem(CNNBlockBase): - """ - The standard ResNet stem (layers before the first residual block). - """ - - def __init__(self, in_channels=3, out_channels=64, norm="BN"): - """ - Args: - norm (str or callable): norm after the first conv layer. - See :func:`layers.get_norm` for supported format. - """ - super().__init__(in_channels, out_channels, 4) - self.in_channels = in_channels - self.conv1 = nn.Sequential( - Conv2d( - in_channels, - 32, - kernel_size=3, - stride=2, - padding=1, - bias=False, - ), - get_norm(norm, 32), - nn.ReLU(inplace=True), - Conv2d( - 32, - 32, - kernel_size=3, - stride=1, - padding=1, - bias=False, - ), - get_norm(norm, 32), - nn.ReLU(inplace=True), - Conv2d( - 32, - out_channels, - kernel_size=3, - stride=1, - padding=1, - bias=False, - ), - ) - self.bn1 = get_norm(norm, out_channels) - - for layer in self.conv1: - if isinstance(layer, Conv2d): - weight_init.c2_msra_fill(layer) - - def forward(self, x): - x = self.conv1(x) - x = self.bn1(x) - x = F.relu_(x) - x = F.max_pool2d(x, kernel_size=3, stride=2, padding=1) - return x - - -class ResNet(Backbone): - def __init__(self, stem, stages, num_classes=None, out_features=None): - """ - Args: - stem (nn.Module): a stem module - stages (list[list[CNNBlockBase]]): several (typically 4) stages, - each contains multiple :class:`CNNBlockBase`. - num_classes (None or int): if None, will not perform classification. - Otherwise, will create a linear layer. - out_features (list[str]): name of the layers whose outputs should - be returned in forward. Can be anything in "stem", "linear", or "res2" ... - If None, will return the output of the last layer. - """ - super(ResNet, self).__init__() - self.stem = stem - self.num_classes = num_classes - - current_stride = self.stem.stride - self._out_feature_strides = {"stem": current_stride} - self._out_feature_channels = {"stem": self.stem.out_channels} - - self.stages_and_names = [] - for i, blocks in enumerate(stages): - assert len(blocks) > 0, len(blocks) - for block in blocks: - assert isinstance(block, CNNBlockBase), block - - name = "res" + str(i + 2) - stage = nn.Sequential(*blocks) - - self.add_module(name, stage) - self.stages_and_names.append((stage, name)) - - self._out_feature_strides[name] = current_stride = int( - current_stride * np.prod([k.stride for k in blocks]) - ) - self._out_feature_channels[name] = curr_channels = blocks[-1].out_channels - - if num_classes is not None: - self.avgpool = nn.AdaptiveAvgPool2d((1, 1)) - self.linear = nn.Linear(curr_channels, num_classes) - - # Sec 5.1 in "Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour": - # "The 1000-way fully-connected layer is initialized by - # drawing weights from a zero-mean Gaussian with standard deviation of 0.01." - nn.init.normal_(self.linear.weight, std=0.01) - name = "linear" - - if out_features is None: - out_features = [name] - self._out_features = out_features - assert len(self._out_features) - children = [x[0] for x in self.named_children()] - for out_feature in self._out_features: - assert out_feature in children, "Available children: {}".format(", ".join(children)) - - def forward(self, x): - outputs = {} - x = self.stem(x) - if "stem" in self._out_features: - outputs["stem"] = x - for stage, name in self.stages_and_names: - x = stage(x) - if name in self._out_features: - outputs[name] = x - if self.num_classes is not None: - x = self.avgpool(x) - x = torch.flatten(x, 1) - x = self.linear(x) - if "linear" in self._out_features: - outputs["linear"] = x - return outputs - - def output_shape(self): - return { - name: ShapeSpec( - channels=self._out_feature_channels[name], stride=self._out_feature_strides[name] - ) - for name in self._out_features - } - - def freeze(self, freeze_at=0): - """ - Freeze the first several stages of the ResNet. Commonly used in - fine-tuning. - Args: - freeze_at (int): number of stem and stages to freeze. - `1` means freezing the stem. `2` means freezing the stem and - the first stage, etc. - Returns: - nn.Module: this ResNet itself - """ - if freeze_at >= 1: - self.stem.freeze() - for idx, (stage, _) in enumerate(self.stages_and_names, start=2): - if freeze_at >= idx: - for block in stage.children(): - block.freeze() - return self - - -@BACKBONE_REGISTRY.register() -def build_res2net_backbone(cfg, input_shape): - """ - Create a Res2Net instance from config. - Returns: - ResNet: a :class:`ResNet` instance. - """ - # need registration of new blocks/stems? - norm = cfg.MODEL.RESNETS.NORM - stem = BasicStem( - in_channels=input_shape.channels, - out_channels=cfg.MODEL.RESNETS.STEM_OUT_CHANNELS, - norm=norm, - ) - - # fmt: off - freeze_at = cfg.MODEL.BACKBONE.FREEZE_AT - out_features = cfg.MODEL.RESNETS.OUT_FEATURES - depth = cfg.MODEL.RESNETS.DEPTH - num_groups = cfg.MODEL.RESNETS.NUM_GROUPS - width_per_group = cfg.MODEL.RESNETS.WIDTH_PER_GROUP - scale = 4 - bottleneck_channels = num_groups * width_per_group * scale - in_channels = cfg.MODEL.RESNETS.STEM_OUT_CHANNELS - out_channels = cfg.MODEL.RESNETS.RES2_OUT_CHANNELS - stride_in_1x1 = cfg.MODEL.RESNETS.STRIDE_IN_1X1 - res5_dilation = cfg.MODEL.RESNETS.RES5_DILATION - deform_on_per_stage = cfg.MODEL.RESNETS.DEFORM_ON_PER_STAGE - deform_modulated = cfg.MODEL.RESNETS.DEFORM_MODULATED - deform_num_groups = cfg.MODEL.RESNETS.DEFORM_NUM_GROUPS - # fmt: on - assert res5_dilation in {1, 2}, "res5_dilation cannot be {}.".format(res5_dilation) - - num_blocks_per_stage = { - 18: [2, 2, 2, 2], - 34: [3, 4, 6, 3], - 50: [3, 4, 6, 3], - 101: [3, 4, 23, 3], - 152: [3, 8, 36, 3], - }[depth] - - if depth in [18, 34]: - assert out_channels == 64, "Must set MODEL.RESNETS.RES2_OUT_CHANNELS = 64 for R18/R34" - assert not any( - deform_on_per_stage - ), "MODEL.RESNETS.DEFORM_ON_PER_STAGE unsupported for R18/R34" - assert res5_dilation == 1, "Must set MODEL.RESNETS.RES5_DILATION = 1 for R18/R34" - assert num_groups == 1, "Must set MODEL.RESNETS.NUM_GROUPS = 1 for R18/R34" - - stages = [] - - # Avoid creating variables without gradients - # It consumes extra memory and may cause allreduce to fail - out_stage_idx = [{"res2": 2, "res3": 3, "res4": 4, "res5": 5}[f] for f in out_features] - max_stage_idx = max(out_stage_idx) - for idx, stage_idx in enumerate(range(2, max_stage_idx + 1)): - dilation = res5_dilation if stage_idx == 5 else 1 - first_stride = 1 if idx == 0 or (stage_idx == 5 and dilation == 2) else 2 - stage_kargs = { - "num_blocks": num_blocks_per_stage[idx], - "first_stride": first_stride, - "in_channels": in_channels, - "out_channels": out_channels, - "norm": norm, - } - # Use BasicBlock for R18 and R34. - if depth in [18, 34]: - stage_kargs["block_class"] = BasicBlock - else: - stage_kargs["bottleneck_channels"] = bottleneck_channels - stage_kargs["stride_in_1x1"] = stride_in_1x1 - stage_kargs["dilation"] = dilation - stage_kargs["num_groups"] = num_groups - stage_kargs["scale"] = scale - - if deform_on_per_stage[idx]: - stage_kargs["block_class"] = DeformBottleneckBlock - stage_kargs["deform_modulated"] = deform_modulated - stage_kargs["deform_num_groups"] = deform_num_groups - else: - stage_kargs["block_class"] = BottleneckBlock - blocks = make_stage(**stage_kargs) - in_channels = out_channels - out_channels *= 2 - bottleneck_channels *= 2 - stages.append(blocks) - return ResNet(stem, stages, out_features=out_features).freeze(freeze_at) - - -@BACKBONE_REGISTRY.register() -def build_p67_res2net_fpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_res2net_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - out_channels = cfg.MODEL.FPN.OUT_CHANNELS - backbone = FPN( - bottom_up=bottom_up, - in_features=in_features, - out_channels=out_channels, - norm=cfg.MODEL.FPN.NORM, - top_block=LastLevelP6P7_P5(out_channels, out_channels), - fuse_type=cfg.MODEL.FPN.FUSE_TYPE, - ) - return backbone - - -@BACKBONE_REGISTRY.register() -def build_res2net_bifpn_backbone(cfg, input_shape: ShapeSpec): - """ - Args: - cfg: a detectron2 CfgNode - - Returns: - backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. - """ - bottom_up = build_res2net_backbone(cfg, input_shape) - in_features = cfg.MODEL.FPN.IN_FEATURES - backbone = BiFPN( - cfg=cfg, - bottom_up=bottom_up, - in_features=in_features, - out_channels=cfg.MODEL.BIFPN.OUT_CHANNELS, - norm=cfg.MODEL.BIFPN.NORM, - num_levels=cfg.MODEL.BIFPN.NUM_LEVELS, - num_bifpn=cfg.MODEL.BIFPN.NUM_BIFPN, - separable_conv=cfg.MODEL.BIFPN.SEPARABLE_CONV, - ) - return backbone \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/debug.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/debug.py deleted file mode 100755 index 0a4437fb..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/debug.py +++ /dev/null @@ -1,283 +0,0 @@ -import cv2 -import numpy as np -import torch -import torch.nn.functional as F - -COLORS = ((np.random.rand(1300, 3) * 0.4 + 0.6) * 255).astype( - np.uint8).reshape(1300, 1, 1, 3) - -def _get_color_image(heatmap): - heatmap = heatmap.reshape( - heatmap.shape[0], heatmap.shape[1], heatmap.shape[2], 1) - if heatmap.shape[0] == 1: - color_map = (heatmap * np.ones((1, 1, 1, 3), np.uint8) * 255).max( - axis=0).astype(np.uint8) # H, W, 3 - else: - color_map = (heatmap * COLORS[:heatmap.shape[0]]).max(axis=0).astype(np.uint8) # H, W, 3 - - return color_map - -def _blend_image(image, color_map, a=0.7): - color_map = cv2.resize(color_map, (image.shape[1], image.shape[0])) - ret = np.clip(image * (1 - a) + color_map * a, 0, 255).astype(np.uint8) - return ret - -def _blend_image_heatmaps(image, color_maps, a=0.7): - merges = np.zeros((image.shape[0], image.shape[1], 3), np.float32) - for color_map in color_maps: - color_map = cv2.resize(color_map, (image.shape[1], image.shape[0])) - merges = np.maximum(merges, color_map) - ret = np.clip(image * (1 - a) + merges * a, 0, 255).astype(np.uint8) - return ret - -def _decompose_level(x, shapes_per_level, N): - ''' - x: LNHiWi x C - ''' - x = x.view(x.shape[0], -1) - ret = [] - st = 0 - for l in range(len(shapes_per_level)): - ret.append([]) - h = shapes_per_level[l][0].int().item() - w = shapes_per_level[l][1].int().item() - for i in range(N): - ret[l].append(x[st + h * w * i:st + h * w * (i + 1)].view( - h, w, -1).permute(2, 0, 1)) - st += h * w * N - return ret - -def _imagelist_to_tensor(images): - images = [x for x in images] - image_sizes = [x.shape[-2:] for x in images] - h = max([size[0] for size in image_sizes]) - w = max([size[1] for size in image_sizes]) - S = 32 - h, w = ((h - 1) // S + 1) * S, ((w - 1) // S + 1) * S - images = [F.pad(x, (0, w - x.shape[2], 0, h - x.shape[1], 0, 0)) \ - for x in images] - images = torch.stack(images) - return images - - -def _ind2il(ind, shapes_per_level, N): - r = ind - l = 0 - S = 0 - while r - S >= N * shapes_per_level[l][0] * shapes_per_level[l][1]: - S += N * shapes_per_level[l][0] * shapes_per_level[l][1] - l += 1 - i = (r - S) // (shapes_per_level[l][0] * shapes_per_level[l][1]) - return i, l - -def debug_train( - images, gt_instances, flattened_hms, reg_targets, labels, pos_inds, - shapes_per_level, locations, strides): - ''' - images: N x 3 x H x W - flattened_hms: LNHiWi x C - shapes_per_level: L x 2 [(H_i, W_i)] - locations: LNHiWi x 2 - ''' - reg_inds = torch.nonzero( - reg_targets.max(dim=1)[0] > 0).squeeze(1) - N = len(images) - images = _imagelist_to_tensor(images) - repeated_locations = [torch.cat([loc] * N, dim=0) \ - for loc in locations] - locations = torch.cat(repeated_locations, dim=0) - gt_hms = _decompose_level(flattened_hms, shapes_per_level, N) - masks = flattened_hms.new_zeros((flattened_hms.shape[0], 1)) - masks[pos_inds] = 1 - masks = _decompose_level(masks, shapes_per_level, N) - for i in range(len(images)): - image = images[i].detach().cpu().numpy().transpose(1, 2, 0) - color_maps = [] - for l in range(len(gt_hms)): - color_map = _get_color_image( - gt_hms[l][i].detach().cpu().numpy()) - color_maps.append(color_map) - cv2.imshow('gthm_{}'.format(l), color_map) - blend = _blend_image_heatmaps(image.copy(), color_maps) - if gt_instances is not None: - bboxes = gt_instances[i].gt_boxes.tensor - for j in range(len(bboxes)): - bbox = bboxes[j] - cv2.rectangle( - blend, - (int(bbox[0]), int(bbox[1])), - (int(bbox[2]), int(bbox[3])), - (0, 0, 255), 3, cv2.LINE_AA) - - for j in range(len(pos_inds)): - image_id, l = _ind2il(pos_inds[j], shapes_per_level, N) - if image_id != i: - continue - loc = locations[pos_inds[j]] - cv2.drawMarker( - blend, (int(loc[0]), int(loc[1])), (0, 255, 255), - markerSize=(l + 1) * 16) - - for j in range(len(reg_inds)): - image_id, l = _ind2il(reg_inds[j], shapes_per_level, N) - if image_id != i: - continue - ltrb = reg_targets[reg_inds[j]] - ltrb *= strides[l] - loc = locations[reg_inds[j]] - bbox = [(loc[0] - ltrb[0]), (loc[1] - ltrb[1]), - (loc[0] + ltrb[2]), (loc[1] + ltrb[3])] - cv2.rectangle( - blend, - (int(bbox[0]), int(bbox[1])), - (int(bbox[2]), int(bbox[3])), - (255, 0, 0), 1, cv2.LINE_AA) - cv2.circle(blend, (int(loc[0]), int(loc[1])), 2, (255, 0, 0), -1) - - cv2.imshow('blend', blend) - cv2.waitKey() - - -def debug_test( - images, logits_pred, reg_pred, agn_hm_pred=[], preds=[], - vis_thresh=0.3, debug_show_name=False, mult_agn=False): - ''' - images: N x 3 x H x W - class_target: LNHiWi x C - cat_agn_heatmap: LNHiWi - shapes_per_level: L x 2 [(H_i, W_i)] - ''' - N = len(images) - for i in range(len(images)): - image = images[i].detach().cpu().numpy().transpose(1, 2, 0) - result = image.copy().astype(np.uint8) - pred_image = image.copy().astype(np.uint8) - color_maps = [] - L = len(logits_pred) - for l in range(L): - if logits_pred[0] is not None: - stride = min(image.shape[0], image.shape[1]) / min( - logits_pred[l][i].shape[1], logits_pred[l][i].shape[2]) - else: - stride = min(image.shape[0], image.shape[1]) / min( - agn_hm_pred[l][i].shape[1], agn_hm_pred[l][i].shape[2]) - stride = stride if stride < 60 else 64 if stride < 100 else 128 - if logits_pred[0] is not None: - if mult_agn: - logits_pred[l][i] = logits_pred[l][i] * agn_hm_pred[l][i] - color_map = _get_color_image( - logits_pred[l][i].detach().cpu().numpy()) - color_maps.append(color_map) - cv2.imshow('predhm_{}'.format(l), color_map) - - if debug_show_name: - from detectron2.data.datasets.lvis_v1_categories import LVIS_CATEGORIES - cat2name = [x['name'] for x in LVIS_CATEGORIES] - for j in range(len(preds[i].scores) if preds is not None else 0): - if preds[i].scores[j] > vis_thresh: - bbox = preds[i].proposal_boxes[j] \ - if preds[i].has('proposal_boxes') else \ - preds[i].pred_boxes[j] - bbox = bbox.tensor[0].detach().cpu().numpy().astype(np.int32) - cat = int(preds[i].pred_classes[j]) \ - if preds[i].has('pred_classes') else 0 - cl = COLORS[cat, 0, 0] - cv2.rectangle( - pred_image, (int(bbox[0]), int(bbox[1])), - (int(bbox[2]), int(bbox[3])), - (int(cl[0]), int(cl[1]), int(cl[2])), 2, cv2.LINE_AA) - if debug_show_name: - txt = '{}{:.1f}'.format( - cat2name[cat] if cat > 0 else '', - preds[i].scores[j]) - font = cv2.FONT_HERSHEY_SIMPLEX - cat_size = cv2.getTextSize(txt, font, 0.5, 2)[0] - cv2.rectangle( - pred_image, - (int(bbox[0]), int(bbox[1] - cat_size[1] - 2)), - (int(bbox[0] + cat_size[0]), int(bbox[1] - 2)), - (int(cl[0]), int(cl[1]), int(cl[2])), -1) - cv2.putText( - pred_image, txt, (int(bbox[0]), int(bbox[1] - 2)), - font, 0.5, (0, 0, 0), thickness=1, lineType=cv2.LINE_AA) - - - if agn_hm_pred[l] is not None: - agn_hm_ = agn_hm_pred[l][i, 0, :, :, None].detach().cpu().numpy() - agn_hm_ = (agn_hm_ * np.array([255, 255, 255]).reshape( - 1, 1, 3)).astype(np.uint8) - cv2.imshow('agn_hm_{}'.format(l), agn_hm_) - blend = _blend_image_heatmaps(image.copy(), color_maps) - cv2.imshow('blend', blend) - cv2.imshow('preds', pred_image) - cv2.waitKey() - -global cnt -cnt = 0 - -def debug_second_stage(images, instances, proposals=None, vis_thresh=0.3, - save_debug=False, debug_show_name=False): - images = _imagelist_to_tensor(images) - if debug_show_name: - from detectron2.data.datasets.lvis_v1_categories import LVIS_CATEGORIES - cat2name = [x['name'] for x in LVIS_CATEGORIES] - for i in range(len(images)): - image = images[i].detach().cpu().numpy().transpose(1, 2, 0).astype(np.uint8).copy() - if instances[i].has('gt_boxes'): - bboxes = instances[i].gt_boxes.tensor.cpu().numpy() - scores = np.ones(bboxes.shape[0]) - cats = instances[i].gt_classes.cpu().numpy() - else: - bboxes = instances[i].pred_boxes.tensor.cpu().numpy() - scores = instances[i].scores.cpu().numpy() - cats = instances[i].pred_classes.cpu().numpy() - for j in range(len(bboxes)): - if scores[j] > vis_thresh: - bbox = bboxes[j] - cl = COLORS[cats[j], 0, 0] - cl = (int(cl[0]), int(cl[1]), int(cl[2])) - cv2.rectangle( - image, - (int(bbox[0]), int(bbox[1])), - (int(bbox[2]), int(bbox[3])), - cl, 2, cv2.LINE_AA) - if debug_show_name: - cat = cats[j] - txt = '{}{:.1f}'.format( - cat2name[cat] if cat > 0 else '', - scores[j]) - font = cv2.FONT_HERSHEY_SIMPLEX - cat_size = cv2.getTextSize(txt, font, 0.5, 2)[0] - cv2.rectangle( - image, - (int(bbox[0]), int(bbox[1] - cat_size[1] - 2)), - (int(bbox[0] + cat_size[0]), int(bbox[1] - 2)), - (int(cl[0]), int(cl[1]), int(cl[2])), -1) - cv2.putText( - image, txt, (int(bbox[0]), int(bbox[1] - 2)), - font, 0.5, (0, 0, 0), thickness=1, lineType=cv2.LINE_AA) - if proposals is not None: - proposal_image = images[i].detach().cpu().numpy().transpose(1, 2, 0).astype(np.uint8).copy() - bboxes = proposals[i].proposal_boxes.tensor.cpu().numpy() - if proposals[i].has('scores'): - scores = proposals[i].scores.cpu().numpy() - else: - scores = proposals[i].objectness_logits.sigmoid().cpu().numpy() - for j in range(len(bboxes)): - if scores[j] > vis_thresh: - bbox = bboxes[j] - cl = (209, 159, 83) - cv2.rectangle( - proposal_image, - (int(bbox[0]), int(bbox[1])), - (int(bbox[2]), int(bbox[3])), - cl, 2, cv2.LINE_AA) - - cv2.imshow('image', image) - if proposals is not None: - cv2.imshow('proposals', proposal_image) - if save_debug: - global cnt - cnt += 1 - cv2.imwrite('output/save_debug/{}.jpg'.format(cnt), proposal_image) - cv2.waitKey() \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet.py deleted file mode 100755 index feb7a822..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet.py +++ /dev/null @@ -1,864 +0,0 @@ - -import math -import json -import copy -from typing import List, Dict -import numpy as np -import torch -from torch import nn -from torch.nn import functional as F - -from detectron2.modeling.proposal_generator.build import PROPOSAL_GENERATOR_REGISTRY -from detectron2.layers import ShapeSpec, cat -from detectron2.structures import Instances, Boxes -from detectron2.modeling import detector_postprocess -from detectron2.utils.comm import get_world_size -from detectron2.config import configurable - -from ..layers.heatmap_focal_loss import heatmap_focal_loss_jit -from ..layers.heatmap_focal_loss import binary_heatmap_focal_loss -from ..layers.iou_loss import IOULoss -from ..layers.ml_nms import ml_nms -from ..debug import debug_train, debug_test -from .utils import reduce_sum, _transpose -from .centernet_head import CenterNetHead - -__all__ = ["CenterNet"] - -INF = 100000000 - -@PROPOSAL_GENERATOR_REGISTRY.register() -class CenterNet(nn.Module): - @configurable - def __init__(self, - # input_shape: Dict[str, ShapeSpec], - in_channels=256, - *, - num_classes=80, - in_features=("p3", "p4", "p5", "p6", "p7"), - strides=(8, 16, 32, 64, 128), - score_thresh=0.05, - hm_min_overlap=0.8, - loc_loss_type='giou', - min_radius=4, - hm_focal_alpha=0.25, - hm_focal_beta=4, - loss_gamma=2.0, - reg_weight=2.0, - not_norm_reg=True, - with_agn_hm=False, - only_proposal=False, - as_proposal=False, - not_nms=False, - pos_weight=1., - neg_weight=1., - sigmoid_clamp=1e-4, - ignore_high_fp=-1., - center_nms=False, - sizes_of_interest=[[0,80],[64,160],[128,320],[256,640],[512,10000000]], - more_pos=False, - more_pos_thresh=0.2, - more_pos_topk=9, - pre_nms_topk_train=1000, - pre_nms_topk_test=1000, - post_nms_topk_train=100, - post_nms_topk_test=100, - nms_thresh_train=0.6, - nms_thresh_test=0.6, - no_reduce=False, - debug=False, - vis_thresh=0.5, - pixel_mean=[103.530,116.280,123.675], - pixel_std=[1.0,1.0,1.0], - device='cuda', - centernet_head=None, - ): - super().__init__() - self.num_classes = num_classes - self.in_features = in_features - self.strides = strides - self.score_thresh = score_thresh - self.min_radius = min_radius - self.hm_focal_alpha = hm_focal_alpha - self.hm_focal_beta = hm_focal_beta - self.loss_gamma = loss_gamma - self.reg_weight = reg_weight - self.not_norm_reg = not_norm_reg - self.with_agn_hm = with_agn_hm - self.only_proposal = only_proposal - self.as_proposal = as_proposal - self.not_nms = not_nms - self.pos_weight = pos_weight - self.neg_weight = neg_weight - self.sigmoid_clamp = sigmoid_clamp - self.ignore_high_fp = ignore_high_fp - self.center_nms = center_nms - self.sizes_of_interest = sizes_of_interest - self.more_pos = more_pos - self.more_pos_thresh = more_pos_thresh - self.more_pos_topk = more_pos_topk - self.pre_nms_topk_train = pre_nms_topk_train - self.pre_nms_topk_test = pre_nms_topk_test - self.post_nms_topk_train = post_nms_topk_train - self.post_nms_topk_test = post_nms_topk_test - self.nms_thresh_train = nms_thresh_train - self.nms_thresh_test = nms_thresh_test - self.no_reduce = no_reduce - self.debug = debug - self.vis_thresh = vis_thresh - if self.center_nms: - self.not_nms = True - self.iou_loss = IOULoss(loc_loss_type) - assert (not self.only_proposal) or self.with_agn_hm - # delta for rendering heatmap - self.delta = (1 - hm_min_overlap) / (1 + hm_min_overlap) - if centernet_head is None: - self.centernet_head = CenterNetHead( - in_channels=in_channels, - num_levels=len(in_features), - with_agn_hm=with_agn_hm, - only_proposal=only_proposal) - else: - self.centernet_head = centernet_head - if self.debug: - pixel_mean = torch.Tensor(pixel_mean).to( - torch.device(device)).view(3, 1, 1) - pixel_std = torch.Tensor(pixel_std).to( - torch.device(device)).view(3, 1, 1) - self.denormalizer = lambda x: x * pixel_std + pixel_mean - - @classmethod - def from_config(cls, cfg, input_shape): - ret = { - # 'input_shape': input_shape, - 'in_channels': input_shape[ - cfg.MODEL.CENTERNET.IN_FEATURES[0]].channels, - 'num_classes': cfg.MODEL.CENTERNET.NUM_CLASSES, - 'in_features': cfg.MODEL.CENTERNET.IN_FEATURES, - 'strides': cfg.MODEL.CENTERNET.FPN_STRIDES, - 'score_thresh': cfg.MODEL.CENTERNET.INFERENCE_TH, - 'loc_loss_type': cfg.MODEL.CENTERNET.LOC_LOSS_TYPE, - 'hm_min_overlap': cfg.MODEL.CENTERNET.HM_MIN_OVERLAP, - 'min_radius': cfg.MODEL.CENTERNET.MIN_RADIUS, - 'hm_focal_alpha': cfg.MODEL.CENTERNET.HM_FOCAL_ALPHA, - 'hm_focal_beta': cfg.MODEL.CENTERNET.HM_FOCAL_BETA, - 'loss_gamma': cfg.MODEL.CENTERNET.LOSS_GAMMA, - 'reg_weight': cfg.MODEL.CENTERNET.REG_WEIGHT, - 'not_norm_reg': cfg.MODEL.CENTERNET.NOT_NORM_REG, - 'with_agn_hm': cfg.MODEL.CENTERNET.WITH_AGN_HM, - 'only_proposal': cfg.MODEL.CENTERNET.ONLY_PROPOSAL, - 'as_proposal': cfg.MODEL.CENTERNET.AS_PROPOSAL, - 'not_nms': cfg.MODEL.CENTERNET.NOT_NMS, - 'pos_weight': cfg.MODEL.CENTERNET.POS_WEIGHT, - 'neg_weight': cfg.MODEL.CENTERNET.NEG_WEIGHT, - 'sigmoid_clamp': cfg.MODEL.CENTERNET.SIGMOID_CLAMP, - 'ignore_high_fp': cfg.MODEL.CENTERNET.IGNORE_HIGH_FP, - 'center_nms': cfg.MODEL.CENTERNET.CENTER_NMS, - 'sizes_of_interest': cfg.MODEL.CENTERNET.SOI, - 'more_pos': cfg.MODEL.CENTERNET.MORE_POS, - 'more_pos_thresh': cfg.MODEL.CENTERNET.MORE_POS_THRESH, - 'more_pos_topk': cfg.MODEL.CENTERNET.MORE_POS_TOPK, - 'pre_nms_topk_train': cfg.MODEL.CENTERNET.PRE_NMS_TOPK_TRAIN, - 'pre_nms_topk_test': cfg.MODEL.CENTERNET.PRE_NMS_TOPK_TEST, - 'post_nms_topk_train': cfg.MODEL.CENTERNET.POST_NMS_TOPK_TRAIN, - 'post_nms_topk_test': cfg.MODEL.CENTERNET.POST_NMS_TOPK_TEST, - 'nms_thresh_train': cfg.MODEL.CENTERNET.NMS_TH_TRAIN, - 'nms_thresh_test': cfg.MODEL.CENTERNET.NMS_TH_TEST, - 'no_reduce': cfg.MODEL.CENTERNET.NO_REDUCE, - 'debug': cfg.DEBUG, - 'vis_thresh': cfg.VIS_THRESH, - 'pixel_mean': cfg.MODEL.PIXEL_MEAN, - 'pixel_std': cfg.MODEL.PIXEL_STD, - 'device': cfg.MODEL.DEVICE, - 'centernet_head': CenterNetHead( - cfg, [input_shape[f] for f in cfg.MODEL.CENTERNET.IN_FEATURES]), - } - return ret - - - def forward(self, images, features_dict, gt_instances): - features = [features_dict[f] for f in self.in_features] - clss_per_level, reg_pred_per_level, agn_hm_pred_per_level = \ - self.centernet_head(features) - grids = self.compute_grids(features) - shapes_per_level = grids[0].new_tensor( - [(x.shape[2], x.shape[3]) for x in reg_pred_per_level]) - - if not self.training: - return self.inference( - images, clss_per_level, reg_pred_per_level, - agn_hm_pred_per_level, grids) - else: - pos_inds, labels, reg_targets, flattened_hms = \ - self._get_ground_truth( - grids, shapes_per_level, gt_instances) - # logits_pred: M x F, reg_pred: M x 4, agn_hm_pred: M - logits_pred, reg_pred, agn_hm_pred = self._flatten_outputs( - clss_per_level, reg_pred_per_level, agn_hm_pred_per_level) - - if self.more_pos: - # add more pixels as positive if \ - # 1. they are within the center3x3 region of an object - # 2. their regression losses are small (= 0).squeeze(1) - reg_pred = reg_pred[reg_inds] - reg_targets_pos = reg_targets[reg_inds] - reg_weight_map = flattened_hms.max(dim=1)[0] - reg_weight_map = reg_weight_map[reg_inds] - reg_weight_map = reg_weight_map * 0 + 1 \ - if self.not_norm_reg else reg_weight_map - if self.no_reduce: - reg_norm = max(reg_weight_map.sum(), 1) - else: - reg_norm = max(reduce_sum(reg_weight_map.sum()).item() / num_gpus, 1) - - reg_loss = self.reg_weight * self.iou_loss( - reg_pred, reg_targets_pos, reg_weight_map, - reduction='sum') / reg_norm - losses['loss_centernet_loc'] = reg_loss - - if self.with_agn_hm: - cat_agn_heatmap = flattened_hms.max(dim=1)[0] # M - agn_pos_loss, agn_neg_loss = binary_heatmap_focal_loss( - agn_hm_pred, cat_agn_heatmap, pos_inds, - alpha=self.hm_focal_alpha, - beta=self.hm_focal_beta, - gamma=self.loss_gamma, - sigmoid_clamp=self.sigmoid_clamp, - ignore_high_fp=self.ignore_high_fp, - ) - agn_pos_loss = self.pos_weight * agn_pos_loss / num_pos_avg - agn_neg_loss = self.neg_weight * agn_neg_loss / num_pos_avg - losses['loss_centernet_agn_pos'] = agn_pos_loss - losses['loss_centernet_agn_neg'] = agn_neg_loss - - if self.debug: - print('losses', losses) - print('total_num_pos', total_num_pos) - return losses - - - def compute_grids(self, features): - grids = [] - for level, feature in enumerate(features): - h, w = feature.size()[-2:] - shifts_x = torch.arange( - 0, w * self.strides[level], - step=self.strides[level], - dtype=torch.float32, device=feature.device) - shifts_y = torch.arange( - 0, h * self.strides[level], - step=self.strides[level], - dtype=torch.float32, device=feature.device) - shift_y, shift_x = torch.meshgrid(shifts_y, shifts_x) - shift_x = shift_x.reshape(-1) - shift_y = shift_y.reshape(-1) - grids_per_level = torch.stack((shift_x, shift_y), dim=1) + \ - self.strides[level] // 2 - grids.append(grids_per_level) - return grids - - - def _get_ground_truth(self, grids, shapes_per_level, gt_instances): - ''' - Input: - grids: list of tensors [(hl x wl, 2)]_l - shapes_per_level: list of tuples L x 2: - gt_instances: gt instances - Retuen: - pos_inds: N - labels: N - reg_targets: M x 4 - flattened_hms: M x C or M x 1 - N: number of objects in all images - M: number of pixels from all FPN levels - ''' - - # get positive pixel index - if not self.more_pos: - pos_inds, labels = self._get_label_inds( - gt_instances, shapes_per_level) - else: - pos_inds, labels = None, None - heatmap_channels = self.num_classes - L = len(grids) - num_loc_list = [len(loc) for loc in grids] - strides = torch.cat([ - shapes_per_level.new_ones(num_loc_list[l]) * self.strides[l] \ - for l in range(L)]).float() # M - reg_size_ranges = torch.cat([ - shapes_per_level.new_tensor(self.sizes_of_interest[l]).float().view( - 1, 2).expand(num_loc_list[l], 2) for l in range(L)]) # M x 2 - grids = torch.cat(grids, dim=0) # M x 2 - M = grids.shape[0] - - reg_targets = [] - flattened_hms = [] - for i in range(len(gt_instances)): # images - boxes = gt_instances[i].gt_boxes.tensor # N x 4 - area = gt_instances[i].gt_boxes.area() # N - gt_classes = gt_instances[i].gt_classes # N in [0, self.num_classes] - - N = boxes.shape[0] - if N == 0: - reg_targets.append(grids.new_zeros((M, 4)) - INF) - flattened_hms.append( - grids.new_zeros(( - M, 1 if self.only_proposal else heatmap_channels))) - continue - - l = grids[:, 0].view(M, 1) - boxes[:, 0].view(1, N) # M x N - t = grids[:, 1].view(M, 1) - boxes[:, 1].view(1, N) # M x N - r = boxes[:, 2].view(1, N) - grids[:, 0].view(M, 1) # M x N - b = boxes[:, 3].view(1, N) - grids[:, 1].view(M, 1) # M x N - reg_target = torch.stack([l, t, r, b], dim=2) # M x N x 4 - - centers = ((boxes[:, [0, 1]] + boxes[:, [2, 3]]) / 2) # N x 2 - centers_expanded = centers.view(1, N, 2).expand(M, N, 2) # M x N x 2 - strides_expanded = strides.view(M, 1, 1).expand(M, N, 2) - centers_discret = ((centers_expanded / strides_expanded).int() * \ - strides_expanded).float() + strides_expanded / 2 # M x N x 2 - - is_peak = (((grids.view(M, 1, 2).expand(M, N, 2) - \ - centers_discret) ** 2).sum(dim=2) == 0) # M x N - is_in_boxes = reg_target.min(dim=2)[0] > 0 # M x N - is_center3x3 = self.get_center3x3( - grids, centers, strides) & is_in_boxes # M x N - is_cared_in_the_level = self.assign_reg_fpn( - reg_target, reg_size_ranges) # M x N - reg_mask = is_center3x3 & is_cared_in_the_level # M x N - - dist2 = ((grids.view(M, 1, 2).expand(M, N, 2) - \ - centers_expanded) ** 2).sum(dim=2) # M x N - dist2[is_peak] = 0 - radius2 = self.delta ** 2 * 2 * area # N - radius2 = torch.clamp( - radius2, min=self.min_radius ** 2) - weighted_dist2 = dist2 / radius2.view(1, N).expand(M, N) # M x N - reg_target = self._get_reg_targets( - reg_target, weighted_dist2.clone(), reg_mask, area) # M x 4 - - if self.only_proposal: - flattened_hm = self._create_agn_heatmaps_from_dist( - weighted_dist2.clone()) # M x 1 - else: - flattened_hm = self._create_heatmaps_from_dist( - weighted_dist2.clone(), gt_classes, - channels=heatmap_channels) # M x C - - reg_targets.append(reg_target) - flattened_hms.append(flattened_hm) - - # transpose im first training_targets to level first ones - reg_targets = _transpose(reg_targets, num_loc_list) - flattened_hms = _transpose(flattened_hms, num_loc_list) - for l in range(len(reg_targets)): - reg_targets[l] = reg_targets[l] / float(self.strides[l]) - reg_targets = cat([x for x in reg_targets], dim=0) # MB x 4 - flattened_hms = cat([x for x in flattened_hms], dim=0) # MB x C - - return pos_inds, labels, reg_targets, flattened_hms - - - def _get_label_inds(self, gt_instances, shapes_per_level): - ''' - Inputs: - gt_instances: [n_i], sum n_i = N - shapes_per_level: L x 2 [(h_l, w_l)]_L - Returns: - pos_inds: N' - labels: N' - ''' - pos_inds = [] - labels = [] - L = len(self.strides) - B = len(gt_instances) - shapes_per_level = shapes_per_level.long() - loc_per_level = (shapes_per_level[:, 0] * shapes_per_level[:, 1]).long() # L - level_bases = [] - s = 0 - for l in range(L): - level_bases.append(s) - s = s + B * loc_per_level[l] - level_bases = shapes_per_level.new_tensor(level_bases).long() # L - strides_default = shapes_per_level.new_tensor(self.strides).float() # L - for im_i in range(B): - targets_per_im = gt_instances[im_i] - bboxes = targets_per_im.gt_boxes.tensor # n x 4 - n = bboxes.shape[0] - centers = ((bboxes[:, [0, 1]] + bboxes[:, [2, 3]]) / 2) # n x 2 - centers = centers.view(n, 1, 2).expand(n, L, 2) - strides = strides_default.view(1, L, 1).expand(n, L, 2) - centers_inds = (centers / strides).long() # n x L x 2 - Ws = shapes_per_level[:, 1].view(1, L).expand(n, L) - pos_ind = level_bases.view(1, L).expand(n, L) + \ - im_i * loc_per_level.view(1, L).expand(n, L) + \ - centers_inds[:, :, 1] * Ws + \ - centers_inds[:, :, 0] # n x L - is_cared_in_the_level = self.assign_fpn_level(bboxes) - pos_ind = pos_ind[is_cared_in_the_level].view(-1) - label = targets_per_im.gt_classes.view( - n, 1).expand(n, L)[is_cared_in_the_level].view(-1) - - pos_inds.append(pos_ind) # n' - labels.append(label) # n' - pos_inds = torch.cat(pos_inds, dim=0).long() - labels = torch.cat(labels, dim=0) - return pos_inds, labels # N, N - - - def assign_fpn_level(self, boxes): - ''' - Inputs: - boxes: n x 4 - size_ranges: L x 2 - Return: - is_cared_in_the_level: n x L - ''' - size_ranges = boxes.new_tensor( - self.sizes_of_interest).view(len(self.sizes_of_interest), 2) # L x 2 - crit = ((boxes[:, 2:] - boxes[:, :2]) **2).sum(dim=1) ** 0.5 / 2 # n - n, L = crit.shape[0], size_ranges.shape[0] - crit = crit.view(n, 1).expand(n, L) - size_ranges_expand = size_ranges.view(1, L, 2).expand(n, L, 2) - is_cared_in_the_level = (crit >= size_ranges_expand[:, :, 0]) & \ - (crit <= size_ranges_expand[:, :, 1]) - return is_cared_in_the_level - - - def assign_reg_fpn(self, reg_targets_per_im, size_ranges): - ''' - TODO (Xingyi): merge it with assign_fpn_level - Inputs: - reg_targets_per_im: M x N x 4 - size_ranges: M x 2 - ''' - crit = ((reg_targets_per_im[:, :, :2] + \ - reg_targets_per_im[:, :, 2:])**2).sum(dim=2) ** 0.5 / 2 # M x N - is_cared_in_the_level = (crit >= size_ranges[:, [0]]) & \ - (crit <= size_ranges[:, [1]]) - return is_cared_in_the_level - - - def _get_reg_targets(self, reg_targets, dist, mask, area): - ''' - reg_targets (M x N x 4): long tensor - dist (M x N) - is_*: M x N - ''' - dist[mask == 0] = INF * 1.0 - min_dist, min_inds = dist.min(dim=1) # M - reg_targets_per_im = reg_targets[ - range(len(reg_targets)), min_inds] # M x N x 4 --> M x 4 - reg_targets_per_im[min_dist == INF] = - INF - return reg_targets_per_im - - - def _create_heatmaps_from_dist(self, dist, labels, channels): - ''' - dist: M x N - labels: N - return: - heatmaps: M x C - ''' - heatmaps = dist.new_zeros((dist.shape[0], channels)) - for c in range(channels): - inds = (labels == c) # N - if inds.int().sum() == 0: - continue - heatmaps[:, c] = torch.exp(-dist[:, inds].min(dim=1)[0]) - zeros = heatmaps[:, c] < 1e-4 - heatmaps[zeros, c] = 0 - return heatmaps - - - def _create_agn_heatmaps_from_dist(self, dist): - ''' - TODO (Xingyi): merge it with _create_heatmaps_from_dist - dist: M x N - return: - heatmaps: M x 1 - ''' - heatmaps = dist.new_zeros((dist.shape[0], 1)) - heatmaps[:, 0] = torch.exp(-dist.min(dim=1)[0]) - zeros = heatmaps < 1e-4 - heatmaps[zeros] = 0 - return heatmaps - - - def _flatten_outputs(self, clss, reg_pred, agn_hm_pred): - # Reshape: (N, F, Hl, Wl) -> (N, Hl, Wl, F) -> (sum_l N*Hl*Wl, F) - clss = cat([x.permute(0, 2, 3, 1).reshape(-1, x.shape[1]) \ - for x in clss], dim=0) if clss[0] is not None else None - reg_pred = cat( - [x.permute(0, 2, 3, 1).reshape(-1, 4) for x in reg_pred], dim=0) - agn_hm_pred = cat([x.permute(0, 2, 3, 1).reshape(-1) \ - for x in agn_hm_pred], dim=0) if self.with_agn_hm else None - return clss, reg_pred, agn_hm_pred - - - def get_center3x3(self, locations, centers, strides): - ''' - Inputs: - locations: M x 2 - centers: N x 2 - strides: M - ''' - M, N = locations.shape[0], centers.shape[0] - locations_expanded = locations.view(M, 1, 2).expand(M, N, 2) # M x N x 2 - centers_expanded = centers.view(1, N, 2).expand(M, N, 2) # M x N x 2 - strides_expanded = strides.view(M, 1, 1).expand(M, N, 2) # M x N - centers_discret = ((centers_expanded / strides_expanded).int() * \ - strides_expanded).float() + strides_expanded / 2 # M x N x 2 - dist_x = (locations_expanded[:, :, 0] - centers_discret[:, :, 0]).abs() - dist_y = (locations_expanded[:, :, 1] - centers_discret[:, :, 1]).abs() - return (dist_x <= strides_expanded[:, :, 0]) & \ - (dist_y <= strides_expanded[:, :, 0]) - - - def inference(self, images, clss_per_level, reg_pred_per_level, - agn_hm_pred_per_level, grids): - logits_pred = [x.sigmoid() if x is not None else None \ - for x in clss_per_level] - agn_hm_pred_per_level = [x.sigmoid() if x is not None else None \ - for x in agn_hm_pred_per_level] - - if self.only_proposal: - proposals = self.predict_instances( - grids, agn_hm_pred_per_level, reg_pred_per_level, - images.image_sizes, [None for _ in agn_hm_pred_per_level]) - else: - proposals = self.predict_instances( - grids, logits_pred, reg_pred_per_level, - images.image_sizes, agn_hm_pred_per_level) - if self.as_proposal or self.only_proposal: - for p in range(len(proposals)): - proposals[p].proposal_boxes = proposals[p].get('pred_boxes') - proposals[p].objectness_logits = proposals[p].get('scores') - proposals[p].remove('pred_boxes') - - if self.debug: - debug_test( - [self.denormalizer(x) for x in images], - logits_pred, reg_pred_per_level, - agn_hm_pred_per_level, preds=proposals, - vis_thresh=self.vis_thresh, - debug_show_name=False) - return proposals, {} - - - def predict_instances( - self, grids, logits_pred, reg_pred, image_sizes, agn_hm_pred, - is_proposal=False): - sampled_boxes = [] - for l in range(len(grids)): - sampled_boxes.append(self.predict_single_level( - grids[l], logits_pred[l], reg_pred[l] * self.strides[l], - image_sizes, agn_hm_pred[l], l, is_proposal=is_proposal)) - boxlists = list(zip(*sampled_boxes)) - boxlists = [Instances.cat(boxlist) for boxlist in boxlists] - boxlists = self.nms_and_topK( - boxlists, nms=not self.not_nms) - return boxlists - - - def predict_single_level( - self, grids, heatmap, reg_pred, image_sizes, agn_hm, level, - is_proposal=False): - N, C, H, W = heatmap.shape - # put in the same format as grids - if self.center_nms: - heatmap_nms = nn.functional.max_pool2d( - heatmap, (3, 3), stride=1, padding=1) - heatmap = heatmap * (heatmap_nms == heatmap).float() - heatmap = heatmap.permute(0, 2, 3, 1) # N x H x W x C - heatmap = heatmap.reshape(N, -1, C) # N x HW x C - box_regression = reg_pred.view(N, 4, H, W).permute(0, 2, 3, 1) # N x H x W x 4 - box_regression = box_regression.reshape(N, -1, 4) - - candidate_inds = heatmap > self.score_thresh # 0.05 - pre_nms_top_n = candidate_inds.view(N, -1).sum(1) # N - pre_nms_topk = self.pre_nms_topk_train if self.training else self.pre_nms_topk_test - pre_nms_top_n = pre_nms_top_n.clamp(max=pre_nms_topk) # N - - if agn_hm is not None: - agn_hm = agn_hm.view(N, 1, H, W).permute(0, 2, 3, 1) - agn_hm = agn_hm.reshape(N, -1) - heatmap = heatmap * agn_hm[:, :, None] - - results = [] - for i in range(N): - per_box_cls = heatmap[i] # HW x C - per_candidate_inds = candidate_inds[i] # n - per_box_cls = per_box_cls[per_candidate_inds] # n - - per_candidate_nonzeros = per_candidate_inds.nonzero() # n - per_box_loc = per_candidate_nonzeros[:, 0] # n - per_class = per_candidate_nonzeros[:, 1] # n - - per_box_regression = box_regression[i] # HW x 4 - per_box_regression = per_box_regression[per_box_loc] # n x 4 - per_grids = grids[per_box_loc] # n x 2 - - per_pre_nms_top_n = pre_nms_top_n[i] # 1 - - if per_candidate_inds.sum().item() > per_pre_nms_top_n.item(): - per_box_cls, top_k_indices = \ - per_box_cls.topk(per_pre_nms_top_n, sorted=False) - per_class = per_class[top_k_indices] - per_box_regression = per_box_regression[top_k_indices] - per_grids = per_grids[top_k_indices] - - detections = torch.stack([ - per_grids[:, 0] - per_box_regression[:, 0], - per_grids[:, 1] - per_box_regression[:, 1], - per_grids[:, 0] + per_box_regression[:, 2], - per_grids[:, 1] + per_box_regression[:, 3], - ], dim=1) # n x 4 - - # avoid invalid boxes in RoI heads - detections[:, 2] = torch.max(detections[:, 2], detections[:, 0] + 0.01) - detections[:, 3] = torch.max(detections[:, 3], detections[:, 1] + 0.01) - boxlist = Instances(image_sizes[i]) - boxlist.scores = torch.sqrt(per_box_cls) \ - if self.with_agn_hm else per_box_cls # n - # import pdb; pdb.set_trace() - boxlist.pred_boxes = Boxes(detections) - boxlist.pred_classes = per_class - results.append(boxlist) - return results - - - def nms_and_topK(self, boxlists, nms=True): - num_images = len(boxlists) - results = [] - for i in range(num_images): - nms_thresh = self.nms_thresh_train if self.training else \ - self.nms_thresh_test - result = ml_nms(boxlists[i], nms_thresh) if nms else boxlists[i] - if self.debug: - print('#proposals before nms', len(boxlists[i])) - print('#proposals after nms', len(result)) - num_dets = len(result) - post_nms_topk = self.post_nms_topk_train if self.training else \ - self.post_nms_topk_test - if num_dets > post_nms_topk: - cls_scores = result.scores - image_thresh, _ = torch.kthvalue( - cls_scores.float().cpu(), - num_dets - post_nms_topk + 1 - ) - keep = cls_scores >= image_thresh.item() - keep = torch.nonzero(keep).squeeze(1) - result = result[keep] - if self.debug: - print('#proposals after filter', len(result)) - results.append(result) - return results - - - def _add_more_pos(self, reg_pred, gt_instances, shapes_per_level): - labels, level_masks, c33_inds, c33_masks, c33_regs = \ - self._get_c33_inds(gt_instances, shapes_per_level) - N, L, K = labels.shape[0], len(self.strides), 9 - c33_inds[c33_masks == 0] = 0 - reg_pred_c33 = reg_pred[c33_inds].detach() # N x L x K - invalid_reg = c33_masks == 0 - c33_regs_expand = c33_regs.view(N * L * K, 4).clamp(min=0) - if N > 0: - with torch.no_grad(): - c33_reg_loss = self.iou_loss( - reg_pred_c33.view(N * L * K, 4), - c33_regs_expand, None, - reduction='none').view(N, L, K).detach() # N x L x K - else: - c33_reg_loss = reg_pred_c33.new_zeros((N, L, K)).detach() - c33_reg_loss[invalid_reg] = INF # N x L x K - c33_reg_loss.view(N * L, K)[level_masks.view(N * L), 4] = 0 # real center - c33_reg_loss = c33_reg_loss.view(N, L * K) - if N == 0: - loss_thresh = c33_reg_loss.new_ones((N)).float() - else: - loss_thresh = torch.kthvalue( - c33_reg_loss, self.more_pos_topk, dim=1)[0] # N - loss_thresh[loss_thresh > self.more_pos_thresh] = self.more_pos_thresh # N - new_pos = c33_reg_loss.view(N, L, K) < \ - loss_thresh.view(N, 1, 1).expand(N, L, K) - pos_inds = c33_inds[new_pos].view(-1) # P - labels = labels.view(N, 1, 1).expand(N, L, K)[new_pos].view(-1) - return pos_inds, labels - - - def _get_c33_inds(self, gt_instances, shapes_per_level): - ''' - TODO (Xingyi): The current implementation is ugly. Refactor. - Get the center (and the 3x3 region near center) locations of each objects - Inputs: - gt_instances: [n_i], sum n_i = N - shapes_per_level: L x 2 [(h_l, w_l)]_L - ''' - labels = [] - level_masks = [] - c33_inds = [] - c33_masks = [] - c33_regs = [] - L = len(self.strides) - B = len(gt_instances) - shapes_per_level = shapes_per_level.long() - loc_per_level = (shapes_per_level[:, 0] * shapes_per_level[:, 1]).long() # L - level_bases = [] - s = 0 - for l in range(L): - level_bases.append(s) - s = s + B * loc_per_level[l] - level_bases = shapes_per_level.new_tensor(level_bases).long() # L - strides_default = shapes_per_level.new_tensor(self.strides).float() # L - K = 9 - dx = shapes_per_level.new_tensor([-1, 0, 1, -1, 0, 1, -1, 0, 1]).long() - dy = shapes_per_level.new_tensor([-1, -1, -1, 0, 0, 0, 1, 1, 1]).long() - for im_i in range(B): - targets_per_im = gt_instances[im_i] - bboxes = targets_per_im.gt_boxes.tensor # n x 4 - n = bboxes.shape[0] - if n == 0: - continue - centers = ((bboxes[:, [0, 1]] + bboxes[:, [2, 3]]) / 2) # n x 2 - centers = centers.view(n, 1, 2).expand(n, L, 2) - - strides = strides_default.view(1, L, 1).expand(n, L, 2) # - centers_inds = (centers / strides).long() # n x L x 2 - center_grids = centers_inds * strides + strides // 2# n x L x 2 - l = center_grids[:, :, 0] - bboxes[:, 0].view(n, 1).expand(n, L) - t = center_grids[:, :, 1] - bboxes[:, 1].view(n, 1).expand(n, L) - r = bboxes[:, 2].view(n, 1).expand(n, L) - center_grids[:, :, 0] - b = bboxes[:, 3].view(n, 1).expand(n, L) - center_grids[:, :, 1] # n x L - reg = torch.stack([l, t, r, b], dim=2) # n x L x 4 - reg = reg / strides_default.view(1, L, 1).expand(n, L, 4).float() - - Ws = shapes_per_level[:, 1].view(1, L).expand(n, L) - Hs = shapes_per_level[:, 0].view(1, L).expand(n, L) - expand_Ws = Ws.view(n, L, 1).expand(n, L, K) - expand_Hs = Hs.view(n, L, 1).expand(n, L, K) - label = targets_per_im.gt_classes.view(n).clone() - mask = reg.min(dim=2)[0] >= 0 # n x L - mask = mask & self.assign_fpn_level(bboxes) - labels.append(label) # n - level_masks.append(mask) # n x L - - Dy = dy.view(1, 1, K).expand(n, L, K) - Dx = dx.view(1, 1, K).expand(n, L, K) - c33_ind = level_bases.view(1, L, 1).expand(n, L, K) + \ - im_i * loc_per_level.view(1, L, 1).expand(n, L, K) + \ - (centers_inds[:, :, 1:2].expand(n, L, K) + Dy) * expand_Ws + \ - (centers_inds[:, :, 0:1].expand(n, L, K) + Dx) # n x L x K - - c33_mask = \ - ((centers_inds[:, :, 1:2].expand(n, L, K) + dy) < expand_Hs) & \ - ((centers_inds[:, :, 1:2].expand(n, L, K) + dy) >= 0) & \ - ((centers_inds[:, :, 0:1].expand(n, L, K) + dx) < expand_Ws) & \ - ((centers_inds[:, :, 0:1].expand(n, L, K) + dx) >= 0) - # TODO (Xingyi): think about better way to implement this - # Currently it hard codes the 3x3 region - c33_reg = reg.view(n, L, 1, 4).expand(n, L, K, 4).clone() - c33_reg[:, :, [0, 3, 6], 0] -= 1 - c33_reg[:, :, [0, 3, 6], 2] += 1 - c33_reg[:, :, [2, 5, 8], 0] += 1 - c33_reg[:, :, [2, 5, 8], 2] -= 1 - c33_reg[:, :, [0, 1, 2], 1] -= 1 - c33_reg[:, :, [0, 1, 2], 3] += 1 - c33_reg[:, :, [6, 7, 8], 1] += 1 - c33_reg[:, :, [6, 7, 8], 3] -= 1 - c33_mask = c33_mask & (c33_reg.min(dim=3)[0] >= 0) # n x L x K - c33_inds.append(c33_ind) - c33_masks.append(c33_mask) - c33_regs.append(c33_reg) - - if len(level_masks) > 0: - labels = torch.cat(labels, dim=0) - level_masks = torch.cat(level_masks, dim=0) - c33_inds = torch.cat(c33_inds, dim=0).long() - c33_regs = torch.cat(c33_regs, dim=0) - c33_masks = torch.cat(c33_masks, dim=0) - else: - labels = shapes_per_level.new_zeros((0)).long() - level_masks = shapes_per_level.new_zeros((0, L)).bool() - c33_inds = shapes_per_level.new_zeros((0, L, K)).long() - c33_regs = shapes_per_level.new_zeros((0, L, K, 4)).float() - c33_masks = shapes_per_level.new_zeros((0, L, K)).bool() - return labels, level_masks, c33_inds, c33_masks, c33_regs # N x L, N x L x K \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet_head.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet_head.py deleted file mode 100755 index 57e0960a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/centernet_head.py +++ /dev/null @@ -1,162 +0,0 @@ -import math -from typing import List -import torch -from torch import nn -from torch.nn import functional as F - -from detectron2.layers import ShapeSpec, get_norm -from detectron2.config import configurable -from ..layers.deform_conv import DFConv2d - -__all__ = ["CenterNetHead"] - -class Scale(nn.Module): - def __init__(self, init_value=1.0): - super(Scale, self).__init__() - self.scale = nn.Parameter(torch.FloatTensor([init_value])) - - def forward(self, input): - return input * self.scale - -class CenterNetHead(nn.Module): - @configurable - def __init__(self, - # input_shape: List[ShapeSpec], - in_channels, - num_levels, - *, - num_classes=80, - with_agn_hm=False, - only_proposal=False, - norm='GN', - num_cls_convs=4, - num_box_convs=4, - num_share_convs=0, - use_deformable=False, - prior_prob=0.01): - super().__init__() - self.num_classes = num_classes - self.with_agn_hm = with_agn_hm - self.only_proposal = only_proposal - self.out_kernel = 3 - - head_configs = { - "cls": (num_cls_convs if not self.only_proposal else 0, \ - use_deformable), - "bbox": (num_box_convs, use_deformable), - "share": (num_share_convs, use_deformable)} - - # in_channels = [s.channels for s in input_shape] - # assert len(set(in_channels)) == 1, \ - # "Each level must have the same channel!" - # in_channels = in_channels[0] - channels = { - 'cls': in_channels, - 'bbox': in_channels, - 'share': in_channels, - } - for head in head_configs: - tower = [] - num_convs, use_deformable = head_configs[head] - channel = channels[head] - for i in range(num_convs): - if use_deformable and i == num_convs - 1: - conv_func = DFConv2d - else: - conv_func = nn.Conv2d - tower.append(conv_func( - in_channels if i == 0 else channel, - channel, - kernel_size=3, stride=1, - padding=1, bias=True - )) - if norm == 'GN' and channel % 32 != 0: - tower.append(nn.GroupNorm(25, channel)) - elif norm != '': - tower.append(get_norm(norm, channel)) - tower.append(nn.ReLU()) - self.add_module('{}_tower'.format(head), - nn.Sequential(*tower)) - - self.bbox_pred = nn.Conv2d( - in_channels, 4, kernel_size=self.out_kernel, - stride=1, padding=self.out_kernel // 2 - ) - - self.scales = nn.ModuleList( - [Scale(init_value=1.0) for _ in range(num_levels)]) - - for modules in [ - self.cls_tower, self.bbox_tower, - self.share_tower, - self.bbox_pred, - ]: - for l in modules.modules(): - if isinstance(l, nn.Conv2d): - torch.nn.init.normal_(l.weight, std=0.01) - torch.nn.init.constant_(l.bias, 0) - - torch.nn.init.constant_(self.bbox_pred.bias, 8.) - prior_prob = prior_prob - bias_value = -math.log((1 - prior_prob) / prior_prob) - - if self.with_agn_hm: - self.agn_hm = nn.Conv2d( - in_channels, 1, kernel_size=self.out_kernel, - stride=1, padding=self.out_kernel // 2 - ) - torch.nn.init.constant_(self.agn_hm.bias, bias_value) - torch.nn.init.normal_(self.agn_hm.weight, std=0.01) - - if not self.only_proposal: - cls_kernel_size = self.out_kernel - self.cls_logits = nn.Conv2d( - in_channels, self.num_classes, - kernel_size=cls_kernel_size, - stride=1, - padding=cls_kernel_size // 2, - ) - - torch.nn.init.constant_(self.cls_logits.bias, bias_value) - torch.nn.init.normal_(self.cls_logits.weight, std=0.01) - - @classmethod - def from_config(cls, cfg, input_shape): - ret = { - # 'input_shape': input_shape, - 'in_channels': [s.channels for s in input_shape][0], - 'num_levels': len(input_shape), - 'num_classes': cfg.MODEL.CENTERNET.NUM_CLASSES, - 'with_agn_hm': cfg.MODEL.CENTERNET.WITH_AGN_HM, - 'only_proposal': cfg.MODEL.CENTERNET.ONLY_PROPOSAL, - 'norm': cfg.MODEL.CENTERNET.NORM, - 'num_cls_convs': cfg.MODEL.CENTERNET.NUM_CLS_CONVS, - 'num_box_convs': cfg.MODEL.CENTERNET.NUM_BOX_CONVS, - 'num_share_convs': cfg.MODEL.CENTERNET.NUM_SHARE_CONVS, - 'use_deformable': cfg.MODEL.CENTERNET.USE_DEFORMABLE, - 'prior_prob': cfg.MODEL.CENTERNET.PRIOR_PROB, - } - return ret - - def forward(self, x): - clss = [] - bbox_reg = [] - agn_hms = [] - for l, feature in enumerate(x): - feature = self.share_tower(feature) - cls_tower = self.cls_tower(feature) - bbox_tower = self.bbox_tower(feature) - if not self.only_proposal: - clss.append(self.cls_logits(cls_tower)) - else: - clss.append(None) - - if self.with_agn_hm: - agn_hms.append(self.agn_hm(bbox_tower)) - else: - agn_hms.append(None) - reg = self.bbox_pred(bbox_tower) - reg = self.scales[l](reg) - bbox_reg.append(F.relu(reg)) - - return clss, bbox_reg, agn_hms \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/utils.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/utils.py deleted file mode 100755 index c9efa287..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/dense_heads/utils.py +++ /dev/null @@ -1,38 +0,0 @@ -import cv2 -import torch -from torch import nn -from detectron2.utils.comm import get_world_size -from detectron2.structures import pairwise_iou, Boxes -# from .data import CenterNetCrop -import torch.nn.functional as F -import numpy as np -from detectron2.structures import Boxes, ImageList, Instances - -__all__ = ['reduce_sum', '_transpose'] - -INF = 1000000000 - -def _transpose(training_targets, num_loc_list): - ''' - This function is used to transpose image first training targets to - level first ones - :return: level first training targets - ''' - for im_i in range(len(training_targets)): - training_targets[im_i] = torch.split( - training_targets[im_i], num_loc_list, dim=0) - - targets_level_first = [] - for targets_per_level in zip(*training_targets): - targets_level_first.append( - torch.cat(targets_per_level, dim=0)) - return targets_level_first - - -def reduce_sum(tensor): - world_size = get_world_size() - if world_size < 2: - return tensor - tensor = tensor.clone() - torch.distributed.all_reduce(tensor, op=torch.distributed.ReduceOp.SUM) - return tensor \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/deform_conv.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/deform_conv.py deleted file mode 100755 index e5650c40..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/deform_conv.py +++ /dev/null @@ -1,116 +0,0 @@ -import torch -from torch import nn - -from detectron2.layers import Conv2d - - -class _NewEmptyTensorOp(torch.autograd.Function): - @staticmethod - def forward(ctx, x, new_shape): - ctx.shape = x.shape - return x.new_empty(new_shape) - - @staticmethod - def backward(ctx, grad): - shape = ctx.shape - return _NewEmptyTensorOp.apply(grad, shape), None - - -class DFConv2d(nn.Module): - """Deformable convolutional layer""" - def __init__( - self, - in_channels, - out_channels, - with_modulated_dcn=True, - kernel_size=3, - stride=1, - groups=1, - dilation=1, - deformable_groups=1, - bias=False, - padding=None - ): - super(DFConv2d, self).__init__() - if isinstance(kernel_size, (list, tuple)): - assert isinstance(stride, (list, tuple)) - assert isinstance(dilation, (list, tuple)) - assert len(kernel_size) == 2 - assert len(stride) == 2 - assert len(dilation) == 2 - padding = ( - dilation[0] * (kernel_size[0] - 1) // 2, - dilation[1] * (kernel_size[1] - 1) // 2 - ) - offset_base_channels = kernel_size[0] * kernel_size[1] - else: - padding = dilation * (kernel_size - 1) // 2 - offset_base_channels = kernel_size * kernel_size - if with_modulated_dcn: - from detectron2.layers.deform_conv import ModulatedDeformConv - offset_channels = offset_base_channels * 3 # default: 27 - conv_block = ModulatedDeformConv - else: - from detectron2.layers.deform_conv import DeformConv - offset_channels = offset_base_channels * 2 # default: 18 - conv_block = DeformConv - self.offset = Conv2d( - in_channels, - deformable_groups * offset_channels, - kernel_size=kernel_size, - stride=stride, - padding=padding, - groups=1, - dilation=dilation - ) - nn.init.constant_(self.offset.weight, 0) - nn.init.constant_(self.offset.bias, 0) - ''' - for l in [self.offset, ]: - nn.init.kaiming_uniform_(l.weight, a=1) - torch.nn.init.constant_(l.bias, 0.) - ''' - self.conv = conv_block( - in_channels, - out_channels, - kernel_size=kernel_size, - stride=stride, - padding=padding, - dilation=dilation, - groups=groups, - deformable_groups=deformable_groups, - bias=bias - ) - self.with_modulated_dcn = with_modulated_dcn - self.kernel_size = kernel_size - self.stride = stride - self.padding = padding - self.dilation = dilation - self.offset_split = offset_base_channels * deformable_groups * 2 - - def forward(self, x, return_offset=False): - if x.numel() > 0: - if not self.with_modulated_dcn: - offset_mask = self.offset(x) - x = self.conv(x, offset_mask) - else: - offset_mask = self.offset(x) - offset = offset_mask[:, :self.offset_split, :, :] - mask = offset_mask[:, self.offset_split:, :, :].sigmoid() - x = self.conv(x, offset, mask) - if return_offset: - return x, offset_mask - return x - # get output shape - output_shape = [ - (i + 2 * p - (di * (k - 1) + 1)) // d + 1 - for i, p, di, k, d in zip( - x.shape[-2:], - self.padding, - self.dilation, - self.kernel_size, - self.stride - ) - ] - output_shape = [x.shape[0], self.conv.weight.shape[0]] + output_shape - return _NewEmptyTensorOp.apply(x, output_shape) \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/heatmap_focal_loss.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/heatmap_focal_loss.py deleted file mode 100755 index d4693b21..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/heatmap_focal_loss.py +++ /dev/null @@ -1,92 +0,0 @@ -import torch -from torch.nn import functional as F - -# TODO: merge these two function -def heatmap_focal_loss( - inputs, - targets, - pos_inds, - labels, - alpha: float = -1, - beta: float = 4, - gamma: float = 2, - reduction: str = 'sum', - sigmoid_clamp: float = 1e-4, - ignore_high_fp: float = -1., -): - """ - Loss used in RetinaNet for dense detection: https://arxiv.org/abs/1708.02002. - Args: - inputs: (sum_l N*Hl*Wl, C) - targets: (sum_l N*Hl*Wl, C) - pos_inds: N - labels: N - Returns: - Loss tensor with the reduction option applied. - """ - pred = torch.clamp(inputs.sigmoid_(), min=sigmoid_clamp, max=1-sigmoid_clamp) - neg_weights = torch.pow(1 - targets, beta) - pos_pred_pix = pred[pos_inds] # N x C - pos_pred = pos_pred_pix.gather(1, labels.unsqueeze(1)) - pos_loss = torch.log(pos_pred) * torch.pow(1 - pos_pred, gamma) - neg_loss = torch.log(1 - pred) * torch.pow(pred, gamma) * neg_weights - - if ignore_high_fp > 0: - not_high_fp = (pred < ignore_high_fp).float() - neg_loss = not_high_fp * neg_loss - - if reduction == "sum": - pos_loss = pos_loss.sum() - neg_loss = neg_loss.sum() - - if alpha >= 0: - pos_loss = alpha * pos_loss - neg_loss = (1 - alpha) * neg_loss - - return - pos_loss, - neg_loss - -heatmap_focal_loss_jit = torch.jit.script(heatmap_focal_loss) -# heatmap_focal_loss_jit = heatmap_focal_loss - -def binary_heatmap_focal_loss( - inputs, - targets, - pos_inds, - alpha: float = -1, - beta: float = 4, - gamma: float = 2, - sigmoid_clamp: float = 1e-4, - ignore_high_fp: float = -1., -): - """ - Args: - inputs: (sum_l N*Hl*Wl,) - targets: (sum_l N*Hl*Wl,) - pos_inds: N - Returns: - Loss tensor with the reduction option applied. - """ - pred = torch.clamp(inputs.sigmoid_(), min=sigmoid_clamp, max=1-sigmoid_clamp) - neg_weights = torch.pow(1 - targets, beta) - for i, ind in enumerate(pos_inds): - if ind >= pred.shape[0]: - print('%'*100) - print(pred.shape, ind, pos_inds) - pos_inds[i] = pred.shape[0] - 1 - pos_pred = pred[pos_inds] # N - pos_loss = torch.log(pos_pred) * torch.pow(1 - pos_pred, gamma) - neg_loss = torch.log(1 - pred) * torch.pow(pred, gamma) * neg_weights - if ignore_high_fp > 0: - not_high_fp = (pred < ignore_high_fp).float() - neg_loss = not_high_fp * neg_loss - - pos_loss = - pos_loss.sum() - neg_loss = - neg_loss.sum() - - if alpha >= 0: - pos_loss = alpha * pos_loss - neg_loss = (1 - alpha) * neg_loss - - return pos_loss, neg_loss - -# binary_heatmap_focal_loss_jit = torch.jit.script(binary_heatmap_focal_loss) \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/iou_loss.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/iou_loss.py deleted file mode 100755 index 6a024646..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/iou_loss.py +++ /dev/null @@ -1,121 +0,0 @@ -import torch -from torch import nn - - -class IOULoss(nn.Module): - def __init__(self, loc_loss_type='iou'): - super(IOULoss, self).__init__() - self.loc_loss_type = loc_loss_type - - def forward(self, pred, target, weight=None, reduction='sum'): - pred_left = pred[:, 0] - pred_top = pred[:, 1] - pred_right = pred[:, 2] - pred_bottom = pred[:, 3] - - target_left = target[:, 0] - target_top = target[:, 1] - target_right = target[:, 2] - target_bottom = target[:, 3] - - target_aera = (target_left + target_right) * \ - (target_top + target_bottom) - pred_aera = (pred_left + pred_right) * \ - (pred_top + pred_bottom) - - w_intersect = torch.min(pred_left, target_left) + \ - torch.min(pred_right, target_right) - h_intersect = torch.min(pred_bottom, target_bottom) + \ - torch.min(pred_top, target_top) - - g_w_intersect = torch.max(pred_left, target_left) + \ - torch.max(pred_right, target_right) - g_h_intersect = torch.max(pred_bottom, target_bottom) + \ - torch.max(pred_top, target_top) - ac_uion = g_w_intersect * g_h_intersect - - area_intersect = w_intersect * h_intersect - area_union = target_aera + pred_aera - area_intersect - - ious = (area_intersect + 1.0) / (area_union + 1.0) - gious = ious - (ac_uion - area_union) / ac_uion - if self.loc_loss_type == 'iou': - losses = -torch.log(ious) - elif self.loc_loss_type == 'linear_iou': - losses = 1 - ious - elif self.loc_loss_type == 'giou': - losses = 1 - gious - else: - raise NotImplementedError - - if weight is not None: - losses = losses * weight - else: - losses = losses - - if reduction == 'sum': - return losses.sum() - elif reduction == 'batch': - return losses.sum(dim=[1]) - elif reduction == 'none': - return losses - else: - raise NotImplementedError - - -def giou_loss( - boxes1: torch.Tensor, - boxes2: torch.Tensor, - reduction: str = "none", - eps: float = 1e-7, -) -> torch.Tensor: - """ - Generalized Intersection over Union Loss (Hamid Rezatofighi et. al) - https://arxiv.org/abs/1902.09630 - Gradient-friendly IoU loss with an additional penalty that is non-zero when the - boxes do not overlap and scales with the size of their smallest enclosing box. - This loss is symmetric, so the boxes1 and boxes2 arguments are interchangeable. - Args: - boxes1, boxes2 (Tensor): box locations in XYXY format, shape (N, 4) or (4,). - reduction: 'none' | 'mean' | 'sum' - 'none': No reduction will be applied to the output. - 'mean': The output will be averaged. - 'sum': The output will be summed. - eps (float): small number to prevent division by zero - """ - - x1, y1, x2, y2 = boxes1.unbind(dim=-1) - x1g, y1g, x2g, y2g = boxes2.unbind(dim=-1) - - assert (x2 >= x1).all(), "bad box: x1 larger than x2" - assert (y2 >= y1).all(), "bad box: y1 larger than y2" - - # Intersection keypoints - xkis1 = torch.max(x1, x1g) - ykis1 = torch.max(y1, y1g) - xkis2 = torch.min(x2, x2g) - ykis2 = torch.min(y2, y2g) - - intsctk = torch.zeros_like(x1) - mask = (ykis2 > ykis1) & (xkis2 > xkis1) - intsctk[mask] = (xkis2[mask] - xkis1[mask]) * (ykis2[mask] - ykis1[mask]) - unionk = (x2 - x1) * (y2 - y1) + (x2g - x1g) * (y2g - y1g) - intsctk - iouk = intsctk / (unionk + eps) - - # smallest enclosing box - xc1 = torch.min(x1, x1g) - yc1 = torch.min(y1, y1g) - xc2 = torch.max(x2, x2g) - yc2 = torch.max(y2, y2g) - - area_c = (xc2 - xc1) * (yc2 - yc1) - miouk = iouk - ((area_c - unionk) / (area_c + eps)) - - loss = 1 - miouk - - if reduction == "mean": - loss = loss.mean() - elif reduction == "sum": - loss = loss.sum() - - return loss \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/ml_nms.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/ml_nms.py deleted file mode 100755 index 325d709a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/layers/ml_nms.py +++ /dev/null @@ -1,31 +0,0 @@ -from detectron2.layers import batched_nms - - -def ml_nms(boxlist, nms_thresh, max_proposals=-1, - score_field="scores", label_field="labels"): - """ - Performs non-maximum suppression on a boxlist, with scores specified - in a boxlist field via score_field. - Arguments: - boxlist(BoxList) - nms_thresh (float) - max_proposals (int): if > 0, then only the top max_proposals are kept - after non-maximum suppression - score_field (str) - """ - if nms_thresh <= 0: - return boxlist - if boxlist.has('pred_boxes'): - boxes = boxlist.pred_boxes.tensor - labels = boxlist.pred_classes - else: - boxes = boxlist.proposal_boxes.tensor - labels = boxlist.proposal_boxes.tensor.new_zeros( - len(boxlist.proposal_boxes.tensor)) - scores = boxlist.scores - - keep = batched_nms(boxes, scores, labels, nms_thresh) - if max_proposals > 0: - keep = keep[: max_proposals] - boxlist = boxlist[keep] - return boxlist diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/meta_arch/centernet_detector.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/meta_arch/centernet_detector.py deleted file mode 100755 index b7525c7b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/meta_arch/centernet_detector.py +++ /dev/null @@ -1,69 +0,0 @@ -import math -import json -import numpy as np -import torch -from torch import nn - -from detectron2.modeling.meta_arch.build import META_ARCH_REGISTRY -from detectron2.modeling import build_backbone, build_proposal_generator -from detectron2.modeling import detector_postprocess -from detectron2.structures import ImageList - -@META_ARCH_REGISTRY.register() -class CenterNetDetector(nn.Module): - def __init__(self, cfg): - super().__init__() - self.mean, self.std = cfg.MODEL.PIXEL_MEAN, cfg.MODEL.PIXEL_STD - self.register_buffer("pixel_mean", torch.Tensor(cfg.MODEL.PIXEL_MEAN).view(-1, 1, 1)) - self.register_buffer("pixel_std", torch.Tensor(cfg.MODEL.PIXEL_STD).view(-1, 1, 1)) - - self.backbone = build_backbone(cfg) - self.proposal_generator = build_proposal_generator( - cfg, self.backbone.output_shape()) # TODO: change to a more precise name - - - def forward(self, batched_inputs): - if not self.training: - return self.inference(batched_inputs) - images = self.preprocess_image(batched_inputs) - features = self.backbone(images.tensor) - gt_instances = [x["instances"].to(self.device) for x in batched_inputs] - - _, proposal_losses = self.proposal_generator( - images, features, gt_instances) - return proposal_losses - - - @property - def device(self): - return self.pixel_mean.device - - - @torch.no_grad() - def inference(self, batched_inputs, do_postprocess=True): - images = self.preprocess_image(batched_inputs) - inp = images.tensor - features = self.backbone(inp) - proposals, _ = self.proposal_generator(images, features, None) - - processed_results = [] - for results_per_image, input_per_image, image_size in zip( - proposals, batched_inputs, images.image_sizes): - if do_postprocess: - height = input_per_image.get("height", image_size[0]) - width = input_per_image.get("width", image_size[1]) - r = detector_postprocess(results_per_image, height, width) - processed_results.append({"instances": r}) - else: - r = results_per_image - processed_results.append(r) - return processed_results - - def preprocess_image(self, batched_inputs): - """ - Normalize, pad and batch the input images. - """ - images = [x["image"].to(self.device) for x in batched_inputs] - images = [(x - self.pixel_mean) / self.pixel_std for x in images] - images = ImageList.from_tensors(images, self.backbone.size_divisibility) - return images diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_fast_rcnn.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_fast_rcnn.py deleted file mode 100755 index b6d95690..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_fast_rcnn.py +++ /dev/null @@ -1,124 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -# Part of the code is from https://github.com/tztztztztz/eql.detectron2/blob/master/projects/EQL/eql/fast_rcnn.py -import logging -import math -import json -from typing import Dict, Union -import torch -from fvcore.nn import giou_loss, smooth_l1_loss -from torch import nn -from torch.nn import functional as F - -from detectron2.config import configurable -from detectron2.layers import Linear, ShapeSpec, batched_nms, cat, nonzero_tuple -from detectron2.modeling.box_regression import Box2BoxTransform -from detectron2.structures import Boxes, Instances -from detectron2.utils.events import get_event_storage -from detectron2.modeling.roi_heads.fast_rcnn import FastRCNNOutputLayers -from detectron2.modeling.roi_heads.fast_rcnn import fast_rcnn_inference -from detectron2.modeling.roi_heads.fast_rcnn import _log_classification_stats -from detectron2.utils.comm import get_world_size -from .fed_loss import load_class_freq, get_fed_loss_inds - -__all__ = ["CustomFastRCNNOutputLayers"] - -class CustomFastRCNNOutputLayers(FastRCNNOutputLayers): - def __init__( - self, - cfg, - input_shape: ShapeSpec, - **kwargs - ): - super().__init__(cfg, input_shape, **kwargs) - - self.cfg = cfg - - def losses(self, predictions, proposals): - """ - enable advanced loss - """ - scores, proposal_deltas = predictions - gt_classes = ( - cat([p.gt_classes for p in proposals], dim=0) if len(proposals) else torch.empty(0) - ) - num_classes = self.num_classes - _log_classification_stats(scores, gt_classes) - - if len(proposals): - proposal_boxes = cat([p.proposal_boxes.tensor for p in proposals], dim=0) # Nx4 - assert not proposal_boxes.requires_grad, "Proposals should not require gradients!" - gt_boxes = cat( - [(p.gt_boxes if p.has("gt_boxes") else p.proposal_boxes).tensor for p in proposals], - dim=0, - ) - else: - proposal_boxes = gt_boxes = torch.empty((0, 4), device=proposal_deltas.device) - - loss_cls = self.softmax_cross_entropy_loss(scores, gt_classes) - return { - "loss_cls": loss_cls, - "loss_box_reg": self.box_reg_loss( - proposal_boxes, gt_boxes, proposal_deltas, gt_classes) - } - - - def sigmoid_cross_entropy_loss(self, pred_class_logits, gt_classes): - if pred_class_logits.numel() == 0: - return pred_class_logits.new_zeros([1])[0] # This is more robust than .sum() * 0. - - B = pred_class_logits.shape[0] - C = pred_class_logits.shape[1] - 1 - - target = pred_class_logits.new_zeros(B, C + 1) - target[range(len(gt_classes)), gt_classes] = 1 # B x (C + 1) - target = target[:, :C] # B x C - - weight = 1 - - cls_loss = F.binary_cross_entropy_with_logits( - pred_class_logits[:, :-1], target, reduction='none') # B x C - loss = torch.sum(cls_loss * weight) / B - return loss - - - def softmax_cross_entropy_loss(self, pred_class_logits, gt_classes): - """ - change _no_instance handling - """ - if pred_class_logits.numel() == 0: - return pred_class_logits.new_zeros([1])[0] - - loss = F.cross_entropy( - pred_class_logits, gt_classes, reduction="mean") - return loss - - - def inference(self, predictions, proposals): - """ - enable use proposal boxes - """ - boxes = self.predict_boxes(predictions, proposals) - scores = self.predict_probs(predictions, proposals) - if self.cfg.MODEL.ROI_BOX_HEAD.MULT_PROPOSAL_SCORE: - proposal_scores = [p.get('objectness_logits') for p in proposals] - scores = [(s * ps[:, None]) ** 0.5 \ - for s, ps in zip(scores, proposal_scores)] - image_shapes = [x.image_size for x in proposals] - return fast_rcnn_inference( - boxes, - scores, - image_shapes, - self.test_score_thresh, - self.test_nms_thresh, - self.test_topk_per_image, - ) - - - def predict_probs(self, predictions, proposals): - """ - support sigmoid - """ - scores, _ = predictions - num_inst_per_image = [len(p) for p in proposals] - probs = F.softmax(scores, dim=-1) - return probs.split(num_inst_per_image, dim=0) diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_roi_heads.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_roi_heads.py deleted file mode 100755 index 90fadf1a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_roi_heads.py +++ /dev/null @@ -1,185 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -import numpy as np -import json -import math -import torch -from torch import nn -from torch.autograd.function import Function -from typing import Dict, List, Optional, Tuple, Union - -from detectron2.layers import ShapeSpec -from detectron2.structures import Boxes, Instances, pairwise_iou -from detectron2.utils.events import get_event_storage - -from detectron2.modeling.box_regression import Box2BoxTransform -from detectron2.modeling.roi_heads.fast_rcnn import fast_rcnn_inference -from detectron2.modeling.roi_heads.roi_heads import ROI_HEADS_REGISTRY, StandardROIHeads -from detectron2.modeling.roi_heads.cascade_rcnn import CascadeROIHeads -from detectron2.modeling.roi_heads.box_head import build_box_head -from .custom_fast_rcnn import CustomFastRCNNOutputLayers - - -@ROI_HEADS_REGISTRY.register() -class CustomROIHeads(StandardROIHeads): - @classmethod - def _init_box_head(self, cfg, input_shape): - ret = super()._init_box_head(cfg, input_shape) - del ret['box_predictor'] - ret['box_predictor'] = CustomFastRCNNOutputLayers( - cfg, ret['box_head'].output_shape) - self.debug = cfg.DEBUG - if self.debug: - self.debug_show_name = cfg.DEBUG_SHOW_NAME - self.save_debug = cfg.SAVE_DEBUG - self.vis_thresh = cfg.VIS_THRESH - self.pixel_mean = torch.Tensor(cfg.MODEL.PIXEL_MEAN).to( - torch.device(cfg.MODEL.DEVICE)).view(3, 1, 1) - self.pixel_std = torch.Tensor(cfg.MODEL.PIXEL_STD).to( - torch.device(cfg.MODEL.DEVICE)).view(3, 1, 1) - return ret - - def forward(self, images, features, proposals, targets=None): - """ - enable debug - """ - if not self.debug: - del images - if self.training: - assert targets - proposals = self.label_and_sample_proposals(proposals, targets) - del targets - - if self.training: - losses = self._forward_box(features, proposals) - losses.update(self._forward_mask(features, proposals)) - losses.update(self._forward_keypoint(features, proposals)) - return proposals, losses - else: - pred_instances = self._forward_box(features, proposals) - pred_instances = self.forward_with_given_boxes(features, pred_instances) - if self.debug: - from ..debug import debug_second_stage - denormalizer = lambda x: x * self.pixel_std + self.pixel_mean - debug_second_stage( - [denormalizer(images[0].clone())], - pred_instances, proposals=proposals, - debug_show_name=self.debug_show_name) - return pred_instances, {} - - -@ROI_HEADS_REGISTRY.register() -class CustomCascadeROIHeads(CascadeROIHeads): - @classmethod - def _init_box_head(self, cfg, input_shape): - self.mult_proposal_score = cfg.MODEL.ROI_BOX_HEAD.MULT_PROPOSAL_SCORE - ret = super()._init_box_head(cfg, input_shape) - del ret['box_predictors'] - cascade_bbox_reg_weights = cfg.MODEL.ROI_BOX_CASCADE_HEAD.BBOX_REG_WEIGHTS - box_predictors = [] - for box_head, bbox_reg_weights in zip(ret['box_heads'], cascade_bbox_reg_weights): - box_predictors.append( - CustomFastRCNNOutputLayers( - cfg, box_head.output_shape, - box2box_transform=Box2BoxTransform(weights=bbox_reg_weights) - )) - ret['box_predictors'] = box_predictors - self.debug = cfg.DEBUG - if self.debug: - self.debug_show_name = cfg.DEBUG_SHOW_NAME - self.save_debug = cfg.SAVE_DEBUG - self.vis_thresh = cfg.VIS_THRESH - self.pixel_mean = torch.Tensor(cfg.MODEL.PIXEL_MEAN).to( - torch.device(cfg.MODEL.DEVICE)).view(3, 1, 1) - self.pixel_std = torch.Tensor(cfg.MODEL.PIXEL_STD).to( - torch.device(cfg.MODEL.DEVICE)).view(3, 1, 1) - return ret - - - def _forward_box(self, features, proposals, targets=None): - """ - Add mult proposal scores at testing - """ - if (not self.training) and self.mult_proposal_score: - if len(proposals) > 0 and proposals[0].has('scores'): - proposal_scores = [ - p.get('scores') for p in proposals] - else: - proposal_scores = [ - p.get('objectness_logits') for p in proposals] - - features = [features[f] for f in self.box_in_features] - head_outputs = [] # (predictor, predictions, proposals) - prev_pred_boxes = None - image_sizes = [x.image_size for x in proposals] - for k in range(self.num_cascade_stages): - if k > 0: - proposals = self._create_proposals_from_boxes(prev_pred_boxes, image_sizes) - if self.training: - proposals = self._match_and_label_boxes(proposals, k, targets) - predictions = self._run_stage(features, proposals, k) - prev_pred_boxes = self.box_predictor[k].predict_boxes(predictions, proposals) - head_outputs.append((self.box_predictor[k], predictions, proposals)) - - if self.training: - losses = {} - storage = get_event_storage() - for stage, (predictor, predictions, proposals) in enumerate(head_outputs): - with storage.name_scope("stage{}".format(stage)): - stage_losses = predictor.losses(predictions, proposals) - losses.update({k + "_stage{}".format(stage): v for k, v in stage_losses.items()}) - return losses - else: - # Each is a list[Tensor] of length #image. Each tensor is Ri x (K+1) - scores_per_stage = [h[0].predict_probs(h[1], h[2]) for h in head_outputs] - scores = [ - sum(list(scores_per_image)) * (1.0 / self.num_cascade_stages) - for scores_per_image in zip(*scores_per_stage) - ] - - if self.mult_proposal_score: - scores = [(s * ps[:, None]) ** 0.5 \ - for s, ps in zip(scores, proposal_scores)] - - predictor, predictions, proposals = head_outputs[-1] - boxes = predictor.predict_boxes(predictions, proposals) - pred_instances, _ = fast_rcnn_inference( - boxes, - scores, - image_sizes, - predictor.test_score_thresh, - predictor.test_nms_thresh, - predictor.test_topk_per_image, - ) - - return pred_instances - - def forward(self, images, features, proposals, targets=None): - ''' - enable debug - ''' - if not self.debug: - del images - if self.training: - proposals = self.label_and_sample_proposals(proposals, targets) - - if self.training: - losses = self._forward_box(features, proposals, targets) - losses.update(self._forward_mask(features, proposals)) - losses.update(self._forward_keypoint(features, proposals)) - return proposals, losses - else: - # import pdb; pdb.set_trace() - pred_instances = self._forward_box(features, proposals) - pred_instances = self.forward_with_given_boxes(features, pred_instances) - if self.debug: - from ..debug import debug_second_stage - denormalizer = lambda x: x * self.pixel_std + self.pixel_mean - debug_second_stage( - [denormalizer(x.clone()) for x in images], - pred_instances, proposals=proposals, - save_debug=self.save_debug, - debug_show_name=self.debug_show_name, - vis_thresh=self.vis_thresh) - return pred_instances, {} - - diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/fed_loss.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/fed_loss.py deleted file mode 100755 index 290f0f07..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet/modeling/roi_heads/fed_loss.py +++ /dev/null @@ -1,31 +0,0 @@ -import torch -import json -import numpy as np -from torch.nn import functional as F - -def load_class_freq( - path='datasets/lvis/lvis_v1_train_cat_info.json', - freq_weight=0.5): - cat_info = json.load(open(path, 'r')) - cat_info = torch.tensor( - [c['image_count'] for c in sorted(cat_info, key=lambda x: x['id'])]) - freq_weight = cat_info.float() ** freq_weight - return freq_weight - -def get_fed_loss_inds( - gt_classes, num_sample_cats=50, C=1203, \ - weight=None, fed_cls_inds=-1): - appeared = torch.unique(gt_classes) # C' - prob = appeared.new_ones(C + 1).float() - prob[-1] = 0 - if len(appeared) < num_sample_cats: - if weight is not None: - prob[:C] = weight.float().clone() - prob[appeared] = 0 - if fed_cls_inds > 0: - prob[fed_cls_inds:] = 0 - more_appeared = torch.multinomial( - prob, num_sample_cats - len(appeared), - replacement=False) - appeared = torch.cat([appeared, more_appeared]) - return appeared \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet2_docs/MODEL_ZOO.md b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet2_docs/MODEL_ZOO.md deleted file mode 100755 index 7a2a92b6..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/centernet2_docs/MODEL_ZOO.md +++ /dev/null @@ -1,73 +0,0 @@ -# MODEL_ZOO - -### Common settings and notes - -- Multiscale training is used by default in all models. The results are all reported using single-scale testing. -- We report runtime on our local workstation with a TitanXp GPU and a Titan RTX GPU. -- All models are trained on 8-GPU servers by default. The 1280 models are trained on 24G GPUs. Reducing the batchsize with the linear learning rate rule should be fine. -- All models can be downloaded directly from [Google drive](https://drive.google.com/drive/folders/1eae1cTX8tvIaCeof36sBgxrXEXALYlf-?usp=sharing). - - -## COCO - -### CenterNet - -| Model | val mAP | FPS (Titan Xp/ Titan RTX) | links | -|-------------------------------------------|---------|---------|-----------| -| CenterNet-S4_DLA_8x | 42.5 | 50 / 71 |[config](../configs/CenterNet-S4_DLA_8x.yaml)/[model](https://drive.google.com/file/d/1lNBhVHnZAEBRD66MFaHjm5Ij6Z4KYrJq/view?usp=sharing)| -| CenterNet-FPN_R50_1x | 40.2 | 20 / 24 |[config](../configs/CenterNet-FPN_R50_1x.yaml)/[model](https://drive.google.com/file/d/1rVG1YTthMXvutC6jr9KoE2DthT5-jhGj/view?usp=sharing)| - -#### Note - -- `CenterNet-S4_DLA_8x` is a re-implemented version of the original CenterNet (stride 4), with several changes, including - - Using top-left-right-bottom box encoding and GIoU Loss; adding regression loss to the center 3x3 region. - - Adding more positive pixels for the heatmap loss whose regression loss is small and is within the center3x3 region. - - Using more heavy crop augmentation (EfficientDet-style crop ratio 0.1-2), and removing color augmentations. - - Using standard NMS instead of max pooling. - - Using RetinaNet-style optimizer (SGD), learning rate rule (0.01 for each batch size 16), and schedule (8x12 epochs). -- `CenterNet-FPN_R50_1x` is a (new) FPN version of CenterNet. It includes the changes above, and assigns objects to FPN levels based on a fixed size range. The model is trained with standard short edge 640-800 multi-scale training with 12 epochs (1x). - - -### CenterNet2 - -| Model | val mAP | FPS (Titan Xp/ Titan RTX) | links | -|-------------------------------------------|---------|---------|-----------| -| CenterNet2-F_R50_1x | 41.7 | 22 / 27 |[config](../configs/CenterNet2-F_R50_1x.yaml)/[model](X)| -| CenterNet2_R50_1x | 42.9 | 18 / 24 |[config](../configs/CenterNet2_R50_1x.yaml)/[model](https://drive.google.com/file/d/1Osu1J_sskt_1FaGdfJKa4vd2N71TWS9W/view?usp=sharing)| -| CenterNet2_X101-DCN_2x | 49.9 | 6 / 8 |[config](../configs/CenterNet2_X101-DCN_2x.yaml)/[model](https://drive.google.com/file/d/1IHgpUHVJWpvMuFUUetgKWsw27pRNN2oK/view?usp=sharing)| -| CenterNet2_DLA-BiFPN-P3_4x | 43.8 | 40 / 50|[config](../configs/CenterNet2_DLA-BiFPN-P3_4x.yaml)/[model](https://drive.google.com/file/d/12GUNlDW9RmOs40UEMSiiUsk5QK_lpGsE/view?usp=sharing)| -| CenterNet2_DLA-BiFPN-P3_24x | 45.6 | 40 / 50 |[config](../configs/CenterNet2_DLA-BiFPN-P3_24x.yaml)/[model](https://drive.google.com/file/d/15ZES1ySxubDPzKsHPA7pYg8o_Vwmf-Mb/view?usp=sharing)| -| CenterNet2_R2-101-DCN_896_4x | 51.2 | 9 / 13 |[config](../configs/CenterNet2_R2-101-DCN_896_4x.yaml)/[model](https://drive.google.com/file/d/1S7_GE8ZDQBWuLEfKHkxzeF3KBsxsbABg/view?usp=sharing)| -| CenterNet2_R2-101-DCN-BiFPN_1280_4x | 52.9 | 6 / 8 |[config](../configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml)/[model](https://drive.google.com/file/d/14EBHNMagBCNTQjOXcHoZwLYIi2lFIm7F/view?usp=sharing)| -| CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST | 56.1 | 3 / 5 |[config](../configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml)/[model](https://drive.google.com/file/d/11ww9VlOi_nhpdsU_vBAecSxBU0dR_JzW/view?usp=sharing)| -| CenterNet2_DLA-BiFPN-P5_640_24x_ST | 49.2 | 33 / 38 |[config](../configs/CenterNet2_DLA-BiFPN-P5_640_24x_ST.yaml)/[model](https://drive.google.com/file/d/1qsHp2HrM1u8WrtBzF5S0oCoLMz-B40wk/view?usp=sharing)| - -#### Note - -- `CenterNet2-F_R50_1x` uses Faster RCNN as the second stage. All other CenterNet2 models use Cascade RCNN as the second stage. -- `CenterNet2_DLA-BiFPN-P3_4x` follows the same training setting as [realtime-FCOS](https://github.com/aim-uofa/AdelaiDet/blob/master/configs/FCOS-Detection/README.md). -- `CenterNet2_DLA-BiFPN-P3_24x` is trained by repeating the `4x` schedule (starting from learning rate 0.01) 6 times. -- R2 means [Res2Net](https://github.com/Res2Net/Res2Net-detectron2) backbone. To train Res2Net models, you need to download the ImageNet pre-trained weight [here](https://github.com/Res2Net/Res2Net-detectron2) and place it in `output/r2_101.pkl`. -- The last 4 models in the table are trained with the EfficientDet-style resize-and-crop augmentation, instead of the default random resizing short edge in detectron2. We found this trains faster (per-iteration) and gives better performance under a long schedule. -- `_ST` means using [self-training](https://arxiv.org/abs/2006.06882) using pseudo-labels produced by [Scaled-YOLOv4](https://github.com/WongKinYiu/ScaledYOLOv4) on COCO unlabeled images, with a hard score threshold 0.5. Our processed pseudo-labels can be downloaded [here](https://drive.google.com/file/d/1LMBjtHhLp6dYf6MjwEQmzCLWQLkmWPpw/view?usp=sharing). -- `CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST` finetunes from `CenterNet2_R2-101-DCN-BiFPN_1280_4x` for an additional `4x` schedule with the self-training data. It is trained under `1280x1280` but tested under `1560x1560`. - -## LVIS v1 - -| Model | val mAP box | links | -|-------------------------------------------|--------------|-----------| -| LVIS_CenterNet2_R50_1x | 26.5 |[config](../configs/LVIS_CenterNet2_R50_1x.yaml)/[model](https://drive.google.com/file/d/1gT9e-tNw8uzEBaCadQuoOOP2TEYa4kKP/view?usp=sharing)| -| LVIS_CenterNet2_R50_Fed_1x | 28.3 |[config](../configs/LVIS_CenterNet2_R50_Fed_1x.yaml)/[model](https://drive.google.com/file/d/1a9UjheMCKax0qAKEwPVpq2ZHN6vpqJv8/view?usp=sharing)| - -- The models are trained with repeat-factor sampling. -- `LVIS_CenterNet2_R50_Fed_1x` is CenterNet2 with our federated loss. Check our Appendix D of our [paper](https://arxiv.org/abs/2103.07461) or our [technical report at LVIS challenge](https://www.lvisdataset.org/assets/challenge_reports/2020/CenterNet2.pdf) for references. - -## Objects365 - -| Model | val mAP| links | -|-------------------------------------------|---------|-----------| -| O365_CenterNet2_R50_1x | 22.6 |[config](../configs/O365_CenterNet2_R50_1x.yaml)/[model](https://drive.google.com/file/d/18fG6xGchAlpNp5sx8RAtwadGkS-gdIBU/view?usp=sharing)| - -#### Note -- Objects365 dataset can be downloaded [here](https://www.objects365.org/overview.html). -- The model is trained with class-aware sampling. diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet-FPN.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet-FPN.yaml deleted file mode 100755 index bef3dc10..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet-FPN.yaml +++ /dev/null @@ -1,28 +0,0 @@ -MODEL: - META_ARCHITECTURE: "CenterNetDetector" - PROPOSAL_GENERATOR: - NAME: "CenterNet" - BACKBONE: - NAME: "build_p67_resnet_fpn_backbone" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 - OUT_FEATURES: ["res3", "res4", "res5"] - FPN: - IN_FEATURES: ["res3", "res4", "res5"] -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - IMS_PER_BATCH: 16 - BASE_LR: 0.01 - STEPS: (60000, 80000) - MAX_ITER: 90000 - CHECKPOINT_PERIOD: 1000000000 - WARMUP_ITERS: 4000 - WARMUP_FACTOR: 0.00025 - CLIP_GRADIENTS: - ENABLED: True -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -OUTPUT_DIR: "./output/CenterNet2/auto" diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet2.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet2.yaml deleted file mode 100755 index 68937231..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base-CenterNet2.yaml +++ /dev/null @@ -1,56 +0,0 @@ -MODEL: - META_ARCHITECTURE: "GeneralizedRCNN" - PROPOSAL_GENERATOR: - NAME: "CenterNet" - BACKBONE: - NAME: "build_p67_resnet_fpn_backbone" - WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" - RESNETS: - DEPTH: 50 - OUT_FEATURES: ["res3", "res4", "res5"] - FPN: - IN_FEATURES: ["res3", "res4", "res5"] - ROI_HEADS: - NAME: CustomCascadeROIHeads - IN_FEATURES: ["p3", "p4", "p5", "p6", "p7"] - IOU_THRESHOLDS: [0.6] - NMS_THRESH_TEST: 0.7 - ROI_BOX_CASCADE_HEAD: - IOUS: [0.6, 0.7, 0.8] - ROI_BOX_HEAD: - NAME: "FastRCNNConvFCHead" - NUM_FC: 2 - POOLER_RESOLUTION: 7 - CLS_AGNOSTIC_BBOX_REG: True - MULT_PROPOSAL_SCORE: True - CENTERNET: - REG_WEIGHT: 1. - NOT_NORM_REG: True - ONLY_PROPOSAL: True - WITH_AGN_HM: True - INFERENCE_TH: 0.0001 - PRE_NMS_TOPK_TRAIN: 4000 - POST_NMS_TOPK_TRAIN: 2000 - PRE_NMS_TOPK_TEST: 1000 - POST_NMS_TOPK_TEST: 256 - NMS_TH_TRAIN: 0.9 - NMS_TH_TEST: 0.9 - POS_WEIGHT: 0.5 - NEG_WEIGHT: 0.5 - IGNORE_HIGH_FP: 0.85 -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - IMS_PER_BATCH: 16 - BASE_LR: 0.02 - STEPS: (60000, 80000) - MAX_ITER: 90000 - CHECKPOINT_PERIOD: 1000000000 - WARMUP_ITERS: 4000 - WARMUP_FACTOR: 0.00025 - CLIP_GRADIENTS: - ENABLED: True -INPUT: - MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) -OUTPUT_DIR: "./output/CenterNet2/auto" diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base_S4_DLA.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base_S4_DLA.yaml deleted file mode 100755 index 7e01be7e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/Base_S4_DLA.yaml +++ /dev/null @@ -1,40 +0,0 @@ -MODEL: - META_ARCHITECTURE: "CenterNetDetector" - PROPOSAL_GENERATOR: - NAME: "CenterNet" - PIXEL_STD: [57.375, 57.120, 58.395] - BACKBONE: - NAME: "build_dla_backbone" - DLA: - NORM: "BN" - CENTERNET: - IN_FEATURES: ["dla2"] - FPN_STRIDES: [4] - SOI: [[0, 1000000]] - NUM_CLS_CONVS: 1 - NUM_BOX_CONVS: 1 - REG_WEIGHT: 1. - MORE_POS: True - HM_FOCAL_ALPHA: 0.25 -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - LR_SCHEDULER_NAME: "WarmupCosineLR" - MAX_ITER: 90000 - BASE_LR: 0.04 - IMS_PER_BATCH: 64 - WEIGHT_DECAY: 0.0001 - CHECKPOINT_PERIOD: 1000000 - CLIP_GRADIENTS: - ENABLED: True -INPUT: - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 640 - MIN_SIZE_TEST: 608 - MAX_SIZE_TEST: 900 -TEST: - EVAL_PERIOD: 7500 -DATALOADER: - NUM_WORKERS: 8 -OUTPUT_DIR: "output/CenterNet2/auto" diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-FPN_R50_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-FPN_R50_1x.yaml deleted file mode 100755 index 6ea7d9b7..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-FPN_R50_1x.yaml +++ /dev/null @@ -1,4 +0,0 @@ -_BASE_: "Base-CenterNet-FPN.yaml" -MODEL: - CENTERNET: - MORE_POS: True \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-S4_DLA_8x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-S4_DLA_8x.yaml deleted file mode 100755 index b3d88be9..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet-S4_DLA_8x.yaml +++ /dev/null @@ -1,5 +0,0 @@ -_BASE_: "Base_S4_DLA.yaml" -SOLVER: - MAX_ITER: 90000 - BASE_LR: 0.08 - IMS_PER_BATCH: 128 \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2-F_R50_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2-F_R50_1x.yaml deleted file mode 100755 index c40eecc1..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2-F_R50_1x.yaml +++ /dev/null @@ -1,4 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - ROI_HEADS: - NAME: CustomROIHeads \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml deleted file mode 100755 index d7491447..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml +++ /dev/null @@ -1,36 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - BACKBONE: - NAME: "build_p35_fcos_dla_bifpn_backbone" - BIFPN: - OUT_CHANNELS: 160 - NUM_LEVELS: 3 - NUM_BIFPN: 4 - DLA: - NUM_LAYERS: 34 - NORM: "SyncBN" - FPN: - IN_FEATURES: ["dla3", "dla4", "dla5"] - ROI_HEADS: - IN_FEATURES: ["p3", "p4", "p5"] - CENTERNET: - POST_NMS_TOPK_TEST: 128 - FPN_STRIDES: [8, 16, 32] - IN_FEATURES: ['p3', 'p4', 'p5'] - SOI: [[0, 64], [48, 192], [128, 1000000]] -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - IMS_PER_BATCH: 16 - BASE_LR: 0.02 - STEPS: (300000, 340000) - MAX_ITER: 360000 - CHECKPOINT_PERIOD: 100000 - WARMUP_ITERS: 4000 - WARMUP_FACTOR: 0.00025 -INPUT: - MIN_SIZE_TRAIN: (256, 288, 320, 352, 384, 416, 448, 480, 512, 544, 576, 608) - MAX_SIZE_TRAIN: 900 - MAX_SIZE_TEST: 736 - MIN_SIZE_TEST: 512 \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml deleted file mode 100755 index d7491447..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml +++ /dev/null @@ -1,36 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - BACKBONE: - NAME: "build_p35_fcos_dla_bifpn_backbone" - BIFPN: - OUT_CHANNELS: 160 - NUM_LEVELS: 3 - NUM_BIFPN: 4 - DLA: - NUM_LAYERS: 34 - NORM: "SyncBN" - FPN: - IN_FEATURES: ["dla3", "dla4", "dla5"] - ROI_HEADS: - IN_FEATURES: ["p3", "p4", "p5"] - CENTERNET: - POST_NMS_TOPK_TEST: 128 - FPN_STRIDES: [8, 16, 32] - IN_FEATURES: ['p3', 'p4', 'p5'] - SOI: [[0, 64], [48, 192], [128, 1000000]] -DATASETS: - TRAIN: ("coco_2017_train",) - TEST: ("coco_2017_val",) -SOLVER: - IMS_PER_BATCH: 16 - BASE_LR: 0.02 - STEPS: (300000, 340000) - MAX_ITER: 360000 - CHECKPOINT_PERIOD: 100000 - WARMUP_ITERS: 4000 - WARMUP_FACTOR: 0.00025 -INPUT: - MIN_SIZE_TRAIN: (256, 288, 320, 352, 384, 416, 448, 480, 512, 544, 576, 608) - MAX_SIZE_TRAIN: 900 - MAX_SIZE_TEST: 736 - MIN_SIZE_TEST: 512 \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml deleted file mode 100755 index 80413a62..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml +++ /dev/null @@ -1,29 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - BACKBONE: - NAME: "build_p37_dla_bifpn_backbone" - BIFPN: - OUT_CHANNELS: 160 - NUM_LEVELS: 5 - NUM_BIFPN: 3 - CENTERNET: - POST_NMS_TOPK_TEST: 128 - WEIGHTS: '' - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.12, 57.375] - FPN: - IN_FEATURES: ["dla3", "dla4", "dla5"] -SOLVER: - LR_SCHEDULER_NAME: "WarmupCosineLR" - MAX_ITER: 360000 - BASE_LR: 0.08 - IMS_PER_BATCH: 64 - CHECKPOINT_PERIOD: 90000 -TEST: - EVAL_PERIOD: 7500 -INPUT: - FORMAT: RGB - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 640 - MIN_SIZE_TEST: 608 - MAX_SIZE_TEST: 900 diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml deleted file mode 100755 index 8813b39c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml +++ /dev/null @@ -1,30 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - BACKBONE: - NAME: "build_p37_dla_bifpn_backbone" - BIFPN: - OUT_CHANNELS: 160 - NUM_LEVELS: 5 - NUM_BIFPN: 3 - CENTERNET: - POST_NMS_TOPK_TEST: 128 - WEIGHTS: '' - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.12, 57.375] - FPN: - IN_FEATURES: ["dla3", "dla4", "dla5"] -SOLVER: - LR_SCHEDULER_NAME: "WarmupCosineLR" - MAX_ITER: 360000 - BASE_LR: 0.08 - IMS_PER_BATCH: 64 -TEST: - EVAL_PERIOD: 7500 -INPUT: - FORMAT: RGB - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 640 - MIN_SIZE_TEST: 608 - MAX_SIZE_TEST: 900 -DATASETS: - TRAIN: ("coco_2017_train","coco_un_yolov4_55_0.5",) diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml deleted file mode 100755 index f94f1358..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml +++ /dev/null @@ -1,30 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - BACKBONE: - NAME: "build_p37_fcos_dla_bifpn_backbone" - BIFPN: - OUT_CHANNELS: 160 - NUM_LEVELS: 5 - NUM_BIFPN: 3 - CENTERNET: - POST_NMS_TOPK_TEST: 128 - WEIGHTS: '' - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.12, 57.375] - FPN: - IN_FEATURES: ["dla3", "dla4", "dla5"] -TEST: - EVAL_PERIOD: 7500 -SOLVER: - LR_SCHEDULER_NAME: "WarmupCosineLR" - MAX_ITER: 360000 - BASE_LR: 0.08 - IMS_PER_BATCH: 64 -INPUT: - FORMAT: RGB - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 640 - MIN_SIZE_TEST: 608 - MAX_SIZE_TEST: 900 -DATASETS: - TRAIN: ("coco_2017_train","coco_un_yolov4_55_0.5",) diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml deleted file mode 100755 index e07574b3..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml +++ /dev/null @@ -1,32 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - BACKBONE: - NAME: "build_res2net_bifpn_backbone" - BIFPN: - NUM_BIFPN: 7 - OUT_CHANNELS: 288 - WEIGHTS: "output/r2_101.pkl" - RESNETS: - DEPTH: 101 - WIDTH_PER_GROUP: 26 - DEFORM_ON_PER_STAGE: [False, False, True, True] # on Res4, Res5 - DEFORM_MODULATED: True - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.12, 57.375] - CENTERNET: - USE_DEFORMABLE: True - ROI_HEADS: - IN_FEATURES: ["p3", "p4"] -INPUT: - FORMAT: RGB -TEST: - EVAL_PERIOD: 7500 -SOLVER: - MAX_ITER: 180000 - CHECKPOINT_PERIOD: 60000 - LR_SCHEDULER_NAME: "WarmupCosineLR" - BASE_LR: 0.04 - IMS_PER_BATCH: 32 -INPUT: - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 1280 diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml deleted file mode 100755 index 81fcab09..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml +++ /dev/null @@ -1,36 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - BACKBONE: - NAME: "build_res2net_bifpn_backbone" - BIFPN: - NUM_BIFPN: 7 - OUT_CHANNELS: 288 - WEIGHTS: "output/r2_101.pkl" - RESNETS: - DEPTH: 101 - WIDTH_PER_GROUP: 26 - DEFORM_ON_PER_STAGE: [False, False, True, True] # on Res4, Res5 - DEFORM_MODULATED: True - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.12, 57.375] - CENTERNET: - USE_DEFORMABLE: True - ROI_HEADS: - IN_FEATURES: ["p3", "p4"] -TEST: - EVAL_PERIOD: 7500 -SOLVER: - MAX_ITER: 180000 - CHECKPOINT_PERIOD: 7500 - LR_SCHEDULER_NAME: "WarmupCosineLR" - BASE_LR: 0.04 - IMS_PER_BATCH: 32 -DATASETS: - TRAIN: "('coco_2017_train', 'coco_un_yolov4_55_0.5')" -INPUT: - FORMAT: RGB - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 1280 - TEST_SIZE: 1560 - TEST_INPUT_TYPE: 'square' - \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml deleted file mode 100755 index fd6c49ee..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml +++ /dev/null @@ -1,29 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - BACKBONE: - NAME: "build_p67_res2net_fpn_backbone" - WEIGHTS: "output/r2_101.pkl" - RESNETS: - DEPTH: 101 - WIDTH_PER_GROUP: 26 - DEFORM_ON_PER_STAGE: [False, False, True, True] # on Res4, Res5 - DEFORM_MODULATED: True - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.12, 57.375] - CENTERNET: - USE_DEFORMABLE: True - ROI_HEADS: - IN_FEATURES: ["p3", "p4"] -INPUT: - FORMAT: RGB -TEST: - EVAL_PERIOD: 7500 -SOLVER: - MAX_ITER: 180000 - CHECKPOINT_PERIOD: 600000 - LR_SCHEDULER_NAME: "WarmupCosineLR" - BASE_LR: 0.04 - IMS_PER_BATCH: 32 -INPUT: - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 896 \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R50_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R50_1x.yaml deleted file mode 100755 index 9dcdf5b8..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_R50_1x.yaml +++ /dev/null @@ -1 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_X101-DCN_2x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_X101-DCN_2x.yaml deleted file mode 100755 index 009c6808..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/CenterNet2_X101-DCN_2x.yaml +++ /dev/null @@ -1,22 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - CENTERNET: - USE_DEFORMABLE: True - WEIGHTS: "detectron2://ImageNetPretrained/FAIR/X-101-32x8d.pkl" - PIXEL_STD: [57.375, 57.120, 58.395] - RESNETS: - STRIDE_IN_1X1: False - NUM_GROUPS: 32 - WIDTH_PER_GROUP: 8 - DEPTH: 101 - DEFORM_ON_PER_STAGE: [False, False, True, True] # on Res4, Res5 - DEFORM_MODULATED: True - ROI_HEADS: - IN_FEATURES: ["p3", "p4"] -SOLVER: - STEPS: (120000, 160000) - MAX_ITER: 180000 - CHECKPOINT_PERIOD: 40000 -INPUT: - MIN_SIZE_TRAIN: (480, 960) - MIN_SIZE_TRAIN_SAMPLING: "range" diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_1x.yaml deleted file mode 100755 index 912e8925..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_1x.yaml +++ /dev/null @@ -1,17 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - ROI_HEADS: - NUM_CLASSES: 1203 - SCORE_THRESH_TEST: 0.02 - NMS_THRESH_TEST: 0.5 - CENTERNET: - NUM_CLASSES: 1203 - -DATASETS: - TRAIN: ("lvis_v1_train",) - TEST: ("lvis_v1_val",) -DATALOADER: - SAMPLER_TRAIN: "RepeatFactorTrainingSampler" - REPEAT_THRESHOLD: 0.001 -TEST: - DETECTIONS_PER_IMAGE: 300 diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml deleted file mode 100755 index d6b6c823..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml +++ /dev/null @@ -1,19 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - ROI_HEADS: - NUM_CLASSES: 1203 - SCORE_THRESH_TEST: 0.02 - NMS_THRESH_TEST: 0.5 - CENTERNET: - NUM_CLASSES: 1203 - ROI_BOX_HEAD: - USE_SIGMOID_CE: True - USE_FED_LOSS: True -DATASETS: - TRAIN: ("lvis_v1_train",) - TEST: ("lvis_v1_val",) -DATALOADER: - SAMPLER_TRAIN: "RepeatFactorTrainingSampler" - REPEAT_THRESHOLD: 0.001 -TEST: - DETECTIONS_PER_IMAGE: 300 diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/O365_CenterNet2_R50_1x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/O365_CenterNet2_R50_1x.yaml deleted file mode 100755 index 514e52cd..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/O365_CenterNet2_R50_1x.yaml +++ /dev/null @@ -1,13 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - ROI_HEADS: - NUM_CLASSES: 365 - CENTERNET: - NUM_CLASSES: 365 -DATASETS: - TRAIN: ("objects365_train",) - TEST: ("objects365_val",) -DATALOADER: - SAMPLER_TRAIN: "ClassAwareSampler" -TEST: - DETECTIONS_PER_IMAGE: 300 \ No newline at end of file diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml deleted file mode 100755 index c400e92c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml +++ /dev/null @@ -1,42 +0,0 @@ -_BASE_: "Base-CenterNet2.yaml" -MODEL: - MASK_ON: True - ROI_MASK_HEAD: - NAME: "MaskRCNNConvUpsampleHead" - NUM_CONV: 4 - POOLER_RESOLUTION: 14 - ROI_HEADS: - NUM_CLASSES: 10 - IN_FEATURES: ["dla2"] - BACKBONE: - NAME: "build_dla_backbone" - DLA: - NORM: "BN" - CENTERNET: - IN_FEATURES: ["dla2"] - FPN_STRIDES: [4] - SOI: [[0, 1000000]] - NUM_CLS_CONVS: 1 - NUM_BOX_CONVS: 1 - REG_WEIGHT: 1. - MORE_POS: True - HM_FOCAL_ALPHA: 0.25 - POST_NMS_TOPK_TEST: 128 - WEIGHTS: '' - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.12, 57.375] -SOLVER: - MAX_ITER: 180000 - STEPS: (120000, 160000) - BASE_LR: 0.08 - IMS_PER_BATCH: 64 -INPUT: - FORMAT: RGB - CUSTOM_AUG: EfficientDetResizeCrop - TRAIN_SIZE: 640 - MIN_SIZE_TEST: 608 - MAX_SIZE_TEST: 900 - MASK_FORMAT: bitmask -DATASETS: - TRAIN: ("nuimages_train",) - TEST: ("nuimages_val",) diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/demo.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/demo.py deleted file mode 100755 index 5213faf4..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/demo.py +++ /dev/null @@ -1,185 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -import argparse -import glob -import multiprocessing as mp -import os -import time -import cv2 -import tqdm - -from detectron2.config import get_cfg -from detectron2.data.detection_utils import read_image -from detectron2.utils.logger import setup_logger - -from predictor import VisualizationDemo -from centernet.config import add_centernet_config -# constants -WINDOW_NAME = "CenterNet2 detections" - -from detectron2.utils.video_visualizer import VideoVisualizer -from detectron2.utils.visualizer import ColorMode, Visualizer -from detectron2.data import MetadataCatalog - -def setup_cfg(args): - # load config from file and command-line arguments - cfg = get_cfg() - add_centernet_config(cfg) - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - # Set score_threshold for builtin models - cfg.MODEL.RETINANET.SCORE_THRESH_TEST = args.confidence_threshold - cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = args.confidence_threshold - if cfg.MODEL.META_ARCHITECTURE in ['ProposalNetwork', 'CenterNetDetector']: - cfg.MODEL.CENTERNET.INFERENCE_TH = args.confidence_threshold - cfg.MODEL.CENTERNET.NMS_TH = cfg.MODEL.ROI_HEADS.NMS_THRESH_TEST - cfg.MODEL.PANOPTIC_FPN.COMBINE.INSTANCES_CONFIDENCE_THRESH = args.confidence_threshold - cfg.freeze() - return cfg - - -def get_parser(): - parser = argparse.ArgumentParser(description="Detectron2 demo for builtin models") - parser.add_argument( - "--config-file", - default="configs/quick_schedules/mask_rcnn_R_50_FPN_inference_acc_test.yaml", - metavar="FILE", - help="path to config file", - ) - parser.add_argument("--webcam", action="store_true", help="Take inputs from webcam.") - parser.add_argument("--video-input", help="Path to video file.") - parser.add_argument("--input", nargs="+", help="A list of space separated input images") - parser.add_argument( - "--output", - help="A file or directory to save output visualizations. " - "If not given, will show output in an OpenCV window.", - ) - - parser.add_argument( - "--confidence-threshold", - type=float, - default=0.3, - help="Minimum score for instance predictions to be shown", - ) - parser.add_argument( - "--opts", - help="Modify config options using the command-line 'KEY VALUE' pairs", - default=[], - nargs=argparse.REMAINDER, - ) - return parser - - -if __name__ == "__main__": - mp.set_start_method("spawn", force=True) - args = get_parser().parse_args() - logger = setup_logger() - logger.info("Arguments: " + str(args)) - - cfg = setup_cfg(args) - - demo = VisualizationDemo(cfg) - output_file = None - if args.input: - if len(args.input) == 1: - args.input = glob.glob(os.path.expanduser(args.input[0])) - files = os.listdir(args.input[0]) - args.input = [args.input[0] + x for x in files] - assert args.input, "The input path(s) was not found" - visualizer = VideoVisualizer( - MetadataCatalog.get( - cfg.DATASETS.TEST[0] if len(cfg.DATASETS.TEST) else "__unused" - ), - instance_mode=ColorMode.IMAGE) - for path in tqdm.tqdm(args.input, disable=not args.output): - # use PIL, to be consistent with evaluation - img = read_image(path, format="BGR") - start_time = time.time() - predictions, visualized_output = demo.run_on_image( - img, visualizer=visualizer) - if 'instances' in predictions: - logger.info( - "{}: detected {} instances in {:.2f}s".format( - path, len(predictions["instances"]), time.time() - start_time - ) - ) - else: - logger.info( - "{}: detected {} instances in {:.2f}s".format( - path, len(predictions["proposals"]), time.time() - start_time - ) - ) - - if args.output: - if os.path.isdir(args.output): - assert os.path.isdir(args.output), args.output - out_filename = os.path.join(args.output, os.path.basename(path)) - visualized_output.save(out_filename) - else: - # assert len(args.input) == 1, "Please specify a directory with args.output" - # out_filename = args.output - if output_file is None: - width = visualized_output.get_image().shape[1] - height = visualized_output.get_image().shape[0] - frames_per_second = 15 - output_file = cv2.VideoWriter( - filename=args.output, - # some installation of opencv may not support x264 (due to its license), - # you can try other format (e.g. MPEG) - fourcc=cv2.VideoWriter_fourcc(*"x264"), - fps=float(frames_per_second), - frameSize=(width, height), - isColor=True, - ) - output_file.write(visualized_output.get_image()[:, :, ::-1]) - else: - # cv2.namedWindow(WINDOW_NAME, cv2.WINDOW_NORMAL) - cv2.imshow(WINDOW_NAME, visualized_output.get_image()[:, :, ::-1]) - if cv2.waitKey(1 ) == 27: - break # esc to quit - elif args.webcam: - assert args.input is None, "Cannot have both --input and --webcam!" - cam = cv2.VideoCapture(0) - for vis in tqdm.tqdm(demo.run_on_video(cam)): - cv2.namedWindow(WINDOW_NAME, cv2.WINDOW_NORMAL) - cv2.imshow(WINDOW_NAME, vis) - if cv2.waitKey(1) == 27: - break # esc to quit - cv2.destroyAllWindows() - elif args.video_input: - video = cv2.VideoCapture(args.video_input) - width = int(video.get(cv2.CAP_PROP_FRAME_WIDTH)) - height = int(video.get(cv2.CAP_PROP_FRAME_HEIGHT)) - frames_per_second = 15 # video.get(cv2.CAP_PROP_FPS) - num_frames = int(video.get(cv2.CAP_PROP_FRAME_COUNT)) - basename = os.path.basename(args.video_input) - - if args.output: - if os.path.isdir(args.output): - output_fname = os.path.join(args.output, basename) - output_fname = os.path.splitext(output_fname)[0] + ".mkv" - else: - output_fname = args.output - # assert not os.path.isfile(output_fname), output_fname - output_file = cv2.VideoWriter( - filename=output_fname, - # some installation of opencv may not support x264 (due to its license), - # you can try other format (e.g. MPEG) - fourcc=cv2.VideoWriter_fourcc(*"x264"), - fps=float(frames_per_second), - frameSize=(width, height), - isColor=True, - ) - assert os.path.isfile(args.video_input) - for vis_frame in tqdm.tqdm(demo.run_on_video(video), total=num_frames): - if args.output: - output_file.write(vis_frame) - - cv2.namedWindow(basename, cv2.WINDOW_NORMAL) - cv2.imshow(basename, vis_frame) - if cv2.waitKey(1) == 27: - break # esc to quit - video.release() - if args.output: - output_file.release() - else: - cv2.destroyAllWindows() diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/predictor.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/predictor.py deleted file mode 100755 index 8a036bde..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/predictor.py +++ /dev/null @@ -1,243 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved -import atexit -import bisect -import multiprocessing as mp -from collections import deque -import cv2 -import torch - -from detectron2.data import MetadataCatalog -from detectron2.engine.defaults import DefaultPredictor -from detectron2.utils.video_visualizer import VideoVisualizer -from detectron2.utils.visualizer import ColorMode, Visualizer - - -class VisualizationDemo(object): - def __init__(self, cfg, instance_mode=ColorMode.IMAGE, parallel=False): - """ - Args: - cfg (CfgNode): - instance_mode (ColorMode): - parallel (bool): whether to run the model in different processes from visualization. - Useful since the visualization logic can be slow. - """ - self.metadata = MetadataCatalog.get( - cfg.DATASETS.TRAIN[0] if len(cfg.DATASETS.TRAIN) else "__unused" - ) - self.cpu_device = torch.device("cpu") - self.instance_mode = instance_mode - - self.parallel = parallel - if parallel: - num_gpu = torch.cuda.device_count() - self.predictor = AsyncPredictor(cfg, num_gpus=num_gpu) - else: - self.predictor = DefaultPredictor(cfg) - - def run_on_image(self, image, visualizer=None): - """ - Args: - image (np.ndarray): an image of shape (H, W, C) (in BGR order). - This is the format used by OpenCV. - - Returns: - predictions (dict): the output of the model. - vis_output (VisImage): the visualized image output. - """ - vis_output = None - predictions = self.predictor(image) - # Convert image from OpenCV BGR format to Matplotlib RGB format. - image = image[:, :, ::-1] - use_video_vis = True - if visualizer is None: - use_video_vis = False - visualizer = Visualizer(image, self.metadata, instance_mode=self.instance_mode) - if "panoptic_seg" in predictions: - panoptic_seg, segments_info = predictions["panoptic_seg"] - vis_output = visualizer.draw_panoptic_seg_predictions( - panoptic_seg.to(self.cpu_device), segments_info - ) - else: - if "sem_seg" in predictions: - vis_output = visualizer.draw_sem_seg( - predictions["sem_seg"].argmax(dim=0).to(self.cpu_device) - ) - if "instances" in predictions: - instances = predictions["instances"].to(self.cpu_device) - if use_video_vis: - vis_output = visualizer.draw_instance_predictions( - image, predictions=instances) - else: - vis_output = visualizer.draw_instance_predictions(predictions=instances) - elif "proposals" in predictions: - instances = predictions["proposals"].to(self.cpu_device) - instances.pred_boxes = instances.proposal_boxes - instances.scores = instances.objectness_logits - instances.pred_classes[:] = -1 - if use_video_vis: - vis_output = visualizer.draw_instance_predictions( - image, predictions=instances) - else: - vis_output = visualizer.draw_instance_predictions(predictions=instances) - - return predictions, vis_output - - def _frame_from_video(self, video): - while video.isOpened(): - success, frame = video.read() - if success: - yield frame - else: - break - - def run_on_video(self, video): - """ - Visualizes predictions on frames of the input video. - - Args: - video (cv2.VideoCapture): a :class:`VideoCapture` object, whose source can be - either a webcam or a video file. - - Yields: - ndarray: BGR visualizations of each video frame. - """ - video_visualizer = VideoVisualizer(self.metadata, self.instance_mode) - - def process_predictions(frame, predictions): - frame = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR) - if "panoptic_seg" in predictions: - panoptic_seg, segments_info = predictions["panoptic_seg"] - vis_frame = video_visualizer.draw_panoptic_seg_predictions( - frame, panoptic_seg.to(self.cpu_device), segments_info - ) - elif "instances" in predictions: - predictions = predictions["instances"].to(self.cpu_device) - vis_frame = video_visualizer.draw_instance_predictions(frame, predictions) - elif "sem_seg" in predictions: - vis_frame = video_visualizer.draw_sem_seg( - frame, predictions["sem_seg"].argmax(dim=0).to(self.cpu_device) - ) - elif "proposals" in predictions: - predictions = predictions["proposals"].to(self.cpu_device) - predictions.pred_boxes = predictions.proposal_boxes - predictions.scores = predictions.objectness_logits - predictions.pred_classes[:] = -1 - vis_frame = video_visualizer.draw_instance_predictions(frame, predictions) - - # Converts Matplotlib RGB format to OpenCV BGR format - vis_frame = cv2.cvtColor(vis_frame.get_image(), cv2.COLOR_RGB2BGR) - return vis_frame - - frame_gen = self._frame_from_video(video) - if self.parallel: - buffer_size = self.predictor.default_buffer_size - - frame_data = deque() - - for cnt, frame in enumerate(frame_gen): - frame_data.append(frame) - self.predictor.put(frame) - - if cnt >= buffer_size: - frame = frame_data.popleft() - predictions = self.predictor.get() - yield process_predictions(frame, predictions) - - while len(frame_data): - frame = frame_data.popleft() - predictions = self.predictor.get() - yield process_predictions(frame, predictions) - else: - for frame in frame_gen: - yield process_predictions(frame, self.predictor(frame)) - - -class AsyncPredictor: - """ - A predictor that runs the model asynchronously, possibly on >1 GPUs. - Because rendering the visualization takes considerably amount of time, - this helps improve throughput when rendering videos. - """ - - class _StopToken: - pass - - class _PredictWorker(mp.Process): - def __init__(self, cfg, task_queue, result_queue): - self.cfg = cfg - self.task_queue = task_queue - self.result_queue = result_queue - super().__init__() - - def run(self): - predictor = DefaultPredictor(self.cfg) - - while True: - task = self.task_queue.get() - if isinstance(task, AsyncPredictor._StopToken): - break - idx, data = task - result = predictor(data) - self.result_queue.put((idx, result)) - - def __init__(self, cfg, num_gpus: int = 1): - """ - Args: - cfg (CfgNode): - num_gpus (int): if 0, will run on CPU - """ - num_workers = max(num_gpus, 1) - self.task_queue = mp.Queue(maxsize=num_workers * 3) - self.result_queue = mp.Queue(maxsize=num_workers * 3) - self.procs = [] - for gpuid in range(max(num_gpus, 1)): - cfg = cfg.clone() - cfg.defrost() - cfg.MODEL.DEVICE = "cuda:{}".format(gpuid) if num_gpus > 0 else "cpu" - self.procs.append( - AsyncPredictor._PredictWorker(cfg, self.task_queue, self.result_queue) - ) - - self.put_idx = 0 - self.get_idx = 0 - self.result_rank = [] - self.result_data = [] - - for p in self.procs: - p.start() - atexit.register(self.shutdown) - - def put(self, image): - self.put_idx += 1 - self.task_queue.put((self.put_idx, image)) - - def get(self): - self.get_idx += 1 # the index needed for this request - if len(self.result_rank) and self.result_rank[0] == self.get_idx: - res = self.result_data[0] - del self.result_data[0], self.result_rank[0] - return res - - while True: - # make sure the results are returned in the correct order - idx, res = self.result_queue.get() - if idx == self.get_idx: - return res - insert = bisect.bisect(self.result_rank, idx) - self.result_rank.insert(insert, idx) - self.result_data.insert(insert, res) - - def __len__(self): - return self.put_idx - self.get_idx - - def __call__(self, image): - self.put(image) - return self.get() - - def shutdown(self): - for _ in self.procs: - self.task_queue.put(AsyncPredictor._StopToken()) - - @property - def default_buffer_size(self): - return len(self.procs) * 5 diff --git a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/train_net.py b/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/train_net.py deleted file mode 100755 index d903efde..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/projects/CenterNet2/train_net.py +++ /dev/null @@ -1,228 +0,0 @@ -import logging -import os -from collections import OrderedDict -import torch -from torch.nn.parallel import DistributedDataParallel -import time -import datetime -import json - -from fvcore.common.timer import Timer -import detectron2.utils.comm as comm -from detectron2.checkpoint import DetectionCheckpointer, PeriodicCheckpointer -from detectron2.config import get_cfg -from detectron2.data import ( - MetadataCatalog, - build_detection_test_loader, -) -from detectron2.engine import default_argument_parser, default_setup, launch - -from detectron2.evaluation import ( - COCOEvaluator, - LVISEvaluator, - inference_on_dataset, - print_csv_format, -) -from detectron2.modeling import build_model -from detectron2.solver import build_lr_scheduler, build_optimizer -from detectron2.utils.events import ( - CommonMetricPrinter, - EventStorage, - JSONWriter, - TensorboardXWriter, -) -from detectron2.modeling.test_time_augmentation import GeneralizedRCNNWithTTA -from detectron2.data.dataset_mapper import DatasetMapper -from detectron2.data.build import build_detection_train_loader - -from centernet.config import add_centernet_config -from centernet.data.custom_build_augmentation import build_custom_augmentation - -logger = logging.getLogger("detectron2") - -def do_test(cfg, model): - results = OrderedDict() - for dataset_name in cfg.DATASETS.TEST: - mapper = None if cfg.INPUT.TEST_INPUT_TYPE == 'default' else \ - DatasetMapper( - cfg, False, augmentations=build_custom_augmentation(cfg, False)) - data_loader = build_detection_test_loader(cfg, dataset_name, mapper=mapper) - output_folder = os.path.join( - cfg.OUTPUT_DIR, "inference_{}".format(dataset_name)) - evaluator_type = MetadataCatalog.get(dataset_name).evaluator_type - - if evaluator_type == "lvis": - evaluator = LVISEvaluator(dataset_name, cfg, True, output_folder) - elif evaluator_type == 'coco': - evaluator = COCOEvaluator(dataset_name, cfg, True, output_folder) - else: - assert 0, evaluator_type - - results[dataset_name] = inference_on_dataset( - model, data_loader, evaluator) - if comm.is_main_process(): - logger.info("Evaluation results for {} in csv format:".format( - dataset_name)) - print_csv_format(results[dataset_name]) - if len(results) == 1: - results = list(results.values())[0] - return results - -def do_train(cfg, model, resume=False): - model.train() - optimizer = build_optimizer(cfg, model) - scheduler = build_lr_scheduler(cfg, optimizer) - - checkpointer = DetectionCheckpointer( - model, cfg.OUTPUT_DIR, optimizer=optimizer, scheduler=scheduler - ) - - start_iter = ( - checkpointer.resume_or_load( - cfg.MODEL.WEIGHTS, resume=resume, - ).get("iteration", -1) + 1 - ) - if cfg.SOLVER.RESET_ITER: - logger.info('Reset loaded iteration. Start training from iteration 0.') - start_iter = 0 - max_iter = cfg.SOLVER.MAX_ITER if cfg.SOLVER.TRAIN_ITER < 0 else cfg.SOLVER.TRAIN_ITER - - periodic_checkpointer = PeriodicCheckpointer( - checkpointer, cfg.SOLVER.CHECKPOINT_PERIOD, max_iter=max_iter - ) - - writers = ( - [ - CommonMetricPrinter(max_iter), - JSONWriter(os.path.join(cfg.OUTPUT_DIR, "metrics.json")), - TensorboardXWriter(cfg.OUTPUT_DIR), - ] - if comm.is_main_process() - else [] - ) - - - mapper = DatasetMapper(cfg, True) if cfg.INPUT.CUSTOM_AUG == '' else \ - DatasetMapper(cfg, True, augmentations=build_custom_augmentation(cfg, True)) - if cfg.DATALOADER.SAMPLER_TRAIN in ['TrainingSampler', 'RepeatFactorTrainingSampler']: - data_loader = build_detection_train_loader(cfg, mapper=mapper) - else: - from centernet.data.custom_dataset_dataloader import build_custom_train_loader - data_loader = build_custom_train_loader(cfg, mapper=mapper) - - - logger.info("Starting training from iteration {}".format(start_iter)) - with EventStorage(start_iter) as storage: - step_timer = Timer() - data_timer = Timer() - start_time = time.perf_counter() - for data, iteration in zip(data_loader, range(start_iter, max_iter)): - data_time = data_timer.seconds() - storage.put_scalars(data_time=data_time) - step_timer.reset() - iteration = iteration + 1 - storage.step() - loss_dict = model(data) - - losses = sum( - loss for k, loss in loss_dict.items()) - assert torch.isfinite(losses).all(), loss_dict - - loss_dict_reduced = {k: v.item() \ - for k, v in comm.reduce_dict(loss_dict).items()} - losses_reduced = sum(loss for loss in loss_dict_reduced.values()) - if comm.is_main_process(): - storage.put_scalars( - total_loss=losses_reduced, **loss_dict_reduced) - - optimizer.zero_grad() - losses.backward() - optimizer.step() - - storage.put_scalar( - "lr", optimizer.param_groups[0]["lr"], smoothing_hint=False) - - step_time = step_timer.seconds() - storage.put_scalars(time=step_time) - data_timer.reset() - scheduler.step() - - if ( - cfg.TEST.EVAL_PERIOD > 0 - and iteration % cfg.TEST.EVAL_PERIOD == 0 - and iteration != max_iter - ): - do_test(cfg, model) - comm.synchronize() - - if iteration - start_iter > 5 and \ - (iteration % 20 == 0 or iteration == max_iter): - for writer in writers: - writer.write() - periodic_checkpointer.step(iteration) - - total_time = time.perf_counter() - start_time - logger.info( - "Total training time: {}".format( - str(datetime.timedelta(seconds=int(total_time))))) - -def setup(args): - """ - Create configs and perform basic setups. - """ - cfg = get_cfg() - add_centernet_config(cfg) - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - if '/auto' in cfg.OUTPUT_DIR: - file_name = os.path.basename(args.config_file)[:-5] - cfg.OUTPUT_DIR = cfg.OUTPUT_DIR.replace('/auto', '/{}'.format(file_name)) - logger.info('OUTPUT_DIR: {}'.format(cfg.OUTPUT_DIR)) - cfg.freeze() - default_setup(cfg, args) - return cfg - - -def main(args): - cfg = setup(args) - - model = build_model(cfg) - logger.info("Model:\n{}".format(model)) - if args.eval_only: - DetectionCheckpointer(model, save_dir=cfg.OUTPUT_DIR).resume_or_load( - cfg.MODEL.WEIGHTS, resume=args.resume - ) - if cfg.TEST.AUG.ENABLED: - logger.info("Running inference with test-time augmentation ...") - model = GeneralizedRCNNWithTTA(cfg, model, batch_size=1) - - return do_test(cfg, model) - - distributed = comm.get_world_size() > 1 - if distributed: - model = DistributedDataParallel( - model, device_ids=[comm.get_local_rank()], broadcast_buffers=False, - find_unused_parameters=True - ) - - do_train(cfg, model, resume=args.resume) - return do_test(cfg, model) - - -if __name__ == "__main__": - args = default_argument_parser() - args.add_argument('--manual_device', default='') - args = args.parse_args() - if args.manual_device != '': - os.environ['CUDA_VISIBLE_DEVICES'] = args.manual_device - args.dist_url = 'tcp://127.0.0.1:{}'.format( - torch.randint(11111, 60000, (1,))[0].item()) - print("Command Line Args:", args) - launch( - main, - args.num_gpus, - num_machines=args.num_machines, - machine_rank=args.machine_rank, - dist_url=args.dist_url, - args=(args,), - ) diff --git a/grit_src_deprecated/third_party/CenterNet2/setup.cfg b/grit_src_deprecated/third_party/CenterNet2/setup.cfg deleted file mode 100755 index 2a1ccd4e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/setup.cfg +++ /dev/null @@ -1,26 +0,0 @@ -[isort] -line_length=100 -multi_line_output=3 -include_trailing_comma=True -known_standard_library=numpy,setuptools,mock -skip=./datasets,docs -skip_glob=*/__init__.py,**/configs/**,tests/config/** -known_myself=detectron2 -known_third_party=fvcore,matplotlib,cv2,torch,torchvision,PIL,pycocotools,yacs,termcolor,cityscapesscripts,tabulate,tqdm,scipy,lvis,psutil,pkg_resources,caffe2,onnx,panopticapi,black,isort,av,iopath,omegaconf,hydra,yaml,pydoc,submitit,cloudpickle -no_lines_before=STDLIB,THIRDPARTY -sections=FUTURE,STDLIB,THIRDPARTY,myself,FIRSTPARTY,LOCALFOLDER -default_section=FIRSTPARTY - -[mypy] -python_version=3.6 -ignore_missing_imports = True -warn_unused_configs = True -disallow_untyped_defs = True -check_untyped_defs = True -warn_unused_ignores = True -warn_redundant_casts = True -show_column_numbers = True -follow_imports = silent -allow_redefinition = True -; Require all functions to be annotated -disallow_incomplete_defs = True diff --git a/grit_src_deprecated/third_party/CenterNet2/setup.py b/grit_src_deprecated/third_party/CenterNet2/setup.py deleted file mode 100755 index 50a5e23e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/setup.py +++ /dev/null @@ -1,206 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. - -import glob -import os -import shutil -from os import path -from setuptools import find_packages, setup -from typing import List -import torch -from torch.utils.cpp_extension import CUDA_HOME, CppExtension, CUDAExtension - -torch_ver = [int(x) for x in torch.__version__.split(".")[:2]] -assert torch_ver >= [1, 8], "Requires PyTorch >= 1.8" - - -def get_version(): - init_py_path = path.join(path.abspath(path.dirname(__file__)), "detectron2", "__init__.py") - init_py = open(init_py_path, "r").readlines() - version_line = [l.strip() for l in init_py if l.startswith("__version__")][0] - version = version_line.split("=")[-1].strip().strip("'\"") - - # The following is used to build release packages. - # Users should never use it. - suffix = os.getenv("D2_VERSION_SUFFIX", "") - version = version + suffix - if os.getenv("BUILD_NIGHTLY", "0") == "1": - from datetime import datetime - - date_str = datetime.today().strftime("%y%m%d") - version = version + ".dev" + date_str - - new_init_py = [l for l in init_py if not l.startswith("__version__")] - new_init_py.append('__version__ = "{}"\n'.format(version)) - with open(init_py_path, "w") as f: - f.write("".join(new_init_py)) - return version - - -def get_extensions(): - this_dir = path.dirname(path.abspath(__file__)) - extensions_dir = path.join(this_dir, "detectron2", "layers", "csrc") - - main_source = path.join(extensions_dir, "vision.cpp") - sources = glob.glob(path.join(extensions_dir, "**", "*.cpp")) - - from torch.utils.cpp_extension import ROCM_HOME - - is_rocm_pytorch = ( - True if ((torch.version.hip is not None) and (ROCM_HOME is not None)) else False - ) - if is_rocm_pytorch: - assert torch_ver >= [1, 8], "ROCM support requires PyTorch >= 1.8!" - - # common code between cuda and rocm platforms, for hipify version [1,0,0] and later. - source_cuda = glob.glob(path.join(extensions_dir, "**", "*.cu")) + glob.glob( - path.join(extensions_dir, "*.cu") - ) - sources = [main_source] + sources - - extension = CppExtension - - extra_compile_args = {"cxx": []} - define_macros = [] - - if (torch.cuda.is_available() and ((CUDA_HOME is not None) or is_rocm_pytorch)) or os.getenv( - "FORCE_CUDA", "0" - ) == "1": - extension = CUDAExtension - sources += source_cuda - - if not is_rocm_pytorch: - define_macros += [("WITH_CUDA", None)] - extra_compile_args["nvcc"] = [ - "-O3", - "-DCUDA_HAS_FP16=1", - "-D__CUDA_NO_HALF_OPERATORS__", - "-D__CUDA_NO_HALF_CONVERSIONS__", - "-D__CUDA_NO_HALF2_OPERATORS__", - ] - else: - define_macros += [("WITH_HIP", None)] - extra_compile_args["nvcc"] = [] - - if torch_ver < [1, 7]: - # supported by https://github.com/pytorch/pytorch/pull/43931 - CC = os.environ.get("CC", None) - if CC is not None: - extra_compile_args["nvcc"].append("-ccbin={}".format(CC)) - - include_dirs = [extensions_dir] - - ext_modules = [ - extension( - "detectron2._C", - sources, - include_dirs=include_dirs, - define_macros=define_macros, - extra_compile_args=extra_compile_args, - ) - ] - - return ext_modules - - -def get_model_zoo_configs() -> List[str]: - """ - Return a list of configs to include in package for model zoo. Copy over these configs inside - detectron2/model_zoo. - """ - - # Use absolute paths while symlinking. - source_configs_dir = path.join(path.dirname(path.realpath(__file__)), "configs") - destination = path.join( - path.dirname(path.realpath(__file__)), "detectron2", "model_zoo", "configs" - ) - # Symlink the config directory inside package to have a cleaner pip install. - - # Remove stale symlink/directory from a previous build. - if path.exists(source_configs_dir): - if path.islink(destination): - os.unlink(destination) - elif path.isdir(destination): - shutil.rmtree(destination) - - if not path.exists(destination): - try: - os.symlink(source_configs_dir, destination) - except OSError: - # Fall back to copying if symlink fails: ex. on Windows. - shutil.copytree(source_configs_dir, destination) - - config_paths = glob.glob("configs/**/*.yaml", recursive=True) + glob.glob( - "configs/**/*.py", recursive=True - ) - return config_paths - - -# For projects that are relative small and provide features that are very close -# to detectron2's core functionalities, we install them under detectron2.projects -PROJECTS = { - -} - -setup( - name="detectron2", - version=get_version(), - author="FAIR", - url="https://github.com/facebookresearch/detectron2", - description="Detectron2 is FAIR's next-generation research " - "platform for object detection and segmentation.", - packages=find_packages(exclude=("configs", "tests*")) + list(PROJECTS.keys()), - package_dir=PROJECTS, - package_data={"detectron2.model_zoo": get_model_zoo_configs()}, - python_requires=">=3.6", - install_requires=[ - # These dependencies are not pure-python. - # In general, avoid adding more dependencies like them because they are not - # guaranteed to be installable by `pip install` on all platforms. - # To tell if a package is pure-python, go to https://pypi.org/project/{name}/#files - "Pillow>=7.1", # or use pillow-simd for better performance - "matplotlib", # TODO move it to optional after we add opencv visualization - "pycocotools>=2.0.2", # corresponds to https://github.com/ppwwyyxx/cocoapi - # Do not add opencv here. Just like pytorch, user should install - # opencv themselves, preferrably by OS's package manager, or by - # choosing the proper pypi package name at https://github.com/skvark/opencv-python - # The following are pure-python dependencies that should be easily installable - "termcolor>=1.1", - "yacs>=0.1.8", - "tabulate", - "cloudpickle", - "tqdm>4.29.0", - "tensorboard", - # Lock version of fvcore/iopath because they may have breaking changes - # NOTE: when updating fvcore/iopath version, make sure fvcore depends - # on compatible version of iopath. - "fvcore>=0.1.5,<0.1.6", # required like this to make it pip installable - "iopath>=0.1.7,<0.1.10", - "future", # used by caffe2 - "pydot", # used to save caffe2 SVGs - "dataclasses; python_version<'3.7'", - "omegaconf>=2.1", - "hydra-core>=1.1", - "black==21.4b2", - # If a new dependency is required at import time (in addition to runtime), it - # probably needs to exist in docs/requirements.txt, or as a mock in docs/conf.py - ], - extras_require={ - # optional dependencies, required by some features - "all": [ - "shapely", - "pygments>=2.2", - "psutil", - "panopticapi @ https://github.com/cocodataset/panopticapi/archive/master.zip", - ], - # dev dependencies. Install them by `pip install 'detectron2[dev]'` - "dev": [ - "flake8==3.8.1", - "isort==4.3.21", - "flake8-bugbear", - "flake8-comprehensions", - ], - }, - ext_modules=get_extensions(), - cmdclass={"build_ext": torch.utils.cpp_extension.BuildExtension}, -) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/README.md b/grit_src_deprecated/third_party/CenterNet2/tests/README.md deleted file mode 100755 index f5603840..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/README.md +++ /dev/null @@ -1,9 +0,0 @@ -## Unit Tests - -To run the unittests, do: -``` -cd detectron2 -python -m unittest discover -v -s ./tests -``` - -There are also end-to-end inference & training tests, in [dev/run_*_tests.sh](../dev). diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/__init__.py b/grit_src_deprecated/third_party/CenterNet2/tests/__init__.py deleted file mode 100755 index 9020c2df..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/__init__.py +++ /dev/null @@ -1 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/config/dir1/dir1_a.py b/grit_src_deprecated/third_party/CenterNet2/tests/config/dir1/dir1_a.py deleted file mode 100755 index a9399551..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/config/dir1/dir1_a.py +++ /dev/null @@ -1,3 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -dir1a_str = "base_a_1" -dir1a_dict = {"a": 1, "b": 2} diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/config/dir1/dir1_b.py b/grit_src_deprecated/third_party/CenterNet2/tests/config/dir1/dir1_b.py deleted file mode 100755 index 2dcb54cb..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/config/dir1/dir1_b.py +++ /dev/null @@ -1,11 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from detectron2.config import LazyConfig - -# equivalent to relative import -dir1a_str, dir1a_dict = LazyConfig.load_rel("dir1_a.py", ("dir1a_str", "dir1a_dict")) - -dir1b_str = dir1a_str + "_from_b" -dir1b_dict = dir1a_dict - -# Every import is a reload: not modified by other config files -assert dir1a_dict.a == 1 diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/config/root_cfg.py b/grit_src_deprecated/third_party/CenterNet2/tests/config/root_cfg.py deleted file mode 100755 index 33d1d4bd..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/config/root_cfg.py +++ /dev/null @@ -1,14 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from itertools import count - -from detectron2.config import LazyCall as L - -from .dir1.dir1_a import dir1a_dict, dir1a_str - -dir1a_dict.a = "modified" - -# modification above won't affect future imports -from .dir1.dir1_b import dir1b_dict, dir1b_str - - -lazyobj = L(count)(x=dir1a_str, y=dir1b_str) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/config/test_instantiate_config.py b/grit_src_deprecated/third_party/CenterNet2/tests/config/test_instantiate_config.py deleted file mode 100755 index b76f71b9..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/config/test_instantiate_config.py +++ /dev/null @@ -1,100 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import os -import tempfile -import unittest -import yaml -from omegaconf import OmegaConf -from omegaconf import __version__ as oc_version -from dataclasses import dataclass - -from detectron2.config import instantiate, LazyCall as L -from detectron2.layers import ShapeSpec - -OC_VERSION = tuple(int(x) for x in oc_version.split(".")[:2]) - - -class TestClass: - def __init__(self, int_arg, list_arg=None, dict_arg=None, extra_arg=None): - self.int_arg = int_arg - self.list_arg = list_arg - self.dict_arg = dict_arg - self.extra_arg = extra_arg - - def __call__(self, call_arg): - return call_arg + self.int_arg - - -@dataclass -class TestDataClass: - x: int - y: str - - -@unittest.skipIf(OC_VERSION < (2, 1), "omegaconf version too old") -class TestConstruction(unittest.TestCase): - def test_basic_construct(self): - objconf = L(TestClass)( - int_arg=3, - list_arg=[10], - dict_arg={}, - extra_arg=L(TestClass)(int_arg=4, list_arg="${..list_arg}"), - ) - - obj = instantiate(objconf) - self.assertIsInstance(obj, TestClass) - self.assertEqual(obj.int_arg, 3) - self.assertEqual(obj.extra_arg.int_arg, 4) - self.assertEqual(obj.extra_arg.list_arg, obj.list_arg) - - objconf.extra_arg.list_arg = [5] - obj = instantiate(objconf) - self.assertIsInstance(obj, TestClass) - self.assertEqual(obj.extra_arg.list_arg, [5]) - - def test_instantiate_other_obj(self): - # do nothing for other obj - self.assertEqual(instantiate(5), 5) - x = [3, 4, 5] - self.assertEqual(instantiate(x), x) - x = TestClass(1) - self.assertIs(instantiate(x), x) - x = {"xx": "yy"} - self.assertIs(instantiate(x), x) - - def test_instantiate_lazy_target(self): - # _target_ is result of instantiate - objconf = L(L(len)(int_arg=3))(call_arg=4) - objconf._target_._target_ = TestClass - self.assertEqual(instantiate(objconf), 7) - - def test_instantiate_lst(self): - lst = [1, 2, L(TestClass)(int_arg=1)] - x = L(TestClass)(int_arg=lst) # list as an argument should be recursively instantiated - x = instantiate(x).int_arg - self.assertEqual(x[:2], [1, 2]) - self.assertIsInstance(x[2], TestClass) - self.assertEqual(x[2].int_arg, 1) - - def test_instantiate_namedtuple(self): - x = L(TestClass)(int_arg=ShapeSpec(channels=1, width=3)) - # test serialization - with tempfile.TemporaryDirectory() as d: - fname = os.path.join(d, "d2_test.yaml") - OmegaConf.save(x, fname) - with open(fname) as f: - x = yaml.unsafe_load(f) - - x = instantiate(x) - self.assertIsInstance(x.int_arg, ShapeSpec) - self.assertEqual(x.int_arg.channels, 1) - - def test_bad_lazycall(self): - with self.assertRaises(Exception): - L(3) - - def test_instantiate_dataclass(self): - a = L(TestDataClass)(x=1, y="s") - a = instantiate(a) - self.assertEqual(a.x, 1) - self.assertEqual(a.y, "s") diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/config/test_lazy_config.py b/grit_src_deprecated/third_party/CenterNet2/tests/config/test_lazy_config.py deleted file mode 100755 index 6ff5b6dc..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/config/test_lazy_config.py +++ /dev/null @@ -1,79 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import os -import unittest -import tempfile -from itertools import count - -from detectron2.config import LazyConfig, LazyCall as L -from omegaconf import DictConfig - - -class TestLazyPythonConfig(unittest.TestCase): - def setUp(self): - self.root_filename = os.path.join(os.path.dirname(__file__), "root_cfg.py") - - def test_load(self): - cfg = LazyConfig.load(self.root_filename) - - self.assertEqual(cfg.dir1a_dict.a, "modified") - self.assertEqual(cfg.dir1b_dict.a, 1) - self.assertEqual(cfg.lazyobj.x, "base_a_1") - - cfg.lazyobj.x = "new_x" - # reload - cfg = LazyConfig.load(self.root_filename) - self.assertEqual(cfg.lazyobj.x, "base_a_1") - - def test_save_load(self): - cfg = LazyConfig.load(self.root_filename) - with tempfile.TemporaryDirectory(prefix="detectron2") as d: - fname = os.path.join(d, "test_config.yaml") - LazyConfig.save(cfg, fname) - cfg2 = LazyConfig.load(fname) - - self.assertEqual(cfg2.lazyobj._target_, "itertools.count") - self.assertEqual(cfg.lazyobj._target_, count) - cfg2.lazyobj.pop("_target_") - cfg.lazyobj.pop("_target_") - # the rest are equal - self.assertEqual(cfg, cfg2) - - def test_failed_save(self): - cfg = DictConfig({"x": lambda: 3}, flags={"allow_objects": True}) - with tempfile.TemporaryDirectory(prefix="detectron2") as d: - fname = os.path.join(d, "test_config.yaml") - LazyConfig.save(cfg, fname) - self.assertTrue(os.path.exists(fname)) - self.assertTrue(os.path.exists(fname + ".pkl")) - - def test_overrides(self): - cfg = LazyConfig.load(self.root_filename) - LazyConfig.apply_overrides(cfg, ["lazyobj.x=123", 'dir1b_dict.a="123"']) - self.assertEqual(cfg.dir1b_dict.a, "123") - self.assertEqual(cfg.lazyobj.x, 123) - - def test_invalid_overrides(self): - cfg = LazyConfig.load(self.root_filename) - with self.assertRaises(KeyError): - LazyConfig.apply_overrides(cfg, ["lazyobj.x.xxx=123"]) - - def test_to_py(self): - cfg = LazyConfig.load(self.root_filename) - cfg.lazyobj.x = {"a": 1, "b": 2, "c": L(count)(x={"r": "a", "s": 2.4, "t": [1, 2, 3, "z"]})} - cfg.list = ["a", 1, "b", 3.2] - py_str = LazyConfig.to_py(cfg) - expected = """cfg.dir1a_dict.a = "modified" -cfg.dir1a_dict.b = 2 -cfg.dir1b_dict.a = 1 -cfg.dir1b_dict.b = 2 -cfg.lazyobj = itertools.count( - x={ - "a": 1, - "b": 2, - "c": itertools.count(x={"r": "a", "s": 2.4, "t": [1, 2, 3, "z"]}), - }, - y="base_a_1_from_b", -) -cfg.list = ["a", 1, "b", 3.2] -""" - self.assertEqual(py_str, expected) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/config/test_yacs_config.py b/grit_src_deprecated/third_party/CenterNet2/tests/config/test_yacs_config.py deleted file mode 100755 index 01dd6955..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/config/test_yacs_config.py +++ /dev/null @@ -1,270 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. - - -import os -import tempfile -import unittest -import torch -from omegaconf import OmegaConf - -from detectron2 import model_zoo -from detectron2.config import configurable, downgrade_config, get_cfg, upgrade_config -from detectron2.layers import ShapeSpec -from detectron2.modeling import build_model - -_V0_CFG = """ -MODEL: - RPN_HEAD: - NAME: "TEST" -VERSION: 0 -""" - -_V1_CFG = """ -MODEL: - WEIGHT: "/path/to/weight" -""" - - -class TestConfigVersioning(unittest.TestCase): - def test_upgrade_downgrade_consistency(self): - cfg = get_cfg() - # check that custom is preserved - cfg.USER_CUSTOM = 1 - - down = downgrade_config(cfg, to_version=0) - up = upgrade_config(down) - self.assertTrue(up == cfg) - - def _merge_cfg_str(self, cfg, merge_str): - f = tempfile.NamedTemporaryFile(mode="w", suffix=".yaml", delete=False) - try: - f.write(merge_str) - f.close() - cfg.merge_from_file(f.name) - finally: - os.remove(f.name) - return cfg - - def test_auto_upgrade(self): - cfg = get_cfg() - latest_ver = cfg.VERSION - cfg.USER_CUSTOM = 1 - - self._merge_cfg_str(cfg, _V0_CFG) - - self.assertEqual(cfg.MODEL.RPN.HEAD_NAME, "TEST") - self.assertEqual(cfg.VERSION, latest_ver) - - def test_guess_v1(self): - cfg = get_cfg() - latest_ver = cfg.VERSION - self._merge_cfg_str(cfg, _V1_CFG) - self.assertEqual(cfg.VERSION, latest_ver) - - -class _TestClassA(torch.nn.Module): - @configurable - def __init__(self, arg1, arg2, arg3=3): - super().__init__() - self.arg1 = arg1 - self.arg2 = arg2 - self.arg3 = arg3 - assert arg1 == 1 - assert arg2 == 2 - assert arg3 == 3 - - @classmethod - def from_config(cls, cfg): - args = {"arg1": cfg.ARG1, "arg2": cfg.ARG2} - return args - - -class _TestClassB(_TestClassA): - @configurable - def __init__(self, input_shape, arg1, arg2, arg3=3): - """ - Doc of _TestClassB - """ - assert input_shape == "shape" - super().__init__(arg1, arg2, arg3) - - @classmethod - def from_config(cls, cfg, input_shape): # test extra positional arg in from_config - args = {"arg1": cfg.ARG1, "arg2": cfg.ARG2} - args["input_shape"] = input_shape - return args - - -class _LegacySubClass(_TestClassB): - # an old subclass written in cfg style - def __init__(self, cfg, input_shape, arg4=4): - super().__init__(cfg, input_shape) - assert self.arg1 == 1 - assert self.arg2 == 2 - assert self.arg3 == 3 - - -class _NewSubClassNewInit(_TestClassB): - # test new subclass with a new __init__ - @configurable - def __init__(self, input_shape, arg4=4, **kwargs): - super().__init__(input_shape, **kwargs) - assert self.arg1 == 1 - assert self.arg2 == 2 - assert self.arg3 == 3 - - -class _LegacySubClassNotCfg(_TestClassB): - # an old subclass written in cfg style, but argument is not called "cfg" - def __init__(self, config, input_shape): - super().__init__(config, input_shape) - assert self.arg1 == 1 - assert self.arg2 == 2 - assert self.arg3 == 3 - - -class _TestClassC(_TestClassB): - @classmethod - def from_config(cls, cfg, input_shape, **kwargs): # test extra kwarg overwrite - args = {"arg1": cfg.ARG1, "arg2": cfg.ARG2} - args["input_shape"] = input_shape - args.update(kwargs) - return args - - -class _TestClassD(_TestClassA): - @configurable - def __init__(self, input_shape: ShapeSpec, arg1: int, arg2, arg3=3): - assert input_shape == "shape" - super().__init__(arg1, arg2, arg3) - - # _TestClassA.from_config does not have input_shape args. - # Test whether input_shape will be forwarded to __init__ - - -@configurable(from_config=lambda cfg, arg2: {"arg1": cfg.ARG1, "arg2": arg2, "arg3": cfg.ARG3}) -def _test_func(arg1, arg2=2, arg3=3, arg4=4): - return arg1, arg2, arg3, arg4 - - -class TestConfigurable(unittest.TestCase): - def testInitWithArgs(self): - _ = _TestClassA(arg1=1, arg2=2, arg3=3) - _ = _TestClassB("shape", arg1=1, arg2=2) - _ = _TestClassC("shape", arg1=1, arg2=2) - _ = _TestClassD("shape", arg1=1, arg2=2, arg3=3) - - def testPatchedAttr(self): - self.assertTrue("Doc" in _TestClassB.__init__.__doc__) - self.assertEqual(_TestClassD.__init__.__annotations__["arg1"], int) - - def testInitWithCfg(self): - cfg = get_cfg() - cfg.ARG1 = 1 - cfg.ARG2 = 2 - cfg.ARG3 = 3 - _ = _TestClassA(cfg) - _ = _TestClassB(cfg, input_shape="shape") - _ = _TestClassC(cfg, input_shape="shape") - _ = _TestClassD(cfg, input_shape="shape") - _ = _LegacySubClass(cfg, input_shape="shape") - _ = _NewSubClassNewInit(cfg, input_shape="shape") - _ = _LegacySubClassNotCfg(cfg, input_shape="shape") - with self.assertRaises(TypeError): - # disallow forwarding positional args to __init__ since it's prone to errors - _ = _TestClassD(cfg, "shape") - - # call with kwargs instead - _ = _TestClassA(cfg=cfg) - _ = _TestClassB(cfg=cfg, input_shape="shape") - _ = _TestClassC(cfg=cfg, input_shape="shape") - _ = _TestClassD(cfg=cfg, input_shape="shape") - _ = _LegacySubClass(cfg=cfg, input_shape="shape") - _ = _NewSubClassNewInit(cfg=cfg, input_shape="shape") - _ = _LegacySubClassNotCfg(config=cfg, input_shape="shape") - - def testInitWithCfgOverwrite(self): - cfg = get_cfg() - cfg.ARG1 = 1 - cfg.ARG2 = 999 # wrong config - with self.assertRaises(AssertionError): - _ = _TestClassA(cfg, arg3=3) - - # overwrite arg2 with correct config later: - _ = _TestClassA(cfg, arg2=2, arg3=3) - _ = _TestClassB(cfg, input_shape="shape", arg2=2, arg3=3) - _ = _TestClassC(cfg, input_shape="shape", arg2=2, arg3=3) - _ = _TestClassD(cfg, input_shape="shape", arg2=2, arg3=3) - - # call with kwargs cfg=cfg instead - _ = _TestClassA(cfg=cfg, arg2=2, arg3=3) - _ = _TestClassB(cfg=cfg, input_shape="shape", arg2=2, arg3=3) - _ = _TestClassC(cfg=cfg, input_shape="shape", arg2=2, arg3=3) - _ = _TestClassD(cfg=cfg, input_shape="shape", arg2=2, arg3=3) - - def testInitWithCfgWrongArgs(self): - cfg = get_cfg() - cfg.ARG1 = 1 - cfg.ARG2 = 2 - with self.assertRaises(TypeError): - _ = _TestClassB(cfg, "shape", not_exist=1) - with self.assertRaises(TypeError): - _ = _TestClassC(cfg, "shape", not_exist=1) - with self.assertRaises(TypeError): - _ = _TestClassD(cfg, "shape", not_exist=1) - - def testBadClass(self): - class _BadClass1: - @configurable - def __init__(self, a=1, b=2): - pass - - class _BadClass2: - @configurable - def __init__(self, a=1, b=2): - pass - - def from_config(self, cfg): # noqa - pass - - class _BadClass3: - @configurable - def __init__(self, a=1, b=2): - pass - - # bad name: must be cfg - @classmethod - def from_config(cls, config): # noqa - pass - - with self.assertRaises(AttributeError): - _ = _BadClass1(a=1) - - with self.assertRaises(TypeError): - _ = _BadClass2(a=1) - - with self.assertRaises(TypeError): - _ = _BadClass3(get_cfg()) - - def testFuncWithCfg(self): - cfg = get_cfg() - cfg.ARG1 = 10 - cfg.ARG3 = 30 - - self.assertEqual(_test_func(1), (1, 2, 3, 4)) - with self.assertRaises(TypeError): - _test_func(cfg) - self.assertEqual(_test_func(cfg, arg2=2), (10, 2, 30, 4)) - self.assertEqual(_test_func(cfg, arg1=100, arg2=20), (100, 20, 30, 4)) - self.assertEqual(_test_func(cfg, arg1=100, arg2=20, arg4=40), (100, 20, 30, 40)) - - self.assertTrue(callable(_test_func.from_config)) - - def testOmegaConf(self): - cfg = model_zoo.get_config("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml") - cfg = OmegaConf.create(cfg.dump()) - if not torch.cuda.is_available(): - cfg.MODEL.DEVICE = "cpu" - # test that a model can be built with omegaconf config as well - build_model(cfg) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/data/__init__.py b/grit_src_deprecated/third_party/CenterNet2/tests/data/__init__.py deleted file mode 100755 index e69de29b..00000000 diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/data/test_coco.py b/grit_src_deprecated/third_party/CenterNet2/tests/data/test_coco.py deleted file mode 100755 index caabead5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/data/test_coco.py +++ /dev/null @@ -1,139 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import json -import numpy as np -import os -import tempfile -import unittest -import pycocotools.mask as mask_util - -from detectron2.data import DatasetCatalog, MetadataCatalog -from detectron2.data.datasets.coco import convert_to_coco_dict, load_coco_json -from detectron2.structures import BoxMode - - -def make_mask(): - """ - Makes a donut shaped binary mask. - """ - H = 100 - W = 100 - mask = np.zeros([H, W], dtype=np.uint8) - for x in range(W): - for y in range(H): - d = np.linalg.norm(np.array([W, H]) / 2 - np.array([x, y])) - if d > 10 and d < 20: - mask[y, x] = 1 - return mask - - -def uncompressed_rle(mask): - l = mask.flatten(order="F").tolist() - counts = [] - p = False - cnt = 0 - for i in l: - if i == p: - cnt += 1 - else: - counts.append(cnt) - p = i - cnt = 1 - counts.append(cnt) - return {"counts": counts, "size": [mask.shape[0], mask.shape[1]]} - - -def make_dataset_dicts(mask, compressed: bool = True): - """ - Returns a list of dicts that represents a single COCO data point for - object detection. The single instance given by `mask` is represented by - RLE, either compressed or uncompressed. - """ - record = {} - record["file_name"] = "test" - record["image_id"] = 0 - record["height"] = mask.shape[0] - record["width"] = mask.shape[1] - - y, x = np.nonzero(mask) - if compressed: - segmentation = mask_util.encode(np.asarray(mask, order="F")) - else: - segmentation = uncompressed_rle(mask) - min_x = np.min(x) - max_x = np.max(x) - min_y = np.min(y) - max_y = np.max(y) - obj = { - "bbox": [min_x, min_y, max_x, max_y], - "bbox_mode": BoxMode.XYXY_ABS, - "category_id": 0, - "iscrowd": 0, - "segmentation": segmentation, - } - record["annotations"] = [obj] - return [record] - - -class TestRLEToJson(unittest.TestCase): - def test(self): - # Make a dummy dataset. - mask = make_mask() - DatasetCatalog.register("test_dataset", lambda: make_dataset_dicts(mask)) - MetadataCatalog.get("test_dataset").set(thing_classes=["test_label"]) - - # Dump to json. - json_dict = convert_to_coco_dict("test_dataset") - with tempfile.TemporaryDirectory() as tmpdir: - json_file_name = os.path.join(tmpdir, "test.json") - with open(json_file_name, "w") as f: - json.dump(json_dict, f) - # Load from json. - dicts = load_coco_json(json_file_name, "") - - # Check the loaded mask matches the original. - anno = dicts[0]["annotations"][0] - loaded_mask = mask_util.decode(anno["segmentation"]) - self.assertTrue(np.array_equal(loaded_mask, mask)) - DatasetCatalog.pop("test_dataset") - MetadataCatalog.pop("test_dataset") - - def test_uncompressed_RLE(self): - mask = make_mask() - rle = mask_util.encode(np.asarray(mask, order="F")) - uncompressed = uncompressed_rle(mask) - compressed = mask_util.frPyObjects(uncompressed, *rle["size"]) - self.assertEqual(rle, compressed) - - -class TestConvertCOCO(unittest.TestCase): - @staticmethod - def generate_data(): - record = { - "file_name": "test", - "image_id": 0, - "height": 100, - "width": 100, - "annotations": [ - { - "bbox": [10, 10, 10, 10, 5], - "bbox_mode": BoxMode.XYWHA_ABS, - "category_id": 0, - "iscrowd": 0, - }, - { - "bbox": [15, 15, 3, 3], - "bbox_mode": BoxMode.XYXY_ABS, - "category_id": 0, - "iscrowd": 0, - }, - ], - } - - return [record] - - def test_convert_to_coco(self): - DatasetCatalog.register("test_dataset", lambda: TestConvertCOCO.generate_data()) - MetadataCatalog.get("test_dataset").set(thing_classes=["test_label"]) - convert_to_coco_dict("test_dataset") - DatasetCatalog.pop("test_dataset") - MetadataCatalog.pop("test_dataset") diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/data/test_coco_evaluation.py b/grit_src_deprecated/third_party/CenterNet2/tests/data/test_coco_evaluation.py deleted file mode 100755 index 964f0028..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/data/test_coco_evaluation.py +++ /dev/null @@ -1,138 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import contextlib -import copy -import io -import json -import numpy as np -import os -import tempfile -import unittest -import torch -from pycocotools.coco import COCO -from pycocotools.cocoeval import COCOeval - -from detectron2.data import DatasetCatalog -from detectron2.evaluation import COCOEvaluator -from detectron2.evaluation.fast_eval_api import COCOeval_opt -from detectron2.structures import Boxes, Instances - - -class TestCOCOeval(unittest.TestCase): - def test_fast_eval(self): - # A small set of images/categories from COCO val - # fmt: off - detections = [{"image_id": 139, "category_id": 1, "bbox": [417.3332824707031, 159.27003479003906, 47.66064453125, 143.00193786621094], "score": 0.9949821829795837, "segmentation": {"size": [426, 640], "counts": "Tc`52W=3N0N4aNN^E7]:4XE1g:8kDMT;U100000001O1gE[Nk8h1dFiNY9Z1aFkN]9g2J3NdN`FlN`9S1cFRN07]9g1bFoM6;X9c1cFoM=8R9g1bFQN>3U9Y30O01OO1O001N2O1N1O4L4L5UNoE3V:CVF6Q:@YF9l9@ZF 0 else 0.0 - msg = "%s: comparing COCO APIs, %s differs by %f" % (name, k, abs_diff) - self.assertTrue(abs_diff < 1e-4, msg=msg) - - def test_unknown_category(self): - dataset = "coco_2017_val_100" - evaluator = COCOEvaluator(dataset) - evaluator.reset() - inputs = DatasetCatalog.get(dataset)[:2] - pred = Instances((100, 100)) - pred.pred_boxes = Boxes(torch.rand(2, 4)) - pred.scores = torch.rand(2) - pred.pred_classes = torch.tensor([10, 80]) - output = {"instances": pred} - evaluator.process(inputs, [output, output]) - with self.assertRaises(AssertionError): - evaluator.evaluate() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/data/test_dataset.py b/grit_src_deprecated/third_party/CenterNet2/tests/data/test_dataset.py deleted file mode 100755 index 7d16ec4c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/data/test_dataset.py +++ /dev/null @@ -1,134 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import os -import pickle -import sys -import unittest -from functools import partial -import torch -from iopath.common.file_io import LazyPath - -from detectron2 import model_zoo -from detectron2.config import instantiate -from detectron2.data import ( - DatasetFromList, - MapDataset, - ToIterableDataset, - build_batch_data_loader, - build_detection_test_loader, - build_detection_train_loader, -) -from detectron2.data.samplers import InferenceSampler, TrainingSampler - - -def _a_slow_func(x): - return "path/{}".format(x) - - -class TestDatasetFromList(unittest.TestCase): - # Failing for py3.6, likely due to pickle - @unittest.skipIf(sys.version_info.minor <= 6, "Not supported in Python 3.6") - def test_using_lazy_path(self): - dataset = [] - for i in range(10): - dataset.append({"file_name": LazyPath(partial(_a_slow_func, i))}) - - dataset = DatasetFromList(dataset) - for i in range(10): - path = dataset[i]["file_name"] - self.assertTrue(isinstance(path, LazyPath)) - self.assertEqual(os.fspath(path), _a_slow_func(i)) - - -class TestMapDataset(unittest.TestCase): - @staticmethod - def map_func(x): - if x == 2: - return None - return x * 2 - - def test_map_style(self): - ds = DatasetFromList([1, 2, 3]) - ds = MapDataset(ds, TestMapDataset.map_func) - self.assertEqual(ds[0], 2) - self.assertEqual(ds[2], 6) - self.assertIn(ds[1], [2, 6]) - - def test_iter_style(self): - class DS(torch.utils.data.IterableDataset): - def __iter__(self): - yield from [1, 2, 3] - - ds = DS() - ds = MapDataset(ds, TestMapDataset.map_func) - self.assertIsInstance(ds, torch.utils.data.IterableDataset) - - data = list(iter(ds)) - self.assertEqual(data, [2, 6]) - - def test_pickleability(self): - ds = DatasetFromList([1, 2, 3]) - ds = MapDataset(ds, lambda x: x * 2) - ds = pickle.loads(pickle.dumps(ds)) - self.assertEqual(ds[0], 2) - - -class TestDataLoader(unittest.TestCase): - def _get_kwargs(self): - # get kwargs of build_detection_train_loader - cfg = model_zoo.get_config("common/data/coco.py").dataloader.train - cfg.dataset.names = "coco_2017_val_100" - cfg.pop("_target_") - kwargs = {k: instantiate(v) for k, v in cfg.items()} - return kwargs - - def test_build_dataloader_train(self): - kwargs = self._get_kwargs() - dl = build_detection_train_loader(**kwargs) - next(iter(dl)) - - def test_build_iterable_dataloader_train(self): - kwargs = self._get_kwargs() - ds = DatasetFromList(kwargs.pop("dataset")) - ds = ToIterableDataset(ds, TrainingSampler(len(ds))) - dl = build_detection_train_loader(dataset=ds, **kwargs) - next(iter(dl)) - - def _check_is_range(self, data_loader, N): - # check that data_loader produces range(N) - data = list(iter(data_loader)) - data = [x for batch in data for x in batch] # flatten the batches - self.assertEqual(len(data), N) - self.assertEqual(set(data), set(range(N))) - - def test_build_batch_dataloader_inference(self): - # Test that build_batch_data_loader can be used for inference - N = 96 - ds = DatasetFromList(list(range(N))) - sampler = InferenceSampler(len(ds)) - dl = build_batch_data_loader(ds, sampler, 8, num_workers=3) - self._check_is_range(dl, N) - - def test_build_dataloader_inference(self): - N = 50 - ds = DatasetFromList(list(range(N))) - sampler = InferenceSampler(len(ds)) - # test that parallel loader works correctly - dl = build_detection_test_loader( - dataset=ds, sampler=sampler, mapper=lambda x: x, num_workers=3 - ) - self._check_is_range(dl, N) - - # test that batch_size works correctly - dl = build_detection_test_loader( - dataset=ds, sampler=sampler, mapper=lambda x: x, batch_size=4, num_workers=0 - ) - self._check_is_range(dl, N) - - def test_build_iterable_dataloader_inference(self): - # Test that build_detection_test_loader supports iterable dataset - N = 50 - ds = DatasetFromList(list(range(N))) - ds = ToIterableDataset(ds, InferenceSampler(len(ds))) - dl = build_detection_test_loader(dataset=ds, mapper=lambda x: x, num_workers=3) - self._check_is_range(dl, N) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/data/test_detection_utils.py b/grit_src_deprecated/third_party/CenterNet2/tests/data/test_detection_utils.py deleted file mode 100755 index aac56c07..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/data/test_detection_utils.py +++ /dev/null @@ -1,176 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import copy -import numpy as np -import os -import unittest -import pycocotools.mask as mask_util - -from detectron2.data import MetadataCatalog, detection_utils -from detectron2.data import transforms as T -from detectron2.structures import BitMasks, BoxMode -from detectron2.utils.file_io import PathManager - - -class TestTransformAnnotations(unittest.TestCase): - def test_transform_simple_annotation(self): - transforms = T.TransformList([T.HFlipTransform(400)]) - anno = { - "bbox": np.asarray([10, 10, 200, 300]), - "bbox_mode": BoxMode.XYXY_ABS, - "category_id": 3, - "segmentation": [[10, 10, 100, 100, 100, 10], [150, 150, 200, 150, 200, 200]], - } - - output = detection_utils.transform_instance_annotations(anno, transforms, (400, 400)) - self.assertTrue(np.allclose(output["bbox"], [200, 10, 390, 300])) - self.assertEqual(len(output["segmentation"]), len(anno["segmentation"])) - self.assertTrue(np.allclose(output["segmentation"][0], [390, 10, 300, 100, 300, 10])) - - detection_utils.annotations_to_instances([output, output], (400, 400)) - - def test_transform_empty_annotation(self): - detection_utils.annotations_to_instances([], (400, 400)) - - def test_flip_keypoints(self): - transforms = T.TransformList([T.HFlipTransform(400)]) - anno = { - "bbox": np.asarray([10, 10, 200, 300]), - "bbox_mode": BoxMode.XYXY_ABS, - "keypoints": np.random.rand(17, 3) * 50 + 15, - } - - output = detection_utils.transform_instance_annotations( - copy.deepcopy(anno), - transforms, - (400, 400), - keypoint_hflip_indices=detection_utils.create_keypoint_hflip_indices( - ["keypoints_coco_2017_train"] - ), - ) - # The first keypoint is nose - self.assertTrue(np.allclose(output["keypoints"][0, 0], 400 - anno["keypoints"][0, 0])) - # The last 16 keypoints are 8 left-right pairs - self.assertTrue( - np.allclose( - output["keypoints"][1:, 0].reshape(-1, 2)[:, ::-1], - 400 - anno["keypoints"][1:, 0].reshape(-1, 2), - ) - ) - self.assertTrue( - np.allclose( - output["keypoints"][1:, 1:].reshape(-1, 2, 2)[:, ::-1, :], - anno["keypoints"][1:, 1:].reshape(-1, 2, 2), - ) - ) - - def test_crop(self): - transforms = T.TransformList([T.CropTransform(300, 300, 10, 10)]) - keypoints = np.random.rand(17, 3) * 50 + 15 - keypoints[:, 2] = 2 - anno = { - "bbox": np.asarray([10, 10, 200, 400]), - "bbox_mode": BoxMode.XYXY_ABS, - "keypoints": keypoints, - } - - output = detection_utils.transform_instance_annotations( - copy.deepcopy(anno), transforms, (10, 10) - ) - # box is shifted and cropped - self.assertTrue((output["bbox"] == np.asarray([0, 0, 0, 10])).all()) - # keypoints are no longer visible - self.assertTrue((output["keypoints"][:, 2] == 0).all()) - - def test_transform_RLE(self): - transforms = T.TransformList([T.HFlipTransform(400)]) - mask = np.zeros((300, 400), order="F").astype("uint8") - mask[:, :200] = 1 - - anno = { - "bbox": np.asarray([10, 10, 200, 300]), - "bbox_mode": BoxMode.XYXY_ABS, - "segmentation": mask_util.encode(mask[:, :, None])[0], - "category_id": 3, - } - output = detection_utils.transform_instance_annotations( - copy.deepcopy(anno), transforms, (300, 400) - ) - mask = output["segmentation"] - self.assertTrue((mask[:, 200:] == 1).all()) - self.assertTrue((mask[:, :200] == 0).all()) - - inst = detection_utils.annotations_to_instances( - [output, output], (400, 400), mask_format="bitmask" - ) - self.assertTrue(isinstance(inst.gt_masks, BitMasks)) - - def test_transform_RLE_resize(self): - transforms = T.TransformList( - [T.HFlipTransform(400), T.ScaleTransform(300, 400, 400, 400, "bilinear")] - ) - mask = np.zeros((300, 400), order="F").astype("uint8") - mask[:, :200] = 1 - - anno = { - "bbox": np.asarray([10, 10, 200, 300]), - "bbox_mode": BoxMode.XYXY_ABS, - "segmentation": mask_util.encode(mask[:, :, None])[0], - "category_id": 3, - } - output = detection_utils.transform_instance_annotations( - copy.deepcopy(anno), transforms, (400, 400) - ) - - inst = detection_utils.annotations_to_instances( - [output, output], (400, 400), mask_format="bitmask" - ) - self.assertTrue(isinstance(inst.gt_masks, BitMasks)) - - def test_gen_crop(self): - instance = {"bbox": [10, 10, 100, 100], "bbox_mode": BoxMode.XYXY_ABS} - t = detection_utils.gen_crop_transform_with_instance((10, 10), (150, 150), instance) - # the box center must fall into the cropped region - self.assertTrue(t.x0 <= 55 <= t.x0 + t.w) - - def test_gen_crop_outside_boxes(self): - instance = {"bbox": [10, 10, 100, 100], "bbox_mode": BoxMode.XYXY_ABS} - with self.assertRaises(AssertionError): - detection_utils.gen_crop_transform_with_instance((10, 10), (15, 15), instance) - - def test_read_sem_seg(self): - cityscapes_dir = MetadataCatalog.get("cityscapes_fine_sem_seg_val").gt_dir - sem_seg_gt_path = os.path.join( - cityscapes_dir, "frankfurt", "frankfurt_000001_083852_gtFine_labelIds.png" - ) - if not PathManager.exists(sem_seg_gt_path): - raise unittest.SkipTest( - "Semantic segmentation ground truth {} not found.".format(sem_seg_gt_path) - ) - sem_seg = detection_utils.read_image(sem_seg_gt_path, "L") - self.assertEqual(sem_seg.ndim, 3) - self.assertEqual(sem_seg.shape[2], 1) - self.assertEqual(sem_seg.dtype, np.uint8) - self.assertEqual(sem_seg.max(), 32) - self.assertEqual(sem_seg.min(), 1) - - def test_read_exif_orientation(self): - # https://github.com/recurser/exif-orientation-examples/raw/master/Landscape_5.jpg - URL = "detectron2://assets/Landscape_5.jpg" - img = detection_utils.read_image(URL, "RGB") - self.assertEqual(img.ndim, 3) - self.assertEqual(img.dtype, np.uint8) - self.assertEqual(img.shape, (1200, 1800, 3)) # check that shape is not transposed - - def test_opencv_exif_orientation(self): - import cv2 - - URL = "detectron2://assets/Landscape_5.jpg" - with PathManager.open(URL, "rb") as f: - img = cv2.imdecode(np.frombuffer(f.read(), dtype="uint8"), cv2.IMREAD_COLOR) - self.assertEqual(img.dtype, np.uint8) - self.assertEqual(img.shape, (1200, 1800, 3)) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/data/test_rotation_transform.py b/grit_src_deprecated/third_party/CenterNet2/tests/data/test_rotation_transform.py deleted file mode 100755 index 0e8299ed..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/data/test_rotation_transform.py +++ /dev/null @@ -1,71 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -import unittest - -from detectron2.data.transforms.transform import RotationTransform - - -class TestRotationTransform(unittest.TestCase): - def assertEqualsArrays(self, a1, a2): - self.assertTrue(np.allclose(a1, a2)) - - def randomData(self, h=5, w=5): - image = np.random.rand(h, w) - coords = np.array([[i, j] for j in range(h + 1) for i in range(w + 1)], dtype=float) - return image, coords, h, w - - def test180(self): - image, coords, h, w = self.randomData(6, 6) - rot = RotationTransform(h, w, 180, expand=False, center=None) - self.assertEqualsArrays(rot.apply_image(image), image[::-1, ::-1]) - rotated_coords = [[w - c[0], h - c[1]] for c in coords] - self.assertEqualsArrays(rot.apply_coords(coords), rotated_coords) - - def test45_coords(self): - _, coords, h, w = self.randomData(4, 6) - rot = RotationTransform(h, w, 45, expand=False, center=None) - rotated_coords = [ - [(x + y - (h + w) / 2) / np.sqrt(2) + w / 2, h / 2 + (y + (w - h) / 2 - x) / np.sqrt(2)] - for (x, y) in coords - ] - self.assertEqualsArrays(rot.apply_coords(coords), rotated_coords) - - def test90(self): - image, coords, h, w = self.randomData() - rot = RotationTransform(h, w, 90, expand=False, center=None) - self.assertEqualsArrays(rot.apply_image(image), image.T[::-1]) - rotated_coords = [[c[1], w - c[0]] for c in coords] - self.assertEqualsArrays(rot.apply_coords(coords), rotated_coords) - - def test90_expand(self): # non-square image - image, coords, h, w = self.randomData(h=5, w=8) - rot = RotationTransform(h, w, 90, expand=True, center=None) - self.assertEqualsArrays(rot.apply_image(image), image.T[::-1]) - rotated_coords = [[c[1], w - c[0]] for c in coords] - self.assertEqualsArrays(rot.apply_coords(coords), rotated_coords) - - def test_center_expand(self): - # center has no effect if expand=True because it only affects shifting - image, coords, h, w = self.randomData(h=5, w=8) - angle = np.random.randint(360) - rot1 = RotationTransform(h, w, angle, expand=True, center=None) - rot2 = RotationTransform(h, w, angle, expand=True, center=(0, 0)) - rot3 = RotationTransform(h, w, angle, expand=True, center=(h, w)) - rot4 = RotationTransform(h, w, angle, expand=True, center=(2, 5)) - for r1 in [rot1, rot2, rot3, rot4]: - for r2 in [rot1, rot2, rot3, rot4]: - self.assertEqualsArrays(r1.apply_image(image), r2.apply_image(image)) - self.assertEqualsArrays(r1.apply_coords(coords), r2.apply_coords(coords)) - - def test_inverse_transform(self): - image, coords, h, w = self.randomData(h=5, w=8) - rot = RotationTransform(h, w, 90, expand=True, center=None) - rot_image = rot.apply_image(image) - self.assertEqualsArrays(rot.inverse().apply_image(rot_image), image) - rot = RotationTransform(h, w, 65, expand=True, center=None) - rotated_coords = rot.apply_coords(coords) - self.assertEqualsArrays(rot.inverse().apply_coords(rotated_coords), coords) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/data/test_sampler.py b/grit_src_deprecated/third_party/CenterNet2/tests/data/test_sampler.py deleted file mode 100755 index 0d278439..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/data/test_sampler.py +++ /dev/null @@ -1,111 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import itertools -import math -import operator -import unittest -import torch -from torch.utils import data -from torch.utils.data.sampler import SequentialSampler - -from detectron2.data.build import worker_init_reset_seed -from detectron2.data.common import DatasetFromList, ToIterableDataset -from detectron2.data.samplers import ( - GroupedBatchSampler, - InferenceSampler, - RepeatFactorTrainingSampler, - TrainingSampler, -) -from detectron2.utils.env import seed_all_rng - - -class TestGroupedBatchSampler(unittest.TestCase): - def test_missing_group_id(self): - sampler = SequentialSampler(list(range(100))) - group_ids = [1] * 100 - samples = GroupedBatchSampler(sampler, group_ids, 2) - - for mini_batch in samples: - self.assertEqual(len(mini_batch), 2) - - def test_groups(self): - sampler = SequentialSampler(list(range(100))) - group_ids = [1, 0] * 50 - samples = GroupedBatchSampler(sampler, group_ids, 2) - - for mini_batch in samples: - self.assertEqual((mini_batch[0] + mini_batch[1]) % 2, 0) - - -class TestSamplerDeterministic(unittest.TestCase): - def test_to_iterable(self): - sampler = TrainingSampler(100, seed=10) - gt_output = list(itertools.islice(sampler, 100)) - self.assertEqual(set(gt_output), set(range(100))) - - dataset = DatasetFromList(list(range(100))) - dataset = ToIterableDataset(dataset, sampler) - data_loader = data.DataLoader(dataset, num_workers=0, collate_fn=operator.itemgetter(0)) - - output = list(itertools.islice(data_loader, 100)) - self.assertEqual(output, gt_output) - - data_loader = data.DataLoader( - dataset, - num_workers=2, - collate_fn=operator.itemgetter(0), - worker_init_fn=worker_init_reset_seed, - # reset seed should not affect behavior of TrainingSampler - ) - output = list(itertools.islice(data_loader, 100)) - # multiple workers should not lead to duplicate or different data - self.assertEqual(output, gt_output) - - def test_training_sampler_seed(self): - seed_all_rng(42) - sampler = TrainingSampler(30) - data = list(itertools.islice(sampler, 65)) - - seed_all_rng(42) - sampler = TrainingSampler(30) - seed_all_rng(999) # should be ineffective - data2 = list(itertools.islice(sampler, 65)) - self.assertEqual(data, data2) - - -class TestRepeatFactorTrainingSampler(unittest.TestCase): - def test_repeat_factors_from_category_frequency(self): - repeat_thresh = 0.5 - - dataset_dicts = [ - {"annotations": [{"category_id": 0}, {"category_id": 1}]}, - {"annotations": [{"category_id": 0}]}, - {"annotations": []}, - ] - - rep_factors = RepeatFactorTrainingSampler.repeat_factors_from_category_frequency( - dataset_dicts, repeat_thresh - ) - - expected_rep_factors = torch.tensor([math.sqrt(3 / 2), 1.0, 1.0]) - self.assertTrue(torch.allclose(rep_factors, expected_rep_factors)) - - -class TestInferenceSampler(unittest.TestCase): - def test_local_indices(self): - sizes = [0, 16, 2, 42] - world_sizes = [5, 2, 3, 4] - - expected_results = [ - [range(0) for _ in range(5)], - [range(8), range(8, 16)], - [range(1), range(1, 2), range(0)], - [range(11), range(11, 22), range(22, 32), range(32, 42)], - ] - - for size, world_size, expected_result in zip(sizes, world_sizes, expected_results): - with self.subTest(f"size={size}, world_size={world_size}"): - local_indices = [ - InferenceSampler._get_local_indices(size, world_size, r) - for r in range(world_size) - ] - self.assertEqual(local_indices, expected_result) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/data/test_transforms.py b/grit_src_deprecated/third_party/CenterNet2/tests/data/test_transforms.py deleted file mode 100755 index 382048e5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/data/test_transforms.py +++ /dev/null @@ -1,268 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import logging -import numpy as np -import unittest -from unittest import mock -import torch -from PIL import Image, ImageOps -from torch.nn import functional as F - -from detectron2.config import get_cfg -from detectron2.data import detection_utils -from detectron2.data import transforms as T -from detectron2.utils.logger import setup_logger - -logger = logging.getLogger(__name__) - - -def polygon_allclose(poly1, poly2): - """ - Test whether two polygons are the same. - Both arguments are nx2 numpy arrays. - """ - # ABCD and CDAB are the same polygon. So it's important to check after rolling - for k in range(len(poly1)): - rolled_poly1 = np.roll(poly1, k, axis=0) - if np.allclose(rolled_poly1, poly2): - return True - return False - - -class TestTransforms(unittest.TestCase): - def setUp(self): - setup_logger() - - def test_apply_rotated_boxes(self): - np.random.seed(125) - cfg = get_cfg() - is_train = True - augs = detection_utils.build_augmentation(cfg, is_train) - image = np.random.rand(200, 300) - image, transforms = T.apply_augmentations(augs, image) - image_shape = image.shape[:2] # h, w - assert image_shape == (800, 1200) - annotation = {"bbox": [179, 97, 62, 40, -56]} - - boxes = np.array([annotation["bbox"]], dtype=np.float64) # boxes.shape = (1, 5) - transformed_bbox = transforms.apply_rotated_box(boxes)[0] - - expected_bbox = np.array([484, 388, 248, 160, 56], dtype=np.float64) - err_msg = "transformed_bbox = {}, expected {}".format(transformed_bbox, expected_bbox) - assert np.allclose(transformed_bbox, expected_bbox), err_msg - - def test_resize_and_crop(self): - np.random.seed(125) - min_scale = 0.2 - max_scale = 2.0 - target_height = 1100 - target_width = 1000 - resize_aug = T.ResizeScale(min_scale, max_scale, target_height, target_width) - fixed_size_crop_aug = T.FixedSizeCrop((target_height, target_width)) - hflip_aug = T.RandomFlip() - augs = [resize_aug, fixed_size_crop_aug, hflip_aug] - original_image = np.random.rand(900, 800) - image, transforms = T.apply_augmentations(augs, original_image) - image_shape = image.shape[:2] # h, w - self.assertEqual((1100, 1000), image_shape) - - boxes = np.array( - [[91, 46, 144, 111], [523, 251, 614, 295]], - dtype=np.float64, - ) - transformed_bboxs = transforms.apply_box(boxes) - expected_bboxs = np.array( - [ - [895.42, 33.42666667, 933.91125, 80.66], - [554.0825, 182.39333333, 620.17125, 214.36666667], - ], - dtype=np.float64, - ) - err_msg = "transformed_bbox = {}, expected {}".format(transformed_bboxs, expected_bboxs) - self.assertTrue(np.allclose(transformed_bboxs, expected_bboxs), err_msg) - - polygon = np.array([[91, 46], [144, 46], [144, 111], [91, 111]]) - transformed_polygons = transforms.apply_polygons([polygon]) - expected_polygon = np.array([[934.0, 33.0], [934.0, 80.0], [896.0, 80.0], [896.0, 33.0]]) - self.assertEqual(1, len(transformed_polygons)) - err_msg = "transformed_polygon = {}, expected {}".format( - transformed_polygons[0], expected_polygon - ) - self.assertTrue(polygon_allclose(transformed_polygons[0], expected_polygon), err_msg) - - def test_apply_rotated_boxes_unequal_scaling_factor(self): - np.random.seed(125) - h, w = 400, 200 - newh, neww = 800, 800 - image = np.random.rand(h, w) - augs = [] - augs.append(T.Resize(shape=(newh, neww))) - image, transforms = T.apply_augmentations(augs, image) - image_shape = image.shape[:2] # h, w - assert image_shape == (newh, neww) - - boxes = np.array( - [ - [150, 100, 40, 20, 0], - [150, 100, 40, 20, 30], - [150, 100, 40, 20, 90], - [150, 100, 40, 20, -90], - ], - dtype=np.float64, - ) - transformed_boxes = transforms.apply_rotated_box(boxes) - - expected_bboxes = np.array( - [ - [600, 200, 160, 40, 0], - [600, 200, 144.22205102, 52.91502622, 49.10660535], - [600, 200, 80, 80, 90], - [600, 200, 80, 80, -90], - ], - dtype=np.float64, - ) - err_msg = "transformed_boxes = {}, expected {}".format(transformed_boxes, expected_bboxes) - assert np.allclose(transformed_boxes, expected_bboxes), err_msg - - def test_print_augmentation(self): - t = T.RandomCrop("relative", (100, 100)) - self.assertEqual(str(t), "RandomCrop(crop_type='relative', crop_size=(100, 100))") - - t0 = T.RandomFlip(prob=0.5) - self.assertEqual(str(t0), "RandomFlip(prob=0.5)") - - t1 = T.RandomFlip() - self.assertEqual(str(t1), "RandomFlip()") - - t = T.AugmentationList([t0, t1]) - self.assertEqual(str(t), f"AugmentationList[{t0}, {t1}]") - - def test_random_apply_prob_out_of_range_check(self): - test_probabilities = {0.0: True, 0.5: True, 1.0: True, -0.01: False, 1.01: False} - - for given_probability, is_valid in test_probabilities.items(): - if not is_valid: - self.assertRaises(AssertionError, T.RandomApply, None, prob=given_probability) - else: - T.RandomApply(T.NoOpTransform(), prob=given_probability) - - def test_random_apply_wrapping_aug_probability_occured_evaluation(self): - transform_mock = mock.MagicMock(name="MockTransform", spec=T.Augmentation) - image_mock = mock.MagicMock(name="MockImage") - random_apply = T.RandomApply(transform_mock, prob=0.001) - - with mock.patch.object(random_apply, "_rand_range", return_value=0.0001): - transform = random_apply.get_transform(image_mock) - transform_mock.get_transform.assert_called_once_with(image_mock) - self.assertIsNot(transform, transform_mock) - - def test_random_apply_wrapping_std_transform_probability_occured_evaluation(self): - transform_mock = mock.MagicMock(name="MockTransform", spec=T.Transform) - image_mock = mock.MagicMock(name="MockImage") - random_apply = T.RandomApply(transform_mock, prob=0.001) - - with mock.patch.object(random_apply, "_rand_range", return_value=0.0001): - transform = random_apply.get_transform(image_mock) - self.assertIs(transform, transform_mock) - - def test_random_apply_probability_not_occured_evaluation(self): - transform_mock = mock.MagicMock(name="MockTransform", spec=T.Augmentation) - image_mock = mock.MagicMock(name="MockImage") - random_apply = T.RandomApply(transform_mock, prob=0.001) - - with mock.patch.object(random_apply, "_rand_range", return_value=0.9): - transform = random_apply.get_transform(image_mock) - transform_mock.get_transform.assert_not_called() - self.assertIsInstance(transform, T.NoOpTransform) - - def test_augmentation_input_args(self): - input_shape = (100, 100) - output_shape = (50, 50) - - # define two augmentations with different args - class TG1(T.Augmentation): - def get_transform(self, image, sem_seg): - return T.ResizeTransform( - input_shape[0], input_shape[1], output_shape[0], output_shape[1] - ) - - class TG2(T.Augmentation): - def get_transform(self, image): - assert image.shape[:2] == output_shape # check that TG1 is applied - return T.HFlipTransform(output_shape[1]) - - image = np.random.rand(*input_shape).astype("float32") - sem_seg = (np.random.rand(*input_shape) < 0.5).astype("uint8") - inputs = T.AugInput(image, sem_seg=sem_seg) # provide two args - tfms = inputs.apply_augmentations([TG1(), TG2()]) - self.assertIsInstance(tfms[0], T.ResizeTransform) - self.assertIsInstance(tfms[1], T.HFlipTransform) - self.assertTrue(inputs.image.shape[:2] == output_shape) - self.assertTrue(inputs.sem_seg.shape[:2] == output_shape) - - class TG3(T.Augmentation): - def get_transform(self, image, nonexist): - pass - - with self.assertRaises(AttributeError): - inputs.apply_augmentations([TG3()]) - - def test_augmentation_list(self): - input_shape = (100, 100) - image = np.random.rand(*input_shape).astype("float32") - sem_seg = (np.random.rand(*input_shape) < 0.5).astype("uint8") - inputs = T.AugInput(image, sem_seg=sem_seg) # provide two args - - augs = T.AugmentationList([T.RandomFlip(), T.Resize(20)]) - _ = T.AugmentationList([augs, T.Resize(30)])(inputs) - # 3 in latest fvcore (flattened transformlist), 2 in older - # self.assertEqual(len(tfms), 3) - - def test_color_transforms(self): - rand_img = np.random.random((100, 100, 3)) * 255 - rand_img = rand_img.astype("uint8") - - # Test no-op - noop_transform = T.ColorTransform(lambda img: img) - self.assertTrue(np.array_equal(rand_img, noop_transform.apply_image(rand_img))) - - # Test a ImageOps operation - magnitude = np.random.randint(0, 256) - solarize_transform = T.PILColorTransform(lambda img: ImageOps.solarize(img, magnitude)) - expected_img = ImageOps.solarize(Image.fromarray(rand_img), magnitude) - self.assertTrue(np.array_equal(expected_img, solarize_transform.apply_image(rand_img))) - - def test_resize_transform(self): - input_shapes = [(100, 100), (100, 100, 1), (100, 100, 3)] - output_shapes = [(200, 200), (200, 200, 1), (200, 200, 3)] - for in_shape, out_shape in zip(input_shapes, output_shapes): - in_img = np.random.randint(0, 255, size=in_shape, dtype=np.uint8) - tfm = T.ResizeTransform(in_shape[0], in_shape[1], out_shape[0], out_shape[1]) - out_img = tfm.apply_image(in_img) - self.assertEqual(out_img.shape, out_shape) - - def test_resize_shorted_edge_scriptable(self): - def f(image): - newh, neww = T.ResizeShortestEdge.get_output_shape( - image.shape[-2], image.shape[-1], 80, 133 - ) - return F.interpolate(image.unsqueeze(0), size=(newh, neww)) - - input = torch.randn(3, 10, 10) - script_f = torch.jit.script(f) - self.assertTrue(torch.allclose(f(input), script_f(input))) - - # generalize to new shapes - input = torch.randn(3, 8, 100) - self.assertTrue(torch.allclose(f(input), script_f(input))) - - def test_extent_transform(self): - input_shapes = [(100, 100), (100, 100, 1), (100, 100, 3)] - src_rect = (20, 20, 80, 80) - output_shapes = [(200, 200), (200, 200, 1), (200, 200, 3)] - for in_shape, out_shape in zip(input_shapes, output_shapes): - in_img = np.random.randint(0, 255, size=in_shape, dtype=np.uint8) - tfm = T.ExtentTransform(src_rect, out_shape[:2]) - out_img = tfm.apply_image(in_img) - self.assertTrue(out_img.shape == out_shape) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/layers/__init__.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/__init__.py deleted file mode 100755 index e69de29b..00000000 diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_blocks.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_blocks.py deleted file mode 100755 index 5a0488ad..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_blocks.py +++ /dev/null @@ -1,51 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import unittest -import torch -from torch import nn - -from detectron2.layers import ASPP, DepthwiseSeparableConv2d, FrozenBatchNorm2d -from detectron2.modeling.backbone.resnet import BasicStem, ResNet - - -""" -Test for misc layers. -""" - - -class TestBlocks(unittest.TestCase): - def test_separable_conv(self): - DepthwiseSeparableConv2d(3, 10, norm1="BN", activation1=nn.PReLU()) - - def test_aspp(self): - m = ASPP(3, 10, [2, 3, 4], norm="", activation=nn.PReLU()) - self.assertIsNot(m.convs[0].activation.weight, m.convs[1].activation.weight) - self.assertIsNot(m.convs[0].activation.weight, m.project.activation.weight) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_frozen_batchnorm_fp16(self): - from torch.cuda.amp import autocast - - C = 10 - input = torch.rand(1, C, 10, 10).cuda() - m = FrozenBatchNorm2d(C).cuda() - with autocast(): - output = m(input.half()) - self.assertEqual(output.dtype, torch.float16) - - # requires_grad triggers a different codepath - input.requires_grad_() - with autocast(): - output = m(input.half()) - self.assertEqual(output.dtype, torch.float16) - - def test_resnet_unused_stages(self): - resnet = ResNet(BasicStem(), ResNet.make_default_stages(18), out_features=["res2"]) - self.assertTrue(hasattr(resnet, "res2")) - self.assertFalse(hasattr(resnet, "res3")) - self.assertFalse(hasattr(resnet, "res5")) - - resnet = ResNet(BasicStem(), ResNet.make_default_stages(18), out_features=["res2", "res5"]) - self.assertTrue(hasattr(resnet, "res2")) - self.assertTrue(hasattr(resnet, "res4")) - self.assertTrue(hasattr(resnet, "res5")) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_deformable.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_deformable.py deleted file mode 100755 index 4aa319fc..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_deformable.py +++ /dev/null @@ -1,175 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -import unittest -import torch - -from detectron2.layers import DeformConv, ModulatedDeformConv -from detectron2.utils.env import TORCH_VERSION - - -@unittest.skipIf( - TORCH_VERSION == (1, 8) and torch.cuda.is_available(), - "This test fails under cuda11 + torch1.8.", -) -class DeformableTest(unittest.TestCase): - @unittest.skipIf(not torch.cuda.is_available(), "Deformable not supported for cpu") - def test_forward_output(self): - device = torch.device("cuda") - N, C, H, W = shape = 1, 1, 5, 5 - kernel_size = 3 - padding = 1 - - inputs = torch.arange(np.prod(shape), dtype=torch.float32).reshape(*shape).to(device) - """ - 0 1 2 3 4 - 5 6 7 8 9 - 10 11 12 13 14 - 15 16 17 18 19 - 20 21 22 23 24 - """ - offset_channels = kernel_size * kernel_size * 2 - offset = torch.full((N, offset_channels, H, W), 0.5, dtype=torch.float32).to(device) - - # Test DCN v1 - deform = DeformConv(C, C, kernel_size=kernel_size, padding=padding).to(device) - deform.weight = torch.nn.Parameter(torch.ones_like(deform.weight)) - output = deform(inputs, offset) - output = output.detach().cpu().numpy() - deform_results = np.array( - [ - [30, 41.25, 48.75, 45, 28.75], - [62.25, 81, 90, 80.25, 50.25], - [99.75, 126, 135, 117.75, 72.75], - [105, 131.25, 138.75, 120, 73.75], - [71.75, 89.25, 93.75, 80.75, 49.5], - ] - ) - self.assertTrue(np.allclose(output.flatten(), deform_results.flatten())) - - # Test DCN v2 - mask_channels = kernel_size * kernel_size - mask = torch.full((N, mask_channels, H, W), 0.5, dtype=torch.float32).to(device) - modulate_deform = ModulatedDeformConv(C, C, kernel_size, padding=padding, bias=False).to( - device - ) - modulate_deform.weight = deform.weight - output = modulate_deform(inputs, offset, mask) - output = output.detach().cpu().numpy() - self.assertTrue(np.allclose(output.flatten(), deform_results.flatten() * 0.5)) - - def test_forward_output_on_cpu(self): - device = torch.device("cpu") - N, C, H, W = shape = 1, 1, 5, 5 - kernel_size = 3 - padding = 1 - - inputs = torch.arange(np.prod(shape), dtype=torch.float32).reshape(*shape).to(device) - - offset_channels = kernel_size * kernel_size * 2 - offset = torch.full((N, offset_channels, H, W), 0.5, dtype=torch.float32).to(device) - - # Test DCN v1 on cpu - deform = DeformConv(C, C, kernel_size=kernel_size, padding=padding).to(device) - deform.weight = torch.nn.Parameter(torch.ones_like(deform.weight)) - output = deform(inputs, offset) - output = output.detach().cpu().numpy() - deform_results = np.array( - [ - [30, 41.25, 48.75, 45, 28.75], - [62.25, 81, 90, 80.25, 50.25], - [99.75, 126, 135, 117.75, 72.75], - [105, 131.25, 138.75, 120, 73.75], - [71.75, 89.25, 93.75, 80.75, 49.5], - ] - ) - self.assertTrue(np.allclose(output.flatten(), deform_results.flatten())) - - @unittest.skipIf(not torch.cuda.is_available(), "This test requires gpu access") - def test_forward_output_on_cpu_equals_output_on_gpu(self): - N, C, H, W = shape = 2, 4, 10, 10 - kernel_size = 3 - padding = 1 - - for groups in [1, 2]: - inputs = torch.arange(np.prod(shape), dtype=torch.float32).reshape(*shape) - offset_channels = kernel_size * kernel_size * 2 - offset = torch.full((N, offset_channels, H, W), 0.5, dtype=torch.float32) - - deform_gpu = DeformConv( - C, C, kernel_size=kernel_size, padding=padding, groups=groups - ).to("cuda") - deform_gpu.weight = torch.nn.Parameter(torch.ones_like(deform_gpu.weight)) - output_gpu = deform_gpu(inputs.to("cuda"), offset.to("cuda")).detach().cpu().numpy() - - deform_cpu = DeformConv( - C, C, kernel_size=kernel_size, padding=padding, groups=groups - ).to("cpu") - deform_cpu.weight = torch.nn.Parameter(torch.ones_like(deform_cpu.weight)) - output_cpu = deform_cpu(inputs.to("cpu"), offset.to("cpu")).detach().numpy() - - self.assertTrue(np.allclose(output_gpu.flatten(), output_cpu.flatten())) - - @unittest.skipIf(not torch.cuda.is_available(), "Deformable not supported for cpu") - def test_small_input(self): - device = torch.device("cuda") - for kernel_size in [3, 5]: - padding = kernel_size // 2 - N, C, H, W = shape = (1, 1, kernel_size - 1, kernel_size - 1) - - inputs = torch.rand(shape).to(device) # input size is smaller than kernel size - - offset_channels = kernel_size * kernel_size * 2 - offset = torch.randn((N, offset_channels, H, W), dtype=torch.float32).to(device) - deform = DeformConv(C, C, kernel_size=kernel_size, padding=padding).to(device) - output = deform(inputs, offset) - self.assertTrue(output.shape == inputs.shape) - - mask_channels = kernel_size * kernel_size - mask = torch.ones((N, mask_channels, H, W), dtype=torch.float32).to(device) - modulate_deform = ModulatedDeformConv( - C, C, kernel_size, padding=padding, bias=False - ).to(device) - output = modulate_deform(inputs, offset, mask) - self.assertTrue(output.shape == inputs.shape) - - @unittest.skipIf(not torch.cuda.is_available(), "Deformable not supported for cpu") - def test_raise_exception(self): - device = torch.device("cuda") - N, C, H, W = shape = 1, 1, 3, 3 - kernel_size = 3 - padding = 1 - - inputs = torch.rand(shape, dtype=torch.float32).to(device) - offset_channels = kernel_size * kernel_size # This is wrong channels for offset - offset = torch.randn((N, offset_channels, H, W), dtype=torch.float32).to(device) - deform = DeformConv(C, C, kernel_size=kernel_size, padding=padding).to(device) - self.assertRaises(RuntimeError, deform, inputs, offset) - - offset_channels = kernel_size * kernel_size * 2 - offset = torch.randn((N, offset_channels, H, W), dtype=torch.float32).to(device) - mask_channels = kernel_size * kernel_size * 2 # This is wrong channels for mask - mask = torch.ones((N, mask_channels, H, W), dtype=torch.float32).to(device) - modulate_deform = ModulatedDeformConv(C, C, kernel_size, padding=padding, bias=False).to( - device - ) - self.assertRaises(RuntimeError, modulate_deform, inputs, offset, mask) - - def test_repr(self): - module = DeformConv(3, 10, kernel_size=3, padding=1, deformable_groups=2) - correct_string = ( - "DeformConv(in_channels=3, out_channels=10, kernel_size=(3, 3), " - "stride=(1, 1), padding=(1, 1), dilation=(1, 1), " - "groups=1, deformable_groups=2, bias=False)" - ) - self.assertEqual(repr(module), correct_string) - - module = ModulatedDeformConv(3, 10, kernel_size=3, padding=1, deformable_groups=2) - correct_string = ( - "ModulatedDeformConv(in_channels=3, out_channels=10, kernel_size=(3, 3), " - "stride=1, padding=1, dilation=1, groups=1, deformable_groups=2, bias=True)" - ) - self.assertEqual(repr(module), correct_string) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_losses.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_losses.py deleted file mode 100755 index d7492024..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_losses.py +++ /dev/null @@ -1,82 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -import unittest -import torch - -from detectron2.layers import ciou_loss, diou_loss - - -class TestLosses(unittest.TestCase): - def test_diou_loss(self): - """ - loss = 1 - iou + d/c - where, - d = (distance between centers of the 2 boxes)^2 - c = (diagonal length of the smallest enclosing box covering the 2 boxes)^2 - """ - # Identical boxes should have loss of 0 - box = torch.tensor([-1, -1, 1, 1], dtype=torch.float32) - loss = diou_loss(box, box) - self.assertTrue(np.allclose(loss, [0.0])) - - # Half size box inside other box - # iou = 0.5, d = 0.25, c = 8 - box2 = torch.tensor([0, -1, 1, 1], dtype=torch.float32) - loss = diou_loss(box, box2) - self.assertTrue(np.allclose(loss, [0.53125])) - - # Two diagonally adjacent boxes - # iou = 0, d = 2, c = 8 - box3 = torch.tensor([0, 0, 1, 1], dtype=torch.float32) - box4 = torch.tensor([1, 1, 2, 2], dtype=torch.float32) - loss = diou_loss(box3, box4) - self.assertTrue(np.allclose(loss, [1.25])) - - # Test batched loss and reductions - box1s = torch.stack([box, box3], dim=0) - box2s = torch.stack([box2, box4], dim=0) - - loss = diou_loss(box1s, box2s, reduction="sum") - self.assertTrue(np.allclose(loss, [1.78125])) - - loss = diou_loss(box1s, box2s, reduction="mean") - self.assertTrue(np.allclose(loss, [0.890625])) - - def test_ciou_loss(self): - """ - loss = 1 - iou + d/c + alpha*v - where, - d = (distance between centers of the 2 boxes)^2 - c = (diagonal length of the smallest enclosing box covering the 2 boxes)^2 - v = (4/pi^2) * (arctan(box1_w/box1_h) - arctan(box2_w/box2_h))^2 - alpha = v/(1 - iou + v) - """ - # Identical boxes should have loss of 0 - box = torch.tensor([-1, -1, 1, 1], dtype=torch.float32) - loss = ciou_loss(box, box) - self.assertTrue(np.allclose(loss, [0.0])) - - # Half size box inside other box - # iou = 0.5, d = 0.25, c = 8 - # v = (4/pi^2) * (arctan(1) - arctan(0.5))^2 = 0.042 - # alpha = 0.0775 - box2 = torch.tensor([0, -1, 1, 1], dtype=torch.float32) - loss = ciou_loss(box, box2) - self.assertTrue(np.allclose(loss, [0.5345])) - - # Two diagonally adjacent boxes - # iou = 0, d = 2, c = 8, v = 0, alpha = 0 - box3 = torch.tensor([0, 0, 1, 1], dtype=torch.float32) - box4 = torch.tensor([1, 1, 2, 2], dtype=torch.float32) - loss = ciou_loss(box3, box4) - self.assertTrue(np.allclose(loss, [1.25])) - - # Test batched loss and reductions - box1s = torch.stack([box, box3], dim=0) - box2s = torch.stack([box2, box4], dim=0) - - loss = ciou_loss(box1s, box2s, reduction="sum") - self.assertTrue(np.allclose(loss, [1.7845])) - - loss = ciou_loss(box1s, box2s, reduction="mean") - self.assertTrue(np.allclose(loss, [0.89225])) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_mask_ops.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_mask_ops.py deleted file mode 100755 index 162c449c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_mask_ops.py +++ /dev/null @@ -1,202 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import contextlib -import io -import numpy as np -import unittest -from collections import defaultdict -import torch -import tqdm -from fvcore.common.benchmark import benchmark -from pycocotools.coco import COCO -from tabulate import tabulate -from torch.nn import functional as F - -from detectron2.data import MetadataCatalog -from detectron2.layers.mask_ops import ( - pad_masks, - paste_mask_in_image_old, - paste_masks_in_image, - scale_boxes, -) -from detectron2.structures import BitMasks, Boxes, BoxMode, PolygonMasks -from detectron2.structures.masks import polygons_to_bitmask -from detectron2.utils.file_io import PathManager -from detectron2.utils.testing import random_boxes - - -def iou_between_full_image_bit_masks(a, b): - intersect = (a & b).sum() - union = (a | b).sum() - return intersect / union - - -def rasterize_polygons_with_grid_sample(full_image_bit_mask, box, mask_size, threshold=0.5): - x0, y0, x1, y1 = box[0], box[1], box[2], box[3] - - img_h, img_w = full_image_bit_mask.shape - - mask_y = np.arange(0.0, mask_size) + 0.5 # mask y sample coords in [0.5, mask_size - 0.5] - mask_x = np.arange(0.0, mask_size) + 0.5 # mask x sample coords in [0.5, mask_size - 0.5] - mask_y = mask_y / mask_size * (y1 - y0) + y0 - mask_x = mask_x / mask_size * (x1 - x0) + x0 - - mask_x = (mask_x - 0.5) / (img_w - 1) * 2 + -1 - mask_y = (mask_y - 0.5) / (img_h - 1) * 2 + -1 - gy, gx = torch.meshgrid(torch.from_numpy(mask_y), torch.from_numpy(mask_x)) - ind = torch.stack([gx, gy], dim=-1).to(dtype=torch.float32) - - full_image_bit_mask = torch.from_numpy(full_image_bit_mask) - mask = F.grid_sample( - full_image_bit_mask[None, None, :, :].to(dtype=torch.float32), - ind[None, :, :, :], - align_corners=True, - ) - - return mask[0, 0] >= threshold - - -class TestMaskCropPaste(unittest.TestCase): - def setUp(self): - json_file = MetadataCatalog.get("coco_2017_val_100").json_file - if not PathManager.isfile(json_file): - raise unittest.SkipTest("{} not found".format(json_file)) - with contextlib.redirect_stdout(io.StringIO()): - json_file = PathManager.get_local_path(json_file) - self.coco = COCO(json_file) - - def test_crop_paste_consistency(self): - """ - rasterize_polygons_within_box (used in training) - and - paste_masks_in_image (used in inference) - should be inverse operations to each other. - - This function runs several implementation of the above two operations and prints - the reconstruction error. - """ - - anns = self.coco.loadAnns(self.coco.getAnnIds(iscrowd=False)) # avoid crowd annotations - - selected_anns = anns[:100] - - ious = [] - for ann in tqdm.tqdm(selected_anns): - results = self.process_annotation(ann) - ious.append([k[2] for k in results]) - - ious = np.array(ious) - mean_ious = ious.mean(axis=0) - table = [] - res_dic = defaultdict(dict) - for row, iou in zip(results, mean_ious): - table.append((row[0], row[1], iou)) - res_dic[row[0]][row[1]] = iou - print(tabulate(table, headers=["rasterize", "paste", "iou"], tablefmt="simple")) - # assert that the reconstruction is good: - self.assertTrue(res_dic["polygon"]["aligned"] > 0.94) - self.assertTrue(res_dic["roialign"]["aligned"] > 0.95) - - def process_annotation(self, ann, mask_side_len=28): - # Parse annotation data - img_info = self.coco.loadImgs(ids=[ann["image_id"]])[0] - height, width = img_info["height"], img_info["width"] - gt_polygons = [np.array(p, dtype=np.float64) for p in ann["segmentation"]] - gt_bbox = BoxMode.convert(ann["bbox"], BoxMode.XYWH_ABS, BoxMode.XYXY_ABS) - gt_bit_mask = polygons_to_bitmask(gt_polygons, height, width) - - # Run rasterize .. - torch_gt_bbox = torch.tensor(gt_bbox).to(dtype=torch.float32).reshape(-1, 4) - box_bitmasks = { - "polygon": PolygonMasks([gt_polygons]).crop_and_resize(torch_gt_bbox, mask_side_len)[0], - "gridsample": rasterize_polygons_with_grid_sample(gt_bit_mask, gt_bbox, mask_side_len), - "roialign": BitMasks(torch.from_numpy(gt_bit_mask[None, :, :])).crop_and_resize( - torch_gt_bbox, mask_side_len - )[0], - } - - # Run paste .. - results = defaultdict(dict) - for k, box_bitmask in box_bitmasks.items(): - padded_bitmask, scale = pad_masks(box_bitmask[None, :, :], 1) - scaled_boxes = scale_boxes(torch_gt_bbox, scale) - - r = results[k] - r["old"] = paste_mask_in_image_old( - padded_bitmask[0], scaled_boxes[0], height, width, threshold=0.5 - ) - r["aligned"] = paste_masks_in_image( - box_bitmask[None, :, :], Boxes(torch_gt_bbox), (height, width) - )[0] - - table = [] - for rasterize_method, r in results.items(): - for paste_method, mask in r.items(): - mask = np.asarray(mask) - iou = iou_between_full_image_bit_masks(gt_bit_mask.astype("uint8"), mask) - table.append((rasterize_method, paste_method, iou)) - return table - - def test_polygon_area(self): - # Draw polygon boxes - for d in [5.0, 10.0, 1000.0]: - polygon = PolygonMasks([[[0, 0, 0, d, d, d, d, 0]]]) - area = polygon.area()[0] - target = d ** 2 - self.assertEqual(area, target) - - # Draw polygon triangles - for d in [5.0, 10.0, 1000.0]: - polygon = PolygonMasks([[[0, 0, 0, d, d, d]]]) - area = polygon.area()[0] - target = d ** 2 / 2 - self.assertEqual(area, target) - - def test_paste_mask_scriptable(self): - scripted_f = torch.jit.script(paste_masks_in_image) - N = 10 - masks = torch.rand(N, 28, 28) - boxes = Boxes(random_boxes(N, 100)).tensor - image_shape = (150, 150) - - out = paste_masks_in_image(masks, boxes, image_shape) - scripted_out = scripted_f(masks, boxes, image_shape) - self.assertTrue(torch.equal(out, scripted_out)) - - -def benchmark_paste(): - S = 800 - H, W = image_shape = (S, S) - N = 64 - torch.manual_seed(42) - masks = torch.rand(N, 28, 28) - - center = torch.rand(N, 2) * 600 + 100 - wh = torch.clamp(torch.randn(N, 2) * 40 + 200, min=50) - x0y0 = torch.clamp(center - wh * 0.5, min=0.0) - x1y1 = torch.clamp(center + wh * 0.5, max=S) - boxes = Boxes(torch.cat([x0y0, x1y1], axis=1)) - - def func(device, n=3): - m = masks.to(device=device) - b = boxes.to(device=device) - - def bench(): - for _ in range(n): - paste_masks_in_image(m, b, image_shape) - if device.type == "cuda": - torch.cuda.synchronize() - - return bench - - specs = [{"device": torch.device("cpu"), "n": 3}] - if torch.cuda.is_available(): - specs.append({"device": torch.device("cuda"), "n": 3}) - - benchmark(func, "paste_masks", specs, num_iters=10, warmup_iters=2) - - -if __name__ == "__main__": - benchmark_paste() - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_nms.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_nms.py deleted file mode 100755 index a042db61..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_nms.py +++ /dev/null @@ -1,33 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from __future__ import absolute_import, division, print_function, unicode_literals -import unittest -import torch - -from detectron2.layers import batched_nms -from detectron2.utils.testing import random_boxes - - -class TestNMS(unittest.TestCase): - def _create_tensors(self, N): - boxes = random_boxes(N, 200) - scores = torch.rand(N) - return boxes, scores - - def test_nms_scriptability(self): - N = 2000 - num_classes = 50 - boxes, scores = self._create_tensors(N) - idxs = torch.randint(0, num_classes, (N,)) - scripted_batched_nms = torch.jit.script(batched_nms) - err_msg = "NMS is incompatible with jit-scripted NMS for IoU={}" - - for iou in [0.2, 0.5, 0.8]: - keep_ref = batched_nms(boxes, scores, idxs, iou) - backup = boxes.clone() - scripted_keep = scripted_batched_nms(boxes, scores, idxs, iou) - assert torch.allclose(boxes, backup), "boxes modified by jit-scripted batched_nms" - self.assertTrue(torch.equal(keep_ref, scripted_keep), err_msg.format(iou)) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_nms_rotated.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_nms_rotated.py deleted file mode 100755 index 4b453848..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_nms_rotated.py +++ /dev/null @@ -1,172 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from __future__ import absolute_import, division, print_function, unicode_literals -import numpy as np -import unittest -from copy import deepcopy -import torch -from torchvision import ops - -from detectron2.layers import batched_nms, batched_nms_rotated, nms_rotated -from detectron2.utils.testing import random_boxes - - -def nms_edit_distance(keep1, keep2): - """ - Compare the "keep" result of two nms call. - They are allowed to be different in terms of edit distance - due to floating point precision issues, e.g., - if a box happen to have an IoU of 0.5 with another box, - one implentation may choose to keep it while another may discard it. - """ - keep1, keep2 = keep1.cpu(), keep2.cpu() - if torch.equal(keep1, keep2): - # they should be equal most of the time - return 0 - keep1, keep2 = tuple(keep1), tuple(keep2) - m, n = len(keep1), len(keep2) - - # edit distance with DP - f = [np.arange(n + 1), np.arange(n + 1)] - for i in range(m): - cur_row = i % 2 - other_row = (i + 1) % 2 - f[other_row][0] = i + 1 - for j in range(n): - f[other_row][j + 1] = ( - f[cur_row][j] - if keep1[i] == keep2[j] - else min(min(f[cur_row][j], f[cur_row][j + 1]), f[other_row][j]) + 1 - ) - return f[m % 2][n] - - -class TestNMSRotated(unittest.TestCase): - def reference_horizontal_nms(self, boxes, scores, iou_threshold): - """ - Args: - box_scores (N, 5): boxes in corner-form and probabilities. - (Note here 5 == 4 + 1, i.e., 4-dim horizontal box + 1-dim prob) - iou_threshold: intersection over union threshold. - Returns: - picked: a list of indexes of the kept boxes - """ - picked = [] - _, indexes = scores.sort(descending=True) - while len(indexes) > 0: - current = indexes[0] - picked.append(current.item()) - if len(indexes) == 1: - break - current_box = boxes[current, :] - indexes = indexes[1:] - rest_boxes = boxes[indexes, :] - iou = ops.box_iou(rest_boxes, current_box.unsqueeze(0)).squeeze(1) - indexes = indexes[iou <= iou_threshold] - - return torch.as_tensor(picked) - - def _create_tensors(self, N, device="cpu"): - boxes = random_boxes(N, 200, device=device) - scores = torch.rand(N, device=device) - return boxes, scores - - def test_batched_nms_rotated_0_degree_cpu(self, device="cpu"): - N = 2000 - num_classes = 50 - boxes, scores = self._create_tensors(N, device=device) - idxs = torch.randint(0, num_classes, (N,)) - rotated_boxes = torch.zeros(N, 5, device=device) - rotated_boxes[:, 0] = (boxes[:, 0] + boxes[:, 2]) / 2.0 - rotated_boxes[:, 1] = (boxes[:, 1] + boxes[:, 3]) / 2.0 - rotated_boxes[:, 2] = boxes[:, 2] - boxes[:, 0] - rotated_boxes[:, 3] = boxes[:, 3] - boxes[:, 1] - err_msg = "Rotated NMS with 0 degree is incompatible with horizontal NMS for IoU={}" - for iou in [0.2, 0.5, 0.8]: - backup = boxes.clone() - keep_ref = batched_nms(boxes, scores, idxs, iou) - assert torch.allclose(boxes, backup), "boxes modified by batched_nms" - backup = rotated_boxes.clone() - keep = batched_nms_rotated(rotated_boxes, scores, idxs, iou) - assert torch.allclose( - rotated_boxes, backup - ), "rotated_boxes modified by batched_nms_rotated" - # Occasionally the gap can be large if there are many IOU on the threshold boundary - self.assertLessEqual(nms_edit_distance(keep, keep_ref), 5, err_msg.format(iou)) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_batched_nms_rotated_0_degree_cuda(self): - self.test_batched_nms_rotated_0_degree_cpu(device="cuda") - - def test_nms_rotated_0_degree_cpu(self, device="cpu"): - N = 1000 - boxes, scores = self._create_tensors(N, device=device) - rotated_boxes = torch.zeros(N, 5, device=device) - rotated_boxes[:, 0] = (boxes[:, 0] + boxes[:, 2]) / 2.0 - rotated_boxes[:, 1] = (boxes[:, 1] + boxes[:, 3]) / 2.0 - rotated_boxes[:, 2] = boxes[:, 2] - boxes[:, 0] - rotated_boxes[:, 3] = boxes[:, 3] - boxes[:, 1] - err_msg = "Rotated NMS incompatible between CPU and reference implementation for IoU={}" - for iou in [0.2, 0.5, 0.8]: - keep_ref = self.reference_horizontal_nms(boxes, scores, iou) - keep = nms_rotated(rotated_boxes, scores, iou) - self.assertLessEqual(nms_edit_distance(keep, keep_ref), 1, err_msg.format(iou)) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_nms_rotated_0_degree_cuda(self): - self.test_nms_rotated_0_degree_cpu(device="cuda") - - def test_nms_rotated_90_degrees_cpu(self): - N = 1000 - boxes, scores = self._create_tensors(N) - rotated_boxes = torch.zeros(N, 5) - rotated_boxes[:, 0] = (boxes[:, 0] + boxes[:, 2]) / 2.0 - rotated_boxes[:, 1] = (boxes[:, 1] + boxes[:, 3]) / 2.0 - # Note for rotated_boxes[:, 2] and rotated_boxes[:, 3]: - # widths and heights are intentionally swapped here for 90 degrees case - # so that the reference horizontal nms could be used - rotated_boxes[:, 2] = boxes[:, 3] - boxes[:, 1] - rotated_boxes[:, 3] = boxes[:, 2] - boxes[:, 0] - - rotated_boxes[:, 4] = torch.ones(N) * 90 - err_msg = "Rotated NMS incompatible between CPU and reference implementation for IoU={}" - for iou in [0.2, 0.5, 0.8]: - keep_ref = self.reference_horizontal_nms(boxes, scores, iou) - keep = nms_rotated(rotated_boxes, scores, iou) - self.assertLessEqual(nms_edit_distance(keep, keep_ref), 1, err_msg.format(iou)) - - def test_nms_rotated_180_degrees_cpu(self): - N = 1000 - boxes, scores = self._create_tensors(N) - rotated_boxes = torch.zeros(N, 5) - rotated_boxes[:, 0] = (boxes[:, 0] + boxes[:, 2]) / 2.0 - rotated_boxes[:, 1] = (boxes[:, 1] + boxes[:, 3]) / 2.0 - rotated_boxes[:, 2] = boxes[:, 2] - boxes[:, 0] - rotated_boxes[:, 3] = boxes[:, 3] - boxes[:, 1] - rotated_boxes[:, 4] = torch.ones(N) * 180 - err_msg = "Rotated NMS incompatible between CPU and reference implementation for IoU={}" - for iou in [0.2, 0.5, 0.8]: - keep_ref = self.reference_horizontal_nms(boxes, scores, iou) - keep = nms_rotated(rotated_boxes, scores, iou) - self.assertLessEqual(nms_edit_distance(keep, keep_ref), 1, err_msg.format(iou)) - - -class TestScriptable(unittest.TestCase): - def setUp(self): - class TestingModule(torch.nn.Module): - def forward(self, boxes, scores, threshold): - return nms_rotated(boxes, scores, threshold) - - self.module = TestingModule() - - def test_scriptable_cpu(self): - m = deepcopy(self.module).cpu() - _ = torch.jit.script(m) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_scriptable_cuda(self): - m = deepcopy(self.module).cuda() - _ = torch.jit.script(m) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_roi_align.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_roi_align.py deleted file mode 100755 index b6fd8ede..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_roi_align.py +++ /dev/null @@ -1,210 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import numpy as np -import unittest -from copy import copy -import cv2 -import torch -from fvcore.common.benchmark import benchmark -from torch.nn import functional as F - -from detectron2.layers.roi_align import ROIAlign, roi_align - - -class ROIAlignTest(unittest.TestCase): - def test_forward_output(self): - input = np.arange(25).reshape(5, 5).astype("float32") - """ - 0 1 2 3 4 - 5 6 7 8 9 - 10 11 12 13 14 - 15 16 17 18 19 - 20 21 22 23 24 - """ - - output = self._simple_roialign(input, [1, 1, 3, 3], (4, 4), aligned=False) - output_correct = self._simple_roialign(input, [1, 1, 3, 3], (4, 4), aligned=True) - - # without correction: - old_results = [ - [7.5, 8, 8.5, 9], - [10, 10.5, 11, 11.5], - [12.5, 13, 13.5, 14], - [15, 15.5, 16, 16.5], - ] - - # with 0.5 correction: - correct_results = [ - [4.5, 5.0, 5.5, 6.0], - [7.0, 7.5, 8.0, 8.5], - [9.5, 10.0, 10.5, 11.0], - [12.0, 12.5, 13.0, 13.5], - ] - # This is an upsampled version of [[6, 7], [11, 12]] - - self.assertTrue(np.allclose(output.flatten(), np.asarray(old_results).flatten())) - self.assertTrue( - np.allclose(output_correct.flatten(), np.asarray(correct_results).flatten()) - ) - - # Also see similar issues in tensorflow at - # https://github.com/tensorflow/tensorflow/issues/26278 - - def test_resize(self): - H, W = 30, 30 - input = np.random.rand(H, W).astype("float32") * 100 - box = [10, 10, 20, 20] - output = self._simple_roialign(input, box, (5, 5), aligned=True) - - input2x = cv2.resize(input, (W // 2, H // 2), interpolation=cv2.INTER_LINEAR) - box2x = [x / 2 for x in box] - output2x = self._simple_roialign(input2x, box2x, (5, 5), aligned=True) - diff = np.abs(output2x - output) - self.assertTrue(diff.max() < 1e-4) - - def test_grid_sample_equivalence(self): - H, W = 30, 30 - input = np.random.rand(H, W).astype("float32") * 100 - box = [10, 10, 20, 20] - for ratio in [1, 2, 3]: - output = self._simple_roialign(input, box, (5, 5), sampling_ratio=ratio) - output_grid_sample = grid_sample_roi_align( - torch.from_numpy(input[None, None, :, :]).float(), - torch.as_tensor(box).float()[None, :], - 5, - 1.0, - ratio, - ) - self.assertTrue(torch.allclose(output, output_grid_sample)) - - def _simple_roialign(self, img, box, resolution, sampling_ratio=0, aligned=True): - """ - RoiAlign with scale 1.0. - """ - if isinstance(resolution, int): - resolution = (resolution, resolution) - op = ROIAlign(resolution, 1.0, sampling_ratio, aligned=aligned) - input = torch.from_numpy(img[None, None, :, :].astype("float32")) - - rois = [0] + list(box) - rois = torch.from_numpy(np.asarray(rois)[None, :].astype("float32")) - output = op.forward(input, rois) - if torch.cuda.is_available(): - output_cuda = op.forward(input.cuda(), rois.cuda()).cpu() - self.assertTrue(torch.allclose(output, output_cuda)) - return output[0, 0] - - def _simple_roialign_with_grad(self, img, box, resolution, device): - if isinstance(resolution, int): - resolution = (resolution, resolution) - - op = ROIAlign(resolution, 1.0, 0, aligned=True) - input = torch.from_numpy(img[None, None, :, :].astype("float32")) - - rois = [0] + list(box) - rois = torch.from_numpy(np.asarray(rois)[None, :].astype("float32")) - input = input.to(device=device) - rois = rois.to(device=device) - input.requires_grad = True - output = op.forward(input, rois) - return input, output - - def test_empty_box(self): - img = np.random.rand(5, 5) - box = [3, 4, 5, 4] - o = self._simple_roialign(img, box, 7) - self.assertTrue(o.shape == (7, 7)) - self.assertTrue((o == 0).all()) - - for dev in ["cpu"] + ["cuda"] if torch.cuda.is_available() else []: - input, output = self._simple_roialign_with_grad(img, box, 7, torch.device(dev)) - output.sum().backward() - self.assertTrue(torch.allclose(input.grad, torch.zeros_like(input))) - - def test_empty_batch(self): - input = torch.zeros(0, 3, 10, 10, dtype=torch.float32) - rois = torch.zeros(0, 5, dtype=torch.float32) - op = ROIAlign((7, 7), 1.0, 0, aligned=True) - output = op.forward(input, rois) - self.assertTrue(output.shape == (0, 3, 7, 7)) - - -def grid_sample_roi_align(input, boxes, output_size, scale, sampling_ratio): - # unlike true roi_align, this does not support different batch_idx - from detectron2.projects.point_rend.point_features import ( - generate_regular_grid_point_coords, - get_point_coords_wrt_image, - point_sample, - ) - - N, _, H, W = input.shape - R = len(boxes) - assert N == 1 - boxes = boxes * scale - grid = generate_regular_grid_point_coords(R, output_size * sampling_ratio, device=boxes.device) - coords = get_point_coords_wrt_image(boxes, grid) - coords = coords / torch.as_tensor([W, H], device=coords.device) # R, s^2, 2 - res = point_sample(input, coords.unsqueeze(0), align_corners=False) # 1,C, R,s^2 - res = ( - res.squeeze(0) - .permute(1, 0, 2) - .reshape(R, -1, output_size * sampling_ratio, output_size * sampling_ratio) - ) - res = F.avg_pool2d(res, sampling_ratio) - return res - - -def benchmark_roi_align(): - def random_boxes(mean_box, stdev, N, maxsize): - ret = torch.rand(N, 4) * stdev + torch.tensor(mean_box, dtype=torch.float) - ret.clamp_(min=0, max=maxsize) - return ret - - def func(shape, nboxes_per_img, sampling_ratio, device, box_size="large"): - N, _, H, _ = shape - input = torch.rand(*shape) - boxes = [] - batch_idx = [] - for k in range(N): - if box_size == "large": - b = random_boxes([80, 80, 130, 130], 24, nboxes_per_img, H) - else: - b = random_boxes([100, 100, 110, 110], 4, nboxes_per_img, H) - boxes.append(b) - batch_idx.append(torch.zeros(nboxes_per_img, 1, dtype=torch.float32) + k) - boxes = torch.cat(boxes, axis=0) - batch_idx = torch.cat(batch_idx, axis=0) - boxes = torch.cat([batch_idx, boxes], axis=1) - - input = input.to(device=device) - boxes = boxes.to(device=device) - - def bench(): - if False and sampling_ratio > 0 and N == 1: - # enable to benchmark grid_sample (slower) - grid_sample_roi_align(input, boxes[:, 1:], 7, 1.0, sampling_ratio) - else: - roi_align(input, boxes, 7, 1.0, sampling_ratio, True) - if device == "cuda": - torch.cuda.synchronize() - - return bench - - def gen_args(arg): - args = [] - for size in ["small", "large"]: - for ratio in [0, 2]: - args.append(copy(arg)) - args[-1]["sampling_ratio"] = ratio - args[-1]["box_size"] = size - return args - - arg = dict(shape=(1, 512, 256, 256), nboxes_per_img=512, device="cuda") - benchmark(func, "cuda_roialign", gen_args(arg), num_iters=20, warmup_iters=1) - arg.update({"device": "cpu", "shape": (1, 256, 128, 128)}) - benchmark(func, "cpu_roialign", gen_args(arg), num_iters=5, warmup_iters=1) - - -if __name__ == "__main__": - if torch.cuda.is_available(): - benchmark_roi_align() - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_roi_align_rotated.py b/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_roi_align_rotated.py deleted file mode 100755 index 7323d7d5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/layers/test_roi_align_rotated.py +++ /dev/null @@ -1,176 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import unittest -import cv2 -import torch -from torch.autograd import Variable, gradcheck - -from detectron2.layers.roi_align import ROIAlign -from detectron2.layers.roi_align_rotated import ROIAlignRotated - -logger = logging.getLogger(__name__) - - -class ROIAlignRotatedTest(unittest.TestCase): - def _box_to_rotated_box(self, box, angle): - return [ - (box[0] + box[2]) / 2.0, - (box[1] + box[3]) / 2.0, - box[2] - box[0], - box[3] - box[1], - angle, - ] - - def _rot90(self, img, num): - num = num % 4 # note: -1 % 4 == 3 - for _ in range(num): - img = img.transpose(0, 1).flip(0) - return img - - def test_forward_output_0_90_180_270(self): - for i in range(4): - # i = 0, 1, 2, 3 corresponding to 0, 90, 180, 270 degrees - img = torch.arange(25, dtype=torch.float32).reshape(5, 5) - """ - 0 1 2 3 4 - 5 6 7 8 9 - 10 11 12 13 14 - 15 16 17 18 19 - 20 21 22 23 24 - """ - box = [1, 1, 3, 3] - rotated_box = self._box_to_rotated_box(box=box, angle=90 * i) - - result = self._simple_roi_align_rotated(img=img, box=rotated_box, resolution=(4, 4)) - - # Here's an explanation for 0 degree case: - # point 0 in the original input lies at [0.5, 0.5] - # (the center of bin [0, 1] x [0, 1]) - # point 1 in the original input lies at [1.5, 0.5], etc. - # since the resolution is (4, 4) that divides [1, 3] x [1, 3] - # into 4 x 4 equal bins, - # the top-left bin is [1, 1.5] x [1, 1.5], and its center - # (1.25, 1.25) lies at the 3/4 position - # between point 0 and point 1, point 5 and point 6, - # point 0 and point 5, point 1 and point 6, so it can be calculated as - # 0.25*(0*0.25+1*0.75)+(5*0.25+6*0.75)*0.75 = 4.5 - result_expected = torch.tensor( - [ - [4.5, 5.0, 5.5, 6.0], - [7.0, 7.5, 8.0, 8.5], - [9.5, 10.0, 10.5, 11.0], - [12.0, 12.5, 13.0, 13.5], - ] - ) - # This is also an upsampled version of [[6, 7], [11, 12]] - - # When the box is rotated by 90 degrees CCW, - # the result would be rotated by 90 degrees CW, thus it's -i here - result_expected = self._rot90(result_expected, -i) - - assert torch.allclose(result, result_expected) - - def test_resize(self): - H, W = 30, 30 - input = torch.rand(H, W) * 100 - box = [10, 10, 20, 20] - rotated_box = self._box_to_rotated_box(box, angle=0) - output = self._simple_roi_align_rotated(img=input, box=rotated_box, resolution=(5, 5)) - - input2x = cv2.resize(input.numpy(), (W // 2, H // 2), interpolation=cv2.INTER_LINEAR) - input2x = torch.from_numpy(input2x) - box2x = [x / 2 for x in box] - rotated_box2x = self._box_to_rotated_box(box2x, angle=0) - output2x = self._simple_roi_align_rotated(img=input2x, box=rotated_box2x, resolution=(5, 5)) - assert torch.allclose(output2x, output) - - def _simple_roi_align_rotated(self, img, box, resolution): - """ - RoiAlignRotated with scale 1.0 and 0 sample ratio. - """ - op = ROIAlignRotated(output_size=resolution, spatial_scale=1.0, sampling_ratio=0) - input = img[None, None, :, :] - - rois = [0] + list(box) - rois = torch.tensor(rois, dtype=torch.float32)[None, :] - result_cpu = op.forward(input, rois) - if torch.cuda.is_available(): - result_cuda = op.forward(input.cuda(), rois.cuda()) - assert torch.allclose(result_cpu, result_cuda.cpu()) - return result_cpu[0, 0] - - def test_empty_box(self): - img = torch.rand(5, 5) - out = self._simple_roi_align_rotated(img, [2, 3, 0, 0, 0], (7, 7)) - self.assertTrue((out == 0).all()) - - def test_roi_align_rotated_gradcheck_cpu(self): - dtype = torch.float64 - device = torch.device("cpu") - roi_align_rotated_op = ROIAlignRotated( - output_size=(5, 5), spatial_scale=0.5, sampling_ratio=1 - ).to(dtype=dtype, device=device) - x = torch.rand(1, 1, 10, 10, dtype=dtype, device=device, requires_grad=True) - # roi format is (batch index, x_center, y_center, width, height, angle) - rois = torch.tensor( - [[0, 4.5, 4.5, 9, 9, 0], [0, 2, 7, 4, 4, 0], [0, 7, 7, 4, 4, 0]], - dtype=dtype, - device=device, - ) - - def func(input): - return roi_align_rotated_op(input, rois) - - assert gradcheck(func, (x,)), "gradcheck failed for RoIAlignRotated CPU" - assert gradcheck(func, (x.transpose(2, 3),)), "gradcheck failed for RoIAlignRotated CPU" - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_roi_align_rotated_gradient_cuda(self): - """ - Compute gradients for ROIAlignRotated with multiple bounding boxes on the GPU, - and compare the result with ROIAlign - """ - # torch.manual_seed(123) - dtype = torch.float64 - device = torch.device("cuda") - pool_h, pool_w = (5, 5) - - roi_align = ROIAlign(output_size=(pool_h, pool_w), spatial_scale=1, sampling_ratio=2).to( - device=device - ) - - roi_align_rotated = ROIAlignRotated( - output_size=(pool_h, pool_w), spatial_scale=1, sampling_ratio=2 - ).to(device=device) - - x = torch.rand(1, 1, 10, 10, dtype=dtype, device=device, requires_grad=True) - # x_rotated = x.clone() won't work (will lead to grad_fun=CloneBackward)! - x_rotated = Variable(x.data.clone(), requires_grad=True) - - # roi_rotated format is (batch index, x_center, y_center, width, height, angle) - rois_rotated = torch.tensor( - [[0, 4.5, 4.5, 9, 9, 0], [0, 2, 7, 4, 4, 0], [0, 7, 7, 4, 4, 0]], - dtype=dtype, - device=device, - ) - - y_rotated = roi_align_rotated(x_rotated, rois_rotated) - s_rotated = y_rotated.sum() - s_rotated.backward() - - # roi format is (batch index, x1, y1, x2, y2) - rois = torch.tensor( - [[0, 0, 0, 9, 9], [0, 0, 5, 4, 9], [0, 5, 5, 9, 9]], dtype=dtype, device=device - ) - - y = roi_align(x, rois) - s = y.sum() - s.backward() - - assert torch.allclose( - x.grad, x_rotated.grad - ), "gradients for ROIAlign and ROIAlignRotated mismatch on CUDA" - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/__init__.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/__init__.py deleted file mode 100755 index e69de29b..00000000 diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_anchor_generator.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_anchor_generator.py deleted file mode 100755 index 13a808e5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_anchor_generator.py +++ /dev/null @@ -1,120 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import unittest -import torch - -from detectron2.config import get_cfg -from detectron2.layers import ShapeSpec -from detectron2.modeling.anchor_generator import DefaultAnchorGenerator, RotatedAnchorGenerator - -logger = logging.getLogger(__name__) - - -class TestAnchorGenerator(unittest.TestCase): - def test_default_anchor_generator(self): - cfg = get_cfg() - cfg.MODEL.ANCHOR_GENERATOR.SIZES = [[32, 64]] - cfg.MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS = [[0.25, 1, 4]] - - anchor_generator = DefaultAnchorGenerator(cfg, [ShapeSpec(stride=4)]) - - # only the last two dimensions of features matter here - num_images = 2 - features = {"stage3": torch.rand(num_images, 96, 1, 2)} - anchors = anchor_generator([features["stage3"]]) - expected_anchor_tensor = torch.tensor( - [ - [-32.0, -8.0, 32.0, 8.0], - [-16.0, -16.0, 16.0, 16.0], - [-8.0, -32.0, 8.0, 32.0], - [-64.0, -16.0, 64.0, 16.0], - [-32.0, -32.0, 32.0, 32.0], - [-16.0, -64.0, 16.0, 64.0], - [-28.0, -8.0, 36.0, 8.0], # -28.0 == -32.0 + STRIDE (4) - [-12.0, -16.0, 20.0, 16.0], - [-4.0, -32.0, 12.0, 32.0], - [-60.0, -16.0, 68.0, 16.0], - [-28.0, -32.0, 36.0, 32.0], - [-12.0, -64.0, 20.0, 64.0], - ] - ) - - self.assertTrue(torch.allclose(anchors[0].tensor, expected_anchor_tensor)) - - def test_default_anchor_generator_centered(self): - # test explicit args - anchor_generator = DefaultAnchorGenerator( - sizes=[32, 64], aspect_ratios=[0.25, 1, 4], strides=[4] - ) - - # only the last two dimensions of features matter here - num_images = 2 - features = {"stage3": torch.rand(num_images, 96, 1, 2)} - expected_anchor_tensor = torch.tensor( - [ - [-30.0, -6.0, 34.0, 10.0], - [-14.0, -14.0, 18.0, 18.0], - [-6.0, -30.0, 10.0, 34.0], - [-62.0, -14.0, 66.0, 18.0], - [-30.0, -30.0, 34.0, 34.0], - [-14.0, -62.0, 18.0, 66.0], - [-26.0, -6.0, 38.0, 10.0], - [-10.0, -14.0, 22.0, 18.0], - [-2.0, -30.0, 14.0, 34.0], - [-58.0, -14.0, 70.0, 18.0], - [-26.0, -30.0, 38.0, 34.0], - [-10.0, -62.0, 22.0, 66.0], - ] - ) - - anchors = anchor_generator([features["stage3"]]) - self.assertTrue(torch.allclose(anchors[0].tensor, expected_anchor_tensor)) - - anchors = torch.jit.script(anchor_generator)([features["stage3"]]) - self.assertTrue(torch.allclose(anchors[0].tensor, expected_anchor_tensor)) - - def test_rrpn_anchor_generator(self): - cfg = get_cfg() - cfg.MODEL.ANCHOR_GENERATOR.SIZES = [[32, 64]] - cfg.MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS = [[0.25, 1, 4]] - cfg.MODEL.ANCHOR_GENERATOR.ANGLES = [0, 45] # test single list[float] - anchor_generator = RotatedAnchorGenerator(cfg, [ShapeSpec(stride=4)]) - - # only the last two dimensions of features matter here - num_images = 2 - features = {"stage3": torch.rand(num_images, 96, 1, 2)} - anchors = anchor_generator([features["stage3"]]) - expected_anchor_tensor = torch.tensor( - [ - [0.0, 0.0, 64.0, 16.0, 0.0], - [0.0, 0.0, 64.0, 16.0, 45.0], - [0.0, 0.0, 32.0, 32.0, 0.0], - [0.0, 0.0, 32.0, 32.0, 45.0], - [0.0, 0.0, 16.0, 64.0, 0.0], - [0.0, 0.0, 16.0, 64.0, 45.0], - [0.0, 0.0, 128.0, 32.0, 0.0], - [0.0, 0.0, 128.0, 32.0, 45.0], - [0.0, 0.0, 64.0, 64.0, 0.0], - [0.0, 0.0, 64.0, 64.0, 45.0], - [0.0, 0.0, 32.0, 128.0, 0.0], - [0.0, 0.0, 32.0, 128.0, 45.0], - [4.0, 0.0, 64.0, 16.0, 0.0], # 4.0 == 0.0 + STRIDE (4) - [4.0, 0.0, 64.0, 16.0, 45.0], - [4.0, 0.0, 32.0, 32.0, 0.0], - [4.0, 0.0, 32.0, 32.0, 45.0], - [4.0, 0.0, 16.0, 64.0, 0.0], - [4.0, 0.0, 16.0, 64.0, 45.0], - [4.0, 0.0, 128.0, 32.0, 0.0], - [4.0, 0.0, 128.0, 32.0, 45.0], - [4.0, 0.0, 64.0, 64.0, 0.0], - [4.0, 0.0, 64.0, 64.0, 45.0], - [4.0, 0.0, 32.0, 128.0, 0.0], - [4.0, 0.0, 32.0, 128.0, 45.0], - ] - ) - - self.assertTrue(torch.allclose(anchors[0].tensor, expected_anchor_tensor)) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_backbone.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_backbone.py deleted file mode 100755 index 3bb100f9..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_backbone.py +++ /dev/null @@ -1,34 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved - -import unittest -import torch - -import detectron2.export.torchscript # apply patch # noqa -from detectron2 import model_zoo -from detectron2.config import get_cfg -from detectron2.layers import ShapeSpec -from detectron2.modeling.backbone import build_resnet_backbone -from detectron2.modeling.backbone.fpn import build_resnet_fpn_backbone - - -class TestBackBone(unittest.TestCase): - def test_resnet_scriptability(self): - cfg = get_cfg() - resnet = build_resnet_backbone(cfg, ShapeSpec(channels=3)) - - scripted_resnet = torch.jit.script(resnet) - - inp = torch.rand(2, 3, 100, 100) - out1 = resnet(inp)["res4"] - out2 = scripted_resnet(inp)["res4"] - self.assertTrue(torch.allclose(out1, out2)) - - def test_fpn_scriptability(self): - cfg = model_zoo.get_config("Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml") - bb = build_resnet_fpn_backbone(cfg, ShapeSpec(channels=3)) - bb_s = torch.jit.script(bb) - - inp = torch.rand(2, 3, 128, 128) - out1 = bb(inp)["p5"] - out2 = bb_s(inp)["p5"] - self.assertTrue(torch.allclose(out1, out2)) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_box2box_transform.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_box2box_transform.py deleted file mode 100755 index fd3a7b79..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_box2box_transform.py +++ /dev/null @@ -1,94 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import unittest -import torch - -from detectron2.modeling.box_regression import ( - Box2BoxTransform, - Box2BoxTransformLinear, - Box2BoxTransformRotated, -) -from detectron2.utils.testing import random_boxes - -logger = logging.getLogger(__name__) - - -class TestBox2BoxTransform(unittest.TestCase): - def test_reconstruction(self): - weights = (5, 5, 10, 10) - b2b_tfm = Box2BoxTransform(weights=weights) - src_boxes = random_boxes(10) - dst_boxes = random_boxes(10) - - devices = [torch.device("cpu")] - if torch.cuda.is_available(): - devices.append(torch.device("cuda")) - for device in devices: - src_boxes = src_boxes.to(device=device) - dst_boxes = dst_boxes.to(device=device) - deltas = b2b_tfm.get_deltas(src_boxes, dst_boxes) - dst_boxes_reconstructed = b2b_tfm.apply_deltas(deltas, src_boxes) - self.assertTrue(torch.allclose(dst_boxes, dst_boxes_reconstructed)) - - def test_apply_deltas_tracing(self): - weights = (5, 5, 10, 10) - b2b_tfm = Box2BoxTransform(weights=weights) - - with torch.no_grad(): - func = torch.jit.trace(b2b_tfm.apply_deltas, (torch.randn(10, 20), torch.randn(10, 4))) - - o = func(torch.randn(10, 20), torch.randn(10, 4)) - self.assertEqual(o.shape, (10, 20)) - o = func(torch.randn(5, 20), torch.randn(5, 4)) - self.assertEqual(o.shape, (5, 20)) - - -def random_rotated_boxes(mean_box, std_length, std_angle, N): - return torch.cat( - [torch.rand(N, 4) * std_length, torch.rand(N, 1) * std_angle], dim=1 - ) + torch.tensor(mean_box, dtype=torch.float) - - -class TestBox2BoxTransformRotated(unittest.TestCase): - def test_reconstruction(self): - weights = (5, 5, 10, 10, 1) - b2b_transform = Box2BoxTransformRotated(weights=weights) - src_boxes = random_rotated_boxes([10, 10, 20, 20, -30], 5, 60.0, 10) - dst_boxes = random_rotated_boxes([10, 10, 20, 20, -30], 5, 60.0, 10) - - devices = [torch.device("cpu")] - if torch.cuda.is_available(): - devices.append(torch.device("cuda")) - for device in devices: - src_boxes = src_boxes.to(device=device) - dst_boxes = dst_boxes.to(device=device) - deltas = b2b_transform.get_deltas(src_boxes, dst_boxes) - dst_boxes_reconstructed = b2b_transform.apply_deltas(deltas, src_boxes) - assert torch.allclose(dst_boxes[:, :4], dst_boxes_reconstructed[:, :4], atol=1e-5) - # angle difference has to be normalized - assert torch.allclose( - (dst_boxes[:, 4] - dst_boxes_reconstructed[:, 4] + 180.0) % 360.0 - 180.0, - torch.zeros_like(dst_boxes[:, 4]), - atol=1e-4, - ) - - -class TestBox2BoxTransformLinear(unittest.TestCase): - def test_reconstruction(self): - b2b_tfm = Box2BoxTransformLinear() - src_boxes = random_boxes(10) - dst_boxes = torch.tensor([0, 0, 101, 101] * 10).reshape(10, 4).float() - - devices = [torch.device("cpu")] - if torch.cuda.is_available(): - devices.append(torch.device("cuda")) - for device in devices: - src_boxes = src_boxes.to(device=device) - dst_boxes = dst_boxes.to(device=device) - deltas = b2b_tfm.get_deltas(src_boxes, dst_boxes) - dst_boxes_reconstructed = b2b_tfm.apply_deltas(deltas, src_boxes) - self.assertTrue(torch.allclose(dst_boxes, dst_boxes_reconstructed, atol=1e-3)) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_fast_rcnn.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_fast_rcnn.py deleted file mode 100755 index e29b944b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_fast_rcnn.py +++ /dev/null @@ -1,171 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import unittest -import torch - -from detectron2.layers import ShapeSpec -from detectron2.modeling.box_regression import Box2BoxTransform, Box2BoxTransformRotated -from detectron2.modeling.roi_heads.fast_rcnn import FastRCNNOutputLayers -from detectron2.modeling.roi_heads.rotated_fast_rcnn import RotatedFastRCNNOutputLayers -from detectron2.structures import Boxes, Instances, RotatedBoxes -from detectron2.utils.events import EventStorage - -logger = logging.getLogger(__name__) - - -class FastRCNNTest(unittest.TestCase): - def test_fast_rcnn(self): - torch.manual_seed(132) - - box_head_output_size = 8 - - box_predictor = FastRCNNOutputLayers( - ShapeSpec(channels=box_head_output_size), - box2box_transform=Box2BoxTransform(weights=(10, 10, 5, 5)), - num_classes=5, - ) - feature_pooled = torch.rand(2, box_head_output_size) - predictions = box_predictor(feature_pooled) - - proposal_boxes = torch.tensor([[0.8, 1.1, 3.2, 2.8], [2.3, 2.5, 7, 8]], dtype=torch.float32) - gt_boxes = torch.tensor([[1, 1, 3, 3], [2, 2, 6, 6]], dtype=torch.float32) - proposal = Instances((10, 10)) - proposal.proposal_boxes = Boxes(proposal_boxes) - proposal.gt_boxes = Boxes(gt_boxes) - proposal.gt_classes = torch.tensor([1, 2]) - - with EventStorage(): # capture events in a new storage to discard them - losses = box_predictor.losses(predictions, [proposal]) - - expected_losses = { - "loss_cls": torch.tensor(1.7951188087), - "loss_box_reg": torch.tensor(4.0357131958), - } - for name in expected_losses.keys(): - assert torch.allclose(losses[name], expected_losses[name]) - - def test_fast_rcnn_empty_batch(self, device="cpu"): - box_predictor = FastRCNNOutputLayers( - ShapeSpec(channels=10), - box2box_transform=Box2BoxTransform(weights=(10, 10, 5, 5)), - num_classes=8, - ).to(device=device) - - logits = torch.randn(0, 100, requires_grad=True, device=device) - deltas = torch.randn(0, 4, requires_grad=True, device=device) - losses = box_predictor.losses([logits, deltas], []) - for value in losses.values(): - self.assertTrue(torch.allclose(value, torch.zeros_like(value))) - sum(losses.values()).backward() - self.assertTrue(logits.grad is not None) - self.assertTrue(deltas.grad is not None) - - predictions, _ = box_predictor.inference([logits, deltas], []) - self.assertEqual(len(predictions), 0) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_fast_rcnn_empty_batch_cuda(self): - self.test_fast_rcnn_empty_batch(device=torch.device("cuda")) - - def test_fast_rcnn_rotated(self): - torch.manual_seed(132) - box_head_output_size = 8 - - box_predictor = RotatedFastRCNNOutputLayers( - ShapeSpec(channels=box_head_output_size), - box2box_transform=Box2BoxTransformRotated(weights=(10, 10, 5, 5, 1)), - num_classes=5, - ) - feature_pooled = torch.rand(2, box_head_output_size) - predictions = box_predictor(feature_pooled) - proposal_boxes = torch.tensor( - [[2, 1.95, 2.4, 1.7, 0], [4.65, 5.25, 4.7, 5.5, 0]], dtype=torch.float32 - ) - gt_boxes = torch.tensor([[2, 2, 2, 2, 0], [4, 4, 4, 4, 0]], dtype=torch.float32) - proposal = Instances((10, 10)) - proposal.proposal_boxes = RotatedBoxes(proposal_boxes) - proposal.gt_boxes = RotatedBoxes(gt_boxes) - proposal.gt_classes = torch.tensor([1, 2]) - - with EventStorage(): # capture events in a new storage to discard them - losses = box_predictor.losses(predictions, [proposal]) - - # Note: the expected losses are slightly different even if - # the boxes are essentially the same as in the FastRCNNOutput test, because - # bbox_pred in FastRCNNOutputLayers have different Linear layers/initialization - # between the two cases. - expected_losses = { - "loss_cls": torch.tensor(1.7920907736), - "loss_box_reg": torch.tensor(4.0410838127), - } - for name in expected_losses.keys(): - assert torch.allclose(losses[name], expected_losses[name]) - - def test_predict_boxes_tracing(self): - class Model(torch.nn.Module): - def __init__(self, output_layer): - super(Model, self).__init__() - self._output_layer = output_layer - - def forward(self, proposal_deltas, proposal_boxes): - instances = Instances((10, 10)) - instances.proposal_boxes = Boxes(proposal_boxes) - return self._output_layer.predict_boxes((None, proposal_deltas), [instances]) - - box_head_output_size = 8 - - box_predictor = FastRCNNOutputLayers( - ShapeSpec(channels=box_head_output_size), - box2box_transform=Box2BoxTransform(weights=(10, 10, 5, 5)), - num_classes=5, - ) - - model = Model(box_predictor) - - from detectron2.export.torchscript_patch import patch_builtin_len - - with torch.no_grad(), patch_builtin_len(): - func = torch.jit.trace(model, (torch.randn(10, 20), torch.randn(10, 4))) - - o = func(torch.randn(10, 20), torch.randn(10, 4)) - self.assertEqual(o[0].shape, (10, 20)) - o = func(torch.randn(5, 20), torch.randn(5, 4)) - self.assertEqual(o[0].shape, (5, 20)) - o = func(torch.randn(20, 20), torch.randn(20, 4)) - self.assertEqual(o[0].shape, (20, 20)) - - def test_predict_probs_tracing(self): - class Model(torch.nn.Module): - def __init__(self, output_layer): - super(Model, self).__init__() - self._output_layer = output_layer - - def forward(self, scores, proposal_boxes): - instances = Instances((10, 10)) - instances.proposal_boxes = Boxes(proposal_boxes) - return self._output_layer.predict_probs((scores, None), [instances]) - - box_head_output_size = 8 - - box_predictor = FastRCNNOutputLayers( - ShapeSpec(channels=box_head_output_size), - box2box_transform=Box2BoxTransform(weights=(10, 10, 5, 5)), - num_classes=5, - ) - - model = Model(box_predictor) - - from detectron2.export.torchscript_patch import patch_builtin_len - - with torch.no_grad(), patch_builtin_len(): - func = torch.jit.trace(model, (torch.randn(10, 6), torch.rand(10, 4))) - o = func(torch.randn(10, 6), torch.randn(10, 4)) - self.assertEqual(o[0].shape, (10, 6)) - o = func(torch.randn(5, 6), torch.randn(5, 4)) - self.assertEqual(o[0].shape, (5, 6)) - o = func(torch.randn(20, 6), torch.randn(20, 4)) - self.assertEqual(o[0].shape, (20, 6)) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_matcher.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_matcher.py deleted file mode 100755 index 6eb2db0c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_matcher.py +++ /dev/null @@ -1,42 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import unittest -from typing import List -import torch - -from detectron2.config import get_cfg -from detectron2.modeling.matcher import Matcher - - -class TestMatcher(unittest.TestCase): - def test_scriptability(self): - cfg = get_cfg() - anchor_matcher = Matcher( - cfg.MODEL.RPN.IOU_THRESHOLDS, cfg.MODEL.RPN.IOU_LABELS, allow_low_quality_matches=True - ) - match_quality_matrix = torch.tensor( - [[0.15, 0.45, 0.2, 0.6], [0.3, 0.65, 0.05, 0.1], [0.05, 0.4, 0.25, 0.4]] - ) - expected_matches = torch.tensor([1, 1, 2, 0]) - expected_match_labels = torch.tensor([-1, 1, 0, 1], dtype=torch.int8) - - matches, match_labels = anchor_matcher(match_quality_matrix) - self.assertTrue(torch.allclose(matches, expected_matches)) - self.assertTrue(torch.allclose(match_labels, expected_match_labels)) - - # nonzero_tuple must be import explicitly to let jit know what it is. - # https://github.com/pytorch/pytorch/issues/38964 - from detectron2.layers import nonzero_tuple # noqa F401 - - def f(thresholds: List[float], labels: List[int]): - return Matcher(thresholds, labels, allow_low_quality_matches=True) - - scripted_anchor_matcher = torch.jit.script(f)( - cfg.MODEL.RPN.IOU_THRESHOLDS, cfg.MODEL.RPN.IOU_LABELS - ) - matches, match_labels = scripted_anchor_matcher(match_quality_matrix) - self.assertTrue(torch.allclose(matches, expected_matches)) - self.assertTrue(torch.allclose(match_labels, expected_match_labels)) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_mmdet.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_mmdet.py deleted file mode 100755 index a743b0b6..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_mmdet.py +++ /dev/null @@ -1,186 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import unittest - -from detectron2.layers import ShapeSpec -from detectron2.modeling.mmdet_wrapper import MMDetBackbone, MMDetDetector - -try: - import mmdet.models # noqa - - HAS_MMDET = True -except ImportError: - HAS_MMDET = False - - -@unittest.skipIf(not HAS_MMDET, "mmdet not available") -class TestMMDetWrapper(unittest.TestCase): - def test_backbone(self): - MMDetBackbone( - backbone=dict( - type="DetectoRS_ResNet", - conv_cfg=dict(type="ConvAWS"), - sac=dict(type="SAC", use_deform=True), - stage_with_sac=(False, True, True, True), - depth=50, - num_stages=4, - out_indices=(0, 1, 2, 3), - frozen_stages=1, - norm_cfg=dict(type="BN", requires_grad=True), - norm_eval=True, - style="pytorch", - ), - neck=dict( - type="FPN", - in_channels=[256, 512, 1024, 2048], - out_channels=256, - num_outs=5, - ), - # skip pretrained model for tests - # pretrained_backbone="torchvision://resnet50", - output_shapes=[ShapeSpec(channels=256, stride=s) for s in [4, 8, 16, 32, 64]], - output_names=["p2", "p3", "p4", "p5", "p6"], - ) - - def test_detector(self): - # a basic R50 Mask R-CNN - MMDetDetector( - detector=dict( - type="MaskRCNN", - backbone=dict( - type="ResNet", - depth=50, - num_stages=4, - out_indices=(0, 1, 2, 3), - frozen_stages=1, - norm_cfg=dict(type="BN", requires_grad=True), - norm_eval=True, - style="pytorch", - # skip pretrained model for tests - # init_cfg=dict(type='Pretrained', checkpoint='torchvision://resnet50')) - ), - neck=dict( - type="FPN", in_channels=[256, 512, 1024, 2048], out_channels=256, num_outs=5 - ), - rpn_head=dict( - type="RPNHead", - in_channels=256, - feat_channels=256, - anchor_generator=dict( - type="AnchorGenerator", - scales=[8], - ratios=[0.5, 1.0, 2.0], - strides=[4, 8, 16, 32, 64], - ), - bbox_coder=dict( - type="DeltaXYWHBBoxCoder", - target_means=[0.0, 0.0, 0.0, 0.0], - target_stds=[1.0, 1.0, 1.0, 1.0], - ), - loss_cls=dict(type="CrossEntropyLoss", use_sigmoid=True, loss_weight=1.0), - loss_bbox=dict(type="L1Loss", loss_weight=1.0), - ), - roi_head=dict( - type="StandardRoIHead", - bbox_roi_extractor=dict( - type="SingleRoIExtractor", - roi_layer=dict(type="RoIAlign", output_size=7, sampling_ratio=0), - out_channels=256, - featmap_strides=[4, 8, 16, 32], - ), - bbox_head=dict( - type="Shared2FCBBoxHead", - in_channels=256, - fc_out_channels=1024, - roi_feat_size=7, - num_classes=80, - bbox_coder=dict( - type="DeltaXYWHBBoxCoder", - target_means=[0.0, 0.0, 0.0, 0.0], - target_stds=[0.1, 0.1, 0.2, 0.2], - ), - reg_class_agnostic=False, - loss_cls=dict(type="CrossEntropyLoss", use_sigmoid=False, loss_weight=1.0), - loss_bbox=dict(type="L1Loss", loss_weight=1.0), - ), - mask_roi_extractor=dict( - type="SingleRoIExtractor", - roi_layer=dict(type="RoIAlign", output_size=14, sampling_ratio=0), - out_channels=256, - featmap_strides=[4, 8, 16, 32], - ), - mask_head=dict( - type="FCNMaskHead", - num_convs=4, - in_channels=256, - conv_out_channels=256, - num_classes=80, - loss_mask=dict(type="CrossEntropyLoss", use_mask=True, loss_weight=1.0), - ), - ), - # model training and testing settings - train_cfg=dict( - rpn=dict( - assigner=dict( - type="MaxIoUAssigner", - pos_iou_thr=0.7, - neg_iou_thr=0.3, - min_pos_iou=0.3, - match_low_quality=True, - ignore_iof_thr=-1, - ), - sampler=dict( - type="RandomSampler", - num=256, - pos_fraction=0.5, - neg_pos_ub=-1, - add_gt_as_proposals=False, - ), - allowed_border=-1, - pos_weight=-1, - debug=False, - ), - rpn_proposal=dict( - nms_pre=2000, - max_per_img=1000, - nms=dict(type="nms", iou_threshold=0.7), - min_bbox_size=0, - ), - rcnn=dict( - assigner=dict( - type="MaxIoUAssigner", - pos_iou_thr=0.5, - neg_iou_thr=0.5, - min_pos_iou=0.5, - match_low_quality=True, - ignore_iof_thr=-1, - ), - sampler=dict( - type="RandomSampler", - num=512, - pos_fraction=0.25, - neg_pos_ub=-1, - add_gt_as_proposals=True, - ), - mask_size=28, - pos_weight=-1, - debug=False, - ), - ), - test_cfg=dict( - rpn=dict( - nms_pre=1000, - max_per_img=1000, - nms=dict(type="nms", iou_threshold=0.7), - min_bbox_size=0, - ), - rcnn=dict( - score_thr=0.05, - nms=dict(type="nms", iou_threshold=0.5), - max_per_img=100, - mask_thr_binary=0.5, - ), - ), - ), - pixel_mean=[1, 2, 3], - pixel_std=[1, 2, 3], - ) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_model_e2e.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_model_e2e.py deleted file mode 100755 index 5da35205..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_model_e2e.py +++ /dev/null @@ -1,223 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - - -import itertools -import unittest -from contextlib import contextmanager -from copy import deepcopy -import torch - -from detectron2.structures import BitMasks, Boxes, ImageList, Instances -from detectron2.utils.events import EventStorage -from detectron2.utils.testing import get_model_no_weights - - -@contextmanager -def typecheck_hook(model, *, in_dtype=None, out_dtype=None): - """ - Check that the model must be called with the given input/output dtype - """ - if not isinstance(in_dtype, set): - in_dtype = {in_dtype} - if not isinstance(out_dtype, set): - out_dtype = {out_dtype} - - def flatten(x): - if isinstance(x, torch.Tensor): - return [x] - if isinstance(x, (list, tuple)): - return list(itertools.chain(*[flatten(t) for t in x])) - if isinstance(x, dict): - return flatten(list(x.values())) - return [] - - def hook(module, input, output): - if in_dtype is not None: - dtypes = {x.dtype for x in flatten(input)} - assert ( - dtypes == in_dtype - ), f"Expected input dtype of {type(module)} is {in_dtype}. Got {dtypes} instead!" - - if out_dtype is not None: - dtypes = {x.dtype for x in flatten(output)} - assert ( - dtypes == out_dtype - ), f"Expected output dtype of {type(module)} is {out_dtype}. Got {dtypes} instead!" - - with model.register_forward_hook(hook): - yield - - -def create_model_input(img, inst=None): - if inst is not None: - return {"image": img, "instances": inst} - else: - return {"image": img} - - -def get_empty_instance(h, w): - inst = Instances((h, w)) - inst.gt_boxes = Boxes(torch.rand(0, 4)) - inst.gt_classes = torch.tensor([]).to(dtype=torch.int64) - inst.gt_masks = BitMasks(torch.rand(0, h, w)) - return inst - - -def get_regular_bitmask_instances(h, w): - inst = Instances((h, w)) - inst.gt_boxes = Boxes(torch.rand(3, 4)) - inst.gt_boxes.tensor[:, 2:] += inst.gt_boxes.tensor[:, :2] - inst.gt_classes = torch.tensor([3, 4, 5]).to(dtype=torch.int64) - inst.gt_masks = BitMasks((torch.rand(3, h, w) > 0.5)) - return inst - - -class InstanceModelE2ETest: - def setUp(self): - torch.manual_seed(43) - self.model = get_model_no_weights(self.CONFIG_PATH) - - def _test_eval(self, input_sizes): - inputs = [create_model_input(torch.rand(3, s[0], s[1])) for s in input_sizes] - self.model.eval() - self.model(inputs) - - def _test_train(self, input_sizes, instances): - assert len(input_sizes) == len(instances) - inputs = [ - create_model_input(torch.rand(3, s[0], s[1]), inst) - for s, inst in zip(input_sizes, instances) - ] - self.model.train() - with EventStorage(): - losses = self.model(inputs) - sum(losses.values()).backward() - del losses - - def _inf_tensor(self, *shape): - return 1.0 / torch.zeros(*shape, device=self.model.device) - - def _nan_tensor(self, *shape): - return torch.zeros(*shape, device=self.model.device).fill_(float("nan")) - - def test_empty_data(self): - instances = [get_empty_instance(200, 250), get_empty_instance(200, 249)] - self._test_eval([(200, 250), (200, 249)]) - self._test_train([(200, 250), (200, 249)], instances) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA unavailable") - def test_eval_tocpu(self): - model = deepcopy(self.model).cpu() - model.eval() - input_sizes = [(200, 250), (200, 249)] - inputs = [create_model_input(torch.rand(3, s[0], s[1])) for s in input_sizes] - model(inputs) - - -class MaskRCNNE2ETest(InstanceModelE2ETest, unittest.TestCase): - CONFIG_PATH = "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml" - - def test_half_empty_data(self): - instances = [get_empty_instance(200, 250), get_regular_bitmask_instances(200, 249)] - self._test_train([(200, 250), (200, 249)], instances) - - # This test is flaky because in some environment the output features are zero due to relu - # def test_rpn_inf_nan_data(self): - # self.model.eval() - # for tensor in [self._inf_tensor, self._nan_tensor]: - # images = ImageList(tensor(1, 3, 512, 512), [(510, 510)]) - # features = { - # "p2": tensor(1, 256, 256, 256), - # "p3": tensor(1, 256, 128, 128), - # "p4": tensor(1, 256, 64, 64), - # "p5": tensor(1, 256, 32, 32), - # "p6": tensor(1, 256, 16, 16), - # } - # props, _ = self.model.proposal_generator(images, features) - # self.assertEqual(len(props[0]), 0) - - def test_roiheads_inf_nan_data(self): - self.model.eval() - for tensor in [self._inf_tensor, self._nan_tensor]: - images = ImageList(tensor(1, 3, 512, 512), [(510, 510)]) - features = { - "p2": tensor(1, 256, 256, 256), - "p3": tensor(1, 256, 128, 128), - "p4": tensor(1, 256, 64, 64), - "p5": tensor(1, 256, 32, 32), - "p6": tensor(1, 256, 16, 16), - } - props = [Instances((510, 510))] - props[0].proposal_boxes = Boxes([[10, 10, 20, 20]]).to(device=self.model.device) - props[0].objectness_logits = torch.tensor([1.0]).reshape(1, 1) - det, _ = self.model.roi_heads(images, features, props) - self.assertEqual(len(det[0]), 0) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_autocast(self): - from torch.cuda.amp import autocast - - inputs = [{"image": torch.rand(3, 100, 100)}] - self.model.eval() - with autocast(), typecheck_hook( - self.model.backbone, in_dtype=torch.float32, out_dtype=torch.float16 - ), typecheck_hook( - self.model.roi_heads.box_predictor, in_dtype=torch.float16, out_dtype=torch.float16 - ): - out = self.model.inference(inputs, do_postprocess=False)[0] - self.assertEqual(out.pred_boxes.tensor.dtype, torch.float32) - self.assertEqual(out.pred_masks.dtype, torch.float16) - self.assertEqual(out.scores.dtype, torch.float32) # scores comes from softmax - - -class RetinaNetE2ETest(InstanceModelE2ETest, unittest.TestCase): - CONFIG_PATH = "COCO-Detection/retinanet_R_50_FPN_1x.yaml" - - def test_inf_nan_data(self): - self.model.eval() - self.model.score_threshold = -999999999 - for tensor in [self._inf_tensor, self._nan_tensor]: - images = ImageList(tensor(1, 3, 512, 512), [(510, 510)]) - features = [ - tensor(1, 256, 128, 128), - tensor(1, 256, 64, 64), - tensor(1, 256, 32, 32), - tensor(1, 256, 16, 16), - tensor(1, 256, 8, 8), - ] - pred_logits, pred_anchor_deltas = self.model.head(features) - pred_logits = [tensor(*x.shape) for x in pred_logits] - pred_anchor_deltas = [tensor(*x.shape) for x in pred_anchor_deltas] - det = self.model.forward_inference(images, features, [pred_logits, pred_anchor_deltas]) - # all predictions (if any) are infinite or nan - if len(det[0]): - self.assertTrue(torch.isfinite(det[0].pred_boxes.tensor).sum() == 0) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_autocast(self): - from torch.cuda.amp import autocast - - inputs = [{"image": torch.rand(3, 100, 100)}] - self.model.eval() - with autocast(), typecheck_hook( - self.model.backbone, in_dtype=torch.float32, out_dtype=torch.float16 - ), typecheck_hook(self.model.head, in_dtype=torch.float16, out_dtype=torch.float16): - out = self.model(inputs)[0]["instances"] - self.assertEqual(out.pred_boxes.tensor.dtype, torch.float32) - self.assertEqual(out.scores.dtype, torch.float16) - - -class SemSegE2ETest(unittest.TestCase): - CONFIG_PATH = "Misc/semantic_R_50_FPN_1x.yaml" - - def setUp(self): - torch.manual_seed(43) - self.model = get_model_no_weights(self.CONFIG_PATH) - - def _test_eval(self, input_sizes): - inputs = [create_model_input(torch.rand(3, s[0], s[1])) for s in input_sizes] - self.model.eval() - self.model(inputs) - - def test_forward(self): - self._test_eval([(200, 250), (200, 249)]) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_roi_heads.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_roi_heads.py deleted file mode 100755 index 6af160ef..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_roi_heads.py +++ /dev/null @@ -1,323 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import unittest -from copy import deepcopy -import torch -from torch import nn - -from detectron2 import model_zoo -from detectron2.config import get_cfg -from detectron2.export.torchscript_patch import ( - freeze_training_mode, - patch_builtin_len, - patch_instances, -) -from detectron2.layers import ShapeSpec -from detectron2.modeling.proposal_generator.build import build_proposal_generator -from detectron2.modeling.roi_heads import ( - FastRCNNConvFCHead, - KRCNNConvDeconvUpsampleHead, - MaskRCNNConvUpsampleHead, - StandardROIHeads, - build_roi_heads, -) -from detectron2.projects import point_rend -from detectron2.structures import BitMasks, Boxes, ImageList, Instances, RotatedBoxes -from detectron2.utils.events import EventStorage -from detectron2.utils.testing import assert_instances_allclose, random_boxes - -logger = logging.getLogger(__name__) - -""" -Make sure the losses of ROIHeads/RPN do not change, to avoid -breaking the forward logic by mistake. -This relies on assumption that pytorch's RNG is stable. -""" - - -class ROIHeadsTest(unittest.TestCase): - def test_roi_heads(self): - torch.manual_seed(121) - cfg = get_cfg() - cfg.MODEL.ROI_BOX_HEAD.NAME = "FastRCNNConvFCHead" - cfg.MODEL.ROI_BOX_HEAD.NUM_FC = 2 - cfg.MODEL.ROI_BOX_HEAD.POOLER_TYPE = "ROIAlignV2" - cfg.MODEL.ROI_BOX_HEAD.BBOX_REG_WEIGHTS = (10, 10, 5, 5) - cfg.MODEL.MASK_ON = True - num_images = 2 - images_tensor = torch.rand(num_images, 20, 30) - image_sizes = [(10, 10), (20, 30)] - images = ImageList(images_tensor, image_sizes) - num_channels = 1024 - features = {"res4": torch.rand(num_images, num_channels, 1, 2)} - feature_shape = {"res4": ShapeSpec(channels=num_channels, stride=16)} - - image_shape = (15, 15) - gt_boxes0 = torch.tensor([[1, 1, 3, 3], [2, 2, 6, 6]], dtype=torch.float32) - gt_instance0 = Instances(image_shape) - gt_instance0.gt_boxes = Boxes(gt_boxes0) - gt_instance0.gt_classes = torch.tensor([2, 1]) - gt_instance0.gt_masks = BitMasks(torch.rand((2,) + image_shape) > 0.5) - gt_boxes1 = torch.tensor([[1, 5, 2, 8], [7, 3, 10, 5]], dtype=torch.float32) - gt_instance1 = Instances(image_shape) - gt_instance1.gt_boxes = Boxes(gt_boxes1) - gt_instance1.gt_classes = torch.tensor([1, 2]) - gt_instance1.gt_masks = BitMasks(torch.rand((2,) + image_shape) > 0.5) - gt_instances = [gt_instance0, gt_instance1] - - proposal_generator = build_proposal_generator(cfg, feature_shape) - roi_heads = StandardROIHeads(cfg, feature_shape) - - with EventStorage(): # capture events in a new storage to discard them - proposals, proposal_losses = proposal_generator(images, features, gt_instances) - _, detector_losses = roi_heads(images, features, proposals, gt_instances) - - detector_losses.update(proposal_losses) - expected_losses = { - "loss_cls": 4.5253729820251465, - "loss_box_reg": 0.009785720147192478, - "loss_mask": 0.693184494972229, - "loss_rpn_cls": 0.08186662942171097, - "loss_rpn_loc": 0.1104838103055954, - } - succ = all( - torch.allclose(detector_losses[name], torch.tensor(expected_losses.get(name, 0.0))) - for name in detector_losses.keys() - ) - self.assertTrue( - succ, - "Losses has changed! New losses: {}".format( - {k: v.item() for k, v in detector_losses.items()} - ), - ) - - def test_rroi_heads(self): - torch.manual_seed(121) - cfg = get_cfg() - cfg.MODEL.PROPOSAL_GENERATOR.NAME = "RRPN" - cfg.MODEL.ANCHOR_GENERATOR.NAME = "RotatedAnchorGenerator" - cfg.MODEL.ROI_HEADS.NAME = "RROIHeads" - cfg.MODEL.ROI_BOX_HEAD.NAME = "FastRCNNConvFCHead" - cfg.MODEL.ROI_BOX_HEAD.NUM_FC = 2 - cfg.MODEL.RPN.BBOX_REG_WEIGHTS = (1, 1, 1, 1, 1) - cfg.MODEL.RPN.HEAD_NAME = "StandardRPNHead" - cfg.MODEL.ROI_BOX_HEAD.POOLER_TYPE = "ROIAlignRotated" - cfg.MODEL.ROI_BOX_HEAD.BBOX_REG_WEIGHTS = (10, 10, 5, 5, 1) - num_images = 2 - images_tensor = torch.rand(num_images, 20, 30) - image_sizes = [(10, 10), (20, 30)] - images = ImageList(images_tensor, image_sizes) - num_channels = 1024 - features = {"res4": torch.rand(num_images, num_channels, 1, 2)} - feature_shape = {"res4": ShapeSpec(channels=num_channels, stride=16)} - - image_shape = (15, 15) - gt_boxes0 = torch.tensor([[2, 2, 2, 2, 30], [4, 4, 4, 4, 0]], dtype=torch.float32) - gt_instance0 = Instances(image_shape) - gt_instance0.gt_boxes = RotatedBoxes(gt_boxes0) - gt_instance0.gt_classes = torch.tensor([2, 1]) - gt_boxes1 = torch.tensor([[1.5, 5.5, 1, 3, 0], [8.5, 4, 3, 2, -50]], dtype=torch.float32) - gt_instance1 = Instances(image_shape) - gt_instance1.gt_boxes = RotatedBoxes(gt_boxes1) - gt_instance1.gt_classes = torch.tensor([1, 2]) - gt_instances = [gt_instance0, gt_instance1] - - proposal_generator = build_proposal_generator(cfg, feature_shape) - roi_heads = build_roi_heads(cfg, feature_shape) - - with EventStorage(): # capture events in a new storage to discard them - proposals, proposal_losses = proposal_generator(images, features, gt_instances) - _, detector_losses = roi_heads(images, features, proposals, gt_instances) - - detector_losses.update(proposal_losses) - expected_losses = { - "loss_cls": 4.365657806396484, - "loss_box_reg": 0.0015851043863222003, - "loss_rpn_cls": 0.2427729219198227, - "loss_rpn_loc": 0.3646621108055115, - } - succ = all( - torch.allclose(detector_losses[name], torch.tensor(expected_losses.get(name, 0.0))) - for name in detector_losses.keys() - ) - self.assertTrue( - succ, - "Losses has changed! New losses: {}".format( - {k: v.item() for k, v in detector_losses.items()} - ), - ) - - def test_box_head_scriptability(self): - input_shape = ShapeSpec(channels=1024, height=14, width=14) - box_features = torch.randn(4, 1024, 14, 14) - - box_head = FastRCNNConvFCHead( - input_shape, conv_dims=[512, 512], fc_dims=[1024, 1024] - ).eval() - script_box_head = torch.jit.script(box_head) - - origin_output = box_head(box_features) - script_output = script_box_head(box_features) - self.assertTrue(torch.equal(origin_output, script_output)) - - def test_mask_head_scriptability(self): - input_shape = ShapeSpec(channels=1024) - mask_features = torch.randn(4, 1024, 14, 14) - - image_shapes = [(10, 10), (15, 15)] - pred_instance0 = Instances(image_shapes[0]) - pred_classes0 = torch.tensor([1, 2, 3], dtype=torch.int64) - pred_instance0.pred_classes = pred_classes0 - pred_instance1 = Instances(image_shapes[1]) - pred_classes1 = torch.tensor([4], dtype=torch.int64) - pred_instance1.pred_classes = pred_classes1 - - mask_head = MaskRCNNConvUpsampleHead( - input_shape, num_classes=80, conv_dims=[256, 256] - ).eval() - # pred_instance will be in-place changed during the inference - # process of `MaskRCNNConvUpsampleHead` - origin_outputs = mask_head(mask_features, deepcopy([pred_instance0, pred_instance1])) - - fields = {"pred_masks": torch.Tensor, "pred_classes": torch.Tensor} - with freeze_training_mode(mask_head), patch_instances(fields) as NewInstances: - sciript_mask_head = torch.jit.script(mask_head) - pred_instance0 = NewInstances.from_instances(pred_instance0) - pred_instance1 = NewInstances.from_instances(pred_instance1) - script_outputs = sciript_mask_head(mask_features, [pred_instance0, pred_instance1]) - - for origin_ins, script_ins in zip(origin_outputs, script_outputs): - assert_instances_allclose(origin_ins, script_ins, rtol=0) - - def test_keypoint_head_scriptability(self): - input_shape = ShapeSpec(channels=1024, height=14, width=14) - keypoint_features = torch.randn(4, 1024, 14, 14) - - image_shapes = [(10, 10), (15, 15)] - pred_boxes0 = torch.tensor([[1, 1, 3, 3], [2, 2, 6, 6], [1, 5, 2, 8]], dtype=torch.float32) - pred_instance0 = Instances(image_shapes[0]) - pred_instance0.pred_boxes = Boxes(pred_boxes0) - pred_boxes1 = torch.tensor([[7, 3, 10, 5]], dtype=torch.float32) - pred_instance1 = Instances(image_shapes[1]) - pred_instance1.pred_boxes = Boxes(pred_boxes1) - - keypoint_head = KRCNNConvDeconvUpsampleHead( - input_shape, num_keypoints=17, conv_dims=[512, 512] - ).eval() - origin_outputs = keypoint_head( - keypoint_features, deepcopy([pred_instance0, pred_instance1]) - ) - - fields = { - "pred_boxes": Boxes, - "pred_keypoints": torch.Tensor, - "pred_keypoint_heatmaps": torch.Tensor, - } - with freeze_training_mode(keypoint_head), patch_instances(fields) as NewInstances: - sciript_keypoint_head = torch.jit.script(keypoint_head) - pred_instance0 = NewInstances.from_instances(pred_instance0) - pred_instance1 = NewInstances.from_instances(pred_instance1) - script_outputs = sciript_keypoint_head( - keypoint_features, [pred_instance0, pred_instance1] - ) - - for origin_ins, script_ins in zip(origin_outputs, script_outputs): - assert_instances_allclose(origin_ins, script_ins, rtol=0) - - def test_StandardROIHeads_scriptability(self): - cfg = get_cfg() - cfg.MODEL.ROI_BOX_HEAD.NAME = "FastRCNNConvFCHead" - cfg.MODEL.ROI_BOX_HEAD.NUM_FC = 2 - cfg.MODEL.ROI_BOX_HEAD.POOLER_TYPE = "ROIAlignV2" - cfg.MODEL.ROI_BOX_HEAD.BBOX_REG_WEIGHTS = (10, 10, 5, 5) - cfg.MODEL.MASK_ON = True - cfg.MODEL.ROI_HEADS.NMS_THRESH_TEST = 0.01 - cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.01 - num_images = 2 - images_tensor = torch.rand(num_images, 20, 30) - image_sizes = [(10, 10), (20, 30)] - images = ImageList(images_tensor, image_sizes) - num_channels = 1024 - features = {"res4": torch.rand(num_images, num_channels, 1, 2)} - feature_shape = {"res4": ShapeSpec(channels=num_channels, stride=16)} - - roi_heads = StandardROIHeads(cfg, feature_shape).eval() - - proposal0 = Instances(image_sizes[0]) - proposal_boxes0 = torch.tensor([[1, 1, 3, 3], [2, 2, 6, 6]], dtype=torch.float32) - proposal0.proposal_boxes = Boxes(proposal_boxes0) - proposal0.objectness_logits = torch.tensor([0.5, 0.7], dtype=torch.float32) - - proposal1 = Instances(image_sizes[1]) - proposal_boxes1 = torch.tensor([[1, 5, 2, 8], [7, 3, 10, 5]], dtype=torch.float32) - proposal1.proposal_boxes = Boxes(proposal_boxes1) - proposal1.objectness_logits = torch.tensor([0.1, 0.9], dtype=torch.float32) - proposals = [proposal0, proposal1] - - pred_instances, _ = roi_heads(images, features, proposals) - fields = { - "objectness_logits": torch.Tensor, - "proposal_boxes": Boxes, - "pred_classes": torch.Tensor, - "scores": torch.Tensor, - "pred_masks": torch.Tensor, - "pred_boxes": Boxes, - "pred_keypoints": torch.Tensor, - "pred_keypoint_heatmaps": torch.Tensor, - } - with freeze_training_mode(roi_heads), patch_instances(fields) as new_instances: - proposal0 = new_instances.from_instances(proposal0) - proposal1 = new_instances.from_instances(proposal1) - proposals = [proposal0, proposal1] - scripted_rot_heads = torch.jit.script(roi_heads) - scripted_pred_instances, _ = scripted_rot_heads(images, features, proposals) - - for instance, scripted_instance in zip(pred_instances, scripted_pred_instances): - assert_instances_allclose(instance, scripted_instance, rtol=0) - - def test_PointRend_mask_head_tracing(self): - cfg = model_zoo.get_config("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml") - point_rend.add_pointrend_config(cfg) - cfg.MODEL.ROI_HEADS.IN_FEATURES = ["p2", "p3"] - cfg.MODEL.ROI_MASK_HEAD.NAME = "PointRendMaskHead" - cfg.MODEL.ROI_MASK_HEAD.POOLER_TYPE = "" - cfg.MODEL.ROI_MASK_HEAD.POINT_HEAD_ON = True - chan = 256 - head = point_rend.PointRendMaskHead( - cfg, - { - "p2": ShapeSpec(channels=chan, stride=4), - "p3": ShapeSpec(channels=chan, stride=8), - }, - ) - - def gen_inputs(h, w, N): - p2 = torch.rand(1, chan, h, w) - p3 = torch.rand(1, chan, h // 2, w // 2) - boxes = random_boxes(N, max_coord=h) - return p2, p3, boxes - - class Wrap(nn.ModuleDict): - def forward(self, p2, p3, boxes): - features = { - "p2": p2, - "p3": p3, - } - inst = Instances((p2.shape[2] * 4, p2.shape[3] * 4)) - inst.pred_boxes = Boxes(boxes) - inst.pred_classes = torch.zeros(inst.__len__(), dtype=torch.long) - out = self.head(features, [inst])[0] - return out.pred_masks - - model = Wrap({"head": head}) - model.eval() - with torch.no_grad(), patch_builtin_len(): - traced = torch.jit.trace(model, gen_inputs(302, 208, 20)) - inputs = gen_inputs(100, 120, 30) - out_eager = model(*inputs) - out_trace = traced(*inputs) - self.assertTrue(torch.allclose(out_eager, out_trace)) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_roi_pooler.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_roi_pooler.py deleted file mode 100755 index b93b7ae6..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_roi_pooler.py +++ /dev/null @@ -1,165 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import unittest -import torch - -from detectron2.modeling.poolers import ROIPooler -from detectron2.structures import Boxes, RotatedBoxes -from detectron2.utils.testing import random_boxes - -logger = logging.getLogger(__name__) - - -class TestROIPooler(unittest.TestCase): - def _test_roialignv2_roialignrotated_match(self, device): - pooler_resolution = 14 - canonical_level = 4 - canonical_scale_factor = 2 ** canonical_level - pooler_scales = (1.0 / canonical_scale_factor,) - sampling_ratio = 0 - - N, C, H, W = 2, 4, 10, 8 - N_rois = 10 - std = 11 - mean = 0 - feature = (torch.rand(N, C, H, W) - 0.5) * 2 * std + mean - - features = [feature.to(device)] - - rois = [] - rois_rotated = [] - for _ in range(N): - boxes = random_boxes(N_rois, W * canonical_scale_factor) - rotated_boxes = torch.zeros(N_rois, 5) - rotated_boxes[:, 0] = (boxes[:, 0] + boxes[:, 2]) / 2.0 - rotated_boxes[:, 1] = (boxes[:, 1] + boxes[:, 3]) / 2.0 - rotated_boxes[:, 2] = boxes[:, 2] - boxes[:, 0] - rotated_boxes[:, 3] = boxes[:, 3] - boxes[:, 1] - rois.append(Boxes(boxes).to(device)) - rois_rotated.append(RotatedBoxes(rotated_boxes).to(device)) - - roialignv2_pooler = ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type="ROIAlignV2", - ) - - roialignv2_out = roialignv2_pooler(features, rois) - - roialignrotated_pooler = ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type="ROIAlignRotated", - ) - - roialignrotated_out = roialignrotated_pooler(features, rois_rotated) - - self.assertTrue(torch.allclose(roialignv2_out, roialignrotated_out, atol=1e-4)) - - def test_roialignv2_roialignrotated_match_cpu(self): - self._test_roialignv2_roialignrotated_match(device="cpu") - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_roialignv2_roialignrotated_match_cuda(self): - self._test_roialignv2_roialignrotated_match(device="cuda") - - def _test_scriptability(self, device): - pooler_resolution = 14 - canonical_level = 4 - canonical_scale_factor = 2 ** canonical_level - pooler_scales = (1.0 / canonical_scale_factor,) - sampling_ratio = 0 - - N, C, H, W = 2, 4, 10, 8 - N_rois = 10 - std = 11 - mean = 0 - feature = (torch.rand(N, C, H, W) - 0.5) * 2 * std + mean - - features = [feature.to(device)] - - rois = [] - for _ in range(N): - boxes = random_boxes(N_rois, W * canonical_scale_factor) - - rois.append(Boxes(boxes).to(device)) - - roialignv2_pooler = ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type="ROIAlignV2", - ) - - roialignv2_out = roialignv2_pooler(features, rois) - scripted_roialignv2_out = torch.jit.script(roialignv2_pooler)(features, rois) - self.assertTrue(torch.equal(roialignv2_out, scripted_roialignv2_out)) - - def test_scriptability_cpu(self): - self._test_scriptability(device="cpu") - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_scriptability_gpu(self): - self._test_scriptability(device="cuda") - - def test_no_images(self): - N, C, H, W = 0, 32, 32, 32 - feature = torch.rand(N, C, H, W) - 0.5 - features = [feature] - pooler = ROIPooler( - output_size=14, scales=(1.0,), sampling_ratio=0.0, pooler_type="ROIAlignV2" - ) - output = pooler.forward(features, []) - self.assertEqual(output.shape, (0, C, 14, 14)) - - def test_roi_pooler_tracing(self): - class Model(torch.nn.Module): - def __init__(self, roi): - super(Model, self).__init__() - self.roi = roi - - def forward(self, x, boxes): - return self.roi(x, [Boxes(boxes)]) - - pooler_resolution = 14 - canonical_level = 4 - canonical_scale_factor = 2 ** canonical_level - pooler_scales = (1.0 / canonical_scale_factor, 0.5 / canonical_scale_factor) - sampling_ratio = 0 - - N, C, H, W = 1, 4, 10, 8 - N_rois = 10 - std = 11 - mean = 0 - feature = (torch.rand(N, C, H, W) - 0.5) * 2 * std + mean - feature = [feature, feature] - - rois = random_boxes(N_rois, W * canonical_scale_factor) - # Add one larger box so that this level has only one box. - # This may trigger the bug https://github.com/pytorch/pytorch/issues/49852 - # that we shall workaround. - rois = torch.cat([rois, torch.tensor([[0, 0, 448, 448]])]) - - model = Model( - ROIPooler( - output_size=pooler_resolution, - scales=pooler_scales, - sampling_ratio=sampling_ratio, - pooler_type="ROIAlign", - ) - ) - - with torch.no_grad(): - func = torch.jit.trace(model, (feature, rois)) - o = func(feature, rois) - self.assertEqual(o.shape, (11, 4, 14, 14)) - o = func(feature, rois[:5]) - self.assertEqual(o.shape, (5, 4, 14, 14)) - o = func(feature, random_boxes(20, W * canonical_scale_factor)) - self.assertEqual(o.shape, (20, 4, 14, 14)) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_rpn.py b/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_rpn.py deleted file mode 100755 index f14faae5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/modeling/test_rpn.py +++ /dev/null @@ -1,262 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import unittest -import torch - -from detectron2.config import get_cfg -from detectron2.export import scripting_with_instances -from detectron2.layers import ShapeSpec -from detectron2.modeling.backbone import build_backbone -from detectron2.modeling.proposal_generator import RPN, build_proposal_generator -from detectron2.modeling.proposal_generator.proposal_utils import ( - add_ground_truth_to_proposals, - find_top_rpn_proposals, -) -from detectron2.structures import Boxes, ImageList, Instances, RotatedBoxes -from detectron2.utils.events import EventStorage - -logger = logging.getLogger(__name__) - - -class RPNTest(unittest.TestCase): - def get_gt_and_features(self): - num_images = 2 - images_tensor = torch.rand(num_images, 20, 30) - image_sizes = [(10, 10), (20, 30)] - images = ImageList(images_tensor, image_sizes) - image_shape = (15, 15) - num_channels = 1024 - features = {"res4": torch.rand(num_images, num_channels, 1, 2)} - gt_boxes = torch.tensor([[1, 1, 3, 3], [2, 2, 6, 6]], dtype=torch.float32) - gt_instances = Instances(image_shape) - gt_instances.gt_boxes = Boxes(gt_boxes) - return (gt_instances, features, images, image_sizes) - - def test_rpn(self): - torch.manual_seed(121) - cfg = get_cfg() - backbone = build_backbone(cfg) - proposal_generator = RPN(cfg, backbone.output_shape()) - (gt_instances, features, images, image_sizes) = self.get_gt_and_features() - with EventStorage(): # capture events in a new storage to discard them - proposals, proposal_losses = proposal_generator( - images, features, [gt_instances[0], gt_instances[1]] - ) - - expected_losses = { - "loss_rpn_cls": torch.tensor(0.08011703193), - "loss_rpn_loc": torch.tensor(0.101470276), - } - for name in expected_losses.keys(): - err_msg = "proposal_losses[{}] = {}, expected losses = {}".format( - name, proposal_losses[name], expected_losses[name] - ) - self.assertTrue(torch.allclose(proposal_losses[name], expected_losses[name]), err_msg) - - self.assertEqual(len(proposals), len(image_sizes)) - for proposal, im_size in zip(proposals, image_sizes): - self.assertEqual(proposal.image_size, im_size) - - expected_proposal_box = torch.tensor([[0, 0, 10, 10], [7.2702, 0, 10, 10]]) - expected_objectness_logit = torch.tensor([0.1596, -0.0007]) - self.assertTrue( - torch.allclose(proposals[0].proposal_boxes.tensor, expected_proposal_box, atol=1e-4) - ) - self.assertTrue( - torch.allclose(proposals[0].objectness_logits, expected_objectness_logit, atol=1e-4) - ) - - def verify_rpn(self, conv_dims, expected_conv_dims): - torch.manual_seed(121) - cfg = get_cfg() - cfg.MODEL.RPN.CONV_DIMS = conv_dims - backbone = build_backbone(cfg) - proposal_generator = RPN(cfg, backbone.output_shape()) - for k, conv in enumerate(proposal_generator.rpn_head.conv): - self.assertEqual(expected_conv_dims[k], conv.out_channels) - return proposal_generator - - def test_rpn_larger_num_convs(self): - conv_dims = [64, 64, 64, 64, 64] - proposal_generator = self.verify_rpn(conv_dims, conv_dims) - (gt_instances, features, images, image_sizes) = self.get_gt_and_features() - with EventStorage(): # capture events in a new storage to discard them - proposals, proposal_losses = proposal_generator( - images, features, [gt_instances[0], gt_instances[1]] - ) - expected_losses = { - "loss_rpn_cls": torch.tensor(0.08122821152), - "loss_rpn_loc": torch.tensor(0.10064548254), - } - for name in expected_losses.keys(): - err_msg = "proposal_losses[{}] = {}, expected losses = {}".format( - name, proposal_losses[name], expected_losses[name] - ) - self.assertTrue(torch.allclose(proposal_losses[name], expected_losses[name]), err_msg) - - def test_rpn_conv_dims_not_set(self): - conv_dims = [-1, -1, -1] - expected_conv_dims = [1024, 1024, 1024] - self.verify_rpn(conv_dims, expected_conv_dims) - - def test_rpn_scriptability(self): - cfg = get_cfg() - proposal_generator = RPN(cfg, {"res4": ShapeSpec(channels=1024, stride=16)}).eval() - num_images = 2 - images_tensor = torch.rand(num_images, 30, 40) - image_sizes = [(32, 32), (30, 40)] - images = ImageList(images_tensor, image_sizes) - features = {"res4": torch.rand(num_images, 1024, 1, 2)} - - fields = {"proposal_boxes": Boxes, "objectness_logits": torch.Tensor} - proposal_generator_ts = scripting_with_instances(proposal_generator, fields) - - proposals, _ = proposal_generator(images, features) - proposals_ts, _ = proposal_generator_ts(images, features) - - for proposal, proposal_ts in zip(proposals, proposals_ts): - self.assertEqual(proposal.image_size, proposal_ts.image_size) - self.assertTrue( - torch.equal(proposal.proposal_boxes.tensor, proposal_ts.proposal_boxes.tensor) - ) - self.assertTrue(torch.equal(proposal.objectness_logits, proposal_ts.objectness_logits)) - - def test_rrpn(self): - torch.manual_seed(121) - cfg = get_cfg() - cfg.MODEL.PROPOSAL_GENERATOR.NAME = "RRPN" - cfg.MODEL.ANCHOR_GENERATOR.NAME = "RotatedAnchorGenerator" - cfg.MODEL.ANCHOR_GENERATOR.SIZES = [[32, 64]] - cfg.MODEL.ANCHOR_GENERATOR.ASPECT_RATIOS = [[0.25, 1]] - cfg.MODEL.ANCHOR_GENERATOR.ANGLES = [[0, 60]] - cfg.MODEL.RPN.BBOX_REG_WEIGHTS = (1, 1, 1, 1, 1) - cfg.MODEL.RPN.HEAD_NAME = "StandardRPNHead" - backbone = build_backbone(cfg) - proposal_generator = build_proposal_generator(cfg, backbone.output_shape()) - num_images = 2 - images_tensor = torch.rand(num_images, 20, 30) - image_sizes = [(10, 10), (20, 30)] - images = ImageList(images_tensor, image_sizes) - image_shape = (15, 15) - num_channels = 1024 - features = {"res4": torch.rand(num_images, num_channels, 1, 2)} - gt_boxes = torch.tensor([[2, 2, 2, 2, 0], [4, 4, 4, 4, 0]], dtype=torch.float32) - gt_instances = Instances(image_shape) - gt_instances.gt_boxes = RotatedBoxes(gt_boxes) - with EventStorage(): # capture events in a new storage to discard them - proposals, proposal_losses = proposal_generator( - images, features, [gt_instances[0], gt_instances[1]] - ) - - expected_losses = { - "loss_rpn_cls": torch.tensor(0.04291602224), - "loss_rpn_loc": torch.tensor(0.145077362), - } - for name in expected_losses.keys(): - err_msg = "proposal_losses[{}] = {}, expected losses = {}".format( - name, proposal_losses[name], expected_losses[name] - ) - self.assertTrue(torch.allclose(proposal_losses[name], expected_losses[name]), err_msg) - - expected_proposal_box = torch.tensor( - [ - [-1.77999556, 0.78155339, 68.04367828, 14.78156471, 60.59333801], - [13.82740974, -1.50282836, 34.67269897, 29.19676590, -3.81942749], - [8.10392570, -0.99071521, 145.39100647, 32.13126373, 3.67242432], - [5.00000000, 4.57370186, 10.00000000, 9.14740372, 0.89196777], - ] - ) - - expected_objectness_logit = torch.tensor([0.10924313, 0.09881870, 0.07649877, 0.05858029]) - - torch.set_printoptions(precision=8, sci_mode=False) - - self.assertEqual(len(proposals), len(image_sizes)) - - proposal = proposals[0] - # It seems that there's some randomness in the result across different machines: - # This test can be run on a local machine for 100 times with exactly the same result, - # However, a different machine might produce slightly different results, - # thus the atol here. - err_msg = "computed proposal boxes = {}, expected {}".format( - proposal.proposal_boxes.tensor, expected_proposal_box - ) - self.assertTrue( - torch.allclose(proposal.proposal_boxes.tensor[:4], expected_proposal_box, atol=1e-5), - err_msg, - ) - - err_msg = "computed objectness logits = {}, expected {}".format( - proposal.objectness_logits, expected_objectness_logit - ) - self.assertTrue( - torch.allclose(proposal.objectness_logits[:4], expected_objectness_logit, atol=1e-5), - err_msg, - ) - - def test_find_rpn_proposals_inf(self): - N, Hi, Wi, A = 3, 3, 3, 3 - proposals = [torch.rand(N, Hi * Wi * A, 4)] - pred_logits = [torch.rand(N, Hi * Wi * A)] - pred_logits[0][1][3:5].fill_(float("inf")) - find_top_rpn_proposals(proposals, pred_logits, [(10, 10)], 0.5, 1000, 1000, 0, False) - - def test_find_rpn_proposals_tracing(self): - N, Hi, Wi, A = 3, 50, 50, 9 - proposal = torch.rand(N, Hi * Wi * A, 4) - pred_logit = torch.rand(N, Hi * Wi * A) - - def func(proposal, logit, image_size): - r = find_top_rpn_proposals( - [proposal], [logit], [image_size], 0.7, 1000, 1000, 0, False - )[0] - size = r.image_size - if not isinstance(size, torch.Tensor): - size = torch.tensor(size) - return (size, r.proposal_boxes.tensor, r.objectness_logits) - - other_inputs = [] - # test that it generalizes to other shapes - for Hi, Wi, shp in [(30, 30, 60), (10, 10, 800)]: - other_inputs.append( - ( - torch.rand(N, Hi * Wi * A, 4), - torch.rand(N, Hi * Wi * A), - torch.tensor([shp, shp]), - ) - ) - torch.jit.trace( - func, (proposal, pred_logit, torch.tensor([100, 100])), check_inputs=other_inputs - ) - - def test_append_gt_to_proposal(self): - proposals = Instances( - (10, 10), - **{ - "proposal_boxes": Boxes(torch.empty((0, 4))), - "objectness_logits": torch.tensor([]), - "custom_attribute": torch.tensor([]), - } - ) - gt_boxes = Boxes(torch.tensor([[0, 0, 1, 1]])) - - self.assertRaises(AssertionError, add_ground_truth_to_proposals, [gt_boxes], [proposals]) - - gt_instances = Instances((10, 10)) - gt_instances.gt_boxes = gt_boxes - - self.assertRaises( - AssertionError, add_ground_truth_to_proposals, [gt_instances], [proposals] - ) - - gt_instances.custom_attribute = torch.tensor([1]) - gt_instances.custom_attribute2 = torch.tensor([1]) - new_proposals = add_ground_truth_to_proposals([gt_instances], [proposals])[0] - - self.assertEqual(new_proposals.custom_attribute[0], 1) - # new proposals should only include the attributes in proposals - self.assertRaises(AttributeError, lambda: new_proposals.custom_attribute2) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/structures/__init__.py b/grit_src_deprecated/third_party/CenterNet2/tests/structures/__init__.py deleted file mode 100755 index e69de29b..00000000 diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_boxes.py b/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_boxes.py deleted file mode 100755 index 10119181..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_boxes.py +++ /dev/null @@ -1,223 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import json -import math -import numpy as np -import unittest -import torch - -from detectron2.structures import Boxes, BoxMode, pairwise_ioa, pairwise_iou -from detectron2.utils.testing import reload_script_model - - -class TestBoxMode(unittest.TestCase): - def _convert_xy_to_wh(self, x): - return BoxMode.convert(x, BoxMode.XYXY_ABS, BoxMode.XYWH_ABS) - - def _convert_xywha_to_xyxy(self, x): - return BoxMode.convert(x, BoxMode.XYWHA_ABS, BoxMode.XYXY_ABS) - - def _convert_xywh_to_xywha(self, x): - return BoxMode.convert(x, BoxMode.XYWH_ABS, BoxMode.XYWHA_ABS) - - def test_convert_int_mode(self): - BoxMode.convert([1, 2, 3, 4], 0, 1) - - def test_box_convert_list(self): - for tp in [list, tuple]: - box = tp([5.0, 5.0, 10.0, 10.0]) - output = self._convert_xy_to_wh(box) - self.assertIsInstance(output, tp) - self.assertIsInstance(output[0], float) - self.assertEqual(output, tp([5.0, 5.0, 5.0, 5.0])) - - with self.assertRaises(Exception): - self._convert_xy_to_wh([box]) - - def test_box_convert_array(self): - box = np.asarray([[5, 5, 10, 10], [1, 1, 2, 3]]) - output = self._convert_xy_to_wh(box) - self.assertEqual(output.dtype, box.dtype) - self.assertEqual(output.shape, box.shape) - self.assertTrue((output[0] == [5, 5, 5, 5]).all()) - self.assertTrue((output[1] == [1, 1, 1, 2]).all()) - - def test_box_convert_cpu_tensor(self): - box = torch.tensor([[5, 5, 10, 10], [1, 1, 2, 3]]) - output = self._convert_xy_to_wh(box) - self.assertEqual(output.dtype, box.dtype) - self.assertEqual(output.shape, box.shape) - output = output.numpy() - self.assertTrue((output[0] == [5, 5, 5, 5]).all()) - self.assertTrue((output[1] == [1, 1, 1, 2]).all()) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_box_convert_cuda_tensor(self): - box = torch.tensor([[5, 5, 10, 10], [1, 1, 2, 3]]).cuda() - output = self._convert_xy_to_wh(box) - self.assertEqual(output.dtype, box.dtype) - self.assertEqual(output.shape, box.shape) - self.assertEqual(output.device, box.device) - output = output.cpu().numpy() - self.assertTrue((output[0] == [5, 5, 5, 5]).all()) - self.assertTrue((output[1] == [1, 1, 1, 2]).all()) - - def test_box_convert_xywha_to_xyxy_list(self): - for tp in [list, tuple]: - box = tp([50, 50, 30, 20, 0]) - output = self._convert_xywha_to_xyxy(box) - self.assertIsInstance(output, tp) - self.assertEqual(output, tp([35, 40, 65, 60])) - - with self.assertRaises(Exception): - self._convert_xywha_to_xyxy([box]) - - def test_box_convert_xywha_to_xyxy_array(self): - for dtype in [np.float64, np.float32]: - box = np.asarray( - [ - [50, 50, 30, 20, 0], - [50, 50, 30, 20, 90], - [1, 1, math.sqrt(2), math.sqrt(2), -45], - ], - dtype=dtype, - ) - output = self._convert_xywha_to_xyxy(box) - self.assertEqual(output.dtype, box.dtype) - expected = np.asarray([[35, 40, 65, 60], [40, 35, 60, 65], [0, 0, 2, 2]], dtype=dtype) - self.assertTrue(np.allclose(output, expected, atol=1e-6), "output={}".format(output)) - - def test_box_convert_xywha_to_xyxy_tensor(self): - for dtype in [torch.float32, torch.float64]: - box = torch.tensor( - [ - [50, 50, 30, 20, 0], - [50, 50, 30, 20, 90], - [1, 1, math.sqrt(2), math.sqrt(2), -45], - ], - dtype=dtype, - ) - output = self._convert_xywha_to_xyxy(box) - self.assertEqual(output.dtype, box.dtype) - expected = torch.tensor([[35, 40, 65, 60], [40, 35, 60, 65], [0, 0, 2, 2]], dtype=dtype) - - self.assertTrue(torch.allclose(output, expected, atol=1e-6), "output={}".format(output)) - - def test_box_convert_xywh_to_xywha_list(self): - for tp in [list, tuple]: - box = tp([50, 50, 30, 20]) - output = self._convert_xywh_to_xywha(box) - self.assertIsInstance(output, tp) - self.assertEqual(output, tp([65, 60, 30, 20, 0])) - - with self.assertRaises(Exception): - self._convert_xywh_to_xywha([box]) - - def test_box_convert_xywh_to_xywha_array(self): - for dtype in [np.float64, np.float32]: - box = np.asarray([[30, 40, 70, 60], [30, 40, 60, 70], [-1, -1, 2, 2]], dtype=dtype) - output = self._convert_xywh_to_xywha(box) - self.assertEqual(output.dtype, box.dtype) - expected = np.asarray( - [[65, 70, 70, 60, 0], [60, 75, 60, 70, 0], [0, 0, 2, 2, 0]], dtype=dtype - ) - self.assertTrue(np.allclose(output, expected, atol=1e-6), "output={}".format(output)) - - def test_box_convert_xywh_to_xywha_tensor(self): - for dtype in [torch.float32, torch.float64]: - box = torch.tensor([[30, 40, 70, 60], [30, 40, 60, 70], [-1, -1, 2, 2]], dtype=dtype) - output = self._convert_xywh_to_xywha(box) - self.assertEqual(output.dtype, box.dtype) - expected = torch.tensor( - [[65, 70, 70, 60, 0], [60, 75, 60, 70, 0], [0, 0, 2, 2, 0]], dtype=dtype - ) - - self.assertTrue(torch.allclose(output, expected, atol=1e-6), "output={}".format(output)) - - def test_json_serializable(self): - payload = {"box_mode": BoxMode.XYWH_REL} - try: - json.dumps(payload) - except Exception: - self.fail("JSON serialization failed") - - def test_json_deserializable(self): - payload = '{"box_mode": 2}' - obj = json.loads(payload) - try: - obj["box_mode"] = BoxMode(obj["box_mode"]) - except Exception: - self.fail("JSON deserialization failed") - - -class TestBoxIOU(unittest.TestCase): - def create_boxes(self): - boxes1 = torch.tensor([[0.0, 0.0, 1.0, 1.0], [0.0, 0.0, 1.0, 1.0]]) - - boxes2 = torch.tensor( - [ - [0.0, 0.0, 1.0, 1.0], - [0.0, 0.0, 0.5, 1.0], - [0.0, 0.0, 1.0, 0.5], - [0.0, 0.0, 0.5, 0.5], - [0.5, 0.5, 1.0, 1.0], - [0.5, 0.5, 1.5, 1.5], - ] - ) - return boxes1, boxes2 - - def test_pairwise_iou(self): - boxes1, boxes2 = self.create_boxes() - expected_ious = torch.tensor( - [ - [1.0, 0.5, 0.5, 0.25, 0.25, 0.25 / (2 - 0.25)], - [1.0, 0.5, 0.5, 0.25, 0.25, 0.25 / (2 - 0.25)], - ] - ) - - ious = pairwise_iou(Boxes(boxes1), Boxes(boxes2)) - self.assertTrue(torch.allclose(ious, expected_ious)) - - def test_pairwise_ioa(self): - boxes1, boxes2 = self.create_boxes() - expected_ioas = torch.tensor( - [[1.0, 1.0, 1.0, 1.0, 1.0, 0.25], [1.0, 1.0, 1.0, 1.0, 1.0, 0.25]] - ) - ioas = pairwise_ioa(Boxes(boxes1), Boxes(boxes2)) - self.assertTrue(torch.allclose(ioas, expected_ioas)) - - -class TestBoxes(unittest.TestCase): - def test_empty_cat(self): - x = Boxes.cat([]) - self.assertTrue(x.tensor.shape, (0, 4)) - - def test_to(self): - x = Boxes(torch.rand(3, 4)) - self.assertEqual(x.to(device="cpu").tensor.device.type, "cpu") - - def test_scriptability(self): - def func(x): - boxes = Boxes(x) - test = boxes.to(torch.device("cpu")).tensor - return boxes.area(), test - - f = torch.jit.script(func) - f = reload_script_model(f) - f(torch.rand((3, 4))) - - data = torch.rand((3, 4)) - - def func_cat(x: torch.Tensor): - boxes1 = Boxes(x) - boxes2 = Boxes(x) - # boxes3 = Boxes.cat([boxes1, boxes2]) # this is not supported by torchsript for now. - boxes3 = boxes1.cat([boxes1, boxes2]) - return boxes3 - - f = torch.jit.script(func_cat) - script_box = f(data) - self.assertTrue(torch.equal(torch.cat([data, data]), script_box.tensor)) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_imagelist.py b/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_imagelist.py deleted file mode 100755 index e446e44a..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_imagelist.py +++ /dev/null @@ -1,75 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import unittest -from typing import List, Sequence, Tuple -import torch - -from detectron2.structures import ImageList - - -class TestImageList(unittest.TestCase): - def test_imagelist_padding_tracing(self): - # test that the trace does not contain hard-coded constant sizes - def to_imagelist(tensors: Sequence[torch.Tensor]): - image_list = ImageList.from_tensors(tensors, 4) - return image_list.tensor, image_list.image_sizes - - def _tensor(*shape): - return torch.ones(shape, dtype=torch.float32) - - # test CHW (inputs needs padding vs. no padding) - for shape in [(3, 10, 10), (3, 12, 12)]: - func = torch.jit.trace(to_imagelist, ([_tensor(*shape)],)) - tensor, image_sizes = func([_tensor(3, 15, 20)]) - self.assertEqual(tensor.shape, (1, 3, 16, 20), tensor.shape) - self.assertEqual(image_sizes[0].tolist(), [15, 20], image_sizes[0]) - - # test HW - func = torch.jit.trace(to_imagelist, ([_tensor(10, 10)],)) - tensor, image_sizes = func([_tensor(15, 20)]) - self.assertEqual(tensor.shape, (1, 16, 20), tensor.shape) - self.assertEqual(image_sizes[0].tolist(), [15, 20], image_sizes[0]) - - # test 2x CHW - func = torch.jit.trace( - to_imagelist, - ([_tensor(3, 16, 10), _tensor(3, 13, 11)],), - ) - tensor, image_sizes = func([_tensor(3, 25, 20), _tensor(3, 10, 10)]) - self.assertEqual(tensor.shape, (2, 3, 28, 20), tensor.shape) - self.assertEqual(image_sizes[0].tolist(), [25, 20], image_sizes[0]) - self.assertEqual(image_sizes[1].tolist(), [10, 10], image_sizes[1]) - # support calling with different spatial sizes, but not with different #images - - def test_imagelist_scriptability(self): - image_nums = 2 - image_tensor = torch.randn((image_nums, 10, 20), dtype=torch.float32) - image_shape = [(10, 20)] * image_nums - - def f(image_tensor, image_shape: List[Tuple[int, int]]): - return ImageList(image_tensor, image_shape) - - ret = f(image_tensor, image_shape) - ret_script = torch.jit.script(f)(image_tensor, image_shape) - - self.assertEqual(len(ret), len(ret_script)) - for i in range(image_nums): - self.assertTrue(torch.equal(ret[i], ret_script[i])) - - def test_imagelist_from_tensors_scriptability(self): - image_tensor_0 = torch.randn(10, 20, dtype=torch.float32) - image_tensor_1 = torch.randn(12, 22, dtype=torch.float32) - inputs = [image_tensor_0, image_tensor_1] - - def f(image_tensor: List[torch.Tensor]): - return ImageList.from_tensors(image_tensor, 10) - - ret = f(inputs) - ret_script = torch.jit.script(f)(inputs) - - self.assertEqual(len(ret), len(ret_script)) - self.assertTrue(torch.equal(ret.tensor, ret_script.tensor)) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_instances.py b/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_instances.py deleted file mode 100755 index a352f743..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_instances.py +++ /dev/null @@ -1,219 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import unittest -import torch -from torch import Tensor - -from detectron2.export.torchscript import patch_instances -from detectron2.structures import Boxes, Instances -from detectron2.utils.testing import convert_scripted_instances - - -class TestInstances(unittest.TestCase): - def test_int_indexing(self): - attr1 = torch.tensor([[0.0, 0.0, 1.0], [0.0, 0.0, 0.5], [0.0, 0.0, 1.0], [0.0, 0.5, 0.5]]) - attr2 = torch.tensor([0.1, 0.2, 0.3, 0.4]) - instances = Instances((100, 100)) - instances.attr1 = attr1 - instances.attr2 = attr2 - for i in range(-len(instances), len(instances)): - inst = instances[i] - self.assertEqual((inst.attr1 == attr1[i]).all(), True) - self.assertEqual((inst.attr2 == attr2[i]).all(), True) - - self.assertRaises(IndexError, lambda: instances[len(instances)]) - self.assertRaises(IndexError, lambda: instances[-len(instances) - 1]) - - def test_script_new_fields(self): - def get_mask(x: Instances) -> torch.Tensor: - return x.mask - - class f(torch.nn.Module): - def forward(self, x: Instances): - proposal_boxes = x.proposal_boxes # noqa F841 - objectness_logits = x.objectness_logits # noqa F841 - return x - - class g(torch.nn.Module): - def forward(self, x: Instances): - return get_mask(x) - - class g2(torch.nn.Module): - def __init__(self): - super().__init__() - self.g = g() - - def forward(self, x: Instances): - proposal_boxes = x.proposal_boxes # noqa F841 - return x, self.g(x) - - fields = {"proposal_boxes": Boxes, "objectness_logits": Tensor} - with patch_instances(fields): - torch.jit.script(f()) - - # can't script anymore after exiting the context - with self.assertRaises(Exception): - # will create a ConcreteType for g - torch.jit.script(g2()) - - new_fields = {"mask": Tensor} - with patch_instances(new_fields): - # will compile g with a different Instances; this should pass - torch.jit.script(g()) - with self.assertRaises(Exception): - torch.jit.script(g2()) - - new_fields = {"mask": Tensor, "proposal_boxes": Boxes} - with patch_instances(new_fields) as NewInstances: - # get_mask will be compiled with a different Instances; this should pass - scripted_g2 = torch.jit.script(g2()) - x = NewInstances((3, 4)) - x.mask = torch.rand(3) - x.proposal_boxes = Boxes(torch.rand(3, 4)) - scripted_g2(x) # it should accept the new Instances object and run successfully - - def test_script_access_fields(self): - class f(torch.nn.Module): - def forward(self, x: Instances): - proposal_boxes = x.proposal_boxes - objectness_logits = x.objectness_logits - return proposal_boxes.tensor + objectness_logits - - fields = {"proposal_boxes": Boxes, "objectness_logits": Tensor} - with patch_instances(fields): - torch.jit.script(f()) - - def test_script_len(self): - class f(torch.nn.Module): - def forward(self, x: Instances): - return len(x) - - class g(torch.nn.Module): - def forward(self, x: Instances): - return len(x) - - image_shape = (15, 15) - - fields = {"proposal_boxes": Boxes} - with patch_instances(fields) as new_instance: - script_module = torch.jit.script(f()) - x = new_instance(image_shape) - with self.assertRaises(Exception): - script_module(x) - box_tensors = torch.tensor([[5, 5, 10, 10], [1, 1, 2, 3]]) - x.proposal_boxes = Boxes(box_tensors) - length = script_module(x) - self.assertEqual(length, 2) - - fields = {"objectness_logits": Tensor} - with patch_instances(fields) as new_instance: - script_module = torch.jit.script(g()) - x = new_instance(image_shape) - objectness_logits = torch.tensor([1.0]).reshape(1, 1) - x.objectness_logits = objectness_logits - length = script_module(x) - self.assertEqual(length, 1) - - def test_script_has(self): - class f(torch.nn.Module): - def forward(self, x: Instances): - return x.has("proposal_boxes") - - image_shape = (15, 15) - fields = {"proposal_boxes": Boxes} - with patch_instances(fields) as new_instance: - script_module = torch.jit.script(f()) - x = new_instance(image_shape) - self.assertFalse(script_module(x)) - - box_tensors = torch.tensor([[5, 5, 10, 10], [1, 1, 2, 3]]) - x.proposal_boxes = Boxes(box_tensors) - self.assertTrue(script_module(x)) - - def test_script_to(self): - class f(torch.nn.Module): - def forward(self, x: Instances): - return x.to(torch.device("cpu")) - - image_shape = (15, 15) - fields = {"proposal_boxes": Boxes, "a": Tensor} - with patch_instances(fields) as new_instance: - script_module = torch.jit.script(f()) - x = new_instance(image_shape) - script_module(x) - - box_tensors = torch.tensor([[5, 5, 10, 10], [1, 1, 2, 3]]) - x.proposal_boxes = Boxes(box_tensors) - x.a = box_tensors - script_module(x) - - def test_script_getitem(self): - class f(torch.nn.Module): - def forward(self, x: Instances, idx): - return x[idx] - - image_shape = (15, 15) - fields = {"proposal_boxes": Boxes, "a": Tensor} - inst = Instances(image_shape) - inst.proposal_boxes = Boxes(torch.rand(4, 4)) - inst.a = torch.rand(4, 10) - idx = torch.tensor([True, False, True, False]) - with patch_instances(fields) as new_instance: - script_module = torch.jit.script(f()) - - out = f()(inst, idx) - out_scripted = script_module(new_instance.from_instances(inst), idx) - self.assertTrue( - torch.equal(out.proposal_boxes.tensor, out_scripted.proposal_boxes.tensor) - ) - self.assertTrue(torch.equal(out.a, out_scripted.a)) - - def test_from_to_instances(self): - orig = Instances((30, 30)) - orig.proposal_boxes = Boxes(torch.rand(3, 4)) - - fields = {"proposal_boxes": Boxes, "a": Tensor} - with patch_instances(fields) as NewInstances: - # convert to NewInstances and back - new1 = NewInstances.from_instances(orig) - new2 = convert_scripted_instances(new1) - self.assertTrue(torch.equal(orig.proposal_boxes.tensor, new1.proposal_boxes.tensor)) - self.assertTrue(torch.equal(orig.proposal_boxes.tensor, new2.proposal_boxes.tensor)) - - def test_script_init_args(self): - def f(x: Tensor): - image_shape = (15, 15) - # __init__ can take arguments - inst = Instances(image_shape, a=x, proposal_boxes=Boxes(x)) - inst2 = Instances(image_shape, a=x) - return inst.a, inst2.a - - fields = {"proposal_boxes": Boxes, "a": Tensor} - with patch_instances(fields): - script_f = torch.jit.script(f) - x = torch.randn(3, 4) - outputs = script_f(x) - self.assertTrue(torch.equal(outputs[0], x)) - self.assertTrue(torch.equal(outputs[1], x)) - - def test_script_cat(self): - def f(x: Tensor): - image_shape = (15, 15) - # __init__ can take arguments - inst = Instances(image_shape, a=x) - inst2 = Instances(image_shape, a=x) - - inst3 = Instances(image_shape, proposal_boxes=Boxes(x)) - return inst.cat([inst, inst2]), inst3.cat([inst3, inst3]) - - fields = {"proposal_boxes": Boxes, "a": Tensor} - with patch_instances(fields): - script_f = torch.jit.script(f) - x = torch.randn(3, 4) - output, output2 = script_f(x) - self.assertTrue(torch.equal(output.a, torch.cat([x, x]))) - self.assertFalse(output.has("proposal_boxes")) - self.assertTrue(torch.equal(output2.proposal_boxes.tensor, torch.cat([x, x]))) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_keypoints.py b/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_keypoints.py deleted file mode 100755 index adc616e4..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_keypoints.py +++ /dev/null @@ -1,19 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import unittest -import torch - -from detectron2.structures.keypoints import Keypoints - - -class TestKeypoints(unittest.TestCase): - def test_cat_keypoints(self): - keypoints1 = Keypoints(torch.rand(2, 21, 3)) - keypoints2 = Keypoints(torch.rand(4, 21, 3)) - - cat_keypoints = keypoints1.cat([keypoints1, keypoints2]) - self.assertTrue(torch.all(cat_keypoints.tensor[:2] == keypoints1.tensor).item()) - self.assertTrue(torch.all(cat_keypoints.tensor[2:] == keypoints2.tensor).item()) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_masks.py b/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_masks.py deleted file mode 100755 index 7991eb0b..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_masks.py +++ /dev/null @@ -1,53 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import unittest -import torch - -from detectron2.structures.masks import BitMasks, PolygonMasks, polygons_to_bitmask - - -class TestBitMask(unittest.TestCase): - def test_get_bounding_box(self): - masks = torch.tensor( - [ - [ - [False, False, False, True], - [False, False, True, True], - [False, True, True, False], - [False, True, True, False], - ], - [ - [False, False, False, False], - [False, False, True, False], - [False, True, True, False], - [False, True, True, False], - ], - torch.zeros(4, 4), - ] - ) - bitmask = BitMasks(masks) - box_true = torch.tensor([[1, 0, 4, 4], [1, 1, 3, 4], [0, 0, 0, 0]], dtype=torch.float32) - box = bitmask.get_bounding_boxes() - self.assertTrue(torch.all(box.tensor == box_true).item()) - - for box in box_true: - poly = box[[0, 1, 2, 1, 2, 3, 0, 3]].numpy() - mask = polygons_to_bitmask([poly], 4, 4) - reconstruct_box = BitMasks(mask[None, :, :]).get_bounding_boxes()[0].tensor - self.assertTrue(torch.all(box == reconstruct_box).item()) - - reconstruct_box = PolygonMasks([[poly]]).get_bounding_boxes()[0].tensor - self.assertTrue(torch.all(box == reconstruct_box).item()) - - def test_from_empty_polygons(self): - masks = BitMasks.from_polygon_masks([], 100, 100) - self.assertEqual(masks.tensor.shape, (0, 100, 100)) - - def test_getitem(self): - masks = BitMasks(torch.ones(3, 10, 10)) - self.assertEqual(masks[1].tensor.shape, (1, 10, 10)) - self.assertEqual(masks[1:3].tensor.shape, (2, 10, 10)) - self.assertEqual(masks[torch.tensor([True, False, False])].tensor.shape, (1, 10, 10)) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_rotated_boxes.py b/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_rotated_boxes.py deleted file mode 100755 index 27812374..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/structures/test_rotated_boxes.py +++ /dev/null @@ -1,437 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -from __future__ import absolute_import, division, print_function, unicode_literals -import logging -import math -import random -import unittest -import torch -from fvcore.common.benchmark import benchmark - -from detectron2.layers.rotated_boxes import pairwise_iou_rotated -from detectron2.structures.boxes import Boxes -from detectron2.structures.rotated_boxes import RotatedBoxes, pairwise_iou -from detectron2.utils.testing import reload_script_model - -logger = logging.getLogger(__name__) - - -class TestRotatedBoxesLayer(unittest.TestCase): - def test_iou_0_dim_cpu(self): - boxes1 = torch.rand(0, 5, dtype=torch.float32) - boxes2 = torch.rand(10, 5, dtype=torch.float32) - expected_ious = torch.zeros(0, 10, dtype=torch.float32) - ious = pairwise_iou_rotated(boxes1, boxes2) - self.assertTrue(torch.allclose(ious, expected_ious)) - - boxes1 = torch.rand(10, 5, dtype=torch.float32) - boxes2 = torch.rand(0, 5, dtype=torch.float32) - expected_ious = torch.zeros(10, 0, dtype=torch.float32) - ious = pairwise_iou_rotated(boxes1, boxes2) - self.assertTrue(torch.allclose(ious, expected_ious)) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_iou_0_dim_cuda(self): - boxes1 = torch.rand(0, 5, dtype=torch.float32) - boxes2 = torch.rand(10, 5, dtype=torch.float32) - expected_ious = torch.zeros(0, 10, dtype=torch.float32) - ious_cuda = pairwise_iou_rotated(boxes1.cuda(), boxes2.cuda()) - self.assertTrue(torch.allclose(ious_cuda.cpu(), expected_ious)) - - boxes1 = torch.rand(10, 5, dtype=torch.float32) - boxes2 = torch.rand(0, 5, dtype=torch.float32) - expected_ious = torch.zeros(10, 0, dtype=torch.float32) - ious_cuda = pairwise_iou_rotated(boxes1.cuda(), boxes2.cuda()) - self.assertTrue(torch.allclose(ious_cuda.cpu(), expected_ious)) - - def test_iou_half_overlap_cpu(self): - boxes1 = torch.tensor([[0.5, 0.5, 1.0, 1.0, 0.0]], dtype=torch.float32) - boxes2 = torch.tensor([[0.25, 0.5, 0.5, 1.0, 0.0]], dtype=torch.float32) - expected_ious = torch.tensor([[0.5]], dtype=torch.float32) - ious = pairwise_iou_rotated(boxes1, boxes2) - self.assertTrue(torch.allclose(ious, expected_ious)) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_iou_half_overlap_cuda(self): - boxes1 = torch.tensor([[0.5, 0.5, 1.0, 1.0, 0.0]], dtype=torch.float32) - boxes2 = torch.tensor([[0.25, 0.5, 0.5, 1.0, 0.0]], dtype=torch.float32) - expected_ious = torch.tensor([[0.5]], dtype=torch.float32) - ious_cuda = pairwise_iou_rotated(boxes1.cuda(), boxes2.cuda()) - self.assertTrue(torch.allclose(ious_cuda.cpu(), expected_ious)) - - def test_iou_precision(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - boxes1 = torch.tensor([[565, 565, 10, 10.0, 0]], dtype=torch.float32, device=device) - boxes2 = torch.tensor([[565, 565, 10, 8.3, 0]], dtype=torch.float32, device=device) - iou = 8.3 / 10.0 - expected_ious = torch.tensor([[iou]], dtype=torch.float32) - ious = pairwise_iou_rotated(boxes1, boxes2) - self.assertTrue(torch.allclose(ious.cpu(), expected_ious)) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_iou_too_many_boxes_cuda(self): - s1, s2 = 5, 1289035 - boxes1 = torch.zeros(s1, 5) - boxes2 = torch.zeros(s2, 5) - ious_cuda = pairwise_iou_rotated(boxes1.cuda(), boxes2.cuda()) - self.assertTupleEqual(tuple(ious_cuda.shape), (s1, s2)) - - def test_iou_extreme(self): - # Cause floating point issues in cuda kernels (#1266) - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - boxes1 = torch.tensor([[160.0, 153.0, 230.0, 23.0, -37.0]], device=device) - boxes2 = torch.tensor( - [ - [ - -1.117407639806935e17, - 1.3858420478349148e18, - 1000.0000610351562, - 1000.0000610351562, - 1612.0, - ] - ], - device=device, - ) - ious = pairwise_iou_rotated(boxes1, boxes2) - self.assertTrue(ious.min() >= 0, ious) - - def test_iou_issue_2154(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - boxes1 = torch.tensor( - [ - [ - 296.6620178222656, - 458.73883056640625, - 23.515729904174805, - 47.677001953125, - 0.08795166015625, - ] - ], - device=device, - ) - boxes2 = torch.tensor( - [[296.66201, 458.73882000000003, 23.51573, 47.67702, 0.087951]], - device=device, - ) - ious = pairwise_iou_rotated(boxes1, boxes2) - expected_ious = torch.tensor([[1.0]], dtype=torch.float32) - self.assertTrue(torch.allclose(ious.cpu(), expected_ious)) - - def test_iou_issue_2167(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - boxes1 = torch.tensor( - [ - [ - 2563.74462890625000000000, - 1436.79016113281250000000, - 2174.70336914062500000000, - 214.09500122070312500000, - 115.11834716796875000000, - ] - ], - device=device, - ) - boxes2 = torch.tensor( - [ - [ - 2563.74462890625000000000, - 1436.79028320312500000000, - 2174.70288085937500000000, - 214.09495544433593750000, - 115.11835479736328125000, - ] - ], - device=device, - ) - ious = pairwise_iou_rotated(boxes1, boxes2) - expected_ious = torch.tensor([[1.0]], dtype=torch.float32) - self.assertTrue(torch.allclose(ious.cpu(), expected_ious)) - - -class TestRotatedBoxesStructure(unittest.TestCase): - def test_clip_area_0_degree(self): - for _ in range(50): - num_boxes = 100 - boxes_5d = torch.zeros(num_boxes, 5) - boxes_5d[:, 0] = torch.FloatTensor(num_boxes).uniform_(-100, 500) - boxes_5d[:, 1] = torch.FloatTensor(num_boxes).uniform_(-100, 500) - boxes_5d[:, 2] = torch.FloatTensor(num_boxes).uniform_(0, 500) - boxes_5d[:, 3] = torch.FloatTensor(num_boxes).uniform_(0, 500) - # Convert from (x_ctr, y_ctr, w, h, 0) to (x1, y1, x2, y2) - boxes_4d = torch.zeros(num_boxes, 4) - boxes_4d[:, 0] = boxes_5d[:, 0] - boxes_5d[:, 2] / 2.0 - boxes_4d[:, 1] = boxes_5d[:, 1] - boxes_5d[:, 3] / 2.0 - boxes_4d[:, 2] = boxes_5d[:, 0] + boxes_5d[:, 2] / 2.0 - boxes_4d[:, 3] = boxes_5d[:, 1] + boxes_5d[:, 3] / 2.0 - - image_size = (500, 600) - test_boxes_4d = Boxes(boxes_4d) - test_boxes_5d = RotatedBoxes(boxes_5d) - # Before clip - areas_4d = test_boxes_4d.area() - areas_5d = test_boxes_5d.area() - self.assertTrue(torch.allclose(areas_4d, areas_5d, atol=1e-1, rtol=1e-5)) - # After clip - test_boxes_4d.clip(image_size) - test_boxes_5d.clip(image_size) - areas_4d = test_boxes_4d.area() - areas_5d = test_boxes_5d.area() - self.assertTrue(torch.allclose(areas_4d, areas_5d, atol=1e-1, rtol=1e-5)) - - def test_clip_area_arbitrary_angle(self): - num_boxes = 100 - boxes_5d = torch.zeros(num_boxes, 5) - boxes_5d[:, 0] = torch.FloatTensor(num_boxes).uniform_(-100, 500) - boxes_5d[:, 1] = torch.FloatTensor(num_boxes).uniform_(-100, 500) - boxes_5d[:, 2] = torch.FloatTensor(num_boxes).uniform_(0, 500) - boxes_5d[:, 3] = torch.FloatTensor(num_boxes).uniform_(0, 500) - boxes_5d[:, 4] = torch.FloatTensor(num_boxes).uniform_(-1800, 1800) - clip_angle_threshold = random.uniform(0, 180) - - image_size = (500, 600) - test_boxes_5d = RotatedBoxes(boxes_5d) - # Before clip - areas_before = test_boxes_5d.area() - # After clip - test_boxes_5d.clip(image_size, clip_angle_threshold) - areas_diff = test_boxes_5d.area() - areas_before - - # the areas should only decrease after clipping - self.assertTrue(torch.all(areas_diff <= 0)) - # whenever the box is clipped (thus the area shrinks), - # the angle for the box must be within the clip_angle_threshold - # Note that the clip function will normalize the angle range - # to be within (-180, 180] - self.assertTrue( - torch.all(torch.abs(boxes_5d[:, 4][torch.where(areas_diff < 0)]) < clip_angle_threshold) - ) - - def test_normalize_angles(self): - # torch.manual_seed(0) - for _ in range(50): - num_boxes = 100 - boxes_5d = torch.zeros(num_boxes, 5) - boxes_5d[:, 0] = torch.FloatTensor(num_boxes).uniform_(-100, 500) - boxes_5d[:, 1] = torch.FloatTensor(num_boxes).uniform_(-100, 500) - boxes_5d[:, 2] = torch.FloatTensor(num_boxes).uniform_(0, 500) - boxes_5d[:, 3] = torch.FloatTensor(num_boxes).uniform_(0, 500) - boxes_5d[:, 4] = torch.FloatTensor(num_boxes).uniform_(-1800, 1800) - rotated_boxes = RotatedBoxes(boxes_5d) - normalized_boxes = rotated_boxes.clone() - normalized_boxes.normalize_angles() - self.assertTrue(torch.all(normalized_boxes.tensor[:, 4] >= -180)) - self.assertTrue(torch.all(normalized_boxes.tensor[:, 4] < 180)) - # x, y, w, h should not change - self.assertTrue(torch.allclose(boxes_5d[:, :4], normalized_boxes.tensor[:, :4])) - # the cos/sin values of the angles should stay the same - - self.assertTrue( - torch.allclose( - torch.cos(boxes_5d[:, 4] * math.pi / 180), - torch.cos(normalized_boxes.tensor[:, 4] * math.pi / 180), - atol=1e-5, - ) - ) - - self.assertTrue( - torch.allclose( - torch.sin(boxes_5d[:, 4] * math.pi / 180), - torch.sin(normalized_boxes.tensor[:, 4] * math.pi / 180), - atol=1e-5, - ) - ) - - def test_pairwise_iou_0_degree(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - boxes1 = torch.tensor( - [[0.5, 0.5, 1.0, 1.0, 0.0], [0.5, 0.5, 1.0, 1.0, 0.0]], - dtype=torch.float32, - device=device, - ) - boxes2 = torch.tensor( - [ - [0.5, 0.5, 1.0, 1.0, 0.0], - [0.25, 0.5, 0.5, 1.0, 0.0], - [0.5, 0.25, 1.0, 0.5, 0.0], - [0.25, 0.25, 0.5, 0.5, 0.0], - [0.75, 0.75, 0.5, 0.5, 0.0], - [1.0, 1.0, 1.0, 1.0, 0.0], - ], - dtype=torch.float32, - device=device, - ) - expected_ious = torch.tensor( - [ - [1.0, 0.5, 0.5, 0.25, 0.25, 0.25 / (2 - 0.25)], - [1.0, 0.5, 0.5, 0.25, 0.25, 0.25 / (2 - 0.25)], - ], - dtype=torch.float32, - device=device, - ) - ious = pairwise_iou(RotatedBoxes(boxes1), RotatedBoxes(boxes2)) - self.assertTrue(torch.allclose(ious, expected_ious)) - - def test_pairwise_iou_45_degrees(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - boxes1 = torch.tensor( - [ - [1, 1, math.sqrt(2), math.sqrt(2), 45], - [1, 1, 2 * math.sqrt(2), 2 * math.sqrt(2), -45], - ], - dtype=torch.float32, - device=device, - ) - boxes2 = torch.tensor([[1, 1, 2, 2, 0]], dtype=torch.float32, device=device) - expected_ious = torch.tensor([[0.5], [0.5]], dtype=torch.float32, device=device) - ious = pairwise_iou(RotatedBoxes(boxes1), RotatedBoxes(boxes2)) - self.assertTrue(torch.allclose(ious, expected_ious)) - - def test_pairwise_iou_orthogonal(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - boxes1 = torch.tensor([[5, 5, 10, 6, 55]], dtype=torch.float32, device=device) - boxes2 = torch.tensor([[5, 5, 10, 6, -35]], dtype=torch.float32, device=device) - iou = (6.0 * 6.0) / (6.0 * 6.0 + 4.0 * 6.0 + 4.0 * 6.0) - expected_ious = torch.tensor([[iou]], dtype=torch.float32, device=device) - ious = pairwise_iou(RotatedBoxes(boxes1), RotatedBoxes(boxes2)) - self.assertTrue(torch.allclose(ious, expected_ious)) - - def test_pairwise_iou_large_close_boxes(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - boxes1 = torch.tensor( - [[299.500000, 417.370422, 600.000000, 364.259186, 27.1828]], - dtype=torch.float32, - device=device, - ) - boxes2 = torch.tensor( - [[299.500000, 417.370422, 600.000000, 364.259155, 27.1828]], - dtype=torch.float32, - device=device, - ) - iou = 364.259155 / 364.259186 - expected_ious = torch.tensor([[iou]], dtype=torch.float32, device=device) - ious = pairwise_iou(RotatedBoxes(boxes1), RotatedBoxes(boxes2)) - self.assertTrue(torch.allclose(ious, expected_ious)) - - def test_pairwise_iou_many_boxes(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - num_boxes1 = 100 - num_boxes2 = 200 - boxes1 = torch.stack( - [ - torch.tensor( - [5 + 20 * i, 5 + 20 * i, 10, 10, 0], - dtype=torch.float32, - device=device, - ) - for i in range(num_boxes1) - ] - ) - boxes2 = torch.stack( - [ - torch.tensor( - [5 + 20 * i, 5 + 20 * i, 10, 1 + 9 * i / num_boxes2, 0], - dtype=torch.float32, - device=device, - ) - for i in range(num_boxes2) - ] - ) - expected_ious = torch.zeros(num_boxes1, num_boxes2, dtype=torch.float32, device=device) - for i in range(min(num_boxes1, num_boxes2)): - expected_ious[i][i] = (1 + 9 * i / num_boxes2) / 10.0 - ious = pairwise_iou(RotatedBoxes(boxes1), RotatedBoxes(boxes2)) - self.assertTrue(torch.allclose(ious, expected_ious)) - - def test_pairwise_iou_issue1207_simplified(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - # Simplified test case of D2-issue-1207 - boxes1 = torch.tensor([[3, 3, 8, 2, -45.0]], device=device) - boxes2 = torch.tensor([[6, 0, 8, 2, -45.0]], device=device) - iou = 0.0 - expected_ious = torch.tensor([[iou]], dtype=torch.float32, device=device) - - ious = pairwise_iou(RotatedBoxes(boxes1), RotatedBoxes(boxes2)) - self.assertTrue(torch.allclose(ious, expected_ious)) - - def test_pairwise_iou_issue1207(self): - for device in ["cpu"] + (["cuda"] if torch.cuda.is_available() else []): - # The original test case in D2-issue-1207 - boxes1 = torch.tensor([[160.0, 153.0, 230.0, 23.0, -37.0]], device=device) - boxes2 = torch.tensor([[190.0, 127.0, 80.0, 21.0, -46.0]], device=device) - - iou = 0.0 - expected_ious = torch.tensor([[iou]], dtype=torch.float32, device=device) - - ious = pairwise_iou(RotatedBoxes(boxes1), RotatedBoxes(boxes2)) - self.assertTrue(torch.allclose(ious, expected_ious)) - - def test_empty_cat(self): - x = RotatedBoxes.cat([]) - self.assertTrue(x.tensor.shape, (0, 5)) - - def test_scriptability(self): - def func(x): - boxes = RotatedBoxes(x) - test = boxes.to(torch.device("cpu")).tensor - return boxes.area(), test - - f = torch.jit.script(func) - f = reload_script_model(f) - f(torch.rand((3, 5))) - - data = torch.rand((3, 5)) - - def func_cat(x: torch.Tensor): - boxes1 = RotatedBoxes(x) - boxes2 = RotatedBoxes(x) - # this is not supported by torchscript for now. - # boxes3 = RotatedBoxes.cat([boxes1, boxes2]) - boxes3 = boxes1.cat([boxes1, boxes2]) - return boxes3 - - f = torch.jit.script(func_cat) - script_box = f(data) - self.assertTrue(torch.equal(torch.cat([data, data]), script_box.tensor)) - - -def benchmark_rotated_iou(): - num_boxes1 = 200 - num_boxes2 = 500 - boxes1 = torch.stack( - [ - torch.tensor([5 + 20 * i, 5 + 20 * i, 10, 10, 0], dtype=torch.float32) - for i in range(num_boxes1) - ] - ) - boxes2 = torch.stack( - [ - torch.tensor( - [5 + 20 * i, 5 + 20 * i, 10, 1 + 9 * i / num_boxes2, 0], - dtype=torch.float32, - ) - for i in range(num_boxes2) - ] - ) - - def func(dev, n=1): - b1 = boxes1.to(device=dev) - b2 = boxes2.to(device=dev) - - def bench(): - for _ in range(n): - pairwise_iou_rotated(b1, b2) - if dev.type == "cuda": - torch.cuda.synchronize() - - return bench - - # only run it once per timed loop, since it's slow - args = [{"dev": torch.device("cpu"), "n": 1}] - if torch.cuda.is_available(): - args.append({"dev": torch.device("cuda"), "n": 10}) - - benchmark(func, "rotated_iou", args, warmup_iters=3) - - -if __name__ == "__main__": - unittest.main() - benchmark_rotated_iou() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/test_checkpoint.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_checkpoint.py deleted file mode 100755 index ab0bfbd5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/test_checkpoint.py +++ /dev/null @@ -1,49 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import unittest -from collections import OrderedDict -import torch -from torch import nn - -from detectron2.checkpoint.c2_model_loading import align_and_update_state_dicts -from detectron2.utils.logger import setup_logger - - -class TestCheckpointer(unittest.TestCase): - def setUp(self): - setup_logger() - - def create_complex_model(self): - m = nn.Module() - m.block1 = nn.Module() - m.block1.layer1 = nn.Linear(2, 3) - m.layer2 = nn.Linear(3, 2) - m.res = nn.Module() - m.res.layer2 = nn.Linear(3, 2) - - state_dict = OrderedDict() - state_dict["layer1.weight"] = torch.rand(3, 2) - state_dict["layer1.bias"] = torch.rand(3) - state_dict["layer2.weight"] = torch.rand(2, 3) - state_dict["layer2.bias"] = torch.rand(2) - state_dict["res.layer2.weight"] = torch.rand(2, 3) - state_dict["res.layer2.bias"] = torch.rand(2) - return m, state_dict - - def test_complex_model_loaded(self): - for add_data_parallel in [False, True]: - model, state_dict = self.create_complex_model() - if add_data_parallel: - model = nn.DataParallel(model) - model_sd = model.state_dict() - - sd_to_load = align_and_update_state_dicts(model_sd, state_dict) - model.load_state_dict(sd_to_load) - for loaded, stored in zip(model_sd.values(), state_dict.values()): - # different tensor references - self.assertFalse(id(loaded) == id(stored)) - # same content - self.assertTrue(loaded.to(stored).equal(stored)) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/test_engine.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_engine.py deleted file mode 100755 index 6f6a0997..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/test_engine.py +++ /dev/null @@ -1,186 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import json -import math -import os -import tempfile -import time -import unittest -from unittest import mock -import torch -from fvcore.common.checkpoint import Checkpointer -from torch import nn - -from detectron2 import model_zoo -from detectron2.config import configurable, get_cfg -from detectron2.engine import DefaultTrainer, SimpleTrainer, default_setup, hooks -from detectron2.modeling.meta_arch import META_ARCH_REGISTRY -from detectron2.utils.events import CommonMetricPrinter, JSONWriter - - -@META_ARCH_REGISTRY.register() -class _SimpleModel(nn.Module): - @configurable - def __init__(self, sleep_sec=0): - super().__init__() - self.mod = nn.Linear(10, 20) - self.sleep_sec = sleep_sec - - @classmethod - def from_config(cls, cfg): - return {} - - def forward(self, x): - if self.sleep_sec > 0: - time.sleep(self.sleep_sec) - return {"loss": x.sum() + sum([x.mean() for x in self.parameters()])} - - -class TestTrainer(unittest.TestCase): - def _data_loader(self, device): - device = torch.device(device) - while True: - yield torch.rand(3, 3).to(device) - - def test_simple_trainer(self, device="cpu"): - model = _SimpleModel().to(device=device) - trainer = SimpleTrainer( - model, self._data_loader(device), torch.optim.SGD(model.parameters(), 0.1) - ) - trainer.train(0, 10) - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def test_simple_trainer_cuda(self): - self.test_simple_trainer(device="cuda") - - def test_writer_hooks(self): - model = _SimpleModel(sleep_sec=0.1) - trainer = SimpleTrainer( - model, self._data_loader("cpu"), torch.optim.SGD(model.parameters(), 0.1) - ) - - max_iter = 50 - - with tempfile.TemporaryDirectory(prefix="detectron2_test") as d: - json_file = os.path.join(d, "metrics.json") - writers = [CommonMetricPrinter(max_iter), JSONWriter(json_file)] - - trainer.register_hooks( - [hooks.EvalHook(0, lambda: {"metric": 100}), hooks.PeriodicWriter(writers)] - ) - with self.assertLogs(writers[0].logger) as logs: - trainer.train(0, max_iter) - - with open(json_file, "r") as f: - data = [json.loads(line.strip()) for line in f] - self.assertEqual([x["iteration"] for x in data], [19, 39, 49, 50]) - # the eval metric is in the last line with iter 50 - self.assertIn("metric", data[-1], "Eval metric must be in last line of JSON!") - - # test logged messages from CommonMetricPrinter - self.assertEqual(len(logs.output), 3) - for log, iter in zip(logs.output, [19, 39, 49]): - self.assertIn(f"iter: {iter}", log) - - self.assertIn("eta: 0:00:00", logs.output[-1], "Last ETA must be 0!") - - def test_default_trainer(self): - # TODO: this test requires manifold access, so changed device to CPU. see: T88318502 - cfg = get_cfg() - cfg.MODEL.DEVICE = "cpu" - cfg.MODEL.META_ARCHITECTURE = "_SimpleModel" - cfg.DATASETS.TRAIN = ("coco_2017_val_100",) - with tempfile.TemporaryDirectory(prefix="detectron2_test") as d: - cfg.OUTPUT_DIR = d - trainer = DefaultTrainer(cfg) - - # test property - self.assertIs(trainer.model, trainer._trainer.model) - trainer.model = _SimpleModel() - self.assertIs(trainer.model, trainer._trainer.model) - - def test_checkpoint_resume(self): - model = _SimpleModel() - dataloader = self._data_loader("cpu") - opt = torch.optim.SGD(model.parameters(), 0.1) - scheduler = torch.optim.lr_scheduler.StepLR(opt, 3) - - with tempfile.TemporaryDirectory(prefix="detectron2_test") as d: - trainer = SimpleTrainer(model, dataloader, opt) - checkpointer = Checkpointer(model, d, opt=opt, trainer=trainer) - - trainer.register_hooks( - [ - hooks.LRScheduler(scheduler=scheduler), - # checkpoint after scheduler to properly save the state of scheduler - hooks.PeriodicCheckpointer(checkpointer, 10), - ] - ) - - trainer.train(0, 12) - self.assertAlmostEqual(opt.param_groups[0]["lr"], 1e-5) - self.assertEqual(scheduler.last_epoch, 12) - del trainer - - opt = torch.optim.SGD(model.parameters(), 999) # lr will be loaded - trainer = SimpleTrainer(model, dataloader, opt) - scheduler = torch.optim.lr_scheduler.StepLR(opt, 3) - trainer.register_hooks( - [ - hooks.LRScheduler(scheduler=scheduler), - ] - ) - checkpointer = Checkpointer(model, d, opt=opt, trainer=trainer) - checkpointer.resume_or_load("non_exist.pth") - self.assertEqual(trainer.iter, 11) # last finished iter number (0-based in Trainer) - # number of times `scheduler.step()` was called (1-based) - self.assertEqual(scheduler.last_epoch, 12) - self.assertAlmostEqual(opt.param_groups[0]["lr"], 1e-5) - - def test_eval_hook(self): - model = _SimpleModel() - dataloader = self._data_loader("cpu") - opt = torch.optim.SGD(model.parameters(), 0.1) - - for total_iter, period, eval_count in [(30, 15, 2), (31, 15, 3), (20, 0, 1)]: - test_func = mock.Mock(return_value={"metric": 3.0}) - trainer = SimpleTrainer(model, dataloader, opt) - trainer.register_hooks([hooks.EvalHook(period, test_func)]) - trainer.train(0, total_iter) - self.assertEqual(test_func.call_count, eval_count) - - def test_best_checkpointer(self): - model = _SimpleModel() - dataloader = self._data_loader("cpu") - opt = torch.optim.SGD(model.parameters(), 0.1) - metric_name = "metric" - total_iter = 40 - test_period = 10 - test_cases = [ - ("max", iter([0.3, 0.4, 0.35, 0.5]), 3), - ("min", iter([1.0, 0.8, 0.9, 0.9]), 2), - ("min", iter([math.nan, 0.8, 0.9, 0.9]), 1), - ] - for mode, metrics, call_count in test_cases: - trainer = SimpleTrainer(model, dataloader, opt) - with tempfile.TemporaryDirectory(prefix="detectron2_test") as d: - checkpointer = Checkpointer(model, d, opt=opt, trainer=trainer) - trainer.register_hooks( - [ - hooks.EvalHook(test_period, lambda: {metric_name: next(metrics)}), - hooks.BestCheckpointer(test_period, checkpointer, metric_name, mode=mode), - ] - ) - with mock.patch.object(checkpointer, "save") as mock_save_method: - trainer.train(0, total_iter) - self.assertEqual(mock_save_method.call_count, call_count) - - def test_setup_config(self): - with tempfile.TemporaryDirectory(prefix="detectron2_test") as d: - cfg = get_cfg() - cfg.OUTPUT_DIR = os.path.join(d, "yacs") - default_setup(cfg, {}) - - cfg = model_zoo.get_config("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.py") - cfg.train.output_dir = os.path.join(d, "omegaconf") - default_setup(cfg, {}) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/test_events.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_events.py deleted file mode 100755 index c1b03e4d..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/test_events.py +++ /dev/null @@ -1,64 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import json -import os -import tempfile -import unittest - -from detectron2.utils.events import CommonMetricPrinter, EventStorage, JSONWriter - - -class TestEventWriter(unittest.TestCase): - def testScalar(self): - with tempfile.TemporaryDirectory( - prefix="detectron2_tests" - ) as dir, EventStorage() as storage: - json_file = os.path.join(dir, "test.json") - writer = JSONWriter(json_file) - for k in range(60): - storage.put_scalar("key", k, smoothing_hint=False) - if (k + 1) % 20 == 0: - writer.write() - storage.step() - writer.close() - with open(json_file) as f: - data = [json.loads(l) for l in f] - self.assertTrue([int(k["key"]) for k in data] == [19, 39, 59]) - - def testScalarMismatchedPeriod(self): - with tempfile.TemporaryDirectory( - prefix="detectron2_tests" - ) as dir, EventStorage() as storage: - json_file = os.path.join(dir, "test.json") - - writer = JSONWriter(json_file) - for k in range(60): - if k % 17 == 0: # write in a differnt period - storage.put_scalar("key2", k, smoothing_hint=False) - storage.put_scalar("key", k, smoothing_hint=False) - if (k + 1) % 20 == 0: - writer.write() - storage.step() - writer.close() - with open(json_file) as f: - data = [json.loads(l) for l in f] - self.assertTrue([int(k.get("key2", 0)) for k in data] == [17, 0, 34, 0, 51, 0]) - self.assertTrue([int(k.get("key", 0)) for k in data] == [0, 19, 0, 39, 0, 59]) - self.assertTrue([int(k["iteration"]) for k in data] == [17, 19, 34, 39, 51, 59]) - - def testPrintETA(self): - with EventStorage() as s: - p1 = CommonMetricPrinter(10) - p2 = CommonMetricPrinter() - - s.put_scalar("time", 1.0) - s.step() - s.put_scalar("time", 1.0) - s.step() - - with self.assertLogs("detectron2.utils.events") as logs: - p1.write() - self.assertIn("eta", logs.output[0]) - - with self.assertLogs("detectron2.utils.events") as logs: - p2.write() - self.assertNotIn("eta", logs.output[0]) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/test_export_caffe2.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_export_caffe2.py deleted file mode 100755 index 9a5e155f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/test_export_caffe2.py +++ /dev/null @@ -1,52 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# -*- coding: utf-8 -*- - -import copy -import os -import tempfile -import unittest -import torch - -from detectron2 import model_zoo -from detectron2.export import Caffe2Model, Caffe2Tracer -from detectron2.utils.logger import setup_logger -from detectron2.utils.testing import get_sample_coco_image - - -# TODO: this test requires manifold access, see: T88318502 -# Running it on CircleCI causes crash, not sure why. -@unittest.skipIf(os.environ.get("CIRCLECI"), "Caffe2 tests crash on CircleCI.") -class TestCaffe2Export(unittest.TestCase): - def setUp(self): - setup_logger() - - def _test_model(self, config_path, device="cpu"): - cfg = model_zoo.get_config(config_path) - cfg.MODEL.DEVICE = device - model = model_zoo.get(config_path, trained=True, device=device) - - inputs = [{"image": get_sample_coco_image()}] - tracer = Caffe2Tracer(cfg, model, copy.deepcopy(inputs)) - - with tempfile.TemporaryDirectory(prefix="detectron2_unittest") as d: - if not os.environ.get("CI"): - # This requires onnx, which is not yet available on public CI - c2_model = tracer.export_caffe2() - c2_model.save_protobuf(d) - c2_model.save_graph(os.path.join(d, "test.svg"), inputs=copy.deepcopy(inputs)) - - c2_model = Caffe2Model.load_protobuf(d) - c2_model(inputs)[0]["instances"] - - ts_model = tracer.export_torchscript() - ts_model.save(os.path.join(d, "model.ts")) - - def testMaskRCNN(self): - self._test_model("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml") - - @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available") - def testMaskRCNNGPU(self): - self._test_model("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml", device="cuda") - - def testRetinaNet(self): - self._test_model("COCO-Detection/retinanet_R_50_FPN_3x.yaml") diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/test_export_torchscript.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_export_torchscript.py deleted file mode 100755 index e9a0ff58..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/test_export_torchscript.py +++ /dev/null @@ -1,296 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import json -import os -import random -import tempfile -import unittest -import torch -from torch import Tensor, nn - -from detectron2 import model_zoo -from detectron2.config import get_cfg -from detectron2.config.instantiate import dump_dataclass, instantiate -from detectron2.export import dump_torchscript_IR, scripting_with_instances -from detectron2.export.flatten import TracingAdapter, flatten_to_tuple -from detectron2.export.torchscript_patch import patch_builtin_len -from detectron2.layers import ShapeSpec -from detectron2.modeling import build_backbone -from detectron2.modeling.postprocessing import detector_postprocess -from detectron2.modeling.roi_heads import KRCNNConvDeconvUpsampleHead -from detectron2.structures import Boxes, Instances -from detectron2.utils.env import TORCH_VERSION -from detectron2.utils.testing import ( - assert_instances_allclose, - convert_scripted_instances, - get_sample_coco_image, - random_boxes, -) - -""" -https://detectron2.readthedocs.io/tutorials/deployment.html -contains some explanations of this file. -""" - -SLOW_PUBLIC_CPU_TEST = unittest.skipIf( - os.environ.get("CI") and not torch.cuda.is_available(), - "The test is too slow on CPUs and will be executed on CircleCI's GPU jobs.", -) - - -class TestScripting(unittest.TestCase): - def testMaskRCNNFPN(self): - self._test_rcnn_model("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml") - - @SLOW_PUBLIC_CPU_TEST - def testMaskRCNNC4(self): - self._test_rcnn_model("COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml") - - def testRetinaNet(self): - self._test_retinanet_model("COCO-Detection/retinanet_R_50_FPN_3x.yaml") - - def _test_rcnn_model(self, config_path): - model = model_zoo.get(config_path, trained=True) - model.eval() - - fields = { - "proposal_boxes": Boxes, - "objectness_logits": Tensor, - "pred_boxes": Boxes, - "scores": Tensor, - "pred_classes": Tensor, - "pred_masks": Tensor, - } - script_model = scripting_with_instances(model, fields) - - # Test that batch inference with different shapes are supported - image = get_sample_coco_image() - small_image = nn.functional.interpolate(image, scale_factor=0.5) - inputs = [{"image": image}, {"image": small_image}] - with torch.no_grad(): - instance = model.inference(inputs, do_postprocess=False)[0] - scripted_instance = script_model.inference(inputs, do_postprocess=False)[0] - assert_instances_allclose(instance, scripted_instance) - - def _test_retinanet_model(self, config_path): - model = model_zoo.get(config_path, trained=True) - model.eval() - - fields = { - "pred_boxes": Boxes, - "scores": Tensor, - "pred_classes": Tensor, - } - script_model = scripting_with_instances(model, fields) - - img = get_sample_coco_image() - inputs = [{"image": img}] * 2 - with torch.no_grad(): - instance = model(inputs)[0]["instances"] - scripted_instance = convert_scripted_instances(script_model(inputs)[0]) - scripted_instance = detector_postprocess(scripted_instance, img.shape[1], img.shape[2]) - assert_instances_allclose(instance, scripted_instance) - # Note that the model currently cannot be saved and loaded into a new process: - # https://github.com/pytorch/pytorch/issues/46944 - - -# TODO: this test requires manifold access, see: T88318502 -class TestTracing(unittest.TestCase): - def testMaskRCNNFPN(self): - def inference_func(model, image): - inputs = [{"image": image}] - return model.inference(inputs, do_postprocess=False)[0] - - self._test_model("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml", inference_func) - - def testMaskRCNNFPN_with_postproc(self): - def inference_func(model, image): - inputs = [{"image": image, "height": image.shape[1], "width": image.shape[2]}] - return model.inference(inputs, do_postprocess=True)[0]["instances"] - - self._test_model("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml", inference_func) - - @SLOW_PUBLIC_CPU_TEST - def testMaskRCNNC4(self): - def inference_func(model, image): - inputs = [{"image": image}] - return model.inference(inputs, do_postprocess=False)[0] - - self._test_model("COCO-InstanceSegmentation/mask_rcnn_R_50_C4_3x.yaml", inference_func) - - @SLOW_PUBLIC_CPU_TEST - def testCascadeRCNN(self): - def inference_func(model, image): - inputs = [{"image": image}] - return model.inference(inputs, do_postprocess=False)[0] - - self._test_model("Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml", inference_func) - - # bug fixed by https://github.com/pytorch/pytorch/pull/67734 - @unittest.skipIf(TORCH_VERSION == (1, 10) and os.environ.get("CI"), "1.10 has bugs.") - def testRetinaNet(self): - def inference_func(model, image): - return model.forward([{"image": image}])[0]["instances"] - - self._test_model("COCO-Detection/retinanet_R_50_FPN_3x.yaml", inference_func) - - def _test_model(self, config_path, inference_func, batch=1): - model = model_zoo.get(config_path, trained=True) - image = get_sample_coco_image() - inputs = tuple(image.clone() for _ in range(batch)) - - wrapper = TracingAdapter(model, inputs, inference_func) - wrapper.eval() - with torch.no_grad(): - # trace with smaller images, and the trace must still work - trace_inputs = tuple( - nn.functional.interpolate(image, scale_factor=random.uniform(0.5, 0.7)) - for _ in range(batch) - ) - traced_model = torch.jit.trace(wrapper, trace_inputs) - - outputs = inference_func(model, *inputs) - traced_outputs = wrapper.outputs_schema(traced_model(*inputs)) - if batch > 1: - for output, traced_output in zip(outputs, traced_outputs): - assert_instances_allclose(output, traced_output, size_as_tensor=True) - else: - assert_instances_allclose(outputs, traced_outputs, size_as_tensor=True) - - @SLOW_PUBLIC_CPU_TEST - def testMaskRCNNFPN_batched(self): - def inference_func(model, image1, image2): - inputs = [{"image": image1}, {"image": image2}] - return model.inference(inputs, do_postprocess=False) - - self._test_model( - "COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml", inference_func, batch=2 - ) - - def testKeypointHead(self): - class M(nn.Module): - def __init__(self): - super().__init__() - self.model = KRCNNConvDeconvUpsampleHead( - ShapeSpec(channels=4, height=14, width=14), num_keypoints=17, conv_dims=(4,) - ) - - def forward(self, x, predbox1, predbox2): - inst = [ - Instances((100, 100), pred_boxes=Boxes(predbox1)), - Instances((100, 100), pred_boxes=Boxes(predbox2)), - ] - ret = self.model(x, inst) - return tuple(x.pred_keypoints for x in ret) - - model = M() - model.eval() - - def gen_input(num1, num2): - feat = torch.randn((num1 + num2, 4, 14, 14)) - box1 = random_boxes(num1) - box2 = random_boxes(num2) - return feat, box1, box2 - - with torch.no_grad(), patch_builtin_len(): - trace = torch.jit.trace(model, gen_input(15, 15), check_trace=False) - - inputs = gen_input(12, 10) - trace_outputs = trace(*inputs) - true_outputs = model(*inputs) - for trace_output, true_output in zip(trace_outputs, true_outputs): - self.assertTrue(torch.allclose(trace_output, true_output)) - - -class TestTorchscriptUtils(unittest.TestCase): - # TODO: add test to dump scripting - def test_dump_IR_tracing(self): - cfg = get_cfg() - cfg.MODEL.RESNETS.DEPTH = 18 - cfg.MODEL.RESNETS.RES2_OUT_CHANNELS = 64 - - class Mod(nn.Module): - def forward(self, x): - return tuple(self.m(x).values()) - - model = Mod() - model.m = build_backbone(cfg) - model.eval() - - with torch.no_grad(): - ts_model = torch.jit.trace(model, (torch.rand(2, 3, 224, 224),)) - - with tempfile.TemporaryDirectory(prefix="detectron2_test") as d: - dump_torchscript_IR(ts_model, d) - # check that the files are created - for name in ["model_ts_code", "model_ts_IR", "model_ts_IR_inlined", "model"]: - fname = os.path.join(d, name + ".txt") - self.assertTrue(os.stat(fname).st_size > 0, fname) - - def test_dump_IR_function(self): - @torch.jit.script - def gunc(x, y): - return x + y - - def func(x, y): - return x + y + gunc(x, y) - - ts_model = torch.jit.trace(func, (torch.rand(3), torch.rand(3))) - with tempfile.TemporaryDirectory(prefix="detectron2_test") as d: - dump_torchscript_IR(ts_model, d) - for name in ["model_ts_code", "model_ts_IR", "model_ts_IR_inlined"]: - fname = os.path.join(d, name + ".txt") - self.assertTrue(os.stat(fname).st_size > 0, fname) - - def test_flatten_basic(self): - obj = [3, ([5, 6], {"name": [7, 9], "name2": 3})] - res, schema = flatten_to_tuple(obj) - self.assertEqual(res, (3, 5, 6, 7, 9, 3)) - new_obj = schema(res) - self.assertEqual(new_obj, obj) - - _, new_schema = flatten_to_tuple(new_obj) - self.assertEqual(schema, new_schema) # test __eq__ - self._check_schema(schema) - - def _check_schema(self, schema): - dumped_schema = dump_dataclass(schema) - # Check that the schema is json-serializable - # Although in reality you might want to use yaml because it often has many levels - json.dumps(dumped_schema) - - # Check that the schema can be deserialized - new_schema = instantiate(dumped_schema) - self.assertEqual(schema, new_schema) - - def test_flatten_instances_boxes(self): - inst = Instances( - torch.tensor([5, 8]), pred_masks=torch.tensor([3]), pred_boxes=Boxes(torch.ones((1, 4))) - ) - obj = [3, ([5, 6], inst)] - res, schema = flatten_to_tuple(obj) - self.assertEqual(res[:3], (3, 5, 6)) - for r, expected in zip(res[3:], (inst.pred_boxes.tensor, inst.pred_masks, inst.image_size)): - self.assertIs(r, expected) - new_obj = schema(res) - assert_instances_allclose(new_obj[1][1], inst, rtol=0.0, size_as_tensor=True) - - self._check_schema(schema) - - def test_allow_non_tensor(self): - data = (torch.tensor([5, 8]), 3) # contains non-tensor - - class M(nn.Module): - def forward(self, input, number): - return input - - model = M() - with self.assertRaisesRegex(ValueError, "must only contain tensors"): - adap = TracingAdapter(model, data, allow_non_tensor=False) - - adap = TracingAdapter(model, data, allow_non_tensor=True) - _ = adap(*adap.flattened_inputs) - - newdata = (data[0].clone(),) - with self.assertRaisesRegex(ValueError, "cannot generalize"): - _ = adap(*newdata) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/test_model_analysis.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_model_analysis.py deleted file mode 100755 index c01b7af0..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/test_model_analysis.py +++ /dev/null @@ -1,80 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - - -import unittest -import torch -from torch import nn - -from detectron2.utils.analysis import find_unused_parameters, flop_count_operators, parameter_count -from detectron2.utils.testing import get_model_no_weights - - -class RetinaNetTest(unittest.TestCase): - def setUp(self): - self.model = get_model_no_weights("COCO-Detection/retinanet_R_50_FPN_1x.yaml") - - def test_flop(self): - # RetinaNet supports flop-counting with random inputs - inputs = [{"image": torch.rand(3, 800, 800), "test_unused": "abcd"}] - res = flop_count_operators(self.model, inputs) - self.assertEqual(int(res["conv"]), 146) # 146B flops - - def test_param_count(self): - res = parameter_count(self.model) - self.assertEqual(res[""], 37915572) - self.assertEqual(res["backbone"], 31452352) - - -class FasterRCNNTest(unittest.TestCase): - def setUp(self): - self.model = get_model_no_weights("COCO-Detection/faster_rcnn_R_50_FPN_1x.yaml") - - def test_flop(self): - # Faster R-CNN supports flop-counting with random inputs - inputs = [{"image": torch.rand(3, 800, 800)}] - res = flop_count_operators(self.model, inputs) - - # This only checks flops for backbone & proposal generator - # Flops for box head is not conv, and depends on #proposals, which is - # almost 0 for random inputs. - self.assertEqual(int(res["conv"]), 117) - - def test_flop_with_output_shape(self): - inputs = [{"image": torch.rand(3, 800, 800), "height": 700, "width": 700}] - res = flop_count_operators(self.model, inputs) - self.assertEqual(int(res["conv"]), 117) - - def test_param_count(self): - res = parameter_count(self.model) - self.assertEqual(res[""], 41699936) - self.assertEqual(res["backbone"], 26799296) - - -class MaskRCNNTest(unittest.TestCase): - def setUp(self): - self.model = get_model_no_weights("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml") - - def test_flop(self): - inputs1 = [{"image": torch.rand(3, 800, 800)}] - inputs2 = [{"image": torch.rand(3, 800, 800), "height": 700, "width": 700}] - - for inputs in [inputs1, inputs2]: - res = flop_count_operators(self.model, inputs) - # The mask head could have extra conv flops, so total >= 117 - self.assertGreaterEqual(int(res["conv"]), 117) - - -class UnusedParamTest(unittest.TestCase): - def test_unused(self): - class TestMod(nn.Module): - def __init__(self): - super().__init__() - self.fc1 = nn.Linear(10, 10) - self.t = nn.Linear(10, 10) - - def forward(self, x): - return self.fc1(x).mean() - - m = TestMod() - ret = find_unused_parameters(m, torch.randn(10, 10)) - self.assertEqual(set(ret), {"t.weight", "t.bias"}) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/test_model_zoo.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_model_zoo.py deleted file mode 100755 index e3360a74..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/test_model_zoo.py +++ /dev/null @@ -1,50 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import logging -import unittest - -from detectron2 import model_zoo -from detectron2.config import instantiate -from detectron2.modeling import FPN, GeneralizedRCNN - -logger = logging.getLogger(__name__) - - -class TestModelZoo(unittest.TestCase): - def test_get_returns_model(self): - model = model_zoo.get("Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml", trained=False) - self.assertIsInstance(model, GeneralizedRCNN) - self.assertIsInstance(model.backbone, FPN) - - def test_get_invalid_model(self): - self.assertRaises(RuntimeError, model_zoo.get, "Invalid/config.yaml") - - def test_get_url(self): - url = model_zoo.get_checkpoint_url("Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.yaml") - self.assertEqual( - url, - "https://dl.fbaipublicfiles.com/detectron2/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn/138602908/model_final_01ca85.pkl", # noqa - ) - url2 = model_zoo.get_checkpoint_url("Misc/scratch_mask_rcnn_R_50_FPN_3x_gn.py") - self.assertEqual(url, url2) - - def _build_lazy_model(self, name): - cfg = model_zoo.get_config("common/models/" + name) - instantiate(cfg.model) - - def test_mask_rcnn_fpn(self): - self._build_lazy_model("mask_rcnn_fpn.py") - - def test_mask_rcnn_c4(self): - self._build_lazy_model("mask_rcnn_c4.py") - - def test_panoptic_fpn(self): - self._build_lazy_model("panoptic_fpn.py") - - def test_schedule(self): - cfg = model_zoo.get_config("common/coco_schedule.py") - for _, v in cfg.items(): - instantiate(v) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/test_packaging.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_packaging.py deleted file mode 100755 index a5b1661e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/test_packaging.py +++ /dev/null @@ -1,24 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import unittest - -from detectron2.utils.collect_env import collect_env_info - - -class TestProjects(unittest.TestCase): - def test_import(self): - from detectron2.projects import point_rend - - _ = point_rend.add_pointrend_config - - import detectron2.projects.deeplab as deeplab - - _ = deeplab.add_deeplab_config - - # import detectron2.projects.panoptic_deeplab as panoptic_deeplab - - # _ = panoptic_deeplab.add_panoptic_deeplab_config - - -class TestCollectEnv(unittest.TestCase): - def test(self): - _ = collect_env_info() diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/test_registry.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_registry.py deleted file mode 100755 index 4e425a6e..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/test_registry.py +++ /dev/null @@ -1,45 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -import unittest -import torch - -from detectron2.modeling.meta_arch import GeneralizedRCNN -from detectron2.utils.registry import _convert_target_to_string, locate - - -class A: - class B: - pass - - -class TestLocate(unittest.TestCase): - def _test_obj(self, obj): - name = _convert_target_to_string(obj) - newobj = locate(name) - self.assertIs(obj, newobj) - - def test_basic(self): - self._test_obj(GeneralizedRCNN) - - def test_inside_class(self): - # requires using __qualname__ instead of __name__ - self._test_obj(A.B) - - def test_builtin(self): - self._test_obj(len) - self._test_obj(dict) - - def test_pytorch_optim(self): - # pydoc.locate does not work for it - self._test_obj(torch.optim.SGD) - - def test_failure(self): - with self.assertRaises(ImportError): - locate("asdf") - - def test_compress_target(self): - from detectron2.data.transforms import RandomCrop - - name = _convert_target_to_string(RandomCrop) - # name shouldn't contain 'augmentation_impl' - self.assertEqual(name, "detectron2.data.transforms.RandomCrop") - self.assertIs(RandomCrop, locate(name)) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/test_scheduler.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_scheduler.py deleted file mode 100755 index 6cccb03f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/test_scheduler.py +++ /dev/null @@ -1,68 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. - -import math -import numpy as np -from unittest import TestCase -import torch -from fvcore.common.param_scheduler import CosineParamScheduler, MultiStepParamScheduler -from torch import nn - -from detectron2.solver import LRMultiplier, WarmupParamScheduler - - -class TestScheduler(TestCase): - def test_warmup_multistep(self): - p = nn.Parameter(torch.zeros(0)) - opt = torch.optim.SGD([p], lr=5) - - multiplier = WarmupParamScheduler( - MultiStepParamScheduler( - [1, 0.1, 0.01, 0.001], - milestones=[10, 15, 20], - num_updates=30, - ), - 0.001, - 5 / 30, - ) - sched = LRMultiplier(opt, multiplier, 30) - # This is an equivalent of: - # sched = WarmupMultiStepLR( - # opt, milestones=[10, 15, 20], gamma=0.1, warmup_factor=0.001, warmup_iters=5) - - p.sum().backward() - opt.step() - - lrs = [0.005] - for _ in range(30): - sched.step() - lrs.append(opt.param_groups[0]["lr"]) - self.assertTrue(np.allclose(lrs[:5], [0.005, 1.004, 2.003, 3.002, 4.001])) - self.assertTrue(np.allclose(lrs[5:10], 5.0)) - self.assertTrue(np.allclose(lrs[10:15], 0.5)) - self.assertTrue(np.allclose(lrs[15:20], 0.05)) - self.assertTrue(np.allclose(lrs[20:], 0.005)) - - def test_warmup_cosine(self): - p = nn.Parameter(torch.zeros(0)) - opt = torch.optim.SGD([p], lr=5) - multiplier = WarmupParamScheduler( - CosineParamScheduler(1, 0), - 0.001, - 5 / 30, - ) - sched = LRMultiplier(opt, multiplier, 30) - - p.sum().backward() - opt.step() - self.assertEqual(opt.param_groups[0]["lr"], 0.005) - lrs = [0.005] - - for _ in range(30): - sched.step() - lrs.append(opt.param_groups[0]["lr"]) - for idx, lr in enumerate(lrs): - expected_cosine = 2.5 * (1.0 + math.cos(math.pi * idx / 30)) - if idx >= 5: - self.assertAlmostEqual(lr, expected_cosine) - else: - self.assertNotAlmostEqual(lr, expected_cosine) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/test_solver.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_solver.py deleted file mode 100755 index 6b3ae84c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/test_solver.py +++ /dev/null @@ -1,66 +0,0 @@ -import unittest - -from detectron2.solver.build import _expand_param_groups, reduce_param_groups - - -class TestOptimizer(unittest.TestCase): - def testExpandParamsGroups(self): - params = [ - { - "params": ["p1", "p2", "p3", "p4"], - "lr": 1.0, - "weight_decay": 3.0, - }, - { - "params": ["p2", "p3", "p5"], - "lr": 2.0, - "momentum": 2.0, - }, - { - "params": ["p1"], - "weight_decay": 4.0, - }, - ] - out = _expand_param_groups(params) - gt = [ - dict(params=["p1"], lr=1.0, weight_decay=4.0), # noqa - dict(params=["p2"], lr=2.0, weight_decay=3.0, momentum=2.0), # noqa - dict(params=["p3"], lr=2.0, weight_decay=3.0, momentum=2.0), # noqa - dict(params=["p4"], lr=1.0, weight_decay=3.0), # noqa - dict(params=["p5"], lr=2.0, momentum=2.0), # noqa - ] - self.assertEqual(out, gt) - - def testReduceParamGroups(self): - params = [ - dict(params=["p1"], lr=1.0, weight_decay=4.0), # noqa - dict(params=["p2", "p6"], lr=2.0, weight_decay=3.0, momentum=2.0), # noqa - dict(params=["p3"], lr=2.0, weight_decay=3.0, momentum=2.0), # noqa - dict(params=["p4"], lr=1.0, weight_decay=3.0), # noqa - dict(params=["p5"], lr=2.0, momentum=2.0), # noqa - ] - gt_groups = [ - { - "lr": 1.0, - "weight_decay": 4.0, - "params": ["p1"], - }, - { - "lr": 2.0, - "weight_decay": 3.0, - "momentum": 2.0, - "params": ["p2", "p6", "p3"], - }, - { - "lr": 1.0, - "weight_decay": 3.0, - "params": ["p4"], - }, - { - "lr": 2.0, - "momentum": 2.0, - "params": ["p5"], - }, - ] - out = reduce_param_groups(params) - self.assertEqual(out, gt_groups) diff --git a/grit_src_deprecated/third_party/CenterNet2/tests/test_visualizer.py b/grit_src_deprecated/third_party/CenterNet2/tests/test_visualizer.py deleted file mode 100755 index 1005000f..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tests/test_visualizer.py +++ /dev/null @@ -1,278 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import numpy as np -import os -import tempfile -import unittest -import cv2 -import torch - -from detectron2.data import MetadataCatalog -from detectron2.structures import BoxMode, Instances, RotatedBoxes -from detectron2.utils.visualizer import ColorMode, Visualizer - - -class TestVisualizer(unittest.TestCase): - def _random_data(self): - H, W = 100, 100 - N = 10 - img = np.random.rand(H, W, 3) * 255 - boxxy = np.random.rand(N, 2) * (H // 2) - boxes = np.concatenate((boxxy, boxxy + H // 2), axis=1) - - def _rand_poly(): - return np.random.rand(3, 2).flatten() * H - - polygons = [[_rand_poly() for _ in range(np.random.randint(1, 5))] for _ in range(N)] - - mask = np.zeros_like(img[:, :, 0], dtype=np.bool) - mask[:40, 10:20] = 1 - - labels = [str(i) for i in range(N)] - return img, boxes, labels, polygons, [mask] * N - - @property - def metadata(self): - return MetadataCatalog.get("coco_2017_train") - - def test_draw_dataset_dict(self): - img = np.random.rand(512, 512, 3) * 255 - dic = { - "annotations": [ - { - "bbox": [ - 368.9946492271106, - 330.891438763377, - 13.148537455410235, - 13.644708680142685, - ], - "bbox_mode": BoxMode.XYWH_ABS, - "category_id": 0, - "iscrowd": 1, - "segmentation": { - "counts": "_jh52m?2N2N2N2O100O10O001N1O2MceP2", - "size": [512, 512], - }, - } - ], - "height": 512, - "image_id": 1, - "width": 512, - } - v = Visualizer(img) - v.draw_dataset_dict(dic) - - v = Visualizer(img, self.metadata) - v.draw_dataset_dict(dic) - - def test_draw_rotated_dataset_dict(self): - img = np.random.rand(512, 512, 3) * 255 - dic = { - "annotations": [ - { - "bbox": [ - 368.9946492271106, - 330.891438763377, - 13.148537455410235, - 13.644708680142685, - 45.0, - ], - "bbox_mode": BoxMode.XYWHA_ABS, - "category_id": 0, - "iscrowd": 1, - } - ], - "height": 512, - "image_id": 1, - "width": 512, - } - v = Visualizer(img, self.metadata) - v.draw_dataset_dict(dic) - - def test_overlay_instances(self): - img, boxes, labels, polygons, masks = self._random_data() - - v = Visualizer(img, self.metadata) - output = v.overlay_instances(masks=polygons, boxes=boxes, labels=labels).get_image() - self.assertEqual(output.shape, img.shape) - - # Test 2x scaling - v = Visualizer(img, self.metadata, scale=2.0) - output = v.overlay_instances(masks=polygons, boxes=boxes, labels=labels).get_image() - self.assertEqual(output.shape[0], img.shape[0] * 2) - - # Test overlay masks - v = Visualizer(img, self.metadata) - output = v.overlay_instances(masks=masks, boxes=boxes, labels=labels).get_image() - self.assertEqual(output.shape, img.shape) - - def test_overlay_instances_no_boxes(self): - img, boxes, labels, polygons, _ = self._random_data() - v = Visualizer(img, self.metadata) - v.overlay_instances(masks=polygons, boxes=None, labels=labels).get_image() - - def test_draw_instance_predictions(self): - img, boxes, _, _, masks = self._random_data() - num_inst = len(boxes) - inst = Instances((img.shape[0], img.shape[1])) - inst.pred_classes = torch.randint(0, 80, size=(num_inst,)) - inst.scores = torch.rand(num_inst) - inst.pred_boxes = torch.from_numpy(boxes) - inst.pred_masks = torch.from_numpy(np.asarray(masks)) - - v = Visualizer(img) - v.draw_instance_predictions(inst) - - v = Visualizer(img, self.metadata) - v.draw_instance_predictions(inst) - - def test_BWmode_nomask(self): - img, boxes, _, _, masks = self._random_data() - num_inst = len(boxes) - inst = Instances((img.shape[0], img.shape[1])) - inst.pred_classes = torch.randint(0, 80, size=(num_inst,)) - inst.scores = torch.rand(num_inst) - inst.pred_boxes = torch.from_numpy(boxes) - - v = Visualizer(img, self.metadata, instance_mode=ColorMode.IMAGE_BW) - v.draw_instance_predictions(inst) - - # check that output is grayscale - inst = inst[:0] - v = Visualizer(img, self.metadata, instance_mode=ColorMode.IMAGE_BW) - output = v.draw_instance_predictions(inst).get_image() - self.assertTrue(np.allclose(output[:, :, 0], output[:, :, 1])) - self.assertTrue(np.allclose(output[:, :, 0], output[:, :, 2])) - - def test_draw_empty_mask_predictions(self): - img, boxes, _, _, masks = self._random_data() - num_inst = len(boxes) - inst = Instances((img.shape[0], img.shape[1])) - inst.pred_classes = torch.randint(0, 80, size=(num_inst,)) - inst.scores = torch.rand(num_inst) - inst.pred_boxes = torch.from_numpy(boxes) - inst.pred_masks = torch.from_numpy(np.zeros_like(np.asarray(masks))) - - v = Visualizer(img, self.metadata) - v.draw_instance_predictions(inst) - - def test_correct_output_shape(self): - img = np.random.rand(928, 928, 3) * 255 - v = Visualizer(img, self.metadata) - out = v.output.get_image() - self.assertEqual(out.shape, img.shape) - - def test_overlay_rotated_instances(self): - H, W = 100, 150 - img = np.random.rand(H, W, 3) * 255 - num_boxes = 50 - boxes_5d = torch.zeros(num_boxes, 5) - boxes_5d[:, 0] = torch.FloatTensor(num_boxes).uniform_(-0.1 * W, 1.1 * W) - boxes_5d[:, 1] = torch.FloatTensor(num_boxes).uniform_(-0.1 * H, 1.1 * H) - boxes_5d[:, 2] = torch.FloatTensor(num_boxes).uniform_(0, max(W, H)) - boxes_5d[:, 3] = torch.FloatTensor(num_boxes).uniform_(0, max(W, H)) - boxes_5d[:, 4] = torch.FloatTensor(num_boxes).uniform_(-1800, 1800) - rotated_boxes = RotatedBoxes(boxes_5d) - labels = [str(i) for i in range(num_boxes)] - - v = Visualizer(img, self.metadata) - output = v.overlay_instances(boxes=rotated_boxes, labels=labels).get_image() - self.assertEqual(output.shape, img.shape) - - def test_draw_no_metadata(self): - img, boxes, _, _, masks = self._random_data() - num_inst = len(boxes) - inst = Instances((img.shape[0], img.shape[1])) - inst.pred_classes = torch.randint(0, 80, size=(num_inst,)) - inst.scores = torch.rand(num_inst) - inst.pred_boxes = torch.from_numpy(boxes) - inst.pred_masks = torch.from_numpy(np.asarray(masks)) - - v = Visualizer(img, MetadataCatalog.get("asdfasdf")) - v.draw_instance_predictions(inst) - - def test_draw_binary_mask(self): - img, boxes, _, _, masks = self._random_data() - img[:, :, 0] = 0 # remove red color - mask = masks[0] - mask_with_hole = np.zeros_like(mask).astype("uint8") - mask_with_hole = cv2.rectangle(mask_with_hole, (10, 10), (50, 50), 1, 5) - - for m in [mask, mask_with_hole]: - for save in [True, False]: - v = Visualizer(img) - o = v.draw_binary_mask(m, color="red", text="test") - if save: - with tempfile.TemporaryDirectory(prefix="detectron2_viz") as d: - path = os.path.join(d, "output.png") - o.save(path) - o = cv2.imread(path)[:, :, ::-1] - else: - o = o.get_image().astype("float32") - # red color is drawn on the image - self.assertTrue(o[:, :, 0].sum() > 0) - - def test_draw_soft_mask(self): - img = np.random.rand(100, 100, 3) * 255 - img[:, :, 0] = 0 # remove red color - mask = np.zeros((100, 100), dtype=np.float32) - mask[30:50, 40:50] = 1.0 - cv2.GaussianBlur(mask, (21, 21), 10) - - v = Visualizer(img) - o = v.draw_soft_mask(mask, color="red", text="test") - o = o.get_image().astype("float32") - # red color is drawn on the image - self.assertTrue(o[:, :, 0].sum() > 0) - - # test draw empty mask - v = Visualizer(img) - o = v.draw_soft_mask(np.zeros((100, 100), dtype=np.float32), color="red", text="test") - o = o.get_image().astype("float32") - - def test_border_mask_with_holes(self): - H, W = 200, 200 - img = np.zeros((H, W, 3)) - img[:, :, 0] = 255.0 - v = Visualizer(img, scale=3) - - mask = np.zeros((H, W)) - mask[:, 100:150] = 1 - # create a hole, to trigger imshow - mask = cv2.rectangle(mask, (110, 110), (130, 130), 0, thickness=-1) - output = v.draw_binary_mask(mask, color="blue") - output = output.get_image()[:, :, ::-1] - - first_row = {tuple(x.tolist()) for x in output[0]} - last_row = {tuple(x.tolist()) for x in output[-1]} - # Check quantization / off-by-1 error: the first and last row must have two colors - self.assertEqual(len(last_row), 2) - self.assertEqual(len(first_row), 2) - self.assertIn((0, 0, 255), last_row) - self.assertIn((0, 0, 255), first_row) - - def test_border_polygons(self): - H, W = 200, 200 - img = np.zeros((H, W, 3)) - img[:, :, 0] = 255.0 - v = Visualizer(img, scale=3) - mask = np.zeros((H, W)) - mask[:, 100:150] = 1 - - output = v.draw_binary_mask(mask, color="blue") - output = output.get_image()[:, :, ::-1] - - first_row = {tuple(x.tolist()) for x in output[0]} - last_row = {tuple(x.tolist()) for x in output[-1]} - # Check quantization / off-by-1 error: - # the first and last row must have >=2 colors, because the polygon - # touches both rows - self.assertGreaterEqual(len(last_row), 2) - self.assertGreaterEqual(len(first_row), 2) - self.assertIn((0, 0, 255), last_row) - self.assertIn((0, 0, 255), first_row) - - -if __name__ == "__main__": - unittest.main() diff --git a/grit_src_deprecated/third_party/CenterNet2/tools/README.md b/grit_src_deprecated/third_party/CenterNet2/tools/README.md deleted file mode 100755 index 0b40d531..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tools/README.md +++ /dev/null @@ -1,49 +0,0 @@ - -This directory contains a few example scripts that demonstrate features of detectron2. - - -* `train_net.py` - -An example training script that's made to train builtin models of detectron2. - -For usage, see [GETTING_STARTED.md](../GETTING_STARTED.md). - -* `plain_train_net.py` - -Similar to `train_net.py`, but implements a training loop instead of using `Trainer`. -This script includes fewer features but it may be more friendly to hackers. - -* `benchmark.py` - -Benchmark the training speed, inference speed or data loading speed of a given config. - -Usage: -``` -python benchmark.py --config-file config.yaml --task train/eval/data [optional DDP flags] -``` - -* `analyze_model.py` - -Analyze FLOPs, parameters, activations of a detectron2 model. See its `--help` for usage. - -* `visualize_json_results.py` - -Visualize the json instance detection/segmentation results dumped by `COCOEvalutor` or `LVISEvaluator` - -Usage: -``` -python visualize_json_results.py --input x.json --output dir/ --dataset coco_2017_val -``` -If not using a builtin dataset, you'll need your own script or modify this script. - -* `visualize_data.py` - -Visualize ground truth raw annotations or training data (after preprocessing/augmentations). - -Usage: -``` -python visualize_data.py --config-file config.yaml --source annotation/dataloader --output-dir dir/ [--show] -``` - -NOTE: the script does not stop by itself when using `--source dataloader` because a training -dataloader is usually infinite. diff --git a/grit_src_deprecated/third_party/CenterNet2/tools/__init__.py b/grit_src_deprecated/third_party/CenterNet2/tools/__init__.py deleted file mode 100755 index e69de29b..00000000 diff --git a/grit_src_deprecated/third_party/CenterNet2/tools/analyze_model.py b/grit_src_deprecated/third_party/CenterNet2/tools/analyze_model.py deleted file mode 100755 index 8e38f8b7..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tools/analyze_model.py +++ /dev/null @@ -1,159 +0,0 @@ -# -*- coding: utf-8 -*- -# Copyright (c) Facebook, Inc. and its affiliates. - -import logging -import numpy as np -from collections import Counter -import tqdm -from fvcore.nn import flop_count_table # can also try flop_count_str - -from detectron2.checkpoint import DetectionCheckpointer -from detectron2.config import CfgNode, LazyConfig, get_cfg, instantiate -from detectron2.data import build_detection_test_loader -from detectron2.engine import default_argument_parser -from detectron2.modeling import build_model -from detectron2.utils.analysis import ( - FlopCountAnalysis, - activation_count_operators, - parameter_count_table, -) -from detectron2.utils.logger import setup_logger - -logger = logging.getLogger("detectron2") - - -def setup(args): - if args.config_file.endswith(".yaml"): - cfg = get_cfg() - cfg.merge_from_file(args.config_file) - cfg.DATALOADER.NUM_WORKERS = 0 - cfg.merge_from_list(args.opts) - cfg.freeze() - else: - cfg = LazyConfig.load(args.config_file) - cfg = LazyConfig.apply_overrides(cfg, args.opts) - setup_logger(name="fvcore") - setup_logger() - return cfg - - -def do_flop(cfg): - if isinstance(cfg, CfgNode): - data_loader = build_detection_test_loader(cfg, cfg.DATASETS.TEST[0]) - model = build_model(cfg) - DetectionCheckpointer(model).load(cfg.MODEL.WEIGHTS) - else: - data_loader = instantiate(cfg.dataloader.test) - model = instantiate(cfg.model) - model.to(cfg.train.device) - DetectionCheckpointer(model).load(cfg.train.init_checkpoint) - model.eval() - - counts = Counter() - total_flops = [] - for idx, data in zip(tqdm.trange(args.num_inputs), data_loader): # noqa - flops = FlopCountAnalysis(model, data) - if idx > 0: - flops.unsupported_ops_warnings(False).uncalled_modules_warnings(False) - counts += flops.by_operator() - total_flops.append(flops.total()) - - logger.info("Flops table computed from only one input sample:\n" + flop_count_table(flops)) - logger.info( - "Average GFlops for each type of operators:\n" - + str([(k, v / (idx + 1) / 1e9) for k, v in counts.items()]) - ) - logger.info( - "Total GFlops: {:.1f}±{:.1f}".format(np.mean(total_flops) / 1e9, np.std(total_flops) / 1e9) - ) - - -def do_activation(cfg): - if isinstance(cfg, CfgNode): - data_loader = build_detection_test_loader(cfg, cfg.DATASETS.TEST[0]) - model = build_model(cfg) - DetectionCheckpointer(model).load(cfg.MODEL.WEIGHTS) - else: - data_loader = instantiate(cfg.dataloader.test) - model = instantiate(cfg.model) - model.to(cfg.train.device) - DetectionCheckpointer(model).load(cfg.train.init_checkpoint) - model.eval() - - counts = Counter() - total_activations = [] - for idx, data in zip(tqdm.trange(args.num_inputs), data_loader): # noqa - count = activation_count_operators(model, data) - counts += count - total_activations.append(sum(count.values())) - logger.info( - "(Million) Activations for Each Type of Operators:\n" - + str([(k, v / idx) for k, v in counts.items()]) - ) - logger.info( - "Total (Million) Activations: {}±{}".format( - np.mean(total_activations), np.std(total_activations) - ) - ) - - -def do_parameter(cfg): - if isinstance(cfg, CfgNode): - model = build_model(cfg) - else: - model = instantiate(cfg.model) - logger.info("Parameter Count:\n" + parameter_count_table(model, max_depth=5)) - - -def do_structure(cfg): - if isinstance(cfg, CfgNode): - model = build_model(cfg) - else: - model = instantiate(cfg.model) - logger.info("Model Structure:\n" + str(model)) - - -if __name__ == "__main__": - parser = default_argument_parser( - epilog=""" -Examples: - -To show parameters of a model: -$ ./analyze_model.py --tasks parameter \\ - --config-file ../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml - -Flops and activations are data-dependent, therefore inputs and model weights -are needed to count them: - -$ ./analyze_model.py --num-inputs 100 --tasks flop \\ - --config-file ../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_1x.yaml \\ - MODEL.WEIGHTS /path/to/model.pkl -""" - ) - parser.add_argument( - "--tasks", - choices=["flop", "activation", "parameter", "structure"], - required=True, - nargs="+", - ) - parser.add_argument( - "-n", - "--num-inputs", - default=100, - type=int, - help="number of inputs used to compute statistics for flops/activations, " - "both are data dependent.", - ) - args = parser.parse_args() - assert not args.eval_only - assert args.num_gpus == 1 - - cfg = setup(args) - - for task in args.tasks: - { - "flop": do_flop, - "activation": do_activation, - "parameter": do_parameter, - "structure": do_structure, - }[task](cfg) diff --git a/grit_src_deprecated/third_party/CenterNet2/tools/benchmark.py b/grit_src_deprecated/third_party/CenterNet2/tools/benchmark.py deleted file mode 100755 index aaac5640..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tools/benchmark.py +++ /dev/null @@ -1,197 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. -""" -A script to benchmark builtin models. - -Note: this script has an extra dependency of psutil. -""" - -import itertools -import logging -import psutil -import torch -import tqdm -from fvcore.common.timer import Timer -from torch.nn.parallel import DistributedDataParallel - -from detectron2.checkpoint import DetectionCheckpointer -from detectron2.config import LazyConfig, get_cfg, instantiate -from detectron2.data import ( - DatasetFromList, - build_detection_test_loader, - build_detection_train_loader, -) -from detectron2.data.benchmark import DataLoaderBenchmark -from detectron2.engine import AMPTrainer, SimpleTrainer, default_argument_parser, hooks, launch -from detectron2.modeling import build_model -from detectron2.solver import build_optimizer -from detectron2.utils import comm -from detectron2.utils.collect_env import collect_env_info -from detectron2.utils.events import CommonMetricPrinter -from detectron2.utils.logger import setup_logger - -logger = logging.getLogger("detectron2") - - -def setup(args): - if args.config_file.endswith(".yaml"): - cfg = get_cfg() - cfg.merge_from_file(args.config_file) - cfg.SOLVER.BASE_LR = 0.001 # Avoid NaNs. Not useful in this script anyway. - cfg.merge_from_list(args.opts) - cfg.freeze() - else: - cfg = LazyConfig.load(args.config_file) - cfg = LazyConfig.apply_overrides(cfg, args.opts) - setup_logger(distributed_rank=comm.get_rank()) - return cfg - - -def create_data_benchmark(cfg, args): - if args.config_file.endswith(".py"): - dl_cfg = cfg.dataloader.train - dl_cfg._target_ = DataLoaderBenchmark - return instantiate(dl_cfg) - else: - kwargs = build_detection_train_loader.from_config(cfg) - kwargs.pop("aspect_ratio_grouping", None) - kwargs["_target_"] = DataLoaderBenchmark - return instantiate(kwargs) - - -def RAM_msg(): - vram = psutil.virtual_memory() - return "RAM Usage: {:.2f}/{:.2f} GB".format( - (vram.total - vram.available) / 1024 ** 3, vram.total / 1024 ** 3 - ) - - -def benchmark_data(args): - cfg = setup(args) - logger.info("After spawning " + RAM_msg()) - - benchmark = create_data_benchmark(cfg, args) - benchmark.benchmark_distributed(250, 10) - # test for a few more rounds - for k in range(10): - logger.info(f"Iteration {k} " + RAM_msg()) - benchmark.benchmark_distributed(250, 1) - - -def benchmark_data_advanced(args): - # benchmark dataloader with more details to help analyze performance bottleneck - cfg = setup(args) - benchmark = create_data_benchmark(cfg, args) - - if comm.get_rank() == 0: - benchmark.benchmark_dataset(100) - benchmark.benchmark_mapper(100) - benchmark.benchmark_workers(100, warmup=10) - benchmark.benchmark_IPC(100, warmup=10) - if comm.get_world_size() > 1: - benchmark.benchmark_distributed(100) - logger.info("Rerun ...") - benchmark.benchmark_distributed(100) - - -def benchmark_train(args): - cfg = setup(args) - model = build_model(cfg) - logger.info("Model:\n{}".format(model)) - if comm.get_world_size() > 1: - model = DistributedDataParallel( - model, device_ids=[comm.get_local_rank()], broadcast_buffers=False - ) - optimizer = build_optimizer(cfg, model) - checkpointer = DetectionCheckpointer(model, optimizer=optimizer) - checkpointer.load(cfg.MODEL.WEIGHTS) - - cfg.defrost() - cfg.DATALOADER.NUM_WORKERS = 2 - data_loader = build_detection_train_loader(cfg) - dummy_data = list(itertools.islice(data_loader, 100)) - - def f(): - data = DatasetFromList(dummy_data, copy=False, serialize=False) - while True: - yield from data - - max_iter = 400 - trainer = (AMPTrainer if cfg.SOLVER.AMP.ENABLED else SimpleTrainer)(model, f(), optimizer) - trainer.register_hooks( - [ - hooks.IterationTimer(), - hooks.PeriodicWriter([CommonMetricPrinter(max_iter)]), - hooks.TorchProfiler( - lambda trainer: trainer.iter == max_iter - 1, cfg.OUTPUT_DIR, save_tensorboard=True - ), - ] - ) - trainer.train(1, max_iter) - - -@torch.no_grad() -def benchmark_eval(args): - cfg = setup(args) - if args.config_file.endswith(".yaml"): - model = build_model(cfg) - DetectionCheckpointer(model).load(cfg.MODEL.WEIGHTS) - - cfg.defrost() - cfg.DATALOADER.NUM_WORKERS = 0 - data_loader = build_detection_test_loader(cfg, cfg.DATASETS.TEST[0]) - else: - model = instantiate(cfg.model) - model.to(cfg.train.device) - DetectionCheckpointer(model).load(cfg.train.init_checkpoint) - - cfg.dataloader.num_workers = 0 - data_loader = instantiate(cfg.dataloader.test) - - model.eval() - logger.info("Model:\n{}".format(model)) - dummy_data = DatasetFromList(list(itertools.islice(data_loader, 100)), copy=False) - - def f(): - while True: - yield from dummy_data - - for k in range(5): # warmup - model(dummy_data[k]) - - max_iter = 300 - timer = Timer() - with tqdm.tqdm(total=max_iter) as pbar: - for idx, d in enumerate(f()): - if idx == max_iter: - break - model(d) - pbar.update() - logger.info("{} iters in {} seconds.".format(max_iter, timer.seconds())) - - -if __name__ == "__main__": - parser = default_argument_parser() - parser.add_argument("--task", choices=["train", "eval", "data", "data_advanced"], required=True) - args = parser.parse_args() - assert not args.eval_only - - logger.info("Environment info:\n" + collect_env_info()) - if "data" in args.task: - print("Initial " + RAM_msg()) - if args.task == "data": - f = benchmark_data - if args.task == "data_advanced": - f = benchmark_data_advanced - elif args.task == "train": - """ - Note: training speed may not be representative. - The training cost of a R-CNN model varies with the content of the data - and the quality of the model. - """ - f = benchmark_train - elif args.task == "eval": - f = benchmark_eval - # only benchmark single-GPU inference. - assert args.num_gpus == 1 and args.num_machines == 1 - launch(f, args.num_gpus, args.num_machines, args.machine_rank, args.dist_url, args=(args,)) diff --git a/grit_src_deprecated/third_party/CenterNet2/tools/convert-torchvision-to-d2.py b/grit_src_deprecated/third_party/CenterNet2/tools/convert-torchvision-to-d2.py deleted file mode 100755 index 4b827d96..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tools/convert-torchvision-to-d2.py +++ /dev/null @@ -1,56 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. - -import pickle as pkl -import sys -import torch - -""" -Usage: - # download one of the ResNet{18,34,50,101,152} models from torchvision: - wget https://download.pytorch.org/models/resnet50-19c8e357.pth -O r50.pth - # run the conversion - ./convert-torchvision-to-d2.py r50.pth r50.pkl - - # Then, use r50.pkl with the following changes in config: - -MODEL: - WEIGHTS: "/path/to/r50.pkl" - PIXEL_MEAN: [123.675, 116.280, 103.530] - PIXEL_STD: [58.395, 57.120, 57.375] - RESNETS: - DEPTH: 50 - STRIDE_IN_1X1: False -INPUT: - FORMAT: "RGB" - - These models typically produce slightly worse results than the - pre-trained ResNets we use in official configs, which are the - original ResNet models released by MSRA. -""" - -if __name__ == "__main__": - input = sys.argv[1] - - obj = torch.load(input, map_location="cpu") - - newmodel = {} - for k in list(obj.keys()): - old_k = k - if "layer" not in k: - k = "stem." + k - for t in [1, 2, 3, 4]: - k = k.replace("layer{}".format(t), "res{}".format(t + 1)) - for t in [1, 2, 3]: - k = k.replace("bn{}".format(t), "conv{}.norm".format(t)) - k = k.replace("downsample.0", "shortcut") - k = k.replace("downsample.1", "shortcut.norm") - print(old_k, "->", k) - newmodel[k] = obj.pop(old_k).detach().numpy() - - res = {"model": newmodel, "__author__": "torchvision", "matching_heuristics": True} - - with open(sys.argv[2], "wb") as f: - pkl.dump(res, f) - if obj: - print("Unconverted keys:", obj.keys()) diff --git a/grit_src_deprecated/third_party/CenterNet2/tools/deploy/CMakeLists.txt b/grit_src_deprecated/third_party/CenterNet2/tools/deploy/CMakeLists.txt deleted file mode 100755 index 80dae125..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tools/deploy/CMakeLists.txt +++ /dev/null @@ -1,15 +0,0 @@ -# Copyright (c) Facebook, Inc. and its affiliates. -# See https://pytorch.org/tutorials/advanced/cpp_frontend.html -cmake_minimum_required(VERSION 3.12 FATAL_ERROR) -project(torchscript_mask_rcnn) - -find_package(Torch REQUIRED) -find_package(OpenCV REQUIRED) -find_package(TorchVision REQUIRED) # needed by export-method=tracing/scripting - -add_executable(torchscript_mask_rcnn torchscript_mask_rcnn.cpp) -target_link_libraries( - torchscript_mask_rcnn - -Wl,--no-as-needed TorchVision::TorchVision -Wl,--as-needed - "${TORCH_LIBRARIES}" ${OpenCV_LIBS}) -set_property(TARGET torchscript_mask_rcnn PROPERTY CXX_STANDARD 14) diff --git a/grit_src_deprecated/third_party/CenterNet2/tools/deploy/README.md b/grit_src_deprecated/third_party/CenterNet2/tools/deploy/README.md deleted file mode 100755 index e33cbeb5..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tools/deploy/README.md +++ /dev/null @@ -1,66 +0,0 @@ -See [deployment tutorial](https://detectron2.readthedocs.io/tutorials/deployment.html) -for some high-level background about deployment. - -This directory contains the following examples: - -1. An example script `export_model.py` - that exports a detectron2 model for deployment using different methods and formats. - -2. A C++ example that runs inference with Mask R-CNN model in TorchScript format. - -## Build -Deployment depends on libtorch and OpenCV. Some require more dependencies: - -* Running TorchScript-format models produced by `--export-method=caffe2_tracing` requires libtorch - to be built with caffe2 enabled. -* Running TorchScript-format models produced by `--export-method=tracing/scripting` requires libtorchvision (C++ library of torchvision). - -All methods are supported in one C++ file that requires all the above dependencies. -Adjust it and remove code you don't need. -As a reference, we provide a [Dockerfile](../../docker/deploy.Dockerfile) that installs all the above dependencies and builds the C++ example. - -## Use - -We show a few example commands to export and execute a Mask R-CNN model in C++. - -* `export-method=tracing, format=torchscript`: -``` -./export_model.py --config-file ../../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml \ - --output ./output --export-method tracing --format torchscript \ - MODEL.WEIGHTS detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl \ - MODEL.DEVICE cuda - -./build/torchscript_mask_rcnn output/model.ts input.jpg tracing -``` - -* `export-method=scripting, format=torchscript`: -``` -./export_model.py --config-file ../../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml \ - --output ./output --export-method scripting --format torchscript \ - MODEL.WEIGHTS detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl \ - -./build/torchscript_mask_rcnn output/model.ts input.jpg scripting -``` - -* `export-method=caffe2_tracing, format=torchscript`: - -``` -./export_model.py --config-file ../../configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml \ - --output ./output --export-method caffe2_tracing --format torchscript \ - MODEL.WEIGHTS detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl \ - -./build/torchscript_mask_rcnn output/model.ts input.jpg caffe2_tracing -``` - - -## Notes: - -1. Tracing/Caffe2-tracing requires valid weights & sample inputs. - Therefore the above commands require pre-trained models and [COCO dataset](https://detectron2.readthedocs.io/tutorials/builtin_datasets.html). - You can modify the script to obtain sample inputs in other ways instead of from COCO. - -2. `--run-eval` is implemented only for tracing mode - to evaluate the exported model using the dataset in the config. - It's recommended to always verify the accuracy in case the conversion is not successful. - Evaluation can be slow if model is exported to CPU or dataset is too large ("coco_2017_val_100" is a small subset of COCO useful for evaluation). - `caffe2_tracing` accuracy may be slightly different (within 0.1 AP) from original model due to numerical precisions between different runtime. diff --git a/grit_src_deprecated/third_party/CenterNet2/tools/deploy/export_model.py b/grit_src_deprecated/third_party/CenterNet2/tools/deploy/export_model.py deleted file mode 100755 index bb1bcee6..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tools/deploy/export_model.py +++ /dev/null @@ -1,235 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. -import argparse -import os -from typing import Dict, List, Tuple -import torch -from torch import Tensor, nn - -import detectron2.data.transforms as T -from detectron2.checkpoint import DetectionCheckpointer -from detectron2.config import get_cfg -from detectron2.data import build_detection_test_loader, detection_utils -from detectron2.evaluation import COCOEvaluator, inference_on_dataset, print_csv_format -from detectron2.export import TracingAdapter, dump_torchscript_IR, scripting_with_instances -from detectron2.modeling import GeneralizedRCNN, RetinaNet, build_model -from detectron2.modeling.postprocessing import detector_postprocess -from detectron2.projects.point_rend import add_pointrend_config -from detectron2.structures import Boxes -from detectron2.utils.env import TORCH_VERSION -from detectron2.utils.file_io import PathManager -from detectron2.utils.logger import setup_logger - - -def setup_cfg(args): - cfg = get_cfg() - # cuda context is initialized before creating dataloader, so we don't fork anymore - cfg.DATALOADER.NUM_WORKERS = 0 - add_pointrend_config(cfg) - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - cfg.freeze() - return cfg - - -def export_caffe2_tracing(cfg, torch_model, inputs): - from detectron2.export import Caffe2Tracer - - tracer = Caffe2Tracer(cfg, torch_model, inputs) - if args.format == "caffe2": - caffe2_model = tracer.export_caffe2() - caffe2_model.save_protobuf(args.output) - # draw the caffe2 graph - caffe2_model.save_graph(os.path.join(args.output, "model.svg"), inputs=inputs) - return caffe2_model - elif args.format == "onnx": - import onnx - - onnx_model = tracer.export_onnx() - onnx.save(onnx_model, os.path.join(args.output, "model.onnx")) - elif args.format == "torchscript": - ts_model = tracer.export_torchscript() - with PathManager.open(os.path.join(args.output, "model.ts"), "wb") as f: - torch.jit.save(ts_model, f) - dump_torchscript_IR(ts_model, args.output) - - -# experimental. API not yet final -def export_scripting(torch_model): - assert TORCH_VERSION >= (1, 8) - fields = { - "proposal_boxes": Boxes, - "objectness_logits": Tensor, - "pred_boxes": Boxes, - "scores": Tensor, - "pred_classes": Tensor, - "pred_masks": Tensor, - "pred_keypoints": torch.Tensor, - "pred_keypoint_heatmaps": torch.Tensor, - } - assert args.format == "torchscript", "Scripting only supports torchscript format." - - class ScriptableAdapterBase(nn.Module): - # Use this adapter to workaround https://github.com/pytorch/pytorch/issues/46944 - # by not retuning instances but dicts. Otherwise the exported model is not deployable - def __init__(self): - super().__init__() - self.model = torch_model - self.eval() - - if isinstance(torch_model, GeneralizedRCNN): - - class ScriptableAdapter(ScriptableAdapterBase): - def forward(self, inputs: Tuple[Dict[str, torch.Tensor]]) -> List[Dict[str, Tensor]]: - instances = self.model.inference(inputs, do_postprocess=False) - return [i.get_fields() for i in instances] - - else: - - class ScriptableAdapter(ScriptableAdapterBase): - def forward(self, inputs: Tuple[Dict[str, torch.Tensor]]) -> List[Dict[str, Tensor]]: - instances = self.model(inputs) - return [i.get_fields() for i in instances] - - ts_model = scripting_with_instances(ScriptableAdapter(), fields) - with PathManager.open(os.path.join(args.output, "model.ts"), "wb") as f: - torch.jit.save(ts_model, f) - dump_torchscript_IR(ts_model, args.output) - # TODO inference in Python now missing postprocessing glue code - return None - - -# experimental. API not yet final -def export_tracing(torch_model, inputs): - assert TORCH_VERSION >= (1, 8) - image = inputs[0]["image"] - inputs = [{"image": image}] # remove other unused keys - - if isinstance(torch_model, GeneralizedRCNN): - - def inference(model, inputs): - # use do_postprocess=False so it returns ROI mask - inst = model.inference(inputs, do_postprocess=False)[0] - return [{"instances": inst}] - - else: - inference = None # assume that we just call the model directly - - traceable_model = TracingAdapter(torch_model, inputs, inference) - - if args.format == "torchscript": - ts_model = torch.jit.trace(traceable_model, (image,)) - with PathManager.open(os.path.join(args.output, "model.ts"), "wb") as f: - torch.jit.save(ts_model, f) - dump_torchscript_IR(ts_model, args.output) - elif args.format == "onnx": - with PathManager.open(os.path.join(args.output, "model.onnx"), "wb") as f: - torch.onnx.export(traceable_model, (image,), f, opset_version=11) - logger.info("Inputs schema: " + str(traceable_model.inputs_schema)) - logger.info("Outputs schema: " + str(traceable_model.outputs_schema)) - - if args.format != "torchscript": - return None - if not isinstance(torch_model, (GeneralizedRCNN, RetinaNet)): - return None - - def eval_wrapper(inputs): - """ - The exported model does not contain the final resize step, which is typically - unused in deployment but needed for evaluation. We add it manually here. - """ - input = inputs[0] - instances = traceable_model.outputs_schema(ts_model(input["image"]))[0]["instances"] - postprocessed = detector_postprocess(instances, input["height"], input["width"]) - return [{"instances": postprocessed}] - - return eval_wrapper - - -def get_sample_inputs(args): - - if args.sample_image is None: - # get a first batch from dataset - data_loader = build_detection_test_loader(cfg, cfg.DATASETS.TEST[0]) - first_batch = next(iter(data_loader)) - return first_batch - else: - # get a sample data - original_image = detection_utils.read_image(args.sample_image, format=cfg.INPUT.FORMAT) - # Do same preprocessing as DefaultPredictor - aug = T.ResizeShortestEdge( - [cfg.INPUT.MIN_SIZE_TEST, cfg.INPUT.MIN_SIZE_TEST], cfg.INPUT.MAX_SIZE_TEST - ) - height, width = original_image.shape[:2] - image = aug.get_transform(original_image).apply_image(original_image) - image = torch.as_tensor(image.astype("float32").transpose(2, 0, 1)) - - inputs = {"image": image, "height": height, "width": width} - - # Sample ready - sample_inputs = [inputs] - return sample_inputs - - -if __name__ == "__main__": - parser = argparse.ArgumentParser(description="Export a model for deployment.") - parser.add_argument( - "--format", - choices=["caffe2", "onnx", "torchscript"], - help="output format", - default="torchscript", - ) - parser.add_argument( - "--export-method", - choices=["caffe2_tracing", "tracing", "scripting"], - help="Method to export models", - default="tracing", - ) - parser.add_argument("--config-file", default="", metavar="FILE", help="path to config file") - parser.add_argument("--sample-image", default=None, type=str, help="sample image for input") - parser.add_argument("--run-eval", action="store_true") - parser.add_argument("--output", help="output directory for the converted model") - parser.add_argument( - "opts", - help="Modify config options using the command-line", - default=None, - nargs=argparse.REMAINDER, - ) - args = parser.parse_args() - logger = setup_logger() - logger.info("Command line arguments: " + str(args)) - PathManager.mkdirs(args.output) - # Disable respecialization on new shapes. Otherwise --run-eval will be slow - torch._C._jit_set_bailout_depth(1) - - cfg = setup_cfg(args) - - # create a torch model - torch_model = build_model(cfg) - DetectionCheckpointer(torch_model).resume_or_load(cfg.MODEL.WEIGHTS) - torch_model.eval() - - # get sample data - sample_inputs = get_sample_inputs(args) - - # convert and save model - if args.export_method == "caffe2_tracing": - exported_model = export_caffe2_tracing(cfg, torch_model, sample_inputs) - elif args.export_method == "scripting": - exported_model = export_scripting(torch_model) - elif args.export_method == "tracing": - exported_model = export_tracing(torch_model, sample_inputs) - - # run evaluation with the converted model - if args.run_eval: - assert exported_model is not None, ( - "Python inference is not yet implemented for " - f"export_method={args.export_method}, format={args.format}." - ) - logger.info("Running evaluation ... this takes a long time if you export to CPU.") - dataset = cfg.DATASETS.TEST[0] - data_loader = build_detection_test_loader(cfg, dataset) - # NOTE: hard-coded evaluator. change to the evaluator for your dataset - evaluator = COCOEvaluator(dataset, output_dir=args.output) - metrics = inference_on_dataset(exported_model, data_loader, evaluator) - print_csv_format(metrics) diff --git a/grit_src_deprecated/third_party/CenterNet2/tools/deploy/torchscript_mask_rcnn.cpp b/grit_src_deprecated/third_party/CenterNet2/tools/deploy/torchscript_mask_rcnn.cpp deleted file mode 100755 index b40f13b8..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tools/deploy/torchscript_mask_rcnn.cpp +++ /dev/null @@ -1,187 +0,0 @@ -// Copyright (c) Facebook, Inc. and its affiliates. -// @lint-ignore-every CLANGTIDY -// This is an example code that demonstrates how to run inference -// with a torchscript format Mask R-CNN model exported by ./export_model.py -// using export method=tracing, caffe2_tracing & scripting. - -#include -#include -#include - -#include -#include -#include -#include - -// only needed for export_method=tracing -#include // @oss-only -// @fb-only: #include - -using namespace std; - -c10::IValue get_caffe2_tracing_inputs(cv::Mat& img, c10::Device device) { - const int height = img.rows; - const int width = img.cols; - // FPN models require divisibility of 32. - // Tracing mode does padding inside the graph, but caffe2_tracing does not. - assert(height % 32 == 0 && width % 32 == 0); - const int channels = 3; - - auto input = - torch::from_blob(img.data, {1, height, width, channels}, torch::kUInt8); - // NHWC to NCHW - input = input.to(device, torch::kFloat).permute({0, 3, 1, 2}).contiguous(); - - std::array im_info_data{height * 1.0f, width * 1.0f, 1.0f}; - auto im_info = - torch::from_blob(im_info_data.data(), {1, 3}).clone().to(device); - return std::make_tuple(input, im_info); -} - -c10::IValue get_tracing_inputs(cv::Mat& img, c10::Device device) { - const int height = img.rows; - const int width = img.cols; - const int channels = 3; - - auto input = - torch::from_blob(img.data, {height, width, channels}, torch::kUInt8); - // HWC to CHW - input = input.to(device, torch::kFloat).permute({2, 0, 1}).contiguous(); - return input; -} - -// create a Tuple[Dict[str, Tensor]] which is the input type of scripted model -c10::IValue get_scripting_inputs(cv::Mat& img, c10::Device device) { - const int height = img.rows; - const int width = img.cols; - const int channels = 3; - - auto img_tensor = - torch::from_blob(img.data, {height, width, channels}, torch::kUInt8); - // HWC to CHW - img_tensor = - img_tensor.to(device, torch::kFloat).permute({2, 0, 1}).contiguous(); - auto dic = c10::Dict(); - dic.insert("image", img_tensor); - return std::make_tuple(dic); -} - -c10::IValue -get_inputs(std::string export_method, cv::Mat& img, c10::Device device) { - // Given an image, create inputs in the format required by the model. - if (export_method == "tracing") - return get_tracing_inputs(img, device); - if (export_method == "caffe2_tracing") - return get_caffe2_tracing_inputs(img, device); - if (export_method == "scripting") - return get_scripting_inputs(img, device); - abort(); -} - -struct MaskRCNNOutputs { - at::Tensor pred_boxes, pred_classes, pred_masks, scores; - int num_instances() const { - return pred_boxes.sizes()[0]; - } -}; - -MaskRCNNOutputs get_outputs(std::string export_method, c10::IValue outputs) { - // Given outputs of the model, extract tensors from it to turn into a - // common MaskRCNNOutputs format. - if (export_method == "tracing") { - auto out_tuple = outputs.toTuple()->elements(); - // They are ordered alphabetically by their field name in Instances - return MaskRCNNOutputs{ - out_tuple[0].toTensor(), - out_tuple[1].toTensor(), - out_tuple[2].toTensor(), - out_tuple[3].toTensor()}; - } - if (export_method == "caffe2_tracing") { - auto out_tuple = outputs.toTuple()->elements(); - // A legacy order used by caffe2 models - return MaskRCNNOutputs{ - out_tuple[0].toTensor(), - out_tuple[2].toTensor(), - out_tuple[3].toTensor(), - out_tuple[1].toTensor()}; - } - if (export_method == "scripting") { - // With the ScriptableAdapter defined in export_model.py, the output is - // List[Dict[str, Any]]. - auto out_dict = outputs.toList().get(0).toGenericDict(); - return MaskRCNNOutputs{ - out_dict.at("pred_boxes").toTensor(), - out_dict.at("pred_classes").toTensor(), - out_dict.at("pred_masks").toTensor(), - out_dict.at("scores").toTensor()}; - } - abort(); -} - -int main(int argc, const char* argv[]) { - if (argc != 4) { - cerr << R"xx( -Usage: - ./torchscript_mask_rcnn model.ts input.jpg EXPORT_METHOD - - EXPORT_METHOD can be "tracing", "caffe2_tracing" or "scripting". -)xx"; - return 1; - } - std::string image_file = argv[2]; - std::string export_method = argv[3]; - assert( - export_method == "caffe2_tracing" || export_method == "tracing" || - export_method == "scripting"); - - torch::jit::getBailoutDepth() = 1; - torch::autograd::AutoGradMode guard(false); - auto module = torch::jit::load(argv[1]); - - assert(module.buffers().size() > 0); - // Assume that the entire model is on the same device. - // We just put input to this device. - auto device = (*begin(module.buffers())).device(); - - cv::Mat input_img = cv::imread(image_file, cv::IMREAD_COLOR); - auto inputs = get_inputs(export_method, input_img, device); - - // Run the network - auto output = module.forward({inputs}); - if (device.is_cuda()) - c10::cuda::getCurrentCUDAStream().synchronize(); - - // run 3 more times to benchmark - int N_benchmark = 3, N_warmup = 1; - auto start_time = chrono::high_resolution_clock::now(); - for (int i = 0; i < N_benchmark + N_warmup; ++i) { - if (i == N_warmup) - start_time = chrono::high_resolution_clock::now(); - output = module.forward({inputs}); - if (device.is_cuda()) - c10::cuda::getCurrentCUDAStream().synchronize(); - } - auto end_time = chrono::high_resolution_clock::now(); - auto ms = chrono::duration_cast(end_time - start_time) - .count(); - cout << "Latency (should vary with different inputs): " - << ms * 1.0 / 1e6 / N_benchmark << " seconds" << endl; - - // Parse Mask R-CNN outputs - auto rcnn_outputs = get_outputs(export_method, output); - cout << "Number of detected objects: " << rcnn_outputs.num_instances() - << endl; - - cout << "pred_boxes: " << rcnn_outputs.pred_boxes.toString() << " " - << rcnn_outputs.pred_boxes.sizes() << endl; - cout << "scores: " << rcnn_outputs.scores.toString() << " " - << rcnn_outputs.scores.sizes() << endl; - cout << "pred_classes: " << rcnn_outputs.pred_classes.toString() << " " - << rcnn_outputs.pred_classes.sizes() << endl; - cout << "pred_masks: " << rcnn_outputs.pred_masks.toString() << " " - << rcnn_outputs.pred_masks.sizes() << endl; - - cout << rcnn_outputs.pred_boxes << endl; - return 0; -} diff --git a/grit_src_deprecated/third_party/CenterNet2/tools/lazyconfig_train_net.py b/grit_src_deprecated/third_party/CenterNet2/tools/lazyconfig_train_net.py deleted file mode 100755 index bb62d36c..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tools/lazyconfig_train_net.py +++ /dev/null @@ -1,131 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. -""" -Training script using the new "LazyConfig" python config files. - -This scripts reads a given python config file and runs the training or evaluation. -It can be used to train any models or dataset as long as they can be -instantiated by the recursive construction defined in the given config file. - -Besides lazy construction of models, dataloader, etc., this scripts expects a -few common configuration parameters currently defined in "configs/common/train.py". -To add more complicated training logic, you can easily add other configs -in the config file and implement a new train_net.py to handle them. -""" -import logging - -from detectron2.checkpoint import DetectionCheckpointer -from detectron2.config import LazyConfig, instantiate -from detectron2.engine import ( - AMPTrainer, - SimpleTrainer, - default_argument_parser, - default_setup, - default_writers, - hooks, - launch, -) -from detectron2.engine.defaults import create_ddp_model -from detectron2.evaluation import inference_on_dataset, print_csv_format -from detectron2.utils import comm - -logger = logging.getLogger("detectron2") - - -def do_test(cfg, model): - if "evaluator" in cfg.dataloader: - ret = inference_on_dataset( - model, instantiate(cfg.dataloader.test), instantiate(cfg.dataloader.evaluator) - ) - print_csv_format(ret) - return ret - - -def do_train(args, cfg): - """ - Args: - cfg: an object with the following attributes: - model: instantiate to a module - dataloader.{train,test}: instantiate to dataloaders - dataloader.evaluator: instantiate to evaluator for test set - optimizer: instantaite to an optimizer - lr_multiplier: instantiate to a fvcore scheduler - train: other misc config defined in `configs/common/train.py`, including: - output_dir (str) - init_checkpoint (str) - amp.enabled (bool) - max_iter (int) - eval_period, log_period (int) - device (str) - checkpointer (dict) - ddp (dict) - """ - model = instantiate(cfg.model) - logger = logging.getLogger("detectron2") - logger.info("Model:\n{}".format(model)) - model.to(cfg.train.device) - - cfg.optimizer.params.model = model - optim = instantiate(cfg.optimizer) - - train_loader = instantiate(cfg.dataloader.train) - - model = create_ddp_model(model, **cfg.train.ddp) - trainer = (AMPTrainer if cfg.train.amp.enabled else SimpleTrainer)(model, train_loader, optim) - checkpointer = DetectionCheckpointer( - model, - cfg.train.output_dir, - trainer=trainer, - ) - trainer.register_hooks( - [ - hooks.IterationTimer(), - hooks.LRScheduler(scheduler=instantiate(cfg.lr_multiplier)), - hooks.PeriodicCheckpointer(checkpointer, **cfg.train.checkpointer) - if comm.is_main_process() - else None, - hooks.EvalHook(cfg.train.eval_period, lambda: do_test(cfg, model)), - hooks.PeriodicWriter( - default_writers(cfg.train.output_dir, cfg.train.max_iter), - period=cfg.train.log_period, - ) - if comm.is_main_process() - else None, - ] - ) - - checkpointer.resume_or_load(cfg.train.init_checkpoint, resume=args.resume) - if args.resume and checkpointer.has_checkpoint(): - # The checkpoint stores the training iteration that just finished, thus we start - # at the next iteration - start_iter = trainer.iter + 1 - else: - start_iter = 0 - trainer.train(start_iter, cfg.train.max_iter) - - -def main(args): - cfg = LazyConfig.load(args.config_file) - cfg = LazyConfig.apply_overrides(cfg, args.opts) - default_setup(cfg, args) - - if args.eval_only: - model = instantiate(cfg.model) - model.to(cfg.train.device) - model = create_ddp_model(model) - DetectionCheckpointer(model).load(cfg.train.init_checkpoint) - print(do_test(cfg, model)) - else: - do_train(args, cfg) - - -if __name__ == "__main__": - args = default_argument_parser().parse_args() - launch( - main, - args.num_gpus, - num_machines=args.num_machines, - machine_rank=args.machine_rank, - dist_url=args.dist_url, - args=(args,), - ) diff --git a/grit_src_deprecated/third_party/CenterNet2/tools/lightning_train_net.py b/grit_src_deprecated/third_party/CenterNet2/tools/lightning_train_net.py deleted file mode 100755 index f6734b56..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tools/lightning_train_net.py +++ /dev/null @@ -1,239 +0,0 @@ -#!/usr/bin/env python3 -# Copyright (c) Facebook, Inc. and its affiliates. -# Lightning Trainer should be considered beta at this point -# We have confirmed that training and validation run correctly and produce correct results -# Depending on how you launch the trainer, there are issues with processes terminating correctly -# This module is still dependent on D2 logging, but could be transferred to use Lightning logging - -import logging -import os -import time -import weakref -from collections import OrderedDict -from typing import Any, Dict, List - -import detectron2.utils.comm as comm -from detectron2.checkpoint import DetectionCheckpointer -from detectron2.config import get_cfg -from detectron2.data import build_detection_test_loader, build_detection_train_loader -from detectron2.engine import ( - DefaultTrainer, - SimpleTrainer, - default_argument_parser, - default_setup, - default_writers, - hooks, -) -from detectron2.evaluation import print_csv_format -from detectron2.evaluation.testing import flatten_results_dict -from detectron2.modeling import build_model -from detectron2.solver import build_lr_scheduler, build_optimizer -from detectron2.utils.events import EventStorage -from detectron2.utils.logger import setup_logger - -import pytorch_lightning as pl # type: ignore -from pytorch_lightning import LightningDataModule, LightningModule -from train_net import build_evaluator - -logging.basicConfig(level=logging.INFO) -logger = logging.getLogger("detectron2") - - -class TrainingModule(LightningModule): - def __init__(self, cfg): - super().__init__() - if not logger.isEnabledFor(logging.INFO): # setup_logger is not called for d2 - setup_logger() - self.cfg = DefaultTrainer.auto_scale_workers(cfg, comm.get_world_size()) - self.storage: EventStorage = None - self.model = build_model(self.cfg) - - self.start_iter = 0 - self.max_iter = cfg.SOLVER.MAX_ITER - - def on_save_checkpoint(self, checkpoint: Dict[str, Any]) -> None: - checkpoint["iteration"] = self.storage.iter - - def on_load_checkpoint(self, checkpointed_state: Dict[str, Any]) -> None: - self.start_iter = checkpointed_state["iteration"] - self.storage.iter = self.start_iter - - def setup(self, stage: str): - if self.cfg.MODEL.WEIGHTS: - self.checkpointer = DetectionCheckpointer( - # Assume you want to save checkpoints together with logs/statistics - self.model, - self.cfg.OUTPUT_DIR, - ) - logger.info(f"Load model weights from checkpoint: {self.cfg.MODEL.WEIGHTS}.") - # Only load weights, use lightning checkpointing if you want to resume - self.checkpointer.load(self.cfg.MODEL.WEIGHTS) - - self.iteration_timer = hooks.IterationTimer() - self.iteration_timer.before_train() - self.data_start = time.perf_counter() - self.writers = None - - def training_step(self, batch, batch_idx): - data_time = time.perf_counter() - self.data_start - # Need to manually enter/exit since trainer may launch processes - # This ideally belongs in setup, but setup seems to run before processes are spawned - if self.storage is None: - self.storage = EventStorage(0) - self.storage.__enter__() - self.iteration_timer.trainer = weakref.proxy(self) - self.iteration_timer.before_step() - self.writers = ( - default_writers(self.cfg.OUTPUT_DIR, self.max_iter) - if comm.is_main_process() - else {} - ) - - loss_dict = self.model(batch) - SimpleTrainer.write_metrics(loss_dict, data_time) - - opt = self.optimizers() - self.storage.put_scalar( - "lr", opt.param_groups[self._best_param_group_id]["lr"], smoothing_hint=False - ) - self.iteration_timer.after_step() - self.storage.step() - # A little odd to put before step here, but it's the best way to get a proper timing - self.iteration_timer.before_step() - - if self.storage.iter % 20 == 0: - for writer in self.writers: - writer.write() - return sum(loss_dict.values()) - - def training_step_end(self, training_step_outpus): - self.data_start = time.perf_counter() - return training_step_outpus - - def training_epoch_end(self, training_step_outputs): - self.iteration_timer.after_train() - if comm.is_main_process(): - self.checkpointer.save("model_final") - for writer in self.writers: - writer.write() - writer.close() - self.storage.__exit__(None, None, None) - - def _process_dataset_evaluation_results(self) -> OrderedDict: - results = OrderedDict() - for idx, dataset_name in enumerate(self.cfg.DATASETS.TEST): - results[dataset_name] = self._evaluators[idx].evaluate() - if comm.is_main_process(): - print_csv_format(results[dataset_name]) - - if len(results) == 1: - results = list(results.values())[0] - return results - - def _reset_dataset_evaluators(self): - self._evaluators = [] - for dataset_name in self.cfg.DATASETS.TEST: - evaluator = build_evaluator(self.cfg, dataset_name) - evaluator.reset() - self._evaluators.append(evaluator) - - def on_validation_epoch_start(self, _outputs): - self._reset_dataset_evaluators() - - def validation_epoch_end(self, _outputs): - results = self._process_dataset_evaluation_results(_outputs) - - flattened_results = flatten_results_dict(results) - for k, v in flattened_results.items(): - try: - v = float(v) - except Exception as e: - raise ValueError( - "[EvalHook] eval_function should return a nested dict of float. " - "Got '{}: {}' instead.".format(k, v) - ) from e - self.storage.put_scalars(**flattened_results, smoothing_hint=False) - - def validation_step(self, batch, batch_idx: int, dataloader_idx: int = 0) -> None: - if not isinstance(batch, List): - batch = [batch] - outputs = self.model(batch) - self._evaluators[dataloader_idx].process(batch, outputs) - - def configure_optimizers(self): - optimizer = build_optimizer(self.cfg, self.model) - self._best_param_group_id = hooks.LRScheduler.get_best_param_group_id(optimizer) - scheduler = build_lr_scheduler(self.cfg, optimizer) - return [optimizer], [{"scheduler": scheduler, "interval": "step"}] - - -class DataModule(LightningDataModule): - def __init__(self, cfg): - super().__init__() - self.cfg = DefaultTrainer.auto_scale_workers(cfg, comm.get_world_size()) - - def train_dataloader(self): - return build_detection_train_loader(self.cfg) - - def val_dataloader(self): - dataloaders = [] - for dataset_name in self.cfg.DATASETS.TEST: - dataloaders.append(build_detection_test_loader(self.cfg, dataset_name)) - return dataloaders - - -def main(args): - cfg = setup(args) - train(cfg, args) - - -def train(cfg, args): - trainer_params = { - # training loop is bounded by max steps, use a large max_epochs to make - # sure max_steps is met first - "max_epochs": 10 ** 8, - "max_steps": cfg.SOLVER.MAX_ITER, - "val_check_interval": cfg.TEST.EVAL_PERIOD if cfg.TEST.EVAL_PERIOD > 0 else 10 ** 8, - "num_nodes": args.num_machines, - "gpus": args.num_gpus, - "num_sanity_val_steps": 0, - } - if cfg.SOLVER.AMP.ENABLED: - trainer_params["precision"] = 16 - - last_checkpoint = os.path.join(cfg.OUTPUT_DIR, "last.ckpt") - if args.resume: - # resume training from checkpoint - trainer_params["resume_from_checkpoint"] = last_checkpoint - logger.info(f"Resuming training from checkpoint: {last_checkpoint}.") - - trainer = pl.Trainer(**trainer_params) - logger.info(f"start to train with {args.num_machines} nodes and {args.num_gpus} GPUs") - - module = TrainingModule(cfg) - data_module = DataModule(cfg) - if args.eval_only: - logger.info("Running inference") - trainer.validate(module, data_module) - else: - logger.info("Running training") - trainer.fit(module, data_module) - - -def setup(args): - """ - Create configs and perform basic setups. - """ - cfg = get_cfg() - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - cfg.freeze() - default_setup(cfg, args) - return cfg - - -if __name__ == "__main__": - parser = default_argument_parser() - args = parser.parse_args() - logger.info("Command Line Args:", args) - main(args) diff --git a/grit_src_deprecated/third_party/CenterNet2/tools/plain_train_net.py b/grit_src_deprecated/third_party/CenterNet2/tools/plain_train_net.py deleted file mode 100755 index 4851a839..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tools/plain_train_net.py +++ /dev/null @@ -1,223 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. -""" -Detectron2 training script with a plain training loop. - -This script reads a given config file and runs the training or evaluation. -It is an entry point that is able to train standard models in detectron2. - -In order to let one script support training of many models, -this script contains logic that are specific to these built-in models and therefore -may not be suitable for your own project. -For example, your research project perhaps only needs a single "evaluator". - -Therefore, we recommend you to use detectron2 as a library and take -this file as an example of how to use the library. -You may want to write your own script with your datasets and other customizations. - -Compared to "train_net.py", this script supports fewer default features. -It also includes fewer abstraction, therefore is easier to add custom logic. -""" - -import logging -import os -from collections import OrderedDict -import torch -from torch.nn.parallel import DistributedDataParallel - -import detectron2.utils.comm as comm -from detectron2.checkpoint import DetectionCheckpointer, PeriodicCheckpointer -from detectron2.config import get_cfg -from detectron2.data import ( - MetadataCatalog, - build_detection_test_loader, - build_detection_train_loader, -) -from detectron2.engine import default_argument_parser, default_setup, default_writers, launch -from detectron2.evaluation import ( - CityscapesInstanceEvaluator, - CityscapesSemSegEvaluator, - COCOEvaluator, - COCOPanopticEvaluator, - DatasetEvaluators, - LVISEvaluator, - PascalVOCDetectionEvaluator, - SemSegEvaluator, - inference_on_dataset, - print_csv_format, -) -from detectron2.modeling import build_model -from detectron2.solver import build_lr_scheduler, build_optimizer -from detectron2.utils.events import EventStorage - -logger = logging.getLogger("detectron2") - - -def get_evaluator(cfg, dataset_name, output_folder=None): - """ - Create evaluator(s) for a given dataset. - This uses the special metadata "evaluator_type" associated with each builtin dataset. - For your own dataset, you can simply create an evaluator manually in your - script and do not have to worry about the hacky if-else logic here. - """ - if output_folder is None: - output_folder = os.path.join(cfg.OUTPUT_DIR, "inference") - evaluator_list = [] - evaluator_type = MetadataCatalog.get(dataset_name).evaluator_type - if evaluator_type in ["sem_seg", "coco_panoptic_seg"]: - evaluator_list.append( - SemSegEvaluator( - dataset_name, - distributed=True, - output_dir=output_folder, - ) - ) - if evaluator_type in ["coco", "coco_panoptic_seg"]: - evaluator_list.append(COCOEvaluator(dataset_name, output_dir=output_folder)) - if evaluator_type == "coco_panoptic_seg": - evaluator_list.append(COCOPanopticEvaluator(dataset_name, output_folder)) - if evaluator_type == "cityscapes_instance": - assert ( - torch.cuda.device_count() > comm.get_rank() - ), "CityscapesEvaluator currently do not work with multiple machines." - return CityscapesInstanceEvaluator(dataset_name) - if evaluator_type == "cityscapes_sem_seg": - assert ( - torch.cuda.device_count() > comm.get_rank() - ), "CityscapesEvaluator currently do not work with multiple machines." - return CityscapesSemSegEvaluator(dataset_name) - if evaluator_type == "pascal_voc": - return PascalVOCDetectionEvaluator(dataset_name) - if evaluator_type == "lvis": - return LVISEvaluator(dataset_name, cfg, True, output_folder) - if len(evaluator_list) == 0: - raise NotImplementedError( - "no Evaluator for the dataset {} with the type {}".format(dataset_name, evaluator_type) - ) - if len(evaluator_list) == 1: - return evaluator_list[0] - return DatasetEvaluators(evaluator_list) - - -def do_test(cfg, model): - results = OrderedDict() - for dataset_name in cfg.DATASETS.TEST: - data_loader = build_detection_test_loader(cfg, dataset_name) - evaluator = get_evaluator( - cfg, dataset_name, os.path.join(cfg.OUTPUT_DIR, "inference", dataset_name) - ) - results_i = inference_on_dataset(model, data_loader, evaluator) - results[dataset_name] = results_i - if comm.is_main_process(): - logger.info("Evaluation results for {} in csv format:".format(dataset_name)) - print_csv_format(results_i) - if len(results) == 1: - results = list(results.values())[0] - return results - - -def do_train(cfg, model, resume=False): - model.train() - optimizer = build_optimizer(cfg, model) - scheduler = build_lr_scheduler(cfg, optimizer) - - checkpointer = DetectionCheckpointer( - model, cfg.OUTPUT_DIR, optimizer=optimizer, scheduler=scheduler - ) - start_iter = ( - checkpointer.resume_or_load(cfg.MODEL.WEIGHTS, resume=resume).get("iteration", -1) + 1 - ) - max_iter = cfg.SOLVER.MAX_ITER - - periodic_checkpointer = PeriodicCheckpointer( - checkpointer, cfg.SOLVER.CHECKPOINT_PERIOD, max_iter=max_iter - ) - - writers = default_writers(cfg.OUTPUT_DIR, max_iter) if comm.is_main_process() else [] - - # compared to "train_net.py", we do not support accurate timing and - # precise BN here, because they are not trivial to implement in a small training loop - data_loader = build_detection_train_loader(cfg) - logger.info("Starting training from iteration {}".format(start_iter)) - with EventStorage(start_iter) as storage: - for data, iteration in zip(data_loader, range(start_iter, max_iter)): - storage.iter = iteration - - loss_dict = model(data) - losses = sum(loss_dict.values()) - assert torch.isfinite(losses).all(), loss_dict - - loss_dict_reduced = {k: v.item() for k, v in comm.reduce_dict(loss_dict).items()} - losses_reduced = sum(loss for loss in loss_dict_reduced.values()) - if comm.is_main_process(): - storage.put_scalars(total_loss=losses_reduced, **loss_dict_reduced) - - optimizer.zero_grad() - losses.backward() - optimizer.step() - storage.put_scalar("lr", optimizer.param_groups[0]["lr"], smoothing_hint=False) - scheduler.step() - - if ( - cfg.TEST.EVAL_PERIOD > 0 - and (iteration + 1) % cfg.TEST.EVAL_PERIOD == 0 - and iteration != max_iter - 1 - ): - do_test(cfg, model) - # Compared to "train_net.py", the test results are not dumped to EventStorage - comm.synchronize() - - if iteration - start_iter > 5 and ( - (iteration + 1) % 20 == 0 or iteration == max_iter - 1 - ): - for writer in writers: - writer.write() - periodic_checkpointer.step(iteration) - - -def setup(args): - """ - Create configs and perform basic setups. - """ - cfg = get_cfg() - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - cfg.freeze() - default_setup( - cfg, args - ) # if you don't like any of the default setup, write your own setup code - return cfg - - -def main(args): - cfg = setup(args) - - model = build_model(cfg) - logger.info("Model:\n{}".format(model)) - if args.eval_only: - DetectionCheckpointer(model, save_dir=cfg.OUTPUT_DIR).resume_or_load( - cfg.MODEL.WEIGHTS, resume=args.resume - ) - return do_test(cfg, model) - - distributed = comm.get_world_size() > 1 - if distributed: - model = DistributedDataParallel( - model, device_ids=[comm.get_local_rank()], broadcast_buffers=False - ) - - do_train(cfg, model, resume=args.resume) - return do_test(cfg, model) - - -if __name__ == "__main__": - args = default_argument_parser().parse_args() - print("Command Line Args:", args) - launch( - main, - args.num_gpus, - num_machines=args.num_machines, - machine_rank=args.machine_rank, - dist_url=args.dist_url, - args=(args,), - ) diff --git a/grit_src_deprecated/third_party/CenterNet2/tools/train_net.py b/grit_src_deprecated/third_party/CenterNet2/tools/train_net.py deleted file mode 100755 index 6ebf5f60..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tools/train_net.py +++ /dev/null @@ -1,170 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. -""" -A main training script. - -This scripts reads a given config file and runs the training or evaluation. -It is an entry point that is made to train standard models in detectron2. - -In order to let one script support training of many models, -this script contains logic that are specific to these built-in models and therefore -may not be suitable for your own project. -For example, your research project perhaps only needs a single "evaluator". - -Therefore, we recommend you to use detectron2 as an library and take -this file as an example of how to use the library. -You may want to write your own script with your datasets and other customizations. -""" - -import logging -import os -from collections import OrderedDict -import torch - -import detectron2.utils.comm as comm -from detectron2.checkpoint import DetectionCheckpointer -from detectron2.config import get_cfg -from detectron2.data import MetadataCatalog -from detectron2.engine import DefaultTrainer, default_argument_parser, default_setup, hooks, launch -from detectron2.evaluation import ( - CityscapesInstanceEvaluator, - CityscapesSemSegEvaluator, - COCOEvaluator, - COCOPanopticEvaluator, - DatasetEvaluators, - LVISEvaluator, - PascalVOCDetectionEvaluator, - SemSegEvaluator, - verify_results, -) -from detectron2.modeling import GeneralizedRCNNWithTTA - - -def build_evaluator(cfg, dataset_name, output_folder=None): - """ - Create evaluator(s) for a given dataset. - This uses the special metadata "evaluator_type" associated with each builtin dataset. - For your own dataset, you can simply create an evaluator manually in your - script and do not have to worry about the hacky if-else logic here. - """ - if output_folder is None: - output_folder = os.path.join(cfg.OUTPUT_DIR, "inference") - evaluator_list = [] - evaluator_type = MetadataCatalog.get(dataset_name).evaluator_type - if evaluator_type in ["sem_seg", "coco_panoptic_seg"]: - evaluator_list.append( - SemSegEvaluator( - dataset_name, - distributed=True, - output_dir=output_folder, - ) - ) - if evaluator_type in ["coco", "coco_panoptic_seg"]: - evaluator_list.append(COCOEvaluator(dataset_name, output_dir=output_folder)) - if evaluator_type == "coco_panoptic_seg": - evaluator_list.append(COCOPanopticEvaluator(dataset_name, output_folder)) - if evaluator_type == "cityscapes_instance": - assert ( - torch.cuda.device_count() > comm.get_rank() - ), "CityscapesEvaluator currently do not work with multiple machines." - return CityscapesInstanceEvaluator(dataset_name) - if evaluator_type == "cityscapes_sem_seg": - assert ( - torch.cuda.device_count() > comm.get_rank() - ), "CityscapesEvaluator currently do not work with multiple machines." - return CityscapesSemSegEvaluator(dataset_name) - elif evaluator_type == "pascal_voc": - return PascalVOCDetectionEvaluator(dataset_name) - elif evaluator_type == "lvis": - return LVISEvaluator(dataset_name, output_dir=output_folder) - if len(evaluator_list) == 0: - raise NotImplementedError( - "no Evaluator for the dataset {} with the type {}".format(dataset_name, evaluator_type) - ) - elif len(evaluator_list) == 1: - return evaluator_list[0] - return DatasetEvaluators(evaluator_list) - - -class Trainer(DefaultTrainer): - """ - We use the "DefaultTrainer" which contains pre-defined default logic for - standard training workflow. They may not work for you, especially if you - are working on a new research project. In that case you can write your - own training loop. You can use "tools/plain_train_net.py" as an example. - """ - - @classmethod - def build_evaluator(cls, cfg, dataset_name, output_folder=None): - return build_evaluator(cfg, dataset_name, output_folder) - - @classmethod - def test_with_TTA(cls, cfg, model): - logger = logging.getLogger("detectron2.trainer") - # In the end of training, run an evaluation with TTA - # Only support some R-CNN models. - logger.info("Running inference with test-time augmentation ...") - model = GeneralizedRCNNWithTTA(cfg, model) - evaluators = [ - cls.build_evaluator( - cfg, name, output_folder=os.path.join(cfg.OUTPUT_DIR, "inference_TTA") - ) - for name in cfg.DATASETS.TEST - ] - res = cls.test(cfg, model, evaluators) - res = OrderedDict({k + "_TTA": v for k, v in res.items()}) - return res - - -def setup(args): - """ - Create configs and perform basic setups. - """ - cfg = get_cfg() - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - cfg.freeze() - default_setup(cfg, args) - return cfg - - -def main(args): - cfg = setup(args) - - if args.eval_only: - model = Trainer.build_model(cfg) - DetectionCheckpointer(model, save_dir=cfg.OUTPUT_DIR).resume_or_load( - cfg.MODEL.WEIGHTS, resume=args.resume - ) - res = Trainer.test(cfg, model) - if cfg.TEST.AUG.ENABLED: - res.update(Trainer.test_with_TTA(cfg, model)) - if comm.is_main_process(): - verify_results(cfg, res) - return res - - """ - If you'd like to do anything fancier than the standard training logic, - consider writing your own training loop (see plain_train_net.py) or - subclassing the trainer. - """ - trainer = Trainer(cfg) - trainer.resume_or_load(resume=args.resume) - if cfg.TEST.AUG.ENABLED: - trainer.register_hooks( - [hooks.EvalHook(0, lambda: trainer.test_with_TTA(cfg, trainer.model))] - ) - return trainer.train() - - -if __name__ == "__main__": - args = default_argument_parser().parse_args() - print("Command Line Args:", args) - launch( - main, - args.num_gpus, - num_machines=args.num_machines, - machine_rank=args.machine_rank, - dist_url=args.dist_url, - args=(args,), - ) diff --git a/grit_src_deprecated/third_party/CenterNet2/tools/visualize_data.py b/grit_src_deprecated/third_party/CenterNet2/tools/visualize_data.py deleted file mode 100755 index fd0ba834..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tools/visualize_data.py +++ /dev/null @@ -1,94 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. -import argparse -import os -from itertools import chain -import cv2 -import tqdm - -from detectron2.config import get_cfg -from detectron2.data import DatasetCatalog, MetadataCatalog, build_detection_train_loader -from detectron2.data import detection_utils as utils -from detectron2.data.build import filter_images_with_few_keypoints -from detectron2.utils.logger import setup_logger -from detectron2.utils.visualizer import Visualizer - - -def setup(args): - cfg = get_cfg() - if args.config_file: - cfg.merge_from_file(args.config_file) - cfg.merge_from_list(args.opts) - cfg.DATALOADER.NUM_WORKERS = 0 - cfg.freeze() - return cfg - - -def parse_args(in_args=None): - parser = argparse.ArgumentParser(description="Visualize ground-truth data") - parser.add_argument( - "--source", - choices=["annotation", "dataloader"], - required=True, - help="visualize the annotations or the data loader (with pre-processing)", - ) - parser.add_argument("--config-file", metavar="FILE", help="path to config file") - parser.add_argument("--output-dir", default="./", help="path to output directory") - parser.add_argument("--show", action="store_true", help="show output in a window") - parser.add_argument( - "opts", - help="Modify config options using the command-line", - default=None, - nargs=argparse.REMAINDER, - ) - return parser.parse_args(in_args) - - -if __name__ == "__main__": - args = parse_args() - logger = setup_logger() - logger.info("Arguments: " + str(args)) - cfg = setup(args) - - dirname = args.output_dir - os.makedirs(dirname, exist_ok=True) - metadata = MetadataCatalog.get(cfg.DATASETS.TRAIN[0]) - - def output(vis, fname): - if args.show: - print(fname) - cv2.imshow("window", vis.get_image()[:, :, ::-1]) - cv2.waitKey() - else: - filepath = os.path.join(dirname, fname) - print("Saving to {} ...".format(filepath)) - vis.save(filepath) - - scale = 1.0 - if args.source == "dataloader": - train_data_loader = build_detection_train_loader(cfg) - for batch in train_data_loader: - for per_image in batch: - # Pytorch tensor is in (C, H, W) format - img = per_image["image"].permute(1, 2, 0).cpu().detach().numpy() - img = utils.convert_image_to_rgb(img, cfg.INPUT.FORMAT) - - visualizer = Visualizer(img, metadata=metadata, scale=scale) - target_fields = per_image["instances"].get_fields() - labels = [metadata.thing_classes[i] for i in target_fields["gt_classes"]] - vis = visualizer.overlay_instances( - labels=labels, - boxes=target_fields.get("gt_boxes", None), - masks=target_fields.get("gt_masks", None), - keypoints=target_fields.get("gt_keypoints", None), - ) - output(vis, str(per_image["image_id"]) + ".jpg") - else: - dicts = list(chain.from_iterable([DatasetCatalog.get(k) for k in cfg.DATASETS.TRAIN])) - if cfg.MODEL.KEYPOINT_ON: - dicts = filter_images_with_few_keypoints(dicts, 1) - for dic in tqdm.tqdm(dicts): - img = utils.read_image(dic["file_name"], "RGB") - visualizer = Visualizer(img, metadata=metadata, scale=scale) - vis = visualizer.draw_dataset_dict(dic) - output(vis, os.path.basename(dic["file_name"])) diff --git a/grit_src_deprecated/third_party/CenterNet2/tools/visualize_json_results.py b/grit_src_deprecated/third_party/CenterNet2/tools/visualize_json_results.py deleted file mode 100755 index 472190e0..00000000 --- a/grit_src_deprecated/third_party/CenterNet2/tools/visualize_json_results.py +++ /dev/null @@ -1,90 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) Facebook, Inc. and its affiliates. - -import argparse -import json -import numpy as np -import os -from collections import defaultdict -import cv2 -import tqdm - -from detectron2.data import DatasetCatalog, MetadataCatalog -from detectron2.structures import Boxes, BoxMode, Instances -from detectron2.utils.file_io import PathManager -from detectron2.utils.logger import setup_logger -from detectron2.utils.visualizer import Visualizer - - -def create_instances(predictions, image_size): - ret = Instances(image_size) - - score = np.asarray([x["score"] for x in predictions]) - chosen = (score > args.conf_threshold).nonzero()[0] - score = score[chosen] - bbox = np.asarray([predictions[i]["bbox"] for i in chosen]).reshape(-1, 4) - bbox = BoxMode.convert(bbox, BoxMode.XYWH_ABS, BoxMode.XYXY_ABS) - - labels = np.asarray([dataset_id_map(predictions[i]["category_id"]) for i in chosen]) - - ret.scores = score - ret.pred_boxes = Boxes(bbox) - ret.pred_classes = labels - - try: - ret.pred_masks = [predictions[i]["segmentation"] for i in chosen] - except KeyError: - pass - return ret - - -if __name__ == "__main__": - parser = argparse.ArgumentParser( - description="A script that visualizes the json predictions from COCO or LVIS dataset." - ) - parser.add_argument("--input", required=True, help="JSON file produced by the model") - parser.add_argument("--output", required=True, help="output directory") - parser.add_argument("--dataset", help="name of the dataset", default="coco_2017_val") - parser.add_argument("--conf-threshold", default=0.5, type=float, help="confidence threshold") - args = parser.parse_args() - - logger = setup_logger() - - with PathManager.open(args.input, "r") as f: - predictions = json.load(f) - - pred_by_image = defaultdict(list) - for p in predictions: - pred_by_image[p["image_id"]].append(p) - - dicts = list(DatasetCatalog.get(args.dataset)) - metadata = MetadataCatalog.get(args.dataset) - if hasattr(metadata, "thing_dataset_id_to_contiguous_id"): - - def dataset_id_map(ds_id): - return metadata.thing_dataset_id_to_contiguous_id[ds_id] - - elif "lvis" in args.dataset: - # LVIS results are in the same format as COCO results, but have a different - # mapping from dataset category id to contiguous category id in [0, #categories - 1] - def dataset_id_map(ds_id): - return ds_id - 1 - - else: - raise ValueError("Unsupported dataset: {}".format(args.dataset)) - - os.makedirs(args.output, exist_ok=True) - - for dic in tqdm.tqdm(dicts): - img = cv2.imread(dic["file_name"], cv2.IMREAD_COLOR)[:, :, ::-1] - basename = os.path.basename(dic["file_name"]) - - predictions = create_instances(pred_by_image[dic["image_id"]], img.shape[:2]) - vis = Visualizer(img, metadata) - vis_pred = vis.draw_instance_predictions(predictions).get_image() - - vis = Visualizer(img, metadata) - vis_gt = vis.draw_dataset_dict(dic).get_image() - - concat = np.concatenate((vis_pred, vis_gt), axis=1) - cv2.imwrite(os.path.join(args.output, basename), concat[:, :, ::-1]) From 2627479925d8a9b6a63d03ca9faac605e50be53e Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Thu, 28 Dec 2023 15:01:57 +0700 Subject: [PATCH 065/248] [update]: add usage in readme --- README.md | 17 ++++++++++++++++- 1 file changed, 16 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index fa073816..19ed3163 100755 --- a/README.md +++ b/README.md @@ -38,7 +38,22 @@ We propose **VBench**, a comprehensive benchmark suite for video generative mode ## :hammer: Installation (pip install) ``` -python setup.py sdist && python -m pip install dist/vbench-0.1.0.tar.gz + python setup.py sdist && python -m pip install dist/vbench-0.1.0.tar.gz +``` +### Usage +##### command line +```bash + evaluate --videos_path $VIDEO_PATH --dimension $DIMENSION +``` +##### python +```python + from vbench import VBench + my_VBench = VBench(device, , ) + my_VBench.evaluate( + videos_path = , + name = , + dimension_list = [, , ...], + ) ``` ## :gem: Pre-Trained Models From f954e69e46d175600fa0d3d0aab122e3be3e95ad Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Fri, 29 Dec 2023 01:28:04 +0700 Subject: [PATCH 066/248] [fix]: add centernet2 --- MANIFEST.in | 1 + dimension_to_folder.json | 22 +- .../grit_src/centernet2/.gitignore | 10 + .../grit_src/centernet2/__init__.py | 1 + .../grit_src/centernet2/centernet/__init__.py | 10 + .../grit_src/centernet2/centernet/config.py | 87 ++ .../centernet2/centernet/modeling/__init__.py | 0 .../centernet/modeling/backbone/__init__.py | 0 .../centernet/modeling/backbone/bifpn.py | 425 +++++++++ .../centernet/modeling/backbone/bifpn_fcos.py | 469 ++++++++++ .../centernet/modeling/backbone/dla.py | 479 ++++++++++ .../centernet/modeling/backbone/dlafpn.py | 493 ++++++++++ .../centernet/modeling/backbone/fpn_p5.py} | 4 +- .../centernet/modeling/backbone/res2net.py | 802 ++++++++++++++++ .../centernet2/centernet/modeling/debug.py | 283 ++++++ .../modeling/dense_heads/__init__.py | 0 .../modeling/dense_heads/centernet.py | 864 ++++++++++++++++++ .../modeling/dense_heads/centernet_head.py | 162 ++++ .../centernet/modeling/dense_heads/utils.py | 38 + .../centernet/modeling/layers/__init__.py | 0 .../centernet/modeling/layers/deform_conv.py | 116 +++ .../modeling/layers/heatmap_focal_loss.py | 92 ++ .../centernet/modeling/layers/iou_loss.py | 121 +++ .../centernet/modeling/layers/ml_nms.py | 31 + .../centernet/modeling/meta_arch/__init__.py | 0 .../modeling/meta_arch/centernet_detector.py | 69 ++ .../centernet/modeling/roi_heads/__init__.py | 0 .../modeling/roi_heads/custom_fast_rcnn.py | 124 +++ .../modeling/roi_heads/custom_roi_heads.py | 185 ++++ .../centernet/modeling/roi_heads/fed_loss.py | 31 + .../centernet2/centernet2_docs/MODEL_ZOO.md | 73 ++ .../configs/Base-CenterNet-FPN.yaml | 28 + .../centernet2/configs/Base-CenterNet2.yaml | 56 ++ .../centernet2/configs/Base_S4_DLA.yaml | 40 + .../configs/CenterNet-FPN_R50_1x.yaml | 4 + .../configs/CenterNet-S4_DLA_8x.yaml | 5 + .../configs/CenterNet2-F_R50_1x.yaml | 4 + .../configs/CenterNet2_DLA-BiFPN-P3_24x.yaml | 36 + .../configs/CenterNet2_DLA-BiFPN-P3_4x.yaml | 36 + .../CenterNet2_DLA-BiFPN-P5_640_16x.yaml | 29 + .../CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml | 30 + ...enterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml | 30 + .../CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml | 32 + ...erNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml | 36 + .../configs/CenterNet2_R2-101-DCN_896_4x.yaml | 29 + .../centernet2/configs/CenterNet2_R50_1x.yaml | 1 + .../configs/CenterNet2_X101-DCN_2x.yaml | 22 + .../configs/LVIS_CenterNet2_R50_1x.yaml | 17 + .../configs/LVIS_CenterNet2_R50_Fed_1x.yaml | 19 + .../configs/O365_CenterNet2_R50_1x.yaml | 13 + .../nuImages_CenterNet2_DLA_640_8x.yaml | 42 + .../grit_src/centernet2/predictor.py | 243 +++++ .../grit_src/centernet2/train_net.py | 228 +++++ .../grit_src/grit/modeling/backbone/vit.py | 5 +- .../grit_src/image_dense_captions.py | 2 +- 55 files changed, 5963 insertions(+), 16 deletions(-) create mode 100755 vbench/third_party/grit_src/centernet2/.gitignore create mode 100644 vbench/third_party/grit_src/centernet2/__init__.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/__init__.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/config.py create mode 100644 vbench/third_party/grit_src/centernet2/centernet/modeling/__init__.py create mode 100644 vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/__init__.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/bifpn.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/bifpn_fcos.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/dla.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/dlafpn.py rename vbench/third_party/grit_src/{grit/modeling/backbone/centernet_last_layer.py => centernet2/centernet/modeling/backbone/fpn_p5.py} (97%) mode change 100644 => 100755 create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/res2net.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/debug.py create mode 100644 vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/__init__.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/centernet.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/centernet_head.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/utils.py create mode 100644 vbench/third_party/grit_src/centernet2/centernet/modeling/layers/__init__.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/layers/deform_conv.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/layers/heatmap_focal_loss.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/layers/iou_loss.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/layers/ml_nms.py create mode 100644 vbench/third_party/grit_src/centernet2/centernet/modeling/meta_arch/__init__.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/meta_arch/centernet_detector.py create mode 100644 vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/__init__.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/custom_fast_rcnn.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/custom_roi_heads.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/fed_loss.py create mode 100755 vbench/third_party/grit_src/centernet2/centernet2_docs/MODEL_ZOO.md create mode 100755 vbench/third_party/grit_src/centernet2/configs/Base-CenterNet-FPN.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/Base-CenterNet2.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/Base_S4_DLA.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/CenterNet-FPN_R50_1x.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/CenterNet-S4_DLA_8x.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/CenterNet2-F_R50_1x.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/CenterNet2_R50_1x.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/CenterNet2_X101-DCN_2x.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/LVIS_CenterNet2_R50_1x.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/O365_CenterNet2_R50_1x.yaml create mode 100755 vbench/third_party/grit_src/centernet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml create mode 100755 vbench/third_party/grit_src/centernet2/predictor.py create mode 100755 vbench/third_party/grit_src/centernet2/train_net.py diff --git a/MANIFEST.in b/MANIFEST.in index 7914e2c7..d5219a9f 100644 --- a/MANIFEST.in +++ b/MANIFEST.in @@ -1,3 +1,4 @@ include version.txt include requirements.txt recursive-include vbench/third_party *.yaml +recursive-include vbench/third_party *.json diff --git a/dimension_to_folder.json b/dimension_to_folder.json index 085532cd..4504b68e 100644 --- a/dimension_to_folder.json +++ b/dimension_to_folder.json @@ -1,18 +1,18 @@ { "subject_consistency": "subject_consistency", - "background_consistency": "scene", + "background_consistency": "scene", "aesthetic_quality": "overall_consistency", "imaging_quality": "overall_consistency", - "object_class": "object_class", - "multiple_objects": "multiple_objects", - "color": "color", - "spatial_relationship": "spatial_relationship", - "scene": "scene", - "temporal_style": "temporal_style", + "object_class": "object_class", + "multiple_objects": "multiple_objects", + "color": "color", + "spatial_relationship": "spatial_relationship", + "scene": "scene", + "temporal_style": "temporal_style", "overall_consistency": "overall_consistency", - "human_action": "human_action", - "temporal_flickering": "temporal_flickering", + "human_action": "human_action", + "temporal_flickering": "temporal_flickering", "motion_smoothness": "subject_consistency", - "dynamic_degree": "subject_consistency", - "appearance_style": "appearance_style" + "dynamic_degree": "subject_consistency", + "appearance_style": "appearance_style" } diff --git a/vbench/third_party/grit_src/centernet2/.gitignore b/vbench/third_party/grit_src/centernet2/.gitignore new file mode 100755 index 00000000..51c17688 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/.gitignore @@ -0,0 +1,10 @@ +# compilation and distribution +__pycache__ +_ext +*.pyc +*.pyd +*.so +centernet.egg-info/ +build/ +dist/ +wheels/ diff --git a/vbench/third_party/grit_src/centernet2/__init__.py b/vbench/third_party/grit_src/centernet2/__init__.py new file mode 100644 index 00000000..a618c479 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/__init__.py @@ -0,0 +1 @@ +from .centernet.config import add_centernet_config diff --git a/vbench/third_party/grit_src/centernet2/centernet/__init__.py b/vbench/third_party/grit_src/centernet2/centernet/__init__.py new file mode 100755 index 00000000..83df7d5b --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/__init__.py @@ -0,0 +1,10 @@ +from .modeling.meta_arch.centernet_detector import CenterNetDetector +from .modeling.dense_heads.centernet import CenterNet +from .modeling.roi_heads.custom_roi_heads import CustomROIHeads, CustomCascadeROIHeads + +from .modeling.backbone.fpn_p5 import build_p67_resnet_fpn_backbone +from .modeling.backbone.dla import build_dla_backbone +from .modeling.backbone.dlafpn import build_dla_fpn3_backbone +from .modeling.backbone.bifpn import build_resnet_bifpn_backbone +from .modeling.backbone.bifpn_fcos import build_fcos_resnet_bifpn_backbone +from .modeling.backbone.res2net import build_p67_res2net_fpn_backbone diff --git a/vbench/third_party/grit_src/centernet2/centernet/config.py b/vbench/third_party/grit_src/centernet2/centernet/config.py new file mode 100755 index 00000000..36d0d250 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/config.py @@ -0,0 +1,87 @@ +from detectron2.config import CfgNode as CN + +def add_centernet_config(cfg): + _C = cfg + + _C.MODEL.CENTERNET = CN() + _C.MODEL.CENTERNET.NUM_CLASSES = 80 + _C.MODEL.CENTERNET.IN_FEATURES = ["p3", "p4", "p5", "p6", "p7"] + _C.MODEL.CENTERNET.FPN_STRIDES = [8, 16, 32, 64, 128] + _C.MODEL.CENTERNET.PRIOR_PROB = 0.01 + _C.MODEL.CENTERNET.INFERENCE_TH = 0.05 + _C.MODEL.CENTERNET.CENTER_NMS = False + _C.MODEL.CENTERNET.NMS_TH_TRAIN = 0.6 + _C.MODEL.CENTERNET.NMS_TH_TEST = 0.6 + _C.MODEL.CENTERNET.PRE_NMS_TOPK_TRAIN = 1000 + _C.MODEL.CENTERNET.POST_NMS_TOPK_TRAIN = 100 + _C.MODEL.CENTERNET.PRE_NMS_TOPK_TEST = 1000 + _C.MODEL.CENTERNET.POST_NMS_TOPK_TEST = 100 + _C.MODEL.CENTERNET.NORM = "GN" + _C.MODEL.CENTERNET.USE_DEFORMABLE = False + _C.MODEL.CENTERNET.NUM_CLS_CONVS = 4 + _C.MODEL.CENTERNET.NUM_BOX_CONVS = 4 + _C.MODEL.CENTERNET.NUM_SHARE_CONVS = 0 + _C.MODEL.CENTERNET.LOC_LOSS_TYPE = 'giou' + _C.MODEL.CENTERNET.SIGMOID_CLAMP = 1e-4 + _C.MODEL.CENTERNET.HM_MIN_OVERLAP = 0.8 + _C.MODEL.CENTERNET.MIN_RADIUS = 4 + _C.MODEL.CENTERNET.SOI = [[0, 80], [64, 160], [128, 320], [256, 640], [512, 10000000]] + _C.MODEL.CENTERNET.POS_WEIGHT = 1. + _C.MODEL.CENTERNET.NEG_WEIGHT = 1. + _C.MODEL.CENTERNET.REG_WEIGHT = 2. + _C.MODEL.CENTERNET.HM_FOCAL_BETA = 4 + _C.MODEL.CENTERNET.HM_FOCAL_ALPHA = 0.25 + _C.MODEL.CENTERNET.LOSS_GAMMA = 2.0 + _C.MODEL.CENTERNET.WITH_AGN_HM = False + _C.MODEL.CENTERNET.ONLY_PROPOSAL = False + _C.MODEL.CENTERNET.AS_PROPOSAL = False + _C.MODEL.CENTERNET.IGNORE_HIGH_FP = -1. + _C.MODEL.CENTERNET.MORE_POS = False + _C.MODEL.CENTERNET.MORE_POS_THRESH = 0.2 + _C.MODEL.CENTERNET.MORE_POS_TOPK = 9 + _C.MODEL.CENTERNET.NOT_NORM_REG = True + _C.MODEL.CENTERNET.NOT_NMS = False + _C.MODEL.CENTERNET.NO_REDUCE = False + + _C.MODEL.ROI_BOX_HEAD.USE_SIGMOID_CE = False + _C.MODEL.ROI_BOX_HEAD.PRIOR_PROB = 0.01 + _C.MODEL.ROI_BOX_HEAD.USE_EQL_LOSS = False + _C.MODEL.ROI_BOX_HEAD.CAT_FREQ_PATH = \ + 'datasets/lvis/lvis_v1_train_cat_info.json' + _C.MODEL.ROI_BOX_HEAD.EQL_FREQ_CAT = 200 + _C.MODEL.ROI_BOX_HEAD.USE_FED_LOSS = False + _C.MODEL.ROI_BOX_HEAD.FED_LOSS_NUM_CAT = 50 + _C.MODEL.ROI_BOX_HEAD.FED_LOSS_FREQ_WEIGHT = 0.5 + _C.MODEL.ROI_BOX_HEAD.MULT_PROPOSAL_SCORE = False + + _C.MODEL.BIFPN = CN() + _C.MODEL.BIFPN.NUM_LEVELS = 5 + _C.MODEL.BIFPN.NUM_BIFPN = 6 + _C.MODEL.BIFPN.NORM = 'GN' + _C.MODEL.BIFPN.OUT_CHANNELS = 160 + _C.MODEL.BIFPN.SEPARABLE_CONV = False + + _C.MODEL.DLA = CN() + _C.MODEL.DLA.OUT_FEATURES = ['dla2'] + _C.MODEL.DLA.USE_DLA_UP = True + _C.MODEL.DLA.NUM_LAYERS = 34 + _C.MODEL.DLA.MS_OUTPUT = False + _C.MODEL.DLA.NORM = 'BN' + _C.MODEL.DLA.DLAUP_IN_FEATURES = ['dla3', 'dla4', 'dla5'] + _C.MODEL.DLA.DLAUP_NODE = 'conv' + + _C.SOLVER.RESET_ITER = False + _C.SOLVER.TRAIN_ITER = -1 + + _C.INPUT.CUSTOM_AUG = '' + _C.INPUT.TRAIN_SIZE = 640 + _C.INPUT.TEST_SIZE = 640 + _C.INPUT.SCALE_RANGE = (0.1, 2.) + # 'default' for fixed short/ long edge, 'square' for max size=INPUT.SIZE + _C.INPUT.TEST_INPUT_TYPE = 'default' + + _C.DEBUG = False + _C.SAVE_DEBUG = False + _C.SAVE_PTH = False + _C.VIS_THRESH = 0.3 + _C.DEBUG_SHOW_NAME = False diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/__init__.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/__init__.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/bifpn.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/bifpn.py new file mode 100755 index 00000000..a1797129 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/bifpn.py @@ -0,0 +1,425 @@ +# Modified from https://github.com/rwightman/efficientdet-pytorch/blob/master/effdet/efficientdet.py +# The original file is under Apache-2.0 License +import math +from os.path import join +import numpy as np +from collections import OrderedDict +from typing import List + +import torch +from torch import nn +import torch.utils.model_zoo as model_zoo +import torch.nn.functional as F +import fvcore.nn.weight_init as weight_init + +from detectron2.layers import ShapeSpec, Conv2d +from detectron2.modeling.backbone.resnet import build_resnet_backbone +from detectron2.modeling.backbone.build import BACKBONE_REGISTRY +from detectron2.layers.batch_norm import get_norm +from detectron2.modeling.backbone import Backbone +from .dlafpn import dla34 + +def get_fpn_config(base_reduction=8): + """BiFPN config with sum.""" + p = { + 'nodes': [ + {'reduction': base_reduction << 3, 'inputs_offsets': [3, 4]}, + {'reduction': base_reduction << 2, 'inputs_offsets': [2, 5]}, + {'reduction': base_reduction << 1, 'inputs_offsets': [1, 6]}, + {'reduction': base_reduction, 'inputs_offsets': [0, 7]}, + {'reduction': base_reduction << 1, 'inputs_offsets': [1, 7, 8]}, + {'reduction': base_reduction << 2, 'inputs_offsets': [2, 6, 9]}, + {'reduction': base_reduction << 3, 'inputs_offsets': [3, 5, 10]}, + {'reduction': base_reduction << 4, 'inputs_offsets': [4, 11]}, + ], + 'weight_method': 'fastattn', + } + return p + + +def swish(x, inplace: bool = False): + """Swish - Described in: https://arxiv.org/abs/1710.05941 + """ + return x.mul_(x.sigmoid()) if inplace else x.mul(x.sigmoid()) + + +class Swish(nn.Module): + def __init__(self, inplace: bool = False): + super(Swish, self).__init__() + self.inplace = inplace + + def forward(self, x): + return swish(x, self.inplace) + + +class SequentialAppend(nn.Sequential): + def __init__(self, *args): + super(SequentialAppend, self).__init__(*args) + + def forward(self, x): + for module in self: + x.append(module(x)) + return x + + +class SequentialAppendLast(nn.Sequential): + def __init__(self, *args): + super(SequentialAppendLast, self).__init__(*args) + + # def forward(self, x: List[torch.Tensor]): + def forward(self, x): + for module in self: + x.append(module(x[-1])) + return x + + +class ConvBnAct2d(nn.Module): + def __init__(self, in_channels, out_channels, kernel_size, stride=1, dilation=1, padding='', bias=False, + norm='', act_layer=Swish): + super(ConvBnAct2d, self).__init__() + # self.conv = create_conv2d( + # in_channels, out_channels, kernel_size, stride=stride, dilation=dilation, padding=padding, bias=bias) + self.conv = Conv2d( + in_channels, out_channels, kernel_size=kernel_size, stride=stride, + padding=kernel_size // 2, bias=(norm == '')) + self.bn = get_norm(norm, out_channels) + self.act = None if act_layer is None else act_layer(inplace=True) + + def forward(self, x): + x = self.conv(x) + if self.bn is not None: + x = self.bn(x) + if self.act is not None: + x = self.act(x) + return x + + +class SeparableConv2d(nn.Module): + """ Separable Conv + """ + def __init__(self, in_channels, out_channels, kernel_size=3, stride=1, dilation=1, padding='', bias=False, + channel_multiplier=1.0, pw_kernel_size=1, act_layer=Swish, + norm=''): + super(SeparableConv2d, self).__init__() + + # self.conv_dw = create_conv2d( + # in_channels, int(in_channels * channel_multiplier), kernel_size, + # stride=stride, dilation=dilation, padding=padding, depthwise=True) + + self.conv_dw = Conv2d( + in_channels, int(in_channels * channel_multiplier), + kernel_size=kernel_size, stride=stride, padding=kernel_size // 2, bias=bias, + groups=out_channels) + # print('conv_dw', kernel_size, stride) + # self.conv_pw = create_conv2d( + # int(in_channels * channel_multiplier), out_channels, pw_kernel_size, padding=padding, bias=bias) + + self.conv_pw = Conv2d( + int(in_channels * channel_multiplier), out_channels, + kernel_size=pw_kernel_size, padding=pw_kernel_size // 2, bias=(norm=='')) + # print('conv_pw', pw_kernel_size) + + self.bn = get_norm(norm, out_channels) + self.act = None if act_layer is None else act_layer(inplace=True) + + def forward(self, x): + x = self.conv_dw(x) + x = self.conv_pw(x) + if self.bn is not None: + x = self.bn(x) + if self.act is not None: + x = self.act(x) + return x + + +class ResampleFeatureMap(nn.Sequential): + def __init__(self, in_channels, out_channels, reduction_ratio=1., pad_type='', pooling_type='max', + norm='', apply_bn=False, conv_after_downsample=False, + redundant_bias=False): + super(ResampleFeatureMap, self).__init__() + pooling_type = pooling_type or 'max' + self.in_channels = in_channels + self.out_channels = out_channels + self.reduction_ratio = reduction_ratio + self.conv_after_downsample = conv_after_downsample + + conv = None + if in_channels != out_channels: + conv = ConvBnAct2d( + in_channels, out_channels, kernel_size=1, padding=pad_type, + norm=norm if apply_bn else '', + bias=not apply_bn or redundant_bias, act_layer=None) + + if reduction_ratio > 1: + stride_size = int(reduction_ratio) + if conv is not None and not self.conv_after_downsample: + self.add_module('conv', conv) + self.add_module( + 'downsample', + # create_pool2d( + # pooling_type, kernel_size=stride_size + 1, stride=stride_size, padding=pad_type) + # nn.MaxPool2d(kernel_size=stride_size + 1, stride=stride_size, padding=pad_type) + nn.MaxPool2d(kernel_size=stride_size, stride=stride_size) + ) + if conv is not None and self.conv_after_downsample: + self.add_module('conv', conv) + else: + if conv is not None: + self.add_module('conv', conv) + if reduction_ratio < 1: + scale = int(1 // reduction_ratio) + self.add_module('upsample', nn.UpsamplingNearest2d(scale_factor=scale)) + + +class FpnCombine(nn.Module): + def __init__(self, feature_info, fpn_config, fpn_channels, inputs_offsets, target_reduction, pad_type='', + pooling_type='max', norm='', apply_bn_for_resampling=False, + conv_after_downsample=False, redundant_bias=False, weight_method='attn'): + super(FpnCombine, self).__init__() + self.inputs_offsets = inputs_offsets + self.weight_method = weight_method + + self.resample = nn.ModuleDict() + for idx, offset in enumerate(inputs_offsets): + in_channels = fpn_channels + if offset < len(feature_info): + in_channels = feature_info[offset]['num_chs'] + input_reduction = feature_info[offset]['reduction'] + else: + node_idx = offset - len(feature_info) + # print('node_idx, len', node_idx, len(fpn_config['nodes'])) + input_reduction = fpn_config['nodes'][node_idx]['reduction'] + reduction_ratio = target_reduction / input_reduction + self.resample[str(offset)] = ResampleFeatureMap( + in_channels, fpn_channels, reduction_ratio=reduction_ratio, pad_type=pad_type, + pooling_type=pooling_type, norm=norm, + apply_bn=apply_bn_for_resampling, conv_after_downsample=conv_after_downsample, + redundant_bias=redundant_bias) + + if weight_method == 'attn' or weight_method == 'fastattn': + # WSM + self.edge_weights = nn.Parameter(torch.ones(len(inputs_offsets)), requires_grad=True) + else: + self.edge_weights = None + + def forward(self, x): + dtype = x[0].dtype + nodes = [] + for offset in self.inputs_offsets: + input_node = x[offset] + input_node = self.resample[str(offset)](input_node) + nodes.append(input_node) + + if self.weight_method == 'attn': + normalized_weights = torch.softmax(self.edge_weights.type(dtype), dim=0) + x = torch.stack(nodes, dim=-1) * normalized_weights + elif self.weight_method == 'fastattn': + edge_weights = nn.functional.relu(self.edge_weights.type(dtype)) + weights_sum = torch.sum(edge_weights) + x = torch.stack( + [(nodes[i] * edge_weights[i]) / (weights_sum + 0.0001) for i in range(len(nodes))], dim=-1) + elif self.weight_method == 'sum': + x = torch.stack(nodes, dim=-1) + else: + raise ValueError('unknown weight_method {}'.format(self.weight_method)) + x = torch.sum(x, dim=-1) + return x + + +class BiFpnLayer(nn.Module): + def __init__(self, feature_info, fpn_config, fpn_channels, num_levels=5, pad_type='', + pooling_type='max', norm='', act_layer=Swish, + apply_bn_for_resampling=False, conv_after_downsample=True, conv_bn_relu_pattern=False, + separable_conv=True, redundant_bias=False): + super(BiFpnLayer, self).__init__() + self.fpn_config = fpn_config + self.num_levels = num_levels + self.conv_bn_relu_pattern = False + + self.feature_info = [] + self.fnode = SequentialAppend() + for i, fnode_cfg in enumerate(fpn_config['nodes']): + # logging.debug('fnode {} : {}'.format(i, fnode_cfg)) + # print('fnode {} : {}'.format(i, fnode_cfg)) + fnode_layers = OrderedDict() + + # combine features + reduction = fnode_cfg['reduction'] + fnode_layers['combine'] = FpnCombine( + feature_info, fpn_config, fpn_channels, fnode_cfg['inputs_offsets'], target_reduction=reduction, + pad_type=pad_type, pooling_type=pooling_type, norm=norm, + apply_bn_for_resampling=apply_bn_for_resampling, conv_after_downsample=conv_after_downsample, + redundant_bias=redundant_bias, weight_method=fpn_config['weight_method']) + self.feature_info.append(dict(num_chs=fpn_channels, reduction=reduction)) + + # after combine ops + after_combine = OrderedDict() + if not conv_bn_relu_pattern: + after_combine['act'] = act_layer(inplace=True) + conv_bias = redundant_bias + conv_act = None + else: + conv_bias = False + conv_act = act_layer + conv_kwargs = dict( + in_channels=fpn_channels, out_channels=fpn_channels, kernel_size=3, padding=pad_type, + bias=conv_bias, norm=norm, act_layer=conv_act) + after_combine['conv'] = SeparableConv2d(**conv_kwargs) if separable_conv else ConvBnAct2d(**conv_kwargs) + fnode_layers['after_combine'] = nn.Sequential(after_combine) + + self.fnode.add_module(str(i), nn.Sequential(fnode_layers)) + + self.feature_info = self.feature_info[-num_levels::] + + def forward(self, x): + x = self.fnode(x) + return x[-self.num_levels::] + + +class BiFPN(Backbone): + def __init__( + self, cfg, bottom_up, in_features, out_channels, norm='', + num_levels=5, num_bifpn=4, separable_conv=False, + ): + super(BiFPN, self).__init__() + assert isinstance(bottom_up, Backbone) + + # Feature map strides and channels from the bottom up network (e.g. ResNet) + input_shapes = bottom_up.output_shape() + in_strides = [input_shapes[f].stride for f in in_features] + in_channels = [input_shapes[f].channels for f in in_features] + + self.num_levels = num_levels + self.num_bifpn = num_bifpn + self.bottom_up = bottom_up + self.in_features = in_features + self._size_divisibility = 128 + levels = [int(math.log2(s)) for s in in_strides] + self._out_feature_strides = { + "p{}".format(int(math.log2(s))): s for s in in_strides} + if len(in_features) < num_levels: + for l in range(num_levels - len(in_features)): + s = l + levels[-1] + self._out_feature_strides["p{}".format(s + 1)] = 2 ** (s + 1) + self._out_features = list(sorted(self._out_feature_strides.keys())) + self._out_feature_channels = {k: out_channels for k in self._out_features} + + # print('self._out_feature_strides', self._out_feature_strides) + # print('self._out_feature_channels', self._out_feature_channels) + + feature_info = [ + {'num_chs': in_channels[level], 'reduction': in_strides[level]} \ + for level in range(len(self.in_features)) + ] + # self.config = config + fpn_config = get_fpn_config() + self.resample = SequentialAppendLast() + for level in range(num_levels): + if level < len(feature_info): + in_chs = in_channels[level] # feature_info[level]['num_chs'] + reduction = in_strides[level] # feature_info[level]['reduction'] + else: + # Adds a coarser level by downsampling the last feature map + reduction_ratio = 2 + self.resample.add_module(str(level), ResampleFeatureMap( + in_channels=in_chs, + out_channels=out_channels, + pad_type='same', + pooling_type=None, + norm=norm, + reduction_ratio=reduction_ratio, + apply_bn=True, + conv_after_downsample=False, + redundant_bias=False, + )) + in_chs = out_channels + reduction = int(reduction * reduction_ratio) + feature_info.append(dict(num_chs=in_chs, reduction=reduction)) + + self.cell = nn.Sequential() + for rep in range(self.num_bifpn): + # logging.debug('building cell {}'.format(rep)) + # print('building cell {}'.format(rep)) + fpn_layer = BiFpnLayer( + feature_info=feature_info, + fpn_config=fpn_config, + fpn_channels=out_channels, + num_levels=self.num_levels, + pad_type='same', + pooling_type=None, + norm=norm, + act_layer=Swish, + separable_conv=separable_conv, + apply_bn_for_resampling=True, + conv_after_downsample=False, + conv_bn_relu_pattern=False, + redundant_bias=False, + ) + self.cell.add_module(str(rep), fpn_layer) + feature_info = fpn_layer.feature_info + # import pdb; pdb.set_trace() + + @property + def size_divisibility(self): + return self._size_divisibility + + def forward(self, x): + # print('input shapes', x.shape) + bottom_up_features = self.bottom_up(x) + x = [bottom_up_features[f] for f in self.in_features] + assert len(self.resample) == self.num_levels - len(x) + x = self.resample(x) + shapes = [xx.shape for xx in x] + # print('resample shapes', shapes) + x = self.cell(x) + out = {f: xx for f, xx in zip(self._out_features, x)} + # import pdb; pdb.set_trace() + return out + + +# @BACKBONE_REGISTRY.register() +def build_resnet_bifpn_backbone(cfg, input_shape: ShapeSpec): + """ + Args: + cfg: a detectron2 CfgNode + + Returns: + backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. + """ + bottom_up = build_resnet_backbone(cfg, input_shape) + in_features = cfg.MODEL.FPN.IN_FEATURES + backbone = BiFPN( + cfg=cfg, + bottom_up=bottom_up, + in_features=in_features, + out_channels=cfg.MODEL.BIFPN.OUT_CHANNELS, + norm=cfg.MODEL.BIFPN.NORM, + num_levels=cfg.MODEL.BIFPN.NUM_LEVELS, + num_bifpn=cfg.MODEL.BIFPN.NUM_BIFPN, + separable_conv=cfg.MODEL.BIFPN.SEPARABLE_CONV, + ) + return backbone + +# @BACKBONE_REGISTRY.register() +def build_p37_dla_bifpn_backbone(cfg, input_shape: ShapeSpec): + """ + Args: + cfg: a detectron2 CfgNode + Returns: + backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. + """ + bottom_up = dla34(cfg) + in_features = cfg.MODEL.FPN.IN_FEATURES + assert cfg.MODEL.BIFPN.NUM_LEVELS == 5 + + backbone = BiFPN( + cfg=cfg, + bottom_up=bottom_up, + in_features=in_features, + out_channels=cfg.MODEL.BIFPN.OUT_CHANNELS, + norm=cfg.MODEL.BIFPN.NORM, + num_levels=cfg.MODEL.BIFPN.NUM_LEVELS, + num_bifpn=cfg.MODEL.BIFPN.NUM_BIFPN, + separable_conv=cfg.MODEL.BIFPN.SEPARABLE_CONV, + ) + return backbone diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/bifpn_fcos.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/bifpn_fcos.py new file mode 100755 index 00000000..54c9fa27 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/bifpn_fcos.py @@ -0,0 +1,469 @@ +# This file is modified from https://github.com/aim-uofa/AdelaiDet/blob/master/adet/modeling/backbone/bifpn.py +# The original file is under 2-clause BSD License for academic use, and *non-commercial use*. +import torch +import torch.nn.functional as F +from torch import nn + +from detectron2.layers import Conv2d, ShapeSpec, get_norm + +from detectron2.modeling.backbone import Backbone, build_resnet_backbone +from detectron2.modeling import BACKBONE_REGISTRY +from .dlafpn import dla34 + +__all__ = [] + + +def swish(x): + return x * x.sigmoid() + + +def split_name(name): + for i, c in enumerate(name): + if not c.isalpha(): + return name[:i], int(name[i:]) + raise ValueError() + + +class FeatureMapResampler(nn.Module): + def __init__(self, in_channels, out_channels, stride, norm=""): + super(FeatureMapResampler, self).__init__() + if in_channels != out_channels: + self.reduction = Conv2d( + in_channels, out_channels, kernel_size=1, + bias=(norm == ""), + norm=get_norm(norm, out_channels), + activation=None + ) + else: + self.reduction = None + + assert stride <= 2 + self.stride = stride + + def forward(self, x): + if self.reduction is not None: + x = self.reduction(x) + + if self.stride == 2: + x = F.max_pool2d( + x, kernel_size=self.stride + 1, + stride=self.stride, padding=1 + ) + elif self.stride == 1: + pass + else: + raise NotImplementedError() + return x + + +class BackboneWithTopLevels(Backbone): + def __init__(self, backbone, out_channels, num_top_levels, norm=""): + super(BackboneWithTopLevels, self).__init__() + self.backbone = backbone + backbone_output_shape = backbone.output_shape() + + self._out_feature_channels = {name: shape.channels for name, shape in backbone_output_shape.items()} + self._out_feature_strides = {name: shape.stride for name, shape in backbone_output_shape.items()} + self._out_features = list(self._out_feature_strides.keys()) + + last_feature_name = max(self._out_feature_strides.keys(), key=lambda x: split_name(x)[1]) + self.last_feature_name = last_feature_name + self.num_top_levels = num_top_levels + + last_channels = self._out_feature_channels[last_feature_name] + last_stride = self._out_feature_strides[last_feature_name] + + prefix, suffix = split_name(last_feature_name) + prev_channels = last_channels + for i in range(num_top_levels): + name = prefix + str(suffix + i + 1) + self.add_module(name, FeatureMapResampler( + prev_channels, out_channels, 2, norm + )) + prev_channels = out_channels + + self._out_feature_channels[name] = out_channels + self._out_feature_strides[name] = last_stride * 2 ** (i + 1) + self._out_features.append(name) + + def forward(self, x): + outputs = self.backbone(x) + last_features = outputs[self.last_feature_name] + prefix, suffix = split_name(self.last_feature_name) + + x = last_features + for i in range(self.num_top_levels): + name = prefix + str(suffix + i + 1) + x = self.__getattr__(name)(x) + outputs[name] = x + + return outputs + + +class SingleBiFPN(Backbone): + """ + This module implements Feature Pyramid Network. + It creates pyramid features built on top of some input feature maps. + """ + + def __init__( + self, in_channels_list, out_channels, norm="" + ): + """ + Args: + bottom_up (Backbone): module representing the bottom up subnetwork. + Must be a subclass of :class:`Backbone`. The multi-scale feature + maps generated by the bottom up network, and listed in `in_features`, + are used to generate FPN levels. + in_features (list[str]): names of the input feature maps coming + from the backbone to which FPN is attached. For example, if the + backbone produces ["res2", "res3", "res4"], any *contiguous* sublist + of these may be used; order must be from high to low resolution. + out_channels (int): number of channels in the output feature maps. + norm (str): the normalization to use. + """ + super(SingleBiFPN, self).__init__() + + self.out_channels = out_channels + # build 5-levels bifpn + if len(in_channels_list) == 5: + self.nodes = [ + {'feat_level': 3, 'inputs_offsets': [3, 4]}, + {'feat_level': 2, 'inputs_offsets': [2, 5]}, + {'feat_level': 1, 'inputs_offsets': [1, 6]}, + {'feat_level': 0, 'inputs_offsets': [0, 7]}, + {'feat_level': 1, 'inputs_offsets': [1, 7, 8]}, + {'feat_level': 2, 'inputs_offsets': [2, 6, 9]}, + {'feat_level': 3, 'inputs_offsets': [3, 5, 10]}, + {'feat_level': 4, 'inputs_offsets': [4, 11]}, + ] + elif len(in_channels_list) == 3: + self.nodes = [ + {'feat_level': 1, 'inputs_offsets': [1, 2]}, + {'feat_level': 0, 'inputs_offsets': [0, 3]}, + {'feat_level': 1, 'inputs_offsets': [1, 3, 4]}, + {'feat_level': 2, 'inputs_offsets': [2, 5]}, + ] + else: + raise NotImplementedError + + node_info = [_ for _ in in_channels_list] + + num_output_connections = [0 for _ in in_channels_list] + for fnode in self.nodes: + feat_level = fnode["feat_level"] + inputs_offsets = fnode["inputs_offsets"] + inputs_offsets_str = "_".join(map(str, inputs_offsets)) + for input_offset in inputs_offsets: + num_output_connections[input_offset] += 1 + + in_channels = node_info[input_offset] + if in_channels != out_channels: + lateral_conv = Conv2d( + in_channels, + out_channels, + kernel_size=1, + norm=get_norm(norm, out_channels) + ) + self.add_module( + "lateral_{}_f{}".format(input_offset, feat_level), lateral_conv + ) + node_info.append(out_channels) + num_output_connections.append(0) + + # generate attention weights + name = "weights_f{}_{}".format(feat_level, inputs_offsets_str) + self.__setattr__(name, nn.Parameter( + torch.ones(len(inputs_offsets), dtype=torch.float32), + requires_grad=True + )) + + # generate convolutions after combination + name = "outputs_f{}_{}".format(feat_level, inputs_offsets_str) + self.add_module(name, Conv2d( + out_channels, + out_channels, + kernel_size=3, + padding=1, + norm=get_norm(norm, out_channels), + bias=(norm == "") + )) + + def forward(self, feats): + """ + Args: + input (dict[str->Tensor]): mapping feature map name (e.g., "p5") to + feature map tensor for each feature level in high to low resolution order. + Returns: + dict[str->Tensor]: + mapping from feature map name to FPN feature map tensor + in high to low resolution order. Returned feature names follow the FPN + paper convention: "p", where stage has stride = 2 ** stage e.g., + ["n2", "n3", ..., "n6"]. + """ + feats = [_ for _ in feats] + num_levels = len(feats) + num_output_connections = [0 for _ in feats] + for fnode in self.nodes: + feat_level = fnode["feat_level"] + inputs_offsets = fnode["inputs_offsets"] + inputs_offsets_str = "_".join(map(str, inputs_offsets)) + input_nodes = [] + _, _, target_h, target_w = feats[feat_level].size() + for input_offset in inputs_offsets: + num_output_connections[input_offset] += 1 + input_node = feats[input_offset] + + # reduction + if input_node.size(1) != self.out_channels: + name = "lateral_{}_f{}".format(input_offset, feat_level) + input_node = self.__getattr__(name)(input_node) + + # maybe downsample + _, _, h, w = input_node.size() + if h > target_h and w > target_w: + height_stride_size = int((h - 1) // target_h + 1) + width_stride_size = int((w - 1) // target_w + 1) + assert height_stride_size == width_stride_size == 2 + input_node = F.max_pool2d( + input_node, kernel_size=(height_stride_size + 1, width_stride_size + 1), + stride=(height_stride_size, width_stride_size), padding=1 + ) + elif h <= target_h and w <= target_w: + if h < target_h or w < target_w: + input_node = F.interpolate( + input_node, + size=(target_h, target_w), + mode="nearest" + ) + else: + raise NotImplementedError() + input_nodes.append(input_node) + + # attention + name = "weights_f{}_{}".format(feat_level, inputs_offsets_str) + weights = F.relu(self.__getattr__(name)) + norm_weights = weights / (weights.sum() + 0.0001) + + new_node = torch.stack(input_nodes, dim=-1) + new_node = (norm_weights * new_node).sum(dim=-1) + new_node = swish(new_node) + + name = "outputs_f{}_{}".format(feat_level, inputs_offsets_str) + feats.append(self.__getattr__(name)(new_node)) + + num_output_connections.append(0) + + output_feats = [] + for idx in range(num_levels): + for i, fnode in enumerate(reversed(self.nodes)): + if fnode['feat_level'] == idx: + output_feats.append(feats[-1 - i]) + break + else: + raise ValueError() + return output_feats + + +class BiFPN(Backbone): + """ + This module implements Feature Pyramid Network. + It creates pyramid features built on top of some input feature maps. + """ + + def __init__( + self, bottom_up, in_features, out_channels, num_top_levels, num_repeats, norm="" + ): + """ + Args: + bottom_up (Backbone): module representing the bottom up subnetwork. + Must be a subclass of :class:`Backbone`. The multi-scale feature + maps generated by the bottom up network, and listed in `in_features`, + are used to generate FPN levels. + in_features (list[str]): names of the input feature maps coming + from the backbone to which FPN is attached. For example, if the + backbone produces ["res2", "res3", "res4"], any *contiguous* sublist + of these may be used; order must be from high to low resolution. + out_channels (int): number of channels in the output feature maps. + num_top_levels (int): the number of the top levels (p6 or p7). + num_repeats (int): the number of repeats of BiFPN. + norm (str): the normalization to use. + """ + super(BiFPN, self).__init__() + assert isinstance(bottom_up, Backbone) + + # add extra feature levels (i.e., 6 and 7) + self.bottom_up = BackboneWithTopLevels( + bottom_up, out_channels, + num_top_levels, norm + ) + bottom_up_output_shapes = self.bottom_up.output_shape() + + in_features = sorted(in_features, key=lambda x: split_name(x)[1]) + self._size_divisibility = 128 #bottom_up_output_shapes[in_features[-1]].stride + self.out_channels = out_channels + self.min_level = split_name(in_features[0])[1] + + # add the names for top blocks + prefix, last_suffix = split_name(in_features[-1]) + for i in range(num_top_levels): + in_features.append(prefix + str(last_suffix + i + 1)) + self.in_features = in_features + + # generate output features + self._out_features = ["p{}".format(split_name(name)[1]) for name in in_features] + self._out_feature_strides = { + out_name: bottom_up_output_shapes[in_name].stride + for out_name, in_name in zip(self._out_features, in_features) + } + self._out_feature_channels = {k: out_channels for k in self._out_features} + + # build bifpn + self.repeated_bifpn = nn.ModuleList() + for i in range(num_repeats): + if i == 0: + in_channels_list = [ + bottom_up_output_shapes[name].channels for name in in_features + ] + else: + in_channels_list = [ + self._out_feature_channels[name] for name in self._out_features + ] + self.repeated_bifpn.append(SingleBiFPN( + in_channels_list, out_channels, norm + )) + + @property + def size_divisibility(self): + return self._size_divisibility + + def forward(self, x): + """ + Args: + input (dict[str->Tensor]): mapping feature map name (e.g., "p5") to + feature map tensor for each feature level in high to low resolution order. + Returns: + dict[str->Tensor]: + mapping from feature map name to FPN feature map tensor + in high to low resolution order. Returned feature names follow the FPN + paper convention: "p", where stage has stride = 2 ** stage e.g., + ["n2", "n3", ..., "n6"]. + """ + bottom_up_features = self.bottom_up(x) + feats = [bottom_up_features[f] for f in self.in_features] + + for bifpn in self.repeated_bifpn: + feats = bifpn(feats) + + return dict(zip(self._out_features, feats)) + + +def _assert_strides_are_log2_contiguous(strides): + """ + Assert that each stride is 2x times its preceding stride, i.e. "contiguous in log2". + """ + for i, stride in enumerate(strides[1:], 1): + assert stride == 2 * strides[i - 1], "Strides {} {} are not log2 contiguous".format( + stride, strides[i - 1] + ) + + +# @BACKBONE_REGISTRY.register() +def build_fcos_resnet_bifpn_backbone(cfg, input_shape: ShapeSpec): + """ + Args: + cfg: a detectron2 CfgNode + Returns: + backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. + """ + bottom_up = build_resnet_backbone(cfg, input_shape) + in_features = cfg.MODEL.FPN.IN_FEATURES + out_channels = cfg.MODEL.BIFPN.OUT_CHANNELS + num_repeats = cfg.MODEL.BIFPN.NUM_BIFPN + top_levels = 2 + + backbone = BiFPN( + bottom_up=bottom_up, + in_features=in_features, + out_channels=out_channels, + num_top_levels=top_levels, + num_repeats=num_repeats, + norm=cfg.MODEL.BIFPN.NORM + ) + return backbone + + + +# @BACKBONE_REGISTRY.register() +def build_p35_fcos_resnet_bifpn_backbone(cfg, input_shape: ShapeSpec): + """ + Args: + cfg: a detectron2 CfgNode + Returns: + backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. + """ + bottom_up = build_resnet_backbone(cfg, input_shape) + in_features = cfg.MODEL.FPN.IN_FEATURES + out_channels = cfg.MODEL.BIFPN.OUT_CHANNELS + num_repeats = cfg.MODEL.BIFPN.NUM_BIFPN + top_levels = 0 + + backbone = BiFPN( + bottom_up=bottom_up, + in_features=in_features, + out_channels=out_channels, + num_top_levels=top_levels, + num_repeats=num_repeats, + norm=cfg.MODEL.BIFPN.NORM + ) + return backbone + + +# @BACKBONE_REGISTRY.register() +def build_p35_fcos_dla_bifpn_backbone(cfg, input_shape: ShapeSpec): + """ + Args: + cfg: a detectron2 CfgNode + Returns: + backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. + """ + bottom_up = dla34(cfg) + in_features = cfg.MODEL.FPN.IN_FEATURES + out_channels = cfg.MODEL.BIFPN.OUT_CHANNELS + num_repeats = cfg.MODEL.BIFPN.NUM_BIFPN + top_levels = 0 + + backbone = BiFPN( + bottom_up=bottom_up, + in_features=in_features, + out_channels=out_channels, + num_top_levels=top_levels, + num_repeats=num_repeats, + norm=cfg.MODEL.BIFPN.NORM + ) + return backbone + +# @BACKBONE_REGISTRY.register() +def build_p37_fcos_dla_bifpn_backbone(cfg, input_shape: ShapeSpec): + """ + Args: + cfg: a detectron2 CfgNode + Returns: + backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. + """ + bottom_up = dla34(cfg) + in_features = cfg.MODEL.FPN.IN_FEATURES + out_channels = cfg.MODEL.BIFPN.OUT_CHANNELS + num_repeats = cfg.MODEL.BIFPN.NUM_BIFPN + assert cfg.MODEL.BIFPN.NUM_LEVELS == 5 + top_levels = 2 + + backbone = BiFPN( + bottom_up=bottom_up, + in_features=in_features, + out_channels=out_channels, + num_top_levels=top_levels, + num_repeats=num_repeats, + norm=cfg.MODEL.BIFPN.NORM + ) + return backbone diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/dla.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/dla.py new file mode 100755 index 00000000..85e7a666 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/dla.py @@ -0,0 +1,479 @@ +import numpy as np +import math +from os.path import join +import fvcore.nn.weight_init as weight_init +import torch +import torch.nn.functional as F +from torch import nn +import torch.utils.model_zoo as model_zoo + +from detectron2.modeling.backbone.resnet import ( + BasicStem, BottleneckBlock, DeformBottleneckBlock) +from detectron2.layers import ( + Conv2d, + DeformConv, + FrozenBatchNorm2d, + ModulatedDeformConv, + ShapeSpec, + get_norm, +) + +from detectron2.modeling.backbone.backbone import Backbone +from detectron2.modeling.backbone.build import BACKBONE_REGISTRY +from detectron2.modeling.backbone.fpn import FPN + +__all__ = [ + "BottleneckBlock", + "DeformBottleneckBlock", + "BasicStem", +] + +DCNV1 = False + +HASH = { + 34: 'ba72cf86', + 60: '24839fc4', +} + +def get_model_url(data, name, hash): + return join('http://dl.yf.io/dla/models', data, '{}-{}.pth'.format(name, hash)) + +class BasicBlock(nn.Module): + def __init__(self, inplanes, planes, stride=1, dilation=1, norm='BN'): + super(BasicBlock, self).__init__() + self.conv1 = nn.Conv2d(inplanes, planes, kernel_size=3, + stride=stride, padding=dilation, + bias=False, dilation=dilation) + self.bn1 = get_norm(norm, planes) + self.relu = nn.ReLU(inplace=True) + self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, + stride=1, padding=dilation, + bias=False, dilation=dilation) + self.bn2 = get_norm(norm, planes) + self.stride = stride + + def forward(self, x, residual=None): + if residual is None: + residual = x + + out = self.conv1(x) + out = self.bn1(out) + out = self.relu(out) + + out = self.conv2(out) + out = self.bn2(out) + + out += residual + out = self.relu(out) + + return out + +class Bottleneck(nn.Module): + expansion = 2 + + def __init__(self, inplanes, planes, stride=1, dilation=1, norm='BN'): + super(Bottleneck, self).__init__() + expansion = Bottleneck.expansion + bottle_planes = planes // expansion + self.conv1 = nn.Conv2d(inplanes, bottle_planes, + kernel_size=1, bias=False) + self.bn1 = get_norm(norm, bottle_planes) + self.conv2 = nn.Conv2d(bottle_planes, bottle_planes, kernel_size=3, + stride=stride, padding=dilation, + bias=False, dilation=dilation) + self.bn2 = get_norm(norm, bottle_planes) + self.conv3 = nn.Conv2d(bottle_planes, planes, + kernel_size=1, bias=False) + self.bn3 = get_norm(norm, planes) + self.relu = nn.ReLU(inplace=True) + self.stride = stride + + def forward(self, x, residual=None): + if residual is None: + residual = x + + out = self.conv1(x) + out = self.bn1(out) + out = self.relu(out) + + out = self.conv2(out) + out = self.bn2(out) + out = self.relu(out) + + out = self.conv3(out) + out = self.bn3(out) + + out += residual + out = self.relu(out) + + return out + +class Root(nn.Module): + def __init__(self, in_channels, out_channels, kernel_size, residual, norm='BN'): + super(Root, self).__init__() + self.conv = nn.Conv2d( + in_channels, out_channels, 1, + stride=1, bias=False, padding=(kernel_size - 1) // 2) + self.bn = get_norm(norm, out_channels) + self.relu = nn.ReLU(inplace=True) + self.residual = residual + + def forward(self, *x): + children = x + x = self.conv(torch.cat(x, 1)) + x = self.bn(x) + if self.residual: + x += children[0] + x = self.relu(x) + + return x + + +class Tree(nn.Module): + def __init__(self, levels, block, in_channels, out_channels, stride=1, + level_root=False, root_dim=0, root_kernel_size=1, + dilation=1, root_residual=False, norm='BN'): + super(Tree, self).__init__() + if root_dim == 0: + root_dim = 2 * out_channels + if level_root: + root_dim += in_channels + if levels == 1: + self.tree1 = block(in_channels, out_channels, stride, + dilation=dilation, norm=norm) + self.tree2 = block(out_channels, out_channels, 1, + dilation=dilation, norm=norm) + else: + self.tree1 = Tree(levels - 1, block, in_channels, out_channels, + stride, root_dim=0, + root_kernel_size=root_kernel_size, + dilation=dilation, root_residual=root_residual, + norm=norm) + self.tree2 = Tree(levels - 1, block, out_channels, out_channels, + root_dim=root_dim + out_channels, + root_kernel_size=root_kernel_size, + dilation=dilation, root_residual=root_residual, + norm=norm) + if levels == 1: + self.root = Root(root_dim, out_channels, root_kernel_size, + root_residual, norm=norm) + self.level_root = level_root + self.root_dim = root_dim + self.downsample = None + self.project = None + self.levels = levels + if stride > 1: + self.downsample = nn.MaxPool2d(stride, stride=stride) + if in_channels != out_channels: + self.project = nn.Sequential( + nn.Conv2d(in_channels, out_channels, + kernel_size=1, stride=1, bias=False), + get_norm(norm, out_channels) + ) + + def forward(self, x, residual=None, children=None): + children = [] if children is None else children + bottom = self.downsample(x) if self.downsample else x + residual = self.project(bottom) if self.project else bottom + if self.level_root: + children.append(bottom) + x1 = self.tree1(x, residual) + if self.levels == 1: + x2 = self.tree2(x1) + x = self.root(x2, x1, *children) + else: + children.append(x1) + x = self.tree2(x1, children=children) + return x + +class DLA(nn.Module): + def __init__(self, num_layers, levels, channels, + block=BasicBlock, residual_root=False, norm='BN'): + """ + Args: + """ + super(DLA, self).__init__() + self.norm = norm + self.channels = channels + self.base_layer = nn.Sequential( + nn.Conv2d(3, channels[0], kernel_size=7, stride=1, + padding=3, bias=False), + get_norm(self.norm, channels[0]), + nn.ReLU(inplace=True)) + self.level0 = self._make_conv_level( + channels[0], channels[0], levels[0]) + self.level1 = self._make_conv_level( + channels[0], channels[1], levels[1], stride=2) + self.level2 = Tree(levels[2], block, channels[1], channels[2], 2, + level_root=False, + root_residual=residual_root, norm=norm) + self.level3 = Tree(levels[3], block, channels[2], channels[3], 2, + level_root=True, root_residual=residual_root, + norm=norm) + self.level4 = Tree(levels[4], block, channels[3], channels[4], 2, + level_root=True, root_residual=residual_root, + norm=norm) + self.level5 = Tree(levels[5], block, channels[4], channels[5], 2, + level_root=True, root_residual=residual_root, + norm=norm) + self.load_pretrained_model( + data='imagenet', name='dla{}'.format(num_layers), + hash=HASH[num_layers]) + + def load_pretrained_model(self, data, name, hash): + model_url = get_model_url(data, name, hash) + model_weights = model_zoo.load_url(model_url) + num_classes = len(model_weights[list(model_weights.keys())[-1]]) + self.fc = nn.Conv2d( + self.channels[-1], num_classes, + kernel_size=1, stride=1, padding=0, bias=True) + print('Loading pretrained') + self.load_state_dict(model_weights, strict=False) + + def _make_conv_level(self, inplanes, planes, convs, stride=1, dilation=1): + modules = [] + for i in range(convs): + modules.extend([ + nn.Conv2d(inplanes, planes, kernel_size=3, + stride=stride if i == 0 else 1, + padding=dilation, bias=False, dilation=dilation), + get_norm(self.norm, planes), + nn.ReLU(inplace=True)]) + inplanes = planes + return nn.Sequential(*modules) + + def forward(self, x): + y = [] + x = self.base_layer(x) + for i in range(6): + x = getattr(self, 'level{}'.format(i))(x) + y.append(x) + return y + + +def fill_up_weights(up): + w = up.weight.data + f = math.ceil(w.size(2) / 2) + c = (2 * f - 1 - f % 2) / (2. * f) + for i in range(w.size(2)): + for j in range(w.size(3)): + w[0, 0, i, j] = \ + (1 - math.fabs(i / f - c)) * (1 - math.fabs(j / f - c)) + for c in range(1, w.size(0)): + w[c, 0, :, :] = w[0, 0, :, :] + + +class _DeformConv(nn.Module): + def __init__(self, chi, cho, norm='BN'): + super(_DeformConv, self).__init__() + self.actf = nn.Sequential( + get_norm(norm, cho), + nn.ReLU(inplace=True) + ) + if DCNV1: + self.offset = Conv2d( + chi, 18, kernel_size=3, stride=1, + padding=1, dilation=1) + self.conv = DeformConv( + chi, cho, kernel_size=(3,3), stride=1, padding=1, + dilation=1, deformable_groups=1) + else: + self.offset = Conv2d( + chi, 27, kernel_size=3, stride=1, + padding=1, dilation=1) + self.conv = ModulatedDeformConv( + chi, cho, kernel_size=3, stride=1, padding=1, + dilation=1, deformable_groups=1) + nn.init.constant_(self.offset.weight, 0) + nn.init.constant_(self.offset.bias, 0) + + def forward(self, x): + if DCNV1: + offset = self.offset(x) + x = self.conv(x, offset) + else: + offset_mask = self.offset(x) + offset_x, offset_y, mask = torch.chunk(offset_mask, 3, dim=1) + offset = torch.cat((offset_x, offset_y), dim=1) + mask = mask.sigmoid() + x = self.conv(x, offset, mask) + x = self.actf(x) + return x + + +class IDAUp(nn.Module): + def __init__(self, o, channels, up_f, norm='BN'): + super(IDAUp, self).__init__() + for i in range(1, len(channels)): + c = channels[i] + f = int(up_f[i]) + proj = _DeformConv(c, o, norm=norm) + node = _DeformConv(o, o, norm=norm) + + up = nn.ConvTranspose2d(o, o, f * 2, stride=f, + padding=f // 2, output_padding=0, + groups=o, bias=False) + fill_up_weights(up) + + setattr(self, 'proj_' + str(i), proj) + setattr(self, 'up_' + str(i), up) + setattr(self, 'node_' + str(i), node) + + + def forward(self, layers, startp, endp): + for i in range(startp + 1, endp): + upsample = getattr(self, 'up_' + str(i - startp)) + project = getattr(self, 'proj_' + str(i - startp)) + layers[i] = upsample(project(layers[i])) + node = getattr(self, 'node_' + str(i - startp)) + layers[i] = node(layers[i] + layers[i - 1]) + + +class DLAUp(nn.Module): + def __init__(self, startp, channels, scales, in_channels=None, norm='BN'): + super(DLAUp, self).__init__() + self.startp = startp + if in_channels is None: + in_channels = channels + self.channels = channels + channels = list(channels) + scales = np.array(scales, dtype=int) + for i in range(len(channels) - 1): + j = -i - 2 + setattr(self, 'ida_{}'.format(i), + IDAUp(channels[j], in_channels[j:], + scales[j:] // scales[j], norm=norm)) + scales[j + 1:] = scales[j] + in_channels[j + 1:] = [channels[j] for _ in channels[j + 1:]] + + def forward(self, layers): + out = [layers[-1]] # start with 32 + for i in range(len(layers) - self.startp - 1): + ida = getattr(self, 'ida_{}'.format(i)) + ida(layers, len(layers) -i - 2, len(layers)) + out.insert(0, layers[-1]) + return out + +DLA_CONFIGS = { + 34: ([1, 1, 1, 2, 2, 1], [16, 32, 64, 128, 256, 512], BasicBlock), + 60: ([1, 1, 1, 2, 3, 1], [16, 32, 128, 256, 512, 1024], Bottleneck) +} + + +class DLASeg(Backbone): + def __init__(self, num_layers, out_features, use_dla_up=True, + ms_output=False, norm='BN'): + super(DLASeg, self).__init__() + # depth = 34 + levels, channels, Block = DLA_CONFIGS[num_layers] + self.base = DLA(num_layers=num_layers, + levels=levels, channels=channels, block=Block, norm=norm) + down_ratio = 4 + self.first_level = int(np.log2(down_ratio)) + self.ms_output = ms_output + self.last_level = 5 if not self.ms_output else 6 + channels = self.base.channels + scales = [2 ** i for i in range(len(channels[self.first_level:]))] + self.use_dla_up = use_dla_up + if self.use_dla_up: + self.dla_up = DLAUp( + self.first_level, channels[self.first_level:], scales, + norm=norm) + out_channel = channels[self.first_level] + if not self.ms_output: # stride 4 DLA + self.ida_up = IDAUp( + out_channel, channels[self.first_level:self.last_level], + [2 ** i for i in range(self.last_level - self.first_level)], + norm=norm) + self._out_features = out_features + self._out_feature_channels = { + 'dla{}'.format(i): channels[i] for i in range(6)} + self._out_feature_strides = { + 'dla{}'.format(i): 2 ** i for i in range(6)} + self._size_divisibility = 32 + + @property + def size_divisibility(self): + return self._size_divisibility + + def forward(self, x): + x = self.base(x) + if self.use_dla_up: + x = self.dla_up(x) + if not self.ms_output: # stride 4 dla + y = [] + for i in range(self.last_level - self.first_level): + y.append(x[i].clone()) + self.ida_up(y, 0, len(y)) + ret = {} + for i in range(self.last_level - self.first_level): + out_feature = 'dla{}'.format(i) + if out_feature in self._out_features: + ret[out_feature] = y[i] + else: + ret = {} + st = self.first_level if self.use_dla_up else 0 + for i in range(self.last_level - st): + out_feature = 'dla{}'.format(i + st) + if out_feature in self._out_features: + ret[out_feature] = x[i] + + return ret + + +# @BACKBONE_REGISTRY.register() +def build_dla_backbone(cfg, input_shape): + """ + Create a ResNet instance from config. + + Returns: + ResNet: a :class:`ResNet` instance. + """ + return DLASeg( + out_features=cfg.MODEL.DLA.OUT_FEATURES, + num_layers=cfg.MODEL.DLA.NUM_LAYERS, + use_dla_up=cfg.MODEL.DLA.USE_DLA_UP, + ms_output=cfg.MODEL.DLA.MS_OUTPUT, + norm=cfg.MODEL.DLA.NORM) + +class LastLevelP6P7(nn.Module): + """ + This module is used in RetinaNet to generate extra layers, P6 and P7 from + C5 feature. + """ + + def __init__(self, in_channels, out_channels): + super().__init__() + self.num_levels = 2 + self.in_feature = "dla5" + self.p6 = nn.Conv2d(in_channels, out_channels, 3, 2, 1) + self.p7 = nn.Conv2d(out_channels, out_channels, 3, 2, 1) + for module in [self.p6, self.p7]: + weight_init.c2_xavier_fill(module) + + def forward(self, c5): + p6 = self.p6(c5) + p7 = self.p7(F.relu(p6)) + return [p6, p7] + +# @BACKBONE_REGISTRY.register() +def build_retinanet_dla_fpn_backbone(cfg, input_shape: ShapeSpec): + """ + Args: + cfg: a detectron2 CfgNode + Returns: + backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. + """ + bottom_up = build_dla_backbone(cfg, input_shape) + in_features = cfg.MODEL.FPN.IN_FEATURES + out_channels = cfg.MODEL.FPN.OUT_CHANNELS + in_channels_p6p7 = bottom_up.output_shape()['dla5'].channels + backbone = FPN( + bottom_up=bottom_up, + in_features=in_features, + out_channels=out_channels, + norm=cfg.MODEL.FPN.NORM, + top_block=LastLevelP6P7(in_channels_p6p7, out_channels), + fuse_type=cfg.MODEL.FPN.FUSE_TYPE, + ) + return backbone diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/dlafpn.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/dlafpn.py new file mode 100755 index 00000000..d2dfa52d --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/dlafpn.py @@ -0,0 +1,493 @@ +#!/usr/bin/env python +# -*- coding: utf-8 -*- + +# this file is from https://github.com/ucbdrive/dla/blob/master/dla.py. + +import math +from os.path import join +import numpy as np + +import torch +from torch import nn +import torch.utils.model_zoo as model_zoo +import torch.nn.functional as F +import fvcore.nn.weight_init as weight_init + +from detectron2.modeling.backbone import FPN +from detectron2.layers import ShapeSpec, ModulatedDeformConv, Conv2d +from detectron2.modeling.backbone.build import BACKBONE_REGISTRY +from detectron2.layers.batch_norm import get_norm +from detectron2.modeling.backbone import Backbone + +WEB_ROOT = 'http://dl.yf.io/dla/models' + + +def get_model_url(data, name, hash): + return join( + 'http://dl.yf.io/dla/models', data, '{}-{}.pth'.format(name, hash)) + + +def conv3x3(in_planes, out_planes, stride=1): + "3x3 convolution with padding" + return nn.Conv2d(in_planes, out_planes, kernel_size=3, stride=stride, + padding=1, bias=False) + + +class BasicBlock(nn.Module): + def __init__(self, cfg, inplanes, planes, stride=1, dilation=1): + super(BasicBlock, self).__init__() + self.conv1 = nn.Conv2d(inplanes, planes, kernel_size=3, + stride=stride, padding=dilation, + bias=False, dilation=dilation) + self.bn1 = get_norm(cfg.MODEL.DLA.NORM, planes) + self.relu = nn.ReLU(inplace=True) + self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, + stride=1, padding=dilation, + bias=False, dilation=dilation) + self.bn2 = get_norm(cfg.MODEL.DLA.NORM, planes) + self.stride = stride + + def forward(self, x, residual=None): + if residual is None: + residual = x + + out = self.conv1(x) + out = self.bn1(out) + out = self.relu(out) + + out = self.conv2(out) + out = self.bn2(out) + + out += residual + out = self.relu(out) + + return out + + +class Bottleneck(nn.Module): + expansion = 2 + + def __init__(self, cfg, inplanes, planes, stride=1, dilation=1): + super(Bottleneck, self).__init__() + expansion = Bottleneck.expansion + bottle_planes = planes // expansion + self.conv1 = nn.Conv2d(inplanes, bottle_planes, + kernel_size=1, bias=False) + self.bn1 = get_norm(cfg.MODEL.DLA.NORM, bottle_planes) + self.conv2 = nn.Conv2d(bottle_planes, bottle_planes, kernel_size=3, + stride=stride, padding=dilation, + bias=False, dilation=dilation) + self.bn2 = get_norm(cfg.MODEL.DLA.NORM, bottle_planes) + self.conv3 = nn.Conv2d(bottle_planes, planes, + kernel_size=1, bias=False) + self.bn3 = get_norm(cfg.MODEL.DLA.NORM, planes) + self.relu = nn.ReLU(inplace=True) + self.stride = stride + + def forward(self, x, residual=None): + if residual is None: + residual = x + + out = self.conv1(x) + out = self.bn1(out) + out = self.relu(out) + + out = self.conv2(out) + out = self.bn2(out) + out = self.relu(out) + + out = self.conv3(out) + out = self.bn3(out) + + out += residual + out = self.relu(out) + + return out + + +class Root(nn.Module): + def __init__(self, cfg, in_channels, out_channels, kernel_size, residual): + super(Root, self).__init__() + self.conv = nn.Conv2d( + in_channels, out_channels, kernel_size, + stride=1, bias=False, padding=(kernel_size - 1) // 2) + self.bn = get_norm(cfg.MODEL.DLA.NORM, out_channels) + self.relu = nn.ReLU(inplace=True) + self.residual = residual + + def forward(self, *x): + children = x + x = self.conv(torch.cat(x, 1)) + x = self.bn(x) + if self.residual: + x += children[0] + x = self.relu(x) + + return x + + +class Tree(nn.Module): + def __init__(self, cfg, levels, block, in_channels, out_channels, stride=1, + level_root=False, root_dim=0, root_kernel_size=1, + dilation=1, root_residual=False): + super(Tree, self).__init__() + if root_dim == 0: + root_dim = 2 * out_channels + if level_root: + root_dim += in_channels + if levels == 1: + self.tree1 = block(cfg, in_channels, out_channels, stride, + dilation=dilation) + self.tree2 = block(cfg, out_channels, out_channels, 1, + dilation=dilation) + else: + self.tree1 = Tree(cfg, levels - 1, block, in_channels, out_channels, + stride, root_dim=0, + root_kernel_size=root_kernel_size, + dilation=dilation, root_residual=root_residual) + self.tree2 = Tree(cfg, levels - 1, block, out_channels, out_channels, + root_dim=root_dim + out_channels, + root_kernel_size=root_kernel_size, + dilation=dilation, root_residual=root_residual) + if levels == 1: + self.root = Root(cfg, root_dim, out_channels, root_kernel_size, + root_residual) + self.level_root = level_root + self.root_dim = root_dim + self.downsample = None + self.project = None + self.levels = levels + if stride > 1: + self.downsample = nn.MaxPool2d(stride, stride=stride) + if in_channels != out_channels: + self.project = nn.Sequential( + nn.Conv2d(in_channels, out_channels, + kernel_size=1, stride=1, bias=False), + get_norm(cfg.MODEL.DLA.NORM, out_channels) + ) + + def forward(self, x, residual=None, children=None): + if self.training and residual is not None: + x = x + residual.sum() * 0.0 + children = [] if children is None else children + bottom = self.downsample(x) if self.downsample else x + residual = self.project(bottom) if self.project else bottom + if self.level_root: + children.append(bottom) + x1 = self.tree1(x, residual) + if self.levels == 1: + x2 = self.tree2(x1) + x = self.root(x2, x1, *children) + else: + children.append(x1) + x = self.tree2(x1, children=children) + return x + + +class DLA(Backbone): + def __init__(self, cfg, levels, channels, block=BasicBlock, residual_root=False): + super(DLA, self).__init__() + self.cfg = cfg + self.channels = channels + + self._out_features = ["dla{}".format(i) for i in range(6)] + self._out_feature_channels = {k: channels[i] for i, k in enumerate(self._out_features)} + self._out_feature_strides = {k: 2 ** i for i, k in enumerate(self._out_features)} + + self.base_layer = nn.Sequential( + nn.Conv2d(3, channels[0], kernel_size=7, stride=1, + padding=3, bias=False), + get_norm(cfg.MODEL.DLA.NORM, channels[0]), + nn.ReLU(inplace=True)) + self.level0 = self._make_conv_level( + channels[0], channels[0], levels[0]) + self.level1 = self._make_conv_level( + channels[0], channels[1], levels[1], stride=2) + self.level2 = Tree(cfg, levels[2], block, channels[1], channels[2], 2, + level_root=False, + root_residual=residual_root) + self.level3 = Tree(cfg, levels[3], block, channels[2], channels[3], 2, + level_root=True, root_residual=residual_root) + self.level4 = Tree(cfg, levels[4], block, channels[3], channels[4], 2, + level_root=True, root_residual=residual_root) + self.level5 = Tree(cfg, levels[5], block, channels[4], channels[5], 2, + level_root=True, root_residual=residual_root) + + for m in self.modules(): + if isinstance(m, nn.Conv2d): + n = m.kernel_size[0] * m.kernel_size[1] * m.out_channels + m.weight.data.normal_(0, math.sqrt(2. / n)) + + self.load_pretrained_model( + data='imagenet', name='dla34', hash='ba72cf86') + + def load_pretrained_model(self, data, name, hash): + model_url = get_model_url(data, name, hash) + model_weights = model_zoo.load_url(model_url) + del model_weights['fc.weight'] + del model_weights['fc.bias'] + print('Loading pretrained DLA!') + self.load_state_dict(model_weights, strict=True) + + def _make_conv_level(self, inplanes, planes, convs, stride=1, dilation=1): + modules = [] + for i in range(convs): + modules.extend([ + nn.Conv2d(inplanes, planes, kernel_size=3, + stride=stride if i == 0 else 1, + padding=dilation, bias=False, dilation=dilation), + get_norm(self.cfg.MODEL.DLA.NORM, planes), + nn.ReLU(inplace=True)]) + inplanes = planes + return nn.Sequential(*modules) + + def forward(self, x): + y = {} + x = self.base_layer(x) + for i in range(6): + name = 'level{}'.format(i) + x = getattr(self, name)(x) + y['dla{}'.format(i)] = x + return y + + +def fill_up_weights(up): + w = up.weight.data + f = math.ceil(w.size(2) / 2) + c = (2 * f - 1 - f % 2) / (2. * f) + for i in range(w.size(2)): + for j in range(w.size(3)): + w[0, 0, i, j] = \ + (1 - math.fabs(i / f - c)) * (1 - math.fabs(j / f - c)) + for c in range(1, w.size(0)): + w[c, 0, :, :] = w[0, 0, :, :] + + +class Conv(nn.Module): + def __init__(self, chi, cho, norm): + super(Conv, self).__init__() + self.conv = nn.Sequential( + nn.Conv2d(chi, cho, kernel_size=1, stride=1, bias=False), + get_norm(norm, cho), + nn.ReLU(inplace=True)) + + def forward(self, x): + return self.conv(x) + + +class DeformConv(nn.Module): + def __init__(self, chi, cho, norm): + super(DeformConv, self).__init__() + self.actf = nn.Sequential( + get_norm(norm, cho), + nn.ReLU(inplace=True) + ) + self.offset = Conv2d( + chi, 27, kernel_size=3, stride=1, + padding=1, dilation=1) + self.conv = ModulatedDeformConv( + chi, cho, kernel_size=3, stride=1, padding=1, + dilation=1, deformable_groups=1) + nn.init.constant_(self.offset.weight, 0) + nn.init.constant_(self.offset.bias, 0) + + def forward(self, x): + offset_mask = self.offset(x) + offset_x, offset_y, mask = torch.chunk(offset_mask, 3, dim=1) + offset = torch.cat((offset_x, offset_y), dim=1) + mask = mask.sigmoid() + x = self.conv(x, offset, mask) + x = self.actf(x) + return x + + +class IDAUp(nn.Module): + def __init__(self, o, channels, up_f, norm='FrozenBN', node_type=Conv): + super(IDAUp, self).__init__() + for i in range(1, len(channels)): + c = channels[i] + f = int(up_f[i]) + proj = node_type(c, o, norm) + node = node_type(o, o, norm) + + up = nn.ConvTranspose2d(o, o, f * 2, stride=f, + padding=f // 2, output_padding=0, + groups=o, bias=False) + fill_up_weights(up) + + setattr(self, 'proj_' + str(i), proj) + setattr(self, 'up_' + str(i), up) + setattr(self, 'node_' + str(i), node) + + + def forward(self, layers, startp, endp): + for i in range(startp + 1, endp): + upsample = getattr(self, 'up_' + str(i - startp)) + project = getattr(self, 'proj_' + str(i - startp)) + layers[i] = upsample(project(layers[i])) + node = getattr(self, 'node_' + str(i - startp)) + layers[i] = node(layers[i] + layers[i - 1]) + + +DLAUP_NODE_MAP = { + 'conv': Conv, + 'dcn': DeformConv, +} + +class DLAUP(Backbone): + def __init__(self, bottom_up, in_features, norm, dlaup_node='conv'): + super(DLAUP, self).__init__() + assert isinstance(bottom_up, Backbone) + self.bottom_up = bottom_up + input_shapes = bottom_up.output_shape() + in_strides = [input_shapes[f].stride for f in in_features] + in_channels = [input_shapes[f].channels for f in in_features] + in_levels = [int(math.log2(input_shapes[f].stride)) for f in in_features] + self.in_features = in_features + out_features = ['dlaup{}'.format(l) for l in in_levels] + self._out_features = out_features + self._out_feature_channels = { + 'dlaup{}'.format(l): in_channels[i] for i, l in enumerate(in_levels)} + self._out_feature_strides = { + 'dlaup{}'.format(l): 2 ** l for l in in_levels} + + print('self._out_features', self._out_features) + print('self._out_feature_channels', self._out_feature_channels) + print('self._out_feature_strides', self._out_feature_strides) + self._size_divisibility = 32 + + node_type = DLAUP_NODE_MAP[dlaup_node] + + self.startp = int(math.log2(in_strides[0])) + self.channels = in_channels + channels = list(in_channels) + scales = np.array([2 ** i for i in range(len(out_features))], dtype=int) + for i in range(len(channels) - 1): + j = -i - 2 + setattr(self, 'ida_{}'.format(i), + IDAUp(channels[j], in_channels[j:], + scales[j:] // scales[j], + norm=norm, + node_type=node_type)) + scales[j + 1:] = scales[j] + in_channels[j + 1:] = [channels[j] for _ in channels[j + 1:]] + + @property + def size_divisibility(self): + return self._size_divisibility + + def forward(self, x): + bottom_up_features = self.bottom_up(x) + layers = [bottom_up_features[f] for f in self.in_features] + out = [layers[-1]] # start with 32 + for i in range(len(layers) - 1): + ida = getattr(self, 'ida_{}'.format(i)) + ida(layers, len(layers) - i - 2, len(layers)) + out.insert(0, layers[-1]) + ret = {} + for k, v in zip(self._out_features, out): + ret[k] = v + # import pdb; pdb.set_trace() + return ret + + +def dla34(cfg, pretrained=None): # DLA-34 + model = DLA(cfg, [1, 1, 1, 2, 2, 1], + [16, 32, 64, 128, 256, 512], + block=BasicBlock) + return model + + +class LastLevelP6P7(nn.Module): + """ + This module is used in RetinaNet to generate extra layers, P6 and P7 from + C5 feature. + """ + + def __init__(self, in_channels, out_channels): + super().__init__() + self.num_levels = 2 + self.in_feature = "dla5" + self.p6 = nn.Conv2d(in_channels, out_channels, 3, 2, 1) + self.p7 = nn.Conv2d(out_channels, out_channels, 3, 2, 1) + for module in [self.p6, self.p7]: + weight_init.c2_xavier_fill(module) + + def forward(self, c5): + p6 = self.p6(c5) + p7 = self.p7(F.relu(p6)) + return [p6, p7] + + +# @BACKBONE_REGISTRY.register() +def build_dla_fpn3_backbone(cfg, input_shape: ShapeSpec): + """ + Args: + cfg: a detectron2 CfgNode + Returns: + backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. + """ + + depth_to_creator = {"dla34": dla34} + bottom_up = depth_to_creator['dla{}'.format(cfg.MODEL.DLA.NUM_LAYERS)](cfg) + in_features = cfg.MODEL.FPN.IN_FEATURES + out_channels = cfg.MODEL.FPN.OUT_CHANNELS + + backbone = FPN( + bottom_up=bottom_up, + in_features=in_features, + out_channels=out_channels, + norm=cfg.MODEL.FPN.NORM, + top_block=None, + fuse_type=cfg.MODEL.FPN.FUSE_TYPE, + ) + + return backbone + +# @BACKBONE_REGISTRY.register() +def build_dla_fpn5_backbone(cfg, input_shape: ShapeSpec): + """ + Args: + cfg: a detectron2 CfgNode + Returns: + backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. + """ + + depth_to_creator = {"dla34": dla34} + bottom_up = depth_to_creator['dla{}'.format(cfg.MODEL.DLA.NUM_LAYERS)](cfg) + in_features = cfg.MODEL.FPN.IN_FEATURES + out_channels = cfg.MODEL.FPN.OUT_CHANNELS + in_channels_top = bottom_up.output_shape()['dla5'].channels + + backbone = FPN( + bottom_up=bottom_up, + in_features=in_features, + out_channels=out_channels, + norm=cfg.MODEL.FPN.NORM, + top_block=LastLevelP6P7(in_channels_top, out_channels), + fuse_type=cfg.MODEL.FPN.FUSE_TYPE, + ) + + return backbone + + +# @BACKBONE_REGISTRY.register() +def build_dlaup_backbone(cfg, input_shape: ShapeSpec): + """ + Args: + cfg: a detectron2 CfgNode + Returns: + backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. + """ + + depth_to_creator = {"dla34": dla34} + bottom_up = depth_to_creator['dla{}'.format(cfg.MODEL.DLA.NUM_LAYERS)](cfg) + + backbone = DLAUP( + bottom_up=bottom_up, + in_features=cfg.MODEL.DLA.DLAUP_IN_FEATURES, + norm=cfg.MODEL.DLA.NORM, + dlaup_node=cfg.MODEL.DLA.DLAUP_NODE, + ) + + return backbone diff --git a/vbench/third_party/grit_src/grit/modeling/backbone/centernet_last_layer.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/fpn_p5.py old mode 100644 new mode 100755 similarity index 97% rename from vbench/third_party/grit_src/grit/modeling/backbone/centernet_last_layer.py rename to vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/fpn_p5.py index cc4e7a49..34544547 --- a/vbench/third_party/grit_src/grit/modeling/backbone/centernet_last_layer.py +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/fpn_p5.py @@ -33,7 +33,7 @@ def forward(self, c5): return [p6, p7] -@BACKBONE_REGISTRY.register() +# @BACKBONE_REGISTRY.register() def build_p67_resnet_fpn_backbone(cfg, input_shape: ShapeSpec): """ Args: @@ -55,7 +55,7 @@ def build_p67_resnet_fpn_backbone(cfg, input_shape: ShapeSpec): ) return backbone -@BACKBONE_REGISTRY.register() +# @BACKBONE_REGISTRY.register() def build_p35_resnet_fpn_backbone(cfg, input_shape: ShapeSpec): """ Args: diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/res2net.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/res2net.py new file mode 100755 index 00000000..20ea930b --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/res2net.py @@ -0,0 +1,802 @@ +# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved +# This file is modified from https://github.com/Res2Net/Res2Net-detectron2/blob/master/detectron2/modeling/backbone/resnet.py +# The original file is under Apache-2.0 License +import numpy as np +import fvcore.nn.weight_init as weight_init +import torch +import torch.nn.functional as F +from torch import nn + +from detectron2.layers import ( + CNNBlockBase, + Conv2d, + DeformConv, + ModulatedDeformConv, + ShapeSpec, + get_norm, +) + +from detectron2.modeling.backbone import Backbone +from detectron2.modeling.backbone.fpn import FPN +from detectron2.modeling.backbone.build import BACKBONE_REGISTRY +from .fpn_p5 import LastLevelP6P7_P5 +from .bifpn import BiFPN + +__all__ = [ + "ResNetBlockBase", + "BasicBlock", + "BottleneckBlock", + "DeformBottleneckBlock", + "BasicStem", + "ResNet", + "make_stage", + "build_res2net_backbone", +] + + +ResNetBlockBase = CNNBlockBase +""" +Alias for backward compatibiltiy. +""" + + +class BasicBlock(CNNBlockBase): + """ + The basic residual block for ResNet-18 and ResNet-34, with two 3x3 conv layers + and a projection shortcut if needed. + """ + + def __init__(self, in_channels, out_channels, *, stride=1, norm="BN"): + """ + Args: + in_channels (int): Number of input channels. + out_channels (int): Number of output channels. + stride (int): Stride for the first conv. + norm (str or callable): normalization for all conv layers. + See :func:`layers.get_norm` for supported format. + """ + super().__init__(in_channels, out_channels, stride) + + if in_channels != out_channels: + self.shortcut = Conv2d( + in_channels, + out_channels, + kernel_size=1, + stride=stride, + bias=False, + norm=get_norm(norm, out_channels), + ) + else: + self.shortcut = None + + self.conv1 = Conv2d( + in_channels, + out_channels, + kernel_size=3, + stride=stride, + padding=1, + bias=False, + norm=get_norm(norm, out_channels), + ) + + self.conv2 = Conv2d( + out_channels, + out_channels, + kernel_size=3, + stride=1, + padding=1, + bias=False, + norm=get_norm(norm, out_channels), + ) + + for layer in [self.conv1, self.conv2, self.shortcut]: + if layer is not None: # shortcut can be None + weight_init.c2_msra_fill(layer) + + def forward(self, x): + out = self.conv1(x) + out = F.relu_(out) + out = self.conv2(out) + + if self.shortcut is not None: + shortcut = self.shortcut(x) + else: + shortcut = x + + out += shortcut + out = F.relu_(out) + return out + + +class BottleneckBlock(CNNBlockBase): + """ + The standard bottle2neck residual block used by Res2Net-50, 101 and 152. + """ + + def __init__( + self, + in_channels, + out_channels, + *, + bottleneck_channels, + stride=1, + num_groups=1, + norm="BN", + stride_in_1x1=False, + dilation=1, + basewidth=26, + scale=4, + ): + """ + Args: + bottleneck_channels (int): number of output channels for the 3x3 + "bottleneck" conv layers. + num_groups (int): number of groups for the 3x3 conv layer. + norm (str or callable): normalization for all conv layers. + See :func:`layers.get_norm` for supported format. + stride_in_1x1 (bool): when stride>1, whether to put stride in the + first 1x1 convolution or the bottleneck 3x3 convolution. + dilation (int): the dilation rate of the 3x3 conv layer. + """ + super().__init__(in_channels, out_channels, stride) + + if in_channels != out_channels: + self.shortcut = nn.Sequential( + nn.AvgPool2d(kernel_size=stride, stride=stride, + ceil_mode=True, count_include_pad=False), + Conv2d( + in_channels, + out_channels, + kernel_size=1, + stride=1, + bias=False, + norm=get_norm(norm, out_channels), + ) + ) + else: + self.shortcut = None + + # The original MSRA ResNet models have stride in the first 1x1 conv + # The subsequent fb.torch.resnet and Caffe2 ResNe[X]t implementations have + # stride in the 3x3 conv + stride_1x1, stride_3x3 = (stride, 1) if stride_in_1x1 else (1, stride) + width = bottleneck_channels//scale + + self.conv1 = Conv2d( + in_channels, + bottleneck_channels, + kernel_size=1, + stride=stride_1x1, + bias=False, + norm=get_norm(norm, bottleneck_channels), + ) + if scale == 1: + self.nums = 1 + else: + self.nums = scale -1 + if self.in_channels!=self.out_channels and stride_3x3!=2: + self.pool = nn.AvgPool2d(kernel_size=3, stride = stride_3x3, padding=1) + + convs = [] + bns = [] + for i in range(self.nums): + convs.append(nn.Conv2d( + width, + width, + kernel_size=3, + stride=stride_3x3, + padding=1 * dilation, + bias=False, + groups=num_groups, + dilation=dilation, + )) + bns.append(get_norm(norm, width)) + self.convs = nn.ModuleList(convs) + self.bns = nn.ModuleList(bns) + + self.conv3 = Conv2d( + bottleneck_channels, + out_channels, + kernel_size=1, + bias=False, + norm=get_norm(norm, out_channels), + ) + self.scale = scale + self.width = width + self.in_channels = in_channels + self.out_channels = out_channels + self.stride_3x3 = stride_3x3 + for layer in [self.conv1, self.conv3]: + if layer is not None: # shortcut can be None + weight_init.c2_msra_fill(layer) + if self.shortcut is not None: + for layer in self.shortcut.modules(): + if isinstance(layer, Conv2d): + weight_init.c2_msra_fill(layer) + + for layer in self.convs: + if layer is not None: # shortcut can be None + weight_init.c2_msra_fill(layer) + + # Zero-initialize the last normalization in each residual branch, + # so that at the beginning, the residual branch starts with zeros, + # and each residual block behaves like an identity. + # See Sec 5.1 in "Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour": + # "For BN layers, the learnable scaling coefficient γ is initialized + # to be 1, except for each residual block's last BN + # where γ is initialized to be 0." + + # nn.init.constant_(self.conv3.norm.weight, 0) + # TODO this somehow hurts performance when training GN models from scratch. + # Add it as an option when we need to use this code to train a backbone. + + def forward(self, x): + out = self.conv1(x) + out = F.relu_(out) + + spx = torch.split(out, self.width, 1) + for i in range(self.nums): + if i==0 or self.in_channels!=self.out_channels: + sp = spx[i] + else: + sp = sp + spx[i] + sp = self.convs[i](sp) + sp = F.relu_(self.bns[i](sp)) + if i==0: + out = sp + else: + out = torch.cat((out, sp), 1) + if self.scale!=1 and self.stride_3x3==1: + out = torch.cat((out, spx[self.nums]), 1) + elif self.scale != 1 and self.stride_3x3==2: + out = torch.cat((out, self.pool(spx[self.nums])), 1) + + out = self.conv3(out) + + if self.shortcut is not None: + shortcut = self.shortcut(x) + else: + shortcut = x + + out += shortcut + out = F.relu_(out) + return out + + +class DeformBottleneckBlock(ResNetBlockBase): + """ + Not implemented for res2net yet. + Similar to :class:`BottleneckBlock`, but with deformable conv in the 3x3 convolution. + """ + + def __init__( + self, + in_channels, + out_channels, + *, + bottleneck_channels, + stride=1, + num_groups=1, + norm="BN", + stride_in_1x1=False, + dilation=1, + deform_modulated=False, + deform_num_groups=1, + basewidth=26, + scale=4, + ): + super().__init__(in_channels, out_channels, stride) + self.deform_modulated = deform_modulated + + if in_channels != out_channels: + # self.shortcut = Conv2d( + # in_channels, + # out_channels, + # kernel_size=1, + # stride=stride, + # bias=False, + # norm=get_norm(norm, out_channels), + # ) + self.shortcut = nn.Sequential( + nn.AvgPool2d(kernel_size=stride, stride=stride, + ceil_mode=True, count_include_pad=False), + Conv2d( + in_channels, + out_channels, + kernel_size=1, + stride=1, + bias=False, + norm=get_norm(norm, out_channels), + ) + ) + else: + self.shortcut = None + + stride_1x1, stride_3x3 = (stride, 1) if stride_in_1x1 else (1, stride) + width = bottleneck_channels//scale + + self.conv1 = Conv2d( + in_channels, + bottleneck_channels, + kernel_size=1, + stride=stride_1x1, + bias=False, + norm=get_norm(norm, bottleneck_channels), + ) + + if scale == 1: + self.nums = 1 + else: + self.nums = scale -1 + if self.in_channels!=self.out_channels and stride_3x3!=2: + self.pool = nn.AvgPool2d(kernel_size=3, stride = stride_3x3, padding=1) + + if deform_modulated: + deform_conv_op = ModulatedDeformConv + # offset channels are 2 or 3 (if with modulated) * kernel_size * kernel_size + offset_channels = 27 + else: + deform_conv_op = DeformConv + offset_channels = 18 + + # self.conv2_offset = Conv2d( + # bottleneck_channels, + # offset_channels * deform_num_groups, + # kernel_size=3, + # stride=stride_3x3, + # padding=1 * dilation, + # dilation=dilation, + # ) + # self.conv2 = deform_conv_op( + # bottleneck_channels, + # bottleneck_channels, + # kernel_size=3, + # stride=stride_3x3, + # padding=1 * dilation, + # bias=False, + # groups=num_groups, + # dilation=dilation, + # deformable_groups=deform_num_groups, + # norm=get_norm(norm, bottleneck_channels), + # ) + + conv2_offsets = [] + convs = [] + bns = [] + for i in range(self.nums): + conv2_offsets.append(Conv2d( + width, + offset_channels * deform_num_groups, + kernel_size=3, + stride=stride_3x3, + padding=1 * dilation, + bias=False, + groups=num_groups, + dilation=dilation, + )) + convs.append(deform_conv_op( + width, + width, + kernel_size=3, + stride=stride_3x3, + padding=1 * dilation, + bias=False, + groups=num_groups, + dilation=dilation, + deformable_groups=deform_num_groups, + )) + bns.append(get_norm(norm, width)) + self.conv2_offsets = nn.ModuleList(conv2_offsets) + self.convs = nn.ModuleList(convs) + self.bns = nn.ModuleList(bns) + + self.conv3 = Conv2d( + bottleneck_channels, + out_channels, + kernel_size=1, + bias=False, + norm=get_norm(norm, out_channels), + ) + self.scale = scale + self.width = width + self.in_channels = in_channels + self.out_channels = out_channels + self.stride_3x3 = stride_3x3 + # for layer in [self.conv1, self.conv2, self.conv3, self.shortcut]: + # if layer is not None: # shortcut can be None + # weight_init.c2_msra_fill(layer) + + # nn.init.constant_(self.conv2_offset.weight, 0) + # nn.init.constant_(self.conv2_offset.bias, 0) + for layer in [self.conv1, self.conv3]: + if layer is not None: # shortcut can be None + weight_init.c2_msra_fill(layer) + if self.shortcut is not None: + for layer in self.shortcut.modules(): + if isinstance(layer, Conv2d): + weight_init.c2_msra_fill(layer) + + for layer in self.convs: + if layer is not None: # shortcut can be None + weight_init.c2_msra_fill(layer) + + for layer in self.conv2_offsets: + if layer.weight is not None: + nn.init.constant_(layer.weight, 0) + if layer.bias is not None: + nn.init.constant_(layer.bias, 0) + + def forward(self, x): + out = self.conv1(x) + out = F.relu_(out) + + # if self.deform_modulated: + # offset_mask = self.conv2_offset(out) + # offset_x, offset_y, mask = torch.chunk(offset_mask, 3, dim=1) + # offset = torch.cat((offset_x, offset_y), dim=1) + # mask = mask.sigmoid() + # out = self.conv2(out, offset, mask) + # else: + # offset = self.conv2_offset(out) + # out = self.conv2(out, offset) + # out = F.relu_(out) + + spx = torch.split(out, self.width, 1) + for i in range(self.nums): + if i==0 or self.in_channels!=self.out_channels: + sp = spx[i].contiguous() + else: + sp = sp + spx[i].contiguous() + + # sp = self.convs[i](sp) + if self.deform_modulated: + offset_mask = self.conv2_offsets[i](sp) + offset_x, offset_y, mask = torch.chunk(offset_mask, 3, dim=1) + offset = torch.cat((offset_x, offset_y), dim=1) + mask = mask.sigmoid() + sp = self.convs[i](sp, offset, mask) + else: + offset = self.conv2_offsets[i](sp) + sp = self.convs[i](sp, offset) + sp = F.relu_(self.bns[i](sp)) + if i==0: + out = sp + else: + out = torch.cat((out, sp), 1) + if self.scale!=1 and self.stride_3x3==1: + out = torch.cat((out, spx[self.nums]), 1) + elif self.scale != 1 and self.stride_3x3==2: + out = torch.cat((out, self.pool(spx[self.nums])), 1) + + out = self.conv3(out) + + if self.shortcut is not None: + shortcut = self.shortcut(x) + else: + shortcut = x + + out += shortcut + out = F.relu_(out) + return out + + +def make_stage(block_class, num_blocks, first_stride, *, in_channels, out_channels, **kwargs): + """ + Create a list of blocks just like those in a ResNet stage. + Args: + block_class (type): a subclass of ResNetBlockBase + num_blocks (int): + first_stride (int): the stride of the first block. The other blocks will have stride=1. + in_channels (int): input channels of the entire stage. + out_channels (int): output channels of **every block** in the stage. + kwargs: other arguments passed to the constructor of every block. + Returns: + list[nn.Module]: a list of block module. + """ + assert "stride" not in kwargs, "Stride of blocks in make_stage cannot be changed." + blocks = [] + for i in range(num_blocks): + blocks.append( + block_class( + in_channels=in_channels, + out_channels=out_channels, + stride=first_stride if i == 0 else 1, + **kwargs, + ) + ) + in_channels = out_channels + return blocks + + +class BasicStem(CNNBlockBase): + """ + The standard ResNet stem (layers before the first residual block). + """ + + def __init__(self, in_channels=3, out_channels=64, norm="BN"): + """ + Args: + norm (str or callable): norm after the first conv layer. + See :func:`layers.get_norm` for supported format. + """ + super().__init__(in_channels, out_channels, 4) + self.in_channels = in_channels + self.conv1 = nn.Sequential( + Conv2d( + in_channels, + 32, + kernel_size=3, + stride=2, + padding=1, + bias=False, + ), + get_norm(norm, 32), + nn.ReLU(inplace=True), + Conv2d( + 32, + 32, + kernel_size=3, + stride=1, + padding=1, + bias=False, + ), + get_norm(norm, 32), + nn.ReLU(inplace=True), + Conv2d( + 32, + out_channels, + kernel_size=3, + stride=1, + padding=1, + bias=False, + ), + ) + self.bn1 = get_norm(norm, out_channels) + + for layer in self.conv1: + if isinstance(layer, Conv2d): + weight_init.c2_msra_fill(layer) + + def forward(self, x): + x = self.conv1(x) + x = self.bn1(x) + x = F.relu_(x) + x = F.max_pool2d(x, kernel_size=3, stride=2, padding=1) + return x + + +class ResNet(Backbone): + def __init__(self, stem, stages, num_classes=None, out_features=None): + """ + Args: + stem (nn.Module): a stem module + stages (list[list[CNNBlockBase]]): several (typically 4) stages, + each contains multiple :class:`CNNBlockBase`. + num_classes (None or int): if None, will not perform classification. + Otherwise, will create a linear layer. + out_features (list[str]): name of the layers whose outputs should + be returned in forward. Can be anything in "stem", "linear", or "res2" ... + If None, will return the output of the last layer. + """ + super(ResNet, self).__init__() + self.stem = stem + self.num_classes = num_classes + + current_stride = self.stem.stride + self._out_feature_strides = {"stem": current_stride} + self._out_feature_channels = {"stem": self.stem.out_channels} + + self.stages_and_names = [] + for i, blocks in enumerate(stages): + assert len(blocks) > 0, len(blocks) + for block in blocks: + assert isinstance(block, CNNBlockBase), block + + name = "res" + str(i + 2) + stage = nn.Sequential(*blocks) + + self.add_module(name, stage) + self.stages_and_names.append((stage, name)) + + self._out_feature_strides[name] = current_stride = int( + current_stride * np.prod([k.stride for k in blocks]) + ) + self._out_feature_channels[name] = curr_channels = blocks[-1].out_channels + + if num_classes is not None: + self.avgpool = nn.AdaptiveAvgPool2d((1, 1)) + self.linear = nn.Linear(curr_channels, num_classes) + + # Sec 5.1 in "Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour": + # "The 1000-way fully-connected layer is initialized by + # drawing weights from a zero-mean Gaussian with standard deviation of 0.01." + nn.init.normal_(self.linear.weight, std=0.01) + name = "linear" + + if out_features is None: + out_features = [name] + self._out_features = out_features + assert len(self._out_features) + children = [x[0] for x in self.named_children()] + for out_feature in self._out_features: + assert out_feature in children, "Available children: {}".format(", ".join(children)) + + def forward(self, x): + outputs = {} + x = self.stem(x) + if "stem" in self._out_features: + outputs["stem"] = x + for stage, name in self.stages_and_names: + x = stage(x) + if name in self._out_features: + outputs[name] = x + if self.num_classes is not None: + x = self.avgpool(x) + x = torch.flatten(x, 1) + x = self.linear(x) + if "linear" in self._out_features: + outputs["linear"] = x + return outputs + + def output_shape(self): + return { + name: ShapeSpec( + channels=self._out_feature_channels[name], stride=self._out_feature_strides[name] + ) + for name in self._out_features + } + + def freeze(self, freeze_at=0): + """ + Freeze the first several stages of the ResNet. Commonly used in + fine-tuning. + Args: + freeze_at (int): number of stem and stages to freeze. + `1` means freezing the stem. `2` means freezing the stem and + the first stage, etc. + Returns: + nn.Module: this ResNet itself + """ + if freeze_at >= 1: + self.stem.freeze() + for idx, (stage, _) in enumerate(self.stages_and_names, start=2): + if freeze_at >= idx: + for block in stage.children(): + block.freeze() + return self + + +# @BACKBONE_REGISTRY.register() +def build_res2net_backbone(cfg, input_shape): + """ + Create a Res2Net instance from config. + Returns: + ResNet: a :class:`ResNet` instance. + """ + # need registration of new blocks/stems? + norm = cfg.MODEL.RESNETS.NORM + stem = BasicStem( + in_channels=input_shape.channels, + out_channels=cfg.MODEL.RESNETS.STEM_OUT_CHANNELS, + norm=norm, + ) + + # fmt: off + freeze_at = cfg.MODEL.BACKBONE.FREEZE_AT + out_features = cfg.MODEL.RESNETS.OUT_FEATURES + depth = cfg.MODEL.RESNETS.DEPTH + num_groups = cfg.MODEL.RESNETS.NUM_GROUPS + width_per_group = cfg.MODEL.RESNETS.WIDTH_PER_GROUP + scale = 4 + bottleneck_channels = num_groups * width_per_group * scale + in_channels = cfg.MODEL.RESNETS.STEM_OUT_CHANNELS + out_channels = cfg.MODEL.RESNETS.RES2_OUT_CHANNELS + stride_in_1x1 = cfg.MODEL.RESNETS.STRIDE_IN_1X1 + res5_dilation = cfg.MODEL.RESNETS.RES5_DILATION + deform_on_per_stage = cfg.MODEL.RESNETS.DEFORM_ON_PER_STAGE + deform_modulated = cfg.MODEL.RESNETS.DEFORM_MODULATED + deform_num_groups = cfg.MODEL.RESNETS.DEFORM_NUM_GROUPS + # fmt: on + assert res5_dilation in {1, 2}, "res5_dilation cannot be {}.".format(res5_dilation) + + num_blocks_per_stage = { + 18: [2, 2, 2, 2], + 34: [3, 4, 6, 3], + 50: [3, 4, 6, 3], + 101: [3, 4, 23, 3], + 152: [3, 8, 36, 3], + }[depth] + + if depth in [18, 34]: + assert out_channels == 64, "Must set MODEL.RESNETS.RES2_OUT_CHANNELS = 64 for R18/R34" + assert not any( + deform_on_per_stage + ), "MODEL.RESNETS.DEFORM_ON_PER_STAGE unsupported for R18/R34" + assert res5_dilation == 1, "Must set MODEL.RESNETS.RES5_DILATION = 1 for R18/R34" + assert num_groups == 1, "Must set MODEL.RESNETS.NUM_GROUPS = 1 for R18/R34" + + stages = [] + + # Avoid creating variables without gradients + # It consumes extra memory and may cause allreduce to fail + out_stage_idx = [{"res2": 2, "res3": 3, "res4": 4, "res5": 5}[f] for f in out_features] + max_stage_idx = max(out_stage_idx) + for idx, stage_idx in enumerate(range(2, max_stage_idx + 1)): + dilation = res5_dilation if stage_idx == 5 else 1 + first_stride = 1 if idx == 0 or (stage_idx == 5 and dilation == 2) else 2 + stage_kargs = { + "num_blocks": num_blocks_per_stage[idx], + "first_stride": first_stride, + "in_channels": in_channels, + "out_channels": out_channels, + "norm": norm, + } + # Use BasicBlock for R18 and R34. + if depth in [18, 34]: + stage_kargs["block_class"] = BasicBlock + else: + stage_kargs["bottleneck_channels"] = bottleneck_channels + stage_kargs["stride_in_1x1"] = stride_in_1x1 + stage_kargs["dilation"] = dilation + stage_kargs["num_groups"] = num_groups + stage_kargs["scale"] = scale + + if deform_on_per_stage[idx]: + stage_kargs["block_class"] = DeformBottleneckBlock + stage_kargs["deform_modulated"] = deform_modulated + stage_kargs["deform_num_groups"] = deform_num_groups + else: + stage_kargs["block_class"] = BottleneckBlock + blocks = make_stage(**stage_kargs) + in_channels = out_channels + out_channels *= 2 + bottleneck_channels *= 2 + stages.append(blocks) + return ResNet(stem, stages, out_features=out_features).freeze(freeze_at) + + +# @BACKBONE_REGISTRY.register() +def build_p67_res2net_fpn_backbone(cfg, input_shape: ShapeSpec): + """ + Args: + cfg: a detectron2 CfgNode + + Returns: + backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. + """ + bottom_up = build_res2net_backbone(cfg, input_shape) + in_features = cfg.MODEL.FPN.IN_FEATURES + out_channels = cfg.MODEL.FPN.OUT_CHANNELS + backbone = FPN( + bottom_up=bottom_up, + in_features=in_features, + out_channels=out_channels, + norm=cfg.MODEL.FPN.NORM, + top_block=LastLevelP6P7_P5(out_channels, out_channels), + fuse_type=cfg.MODEL.FPN.FUSE_TYPE, + ) + return backbone + + +# @BACKBONE_REGISTRY.register() +def build_res2net_bifpn_backbone(cfg, input_shape: ShapeSpec): + """ + Args: + cfg: a detectron2 CfgNode + + Returns: + backbone (Backbone): backbone module, must be a subclass of :class:`Backbone`. + """ + bottom_up = build_res2net_backbone(cfg, input_shape) + in_features = cfg.MODEL.FPN.IN_FEATURES + backbone = BiFPN( + cfg=cfg, + bottom_up=bottom_up, + in_features=in_features, + out_channels=cfg.MODEL.BIFPN.OUT_CHANNELS, + norm=cfg.MODEL.BIFPN.NORM, + num_levels=cfg.MODEL.BIFPN.NUM_LEVELS, + num_bifpn=cfg.MODEL.BIFPN.NUM_BIFPN, + separable_conv=cfg.MODEL.BIFPN.SEPARABLE_CONV, + ) + return backbone diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/debug.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/debug.py new file mode 100755 index 00000000..0a4437fb --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/debug.py @@ -0,0 +1,283 @@ +import cv2 +import numpy as np +import torch +import torch.nn.functional as F + +COLORS = ((np.random.rand(1300, 3) * 0.4 + 0.6) * 255).astype( + np.uint8).reshape(1300, 1, 1, 3) + +def _get_color_image(heatmap): + heatmap = heatmap.reshape( + heatmap.shape[0], heatmap.shape[1], heatmap.shape[2], 1) + if heatmap.shape[0] == 1: + color_map = (heatmap * np.ones((1, 1, 1, 3), np.uint8) * 255).max( + axis=0).astype(np.uint8) # H, W, 3 + else: + color_map = (heatmap * COLORS[:heatmap.shape[0]]).max(axis=0).astype(np.uint8) # H, W, 3 + + return color_map + +def _blend_image(image, color_map, a=0.7): + color_map = cv2.resize(color_map, (image.shape[1], image.shape[0])) + ret = np.clip(image * (1 - a) + color_map * a, 0, 255).astype(np.uint8) + return ret + +def _blend_image_heatmaps(image, color_maps, a=0.7): + merges = np.zeros((image.shape[0], image.shape[1], 3), np.float32) + for color_map in color_maps: + color_map = cv2.resize(color_map, (image.shape[1], image.shape[0])) + merges = np.maximum(merges, color_map) + ret = np.clip(image * (1 - a) + merges * a, 0, 255).astype(np.uint8) + return ret + +def _decompose_level(x, shapes_per_level, N): + ''' + x: LNHiWi x C + ''' + x = x.view(x.shape[0], -1) + ret = [] + st = 0 + for l in range(len(shapes_per_level)): + ret.append([]) + h = shapes_per_level[l][0].int().item() + w = shapes_per_level[l][1].int().item() + for i in range(N): + ret[l].append(x[st + h * w * i:st + h * w * (i + 1)].view( + h, w, -1).permute(2, 0, 1)) + st += h * w * N + return ret + +def _imagelist_to_tensor(images): + images = [x for x in images] + image_sizes = [x.shape[-2:] for x in images] + h = max([size[0] for size in image_sizes]) + w = max([size[1] for size in image_sizes]) + S = 32 + h, w = ((h - 1) // S + 1) * S, ((w - 1) // S + 1) * S + images = [F.pad(x, (0, w - x.shape[2], 0, h - x.shape[1], 0, 0)) \ + for x in images] + images = torch.stack(images) + return images + + +def _ind2il(ind, shapes_per_level, N): + r = ind + l = 0 + S = 0 + while r - S >= N * shapes_per_level[l][0] * shapes_per_level[l][1]: + S += N * shapes_per_level[l][0] * shapes_per_level[l][1] + l += 1 + i = (r - S) // (shapes_per_level[l][0] * shapes_per_level[l][1]) + return i, l + +def debug_train( + images, gt_instances, flattened_hms, reg_targets, labels, pos_inds, + shapes_per_level, locations, strides): + ''' + images: N x 3 x H x W + flattened_hms: LNHiWi x C + shapes_per_level: L x 2 [(H_i, W_i)] + locations: LNHiWi x 2 + ''' + reg_inds = torch.nonzero( + reg_targets.max(dim=1)[0] > 0).squeeze(1) + N = len(images) + images = _imagelist_to_tensor(images) + repeated_locations = [torch.cat([loc] * N, dim=0) \ + for loc in locations] + locations = torch.cat(repeated_locations, dim=0) + gt_hms = _decompose_level(flattened_hms, shapes_per_level, N) + masks = flattened_hms.new_zeros((flattened_hms.shape[0], 1)) + masks[pos_inds] = 1 + masks = _decompose_level(masks, shapes_per_level, N) + for i in range(len(images)): + image = images[i].detach().cpu().numpy().transpose(1, 2, 0) + color_maps = [] + for l in range(len(gt_hms)): + color_map = _get_color_image( + gt_hms[l][i].detach().cpu().numpy()) + color_maps.append(color_map) + cv2.imshow('gthm_{}'.format(l), color_map) + blend = _blend_image_heatmaps(image.copy(), color_maps) + if gt_instances is not None: + bboxes = gt_instances[i].gt_boxes.tensor + for j in range(len(bboxes)): + bbox = bboxes[j] + cv2.rectangle( + blend, + (int(bbox[0]), int(bbox[1])), + (int(bbox[2]), int(bbox[3])), + (0, 0, 255), 3, cv2.LINE_AA) + + for j in range(len(pos_inds)): + image_id, l = _ind2il(pos_inds[j], shapes_per_level, N) + if image_id != i: + continue + loc = locations[pos_inds[j]] + cv2.drawMarker( + blend, (int(loc[0]), int(loc[1])), (0, 255, 255), + markerSize=(l + 1) * 16) + + for j in range(len(reg_inds)): + image_id, l = _ind2il(reg_inds[j], shapes_per_level, N) + if image_id != i: + continue + ltrb = reg_targets[reg_inds[j]] + ltrb *= strides[l] + loc = locations[reg_inds[j]] + bbox = [(loc[0] - ltrb[0]), (loc[1] - ltrb[1]), + (loc[0] + ltrb[2]), (loc[1] + ltrb[3])] + cv2.rectangle( + blend, + (int(bbox[0]), int(bbox[1])), + (int(bbox[2]), int(bbox[3])), + (255, 0, 0), 1, cv2.LINE_AA) + cv2.circle(blend, (int(loc[0]), int(loc[1])), 2, (255, 0, 0), -1) + + cv2.imshow('blend', blend) + cv2.waitKey() + + +def debug_test( + images, logits_pred, reg_pred, agn_hm_pred=[], preds=[], + vis_thresh=0.3, debug_show_name=False, mult_agn=False): + ''' + images: N x 3 x H x W + class_target: LNHiWi x C + cat_agn_heatmap: LNHiWi + shapes_per_level: L x 2 [(H_i, W_i)] + ''' + N = len(images) + for i in range(len(images)): + image = images[i].detach().cpu().numpy().transpose(1, 2, 0) + result = image.copy().astype(np.uint8) + pred_image = image.copy().astype(np.uint8) + color_maps = [] + L = len(logits_pred) + for l in range(L): + if logits_pred[0] is not None: + stride = min(image.shape[0], image.shape[1]) / min( + logits_pred[l][i].shape[1], logits_pred[l][i].shape[2]) + else: + stride = min(image.shape[0], image.shape[1]) / min( + agn_hm_pred[l][i].shape[1], agn_hm_pred[l][i].shape[2]) + stride = stride if stride < 60 else 64 if stride < 100 else 128 + if logits_pred[0] is not None: + if mult_agn: + logits_pred[l][i] = logits_pred[l][i] * agn_hm_pred[l][i] + color_map = _get_color_image( + logits_pred[l][i].detach().cpu().numpy()) + color_maps.append(color_map) + cv2.imshow('predhm_{}'.format(l), color_map) + + if debug_show_name: + from detectron2.data.datasets.lvis_v1_categories import LVIS_CATEGORIES + cat2name = [x['name'] for x in LVIS_CATEGORIES] + for j in range(len(preds[i].scores) if preds is not None else 0): + if preds[i].scores[j] > vis_thresh: + bbox = preds[i].proposal_boxes[j] \ + if preds[i].has('proposal_boxes') else \ + preds[i].pred_boxes[j] + bbox = bbox.tensor[0].detach().cpu().numpy().astype(np.int32) + cat = int(preds[i].pred_classes[j]) \ + if preds[i].has('pred_classes') else 0 + cl = COLORS[cat, 0, 0] + cv2.rectangle( + pred_image, (int(bbox[0]), int(bbox[1])), + (int(bbox[2]), int(bbox[3])), + (int(cl[0]), int(cl[1]), int(cl[2])), 2, cv2.LINE_AA) + if debug_show_name: + txt = '{}{:.1f}'.format( + cat2name[cat] if cat > 0 else '', + preds[i].scores[j]) + font = cv2.FONT_HERSHEY_SIMPLEX + cat_size = cv2.getTextSize(txt, font, 0.5, 2)[0] + cv2.rectangle( + pred_image, + (int(bbox[0]), int(bbox[1] - cat_size[1] - 2)), + (int(bbox[0] + cat_size[0]), int(bbox[1] - 2)), + (int(cl[0]), int(cl[1]), int(cl[2])), -1) + cv2.putText( + pred_image, txt, (int(bbox[0]), int(bbox[1] - 2)), + font, 0.5, (0, 0, 0), thickness=1, lineType=cv2.LINE_AA) + + + if agn_hm_pred[l] is not None: + agn_hm_ = agn_hm_pred[l][i, 0, :, :, None].detach().cpu().numpy() + agn_hm_ = (agn_hm_ * np.array([255, 255, 255]).reshape( + 1, 1, 3)).astype(np.uint8) + cv2.imshow('agn_hm_{}'.format(l), agn_hm_) + blend = _blend_image_heatmaps(image.copy(), color_maps) + cv2.imshow('blend', blend) + cv2.imshow('preds', pred_image) + cv2.waitKey() + +global cnt +cnt = 0 + +def debug_second_stage(images, instances, proposals=None, vis_thresh=0.3, + save_debug=False, debug_show_name=False): + images = _imagelist_to_tensor(images) + if debug_show_name: + from detectron2.data.datasets.lvis_v1_categories import LVIS_CATEGORIES + cat2name = [x['name'] for x in LVIS_CATEGORIES] + for i in range(len(images)): + image = images[i].detach().cpu().numpy().transpose(1, 2, 0).astype(np.uint8).copy() + if instances[i].has('gt_boxes'): + bboxes = instances[i].gt_boxes.tensor.cpu().numpy() + scores = np.ones(bboxes.shape[0]) + cats = instances[i].gt_classes.cpu().numpy() + else: + bboxes = instances[i].pred_boxes.tensor.cpu().numpy() + scores = instances[i].scores.cpu().numpy() + cats = instances[i].pred_classes.cpu().numpy() + for j in range(len(bboxes)): + if scores[j] > vis_thresh: + bbox = bboxes[j] + cl = COLORS[cats[j], 0, 0] + cl = (int(cl[0]), int(cl[1]), int(cl[2])) + cv2.rectangle( + image, + (int(bbox[0]), int(bbox[1])), + (int(bbox[2]), int(bbox[3])), + cl, 2, cv2.LINE_AA) + if debug_show_name: + cat = cats[j] + txt = '{}{:.1f}'.format( + cat2name[cat] if cat > 0 else '', + scores[j]) + font = cv2.FONT_HERSHEY_SIMPLEX + cat_size = cv2.getTextSize(txt, font, 0.5, 2)[0] + cv2.rectangle( + image, + (int(bbox[0]), int(bbox[1] - cat_size[1] - 2)), + (int(bbox[0] + cat_size[0]), int(bbox[1] - 2)), + (int(cl[0]), int(cl[1]), int(cl[2])), -1) + cv2.putText( + image, txt, (int(bbox[0]), int(bbox[1] - 2)), + font, 0.5, (0, 0, 0), thickness=1, lineType=cv2.LINE_AA) + if proposals is not None: + proposal_image = images[i].detach().cpu().numpy().transpose(1, 2, 0).astype(np.uint8).copy() + bboxes = proposals[i].proposal_boxes.tensor.cpu().numpy() + if proposals[i].has('scores'): + scores = proposals[i].scores.cpu().numpy() + else: + scores = proposals[i].objectness_logits.sigmoid().cpu().numpy() + for j in range(len(bboxes)): + if scores[j] > vis_thresh: + bbox = bboxes[j] + cl = (209, 159, 83) + cv2.rectangle( + proposal_image, + (int(bbox[0]), int(bbox[1])), + (int(bbox[2]), int(bbox[3])), + cl, 2, cv2.LINE_AA) + + cv2.imshow('image', image) + if proposals is not None: + cv2.imshow('proposals', proposal_image) + if save_debug: + global cnt + cnt += 1 + cv2.imwrite('output/save_debug/{}.jpg'.format(cnt), proposal_image) + cv2.waitKey() \ No newline at end of file diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/__init__.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/centernet.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/centernet.py new file mode 100755 index 00000000..4794e002 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/centernet.py @@ -0,0 +1,864 @@ + +import math +import json +import copy +from typing import List, Dict +import numpy as np +import torch +from torch import nn +from torch.nn import functional as F + +from detectron2.modeling.proposal_generator.build import PROPOSAL_GENERATOR_REGISTRY +from detectron2.layers import ShapeSpec, cat +from detectron2.structures import Instances, Boxes +from detectron2.modeling import detector_postprocess +from detectron2.utils.comm import get_world_size +from detectron2.config import configurable + +from ..layers.heatmap_focal_loss import heatmap_focal_loss_jit +from ..layers.heatmap_focal_loss import binary_heatmap_focal_loss +from ..layers.iou_loss import IOULoss +from ..layers.ml_nms import ml_nms +from ..debug import debug_train, debug_test +from .utils import reduce_sum, _transpose +from .centernet_head import CenterNetHead + +__all__ = ["CenterNet"] + +INF = 100000000 + +# @PROPOSAL_GENERATOR_REGISTRY.register() +class CenterNet(nn.Module): + @configurable + def __init__(self, + # input_shape: Dict[str, ShapeSpec], + in_channels=256, + *, + num_classes=80, + in_features=("p3", "p4", "p5", "p6", "p7"), + strides=(8, 16, 32, 64, 128), + score_thresh=0.05, + hm_min_overlap=0.8, + loc_loss_type='giou', + min_radius=4, + hm_focal_alpha=0.25, + hm_focal_beta=4, + loss_gamma=2.0, + reg_weight=2.0, + not_norm_reg=True, + with_agn_hm=False, + only_proposal=False, + as_proposal=False, + not_nms=False, + pos_weight=1., + neg_weight=1., + sigmoid_clamp=1e-4, + ignore_high_fp=-1., + center_nms=False, + sizes_of_interest=[[0,80],[64,160],[128,320],[256,640],[512,10000000]], + more_pos=False, + more_pos_thresh=0.2, + more_pos_topk=9, + pre_nms_topk_train=1000, + pre_nms_topk_test=1000, + post_nms_topk_train=100, + post_nms_topk_test=100, + nms_thresh_train=0.6, + nms_thresh_test=0.6, + no_reduce=False, + debug=False, + vis_thresh=0.5, + pixel_mean=[103.530,116.280,123.675], + pixel_std=[1.0,1.0,1.0], + device='cuda', + centernet_head=None, + ): + super().__init__() + self.num_classes = num_classes + self.in_features = in_features + self.strides = strides + self.score_thresh = score_thresh + self.min_radius = min_radius + self.hm_focal_alpha = hm_focal_alpha + self.hm_focal_beta = hm_focal_beta + self.loss_gamma = loss_gamma + self.reg_weight = reg_weight + self.not_norm_reg = not_norm_reg + self.with_agn_hm = with_agn_hm + self.only_proposal = only_proposal + self.as_proposal = as_proposal + self.not_nms = not_nms + self.pos_weight = pos_weight + self.neg_weight = neg_weight + self.sigmoid_clamp = sigmoid_clamp + self.ignore_high_fp = ignore_high_fp + self.center_nms = center_nms + self.sizes_of_interest = sizes_of_interest + self.more_pos = more_pos + self.more_pos_thresh = more_pos_thresh + self.more_pos_topk = more_pos_topk + self.pre_nms_topk_train = pre_nms_topk_train + self.pre_nms_topk_test = pre_nms_topk_test + self.post_nms_topk_train = post_nms_topk_train + self.post_nms_topk_test = post_nms_topk_test + self.nms_thresh_train = nms_thresh_train + self.nms_thresh_test = nms_thresh_test + self.no_reduce = no_reduce + self.debug = debug + self.vis_thresh = vis_thresh + if self.center_nms: + self.not_nms = True + self.iou_loss = IOULoss(loc_loss_type) + assert (not self.only_proposal) or self.with_agn_hm + # delta for rendering heatmap + self.delta = (1 - hm_min_overlap) / (1 + hm_min_overlap) + if centernet_head is None: + self.centernet_head = CenterNetHead( + in_channels=in_channels, + num_levels=len(in_features), + with_agn_hm=with_agn_hm, + only_proposal=only_proposal) + else: + self.centernet_head = centernet_head + if self.debug: + pixel_mean = torch.Tensor(pixel_mean).to( + torch.device(device)).view(3, 1, 1) + pixel_std = torch.Tensor(pixel_std).to( + torch.device(device)).view(3, 1, 1) + self.denormalizer = lambda x: x * pixel_std + pixel_mean + + @classmethod + def from_config(cls, cfg, input_shape): + ret = { + # 'input_shape': input_shape, + 'in_channels': input_shape[ + cfg.MODEL.CENTERNET.IN_FEATURES[0]].channels, + 'num_classes': cfg.MODEL.CENTERNET.NUM_CLASSES, + 'in_features': cfg.MODEL.CENTERNET.IN_FEATURES, + 'strides': cfg.MODEL.CENTERNET.FPN_STRIDES, + 'score_thresh': cfg.MODEL.CENTERNET.INFERENCE_TH, + 'loc_loss_type': cfg.MODEL.CENTERNET.LOC_LOSS_TYPE, + 'hm_min_overlap': cfg.MODEL.CENTERNET.HM_MIN_OVERLAP, + 'min_radius': cfg.MODEL.CENTERNET.MIN_RADIUS, + 'hm_focal_alpha': cfg.MODEL.CENTERNET.HM_FOCAL_ALPHA, + 'hm_focal_beta': cfg.MODEL.CENTERNET.HM_FOCAL_BETA, + 'loss_gamma': cfg.MODEL.CENTERNET.LOSS_GAMMA, + 'reg_weight': cfg.MODEL.CENTERNET.REG_WEIGHT, + 'not_norm_reg': cfg.MODEL.CENTERNET.NOT_NORM_REG, + 'with_agn_hm': cfg.MODEL.CENTERNET.WITH_AGN_HM, + 'only_proposal': cfg.MODEL.CENTERNET.ONLY_PROPOSAL, + 'as_proposal': cfg.MODEL.CENTERNET.AS_PROPOSAL, + 'not_nms': cfg.MODEL.CENTERNET.NOT_NMS, + 'pos_weight': cfg.MODEL.CENTERNET.POS_WEIGHT, + 'neg_weight': cfg.MODEL.CENTERNET.NEG_WEIGHT, + 'sigmoid_clamp': cfg.MODEL.CENTERNET.SIGMOID_CLAMP, + 'ignore_high_fp': cfg.MODEL.CENTERNET.IGNORE_HIGH_FP, + 'center_nms': cfg.MODEL.CENTERNET.CENTER_NMS, + 'sizes_of_interest': cfg.MODEL.CENTERNET.SOI, + 'more_pos': cfg.MODEL.CENTERNET.MORE_POS, + 'more_pos_thresh': cfg.MODEL.CENTERNET.MORE_POS_THRESH, + 'more_pos_topk': cfg.MODEL.CENTERNET.MORE_POS_TOPK, + 'pre_nms_topk_train': cfg.MODEL.CENTERNET.PRE_NMS_TOPK_TRAIN, + 'pre_nms_topk_test': cfg.MODEL.CENTERNET.PRE_NMS_TOPK_TEST, + 'post_nms_topk_train': cfg.MODEL.CENTERNET.POST_NMS_TOPK_TRAIN, + 'post_nms_topk_test': cfg.MODEL.CENTERNET.POST_NMS_TOPK_TEST, + 'nms_thresh_train': cfg.MODEL.CENTERNET.NMS_TH_TRAIN, + 'nms_thresh_test': cfg.MODEL.CENTERNET.NMS_TH_TEST, + 'no_reduce': cfg.MODEL.CENTERNET.NO_REDUCE, + 'debug': cfg.DEBUG, + 'vis_thresh': cfg.VIS_THRESH, + 'pixel_mean': cfg.MODEL.PIXEL_MEAN, + 'pixel_std': cfg.MODEL.PIXEL_STD, + 'device': cfg.MODEL.DEVICE, + 'centernet_head': CenterNetHead( + cfg, [input_shape[f] for f in cfg.MODEL.CENTERNET.IN_FEATURES]), + } + return ret + + + def forward(self, images, features_dict, gt_instances): + features = [features_dict[f] for f in self.in_features] + clss_per_level, reg_pred_per_level, agn_hm_pred_per_level = \ + self.centernet_head(features) + grids = self.compute_grids(features) + shapes_per_level = grids[0].new_tensor( + [(x.shape[2], x.shape[3]) for x in reg_pred_per_level]) + + if not self.training: + return self.inference( + images, clss_per_level, reg_pred_per_level, + agn_hm_pred_per_level, grids) + else: + pos_inds, labels, reg_targets, flattened_hms = \ + self._get_ground_truth( + grids, shapes_per_level, gt_instances) + # logits_pred: M x F, reg_pred: M x 4, agn_hm_pred: M + logits_pred, reg_pred, agn_hm_pred = self._flatten_outputs( + clss_per_level, reg_pred_per_level, agn_hm_pred_per_level) + + if self.more_pos: + # add more pixels as positive if \ + # 1. they are within the center3x3 region of an object + # 2. their regression losses are small (= 0).squeeze(1) + reg_pred = reg_pred[reg_inds] + reg_targets_pos = reg_targets[reg_inds] + reg_weight_map = flattened_hms.max(dim=1)[0] + reg_weight_map = reg_weight_map[reg_inds] + reg_weight_map = reg_weight_map * 0 + 1 \ + if self.not_norm_reg else reg_weight_map + if self.no_reduce: + reg_norm = max(reg_weight_map.sum(), 1) + else: + reg_norm = max(reduce_sum(reg_weight_map.sum()).item() / num_gpus, 1) + + reg_loss = self.reg_weight * self.iou_loss( + reg_pred, reg_targets_pos, reg_weight_map, + reduction='sum') / reg_norm + losses['loss_centernet_loc'] = reg_loss + + if self.with_agn_hm: + cat_agn_heatmap = flattened_hms.max(dim=1)[0] # M + agn_pos_loss, agn_neg_loss = binary_heatmap_focal_loss( + agn_hm_pred, cat_agn_heatmap, pos_inds, + alpha=self.hm_focal_alpha, + beta=self.hm_focal_beta, + gamma=self.loss_gamma, + sigmoid_clamp=self.sigmoid_clamp, + ignore_high_fp=self.ignore_high_fp, + ) + agn_pos_loss = self.pos_weight * agn_pos_loss / num_pos_avg + agn_neg_loss = self.neg_weight * agn_neg_loss / num_pos_avg + losses['loss_centernet_agn_pos'] = agn_pos_loss + losses['loss_centernet_agn_neg'] = agn_neg_loss + + if self.debug: + print('losses', losses) + print('total_num_pos', total_num_pos) + return losses + + + def compute_grids(self, features): + grids = [] + for level, feature in enumerate(features): + h, w = feature.size()[-2:] + shifts_x = torch.arange( + 0, w * self.strides[level], + step=self.strides[level], + dtype=torch.float32, device=feature.device) + shifts_y = torch.arange( + 0, h * self.strides[level], + step=self.strides[level], + dtype=torch.float32, device=feature.device) + shift_y, shift_x = torch.meshgrid(shifts_y, shifts_x) + shift_x = shift_x.reshape(-1) + shift_y = shift_y.reshape(-1) + grids_per_level = torch.stack((shift_x, shift_y), dim=1) + \ + self.strides[level] // 2 + grids.append(grids_per_level) + return grids + + + def _get_ground_truth(self, grids, shapes_per_level, gt_instances): + ''' + Input: + grids: list of tensors [(hl x wl, 2)]_l + shapes_per_level: list of tuples L x 2: + gt_instances: gt instances + Retuen: + pos_inds: N + labels: N + reg_targets: M x 4 + flattened_hms: M x C or M x 1 + N: number of objects in all images + M: number of pixels from all FPN levels + ''' + + # get positive pixel index + if not self.more_pos: + pos_inds, labels = self._get_label_inds( + gt_instances, shapes_per_level) + else: + pos_inds, labels = None, None + heatmap_channels = self.num_classes + L = len(grids) + num_loc_list = [len(loc) for loc in grids] + strides = torch.cat([ + shapes_per_level.new_ones(num_loc_list[l]) * self.strides[l] \ + for l in range(L)]).float() # M + reg_size_ranges = torch.cat([ + shapes_per_level.new_tensor(self.sizes_of_interest[l]).float().view( + 1, 2).expand(num_loc_list[l], 2) for l in range(L)]) # M x 2 + grids = torch.cat(grids, dim=0) # M x 2 + M = grids.shape[0] + + reg_targets = [] + flattened_hms = [] + for i in range(len(gt_instances)): # images + boxes = gt_instances[i].gt_boxes.tensor # N x 4 + area = gt_instances[i].gt_boxes.area() # N + gt_classes = gt_instances[i].gt_classes # N in [0, self.num_classes] + + N = boxes.shape[0] + if N == 0: + reg_targets.append(grids.new_zeros((M, 4)) - INF) + flattened_hms.append( + grids.new_zeros(( + M, 1 if self.only_proposal else heatmap_channels))) + continue + + l = grids[:, 0].view(M, 1) - boxes[:, 0].view(1, N) # M x N + t = grids[:, 1].view(M, 1) - boxes[:, 1].view(1, N) # M x N + r = boxes[:, 2].view(1, N) - grids[:, 0].view(M, 1) # M x N + b = boxes[:, 3].view(1, N) - grids[:, 1].view(M, 1) # M x N + reg_target = torch.stack([l, t, r, b], dim=2) # M x N x 4 + + centers = ((boxes[:, [0, 1]] + boxes[:, [2, 3]]) / 2) # N x 2 + centers_expanded = centers.view(1, N, 2).expand(M, N, 2) # M x N x 2 + strides_expanded = strides.view(M, 1, 1).expand(M, N, 2) + centers_discret = ((centers_expanded / strides_expanded).int() * \ + strides_expanded).float() + strides_expanded / 2 # M x N x 2 + + is_peak = (((grids.view(M, 1, 2).expand(M, N, 2) - \ + centers_discret) ** 2).sum(dim=2) == 0) # M x N + is_in_boxes = reg_target.min(dim=2)[0] > 0 # M x N + is_center3x3 = self.get_center3x3( + grids, centers, strides) & is_in_boxes # M x N + is_cared_in_the_level = self.assign_reg_fpn( + reg_target, reg_size_ranges) # M x N + reg_mask = is_center3x3 & is_cared_in_the_level # M x N + + dist2 = ((grids.view(M, 1, 2).expand(M, N, 2) - \ + centers_expanded) ** 2).sum(dim=2) # M x N + dist2[is_peak] = 0 + radius2 = self.delta ** 2 * 2 * area # N + radius2 = torch.clamp( + radius2, min=self.min_radius ** 2) + weighted_dist2 = dist2 / radius2.view(1, N).expand(M, N) # M x N + reg_target = self._get_reg_targets( + reg_target, weighted_dist2.clone(), reg_mask, area) # M x 4 + + if self.only_proposal: + flattened_hm = self._create_agn_heatmaps_from_dist( + weighted_dist2.clone()) # M x 1 + else: + flattened_hm = self._create_heatmaps_from_dist( + weighted_dist2.clone(), gt_classes, + channels=heatmap_channels) # M x C + + reg_targets.append(reg_target) + flattened_hms.append(flattened_hm) + + # transpose im first training_targets to level first ones + reg_targets = _transpose(reg_targets, num_loc_list) + flattened_hms = _transpose(flattened_hms, num_loc_list) + for l in range(len(reg_targets)): + reg_targets[l] = reg_targets[l] / float(self.strides[l]) + reg_targets = cat([x for x in reg_targets], dim=0) # MB x 4 + flattened_hms = cat([x for x in flattened_hms], dim=0) # MB x C + + return pos_inds, labels, reg_targets, flattened_hms + + + def _get_label_inds(self, gt_instances, shapes_per_level): + ''' + Inputs: + gt_instances: [n_i], sum n_i = N + shapes_per_level: L x 2 [(h_l, w_l)]_L + Returns: + pos_inds: N' + labels: N' + ''' + pos_inds = [] + labels = [] + L = len(self.strides) + B = len(gt_instances) + shapes_per_level = shapes_per_level.long() + loc_per_level = (shapes_per_level[:, 0] * shapes_per_level[:, 1]).long() # L + level_bases = [] + s = 0 + for l in range(L): + level_bases.append(s) + s = s + B * loc_per_level[l] + level_bases = shapes_per_level.new_tensor(level_bases).long() # L + strides_default = shapes_per_level.new_tensor(self.strides).float() # L + for im_i in range(B): + targets_per_im = gt_instances[im_i] + bboxes = targets_per_im.gt_boxes.tensor # n x 4 + n = bboxes.shape[0] + centers = ((bboxes[:, [0, 1]] + bboxes[:, [2, 3]]) / 2) # n x 2 + centers = centers.view(n, 1, 2).expand(n, L, 2) + strides = strides_default.view(1, L, 1).expand(n, L, 2) + centers_inds = (centers / strides).long() # n x L x 2 + Ws = shapes_per_level[:, 1].view(1, L).expand(n, L) + pos_ind = level_bases.view(1, L).expand(n, L) + \ + im_i * loc_per_level.view(1, L).expand(n, L) + \ + centers_inds[:, :, 1] * Ws + \ + centers_inds[:, :, 0] # n x L + is_cared_in_the_level = self.assign_fpn_level(bboxes) + pos_ind = pos_ind[is_cared_in_the_level].view(-1) + label = targets_per_im.gt_classes.view( + n, 1).expand(n, L)[is_cared_in_the_level].view(-1) + + pos_inds.append(pos_ind) # n' + labels.append(label) # n' + pos_inds = torch.cat(pos_inds, dim=0).long() + labels = torch.cat(labels, dim=0) + return pos_inds, labels # N, N + + + def assign_fpn_level(self, boxes): + ''' + Inputs: + boxes: n x 4 + size_ranges: L x 2 + Return: + is_cared_in_the_level: n x L + ''' + size_ranges = boxes.new_tensor( + self.sizes_of_interest).view(len(self.sizes_of_interest), 2) # L x 2 + crit = ((boxes[:, 2:] - boxes[:, :2]) **2).sum(dim=1) ** 0.5 / 2 # n + n, L = crit.shape[0], size_ranges.shape[0] + crit = crit.view(n, 1).expand(n, L) + size_ranges_expand = size_ranges.view(1, L, 2).expand(n, L, 2) + is_cared_in_the_level = (crit >= size_ranges_expand[:, :, 0]) & \ + (crit <= size_ranges_expand[:, :, 1]) + return is_cared_in_the_level + + + def assign_reg_fpn(self, reg_targets_per_im, size_ranges): + ''' + TODO (Xingyi): merge it with assign_fpn_level + Inputs: + reg_targets_per_im: M x N x 4 + size_ranges: M x 2 + ''' + crit = ((reg_targets_per_im[:, :, :2] + \ + reg_targets_per_im[:, :, 2:])**2).sum(dim=2) ** 0.5 / 2 # M x N + is_cared_in_the_level = (crit >= size_ranges[:, [0]]) & \ + (crit <= size_ranges[:, [1]]) + return is_cared_in_the_level + + + def _get_reg_targets(self, reg_targets, dist, mask, area): + ''' + reg_targets (M x N x 4): long tensor + dist (M x N) + is_*: M x N + ''' + dist[mask == 0] = INF * 1.0 + min_dist, min_inds = dist.min(dim=1) # M + reg_targets_per_im = reg_targets[ + range(len(reg_targets)), min_inds] # M x N x 4 --> M x 4 + reg_targets_per_im[min_dist == INF] = - INF + return reg_targets_per_im + + + def _create_heatmaps_from_dist(self, dist, labels, channels): + ''' + dist: M x N + labels: N + return: + heatmaps: M x C + ''' + heatmaps = dist.new_zeros((dist.shape[0], channels)) + for c in range(channels): + inds = (labels == c) # N + if inds.int().sum() == 0: + continue + heatmaps[:, c] = torch.exp(-dist[:, inds].min(dim=1)[0]) + zeros = heatmaps[:, c] < 1e-4 + heatmaps[zeros, c] = 0 + return heatmaps + + + def _create_agn_heatmaps_from_dist(self, dist): + ''' + TODO (Xingyi): merge it with _create_heatmaps_from_dist + dist: M x N + return: + heatmaps: M x 1 + ''' + heatmaps = dist.new_zeros((dist.shape[0], 1)) + heatmaps[:, 0] = torch.exp(-dist.min(dim=1)[0]) + zeros = heatmaps < 1e-4 + heatmaps[zeros] = 0 + return heatmaps + + + def _flatten_outputs(self, clss, reg_pred, agn_hm_pred): + # Reshape: (N, F, Hl, Wl) -> (N, Hl, Wl, F) -> (sum_l N*Hl*Wl, F) + clss = cat([x.permute(0, 2, 3, 1).reshape(-1, x.shape[1]) \ + for x in clss], dim=0) if clss[0] is not None else None + reg_pred = cat( + [x.permute(0, 2, 3, 1).reshape(-1, 4) for x in reg_pred], dim=0) + agn_hm_pred = cat([x.permute(0, 2, 3, 1).reshape(-1) \ + for x in agn_hm_pred], dim=0) if self.with_agn_hm else None + return clss, reg_pred, agn_hm_pred + + + def get_center3x3(self, locations, centers, strides): + ''' + Inputs: + locations: M x 2 + centers: N x 2 + strides: M + ''' + M, N = locations.shape[0], centers.shape[0] + locations_expanded = locations.view(M, 1, 2).expand(M, N, 2) # M x N x 2 + centers_expanded = centers.view(1, N, 2).expand(M, N, 2) # M x N x 2 + strides_expanded = strides.view(M, 1, 1).expand(M, N, 2) # M x N + centers_discret = ((centers_expanded / strides_expanded).int() * \ + strides_expanded).float() + strides_expanded / 2 # M x N x 2 + dist_x = (locations_expanded[:, :, 0] - centers_discret[:, :, 0]).abs() + dist_y = (locations_expanded[:, :, 1] - centers_discret[:, :, 1]).abs() + return (dist_x <= strides_expanded[:, :, 0]) & \ + (dist_y <= strides_expanded[:, :, 0]) + + + def inference(self, images, clss_per_level, reg_pred_per_level, + agn_hm_pred_per_level, grids): + logits_pred = [x.sigmoid() if x is not None else None \ + for x in clss_per_level] + agn_hm_pred_per_level = [x.sigmoid() if x is not None else None \ + for x in agn_hm_pred_per_level] + + if self.only_proposal: + proposals = self.predict_instances( + grids, agn_hm_pred_per_level, reg_pred_per_level, + images.image_sizes, [None for _ in agn_hm_pred_per_level]) + else: + proposals = self.predict_instances( + grids, logits_pred, reg_pred_per_level, + images.image_sizes, agn_hm_pred_per_level) + if self.as_proposal or self.only_proposal: + for p in range(len(proposals)): + proposals[p].proposal_boxes = proposals[p].get('pred_boxes') + proposals[p].objectness_logits = proposals[p].get('scores') + proposals[p].remove('pred_boxes') + + if self.debug: + debug_test( + [self.denormalizer(x) for x in images], + logits_pred, reg_pred_per_level, + agn_hm_pred_per_level, preds=proposals, + vis_thresh=self.vis_thresh, + debug_show_name=False) + return proposals, {} + + + def predict_instances( + self, grids, logits_pred, reg_pred, image_sizes, agn_hm_pred, + is_proposal=False): + sampled_boxes = [] + for l in range(len(grids)): + sampled_boxes.append(self.predict_single_level( + grids[l], logits_pred[l], reg_pred[l] * self.strides[l], + image_sizes, agn_hm_pred[l], l, is_proposal=is_proposal)) + boxlists = list(zip(*sampled_boxes)) + boxlists = [Instances.cat(boxlist) for boxlist in boxlists] + boxlists = self.nms_and_topK( + boxlists, nms=not self.not_nms) + return boxlists + + + def predict_single_level( + self, grids, heatmap, reg_pred, image_sizes, agn_hm, level, + is_proposal=False): + N, C, H, W = heatmap.shape + # put in the same format as grids + if self.center_nms: + heatmap_nms = nn.functional.max_pool2d( + heatmap, (3, 3), stride=1, padding=1) + heatmap = heatmap * (heatmap_nms == heatmap).float() + heatmap = heatmap.permute(0, 2, 3, 1) # N x H x W x C + heatmap = heatmap.reshape(N, -1, C) # N x HW x C + box_regression = reg_pred.view(N, 4, H, W).permute(0, 2, 3, 1) # N x H x W x 4 + box_regression = box_regression.reshape(N, -1, 4) + + candidate_inds = heatmap > self.score_thresh # 0.05 + pre_nms_top_n = candidate_inds.view(N, -1).sum(1) # N + pre_nms_topk = self.pre_nms_topk_train if self.training else self.pre_nms_topk_test + pre_nms_top_n = pre_nms_top_n.clamp(max=pre_nms_topk) # N + + if agn_hm is not None: + agn_hm = agn_hm.view(N, 1, H, W).permute(0, 2, 3, 1) + agn_hm = agn_hm.reshape(N, -1) + heatmap = heatmap * agn_hm[:, :, None] + + results = [] + for i in range(N): + per_box_cls = heatmap[i] # HW x C + per_candidate_inds = candidate_inds[i] # n + per_box_cls = per_box_cls[per_candidate_inds] # n + + per_candidate_nonzeros = per_candidate_inds.nonzero() # n + per_box_loc = per_candidate_nonzeros[:, 0] # n + per_class = per_candidate_nonzeros[:, 1] # n + + per_box_regression = box_regression[i] # HW x 4 + per_box_regression = per_box_regression[per_box_loc] # n x 4 + per_grids = grids[per_box_loc] # n x 2 + + per_pre_nms_top_n = pre_nms_top_n[i] # 1 + + if per_candidate_inds.sum().item() > per_pre_nms_top_n.item(): + per_box_cls, top_k_indices = \ + per_box_cls.topk(per_pre_nms_top_n, sorted=False) + per_class = per_class[top_k_indices] + per_box_regression = per_box_regression[top_k_indices] + per_grids = per_grids[top_k_indices] + + detections = torch.stack([ + per_grids[:, 0] - per_box_regression[:, 0], + per_grids[:, 1] - per_box_regression[:, 1], + per_grids[:, 0] + per_box_regression[:, 2], + per_grids[:, 1] + per_box_regression[:, 3], + ], dim=1) # n x 4 + + # avoid invalid boxes in RoI heads + detections[:, 2] = torch.max(detections[:, 2], detections[:, 0] + 0.01) + detections[:, 3] = torch.max(detections[:, 3], detections[:, 1] + 0.01) + boxlist = Instances(image_sizes[i]) + boxlist.scores = torch.sqrt(per_box_cls) \ + if self.with_agn_hm else per_box_cls # n + # import pdb; pdb.set_trace() + boxlist.pred_boxes = Boxes(detections) + boxlist.pred_classes = per_class + results.append(boxlist) + return results + + + def nms_and_topK(self, boxlists, nms=True): + num_images = len(boxlists) + results = [] + for i in range(num_images): + nms_thresh = self.nms_thresh_train if self.training else \ + self.nms_thresh_test + result = ml_nms(boxlists[i], nms_thresh) if nms else boxlists[i] + if self.debug: + print('#proposals before nms', len(boxlists[i])) + print('#proposals after nms', len(result)) + num_dets = len(result) + post_nms_topk = self.post_nms_topk_train if self.training else \ + self.post_nms_topk_test + if num_dets > post_nms_topk: + cls_scores = result.scores + image_thresh, _ = torch.kthvalue( + cls_scores.float().cpu(), + num_dets - post_nms_topk + 1 + ) + keep = cls_scores >= image_thresh.item() + keep = torch.nonzero(keep).squeeze(1) + result = result[keep] + if self.debug: + print('#proposals after filter', len(result)) + results.append(result) + return results + + + def _add_more_pos(self, reg_pred, gt_instances, shapes_per_level): + labels, level_masks, c33_inds, c33_masks, c33_regs = \ + self._get_c33_inds(gt_instances, shapes_per_level) + N, L, K = labels.shape[0], len(self.strides), 9 + c33_inds[c33_masks == 0] = 0 + reg_pred_c33 = reg_pred[c33_inds].detach() # N x L x K + invalid_reg = c33_masks == 0 + c33_regs_expand = c33_regs.view(N * L * K, 4).clamp(min=0) + if N > 0: + with torch.no_grad(): + c33_reg_loss = self.iou_loss( + reg_pred_c33.view(N * L * K, 4), + c33_regs_expand, None, + reduction='none').view(N, L, K).detach() # N x L x K + else: + c33_reg_loss = reg_pred_c33.new_zeros((N, L, K)).detach() + c33_reg_loss[invalid_reg] = INF # N x L x K + c33_reg_loss.view(N * L, K)[level_masks.view(N * L), 4] = 0 # real center + c33_reg_loss = c33_reg_loss.view(N, L * K) + if N == 0: + loss_thresh = c33_reg_loss.new_ones((N)).float() + else: + loss_thresh = torch.kthvalue( + c33_reg_loss, self.more_pos_topk, dim=1)[0] # N + loss_thresh[loss_thresh > self.more_pos_thresh] = self.more_pos_thresh # N + new_pos = c33_reg_loss.view(N, L, K) < \ + loss_thresh.view(N, 1, 1).expand(N, L, K) + pos_inds = c33_inds[new_pos].view(-1) # P + labels = labels.view(N, 1, 1).expand(N, L, K)[new_pos].view(-1) + return pos_inds, labels + + + def _get_c33_inds(self, gt_instances, shapes_per_level): + ''' + TODO (Xingyi): The current implementation is ugly. Refactor. + Get the center (and the 3x3 region near center) locations of each objects + Inputs: + gt_instances: [n_i], sum n_i = N + shapes_per_level: L x 2 [(h_l, w_l)]_L + ''' + labels = [] + level_masks = [] + c33_inds = [] + c33_masks = [] + c33_regs = [] + L = len(self.strides) + B = len(gt_instances) + shapes_per_level = shapes_per_level.long() + loc_per_level = (shapes_per_level[:, 0] * shapes_per_level[:, 1]).long() # L + level_bases = [] + s = 0 + for l in range(L): + level_bases.append(s) + s = s + B * loc_per_level[l] + level_bases = shapes_per_level.new_tensor(level_bases).long() # L + strides_default = shapes_per_level.new_tensor(self.strides).float() # L + K = 9 + dx = shapes_per_level.new_tensor([-1, 0, 1, -1, 0, 1, -1, 0, 1]).long() + dy = shapes_per_level.new_tensor([-1, -1, -1, 0, 0, 0, 1, 1, 1]).long() + for im_i in range(B): + targets_per_im = gt_instances[im_i] + bboxes = targets_per_im.gt_boxes.tensor # n x 4 + n = bboxes.shape[0] + if n == 0: + continue + centers = ((bboxes[:, [0, 1]] + bboxes[:, [2, 3]]) / 2) # n x 2 + centers = centers.view(n, 1, 2).expand(n, L, 2) + + strides = strides_default.view(1, L, 1).expand(n, L, 2) # + centers_inds = (centers / strides).long() # n x L x 2 + center_grids = centers_inds * strides + strides // 2# n x L x 2 + l = center_grids[:, :, 0] - bboxes[:, 0].view(n, 1).expand(n, L) + t = center_grids[:, :, 1] - bboxes[:, 1].view(n, 1).expand(n, L) + r = bboxes[:, 2].view(n, 1).expand(n, L) - center_grids[:, :, 0] + b = bboxes[:, 3].view(n, 1).expand(n, L) - center_grids[:, :, 1] # n x L + reg = torch.stack([l, t, r, b], dim=2) # n x L x 4 + reg = reg / strides_default.view(1, L, 1).expand(n, L, 4).float() + + Ws = shapes_per_level[:, 1].view(1, L).expand(n, L) + Hs = shapes_per_level[:, 0].view(1, L).expand(n, L) + expand_Ws = Ws.view(n, L, 1).expand(n, L, K) + expand_Hs = Hs.view(n, L, 1).expand(n, L, K) + label = targets_per_im.gt_classes.view(n).clone() + mask = reg.min(dim=2)[0] >= 0 # n x L + mask = mask & self.assign_fpn_level(bboxes) + labels.append(label) # n + level_masks.append(mask) # n x L + + Dy = dy.view(1, 1, K).expand(n, L, K) + Dx = dx.view(1, 1, K).expand(n, L, K) + c33_ind = level_bases.view(1, L, 1).expand(n, L, K) + \ + im_i * loc_per_level.view(1, L, 1).expand(n, L, K) + \ + (centers_inds[:, :, 1:2].expand(n, L, K) + Dy) * expand_Ws + \ + (centers_inds[:, :, 0:1].expand(n, L, K) + Dx) # n x L x K + + c33_mask = \ + ((centers_inds[:, :, 1:2].expand(n, L, K) + dy) < expand_Hs) & \ + ((centers_inds[:, :, 1:2].expand(n, L, K) + dy) >= 0) & \ + ((centers_inds[:, :, 0:1].expand(n, L, K) + dx) < expand_Ws) & \ + ((centers_inds[:, :, 0:1].expand(n, L, K) + dx) >= 0) + # TODO (Xingyi): think about better way to implement this + # Currently it hard codes the 3x3 region + c33_reg = reg.view(n, L, 1, 4).expand(n, L, K, 4).clone() + c33_reg[:, :, [0, 3, 6], 0] -= 1 + c33_reg[:, :, [0, 3, 6], 2] += 1 + c33_reg[:, :, [2, 5, 8], 0] += 1 + c33_reg[:, :, [2, 5, 8], 2] -= 1 + c33_reg[:, :, [0, 1, 2], 1] -= 1 + c33_reg[:, :, [0, 1, 2], 3] += 1 + c33_reg[:, :, [6, 7, 8], 1] += 1 + c33_reg[:, :, [6, 7, 8], 3] -= 1 + c33_mask = c33_mask & (c33_reg.min(dim=3)[0] >= 0) # n x L x K + c33_inds.append(c33_ind) + c33_masks.append(c33_mask) + c33_regs.append(c33_reg) + + if len(level_masks) > 0: + labels = torch.cat(labels, dim=0) + level_masks = torch.cat(level_masks, dim=0) + c33_inds = torch.cat(c33_inds, dim=0).long() + c33_regs = torch.cat(c33_regs, dim=0) + c33_masks = torch.cat(c33_masks, dim=0) + else: + labels = shapes_per_level.new_zeros((0)).long() + level_masks = shapes_per_level.new_zeros((0, L)).bool() + c33_inds = shapes_per_level.new_zeros((0, L, K)).long() + c33_regs = shapes_per_level.new_zeros((0, L, K, 4)).float() + c33_masks = shapes_per_level.new_zeros((0, L, K)).bool() + return labels, level_masks, c33_inds, c33_masks, c33_regs # N x L, N x L x K diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/centernet_head.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/centernet_head.py new file mode 100755 index 00000000..57e0960a --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/centernet_head.py @@ -0,0 +1,162 @@ +import math +from typing import List +import torch +from torch import nn +from torch.nn import functional as F + +from detectron2.layers import ShapeSpec, get_norm +from detectron2.config import configurable +from ..layers.deform_conv import DFConv2d + +__all__ = ["CenterNetHead"] + +class Scale(nn.Module): + def __init__(self, init_value=1.0): + super(Scale, self).__init__() + self.scale = nn.Parameter(torch.FloatTensor([init_value])) + + def forward(self, input): + return input * self.scale + +class CenterNetHead(nn.Module): + @configurable + def __init__(self, + # input_shape: List[ShapeSpec], + in_channels, + num_levels, + *, + num_classes=80, + with_agn_hm=False, + only_proposal=False, + norm='GN', + num_cls_convs=4, + num_box_convs=4, + num_share_convs=0, + use_deformable=False, + prior_prob=0.01): + super().__init__() + self.num_classes = num_classes + self.with_agn_hm = with_agn_hm + self.only_proposal = only_proposal + self.out_kernel = 3 + + head_configs = { + "cls": (num_cls_convs if not self.only_proposal else 0, \ + use_deformable), + "bbox": (num_box_convs, use_deformable), + "share": (num_share_convs, use_deformable)} + + # in_channels = [s.channels for s in input_shape] + # assert len(set(in_channels)) == 1, \ + # "Each level must have the same channel!" + # in_channels = in_channels[0] + channels = { + 'cls': in_channels, + 'bbox': in_channels, + 'share': in_channels, + } + for head in head_configs: + tower = [] + num_convs, use_deformable = head_configs[head] + channel = channels[head] + for i in range(num_convs): + if use_deformable and i == num_convs - 1: + conv_func = DFConv2d + else: + conv_func = nn.Conv2d + tower.append(conv_func( + in_channels if i == 0 else channel, + channel, + kernel_size=3, stride=1, + padding=1, bias=True + )) + if norm == 'GN' and channel % 32 != 0: + tower.append(nn.GroupNorm(25, channel)) + elif norm != '': + tower.append(get_norm(norm, channel)) + tower.append(nn.ReLU()) + self.add_module('{}_tower'.format(head), + nn.Sequential(*tower)) + + self.bbox_pred = nn.Conv2d( + in_channels, 4, kernel_size=self.out_kernel, + stride=1, padding=self.out_kernel // 2 + ) + + self.scales = nn.ModuleList( + [Scale(init_value=1.0) for _ in range(num_levels)]) + + for modules in [ + self.cls_tower, self.bbox_tower, + self.share_tower, + self.bbox_pred, + ]: + for l in modules.modules(): + if isinstance(l, nn.Conv2d): + torch.nn.init.normal_(l.weight, std=0.01) + torch.nn.init.constant_(l.bias, 0) + + torch.nn.init.constant_(self.bbox_pred.bias, 8.) + prior_prob = prior_prob + bias_value = -math.log((1 - prior_prob) / prior_prob) + + if self.with_agn_hm: + self.agn_hm = nn.Conv2d( + in_channels, 1, kernel_size=self.out_kernel, + stride=1, padding=self.out_kernel // 2 + ) + torch.nn.init.constant_(self.agn_hm.bias, bias_value) + torch.nn.init.normal_(self.agn_hm.weight, std=0.01) + + if not self.only_proposal: + cls_kernel_size = self.out_kernel + self.cls_logits = nn.Conv2d( + in_channels, self.num_classes, + kernel_size=cls_kernel_size, + stride=1, + padding=cls_kernel_size // 2, + ) + + torch.nn.init.constant_(self.cls_logits.bias, bias_value) + torch.nn.init.normal_(self.cls_logits.weight, std=0.01) + + @classmethod + def from_config(cls, cfg, input_shape): + ret = { + # 'input_shape': input_shape, + 'in_channels': [s.channels for s in input_shape][0], + 'num_levels': len(input_shape), + 'num_classes': cfg.MODEL.CENTERNET.NUM_CLASSES, + 'with_agn_hm': cfg.MODEL.CENTERNET.WITH_AGN_HM, + 'only_proposal': cfg.MODEL.CENTERNET.ONLY_PROPOSAL, + 'norm': cfg.MODEL.CENTERNET.NORM, + 'num_cls_convs': cfg.MODEL.CENTERNET.NUM_CLS_CONVS, + 'num_box_convs': cfg.MODEL.CENTERNET.NUM_BOX_CONVS, + 'num_share_convs': cfg.MODEL.CENTERNET.NUM_SHARE_CONVS, + 'use_deformable': cfg.MODEL.CENTERNET.USE_DEFORMABLE, + 'prior_prob': cfg.MODEL.CENTERNET.PRIOR_PROB, + } + return ret + + def forward(self, x): + clss = [] + bbox_reg = [] + agn_hms = [] + for l, feature in enumerate(x): + feature = self.share_tower(feature) + cls_tower = self.cls_tower(feature) + bbox_tower = self.bbox_tower(feature) + if not self.only_proposal: + clss.append(self.cls_logits(cls_tower)) + else: + clss.append(None) + + if self.with_agn_hm: + agn_hms.append(self.agn_hm(bbox_tower)) + else: + agn_hms.append(None) + reg = self.bbox_pred(bbox_tower) + reg = self.scales[l](reg) + bbox_reg.append(F.relu(reg)) + + return clss, bbox_reg, agn_hms \ No newline at end of file diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/utils.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/utils.py new file mode 100755 index 00000000..c9efa287 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/utils.py @@ -0,0 +1,38 @@ +import cv2 +import torch +from torch import nn +from detectron2.utils.comm import get_world_size +from detectron2.structures import pairwise_iou, Boxes +# from .data import CenterNetCrop +import torch.nn.functional as F +import numpy as np +from detectron2.structures import Boxes, ImageList, Instances + +__all__ = ['reduce_sum', '_transpose'] + +INF = 1000000000 + +def _transpose(training_targets, num_loc_list): + ''' + This function is used to transpose image first training targets to + level first ones + :return: level first training targets + ''' + for im_i in range(len(training_targets)): + training_targets[im_i] = torch.split( + training_targets[im_i], num_loc_list, dim=0) + + targets_level_first = [] + for targets_per_level in zip(*training_targets): + targets_level_first.append( + torch.cat(targets_per_level, dim=0)) + return targets_level_first + + +def reduce_sum(tensor): + world_size = get_world_size() + if world_size < 2: + return tensor + tensor = tensor.clone() + torch.distributed.all_reduce(tensor, op=torch.distributed.ReduceOp.SUM) + return tensor \ No newline at end of file diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/layers/__init__.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/layers/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/layers/deform_conv.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/layers/deform_conv.py new file mode 100755 index 00000000..e5650c40 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/layers/deform_conv.py @@ -0,0 +1,116 @@ +import torch +from torch import nn + +from detectron2.layers import Conv2d + + +class _NewEmptyTensorOp(torch.autograd.Function): + @staticmethod + def forward(ctx, x, new_shape): + ctx.shape = x.shape + return x.new_empty(new_shape) + + @staticmethod + def backward(ctx, grad): + shape = ctx.shape + return _NewEmptyTensorOp.apply(grad, shape), None + + +class DFConv2d(nn.Module): + """Deformable convolutional layer""" + def __init__( + self, + in_channels, + out_channels, + with_modulated_dcn=True, + kernel_size=3, + stride=1, + groups=1, + dilation=1, + deformable_groups=1, + bias=False, + padding=None + ): + super(DFConv2d, self).__init__() + if isinstance(kernel_size, (list, tuple)): + assert isinstance(stride, (list, tuple)) + assert isinstance(dilation, (list, tuple)) + assert len(kernel_size) == 2 + assert len(stride) == 2 + assert len(dilation) == 2 + padding = ( + dilation[0] * (kernel_size[0] - 1) // 2, + dilation[1] * (kernel_size[1] - 1) // 2 + ) + offset_base_channels = kernel_size[0] * kernel_size[1] + else: + padding = dilation * (kernel_size - 1) // 2 + offset_base_channels = kernel_size * kernel_size + if with_modulated_dcn: + from detectron2.layers.deform_conv import ModulatedDeformConv + offset_channels = offset_base_channels * 3 # default: 27 + conv_block = ModulatedDeformConv + else: + from detectron2.layers.deform_conv import DeformConv + offset_channels = offset_base_channels * 2 # default: 18 + conv_block = DeformConv + self.offset = Conv2d( + in_channels, + deformable_groups * offset_channels, + kernel_size=kernel_size, + stride=stride, + padding=padding, + groups=1, + dilation=dilation + ) + nn.init.constant_(self.offset.weight, 0) + nn.init.constant_(self.offset.bias, 0) + ''' + for l in [self.offset, ]: + nn.init.kaiming_uniform_(l.weight, a=1) + torch.nn.init.constant_(l.bias, 0.) + ''' + self.conv = conv_block( + in_channels, + out_channels, + kernel_size=kernel_size, + stride=stride, + padding=padding, + dilation=dilation, + groups=groups, + deformable_groups=deformable_groups, + bias=bias + ) + self.with_modulated_dcn = with_modulated_dcn + self.kernel_size = kernel_size + self.stride = stride + self.padding = padding + self.dilation = dilation + self.offset_split = offset_base_channels * deformable_groups * 2 + + def forward(self, x, return_offset=False): + if x.numel() > 0: + if not self.with_modulated_dcn: + offset_mask = self.offset(x) + x = self.conv(x, offset_mask) + else: + offset_mask = self.offset(x) + offset = offset_mask[:, :self.offset_split, :, :] + mask = offset_mask[:, self.offset_split:, :, :].sigmoid() + x = self.conv(x, offset, mask) + if return_offset: + return x, offset_mask + return x + # get output shape + output_shape = [ + (i + 2 * p - (di * (k - 1) + 1)) // d + 1 + for i, p, di, k, d in zip( + x.shape[-2:], + self.padding, + self.dilation, + self.kernel_size, + self.stride + ) + ] + output_shape = [x.shape[0], self.conv.weight.shape[0]] + output_shape + return _NewEmptyTensorOp.apply(x, output_shape) \ No newline at end of file diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/layers/heatmap_focal_loss.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/layers/heatmap_focal_loss.py new file mode 100755 index 00000000..d4693b21 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/layers/heatmap_focal_loss.py @@ -0,0 +1,92 @@ +import torch +from torch.nn import functional as F + +# TODO: merge these two function +def heatmap_focal_loss( + inputs, + targets, + pos_inds, + labels, + alpha: float = -1, + beta: float = 4, + gamma: float = 2, + reduction: str = 'sum', + sigmoid_clamp: float = 1e-4, + ignore_high_fp: float = -1., +): + """ + Loss used in RetinaNet for dense detection: https://arxiv.org/abs/1708.02002. + Args: + inputs: (sum_l N*Hl*Wl, C) + targets: (sum_l N*Hl*Wl, C) + pos_inds: N + labels: N + Returns: + Loss tensor with the reduction option applied. + """ + pred = torch.clamp(inputs.sigmoid_(), min=sigmoid_clamp, max=1-sigmoid_clamp) + neg_weights = torch.pow(1 - targets, beta) + pos_pred_pix = pred[pos_inds] # N x C + pos_pred = pos_pred_pix.gather(1, labels.unsqueeze(1)) + pos_loss = torch.log(pos_pred) * torch.pow(1 - pos_pred, gamma) + neg_loss = torch.log(1 - pred) * torch.pow(pred, gamma) * neg_weights + + if ignore_high_fp > 0: + not_high_fp = (pred < ignore_high_fp).float() + neg_loss = not_high_fp * neg_loss + + if reduction == "sum": + pos_loss = pos_loss.sum() + neg_loss = neg_loss.sum() + + if alpha >= 0: + pos_loss = alpha * pos_loss + neg_loss = (1 - alpha) * neg_loss + + return - pos_loss, - neg_loss + +heatmap_focal_loss_jit = torch.jit.script(heatmap_focal_loss) +# heatmap_focal_loss_jit = heatmap_focal_loss + +def binary_heatmap_focal_loss( + inputs, + targets, + pos_inds, + alpha: float = -1, + beta: float = 4, + gamma: float = 2, + sigmoid_clamp: float = 1e-4, + ignore_high_fp: float = -1., +): + """ + Args: + inputs: (sum_l N*Hl*Wl,) + targets: (sum_l N*Hl*Wl,) + pos_inds: N + Returns: + Loss tensor with the reduction option applied. + """ + pred = torch.clamp(inputs.sigmoid_(), min=sigmoid_clamp, max=1-sigmoid_clamp) + neg_weights = torch.pow(1 - targets, beta) + for i, ind in enumerate(pos_inds): + if ind >= pred.shape[0]: + print('%'*100) + print(pred.shape, ind, pos_inds) + pos_inds[i] = pred.shape[0] - 1 + pos_pred = pred[pos_inds] # N + pos_loss = torch.log(pos_pred) * torch.pow(1 - pos_pred, gamma) + neg_loss = torch.log(1 - pred) * torch.pow(pred, gamma) * neg_weights + if ignore_high_fp > 0: + not_high_fp = (pred < ignore_high_fp).float() + neg_loss = not_high_fp * neg_loss + + pos_loss = - pos_loss.sum() + neg_loss = - neg_loss.sum() + + if alpha >= 0: + pos_loss = alpha * pos_loss + neg_loss = (1 - alpha) * neg_loss + + return pos_loss, neg_loss + +# binary_heatmap_focal_loss_jit = torch.jit.script(binary_heatmap_focal_loss) \ No newline at end of file diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/layers/iou_loss.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/layers/iou_loss.py new file mode 100755 index 00000000..6a024646 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/layers/iou_loss.py @@ -0,0 +1,121 @@ +import torch +from torch import nn + + +class IOULoss(nn.Module): + def __init__(self, loc_loss_type='iou'): + super(IOULoss, self).__init__() + self.loc_loss_type = loc_loss_type + + def forward(self, pred, target, weight=None, reduction='sum'): + pred_left = pred[:, 0] + pred_top = pred[:, 1] + pred_right = pred[:, 2] + pred_bottom = pred[:, 3] + + target_left = target[:, 0] + target_top = target[:, 1] + target_right = target[:, 2] + target_bottom = target[:, 3] + + target_aera = (target_left + target_right) * \ + (target_top + target_bottom) + pred_aera = (pred_left + pred_right) * \ + (pred_top + pred_bottom) + + w_intersect = torch.min(pred_left, target_left) + \ + torch.min(pred_right, target_right) + h_intersect = torch.min(pred_bottom, target_bottom) + \ + torch.min(pred_top, target_top) + + g_w_intersect = torch.max(pred_left, target_left) + \ + torch.max(pred_right, target_right) + g_h_intersect = torch.max(pred_bottom, target_bottom) + \ + torch.max(pred_top, target_top) + ac_uion = g_w_intersect * g_h_intersect + + area_intersect = w_intersect * h_intersect + area_union = target_aera + pred_aera - area_intersect + + ious = (area_intersect + 1.0) / (area_union + 1.0) + gious = ious - (ac_uion - area_union) / ac_uion + if self.loc_loss_type == 'iou': + losses = -torch.log(ious) + elif self.loc_loss_type == 'linear_iou': + losses = 1 - ious + elif self.loc_loss_type == 'giou': + losses = 1 - gious + else: + raise NotImplementedError + + if weight is not None: + losses = losses * weight + else: + losses = losses + + if reduction == 'sum': + return losses.sum() + elif reduction == 'batch': + return losses.sum(dim=[1]) + elif reduction == 'none': + return losses + else: + raise NotImplementedError + + +def giou_loss( + boxes1: torch.Tensor, + boxes2: torch.Tensor, + reduction: str = "none", + eps: float = 1e-7, +) -> torch.Tensor: + """ + Generalized Intersection over Union Loss (Hamid Rezatofighi et. al) + https://arxiv.org/abs/1902.09630 + Gradient-friendly IoU loss with an additional penalty that is non-zero when the + boxes do not overlap and scales with the size of their smallest enclosing box. + This loss is symmetric, so the boxes1 and boxes2 arguments are interchangeable. + Args: + boxes1, boxes2 (Tensor): box locations in XYXY format, shape (N, 4) or (4,). + reduction: 'none' | 'mean' | 'sum' + 'none': No reduction will be applied to the output. + 'mean': The output will be averaged. + 'sum': The output will be summed. + eps (float): small number to prevent division by zero + """ + + x1, y1, x2, y2 = boxes1.unbind(dim=-1) + x1g, y1g, x2g, y2g = boxes2.unbind(dim=-1) + + assert (x2 >= x1).all(), "bad box: x1 larger than x2" + assert (y2 >= y1).all(), "bad box: y1 larger than y2" + + # Intersection keypoints + xkis1 = torch.max(x1, x1g) + ykis1 = torch.max(y1, y1g) + xkis2 = torch.min(x2, x2g) + ykis2 = torch.min(y2, y2g) + + intsctk = torch.zeros_like(x1) + mask = (ykis2 > ykis1) & (xkis2 > xkis1) + intsctk[mask] = (xkis2[mask] - xkis1[mask]) * (ykis2[mask] - ykis1[mask]) + unionk = (x2 - x1) * (y2 - y1) + (x2g - x1g) * (y2g - y1g) - intsctk + iouk = intsctk / (unionk + eps) + + # smallest enclosing box + xc1 = torch.min(x1, x1g) + yc1 = torch.min(y1, y1g) + xc2 = torch.max(x2, x2g) + yc2 = torch.max(y2, y2g) + + area_c = (xc2 - xc1) * (yc2 - yc1) + miouk = iouk - ((area_c - unionk) / (area_c + eps)) + + loss = 1 - miouk + + if reduction == "mean": + loss = loss.mean() + elif reduction == "sum": + loss = loss.sum() + + return loss \ No newline at end of file diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/layers/ml_nms.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/layers/ml_nms.py new file mode 100755 index 00000000..325d709a --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/layers/ml_nms.py @@ -0,0 +1,31 @@ +from detectron2.layers import batched_nms + + +def ml_nms(boxlist, nms_thresh, max_proposals=-1, + score_field="scores", label_field="labels"): + """ + Performs non-maximum suppression on a boxlist, with scores specified + in a boxlist field via score_field. + Arguments: + boxlist(BoxList) + nms_thresh (float) + max_proposals (int): if > 0, then only the top max_proposals are kept + after non-maximum suppression + score_field (str) + """ + if nms_thresh <= 0: + return boxlist + if boxlist.has('pred_boxes'): + boxes = boxlist.pred_boxes.tensor + labels = boxlist.pred_classes + else: + boxes = boxlist.proposal_boxes.tensor + labels = boxlist.proposal_boxes.tensor.new_zeros( + len(boxlist.proposal_boxes.tensor)) + scores = boxlist.scores + + keep = batched_nms(boxes, scores, labels, nms_thresh) + if max_proposals > 0: + keep = keep[: max_proposals] + boxlist = boxlist[keep] + return boxlist diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/meta_arch/__init__.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/meta_arch/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/meta_arch/centernet_detector.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/meta_arch/centernet_detector.py new file mode 100755 index 00000000..eba0457a --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/meta_arch/centernet_detector.py @@ -0,0 +1,69 @@ +import math +import json +import numpy as np +import torch +from torch import nn + +from detectron2.modeling.meta_arch.build import META_ARCH_REGISTRY +from detectron2.modeling import build_backbone, build_proposal_generator +from detectron2.modeling import detector_postprocess +from detectron2.structures import ImageList + +# @META_ARCH_REGISTRY.register() +class CenterNetDetector(nn.Module): + def __init__(self, cfg): + super().__init__() + self.mean, self.std = cfg.MODEL.PIXEL_MEAN, cfg.MODEL.PIXEL_STD + self.register_buffer("pixel_mean", torch.Tensor(cfg.MODEL.PIXEL_MEAN).view(-1, 1, 1)) + self.register_buffer("pixel_std", torch.Tensor(cfg.MODEL.PIXEL_STD).view(-1, 1, 1)) + + self.backbone = build_backbone(cfg) + self.proposal_generator = build_proposal_generator( + cfg, self.backbone.output_shape()) # TODO: change to a more precise name + + + def forward(self, batched_inputs): + if not self.training: + return self.inference(batched_inputs) + images = self.preprocess_image(batched_inputs) + features = self.backbone(images.tensor) + gt_instances = [x["instances"].to(self.device) for x in batched_inputs] + + _, proposal_losses = self.proposal_generator( + images, features, gt_instances) + return proposal_losses + + + @property + def device(self): + return self.pixel_mean.device + + + @torch.no_grad() + def inference(self, batched_inputs, do_postprocess=True): + images = self.preprocess_image(batched_inputs) + inp = images.tensor + features = self.backbone(inp) + proposals, _ = self.proposal_generator(images, features, None) + + processed_results = [] + for results_per_image, input_per_image, image_size in zip( + proposals, batched_inputs, images.image_sizes): + if do_postprocess: + height = input_per_image.get("height", image_size[0]) + width = input_per_image.get("width", image_size[1]) + r = detector_postprocess(results_per_image, height, width) + processed_results.append({"instances": r}) + else: + r = results_per_image + processed_results.append(r) + return processed_results + + def preprocess_image(self, batched_inputs): + """ + Normalize, pad and batch the input images. + """ + images = [x["image"].to(self.device) for x in batched_inputs] + images = [(x - self.pixel_mean) / self.pixel_std for x in images] + images = ImageList.from_tensors(images, self.backbone.size_divisibility) + return images diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/__init__.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/custom_fast_rcnn.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/custom_fast_rcnn.py new file mode 100755 index 00000000..b6d95690 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/custom_fast_rcnn.py @@ -0,0 +1,124 @@ +# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved +# Part of the code is from https://github.com/tztztztztz/eql.detectron2/blob/master/projects/EQL/eql/fast_rcnn.py +import logging +import math +import json +from typing import Dict, Union +import torch +from fvcore.nn import giou_loss, smooth_l1_loss +from torch import nn +from torch.nn import functional as F + +from detectron2.config import configurable +from detectron2.layers import Linear, ShapeSpec, batched_nms, cat, nonzero_tuple +from detectron2.modeling.box_regression import Box2BoxTransform +from detectron2.structures import Boxes, Instances +from detectron2.utils.events import get_event_storage +from detectron2.modeling.roi_heads.fast_rcnn import FastRCNNOutputLayers +from detectron2.modeling.roi_heads.fast_rcnn import fast_rcnn_inference +from detectron2.modeling.roi_heads.fast_rcnn import _log_classification_stats +from detectron2.utils.comm import get_world_size +from .fed_loss import load_class_freq, get_fed_loss_inds + +__all__ = ["CustomFastRCNNOutputLayers"] + +class CustomFastRCNNOutputLayers(FastRCNNOutputLayers): + def __init__( + self, + cfg, + input_shape: ShapeSpec, + **kwargs + ): + super().__init__(cfg, input_shape, **kwargs) + + self.cfg = cfg + + def losses(self, predictions, proposals): + """ + enable advanced loss + """ + scores, proposal_deltas = predictions + gt_classes = ( + cat([p.gt_classes for p in proposals], dim=0) if len(proposals) else torch.empty(0) + ) + num_classes = self.num_classes + _log_classification_stats(scores, gt_classes) + + if len(proposals): + proposal_boxes = cat([p.proposal_boxes.tensor for p in proposals], dim=0) # Nx4 + assert not proposal_boxes.requires_grad, "Proposals should not require gradients!" + gt_boxes = cat( + [(p.gt_boxes if p.has("gt_boxes") else p.proposal_boxes).tensor for p in proposals], + dim=0, + ) + else: + proposal_boxes = gt_boxes = torch.empty((0, 4), device=proposal_deltas.device) + + loss_cls = self.softmax_cross_entropy_loss(scores, gt_classes) + return { + "loss_cls": loss_cls, + "loss_box_reg": self.box_reg_loss( + proposal_boxes, gt_boxes, proposal_deltas, gt_classes) + } + + + def sigmoid_cross_entropy_loss(self, pred_class_logits, gt_classes): + if pred_class_logits.numel() == 0: + return pred_class_logits.new_zeros([1])[0] # This is more robust than .sum() * 0. + + B = pred_class_logits.shape[0] + C = pred_class_logits.shape[1] - 1 + + target = pred_class_logits.new_zeros(B, C + 1) + target[range(len(gt_classes)), gt_classes] = 1 # B x (C + 1) + target = target[:, :C] # B x C + + weight = 1 + + cls_loss = F.binary_cross_entropy_with_logits( + pred_class_logits[:, :-1], target, reduction='none') # B x C + loss = torch.sum(cls_loss * weight) / B + return loss + + + def softmax_cross_entropy_loss(self, pred_class_logits, gt_classes): + """ + change _no_instance handling + """ + if pred_class_logits.numel() == 0: + return pred_class_logits.new_zeros([1])[0] + + loss = F.cross_entropy( + pred_class_logits, gt_classes, reduction="mean") + return loss + + + def inference(self, predictions, proposals): + """ + enable use proposal boxes + """ + boxes = self.predict_boxes(predictions, proposals) + scores = self.predict_probs(predictions, proposals) + if self.cfg.MODEL.ROI_BOX_HEAD.MULT_PROPOSAL_SCORE: + proposal_scores = [p.get('objectness_logits') for p in proposals] + scores = [(s * ps[:, None]) ** 0.5 \ + for s, ps in zip(scores, proposal_scores)] + image_shapes = [x.image_size for x in proposals] + return fast_rcnn_inference( + boxes, + scores, + image_shapes, + self.test_score_thresh, + self.test_nms_thresh, + self.test_topk_per_image, + ) + + + def predict_probs(self, predictions, proposals): + """ + support sigmoid + """ + scores, _ = predictions + num_inst_per_image = [len(p) for p in proposals] + probs = F.softmax(scores, dim=-1) + return probs.split(num_inst_per_image, dim=0) diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/custom_roi_heads.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/custom_roi_heads.py new file mode 100755 index 00000000..a01a8bf6 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/custom_roi_heads.py @@ -0,0 +1,185 @@ +# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved +import numpy as np +import json +import math +import torch +from torch import nn +from torch.autograd.function import Function +from typing import Dict, List, Optional, Tuple, Union + +from detectron2.layers import ShapeSpec +from detectron2.structures import Boxes, Instances, pairwise_iou +from detectron2.utils.events import get_event_storage + +from detectron2.modeling.box_regression import Box2BoxTransform +from detectron2.modeling.roi_heads.fast_rcnn import fast_rcnn_inference +from detectron2.modeling.roi_heads.roi_heads import ROI_HEADS_REGISTRY, StandardROIHeads +from detectron2.modeling.roi_heads.cascade_rcnn import CascadeROIHeads +from detectron2.modeling.roi_heads.box_head import build_box_head +from .custom_fast_rcnn import CustomFastRCNNOutputLayers + + +# @ROI_HEADS_REGISTRY.register() +class CustomROIHeads(StandardROIHeads): + @classmethod + def _init_box_head(self, cfg, input_shape): + ret = super()._init_box_head(cfg, input_shape) + del ret['box_predictor'] + ret['box_predictor'] = CustomFastRCNNOutputLayers( + cfg, ret['box_head'].output_shape) + self.debug = cfg.DEBUG + if self.debug: + self.debug_show_name = cfg.DEBUG_SHOW_NAME + self.save_debug = cfg.SAVE_DEBUG + self.vis_thresh = cfg.VIS_THRESH + self.pixel_mean = torch.Tensor(cfg.MODEL.PIXEL_MEAN).to( + torch.device(cfg.MODEL.DEVICE)).view(3, 1, 1) + self.pixel_std = torch.Tensor(cfg.MODEL.PIXEL_STD).to( + torch.device(cfg.MODEL.DEVICE)).view(3, 1, 1) + return ret + + def forward(self, images, features, proposals, targets=None): + """ + enable debug + """ + if not self.debug: + del images + if self.training: + assert targets + proposals = self.label_and_sample_proposals(proposals, targets) + del targets + + if self.training: + losses = self._forward_box(features, proposals) + losses.update(self._forward_mask(features, proposals)) + losses.update(self._forward_keypoint(features, proposals)) + return proposals, losses + else: + pred_instances = self._forward_box(features, proposals) + pred_instances = self.forward_with_given_boxes(features, pred_instances) + if self.debug: + from ..debug import debug_second_stage + denormalizer = lambda x: x * self.pixel_std + self.pixel_mean + debug_second_stage( + [denormalizer(images[0].clone())], + pred_instances, proposals=proposals, + debug_show_name=self.debug_show_name) + return pred_instances, {} + + +# @ROI_HEADS_REGISTRY.register() +class CustomCascadeROIHeads(CascadeROIHeads): + @classmethod + def _init_box_head(self, cfg, input_shape): + self.mult_proposal_score = cfg.MODEL.ROI_BOX_HEAD.MULT_PROPOSAL_SCORE + ret = super()._init_box_head(cfg, input_shape) + del ret['box_predictors'] + cascade_bbox_reg_weights = cfg.MODEL.ROI_BOX_CASCADE_HEAD.BBOX_REG_WEIGHTS + box_predictors = [] + for box_head, bbox_reg_weights in zip(ret['box_heads'], cascade_bbox_reg_weights): + box_predictors.append( + CustomFastRCNNOutputLayers( + cfg, box_head.output_shape, + box2box_transform=Box2BoxTransform(weights=bbox_reg_weights) + )) + ret['box_predictors'] = box_predictors + self.debug = cfg.DEBUG + if self.debug: + self.debug_show_name = cfg.DEBUG_SHOW_NAME + self.save_debug = cfg.SAVE_DEBUG + self.vis_thresh = cfg.VIS_THRESH + self.pixel_mean = torch.Tensor(cfg.MODEL.PIXEL_MEAN).to( + torch.device(cfg.MODEL.DEVICE)).view(3, 1, 1) + self.pixel_std = torch.Tensor(cfg.MODEL.PIXEL_STD).to( + torch.device(cfg.MODEL.DEVICE)).view(3, 1, 1) + return ret + + + def _forward_box(self, features, proposals, targets=None): + """ + Add mult proposal scores at testing + """ + if (not self.training) and self.mult_proposal_score: + if len(proposals) > 0 and proposals[0].has('scores'): + proposal_scores = [ + p.get('scores') for p in proposals] + else: + proposal_scores = [ + p.get('objectness_logits') for p in proposals] + + features = [features[f] for f in self.box_in_features] + head_outputs = [] # (predictor, predictions, proposals) + prev_pred_boxes = None + image_sizes = [x.image_size for x in proposals] + for k in range(self.num_cascade_stages): + if k > 0: + proposals = self._create_proposals_from_boxes(prev_pred_boxes, image_sizes) + if self.training: + proposals = self._match_and_label_boxes(proposals, k, targets) + predictions = self._run_stage(features, proposals, k) + prev_pred_boxes = self.box_predictor[k].predict_boxes(predictions, proposals) + head_outputs.append((self.box_predictor[k], predictions, proposals)) + + if self.training: + losses = {} + storage = get_event_storage() + for stage, (predictor, predictions, proposals) in enumerate(head_outputs): + with storage.name_scope("stage{}".format(stage)): + stage_losses = predictor.losses(predictions, proposals) + losses.update({k + "_stage{}".format(stage): v for k, v in stage_losses.items()}) + return losses + else: + # Each is a list[Tensor] of length #image. Each tensor is Ri x (K+1) + scores_per_stage = [h[0].predict_probs(h[1], h[2]) for h in head_outputs] + scores = [ + sum(list(scores_per_image)) * (1.0 / self.num_cascade_stages) + for scores_per_image in zip(*scores_per_stage) + ] + + if self.mult_proposal_score: + scores = [(s * ps[:, None]) ** 0.5 \ + for s, ps in zip(scores, proposal_scores)] + + predictor, predictions, proposals = head_outputs[-1] + boxes = predictor.predict_boxes(predictions, proposals) + pred_instances, _ = fast_rcnn_inference( + boxes, + scores, + image_sizes, + predictor.test_score_thresh, + predictor.test_nms_thresh, + predictor.test_topk_per_image, + ) + + return pred_instances + + def forward(self, images, features, proposals, targets=None): + ''' + enable debug + ''' + if not self.debug: + del images + if self.training: + proposals = self.label_and_sample_proposals(proposals, targets) + + if self.training: + losses = self._forward_box(features, proposals, targets) + losses.update(self._forward_mask(features, proposals)) + losses.update(self._forward_keypoint(features, proposals)) + return proposals, losses + else: + # import pdb; pdb.set_trace() + pred_instances = self._forward_box(features, proposals) + pred_instances = self.forward_with_given_boxes(features, pred_instances) + if self.debug: + from ..debug import debug_second_stage + denormalizer = lambda x: x * self.pixel_std + self.pixel_mean + debug_second_stage( + [denormalizer(x.clone()) for x in images], + pred_instances, proposals=proposals, + save_debug=self.save_debug, + debug_show_name=self.debug_show_name, + vis_thresh=self.vis_thresh) + return pred_instances, {} + + diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/fed_loss.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/fed_loss.py new file mode 100755 index 00000000..290f0f07 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/fed_loss.py @@ -0,0 +1,31 @@ +import torch +import json +import numpy as np +from torch.nn import functional as F + +def load_class_freq( + path='datasets/lvis/lvis_v1_train_cat_info.json', + freq_weight=0.5): + cat_info = json.load(open(path, 'r')) + cat_info = torch.tensor( + [c['image_count'] for c in sorted(cat_info, key=lambda x: x['id'])]) + freq_weight = cat_info.float() ** freq_weight + return freq_weight + +def get_fed_loss_inds( + gt_classes, num_sample_cats=50, C=1203, \ + weight=None, fed_cls_inds=-1): + appeared = torch.unique(gt_classes) # C' + prob = appeared.new_ones(C + 1).float() + prob[-1] = 0 + if len(appeared) < num_sample_cats: + if weight is not None: + prob[:C] = weight.float().clone() + prob[appeared] = 0 + if fed_cls_inds > 0: + prob[fed_cls_inds:] = 0 + more_appeared = torch.multinomial( + prob, num_sample_cats - len(appeared), + replacement=False) + appeared = torch.cat([appeared, more_appeared]) + return appeared \ No newline at end of file diff --git a/vbench/third_party/grit_src/centernet2/centernet2_docs/MODEL_ZOO.md b/vbench/third_party/grit_src/centernet2/centernet2_docs/MODEL_ZOO.md new file mode 100755 index 00000000..7a2a92b6 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/centernet2_docs/MODEL_ZOO.md @@ -0,0 +1,73 @@ +# MODEL_ZOO + +### Common settings and notes + +- Multiscale training is used by default in all models. The results are all reported using single-scale testing. +- We report runtime on our local workstation with a TitanXp GPU and a Titan RTX GPU. +- All models are trained on 8-GPU servers by default. The 1280 models are trained on 24G GPUs. Reducing the batchsize with the linear learning rate rule should be fine. +- All models can be downloaded directly from [Google drive](https://drive.google.com/drive/folders/1eae1cTX8tvIaCeof36sBgxrXEXALYlf-?usp=sharing). + + +## COCO + +### CenterNet + +| Model | val mAP | FPS (Titan Xp/ Titan RTX) | links | +|-------------------------------------------|---------|---------|-----------| +| CenterNet-S4_DLA_8x | 42.5 | 50 / 71 |[config](../configs/CenterNet-S4_DLA_8x.yaml)/[model](https://drive.google.com/file/d/1lNBhVHnZAEBRD66MFaHjm5Ij6Z4KYrJq/view?usp=sharing)| +| CenterNet-FPN_R50_1x | 40.2 | 20 / 24 |[config](../configs/CenterNet-FPN_R50_1x.yaml)/[model](https://drive.google.com/file/d/1rVG1YTthMXvutC6jr9KoE2DthT5-jhGj/view?usp=sharing)| + +#### Note + +- `CenterNet-S4_DLA_8x` is a re-implemented version of the original CenterNet (stride 4), with several changes, including + - Using top-left-right-bottom box encoding and GIoU Loss; adding regression loss to the center 3x3 region. + - Adding more positive pixels for the heatmap loss whose regression loss is small and is within the center3x3 region. + - Using more heavy crop augmentation (EfficientDet-style crop ratio 0.1-2), and removing color augmentations. + - Using standard NMS instead of max pooling. + - Using RetinaNet-style optimizer (SGD), learning rate rule (0.01 for each batch size 16), and schedule (8x12 epochs). +- `CenterNet-FPN_R50_1x` is a (new) FPN version of CenterNet. It includes the changes above, and assigns objects to FPN levels based on a fixed size range. The model is trained with standard short edge 640-800 multi-scale training with 12 epochs (1x). + + +### CenterNet2 + +| Model | val mAP | FPS (Titan Xp/ Titan RTX) | links | +|-------------------------------------------|---------|---------|-----------| +| CenterNet2-F_R50_1x | 41.7 | 22 / 27 |[config](../configs/CenterNet2-F_R50_1x.yaml)/[model](X)| +| CenterNet2_R50_1x | 42.9 | 18 / 24 |[config](../configs/CenterNet2_R50_1x.yaml)/[model](https://drive.google.com/file/d/1Osu1J_sskt_1FaGdfJKa4vd2N71TWS9W/view?usp=sharing)| +| CenterNet2_X101-DCN_2x | 49.9 | 6 / 8 |[config](../configs/CenterNet2_X101-DCN_2x.yaml)/[model](https://drive.google.com/file/d/1IHgpUHVJWpvMuFUUetgKWsw27pRNN2oK/view?usp=sharing)| +| CenterNet2_DLA-BiFPN-P3_4x | 43.8 | 40 / 50|[config](../configs/CenterNet2_DLA-BiFPN-P3_4x.yaml)/[model](https://drive.google.com/file/d/12GUNlDW9RmOs40UEMSiiUsk5QK_lpGsE/view?usp=sharing)| +| CenterNet2_DLA-BiFPN-P3_24x | 45.6 | 40 / 50 |[config](../configs/CenterNet2_DLA-BiFPN-P3_24x.yaml)/[model](https://drive.google.com/file/d/15ZES1ySxubDPzKsHPA7pYg8o_Vwmf-Mb/view?usp=sharing)| +| CenterNet2_R2-101-DCN_896_4x | 51.2 | 9 / 13 |[config](../configs/CenterNet2_R2-101-DCN_896_4x.yaml)/[model](https://drive.google.com/file/d/1S7_GE8ZDQBWuLEfKHkxzeF3KBsxsbABg/view?usp=sharing)| +| CenterNet2_R2-101-DCN-BiFPN_1280_4x | 52.9 | 6 / 8 |[config](../configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml)/[model](https://drive.google.com/file/d/14EBHNMagBCNTQjOXcHoZwLYIi2lFIm7F/view?usp=sharing)| +| CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST | 56.1 | 3 / 5 |[config](../configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml)/[model](https://drive.google.com/file/d/11ww9VlOi_nhpdsU_vBAecSxBU0dR_JzW/view?usp=sharing)| +| CenterNet2_DLA-BiFPN-P5_640_24x_ST | 49.2 | 33 / 38 |[config](../configs/CenterNet2_DLA-BiFPN-P5_640_24x_ST.yaml)/[model](https://drive.google.com/file/d/1qsHp2HrM1u8WrtBzF5S0oCoLMz-B40wk/view?usp=sharing)| + +#### Note + +- `CenterNet2-F_R50_1x` uses Faster RCNN as the second stage. All other CenterNet2 models use Cascade RCNN as the second stage. +- `CenterNet2_DLA-BiFPN-P3_4x` follows the same training setting as [realtime-FCOS](https://github.com/aim-uofa/AdelaiDet/blob/master/configs/FCOS-Detection/README.md). +- `CenterNet2_DLA-BiFPN-P3_24x` is trained by repeating the `4x` schedule (starting from learning rate 0.01) 6 times. +- R2 means [Res2Net](https://github.com/Res2Net/Res2Net-detectron2) backbone. To train Res2Net models, you need to download the ImageNet pre-trained weight [here](https://github.com/Res2Net/Res2Net-detectron2) and place it in `output/r2_101.pkl`. +- The last 4 models in the table are trained with the EfficientDet-style resize-and-crop augmentation, instead of the default random resizing short edge in detectron2. We found this trains faster (per-iteration) and gives better performance under a long schedule. +- `_ST` means using [self-training](https://arxiv.org/abs/2006.06882) using pseudo-labels produced by [Scaled-YOLOv4](https://github.com/WongKinYiu/ScaledYOLOv4) on COCO unlabeled images, with a hard score threshold 0.5. Our processed pseudo-labels can be downloaded [here](https://drive.google.com/file/d/1LMBjtHhLp6dYf6MjwEQmzCLWQLkmWPpw/view?usp=sharing). +- `CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST` finetunes from `CenterNet2_R2-101-DCN-BiFPN_1280_4x` for an additional `4x` schedule with the self-training data. It is trained under `1280x1280` but tested under `1560x1560`. + +## LVIS v1 + +| Model | val mAP box | links | +|-------------------------------------------|--------------|-----------| +| LVIS_CenterNet2_R50_1x | 26.5 |[config](../configs/LVIS_CenterNet2_R50_1x.yaml)/[model](https://drive.google.com/file/d/1gT9e-tNw8uzEBaCadQuoOOP2TEYa4kKP/view?usp=sharing)| +| LVIS_CenterNet2_R50_Fed_1x | 28.3 |[config](../configs/LVIS_CenterNet2_R50_Fed_1x.yaml)/[model](https://drive.google.com/file/d/1a9UjheMCKax0qAKEwPVpq2ZHN6vpqJv8/view?usp=sharing)| + +- The models are trained with repeat-factor sampling. +- `LVIS_CenterNet2_R50_Fed_1x` is CenterNet2 with our federated loss. Check our Appendix D of our [paper](https://arxiv.org/abs/2103.07461) or our [technical report at LVIS challenge](https://www.lvisdataset.org/assets/challenge_reports/2020/CenterNet2.pdf) for references. + +## Objects365 + +| Model | val mAP| links | +|-------------------------------------------|---------|-----------| +| O365_CenterNet2_R50_1x | 22.6 |[config](../configs/O365_CenterNet2_R50_1x.yaml)/[model](https://drive.google.com/file/d/18fG6xGchAlpNp5sx8RAtwadGkS-gdIBU/view?usp=sharing)| + +#### Note +- Objects365 dataset can be downloaded [here](https://www.objects365.org/overview.html). +- The model is trained with class-aware sampling. diff --git a/vbench/third_party/grit_src/centernet2/configs/Base-CenterNet-FPN.yaml b/vbench/third_party/grit_src/centernet2/configs/Base-CenterNet-FPN.yaml new file mode 100755 index 00000000..bef3dc10 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/Base-CenterNet-FPN.yaml @@ -0,0 +1,28 @@ +MODEL: + META_ARCHITECTURE: "CenterNetDetector" + PROPOSAL_GENERATOR: + NAME: "CenterNet" + BACKBONE: + NAME: "build_p67_resnet_fpn_backbone" + WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" + RESNETS: + DEPTH: 50 + OUT_FEATURES: ["res3", "res4", "res5"] + FPN: + IN_FEATURES: ["res3", "res4", "res5"] +DATASETS: + TRAIN: ("coco_2017_train",) + TEST: ("coco_2017_val",) +SOLVER: + IMS_PER_BATCH: 16 + BASE_LR: 0.01 + STEPS: (60000, 80000) + MAX_ITER: 90000 + CHECKPOINT_PERIOD: 1000000000 + WARMUP_ITERS: 4000 + WARMUP_FACTOR: 0.00025 + CLIP_GRADIENTS: + ENABLED: True +INPUT: + MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) +OUTPUT_DIR: "./output/CenterNet2/auto" diff --git a/vbench/third_party/grit_src/centernet2/configs/Base-CenterNet2.yaml b/vbench/third_party/grit_src/centernet2/configs/Base-CenterNet2.yaml new file mode 100755 index 00000000..68937231 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/Base-CenterNet2.yaml @@ -0,0 +1,56 @@ +MODEL: + META_ARCHITECTURE: "GeneralizedRCNN" + PROPOSAL_GENERATOR: + NAME: "CenterNet" + BACKBONE: + NAME: "build_p67_resnet_fpn_backbone" + WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl" + RESNETS: + DEPTH: 50 + OUT_FEATURES: ["res3", "res4", "res5"] + FPN: + IN_FEATURES: ["res3", "res4", "res5"] + ROI_HEADS: + NAME: CustomCascadeROIHeads + IN_FEATURES: ["p3", "p4", "p5", "p6", "p7"] + IOU_THRESHOLDS: [0.6] + NMS_THRESH_TEST: 0.7 + ROI_BOX_CASCADE_HEAD: + IOUS: [0.6, 0.7, 0.8] + ROI_BOX_HEAD: + NAME: "FastRCNNConvFCHead" + NUM_FC: 2 + POOLER_RESOLUTION: 7 + CLS_AGNOSTIC_BBOX_REG: True + MULT_PROPOSAL_SCORE: True + CENTERNET: + REG_WEIGHT: 1. + NOT_NORM_REG: True + ONLY_PROPOSAL: True + WITH_AGN_HM: True + INFERENCE_TH: 0.0001 + PRE_NMS_TOPK_TRAIN: 4000 + POST_NMS_TOPK_TRAIN: 2000 + PRE_NMS_TOPK_TEST: 1000 + POST_NMS_TOPK_TEST: 256 + NMS_TH_TRAIN: 0.9 + NMS_TH_TEST: 0.9 + POS_WEIGHT: 0.5 + NEG_WEIGHT: 0.5 + IGNORE_HIGH_FP: 0.85 +DATASETS: + TRAIN: ("coco_2017_train",) + TEST: ("coco_2017_val",) +SOLVER: + IMS_PER_BATCH: 16 + BASE_LR: 0.02 + STEPS: (60000, 80000) + MAX_ITER: 90000 + CHECKPOINT_PERIOD: 1000000000 + WARMUP_ITERS: 4000 + WARMUP_FACTOR: 0.00025 + CLIP_GRADIENTS: + ENABLED: True +INPUT: + MIN_SIZE_TRAIN: (640, 672, 704, 736, 768, 800) +OUTPUT_DIR: "./output/CenterNet2/auto" diff --git a/vbench/third_party/grit_src/centernet2/configs/Base_S4_DLA.yaml b/vbench/third_party/grit_src/centernet2/configs/Base_S4_DLA.yaml new file mode 100755 index 00000000..7e01be7e --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/Base_S4_DLA.yaml @@ -0,0 +1,40 @@ +MODEL: + META_ARCHITECTURE: "CenterNetDetector" + PROPOSAL_GENERATOR: + NAME: "CenterNet" + PIXEL_STD: [57.375, 57.120, 58.395] + BACKBONE: + NAME: "build_dla_backbone" + DLA: + NORM: "BN" + CENTERNET: + IN_FEATURES: ["dla2"] + FPN_STRIDES: [4] + SOI: [[0, 1000000]] + NUM_CLS_CONVS: 1 + NUM_BOX_CONVS: 1 + REG_WEIGHT: 1. + MORE_POS: True + HM_FOCAL_ALPHA: 0.25 +DATASETS: + TRAIN: ("coco_2017_train",) + TEST: ("coco_2017_val",) +SOLVER: + LR_SCHEDULER_NAME: "WarmupCosineLR" + MAX_ITER: 90000 + BASE_LR: 0.04 + IMS_PER_BATCH: 64 + WEIGHT_DECAY: 0.0001 + CHECKPOINT_PERIOD: 1000000 + CLIP_GRADIENTS: + ENABLED: True +INPUT: + CUSTOM_AUG: EfficientDetResizeCrop + TRAIN_SIZE: 640 + MIN_SIZE_TEST: 608 + MAX_SIZE_TEST: 900 +TEST: + EVAL_PERIOD: 7500 +DATALOADER: + NUM_WORKERS: 8 +OUTPUT_DIR: "output/CenterNet2/auto" diff --git a/vbench/third_party/grit_src/centernet2/configs/CenterNet-FPN_R50_1x.yaml b/vbench/third_party/grit_src/centernet2/configs/CenterNet-FPN_R50_1x.yaml new file mode 100755 index 00000000..6ea7d9b7 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/CenterNet-FPN_R50_1x.yaml @@ -0,0 +1,4 @@ +_BASE_: "Base-CenterNet-FPN.yaml" +MODEL: + CENTERNET: + MORE_POS: True \ No newline at end of file diff --git a/vbench/third_party/grit_src/centernet2/configs/CenterNet-S4_DLA_8x.yaml b/vbench/third_party/grit_src/centernet2/configs/CenterNet-S4_DLA_8x.yaml new file mode 100755 index 00000000..b3d88be9 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/CenterNet-S4_DLA_8x.yaml @@ -0,0 +1,5 @@ +_BASE_: "Base_S4_DLA.yaml" +SOLVER: + MAX_ITER: 90000 + BASE_LR: 0.08 + IMS_PER_BATCH: 128 \ No newline at end of file diff --git a/vbench/third_party/grit_src/centernet2/configs/CenterNet2-F_R50_1x.yaml b/vbench/third_party/grit_src/centernet2/configs/CenterNet2-F_R50_1x.yaml new file mode 100755 index 00000000..c40eecc1 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/CenterNet2-F_R50_1x.yaml @@ -0,0 +1,4 @@ +_BASE_: "Base-CenterNet2.yaml" +MODEL: + ROI_HEADS: + NAME: CustomROIHeads \ No newline at end of file diff --git a/vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml new file mode 100755 index 00000000..d7491447 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-BiFPN-P3_24x.yaml @@ -0,0 +1,36 @@ +_BASE_: "Base-CenterNet2.yaml" +MODEL: + BACKBONE: + NAME: "build_p35_fcos_dla_bifpn_backbone" + BIFPN: + OUT_CHANNELS: 160 + NUM_LEVELS: 3 + NUM_BIFPN: 4 + DLA: + NUM_LAYERS: 34 + NORM: "SyncBN" + FPN: + IN_FEATURES: ["dla3", "dla4", "dla5"] + ROI_HEADS: + IN_FEATURES: ["p3", "p4", "p5"] + CENTERNET: + POST_NMS_TOPK_TEST: 128 + FPN_STRIDES: [8, 16, 32] + IN_FEATURES: ['p3', 'p4', 'p5'] + SOI: [[0, 64], [48, 192], [128, 1000000]] +DATASETS: + TRAIN: ("coco_2017_train",) + TEST: ("coco_2017_val",) +SOLVER: + IMS_PER_BATCH: 16 + BASE_LR: 0.02 + STEPS: (300000, 340000) + MAX_ITER: 360000 + CHECKPOINT_PERIOD: 100000 + WARMUP_ITERS: 4000 + WARMUP_FACTOR: 0.00025 +INPUT: + MIN_SIZE_TRAIN: (256, 288, 320, 352, 384, 416, 448, 480, 512, 544, 576, 608) + MAX_SIZE_TRAIN: 900 + MAX_SIZE_TEST: 736 + MIN_SIZE_TEST: 512 \ No newline at end of file diff --git a/vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml new file mode 100755 index 00000000..d7491447 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-BiFPN-P3_4x.yaml @@ -0,0 +1,36 @@ +_BASE_: "Base-CenterNet2.yaml" +MODEL: + BACKBONE: + NAME: "build_p35_fcos_dla_bifpn_backbone" + BIFPN: + OUT_CHANNELS: 160 + NUM_LEVELS: 3 + NUM_BIFPN: 4 + DLA: + NUM_LAYERS: 34 + NORM: "SyncBN" + FPN: + IN_FEATURES: ["dla3", "dla4", "dla5"] + ROI_HEADS: + IN_FEATURES: ["p3", "p4", "p5"] + CENTERNET: + POST_NMS_TOPK_TEST: 128 + FPN_STRIDES: [8, 16, 32] + IN_FEATURES: ['p3', 'p4', 'p5'] + SOI: [[0, 64], [48, 192], [128, 1000000]] +DATASETS: + TRAIN: ("coco_2017_train",) + TEST: ("coco_2017_val",) +SOLVER: + IMS_PER_BATCH: 16 + BASE_LR: 0.02 + STEPS: (300000, 340000) + MAX_ITER: 360000 + CHECKPOINT_PERIOD: 100000 + WARMUP_ITERS: 4000 + WARMUP_FACTOR: 0.00025 +INPUT: + MIN_SIZE_TRAIN: (256, 288, 320, 352, 384, 416, 448, 480, 512, 544, 576, 608) + MAX_SIZE_TRAIN: 900 + MAX_SIZE_TEST: 736 + MIN_SIZE_TEST: 512 \ No newline at end of file diff --git a/vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml new file mode 100755 index 00000000..80413a62 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x.yaml @@ -0,0 +1,29 @@ +_BASE_: "Base-CenterNet2.yaml" +MODEL: + BACKBONE: + NAME: "build_p37_dla_bifpn_backbone" + BIFPN: + OUT_CHANNELS: 160 + NUM_LEVELS: 5 + NUM_BIFPN: 3 + CENTERNET: + POST_NMS_TOPK_TEST: 128 + WEIGHTS: '' + PIXEL_MEAN: [123.675, 116.280, 103.530] + PIXEL_STD: [58.395, 57.12, 57.375] + FPN: + IN_FEATURES: ["dla3", "dla4", "dla5"] +SOLVER: + LR_SCHEDULER_NAME: "WarmupCosineLR" + MAX_ITER: 360000 + BASE_LR: 0.08 + IMS_PER_BATCH: 64 + CHECKPOINT_PERIOD: 90000 +TEST: + EVAL_PERIOD: 7500 +INPUT: + FORMAT: RGB + CUSTOM_AUG: EfficientDetResizeCrop + TRAIN_SIZE: 640 + MIN_SIZE_TEST: 608 + MAX_SIZE_TEST: 900 diff --git a/vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml new file mode 100755 index 00000000..8813b39c --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-BiFPN-P5_640_16x_ST.yaml @@ -0,0 +1,30 @@ +_BASE_: "Base-CenterNet2.yaml" +MODEL: + BACKBONE: + NAME: "build_p37_dla_bifpn_backbone" + BIFPN: + OUT_CHANNELS: 160 + NUM_LEVELS: 5 + NUM_BIFPN: 3 + CENTERNET: + POST_NMS_TOPK_TEST: 128 + WEIGHTS: '' + PIXEL_MEAN: [123.675, 116.280, 103.530] + PIXEL_STD: [58.395, 57.12, 57.375] + FPN: + IN_FEATURES: ["dla3", "dla4", "dla5"] +SOLVER: + LR_SCHEDULER_NAME: "WarmupCosineLR" + MAX_ITER: 360000 + BASE_LR: 0.08 + IMS_PER_BATCH: 64 +TEST: + EVAL_PERIOD: 7500 +INPUT: + FORMAT: RGB + CUSTOM_AUG: EfficientDetResizeCrop + TRAIN_SIZE: 640 + MIN_SIZE_TEST: 608 + MAX_SIZE_TEST: 900 +DATASETS: + TRAIN: ("coco_2017_train","coco_un_yolov4_55_0.5",) diff --git a/vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml new file mode 100755 index 00000000..f94f1358 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_DLA-fcosBiFPN-P5_640_16x_ST.yaml @@ -0,0 +1,30 @@ +_BASE_: "Base-CenterNet2.yaml" +MODEL: + BACKBONE: + NAME: "build_p37_fcos_dla_bifpn_backbone" + BIFPN: + OUT_CHANNELS: 160 + NUM_LEVELS: 5 + NUM_BIFPN: 3 + CENTERNET: + POST_NMS_TOPK_TEST: 128 + WEIGHTS: '' + PIXEL_MEAN: [123.675, 116.280, 103.530] + PIXEL_STD: [58.395, 57.12, 57.375] + FPN: + IN_FEATURES: ["dla3", "dla4", "dla5"] +TEST: + EVAL_PERIOD: 7500 +SOLVER: + LR_SCHEDULER_NAME: "WarmupCosineLR" + MAX_ITER: 360000 + BASE_LR: 0.08 + IMS_PER_BATCH: 64 +INPUT: + FORMAT: RGB + CUSTOM_AUG: EfficientDetResizeCrop + TRAIN_SIZE: 640 + MIN_SIZE_TEST: 608 + MAX_SIZE_TEST: 900 +DATASETS: + TRAIN: ("coco_2017_train","coco_un_yolov4_55_0.5",) diff --git a/vbench/third_party/grit_src/centernet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml new file mode 100755 index 00000000..e07574b3 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_R2-101-DCN-BiFPN_1280_4x.yaml @@ -0,0 +1,32 @@ +_BASE_: "Base-CenterNet2.yaml" +MODEL: + BACKBONE: + NAME: "build_res2net_bifpn_backbone" + BIFPN: + NUM_BIFPN: 7 + OUT_CHANNELS: 288 + WEIGHTS: "output/r2_101.pkl" + RESNETS: + DEPTH: 101 + WIDTH_PER_GROUP: 26 + DEFORM_ON_PER_STAGE: [False, False, True, True] # on Res4, Res5 + DEFORM_MODULATED: True + PIXEL_MEAN: [123.675, 116.280, 103.530] + PIXEL_STD: [58.395, 57.12, 57.375] + CENTERNET: + USE_DEFORMABLE: True + ROI_HEADS: + IN_FEATURES: ["p3", "p4"] +INPUT: + FORMAT: RGB +TEST: + EVAL_PERIOD: 7500 +SOLVER: + MAX_ITER: 180000 + CHECKPOINT_PERIOD: 60000 + LR_SCHEDULER_NAME: "WarmupCosineLR" + BASE_LR: 0.04 + IMS_PER_BATCH: 32 +INPUT: + CUSTOM_AUG: EfficientDetResizeCrop + TRAIN_SIZE: 1280 diff --git a/vbench/third_party/grit_src/centernet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml new file mode 100755 index 00000000..81fcab09 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_R2-101-DCN-BiFPN_4x+4x_1560_ST.yaml @@ -0,0 +1,36 @@ +_BASE_: "Base-CenterNet2.yaml" +MODEL: + BACKBONE: + NAME: "build_res2net_bifpn_backbone" + BIFPN: + NUM_BIFPN: 7 + OUT_CHANNELS: 288 + WEIGHTS: "output/r2_101.pkl" + RESNETS: + DEPTH: 101 + WIDTH_PER_GROUP: 26 + DEFORM_ON_PER_STAGE: [False, False, True, True] # on Res4, Res5 + DEFORM_MODULATED: True + PIXEL_MEAN: [123.675, 116.280, 103.530] + PIXEL_STD: [58.395, 57.12, 57.375] + CENTERNET: + USE_DEFORMABLE: True + ROI_HEADS: + IN_FEATURES: ["p3", "p4"] +TEST: + EVAL_PERIOD: 7500 +SOLVER: + MAX_ITER: 180000 + CHECKPOINT_PERIOD: 7500 + LR_SCHEDULER_NAME: "WarmupCosineLR" + BASE_LR: 0.04 + IMS_PER_BATCH: 32 +DATASETS: + TRAIN: "('coco_2017_train', 'coco_un_yolov4_55_0.5')" +INPUT: + FORMAT: RGB + CUSTOM_AUG: EfficientDetResizeCrop + TRAIN_SIZE: 1280 + TEST_SIZE: 1560 + TEST_INPUT_TYPE: 'square' + \ No newline at end of file diff --git a/vbench/third_party/grit_src/centernet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml new file mode 100755 index 00000000..fd6c49ee --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_R2-101-DCN_896_4x.yaml @@ -0,0 +1,29 @@ +_BASE_: "Base-CenterNet2.yaml" +MODEL: + BACKBONE: + NAME: "build_p67_res2net_fpn_backbone" + WEIGHTS: "output/r2_101.pkl" + RESNETS: + DEPTH: 101 + WIDTH_PER_GROUP: 26 + DEFORM_ON_PER_STAGE: [False, False, True, True] # on Res4, Res5 + DEFORM_MODULATED: True + PIXEL_MEAN: [123.675, 116.280, 103.530] + PIXEL_STD: [58.395, 57.12, 57.375] + CENTERNET: + USE_DEFORMABLE: True + ROI_HEADS: + IN_FEATURES: ["p3", "p4"] +INPUT: + FORMAT: RGB +TEST: + EVAL_PERIOD: 7500 +SOLVER: + MAX_ITER: 180000 + CHECKPOINT_PERIOD: 600000 + LR_SCHEDULER_NAME: "WarmupCosineLR" + BASE_LR: 0.04 + IMS_PER_BATCH: 32 +INPUT: + CUSTOM_AUG: EfficientDetResizeCrop + TRAIN_SIZE: 896 \ No newline at end of file diff --git a/vbench/third_party/grit_src/centernet2/configs/CenterNet2_R50_1x.yaml b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_R50_1x.yaml new file mode 100755 index 00000000..9dcdf5b8 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_R50_1x.yaml @@ -0,0 +1 @@ +_BASE_: "Base-CenterNet2.yaml" diff --git a/vbench/third_party/grit_src/centernet2/configs/CenterNet2_X101-DCN_2x.yaml b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_X101-DCN_2x.yaml new file mode 100755 index 00000000..009c6808 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/CenterNet2_X101-DCN_2x.yaml @@ -0,0 +1,22 @@ +_BASE_: "Base-CenterNet2.yaml" +MODEL: + CENTERNET: + USE_DEFORMABLE: True + WEIGHTS: "detectron2://ImageNetPretrained/FAIR/X-101-32x8d.pkl" + PIXEL_STD: [57.375, 57.120, 58.395] + RESNETS: + STRIDE_IN_1X1: False + NUM_GROUPS: 32 + WIDTH_PER_GROUP: 8 + DEPTH: 101 + DEFORM_ON_PER_STAGE: [False, False, True, True] # on Res4, Res5 + DEFORM_MODULATED: True + ROI_HEADS: + IN_FEATURES: ["p3", "p4"] +SOLVER: + STEPS: (120000, 160000) + MAX_ITER: 180000 + CHECKPOINT_PERIOD: 40000 +INPUT: + MIN_SIZE_TRAIN: (480, 960) + MIN_SIZE_TRAIN_SAMPLING: "range" diff --git a/vbench/third_party/grit_src/centernet2/configs/LVIS_CenterNet2_R50_1x.yaml b/vbench/third_party/grit_src/centernet2/configs/LVIS_CenterNet2_R50_1x.yaml new file mode 100755 index 00000000..912e8925 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/LVIS_CenterNet2_R50_1x.yaml @@ -0,0 +1,17 @@ +_BASE_: "Base-CenterNet2.yaml" +MODEL: + ROI_HEADS: + NUM_CLASSES: 1203 + SCORE_THRESH_TEST: 0.02 + NMS_THRESH_TEST: 0.5 + CENTERNET: + NUM_CLASSES: 1203 + +DATASETS: + TRAIN: ("lvis_v1_train",) + TEST: ("lvis_v1_val",) +DATALOADER: + SAMPLER_TRAIN: "RepeatFactorTrainingSampler" + REPEAT_THRESHOLD: 0.001 +TEST: + DETECTIONS_PER_IMAGE: 300 diff --git a/vbench/third_party/grit_src/centernet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml b/vbench/third_party/grit_src/centernet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml new file mode 100755 index 00000000..d6b6c823 --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/LVIS_CenterNet2_R50_Fed_1x.yaml @@ -0,0 +1,19 @@ +_BASE_: "Base-CenterNet2.yaml" +MODEL: + ROI_HEADS: + NUM_CLASSES: 1203 + SCORE_THRESH_TEST: 0.02 + NMS_THRESH_TEST: 0.5 + CENTERNET: + NUM_CLASSES: 1203 + ROI_BOX_HEAD: + USE_SIGMOID_CE: True + USE_FED_LOSS: True +DATASETS: + TRAIN: ("lvis_v1_train",) + TEST: ("lvis_v1_val",) +DATALOADER: + SAMPLER_TRAIN: "RepeatFactorTrainingSampler" + REPEAT_THRESHOLD: 0.001 +TEST: + DETECTIONS_PER_IMAGE: 300 diff --git a/vbench/third_party/grit_src/centernet2/configs/O365_CenterNet2_R50_1x.yaml b/vbench/third_party/grit_src/centernet2/configs/O365_CenterNet2_R50_1x.yaml new file mode 100755 index 00000000..514e52cd --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/O365_CenterNet2_R50_1x.yaml @@ -0,0 +1,13 @@ +_BASE_: "Base-CenterNet2.yaml" +MODEL: + ROI_HEADS: + NUM_CLASSES: 365 + CENTERNET: + NUM_CLASSES: 365 +DATASETS: + TRAIN: ("objects365_train",) + TEST: ("objects365_val",) +DATALOADER: + SAMPLER_TRAIN: "ClassAwareSampler" +TEST: + DETECTIONS_PER_IMAGE: 300 \ No newline at end of file diff --git a/vbench/third_party/grit_src/centernet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml b/vbench/third_party/grit_src/centernet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml new file mode 100755 index 00000000..c400e92c --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/configs/nuImages_CenterNet2_DLA_640_8x.yaml @@ -0,0 +1,42 @@ +_BASE_: "Base-CenterNet2.yaml" +MODEL: + MASK_ON: True + ROI_MASK_HEAD: + NAME: "MaskRCNNConvUpsampleHead" + NUM_CONV: 4 + POOLER_RESOLUTION: 14 + ROI_HEADS: + NUM_CLASSES: 10 + IN_FEATURES: ["dla2"] + BACKBONE: + NAME: "build_dla_backbone" + DLA: + NORM: "BN" + CENTERNET: + IN_FEATURES: ["dla2"] + FPN_STRIDES: [4] + SOI: [[0, 1000000]] + NUM_CLS_CONVS: 1 + NUM_BOX_CONVS: 1 + REG_WEIGHT: 1. + MORE_POS: True + HM_FOCAL_ALPHA: 0.25 + POST_NMS_TOPK_TEST: 128 + WEIGHTS: '' + PIXEL_MEAN: [123.675, 116.280, 103.530] + PIXEL_STD: [58.395, 57.12, 57.375] +SOLVER: + MAX_ITER: 180000 + STEPS: (120000, 160000) + BASE_LR: 0.08 + IMS_PER_BATCH: 64 +INPUT: + FORMAT: RGB + CUSTOM_AUG: EfficientDetResizeCrop + TRAIN_SIZE: 640 + MIN_SIZE_TEST: 608 + MAX_SIZE_TEST: 900 + MASK_FORMAT: bitmask +DATASETS: + TRAIN: ("nuimages_train",) + TEST: ("nuimages_val",) diff --git a/vbench/third_party/grit_src/centernet2/predictor.py b/vbench/third_party/grit_src/centernet2/predictor.py new file mode 100755 index 00000000..8a036bde --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/predictor.py @@ -0,0 +1,243 @@ +# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved +import atexit +import bisect +import multiprocessing as mp +from collections import deque +import cv2 +import torch + +from detectron2.data import MetadataCatalog +from detectron2.engine.defaults import DefaultPredictor +from detectron2.utils.video_visualizer import VideoVisualizer +from detectron2.utils.visualizer import ColorMode, Visualizer + + +class VisualizationDemo(object): + def __init__(self, cfg, instance_mode=ColorMode.IMAGE, parallel=False): + """ + Args: + cfg (CfgNode): + instance_mode (ColorMode): + parallel (bool): whether to run the model in different processes from visualization. + Useful since the visualization logic can be slow. + """ + self.metadata = MetadataCatalog.get( + cfg.DATASETS.TRAIN[0] if len(cfg.DATASETS.TRAIN) else "__unused" + ) + self.cpu_device = torch.device("cpu") + self.instance_mode = instance_mode + + self.parallel = parallel + if parallel: + num_gpu = torch.cuda.device_count() + self.predictor = AsyncPredictor(cfg, num_gpus=num_gpu) + else: + self.predictor = DefaultPredictor(cfg) + + def run_on_image(self, image, visualizer=None): + """ + Args: + image (np.ndarray): an image of shape (H, W, C) (in BGR order). + This is the format used by OpenCV. + + Returns: + predictions (dict): the output of the model. + vis_output (VisImage): the visualized image output. + """ + vis_output = None + predictions = self.predictor(image) + # Convert image from OpenCV BGR format to Matplotlib RGB format. + image = image[:, :, ::-1] + use_video_vis = True + if visualizer is None: + use_video_vis = False + visualizer = Visualizer(image, self.metadata, instance_mode=self.instance_mode) + if "panoptic_seg" in predictions: + panoptic_seg, segments_info = predictions["panoptic_seg"] + vis_output = visualizer.draw_panoptic_seg_predictions( + panoptic_seg.to(self.cpu_device), segments_info + ) + else: + if "sem_seg" in predictions: + vis_output = visualizer.draw_sem_seg( + predictions["sem_seg"].argmax(dim=0).to(self.cpu_device) + ) + if "instances" in predictions: + instances = predictions["instances"].to(self.cpu_device) + if use_video_vis: + vis_output = visualizer.draw_instance_predictions( + image, predictions=instances) + else: + vis_output = visualizer.draw_instance_predictions(predictions=instances) + elif "proposals" in predictions: + instances = predictions["proposals"].to(self.cpu_device) + instances.pred_boxes = instances.proposal_boxes + instances.scores = instances.objectness_logits + instances.pred_classes[:] = -1 + if use_video_vis: + vis_output = visualizer.draw_instance_predictions( + image, predictions=instances) + else: + vis_output = visualizer.draw_instance_predictions(predictions=instances) + + return predictions, vis_output + + def _frame_from_video(self, video): + while video.isOpened(): + success, frame = video.read() + if success: + yield frame + else: + break + + def run_on_video(self, video): + """ + Visualizes predictions on frames of the input video. + + Args: + video (cv2.VideoCapture): a :class:`VideoCapture` object, whose source can be + either a webcam or a video file. + + Yields: + ndarray: BGR visualizations of each video frame. + """ + video_visualizer = VideoVisualizer(self.metadata, self.instance_mode) + + def process_predictions(frame, predictions): + frame = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR) + if "panoptic_seg" in predictions: + panoptic_seg, segments_info = predictions["panoptic_seg"] + vis_frame = video_visualizer.draw_panoptic_seg_predictions( + frame, panoptic_seg.to(self.cpu_device), segments_info + ) + elif "instances" in predictions: + predictions = predictions["instances"].to(self.cpu_device) + vis_frame = video_visualizer.draw_instance_predictions(frame, predictions) + elif "sem_seg" in predictions: + vis_frame = video_visualizer.draw_sem_seg( + frame, predictions["sem_seg"].argmax(dim=0).to(self.cpu_device) + ) + elif "proposals" in predictions: + predictions = predictions["proposals"].to(self.cpu_device) + predictions.pred_boxes = predictions.proposal_boxes + predictions.scores = predictions.objectness_logits + predictions.pred_classes[:] = -1 + vis_frame = video_visualizer.draw_instance_predictions(frame, predictions) + + # Converts Matplotlib RGB format to OpenCV BGR format + vis_frame = cv2.cvtColor(vis_frame.get_image(), cv2.COLOR_RGB2BGR) + return vis_frame + + frame_gen = self._frame_from_video(video) + if self.parallel: + buffer_size = self.predictor.default_buffer_size + + frame_data = deque() + + for cnt, frame in enumerate(frame_gen): + frame_data.append(frame) + self.predictor.put(frame) + + if cnt >= buffer_size: + frame = frame_data.popleft() + predictions = self.predictor.get() + yield process_predictions(frame, predictions) + + while len(frame_data): + frame = frame_data.popleft() + predictions = self.predictor.get() + yield process_predictions(frame, predictions) + else: + for frame in frame_gen: + yield process_predictions(frame, self.predictor(frame)) + + +class AsyncPredictor: + """ + A predictor that runs the model asynchronously, possibly on >1 GPUs. + Because rendering the visualization takes considerably amount of time, + this helps improve throughput when rendering videos. + """ + + class _StopToken: + pass + + class _PredictWorker(mp.Process): + def __init__(self, cfg, task_queue, result_queue): + self.cfg = cfg + self.task_queue = task_queue + self.result_queue = result_queue + super().__init__() + + def run(self): + predictor = DefaultPredictor(self.cfg) + + while True: + task = self.task_queue.get() + if isinstance(task, AsyncPredictor._StopToken): + break + idx, data = task + result = predictor(data) + self.result_queue.put((idx, result)) + + def __init__(self, cfg, num_gpus: int = 1): + """ + Args: + cfg (CfgNode): + num_gpus (int): if 0, will run on CPU + """ + num_workers = max(num_gpus, 1) + self.task_queue = mp.Queue(maxsize=num_workers * 3) + self.result_queue = mp.Queue(maxsize=num_workers * 3) + self.procs = [] + for gpuid in range(max(num_gpus, 1)): + cfg = cfg.clone() + cfg.defrost() + cfg.MODEL.DEVICE = "cuda:{}".format(gpuid) if num_gpus > 0 else "cpu" + self.procs.append( + AsyncPredictor._PredictWorker(cfg, self.task_queue, self.result_queue) + ) + + self.put_idx = 0 + self.get_idx = 0 + self.result_rank = [] + self.result_data = [] + + for p in self.procs: + p.start() + atexit.register(self.shutdown) + + def put(self, image): + self.put_idx += 1 + self.task_queue.put((self.put_idx, image)) + + def get(self): + self.get_idx += 1 # the index needed for this request + if len(self.result_rank) and self.result_rank[0] == self.get_idx: + res = self.result_data[0] + del self.result_data[0], self.result_rank[0] + return res + + while True: + # make sure the results are returned in the correct order + idx, res = self.result_queue.get() + if idx == self.get_idx: + return res + insert = bisect.bisect(self.result_rank, idx) + self.result_rank.insert(insert, idx) + self.result_data.insert(insert, res) + + def __len__(self): + return self.put_idx - self.get_idx + + def __call__(self, image): + self.put(image) + return self.get() + + def shutdown(self): + for _ in self.procs: + self.task_queue.put(AsyncPredictor._StopToken()) + + @property + def default_buffer_size(self): + return len(self.procs) * 5 diff --git a/vbench/third_party/grit_src/centernet2/train_net.py b/vbench/third_party/grit_src/centernet2/train_net.py new file mode 100755 index 00000000..d903efde --- /dev/null +++ b/vbench/third_party/grit_src/centernet2/train_net.py @@ -0,0 +1,228 @@ +import logging +import os +from collections import OrderedDict +import torch +from torch.nn.parallel import DistributedDataParallel +import time +import datetime +import json + +from fvcore.common.timer import Timer +import detectron2.utils.comm as comm +from detectron2.checkpoint import DetectionCheckpointer, PeriodicCheckpointer +from detectron2.config import get_cfg +from detectron2.data import ( + MetadataCatalog, + build_detection_test_loader, +) +from detectron2.engine import default_argument_parser, default_setup, launch + +from detectron2.evaluation import ( + COCOEvaluator, + LVISEvaluator, + inference_on_dataset, + print_csv_format, +) +from detectron2.modeling import build_model +from detectron2.solver import build_lr_scheduler, build_optimizer +from detectron2.utils.events import ( + CommonMetricPrinter, + EventStorage, + JSONWriter, + TensorboardXWriter, +) +from detectron2.modeling.test_time_augmentation import GeneralizedRCNNWithTTA +from detectron2.data.dataset_mapper import DatasetMapper +from detectron2.data.build import build_detection_train_loader + +from centernet.config import add_centernet_config +from centernet.data.custom_build_augmentation import build_custom_augmentation + +logger = logging.getLogger("detectron2") + +def do_test(cfg, model): + results = OrderedDict() + for dataset_name in cfg.DATASETS.TEST: + mapper = None if cfg.INPUT.TEST_INPUT_TYPE == 'default' else \ + DatasetMapper( + cfg, False, augmentations=build_custom_augmentation(cfg, False)) + data_loader = build_detection_test_loader(cfg, dataset_name, mapper=mapper) + output_folder = os.path.join( + cfg.OUTPUT_DIR, "inference_{}".format(dataset_name)) + evaluator_type = MetadataCatalog.get(dataset_name).evaluator_type + + if evaluator_type == "lvis": + evaluator = LVISEvaluator(dataset_name, cfg, True, output_folder) + elif evaluator_type == 'coco': + evaluator = COCOEvaluator(dataset_name, cfg, True, output_folder) + else: + assert 0, evaluator_type + + results[dataset_name] = inference_on_dataset( + model, data_loader, evaluator) + if comm.is_main_process(): + logger.info("Evaluation results for {} in csv format:".format( + dataset_name)) + print_csv_format(results[dataset_name]) + if len(results) == 1: + results = list(results.values())[0] + return results + +def do_train(cfg, model, resume=False): + model.train() + optimizer = build_optimizer(cfg, model) + scheduler = build_lr_scheduler(cfg, optimizer) + + checkpointer = DetectionCheckpointer( + model, cfg.OUTPUT_DIR, optimizer=optimizer, scheduler=scheduler + ) + + start_iter = ( + checkpointer.resume_or_load( + cfg.MODEL.WEIGHTS, resume=resume, + ).get("iteration", -1) + 1 + ) + if cfg.SOLVER.RESET_ITER: + logger.info('Reset loaded iteration. Start training from iteration 0.') + start_iter = 0 + max_iter = cfg.SOLVER.MAX_ITER if cfg.SOLVER.TRAIN_ITER < 0 else cfg.SOLVER.TRAIN_ITER + + periodic_checkpointer = PeriodicCheckpointer( + checkpointer, cfg.SOLVER.CHECKPOINT_PERIOD, max_iter=max_iter + ) + + writers = ( + [ + CommonMetricPrinter(max_iter), + JSONWriter(os.path.join(cfg.OUTPUT_DIR, "metrics.json")), + TensorboardXWriter(cfg.OUTPUT_DIR), + ] + if comm.is_main_process() + else [] + ) + + + mapper = DatasetMapper(cfg, True) if cfg.INPUT.CUSTOM_AUG == '' else \ + DatasetMapper(cfg, True, augmentations=build_custom_augmentation(cfg, True)) + if cfg.DATALOADER.SAMPLER_TRAIN in ['TrainingSampler', 'RepeatFactorTrainingSampler']: + data_loader = build_detection_train_loader(cfg, mapper=mapper) + else: + from centernet.data.custom_dataset_dataloader import build_custom_train_loader + data_loader = build_custom_train_loader(cfg, mapper=mapper) + + + logger.info("Starting training from iteration {}".format(start_iter)) + with EventStorage(start_iter) as storage: + step_timer = Timer() + data_timer = Timer() + start_time = time.perf_counter() + for data, iteration in zip(data_loader, range(start_iter, max_iter)): + data_time = data_timer.seconds() + storage.put_scalars(data_time=data_time) + step_timer.reset() + iteration = iteration + 1 + storage.step() + loss_dict = model(data) + + losses = sum( + loss for k, loss in loss_dict.items()) + assert torch.isfinite(losses).all(), loss_dict + + loss_dict_reduced = {k: v.item() \ + for k, v in comm.reduce_dict(loss_dict).items()} + losses_reduced = sum(loss for loss in loss_dict_reduced.values()) + if comm.is_main_process(): + storage.put_scalars( + total_loss=losses_reduced, **loss_dict_reduced) + + optimizer.zero_grad() + losses.backward() + optimizer.step() + + storage.put_scalar( + "lr", optimizer.param_groups[0]["lr"], smoothing_hint=False) + + step_time = step_timer.seconds() + storage.put_scalars(time=step_time) + data_timer.reset() + scheduler.step() + + if ( + cfg.TEST.EVAL_PERIOD > 0 + and iteration % cfg.TEST.EVAL_PERIOD == 0 + and iteration != max_iter + ): + do_test(cfg, model) + comm.synchronize() + + if iteration - start_iter > 5 and \ + (iteration % 20 == 0 or iteration == max_iter): + for writer in writers: + writer.write() + periodic_checkpointer.step(iteration) + + total_time = time.perf_counter() - start_time + logger.info( + "Total training time: {}".format( + str(datetime.timedelta(seconds=int(total_time))))) + +def setup(args): + """ + Create configs and perform basic setups. + """ + cfg = get_cfg() + add_centernet_config(cfg) + cfg.merge_from_file(args.config_file) + cfg.merge_from_list(args.opts) + if '/auto' in cfg.OUTPUT_DIR: + file_name = os.path.basename(args.config_file)[:-5] + cfg.OUTPUT_DIR = cfg.OUTPUT_DIR.replace('/auto', '/{}'.format(file_name)) + logger.info('OUTPUT_DIR: {}'.format(cfg.OUTPUT_DIR)) + cfg.freeze() + default_setup(cfg, args) + return cfg + + +def main(args): + cfg = setup(args) + + model = build_model(cfg) + logger.info("Model:\n{}".format(model)) + if args.eval_only: + DetectionCheckpointer(model, save_dir=cfg.OUTPUT_DIR).resume_or_load( + cfg.MODEL.WEIGHTS, resume=args.resume + ) + if cfg.TEST.AUG.ENABLED: + logger.info("Running inference with test-time augmentation ...") + model = GeneralizedRCNNWithTTA(cfg, model, batch_size=1) + + return do_test(cfg, model) + + distributed = comm.get_world_size() > 1 + if distributed: + model = DistributedDataParallel( + model, device_ids=[comm.get_local_rank()], broadcast_buffers=False, + find_unused_parameters=True + ) + + do_train(cfg, model, resume=args.resume) + return do_test(cfg, model) + + +if __name__ == "__main__": + args = default_argument_parser() + args.add_argument('--manual_device', default='') + args = args.parse_args() + if args.manual_device != '': + os.environ['CUDA_VISIBLE_DEVICES'] = args.manual_device + args.dist_url = 'tcp://127.0.0.1:{}'.format( + torch.randint(11111, 60000, (1,))[0].item()) + print("Command Line Args:", args) + launch( + main, + args.num_gpus, + num_machines=args.num_machines, + machine_rank=args.machine_rank, + dist_url=args.dist_url, + args=(args,), + ) diff --git a/vbench/third_party/grit_src/grit/modeling/backbone/vit.py b/vbench/third_party/grit_src/grit/modeling/backbone/vit.py index 79e032bc..493cec49 100755 --- a/vbench/third_party/grit_src/grit/modeling/backbone/vit.py +++ b/vbench/third_party/grit_src/grit/modeling/backbone/vit.py @@ -10,8 +10,9 @@ from detectron2.modeling.backbone.build import BACKBONE_REGISTRY from detectron2.layers import ShapeSpec -# from centernet.modeling.backbone.fpn_p5 import LastLevelP6P7_P5 -from .centernet_last_layer import LastLevelP6P7_P5 +import sys +sys.path.append('vbench/third_party/grit_src/centernet2') +from centernet.modeling.backbone.fpn_p5 import LastLevelP6P7_P5 import torch.utils.checkpoint as checkpoint from timm.models.layers import DropPath, Mlp, trunc_normal_ diff --git a/vbench/third_party/grit_src/image_dense_captions.py b/vbench/third_party/grit_src/image_dense_captions.py index 734b68eb..293516c2 100755 --- a/vbench/third_party/grit_src/image_dense_captions.py +++ b/vbench/third_party/grit_src/image_dense_captions.py @@ -11,8 +11,8 @@ # sys.path.insert(0, f"{CUR_DIR}/../") # print(CUR_DIR) # sys.path.insert(0, os.path.join(CUR_DIR,'third_party/CenterNet2/projects/CenterNet2/')) -# from centernet.config import add_centernet_config from .centernet_config import add_centernet_config +# from .centernet2.centernet.config import add_centernet_config from .grit.config import add_grit_config from .grit.predictor import VisualizationDemo From 2771a46ae31e5543d5f873e191ad533ceedaaba7 Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Fri, 29 Dec 2023 17:26:16 +0700 Subject: [PATCH 067/248] [fix] grit model --- dimension_to_folder.json | 32 +++++++++---------- .../centernet/modeling/backbone/bifpn.py | 4 +-- .../centernet/modeling/backbone/bifpn_fcos.py | 8 ++--- .../centernet/modeling/backbone/dla.py | 4 +-- .../centernet/modeling/backbone/dlafpn.py | 6 ++-- .../centernet/modeling/backbone/fpn_p5.py | 4 +-- .../centernet/modeling/backbone/res2net.py | 6 ++-- .../modeling/dense_heads/centernet.py | 2 +- .../modeling/meta_arch/centernet_detector.py | 2 +- .../modeling/roi_heads/custom_roi_heads.py | 4 +-- .../grit/data/custom_dataset_mapper.py | 2 +- .../grit_src/grit/data/transforms/__init__.py | 0 .../grit/modeling/roi_heads/grit_roi_heads.py | 2 +- 13 files changed, 38 insertions(+), 38 deletions(-) create mode 100644 vbench/third_party/grit_src/grit/data/transforms/__init__.py diff --git a/dimension_to_folder.json b/dimension_to_folder.json index 4504b68e..d45e99ae 100644 --- a/dimension_to_folder.json +++ b/dimension_to_folder.json @@ -1,18 +1,18 @@ { - "subject_consistency": "subject_consistency", - "background_consistency": "scene", - "aesthetic_quality": "overall_consistency", - "imaging_quality": "overall_consistency", - "object_class": "object_class", - "multiple_objects": "multiple_objects", - "color": "color", - "spatial_relationship": "spatial_relationship", - "scene": "scene", - "temporal_style": "temporal_style", - "overall_consistency": "overall_consistency", - "human_action": "human_action", - "temporal_flickering": "temporal_flickering", - "motion_smoothness": "subject_consistency", - "dynamic_degree": "subject_consistency", - "appearance_style": "appearance_style" + "subject_consistency": "subject_consistency", + "background_consistency": "scene", + "aesthetic_quality": "overall_consistency", + "imaging_quality": "overall_consistency", + "object_class": "object_class", + "multiple_objects": "multiple_objects", + "color": "color", + "spatial_relationship": "spatial_relationship", + "scene": "scene", + "temporal_style": "temporal_style", + "overall_consistency": "overall_consistency", + "human_action": "human_action", + "temporal_flickering": "temporal_flickering", + "motion_smoothness": "subject_consistency", + "dynamic_degree": "subject_consistency", + "appearance_style": "appearance_style" } diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/bifpn.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/bifpn.py index a1797129..565e2940 100755 --- a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/bifpn.py +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/bifpn.py @@ -377,7 +377,7 @@ def forward(self, x): return out -# @BACKBONE_REGISTRY.register() +@BACKBONE_REGISTRY.register() def build_resnet_bifpn_backbone(cfg, input_shape: ShapeSpec): """ Args: @@ -400,7 +400,7 @@ def build_resnet_bifpn_backbone(cfg, input_shape: ShapeSpec): ) return backbone -# @BACKBONE_REGISTRY.register() +@BACKBONE_REGISTRY.register() def build_p37_dla_bifpn_backbone(cfg, input_shape: ShapeSpec): """ Args: diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/bifpn_fcos.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/bifpn_fcos.py index 54c9fa27..bb93d73b 100755 --- a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/bifpn_fcos.py +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/bifpn_fcos.py @@ -368,7 +368,7 @@ def _assert_strides_are_log2_contiguous(strides): ) -# @BACKBONE_REGISTRY.register() +@BACKBONE_REGISTRY.register() def build_fcos_resnet_bifpn_backbone(cfg, input_shape: ShapeSpec): """ Args: @@ -394,7 +394,7 @@ def build_fcos_resnet_bifpn_backbone(cfg, input_shape: ShapeSpec): -# @BACKBONE_REGISTRY.register() +@BACKBONE_REGISTRY.register() def build_p35_fcos_resnet_bifpn_backbone(cfg, input_shape: ShapeSpec): """ Args: @@ -419,7 +419,7 @@ def build_p35_fcos_resnet_bifpn_backbone(cfg, input_shape: ShapeSpec): return backbone -# @BACKBONE_REGISTRY.register() +@BACKBONE_REGISTRY.register() def build_p35_fcos_dla_bifpn_backbone(cfg, input_shape: ShapeSpec): """ Args: @@ -443,7 +443,7 @@ def build_p35_fcos_dla_bifpn_backbone(cfg, input_shape: ShapeSpec): ) return backbone -# @BACKBONE_REGISTRY.register() +@BACKBONE_REGISTRY.register() def build_p37_fcos_dla_bifpn_backbone(cfg, input_shape: ShapeSpec): """ Args: diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/dla.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/dla.py index 85e7a666..9f15f840 100755 --- a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/dla.py +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/dla.py @@ -421,7 +421,7 @@ def forward(self, x): return ret -# @BACKBONE_REGISTRY.register() +@BACKBONE_REGISTRY.register() def build_dla_backbone(cfg, input_shape): """ Create a ResNet instance from config. @@ -456,7 +456,7 @@ def forward(self, c5): p7 = self.p7(F.relu(p6)) return [p6, p7] -# @BACKBONE_REGISTRY.register() +@BACKBONE_REGISTRY.register() def build_retinanet_dla_fpn_backbone(cfg, input_shape: ShapeSpec): """ Args: diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/dlafpn.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/dlafpn.py index d2dfa52d..2a33c66b 100755 --- a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/dlafpn.py +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/dlafpn.py @@ -419,7 +419,7 @@ def forward(self, c5): return [p6, p7] -# @BACKBONE_REGISTRY.register() +@BACKBONE_REGISTRY.register() def build_dla_fpn3_backbone(cfg, input_shape: ShapeSpec): """ Args: @@ -444,7 +444,7 @@ def build_dla_fpn3_backbone(cfg, input_shape: ShapeSpec): return backbone -# @BACKBONE_REGISTRY.register() +@BACKBONE_REGISTRY.register() def build_dla_fpn5_backbone(cfg, input_shape: ShapeSpec): """ Args: @@ -471,7 +471,7 @@ def build_dla_fpn5_backbone(cfg, input_shape: ShapeSpec): return backbone -# @BACKBONE_REGISTRY.register() +@BACKBONE_REGISTRY.register() def build_dlaup_backbone(cfg, input_shape: ShapeSpec): """ Args: diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/fpn_p5.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/fpn_p5.py index 34544547..cc4e7a49 100755 --- a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/fpn_p5.py +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/fpn_p5.py @@ -33,7 +33,7 @@ def forward(self, c5): return [p6, p7] -# @BACKBONE_REGISTRY.register() +@BACKBONE_REGISTRY.register() def build_p67_resnet_fpn_backbone(cfg, input_shape: ShapeSpec): """ Args: @@ -55,7 +55,7 @@ def build_p67_resnet_fpn_backbone(cfg, input_shape: ShapeSpec): ) return backbone -# @BACKBONE_REGISTRY.register() +@BACKBONE_REGISTRY.register() def build_p35_resnet_fpn_backbone(cfg, input_shape: ShapeSpec): """ Args: diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/res2net.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/res2net.py index 20ea930b..0db04629 100755 --- a/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/res2net.py +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/backbone/res2net.py @@ -666,7 +666,7 @@ def freeze(self, freeze_at=0): return self -# @BACKBONE_REGISTRY.register() +@BACKBONE_REGISTRY.register() def build_res2net_backbone(cfg, input_shape): """ Create a Res2Net instance from config. @@ -755,7 +755,7 @@ def build_res2net_backbone(cfg, input_shape): return ResNet(stem, stages, out_features=out_features).freeze(freeze_at) -# @BACKBONE_REGISTRY.register() +@BACKBONE_REGISTRY.register() def build_p67_res2net_fpn_backbone(cfg, input_shape: ShapeSpec): """ Args: @@ -778,7 +778,7 @@ def build_p67_res2net_fpn_backbone(cfg, input_shape: ShapeSpec): return backbone -# @BACKBONE_REGISTRY.register() +@BACKBONE_REGISTRY.register() def build_res2net_bifpn_backbone(cfg, input_shape: ShapeSpec): """ Args: diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/centernet.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/centernet.py index 4794e002..ed05465a 100755 --- a/vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/centernet.py +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/dense_heads/centernet.py @@ -27,7 +27,7 @@ INF = 100000000 -# @PROPOSAL_GENERATOR_REGISTRY.register() +@PROPOSAL_GENERATOR_REGISTRY.register() class CenterNet(nn.Module): @configurable def __init__(self, diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/meta_arch/centernet_detector.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/meta_arch/centernet_detector.py index eba0457a..b7525c7b 100755 --- a/vbench/third_party/grit_src/centernet2/centernet/modeling/meta_arch/centernet_detector.py +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/meta_arch/centernet_detector.py @@ -9,7 +9,7 @@ from detectron2.modeling import detector_postprocess from detectron2.structures import ImageList -# @META_ARCH_REGISTRY.register() +@META_ARCH_REGISTRY.register() class CenterNetDetector(nn.Module): def __init__(self, cfg): super().__init__() diff --git a/vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/custom_roi_heads.py b/vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/custom_roi_heads.py index a01a8bf6..90fadf1a 100755 --- a/vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/custom_roi_heads.py +++ b/vbench/third_party/grit_src/centernet2/centernet/modeling/roi_heads/custom_roi_heads.py @@ -19,7 +19,7 @@ from .custom_fast_rcnn import CustomFastRCNNOutputLayers -# @ROI_HEADS_REGISTRY.register() +@ROI_HEADS_REGISTRY.register() class CustomROIHeads(StandardROIHeads): @classmethod def _init_box_head(self, cfg, input_shape): @@ -67,7 +67,7 @@ def forward(self, images, features, proposals, targets=None): return pred_instances, {} -# @ROI_HEADS_REGISTRY.register() +@ROI_HEADS_REGISTRY.register() class CustomCascadeROIHeads(CascadeROIHeads): @classmethod def _init_box_head(self, cfg, input_shape): diff --git a/vbench/third_party/grit_src/grit/data/custom_dataset_mapper.py b/vbench/third_party/grit_src/grit/data/custom_dataset_mapper.py index 1e21edb3..0827c791 100755 --- a/vbench/third_party/grit_src/grit/data/custom_dataset_mapper.py +++ b/vbench/third_party/grit_src/grit/data/custom_dataset_mapper.py @@ -146,4 +146,4 @@ def __len__(self): return len(self.data) def __repr__(self): - return "ObjDescription({})".format(self.data) \ No newline at end of file + return "ObjDescription({})".format(self.data) diff --git a/vbench/third_party/grit_src/grit/data/transforms/__init__.py b/vbench/third_party/grit_src/grit/data/transforms/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/vbench/third_party/grit_src/grit/modeling/roi_heads/grit_roi_heads.py b/vbench/third_party/grit_src/grit/modeling/roi_heads/grit_roi_heads.py index 0dbce5ae..afb1325c 100755 --- a/vbench/third_party/grit_src/grit/modeling/roi_heads/grit_roi_heads.py +++ b/vbench/third_party/grit_src/grit/modeling/roi_heads/grit_roi_heads.py @@ -17,7 +17,7 @@ from ..text.load_text_token import LoadTextTokens from transformers import BertTokenizer -# from grit.data.custom_dataset_mapper import ObjDescription +from vbench.third_party.grit_src.grit.data.custom_dataset_mapper import ObjDescription from ..soft_nms import batched_soft_nms import logging From f8a6bd18a051b78d7e522689b4af27d26db997fa Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Fri, 29 Dec 2023 18:41:05 +0700 Subject: [PATCH 068/248] [fix]: normalize imaging quality score to [0,1] --- vbench/imaging_quality.py | 1 + 1 file changed, 1 insertion(+) diff --git a/vbench/imaging_quality.py b/vbench/imaging_quality.py index 86b72b78..e6b4dea3 100755 --- a/vbench/imaging_quality.py +++ b/vbench/imaging_quality.py @@ -18,6 +18,7 @@ def technical_quality(model, video_list, device): acc_score_video += float(score) video_results.append({'video_path': video_path, 'video_results': acc_score_video/len(images)}) average_score = sum([o['video_results'] for o in video_results]) / len(video_results) + average_score = average_score / 100. return average_score, video_results From 274a468f2a9c9747c7e442f6fe383c1ef78fff4b Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Fri, 29 Dec 2023 21:42:10 +0700 Subject: [PATCH 069/248] [fix]: import ObjDetection in subject_consistency --- vbench/human_action.py | 2 +- vbench/third_party/umt/models/__init__.py | 1 + .../umt/models/modeling_finetune.py | 18 +++++++++--------- 3 files changed, 11 insertions(+), 10 deletions(-) diff --git a/vbench/human_action.py b/vbench/human_action.py index de22ec51..c276f96d 100755 --- a/vbench/human_action.py +++ b/vbench/human_action.py @@ -15,7 +15,7 @@ ) from vbench.third_party.umt.datasets.volume_transforms import ClipToTensor from timm.models import create_model -# from vbench.third_party.umt.models import vit_large_patch16_224 +from vbench.third_party.umt.models.modeling_finetune import vit_large_patch16_224 from tqdm import tqdm def build_dict(): diff --git a/vbench/third_party/umt/models/__init__.py b/vbench/third_party/umt/models/__init__.py index 881731df..e7e31a76 100755 --- a/vbench/third_party/umt/models/__init__.py +++ b/vbench/third_party/umt/models/__init__.py @@ -1,4 +1,5 @@ from .clip import clip_b16, clip_l14, clip_l14_336 # from .modeling_finetune import vit_base_patch16_224, vit_base_patch16_384, vit_large_patch16_224, vit_large_patch16_384 +from .modeling_finetune import vit_large_patch16_224 from .modeling_pretrain_umt import pretrain_umt_base_patch16_224, pretrain_umt_large_patch16_224 from .modeling_pretrain import pretrain_videomae_base_patch16_224, pretrain_videomae_large_patch16_224, pretrain_videomae_huge_patch16_224 diff --git a/vbench/third_party/umt/models/modeling_finetune.py b/vbench/third_party/umt/models/modeling_finetune.py index 65353688..87edb146 100755 --- a/vbench/third_party/umt/models/modeling_finetune.py +++ b/vbench/third_party/umt/models/modeling_finetune.py @@ -344,15 +344,15 @@ def forward(self, x): # return model -# @register_model -# def vit_large_patch16_224(pretrained=False, **kwargs): -# kwargs.pop('pretrained_cfg', None) # added by Ziqi to accommodate timm=0.9.12 -# kwargs.pop('pretrained_cfg_overlay', None) # added by Ziqi to accommodate timm=0.9.12 -# model = VisionTransformer( -# patch_size=16, embed_dim=1024, depth=24, num_heads=16, mlp_ratio=4, qkv_bias=True, -# norm_layer=partial(nn.LayerNorm, eps=1e-6), **kwargs) -# model.default_cfg = _cfg() -# return model +@register_model +def vit_large_patch16_224(pretrained=False, **kwargs): + kwargs.pop('pretrained_cfg', None) # added by Ziqi to accommodate timm=0.9.12 + kwargs.pop('pretrained_cfg_overlay', None) # added by Ziqi to accommodate timm=0.9.12 + model = VisionTransformer( + patch_size=16, embed_dim=1024, depth=24, num_heads=16, mlp_ratio=4, qkv_bias=True, + norm_layer=partial(nn.LayerNorm, eps=1e-6), **kwargs) + model.default_cfg = _cfg() + return model # @register_model From 9d1e853020a7cae800263c7556afb1a642117fa7 Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Fri, 29 Dec 2023 22:40:42 +0700 Subject: [PATCH 070/248] [update]: update requirements && fix import --- requirements.txt | 3 +++ setup.cfg | 0 vbench/third_party/grit_src/grit/modeling/backbone/vit.py | 4 +--- 3 files changed, 4 insertions(+), 3 deletions(-) delete mode 100644 setup.cfg diff --git a/requirements.txt b/requirements.txt index 6a9ef8e5..863bd7d0 100644 --- a/requirements.txt +++ b/requirements.txt @@ -15,3 +15,6 @@ pyyaml easydict lvis fairscale +openai-clip +fvcore +easydict diff --git a/setup.cfg b/setup.cfg deleted file mode 100644 index e69de29b..00000000 diff --git a/vbench/third_party/grit_src/grit/modeling/backbone/vit.py b/vbench/third_party/grit_src/grit/modeling/backbone/vit.py index 493cec49..3f7cacb4 100755 --- a/vbench/third_party/grit_src/grit/modeling/backbone/vit.py +++ b/vbench/third_party/grit_src/grit/modeling/backbone/vit.py @@ -10,9 +10,7 @@ from detectron2.modeling.backbone.build import BACKBONE_REGISTRY from detectron2.layers import ShapeSpec -import sys -sys.path.append('vbench/third_party/grit_src/centernet2') -from centernet.modeling.backbone.fpn_p5 import LastLevelP6P7_P5 +from vbench.third_party.grit_src.centernet2.centernet.modeling.backbone.fpn_p5 import LastLevelP6P7_P5 import torch.utils.checkpoint as checkpoint from timm.models.layers import DropPath, Mlp, trunc_normal_ From a06572e535d78d7d5d4eb08342af6e2b8ec12ce2 Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Sat, 30 Dec 2023 00:16:06 +0700 Subject: [PATCH 071/248] [fix]: import path --- vbench/cli/__init__.py | 4 +- vbench/third_party/RAFT/core/corr.py | 2 +- vbench/third_party/RAFT/core/raft.py | 3 - vbench/third_party/RAFT/evaluate.py | 197 ------------------ vbench/third_party/amt/utils/build_utils.py | 5 +- .../grit_src/centernet2/__init__.py | 1 - .../grit_src/grit/modeling/backbone/vit.py | 7 +- .../grit_src/image_dense_captions.py | 7 +- 8 files changed, 17 insertions(+), 209 deletions(-) delete mode 100644 vbench/third_party/RAFT/evaluate.py diff --git a/vbench/cli/__init__.py b/vbench/cli/__init__.py index 31b4bf0a..ffcc86d6 100644 --- a/vbench/cli/__init__.py +++ b/vbench/cli/__init__.py @@ -1,2 +1,4 @@ +import os import sys -sys.path.append('vbench/third_party') +CUR_DIR = os.path.dirname(os.path.abspath(__file__)) +sys.path.append(os.path.join(CUR_DIR, '../third_party')) diff --git a/vbench/third_party/RAFT/core/corr.py b/vbench/third_party/RAFT/core/corr.py index 55e70156..3839ba84 100644 --- a/vbench/third_party/RAFT/core/corr.py +++ b/vbench/third_party/RAFT/core/corr.py @@ -1,6 +1,6 @@ import torch import torch.nn.functional as F -from utils_core.utils import bilinear_sampler, coords_grid +from .utils_core.utils import bilinear_sampler, coords_grid try: import alt_cuda_corr diff --git a/vbench/third_party/RAFT/core/raft.py b/vbench/third_party/RAFT/core/raft.py index ba5d48f9..1d7404be 100644 --- a/vbench/third_party/RAFT/core/raft.py +++ b/vbench/third_party/RAFT/core/raft.py @@ -3,9 +3,6 @@ import torch.nn as nn import torch.nn.functional as F -import sys -sys.path.append('vbench/third_party/RAFT/core') - from .update import BasicUpdateBlock, SmallUpdateBlock from .extractor import BasicEncoder, SmallEncoder from .corr import CorrBlock, AlternateCorrBlock diff --git a/vbench/third_party/RAFT/evaluate.py b/vbench/third_party/RAFT/evaluate.py deleted file mode 100644 index 431a0f58..00000000 --- a/vbench/third_party/RAFT/evaluate.py +++ /dev/null @@ -1,197 +0,0 @@ -import sys -sys.path.append('core') - -from PIL import Image -import argparse -import os -import time -import numpy as np -import torch -import torch.nn.functional as F -import matplotlib.pyplot as plt - -import datasets -from utils import flow_viz -from utils import frame_utils - -from raft import RAFT -from utils.utils import InputPadder, forward_interpolate - - -@torch.no_grad() -def create_sintel_submission(model, iters=32, warm_start=False, output_path='sintel_submission'): - """ Create submission for the Sintel leaderboard """ - model.eval() - for dstype in ['clean', 'final']: - test_dataset = datasets.MpiSintel(split='test', aug_params=None, dstype=dstype) - - flow_prev, sequence_prev = None, None - for test_id in range(len(test_dataset)): - image1, image2, (sequence, frame) = test_dataset[test_id] - if sequence != sequence_prev: - flow_prev = None - - padder = InputPadder(image1.shape) - image1, image2 = padder.pad(image1[None].cuda(), image2[None].cuda()) - - flow_low, flow_pr = model(image1, image2, iters=iters, flow_init=flow_prev, test_mode=True) - flow = padder.unpad(flow_pr[0]).permute(1, 2, 0).cpu().numpy() - - if warm_start: - flow_prev = forward_interpolate(flow_low[0])[None].cuda() - - output_dir = os.path.join(output_path, dstype, sequence) - output_file = os.path.join(output_dir, 'frame%04d.flo' % (frame+1)) - - if not os.path.exists(output_dir): - os.makedirs(output_dir) - - frame_utils.writeFlow(output_file, flow) - sequence_prev = sequence - - -@torch.no_grad() -def create_kitti_submission(model, iters=24, output_path='kitti_submission'): - """ Create submission for the Sintel leaderboard """ - model.eval() - test_dataset = datasets.KITTI(split='testing', aug_params=None) - - if not os.path.exists(output_path): - os.makedirs(output_path) - - for test_id in range(len(test_dataset)): - image1, image2, (frame_id, ) = test_dataset[test_id] - padder = InputPadder(image1.shape, mode='kitti') - image1, image2 = padder.pad(image1[None].cuda(), image2[None].cuda()) - - _, flow_pr = model(image1, image2, iters=iters, test_mode=True) - flow = padder.unpad(flow_pr[0]).permute(1, 2, 0).cpu().numpy() - - output_filename = os.path.join(output_path, frame_id) - frame_utils.writeFlowKITTI(output_filename, flow) - - -@torch.no_grad() -def validate_chairs(model, iters=24): - """ Perform evaluation on the FlyingChairs (test) split """ - model.eval() - epe_list = [] - - val_dataset = datasets.FlyingChairs(split='validation') - for val_id in range(len(val_dataset)): - image1, image2, flow_gt, _ = val_dataset[val_id] - image1 = image1[None].cuda() - image2 = image2[None].cuda() - - _, flow_pr = model(image1, image2, iters=iters, test_mode=True) - epe = torch.sum((flow_pr[0].cpu() - flow_gt)**2, dim=0).sqrt() - epe_list.append(epe.view(-1).numpy()) - - epe = np.mean(np.concatenate(epe_list)) - print("Validation Chairs EPE: %f" % epe) - return {'chairs': epe} - - -@torch.no_grad() -def validate_sintel(model, iters=32): - """ Peform validation using the Sintel (train) split """ - model.eval() - results = {} - for dstype in ['clean', 'final']: - val_dataset = datasets.MpiSintel(split='training', dstype=dstype) - epe_list = [] - - for val_id in range(len(val_dataset)): - image1, image2, flow_gt, _ = val_dataset[val_id] - image1 = image1[None].cuda() - image2 = image2[None].cuda() - - padder = InputPadder(image1.shape) - image1, image2 = padder.pad(image1, image2) - - flow_low, flow_pr = model(image1, image2, iters=iters, test_mode=True) - flow = padder.unpad(flow_pr[0]).cpu() - - epe = torch.sum((flow - flow_gt)**2, dim=0).sqrt() - epe_list.append(epe.view(-1).numpy()) - - epe_all = np.concatenate(epe_list) - epe = np.mean(epe_all) - px1 = np.mean(epe_all<1) - px3 = np.mean(epe_all<3) - px5 = np.mean(epe_all<5) - - print("Validation (%s) EPE: %f, 1px: %f, 3px: %f, 5px: %f" % (dstype, epe, px1, px3, px5)) - results[dstype] = np.mean(epe_list) - - return results - - -@torch.no_grad() -def validate_kitti(model, iters=24): - """ Peform validation using the KITTI-2015 (train) split """ - model.eval() - val_dataset = datasets.KITTI(split='training') - - out_list, epe_list = [], [] - for val_id in range(len(val_dataset)): - image1, image2, flow_gt, valid_gt = val_dataset[val_id] - image1 = image1[None].cuda() - image2 = image2[None].cuda() - - padder = InputPadder(image1.shape, mode='kitti') - image1, image2 = padder.pad(image1, image2) - - flow_low, flow_pr = model(image1, image2, iters=iters, test_mode=True) - flow = padder.unpad(flow_pr[0]).cpu() - - epe = torch.sum((flow - flow_gt)**2, dim=0).sqrt() - mag = torch.sum(flow_gt**2, dim=0).sqrt() - - epe = epe.view(-1) - mag = mag.view(-1) - val = valid_gt.view(-1) >= 0.5 - - out = ((epe > 3.0) & ((epe/mag) > 0.05)).float() - epe_list.append(epe[val].mean().item()) - out_list.append(out[val].cpu().numpy()) - - epe_list = np.array(epe_list) - out_list = np.concatenate(out_list) - - epe = np.mean(epe_list) - f1 = 100 * np.mean(out_list) - - print("Validation KITTI: %f, %f" % (epe, f1)) - return {'kitti-epe': epe, 'kitti-f1': f1} - - -if __name__ == '__main__': - parser = argparse.ArgumentParser() - parser.add_argument('--model', help="restore checkpoint") - parser.add_argument('--dataset', help="dataset for evaluation") - parser.add_argument('--small', action='store_true', help='use small model') - parser.add_argument('--mixed_precision', action='store_true', help='use mixed precision') - parser.add_argument('--alternate_corr', action='store_true', help='use efficent correlation implementation') - args = parser.parse_args() - - model = torch.nn.DataParallel(RAFT(args)) - model.load_state_dict(torch.load(args.model)) - - model.cuda() - model.eval() - - # create_sintel_submission(model.module, warm_start=True) - # create_kitti_submission(model.module) - - with torch.no_grad(): - if args.dataset == 'chairs': - validate_chairs(model.module) - - elif args.dataset == 'sintel': - validate_sintel(model.module) - - elif args.dataset == 'kitti': - validate_kitti(model.module) - - diff --git a/vbench/third_party/amt/utils/build_utils.py b/vbench/third_party/amt/utils/build_utils.py index c9cc0d3b..6e0c5f58 100755 --- a/vbench/third_party/amt/utils/build_utils.py +++ b/vbench/third_party/amt/utils/build_utils.py @@ -1,7 +1,8 @@ import importlib - +import os import sys -sys.path.append("vbench/third_party/amt") +CUR_DIR = os.path.dirname(os.path.abspath(__file__)) +sys.path.append(os.path.join(CUR_DIR, "../")) def base_build_fn(module, cls, params): diff --git a/vbench/third_party/grit_src/centernet2/__init__.py b/vbench/third_party/grit_src/centernet2/__init__.py index a618c479..e69de29b 100644 --- a/vbench/third_party/grit_src/centernet2/__init__.py +++ b/vbench/third_party/grit_src/centernet2/__init__.py @@ -1 +0,0 @@ -from .centernet.config import add_centernet_config diff --git a/vbench/third_party/grit_src/grit/modeling/backbone/vit.py b/vbench/third_party/grit_src/grit/modeling/backbone/vit.py index 3f7cacb4..fd414242 100755 --- a/vbench/third_party/grit_src/grit/modeling/backbone/vit.py +++ b/vbench/third_party/grit_src/grit/modeling/backbone/vit.py @@ -10,7 +10,12 @@ from detectron2.modeling.backbone.build import BACKBONE_REGISTRY from detectron2.layers import ShapeSpec -from vbench.third_party.grit_src.centernet2.centernet.modeling.backbone.fpn_p5 import LastLevelP6P7_P5 +import os +import sys +CUR_DIR = os.path.dirname(os.path.abspath(__file__)) +sys.path.append(os.path.join(CUR_DIR, '../../../centernet2')) +from centernet.modeling.backbone.fpn_p5 import LastLevelP6P7_P5 + import torch.utils.checkpoint as checkpoint from timm.models.layers import DropPath, Mlp, trunc_normal_ diff --git a/vbench/third_party/grit_src/image_dense_captions.py b/vbench/third_party/grit_src/image_dense_captions.py index 293516c2..1bfe85c3 100755 --- a/vbench/third_party/grit_src/image_dense_captions.py +++ b/vbench/third_party/grit_src/image_dense_captions.py @@ -10,9 +10,10 @@ # sys.path.insert(0, f"{CUR_DIR}/../") # print(CUR_DIR) -# sys.path.insert(0, os.path.join(CUR_DIR,'third_party/CenterNet2/projects/CenterNet2/')) -from .centernet_config import add_centernet_config -# from .centernet2.centernet.config import add_centernet_config +import sys +sys.path.append(os.path.join(CUR_DIR, './centernet2/')) +from centernet.config import add_centernet_config + from .grit.config import add_grit_config from .grit.predictor import VisualizationDemo From 171b656a68070c55acb9cf415a866bcab5bc38ac Mon Sep 17 00:00:00 2001 From: zhangfan-p Date: Tue, 2 Jan 2024 14:32:26 +0800 Subject: [PATCH 072/248] [update] load_video func --- vbench/utils.py | 25 ++++++++----------------- 1 file changed, 8 insertions(+), 17 deletions(-) diff --git a/vbench/utils.py b/vbench/utils.py index e4df979b..460d3e81 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -128,18 +128,12 @@ def load_video(video_path, data_transform=None, num_frames=None, return_tensor=T frame = np.array(frame).astype(np.uint8) frame_ls.append(frame) buffer = np.array(frame_ls).astype(np.uint8) - if data_transform: - buffer = data_transform(buffer) - return buffer elif video_path.endswith('.png'): frame = Image.open(video_path) frame = frame.convert('RGB') frame = np.array(frame).astype(np.uint8) frame_ls = [frame] buffer = np.array(frame_ls) - if data_transform: - buffer = data_transform(buffer) - return buffer elif video_path.endswith('.mp4'): if width: video_reader = VideoReader(video_path, width=width, height=height, num_threads=1) @@ -148,25 +142,22 @@ def load_video(video_path, data_transform=None, num_frames=None, return_tensor=T frames = video_reader.get_batch(range(len(video_reader))) # (T, H, W, C), torch.uint8 buffer = frames.asnumpy().astype(np.uint8) - if num_frames: - frame_indices = get_frame_indices( - num_frames, len(buffer), sample="middle" - ) - buffer = buffer[frame_indices] - if data_transform: - buffer = data_transform(buffer) - return buffer else: raise NotImplementedError + frames = buffer - if return_tensor: - frames = torch.Tensor(buffer) - frames = frames.permute(0, 3, 1, 2) # (T, C, H, W), torch.uint8 if num_frames: frame_indices = get_frame_indices( num_frames, len(frames), sample="middle" ) frames = frames[frame_indices] + + if data_transform: + frames = data_transform(frames) + elif return_tensor: + frames = torch.Tensor(frames) + frames = frames.permute(0, 3, 1, 2) # (T, C, H, W), torch.uint8 + return frames def read_frames_decord_by_fps( From 85df893ea19571b6afd1dfbdc7c4aa45b0b2a5ad Mon Sep 17 00:00:00 2001 From: zhangfan-p Date: Tue, 2 Jan 2024 14:43:39 +0800 Subject: [PATCH 073/248] [add] frame extraction module --- vbench/dynamic_degree.py | 10 ++++++++++ 1 file changed, 10 insertions(+) diff --git a/vbench/dynamic_degree.py b/vbench/dynamic_degree.py index d6ce3ae8..34ef0af1 100644 --- a/vbench/dynamic_degree.py +++ b/vbench/dynamic_degree.py @@ -89,6 +89,8 @@ def check_move(self, score_list): def get_frames(self, video_path): frame_list = [] video = cv2.VideoCapture(video_path) + fps = video.get(cv2.CAP_PROP_FPS) # get fps + interval = round(fps/8) while video.isOpened(): success, frame = video.read() if success: @@ -100,7 +102,15 @@ def get_frames(self, video_path): break video.release() assert frame_list != [] + frame_list = self.extract_frame(frame_list, interval) return frame_list + + + def extract_frame(self, frame_list, interval=1): + extract = [] + for i in range(0, len(frame_list), interval): + extract.append(frame_list[i]) + return extract def get_frames_from_img_folder(self, img_folder): From eff4d125e63e100c592c9cef5a07b55336abe227 Mon Sep 17 00:00:00 2001 From: zhangfan-p Date: Tue, 2 Jan 2024 14:51:46 +0800 Subject: [PATCH 074/248] [update] set num_frames --- vbench/color.py | 2 +- vbench/multiple_objects.py | 2 +- vbench/object_class.py | 2 +- vbench/scene.py | 2 +- vbench/spatial_relationship.py | 2 +- 5 files changed, 5 insertions(+), 5 deletions(-) diff --git a/vbench/color.py b/vbench/color.py index 78bfeb2d..a262cee0 100755 --- a/vbench/color.py +++ b/vbench/color.py @@ -55,7 +55,7 @@ def color(model, video_dict, device): object_info = info['prompt'] object_info = object_info.replace('a ','').replace('an ','').replace(color_info,'').strip() for video_path in info['video_list']: - video_arrays = load_video(video_path, return_tensor=False) + video_arrays = load_video(video_path, num_frames=16, return_tensor=False) cur_video_pred = get_dect_from_grit(model ,video_arrays) cur_object, cur_object_color = check_generate(color_info, object_info, cur_video_pred) if cur_object>0: diff --git a/vbench/multiple_objects.py b/vbench/multiple_objects.py index 6385969e..5370d522 100755 --- a/vbench/multiple_objects.py +++ b/vbench/multiple_objects.py @@ -42,7 +42,7 @@ def multiple_objects(model, video_dict, device): raise "Auxiliary info is not in json, please check your json." object_info = info['auxiliary_info']['object'] for video_path in info['video_list']: - video_tensor = load_video(video_path) + video_tensor = load_video(video_path, num_frames=16) cur_video_pred = get_dect_from_grit(model, video_tensor.permute(0,2,3,1)) cur_success_frame_count = check_generate(object_info, cur_video_pred) cur_success_frame_rate = cur_success_frame_count/len(cur_video_pred) diff --git a/vbench/object_class.py b/vbench/object_class.py index a3ed0614..a3b21b43 100755 --- a/vbench/object_class.py +++ b/vbench/object_class.py @@ -38,7 +38,7 @@ def object_class(model, video_dict, device): raise "Auxiliary info is not in json, please check your json." object_info = info['auxiliary_info']['object'] for video_path in info['video_list']: - video_tensor = load_video(video_path) + video_tensor = load_video(video_path, num_frames=16) cur_video_pred = get_dect_from_grit(model, video_tensor.permute(0,2,3,1)) cur_success_frame_count = check_generate(object_info, cur_video_pred) cur_success_frame_rate = cur_success_frame_count/len(cur_video_pred) diff --git a/vbench/scene.py b/vbench/scene.py index 8f276703..90bc8ac0 100755 --- a/vbench/scene.py +++ b/vbench/scene.py @@ -33,7 +33,7 @@ def scene(model, video_dict, device): raise "Auxiliary info is not in json, please check your json." scene_info = info['auxiliary_info']['scene'] for video_path in info['video_list']: - video_array = load_video(video_path, return_tensor=False, width=384, height=384) + video_array = load_video(video_path, num_frames=16, return_tensor=False, width=384, height=384) video_tensor_list = [] for i in video_array: video_tensor_list.append(transform(i).to(device).unsqueeze(0)) diff --git a/vbench/spatial_relationship.py b/vbench/spatial_relationship.py index 898b59a1..9f4051e1 100755 --- a/vbench/spatial_relationship.py +++ b/vbench/spatial_relationship.py @@ -111,7 +111,7 @@ def spatial_relationship(model, video_dict, device): raise "Auxiliary info is not in json, please check your json." object_info = info['auxiliary_info']['spatial_relationship'] for video_path in info['video_list']: - video_tensor = load_video(video_path) + video_tensor = load_video(video_path, num_frames=16) cur_video_pred = get_dect_from_grit(model, video_tensor.permute(0,2,3,1)) cur_video_frame_score = check_generate(object_info, cur_video_pred) cur_success_frame_rate = np.mean(cur_video_frame_score) From 73eb30ebbf61c89902dd09c30f3808b177c9d280 Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Thu, 4 Jan 2024 00:45:13 +0800 Subject: [PATCH 075/248] [update] 16 T2V dimensions --- README.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 19ed3163..92198146 100755 --- a/README.md +++ b/README.md @@ -18,7 +18,8 @@ This repository contains the implementation of the following paper: We propose **VBench**, a comprehensive benchmark suite for video generative models. We design a comprehensive and hierarchical Evaluation Dimension Suite to decompose "video generation quality" into multiple well-defined dimensions to facilitate fine-grained and objective evaluation. For each dimension and each content category, we carefully design a Prompt Suite as test cases, and sample Generated Videos from a set of video generation models. For each evaluation dimension, we specifically design an Evaluation Method Suite, which uses carefully crafted method or designated pipeline for automatic objective evaluation. We also conduct Human Preference Annotation for the generated videos for each dimension, and show that VBench evaluation results are well aligned with human perceptions. VBench can provide valuable insights from multiple perspectives. ## :fire: Updates -- [11/2023] Evaluation code for released for this list of dimensions: `['subject_consistency', 'background_consistency', 'aesthetic_quality', 'object_class', 'multiple_objects', 'human_action', 'color', 'spatial_relationship', 'scene', 'temporal_style', 'appearance_style', 'overall_consistency']` +- [12/2023] Evaluation code for released for 16 Text-to-Video (T2V) evaluation dimensions. + - `['subject_consistency', 'background_consistency', 'temporal_flickering', 'motion_smoothness', 'dynamic_degree', 'aesthetic_quality', 'imaging_quality', 'object_class', 'multiple_objects', 'human_action', 'color', 'spatial_relationship', 'scene', 'temporal_style', 'appearance_style', 'overall_consistency']` - [11/2023] Prompt Suites released. (See prompt lists [here](https://github.com/Vchitect/VBench/tree/master/prompts)) ## :hammer: Installation From bc67b4859d2d5d5736bfc3f094125339ca5f0a1b Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Thu, 4 Jan 2024 00:49:54 +0800 Subject: [PATCH 076/248] [update] README 16 T2V dimensions --- README.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 1f655088..e8a408f6 100755 --- a/README.md +++ b/README.md @@ -18,7 +18,8 @@ This repository contains the implementation of the following paper: We propose **VBench**, a comprehensive benchmark suite for video generative models. We design a comprehensive and hierarchical Evaluation Dimension Suite to decompose "video generation quality" into multiple well-defined dimensions to facilitate fine-grained and objective evaluation. For each dimension and each content category, we carefully design a Prompt Suite as test cases, and sample Generated Videos from a set of video generation models. For each evaluation dimension, we specifically design an Evaluation Method Suite, which uses carefully crafted method or designated pipeline for automatic objective evaluation. We also conduct Human Preference Annotation for the generated videos for each dimension, and show that VBench evaluation results are well aligned with human perceptions. VBench can provide valuable insights from multiple perspectives. ## :fire: Updates -- [11/2023] Evaluation code for released for this list of dimensions: `['subject_consistency', 'background_consistency', 'aesthetic_quality', 'object_class', 'multiple_objects', 'human_action', 'color', 'spatial_relationship', 'scene', 'temporal_style', 'appearance_style', 'overall_consistency']` +- [12/2023] Evaluation code for released for 16 Text-to-Video (T2V) evaluation dimensions. + - `['subject_consistency', 'background_consistency', 'temporal_flickering', 'motion_smoothness', 'dynamic_degree', 'aesthetic_quality', 'imaging_quality', 'object_class', 'multiple_objects', 'human_action', 'color', 'spatial_relationship', 'scene', 'temporal_style', 'appearance_style', 'overall_consistency']` - [11/2023] Prompt Suites released. (See prompt lists [here](https://github.com/Vchitect/VBench/tree/master/prompts)) ## :hammer: Installation From ee39091dd99b00e3c992312e31d0ee764cea79f0 Mon Sep 17 00:00:00 2001 From: zhangfan-p Date: Fri, 5 Jan 2024 16:02:29 +0800 Subject: [PATCH 077/248] [update] add logger&check_and_move func --- static_filter.py | 24 ++++++++++++++++++++++-- 1 file changed, 22 insertions(+), 2 deletions(-) diff --git a/static_filter.py b/static_filter.py index 3d47c7b2..b3312b10 100644 --- a/static_filter.py +++ b/static_filter.py @@ -6,10 +6,14 @@ import torch from tqdm import tqdm import json +import shutil from vbench.third_party.RAFT.core.raft import RAFT from vbench.third_party.RAFT.core.utils_core.utils import InputPadder +import logging +logging.basicConfig(level = logging.INFO,format = '%(asctime)s - %(levelname)s - %(message)s') +logger = logging.getLogger(__name__) DEVICE = 'cuda' @@ -99,6 +103,19 @@ def get_frames(self, video_path): return frame_list +def check_and_move(args, filter_results, target_path=None): + if target_path is None: + target_path = os.path.join(args.result_path, "filtered_videos") + os.makedirs(target_path, exist_ok=True) + for prompt, v in filter_results.items(): + if v["static_count"] < 5: + logger.warning(f"Prompt: '{prompt}' has fewer than 5 filter results.") + for i, video_path in enumerate(v["static_path"]): + target_name = os.path.join(target_path, f"{prompt}-{i}.mp4") + shutil.copy(video_path, target_name) + logger.info(f"All filtered videos are saved in the '{target_path}' path") + + def filter_static(args): static_filter = StaticFilter(args, device=DEVICE) prompt_dict = {} @@ -116,8 +133,11 @@ def filter_static(args): prompt_dict[name]["static_count"] += 1 prompt_dict[name]["static_path"].append(path) os.makedirs(args.result_path, exist_ok=True) - json.dump(prompt_dict, open(os.path.join(args.result_path, args.store_name), "w")) - + info_file = os.path.join(args.result_path, args.store_name) + json.dump(prompt_dict, open(info_file, "w")) + logger.info(f"Filtered results info is saved in the '{info_file}' file") + check_and_move(args, prompt_dict) + if __name__ == '__main__': parser = argparse.ArgumentParser() From d522b5149689754f19b66eb83750f66a9395211b Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Sat, 6 Jan 2024 15:25:23 +0700 Subject: [PATCH 078/248] [fix]: cache directory --- vbench/aesthetic_quality.py | 2 +- vbench/background_consistency.py | 3 +- vbench/overall_consistency.py | 4 +-- vbench/temporal_style.py | 2 +- vbench/third_party/ViCLIP/viclip.py | 2 +- vbench/utils.py | 55 +++++++++++++++-------------- 6 files changed, 34 insertions(+), 34 deletions(-) diff --git a/vbench/aesthetic_quality.py b/vbench/aesthetic_quality.py index 78d79618..28a99680 100755 --- a/vbench/aesthetic_quality.py +++ b/vbench/aesthetic_quality.py @@ -26,7 +26,7 @@ def get_aesthetic_model(cache_folder): urlretrieve(url_model, path_to_model) # unable to download https://github.com/LAION-AI/aesthetic-predictor/blob/main/sa_0_4_vit_l_14_linear.pth?raw=true to pretrained/aesthetic_model/emb_reader/sa_0_4_vit_l_14_linear.pth except: print(f'unable to download {url_model} to {path_to_model} using urlretrieve, trying wget') - os.system(f"wget {url_model} -O {path_to_model}") + os.system(f"wget {url_model} -P {os.path.dirname(path_to_model)}") m = nn.Linear(768, 1) s = torch.load(path_to_model) m.load_state_dict(s) diff --git a/vbench/background_consistency.py b/vbench/background_consistency.py index 3ce0ea40..c6ad2b06 100755 --- a/vbench/background_consistency.py +++ b/vbench/background_consistency.py @@ -1,5 +1,6 @@ import os import json +import logging import numpy as np import clip from PIL import Image @@ -51,8 +52,6 @@ def background_consistency(clip_model, preprocess, video_list, device, read_fram def compute_background_consistency(json_dir, device, submodules_list): vit_path, read_frame = submodules_list[0], submodules_list[1] - if not os.path.isfile(vit_path): - os.system(f'wget -q --show-progress https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt -O {vit_path}') clip_model, preprocess = clip.load(vit_path, device=device) video_list, _ = load_dimension_info(json_dir, dimension='background_consistency', lang='en') all_results, video_results = background_consistency(clip_model, preprocess, video_list, device, read_frame) diff --git a/vbench/overall_consistency.py b/vbench/overall_consistency.py index 1b4ae6fe..eaa2b646 100755 --- a/vbench/overall_consistency.py +++ b/vbench/overall_consistency.py @@ -6,9 +6,9 @@ import torch import clip from tqdm import tqdm -from vbench.third_party.ViCLIP.viclip import ViCLIP -from vbench.third_party.ViCLIP.simple_tokenizer import SimpleTokenizer from vbench.utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps +from vbench.third_party.ViCLIP.viclip import ViCLIP + from vbench.third_party.ViCLIP.simple_tokenizer import SimpleTokenizer def get_text_features(model, input_text, tokenizer, text_feature_dict={}): if input_text in text_feature_dict: diff --git a/vbench/temporal_style.py b/vbench/temporal_style.py index 4fae32f9..7a30e5ca 100755 --- a/vbench/temporal_style.py +++ b/vbench/temporal_style.py @@ -6,9 +6,9 @@ import torch import clip from tqdm import tqdm +from vbench.utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps from vbench.third_party.ViCLIP.viclip import ViCLIP from vbench.third_party.ViCLIP.simple_tokenizer import SimpleTokenizer -from vbench.utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps def get_text_features(model, input_text, tokenizer, text_feature_dict={}): if input_text in text_feature_dict: diff --git a/vbench/third_party/ViCLIP/viclip.py b/vbench/third_party/ViCLIP/viclip.py index 9098de86..cc5e24d4 100644 --- a/vbench/third_party/ViCLIP/viclip.py +++ b/vbench/third_party/ViCLIP/viclip.py @@ -221,4 +221,4 @@ def get_vid_features(self, input_frames): def get_predict_label(self, clip_feature, text_feats_tensor, top=5): label_probs = (100.0 * clip_feature @ text_feats_tensor.T).softmax(dim=-1) top_probs, top_labels = label_probs.cpu().topk(top, dim=-1) - return top_probs, top_labels \ No newline at end of file + return top_probs, top_labels diff --git a/vbench/utils.py b/vbench/utils.py index e4df979b..f609f97d 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -19,7 +19,6 @@ logging.basicConfig(level = logging.INFO,format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s') logger = logging.getLogger(__name__) - def clip_transform(n_px): return Compose([ Resize(n_px, interpolation=BICUBIC), @@ -226,31 +225,33 @@ def load_dimension_info(json_dir, dimension, lang): prompt_dict_ls += [{'prompt': prompt, 'video_list': cur_video_list}] return video_list, prompt_dict_ls -def init_submodules(dimension_list, local=False, read_frame=False): +def init_submodules(dimension_list, local=False, read_frame=False, cache_dir=os.path.join(os.path.expanduser('~'), '.cache', 'vbench')): submodules_dict = {} if local: logger.info("\x1b[32m[Local Mode]\x1b[0m Working in local mode, please make sure that the pre-trained model has been fully downloaded.") for dimension in dimension_list: + os.makedirs(cache_dir, exist_ok=True) if dimension == 'background_consistency': # read_frame = False if local: - vit_b_path = 'pretrained/clip_model/ViT-B-32.pt' + vit_b_path = f'{cache_dir}/clip_model/ViT-B-32.pt' if not os.path.isfile(vit_b_path): - os.system(f'wget -q https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt -O {vit_b_path}') + os.system(f'wget -q https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt -P {os.path.dirname(vit_b_path)}') else: vit_b_path = 'ViT-B/32' + submodules_dict[dimension] = [vit_b_path, read_frame] elif dimension == 'human_action': - umt_path = "pretrained/umt_model/l16_ptk710_ftk710_ftk400_f16_res224.pth" + umt_path = f'{cache_dir}/umt_model/l16_ptk710_ftk710_ftk400_f16_res224.pth' if not os.path.isfile(umt_path): - os.system(f'wget -q https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/umt/single_modality/l16_ptk710_ftk710_ftk400_f16_res224.pth -O {umt_path}') + os.system(f'wget -q https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/umt/single_modality/l16_ptk710_ftk710_ftk400_f16_res224.pth -P {os.path.dirname(umt_path)}') submodules_dict[dimension] = [umt_path,] elif dimension == 'temporal_flickering': submodules_dict[dimension] = [] elif dimension == 'motion_smoothness': submodules_dict[dimension] = { - 'config':'pretrained/amt_model/AMT-S.yaml', - 'ckpt':'pretrained/amt_model/amt-s.pth' + 'config': f'{cache_dir}/amt_model/AMT-S.yaml', + 'ckpt': f'{cache_dir}/amt_model/amt-s.pth' } details = submodules_dict[dimension] # Check if the file exists, if not, download it with wget @@ -262,21 +263,21 @@ def init_submodules(dimension_list, local=False, read_frame=False): elif dimension == 'dynamic_degree': submodules_dict[dimension] = { - 'model':'pretrained/raft_model/models/raft-things.pth' + 'model': f'{cache_dir}/raft_model/models/raft-things.pth' } details = submodules_dict[dimension] if not os.path.isfile(details['model']): # raise NotImplementedError print(f"File {details['model']} does not exist. Downloading...") - os.system(f'wget -P pretrained/raft_model/ https://dl.dropboxusercontent.com/s/4j4z58wuv8o0mfz/models.zip') - os.system(f'unzip -d pretrained/raft_model/ pretrained/raft_model/models.zip') - os.system(f'rm -f pretrained/raft_model/models.zip') + os.system(f'wget -P {cache_dir}/raft_model/ https://dl.dropboxusercontent.com/s/4j4z58wuv8o0mfz/models.zip') + os.system(f'unzip -d {cache_dir}/raft_model/ {cache_dir}/raft_model/models.zip') + os.system(f'rm -f {cache_dir}/raft_model/models.zip') # Assign the DINO model path for subject consistency dimension elif dimension == 'subject_consistency': if local: submodules_dict[dimension] = { - 'repo_or_dir':'pretrained/dino_model/facebookresearch_dino_main/', - 'path':'pretrained/dino_model/dino_vitbase16_pretrain.pth', + 'repo_or_dir': f'{cache_dir}/dino_model/facebookresearch_dino_main/', + 'path': f'{cache_dir}/dino_model/dino_vitbase16_pretrain.pth', 'model': 'dino_vitb16', 'source': 'local', 'read_frame': read_frame @@ -300,46 +301,46 @@ def init_submodules(dimension_list, local=False, read_frame=False): 'read_frame': read_frame } elif dimension == 'aesthetic_quality': - aes_path = "pretrained/aesthetic_model/emb_reader" + aes_path = f'{cache_dir}/aesthetic_model/emb_reader' if local: - vit_l_path = 'pretrained/clip_model/ViT-L-14.pt' + vit_l_path = f'{cache_dir}/clip_model/ViT-L-14.pt' if not os.path.isfile(vit_l_path): - os.system(f'wget -q https://openaipublic.azureedge.net/clip/models/b8cca3fd41ae0c99ba7e8951adf17d267cdb84cd88be6f7c2e0eca1737a03836/ViT-L-14.pt -O {vit_l_path}') + os.system(f'wget -q https://openaipublic.azureedge.net/clip/models/b8cca3fd41ae0c99ba7e8951adf17d267cdb84cd88be6f7c2e0eca1737a03836/ViT-L-14.pt -P {os.path.dirname(vit_l_path)}') else: vit_l_path = 'ViT-L/14' submodules_dict[dimension] = [vit_l_path, aes_path] elif dimension == 'imaging_quality': - musiq_spaq_path = 'pretrained/pyiqa_model/musiq_spaq_ckpt-358bb6af.pth' + musiq_spaq_path = f'{cache_dir}/pyiqa_model/musiq_spaq_ckpt-358bb6af.pth' if not os.path.isfile(musiq_spaq_path): - os.system(f'wget -q https://github.com/chaofengc/IQA-PyTorch/releases/download/v0.1-weights/musiq_spaq_ckpt-358bb6af.pth -O {musiq_spaq_path}') + os.system(f'wget -q https://github.com/chaofengc/IQA-PyTorch/releases/download/v0.1-weights/musiq_spaq_ckpt-358bb6af.pth -P {os.path.dirname(musiq_spaq_path)}') submodules_dict[dimension] = {'model_path': musiq_spaq_path} elif dimension in ["object_class", "multiple_objects", "color", "spatial_relationship" ]: submodules_dict[dimension] = { - "model_weight": "pretrained/grit_model/grit_b_densecap_objectdet.pth" + "model_weight": f'{cache_dir}/grit_model/grit_b_densecap_objectdet.pth' } if not os.path.exists(submodules_dict[dimension]['model_weight']): - os.system(f'wget -q https://datarelease.blob.core.windows.net/grit/models/grit_b_densecap_objectdet.pth -O {submodules_dict[dimension]["model_weight"]}') + os.system(f'wget -q https://datarelease.blob.core.windows.net/grit/models/grit_b_densecap_objectdet.pth -P {os.path.dirname(submodules_dict[dimension]["model_weight"])}') elif dimension == 'scene': submodules_dict[dimension] = { - "pretrained": "pretrained/caption_model/tag2text_swin_14m.pth", + "pretrained": f'{cache_dir}/caption_model/tag2text_swin_14m.pth', "image_size":384, "vit":"swin_b" } if not os.path.exists(submodules_dict[dimension]['pretrained']): - os.system(f'wget https://huggingface.co/spaces/xinyu1205/recognize-anything/resolve/main/tag2text_swin_14m.pth -O {submodules_dict[dimension]["pretrained"]}') + os.system(f'wget https://huggingface.co/spaces/xinyu1205/recognize-anything/resolve/main/tag2text_swin_14m.pth -P {os.path.dirname(submodules_dict[dimension]["pretrained"])}') elif dimension == 'appearance_style': if local: - submodules_dict[dimension] = {"name":'pretrained/clip_model/ViT-B-32.pt'} + submodules_dict[dimension] = {"name": f'{cache_dir}/clip_model/ViT-B-32.pt'} if not os.path.isfile(submodules_dict[dimension]["name"]): - os.system(f'wget -q https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt -O {submodules_dict[dimension]["name"]}') + os.system(f'wget -q https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt -P {os.path.dirname(submodules_dict[dimension]["name"])}') else: submodules_dict[dimension] = {"name": 'ViT-B/32'} elif dimension in ["temporal_style", "overall_consistency"]: submodules_dict[dimension] = { - "pretrain": "pretrained/viclip_model/ViClip-InternVid-10M-FLT.pth", + "pretrain": f'{cache_dir}/viclip_model/ViClip-InternVid-10M-FLT.pth', } if not os.path.exists(submodules_dict[dimension]['pretrain']): - os.system(f'wget -q https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/internvideo/viclip/ViClip-InternVid-10M-FLT.pth -O {submodules_dict[dimension]["pretrain"]}') + os.system(f'wget -q https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/internvideo/viclip/ViClip-InternVid-10M-FLT.pth -P {os.path.dirname(submodules_dict[dimension]["pretrain"])}') return submodules_dict From 46faff45487962c2aab2fa9c4291a1fdd3a38526 Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Mon, 8 Jan 2024 19:10:13 +0700 Subject: [PATCH 079/248] [fix]: indentation --- vbench/overall_consistency.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/vbench/overall_consistency.py b/vbench/overall_consistency.py index eaa2b646..c7c0bc42 100755 --- a/vbench/overall_consistency.py +++ b/vbench/overall_consistency.py @@ -8,7 +8,7 @@ from tqdm import tqdm from vbench.utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps from vbench.third_party.ViCLIP.viclip import ViCLIP - from vbench.third_party.ViCLIP.simple_tokenizer import SimpleTokenizer +from vbench.third_party.ViCLIP.simple_tokenizer import SimpleTokenizer def get_text_features(model, input_text, tokenizer, text_feature_dict={}): if input_text in text_feature_dict: From b71ae1c100e192446712e8fc514391e7f8f4a8f0 Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Wed, 10 Jan 2024 15:42:23 +0800 Subject: [PATCH 080/248] [update]: cache dir structure and change os.system -> subprocess.run --- vbench/aesthetic_quality.py | 4 +- vbench/overall_consistency.py | 5 +- vbench/temporal_style.py | 5 +- vbench/third_party/ViCLIP/simple_tokenizer.py | 11 +-- vbench/utils.py | 71 ++++++++++++------- 5 files changed, 57 insertions(+), 39 deletions(-) diff --git a/vbench/aesthetic_quality.py b/vbench/aesthetic_quality.py index 28a99680..210f1e27 100755 --- a/vbench/aesthetic_quality.py +++ b/vbench/aesthetic_quality.py @@ -6,6 +6,7 @@ import torch import torch.nn as nn import torch.nn.functional as F +import subprocess from urllib.request import urlretrieve from vbench.utils import load_video, load_dimension_info, clip_transform from tqdm import tqdm @@ -26,7 +27,8 @@ def get_aesthetic_model(cache_folder): urlretrieve(url_model, path_to_model) # unable to download https://github.com/LAION-AI/aesthetic-predictor/blob/main/sa_0_4_vit_l_14_linear.pth?raw=true to pretrained/aesthetic_model/emb_reader/sa_0_4_vit_l_14_linear.pth except: print(f'unable to download {url_model} to {path_to_model} using urlretrieve, trying wget') - os.system(f"wget {url_model} -P {os.path.dirname(path_to_model)}") + wget_command = ['wget', url_model, '-P', os.path.dirname(path_to_model)] + subprocess.run(wget_command) m = nn.Linear(768, 1) s = torch.load(path_to_model) m.load_state_dict(s) diff --git a/vbench/overall_consistency.py b/vbench/overall_consistency.py index c7c0bc42..66944979 100755 --- a/vbench/overall_consistency.py +++ b/vbench/overall_consistency.py @@ -1,12 +1,11 @@ import os -CUR_DIR = os.path.dirname(os.path.abspath(__file__)) import json import numpy as np import torch import clip from tqdm import tqdm -from vbench.utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps +from vbench.utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps, CACHE_DIR from vbench.third_party.ViCLIP.viclip import ViCLIP from vbench.third_party.ViCLIP.simple_tokenizer import SimpleTokenizer @@ -55,7 +54,7 @@ def overall_consistency(clip_model, video_dict, tokenizer, device, sample="middl return avg_score, video_results def compute_overall_consistency(json_dir, device, submodules_list): - tokenizer = SimpleTokenizer(os.path.join(CUR_DIR,"third_party/ViCLIP/bpe_simple_vocab_16e6.txt.gz")) + tokenizer = SimpleTokenizer(os.path.join(CACHE_DIR, "ViCLIP/bpe_simple_vocab_16e6.txt.gz")) viclip = ViCLIP(tokenizer= tokenizer, **submodules_list).to(device) _, video_dict = load_dimension_info(json_dir, dimension='overall_consistency', lang='en') all_results, video_results = overall_consistency(viclip, video_dict, tokenizer, device) diff --git a/vbench/temporal_style.py b/vbench/temporal_style.py index 7a30e5ca..6066595b 100755 --- a/vbench/temporal_style.py +++ b/vbench/temporal_style.py @@ -1,12 +1,11 @@ import os -CUR_DIR = os.path.dirname(os.path.abspath(__file__)) import json import numpy as np import torch import clip from tqdm import tqdm -from vbench.utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps +from vbench.utils import load_video, load_dimension_info, clip_transform, read_frames_decord_by_fps, CACHE_DIR from vbench.third_party.ViCLIP.viclip import ViCLIP from vbench.third_party.ViCLIP.simple_tokenizer import SimpleTokenizer @@ -56,7 +55,7 @@ def temporal_style(clip_model, video_dict, tokenizer, device, sample="middle"): return avg_score, video_results def compute_temporal_style(json_dir, device, submodules_list): - tokenizer = SimpleTokenizer(os.path.join(CUR_DIR,"third_party/ViCLIP/bpe_simple_vocab_16e6.txt.gz")) + tokenizer = SimpleTokenizer(os.path.join(CACHE_DIR, "ViCLIP/bpe_simple_vocab_16e6.txt.gz")) viclip = ViCLIP(tokenizer= tokenizer, **submodules_list).to(device) _, video_dict = load_dimension_info(json_dir, dimension='temporal_style', lang='en') all_results, video_results = temporal_style(viclip, video_dict, tokenizer, device) diff --git a/vbench/third_party/ViCLIP/simple_tokenizer.py b/vbench/third_party/ViCLIP/simple_tokenizer.py index 0b6bf050..5894b95b 100644 --- a/vbench/third_party/ViCLIP/simple_tokenizer.py +++ b/vbench/third_party/ViCLIP/simple_tokenizer.py @@ -1,17 +1,18 @@ import gzip import html import os +import subprocess from functools import lru_cache - import ftfy import regex as re +from vbench.utils import CACHE_DIR - -@lru_cache() def default_bpe(): - tokenizer_file = os.path.join(os.path.dirname(os.path.abspath(__file__)), "bpe_simple_vocab_16e6.txt.gz") + tokenizer_file = os.path.join(CACHE_DIR, "ViCLIP/bpe_simple_vocab_16e6.txt.gz") + print(f'save tokenizer_file to {tokenizer_file}') if not os.path.exists(tokenizer_file): - os.system(f'wget https://raw.githubusercontent.com/openai/CLIP/main/clip/bpe_simple_vocab_16e6.txt.gz -O {tokenizer_file}') + wget_command = ['wget', 'https://raw.githubusercontent.com/openai/CLIP/main/clip/bpe_simple_vocab_16e6.txt.gz', '-O', tokenizer_file] + subprocess.run(wget_command) return tokenizer_file diff --git a/vbench/utils.py b/vbench/utils.py index f609f97d..9e2c4114 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -16,6 +16,10 @@ BICUBIC = Image.BICUBIC BILINEAR = Image.BILINEAR +CACHE_DIR = os.environ.get('VBENCH_CACHE_DIR') +if CACHE_DIR is None: + CACHE_DIR = os.path.join(os.path.expanduser('~'), '.cache', 'vbench') + logging.basicConfig(level = logging.INFO,format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s') logger = logging.getLogger(__name__) @@ -225,33 +229,35 @@ def load_dimension_info(json_dir, dimension, lang): prompt_dict_ls += [{'prompt': prompt, 'video_list': cur_video_list}] return video_list, prompt_dict_ls -def init_submodules(dimension_list, local=False, read_frame=False, cache_dir=os.path.join(os.path.expanduser('~'), '.cache', 'vbench')): +def init_submodules(dimension_list, local=False, read_frame=False): submodules_dict = {} if local: logger.info("\x1b[32m[Local Mode]\x1b[0m Working in local mode, please make sure that the pre-trained model has been fully downloaded.") for dimension in dimension_list: - os.makedirs(cache_dir, exist_ok=True) + os.makedirs(CACHE_DIR, exist_ok=True) if dimension == 'background_consistency': # read_frame = False if local: - vit_b_path = f'{cache_dir}/clip_model/ViT-B-32.pt' + vit_b_path = f'{CACHE_DIR}/clip_model/ViT-B-32.pt' if not os.path.isfile(vit_b_path): - os.system(f'wget -q https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt -P {os.path.dirname(vit_b_path)}') + wget_command = ['wget', 'https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt', '-P', os.path.dirname(vit_b_path)] + subprocess.run(wget_command) else: vit_b_path = 'ViT-B/32' submodules_dict[dimension] = [vit_b_path, read_frame] elif dimension == 'human_action': - umt_path = f'{cache_dir}/umt_model/l16_ptk710_ftk710_ftk400_f16_res224.pth' + umt_path = f'{CACHE_DIR}/umt_model/l16_ptk710_ftk710_ftk400_f16_res224.pth' if not os.path.isfile(umt_path): - os.system(f'wget -q https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/umt/single_modality/l16_ptk710_ftk710_ftk400_f16_res224.pth -P {os.path.dirname(umt_path)}') + wget_command = ['wget', 'https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/umt/single_modality/l16_ptk710_ftk710_ftk400_f16_res224.pth', '-P', os.path.dirname(umt_path)] + subprocess.run(wget_command) submodules_dict[dimension] = [umt_path,] elif dimension == 'temporal_flickering': submodules_dict[dimension] = [] elif dimension == 'motion_smoothness': submodules_dict[dimension] = { - 'config': f'{cache_dir}/amt_model/AMT-S.yaml', - 'ckpt': f'{cache_dir}/amt_model/amt-s.pth' + 'config': f'{CACHE_DIR}/amt_model/AMT-S.yaml', + 'ckpt': f'{CACHE_DIR}/amt_model/amt-s.pth' } details = submodules_dict[dimension] # Check if the file exists, if not, download it with wget @@ -263,21 +269,27 @@ def init_submodules(dimension_list, local=False, read_frame=False, cache_dir=os. elif dimension == 'dynamic_degree': submodules_dict[dimension] = { - 'model': f'{cache_dir}/raft_model/models/raft-things.pth' + 'model': f'{CACHE_DIR}/raft_model/models/raft-things.pth' } details = submodules_dict[dimension] if not os.path.isfile(details['model']): # raise NotImplementedError print(f"File {details['model']} does not exist. Downloading...") - os.system(f'wget -P {cache_dir}/raft_model/ https://dl.dropboxusercontent.com/s/4j4z58wuv8o0mfz/models.zip') - os.system(f'unzip -d {cache_dir}/raft_model/ {cache_dir}/raft_model/models.zip') - os.system(f'rm -f {cache_dir}/raft_model/models.zip') + wget_command = ['wget', '-P', f'{CACHE_DIR}/raft_model/', 'https://dl.dropboxusercontent.com/s/4j4z58wuv8o0mfz/models.zip'] + unzip_command = ['unzip', 'd', f'{CACHE_DIR}/raft_model/', f'{CACHE_DIR}/raft_model/models.zip'] + remove_command = ['rm', '-r', f'{CACHE_DIR}/raft_model/models.zip'] + try: + subprocess.run(wget_command, check=True) + subprocess.run(unzip_command, check=True) + subprocess.run(remove_command, check=True) + except subprocess.CalledProcessError as err: + print(f"Error during downloading RAFT model: {err}") # Assign the DINO model path for subject consistency dimension elif dimension == 'subject_consistency': if local: submodules_dict[dimension] = { - 'repo_or_dir': f'{cache_dir}/dino_model/facebookresearch_dino_main/', - 'path': f'{cache_dir}/dino_model/dino_vitbase16_pretrain.pth', + 'repo_or_dir': f'{CACHE_DIR}/dino_model/facebookresearch_dino_main/', + 'path': f'{CACHE_DIR}/dino_model/dino_vitbase16_pretrain.pth', 'model': 'dino_vitb16', 'source': 'local', 'read_frame': read_frame @@ -301,46 +313,51 @@ def init_submodules(dimension_list, local=False, read_frame=False, cache_dir=os. 'read_frame': read_frame } elif dimension == 'aesthetic_quality': - aes_path = f'{cache_dir}/aesthetic_model/emb_reader' + aes_path = f'{CACHE_DIR}/aesthetic_model/emb_reader' if local: - vit_l_path = f'{cache_dir}/clip_model/ViT-L-14.pt' + vit_l_path = f'{CACHE_DIR}/clip_model/ViT-L-14.pt' if not os.path.isfile(vit_l_path): - os.system(f'wget -q https://openaipublic.azureedge.net/clip/models/b8cca3fd41ae0c99ba7e8951adf17d267cdb84cd88be6f7c2e0eca1737a03836/ViT-L-14.pt -P {os.path.dirname(vit_l_path)}') + wget_command = ['wget' ,'https://openaipublic.azureedge.net/clip/models/b8cca3fd41ae0c99ba7e8951adf17d267cdb84cd88be6f7c2e0eca1737a03836/ViT-L-14.pt', '-P', os.path.dirname(vit_l_path)] + subprocess.run(wget_command) else: vit_l_path = 'ViT-L/14' submodules_dict[dimension] = [vit_l_path, aes_path] elif dimension == 'imaging_quality': - musiq_spaq_path = f'{cache_dir}/pyiqa_model/musiq_spaq_ckpt-358bb6af.pth' + musiq_spaq_path = f'{CACHE_DIR}/pyiqa_model/musiq_spaq_ckpt-358bb6af.pth' if not os.path.isfile(musiq_spaq_path): - os.system(f'wget -q https://github.com/chaofengc/IQA-PyTorch/releases/download/v0.1-weights/musiq_spaq_ckpt-358bb6af.pth -P {os.path.dirname(musiq_spaq_path)}') + wget_command = ['wget', 'https://github.com/chaofengc/IQA-PyTorch/releases/download/v0.1-weights/musiq_spaq_ckpt-358bb6af.pth', '-P', os.path.dirname(musiq_spaq_path)] + subprocess.run(wget_command) submodules_dict[dimension] = {'model_path': musiq_spaq_path} elif dimension in ["object_class", "multiple_objects", "color", "spatial_relationship" ]: submodules_dict[dimension] = { - "model_weight": f'{cache_dir}/grit_model/grit_b_densecap_objectdet.pth' + "model_weight": f'{CACHE_DIR}/grit_model/grit_b_densecap_objectdet.pth' } if not os.path.exists(submodules_dict[dimension]['model_weight']): - os.system(f'wget -q https://datarelease.blob.core.windows.net/grit/models/grit_b_densecap_objectdet.pth -P {os.path.dirname(submodules_dict[dimension]["model_weight"])}') + wget_command = ['wget', 'https://datarelease.blob.core.windows.net/grit/models/grit_b_densecap_objectdet.pth', '-P', os.path.dirname(submodules_dict[dimension]["model_weight"])] + subprocess.run(wget_command) elif dimension == 'scene': submodules_dict[dimension] = { - "pretrained": f'{cache_dir}/caption_model/tag2text_swin_14m.pth', + "pretrained": f'{CACHE_DIR}/caption_model/tag2text_swin_14m.pth', "image_size":384, "vit":"swin_b" } if not os.path.exists(submodules_dict[dimension]['pretrained']): - os.system(f'wget https://huggingface.co/spaces/xinyu1205/recognize-anything/resolve/main/tag2text_swin_14m.pth -P {os.path.dirname(submodules_dict[dimension]["pretrained"])}') + wget_command = ['wget', 'https://huggingface.co/spaces/xinyu1205/recognize-anything/resolve/main/tag2text_swin_14m.pth', '-P', os.path.dirname(submodules_dict[dimension]["pretrained"])] elif dimension == 'appearance_style': if local: - submodules_dict[dimension] = {"name": f'{cache_dir}/clip_model/ViT-B-32.pt'} + submodules_dict[dimension] = {"name": f'{CACHE_DIR}/clip_model/ViT-B-32.pt'} if not os.path.isfile(submodules_dict[dimension]["name"]): - os.system(f'wget -q https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt -P {os.path.dirname(submodules_dict[dimension]["name"])}') + wget_command = ['wget', 'https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt', '-P', os.path.dirname(submodules_dict[dimension]["name"])] + subprocess.run(wget_command) else: submodules_dict[dimension] = {"name": 'ViT-B/32'} elif dimension in ["temporal_style", "overall_consistency"]: submodules_dict[dimension] = { - "pretrain": f'{cache_dir}/viclip_model/ViClip-InternVid-10M-FLT.pth', + "pretrain": f'{CACHE_DIR}/ViCLIP/ViClip-InternVid-10M-FLT.pth', } if not os.path.exists(submodules_dict[dimension]['pretrain']): - os.system(f'wget -q https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/internvideo/viclip/ViClip-InternVid-10M-FLT.pth -P {os.path.dirname(submodules_dict[dimension]["pretrain"])}') + wget_command = ['wget', 'https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/internvideo/viclip/ViClip-InternVid-10M-FLT.pth', '-P', os.path.dirname(submodules_dict[dimension]["pretrain"])] + subprocess.run(wget_command) return submodules_dict From 5680061e0f823fd3e48ba71941b49d948739a58d Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Wed, 10 Jan 2024 19:23:27 +0800 Subject: [PATCH 081/248] [update]: add check condition in subprocess --- vbench/third_party/ViCLIP/simple_tokenizer.py | 2 +- vbench/utils.py | 23 ++++++++++--------- 2 files changed, 13 insertions(+), 12 deletions(-) diff --git a/vbench/third_party/ViCLIP/simple_tokenizer.py b/vbench/third_party/ViCLIP/simple_tokenizer.py index 5894b95b..0b46f0c6 100644 --- a/vbench/third_party/ViCLIP/simple_tokenizer.py +++ b/vbench/third_party/ViCLIP/simple_tokenizer.py @@ -11,7 +11,7 @@ def default_bpe(): tokenizer_file = os.path.join(CACHE_DIR, "ViCLIP/bpe_simple_vocab_16e6.txt.gz") print(f'save tokenizer_file to {tokenizer_file}') if not os.path.exists(tokenizer_file): - wget_command = ['wget', 'https://raw.githubusercontent.com/openai/CLIP/main/clip/bpe_simple_vocab_16e6.txt.gz', '-O', tokenizer_file] + wget_command = ['wget', 'https://raw.githubusercontent.com/openai/CLIP/main/clip/bpe_simple_vocab_16e6.txt.gz', '-P', os.path.dirname(tokenizer_file)] subprocess.run(wget_command) return tokenizer_file diff --git a/vbench/utils.py b/vbench/utils.py index 9e2c4114..3477c6e5 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -241,7 +241,7 @@ def init_submodules(dimension_list, local=False, read_frame=False): vit_b_path = f'{CACHE_DIR}/clip_model/ViT-B-32.pt' if not os.path.isfile(vit_b_path): wget_command = ['wget', 'https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt', '-P', os.path.dirname(vit_b_path)] - subprocess.run(wget_command) + subprocess.run(wget_command, check=True) else: vit_b_path = 'ViT-B/32' @@ -250,7 +250,7 @@ def init_submodules(dimension_list, local=False, read_frame=False): umt_path = f'{CACHE_DIR}/umt_model/l16_ptk710_ftk710_ftk400_f16_res224.pth' if not os.path.isfile(umt_path): wget_command = ['wget', 'https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/umt/single_modality/l16_ptk710_ftk710_ftk400_f16_res224.pth', '-P', os.path.dirname(umt_path)] - subprocess.run(wget_command) + subprocess.run(wget_command, check=True) submodules_dict[dimension] = [umt_path,] elif dimension == 'temporal_flickering': submodules_dict[dimension] = [] @@ -265,7 +265,7 @@ def init_submodules(dimension_list, local=False, read_frame=False): print(f"File {details['ckpt']} does not exist. Downloading...") wget_command = ['wget', '-P', os.path.dirname(details['ckpt']), 'https://huggingface.co/lalala125/AMT/resolve/main/amt-s.pth'] - subprocess.run(wget_command) + subprocess.run(wget_command, check=True) elif dimension == 'dynamic_degree': submodules_dict[dimension] = { @@ -276,7 +276,7 @@ def init_submodules(dimension_list, local=False, read_frame=False): # raise NotImplementedError print(f"File {details['model']} does not exist. Downloading...") wget_command = ['wget', '-P', f'{CACHE_DIR}/raft_model/', 'https://dl.dropboxusercontent.com/s/4j4z58wuv8o0mfz/models.zip'] - unzip_command = ['unzip', 'd', f'{CACHE_DIR}/raft_model/', f'{CACHE_DIR}/raft_model/models.zip'] + unzip_command = ['unzip', '-d', f'{CACHE_DIR}/raft_model/', f'{CACHE_DIR}/raft_model/models.zip'] remove_command = ['rm', '-r', f'{CACHE_DIR}/raft_model/models.zip'] try: subprocess.run(wget_command, check=True) @@ -298,13 +298,13 @@ def init_submodules(dimension_list, local=False, read_frame=False): # Check if the file exists, if not, download it with wget if not os.path.isdir(details['repo_or_dir']): print(f"Directory {details['repo_or_dir']} does not exist. Cloning repository...") - subprocess.run(['git', 'clone', 'https://github.com/facebookresearch/dino', details['repo_or_dir']]) + subprocess.run(['git', 'clone', 'https://github.com/facebookresearch/dino', details['repo_or_dir']], check=True) if not os.path.isfile(details['path']): print(f"File {details['path']} does not exist. Downloading...") wget_command = ['wget', '-P', os.path.dirname(details['path']), 'https://github.com/facebookresearch/dino/blob/main/dino_vitbase16_pretrain.pth'] - subprocess.run(wget_command) + subprocess.run(wget_command, check=True) else: submodules_dict[dimension] = { 'repo_or_dir':'facebookresearch/dino:main', @@ -318,7 +318,7 @@ def init_submodules(dimension_list, local=False, read_frame=False): vit_l_path = f'{CACHE_DIR}/clip_model/ViT-L-14.pt' if not os.path.isfile(vit_l_path): wget_command = ['wget' ,'https://openaipublic.azureedge.net/clip/models/b8cca3fd41ae0c99ba7e8951adf17d267cdb84cd88be6f7c2e0eca1737a03836/ViT-L-14.pt', '-P', os.path.dirname(vit_l_path)] - subprocess.run(wget_command) + subprocess.run(wget_command, check=True) else: vit_l_path = 'ViT-L/14' submodules_dict[dimension] = [vit_l_path, aes_path] @@ -326,7 +326,7 @@ def init_submodules(dimension_list, local=False, read_frame=False): musiq_spaq_path = f'{CACHE_DIR}/pyiqa_model/musiq_spaq_ckpt-358bb6af.pth' if not os.path.isfile(musiq_spaq_path): wget_command = ['wget', 'https://github.com/chaofengc/IQA-PyTorch/releases/download/v0.1-weights/musiq_spaq_ckpt-358bb6af.pth', '-P', os.path.dirname(musiq_spaq_path)] - subprocess.run(wget_command) + subprocess.run(wget_command, check=True) submodules_dict[dimension] = {'model_path': musiq_spaq_path} elif dimension in ["object_class", "multiple_objects", "color", "spatial_relationship" ]: submodules_dict[dimension] = { @@ -334,7 +334,7 @@ def init_submodules(dimension_list, local=False, read_frame=False): } if not os.path.exists(submodules_dict[dimension]['model_weight']): wget_command = ['wget', 'https://datarelease.blob.core.windows.net/grit/models/grit_b_densecap_objectdet.pth', '-P', os.path.dirname(submodules_dict[dimension]["model_weight"])] - subprocess.run(wget_command) + subprocess.run(wget_command, check=True) elif dimension == 'scene': submodules_dict[dimension] = { "pretrained": f'{CACHE_DIR}/caption_model/tag2text_swin_14m.pth', @@ -343,12 +343,13 @@ def init_submodules(dimension_list, local=False, read_frame=False): } if not os.path.exists(submodules_dict[dimension]['pretrained']): wget_command = ['wget', 'https://huggingface.co/spaces/xinyu1205/recognize-anything/resolve/main/tag2text_swin_14m.pth', '-P', os.path.dirname(submodules_dict[dimension]["pretrained"])] + subprocess.run(wget_command, check=True) elif dimension == 'appearance_style': if local: submodules_dict[dimension] = {"name": f'{CACHE_DIR}/clip_model/ViT-B-32.pt'} if not os.path.isfile(submodules_dict[dimension]["name"]): wget_command = ['wget', 'https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt', '-P', os.path.dirname(submodules_dict[dimension]["name"])] - subprocess.run(wget_command) + subprocess.run(wget_command, check=True) else: submodules_dict[dimension] = {"name": 'ViT-B/32'} elif dimension in ["temporal_style", "overall_consistency"]: @@ -357,7 +358,7 @@ def init_submodules(dimension_list, local=False, read_frame=False): } if not os.path.exists(submodules_dict[dimension]['pretrain']): wget_command = ['wget', 'https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/internvideo/viclip/ViClip-InternVid-10M-FLT.pth', '-P', os.path.dirname(submodules_dict[dimension]["pretrain"])] - subprocess.run(wget_command) + subprocess.run(wget_command, check=True) return submodules_dict From 63c80c8a0361141d83918585dfee8c1305f80224 Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Thu, 11 Jan 2024 00:06:23 +0800 Subject: [PATCH 082/248] [fix]: path to amt yaml --- vbench/utils.py | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/vbench/utils.py b/vbench/utils.py index 3477c6e5..a030f066 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -255,8 +255,9 @@ def init_submodules(dimension_list, local=False, read_frame=False): elif dimension == 'temporal_flickering': submodules_dict[dimension] = [] elif dimension == 'motion_smoothness': + CUR_DIR = os.path.dirname(os.path.abspath(__file__)) submodules_dict[dimension] = { - 'config': f'{CACHE_DIR}/amt_model/AMT-S.yaml', + 'config': f'{CUR_DIR}/third_party/amt/cfgs/AMT-S.yaml', 'ckpt': f'{CACHE_DIR}/amt_model/amt-s.pth' } details = submodules_dict[dimension] From fa9f434fd7e6300e7f5b3589304b69208efe864d Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Thu, 11 Jan 2024 05:03:27 +0800 Subject: [PATCH 083/248] [update]: restructure GRIT & UMT cache/config --- requirements.txt | 17 +- static_filter.py | 4 +- vbench/cli/static_filter.py | 3 +- vbench/human_action.py | 3 +- vbench/third_party/ViCLIP/simple_tokenizer.py | 2 +- .../grit_src/image_dense_captions.py | 3 +- .../umt/kinetics_400_categories.txt | 400 ++++++++++++++++++ 7 files changed, 422 insertions(+), 10 deletions(-) create mode 100644 vbench/third_party/umt/kinetics_400_categories.txt diff --git a/requirements.txt b/requirements.txt index 863bd7d0..a67d49e8 100644 --- a/requirements.txt +++ b/requirements.txt @@ -4,17 +4,26 @@ matplotlib timm>=0.9 torch>=1.12 torchvision>=0.13 +wheel +cython tensorboard scipy opencv-python -detectron2 scikit-learn -requests scikit-image +openai-clip +# decord +requests pyyaml easydict +pyiqa>=0.1.8 lvis -fairscale -openai-clip +fairscale>=0.4.4 fvcore easydict +urllib3 +boto3 +omegaconf +transformers>=4.33.2 +pycocoevalcap +detectron2@git+https://github.com/facebookresearch/detectron2.git@main diff --git a/static_filter.py b/static_filter.py index 3d47c7b2..56dd1228 100644 --- a/static_filter.py +++ b/static_filter.py @@ -6,7 +6,7 @@ import torch from tqdm import tqdm import json - +from vbench.utils import CACHE_DIR from vbench.third_party.RAFT.core.raft import RAFT from vbench.third_party.RAFT.core.utils_core.utils import InputPadder @@ -121,7 +121,7 @@ def filter_static(args): if __name__ == '__main__': parser = argparse.ArgumentParser() - parser.add_argument('--model', type=str, default="./pretrained/raft_model/models/raft-things.pth", help="restore checkpoint") + parser.add_argument('--model', type=str, default=f"{CACHE_DIR}/raft_model/models/raft-things.pth", help="restore checkpoint") parser.add_argument('--videos_path', default="", required=True, help="video path for filtering") parser.add_argument('--result_path', type=str, default="./filter_results", help='result save path') parser.add_argument('--store_name', type=str, default="filtered_static_video.json", help='result file name') diff --git a/vbench/cli/static_filter.py b/vbench/cli/static_filter.py index b35d1445..a59919de 100644 --- a/vbench/cli/static_filter.py +++ b/vbench/cli/static_filter.py @@ -7,6 +7,7 @@ from tqdm import tqdm import json +from vbench.utils import CACHE_DIR from vbench.third_party.RAFT.core.raft import RAFT from vbench.third_party.RAFT.core.utils_core.utils import InputPadder @@ -120,7 +121,7 @@ def filter_static(args): def main(): parser = argparse.ArgumentParser() - parser.add_argument('--model', type=str, default="./pretrained/raft_model/models/raft-things.pth", help="restore checkpoint") + parser.add_argument('--model', type=str, default=f"{CACHE_DIR}/raft_model/models/raft-things.pth", help="restore checkpoint") parser.add_argument('--videos_path', default="", required=True, help="video path for filtering") parser.add_argument('--result_path', type=str, default="./filter_results", help='result save path') parser.add_argument('--store_name', type=str, default="filtered_static_video.json", help='result file name') diff --git a/vbench/human_action.py b/vbench/human_action.py index c276f96d..4280372d 100755 --- a/vbench/human_action.py +++ b/vbench/human_action.py @@ -19,7 +19,8 @@ from tqdm import tqdm def build_dict(): - path = 'pretrained/umt_model/kinetics_400_categroies.txt' + CUR_DIR = os.path.dirname(os.path.abspath(__file__)) + path = f'{CUR_DIR}/third_party/umt/kinetics_400_categories.txt' results = {} with open(path, 'r') as f: cat_list = f.readlines() diff --git a/vbench/third_party/ViCLIP/simple_tokenizer.py b/vbench/third_party/ViCLIP/simple_tokenizer.py index 0b46f0c6..76286cbd 100644 --- a/vbench/third_party/ViCLIP/simple_tokenizer.py +++ b/vbench/third_party/ViCLIP/simple_tokenizer.py @@ -9,8 +9,8 @@ def default_bpe(): tokenizer_file = os.path.join(CACHE_DIR, "ViCLIP/bpe_simple_vocab_16e6.txt.gz") - print(f'save tokenizer_file to {tokenizer_file}') if not os.path.exists(tokenizer_file): + print(f'Downloading ViCLIP tokenizer to {tokenizer_file}') wget_command = ['wget', 'https://raw.githubusercontent.com/openai/CLIP/main/clip/bpe_simple_vocab_16e6.txt.gz', '-P', os.path.dirname(tokenizer_file)] subprocess.run(wget_command) return tokenizer_file diff --git a/vbench/third_party/grit_src/image_dense_captions.py b/vbench/third_party/grit_src/image_dense_captions.py index 1bfe85c3..bdd9d8e5 100755 --- a/vbench/third_party/grit_src/image_dense_captions.py +++ b/vbench/third_party/grit_src/image_dense_captions.py @@ -7,6 +7,7 @@ # constants WINDOW_NAME = "GRiT" CUR_DIR = os.path.dirname(os.path.abspath(__file__)) +from vbench.utils import CACHE_DIR # sys.path.insert(0, f"{CUR_DIR}/../") # print(CUR_DIR) @@ -83,7 +84,7 @@ def setup_cfg(args): return cfg -def get_parser(device, model_weight="pretrained/grit_model/grit_b_densecap_objectdet.pth"): +def get_parser(device, model_weight=f"{CACHE_DIR}/grit_model/grit_b_densecap_objectdet.pth"): arg_dict = {'config_file': f"{CUR_DIR}/configs/GRiT_B_DenseCap_ObjectDet.yaml", 'cpu': False, 'confidence_threshold': 0.5, 'test_task': 'DenseCap', 'opts': ["MODEL.WEIGHTS", model_weight]} if device.type == "cpu": arg_dict["cpu"] = True diff --git a/vbench/third_party/umt/kinetics_400_categories.txt b/vbench/third_party/umt/kinetics_400_categories.txt new file mode 100644 index 00000000..06fc9968 --- /dev/null +++ b/vbench/third_party/umt/kinetics_400_categories.txt @@ -0,0 +1,400 @@ +riding a bike 0 +marching 1 +dodgeball 2 +playing cymbals 3 +checking tires 4 +roller skating 5 +tasting beer 6 +clapping 7 +drawing 8 +juggling fire 9 +bobsledding 10 +petting animal (not cat) 11 +spray painting 12 +training dog 13 +eating watermelon 14 +building cabinet 15 +applauding 16 +playing harp 17 +balloon blowing 18 +sled dog racing 19 +wrestling 20 +pole vault 21 +hurling (sport) 22 +riding scooter 23 +shearing sheep 24 +sweeping floor 25 +eating carrots 26 +skateboarding 27 +dunking basketball 28 +disc golfing 29 +eating spaghetti 30 +playing flute 31 +riding mechanical bull 32 +making sushi 33 +trapezing 34 +picking fruit 35 +stretching leg 36 +playing ukulele 37 +tying tie 38 +skydiving 39 +playing cello 40 +jumping into pool 41 +shooting goal (soccer) 42 +trimming trees 43 +bookbinding 44 +ski jumping 45 +walking the dog 46 +riding unicycle 47 +shaving head 48 +hopscotch 49 +playing piano 50 +parasailing 51 +bartending 52 +kicking field goal 53 +finger snapping 54 +dining 55 +yawning 56 +peeling potatoes 57 +canoeing or kayaking 58 +front raises 59 +laughing 60 +dancing macarena 61 +digging 62 +reading newspaper 63 +hitting baseball 64 +clay pottery making 65 +exercising with an exercise ball 66 +playing saxophone 67 +shooting basketball 68 +washing hair 69 +lunge 70 +brushing hair 71 +curling hair 72 +kitesurfing 73 +tapping guitar 74 +bending back 75 +skipping rope 76 +situp 77 +folding paper 78 +cracking neck 79 +assembling computer 80 +cleaning gutters 81 +blowing out candles 82 +shaking hands 83 +dancing gangnam style 84 +windsurfing 85 +tap dancing 86 +skiing (not slalom or crosscountry) 87 +bandaging 88 +push up 89 +doing nails 90 +punching person (boxing) 91 +bouncing on trampoline 92 +scrambling eggs 93 +singing 94 +cleaning floor 95 +krumping 96 +drumming fingers 97 +snowmobiling 98 +gymnastics tumbling 99 +headbanging 100 +catching or throwing frisbee 101 +riding elephant 102 +bee keeping 103 +feeding birds 104 +snatch weight lifting 105 +mowing lawn 106 +fixing hair 107 +playing trumpet 108 +flying kite 109 +crossing river 110 +swinging legs 111 +sanding floor 112 +belly dancing 113 +sneezing 114 +clean and jerk 115 +side kick 116 +filling eyebrows 117 +shuffling cards 118 +recording music 119 +cartwheeling 120 +feeding fish 121 +folding clothes 122 +water skiing 123 +tobogganing 124 +blowing leaves 125 +smoking 126 +unboxing 127 +tai chi 128 +waxing legs 129 +riding camel 130 +slapping 131 +tossing salad 132 +capoeira 133 +playing cards 134 +playing organ 135 +playing violin 136 +playing drums 137 +tapping pen 138 +vault 139 +shoveling snow 140 +playing tennis 141 +getting a tattoo 142 +making a sandwich 143 +making tea 144 +grinding meat 145 +squat 146 +eating doughnuts 147 +ice fishing 148 +snowkiting 149 +kicking soccer ball 150 +playing controller 151 +giving or receiving award 152 +welding 153 +throwing discus 154 +throwing axe 155 +ripping paper 156 +swimming butterfly stroke 157 +air drumming 158 +blowing nose 159 +hockey stop 160 +taking a shower 161 +bench pressing 162 +planting trees 163 +pumping fist 164 +climbing tree 165 +tickling 166 +high kick 167 +waiting in line 168 +slacklining 169 +tango dancing 170 +hurdling 171 +carrying baby 172 +celebrating 173 +sharpening knives 174 +passing American football (in game) 175 +headbutting 176 +playing recorder 177 +brush painting 178 +garbage collecting 179 +robot dancing 180 +shredding paper 181 +pumping gas 182 +rock climbing 183 +hula hooping 184 +braiding hair 185 +opening present 186 +texting 187 +decorating the christmas tree 188 +answering questions 189 +playing keyboard 190 +writing 191 +bungee jumping 192 +sniffing 193 +eating burger 194 +playing accordion 195 +making pizza 196 +playing volleyball 197 +tasting food 198 +pushing cart 199 +spinning poi 200 +cleaning windows 201 +arm wrestling 202 +changing oil 203 +swimming breast stroke 204 +tossing coin 205 +deadlifting 206 +hoverboarding 207 +cutting watermelon 208 +cheerleading 209 +snorkeling 210 +washing hands 211 +eating cake 212 +pull ups 213 +surfing water 214 +eating hotdog 215 +holding snake 216 +playing harmonica 217 +ironing 218 +cutting nails 219 +golf chipping 220 +shot put 221 +hugging 222 +playing clarinet 223 +faceplanting 224 +trimming or shaving beard 225 +drinking shots 226 +riding mountain bike 227 +tying bow tie 228 +swinging on something 229 +skiing crosscountry 230 +unloading truck 231 +cleaning pool 232 +jogging 233 +ice climbing 234 +mopping floor 235 +making bed 236 +diving cliff 237 +washing dishes 238 +grooming dog 239 +weaving basket 240 +frying vegetables 241 +stomping grapes 242 +moving furniture 243 +cooking sausages 244 +doing laundry 245 +dying hair 246 +knitting 247 +reading book 248 +baby waking up 249 +punching bag 250 +surfing crowd 251 +cooking chicken 252 +pushing car 253 +springboard diving 254 +swing dancing 255 +massaging legs 256 +beatboxing 257 +breading or breadcrumbing 258 +somersaulting 259 +brushing teeth 260 +stretching arm 261 +juggling balls 262 +massaging person's head 263 +eating ice cream 264 +extinguishing fire 265 +hammer throw 266 +whistling 267 +crawling baby 268 +using remote controller (not gaming) 269 +playing cricket 270 +opening bottle 271 +playing xylophone 272 +motorcycling 273 +driving car 274 +exercising arm 275 +passing American football (not in game) 276 +playing kickball 277 +sticking tongue out 278 +flipping pancake 279 +catching fish 280 +eating chips 281 +shaking head 282 +sword fighting 283 +playing poker 284 +cooking on campfire 285 +doing aerobics 286 +paragliding 287 +using segway 288 +folding napkins 289 +playing bagpipes 290 +gargling 291 +skiing slalom 292 +strumming guitar 293 +javelin throw 294 +waxing back 295 +riding or walking with horse 296 +plastering 297 +long jump 298 +parkour 299 +wrapping present 300 +egg hunting 301 +archery 302 +cleaning toilet 303 +swimming backstroke 304 +snowboarding 305 +catching or throwing baseball 306 +massaging back 307 +blowing glass 308 +playing guitar 309 +playing chess 310 +golf driving 311 +presenting weather forecast 312 +rock scissors paper 313 +high jump 314 +baking cookies 315 +using computer 316 +washing feet 317 +arranging flowers 318 +playing bass guitar 319 +spraying 320 +cutting pineapple 321 +waxing chest 322 +auctioning 323 +jetskiing 324 +drinking 325 +busking 326 +playing monopoly 327 +salsa dancing 328 +waxing eyebrows 329 +watering plants 330 +zumba 331 +chopping wood 332 +pushing wheelchair 333 +carving pumpkin 334 +building shed 335 +making jewelry 336 +catching or throwing softball 337 +bending metal 338 +ice skating 339 +dancing charleston 340 +abseiling 341 +climbing a rope 342 +crying 343 +cleaning shoes 344 +dancing ballet 345 +driving tractor 346 +triple jump 347 +throwing ball 348 +getting a haircut 349 +running on treadmill 350 +climbing ladder 351 +blasting sand 352 +playing trombone 353 +drop kicking 354 +country line dancing 355 +changing wheel 356 +feeding goats 357 +tying knot (not on a tie) 358 +setting table 359 +shaving legs 360 +kissing 361 +riding mule 362 +counting money 363 +laying bricks 364 +barbequing 365 +news anchoring 366 +smoking hookah 367 +cooking egg 368 +peeling apples 369 +yoga 370 +sharpening pencil 371 +dribbling basketball 372 +petting cat 373 +playing ice hockey 374 +milking cow 375 +shining shoes 376 +juggling soccer ball 377 +scuba diving 378 +playing squash or racquetball 379 +drinking beer 380 +sign language interpreting 381 +playing basketball 382 +breakdancing 383 +testifying 384 +making snowman 385 +golf putting 386 +playing didgeridoo 387 +biking through snow 388 +sailing 389 +jumpstyle dancing 390 +water sliding 391 +grooming horse 392 +massaging feet 393 +playing paintball 394 +making a cake 395 +bowling 396 +contact juggling 397 +applying cream 398 +playing badminton 399 From 1695e6b4fd19700b2cac4fea4a598da76d7a074d Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Thu, 11 Jan 2024 18:11:06 +0800 Subject: [PATCH 084/248] [fix]: config file excluded after build --- MANIFEST.in | 1 + requirements.txt | 2 +- vbench/utils.py | 2 +- 3 files changed, 3 insertions(+), 2 deletions(-) diff --git a/MANIFEST.in b/MANIFEST.in index d5219a9f..a012a55e 100644 --- a/MANIFEST.in +++ b/MANIFEST.in @@ -2,3 +2,4 @@ include version.txt include requirements.txt recursive-include vbench/third_party *.yaml recursive-include vbench/third_party *.json +recursive-include vbench/third_party *.txt diff --git a/requirements.txt b/requirements.txt index a67d49e8..9190a9e9 100644 --- a/requirements.txt +++ b/requirements.txt @@ -12,7 +12,7 @@ opencv-python scikit-learn scikit-image openai-clip -# decord +decord requests pyyaml easydict diff --git a/vbench/utils.py b/vbench/utils.py index a030f066..4cbc4316 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -304,7 +304,7 @@ def init_submodules(dimension_list, local=False, read_frame=False): if not os.path.isfile(details['path']): print(f"File {details['path']} does not exist. Downloading...") wget_command = ['wget', '-P', os.path.dirname(details['path']), - 'https://github.com/facebookresearch/dino/blob/main/dino_vitbase16_pretrain.pth'] + 'https://dl.fbaipublicfiles.com/dino/dino_vitbase16_pretrain/dino_vitbase16_pretrain.pth'] subprocess.run(wget_command, check=True) else: submodules_dict[dimension] = { From 045f5159452cf1fb9b9aaac00cfc82c3b2419a92 Mon Sep 17 00:00:00 2001 From: Nattapol <66078943+NattapolChan@users.noreply.github.com> Date: Thu, 11 Jan 2024 18:32:53 +0800 Subject: [PATCH 085/248] Update README.md --- README.md | 31 +++++++++++++------------------ 1 file changed, 13 insertions(+), 18 deletions(-) diff --git a/README.md b/README.md index 92198146..0fd44546 100755 --- a/README.md +++ b/README.md @@ -21,27 +21,22 @@ We propose **VBench**, a comprehensive benchmark suite for video generative mode - [12/2023] Evaluation code for released for 16 Text-to-Video (T2V) evaluation dimensions. - `['subject_consistency', 'background_consistency', 'temporal_flickering', 'motion_smoothness', 'dynamic_degree', 'aesthetic_quality', 'imaging_quality', 'object_class', 'multiple_objects', 'human_action', 'color', 'spatial_relationship', 'scene', 'temporal_style', 'appearance_style', 'overall_consistency']` - [11/2023] Prompt Suites released. (See prompt lists [here](https://github.com/Vchitect/VBench/tree/master/prompts)) - + ## :hammer: Installation - -1. Clone Repo - - ```bash - git clone https://github.com/Vchitect/VBench - cd VBench +#### Install with pip + ``` + pip install detectron2@git+https://github.com/facebookresearch/detectron2.git + pip install git+https://github.com/Vchitect/VBench.git ``` -2. Create Conda Environment and Install Dependencies - ``` - conda env create -f vbench_env.yml - conda activate vbench - ``` +#### Install with git clone + git clone https://github.com/Vchitect/VBench.git + pip install -r VBench/requirements.txt + pip install VBench + +If there is an error during [detectron2](https://github.com/facebookresearch/detectron2) installation, see [here](https://detectron2.readthedocs.io/en/latest/tutorials/install.html). -## :hammer: Installation (pip install) -``` - python setup.py sdist && python -m pip install dist/vbench-0.1.0.tar.gz -``` -### Usage +## Usage ##### command line ```bash evaluate --videos_path $VIDEO_PATH --dimension $DIMENSION @@ -58,7 +53,7 @@ We propose **VBench**, a comprehensive benchmark suite for video generative mode ``` ## :gem: Pre-Trained Models -[Optional] Please download the pre-trained weights according to the guidance in the `model_path.txt` file for each model in the `pretrained` folder. +[Optional] Please download the pre-trained weights according to the guidance in the `model_path.txt` file for each model in the `~/.cache/vbench` directory. ## :bookmark_tabs: Prompt Suite From a46e6cf54c4f17049e55e69c0f34c96fca9fc074 Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Thu, 11 Jan 2024 19:34:26 +0800 Subject: [PATCH 086/248] [update]: pretrained download location --- pretrained/aesthetic_model/model_path.txt | 2 +- pretrained/amt_model/download.sh | 2 +- pretrained/caption_model/model_path.txt | 2 +- pretrained/clip_model/model_path.txt | 4 ++-- pretrained/grit_model/model_path.txt | 2 +- pretrained/pyiqa_model/model_path.txt | 2 +- pretrained/raft_model/download.sh | 6 ++++-- pretrained/viclip_model/model_path.txt | 2 +- 8 files changed, 12 insertions(+), 10 deletions(-) diff --git a/pretrained/aesthetic_model/model_path.txt b/pretrained/aesthetic_model/model_path.txt index 5faf48e8..bc2357a1 100755 --- a/pretrained/aesthetic_model/model_path.txt +++ b/pretrained/aesthetic_model/model_path.txt @@ -1 +1 @@ -wget https://github.com/LAION-AI/aesthetic-predictor/blob/main/sa_0_4_vit_l_14_linear.pth?raw=true \ No newline at end of file +wget https://github.com/LAION-AI/aesthetic-predictor/blob/main/sa_0_4_vit_l_14_linear.pth -P ~/.cache/vbench/aesthetic_model/emb_reader diff --git a/pretrained/amt_model/download.sh b/pretrained/amt_model/download.sh index 004890f8..5948f122 100755 --- a/pretrained/amt_model/download.sh +++ b/pretrained/amt_model/download.sh @@ -1 +1 @@ -wget https://huggingface.co/lalala125/AMT/resolve/main/amt-s.pth +wget https://huggingface.co/lalala125/AMT/resolve/main/amt-s.pth -P ~/.cache/amt_model diff --git a/pretrained/caption_model/model_path.txt b/pretrained/caption_model/model_path.txt index e1089554..2393aaaf 100644 --- a/pretrained/caption_model/model_path.txt +++ b/pretrained/caption_model/model_path.txt @@ -1 +1 @@ -wget https://huggingface.co/spaces/xinyu1205/recognize-anything/resolve/main/tag2text_swin_14m.pth \ No newline at end of file +wget https://huggingface.co/spaces/xinyu1205/recognize-anything/resolve/main/tag2text_swin_14m.pth -P ~/.cache/vbench/caption_model diff --git a/pretrained/clip_model/model_path.txt b/pretrained/clip_model/model_path.txt index 40e44c76..0736bb1d 100755 --- a/pretrained/clip_model/model_path.txt +++ b/pretrained/clip_model/model_path.txt @@ -1,2 +1,2 @@ -wget https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt -wget https://openaipublic.azureedge.net/clip/models/b8cca3fd41ae0c99ba7e8951adf17d267cdb84cd88be6f7c2e0eca1737a03836/ViT-L-14.pt \ No newline at end of file +wget https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt -P ~/.cache/vbench/clip_model +wget https://openaipublic.azureedge.net/clip/models/b8cca3fd41ae0c99ba7e8951adf17d267cdb84cd88be6f7c2e0eca1737a03836/ViT-L-14.pt -P ~/.cache/vbench/clip_model diff --git a/pretrained/grit_model/model_path.txt b/pretrained/grit_model/model_path.txt index 0c708439..e967ee61 100755 --- a/pretrained/grit_model/model_path.txt +++ b/pretrained/grit_model/model_path.txt @@ -1 +1 @@ -wget https://datarelease.blob.core.windows.net/grit/models/grit_b_densecap_objectdet.pth \ No newline at end of file +wget https://datarelease.blob.core.windows.net/grit/models/grit_b_densecap_objectdet.pth -P ~/.cache/vbench/grit_model diff --git a/pretrained/pyiqa_model/model_path.txt b/pretrained/pyiqa_model/model_path.txt index 29144b78..19280282 100755 --- a/pretrained/pyiqa_model/model_path.txt +++ b/pretrained/pyiqa_model/model_path.txt @@ -1 +1 @@ -wget https://github.com/chaofengc/IQA-PyTorch/releases/download/v0.1-weights/musiq_spaq_ckpt-358bb6af.pth +wget https://github.com/chaofengc/IQA-PyTorch/releases/download/v0.1-weights/musiq_spaq_ckpt-358bb6af.pth -P ~/.cache/vbench/pyiqa_model diff --git a/pretrained/raft_model/download.sh b/pretrained/raft_model/download.sh index dfd8d473..6c83a919 100755 --- a/pretrained/raft_model/download.sh +++ b/pretrained/raft_model/download.sh @@ -1,3 +1,5 @@ #!/bin/bash -wget https://dl.dropboxusercontent.com/s/4j4z58wuv8o0mfz/models.zip -unzip models.zip +CACHE_DIR=~/.cache/vbench +wget -P $CACHE_DIR/raft_model/ https://dl.dropboxusercontent.com/s/4j4z58wuv8o0mfz/models.zip +unzip -d ${CACHE_DIR}/raft_model/ $CACHE_DIR/raft_model/models.zip +rm -r $CACHE_DIR/raft_model/models.zip diff --git a/pretrained/viclip_model/model_path.txt b/pretrained/viclip_model/model_path.txt index a332716f..868afb63 100755 --- a/pretrained/viclip_model/model_path.txt +++ b/pretrained/viclip_model/model_path.txt @@ -1 +1 @@ -wget https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/internvideo/viclip/ViClip-InternVid-10M-FLT.pth +wget https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/internvideo/viclip/ViClip-InternVid-10M-FLT.pth -P ~/.cache/vbench/ViCLIP From edd464bf9ab1f3f5438179321d4a8be20e42bef6 Mon Sep 17 00:00:00 2001 From: Nattapol <66078943+NattapolChan@users.noreply.github.com> Date: Thu, 11 Jan 2024 19:37:58 +0800 Subject: [PATCH 087/248] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 0fd44546..f4d29657 100755 --- a/README.md +++ b/README.md @@ -53,7 +53,7 @@ If there is an error during [detectron2](https://github.com/facebookresearch/det ``` ## :gem: Pre-Trained Models -[Optional] Please download the pre-trained weights according to the guidance in the `model_path.txt` file for each model in the `~/.cache/vbench` directory. +[Optional] Please download the pre-trained weights according to the guidance in the `model_path.txt` file for each model in the `pretrained` folder to `~/.cache/vbench`. ## :bookmark_tabs: Prompt Suite From cc68b18ade8bca55c455b4b3cd5ab50dbfa31af4 Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Fri, 12 Jan 2024 05:23:51 +0800 Subject: [PATCH 088/248] [update]: change to subcommand --- README.md | 2 +- setup.py | 2 +- vbench/cli/__init__.py | 4 ---- vbench/cli/evaluate.py | 14 ++++---------- vbench/cli/static_filter.py | 12 ++++-------- vbench/cli/vbench.py | 15 +++++++++++++++ 6 files changed, 25 insertions(+), 24 deletions(-) create mode 100644 vbench/cli/vbench.py diff --git a/README.md b/README.md index f4d29657..45d0ad82 100755 --- a/README.md +++ b/README.md @@ -39,7 +39,7 @@ If there is an error during [detectron2](https://github.com/facebookresearch/det ## Usage ##### command line ```bash - evaluate --videos_path $VIDEO_PATH --dimension $DIMENSION + vbench evaluate --videos_path $VIDEO_PATH --dimension $DIMENSION ``` ##### python ```python diff --git a/setup.py b/setup.py index 86e89adb..5a831012 100644 --- a/setup.py +++ b/setup.py @@ -25,7 +25,7 @@ def fetch_requirements(): 'Source': 'https://github.com/Vchitect/VBench', }, entry_points={ - 'console_scripts': ['evaluate=vbench.cli.evaluate:main', 'static_filter=vbench.cli.static_filter:main'] + 'console_scripts': ['vbench=vbench.cli.vbench:main'] }, install_requires=install_requires, packages=find_packages(), diff --git a/vbench/cli/__init__.py b/vbench/cli/__init__.py index ffcc86d6..e69de29b 100644 --- a/vbench/cli/__init__.py +++ b/vbench/cli/__init__.py @@ -1,4 +0,0 @@ -import os -import sys -CUR_DIR = os.path.dirname(os.path.abspath(__file__)) -sys.path.append(os.path.join(CUR_DIR, '../third_party')) diff --git a/vbench/cli/evaluate.py b/vbench/cli/evaluate.py index fc04dbc6..5b6ea161 100644 --- a/vbench/cli/evaluate.py +++ b/vbench/cli/evaluate.py @@ -3,9 +3,8 @@ import argparse -def parse_args(): - - parser = argparse.ArgumentParser(description='VBench') +def register_subparsers(subparser): + parser = subparser.add_parser('evaluate') parser.add_argument( "--output_path", type=str, @@ -42,12 +41,9 @@ def parse_args(): required=False, help="whether directly read frames, or directly read videos", ) - args = parser.parse_args() - return args - + parser.set_defaults(func=evaluate) -def main(): - args = parse_args() +def evaluate(args): print(f'args: {args}') device = torch.device("cuda") @@ -63,5 +59,3 @@ def main(): ) print('done') -if __name__ == "__main__": - main() diff --git a/vbench/cli/static_filter.py b/vbench/cli/static_filter.py index a59919de..e490497b 100644 --- a/vbench/cli/static_filter.py +++ b/vbench/cli/static_filter.py @@ -100,7 +100,7 @@ def get_frames(self, video_path): return frame_list -def filter_static(args): +def static_filter(args): static_filter = StaticFilter(args, device=DEVICE) prompt_dict = {} with open(args.prompt_file, "r") as f: @@ -119,8 +119,8 @@ def filter_static(args): os.makedirs(args.result_path, exist_ok=True) json.dump(prompt_dict, open(os.path.join(args.result_path, args.store_name), "w")) -def main(): - parser = argparse.ArgumentParser() +def register_subparsers(subparser): + parser = subparser.add_parser('static_filter') parser.add_argument('--model', type=str, default=f"{CACHE_DIR}/raft_model/models/raft-things.pth", help="restore checkpoint") parser.add_argument('--videos_path', default="", required=True, help="video path for filtering") parser.add_argument('--result_path', type=str, default="./filter_results", help='result save path') @@ -129,9 +129,5 @@ def main(): parser.add_argument('--small', action='store_true', help='use small model') parser.add_argument('--mixed_precision', action='store_true', help='use mixed precision') parser.add_argument('--alternate_corr', action='store_true', help='use efficent correlation implementation') - args = parser.parse_args() + parser.set_defaults(func=static_filter) - filter_static(args) - -if __name__ == "__main__": - main() diff --git a/vbench/cli/vbench.py b/vbench/cli/vbench.py new file mode 100644 index 00000000..d4690117 --- /dev/null +++ b/vbench/cli/vbench.py @@ -0,0 +1,15 @@ +import argparse +import importlib +from vbench import VBench + +def main(): + parser = argparse.ArgumentParser(prog="vbench") + subparsers = parser.add_subparsers(title='vbench subcommands') + + vbench_cmd = ['evaluate', 'static_filter'] + for cmd in vbench_cmd: + module = importlib.import_module(f'vbench.cli.{cmd}') + module.register_subparsers(subparsers) + args = parser.parse_args() + args.func(args) + From d72139eab302ed5bd059208f5b32bcaf6085e7d1 Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Fri, 12 Jan 2024 14:39:32 +0800 Subject: [PATCH 089/248] [update]: optimize import time --- vbench/__init__.py | 22 +++------------------- vbench/aesthetic_quality.py | 3 --- vbench/cli/evaluate.py | 2 -- vbench/cli/static_filter.py | 1 - vbench/cli/vbench.py | 8 ++++++-- 5 files changed, 9 insertions(+), 27 deletions(-) diff --git a/vbench/__init__.py b/vbench/__init__.py index 5f3f8f8c..342f4121 100755 --- a/vbench/__init__.py +++ b/vbench/__init__.py @@ -1,24 +1,7 @@ import os from .utils import init_submodules, save_json, load_json - -from .subject_consistency import compute_subject_consistency -from .background_consistency import compute_background_consistency -from .temporal_flickering import compute_temporal_flickering -from .motion_smoothness import compute_motion_smoothness -from .dynamic_degree import compute_dynamic_degree -from .aesthetic_quality import compute_aesthetic_quality -from .imaging_quality import compute_imaging_quality -from .object_class import compute_object_class -from .multiple_objects import compute_multiple_objects -from .human_action import compute_human_action -from .color import compute_color -from .spatial_relationship import compute_spatial_relationship -from .scene import compute_scene -from .overall_consistency import compute_overall_consistency -from .temporal_style import compute_temporal_style -from .appearance_style import compute_appearance_style - +import importlib class VBench(object): def __init__(self, device, full_info_dir, output_path): @@ -52,7 +35,8 @@ def evaluate(self, videos_path, name, dimension_list=None, local=False, read_fra cur_full_info_path = self.build_full_info_json(videos_path, name, dimension_list) for dimension in dimension_list: try: - evaluate_func = eval(f"compute_{dimension}") + dimension_module = importlib.import_module(f'vbench.{dimension}') + evaluate_func = getattr(dimension_module, f'compute_{dimension}') except Exception as e: raise NotImplementedError(f'UnImplemented dimension {dimension}!') submodules_list = submodules_dict[dimension] diff --git a/vbench/aesthetic_quality.py b/vbench/aesthetic_quality.py index 210f1e27..fd612ddc 100755 --- a/vbench/aesthetic_quality.py +++ b/vbench/aesthetic_quality.py @@ -1,8 +1,5 @@ - import os -import json import clip -import numpy as np import torch import torch.nn as nn import torch.nn.functional as F diff --git a/vbench/cli/evaluate.py b/vbench/cli/evaluate.py index 5b6ea161..7c73a4e9 100644 --- a/vbench/cli/evaluate.py +++ b/vbench/cli/evaluate.py @@ -1,8 +1,6 @@ import torch from vbench import VBench -import argparse - def register_subparsers(subparser): parser = subparser.add_parser('evaluate') parser.add_argument( diff --git a/vbench/cli/static_filter.py b/vbench/cli/static_filter.py index e490497b..5902410e 100644 --- a/vbench/cli/static_filter.py +++ b/vbench/cli/static_filter.py @@ -1,4 +1,3 @@ -import argparse import os import cv2 import glob diff --git a/vbench/cli/vbench.py b/vbench/cli/vbench.py index d4690117..d8872408 100644 --- a/vbench/cli/vbench.py +++ b/vbench/cli/vbench.py @@ -1,15 +1,19 @@ import argparse import importlib -from vbench import VBench +import subprocess + +vbench_cmd = ['evaluate', 'static_filter'] def main(): parser = argparse.ArgumentParser(prog="vbench") subparsers = parser.add_subparsers(title='vbench subcommands') - vbench_cmd = ['evaluate', 'static_filter'] for cmd in vbench_cmd: module = importlib.import_module(f'vbench.cli.{cmd}') module.register_subparsers(subparsers) + parser.set_defaults(func=help) args = parser.parse_args() args.func(args) +def help(args): + subprocess.run(['vbench', '-h'], check=True) From 5ef8653989baea05d224196f6df67b107a8b3d6d Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Tue, 16 Jan 2024 10:27:39 +0800 Subject: [PATCH 090/248] [update] PyPI use case --- README.md | 15 +++++++++++++++ 1 file changed, 15 insertions(+) diff --git a/README.md b/README.md index 45d0ad82..5506b888 100755 --- a/README.md +++ b/README.md @@ -41,6 +41,10 @@ If there is an error during [detectron2](https://github.com/facebookresearch/det ```bash vbench evaluate --videos_path $VIDEO_PATH --dimension $DIMENSION ``` +For example: +```bash + vbench evaluate --videos_path "sampled_videos/lavie/human_action" --dimension "human_action" +``` ##### python ```python from vbench import VBench @@ -51,6 +55,17 @@ If there is an error during [detectron2](https://github.com/facebookresearch/det dimension_list = [, , ...], ) ``` +For example: +```python + from vbench import VBench + my_VBench = VBench(device, "VBench_full_info.json", "evaluation_results") + my_VBench.evaluate( + videos_path = "sampled_videos/lavie/human_action", + name = "lavie_human_action", + dimension_list = ["human_action"], + ) +``` + ## :gem: Pre-Trained Models [Optional] Please download the pre-trained weights according to the guidance in the `model_path.txt` file for each model in the `pretrained` folder to `~/.cache/vbench`. From b527bd4228a664da6e8f52c7a7dd1031ffa4d98a Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Wed, 17 Jan 2024 03:04:17 +0800 Subject: [PATCH 091/248] [update]: pin transformers version --- requirements.txt | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/requirements.txt b/requirements.txt index 9190a9e9..e6fe9f0d 100644 --- a/requirements.txt +++ b/requirements.txt @@ -2,8 +2,8 @@ Pillow numpy matplotlib timm>=0.9 -torch>=1.12 -torchvision>=0.13 +torch>=1.12,<2.0.0 +torchvision>=0.13,<0.16.0 wheel cython tensorboard @@ -24,6 +24,6 @@ easydict urllib3 boto3 omegaconf -transformers>=4.33.2 +transformers==4.33.2 pycocoevalcap detectron2@git+https://github.com/facebookresearch/detectron2.git@main From 971e6cf49a9ce8a2a46369aeb34fb2e7d2095d95 Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Wed, 17 Jan 2024 03:05:21 +0800 Subject: [PATCH 092/248] [update]: cleanup unused file --- setup.py | 1 - .../third_party/grit_src/centernet_config.py | 87 ------------------- vbench/third_party/grit_src/requiresment.txt | 22 ----- 3 files changed, 110 deletions(-) delete mode 100644 vbench/third_party/grit_src/centernet_config.py delete mode 100755 vbench/third_party/grit_src/requiresment.txt diff --git a/setup.py b/setup.py index 5a831012..68c9ecca 100644 --- a/setup.py +++ b/setup.py @@ -31,5 +31,4 @@ def fetch_requirements(): packages=find_packages(), include_package_data=True, license='Apache Software License 2.0', - scripts=['bin/evaluate'], ) diff --git a/vbench/third_party/grit_src/centernet_config.py b/vbench/third_party/grit_src/centernet_config.py deleted file mode 100644 index 36d0d250..00000000 --- a/vbench/third_party/grit_src/centernet_config.py +++ /dev/null @@ -1,87 +0,0 @@ -from detectron2.config import CfgNode as CN - -def add_centernet_config(cfg): - _C = cfg - - _C.MODEL.CENTERNET = CN() - _C.MODEL.CENTERNET.NUM_CLASSES = 80 - _C.MODEL.CENTERNET.IN_FEATURES = ["p3", "p4", "p5", "p6", "p7"] - _C.MODEL.CENTERNET.FPN_STRIDES = [8, 16, 32, 64, 128] - _C.MODEL.CENTERNET.PRIOR_PROB = 0.01 - _C.MODEL.CENTERNET.INFERENCE_TH = 0.05 - _C.MODEL.CENTERNET.CENTER_NMS = False - _C.MODEL.CENTERNET.NMS_TH_TRAIN = 0.6 - _C.MODEL.CENTERNET.NMS_TH_TEST = 0.6 - _C.MODEL.CENTERNET.PRE_NMS_TOPK_TRAIN = 1000 - _C.MODEL.CENTERNET.POST_NMS_TOPK_TRAIN = 100 - _C.MODEL.CENTERNET.PRE_NMS_TOPK_TEST = 1000 - _C.MODEL.CENTERNET.POST_NMS_TOPK_TEST = 100 - _C.MODEL.CENTERNET.NORM = "GN" - _C.MODEL.CENTERNET.USE_DEFORMABLE = False - _C.MODEL.CENTERNET.NUM_CLS_CONVS = 4 - _C.MODEL.CENTERNET.NUM_BOX_CONVS = 4 - _C.MODEL.CENTERNET.NUM_SHARE_CONVS = 0 - _C.MODEL.CENTERNET.LOC_LOSS_TYPE = 'giou' - _C.MODEL.CENTERNET.SIGMOID_CLAMP = 1e-4 - _C.MODEL.CENTERNET.HM_MIN_OVERLAP = 0.8 - _C.MODEL.CENTERNET.MIN_RADIUS = 4 - _C.MODEL.CENTERNET.SOI = [[0, 80], [64, 160], [128, 320], [256, 640], [512, 10000000]] - _C.MODEL.CENTERNET.POS_WEIGHT = 1. - _C.MODEL.CENTERNET.NEG_WEIGHT = 1. - _C.MODEL.CENTERNET.REG_WEIGHT = 2. - _C.MODEL.CENTERNET.HM_FOCAL_BETA = 4 - _C.MODEL.CENTERNET.HM_FOCAL_ALPHA = 0.25 - _C.MODEL.CENTERNET.LOSS_GAMMA = 2.0 - _C.MODEL.CENTERNET.WITH_AGN_HM = False - _C.MODEL.CENTERNET.ONLY_PROPOSAL = False - _C.MODEL.CENTERNET.AS_PROPOSAL = False - _C.MODEL.CENTERNET.IGNORE_HIGH_FP = -1. - _C.MODEL.CENTERNET.MORE_POS = False - _C.MODEL.CENTERNET.MORE_POS_THRESH = 0.2 - _C.MODEL.CENTERNET.MORE_POS_TOPK = 9 - _C.MODEL.CENTERNET.NOT_NORM_REG = True - _C.MODEL.CENTERNET.NOT_NMS = False - _C.MODEL.CENTERNET.NO_REDUCE = False - - _C.MODEL.ROI_BOX_HEAD.USE_SIGMOID_CE = False - _C.MODEL.ROI_BOX_HEAD.PRIOR_PROB = 0.01 - _C.MODEL.ROI_BOX_HEAD.USE_EQL_LOSS = False - _C.MODEL.ROI_BOX_HEAD.CAT_FREQ_PATH = \ - 'datasets/lvis/lvis_v1_train_cat_info.json' - _C.MODEL.ROI_BOX_HEAD.EQL_FREQ_CAT = 200 - _C.MODEL.ROI_BOX_HEAD.USE_FED_LOSS = False - _C.MODEL.ROI_BOX_HEAD.FED_LOSS_NUM_CAT = 50 - _C.MODEL.ROI_BOX_HEAD.FED_LOSS_FREQ_WEIGHT = 0.5 - _C.MODEL.ROI_BOX_HEAD.MULT_PROPOSAL_SCORE = False - - _C.MODEL.BIFPN = CN() - _C.MODEL.BIFPN.NUM_LEVELS = 5 - _C.MODEL.BIFPN.NUM_BIFPN = 6 - _C.MODEL.BIFPN.NORM = 'GN' - _C.MODEL.BIFPN.OUT_CHANNELS = 160 - _C.MODEL.BIFPN.SEPARABLE_CONV = False - - _C.MODEL.DLA = CN() - _C.MODEL.DLA.OUT_FEATURES = ['dla2'] - _C.MODEL.DLA.USE_DLA_UP = True - _C.MODEL.DLA.NUM_LAYERS = 34 - _C.MODEL.DLA.MS_OUTPUT = False - _C.MODEL.DLA.NORM = 'BN' - _C.MODEL.DLA.DLAUP_IN_FEATURES = ['dla3', 'dla4', 'dla5'] - _C.MODEL.DLA.DLAUP_NODE = 'conv' - - _C.SOLVER.RESET_ITER = False - _C.SOLVER.TRAIN_ITER = -1 - - _C.INPUT.CUSTOM_AUG = '' - _C.INPUT.TRAIN_SIZE = 640 - _C.INPUT.TEST_SIZE = 640 - _C.INPUT.SCALE_RANGE = (0.1, 2.) - # 'default' for fixed short/ long edge, 'square' for max size=INPUT.SIZE - _C.INPUT.TEST_INPUT_TYPE = 'default' - - _C.DEBUG = False - _C.SAVE_DEBUG = False - _C.SAVE_PTH = False - _C.VIS_THRESH = 0.3 - _C.DEBUG_SHOW_NAME = False diff --git a/vbench/third_party/grit_src/requiresment.txt b/vbench/third_party/grit_src/requiresment.txt deleted file mode 100755 index 9ada3585..00000000 --- a/vbench/third_party/grit_src/requiresment.txt +++ /dev/null @@ -1,22 +0,0 @@ -opencv-python==4.5.5.64 -mss -timm==0.6.7 -dataclasses -ftfy -regex -fasttext -scikit-learn -lvis -nltk -tqdm -matplotlib -requests -anytree -boto3 -scikit-image -pyyaml -inflect -protobuf==3.19.4 -einops==0.4.1 -transformers==4.21.1 -deepspeed==0.7.0 From 697fc36b26a9e2548aa946d5e1bae67737092b25 Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Wed, 17 Jan 2024 03:43:41 +0800 Subject: [PATCH 093/248] [update]: separate readme from pypi description remove version.txt ignore direct link requirement if installed from pypi --- README-pypi.md | 54 ++++++++++++++++++++++++++++++++++++++++++++++++++ setup.py | 5 ++--- version.txt | 1 - 3 files changed, 56 insertions(+), 4 deletions(-) create mode 100644 README-pypi.md delete mode 100644 version.txt diff --git a/README-pypi.md b/README-pypi.md new file mode 100644 index 00000000..c7ba5875 --- /dev/null +++ b/README-pypi.md @@ -0,0 +1,54 @@ +# for project description in pypi +## :hammer: Installation +#### Install with pip + ``` + pip install detectron2@git+https://github.com/facebookresearch/detectron2.git + pip install git+https://github.com/Vchitect/VBench.git + ``` + +#### Install with git clone + git clone https://github.com/Vchitect/VBench.git + pip install -r VBench/requirements.txt + pip install VBench + +If there is an error during [detectron2](https://github.com/facebookresearch/detectron2) installation, see [here](https://detectron2.readthedocs.io/en/latest/tutorials/install.html). + +## Usage +##### command line +```bash + vbench evaluate --videos_path $VIDEO_PATH --dimension $DIMENSION +``` +For example: +```bash + vbench evaluate --videos_path "sampled_videos/lavie/human_action" --dimension "human_action" +``` +##### python +```python + from vbench import VBench + my_VBench = VBench(device, , ) + my_VBench.evaluate( + videos_path = , + name = , + dimension_list = [, , ...], + ) +``` +For example: +```python + from vbench import VBench + my_VBench = VBench(device, "VBench_full_info.json", "evaluation_results") + my_VBench.evaluate( + videos_path = "sampled_videos/lavie/human_action", + name = "lavie_human_action", + dimension_list = ["human_action"], + ) +``` + + +## :gem: Pre-Trained Models +[Optional] Please download the pre-trained weights according to the guidance in the `model_path.txt` file for each model in the `pretrained` folder to `~/.cache/vbench`. + +## :bookmark_tabs: Prompt Suite + +We provide prompt lists are at `prompts/`. + +Check out [details of prompt suites](https://github.com/Vchitect/VBench/tree/master/prompts), and instructions for [**how to sample videos for evaluation**](https://github.com/Vchitect/VBench/tree/master/prompts). diff --git a/setup.py b/setup.py index 68c9ecca..dafc6f26 100644 --- a/setup.py +++ b/setup.py @@ -4,17 +4,16 @@ import os def fetch_readme(): - with open('README.md', encoding='utf-8') as f: + with open('README-pypi.md', encoding='utf-8') as f: text = f.read() return text def fetch_requirements(): filename = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'requirements.txt') with open(filename, 'r') as f: - envs = [line.rstrip('\n') for line in f.readlines()] + envs = [line.rstrip('\n') for line in f.readlines() if '@' not in line] return envs -version = open('version.txt', 'r').read().rstrip('\n') install_requires = fetch_requirements() setup(name='vbench', version='0.1.0', diff --git a/version.txt b/version.txt deleted file mode 100644 index 6e8bf73a..00000000 --- a/version.txt +++ /dev/null @@ -1 +0,0 @@ -0.1.0 From 45bf3feed9b6db462dc191881c4432396d4724a4 Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Wed, 17 Jan 2024 18:48:05 +0800 Subject: [PATCH 094/248] [update]: change prompt file path in static_filter read from VBench_full_info.json instead of prompt/temporal_flickering.txt --- vbench/cli/static_filter.py | 16 +++++++++------- 1 file changed, 9 insertions(+), 7 deletions(-) diff --git a/vbench/cli/static_filter.py b/vbench/cli/static_filter.py index 06913676..e862c924 100644 --- a/vbench/cli/static_filter.py +++ b/vbench/cli/static_filter.py @@ -11,7 +11,7 @@ logging.basicConfig(level = logging.INFO,format = '%(asctime)s - %(levelname)s - %(message)s') logger = logging.getLogger(__name__) -from vbench.utils import CACHE_DIR +from vbench.utils import CACHE_DIR, load_json from vbench.third_party.RAFT.core.raft import RAFT from vbench.third_party.RAFT.core.utils_core.utils import InputPadder @@ -118,15 +118,17 @@ def check_and_move(args, filter_results, target_path=None): def static_filter(args): static_filter = StaticFilter(args, device=DEVICE) prompt_dict = {} - with open(args.prompt_file, "r") as f: - lines = [line.strip() for line in f.readlines()] - for line in lines: - prompt_dict[line] = {"static_count":0, "static_path":[]} + prompt_list = [] + full_prompt_list = load_json(args.prompt_file) + for prompt in full_prompt_list: + if 'temporal_flickering' in prompt['dimension']: + prompt_dict[prompt['prompt_en']] = {"static_count":0, "static_path":[]} + prompt_list.append(prompt['prompt_en']) paths = sorted(glob.glob(os.path.join(args.videos_path, "*.mp4"))) for path in tqdm(paths): name = '-'.join(path.split('/')[-1].split('-')[:-1]) - if name in lines: + if name in prompt_list: if prompt_dict[name]["static_count"] < 5: if static_filter.infer(path): prompt_dict[name]["static_count"] += 1 @@ -143,7 +145,7 @@ def register_subparsers(subparser): parser.add_argument('--videos_path', default="", required=True, help="video path for filtering") parser.add_argument('--result_path', type=str, default="./filter_results", help='result save path') parser.add_argument('--store_name', type=str, default="filtered_static_video.json", help='result file name') - parser.add_argument('--prompt_file', type=str, default="./prompts/prompts_per_dimension/temporal_flickering.txt", help='static_prompt') + parser.add_argument('--prompt_file', type=str, default="./VBench_full_info.json", help='static_prompt') parser.add_argument('--small', action='store_true', help='use small model') parser.add_argument('--mixed_precision', action='store_true', help='use mixed precision') parser.add_argument('--alternate_corr', action='store_true', help='use efficent correlation implementation') From 7c77ce871bb592ecd67148dfaff84d10dccf9689 Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Thu, 18 Jan 2024 13:13:20 +0800 Subject: [PATCH 095/248] [update] PyPI instruction --- README-pypi.md | 47 ++++++++++++++++++++++++++++++++++------------- 1 file changed, 34 insertions(+), 13 deletions(-) diff --git a/README-pypi.md b/README-pypi.md index c7ba5875..6c79bfd8 100644 --- a/README-pypi.md +++ b/README-pypi.md @@ -1,15 +1,27 @@ -# for project description in pypi -## :hammer: Installation +# Project Description + +**VBench** is a comprehensive benchmark suite for video generative models. You can use **VBench** to evaluate video generation models from 16 different ability aspects. + +This project is the PyPI implementation of the following research: +> **VBench: Comprehensive Benchmark Suite for Video Generative Models**
      +> [Ziqi Huang](https://ziqihuangg.github.io/), [Yinan He](https://github.com/yinanhe), [Jiashuo Yu](https://scholar.google.com/citations?user=iH0Aq0YAAAAJ&hl=zh-CN), [Fan Zhang](https://github.com/zhangfan-p), [Chenyang Si](https://chenyangsi.top/), [Yuming Jiang](https://yumingj.github.io/), [Yuanhan Zhang](https://zhangyuanhan-ai.github.io/), [Tianxing Wu](https://tianxingwu.github.io/), [Qingyang Jin](https://github.com/Vchitect/VBench), [Nattapol Chanpaisit](https://nattapolchan.github.io/me), [Yaohui Wang](https://wyhsirius.github.io/), [Xinyuan Chen](https://scholar.google.com/citations?user=3fWSC8YAAAAJ), [Limin Wang](https://wanglimin.github.io), [Dahua Lin](http://dahua.site/)+, [Yu Qiao](http://mmlab.siat.ac.cn/yuqiao/index.html)+, [Ziwei Liu](https://liuziwei7.github.io/)+
      + +[![Paper](https://img.shields.io/badge/cs.CV-Paper-b31b1b?logo=arxiv&logoColor=red)](https://arxiv.org/abs/2311.17982) +[![Project Page](https://img.shields.io/badge/VBench-Website-green?logo=googlechrome&logoColor=green)](https://vchitect.github.io/VBench-project/) +[![HuggingFace](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Leaderboard-blue)](https://huggingface.co/spaces/Vchitect/VBench_Leaderboard) +[![Video](https://img.shields.io/badge/YouTube-Video-c4302b?logo=youtube&logoColor=red)](https://www.youtube.com/watch?v=7IhCC8Qqn8Y) +[![Visitor](https://hits.seeyoufarm.com/api/count/incr/badge.svg?url=https%3A%2F%2Fgithub.com%2FVchitect%2FVBench&count_bg=%23FFA500&title_bg=%23555555&icon=&icon_color=%23E7E7E7&title=visitors&edge_flat=false)](https://hits.seeyoufarm.com) + +## Installation #### Install with pip + ``` + pip install vbench + ``` + +To evaluate some video generation ability aspects, you need to install [detectron2](https://github.com/facebookresearch/detectron2) via: ``` pip install detectron2@git+https://github.com/facebookresearch/detectron2.git - pip install git+https://github.com/Vchitect/VBench.git ``` - -#### Install with git clone - git clone https://github.com/Vchitect/VBench.git - pip install -r VBench/requirements.txt - pip install VBench If there is an error during [detectron2](https://github.com/facebookresearch/detectron2) installation, see [here](https://detectron2.readthedocs.io/en/latest/tutorials/install.html). @@ -43,12 +55,21 @@ For example: ) ``` - -## :gem: Pre-Trained Models -[Optional] Please download the pre-trained weights according to the guidance in the `model_path.txt` file for each model in the `pretrained` folder to `~/.cache/vbench`. - -## :bookmark_tabs: Prompt Suite +## Prompt Suite We provide prompt lists are at `prompts/`. Check out [details of prompt suites](https://github.com/Vchitect/VBench/tree/master/prompts), and instructions for [**how to sample videos for evaluation**](https://github.com/Vchitect/VBench/tree/master/prompts). + +## Citation + + If you find this package useful for your reports or publications, please consider citing the VBench paper: + + ```bibtex + @article{huang2023vbench, + title={{VBench}: Comprehensive Benchmark Suite for Video Generative Models}, + author={Huang, Ziqi and He, Yinan and Yu, Jiashuo and Zhang, Fan and Si, Chenyang and Jiang, Yuming and Zhang, Yuanhan and Wu, Tianxing and Jin, Qingyang and Chanpaisit, Nattapol and Wang, Yaohui and Chen, Xinyuan and Wang, Limin and Lin, Dahua and Qiao, Yu and Liu, Ziwei}, + journal={arXiv preprint arXiv:2311.17982}, + year={2023} + } + ``` \ No newline at end of file From 8893a1616c77273db721a67e422e03a1203b6fa5 Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Thu, 18 Jan 2024 13:21:20 +0800 Subject: [PATCH 096/248] [add] vbench logo --- README-pypi.md | 11 ++++++----- asset/vbench_logo.jpg | Bin 0 -> 445589 bytes 2 files changed, 6 insertions(+), 5 deletions(-) create mode 100644 asset/vbench_logo.jpg diff --git a/README-pypi.md b/README-pypi.md index 6c79bfd8..66d5d699 100644 --- a/README-pypi.md +++ b/README-pypi.md @@ -1,4 +1,6 @@ -# Project Description +![vbench_logo](./asset/vbench_logo.jpg) + +## Project Description **VBench** is a comprehensive benchmark suite for video generative models. You can use **VBench** to evaluate video generation models from 16 different ability aspects. @@ -13,10 +15,9 @@ This project is the PyPI implementation of the following research: [![Visitor](https://hits.seeyoufarm.com/api/count/incr/badge.svg?url=https%3A%2F%2Fgithub.com%2FVchitect%2FVBench&count_bg=%23FFA500&title_bg=%23555555&icon=&icon_color=%23E7E7E7&title=visitors&edge_flat=false)](https://hits.seeyoufarm.com) ## Installation -#### Install with pip - ``` - pip install vbench - ``` +``` +pip install vbench +``` To evaluate some video generation ability aspects, you need to install [detectron2](https://github.com/facebookresearch/detectron2) via: ``` diff --git a/asset/vbench_logo.jpg b/asset/vbench_logo.jpg new file mode 100644 index 0000000000000000000000000000000000000000..c8f2f889d8f6f37c1941906cf82291b3e690ade4 GIT binary patch literal 445589 zcmeD^2S8I-*Dnc6WH^EY0dar{ibV{lC>o?;1OjBBHl>QQ39X6>4lFf*qm6(dK_Mu{ zEQY0O8(ON>qSmcnNr=?qsESenS1YK+!GG=x!7yxX`~BZWllR_zFZbPh?m6e4d)95c z*me!;!wd=y!UzNm3xR)F+jWf2SS?(HVJsHrfnk^xM%X08%-{?j!5ra(VMH0xSYj`e zk&Ne7XVA^I8$^0&VBnl6PGnGMI0LVm=oRqhbOx~xhAm!_u$&V--F4Pmd{@gVsJ%DV z9~+NNo;!aeZ%VClStuG}>%5*99A;TpPR{?g^~ zOP9}0ShxV5HM9@?3hQfl5J8MGZqlTQRJ`=>_=U#7P~Af|4qw86b1%zYzG!~p@|EpZ z1mbuM`@QRzb?g{~VUa2fGkf;3jFy36vN0HDpZl`ReH@1Mc>}`^?`W&U`dV3?v6qnu z!!crC0;w;d?OSXpfZCjZ{~4P_Ad<|?EiA2iS@(tq%KKnM0*OR4Bbl3<0m2fJ;XY>8 z*Sz1baRCf$2FC%-WWbd{bF@66L2afi^$no=*r}nZQFmTXdM>qFT6siZ!$9KZS zNq&KEGJ=9bLYZ7%^o*I%#Q6&rE_zoWj9ZbAxN_C%HA!jfH*DPW(Z}MgIg;FM+w=0J za)nZ*E-Ef5{p@p1`4?YSe0AXHv2Tx`IC<*y8C|vhLe2LVf2h5FqweOd`k!t$H2(a{ z!{(MpkDvV7iuy&sNQQ2~x1ncU`qdZuMKm)bnOUNK5s0hcL+Wd0K5U#tzW}!7+-3cT zd#6~Dr)Tfpf27xl@tg+^^OkF^2aKd$cl#N24e#0Obu9Hy^{k^~FZ*=~3&e={ABjjL znUhE)a|?6$v#_#2e^%C3_@8zApAG(JhyLt4{@S1t0@Pt=TnvlB05m8W(7DLK&>oJ26skAw_;<YX&8tS!Eg{7msd*MlS5NlhW&{JIy* z7aR>f$qyXeS{*cwu0(@o(=WNFed8EZBHAs~VcKccYM>{>q*PJ@9S z##^L2j!DbQ3Ug1YTA1S>H?AhFX7g~TyfzG4!yetL=gBt@o8-N8+;hRhp*8v0L5u-! zJ`7ox_d>H~OH=qM#c-%As|_3XT&SMeqCBORj@hU@3G6`k@q+L8TgY~rw;ax$w)yPc zGDLh{ElT+*B~FJf?-!Okli#wtlhe`jF{%H%_+1!Cm^V|T;vL_tW-l1H@N2{zi(kzQ zO^^+-vA=|^|Er*#_y6H3>EC!1-fgRTes2W1c5C*5nE^-h&QOIEpKyv?Y(Zu-Fw@VJ zgc9C0UZ5)E5Y=nw0qJ?hxW9Qs*$)cQ*H=+rE3GAy}GB-BOc3Gj`0T@8aIMV3XX)+=iV`VgI^Gx1$Y9(9?88*3F;A zbH*g;=a)qWR#~i>Uz0V7o1ReOLDE;16PFG1FS%B~Co|VCxz>@OXEQ>~$!xmua}OFz zh?k2gp6w+8o*Uryr%-I}4qv3M6Yxl9iY!6?Ijw8?|E>)ab)N^+365M{J}=vIa|Q4g z`{_#njJ=}(KBbj#hLK%BtcjfjARm66L5`ZmW{|O0eEJ*}Q)B_~5aIVf>u-pN%jlET zInrpEC903(5PHyAr&O=3nmSj* z{X*AQOuPYawV}of`2}~6#(gQ^q7J6$7f6D%*NV)vlc*9$FF$Q*T zT9KAf`L40)V@dFf;BEfG=9k^I^lhB`*& z-wn>c%cul0(ZchQ<7>B!uiY#9^qt`c$>bPBSX;&!ZF`wAqKP7X$v#lPz@|9q^lOffWnojr zOByFck&4y#6p5Ihn7$&cgUpTO{-1ap+NOPe{}hN=+IKA97(P(Uq{xJ*pNM#fCLc>8 zR%A(3$VF~fsh?CRFX-w>0 z>EVrAe3ETv%0I1~^>AjDwtnyK(q?w-m^FKD3*h=f5@p|jDHob)a2CG*GhByOaP*08m_vUr8%9{P@@c)4K6$mOAEj7O+WEIGfOC=XvH%>H zGH%G_n+aoo4*Ho5C^ci!B&A%u(QJP+t#v`N?NpKDu@U6+8&z^?EJKqiBphe{n4lj~ z&RKBUUga;OM)DU8M_eC$5|o!Iv@_NrA6^eh;tEg7WGjape}gQE5}uhhRI>1q23}2k`AYOYxQ`I=BUoV$=I-yWBxf;`G6xhkecd zn#ber>vx}w=xJ83RpJR^v3QKx!Q(^P38-_Yx35;LE-sc=8h*%*b-PNr>fu4@R)IMh zA_Tp?0J)PLL$;YP|NNZ5V7T6+Wr+5}&?(?L`nLS_h7wf>L+dkJJm$4tunQ_@_1L~w zc2Kiv1g$aije{KCK0w>yqZn7|Km3& zRyb9E0O(rN5OrasNv|gVV?p)*J{EuY1s>Nk&8O};{}SQlVjKUDfCm0RdgF5R~Fs$}s}rc_)QI}A<>B2|$A z4Z&Uxx8-74k8O(SAIha%b7YCcMHMBySWc#-+!K+f$mA8aq~YSZmbB}-)FA_4mO+dU zsJP?9n+}CC6HbU^4`P5$Y{?uwp5CP#-TU81Od;4a7fxwbM}M0S6w8uN7gLmoe#40X zOY|88V=#DwDM7>3E`UKXT@#w2`KJ$JksLJL@ZamJc}|siXc-Q%W2>3upHxiCB>uKa zQMw~@#q&B=DShntu<}FHsCRul-7Dv%b8mxekrOX#_UScAoz-ufb4Iww&KPH^hJdD- zPc8_j{wjC%7E?-12aFKU8o~;W-h7ApjR0mg8-$u_7hTM~`Us+Os zzW02joGNKi!izmiXM>`L}OaA&N%Lfd9TJ9D+@JfltAb3Rxbor%t zmwmWFrT*p$&2{*d`$6}U#A00~bBMnB%Qwmb7^4YOZ?!0vx9Z94GiFb>*JPY+ied*- z`rjij;JMz|zD#w->??A)T^;9*;W>1_9so(~n;m}XE(fOAyN{F}Ypu75R}r1d8iHz? zONxumnN5CMmr>ThWe0opxx19fQg;DMovVDu6`|(acA*ICILa32xYhTgD&IM*U;LI z(K72)3DCv@b~cB=82pm&bPRY;Nbrg>%eLz04`9U4jF#2i99XjzG!?FLRfMFKmA;n| zI;_&cg4IjZO3kM&s3!cR3JzkC32x+bQf~pFnPVr>V7%-WA7@~+fkY++hRP?;PS#x7 z9K9S^W{1T15;(9vcbItF!=)2KlC!Wwd<^qMhBwk9{|k_?82W@wEeoNZ5{D0R5*~co0OZ`dZ{prY4z$+JQ=t8B4Uyknz=PJNEVcp$?e z6pKEhE769^#zq4RRkNofk-*FFT9q>cbE6C{xm{9^$@gf3DR%3=7M6P_Ya>bs*&v)F zb`aFS^ot@+xZ-1Vm+P`oF*|pedpMt4vsKcfUf>2jN(H82Mmw`rY#p8nd^B;GOqecD zm&QnY*SqAp`3NuADB*>!FZ)@1o5=L=W7mj4JJ2z0caRU_rjjW0B#+}{;OP4q*i88+ zY`VkkJ>>R6tchEd6rCg_%oFcrOpO}Dj=rss^d?6qQEuVzm_*;tm?_VU8aX$Ibtw{v zX(rKGvq@Yio+R4Km?hr@;>R8``yBp;73dWqNw89YCPw{Lti`!0SDr(Uk(X%0B)uik ze$b{{4iVx>^sg9PXwz8koO{we3@2HSuVWSw(JINeaRQIV*@p&^)45Bp>}wA~Os3RM3#I0#7d+qGm83$K&tW1Lk3WZZ}q;@K<0GMoEdgnOye zO{YgV@2=ng!=*Nbp6}YgAgLVki@Ut5>y+|ed%a2JU6{o6xDgJ#LmkouN=Pf8W~#~n z)SJF6;Jvv5b(%EMsIyO90C-@wp?$Qhl6dd#E&(AHqX$tv7AtgF#dkb@6^X^R%S;{Yr8p2?eCsS$p-7jDkaUHGt`(RYjdy9LE@KZYy{vXpTZT9vny9|Fl+PmgIM~&Rmv2Pb490us$c@?#KwqF#9QsVoR{blXu6z6uZ zaq5YV9*g%h1OX89tOOv#;~*XzJEB%l8y92kMX4wH41%SLIjmj)$Iz^b8Tip(V1O4) z7~sHiH-BE=h>O&iLe7AO@RBQ-L*GGa8bDt*w|(X$eXk1N>EW=aw@xDBBi$@wdetSs z{%B@2He*%?I=H(9#=C6#ZR+ z%0?Q;AyAjkNmSNS2omliv9A~_6?z;=cbuYX@%f+<1g14%je=jc>$UNtwKJfQhOR3> z?s4I?O3`(Ubu+Kt2P=%TUe!*#*}T5ev1SRpgXTlbN2fSAj6K3yh%MSIIzv%8?cfq9 zB`d9ct!EON*$m_KNVmE|4Q9T#!+Oy}mg`7=+n_^~tC8fYwb3KrJgFf|7aa2=yi8w8EU$okchj)4)(Zl1N z)Qr^4c;|zTFe=63?Xikj`85V3Y#)a_NPZ30_uL*{oK!qqG#H*?@8eYTWpnP3-zUdd z72^OK`&yN=_p#%L%T*UC7b!hdz|D)jQP81@L4^6+ACHCUUZ>^P0vWRq2Hbk1MD-h3 z5WR((%tBZ(7UbcRlEZ3rJ&<_FVg8vp++eugoB|RuRHA@uMkNT)?Ul-}y&jtV(_S#v zHev*Ed#vi5`3H{qN!BZmglRbMTWBL|DYaDW4D~5v2m|vLe+XI@x{Z7nC}v;|D7s5t zOz!z6c8Ds(j(c`Uo;X&UNTR=Yy{@gsgta5tc3uLAoP(Zwc;vS7qC+h1(iYPiJw!{Y zgQ_Fsq9K~pesvK^x*5}a|5Uw`weEo6x5ZmtnkqQr zGKF{))^qA259)=kI<56LQtx@f)NnK_U8kj6Sx^PaQRIGA8WYJ1m{K=I9~R=rrZ+Ga z34X8B|E>3jHG~%xO3qZ|0MDjw?E(ceqzmX}G6ST@fU}P=a&~kR%@O^&{Hz=1jYzI; zF3nwhxS=(gS%D_-N{PPbD}V@?oHU-iIaE76aw=YHYnoBf?C zx^kO%5wgU^4h4jgRmdx+ier}r{mc+0lh5DCxUm_V#uO_@#aAjX#%0-pH{Zb!CFzj& zNQ3fJFuotKAfzH9Ixd{@FjP~;pD!p{42FS}%S;doobPB|t zMZrZu1yC87<+SmG8Pnz}YlSfux=hVM(B6)Xb(iOZF^#j&9zPnIrC5!cg_`-YS*_Fp zu&oUsG;=3_t!)hHwt6XfDS4KqHXKZ8lpcT1y222_9MVy8?pPy#3IhVE;0c=_x7E3Rn7hJK6mt+9ZHj)6e*@i&MLh zY>TF=xh9y8Gz_vVc&c4)V;vO@5HD$NQ6wc*@3WH^9P$@jE!51iWQSOd7{0AT`(yQE z!gCMR+e>)jz5AbgWC82i|D2jjcL2?3l>YQ%1{G_}jF2mfZ#H^F^{uWh^Df*Ocd?(& zW#hK2W*;lGT0$mpku`N?4w$$o6}Iy2LZ!^Ih`(+3bV9xA^5OC^W=l5a9irY!8KkyZ zsZPJ?No@)RM>JPpp`DZn6YtZj&QPE5y)>WBx?Zy@CEUz<<*gQFWsEg-f<1qNH9vE> zHuS4x!R5o6S^EV_I4qgNA4Alp7z@oSm9F!^J`*x4NgF#%F81iG5K#GwJJVd#_0f!(45o7@vBux+bHcDWcM^ z9UHZ=&Ws63Nm@Y2PI$=z^jsRr9|N9m_apfeipmBM^xRcNWkafKHe-$L)i6Yf%A$bWajep`tHVi1Gs*_9GrwZ>uNj8voJ(zEQyy0 zYNKixX)4yzz#YZhki0uK_y_Q8l+30(NXLIF(r$1zFTy zLWodkO9})`=xt2_mWU;)mS{vyx-a<_8?_Ab0K?EnT*4WmX0Zx*U}t3jA>jDfI+k4I zTv1(hlgDeBNR|?T%rk~$uz)&6^)DwEB-0TZq8pFE=LWpHC!Ink>_z>mFsHx-;is$U z5B?OL)&j!@D+DU}&PtYCzATN)d*55AHcQ}w)n+H~?jkWW))Z?eefXfH)Zyt-b-UTd zIBFq5LV?T-t&Igm@xzwb`YMt2EE;k8#lAZ`O&QdN#~Zv0cdd@9{G36a(W!RRH+4Ga zu3{e$0fPfM9I-D>8nD1F3dFAL9=~3&Yvr!+oO{*u8QmtX9{i4stSc$Gahp8y5J%rJ zQ4pGR+p?25Rrp97`}_xvmf(i_&*OH#&y1nx)*k_lb~28k5tx#oQaV%P1|!+CL%Qi{ zQSskaW)(q|DC@VD4SyRTlRvBrE+7NE%~22?i6$Ry7Z?S2!%tJEw?AQq4D%3l8kFQH zcqsrHk$m|Mu0Ym%*d}Ij^yN6l`_09IUZt*+&WB7QpP7lXJax7-@ukm+XkP00Fl73c zq;qD7#;!i@X0Vi#s>{OIqxjp@be~OT)dN)_S8RhUsOooz2_yMTZcRRNTtP3#g&wM{ z=di&t!9wy~H3)axCfZ0>)($N5Q|%k_g~W!VW!52#*^JW}YNkDt;!hzSBD49gCCS{? zFHvimm6H>F#ng+YU?Pfv+k&Wtzg1M~O?}F;_M%?!6kSnEmW44!i{Pyy)JEz-mY*1F zS9oOv?zAW0`98lza+FSfS?DeN zB|XPIXLx>2Xn3F$+~{8mWYF$y#lY8MT5>+dJZc}y5X7_L-@KDM>8TJ=qt z$H4(4A7_}PT5GluFOb=9A!KprBL}A>K#-RUBbY&f7z5ad08>p!6!p{s$ik%6SD5af|uG0Wh5;I)s!xhGPl3!WOxQ1{j$o(It2A?YKal!rzvL ztUTSVjaZ!u@bt?s%Z3!xfDZ}bQbCQ9n~kra!;o8`!HlEy2S#;UKxY_i&9$S$2vZ4Q zxaMfzx@0XVV9ed~O+j@mSibJ--);*Ab}^6uc6YGCZyzomOEeUFl4H!kdjKx@l!{&k zR|3rfb}^h%)5*MihyE?(s$Rcnf2kG}#gd<>O;m6KN_ zRSl{yfMBgySVWC(_Y3=NTjr~IHH3!m6lHvDDPB_Fvb(if_=^Yu4ZSl0y-EAkA)$u5 z&(7d!u^3dh#%7vA{OON18TXsQ0pic^Hw8UzQM2&|D=$M%h|2b)3#9M(UNFWsuWrhp z?ZWT>&IP=ligWJs%DsVOc+k;Qmej0L&IX6fKgr<<;0Fql;KD)~yw#&}KU&#MM{cF<%$G8; zXesRsg%En)4Gcfko0v;gQ`4wkCJX6~S7m`IaabNLw)(d0)43Znsr-U~(9Orq=B3o+ z9;l+7&~_}Q4Q_msmnoL1!CTUD3GLu{Z(zNTr4w+!L&I^F04gXqfrD#$B$d9srPI0`L5k~1sj}L6J@<_%f;t9THh1EOv=orShG&# z9&RY6D$Oos44V%TUmwV(f$xHv9UVIqTo$T#KObZ}w8NLvxq@?#T+6hnee=w6s$Tz8)3}ijZ^=3V z4(oFgX8}_yi=nS75-ye4(IB1tj6= z5+Qguabn%^P{N2xkUL&IXBCd$Pz**dqDc*x7+)m0Vs03g8q9gT4=wLOn*0@(LIu;of%ujTf0F+0C>;cG5L#qO< zrbFy3B=rmfw|C7?PfrxQJ)r1!yGBF!PUP8`b$uxil z88uRBqV?IW9?*Skkxv zCcthns)YV6U;{4mF6#yXYA~M(zV&}dnCp|C&RYwCvb4(0a}{$-1Q3DqyOFbIX0bQ6 z7PetY`8;S~k9_%Gp`0YSR=*1ZbvfPQY|U;kTr`J&C(gDUF)y)&`DEg5QNae!L9gFY z%eD*8?>?ha=`BTz9jcanWeX{ob}`Hz+C@kj5i2@B<48z|vrwp9ly%}Dl|v|P8Q~c` zvkx#DL$ued&G%mTAfv}R$M9Z&&LZpS-c*6iN?drfK%;M7zp9jjl-|WXp$=ZAjy#A~ zT0$>3kW4r0wsk|YDyX$P+mjonDvx0>sNF0kw~Dwkc1E2IfR(k;FY(XHy#Eout7&2t zP2evUd^8uJ)m2_2rvhb*+Tm%O@=bHgboOCKyup0ji5a^>u26_yaW8V2*)>J#eb=PQ z%8kF}aE%;(Hmbqm5dx`E+D!wScB_<2+HEbp0`2OzL#Q5&bBk~e7RuU}6XyKNI-2JC zW1%K{v$Y|jXBgmw!DG{81{-vKNOH;9h?4gYLbRSbTP#(4*z9hWpeS>rH1WLf0DWd} zz==)?2#g*6&7*_wt(IbevZ_BAttY%u7RVSIPvQ>?npv znw(XKh8$}2qh7QecN#j=WuApxWf)me=FXsupb`ArHAsI2*9iTeH!k#l(c-P4|0~bv zgU*k&_4p<39eHl$Ca~qJYbhl13c>)~V9(QoGrKXm5}esreyN)HMsW#66+hzA-Jl=i zwu;A+PBQtH%s=FULI97(2Pj(S;ZeYIq;&%(v%BDAEKIRB^HVez;xu5Ty}&oDSkZ*+ z@$H;dKbN-5d*k2}4rqV?FS)pi_=>4eXdb?6;|yy8di&%^cm$;@oiZz@S}L;fr9S5`m56qsop3t#>YxnUW?$x81*cS!)y&f zkRVUdz5epl6unK)anG3hk4tjpe-9-2|EdFDhVEs!i%4?xMIfD3IOU1m81VyXLB2FBs4EV(;kyzz zS#~fSP-}LkIp#U;yE6>0{oh$gzs;T@H%l-g6>vN!O=Dbj$v$f?YZUpibb9uJ+9weFydi@!mYbw=DR&7`jjW>i!41kN+5kUsfNqyaVLLcCb+?G`r-el=(1D#Irc-#KX(VF6 zmK9+KujY!jinpvf1Tn??dYjBPrZZ1Z=Vpd$EZiq!Yd`YmB%Ls*C(r#%)K zclm))eu<$*Ozn0=?szJ)WM;s*y2{%o;{pM1F;=A@vgO1U5mLn4D)|)k_`V!cL*{DreVQ(` zG{g)-CWQ7(4RY(Hq0d;{(aU4_=7-S8Zh zQOjy}vX*EZ`fBi$B!@~U&Z&||N_mPBR(`N14G7%CxigaNWNh{>{=BOAog0!fRouQQ5>SdLJIP`cMGV+Jl>qb zOOV|Vxr2#Repvt}>l7b{0S|*7hDmw}ioXP#$0WKj3euz?ebOigH+NA7;7lmuO{9C{ zI+)ex#O?`yt$0Hv1v@{G=x5Y0^&la=7E}L!{#}1qNuJ9NtI+=<0{`X=zZ|+@R6{Ja zzdsZN+)o*WKOSfeb-E)7IkIA-jEuZR(V`v!$u z$R?sfg@J4RXLiC1j_W+OqQ&BrC_+F(_*Kg|VXP2$OW$Y8ungx#8|p=5&j>$j`nZVV z_lbMXLf*BR+wXHf>Z{kTDvlEztcJ+F?m;(>uI)sNEPzp+vLJvll}{lM&U?B~I`3(& z29hzRJ!t2Z%2Q1AXo#W2`RIrQb#=oy=Cw+2)^xJw zFS3D*m*CBp((6^j3CyoiI9OJkA>Rq^aN6bT{mtDbJF&mWJsSw2c&bW8-h40vd}uz1 z{FMm!px6U3kRDlS*YdzxX;Sr;IAIe@BzfKS3H$2SZR>&WQ4bbhVaW_$z9@#_z;@a! z^Oqc$G8p1aP>9MiHSv-xdX08Ug5ergBQjG z&c=FWFTkMR9ZTff*M$wzT9l_%BWQUv$KbiAYgENmvHDz_-C#_O-V+I5r5^$F4+oYf zBNiT_O0F#nO`$%fQ$l;OC;yU zcg{9(Dy0DYLe0?Fc~4JX8c;yxYb)2>m*<}!e1Po^EyQOMdiTZ4Vy5d9rLe`m-wy+x@j z&`i{C$`0Y*B=CHO7h=lFtzv2~QW?E>_=l2%+iRrfDwVT8ix})zAkB(8b8u+{ z1p+L*1*c1I5%$&;pDP^sJ_YBI9L@sh8_N} zf6bP05G4l+k@`_(F7qI)5%e49BKc8guGo&o8cktuay||D*mYrvfKq0uPSUM%u6UbD zA$Ykso+yD^hQ%?eLONExyF}h7^m0d@%2V@4UG||^>tU@yDx1}h9n^)q0N-$Oi!Ohv zXo&WxeXea*xegFhiZOLLIJ^tUjh*cfL||ja*n$8Xit;C##uo9sUO-VD&b9{8Zo=-O zDE?d!3%uZd2irsEB_4iT)r>b`5Vxix9{wNyfyp03xhu@c+|c7R(PHp!Tm*w=I)(RO zjDUqM;&43~iaN`G4mO8M<2KILSM@Qhewmg?eCYyo)9Vsmylv1VAMXbq3o7|2I9Tm< zS}77|foG_{N995jeq~6SNjHalV(M~Y#T)!YpQP$y??T{NcA9>JXi6uv9|g7rHfc&G zW9J;1Tg6zWcnvZCw-wyO8o~nVMS=Ew@Z}9SfRJRM^{NNe!eEuv*s0=)svDvT31`Q= zttppZMd^Q))0t44_3EZ4Tx#9k>nu#uRdmC1uZ$LLe!PLjE=||xcFyojwL0%f#tI-k zizoS>ndcF;KHXdBD^>yspUd+ju5}q>)A9c2o^w*9E456N1VYOi0WRPm`HGCJ;8jQ7 z{^UlnYBOZ4EfiBesqATBkRM$SY%AN4V#z{F{4jg-^(Rf?=8R0OlWDP7@50w5xdgRL2XOqaeRn9y!){@b>cvhctS` z*Et|V?u5mDV#{R*1(zT{QpFX`V(`w4nL9{Xlr3ecQ*(wyTG9$wltHpp&c~fNkXO;} z+G>>Rn?gl%Wrs++shrkQ3^D=aBgg0jQOeqq+1q;r-@mjyQn?fc*60H z(6T=`2^V~^lQCIzMYBHEF&R?(Opgutoy?`3M0!)&|9hWz4?QCwdPE_^@@o?o4?)3o zeOxq>8%4CmJ~Ig@?a$(uvJ*nBZEqZ6BUU#E1uq-4ZC#kbQ1F)PiOr>6E0jF5XJ*)q z=jcBdq{sUfdX)2*@bu=wZ+8Q3y?o8`t$tkL*-+85UHs{*>&1aO6;_?8mq5saAu}zg zR-CDV*skO4FmV)7b|P+B74K9ph*AJ$iP9=>7yqMeEj|m<_1-q~q{Jz^_#Z5|@1#$7 z^5V&!*tblBWE?{+Bu>?l{N>Fu0Uzv(cc6MRv?6r`T}4Xzr1KLKR3`+ojj14w!Ekoc z&kck}(%x)u5B?*EW5k@eR%llDf9u#WG16#t^)@9KJbEyQbnVT*^>v1B zn>0eWdz^;Ncne1`Q-FmeakTVt3R)vYcFts$#JM592@&$)?r8~UBGny@5H>?vARB4m zce4d@5vK5`K5>z3uGvYqq92E_-K^%~?*dr_n5R0{p`F@+wwP_4+eGhu=#cP8rk_6H zXPwA7rzt41%x&yAxRjVjMAV^oRna-?0P(bfL7AWzE!%Glxh)rG#8C4+W%n*cVEYfY9udk+_osKpP{O8Jy?eF)8&?;wQ#7=1l zgAL~l&nbqZ2PHNMis51m-x^o>TR2vIl9{P=rjSsg!X5<9bW8Hxo6Evg{sP~RT_;!2 z$QF6KAR&+a^pDbG6N){qH}D#B08tDwFLxUNe=HyaI~Ld+ZnSXmSLFw6j z3H!J)^HeC?7FTce@$m`nJFZZ3yUpe8j$AfK*2CjX3}zG&O7qek8~#9*ud@DvkKPF%ha^SLg?)o z#7aW&E?vYU z^GXEe!;xP@iD~!(&eo0W{oneyq&Cr~MCtAeVHW$L?_496^g|O#7O<1SXX62seM}f5 zuA%inP(#a0;}0+}%xiSx>3R1w96NJQWVKI431;vF-?*<+pB=n~UlqBO4{oB{S<=W? zejKzriCZHYH2h@Jp!WwiarDcd*D5RDEjTeQWX=fKkh3Tje%|Af$;c-O$HVug=wwBi zl26PIzgs?T7i_-fbD%l|+X3-gayYINR1&@5%+iU!V_!VB2_ zNN3n!aZRBfE}Y4qm=(TT?(%MVFl?-7*d$j0+pQRO&t(JM8Mf7hO;9V%Q|fc#)sK|w z8={rhLn6R4yz?*s-=qcyvQQ$sEZ0{-`G0Au5!YZn_?9qQIwPme+-Ob8s35`*Sm)F-stwIwz>AT{@Z z6z0i6s@b_tMT0)6WSwmaUj%XI8WHvE<4xdnKj-VIn#%GXGMkaY$1?^QLV;5@43?+k z@}?nD)!2#GIHp!TV0Z>I?c=uh({BQ8h+8{*?ka`-n_hE!-31ry$;0zu->H5VfnR>x zgCy~Cqri5C`bQ*85KO#UiofG$p)YKCb5Yqi#6X?dXCKEgGv|Pt_U!KdKrb-}@)z9^ z&y^9CQ6a^R-_E~fkrO-Ny(F$vtoZY&Fg*tx=}e+i$Tr$kYURA`WszQeAzQUXQO0o( zybYZwbMLlMYl1AoT-9h4tmS4ne+(#N0|2 zr1aVe8weWGVJY$;eSBV#ZeC`r!I`n>9EqYj?)FkW_9wZn)7R3xH=I32KE1FYp(Hl= z%i7e#sY(=LhNpk^h@(Cv*tLh4!8S9nYhRAAq+y&Cb~A+KdjX`S`JgiTgt?o4gXA7A zQy66nT)yygWX?Vzg{|zsCeKLe&2We9^gaQrLuY3NlR5Vvrt$3IorJ|7-c~8+9H7Px z0*ZaPf<*(BeSo8zz{D0B*HX0V6152OLmK2Lj&-JzjQkQG$Q9t1m;u?xkb5>tRgA}i zoB2~EkDjV!vYH|NVE=P+)GX82gCbjUZ7nC(FHI+oaTK9F>i6WCK~9xTWo}XZ*x)N( z)eW+d{DowKpl@cm)7N!QdY_TWjf-Okr-i1?K>K3B9zo1+RwoN7rNpbjWrJPHIpP>u zj6D0p+pQdZR|wmB$&s|e66J_s!OI5X_# zZG-IfrmomXB1>I^sDDz&7i&X1q7ZXN)NCvL^6E6ckT7lp?O2U`4`e52$sA=j3MV#2 zsLMm_P8Hw0wOTGcNt6;DsgbnqYh;`oyGQ`f@Z}w4@Sa{GS#;d$hEUmzKaVwZV6t|?I+a5Z2T8v)pyBP^%5HC3-R=Eu)YVF^~{pv@3LSCkGen)Wdt)Y_~LZ37IbaV zKjb>dmV9j)y;oY0?byZ%oQeVDbnBIjHz7KO0Ib@ zPYZX|>!ZvD(PH&r3G6uB5hI+Y=}34w!tEC6u)*}ySSQZCqT*|gtH=etZRFXlKeJA& zKN2M4d!P0xd&~-q90{v_CDQbCwkuO^fU`vInv&F?qXUkVqz&b7?=u*s&j9R!sk#hX zjfdBCXPF2wxQwgd<#QiQ=HN=f0%uWd@vz&%uW0LcL@!TqCbN0w$e|}J1Z?Cz%mP-8 z2E#g%1Mpm|wm`Q51cpUkuk)mRuu$lly%0%w0n32kkvfvD1l&M-2H*>K3q)KH_@qye zwdyM*d99GN^55FQ8(Xckq`64W2z%|ndE+-^!>f>TC%_W{Zdebb%9{?15zQ(c>j5|) z-=ol7`s3>7aUV!OO#<)jT+ktUl)ts*p7nAir7yA!!@m-$Jr#39F{PSOCF5kXNRJ8}=k z@MUz<-)WKErSC>K5bVKYJHo%DsLWy?(sp)QGri(B#^1i~kr14;x*^=Tp-web1YL9O z3iiRZTZ^JkHH3X$yAf>@^~2o=>x$dd7;8=RAuv_=?v*L>tTnv#HrkY6FB=t!cE_4J z3A>`1w=*P39GV71+T90&m+KsgJd#m-_dlH*npNJ&xG(@Lb)T3WEV%$CyQ!M5=!N1m z+^{vN)7koQEvhRJ*CRYzBfAorep3rO+;j;<0-VF5XFNSohQ3N&*R^j1t$AksV_scU z>>=}@^8xeh%)K7gsE^g}L)pVIKhA~x(~V^ADo56dl;Zu(N5=)#h3=^0q`^Lc*Ic06+cX_Tu4+D#djS?aCU65oiuAN zGrTtsg6?-ehtS{s) zbk{-|Ek~N@PLN@+F)TsMw?1?BIa>G;GuKP#R3aNq1anZuW3kFcbVAkjbu{Ns^yg zV+*OUBZkU#k!XZ`S4xmM+Mf+#STNZC1iXeIYVR651#Q-5xV3DMy61V^NC2;KBqC3s zU7;xf%W`3xDloF9{)%>6fwwu`UI;DnqSz&%MhwHCU@`fd4MR7O>idc39NltMgdZ~@ z&`8oPJnIjrG`$623rE&@>SgoOsw*scmi>9p@o2;M=8E&dIbjvL*l8dEFL|U`%hC6$ zishDKckHN_YI8Od`5vBkgB~gH_`&iawV6?lNjkP+Z)w|<&l(M!^qL=#uY#0XR>Lqef6(p9L0daG-)5CnFVo*ZXUh#>$nSECzSQl z*mdAb`4G~79KERiHNVcGGkG559{lh#^K?aE2rVkTts}!eKKffg`c9S7Gyp zWyo8UqBg6OsfyF)?HKA`$|ueRFPNg?J2`|`U@ww^rc`mQDNgKw0f+-hnw^mKzy5{z+s7Esy!2XZ-T`h-cP%WJ#G*dJlK z)&Mz)&RMxV{c!_>bN2B!)?y7K0Y)YjMhbq``Gb@_4k5GQ za}dEG)!V|fU~r+QMh1Yw(%NFZvEc8?C)OqLBbvK7j35wYwU9NNiq)cpT_o=g*qOqy z{Ty;VNi~hJK#|=9R~Raz;PH<+vO}2^kKrKLkUiZz-ONrwTCA0OE_`(Ou6wAPG3OBs zCrTjr02E7G^OC*~_~~OPfkBNN0Zy(K+9cZ|(YWv;B4wDGN+O?GGdw>a9wpGU9tryG zyGNyKXc){&FWDA^s*nk8cxUK92bnGd{*N9Fnx> zt*fb!N!qtiX&v56Xdj3+78_$#=|hEmn+#in?ScKxVCS84>sIcHQWVGZ=WMi81rh{3 zyVsHef|)={))nwFfhDcc99dR7P`fMsP`xe5n)Ok@$zZ1@*`5tRvqyUxUI?P_a^imG z!S@^|P@Yni41%CB&?;BrkzVp^o)CNW)%ln3Kq_atLE{YF{*4PNQXM2JX({B`&RJAQx|a$Hd=V(>l`#a7@AWu)E!n*f?#-b99Hq+dm?!tvJsahIi?X7?T<= zc*TWD8(#syo`7m7dyiXUbsqMTQh^FH^`0BKf)-mfu%D!v9ILx_Llh1>>Vfqp_1;Ep z1Z*WQ=)D_xO#4cSAYk5A=U9jiposu+Zc~Q-N6$%5b-Gn5HBxNQ+sD>`{zxvo4^p%- z5_>W1JB-10f8e-+!47}0{V?6uyrMV{1CMZCT{7O7FiuwCCPsxemg zJ4a-33g`i^fOI6B?37SCjnWiZ+{*pZaho_iW>L;98ElZTopSO1wk_LoEVc`Z&ZJnL z4a=f>fnNPBn;76twN%LddE1k5N;z^~x;@{^Ur^-(JBE>J;9}ME&kd*6V^hy9Cf~tt zALtFc0`Kz|RJO=(k2wLGLbNDIHJfGaPula*qtk}x;1%%g*~_U@`Bu$pt*piQ7u`+k zg=B%_7ErS1GaSwHp23cf61MD8256@7lR?h#0a!Sn(N?9}w6!Y+# z0eC_BDiypfAZ{!?mFJ>~P14E?=j4~CE}9PxYXpv)vqM;e+8=QTjZX!fa}PM=<-@Y< z3$d>lTDb&@@5-T}(0Jrr9b-Lp4r+b4)LR(e-oI~8E8Mq}YBK!=&}|6GRsEVQfd4Nq za457t1rY1@b5-3TXa+P%<5=;52L2t^#JV&KynYqhP3Bxn()Dd%Wc9}P5mDq~Xp{=7 z%gD=j!!YycU2vhtSfU}k~fjEnfHBUQ%85kuP z&N+n|8z9$?`v8f~>mt)@OBLBgVha)SosyHk3MRi(o-QUH0mRbj?T#5%)Am5uS59P{ z5Z!`=7|Rr+bZ`=~mb0LheICmK+bv8|0sD*){jAIV|geWtR9nb;tVk*iYLmZ@+ z#(iN#U8O1GHFc8=Tx{Y`?ugbi-_!+DCd@Clq%CNjSrGa2d)HOpz>mlcKi562mc1z7KI-JOG;RN%R#!7) zk}elL(cQFL@%`#4x}l$rJ`s7B^k7nieI_kyUDLg3KdaiXjq9H{O{-s{oAzMrw4W7O zleWzN?kUeM$?5xyHtg8+5Bl7#+c~NYBP!Z3$7$DPhpl$~T1eaV%zDc+;aczHk~blVHo+qDtL0jwfsOYmhQ#A+QJ7Ia3@L*OOcJ60o z8+MJit}dbt+wlCsoG&xmYdjEemwo4;;P~qApS{tBr5<{=to^}IfC=dV_r<0RV_W>Q z*7DZ+*81Yjq(AG|hV8C{rz(rvuoGLJ2!^K4X`UjS_QYZ66WycvkBT3$G>i8Rf!?M( zacjc@Pvo>=DeN{ZV<>9*H+aiEp!+{;Di8i_v10LR?Q_|Oy;l8Jut}K5>88r;q(UW8JZU5BfNNuoQJaW?fyt^W9S|>s|!oy~m%-yTJV+URas*{pNMY zra!YY_PzpoFI^{CnmY7YU772;OPANxMgCx4TlfPgTvw+&-~Ga>4Xb*rT!$(RStnSR z8vNqjL(pD)c>S{umE469nPBmC*bB#@*S4>7ZWxLnSiR|P{|6J3?>paq(uVCt@7S_n zTHM-2jyV=z&w1fE`r6Fr_R?uD%we#1J*|6#zvR)}yY@mzL-g~~CxX2x)0(HkYo81r`UI-6b$Br9#isK^ zjgZeDD>u~m_9O8Vhs-H!P>ld27BoiuWU#TJ%b=k?A+s;Ie4iHaZ1lJA`VTGO^~#}7 z*8)MTgG+M&@IwQDBY)^!tGG1H`rWgI*!y zGXVuYus`{1i(ljm^I$XvRo{?K&T0Rp_Rh(>eeZk?spis$-k*sRHI6Sog#fjnvCMhl zIPluKr5%qSs~z#fzN?xmn?7y)`I{$#Nc^=pB_X((KM33VY*hQ3fV7Jrp*FC;^l!tM zhR26OD-DFVtJd-w^e+FW4#HgIqJI>QjXFqh%)ob42|LWV|DkBtf)Rw zI}-=bA;dZCw3o!b-)GrxSqB4lJ5l$|10a7Glnsv`%z3eiH5%|D2gn?-cQ0V?8vAE& ze3ROSHTwzYJR$D_JnMYq7U!@9gn zSXB4Zkq_FiQ%$OMh&2X7oonqbG%HW|&CDFS`U4Ab(qEf%;>(K`@Wz1dEe)H z-{-yl*Olfszd8H)o^!tEyL`SVav#ZCoKlS3MAYb89_MCOFfvb=S(~=O4817;#E=h~ zHMtaajD^ebe8%=&z2*G5Z`gKA;q>=d6x@sG{ACnZx70NGSkW|JZsRG-dWl>B-yPoB z2VS$X+lc(_uX9gK&pwMWJ2w;}{gCCM)R6}Voy3%VY#N=Ky|9m-@+v*f{Y z==hWbJ$3~ZzfQ@&ICTAukce_&`}O`U;AsCONg|t)93j6X7q@>mror!F-R_NL*=hqs zIM0+EGEbHHZ%G7z6rf}mLYG0IYdWMvnIatEq;RPTf8n}B_Bgo7=jN4~dC`*HG$?j~ zbiaT>C_L3}2pZaxzkd5;SN`@HSPQZsXdg`dO*Du9h6VWt=Y4N13CMbB$&r~diy3qs z1%0na!vv2DI*dhQnaJRrzx+#h9I^w+tnA-`>isut^S@RK{R(5lT`lOykt-teU-4r= zS`-Yt&S=Dd+~mJB6`&4t@{+op|F7@k|5HBvKah_K*v2mbS5FX5p{3o7?72q1bWy00 zY8naR3Tvu9llh&OB_p)0Ppj-3bS$5k)%l0kV3)eHM}EyQ2i0eQlnSH}|MoOIlJGVT zf0p5sPn)+Jpfm4~r(RKiCBb1&%e1<*uDZg%7JL2Wa<{uka^&aLxNaW?GxNk0youN^ z_^sJY_x{sZpN>!G{et4R4eX=gW)E75(2gyH;!gp5e$F`erPm#nxOU?!oNqsV=YQ(i zCv8hO^<3;mn+|U|eH%UR4ut3w^z+ik6pinkGg{3E!ed++*riNy*GBsv-l#d+*F1LJ zlKuO05k1?lqnq%=08fG%UIca`2tdg#<-o90`76`u4e!cx3aBna zOPc~xH>&(_0sY%$;~$Xd^RnL5Ta~3xL&QmvB=%D$o4jv6e{1t-$`kL2eI6i9u)(g_ zFQ-4FpCzTr?m*Zti7a!BiF|Az33sQ?dLJiMG}2vl-*nPhl`V0Z8~Y1C`I3)Tw zUT!sWy8krb|9F%ph9P!iYuk~ZD(QmoR95@3L5bHcT!hRmD3xm&GkZ5g>?HiqXlZX z=b1mageSkU2%Fc5nOAZp1%uz$I#G_!R%>kJqDn5yZ=jX29%nvyG3m`K*^`2kK?1-~ zLHknLi8NdtrVAg#JH!5e_-m>03# zZS+@SZZTkqbSJ?mX91R-yZD9gxy_B}QqaJ&8Tf5SP^IfO{lE@nxXb7TtQH-o90Ypf z+Krkz=}GPk)V2HUj5i;tFC{lfgUx?5CE$}%*nC8WD;97$0+jwjZ(=dOQX3I_H`Sr< zNK@$VnlF5>_V#;0x9QD6;q_V@t(v4@Q~2gmx9cG477|~&5Y7(UU~#&Pzn%peBbCFP>9gJL>RoMeygZ6%D(d~T$sNs6;jaF?^Fb5QrbCDTqZR9HrX1M|sUV|pL zmkD|U3*T)Flkpav@yA?p8N6+v&D!uA*eWMk(1K3l8^aw)W>hngAZ@rWV5*UmEsgJ+ zZlUY^yA0yqoAAZaT>MSIx2PTkjdDr{<|f^smX6Aec|B6_ZUGo$P(WfkktnSVJQ{eU zzd03p&u^C!_dZ80PUqln7w&dxo07Yd!+Ilkah~ZC>I-EetmZ|mNWtFl&9(0RAPo9D z=NCehIs5hUuXCn%Vh-+PMe6J(kNEDS^qFD&b!RDaipWy^aXsV2oG2v;=C?WN^TF&= zM$r?!g>Z?CLAEVZ8~nN9jieIt)C1adM3t|jcB##11^j*u{toujBBYN4RbG)ndEbM^ zYny?7Qpm8i&mY$3FLW35B`Rz>8F7&zUHu!otv22i zOnYVpAYiG*c^9`^-Cz~3nwTgRG*^gTuVU#-8SdL&B+@MGKo(b##M0;FG8RcY5NW&d zzDQr>T(9-U{j4784tj;^VcvLYbz=21ki=}0M5?e)r*+pE5G-1rqC?8?;xMoK|DDZ*<21U{pR}2>0mtyw2LP zeFV5>7>*sqk&>C=C0F!#2x<{Ty$ETkZHpFoiHDpO!p$0fKPA$a$yAApB@%jg{=&+8 z`{F$csJu_`4X3FaI}nG`7%taV*Kr-Kz`5R68N~GWns?x{@{Kg(HlggI`-4pFv1WSn z$#`|gi_g8b&e-avyEN9bt`mRy%7(#5@-XG6f`lR>TohK0V?QM``|Ln^ZLC1RJ~92G z$(Yf?q|0nJ_xoB$O|N!)4-f;EdI{{!4bvwC!pqOPolSMtVj+dPTUp;>Au)I^oo&c< zS-;@LcqOd%4PGX7Ss0;n!a(J&-}G}9{?&V7UF_RdOT!!YaHh+a zQag})qE|Q`xeS?pN>ZeXjBF_$486N9CYd*H`-f)7Xj%}` z9zcr&M+(VPzyM8Kg%)f*NEIf`1-~8*@Sh(=X4AU1PaFk=noL z!66#DlC##izw`-R=`wVdbtQd3d%n&AYV{)0pI1Y8(Z^l?=JsjHxZPXM;I)`Hv0!c4a0SAOX(c< ztPHROQZ1ib&v;B5xRgT}2WFEuw*Ay%yAW}E;3MZvCj{mh!QDiBNg&u@Tore{-l&Jq zWdF6$KBfl2i&{siX49P!ckXI^=WKhA+;(}6D^^?TMmzDIz*={SpnvAPbqj|~7H+UF zl{UHJV7+yv_a(HiWt*#wIVYUPOv$>-?Bjc*yFnjU)-=w8}ReW;C-D2NzU*5 zA zu_FGBSLndyEt{yC^zP}d`VX+N#Jdu=j`JU)T4gayzI}V6&qLf;RdkhkZFJUxcgF09 zua?Bev(b7!$n^s1gNxb|KG@aInQ4Xt8SXg)s;w&5)-tu?%$m@zbL3`p_YJ-^-1P0X zz-@n>$UqpRLWNE*I$4Rg`xH;496sl)e4PPS&VOUo;C`#&pzffKR@iY@_q-(K#C;x0 z(|+})9ECE_B(%uIRh9}~n~h>muJAv6&23XQf{SRFJ(++yDb~;HT-1fd9$wV_Sm0Mb zuld-&a6n$`4f0*_0y;oQ{@WJP9t0~dP^{NjHU6R#rwq(w^LH$Lk)F^|*O;&+HjhPUTFaOpFf9%ZySuW~0#BAnQTXIpdiV$lrNaB-6NzK0Fg%7j^^oPPmI`h zw>iGco6C&p#KfxByW;zDxyLY9xRlV)&wEYzq+qHy6Lq0-1XMu(P5_;Ps7V? zUB>mV?`8u_AZJfxb zhwEk^dT8ql(!U-@aAH4YlBZxjXK)dDQ!4ZUd&;9jL883n=sm@Y$_X_#YejEi+2;aR z9Jp^4oX=Sfx7gaqZTs|60RM@q;335Q;u&>~ODRrh)(i1>oB9&Ya$hf|&1KfyQ`{FO zBvL5c86!2xZ|o`;?Koz9vqJZ(=@pL1j`zOT%tLy4PzspbS+1mzg8ITK)0HdQ^d`s& z8Hs$wKuve^tyue>jJFLw!q@Y@KH7{@%8A>iUy3I&@GK{fVv=o(6&2Fcs?VL19p{8f ziZ5CL4w@Dp%9llCasj%Z!-gq1s=Az=KHo(nqg$}!YBy}>bERrhPxv4ZL%}X zMas1}G#fA_ zpnoxu%zZuKR_0pfr*T2gaKu@Chp#8ufO%;UfJU9d!@V~!FFQ#iq0IpGOF5(X)x@#(v^$76sR1T;6-PyKH{iJ+h zIoo3ca#W=Ob7VLI1{FgBOJBsF{BVF&_TM*BLjZq&WJeeIWd7f=O}~$0rBqdvOgLnB zD-7g$+d+RS6=2d*hn+|Q)}mQ~|K0fiU;S>d049MB__Hp6tseyczB~Gdaf0|5d6D6o zotc8ce7^t@DqPs;v-%5FTcz|BI@Bz}y3C&HIPH<}CIblUhE~laJGD}9m~ySzR<(Y) zEBfqzEo3D+gpOG>20^y~XiMI~@H7A_)dBdv-8de9q$i4@+9B5s^e>c{CXx>lAil&e zbQRs)t1dqD#6YVLU8Dk&lP}!+m;w)zru~Rk0b7A2V@#EN$jv-l#G6;rt%mJ-z|0a9 zY1rJ_ss`d6vWYR3uW`1Xly@|z1LGO^SAO`9-vz4*v9<16nH=5~QXeCK_$y88h{qh& zqE-^LZ1jj>NfM4$X_d|qlIPww@E?fpVh1;9c9IN15vPup$sZTF+6UxNds!5bUx)!y zx6M}@9rwwfJ#o?iw%%S|6x<$a*^pkbMZ{~O>Qn+a{CLVUE_7Vi@oZ2mcx`&;;$3Fy z1#yWqiJ%xCLJ_eLK9XrEfvZNjk7D*NbP^Os@)yU{9~WmYG{lvWj`^;YwuN|~Z9Uo8 z5GB>gn(SFs+EZ<=Y}KPlW3G4JUw%1%qa62nxorbAPqe&Z!T)OCvohA8BmOh8&l~~_ zw|c)^u?_XsP`*Cf=CHaVX7nm!VXZ)B9$&pZQGExAG~+f*wM(}asl2Z%e!-mE?pDE6 zk(Zr~4L(d?KJSqu?Ag!f;|8|fEu4y-TV&WGI!WFuQmKXI#)48D9H$akb+W@`HrXG- z3Rcz4VwwAO_X{cLyQ|kRv~S-lV`* z)rI${KgHfG8oJuSZ{GRh^JNgm1tmEYBRP~9d)T39$iAA_zFPcJMYN=r@%vA%sYW>= z2~}?m4$~HH6>C1O$bRgRR8OV--dOyY)ckl)yf=rSTg$$&crQZ~gR(5hLZv{f@AS}E6$-LGC6j3@6V`e4iYiP)9qz8CSq(DJ|;^@4N zL^yX{-L2DG6dWnjRyO*;X`Ks7Pwc)9xBb`0AJFf|)k89Orh(%yT zOQb^Ai+rcm^dB!=QpUPAa~6aIda5Q|zu$2w`O8Mm@I3Nvc8rwK)Y(Ieya}!w1@FIV zK4xKC2yPg%uQ={UjxVG&0Yj@ET968j(m2UhlWmallyTLrn6b%$@0QgI*1Dd)s+b&<^A^ zrpYRV!^BO`|4d1?#tiSb$JxA>{J+TFy)m#>I?(8gf6z-ZS7Xs}8I?^hCxWPiKy#J5EMd>&V`1(HrBF zpFfu--Xc#Fe$q1YqI?j$`Y|U@H|XzuJ$HpRzfL7;1{Nt`bh(RjPPxh-m&*la%2fNq zL@wrIhVo1vkQoFTY5}$#i0DuX8H`mV=%BfwkY@*$?e7nPUkAco0k%$05E-VNa+3bE zXl55kT-1r6oFn&|%k7nfl)7z5(?#W7RQ!|V>81r=sfVn>R+|53ju7XTI@Tm{FvuDx< z9KU}Cy3uiRq2bp8vDf-w6wh1nUFRnRqbf*BcEFH9&l3wQ78LB?6iOJ zrEpBCz7!GuEcaBced3~K;-Vn{n*UdTY>K5mDdA`KDw|gGnpW$mbL^>%;h9l*xo%cS zp6uE`9>;%t7Sv=*(5DV+JX%-|u#%DQ$k@qVXB-pISy^O|yT50r{}10&qV-TjP9Ofj z#X!~yX4E>-04WJ9h~G`wPlbNW6#zIjhscrr*#0Pfl%oolNkMid<-Xv{PSX&qP*GOq z${!MXx1Mj{K+j0siOvqU{$u*xX9Nh+bGs==NL@!m(jbS%M^MjD%V;lfqo``sOvffF zX39r9{&H)EC}6Wf)e&-8nIe=B95-3zPB$BKg6U~RllS4T*iu~W*K zHL1N=r1}NQTKaIOFhA{62ec*oVqjLKU5w--aui=Z8n(A!aVUq3`)b`~CE1TG+>KhsriJUU($K zdAuwGK?ef?@pS0d~X)~`=xhY8RQ60d3QY#bqtlI2k zpOPKlW;m4J2bseoQyk+orG}+_xk??*9$HXdOO2_P{KA%QjZS-hP6r5UR%#Ni`NJkT zU}IQ2?D^LKCMh|4%KX_2`FvxyI}^!2i*(qsKdx$34P_0`sfFu_q1yF2bK9=~rpFC1 zB2JjcnANT|^cUzbQLr!~;njNv*%Ff>IA$9O{7?=RoH^IDj`rwwl@T9`TITgOS`9mH z@6pv8*Jw1zQKDq2=#~FPn%1l?N1mM)M|9j@R_b$xJF*P0M3@X{3G4}d?G3}Xm6vTj zeZE#!JBf<#mb(c|>2yM$$hOWRWWA-Y_@I0Omp7$FLZ6894Mo*syHR4dT{~kWiI~o( zv0F#%#L#X<{auxXhr-&hpgH0gAKVvt+lN(?Xahq(Olv8Dp(Zk2r({hLj41YPzkIvG#oh zevzrD3K7#`T{@*{3&zBFd$y|Tnaa~FY_v*K9lED8+|Vbo_%odJQhV}RoK8P}!RBgq z&%PJ9fvqu{AS~lgq3iOkx4&j?aW45g3<*A7&%4Q+;2>}q?tg@>_4BkYj~v_6Aob`L zv{c%*0c(ttsoGE5LQU?JdsRLiEQ)EJt}4iDk>Te|&|ChnPhN@)trxg2%?LiBLmPUt z&vX~1vOla{&$l+ZGF?~g^zg+t2rzlv6=9A(Nyk%jjaA*4G%f@7qXrfuyZm`j%(8$bDy=! ziv9BF*b{RFjGKqtS|f_Jr>iw@kg2#gSGvTQ>R^VErE=pTqiK zW2X-VTRs)|LjI#>ESSA$Z`g^pU4@p&6rQIM01^+OKz&K=7hSRT(kHd8#rAt@zt_R1qiR3aaeRQy&x&y4TXtY0dOW%@l{x*S=Y@Q?Ge72qeY z>e4;|@&>gB;Dii01{(W213(yyim<7Tw z0?8WJgGwno5OZNm9r!Ul0%#(Z0MDN+N$Pt9qPU-NXg$fsT=#1}ew&y7$ft=xv2j8U zfJsFDlW>Wex7awjsM}%BZvb`wUHxRClV#IA>M1nv1%tqL|Noom|06H-pL`}q@A?}y zCXXz2%M}JrhBwQf7DyS8m9d$BB%nV$Q;kF#052bd7l&+>DmQ;v-CE&+ zNvZABAliNNT88k!0!Oku)e!Qru#UHa2Ega~#h66}2^_orgp3lCRI6H1&7^&RPoA^T zA?pP=nRCEI{7Gi{TpyIo4{OPO?9(CDypi|gmLifM%_<|u7vFd>b!sorbKNOb#zibc zKR_7?;$CB=oK6R(H%vZ#QT%Se$0wBZt*V9Q0bO`I?#c4m4_5p9MDC6ab6G{`D0c|9 zF}}KK9kQ>|c2#Ujcq@@uh;c4GLi*Y_khF4f+dJ%p`^(NFxnklfEti^bKNBmT&&jYp z9h|c@DS9}CdQCT9J?0v{T`?T02<3CE;SB0inlh8mJBVn!Y$?z1QX(#0zO=m}T&%e4 zBi|y2{He4WBR;;L{U3seeSxp9xQ$CM8U%;GR1EK`;ur9gVP-yywWY>Lc}<|5-tbYW zaV1*O7I%h(pHqJ{(xpY4HHcu=g$ezrWV16AbA+90*mcmNUJ>*Vd<%qS4E1d5dFhhr zSpSJS;Kyfot?uMIV0<21v*snoqa%6i(pFBNLV)zuCdSYGF)zQFeRdbbSJ~Qb2O0c$ zBo%e4W#jeOL|?h91n=9FZa;kySuR$RktE`IBcDkqWI_*Ggyl`Zx9x(@M%R7#v@+Y&^5sfiU#PY;Y6P`1H|rK6)fb zJ&BzKs`*xtEsl`1_0|5K>!%R0dB%(nn36a=Yd{;Y>z6Qz-+=^*qeAtdb5?i-4Vltl zsif_1jUXeBcBT6V?J`325zO?xjLlZ*>jJs&!6(WBy3~YvRZ85~r%A4Y4OPz>(AL^j zyR8B&)nTX)$-RZ%P;Blul6q8|oXkpqQP=IqI}n4dZPWDu_|?VlUAXs*z%9c)j379? z)HwVKWrb5IcO_7P@5F#ue*^!RAF}hdq&r$$$PHl8rpQky-d+Jt8KT5gV7R6B6*v&k zLYMMU^+y6)f#Lu0Dd0Ln&4B}kWDW)kEKFybSJ)}}><9&TJ)(H>O6lwuQWN}D7;&GO zED6#`VjIOD31|}_*@*zRBM5oeP*&&xXGV~d**-GeCKZlQ1nxAGk;927D@teY09OfW z4fw`52hITaUU5+>KIVWS*K6c|JK3Fhc>^9R=@z)WcXBBQe-66kov~6nM-R8n_+ClN zQa=yG2>p>(qXsPBRySOrt9nv;1gZ!EY@4V&e#z?_NjU(^nplxGNlfdE_<;J^=8B)d zyl}{Ak-nWPu#}3((EKcn}Y8;h& zW43N(khV(aBuSjW#>Zs`l6^NDHL~AoRFfdS1DWgJft+7G58O*QF&{#s^Pnamd)Isz ziR|c9meIH#&lN?;Mb8pQDZ1ZKGU3GwPN1vLE~_YKc~KDR;e3&B!lid=sxmdU_I6 z>p<#ghuTq^%tGQ^+yHiA+5XwUy=Zs5hUdQjqrXs#Jb3XIw0W{WjIa&p&hP8S=nh2r z6{XF9CU^o`hu8GhtSIves8xZOikZro+9eU9&tBr?+whq7d6sRjWYQLz+=6SM1atE^C8!xj1`Kt64PHr`NwgcKth#8-pA6}aU{nYHoO!rU-1ckG0 z|NaIL$iv<4iW0;pDKotJYldS=Yn6+tpv|=yzF!0qFoc@yv?5cN)n76TkTVYg^1%Ng z^D~h7YqbcN;8!d-W2j10h7)YQXOVA zZUQpW=l=<~o3x4S*L*Mp!AGQtpD}7h>Oki6#5DMCWNvT? z7d6sri`zV;3?J`7;U^wjUfh8+=QfiUvOF;sWq-m$89Id~4S@B?pog<#`Tm)7kBI|p zedHI;;^(%L;|P97uFX5bi7SpZH-=ZI!GklNp3=%R%#`X^3}yL{j?is*0JkeTja~AH zncoXF02`O90@(S_kSF<6g0`EFnI7ms$}V-<^MR9*Icj%RL6=&fuL>0J{3AcjH|2b= z107%R+=1|VLpMJb%9^9v6PwA^Q-kW&i0N=j^$cLwX8F##ac6#nhkVqKTMjQjK+1_3 zUnB~U{GP)XLku4PlikTrfKH`}C@^Sz0qqv=snMtrHmgxX@cmRk8|9Fr?1o^(ABxg~ z+@FAXJ!P^(5GSyx5WPyi-p^p~B&R3+mYxQjO~rT^sXbh*hURN6@DGJyJ|u4hb70%Y zCqH(>r;dX52eisvacYjsvw-5SJLBm<>rIFm$@xLIP=q}(1ffeikZbMViB!;~;3)z) z91>8Y+Ox|ng&EBSuY2HA9A2P=RDvMaun90yHtro|HQfL^^L`8-4vdT=&|oWNLycZQ zU0V#~dCy4x%8eS;>>ZO1S%F6{@{u#?3=_AV6iY7n6VFW9_DEWN^C5j6hIq!{R6!G{ zK@%8bz8jA0{bWiy;Bt(02cqka3N>i1%?|}rXnXRL+lFL*_r9VMUgF?VAa90PlN8oZ7g=)QZUwLc-$WPs zO@`YuCAx!Z=i?X9{;EI>{^*FjjmCWZCvvJU3~8mmeov+;>N>$%FW&e)U}gzehX&IW1HE_m60( zOqJf2P8HXp4k$`5AFc06=U>VtaJ=qC@@G`V*?r?b!holTjA0@QmWB5+B?)AeR>SRF z`&k+?AnWU$(ocp~>$_}t<*y|^It3A;iIR+^b^hFF-PO&5b$5Yc?bp#YXL}Ehpm=ev zW2Zt~#^8*DC4&jIu_Nq?;)aQe(W=i|k8-~;xKnyq3;j(;;v$zIQB$g`k-w=U0jVy& z&&QA)7x&bC>ki2je>OTYZaxKKUnCe@tmN;f9DceRP!wF))H*t-D*e@u+sp`+u=`~S z&Nza=Knzq$G-rC&q@}EG=j41l0PF7!zpYuqUtPqdWln!7DnBJWT#)P&N4fElZ+-vt z*8+4(36zWWofDTf$PT3i8$ge7g0L0WkKA3?J|4+Bc zzuqqYE~;3u=}Xlo%L zNzek%SHQF2IGf?nlRpBr{*&Z|O1B;10s^vAZn6e@05X}70t|czhLp&8N!0>Wmj5%m zWlC(|e-fgQX@T~QnJ%`ZC%%HVj4VMR%L@Tlwy}=+&;}l<#2>GBFD;pit_w=1m$Ca*PH9+32a@pOzkIuGBf!a1caY zhem3(QLx?V!K>ABTi)`=j~3*jn^RWXA7TzEiY^s@NDdHMwwSvu6N`R8XzJrn;8Vy{ zT z-QDq80xGeaF<=>ai2#h1e~OIpA8c+?z%!KIkN6n_5fP|dv{Ks#E9O-~JpXayi=7CD zv0bVz8l7_V?Ys)-w&9I-591%#Qw5N^2I4yL1<%GRjzx#VN+4thU$xpyhotCku5|g_ z_FS4zW-K7qNT`Pg%ODSYd#$)K=yaVw2nVH@b8jsBRDz(Ksn^9#+wipva0t*VpYSv7 z9_vzMN1K~BwH0-x-nBeA;Qna2*qV9mS<(b}i43;(AoT6Y>pX2ntS_hZ$sCNDpq=rw*>azSjlQ< z>?d!pm1YO=ye&Ly8n0-SU($v{Y8~l$Ez?u5SL~Vd6oj+5mYE1brVFPVxTF+hpOD)U zl;^f*mw`+pf5KatG$Rw@l@=B}M7bxJQ+ zXcdbLnsL}>HU0GQB{W7?aGz4EY=)tx@tg2_a!^~A}gv3R~l3^oX+_d%Cn8oqm%fK#!p|+ ztI#U%LcFfEoUs^{N4eToFPYo4{~K2=@1wc(-Sq>nI^6kH?8@OgY~vgxC8(UP=d9rWq)OS z7&oo4=pzU4xM1by#=W;wWTk;~r*_=5dN#GU^wE^3anl!kx)8_R2Cz4!5-O<+T9q$7 zu)s!YaU|$p&(EDU=>Hx>+X@;JCHl38_#h(KBkE{z@!e>$2{mcG(9BPabtln8X|n9VI^91 z>vBT+hiV4-cr(eA=SP%~1WV1+hr}=37n6*{RyGEVk-#Mt@mb4YvLS=+IpGPPX>6Ul z@cy{U*G1~|i-9x&3IsJ^B*Zv+JzqI~JnHa5hbG^#jJcH@z4CFP6`>!joz*_Z<4R8^ zy6el`h-F$AO%1soe3MZ~n}|KwA`sK_I-;y))a5gqk?MU}?&VpHQDb)rfVTZ1S;{;K z1hbyj<;)VhfH>!_E0t)<6P)07Hz(J!!tr7DjYr-G@AJ%zAzq#_A1P>ZdOQ)EUW+Z4 zJ{401w*~$gz*RZ};-^NNSk%^qGg&!*8EwmT+Pzp(#vCnsvWkoJgupaz!Ps2GQ?V=? zHmM!qoM6u8QY)_C?}_VOc0DSbd0&NrwlWTGG7qzT#!{9gN1SW!eyrG6%vzCKwih=|& zUc53n$;!C2ztiFTr@{dPPJ?p1|8GA3e=?r`g2`n;yfS|D4BGvIm%oIn3(v=tz+Rx&{8-bo?jYkzznhr=N;(x$SXfk4W|^=kK4Z4W9t6S2H974zC}dGQ ztxJguAajYh>MB|TF~M=lUBnfm_~zt5sq4PvXsbtZfIV9WRBXBCKt-=0{{ zo0RGV1N~TNTmRdE?#HB&JQpj3<8{NJ+T6E?VMRl>+t{+Q8xx6vpnd;@YvK?bt~SNY z1ewf7K9kNhBIF47B8zc^r9Hw)A(2~&Bwc> zTzVt;R?CKw|K@WA+I9q?#US>a$kc!Vc zzQP|y#jP)PAacr7BX6AIPi5_m3(*~L)^)ikHKQG*%6&C^z+s?UNJ*;Iak_S=JcX@6VAg(|eNS`oEdKs*KR(BMU%T)Yb zQgj~;qF5I>y?$y9D0{Kv7nh1hyz5ItQdGM9ICAQ3o7?%V@1sfus(l8lOR9@biyc$9 z40Jb4+)EU$Yp^{jo7Y^52-Y@$Cu9ytoHhI~8YD0!cwx@*(jL_*bvK#C!j4CLkRPx1 z%jQ?;Di|>%BEyY6%ehI?`Jy)QNc|@e$YhAIJti1(F`A7krJ%9;ka+Qfd+ZZ)4n}-? z-Y=Tsa!2MlUk=Z7eE6JxR7sOJaop$4yx?fRj90nFyYGVxt6RM#VJ*wX(owRTlJ$-5 z8i$knFu9m7+r@6{c?;}I)c{69-3tb{>6);-)1Ksd2C8mbz)>=<^{)P@Q-Xs-SPP$!39t0T%o^6lVkw;FM+g zNXk?oGZ5E?N{>F5-AW!OoyQzl!nEu_K*kH+`6Wkt%Sv0@aSj>W3H;jS%sGwFpPeai zN6{mEqK|)s^lrK@ZyQmnk_siG6V|oAxTL8}r$*8Y&!+Qqr8q{dfl)4wk}(h^ZIrPoT~EFL-Fx4H%_AF3o8E z!S1%u<+u&U(REx)Q`I|Ey02meGm_B>f1EJ{ead?kbtk}!kRUt1g}?F$fh%H0O@9D- ztp`QzcoeBuyQ@(jH9`Y4SqCkxus?hm**UU^Zi{LI0k^W^E1^$OC&EY!jCd=1>fh7p zLYMUQ`l;W9TLUWgC+9R?r%#?AA0WNcirj%5AWSRkEi#Za!U$}_8x1i(t=LuI3s4t; z$u(V1Vu0dpVyefgPy~*C)bcy4@2higSKi6>W_($9C`>%sn}MXMM__|*)RCa_Bg zRm#K`P!{P|QGNlp?yJw$lB5X_WBTzM6^mmJ;7fak!Qd2fY-|xrp!wk==04x90A`pC z^YZ5a>Qb2J_SDeRZvj`Mrr-|%2>?xG{oR0`sNTk;Nz#A_Py>gn;k^TTcu=FlA@KE4 zS1_msm(8mpX316Z-aril>Y@A(h+t!m`0Pq)_y|x6Frr8&&_MD%K`g#Dt`nnl_sUB$$z;ay1GbQw4VNLUMllUwaf!iJTGXZ`w)`o6Nj zJEYV;yV)LspWO;q#KCJYk4wjKq<&}x!;+sdEo$}?3+Y5EK;L_X*9W5ZUOEyL82APx zF&!V%-&{!yI{H9U)9MQ_7GT*-US%hsz*yZNVc~1iAiZwTSi1~DcRyP*B2c;yuvPJ#uK`=-|H>`%O z+EOPo;dk}Mgty3gh+blT)t%?R>MRVKuf`h&1&tN#gRcDqKOfZG@LhmjdPuxW%u+I4 zWvjWR`Dg}y(5eMVe1mZUtG1<^jsyXHZ2;&;{TUVV`g)YP<^;p7E$ZLA{$LYc$|57W z7}FN}`M}KhYUvN_fd1hMmFck-pmqg=`xFdr9o}ej*$#@MCX$Ll;p37CdD&>Y&j7bU73UW0}`M4{M6_K0N;RP5bd1CC#n|=@rjB zzI#g~3DK?lPG}BR=B}bi>j4LG)L@N{Vc7OPTW7j@Ry#Xnv7SSucB*(FWt+KuBLWvA z!H+i3LdV)k)N}3yn(j56C0*&2ku-oyg`@j?757exC!cN;RR=fm16CLq&8N`K*_Cgk z4y%I?j_!pw4q$Y7j zu^V*8lBJJh%)sMjAUNH`!RbZ~+MmEe@Q=Z-LdjMe51gYHFwm9VTI&hzShzI2S~IPt zVk;2t(EgB*WW_eLd}cUii-m|Ha}PB?N%m9E7ky-Xdnbbfux2=X2hb@8HXE?? zrCz2>y$pxVoE)7{x@8r!#pF)n*t)hUx(YwH>Dy^PRKr+7e3lckXI4AiH7nB8IfvA( z46Qk4(XVBnvG{U%%QW>$NLk124HrVj7LU=}F2*yn3xLm=Qw*07ZZ1bf4$*eM=I45U z>arYTd|JkbYKdOM@^=rf-PfO~J)r5=8}YE#1WmXgy<%T|zfElY*1g`LDD!Z$=O+!L z@=JZJ#1cO}9Y%~<+M|yMym(2K_C^n30$B#4IqOvxft1*^Thr%zQVw@N_?qk}-@@bE zL^@5N!?`SHAQVuMYmZ&&q5RP~FNyLzv)`BFQ&j7u&gG0PJV|(DzbcOh#8an0xV98c znr%R_9MZX>a>W0o)CG1VTN>R|YW*thlaXa$_f zy5?%p_|GBckpTcY?B<-i92W9kYlO*X@k5PRxj)?Kwg0n65XJ+v7pSG%;)6Z%(}SxW zTBYtv=5=AMxx1!Q-@l;t`_jC^Mv~C*Jnv1Xr!uX}G^>;SEcK8VC14?>xih+gR|jUB z;?4e0tG3{?r*>MWwY}6)Fm| z?ptuP9f);C|C#aH^b_R(hiG=J@APOAy}2akAHe5b`OfjX{j z)T$vt@^SZd!-ofg#d|PWEI5I%d0W@<68Zp6+^4lOl~-F{NYYp-<$E93hs{8awBEfm za1<9eJv!U(xt^#>o&K~YVdiR?V!FDZOKs3f%8HDV@URhx+eZlpP4xtXSDORDx(fFMFbclhPrLob1QW z`E(jZqNN_Qo7bMl_Qf@cBfH3MS^w~U|BKp<>LR#ZIsVf2<()UEl-hxa2zo-k0ZB^s z4b4s8F2$x?=OBI{Qt66#wV%sH>Ti{V6gy@!su&>ck^k^_@(vJGH&@Al>OXOp|KFA& z$l_$C))$mTal+S9B}Xf#Xd3^2ey$Zu>mO1uBsAxAfEzu@_UnYw8U52pV?s0fi;uiY znY!sJZ;bfV&pnOoR$qJ+>t_+Qm$EZgSxR_qKx7i>=|pQG07CE?@!VP%Y2sIo{)~_h zEK$F_(itNHG%$t#n53Cd-q6O{>@5|2y)~28P{K9u6OY;~B@P zs3zc7?k0tfyO{ft4cJ?g;}yjI9``SA;^}`x843kB{eLf{0R#*J@q*xIIo_ECRb+1; z>5G60{uqzwPeMX1=uk1c@f~R~zuScXwy=E>V7CGuMqTJ0a5GH+A@(i<_CW#zCErS$U?{nF5V~$Ci#hW=Lv`+R32` zD=lR){Q`t;75ahV8s zT|1t<$ec*{FyoW=YRq3_wjtp7Ti zN%z#dE6enDQT+5DrjNr94&R7!&0Db{1*=RBbAyFv~L3AxGTJ;pwi<&x!!@j6iGSE0K-Ik@zB)nnHb`5?`GN8UsJ0- zX0s<%UB>aH&A1g*8vi3dOZDrBeItP=g^0cYYJ$PW_t!MH-C_h6_X_(;?c-$2E}+sk zYoD%tt9`BO@RTGZ6UFVEFyr>OCZD2X%kOWolnw?W=UVb8^+UTE;3U8cR)JTEd2x z&d&C$?L=Nx-2qSAd@Bsq5|$|hqJK(|^^=`dwe>d;5WlNaXp+7POusyH=mW3J+1T9% zuK5SAiqBYl>(z8v{_e)wm6sv{bKF^Jvpjm%Geqi@p~E#l@Cs)eJ%Is0hPD3ogF5>K zAop4LOm51n^iTGud!K`=y?pEt8vct&!OJt@cQx8;bA@|G85=9M_ZAkET@4dA1OqqK z3X0EchD;B&Kun}h%RfH$Uu|m3xgT1bs55m`M1THj$0$0FqhJn~%6PZlE@MC%Ewuln z6u#aoKr&TAhe#Jl8Rzi9Z6ufJ_gvB@amK3Pl1DLfi5U~2@%f~vOe7l=Dp-mtBiN^6 zmp{iWHnFXJYOJe_ILn%8tKkTlmU^HnIK$;d#@lE5V5v&?g%MMunBIYH+Hm)1s@}M_ z@Xv>B))%cuhl+^TkH~dG*Nshky0}3{o9i^TI#_vC*cSuW%Eos4?zvt@q^u)R4qU?b zXZ2#VN_JRQT4gn|ue_~21pOY^9~=D!dT&y@wda2z`?aYoEw*6~X?jH6O~saNSn=Go zvvQX&;o!xr{J=_v`h6u`N@;Ezp3As{ z{x6RU>@@q;XJY5an(PchoEph+F+fbE*ikV;aF41Y#UgXq2EQHLBUT~l>6JznTJZDY zrli@YeRZT~rC&qULsa}@ShUgE)ieFeq^0`?C(=F|3X&G0EXK9qmP+ff%4vujJdafB zWn>A$S^2#F?j!p+B~B5@y0q}6GaA0M0KUuw5YFBQ1&TF+AtUoLR~?sDxKR)Knv%?TPcD|9uj=L z!ZmtU$I*)+Aj7C8bsqlpT0Q)%@K+C&Ub)r^o7BC1`I^?!ubdek7*vB;+%F6%hBd+{ zk5y+?v4}d5(imc zue7;pNdB~&%vNyd+cln2nklBz4dv)DH`jI@G^Q@0!n(t8{7QEt>0*K4- zmF^aqEQP{=u0f!Q?7#+u3CDNLxFvS-k!Y9OLIy+98{S`UgMS0rIxy&=vb~yqJ-52K za9~)KKMD7FRr>NH!-rxZ^GC)lD_O5 zk{6u#dbLsrF4Ez=_|MlD$e~78Xp1_v2gEEa89;z7K+v7ur}odzoe@D znbU2@7Z>Xj#b2SRS|k7Pz2iz2iOWEb)&d}MtDZ6F#!&5=qVTM0_j#{)OQ2BIhOb7X zGbhwRj#Y1}b!tq*T()1uHsbZEfwoy6X^&J`M6i{U&j$#?XLX|BikzQAg#5icPQs^1 zK?ba0O(g}+_6#v4tKba&;K_}Or|@~qqO_M84|VI3m)+M$=v23jVq97-srw zZ1^{HyAIMY7_CygEpjdw3{W`U)mgo4I~pSCj2S?jYL`m}ysCB4W@_%7x|;mDSm#lI z*Gmnac^u9;*z#KMbi463`P4KlfXU0cXCmL8B|Pj)fGl%tA06EMNB+_ez9QbIau5P~>gsp?^i5H`X2A^sF@-=8vWTOpq zQyXX*jRv6kuxGrc1hYl;?{hKb=;X!*oR`qGSoiXrg)>f^r@qu$)A``!d{-3u{Lr)o zW_zWnXwg&!m_mjK>d2~=Co95k_NBI?18As{Z0mwiS*0`yu)InQy&_rWd1+tHj4iKI zi($kx%<-gzm=+b&)f;xen;)O?B6lMtBWpAf+MG08tLdrUl*-9Z3qTt?(J{w@GJ0H%O4g{p!-k))Ork zjb><&Ral)-eHZ7m0r4Vjn`4$oIJ=ft(>F;0qT8mFVvnYF;}tOTY~p9<@|MyQNT>TT zrs7j}l>MJ$?pJ#8B0mQ7EY{-ZA;edTKiIoD59aN_W${K*nBTz%$ZONIE#$PfO~(R) ztF{9eM%j4xiOk(BeX^}IKr-thXdx&!EE_I7-X?FWo>z=V*PZUG6rx+(>?;~=?Eyi^ zw7Z}>7qAoZ{CB*?gwq3J<5F#~_0%+4SLVU+`8~=_Vx}rB3`p2;K!nuQ)A^2|wyC0} zQ&`!x$<*olvg!NMLA3koqX0izEdcBOIhXIQ4d9;TcKJKw+3(`5>NMuIp4VD>nBT{= zc+LfRzBT^^K(Px9qP_-F;(`DZJ+%(a0)SP-KmMl9=0*V6?wo%Oq5gdz5YGNfj)rzUp_JJO72R)e~%*nQ?CL#YCWh8Ng7A;5{)6i zFDLl7C#G2&d~-nHb0mr!<0l{R!rzh<{~vb)>Nm|DkV}T~Se8Tv^sT#aU>&;}ZGOSA zXvwi?sU8kv33A9zWXO&cP=oWmDV;5;#xIs122zUz2eI$SYVWEx|3_>d5TO8`omaM= z+3SDikp+ssD;^tJQQ7j@#7#qF zJo#}^XwOKHhk%<+somD;LRm!j(N`xNQ|}DNqjesjcl&cFld_!NCQ!Ws})Jt#XFSzx@lBeaj4nW%F;f23JMSLoV(g=(0c zyuD<@f5_u(QT9e!?yDsJ*wQmnk;ItO9G3MMX73G6dLZI3QPz5F()#g4q*1t48sO@2 zQ$Nn|63ADI)N=HKt|Ni5tPcGI^BxJRE=iSGnt$rdyN)=F7T4+n2(pWB; zXTx{s8Th9236zd~eMkUTXu?A=4z?~Dt8`<3)_03jcTGFH^NKpEP56`M7rC{X$W`>W zk%h8zpBO+$JfK?jwb3C$f`))w+A>g?T;l5GMLYn3g6kZxLU>t}Uh8hQIa1iIVhS?f%uP}b{ak5<__iQ*Xkq^-K zMIF0pEzxp00ecLZi=%mIWv*U*Gz*z7Twm{ZeIfMK9rz7I8X#O8&6+_SVK~@ib(xiu zEQQbaiC@ir^4q73sj}>!QI1~Ty`G9#6a%|cCpqsHNk#aVt_5lMIp<9MM;sHLrQy(8 zV-25Hg}2xq(e#*bq>tXNC&0yNeV=~{jasYZ?ytmB8?USrp5~>rkaBTjdH9`Z))TTs zJY+DGvAqsL^*UT=_gt>TEDD2`8m@=~wb2ok?4C}~)m`eiR0V0W>ogRrRV%cTTB74S zVYL3-gZ?*+bQHB-d^>U#8-&orZ%31@*Yk-Ue`&!g%AEV{mkP7F7)Pj}SaPyl@l&0s++CH3B@Dh>eGic+ z9ryWfhLN<{YLTSqHLuylV9-ma72SPen3w+soCkFH21WA;HRBGBj?O7WH!^J_sd{W~ zoo2XWJ=q0+Shfi%c@drKCj{5o)QWdxpCXDY505hok|CIA=engunoqQNx;NRHFeZH{ zaTaa1XmA31Et(wBUAB%t+ibRw@BYLcK2$7UN&BNaG01IhaxMMLSC@p1F7L`jF>bU= z=GTPrcTJwyR5L|>UgOX88*A4dC7z8UHGLCt396SHZR)+MZd*^^uJXXiL`JQ%)Du*2 z$`od?9xsGsU|B9yc85F$w_uNXi6@0^6Qqi`rGwP_3 zM80Psv)6;VN`Pu1FtB><#Va*tHdV*SF5j*zM_~^XUvdTiP&S1D;h62A87zU@NqEp% z3lN=&8UsSa=xGwzk$$be*>-Z+}5iWU<6sOPbRbYPi6ip=udNxZfCwxR=uFj z>8rZBZ`*adcy}T%15!wM6&`zuZ@YLw!SEZ1tnf(dI;k2D`H@pT_ViWy`oK+>X*mmU zg@_A8-v>{XzMFu%uV<@@B%)X?cDZQ;zR{9B5Wmh4t5>{?yUiW?mK3J%JkNC)P7K zo_U~xOnsJ1b^I>$=9TFMt!%GIocpdg5-;$UTia25DT$isP0!DYl~%rSEZXlSU{XLP z?FF-ci+Gwwr#^L(#dOC{c!sS$_?H05l|w~fr|3!5J%hTR^+&`CuB{sP4IaI|!G&JZ ztYCG;z7->o4ON;raK9HH-yJWP4kG+^W3TTe!otP&$f^F*K?(==1ZAF{sQ=VY!!n1IN8g z>15mLWWt*K<`~v z0k-Gyhkzn6pG;v-=PEcf5gJ9P*i@FeX0kPHOA<7&rj^TMn(@RK^iig461q=2CZ&`jo4aUWwBJzJ#B*votWfyqtNC;eMO z!g(wpco_iqaP#J)dK{f?oJ+?46u)bn<(KiS?s;hlB`mjq3NJO z+)v+N8%tUg?^fCf5wiQ-?%+86i!70TBh=m0*Th4tY)9UYVj-FD zACge52uMzGVW%dl&1gJI+ly(%I%KKbbk@so*VO1^h(CpqakwPdZ{_e-X1`7RW{{sp zkmU|Lj+zju2)YGUdsPLEk3Qr^C1~!vu_T1A-C-;-$ngfciXcqe4#*}C;^-W`Zq`B?NS7)D$UT!P1`jU&*eR=nmb;(6%;&rgrd9M(`Xi_yz4AaldwqT3p)P=07 zjG@a`J1hSda#w{33!4ThZV=|9Ny^FOhff($2sPV4&)&2lWma;E{>Vvbtn#F};q*S;VBMM$_8U5!&VKY5pUL?gj5d+VFEogAD5n`hsv@M6w6IO;IGt+Gm|fB9sBM z4A82}(8$5#k0A%*3MbB9;l6y%7S87e)KR;fiMg{g9}=)C#C9H_4w9h|s0kfSLMwxo zHd^1+HLm^Sd)Y7qfe-AfJ)x=zDS^RzyIr`z$Vw5c(jg5n z-+6RNJjb^?z{p@0=k>q;B*}M74?FV^AX!SvN&u21f%pGIFyt@CBI<29?air}K}3a=fON#0TxbYY(ziIcKyDh2s!7r}pOf6iJ%A-# z@{Y{!FYxnk`*JFb{X8H5!GXYkkZt4q6B7GJyh|@p;0#bvCOZFh^Zy|X{R`wlA+WRE zRpEh8gzDYhhu)vJtdSj(kaP@1@6Jy>Dmco`#E}+;82}Y84*&#Iyyl}>LI;G8C7=A| z)_<2+ar^@-DMCRCOwKj{0U8n1h@8Ww8b_M_J{`+Hcjmw9;J>vUjg7)xxolnU7&!r& z+P{!e{l|eVfx3(|y^J*I4%PyI>HqqHF~|;R82rJ2$rQ9d=2$!$Qyb(D4sHE!1Rwv> zg?HWqG(ewZFF)~VN;5uCslZHjCq(%Nc=-#~LmN#Kf)6Q}?Mk}bA^1Cq8(=U7OJK+e zX^P_pg72jPEtud@rZF`%4xsZ!lrUoeg3vr25@#XgjZ@hP1$)=E9Wt~;vh@ha9yRfM=xU`$FoRsiDQ-~7!EPn zhC5_vALi;6szgS5!}%)UbSQL8Lv##0PS!?c< zE#=QjBfFV-CN|@&*=D(nQYq%Mq(DVCP2#FDz>}J2T z&em*`r9Idll;wptr!@Hg%2{C$&$j1URWnmqGsBCm2$V;hhp&tR61D?NEXxs%SH8tL^66MNrOnBLHYoLO=2@^zKpDbUWQ-;4K#x031-xCNEJKs zf$$Auvaobx4Y4BCZ=hru1xW1IPwo|Niiq0}^rfW>_g=iGNGo-e?~^pDj}4&r^6Z$g z&Mckpoea9&p?$ePVaw7$#Kpy7e;m8w*)wn5I}>(nq9F32J3l{cX|)nxShv?&L|c*I zk&4jwt7EbE4E@X&)I7D3JK_&-MNM|I1Jd*49NyG}h_KBc_t7MyyO}cB34JyhXb(8x z)Somc{Pj|WA`pT*WSS}l%Rq*54VJCYUrt}WB1TaO=OulzCofn4f=ix)4}5h$lnYf@ z(KAVQ?#B9PxIJe`^Ox2*#x7n6mBbW5ITw57WH$_dQ-0%kd!Ju9Y))0+8{+}Xm>99& z{iE~VD&mLs&Mq(k>`se(hKqy> zVS@~5qf$*TwEat;MgjbZl}bAF98(8*;jVKRXlMF>DPX^I``AXr^629wA1)euI<0S5 z`dp!RjS_Ze*PFAdIRbPq;x20ZH;jIw%Zj|8J}VKDe!s>q?xd4W}P)m4!nbe)u_-=uZ$vIFmKitQG|@ z`T@CCAq^LH;(UF}9w!=0Sij0+xW%A| z+7B0{_K2$JuMCY>y<0@+v`603=(8b7@&=YW+`ecx#RRH2pREqxx*r8bivuWlhGpfs zLN>!u*63_MuD3Ee>5N;i4u=o<;^0WEtKF`s(4_$EZDh1?o$<#Xuibo-~&I7-) zPuQ_np~@9mi5YHl!8S!eF>CSHzpU4!yhc&%bYOK^n!6Ql?XDKiAUX(GqBNpAfah!( zz_D*O>SBWmH6oV5E;1vET1mxB#0(2E$^fcQ3ovWfB*=KIMgUz8$UCeNi9f3IGXRfX z$Q7oGitgjhBEE!BzW1fhTA?$|qF8<8hY<%)8%EfFU|`dNVB%@K9n|R_*%4VJG-$5^ z$+z+%KQIg`?+_J{^^{WQw&RQ2GP;AQSW!@oWHE-g_V+t(RE{OxtN%w--hJ;r19{H6N`)RDh8}sWbm0dVeVjX$gYKUV;JLBK2wh z-BV2hihhhN?;9}|k@$F>dp>Ro^{pge`uA#FPTyT+X?Ar~)AWqGZw-4Tv8qJv@A+B} zhxXN!fA{cs8KdtkaHwy|0j(3aep23R)PH@rB-^LD-=BvVirkzI2xFG(6LWcG8O0pL z7lmJU1|}-+YZM|!%3Nzz&%NI=CB}Q88%1cEzld_nJ?r&l;LUvv$%jR#AM!#2!DhR?A+wX#VQDf zC|1vSZ7ac0Id#$*J@i#{YAdW{pIB{4zrwG6DE5lBUEir?oBLNJy!1Ohw@C1$~0L7hxYa@d5k~D6?0bmL@fEi&~ zc3Hv=M}@Lf99PZe5yuWAfT)6xF~BY}P}F5WxLzzXG+5$*?{~0R$mV!Zq)LpnPQ?_c%>5UP46K-coFf4Y-E>0| z#x-rD6`0g7Xv$rM?O$u2pSwuh15UP)rU%g>9wASwHh>}&?xVKVmC0T(Vj+j~ws$n| z9hf4?Kc9^5xkrMm%&uuWb)9$*X&Vue{D=S=mdU=qZ{;Nv^~thrg7O-AE5$2nuxmQu zPU*P*BVl1q@tOGkm&09?6x0v%KOshYP+d$OF$KupN*R^y=*?}K=njXQNy~UF*ZG5! zVt6EAk`X-ye)Kq84*)QK==ekN>V43DZ^*K)%W8%LBW-k^>wVrQ`Q36+au)Kp)4Fgo@ir<4sm=EV6~=u(g#EbrZ4ElAbt$~ zff{dLm6jSQ?R2}EB|mY8a8k-nRQG7r8VWf>f5~p0hDSYo=%Q~RQ`6;`_w-(eHYJAX zrpt*yaS+4|5txk2his_Ibok<1CE0@_a;4#bGbj#uRXVS{)@wRUe5tA4T(7~bt@hr@ zawve#YhkuC8`3Zi38uBBk!k3ErTr5di}%uPMV#35B2>F!!l{U>^*cACg`Ad!9GWO+ zcJi#B{YBa>|0F{QQ~QdkQ$)B>&^2nk09N9z3F&!&1^V?AwUX|MCkXM8<3zMh`>?%B z6R|-5lMY&8gA_;H(@;)%3>nuU6B$EwN=1K_O*gP@TA&#=eau&5!Pr6RJHBSd3>BP( zgV*WHG@IAy@~!=0PqQj|?TBC@zGKS)0J#6f!pVuZ_*Df9U>`^S?BtAkvU+)A^GiO- zO{e=F*fGX=iUCoY%!}ArWT>H$D{pVFbV;^`W9O^XiKVr2?OOPKj z9c|}cab(Fsl>P=X^rkkAcU~>Mo@w?taJ*RUovDwkg0+rY9V^BBOsD&@mH-;q;x=)d zBvRnG*;eGP2MV3g4%I$y(nZ`4(A22d9Axkacu%gtB>h^1{XPr6R~|Mcll-AN|M-E; zvcng0f(mXgG(nyT9+^eHg`x(uiPSnBV%=e%8eJ`#HS>h|<84#(+wat$RAa<#atIxv z#MaBr77;6f98yfW1odX_Pui?H%S>ykI^Kzi_*ub`Y#*iT<-9=+Wc;atw-t%mcQvf$Y%nsKsv(`FZ*@@`tVd>NsNd4!)0%R)jfaF(b; ze1vSp@JE;Honij{(GRUzC0|+uy>#U^nNqZ}@@rT-ACxN%&I$mtoF2x)XIkXt`%ksw z89{Ff=WHbh*`5#k1{EjHe30|nNW&8iLaPc?;NtGHI~sa#N_k6)B!lOuU4(w5Ps_U} zLrLqfUzbJ@PeC?ge>AxYLz3iQo$t3%U2qoom2_H;(=_d%pyx2_VX5~S%p_y6MP}^< z%zJ(A%YKM3Np&l?Ki>vFeBUP7sJJ@qinD8)(oWgFRJKUaLQUHw`*n*#V~hrfBc>#j zVPYgL#{-^N)9^*Y02V))3s{L*`Ra}TDH9h3Ca;p21GO`tw86qbxEJMMYErkY(Mi|1 zIo_k6W8$ZF(+W@MulLK4HfyNme}bwwR|$(o&0R~S)A@b z<(KN%IG;pA|J;39^6>EcA2cknic#`X>Y`u7*p9t^ zr7Yt!7N()O(bk;@>R|&1ZM^3#r+%hxlMnS~7lQM-1yd{307!N2yPmH8K-D19CCW2O;Pg!t85xGsI@D&@?X5Qs z8W6SexvW%c?ewMO58pr$IZ-Ev#M5jG=}-48BQ$?yEm@<9oz9jWcA=TZ1P~q-7cCn$ zRPmhCC3is}*60vdf+GKB#O7J(r%7CWdzW{P0&0pW^L-3qs~I=CG^>mgP?#uCTT8R_ z*@XN9|B#}VzeQM7Hd6s@f?aZL%>n`6^ru4Yn`ViDjkbfVRR^IQ20CkVV7CACykHyP zZYLbM^U2EjXfl#Z#TtBSIMg2J$J;DIIhqMC&K;q7e41xT>84C%S;znLtTdHp5vcId z0!dax-_sitVXEj_VH0{KTR$@tBD1E0r?r}ZFQkK=Lql5n@OfFs&4dQ$jeQPDkhaCL zzA=f)$K1>OfF>l((M?q>lF9z#6I2h$4LrgK!R-WPpD$WRrtzj2ZS<;$^PUIy8Ut4} z1>HDAKs^do5~@bJyV}uno+;mj{3pn&)(<%Psj!Il5v{SaUZ#RqJyfzs&o1cpgYbpn zGGiENiLvl)7060rL2!=2C6(sRETf%|dvOJ9&bBE`(Z%6SG^D2C8y80>Iv|Lu9_Y{> zJQ;}5#%uR5zWXVhcU7$50Uj71es|}L9#IOOEx8Xks96t+=IkoKC{7_v$(YI%(Wo{_SHmzYsFU^}tJFkB zar$rhPGH1Tx1V?<00;9r=G(ny}I&IH^5aFvauQ3r=sg56%`tX{-O?Uf3Od@ zTDV92Z0$Nz)n$P|-+X*xgXsaF45Ir}h0lSi$rJe3Xhq#whRiss0%xV9V|cE1`9ThM ze@+$E7N**@>#q_2b1X3d(=-UWlc4;&)%4eW1))1tR?u-rad3vqwj3rScOvLc1dJ^F z&T=HS$;HzckvRDM-^=*#(#>**Q51Z}0G>0D7&Hb7N`S_s(g#sX^AKnJO-=Rhu;zam z5d7aWPGmL0=&44)@{9lCyuw`s!0GV9R ze|UQ@AymPetZ-IRmqG*Iw%N7|ika4;#$gK(=V7@p0m>U74~JpiU&fs$=7Dhgqpv^=SZ^o$)?EuejO{j;2fA zts|&wBZ>0TAa-SuHzhlgqkp76#tVO_@=j2opy%1U(-RM0-HOlrIc2``o!7%Nw8;_sHDAjdWoAi zZ_jy>Nq=InE5A7rxO~g6@Di+}90{bHGGQ|<;b{W;+8hvx9wz_)B()}1aiq*ab7Or&TXuTLLSubWOeh4Qcj;(J*ujXai) z(w_wucQm>O>|SW2?Ky`RbPQkLwJ5KO8uodPl;5YTcome49@-==zyMPjzai}1v6{k? zG+`GCIE^*MQTetGjfIRQO^{*LdH3bMm}7!mPkzg(t?I@RdA8g=y7#hY{y@psa07+R z(;A&BqAg?MnNF8#_Wo3OFxk{KBar#TjEuS%SeZQ}y_nUAo8vZ^{_%C{AuYNyckpbH ze?6*|YmWuoZaXmkNaSS(qxu=s&TpVWDnoDQS80tqb|ky;QW#4a0a_@gW3^@p>*AEJ zKUqbacS6R^pQe(6`{us6sFzjK@N9Ro%<1szZfHWIwL&M-A;FSB+^At+zhQcw*izzw z1ZJXV`mfj|0+aS|wPVsOR9CxIkpMNeUrkFy;s#%}N1bvM?&eYk##%KJFU|ULk5loFs|rYE`ejGNy12u2;Z{q*tC)e_@qzrI?vF zPa1$bi_`2_H} z)w7N@pD9FNO{6eAa9b!2zNZvA7Z|S$d&^nPEXYvha=;AM&lr{6vqln`I4~itOQnPv z$8-#u`#`;*VrcFZ|p-m|3hce{Ln)5WJmT zcVoFFG#JwiGu-ju!m(&D{-zZo&S~<{EHin?_rA74@9?flM*&q5-gy;&`sdQ8?27J? zH^oFt%ieuJhFgU``OnrO8=ZuOok=DKJU!v`AdxuX_+8W393Tx3d@haHo%a%7S|bSb z=|{A7H}C(@pxn|Y)DJZY8<^2y*<_6{?1tGp2Wcfwxk>(0@&wT}d z)4r?IjWT)uYIRq9Bg zA2UNyr}_uh$&sA2qokPf0@!;~y>eH&=n6o2_zb}p@nqRr;$J0Vo29Uj9lk|@-$Ydq<~R5a&)@FB!KPi2z8(WkoQyxtO2((TBFyLV zrY$&En!)H+sP`3~B>h?=I9IW4)B7pzqk7G#Y}!0mDepnv$iVh%Y#K=U(;@n{qoOgo zp?Yo6cct^8D&ll^@!}lLE_+ksa>K$md=nML*&`6*HHYEz686>U-ML+bQ$9BfNb;pi z=jo+=aotT$0Q&-Hc#AWq)ciNlr*Ypmrq$NrR#gz{B5v`$$&vDFExh6b;-$4EV zL67)yS^&UTG7OL!?YCcKnCb(X2GhrBFAq4C*>cQHpBn`XQj?pZ6b|Q!%R1fCSF_(@ zBeg#b&F~QRoqOioq-2yos_5GJBpX%EC@>^;0&_A7(4M!WCJ?cINybl33zfVjodZ-z z>V6IsEzTyAOyszeQN;<~CbQe_F-WE@7ed%M7kR8U4>q?|Q>dAOYz}l8F7nM)p5X;D zvJ*6|$md0I4ilahe)Syn|tso!W6FG_>Le;NUJG zi1?g3Q87M>JeEEcI-=6GQS^Tlm+RR9!hSF@}x++5&`g1iHu>6<9Nk7wa?S{ zs&3mu5rLF-PI!R@cC>Q_9|}Z1TheAL`cBSg-#?moyE04L0$-nhqGll4QN2XujI)}7 z^pl4$t>vDOEL^bkWt>MmOW9OuNha~K!q?&o!NH=8VcB=3;x1KC$hsxoJT{?%=?lur z1auVBIdna-9SBVp|KLUabtqbh;C)_wOy3*aBnn*X86*)igi1&lX$jXfdN%V?3g>NAtij2vj0xeaj~`tH3h0wU)#Kc> zriS-2m>K|?>-2;Jz{&n#V2M&;12<;f&)I?<|l z5iKqW{`T0bBpzRo1a41%h=#ih=hSR>Y_{_XKyZ784-+-vY))hAju!H|TRYeld86%V z49ajv)|-`ZZGqCZW}D$fZ0cp4DA zbGfSdVJ|f1ayX0P5c1+h0iW6BaDB*ct;Clya|l&K1H`9+^-e}nRKJiNML;q zu0pN=fp9MSe~rBU9U}YxNjsne-Jk38UIyb%=@m17Nj+FTUPc?nJ< z3U>$TF)-r*Zco%G+?P@LIkg%Ng=jWKJ#_yGH2foo_*cj5((JNDd{9=jPohjROi4>F z^0|~iU(zqjNS1y^ac1l&#Nw$nLIbYDn7>2{j0fpQpt~W)*l=Go5Z6KE6v>o)hj0Ep z7t3Ff^Iste8BY>`Pewxl7XwhiRWtyESCoKt900)_#42DEKs)fC;1S+R7QSum`mUT* z$)5xDZ(F7cmEt-KDVJIR19kp+MMXfQ5hq`pjEu)@6gCf2do}+qvxqBeZ06p3;BG=f z?uOo79;VSS+V&3jMEXRrN>V0hrY(s&b>@Tmz)TTn9#}RKY#Gb%vJn2;M``|{+E`82 zt3*^ECO{uv0t2fEOqB$BKN@eVTLiKQCQMLKaTY>3G!?uZTO^uYHtpP3?VjycQkZq5 zB{==o)$p_}Y8f6gesKg)`dMGZ(#IptRSKGte-<$BY-+#9N-9h>DS&*AFux!a8PrL;f-tgv?LLGEs%zAvk~`e9L$6fNcH zK?kw`0#lG6`{_D$!_pmS-HsEBj-(q#@;x%E)mIT;OwJkhxzeqF=5zT&X*SWi`S}E~ z%aDPaRbFRjF+r;e!+XMuPF4y`xUhzQcOBn88t_+Rkr#Fo-e~E^7+|Ek@}Q!&Uta?I{l9) zHz#T;jO{*ucNcc@*85EDb~(8|r`Zgqt<>KSP~$zXk`* zACHD5n7mxLiYG!NvCIy9W~bw|q;dH8^(tDCQEg3AYZLVAw?|ElO)e1i7e#BfkY(qB z#Kb<5YyE4yJ2tN3c);z4G!H6GeyGSo(@3U0KHk%c4AXYGm-I@7RkEkE8O{AR+0q1ylH%Ud#^O)jJaeURA>GcnibzQK&u7^|vRF~5Pz z%BlM}&Ruy!HC1EJrZ#VJDL*0~ zLT8tUy1izuyN;IGti5T%*~Z8yf4z8_>Bg}$r@Nl!jT=S6>RsGio7+PV|J-<9DLA4z z*Tie2x`(E4^C{4pQs8SuL>Y;F^FLA;sWb>#}b?+p7oYipZs(dwP z-~H(Q>WA)%6=fYoQF^JQ&z#-Z)X%7_Y{Gu!PJYXmr-_cET%Stt$+o2n6x5DzZH-lM znQG@kRm7u~G=8&9H@sG=-yaK!5S4&End+_zsQnf3NGv;Kf3T2G+d)%S@9jC=2tEAM=+AHKVFyfd6 z(_;~(^H;k07gEbk-8m4Wk5nXx@zX9omnXI9D;=9^VgeF7ZzqgR9+kQBJDb#Yh4>8U4_fXVa>X zPJu4oLL{u!a}iNNl1n0J3k_wKPt(UC^i4TKY@ZG0T_a*ui;%kF( zUYyi&9QJ~tCHgA=;dZmb>3Yc+PJ(dlY%cu6+PSknOQw^mn#>I{BV27XG%T^Y(C(qe z`~u=0p1|qFW&Ft4y+wjTz0(AM^l~_D(!~LB*!jil;gywrY-#0^MF`5r+ZwTTW}CiP zsnMruk+!7wIxeO4jm0_z488i0=-M~=8CvdzJ6~|IKe9RyAZt{w*EDV-hHkxJH@}#N zw-q&i^nHT{PM=v>_NxR=gNOD8#YwM4{C)#fGv9z4*B~XEjx8+9*FSFZdl`nV1u6k5 zz-6qiK}GS`Bddv`-sZ??9|z0|-5eKJnvDlP0$8 zNDO~Z`!`c78{0apN7Ii*KN}PFNoSpMe2Co)>ti3{h{C0LX*J-OeEu7#NA-e!s4w-Z}i?8#6832yh$LNGJFAbRM0?QyCrK!YepeV{c!jA>eFFB7${(gnc% zTs){kBH^FRZBc)var^y1lIJSQy_Avs&gcCYwi*SO7-TI6s|pHVdjZRp6)zjfo@7(y1wbS!0yK&oiiV`*qvtbrI)}^iGgxRb3k)((78v zz*da8Nq4UMCA4@hap+rHG9o0MCXuDT;+}q(D^5L$`3*eOq*wRD3P+;mdyiA=%$N?% zZm9&ho1&0|Z)ekv6;_yn99l81*!`vQGh*r;JL8ulh=SGLs^y9%$J~+cR;FmqbE^k2 zMM1q7{=4}XISNM@8H=`)*jq~wOwFj1pEy->uI%(MXh_@UMtZiQCIS1_);YQDke#rm zC;1f#O@Fo3Xy~r;AiezyP+vAvBTtkBPiE>wtpt+=6Mr3NnkDpIV{shup`G#`_C`LQ ze7+Qv>YWGMA?+`_?w=dHxI*MV{||F-9TvsE_l++h2ojRgNGK)J-5}kKfGnkyh=8QP zsz|MLD$?Bx(kLvQ(jC%`#F7gv>u>nh{XOSC_c`}<{hsH#oN%fwtKk;wSzK9SNvM|$2R&a;UNJ#YcL+vhbNNaK-mUXa53Gr3XXt` z1V?HDkKer*Lv%HNsMqOW4lA4fXYlpE0cigZ?Smlz<>#O+6@FK*ysAg0nT_*y>No0o zPD_U89h`8T*Lpg7p>^5h@jBAFu1Xw$ql!F$Rz2y6$+shuT=Teye=~~3VpJ5kA19WB z73J&vKlG~q#KQk!iMc^;JEE(2&7TK?6`LPwKJ*5JjBf%IA1W#kh zTwC<~kNn#Ixb30lJpkH||95DQuj8t!$V3cGHnenz(hXhbru;RE^+=LU*|FPaIPx*m zHKgphRl^jy@8J~uIhbBPC$LUkXH<)X-slYx_)UNr4O{@%I=J7a?_OGS$}x4R1*eJY z43VCPj3oCrd{+ET5ZL_ZF8^xDsTnwtU5u4`zu7KP%8{SfReuaH$YgDH7Bh1ehGwvm zlFX#8r>g}+jWAsVFLc-AI>pBXi_d~6F`Gqme<;m3%VN1Z4+{a6_`0?%c<<5--DbTOK+mC)-PaR@t&P(>%8JOhTB?m_PO`5@A27{9A} zsCO!$`?@u~)`q`Ti&gL&NxDXY;vALm<1WxGW)wahuhqQjw1DNNGw-D^je*0Q|tk>bo(s2S<0Y1|v7Ird0ayLGHwcg7k6Ri0gpA~W`BI!fWqB5|u*7~n6E1GW9YFG%$z z`mXi}HOAv8vLbcr4VE4sYw3lA0MUSHuG-U(iKr`a1;B=0$X%ysE+<`K_|-1inocGy zzAu6P&D|o@!7x5Kgr2V60S4)nf^6M+iR0{Ukk1E0WH$2#9cc@68vEs2g4DRqbY*7Lp>I^;6ny<_EVI$BSJfTfr8~gDr@V)3Co||s?Lrtrs_bMF3{{3 zh^&d#0eEBzJ{Cbup7B`=VcW8-y}qD9bs}&>_p1Gs)OYguCG*)1G2w#Cdu*zrayvlM zufLI8?4?rCm}!8GN2=S3PFG?@Ob;`yF1ld(R2Mmjo%z?Uk7vY z^(?2lSP8BD6c%><^kSV!*SMI}wHY?=GNpV7^5{8jv5%Mg{3fHw?$`QJ$Xe996!0GDm>E4~Gth&XhKNMc|r@ zb1PZ-(94~)RllaXbH8Lz;*JyD8xVc7<;6&Xyn8JS>}$$3V*1>NJJ_t4zJF8-v8DF1 zjX(gZ#yR(yEROLj`P)!7I*#3$&x(p~`DPn&ZioWU+fG-b@zdM+Sm8%%yx#ko_Z8Qa zUh5@3qRm}QzQcd(Rv6y-8+?fo0yoHegF?!sMkTLTd6j0tw+Ci(WITMyw$!wK)^%Q( zsE`ko=+~;f1OtAvsGL)}$sq{7%rnLa@SPV{43tAJNL#fk6O3eUvf8Db!p2?REgYz+ za|>?|nj@25<78}a?_yGCSyfjK@QP8y?{rF}>a^AbFnPaj1L7&5`$<{{{U4t6&iO5{ ziD4Bs?{OWIEy7V`uUBx}AJvZeSw>}VeK|Dc*j*LvBW^HB%P>9`0i5rF1V|t`phm|8 zU{si3>lUq61iT?CO;3;rs;cXQZsKs*{};IkU@Kj-ntm&tR0Gu&LH9vhn(VYN(fnaqg6VECAR^0mbTczULo*|s z#xAJ^w49R_E8C5#NC^zwv3R@!dj}?vW-;p{4i}{LSLr4))oEW{-FPRFGFdz#qE@Y2 zsw)|aia`RwG1vu2-Q~?aA#EIszDzAtiSTeI`PK0F(I!M3RfPm1i?H{uGSHM=u8{1l zK%EiG2c5hXDiNKOChEl7;G>{)6Zc+~-=vdBdE0*c-TP7DN1x0U>6?O6CycAIhOrK?wh)CmeZr0=T?oQpAk zeOcEVpWClF{n0p^y6kardbu<{qBJVJtnGF_swV zz(`{~^?qsjulR`;M6zQ0;H-3lCXOv>@rvNvY8I?cjd;`#Kz**A{7$D28J4}RS^9J@ z7bauQC)}geKDg%}pq0zTW)^t3c62{!EyTYnoPHYSh8p*yZ?Zmy)#nRl_p_{rCdx0y z@+ykbjXr50na^4ZKJU8a3a zJ zCn$0kpf`J;z-lX9UIfgZE$cbm|#WGwwwzAHtR^J0|klI%cY4Xyr9Iu+TV8 zm!1u`R@KuKIM9YDYn1P(%ol0y9ZK)n7&!XE+G(Qi$!Pt?N>-_Rr+UnNKJ9shpp$}C zDzTFsr9OhmVw>KY_vbNMTi&|;{*0Gw$XhDNRK#0Jjso;4N7ZFdu(EQ5(vZNO(*_9t z-GEzD!l?|3>{JJc+#!lCCxma_@j;E`Z7AN%8~p6|dWjZ@4MM{74r;yxbEV2WTU3$p zSBO-mzgfmEn5shSbTWv>uRpNgl$NMK<~}8w0`z2J)Qe_qqy(qCGjOJZxF++ZQ^fJE zT7*iEd2)P-_E*Ws_^%PXnAuJq{O#ow(`g zFVN0};fi-9j98@N>Iq?o*$=py+bDCiHkHTbTvV%N1>#3V(|LPwxiRS!J@g32@YU$@@KF-a+jA_u( z;>cMqWR3-z$3^V_UWS*|VhO;$@@q%G2b6SG+s|B;R@7R{FF3vy{7`(?R*<)s?Lm2q zC@Zqc;J<^~_sfaNW1tt=9>P$N=eCHxeFpBoT=4)Hm_vWT>Hh$={;%$*s;=S-#d&`n ze^I65+#B)&$ljP?Co+(5&34TjmUYYgeVkz>;5%?=JF4Bpe@P7hJB&+!v&#>~{24H; zG^{4BD<}C~Gb9m)cNPL{QuB8 z1`vFWHpZ^IxOC_$=gF|3{8tPdCDim3!;4yOX8#tE(bE z{X0R@*xn z^Xd&_#}>G^Yxkqf3bRl(dN&22IWjn+8*Oz+LJEXXLFno9Ls#GZD{r7+%oOsO_A6WU z^$xwv?**;i=4ffA8zqftFfRDYaId<8H^FtJX%VOueRI4RtMWdUiBcMaFYoz1Aa=Y= zN8_#BjXI4T1z4l$T50Ol#3nECFy@^Tfq&UqSo+vIRyO~aO3g3 zbEv40ZCussF`_0@tg+GdPrAuJwI($NI$+$Uyq~bV^!yvSJ|UE$#|t?2BBo6ROD{-8 z_>@%E;%#55NM0KCkB=v$es#Qa4Js&vkh-mYN{IE$l_!pxYCb8M4=8TY7v+Y5FtzxS zk2rFz@6hXu4*Fe;RPMPcmkNyu3x`iGQH3kS&*cq{Zt3d)HwXe*b;?a`tC5mStWx<# zhm$_+i_ZhbUTmG(beBdGVM~p69W;$;L78Sf9tBUjV>-yy2HZ9rV~g-QweU(5W2_~n z7V)nJfKz81&^zNd)PsO^o_=r@mrR_9QG?D>69_1?Teh#dzlC--S{OKoLxYLwDqh?$StUS#VZx`wX6@yzA86Q{4XLmW zGP`$1DWny>t9R5iPL?#8a_B74Y4JF^LH#nNj}|ehU|J{8v0KCp+o*IsSdB~9 zA8=&Pyj|46xn{Nxse~$8Rq;A^vD(cv774~1#;i|#j(L_u_v~Xna}JuO#?1t1{AkE2 zvY+Ki0~53!ETZ9%*H`nvrCfG%hS)c{HKyxwg7<=#n(cr>*RP>crFF$1(Gp~>ZDnLM z*S&mCsV@E)%GWrE!YzGG+{*Z7fh<61+tnfM=7_~3JN!we75O`Eh0j&`(!UV~tf2eB z2O2P3Bn_7{WcgiJ$@bKPVQoymoI-OKRVqf)&ZdX#JNZHN$diMRQ)$3G=-uhfS3fBI z1+?RjgvJEqI$wHLgom%gZGVC6_L28_+6FOL#cLy{Pa+<@`5yc61v7RKtJTsV0bxLf zy#@V^eGcvJM&kntWr%!n%({i>O_ zJq>TyOupERsyEgS5rSY_1b6JeZ-}p9;6J$PZMk!QjXRIm1{Al>|F%=fsrw!Yyq~sA zSvMx-!e#I8bii<$Q9d(^o@0p|Vi#OGF!C_}M81a`;jm(!xb*^c5k}66RO0 zAepTdOqHc!wQyw{8+mD*-{||w^+REgK0yF&7{9oigjded9qN;^g=NMzZ3Yy_xS0lG zJ^JG5ca5HiK6W?0l&Yt1Iu&0+gW@~N`LxvSVv_g5$Erowg6)&M^Rbf4F!T&k=h10n z^>sW9}J19u&Hytj6?5s_;J-4G?=Id|~{ZJ!v0O)M8A-bw{xApksHxVWf^d(r!+OfB6I2U zN#m%U_Z$J80LTaXm}R7H-94OJiLS7F$8@QeJ@5IA9%4Tfb*4JLhugFbAsOF-6@!JZ}=3an5Cx}WK*aAqbRHCrV!uD=4eMV z)j95h&DluK5n;Xo$8ftqBa(8T64J_boL3f%F(F~EM5HLub7(ED)2>#2pBajH_Ev4mi}i;)b{O#dNw)Gwf`{0$e14H3@|{ zK9zh0>#(V~sO0+2M`jm|F}MQmI@=9;mMFt;Vxj|*9=zqc-vBRH4L$rkM}wAyKQDX_ zufi!(MqSEXIgRvLY@|m`oj=p>&(~BC#=81LeGM4{*>lK;Uch! zaX%MMKkjwDsiXFFch4|%YX%e@h4}2=IL#GIvVs+wWc&7SSv}pL5@{K zz54AI6fGz%;jOjNl72^0A^SY989oUq!ZQ_pG{AP<(2p_7dCGkWhF^< zsi~ciN_y4lst`Fi2;T(^`ri0H+`w#9UJ7WMzk|$P;?mZ`EtKr7|;4hI) zl5z)FEyhf>g1$y+r6-r-WMH*eVzmRu&hIB%p}npe^SBgJ1}@(Ms$y(EcQDbOQ_!oW za{R>dq><0@C@d0_(9^iNnFY-N>g0bcivL{KfSN}B>n(roalPcX&tKHBq@Q_&PX z&eOpI>jM3xcO&@GzX{OZIRYIbx$ExCKVU_`JFlv3;tkfd1`YI5G7T);8ZhSqv~SUZHuJGitayv-@VmEeAL0N?I5{qRy&%x_ldV_& z{Rj}rO%GxMVMKD;8A%y)jHU`P+S$|@=GO#_d`p05eI5+R2?2Ey=9_ieI3@3VOMR6N zP|iKzjQX`GKD6)0_aqCcC<|8A)q_z0AL;kJxrpr0M!uC9Sge(`UsfPbzIIFxl&0sC3d z7jYdN@e!DsP3?xs;kIB^JrZc)-0(o$nL^XrCri2<_4r1VlcWghMJAvw18XLD=WGcs z8P%?Gfy%kjN#Z1(cr&%7CW#R&XuMqA#bTFX3jKVcRd6pSx%b60)B2GUg{ zym!A@18U>upt7&3%6nsW{h_w+RT=PCq>+!bOT0rmBS@rG>#K+Zqdq>G8dO7RLgEF} ze^A4=SkQO^OI0K2TTV2YAfbUAGxTM!z?S*P9#gj5cW*D?JU^tHCl5(3ff=Ztjve^P z3NiZq_|@wkbYa*|Ah=t5_oFa+Sp4h{IOqPfY(P1HY11zy=UGhXNx|g4p3J<^4qM^U z2?N(IU{tlFd#SwjJ_V9_tx8Hj;4|=n9nj7FwWkpP0N5E=IWILx3od@laxtl0Cmp}& z29FSysYl|wy<%=MovHj`Mzm`#)2`*mRAh(tPG9G$DPvpaK|#gyuUuum8$Z)WpJ=1h zk3*v^&5oW;pUv1mN!;miFfCa)A@o#zg<|d@Fo2Fl|IJ12pUJe`_|c1Vr+3I zIY)eFn#QP~6+R9t?%#?fJn86daS--GmKZ43)!E%>8QZ_Wp2{P>5XH&cfw9^^Z4L~@;*TxG!RwQ0~p z^ADR4JYi+2f|ChpBa`3#V_0^3izIk0j72HN6^Os@xsMmxK1I18uU6nQ#2shStcb?h z?t5Hcn2V{RtlwjoSn}NUH4a1X@#~R<6E!Ld4DeSxd-4@II1;yp1`gQ!=t{N81 zGE3A%`w*W29R|ae^GHSD8lFH+C_OYKGLPi646twnqFks3g$TiVWHdKfn8U+uk26(z z2JuvGr`bp(Rzd6BLd`C#IXhRMCQ(tNI4MNk*9)mPL91wrlaQYqmmGeIR~gmbyP37l z66&O!eS%pkAG_)mobLEO7L_*9FFY<{ZmOkTEa}{b0l`}=Z6I&&I z;dH8SY7U$~d$6O6-SR^CU1uX>GgX$fdAVRqxfZNw4A=s z&a~%~B#(C2v8UR3Wqn{@SR!n6ia=O-;`oOMs@csb`9x}#&;z!%sGBIMVas7ZzIw$GypL<6wu!5&(aC!Q z`{?jOzm7PYIh*@!-6DyK35+|cHT6vrb2#)vEPq|k$#L!~kMmd=cD%j_iP8}e0_#uS z-x=|vLj5By;9MLnG5S)%Iy#{nx0o6KLG*ret#-zekY8CVtrlTm`}P9MmUm_6^wNgL_n0vfxk}|*%**;kNmD= z(z0<2Bm_cn945#^y@luH!lqXr@#QAVhK7ia#N^`OF5rcN-GzvHW8^}HV#PHaT@w|i zET1#iqA;*~(9uwhGpI_wo6QU_3wB%OZ_#SJ>+qiZ*U36UK!mVII3kATzC1r1s-te`URSXEc5(4fk`JVb5X*}V^WmeU&LxaD-swb&K>1*s;T_t*c3qzf-+17V# z8su&ezxtj)Y477EpB6tue97#+tckUgygM6#hP6PXYSSa;u8;l>y({%aUn9f+btW^Y8(EJ;Q~T8~I`;;IL~;>9w{k6X{Dz02My_iW7eWW_EB9cb zuX0DLXA3&!l&yo=bBYQ!u;!ZrwQk)G&<{6W$@^4bvE{|#W2xs3-HO%MG71^xn2 zW+I*xn`}R|X$WOsT@?0W;CJ&i-a&D9JS?%1&jUE)2lea9J-$^n+pni&cvu>7a}^)F zDQ}^B<1J~O`gIN4cnc`}{k3Du=fYPc$KIBUkKMrdp1O(4>27A)?fIay7o#OV_=Xgr zK~!d;VZGpLO=z-kmAtXp@DY%&R~}&25ip065j!aRsO%kb0&CagJKgb>x$GTnJWcth*TKyCVDzO0C^TIG~&WOE8$yffP8)A8+>a zp5*LU91bCNGX?0ss+|g*@u0{)0W|EHIL$#_D=$F1Y}|&M)K%?24Gi!f<^f|D;H$uW znNziml7VpPjT1i{v6Bu659GiAh`h2T&pRZ34aL2YlW6UGwhS-u^Q+$n zzmYjGVI<@^MSoaM@u2iPKf_#VlqvQ07OUna)@m5lM_8to7`!6SeQWxG_3-tu#oSZ( z_Smrm6|7{>wUYw&%&h0yz-XFls=h(aN|&>gB$;Bkb##imUq$v9iXkclc`X#buot6d zMxpt->jao>#$KN=Gr3B8E+TzocWNU2**R;u>8b_RM~q}WEptKN*!C=gj5%_qpY>N6 zUgTQPL&DP94P)+Z8JYpf0H#HyXHgTCVK$BiTe=3JqW!|y33RDNYLQyw?AL(mUnY(b zEXOeiZ)xOV9dg9~;SQ2Sos$n!1HB-m61-ry6u=3m@yCUxS+ZOM#=lpKx{=2hkvSQi z@<49Db^l?67N}w6{Oo`M;l;nTvjoUv9(hC9pU7e0K zrOQ5K@6OXzW_ODM3?Dx!(q;C_0C(!A420tI6IiA3-vJZt#MJn}{HCT|G<@wTQ^! zG!DdMJEkjj&C-KeT>r0JbEcg~-t*lB-+lsjE4UxhqC z_gu<&KPQRe9~g#9yefO(K6LU@c``QrM+H5v(>eZC74j-eX#WBc@(UynG|zH6J5O=o z(coF2{sX3R|A1kUPrgMqTr%Ek2dH6kVq&>69s%*g6O9JmrH)cgKNqIBD*&(sKG|?Z zjWh{jMRNU=nbNR14INn!?I-~(WH>($O>h}N%!hx0ioqHAXI<6&yq?O(`7n8|b;_S- z;&|rT?cj5A5ABO($j7poI#{TUH0e0*6(0b#EdGF6lBkudnfz?jC!}jBo%aVrhT2NV z=Ch(X5#~~-_faIkCCc0=M38$p?fBo)hNJT!cKmM%z*Bz&e!} z>D|G*7n5B_2^w*UtjloP6=~+gdxtdvfo=81l~I!D`-Uzb_jgp2jj&0H$rH}Ey~Zw# zN;7O2Bgl14?_(rC>6(bglsn!(&UvUtm?I9Orf)2ADfr2PcMnH!HEmp~u_tnwA58j4 zfsLN-mC#LAiFd+phZw5TPKu#UdfCEJql433Xnb7+&k{vrQm~{GH3BEZvKU+L2h(ju zn&*ZL=1)JT`6gfXwL=b!7T%xw^3@-#eF2zG9MjMJkS$i>gu6syunl$T>brF^b5~rk z-YN?=d7^nS$M9XIOHJ?D(-?$cL+uB5r@Tk)vya!F=oCx0(i@ruS&PAv(2vPKg^(P` z(A3w#?`KHT869usJQd#!Rrmt?G;~a!!@i`%j+4Ucn438;)yvWDt7n(`2~Az<+#FHUG`Z>cK{Z|6rQ(r^uxUDQ*L3b z-E)vcS}L+^bZMh?kQ!w?$&C_N^p`f9X7eygAjx?|)PiT75sd*@WD}sfz~#yz0k^km znphDj@*Cg;bzV|2#Um00%-Fjv*fg!x-W~Mh1tRqbub6PaQ{`IE zJ(Z1hHijBjBhG~VfMff~989+TU?#lNM(O12Oj>YPfI&WpZJn*0Vy=VWWr0dObL1NW z{KRjjQ=e2$97a@0D1YL(%XTli5jzHm^^4a(WwB=MUQ9VUe{E2ut9BQd*3tQGg6J+) zZb9d6#~0ybz^5%N7=$m{k0B?vuiHOWOgcga4UrItNo@q36+A4ntPCnEGk*ldd(Z{K z7|fx?k`%9cEXIs9aUx@YNghq4z>@b2 zF`?O$9Wpm8)6YHzw=t-{2Ts%m;i(K58fcI0R?|v4jJIBCLd539fZI|D%3<8E8+FTt zA^p}Wag9lP7p~O!V6~smO4LmDN>{sQ>_&TXb)Ap+2m5fTqhU>t(LUQVv5g27bSX+5 zVb+{R1l-*4hze?&3DRR0eu3n0Cz>D##@(e$UT+;0A@H4>tLBJP_3WLlK9XbD8vpX8 z@z{sC?c|{EvFR7jk*G?8)?o00%AqA9k?nj^uXoZtb!Bk#sDtsqP-c2V`!UYjf`|Em zomLG|l2Q8#7TU#-7O)?+liQF(8hzweUCU9PMfN-=is4+M09DXyCrVuwl%!l3*Ogw? z5QU;RJ#^^ZY9b83ozJ+Xd1Va~`UZUDNOd$b(L% z8EZIzJ%SR)n9aQ@#c1nA!lfIT%l6fPf7a+tw9P$}7)YO!I;Jjw&f-Loin8Q6WGcD8 zJvX?Lk5Ol2xex~mY!rp|q7YK9bpGMz#)@`heqUb&3qZLzINNvn48dPyf-Ltb4JZEc zglKyIHf)V9fi4I$^!w}Ds#t*N^%O8g;~aSWoUzrkOW;!eV6r_O$Ge7DL~(3lTX<8A zAF?TH>_InF$6!{7n@0#Cned&c&+CV4kCoka?MeZH6D`&jWoC+3^4ngAZX8S$W1T;8 zzC%J?M}QBkQcTq8-em@y;uDKtX*8p+TPSlB0BR$4~5;g%OP5t_I_ zmHE3-{2#>~(}0q=S3x`j+&dCoEkuq%%fS+rRA0u}1t{-g3N%m$REwuBy^QcQ7XD4Ez9)u8xcy(1n+$TvC1$pR-N=j}r1X#b0J9i8-FVqhzm!H7HOR>oc%%*f~ZBw$|2&7<)P1d&0zLCy?-}mELrcEc@j&T(J z=}xWxc2FL$Cn7z-ZI_bi^p&bjRGU|%3Bko;qhl5369uTlca<) zw+Y$48`J5LbqI+F)w%YF^$XGwIUU~-oCx`D5-X50x+0kjWd>Xy$5HYXl{WW*4kToB zj`4yDHVY8Q^Q8$!h_84VzfUgZ#zpNqbD^*KKSMzQ%P4NK1L1JU)*JLQI2po7ralmT zwmDe$2Qbdik5Y4%ku+W!p1Lwj$hnO5m&t|D%T!MfcI|@EjZ0`}@YFZ>_UbRtE8to~ zi@}{_&L3XXUcLNqUhVryTR)EWAYFu^il~YqFG2bM_EcTJ>oqYJzuim|AkTg05*{!% zWA{7!MM_JT9>W>{#(Q!Kdfe9&vdQQJfMp}08apLM_lst_0%kZRe6f7S4c4ZKtG|C$ z>KL-MI!AMrM@KNZ0^BGlamL^++&BSk<~5AxJgyd$6qns6T&2VtI=ozUUb%E*Hn|jg zZ8lx;cI^Np4(VtDy&*fcm%#3?uKbH#s?;Cr+h12IU@g3_b%Kgjs;XgO${=Tk$ zZPcm4o`Y?j_{Qq*8~$&+R`GYR&%kk2Ccj|kNwFh$)dDgqG_mx^prW}m_A~a3yeYaJ z4@Z?of#8N&!T22O*z3RsJp{0-$U~l@BN0Si75|sM?)sSP4vp;PR};Wki% zVqW8A8f9N+Rn@n0uF<`m8u;NS8P#0Cjr=!{*Xxmn8K_5;m6h9*ccX!mzJEt){iI+G zWYr@$qo=x~`lmhlziw$?4==Sw#e3}A?3hCZsW0&gjdPXe3j3eZw-2uCqwznxucJqd zD#%hBgbIIwx+{&>bEd5+;fgYg`Tf9_1GsDUEOE1@P=I^4ip@_~i<5gx|#h14SxUD9?KpDL?iNf(tnpmt z!C_HTqi7XkfEDwT-tHGDi?q%a_Cv~?bm$^k!{@ob9Bh^Nbn+J{E^-pG)5!?P7cNk) zmtUkJAa}tv6@$BsFyaf>-*9l>LomrC*k*O*%y>&hd>q1?IGYB_C2$8IZxnsk{6nH( zKx-bdeh9ZGAB?*$1;#D?11}H&J zx9vsJSSKFd1DTmW`4V`w_Ux|LF|2PP3@tP6_q3#^aIRZ}^QVo4w9eFuC_}>4jdmZD zgOw4(hqDs6jQ*it5~oM8d*6~Wo{yK%czdimi%M5YMZcYSx5)#0vBiz~RMsH!1j+K2 zq8EdAXM8A-CHWkE$#$DVrbkR&o&oN0ip&LuAXliV@sk^Do>RJ-{w@4CILd;#Atq?| zDc^Bc{0A?wV*-($(p@tpX0zIEkGV^KbW7%_ndSR|9Q|2T$33p}Cgnz$1i;(;Ik(&S z5@Qvrm9%hKSK`3KAwR1D0~G$N6!dKcgv@fz&5PaLx`cV1O-r&?5xGN&V!z&i=SD)D z1z%}L^JZjH#N?FQmlE}-!flQUcBTSEkY8IZM39+f-f=CyZd)|mTkp-x9z?J+m^;FP zE%Otx54pN&sh4XbzUP&+pB?8tc%NdOo-9IR^vt%%jZ^rtGmoB=^0G6wL3sl~}_ zwCzi7FH*P;0m(c~4@t)q6y+9_5~t=lUvN8U+wg1WqaDU*W`SP#%i?B8c=zn+?~+^E zdHEtZ4cw%tT3dnGq)))(gwauR3KQzmuO=}6eh+h4^Lfl6LUXS9rHBpmA$GihV$_SK zcuMDd7gl^rb&bUXaYMU0-p;3=@r%TI=y83|1y&%t=?7kmsa1W&jtKK@-3`h*z|_ly zHc^)v^?aO_Am5ZeuT2-5B*#W0PKL>zG&CPeQ#&^QNXt>NmR2K{K$~v_k2#{%}W%9_Ux__NQT>?&-pDgzpnQS;%ce^d{XIz`yes$Z(b$lu(4u6P!sQWvmFh4 zmXE~oFD!g0?#t#0)XIA2WWWl>re}~gUGp*rLA`U{+U+xuQLHw5Gf!0Uqxo8;lZ?B_`LLHwl!$oT8XD^TQ_pR&L-kOZJbtJdl z9==jV5nnZ-3H_`LDuVZ%G*v^5jDs@6DMhL5q^nO)KNXm4v)59JaJbWk2OAUkY*a%? zI)y%s4|p;4Mys@-xV8r0FH}!<@w|&O`vqzhDejR@!k{hXqY2i*TzSnxau{x$s*g2w z%wc0VKDhG*iu!;g1Ky+-(tm}-B?qM)B+278MJUjBuXE*y@hct3_3c7qK$f?57eL}1 z_(W8(OnE}b5C-E*HL;9e1bP+2i_+oy4kE*$@P+> zQPr$1LF5a`sTw%d5@OlZ*;F}LS`Nx;=Fgy(OsfnhH^HA-F*-#Hw=&7?uwa7b5B$Df zJ__iC6Ux|Y6Pa&WFpnzLQP&SbuWmfLIFGDdcUs5#G)I};PSucF` zA&K+qHAumceAZB4{oq8bNVguP+deGDzsHG-MUSp(R%H}11*Bo$t|BCDE+vtj>*$%` zeMu$`Zo+26M<4vcE4pgGdr1Maa>qbTF- z79Ts7r!BoPk>SePtzWHQavkw3CAHJ;<2w#y^= zxoWrb#uZ%m5_VgGL_5}06%up4=Mmq)zaII0Gux(L#!FWPX`y4|p+L(;7MOfF@H8hX zy&~JT>(UsQ+=11}9P@$e!T69>g<;6fHL|>QeAs3(DmmEjFth9GnCnbtNe;K!l!<UvhfiE`{ehAQdkiRxibvr;f5Vu@tu|pGdIZ=9(jqtxSt@=Yx@Gkmz@7%4)c=}AxK7H+pr|C^Hr*8Q3siX;OnldAVgqq0)!}Ud z4n5)e2@&qq8rCCH6!*X*(8#`#bU#^(=*H_Eo9_5(R-n}~CPu#fjQhHBeP{)^tMiww zyX4Bf@fLC<2_vq{nGe!OM$N#GEW5*Ish*>fv^#ODo{Nq z9G~vGDpx}XPJGi)Zhy^T&-E&oo+CtRYbDo`Hwzl9^i&k*&3yoTY8)|6L|441Rs{ht zq2Qm(e>@!uD8k)buq5c)zcxD0sGm!!7oT^aPZ2)sEc6xO z7*npMBkx>vbNqjSmeY|B>~^J!e~2Bd$1Uw*P9iC|@7Ko@pQb(TjXPGnF#hxeuthY4 zJ9EWqNI&>eHE3V;nI-D-oNRJKiek8jp~K|z3IOs2kXkG{Si4}j@588Eh?#$6=%OaivP@g4s*onJk>;>Uwo^7oTL&kD-lONm9IK4a zY0%Wgr+w&~qlhN|J6&itBwo=RW4U+a-1dwc-?`Dn(2Z0b-##3wOL;EN`0=v$!&emg z%^mz{L{1-D|GOO#LwrcC$bBG0@EZAXyiXYt3>nr*6w7w^9g@}qvg+c%RY@p{s3nnB z!hp*bMuI9Y8?Gt_1eSXj^B&qoA9)w@Njw$W9>b@B@?dnJD9g;GkTj zyL<@pk&*l0eVG}D4G*@>ci%?#(fIk74s#W}$cBM2hT$z?R^rR^Taz14C~7$c*OV8c zfJd9`RUOq@05hk*pg-Hi49z3plFSI0Q*#So;qre4{Kym^NecBVcn~#g#)IU{-z{`( z$wzpkF4E_YJeQR@+hc`U9@wEMwNE35%a#F+$5{wU)s~8yf+?+cE#&2w92m{USnK#X zh2q0UbOq=>xb4Es{?fv(%i=8jwzoWWRVi56du3d(IF`Z8__J$7q9TSZR5PNY8F!5- zL8Zh03AmpaMG~-l4sqbQW7@^_K}?d(?J$F_&p{8}tz`0Q4oiA;5__c(w&h#DK`X$) z1zfkBs`%_ML1Yg?;t`VXptc>z+dw_KHHCESG^+`x6CB7Ud6ZwN&SZd%NTgnVBbpwy zPz4Nm$aJ>+2UBEeS!w*FRYdOlTN?{M5_Z;PzwOfBsxWD3et<`fibEu!D8F;>v>lu+ zd4004pO!8x_!)C0fK9>+zPuPm!Vpf&@CdXp@)yVm2{1{ostN6k_femB-_Q5+TOPm1egE!1%r&lI=DN;Y=UnIYJkRCG=)?j(TZv?(#SX??_(gE0 z)Cn?okjV&HO?O0M7`?5d9vLsHb;eEfVuBr742}g^CGaGtOQ^KP7*J2&SWy2(&`^|$ zHObE83_KnmbuuP*ZTC>ww!B5FopB*^y zLEe0ZFIhi*WQ=ySEWn5C&tSyr!^I($c)F>{fFssUJ+O3Qp>Bnko55U~+1!GEhR|V5 zn!ntP8P0w>uzvCQV>P%(@XKNc`Zded+T%7%njk1si7{leJNtRR5m#0xd2)EZm4@^V zuR6cWfjCL9^=w>hCJCPZSzCybPqtr}wxarj0yUY2jBffGAC(q0#N1wpXM5qJ_!`=s z`%(BycxL=4S4C#yAiW>aJX4Xa3(b}BZn?=zZ37dFx@5gU+$smhW+-nwrNTnDfMdl@ ztQBqL=qARzzQ3dVtI>~C^4oLCb;<7-X%a6sX7)2O?jaixlbUF6TsEsHZDfY%bDx?E zo&)ZUiynn2^Ap%5RTjhjaWT@*K2WEp9e55QP5 z^m~3&Q|AbB21tef+1tmSJ7jt_Z6v09L)9x0%!J%6OCa(ILZvFJP#aI(5dS)5OA92i_YoERi`(<> z;OGC%=T0m+PFd;3c6}DrU-jhj#Pt}c$pLJ1l_C&C4w0N5`Cq8^KcD9*7XqC*2@g0N zG78})j@2Va##TU~ms`p=6aWoOe7YQQidd6e6l0)14SQqzyXT)*{(st10&a_dp12-2 z0rhRG9?H$T?V3|Hmnh z<$v9t|GbvRp?*;WH+PUlsXOpSkV0!eKQzh?>46iKt$u?ABH%{dWkK|7L-;BkDLyt8 zd3DKNeDSgDF)6sKTn@qy5^+>KgX}*VKEmW*4CU}gz+N7@`46XL@$=ii3N(?76)Jh? z6*oQP?)>1Ys~Cx=FTH;uTb@)>dQ@@^|0j4`5^_5g-vF_#AO{d0=kbF-0AAbf)#4?q zOku<(En4jeM(8IGACl?YQDs1{=P}y4nzWu76(W~H`($8(};X8xjbT|2^5AtoB6ku+y(Fy- zVC}xw1{dBZ3eWl*({{hCzkB^B#jz;bia*F)CxO{a6}lj3jR&p@R#3^O$Oo|d4bJ&RbxqnF#%bkehrbB+#h`SrZrTu}!z6i!tX ze6iaL-y4ZjXx5v!@yWZqT0NgsVD~hq*N)6bqp<4U^IYfFS%2(G8x$H`+WxN1-P!Vt z_#|~jTlCxc-jZPRt6awAUmbeqK9=RN-3S_>qVm>@E4}kxgn`qB8(p}3bpLG4Bi*F+ zbc;h$c84e(I|m_Zm8a1qePj=VYGW5BvW1qSt~X!L;mmw{^XrQ(D=AvpRU~;m>js1= z$R`)8(zhq>{x&J&#a_5XoyV1Kt8n{eEX+ZgPDUb|CY~mbA&5&KW<$0>{wTc5_%*}nU9ko>sJ8CDc_WKdPWfT5WXr~2k1gEZN_j8y-L~^56o{I5SR7Son(q$dj6%;L0vdE-pRX1m~2PA!7$j#nOAkgfE^|i z!W=KZp?8F7V`$`UXN-g;FU(v}k@usWor%mFt-T9hYK;g19D- ztaWfyO3x^DmBegayCo=cw;Vl+;X}C_IJx- zQNUoC6|3dYU(e+KMc}oZ$h4ZwNBz#*S{F50$-D+|whk*_dr2Agxxtqiqvgcy4D&uMLG z^R>?kZy7MN$S+3Qzdozb<8rHoFQL1v|7B6Hh0{?_u&};{9{V}TQJLJCarV0rlg4eW zP9>F|-TDb~^4XHFjDAYW%hXlpk?FcxRp27dky;&yOVk+IQGewSsi)1m%NDB{@WL)P zUfuG}rf}DtPgQ)CN@f+Gy`*^|%2x_yrPeH7q>EM5D63q@GxinMH!I}GrAoQHi)><@ zYIC;G&OE$>71dKBnRj4eiivyoi-008^aJ|FmtFbVR3Bem)Iy7!+{HJ1Q(UMVJP^}p zVrw$7fmjh_`8rx~vaP0xK-lk~w9_DRS~QNNxJM-h&v?z66!TO9=^)4%)piV3LJb<1 zlbA~seh1P93+dH~w>8b5U&djG;7?4i_$o9$A6?HiI`HQ+6FTIXPdrnMi#0ReX%*oP ziRo@AUh>InO-Az_IeAd?V-eX8KI`% zxqrB$omfAio-8lFo*SEue11Dy`>hW3%h*^&yw8CznEVkf2sS9|FoqjdkI1z5GQGNB zOr~`ddnG!yB=x7u67GJ0jAvc%#xuGo3+w?d9XXrDMTM8Uq^LZ26sj*&8%>CMUn^=rkZ_G;l<22%kEqj~q!WNcU8 z7oBb279PI#m;M?_sLipio>d?#JRZ6Nr&Hm;*v@N4iYr0pix}!ZzmA(D31yiKQ)?wh zEMGu3;bLmdPLwg(!Ep6YFBiV)FBT^he|nZALe;@zm?J*MK6(&bMan%J043bLFxU13 zCV+9?`8F-~F$~|jO3>Fm>uG`4LyWM zu35Xm#<@O8JOd5_5)9hG&S5{BQeV7^(wEcNtB6^8VifY>!)0%4x)i87nKt~IJi?v! z;zjSyLa=Npt-(f+9xU8^&tat0z~pT4m0T+A7EK)*8mDI&QE~qJg+gbQc;oBmdWhi$UMkZN&dUks$?wfJ?bK5MxmP0XZC&BIINi!K$J06(9K|# zb!9f=x3FGOY~HhzO;Z|eMAC#wRlZK8t&eQpcjIr({trvZ#cM7%Z{Jh$2aiVLROU2lz>_}JXC z`Bs^s+wjK2pDG-V5*rVZ-$$2Wmp~_{?1tSu&LGDt)j>=H?m{Z1GAED}v2E)&OXQ9V zAwM9U$XJ7kF1&6jP+nN_o$^i=Yn$=w@MJi8M9n*l<2OPb7N*iA4^w1O|r>>HfW0>phyxdm1(Aw*jO(}VxsUtzLV+5q3Pny@_ zt17f%4#v6L7&AK-8*^UwOrINQb`?If1fHDCbN+`N?a|TKUj+T+)_nnq(@5Y)is6^z zlo(OocKb!Je+|0Nt0CH2vMW7NA07t`%Az9L)p*NTc;pNC6=Yb*xAX*Moe03kT>jgi z{?SwmH`r$`#aW1-5e#?G2!u!(@*_&tnQVoPv7i7+VzX<2aL>WUOrGn&fA8UOm)R)FFv5u8pw$bja zVWe-6D1IFX1-^Cz)r6+g}H|J<}zQD_vjI@K~A+%;<~$>qIA` zVSa8|vC9+cDZ)vLis?n7RAFT6WNYT6I+!9n`PQyAo(U|Pr_Bc2FS}ng?%V2l9++`U z@BKXgrNHvWB`t#@=)`ct@Y`4?fW!#@mly%?KJ;2c81~;AFQ)4J)-4N1mZ-eD6)p5S)$N?3gt; zGu5={47a(1Q>BZ%P_;O9vt->?Vk!6>bZEFs@Ue|x0Z0_#rNFj;+ekGgY()bXFCnwx zf77XSBfvE&3(m@?JTeGyn^2Hhoo{xe08a8HK+(X zKdMdXn`A}?H9g(=)Gys+;m|fWH;Gqc52L+qT>vj9#WG4S7&LYj8PH(OQ{A7z&gqq9 zR#YpNe0cPOdeB>(6m|2C62YqI-gK0nV09{_N+Jxxg=c{E;m=z$2MLez4wslXPc^`O zeVq8Hj(pEmb+dVR7<7(Zyd5dKIgC{aV#5sY^34jyY$?G}P0&Q0VX=BB(Q?=_y_%yN zFXU3fX9bhKxumBAaqDepX9SE{u3ty$K{kP~0jf?L#PE$<|DfuufD_hjUg zO!9B7#xZdf1=&&Gx-SRKEY)?b3!`9w`Ih;heq$ z5r?`Qu(HbnBpu*!=SVmNQV`@=f!ErP)|l3i(--p%x&JAW|37XC^GgJP`h{ZoG-l5D z9xKu72Mjo1ix)a|nukZZeA<2gK7>vQ<*^T(GNb-YY9b807G%K55q2^dALK-le@sGY z2-g}L3INSjd%{9iZ>I?D$Xr`Ac>)(w*GRcV(bovN`=$oS3sL=Rt$hz6KQp z(ccGBkL*_r_xdD-a?bwmO^Z(Jj(>hYZ1v)Aq7$d0@t_)k@iqfNMZ`R?+|YL1y`2agb&D_n^UD! z=T-ndkAH{ZKMXEnW#MF=_pFr9l7P>}yMJTF0Y454aIF~<6=bpZ12AbeVqjCo5?q#J zj$@A1RN3jKQLX@B0l%JJqO1;ohPc!75$*y;dujUCwJH^a&e`!xv;%nscFfoJ3sm8v zpwh|ticW)|$$$t{v)h_bs5$v?2g{I#*LQlok8cN408Kq>0+}fO?qHIuUKx*h2sgAf zyt^5p3Z!C+X{mCipP1KauY(AK43!E^&s+oM?%MDnR(vDU&ND6~gBB#%Ja~nFrh|v@ zFh8$P#y!Dz6hLPs!Xrz;9ewW%^sGWgDb=xxLG>>J@Ha?T_apf&?|%T3j~7m2gNra& zT;&+;#8t2=gkXm63lwx^Hv5c>9kG0g^4z%DGcRd;nS1m@Mt-t#BMfys+6Pwi=-OIc zxxxHk=9qr8mvV?p=MX4?-G z6trgBdax<{#0*z)JrS0BSk~{o^_G87X8AXdyZy_z#-2>)&T5pr5o$A^-eYr)rv|bJselo?Q`+6kj60?28=i|kufl{|o5NvjYdON+yeT~P ztdR3D+qW*0b`^1~&WOq-zZ0|Fs%Q4nQ&YM;&(7sM-#gyxvZ>a^Lsk z2KOzJb?MLS#_sp!7pz3@>NkCw4TIt`va>?_uEi=1Oj~VB`L~L0(2kp+Z2*`-KEGcX zbwQ7{zpGm=1=f z-`puI{7jmd=BJtYOlQj->^%>=IPsP1p4&t;UQQI};{zVIS;P~Z6>UvVb0}mUvk+G3 z7E@19`$Gc0=BauX{LXVvxOmQwduZZ>Aa`NS%uO&#MoiSzQ0h(}#XO7$tAlcH;yAO> zJS{QI{jsVlX6(|ZTvt(%@*~ek5z3Dyp&2em`#1z-_T4l;CQo@L9L8Ln>J}MaqLlEu zl+7iQgg?rhS(%n`{ArWIYx5H35(6Z}8G#W(M!;B_yypy_zHBSm_++CrCUA3-$*^&p z^9t!8BZg#GbKW5aCW*;j083ftD$o0SY<96SXrWv0zJ(szs5RR5h(3W!&6_9fAIsE>U~9M^+IhM&d- zewgBCGOXBK6e6Sa^4qc)%oBE^E6V$hmTsW&CHzuCcKc%kwhmXcN=pIjfhq42-&c!uq7CFA@p5t5u_K}`?}e)Xd+z&J^)dwfR9sJ zBwhOQv{Vc^?g+-r3s(8>V4c&l$QzapmFn2x{YEJt_r z`HRo@+5No*F{|%hmq<9QmwTCl)MEJ?-c@_Tvku6H+8SO@0~ z%QVjRBu_bUQtm!2zhsVbtcriGs1#pq6zeaK&~zUM1~3oh;x4(ewCzAJQN#8%!5oaH zvHpFn!^7`;X|-Omi*S?pt(o803k^}lk~q7-I7e!)`k}wn<$m1acelHjPs)Biwz}9a zU6L^Hc(g9ENT=ApP-i94nEpjYg2co&DP?ppDaJ%^Xr&awgJ%@Y5lMMSXlj>i=1bRB z>BNxd%NfD5*TJ{2Ord$gTdEe!uUpx7xmA%CRj}VJXF5X8`XKrcuxT?qQ(9q3RbTY$ z^4|q(-uzOj3fnJ}qWwN1^V6Bes_l*CaDU<*1_bY*g!YA$xxNFIHpml?iK?c!$YbPd zxBcLyHf!0oJ)666skQ7rYZ(XJ9!+_VhB{EgDnj@N{mnzUJEXQN94=@t6jQ->F`hf- zHX*uNV@kgWs;r>Ju=Yjh@89jwrg~{wpf$aM$Ga0u0xT* zGf$T-kV{@af^^_wbIMfm?C?@pV^W04E*w`y_;gb^=$ZG=PH1&e!9Zg2z){)94%Oy> z=%^JR4nDmh#+3QiY1JRP8svDhpO{jjyo=oi21snLH+8GDg=tK0%VWI(#-3T>G*y*5 z;*!<{(}EbfIj(a@pIXjOPARmHnVo~_TFdyAqpfz`QaJZomq+|QdPHpxcqpHNw%N1u zy~z9U!tt10;@sT2*aQ2Z)GJ>Zlu4)vkPIbRjOXIsD zdu-)TMn@f*0dwU}FJr4UE(SxthlU>Y2g#LX*>?dY9?pCwk>#ZJWKdtNRa&E`-Qc^i zhZ@n@_sQd^0RX4x-H>}EBg9iaWcuZ8qZmo$i%lrq%u=gD>Jjlitcxz9YOgL zhdG%mX^DFR zx#zXHi|vUhMsGK#VTPrGaWsgopb5??&HeoFC@5aa<7&9oN9M=|1vI`AYljua^e)VV zTj1&|=R^;U&KDigY~OdSna%zbrU2BMu#TB3WGGC^X{4TI*K#8l6D2>aoz@j5Z23^H z@QeBV+moTO>i*2pd@uSrGq~t8HgUYfFP(<1CLC9yb=e}Ji|4$+6bnlAt}Gg@>=M@I zI;ysB;#=yO;pE-wL{MSkC9F=w8~1DXiP-67H>1uy3yOIc@Pcr9s)~WocHn zM~U|nQtqoA2p~T$TPKAM?RWc0Y!RW@zFBjsa9Lw?n?J(#?5xH8(8UG)ALB{9id8S2 z^;?9RQEXpYDbiWleRV0rG&+$M8J7sAV+Z}U^2B>hA;F}10f;u*@Omi>?e3YM4fTSR zJXt>Hrjos>Cj0G9L@nRYziIuWx+7=b$c#T-eJM&xG9u-m1WD?eD!xO(yJB` zzMt)K3Kw^0Vh@hhXSrC$F&`RzC_SjYBNfkf5U@j+2|b#qkD-Vlq1|3AZ|XpoX!)Iv zkXbK+jdjs=%pK*zCody40i1;Lepq9k56fS>nDvX``IGR3Ft=>T(>d!e@UuNTPw`xQ zb)S9_NLz;;pDl7PoN|(TyyB(DPakLfI@}s)IL~JKHd)hlMB9)1Vf6y71felE#S|KU z+=bozS*@5BD_|6iGz|!bV4Rh94c03$Nsl4=+=;mBRoSU{If(P)AG>W($bKtic^H?V zLp#1`b)pAn&8%by$6s(m@T@IybJv}LZ)EZ!d#ZCnyvyL{{3Hv5 zdX>tt8AIX?U#I@%IhhDs}3bc0|;=1p$>MZ>JVec$h z*o%@E>GV*W4HK?JL$-dRBJm8!EwcqUjYZ?)T0luJx+jC+8!pZ0dy66bvBJ#++C$NW zsBQiA<s^GTa1^MOG9D|QO392m(5f}kfCo?yvQ#;p&vKUnIP%y7%| ziaxk}=j;L0|DyasH0uDxc1w#hTm^h7K52hJgL368Z4yW9{d1Y#vF7w2fAWnl( z`th~IKHO$@qDPKLsR^ND?v&7!5TVN+pwe9YMIcZD7*AuPHopjReJhV!s-cT3Sf(2* z=>G5ChR{-_5Up0|NdS_H0Rzz%Z6p$9ZBA0n_JF3wyMh2;g3>Zx@$no zP}IMW;8hwKM(vMdAjB2Gx@Ib@Baluz5vdN%2&&|&FE;34{kF#|W%bcSm&B$D+Z`t{ z2}!f;q5JA^zD$$>?F5*=+V8|MUAOvr86(}YtSonnYAlwc8z1*rQ}v$rPtN{|*DH)T z&oj(tP2&BDp)w2hwij(2v!lYU{ZGz-V838p$8AtMi%tEiQ;t*NN9ydi75x?c_ZXcm zo_PX9 zzTf}uv;OrOv9h{U=5(S{*1#oKhOPH8@57FO!OEl_n;zl!r~Ue;LO>e3D^ZUJwKRli z>@EWMY5Nq*^2q)Veo8(F)By+``KC(LsrM=v4F14BgqO>V8A#-k4Os)tYKmQ2W>(7Yv^CF)?ZNm zpXG$pt>4;B;0Kp0OsUQQVnR+4Ne-v6$6V?R#(z0mO(3!1AUBH?p=05=rr-WQgRhc) z4!lwG1RoG?2-=YTba(+k{O_TRaAP1Pl}30R*hf8|&Mh#V&jbEd)C3jAXQ;)r$O2fy ziaC?%OvcRALpnbynlkg|a|p$gT$}M58m7c8+)#C8F#4nN`;BEyc>QJunXv;xkcdvK zbu}!RjCSb`p5o@rZXyk2osRaD69h>s2?icxBtk3b*Irkh&H={FTG6apAGFS3PF|9@ z<4|4kbumG{o3(|*Iq>~iTgXY2Slkl>Q+;a9>VUy2;Z|z#oK@>MoGyqcrotsb7Q;b6 zb)&iUF^?!_FFH;bfLc#s9hnQ|fxD!{E$y6`(!*N7TLLhO@-))n^EjaG^r`)0d+bYJR z?YSndOE!U_(Ct#UO3LRxE>C}Wd$cU}?n?W}8ZBOZtGx+fJw%}@3<9?a_-u;l)yFCG^af12IM2xeBL<`*FSOgw$Y3o={jHnCIthuKXxKNYCbwGBl>AIw@4 zZQxaQ8LPsB6xj3%tT_qp)pAeGNG{};*f<`$j#L&{I6n4#^`$m}_qpc9AMIuY0WV#S zRdFG3+YPRWR1#zy*1&PaYB)RAu#|=FjACAzKj!#_oCcF*mA znz@nbi{+KF`Xb)y>ep?|3CWA%S)Y7b{SXMglU!Q@c8dEvM4s#!l~OD0yH?8DQwN3L z+AV~#Kgs;77bZR?cwg3#?;y*v5`L|k`ci0E;!nDkpr5-Z=qQv!0~dJ*!Y^eBHuy>7 zDR1n_gc3bXgVx*=f8Q5t zpFfPo%lduZXu&Ifm8Co;OsB;Z=zDmhc9L#iOtEH5vRPAMtBDzf1I=KHqRwFPctW49ykiI|4{Lht0Q zd*{+w6N4ZPW44nFL*~AO8vdn&YN!iQwFZxQ`YOil3HCY(TfMiHML)>4sNbST_8?TmK+)#eGqif7`ZD0A5Hklj2F(){P7flQc=VNpNKn4 z=~B-`pWiGRPPplXfZsfUj=)$kCbJN#YtJ2W@35Fv`A_Ra_Hs0!y8{^LCX1axStYwl zzGe{kf?-jI%t{+k*=V1w;9rw3eV)l*Zn%6w{ z1(jX45foS%)Xb9{VVj3%)q1p*(gPCL?rvV`Yi6`1JhK~xZIa!ksMui@HBbhn&5lbd5o{Nhz}pGz;E8KzigxqsfUt#`e>|~DfQ90C+sBAgl&uyEAk5r zHHY0n;-X3|X44N(f-cb76l}2_Phz9TF4;s}GWu5VqM@P4UdXdTh-E=OU-4;Un^i?N zWarU7;)+zto8qe?s4I!{B?5u|TYN8lG=w&@KJw^aHmE^#{cyNmgX$^r z&iO%aU&YP7P)%B}PzP)4jA~ci6#eOqe*MH|_epI6yeoLAx0KP-Iv?YcHWcHl0 zU98&Pr#)ribMjW{EbV&fNL4Ux z2NDEk08^489G{~jU#j}E^k9%EOq1NA)gLL6!??=D=yL+6_e0|k4Qwuc(Yp*aiEsF< z#<;v!9CmGN9$6@Lt846=u^So+p=(hOynV59W?AbNTFFF8I3(R$Z%TC1vo6+!GTlTn zyiJBxA6DthdQtpqz`bD;qB43{+Kr51l|D_G-Ya%T=eL7J1WA-xv@6S=f55n_4)?t; zeBT&w%l5ExB%#u zW&Cws=4J(xQKivXzroqa@Y>t#pVIi;{T4r_1=6J_9;y)~T9dp%5oc$XC66MoL>nGT zx85c{IOIxxqECTqRwgS5CsDDQ58haXb79Zz%00QcH#yPaV^_yvLO^phMY%iZ7lCYD zcg1@a(?ZgOHEW{i&JCki7u9VyJ;vHiE&F}>rW9E{kLtI$DbwXFjIVuenX3~`=up4) zL$rTqcT zZ^;Sv#L%xS{6yAYQ>_pnl3j9U?dS8^#AIhY&+Si>?MDD@9bJWNhM#!)<;*@Ux0#DO zuZNzY)>*pzz3k}f>#vet(aqJShtNSbOv@z1o9Rl_OLV}*rL zOfspA05`pluqSPr`RuTn_72~lL0-+oRF?m|0<+|zZN^8h^oiK$4cGRp1gR{x$ZzJG z#ZE|&?FWpYRk-VRvZoBcBy|Gi>q9#ae~zM)NHTrxg}wmjVkDL!4y|bL9LAnFj|eXr z+ekI`e?`f9LCl@lTW1jj(%)g9|B9+C;eXR@@pia(`6#*Bt+&0sUHQXc$b)9vLzgh& zmuj94N(s|EVe{ud-*Brb$W9^~3ZB+?nTX9#mg%UHC{Zl(e|Dz?xU4$c6+gH)RbK&S zhfkxj%C0OOkjCLn}C0i%ka-_CV)04wBd$Brp|LNiK` zzVS|latA_Ht8(k@w{b7XdFxh(NG0vSxe|AS(My?!DJCuQmNYjXvazoWAd9)i5DT#} zyZN)Nas2e}hW5YSGC!7e?Ph=Ziu*F5>YPiy;`h8e)U56~II=De=ePc*cg^C;_9WX( z^hu*x9gpJAH!`HhwbrW$_Q#~*Ou%|5pk=XGSCaR(#@vE`%#HL~{!Y%5JL*$dii?poMiDz3YpQ&LRaNvxso6^46%C15U6Ytz9a5 z@Eo?DiUsg-SUne}O%F_|hXQDE0@VmM%3Z#Jk}~XMr#E#Ch{<{fdzx#T7HH`kZR#eS z4(-!?x7gtZ$yaQ36H7S^ftW}c3EWe)+LaN zD~!F^w#d^1E!>|vO z)W5Hy{(GRv`#mGhZn$Fb)$D^mGh>0se-N~OEBriF6Hm2%r{xssToslnZjN*kD+Y8% z7LQn7X=mus;D+p8i>qY}r}Z1(NQQli!Uu|;+`)Z-(dD;EjPJBTs8V>umJ9Q>(A?4U z$CKwLGL23a4L!$F@xpI8~p=KBIqF%58XsG3Hr3!cCDg)y&<}SQk zFeJ!VY5MUyfs7RBVtQHKM}djQZnIxCmucUR^71Y^PfrkG-2;?Gh*3 z$Zk_~oQXNwy)rpF+UU+2WLPc2*7W6W%Xg6ua#nS zL`iCgOrtN?*9*-1xSOy+QfRZI@RrE?kO&CsD*$xPV8oYcAvdEAd^m2&)cTcwMq7##BJ>s9%N@8&FS`qj^3$#i?1JkBeSDvEhqGIA zc#qa+68e@nWcuQMtLVZpE(vrdF?fgf?kMg>=7~Sa&5W=>&L_VJXf#LRtEH3h+2mgY zM^bSZFBL#sRDV1Px-o$LI+AgOMz;br9(?azImENa0oh>x-BS&Y=r4l32eRN>f6mEG zd4hbAl6%a-(g4uoa_hc}41pT#-|@Z&v>4vVW~~K5T})b@CB#NH8Wq=VWw2zWIUk4N zYunPx&e476tirKN5&PUdP=-FUpgK&3>+lEM0B~8N6~?i*f=qFMuy(pWl=9OWtBpFF zm$!yu*A7T1`SCKqt>&|njU~E^;%EX{=Wru|Re?e3BX^h-nX($9w|C+yp8V^HQjp)E zI}rNva-G3^zFBMWK9Qn@O$)xV+^++uxJ%C{c_3{=tY(2p`BQjZAp}Ugn*rwWZJTxT z`#`~2r)7S*nfVkM;)-|(-vM~ zC1}Ten_el#w7o6G+=SaWD=0h;lr)LBRQ!eCNg%)qo8XTTHUSnyDn$WHQD8)N ziWi;!`9F`R1Ka=AAMnhgRqAq zjm_z2aaNFC2KR{n!5#YaLmOo(@taOY_?S!Q@cHr$OIs5^#} zH$DT65&rD-ScE$cpIk}L^fS3x`ZS9D4(KiD3nv2bjldmWzs~ovV-=I}iMCCB6)56&u~K(zn(*lq*ucWMA<;|NcHBrWGLbc-9_r~isW7p(=O-|E9si) z18-CS3yH#5+me=D_qNAwpRZ)@uAR^C>}6}YW%?*~^4-G5r>!lZAYBdma4dC4)T(i2un>W7I|0Moc zD7&Wnv_GBs{UcoGpW8=QLn`_YzA7gEn$)4xwy;Epv4rfebB>Jo~{ z)g;@dSN9Jbwr_%qFRzVxAp&1?K#YPmv&xJ|C@#BlatabLK&hhM|D#2L=y zV@A}%?E4q9CxL;5pMU%af4P~a6F*M+R9SS@>wDN|DeV@gJ*D(sxxA+_87q-Nvbc0e zMNXXguHYNu8-=$TDX_EAMS=z(Z}1zQs!vJM{o?00XSn5Yb7oYYi9>`Nch245zIyvp z9x|?V|*^oqrOyLjo-B1aNALLq&vtyIg!Skoo)jr5)s*tIPJ3-;}??0?iwiU zq#hs3Or%L?93dJoUD3|H=S3JE*de&Ku|FztcI~!!x$d zL$__usCd7CE`F-mzgv04U>$a_9azz?`fngOvMkWyXz3m7pFV1 zJHAaa%67-Aa5l`_v{rq$sYOMiEKy&o?zKJtR`9sxIe0|CXf&~;{EP3>Tsxei``gsB zAI7v5$71lBD?z%=MHd)#*aYE?DVz@WcsHm=&&SmoX&pp0+LN^!j3<6L5MwQry?p&5% z&QNi4Hgm0{OG>P0b5BJ$V?WAp)$v{;9OlvvcF9ITh zrajr8hs_4#GvB`KrXK3fxIXUG`S4Kd@ZN&_%>`wbH69+dFr?gJVJ^AaC4auNwZ(2` z*XcMduZ`|p{rD!vSsWF>6+e9iV?CcqI;{k~VysQ@iy);Sx@#z|F61io^}|Qkzp`w? zpN(#vt(n`pdM9Gi*;yc>^vdOS&Paqa#(h3+^H3n}s~PHqe2Z3|?7V|o!<2RhqO-h* zKACCi!q+{>wNAfS#^8xm>4n4=4h{9Y^`BBkps}iI8lvs{^Mz*^piT0(Gj*9>pK!^wW^)Q9}Wm@64xI+5rDcTFN+4a)UQxp5{=MT3ib9 zAr~izioW=)vTMdsS~B*Ilf`X1#?CYAVd{pBCIo22+RPknE8+~+9@T<)9>h_gsRX4f zdR^Nrt&WtZ7b$mD;-rSN=v8o3x4R)ttc_sb;gz&6;Pyynb-0U6RVvk1+6xKkuIMK* z{nAhFdsy?c%~m+wA8W!W)S> z4<5f>3SkR&v>-ve!w!M6M-y&aa(%8XsDpq$$iLuyzeq5=XIG2PanYfljtdHFXVNGgv%xDy>wlunj#VDviL0#T));SHU(QfMWyU#aQ6ZFl!+1O|8ZhI7I+JI)NLlCWVnm+ zhoCcDC_Oj0^Ca+y?tQX@@3?Kc?ix9TFTH@f#rRt;6~wh0YXRmt1^ZmbdY;zBw%a8z zLtyq#inUFNi94oM10<4B#s!;t{(wYC8X-P=5ML9e3<}m1Cgj0)qYZ1-&q^P@hCe-To!Lm5hye%o*XO?UnB)9L#3mc#}{3W*%2#o zcyiza_-AL;@14vD8f0^?*VXex6WPjoI(5!`LqpA>VA`DZr((%D1AV!%J=BKArXl8{7;^P*p>8(^U6;qgzJR&2d?(nzXpzD;dbB> zI&(i(56%JdU<_#^xKQsQ&b@3H*a(-P?ua<#a|KG?x_n(BIa%{TtS!GSt`pxJvSS@- zjk175a7|d9Lm9MjC{lIjSZZg_e2LC1|3M^ME|=ZLtA)I+?xJCC_st@+7dVXEwV(T2 zL_5G6A*brnZyC5^CCbvN?!~n)Eu&~LwEl(mA*5Gv+ez7X+jzh@vt}Mj(7i;Ti&BWy z8k72zF7;7?n658eyQqjf=VU6o_B#3@CjUZfZwUS@CaaV`^X|ax%MEw++Gu~Mj&jN) zPYkqblt_~gMCqt)Xaj17$(<4+E%P-!4kw|kpxmSpB@AIhD19dp#g}2vK^ync zw!tx%;#xfC5Me587=s5_CK0U8){fPx0pm`gcHFV!8CqJscOy}FmyP{EJjL5n)uO?c zP4Q}qw|}Z{Xh~MTP2W-kTZ>#-j2Eq&y0vHzF z$b>(4WNxL1*(gx^so%fK8ko@2Jn8MH5fFBhfbPewV2GcA1t4gtV6Fp#mK@p*CDGI| zwKuI51y3aRKYe}Qsz`B_!n@1b%H}gE`lIed`mB`HVnzSfi}UD zRe~$Fx6K+Y4&Pd2lpZr4!zX~xTlzOKYp${oQA<4}=q!qKivu-jg!zAnd+UIx+U{+1 z=mwE)l?F*cngNs+>5ie45)hD(8cLCFP^6@h5=lY2QM#mCq#K4B&kaw!&-;7d?|kPw z=a2KpuqQTq?z#6JYh7!t>$2Iy9RFTXOaG*@2jw`7GY4JX35`DT%kqmS%z9)@x_fnd zQ46pTNgQw0J%*Nk@?1e4!kPLyVGJE>8a!p(M-;>F(`e4+aJV4QU3NK%Js7p_#|xYhm;z#M_dq8iR3hn3#k*EQ zY`&ScCM0F{F$uIrJd~qe!c$j>z4yB^koS^Q`^l7<1=0$x4!N-|@@y_7>M?4T$x22a zzUu){rEpL+UL=f)REu*ghmOO5+8kgoS4%kX)Bg1k1UW_^p-8O>bg0zz2bwqn?*veo z#hmi3GjOV@)kZPccBJ$wu-m!8&JgfYnTpg>+mnq@qn!*>g-s+wL@^)K=7}h>UH@gz z%Ue>(^#1{_h8EA(4$zQHKN}z@jbg6EA<9O<&P+x}ip9=M+B};ks>X8Vh5b1|{|3F# zf|cCiqw{!A3Kv`OY3sNj@Et9dUA>Ptv5+CU*EyLARPZ@7kLS1-#XZq208&m{Z% z{<)wZ(P{t>e9DYX1^K^R&A(jEe;ng?ZP--}(0{}Oa;IswJo_wr^l5vjb@tTb43&@y z^$LwasS=jKtFJ|I;>F-1RpWehdjp{{M2#D{1R727-!4w{d+h%X|G9#kJb@zfc+M3S zlVZp#P=Fc&?t*rrA{e$$&A_llGc3M;)%NS;)vRL4tQ_CQ zL;&QkB>w^^S2k4XYYv~f6zP5qgbYW2pg*wAFCax#5ek5z zAY@nizO*;kxJ(OqE^j2Xh9gv!(JLeq(zh8tlrG~PgSzOr(DBAUS=KNA>)RkTYooMm zA95bNA(f4_(pk?|s?9|tDi*e3*g#cpNN3x#kRrd*OG7RJ(Lo<0-cgc-y?bePQBawo z?6RXeWpNy=8t&PVuy9F!XtL(Xy~ClGO3!+brSgM)9Q}q149iNq>>D9_Ni(h@7WNFq z@NTwuVA{I^1r4UmjP9cx4QG)&bDaFl(CfR5^5+7D7Hv6lS*Dg}kistD@`B-h09s&j zjPo?>9BXMMZvmMnv$h77krnjP(>bE@wypgdwefLffxZp(GZp6P6?JW0r)McQu`1!3 z>7RfdG-I{PxlqJ{#6V7t1fM;6W&C)=&2j;`jpkPRtr5Y}xY>df+2gp-*z?J^Be)|I z%}B_kpTTTM7<3FBQd?9qvHBS-!namf8HOA&Hma= zqEJ{-8<$#-(hFx}plaZ;`wgM7ml^NZ z$?_x;eOcPZ$geLK8$%`8Yxm&c4wL)GTh-2}ubS{qba92zGD@uJ@^FLc>&sOjg&AAZ zCwbxg3E}DS+oyHo1I_zcZ7HXLj>Tebn4-hd)6O2f+(Vw)a#ISug#?UF#|2p1odpu@ z64+>tE+gk!)}M`gOoQxko#Hxt6BO^Cs)^kH8eERbD~#7?H-)6QU>NhZ%a`Z99F0-a zpP(%G>^|G5N=~)1d`~X*aK{n z86H-70H4^8uG65gS3WSf!+$qJhT4Lif zK2Z;r0g66k3Ex&rEi0ziw=(d<=AD9OYhPEBuOAir9M#wMBVV?2elHHk*ED4Et5|&% z@9%G{xuW>3AYi(--B_xFPdoQn>k9qjd8awlbP~&y8c<8V0j6J~UUbMHB*19-vJ$gmFn?pzmt^^lHoFs@vAU*0BK zRO23m*JH-0cqjY?p76JRA9Pc==R=0x!k_E79R;>yi|lfSU3sAs^9mTF<0&aPM^*pNe)n;Yg{mMYQpe-Ga5?yBI2A7QA#oTRe4dxVWGs; z>p0;(>&ht>Q(7Q><+posHBD&r0WsTH=5~kSjX}IM?jBi7P9m=iO7@?;dlD11LjxaA zHWzbS{8+{{TXg4qe5m$CMlq?A*b#*rp}c{It=~}=%iAx?Somye7&7-16iG2c0`2Fi z&Eh1FD7&*W+-UWNovkzdgU7DW)~kYcshgTa1}GW();DD|mWR}(TCP6;bl70`2p^_m zG^-pS{v~JoB-_!pN1gfJN`E*5+UA9}l`4&IHzR@t*4D-jbo1VOSEJa84E1 zq+tIT+kFLTgEH0W6pGA}WB_?=t8Jw?HnY6Dz08Y~EC3L8Mdc{XaMH+)JrmrLq%%}b7JF}(Fhl#Mc_cD8fywoxIFp~VO>Jf1aPja4 z;osn<%p;**74rk+0_`#F7n-V#^>inC*I~*kkn!4B3kfV?^C&RyrVjD=Sl77 zcUvpy_MCUE>`=vS$SdU|FqHRS_E-7>=9}h6@TZOlEjTfJ!4zRIMNM8I-aSHyvGRRb z%=8Y&k~N*8;Hev-3`ti~x-aK+tDuA`cRlA-WXkO=AzHm?^oyJ#%cI*@cF(_&U*u&) zue-Dy(O2|ntMp;712Ka#g~O}0lDHE7DotY0+iT&}07u|`zQYss0An-z=I7JC-nz#q zyR3^P3437+RWVBh9NS_cR@Htp&z|_iiQ8@$SNq9%>h9SSbrCNE55Bme(ed#MjB z6g%W7=Wbocfw$1!leXfF^$)Xme9f!*1=Qs-DndUvrA?IbP$^cwa5I~8Ia951oT;PN zwQ1fIE?qo14l@O64wZ@8!au$qpLaBtp9d=zGg%uOE2Y*wmM&dvT&ip}_BI@QicY$Q zcr8fDuK#1b?E-(FN|YK)T47U!G%KT2n_g9>4Lu}oopT3{VENI%1^X+D#+!h+t>B34 zWc-4}FnzL4mEnk}9{H( zSw9%ARF^3N{q_z#SJ^d$!EK%{KajSHn3E$>r$0F`&>gWRv!i}iR`g#Skl_g5^?Bt( z#`=8HH~YuOdSXkimtg3;+41JMONXxBN#gDbM7$hxw`oZ$hr9RX@DCy{4=%!-n!j zARbguux%(RVOpckkgDs0f-O2W)UQF?xt7D(D+LLef19K^i?N}oEul(m8A?~ytUnk$ zQbtC}{J^v4CMB%Uh^)XVLjbcMlLB{_XEJJm2~OK+wL|O8yWf`N*^tWqxogeGF=R`% zdMD=IR`SUtdoDH-J3CUr(&MR2>}`&oR&-(j4hCs*Zb1=j&=w+uCbZH$;)SxFcPfuj zid`ZEpC=~&r>-_#qR7RMrb7e7=eoGsFJo6S$sOQpuU$I0eB-VhPZ)Hn5%;U{t{c*R zT#~F&MP^X%;8;ag!BzcN+{5oaD`e+TST~R(XI$*R;OL*6YdJpAd}-hq9&3!c${H~K z*3LoHa=Pn`&p!ui%({i^66DTwlI+DM1<%TM} zM+Ndll)7|#L>ZmV(JsnWx4Uo4W zykk>G41JZAb$ZE>+n<_*9X}f~R7~?8_sG+DDTdk@M=+j-?#aJ7Ql<2Gue0ayg(tPZ z&fR6ssQdDC+tS9;vHfjwOJ?YX&>t`r@M_M+kEyPJ3%v5<%w3)!hm3>6I_2h%wvh0= z34z~`x!T5a{?A1qZDwB^F2%4b+aCmdIwbD8H>?&&^peSwFm2iN)XDx!9pk8eI6y-v zeOMg2G~DF3+u6Yo@f>IO176y7Pm`UBfq@Pnqkj?~0ol!N!22A0kz57Fr-gAaGG=!4 zfpgZEi?%^ovXN(h-6pbgc902U@Ea1lx>R~j0X^Qwy|62tP(|Dszes}8!GC}rs*cv` zH&eDadkDB=;+tL-@Y@)?H@NZR4|e0f#vR#iH6U1N8S`bo3`gGLYBRB%)N)G>BA6B7 z+RnV&xtJg z&5K`{aC^`>E-BZeGP}BUANJHnwJf<2zB8f?e;gi!YKY%kL&ZXo;PKI ziHP?agYs&1`r{f)UP#hOr`!QMiVX03$gaxK@U9mxwr=nZi_3#RNDsxy*}5AD&ei<; z_~HhtZSiUp6fEJuYsU)%Cz4`Z4H9lJlCSI9{=Cqi?fO0bPt?I?L+LTceU@~#_jG`a zTaaEwM6nkjR?M02YlaI1u85x$KwC3VlZ3$+QvhOakn=I-M?qTgM@`#J^+z+RMc-RO z*|P^A0Ew^?_yZsU@|;%&q{3l#ca|%fbf7eAj^ta<0VCI&;9?a4;Lr64R3c<&%P{%? z2H;;MW7jKr4myA*AhD@6$AXkBsZvDvUdf4tqy9rVS5SBMb$h~}f|Qp!-JyuA=S_Lw zP7k9T{i`r6K5-t-!LY00cq;NYxPZDT`4`iX4(uFV_BSLex74qWD8~l_8Z_UA0aAt9 zuX>#egH zBmsxVAG}BN9h;sWf*%f3g4tO=U4adwVDWi9-nDT|*$67ZHnxz!mJFw?D6*dO- z&cpGrGZGzPf@Q}yKM-fPa4q+3Xjby_ltLdI5Sx4yuQ(!c7gzrhzwDYDSy4PzyeYyb zM7vrqTnDTl4xE~;j+CH8GCB##-Mr_|Kz60Zb#5WEHo7yrTWJmsD0cIssq|r+Y=~cH zWB&AzYF?|ie>&gYer+}-%0F-{iMyIGu&N@@(t>IHllLuzlMhfW!;F}>zkiR>xSuEu z6Z40Otek@a7ESHbuOCQTNtDB5U;|heH_wlqYX&OnqQZPxsx0ek4(!e4Wg2TGALTbO z|3jXqcTwfvHYoqj7#vz^Wy@b8ZER` zfbw?A!$QS+c6P)psXs{I*?B1EivKSUvi%gxnEmjAHDzjyzJ0u{Wg;bK2M$$*=5`-4 z;rul2J6SROdtEu^iuj(P)&lPe6Z6PhKg)Z^sS>>UMoX&Ici`sg`0N4opi` zavv4_ZWUw!g8ul|~dnEe@N;@n&@q6lw zX0L6#uv;ZI6i1%x$l2q@ntHD$$A9mWqY=~|3?#@3ExQ?hpfU_Vrle_Qx1^WdOJrtC zZenk`$x!*TXrnp48A)~fEkO;s7ni=}D~kP6FlTr&+1)uln{MG#@I!%R_zCZ;L&&~l zrRo+M4I1uqHw!;$vTj3+;G)wM%_D`ORK->+^W)OOb%ROlqnEu+bo>ycUQwaPhX^;B z8r0%b$nbsagUF*7pDHNo9^dy>Y8Ks3YTHKVr5`nT5c7IV;h<6JRoJIKku%+AHv5hR z$qyQ2c-eU~mdGe}C{i@Aybm;)&M2PhbJV}O_kC(MV)n2lV^pTNTBgkOzATH16b^K6`r;sA{T%C0pw1 z2@pX2V}(Y&Uc|u*ng|vGQpncUm7-QZ{!D6P2&QiIuY6iLa);yH<=n09 zM=3(+OSfK>j;}_27r9l1B$Ox&-&^$;P^JD=zBayF_WqkAdvnM5q|9n~idBv`lw`M^ zde0@Gq5-z-BdM6)6L#z>7`5yxs9ND167Kr*1O69`zEQ#T^qZ2j>kXKr1gVFQZ{KYZ z!Hq7Wpt_Y_{L+O}PKQE6>#($APfuhxF8l5GWEnEA{d!EJrjT_x&U49m-bh-K3i6eyZbkOg1y$h_hIA{8p zB4+(QKc$u5iqTETMd8_#%K)fdgxHc*koj(2h*cz%Snp+|+>yp0XSP#Bs@{Y$+~ zu1fiW;L^eL9MhELy{*US#4+lfudz}NPPGE_ieKu8y#o6JB-<>#B@xj1wcn+UdaEYK z-@|V+(jnaMsRDjVg=2Fu+EQ?<%7Yf)TU8YF1`_QxOyNdM&u;8~8%SFvyLfpav&&Mx zJJn%O8X#D~nQucv=@y&E4RjJ2MKvY z_$dC$YTxM>{#nU=WhJRmHP0VrhDXNhnV0k7KVN4DQ3E` zP$FLDmVcbKo9yi?cjja6zB_G^C<5*KQeake8ksy=N<+EKz3mAk1yxTX6w{Lg$0Fqv z(tRW|uR{kmz_A>+nLECAdpYL?&ewo9yn0Q7QU_w2pL{@6U#X$YiAr87fM zVYKSfSW$a>@gJ{lpURIlyiRKIIFe*R0|~f+^<2{7A?!Hqke{#Dq{&AtQgS&)lSurh zLi6Qh4a=dzzRRPE=iy^C>7V9y71AVc*)|?c9smuAV)%oEi+y zoL2?68uh-#k70MAV!pTA0`DV?{p3Uh;$1V}0?X!4`T>o_YxyW(7b>x_JmR$h`Woue zNxa{wc|o*I((J7O1RSr^q7^`J@um$M+8z-S@4>F$dK%8IZYO~#!SFA4e(Ukgtn_l@ zK(0Ez&9a`ztXTu3`^LHTWuD~C_PuZOL=i{|0Xyct%G{p2Cxql@_$sz@LBE2|pEibk z`H(oP8}Pz}87$}joIKh!D9`hkvA-H`gP#!`z)7Ux@wRwjPQ{%*fh zv9|OjK>K{~8y=P%{eqw1Wy{4aL~!1PKEiN8HN?|o&~ee2f9%{g<;|>ht0Kdg&~)%L zHImxy%n=Mf32%K#zRNUM*lYLUmq`t%Bk#;LoON2&XT)Krltx6R7#7AhE*yo*>BSZ5 zvd({W7hj|t2L2CPO%#c>Yz^rO_nb)#E;6_4@FV9b;K&SUy!MRykE1>zT%X)OXMn7g zwX73LHH`4N@XgCfO`4cp-72 zw};Hy(_M57^<`TBLLtdKQc^mhxB88wtFJ7|QiIeB#bC%_x`weIM1rPA{_-i*@NRtM zSB2mrhv1g*pa=6nvvnR8O!8oP?#J4i1fqjzYP-AU3;L1RlbC3d3l0$R;f8+-Zec(e z!zzXRFL+$B(sR~MEObUts8m0%%{6j>Xi(3348zIBIdze3!)q6_YnqA5khj27{Qt*| zu4Mnk&M89$xOjmph!USwEQT77l8i!#%Ll`RH$M_LezHLHW4O~!Y^!D*BkkR@FCC}c z+!owW>{z)A&K>EYM%2dna)7;Ua_8;Bnk2XZS7G;*$w-1Kae0>E#D~s3dz-g`gGvmG zIRyIIs?8wbrBV!^ zLG-9?)jFckmaK)sOSCC92(uB#L%B# z-n<9yO5vKaECtG6Z*rsB!<4`aE|&(*J@;2%@iL4Ng&m%iB_UZqSWdThnDxved}g|+ zSp7ufntq1)4{TkZ?yH07MB<^Gb9y-M+i?Lw+C6CE3i`2|`U8h~S_cAG0e;it%efUJ z=!Im|JkceNB6u@@scReFZBavdr89+j-yLph` z3&2epkN7!{0HUL0odkv{?j_+q=4QQ2hqwBca<>A*U10S6z!%B9OiR?@`bxFi@hAK%oul2R&p$q$3dkXkAYB{>KARpyh@4dBo`! z^OMGe?^;4*zQR^b7UqF=9&FQVDXiu$_g8uM1+TCG)ClwLg>aIIpFzF2KKe~3IbH2N z{ev8$V{(Cu1X%3`vZZgt`3eke&y3d{1Gf7tYXY>^zA%M<{&a#qR<95QKnr00sAOIb zVF=6h7}6M&^#ku@CWhWc1HOK9DF>o?@7scI6AYh$!dq{Ttij*3E&Gm)iB~Xfd;oT< zA}4-$kfwik)6d3x;)b9jQV}FMvx@c$L|Ar{hfXOtKhGuN9k(;SmPVZKfTt5k}Krzm@>zdny`kj-T_+bFw2%WJJU zpDb8i5quHqC$M>3FP5b0Cb##jj$;vIyZ2{u5P3_+VlwY^Jt|*%PmP;=QPvCWYn{GL z%0w>F5PIzG+JL8 z=v(aO1&fPi81jkXNjtrKo;53G&2EPOhEyIqukKkG39fE!oSV&A^&tw2xZ$gDofOD0*^kHgrI!7huS1c-MZY1A zrBl?M^s4PD`enZ%5|1?Dzq%&tU&^@)S1(r0J49Esa_ z&UPWaZ!fN#5A&c3#q}2)X{-l4 z!kQ=B)9P}J!rR>~#NBzJ=bz>DRacEMq*4BaF;Qg6_NupV&R_cW!OEZJum+1e7ONC^ zL^A96uqp-_8+TATzy!+W@gD|Y>oe)x3P7P9RyrqO+)U;K+CX5iOnTgz)TJj!=;s4U zq1y4}l>5(o3u%RJ`97^tgwI!dmO*O}P*kGh$_~dBOODI<(Bo$SrOj+a1vu?|brMR` zh$LYl`wsb6X@)?(6r_25VKZ!3zu)hyZEp&cyHoE3X zUDt@Z^DxN&V#{B1wnq6wh0o$wX_~JDawHuC`V?@zD=&7wAyKWV*RfNkR4?Ii2uLBJ zWsj?%$EzB_3YaMEbp&f^vt0w(MBf_FRTAwhzVYj1T;~{A`b`$+I#AIZ%wy_6FNVrP z-Fdps3-WWdr_d(|v70>hX9s+jcXeXtka#}%g;A9kbfL04Uhc3Z5E6 z^zo4C>wpjjaMuCMXq5b1Bo}~{>gq#R`D?krWtZv%I)-_n=%jzo>A&W-EaWy-?)>nt z40kFZv?m#g_5B`ZIJFNc)}K~1Z7Z*JbKano3qGhbOqWIZfBvzWgw2Q=w;4c#a{$6S zB*A`Vl>c?P)7PehfvSL*pQ>vyd_cOFQh@aByBq;iK+T{2d*B~v-{rWO8|9m)O;@RJd;p&V+UB^z8L*6dRe_d|8aUQOGG z4zT50{u?9Dldhyh?;@rYLiHkLr; zt$+5nm{abz6QZF|W{^v!f^ty+2C1SJyj592;>8>ImB`=4V{C&sT7uE#hj3J9|t4L zPI3vJ#*pBmk%90LX5**+2@<3NQ4rSUgS*%DhBJ;V&sO@M2wsLFyS_#F(Gy748pi-; zN*t4?!aFSt^3ZAM7Fv2sW3=Pcy`PsK1i>QEe+B949v!ewOE!(~6P-VScQf1TK?i{Q;vGs4#ETiAcN~j4L*ef!4K1*!vffK`=rH0~pkgJB)R?zSWu3UCA#^2xn zMzzd`#dsM5D+#2UU@*xZ4-QYOrO&+?TiM>B*e`xk@KJnu{soA|`t)Pp=^AaE@A@l8>4kS2LVq`E)4?g915_*v!VWMr|} z^vrj6MW%4$w}(`$aqXU~TetV(bP+pyFx|bUrHq1&lMdwVis;M*32B1p9F`EznB=`z z%#Woyt8={G+A1=3wrjh8=2C!|%bTozp^qPxXbc{%9(zk8aCS(_UE*ZCDa=RknpSH? z3x7m8K!JAgZCug&n2sS)?H<~)pEA_~MPY<{9|HqVbt}Bt5H`?oS34UPMe;!-M^h{r zg(XJ1!UTlZypM7x^-9A#TLqmyL&!Ya%m5&s$!|c}m3+UTLOt6vfHuO{Om!Nvty=7p zg!g=-tyjkm1TP@F_oi3O3TmA^%T-&>2=amt-w4zxw6qh7LinZ7oSHU_?`4)duqKzU zpv$T8iWE&yIB*{QU{>npBKo$f-S=44X|SA-{dA@|-(TS2X>{n}D%*wny-y$C#o3GU zKl0alAr40DM=Af5<;5IqX?)RD{~gU2Z|_UIEtteoH%K;#f*1BBCO;z*@9pf#@wsiWt{Bc=Xjg9z3B~KAesnnt1Z*rV?VzOYKFy)m0juHNEH5x;5;_o z)lMM}NmfkNLN&G!5eC*m!#i2M4lV5uJ>lbvh)9u6Eqj?P<+FXLX3KL2>J+FKz18K* zbpAQt@4ndeZ#H?*-1v$>4ChHH!Cv)dYr)#{`p%%45;xSgN9&@f9uOkiOT1?&{a6QI>ZhXHmA9=y${vdVB)lmMm!jA%GJ#Kj{^l<9# zB#g-6XbH}vopyT2?oOB)@5X(*=UKwtjOixWrLP;9k$tsu!Fh3-yWT9wStI0%lAjYr(;enm7lgMN*Z84y{k-HZ+4bG($sw}X1Lo9`%+z3 z9<$}Jz^n9bnn?KQqvgkvLlK5EihADEPhy%;Zfd z{j@F4Mz{8bG~1$H+xE>yPV*gGGgg7Yp~N^$ptvH{i%MoQznF~4ZjCaPKLd)v>1tEe zM-puO)I>7l1$pUFa+-4;JC$Jkeso~l6Z)R@Jp;>hh&IwjyhrAg@`GPpFUyh&NM+BQ zXBWxNx)UBc$V*E(G@op3Ym6w`M(^7Gq7Jwb3&QSw67CQqL>{U7Jc@NIvXbL>k0t)u zNYX&RCMh{A*`Ci%0&x@lm}nv`%tUKNcKT!qie*QCbAdlN<$XILj~hMwn(19n$ErkiXU@n z8w)-r`rMvl1vQjf3dE}!@IrMX*KLf=J$8b2|i&E;f z*sN|>e7Ob2U$M2A9MVhy_vVt1^QWFuQj%SCgPjjdM%#`4K%@XHSd!@rbT`5_*DhAi zFVi`M)&h@4$We{+>}hqhB6SbpidhUyA~dU~-#kJ&QHWvTgQA3ua5IOEmZ^_j3`DLOtj2`on~!(r0kl>W+H77s^0n!b-l{#n;fwW3U;Q`1UD!*%Ni7 zV$zL=Z=QTF>}j<_^;6`DoJkTy-0W$Ok4wE+{V_ghX3H+Od9cxzs)g44A(D{22}Arv zwu5vr>zDQsw%k%BSB0yb8SKA}6}~&$s>g&cHLMKwjr+`$LBDES|7PteNNG{l3d zwo}U{-z&^;WS$Y8SgLgQ7YmbwYeaozcX{2btI=}DyN&KM=OHMx%Eh`6e3}MdXYjg` zlUqHsx>;I>dx-@aZYAG%TsVALaSph%%S+4Z8PCeG7&l-E@Lk=<@TXh6^pn%Dt3t4k z1w&Vm*?dX*zWt=*RBlFGJkg^u%db0ncGMtAO@!h=9!-MgojFwfj9mWra0mtmcr&Iq|` zC=5Oy^9a6RVH-B}{VOnT02}szNa<=A5{KMrj}z3Xlq5A?m5b3yr;JDFx{SR6l0}Jc zJ;y94TyI|m(62g=z3Di2IH?!{3L`R`c#?hC@q>=@e`Pjd#U=lbya)2s3QZfGKoLHv zb{1gV-FQze$Bxo24EQ57#+{}LquDemo|{N zG`oRF(zz|OC6zl@Wnz2it;U}qX5Og}-!b8eaTwuPpbgrcW6*K#F(C1zcLUF1O}_Yu znLO!QIMIw`jBk&~IoGd)TTh`gky=ma_s1%pJhf3-hl_!4a83Q&yMyI1>?+QML3c}` zwqB|@v%M>8vl3ze?+Ae8OnZ=-}n#piw6{GaGTzTuTx8g5p46?#7QPaOMWVh%$%OoE`qS zsu9*LO{%KlbrI1}%;**ufXP2LsTeem1ob7_eqf-YoSp#3TlktUyYX0IPq4UXRwrwT zHB73idAWGjU^TMl{$Nj|2Inbf=_C*0?sz+Vt3TA22q9r{f5Q>)hh&!9M~`!p%%(e^ zjugyw=!cRoj7vi89$0e$OtSb-p0zxV@=p&j-bF%aV^z(h7zUsh#O?qHy?`!7MS6x) z#;GSu;Hf(EdcVCtZA`pgvW4AWWa$HS*;q}VkYAv_Ah=8uC)gcQ2ojofNTp3$qP?vB z^**d94Us6PrfQ1eLtymT{pn55SJzooBX5-c8Xx~(n2P>`e8^XDd@~Plcq5h5BO5o3 zenY0ZIIKS9lwIBd6(2pOEbRA(^Ffui!z4InE|}HXNVRi$@N|}fQdxjg&n<*}mEVIJ zhxR#grHwxhNmP+J3P!Qt+Nb*L!Tmh!;!##IcWe9{Vqq`683qvGRxXRhs3>mnVlkzt zUt)#ovpQ99<3 z(k+H#GX$62xuiGlZc@t|edaHNTPXF(eEYpVVpJtYB;A^vpNtMF*tTF1>AQ-s2^zH# zV29^>ds(ckCF0$srwM)X(<|}2i^(f!^UdwZdo$m7o266_>GZUd9AdDOpr2z#Oc1*d~WN_`=U=}W^wP=F$FIC z0c?EBWpuj7>lMiUj3zt1%IDy^%suL#7@oJ(@x`vaZ^E@#*}8VDi|xc*{MDvmeGz&* zsw9{9vBJNTgY!_QF`WUyl{3AwIQQy4ep@R+WZ|7s(;;OCO5|;d{lb7c4o=P8^_Q!) zY08BK)Hb>1+uO?b?$IWizhp{N6eJ_nRs3}2yj)H6BiLs3xygn0Wx`iu^3nyG&c@xN zHotXI+WqPu+XM6WP`W}*vAQ|J!_5M@wSGeY$cZJ%jo@6$YfYtDbob*%j%IHVV?pMX z(F@z6A7!L|mYdL_6ARD^fCn2~s)3D24>=NhL5pZ=#Qv)4THXS^VHJr!`$JhpFCuY2 zd>4IM2ASQ_w!kSG-H@%sdPEE;)kLEIY}|DNuQVZ>Ed*1QPL4AM552-(dvOW1x;@4I z?8XR~&xUyEgzl4)jJ#6+p}Rcen-9)4ZCu_ghHn}U1A$W4USC~Zt@}ce&7jZ>pY!Yt zG{|Xv@^u+t=;x~&$+ZiysQg;3iKzHH8p5~6S222fd$-z>++)8Ud~cssq%5!F z>%Q9=tGKQrE&GBkr|hH7RjsAP^~U+e8qX`8~7;XmWhM1ha1b)8KT!&4TKryOQ$rDGU1BLyKZ4YqjgW&hmxg4ASHx z4F05AJviISp;jk83sIy9$BtTzq)>0hf~gHr$sEeW$?YNSWGb~2T@C3&$>6|VN8NqZ za}?hRN#@IBNftd1cf)$Gs9+Q3+DHP8AvqQ?`S>7;fxr`bNqj$eQTPpM(^v4~(B+i7 zeJ|y};Ly5Bb|K|EKpg?fBBmC(a|#}fQwY)D8P%WZQ-s>ZoxY$C3`NduAY2f0peQ)2 z^Ec#}*S7IYm)Y?_K_%n2Uzwq6*X4##Z}8LJB){5d-3wNF`L6hmJ+9@O^aRUj80ROt zFmyFlpRA!qPrF@CB=##nehj68Pw)PQ9E@_wrWellC>Vd`V5(Z}d9MKWqdVt&7hk5C)$6&}VR zaX_jH>$`=WbIkR|vqz)%CgQlF);EBpXYQa8E87-O@q8}}5xagZMTj2vIG3>E?0=ag z3dYI5ea23tWbk0fkLtgHo60pH&yo;`hT2r9p;ZIXP`Y~>O}vsR$Q$Y|Mj8qG%@<Qc{fR)?pAAgC#{ zokDhUBLHYoh(@QVxex$A^h$sQweB?=PSDgWU}+oqQq=QHzZzX1L}MDu`X`uHkKwy8 zKz;_lB0n2+xM%eo`a|$C+_#$%c^4}Hm_%e2ps-`j#7%Je@Oki{hEL}iHBuV-tQCa6 zF-oHh*{;)0&$KV6E(zcab5&-)A>n@G?~y5)Ai$a&UJE5IVYKx24koA=UgtaUKO$q> zFsTzCqZIg{m@z#3AwNEZK}n9C-34%y%c#q;ZD@5<0M(E@b@lJ)FLr|t9u;&->p-&5 zuJeJp`R@U(P$@lis_4U>NW-<+`t*6#)~f(STdGjY;3>1gBXQGy^5Qcs<+w~f^a9*e%sv@&hj7O&t9`JZUcq(%#4i8 zP47rh9M>)(hFo+&QDMEhM3b=%vYIulQ7f z*%Rw!<)BGTXFjHIXY6+0D&Ns`I)Q$_Ih;yW>b{BE5R3gRESi(=F=;7#4UI8SVvj)W zy~khNi#IgudwW~U9o_D_ku%Gw=GqN44>aTvvQ#F%rsljGs>(ruH7>~O7|)KRkN-X# zw}U_aDWO{2ouzy9Q=Y)?{JeRlITmVGch@Re4ts3%7oYdmG?F@WF2+d#7q@*DXKXn+ z1*yzGgrNK3K@^)LD|>(F_2(b9_x=T1Czo1i8oHJ4)o0D>PpRZ*>5t8mS)M)cFD=;8zi0$HIrT+N)xX}cr z1JdPKuM|%$MGju^@mzSz`@N$!)NRfe#AJQS(Fz^Uz1n5*BrfvyxI!ES}5MnKb zt~FWxz};#gfq7^*j|MzeXBWd{;8BVsy(42&R?P&(8hBBN&Ff;4%Bk4V$vL##WVs|< zuq6LQQoA-O0jZ4^mu|{Y-J4R%IxwE;eRuH5sJCN|)URAV7xKN7aPkuT&Km{8Dd#N4 z*e)va-_M8ViCWL$gdQg2YRR{c>$-+?o_607WoL|u>9;+5$bvqNvD9s8o{({y#+Nxf(MoYu0sHlo28k5Mt zTCLMbd?XmdXN}bS{WCY+8uad7?btDP?O1L6;_*AhtE4SG9(|mGG+u+#8cIsN^dx(D z^HHzAT9K4b6;UuhFJ;&s7Z&at>}O3z=yt+U#Wm*(^Slh*8>`pxahUw@mtu7q_4v}T zyNb>?I1Ou(*%x)?He!;`@q@#i^w>GYvX<^S+{n=v(HKk*xhc1CJ7VO+PE5F#BUQN7 zbK%2MkhJk8y*w>sQQm8d!698zQZG|h%fe3tTI>y7f(9F$_?M(jX@z-~-^lSbtBA$) zI4+(1OfT{w6#g|Kz3ZZ%zsrfz%0_l-;2<}}z7L0VronP7 z!mJLmoH5fBc>-BP>#WC+%ghgm=H0$2+0l}( zGcNLPeDdqK^|~3inx+jic3|IL`8b_ZUbxpGDE6Wrjcl(1TYdG;HgBgLhb+r<{>B0E zSdG(*moXEs3vC$2j^`ycnzs!?NSxSv z=UVzgW@etyL)?pl;7jLAS>IrIGe0KO))0EO2k^8B4n7(>tuuyE(@wHeP;Z3+9Njp_ zVfdyX1v1V^cRgBMcg697tRJb;2%-?5My}vL6Ed=b^~w(y10-R}JRYw!!|fs3AQdXg zkmzig17%RfhfHWYO~Sr&Xi)a43vgDYa0v1;3kn=Wj(wm&a?IRkhR zIxtjYlRgsZ)6Ow}_5Wlh^g*hqe^7xAV>HRfy|1TFK1JvRGLkQD3`n!8}URrz)ZWv&= z4j;Jx3XAmYsm&o(H@2rH`)Urq-y6-u_m2(m(pvK~L-8Nv2tChp%{3iBRaXXRKia2w zjrqYqDZS@wCUob@-mHO3QB96C7hEx9tK$L_u>;kz^{meH>XyY4*ZrX@ppdDsRL_qW-ynpC zO?^URUdfNJhjZa<=kfZgff@Lrlz*D3D3==WW>(3&fFi2pGLg%h)l2AIfd}rH50}4) ziyiy9N;$)^OxGY!h3nZfLjat^o9P>_PNqATS#QMOc{=fAX-062i4`(uGnEHw|8BkO z90^4x>n)GdX+L%85sf;)%8`3)!~RVzpt%CTPbE#)0qJh0NXkily*sz=e(!{!bTg&- zX>KgdO^pL)a`v4G<<2z5s`uR^m7Mpo^`f5x&gzkz;otIIGGt4=F_d4^bQ{J?{_bJD^9g3h>zxuwV~LgvxDsE41N zc^{HDsXBjKFJ*k#s2A~QCOZ2N>ICjBpkmUzQuN9|znVc;kN>QN9N!@OPj0A>GzoP5 zKUqTAO0rD<{@1}$1WV@@^KSLA6(Vafiu6LKuBLqW$e3pHxnOG5w3&m>>BVhked1V#|mi=;X2UW zW1b&V$iUh>BLHW-z%kmpm z?frkpAIF8U0PULF6ahjH{0aW=b!6Dx2i>g!`?k3E8 zZiTMJw&T4GUUl7RFIOCOVrBG@ET8fV)s@X-!6ba!a_y#Y*m-${)xFuWx3I~3m%tc7 zT3%Pyq>UD$-4IDN;p10fDF>MFgaGA|SnkfC3s7=>j4kMIaOf4NbZtMS2sWRH>miMS4eiuK_86 zl=V$q%XRkNXW#qXbI%=jjQdAONYAg#`M%G5N;T?eRz#Sk$#hrkgC{~6ll|Iq22FEY zIcvr2F&}AJVHHJMFc&~nwZX^k~daXR)ogD{%fO(Q~}XB4T@iloU0;2|&nObwJs8Z&@gYCVG0Jn6E49OPfmW6J} z=BX2zB6Yt|>J4^AwxWx!;#I*{HYT>B>Fbz0Lo5J@{ru+prs0ZnHrjpHgw9*_XxzOw zJF6bqb$;S@j|6kzyuU;|q4MKZ*%FpoB_!|14oadoY`Pr2BX>-C{PYa$l#Ddg@;!7) zBvv2+9|@yEue0YNmX{qgpC{0CH?0ENGYxp*pS0&pH>%sel#^D|q##m_F+JU={5KlaRk9pqB zcw2l|AgHYVK~8Japt|I#XO{(c1pG)%2J-f--$d#I0lDjqzcv9Ry&6jHBMVOOX4%&M z8j_hpz2*vy@Ls!e*LyNcgAYtNw^2D_Ff32b#{1a|Zk3;~cXMw>_T24qc=QJZY0-6o zuG!F$v^y*p^`(ps%e}PmIa?5y8f;1{L!8s^FQQvu`pUZFeBQ4ETDlbActL+#t%==l zs8_Vk?Q^{68`@1wh%v`JQaZuqE>zY)FIxIiRC=7I%-N40uPd@ElBZ?=IU$=KoWzhf zyJFj>O`A~X?n+1d+B7BV%!hfu<0(a|b=S=TLDn9^7V$Z$Rc{nF!zuApPBdF+x_p&< z=Oh2R)vY6Iq&Ct5xp@7PDQ0OKP1I701aQXvLBwv6CUK7cKO={3`m2uWJD_5%`emm^ z2T+5{=kY#xt3M#F0Pg74M}hb?N^tO8LbP?a!VlzN_&rL(UXcu%zRTzd*uWwuejn`M znr?oaNC{36o`N~LquoBlZZIo;9n}lNUQNKKpu%8l{Tg6pm)YTGadld;%pAb~nUpCB zT;T2X!Pw^coY+u>BF2dOIS#Mz;LipT^+cPd_G!Skf0u7l!dMO~K=P)4(fz$kNz}*y zP{n%;R5>wr@`Bvq3m|zI;whb#eG+-M)t~)W`~pbEd}Q!f7|Oq*eSc9GJ3+KS!-r>Q zkru4iZK+PH2m)g~(iiCofQ5LX@mIY@@+)|N%>Q@Urupj?*RC*r5xd`W?w?=i&t_CX z>2&-SA;(Ft6UXmK3nFv*zKCt42SB8=G6AylZxO=3N-~Hc2l}M|u)5Q~<#7y9{_UP{ zJkI}GNbJ+MKE93kqwjwL41#x~5mqycAV6!>}_0yx{Kg0)uU8?vG+w5dOi!AVb27w5fmjP|v8vMp1UZEs_ z;8B2=`X67YAA;9p@q-%piQTS|cxWXD4b!C+gT7zorvB;fG5`dwcFH9B?e6858ZMj| z+TF`K#Sxr;>ofE{W73V3UsiY^hBVedXac5r78u=G5-IHha9W^=>bXouU>EoEvRT|m z^`NnQZTO@LLgPiSh~ICzl;CoYAikLq_*qZ|gy4WZuG=Pl{QXH!(dq~;w7MF=G;M*U z%2<4Sa9Kfa`4@|wGl^hYcvK9Vxq%e|fd`<9>OBo_B~%=N$$kkwaU1tqz1fCnmoMY|Hv*{$6J;oyKYb(~2`rnl#~D~|Ugj#}Wohrw8* z-~xtP9X$c>_I~=F!&BEAe~uW2Sj>r^qb2W4Vx-*5?3Pb|9MqTeg0_sAIqT!Q_uL>N zD{R8Wz;H=SmG`tHQhzNmIMzLXfhklrVJIA=9`$NPk@FH7v(5^Vi{C>Q=`RQbQnY7f zs~wOkWNS9J+>MYr{pD!(ZEo8esjSu^T5nom(+twgyXdy@6{8-*$#&p!|JrRv2yao?(8)O=J z9rHoG&xhL(<*982&;T?!s?758`!1Eb##c`)ytX@n8?2-{_Xzx&`cI!RCP!gkgR1EY zAss&3)^b{oCc7K0foUF=*u24ff8zolF0BfchwAET@YAazw>PiG)mrLT50&w0S9lcG zsd51sV%K%P?(W0k-k$XI$n#9HpGV094233{D8t0HQ$0<(xLs<=Q_OAGv~NYF+OkOJ zv29uio!SQ2p^tz&YL&i)Qgo)W)tmac@dG}b22M=6LrL!2k;lF_)*3g}B9qB+KMcg9 zgfUd=DdX3+WS_g0mK)>JVxj$6E<~!H>Anq^-?2m5M}UhpUgUC<;H z!||S@{Z!(Mpo8Ltx`)hBJMkCqa}2f-yHlwyVS-VoED!W3B}6-|v@h6>CB7KD0? zJBhl9rmzfKJg+vT9eJ4oGC2H9J)Jioj2qn-waaEEkm{&#_$z<*pWFGM3Ry8Sw*EwPlboc&`L9#^34wjV|6<}s|kAk-@gdLnCSYyw&OiW$UEsS zr<}2_OTmWXWU(>I(dM!Vyb@OrFC`2%7#*?_3ixIp zOYjNaaq@Ey`?}iesKc_s^5Z3A<40PGf(1_g7f!xiH+@|~Rux~PcfiHQkY%H!LkDQ(|9OCuNXj#92HDejI`;>%xXMRUuxxnJ1sW6?)xbqnjr@^ZK&lDu237 zd+Uv?H+-*XH&!+*n7=_wbGUkIS&PmZs)5#_8BRWhg*7v*oHuw|4w+_SE3jej^7 z`3&21ii%&jUvDZMS#{>uSy`!fIc4Fa+k}C8GAHqCKT~WUg?CmZ<+gFYalnKD*6{K} zX>ZZJ^^EHgmyt0OVG>UjR}B62Gz;iys}V?8Ep}j0KM1*P9eX!wWv6;PcN3etWVo#0 zr1oP!Pc$mkEG)@s6A@gJ5(W-Gkm?8BU=nJEowC5oZ=b^Lg$vBbkwEAlhcen7+@h`A zN+ejBe1GbI-6h2MiQMyH#!h`$Yjlh^J&-6*u>2H|qw0DPOG|F(0v}d;QKM`*J4L%f zY^0v8KQ1LuzTtYt3OVS>?klkT?6-6pBMtS(iGwV}8h-Q+Q2HSN{O z+P+||_F_7c_XG#YXvM-FDh}~jF9=Kx%tRV{Z>px+^_vMD9k?E4ZgT3{E??7O&M)~M zL!a%>8I9@?i#*4;kFH-lIvVWtDVmM0#0#vl%J08^?ve>twX7_;dc{z!^Y$aR_IcJU zHl(REsv0r<0YPDnHO48fNuGNnV&}cy)c%lG_!PaHDZoG?qUV|K9ICXyk&f7qM++HhZL zrdcm0zsoI*K_9YmrANSlPoZ>r+Pz?3w#qUwJL!7CKKV5VEnHY`t==#U%jJZav0hcK zTbm-BEhEs)B@O=smHRIoG&K)V3%ccF{reH7V6)p$_S|`-rMuWnU!wPr0k7uqR2&Q3 zC;hqFvr}Tn{zR|9UWXiC8nPh$A)02?6Rp%9IrF#NSWm{!b#o$}Xs+(%=+lfV$?yU1 zY(-5)!$-H=^d*YGBH%+1RsSdIdE)sorBV2IRru>#dp6LucAg~aeVkF#TUTYI`s7OO{47FQ*Syzx54K~a=1q{Ysd@iRY=H$(OZmnqMxZ}>n}HrTr-3R3{-CaLL~i2!B(&%@^4jQ$Qv%Oe0gm> zyp=&-?NnCM{pG|QDi<}SNt8f1!H{5nt3=qtp`KNf&a%1{Xh)Iepjez>5^1prqU4gY z&|sIHj=jaXoQ_fh6*tzGe<7Z95t4C3ys_1lB)6!e^=?46;A80ZGEAHT8L_*L4EwN` zqG&f+Mxyp)?Ov-*mGl;Fg^lj9>COr`aEQNb2+iH#1dsI(+-Y%;xmYnW0Rn^Oq6@*8da&rs)o!3O{LoBW$=_Ti>*{(9E# zP=RX-+-nE-STVSyZZ8|F{lkP)KIu>i!3>y$F0ASY|B)ggR1BP`6)&<%s<1q>@toX(uw2bk6;4ou`&CkmL8Ro;}He$l4>28_Do}@_1 z40z?EUmy6dn;Ibx#h42na`>G`jK~7OvA1V+mRor>$7ZmgAdqUSdi>7%p-)M>y<$P(scyvZ_p0)E&Pz+rN!*U;N@B_}~ zX&h|yw!`o}L@1tqVZ}RT@mq{)0r9olkM8%Z8xOriI8$cP(VaN{{j-Ejz^Fpl(mviAJzlnsV?NjCG z4N*&+pbyk47Ft&hoRZU^?Hc(Om9ZY?RcIBh%c9E^v$C9bYh$h!QgN+Tdg7 zel{JkcPG!lAzoF+5&dr$DybF5pUATB@P@0;M2kNd7;KcZQn{O&%35BSw>DO|SXp>i z<9V*VsD_KH+{b$w)^VZL)f`%)XEo$?YsiUf#6hol6b?D7I(YD#176vsG}uad#JCz z)MxZwc>2BIIg2!l40lkPr;SdiDrPd-G0Sd$j(&Vh>5WoQuVvWP$a<5qzX-V;ux;2h zc>_FbNMyuE4`clf^;%VKW!kt(cQj%_qKoA~jSS7{2fFx%3#H}-&yzj&UAMX7O_c@3 zeC=x5FVwX=^?Kx_zc$|SZqG<0^rh#EIy}hko|&4>X+VQ^^HhyG@hmoP<|~?ZRr$)f z8JLF%UxTNWVQ=TVIk(46ayH_N?u2%{9##W0H>dYfP>yL^`9}FRS(Jia!V|QdUfAhN zQBStj$<Cw_>yWtwFb4wl~`!ptfNMKk*4N zkBgAX)t5I9#KO8es)z&VNAFLIb&opFTuI7sh@@QpUXM|X{w{eD7Y_2Q4`$9c3bFdw zViPia1TgMlC97j)o%K_k^o|_US5lle+b9>3Qv5B8ND)pP>^;y%X* zr2%?eDkmsJmN`7zNzC^h?z_sr>KI{IU!WMs3Kgwkd2>fV9v^S+m!?I$WGG4W3jRdR|v0P>>Fq& z8qYXK)V+-u`3np5U^)WJx}=g|JtDrFpa}E~$6z4B~K98;5sQ z`vby}0NrFIwjdrw_(43cf$a3FZD4imiq}Y21YWSMkJk0VF$?eyNVyPhvK$r6)T8@ShYke@n={&1(fCLsRD7A5dFb}WJxyj76dmkfC%b!5<=KiSw z;>{5mhS>ggS+3lr{6G!DWBGwJMPv+&teBbF*Yn}t;^iYBO zedvod>^{mAt2n*lcL}fAvjM_EfsXFU)taJBO(WFWxha84ldlIZLrm<-qR_+qz~?w^OuV!&+P| zpJ-6}SdvGM76|JNEcOQg^#Md>*&X2dT)$uC`O$H&^es)aO8C>I1$GV8;sqk5%l7xB zS(}zU%(L((=IXs^x2;~^3G~gL^oorx3`n{@Z@T5Np|`C>_agNaUTgO4!flj5;hn+? zHm5ft7!itP2P{}r6s&fi?Phe~;H|lmHR!Zh>HP8I?cJl{ynvj-@7IjYFTaRtnSZsv z@%YJ&%+{#{>7%|!Zdrf?WVRH}Cvk;0GlyrAInV4xsBp`YO#WUwstRp`I+c}@=>D~Z z$94MA$%8lq>!EVJ9|y${!J2zM~?FUdo83w-6ha3U|{EZ+#j zT^lN8lbs!|q$|K*yE5sfUiL02F0YVyO9TMfq-$>JOO1Z!oWfc~d5xp{ua>#EoRY6!;H^3Z8n{Z0hH9Ffk)V zNDi_^DOyj@57ZG}dWEz=Ic{UAnq1dB!X!TEi;Vl>9M^)@%w{k<1%_fDDnwdl<5+bw zBcp%}5|F?Zy%w;Vqiye~f-V|*%bbuenA3WEsM&^9o$CBBRHpUA`F8590ccmEh1Jms z@{<7@>N%ABM-{;$ApuQS>C&+09Kt-2MIS4yGqnrb3ZKL4O%zsJ-+Rv^=Y5F~6K~5DBN(SY&MOyhpVQ?JP9i@k#ke49&s- z2!ii#|1#1`)5lH|DfQg0E#kp4hWZyO3VDABs5v#?d=%b>$&S4$#cu!f?bZ$Ed=1H` zoYDfFJ4(@l!))~3&@9iT2tO)MqgE}h;d7tIC)gf7dQG*z>Gu}pz{40&>~hyr+g;Oe zDRaHIA>wQ9u&%ht(gHe?Nap#zQB=MqE@$lHtrJ| zn)0PZ;5O{~kG_no&Yn#*AU?TCM&ETsL&vG?hD74Ps)N3m1QHnRk%p>Bx?jj888GrN zsPbnwq>ZCp)pxy-cqOwv+d$*<(@3(c{6Z7g00ZMY+}y~gw0^-KFHW2%-*U^2>-Ki= zDM%?R&X2v;X{BR0M75!?q*ryA{zX;ol+A(@9n<06?Z}2`tX_+$L1WJP!L^+n2;}1u z^9i)^S0_?$>pD*?Zu+*5;>?OTuilYPvH$hcvbW;p*QPzId5!a2TkIpVXKI>8f#Ga# z>bt9)k{{l` zgYVuU+sMpO?A0#tZ`*HMpOL(ASl=3PnBziI)Jfa+>0H#hFejPF>`x`9!@8rm>5$5x z(HgHu>3T<_#r3~NdbBE7d=q&PkP{6 zTmd`|keCpMk~BiZCjCDkE8;DlZVH^kF+D^Zum%{(4+EsYPg_QG*aI-xJ~2|le{@*R z!}CNLtinDW8K}!pntf||2{vvBSipI%hilEHE%R>LGDZi)G3<>KPhSAa85rnky+QMF zY!$hO-vm~X!2c`H9z8eYY zlA_jEenT5uYM}VulA?vpXrKUn*ZNM#`JmO6frelZ)t z_pGB-9)7V`g=y$NxN&~nhz1toHN*p-TiYHDYlr3Ip7{;pR4SHj zW4nu&2Gt6h_=Wdhna-6Up8&xMit)qp1z@UfhtAD}lhy{t-dOtJ(8^~t5Jv4YuRJOdEWdTn31 za%R%b`%&Jb(Y7P^?*bc$9kEw>^!W|$FIMtH8h6jH(WAzXFnXgw zGji~gbtgPIXFS8yTr=!GQ+3s{9{F@VYa02!75Uh-c35mOn`Q!>)d8fxmgfP<^DhmM z>HHS{oH9|zTm^JRMF5sn-*VVPqInS0#n2u5F(E?uGw6}pb7Pv$h^3zl04*>8oQQuf zANn`%Fp#3hS_SHpc-n;FLGyYTi=&Ubm^w35mr00)*`qm# zod|l)ZJXe`s8X%jDElL=%G=J44R|H(cieoC&ZMANYonx@r&kzoV(sZQ<1>0KdSpJf z7_!K1{(8xDoXSwz+2#rI@DMu2e9?0C5hnL;YC3~1Z;`!^tRi22Go?xYABDWgR}xyc z0+*UoF?y%~c&To_GHT$LaGU7 zZ+%~qZ@ep@^K9na)sfDW=|Cy*Gx5uBOQHc?tMj;fgXO`nR}ja@8dc=9C)Q?G7A4sN zrH$v8tG3z{D?eqX)osLVL}mHm7T3bs-Z^yW)F^a6U;u5M8d1lA1sK{2h}yLroYpN_ zsW^3dn0@s66-C0X#Ws92J|=%p`wEdrQqWd+%53e0FyA)sLBHd}&+< z!rz4rWrK6>PW16mh7f{cZVMgC^22fGc7ee$yDL0y@F)We`GH`_F9u`xXJc(=1;NQ_T!H$;l5i5P0=qMRtHAp>%?mU7~g9Vb5ygwj0T2p~J7?%*yw?Pmt!UddHJ*S; z^yjHj{(VQx(t0h%LksJh_0e^8C%1CQIqJ!9!Y0{%$I{ox&6{*LAyHpG!@>Z{uNFDF zKh=_YMto}QWb?U6tAwDC5wXIit7CEB9(7$j2qZhnQ9{O6h)^BuHl6~~^A~QnVol^S z>3anYQW^p)yKN|f-lgL4oM0D709}vP=QN*|o^WNu)Mt?#%$5S8OBpiDg z`p+~mZD5A*Kj4YEycN6Ozx=VCrAsyigogf_JdfYy|0-MTe}w4#+xw5vu44nhWBlXK zKP&aKzX)Xle@)cCC+z>mb;wZso3I|S)@?P$YTI!|pGKEoH*UC|ZecUr$Z#b2#>42M z1;je!*OLrd1ji(L*#3SB^pAo6e{jcmW&Hh63^3eRQLe1DotP}GRQvY#?WZBa;!L!Q zdpVbQ3i(Mp!0e^`8v&wD^aL1J2l(yTkNSSuoE-17JlkVJSY}Tmc-k9q>%)np(lOo> zntboX9`y(x+a#V^44}&{18yF#mUE_oE;K!n1eo;liarMKbJQE7s0o|^gA(0Ol|7(X z5-T;hF?TU{Ra2YNI^UE<1$H7)m5*BUH{$iwcSZ*^?u;b%iJ;0O@C^(D6i7cfa={f~ z3IMeStmP-KV8@nqKPF6<;dsWCSA^^*bkNZ=h~<@npUn(awW|n9sqIrw0s51ddLdl$ zMD8MI4odThzTL=)z-W5!e)K}t)qq4hm~dtV)vnl?*;Y_q_KU(t+SXsW;$Xj!9`0XA z4}zU4C}+a53J|f;mgSZ>1+0|tcd@VZSl?v+@2%`3#%a4vKq6|oy@h*Nt{}H&Gc*RE zo&p?I>;ouA2uN0+!97gHqN_&aNk1Zv7;af@eph{)9JO=1!-q1;Z#jEA38|!+L`(Xd zW@6fYzSNLMs!M(PWQ+B;5iq_5kV^YufZNO#B_t8>k?UV^43zfp4OX3)K&-Y1sIWMZ z+sBu93rmSz<&9RnfXPTA@b9(}PlsaWhzvpzOD$(xi_Nj#>K)V~c@L>An8D@qG=DxlPCU^FCzOoJ|Smknm9)J>@m~ z&mTyPZ!quYtsuY5yR~Wh`VhAARzIX8d4F;HdItu4EtbOk%J?|qQgeS?NlqGmnp{8` zliC<VfDLPjEFn!PY)z}*Zl5h%n>&+$|ZUg&!P zXM#7Z##V24-`SR{pb?2WQ6T)C+TfaR{VwYL_nU9oyA>Aef~d59e4d|W|M9xVIyJ=w zAS2q(8k1{v+#6g|yeFbFUocz_=oU8-zyYFuA&_ch1rV;ht##Z^qs8v6Wf6PDZ|m zYLNOWc{IeggLZd3u=h)AIP@q^>$(DYeduU`tN=VE(~#HG#&MHJ#`rg2|8 z;Fc3=T*CB8uURFodP6nwo&)0i;c&z1!6J*A1LxYedzN93hS%;fV*tFFyKOXTcaZT@ zL9FY&Un6uV^TBIKO4)}C2MD?i&p30EV-Sc6O3<(~2xi#s z)x#sfTPOSK{0Yg@SpOVCE^JrkFy%@=x$Hh7My7bryZ4^MPqo!J^ikZAaeg61vofYY z9G`P#O9G+Xht&s@Za5f${+v19>;f}S&Wsb8z&!co;wiTmvsv|-^8izbHH5nLGhTk$rJ18P< z2iB}Nnoe#WZr=Ef^9>odq!c+M>1w`vmv3Et*u%8w>uX`ApW|%GFi#CL1R}{rXO5&| z598ipOJ?b49sF3Up6i(O*{VO3EEY+-I2TxND)U(uUq8IA7OueN>D(&8O=H+8{b+r? zl5>v<18;_pE;VzUf44`&s_b#UChDa&gv@gSL|AUH!DiXzRPtus5<2{H77p7ehnmW# zi)xDgfG95xDZo1?Vl*_QM@5z0vbmMil)TvU&Oe(;t$IGZBKG-A;$m1)O=Mgwo13j8 z6W@(jqTinnr~!$y!M#V+@^*YrMY~kTD_r{W_^TmD4hBi?*`nU8-R~xc_%7K+_Px1Q zDk2kBp;@l~tv6AVPE*6bG6Muxv54DQoEBBpxwxXDQvZfQk6irrXR`3FHKp+S^7HN@ z)>lcoZbnS}RA&F-^>I~YvR-CVXNP{ns8hkn_zkJ_JsS1fDW~%bY(B|1igmCoJ4l)p zhw2x9DUBjItGd#@cgngn#9+msyV|o`8k(iOteW{4$E2&Hxr|T1R?LuLP_H(xm6L|Q zKX^ooz8dqLywFQ5;+ek`s+Vr;d3^^Webh6!_0r;vD`&US)82G`(B>I{`jWxJAyYL51&iJ zHeqBHx49CiB+^%SzZlk6y+~9a=5(chQ@ggth?`P``@=CB&`@%np&CWZxJh!tH*@uk zGtoEfxZaI3a}djLIF|3eUr?!S{R?@PKhk8=%|Jz~qjY?z$2`g2;bpgHeN2%_I}=yF z@xWhpc*h;o&lFg7py@)}Ug))gL~0EPINowJB!73cqym$_k6OFr-_5fEM*@en$l!-t z^+#%v8t_)6!oUNSPvd1HI@g(6EXCyk%tr zw$=(j7(lJ6e(D3}UQg5IV@eHCH+ld}#NkWCjyto@#KYWS`7c=v;BmlN37|sk02MMo z08q%U;F13qR3{wE-vGmWH`DLNx6Xh?M+f9KuaFl<4X}-)2TM2m=uJw+1M?N_Q-s8+ zQBqNtrIfN=;d?9dx6EQ_1LNO@Tu#07wzd%&9!lx7xR<{G<&#=7{iuNX_@OJMEbG4B zTi^UF=o1&UHDI>HPt;7wA`*T{NRb7H{h-)YP4Y1rxUuQ1M^rQ;A{)9qd zba@y%JExB+i`1a7`rCPg@43+@-ue$+T@>Wr?u`1%4*+lDV*0xnRUOh?T6wxEKo)JXtX2G#iW zr4i37vC*^&p##GW@^=nkdF-hhh;SHPIo(2GOQaIqaC+>NU0n0}!)ap;~vMQKP!AO6-?)ad>DOz3Mi5H}Jhll#Yz(~GrQ*@9$^ zW60xg_Z`?LG!`(sT2iqV(BhDQvaj!IHnddBi7j;3-=D9uT#MzyrJ3J|-3z_jI*VcVIxDvDO z+TFPT3LCy(w$A(;&R8Bai2ATVlsN~9h35Kf2;F$QRxurx8=`wzy@_`9=i7{n`}Wvu zbUGBvdA^Zu9Yuoeo^#3~;~CzJQU7Gk7HWQJQMVIO%~h1|JDc~um|IVIze8rA)E|_` zmG6ZWLCy>WB;N4pnL++8fd4xNaYuUl)`a-F%YK7(R|kEdnAx8?#kIZbbyK4Du3FTp zLc*6D-#&O7KQR{wY4t+A=ifIXQhAw2c3k7W?qhJoO(=XoO>%MMtG?cxNf zUbV6Z-@6w%;8@O(4P)j`2M(qL!Z_|Th`_jmToM<;a@lNn-G5)`xGR4X_|Vd5*@4&f zw(=jnPmi>3N=|Kuy+%(bLCX;@qq3^8&aS>^=EYzBNM3>647Q2+X?AGYno6y}$<7~E;T>l$TuR-MldyRT+PGbNfsCK(OK11?_iTt z^y@0(#v#&pFxm{|v;rW)W4&ZM`1EG!799{g(-d~ZoByD4EOwHY9J@bWycAPLA9>K? zNAEX)#^^yKpd~VQJ>$~g=RLU@Qa3_RU95jGeWCj49k3+rc|P*PPZ1Y0RlE4icNi5$ z1C!&5=C=fhK?8)M1wv{aUmFJ3>ahrec=kOCMhK(bANY-^JGFY=a^if8k=S^O zct(o2^^VOHV+(e8;Rw#Y^jgo;rs z_LZwnR`QLYdHK^@|96Y``}%)VU`e-{Ks?PStA*oO`5Y5P`#JB!5epLGLJ8PC#-u!RQQf9l zZtDfRqscrIj4>gv9nHR)^g3miR0j}ujm9e@RA8#7Olbb1Wg;};81*Tj<75B=SIJZe zUe5!47b9{wX7}uCcc$jZm}m!1tgvL1L!6xkAzrotE9q`cM(q1wTV_y5BJ>C6a>d1o zdRo%Kq|__Tkq}wOz(%G)>P|#8I=c?aOR=zS9{}AjGiI4wgofC zQQ{7Hn~UAi^yj2?7N^?{F$Ig_*T0_cg`hsJ zKqyb-*0VwQ9HL5+0lfKVbn9Qu2w4U2mscg__K!fufKXkF4{9nJAh?$8gBrYP>R;(l zxL5V4nW;Gse0(GnG18C^7PliV zwISUhL~o%*-ro;G{@3p$>LaG}p*!Aw*MY)K2h8p24^ZD%31Bb`%#lHFWX+v;aUI7U z)1P|i5{#BR_IPvu1eOhfq7(dpK(Kb#v4H(s(&;q3avpy+gs3O<50XxX5n^Fpb+}9z z<=nz`7ElJkz($n5VGY7}k5j4^Nb+@bfbCkJ?K`?(RnlcOkRbu0j-e_S<@s#?EpnIs zpE+Bvi=II`1A813P{^E-G9HTfTd{|MOmrDx=|cS+vQ*N%Ucn{$R#BKNme`|n?slh$e%`(ZZt)b6bZ z9eGz-#lC<^<8MbdKq${5X9QomIvlbHkjY=)%FtE^l#Kc}{$X~3{7!8407iBY_g_n5 z1~=&YqI5aMCRk}0U$U`2QxQ$PZhP7u2Glvnnog=%KNZGr;OUfsP#vdICo%W=p4(%5 zI2Exl{a-6vMloBIj?xO44^+2~VL9Dw^i{vOL{?ZF9LqP1PwD|e`J2!q8GHN=ycV98 ziZMz`9eV^ldZG9$AGrGl$4fPm zD58^QsH&&pfTA=>0I0^B|0mndy0PVw64OscQ{}oE$(IiTz%RUyo{VXhz z*aEY(qat>v(D4uh%9TM$(^s=@xBMV#j}h%scMQc1;w%vzzC&OkVdxh|#~4doiTsS3 z%h|8z){CrKd~OCHzcg{HfFB8P%LTewsHUs;fOC_J>d$+PYUGB^zT`uhe%`kz+8m{X zGuvGT+!=oid{_+hQpw0-G(5A>wWi5ZYcH((oi8`cb48mAB$b++${%5pb56A{=%MrW zdF)e9%5x!hw0A|~slDWwUoy20SAyviHQ++Jc^+iApTu!`@_I5%LtoY?VBS6oJbg0g zjaQ;78^`nEof4ch)_Sfbm8C{BqqTbOyuNn5J7FSU&k{a*FUFEAIcI!iTySJYAHsp( zM~Sj1gjjRPu==~E+PjNE-=9dnXQ*Zi4vc!<^hAxeuft^!K83iQgpGvli#~!_h9b{6 z>)T6Vjk&ZFZ+5OGkk!)mz`U_;kwVqqw-ERml+4A+hR{MsA`|?uRX;#-p8aUs`&}9iplhs8ai{%!XS1=@|#!Nx55cu z{-w8&=$qPZt^B)2`_`D*JGL7iHFJF1l|*|S1*K+diR_IsWf?-S9XX4i=kjLek5Y}* zoY{-LwCckinQBcZvmzL<`NKCW4#R@JUp(QdB-)?2l=Glk;NW@6-ro83tQoQaA7dV5 z=G~sL!=X9!q%mg&y3nt(S!yb8`mXW3pH;$4);*Mv(JoxEZ==*#g5TV#*(Hp`+=k0v zp>*)ZdD|7kPb(3F_}UG2lzjKH%&j@bSP52NnE{bANLm-e9Xfd(7eAS8Mu4Ps?U!A; z;K(V|?c>LJIP_M=Fi)Jc+WPi{QkJffFa6;HNKBi(`@A{I)9QkIT;3g8;Ye|+ym{B2 zT~xHuwQuC4Ksj86W?C!ujBGYbDd0cHV4>MbcY~#axkI7s>-9EW{-c5yPX}K%D)Biu z>NS*HGcPnV^1X9kvXk*y-e|kQt*ENGGQBTRL2w;M_DJyz)xn)k{$KnMHU-k$%(n4w~nTLFz3S>O>eU%FD=CK z@Ka=5$CrJ_ROtIBFRxDS2|&|f^ieafM_JUZ+WcaNu1!Sj9r15-f&&fsx>@?yU)7n>$G+K#Fp^d&3(93 zD_p@{ut22aI~(w%KOm)TFVlMgC~kg< zUTYzD+IRM&-VOere)3JmqR_^66v1C%`KPy+4&!1fvU3MNdpJhbTDkxx$np14e zJ-00FBEN>`(u$Y|452@zQrd50Lz=2g&U3~Ml-90w z{Fy=AYA1{|ajr`QL(6*AqqaM>{~ncuzA z-@U317o1ZUPBQv*HHLxltlj%?g&5mx{%}wLC6^kM>W8@TE+S&$v^2?X%3A=dp240 zfLsOWHu}GN>C>S|Vb$aE<*l$PZ^MU^iKO2YK$+go+n#+^kMUt(tXi+SjA0M=s}=Pe z`a~xh(mYC;{=5w79DVH(Fl3!QZi@ct!Vw!1pfh1ha$J{#iyh5HfC=qH#`3NKC2|NK zwgPOYcUVP2HKw`|!LqyTVSzQ-D_t8h=pNaFP(NPklb$o?*hwV1tl_)}X$t4f@b*}q zIoQhzMJ>+|ofHZd;=0ET=khneUPl_ho2XfZo$qV9HK}@auRm0lN2SM0GCy*Yy>t87 z#W2>D^xe<3l4)k3|2`()|MJ>UNF{{TT|+d+5wdGe&gCuch++2_i_s7L3~c;pO4pmd zsJ!`dpznZ_Mop)nOX)_|fLc)yJ(^Rm084RJAkOJV3=v*`y)b^(v52LXs>Z5b{;i3) zsgZQX3X(VdvnVAV)LDT|KDUAOsoK78FOhjvr}&`FJNmGlPQWsIBSjA%;RwtkhS$98 ziKgi3k5HyRATtmA1`hm~O5T6#uOYW2i%-+l|DmqX|Fntlq3@c`7W-ZKJNjm+?`{FD z;$KQ6kP_E$YS^gF(eZ-TwU6}ls&_yb1LUcOJSpNaq6FUf%8wjRj!|nIP|?^Gkl?Bw z??qXC8`c&c&GFzI8QuYlOsCd=DHcQbKVhCFdypq*VVzWmQN<$Z0&*d*t}h#JQu8XV zx2?d3TP%%tMc;$X2Y%e~HvB(m+cv6)y*!LhfUzS+D*u2~yfK}->(_~BIES9E#I6xr zx(PLb+0QA>p?y~v4J@+%sfjSqJ9S=P!_k!$&}j^M>#eLF1X9SIQ9H#7ELMomfHzhK zz?Z0?A^Qp7#3io}hjn6hJ|~#>-q(w+mozUhW*7KLEWR#$eiHs;UJaW7pY&Zw&Q7zF ze+fwWj!(lGKdYzpsZTgsSxJ+sOI^gMAfV6+({+%pO8s>kXM%}{VL-jyb7xI?wGrQv zxvm)EvU>lGm``$x(r4UYt@MpoK!*&A*xyCwfNnQH@t;hP|{>yLV4qy47fx#yY zK$iaHaC{`3(LkexG!g)$GqY3W2?QppDLiR07*@Su|9S^D9VxAU8wV$H2;l4xz52jC zt=?RSsFv}scc75G8FBxm_v{B*p+?Yg$)>R%*@Z51Lqqo~p7h0Oj6y%Ug*K*oB0ZUgBva<4&ImaAx3?`3E-5!~pIVvnve_IMno}dBFKEwlF)zbza=ZD8rEoq^3 zIcDWGKOiq|Y0TW|^(+4RJ-DZWD=GiyocXWOI#&Ft0}g=A2+bnOr`6;T-x$nr*4x)m z+=$7bl!PHwZK6^?fo8m_q5fi@V}}!+EkY867XXMEDz4<82^8K5M7QOQ2{bD5i9Yw= zXw1BpQ2uV_LpsgqHAnJcFp z!jD_m*nNh`%Nmo5c`<26jul$3E=M0U_J1ERK;uUH^#yk}FW&zYu^w@6llRHP3qd3{ z>iT+@iDv*j)0q#D1;Tv*e0#~lyHO+bCo>O3T+^1s-LHgeb>0g6GJJEWqQn?TW=S%% z-4?p(vK)C32&;$H!F;RPF(78FM^Q?%3PfdL99N|>Ua;}`NrcffOR)zO*-B;8<9qSc z9?Iwm(7WpcXTiLmI$gQK*`OCXRv#wLvF_u$UgJ6{B(dD8=+vj@A`~ri66DD@X6_h! z?*R>z-wC^sVfe_P#eWiZ+h6+ch25S4RHyv`)n?R+ClVmx!Uud<{XXx<&glNkGvX;x zVryquGR&#AY&=-Oj(DYawi|UH=6k+oyuQ4KAH_#q4jmE{NW|XZo7LFuhnVnfj&tR93g8~><(vZtQFqmfPu@Ik! zfC%eQo`B6wXv71!sPLfHAU-nVNP_bhn*6&Xp6Rb_HrS{lf)toaqrMfn9^jiuz#vok zmYYfcli?P4><@77e>|dYq9!At8*XrBfOOabRz^x2ET(uL;0?!Mv@9=7FE9IkG|C@Wn2xRhW z98}9X2{Y1GMQlYNKQ9tE9mqMU{?t(Y13Cf#)BpPaEI;~xfnMCNY9tz&O}Elb%l5gm zMlyS!KZrY_s2v&|khm0`(qi$Dvg}e=*0)1LkAIa*5C47JK|?!Sv)kKSu4O)>emcr^ zE2BW41yszASVzB7O|+BhEk>=?qb|feUU1ix3&cPy_YXr`!qP+Q1;|awq>v}6M|2Vp zHn$!c71$i<-+ci03IbSS54j}Lo_mT`03Ltr5u5utcgyZ8u~?Mr?Gsry4$>FoBomt` zalDt4l{Nq&^#oX|!r!s%oc8_!QBs4&^T`9C0`T{p0aH!zUre>{SI_KbV!37-Yz2ec z{J3e)JIT@J5;rFK*-NA-Tn`PPe-g)YWlG(>TloU3M566!>ZBi#5YG;rlNG)maers* zK-Tr9yJZRNurblCLGYd=$kvs)zHUT@9U#2M z>{PMhX}*_17k(mM^bz-on43_6Ys1)?nbk7H&(n;g%l&!HUx;rgipA>FKHg5TO{N#5wkm7TG>*Hlp?>{6fdf#PG?@>-qvTuU2h8+zJm+ zDG;Y4Wa8a-!h~lB5>5Sp_&UEeMVa>4exD28b`VCW;Dz9qT|+Ra=?PyCt+p~Nnj7-o zp{*|l0BM5la=X>WM+vgaw_hut{=6uYIk^hi$(o;)H~Cc2i?^d?O{ZFcjY+$SJFGhR z=_r&DOypkS4o__2sguRNqP-L`@aX}0mxZ(J`JSfqypcOdJyWPAESLFMm-cmZbzFLS z>wP_v7mE$Rfe|~0A(<+7)}WCWb5(2W0g~yOrZ5h!852OU(tok>1LDxGs(~?^bSSvCW`cPk2yvn)yT^w;4FY&DJTnUgJ30;`HpD^GTOu2}?$jPw^ zZ>9{(@79<4grr*MTydIFNVn*RGZlqo^hwXB5VAf$7qGqtvocnmO_q9Mh>gXi6*sOE zH7GAn+M9lS$R*z~;VaxTsy~Y6(BBStg7^divVdtUqA;;Fer?OVz2Wgi_vz@ia25OK z8B9D>a~!CQiziN)ZSaO&FXN03OAf5z^b3jK_9&^~n9T^X=bi5u%9v@ML=+z)RuXlN za>!&9@Fj0v!f&zcGkHu<4Vk7d>C7Q1V+6ZPQ*|51 zEX)jFnlsYsE;7)$gUq^`z6#&bpDWwO6PoZ& zTr+f6c?DKN!t6f7qB4m!nq(q63}3Xp?dKxv;%{7?M#+6H+a>o23slpsM198ajd&$X zwTBOsZgn)xZL+hMJP@F3(=~lt}Y6r)5qMoDOrY8Dh`HnS665DyT3gxYe)*0$fOc?$=oD}&hY;&o~fs+X;Sqwy? zPw5B(%TgsQIQ*uBVdC&EStd23TCe$l78_RO*rx`=)xmeJnX~3Pj9=-A(nqi0vT-xS z@0e$AA}OGC6@!+SGns?*VM7A}Lzp>7mal?HtDKv#2z{mL80|DyKl~V8lQ{34Y*Ts2 zp4_SLsCb^{w>GkK%bNW>s-_d|f*M+9A&QDYP8!-pYcfQ%iN&3z1>9evIO9G%&|gDf z&y!6{mkv`Gpch8*Q>%7h&m!ht2*MX*DBiXc=}$)EPJ^Jw0xOR=hDBKOW>OkiiNWPs znmhQv5@zmDwY*7pw{2FZSC-LOOd)UQruj}#WmQDM)*61jHu+v`kk5Ly$vb4MyOeOh z`hH@srXmgP^Y3p%h;QGpyg)yCk~38V=rXu?Ynid0j45RcSHFPZMWA(9PBsG$!th1s zQ)UdJ)^{$)z6g$2^(B%90^$d; z0)58VE++zd}U#FomymS#z%VX&UN1F#j)>;!>@VMrNp7Pv8d zTTuRdSbh<3b_W&f+XOeqxGhh$5h_@64~p1)=o8^uP?ekZBThZ6_WjGM_1{qu@Hjfw zP2neg=9-h-DNrdpf`Ssb_OAJHW%Gfj=~LB#r>JwMe2;5lW%R>?#Xwhe#ges zErP}bc29_sl2bUD^VATs#XAXDiWvYz@sJHS*X%cb9aoRZGiEx!%X#KrFY3vw;D^fC zOfX3vrCH5u8L|xWvpz{pTM_;ImDH@sgr~dm>=X&`U|WvYi}_$MIx$*$OTn5!We&fm z4XbYs#2RH`4w~F|gPaE@GBB_A=Wdrve%`cAwS+MbnFU+nKR_q5VN|H8yZF{pJY8YT zmtGH-XD7pHwSA~t;a!dss}TX$C$>BFH;_hMK~bV!AXqLryr&=_O)n`at19D!faUFraq45$rI6+!@Zyl-WZPV;4K%r6qcwegI)sM7sLmc6j zyRT8!*^MZ3=qmu4ABUgB@J>D8$-ev|t)8j{RJxh}Y-OOw$*w;q1j@pw4Uf7OWy zaQ^TbSvET=cr(jcQ8C1o+EfO{@K_RAOMVFUxB)UhdlMMLX*{(>n2`}H#Nxt$a)qlwC$Fz*LZVgIU{=VniPQ(@KJs;$f z68`k7N$ZJ|@MhU_;kN^79j58*obLx*@%ZFBbk8oi?#;~5D@CQmIJMwTre520H}n2r zzWsjjEBUjM>Dt#q$D=?cInjD=9&89hh|inoPx#v5D16XsRN`Y6!{4AR+3Dr>qgW`A z-2QNsB;gYPaFVcL<(`fU6bG^+s+O>9=RIXGx}Jb!M7s~VY>5I(ppgpBDDTdy8Gjv$ z5IUI;DEazPVvz^x7{h_t@Ye21Ida4sj}jNPV9f&`&h86Wqj8&goYP9rrN!5^YYcbm z6KITZ=}PNVu_Z2Nnu1UA) z%a4z-YsP$e*s&XNI#P-7ae5}Q&T3hf#dWCxMqKHT=y44l**xSNOlG3+>@lwLvw>JY zQ5E{h$ zgFiF-S6#-xqXGZ#(GtXD1Y*O@RED9&J3AF+5{t+j@%Kq2{)q+O(jytOgVt~ zgR$;#X)(SGzAu`yU_Di=n*v+0{l88Mf^&a`GA=AyTBLMd0cn9_cq@1T@7V~jn(!Th zG5#mK{J4^8LE6i0zy}^Wc8H|VNF)OmHsq&|Bav;gv)G{)lk-xL z*Sm2m8LFHBTqq933$n#R6bczheqsMu3IA9biH<>lVHmjUK-Ecv5r~vS#8n;w<ASduT)ghP$?7MFjXck{$UuzScSB z$-4z)gUr1T)L0IV*8;-OcVtKDipea?E5>(T97}5H;)XUB4=A-}L?R&Cu+?bS<69}t zHdS0kFW^wjI;g$mJwoMc3?lJrn{^4>Qx6Y~9AVA94?MlT?QFCmOcM$;iFTf$)1Y&8 zUx2h?cO~}T$g9EqR=aHtpxpi#Kk&S?xyl7!l??ReqfsC4pn6fA1M7tchA1p7?iqA* zF=3xVPl9Qv_-=4i>-i76HEC3~X(^pEKev*}TM|BdS8N9_dxu+8aWPYv&mh@I?NL5r+Az(K63BgzPKYcJ1d^ow$=h`QR9jJ6`?m+!mu4SQ$c((vV-pz5m1U7;_|f zvurWHo)pt zX^C?C9ZZJ`&ipm@rn@!((_p@PZyjoT(%m%s1QO=qPahD>mesra-C%fqYN5L#n9`M^ zA^tS#faT2vixIYC4($-m)$m2$7@-^5kf|!(9b(dlH;UXAq)=5Xo4z_}zTRkZ@|*Xz)sLD~k{{j8Ak)^~{{BHm?e;YLjj;u6j}cNd<&FX% z=hP%$iXpH_8RbrmVzS6TTc(>~e5zflkI%bN#m6eNJIz{-epy&k?M&-?yNc0s&-~oJ z$MK+J+|8q#0C)OQ(sD;DOAx)3C0?rcV+?DwCDSu|2*O;t||1A)Wr8Kf@sl0iSxBA zpDRM^wF3FGF9t2ggNQ#rGwZH=$GG*qNaR^lPjQZ$*KW#*5;7;lp?aS#AtjmSv0!fwa&JuC2LOfBn0r-e(T6 z6@KrfHlTVmAKGl>C{u4?vNe1w+^e35vUspRos|KrvuL-joi*DDcDf95Akst89hml7 zF5=~MocAr;qRws_`(#(f9*{;CDHI5m5s7?iTJBW^5GXuCNsLM}DdQsnNsu2kFjS*+E^_CF>I0qHG zv3tM4EwWH?8W=W97Ibkjiy5eQ1EKJw5KrsItYVp)Z``|YyLYL~p5R(iGQ2HgaY^ft zWc&bxKK~b;KCOJ4!@GS22<|KQDTxpCE`%6iR6%9; zTAs+*HR{reqR^>mz?2CS(v9@Ff9gV?%E1Kwj4t}62RkmuL+kP6#uFsX^1#ZKGT2u4 zgau(K4YvT{hfCF`@`_gTY~O*)X*#@aDn5!f8)+A@)h<;EA|>tDdfFlKOs8acc(@3G z2zeZws~%jOotG@vjj7autclk3-t{YI=nIAl)(2f zrzw7UyIh@kcylIUxsE&Ax)h5WQ7*jMie|@M_f>ml#@CXaw+_-hH|NfTHl@Dz-GZFA z4pV7wBC$eec+HhUmDqXtRg9BZ%@**%lHcQy5Hv>b|;{ET-!yTGLXcEp%@ zwo`x(%pGNZ8Z~8Z1K(x6z!9(e;rNq_d4^OMwvP|gZR2Rcc%_#;hq#9oT(@fqyg_V? zN!xxh!f~%-aBKZNoyP8i_-pCo7d6En+=};ExdNP*aq`jx@8rSP{jeZ6^7@Qu$+kf4hz=qam{xSed887)ssVlJ8@0c-CD_LVv?6lvG8`zg z6gd5W*j+$2Rr~|ixU~b!qEb{qFYT>=M6#8tJMY3I5awil-CqKIr(~RCqyx^L_C?wt z0}zvm%p%@369p3u*?a68v%N*@sn?}GS!tyfase{P&4vR)HCqt`bCIT^bIhNU2bR7B z?0GM0pB>RoHrwz;!{p!(We@Szu#}ir9-wdHcweWH9L^8VSbr&${hP~f;Ob?8RVFeV zTCaMsHt&w@J^toDoT99*LocC%S&o#xVgFM49@wGrhBc>SDOo`K#B?z3<$aLYTUZ+! z0mr#O3(z{!dl<_f6ygDJb|BzNvxCGG=3&U1_k&W?3@qK*Dk2;$!xR|Xgp5b(r~=k+ zeG+SJCXoNkICB9H0_OU6h(`(g4AOT2kT7A{8KU9KwV8(Qru~S0epx95fB>> zO|N0kfHmeDJ#^|Tv`_;_5`P)6U9!nQQ{X@}kK2&+SL|y5EnmU6LAS(QL2i322fBAM z7~@2?+ymO#&HYhm9uT~SSZ{y#lN@vs{@?y-uDN?(J6g#=*|3%g@i3Rhptw(zAJ#DZ zje3vTSR`88;3e;R-AKzj|AWoR37FL|NA;zpz$Q(R<#^6_)m6IuPZB6u!w zP4Z+w2x_RF`n??)RRdq>&1(-d%d~F-PcOpKpT>WjvXU5ZV@y}Tey^ke>&zoT$UxCX zWPcV%@9P=Ou!Ua$B#G`r6(a~j1BF)7*b|k}#y7xrt>>gLj+I6pv-NKITZseKYW$kKMzl-J!Rh z_p#Z)t_LJ)>t%$D6aW0P-(BA6+_pGtGa3`lb2Jh}_Sc}@QT1Kc~ zlac)>3}2-QY(;r$0Q5Y?pDVD#CZJflQS7`1k1d(J{yVL94n;bu4yTLOo4fckz7_+p z0?yzR-BFv=Znl|xd{2oMVto-u@EIA7u&p}h>c1n8$xUAwo3#iZo55H5jn1Jr3e^QP zb310NfTL^SPn>Ij48y8kup=;R-r%lULZOK+?(bS+6Mq{3T_5Y)V}W^qQ4AysK+iRD zC2mrab-A=(5g#WY<8h#T%WVC<))8m0ANLQ=;IHNI*YP*v+Xrv{*=$zKs$w1mxO@d4 zz_v%Qja)81pvT_E<-r=duwuPio)Q?HJ2A)=%hih$I}=i0HY)wyHiIvg+KvPSn8$iL z1EuzdxD3?%BQyd}WW|C-A^|*7(j+AnLs%^Y*hbTdiiQT#*H)K^q74UTvmbd1%k zDExp#ZXxR5Bdi4W=?ig#(>GAVtyz^QQWEm$pt@V&4tVOQu62qCEK>VvK7 z@9(&f<|mU)`V=7q`i)`3{`l8+M4-a)H1P+dbswxme|y7ioX8(7$3}R@F9({4&NfgB z!;?vjAtLsSH%b9Y<>nxrMksCr$EG zevKBgbONnjA7Xnl4mY&F@<+iKAbcCAMw|LMbCxsW=tZgbB>a!e4-NSEDc@5*&@|0I zazVU#t!q5MPvVgHlKoxa9aqYlYV_qRNdaBaE|A0DmkE9isNWnR7D)j*MI2s!f{K?l z51C1duK*gxm2UtxpKJd@nSP;7zmB&ZlocUR`(%5d>)rg*sF9qVib;a;Hep7}p-?(C zaI6&K0%_6b)DL7oI)8(FX{X92nM7>oJS_78;^LFqiVec;+?L+q8fn_#ZC{+YHsRXa zca|_*mt3&u&{bPU?Qh zryO>~P03N~r}CkImKDv>froh#!3_nR&OJ#G{hPjnZh7?WVQ|@7XYpsu04U}F_&lJr z2~O_gvPY89$x-L7j{!7rNiCOHk%%%vQ=Zt}Ig@2IxG7n^Xm_5=D8e@!FFyd=Zp%I( z#QC}bM2Aa;j}NsK$BV)F$H{&}^oSvsfHw#$bUnD5W+xvY!-uYh^e$5!xp~lDWdMI! z;rkQBo8zA0tF(_a%v;SRy)<^!5cCtp4jo75!w=8b6MhoQF4#aXXp@!SvK8D)_Y%`D zc4}h;x^vmxaas+#;A;u-0TLpD?mY{MAfe+q>ip)oE@UI$`@L8sBXDIqHD1;-_4Ia2 zPT7Q2tmjnsC*-j~{$2k{L5pO5*+lvT`~n~(Vp*UH$-AU0jT<(rb$HK=B7*NVy(qhkVzgj2Iv@PI*s_fO};O? zBEo@?g`y3}D4IC~iUiKSZU4M-*`XOH268#xv2#7=t4<@k5f~om8zU235_MP1rS>M# z=JY7OB$Q}ct?24Kvj@RKm8(tUfl0YYd0h=nJtg_O#NQr2c?lHW)ih)`Xnoka@NGw)DK9p7D}V%C5sB)bH;tuU{7&551wADV{=Vw3+qcC65kup zw717{R;>HLmj3H=C8zgF_gVbbZA?6Lu_!$)7FL3u=rHZ*pc!8|rcC8U>#gQ) z&aZm`Y&rpOO}{st&3+DJuY7xF6+HzG1i=70y-wIqh9j+h6fi@zHL%=JkEht&_69K_ z^8NJMuH-(nC9d3M{YV)cskE!qoufnU*@r!ZtjxEF4-qRbu)!aZ_rN;#e?BJOcQGqg zyCof~E&q)Cl}Ycku=+?4SI)z{LWbVEAD(2e-~HlOs2L(0t5;fZ`E)j;%bQA;6xX-i zsy13o^E0I+dw42zbHmzDvSH8H;q;d!E%ae{{WZ3bNs8yk;?>}OadY2mGKQ$-W6h-* z2RdgNmHcXp7Su+HJ7T_qVE4_~o!tVAW#;f=1^btw#hn$)tFh!f5j*n^IFCQ(>R+37_gbxN-@T@@W#&u-dpe}w zQRY)?=eVZN2kL%d-6WpyvAUNKy4fNz{o&EK-fp&#@*rKV`hw}iCN+349YdU z+dTJH^E`K&&?Qfc&uPVCr9m~4ZyLofg;r>Dnch70In#GV*T<0V&82AlvlJ-WMwuYj zovh`C>74VYL<3~c$!g;=M2f@fV8bfuf|Y`Ei%qDjw8y1L;OSUI0S%PYR)J)&><#|< zz^qO;bwn|oqUp(sza@wdgtwWg!rw9g{g7`YEh(%)mO9@a^uSU*%*=ZRLr*QJJ9`kB zWVpa}^~4Cuv(>nR#3AOcK-b=*-|VF*G*M%4#@I%6(#NMnTd`{#it zs?VCIHpaaZo2{^lxaY;tQV-@&g`CiPVIWl#?bt^4X}lc2^mgF(Rhbd^s@!n}`~G65!S`hPGEg^lr@sEVaC#c4gglF z&*s$UjrXXvNGm+&_gB+pZ)s?UoA(BG*=JN0tI*Y%gpRUpg4mFObvp3aM1%g0pWrvt zCQ|jpWgjNz%+KE*%;id5*vgvA-0$hhb^oZu%JLI_%7~H!qxpkEy-rNwZB02gjZn?< z@9z}}&sfr?Yl*95R9)vJ0Z=)z(OeBT`2*AVY({Ux(}@w)4=z@+ClxUDH~4Zb_}){pzW@$Yus-<{+3 z$(4QQe#Fp)Xx>ny?Cb_NG$%jFPvxf>OW}okOUf$l-MUF3hk+RDlQiv-RG~CExG19b z0iGODr-%-&1B{_9jLThDSur()U|%f zd8f0Qwa1qK{-Xt^+Pj@#HGVvMYGwh}08a*nnpa=7M`m8@-+ii8DSUl~gfSH8iZOri zB6*_u_N}OQXF5=&KGG&X0jgsx9{|8}3+B&)&IY7R7|GDxwpq(+k)XRLXE?of4W8Lv zeK6X5elFh(RtZCEv4H1gK_3Lus@aH#XJAsb<%;Hx7vuFjTt1iFdDCpqyT{!n8mLrX z(dP3;#5b@C|K5SBuQI!j<5hh!nSBjC?g1DC$@uPlm!>%3$sO;h^5((IPiQOETqtcK zRpT_TDV^PyKFjz#c;E7l3iHMT(ab$^aJ8wwP!8(H%d5p_%lP>2TdvyPz3(SNXYL}U zE4QjFL4_-U19Mq^_-QN!8dmq^;q7te9#gNW@YSjjGNJGCc?Y3<10owiCW*Br&)w*Uq7!`C=oks|$5WTe=X^6V~c7s$Po@MVgAXR{y>% zm^{7}akit$4!j!+=>z|cqTX!-=5@i?A;a&}Aojb^v~>lFgbpp$Dou6NB>0S<2$e1` zQah_&@{YOM-6%a4K_^bX!^4#)9D6w1crk!#?DY~@vwLWPPKN;4z8VSQUNP1UQF{&g zZBv#TF?d4&$ML#26=;~usBk5fiX7fwC=3O()i)fO1pr2;KJx>jx}3dli2NRfKUE6| z@fQJJc|f@e-KuUyrGiDVJmN53Ec|V>&HFokjQelcF}d8U#^O-2j3mE#(W7ffpTMxD zW256UfugnA4+uVNSbuR0u7#*&f`04W);OTI1D4%yyHR>eZ=b#U0@7+c|eXho>&|PodsCyo(wm3_EVsxc7Kz@FkG$NOnE6E4Ap6 z)rd(q78LYoUK6=4qW@mExOGog#Eumfj_mKi@QV9yk@=bkU=-J( zbvE-UwjUJylN!;=_}KLvL>;~s1P@cZRL56C?LE|}+@ zj*q*%v`2+pO2?m0{sG}oMEBs!LDl@pryOgUQ}NAXo?s%*3Z0pf<#yG;Q%t-zxr`Ra z9m;6Ob>L)1Dxn9WE>CU~p$+HPrIap3JYhR`;}mPgr5KaF18odF?Da&EOGEI9N|Wn` z@J}8#y#a`%4({kt{FMrprb^PP! z^H!^R$}2?4-ocoJk^nb%$7I+_@9BQv%WRqAUTZ$Bi>;NmguI0F@}IMH(Li9!dWI27 z^8@mb<&3|7u#`U9J?_|^@)z!p5F?6s^(8a%j;CYu)@xV4S=4?2%#h#kHGgK1ycieE z7}ySznq`sR8Rt^*(_GS^TZOK4&I zDS(==K zUKK0_idKQ{Nh8S+gW|d%f_Rh}|5d|mzYrjXFKd3it7d<~Ehfut2MD+k&%ti{i_`a? zj?q-UT**Lo#tQ(a0Q4X*1$nm+*Vz*QvIvzkIC|(Z{fiG;4>U17w=4{~q4dIQx{i6E zkn~Sr|DUD)ZkBlPNDwk&z29u1v*MV>Jd7!CHtqQ53s>ux_R}s z@Q<#4LK4kL6+=MsF!^AS)Owezs2Y>xG z%x>F!-deP*W0nkm20&4qA`?4jz%Wv3gApoYVTK1we?NV!1)ScQLdO=n@Y1VjD;p-C z1D8xd>e`%pMtx_$JKksbopOzA5qriLPy_gUb^ZukIM}%~x8sg3#kt)9aMo4*jT7ep zh*8P1^>tu-?|^I*G%&%So3rMJoE(#Z=9HsvPAx<^p;OHb|R9 z*dNe&7Q1=jMTwCMHft3A79pQJf~>}g(3CX9mG z$qe?11D3pYcB4H7m^PO+;YHXG z>30T=H_?%mK5suRY)A5LpA^$dCj7Q+jUK6&1^p=u(=jsewz85iPJ7|aq$~F;$zx{{ zk-bQ8A$_&#n+ri|4&8v>>A}NhW7tK}mnOgey;5mfV})kQj%=Z@K5y^HnGip7YEf6PFIOPZ#ceFn(tuHilkbXgb3*ZtDJ5KxDM)WsN zzuU+6Cr7=%>+_XIvr*7ZQnvVE_5yd)N@3XJ^x@bs9>Lz84wWY*j6Thc0Mp=KpIK`< z0yKySMYl?qQ5zVX(!p&6VBmNdqXy>y0SCp07dh-h>MyHaQPxzdVczCPa3r7z(I;d!QCzLfb8^e2@QiqHUPn z%+o$TDfCUz9js)tC|Z=#JV(Q~-St|Wfn?Xj9Nv7Iz&SJ^OUQpWbE^KvQ_{GqD(gj9 z0IKk9;uMXiUEMX+?BgfKDJqohA(JO=Uya_o@tCjrR%i?HXZzNAhMSA0kmDPCOQ!>a zWyG_gSDH+%G^5x$r8o$*!`yqEf-H^2+?uDl6Rq~458gIknBJ;$(p;U3c6Ctx%trus zP=^=2Z@bp8ny$7e)XOQE=|e2Hm62R?GS(+H9;?%Mz5|MRZ@DvZ`mUifd*2TT-@rGq zoKKg(?GG%Y{3a>pcO~X2gF6Ri?B1Mf%xGgkoJKae?#Jkc<0}%9?LCm{H=8*96gerI zY`ME;qeS;Ag9{5pJa6WndM>#`CCA9vTl;w=czU{U$zfy602c~2TKjRg@l)%f8Q*Xp z1-b=cY3_H|d+c(zA@(*S&Tz16){5jIIz7}Z#$*RkDWdxU6kqw`0-r@Bx7}P{CksnU zA2ng@W9=*QfA+cE<QktG zWUv0b9hO1~ul`9Dq!7+)_!d~aCL$M>f$W^Xa0FxHo<;%m8+g};rBf8xkFoT%?jT!` z7~TPF%y)2C@0CHffC=;+{bINpqSgfUZ9*jYfCQuwDZ9VWiUIT45}F=x-J_&u;hQ|1D zB~_?-{R)%PqNk^4%!k=6JyK7_yesK7@e}SOtxNnPJw8g4v!7CnN2hLA$E4Q{(C9m| zU1{>>+qiixD`Qpp!WjrY06G~61!B=G7|m&9Kt%w9ZKaV%3H9UpgaXT<@m5c^xG|&g zi?G7tZwLZenjtJTMsRZCRmnZXbZn1l$@}ot7N=6r25)@Dt@a;~K;ZAJJT_Webqyf* z7a3B8V%;XSt=I^G@kMkZtdhqnR}Zv;=}hny<3$ehf_8eG_@abKSpS6={Ld9Xw={rykAhl z4+z{A)Djz6{&B|9v5-M{d6;)b0OJZb?qb1bVazQH59F*F_^8acK{9I=p93N4XE^p>Ie`dk{a{`$MMJ~o`!I5%8raI1ce5x<+L395V%Phxb#vI%`WFW!{ zHsU{tS}sUeww#vgSm~fqWgQl0h`bb9Z~tOc~uk^^1PV$XU6d{!vZ%xcvM{4L0;iX!7Vuj z2I5ytDpEqUlZts$1VC~QNm>v%>NoToSepQr+#9d1&}S^_NaOEiPiA?gGV zVR<($p|U41JOZ^aj1lzXcrMbi)2*W8`9u1#a>DOoU0zYX39Ea=4`tItk+WwiEI%b% z99lqUAH{+=S+M&Uz`J$;TSitY!3WP)wgG@0S_RYgm6zUHfx&~{Bqxb3cP6-KsZ*me zv+!qk%C%@AO`R^+B-|2|GRcxFkGFEgEDsSG|ILWoe(Ye-4@2mb!_~ zbbWu&E^()3egaBRL)#auEt!|T0K{Pp-XEnh7wLe-3IlMSTI4o3N$#iOK$lZxeWwd6 zo_g21`Xb8vR?lNIq0zf{Z|2t$Tld8)UDQ>|lt`IA8A*i(twvta_$Hg_Mszf0%GdJx zv4dstYt3KzOK}eV1O!VQ(v=<~CEo_d7h4yZPCdLAqIAJ~RLjBIVGvkfgm|HRP=OP` zTHfF3MJPY3a6?|`C5{4x|H$6AQ`RmLo zRh#CbnM`9+2xE8eU-aG(oKUnb%btt|En*-IghnY>2Ig$wzKB{K#J6qfdp{s1(Hi)W zdRScX0MijtqD`y+cRl0m&8NQ+hEXv(y^PT7)l_t(J@Kn2=D$5kFlt)>=XtH(C$|sH z+m3#%cLN5+|Dp=|pPu8bND1_{z6V+HDS{0$e{}yR2;`)L8d0>~!mVtt zV*mou3*$AYP@0O7j^Fc@RJ)EC`5xEF$((MV}Bz*#n!PFK0cKRX}ArlqN)K*QM|kYK37 zm2YgEU?}7XI>l=p96c>swo^baTVu<1W~~I3NyPc9^w$J#sGnW~ts|{n$@oqL*(K96 zBvxNnR>ChJ--W~#Z<8&@m*Wx-wT=d@I(|TIobfv#<<&l*%fDR`$`b&|BAV5U&NgH; zQpC^WbxmUj+BKZ|Uf0hj`6(U^_=k&O%{KQ$0-uvRT5Fa4fH01F#?%c5%mfxt?E97R zE5~%=pqRUY;#v5LCeB7 z7bGp)3vZZ$;71q^V-~}58RFqjeCIwKQ+$S~fgoo_+r^BKFb*S^|8{7FcutX72jKEwZFvrI`Cr;hiX^cAnQ)iBLEFQJjy=n>1n4S z16yL`7FA?TJhm<0Tob(86iF|Jx0miQ5+kCmk$u3EI@_6X$EOYT++vZpom zOXQHh;F9eIu_2PkS}vC!NYt=qWluB~B=OdcBI!%7A^en|cbBL2Ms&+?KYyp$lY527J48}sSw}L;Z7U_d!iVSCxw}KP zE)0!)_xfI+V)cf_B8rJ>mploG0=BqbI<$y!1c~aJD0~uZ=1hf$oioa)HF;eIU?`@N z^2rr5jkmy}uQ*l!R&8cVwq&xZ6w_CK|I!B_8Z-A^@oH~uMe_Rj;r;Ff)h5sebXDhT z>lg$nynbm6W7Zg564X|@?DV*nNutA}zddT2+|i?)_7N)WBeqZFv@zDb`lP10fn3`S zR}o&H+pSK~rzDU`sce1iyb!baBJvKA8qz7PQp4lsbOj?iJe7m972 zQZ%G0!^A^u+ozxn(_}LEFqJI;oDpDcb;e zpSCwG!Hi=W@(^o{@FKY@AL3JDK>-G=QP%63*4aNG8cM*+T7Svvm>J=c3I6mjwKVa- zOvTj%LaVi;rEFHgv)n@ym^WL z&8-82J&Pr$zLq6Rvl0+k)yO_ST$>ZKHq*w)_CdkU9`hcptLWStKv)HGPN-D^9t2PJ zOL1uS)BEJ=)*IpYGpj^c;r1X6+{r(f5IS-BS-cxD_%q8+2htWGx@G`%H03fkspw99mJ|-p>2xH!V z@f0UO7Hk}$r?A-4*3D8+9yxo!evtPS2C5#+&(e(CZFwb#Q&WK6>;6L zNTrm?$yKlqXOXb(Gf+AW%qT9=6V*um0u)~S9ZhN}ns=eGglBV2=z~kG@?-6%`fz8yDWa3ll0SiM-QIm4$V z@B5P-zqrRtXTez`V)&0bKkD7se<-k;^-c7p?BvoLn5Pn!dkAuRTaD&Oh6WT(NAlf% z66GG;#0i<~!omK21GSy);sY;q|C|VOfW_drT3;m%P1n6Mg-!~+4iU8<5`@2a#zu6q zvB$<-NsEP5!M67hBgy!P8q~HyA$(Wzo}uufky%FOfXPxfR)Oo_5{QmAU9j}_Fl@B? zzYiFE?ktBY7a+Sc(0=Oc0DM%Lm`3vrppIA(|2{y#Tb3mmj(+?kqS3%TgD?BP0+}-_ z04BK%{$c2I^H9PZAfu^z0({)(Fj^Mtv`U;9jk_e5ab z>R84vc=e31=>zHk)SjnguELk_rP+PZBz5u)c;w0VFyKskg8S9zuL>YQ&ixxBUZ{k9R!Xfd9iw__Z6)+cDVa~pw(`uIA$t<}{c3F7#HBfOJYBh!^u)!_k+W-=rwD18YGrtfXJvR?L?|l-mBF+;Gmk z>NDXseIfC&z~{zdeu2giRrdjX@_i8u3s}Ik8aS{4<^6S(J8J%LO6>Iu_fKEgeJd#= zd-6kU4DBuGZre9aXsyU6*Z?b@a%~G54Bs}G9CEY;r7QZvCF<{r`#gzA;)4!ju04&r zrl=P2I;j)RVf8iOoZrMSto1zOLO*ASKaAOJK0XO+%N&EBl+57bO3d);g|S(JT~-Fu zoi_86@&LnrQ;lyIV-{}#-nG>;#L69yIT5tLqjmE)W1R|!DB;RZ!2L*KG`qR%eL&@L z<_bv7_iDh?@Me!LMkk0dKowluA)g>-n-6I1?YYI(TcoO9C6nBzBFY>bTqNEd(18{o zA-oiR5?(RgQBdkR&5Ukv;GkU+7`v6A*kY}}uYS8bMGSLe>sI0W1Qr|xz7LS(Tt*LP z=$XtE5e2+4SaZe&OKGjdJIpy8rOu7ESxvKvk3P{Db2uE0;c}slF3}m4j3rAxn41{k zFw$2W66PGex0Mke+r)0XC;mw!#C^Gi*LhG*{qw-BvJlmR90puwP2R)86NowEIwXb93$`s#Pabt#e;ILsNmd@TJ`G5y)-Dh)gTK`TUlfg$4V3pxb zf1o%IJZmEbNzVU+5frY~WP59;7O$<2K&{^$gzuGVjm7@9h8p?rtf6+0ja%r8BEX0p z#AY5XAOO=xPP5Jz?=W|}5)6;Zxh+8J2w%SIZ{wqCYVx7viz%w?jc{JnKj0l;5p6Pt zF6?2!0{REwF*r^Y`;j0x^d>AQ7T&}MU)aSefQrqe2?&!?QPK{3z(E?+Ab6AiAdC#; zzDw`ecLI3Xh2TG;jsGW~KrM=>=5MRB#BmHkaOI&JFmw9T#fi`*^H=!d|McfyjV%At zdU=wI<;zOudkZE0jTr&-6Y6N{*sI#Ok4kYJwMe1W^90h{0(g( z>@Kmps_?u{Rxn3(54Byz_e}C23Br!_0ECwG*W&O`2$JwT=(Le@%XZ7_Rid7-W#7;1 zPYS+aaf+x-h@K`TOkkfrA-f=xabYhWUJO;y)|5Uj-;+$8n853UQb>*sq@o3K90u+o zptutJ-Kx4^AK8GK$-xHob%D!W4F_K9+jKbWaE*?CzK~`6yQVg&^q)k2DP~+}2}iK5 zBhCVfoDL=Odsp9>1k>2N`9;0Y*xB-HC}YLfLGQD8-$=>~_ z;r(I^gGC}PU0yC`=pgi$V-xnyk`4>R*0*M<<`(`Y)(t=SnY_cv1M*`NXs?(YP?ENl zo}j}DVWZ&?SY;>#edl=kHy|RMaV4vLG|kmr6LEOxRA?QMB=KtekNroEqe1 zbWNdpzK@^*wX1blFqQ&ogiL|jZd~t}MV70%nsSr>B;qKoqf)dl6i;R`N}|f>BySsz z#q*)xtA95)uD^%=I=HA4em~G*Fi4a)c}KkU9H;c;MAed@N%G5sxFT7TawdoD==F<8 z1~FkK#l4q}`681NTNBS$+*TjqRu565b9|^-nWtsT+w&O*YmNHzx&9fA&jK=7$E9`p zu0wZL_r-Ye{kj;7*)Es$yx3y|{cC+s2D_{JxGtM7~cHibf#KPqeJq?UIEf^DIdt!!XL;u zPR}mKj3AO!si9-LLk&hT;<7I-=n{Et-mF~UMe zmlW@&pPjp5T{iIMR(70A;bEF;%ZUALU*nKU#8<0+u3ROH#XNFHEd??Y=SmzC`5+`Q z%3ZUdu}$G)zkQ0W<68~-sZ0`WUS$p0TQ6P>FmDqv?4j1{rgvP$w+H}>gf}t=ATIMD zha=emx2y^>eQp}`tpr7Fe1S(pZXRXtH;7cz0Ep!-;9n_*$=6sntaQsfVXx*0Udasa`VBtt^)24Zf9r%Ixhy+D|FmR?0=HYKNP8cQnBKm zmQKcVIEzb(7vy=6V0}enxOrUxbGuZ`%elG(w&buw5e5b9`)SM00*kaR_eV1Ol9~|= zm;+*eSpx$oPtj4zvKjO#_ru!{$EzSQ4wbaUWJ_Bn;o}-B=LN@8_a*5Z4{OdNbZfc~&5sD^%QKqN3C-1M&;`6hR@??% z^Z1RkIJYqq>rC0c+B)?h2*w@QeeJ=0kJQxjc}nJGT_%WnnU|#$`vvAt9zlVJN$S_G zqP2W<&;Wc-HK-=o`>x4C2bNBD`if8F@=qeF5tic?Sc$E8BEz~tuOKI<4fYyGZWC2U zz5i0vq<#ext7BK1BZc@f^KeMD0O*zq+#bW4`lL>YxxU}*o(ZoJmu2s%ez2crf2rqq z0b=HLomxq;lE+nW{z1)w_C4E757i3^lI(#Y-^Zh_{{~xB4sy?S?`*a8x4U0|2WF>S zxjPgWEkOaEO-J?MRrP1bncgy39O3$`;2UmmLllfeQqx*q& z(S7xyF&D$R=W+@XJSBysZh*-!M+vMZ8ssO>-%s|`euVqbpc{b*M<8&c|dVJ zHJP3yi(5-E3kAZynOD0Iw0$5izG*lEb15kBn)rDn3Jj;gL^cd^Fm{B|2L{tB8eGTk zJ|?y}?IAy-(1IY3YFDsb8^_p_06cP-;W*w+c*R_C`ZG!pK2s9$6_>I1c%?)IBp3a| z(VBAU@gsPr*PnJN@ZXKe<%04`KA~2L@ZWC{%?3UrP&yo|clVj18E4tXTGF}an}SGGRPUOb)tYJ3Op zSd4jw9^M{mnP+O&V+s%%`0)15rGDqjgCo|e;%{~z9Cw^I;haM~bu!l;kWjewjVDg-l#h45TdA=57tf{L{JRi=@ zVs|MvB?x>ZS;?rV(}|fi??aXNHG{Sf=^i_7R-N|Hp}3lH z@l)>o_Nr#eZj@ZP#HneYuV~(SvG=Y9gja^6uV~sshfC=5OPEeSL-(S4fxchFy!t#r z><<$CPx2zgyBYY?=j?QNGl+HAdGO+5$Z#FQ-^0#(+_6m#@{n=`vSc9iYzKu0gu#cu$2>@R$*o z0G~;Mo@qF3dD~}xZn8B=pR%a5m}^gxC_gy)X!&~ zXggYh*xyI!^zD5zoEjPTN_4@~>|i2hhIeZGf=M2Zou6lc({3AH{AM+ImGZ!(o4WxDe z1G`TdMB$SMkA_n@KHFw`x~XkB-ii{A4-~h~BcH5FiMg+Et{g)0W%5WGHaQ$AFETsL z=JUv4JNdaM+RKvdR@la7i8H;tVe9;vw=~P@=XWX2v*Jxr&eJ1It<&X_=je~S zTNJLrCMNE7N+oUjRMm2!Jz70D*)~5w*s7i!i{Jk8=BWOkW}81|9Wqcp-PXZqUPj%| z2p9b%YBqMP*KB!hQTbRBeqksnS^7|aWb}IfgoSJeyX|SY+BAUz_BhQ(23&Pe_cZk) zz!RX2C@SZJS19V`1MlQZy2cYZ+DnNCVo6+M?~BXm6hgoabp|jE;QzriIQI`a8un@l z-6DUIOb+Am6i>xHcH;l+$vB9_Qh`V0!rxUeO6{K;v2616s`YBCsy?H$&9=?WdQS8T z*ft8$-zNR#_L2p%aKJj?ggX82SqFn0|9?rs|C{*LGv8TajE{I6INbmZRBvET1kzVd zYCC&bi&Ub(U&c4Uq*xkcr@*5{L*}>l3gIF9^YtRlu-^8+3#9*3U>_{w$>0sIF(IZABZPAgF<){QnT72jBZn$fpS!XvC`Y;qljO0beCEIP0IM zzh6^gzK~QJ5Q!ip{00MthA~->OFCEnXvu(82ZfiLzsmSbj-^K;#$cQut)aj&8BMKA z)7O9pH&k*|P#>sK4G2SU9FhSfA}aex_l#!uK`wB>*Y6Yd{6e?Arxhuv@w@kgKb!qY zgs|(toJ76N7%7RsiVsJ$%c_7XK^844-WEDP1wW*M--edm00oOPM<9P4W&=Up3)(uY zO{>EV3~g7-6fOC~U)4JZkP!%lw;c|l)I}11JYwW#`;gEo*gbUdOmW)$`dCTb6CCrco8FQ+mCz4Rn$NsF>pXW5q>fMq zhB6DAmf|w_43uVyBty1+@&|s;-k}_smb8=?Lb#Oxt*6iPWbnzWq{D5{3ZWmCun!rK z3(B~g4jU|K+?PAvNp8bh$D4fT__eVukB?6syw|cAboQ00*S!t-TGjF8Q8_xdglpg7 zSvN?nW24rWN&+uoZrf#yPBS6fCnz6v0-V_QdXYZNKSrtTxcU`+dnaM>4E=!Db9Q)cyRpNkT4bS;zl3bPv3_#gLUv#~y< z0bZw=Q=P*vNCA#Z>+Z~55?jlfgA;{Pw>J=x+hX`rjWb08BerJRwJty8%1pcb){?S&-tXFQ z?-!glbU)eujbCsre$&^ELYF^xXDl{`N;Iv%MkS@nkS>@?DiOb&&s6<#w=p+R2qSVP zK&hjS;Mnpt=zE|zMkEb;4V1k1WRLKI@O6(p9LsppPa;;u;ved3PavxuJ;sBiFgN(A zbX5E(W&;{Vc--Ipub#pG^|otbSClANG#ll6YlkBrPPmHFtZ7lJR|WJCzfha;FerVr z_x81iWpACF6;E!-9n}X;XD^$n8?&EqOFHk#a72kAu=)+K;xozxK6}EhxM#ulUy`{H+6s#?80Sn~ZE9+^H`a{V}?s zy8brstZ-Dzr7BMRK%eHCamjt#d0NgmcLg|WeZ}zVtMwrF!IwR)tjnI}@eZ;zn1;|5 z3;b9mrL|zF<;Z1~sHmd$<%0qX*~_8XZjlwm=?*EqHgdnDE7y3y2%qAXcZcV1O$Tb2hqA8s{mtAoq%^QYx+GrNZ<26)9l$?lKHvQ66IyZS6izJ~Ar$oxr3eKc;5M<_`6hMa zh>D48^LTVyP?#&5ORm3!$>wIjl+08%k>WMVt<;(vor6HFu^G4Tv@{KB0J=c4d-VN5u_8X9 z^ci0c2fzI}HMNi968?wyjk^3@6?!>5(ws3>TJ6tkjCEAn0nrzh12(U&Yl)DaC|*u zpPImr5S(Fn!AQbz*o~#jQ-azHae)jUj=AK~$=Zl7;Bmd=u?|fP0%e{i({V^S{snwh z66lt8!wL->tmBu#CQt`xGBd|tIaCLM`2_S>Pr=u_T;cRGT!tDtKXIIU6XF>pc`rt_ z21K7lNUi-h+v(O*ImLZRpx+Z}SES}dmG!!!x1vXoLQq?exw<2d>S1h*j-4I$lf{5> z(27BC#=Gfljd=jJ(HS2mffDWY61a*fa_JGCk`s+a6h5BPZUHM9P%b^;5xKisqC22L z%|9L{{>WaZ`91v$#>G@!JK5g9H&q%MdtLh1!mpuA(4Hcfs379BtF;@@C-Lt6@LFeb z1<)v@UVS8F@n4v{n0Zs3K)gXZW~bC9bm3|yQJRrv7wKZ$A4vXxeLAnQzA_x4=CB(W zT+_`XuxIWvU|+$T^n$3Rn-OE>cLpzK?p7e!+fz5nNKP+K&?Y;iziQ}-S@5QAsDTdY z1l+?o;y=kUZMbbuQJuBSi=C?{Ua0P^q@=H~-Q}ZI7PY8~ml~WfX1sOtE*qKIte_Z> z1)d*^Jm#H;tiQo1VaZX|mFVc1FKgOYPdg3VeOBsuL=UH(Xl1<;`)x>|G4MNik*F{M`8wPZ{VQ!Zsi0DL6MF+SDlF(UI%1wW_v9B8JM6T-8qf-r=f^ER$j$p<{d@1+p~rH?RGD(Kb5PC6>QJ0aWol8Fw*@te*;gsjGPu3H^)<& zmN=we9B2>-`uakd_aG|T&42cTbrq$c`<4|~BEo_5#${vvZ_UaSsAp4RXq_GP*~zt) zDf#V~l3C2tABhd`Q>V{AGF&AQc|#gHGVJ&;z2k;;%F=*=s=;6D4j_625hNd3488T> zh-ba-%ww{gKDvHK$LUY{WUPW8fLPEDn{i{mykODf=E}lIHxGO(L2~&Rx?YIUalq)t zZ)dlOb9s7XPjSN!$J?3qbKNYq1q3&dAPZ@dL^(ld*L{LGzT`JgS~t|cuN>dkMGu%L18C7V)j!8i}}qAgV*FA6RQwVO^w5x#oA|GkQn+*S|t+h;l11$ z7@AAE(em6}_0H0B$!@wYEM(cCo&6;_`)7*Ye?c5RB%eYlH!uc$fs}VN-J`=wZ8Ik4 zOtwTHD;@}wz(dskC( z&_LDtx2VWp;(i9~6JQl6@w|lXpbwqM@R@;0n(@() zDi|^!#KH5_mbvL>hHyQWm0KvQx_5~~=L*K!P3NwB8c~*i5E;E7_=d!F|mAD5q5x6 z`UuM`#cbM+UigSg;i zsV4~SHi5d>L+CRU^PUJrfCkzsS7(kgfG7FGaadSUif|OuNombDPXqcKt?#a8R(;Jm zL2@ovMlhMu9xf$ub5ccjK;rpm#pJCabHQ3~!OP;=>K8@VKQ*2Wd|GsKz!?x%Y(neE zK^vxDogaDd9^2O0Rk6eqr%(F|TOY*`Ules!_(lX^9MIPTj_luH9H3OI5JcpII`F`y zIR>!JFKENht|lMKmV7Ug{Ul^I!)-_DdCByN8^2RdaBHvz3)FS^a+T;W#nq>OoM^!o zlx_FpJIsKgb9N1NDnLIVVIZ5v&XZnzbgJ!#vMjaYnrq{^n$yBcR(IxADQISs=OSA4 zn6R8X2bJ679o`Q+%zH{%Df{C}VhlA#9)Fafk5I#3IzLPT8=l@s#GFNwAAH!}V`8GC zB5D74!|`KK8T-9XUA@sw{kxr%RKoG#?|=ldKIdJg3j>UQD>yv zh_C)&X)YJ}92&QVyVAjW{W-nXxwAw?Cy7WhN&4Ou3Ws}GEoVi3`ufM27D&p?vm5)C z;gRFOSO&X2rPkB!B73<^n=MLARum9qWj+1XV@1laePRPC7#wagY(LUKXGHcIw zALyP-kHo!4PXp}LerGe-rLX=*6C=7L+lzr0{@> zfwF%$n#X#Yva()!UcF;5e5p<9x=x;L+50Z)43V4ZbIJ*Ev(U%-2~pVbl!JBL>#WFT z!MP-*otztQ+4=5E3oDn{iED_@oZ>ql8v3h8&fJV99IHB4qAZbddd0-7Z7;)n>Kmo- zC6_EU7Q}ZYNDX}ZIuw0`{(@fuWuWvou!di4p``h^mqC}Xq}$LadCMJb-9Q?Hor(`9 zrgbw;EV4Ne%h28k#hGgilT(a5IGsdt6*(lk_%B2_n*Prb?H%E7P?5s8VQ4K%CLf@L2b@+WaC)w7P*p_o2l1@+dRfn~hlKw39%((06 zAfAbFi_z{&B@=q2y|krq{I_m#D3iYksvIPLQ+MYHfOvq-)IV5Frybmq{+Kv$i`A{;MBEti@lUnz#@SaZGbX9vhVZh$sqWy97c-8&I}iZKy7h-ks$3fwe$##C zQD0&h^sp1T06i%2ik~m7_~b7C1i+4|sFD)l9o1RBCMVd&M4ityh(v=iwZEwZz0?j*LOP=6f}RU(Q=8 zS{HjYW3`QeU@>3j^#ZpjuhNX>s)rqSdiNQw+6u>Z?wxW-BtLcAyW~=0NskVMMo8wU|dukP~;7ptil2xw|nv7myYca;ZlS2U5fBD2xo6(X#PXAVl}+d>Lc! z@BsJ2hM>1jTz$_!asx%<6fM~y4{v7dVICs~6?xywsfL7O3&sQS%Ho6%{dtKoL}q9% z%kctxT3z4HHJAf>aoHU0J*Uz!2}x-SFc6(sS2sJ3frPzE-XA!s+sqDHL7QWI?Z~y! zg$zxWg+gxwFy%GKr@<379BX^S_*2lVBi5M2lHeeCc}qq?!tmmCon3@hKO392iW#&; z=k&KLIxRMlCRWvRS&s3k&01enJZqC=y=I2;Ub!a?_h@WXoe`v`Yqx)RpM_$T%96v1 z@lgDGZ5f7Y8^0>oO|k6JKAv#9k@9dvutK0+0ax4-L<-iKF-3~<=RQ`XZh7RGE0Ko~ z(IaAqV{P8T?k7*`p%UkF(dApdt!fr=PtH=FA@5p;+ zD4R>d+AzwoDN`nI%V3~C^whJbJmHTs?^Axt>jmHddp7-iCD&OsG8XzqHM=%bQBIBV zn0>&+sg2!XTD=QynzcpUyoOX<@(8@S4R#1X?Vh#fxtBn6PP>24<{>QC_88d(DRsm3 zrlx1=eFOrsIu1sMWB&hR&@eO2D=T%RLqhb#YPOsFDZMLNo8*xQW-f9fuWrBXpF{v# zuV8AP4oST^^=975``fog=9Io6^C#h^Pp_>$WkbKD=5MCG?|9Ie8zM5@bzarrn-AnQ zP!H4$n$DoW6PtD4Pj*|yU$k(1chUM5l)odjOcs$?K0Jm3y?6FBX#R|t2HtM}S&@jH zQbgv~Ro_Hwn~|=N_ZQe1E~5BEyv#SQ8TtFVAJSdv{J~tax2faR*8+2shCjim97N!% z@xoYmP4$PG0bz;PL~&#x-HWXnQ_}85z9ww}x4RdKt}TrZPi4l)IA1!=O55>nA8EFY z2o~O23g5&_MH77kB5X*9)5(>>wrMZl-nhpmhIowSdQ@odI{Afj+^I!@;@&rPi{TMJ zq0-h6hC5tisO@u+-}fdJhuRF~Zh!0T8Gd6sRDJPI+G0buV6*#bm7<75d(G?jeCN*J zQQ}Y#X!bfPVsJI4(Q>s)J^rlOF0k}flWw4_m@duQSWCJ19kr5VubzKO*C>Ztgu6i^ z^E|hS2aC|5FT;{Ixpi;aBZlRqw2qGrRs4mU;-A`z<&qgR)(oC8ifOYO*S_~D+C6`t zu4e%!IGj(P?d>LHI+p=)ZbP;dr;yyOY=y@1paeJ3)qXfyTtTIk!z^^Y;;xx`SmRc3 ztI3kZQb5#HB^t(lYq8CJuj(kHV7r8uU19psF@r^DUv~c0ro7=+_w1EUt}R0+=0NNT zih>t2ID1?gl^5K`K9t8UTMx55_q5&-*yip*Ff=P%T4<_V4JHSeDtqWOY2@)nxc0eW zhB-)ic$|bwNW-}@k~5dv;V116g-=*}Np7j2?I1#t@Oo<-xj>zZ<-{TU17ZGyZD!p0 zJ;xcdPsXv5q}Q>l7P0HMhjZc|cliuA%Z2i!_d;e#0qMHEKA!!4!)XYo=_thRak8Z1 zeSLK;$~FnFT<6Nq-s$E$2;&G%w14p(T@2G_uWkh`>0CISSy)rLf2W@5IOuA_Hvx&$ zdfx!KZ53nGBZ@O+yu1*&D{?+{^M(o=NgSvvzxI21>CG22-GhstH^}rj_{9W78>up) zg;o9H1{N2tzX`vRs&$u3;`m#uCP6B`n&5y(MW{-<2ZOgpes!SDa`G7$i&uvS%n#lq z_Y_je8K)qz)(GGEj208VgqREq1l29ly(555Uj9kcA2yBeygeWp2`xN&`~$)ndSvqb z5tbo07?)A;KtZ<(z>^aut7i_crgZ3%?;^h-(1Q268#x#ve)oL&NAA{?ep||drVXz+ z(^waMQXWV6rA#;n!^`zAFq`eKto!~@4Rg`*2G3mS?U)1?dwZTomt79B;gu!?5^z9v z&D{?^C33V<2#X;xnF(>TA(hNfe|ikp^0l zsLi}2?FNW!fR-IFT{KkXmXD1@TnyD|D&h+i4b{sEW(n0LXE*GpF(dL;m3*3$SIKkY zSENBAQA)C$6`4thYzQwpAh>6@rfKx`#FE6_zQ?IvltevObpP^&Q$Iw`2GFCW=SEp! z3_dhULzW*aHbGyFaA+IBvmwV?n|)us=zR5%Q5#p!)YG@%nc>Y6Sc+ z_cKYe{udS8|JgP%5Va9x3<>XibOh{={S9~n!h>0l0od}_{ym*5c^--vwqc=lmdp`t zehlH2tWjk3m5~d?>z8gJuf`wP;Ld%3`G2^Xp~T6vOT@torFb&t8@aX;FC>s{7$itoij9?TV4Xr~0#E z9n8%C=2CK78pR~zL;>0!kTh3!c}OZ-K`;WLRDWy!$k=`alGbwrm9TYY31s5LH6 z5ftVI%Nj6Ty1!`i)aUepJuahx+1K5NW_B?mdqOq3>fA`{u!Uz=OR*N53F(p*`Mb&M zFLwhw^r<2Xkoj;dfkxKx=B$ZH->QFCRb#a;#$`f1UCz5c_U(bo6FjZ3GFeR6y*TE( zQAY|*O-NUOgLku*Ut`OC*A!)Di9%4}J^KBNH6 zTltN98b($UKci+f8bu7 zpa*S6p3*!oGcm^T*fHivbnz+lVaDj^!PX?ysXG4Cp)ZV7-Wg?BJari7%Xgbt+=HA% zt27rGH6L@t#Jm+%eO+;D2B9}uG_0vgPsnutq+wiG(P_%>)y!lGRijdjlu>#}#Vm2m z_nr8q#NGbo?1Ynd`>Yr)@rY|DoHWn_%``FkY@kzjTGZgFk1Yb zYh}Oum273j-2F9|CU}7XK&w*v>;L$F^>Ye+#Y=i3piZhr#!O#H1j7B7wz*$PcKJR5 z?yM*cA^ZpJHBd0aKA`s?`n4|xU^BlmOTo&Qp(l;Wge?q8zrXN%yZ-OLR{H(NM9_<` zLN)a`_(*;Z;U;oPhSE;*JCt>oEfHW^!h%UyIsf+B=}XGaM{3y;2q^^V^S|eepa^Jw z0gUqJ83F{#&`}5cJ7CrU2mN>ZW0DejebN;AKMP~CeXj_uFFdLY9DHlq2^QH1tp~s( z_(?AZ!zKFQm`GrI{OirWzW`XU{OdlZ<^YgR^R!lE`m`sGkJV)gHvt%%kf*M?B)hGp9`IxNK3juC6zHLvXEB3=ZB>xX z$d!3^&Sog}wcXaCj-Ea*Iim|=GZp_DW%Viqb)w=9{Ci>*AqBnP^RO2;FuYfexj69_ zM(Sw`WJse?KxuxaT}Wr8o~Cl|OO{K};dT}LKnKo^V%t{aUnANzhaR(;?8EXV;JN&e zNjQ#lcsF_nggtX>_9gfrjPXp>!>XR$lPuokW%)6x+|em44Hd@+*NI7^O(6`rAn=lj|hoJZM)|_Lze4t%+iA=(hmvo$Rk$u~n!At5uHvc@h zPTb^^cjX3`nI$W~Tv@u;SfqW}rhUTq2{K4}EDX^3sm<`gZK+HdHxJNRRar zvC#ebLS&z17K$EFuMv2$9}SQ-Jl$ltYz_i(D3rySK2?b^40|rho9iPvnBylg6?zb9 zyQWJnF|}B;5KXk7Cv}y4=(X}nq>Jj^RbS~gef9O29P_o1>{yOp79^vF*LJs{u()f?XNukZpbB>C*h|zV6!;+D(DXpE#0O^m2pv zuDh`g??q#@eeU{{&umuOL{%m*F>yr!qtp=PRRxEm+UhE)atU`YVEe(0#w^`A5zzC_ zxiD+&Q|ms1^2oVGRm1D!_ql2ohBJgLV>;C*c`T%5Ry4l)D7figb4zuLXHh819E)F< zY429qavAl9+0DwDL$=N;gisG2sEJk{6=S5==Jj?e1f?fztqt}CACI9loTf~LTGHtc zkdr=KkPXqzS1a%yEb`g%)j65kc!L}5WS0U{emohk?z(QCYIn8Z`5-^Bu-}gx&N}KK z_L7TfO)&PQ@3XDu*15r9FRIG7rsLSOtfjag$d3_a4^Y$d4*g|YQaju?T`q!6u!XMW z05XeDnqDdO9mKB99TIf!J}eaqv%-X~7!8~QrvjF%2kVTR!0o0t0TS-7p6NfHp8xG9 zkJ6g9{?!TeT1@%?u>u1d^~oSB4KoSDZ?4&e5^qVL%++k2^U1ij-SA2=zjRvp!5NA^ zI*CHgYau>UYMLDkJ3xpQ;~l_-rp*81pn&y@`FJi>PV|mgSCxSD(@C}YHn<$LB4VRz z$NV64(D~lS0qg!c#r5&adWix>5f7E?u17bXH@PmlIm|F@w6AgiNKYY!%Xr4PrDiBa zN&4!(d)$QXOVz7tM(nZL%jS}3^Bo{}*Glu8wX3~y{~>JZB}fT_}W zl{2L7qe*w?_u28{CFJb2YY$^DJQHnwd}oUM^w-h{2Kx<9g8W^+{79cvDq$14=yVOv zGbO#$K;>;F@;IkPP_nc67@W~zq+_3_wxy(Ud+rU<|#j+cK6M8T2*)*rBnWT zO|9ituph~Cpx~&G(^oh3?^K148-Eg=DzT4hC0|J0l<4AL;A@>R>7ChLeWjWGz3g^7 zo`)y?-cll&hQ%SnWRng1Df*ygCPFN&~kLfwU`3Yi4eIO60kQ59uC^8%5kW9?rh);RvPqe#e0g zzpq#CcQR=_lSFdsJA;d7w;W@IBx6Rq_M>}|_WY!!9>$wXQaf)ad*26z)MNd>=dJs0 zn|Ldd;qvR04q4|`GD9vX<*$}b?eCI1>pG0!#qB$8?ld4TXY78PyAIK5uXD^#@&*Y= z)w%nDc^6P?g&4(~I@g{UUI?oA?7^P;&}8#mCz4WQFRprhX0g^K;ud@?NPc_cnI&M` z4r-9dOB6v%T(|1lIl3+88e(t>b)fH)64D;X$f(miWbWJD%Zo_Q9;(QP_>$>)3uW{G zOSj7>5wlKQx(1a_v!gLD7lv(cvnw~}qDc|X+2`Ig7Mk#KLpc-o*gu3Qs%0-0x|x(* zjkIMcG|v@zFM*pM52E=>GNGFO^eXt5SPvWB#bMF`7oa+w#2)EH03tmSlls+?(X_k(oh645C0#2t@a#__ zTa!unyL0F8{bL5?_M^mSRU09gaS3k?c+7T6B zCN~QNI63fZX&C}Z?hOSy{+xrX8QGV*cKk|RI-4hq8n?7nB?xEr?jO+^4>5s4{U+Xx z2(+`_; z(v1JqoD)6hdeUe5Jsw>BN|pIUQfM26Fu3ft|Kn!KKO*Em>Gu4Im*`VZ$O>rjyFVtx zuY~sti?2Q;^HJrJ)7OG#T}I+Zzc}U=qam6bCJ)z zWM4P_9LCLLrNrxT2xk5OXY-$~(62FaA=&JF>Pow0w}F3;x9JUx?#pzAs^4Gxxy zvO_;B%HW#P3j@__w-*Mvr8)}=9!ZyI<2MA^?Y~rrq}H7enB|E!mvwGr`4hBsrXR%gS9wCYCAdID-49td;wYbR!Z?p zza;TN1q+@D1fn`{At>870GJ{6!Ea-zyl8=$ir~#92M;tW{sCSr`X|x3%4@!BK>dX^ zO5$=CU2nS4aH`LW&Q+2~F^3QBhAOMK+m#t8R`jlUXqHgkaCg=5e9_9;RF0K_=6dL- zElEz`p_v-n@6m35!z1$9J0^q~Z=p!d52m*9tmChY3p=54K(o1!>GaKLdXz#w0-aCXv4m&CQM zJe{Vk2f2@2d_K1sjHg4V%n32hdJENeDi6O$z)IlyhL2yMnkUsu9k%@s4z_RHVMQn6 zK3+Wav9-LjU1t1yjH|-xmEMY)=-r|uCkO@ED2wre0jCDmuAfGDHalYR+DY*n`X0L( zCBB6Y-~7G%ns+k?v^1|O(3qvG%HPkvdFPSa%1!Mf&YAS+*T`Tm$$oil8~2STrt~)B z7g6R|1@F}GSet@t9`vhPc+yC$)%!RN;W&N8o9FR2k5jgJTS+XpC8NkX{Uu~TtWXON z3rcKc&MDTSsai!u&7Tq3oWF!~_P@8HADxI-bGJj&w`kaf>s*N@FOf=BVXyeUrt#V?XJ(Ms;SKE}G8gp_cEr~M zAJ!mn(-U=d0C!!C?kCadivStXIS+2rwYE9+0B#&);oI9&xIKc_{|NI>qCq6+0)G)u zq?jlKzx2i9@RJaj%>NI&z%_hW3Jw`Duj_4KV$T`^0Bsab;HNR|MIoayn zBVT3s_8fADkz`-;C>di6m|E%qWZT}~)>@{h;xXE-1yNF+rz@);B0PQKq&d136FpaI zqV)2Ap){mvRzQq;_Z5u7!;&ETC-6wtLWr_r8wU!xJo-$v> z@5A~$J1b@CYP}}Ytg9H-{X{}bDi%zI^))XvH8rtZYu6zQRd6C`pn= zVfB+JFE|uiJS2fQJ-YJZP1MPNU5|_<^LI*xVI(9yw{W?!@y;jJ97eLkK14i!)?QVn zbye*|L7QiinP>TnB0+!Mo}nA^PX*ssRaOPxQ!`*LY_?+sUDNAziQ`(nexT)u$&k zbgIJY!@bU8xXm;28~xw>-7Lq1s2|D2snr^v3RH>LHS{o45#kZd>&qX1!c${?^{(EH zq5TN4kjlwqsiNCUZ!6>;x7bKF`)nB@G`G8OYtn*es*l5XYJEsVO2(RqhMOb?5aBU4 zAbi}~j$E0hb`H3{BcSc!Cg40TXKLl>H&-j#vuKx{!FE&N297=AYr@R_)~!?RqF<&` zD96OE7G6CzLGk*G99>=BRIuCV%i^$GO6}~>K7lUv%V?HvFL#HI>fuf`q^v6@{t#(~ z9#KtzcHWF{4XI#Y$F~o+a$st*{p4P3q!nKyOCve&is9q5#j*euKZ$zALw#;y7djcJ za_a6}>AD}n>IeZ+*|1+sU2gSbEwWcTWMxOZ`&@o%(p@s;}`W>tw(?Vo1MK_kP9#$ zK)z@;i=(bM4&_Pq@Edf0F{7iZQd4g`F;(IId^Fxbf78Xxl_&n1EBCK;{DM0Id}>ii zuxA&liFaLMu#QuGy{cXkh{RA{fsM&&yBxdyf6TpgTvTh@HasALAfa>#NQZPcf^pafmJdU#(RPI~Pm(e-7E-Vur>28SVUy;+PvD(Q-0~pqCO`eeO;40>A$9`WODA9lE z)2LyiW9wHf?QZ=Lmxqg;bsVNE$#(nAtPsi$TUJvFC>wwn>?U=I&yJ$xDT{l6Pld}*hr{&e#VfW~n6$jR89h0B6(^x@5M1NG^v4Ns9PlN!zP-W+f$9 z+mQjK>a3U(94I-gF)HzXbss?P{MR9}tZOXxEB5@$d?=pJ205(M2*A|+9)|v(EVcBK zRjnYee&7F}m-0ZhP>3wEDge)rzl5BagtJ=5Sb-_Sl~(V<@V5wSttk10USugcp(-Iy z(C}~P=}*C;6cZTbZ*(Bo<9}PAb#2gZFaW%Y7#@I2^!$(G|MU3%_t$06LAa&PB9ft{wvuy}Bu(((smY^@wh#VsGZP=nMzGMD~ub20<;PZb$jg&^I+vL19iO>oUw zb0LqAANbqvYE5pz;5IY|B<5# zSnrM@-4*>k(CcVJ#_U1YfrhLT9Ww2WvNj5gqnw5a3dOJ1ay|SkeA+Ws&a>*>eO`Ec zf?<|rVuZ5d0oT|QPTyBgsSwu3Oh7iecSXEr@6SaG?ohJq4%+-7?PJ8cK_PH;2H^d2 zWhcS0R7YcF4v6Aax%#w1fs?X$At9U~FB~5+Nn-G5X!6Rv#JhT9+3y|`6By&%Hwv{% zT9GK*fUZ}ugD>@x5Z}m}vTUbm8iqX)BIQ<4_r37j$-CjF^ECyYZ%k`FBz;qL>!ln! zsK4nN;K%aIgc7l_rDzrFmlx%F+#;Q1!|NF3fkolY*HX+A$1=PtcbHTlP}smcm^QG> z7VeQ_g|RCqSGqZBRWX8!$#_CDU3UMLs3wc)ZN+fyxX0n%YgxDVmZQ98Sa|HeL_DW= zCut6LP{(uq(d@pV`BGpaIA#0zL6>^5)zydu`AN}_wi)!n8dbB&`P!gpy2}_=cfYkY z!{((oAIUK>m&u6lhg7M_?-e5^uP|$O1@Nf7d+Vr&o*YlF_Ln50;bVXPCby-yDms=J z5~`K;QS$z9#V(c?&Enu#G`$nfc7ngm`|q{9I_@L-#W6F|t`w@;k0u!Pc^7QS6^~%} z20owf1soQcaiCkfwA`mQ7=~jk6I4=vC}3|1(w#9{u(FpRA-z0e3)+;3EAHhyVXp9( zE1VVOByOZS9+bKJ3*^h7oXE=LV7)x%tZta6-Hw@o*okf4Duc?qS3y!_bf)MF0c*;4 zZl>?>mW3|MIb*fbigEIFj7W8CUm!*z`gXDz!4Bd0GEgEOXKGGjago_qp zQiibP+$!nlWqJE~H1kc!{Vmp^vLfO8Ibol>g4!fV_>a^#$`^z-4;gn3_UxTYP_9-T z*DkFWVyRg)X1*G}o4BYW-c@}EC%AsAw0A4XUwCcZ;0MkN+;^<($Cn!I33HD;#bWin z1J`H6KPEZUl{?&x7R_gp9X1+1s4SX)Zm~Ff%W>Z{JfVMI;Z>dpXq8SyaVxJmo;pY; zYk`TRA8MyXy1P)Gs5v z0qvl3-TMqFncWa^GHin9m0paXLO5e-^kY^J@|Jyr03uZ%~p3QD$v8Fi7Q;d{o zYAidU7t7&hQRp-U zREPV4?LQ_w=CU{TQ+sk<`?M5_1&IBp-}rX5K2G0yU?)h_rml3+c$rr}o0nh^YU?=1 z8F0JZMKmMrYaD$)zI62Xt`Z&w&Dfkzi_b@IffHE4Q)`KP!}QfLbMakn5qwIIBlq$@ z8(}B8>&mi1I#KjsGsTwtXtwSL1>6CrW6Pu|ie%lT zj!ZgCqv=p4NuH7m>#TtbQRkeu+QDDA+cqej4_iH)m?zB~ALXSNeGeqWS3sn>2(^jY ziH^CYpRjIgF?!SyonE^J>lic5I3XWxe>wD-H(>qXsOU=6K75Jg>1#^eLS@uUH~xLx zaNdJU;z`x__bELTwi9Nc(@&yIU5K(x;eB@A3yvGs6@wSzIA04snSBuy zZcgZbJjmUU9i(|aX2=j?&e($CGN+_F7_E3GSFwjr;hVkm<(yhpWI%!iCQ2Jy`oPX+ zU!kLDuoJm*+_*==MsxmZkX0e>wnI~un7v_qJ@thd5U_kmnGhCiMr~{GDMP_1DX#Z) zn}09HHa8ip>2~`W$>Nk89t~Bdd7<|M#PQxU!ycErQ@t(wYZ2D13qcdZ;%^L!z|=kv zWAghN{e)A!rQ_A7D)0DBjRnq1=ruCR(uI_fyH|(alBz;Qx(M%&Os~G)meeyAdWxs# zL76lNdE$GAO%YM>z#xYG$>h|JDF%3gO^m583c`o{Dq8Xdpe{l&JZby+roS2>MrYRA z&o!mvx2$)3jx2UO>-bI@Q^!x`l%1=CB+y-79?pi0T+&*Ma<83v1-~)9X=zJI<3a0p z$u{25u*b`qvdvOZqq&ms6pM>(uxb{{suYr$^Ab<%1TYkLzrv0QPRE8G&+7KTZWy!Y zPb>LU0$>u+DMo%$uyE2o@m%>>ibrv}U$?iuTs^4pf^3Jajr?o(k`mu!IN8ef`_4O| za4&+8%#_;e-WooQx)Su+Z?CYM=eiX&Z1}AsUn2CABPKihQSq$io2B=}S!(f?i(4jY zCPoYMta&7k>Z{!1&Z~MHxpQP25LVw(+ifYSQVG@*@{-(@XDt2(VSh`Iwv9c$X}?y~ZjV#TK-%JR1eB$UtOQAEfWB^K2|b1Ed{@V| zrqK8e&GAc6Rqn53ze{!gJKWIS2dOeNz!61+=iteI{gvZjcjG@60Eb&u9gQ?u{TulB zmcV0GrL;sX%e0_;GSG#D4($E)s_w4kWNq=K4%iWc zbB6Le1@Q07Uu!XxBrifLh{RuJ7boyg;^#L^23Pq5(~jmqh<$c22ZT?OGq8EAyE z;rm*Z0};wJ7z9%?_=b7Ehg~L8xhm(#NEIZFWom0#WKs^2xfnX((;7Az&gyKaMvlrF zY+A?Ust^RrSK#Qv_S+I70<{6oE>gHi`9&pgH3P>9;11%S0ED$xZlM$oa#sEQm4i@v zfny25hdw3evf4t`P)EAqIHr=xN&=UhjATd`cY^nT6lvzNuMM*Hht+NT#&vc=?JwX| zYW`(s@H>{wshG^a4dUlW|I-M;51E0N+uw(R{0w?6XT=SHDFEX-^#_n5PWb=z`(LZ* z{xUKCJW2jl}VSwu?AhMW_ketc249ZMePB1ZhMDL|a_2dx#Xi~M5UINOT z`;;4@X5IaCU)-q9tb-U zv?6&B%~x>yw~|U6qSRe|7eqH!j^Y3tBJgWr(z-W2wTxjp=&i@tzR68}>Hi9}^l=(F z9dCC8O%lG*o@x-Vg+!GF0P#%FPtNLKUEQbJ^21MVl9$zUh*zfvN}yFYQS(mZ(cwT1gYIV5l33L#|tB^p6s;m zUc6kGsOTx#BDyyqEbAAJ%GRZB_ZL&-1m49}&lk zpi6Usk6xNzLcO<)m7w}M-6Pe&(&GB&@{M#)M!J(Ao5n{;4e1J*WcdEL@;Ujwj!@IWmZL@=Pw-ni7+*w~b)|ET0>Z7fH z=`))KyBVp!<4Gm{Db%YqvMiPqE#iG5x?96V{;sCUQ~Hm$Wg*vk-HNda@T2>3nzd`R z9!S}&PecjK1n}Q8;)XccTtyQPP8}5(C2#eHSg*FDcjnSDJmo^LqBLtYidW{x1S_HO zzrq`^{l4Fhzdl3xQ6||hWmGtinRBOZVs%z9Vc~j&JmXtr`ExvZ~xMJGd;xZ(3 zixU7E1Ku|p8?3E9ET50#Ho#qj7Tmc1k{3h$u9#p zb4o!?J_yN7Dx2xsO+zNDgqe)OX+^L6-Z9<|MTW}7eb{JD(a>$xa~D&7M1QkoT2can zPqA7cxNTN6B}2Sb|MQSpjkB`R(}d{(Pj;6)1$|3@x*%>c>fYLX+5US@Iguvl;o7Kf zi)TRS6M|PQwxPx?|DcyndPn6svGPFYog3)2yy`zE@_B6=&ysn{CF|LcI2tMjhv*05_4uNZ19Q9dori9#W~n=(#(rMW^HIUn zk8hcLn<;9qXHh1pqPxD_zQ1l5wQ{nef4k@MW$4h}27ZPdcgP}EivX*4?xXhJ6-w>K zD9`xinp_TZ$D!PteDBU4It2O{s~K0oloCi6t7Al+49uK0PLec(vo`VRb`8_cK7a@p zapH6I@zStu%x>|?N4L8sD2rlH`zfd5@eaG))4=?sEA+ajmr+;L6~2IlC)UcE?2e97 zZA1$d{_aYD#D7Td!VxksTM`IuQusYjY5J2RMixkJmq95srqp6BRwIVYV@?c{-+o z#_^#$WBZ~Y);B$g1dt4`?Xf4Gt^Kmzc>8lF2ouEwlUBRh;U&Y#N8Pf1KI>Y&owrc5 zZhSl8S>0Xr-M)4~Nqh-rs}#nn*uIyiG`kHS-rs>-JrW{)Hnow~Z6)|aY z=@3>T?v&&&9mZpl>Rn%COsnafzP%s(frh$vWwZ`<8KvKKGM!zrZE250J>R$@=&4qS z4njAo?2J$PjGnldjDjDe4)?^wWN+0^1XQkMvCY+=FyBkNy1nL0eF_EnQb2?hQmsX3 z^{ME#{-`K8uOnQukBo2@3cb1?liU1E&7GsSebMx?^$g2ym4(#v!xl55TwybU0R&pP zgUK@TVijQU3Ac1&60hp_3YzSA^+enzc`9-VvrFyAFF^+feii5;|A`U!zq{=&89-@?pz~}IDpgt3X4$LU65tO8K0ihuK*p5!u32Qi z%6LVLQl|A~;LAtIgahE+nMt$4ZUSijZ|nv*78KCDbPBMA7R4GVt{_AgOgn;i6uHSi zBnIVnKvD54BG`aJfO_IT(qPn;py%qv&)+FxXaV1s-DZzd}?D#i+=BuNB)pzv`sN+R5 z@!4W#K{dwi*LnSX*S2%&mP+dQY#F)R2)AP`m=en)gYMW)l>&$f8 z2W0#QBuyuMNB&-td1HM!QetBRu$xuGCUpHi{KZ;~KATwb$?%(p-)faC=7h!uS5G+Vxqz(!(Y}C7_hBiRyxvTDYxAA7nyO%AEMQ zy)fA2@wIIX)U0h)MvC|nCtn$KY^L;lx&eBfgpBwUmZM-uV0AHi^}8DmA@P5X#e5Qud}LOd03p5zbko#iMMOzrQf6ryIEXx zQ!xwPGb@vFO${o%HdAl?5yIo4>jw=`Yd47r*l%NFdJJ&kqm@fXpG)*|9`1ymk*H*j z4HxeEJ-B=tKWM1*#JzrPjP&X>=P04nT>wGC?EZl~U1-zL?cEi*1j+3SX_{=tC~OY`h0RBM zDE!O&`ZD86XQAr~Q@EBc970)1l3gAnnW~3;G+~Vn5ot6mXw>ePL?6uV^F52b|O^0ZOuw{a9=%tp1^}j z$38aiU`l6YomK(Q!19K;y37+j@dJ_Myy;1SlSMKN(!w$Iuv;R!+cjOvs-c}$_glY5 zhbk2{CyfLQrqO;;YYErVOOayiNp|TIf1Jg$JfaN8&oui@BRrksjhirXK4ESo^ z@f>XUBDHR);pfA z7e_C5Ow6+OTSy1BTgM%R;&#J{?klYD^b@s-?Iy|V1Gdg8HJa0E;eD1%QO0pIWc!r@ zqE5wKh7-z1D<_S+$C{NDb#*|L8%cLAF@4<7^_ucdLj`u#`e+?}%2L+#$+srn2H(Zi z7oH85bd9~RQ9Gjw-6fbG3%X*p9&_IgDrey^uZ~{YYY8!uUa`k2=7ItN-x}2|3hFf!p_naIy4T45k-Bc_Ts2wpt%c`nPSTAF$%8 z@do%E&ezJz@9^m~D86pYT=#X5)Ofa>2!Fg*a{BNA(dW@Gw4Zq@wuW|#Q`3BxPE$DO zxNW7Ah#yY2m+O*lu)XM0sjQ;wG!Snr`lzzA@tOJ{yo~CUtf;rQ>!u1VwrOag`@v2w z712v`)7X-oSxx$+pb(Jex&6$EyA|Iz2VFdGNVf7KKsfuxjzCF|yY?D$enxMyIeU8= z+M~+Xb&jpzR_Lua`sI_GV#k_(Q(|}z3oS#1izE|&zG+t1bM*UrSNz46M!Cm9 zYegTG7pT&JoYsv#tBW7IeIJTE6^o~>G`W3vpArD@>;Ykg3r!TlSSHb>LG~`p-;ZJu$3A-`7O&d zfD~e3uBf%$N3TzO1Y3MQoawy9j|t-aOl5x#=j=qOWov%mX4EjJ*jPNydBW7*Bs=ud zR?N-I276iC`USmNJ0Cq&U`Tl#en^GzuA}y>cluh)!|xL<;SjnHQWK)46J;Z~F0c9I zZ#=Hk=su*1rqK#>oP>LkkDrX57PV4h*{#d5NG+${S{O{5u#y=~sJw?!YtCfcBmz_Q zWdA~ZMOG0>%JI}^VYya454zw1aVP1%h^m{jtuH`~+L2*Vjzywt*)!58%8R8`$vg%$dg zi$Tp&6F1U!C8e1*6%e6xtj*`hPJ8I!ls!ZMmnclyL+Tx*Z&wkeWjoS_$~;+t z8w~a`UdyIUmu)vaqPW6jMLCJj@u~f-+AQJ9S#mZ4)S$eD<@nUQfg!}yr&sUo>U{>F z$I=LwEgA;_qW!}RS76ek5}67LcG`toMR90FG)9BS?Fi`~0@U;Puc5xC8Hm_Gt9xgX zKLy)=4!r+qokf5V+{2O8WLv5B0fko8NXCxTmETo7X{m2FO?Qk{#+@ionZ4V^hq`Lw1HAXh>LkP;3|Hjw+ z6{h^t+4`St%ibdFQh~xL{E+5f8o=U0o*CbU&alV)CI31m4wwt`huJ)mxMqj5a)Bg|MG1ryUIb#=VTu(Af^w%hu0kKfM|ZkUYQy*<&Q&(OjXS zRyv%$95!WdyB4mjYFN!OQa#@J7ToJ(hL8nkSxkB!mqVJzs1u+{%?IwC=y@lR>iQ?D zk0e*VZ2K&ZixT2&2LW?vt)90nNEenui_iV55D6{EQB|aB9^@AvPl915DjanIrwNyj zpo_+Ci`NXJeQv8?JQ`!Fp#C;IG-Yl2UfhazG)19vIfMR1b>v-;yvW%0ChV6J84TFR z)+yP72%_>D%6oEX+RonGylC}AjCZtp(e0GrZQkH=UpOsvCnLX{7rPiV{91MVD6A(+ zG3g?`PWcjE!(BbcpOevHH<@@jqG70K*-&shqy}rYj&oJ9$dywA3!ddq%;FarA+Ls6 zJhUGqSkd(LK=(X+=ZOK+cN4Ac?0N5Jd}@K>Z|`c`e-+~QJ{z6yLg>3i1!sobUr*?} zt3Z=Pe(^d_@?l6usY*NFmAszC`V+0^+>RkEnaloimOQM(?!Mrew-z)UuFz5U(r|}0 zh-lUFKGw~m&RIwW7t zf`dqou<8Bhu4Ul9RhKbr%KDJknThz~-kQkMlid}2L)V|2H$2Gbuy}pwi29_{a@S{8 zE%0^32ad8Fn$NJLinvlXf$jJZZ=1uVg^b~uq%w(?HZ?D+kle`ZDU@ka?`T)|vtdX1 zW$9J<8T3lat@v_vKNuk*JZgAb5y4=)qFsSCy?f7u^Y)Jh5w}761ODDU9Iq?f-U#z% z6Ox+FoJ=&$vFE%Amo0R3=r!33)V>9Ew5|jXcb2|M%VX`!CtV)O-5gpsD1NkZD&<{8 zLk&0A#8PW3%kE}_b@>E~@#ebHwYUeqa~6Uso8Aq1=p}2vS>Xn8G|ZRWd8qF9u31z4 z!`b!xG7@x4+Xt3=?_7cj?&|(H3HE-(yM+45x8bljsgOeDteRl?ZG%$=oLQ$qXk8BX z^+JL}jQ<0_SX*&Rd>vh26q&d6Sl~u>P>R(^L{(na;IgE&k2&@G?%eOnbL-t!v#fj zmSW|&dCj8!{;p`p{AXhrD{8e2YPOHB3SkMm9FI0uHciURO;#{4pda36#4PU1;dmij zBvdTPa^g&~&W1=)t^96rH)JixsQIVoS#@M8WiJzf+O^j9mVBSHHDEsiO4quN-+FPL z^Z0ZseF0tUeuG5!i-><#t`VG+A_enireu(F?L##A8xqmtK5T&Cc?oX2fZ*Zz3l0Jw zk~-mWQg{i;?Ny+N7^ zFEE55xd+Sjs1nQ<&L@$(k$SN}ZT(riFYdm+kA&_QHOO5}`IqJvl6uq9=MkZUh7*Zb zSfTSi@4MaRU~ih9;VDInRjG7i&zKNBw=49QQ$*8U)Pd(CL`LEE`G#tPQ&Xa%F~Xgu z5%Q^>1;xp|B)0s8*{cxcFqFC^i;RvP1j_sq?}u2Qd@Tn zquSfKL{)~{Xx^rqOMXJs$xb(K=(bpQZ0c;LRVBwT6Q+pcIG=e3l--NOORPU_^4x-Cs2Hs)oXIXQt#T-=ByRfE#{gw)$z*ZH2Pe#*0_GA4Ad z+aUQ^q_8M0G^1S|x_#2O!H&=`$p^v|8ww;A-NXG&48|e%^S^vi=RDhy;;$vjybU38 z_t4D<%;>3lgP?y@J~SdFROmWFKm#S9v2Rsvb>~VCZ)Xh-#Y>4gJR318%X=(=r!?J4 zHeMdC@Yp22%FFU5HhC0f`j(-d;I)UE0Z~9&??)tMA<^nDgb3AZN0`< zC>`D-rAR1zfUg(H_DVQQ!LYl<^Yi6)Q)0>`o9`}gDyP{=33TM?@JFb)8Hds>Lsr7v z$(O#_V-Ze@aFtOmM~F!-)Q84c`pi)btz0*G)+;p|NZMX(b*ds6=TdI5?lOcRDRvV* z%~In{g8c${WMXUVi=lVXO#*dZo<5RPZ!~(us>J-|22(=ZW~&Ys>S1>7i&b9e{^?66 z{@z+lg6(t7u5YQ~9a%vObdj3UbI;78&}9)Kkx#bS;>!zrx90-iaMsu_)RHs{XvE_m zA#Nz@ph%~(Hq%W3acI4*u2_3{e&S-n6OyL zW@Rh;3}#|kTJV}{bM_Tk0VjoP$d6_Tf|0|o@o3iO)?#_;1@@oumYz{kBl_)fAP^^d z=mmfd{;3y^)Ho8iTj!E#SGqPuwwI!z2x@UMV2t&_%SkyjbEt$a3JpNhP8-P>W}X_@z@9?X?~uL-jvQa<2=! z4}pyBJ8Wk{O{(blxI-c1!(yis4A$b7b&P|GC?D3HRGuGiJ>(lgW9Ru334-o39pt36 zviFTznlofhc;GD=vU-1aC`-9EL{fbA((sH!=}dJ?pytQ@i~J?S3B+91j$BeI!Bry3 z)yIhH^({NClfK^F(G<%{(cF;$_mrSx>C}eoLENbF7rc6azrp90D!5BW&SLA^bL z_H`8Eh+avMAXuqd;Xfe~8>k?b1e`z8gie;-NA}SL!l#kJXE7T=y*=Zz3NKzQs;QA% z_Ip?{!!IB!J+10Ubh~Vp1(J;d&4dyMz4#CHK+^UqaA7{9VJ?(Wm?~r$7NG5`#A0XB zn1!mL9H96=;Y+k==?}q~eIIgHme5RtCQMt!jc;KzL)@|vp8?5=tPd7>K*}^^O;BXX z2sMnrul@{Peee3zSe7lHwVRSX`XY4BMWBpkcN8rD zx{HrL_e=N~6+FQ*0l4^=Rlq;5Yaub(zx{uZAOxHL@eKdw;9UN$0Sx6|K$Zi@wjmrq zgAk?p3H3Q0;Rf`*6TBeF8ASd(tRgIYXi?fuReZqF85}Lx@6UJ8<8%>BE>#G)0#8x_ z2KoaBfL32PVdG#662|^6NkK)zBn&R_b~6r%N?FsNqC%PT?g9M%uR6*9pRNDaM-CvP z^pL2;w`{&cNN>Q*EVP$gtRnGMIacJT3Rfrj5P#P^-`VAqpq5cBsb@l&Ij60P$Se2P zvcx|jAQY({A#828O?ABia;8+oZ)OddagktN{9f_)A1YS<9OwLLJ*jNMVbH`hCmre! zImp5-+aHXpwP3-2oh=9y{gA^nWGS0|Vk@kx{(^A*O3m13g8<_yB%xlE5g`%%({&55 z!I5!>=V%}w4P2mmJF7$C?gvNm4C?Q}R6xqyvVOA(@$_+r-JMUTq#P7|5qV7&m&SC% z`hacD>15|wT;EPRk_)=nQYrtuKVrR2{q8~W8+LRLd9W3_oM(;Z@sa?>wZ9JROz7tE zE`p9FSE`B!a$caC@b*(L879GQNiM#qI>TGGNUKtr?Ne`qOTJDQmn4t6oLGB)H=5p* zU1PY^3sD)-dEJY8r(i*&;unbTFA)CXBn&2^1M?;z>9Lr^bH1`R$Wj1J=tDVqt|%l< zhhi^@c@gOmdpEYyV+L)jXd1WGN`$_=_xZ%?rcKf))Z^W!2ID88T0f9vUoKaM#ru%( zX{Dd!Z4C&?vokMH62H_9t9aHM%o@#WETkc~E zRik+0BX(Qf*^wqwt1Rh)wp=JfL{!4LguZIBcgc#;lNr$F#t+SsoYF5KfrOwZ{XmVIEl(%^UKhPo*S8eyV$$i)A+N{!Ywz>VE6cq;kdWGK&DV2HL1=O> zchh;7ck1SQ>X$!uw${CNX(dX1j;P7hoq9fK&`C`YYFJ||OQ`GC)D&L#d_p9|(gJj~ z_yscPuhDXGe@8BQm#K+7@QpJ+)w5N#{I8e#7;Ej$z*oR@KGfHlGcbj#W~VKeEO`EM zeuXkWGUQE2z-WDLfg;UQoPv~*(;w$vz)Z5un?n749eD-ys}e=3#q`hNgxrf}oDlO` zx60{)P;-vf90?t7%7Qm0jFf)u{aUdo7c*DUnzQP*P6NG}d{1QBgef(Tu+{bU=#-<5 znUs~&hDt4Y2Vr>A{1{p~x4#5D!;i3AG8Ud-Ww88W3ismOA(_{5wbfImsJmw)(C*)k z@;!>c7tiXnyrh}Nvqus8N@a3EsA0pMT5qGvJ`{T^$lmqQd-%rp^Cvjp1Yd{E=!FgR zrj6Zc+*T%t)kwKi_4GTJ^n6lLAM&F~WH|AOHoc}OYjA0ncQ(E5c>aE#VsKraJM?ma zp~uwV=J%tq9m%`j_D+ovLMp+-!;kE^C+-u;nHqczY)j*i&tw(jY zSLvP?ht4%?n;tLzbKWd(h9^m&c2wWz61UDS_!^hqndFg`7Yu)6*r(4LY^({n`?ONH z)$K}UwW<5K)Y}?pfZq5x;TdaM6`$&>$lLH=fN!@gm!{3iFg&xP*EwwI%E z_A~WV3kUqHUbcXHuP7MXsChVN5Z>jx>K{DS=kbWF2&^RNY9qD~Yac(0-bvks#isU( z>hhp{e5UrnmT>6%I0GYX)u zi!kUZ$q$3snqP}J6dmHwZ1)`&O;Mq!BHTuHtt3$zEF^BaN%y%l1sp42WG@(G8=G1_ zCk+{;$#kZJ4oeBV)zG2)R8?tI;<)aYaHb@L_oB#kWY@;OdI^er#-@P_N@NllzobMDj|9%m!m(3FGiGC35dXob*7$8gvnz$~A=?Lg(e=XZ>R` zeM!anxCa;J3EI*dMDCG(&)8E!lf*8(9G?;9@RAKHbW~J$1eBxWIO)7*Ar^Gr*0ZG4 z2rk9e?#L1Dc%`N|T15u2V|5~XXqoA*6*X(}m3**WlpbWmg>sf|?bs`+$#RkF zfzamHrx{A3eGd%{Ip+35?!|;oV1##mnd`FnNG!r~T%tUiRzEbVjMOO0KIH&!rb3c6 zWJI+y$ZQWs!u-a^W!QA_y=e%wUQ@xT_%i>^4UG|JWiQr2{jU*d_NezIQq75Ws9Ovv zo}PTxtxQorIlz4#vz4lek{}w2g?k&c+Kbit!^q-9&SUf=;?FVeJ=?1aD9#3MBp__ zm{sW3klOa4)Qau5A4+*|o9Mf=V)2RYw234fsK28^tv?A~U>9wg(2HjnCX4dL=qx9z zfA8xUs<*$m*UjuCwo`N1r*sLw^!L6l5RlHFhpNC3*x4!447nzh`qJa@q+H(J+(Ra@oBCuf-*SK%+U{n&yn z8NA*3p)N?p1l_=;WBVU@VB|WyLxx zB|Y>y{pBscKjNv zLxWkoqKOZ$A=A5e+24EOn!6{@v!;?ml{5x3G|EhcTNOlfKShVP3(z@d4J8%|6r%nf z9sgzNaF{hz?Wp)61n3WLQ2zp9m&7{4^Y_`)cv-*x2KG%Ccr|pZH`V&gW?w`pOPeL>)^D>b(-K9EKMmIvPS`g*d|duLJJ1K4{E1V!OiR2B=CNpepxYQPqRPi;bVCD#wa?+rS4|Y?o3~ zAh*OPuZbIdM?pk~o3G9f)GFQhTRi-&lLj2{K>F{I;q^O~2aTgJ5QI}j-+?FsndqOQ zz~4EVKbD{lLGSnI`kf5eN0@&`Cevt?Q(m|xmhrlJ7FF3?6|L47kZ{+s=DkD(9=KuH z;+D;DzQa`K!n12YAFcx-B1;cT(kj6Lc5Ld55fxH*$>4B0!e~p#uOjg^Uu5k6-{Gmh z;}QPc;n7s|0a-#McK!no-3l6fl?_0a;42X*3m+dLvY0FSjQ~`WTk$VU6gUfqcO>m{ zB0!oCba>&XEC3oFRw<29q}MNK1C#OVTO7WI0uZ7g$M@&gLyOcQDMD>0;NWnO^)G}1 z5qH-d(n)@zZU=_+PvLj&5=5L;B>o?lU*%XlQ{bn$f6y4e2L3Fm z^y%qS%u{U2R{b=^)ik;o6uxskmnuxL5@T~0m@WShB97^OW+Zm(04uxNb zad^lZs^|*=SGu;d3Xp^(vYZhZSmK&q3+lOvjej#&X83l0N_u@M-uJ`>#?am3xfk*C zULyDJ#jM>ao8r3M(2#q&mp}a#TQ@qSdGEYkmOQq9O@e(b-HQ})S8nxnk?)6EA@2_r*T-`bOsjK)O&Q`M0d*^e+SQf8ibpq7ZHVg%3`UyIlE1*8vwWC+HCZXt zK1_b>;JM|h*ByPH9h>+(tX0$ULg6wt+$9AUvKDf36^+lS6LsyNnDj(&SNcbLHMp9f zZ6J4G;Oxaf&O>!6VaD33uq`7PPqD$sc?gC;_WR6a!Q4|uPD~L3yLhGhktrjL{e|hH ztYh&}Be9Xo^R6&!7=i3a8ng0Qf1b0jS^RnI&PwZ-8Qz_gz8ARYY5IbPM8=960V#JINnX7`gP4YNOvAx8L^v z1+r~6mb>bGGr%^{HBXV!9|Z$u|BNv{^}A(&lVab3{-opn0aY|Yqe-A#?5P}gMH(7$ zi5_zZnl=`9!!@{`0+mek;z*)0OA!ps=ylPg+5qhvHU-|%T34gZ0(Dln2jx|KMh|a( zC50n&yYL?SCgH6BmwxR5T3SEShI|T*y7FlyJWDEPz1|0Dzd$NewS4raFm4qrNbsiHDHOwzDD5~Me$MngX75p;L&DP=W+91|>XgG7iZt7o$M_6pW2uvOIm}@?l!-+JIycT3WSm%C{yDQFj?wGc=#}!}c zBGY!GqY=Hj_&q2`zPrJMGFm7H-9s8ym3Q!rzd_wp-0{ot0@o2?$>*Ch;< z4tK`Av%3~bIc%fxv$1rb4%kDx1_3(qDT1geQj>o27;yFnboK)BL7E|Im<`!s0?Ucu zL3WszxiS8u+2QfRlNALyxeaEC<+{?HWPghp$}evlbVDSQ4tjY<>t631S$8~+ z=fe3zk&1VGvvTs2v_Q2+Rp;BuL)}#8c$wAf$)AmP<^h_>NV$)CBt?zYd8%`WEw`&A zeE85NdoS~JLmV?x(rMJeJ^4^jnq_u5mNBE3xwBB?-p^3a;#Tpos^QQ(j*K&CuZcpT zs+dn^`ymkl3*8aZ6(*94llhH{0}br~Y6Nz+jSP~DNZ1I-xStxW0=YR_kuayMaH!qd z5%IXmLgX%)=X1FM7xVvUQs{NWA;J(-+#-qyKB}S^z$@D3O#4<+wiECgf z1ojOHcvjB8^EE%0T!*Oy33z%Kdcc>A@ZtAFknyEr`G078%cwZQZCkYQ;0Y2Scp!Li zCs+vXF2P-byAuc++}#_8#$AKELvVKwu6adr_P%@XbMAfPzA@gPuCBJPy6UT{HRoDu zPDCbjiJ*$hd<C= ze-`2XSAki>aCae*iBfOE?~9#Q8_ z=gp=R%2DUdY5-pjAk8N)2G^ehIFivpZ;r=^14%(UGr>#1_%oBZUcO$D3f$)tC&rq8 zAZOLmO9&=A-+-zl;LV{(_he9{Nhs&3*^|Em?|@u! zb9-AjfM>e&fBZ>{(|pFN50R(}?R0UmbOF%=5Ky`qLRJ5~4Z!k$gh8;p)UoFiNgAEH z0DAjxI)Dg&RS5pY$n>8W4_$(am%6k)96+Cai*zuZ z@{LKtk|9v{MFMs-NRgN!NumOO6)A;*jNni*hXT}vqVvD^p8ZK|JS%2oI{$Mls_eAh z=dP%2G~GN5`)BS!Gq)4?-x5^y-GBBiJygsy%?RjsPy;n)&xSnz>kR)FnEubR{{QMV zF`4ISbVy;8gY9YT}BPtAC@xiL= zT#+_%T7wT&f_oZ9qC2}iYgFSNVeOpR#SO*V(fi_+4*u*6TMik#>iyFrr^`+lX4y>$?S2W+MYGpuc9_gIy$IKb;HDPC@Ua+wh@P^?SqtdoE+hATP5!G?(7;TnD3=b2@5 zGZ$Z6KvkcNHp}QQ9j=-mIF$+`h{CMr@UZ=QBFCCY;}oZYx7{YqZwZ;RV|bxm*CmS} zR41OH|3Tu_$4Ze=GFV}vH22wUb)AgC$W$T`w~(`Z1h}lrM7NM^DTU0-JUGy~0`;gy zSmG?rRt(#Y_0qc^P?4?$!91O4Q|(=;`apfuGAcQH1K!+?PB8%Tz0i5y$DDR^`Y~ zW_(OUK^WpJjCWN=vACB0%$OS-X(KAHtmbO%!xC~FPYDh;GFCKAG6}zwBp-XZD@Yc{ zg?BaVS$Ar3HF0_!Ff#}cV^|dIURKHs8G|8njOxKo#t5lCK>UsCT>_gvz2S}1%wb)6 zNXq64N({L$5tuM2RDBHcR{Ltf2RB&!Oy2sPG%y8QK{Y!$}=}bO6 z*yF}bapLBW@6V{Sa^ZN9Pf?UUW+`PLo8RO& z$ZX|?rq(Z0)Nw7txAdx1K&&571hKM2qNcI=lynuagM2&vRzYqjmB67aq0yxKELy!C1Y!kR{S{eX5E`>+L8%S+6n%hpCf~pD!$5s_H)ly! z5XcXt;!6QXLfjc1r-zjhUnen77BzhNB6hnp+j@gwae-*L2X*u;n8(jr|INlxNwh-^)u#&3KRm zjCqTU_GCtAz^c+O%RZHfF7rp2Cuym>?=3&$_`0>hviq^WmwY2;qSb-vs;&N&Q`6(k; zuf$H=AhqH|PP|G~rG3VjCsoiDB&bkZF($@jK`}QoR)JD}bWAZqLeH{!l4J8sq2!v? zv)8Zgs|?!GKIu$`>gy;}Gu!S6w=12mI@pjWw443eu8Oz)n5^QAOvQEa_1AgDOtFqM0TI@hHHi{sU7%oDT zcA1YD4yf!z{%-oxR{yQ9U8P*SYHx`ObB7)tbuU>hjQpiXkAZg0UAL_UTalMVC$ zvNN1Oc-m|6QS*yk{X1f&uui+dB%+e0NZq>)Hg*@{Qnt?o^{e6O7aP7% z+Ht$?s_!bc8}c)A2n)@MB}94M#_^ie(c=f@JFoqqQ_iW|SHri#?R{!{qgLl)wNZBG zgxMjLazHg@Ompej6~F%s^XGXj3+G&sC>*#5ViM5Kp3bQC1J>kK05LWn4W&`6wtbURGrG4CiPJb<= zSjQzou70|P8Zcg7aHtgPPOUhJ2kLjw%pD1mu6}ZBmRGVQ1-Boz(6FQ)6PlTp?YE?C~9Oeb!Qsppornx4v~(J43(kI+=zPaXIFg}@$6e{gmp}MKOc7V z?2-9+Byt5L#M-^Cz~>q{*>UYL!{&05r%OdN(R_IUuZfCS6uuz%Nr9IXQlJ#m+axX{ z{PLe@52UUFWcRei@MHKgoJcDZQ&1E`Av0q*Ix!`8gwWAdXT#UVR!z8WRZ zeHCu<52rv1QbWS|&C8t{!0jnNi;gLIauSgxi+`DTq$=SAsU_bV67jh9u&?okU%zcN z`Wp}*O-_m4t7hE2v$&(C#D3TV;57Nd6MbIR)8!6eGxPNQ#jvYOT2Ie*I=3Qdn6F$I zz*$m)|50uUQyZd{9qW+o17ZT+fyNB6k_Kc4%GERht;v54%gxB&fE<}dm;lYYejY`E z7k1_n;e2_yS>P8m4u~Qz_Q~ zHLzy7O#sFK9!T4wPeE&+BJmp#{KQ8<{jclUAgk-g2DOkmDlbxiQ+$M|0 z{QrPHkY+M7ki-6quK({=VCn6}>n8v`5Ja>^(?zptdA;zzOI<=hhjP+1yPd~2m(E=Q zNGRi{hYc@>P;G&pTin1t&!xO#a5>NAu22hMR8juSA#^^#R(_Z%*v;-kNWz_r3oa;Z zUsW(g2E~Yo6$nLA=SlQ`HL%UGEm}4rG9sZnW`#-q#>>ga#oC^nRz1bDw-ELk>V`Fu zb*U$;#|I}_!)ikU&GeZ85%^EMN@(`%ag_%gS-|5;aa)&STx48Qm|zI@S@6;{C&?#8 zm^DJe6N|e1uh}ic6oy$8t0|21KwKsAzy7HqOx+(}Ds|8iFkXuAYXKf$+w#VRpFhz5 zLD5W9gASOqz(BP0#(zhWA8MD*4$F=Y;YvgVLZrg*xcJE)tRGXl8S4xcM{of_ebQo12|U_jhK!#`%Nk zq3nP|0w_CL0aBpzfc+UYFxf%WK)N>*7T`1DLlA8LlvVrG|0QTNrI3J{GbmfYmtaC( zX6yTN!8DfU-&sNbmvg}Zei9hxyH5k_nf#O=m=idw{y$ju0+RhvjdDKE6g6-rPPG5! z9spivG9lTB*NhdmT3-0SNTk25y-R!lp>3ME?ZXdPea$b)6zp}hSWJDWn3A6f0a^U5 zUx^Wt0@ygr=fdW{^u^gkl7RV0S-ZFd8g=G&d_()+nsl50jqoS{_@U4xi~#JeQvMT; z&|GU8Ra_zy4Hyid+4BAvEoA%tZ+IYpQutr|P*Um}fRM{=x8VX8>}`O_+coTf#sulV z6*MqFfw0*Qn9b`n6I?;T?zh4!{3B|gT5K3*3jwest-j!oi@2d)8NQdKzL>}aB%=V) z{`Y;Ia8E${7Y86;!UdEI#}WUNC78u!W*D2TKsZixq`{0ymQdL@qct$(uw>88C1b@{ zr}LDS;4FwKEToy~iiSwd&U(FwWeC*e%R+eKrrl(^rrnQ#6YK39ivR>dUX++)L@hNJ z6XOj7e7lEu=}2cN!zHB1DwlS|VvG1?tz&T867fi0%+%WpV+6NnPNq=;$*y&Lk}Ge_ zO-WXlD4&v|HS|WOC&!9+_O{exyTxL+&3}hGuI&Ngw$xo6=f2BRerwwshVOCGwbs7{ z%bRAJ5?MW2=fO?*X&0abM9m*(kuGkDW0h}`cpJ!%KR6X`>F4?l)R~v!^fgZfE%dUD z%g<9MepcM%pXccKqOhRtV>R@x6~R#OLN>-xE@v0JHkjw5F=L>ylUQ$CLHYdwX`3i z5+md1ceW@$AU#!(wN82%tgShN2PkWsA__*(>o@@u0qL+c0Wy72Qa`j8w0+xcSs6DJ zI!7i%4cQDG8Ezgd@GhO~GPK<}Hgd1JJzqmZSm`~|xdZ9? zgS%N%Cmg|K*z;+ZoXg2b)f`ezxd>SKU`q_K0ibJ^b5G1pJ%Bm?#_ST`c?Vl%3WfjW zF1L5^g~bZG`KmZz7k0|$M$-ot-+(=}I7PlHd`xv@@M~pgXsHeZk6Eh?Vr6|S3MMxa z?zu8(KhPPQglCH694)o+;=6HOI|(KtPUu`FuM4*@0Uhgd1mp17{Or?AkL&&p_6$EH zVwh`k;E#ufitT>2EV?F58&lT?_QbvycmSPIcJN~?&Y3f^>>dB~U1#%t->yt)Q4ZSR zgrvz#+l&|MhGJOs#PNLbR5HR7IU^sL?l_W9I0}hy9ft^cb1{#bBnPu`L%Gjt*iYiT?u-^gL)yWkn>N+28g)J1de+m|-z`7><-t(XsP zm1lJFX329emqfk_y{6>(82(jYGe##>>LAnrvZGpRbz(^_zUCC10BiY9mXIanx-xa$ z#W=-{h}|e|vP)=1^vJWhqLI8M)sR>tEtNUW3K8;^56f#TL;s%r`#QTTdsx$xHY2)% zv}awO1z-!2syf45GTwIm;)}s7_T9E}my#``gnLHfwkwDHQtYh=AEum7-%kK} zyl*>i^Dv8}o5Zf)ru1oB%*85gCI}5lm&%Tee%C?Eke))F;^&UXcj=k*wVzEsr7h)q znzR@5zwOn_aC?+I>-CS(oD3kLA@HC6)VIUAHHAi1k9*^)(?%*nBg|8v-4a;x77+uPmB*?&Pr}^!!@!W zE(|(0Fl*Fa{E${$7($XNLL*3AOQUq_rm9&8uGkp?cd6AmcVJtwo+I!U%DbmuI^nC@ ze>+h{pbx-^!uXL>SSl&ayBI)*e4tS~?_gVC=E_<6${$y!Njr>=2avf`pn59?|Ei8y zQ@_;eotKwcjs;4-gdQ&BW3oI~O8qw3jW7$rW{A{!*z%=aVcA!`t_r>>!#hQ1_wlCX z{8R*JPwHEuxYzqy>)XdH$Zj$=DKTDkQz{$>y8lh(SW(BJav5X>W&A)tGQ5^B@HFyF{&5N33HN$$&zywbmq6}yyBtsl+Bm6 z#GsJ=iPw%*kRoF*+v^sVK}ee(ZBDa2`&#N%=mup(DXU0`Q{TO&BkrEaHdhA>!MTjM z0SVt))b;t|y>gxIE3B9JA9ldJNlHGw*mIE16FiQ9wl(@dvBm_c!GL--&8}>w_;(f6 z25oFTh|mHHy6WQYA?zrz#ojBjj*XwY^fGVPF=(VoQuoL3+CW55!!pgU5so|*xV7t5PBZf0)t;Bx%I&Kydo3JGu z@*STyO?pYywI4(iv{m`Ru582b;~`5rr(Z@LTHJogSPzV_`dPEx z;3j@iAM1TME&7D%joo)2URSX3)C%~RtQS)z*7njB+G{=)Y@AUB8o#At!=m#Hb9zFX zs>iy6*4wU-Eh)a!zW#!892D zqD(MMEsBcM>-IoVH=>*H_~rd-FGhu)oY^WZmw}8=D{(X06`0zz=hz#L(r@G?6({o+ zl>>qF)jKG_Og2pzy1d<1vN7O5Ix{8wpc~&*yLum~rY9=I5{U8SB%iV7KeIf4q(-Tr z6&@y)wz`?qGhE~%F^)z+mk_xdIoplzA{PNb7wC}a@`J@BAGUYj0U&#!Y4{YYq=Aij zf|GKcbO8ZMoJ`GNlToA7XPGh{91j+WK5Cw8!<^mLnZ_IN|2ijr^1YBwR1h7otJMl@ zQROvPz27;I=B45$$`O^|P&5U$QNxkVY}!;X)n4O`^Mgm`1<8!7>; zpk6M9_{`A}YJWEWH;AOvTln%}_Ze3I25lDWJX-(pgVxekpO=16XB0o{q1G5(^o|+!vIraHD|Ip zkpUq2ibFOhUaeFuKY;Ugo(cy0JzL|1EpjL&pajYxVI; zuuDQLO+@E6T0L3d{`{f0T0LRYFDY5?%mP13P!axj-wEhf$pZ zD6tmr6>8Y4wuJOI-Q=V8#Dxh8gszdaHYRei!M7UhtsziTI0{MD zj}Q1HIN}%a{{097AlAxalk*lB!e=`^@x?G-_H&(f0We+uIrskg9Ka>jfPC~(4p3_D zG4yxIm593VY$ceNzxZNLRKp9X2)Jzr=J0c>_J853e|i4*Yh}hTPUFCNPVmp4c{Q#8 z^bV^Yfpqi7zY2c>KUb9hH_vVan2_i54FNFMU=IN~?f?J2lCLN#+moh>F$Sff{YA0H6_P7o--*lrdm za;J_}Egf+E&~*S{ZLeK$A(a_kZb$yO#E-TPDN$&ZUq3P|A+#qRsdph`!nivwNJ|E;PXfXZ)U%<|=xz`8HdHGq?Q zsh!$6L_+UC85a4Y){g|`!x8kg&W&>LIkr zi^9(d2Dh3(K~y?wTXl8}P$87a|Lh3S#zY|}ngy94zM(;QNCnVz`6uE#Zu5twgkT(? z(TD0jpZg|A#(wx6p=lA@bG-yeU^^8T^Q)9qEah@u5(Zu2t;OC6Tz{Ql#1qVOEA6e1 z=cJDKjf6lW3LpRqB#i{I(-dUB%cYfnOt;uZ^=wFIm-fL&!Nt|3oTMzBY?o0Lj?lXTk$Bo=IkU)*o`dY;ZYTco2Z2HA9Et z%P7bXvecKy?it%eA!eHUIlDNfXHVi$Wg7vuZRI{aTbtuf? zA^`9R7b2OEOzSW@L*37sVFv;?D#Yr|L`kB>G2Y%?mAN@@U0<`CX+OTWio@gG%k-Jc z=fUhT=Wbe33kYs*=OQ!V!A!yXUquN!sk7>Wmo@b>mcxvZ2`V9(CCO)D*b(^{OD-^P- zy?(Kn!AmM|5^d*Ye4u_e62l%*G?>T#P^Axb*$&xpzv6hzxg4DvUDpaICg`rV)KEJ- zp4^m2Q5?n>Cu5-{pn-69drRF-ABUZBbZ|R&@gt1XInCPMR#cnVL}^t-x?J!UF-{xF zXcimW7`qtGGJifw|R<7=-YB%!@`&aAP|9TyG)gNR~h zcWJYe(z-^>xCmA`DslTaa~0QonO{=)+}uxOuR-AzN?rU8YVDn$btKn{z%1Z>h_^K z2&JwyyJWNN8;h(a=SXsfK8(VQFzMW?Z65YFIP0>0%M^{})1~Q>ugHDpSF^$lgLO=G znPpHt9+;z<88~~fH)T>x#uXFz3S-t(Pmm0-_#G6L;zvF^UtB#zMYK*Hyf(Wgiw1+x@I@Mv_ecaxmMo_9C zsqU93o9Wo1HC}PCw%pORq(?PvYomU{F52=Q7wiX*YWLEo8!)MNvjGocrytznniJyB zNMdXbxi*t8m+-sX*qdI~Ptq%Fzhr5VC2%ZGug6H!RooLNrR62{jFd5d{pUP5H zHaI_#A1hm5e6i(8%S}jeh?a;4DQH$+Q_rlA-hVOZyh_CFKAS53LbVe@=BEP_G60rQ zY~!1XF_q7Gp|9I4#aS$&y0Rj`W-;;bw@0g((f~c`;hAl43q(uIibU z^dutY#Uxv<*oeWHl+p`-dOF<-uq$2g(!EwyC>}R; zHwopV-u1#${8Bcd2{nes6P`KSuT6P&T(i#g&~HWl2@o6B>(|bBG+!@+Y&9he*YjDR zu}q{9jd}FL!+AJyu%xLT6C3#rn#rr0AAVzsO6=Q7oiNJz2t|i(Ak#Y}wg<>6I5qcD zOeY<%LTp1Gp-)@mxGS9xlmj(94UjzY9N*b>a({cToTSD#a{0Cj8Y$hA*Oc?tBc^VU zi?WP&|F&j|th`(LRzVDAvedKq{+(BkHpOEe8?7IHO+bQSA9)Ih3xlk3m9b;XR?GTO z>mJtvu!7@jY&?2EM-MzQG>TTlXXfcKuwyREHPUV?IPq^o@}ND9>5NauJB=%-tE~w$ zXgB`iKf-0#+g577PAmLHQ+_@D&0~ZY4uDZ!ShZqm{uE@zu^GloE8SqgOZ!+wwU(k6^_=of2AfyKlD?jO-4WVV;U&EtYS5aK_rY z&rTK`Hp6o;cB<`+3WshOnXD(;S=f`1#84w_IHXQwomu&*EFm7&-}-GK_&!f|ZieVjTT*G&U+|9*f{^JKJPlq#G9LBg-%as)jGC0{py-Hq_>$S*hQ*-JF*9 zxD>%1Yz1Qv&tZ{@Ex;!bFD5x|fkNgTuP{Hu1-Y4zb*S`&vX1&|`v4rEBaXbTvw>vlT& z4U)>ZQ>WRx^kRk6Ys>GBO!2Y2T+Oj4?U3b&SpqrgAVRfO#UxVvkASUovZ>{| z>oK802%~yZkg^Nn{BFDwlIf~uale}0p6@{OJDJ)$B375 z4_?9QooPf?TF5{pECK%?l3-&)V^+z zXa}B_(Vt_W#^i#__RM}w8r)vpcozR1<)}UE65q(q%fF+nQ+mcB&HSgf>1_GRLd;ik zAj-_Rj*Ze8jZeA9dYaS#N3RI3*7*|((TqBqzy|V$+DVn<&-p-jrXOg}In(ej)|0qzMPSN^F|te#ngYC71S&5rybo#HM*3q&H-HXSQ74KvS)0@o)W1b{`~q3 z^Bmo6_!qqPf@eR&H&jo5*6O*x2^t>az;8c)B_Ow}7@Y|ZYx9Eqv8l-1&Fg#KlEs^v z9~x>uyUs^h1I$&Aj01ulZ{6bsNfG?2DZhXiwEWn%6`lZ^|-Iq(EO+_~zKCrnxR&@DOtdrFTvcLVlXiWepTXK!1=V*Feki zUTYBkl|GRe1o^aqkV(M? z%p?5@(xJer>-SnIG>{GLip5T`LL67Uq=TDWaY01{J={F%mann*Kee& z>lE9r_5G16?pAGhX3W?RL%xhPwpLck&DOnID7HZfe@UEY_8xKw!A>7>ImVVj4D0mq zJWGDk_oDU?Kcno15SXX4pS|xK=DjEz|1_ljKC}PVTByT|?>Rt+ zHgHfc5E(!pc+%IBco;hmZx(c)e5UC0CL-nb*drNmXmmg3MJn z!$0q^qybz1^$sJPC4ex|L2v4%AaJALr%fj;vsB!7(yOp7s$Sf)pqI864xXpBJ{lj3 zR2uN{<@QB{ynoDg#6O~4RSmyHWp}P^2o!CiCO5IjVv33*i?neei_#e}`Sw7)&Omq* zKKLX3bv1WDX(5*w30jb$E;wk3RG9$U^=+fwVxj7{d}?*F5851a{lc`1Pg(FSSrtF_ z*}LwgGBz|!_udAE`O(_F;DT>U7xy+`m^a3ve|nMl^>LwCgg<&n+I6+*X6_VO>z*V< z?NlY(%bUTW_J-^dh^`eJF zOo+rZmWi;B0NTxPqY&cvMEHPkQ*~w_(>VP^?BXS@Ju#s3*6g+ETV#r{1zWlxSy&=C zw;QQwHmah@!hwv-#kJBz&MBKvD{rx2h^ac@JMwp|Lhrh2hJQ}JCohCUgL^5W^cN|> zW#RBQ=;o2(fhq8a|Fp6IL4cfVT;2(;BO#KFBSl`_ekFA$;>e`_T zDWB9Jqp~lRUE(Jm>1v}y*MGvhG$v8acpyA|s#}xwCwh508q7J(+CnojzvYZDM<-8H zX$2cFFS6(>*h`?=1(;eX^jP00FJo(-I!22fMOnXh#r;B4sIJg@VqaEcrDp&0@#SQV zC#71@q8U!P&Zm61)hHu5PecV_)AX8Kbb#INHwF@vfxX2s)$>0IocdggFIO+Y5gkQr zT0F~LYy0kE)QIZy;gHJ28y|vl3-5n4scDx-12A?+ONA!@& zO+~dV=drg$^$MLbU|b$3DO|py3B6s-Rj8k4U7&_lO|WU-H2D%ulj^n_1@w#k5CmAI zF^PpIpn2nF_FND^3T@M^?_CD&71EBah}Bx*7UMb9mz3;(fa?wD zRi`JUr1doH@z!Zm<`QN!l({|a>nk|UZmwy-7rAT)ddP4=-v2iUB6Rl>_*Z^{z$AKX|qJJ;npUb%Ynp z&B0m*j!RBwwCSBp>g;=an_2clXx`>1{{@1QUVBDdm{wWn^3vF`P=Ywa`3oR&5Hq0I zjPV1^Vh*>36&iqlPWftCs~aXNlH5^2v=SlUVD(mkY2`iNp!_%jVt5C zFA|0LnI#LuCZdyDY78ftwQ;w~@ya4x7bt>kV4YFXqk>}qzO01RRajQ4e~n3nGD5(1 zHP*QduGgN|R4Y-;C$bZc?JbofM-+Dy?y=T+Mw z0oSeUtwCB^@t-Y_6`UTkj|C*C$+8PnocNa%!)-Owf5_xlg_m8iUkxTCrFVFG8{~cF zJqFtnf}0KAlb4YR6CHLzX}kUgA=j;r&hr7WG~)5so!=n)dxc5ot!?AB3>{3Dg$4NDmf_NOQ4QL@^*E&b?lrAYao1mH-e?^2qJG+HtuiPTAz3)gelMQHI4R$JUK@f~ zKVePU5-c-uDppCwm!u$Kv6^v1)%KO%L+~_5;;b~2Zuxb~fPH{Z{W?Q7TNGUe8GE6eGLD^R9$1&aP%0B^0cf6ITa@s$4lISsMG|`eA&sEp2vD zCJYegB<$xEFBr7j3MyCy8wQ4S-BT5^di@4@Tbt@!yaKeq+xnka|8<4!;n$7*l@bu$ ztS=$;b}bW~hSnt;=VOdoGd9!+l3OvCU~j45T|)z3T?=h9i@3u8F30Un{7?MaDSf+? zCR$#^+jb?MqQ61(E0@3cGsW1#-mWVLEZ-p-T7^^=Gu?qK6yLTXdSQF@tyEhDp7H!} zTh3ZDV_lHuQLb!A-YP2<3!pMqb^altwkEg=Ei3Uj3!!A}`0-sL^%nB;s8cm|ayu-T zB$AsW*|GS zBm1?C^7PY45c4Nhl^BlIilHHIC{59CeDn{dH3OB}T+R*MuwXxV%%oL_b=3ZKoq zYGw1~ecA^G0s$Vwwv4s~2Km-0elv93(!&$x`C(Wlc=Di)&n~A$JSfpN zJkf43YF-47aeVwYbssCvI0Bo=-ft%rc2zA5(Ob%i5nG#q+_1wL+B6j`$4#)myxV%9133?;7_a~Y0qPxX3Rm7PHY!7$dow!f`dWUO) zSra|{elKe3joJLJbmc}xntf7c`b(K{gGI3dX$W52(rI)&r<@BYW)hpFf0ul|tPTB~ z+MtL%p~`!Eg{`~R88A`1^`7RoVNJoHI zc8Th_(5d7GdN3d2&9&@VKp<F7?e0i@rLU-%RJz57L=7vERxqP+&Q^^KI+vr z*0g-J`m!y)=G}N!Zp8nQla1HQ)w_dTYhksvHMFbf;CvOceYmtv;t{o=d+jk!U9sg* z87cKt;TW4>QrH@El&8eD2bqNl=#tEhX~#b2IAXY$ym7wrJU#So(vs}DOJ){adnbI=QW@gdtFw{HP6biSfx6TJk{ zu4NDKvI>nmR^!L^`nG3zx6URP)V60PYg*v;nt!8H0;Fd3^33N|YQ8Wyxc{_1>Ti!m ze62+JxJ3@QLgzho>r{^v8rF@uFs+Nq4OA`73zCi+5X8_@U+L10vRNa0*gR&`^jX*7 zORILw-nTj0<9`q*C5$%>4wrRA5#bJfbF7Yy=U#n4@uahD?`=B5J@STl0@(L$6_b|p zC-M#9GjIpwWv}2p)q}9l&<`c=MO^AOoPx_twRv7Gnm-Y|9%3+%$7=}La3RvCLLx-6 zQv$VzL&2t#Jt3(9F(*>fkK%N+N*Nx*&x6`Z%zK`CdfSI1bNUIDR~7u9SdVnXXWnwW z5kABjr|xheS63qbs1&7vef#Qv43(EFd)GYuRb{R#T+EjzkerZ4rW`#fMT1SIPjoAS z!(n9KcIDndcOi{K`ebjYi-EiDGOXt5dxjvwO2d#)c+7=qV1fit)K&SVMa&}8-V;hJ zSzR0i9cOwsk;@RG@#2YNTeWeuttOZBt}z71#(XWUMX<8Wf{Z7J8WYbc$MGz8%N38p z(Qna~%F0yQ5*|&d9Cnh3zs(|>g`KL9g&ROD-BdTQ3jM~&|B?1K;bydGpL@2u4Nqr3 z0o{>j|9Y@hgzjU>aZwI0wg8l<06YTrUPdLPCrokmH>lf_$Q7~~*qL0iXI9v*c}qbZ z-oZk>%1S46Ot6Z*D+aOi;Cd&U&fHXUWH6$hy|ZN@bVp3angWzo+@>qe{3)v_UPoVm zY_MH-68n$c*mov`nUvjBs?>?UwXEzmdkMcb7B0x8)^QL>{!-XFpp&}>oAY>Re@gCq z%&unwJtS9A=;&kHV*SrVU8CNf+PA(bd z7NjL_mu-#QU9Hi6eKwR>i-UKE?u98|Z}~Dvrsn1~nS6J}Sq@O57WALh?m5hIfK0kJ z4)ty<1LTmk$=(x{9K)2!_s@dXsZ!@E5C$9wIX{6HQcD4iFT%Qf1T5x zv>ID1+F`Ju?B}+fWvVh*fp7he-Z=usR!mWWSa|j(s4~kJ_FJOw#X_UmtFl0N@oqz4*>@eD2EF!hHn$QX8;2j(-2a>M@Dmium=O7isw^e-H9xiP@XbeprpvL%8mgKBvOC-=M8r^`7+==FQr5 z`1!|B2o9l7VPFQ`$RQsE8}>Tp(r1ky*BlBZOd`x9huBJP(Jt6R(tzfSVf_RfzWWC8 z$}b9Wr$?ulkXy2=Gh^Z^t464Hi(-&x@BGWzot+2F?nDo-SZD#uH>bqj%O5R@QR9jd z%~ZD+vf?y#-Yd&2WHj6Zw{;<%G=3neI^y;~1`nUcv?sWG2p0!m^RC#@9S;qP4W zC4R%lv$V9oj5T597}flkBi9%!`YI+gAl(USB~qf$xR+eN<&fxOzU+cZ0-Y>(E8oPX zZxer(79frGW98N2`~1LWy_|jfh9A8a1)qmD@8Co!nvM>1MrA|V^fTr;e#9%7Gv8?L zJDs6aOzwp2%l^Q&W@}revH(b7-L?>5*OK0p^I=&urY+3`zXOkafgp2L$Nz*U(n&fO z4aKHs5O*%5y3nlROv^DKX7K?&!6WI@+7F6qKlk`w}>F)fN&S;|j;JGgXf7 zO+?3Y)E&R?(#elt=f~;8MttltuEshW_F@L&8Y#Spu2DpKFZo<1m|E%1om-VTC**}0 zrIdTh*-S41@ZR4i7}Zzy3fPw^E|>~M6NU??;1 zc`Mdix6a67Pmc=*Y=lcTU%DdbmDsvi8)+y(+?jW$w|x^MZ&ErFa8~AQ%xj~y4Iz@ZFI{$HAMO((=Ulm*FSd_eo+kge-A9_9zdQ~N^FfU5jRQO!p%?H$2m_J{ZD zviKetXMz46WOx~>Q(rVV-=e)7aHeI|OZ4F=pzViAeREMom!-G=k>E}eVvvFw>PSZY z*AAo&H^Xsh6gpmA`UoPZ*hQ1&qIfD{t`<|9@;pabc%}E`W{Q79RS; zO}Nan9zA<6cr1(doTFXY6Cy_`SZc^sI0;Wtb@uup-PUtDSB5v2s}T`IIA=P+9xQwj zV6#!1YYq%)wEu^&w+xG_4cCT8QBe_43F$!uDG})q7!~P~R%(!tZs`~W0VO4skdbbN zE`g!DYmhGKm>~xyc$d$!_x|3!-+dh4k15t-R?L0hSDn{+4o5q&IkYPFd(x79pfpOm zByy_1i^_*?A}g#ur)d1xonE%%vfxrqjNGGgjg@GlxV|SY#c&`%Z~wrA8h{rzR_!A4 zQ5tvW#3bY=W%7v&!^n*X4nSlWhqwOc1IND~?9M{bq-}5pKTQ;Ji4M*3eY!e_%^M%@ zwIX9=am``0oS&X~dj_%;j3S>jGxSCPo$Wbiy9({--oA*< zW7BFI^Y|ipy0`!{E6uuiOe!YTm3OG?&wCr~^40&*Vju{?wUi-KtM&{76mV= zR}fjO3>Y$T%zqd%RoFj82L8X^!=&^U%E7x}ZwHz= z`{stE)E$%>c+D%OJNe%K=JeAipa{{Dh*QG?L%3fB)m=S^&9Bd4DBAEgVBvAU6-#8`4*wb$cLAMrP8->K?6Ane`!eZyuhe)?KVgwKSjy0|5u4~boAz@hDi7FxQ zC*kLNl>vU^ns5z_N!VwUz)Ot;H!{BrulR$XV!hjOl?m&&K)7p7Zd9Q^xa-~YtdBOw zwWEbS0AOY5->%Gm!JR@XN?~^WBmm9<=tC_yl?SAIwADED*)E*{Z@6;1KLh}3KD}J4 z&;Er%xUzx)y;H$3iQAOjna(+y3T(Ftt&y!sa!j8|0X)#ZdpiHS1%R(I0v*K?x9_@V zBms@AkDr0%gdV0!>hy~KZ3zA^w(Gx0&%X~q(fHTocOPxu*L>HKM2|Cl#)WlX`?b&f zb!tZBg1)Etj5OLZaFx1IIT?uE3`vvJ>3gQd{tx&-=aBOISEdm~Nv*NkJ?bRtD&~jH z|C~6t+h5A-v#BHC08t(cSH-mfO{yrn4_ko3%wO!r<^S=Zyci*jcW1I?W;6zM@p02p z{|eS;=C<`#UgNyN5?|TAsgchGz&dhklr8^4fFBCChH<3}0#7-Y{_U0Tk-VvTgF0e7 zs8~OIU*~v8x;c(5EJ7#YU%>GHM%C3e30IZ2Q#K_jmdVVGieIY_?gDYQG3Uqs*?v>p ztLlD#Q$@~@>m50WX>6mYnJb^GK#TJ`z*JVi<>=kbkI6v*@nLKGTj-@SjPc5~6-OlM%Y(NicbW9k zZkXCe0(_+z?QMZ!SFPbMH>z;4fiBVA_>98S?-7cd3gWpG@-{+2VfA4nl|m=1^4o`MMI!H*nWJ~=%AX#d z!eM~q|DN*Cdkf=Hr0Iw$_}z`_H1-jGVY||sM!Ve)An!w%0}{>@R*<*mysV5;RFc$} z4oNzle>X298JGRCM+$czprHMP2O$-db{er7I?iu`t}Glecct`8 zgHZYq050r6SK<^HU2?pJ)nAz@eb1-J*2bQ7=k`Z-R?^sKg^6#6-Jmfy-8u}bYJF9{ zJC5`_-J5$*{P^XOxn{j51hGJC{mR-P_ayQQPCmuE@@6$8C-)tTBW2BYsp!UuP>$SeH@!ll`)Zo8ytBf%2o}5 zwvy45pyW8IwfF;Guf}WLpR;n4t;Gvesp^?OEtZK(KY{&tM#7fh=&F=oZc~}9zWnLY zkSI0*jckKb;{`mIhHlRUF*RR}x%@@nX7)+D=rsi?{l(l}65JCn2-F5-R}%-`hEqN+@v~gSMh-^1JW$;JKCJM|FPP>U z1vTKp9BwK09&%fM!`A-MFum#>m+7F9ldC0U)GiyvBWk8GnmgULAHndYUgw3F=VOB{ z`NF!oWk8Yg8QuACqpj*?fo(Z!LmiH%G)?b~QmK;R1DrNecQy$&ex*;xbcJ@6A8Wj` zUZ>~NP(Z^wC~tcy6FAG_{LJa)`yLC$buMi}-+76hgpRsfn}xrz2vp1DyctG8>PoCS zn}`gZOf82jKGm=Kw9HehU^mAiD#V^`Y@B(#Hl{b~jx!gDTot>lu_x~ICwWmyZYgWt z@sVzl%TR##&~qq!o*6@IT&jsY+RlFvDVgz$C#})t7?FwR#J!|{VCdjFcjrcn->ANr$?w4C7!CY0X(FzU$mP2@l3~ct$>+219}BYhN1+yThblATjvn@%b2| z@=cX%S6EWRlFYb{-bqFJ`B3{!lKv_5XtlApsdKoi$2!K&zimvDT?i!XBJ?mh+2zAw z33sPL2K!|X#n~rI2Ze^UTU{n2e@3a>W&9A&PVeBJji{r!5Vlow?z%hQS16-q0_EPeRjVMm6f!OAdi5GF1@wDDyBm$nbd&n z)fz}>9$w~elTv-)tL2??2Dx+uQw+t4nsc%~Hfx`pyvmuK!%=&&lWQ&Oo`B_-Y{^!H99{4EKE* zJM;3=)avC$7VOT%;p6<}NCh4H1<yOkKG9X^{}O+hO$gL#RN+P{Er#p$+u!TW0%0 z87L{W{O!@+(^ywRxd)M+bF{9sV_cpP<;wNIT{wrg-K$2|zzV4>5Ll@cPvLX(b7f?( zz4>%fZGNX)vPr@~_u~FFI~ptR`?&m1acaVUia$*S1@fqZ`;uKu<&~q&=ZoJ=8QF?C z+a#Jc4)pP`MK3>gz9jQ0`P&0geYuzqiYOOXfse5P3FlvMEU*?CKTV9Mehr2++Vq<` z8V_#ZJ#3TuAn@WEHc2CIt#Rmt3RvlU=C)67d+jezriH3cBd#`N(62aeE&lO-aOIZ% z5mlx&gMR2v`dDS*>z(xIV8wWz=w^KyAHtn1Hdg>3M2FR1f(F~;`IijeUahLMu4~;l z%PUOeDU5v)k*fc*KAJ0}n{=!g!b36YUBchh^b|eR#!oiy^{ew9%yO(KFbQnqDpvYD z*eLGs$)<^qLxsGpS6ur!_|yCRW5gYkHcN-gLKVGRLAJNjUIRqnw_pwc%Xjk|rzU1R z&Hc~h)F^GXdRl!8H>R%xsdsuYP^N%sNR2{-j3hSMfwX1%)3j&kTeV=mKZ(NMoOzaF zFfGhph=H{v6-CRZd(2-Rc}pFyH8O|?6!mn^VJJU~?9*&%Z3jGP@|~$0TsHna=uC>a z>vL(ov~G7mQjqpZ#D{L*Q~YLV&_|Pano7a=FBQSmb!Tp`b~uxDPtHP}ZBrHPzY0i8 zWS`Y9C#7w-dUmLmHPkBhv;|PVxw$G=rrJ_WGsDj7=_O-doxZ#<1T~7odVEACCX2*fd6b zRF`@dN=PIoXdB#szb0gcbouGW`I&uhinA@a!Db-^tDo3-Vb5@|A9l+49;gi;-a8Yq zBXyEN(i5GWhOK`@?s-3ePk`wc`zcPj#>#D$d;%AFn7o0a5Amt|4Yf z+x1!K2q}IpFk2VCjw}xoIN#C841_50KzBHImq(s?THf8N@809CrF)$kUy{2z`M($p zMY)T5NPgO>F0GM~D~M$cezx+*$covJZ#6W{eJpk-S07==BqDD150QKq{EJ1RW6OwuDx{BjqPFBjE`FUTngm* zW9_zTzftY1+!7Yny8wkq*_XqG+%0R{Cd;deDdjN_zqzDi13Q@&G#a$x3}@MTsKa~( zOIauLx0Rs0_Z*@vGuw%Gh(j>)rt7wn#Pmfnn_bw_eQVV+HGyRDL~`6+<~C=eUsQG9 zkJdpqu>c%V0%N=W99G-?P$M>kJmF&lUyj$xi5+Gr=s01{ylAT?G4?z-x%amx z<;gb*(lQWA4@|wW$USZh-rBR&`X6W{tyVE_=;d#U$ zfHCBPcW7Yci1br<(Xn?ku;t%1MpvbZBlB-Huf(jWlv2K8@;(u+ z_z2)Fk0k8cmxfN^aniTOunHUC9mWgL!ZIW zA3vKCYMz#`zNSLR$SssCMYcw#V-;{-emuiOCO?xn@u8D70xD5pz1-LKDTHG z(^aPg^wSD+5}Fezo(k4J9g>UAH9=3zMzTvj%&lK=VQ2T$4CQJH@~QBDjp~Hed}|*7 zFOzm!j#l0X?tlk7$ivwSo-5QJ(yvV^ysFcI>~il0L4j*&k!axd&cuT!{8T)G??K1s zes*_mIK-F%1S)IZL1;u~mly#KwrPT+X1!knIyDEo`A0wt1#pc73<AG|uXYIvXonBzx0x zeWhauRg{5TU3!WREQx*k>u2|(iNcVQ+#@@vByj>pc|-}ulVL0j`mY~uO#2V!S_!-| zq2F1y`KY%=63jKVRb)5(m?-4OS{hv9B;}IccK7(*c<~1ytMJ2a5jMSY(ud{)K#|89 zArLbDkShOTc?Sf4hW5#82Mchc;_OPlC-fW7%@0a*wUF!7Eb8~Wa7VW-b&tj25y zr`q>!cRK97o4XhL<=2zWMc3;stSG~5?msaRGz)kt8WARKAY-owF{L(xLwNz9sC_^` z|p5P>sAz;{!fcL~%C_@vI-#e5P zw=a5M+d-XKaC+-$;N;=+vaI(!0)qD?Kt}^oDWrOIpL;cJ?8dqUCYE>{9{RfgMhKYF zv1-dO9C-~fP8OQ^1~CZ1K1AYUov2__4#=Zq1_E6P(;TdFXndmuIrozIR)z=Gqk(}D zLPRJDTt37kDPh=h>L_x1l#)QpdzOL)(*GQgjtkHU-8keZefyN;C7}ZOE?kYab#kTi zb8MT_T}eU+b}0e$fbEt$!cDqu&^fnq{-&R~^dAc`G5L;w?iohcr(TOc7j_ zjg)#K8FO*cq-@CV@Xt$>9Z2dM*H$I}>^Fi%#Nkzqpw`F-S3_V&V3 z!oJ1`i-+~mm#KhZ0N#p_x#9O_0?(R0*$kPk{H-M6EAd6(_FS)NXBV#5?gVk7jGaO< zG*SIt*b=%SC7a<;Hr@geQVhq{4OR9WQjK{zOX`#0T=76@IQR#;u|YqhWz8$7m5gKj zt-_E<5@=Sgi~YluFo{2kOjQ9U#2z>!<6>?{IG?QM_uSB`;%dq7oPS>bXj1Av4knuU z?97uI%-e0(s#{{Z&haoIapYTr1xxra0VcIj9gR<2hy?hhis8hV=g{GL$?o*w5Is;J z#dFeH#Cqn*2w)vjFi3)X1u(F=c*8@G(fk?{%=EcvR6foPCj;xbjs>8-X2hfmP*xB0 zDy&7_$#HaSX)QG?{0&qmp))`)R=~b6W{ad+Ca39`0Eeu8@{_s%mBCbjFSJ0g6Dk5o z7yPWSZ;)*xNq*0l@>xy>Sp`wo^{JjAcO^gQPb4V#MM=?Ek9Bfu*dfO!8<_Imc*v*$ zD}v?bJUC7ZGg_ZNrz@qHCd^b=yR`lhP$8E8MaZQ-+tFjjiyDU1fI zzEnxj*8nu(P3bmg1}tfpnK4W<>G;pzuxB;7ik#uyP>9T--!m*>BLQ zcMOyq{-pUVzxw%peqv)i$w6OpGO!-@ap*(GbGhdmLzQ^BB`0z(q`IU+*inkwyN>Sy zfd^M=!<{!gy03Td*g1-?gouQ3T>pHzAg>wZ48X#v`|MJD`@HscIg}QYWV1fE#7u-d z*JHY^@PfzoEOEXo-!XR$z-3ktl|q9dJ5OLoY07_5!ccsK32^HWhF}6cpAR6S@ZWZb z#y-@*M+E%cLFMlb{%ZVZv?}LY-hh4bfudxO8J9_eilktRKipaj5@m*p-wfFH%X`pM z2fOS=hOjInlO04)q1d-8+rEn-GY06AtRev0Pu>wfn}0G}D>FfmLoM#_?xNR;w-HgE zEUcB`mZ9N4RgPHS%C!>rmIAWl6=B=i=56kysdd5FR$fNK=& zpNOy{pXJ+T+F*9u4;pZEK!m^+;qW_U~+ed0AqXG=|ghb4~NHn`_#S3Mz5vLTrKw~q;Dy9`%mew z1AYf^npChTkNVwN!yPFOk}V3wGqo1NTs=Xy>?^F!xC>7WUA~8B2h+SC0!^#b&$L#* z&3@%$Jii-Lar9X(S~d9`a9C~}0$+27Kk&e-X^-#QJb%622gNr3#HsH;Nw?A`$&Kgahm;tQsKO_!b4*-s2~5h9NVOiS8F5d+?bAgb6TW zv(R%58aH$+;iwU!W0L}zLuj9Lo!CIJT^AsbIq|~@{2YXEZb1afTnm^)RxEsyq4a{Z z64@}ezqfo$lAN5}Zd5eD9_u`#_hf&ROIpjWUkLjj_RQO>h}=_4?fKrhP)Q;ioaZ#= zn1Y;6WDQ0xBkISGZ3Z0nC8O>9B4(jJobu=x?EW-rB< zo|URaa=nGlUoF&(6^J-wF&lEADg~I&w?V*!KhOuTmMs9NVpVA60(5>lhCGtk9tU=k z@_T}=A%sv8<9eXt#AYOQ0N77xLdr@k+P%;Jj>vU;z1i8jS3*R;*V2*ZEWwNYJy9R`u^l-8CWdml9$yJe^LXsjm&UxU0n%YK(Ry}Ygf}gigiGxOe>y?=f&jZ( z>0h`#vp>Cl;xe{7zvjgByF2ejJGby)Ksb}V#vdr)npQ>c-UkNqjpZLt)zJe*lk-d? zp4nNhAJ>hBxG&{c7_G4KWLSI^bW3D)vG~r@Tb~lHWT8(UG91|V+ zewFV;sfaJz1utWvRi5LI-jD7?n({71PeB*;D$tqD--uio_lh^$X*|~OoxSU^^P{)@ zvST#nV!|r~k4db>O`qY?Y7YxW^1sAqmNi|b5RSDvY{UVH)t(-&bz)NnupD|7{v8E? zS-rFfxQo-Ujmt=`!`Pa~1t?U8$jrSB<%ym#JTdFS zM*&v%Upt9F53N7}{*~|*VBQD2mE@GZAL8HTVN0fs9+)fLO}hQnBkkFPu(PDMJS}d@ zb9jZOA1VhE4E2 z8=?)-KqMS!59^`BP!b}jV4ECO0UvFBA`_!?=C;n`GsslvveIlA|LLEQX})ZCcuKrc z21sU`w?j^z02VMSabh1{1!S5XrG#)|*k$7QvSGO5PE(<;u*VRkenKwcm5U@!R38qhGi#`}3P0yLxK}C95^G>`t~%v-r~BWzGgm_3gTS_yltz$-Ztz%ob7X2XvzXwYv}AD%im%7uU|gi$J6VT zr|h^-ywnjM7)j6Wt?-W{4gnnT?0+3`UUb(9BNl~TSnRA=?i>1CpJ@|gn|Km4lTefF zzf0Bq+Kk%u37vaK0DzBcdmqpbI}!mr&A{yiqL8uoAVm@8ZoKLAST4Yp(C|QslilA9 z`W1%OhMv*m$u2-9p2%}SNgO4SG4sS62%<|g@b`d}BPJQYPN*e{_=#hOO0Zg|OOtui zsgjme;LjcXP^;fU+7rU8PfU%-|VLYEHm)s3lQ#>E%^M^@iwfgri~w~2WF_U zd-}SeSjb3c!XEu3<7rV6wa2B06hA#)@%H?82cBRHTz4QISp7exEIm%ZP_*))ny6xE z%}5{i?%T?^^!((fS@8{ zwY9Be1K7KTSl3S9SJY2^d+3!Ye=(dZBWVDK1wbMK)`}4umxmV{=a{iv&XekM*0_8! zd=KQoxcrzhUzY}_24zlJYK-)jvp-g-3oIT|lgUcJV*{q()QroyUD9uk&TgHX1j5RT zUL4RpYggWmqBslxs-Qcw!FwWml7!8Wx#P!x!fQM6C*-Svrz(lg3wCu^?|hpVsMp9~ zz`GG51L(kC7MPCW{O_e?V||GaYb--_8$wu=nb4l9+N{T~CN@`v$L~w&oES1CMa-t! zDCBFOBxB8x*F)Qoq)IIf&dDMo5HGN>9?(0yf5LnaF(=Wn)xMO-|0YR zvB>F;-Lz+L10>(;n|@qx*{DNKJ#g24QC~NXUggzm0&ZX62fPCCOUW+wdc7Dl2Ar8X zS}ZB~iHe_!1X%k|`Z0yw8am$Cpe(1pVxm|(mclr?r}1W=A>n6$#LSFwBF;17IEej^ z_d#abi-QZ{Ry0|rEHBe z#2X#xeH^qadA&WQyMi=yH#ocEPS>=PP+k0amU1&rk#_@%<;rbCU=?NVZNhKcFVDFz z*%ik<0UvDo=!tTo>@%c%97RqLV+X}Ov4@D z`v1%y=law21Mx#x2GJ+LC@4u+~);2v? z6IRuwKwDk^K5+TLV;ODXZokS1CaYVoqwC4NZnx^D?q2_u7E#bD(2#`n#0a$kE&(=j z8BY3JB5~&B1iyX8JKO+s!vI6velYzhVxwj;s@NUj$Ap%t@L}A&=kplO9$a7XoyrZ? z9Q|Mlb=bG$bjz!6CAz(R^nLgAsUe1MdqC>rjh8|_O*X4DrRRGFOv%i#PVoBRzGuSpShN=rn>veIK^XQz3o{sNh!tZEreLIZMj#(n7gycBSF zJ|TvNu$KO`+xq#l!`q~Q^H>Zfjp@JvCA|1 zo*Iv1>B?9=*^Ru7JbL^Daag~`6Q~CSuo^+{S~8qug8N-x$T?$vo?m@MT8dbJ{9e!) z79RT8bgdfQ9r3z4#^&Kh^QE=KCx=hXvC~>W+GTnmK!-RFzU;1X75}^zo158u=mqPz z>m66m&1>iPjG{yVq#EGPSxCzq;)gqNCX_Tl(f5IY-dIDNM`w*X*@?Nv&Io~j6sN8} z_9?$p70sH@5k0f=WdB}pP{zrI7d20TNLKDR;R4iM;xEOy^cw$ioAmdL?!JE7jHu|`twE+-tP=|VrJQ#KGP6QdD>MSG;TMqjQ^qm6``O`q znTP{E>jeKP*yaoUm_!9a2E5>@0JQa6`q?pXqkv*UTwolr83jLQ62TQhhoX0kF}y91 zn@s}9Fe(3ha$Xfl%YoYO9gh|oV^(;6y;X30kp^&r0r=C9X~K8NhKC0J5)ueZeNMc> zZ|HeNSwF9`EVrUCHn@}fZsb^(=Aa!`jofK_6wZL1e`54O>2=oC+djAL*0z?mw2ABn$9mFSIuT9xT!6x_*=zlSLySIAIWz*YBa{c*ltmqr zL;Hq0pl3WRfW;^sO#i30VcZI1)`(dJe6L#?n-y))gU2>N(TkKAPE%z>87gJIM2$X4 zM3lS^p)I`(bU6i;!m13R=p-))F{m1hrMU;#9w@K4^1keZmY$^Y+5Sl;wvZqA!2_pH z!|Wd(xZ4FAGE4soFH9|{9ioqvRT?_An=2BL7SsKbJG!^j0Ca^9cb9WH)^aET+JXYe z#axCXw3l0R8l5aIW&77m7kJm8i`p8-J^jnINHG6%u!JYR>Pg4=pPbd`0Ynh=)^T(s zjOr+w#6WrLJgZaXSKOhn-sa&DH)r8hQhJEcZJ_~}g2`nxWv~oA)?}!Kc&k(mqE`MY zU59*B6lBA0Z@Fr3`TamG02rt(f(;^9QQ>|L%bU09-_P(GiccM1K2V6kTz~?u2DItt zw@>vb*OVD597?#>bIZaDZysKNz94TkJis`$-fOyrNl2c1GnYTLNGUpR&Rmk4 zzTeH7&w759=J0DuxK{WCOMFIh1>~F4XBKi)0JYzX2|Q!I;)tyY_?JM@NNjNR?Ge>a zOp#lhqIUu@I@tqg^ic^6gnc3+<+&4SZWp~sVcmnoriQ|ar79>6oH4yQ=M zoG)mGJPFkwDSpiU7O%w~_dITomIJJ+{~L@}n=|oU2sG$S5^`klsFCc1{9-D71~iB_ z^?^H^`XmDuuB!i9(CC7-W1MCFcxxLnukQkv{BWWKhMEKX3620?Sj$t$>S7i!YDCtF zq2vLdhzsBY2ZR5d9qXh!WQYwDChYFR%w=Zz8I>wYwP|mZHv64@t8lWqZhCQX`n*1x zgROjTqGFLgu3uxm9_w^M#KL}h#Vx9*uRP7W;|?QJap0?J)?D-Uq{_Pc(y*lSu4C6} zy#A<9bms*qNKzGSJ?`h8-u$B?ZzeY2gA9O`J|sCA16Fxn%!9VvC51Nn6wsftm5KDxGpQ5E0AmwdGRQ^%a=yIl(J%iKRQ;c?sw96Ejb#1` z56uD`y5m!TzR)$eoPH%hrfWL3kvV;dg`9@dKrJH~?o2NS0ko{NU7v|vi(v4lkr}>b zIs#}CMO%|Ru?8rE75?jgK>Z9)Y>qT{LGQ@LO>Y{!-IhE1i_xZGrY3y9YoQsVIigT; zSv5?MN0RHrR0#lUE6B0*-{1lye;tMEK-!~8DzP>^%S8ZOS^K}>A^D`rQCZA9R;2)t zx#*?ToiL-YsIY5FF>1|#w(~P157J^fmda@^@RJ=%gD289m9LoEN~Nv;+2}^me4Lr9 z?7(k7d?WlH(WG40(4cJ<;5;L&fCcR|6aLrf&jATT8fO6`XB0xu!~W`FmUUp zq+C|A`4r+*;+|36^cg3~=F{L-ez`lEq(+~s6A$27d2hDJQEQsT0U+)qrC@+|t~H7C zxke^^`iJhOV+~E)r}<>WNVFLzM4dT(nm^RoU0mH$5vc>qqWV`)k!ksHO; zoRRvMlR67P27xLwixmKnI<(@l56^9NpRu#BZZ;Dg#cK%Q zL|3)}aq5R7_Gw{3oRbl71Bnkw0Q6cmtrByZe^e*P{`@<}WM=7fCaWDXN7P;RFPy?l z!IpK9n)m`+zQ@lSU7jqMQ00p{Ezs?tK?x$(UN;aimvczl*TW;$i^OCa%lVoP3pJHAyzWtg?-g_s{lx1Mi||WCp51 z;#$i4g-yL4CR3K;Pd!qxYQEvlx}oyBM~HbKNJc0N8i2o=qxTZ@uW8&9fOi)5jqCKf zjAcv3OBdGjpQh%5zt7cO{z+q99(!_*0T#DeuQNmXIb7?AhY;Wj;1F z#}U|Q@dK+k5(pcD8DKv6U2&tig_&G8k*j1SnSR7XBF_98{S_Z3RAjQLu8)@LUbEQ` zhA?ZqQ@wte)5GT>Ogf~eMrm$P1-u*99TMvI%xPy)x3X3*jz{K|@eIfmB$IIHo4H57 znATM@iKOU%B9tv!4KeQ9?yRI98~&1C_4EnP4LbXioZunbwNym@uj*ADoopASH+9~Z zzc7pEo7mt!Z1D)%WGI<$>#M3;5>y`MXgTx5GFllgY24ikxQ%+`C-<}0p^EH= zRQKxkp=nfT)G>iEpch)H_YtS)MX{hbS+Y`&35cc(wecU462;`aW-ItuH+#N}Od4)~ zFU>nHL*u6kbRSbv8DhiR#A0U#sK^%;HzHCJzTb|h8ziMKzFXJxIdgy!JZ|3g=x*PY zZu7(kp%KTVqX~2@Zdm;g{7pip6gOtSZ3NmStk-lu^7ha*U8yc-rRnV7GnX_>5_Vgr zUnpKuShYj74!FyhRdLeWz9vnMe@|oV5!A4^XdacoQ^ynXJkZDd9Q86S=k5Vb0Z;0< z#k(rgO4_d<6*aOSY81;_yg9vR2X4S>&QM2q`YkDvRQrtm(wZoAXZWJGJkQu|QB@lu ztd*`$np$`Za3E2L!G-CkCoIDa4kvJ0uN!Jn1z2;C{`X#$TQT22Z~Y68T%$E?SK8m! zok%)QebM%;!{=hVn$x(@EV&OOmm?*OW!3}*%Rf*)c#ys)QNoR-M{Kq&p91j{z^}5G zz>!m9`qd~+uK!MZg%usWhrAR;0=5Nncp`=wY8{Zx-)qkHF14OQu(CwdUQPg*F}_?%77oD*mmpUdqZ3UzPjj@TkR)11`)G z)DcSC1}`!VGU-RdN4rLSD7%y%g=G`cT?o62KClXtmJIw`^jc}pl~|67CsN-A1%`ei z0?QRw*V4WT8OE@y>qN_j9p|sPv(-q||K_9LcF-85a~}{z2RV4eQreEGd{yWb^F^&w z%dk%VYzaHp>B@?L6gwLDJ;QL6E+4$>DvE*?duR^bonry2{NLmQ0+Ym7b@lvG0TdfY z6}EBty8PtVy3_B0sr(!`sIucuO^Z>5CO_liO7D{`2V7yZTk*vLv{E5Zh z=(H715m62-iZTD`n7p`*&)xmHrBzV($%ao$_3;FEYDgQ}b0f*I1%(lyh-{Q*w8v&2o! z*6vnp*>P|F)-{c61-@Q+7z~Fj_S5jkxV7u2H+*^dEPBZb2$Eiu2yS<#@zguPUws1~ zPh5aH>!7z0rBUCVcZg$Ur~V6{k5rx~@n_2JCoflNgb^9W5*l(;LZN$2f-oLzr-PMt z{5!ik!>t`yv1Ma_bm}h_x1JK7-xCUDGYYJM*yyZdXtdBbi?VWo4_;3J&;7Su&+!o0 zj)nEdpIUeig}}@h^9O|VqxNIrln;w2xS>!XN2at7y+4mnhpmnDc&w%IDQ?_Q_G^L1 zN7Q@)FU7<|q9p3un?V?rWP(iV_Z9Mk2=hkJ-c_MrX;pV-H4HpnMDduO`OtVM{YXu7 zQAobNsTooAlX*-fNcL zn)O^f?y6Gl4SX_kkXzytBs(>z+@@`+)fLNIi*;D}p|FIA5MH3$RAkuKs15VaBB?Wr zvFPf*zKY0p{`HLR+6P5ec9dHsqfAbZMo3m0T&)+_Jnkvic&ihTrvImrEfq|m# z0oAE#r=`$#XmBA$X6AKGO?i}<&E%U5h0iaEZyt;N_?h-8`LJNrX2XbM{8-~N@{XSX z`d8~I3opiDM^99A>zR$AwfE)3>lBpi+=xF;C!>TiKTVuGw%m}BH@Y0_IH{yJ$icIe z;UNze*X6wGi-O{Oh|}};u?s7_$ZK8(yTy2c>~8b^#~Tk5Z*aU?v5$x-ije3MIV-yH z)HPsV!*;uEX}6Gieb|fbrR4=ke@o+OV}S^X&rcK3_EEk-nzYTmRc(fYu11d}BNHVv zO7)|O?KpKUo!z%%#n7;EquzXbGFsNFL^gj=!QuUKkY7LB#Eb)m%x zG61$fL_q1&=9k+5T?DnSF4E(It#lwd8aH}URt&TbU_L}HrXX)LF&h3}z$Q}_ZE2D; zI?|GTb~~gHdqF8{lu#BXc~4zz111fC(cbX`C83ffr<-0fXDb{E-^19M6{Jd!uQe^B z#qf8#R>VI4=EOWKon`~(Eq#&L?dwCDT$`iFj9s#praS0mzNU|v1~z@b;aQ^|bdsV! zZW?P{U5`>Ic2S}Cj~u^))dp_E6(=Qqf1n=q)@o0tgO#YQ`)dLFKri+LzR@J~g;V>- zg2mU9qr<{e;V4(hQ3sLeNeh{mFmWPKWDZpDlOLXZ^PZ9#W+N}N<1%$y!!gGkN`+S- zL;zAlNRTZ1IVz}%ES#=R$XkWsW5oJ?GO_*UyYebEN`*;>>4n#2At$tS+`@LO4>p>^ zM(J|>m0TdfY6_}x>c4z>HyB#iqynQ{y~H8K=2EX8nt8@-b{P=s1Ed0)p;u)DfA>lG z_-y!ujB4(%)ztL)wL8kFu&B5}hlY0Vo2#TDIM_3foEmO zQ7xAE)S!5qZDqKZwoO~MVl2D%L7XuxFvYQg5ms&&eg$Q?B-&Zb0;#v@*=Wh!1V4?? zPx4b`pE69`Ttgg)%-!%>MUh}AUQJuPSwAhrN6H#}Z^q<<9(f!MJRiw3Q(2TNm7wN6B8f)Y-`XkLn7XlHHO<8*tuuzcib=oyCz{v(QQcGT~0WDm4g4S2oqF$|m<&@;FnsEqR+EtOGv91I{6hMLa{|+|t zHbo!yP*S%IaFAxJ?L9U{M8TZIbK*{{FjOy{%VuqMh}$Q5U59K05zZ~m0Obqyme6n;io?6K(Ii;mkst(kwBOpVlcU^)Ebfbe z#mDa`?R0kf1^7c4h@b4vU|6LUC{`ar)gZ7Eay4TFv6yBFZ5Y0GS(jy({w;K@D*I1K zrq#9dw}P{VMxEsyY`42HwX;Q8&d*6%QFR?U(CSa;B2K{6ZIPPIv5V>IPd(1c2s>-{ zX@L$=?q*VdC5{D%DXG66uwDpy)7*B%hD_hW^8oWB#LGa9YGo7k(n4;TPjqrUKt`J# zG$Xh^{zy@yjONHlM7WIH6T0eMSLCElwAzI3$oewx?{qEBeYo}k`m))B+ZS?32Z{7I zhWfz%bS%fCXUCj$Bz3Ge7b6Tkg%Mcq? zCsN91T%%b)UXtDG_q&SJ=w3=J_YeS(0v1q&$_0o(PvZ-qhqZ}{$j`7H(-a^f%U9R| z`5%DxT^L$T?Ruy(ua z)xW*U7wlIluI~QSv0wSv^njivOW+uh4_W*HrG_m%#G2>%iH$#;^IeiSZKgZ0FX@x~ zn5Nz%8?-^MknJ0~8;b5C(uOww@adf77tG<8AJbtRwP;B@by3==y=}Gb*MJuQ*)AL#I{-teb+(PMo>pu21lIcA`{TwW$ zj0Vy^HiL?1sv}?d;39v;P z{sl9>QBF^sH5}XN1OUfTyc&c@fyl=|G97Xz-UuPKoGzR{M!tgCAHbH98i0hpQ8DDG z{3gDc@ZlQFdf-2HNYf`S0pMv?Ad}wE_U8=#YvO-{sh=ubMgj%-#dHAs>ns4I%2|dk zK(dK-;x4dCzoe$W)qyqO+yQo~{cz;<2C%~-?D`J~PITl?jLe9=Mcn&A# zFFs#K$WkROZBBgKL$IF?uVsHKXV$=`Ugw6B)7`^r@y=^Xt8wFTJrbO8 z#w=Iw)6p!gNhGMssvB?&A3dp695rzCg8d)L-a0JGc3uBRK@kK|LOK+Xl`cA z7?1`j=^OzCMM6?RWRMPFXhdQ_O1c?BVCaS!Vi=fszmIFbd+oK>-pBDfet)Rr^xV%K z*L~fe^Su6pbl-g6-bRUNyNhkqLz%~LYx(P$sj^+?emnZ1aM%JxLa7IN1mIasAXKgr z1h|HwZ~_TpC2%hi&?)dDy)G_30z+;c=6eSKaLc6u;07cSmu|sFaPw2)A-n~-0La#? zRtYDyNHG<^HL=r|YMi76eS4W3`XH|B!hD#6_2Z*4ypc0sLS54-yTo*u&8&t%edg3l zX43TMuQ^(r;~o4tCu8&?G=W)Fl}ItW-6+G!(xU9KPW{Npxtxqs;Ir2OAD5pneSXgC z*9Tg%8O}8oAV*V6oRq$(s9}N)guCA2so~6|3q&UHsjq^f=rnFYEFK^8#)5e*F>qJaY%n&XkZeH^bF&~$t;r=VNgi+7{d z^&}4WV&pyMo^K93mAzc~ol`~}tP1JS1FbzraS}qnkgZ?k(-349cycGpv~>0syK`&~FkgB0;QEN+q&?T!CD000(!2n6q0J@N{ zBm^qEBZed`C(FM-+y?0P_t#&)-UbX@Wr201uI&!sOR!W3St`i{Ye@)SV|50{7>ShJHq$>x;L1)5jRNC|ILS5frvr>4I&8NE&lTgyk3dLGcYVs zc+{#Z<&DfsHZQ}Ozet|WOQX!Y4bBCKqQIFz&)0vPBu&u70H138-&ldrR&AKSwpj7r z0H9DDn5%j;xvrUF-WZ-fr`;it|3*Jlht_v2DAw%AKPIAyG&32ObIDr=~xILvcS zb44sqG$Xm}if|V+E#;-cDw_l0zFOD@w^W6hQff#DN4krv z8$pu??^RMoOeO-RK>;m;`OJHQ<$EX2`VvV>QVSgRIEpY&AfJ$m8X(^M;<-AAz;l#( z>TqVc({<5?ruAklsqtmN2AAe_i27c@`NruqFuiN$Rx%<*yY)1XN!=boJ%T*XTzF*W z;z@3fTWUTUd=qs3@zNycs!`T1K5# z2eoh!)+>9}DQN*I1k(|Tiii*a)KzpG`+9nDK4WA@9z}%g<*GvHp1GoOTj_+WcMUM;s=2twWm}pGRLTuQOb6dWkI4W?&ZpNYSVaNS_$0_`eWh z8zW#7+4n~=*b6<~28gJ9UKjvOL;;DPK5-Q6?tZJJaUrgotto_Qab1hdk zf7AQ3D)BQ#OYwtK+{e-#GfS1vf#o~KZ#{&crvw)$)5@jQiUXQo@Sx{c*A>p!VA_qk zg|%Iqn^V`!J4i=rJ_~jdQl8w}5yIGG>ai2Wy*MF_73!-V6==%q&)f}fSf~c%_#u;A z((}C@wR37;DD7U^7n(>@nT8^M9eYckyY{KXt`f@wVPSsQKIK75{o++U@GXHgbz0H*?=awZ0iTnjmk~fS z!@k2$as-KSLgxrV6=-b@56qKbAIP@S6RbOS{DffW1igH!yt+Jsr#tHc-NaLT4cb`h zk!dh;&q%iZChBr9_D($@gO)_|v^xNb+Jpbw%toLj0Ap%xsg3hb{5Anpyks~qjp0`6 z>M6*}8L1J;xttp3!O0fU{})Nk(Ba;G;0n8{F=BPPXnK5+$0FCoO!`$bZJ^?9VGH{r z@g=43I0Y7e>4cz}Y~7u{;i63setr>)J()PGEZwY)QN$}r8L z<@$=j-0fY3f@E7i;sFZ}{26EXz0phOQIC*>T`K^%`hz9oAE%WUHd zy9CI!RorgREw;My0_A~X=FKMxog$ssDrY~scb#*JpyGX!vxo`hZ?WmidYru+mk@XX zm>OGyWkU8C{PUUNJ}=M6^LSFy$(z?;iZos6X(^>niE&N-ilILwk3VAM2~}YlE8ts} zVfV1*ftd5dGm+1w{SPt=N$)QDj-9;?shG7?j}ZT3;S zL<%6~PxIF1%4my^*Xs3W{PhExmD7xMfi7M7iaO*Z$-8kV*gW8g^ygJ)e_jP{fYfT> zfwBDxfX9HWLoVg1FO|PGlcjOdz8{=Bl}4SeTw0o7g~tHUyHim1UnGP7Tof-j6R%29A4B@kioU4~-FfnG4W4*Q**g_ruXDF>z{ND{Ff1G!+UpYKc}I?>j5Rie07VBO&P zC5BjpHI+kAbMuAO*r+K#|2g-O4v}IP-Aud%XRr=qM^wseFW#~DDP~mC2z!6rBRQ)3 zz89xWvYy}%MVY|PuZ00UveL4c_{IYah1^tPT}g`Kq!(?G_K^ha?AThK-mf%S7ry!K z{sH&N?`EqI4jL{%notio6h9MQC2e z|IX!5p5yh@*us~NxhD!)m%lZ0CIHLv$(Sn6AP6+nJ1k9R94cN7<%Amaz;k~P?b>Lu z6GR3?*9aAMA=0HT6w8-xekgV4Rk}t~d=+fFXc&VqtcxP{IDT(vQp*sy8Ka@Wnx2*j zz(OoLD=yMzl3R$=^H1xNR*5Q3z_GC&v-Gi2?>B@014ln`zXz8Cm@z&O)xSt;ic{g3 zPa)Z#4wp6YM}!Y)t5#K5P&NwHfZv^{R~JNmw??C2Wozr06YwCD>q{1fb|{A;jV z4?$+ea@<945<-AR-Rg5cMciwRpO=HFx0H5l7q59EtDbD?Y3rDv5k&T{Xc}S+L5N5& zATk^+?UUmau(6ns$&+wJn<)=}lW(2E)%7($_KrRYUA;km76W($1n5Pm0>P_pVweFE zd~w|RN3=VZw#l$-o*>`K4wSa*L6-jfoMu*B^C!gpPd42fxKrka3LW-;!sj-5=4zP!9L=^8fXk zmo^JHUXsv=L$>F_4+h-bQ@Ao;bWN1 z(b5xX0&kFLcL-2Mmp32VtL`yvIrPok$Ty{2FlKIPUqI0!fGZ9UjPR(qe~|B?D=L|2 z!Jq66IsDOmOggoth)Drj?2?)!BT;D6}|2%U9Ku6GSQLLs?cu|<>MA3rt@~5`30D2Un zQTrM>V&J`}^7SfMImDLc4X}{JgxyK)`H0$^Y_XS}yTnb^;O)r8iMng^aECcu=m^|t zY48r@!kHlRBYg19sR7eY0&my*k&IgT6lYRgwMNscJw^t?Jy#ZWXzsR*B}e+T#+r%T z#KmifJT=j6&9+Uu!;^pk zj0c##;6`91iXb+0z`^AYGxeefobge{p119*g0(9V% zlJn45Wd_y(ov#HDZ9WjYsNXZKz-=P!JHMHBkU)0W8Fi8ocd#>|1Ka?m3v;(eEiIqD zRq}^K-+u3WzunlIC)xZ~?`fc5{%^#wJR1iO8veEr$6nu4Tj5r#qo_gVA$6=hUyvNd zC2Rm#$4a=#xlumg64F;!v---f-s3^z3Fd+L-;x;QB! zcr4u_saSenpK^^b%XLksEU$V!e^Po%hBz5o-VfAAt3(!IQ#DhEs;4}@{)qgGq?P^` zJI$IJM0`!*c z+Fxev3tYmgtXEX&Tg+KGSuo*9H@Gklt(DYbHwsw>ivD!Pd-B#gdYw&YdjGj^g&52* z;M?&}{#`vX(#gq2RG)=+(ru~YC?m#_3RTNyU8Wn~E4d4Hv!7hntCXcVjW5sUJTxCv z5ke)sYoV^R3UcqhaJbMfcJtw(`)WpVVmF(Hapbx6i^u$gu$&88X8Uhx_&Q`>1DWu# z+w1k&k430Eyeo$xPG>;Y=(vxP8gY|jkNJ`#dtljW>F_6&JD*FwnhHq>ttT69#B9pr zsCPhQ{$3C>$Gkv*b`Ls%K|%{xfg``*iKDcT8IEHhE{tonDl3n<>YKO?A6@NvsB?>{c4H)_ z4TMf=7k!5z^3+6D)a%_2Y;3LlM1Rap?tnQ;ob8cJN`Gbdi-ACoSHMOt!{bz_C&GM| zweSRxal9Nk>xxfjHy1lQt6=A(PjWOE%GubsH%ovy%JyLW&sw>M%0lG)*h1P|^$?%SZ$Q7*_tRgtsY>!s6E*oF!mp|%V6FKD_ESQ%g5${vS z5CWlHh=8uq3o0oaI4fu3mUNX}NBgL|w0t~wg^f!b{O_cv$2=tSe#lO(6$$1T+y7u> zf;5WMNux@5rZs-LAhW4HU2(UqszsH4R5wadLNJQ>@`sC`s}o)*2?}X(4l*VH^3#V{jTPSU z7W*P=7lmx@th$ShA!V84DMdGIZl#|yI zGd${g30u_%+x({QA;Rd62u-lqVcvWBE`Lb6${Ofj*VSRsg}{?TNKI z?5VR~e}K(aLyf>*z<`ni>3&0tCx=$~7v zfR9`KyvH+4Xye2kZ=l1o4!AOYKW4HL=>1BIKA;lF_+Zyg-^55d%w_S^jkizMo+571BAL9i>^H2 zFz0TTXDSH3#?IyrG{LQzk@bVFF4Ve>L`G6?GsqIg27qVSz5izplmaJWw zUv6V{hx<_QoA+9%lLUwoC)xEH2mut!AP69Tc_EAtaRP{d;vM(b;HIRn_|ZRSVkuvd ze^hP2MFi9%1eNG4qHP-k5Rn_8VhJ%ocn%Qg#2g9tnH6XpNBl~^3NzjRBHvrr5F<fxd9#7g~cr69BFLefgYCB4!%(T@b1EoGH;MM65ns~%%MK3jnJGxy$g_k_dwvcptNfh%dH3o4e9g3+ zg1c|Xh^F?yq3n5!lS0GY#cA{E?Xk{BU(Rpc)8m&fsd4G6-d#cFq1gEij)R0ot(aFFkYafL1{} zXx=Sd^-E$b?)oCqb#-X!$cbe0N0s!GF28yjh!!V8uS`Myp$G&81}wKN$XX_`J`4OSV0KN6ju{mDdE2 zOUgplpQE49dmr4|COQL59NEvdB^-4uFj!lzs7ta1H$&8$s|breugt!KHaIc1!M7gN&yp;75=SDUUA^^w_*TLZ5$L^iIZTDxjmH#4=(=vC*z* zI!q+u;AHkOhJ`;#yU|~s>i)xHP33FHS-Q76n4~Z2ao4djUe_Xqv)B+uOrFx`mC_o< zV%mfAcf7Fl;ys-+aX6*!P^E^O83#)nLEpVX=>+aV{{vs=!wX@JR%zmM1c0;}sN-h0 zO8%PG%g5v2__^P{>h%q+VuH=n{JR9?&z3wY0Mgg-&)(UR(OmrMi{`;+e6rk}o5kDi z9J^J>%qr#$>UO42#ZC^BY*u*QVTL>N$`AhhQgLo(Z-vxal9&2IeMCCo`bj0RyCR?r zC+|*S)UnrnuCxqY7ttgv&&ev3^zk!ahXnb7Ce|aE0Ir@)n8=D(Ye0|0?6-~NVc~-r zj~~8{A>42xq4`mBmG%;0AB!ZB>aXp6mzr<-v^8rq+vbX~6$<1Q#-=2!Bv8Of#`_~E zeE)gqCAY7(IK>i|ei9OrXz1pUU;TL@aP`-)_4itlglEel6)HH`jmCLlRkl<|#pjPX z#D`drqkv9l;f368Q!_N$o@d{rn%U;I05?uDuTY zrTh2yQ0Bj^yjHJIakuBcX}w+biILi!TXLIH{n@Wvo3C8C zC&p${`b}Ib^+~4cR>9SblE1&jzdM$v`4<$jyV&+le8LxRKVU9?B^HsM(e)^7dDUHT^>}qJ3e7HVRv6S_c<<*cD_GyV#d7NM_rHQ-DK=(^=X)VR;WBqpq2Uj z_=CFUmX871B{SIlg2v=LKcVWlY)miP6m3|3>>5YH9GoxxydI?bIyik4)4}${S^!C} zXtk-5c0@lVP(Ur$s!dS4CvJR`{BkOD!NRFps}Z|C-0#Wb5JRR!e%*|BWiEY&nqEJ1 zO_H;8)5AL}^e6%leic!rgjd3cBfKaKMgSk0{)J6Y9eYQCyjSYE1hJkH~B1QMf zmtlzt*P17iJx-af-isCvV!G5QQK=;lc>1IT7%oGSnDIhmZR}w>Xi#So?Of3N&NGke z^IEJwUNMQqZ!&ob71LNCyO&M%ZV@y}oL=OAphs59l@h^X+2|SbA!u#mKJzjt!G^ik zyg6hTr@Uw!yJ)e^HA6Zv1t<&$q7j~xfa>3^=}{Z(-8n#nrM zBzD5?Iy3jC_WR~RVtv@)I#6gtr(W&4_JV*_fkYgC#b|dgb3x${FNs(+Tk26_`Hr$+ z!a{w{SF#MofB_$OF~vNu<`^U~Jt&?n&Rfo70Ug(G_&(&d)5<~8@JQlPmDlSqmsEEZ zl3T8U`x!E0kHpq9Lz?BbF2VXh0G3Ru6pZ?L5elZA`Skq#K~2qfPgw|s91?MIVXEg# zV%h%Zt9&j zo3wkYqqNGdOewSMf}*2CKL;Negjj9F$7$zM${nh!33BE@Wxbg?16AW(gkP1)4_Wg1 zWKU{1@w=wyTq|zHld(qh91;(LV1>IVn*k z`{e1Gv~%5=y*L-eM1DCwuLUVGTkmLJP6KV_`l)Z0#f~p+N$6i=$jK)2DA=y3emrO$ z3JHu>0Zvu0k-pUMSU;h@40TEeP3y_ak5YR*-8Jo+8s*pJDjY`cKM_5u-kfgLS^n4o zX9;^U%{|&8kp(b#0Zy;TuvHm7OD5R(eg`W$(qPngBuAH@JZ#+S zN|*|&_?6$uM2om>Lv@h`Nb;o?ZYcFP;)VB(ISx-gnFQrq_^zP&_8e|3q(bH03O|*s zPOsrh3GDQhcz_qhU6N(Sb zs_#CE@|{qbk|=#Hqt4jyuw>Gwz0bz|>v)T7$?jZ2PIAGcv%tXNx23chwpNj&zCCV5 z(`)f#CqfIMrb+7KcV%T?W_z&t0w>FYgek%YIP%j5Yq= z+xOJhe{FY%!uGDqJNN9S5j}4x6}dx}tU4Vs0V1P<@H@OYzI(M4{6fx%@Ym z)i0Oq2@Q5V|G^#9;c15r>lbrNx98O-)9s3L4Y z(fKhEeJ55IQ@M95`-%2#srZSjY@Xb?mG<2>45-6IDIw^Mpqw6Mz(U!OgZZK0yr$g; z@zGcAC3bU})#~~RllS^vh*l7f!x_~sPBMRYI4f4-j_)#Sk<(0EwXk4vY}%W}!C<~d z>2rTU#uty8FXxb#u%E*YCb|}x#=^jPi&*!r+vUD?;dH8Z8~r}my;6SI>jT*2?W1vY zyAV=5*nic4+(IjO*~2U_?Q_t&UzAosayp0lul~Fvq2}vvr^oLag2djvz%RIvPZxje z|BJ-(+MQOxCWCbE_K%7WPR?LJxRRY@8Lm#(!$Q6c4{LnuZ#UuT5(G z{x86};J^NP^`|^dd=`n1T+ybKvs4;Mu2TVN_LHaGa^gcIB4f}(zqZ?6*w4L52U#1k zYy|gQ)I9xU`o|J76UcSm3WHDy+WK}ZQFnWElNbU3%`@aLo`3gZTmN>KoESp0<}TM< z%g^%4(*2xWX zGQM^S16+`Ra|S(YQ_sy-tjUUtV}oy#@TuY0uV`G}3a>43HX(c@+)**ZsvtAJK~-YhtDviM$x@LoEQrSuGc#U%)%aA^;y$%*KYGySZeqS#$O7cBFEPSDETZl|3xzI zi?lxLp@}itNU!en?J18K^d;E%}c^tr2+t+-`S#|TLnBkCM= z`08ZYm90Rz>xffgWqs0Cs)pkkvm7a9lj4J+kQ^TywXmP!U7>E2eIA71FIBkA%8)39&@>FfjZFT0hczkpay+5uB<= z^<3xKM`cS3iSND(V;}i=qgk+gfDKhtU|8sch=bl;Jt2!%;k|KNU0tEyetlAz`7ZZV6Py2`UJNHn`py;ETwJmcb^q%WO{y6#{@w||2oh4~c>JcR zdZG@cFKZAMPYyqL9D4%$7C{GWpX8L}^K3fH@z17v+MahpoM(kPURXdzwBFJFYU{7Y zrO1_h)e-#(ci9}AUVw;BmZfx=TD3H2yd}0nYd5~GBr$fPKXx_1Zp7xwD|jb+iyQ`9 z@8*<}^Kla`JxIQMBj1-cnvfb&RG-+d`*L?qQBg>4n&0ltBMmkev9dC;Kj%0$urGtv zM_s1=91FL$maBHMgN@Fw02x>yO!!wLb0Ak3<%UK*`yY!oIL?VU*rtp<4_{6&=wJ-E z#m*x1sVEWi^zQvl35hfT*64&{uGI#~`O&{f9KcZ=LfGr5RU;Jq{XvC@S3R}9&N45Ad}yW%zjDb;9^?0kSaAOcHn&$sIeyd6MkhE#Hu2) zwRpQJ=UF-ICOG(Gqe%DqQH^6$=BKrcYs@tv&dQ9Yn?0g-`weqN`4l& zTrP{;B58+3LmX&kMZ3VH_=n}#kzT}YbIXdM?{zggvcZqV6+JUEztC-XlwBS5mh~ML zIw27WHOo9ukS^n&2#mXJ9qS$aashfjVis^~S(E{Y1vBWy(<>2n5$&8uK3Nb#G~mr^ z9np@5V13KO%&`wwK(QB8sJR`MIwN5TtJ@Z;YD&I#{X$~oP7?I1%0erLv*s8D>{pcU z$~PNDbi%=>7ZnwrT({AUAAdPIWONxM}la7&o8*i)chV0 zvaFTj9!bv@tSu97=jLA3~Kq@@)0C;aw@Vlj&KW?Rw@ZB9SfiD9d*mgfi1-C-pDN2Xs)6G&0ZFTz@YT@u% z#j@}2hr<4-N1LT2Py)YKmCsiEDM8@mr_u!mh|a|W1z_rfW^8jgWh^L#hC71)|o6~W{V=sHAiq%YKhKe{Ct|FqR!k9be5M%T6%#z27dzA$?oy4 zjmgrPqV!m0#kV7v@9P#7%u5`dEvL3i=@1;I%s86lQWGz`vKdnUbl7C&n~=O z2T}q3JLdfrM;2y>wL`4}XW1ts>_I6bYCnEVk&S7*#64X-;1Wuf?3 z>mbYTA)n+%OBErVnWk?vr+wo+BeEO!wQx$wy-9v>9N{mL=@Nje2L3q1E;#?b2Q~=9 z1|0TsUfc)*5|)>}K<77-{_~EirxDkQ5r0^|{4R5%DkfTe>ZBE-TQ_~8Q*!uCH2-OC z&TSgDk}pmu@i%>OC`AUHT8hP1->W_0~O zRpiAe3h<=CQicbuT3oIMQrRmXC;mke$WeD3Qk`?xIQqxL5fvMoYe5(6mlsw5`mdJi z5iN+HoDCX#1FK6(mO!e>fXsU4v0sqqoh-#Zp0|zJzxsa}%aR{tM<>pVIH#G#1Ky1b z8SoF)A6^!X%L9(i|L&F!O9MH8B6EbW)JS%d&NlHMbxm!P^|xD$oDO+hnd9Wzg|}h@ z3{xZH?Xxc0Y6i>H{nz=M^H5`MS-Sseu1?q)g}5Hh5|>mbDmKxJpHkF3C~Go8M@cLg4U_gS%aS4-6&^Nbj~YpLfeJW{VeWty@mz5cdOSWH6QfMH{*9Xa_d!5RYN zE|cDsNsYW{{$F%DM1KmqzRHQ^JM7VetccWBK6T5I*v`r+^a)65IgYg`$}C{bzX{DI z{Y3`sMQ_U&e^_yyNF(G|Ri6~{G#TM;PtN(8xVI~lqK+|WEH9w0!L3N(ACB&0Tx>)G z(|;yqus`sXz)?V^MN37B_m&D%FcSJY56)Ae1FA`6) z{1?ep4f!VP>jw4J$C~pS;aR*%5%N9zO}%cY!#Ns^IZCt_?N#O$NasT2s)_v-MI_-A zU#lq?lOfHkr4{S%=A{?kO zb0s)Q0no|Bz1y4`DoAtZ706s5lJkue;FSlQU%Cc9m->CTx=CjiIx6hVD6fKL?<38^ zx1vSybhWEa=`TAw&*p5JJn+J`(k*ZlPr6L>iSKD;pg!Rh$mRWfdep{<{=`^yrgN*h z+OLDVw6g4SCN}v4D7|CBA7P0G-w>8^97Z^8yw~T|k*-qwQ-FY7rh3qM#I*qViDDAmbt2Y&=qI6a=MQAuRo5bA$%Fb&3ujon7 zjy2n0MEl4L<4+rs$^9@B^gV)-QQ2HgA9bF(@lR&@HOZ^zMO;U+SHMq{%(wLHvz+MH z^$fqF@=)Jh1r$<$3WyFnjUyxxYHAPp$OK$FiD0$*Q2)~}^3y-3rXhcEO@ov=|7 z76_%pt5*0htY;4EIA7F+!=EenOD#g{>g5pMx>;E^^{`Ras!un6ahpLe-(RYBMMLqP7 z@0e6cg>+G(c?@AQMzcUOMdDH%v+8AYF8udl@gDc7Up_YrO@AczF9CWP`gJ*8`cO!_ z=p5k56GNoLiatbyxfVZllVP#0lkDL69<=uOQhJO+-030K4LrL_jNKO{0wu^#%a7@Z z?9J%jzo+ESz?7W-pLbN2ePW|8SObbBQJ#+Ld?08|fLsk;&GqDa6l}N2Pw+LjB;Sb< zm$}||XRb_wq>t}%YZYSE0KhAdU*Jr2PC9Gpos-$H;cxfp33&;5EyIufpnr;@zyb4R0KNFze0dX$ z1;~W1C6X|?sTNnh#M60eYEfkMP1fQW9`U;QpHPp^W7sHc4vxGH!#xP1?Cf0RnDjW= z*U@cd^kK~P8gYZ3UM=--C>$DM;~N7ivLpU33_Fyg%?^A1tQiRlu{ebWD5f- zWJB+n@zhuMV1G`5yvD{jqXxYw!^$HDp397Gn*l58LO)H26!qwqDq8n8cGCm3?pe^s zS@L6;77Bu=Q^79?x2d4hv|bcAu}fTa_F1AGsuLyP+=eq~L<@OkjWOGVyT#_Wq#aoS zUTk2AqTb*=l<`FG-Zi%EkQeC19mzNSYF?yXa|RE_)33E{%C$=qm=HOD6jTohdd_(|hVJn~uwmpIVUblQC{{B2Q-!!n`J;of}! z66E7ug%I`=;VHJI}D%a8D-UV5n2eZMq zqz73p1^r8qS)LWR&yC`KI_cr} zMY5O0;VdwPN34gBq$P#DEMJb92Wh^3h*xy~XKe5%o_$6HQUOu^c>D1%qx_v8Ma{Vt zU((m;bnS9>w8FlxQOz?{80px0U%!-7^qu!3(s;b_^o(7+3jxHZs#oFEAmms05qK5o zkIMh>sNXc-7}<9Q+-3i3e+DE{|K6YV{=GlrTmQ2^6Z6g9Ox*l56I`XE=INK;_hpgq?W-o2a&xwvrAQ8zHBS4Uo{Q+`tAdAHaRC08?aP?-$-`r`L67 zv<7cP%DOIbF$+qlg{lY*_e>dA$ZI$)ZTK6Sy*Am4cnRb=9uTiV8L%=-D~oTunxCNh zgUvh0heUFQzVtSU$uI21ubnVpM-hPruI%sodlngkX(b{|gD=P0D&D@9n}g*@Cp@uv zYPlohyM2b{?XDm6ps)T;mBgCx?Hdi*O>eb`hj>+>1-*loh_s;tNimkwSr=3XBV-D% zYt?)%SC3?9tKI!>6InxXpd9h>?(iL^P{HBpnWI`N)|Z=6OTG&bc?F?%M#0{SBH@##*U!B+-937k271}F-wrc;zRtvx&Xyt` znR?PD2?k{%;?Z(7b-DohBTlufEOT!>JbKl6 z31Zda11v*$4b%eSb&G4UM(K8-tyi3G6jL}t-AY0MBv(LAn{!b$&muiOy zuO@BL8=sOH78;&XNdSq;34zOfP>&|qE$n9;1Ta@lPy2vnFG5F@^alv@V=yHPEq#Ti zJb{EcnQ*nRdqCO?TKC{{KTageoGTdWVcl|uYyN&={te3tc>S;JHou0h00jx8b`9I> z6x@^thg*9y1-sscW4>cbEZEv;vP?yqA)O4T@w#8ZJdX+6`J->Q2@iY@y&;elQe-yiI!&p<4Z#(F!6zTHerR^n$ zL}QMXKo!UCfs%Gt2c-rH@2dsy5ucTwg$TfTBO+;LxU=qUKSSSO8y76udTICZ`yLyPd=k>33Y$AS`|AbWc?v1RD zylQOYoow+4v=g6x-Vn#Mx0YhurlTnS5}nKZTRi@3SdlPm;VhkRa!cUhg6lbpt?jCo z02l065>_DT-!^5%>G^lOVHbX&jIi+@zIwBlSaTV z>5u`frT&XLp*}1hm?Wss;C2YfP7Vk9%~avgL19Tf^0w!Zdh)0I_{+m@105;kn0n+- zE`HPu3HAiN>lfK-;bHsJwcq|9!i2v_lH@?Y!Rz3}Hd$aiM&KRMj9t{ePpc+R94G3n zA8qW&^W^yZ#|1y_S-Z*k>FU~L_0Tie%&(iBlxo>~hD{fgr;@*|6_N`|`!VpXm+O;- zh>Nmd#Oy`b_A*+s*Q$L!8Wtz4clsBb{5B1 zG@rMNSOjnIo|iPzkM{?-{=0}?;`^|Y^Tfsmh}*W0Gbb^#AC zeS^yLswWZE}R%H(_UT=9Xh zT)=vJ>2=#3EFUUozC(q-siA4|_Rh-s+}M{DBXK6f;Swzp-=oA3=ZcV^Z`QJrZ=~;s zhHvt+NLakns-Ve)APYEYK{ZO$U-9BKAL+1B$VF>@B|^q%LR{zA`kmMorxs@-MH66w zZ_?Aoh0P4UZ8y4~(Lb5oJ124;V6u}eGO^DaQVdim=LxgNBGAvhn6ZI|Kz(f-Z)<&B zpveGR)i-UO(#w2I5|UM1KyzwetYX}-@_g;^Hvu-+TeDcd7;6{j;{4Kc0R91hKqhw{*$J@C_=FdST2s@-R){poDd`Fef%pWC+wf+VF*o zZ=I2I_x?ObA!2~@bQEy_iVIUN4uW?oz}`svFFAg*9C8&qTpxbJ%7J7{3c7(KNf%6*yv59E*v?$ zONy{nd8$(8jv2tHWApz};36QlbYTc6Z~^#X08TA0d)y5#H-}{2$^?;QQ2a4@jow6qTl z$9uE?G#(BES{Kmztsx!$1Vw=LFhfLH!~`eh=~DdCHGl9WtlP#S!stiFoJH1lrk%>; zvh}CQ)-ey{6b(RMfaC&i#4iXPaA2kTxtBw#1=z{$(ajZ@rRoS*@KnC0`@v(FE3g^h zo2V?=@w~M6%AbsR8`Q^it8ayyrw*jW8z=NOhhoejQBAPFoxzct>}C*GVji3n@&Yv5 z500{tAaSC1v!YWHf?`qswfup%`A`ph(3?^eay7fE4 z1P^SOqCN;C`1G+ud4M!#i5HU+Z+c0YH%h~a3lhu*#n4#d@>L0iL=tSwf~OxaxLnZX zmB>bOEf=K!x(L+^ni6hl^inqr%O?_9xx>hU_!<7f#lU-T)1cKXWlG2QZK1)UdOohu z0mEf$-Y>g^GB}(Q8MkL|P5cs)4^oo-MZZe?@mdM9x*K=lYFEY3x;64~fXIm-xntTj zG1jxW#n=m)hb()&!pjG{qHt!>T|~Z|d(Vtqal4xA-)Z&4&ac^MJ{Gokbz?M;rU!6r z;G~g;zyFKm)(M1c+wEwi>^G^8C|Hmr&_I$S=t$CZe86TET%kG8*#i>htcfZ?G; zNreGvknZkArF%$`k`C!cN>YYysR4muKuWr#8$mjxOOVo`N6L3`-_PFre)itqxBqy5 z?;p&pHO#EMu4`S#d7j5{h!y(U$T7O3mzVRC(cGs4wDLS2M5j?e3x4?5m$G&B<&9|` z)L$u@-j20%H^I)aJCL(AL{d)|_$B-JQfho_-?oYHE-2wWzznNPSdhUBnleL^G(!xO z*u%so4|_t?3>w+2#g7L0mcY&uiT5w6QiBBU+F=0SM#a-jfGKvydM)JT+PA9N^z>XJ z!Hg}1=l+<4cS}4JY;G{n5_N!arSMH-;-&rU<2?e*L^FktiX|rj4Fl6Q?I{mG86pm; zep~qo-n5Z?`QjOr9-w1~eA1a|(yK|*pe#aK9+t_POMNtu>6%?2KcoWvuiSsBd&C3%C2d$pMwuO`5>KH~z2;1V?f1mkG!jAjv~KRUi$>VI(Z;!Il77@6o| zZ=R1tOMl{v`4(2>RrO7(pzY_A-l;NHard?a3C==RHV=-6ckB^^>+FsMnSWU+j9AtWFCckFW^GYQKKdq5_3J*X@Kl}}%BLnIgSSAX`KUR?@ zp3~_l$$pD`LPj=H*esQ`Fmy)Id~w4hKau>Y$b2W2daun?S=m z2ZnFJ7r{3lg22Yt_9&gQYne~&>f%3-^3%!TkcnuzG%K|3lSoi(?50>(KVkOUk`G2i z=|hE;pI;(aD@(3DZ{A6|t@A_y{6;g=qvEoA35ZhhqLU@aPYSZ;T*#O?~GN=*a}YWLZXS{sGDT=SOo>o4acOlOegx za;RsxP4u{g7GZ{*27D232p7z5fQZ<>0UX7%K^WRjZ$G)Wf<0SN->T||$FKZ)y;}+7 zRIx<$(64IXe`K}VQ8!64Kyx7p8oPZw=UMgG?32 zYap?$&4KF7?{ zVpHZ_Kw#y*kqx2%NVV-{MT|~El-OkGwsJU$WU z&q`o#zPwnk`G=Jdha+>XXRH2Svl8>aXC*`OxOoF4!ERM~={voDkGc)3t7n7xW-l3$^?GDofXWZxTZ$E zWzO?wn)s}|Ue;&rwO*J-icgi4%scE#{<4q_JJIo+V=eFS1o@@kNWL=nNP5fHpqI6b zss@m05Fx;A@xtV>2<#L4|34k z=9Uxid#5pGT~8jlToXrae9dsx4EL%KFsUEvN`rk3?jsWqf*8z>Jr3{hV*iz{dU;#v z3yp9fa)y=MXk?$iLlI(CEz)k#gj$}OC@inuq71k$4SHniTn>Nd2fteJm09zgTwXOT zl;^I|{t&;ciGS|pN?pVt>JH|?@}W^2M&5R=)&C*BL`{F3&#uVz1j@xvcWfem?56R7DFZ_`Rc(wJft9lNRuNjy*0R zUdA@z?ZU5tym#I(s(NM1Njk4=-8SvxE|;97rlW1;(CYY|94CE-OQ|W(D=rp|`pLR^ z{gugK#z?Zc2&GM}lEQfqsFG!Utj7iGnhm)GR01+*(qz6OU+Hq6RDD9muX32~Bf6)F z0jQmGv@NaFZ*Bih=^az2Z;;^i)wHG+*v$LC_%%0>o8ll+$3GzQ+lLPA7|7+(wuVA4 zuDWWW3G49RL}h6<^4{d2wL{6Xv#K;1tGsW-8@63C(Y3yt{XQ*@MuDMko-Y4uK_M!L@AQTY_h@v zfTBz?^_&0{6(9dEC<^y^GwD2buiZOBStqNMdg8})XsWT9&hJk*IYF#4-L$bm)JLf` zWT$62FU{inziwW*Crcb)Dq#bz6{iL9HLRXb(n9FM66Jn!9F+mgalqlrdMDm1=2z53 zi!Acya60A1faGx>=OdNAaJ2$bT4wLi2ZJD_FKt;Ko}^5bujcw#%e(EXFLE>Mwe6iaFLUrZ5s89F&)ADBKY1ALMbmg4`5T=Vz; zl{8qQUD}%C+hki#g2dTWzha2(&S@LDTx&U|IG3R0ZtXMS^N&o^AF^~dLH=z z{U+WM@QjFpQ~kgy$WAZh(sQMjr{97(sNrNVhFF8Iad_E0@&OkM^jI9RalC>&!2N*g z$31P$&O-EfPYhNBK&w?02etsva~%B02n#X+vjwWyzYiu}0prKru_-eAeISKNA(zk8 zhu~Cah3N|+3v4K`IdBaBxp8o;MZMin)(-`MwZt~m+87`^-2>C4 zYRf+KK|~f>Q?hgN^=qW@H}@HQ)2RfNFALMldZ=orK(%JFWu>9cds-e!SOx!J`rIuS z*He~07_}#Bje7vd;_5W&4U+A#wU#WuV6Bl&P|>qgZpk*PnxdEbuP1vedHsrv;)<6%pmN+2TBKYoMs89c z?-wyuL$k$RL#+4|ncGC z^cNQjcA}*tdh(|~#a_+A5Z}d$=3QTkv78lGYLZpOrjagBoDnBm8m}G6 zixqZlO}{B!k$6%OrGwAC_h6LH4&#w(&WRd=M+NR&pp;}b2Yv1qNlv$!pxIx@0Xsbd zQNbKVJZ~lz20oy02Vjgy=0?X#-=N(JXQ$ytg2CAQP`L#;G2ntO^L)!Yubq(Q->tUu zzmB%9T#O*hk*BPEEcrO;{kpEf$v1|KgfF{;42vY=z8<| zll{EEYm(})SNT;^17FwZXRsl^wTYVS>$M$J!qlbio>rcqCzZht%faZC^Daz87)Fc= zk#G2PK*Scm!}(%{!I&P4+W*$;k~KrPCe4>jB{^;-JE6i~s{}SMGS@RqnPLTfFvi1G4ivr-E7NkF8-;8 z_!o-7RTNS>Z9JW`FFV;zB66H?c~|?ulII=UOpY}1$(N0}1FSXL$8uc1-fm}WPHS=y zf7a^i6dg2--hw%v&~5}DWCDe0@R71$8=epP66*QsE*WkZqOk-sH&00PoqmdlSnfU# zF;*~d{=Fe(0C(5x1{ipIS|dg648OTrr|R*=E%DFl(g&_8W224+N%90=$1P&RI4<>3 zVOOjHq^R(k(ZE7Q{H?s>#efmH%?6}64(iPY;WaVj)%Fq!Ti-D!RDbDtiNFwsXL2Bx zt0t~cnPP<^AA*ZJ==sNc`Kc!a^tk}9%dqYm(rz0 z#({Z|M&3`77X#l7hq)M91T-yapsAcr7nBUUP@5s2SHwODp;5MqxI*lvCBWv$B#U0M zuTRF$dMS*~t3cGZ;=S~9W@_>UVg<>| zCLvLK#1!gThJquAPV$XMX@LRRu&Tgf$Vwub`-p<0Y#a#2B zOlrjY4CM+OUogS&@b`ij5n;?hv_s}!e?~AH^`yse`%eML*O}N2=G&RyJ;+n&u zVSzN$#ip#&u0YpiRJO9{ew2i?i}ARIuM|g(s3Q$G13AP+%UuH_C&BbIoIMeHXC*N{ z+ZX;TWvDV@q(PA*`AuRKzq8sV*~Msp+?CPo7YfZj9>Qf`L=PWRLx){kLw^csqD1YX zg3W7NY@rKsg|8lSVW-95Rl0XW#}8#U^G{2AwbV^n^dA$mN)-N`ySdBN}xDhM#qb%!4ZZS zf+6NRb5(70LB8bF38kl1T*EsMBO)gf%$O5%A4H}BPJh;9R%1C&A!>3qmhHYoRt)%HZql1S}UdK08(LT;(@zt3> z{q)8ZkbYj(rI$C8!JNBOAq(D?Bxf{i&KQtn7?}XL%B|ptzpG7GDZpVeFZG> z>;^da2{wc1Ml%nEuRu?`^tkh9p$$jYs@IiispTb=;F{Ex4{y;zEW@Dt(8#MLC^-iA zs)~w9$gNI?3*meQYa ziE5l$>1YOmJOGuiqqhTh#rIjiw`$3529#9kTjn$KNU9GyT^Mb_oOulc2Z|dq->MA7 zSKB?W+qQRYRR^PKOk1qBJ;V@@LsMWYt}rN-=e7otNj;Ki~ta*`-_ne zr?zM7+(-@a0xwTHm!}#M0}LpIxAI^u0GvRYDgn48ho-3VREfO!@t6agGTPYtr=3HB z7hAx2VOnZ9*RGQPL?4r}<*Np2w#_6p1ja&ifufF4u9AWS{TUkILC3wI#izd&VnnFI zS{@P)fDz#-v^t%J$wyFGQBQb@Dw@KPZb5f%(MDW%!Lpn~KsmNVKs}lm+;*_onZ)fG zX(3{Z0^R0gr=VBqfJ%b3vVI}}ZPeuC!LzTmFiI09QvwBZrW;5p<-vt15OkgN3_Cao zN6mm>T3Z7{(4C(W4D95>5-KuN9GR#*t?RihRyf-`|5fw~q z`?3JIyF2ZF`@=BG)?^UOw!1s%iG~nLc?l=V{~s&={=>gE1c|RF5f!OWiy|o%BB8ky znO>%Nk*ux#t1yrwBAT`2Vxm3Xij=XSa8=>bc#uvy)Pt^?Nl}GX5G^aCvbQz* zZin{nd=A2Q*KT5Pz^x9lk(|Lmrjwp6FRcLVMSIYIS6dY}?FJeD9bQaRcgx0$FZdXM zhjxW_ADfJ=|0?~8iRjnX1ZHone}9l-m(MK>h43lM0(kG=L=>>7?N7T@Te2*FsLRVY zX*cO?;}T8W9k8@{HPhWGOt}8fckuW(IJV%=CwBiZ{`>!@lmT%4q_d%*xRxmgNM_MD z?N>G8?CPnn3pxRxz?!&J!}~uW#RKcm-1Cf)6XKE38|u{_#0_pJEIKkuaWEP|T7P4%*22fwf6^+HT3{ZehxQL~s(Zi39*xl@9))t6X}4A7?U zTx!_(vFOAc)WeuCNZ5V^MHZG4K06i4(3VR0u6Tka!N&`19y7mA$&CMYMt68_iOk&|OaiPB=wz9sL^$0rmF_O}p{I;=7ga?i+ll z)LUc$%NY)5k*nXbdx@Yv#)pdomlj*a;9*_2vffNKLN)PIR8R8H>R!EnQ*@<_@tP)`&{fnqjO3S@0ngvB}bzoCSbu_~*u zd>`ngy|;?eGy)V|vF#Qk7i48vC2jYO=~;c_rv^uZGOm@8Bzdudt)b(Hg>gczPtA-; zImv{t%yQeAk!L$mXJTA%g;-We8i65Tr`H+p_%kQVf0O1RUtHR#XDWOqN|~V=qYMhF zOM3iFn~5*}uzF6Nme}f@s?C?23dJPIk9N1Uo`etW+L@PZ!Ock`1$6QN+cmtnov=dlaT*@e9F+ z#i<-V;B0iMu-UZg@8M&?MnOHwp$M|U3wYr~^lE-2U1@I8aFILpGuqiE!iU$~NbDHP zWG)vqtRk|TdENa1M7g7j+!!v^c&4=a?w<4VGjlfv+{GTcIaJ z%{!kknyb=3+XFekr6ZyT%*fehjOXF60!pdQ&hB+sfs=YbotKjz!(S`(J^^Ye_R5(u z`f^W8O(M8BbqDk*xXlZ0xGf_!u7$k zDROy|BRfhncK>e05dcnbSK4X6e;})A%E$mZgJOrz_p{GBM>NI3JnN&9Gg5R z5`1wdgdI2gz3Siz60%4cUbA_H>_P1*Kd9DTI_GBy8L#nR# ze+r~~VEq2nfXs;gm5Pf`zX1tuB^R2rH^Y=>-{K7;eLgU%n(*F;?jq)bG$LJzpMiu&w^afW+;QsdnkR3E;`J z7S7EHuBuONe}7jJ9;4$0-R#ZU!ptF+jgQK)a`&IxuVH-a)`Yk!&iDfIV!X_9?Q|^d z(Gf~*CkBQqtlOU82a>`>fktAL2raP}N|+}woVlq5?4FMMD4YB1lY)J5mSU7MhG%2O z2G)%U?~?=f^ep3*WsFYk*PjrNmByAXg?Cqe&?5mI6w(ey#10I|#2F6mC$kG%EWa^O zLC{1=l(9KK(b<=MO5mYcP@VQu+6s8=y&#YSW}{ZUU++<5e-iCJL%6T~Jk*kGf9|2A zA^eOXNJX8Flw^#5xwlx&vJ!)2GGO@8-Dxi9Z{d!Dqv0iiKYcS45 z)dEW&EixlOYqq}B+8*~^w^hn;iK$1P3X*Fvv{}?uR_5p2$Oh9dT8|KfgE#a0t<6un zTGN?J2iO|ZBl-f7kv{^=i9CKd+AK_Dt$@SHne!+~{9%5bNB-;2XBWS2Zy&SQhX)znQ54@^;RNWaQ} zI7gE~d26e8=oLwkn3vE?*zi|eGX0fr-U=ajJ{!eQtO)eyAD+g7M9D~4cYm2t#`b?L zuNuD$;3fiZw8lh`59mB~bz8}~CqBtzw2@j-AfD_30U%rEU3&oLLfYmG3?2`ApeMhg z^9Zhmj#RrvZi$XKX##T(G|3l36wWvjU0q!r#|K$H^xF#`V%Skpz@Hf2J_Tg%Xuo0i z=GErK^YBzSv9u!Ft|J}aM8sV;k-i{jh+aaP;pX|HoRu1be!04KxJAjqnyY@YY#_RW zu>(wyGXs$Umal-QX(uo7^W;OXualTiMa(UOW7{j^+o4s_U)WE;UQB z%Jr7IT%QXf%rcQujuSl_mCY{*So$}8hn1Qb+SJ4$&6qokArek@EgH{puLV zq8S%nf8B^ijyL`@4jCMwOefgz7t6X1ZuA!)cbme z%k4rS&m4TEk1lmd1X=KcQjK5N%WY&g2ih?{GH}Pv>Ocfnk>oW|v|6k)x%A`FXcD+v#{3y$o(SO>hPc#Q_#@h|%_=1vAiNM~0 zdLwtcLe@P6KqCs}0-9eP#zTNFERtZI&oGmdRp8=LhDf~^)Q(vBSYXJekSr5;P0F5cl-ryLCJsRx6)$*=4oIH-H;?w&Ts7x9 zG?VGAtc0x0d35=wnqn2RJv2p7h=hI;?p=Oh4}OtnGQtn!yO7ohqA#- z&}Jwne}TI_AoIYyTuT6g4vDW@W0M6}qRDgPlL>zwc`lR~dQnKfQyZsb@uSDLb@(Vvhf7IUe|(_Z`~$1WfO-LV-0- zVAE*k089|_)b+$+y=J{;m&yzn1U3ZI|7&FB)_)9iCGEj~y*^8ao2EUa)9FD|x8pDl z1TXI^oFPUchQ*=IXNCR~o}?JlQ+AweA~@U;uk4C|`5pVBSan`L4+@nPOkC1|a?^B?Y#S)c-{N!HNFILld{9E=s^EZ7<+Kp)o~LfbUyEI2ytoR3Z6OpyT~@q8c3FhxOP1v`=+ zKz2xl6*neDQIG&%!ZjbW3%)SD)~Wqj3;nf=k*f}HZ|;#7b=AmC0u31xTv)+`9*!9X z2Zun7w*eIS-;%=rU#!1EOuW@2HzfQsdov9{=J>oM=&8l3&KC;*Y1<0Pq^wu_FAIf% zhA>JdSV>9*HMU585A+w=zy9vT1xfgm-s;d7h(p!8Lx90ct8NydW0cx(kJ%1vKA8Zw zihRGvT1a27_UR7k>de9UVQsAF-}I~ zbOMFge}>t;#zb&2$z(xS+ z#$%xN8cWs%HFU>7H2w=)FE;I>=5`-nlK?g{^1%m>{{q%PPRDKEbXn#tjzsF}FA68I zTVUuA#z~ewqgD#X+YVtit=qFm)FffX#~gIsec9r1Xw-dZ)L>=|MeQ4wl)nz!m&xRZd#nZJ=s97WSi(8^=^(eKvdYZg zT@_Wb$YJD|v%l)L3&`2KpCt=GBZcTFG-|`^c)pz|nntbWVFWP^l)Y?-Db>8%GE0OV ze03W)m~tDXigNb%sTv9<+^<0e-Eu6l3G?&A2#u+Llj}(!|K?l!Jk)nv5hmK*g5FvK zwn`+_a6FSH|GKPqP&D>cG>l^_XqMyI_?Ee0k~V1(K@I+(J0V=~*p=guqWX9UO{^$C zlKP|g=n#D}Z5N#VgAdxMg0iU{5%DoBE%)W3ToJ#nJvg77rQR3O=MtGt!(NkoIl07j z@Mw#7h{f(TkCu4W1z%l2=ZT)R){ot5MjEO8jK^V-^eHh$xrS=SyXq>O*EutFjyE)W zFJIQNug9;sbx}7lH(%ah0QxJ%V`gU_dtKLe-Q z-b*xmzc&5GZw;~o)uTHTE!rhPCXj2G0Gv3P<;vC0j)wu7$dlfOzEM4mW7d**u{aSD zp}giDybO)T26()PS7Ou&s7WfGVFH83%HZyLgh9qrCK9eOAmP4$zKh(6tEd2^oiQuo z5%t7lKDPX6YPC*EA>>lEW(A~;-i?XyneqZ(X4+P_0gcaIRa;pL$EvFDzC#t`Qcy|I z#K3n|bwT%TzriF>uLQ-NtGpK?-$(eGGr2a~$Oy*$G7+r)dM=rxZ`i1K%3KI|349-r ziGquU+b|`Jd^Q+ey)R>*7iC6r`a?Y)j+J)xwuO8<+M4s}4oa<$v1mg(wv!@N6;Q@t^tQBCZmZ;Woefk2P z_|#2kbfm*{)?;=S{66kY?NC|t47U>QZXyo_``D!^pR0*iCdtJ6%1P>Yh*J2lf5O)U z8?DVyUUpJ!!Rsm;_)>d84Ef-@d&TRcFP7dHk&&pt>z){fy@sW>zHBNzW)mSadKq!` z@IX4%i6K@Oo-wWCgRxRL-~L_xf?;7f3QRJf4GHB_LP_CE2&%#7%?JEdSj(EzS|Wg| zrMsD}o1u6-NA^+0&be*4kh&5TS%LM~6%Z%`lkA(i9KqI%t{IQ;J4G4HvG-)l9b2F8 zcehR}o)B-8?Mw@mwPuKM@0<#aY}9!Xjrs5~lr^4|-ykO2Eg#;ToZ_gav*AB=S3-Cd z$r?yBja869td%Le>FEz_OuFP4_Bbx)ZXaMHKQW45^g7mm89c9fs`P^qYAq_!cEf=b zAFQ+cij4>NqaJg(t&&;Gl747LcsqJaGZRotxp%HVNwpQYyxB3^%%{0bA~iNaghuWb zre^It5NiP+8<8s?Rn=;oJlOu+F-Seg>=}4^KYELumwO0@}^7|N62qhR}O0L^1`t_+ftjOR; zdgd#~;sr%VYqEGDG7#~N<95Qg6NQMwpI!)Kb#Mhm_3f^_XjSPH9Vl4kH|$AYOPUz* z1kF$NDD5cG^(GwWm#ESKi6}TXhS#B)^lVf^WZXandn|=3Sz+vW5#b;ZZvL+)9`NLQ z1Re=V=@&PE26&)o*9qf;9&{jWW4MSmBvKcNg@2{&7)WxkiLpz}H3|nuRR~rpOf0qu z<(CIj$kYcP)Nb;8tLj&W?`gI5CwSjFwGHXXqbmkh9}?_Dm=?FIvo(pyut)CCVNZ56 z1TWyO5r*AL`x0v*8>6}C-vA}jP6!4^MGl3*q9&|QxVF$+wP5?aZ?*j$wPSI=JgMD> zwc)>~xN^tiwQ+kB>NVf*wbssGR6UJ+(nikBGLn#i_v_f-XH>6DaT^l*!0U8(RDWUF z)$#M_2Ua+aawpY9-K7>CHZ|*#Iw0C{7hl$`_7Wmdegw*yG#}D5LKB?qk+K6GpYqXB zMzy%FIxjwcR1nC@ltxOQxiKsH*EEusJ6Sh6jnx~Ucr?`1zrfhcnU;I;UMC(Qxl{5n zd}|w#A|`rP!W+m|`w4)6iNw#9S%hBs&lrAvP=@nm32K)fjc@vC!%I3`4=-$dyN(lO z9=LCSOdgvDY~Nk3(_(hc{ze?^RXYZ^4=Bg_IHu-e#%_q}Z%7Oh3|!N@G^yEAkqR85 zDUa+5>0_Y;lLgj_fj$ zUvi9j&?W0WJ06PA0xLC{i*+)jm<0x~prR1h1}Lu&0diCH$MmyYy>SGGw`w0%RA~KusX+-K#|%+ne$G6U zn!1KuUw#L9o`}W^E{DBMAL8PRt{njZRRh{IxX;j~_BxPYm zk|y%uHPO{X>`5apX@*59YK!;e?P9;b3%AO@u>G-dhA@Bvx@pxST|AK#8NFnJ6CX!y z86PpV4qE8jZy6qski!dmz+&2@s9m#9ZuRmJcz`Vtp4G?mHZw$YzYXYPM$ujTt zE1J5~mN1a*j^U)^6z#TDo9>aQOz3F2==}P~+7}>krF#2K`x^5o0ud^>iXZH3`o%6I zMU`u)F!g^mDQa)TG;HC(@^+Aw%k|1fW|bwx4y)~qi$bL?!-WNg-fW1SEb z+$DF5>sz@^Ez^rMMM1VL8%LV;vc`$W$qSgGn(by3J)l=%sxzLWE}CKb6P4>7)h${_ znwT-15Vl(;FD9jndrgk(4iN8BO3JA+@&L{bd6F$4^cloqcuk6O`55rDqb5+v&`+7V zMkd`l*`8DgvL!r`J|rqhfF??kCMu?^AW`$3G40zv*kH>AI2Yr8o(qrY_B=r4!let279>V!qQe{_{)JTms&RdGpYKUi_sHn44;FiVnV z0fM<1sQ8z~X7_ygCl|iClhorzGkc#hSv`9L0> zXOadxtG&S}*0G)rLKDs6YpDrmNo~&OB`s;U19k&5A{@hOZwhRy2ZW{~yQa$zHAMKz zjCfI*mwa-&W-*PyNJv7A3#%{<#rS%HWO=Cvhvi;v{tZVJa`hVAV-mlv(@%rb(Db&+ z_EWR3Nnp21&Am%!vkIRt^u6#Pv)(StT{DlfcIfY4YpaJ6$m~rIuBZqv()!t_kDY|i zo{Pw`egSt`B@!Sk`7n~P0&&B3P1TLu!<%i&zoL|tO*vIKw#F91&Od~mdnY`v3wtWQ zU0C@bR}XO|sjY}6`{F%AI;R;3Bu=C$8w3RDqB8%{!bW{w9k-mBBkupWA@FZ%an0a_ zA@(Cke$)N|5em!&|B!0{NN8byKu1|J>)ZanZte@OqIiRE1%OiLR~x|@N6!O;GCa<@ zp_@U}sCFJ&6k*S;NaWH2cdMm~-&XpHg#evJ_9qc17V^$9W1Qn(3o@I_83(`D%3Q$e zaN%075))k2H*TyEKB>0czv@4Ie4GzargIyjdCjV!b8he8Ib5K<{RSW|o28jj-?g~` z=_)5|XuzX$6JbSM!P&;*7Zwn|SCr2`9PKs-)n7&zIV?E0!C&nAOLd#fJ3K5xszq@y z0^U>Rf%oZV=gaYpgwLUK^y&1(s27Dm$CaUOXA|j4h*%iteLfXg#++rY+sN$Rj4|7`O4|%KXe79Srts)u;PDF1^tAwK;_NJu86$G3 zF8>`oouR#X%isDH++ySFz9(BiS`#+HtEXWZGTJ2t?KC&=j@R#Q(NWz~UE(g0 ze?W~q)^)1Q@Z_z<0gAz|AEB7eFWQY$60tF0Zaq;w8s*JwWwixxc`QF#S72NK zRxMDC7n`=x4P~uV6)-Pom%1tKaCi^!tV#t<7*r=2X6plAG3E?#_A`dwAJjS^QMNct z{G8CTGPBWv`3L0krRrVqqq<;pRAZ0+f-c2U_X0=Rn+ql z)VVYsC$b+Ns^sdLNo|^w!>m=YG7f|bUxUBxn@$ACSgudcZqIPRc2`tW(EC4}vw|45 zk}0M5`aE7CYs;in9CeK9EwM(q5^8_^FSiSzz5TatcG@v>ZT8oNF*~FHSaSkE=ekb# z@4c4x-!F9GjTw-GYB;vjf)f3=ZR%wjXm2JiG1uIo6ugeO&=Fs z>A*wE#N$)-OIY58TNIz${)i)4H>>UFZ`DkY!CyP7^NvB7G9E9OER@3>`Xmh{`EZo1 ztlX0N-M;b%PSH2-O{Oxgb4?U67ga_8WgpEhRKBemNZPRaD)POh1-tYVlfnG8rU+b2 z?5h!3?Vs|`Ef#Aiy=QXuNPgQQ7jJQJH)&}D6p$C81EEet)E-)mb<7-@6o!RavK1_B zVb#i(otOm6j;C)1V7#A@*rD}fln&aT&ljk3=3J&^sPfiN)G%Kpb-3*Eo% z?hEGGh?S{h^ZBczucG`a|JlzCW;<%;4i!ZXNqF8>@B~P=P+d|iivRYxw{EQ(MkhA> z1Ok^jfz9uvJWAPMlbg=*G`0KaJFk4)muEV;9ZSLMI8r@XL|!OpKDU!Fkg(V$O%$Lo z-A-c%Bc|xhQcQ1rEMW2*XHuDBsQ!_qM+APTGP&3$6)cCF0B-L-WniUOMaD6d_u*Ti z(C;y%3vk_ldF>G_Z*7ds_ch9|-adI{;U`B`-LsuR?*gR?Dr*d3FfS*+f-E`1spY*` zeVht%_i24<_M@!Nh6Itx8A*!k#-o(XOw0m?rtC+;acg7vdqJE>;O<4Bf^@67DTVwA z1Gy-*DoqFhfQjb5uONURpHTDTDeky6up@7;!rbF6l`r!(MNGO$)9CYOLW|*F$_!fJ zl=J<@v*bPvFg3fR073>r`Gs1pgVI-4lUCB0HX_b%8~>M=2^UMB(w@=jfu= zG`f@_88eFA9qM)%rRXk5-hS6Jn;cDUsQ}nYKp!yXfUqIz0USq`=f6>!Yn0zSUG8_W z#Gc)Er#`eYEKiF<-sDHFp2d*|FN~2?QsP3&z9T51aJw#3je%udgH0NM1dGLjs^F-e zl;kT|_#pY9Z)A(my;eSNxbBn(#b%pe-W-d{?xwIlpzY`Uo;3rQnLNn}O4|eED~!&s zHRLma;j(wajOUv8rXmP@{Te1ksooSw*~@QB;dM@`%*!8;iN?kQ%0Z_ECFcV+!E19V zM&Y|((j7()5DNEoxoq=oJ{l%3 zu!MoQ$zdt$=H*cYT z!G$k3g^C^9K~0dYxhoi)h-ynPeQ+C>B;J?S# zis3!!mVsd3LiY z4d$066f3JOi}O*HxV2INaHaP)tJxXYA6vlfTF7`;Pfa65v#Hi59Gj$Kt&E*sbB{7j z<>tT=mN_O$gcQ=_ah_|@b$hsT?lXPqTvW2V80UUMDK((2x6>Rs`+&Q$$naH!YHKT& zo*p@8FA_wR?0AyB(Qyqn=n5jAW|;^gw>#YMGq;Ef6pUOM`Sj}{&ZLS!e~_$ysKSH2 z?say3@8U{pqtA3z%t470X3${dad7Oa-7-L-KYzOw>@m~=ECvQ5XiT9pU?4vE|9Tz!&)h$)*EZo?bi1m?lkw#f z_m|mH&v(@-Hlh=O8tmslK=ki~kxDK*dz!fJ5XqxcZFyxmC4a~yO={NoJ>W}m_2_p= zu!>4M@Y${!P-A= z-Q5KQmvBzKz&tjDCbcdKZqnyoNN0}b~Th(YOyLfVKiC{O7N{0IGzDRiQ z45WB3SqWYNRn%auzpOPvEa3l{?g^1eqeZa+bK{)QKcIPT+8b)2AmD z(si8U0>hl5x>>fn7hq=aKHz>yZz;>F;IPNdrh3uqyTp__;p#Q}j84oiJ!sKT5|df& zYj;Q9BLkqVp(@(bTA1P6 zAp&w_MfmaTw_n537x23jmZw=nt1MCWo7rQydlRb8i{?q*O?V1Ek6_ZFpM$wt`HtUA zMzRJBh#n^R&!N2+6`)clbRy6XYWm?nlimn_xhG8aNaI20VW{G1g%!gu-Zjs@fG0?^ z4bK&-<}jtaTPfzRn$%&B7Ns12zN9R1=9+!!79JF}dQ>&%F1c>OR`R7lhG*1$S1bF# z#7A`J1BvcN&VF5<>6vkLm5rxYt7M&;h5}nXHCDBb4~sWG;j)4QdN0I9NZpRV3+=7W z_>q}dfFVlTzO>uRi4iNEw=b2II)tq11BPxgZwB5Hc-PcMhAW<4vMMOb2A7;&U_1IZ zTj}nUePfEeYY~4d^ZlRrHW}TA11eYh8+qM}E<`VtOYPsQc4{cH(A|?7q*f1*DYp}$ zkUyC9F;>NOdiBdJ*(TFTg{0Z4PDbTRt*AaUP>s=oipsOp+Wmk)LTmUTRn|Lq0*+88 z`EN8IRg9nn_V(cx8D2CK$s6J+RNLuLqNY%QUi??vQfH}`Z2SU^JyBf{eB>?8Lx9RK zLXK`3MTP|mrA<3mv@o-}-j`BrBDoYmKD)(1t3s*MPh1D0O;=#w+fb%?flXyj>wn$5 z6+#vevJH>je)j1=&ZxUN_ccFk=Zp`zzbVr52NW%CVS;kL&f}#`*J6+q7Pp;k)8h5I zCy?Vq1r)4!tmjeshIL&;fAkIdu`o)@KCeGSGjhLbJRw{LskKV!>ntibMpfuQ3MF}V zyC(?(6E@TMW0=Nqswb^m@O#%sUxkpo$Dl~auJkPuThQQDga_;*=gvz}dPm|Co>TRL z(8JD~{%;S4FgGPYfb{%R*3Un%`btmegQ5^8mG5UEl18bB$2RXoUdzreX#&OYVXbx> zsd#$FmEp^1qD9Lpn&ic2f>?daN*eawlaIm-9ymzc_t%!&mA6w1cTOFC<5{u@zxQ+t zVx_~HkHsRG8nT`K77OV60Adyf`lx1`_UyX}tiMn`;L zvTzi3Re}4!8%n8~rQ*9QKYDMcO_XYK@=?|U=c{;ZuvM1Lg184{eIE8)94a>|O6Bo_ zqoS))yERQw%Dy$>{~_%?qnZlWtF=_J%ZiucuX_SyUFJ?{O!G42mS0$D37YpwUG^O@85;|^o^ za{Ar6kuEx$vQF_e<+9@BQ8X04o`Tn$zF}Yaq8NA_>J@Jf+`R_m`ImrQtkpBu;Ou~; zKI>M)O%@&LSGKzX3)`FP>FWt+QfDy~qx9AuhlQu&>5B1^MnXb-aaWC$_1W8u0<0&A zfAVKk{A;J(lwUeUyjS2N3D4{%mD>mVPbT1QFf96*8!Elbt6`+WCM3&+YEy2NS%?&b zELY#Bxm&MRT_L$Fdo*$&NnjXnw;X+96y{hR8Y@$2*38=E{Y%Ckt4?8%5r;_%b zD5HB&AFd20^n^GdT-8dW9%8lvMaH5`NCifo?3Rwfizor7H&F3u-yNg@*V6}*%NaU4 ztov-%8U7GV%X!Z)Ng_}$RDaC#^u{q9>qVw5>=E>_LeYXVt5a+y^ad&&m z7Az!MnAy!L2rUpv%lf6DtoPjb|M2V*lkO?Z7S~4}%kj|E zN{vSJVh_+}2a0c0xO@O)^9eCj%JLS9)mnb=qx0=)%Qn~O9+9=)0%yyRSvbRJOg{osWb8-xzdAgm&=uE{MJ?I zZ0r~xl`G4Gs6emF5iIFHUD5yds6%5gh7EFeKl|$Tm88FqD-@-`X-c25F)$6)TI&bi z`|qp-ggu60U5h^W?S4))%Rc}b##awO$Jg|(bt!vH^`mk{z;bcP zR|pgq73zVsfT_1!Up?oTeECdE?2J_vU@Befp{9Wzq?(^3H5<48!DQnD54LIX(Q{#o zRdK5EE?4DmMxSfnTGlyGW`qx`94j7l;9UUa9`o#-x^tz*mZ@L-@(gCpx5`d+Qoc5z z=Qvr_>MQHpfxG}BcSCP?>~Nk`z0UhAfdh-Nn>35l&ne7SqFy?YH+_tuq^~FDhGCgA ziyk^Kw;;`xo)v)qLVqmYNQps!B5#nSDHh!{p)B$GxNJON)ZTvvrnMYJ_eB^K;<(Yx z1;s#Ip8C;=~X?>xHKVPb@=Rsz}#=o+BQ ztwIERO#W^_s=J{BXw7&5iSS!V7_$7n3^IxqPWa)gxaGHscsGN6vV9~DpxYaCR-i%A zg{Z!n;meF{l*25}a{hx#*%6Riws$02!Vj%5z8#ovm_~tM#A#s+i~Gc*D1uIPWsUL; z!p2O#_jm|SRE2SPhJh8#DdDV z_^Z1f2h}d;i>A+gLKRn0 zEa)L9{yBUsk^Jrh2K(IOH>BNmw}wmwn9goe+$8yJ!}i4U4n^x&cNw2DZc+&sOfENQ zhUQFtU4;GnW>aJqZyxN;M$#FFbKWuon39)wZra8(baDOIy`Re@z<1AQ=*gK^f&&Gf z{ETx$tvU%yNr}c;&ejTdR@qMyl!>+_ct&j7ZmaB5JA=*oj?9OXfz^@9w9SObH%4`@ zm^q5jjcoJ^_z$@4`G8riSP*s8@CzlAF9}x@H2RC)OGLy8BkL@(zq6O4rn_y|4S;sIG&Yc0vtm?i`Npha8 zJ!AUB3fITk-;fbOgmQpJ)0}i&-7k{Bu2g6Da6v~OTYEEL1!!-gqqQzl#nJu5S>|_2 zZFJozOMs;-Qz>#dcUAQm<@I{TmJk@o-JY$9AR!+vr%STnM1y7ZmXYLpicp(WIqrOI2N$1WtO)q`wEv=2NNQ7FPsZ7;FZJVuX^5Y$xFeyx4{3TMWRQKL2=0fZIHsuTBTLJ9G zCC}|VD_28;Vr79<&{acRT~X@v(ou&_cGr!%45mtT{zFjk`m~H zFVVVrvpsko#THfcS>lxnY-XZZw9EWNu^spM%VQ1o<_%ESox$V0i_;ZN&!-F`Gpi#F zvbdh1x>JD6Tnvu69mUvlI)7)}U8YG>n83P8%k1ZP7wW0IM)CcDD{7v98&NW{L6_&g zt6GMOYfY@$W8>a@1ZHs{{CnR$bR$3OKYzM*X4n)~)#e?2a04%Efzw@J6}!J#J-L#6 zIfU?VS&T*5f|hBTo2}8QHlR_PV;?<}d;D?)x^G^D_o3sX3Opzn%*ojqjw3LUyYCx) zn?MJcRhv{=-vN~sSD7rKS$gvKljF6A?MN)V$4`d4+-dGbA(7XsL!Up(E65F5!PqRp zImRYab|`&~1^qeP%t*#U>@YdHS<43857yY58xFXybI?u7*(uGl^m@Iw_r)!JH2a!v zO{D(te=)ip`tdvGvR2X=5rTj`>p+`XmQj{;?~~Lk*4Zf$GgvK$qsv4}Fk}-qb1ZbL zo%8`_1%HcS@ji!MM_wq67r9!)rexUE;Uq<2wxfXy`YgMI;w}fORG1w>JRP}C0JCSV zvkt|o!H#Z_ZvoTZ0Q`vi`ihah+?~)bV9tFu+2K|!Zzd*>9k^oJn}mDF-eNH_tpE(a zDS`oZ<fHKP?0TP`+qKX;IF?N^amQxmFb%bJQ6<0K6r3c%HXg<-~zZ~SVs^IZ`0RPw63 zUA45S4|LO{7hEU577j~C)hZqc;4Lz}pX2p$$c8JigSJck3FD^s7L7QnUzqPVn24F> zMoEseC9@9`5m@-VU3memof#Vc^e9#dfl6lLy$uh~U3e{eKSvy$*V-l?qZz2Td5BcP zWmhP-Who%l_IJcKdjR`x_dM!&JqwUInv3kgK>~SoCFPP7OS=0)bH7LImDcJXZ<4+V z&Mmw|GT2j#Rf_CZ{QFio{N=JBf1i&!H(>PvE4qBfP%1VWD;=6W+$`iQ(B=`VZz}!2eCF+>T5?tVSzkWm5 z3R2FK+mI#g@5(k|E!B$erv&z@7OE<&>Z&83>V3K|s1T87XL9~5=b-MOv+mDY7uWn{ z@_}av9sKBt?pzj;L5!?QL>^F8`T8bWJG&GuK2X-Ph`Rzch|r8@%OLhnn3n*)fikL{LPsLH7j`3 zxf6V#Au|<8s7kH!7!T&IFxGaUE(P= zSbf>kKFNWxnAlA;!W+%Q{4p0Ci09Bs?8`0b5zks>W^_k);Lg%bZfpJ_mG3i~!uId#3~$!+?Sf%p`gtQL0>6DOHbw}} z{ULo~%!ol^0+uM;O0};OvsJd87e@&@!F=Fue6bz0Fo$SF(Yu$kc;_vaiw-}6CT{A& zs)JBQ?W85QxMU%gDS? zb%G{7pW7KP`>CLK%*(DM9d7kvhfX8mn4U~6i(Mz&Yh#DvKU*+e8~!3H_UOD!pY5)~ z4z`3}r&y>s!R+_s zAFfhfHzJCi7MNj|c5*o5s`+SVNefS>T7l;gYvtO7FwuVcgFB=IGQGDY-XY&Tx_#Vv zByKpNG&Zu-O{nRV>M0?La#6_VpOYh_P;BE&i315zh{4J%{Dy>}CBQ-fsG*K4I66|` zGO+T7HYD17Cn>uZBS~hWaXESh3{%46iV5C#@QSmD08d)-RqW{yy|HJLB7><*9-o&i zv^%V+U@i|JucU?4d_jZ1Q4YC?0baqx3r0F`K==1A!3%!hpw5_GbZd1|A5r>8lmNsdwJ)Fypj#3#F9Nu8KY3bD}cB z=b3K3ZzBc-%}|mL0$5MEk!o){JQCU}@J@ys%g19~#wW(qTFpPG7j@>CO2=t4p>1PsZ#i-&?k6B*-gzdAHl*n?fkUlJ32R@6=1tqj1rfzV#P7u0 zxQy+l8PK!9rRBAA2l}u^w}(muiFPcsoA$+mn(}J$@Y?k8@YM;4Q@ziwcU;Zs{3hO6 zB=eSam?#&xl$(18lbUR#iCx-&BLK!sj?4ilW5lq*?Tt*|?5c4V^Lg3g-2FfDzabap zE*zd3nc2sS5Zs6$jCwn;P`4Fkdxrp}mSs=+1mjI4z1-YCPoN|CSABa7D{Z;iXJZ|n z)1`Lwxd)J4&W#Yc5R4*FsIh$YNK_5GLm?AbG(3mw`oU2og=Bdw7~9PHqV7DT(e zSY)g2WPM*aU?9bN;Y|T@#2ATNJAl4Jo**+5nT#HJ-|tsX7%vXlygkq@DIVzB7;l6< zb1-Wr@vF>ph;esc(8aRKLUbf&r7e`c`UAH-3nLq2cc+bVB;qG#GXNwxH~*=Hv=W8+ zjG6dzMD4KsLuPVxD?+Z>_OO4Cq_Mn=Jh3lj)ymE^M5N1JEuI!stS>4W;?HoP?M`1wu>9=V;B?@z**9k?^Wo0Y@pry-aBDhv!wKvqXd>63Ws3pmVpxy* zyYHq^+_~UY>veVYs;<2?H2-;B?leDGYyb;#o9{r!!}tfu157`C&8$g((XZ5~Bg+;2 z2C}A%4wwGA3tI7FI5>g1F+s5@O!TFZ`+HdrPllb_n~8@u&Hc*n6-iDaM-pDMQ*by| zCT<}w^;?f#T}?^FDU;$7B?Cq1#gkJjF) zY4KHi@KKlU;C@d~vT)6i{fDX{qM~=d7|hcv=bqn+m?-5J^s3n$#;+9pzW#Kd6iwO} zRU`JXYu_*La;~}1{^n>wO*8mH) z(q!iQ>EG{;Fjr`Ga1tU4BLm25{Eo;ANE4{vH19wL`vw1nCW8pAB~=IExC_ji%W{%3 zx??PKbv_;P-la}b595Tg$-9H@Hg<+F&>!d)bRfGZo^4^ZmVcgO*dQ}$Rl4j4ZRfX4 zC&rdbuU;w5Vs-o|Y0pB-F~z^yLr`Iub1Lio4mg3pE0TtX3-|JP?@pRH3!T1(nRvMDtye3C)_W9|KU#kZJ_U6Re}O zRy*Afq@1A8!dzJG)^UMoJp=MS28ywSvlA%>ulv~A=X{x#P_};(Zt~MZq58COCf)myq^WC%CHbc!AJ<>Q z(?7(=&(;|>khy|X&GH`f3AR0U7~(=Ggf&>mq+t@I6?%;D9t*08C$E(P6m_Rz=b99V%)7#GG>=!DvpHr>{z)< zPls5-HdG1sfrNO+tR2!2|1P~Fdunv;;WU$B{45PL3KjZjjqbLG$Uz+OZq`OOfE~5~ zG$^PuE|e{#VAT-+D}&tg%@b>X8qVz)I&Pe=r*gr0z)|r#65EtgB1WR2N6t3w{)U8) z3?83(m*H+ESA<=DXU{LZd1so>_Jwv7=X=JpA+Z2!Bf%9tm^54du@6#q}}@DV{AF}5GC zdG!834-h6u1^knIE5AEb1vd?L9%D^X+A4$HV5k;mz##A`D*C z^WLQ^FK#lJl&ifK-7j`tP7E87KIeAvVDz;3Qg<)Dj0kKDwwTF1QG4Q;1Rt;*d8m^gnDJdrD zF^qOq;ETJ1a&LNs%CSp2Q)|1z-L`C{p2B;EA2ptxo0?qt)1Hu{j&ROilkNA9*k4tB zOvzA_yQRV=&mobK^`BJB|GSs;uYIL|K7V*mRi24LH29&Qdb<6`eA?%$8iOCF%A~%U z0lKPmPKNyd#VqxYx#K^-HHOpKYy`o>i_6D6X@47=r@`Q<3&*O)UC_q*FD9!$ScJqs z{f+-(b^62X^#A{d;&<=Az%w8j9biEJOMkI06Cn5d3vc+>&!`gKTeG+Kl3>8iJS3e=K7YXXlfAvu3Ngs7asT==o+Guv&j~DKKu3`8$l$dY3aT znD6feVXx!0X3pR@<8OF=IXx>f!NQvP4=K6M-}9h#NNvH8PzjI&YZeSu7cK=da=fxB zlUBcF!D#ZbME=p0m+YfT1(K~d1 zK^v1PzYyt2tTrm{APPlpe1iN#qiv%}C$qnyiPxwT$uWS}TYo%~Im*GLOUUoE4hruIoCue)wS_@=1oQ&GA_R@B3`D{ z{J393);t?7HMsKHrq*Z8SzNd`;YOCgG{gA4aGTB7{Jrt-Js7FP*R|T;Q`T`sMurZe zUB+V6pL*v%T91Ug9KH6`R*h7*Jn|4h-*%%9Sne&hb3TYl&u*beTF5@u9RsCinOm(c zwD*~^TUC|cSvJ4~+ck%9k1SMAcxVhhEsS>Z%6ca2#@fxIEOggLCT7*d37*$2nZy!% zLL{e*zh^`|BV6aYjH0FV+3Xun&iCO*-gxukQ>@W?+>t)7pRQepm{mqyRCHRiosw_( zSaR*>4_^uN&<@dU*d&8~hiH6W+j7472d%`Kc;-FFR^g@dg6+=+sL(1}M(1s5dI}&( zgC4?7dT#k1oy9f@->1tAojem_B`WA^5^{{PIz``&MQG0k@6ZK-p1PEpg`nRhzWgP< z=dz1d;-?PKMo~H^mlhK8AAX-uqI7Ybe-(jNYqHhKq6vzT;B6tc;RN2hNK?;kIy~Xy zzJGGks*>?Lqqzz3ykGz#z5YHPnT5~nd=-&hLlfhZ7b^|n8fi+tiw#em`>eLP{k#eC z%AGTZtv=?+-YQZ`YGTFrUCeS{!BrRWC*Mf8(qI%{676!jrdfw^SpO%;*!O13+WQg3 zn>ptx%lJh+2XY#tz#`!Ivj{Gsnm&i`O%6F_eVM4d;H7EOzxYIOJp56!HU*92s8*Hn z<;5o~y4zZxa#){gy^XFO_3kOo$*@hxjCoaJUGaM2{vOW~G7^pUW%8Xs($}-9o%>iA zH=O<8y1B=*!i^+L0OHgBy8O#;?H1zB6Y^`w=)O!$KA+pYoQFvjFkcyZ-x(M!@zn>5 z`IyAV^$oN?i{;q2DB4z)+thqAQt>s7R6{u zM9>TnIZZPb>a`4SZx#LELy1!p9*_);9JTs_?nMM*%F&H8tj*NMYuDQrvnTUtN6(eg zcM0SaR_aRSRyn8n*qpr?8T{U8B2P>X2MerKWKK+*k#`PEwWRjwG21K9qD^NOzbQ>| z%CuU$RQ!f=>)L~Y9%0fC-xzxPr!67n$@gY5p6MqAYRF$(HPCB$M*Z*%Zner1q^3qvpHe6M$=4IXJa?}7fgZG>tioxACmao# z5v6MmSF2dkM|;2g%HU!1v1u|1XBJ+Br<{lMs1%x7T#ea5hWIHtPVBEf0n;mHG?O-B zj$ALOB-5YGdRf+Twj)bHvV=aN$Z2!G%_8Vx@$<;RA}*H)Uct*TUXmrA?!2mIGcl^H zX8bEQ2k61{%K|diqNE>20J`M!H9J=9#&=afJyR7^T*7MQDJwS0{ z7Rg5Lo1rKA+>Tls-<%77=*UtT`H}WwxKB7Ab!Fy(^c4wd#eCExDhOIO(}(*1nDCui&VPN(DKwv;lCvbCQ7^Wv}zONOp#tAQeCl-DKq z<;p6AxK77m?yTTiJHvkD$CvV_hJ_Nn*ow0lByVNe^Nul6{)7$tthxWZH=QPE8LraI zP3Xc8Jm_e*asDj-NYynbuBEs7Dv#d7cXX2`Gcia#3|5_vWda|4;DRN8W=&k|f4RBf za>_z=TPid~h5Y!o<&!0g#S4xz2_z6*_$zrk1SsK{im@fSNnF*X5i};Pm4@z?uIIpfo?r2VFo)#6EhGaS0VjCQw&%4k9&kRDQD+Z~7Ke&uv{A`z)LnA%wz=EhvF6G9imvFKO@U2PtZNMq;Wlol4(;0J>CiWnYr_3UA{Lm2ye5Y&5+@SB#!v?%@8n zN!tv*lD#UWah2Wu{T#-6jpQ;=i9help^yL8VH))y&XJg3yZh$MYp*EVp*QW`Bg;c8 zDwnKNT;8?Zimc+00HKEgrIwzsb2xD9Z~iql`SYBYPQ~Ds$AjlTrv(9B()vLaDaM&&5ldYGfLJMhMLD=$S8wquIBvvN+D*`3MxGB8(Ye2_W|yGsdU; z@z=`Q2bUiTwqE?i6CnxN9^fz;+3~_b=SG6-8S%lpP3%J7Y_E2`J|k{%?fu;QXE_VW z=R8aKjP&D$J&pr<5*t9Fai4Un26Fk`q7Er{6eU?y^Ua6S<_k#vaTTMwYVFqmh~yLi zpFB7%;fu{2h~B1#6s2W7T}UW^cb^j4VHeb~k`TGe<($oRgkqSusFPTxsP0f}T^c3) z50uf3knP~nGBUKltog9@rJ?Kba4}vf_SXiIc>h0Ur^u`uOgvz|385O&Gx<-MB;lZ9t0_v*;KwpRp#DK{94X#V^< zSpeuug7#8hP}EsZs7am({;7FF=A&kEY0L{?y)yp?K%;p+h7E*~+8}7;`|F9QdD5kM zAkw@aH6Bo`KDp$il?+fa59qQDx346mI@H>Jq0FBD?59-6JM`kEU@(|N1{lUKr=)LP zpN`>`GX0VhP4U{ACoTOiXYO!{=HVIO;*>uDH0|s?xp2N09Go5L9cjU%KmOb^;V;rE zKg@nU75P0wU(Z}9(*Nw%S@d$q0{o2?D0L%RlFv3Pvd6dsB}%i;ZR3ws?Z@^7zeR5& z2G02^ARCos*nFZ5u_LXLJjHB8yNv0(tR6+ndQ|(j-?jIh_L_tREfst}h-~~cP%BXb zP5TX@AakHwp>K_uaV6qW%g5bJ>*V?`KeTWYLYx;j@92zq3rsP8q>S7jWmX` zW;37{Tn+KMb1vz%-w{u1o*>`3$vLE$KF*@-z0z&*8?ss7p0T|%Ij|0^{I!0Z!c6W! zp2y5tdb?q#qaB$OpJ#TFvfR&h9*LTZ)55BEUdoa8k@7T%cqCSBCJ;6@O%L>U38TfL z#rIbaLKR_FYc7eC%d{w;kaOvZMS3a(^EQK9&w)o?P~88QPwbcmFQvU7)_i98my=CRE}l zWi%TmtL-r`I@Qu%rI(?gfoFF3i9xL6Y{}H}^Wq|O!DJ|Ql9qMl_({}bXw8kmeAg78 zps>B?M)8*%E)^_4M_FN9s_~pUhki77N?)IUD2q@_kP(=Axgo|z*xW>+s zA+UGJTqDbBIXO1d+UENf^ z&God*Tp@DfsBpS(&~1r1pNuCvp9dqSlw>U?gsOM6okB-yk$*nHw1q5_5uaW^BsIY8q7RLJR#Gp$h>>ge06 zypj%CN35S7OD)l~J%v%4k*YA(t&w{8LJ0`R=6#pgtE2M}j{}a8?c_v9k?GI2E>1vA zppUcKdHlzh$ez^s?3?eU#IEK?u8IbC%AS(F%n|;Ut*i!*gaQQt$5}}Sc?^*)9$(g>~OpYJ&-cWr=zBLVBcrQ*vxiki#+xQ=3Kfd zLpR3*3-i*?th{39;j4I_8NYKxwi?sckoQhqKaCVgDV6uP#;}n&4B8Ko6#|PVGo{87 z9{XO+JJv!k!xt1VbiuxN@B=MR3%M#V!vY!a0y$bN9d7@I%!s2i=F&;&Gm*=dNNN}o zX_=F}FiY2b==)Wn1m4I=4U;vVF@BDP%cnGBC~GgKPTlDW;b&@+(CwE!;Q&1xwjEe+ zXptO@HX2*eGG3!5f||E~Jfbw24=eLYdBVZME$En?R+w6z;I`l&11=w2`i0k{wS@%PZf1sC+82#e5 zVo=@CFhoUR9kzMiw{Dh=uj%Q?@bT>~NU~R;Vq|TZrKHZQGkSgcGHq5u}24s*gS2fNpREm%7punf3C+XSLLvx-~hM4!-&o&`eEaJXwNRUZ zW448Is1NN_<+{k{z(RZ=0uMr7auztTK7Gl*ZN{k5P`ne`+kBRTW6-h*XZ=Z!7 z@6`iQ%}8Bfs%VifkRcdK=e#=Rx#{w1qtAOYFXGP?loB$=>0K?uzRyy;aYVdCHmYG3 z&9G3CCjznrL_wStQA})8`Zq-GHFCBZ9BQa;jc+%!tfIXKO0>uJ1k~yma_6P5MX9l3 zeS*h1%SHG=xH2j8AQMjSSq zf!uj4#Rn$dK|3jZM^^?;-UTAU*mOrI_Az1~dhs`8x*r)cwNtl}6A_@r^a9a%rKwsXPz|`2z%7HXcPjh$a@U@1fge&h^P$bC-LH)eTD9 zE!{Mo+FW8Ad39vu)E8OQ>2>!^9j&ooZ+oYYFeW3iTLD$toShN_JPR~zX7qUTFT91x zv&t60TjuV>INj!yK9ybb(RnhfW@L1x0!~bX1;f@0LBl1ZXk;D{0)#bY{9*Kkdwd@Z zdY)nHc53fsb;W*o8=)RgW;QCU+VDURVNg(pz3os;5h(w4vE_)29C*7$g#Jd}i(Xt&2O-Mt}O0rRkH` z$}HSfKCa?Z{s&)Pr?R|1gEOw2N(q31@U!2N_h%OhSvSkK*N2ez|0x_u@2!7{+`dv z^!MU8HX;>I;aY9+&0x9(x6pDxzgqo#p!R-rSiRj_x`TIGBbU^;-q4elo+z5qiR>db zya+JjRUGIM_z1KZ3t|fjb8@Dh8FR&5ikE82xpzG@kM{Y4lb4C>^al~7w^F>ebA~f4 z-i#FqZ3NX1nTaYw%8qP5th+uLJhf(eP<74i?n^red;QmzSyD~tT*0B&vqkMZg75-umbzL{R&W4>zFwUjr90uYWZi*M2yY z>jx$acMHC!f>Ar z5j6vV2tj1u)H(D!QOza(m@>m5N|fLsXbe_oM?|hPjI7`O)_MvqwZ$1Oa{EU+g*I)J z@(KjK9KC>S;q~Pu>f!{#^+RN{r!ZpyE{1v-H%PlFr%q43jX3gLkKyM{=#Tuived3g z(j&0^UlS$Ct$4QrZ-v?rW_GS>f^-cqB6rF?G_`+2%I=B7wy2Mx2Ni!o?R>MapuH~l z+6mGf*rw!T+KNO=nj(!goP+Gv&<>^7JTv~opN>>FfDUX`y|B(B-Kw5(10MOEI8FK} z!wKAJF@ANf+j1Vk2Oqpa?ylpS^A7n&?*x4mZ5~J(Cx2K!dWm~RhF0Pf3S8zcF9*oS zZdu>{TB)P_wZe*9_*Li{i!Nj?XdaX1FxCUsJwH4^h6U%{e^k+LNaHn->^ovd$Wh*# z?yB{=TbO*knJ5yM8g8iT&OU)a0g(On5xW!%uoCMi5-Z;U6NVim0y8^O?)S)zBhX`U z-IPdVZTqhyHt|D3rWb%jIFOY_f{+_9F>=&|?uv!42k?rbK;AygJ&@eIi(KU|^L_u6 z1x&V()b6G<2mRlH$VPzc*@kkP0YV8{(kW5^0NFdkj*l^ax{~PP^##~DYzc|AmkRQmg_$PEg%j+*1a2X#+`hbJ#<4RNtqzai}7{7HxQ_rjVOZZmilUuIv% zB&9&`PI~weT%c-i;i9#Jw@$7@%}1ZvZ&k6&2t2g|vIzqW=4>hkWPDffvR1Q+`mR&> zZ#a!XR(fm8N8g}?qb0gmYTmzqhnT^;kg^t_7rfQ#LhinZoo^coTG(WM_Px_b^TxPO zdj1Y(^0}4K)&aRrrPXWS>@n$0NojdesefCi$ouIrCh*u>5Q zDde9V2~C2GRE`Ax`=5H|FdU?yTYZnOH2!V&;Sf}GNtqv>0Ad` z`2;9g5K%sJh6pck@0nA4TdO~R;Y>!Jb!>&$XxeR#)VSDGjC3#gPUo%Th@pw=A3d4;vlNpXj!sh)N#?J@u zs39dWWq(FvQQ^q-EdkQaU`*PT|3H8G&;c)V#W*EbYNKb?IBP@EpGaqh<2z8s{_+!H zJ?oM}J=$!>8gWqUFY|Ip+K2 zeT3H+bSiQ-b7top9vCDe7dYw6czp~h*!Ryy<#=euZkEF35h6W|{HW&&XrVov z_zjU4@yR2|PL;(qpo7Gy4lzY843{^|E1Y}g67VHtDiGKLtPSSBu{QO~_^FJpeBu?{ zbew&%`G^5idL%mk{n!B=VV&`prJ9e+Of~6ztr(U$m53)&l=&Yn#5EIaBWNAa)-`@B zj;{it;ev?%Ddb%Qjf;7&FE?KCSxk#{mdyb(ZObfA@EmKDPcdTfOrKG$2Awb6_g|eU zHP=3B9Ryo=r>A*lb%6;D7k?XQiDH0bBkvwzo5qmF7%SFhS z&hA!9@5nlQvjg3nO75=bdN+A{yfoJJNdWQjAr&XZ_EOs~EV|2K)JZ-?my{tTC~RBJ zFCt){Aro8G~&n2R`?Mw8Ki7QZK;2YB?I7_ zq%gu3v+x*l^OZmV>$PaYu#)91YaWq5d4mgaeJc*Y>x+;p#fCZ|bC@G$XUy?FNCo7B z$I&s#8~aZi5kKQL?W|4*Zn7;@@(A}-*Ec68y<|9UPbJLuN(fZMisEmszgb+BNDLWCw-IQ^`Az!cOzs^iH8Q&HdtkncH76R%6`7*h8hFRwPs;;L%bGMyyjpS;5pU_o9 zk3-c`?v3uNkqsw-UXuX%YV-jZib}15VU5fH=^o!En2$M+O1DAK5;g98H9I|!SqQN< z)#dO>mZ30lNB6=0Eha|G-RHyBZ0*mMETpe!nlx@$nlYd85CIzjJ-|s zjnM1#mnn0HyUVoNT^uh`nhR*Ybaz~vy|Q5$V-_f(od;QmSx-HN$usDFPj2jVUug7&;y=tu!yA2hDhTTFc@KL#YtOJO8BzUYyL?7# z?2QOs9=ClUkB`I}9IR`WmdCZQGin-0t4s^EaqU%wy*A~n@o8?mDxpC-rTvoPJ156T zUaV!z-OZh~8rI4rE}H^37+AY+F{Q{>Ou!{BW4dAe(uRo?^_nwYVjUeV=KgOFUySJ; ziX0DNX%0rQbghIMBn>|J;vAA*?Oiyp`8iu>l4v7u8|y(>gVdaA!Vtcjz)0l8=P(?H zN4}`1$DN9Q?q+sb&m5$#(gp7aUnVOXg~M;4RETQ$L{JV+Ia_PxF^guNASIK3yjOhq z*r}yuBY}H?XLNS|)P)Bo4~~`EX;A8&14QI(vY|^+Pc0H%NlscpMDt*^fl_)@@QL}wzG!863df0ecjg|aC780)Wr(v|hYTn2W{Q^n z5&8-Z&IXXK{C_;B2lD9#JpV-$^5)-OL|bQRs5FRKXAXXB1{yrC?+r&KO<9+5 zJ_oHzAQc|rlHCp-zL>3~1)l=%(b1hPN?w41IPVedhD4e!9E}S+2{Z55At{4f1^qq-zEsXkhbQbdGDl} zq^a@X*(-@GE!~3qet_zo=fb~&OEmozRaAH==EAKTk_NT+xf0aQ34!zWrT8yxN=(FQ zXc&VQO=L00d}3ukW;#PB`(+<>dV&wfXLs(+6Y%DAK)@bf_u%1ap>*k6_g|R+)q4x&N264JYSPww z&BGtrzudUvs4V{ieB}GoNs+Ob08tnK=C;pYFj(Mg*NyNtCNaef{El0&q%hQpEh+o zraUxESFD*Y3lDIzmr-%;XAS;*d|T2n^J22~i~OTc;5%?K9NX&1*SvWX+5frjoltNu zmDI?77FYNWiWQqHuV2k4m>{0=9jZyBTN2vRCSOe2B*LAJ1V4o%W+?o3KDKi2@bhe4vk%qZ-A0zFcuch==WhEh6wOo2_EKl54|ebU%q&@E3sfSTGS~+eI2!X{GsEU-Dx? z3tY_|m9Ei6H*N+zt4xx*)gG&|yvNs>RF$^j>x18&%gmaqdRu1%GW-& z9XHgHijD#eNJ@jw1u_SmcE}5{H}@ft;qKku@wk(-Gh=iU&cbdeu5U@vem(LA0AMHn zJp(@T;b82k3kx9z9#08#P0~(Ix;S2FY+d2%CM4%b-(tq~`YfxDP3$sdWPiPde1$b1 zRN11|s6~R!`nxvjQ@`Mpa_OukBJedTOg(<_>?;&WP7UGpdDys339M>su_YbDn4!{Wi=~2D zloEMm54|kc{jdhocu+vYW~TM+YJTBT#fzg($vepFd}%Bp5;luit9?|~x&b@-a-7nj zS^d)$vu>LsCyKa-%)^*iANmEW7Sg$)y`D>JG$m+1qFs_=8w?L&)gPVCPz;cV7I#reO%l5pQWX(_!5&RGc0pc?7&66eee;tgm`v1rJS*V z13}UIa@{y5jxk{i+UjBtJIHWm0yte6hG$`Y_$+l6@;(5x>NtCOx?%+5hOEj^N|Y#r z=iQES^T_5i&rpo3z(}D-k@#FVUO$BvMeO#pBA;mn0155;d%q!{GTx<_mCzoos^|xn zIr8IRbD?H8TmSKvr#|g%@OMv;`8f3i-?#`Q&c(3+UYUH!2c@KNbNX{q(#2BGkuYQ5 z-W1B)W(M$(DK9?WUB7yH|D`7<_wYPS zwm-qCC4GC8zA@eDX+_AyB=aK{bXzuEt-Q7wY-sb9Cl0ELkC@RiH8}3PY)*dC#^D{$ zcrL4+c{Qus7JJe>=CQg~o5*m0uS%xs(`%el6LC2qeo4{!uIGUu;Uwaw`v93*0H^pg zW*)nU7+a}B#Rz2On_?uUaUIlO9A_9ADo&N_Igxwz3-4EEO>TXSdk8Z>pkwlO#YZl< zx#HOK^~&AP<^a+zzAkP|b#+IwLbP;8u8AZWSMIp3+Hv)#;0m9B1Gh`AWBB*b>_+3k zFXz5^I>|2fYu`szxtYZT|Dr4q@s+<-caGtyUGU-=-5%qQ?{%fE19Q@|T&+zznM|m1 z?gfX#YT#KvzMFWgtF>pGn*XdU{r@ocodHd3YuBMGT?k4Cr56?HJ)(wQLlu;c5RhI) zAQS0J@&NbkK$3lMU?!E?`j&n;j1`TZDXk_nT^o?V{3*4pcFxNY`A zNiatorHn=_^*b(fhXE4>dbEcI=SfdpcscpwBKi41N$SbQ0V(oV5=+P$()H`Xr!+e| zkt@1Viq^rTjA{k!zE0Jrbb@%RyZ#v8AbRT~mFM+A!&ge{on?evD~L*B06%C0Fur-b&sj8I zGP6akV%h=fqbpZrU!Erip(=WXf?;{l(=pb_g{rS*b-Fd(IuOkPl#n7M)-Np+RMSG)h{CYC zhd-O1(jyNH)4xAYQvUGSdOga;hzU~4EiUt_>~WhhU-^Oj2d2xCfLfO~;ah?W$%AR$Mn$I=jmGC=ltQ1PntRJJ-$vjs8xT8 zKiX#hKo~)Spu{quKP-eI_b^P0@0IR4Y&)=Pt==1EcWzK^`u1L4v)7Flr63485+ z+?5G!*zfY;?Kxc}AR+$+4d=-(h9a@@F5&`sH@g{LYx*S>M8|7~QN23XCXU~9O_>qI z&%+bW^adPpTns^`nuHR0*%hm%_tY4|Ilw#u6%V2-dJ9#Mm!t;+lqIzc z>I7&32fO~U4~E^QfY>OB>WR~0Rq1uXU>Wg`ZVSP-1{H^0zD0A73?TqqJ0Ar=kU9XA zuygubZuI0BvyEuIU5Cy>M4kil?Lj`1L77(~%$Z|Bp)&CN)Gu%f!}K=~csyZ1@?&Rx z&ZU|v;R+yJFMY`hgunh*@p+xFH5kYS8z8(%q|A3+;Llk}!#~Un%m3pgkf@~=5K<}P zUgbtI^rz-b+I}=#EaA5U!xdul5GHy#;+9kTGKzY6$R1p&`zw>!VNg9gEpVRjN;7n| z*)bAFIu|r6BlF6cYKLvIb8&M!6=1NJs*+kM+*zExo{WU>Yq+Ujco<5>aIVLyImA&T z;Swc*)Kq{mpyF>i7BGsQWc{WpAoGZKjZ*l=_Lq%>=RG(H7tn$|UH#dI^bj1b*G3DE*az#)v! ze4e6@7uAf^?z)rAl1a8|ug=j_P%4xoh98jxY>qxYiM{%2;+iG>cNji=W8KI}wb3X% zk`p7;Y*)#4K((*VcqX=!1<(bV`n(>@uY3P(7$eM}SKG7XUWbdbks%SaR7==|bx1KW z!tPbJWre@h>4^Uzis{&ZZH~d_sOA8-TAl%VQ1H z_f54vb&aB<=u|b z^$ap3ONjtnrduC4r+W}laKHED9Z4^v%DR=yop(f8R$3+{&7W$vEv)uKHYr#zatSgj zoyDamUTkNjAH~9`J}T^ScFFJ3xI79vEy<6Ao3gI(7{BauNxz*NAAJO{!Xa9iJ7~3q zJzOgK%`URjml9;|5`z6f(?sSa*dCh2j2eJa4*Q=+)bte+HBs-on(c%zD@$)ZF{1ifi=>c+od|sRjV&;*3uy6FR zKAYbb9xyGWfo6D6*6 z=S2#x?~JkBA0QA-526(!D*^YresC{q|8vJFpIreuK_{h4U;Ifk=yq6o(D#R#)ELE* z(UN1-B8$I|)B&V5r68bD`}zm)F2J27brEL9T{l=P^kD3GL$oB4j zEK9^vm3q+KJ(>Uq-Em|vUJ12&OQn;J3K(VwXkHw&nft-(#eq-^4Wk-I>fy)nZ<&iy z+={P~G8X#6ks1er5y#q(_Xj%XCU13puKwWYyZFW9An4gM#m+#y7PyrQ6+085B3^bx zV6$Vh)!xqn#mkIFA--pQ^^r&s_s50L3(qL;-?=7;bE8N)_G`K#h!>T|^B702RSKYR zOr!m7abOA1+q@T98n^)2W}K^V_(9-v`&ulMxP3p2o^|HfQW12g$)vu& zEn6F6GYr{he61hrqi|U!D}_-nPgEF}el#X&)h6sRxv_rNBA(GKL0)54vV6Wo0w$%O zp%A)NRKWSsZQwz{*Q{^Yev@A&SQdb9u^j+bTM}bq?e-xjkI+m9T%9a8rTLb;gjL=< zlG}_|%=RNBX)~?f?2=H+N_Uw9+BCY}7StSZI(w>>9*@*d3jX(+o{-zkSKQG5$$_Wu z3cVgw6xqX3*EzGd_seRaVy049h&EyNVY$r5a$(2^9V=OIdtW5f2X{|eU8psu&c0|# zWopMG7DvfPQ0aa`4+C1Ref6iKnjLj2=RdF!JV4ej4uEA2!u|jpdgfFB-ts>7dnDQm zvO@s?z(>s(U`o5a1$?vK))`aAqz-0arZl@*445DLQ-u6Y7PY+>$SgAI$L7PaUlV@Z zv-DnbbUOy#@$jSH3*`z;C00g2-Zo%}{)#0AI865Twu}>6$hiiG>xz?eXV?84;m(FZ zdqxO`n(_t3TYzVWv-IsY>xm*UK;Ii^j`*#NiUyQX$L8rCfD>E6FHDrF<9450IR0^++JOjIzdlQiD3ibuQL%y~QFzSdRxJeXk@T>%=UXCe1%TZ1+j*Xif-|6?`K(vZy#bd=c4|%B>y$O0=bc3q;s|9_x zPR1H%SrI!d?Cw@yPya&)RdUjO0tb{pIP(-2c$0{A+-5N1rFd@WN$f#eU!GE3K*;Nm z>bF5?IgS^hz{9fb?Q*D4KDURL4u z#0e2fGzNtFJ?zZa_9h#x+mDBuLT$gnGphfaWXbOIUNf*TDXaOH$Qn&(Lc`dJ3CtmqbCm@caxUsOKw5k#v+(9513(I$bHd7=gQK2_*F%ekayZ;qSbw32iT=D)|jj@*@f_FZk_jAK- zCD{g!>yZz?*)1c`T!7`pD-6d$4wmvK0KZtH+h%rtzP{W4sq#l12Co1^G1(M|x=<(j zu%iHAFabrY?WejsgBiJyYLgBGS|5T__497tuVV^IoZQLz`PFq6;fSS|cOFq#ATd6d zNln+<57rjV(JHXo^Z})+s+8SduYI$1pR}>UHoqoBj~7DjZF@)NK<0BGl*?mS4mb<> zPU>Z1yI#6a``G3wSve1%HAag2;!E#Aq!KGYlwBram3yV z@KD&s*W+n`?%8$`l9o27Q-CR0@Z7P#;9hSAxrO6(_}ys!GEzCC{L0?JaKFotr!Dj? zPdKI7h|&mx;>&A&47J@NB-VG)F<9&qriQq_G$yv|Xo{?bc}3Nd0%~F6={IZwMCCRZ zssl)f>HGd>6(GSS=&Nu3@M&Sm?u9BsUIZ<%FSZEIAd%%qTX*C1wHC}|*5RQ`UtEqX zO`pA-IBi9}Ced2lk9m`}+`KS;#_VfQEE1Q{eN^{fbVQVt@(Qc~(})2#U=<<>5fhR? zrBopPC3ycd34nLmKC@sr|36g|iD$f%CHaJYKhK0{IYr;F({nWnVUv9p*4z-g8CeBr zGS+UjGP8}z-tf^JYJ{?D0F5X_gaf{?-rW=X@_^+uPLQP(p-uy=^l`&T0(2t7QWX|u zce5YT>3pw%4sV58tkb&uaW7Lzs}z z3X1yiMWPl+X!wB!P@6)e!3W8A9q)~)G|RyWtunZ(Enq}-G?zTZ#oSMDbmc*eAOho8 z9LlKzgBzEKq;j4Zwyh4Yd}~SGm@*{ z6jZ{1`n_mP78ORBgH3`n-XT5_VNw$VnzuGhF1Zq}BrwK}cuT&nfH3H2jsb1cRi7TW z!%k><$Tr=hx%(_^zx#`?*}r#(zQ)e+qbM!qn-jkTyb877!yxW9s5TVZ>;4W9t;+y@ zJHLm`o%<7CPZH7L2i+aa?1R!C)sf5}?8rK(j2@YpKJ62%`6fpMBr1!zY?PK)F^pqe zv14mW7o|$%g**fZzhCO*sI}K8fZ#ed`e5MFw-ZnrALt)FHu-+K10V@pZ2x0oleN&i zq%wT0kd}-g8M$n`%s?!^!N>qTEMYS6Prb%sz(W)K0~8#zFQz%Ny?7`BA%>i!vQs3r z)AG609sPRe3$O~8VC|PqT0sw#?ReHKSq&PLts(nQZ4ltA}aa|hf~(w2;ET`JDP8!h{tzRkS%*J0Tc_(%~Bp|~eY;{!Wj z@n0odfhVfo{aOFzV@6}Z=S>T%fhMS5JU}#vH{AmIUN6sQ=&Hi@cU9Kd?-k|}7t;8` z+XaIwlg$SGzUiWkJVWtbj_l3z%pAlnaie0NAZq3j;ueZ#n$caj#Xy|w1j5UtIkN{8 z5r8)EZ_=9cWRnDwD-%T`AL&0pW%U`@AbS&@Sj3mdYiB}Zz@ne)A2Rk;_s*!L?EnEd ziQ^@B^xTTyd4RfIFf;DYaH9W(|n|@zmdQ6bPcj6gsrDH zZ4RbB19*~s?BRlA2Z|C+zAuJRc36(MY4$B)+fxpRB4 zqe2dpsjLVG(nys#`FW}6ssHq~dC`^4s#fec?R-M%2H zH&hQUKIXB&HIt>gfYitq#rcKS7JoFFBIEH^czQnh_4$D zn~YBnNX@p7ILvNA3P@AL=rKZoZa_ypQDfWAIYma5!=#?fLawgN{%F)x-VhMms-W{Iuj}fn|^e3ehU&cWc@R z0EiZM^GNQ@piYd_zX>plm~ng!J7yRY!S~c>uKJa@eSC2n4VaI}=UA6)-TZ7LOTjK( z^Q)|vOAaGiI9=zE{zCLV7khZYX;2%MZcv(TVMeO0iLu&9Mo1-?wdNfnn9?G4BzE-O za*WdQxSrkHg2&|LanCORtf_khOmcNso z1&pyG?b8~5I)cYT%lbZae#8mY4S6CxZj&)|8q&P{e5iSAjN%fcw@cthI_WoOQppsb zE2?)YX%m5OFf6L9K1$2dx$&$92!QPolaD(R`mMQ;5osiA5VM+)nX{v4?QbkW#H~*K z9>;d)XY>g8u*wU+BEK<2Wkl${QEOrfbF^0sdRRXo=W~tx*D3ZOczrC38*6VE^V16d zhvpF4COiz^h^!z6Ht0kBGQ!@0lWMNdAG`Yf4!t#31=|YDyMcS}IZHe0=iE|gH`!BE zW$B$wMTZQZ^RirH?&}U^!zzc$ucHE{CYxV2TSxvJZ$o_E3&|UJHcNY4*VqJf5JlS_ZVlNwk}IMSg(bEF2N~~!J2eu<#y1%l|1r;_#c*Hd z%JO1v4&4`qfaY4^a0F!1>ZyHBQnKiqo_p6Bxfh!vBFNo8t^c0qX$ZQ8_G8eHfWDNxMl!yb7AulNac+Yag7-bvSO2(&+mX7W;!a7g56Tbmr{Zr37 zA#_BAtBtRm#%}Y?AWYaQUc_MrCZ(F`(T}=260qb=W80<2Q$25f00|1=uKbqqq|sD# z0jvQ56X*0BB?ntT7O?%C?jfHiRp_Fejhb8aH-_g!j6PVwWqU;$>SRrgwG??DXCz9{ z_4}O7YfKrP1nzUBzz4UA$|e>3-xYVCizaW|=N^W5$HQ0Rcuq;8C)QH-5y5`|rZ?PvKhgza?vm5(<<@0<8xb+e z?kDAZoEQWDuqV+gC3kNoRNFW~Kkh4;m~mr_;%cV_jqdh5%JrxjdmWWKGvy<2AX_0w z6L_I_qeF3_@h!D8^G_e^5&iXD0w)#{KgM;F>lmV3FQWFA56|!2EoCE>9;%q(M`4HS z?ncZIf%G$-O&%nf>lzLfR=+$JI7(&B?r6!e2j{NV?m9#z<+5I-;^CAc6OPl;16)u3 z$<_SkZ1Sez+?hlUfJR`Fo2mQHeeE+3KntLjoz}KgLoY^e4}Nw}$~nh5e-PTXoSEX3 zPad~0RD>U@B@YF6{6C3{lJc_s87j3ZwF|S`YiF{2*#9YodJ2#mmP3|W(ahJ zD{P2GZq~wka8;~Sj4&qt4dk1np>)@Az82IUku+QRDtt>d`#mrfd{E@mzNIy&l*w@@ zv+;N}tKhB%>rLHv1(qYWP{^Gu|35FSNbp-Gb`STrOzakpi%oBC_Iu!D<}kd?p9uy;cG)V+3_LC=18@+gv=|k6;7}Vyv00VWOytVT9El>XeSFk8khT^AWt-Xh9}-_ z)*mQ51Aw`CNFW?GT$p+7MCa`TsD$^3nhjo&0ys`zfP6+BNaPg^lDxX)Kjm!nBa^P$ zb~Z0@gOJZhCV49TJzP5(;6#SLPbwz}cM7zik6+>@c@J-dmg)dSdY0cJw)|eG00kcx69cF7H#cfAU=trPeM#onZJ0Ce~Ddl$>Ctm7>}M`O@3xK z(`oCMU-9Dk_gb^D5=yD@4?d8CrHNcsa*Tfdkw^65@> zs}Pmfv*k4zB~-Mu_I$X0yQiRn3q=Aj(eU>6wL5}mI8UtM>*;;YHQYRb8O?CJn~fJB zY|oy}v4!ib;jIf!Z@L-6Mp+UstR5$*;_ShamCH_Pi4IVC80-Cr#SyhvdF4)&soXEt zucK1;gttmX7h*K^1^5o+&=1tLCeqp(F$9l*)-Yp}WaM`{9*sa%(gOse`Gy zhZV&H%XJC&k5x6#k@YFFDB8B?;xtJ_xQX^JHng!YN17i~CRBErrY4HKx(Cc(!Y-RM zhU$hSGV!)x>%&-+4t<43vuxvvt61eC&Wd6s?5B+>%q1U3lQBHd#8xH_l6Y{7Z{0BF zb*RKWb=#5{qqWssYGOh9N*yyy^IP^-jwkCuhXum5kOl_Xc=Gm?jIXj)A8&(!6q4T> z$m~kOOzVWLdcHoeg=;)kmghBm-Z8l@yeq>11}V@jMnxRo@tuq?&S`k!)jcDg!ypu! zIx;ddGD1Bflfy>p+S_wfKnG&<(`gfP%a7v~GW=;!Kv4S!NcbG~s$eKPyW&pAEc_^^ zUpAf$Oz`fBSzW)hJkEFcQf6MJ+FL?Ekr^moWBAZnoePYYQl@X`KhqWaiYyv^F>JCs z@;aEkPjRq1nlz`MZoqZ2Sr1_vITZ;*iorIoHGhZ8OC8WH zf$EL&q=4BTmwv10UVboHH%;Ud5h|Pt9_M4Qw~gH0YCoS%Tx7a4XE2ygZ8&ppeZmto zkF6ZI;pNfZ;cZ#(kVEV4)Z#!lsmKN?MMy&S)e%#MhYPC(L~YG)Z<)D~q(x7DV4-xLP^UCcmWY_qhUon-9`SVH@00v5QvZK{1&ik zd-GFA%VqpT(geaSh6t3Qywr}1o7Dm-^^5R&-=2baSNT(TMoqdWDNus1XWfNw9*Bq_ zuzlkkj`$z?32uZglZU z(y_5EZ+cD5ZRthhrmM^NmaNy6m3dKzN36BZ3O4U|6)`4ktCf@=niCN}-1%`nK9x>? zz>)Y?cw(3bvECU5`uEl!s@&$WSWzr?uQ~rvEQ3wF3pM49^IxoRs?--w)=hGwlms zY@0Wt>yjD4h>@y9PIcZiTTT#$5+1C?NPq6x#f^u`)S*UGQ2dWH`xHyz-@mD`w0{4R zIp^i7lc79nAuw$uq3}r3Vzz=*K`N4DME?lD&k3$y1_PR;cjsz*9Rr}BNyisDF|Xea z_s8$*(c*rxcPC#^Fs$*4voRlo*z;Yxy!9>OhdHuozd6_XD1IFoo?k3>AR6Ev}n{ zAqMua?+md>G5y0LAkYt#K<5;)oL~Ma1yOtDF3YJp$UDR=q^^YwZ#u~4UA-G10fc8Z z(YK*s?Z09ZoQM75q_9CiivpB-?$rqXZ(C31mEL0+5RrhMk97NZlCaU=LK8r1-4t55X+*o|s+Nt6 z{vC=aX${xFY>2d)!80SFDYdsBc)kc9ndCE~Yg&v=QRdFNV+7pLZ#NP3}oJ>{z5cQH`yX!fka?>T*u0TDTu-jJHpv!4l{rW0ply+T{GX zIs%V#dS_M>ccQn;+j6nt!B9fB?uvuaIt)46(tKCST@=~b8)vySNf^AC#_LPbQSaVV zQt)hE#onE&pJYdZA-YvN5Qov}RhdZ}gxHS`O@5-L(3*a8*v+Ues_W3uuoL|8O`v9d zp?$8ubE=aJUsE2FZH}Ij;raDWbgZuuC6iXUhSZWsXjmguWhbz8KLky+KTQ&W9 zUA`ry*+psvZmOe4)^Uio5Htgz?b&M2Ky!Zvum;^6tNqstoGEg{fVstF^pDQ`DK2af zElNI5#?IuU#>ZsF=xr#ez*I++QFbSzDSD zjsrM1>RX`VjD}`kOQ6i#5;qtOuXmuuB%lBb2kdF~wSb9RR}p045eBGS*`5mn7%;%5 z9TXp@@aexR!VK(j{p_w-p4-pzBLwP~)czWl7;xx7k9y29?1u(RGg+QHz7e*&Oa5I! zuu@cXa7uoN630B6H?RXF6gE%1zFrs)?|oWLeEns}PT@)*&yvjxId+F@R%TIho7ZHZ zYS0&23+H>=?c2uU>f~r;R?dz|4Ne?u~tN+xz+W#BFF=$$nl&qAwMD zX*}8z^k2od+f6n%}ugdL4LY< zP?`ZA_ZL|}Zq3&ZY2iE?hRPv0&${0=(#!*LL_^=u&U220DhEPbW20{IXMcb^y;E77 zpUZ&d)uggSdv!9` zWm9c9lP!82ay~1S)b1S|Z-BRlT!bt<{{1}M|L1ua@+mHI#>c&L(r|mFkKslHG#{xq zN&uY9dk9uU;osjdE`jQg6$578w?0RjR|UsM#9I+2=S?YCTid<|3$z7_FNG7*Xn{e+ zL?9f0zFVEfo+CAV*K>FpK2l~9whpkn1^@ZI;Sqy&x}lR$190>w6a$-)z=}w zUiaIWMJprPyCgXh>g=f<6!%J(YIQ8SR;c()FCy&nvr}z_caubX>AC?cCV=H?B#rH! zZ>HEdr<$RBaY6P`4{K)p0aNw-sxj+7uTF#Nl!)#E7Z*p|zzCsTmq@Bcq>UDy^BL7c zCqB&?w7RmIJRHiQl)1`DhT%2|m^;#t{zqwv8CO6JJJt9&J2pCxq%*MK`z?FtT1`^N zh-LBe*aDID!?wl{6Mq)G)np=SxH-16m9 zukJxkj&X#5r&X8={kePD(ACMfnjm%RC%M1 z?b&hVom34w@Ek>> z^e+kiZXXidNazj-|k?Fz&)HxU@yFNVnRsS&L8Y3%mW#IKD1IA7mP z>P69lCB9I=+-_nTu4Jj&Ic#c$lc7F*mhwDPZm5V2JL)YfL&8c$KTZ+gSIa@cdR7Fl z3f}sW0u;4F**Op_VVH6^Q{ozYzP^7iZkeE#V>A0Lw;BX;7{H_1aMOL{5>eqnjeH%y zy}cuRp~%nAs$yqkX^@TNxX$RDIayp>OPomx636|h1yvl7Juqo=43U*+Qq%5|r5P6a zFfk(kdB(7lKa1$L=cj~8K_{Vrl&vFQ2F3l8WXJ4vQB8QgZ}xV`?4$^idt9O|8_`M>3zQdnZbkDT zR(+kmY3;#+2|gOqGIl@TOX$hxA3Iiv2TPv=SoM7@T=DT-O)a>%Xod%&RIv`*R6xi7 z)(3F?x%HGutkV#hUt#NdQPRVfCUNu>2e)QDx6<>mZ)BRoA0SKA&%P5m{#k+w7&*et z>aCwvJ#x!OXrE*r6|@_3(cVNnv~w@#B9_52Mz`Dj6`p|X{CL@*kqWhLi3H?JOi`zu z*;L_-T6W!;nnb7rN_GrOjp04Gm&1u#gb>=)mtrR7oIMMSG=wbj*J1>_4@Bk$b#E>e zn|r)Y2f%7#u6d_5EX~!(u6;GKm*fojG5SdTc%c$b(Ke}<#&6AiGA8h-&lkyhr-2O$ z-+6KRQu)TVlsiLFrJmJ<$L1R5pF`8lbzo>J{$D$`j3pv>Cd{cF^sgJ?tk z>pQxeaeC>qh;P_mzNxR7^?}>tn|HjVMv)`tvj_eFh3n@}Je}Ck1D>f>rH6+{VZ~!o zZng;VA7;8YoM;}-g9HdPpikC#9QwjwKWY8g_EEt;W-sbJ-&Y#lbr@vuZuhlk`eyd! zQ6DWZW&(r8!jpd=G?jVeU6G%nrnP674P^2D?QD(NzR~<@;%maZd@l;Q&y7{?33ORQ zI~Fk#lq*k}EC>v4M9K948r;Q1_a`MIQ_11lk!CYqB1^Lzml=N7y_!}~K9A?q?yatB zd=Z@DhgHA++*$SRl=BP1ouT+POlyq1J&t;Y1rUpU%Gln?n=T!#@jqH~>%!kWz24Hz z!zY>1pbt%|3dk$kiP%RM&1uRX-%ofX>JOn6&Q!}Tx`Tm zyOzo^$5#P*3s>!^Ash;2Y~!4^nb90hER{9ORvj)1DwQn+`hov_2G_rx!=*;5jfe;z z2eDq;++XO zhm|Ua*Yoel#Jw!y&NG{iv|`{@<}QC0X?6RV4gmSyx--d*M{`%C$OXCNiWCEI>%Qnu zYDarwlUdoZ57kyLv{o;4gaK&yL;p!m|4E@uPCSm-haL4piah=7ycTGrz`1tNg|1<; z_mDGg*QS`Zf-;{dUX51$2Po)w3()+x7uv;=eozqNkKzA<++Gzlk)g-1qU7+V5h#}n zT*V~Z;Y#-1Ecjq{xUk_aYmc*d||iyR3~&2qK=#S)J%K zK4B>uQ-^5PXR~RB8#&D!y|y^(7x8IlhBlr%?By)3A=Bm zbh$+yDR7$H7fH|EC;@=aUbtC=Q$x8FY6eWEP1U~GvW;BjMkadRpoTM|BMh^-`R3bv z6jZ9;5Czq3O>FuhiE=cKd&f;`!i3Ps(=mLO(UyJV^AEkDZqEfr>KU{LmLyHqQ-eQ! z**a((-+O%84}^}-_e$@zFEi+LD<}tv3P|*F#n>Ib|He0m+^oFocc|DH`1(~c(^C!R zWSTvz?U3Ro6Xc$VwVR zxL94MXm)WaWJoHGA(Vd2z93I?bw#pbZ6zaH} zZrAXk?t4_8|CS;FF|K0&ZSlfh&cp|L?Dw)6 zeUo&du!X^^WE_O`Vrcuy&1wZ&iK3041Daxb*@k4Zcjj>snY@QFzGU9RrpPi2qE!9l z$ksitBe{2OoGlAGAQibOPa!;%YZ2wT z@%%;ALYjhNL;CFN=OLWDuZU>HkIURkCvT^vo}cS~AI%cqjr2=1=zT7Df&R&Sn98Hew9(N%YeW36N)%G6jFABo6|1Gp=DOb8B-F{bs)TDrXlS0bnm_+4zaqoS>vEjC ze3QNBD2H!a+vcN0sJUwQ#e> zDjRaIS*+fWCs$Kz)X~vC2}@9E)b<&0RW007?5$u|`(iU&MF#FRcfJFA6;$;Y3Q}7o z7Apnz28XbuUD_L3TFULKSC~gIo^XR2r?WCesY)&Nbug*cYN@`kwwa~Dq5e>!mxu{r z;wt!Q5Ry}n8>k6dnyII`C2Jn#bxPMs1$Y|NNXF6(`qUnTw=!*@*)1#1PjmCQBDljo z(7$L(^Eml+7TwSOvUal1seRq<^gZ_a&V|o(yC;0#b8+;m`aKq_yr0v1YtNM5zGsj5 z>W|DlGUnT77VL91qL)WGTS)6yC|cB3IrFuFU-V5y^`)g7%`=2Nx}%D4Z{^h*sDc-B3LcG*`yXoY`VH@7D#Qj*@En zv`;tS(xvRdJM-^A5MKLd)m~g87ivus9&wIeP8Iizt-q-2%fV7#)LHsDp=;XoXGzBB z+csRUpAE^ywsL_|aZ-spl&Bs$Id9tS0ZAXxg590x(h0Gd)Sy`-3wGIS_7x0oawTXM|YDEdNl8gqj-=&7AAFMa)g;->XS4H)(Wo_~nJgj)o zdt9)`7Uc~94Lf}q_p=uHs#d)n1Ek$O7i1p#bnaZ2(2k7|5D+BrEzREYFzo*{^R%~) zp<3Q+DT>U`=~eJUwwy!bg8j9wL$vzYo!Y1`CHp_wK|<;N^gxi$kFiXDmEkS;Y5t{PIiO?wJGP(Fv zAon0?nIR6C!hJ9zh@2mIIxM2beDYk#;IkcHts)uYTg@{0S7d?WA|Hfb7D!u6k#Ik%Z@q*%RPH!FKjQo z$#Yl^w4Q&$mxMKU%AQ&({@fPQ9kCo4t0Pz3Gvu;_L0()frNhJC^F2GhRqzLh6)^6W zu{-+-ASiNdh_-KEV#{v@0dL<$Jn4Hqo~FHM*}HB)L*lyx+od6K7hb3fx>}6(+lB+I zr7ZcQML_KyFt~nNGrgMn{ypp>QdVmV8cKdhG0i>A6Q_-H5iAf5ePohy*NNFuOCgE3 zPC|HLs49$-+7aA+cCHQoI##LkgLDEu4~&Gu{yY?S5|F0Q^1g!60-m4fgGyseb9hSGzmo$|KS?!>v3 zr{k|l`y%QnUzc6QvNb@w zK!@)%SWSnsE$%osPsWBa?!Wki+-8M28p;!npLY7M(L^)^ibBJ^AK>k2WOF@8Y59~-o19k=8P^r}s zVck#=7-9zc5FjCs=2aQt{$s9h{gx2C0Vg*L8ZT=|$-Qb;rZ^ zW;td>HaT(Es_C6t3P#*qZi^~2nkRaX2y1O;<4=u+Z)pyy{Csk~rj!%-O)5V-u6rWI zfJj>=1^4-4_+oZ;1^;|nWlT5Wl)Sy)o< zE)-x4m~2`%ZSWXJ(uzlVV(+jG#nKiG-AGY3rvor2|G!!RcR@fpnSgfOVZU(KY!I?= z4gZp-ZYN_`ysb$cB0YTCQMdt28GeNiQJlolG@bz1r~>7G!ceJnb*I(ag~JPAz@)tri3 zW{fmU;wGXw&~JBqZx|Blk{}s$-5wc`$Uh~=>ik`t=6E-Dk%Y-(QsU@Lt0PGIMzl$; z)zgp84=VO)G1YEX%l5cJ0loPWQgvI($!spB(X-Lt)nAX7z?ARx9Nv4lnA52`GTGZ7 zndBeH6E?xUBKX$!z!-YU#zcC(4MA}{elOgnVu(j^Pe1_b z7F|GrzKX$7d0n`&qwC?Y-hLM<_nTRLWwd}$0!$D>zxtE?@gWb*`K=NgLxPFa}qUSCHwKB8JCl znbe6_=A%W9{HNS|^()#UMJ$rK~f$ghDTt}R>o@{T+V zf`2EI&|J$?T1i;rEaUiCmf4j%BG0cK1Es2^%OAU}j&zo24|}cSdH#6v7txC0FF`h4 zS~ayIMv(k258xNUCmL=N^?pk|zi0`(+4cG@PeCBh$XGT*z&&~K+`B9D1^1^8*Z0k+_r8GFTq{TnR=z9}o>_VDlgocqdDr&^ZNi0D?U(FlFb_Lq*3l z^g!x?rxZA?+qSSD@xZoA7V6PNe&E-zp!R~yu3i5SejbOPWECj?E-p&$EG|%-v!YZ^ z$)GS}=A{FNuDgVG7;}ey21ah5CNj*eh}EpxiLi^g+tbkyb9@;Daj?I?Bp=QaDDC`; z2KA1=9Lf0UU5*^1t!H1i>u2hgEt4X%H#tG=ZhX`FANGDE3R;f6vF(n}NT&auh*$`h zSG%tjp5xWiYN{_mw|88;FdF~pFqQG*Tz~ox5S1*0fmd5=K9c6Syg(--0l|bn;VPRn zjtPYc+&d(wP$S=@d!St69y zZ1MFQ8LtLpgYr~m73d(Up6BCLJnUzZ%=$up2ip)bESu88*8nRw#E+SM7x~AGq{}BQ zs7n4ldl6dLuQA;OF*-f%C^=OB-98rYunhH|$EcU+=LgJtkL&66a(EKPRB@kue)U?e zMcoNU#%@Juy5PfHplmEpf2!|K7Mqwa{j@ZXCRZ7fsO&tc*hqJYL@c9POrtSdiv?c0 zzPaqVW$!+Jmx7(-cGG2A=P_h%as$y7pm0$%`y1o_J6_3`3>wRTa^l0De3mzlBGevX zjz?1f*uB(cxcM*}gt3tT&KCTj+mlrB+ntbWX(@WdAd)CSyVH}}Jr*x1Wx!qOO%>$Y zmgdJX!&E9cZY#YrOISe&zMDuD&(C)`Xw3vperrvYW^v}r&Gi19LQ@&OO_s^T`OCG| zj4&+g{`B^*&lbKmp~waFGMuIE&`cGBW7w_m?ThIQYBG8;>=U7bxMZ6 zwE-H}BT}R)627Gb`}r(4xgY#~D524Htf#zG~?IkS793iTz2Df;?}+0F|dH`&Eu& zU`z9Nr6dq?IUlxUv)3yLlQ)`fs&p_hES4O}^d9BDO|@%cSoZb1ty*?M>7$lM1d_Sg zMXYQEcO{Yk(noMY8H(?`)&`>NsIWsNz^Vg=rTK76U>dNGn+s+MW?iMEP!_$=4mEU# zy>-12zmz8=UKFVu7{C1>FEURO$-7whW-R z3iJhl@RRBu8LjX+ACGFoHA&bY)O?~1$(@VK95+qx3m&F^tiy;*Ndhp%VUEl^nJcau zp93iDe;;B=Z6gWu0Ys01cbqOro=g_0Pp*|DY&yK2_f7WCgPbG@dDFoljqC{1@^AD` zw0YU_ZLNSdl@_=K0P6d{Z`uOzBV8T?ej&mzdS4pjAOAei`|}%RGPi%90*`c>{r>8b zRdSf*m8*$H(z7fmSj$-$c#;qPdFO;k?yK1o!8GtJ3BCbGIRE(LC#?0bF6bZ|$$h2v zeHi$+$n$rgm`Lc)b4-Ll@~F5D=A) zbVQ1wSEbiL=paErq)Qj+pdd;|s-S@M{bt-}pR@0I=e>8|{XWQASy{=NYt8ombBys1 z-DN^G;693cOzCZ6xejo>DarJv@>jeY^+FGW0 z%z3|hK$TC?ey!SoqbVpc{oX3Qi5@$ArXH`J7Qr+cZo+@T|KE}OzgN)z>wPL{g+c6j z1^@B{8lHF6+U*PsIE-R|P zjMM;0QVH+?eIf}zlnwoE*IDE~`1Gcik;svlTBZ=iwaf3^i%TS5HZpJpK!7uXt-FD$ zpUgay>)<2mfJoS0W$TJ=PJ?vI^%hL!0q>Wv$**L7a>ekKjibD3+5QV*Eot|{s6Y79 z&Nm8tSEo%qx2lc=xmZh|8*=0UYrxnmW64n(HqQnk&0ZS&E{iUX-BVH7;cd70Xo42;yKySMi`;VRbOn5?wm$>VD*K%qbxdZHX6~l!l z@x4k?J-_aX4bY<$h|hI$0)fp!5_f#Evm8<|0$f%Z8xdGcQz|lad!bO?MP19a0W>%- zVMi$L#?0tFCfnE~-z(WI^M)w7`Q3Rzw$w)Mm_5<>6@+6LY-g^CmR=3oi)C{sZy=v| z`$E}|Xe-Nv=0L9m-=kThoHxmASEgef zPu>96sZ3eb<=Uju+(EMDq64p;ie5`iVGEU;!=}y#!~_=-oHkwG{}6J78V#Ni?tF^5 zGvZcwrZUQ>St9mg$l_W(E|u$fmBMYt*!IdAHXQ*?e^J_^GEZ4Y_s?dj=UG&-FE>f{ zI%N{QVBHfh7na!e@(N=*?vQEhcM8;Di=0I;uHqPnCQT7`Zgiyd=xbIJn`?t6y0u|J zbO~vNUUDoLMcVHJr~WHVp@S@= zDl7dk)TNlSrs6P} zBBzV~-H2ZGG92H%1`~AEHPc1$lU1?H0-uqCEKyoBJ+S_X40&@0LxSaSz)a_7#~!$P zm*KW#4JFYI;_CxE66NgYnsbJ5x?#*PSGcK>tMafGSoOc297mNGE+eRB&{OUkWdh5Y z{W-utr#r$Nie1xOpzGwCGtpD%>q2Tx0hvra4ZpJcP_|C#oAi){N3}U=4o}h9ZtguI zvu#>*HX?=duWi>1^U9en#i0xZ*-7D=oL*HUTl&Q>q?E-)D`{hkwPUZ-@~-2uSIszM zpM6h%G(}mm!f{cdd+DBXqHn4e&PWffz$J9g!?dny?55VWsO-Kb&;E}6l+_rT1rw(W zI@>{%YoWBR{frgeStj0A`Gi}sbgQvvO^+0*SbDJM4j!%>Ga|N~CYM9)&7y>O2lt#M z)eWyH>-<`7fa^6jtm{;I*jZMZ_$x&*c^2(-KGyfid|NVEzc7+ttROCul(Zq=Dy*tg z402jG+jYeixM)4^WA2$g_Z`VJLioYWFtZlkopW%r>rGwATNRu6Y#LM1nEb+irtO}3 z&VuJ5*bXAe;4y}?v=*yJ(?zNWbNQbR*t=XKA3;sJnb;b?J0&u4E+(F5fd7!RYe~lV zH}RHjd!hFL7M^Hk8{rl(n`&NUoAq7vDsviL`Haauo$}jgJ9@NfF14dR7aALNgQNL1 zk_U6?`OLlxJ7qoldlBeb33ehF6Uv-?LTUM0I#jO!bc?NF3?C z%Gpw=_VrqB?mH;j9+9IpY1&VZ*yABEu*t0FOX)!lLU;UzXQs_WlQ(D}&xeGP7f+pb zACwA(I^5j~@J+oxQY5p>@wNID8Dx*8;8*dS6yq=Z)$jUaexGFA4h*6aKAK-?`Gdfu zkt@S&5hwn=Fud7M_#oM%K#exJRMq@bgaJoz^gx0)ePy{b|K?WtLSyKw1T5S?qNj3z zG)1-NN6V}Q;@9`kmdi2e@KuJ?>mQ?FQYz9si)n|_>)a+EQg;u{9p?{i{X*J zeqA!VU3otI`H@gY?1%DX@isZc!7mmwnu=#hF9fe=pgojx7ScRNbx}hekaj@?xetK{ z(DgZJ z(C7_WK@%PtE9T~`R^kb#aPG{nn}J?Cl`B7vt}ypXeDIyzS2=#B(0O^_ui7LyT zG~SrgYn-t&B$GRX*jG;+o_{z`(=%BV>ZSZhkz=>8xDBLePaYK^MZ+cGPt=wmdjmj!lb~U>e=#HMsOSBu%1to2Rv??<(7J6hk(+1qt~m>Tymwh{3tPCUh9# z0N-Dr{@SYi`FK@LAvDMb*8$QZC!n*e+z3_y`nL^*|Epjly zNll4c9|b0C<9=i)n6~j#Hz{OkU8@vXe}Ngb8~saD!HB`OryAii?w3X(QL=BFIILs) z`v!gB7l4TPzNNO30uVB1c~9{TD}>_Nw~T}s_`vuI zR>c6M*Eu0$t@uki=A_{-Q#uuy^t%LD*LZN-Dz$*&0GE{g+~0T4GA5ylFW6K#?03m@ zmVOt_k*UC4KQEy)!;u4&abjIX{E%!|3DT6~D*zT}Z$G&W@Uq0eUR0=))?&~f+_xA3 z5*$Wu_;tKYK*yj1dL*9nV87*mA{XR@GDF6tes7_Ex$i;jpb}K!?+of+T#A7wHyn^m z=-`HQ-+nNj@W@(i0<@X`@i1H*-1uo+Lpz$Mc;BWzzg$}1eP<*_o;gu1Q3)zt2F@y3 zsiLI7m^edxjO9h9kzpmg`R1R2kobMMKPFBGdV%GkL3D#fsW%*9VmX7h zNUKg@6##LZB=f0>26W%JV7~(_*z$f;kH;)HF(SgW$Fs;NU67Esp1QEkL?aD3XmYuV zs)&!D^x6VODrgO;@#kcW5{~`d+5`3_Pz&XvQe!VWHP8{ba10-m^U_#Sx)L84N&hkn zX%UMQY*dH{Ur))z>Dv4#*1ds%eS|neEkUH?KaDnI;sZ*0@oAcQm>hP!5bYaos>%|D zwiB+AMD2;4n)L=so*&?vdrU#v^?ItEO3fIF@5-&5knIL0k3#o#6PPJc(uEPdl&}p@ zuXw)mA?QBOx)z@AxQp=7L6CQ8{V2e)729_8j@-ryR(9s=r)-CY&F&YExXS}M6RAr- zGa2pIfqbGqa!@~#o#n(Dp+M_3D%P5GFM|Eb`S0%xX>!-Srlh~Fm)ki^uN9HK!3xb= zy%z4|RK2|t9K4#QLAkojKidRDfw3r6B<;MZKV-;Ft6(S%R7%`a{Edc;plJ};En;${ zdGIV@rVnf79&}{^TY3AyuV8)q()y{APs4*pciAs!w+@ch?_z7EZ0sKgC%2NY`kYK2 zB+oUxm8&pq?IY3H=-9D2gB}Gt_OqOujgzQM4UjotJyus`s4bJK&{aqwKOm~^5{{`o zvcJ59Zj$xR|9QspUMBP5-3NnDp9YNy{Y-0cAy0%|_G>m*^mi;~tRo2cZgPUwnhco{ z-nISy{W_c{uljd$9FZDzL}*YBYmgz81(h6#Q!+J8JnqZj$FCqt!F46*Xhl)bu9HSq452CA#1JQgCJH;+;-VzKDGIYE+p^w8-L7;h4ybsq`=G z<9w>k7fdHkPGVH^^urp`(>^+;4>tbkZgx>)e9%X4=@1RS3UVE%#TJqi$2WBXt4CvJ z)&y5QhXjLT`YjOOj`m_IA0|whQWzXTc?E|B6`7-70&tv5mH{i6&g9vEcP{%|98h%&tUb(`xHYjffL*=zz z=w${)ecREWF}P0g0rom5H8a8{6$Bl^qINFDxaiQ@WaznZd6#XxNy$@>_@vcI&XR}i ziHA%J(G-doe^fisFR0x9d1s8aQwirb8Nf1p^CECo?vdG#nxw({32x7PvLU%gm(`bl zWK<00>lAsX+Cj~WZYC3+VNTj{i+O*TeeXaqkih=cl+nysAmS$wzan-@u`P~(vThnp z^O3OqkTVH0&atWuCvmS7daXvm%U!%W^ueBac-zL|m*>Wv>e+GFT$1G_^l{3%{O}`% zE7RWxkM9JUEbh^S$ucz%bUQCq4@v>KFO#2eTR!L00PHe`s=S%<5*Z5w3NUVvG^Q6^J+>E!(Ww>*PJWbN9~ zT@7VTwS9USxS3>j{{eZOxw1q1=^3hXwAMn>y>iGMdrzyLoHDgASGM|uIPK=Wye$3Q zd?K99=uM_MOO8{OgYkDmm5X!EJtry$qsSLO64sSG1Rj%%oDTaKsbl_uT9I(^1{>YFT5u?l)hBwMam;r{Di%FpON7RnB_o&_$LF=sz@XA~t+NMZhU z>tJHIKLeK_fk{*1&0kT)trpFTHyjym7KQ9dRYcUY6hU`s_RTQOv^?9gwXml*wMe;g8RYM}|Yq zQYxGW)?vwBF;02GhM^Cvl6VqTI->k@sekjuMt+8L{Hf!*;R7yyQvnYQ0?JjaI$EJVf3JR z*!8vdU7RkRLJ3N?(qq1{E=?e+Ov=w^EFwC+xGXmdi}{)7VCkC`|@(*;gdX}gvZkUwj`eOS&Rkn|o( z@?DA271+-01`WvM|D35&L`40pXd+UvL|%Cm&8 zhbn))tXiSBVn{5dB3=f!YQWuj9q3Xh;z{zDu|jGx*w665ini9_Xd;m=Z0lw^R&mxw{CE0rlVKZZe%46-QvZ}yRmmuOfkA3kjB&qdO=A; z^j@m<a{*QJn=F}$AkA0Q2 z`IiNM@~n8C7nII$OFO*wzRFCUbm)_0cB((kTxURX@IdiA3Nv={OQ22b+04YV#(RV> zS}ZU`YF3sVh2Hxz#mo_>-xg#7DI?#vO5KxcG_pl?cBiN?!(&dmhrd!bbAF!EW{E6Q zYvyV-l&}`8&pN0qV@u_f@0ku>WH{-Lp1PD|7JVYKzFHZY4SDXpt7)yJ^;mU}jrwWx zEs5TO^kNc!bG>x)q4+(4!E!0`iP8=sQ{`9hJ{i)pB9ere7Ok5l-D+x7Cb|Z#T91Be z7xIudqud6hzFL0t^hK(5o0uTM#0C6GNtM~vTiB}Og6QEhn>hAx zxuwU**^vpCcl>!N*CNYRnKgRDre!;;^x+z#$J?2}&!Q*d2i}9JziVC6c z38kN%D1M3K#2dR9FD7jZDwFcv2hb~Sn}-|n`DhnQz>$U$hfF*|M7n(7Rid61sTP?^ zcgS}rZ%|fL(gd0*$UY+Aal%H2DZzVp{hM7!S&N4VA)FQt)LDv`=|!Q`3yAZ`_t#AT z^0Z5QnGXEf$|5LXvy=in1O(Pml($_vxilSeh+mO1hAO*FB`%li-l<;H)l+-8n<^?4 z`f*Eo7rW|;g3p(60S-~Z@tbsgSAtNHO8AUAe1O}U*e2;I3`jO)%>t~-_IDr+^%?yE z=MskueA%lN&AB2-CrZWG^sQX?S|}hh2%pi5N&f`5e}P;j0#(VeRH%h`zE=~!ZsHAq z@eO|Y4g+T1wLWQb?pMC7YnZ@@Ow+=Q-pYWN{*Yp&3s36+_2U4#NP59 z3FFQ!7@`5piww}b|HMfCi?!hImV^JnJ`ChMz^BSlB}Zxim6+ENG{3RG#xfSrD~xpm z0Iuc8|LR<(BL&E_f)IFIPjCMXzk~DtiRJ$r`!?q(!)K=^&cM#`9j9Xs1VAFVOSLvlb2pU zL{l#7+7OP^7@*J3@B()$Vi)tc9z2+Qu9(Mv-n;~86?y&0l%n@RDd<6_>n5K9uQO)g5irgN~VXGy-UQIwsYrdca5hW4Bi z8>y$t%X}S;GbI~K3bIc0;KouHvfvtvX+$`P^d^BPS`(~_7KBGQ>E|#aBu8{(;H~~3 zj{ac|IZ}IyXNp(U0NMvuM9QylwE=rGN&KED=k4;>naiw*6tqSHI`+M65>OIK4tGVs zr! z2$><9IOjrQhrcbkIGs*~6|u8JzTWT|!;#pO%f7H=UY%CrCFpZ;OEb$$)#ItU!JFZt zsy3j^F(povc3(Cb&L_b68THLAtsImg;g^5B8*ERo-vE?2zs<2y#MDqg(}Z zQL3E;^Dn-86j$5J%v0SuwpSQ;f1=Xw?{tIv+nUkQcH8$7;Nua`uCYv6q-<}`rH|uP+ z7F5-O-_`4PZTJ8ulb&ysU|G6G(F+!cQo!I=BJg>%P9LYPh%MYI(ybN)eKon2M}{Y7 zx9BSN0<#jxYS|I(!vta@k>=j{xs?hBfm}15P+|$IlTIIR#**Uv3y5FzM_7{kgc{qV z*6VMq9jDx1Ke$D%E1ODU-}JEcs-=#U?xDuBBpIip%jM^^DENyObg&D&72!*D?;f4+ z&sndV7-5G?S14=-ecY`Swp17U5;@T#-zLdUH86VW2kE(@#)7O%gsUhf%Ndnm^hLzM z=wSoqu1$zLP!AFr9blO$4^}lNC(2-skbu(BVODA-AHsgB#VkELC&;zIs`1mTJM^t@ znS+rF=UenO?ZegVz(Aj$10fOv(l+%2&P@6hX`a=|h>K~CKG}~CwCBGhmxR7Z&?g_z%c`GLrHetJ-WR5W7v&DvYBJ$28Wqo0=&tvT zwXg5&2NqmKn}om130!uS$EF6$IQ5>0?#QW47+uw zNJPUUXY2G6ypevaE>{-CC?YJD1?kMF6vGl^s zWYtoL%q)85A+(DP@3>w`i?rL)l!?(oDQhGAih0ZXHSETjg^|m*pKR(@^G>C|J<`=< z?n0I!TrGY zTv^)T=|sxdw!ro`HxeiE{m{Kb;XhKU4dh;~Ft+>fyDi1G-aOE^Z+~;kI~=GYo6$+T zX|l_X%Nd6Mh+X%P$~f45jh0lF;{*|(G-0^=e-Na|pPfI0zR{1_9UX;8OUoq%-pzJ{ zifHn4eU7V8!CGt==Z|j0*N4h|k2fKqZbCf^X{E!u!~P)PksR??#)-|kPzHWw7e&rS zN^{9k);ujLo{tq6(Vu;Nd&R%6{d#^3&&wnPBP%+VZM>jTVKfuDakwbs znC9ebO5KQuo;!M?g%JgPx(~TAhL0cfDl|J{cvf}h9SRJt`KCp`4qn%NWj~bz3mLHC z|(&B_-;-eKKEpDKRVgN~NB$2+Hlcal{H#A+grEhj4bBHiETba3Z16 z&ebXRwYMy`H&7Pgmj6#y84p zvX@?4bFscY;Ia3*xl6JCizh$s?N$oa7|E2~_?%LBkEnjR5G(t3tTEcMAf~#goE{RM zM$pMIN{A{r(kpIsAW|4KNf1JY@RLcbOy!Y78puP`B=|!6y^6QwNSbdQ)Xla;JHIuU ziwt4B=DH&o2&>CgOg7Z<21Ksx>!>MkC-?ER{c&nN7S+$BcR`xSul?f~4S*4{qC~B|@kDqkmEm{n8 z)O@Y@ROs9{AM56a2X))E-xm^1w|@k?HrQ*}b4Dwu}96g+`EYGb4Aa{5-JbRVulGf^+FS9IGbQlQ&%&RqJH9DhNUv7OitWkpyU ziB|#ttE!$U!)DAu?RGrJT|b4n-KpUaj(9c$Q}~%ZSF;IciXGZuYBkwXWrw=i{PR*~=h&ItOTIS}ur{6)NB^YCS)_IvFNNxg z*+DS9-`O|hLRlefW6ta&qE{(EDoRy)%KWkA<|w1P0yxY~4Z_TKuf15*j*QyupUB;8f#&WV$~p}=^k#i4%81E?&DxhM zi!&bk`uL27Bu2ScCPk_qy{ISv;iQfNuC8!|i#3>{0OW*)esDpPeEd05itw!X-j2EX zBW2UnE9ykeey;qozw*BJ4d{KXq8s|jtGflo4YCFeP&iM8onOe#q?u%Fl=X<$efMR! z%h^=%Ugc|?ymIEFZ#EW-yg%t{&ktVlE>SxJ?fcME?1|!} zv!C4TsUEV?vV?C*=iYeSd`oyBWX)cG;-&{hz+T7BXP3?=FEs~U^Ch^B#}?2w5+7ig zW9vq{+X|{PFx0sLp-yI)`~asolc7)~A#f@oZOFqt9o50pyjB;52YQZ+I918XPhYFA z2HWlgZbIkmo?nPteG{7W!_t=Yr^F!nNV8yYDu#eZ^wZy@dCOU*3nD<|VO-O84-q z=u#&&l8W=Js#PsBpx}l|#1h!zi|rY-=~^@QTyvN;xyjN2w z)JM?#3zxD2%n&~>pbEO+1GbL=_B-ElG$J=-LKZqj@Tf8WjFU zt3Vv@gfs4|@Koies`{1VS}N(+TQISSZNocaZ0p*En?(la@bWu)6G4D|{%=5@tX3>b z86`G^wHUBRmaJDzMC4KMaz%73N+2bEl|}_)i_!4Tcyz3&J5S z$2=DhbjztK2u-qr#%TTpB=~tzrhE?Yv7-#&S26M&GwNx8MS1YcbNyx?ELOWkJ;2x) zV6}^{B3D9Au3-(pN_ZZOED=rvVKlC_TS!}?O-Q-}&_0mRp+F;a-PDxuRd(Z_FCSDv zQ`A4N11he6U4Bap9WXAlI(>B_I9{(&Kg;B3c&W1dD$xyH>%t$C6N#34dEflr>FYo9 zESTbHFztX#W;+yaKt|8UBXu_*51^(!{7OpdRmdvj`=^Wg`i-QFr2Mv5ujgSt!YB#9 z#MoYC&&$TE*4CTiJ708OWN6oK4#x8z4k#7H;D!X%HB}#~{xO6ng1jH4fkB;z5bwE& z=h*}%JWSiLX-$nFpv-G&mtL=1bsjnTi&C>NjS#2z8iah622J_w7EC#;wiUE`h!%TM zq9h99D1hObKsP2YAy|>96EXk-o*L#ZZ;_XCGC#?yQ%z|{>j-~0SBmi~r`yT{eq~eB%^zN0*k;NV9I-Fa(BMzA< zO!mekYgxQ=jLl%(US&7%Yz%Dw zInNp{>Gh7$yI(yY=UEV>eKdK)cj3UFDxh`6tBR#|Ru0$QVYQ<-dsaQ?sT8+$mH2Vb zXY7YhGW_eSPOD?YDsh2^oSeLYdO} zgLK$RNU3wnY!*l^vCQ(Ji1$-=26=J*7LDonrf!!LZ(UuSRa{&^&U`VJ&2Ogk2a{6X z;ON44mGqov4_7-y{E9?jgio-|JFQs9_1igTb8cA<`E>K>cPS9vE9hI4DC+iMP41EC z_QWB-d~AL>(ch-0{BllmjQAGOs{}XYi8Z)@mXhXx9+i7d zk-y`!QH(dR6;kPD3PhT_%i1K6p+0qs;si&7Lrrv2qFF5t9xae%`mskMkqHjgPr5TPPgm$W|~~oTd6W-28xq9nard> zJqtZ`foCaEhkb|sh(|UaHgIiG*sLAqr*#r5FBQ%r1t(>_B$ry89 z)nJP$qAy@W7b%t7N>Sb4;=xc&mvdU_x*T)z#$QT%wpU6gN&!iVye5Mp-V~Xq5E-Q( zkRa#|G*236PaccGeRjF~+93QPWI_Tm;-ad0J;A1GiYKqk3Ymhjqr=8%a<#tjE)=$l zy9o(A6|3R9G-h9} zhRmwHP{8h9vh}3X_+c_VF;tg<4O?KIEJ!EPrR#zoNJ=$`vo3ve!O~#=w#rDZduwmS z-7SYybt|Uje2>{v_$PB%C`Ga3QZfC5T7xhuClO<5>*D@PWHpbKcT&~4M-m;b7hI|m zfv~NJJ}dq>Je`XivKJ+;L>hhya;+CNnOB`N`yrqs<6Xv-z<=w$Fx}SYp@|LU1vYt^ zQgWRgrRYKuoA-|%u&$eG{)*h%`VrxIvyS|5QpWVQikY;qPiE(SDayL&eVnR~rHd&) zW?m)vbzArPDQ{ZZ+Fj!sDM8ty>%}RqYT}jr6=Fv$p*P@b#wp8c)6`EK^M-?5(&7?S z#Uac^6>i4G5{@4}C^26gXkgOAGP&nv!@=ARGX^WGg@mfgC=?aJHM$}t_Rj0Q^c^=5 z8ZKyqR;#r79NFsA^gtBN)f4g^|JN$3G;@Wo!%7MT`FohtLha?I{RPgHNb;+bNMTKR*Ze7EuyAlY(LbzLnr z-y=64?H)I94lh00E!F)n{QAasRm&0S7)6BB9KWRF@hw^&I z&ePDmyEFGZ8A8k9_kN(a^`vw(;L?fNc6b*n`9~msP;AmslJvn~n3-avIuUJZa(9zxEObkaM3s>8p zUp<4<(A*&#=u(Yi%6>yOZg4c&?|9rRL=^q&v3DCw?U5`oBhwv7qAWRmNG8Rn4a76z zI42{;@rO9x#jyvyq*~gGDuLzp9Nx2jpVniNuEwd-$}~zXH*zJ77sMbuOz(SV`<&d` z*=6CXx!BKke8(m}K7)YVQfji~WcuxgFUf}ubJ>??ylwQ+`(Koe+sxJ|7Kte6dUh-8 zrJ*~-$~!U{&P;2GCDA<_Ig2sgpEdocAEijTt~?z&&L|a$dT}H5TVIDL|5_-$@m8mH zmoWP9(9xe<$d^C@Q}|SJH~(?jE&+SfolnSmx0t+)d)(0MCxK2@jDt_$iVR+!lyXcH z@fcTb1HQY~_nx68IA1K@F-?&XuY7tYaiXk!#Zm6#lwZ)a$?M5BlB>5nWC*|c*u9a9 z3}g!USmW*6;VQ&?pG8My=Kex^rQD3V;nX^7^xmSz>!}4|XmN*go=~hPOs8Yn zEHr6bg*DS(vInF1gYpM&-(WSoJkaA?>^;uheyty$PksPNgJeHV?eMPz6;YQX9&tCk zHdwUHZ_sNaG5X=<3XzZel`5RLCwh#JJ99kvmcUEzo$q#oSqlu?fJ-{BUp(m zY>WV@b-qz{ELe^HS0@2Qy*O;LXvJc@?&J#N9NL?8-?w&*-{!~n%1VBNALlrQFuw6$ zT3!Z*nNy8Pdn}aSYJcPNBypz3(lY+XHMdrhU6$n2suQ{GiqzQ_3RD_dGF=#5ZggMJ z)fHRDdIBFK8L#5YA2&ZNBr^d!E!+X}i2X+W`oWMk#+Ad0mdizRMpfZq(&q(qvx4C7 z`5fkY?)FEltwg7E)YyGZ@viq|dN0cPJG3a39>f775eTHmfCwVMPD~jD90-3y8=&np zGatmNBydY)b6mQW50Z|ZQ4v3{A%mJOKF2Yp_Bj=-+2e+BW0%D;aLyt*)@epVq!^a5 z5trNb0}jQ%&c|AHDn$`#O1D_)q^zgaY7pbq);0@+82=XMOo9(pBKemarS5DR;WgN~ z0XbAYs(4w+HGZ>*rww@8g)bt5qT`*yO%d4iL~tTf11Jsf5bo;T0-^Zf1i;$jjiIKx zX-E8LbJ@29k1&)m^rkJ*uv)$Dhzni;LiaZ={R^M||7{J?mQY4&N5pMmO^tK-@+E>2 zl`PAvypFEr8dV-OEYOCrOo@%91MDhmxx@~H7w~6*2szdL;nx6DPoX{=(viVX)8GX4 zG(1|y@A11c<^OhW6g6$@16&mRl8S&UNKXOkdiec6J`>sErOHc+($!%DkNJO07ys?M z%ufM1z5yBpVIO!QUcx^co&Vjw`=4o`ZrFT6608$O(1@K;24&VE1OZ)m7fk}-O9>>_ z}*$JT3MWU_}e~qna&u^S$%qNZShVUknYx^gUg8vZ^FfRq%ypQNiWUn%8=X_t_h3 z_^uAt0&u#-X(lo+O+kY-f6otz9!UZE5Br|yqyFqVz?~HT$nlgP6j}by4)vS*;|ua~iDH-%x^f$-9-^k73Mxa700cP%DA^l0=qu$#p@n!^40Rerh)_a2g^6nE5k+x8dO z%OL^iM1WEeJf_{^stXwNyeI|(^6SY#*+8q!Q#bCMsVkig`flF6NQoPRywVy4{>nT{ z&4<_?65GcL^NFItC&2+A0Gq1l!2VeIAYG*{q`&s&Lz_j*t-9zzZO_V>PcO)=Pnyn} z){lUH+0(`o`joSd+XwjtJTvFb)IC3E+;C}+j308ZT=^IUlDtt(*KavFx(*Em?+0PD zOIaN5&EK@1*HsLb*(!(hC zYt#t)&hJ27$z& z<71uA%0fzh{t_l)va!x&V#1wVE()`=+o%G2j3PCXVt|u824}!!qM!t^t4Z@BY~3%? ze^NvS;plbn08%2h2F8Z8AQ)SSr^*=LuHDiMe=pAI_#hREm|K;2zm`jR`sfLE^32z+ zsCrVZRAWAC`v@en{ZJCUD;rwOIdt2*@R9RJvZRF!_Z-5EtXkJ{3SK4S7Z0E1-Synm z3rTBnD*gJYO3+m3R;~TvQ@O@pCA4g3J0IqK4c*tqL*CduOiTH;uSq;V7;%HKKTCJ# zSMU0vQMp6*Rb2UP1*-LP+061b1AalxcYE2P@b@+##qVRM(cCn{m^yK%CH_=hnQ-4HK9f-39`@r`C^>gY}$c&+QNXtN=>r2etENa*n2^6xW&cuM_I!w*i!XPA!P8#px6Q~t` z>T_X?L_jp3gx^gqUOfaE=XUosNn;o+3MK0(R(p>Sr#rh_(5duF4~7PMXsQidc%q;O`0rPZHt^Czuv* zaT%(2m)p-i3!d@n9eKIc7w_)dwZ}EQ**2z{AhV)I!i|)4_kD_F z8|JI_n4L7MNCpHAnS;3A8tLpPV|=s@y}5@dJDeJ_OZR}6wKmhIZzKH{dc(EaOyc(H z@Yp>Wu6plvTEWdTVHmZUHf&|Y0I_;p{ z_kbnizrDu#2*HGeLq6jU5(hQURe#Mb*2K%cYr@YW=z$KRJ4%{Q z)MFd-<%$ojFBhwk9ZXj^EeuA=oxge~zV;L@yzklSPDdzDlCn48uL1xogLh!J4{6c! z6F%>>m`tDZN+M6D+Kh|4* zKTpgr>vJqPM+)_RXl;zGENU_v^~ls#NHRl>(0x5_{dxBe3!Mj+yGz#P&6}M|&ldWI z+q5rnY^CnmKmCKif%xT(x5_0U;zEA!yAxL}lpW4Qhdk;#K7I}x#Ilc0u3M!1AemBI zJt=9K;*PNUayx(BLiqVy{lHH}#hLaUlG!rgmizAX(b*7=eI?mtuWbG66^GZ9=F77@ ztkD5lAUt(#c8*`Q(__plb?y;!<0ZP;}kI+1UmES7tI57k()?fb z0mi3mvdI@nM3U-7EXM>m6gy?}bX;^V{3_?mpOG^?jq!sAGQJwv$uf;&MX!^-PmwPf zdp!d}uc9b76gjH_4a7`+Nvef*jfG%tIr-?7ElNxrs(?+W3PWm{axiC(fa`>)b1%Oxt3>&7J^Fyh|DY73F5Sni^yyr!yiNlSZ;VZ z|I%JjA3*50Zk98?n0UcU2pIZr*_IpNAR$>w9JsB2L>2zGw^|$EzViO-OR+QlktSM# zhifDZssr{DALW&X|9KO)91!-0n}-RYibc?13P0RPkW-}QA> z!Rq(oKx9r;626|5hKzronD^}lC#Y4+VRza;mkr*Wr%=L+yqFr|-y`tXCHN2jq&a#2 zhW-D$6z{L|^tuuE#V?_O@C80RP7_~wFyO{#N`r{WQtcFq{AXo>V?6ovn6|=~f0k@3 z2w|Eg;Lw)$jhHLyubo5>P-xktjJT02zujp&8U2gha({$_mYFTc6bvwFWH`@KG2sIA za=g}Op2n-N>${jjGT@&0Vj^w>SQD4_@?DRHEti;;Zh>usKT3pxIxq3}uA{^y09`fv z*OlkiRj!}^vlF_qRpE5)oA{U_K(|2l;1@F_F0z6=iT^T?{ndF0RRAmUC>rY_$_Lcz zIw8>g04DW}Aah*1pmqU!6G!WP2vVOAAt919D>d?flo(Qkf#?S_Lkg@D%0P?wZEB`q zUvosn$QBlqabJ;_flC7Q#FC;s`~nHD?(M&N2=K5>+FR9=2@3^>ld4t7GJ&LaVNgQm zo7{Y^IwC)`DLTR(lS2V3(2YsL5xfT~KWNpV7_^JTeEM&r@xQ8j0xZGQnEU%&k_>Ddz^b29;a?wsnl(9c3*B5+fp;-Cw7od6OB9n z_>ET41?n)fMHJGnJs4 zQMaW=;Rq|b5)BgYcoUuj`!kdPGS{9Uxu}QQIoE^vG#$MwiN7qps=s&xWhwd#&6PE$r-;6&XfTEj5#|c`0{A7=MOTqiqvEAXOZX zSs(-BlwSrOZ1`ngxgVo{nm5C0J!eB0)bEq#`u?t(r4XD(@M)Qr+{A^DtTFW4B=U8O z3dxLpmyah4FB*8IoOyh1t(f!eD9)9~Mn1R`4^w~Hb}bB&5vQd!;%ihB^mUyzZ=1ZF zS3F+zA#p%*$IIDdFumlTM<>%4-#G~z-)+nkbLGD-STEj2$7IMLeAvR)%{2K?KAN(;qyBK2R|lsSxRJD8DDQU`_LokZp3FzCuqD>K8li`u+nQ8s#Cd zgj{BE(_qpYuM&70{3WjT$*RIbpULh(3x`}noqFW};CVan# zLmqK_OQKu!EG@>G0*&5ldTRyf!6Nu32XscIHsL_Up(SGDf%N@(D(Ua{URz3CwK;Wq zGjb-dwk=e))745{Ajtg(0ZZj2!8#p5#J%&QWQ{z_da;Tz6=X@jWJ_NiwAs7VKFkBD zBs>B;ORGlTI_Fw+FH@YZrfQ$YFdSZU-LX_v2e$ z4b}wr^rdNj{3fwinan49!%2rVQzHa$`J?gS=@XkqKaSV+zkY(1>MLgIQaMMCR?;zV zjOrHsFPoAcK4(QdlaTYuF89V+(9W_wnJE2(z@PPutCp`B?hyM9U7dIa^wK^#d8D^( zxdz(nZ}?PMOO6n)o4aRU+T|Ai8NT>+&L}))SA}&Qzk5%GRcTAHP5)kg-Dlj0_v|EB zv+&S4))v;pz)*y=QErP`_ag9(v_9!Q!ZkAUuuG5jni>g%-0duJPwsZfhQ%JJ;wlff z_rKQlb5B73_Z7%?SD9rOpmSBp8*z5m7Bdxtf- zZT+I5i6TWMs31k9C7^UfI*9ZdLQz1eDxlH@MVcb05$PhvLQUv`0s<-mDpin_ZqMm*~&_N@P7lMl%y z%DX3OtseD8S1~^0U;Tu=`qcK+GjZuhAVhjnCg@OW_%ihn|1J*6k5)V$RW%B+GHrEU z#uEkt*Q(`)`3dIxdCv=!l^pqEKS26zS*G{$%^aKZCY;eg_J34bo(d75UagQXh-?7- zZu4cz%WzocNrw%+Fw{dndCi{F(yQdy(Hs+_XPFF%NBPgQet*=tg*VBsD;Im<*mEcU z;iv4Sd|9dm6_X$ORYO*P@N&#P?jS~mxijU2y9mdAGtqfh&mxlQc5Ywka79KB^W_YU z*`Sy$&yr7R`<*|^aQcWPs(05mQ!TA4`F*rUj@!MOyZbX-S;2C&=ZD{#&+@CKfR2kr zj~&Jhjo*5eS(e0F==4e{6gF_+?r<|ZelN(^cxWcmJU%?trdUecm)T|}JXKIfiSl0= zsjZ+bYO=nx6h^%(U*FA?dM>h|xIKVtd3I)PASx^`z_@o(SNNoPK;6*rlET{nj+*@8y?y@4an0 zM0l0HV%;)xeld1U|9MC6`KzDb-}-}|SG_J@2FBn^v>&pa}E zkr_}U8opXd8S=Bh@xPEBGv};QJ<$KAr6c%ElY&W5Q#Qam+|vGq;y4kyWCjr6MYB<> z{S}l`S33$R?3VFA*7;U=HR>f*&&r+7T;U;a>&PF#9hx)npi!Or^v6=g^Mu4iIscTz zs1sn;+;jRdRD}t7!`!M@lrZ~^W2$vv^^=(Btx>_XGTUjnlxQ>ep7zy66_%NvYf2~k zKah&sd*@msmRag?2<2E+VDI$c~F?1JKW%B{~KbvOrW&R z4Rw3BXZr`X*A}}(&;dYTz*9w~EOVGr*;f#!`zU3!OaS&3u-%&@Q#mJe*Src)SRYt~ho5BJS zM7LbU=jewlj_DgPqS=X-G-hW)KGIB_u9(rNRET+HyxZC!yxA zd^b-sKp~^ca)vWn5h;g#E;&V@(r$B?t)*yN{9y3yJ)6kSCvamf*ov5{$ldOL%O5YI z(Uz7iumRqzU_#rfh3I^EtH&{h4j7su#GV8a`Csp%2oJE?nlz2aPD6O1hT{bjNqv2S z-gN;C1TC@_t#S$Ax{`2&prOUQ{UL}d1bWI_K&g~`$cUy8ZkNfY*MuX*D@iO3_L~5; zIbIwH%=t}IozYanxHRY+aj`ObeVTh^N@dETe+Y`PC?EakYCw>`UhOYpStBCicTgX< zDdcvU?`?rgZ@t|VNi*!=;y+a(%e^$GA?Wgk2K=^BBa>0w>JmWMWNrw`n}Crhy?=qQ zfOG!_Y4O*0Ha$pUp;f@7UMoliC*hw=@PGJ8qzC23@D6}PjomiR8bX>zsx1Iv1-}tj zdAgAn04{q$urZrM#<>#2{Ec|A|KQUF(O@zf-u6LMCUVEkfIqm^#<%}eH?+#y&twl1 z$y=T54o!x zn7Jfe1Hk+^ynO=&a1aN1t|vl4V@x_2cQDfK83THWbNFGXw$yL+&wKx-p7`JWAjHC5 zhdlYed&kMaNM1t_wrNYDm46}qAmiqdf!95>hMWvnKvWa2$OUvz!*r#>o+afnXlxpS%Z|^qL(P*81`zFHlaVcFWt3)N(PNdZ+keaQ1Yjz+qlvjurb}ao(QG z`X|DN$_>aP=)Wk%qt$S%YCc-0uBvuFyKRog$IIE`XN2=9 zk}C;;aSah=U5!6VoE%avIcTv;YB#c8#dn0~=U>cl3h3r&&^sd4S8lAgf|#(Kx{?F+ zjSjfcIfY2a%(GXnermmNB;bcsUOSI#Zq`WX&ecy39`$DRW9PKJ3mNX*s_K65uItq0b<=PPZSh$(_stso&e~25eY=ZL zsGKjkV459$g~VU(Fp}lv$8{MZwRz*k_1X?0@fg3=enVwf;R4p=XK4JkrsHaXpuzR@ z=e3oxj+CQuItVpyW7Na&U^1N~^ST}yxfOL6A!|4UeJNdeO!_0%Snq+vUWL}>76m}8 z$>0z!X!%Nv(MEq)##V-iw8xHa)u^gI(CmtjEc#x4G^N8mS#gn>&&q(yh6`TbT5dDN zDzu>U<(v5B&@D#(?YOWhnLiGA6P$xOaMvtF?j(N^(h8~}&)crm&S!@zXVx!1z0z@C zON0FIYxGy0Z7WeDdSXL7d#OHdcS=XbaAiix4%xE8UoB4Ee)C0wdW15%7SQwE{bMq# zWV_6(`)d^>ckbhBq2yQWfiJu%7gyT$y$Xd7;?t?4m9>4krJb*jS0th@lJ1CJ@_cpE zy|JA*KEho_X)^mZRxLEVnEU-;V6B6rr~G*BeCGS8Po9*q2QrsP&BCwt-Ihz#mT`KP z`KDMZ!uw={{@L2A+d>VT>qlfIfsNlA2&8-PoZEpAH$+D~t4(swWqF3JC5h8U5ZV(I zZys&H8ew=ekC&Z(`-)#!hK^8R=8I0qa0+4$ncX14TSeByz7b&(L+khAi~aHyA6|}fdbQw2(u3@rnrLb9e1H%aIrCa zUD7ypQnO_3E5RnI_b7Tj>(G*ISPv(~edeLn9ow1kl)(?XEk8ux*LcOb;z!eBIc5A~ z4dSnC+luprk1v-yE{P6=$z2rig{oUSgbRb63R1@ zRsO;uYOS#rG$UN=e&ZKvNA5t`Ov&wqwJwisI&P*~^dTM|fwQMyao-j>^i0CZK3ZGD z_MKc!Cf`(1^S?BaU)!Y@f zWv;*9F>LQI)RTKvw&E^dGr!j^=FL6Tl8juqL06wY;`pv&<^mYndY`qF?lS&NnzSykUFSJ*b(;@XG3j;fLV6W{Br1QT2TTjs8_#ELqc+Y= z7OJQYrL&bRB)mB}J5Aw`8yLBB)~7wf>bP;_MN;XlNJR&|Y@aiov0qd(qSynD&uE9{ zeFs<25SbWrUH6H36ge;`yh1?*I#6ZwW_N&M_)p^*rIv}+2$pA7UBg(j=no%zQdI8+ zEAq;%rvyn-juUkQ)ceb|oRj45<_kGh2`rpdDeiCoIA=K0Fnp3P!B^fPsQ8=l*1Ri{ zSw{kc*VL$g*m6*n$4jKnm*}6f7uIpqV7E&?CY2ZYq;qzm%<_<~V3}Z~q4pv2v5(u7 z{IZ9BI>ug}wrcmh^IW>nDYA2?d^$-)+v@g_LBeOfjbM(v6X76yByQ}Fut%~#<&@C2 zm@st`^=>`NclTJ{T6yC1qqe?C#nm=1y29naJ{IHJAjwt=KdGk8X8aS*8e@J)J?2Fu zqwIPt^`q+c9}xV>-x9#%!l_H_-Ztf3HBU*`9>`oP7oC-+ZacSjM7!S|T|etyz57+3 z%wqU(EjDJv`sBbgv}yjaY5oSzXF*zsH)E)-Pi80T;kvHb#LcG7rl%Gj97GD+ zTMJwtV3jsPT_Qgz8X7zK!K3WpqqFhtG}ZLOMT8a&e{Tr=dJ^C8K(v6{ns9a;58`?t zK__V3h8UU7BWiXl+6kDd)YdR@biAdiq$M!PhjB<&TkO}ZqaaZ`9aL{4FzQ{VEgLyl zNuz%aP$uiiWZEYRrcP~;h5-TwhZw?=^^v#zOfDOy*z_Ck>oZv68~hi@^$5>q2;=vW zDdrU3LfpLRvpw*D4Y3;G)c+TV)*EsXNHNIg2?WOc-t|6BG|sLR7Y9Fxi5ZFvw!3Is z8SxuDA;5IVHZ~y826t;9l=0!q$3~SKegE&If8=+2VVPC?35(k>+Hty?QksAg5lcYX z5CM!anc>ebG(g2XE++!s!N41$;hv(*AWx~m0qQ?^ZTPcRJ_E%YKMsL@co#rFB7JGJ z4AFQzjhAzMI^@x8envUd4kGQ~ZP|Rba`JWroDf}B=>gbtnR6~)E^BIkfkz~>0W!n} zoJ9*GK-6s@TsZ%UH=$ibvy5(&Y!Wi0i)#`#p(7kcVC+tMvkv;0C8F)jv9|zX5y{J1 zsni4ih9XFmA{`;vn|UgU{+&X!(9S7QXtIUc@6q3wi9RDT;gY=x5~}G0V&%WZ`yq*A zX6cB66M%T2(jJGH&gyE4omIS34I7AlqL?@ufUJ!rZ19mC@i?P?7ziRlpO(nuMUTV2 zgQtV~L>tm|*}!MG&)*a!tk=e@AQXIg%oM3Hnodi=N)sV@=pPE+UPD2^RIx(!O6y-c zwm{0fKo3wdL<@4K0PU2_tA||=HMuhx-Q2A%Y3s*nULP1+=3q565>+S}H9KBOjv)lf zzHHsnP|$7DA4}-CvmG!z1kt%68w8PB2LuO89_O@Y4|t9AFweK&DQ%l_2fWA=VoJM^ z48q5yW-?ZepYjw^S(?#5h==G#$xiK~sB|8z3!@$rEIBknJ}7io2A+?^yoJ39^$4L~ zh#bbGW>@OhVCu``%D&rzWpOX8!ly9Vuiz_=8;V#s`B zx5zUcf&DBE^0j%vJSxI7U0HpqJ8Yb-%FDus|0r-2@KKhhU*uTt3=^zrqp6x`=b-#T z1y|vN&**IvO?8RQJLjM?GMcTm|mUMZ8W}QU8ND?!5>Y%&F`>k+9DbG!e6>m z4}HtNdPB0*RK2d5r#!0sI7+si9Vg`y;>{%o$xkD9~rUH2E4`E2id3ZzYSc2K%ijTrr1K6r?Z z1#?hXJ1&@$TSFd58MT~DWM+@6zI-~PFVN8_W&kQ1={-8!k&|1e--}W5 zWWi14tWp_Zkc5w8cTB3%CX6_%5AYV}M3{2iXcFeh@=$h>JXv;y>D@K-jmYvbNiD2* z1ibH$vSgfP{-pXv@NB97Iu(lqo{Ll8xfauw`+NhL-CCo=j~(qMXM$dI<{wgH<&M6dgMraawKl* zv3+3HP#j)J8(tta4IbQOtf#k6{wSH!;k~k371s8-CT&?)%h3ZXO@eZO0F=jm>ZbjQ zOneV9ium}GT_HGlhtm$@uwRl-TvGJ*@3`VB)v5HMz$+H`ieb%zgtXxl+0Tb+hhrbk zbW=RNt-1_muWVk?b<0$9&!2Olis3owRZVRf+D z>`XgF!*0N?kU+4vm%wga$bh^xgF|v)qw$b&GMh`WIoI`guEoPe!bR4u^qfpeqK`~B zX2<&Ugx1eRGET4VknV7O#9890Tc{`T-PpV%+u7#%x~}m@-prFCWC;D_sdprvs4}~- zWy19JyiUrg=L{BrbF6tlNat*HMuPRw0=ckoeSmloy<##Yrn48_R10Cp%Fgy4%xaa> zz)3kC-w-pzo>ewHLa^IprJ%lp@;3mu!%*33FS%8%RU!P$=W+$&f^8U;p(E(v<(hYI z{gt%NKYSr*7Q92fEK0zU?)&SdZ_FQKLQWK-P~+aVEo_usg-frJ0lie>fV~XI4#`PNM19g;EZATO1+e zRu0GQaD&X?M2HLe>bJ*}#Hj32DwCu8+?oe4p3r7C55pnC8h6aza~W-C+G_n-9ds*N zJ1!dsqAo!3lk9fKp^HonUTq^tV4dr-+1Y91Tvzd#$uX~>PU3~%TvB!uikmYOkge{Mu!8!5Pnw^4gZ2-?(%2BA=!-gYl%$2eUpNMTpvZfm!A ze4^tJY^jJja#M~aesk&Qny}n$5Z*R|j>ikw5RcVDjdh}tOEqgh>}v)@J<;s*4iCeW)jSp!xf z{x5SsWIpS~#@PXgAc8`C%*7?IE1agZrQD@xXu>oR%IRpdWr7g-pdRMbwT)&0%H1+= zRHo8vVPHhRx(mca_vR-+hQ+va9u8?kp}QflhMdTNf#U6$3-om8S9Jqu({qo^ta=qa zNfY(t5!!e;^c@4=R;^Zj&9mdohN;mZ3<}W;IT4j-&^T*OWLbbD{nu-X2m1w{V!>+v z<*s6*$~_*CH+2!%u$+!B*}_g{CjhM=^Bc~`R$}~^CxC|K>5Up48(cNJsITYhX6j^? zmIoQ}rciyr5WKqh3yf~0?}6BOQ(6x%X;X2%yFsHFbZT1%qSc?CuX)4VqH}bT{ZIDp{A_9B{UuLJVz91fF#!WtVW}`1!fa?P0dBy5A_B=*ZE@7(87z(X21l4g?gLM=^C_kR$xo_FO>d+l)x{#hmzUf zCeY>Qoi_^Pd=5#Xigc{yW5ox2&-w+)F$cQx6qm?5W}`e!Co^4+&|MQ~&sMlRW|p{J z6AdZfiYPjQiFa3c8YgY0;hh%GPhIzTLg}6ay0CpChlUcHzB;v*5j2FP0w-E2oU?q^ zGmnP#g;^uD9KF6=t{Y`5zw*@2)JuV@hsDB&)Wp)51NASub=DPUEZ-|jzhC-=lBO@M z{Z_NZ`&!4RH1|tiDp$gT6)2C}WWO{W9CFWdI%ucPLLIBie;U9r+a%_A$9NX?y-JI_ zgv_W`VYR+pq`$7?^v9=Ak~%#==-j)uP%qN6MMWKpV;RK~h1`?bTtR6rIiG92GwMh9 zBPF&@PZNSTKS~xv)x3F@?X8=|C)L|tDBxi|%}xwukgs>o6MsP!rM`hGs+kLm5;TfSY z%)4V-f+_}1Uo_gC{W940%7kgZ&X@1nBRfli8iGZ3%dX``U!AoL_(qYKS$N;4R=)r! zc9e=OMVvg5GtysNexdqvu4inw9IX(aSH#Y;?5O}*nf-@-NSaLF&f4axrTphTy$QFz z3DQw%gO{`2DOeVc%k#DCGw|iz`pNgTwva9=Z9Cb`!#6_JuH8<{o8HdFH#PjGUX^|U z_X~A+^VGXar@u{8%4FbBR*Xm_OOg*R=8r6*IT>g}3^P*f&Tx3Lh!B8LW?QAj*fs zP?g>%I(T3(xCb2gggjfee&k7fU*c6C&YY_k+QFD7*PwE5v3Hq2J-E5@_eJ_J)Rx+O zb6+>F%H`F#HMtU;_zP8n`Gt~0`bE`0mw}#jA3r~lHgjHkY$R$8qVH$T*#q{rHd9`+ zZG1CJP~W&L_+snyvh-lNOa|O^N_9Gl z+7Pvj?+&I=2B-k7`-f?ODj`r+e4cGGTN#4%OIL*b$|zZtS==LY`FX*8#wUWCKT+k8 z6Qe@A7^$Zr>InrM)9?Fg?EZoBZp}3+cw$OLNAc!29+!Ph3;B7*Ro~=Z|3WQ8{6&9O z1w1yi%R{vnnNo78->#Q1rQ79GdgfL1dC&Qq@F_8^HdE2xqy8FLzL^Y}$WL^tQ+!mB z{585A2|f2$qSjn*dJi${k76aolk;z|$Q6`c}Ci@A1>*H+j%J^|at-b?7fVCTiJ?OOCR@_p^R;aoTSzPKd+% zeT`hhosLhPzfd!oGVUv*s?$Kf)%c1&w^fY%Ui#^C z%o<}GT;;{@W@&r?G*xYI3-d7+Z7eGwD4_L!92a=1c2+bd0iiU<9-*mf+Q*o(v7E+| zM<(|NS0da18Qhz&a+6Xvi8ty8xACI;{GO#wz6t8y6@lc*q=q>A1=+>QWo@s zv(;y!K-)`#=|d=}{bbFM;;4RDAR zm>$^}hpQWW)$D4Lz^&(=P2zuq1)l%=Il>fd({Bc|oUK|Q$#%JC>K96a`i1m$Mf%4= z#aY+Y4=^FXdw#P+WYy;XUS|DSPxU^G8mZSFVx7$CPx(q-Y4Nx+Dw7&&6-to z0`lrVUlf@1J(LOM7dIJYwZT{(|AS_FL(EW7%K*FSr8JAX#Rgj_57A} z(;Y&j)pl*IX=((_&-F+4WkW7B-88~3lczF5ly>F>9G|sF>N~uJU2uq!DnQ36c#VI0 zhoXOn{I(>$p!Z3I)g!gd5elc{1f0fRUU>H1=7rgoNKVJ^H$C1uB4--d8dr|3oiPTJ75qIPk#-Pr5xgXwZGRs$&yqB%*UED&~EG-C%uO9f`6BU@L z5s0`u#mWibsEh3T&&}i}JRO#fE;}T*GosaqNiVefE;pb*4fhvx0#>7Aj;D2drayP!Xy@@a8Ek%F0N!j(tgxUq1R~v}(l>l*RV_x-4 zw_0fYn76Km3m`#sxXE;j*t}QtBu}fvLoW48x3$~PsP+rw1fhqT=NpVACTpS}wc+&M zK}Ze9aYWN#-2g3vpcgBAYJj;D{85wHVt0T9^c4(D}y;(DG1VD?n;NqcUiS>Rm~?1v{AWQ3wPc zlH-k`IRbsR!hWR(wGr!_+AGwmqlN9jHPS=rDEu&l%QP8gtNpX}VdIz=lxpAmm(gM27XkiWz`ImfQ`5GCn!bQ$>217 z1ca*TWL_DHERb@~9C@J2%#V&ACU<3pf{DOclfgB8h=&lMjonIaHL`NU z(7B{dEoR`YNc;DAkRk9<1j$>#Bh~B-Q=bhZ&MY(L4d^+jvu79*;*dZPQPZeP?{?|H z8*Eva&@VOZ@aalc^+>OpFwA1qb5~>McDPnag3yj5rEJVtKLUYZ6w4(A^A!+- z7?{^1?3Y{nye z{8J{PNcD!HtK@oCEI<@%X41D(<< z`4Th!Neb(N=8q zwty5asS7z!nQlWuI^0Zti6UGgom%%XwrM7da4iq}iEX=+xv;u5UfHU6TgJ5g269$De!D9B zCI_ACMuz&!0mw;7xmfA7o3ZiEw#N07p%yS4T~`%v4Vf~2wex)9;rX37mnpz2I`6Fp z!sG3tllPL#@1-#c8r-0>J-&Q?;NrcO9TRLrCER@vKXt$P!tlN&v-}sTWku;6>&Jv= zoqPsLw=9Py%_B%voE#rrPK&f&ir;SU!SgJZ?L+N=^kowTzWyzozq(tEUH#GF5ByFOu({9e3F~HJXP0 zsO&nRVpl*4#B?)SdGL1;>^&bbZKvxF?MnRe%JhPL{iLB?AeMEuo;ZE0B?9wXGMUt|`n zEM#N0ruo@F_9FVh68TK_p(HC;NcW6bGad9dzgXLqPTwDw{fb3g$|g$BNJNR~$WA={ zsZL&>v)1)gL(mN0J&SsoA002Bs5{i)+ZWQjSVI>wrL|nVpS~NK7ltr3Il3AV?pC7w zeS5E!SjwCDfH(JOnqA!t_nZoIz?aC}q)ioAB0H^ieDf8JFM5g^W!a*pdc)j3F_J}Q zH{&qr<14oK=Q?hm0h7jW_n3Cmgm&@Vc6K}5H5~1`IQ=(Jkx+4$r`FIyram`bPSzm% z3d0@6`)&6?R_3GUGpDr2hB+C!0)S}r}+(ftRQ>gnx3#5I4;75~s z8=L!zMr;T`z!oySoH=EUp8pVKcY^{KySeV4@3?wgk$o`O2N$#6YGS|cQhL9u2!Cvd zS6e3WXBvwZS4Q{KDo87#kNJ$3Sg;>mdZO`6q@*B&^{N@41c~`=#$y{p_JOYZw8>vP z-2DSapV&%WU_#O5X_RGchmqFt6>eyo=Dl|!A!Bg0-BeA#qWk{kg-oeH2woYcU#lgF zsb_M3(A16OO{$FD*nWw#xvBWkwYKhK)BB{OY6 zo@3EHGKg0@GpEO8CNoC;TS!_5aV31|qxDGxDRS*;mEx5yxkHb!a)+aes~poTj*L5j zxHQaD_0NLPzaKk-2SgvRkYF=N=0ZjDqmGWW#nE%~iky0{weQtAA+kiH-n&AdyVoV_ z_pghH9jVX#@pPK3uaAkC)A8U?cL)lxT(}R|%YEy2lb0R)Dwa=Dl?z4l$2GYW&)=G9 zN#s>yyMJ)YsfKqc!*4*Lx$YD6{yy)ajyx(RK)1bi38?$E^dYa>L*5$k4(WqCY9mEg z`NH=eHn)ou)Qmu(xa}<5@!NFnai9azhQ=A~e=yqPZ`Pf0f9da-cbw6)Ej{~1f|TQP z=_lT+xR(z#wLT?7gnsOyDF#=$5ECJrY&)k5aOK#+_h$9X3q*F!f>vjC24U4BpN-)Ml1}o!~!N z=>KflRwEED_T?QR74Vhf5uw-{k%u>;YcL~rFbdpATuh2#2;4FNaawR!qHml{AjpTf z0qm1eYNXv>Ev$&11O_Bu5d#R?H0=O#7bRU81R2T6U$o{+x{N%iD&GB~^8-zoCH=U6i|F-TL)8`u7cw42r589=B_{*+>V zf3BQhK;v~M%~FM`5ofpN?mZF|xMlo*eJda!Oo8_AaGwfXFChxH$rz>?A`~6XYHBUODXoR0HjNSK-ypMjr`CQ)T@9hd!QEi z>p*?xi28&2{SxIsXkKyG=YQ$Wf8-(kmuqOCc1{z3@+E2d2V_J_M_g?~(CoZJdI1H& zAT=$o0qPyJYF^xlWJg=mnK1PU6895ZGFVOrY8kn$x9TAQ_j*?q-Q&LvYH%3TD9+kVC@S}+_V$l?8)o?xp87a`sXWkgN zvXKGq_h@Uin5H5}qlk~p4=8CrroH73TcY3Tn|p`vdtN~KPOWu(#EJ~;C}a6nUfx-F zvUzU;R^1Zck~J>60(O$?p+S{xqkev>98jZd`p_QJqLQwe^r)NB^@n12U)hR69i`(~ zXBbeTi|70I^aG@z z?%=9p;23Dqey0#NQk@@m6dw&q@C)6hl-Q#Bs#gaME zUqN5m2lKL-Q(_iM0=)ZsI2s98^(ZUjK63C5^yJpHX4A3<809d-;rdP8X5B&b0`U~UWrFgIZG?coI;;WCym*%><;Qk z`SRv`+q;os`&UX0WgOTQzLc(>9ZLMZa++DkrKgFlyrMHJW5IkQwvSqEY!0k{4GU(s z{Fop3M9{tQQ;trNaYcS%0jvN4>X_N%Ea#IYIr>zO{eIFvwTvRo4D~U%nZB zJln4E1_@#Jwb5UwyBA6m0J!FX3fHOTmeyRCk@N}m&I=XcXY!lv`^)Y47Hj8K2Ta8y zWW%@{gQcl5DuS8CF2S}^!JQu)*XgP4h015GBF23ksn&-B=pdUp;Tf2pjfaD4;;8H_ z=a$d%j~g$y*;uSGu9mJP1p`XY$$@pRcB2gE;GyS))zC>)C`KtzUtLs`J_eF z0gKXSOKgjtKUG!~!t!0JfWeJ_seWYUkvt?y%^=gkx6to1++U8pZ%|gCpd0HZ;k9=L5Ueo$Q^(UPB zlGm)_oPYghgzVFKzD5`kMJhYbdFoX;I6Gz5)`%6Cz;q^D{Fq(y6OMNm49Y~|@YC1=~S_1veSa+VI)Kc?Tl6=?S zh_8%d$yC-kre*v;&WKvpmyMi9&W&13t5fa#`E@OS?HCQX`ZO^e2b2XlW-=&(#0tui zJ2I4X)x=uL&y7RA3DxM}oF8k2ldaG>{iV^0%|;*YVx+uK{n-kmV!Nzg913?r=IQ=w z^Puq(-BjQo8)v4k`Pk04SiY^8NtjTb|4fBGs`ef3@{T$>KZTs!IjH)B@#|X7+A+SB zajZ%YBV~bO<`YGbt9E0Q;qKeCfHJ|9-uKd6TeLMdhXS+T4 zt#)o)QqeEeK+Pj1-P}#DV$^a=>Ek87CB_Fou^oriE93Dj>lKtmj+qgPKzH`yqi0Kw z;0Jn(%T8Lk(1XyU@$DWvYGq(^*{;>k=XMc-zf8iUWafu86E3NDouzC;PFdDlR9yjU zIC8%2g!B*bAAHhNp}}riO05QK7dLJXZMRApQ#|)IX_f?&m0|gt^o`fg28OQDHLgWX zf}d%=!*qRvz_rZynltclq;oWm8!gE}8&|AJ2;7w6xP~0E|oGn}M zTiqNW3dDB`gc95*+zH+aNch*LfEglpx**st$#eZBGM*E`D832J07)1{m3$aQM(QfE z2nAMQbotgs+XHliu2ZY#RYv8OMsiahE`3(i5$(%g)-49N+2+)$_6c<<+5}w)t_gdu zqP`wF99WN#Z9%f_F}7Wywk)uQ4D*7UYpF^-13Cf5x!014_}0cMYO-cI7M>X&m{%F! zGt0N)jO|}y4$kG637NGh?Dj_R8fLRpVQHwa&Th_aUe^`Qq|v=aeV-B+}v}mn4js{pyO6gx#%wS;lPh>Y0BJO z#zh%FC;F0L&T+p`;%v(>id?FFhdNb9`hZ(g5ptwoblf1BH*HBhX_IhTfaN>kYGMY3 zqPo6-u+m-S#?&A_;C#kw{s-1W0-KgHp$iyjXHR~(Heu1|6I+t6>Hv6XZsfqS{yp=; zNyk+SPW}|{PNOboZ2#QMYnLBh(RZk;0&#@!st$nT=tjv2?)Bu&Nmrr@ybeq9xs$HU z`$WufYH{YlY-Evw@oLg+1{>%3y~AfW`J21gb?V!5-)vnXAjMj|n|*C@reO9-uIf-B zvMPi7WH-1Fa{LSBc4v(t$tN&HwL(ugV$pxun#z94mU^`UVIRwem#+Oh=l&iT(3&3* zTj7+v&<$EYrn_!=r3{s_vuqS0Vk^}CLN*}qR^6yLF(Q?OFctC1GI9T*L*d5yNgf}( zT^r18m?QZ7XDhgug+Ac!Om<05CDQ6e#Ck!o1Il;N&d(#wn>2WrqMLiPwtL@noHqO@BGC;Vf zv_LjN0`#N*lEJ(d$!!EwkTe)Ly_`>56H#j0j-p@Z^RO5`Ltpm}ZRagYQL5R>C||iK zE2fqhlbcSRPO}`Z{Xw^e{!X!;W#(0$eVjwSmZ5Xf(Z@p|cF%3nD=>ug%iE!Ew2L5i;|-QqI<=*X5yo*ke?BUA#mX4kn!}u|Fbzt6KVCXxjrLg;Y1y zgAs%?904hwD}eijAvNsbi`fW|!ZGd#>FV=C7(4^@|7X3bSFz>uwos!=9ta!5dF)dg zAz=4R91yK7!{g5C`HAGK1fPkO1nk5oBLjg}p zSwu=p)JzI6q73RXN3#bHJcB|(m~6ebFjI+tDwuzLim_mL#vt4*v6>Toquq5MP37=o z!_!6MkV%=*;h3izW4VI~-xYW8x|T}_ZX^5uzk3IG9nfe+1pX5E5uyM0izfY1n^TK_ z8~(!HG7z@Ups}b8?F%}%F>GmxjDujxE|Xaa2^5kxtS8YA6Yazx;uXrmF!KX!a3Fdm zV3&{KTF_)UZb;444VXPB~5w z$ZuvofWH+7BoRq4Lg~in?PVemN-JQBQXguy-Ll{ zX`$K}n}={wlt1(1Nasknq{*}`HFRI{wpgwZA+M(ZdB^mOUn?^qb>_2lG zF`zZEvxaZ)Z;5?e+K#7@a8ZNut)AHT*jOPeGF>Y@B5Wnkw4L=2vZ`26rQEP$sT(tusl zlzLnVrGi>g3CE;QrwrUxcEO|;A{G=BgjJ>zibYX4J%#r1_>f4#6;86b&;}x!Ny{!A zu2nyN8zyr!4c;ekt=bqSEewwfCYx4IL_y?=S}e@4{9o^C-unLlUPm%s@djH*v#{GH z%@Ub`3xwb+rwvlWp)Qk;+1xcj4jzuA=c0+vG%f2%^%*}x6XO@tt4K!i7uxb4M< zO)0ywbhxEO>)yM>qH0`GM66jopq%@eZF^vQWKzEDvhVKfisE{rZmAY~Sn)lN^u+;Id=ySaYRgzTeCB!RH2VBZfcrwD&w*yl|E? z*&*W1xJR&6syt_n=qi5wMR0Etm)!fDiyHlGZ$J1He+s!mzUdR${>(Q)%lu>OtE0U4 zD6`A8Z&ffeVmabBDJL`ZN#DOb&O=vj;bC*Qy39uLtP6w;wh40_FmcU8 zD_YyX_>eKkIbkT}doTw8L5`*C5ythb{ZL*m{Zd*d9CDsuZw%)oeA6Da-pM_y*IJ>-cx(6RYntA=L6OvA%R7wzLFeHDnH8jc*LcJj^Hd0W`=#l+o}d;}6a8CSnACVi zoAYJ5FUzN(9`MGqx^$Qeex>JR+}}|q;3%Itmd;yrw6j>XqFkRN3gi9t6lg`qGqL=5o4YXTH211?Egu> zFMr$l?}t0{7Q|V)mDeX*MbWt4q8qeEQrc=PLc5&nN{WcX&XfC(%{5eFc=IarKP7#> zPu(`D`ab+-Z4!I``9MJq-o1s~cQ7#;ITDN>N|qgb&-f_pEz3fRe796Nk}Y1@J+Hp- zrR})g*g^Ktq8Ulsf&%MqE475(nFk3wc1tLTM24((iPsH1w&qZh?`vjT*4B%9N8Med ze^}-7j+mOQe3FO!N6LgsDz_>u_7ZYT&Aqt29-es`8hJfVcE^5)$ED_i4)}s`5h-;U z^F3Z?AlWzH>e3m^*Qa){OI|N{2t$&e2pbE+nH4juWoD~|_rq`Zi;x|2DFoOmZwru> zdNp^iAcr8V0BucOuuX%}Zf|-vDmDI6M%x8ojF^vDZ>WEb)0SY8&xnN8c!WN_i`;G% zN7HfVXIA{eS|_WDGtTPR>XTA2UQ-&2btyJuvHyKX4c*C6e9?wFW1lh6t&W0oc12nBmcX zy&V|S#;_pUOdJf#rg`nJ-M0gKf7%?x+-y56(K|31A5Fy1sUo->BBuJWd6}SULYvvUJ1L=#*3Nm5ssB~pup^eoJ zyBEprUr17q>kAMdSPtD#sN|lSByk(r{S!B2fW&QBZ7V|jC2~6bf|(BYG=@;)q*-q3hiUyR(iT0 z4cM!xSpeAfM4Kd)1v5*cP28rZgkvu^3!b6jBT_LoRx~8T($8>eJKH$h?L^h8!-4xVLcu*7N%#Mzk-nI^}&a6CVLKe8_XN$(cuq4 zu@#GX%se*g!EQg3k34#FG`&}$*gd}XXtP`V2_vU!n)e!yp0fHomge92g>sqxoY9vw zxjm&}()H!(((LbMOJmH3`6UfjA|1{i-F9s;zc7^bn+xY$!-+xWjyYm`b}&x`Z}1sg zLK3ZPPN|YM$xtPUeJ4lbPQ95QL-B?2=5@Gv%MX!_wZkYrSL*jS2Pkqg!PPvpBk$fE zZSU_mG@@-T^|Uv49KZi6?fIFKrF8HNoSv{^|9ZKkm+JoG+qb)^+8i%ya&=LA?h89F zrIfRFU&1YCY5Mq&gvnO0*s9#m`=VUjp}k8qm{m)gKeQ@W^{|K56%N7qR3@!amqD7J zAKO`Z&iUEEI0a&^&!DS5Re9;a zQF2_9AS=Q?HXr&;eYU~ z4HU7WbXwC;f}RK}7BCl%aup$o_=!l|6yeTs_K^S7o zvMvC3K|D;56-tkH>1CDkU_l9K9s#t)7uaLhk4`*!(m}(a&#FJUI6Y=f)Vh{6@$-%& z$?Cxr-&E?)0pYk>y>;#Ngy7zXyfU*xUwIEMr|oPlx*%6Q=_8wl&X2IR&)?WFX`99mdwWN7!U3XvT-2cKH%TQ9fWR>Rl=%XX4 z=q5ilxuc-zrso7C+d1y9mbDZS&f+%-=u(%OFZ)!th4?a#d#4KVWm(NE+Emzy^VvsD z2fdzTs}v3m_@=Ec%eRx`WoZzI0g-JGUxF-xvU~m7N;%%avOSDR`}Ot5V_Y9#ey38_ z73M`7bH>}l2;&g-naeaetD%p=nh)gywhsIuI7{UuT!mo~Q_=P}kY*ua) zMb|4=AomYefJa6jU0vmvwD%GaNJ#u;BKTuMfWPu0tf81yL7ZwhI=lOR z@ytuMjP0j$fw(eWA0;u7({P+cN#0wx5n~7@sat~zIRX}+L5Oyp-Y=*I{xknaK` zyFq?4;iHZl5I3^lnf$d^CTi`CL9D$`9P?LbnzD(>EN5B?Ykp_hPXCihyn>Yg0GXHY zJYgT;qSPGLNZ-erAZ=OO|DOi@*>BmuaddIsa!r<{A{So-h_bPoiIdfKsD&%ujNePX z^m&vX;D?1Nqt8%@_PnS4LV7AWW*-aC9(bv(q~wfd#qO+uGzqs%GO4#T@D8*pw&A~3 zZ)Q_g7O&+;(5%&WBAkKv>qqYt8{BM}n$Cu;_6;f;uzHu;f@%6%LqW19yA|n$iXQV~ z6c`JMv-x!cQ`W`m8quswfIl&xP~D&#Tt%T)H~>i*FsJ{=3dlfm02O;Hn6QI-0WjU_ zIEbQ`I<2m5@3G!&qd1C3^bPwPRP_!d$2G9XF?(x;H8|GvqZPSIf!u%8=97&j0xT0h zpk5s0Rxj-vC_oH@uqo#nI+z6{_2J@^feApth!@@9ljg^o-*q{&Y~R6C@2C3nCTr7q z{RK<04SSWo758q?E(a+Jt-X@(`rTNm(+C*E(PxBny}3TGL_%dg9{6F-4o*aWaxq4w&ZGpUpmp`XW1@M%|PB z=&q8#_p@V1oNeR1lc#r=UGEqhMSfN<85d|9HrXPtsyyT98kCFK?>k9zyyW>Kol*5I zy*K+jx7mr^b;TYo{VHE==D2+}oqt7m*ozwb(6d){ws5QBY!DJ+b#f z_u<9(s$PGdHxo4K5!aS3LlHyo&le-b_W0_oW{7CMxFMac+f-(;>AI9cF4Flu{|v-( z)_whv_obq^KTOwG@d2%1k1%7{$H6Y=o==w%)zEb{z4K{O?JvCNXHloEeD6L2hkv?g zkF};*qo6jSMXLCE^9|!=OLt=E<&=Oun%&j3rag|-6pLf={5*_18R_SZJ`L2D<;DF2 zaXamaT@3E|Ilk*3TrXzm-tagtAd?rc=jBG`qVuR`%f26SwaxCA`1cG4HkVEP@T}mn zR7*$f$hpJurfh1jZTs1)(-Eg|zsi=rk2`37FSP|Nk~f=;k+G|+9>|} zz@Z*d<-W`05O=J7hR}@Tu5fYyOS)V9K{kH+8(pO66oCmj_&4!pfZax$aJ`{@P~V-@ zgJ1AsYlM%Ed&2>Y!;8X-EXhCks}#if;MRP5SvEvHV;a}?tH=z1}4Q< zEy67rULpj%G3X7Nu}-!x1pN=xeS!1op^)gN#IL68vi3^R8*jJc26C!!QRw{JRgmIy z1qEY?TUgO4{7Aynf;)lNF#Fz{zsHV_ZzuZ;(T34iQ0-_N8HH>4g}}FDz5ORnK8(up ze)m4fx7mn+S2oImb@jwpdjrV-UqwbBZ;BM+tbX8r;QZk_z|+)Htps1(pC&v_X{2D; zg9Tj#u5lb$U@Y|6M#`lf{R1(VjV~+b3=#ZRLb)D%*c_v!NWv#gBJ!S<>BJTCQjR$L zd(2npif(e~n>-kv-e;;Xg!(5KGmDEXH2 zJ=>PmZTGxzbU0%mcIS3opIrP422bXQiGZW=FUj2;?x$9}pc^MN+r^|MxsgIoKZ~PZ z-Rp?4pf=%9zvpoC&BDv4Ht+UtMDh$j{>qpE zMX>%oRaq`K3Szl8@~%6yo`(yP(ML+j(_!Ubb~ELFIDgT2BjiyTW8dGaOLtC6GgMl- zyJLB;yQkvIv->rcf=fR{ACEs->C%)|G(*&;dputC zS~@1k(Xb+?1>)-ZK1p)Y@{cj*w3~Iq?fKs1Oo+Q?DNQJTF8-{ZHz6FE-FJTa!Nb0S zczJUBhKU;rU(m{QJ6L>U9w&#qK)ZysX31vXAU&SLiu8EQ?F=LuYnNzNqk=La0#ONv znLg}!$Qa*ry~kXCYGbo+yRzS3NXbxRy)|TCZ15<+-&vp%a9*gpH7NGj3n#M}1gs7t z-w}a$WFxx5Y}CS*Si=cEGRV9l8Jz~tnGNuHEc0rREnnTl`IUUn!|F9o*$}+yMglDB zdN=Jb8#S-=@IRJp#Yhe)l2b3cwr3-_%B^+x;!CLIJmz^1LpM~!fM3y<1I(zjwqmRXg~&0-V@?Pf8Mv5cbs@`D$nR9&GV zSWFdIc{d|A-2|1g#sH%I*Ud~2pb&QASWp8d-xF%%USuaQS|0!;4FOALFs_ZZ6lT@+ z)OCZ4=yw3PAb?cikAK?!cO4z9^a@y5a6(a3Sx|LG$h(6+Ni?b;`a6l@no}6Yrv3^ zj;wD9v;nLSm9?Rtwdp#m&xt|JLT~x=AAZbZik5YZcnqeFaYZV@=jm9L!%wT^oa`=G z^8)Bi%wwK9q;yY4Kzt1ZC)ReU4OOz-IAspoOlw6;j5)AhK%cZqioKq6p8(%%#Wh_? zQZVPRLW>KQr+`fc+1Gy+@!&E1`oV{@9Zk7z+(RkWb!W47T6j*Bl%_*=dil z5|2&(#oWgYTqhHwWsodDRN1?=zJqBv&EAfQTI)5tI3Q7)4vcp^*CAEa^0pyxMz^vp zNq;chI6i^vD->7fP=&BX@u;=i2ROjWoh1qEYtIcwt*y6#q!DMM^b5Uucgcx3oXzO5T-2I13{HixeIF2_&Lg#vJ6S)VMwXOK(=0 z;1)XQ6qP*D>rQ9DtV=ACMvl#kUS#>FD&xnlVRaaZmPOC_PJ$-^>BL&%unlkZ(yF=m z`S#Q(6zE**k)1d+adiw|a_nPW$D(x);+*#cVcqphQ;6GS0dKf-BDK&5k?ARZpJJfv zAVylmheRzA#&^iQwVBg?QZ|0cR9Zu;uqkbG%Ca;C7+ScuIHh%l884QM7CuU_}az#ty>?bY_gnkZdDANd-Y7GFCy$X zqw`CP1u|#1khnB-20m`c z^}^jG-tsr<^sH{^LblwuK24k6KXxoDSEnzMXi!h{%&O+9a6UTziG1kU#EF8Q@S??4 zO8D)4hmNLemKjT>*OtE^rHUNt`)IScNU-_w1ToX~X~vD}o?l|KIC;VDW20ky$}N{t zvtyVk+tq;%+1CdNu)Cu(+DXrDq<%W@i`U4Qv!>4D zYa*J-mko(;6$HGJa|#;X)f}B29fdH0LX)t@is}1DeVMe^s2WuiP=Jrn-`gr55 zknYyb-$rlxy%G9~$*wshQ5o-d1Yhil3fp$!g}%ajL$cv*%|`KEZ5jpdJ(-#+*WcFP z3%@6G?Y+Yy`F*Oev(D7SwcjHRNCDkp)grM$(e4HA+v*RylOqIEeS12DbMB9lUhZLN zo@DY=gyr2@-t{B-ed6|S&w)o#ca%*my-FTRwUWBgUd6mgIPk(o@@ZoF z!A+A+Fi_pvmMn*6%OgRm@$B75rA7QngOn}3Q)YZ}9)Z_?r=B*ieuVb#=UWgsDP=TB zo6|4b;@4I@W3QFGklPoM+wY!ib-`{}GH}PPE|N|E&R{0(H6ir)M@2lqc}wP^Gi_G; zW-P*rSoAz6nGlWYR~i?;pHn|v02goifJs%WunxQU*tc5Dx7yseu&=q5Cpid_L=@pA-jAWi_ zOo?q4EFOImuRCYHS7}^`Y_DAmV5ZIZ?>hVxYtQSH?7lFT1o#K~=1eznR~TL+KH65T z)O|KO^1&E%#m#Vdk#KZT^qO9H3h+d@y4|WU@MjIah%b8yZ*o+%UZ)*YEd ztx}U%?X&k7}>w=*p{Ovy;hIv(~W(1SdYacw};$1@TIsxoSeHi z$BUt7ct-wqtH-9KMP7pnX*nl_;Ikif9vohNJip-M&Ar-s(xX=>EmpP~OmSs&hqRvUR2;jZPU$O*bnO~S z5y~o6=BcoiRlg?k&SBMGf2s~iCw|1XN zs2Ka-4sba{-}`jH<14=0EAz_MNp-F-M9qE<&11cdd9Jy5N=%`fL$2SAm8LVrH>eC+ zQ`kgQSYM3f`jU?ac zr#-g78$RmL)IE%v!r;K5{PSXeeg_@3_Q1Y-h>vDdmiM#l z2f-Zwd60y5F)xur=#!E>V90m*hfwKby=l>@_S**8sj9%i}uZoGSVbq2{6Y)6FUsHc+?^qdLp$$ zjPfXv9}LM$Pl{eCSoBJnto4Mkt+2iX{HWdy?!rV7Mh|3308uu}icQ>L<_EM6#R0E= zG92vMU6TwzCdoDOhG5alY!Zh(JE4~x-|ctX1!6}QEPMRVUj6es570ul{%Jt78vEx* zK|5bO$qLsyE6);5RYChfC%{>7o)7mcIzd6eYQ`G8r1|1t;Myx31*;ur&jmWNcs@aO zS_2mKjJ2k66ISx8J+4|7#{SWA*YbJKg5#Kr{QOY!{?pQ@s8We@44N!qH&&Z*9SpkE--|y*E1?xu2St3{=im&)(V{RZsv^gj9#s>WB1hZ4^IkMX0)a9Jv~*@ z*&Y?(=pH{Kzk0OmLKjR^mvRy?YT zde4UoQr`#9+23{xd)R+QA{Hkc-m#D+GxysWeQx{I@rNCyx){mL5w6$LVm-49V~ZX( zrOpK@t(*JAxM$;R_T_QuZqpl68np$_4A1el{q~XX6kdYsgyuO%-z%a!6TEiH$llM> z`$1(M{h>Z`yep(&c&I9Bx!d*W?(24^465G&&)782mo5kA)z(K3YWMc~Me7?5-g?}5 zH$F0Ep#0GTNs>zV=I38p?zLKxN&heowxYYmoHy%Y{8(xe}fneRjHj&<;e< zI++5OF=DkFSGXg_t26J;))~I}MlTY>E!Z3(tjl`zf|ldps=Dw2AT>waV|MPo)J!ql zcRF#03lHLtJRTaFBp@t5hjIFPQXl-C~YyqBWxM82#v%#mEuTDFqTQis%f zP>}4a|MEh7d;MhR+I=uGGCU&U4ftbOFJ}eZ@#4&}u1WV*p~eyS<%L@--yw^iCSFL?o5TBgq z3AexF7I>{Wy7}7^N+^{>;xBG3tG;cDT$mI8-Y>b5yDA;omHzYM$U7$*B1|%tx{Mci zrPE@T0)6Ky|AE-N@OAfUSPT2&5XHY7_5&R22Bxck^@HnMoC)P`#fPn4m=$;!*r%Rr zu^J2&n;Rh`h9O73b%<=#TgC$1fK~qTMVZk$p+)@H4~FvYLl%a$^hjM(X2_3(Whblw zTyvn9b@YBX8UPUCk^@-F2X0jKN+xa=yx`W)ZYC==}!t7eByLUGb z=epsfIn9#gONP^BUI*qACX-i%T3D?nx7KR#&_O>aIxbMb?%e}E-^3G(_G%dXfO}%Z zhpM{un7(tZk-jIdMb3?iuT_&FwDX;=LBi_Ntq-`^`nTfWTdA7jZ^RR+O9tr!=cE_L z4J#*`E`9RM`~Dl)?m%(18HlC?0WU`8|FH}9)EAuZJ@aIlTi?1VAo>SyE-fpX(Y>xa4s+r<{-H0^3m(>?xl9+LYu*m;~wvvHZeVw%Y^&f~v!!R?bBY>ko z7wU_IO$=B`$4ISCaEg|aa8y>*Re)vb4RiQPj)$CG-+QHP1|kw+TbU-la26)fp3n>KW^t_a=S{fBSf)yTPtdUkEp=X*??;hIH;P5v++lm7>z z(!L&=5j0r7bQx~bb=}8q6zZzbKZ+l;~{C33ud zAA3}BulLk!(OSNQGM-t$hd3(V_~`p?-?HkNrdtbhF{?}B@9m&-k$$9yEQz>(hbHD- z`Jue%HA_b(SMs;o-!0+u#T#hl(+B|Zh)l1P?+?l9ZJU~{@(rZTf_VnR8#JSdHU>!9V?a6 zU-qpE-5N1mUTB6_F5wI7{XY65Y52t+=pBYn>%%qLKA@x=$2bTpSsanaD!-^jbmnb8 zHv}CBjxH7-svS`h(ol|kYr3rPyxtS*Ybw}6-SYU8S#*UwjhE5!T`*Oww?!|zwd z&6$DR+3k~aUm~GvmZ>t~-DGuc7;H9wTYFL2_gg*xv^D!YbnGD*^pS54IWwOd=5ayi zW{pa$L9ax-va9I?ds<kn3D^E?n|Fw;&bm#KcH_uhW_X+YGfEi zT~gaa(RwKDbP!4SHw>owHkTlPz>evj1@8x!YQFM_Y)h1KDt<+;d4?J&4S z9`p|O^e8^mHDWLR^+Wx4I0=TdFq;pCk|qoHL}12Pyvk~i5X1L55pavHp98_)Hdn9S zk;`8$c?rKR8J57T0H>TuW1jwz{HE>a00<+yDnQlNo(yIfi&v+iNMCnMYKbQn99}<{ zA>Q@tt?9&oamhTgO)0HlQOS$wJ0gqp=mfr4F%N((DO*tVh zD0{b2BcwF3dUKB|PVU&mt(HJ@^~**5b?Mi+eOiLtXM8Ab+MgX%&hg$6TRv5kWYH7v zQ*J^acbby#Jm9=@eM9C(w4P&M+Q40DtxwL|E^DdhbjI;(>Z3nSs^_K75EO(jM==g4 z6GAdXoFXin^|S>v)MFGmjXSOHI${Eyfj>)Hiq=$~8rFy}6)@$?C7qUgWD{fkpw6NF zm7@{yeVldT^Y(Lu%HguoUkEWlc>G>z&te{dToE7?D0x|d7MO75|70ilf^=J5N<1s` zMdKb{wf6(mQMG0AVN1aBZ&jC&CL?awp8{;Z3I90$ys~|dpXosMBOoo&4s|-_3G0lg zzpgNeLR4Zg&WLZAXau|nvKZYhuzpUmuxNJhD!H+)*RQY|mP!y=w1oc&30&;mL;PfE zE3L;kJn(h!vJ5**YAXc~#Dqp0F^T}yLcj)twE%MpVg&+xelEo_kccV(N)e0?eZ%hI z;oZoY9JttPDG7WJPuWoW>~P z{ZHW5b#m-BJHUCla8p^2*8}jc+nWH$AUpnz3Y&*FM+4d}!&N0kQ9=;|->@eza$Ci- z8GRv0Z+s<$6E%5NfOCE?5-JMk1AYiVgza401a}0hp$+*X8t0H-F1pF9Cx#WX;fAy`T%|37ayelp`jjaHzK zsp*h`4MrjP&9n=<>Ff?cqp@nqcaXb~g}V!0t0t#5CFDF^I6x}HwgH}O=ME2e-og{` zm4g*I+jI$H%q&3)mgFuwM1_J7flXobs|aY1cGT9N+%Uheui4hHhkX-#fY)vIU9lZ& zJ8(Wbce2WZs@w!lEI9K(R2FzhbfFyJieQnw+=XuSgw6KyU+*lE291HgIwvS=6j1+k zV%V+o%)sntLI>f&y0&sA-N!SO8juVQ%DZy}zbTW$mKX_xDxVu015qW>rQC?U@Q{2U zjfoGkbd?a>!V;Fsv+v#zN$DJ78)o4lU7P*AKH>T}MKs=ZZxe`w#g z-TOiVJnx9IibFQ~_ z##-va7F;pSDKtC#tNrZ?UW=P&MI3|Mw9CyaOrAd}N=kN3%&mGhWj9Sy?Gh;bg}yA( zx3}}4|ArD?FPg^HB;133d`0HthAkxh{QjmFe(!Ye_kzJ%6GE@Po#8-PYbM5A1 zRfn@j3jHqO(=!t7M`~ zRN^Y7aGW|)o_;0tu}5I~vwftv0{u(J(!32KS(K2f(D3y8qwpK;K4l#?7%x-e^FN&r-?UC4%iLbscI=zn)>kBy?jC_@tZF{cu z*qq&x_Iw3(`y_22MPHtBZCojHb1+@d;)BAmhK)tDJTyYd1xlk*vh}Q8=xdEpM&|Jr znz_mr2ahT_SUUSF=}WV)#XeC0rntzuoY)T8Q})y;hGlPi=R{>X!skH?em{ zP7l1lWJj`-3`dtmx5uiAiw9?n!$RlaGwv4b$?K2vXa)hZf)VaZY9afXD+)nv?skmi zc02f$7$}(MyKcTD_7@S)`{FX;hFBum5bJC_-LdWDjBylmeXTWdYLQ@V1@RrhLsodf zgn)yOvAKoGZN|fCT7iPHdzeMD;pO>Ti;Jf$y6JBaog$-5U&FC;_zKt@H4|T z1m$E>+nE(zlHsV?U5o@ju}(ox)w~C`vt8eh2DOWwPe6q_-Siw+TCOMQ)stS%a!ew_ zvqK%U`4gLFS4w7A%1=&@E>(qRM+Vd1w|a&|^a;X=CtJC)@mH@DPq~`GsbMv%pQqRt zJk!QpLBI7Ce@)W{|Aw*M=_2LDy(d{GC%3v|1Jbq9588bvJ~14s=wdywokXZ3ws8Bk zYn3aRdUww`y-n0d@0SQJncPCy_59@T48fQ82j0bo;unnTnw_h5p)bI3oZh3yPpqVe z;&=ORQMMZ1@=Upwr&{ghX!VIr{^+piDf^GRyf(E%gC6RC#5^@mZ3V@u5|^f}Uu|cP z9y1j$p0XNinObxr7icIg=aKaCNUvtS=2uYcUz5h|9)gcb_gfoYnXh=l8!S%KGSWxu zTy>)>#g3+|px+#T2R(SmXr8vDHBZgn;~C-$%C=(cy^+3Rl*dM}ce5cZxz5ZNUwvV~ zI@n^Kwl#VwNhpF&%O`LP1lx$m|Y4 z?W=ebgBaP;vh;Bm&zqqRU#aVPMCDyZT4XD(Hi)7oOa(b)D=Zv6oRx~Neqn@2Dmk=2+YiuYkY0KYC==1Jg_Zr0|mw84az}E4z z<=w(-h-vxuFV$|ROIy^ri}w98k&E7$O1JKj@()D6>fJ@N%JwWY_3En`#$^*Ioan&@ zb*`(;LelDi%1@e&sE-cBJVGDy!>&hv0_A19M+XX$?R9!EHkUJGEOCA5>Yj4$_vMWl z)I;Y1Xe(5>b4_*pz-_Qt){q!~xaDn5RDJY1U3?gr>52tPzyLY;FG*To8L5eBDKu^0ISA3~eh*I&$@Jvkva6NJ%A z6W6lB`l2X*O9^M00g$CKLGNdodjF9o;a^hHex|2s4&>N;%lCs&0|+;Pjl_YX0jpTl zc6J~DHaOFcD?~whYmpuS-~<0L7V*;{KHSZ*W-s5Pb+Q!J*0UHTeE1ZH6(xtjP7Y>_ zq_B`Nm1BpYkXA+(Iyj>#3?Cm@Fqn(*VWoye{f|HXz`jQ5?p{QJZCO+2f}%>1&-_c0 z`yZA4KlzK-#JZYI8Fja#3ZtO~*%+k%<-e9JlQxdk3HGoI*~3sVF|1zY!YxYtCin@D zJI>P$`KC~iB%JyY|4#T9pNvZ{Y*qD5?CF+DC3aps@imT zu}2-10Fno}P5C2`zZJzoFrkdBvdP#4*<5XbyBuIOu^I}Yc(z&QKeqX^^JEn5a2- zxvPz+nMZM;h&6*w^=_?@i{X+j9OT3MoRBqbve9hUnyAi{#HBw|b?i`dII^yJ|Cd#P z((&`eTVlj#-gQEpd&H_*Y(t&$QaaTOC$dVK-g%|3lqg-EarKcWUhSsaNyhzRbgJx` zYUZF{h&IbgaaUH9bx%%m6no2;`C+qOfTOjXUcam&L9oe2vgVb7Z<}Z-eTQ>EprF~? z;liWG1|}1U6(@*u{E1JNG@sjNh2^F^9+=42n21U8+E$^ATS=`I-r1?yDx2vYC>ZJ8 zP|{f-dtbaVQ}4o*U3!}JKxTtHR+_9Gxy^B;5VYlo?>ZJ1n^D zg6EiWM5iS)bVy?&d3vHeo3viCwu{7FQc+*YxT-urA?aQ$C^$dxus7oQ!7ju+J%o!< zHR%)4+IZA8cKDsMQMbRmDncqvm){%6nS80!-ySd{Np_VzGhB&{OZwuz6k$E#`xtBBwu1nZtoO=?g+HvIyow zcC=H5zC}y=Ch7OA`S0do4cfyM@|Tq7snhgZ>0wv&nFKgKfp$uEkP)7~C@gZhFqXSU zd&+J}v-(6uJw!O9havL^*R_&0oKPu_{h*gX@*P?!rQ@$%VtP$kF;9|S@BA@iJF8!+ z&2??PCVy~VM2W`;a-Ro>%%$R_qwoelzgk@U(R(p&Wp=l z-dNMgIXFo>B@T@J(*~ksqxL$L+tT_~3YViDEo{s^g-dBCg$hb21 zV_lGwa2l-rhU`6IIkg%oUqZ)6laEK!cD^aB-TqdTWPCC+b-KCTBRnhClX6fT`&MJD zdvB*5n9iPqi};mv5Nv{4{c1Vp_{^Dm2iJ7VGTET(;3`JkmswV~V9j6Id+D}RZA90M z#bNzR`v(KlBYlvCgYo^fW12L5iEky298FKb;HT|;aG(5kNvlX>q&ypoYFrY8U0&*S z63vFwt9`HA1C%4$X7J&Op-v686GgLp-eXOkU=%t$Ki*6V&+dDEBEYfZXuId8Fo*N} zq~DelPM31unSy13smMWZ$dNfZB}`G z_-XTUZnN9!J#lkWu)h3}+QZ5h@PiAy3VhayWBuP^L@~By)qH8KIG4fT;^TV)hO0RF zu}Qd%Q{=X`E0UIyq9pHYy8U0bCpB}v%R7%P>o{YxdKxYOEU)z4G!eU2m zilh<0fPAqKP0n+{;_B!W_JY1vDOTJyj7SLQY1elu;;NDVHJ{grsxDKcDndvB1>)Q? z`Q3Y^^yx(yb=e+*6d6-wUK{IBwuRfO%8P~)EKST5=1{AfDB=P_>Y}r~1_zmm zEzxqu=EfMyca!+18DKVLtVMUkz|1F54#caOrv=c_qR1l8>o+#?U{RugK9mrofu=?Z zYraIXaA!bC{7vO2C2=BHWdCY+!S?!UjqJYRDkvZej{Zm5#DDUavKhb_h~X1DOMO{3 z8YvBHti)+b8{>nlFai>PeL%?>8+~F+wc59GBN;&ObN+d2bO1OK**QP&WI>d}4G%d< zujy%z_9CPg^aJGOe>D1Q z8U82V*BnQHe147JYl__>Pl?JU$M{qIi~VH^3whM4IAoBW&$4w`X-;cIxU9?{U{7;o zVHo~<08NY6XyR@fmnk&ypZ==*RwGiHSwUALzce?K&u8i*yX95hY7a2gi7*} z4NxP%67sjsMHBpNiBUi-^Pa_#535T|;n&$bX#lXg;TyVx2G^5Lz<0*{tgwT4@3%PFT~Pf?W^i5BeDmS+L3n~NVQCriZ#ifn#;fsyGvd~?B9yI9bt!N;FSdbn_j^fa)v3N~<38F^!UiuC+?wNeCO;?L^f-{Q{_SvbE3?8&=(FW`0+yyU zu3}R!5)bjlfhRG;$?ak(B=F6c9=k_hP?$O>D>-F3PMDCgSMK0FhY!yPSMY5|c1a{R zi?ou~8`mHkPjDa03RZh7eR6$qdf?^D3I{g_*(~V{Oj4?HvAxWTV6^$T^4*OGPfamR z?P=Y+Aan5PM3?{+fQwNRcELRPlY`6-t~O+dREr$J|$YJcM-f~oZpLPw^vSVA#8n#p0Z@xg}$8KQ7xCq0`fI!q>p9y zM!a~Oa^J%`$a+?~IM~2+7nnBIOA_BI-b(4%RnqXu-~d@k4DDeWe>#}wi#2#bS!kp* z!U5Fjn5oQHOFH=Qyrrt_!R*a;clohLT{#XuHYErTGA2E7N~`TCur6*%B}kPISK69+ zu267H+`&T_F^t&IwT7mh`Am=wGa!BC#r^;UTpnT z;}s(&^p(b#u-*l8WySTVn)Nww4Jm$Ccc}!w=W+U*?J7As#eLOl$IejOXfqb4Iy{b@ zBS#Ll(2iaWnZk$gnd^Fv-?_##(#40z9X&byEN%MJo6KWXv4(dl)P~;;d>=}s9zAik zpf!vxHx&M1Y2Tyc;psh0FRN@f`o19ys#>TbPb!+*An=}064ZhD5Moljk{N2y1y(gf1IMpOUY01;yHt@-*ddFJ3D6T-tp6gb3ow~99FPQRNMxC8muwI(Wlfv|6SByIZnK|>#`K?j9$rPV7u zgXq+6`2?;aBW7|-Dl;7=CZ^kJtvX6Q4KT0SVybk|t3v-U zXO-TK)5xy^Vm&+aeHP@Z^$$yDGG7F$P>Pu2~q4saIcjNPsfFnyD| zbsNF|PPU|M(bj0^ryO=enKmC(5-)~26uu7V&AclOsoMJqxhXl%G7fl15S}JAA9DCc zQx3)7b1T{;#xI*UzK@jEs`l8jJjm2uXk2zZMzLRnH(T-<`u%#6Oet0-KsX6|C;ClA z$o_e?5V?VZly+60vz=PzpYTO)qv{f|Tjp=N?za-;$tAXkUa}eYOxSa})T7<3JTr&e z;%Z%6DUt&_3Z#`JBeFw)bzA8bW1Q4=LHjM1`4V~;2=Wuf4u~K^N&9oNz z@+#AbK7dc$#KNWF2oV1=m(7U=q8+;iI?JeF*!m$rxmk_kXzC9)uK8n`O-3SQ)EA%J)zXc;cQx~(*U+O#Uw}v%b z6STMIWIqCeExrvKjPpS@pajm*NZSkajz%mFfLhy6ry?|I`#kBs-)hU|+ z!)XUi#*zjH2p)nBz(WC?5%@P)I^lo(E7J^!+E|nzV22d@?~;vscAX3wfouwiEX*#B zKV;ftqJZN7)YgO7s(9C0h5v1nfM459m^`e9@4@(`Lyfapqgc+QwNGph1|bVvAk&cc z%fVv6>@nI1G2XpMMEd`O4aRzO)f^+^f5j&z4uqF$HXAF}2`It1!b`9qC_T0whqWdC zD}UCX3y`&i2!4dC4e)~CU*xvVJLRCR65eCq|6J4()Z}dZOr;B93DC0vH$bTd8F3Dl zFv)+q7~n8&*3zD`E{qb0w~;|QZGXj?7En|e69rnMthELZzWC7!(>GQ*$Z*@)DA*_| zsPqPtZ#z*H95VK9-S)})h=8_O_3phY7Byrq2`TzZi|5S>9PZB{X`IJ33pB-Cc&je@ zRvZ}ANAt#ylsfoKlVGu#H(%c5mE3*vx(;<((TIjGzzs^?HLcYTY0OtfY8<637MvX@k~4UTOkfG z8sAft;n+~5%;9R_q|+-bPm>r^wf2IzNT}my+2l4Ry+{EmtW5T$DC}>J?|XfPak;6- zw7pp}AwH$x_97k$o|&h2GR7yXG2`OczKz*^9s$Gg{&(_VOYg3(Q@nm@*zJ&Jb)`F5 z<9S#w^W;v_f+-`tXLCQA%gmjkkD6`CuP{X6#(5tH1X4Ahi=9uR3clm*oxLP^)id6I zTt%;P%4*!VWbK#}CHJiJDtQe$^bNblUEkPK+Xb?+L}Tlvkq~O3v2ID7!j8=f`5^4D zUdHkLx23+(_4J!XI!Lk>7saT^-KW({Je%cG(Q`40&zDq~Mc@J=frzT@EI-YV)Y9$^ zo}4UZ=%A8_gTb$2-L|qPhd#`56GKjgm@?u6-45i8?UhSZuwsR?V23}>=VY=)cj!db za>101--pTF*C2)Jdy&#!VhGeADwN=eDNP&0hZwpJ%tlZ`46*ej@}0DC@lZYLXV+X- zxV==5>{hUwes0MOFad+1aL>2m5P^IN&RnT-N1Pko zH<@gQS{k^OJYO_xUM<$YiSm3~FViW3!jG)?5v!-|J35$qFyjGgwwV~79tsFh)~|DB z1_aIL$oK2bLMpZ)#8=BQBy`UfeN@7!M0A2NIr^qXFWHhA*!L)Z!8)>X-y^k`mnLc+ zcg|R!JDHY?Ub2UFxMQBGU^U(N8@ScrR~a+bjeauq?{^WQ@mG?;j%Qbq7o&JMf%0OO zTg<4!vG$cJ*}W2CpyusIZg8eeH!`Z%?LLQJIMjE}NrdiF%T=x0CYD0KJ=VJ%ZC>Ry zuCibMj_s1tJW$@JkU}_f#`&>num*@=H@)4G3dGBx^GpTtpwpO+--uA14lo5_#3Ii}S?`UV;r3;(J1Iu&{9J2u8_A zKjSuh&MeQhpDzyags5EM6^bR8uQudI<4X?b4bYok%my{=87@g6KXHH4#?N#v+sEKbf{dt$kl9k{E$*CV}UdC@Cbfe?)*w_zh9HQ zu^8!IcOWIK5<J&GjguTr{Y9zw<@FLVw3O z!ug5f^kU8u{Mlwb^LV3j6P;hvVHJ~N9c>nZT%i~5pA^snpm>}%_6eUJ=96<+nN3mf z4NA1itrW|#%$i*TX(!G(k&mP}NypEBMlJPM$ByKT=eefoV*)dqj7Pr}U(HhYlhv8C z+tebUiB`GaWI3rW8Esc#oj*a>BVW_NS?{OPNp`YD8$oFfKXKS$w(Lj(nUY6Jan>pt zoZf$DgucN>Pf^}+$|{}oi-D7FPpznSxdi?lenOP|lIR#xE{H5J^Q&{Yx=&rPJ#nug zXG)8wR8;p?-qQjJ+QGbMxsdINr%D7HS$S4PW!6r7O{a?95QVc27V*XJv1ryH%U_qd z4YT3ZKdb?OoylnXaRjpjSbyrUvMmt~qb}EAAQwWF9NUndNYoT|Ug~R4lMwxWpD}~n zhjoqG;Rgz$dW19foTDNP0Qu9qK|B^XQig-imR;qeXA^+_$d%<4a$xz%{^A-s@jJ1P zJqceNNe(C|tzz?ff?w?WB^&!&Zv@n>qkz8)lwph5^hSTRmEb$4=4V`A!W$|(*7MH< z7?$q>c-0Y#7PMEysy}3G|5MRXxB2gM!U~EN?)f)H%U|{tb;`asTEm_F!3Du3ijgJy z{cnWVxSv8e$>SBYgtq;8z>TFK@Udr#3e)5{2F0olhxYoxp08FGkI*DnmbtiQ$p!aV zRS{H(#)Yjs$ARS_tK>B%>es^L-{+~PCAFJ(a-3l?ISHiyOZ?SHPL-VdQM;fG zivTH$__Z*uF!}WE5_UfT)N(#IONmz#_qRi$f~<@-vejw-Z~tsyk!%QO*~G=KsS2mB z$*`j5ltjTqhv&+OS}Tw`>|m5?(+R1W={s1q*+rkY!c_8^H9cTd-P*k{4IW|?U{Eja z;8G&WgQnGgqK7uXGD_C^<^RGTAV>IUY9t9OgFoH4U{nA3Jrbxw|6-<{IZjVqpHy0P zgR7Cl?7ekiPE^EumZ>!Vhsn7{>27@--USDXKh02hWI|(V4BQC6v-irY33$IhW@mYkJ#e~P$eh|E=3n@CmCI(6CVV$ zxTcbilbd!EdB1`uyvUnS=W?rxA)r;hJG}J9Lt$jPO9jZIEek7=u1X9O>zL(WD&H31 z-42OIm7ko*bCxw_?+D&px72z*@L=!Ti&bxn-GOQLtmJ!|Za+FuagnP+J*|$KeuT3y z*{=m#IX>UK`5wgdc+a6b)+^&jhtrKa)fz5D56A1vpWGuh4_r*U&HJ3ChsU`HlMmaHgpn9{Fweqb1ry;$|zNz7Uvt1R$1>y%{g>d}&qA>cI)NrDP zxNCjoaAGl1GLWK?sgYSbF04m&b#tQuFJW%~$#%66Ga$mUI$15qEPw1M=3H&3x&@zj zXr6JeeAT&Hsku`jhLZ1;Oe|lu5p&$N32BmK`=mVAQOr@~iDxRG`FLi}Y-tuoq!EW& z@X4yF0&J_}eBt|c+)J}w=U{%U-fh1+6vaX+{at1^I2S?=RRwKqdNJFHik+z1@Aa4% zc4s&yx!rztIAL$u_Z0IK+t$bUgw(#bGUcZ2)b`^1z>GII+O{O;hhWd>qhD#`tURdC z%OAVUQ=f}b+=5XH`x#fSmOoXt0^8v7)7E@1skpc)=Vgdi!Qs5E`P<5D;+&T6*v*z$ zyFDW73ddarGy3)UoTtrpdwTn~$Uhs-AV|fW$L!l%-VyQkb0@qHd6QORM`D9n}i-);X{0?9+uh3CnvhV(}*3Y*tc&uVc&4P$1Fv_ zLa_L7|6BHUno=MocxUl3gOs4>Oxo;+x&`b(|4{s@z{%%Dvn3P4WO*Rr67a)xseMM1 zDA7fPWpu!dvtAMy(ZXj0$SzFk$?9KcAW>+XUwM-^o|T}d%d<)C#KI1pwND)wCuC;dLG(Tm@*A3ytnH(dVrVfTIN z(&HHYi@m9;%`DK#0nCD<@yc{|iWrB6*hI#=1&@yQ1gBVSHHF6A>& zs%>+Y6MRxM+1+0+pCF&o(oW7^6i!=DpDpv)@Y!eq`!c-rrKW^4@Q3owhAcW(RBda)rm8u2Ey?C?&^s=bcL9q`Cg*zI;N7K1 zrmQpQ7czl-nY#arx;KGq^32-ClOUidxCKQ7#RW`IDPTlFut5z=AV3nJl+re?B~&bn zOQq@racfjWi2?yNW&>0#w!vbp9b8bP*nxnI#bvAt6|FjL(V1#H-*?^56BOLqdEa^8 z@Bjb(G>^EP?(*8Dz~N+-*M^-4M*p&A-mX0Rwq!<5lvcq;Pc{U^VMddFe1kVM4XAqb zw7+eqc0b&{AsM3H(uc|Z>meKby4oA&$kH! zPjp72zh0i%SNd~i_z_haH-*pli3mv#-ySIg?6%<)I1V=qNJKZdU25#SF_(zWx2c^! z9b~Go_pr?jwB?WK_TK$8I*v0?w)X&juYdZc;I%~A{U`{QBwOO5NJ%R1Hit>;T{7r8pkgT0J6@gX@Q}*~ zzI#$(PYH)IUPnF6DCQ5m%sKD7C*WV`An4&nd%Dz@KE8`HsB}P_lFB|v4-e>U$>?up z+--SJoytzI1<|yhGO3Kd_m81JoS|_=%YGZDoj+rl7tXsh4`*EEI(-yZ2$oK&TOBZK zLb>$fyCC@m=gQKXQYX66U@d8>0G8hZ7f-Oj$YnIFT(Gt#MDEvIrXO=G`4T27r6h{6 z>-7ZP8I#$KV^*C2i&YSpz|CjMMt8sx({m5Z%m1QV?9YQ90hGZI$N zw}AFU>Is0KnoiCZ8=@5ft@s^N->GTc710io6n=37*zWTHb? zy{8LN*_q+&>*-#)KwLj_?k|540u8vO`40EisdS=S8aiu%3BR9(h^J>g(1EggYLs4`u7^`;R1~fK0OfVOtXznS@)dNrnl4`5BKzS$duxF4i7!^Y@N-&wt z9QC&pzsIpE1V9^;4Q(cpC2W~At3LCrP*l+=MVn02^>)usKc5>8aOf#3b)>U@j;f;a zduVI#Uv5*J6Z`Q3!H1n0aM_3hE$bGD8^#5m9w^*=&h76LqZf{eR3|koTCwY@3To*t@!NKBkvZ@l zu|Bq|tRZx2*?cQQh%_i!lpVp8_qY43eQ(N&UB~aljq%+T9=KFLq4ki{FV}U+zm+E~ zd}m;mcTmdgVe&xrs*{xs0~19DBr83(jc;lgHumHS-un4>r7IIU^7dg@&$VH6ie+(s zp48N?eq%US$|@P;{8_vDgDcbbgPUmdYwVNSLsom%u3dk$Ggcq*akPwgHp=t9N?tKI z$n#J%H+12cD9`VATvcrtW@P|zXD&DvJ^=nbu(V58m}H1V?D#j;zf>!uArh3avyxMD zU`CL9xi~$;F^#)EB;{VcTsYtVgQfvnjYOf1%5Xy7)HF*Tz4J zyQ?Z>XZAb5{xZKa6v+8|gw)yY2LtjNvlQuXRC2f2q$}BqQj;XrF_Tr$A2H@C!{p83 zm2sEtA_v@HC#`7!-MT66@@odCcZJ2ecHuAM zAB%79O|i;5c;K(RyV}yhqf_4X+D_ z$JQGzn$Fh6T=ZNZ{XBQ)0x8y z@#LQC1s_x;#<7pBF2j}kplaItS#fv2YDs;lS!EcsF^Ky(rS)$qZ+~^ccj(7bxz45{ zwRGkConfoi2Re(sy0GHwcZGpf2oJxWx9e{y;t@gCNk9G7Try~RX#-EW<}HV{sxhvD zq;I1VP6V1WMt&S2H+=hz=p7%2mbF9G!Z~F-mp2_bZ|Mtwnd^mC6`xfp@~e@|?QTA8e}<+DzAUWxB#i zZ}eq;!~Lh@d=}2er6#_n^w+8}(u)#?$mLX}ZV`8lFec}?aL_zNcboffQD$7TD-z59 zl~CT%e~_JB&i(jx?<<$r-VOEM!CgE`)n~-DxbtgOZZ-u|JZs-JDvXD&i9twJcVe?q z{#aTylD)KazN1@cZi|MLQhI&xCL?&6RSAMtJdOG*KNSQXubaZ2V2>X%r$m3Y> ziJKQs5`5ma^{We{Q7&ou*h_aM*3%6qN^@1vvYMX8mFpiG^ZIWOBO1(yVrZ>ve?Hxt z)J{vLmB&mhv22xcv+F8Ow27An8L1EW@}(OuYw>8{_s>V z2mSN>bUPbsFTop}bz}2w3b21~XZbl97fAI%o}MWTDlpmW1|t_wEZtL|J*G24z~I3$ zYJpaV)XCVg?Q&memhTL#Z`la|v%x#~py z>tlE6N|yj=7%Bz}(e_a9qXMee3vW@qUQ);P)N>X*_MR>ibt)D=JLv6kkxIt3+}ZvLK~f#O>c3U)VHX;UI{ z!Zs?G-*(}{&dNaXhI%V`Mf8oLk8rb`_zVD~ys+ z%8cxz7Y8DsDMG5lmOlR(PQNN6*UC-0im4vMT}j2x@r^;qV|WwkI4f_WrWajkW4WFn zxK@C9=x0W^lw42#O@?l)x}`0i`MpgMO2n45375w^=gTR(z3j1eSIxzGPvNYOq>%)= zMe;3FK#%)qpQL?gNF0NzA{`dpe0r>abHh8bDdE|14c4zsKF;~+lKIdL-m%r?nb~u64lNDNjhDoq zzGs*z+s-Avx63-wigzrzW2hkdHT+W7+pTThx(b)m7t7|q({klNNnC@w&st}HWtbwQ zw8DCjUBNtwV{mlgA&KjOkPGE$s=3^sR}NHdPHC!2S^aUyY29vwQWtb77fuYBUolFO z)$~o&`XGGrz#<;Ict~zqfU{cF_otM5Cqhn$ z!ghS33X%uzyA%J=ZhlC=+Silahy0mvsqdHeWt)A%>%I~rn-uq^E630f{uZK_13c(1 zvOM4QJUBTZSLXRk?8-QQTb`^5sg`%^d0)M8JQ~wzs$09;Pscm|6;qC7rYl*cUn4R9 zg6x3A^T*lSeLsHpobGGZ!MplP5ryG{7PBvZ6t`Ay-tS`0;H*m*+9&+3~2E@&rwyQhk>+|*`AY>SO$e;IUj$WF)HEer&ZK9nh=ndDDq=U(^KW#IN5_cUS zDm4w(Q#W<&ZvV*nh-Q(#266M?;X$|GoxJ}W+X{DkW$m9Pmkr)4JS>%EnLGNkfBhma zv^JsUfx(q46pa0If%3yd)lb~ro>H%qsk()e!$(U+%F`cl`ew#+$~%U0MXpfI=G zZ9ME0bf-82J=9Th&OTztdo?22*n8OrF4Zc=X;GM=ZNe63?hS)S%|b7G=<*r9sU1JX zW{2K5s>-fR&5XKUB$(ziWPvoXSm05Eyn>@!pmUNkSG5exbTd^0kPWZzvvh&9U)k2g zO+)hkk|?_Rp#L&sQ+*%9XApeVY0LWga*Ph9M=z>xL?V%#*Htdq43=lXn{f7yfrdGJ0C)_P7WBx(zzMm$rJ`0J4{OEjsmONg2VH zF*vTtTD6Qm#LOOFB`c)op}Le$2k-ef9bpqK1Ne!c(kuv@2J2^m;x^ z_ThP6Ny|q-1C6KXtxY}}Oq57xvhda_Iptx-lK>>9zlRK7rvY{cAYd5}V`>=h?}n*iJ|~;x+)uq#x%f zoRyqPZGuOix8ptGcuXml#u_ev;*^8v88{wZ~-2kaKN?Q9SDw?w+ zvIXJUv-zQLwc**K>HFM9WQW%VYrtt@SK$_a>CK|G8qqF3{K!n)S8qzEi}7=MMbs zHN)uF3|=zMO8ApM_3I27YCcdnm3o%;gRNo0xz32ab#|m&k;IG+j;LDA{U~~j>dXCs z`;S$v=CpmK`iRe68E11U@uN_KcgPH=D#q9e!iF5$qyb!zj*?WweD<`hrv+c7BB*V73(sMBw(LpvF zn(ME2#mc2*v=zB(J&261%#yNQ8@2M3Bn4I@KI zWb;eJolwHFBWArAGR#^6jdn zAp^y;=hy~E6xwMP70AmUekA^MuaCXs-vzua7b^C&XlG`L8zcg6ee=3^s+OL%4vE=Q zBIy|M)_YYWgZ7_+AHk;h+q)ruNvXxW3mEw?kK{Tj+Ayak-tJUG<=t z7#zjdc%n1>aLUzsKWWDZ>AWWDp}1j~CjySFxZ^v=YHwioR5Ux)tL@JifA2l2%)VM1 zD2r6=eCMP>$1CPK9&z>?8c(th+jmATkSv~>X=UCrP-v3$ zG8Ur}Hf*;k*6#F-pL zaG|SFxZ}Pxw><7cM-S^jrZQ@F*v3?zapi&O>_YY--$QL#uT7n4YjAVgU)>KC$VZ-x zMJSux(%4|OkKir1kfBJ$6z~v@;AJST`9zS8k=z)I-W`;kh#X!!^Ls<6uwUsl=|yOm zdUI-s26;L9eV_uC%0_LXz-J$)^qOR_GDVP(>NUeGo?XPGFpEOLMX@>8)R32aS$=d! z?c5xZrsRP(H=Vev0E-e@cpc+GDJ0OR4tJj4xv^pd?*}{8hBIObV%5kU0iKa|@5fll zziurSPH9t&j>zrGxwAPBzesaCaQWt3HbDs(98c z+-s1$|3%OcPAeCNHqda=mhcX8@k+3f+>DN;{{+^8Y_9*%|BYov>0iW=nz<+vxoMS8 zfx0hyY;d)A=iEwmhGxR^Unq}NSPKA?jYh#G)}ObaNj5>NIrl&}p>-!ZjdCWmZrPhS z396(cW}5cpW|=F8ipBT-p<8i$GJnTzQ;YE1sU9K^#oNZbE#mxevi;*2K&{Ls{L0`< z%8hXqOzY5fK$T-!qL8z1D}tS8d2_FEn5I}jT4VYw{oOm;9t<{x;W`hC{@pG`ywMER zKRyDe-g%u5{f(0E*@cZH0ih`|w|m|wqo<;w;SC~){nv)Mg>vQ z&M@ZrL}7=h5oAYv-nQ2iW8d0FsGI)VZsk=R&9hkC9k7qKRa8r z@W4pxF?@cjHmpN6kISkV^8sstO;^xCe0Jf;IOku&T4VyvxN6;s|=wVl7o zO8u40EryRO`&Lux!M0(B)kp6{RK`B$`biy&9I!o=F4iq%E($hD;l+_TY>M|2 zf0`T9@<~LF^atI-eoUKYVNHke+lV8DzF*mUOo>Noxqy=rU(2zbI-d|kr_8G9$TlBb ze5Dn@(54KP@ao&I6-aKdrGuTetY9BwbVeJ#NoG2RrbpN8%aSxk%#um_jwt7PbDZaV zB=GuMN*)|glb}8eqfA{llBe$vJ4#q1>i5O|@8BfBe6#n}1dH(YEEf32dgnaT2*U-Etl z+ZiH1Jx|})+B&#wd-!1N%-&uLPpKIJ2z;_;uW&h|#y}*8rY2cdLMVrpB5JC)Xc1*t zWMy15eos2>N^IoIm2WUr^vX1VbNfzoS?8s#BdO2AcpS8%+|f>Zg>+Xj8XTP(AI&C$ zP3L$A&)jrJtSSt+FgCPF;OMvT=K(=-s0}%>o5sNrinC*_s5{bcfQXcV^@21Zqkcy| zYnQ@~qxV>MXp9`XQKCgtleVyOOPYZ^bD5`B3_PbFf4#WcOwPm|drFw97*b=~yQL|+ z`&c|kiOVFQ&gj|n$;($qs%@s1*ewx_IPd;!c@KZGyW9s2ACaP7+PXVi$OjfYlot0< z(5(d>Oscm)S4cM(Hz=7Ft^`{K38_4 z7HXP}0};6u0V9Y>|5NfgBVeV7=zfKQXE9W`Tr{sk-)j-klHv?Jr^zRJQ;6#>@~U}P zI<|H|&i#Zof2IC%@=Yc!!I`v$6f5wdn=;zFgpOmjJm`e**Zwo=9A|e*>dSm{9^5Eo ztSE;=)V^GLevL2B;AT|Xp}L;GX*5kfLfUCJl9r(GrzMpaB}QB^V_h8NEhkQtLW&9g zXO`dyoYQV0kPI04)im>qi&e4K2Exz;U*Q@WFe$U*0Ud&C%)+1z18o?xkdA zmHV{R0K=zdtjGyA`gzTX1!?-n7s7{JY|D6{OXWBjFOHv#vQg~VB3JfZQ|f`E!+>A? zE~oF7i%J^-sP82BA%fPnKl&(gZ#S&Ev-A&tpwY1nSWAoy7;IzuW|Uw@bl^sWX;& zSoMV#sa9`uSC<-~WGq1(U#98x#14=Nsp>CT^2ikjBKtxk!}#s}(1QDdnvJO=Q$Bd@ z>bBReZq9aS^J}^K?(tFY9v?n*4%uw1T+@YrNlE${@~Zh?qgQkz(T%k8{1ZGx`@lr6 ziG#$_UuIJdBki!Y)A?w;`0i`2w*YZOEw8%c;5vibyv^&Crns#NW%R@#ys=ifw|o;-AzpRYnuG)7;aPl-e_Fpj%ba*RiQm`NrWy*Rsc|?wgoU7#n`oTGP#OwJ{ z@`30{@8QFFaLtIkPseowHP0pB)zKTs+L<@;~$(#vPLI9`2}ooO~(X zuA}&~tL45c!{8MS{q}UBLuPpYK^V=1OCk)E_b*{Q z8=C4-^HF1z{FGs&^@bm?>y@`isuM8&O<$;7nk&h^>Nb~g#gqyfQEPVbKhJQdN!!Un z5`VFo0YNhYG^PYNv^7K#Ly`{S&0JN9Ic_}W^Rq!7g1tQ9Fp9m9WplFP_ zXouE1pm|bA>a?u51-IHEmDb47P1E4^%_o+bR^>%^S@%?b=Dzrg-QuCjRl4Lgx92XA zPm_#Fxn5D7@OH~OSk`0fVI7UX8@n5K1RLtu&F4Ur`O+wVc{AUhoQR(~T#-{Tvv{q) z;WF2InCgM?9(&jbgKU^m1dJ0QA|*7U2_Ufq)`>8Y5=;@HDZR|O%&F}AQM3HYpPc>F zdMjFuc4xYLYwPaqVd#QsEFK9Fp<)pjTm4Sw5#(D*OTq!{ z!wGItTA4KW4l=OEL;(jkz=8CR(st-dZ^qV@ZH^3*|H3}iYRVZo54DAgCVWx6^i?;s z`2fnKMdrt$hoE%DsoD>CqOvTfrqMBL|LbCgG{|B$`Il z8IWT8mblA33|A6G^eZQOY#%iPRCanpnjP-6*GL6^KAVyX|AOqqzn}<;r`c^*oN84{ zmBOzeOv-^o@%!8*M1O~j{oISNFJpJ)!Ho9EDWOq_;ZDgm5zQ)Hg;#S(v+!gkA`*I4 zbJNv6%H?R{1tUg0s|%V<{RkALUG{pK{wi0mT*y<-IaV8j3{ZMM*m}{&xRGoZt;Y!Q zUl?qMLI4X)$)c0_X`1C2k;~jyO@4Bv7+K99*G|cJsGnpX!4HHk{oD=m$Ll9D(DV>h z7brGea2wjkywkrFm92YL1aH5Lon&*DS9pW>h*LedDf)i6sfn_|kg9rOYi(1OO)LE_ z0y>=-exz|EWOe{XOG{o&v$H5oVbJMdp?^0`d@qQK_EoM&IC|CX*e9$X41Q*rAm{P?xUL zvSZL?H{1zAqwk8nhwTZF`f}?-CVEM<8zqC)NlGr~@ZmyW>URs`)?ta4wJ*dUh1# z6qjksLGB>gP^Z7h<82eBhHn>*tKR2huifJ830Y){d^4}BRAwHO&L11VO{0t%_aY1Y zqV-fZ@t4qg{KkkmqSH_BR0 zZ;^Ux(Yy9IUFdUl?{;CNjjUyXOUp0lT!^UwWbnP?lR4A%g*Nt6v&%EMBMnlP3(!|x zu>%$XVEBl`1Y(O6J1A6>V$c9k8jQgu<6~l_$-81`)Klbq3jD+$a6;cHk-k+i#vZB4 zXG-M`Q$%hALl;WC2#dIsK;rF&>7;3?+hdT z{%mtmbP!lGUTDa>Qvo3`e+;@amGUdLuO%*Ir$ z(R2_NTo14R-+w&tNuy*>!)RwC?^4X$wXf33tSbkAX1|-GYq+?XQEm%hzAG_m=ox@@ zDi;7{=ULZdHB!-r%W?SyS1&?8l6>di(q*SltsoFpd@`{oF(si_b=uF=pEl~hW^XhE zN4-LQ?H;Md%<`iPOgG$MNd9q*>D2Z^U=(!3p}$X%Omkx}iv-=bv0ftHyy!`u{8IJij| z!u7;bQf2b(2$Qapp=@!ValB%Ai#W{*VfM{M>JRrvBFOBfP6=G9u|c79bs+i&6!o*s zh8p9Zot(bO<(vbo08eDB0>tMd8MPjvhKRhuqqv@n7v&Rhol;piK)fAzf8R;zYN6^{ zBWq7fCTq?+T_^z6)J{>>OKmp%b~|$3-sqZf>avC*Zs*El7f4(nycnf7InPQ(BkrU3 zW4T*%q2*|$hIMpX_(at7$(oW@--B}?W>IkRyPiu8oGBdbHN5;g&Mk6$hhj8Htw9q6 z6En4p*`iY`OKh{`OpcQw$TMJ|uyP!j^!Q{Z4ky#+iQ|2lL+ ze=0v1c4PyeoOcqI zJeaHnLhkg0jH>ypFtq|=7o9Me5l?zMrU?rWU5X(H?2}u#S-mV-k{ak&v zKsx75{W$Z!_y^^_DRm?ZFyu0RO044906}xZ;MMT}4V?~`ubP&8kO_N!KGp(&L(Jl+?*Jj> zCV1fk!}{kU=8#_)fMT&=_+{00xOo6l6Qp8ADZv~_DuDTm#b zaludMnjOZkojj&fQapJJoYB2S)SqNf7pa<9oSKa0N^Vi@2wsa~jkSlZVH(O5t7@ia z=c2J37f+-B3`D>_#KaBMdC29g?vG2tPQq2DY}0VYIs8Hfg^LiJynYkxTa>x-5_u$B zu4>Zt9ZK7D!qSE$7;QDe$f)iN@_)J8Ggwt$M}jpj9R83WrIJ4Btl7S=Ha%VW^wjv>CwBeLPhZBX=OF!5#9PqlLu~PR)@X34 zg!2AlzT=u;#3&W%6|U<-a3VdyE*$-Tnt%J6aG4~3690X!2Zeqe&YIw4jNP%!(KV3) z#Mk5iL#3TOuz8y>cby^$1q5tIU%2g2zh&OJ;o@!JO9_=>Hor3><|0iRm(=GB=Zrkr zkmeL>fR4rmBQyGLBtqHo%(kNZXkbV3nZXZAW0z{Km)SOn2;mAF3Uq6l+8?_g$7hzV#_9IKmYBR_BR->aQ#E z!S!PN!0X+H(LoqxpX?mCE1Xl=4mUn0oHXnNuNmn` zl{3@IJp*>AHV!g=4paPvc@rv?&GN5f&&@8=DjN<{2paXgWNhgUJo#DOKkt9-9!x34 z2-!&SOFf$ndSg;`eq*i$ks)J+R!p1Ob*jZ2r=PLya;R`xm9KL7px^F>;ED(oYp!RR zhZ&oMHC%s~OBo!*@5G)x-v8lRI*KX7$5S3)-SxHUq6s_(h_8D%@NVbu(&Ki2?bue= z4|ym9gUpBPr$qB0Bh+I1tQ38KJaSN(7}$f6NS0838`1^=mSNqY2dZd*AxM5nU=5}z z!Zs;3iGjj?9TP}Ada z7E;gu*b|j=suzj0v2sAPF^zCaVR-D48hb)Qm_Yj1UlIN7u7LN6AuH)@ZaCp-n8nu> z+73R}mJeX@{>*3;|8F7AZ9BjiwbL~`5-5FE&*%|5QDKHzHWJNj0eOICTZKWgC5~|g zd=DC$27*x(B?!<-v!kxWG&*SWT+DQrA7RvIH-pq7K*?Fm$}ubF-kHT5$n$K%-XDjX zW(W_v;wfdS2t~ql7in^Lq{{^HX(3En3=kCLyj(Ym!yNuSCNL@nrHz43iJ<;cdB5}S z?2?$Iw8hfT-`f(qH1;uX<~+(nHn-2R0GMM2QdFxhW&Ku2KU1%pI(;f0E{BN94duF- z+*S1-2X{@JwYp_V*{Tyl7w2ypmd`=0UVP6<|Ff+jIWOUDQ!07nb=yBYdCd=YY5pV1 zK9d~yJC`>mS+|;3eP1f8M0LM)>z$}m)B{CLiQOceB3m+e1b;@$d& zJH4nNS+mqozSy*3yD~NC3n^EdpjourBtb6Ee(M-NLL)SxPb2Wc#iId$RThjV=w!-UyE~}Xtc=!(3hvcO6q8N%6I&>NqWHlX6?)AplakgyL z6#FwM(T%x=Gd3O=ApO*H$-Tr`gOz^fDRx%riQ^+!-cus4L{D&QiA&+k29tfLgs2u= zX}Jx8G64hEK&0L4>7?&aM-c%4m$F$)41YW06Nwj}l9v#?0?|5p0e<2SCm5E{(Zp?$ zLNZjD`2I3|VPBH-D8uIBSqE)-9?*P5y{kTqs`9CA))Fv9^Z4Z;5lPlqPv9}Jt{Ry~ zRLC<^ZrmPSB+J+qWC-KB<*3%Sqz)9HnWebfa(_yjPOB^I7x)W%OI!DP5Bxc( z?B}5U0qA0kp3=6Bd;+aMuYQr){#iJd zv!_Alyj8{PSuH$i2P?IvS)Nnz`q=VtpdGj)Iz@dJ>Gw-mu7DI6&d7I$Ls12E zFGzoasKh={TDzH&%MjhwMhZ`o(1~2jqip;2w zbE51{RGpq>9;%#Xw1h%aktjf+LYd>QilOOjK zM%o$N27O`}<5b$TrBUB!X%8MAgv*!K`zpcF6d`UZLG&$MA^r5<$uhqBv%0@)m1UR8 z8a0GjAoiPpogWRkmn4(C$2-xwyQVyO&)ie4w+75mFA?~zTPLc{GC#bo`NoqU#pPLj zm131;UeKJW40pI$#I719m_F_Jv_50H+NbjxlrbSGiOv&{i7rA>X@Ook`}5Z-Mi#X0 z!z%~3pI&fNA~SCQ*sP%Sfa*|;i(O)5PQ>;%Bhoh!P}Oo&V3OjUZMZF{a7)wIdk7$1 zZ7pZ|jbJyos+j=*jBY*gl-CN#0v+WG}GcWf4XVX8;;?Nn)KRLhdECgNV zYd1pCuIloG^{>gl$~hp-{PWh)K(A$5esk_}*1vdbkcs-H=Gj9qbOvOVlHNY!jv;fy z+#>VqKl&flg}kn^TYvbN23L}Py-D~$6f`9YgC>`uXXn$mq6Y-D@UDe$SZ@QRs3UkO zhnoa?LF&I=F&*Q9FQY2J?W27N1VfBNMyUKh?gp*+hc*PNQ#}NeCugGw$EiMvQhb9~ z3?6nkP`ATLlg}?XDj3lP>;Fe`Q85_l5{7|v0k;QDt7hv~>}cz1gW zTuoZLeIxI(^J>R%1(fQ&9~cV<)@gO6p9p1>;T)1lW#U$OJ4@&pz`-rR-8OkXBEvb} z&SaJ{tIMztHK#pMT$LNF7R4rjvMJWxr~q^eko3&b2b};0J)i5etmr3a^+%qwx<;Uf z%aUxIJ__3?AOz;3nR&bdJc;h?nl%E>xT&s=ftu*;cS1rdf(4Q+?<@d8HqOQa_C5ZBuoFC!n zaC_Ob!5HW8E+ahZ-bGgbz>a#`vXSMFy~kfpwr$m+D4P6G7D3Ppf>8|%0HKRrD*mB{ z{HzJ+KW5CNfJH5jI;n(TQbLJGW$+$(u44Ks= znEFM_{`r+)yzvZ}7`T*$p|0$f?pq5H%SYa$sZ*cR(zZ%Fq3(f> zoaSE6(3@FJ+wa7HykN-TQKoAl0WIv(QW3$`0_@naUn2$w~(Ku0YCrYdAM}3J`67yDmlp!dYSzI1`gJ+6iS*a6$ z&-c;q({PxzM>o*0n2r%S;W-j%ScjrxB>1u1t*T+02uf=o3yX(BNm%zZ3FzJsNOlqo z>l+kr@KVu^-9kD!bh?*2m$0$Fcu=?*q*83+l0IXUl@9 zpE-}7!|e{g|DZC7UOn^ZFUMG>KZ`yBeJ5^37wCV(z(m!YGd6`&tOqKwj7+Hl&JZzH z>WgkAqyCHlSWe0TRA5e)K=*%fYsrk}(8RR+;n~1bpVsOTIvn?so*hR$%8j9TRmx>g zW+B=8K|wG?#H-n%z9RRPd)^qM$GdNzX~f~r;o^ddg&agN&fwCboYDMuu3sPiyj|#- zE|amblU^d?DVDFSog5mGAFZ-&vhs83lOca|EJ@-cohm9h!EM7gFsjvs2Ns&Yo?OJ7T5q&m8L=Vpd`qc#F#5H5)J z9^52BsL?k-(hU1I$?B59w*-{x@6-UlRzA|;yE%z?a-9mdw}m1Gh7n0*`Dq(7Nm4eW ze`AL_G;1g-h{2uMngC+QV?2tPA{t@PU$h6I0J81RVK%65F6S23 z_EV^Eh=@YNASx#~!8eq;1fMxQ;(px-2E(BZyrrd50Q7y69XC>MDcVO1LNdd+dKW$F z0+*YitzgQ_*i7-_ema^1K_79GwBAcXg&+>(ZuXv*jE&sMjt&DWoMxKvaH^wI9b~E+ zGqzEG)D1_Rf(o~litbi)5YxB{fBLtOKltjmQKP3nu*Q07xvxAlD7k+^Du^GP*aHi5 zmHF5pSiRyty;D3SGcuX}+c7SXBY>RQy3|0H*4MfP6bPP*d6h;W%q?Ra`igl}x&QhYpcX<*U*!vv8X?OO&I(C&3 z3JDG&l7KdOsP}*qAfX6Q`YYzzIyy$l*9~Y!@*o^%>)mZ>v?ZYi55hG2|EQ)*_4_vV zAp~%K%3?(|%2WOdn5tDZu#7G>Y>A6m_NXZh`EA?Qe2>BxjOTMh{c`+W5=e8a`w6C7 zsm11enJ=h@l4>v>_&cFdZ2*}ySS?paN;?s;ud7-3{C1R}p1!-Pwq$x}O{|u{G%W$f zv=&E?iLC&Q+SmB)^#8xS@Nt|(LDIwa-)Yv60%Ky<|%f9Eg z5yh{Z?P1~Nm4#@^nLgkr^iewv9;!|i5&E?^Sk9gyF*brwELLs}0=8t*5FOF0rx=_J zX(OYoN}A<0KP#OaG))5jfXni(LUt8KQ}W}B%3IyW^vS)@AD9d&Ay;|Q!>u422*ve+ zbeKA-ceQ7AX}6$;Lz-5}6-qatFl*{QR1ByRFdBe!#8`1)PX2p4RD!}R0RIDphbubI z!0ce4_1zu7FN#dXf}~==Za($U7-t43OT^}zjeQEw2gZ)sWVqU*W+fsWezoQ3UwB6t zhMN+J4=>3!XZCMxTYT8VCgcdfz48>pgt*JHW#OF78koosKI{E59vGVoDi|o-{J|&+ z1S^cm9@Yeq!&nXja~Vu9%!|Ohpx&ji1uRE|hGPZGCp;vRj+4i9MwaeT)icP+(Ekg7 z+lQ@Rt^~5NCXr-#`GY|P;0qp zBaS$-gUZS5^ktB;_m~m-!U3g|y_)#w&+Q+UC25lMc@I`g80$PM$L7#ALBtj_m^ErX z)Mg77)gXu$W|`wR^$s4_SkI7YuX)=g*{%7TFm*Tso63b_<^@Q0)kKIqijJ30b3RvR zJBrvqwQyDK&oFB-e6b*q$sh9C6}GA2hSE2Af+`-}zt=`}nSJ8wW6bHIbESjT;Bc)8 z&X3X=9Fe+upeM2U)4}K-6eR;*V(-)dWpMhR!BH|ZpV2h;GWzy;cL|w7VN&J(fv-M2wTtBiY6_BR&hOvt~ zlpE7A?eFyg?gB^&LCr4Z=8we3%c_Soi9bl0RyCY8yG@GK=HRfuie7U)Lz@_jb28!_mjm(5DayV>M>gJGI?X}qg(P6pYfTpyu6jOtD z52?jjJ}>HVc;z^}YK))otg1a4xH*!CeX>-o zrW?)~V(h~YCsz*!q(Sln&SGNrFp<~od@xf1Y5>9QyV?Q*GAOV?Nzt6aEaWi=u`A0i zkyHNM57P(z3HfAcB$ysBFrJ6NS3xFG68#(?hG(D)V4ZO2N@bghttd!(@fTY!bYA}P zte60Dya>sLZrN~V3sk9~q@=wz!efV0k{N!RB@W+{9wB*Zo4RcjJ4OK-xT;TgjGULN zDphmJ2LVixS0Y(bDx(X;l~%vCZ7@t2$8IdTAFD{d6Ad=Wx2q--fP~zH@1pWh5oN66 zS;<2ah2qu<@EB@8V`%eAP>0|tR-=@-TdMT^+!c9liS2Cxq1GFSdwVhmm8NRw zpTHm(k!xRqn@F>0Z~J32Q|ZMtNd_@WC8Jb~`d-Grl!B6wxdd`Df-yn0A8v`r&4NUe zF>eeGp7Hmls=dsyY-saaxxVg|LKM&ZAr_+@_$gK46 zo+O_fj#XW$3b)K9aLg~=$qp1Xk6l4A%B`88cxFusrK zRBL|2L*1XM+}-Tn7vpy~=f{K6nH|q(OOG^dv9nIc1+}uvTWVygKfJEykD0E&5_Uz- zk%c*{N?Ky?wlrhfMhmA&s~xo2=WUhxwciDgk?_RHq>35DoDLk$*8pTJ}cko+9V z`DLxrU#Iq;^_{>49LHYdZyd=x(P@HoA`1Iq zC28z9kfIaQA1TjdP?RBHEU1ArM(l!)9R=P)?iqI(#j|BpNi`NlYqV|i=>juEM0z|S z5~sF~oay_+FiEcj`|DITO#=2oMpo=HP;K~aID^#_7Q#)Kg@}?`#AG4&{;yhLWXa6e zfVAfJ`!vwmQ@Es!bpx&cx>TRb6f8>P1+fx|#wQ1&9WG z-oAq|Krr#cpEW_(mU4C=6`pIz^-em)Qw{nG}9`fcOxcd~D6ZIf5I^VW)BMI=7mWH6E?j$zh>_<`HE z%UJXF$z4S`SmOzE0jQ~5tE82gNj_h}-Xf8H$~CdZyQ1D4QH_3XgzZ`ry0i;W|60L- zcwD1T6p~S5ftww=?u~;;gX0NfKDw;s6%nLb%~QZ{>`eYTQ}dj@RAY|_)PJ#s&v12zzmwvb+b18C@p1+=yNc1&g@SZwxQjgy35_2=Ppa{H_Hp5w6M#%=N3JF5LI)@+f035n!)7h}K> zw!VN3f!nW3TQf$F27mhmVC>}(_CBw8e$kReJZ3t&l!1}D%3(&s+X$2973F}fAB|q%RXgUpd^jZ|6^eQG$yPJu}WG zxV%;1no^T~Cp;whrZ}bsHkr?sD58bS&uDCNO^;ZQ%-a)#yWBs^>{kLfM9xnHDo-}-@zH%qB-?W%S?NzSf++F?SZi>k146#q>N0xtQRR3^%CdoO{ zSz+g+=4kq!XI~HER!`N?vq^%^QY1}KXJ}(Azn!E^b!UVNkVuNeKI?{I%2<_iq;2zC z6*I=>k0?)M)PrHbvj&@4^sNfHnn%t12@X@jBosZ6K4{%!`)E$??-(+9pcGdz~Qx|H4S z#iC$-X_?-t0A&P(D^$D$6*bj@7)osiwQl~vI^&KhFcV16{6cEsF@3O5UU(-ByI!0V z?meD@DOd{Se9)hxbjW7t%F0*7`mB4}HLg~*M4mI6R1?F?g{PZ+2hL5M0oV~rC`S4KJ?%l2H|w#+BY zO1(U^B7zVLJgTTx1`amrvV+gn6)s;ldhJB-OSRuNzp4IZ&S&*}8j^ zE4te*e`Px3mdmL$4^YkuvG>g}uX~eh+nAI^W%`o#N3=nbKr6|a?^)tp>|b_R8N=rd zc`%-5mWMkTRH{zCMc5tJEC+Iy@mtFRw7=3&q0QDCzvr1wwC>$5xgN1e9D<J$}b?gR029ZFZL=6b(?zXH)eLxY3>?6sdW&b=tOJwq_X=xu|@hPPr z5@_;!4PKAH#MUtDO3HlmK3QgLp*bKg&CEr~H(osBjE=pSR%-WeDR%JTO^y0p_USRPdlo66j8zhp&{CiWX++Rnn z%C3Xt+tdP!v)jGFqIon#>78te0ElNypr;@RlE?sTrcYg9_-MErAN8NUOFbGVm-Yxem=g+Wc{?TmIM! z??aWoIpI+ze=v1*->z?`z5CIz)bkni(h}4NxhFCsuw7di$P8&qauxx5g)}9Epwj61 z{@jIy_qSi7s{hDJbm9P?I@w23G7?=nTeAY+AqJB}y-;dR^%%AJnX4HX-6ZEE=VHZW zyF?6({6&YQTm#>h*GQ^&&ulEIjV2SlCa00Fycn1fY=*kh$%q)2D{>RP|9gVZKpU{I z3}LPX>q0ER1{O$qbCuaUh^g#a7Y)o727oEqIpKM{Ck6M z@xnPV>EPeJ)}fvcXGU{+8Sj>gqpo3zxY-R9a)JQoAxku%Nx<;_yJbg(>{pl#H)SMq zpm1SemoaV*YmW_=aX-r7!9*zBb*gofLIo^?AZCd4L>c$%@a(C)+^)&on*@)hRaV+j z#B|{TH-7`YTXe?xb=(nE3?lm&@D>`$3JrVWz^0bBv>FwuG&;FLahs#vhk|eJbK-{9VMrF61;E;8v8|8l)T1tA2Y&W`BrF8)&m_d-#YCR|D(-YYUsTW z>XG3iyp#fTan{J4jLmiDealIf^l3u%S%?4hc}&e*Cw=L5^+JgZBkOJVp_3837?I?w zPqT2AQ)MxgJrPZ60a#BV<5SwyJDTV|xQGAw$)quy z?E4gMg-jrV8~sDW1Q1ysSj+VE4^U>trY8Do!BXCq(V{K9P3D~Ew|&u0dGKrZZ;)X4 zjD=-jE94$N%s)R92{`kV#oR5|?7|&}9k_5>|M8r?-ZsX|o`#^&LGmr13A36*h3R)J z4b?V9`B619^WCKV9vFAAnO?w*#(n*fc?nPPSvArg33YPAdGEtBW^(tWhB(OO%n60b zKik7V3&-=ma}O3U-bDA5-GPYw$o@03?+g3xKZ@@B{*;VO@z$VlkFD18&L$Gr%c2Ne zhwSe8+D|D$--hd%yZW>5rmUXky84#$e40noG2JA&vE>=3L1Uix-+Ck1Bo>5^c{<^c zjij3T#A$;{q|kNTUyZ*&3aUL?G#{Y+_X!Ym$isxpefm{8LPMR-JZh(}=Qw<22U0n` z=9O(lgcuQHS^ii+C_FzV3}}SGS!Bz*&ERH+3+ydDgkG^$KYzoOA02Vt>$^9r4{s*q zXPJ`bLHMz5IBSI+sW4hqUoll1v3jQmH~DzMEUzeTs^C{{cflalLXhg>x$uUApuQ%8)`01|H;}oYHi>DhgQM1w`4HI}f!xX_@LMWH;;i%t z(S)g7)MgA;Q)TX?ZbfI`xM>1+238@%U*2+|G}e1C8VUh+C3_|Ge>_uoMz_;*c_?rg zbyrj9dt~WkBT5@_3NPMPfeU4e7^DR-sLvnm1+Ni)Y^2kuAKBXpo@y-tG@IBDFD8vn z(_X{O(=&rR1$3{ql(sMca_is0#O=B0IJ)WG2}p8}41&zo;H&iX5}Z_Wv=m`UUykuT zb(+5hoJ>$*_f!?$3vC5YyM98IABuEFR8XV8-6WY3_p$#}uN&e~ri_HL4dm*^#2n>) zM_sVSLep0BuI(EgBUKv~nAc{83j(=eC6T(*aEqAY~JC4KLzBipa$#G zawcR?-ls;B9TD8gZyuAfDPK}GC#x}r>Run_x}KWhlZMe#yfT5BzKF?5wcTLFJ2FMDg@5&VV#GM)(@uhkE;V#`& z+rrhHp9dNyGUut&&HL~uaEP4elT1Qo<4MygtY$vYb=1FNel){8DUMagz#qIUy&5z6Z$%;w*spI+-dK(-*D^HVR4c$|XkoCU zi+%sMwkwZo>P*)O$Ra4XgiXN#tC%2C&}c!F0VP1fmIaWBv;$IG!cUOJrN!z5LD2|^ zEDFH|LN>5;uniUK)e9(YqZR@>4z(RbVML^xFuiT3ckc6@0~M{#a)0;!5fe|yS-$06 zpZ9r5RNR4?qMbby61f;LQ9xKaaYMPrNs4-smx&qjFTVGpd3*k#T=z``g1uM0X+EeH zcN#kKeL?1^B&NIJ1AokRAokxgirwTT|F=aI3FJL#27qQKLX>2zn6lKzLh*Lx7lxr- zW4y4bv7df32W=T=4n=4W@$F+jykM?#`YI--Z%{WYQP`=DcQwWYY~+`T7P#3;h43Km zYZfaad=P>G-`Q4Ge?)|G8J0lDqZLlvuVv*n%_oDkkXkYJL|(c%)z$PLR&2LsEMk3~ z9Z};f-f(Im_y@|JyOQqM3Ics~BH`7fLM+PtPGu37086yUL}@m=W*ru*oHb9z=jJu~ zj_R_?=2ELPp*C9~+$l2%D`$zqIH^RcF<$iISmmLfi;><@x1n3F%a{wa9E`J;>30m9 zq+rvXLQ$=#`rHHi(PU(`%%c?QfFn=lRusHITv%FUY>G#8;QkZLVI@cSkd5an9!BX6TBo`EWjBUsTzWBK& zo)s|j5EM`B8_F_ zjLxf=ipo?H`~k&Mu5w$~no$(no1d1Z`F+xz`ZNgXYb#9)TPfZ&#gtWDw6ZeC80ect zrUZAv=L>4}y0J z0YYa|0|M=RcBb`u+6&3faBGEt__XglGb(q*N45zQvL?%Y-ul7cOa zP5LvSi|Nfag{@a~sXj=41_de*BEOyQ_@R&Wx5ramM!qf{`Lr2%a(|}tiFaaKSEs(8{1>~d zvJEB4G&hwvtQ^~yfSnDJmI?MC^o}$h<2f)Zx%tj9}qnokXkRymTeTFFrWshVRhI^&{};au6>uzBCP|1Id*>=6tlk^ALo#ItVbS}wU#wzbjd`NCXSOw+uj9emk8Hnq z_T0nIzY>wlQJ?Q-cJKYG^gl#C1ro}Yl+?vhJKGU^Ud!52qA z&7@*wx=QpxAl+L%3tYhAB4eH+$|XObq!3#@oqNX7IyVIQB!C&Ow7w23XBpiX2UtwV znYl$?2#Bf_CBIm5&&iyB`aMlRBU_q+g^@|&Ee|x&Z2@_A5@2A=8!ZR?J}FDKvin6&ST zR(zD`SDi^fSmW~R?)Gs;nj4hbIUN&P@7wpaL;4XnGwcclr^u)3D{@IGi9weh2me3#xpr)0e9EffR_SmGRKu@g~a}>6MAR~oxJVaM@X{@f>C&YIw$pzFV)h*ri1&uL* zOJSqNGQ@cl9}RWCp3K{YZ<9{*jB&D*(U=7DnfSh8fB{61LBeFfzZaU2J8VQ{;&Ujf zD7u=6_=qIfBCk)IJBglS|d#zMWo3UTV!tlofU7Zx=sB`N!?Xc^&(xA>A< zuy=2x{y@-q4Iv%D3@K2e6b=uv}pPFB5#ouu3}m80%2>Unc(#~LO$bx1jL z+!RAWE{KJ0*)#EsZb{15teTw7fkj?!lO0`8G^b%aS{Dkth_+mG8M9MB1bN}MGI6{) zn)k0F7vx2$ms^^bfw%D_^G6=QhKd7nV`j+xwOh5f7MOlQY6lRta&IkGylcw3CX?5d zMfGy(6tDNR3Vp%}y42IO9u`VZdX2>vg6HaCJR-2dlA{ygL6^&vNMl=9?i;@yx2=MG2toP$Cc&;xDWnmOr zx>Fj<%|ncd{ndj%O3+Q1KcF~Sk+|;DRHsXv_s`}?EN?8{Q*iQ$`0eFZg>y=aE`FZh z%c)oxDp|m7CqetnAXm)@pEqmULRZaapU9U*ncAILw#4mx{FufXCfQ=S@Es>r$tbsE z+LP7w!7U>q(YnM1%eHj{f2x!X^QG};t&jV_s($R#+J6Mh9_VP%pA92%o6G*0=ab@2 z$8vvcZxo-xCKRk!ZsZK}_B)^`*&rIHsA;0temuRhJxvKk4cvcuQMFNZmooydJl?S9 z^N_Ekpuh8k-R?s?3~c|vab~IXF*Fn^7dO5CYpix~$L+pAlbpnR{^EdtM&0&Cfb$7p zr4shf&rS+`?KqH08E}HOvT&jU_awE|e|94jN^REu>91Cg2)lT@FZOTRrg`#H!#P&G zU--P=HP74LybeWvDRVr(*BOa;gKhbC=t#&JiUpgcUN$prj3KjwU-wC3hM)D1{J-hCsVNAxyv}H*xfU{8LY~ z=Q=3e3Dh>2m=1d8bU#EQ*htOpCB6?v0%!vf`nhj|{ zjvHd{RAaTm)); z`e_vP0ERTpb$j<|X8sus|D_o$PCOL=`01)2x;`bn$kKZDoDyXZhb}_Du31j2>(6g) zW=9ym-*Cb^eslFmX!S_cw;l81q4)8{yv@!2e_*|fhGcucKi~82`JS&lMFRjU%^mzv zgl+O4r$P_2TgcC%ZIAaHkJn7i3-b({i^_x*3O~_nbT6j53bS#sG7vN_EBu5ByGTiE zS$LV~3Zl47Kb?(MD?&$;7a%Nu)?|~l+Q0vq`S;FoKRL@DX{D&m{~>FhW=^(b?X*sWmvMu$IBfS1P9*N&H`hwuBFV(O=?Yk=E% z1Q$E8W{2}aeG=NIeOKo@9L}=}x=uuY#rOM=pcl=W}xxjB1vS^Q*Oiq6P zu|i`q_iG=1ljBcjq2iXC9;@cepp(Z)(E`d=GX>$$x2CTWjx)iiFV8d($(}vErdO7k z8-=jr8IsMYLD#bPvC{UwbRRj(Hkql_#PcZGV-+yso7~6I|f&oIP<%DPJ@qf>$3}jU&6tO82%X`sYaYnxD9=}DBQqPR?m+(85 zINlJK*xW@v=`A!wU(6`Hz(w;N3vT7|l&N-m{4?}DbGA)i8|d%)O=Me*sdA@7(=^lL zvanGoxYoCP_~Mwi8R6$6 z;dp~|KrE5wBN9lJx&Zx|AqyNs?Bx#TZydk~NnPg=$>(uf_i=9cqts%${5q1^6OB92 zv7-(L8Q|ZafKtpW7tdldYluSXl3ZVtd>UZ3+SMs_l^?EYRjyfv?47hUA8;_VELHq*oUcgyAP^{7=f7$&(P8IBXCu9vTx0_cXPwcq zcVGb-%)}v{zg2%gv>=jDtkhoTPD7j~KmU$on{un&vW>y%-QqNivG!TiH(B4;Jq`@WiSa{jV!sdq%UZ5Gu4f+Fc(leXo^* z>A@qQHzid=MtbS7bb_eQ=k$f+q9nmB=Y^DrkMU76xo36>8hX@-e?IQ86|iaAscX)4-*Vbo_4>MUDDu=FM1#NAtV2(t1M^b51{>F@ z3Rw1@@16`*Hou`r(0j%|i95JF^&Bg!=G5EoXufD(4|7silAt*@jPz~Wf}R8Xi=L!} zZK>jrEtw`7+_jNKb6nAgGw=+*%_W}n9CIQ}T~#nNwIkd|pwo*4|0V*0KvG7j@)>|f zel`;IRe00`pX#0UXQzkrmb*hX6hVAMAIlw=7B^c95HqgVw`#TXU9=_j9nrawuPB{6 zyM9YT$sIG-x9{ton&klsPv`df z%hGN45G%Iu{t47cRXlr}q#r3nwx{!ih2}TGVwp|z(}{CTl20pMYYfgYimv<52JF;o zZ@IGw_J9SqdB%H^(>9~o794kWn>L;G^-lJn1Do8z+@iVq!C$dArFzS@3vE(1`2}AP zGH%)Q@?XHKATsrPlO9ToB3q~@B;_oNpoplelM848c(Uzqb( zTQ|3n9@w03z35LFHQ7c*R-x54rnhXUZBP+16uP9@Btc>7m8txFbkGywoipK1xWj1k zlCAGwC@%#D?TYH+9T}^Vi^YztS9~DZk4FnKZUWu}43Xi3yvkj?C1@Db+1ZU@q1qgn z5PP_+7FpKWh{Z@bSqI7qBq{}y8hoWb6g_1$*9dc!2OqyczD*^cF{+h(D%fatk<>8}?n2Kq$v@8Jr-ss$ws+Em$-gB7#B%V7;OTGtA(XOx$|ED6ieg3+DwS(o|`INDaTlB9mD90b)XHwr6WV+ca&cKp1Zu6*wJX z`=^0Lau*SHilXRCLiIB=ok4c5SL3`(Ms?G|#2$2b&j$P8-l_)6is^@9Wvr8m0-_?UES zcw#7zksPpo_pW|(pxGm+N{0H?q*^P~2M@s7c! z!;LQtMxllTFkbEKUT~2-#?q)_32z-*b>Uka6CtjAVSm3e1q`KwSR3j?YB6cJ^+6I8 zExV?KEd24s*K+A$GWC<#D{^Y%m26S>aq}TIaR+q+v*)8$PG7s&y1ZCXTD9o&lY=%Q ztmg(OqyX_36d>`Iz$1hMiWo>#n{wewk#xQz2AP07_6u6sti;ZhXkz=(CjI{CDH67N zMZp8fR40|{cjqEag3qun%~z?@D+CJ>=1V^A;o;Ha8M_-eGe?p_uws-duvLCjRrCs1 zE_tZSyo+>o15=#?RZsdN z_A7ajJXep3fbz7>f@U5g7OSv2TC-kzTEI9-=R1Dk{o&FmWhPv}cef*=(g&~ZUMAJX zOm`iYG#A5LSLsJnH(}6cY=P}lx@MA7YeIb{r$~12(+~Y9s|ZtLf}i@&6Q=PC-hq01 z0-#@<$$JnWR-CNbtg;8<2eCPvEDKID#KZ4FAm&GePHYbQuSwV9F?%OL9cCl$oPX?4 zY=J;}c>l&i{1hs4-zg}@b*f^aCotEs&=JdIy&NCz;*QL8o_=-;W!lsPwB{Q`n5dS}z!h{rBl!1?Wdbpm(1YYugjax0jw-57$ z&%`R}3^h75M5%3HGcZn{*_m?pe6#pp|I`!)4!1WH@3jxm1Ess}tf7YxI*%T6=eCKF z_Rcy?ZHl@^9E;|8bUz$);JDaPWF}tlk@Te64dLp zHZ*=$Bqjgsuqc}uG(uwD;!ZAlOV?9gA6cstFAtuRor@EfxzOh){Z;L)Cya)mRXryQ zzY+HDx%tRbei{j}<#sG1Dh5=6sX=++mB&Sfm`Y?1t~;yXGl}B=kQEFa^{!K$iQzz} zZ2Hg*UuWZMg0#5%cS@11)b$v+?XaJ`RnO{E25QzQywy8H;;tpA(Bov_@r0^EBogqL z1QvQwz^E&iS)x%FD|Ugm{-c{7Qu&zrj;*TFm;2uacPxbgEjWU_IW}gDrDd!7kIFa+ zDR`{rnP;$UVyJL4A|zgMOc_VUJ^-RK6hM5;%?B*IG{3C|PR>4bcwOixgBQfl1uYy^ z=GHnfFbQbg_0AAd8Ld>+IazS(KgD6`RVwihjn!KXMy1+ZiXkNR-9Vb4XMg|(0I_(= zphW=;{3d-JD&6`Famzo{oG;U;|MYmfPc-~ZqL}i<-J491?*QkJ{PQCK+*AdX6)=?Uqqn1Lkm z4-wf)nY`t2UqnUE^{8-pR!o-j@_@MyMPJ{ zj<23o?kpC?5inuK>XVBYF1Vqm@*1dMj~>YTV#TiO7N-Rp0ZeLbkg^WBP%KJdi(R(`>&@GeyCM1#9O*s!5&7S(+WXw@q_RZtiA<%C~+Or#mH7N!p@IHi}tEhZ7i4c=^I#x%n*1cX6cXdq1 z@f!}b{9M+Y!YnRtnODvfD^BMQg#f%2qiXESiAiVFSpM(Hbkt}%11+5XXTgeZqKIih z+idS$82{jx2bQbSl0rf&!q%m!Ka0ALLb*ca<*leM!j*%BX>w}vE(nk zhjp!^b*9E*k^c-)9OmCWRjN*{5>zeRpa<2NJg94!*RQQD~LW~tdOTFcJ=OQ zxyh*tXkKA*vo|fMY&ay+xh9$y1Wbq*2Db*-A2P^K-*p@=J2wF6d>}C&n_6}(8Rv-Jj;Ba=tLmHs+T%&PE1{YjmX99&mwd+wvo6ZvAyv)mJqcO>mZ2 zVq$7N-9`WAXNsoJtU`AyLiv(a^^&Eh7N?lxyr5kUBKRZ?bWD!Yx@VP?LBun&rE72Z zuEk0y!B)awKl7@UOOBKgD2ga?r9yJMWVBH@vq!OBua9fAsi-uMT;<#yEzP3(2zxlW z1cO`UmM(VwMN);a1?<+L+X2d()45e!~yl>%Ua)){2+dzt;i0!q|qRj>OvgjT@?eppk$zlcnfQ0?N_2 zyP^i_jJiH;KLZ)NEQ0PJieGT-F;c4em2pZj8(WenUlQrv{6@%iaRN$ylq$A*A30$H5A#(=UwJnz#8 z>1u#{hHuL~5!6Br2)&|1#ukQ*GMNNP0 z=^p=>KLmMhWr>2^p8ZLYF;}(X6<_U}C}X#aW=o(d`;oIAd)=&cg34h{QP9tFGW trH}b Date: Thu, 18 Jan 2024 13:26:17 +0800 Subject: [PATCH 097/248] [update] vbench logo --- README-pypi.md | 2 +- asset/vbench_logo.jpg | Bin 445589 -> 0 bytes asset/vbench_logo_short.jpg | Bin 0 -> 367757 bytes 3 files changed, 1 insertion(+), 1 deletion(-) delete mode 100644 asset/vbench_logo.jpg create mode 100644 asset/vbench_logo_short.jpg diff --git a/README-pypi.md b/README-pypi.md index 66d5d699..ea0f73ec 100644 --- a/README-pypi.md +++ b/README-pypi.md @@ -1,4 +1,4 @@ -![vbench_logo](./asset/vbench_logo.jpg) +![vbench_logo](./asset/vbench_logo_short.jpg) ## Project Description diff --git a/asset/vbench_logo.jpg b/asset/vbench_logo.jpg deleted file mode 100644 index c8f2f889d8f6f37c1941906cf82291b3e690ade4..0000000000000000000000000000000000000000 GIT binary patch literal 0 HcmV?d00001 literal 445589 zcmeD^2S8I-*Dnc6WH^EY0dar{ibV{lC>o?;1OjBBHl>QQ39X6>4lFf*qm6(dK_Mu{ zEQY0O8(ON>qSmcnNr=?qsESenS1YK+!GG=x!7yxX`~BZWllR_zFZbPh?m6e4d)95c z*me!;!wd=y!UzNm3xR)F+jWf2SS?(HVJsHrfnk^xM%X08%-{?j!5ra(VMH0xSYj`e zk&Ne7XVA^I8$^0&VBnl6PGnGMI0LVm=oRqhbOx~xhAm!_u$&V--F4Pmd{@gVsJ%DV z9~+NNo;!aeZ%VClStuG}>%5*99A;TpPR{?g^~ zOP9}0ShxV5HM9@?3hQfl5J8MGZqlTQRJ`=>_=U#7P~Af|4qw86b1%zYzG!~p@|EpZ z1mbuM`@QRzb?g{~VUa2fGkf;3jFy36vN0HDpZl`ReH@1Mc>}`^?`W&U`dV3?v6qnu z!!crC0;w;d?OSXpfZCjZ{~4P_Ad<|?EiA2iS@(tq%KKnM0*OR4Bbl3<0m2fJ;XY>8 z*Sz1baRCf$2FC%-WWbd{bF@66L2afi^$no=*r}nZQFmTXdM>qFT6siZ!$9KZS zNq&KEGJ=9bLYZ7%^o*I%#Q6&rE_zoWj9ZbAxN_C%HA!jfH*DPW(Z}MgIg;FM+w=0J za)nZ*E-Ef5{p@p1`4?YSe0AXHv2Tx`IC<*y8C|vhLe2LVf2h5FqweOd`k!t$H2(a{ z!{(MpkDvV7iuy&sNQQ2~x1ncU`qdZuMKm)bnOUNK5s0hcL+Wd0K5U#tzW}!7+-3cT zd#6~Dr)Tfpf27xl@tg+^^OkF^2aKd$cl#N24e#0Obu9Hy^{k^~FZ*=~3&e={ABjjL znUhE)a|?6$v#_#2e^%C3_@8zApAG(JhyLt4{@S1t0@Pt=TnvlB05m8W(7DLK&>oJ26skAw_;<YX&8tS!Eg{7msd*MlS5NlhW&{JIy* z7aR>f$qyXeS{*cwu0(@o(=WNFed8EZBHAs~VcKccYM>{>q*PJ@9S z##^L2j!DbQ3Ug1YTA1S>H?AhFX7g~TyfzG4!yetL=gBt@o8-N8+;hRhp*8v0L5u-! zJ`7ox_d>H~OH=qM#c-%As|_3XT&SMeqCBORj@hU@3G6`k@q+L8TgY~rw;ax$w)yPc zGDLh{ElT+*B~FJf?-!Okli#wtlhe`jF{%H%_+1!Cm^V|T;vL_tW-l1H@N2{zi(kzQ zO^^+-vA=|^|Er*#_y6H3>EC!1-fgRTes2W1c5C*5nE^-h&QOIEpKyv?Y(Zu-Fw@VJ zgc9C0UZ5)E5Y=nw0qJ?hxW9Qs*$)cQ*H=+rE3GAy}GB-BOc3Gj`0T@8aIMV3XX)+=iV`VgI^Gx1$Y9(9?88*3F;A zbH*g;=a)qWR#~i>Uz0V7o1ReOLDE;16PFG1FS%B~Co|VCxz>@OXEQ>~$!xmua}OFz zh?k2gp6w+8o*Uryr%-I}4qv3M6Yxl9iY!6?Ijw8?|E>)ab)N^+365M{J}=vIa|Q4g z`{_#njJ=}(KBbj#hLK%BtcjfjARm66L5`ZmW{|O0eEJ*}Q)B_~5aIVf>u-pN%jlET zInrpEC903(5PHyAr&O=3nmSj* z{X*AQOuPYawV}of`2}~6#(gQ^q7J6$7f6D%*NV)vlc*9$FF$Q*T zT9KAf`L40)V@dFf;BEfG=9k^I^lhB`*& z-wn>c%cul0(ZchQ<7>B!uiY#9^qt`c$>bPBSX;&!ZF`wAqKP7X$v#lPz@|9q^lOffWnojr zOByFck&4y#6p5Ihn7$&cgUpTO{-1ap+NOPe{}hN=+IKA97(P(Uq{xJ*pNM#fCLc>8 zR%A(3$VF~fsh?CRFX-w>0 z>EVrAe3ETv%0I1~^>AjDwtnyK(q?w-m^FKD3*h=f5@p|jDHob)a2CG*GhByOaP*08m_vUr8%9{P@@c)4K6$mOAEj7O+WEIGfOC=XvH%>H zGH%G_n+aoo4*Ho5C^ci!B&A%u(QJP+t#v`N?NpKDu@U6+8&z^?EJKqiBphe{n4lj~ z&RKBUUga;OM)DU8M_eC$5|o!Iv@_NrA6^eh;tEg7WGjape}gQE5}uhhRI>1q23}2k`AYOYxQ`I=BUoV$=I-yWBxf;`G6xhkecd zn#ber>vx}w=xJ83RpJR^v3QKx!Q(^P38-_Yx35;LE-sc=8h*%*b-PNr>fu4@R)IMh zA_Tp?0J)PLL$;YP|NNZ5V7T6+Wr+5}&?(?L`nLS_h7wf>L+dkJJm$4tunQ_@_1L~w zc2Kiv1g$aije{KCK0w>yqZn7|Km3& zRyb9E0O(rN5OrasNv|gVV?p)*J{EuY1s>Nk&8O};{}SQlVjKUDfCm0RdgF5R~Fs$}s}rc_)QI}A<>B2|$A z4Z&Uxx8-74k8O(SAIha%b7YCcMHMBySWc#-+!K+f$mA8aq~YSZmbB}-)FA_4mO+dU zsJP?9n+}CC6HbU^4`P5$Y{?uwp5CP#-TU81Od;4a7fxwbM}M0S6w8uN7gLmoe#40X zOY|88V=#DwDM7>3E`UKXT@#w2`KJ$JksLJL@ZamJc}|siXc-Q%W2>3upHxiCB>uKa zQMw~@#q&B=DShntu<}FHsCRul-7Dv%b8mxekrOX#_UScAoz-ufb4Iww&KPH^hJdD- zPc8_j{wjC%7E?-12aFKU8o~;W-h7ApjR0mg8-$u_7hTM~`Us+Os zzW02joGNKi!izmiXM>`L}OaA&N%Lfd9TJ9D+@JfltAb3Rxbor%t zmwmWFrT*p$&2{*d`$6}U#A00~bBMnB%Qwmb7^4YOZ?!0vx9Z94GiFb>*JPY+ied*- z`rjij;JMz|zD#w->??A)T^;9*;W>1_9so(~n;m}XE(fOAyN{F}Ypu75R}r1d8iHz? zONxumnN5CMmr>ThWe0opxx19fQg;DMovVDu6`|(acA*ICILa32xYhTgD&IM*U;LI z(K72)3DCv@b~cB=82pm&bPRY;Nbrg>%eLz04`9U4jF#2i99XjzG!?FLRfMFKmA;n| zI;_&cg4IjZO3kM&s3!cR3JzkC32x+bQf~pFnPVr>V7%-WA7@~+fkY++hRP?;PS#x7 z9K9S^W{1T15;(9vcbItF!=)2KlC!Wwd<^qMhBwk9{|k_?82W@wEeoNZ5{D0R5*~co0OZ`dZ{prY4z$+JQ=t8B4Uyknz=PJNEVcp$?e z6pKEhE769^#zq4RRkNofk-*FFT9q>cbE6C{xm{9^$@gf3DR%3=7M6P_Ya>bs*&v)F zb`aFS^ot@+xZ-1Vm+P`oF*|pedpMt4vsKcfUf>2jN(H82Mmw`rY#p8nd^B;GOqecD zm&QnY*SqAp`3NuADB*>!FZ)@1o5=L=W7mj4JJ2z0caRU_rjjW0B#+}{;OP4q*i88+ zY`VkkJ>>R6tchEd6rCg_%oFcrOpO}Dj=rss^d?6qQEuVzm_*;tm?_VU8aX$Ibtw{v zX(rKGvq@Yio+R4Km?hr@;>R8``yBp;73dWqNw89YCPw{Lti`!0SDr(Uk(X%0B)uik ze$b{{4iVx>^sg9PXwz8koO{we3@2HSuVWSw(JINeaRQIV*@p&^)45Bp>}wA~Os3RM3#I0#7d+qGm83$K&tW1Lk3WZZ}q;@K<0GMoEdgnOye zO{YgV@2=ng!=*Nbp6}YgAgLVki@Ut5>y+|ed%a2JU6{o6xDgJ#LmkouN=Pf8W~#~n z)SJF6;Jvv5b(%EMsIyO90C-@wp?$Qhl6dd#E&(AHqX$tv7AtgF#dkb@6^X^R%S;{Yr8p2?eCsS$p-7jDkaUHGt`(RYjdy9LE@KZYy{vXpTZT9vny9|Fl+PmgIM~&Rmv2Pb490us$c@?#KwqF#9QsVoR{blXu6z6uZ zaq5YV9*g%h1OX89tOOv#;~*XzJEB%l8y92kMX4wH41%SLIjmj)$Iz^b8Tip(V1O4) z7~sHiH-BE=h>O&iLe7AO@RBQ-L*GGa8bDt*w|(X$eXk1N>EW=aw@xDBBi$@wdetSs z{%B@2He*%?I=H(9#=C6#ZR+ z%0?Q;AyAjkNmSNS2omliv9A~_6?z;=cbuYX@%f+<1g14%je=jc>$UNtwKJfQhOR3> z?s4I?O3`(Ubu+Kt2P=%TUe!*#*}T5ev1SRpgXTlbN2fSAj6K3yh%MSIIzv%8?cfq9 zB`d9ct!EON*$m_KNVmE|4Q9T#!+Oy}mg`7=+n_^~tC8fYwb3KrJgFf|7aa2=yi8w8EU$okchj)4)(Zl1N z)Qr^4c;|zTFe=63?Xikj`85V3Y#)a_NPZ30_uL*{oK!qqG#H*?@8eYTWpnP3-zUdd z72^OK`&yN=_p#%L%T*UC7b!hdz|D)jQP81@L4^6+ACHCUUZ>^P0vWRq2Hbk1MD-h3 z5WR((%tBZ(7UbcRlEZ3rJ&<_FVg8vp++eugoB|RuRHA@uMkNT)?Ul-}y&jtV(_S#v zHev*Ed#vi5`3H{qN!BZmglRbMTWBL|DYaDW4D~5v2m|vLe+XI@x{Z7nC}v;|D7s5t zOz!z6c8Ds(j(c`Uo;X&UNTR=Yy{@gsgta5tc3uLAoP(Zwc;vS7qC+h1(iYPiJw!{Y zgQ_Fsq9K~pesvK^x*5}a|5Uw`weEo6x5ZmtnkqQr zGKF{))^qA259)=kI<56LQtx@f)NnK_U8kj6Sx^PaQRIGA8WYJ1m{K=I9~R=rrZ+Ga z34X8B|E>3jHG~%xO3qZ|0MDjw?E(ceqzmX}G6ST@fU}P=a&~kR%@O^&{Hz=1jYzI; zF3nwhxS=(gS%D_-N{PPbD}V@?oHU-iIaE76aw=YHYnoBf?C zx^kO%5wgU^4h4jgRmdx+ier}r{mc+0lh5DCxUm_V#uO_@#aAjX#%0-pH{Zb!CFzj& zNQ3fJFuotKAfzH9Ixd{@FjP~;pD!p{42FS}%S;doobPB|t zMZrZu1yC87<+SmG8Pnz}YlSfux=hVM(B6)Xb(iOZF^#j&9zPnIrC5!cg_`-YS*_Fp zu&oUsG;=3_t!)hHwt6XfDS4KqHXKZ8lpcT1y222_9MVy8?pPy#3IhVE;0c=_x7E3Rn7hJK6mt+9ZHj)6e*@i&MLh zY>TF=xh9y8Gz_vVc&c4)V;vO@5HD$NQ6wc*@3WH^9P$@jE!51iWQSOd7{0AT`(yQE z!gCMR+e>)jz5AbgWC82i|D2jjcL2?3l>YQ%1{G_}jF2mfZ#H^F^{uWh^Df*Ocd?(& zW#hK2W*;lGT0$mpku`N?4w$$o6}Iy2LZ!^Ih`(+3bV9xA^5OC^W=l5a9irY!8KkyZ zsZPJ?No@)RM>JPpp`DZn6YtZj&QPE5y)>WBx?Zy@CEUz<<*gQFWsEg-f<1qNH9vE> zHuS4x!R5o6S^EV_I4qgNA4Alp7z@oSm9F!^J`*x4NgF#%F81iG5K#GwJJVd#_0f!(45o7@vBux+bHcDWcM^ z9UHZ=&Ws63Nm@Y2PI$=z^jsRr9|N9m_apfeipmBM^xRcNWkafKHe-$L)i6Yf%A$bWajep`tHVi1Gs*_9GrwZ>uNj8voJ(zEQyy0 zYNKixX)4yzz#YZhki0uK_y_Q8l+30(NXLIF(r$1zFTy zLWodkO9})`=xt2_mWU;)mS{vyx-a<_8?_Ab0K?EnT*4WmX0Zx*U}t3jA>jDfI+k4I zTv1(hlgDeBNR|?T%rk~$uz)&6^)DwEB-0TZq8pFE=LWpHC!Ink>_z>mFsHx-;is$U z5B?OL)&j!@D+DU}&PtYCzATN)d*55AHcQ}w)n+H~?jkWW))Z?eefXfH)Zyt-b-UTd zIBFq5LV?T-t&Igm@xzwb`YMt2EE;k8#lAZ`O&QdN#~Zv0cdd@9{G36a(W!RRH+4Ga zu3{e$0fPfM9I-D>8nD1F3dFAL9=~3&Yvr!+oO{*u8QmtX9{i4stSc$Gahp8y5J%rJ zQ4pGR+p?25Rrp97`}_xvmf(i_&*OH#&y1nx)*k_lb~28k5tx#oQaV%P1|!+CL%Qi{ zQSskaW)(q|DC@VD4SyRTlRvBrE+7NE%~22?i6$Ry7Z?S2!%tJEw?AQq4D%3l8kFQH zcqsrHk$m|Mu0Ym%*d}Ij^yN6l`_09IUZt*+&WB7QpP7lXJax7-@ukm+XkP00Fl73c zq;qD7#;!i@X0Vi#s>{OIqxjp@be~OT)dN)_S8RhUsOooz2_yMTZcRRNTtP3#g&wM{ z=di&t!9wy~H3)axCfZ0>)($N5Q|%k_g~W!VW!52#*^JW}YNkDt;!hzSBD49gCCS{? zFHvimm6H>F#ng+YU?Pfv+k&Wtzg1M~O?}F;_M%?!6kSnEmW44!i{Pyy)JEz-mY*1F zS9oOv?zAW0`98lza+FSfS?DeN zB|XPIXLx>2Xn3F$+~{8mWYF$y#lY8MT5>+dJZc}y5X7_L-@KDM>8TJ=qt z$H4(4A7_}PT5GluFOb=9A!KprBL}A>K#-RUBbY&f7z5ad08>p!6!p{s$ik%6SD5af|uG0Wh5;I)s!xhGPl3!WOxQ1{j$o(It2A?YKal!rzvL ztUTSVjaZ!u@bt?s%Z3!xfDZ}bQbCQ9n~kra!;o8`!HlEy2S#;UKxY_i&9$S$2vZ4Q zxaMfzx@0XVV9ed~O+j@mSibJ--);*Ab}^6uc6YGCZyzomOEeUFl4H!kdjKx@l!{&k zR|3rfb}^h%)5*MihyE?(s$Rcnf2kG}#gd<>O;m6KN_ zRSl{yfMBgySVWC(_Y3=NTjr~IHH3!m6lHvDDPB_Fvb(if_=^Yu4ZSl0y-EAkA)$u5 z&(7d!u^3dh#%7vA{OON18TXsQ0pic^Hw8UzQM2&|D=$M%h|2b)3#9M(UNFWsuWrhp z?ZWT>&IP=ligWJs%DsVOc+k;Qmej0L&IX6fKgr<<;0Fql;KD)~yw#&}KU&#MM{cF<%$G8; zXesRsg%En)4Gcfko0v;gQ`4wkCJX6~S7m`IaabNLw)(d0)43Znsr-U~(9Orq=B3o+ z9;l+7&~_}Q4Q_msmnoL1!CTUD3GLu{Z(zNTr4w+!L&I^F04gXqfrD#$B$d9srPI0`L5k~1sj}L6J@<_%f;t9THh1EOv=orShG&# z9&RY6D$Oos44V%TUmwV(f$xHv9UVIqTo$T#KObZ}w8NLvxq@?#T+6hnee=w6s$Tz8)3}ijZ^=3V z4(oFgX8}_yi=nS75-ye4(IB1tj6= z5+Qguabn%^P{N2xkUL&IXBCd$Pz**dqDc*x7+)m0Vs03g8q9gT4=wLOn*0@(LIu;of%ujTf0F+0C>;cG5L#qO< zrbFy3B=rmfw|C7?PfrxQJ)r1!yGBF!PUP8`b$uxil z88uRBqV?IW9?*Skkxv zCcthns)YV6U;{4mF6#yXYA~M(zV&}dnCp|C&RYwCvb4(0a}{$-1Q3DqyOFbIX0bQ6 z7PetY`8;S~k9_%Gp`0YSR=*1ZbvfPQY|U;kTr`J&C(gDUF)y)&`DEg5QNae!L9gFY z%eD*8?>?ha=`BTz9jcanWeX{ob}`Hz+C@kj5i2@B<48z|vrwp9ly%}Dl|v|P8Q~c` zvkx#DL$ued&G%mTAfv}R$M9Z&&LZpS-c*6iN?drfK%;M7zp9jjl-|WXp$=ZAjy#A~ zT0$>3kW4r0wsk|YDyX$P+mjonDvx0>sNF0kw~Dwkc1E2IfR(k;FY(XHy#Eout7&2t zP2evUd^8uJ)m2_2rvhb*+Tm%O@=bHgboOCKyup0ji5a^>u26_yaW8V2*)>J#eb=PQ z%8kF}aE%;(Hmbqm5dx`E+D!wScB_<2+HEbp0`2OzL#Q5&bBk~e7RuU}6XyKNI-2JC zW1%K{v$Y|jXBgmw!DG{81{-vKNOH;9h?4gYLbRSbTP#(4*z9hWpeS>rH1WLf0DWd} zz==)?2#g*6&7*_wt(IbevZ_BAttY%u7RVSIPvQ>?npv znw(XKh8$}2qh7QecN#j=WuApxWf)me=FXsupb`ArHAsI2*9iTeH!k#l(c-P4|0~bv zgU*k&_4p<39eHl$Ca~qJYbhl13c>)~V9(QoGrKXm5}esreyN)HMsW#66+hzA-Jl=i zwu;A+PBQtH%s=FULI97(2Pj(S;ZeYIq;&%(v%BDAEKIRB^HVez;xu5Ty}&oDSkZ*+ z@$H;dKbN-5d*k2}4rqV?FS)pi_=>4eXdb?6;|yy8di&%^cm$;@oiZz@S}L;fr9S5`m56qsop3t#>YxnUW?$x81*cS!)y&f zkRVUdz5epl6unK)anG3hk4tjpe-9-2|EdFDhVEs!i%4?xMIfD3IOU1m81VyXLB2FBs4EV(;kyzz zS#~fSP-}LkIp#U;yE6>0{oh$gzs;T@H%l-g6>vN!O=Dbj$v$f?YZUpibb9uJ+9weFydi@!mYbw=DR&7`jjW>i!41kN+5kUsfNqyaVLLcCb+?G`r-el=(1D#Irc-#KX(VF6 zmK9+KujY!jinpvf1Tn??dYjBPrZZ1Z=Vpd$EZiq!Yd`YmB%Ls*C(r#%)K zclm))eu<$*Ozn0=?szJ)WM;s*y2{%o;{pM1F;=A@vgO1U5mLn4D)|)k_`V!cL*{DreVQ(` zG{g)-CWQ7(4RY(Hq0d;{(aU4_=7-S8Zh zQOjy}vX*EZ`fBi$B!@~U&Z&||N_mPBR(`N14G7%CxigaNWNh{>{=BOAog0!fRouQQ5>SdLJIP`cMGV+Jl>qb zOOV|Vxr2#Repvt}>l7b{0S|*7hDmw}ioXP#$0WKj3euz?ebOigH+NA7;7lmuO{9C{ zI+)ex#O?`yt$0Hv1v@{G=x5Y0^&la=7E}L!{#}1qNuJ9NtI+=<0{`X=zZ|+@R6{Ja zzdsZN+)o*WKOSfeb-E)7IkIA-jEuZR(V`v!$u z$R?sfg@J4RXLiC1j_W+OqQ&BrC_+F(_*Kg|VXP2$OW$Y8ungx#8|p=5&j>$j`nZVV z_lbMXLf*BR+wXHf>Z{kTDvlEztcJ+F?m;(>uI)sNEPzp+vLJvll}{lM&U?B~I`3(& z29hzRJ!t2Z%2Q1AXo#W2`RIrQb#=oy=Cw+2)^xJw zFS3D*m*CBp((6^j3CyoiI9OJkA>Rq^aN6bT{mtDbJF&mWJsSw2c&bW8-h40vd}uz1 z{FMm!px6U3kRDlS*YdzxX;Sr;IAIe@BzfKS3H$2SZR>&WQ4bbhVaW_$z9@#_z;@a! z^Oqc$G8p1aP>9MiHSv-xdX08Ug5ergBQjG z&c=FWFTkMR9ZTff*M$wzT9l_%BWQUv$KbiAYgENmvHDz_-C#_O-V+I5r5^$F4+oYf zBNiT_O0F#nO`$%fQ$l;OC;yU zcg{9(Dy0DYLe0?Fc~4JX8c;yxYb)2>m*<}!e1Po^EyQOMdiTZ4Vy5d9rLe`m-wy+x@j z&`i{C$`0Y*B=CHO7h=lFtzv2~QW?E>_=l2%+iRrfDwVT8ix})zAkB(8b8u+{ z1p+L*1*c1I5%$&;pDP^sJ_YBI9L@sh8_N} zf6bP05G4l+k@`_(F7qI)5%e49BKc8guGo&o8cktuay||D*mYrvfKq0uPSUM%u6UbD zA$Ykso+yD^hQ%?eLONExyF}h7^m0d@%2V@4UG||^>tU@yDx1}h9n^)q0N-$Oi!Ohv zXo&WxeXea*xegFhiZOLLIJ^tUjh*cfL||ja*n$8Xit;C##uo9sUO-VD&b9{8Zo=-O zDE?d!3%uZd2irsEB_4iT)r>b`5Vxix9{wNyfyp03xhu@c+|c7R(PHp!Tm*w=I)(RO zjDUqM;&43~iaN`G4mO8M<2KILSM@Qhewmg?eCYyo)9Vsmylv1VAMXbq3o7|2I9Tm< zS}77|foG_{N995jeq~6SNjHalV(M~Y#T)!YpQP$y??T{NcA9>JXi6uv9|g7rHfc&G zW9J;1Tg6zWcnvZCw-wyO8o~nVMS=Ew@Z}9SfRJRM^{NNe!eEuv*s0=)svDvT31`Q= zttppZMd^Q))0t44_3EZ4Tx#9k>nu#uRdmC1uZ$LLe!PLjE=||xcFyojwL0%f#tI-k zizoS>ndcF;KHXdBD^>yspUd+ju5}q>)A9c2o^w*9E456N1VYOi0WRPm`HGCJ;8jQ7 z{^UlnYBOZ4EfiBesqATBkRM$SY%AN4V#z{F{4jg-^(Rf?=8R0OlWDP7@50w5xdgRL2XOqaeRn9y!){@b>cvhctS` z*Et|V?u5mDV#{R*1(zT{QpFX`V(`w4nL9{Xlr3ecQ*(wyTG9$wltHpp&c~fNkXO;} z+G>>Rn?gl%Wrs++shrkQ3^D=aBgg0jQOeqq+1q;r-@mjyQn?fc*60H z(6T=`2^V~^lQCIzMYBHEF&R?(Opgutoy?`3M0!)&|9hWz4?QCwdPE_^@@o?o4?)3o zeOxq>8%4CmJ~Ig@?a$(uvJ*nBZEqZ6BUU#E1uq-4ZC#kbQ1F)PiOr>6E0jF5XJ*)q z=jcBdq{sUfdX)2*@bu=wZ+8Q3y?o8`t$tkL*-+85UHs{*>&1aO6;_?8mq5saAu}zg zR-CDV*skO4FmV)7b|P+B74K9ph*AJ$iP9=>7yqMeEj|m<_1-q~q{Jz^_#Z5|@1#$7 z^5V&!*tblBWE?{+Bu>?l{N>Fu0Uzv(cc6MRv?6r`T}4Xzr1KLKR3`+ojj14w!Ekoc z&kck}(%x)u5B?*EW5k@eR%llDf9u#WG16#t^)@9KJbEyQbnVT*^>v1B zn>0eWdz^;Ncne1`Q-FmeakTVt3R)vYcFts$#JM592@&$)?r8~UBGny@5H>?vARB4m zce4d@5vK5`K5>z3uGvYqq92E_-K^%~?*dr_n5R0{p`F@+wwP_4+eGhu=#cP8rk_6H zXPwA7rzt41%x&yAxRjVjMAV^oRna-?0P(bfL7AWzE!%Glxh)rG#8C4+W%n*cVEYfY9udk+_osKpP{O8Jy?eF)8&?;wQ#7=1l zgAL~l&nbqZ2PHNMis51m-x^o>TR2vIl9{P=rjSsg!X5<9bW8Hxo6Evg{sP~RT_;!2 z$QF6KAR&+a^pDbG6N){qH}D#B08tDwFLxUNe=HyaI~Ld+ZnSXmSLFw6j z3H!J)^HeC?7FTce@$m`nJFZZ3yUpe8j$AfK*2CjX3}zG&O7qek8~#9*ud@DvkKPF%ha^SLg?)o z#7aW&E?vYU z^GXEe!;xP@iD~!(&eo0W{oneyq&Cr~MCtAeVHW$L?_496^g|O#7O<1SXX62seM}f5 zuA%inP(#a0;}0+}%xiSx>3R1w96NJQWVKI431;vF-?*<+pB=n~UlqBO4{oB{S<=W? zejKzriCZHYH2h@Jp!WwiarDcd*D5RDEjTeQWX=fKkh3Tje%|Af$;c-O$HVug=wwBi zl26PIzgs?T7i_-fbD%l|+X3-gayYINR1&@5%+iU!V_!VB2_ zNN3n!aZRBfE}Y4qm=(TT?(%MVFl?-7*d$j0+pQRO&t(JM8Mf7hO;9V%Q|fc#)sK|w z8={rhLn6R4yz?*s-=qcyvQQ$sEZ0{-`G0Au5!YZn_?9qQIwPme+-Ob8s35`*Sm)F-stwIwz>AT{@Z z6z0i6s@b_tMT0)6WSwmaUj%XI8WHvE<4xdnKj-VIn#%GXGMkaY$1?^QLV;5@43?+k z@}?nD)!2#GIHp!TV0Z>I?c=uh({BQ8h+8{*?ka`-n_hE!-31ry$;0zu->H5VfnR>x zgCy~Cqri5C`bQ*85KO#UiofG$p)YKCb5Yqi#6X?dXCKEgGv|Pt_U!KdKrb-}@)z9^ z&y^9CQ6a^R-_E~fkrO-Ny(F$vtoZY&Fg*tx=}e+i$Tr$kYURA`WszQeAzQUXQO0o( zybYZwbMLlMYl1AoT-9h4tmS4ne+(#N0|2 zr1aVe8weWGVJY$;eSBV#ZeC`r!I`n>9EqYj?)FkW_9wZn)7R3xH=I32KE1FYp(Hl= z%i7e#sY(=LhNpk^h@(Cv*tLh4!8S9nYhRAAq+y&Cb~A+KdjX`S`JgiTgt?o4gXA7A zQy66nT)yygWX?Vzg{|zsCeKLe&2We9^gaQrLuY3NlR5Vvrt$3IorJ|7-c~8+9H7Px z0*ZaPf<*(BeSo8zz{D0B*HX0V6152OLmK2Lj&-JzjQkQG$Q9t1m;u?xkb5>tRgA}i zoB2~EkDjV!vYH|NVE=P+)GX82gCbjUZ7nC(FHI+oaTK9F>i6WCK~9xTWo}XZ*x)N( z)eW+d{DowKpl@cm)7N!QdY_TWjf-Okr-i1?K>K3B9zo1+RwoN7rNpbjWrJPHIpP>u zj6D0p+pQdZR|wmB$&s|e66J_s!OI5X_# zZG-IfrmomXB1>I^sDDz&7i&X1q7ZXN)NCvL^6E6ckT7lp?O2U`4`e52$sA=j3MV#2 zsLMm_P8Hw0wOTGcNt6;DsgbnqYh;`oyGQ`f@Z}w4@Sa{GS#;d$hEUmzKaVwZV6t|?I+a5Z2T8v)pyBP^%5HC3-R=Eu)YVF^~{pv@3LSCkGen)Wdt)Y_~LZ37IbaV zKjb>dmV9j)y;oY0?byZ%oQeVDbnBIjHz7KO0Ib@ zPYZX|>!ZvD(PH&r3G6uB5hI+Y=}34w!tEC6u)*}ySSQZCqT*|gtH=etZRFXlKeJA& zKN2M4d!P0xd&~-q90{v_CDQbCwkuO^fU`vInv&F?qXUkVqz&b7?=u*s&j9R!sk#hX zjfdBCXPF2wxQwgd<#QiQ=HN=f0%uWd@vz&%uW0LcL@!TqCbN0w$e|}J1Z?Cz%mP-8 z2E#g%1Mpm|wm`Q51cpUkuk)mRuu$lly%0%w0n32kkvfvD1l&M-2H*>K3q)KH_@qye zwdyM*d99GN^55FQ8(Xckq`64W2z%|ndE+-^!>f>TC%_W{Zdebb%9{?15zQ(c>j5|) z-=ol7`s3>7aUV!OO#<)jT+ktUl)ts*p7nAir7yA!!@m-$Jr#39F{PSOCF5kXNRJ8}=k z@MUz<-)WKErSC>K5bVKYJHo%DsLWy?(sp)QGri(B#^1i~kr14;x*^=Tp-web1YL9O z3iiRZTZ^JkHH3X$yAf>@^~2o=>x$dd7;8=RAuv_=?v*L>tTnv#HrkY6FB=t!cE_4J z3A>`1w=*P39GV71+T90&m+KsgJd#m-_dlH*npNJ&xG(@Lb)T3WEV%$CyQ!M5=!N1m z+^{vN)7koQEvhRJ*CRYzBfAorep3rO+;j;<0-VF5XFNSohQ3N&*R^j1t$AksV_scU z>>=}@^8xeh%)K7gsE^g}L)pVIKhA~x(~V^ADo56dl;Zu(N5=)#h3=^0q`^Lc*Ic06+cX_Tu4+D#djS?aCU65oiuAN zGrTtsg6?-ehtS{s) zbk{-|Ek~N@PLN@+F)TsMw?1?BIa>G;GuKP#R3aNq1anZuW3kFcbVAkjbu{Ns^yg zV+*OUBZkU#k!XZ`S4xmM+Mf+#STNZC1iXeIYVR651#Q-5xV3DMy61V^NC2;KBqC3s zU7;xf%W`3xDloF9{)%>6fwwu`UI;DnqSz&%MhwHCU@`fd4MR7O>idc39NltMgdZ~@ z&`8oPJnIjrG`$623rE&@>SgoOsw*scmi>9p@o2;M=8E&dIbjvL*l8dEFL|U`%hC6$ zishDKckHN_YI8Od`5vBkgB~gH_`&iawV6?lNjkP+Z)w|<&l(M!^qL=#uY#0XR>Lqef6(p9L0daG-)5CnFVo*ZXUh#>$nSECzSQl z*mdAb`4G~79KERiHNVcGGkG559{lh#^K?aE2rVkTts}!eKKffg`c9S7Gyp zWyo8UqBg6OsfyF)?HKA`$|ueRFPNg?J2`|`U@ww^rc`mQDNgKw0f+-hnw^mKzy5{z+s7Esy!2XZ-T`h-cP%WJ#G*dJlK z)&Mz)&RMxV{c!_>bN2B!)?y7K0Y)YjMhbq``Gb@_4k5GQ za}dEG)!V|fU~r+QMh1Yw(%NFZvEc8?C)OqLBbvK7j35wYwU9NNiq)cpT_o=g*qOqy z{Ty;VNi~hJK#|=9R~Raz;PH<+vO}2^kKrKLkUiZz-ONrwTCA0OE_`(Ou6wAPG3OBs zCrTjr02E7G^OC*~_~~OPfkBNN0Zy(K+9cZ|(YWv;B4wDGN+O?GGdw>a9wpGU9tryG zyGNyKXc){&FWDA^s*nk8cxUK92bnGd{*N9Fnx> zt*fb!N!qtiX&v56Xdj3+78_$#=|hEmn+#in?ScKxVCS84>sIcHQWVGZ=WMi81rh{3 zyVsHef|)={))nwFfhDcc99dR7P`fMsP`xe5n)Ok@$zZ1@*`5tRvqyUxUI?P_a^imG z!S@^|P@Yni41%CB&?;BrkzVp^o)CNW)%ln3Kq_atLE{YF{*4PNQXM2JX({B`&RJAQx|a$Hd=V(>l`#a7@AWu)E!n*f?#-b99Hq+dm?!tvJsahIi?X7?T<= zc*TWD8(#syo`7m7dyiXUbsqMTQh^FH^`0BKf)-mfu%D!v9ILx_Llh1>>Vfqp_1;Ep z1Z*WQ=)D_xO#4cSAYk5A=U9jiposu+Zc~Q-N6$%5b-Gn5HBxNQ+sD>`{zxvo4^p%- z5_>W1JB-10f8e-+!47}0{V?6uyrMV{1CMZCT{7O7FiuwCCPsxemg zJ4a-33g`i^fOI6B?37SCjnWiZ+{*pZaho_iW>L;98ElZTopSO1wk_LoEVc`Z&ZJnL z4a=f>fnNPBn;76twN%LddE1k5N;z^~x;@{^Ur^-(JBE>J;9}ME&kd*6V^hy9Cf~tt zALtFc0`Kz|RJO=(k2wLGLbNDIHJfGaPula*qtk}x;1%%g*~_U@`Bu$pt*piQ7u`+k zg=B%_7ErS1GaSwHp23cf61MD8256@7lR?h#0a!Sn(N?9}w6!Y+# z0eC_BDiypfAZ{!?mFJ>~P14E?=j4~CE}9PxYXpv)vqM;e+8=QTjZX!fa}PM=<-@Y< z3$d>lTDb&@@5-T}(0Jrr9b-Lp4r+b4)LR(e-oI~8E8Mq}YBK!=&}|6GRsEVQfd4Nq za457t1rY1@b5-3TXa+P%<5=;52L2t^#JV&KynYqhP3Bxn()Dd%Wc9}P5mDq~Xp{=7 z%gD=j!!YycU2vhtSfU}k~fjEnfHBUQ%85kuP z&N+n|8z9$?`v8f~>mt)@OBLBgVha)SosyHk3MRi(o-QUH0mRbj?T#5%)Am5uS59P{ z5Z!`=7|Rr+bZ`=~mb0LheICmK+bv8|0sD*){jAIV|geWtR9nb;tVk*iYLmZ@+ z#(iN#U8O1GHFc8=Tx{Y`?ugbi-_!+DCd@Clq%CNjSrGa2d)HOpz>mlcKi562mc1z7KI-JOG;RN%R#!7) zk}elL(cQFL@%`#4x}l$rJ`s7B^k7nieI_kyUDLg3KdaiXjq9H{O{-s{oAzMrw4W7O zleWzN?kUeM$?5xyHtg8+5Bl7#+c~NYBP!Z3$7$DPhpl$~T1eaV%zDc+;aczHk~blVHo+qDtL0jwfsOYmhQ#A+QJ7Ia3@L*OOcJ60o z8+MJit}dbt+wlCsoG&xmYdjEemwo4;;P~qApS{tBr5<{=to^}IfC=dV_r<0RV_W>Q z*7DZ+*81Yjq(AG|hV8C{rz(rvuoGLJ2!^K4X`UjS_QYZ66WycvkBT3$G>i8Rf!?M( zacjc@Pvo>=DeN{ZV<>9*H+aiEp!+{;Di8i_v10LR?Q_|Oy;l8Jut}K5>88r;q(UW8JZU5BfNNuoQJaW?fyt^W9S|>s|!oy~m%-yTJV+URas*{pNMY zra!YY_PzpoFI^{CnmY7YU772;OPANxMgCx4TlfPgTvw+&-~Ga>4Xb*rT!$(RStnSR z8vNqjL(pD)c>S{umE469nPBmC*bB#@*S4>7ZWxLnSiR|P{|6J3?>paq(uVCt@7S_n zTHM-2jyV=z&w1fE`r6Fr_R?uD%we#1J*|6#zvR)}yY@mzL-g~~CxX2x)0(HkYo81r`UI-6b$Br9#isK^ zjgZeDD>u~m_9O8Vhs-H!P>ld27BoiuWU#TJ%b=k?A+s;Ie4iHaZ1lJA`VTGO^~#}7 z*8)MTgG+M&@IwQDBY)^!tGG1H`rWgI*!y zGXVuYus`{1i(ljm^I$XvRo{?K&T0Rp_Rh(>eeZk?spis$-k*sRHI6Sog#fjnvCMhl zIPluKr5%qSs~z#fzN?xmn?7y)`I{$#Nc^=pB_X((KM33VY*hQ3fV7Jrp*FC;^l!tM zhR26OD-DFVtJd-w^e+FW4#HgIqJI>QjXFqh%)ob42|LWV|DkBtf)Rw zI}-=bA;dZCw3o!b-)GrxSqB4lJ5l$|10a7Glnsv`%z3eiH5%|D2gn?-cQ0V?8vAE& ze3ROSHTwzYJR$D_JnMYq7U!@9gn zSXB4Zkq_FiQ%$OMh&2X7oonqbG%HW|&CDFS`U4Ab(qEf%;>(K`@Wz1dEe)H z-{-yl*Olfszd8H)o^!tEyL`SVav#ZCoKlS3MAYb89_MCOFfvb=S(~=O4817;#E=h~ zHMtaajD^ebe8%=&z2*G5Z`gKA;q>=d6x@sG{ACnZx70NGSkW|JZsRG-dWl>B-yPoB z2VS$X+lc(_uX9gK&pwMWJ2w;}{gCCM)R6}Voy3%VY#N=Ky|9m-@+v*f{Y z==hWbJ$3~ZzfQ@&ICTAukce_&`}O`U;AsCONg|t)93j6X7q@>mror!F-R_NL*=hqs zIM0+EGEbHHZ%G7z6rf}mLYG0IYdWMvnIatEq;RPTf8n}B_Bgo7=jN4~dC`*HG$?j~ zbiaT>C_L3}2pZaxzkd5;SN`@HSPQZsXdg`dO*Du9h6VWt=Y4N13CMbB$&r~diy3qs z1%0na!vv2DI*dhQnaJRrzx+#h9I^w+tnA-`>isut^S@RK{R(5lT`lOykt-teU-4r= zS`-Yt&S=Dd+~mJB6`&4t@{+op|F7@k|5HBvKah_K*v2mbS5FX5p{3o7?72q1bWy00 zY8naR3Tvu9llh&OB_p)0Ppj-3bS$5k)%l0kV3)eHM}EyQ2i0eQlnSH}|MoOIlJGVT zf0p5sPn)+Jpfm4~r(RKiCBb1&%e1<*uDZg%7JL2Wa<{uka^&aLxNaW?GxNk0youN^ z_^sJY_x{sZpN>!G{et4R4eX=gW)E75(2gyH;!gp5e$F`erPm#nxOU?!oNqsV=YQ(i zCv8hO^<3;mn+|U|eH%UR4ut3w^z+ik6pinkGg{3E!ed++*riNy*GBsv-l#d+*F1LJ zlKuO05k1?lqnq%=08fG%UIca`2tdg#<-o90`76`u4e!cx3aBna zOPc~xH>&(_0sY%$;~$Xd^RnL5Ta~3xL&QmvB=%D$o4jv6e{1t-$`kL2eI6i9u)(g_ zFQ-4FpCzTr?m*Zti7a!BiF|Az33sQ?dLJiMG}2vl-*nPhl`V0Z8~Y1C`I3)Tw zUT!sWy8krb|9F%ph9P!iYuk~ZD(QmoR95@3L5bHcT!hRmD3xm&GkZ5g>?HiqXlZX z=b1mageSkU2%Fc5nOAZp1%uz$I#G_!R%>kJqDn5yZ=jX29%nvyG3m`K*^`2kK?1-~ zLHknLi8NdtrVAg#JH!5e_-m>03# zZS+@SZZTkqbSJ?mX91R-yZD9gxy_B}QqaJ&8Tf5SP^IfO{lE@nxXb7TtQH-o90Ypf z+Krkz=}GPk)V2HUj5i;tFC{lfgUx?5CE$}%*nC8WD;97$0+jwjZ(=dOQX3I_H`Sr< zNK@$VnlF5>_V#;0x9QD6;q_V@t(v4@Q~2gmx9cG477|~&5Y7(UU~#&Pzn%peBbCFP>9gJL>RoMeygZ6%D(d~T$sNs6;jaF?^Fb5QrbCDTqZR9HrX1M|sUV|pL zmkD|U3*T)Flkpav@yA?p8N6+v&D!uA*eWMk(1K3l8^aw)W>hngAZ@rWV5*UmEsgJ+ zZlUY^yA0yqoAAZaT>MSIx2PTkjdDr{<|f^smX6Aec|B6_ZUGo$P(WfkktnSVJQ{eU zzd03p&u^C!_dZ80PUqln7w&dxo07Yd!+Ilkah~ZC>I-EetmZ|mNWtFl&9(0RAPo9D z=NCehIs5hUuXCn%Vh-+PMe6J(kNEDS^qFD&b!RDaipWy^aXsV2oG2v;=C?WN^TF&= zM$r?!g>Z?CLAEVZ8~nN9jieIt)C1adM3t|jcB##11^j*u{toujBBYN4RbG)ndEbM^ zYny?7Qpm8i&mY$3FLW35B`Rz>8F7&zUHu!otv22i zOnYVpAYiG*c^9`^-Cz~3nwTgRG*^gTuVU#-8SdL&B+@MGKo(b##M0;FG8RcY5NW&d zzDQr>T(9-U{j4784tj;^VcvLYbz=21ki=}0M5?e)r*+pE5G-1rqC?8?;xMoK|DDZ*<21U{pR}2>0mtyw2LP zeFV5>7>*sqk&>C=C0F!#2x<{Ty$ETkZHpFoiHDpO!p$0fKPA$a$yAApB@%jg{=&+8 z`{F$csJu_`4X3FaI}nG`7%taV*Kr-Kz`5R68N~GWns?x{@{Kg(HlggI`-4pFv1WSn z$#`|gi_g8b&e-avyEN9bt`mRy%7(#5@-XG6f`lR>TohK0V?QM``|Ln^ZLC1RJ~92G z$(Yf?q|0nJ_xoB$O|N!)4-f;EdI{{!4bvwC!pqOPolSMtVj+dPTUp;>Au)I^oo&c< zS-;@LcqOd%4PGX7Ss0;n!a(J&-}G}9{?&V7UF_RdOT!!YaHh+a zQag})qE|Q`xeS?pN>ZeXjBF_$486N9CYd*H`-f)7Xj%}` z9zcr&M+(VPzyM8Kg%)f*NEIf`1-~8*@Sh(=X4AU1PaFk=noL z!66#DlC##izw`-R=`wVdbtQd3d%n&AYV{)0pI1Y8(Z^l?=JsjHxZPXM;I)`Hv0!c4a0SAOX(c< ztPHROQZ1ib&v;B5xRgT}2WFEuw*Ay%yAW}E;3MZvCj{mh!QDiBNg&u@Tore{-l&Jq zWdF6$KBfl2i&{siX49P!ckXI^=WKhA+;(}6D^^?TMmzDIz*={SpnvAPbqj|~7H+UF zl{UHJV7+yv_a(HiWt*#wIVYUPOv$>-?Bjc*yFnjU)-=w8}ReW;C-D2NzU*5 zA zu_FGBSLndyEt{yC^zP}d`VX+N#Jdu=j`JU)T4gayzI}V6&qLf;RdkhkZFJUxcgF09 zua?Bev(b7!$n^s1gNxb|KG@aInQ4Xt8SXg)s;w&5)-tu?%$m@zbL3`p_YJ-^-1P0X zz-@n>$UqpRLWNE*I$4Rg`xH;496sl)e4PPS&VOUo;C`#&pzffKR@iY@_q-(K#C;x0 z(|+})9ECE_B(%uIRh9}~n~h>muJAv6&23XQf{SRFJ(++yDb~;HT-1fd9$wV_Sm0Mb zuld-&a6n$`4f0*_0y;oQ{@WJP9t0~dP^{NjHU6R#rwq(w^LH$Lk)F^|*O;&+HjhPUTFaOpFf9%ZySuW~0#BAnQTXIpdiV$lrNaB-6NzK0Fg%7j^^oPPmI`h zw>iGco6C&p#KfxByW;zDxyLY9xRlV)&wEYzq+qHy6Lq0-1XMu(P5_;Ps7V? zUB>mV?`8u_AZJfxb zhwEk^dT8ql(!U-@aAH4YlBZxjXK)dDQ!4ZUd&;9jL883n=sm@Y$_X_#YejEi+2;aR z9Jp^4oX=Sfx7gaqZTs|60RM@q;335Q;u&>~ODRrh)(i1>oB9&Ya$hf|&1KfyQ`{FO zBvL5c86!2xZ|o`;?Koz9vqJZ(=@pL1j`zOT%tLy4PzspbS+1mzg8ITK)0HdQ^d`s& z8Hs$wKuve^tyue>jJFLw!q@Y@KH7{@%8A>iUy3I&@GK{fVv=o(6&2Fcs?VL19p{8f ziZ5CL4w@Dp%9llCasj%Z!-gq1s=Az=KHo(nqg$}!YBy}>bERrhPxv4ZL%}X zMas1}G#fA_ zpnoxu%zZuKR_0pfr*T2gaKu@Chp#8ufO%;UfJU9d!@V~!FFQ#iq0IpGOF5(X)x@#(v^$76sR1T;6-PyKH{iJ+h zIoo3ca#W=Ob7VLI1{FgBOJBsF{BVF&_TM*BLjZq&WJeeIWd7f=O}~$0rBqdvOgLnB zD-7g$+d+RS6=2d*hn+|Q)}mQ~|K0fiU;S>d049MB__Hp6tseyczB~Gdaf0|5d6D6o zotc8ce7^t@DqPs;v-%5FTcz|BI@Bz}y3C&HIPH<}CIblUhE~laJGD}9m~ySzR<(Y) zEBfqzEo3D+gpOG>20^y~XiMI~@H7A_)dBdv-8de9q$i4@+9B5s^e>c{CXx>lAil&e zbQRs)t1dqD#6YVLU8Dk&lP}!+m;w)zru~Rk0b7A2V@#EN$jv-l#G6;rt%mJ-z|0a9 zY1rJ_ss`d6vWYR3uW`1Xly@|z1LGO^SAO`9-vz4*v9<16nH=5~QXeCK_$y88h{qh& zqE-^LZ1jj>NfM4$X_d|qlIPww@E?fpVh1;9c9IN15vPup$sZTF+6UxNds!5bUx)!y zx6M}@9rwwfJ#o?iw%%S|6x<$a*^pkbMZ{~O>Qn+a{CLVUE_7Vi@oZ2mcx`&;;$3Fy z1#yWqiJ%xCLJ_eLK9XrEfvZNjk7D*NbP^Os@)yU{9~WmYG{lvWj`^;YwuN|~Z9Uo8 z5GB>gn(SFs+EZ<=Y}KPlW3G4JUw%1%qa62nxorbAPqe&Z!T)OCvohA8BmOh8&l~~_ zw|c)^u?_XsP`*Cf=CHaVX7nm!VXZ)B9$&pZQGExAG~+f*wM(}asl2Z%e!-mE?pDE6 zk(Zr~4L(d?KJSqu?Ag!f;|8|fEu4y-TV&WGI!WFuQmKXI#)48D9H$akb+W@`HrXG- z3Rcz4VwwAO_X{cLyQ|kRv~S-lV`* z)rI${KgHfG8oJuSZ{GRh^JNgm1tmEYBRP~9d)T39$iAA_zFPcJMYN=r@%vA%sYW>= z2~}?m4$~HH6>C1O$bRgRR8OV--dOyY)ckl)yf=rSTg$$&crQZ~gR(5hLZv{f@AS}E6$-LGC6j3@6V`e4iYiP)9qz8CSq(DJ|;^@4N zL^yX{-L2DG6dWnjRyO*;X`Ks7Pwc)9xBb`0AJFf|)k89Orh(%yT zOQb^Ai+rcm^dB!=QpUPAa~6aIda5Q|zu$2w`O8Mm@I3Nvc8rwK)Y(Ieya}!w1@FIV zK4xKC2yPg%uQ={UjxVG&0Yj@ET968j(m2UhlWmallyTLrn6b%$@0QgI*1Dd)s+b&<^A^ zrpYRV!^BO`|4d1?#tiSb$JxA>{J+TFy)m#>I?(8gf6z-ZS7Xs}8I?^hCxWPiKy#J5EMd>&V`1(HrBF zpFfu--Xc#Fe$q1YqI?j$`Y|U@H|XzuJ$HpRzfL7;1{Nt`bh(RjPPxh-m&*la%2fNq zL@wrIhVo1vkQoFTY5}$#i0DuX8H`mV=%BfwkY@*$?e7nPUkAco0k%$05E-VNa+3bE zXl55kT-1r6oFn&|%k7nfl)7z5(?#W7RQ!|V>81r=sfVn>R+|53ju7XTI@Tm{FvuDx< z9KU}Cy3uiRq2bp8vDf-w6wh1nUFRnRqbf*BcEFH9&l3wQ78LB?6iOJ zrEpBCz7!GuEcaBced3~K;-Vn{n*UdTY>K5mDdA`KDw|gGnpW$mbL^>%;h9l*xo%cS zp6uE`9>;%t7Sv=*(5DV+JX%-|u#%DQ$k@qVXB-pISy^O|yT50r{}10&qV-TjP9Ofj z#X!~yX4E>-04WJ9h~G`wPlbNW6#zIjhscrr*#0Pfl%oolNkMid<-Xv{PSX&qP*GOq z${!MXx1Mj{K+j0siOvqU{$u*xX9Nh+bGs==NL@!m(jbS%M^MjD%V;lfqo``sOvffF zX39r9{&H)EC}6Wf)e&-8nIe=B95-3zPB$BKg6U~RllS4T*iu~W*K zHL1N=r1}NQTKaIOFhA{62ec*oVqjLKU5w--aui=Z8n(A!aVUq3`)b`~CE1TG+>KhsriJUU($K zdAuwGK?ef?@pS0d~X)~`=xhY8RQ60d3QY#bqtlI2k zpOPKlW;m4J2bseoQyk+orG}+_xk??*9$HXdOO2_P{KA%QjZS-hP6r5UR%#Ni`NJkT zU}IQ2?D^LKCMh|4%KX_2`FvxyI}^!2i*(qsKdx$34P_0`sfFu_q1yF2bK9=~rpFC1 zB2JjcnANT|^cUzbQLr!~;njNv*%Ff>IA$9O{7?=RoH^IDj`rwwl@T9`TITgOS`9mH z@6pv8*Jw1zQKDq2=#~FPn%1l?N1mM)M|9j@R_b$xJF*P0M3@X{3G4}d?G3}Xm6vTj zeZE#!JBf<#mb(c|>2yM$$hOWRWWA-Y_@I0Omp7$FLZ6894Mo*syHR4dT{~kWiI~o( zv0F#%#L#X<{auxXhr-&hpgH0gAKVvt+lN(?Xahq(Olv8Dp(Zk2r({hLj41YPzkIvG#oh zevzrD3K7#`T{@*{3&zBFd$y|Tnaa~FY_v*K9lED8+|Vbo_%odJQhV}RoK8P}!RBgq z&%PJ9fvqu{AS~lgq3iOkx4&j?aW45g3<*A7&%4Q+;2>}q?tg@>_4BkYj~v_6Aob`L zv{c%*0c(ttsoGE5LQU?JdsRLiEQ)EJt}4iDk>Te|&|ChnPhN@)trxg2%?LiBLmPUt z&vX~1vOla{&$l+ZGF?~g^zg+t2rzlv6=9A(Nyk%jjaA*4G%f@7qXrfuyZm`j%(8$bDy=! ziv9BF*b{RFjGKqtS|f_Jr>iw@kg2#gSGvTQ>R^VErE=pTqiK zW2X-VTRs)|LjI#>ESSA$Z`g^pU4@p&6rQIM01^+OKz&K=7hSRT(kHd8#rAt@zt_R1qiR3aaeRQy&x&y4TXtY0dOW%@l{x*S=Y@Q?Ge72qeY z>e4;|@&>gB;Dii01{(W213(yyim<7Tw z0?8WJgGwno5OZNm9r!Ul0%#(Z0MDN+N$Pt9qPU-NXg$fsT=#1}ew&y7$ft=xv2j8U zfJsFDlW>Wex7awjsM}%BZvb`wUHxRClV#IA>M1nv1%tqL|Noom|06H-pL`}q@A?}y zCXXz2%M}JrhBwQf7DyS8m9d$BB%nV$Q;kF#052bd7l&+>DmQ;v-CE&+ zNvZABAliNNT88k!0!Oku)e!Qru#UHa2Ega~#h66}2^_orgp3lCRI6H1&7^&RPoA^T zA?pP=nRCEI{7Gi{TpyIo4{OPO?9(CDypi|gmLifM%_<|u7vFd>b!sorbKNOb#zibc zKR_7?;$CB=oK6R(H%vZ#QT%Se$0wBZt*V9Q0bO`I?#c4m4_5p9MDC6ab6G{`D0c|9 zF}}KK9kQ>|c2#Ujcq@@uh;c4GLi*Y_khF4f+dJ%p`^(NFxnklfEti^bKNBmT&&jYp z9h|c@DS9}CdQCT9J?0v{T`?T02<3CE;SB0inlh8mJBVn!Y$?z1QX(#0zO=m}T&%e4 zBi|y2{He4WBR;;L{U3seeSxp9xQ$CM8U%;GR1EK`;ur9gVP-yywWY>Lc}<|5-tbYW zaV1*O7I%h(pHqJ{(xpY4HHcu=g$ezrWV16AbA+90*mcmNUJ>*Vd<%qS4E1d5dFhhr zSpSJS;Kyfot?uMIV0<21v*snoqa%6i(pFBNLV)zuCdSYGF)zQFeRdbbSJ~Qb2O0c$ zBo%e4W#jeOL|?h91n=9FZa;kySuR$RktE`IBcDkqWI_*Ggyl`Zx9x(@M%R7#v@+Y&^5sfiU#PY;Y6P`1H|rK6)fb zJ&BzKs`*xtEsl`1_0|5K>!%R0dB%(nn36a=Yd{;Y>z6Qz-+=^*qeAtdb5?i-4Vltl zsif_1jUXeBcBT6V?J`325zO?xjLlZ*>jJs&!6(WBy3~YvRZ85~r%A4Y4OPz>(AL^j zyR8B&)nTX)$-RZ%P;Blul6q8|oXkpqQP=IqI}n4dZPWDu_|?VlUAXs*z%9c)j379? z)HwVKWrb5IcO_7P@5F#ue*^!RAF}hdq&r$$$PHl8rpQky-d+Jt8KT5gV7R6B6*v&k zLYMMU^+y6)f#Lu0Dd0Ln&4B}kWDW)kEKFybSJ)}}><9&TJ)(H>O6lwuQWN}D7;&GO zED6#`VjIOD31|}_*@*zRBM5oeP*&&xXGV~d**-GeCKZlQ1nxAGk;927D@teY09OfW z4fw`52hITaUU5+>KIVWS*K6c|JK3Fhc>^9R=@z)WcXBBQe-66kov~6nM-R8n_+ClN zQa=yG2>p>(qXsPBRySOrt9nv;1gZ!EY@4V&e#z?_NjU(^nplxGNlfdE_<;J^=8B)d zyl}{Ak-nWPu#}3((EKcn}Y8;h& zW43N(khV(aBuSjW#>Zs`l6^NDHL~AoRFfdS1DWgJft+7G58O*QF&{#s^Pnamd)Isz ziR|c9meIH#&lN?;Mb8pQDZ1ZKGU3GwPN1vLE~_YKc~KDR;e3&B!lid=sxmdU_I6 z>p<#ghuTq^%tGQ^+yHiA+5XwUy=Zs5hUdQjqrXs#Jb3XIw0W{WjIa&p&hP8S=nh2r z6{XF9CU^o`hu8GhtSIves8xZOikZro+9eU9&tBr?+whq7d6sRjWYQLz+=6SM1atE^C8!xj1`Kt64PHr`NwgcKth#8-pA6}aU{nYHoO!rU-1ckG0 z|NaIL$iv<4iW0;pDKotJYldS=Yn6+tpv|=yzF!0qFoc@yv?5cN)n76TkTVYg^1%Ng z^D~h7YqbcN;8!d-W2j10h7)YQXOVA zZUQpW=l=<~o3x4S*L*Mp!AGQtpD}7h>Oki6#5DMCWNvT? z7d6sri`zV;3?J`7;U^wjUfh8+=QfiUvOF;sWq-m$89Id~4S@B?pog<#`Tm)7kBI|p zedHI;;^(%L;|P97uFX5bi7SpZH-=ZI!GklNp3=%R%#`X^3}yL{j?is*0JkeTja~AH zncoXF02`O90@(S_kSF<6g0`EFnI7ms$}V-<^MR9*Icj%RL6=&fuL>0J{3AcjH|2b= z107%R+=1|VLpMJb%9^9v6PwA^Q-kW&i0N=j^$cLwX8F##ac6#nhkVqKTMjQjK+1_3 zUnB~U{GP)XLku4PlikTrfKH`}C@^Sz0qqv=snMtrHmgxX@cmRk8|9Fr?1o^(ABxg~ z+@FAXJ!P^(5GSyx5WPyi-p^p~B&R3+mYxQjO~rT^sXbh*hURN6@DGJyJ|u4hb70%Y zCqH(>r;dX52eisvacYjsvw-5SJLBm<>rIFm$@xLIP=q}(1ffeikZbMViB!;~;3)z) z91>8Y+Ox|ng&EBSuY2HA9A2P=RDvMaun90yHtro|HQfL^^L`8-4vdT=&|oWNLycZQ zU0V#~dCy4x%8eS;>>ZO1S%F6{@{u#?3=_AV6iY7n6VFW9_DEWN^C5j6hIq!{R6!G{ zK@%8bz8jA0{bWiy;Bt(02cqka3N>i1%?|}rXnXRL+lFL*_r9VMUgF?VAa90PlN8oZ7g=)QZUwLc-$WPs zO@`YuCAx!Z=i?X9{;EI>{^*FjjmCWZCvvJU3~8mmeov+;>N>$%FW&e)U}gzehX&IW1HE_m60( zOqJf2P8HXp4k$`5AFc06=U>VtaJ=qC@@G`V*?r?b!holTjA0@QmWB5+B?)AeR>SRF z`&k+?AnWU$(ocp~>$_}t<*y|^It3A;iIR+^b^hFF-PO&5b$5Yc?bp#YXL}Ehpm=ev zW2Zt~#^8*DC4&jIu_Nq?;)aQe(W=i|k8-~;xKnyq3;j(;;v$zIQB$g`k-w=U0jVy& z&&QA)7x&bC>ki2je>OTYZaxKKUnCe@tmN;f9DceRP!wF))H*t-D*e@u+sp`+u=`~S z&Nza=Knzq$G-rC&q@}EG=j41l0PF7!zpYuqUtPqdWln!7DnBJWT#)P&N4fElZ+-vt z*8+4(36zWWofDTf$PT3i8$ge7g0L0WkKA3?J|4+Bc zzuqqYE~;3u=}Xlo%L zNzek%SHQF2IGf?nlRpBr{*&Z|O1B;10s^vAZn6e@05X}70t|czhLp&8N!0>Wmj5%m zWlC(|e-fgQX@T~QnJ%`ZC%%HVj4VMR%L@Tlwy}=+&;}l<#2>GBFD;pit_w=1m$Ca*PH9+32a@pOzkIuGBf!a1caY zhem3(QLx?V!K>ABTi)`=j~3*jn^RWXA7TzEiY^s@NDdHMwwSvu6N`R8XzJrn;8Vy{ zT z-QDq80xGeaF<=>ai2#h1e~OIpA8c+?z%!KIkN6n_5fP|dv{Ks#E9O-~JpXayi=7CD zv0bVz8l7_V?Ys)-w&9I-591%#Qw5N^2I4yL1<%GRjzx#VN+4thU$xpyhotCku5|g_ z_FS4zW-K7qNT`Pg%ODSYd#$)K=yaVw2nVH@b8jsBRDz(Ksn^9#+wipva0t*VpYSv7 z9_vzMN1K~BwH0-x-nBeA;Qna2*qV9mS<(b}i43;(AoT6Y>pX2ntS_hZ$sCNDpq=rw*>azSjlQ< z>?d!pm1YO=ye&Ly8n0-SU($v{Y8~l$Ez?u5SL~Vd6oj+5mYE1brVFPVxTF+hpOD)U zl;^f*mw`+pf5KatG$Rw@l@=B}M7bxJQ+ zXcdbLnsL}>HU0GQB{W7?aGz4EY=)tx@tg2_a!^~A}gv3R~l3^oX+_d%Cn8oqm%fK#!p|+ ztI#U%LcFfEoUs^{N4eToFPYo4{~K2=@1wc(-Sq>nI^6kH?8@OgY~vgxC8(UP=d9rWq)OS z7&oo4=pzU4xM1by#=W;wWTk;~r*_=5dN#GU^wE^3anl!kx)8_R2Cz4!5-O<+T9q$7 zu)s!YaU|$p&(EDU=>Hx>+X@;JCHl38_#h(KBkE{z@!e>$2{mcG(9BPabtln8X|n9VI^91 z>vBT+hiV4-cr(eA=SP%~1WV1+hr}=37n6*{RyGEVk-#Mt@mb4YvLS=+IpGPPX>6Ul z@cy{U*G1~|i-9x&3IsJ^B*Zv+JzqI~JnHa5hbG^#jJcH@z4CFP6`>!joz*_Z<4R8^ zy6el`h-F$AO%1soe3MZ~n}|KwA`sK_I-;y))a5gqk?MU}?&VpHQDb)rfVTZ1S;{;K z1hbyj<;)VhfH>!_E0t)<6P)07Hz(J!!tr7DjYr-G@AJ%zAzq#_A1P>ZdOQ)EUW+Z4 zJ{401w*~$gz*RZ};-^NNSk%^qGg&!*8EwmT+Pzp(#vCnsvWkoJgupaz!Ps2GQ?V=? zHmM!qoM6u8QY)_C?}_VOc0DSbd0&NrwlWTGG7qzT#!{9gN1SW!eyrG6%vzCKwih=|& zUc53n$;!C2ztiFTr@{dPPJ?p1|8GA3e=?r`g2`n;yfS|D4BGvIm%oIn3(v=tz+Rx&{8-bo?jYkzznhr=N;(x$SXfk4W|^=kK4Z4W9t6S2H974zC}dGQ ztxJguAajYh>MB|TF~M=lUBnfm_~zt5sq4PvXsbtZfIV9WRBXBCKt-=0{{ zo0RGV1N~TNTmRdE?#HB&JQpj3<8{NJ+T6E?VMRl>+t{+Q8xx6vpnd;@YvK?bt~SNY z1ewf7K9kNhBIF47B8zc^r9Hw)A(2~&Bwc> zTzVt;R?CKw|K@WA+I9q?#US>a$kc!Vc zzQP|y#jP)PAacr7BX6AIPi5_m3(*~L)^)ikHKQG*%6&C^z+s?UNJ*;Iak_S=JcX@6VAg(|eNS`oEdKs*KR(BMU%T)Yb zQgj~;qF5I>y?$y9D0{Kv7nh1hyz5ItQdGM9ICAQ3o7?%V@1sfus(l8lOR9@biyc$9 z40Jb4+)EU$Yp^{jo7Y^52-Y@$Cu9ytoHhI~8YD0!cwx@*(jL_*bvK#C!j4CLkRPx1 z%jQ?;Di|>%BEyY6%ehI?`Jy)QNc|@e$YhAIJti1(F`A7krJ%9;ka+Qfd+ZZ)4n}-? z-Y=Tsa!2MlUk=Z7eE6JxR7sOJaop$4yx?fRj90nFyYGVxt6RM#VJ*wX(owRTlJ$-5 z8i$knFu9m7+r@6{c?;}I)c{69-3tb{>6);-)1Ksd2C8mbz)>=<^{)P@Q-Xs-SPP$!39t0T%o^6lVkw;FM+g zNXk?oGZ5E?N{>F5-AW!OoyQzl!nEu_K*kH+`6Wkt%Sv0@aSj>W3H;jS%sGwFpPeai zN6{mEqK|)s^lrK@ZyQmnk_siG6V|oAxTL8}r$*8Y&!+Qqr8q{dfl)4wk}(h^ZIrPoT~EFL-Fx4H%_AF3o8E z!S1%u<+u&U(REx)Q`I|Ey02meGm_B>f1EJ{ead?kbtk}!kRUt1g}?F$fh%H0O@9D- ztp`QzcoeBuyQ@(jH9`Y4SqCkxus?hm**UU^Zi{LI0k^W^E1^$OC&EY!jCd=1>fh7p zLYMUQ`l;W9TLUWgC+9R?r%#?AA0WNcirj%5AWSRkEi#Za!U$}_8x1i(t=LuI3s4t; z$u(V1Vu0dpVyefgPy~*C)bcy4@2higSKi6>W_($9C`>%sn}MXMM__|*)RCa_Bg zRm#K`P!{P|QGNlp?yJw$lB5X_WBTzM6^mmJ;7fak!Qd2fY-|xrp!wk==04x90A`pC z^YZ5a>Qb2J_SDeRZvj`Mrr-|%2>?xG{oR0`sNTk;Nz#A_Py>gn;k^TTcu=FlA@KE4 zS1_msm(8mpX316Z-aril>Y@A(h+t!m`0Pq)_y|x6Frr8&&_MD%K`g#Dt`nnl_sUB$$z;ay1GbQw4VNLUMllUwaf!iJTGXZ`w)`o6Nj zJEYV;yV)LspWO;q#KCJYk4wjKq<&}x!;+sdEo$}?3+Y5EK;L_X*9W5ZUOEyL82APx zF&!V%-&{!yI{H9U)9MQ_7GT*-US%hsz*yZNVc~1iAiZwTSi1~DcRyP*B2c;yuvPJ#uK`=-|H>`%O z+EOPo;dk}Mgty3gh+blT)t%?R>MRVKuf`h&1&tN#gRcDqKOfZG@LhmjdPuxW%u+I4 zWvjWR`Dg}y(5eMVe1mZUtG1<^jsyXHZ2;&;{TUVV`g)YP<^;p7E$ZLA{$LYc$|57W z7}FN}`M}KhYUvN_fd1hMmFck-pmqg=`xFdr9o}ej*$#@MCX$Ll;p37CdD&>Y&j7bU73UW0}`M4{M6_K0N;RP5bd1CC#n|=@rjB zzI#g~3DK?lPG}BR=B}bi>j4LG)L@N{Vc7OPTW7j@Ry#Xnv7SSucB*(FWt+KuBLWvA z!H+i3LdV)k)N}3yn(j56C0*&2ku-oyg`@j?757exC!cN;RR=fm16CLq&8N`K*_Cgk z4y%I?j_!pw4q$Y7j zu^V*8lBJJh%)sMjAUNH`!RbZ~+MmEe@Q=Z-LdjMe51gYHFwm9VTI&hzShzI2S~IPt zVk;2t(EgB*WW_eLd}cUii-m|Ha}PB?N%m9E7ky-Xdnbbfux2=X2hb@8HXE?? zrCz2>y$pxVoE)7{x@8r!#pF)n*t)hUx(YwH>Dy^PRKr+7e3lckXI4AiH7nB8IfvA( z46Qk4(XVBnvG{U%%QW>$NLk124HrVj7LU=}F2*yn3xLm=Qw*07ZZ1bf4$*eM=I45U z>arYTd|JkbYKdOM@^=rf-PfO~J)r5=8}YE#1WmXgy<%T|zfElY*1g`LDD!Z$=O+!L z@=JZJ#1cO}9Y%~<+M|yMym(2K_C^n30$B#4IqOvxft1*^Thr%zQVw@N_?qk}-@@bE zL^@5N!?`SHAQVuMYmZ&&q5RP~FNyLzv)`BFQ&j7u&gG0PJV|(DzbcOh#8an0xV98c znr%R_9MZX>a>W0o)CG1VTN>R|YW*thlaXa$_f zy5?%p_|GBckpTcY?B<-i92W9kYlO*X@k5PRxj)?Kwg0n65XJ+v7pSG%;)6Z%(}SxW zTBYtv=5=AMxx1!Q-@l;t`_jC^Mv~C*Jnv1Xr!uX}G^>;SEcK8VC14?>xih+gR|jUB z;?4e0tG3{?r*>MWwY}6)Fm| z?ptuP9f);C|C#aH^b_R(hiG=J@APOAy}2akAHe5b`OfjX{j z)T$vt@^SZd!-ofg#d|PWEI5I%d0W@<68Zp6+^4lOl~-F{NYYp-<$E93hs{8awBEfm za1<9eJv!U(xt^#>o&K~YVdiR?V!FDZOKs3f%8HDV@URhx+eZlpP4xtXSDORDx(fFMFbclhPrLob1QW z`E(jZqNN_Qo7bMl_Qf@cBfH3MS^w~U|BKp<>LR#ZIsVf2<()UEl-hxa2zo-k0ZB^s z4b4s8F2$x?=OBI{Qt66#wV%sH>Ti{V6gy@!su&>ck^k^_@(vJGH&@Al>OXOp|KFA& z$l_$C))$mTal+S9B}Xf#Xd3^2ey$Zu>mO1uBsAxAfEzu@_UnYw8U52pV?s0fi;uiY znY!sJZ;bfV&pnOoR$qJ+>t_+Qm$EZgSxR_qKx7i>=|pQG07CE?@!VP%Y2sIo{)~_h zEK$F_(itNHG%$t#n53Cd-q6O{>@5|2y)~28P{K9u6OY;~B@P zs3zc7?k0tfyO{ft4cJ?g;}yjI9``SA;^}`x843kB{eLf{0R#*J@q*xIIo_ECRb+1; z>5G60{uqzwPeMX1=uk1c@f~R~zuScXwy=E>V7CGuMqTJ0a5GH+A@(i<_CW#zCErS$U?{nF5V~$Ci#hW=Lv`+R32` zD=lR){Q`t;75ahV8s zT|1t<$ec*{FyoW=YRq3_wjtp7Ti zN%z#dE6enDQT+5DrjNr94&R7!&0Db{1*=RBbAyFv~L3AxGTJ;pwi<&x!!@j6iGSE0K-Ik@zB)nnHb`5?`GN8UsJ0- zX0s<%UB>aH&A1g*8vi3dOZDrBeItP=g^0cYYJ$PW_t!MH-C_h6_X_(;?c-$2E}+sk zYoD%tt9`BO@RTGZ6UFVEFyr>OCZD2X%kOWolnw?W=UVb8^+UTE;3U8cR)JTEd2x z&d&C$?L=Nx-2qSAd@Bsq5|$|hqJK(|^^=`dwe>d;5WlNaXp+7POusyH=mW3J+1T9% zuK5SAiqBYl>(z8v{_e)wm6sv{bKF^Jvpjm%Geqi@p~E#l@Cs)eJ%Is0hPD3ogF5>K zAop4LOm51n^iTGud!K`=y?pEt8vct&!OJt@cQx8;bA@|G85=9M_ZAkET@4dA1OqqK z3X0EchD;B&Kun}h%RfH$Uu|m3xgT1bs55m`M1THj$0$0FqhJn~%6PZlE@MC%Ewuln z6u#aoKr&TAhe#Jl8Rzi9Z6ufJ_gvB@amK3Pl1DLfi5U~2@%f~vOe7l=Dp-mtBiN^6 zmp{iWHnFXJYOJe_ILn%8tKkTlmU^HnIK$;d#@lE5V5v&?g%MMunBIYH+Hm)1s@}M_ z@Xv>B))%cuhl+^TkH~dG*Nshky0}3{o9i^TI#_vC*cSuW%Eos4?zvt@q^u)R4qU?b zXZ2#VN_JRQT4gn|ue_~21pOY^9~=D!dT&y@wda2z`?aYoEw*6~X?jH6O~saNSn=Go zvvQX&;o!xr{J=_v`h6u`N@;Ezp3As{ z{x6RU>@@q;XJY5an(PchoEph+F+fbE*ikV;aF41Y#UgXq2EQHLBUT~l>6JznTJZDY zrli@YeRZT~rC&qULsa}@ShUgE)ieFeq^0`?C(=F|3X&G0EXK9qmP+ff%4vujJdafB zWn>A$S^2#F?j!p+B~B5@y0q}6GaA0M0KUuw5YFBQ1&TF+AtUoLR~?sDxKR)Knv%?TPcD|9uj=L z!ZmtU$I*)+Aj7C8bsqlpT0Q)%@K+C&Ub)r^o7BC1`I^?!ubdek7*vB;+%F6%hBd+{ zk5y+?v4}d5(imc zue7;pNdB~&%vNyd+cln2nklBz4dv)DH`jI@G^Q@0!n(t8{7QEt>0*K4- zmF^aqEQP{=u0f!Q?7#+u3CDNLxFvS-k!Y9OLIy+98{S`UgMS0rIxy&=vb~yqJ-52K za9~)KKMD7FRr>NH!-rxZ^GC)lD_O5 zk{6u#dbLsrF4Ez=_|MlD$e~78Xp1_v2gEEa89;z7K+v7ur}odzoe@D znbU2@7Z>Xj#b2SRS|k7Pz2iz2iOWEb)&d}MtDZ6F#!&5=qVTM0_j#{)OQ2BIhOb7X zGbhwRj#Y1}b!tq*T()1uHsbZEfwoy6X^&J`M6i{U&j$#?XLX|BikzQAg#5icPQs^1 zK?ba0O(g}+_6#v4tKba&;K_}Or|@~qqO_M84|VI3m)+M$=v23jVq97-srw zZ1^{HyAIMY7_CygEpjdw3{W`U)mgo4I~pSCj2S?jYL`m}ysCB4W@_%7x|;mDSm#lI z*Gmnac^u9;*z#KMbi463`P4KlfXU0cXCmL8B|Pj)fGl%tA06EMNB+_ez9QbIau5P~>gsp?^i5H`X2A^sF@-=8vWTOpq zQyXX*jRv6kuxGrc1hYl;?{hKb=;X!*oR`qGSoiXrg)>f^r@qu$)A``!d{-3u{Lr)o zW_zWnXwg&!m_mjK>d2~=Co95k_NBI?18As{Z0mwiS*0`yu)InQy&_rWd1+tHj4iKI zi($kx%<-gzm=+b&)f;xen;)O?B6lMtBWpAf+MG08tLdrUl*-9Z3qTt?(J{w@GJ0H%O4g{p!-k))Ork zjb><&Ral)-eHZ7m0r4Vjn`4$oIJ=ft(>F;0qT8mFVvnYF;}tOTY~p9<@|MyQNT>TT zrs7j}l>MJ$?pJ#8B0mQ7EY{-ZA;edTKiIoD59aN_W${K*nBTz%$ZONIE#$PfO~(R) ztF{9eM%j4xiOk(BeX^}IKr-thXdx&!EE_I7-X?FWo>z=V*PZUG6rx+(>?;~=?Eyi^ zw7Z}>7qAoZ{CB*?gwq3J<5F#~_0%+4SLVU+`8~=_Vx}rB3`p2;K!nuQ)A^2|wyC0} zQ&`!x$<*olvg!NMLA3koqX0izEdcBOIhXIQ4d9;TcKJKw+3(`5>NMuIp4VD>nBT{= zc+LfRzBT^^K(Px9qP_-F;(`DZJ+%(a0)SP-KmMl9=0*V6?wo%Oq5gdz5YGNfj)rzUp_JJO72R)e~%*nQ?CL#YCWh8Ng7A;5{)6i zFDLl7C#G2&d~-nHb0mr!<0l{R!rzh<{~vb)>Nm|DkV}T~Se8Tv^sT#aU>&;}ZGOSA zXvwi?sU8kv33A9zWXO&cP=oWmDV;5;#xIs122zUz2eI$SYVWEx|3_>d5TO8`omaM= z+3SDikp+ssD;^tJQQ7j@#7#qF zJo#}^XwOKHhk%<+somD;LRm!j(N`xNQ|}DNqjesjcl&cFld_!NCQ!Ws})Jt#XFSzx@lBeaj4nW%F;f23JMSLoV(g=(0c zyuD<@f5_u(QT9e!?yDsJ*wQmnk;ItO9G3MMX73G6dLZI3QPz5F()#g4q*1t48sO@2 zQ$Nn|63ADI)N=HKt|Ni5tPcGI^BxJRE=iSGnt$rdyN)=F7T4+n2(pWB; zXTx{s8Th9236zd~eMkUTXu?A=4z?~Dt8`<3)_03jcTGFH^NKpEP56`M7rC{X$W`>W zk%h8zpBO+$JfK?jwb3C$f`))w+A>g?T;l5GMLYn3g6kZxLU>t}Uh8hQIa1iIVhS?f%uP}b{ak5<__iQ*Xkq^-K zMIF0pEzxp00ecLZi=%mIWv*U*Gz*z7Twm{ZeIfMK9rz7I8X#O8&6+_SVK~@ib(xiu zEQQbaiC@ir^4q73sj}>!QI1~Ty`G9#6a%|cCpqsHNk#aVt_5lMIp<9MM;sHLrQy(8 zV-25Hg}2xq(e#*bq>tXNC&0yNeV=~{jasYZ?ytmB8?USrp5~>rkaBTjdH9`Z))TTs zJY+DGvAqsL^*UT=_gt>TEDD2`8m@=~wb2ok?4C}~)m`eiR0V0W>ogRrRV%cTTB74S zVYL3-gZ?*+bQHB-d^>U#8-&orZ%31@*Yk-Ue`&!g%AEV{mkP7F7)Pj}SaPyl@l&0s++CH3B@Dh>eGic+ z9ryWfhLN<{YLTSqHLuylV9-ma72SPen3w+soCkFH21WA;HRBGBj?O7WH!^J_sd{W~ zoo2XWJ=q0+Shfi%c@drKCj{5o)QWdxpCXDY505hok|CIA=engunoqQNx;NRHFeZH{ zaTaa1XmA31Et(wBUAB%t+ibRw@BYLcK2$7UN&BNaG01IhaxMMLSC@p1F7L`jF>bU= z=GTPrcTJwyR5L|>UgOX88*A4dC7z8UHGLCt396SHZR)+MZd*^^uJXXiL`JQ%)Du*2 z$`od?9xsGsU|B9yc85F$w_uNXi6@0^6Qqi`rGwP_3 zM80Psv)6;VN`Pu1FtB><#Va*tHdV*SF5j*zM_~^XUvdTiP&S1D;h62A87zU@NqEp% z3lN=&8UsSa=xGwzk$$be*>-Z+}5iWU<6sOPbRbYPi6ip=udNxZfCwxR=uFj z>8rZBZ`*adcy}T%15!wM6&`zuZ@YLw!SEZ1tnf(dI;k2D`H@pT_ViWy`oK+>X*mmU zg@_A8-v>{XzMFu%uV<@@B%)X?cDZQ;zR{9B5Wmh4t5>{?yUiW?mK3J%JkNC)P7K zo_U~xOnsJ1b^I>$=9TFMt!%GIocpdg5-;$UTia25DT$isP0!DYl~%rSEZXlSU{XLP z?FF-ci+Gwwr#^L(#dOC{c!sS$_?H05l|w~fr|3!5J%hTR^+&`CuB{sP4IaI|!G&JZ ztYCG;z7->o4ON;raK9HH-yJWP4kG+^W3TTe!otP&$f^F*K?(==1ZAF{sQ=VY!!n1IN8g z>15mLWWt*K<`~v z0k-Gyhkzn6pG;v-=PEcf5gJ9P*i@FeX0kPHOA<7&rj^TMn(@RK^iig461q=2CZ&`jo4aUWwBJzJ#B*votWfyqtNC;eMO z!g(wpco_iqaP#J)dK{f?oJ+?46u)bn<(KiS?s;hlB`mjq3NJO z+)v+N8%tUg?^fCf5wiQ-?%+86i!70TBh=m0*Th4tY)9UYVj-FD zACge52uMzGVW%dl&1gJI+ly(%I%KKbbk@so*VO1^h(CpqakwPdZ{_e-X1`7RW{{sp zkmU|Lj+zju2)YGUdsPLEk3Qr^C1~!vu_T1A-C-;-$ngfciXcqe4#*}C;^-W`Zq`B?NS7)D$UT!P1`jU&*eR=nmb;(6%;&rgrd9M(`Xi_yz4AaldwqT3p)P=07 zjG@a`J1hSda#w{33!4ThZV=|9Ny^FOhff($2sPV4&)&2lWma;E{>Vvbtn#F};q*S;VBMM$_8U5!&VKY5pUL?gj5d+VFEogAD5n`hsv@M6w6IO;IGt+Gm|fB9sBM z4A82}(8$5#k0A%*3MbB9;l6y%7S87e)KR;fiMg{g9}=)C#C9H_4w9h|s0kfSLMwxo zHd^1+HLm^Sd)Y7qfe-AfJ)x=zDS^RzyIr`z$Vw5c(jg5n z-+6RNJjb^?z{p@0=k>q;B*}M74?FV^AX!SvN&u21f%pGIFyt@CBI<29?air}K}3a=fON#0TxbYY(ziIcKyDh2s!7r}pOf6iJ%A-# z@{Y{!FYxnk`*JFb{X8H5!GXYkkZt4q6B7GJyh|@p;0#bvCOZFh^Zy|X{R`wlA+WRE zRpEh8gzDYhhu)vJtdSj(kaP@1@6Jy>Dmco`#E}+;82}Y84*&#Iyyl}>LI;G8C7=A| z)_<2+ar^@-DMCRCOwKj{0U8n1h@8Ww8b_M_J{`+Hcjmw9;J>vUjg7)xxolnU7&!r& z+P{!e{l|eVfx3(|y^J*I4%PyI>HqqHF~|;R82rJ2$rQ9d=2$!$Qyb(D4sHE!1Rwv> zg?HWqG(ewZFF)~VN;5uCslZHjCq(%Nc=-#~LmN#Kf)6Q}?Mk}bA^1Cq8(=U7OJK+e zX^P_pg72jPEtud@rZF`%4xsZ!lrUoeg3vr25@#XgjZ@hP1$)=E9Wt~;vh@ha9yRfM=xU`$FoRsiDQ-~7!EPn zhC5_vALi;6szgS5!}%)UbSQL8Lv##0PS!?c< zE#=QjBfFV-CN|@&*=D(nQYq%Mq(DVCP2#FDz>}J2T z&em*`r9Idll;wptr!@Hg%2{C$&$j1URWnmqGsBCm2$V;hhp&tR61D?NEXxs%SH8tL^66MNrOnBLHYoLO=2@^zKpDbUWQ-;4K#x031-xCNEJKs zf$$Auvaobx4Y4BCZ=hru1xW1IPwo|Niiq0}^rfW>_g=iGNGo-e?~^pDj}4&r^6Z$g z&Mckpoea9&p?$ePVaw7$#Kpy7e;m8w*)wn5I}>(nq9F32J3l{cX|)nxShv?&L|c*I zk&4jwt7EbE4E@X&)I7D3JK_&-MNM|I1Jd*49NyG}h_KBc_t7MyyO}cB34JyhXb(8x z)Somc{Pj|WA`pT*WSS}l%Rq*54VJCYUrt}WB1TaO=OulzCofn4f=ix)4}5h$lnYf@ z(KAVQ?#B9PxIJe`^Ox2*#x7n6mBbW5ITw57WH$_dQ-0%kd!Ju9Y))0+8{+}Xm>99& z{iE~VD&mLs&Mq(k>`se(hKqy> zVS@~5qf$*TwEat;MgjbZl}bAF98(8*;jVKRXlMF>DPX^I``AXr^629wA1)euI<0S5 z`dp!RjS_Ze*PFAdIRbPq;x20ZH;jIw%Zj|8J}VKDe!s>q?xd4W}P)m4!nbe)u_-=uZ$vIFmKitQG|@ z`T@CCAq^LH;(UF}9w!=0Sij0+xW%A| z+7B0{_K2$JuMCY>y<0@+v`603=(8b7@&=YW+`ecx#RRH2pREqxx*r8bivuWlhGpfs zLN>!u*63_MuD3Ee>5N;i4u=o<;^0WEtKF`s(4_$EZDh1?o$<#Xuibo-~&I7-) zPuQ_np~@9mi5YHl!8S!eF>CSHzpU4!yhc&%bYOK^n!6Ql?XDKiAUX(GqBNpAfah!( zz_D*O>SBWmH6oV5E;1vET1mxB#0(2E$^fcQ3ovWfB*=KIMgUz8$UCeNi9f3IGXRfX z$Q7oGitgjhBEE!BzW1fhTA?$|qF8<8hY<%)8%EfFU|`dNVB%@K9n|R_*%4VJG-$5^ z$+z+%KQIg`?+_J{^^{WQw&RQ2GP;AQSW!@oWHE-g_V+t(RE{OxtN%w--hJ;r19{H6N`)RDh8}sWbm0dVeVjX$gYKUV;JLBK2wh z-BV2hihhhN?;9}|k@$F>dp>Ro^{pge`uA#FPTyT+X?Ar~)AWqGZw-4Tv8qJv@A+B} zhxXN!fA{cs8KdtkaHwy|0j(3aep23R)PH@rB-^LD-=BvVirkzI2xFG(6LWcG8O0pL z7lmJU1|}-+YZM|!%3Nzz&%NI=CB}Q88%1cEzld_nJ?r&l;LUvv$%jR#AM!#2!DhR?A+wX#VQDf zC|1vSZ7ac0Id#$*J@i#{YAdW{pIB{4zrwG6DE5lBUEir?oBLNJy!1Ohw@C1$~0L7hxYa@d5k~D6?0bmL@fEi&~ zc3Hv=M}@Lf99PZe5yuWAfT)6xF~BY}P}F5WxLzzXG+5$*?{~0R$mV!Zq)LpnPQ?_c%>5UP46K-coFf4Y-E>0| z#x-rD6`0g7Xv$rM?O$u2pSwuh15UP)rU%g>9wASwHh>}&?xVKVmC0T(Vj+j~ws$n| z9hf4?Kc9^5xkrMm%&uuWb)9$*X&Vue{D=S=mdU=qZ{;Nv^~thrg7O-AE5$2nuxmQu zPU*P*BVl1q@tOGkm&09?6x0v%KOshYP+d$OF$KupN*R^y=*?}K=njXQNy~UF*ZG5! zVt6EAk`X-ye)Kq84*)QK==ekN>V43DZ^*K)%W8%LBW-k^>wVrQ`Q36+au)Kp)4Fgo@ir<4sm=EV6~=u(g#EbrZ4ElAbt$~ zff{dLm6jSQ?R2}EB|mY8a8k-nRQG7r8VWf>f5~p0hDSYo=%Q~RQ`6;`_w-(eHYJAX zrpt*yaS+4|5txk2his_Ibok<1CE0@_a;4#bGbj#uRXVS{)@wRUe5tA4T(7~bt@hr@ zawve#YhkuC8`3Zi38uBBk!k3ErTr5di}%uPMV#35B2>F!!l{U>^*cACg`Ad!9GWO+ zcJi#B{YBa>|0F{QQ~QdkQ$)B>&^2nk09N9z3F&!&1^V?AwUX|MCkXM8<3zMh`>?%B z6R|-5lMY&8gA_;H(@;)%3>nuU6B$EwN=1K_O*gP@TA&#=eau&5!Pr6RJHBSd3>BP( zgV*WHG@IAy@~!=0PqQj|?TBC@zGKS)0J#6f!pVuZ_*Df9U>`^S?BtAkvU+)A^GiO- zO{e=F*fGX=iUCoY%!}ArWT>H$D{pVFbV;^`W9O^XiKVr2?OOPKj z9c|}cab(Fsl>P=X^rkkAcU~>Mo@w?taJ*RUovDwkg0+rY9V^BBOsD&@mH-;q;x=)d zBvRnG*;eGP2MV3g4%I$y(nZ`4(A22d9Axkacu%gtB>h^1{XPr6R~|Mcll-AN|M-E; zvcng0f(mXgG(nyT9+^eHg`x(uiPSnBV%=e%8eJ`#HS>h|<84#(+wat$RAa<#atIxv z#MaBr77;6f98yfW1odX_Pui?H%S>ykI^Kzi_*ub`Y#*iT<-9=+Wc;atw-t%mcQvf$Y%nsKsv(`FZ*@@`tVd>NsNd4!)0%R)jfaF(b; ze1vSp@JE;Honij{(GRUzC0|+uy>#U^nNqZ}@@rT-ACxN%&I$mtoF2x)XIkXt`%ksw z89{Ff=WHbh*`5#k1{EjHe30|nNW&8iLaPc?;NtGHI~sa#N_k6)B!lOuU4(w5Ps_U} zLrLqfUzbJ@PeC?ge>AxYLz3iQo$t3%U2qoom2_H;(=_d%pyx2_VX5~S%p_y6MP}^< z%zJ(A%YKM3Np&l?Ki>vFeBUP7sJJ@qinD8)(oWgFRJKUaLQUHw`*n*#V~hrfBc>#j zVPYgL#{-^N)9^*Y02V))3s{L*`Ra}TDH9h3Ca;p21GO`tw86qbxEJMMYErkY(Mi|1 zIo_k6W8$ZF(+W@MulLK4HfyNme}bwwR|$(o&0R~S)A@b z<(KN%IG;pA|J;39^6>EcA2cknic#`X>Y`u7*p9t^ zr7Yt!7N()O(bk;@>R|&1ZM^3#r+%hxlMnS~7lQM-1yd{307!N2yPmH8K-D19CCW2O;Pg!t85xGsI@D&@?X5Qs z8W6SexvW%c?ewMO58pr$IZ-Ev#M5jG=}-48BQ$?yEm@<9oz9jWcA=TZ1P~q-7cCn$ zRPmhCC3is}*60vdf+GKB#O7J(r%7CWdzW{P0&0pW^L-3qs~I=CG^>mgP?#uCTT8R_ z*@XN9|B#}VzeQM7Hd6s@f?aZL%>n`6^ru4Yn`ViDjkbfVRR^IQ20CkVV7CACykHyP zZYLbM^U2EjXfl#Z#TtBSIMg2J$J;DIIhqMC&K;q7e41xT>84C%S;znLtTdHp5vcId z0!dax-_sitVXEj_VH0{KTR$@tBD1E0r?r}ZFQkK=Lql5n@OfFs&4dQ$jeQPDkhaCL zzA=f)$K1>OfF>l((M?q>lF9z#6I2h$4LrgK!R-WPpD$WRrtzj2ZS<;$^PUIy8Ut4} z1>HDAKs^do5~@bJyV}uno+;mj{3pn&)(<%Psj!Il5v{SaUZ#RqJyfzs&o1cpgYbpn zGGiENiLvl)7060rL2!=2C6(sRETf%|dvOJ9&bBE`(Z%6SG^D2C8y80>Iv|Lu9_Y{> zJQ;}5#%uR5zWXVhcU7$50Uj71es|}L9#IOOEx8Xks96t+=IkoKC{7_v$(YI%(Wo{_SHmzYsFU^}tJFkB zar$rhPGH1Tx1V?<00;9r=G(ny}I&IH^5aFvauQ3r=sg56%`tX{-O?Uf3Od@ zTDV92Z0$Nz)n$P|-+X*xgXsaF45Ir}h0lSi$rJe3Xhq#whRiss0%xV9V|cE1`9ThM ze@+$E7N**@>#q_2b1X3d(=-UWlc4;&)%4eW1))1tR?u-rad3vqwj3rScOvLc1dJ^F z&T=HS$;HzckvRDM-^=*#(#>**Q51Z}0G>0D7&Hb7N`S_s(g#sX^AKnJO-=Rhu;zam z5d7aWPGmL0=&44)@{9lCyuw`s!0GV9R ze|UQ@AymPetZ-IRmqG*Iw%N7|ika4;#$gK(=V7@p0m>U74~JpiU&fs$=7Dhgqpv^=SZ^o$)?EuejO{j;2fA zts|&wBZ>0TAa-SuHzhlgqkp76#tVO_@=j2opy%1U(-RM0-HOlrIc2``o!7%Nw8;_sHDAjdWoAi zZ_jy>Nq=InE5A7rxO~g6@Di+}90{bHGGQ|<;b{W;+8hvx9wz_)B()}1aiq*ab7Or&TXuTLLSubWOeh4Qcj;(J*ujXai) z(w_wucQm>O>|SW2?Ky`RbPQkLwJ5KO8uodPl;5YTcome49@-==zyMPjzai}1v6{k? zG+`GCIE^*MQTetGjfIRQO^{*LdH3bMm}7!mPkzg(t?I@RdA8g=y7#hY{y@psa07+R z(;A&BqAg?MnNF8#_Wo3OFxk{KBar#TjEuS%SeZQ}y_nUAo8vZ^{_%C{AuYNyckpbH ze?6*|YmWuoZaXmkNaSS(qxu=s&TpVWDnoDQS80tqb|ky;QW#4a0a_@gW3^@p>*AEJ zKUqbacS6R^pQe(6`{us6sFzjK@N9Ro%<1szZfHWIwL&M-A;FSB+^At+zhQcw*izzw z1ZJXV`mfj|0+aS|wPVsOR9CxIkpMNeUrkFy;s#%}N1bvM?&eYk##%KJFU|ULk5loFs|rYE`ejGNy12u2;Z{q*tC)e_@qzrI?vF zPa1$bi_`2_H} z)w7N@pD9FNO{6eAa9b!2zNZvA7Z|S$d&^nPEXYvha=;AM&lr{6vqln`I4~itOQnPv z$8-#u`#`;*VrcFZ|p-m|3hce{Ln)5WJmT zcVoFFG#JwiGu-ju!m(&D{-zZo&S~<{EHin?_rA74@9?flM*&q5-gy;&`sdQ8?27J? zH^oFt%ieuJhFgU``OnrO8=ZuOok=DKJU!v`AdxuX_+8W393Tx3d@haHo%a%7S|bSb z=|{A7H}C(@pxn|Y)DJZY8<^2y*<_6{?1tGp2Wcfwxk>(0@&wT}d z)4r?IjWT)uYIRq9Bg zA2UNyr}_uh$&sA2qokPf0@!;~y>eH&=n6o2_zb}p@nqRr;$J0Vo29Uj9lk|@-$Ydq<~R5a&)@FB!KPi2z8(WkoQyxtO2((TBFyLV zrY$&En!)H+sP`3~B>h?=I9IW4)B7pzqk7G#Y}!0mDepnv$iVh%Y#K=U(;@n{qoOgo zp?Yo6cct^8D&ll^@!}lLE_+ksa>K$md=nML*&`6*HHYEz686>U-ML+bQ$9BfNb;pi z=jo+=aotT$0Q&-Hc#AWq)ciNlr*Ypmrq$NrR#gz{B5v`$$&vDFExh6b;-$4EV zL67)yS^&UTG7OL!?YCcKnCb(X2GhrBFAq4C*>cQHpBn`XQj?pZ6b|Q!%R1fCSF_(@ zBeg#b&F~QRoqOioq-2yos_5GJBpX%EC@>^;0&_A7(4M!WCJ?cINybl33zfVjodZ-z z>V6IsEzTyAOyszeQN;<~CbQe_F-WE@7ed%M7kR8U4>q?|Q>dAOYz}l8F7nM)p5X;D zvJ*6|$md0I4ilahe)Syn|tso!W6FG_>Le;NUJG zi1?g3Q87M>JeEEcI-=6GQS^Tlm+RR9!hSF@}x++5&`g1iHu>6<9Nk7wa?S{ zs&3mu5rLF-PI!R@cC>Q_9|}Z1TheAL`cBSg-#?moyE04L0$-nhqGll4QN2XujI)}7 z^pl4$t>vDOEL^bkWt>MmOW9OuNha~K!q?&o!NH=8VcB=3;x1KC$hsxoJT{?%=?lur z1auVBIdna-9SBVp|KLUabtqbh;C)_wOy3*aBnn*X86*)igi1&lX$jXfdN%V?3g>NAtij2vj0xeaj~`tH3h0wU)#Kc> zriS-2m>K|?>-2;Jz{&n#V2M&;12<;f&)I?<|l z5iKqW{`T0bBpzRo1a41%h=#ih=hSR>Y_{_XKyZ784-+-vY))hAju!H|TRYeld86%V z49ajv)|-`ZZGqCZW}D$fZ0cp4DA zbGfSdVJ|f1ayX0P5c1+h0iW6BaDB*ct;Clya|l&K1H`9+^-e}nRKJiNML;q zu0pN=fp9MSe~rBU9U}YxNjsne-Jk38UIyb%=@m17Nj+FTUPc?nJ< z3U>$TF)-r*Zco%G+?P@LIkg%Ng=jWKJ#_yGH2foo_*cj5((JNDd{9=jPohjROi4>F z^0|~iU(zqjNS1y^ac1l&#Nw$nLIbYDn7>2{j0fpQpt~W)*l=Go5Z6KE6v>o)hj0Ep z7t3Ff^Iste8BY>`Pewxl7XwhiRWtyESCoKt900)_#42DEKs)fC;1S+R7QSum`mUT* z$)5xDZ(F7cmEt-KDVJIR19kp+MMXfQ5hq`pjEu)@6gCf2do}+qvxqBeZ06p3;BG=f z?uOo79;VSS+V&3jMEXRrN>V0hrY(s&b>@Tmz)TTn9#}RKY#Gb%vJn2;M``|{+E`82 zt3*^ECO{uv0t2fEOqB$BKN@eVTLiKQCQMLKaTY>3G!?uZTO^uYHtpP3?VjycQkZq5 zB{==o)$p_}Y8f6gesKg)`dMGZ(#IptRSKGte-<$BY-+#9N-9h>DS&*AFux!a8PrL;f-tgv?LLGEs%zAvk~`e9L$6fNcH zK?kw`0#lG6`{_D$!_pmS-HsEBj-(q#@;x%E)mIT;OwJkhxzeqF=5zT&X*SWi`S}E~ z%aDPaRbFRjF+r;e!+XMuPF4y`xUhzQcOBn88t_+Rkr#Fo-e~E^7+|Ek@}Q!&Uta?I{l9) zHz#T;jO{*ucNcc@*85EDb~(8|r`Zgqt<>KSP~$zXk`* zACHD5n7mxLiYG!NvCIy9W~bw|q;dH8^(tDCQEg3AYZLVAw?|ElO)e1i7e#BfkY(qB z#Kb<5YyE4yJ2tN3c);z4G!H6GeyGSo(@3U0KHk%c4AXYGm-I@7RkEkE8O{AR+0q1ylH%Ud#^O)jJaeURA>GcnibzQK&u7^|vRF~5Pz z%BlM}&Ruy!HC1EJrZ#VJDL*0~ zLT8tUy1izuyN;IGti5T%*~Z8yf4z8_>Bg}$r@Nl!jT=S6>RsGio7+PV|J-<9DLA4z z*Tie2x`(E4^C{4pQs8SuL>Y;F^FLA;sWb>#}b?+p7oYipZs(dwP z-~H(Q>WA)%6=fYoQF^JQ&z#-Z)X%7_Y{Gu!PJYXmr-_cET%Stt$+o2n6x5DzZH-lM znQG@kRm7u~G=8&9H@sG=-yaK!5S4&End+_zsQnf3NGv;Kf3T2G+d)%S@9jC=2tEAM=+AHKVFyfd6 z(_;~(^H;k07gEbk-8m4Wk5nXx@zX9omnXI9D;=9^VgeF7ZzqgR9+kQBJDb#Yh4>8U4_fXVa>X zPJu4oLL{u!a}iNNl1n0J3k_wKPt(UC^i4TKY@ZG0T_a*ui;%kF( zUYyi&9QJ~tCHgA=;dZmb>3Yc+PJ(dlY%cu6+PSknOQw^mn#>I{BV27XG%T^Y(C(qe z`~u=0p1|qFW&Ft4y+wjTz0(AM^l~_D(!~LB*!jil;gywrY-#0^MF`5r+ZwTTW}CiP zsnMruk+!7wIxeO4jm0_z488i0=-M~=8CvdzJ6~|IKe9RyAZt{w*EDV-hHkxJH@}#N zw-q&i^nHT{PM=v>_NxR=gNOD8#YwM4{C)#fGv9z4*B~XEjx8+9*FSFZdl`nV1u6k5 zz-6qiK}GS`Bddv`-sZ??9|z0|-5eKJnvDlP0$8 zNDO~Z`!`c78{0apN7Ii*KN}PFNoSpMe2Co)>ti3{h{C0LX*J-OeEu7#NA-e!s4w-Z}i?8#6832yh$LNGJFAbRM0?QyCrK!YepeV{c!jA>eFFB7${(gnc% zTs){kBH^FRZBc)var^y1lIJSQy_Avs&gcCYwi*SO7-TI6s|pHVdjZRp6)zjfo@7(y1wbS!0yK&oiiV`*qvtbrI)}^iGgxRb3k)((78v zz*da8Nq4UMCA4@hap+rHG9o0MCXuDT;+}q(D^5L$`3*eOq*wRD3P+;mdyiA=%$N?% zZm9&ho1&0|Z)ekv6;_yn99l81*!`vQGh*r;JL8ulh=SGLs^y9%$J~+cR;FmqbE^k2 zMM1q7{=4}XISNM@8H=`)*jq~wOwFj1pEy->uI%(MXh_@UMtZiQCIS1_);YQDke#rm zC;1f#O@Fo3Xy~r;AiezyP+vAvBTtkBPiE>wtpt+=6Mr3NnkDpIV{shup`G#`_C`LQ ze7+Qv>YWGMA?+`_?w=dHxI*MV{||F-9TvsE_l++h2ojRgNGK)J-5}kKfGnkyh=8QP zsz|MLD$?Bx(kLvQ(jC%`#F7gv>u>nh{XOSC_c`}<{hsH#oN%fwtKk;wSzK9SNvM|$2R&a;UNJ#YcL+vhbNNaK-mUXa53Gr3XXt` z1V?HDkKer*Lv%HNsMqOW4lA4fXYlpE0cigZ?Smlz<>#O+6@FK*ysAg0nT_*y>No0o zPD_U89h`8T*Lpg7p>^5h@jBAFu1Xw$ql!F$Rz2y6$+shuT=Teye=~~3VpJ5kA19WB z73J&vKlG~q#KQk!iMc^;JEE(2&7TK?6`LPwKJ*5JjBf%IA1W#kh zTwC<~kNn#Ixb30lJpkH||95DQuj8t!$V3cGHnenz(hXhbru;RE^+=LU*|FPaIPx*m zHKgphRl^jy@8J~uIhbBPC$LUkXH<)X-slYx_)UNr4O{@%I=J7a?_OGS$}x4R1*eJY z43VCPj3oCrd{+ET5ZL_ZF8^xDsTnwtU5u4`zu7KP%8{SfReuaH$YgDH7Bh1ehGwvm zlFX#8r>g}+jWAsVFLc-AI>pBXi_d~6F`Gqme<;m3%VN1Z4+{a6_`0?%c<<5--DbTOK+mC)-PaR@t&P(>%8JOhTB?m_PO`5@A27{9A} zsCO!$`?@u~)`q`Ti&gL&NxDXY;vALm<1WxGW)wahuhqQjw1DNNGw-D^je*0Q|tk>bo(s2S<0Y1|v7Ird0ayLGHwcg7k6Ri0gpA~W`BI!fWqB5|u*7~n6E1GW9YFG%$z z`mXi}HOAv8vLbcr4VE4sYw3lA0MUSHuG-U(iKr`a1;B=0$X%ysE+<`K_|-1inocGy zzAu6P&D|o@!7x5Kgr2V60S4)nf^6M+iR0{Ukk1E0WH$2#9cc@68vEs2g4DRqbY*7Lp>I^;6ny<_EVI$BSJfTfr8~gDr@V)3Co||s?Lrtrs_bMF3{{3 zh^&d#0eEBzJ{Cbup7B`=VcW8-y}qD9bs}&>_p1Gs)OYguCG*)1G2w#Cdu*zrayvlM zufLI8?4?rCm}!8GN2=S3PFG?@Ob;`yF1ld(R2Mmjo%z?Uk7vY z^(?2lSP8BD6c%><^kSV!*SMI}wHY?=GNpV7^5{8jv5%Mg{3fHw?$`QJ$Xe996!0GDm>E4~Gth&XhKNMc|r@ zb1PZ-(94~)RllaXbH8Lz;*JyD8xVc7<;6&Xyn8JS>}$$3V*1>NJJ_t4zJF8-v8DF1 zjX(gZ#yR(yEROLj`P)!7I*#3$&x(p~`DPn&ZioWU+fG-b@zdM+Sm8%%yx#ko_Z8Qa zUh5@3qRm}QzQcd(Rv6y-8+?fo0yoHegF?!sMkTLTd6j0tw+Ci(WITMyw$!wK)^%Q( zsE`ko=+~;f1OtAvsGL)}$sq{7%rnLa@SPV{43tAJNL#fk6O3eUvf8Db!p2?REgYz+ za|>?|nj@25<78}a?_yGCSyfjK@QP8y?{rF}>a^AbFnPaj1L7&5`$<{{{U4t6&iO5{ ziD4Bs?{OWIEy7V`uUBx}AJvZeSw>}VeK|Dc*j*LvBW^HB%P>9`0i5rF1V|t`phm|8 zU{si3>lUq61iT?CO;3;rs;cXQZsKs*{};IkU@Kj-ntm&tR0Gu&LH9vhn(VYN(fnaqg6VECAR^0mbTczULo*|s z#xAJ^w49R_E8C5#NC^zwv3R@!dj}?vW-;p{4i}{LSLr4))oEW{-FPRFGFdz#qE@Y2 zsw)|aia`RwG1vu2-Q~?aA#EIszDzAtiSTeI`PK0F(I!M3RfPm1i?H{uGSHM=u8{1l zK%EiG2c5hXDiNKOChEl7;G>{)6Zc+~-=vdBdE0*c-TP7DN1x0U>6?O6CycAIhOrK?wh)CmeZr0=T?oQpAk zeOcEVpWClF{n0p^y6kardbu<{qBJVJtnGF_swV zz(`{~^?qsjulR`;M6zQ0;H-3lCXOv>@rvNvY8I?cjd;`#Kz**A{7$D28J4}RS^9J@ z7bauQC)}geKDg%}pq0zTW)^t3c62{!EyTYnoPHYSh8p*yZ?Zmy)#nRl_p_{rCdx0y z@+ykbjXr50na^4ZKJU8a3a zJ zCn$0kpf`J;z-lX9UIfgZE$cbm|#WGwwwzAHtR^J0|klI%cY4Xyr9Iu+TV8 zm!1u`R@KuKIM9YDYn1P(%ol0y9ZK)n7&!XE+G(Qi$!Pt?N>-_Rr+UnNKJ9shpp$}C zDzTFsr9OhmVw>KY_vbNMTi&|;{*0Gw$XhDNRK#0Jjso;4N7ZFdu(EQ5(vZNO(*_9t z-GEzD!l?|3>{JJc+#!lCCxma_@j;E`Z7AN%8~p6|dWjZ@4MM{74r;yxbEV2WTU3$p zSBO-mzgfmEn5shSbTWv>uRpNgl$NMK<~}8w0`z2J)Qe_qqy(qCGjOJZxF++ZQ^fJE zT7*iEd2)P-_E*Ws_^%PXnAuJq{O#ow(`g zFVN0};fi-9j98@N>Iq?o*$=py+bDCiHkHTbTvV%N1>#3V(|LPwxiRS!J@g32@YU$@@KF-a+jA_u( z;>cMqWR3-z$3^V_UWS*|VhO;$@@q%G2b6SG+s|B;R@7R{FF3vy{7`(?R*<)s?Lm2q zC@Zqc;J<^~_sfaNW1tt=9>P$N=eCHxeFpBoT=4)Hm_vWT>Hh$={;%$*s;=S-#d&`n ze^I65+#B)&$ljP?Co+(5&34TjmUYYgeVkz>;5%?=JF4Bpe@P7hJB&+!v&#>~{24H; zG^{4BD<}C~Gb9m)cNPL{QuB8 z1`vFWHpZ^IxOC_$=gF|3{8tPdCDim3!;4yOX8#tE(bE z{X0R@*xn z^Xd&_#}>G^Yxkqf3bRl(dN&22IWjn+8*Oz+LJEXXLFno9Ls#GZD{r7+%oOsO_A6WU z^$xwv?**;i=4ffA8zqftFfRDYaId<8H^FtJX%VOueRI4RtMWdUiBcMaFYoz1Aa=Y= zN8_#BjXI4T1z4l$T50Ol#3nECFy@^Tfq&UqSo+vIRyO~aO3g3 zbEv40ZCussF`_0@tg+GdPrAuJwI($NI$+$Uyq~bV^!yvSJ|UE$#|t?2BBo6ROD{-8 z_>@%E;%#55NM0KCkB=v$es#Qa4Js&vkh-mYN{IE$l_!pxYCb8M4=8TY7v+Y5FtzxS zk2rFz@6hXu4*Fe;RPMPcmkNyu3x`iGQH3kS&*cq{Zt3d)HwXe*b;?a`tC5mStWx<# zhm$_+i_ZhbUTmG(beBdGVM~p69W;$;L78Sf9tBUjV>-yy2HZ9rV~g-QweU(5W2_~n z7V)nJfKz81&^zNd)PsO^o_=r@mrR_9QG?D>69_1?Teh#dzlC--S{OKoLxYLwDqh?$StUS#VZx`wX6@yzA86Q{4XLmW zGP`$1DWny>t9R5iPL?#8a_B74Y4JF^LH#nNj}|ehU|J{8v0KCp+o*IsSdB~9 zA8=&Pyj|46xn{Nxse~$8Rq;A^vD(cv774~1#;i|#j(L_u_v~Xna}JuO#?1t1{AkE2 zvY+Ki0~53!ETZ9%*H`nvrCfG%hS)c{HKyxwg7<=#n(cr>*RP>crFF$1(Gp~>ZDnLM z*S&mCsV@E)%GWrE!YzGG+{*Z7fh<61+tnfM=7_~3JN!we75O`Eh0j&`(!UV~tf2eB z2O2P3Bn_7{WcgiJ$@bKPVQoymoI-OKRVqf)&ZdX#JNZHN$diMRQ)$3G=-uhfS3fBI z1+?RjgvJEqI$wHLgom%gZGVC6_L28_+6FOL#cLy{Pa+<@`5yc61v7RKtJTsV0bxLf zy#@V^eGcvJM&kntWr%!n%({i>O_ zJq>TyOupERsyEgS5rSY_1b6JeZ-}p9;6J$PZMk!QjXRIm1{Al>|F%=fsrw!Yyq~sA zSvMx-!e#I8bii<$Q9d(^o@0p|Vi#OGF!C_}M81a`;jm(!xb*^c5k}66RO0 zAepTdOqHc!wQyw{8+mD*-{||w^+REgK0yF&7{9oigjded9qN;^g=NMzZ3Yy_xS0lG zJ^JG5ca5HiK6W?0l&Yt1Iu&0+gW@~N`LxvSVv_g5$Erowg6)&M^Rbf4F!T&k=h10n z^>sW9}J19u&Hytj6?5s_;J-4G?=Id|~{ZJ!v0O)M8A-bw{xApksHxVWf^d(r!+OfB6I2U zN#m%U_Z$J80LTaXm}R7H-94OJiLS7F$8@QeJ@5IA9%4Tfb*4JLhugFbAsOF-6@!JZ}=3an5Cx}WK*aAqbRHCrV!uD=4eMV z)j95h&DluK5n;Xo$8ftqBa(8T64J_boL3f%F(F~EM5HLub7(ED)2>#2pBajH_Ev4mi}i;)b{O#dNw)Gwf`{0$e14H3@|{ zK9zh0>#(V~sO0+2M`jm|F}MQmI@=9;mMFt;Vxj|*9=zqc-vBRH4L$rkM}wAyKQDX_ zufi!(MqSEXIgRvLY@|m`oj=p>&(~BC#=81LeGM4{*>lK;Uch! zaX%MMKkjwDsiXFFch4|%YX%e@h4}2=IL#GIvVs+wWc&7SSv}pL5@{K zz54AI6fGz%;jOjNl72^0A^SY989oUq!ZQ_pG{AP<(2p_7dCGkWhF^< zsi~ciN_y4lst`Fi2;T(^`ri0H+`w#9UJ7WMzk|$P;?mZ`EtKr7|;4hI) zl5z)FEyhf>g1$y+r6-r-WMH*eVzmRu&hIB%p}npe^SBgJ1}@(Ms$y(EcQDbOQ_!oW za{R>dq><0@C@d0_(9^iNnFY-N>g0bcivL{KfSN}B>n(roalPcX&tKHBq@Q_&PX z&eOpI>jM3xcO&@GzX{OZIRYIbx$ExCKVU_`JFlv3;tkfd1`YI5G7T);8ZhSqv~SUZHuJGitayv-@VmEeAL0N?I5{qRy&%x_ldV_& z{Rj}rO%GxMVMKD;8A%y)jHU`P+S$|@=GO#_d`p05eI5+R2?2Ey=9_ieI3@3VOMR6N zP|iKzjQX`GKD6)0_aqCcC<|8A)q_z0AL;kJxrpr0M!uC9Sge(`UsfPbzIIFxl&0sC3d z7jYdN@e!DsP3?xs;kIB^JrZc)-0(o$nL^XrCri2<_4r1VlcWghMJAvw18XLD=WGcs z8P%?Gfy%kjN#Z1(cr&%7CW#R&XuMqA#bTFX3jKVcRd6pSx%b60)B2GUg{ zym!A@18U>upt7&3%6nsW{h_w+RT=PCq>+!bOT0rmBS@rG>#K+Zqdq>G8dO7RLgEF} ze^A4=SkQO^OI0K2TTV2YAfbUAGxTM!z?S*P9#gj5cW*D?JU^tHCl5(3ff=Ztjve^P z3NiZq_|@wkbYa*|Ah=t5_oFa+Sp4h{IOqPfY(P1HY11zy=UGhXNx|g4p3J<^4qM^U z2?N(IU{tlFd#SwjJ_V9_tx8Hj;4|=n9nj7FwWkpP0N5E=IWILx3od@laxtl0Cmp}& z29FSysYl|wy<%=MovHj`Mzm`#)2`*mRAh(tPG9G$DPvpaK|#gyuUuum8$Z)WpJ=1h zk3*v^&5oW;pUv1mN!;miFfCa)A@o#zg<|d@Fo2Fl|IJ12pUJe`_|c1Vr+3I zIY)eFn#QP~6+R9t?%#?fJn86daS--GmKZ43)!E%>8QZ_Wp2{P>5XH&cfw9^^Z4L~@;*TxG!RwQ0~p z^ADR4JYi+2f|ChpBa`3#V_0^3izIk0j72HN6^Os@xsMmxK1I18uU6nQ#2shStcb?h z?t5Hcn2V{RtlwjoSn}NUH4a1X@#~R<6E!Ld4DeSxd-4@II1;yp1`gQ!=t{N81 zGE3A%`w*W29R|ae^GHSD8lFH+C_OYKGLPi646twnqFks3g$TiVWHdKfn8U+uk26(z z2JuvGr`bp(Rzd6BLd`C#IXhRMCQ(tNI4MNk*9)mPL91wrlaQYqmmGeIR~gmbyP37l z66&O!eS%pkAG_)mobLEO7L_*9FFY<{ZmOkTEa}{b0l`}=Z6I&&I z;dH8SY7U$~d$6O6-SR^CU1uX>GgX$fdAVRqxfZNw4A=s z&a~%~B#(C2v8UR3Wqn{@SR!n6ia=O-;`oOMs@csb`9x}#&;z!%sGBIMVas7ZzIw$GypL<6wu!5&(aC!Q z`{?jOzm7PYIh*@!-6DyK35+|cHT6vrb2#)vEPq|k$#L!~kMmd=cD%j_iP8}e0_#uS z-x=|vLj5By;9MLnG5S)%Iy#{nx0o6KLG*ret#-zekY8CVtrlTm`}P9MmUm_6^wNgL_n0vfxk}|*%**;kNmD= z(z0<2Bm_cn945#^y@luH!lqXr@#QAVhK7ia#N^`OF5rcN-GzvHW8^}HV#PHaT@w|i zET1#iqA;*~(9uwhGpI_wo6QU_3wB%OZ_#SJ>+qiZ*U36UK!mVII3kATzC1r1s-te`URSXEc5(4fk`JVb5X*}V^WmeU&LxaD-swb&K>1*s;T_t*c3qzf-+17V# z8su&ezxtj)Y477EpB6tue97#+tckUgygM6#hP6PXYSSa;u8;l>y({%aUn9f+btW^Y8(EJ;Q~T8~I`;;IL~;>9w{k6X{Dz02My_iW7eWW_EB9cb zuX0DLXA3&!l&yo=bBYQ!u;!ZrwQk)G&<{6W$@^4bvE{|#W2xs3-HO%MG71^xn2 zW+I*xn`}R|X$WOsT@?0W;CJ&i-a&D9JS?%1&jUE)2lea9J-$^n+pni&cvu>7a}^)F zDQ}^B<1J~O`gIN4cnc`}{k3Du=fYPc$KIBUkKMrdp1O(4>27A)?fIay7o#OV_=Xgr zK~!d;VZGpLO=z-kmAtXp@DY%&R~}&25ip065j!aRsO%kb0&CagJKgb>x$GTnJWcth*TKyCVDzO0C^TIG~&WOE8$yffP8)A8+>a zp5*LU91bCNGX?0ss+|g*@u0{)0W|EHIL$#_D=$F1Y}|&M)K%?24Gi!f<^f|D;H$uW znNziml7VpPjT1i{v6Bu659GiAh`h2T&pRZ34aL2YlW6UGwhS-u^Q+$n zzmYjGVI<@^MSoaM@u2iPKf_#VlqvQ07OUna)@m5lM_8to7`!6SeQWxG_3-tu#oSZ( z_Smrm6|7{>wUYw&%&h0yz-XFls=h(aN|&>gB$;Bkb##imUq$v9iXkclc`X#buot6d zMxpt->jao>#$KN=Gr3B8E+TzocWNU2**R;u>8b_RM~q}WEptKN*!C=gj5%_qpY>N6 zUgTQPL&DP94P)+Z8JYpf0H#HyXHgTCVK$BiTe=3JqW!|y33RDNYLQyw?AL(mUnY(b zEXOeiZ)xOV9dg9~;SQ2Sos$n!1HB-m61-ry6u=3m@yCUxS+ZOM#=lpKx{=2hkvSQi z@<49Db^l?67N}w6{Oo`M;l;nTvjoUv9(hC9pU7e0K zrOQ5K@6OXzW_ODM3?Dx!(q;C_0C(!A420tI6IiA3-vJZt#MJn}{HCT|G<@wTQ^! zG!DdMJEkjj&C-KeT>r0JbEcg~-t*lB-+lsjE4UxhqC z_gu<&KPQRe9~g#9yefO(K6LU@c``QrM+H5v(>eZC74j-eX#WBc@(UynG|zH6J5O=o z(coF2{sX3R|A1kUPrgMqTr%Ek2dH6kVq&>69s%*g6O9JmrH)cgKNqIBD*&(sKG|?Z zjWh{jMRNU=nbNR14INn!?I-~(WH>($O>h}N%!hx0ioqHAXI<6&yq?O(`7n8|b;_S- z;&|rT?cj5A5ABO($j7poI#{TUH0e0*6(0b#EdGF6lBkudnfz?jC!}jBo%aVrhT2NV z=Ch(X5#~~-_faIkCCc0=M38$p?fBo)hNJT!cKmM%z*Bz&e!} z>D|G*7n5B_2^w*UtjloP6=~+gdxtdvfo=81l~I!D`-Uzb_jgp2jj&0H$rH}Ey~Zw# zN;7O2Bgl14?_(rC>6(bglsn!(&UvUtm?I9Orf)2ADfr2PcMnH!HEmp~u_tnwA58j4 zfsLN-mC#LAiFd+phZw5TPKu#UdfCEJql433Xnb7+&k{vrQm~{GH3BEZvKU+L2h(ju zn&*ZL=1)JT`6gfXwL=b!7T%xw^3@-#eF2zG9MjMJkS$i>gu6syunl$T>brF^b5~rk z-YN?=d7^nS$M9XIOHJ?D(-?$cL+uB5r@Tk)vya!F=oCx0(i@ruS&PAv(2vPKg^(P` z(A3w#?`KHT869usJQd#!Rrmt?G;~a!!@i`%j+4Ucn438;)yvWDt7n(`2~Az<+#FHUG`Z>cK{Z|6rQ(r^uxUDQ*L3b z-E)vcS}L+^bZMh?kQ!w?$&C_N^p`f9X7eygAjx?|)PiT75sd*@WD}sfz~#yz0k^km znphDj@*Cg;bzV|2#Um00%-Fjv*fg!x-W~Mh1tRqbub6PaQ{`IE zJ(Z1hHijBjBhG~VfMff~989+TU?#lNM(O12Oj>YPfI&WpZJn*0Vy=VWWr0dObL1NW z{KRjjQ=e2$97a@0D1YL(%XTli5jzHm^^4a(WwB=MUQ9VUe{E2ut9BQd*3tQGg6J+) zZb9d6#~0ybz^5%N7=$m{k0B?vuiHOWOgcga4UrItNo@q36+A4ntPCnEGk*ldd(Z{K z7|fx?k`%9cEXIs9aUx@YNghq4z>@b2 zF`?O$9Wpm8)6YHzw=t-{2Ts%m;i(K58fcI0R?|v4jJIBCLd539fZI|D%3<8E8+FTt zA^p}Wag9lP7p~O!V6~smO4LmDN>{sQ>_&TXb)Ap+2m5fTqhU>t(LUQVv5g27bSX+5 zVb+{R1l-*4hze?&3DRR0eu3n0Cz>D##@(e$UT+;0A@H4>tLBJP_3WLlK9XbD8vpX8 z@z{sC?c|{EvFR7jk*G?8)?o00%AqA9k?nj^uXoZtb!Bk#sDtsqP-c2V`!UYjf`|Em zomLG|l2Q8#7TU#-7O)?+liQF(8hzweUCU9PMfN-=is4+M09DXyCrVuwl%!l3*Ogw? z5QU;RJ#^^ZY9b83ozJ+Xd1Va~`UZUDNOd$b(L% z8EZIzJ%SR)n9aQ@#c1nA!lfIT%l6fPf7a+tw9P$}7)YO!I;Jjw&f-Loin8Q6WGcD8 zJvX?Lk5Ol2xex~mY!rp|q7YK9bpGMz#)@`heqUb&3qZLzINNvn48dPyf-Ltb4JZEc zglKyIHf)V9fi4I$^!w}Ds#t*N^%O8g;~aSWoUzrkOW;!eV6r_O$Ge7DL~(3lTX<8A zAF?TH>_InF$6!{7n@0#Cned&c&+CV4kCoka?MeZH6D`&jWoC+3^4ngAZX8S$W1T;8 zzC%J?M}QBkQcTq8-em@y;uDKtX*8p+TPSlB0BR$4~5;g%OP5t_I_ zmHE3-{2#>~(}0q=S3x`j+&dCoEkuq%%fS+rRA0u}1t{-g3N%m$REwuBy^QcQ7XD4Ez9)u8xcy(1n+$TvC1$pR-N=j}r1X#b0J9i8-FVqhzm!H7HOR>oc%%*f~ZBw$|2&7<)P1d&0zLCy?-}mELrcEc@j&T(J z=}xWxc2FL$Cn7z-ZI_bi^p&bjRGU|%3Bko;qhl5369uTlca<) zw+Y$48`J5LbqI+F)w%YF^$XGwIUU~-oCx`D5-X50x+0kjWd>Xy$5HYXl{WW*4kToB zj`4yDHVY8Q^Q8$!h_84VzfUgZ#zpNqbD^*KKSMzQ%P4NK1L1JU)*JLQI2po7ralmT zwmDe$2Qbdik5Y4%ku+W!p1Lwj$hnO5m&t|D%T!MfcI|@EjZ0`}@YFZ>_UbRtE8to~ zi@}{_&L3XXUcLNqUhVryTR)EWAYFu^il~YqFG2bM_EcTJ>oqYJzuim|AkTg05*{!% zWA{7!MM_JT9>W>{#(Q!Kdfe9&vdQQJfMp}08apLM_lst_0%kZRe6f7S4c4ZKtG|C$ z>KL-MI!AMrM@KNZ0^BGlamL^++&BSk<~5AxJgyd$6qns6T&2VtI=ozUUb%E*Hn|jg zZ8lx;cI^Np4(VtDy&*fcm%#3?uKbH#s?;Cr+h12IU@g3_b%Kgjs;XgO${=Tk$ zZPcm4o`Y?j_{Qq*8~$&+R`GYR&%kk2Ccj|kNwFh$)dDgqG_mx^prW}m_A~a3yeYaJ z4@Z?of#8N&!T22O*z3RsJp{0-$U~l@BN0Si75|sM?)sSP4vp;PR};Wki% zVqW8A8f9N+Rn@n0uF<`m8u;NS8P#0Cjr=!{*Xxmn8K_5;m6h9*ccX!mzJEt){iI+G zWYr@$qo=x~`lmhlziw$?4==Sw#e3}A?3hCZsW0&gjdPXe3j3eZw-2uCqwznxucJqd zD#%hBgbIIwx+{&>bEd5+;fgYg`Tf9_1GsDUEOE1@P=I^4ip@_~i<5gx|#h14SxUD9?KpDL?iNf(tnpmt z!C_HTqi7XkfEDwT-tHGDi?q%a_Cv~?bm$^k!{@ob9Bh^Nbn+J{E^-pG)5!?P7cNk) zmtUkJAa}tv6@$BsFyaf>-*9l>LomrC*k*O*%y>&hd>q1?IGYB_C2$8IZxnsk{6nH( zKx-bdeh9ZGAB?*$1;#D?11}H&J zx9vsJSSKFd1DTmW`4V`w_Ux|LF|2PP3@tP6_q3#^aIRZ}^QVo4w9eFuC_}>4jdmZD zgOw4(hqDs6jQ*it5~oM8d*6~Wo{yK%czdimi%M5YMZcYSx5)#0vBiz~RMsH!1j+K2 zq8EdAXM8A-CHWkE$#$DVrbkR&o&oN0ip&LuAXliV@sk^Do>RJ-{w@4CILd;#Atq?| zDc^Bc{0A?wV*-($(p@tpX0zIEkGV^KbW7%_ndSR|9Q|2T$33p}Cgnz$1i;(;Ik(&S z5@Qvrm9%hKSK`3KAwR1D0~G$N6!dKcgv@fz&5PaLx`cV1O-r&?5xGN&V!z&i=SD)D z1z%}L^JZjH#N?FQmlE}-!flQUcBTSEkY8IZM39+f-f=CyZd)|mTkp-x9z?J+m^;FP zE%Otx54pN&sh4XbzUP&+pB?8tc%NdOo-9IR^vt%%jZ^rtGmoB=^0G6wL3sl~}_ zwCzi7FH*P;0m(c~4@t)q6y+9_5~t=lUvN8U+wg1WqaDU*W`SP#%i?B8c=zn+?~+^E zdHEtZ4cw%tT3dnGq)))(gwauR3KQzmuO=}6eh+h4^Lfl6LUXS9rHBpmA$GihV$_SK zcuMDd7gl^rb&bUXaYMU0-p;3=@r%TI=y83|1y&%t=?7kmsa1W&jtKK@-3`h*z|_ly zHc^)v^?aO_Am5ZeuT2-5B*#W0PKL>zG&CPeQ#&^QNXt>NmR2K{K$~v_k2#{%}W%9_Ux__NQT>?&-pDgzpnQS;%ce^d{XIz`yes$Z(b$lu(4u6P!sQWvmFh4 zmXE~oFD!g0?#t#0)XIA2WWWl>re}~gUGp*rLA`U{+U+xuQLHw5Gf!0Uqxo8;lZ?B_`LLHwl!$oT8XD^TQ_pR&L-kOZJbtJdl z9==jV5nnZ-3H_`LDuVZ%G*v^5jDs@6DMhL5q^nO)KNXm4v)59JaJbWk2OAUkY*a%? zI)y%s4|p;4Mys@-xV8r0FH}!<@w|&O`vqzhDejR@!k{hXqY2i*TzSnxau{x$s*g2w z%wc0VKDhG*iu!;g1Ky+-(tm}-B?qM)B+278MJUjBuXE*y@hct3_3c7qK$f?57eL}1 z_(W8(OnE}b5C-E*HL;9e1bP+2i_+oy4kE*$@P+> zQPr$1LF5a`sTw%d5@OlZ*;F}LS`Nx;=Fgy(OsfnhH^HA-F*-#Hw=&7?uwa7b5B$Df zJ__iC6Ux|Y6Pa&WFpnzLQP&SbuWmfLIFGDdcUs5#G)I};PSucF` zA&K+qHAumceAZB4{oq8bNVguP+deGDzsHG-MUSp(R%H}11*Bo$t|BCDE+vtj>*$%` zeMu$`Zo+26M<4vcE4pgGdr1Maa>qbTF- z79Ts7r!BoPk>SePtzWHQavkw3CAHJ;<2w#y^= zxoWrb#uZ%m5_VgGL_5}06%up4=Mmq)zaII0Gux(L#!FWPX`y4|p+L(;7MOfF@H8hX zy&~JT>(UsQ+=11}9P@$e!T69>g<;6fHL|>QeAs3(DmmEjFth9GnCnbtNe;K!l!<UvhfiE`{ehAQdkiRxibvr;f5Vu@tu|pGdIZ=9(jqtxSt@=Yx@Gkmz@7%4)c=}AxK7H+pr|C^Hr*8Q3siX;OnldAVgqq0)!}Ud z4n5)e2@&qq8rCCH6!*X*(8#`#bU#^(=*H_Eo9_5(R-n}~CPu#fjQhHBeP{)^tMiww zyX4Bf@fLC<2_vq{nGe!OM$N#GEW5*Ish*>fv^#ODo{Nq z9G~vGDpx}XPJGi)Zhy^T&-E&oo+CtRYbDo`Hwzl9^i&k*&3yoTY8)|6L|441Rs{ht zq2Qm(e>@!uD8k)buq5c)zcxD0sGm!!7oT^aPZ2)sEc6xO z7*npMBkx>vbNqjSmeY|B>~^J!e~2Bd$1Uw*P9iC|@7Ko@pQb(TjXPGnF#hxeuthY4 zJ9EWqNI&>eHE3V;nI-D-oNRJKiek8jp~K|z3IOs2kXkG{Si4}j@588Eh?#$6=%OaivP@g4s*onJk>;>Uwo^7oTL&kD-lONm9IK4a zY0%Wgr+w&~qlhN|J6&itBwo=RW4U+a-1dwc-?`Dn(2Z0b-##3wOL;EN`0=v$!&emg z%^mz{L{1-D|GOO#LwrcC$bBG0@EZAXyiXYt3>nr*6w7w^9g@}qvg+c%RY@p{s3nnB z!hp*bMuI9Y8?Gt_1eSXj^B&qoA9)w@Njw$W9>b@B@?dnJD9g;GkTj zyL<@pk&*l0eVG}D4G*@>ci%?#(fIk74s#W}$cBM2hT$z?R^rR^Taz14C~7$c*OV8c zfJd9`RUOq@05hk*pg-Hi49z3plFSI0Q*#So;qre4{Kym^NecBVcn~#g#)IU{-z{`( z$wzpkF4E_YJeQR@+hc`U9@wEMwNE35%a#F+$5{wU)s~8yf+?+cE#&2w92m{USnK#X zh2q0UbOq=>xb4Es{?fv(%i=8jwzoWWRVi56du3d(IF`Z8__J$7q9TSZR5PNY8F!5- zL8Zh03AmpaMG~-l4sqbQW7@^_K}?d(?J$F_&p{8}tz`0Q4oiA;5__c(w&h#DK`X$) z1zfkBs`%_ML1Yg?;t`VXptc>z+dw_KHHCESG^+`x6CB7Ud6ZwN&SZd%NTgnVBbpwy zPz4Nm$aJ>+2UBEeS!w*FRYdOlTN?{M5_Z;PzwOfBsxWD3et<`fibEu!D8F;>v>lu+ zd4004pO!8x_!)C0fK9>+zPuPm!Vpf&@CdXp@)yVm2{1{ostN6k_femB-_Q5+TOPm1egE!1%r&lI=DN;Y=UnIYJkRCG=)?j(TZv?(#SX??_(gE0 z)Cn?okjV&HO?O0M7`?5d9vLsHb;eEfVuBr742}g^CGaGtOQ^KP7*J2&SWy2(&`^|$ zHObE83_KnmbuuP*ZTC>ww!B5FopB*^y zLEe0ZFIhi*WQ=ySEWn5C&tSyr!^I($c)F>{fFssUJ+O3Qp>Bnko55U~+1!GEhR|V5 zn!ntP8P0w>uzvCQV>P%(@XKNc`Zded+T%7%njk1si7{leJNtRR5m#0xd2)EZm4@^V zuR6cWfjCL9^=w>hCJCPZSzCybPqtr}wxarj0yUY2jBffGAC(q0#N1wpXM5qJ_!`=s z`%(BycxL=4S4C#yAiW>aJX4Xa3(b}BZn?=zZ37dFx@5gU+$smhW+-nwrNTnDfMdl@ ztQBqL=qARzzQ3dVtI>~C^4oLCb;<7-X%a6sX7)2O?jaixlbUF6TsEsHZDfY%bDx?E zo&)ZUiynn2^Ap%5RTjhjaWT@*K2WEp9e55QP5 z^m~3&Q|AbB21tef+1tmSJ7jt_Z6v09L)9x0%!J%6OCa(ILZvFJP#aI(5dS)5OA92i_YoERi`(<> z;OGC%=T0m+PFd;3c6}DrU-jhj#Pt}c$pLJ1l_C&C4w0N5`Cq8^KcD9*7XqC*2@g0N zG78})j@2Va##TU~ms`p=6aWoOe7YQQidd6e6l0)14SQqzyXT)*{(st10&a_dp12-2 z0rhRG9?H$T?V3|Hmnh z<$v9t|GbvRp?*;WH+PUlsXOpSkV0!eKQzh?>46iKt$u?ABH%{dWkK|7L-;BkDLyt8 zd3DKNeDSgDF)6sKTn@qy5^+>KgX}*VKEmW*4CU}gz+N7@`46XL@$=ii3N(?76)Jh? z6*oQP?)>1Ys~Cx=FTH;uTb@)>dQ@@^|0j4`5^_5g-vF_#AO{d0=kbF-0AAbf)#4?q zOku<(En4jeM(8IGACl?YQDs1{=P}y4nzWu76(W~H`($8(};X8xjbT|2^5AtoB6ku+y(Fy- zVC}xw1{dBZ3eWl*({{hCzkB^B#jz;bia*F)CxO{a6}lj3jR&p@R#3^O$Oo|d4bJ&RbxqnF#%bkehrbB+#h`SrZrTu}!z6i!tX ze6iaL-y4ZjXx5v!@yWZqT0NgsVD~hq*N)6bqp<4U^IYfFS%2(G8x$H`+WxN1-P!Vt z_#|~jTlCxc-jZPRt6awAUmbeqK9=RN-3S_>qVm>@E4}kxgn`qB8(p}3bpLG4Bi*F+ zbc;h$c84e(I|m_Zm8a1qePj=VYGW5BvW1qSt~X!L;mmw{^XrQ(D=AvpRU~;m>js1= z$R`)8(zhq>{x&J&#a_5XoyV1Kt8n{eEX+ZgPDUb|CY~mbA&5&KW<$0>{wTc5_%*}nU9ko>sJ8CDc_WKdPWfT5WXr~2k1gEZN_j8y-L~^56o{I5SR7Son(q$dj6%;L0vdE-pRX1m~2PA!7$j#nOAkgfE^|i z!W=KZp?8F7V`$`UXN-g;FU(v}k@usWor%mFt-T9hYK;g19D- ztaWfyO3x^DmBegayCo=cw;Vl+;X}C_IJx- zQNUoC6|3dYU(e+KMc}oZ$h4ZwNBz#*S{F50$-D+|whk*_dr2Agxxtqiqvgcy4D&uMLG z^R>?kZy7MN$S+3Qzdozb<8rHoFQL1v|7B6Hh0{?_u&};{9{V}TQJLJCarV0rlg4eW zP9>F|-TDb~^4XHFjDAYW%hXlpk?FcxRp27dky;&yOVk+IQGewSsi)1m%NDB{@WL)P zUfuG}rf}DtPgQ)CN@f+Gy`*^|%2x_yrPeH7q>EM5D63q@GxinMH!I}GrAoQHi)><@ zYIC;G&OE$>71dKBnRj4eiivyoi-008^aJ|FmtFbVR3Bem)Iy7!+{HJ1Q(UMVJP^}p zVrw$7fmjh_`8rx~vaP0xK-lk~w9_DRS~QNNxJM-h&v?z66!TO9=^)4%)piV3LJb<1 zlbA~seh1P93+dH~w>8b5U&djG;7?4i_$o9$A6?HiI`HQ+6FTIXPdrnMi#0ReX%*oP ziRo@AUh>InO-Az_IeAd?V-eX8KI`% zxqrB$omfAio-8lFo*SEue11Dy`>hW3%h*^&yw8CznEVkf2sS9|FoqjdkI1z5GQGNB zOr~`ddnG!yB=x7u67GJ0jAvc%#xuGo3+w?d9XXrDMTM8Uq^LZ26sj*&8%>CMUn^=rkZ_G;l<22%kEqj~q!WNcU8 z7oBb279PI#m;M?_sLipio>d?#JRZ6Nr&Hm;*v@N4iYr0pix}!ZzmA(D31yiKQ)?wh zEMGu3;bLmdPLwg(!Ep6YFBiV)FBT^he|nZALe;@zm?J*MK6(&bMan%J043bLFxU13 zCV+9?`8F-~F$~|jO3>Fm>uG`4LyWM zu35Xm#<@O8JOd5_5)9hG&S5{BQeV7^(wEcNtB6^8VifY>!)0%4x)i87nKt~IJi?v! z;zjSyLa=Npt-(f+9xU8^&tat0z~pT4m0T+A7EK)*8mDI&QE~qJg+gbQc;oBmdWhi$UMkZN&dUks$?wfJ?bK5MxmP0XZC&BIINi!K$J06(9K|# zb!9f=x3FGOY~HhzO;Z|eMAC#wRlZK8t&eQpcjIr({trvZ#cM7%Z{Jh$2aiVLROU2lz>_}JXC z`Bs^s+wjK2pDG-V5*rVZ-$$2Wmp~_{?1tSu&LGDt)j>=H?m{Z1GAED}v2E)&OXQ9V zAwM9U$XJ7kF1&6jP+nN_o$^i=Yn$=w@MJi8M9n*l<2OPb7N*iA4^w1O|r>>HfW0>phyxdm1(Aw*jO(}VxsUtzLV+5q3Pny@_ zt17f%4#v6L7&AK-8*^UwOrINQb`?If1fHDCbN+`N?a|TKUj+T+)_nnq(@5Y)is6^z zlo(OocKb!Je+|0Nt0CH2vMW7NA07t`%Az9L)p*NTc;pNC6=Yb*xAX*Moe03kT>jgi z{?SwmH`r$`#aW1-5e#?G2!u!(@*_&tnQVoPv7i7+VzX<2aL>WUOrGn&fA8UOm)R)FFv5u8pw$bja zVWe-6D1IFX1-^Cz)r6+g}H|J<}zQD_vjI@K~A+%;<~$>qIA` zVSa8|vC9+cDZ)vLis?n7RAFT6WNYT6I+!9n`PQyAo(U|Pr_Bc2FS}ng?%V2l9++`U z@BKXgrNHvWB`t#@=)`ct@Y`4?fW!#@mly%?KJ;2c81~;AFQ)4J)-4N1mZ-eD6)p5S)$N?3gt; zGu5={47a(1Q>BZ%P_;O9vt->?Vk!6>bZEFs@Ue|x0Z0_#rNFj;+ekGgY()bXFCnwx zf77XSBfvE&3(m@?JTeGyn^2Hhoo{xe08a8HK+(X zKdMdXn`A}?H9g(=)Gys+;m|fWH;Gqc52L+qT>vj9#WG4S7&LYj8PH(OQ{A7z&gqq9 zR#YpNe0cPOdeB>(6m|2C62YqI-gK0nV09{_N+Jxxg=c{E;m=z$2MLez4wslXPc^`O zeVq8Hj(pEmb+dVR7<7(Zyd5dKIgC{aV#5sY^34jyY$?G}P0&Q0VX=BB(Q?=_y_%yN zFXU3fX9bhKxumBAaqDepX9SE{u3ty$K{kP~0jf?L#PE$<|DfuufD_hjUg zO!9B7#xZdf1=&&Gx-SRKEY)?b3!`9w`Ih;heq$ z5r?`Qu(HbnBpu*!=SVmNQV`@=f!ErP)|l3i(--p%x&JAW|37XC^GgJP`h{ZoG-l5D z9xKu72Mjo1ix)a|nukZZeA<2gK7>vQ<*^T(GNb-YY9b807G%K55q2^dALK-le@sGY z2-g}L3INSjd%{9iZ>I?D$Xr`Ac>)(w*GRcV(bovN`=$oS3sL=Rt$hz6KQp z(ccGBkL*_r_xdD-a?bwmO^Z(Jj(>hYZ1v)Aq7$d0@t_)k@iqfNMZ`R?+|YL1y`2agb&D_n^UD! z=T-ndkAH{ZKMXEnW#MF=_pFr9l7P>}yMJTF0Y454aIF~<6=bpZ12AbeVqjCo5?q#J zj$@A1RN3jKQLX@B0l%JJqO1;ohPc!75$*y;dujUCwJH^a&e`!xv;%nscFfoJ3sm8v zpwh|ticW)|$$$t{v)h_bs5$v?2g{I#*LQlok8cN408Kq>0+}fO?qHIuUKx*h2sgAf zyt^5p3Z!C+X{mCipP1KauY(AK43!E^&s+oM?%MDnR(vDU&ND6~gBB#%Ja~nFrh|v@ zFh8$P#y!Dz6hLPs!Xrz;9ewW%^sGWgDb=xxLG>>J@Ha?T_apf&?|%T3j~7m2gNra& zT;&+;#8t2=gkXm63lwx^Hv5c>9kG0g^4z%DGcRd;nS1m@Mt-t#BMfys+6Pwi=-OIc zxxxHk=9qr8mvV?p=MX4?-G z6trgBdax<{#0*z)JrS0BSk~{o^_G87X8AXdyZy_z#-2>)&T5pr5o$A^-eYr)rv|bJselo?Q`+6kj60?28=i|kufl{|o5NvjYdON+yeT~P ztdR3D+qW*0b`^1~&WOq-zZ0|Fs%Q4nQ&YM;&(7sM-#gyxvZ>a^Lsk z2KOzJb?MLS#_sp!7pz3@>NkCw4TIt`va>?_uEi=1Oj~VB`L~L0(2kp+Z2*`-KEGcX zbwQ7{zpGm=1=f z-`puI{7jmd=BJtYOlQj->^%>=IPsP1p4&t;UQQI};{zVIS;P~Z6>UvVb0}mUvk+G3 z7E@19`$Gc0=BauX{LXVvxOmQwduZZ>Aa`NS%uO&#MoiSzQ0h(}#XO7$tAlcH;yAO> zJS{QI{jsVlX6(|ZTvt(%@*~ek5z3Dyp&2em`#1z-_T4l;CQo@L9L8Ln>J}MaqLlEu zl+7iQgg?rhS(%n`{ArWIYx5H35(6Z}8G#W(M!;B_yypy_zHBSm_++CrCUA3-$*^&p z^9t!8BZg#GbKW5aCW*;j083ftD$o0SY<96SXrWv0zJ(szs5RR5h(3W!&6_9fAIsE>U~9M^+IhM&d- zewgBCGOXBK6e6Sa^4qc)%oBE^E6V$hmTsW&CHzuCcKc%kwhmXcN=pIjfhq42-&c!uq7CFA@p5t5u_K}`?}e)Xd+z&J^)dwfR9sJ zBwhOQv{Vc^?g+-r3s(8>V4c&l$QzapmFn2x{YEJt_r z`HRo@+5No*F{|%hmq<9QmwTCl)MEJ?-c@_Tvku6H+8SO@0~ z%QVjRBu_bUQtm!2zhsVbtcriGs1#pq6zeaK&~zUM1~3oh;x4(ewCzAJQN#8%!5oaH zvHpFn!^7`;X|-Omi*S?pt(o803k^}lk~q7-I7e!)`k}wn<$m1acelHjPs)Biwz}9a zU6L^Hc(g9ENT=ApP-i94nEpjYg2co&DP?ppDaJ%^Xr&awgJ%@Y5lMMSXlj>i=1bRB z>BNxd%NfD5*TJ{2Ord$gTdEe!uUpx7xmA%CRj}VJXF5X8`XKrcuxT?qQ(9q3RbTY$ z^4|q(-uzOj3fnJ}qWwN1^V6Bes_l*CaDU<*1_bY*g!YA$xxNFIHpml?iK?c!$YbPd zxBcLyHf!0oJ)666skQ7rYZ(XJ9!+_VhB{EgDnj@N{mnzUJEXQN94=@t6jQ->F`hf- zHX*uNV@kgWs;r>Ju=Yjh@89jwrg~{wpf$aM$Ga0u0xT* zGf$T-kV{@af^^_wbIMfm?C?@pV^W04E*w`y_;gb^=$ZG=PH1&e!9Zg2z){)94%Oy> z=%^JR4nDmh#+3QiY1JRP8svDhpO{jjyo=oi21snLH+8GDg=tK0%VWI(#-3T>G*y*5 z;*!<{(}EbfIj(a@pIXjOPARmHnVo~_TFdyAqpfz`QaJZomq+|QdPHpxcqpHNw%N1u zy~z9U!tt10;@sT2*aQ2Z)GJ>Zlu4)vkPIbRjOXIsD zdu-)TMn@f*0dwU}FJr4UE(SxthlU>Y2g#LX*>?dY9?pCwk>#ZJWKdtNRa&E`-Qc^i zhZ@n@_sQd^0RX4x-H>}EBg9iaWcuZ8qZmo$i%lrq%u=gD>Jjlitcxz9YOgL zhdG%mX^DFR zx#zXHi|vUhMsGK#VTPrGaWsgopb5??&HeoFC@5aa<7&9oN9M=|1vI`AYljua^e)VV zTj1&|=R^;U&KDigY~OdSna%zbrU2BMu#TB3WGGC^X{4TI*K#8l6D2>aoz@j5Z23^H z@QeBV+moTO>i*2pd@uSrGq~t8HgUYfFP(<1CLC9yb=e}Ji|4$+6bnlAt}Gg@>=M@I zI;ysB;#=yO;pE-wL{MSkC9F=w8~1DXiP-67H>1uy3yOIc@Pcr9s)~WocHn zM~U|nQtqoA2p~T$TPKAM?RWc0Y!RW@zFBjsa9Lw?n?J(#?5xH8(8UG)ALB{9id8S2 z^;?9RQEXpYDbiWleRV0rG&+$M8J7sAV+Z}U^2B>hA;F}10f;u*@Omi>?e3YM4fTSR zJXt>Hrjos>Cj0G9L@nRYziIuWx+7=b$c#T-eJM&xG9u-m1WD?eD!xO(yJB` zzMt)K3Kw^0Vh@hhXSrC$F&`RzC_SjYBNfkf5U@j+2|b#qkD-Vlq1|3AZ|XpoX!)Iv zkXbK+jdjs=%pK*zCody40i1;Lepq9k56fS>nDvX``IGR3Ft=>T(>d!e@UuNTPw`xQ zb)S9_NLz;;pDl7PoN|(TyyB(DPakLfI@}s)IL~JKHd)hlMB9)1Vf6y71felE#S|KU z+=bozS*@5BD_|6iGz|!bV4Rh94c03$Nsl4=+=;mBRoSU{If(P)AG>W($bKtic^H?V zLp#1`b)pAn&8%by$6s(m@T@IybJv}LZ)EZ!d#ZCnyvyL{{3Hv5 zdX>tt8AIX?U#I@%IhhDs}3bc0|;=1p$>MZ>JVec$h z*o%@E>GV*W4HK?JL$-dRBJm8!EwcqUjYZ?)T0luJx+jC+8!pZ0dy66bvBJ#++C$NW zsBQiA<s^GTa1^MOG9D|QO392m(5f}kfCo?yvQ#;p&vKUnIP%y7%| ziaxk}=j;L0|DyasH0uDxc1w#hTm^h7K52hJgL368Z4yW9{d1Y#vF7w2fAWnl( z`th~IKHO$@qDPKLsR^ND?v&7!5TVN+pwe9YMIcZD7*AuPHopjReJhV!s-cT3Sf(2* z=>G5ChR{-_5Up0|NdS_H0Rzz%Z6p$9ZBA0n_JF3wyMh2;g3>Zx@$no zP}IMW;8hwKM(vMdAjB2Gx@Ib@Baluz5vdN%2&&|&FE;34{kF#|W%bcSm&B$D+Z`t{ z2}!f;q5JA^zD$$>?F5*=+V8|MUAOvr86(}YtSonnYAlwc8z1*rQ}v$rPtN{|*DH)T z&oj(tP2&BDp)w2hwij(2v!lYU{ZGz-V838p$8AtMi%tEiQ;t*NN9ydi75x?c_ZXcm zo_PX9 zzTf}uv;OrOv9h{U=5(S{*1#oKhOPH8@57FO!OEl_n;zl!r~Ue;LO>e3D^ZUJwKRli z>@EWMY5Nq*^2q)Veo8(F)By+``KC(LsrM=v4F14BgqO>V8A#-k4Os)tYKmQ2W>(7Yv^CF)?ZNm zpXG$pt>4;B;0Kp0OsUQQVnR+4Ne-v6$6V?R#(z0mO(3!1AUBH?p=05=rr-WQgRhc) z4!lwG1RoG?2-=YTba(+k{O_TRaAP1Pl}30R*hf8|&Mh#V&jbEd)C3jAXQ;)r$O2fy ziaC?%OvcRALpnbynlkg|a|p$gT$}M58m7c8+)#C8F#4nN`;BEyc>QJunXv;xkcdvK zbu}!RjCSb`p5o@rZXyk2osRaD69h>s2?icxBtk3b*Irkh&H={FTG6apAGFS3PF|9@ z<4|4kbumG{o3(|*Iq>~iTgXY2Slkl>Q+;a9>VUy2;Z|z#oK@>MoGyqcrotsb7Q;b6 zb)&iUF^?!_FFH;bfLc#s9hnQ|fxD!{E$y6`(!*N7TLLhO@-))n^EjaG^r`)0d+bYJR z?YSndOE!U_(Ct#UO3LRxE>C}Wd$cU}?n?W}8ZBOZtGx+fJw%}@3<9?a_-u;l)yFCG^af12IM2xeBL<`*FSOgw$Y3o={jHnCIthuKXxKNYCbwGBl>AIw@4 zZQxaQ8LPsB6xj3%tT_qp)pAeGNG{};*f<`$j#L&{I6n4#^`$m}_qpc9AMIuY0WV#S zRdFG3+YPRWR1#zy*1&PaYB)RAu#|=FjACAzKj!#_oCcF*mA znz@nbi{+KF`Xb)y>ep?|3CWA%S)Y7b{SXMglU!Q@c8dEvM4s#!l~OD0yH?8DQwN3L z+AV~#Kgs;77bZR?cwg3#?;y*v5`L|k`ci0E;!nDkpr5-Z=qQv!0~dJ*!Y^eBHuy>7 zDR1n_gc3bXgVx*=f8Q5t zpFfPo%lduZXu&Ifm8Co;OsB;Z=zDmhc9L#iOtEH5vRPAMtBDzf1I=KHqRwFPctW49ykiI|4{Lht0Q zd*{+w6N4ZPW44nFL*~AO8vdn&YN!iQwFZxQ`YOil3HCY(TfMiHML)>4sNbST_8?TmK+)#eGqif7`ZD0A5Hklj2F(){P7flQc=VNpNKn4 z=~B-`pWiGRPPplXfZsfUj=)$kCbJN#YtJ2W@35Fv`A_Ra_Hs0!y8{^LCX1axStYwl zzGe{kf?-jI%t{+k*=V1w;9rw3eV)l*Zn%6w{ z1(jX45foS%)Xb9{VVj3%)q1p*(gPCL?rvV`Yi6`1JhK~xZIa!ksMui@HBbhn&5lbd5o{Nhz}pGz;E8KzigxqsfUt#`e>|~DfQ90C+sBAgl&uyEAk5r zHHY0n;-X3|X44N(f-cb76l}2_Phz9TF4;s}GWu5VqM@P4UdXdTh-E=OU-4;Un^i?N zWarU7;)+zto8qe?s4I!{B?5u|TYN8lG=w&@KJw^aHmE^#{cyNmgX$^r z&iO%aU&YP7P)%B}PzP)4jA~ci6#eOqe*MH|_epI6yeoLAx0KP-Iv?YcHWcHl0 zU98&Pr#)ribMjW{EbV&fNL4Ux z2NDEk08^489G{~jU#j}E^k9%EOq1NA)gLL6!??=D=yL+6_e0|k4Qwuc(Yp*aiEsF< z#<;v!9CmGN9$6@Lt846=u^So+p=(hOynV59W?AbNTFFF8I3(R$Z%TC1vo6+!GTlTn zyiJBxA6DthdQtpqz`bD;qB43{+Kr51l|D_G-Ya%T=eL7J1WA-xv@6S=f55n_4)?t; zeBT&w%l5ExB%#u zW&Cws=4J(xQKivXzroqa@Y>t#pVIi;{T4r_1=6J_9;y)~T9dp%5oc$XC66MoL>nGT zx85c{IOIxxqECTqRwgS5CsDDQ58haXb79Zz%00QcH#yPaV^_yvLO^phMY%iZ7lCYD zcg1@a(?ZgOHEW{i&JCki7u9VyJ;vHiE&F}>rW9E{kLtI$DbwXFjIVuenX3~`=up4) zL$rTqcT zZ^;Sv#L%xS{6yAYQ>_pnl3j9U?dS8^#AIhY&+Si>?MDD@9bJWNhM#!)<;*@Ux0#DO zuZNzY)>*pzz3k}f>#vet(aqJShtNSbOv@z1o9Rl_OLV}*rL zOfspA05`pluqSPr`RuTn_72~lL0-+oRF?m|0<+|zZN^8h^oiK$4cGRp1gR{x$ZzJG z#ZE|&?FWpYRk-VRvZoBcBy|Gi>q9#ae~zM)NHTrxg}wmjVkDL!4y|bL9LAnFj|eXr z+ekI`e?`f9LCl@lTW1jj(%)g9|B9+C;eXR@@pia(`6#*Bt+&0sUHQXc$b)9vLzgh& zmuj94N(s|EVe{ud-*Brb$W9^~3ZB+?nTX9#mg%UHC{Zl(e|Dz?xU4$c6+gH)RbK&S zhfkxj%C0OOkjCLn}C0i%ka-_CV)04wBd$Brp|LNiK` zzVS|latA_Ht8(k@w{b7XdFxh(NG0vSxe|AS(My?!DJCuQmNYjXvazoWAd9)i5DT#} zyZN)Nas2e}hW5YSGC!7e?Ph=Ziu*F5>YPiy;`h8e)U56~II=De=ePc*cg^C;_9WX( z^hu*x9gpJAH!`HhwbrW$_Q#~*Ou%|5pk=XGSCaR(#@vE`%#HL~{!Y%5JL*$dii?poMiDz3YpQ&LRaNvxso6^46%C15U6Ytz9a5 z@Eo?DiUsg-SUne}O%F_|hXQDE0@VmM%3Z#Jk}~XMr#E#Ch{<{fdzx#T7HH`kZR#eS z4(-!?x7gtZ$yaQ36H7S^ftW}c3EWe)+LaN zD~!F^w#d^1E!>|vO z)W5Hy{(GRv`#mGhZn$Fb)$D^mGh>0se-N~OEBriF6Hm2%r{xssToslnZjN*kD+Y8% z7LQn7X=mus;D+p8i>qY}r}Z1(NQQli!Uu|;+`)Z-(dD;EjPJBTs8V>umJ9Q>(A?4U z$CKwLGL23a4L!$F@xpI8~p=KBIqF%58XsG3Hr3!cCDg)y&<}SQk zFeJ!VY5MUyfs7RBVtQHKM}djQZnIxCmucUR^71Y^PfrkG-2;?Gh*3 z$Zk_~oQXNwy)rpF+UU+2WLPc2*7W6W%Xg6ua#nS zL`iCgOrtN?*9*-1xSOy+QfRZI@RrE?kO&CsD*$xPV8oYcAvdEAd^m2&)cTcwMq7##BJ>s9%N@8&FS`qj^3$#i?1JkBeSDvEhqGIA zc#qa+68e@nWcuQMtLVZpE(vrdF?fgf?kMg>=7~Sa&5W=>&L_VJXf#LRtEH3h+2mgY zM^bSZFBL#sRDV1Px-o$LI+AgOMz;br9(?azImENa0oh>x-BS&Y=r4l32eRN>f6mEG zd4hbAl6%a-(g4uoa_hc}41pT#-|@Z&v>4vVW~~K5T})b@CB#NH8Wq=VWw2zWIUk4N zYunPx&e476tirKN5&PUdP=-FUpgK&3>+lEM0B~8N6~?i*f=qFMuy(pWl=9OWtBpFF zm$!yu*A7T1`SCKqt>&|njU~E^;%EX{=Wru|Re?e3BX^h-nX($9w|C+yp8V^HQjp)E zI}rNva-G3^zFBMWK9Qn@O$)xV+^++uxJ%C{c_3{=tY(2p`BQjZAp}Ugn*rwWZJTxT z`#`~2r)7S*nfVkM;)-|(-vM~ zC1}Ten_el#w7o6G+=SaWD=0h;lr)LBRQ!eCNg%)qo8XTTHUSnyDn$WHQD8)N ziWi;!`9F`R1Ka=AAMnhgRqAq zjm_z2aaNFC2KR{n!5#YaLmOo(@taOY_?S!Q@cHr$OIs5^#} zH$DT65&rD-ScE$cpIk}L^fS3x`ZS9D4(KiD3nv2bjldmWzs~ovV-=I}iMCCB6)56&u~K(zn(*lq*ucWMA<;|NcHBrWGLbc-9_r~isW7p(=O-|E9si) z18-CS3yH#5+me=D_qNAwpRZ)@uAR^C>}6}YW%?*~^4-G5r>!lZAYBdma4dC4)T(i2un>W7I|0Moc zD7&Wnv_GBs{UcoGpW8=QLn`_YzA7gEn$)4xwy;Epv4rfebB>Jo~{ z)g;@dSN9Jbwr_%qFRzVxAp&1?K#YPmv&xJ|C@#BlatabLK&hhM|D#2L=y zV@A}%?E4q9CxL;5pMU%af4P~a6F*M+R9SS@>wDN|DeV@gJ*D(sxxA+_87q-Nvbc0e zMNXXguHYNu8-=$TDX_EAMS=z(Z}1zQs!vJM{o?00XSn5Yb7oYYi9>`Nch245zIyvp z9x|?V|*^oqrOyLjo-B1aNALLq&vtyIg!Skoo)jr5)s*tIPJ3-;}??0?iwiU zq#hs3Or%L?93dJoUD3|H=S3JE*de&Ku|FztcI~!!x$d zL$__usCd7CE`F-mzgv04U>$a_9azz?`fngOvMkWyXz3m7pFV1 zJHAaa%67-Aa5l`_v{rq$sYOMiEKy&o?zKJtR`9sxIe0|CXf&~;{EP3>Tsxei``gsB zAI7v5$71lBD?z%=MHd)#*aYE?DVz@WcsHm=&&SmoX&pp0+LN^!j3<6L5MwQry?p&5% z&QNi4Hgm0{OG>P0b5BJ$V?WAp)$v{;9OlvvcF9ITh zrajr8hs_4#GvB`KrXK3fxIXUG`S4Kd@ZN&_%>`wbH69+dFr?gJVJ^AaC4auNwZ(2` z*XcMduZ`|p{rD!vSsWF>6+e9iV?CcqI;{k~VysQ@iy);Sx@#z|F61io^}|Qkzp`w? zpN(#vt(n`pdM9Gi*;yc>^vdOS&Paqa#(h3+^H3n}s~PHqe2Z3|?7V|o!<2RhqO-h* zKACCi!q+{>wNAfS#^8xm>4n4=4h{9Y^`BBkps}iI8lvs{^Mz*^piT0(Gj*9>pK!^wW^)Q9}Wm@64xI+5rDcTFN+4a)UQxp5{=MT3ib9 zAr~izioW=)vTMdsS~B*Ilf`X1#?CYAVd{pBCIo22+RPknE8+~+9@T<)9>h_gsRX4f zdR^Nrt&WtZ7b$mD;-rSN=v8o3x4R)ttc_sb;gz&6;Pyynb-0U6RVvk1+6xKkuIMK* z{nAhFdsy?c%~m+wA8W!W)S> z4<5f>3SkR&v>-ve!w!M6M-y&aa(%8XsDpq$$iLuyzeq5=XIG2PanYfljtdHFXVNGgv%xDy>wlunj#VDviL0#T));SHU(QfMWyU#aQ6ZFl!+1O|8ZhI7I+JI)NLlCWVnm+ zhoCcDC_Oj0^Ca+y?tQX@@3?Kc?ix9TFTH@f#rRt;6~wh0YXRmt1^ZmbdY;zBw%a8z zLtyq#inUFNi94oM10<4B#s!;t{(wYC8X-P=5ML9e3<}m1Cgj0)qYZ1-&q^P@hCe-To!Lm5hye%o*XO?UnB)9L#3mc#}{3W*%2#o zcyiza_-AL;@14vD8f0^?*VXex6WPjoI(5!`LqpA>VA`DZr((%D1AV!%J=BKArXl8{7;^P*p>8(^U6;qgzJR&2d?(nzXpzD;dbB> zI&(i(56%JdU<_#^xKQsQ&b@3H*a(-P?ua<#a|KG?x_n(BIa%{TtS!GSt`pxJvSS@- zjk175a7|d9Lm9MjC{lIjSZZg_e2LC1|3M^ME|=ZLtA)I+?xJCC_st@+7dVXEwV(T2 zL_5G6A*brnZyC5^CCbvN?!~n)Eu&~LwEl(mA*5Gv+ez7X+jzh@vt}Mj(7i;Ti&BWy z8k72zF7;7?n658eyQqjf=VU6o_B#3@CjUZfZwUS@CaaV`^X|ax%MEw++Gu~Mj&jN) zPYkqblt_~gMCqt)Xaj17$(<4+E%P-!4kw|kpxmSpB@AIhD19dp#g}2vK^ync zw!tx%;#xfC5Me587=s5_CK0U8){fPx0pm`gcHFV!8CqJscOy}FmyP{EJjL5n)uO?c zP4Q}qw|}Z{Xh~MTP2W-kTZ>#-j2Eq&y0vHzF z$b>(4WNxL1*(gx^so%fK8ko@2Jn8MH5fFBhfbPewV2GcA1t4gtV6Fp#mK@p*CDGI| zwKuI51y3aRKYe}Qsz`B_!n@1b%H}gE`lIed`mB`HVnzSfi}UD zRe~$Fx6K+Y4&Pd2lpZr4!zX~xTlzOKYp${oQA<4}=q!qKivu-jg!zAnd+UIx+U{+1 z=mwE)l?F*cngNs+>5ie45)hD(8cLCFP^6@h5=lY2QM#mCq#K4B&kaw!&-;7d?|kPw z=a2KpuqQTq?z#6JYh7!t>$2Iy9RFTXOaG*@2jw`7GY4JX35`DT%kqmS%z9)@x_fnd zQ46pTNgQw0J%*Nk@?1e4!kPLyVGJE>8a!p(M-;>F(`e4+aJV4QU3NK%Js7p_#|xYhm;z#M_dq8iR3hn3#k*EQ zY`&ScCM0F{F$uIrJd~qe!c$j>z4yB^koS^Q`^l7<1=0$x4!N-|@@y_7>M?4T$x22a zzUu){rEpL+UL=f)REu*ghmOO5+8kgoS4%kX)Bg1k1UW_^p-8O>bg0zz2bwqn?*veo z#hmi3GjOV@)kZPccBJ$wu-m!8&JgfYnTpg>+mnq@qn!*>g-s+wL@^)K=7}h>UH@gz z%Ue>(^#1{_h8EA(4$zQHKN}z@jbg6EA<9O<&P+x}ip9=M+B};ks>X8Vh5b1|{|3F# zf|cCiqw{!A3Kv`OY3sNj@Et9dUA>Ptv5+CU*EyLARPZ@7kLS1-#XZq208&m{Z% z{<)wZ(P{t>e9DYX1^K^R&A(jEe;ng?ZP--}(0{}Oa;IswJo_wr^l5vjb@tTb43&@y z^$LwasS=jKtFJ|I;>F-1RpWehdjp{{M2#D{1R727-!4w{d+h%X|G9#kJb@zfc+M3S zlVZp#P=Fc&?t*rrA{e$$&A_llGc3M;)%NS;)vRL4tQ_CQ zL;&QkB>w^^S2k4XYYv~f6zP5qgbYW2pg*wAFCax#5ek5z zAY@nizO*;kxJ(OqE^j2Xh9gv!(JLeq(zh8tlrG~PgSzOr(DBAUS=KNA>)RkTYooMm zA95bNA(f4_(pk?|s?9|tDi*e3*g#cpNN3x#kRrd*OG7RJ(Lo<0-cgc-y?bePQBawo z?6RXeWpNy=8t&PVuy9F!XtL(Xy~ClGO3!+brSgM)9Q}q149iNq>>D9_Ni(h@7WNFq z@NTwuVA{I^1r4UmjP9cx4QG)&bDaFl(CfR5^5+7D7Hv6lS*Dg}kistD@`B-h09s&j zjPo?>9BXMMZvmMnv$h77krnjP(>bE@wypgdwefLffxZp(GZp6P6?JW0r)McQu`1!3 z>7RfdG-I{PxlqJ{#6V7t1fM;6W&C)=&2j;`jpkPRtr5Y}xY>df+2gp-*z?J^Be)|I z%}B_kpTTTM7<3FBQd?9qvHBS-!namf8HOA&Hma= zqEJ{-8<$#-(hFx}plaZ;`wgM7ml^NZ z$?_x;eOcPZ$geLK8$%`8Yxm&c4wL)GTh-2}ubS{qba92zGD@uJ@^FLc>&sOjg&AAZ zCwbxg3E}DS+oyHo1I_zcZ7HXLj>Tebn4-hd)6O2f+(Vw)a#ISug#?UF#|2p1odpu@ z64+>tE+gk!)}M`gOoQxko#Hxt6BO^Cs)^kH8eERbD~#7?H-)6QU>NhZ%a`Z99F0-a zpP(%G>^|G5N=~)1d`~X*aK{n z86H-70H4^8uG65gS3WSf!+$qJhT4Lif zK2Z;r0g66k3Ex&rEi0ziw=(d<=AD9OYhPEBuOAir9M#wMBVV?2elHHk*ED4Et5|&% z@9%G{xuW>3AYi(--B_xFPdoQn>k9qjd8awlbP~&y8c<8V0j6J~UUbMHB*19-vJ$gmFn?pzmt^^lHoFs@vAU*0BK zRO23m*JH-0cqjY?p76JRA9Pc==R=0x!k_E79R;>yi|lfSU3sAs^9mTF<0&aPM^*pNe)n;Yg{mMYQpe-Ga5?yBI2A7QA#oTRe4dxVWGs; z>p0;(>&ht>Q(7Q><+posHBD&r0WsTH=5~kSjX}IM?jBi7P9m=iO7@?;dlD11LjxaA zHWzbS{8+{{TXg4qe5m$CMlq?A*b#*rp}c{It=~}=%iAx?Somye7&7-16iG2c0`2Fi z&Eh1FD7&*W+-UWNovkzdgU7DW)~kYcshgTa1}GW();DD|mWR}(TCP6;bl70`2p^_m zG^-pS{v~JoB-_!pN1gfJN`E*5+UA9}l`4&IHzR@t*4D-jbo1VOSEJa84E1 zq+tIT+kFLTgEH0W6pGA}WB_?=t8Jw?HnY6Dz08Y~EC3L8Mdc{XaMH+)JrmrLq%%}b7JF}(Fhl#Mc_cD8fywoxIFp~VO>Jf1aPja4 z;osn<%p;**74rk+0_`#F7n-V#^>inC*I~*kkn!4B3kfV?^C&RyrVjD=Sl77 zcUvpy_MCUE>`=vS$SdU|FqHRS_E-7>=9}h6@TZOlEjTfJ!4zRIMNM8I-aSHyvGRRb z%=8Y&k~N*8;Hev-3`ti~x-aK+tDuA`cRlA-WXkO=AzHm?^oyJ#%cI*@cF(_&U*u&) zue-Dy(O2|ntMp;712Ka#g~O}0lDHE7DotY0+iT&}07u|`zQYss0An-z=I7JC-nz#q zyR3^P3437+RWVBh9NS_cR@Htp&z|_iiQ8@$SNq9%>h9SSbrCNE55Bme(ed#MjB z6g%W7=Wbocfw$1!leXfF^$)Xme9f!*1=Qs-DndUvrA?IbP$^cwa5I~8Ia951oT;PN zwQ1fIE?qo14l@O64wZ@8!au$qpLaBtp9d=zGg%uOE2Y*wmM&dvT&ip}_BI@QicY$Q zcr8fDuK#1b?E-(FN|YK)T47U!G%KT2n_g9>4Lu}oopT3{VENI%1^X+D#+!h+t>B34 zWc-4}FnzL4mEnk}9{H( zSw9%ARF^3N{q_z#SJ^d$!EK%{KajSHn3E$>r$0F`&>gWRv!i}iR`g#Skl_g5^?Bt( z#`=8HH~YuOdSXkimtg3;+41JMONXxBN#gDbM7$hxw`oZ$hr9RX@DCy{4=%!-n!j zARbguux%(RVOpckkgDs0f-O2W)UQF?xt7D(D+LLef19K^i?N}oEul(m8A?~ytUnk$ zQbtC}{J^v4CMB%Uh^)XVLjbcMlLB{_XEJJm2~OK+wL|O8yWf`N*^tWqxogeGF=R`% zdMD=IR`SUtdoDH-J3CUr(&MR2>}`&oR&-(j4hCs*Zb1=j&=w+uCbZH$;)SxFcPfuj zid`ZEpC=~&r>-_#qR7RMrb7e7=eoGsFJo6S$sOQpuU$I0eB-VhPZ)Hn5%;U{t{c*R zT#~F&MP^X%;8;ag!BzcN+{5oaD`e+TST~R(XI$*R;OL*6YdJpAd}-hq9&3!c${H~K z*3LoHa=Pn`&p!ui%({i^66DTwlI+DM1<%TM} zM+Ndll)7|#L>ZmV(JsnWx4Uo4W zykk>G41JZAb$ZE>+n<_*9X}f~R7~?8_sG+DDTdk@M=+j-?#aJ7Ql<2Gue0ayg(tPZ z&fR6ssQdDC+tS9;vHfjwOJ?YX&>t`r@M_M+kEyPJ3%v5<%w3)!hm3>6I_2h%wvh0= z34z~`x!T5a{?A1qZDwB^F2%4b+aCmdIwbD8H>?&&^peSwFm2iN)XDx!9pk8eI6y-v zeOMg2G~DF3+u6Yo@f>IO176y7Pm`UBfq@Pnqkj?~0ol!N!22A0kz57Fr-gAaGG=!4 zfpgZEi?%^ovXN(h-6pbgc902U@Ea1lx>R~j0X^Qwy|62tP(|Dszes}8!GC}rs*cv` zH&eDadkDB=;+tL-@Y@)?H@NZR4|e0f#vR#iH6U1N8S`bo3`gGLYBRB%)N)G>BA6B7 z+RnV&xtJg z&5K`{aC^`>E-BZeGP}BUANJHnwJf<2zB8f?e;gi!YKY%kL&ZXo;PKI ziHP?agYs&1`r{f)UP#hOr`!QMiVX03$gaxK@U9mxwr=nZi_3#RNDsxy*}5AD&ei<; z_~HhtZSiUp6fEJuYsU)%Cz4`Z4H9lJlCSI9{=Cqi?fO0bPt?I?L+LTceU@~#_jG`a zTaaEwM6nkjR?M02YlaI1u85x$KwC3VlZ3$+QvhOakn=I-M?qTgM@`#J^+z+RMc-RO z*|P^A0Ew^?_yZsU@|;%&q{3l#ca|%fbf7eAj^ta<0VCI&;9?a4;Lr64R3c<&%P{%? z2H;;MW7jKr4myA*AhD@6$AXkBsZvDvUdf4tqy9rVS5SBMb$h~}f|Qp!-JyuA=S_Lw zP7k9T{i`r6K5-t-!LY00cq;NYxPZDT`4`iX4(uFV_BSLex74qWD8~l_8Z_UA0aAt9 zuX>#egH zBmsxVAG}BN9h;sWf*%f3g4tO=U4adwVDWi9-nDT|*$67ZHnxz!mJFw?D6*dO- z&cpGrGZGzPf@Q}yKM-fPa4q+3Xjby_ltLdI5Sx4yuQ(!c7gzrhzwDYDSy4PzyeYyb zM7vrqTnDTl4xE~;j+CH8GCB##-Mr_|Kz60Zb#5WEHo7yrTWJmsD0cIssq|r+Y=~cH zWB&AzYF?|ie>&gYer+}-%0F-{iMyIGu&N@@(t>IHllLuzlMhfW!;F}>zkiR>xSuEu z6Z40Otek@a7ESHbuOCQTNtDB5U;|heH_wlqYX&OnqQZPxsx0ek4(!e4Wg2TGALTbO z|3jXqcTwfvHYoqj7#vz^Wy@b8ZER` zfbw?A!$QS+c6P)psXs{I*?B1EivKSUvi%gxnEmjAHDzjyzJ0u{Wg;bK2M$$*=5`-4 z;rul2J6SROdtEu^iuj(P)&lPe6Z6PhKg)Z^sS>>UMoX&Ici`sg`0N4opi` zavv4_ZWUw!g8ul|~dnEe@N;@n&@q6lw zX0L6#uv;ZI6i1%x$l2q@ntHD$$A9mWqY=~|3?#@3ExQ?hpfU_Vrle_Qx1^WdOJrtC zZenk`$x!*TXrnp48A)~fEkO;s7ni=}D~kP6FlTr&+1)uln{MG#@I!%R_zCZ;L&&~l zrRo+M4I1uqHw!;$vTj3+;G)wM%_D`ORK->+^W)OOb%ROlqnEu+bo>ycUQwaPhX^;B z8r0%b$nbsagUF*7pDHNo9^dy>Y8Ks3YTHKVr5`nT5c7IV;h<6JRoJIKku%+AHv5hR z$qyQ2c-eU~mdGe}C{i@Aybm;)&M2PhbJV}O_kC(MV)n2lV^pTNTBgkOzATH16b^K6`r;sA{T%C0pw1 z2@pX2V}(Y&Uc|u*ng|vGQpncUm7-QZ{!D6P2&QiIuY6iLa);yH<=n09 zM=3(+OSfK>j;}_27r9l1B$Ox&-&^$;P^JD=zBayF_WqkAdvnM5q|9n~idBv`lw`M^ zde0@Gq5-z-BdM6)6L#z>7`5yxs9ND167Kr*1O69`zEQ#T^qZ2j>kXKr1gVFQZ{KYZ z!Hq7Wpt_Y_{L+O}PKQE6>#($APfuhxF8l5GWEnEA{d!EJrjT_x&U49m-bh-K3i6eyZbkOg1y$h_hIA{8p zB4+(QKc$u5iqTETMd8_#%K)fdgxHc*koj(2h*cz%Snp+|+>yp0XSP#Bs@{Y$+~ zu1fiW;L^eL9MhELy{*US#4+lfudz}NPPGE_ieKu8y#o6JB-<>#B@xj1wcn+UdaEYK z-@|V+(jnaMsRDjVg=2Fu+EQ?<%7Yf)TU8YF1`_QxOyNdM&u;8~8%SFvyLfpav&&Mx zJJn%O8X#D~nQucv=@y&E4RjJ2MKvY z_$dC$YTxM>{#nU=WhJRmHP0VrhDXNhnV0k7KVN4DQ3E` zP$FLDmVcbKo9yi?cjja6zB_G^C<5*KQeake8ksy=N<+EKz3mAk1yxTX6w{Lg$0Fqv z(tRW|uR{kmz_A>+nLECAdpYL?&ewo9yn0Q7QU_w2pL{@6U#X$YiAr87fM zVYKSfSW$a>@gJ{lpURIlyiRKIIFe*R0|~f+^<2{7A?!Hqke{#Dq{&AtQgS&)lSurh zLi6Qh4a=dzzRRPE=iy^C>7V9y71AVc*)|?c9smuAV)%oEi+y zoL2?68uh-#k70MAV!pTA0`DV?{p3Uh;$1V}0?X!4`T>o_YxyW(7b>x_JmR$h`Woue zNxa{wc|o*I((J7O1RSr^q7^`J@um$M+8z-S@4>F$dK%8IZYO~#!SFA4e(Ukgtn_l@ zK(0Ez&9a`ztXTu3`^LHTWuD~C_PuZOL=i{|0Xyct%G{p2Cxql@_$sz@LBE2|pEibk z`H(oP8}Pz}87$}joIKh!D9`hkvA-H`gP#!`z)7Ux@wRwjPQ{%*fh zv9|OjK>K{~8y=P%{eqw1Wy{4aL~!1PKEiN8HN?|o&~ee2f9%{g<;|>ht0Kdg&~)%L zHImxy%n=Mf32%K#zRNUM*lYLUmq`t%Bk#;LoON2&XT)Krltx6R7#7AhE*yo*>BSZ5 zvd({W7hj|t2L2CPO%#c>Yz^rO_nb)#E;6_4@FV9b;K&SUy!MRykE1>zT%X)OXMn7g zwX73LHH`4N@XgCfO`4cp-72 zw};Hy(_M57^<`TBLLtdKQc^mhxB88wtFJ7|QiIeB#bC%_x`weIM1rPA{_-i*@NRtM zSB2mrhv1g*pa=6nvvnR8O!8oP?#J4i1fqjzYP-AU3;L1RlbC3d3l0$R;f8+-Zec(e z!zzXRFL+$B(sR~MEObUts8m0%%{6j>Xi(3348zIBIdze3!)q6_YnqA5khj27{Qt*| zu4Mnk&M89$xOjmph!USwEQT77l8i!#%Ll`RH$M_LezHLHW4O~!Y^!D*BkkR@FCC}c z+!owW>{z)A&K>EYM%2dna)7;Ua_8;Bnk2XZS7G;*$w-1Kae0>E#D~s3dz-g`gGvmG zIRyIIs?8wbrBV!^ zLG-9?)jFckmaK)sOSCC92(uB#L%B# z-n<9yO5vKaECtG6Z*rsB!<4`aE|&(*J@;2%@iL4Ng&m%iB_UZqSWdThnDxved}g|+ zSp7ufntq1)4{TkZ?yH07MB<^Gb9y-M+i?Lw+C6CE3i`2|`U8h~S_cAG0e;it%efUJ z=!Im|JkceNB6u@@scReFZBavdr89+j-yLph` z3&2epkN7!{0HUL0odkv{?j_+q=4QQ2hqwBca<>A*U10S6z!%B9OiR?@`bxFi@hAK%oul2R&p$q$3dkXkAYB{>KARpyh@4dBo`! z^OMGe?^;4*zQR^b7UqF=9&FQVDXiu$_g8uM1+TCG)ClwLg>aIIpFzF2KKe~3IbH2N z{ev8$V{(Cu1X%3`vZZgt`3eke&y3d{1Gf7tYXY>^zA%M<{&a#qR<95QKnr00sAOIb zVF=6h7}6M&^#ku@CWhWc1HOK9DF>o?@7scI6AYh$!dq{Ttij*3E&Gm)iB~Xfd;oT< zA}4-$kfwik)6d3x;)b9jQV}FMvx@c$L|Ar{hfXOtKhGuN9k(;SmPVZKfTt5k}Krzm@>zdny`kj-T_+bFw2%WJJU zpDb8i5quHqC$M>3FP5b0Cb##jj$;vIyZ2{u5P3_+VlwY^Jt|*%PmP;=QPvCWYn{GL z%0w>F5PIzG+JL8 z=v(aO1&fPi81jkXNjtrKo;53G&2EPOhEyIqukKkG39fE!oSV&A^&tw2xZ$gDofOD0*^kHgrI!7huS1c-MZY1A zrBl?M^s4PD`enZ%5|1?Dzq%&tU&^@)S1(r0J49Esa_ z&UPWaZ!fN#5A&c3#q}2)X{-l4 z!kQ=B)9P}J!rR>~#NBzJ=bz>DRacEMq*4BaF;Qg6_NupV&R_cW!OEZJum+1e7ONC^ zL^A96uqp-_8+TATzy!+W@gD|Y>oe)x3P7P9RyrqO+)U;K+CX5iOnTgz)TJj!=;s4U zq1y4}l>5(o3u%RJ`97^tgwI!dmO*O}P*kGh$_~dBOODI<(Bo$SrOj+a1vu?|brMR` zh$LYl`wsb6X@)?(6r_25VKZ!3zu)hyZEp&cyHoE3X zUDt@Z^DxN&V#{B1wnq6wh0o$wX_~JDawHuC`V?@zD=&7wAyKWV*RfNkR4?Ii2uLBJ zWsj?%$EzB_3YaMEbp&f^vt0w(MBf_FRTAwhzVYj1T;~{A`b`$+I#AIZ%wy_6FNVrP z-Fdps3-WWdr_d(|v70>hX9s+jcXeXtka#}%g;A9kbfL04Uhc3Z5E6 z^zo4C>wpjjaMuCMXq5b1Bo}~{>gq#R`D?krWtZv%I)-_n=%jzo>A&W-EaWy-?)>nt z40kFZv?m#g_5B`ZIJFNc)}K~1Z7Z*JbKano3qGhbOqWIZfBvzWgw2Q=w;4c#a{$6S zB*A`Vl>c?P)7PehfvSL*pQ>vyd_cOFQh@aByBq;iK+T{2d*B~v-{rWO8|9m)O;@RJd;p&V+UB^z8L*6dRe_d|8aUQOGG z4zT50{u?9Dldhyh?;@rYLiHkLr; zt$+5nm{abz6QZF|W{^v!f^ty+2C1SJyj592;>8>ImB`=4V{C&sT7uE#hj3J9|t4L zPI3vJ#*pBmk%90LX5**+2@<3NQ4rSUgS*%DhBJ;V&sO@M2wsLFyS_#F(Gy748pi-; zN*t4?!aFSt^3ZAM7Fv2sW3=Pcy`PsK1i>QEe+B949v!ewOE!(~6P-VScQf1TK?i{Q;vGs4#ETiAcN~j4L*ef!4K1*!vffK`=rH0~pkgJB)R?zSWu3UCA#^2xn zMzzd`#dsM5D+#2UU@*xZ4-QYOrO&+?TiM>B*e`xk@KJnu{soA|`t)Pp=^AaE@A@l8>4kS2LVq`E)4?g915_*v!VWMr|} z^vrj6MW%4$w}(`$aqXU~TetV(bP+pyFx|bUrHq1&lMdwVis;M*32B1p9F`EznB=`z z%#Woyt8={G+A1=3wrjh8=2C!|%bTozp^qPxXbc{%9(zk8aCS(_UE*ZCDa=RknpSH? z3x7m8K!JAgZCug&n2sS)?H<~)pEA_~MPY<{9|HqVbt}Bt5H`?oS34UPMe;!-M^h{r zg(XJ1!UTlZypM7x^-9A#TLqmyL&!Ya%m5&s$!|c}m3+UTLOt6vfHuO{Om!Nvty=7p zg!g=-tyjkm1TP@F_oi3O3TmA^%T-&>2=amt-w4zxw6qh7LinZ7oSHU_?`4)duqKzU zpv$T8iWE&yIB*{QU{>npBKo$f-S=44X|SA-{dA@|-(TS2X>{n}D%*wny-y$C#o3GU zKl0alAr40DM=Af5<;5IqX?)RD{~gU2Z|_UIEtteoH%K;#f*1BBCO;z*@9pf#@wsiWt{Bc=Xjg9z3B~KAesnnt1Z*rV?VzOYKFy)m0juHNEH5x;5;_o z)lMM}NmfkNLN&G!5eC*m!#i2M4lV5uJ>lbvh)9u6Eqj?P<+FXLX3KL2>J+FKz18K* zbpAQt@4ndeZ#H?*-1v$>4ChHH!Cv)dYr)#{`p%%45;xSgN9&@f9uOkiOT1?&{a6QI>ZhXHmA9=y${vdVB)lmMm!jA%GJ#Kj{^l<9# zB#g-6XbH}vopyT2?oOB)@5X(*=UKwtjOixWrLP;9k$tsu!Fh3-yWT9wStI0%lAjYr(;enm7lgMN*Z84y{k-HZ+4bG($sw}X1Lo9`%+z3 z9<$}Jz^n9bnn?KQqvgkvLlK5EihADEPhy%;Zfd z{j@F4Mz{8bG~1$H+xE>yPV*gGGgg7Yp~N^$ptvH{i%MoQznF~4ZjCaPKLd)v>1tEe zM-puO)I>7l1$pUFa+-4;JC$Jkeso~l6Z)R@Jp;>hh&IwjyhrAg@`GPpFUyh&NM+BQ zXBWxNx)UBc$V*E(G@op3Ym6w`M(^7Gq7Jwb3&QSw67CQqL>{U7Jc@NIvXbL>k0t)u zNYX&RCMh{A*`Ci%0&x@lm}nv`%tUKNcKT!qie*QCbAdlN<$XILj~hMwn(19n$ErkiXU@n z8w)-r`rMvl1vQjf3dE}!@IrMX*KLf=J$8b2|i&E;f z*sN|>e7Ob2U$M2A9MVhy_vVt1^QWFuQj%SCgPjjdM%#`4K%@XHSd!@rbT`5_*DhAi zFVi`M)&h@4$We{+>}hqhB6SbpidhUyA~dU~-#kJ&QHWvTgQA3ua5IOEmZ^_j3`DLOtj2`on~!(r0kl>W+H77s^0n!b-l{#n;fwW3U;Q`1UD!*%Ni7 zV$zL=Z=QTF>}j<_^;6`DoJkTy-0W$Ok4wE+{V_ghX3H+Od9cxzs)g44A(D{22}Arv zwu5vr>zDQsw%k%BSB0yb8SKA}6}~&$s>g&cHLMKwjr+`$LBDES|7PteNNG{l3d zwo}U{-z&^;WS$Y8SgLgQ7YmbwYeaozcX{2btI=}DyN&KM=OHMx%Eh`6e3}MdXYjg` zlUqHsx>;I>dx-@aZYAG%TsVALaSph%%S+4Z8PCeG7&l-E@Lk=<@TXh6^pn%Dt3t4k z1w&Vm*?dX*zWt=*RBlFGJkg^u%db0ncGMtAO@!h=9!-MgojFwfj9mWra0mtmcr&Iq|` zC=5Oy^9a6RVH-B}{VOnT02}szNa<=A5{KMrj}z3Xlq5A?m5b3yr;JDFx{SR6l0}Jc zJ;y94TyI|m(62g=z3Di2IH?!{3L`R`c#?hC@q>=@e`Pjd#U=lbya)2s3QZfGKoLHv zb{1gV-FQze$Bxo24EQ57#+{}LquDemo|{N zG`oRF(zz|OC6zl@Wnz2it;U}qX5Og}-!b8eaTwuPpbgrcW6*K#F(C1zcLUF1O}_Yu znLO!QIMIw`jBk&~IoGd)TTh`gky=ma_s1%pJhf3-hl_!4a83Q&yMyI1>?+QML3c}` zwqB|@v%M>8vl3ze?+Ae8OnZ=-}n#piw6{GaGTzTuTx8g5p46?#7QPaOMWVh%$%OoE`qS zsu9*LO{%KlbrI1}%;**ufXP2LsTeem1ob7_eqf-YoSp#3TlktUyYX0IPq4UXRwrwT zHB73idAWGjU^TMl{$Nj|2Inbf=_C*0?sz+Vt3TA22q9r{f5Q>)hh&!9M~`!p%%(e^ zjugyw=!cRoj7vi89$0e$OtSb-p0zxV@=p&j-bF%aV^z(h7zUsh#O?qHy?`!7MS6x) z#;GSu;Hf(EdcVCtZA`pgvW4AWWa$HS*;q}VkYAv_Ah=8uC)gcQ2ojofNTp3$qP?vB z^**d94Us6PrfQ1eLtymT{pn55SJzooBX5-c8Xx~(n2P>`e8^XDd@~Plcq5h5BO5o3 zenY0ZIIKS9lwIBd6(2pOEbRA(^Ffui!z4InE|}HXNVRi$@N|}fQdxjg&n<*}mEVIJ zhxR#grHwxhNmP+J3P!Qt+Nb*L!Tmh!;!##IcWe9{Vqq`683qvGRxXRhs3>mnVlkzt zUt)#ovpQ99<3 z(k+H#GX$62xuiGlZc@t|edaHNTPXF(eEYpVVpJtYB;A^vpNtMF*tTF1>AQ-s2^zH# zV29^>ds(ckCF0$srwM)X(<|}2i^(f!^UdwZdo$m7o266_>GZUd9AdDOpr2z#Oc1*d~WN_`=U=}W^wP=F$FIC z0c?EBWpuj7>lMiUj3zt1%IDy^%suL#7@oJ(@x`vaZ^E@#*}8VDi|xc*{MDvmeGz&* zsw9{9vBJNTgY!_QF`WUyl{3AwIQQy4ep@R+WZ|7s(;;OCO5|;d{lb7c4o=P8^_Q!) zY08BK)Hb>1+uO?b?$IWizhp{N6eJ_nRs3}2yj)H6BiLs3xygn0Wx`iu^3nyG&c@xN zHotXI+WqPu+XM6WP`W}*vAQ|J!_5M@wSGeY$cZJ%jo@6$YfYtDbob*%j%IHVV?pMX z(F@z6A7!L|mYdL_6ARD^fCn2~s)3D24>=NhL5pZ=#Qv)4THXS^VHJr!`$JhpFCuY2 zd>4IM2ASQ_w!kSG-H@%sdPEE;)kLEIY}|DNuQVZ>Ed*1QPL4AM552-(dvOW1x;@4I z?8XR~&xUyEgzl4)jJ#6+p}Rcen-9)4ZCu_ghHn}U1A$W4USC~Zt@}ce&7jZ>pY!Yt zG{|Xv@^u+t=;x~&$+ZiysQg;3iKzHH8p5~6S222fd$-z>++)8Ud~cssq%5!F z>%Q9=tGKQrE&GBkr|hH7RjsAP^~U+e8qX`8~7;XmWhM1ha1b)8KT!&4TKryOQ$rDGU1BLyKZ4YqjgW&hmxg4ASHx z4F05AJviISp;jk83sIy9$BtTzq)>0hf~gHr$sEeW$?YNSWGb~2T@C3&$>6|VN8NqZ za}?hRN#@IBNftd1cf)$Gs9+Q3+DHP8AvqQ?`S>7;fxr`bNqj$eQTPpM(^v4~(B+i7 zeJ|y};Ly5Bb|K|EKpg?fBBmC(a|#}fQwY)D8P%WZQ-s>ZoxY$C3`NduAY2f0peQ)2 z^Ec#}*S7IYm)Y?_K_%n2Uzwq6*X4##Z}8LJB){5d-3wNF`L6hmJ+9@O^aRUj80ROt zFmyFlpRA!qPrF@CB=##nehj68Pw)PQ9E@_wrWellC>Vd`V5(Z}d9MKWqdVt&7hk5C)$6&}VR zaX_jH>$`=WbIkR|vqz)%CgQlF);EBpXYQa8E87-O@q8}}5xagZMTj2vIG3>E?0=ag z3dYI5ea23tWbk0fkLtgHo60pH&yo;`hT2r9p;ZIXP`Y~>O}vsR$Q$Y|Mj8qG%@<Qc{fR)?pAAgC#{ zokDhUBLHYoh(@QVxex$A^h$sQweB?=PSDgWU}+oqQq=QHzZzX1L}MDu`X`uHkKwy8 zKz;_lB0n2+xM%eo`a|$C+_#$%c^4}Hm_%e2ps-`j#7%Je@Oki{hEL}iHBuV-tQCa6 zF-oHh*{;)0&$KV6E(zcab5&-)A>n@G?~y5)Ai$a&UJE5IVYKx24koA=UgtaUKO$q> zFsTzCqZIg{m@z#3AwNEZK}n9C-34%y%c#q;ZD@5<0M(E@b@lJ)FLr|t9u;&->p-&5 zuJeJp`R@U(P$@lis_4U>NW-<+`t*6#)~f(STdGjY;3>1gBXQGy^5Qcs<+w~f^a9*e%sv@&hj7O&t9`JZUcq(%#4i8 zP47rh9M>)(hFo+&QDMEhM3b=%vYIulQ7f z*%Rw!<)BGTXFjHIXY6+0D&Ns`I)Q$_Ih;yW>b{BE5R3gRESi(=F=;7#4UI8SVvj)W zy~khNi#IgudwW~U9o_D_ku%Gw=GqN44>aTvvQ#F%rsljGs>(ruH7>~O7|)KRkN-X# zw}U_aDWO{2ouzy9Q=Y)?{JeRlITmVGch@Re4ts3%7oYdmG?F@WF2+d#7q@*DXKXn+ z1*yzGgrNK3K@^)LD|>(F_2(b9_x=T1Czo1i8oHJ4)o0D>PpRZ*>5t8mS)M)cFD=;8zi0$HIrT+N)xX}cr z1JdPKuM|%$MGju^@mzSz`@N$!)NRfe#AJQS(Fz^Uz1n5*BrfvyxI!ES}5MnKb zt~FWxz};#gfq7^*j|MzeXBWd{;8BVsy(42&R?P&(8hBBN&Ff;4%Bk4V$vL##WVs|< zuq6LQQoA-O0jZ4^mu|{Y-J4R%IxwE;eRuH5sJCN|)URAV7xKN7aPkuT&Km{8Dd#N4 z*e)va-_M8ViCWL$gdQg2YRR{c>$-+?o_607WoL|u>9;+5$bvqNvD9s8o{({y#+Nxf(MoYu0sHlo28k5Mt zTCLMbd?XmdXN}bS{WCY+8uad7?btDP?O1L6;_*AhtE4SG9(|mGG+u+#8cIsN^dx(D z^HHzAT9K4b6;UuhFJ;&s7Z&at>}O3z=yt+U#Wm*(^Slh*8>`pxahUw@mtu7q_4v}T zyNb>?I1Ou(*%x)?He!;`@q@#i^w>GYvX<^S+{n=v(HKk*xhc1CJ7VO+PE5F#BUQN7 zbK%2MkhJk8y*w>sQQm8d!698zQZG|h%fe3tTI>y7f(9F$_?M(jX@z-~-^lSbtBA$) zI4+(1OfT{w6#g|Kz3ZZ%zsrfz%0_l-;2<}}z7L0VronP7 z!mJLmoH5fBc>-BP>#WC+%ghgm=H0$2+0l}( zGcNLPeDdqK^|~3inx+jic3|IL`8b_ZUbxpGDE6Wrjcl(1TYdG;HgBgLhb+r<{>B0E zSdG(*moXEs3vC$2j^`ycnzs!?NSxSv z=UVzgW@etyL)?pl;7jLAS>IrIGe0KO))0EO2k^8B4n7(>tuuyE(@wHeP;Z3+9Njp_ zVfdyX1v1V^cRgBMcg697tRJb;2%-?5My}vL6Ed=b^~w(y10-R}JRYw!!|fs3AQdXg zkmzig17%RfhfHWYO~Sr&Xi)a43vgDYa0v1;3kn=Wj(wm&a?IRkhR zIxtjYlRgsZ)6Ow}_5Wlh^g*hqe^7xAV>HRfy|1TFK1JvRGLkQD3`n!8}URrz)ZWv&= z4j;Jx3XAmYsm&o(H@2rH`)Urq-y6-u_m2(m(pvK~L-8Nv2tChp%{3iBRaXXRKia2w zjrqYqDZS@wCUob@-mHO3QB96C7hEx9tK$L_u>;kz^{meH>XyY4*ZrX@ppdDsRL_qW-ynpC zO?^URUdfNJhjZa<=kfZgff@Lrlz*D3D3==WW>(3&fFi2pGLg%h)l2AIfd}rH50}4) ziyiy9N;$)^OxGY!h3nZfLjat^o9P>_PNqATS#QMOc{=fAX-062i4`(uGnEHw|8BkO z90^4x>n)GdX+L%85sf;)%8`3)!~RVzpt%CTPbE#)0qJh0NXkily*sz=e(!{!bTg&- zX>KgdO^pL)a`v4G<<2z5s`uR^m7Mpo^`f5x&gzkz;otIIGGt4=F_d4^bQ{J?{_bJD^9g3h>zxuwV~LgvxDsE41N zc^{HDsXBjKFJ*k#s2A~QCOZ2N>ICjBpkmUzQuN9|znVc;kN>QN9N!@OPj0A>GzoP5 zKUqTAO0rD<{@1}$1WV@@^KSLA6(Vafiu6LKuBLqW$e3pHxnOG5w3&m>>BVhked1V#|mi=;X2UW zW1b&V$iUh>BLHW-z%kmpm z?frkpAIF8U0PULF6ahjH{0aW=b!6Dx2i>g!`?k3E8 zZiTMJw&T4GUUl7RFIOCOVrBG@ET8fV)s@X-!6ba!a_y#Y*m-${)xFuWx3I~3m%tc7 zT3%Pyq>UD$-4IDN;p10fDF>MFgaGA|SnkfC3s7=>j4kMIaOf4NbZtMS2sWRH>miMS4eiuK_86 zl=V$q%XRkNXW#qXbI%=jjQdAONYAg#`M%G5N;T?eRz#Sk$#hrkgC{~6ll|Iq22FEY zIcvr2F&}AJVHHJMFc&~nwZX^k~daXR)ogD{%fO(Q~}XB4T@iloU0;2|&nObwJs8Z&@gYCVG0Jn6E49OPfmW6J} z=BX2zB6Yt|>J4^AwxWx!;#I*{HYT>B>Fbz0Lo5J@{ru+prs0ZnHrjpHgw9*_XxzOw zJF6bqb$;S@j|6kzyuU;|q4MKZ*%FpoB_!|14oadoY`Pr2BX>-C{PYa$l#Ddg@;!7) zBvv2+9|@yEue0YNmX{qgpC{0CH?0ENGYxp*pS0&pH>%sel#^D|q##m_F+JU={5KlaRk9pqB zcw2l|AgHYVK~8Japt|I#XO{(c1pG)%2J-f--$d#I0lDjqzcv9Ry&6jHBMVOOX4%&M z8j_hpz2*vy@Ls!e*LyNcgAYtNw^2D_Ff32b#{1a|Zk3;~cXMw>_T24qc=QJZY0-6o zuG!F$v^y*p^`(ps%e}PmIa?5y8f;1{L!8s^FQQvu`pUZFeBQ4ETDlbActL+#t%==l zs8_Vk?Q^{68`@1wh%v`JQaZuqE>zY)FIxIiRC=7I%-N40uPd@ElBZ?=IU$=KoWzhf zyJFj>O`A~X?n+1d+B7BV%!hfu<0(a|b=S=TLDn9^7V$Z$Rc{nF!zuApPBdF+x_p&< z=Oh2R)vY6Iq&Ct5xp@7PDQ0OKP1I701aQXvLBwv6CUK7cKO={3`m2uWJD_5%`emm^ z2T+5{=kY#xt3M#F0Pg74M}hb?N^tO8LbP?a!VlzN_&rL(UXcu%zRTzd*uWwuejn`M znr?oaNC{36o`N~LquoBlZZIo;9n}lNUQNKKpu%8l{Tg6pm)YTGadld;%pAb~nUpCB zT;T2X!Pw^coY+u>BF2dOIS#Mz;LipT^+cPd_G!Skf0u7l!dMO~K=P)4(fz$kNz}*y zP{n%;R5>wr@`Bvq3m|zI;whb#eG+-M)t~)W`~pbEd}Q!f7|Oq*eSc9GJ3+KS!-r>Q zkru4iZK+PH2m)g~(iiCofQ5LX@mIY@@+)|N%>Q@Urupj?*RC*r5xd`W?w?=i&t_CX z>2&-SA;(Ft6UXmK3nFv*zKCt42SB8=G6AylZxO=3N-~Hc2l}M|u)5Q~<#7y9{_UP{ zJkI}GNbJ+MKE93kqwjwL41#x~5mqycAV6!>}_0yx{Kg0)uU8?vG+w5dOi!AVb27w5fmjP|v8vMp1UZEs_ z;8B2=`X67YAA;9p@q-%piQTS|cxWXD4b!C+gT7zorvB;fG5`dwcFH9B?e6858ZMj| z+TF`K#Sxr;>ofE{W73V3UsiY^hBVedXac5r78u=G5-IHha9W^=>bXouU>EoEvRT|m z^`NnQZTO@LLgPiSh~ICzl;CoYAikLq_*qZ|gy4WZuG=Pl{QXH!(dq~;w7MF=G;M*U z%2<4Sa9Kfa`4@|wGl^hYcvK9Vxq%e|fd`<9>OBo_B~%=N$$kkwaU1tqz1fCnmoMY|Hv*{$6J;oyKYb(~2`rnl#~D~|Ugj#}Wohrw8* z-~xtP9X$c>_I~=F!&BEAe~uW2Sj>r^qb2W4Vx-*5?3Pb|9MqTeg0_sAIqT!Q_uL>N zD{R8Wz;H=SmG`tHQhzNmIMzLXfhklrVJIA=9`$NPk@FH7v(5^Vi{C>Q=`RQbQnY7f zs~wOkWNS9J+>MYr{pD!(ZEo8esjSu^T5nom(+twgyXdy@6{8-*$#&p!|JrRv2yao?(8)O=J z9rHoG&xhL(<*982&;T?!s?758`!1Eb##c`)ytX@n8?2-{_Xzx&`cI!RCP!gkgR1EY zAss&3)^b{oCc7K0foUF=*u24ff8zolF0BfchwAET@YAazw>PiG)mrLT50&w0S9lcG zsd51sV%K%P?(W0k-k$XI$n#9HpGV094233{D8t0HQ$0<(xLs<=Q_OAGv~NYF+OkOJ zv29uio!SQ2p^tz&YL&i)Qgo)W)tmac@dG}b22M=6LrL!2k;lF_)*3g}B9qB+KMcg9 zgfUd=DdX3+WS_g0mK)>JVxj$6E<~!H>Anq^-?2m5M}UhpUgUC<;H z!||S@{Z!(Mpo8Ltx`)hBJMkCqa}2f-yHlwyVS-VoED!W3B}6-|v@h6>CB7KD0? zJBhl9rmzfKJg+vT9eJ4oGC2H9J)Jioj2qn-waaEEkm{&#_$z<*pWFGM3Ry8Sw*EwPlboc&`L9#^34wjV|6<}s|kAk-@gdLnCSYyw&OiW$UEsS zr<}2_OTmWXWU(>I(dM!Vyb@OrFC`2%7#*?_3ixIp zOYjNaaq@Ey`?}iesKc_s^5Z3A<40PGf(1_g7f!xiH+@|~Rux~PcfiHQkY%H!LkDQ(|9OCuNXj#92HDejI`;>%xXMRUuxxnJ1sW6?)xbqnjr@^ZK&lDu237 zd+Uv?H+-*XH&!+*n7=_wbGUkIS&PmZs)5#_8BRWhg*7v*oHuw|4w+_SE3jej^7 z`3&21ii%&jUvDZMS#{>uSy`!fIc4Fa+k}C8GAHqCKT~WUg?CmZ<+gFYalnKD*6{K} zX>ZZJ^^EHgmyt0OVG>UjR}B62Gz;iys}V?8Ep}j0KM1*P9eX!wWv6;PcN3etWVo#0 zr1oP!Pc$mkEG)@s6A@gJ5(W-Gkm?8BU=nJEowC5oZ=b^Lg$vBbkwEAlhcen7+@h`A zN+ejBe1GbI-6h2MiQMyH#!h`$Yjlh^J&-6*u>2H|qw0DPOG|F(0v}d;QKM`*J4L%f zY^0v8KQ1LuzTtYt3OVS>?klkT?6-6pBMtS(iGwV}8h-Q+Q2HSN{O z+P+||_F_7c_XG#YXvM-FDh}~jF9=Kx%tRV{Z>px+^_vMD9k?E4ZgT3{E??7O&M)~M zL!a%>8I9@?i#*4;kFH-lIvVWtDVmM0#0#vl%J08^?ve>twX7_;dc{z!^Y$aR_IcJU zHl(REsv0r<0YPDnHO48fNuGNnV&}cy)c%lG_!PaHDZoG?qUV|K9ICXyk&f7qM++HhZL zrdcm0zsoI*K_9YmrANSlPoZ>r+Pz?3w#qUwJL!7CKKV5VEnHY`t==#U%jJZav0hcK zTbm-BEhEs)B@O=smHRIoG&K)V3%ccF{reH7V6)p$_S|`-rMuWnU!wPr0k7uqR2&Q3 zC;hqFvr}Tn{zR|9UWXiC8nPh$A)02?6Rp%9IrF#NSWm{!b#o$}Xs+(%=+lfV$?yU1 zY(-5)!$-H=^d*YGBH%+1RsSdIdE)sorBV2IRru>#dp6LucAg~aeVkF#TUTYI`s7OO{47FQ*Syzx54K~a=1q{Ysd@iRY=H$(OZmnqMxZ}>n}HrTr-3R3{-CaLL~i2!B(&%@^4jQ$Qv%Oe0gm> zyp=&-?NnCM{pG|QDi<}SNt8f1!H{5nt3=qtp`KNf&a%1{Xh)Iepjez>5^1prqU4gY z&|sIHj=jaXoQ_fh6*tzGe<7Z95t4C3ys_1lB)6!e^=?46;A80ZGEAHT8L_*L4EwN` zqG&f+Mxyp)?Ov-*mGl;Fg^lj9>COr`aEQNb2+iH#1dsI(+-Y%;xmYnW0Rn^Oq6@*8da&rs)o!3O{LoBW$=_Ti>*{(9E# zP=RX-+-nE-STVSyZZ8|F{lkP)KIu>i!3>y$F0ASY|B)ggR1BP`6)&<%s<1q>@toX(uw2bk6;4ou`&CkmL8Ro;}He$l4>28_Do}@_1 z40z?EUmy6dn;Ibx#h42na`>G`jK~7OvA1V+mRor>$7ZmgAdqUSdi>7%p-)M>y<$P(scyvZ_p0)E&Pz+rN!*U;N@B_}~ zX&h|yw!`o}L@1tqVZ}RT@mq{)0r9olkM8%Z8xOriI8$cP(VaN{{j-Ejz^Fpl(mviAJzlnsV?NjCG z4N*&+pbyk47Ft&hoRZU^?Hc(Om9ZY?RcIBh%c9E^v$C9bYh$h!QgN+Tdg7 zel{JkcPG!lAzoF+5&dr$DybF5pUATB@P@0;M2kNd7;KcZQn{O&%35BSw>DO|SXp>i z<9V*VsD_KH+{b$w)^VZL)f`%)XEo$?YsiUf#6hol6b?D7I(YD#176vsG}uad#JCz z)MxZwc>2BIIg2!l40lkPr;SdiDrPd-G0Sd$j(&Vh>5WoQuVvWP$a<5qzX-V;ux;2h zc>_FbNMyuE4`clf^;%VKW!kt(cQj%_qKoA~jSS7{2fFx%3#H}-&yzj&UAMX7O_c@3 zeC=x5FVwX=^?Kx_zc$|SZqG<0^rh#EIy}hko|&4>X+VQ^^HhyG@hmoP<|~?ZRr$)f z8JLF%UxTNWVQ=TVIk(46ayH_N?u2%{9##W0H>dYfP>yL^`9}FRS(Jia!V|QdUfAhN zQBStj$<Cw_>yWtwFb4wl~`!ptfNMKk*4N zkBgAX)t5I9#KO8es)z&VNAFLIb&opFTuI7sh@@QpUXM|X{w{eD7Y_2Q4`$9c3bFdw zViPia1TgMlC97j)o%K_k^o|_US5lle+b9>3Qv5B8ND)pP>^;y%X* zr2%?eDkmsJmN`7zNzC^h?z_sr>KI{IU!WMs3Kgwkd2>fV9v^S+m!?I$WGG4W3jRdR|v0P>>Fq& z8qYXK)V+-u`3np5U^)WJx}=g|JtDrFpa}E~$6z4B~K98;5sQ z`vby}0NrFIwjdrw_(43cf$a3FZD4imiq}Y21YWSMkJk0VF$?eyNVyPhvK$r6)T8@ShYke@n={&1(fCLsRD7A5dFb}WJxyj76dmkfC%b!5<=KiSw z;>{5mhS>ggS+3lr{6G!DWBGwJMPv+&teBbF*Yn}t;^iYBO zedvod>^{mAt2n*lcL}fAvjM_EfsXFU)taJBO(WFWxha84ldlIZLrm<-qR_+qz~?w^OuV!&+P| zpJ-6}SdvGM76|JNEcOQg^#Md>*&X2dT)$uC`O$H&^es)aO8C>I1$GV8;sqk5%l7xB zS(}zU%(L((=IXs^x2;~^3G~gL^oorx3`n{@Z@T5Np|`C>_agNaUTgO4!flj5;hn+? zHm5ft7!itP2P{}r6s&fi?Phe~;H|lmHR!Zh>HP8I?cJl{ynvj-@7IjYFTaRtnSZsv z@%YJ&%+{#{>7%|!Zdrf?WVRH}Cvk;0GlyrAInV4xsBp`YO#WUwstRp`I+c}@=>D~Z z$94MA$%8lq>!EVJ9|y${!J2zM~?FUdo83w-6ha3U|{EZ+#j zT^lN8lbs!|q$|K*yE5sfUiL02F0YVyO9TMfq-$>JOO1Z!oWfc~d5xp{ua>#EoRY6!;H^3Z8n{Z0hH9Ffk)V zNDi_^DOyj@57ZG}dWEz=Ic{UAnq1dB!X!TEi;Vl>9M^)@%w{k<1%_fDDnwdl<5+bw zBcp%}5|F?Zy%w;Vqiye~f-V|*%bbuenA3WEsM&^9o$CBBRHpUA`F8590ccmEh1Jms z@{<7@>N%ABM-{;$ApuQS>C&+09Kt-2MIS4yGqnrb3ZKL4O%zsJ-+Rv^=Y5F~6K~5DBN(SY&MOyhpVQ?JP9i@k#ke49&s- z2!ii#|1#1`)5lH|DfQg0E#kp4hWZyO3VDABs5v#?d=%b>$&S4$#cu!f?bZ$Ed=1H` zoYDfFJ4(@l!))~3&@9iT2tO)MqgE}h;d7tIC)gf7dQG*z>Gu}pz{40&>~hyr+g;Oe zDRaHIA>wQ9u&%ht(gHe?Nap#zQB=MqE@$lHtrJ| zn)0PZ;5O{~kG_no&Yn#*AU?TCM&ETsL&vG?hD74Ps)N3m1QHnRk%p>Bx?jj888GrN zsPbnwq>ZCp)pxy-cqOwv+d$*<(@3(c{6Z7g00ZMY+}y~gw0^-KFHW2%-*U^2>-Ki= zDM%?R&X2v;X{BR0M75!?q*ryA{zX;ol+A(@9n<06?Z}2`tX_+$L1WJP!L^+n2;}1u z^9i)^S0_?$>pD*?Zu+*5;>?OTuilYPvH$hcvbW;p*QPzId5!a2TkIpVXKI>8f#Ga# z>bt9)k{{l` zgYVuU+sMpO?A0#tZ`*HMpOL(ASl=3PnBziI)Jfa+>0H#hFejPF>`x`9!@8rm>5$5x z(HgHu>3T<_#r3~NdbBE7d=q&PkP{6 zTmd`|keCpMk~BiZCjCDkE8;DlZVH^kF+D^Zum%{(4+EsYPg_QG*aI-xJ~2|le{@*R z!}CNLtinDW8K}!pntf||2{vvBSipI%hilEHE%R>LGDZi)G3<>KPhSAa85rnky+QMF zY!$hO-vm~X!2c`H9z8eYY zlA_jEenT5uYM}VulA?vpXrKUn*ZNM#`JmO6frelZ)t z_pGB-9)7V`g=y$NxN&~nhz1toHN*p-TiYHDYlr3Ip7{;pR4SHj zW4nu&2Gt6h_=Wdhna-6Up8&xMit)qp1z@UfhtAD}lhy{t-dOtJ(8^~t5Jv4YuRJOdEWdTn31 za%R%b`%&Jb(Y7P^?*bc$9kEw>^!W|$FIMtH8h6jH(WAzXFnXgw zGji~gbtgPIXFS8yTr=!GQ+3s{9{F@VYa02!75Uh-c35mOn`Q!>)d8fxmgfP<^DhmM z>HHS{oH9|zTm^JRMF5sn-*VVPqInS0#n2u5F(E?uGw6}pb7Pv$h^3zl04*>8oQQuf zANn`%Fp#3hS_SHpc-n;FLGyYTi=&Ubm^w35mr00)*`qm# zod|l)ZJXe`s8X%jDElL=%G=J44R|H(cieoC&ZMANYonx@r&kzoV(sZQ<1>0KdSpJf z7_!K1{(8xDoXSwz+2#rI@DMu2e9?0C5hnL;YC3~1Z;`!^tRi22Go?xYABDWgR}xyc z0+*UoF?y%~c&To_GHT$LaGU7 zZ+%~qZ@ep@^K9na)sfDW=|Cy*Gx5uBOQHc?tMj;fgXO`nR}ja@8dc=9C)Q?G7A4sN zrH$v8tG3z{D?eqX)osLVL}mHm7T3bs-Z^yW)F^a6U;u5M8d1lA1sK{2h}yLroYpN_ zsW^3dn0@s66-C0X#Ws92J|=%p`wEdrQqWd+%53e0FyA)sLBHd}&+< z!rz4rWrK6>PW16mh7f{cZVMgC^22fGc7ee$yDL0y@F)We`GH`_F9u`xXJc(=1;NQ_T!H$;l5i5P0=qMRtHAp>%?mU7~g9Vb5ygwj0T2p~J7?%*yw?Pmt!UddHJ*S; z^yjHj{(VQx(t0h%LksJh_0e^8C%1CQIqJ!9!Y0{%$I{ox&6{*LAyHpG!@>Z{uNFDF zKh=_YMto}QWb?U6tAwDC5wXIit7CEB9(7$j2qZhnQ9{O6h)^BuHl6~~^A~QnVol^S z>3anYQW^p)yKN|f-lgL4oM0D709}vP=QN*|o^WNu)Mt?#%$5S8OBpiDg z`p+~mZD5A*Kj4YEycN6Ozx=VCrAsyigogf_JdfYy|0-MTe}w4#+xw5vu44nhWBlXK zKP&aKzX)Xle@)cCC+z>mb;wZso3I|S)@?P$YTI!|pGKEoH*UC|ZecUr$Z#b2#>42M z1;je!*OLrd1ji(L*#3SB^pAo6e{jcmW&Hh63^3eRQLe1DotP}GRQvY#?WZBa;!L!Q zdpVbQ3i(Mp!0e^`8v&wD^aL1J2l(yTkNSSuoE-17JlkVJSY}Tmc-k9q>%)np(lOo> zntboX9`y(x+a#V^44}&{18yF#mUE_oE;K!n1eo;liarMKbJQE7s0o|^gA(0Ol|7(X z5-T;hF?TU{Ra2YNI^UE<1$H7)m5*BUH{$iwcSZ*^?u;b%iJ;0O@C^(D6i7cfa={f~ z3IMeStmP-KV8@nqKPF6<;dsWCSA^^*bkNZ=h~<@npUn(awW|n9sqIrw0s51ddLdl$ zMD8MI4odThzTL=)z-W5!e)K}t)qq4hm~dtV)vnl?*;Y_q_KU(t+SXsW;$Xj!9`0XA z4}zU4C}+a53J|f;mgSZ>1+0|tcd@VZSl?v+@2%`3#%a4vKq6|oy@h*Nt{}H&Gc*RE zo&p?I>;ouA2uN0+!97gHqN_&aNk1Zv7;af@eph{)9JO=1!-q1;Z#jEA38|!+L`(Xd zW@6fYzSNLMs!M(PWQ+B;5iq_5kV^YufZNO#B_t8>k?UV^43zfp4OX3)K&-Y1sIWMZ z+sBu93rmSz<&9RnfXPTA@b9(}PlsaWhzvpzOD$(xi_Nj#>K)V~c@L>An8D@qG=DxlPCU^FCzOoJ|Smknm9)J>@m~ z&mTyPZ!quYtsuY5yR~Wh`VhAARzIX8d4F;HdItu4EtbOk%J?|qQgeS?NlqGmnp{8` zliC<VfDLPjEFn!PY)z}*Zl5h%n>&+$|ZUg&!P zXM#7Z##V24-`SR{pb?2WQ6T)C+TfaR{VwYL_nU9oyA>Aef~d59e4d|W|M9xVIyJ=w zAS2q(8k1{v+#6g|yeFbFUocz_=oU8-zyYFuA&_ch1rV;ht##Z^qs8v6Wf6PDZ|m zYLNOWc{IeggLZd3u=h)AIP@q^>$(DYeduU`tN=VE(~#HG#&MHJ#`rg2|8 z;Fc3=T*CB8uURFodP6nwo&)0i;c&z1!6J*A1LxYedzN93hS%;fV*tFFyKOXTcaZT@ zL9FY&Un6uV^TBIKO4)}C2MD?i&p30EV-Sc6O3<(~2xi#s z)x#sfTPOSK{0Yg@SpOVCE^JrkFy%@=x$Hh7My7bryZ4^MPqo!J^ikZAaeg61vofYY z9G`P#O9G+Xht&s@Za5f${+v19>;f}S&Wsb8z&!co;wiTmvsv|-^8izbHH5nLGhTk$rJ18P< z2iB}Nnoe#WZr=Ef^9>odq!c+M>1w`vmv3Et*u%8w>uX`ApW|%GFi#CL1R}{rXO5&| z598ipOJ?b49sF3Up6i(O*{VO3EEY+-I2TxND)U(uUq8IA7OueN>D(&8O=H+8{b+r? zl5>v<18;_pE;VzUf44`&s_b#UChDa&gv@gSL|AUH!DiXzRPtus5<2{H77p7ehnmW# zi)xDgfG95xDZo1?Vl*_QM@5z0vbmMil)TvU&Oe(;t$IGZBKG-A;$m1)O=Mgwo13j8 z6W@(jqTinnr~!$y!M#V+@^*YrMY~kTD_r{W_^TmD4hBi?*`nU8-R~xc_%7K+_Px1Q zDk2kBp;@l~tv6AVPE*6bG6Muxv54DQoEBBpxwxXDQvZfQk6irrXR`3FHKp+S^7HN@ z)>lcoZbnS}RA&F-^>I~YvR-CVXNP{ns8hkn_zkJ_JsS1fDW~%bY(B|1igmCoJ4l)p zhw2x9DUBjItGd#@cgngn#9+msyV|o`8k(iOteW{4$E2&Hxr|T1R?LuLP_H(xm6L|Q zKX^ooz8dqLywFQ5;+ek`s+Vr;d3^^Webh6!_0r;vD`&US)82G`(B>I{`jWxJAyYL51&iJ zHeqBHx49CiB+^%SzZlk6y+~9a=5(chQ@ggth?`P``@=CB&`@%np&CWZxJh!tH*@uk zGtoEfxZaI3a}djLIF|3eUr?!S{R?@PKhk8=%|Jz~qjY?z$2`g2;bpgHeN2%_I}=yF z@xWhpc*h;o&lFg7py@)}Ug))gL~0EPINowJB!73cqym$_k6OFr-_5fEM*@en$l!-t z^+#%v8t_)6!oUNSPvd1HI@g(6EXCyk%tr zw$=(j7(lJ6e(D3}UQg5IV@eHCH+ld}#NkWCjyto@#KYWS`7c=v;BmlN37|sk02MMo z08q%U;F13qR3{wE-vGmWH`DLNx6Xh?M+f9KuaFl<4X}-)2TM2m=uJw+1M?N_Q-s8+ zQBqNtrIfN=;d?9dx6EQ_1LNO@Tu#07wzd%&9!lx7xR<{G<&#=7{iuNX_@OJMEbG4B zTi^UF=o1&UHDI>HPt;7wA`*T{NRb7H{h-)YP4Y1rxUuQ1M^rQ;A{)9qd zba@y%JExB+i`1a7`rCPg@43+@-ue$+T@>Wr?u`1%4*+lDV*0xnRUOh?T6wxEKo)JXtX2G#iW zr4i37vC*^&p##GW@^=nkdF-hhh;SHPIo(2GOQaIqaC+>NU0n0}!)ap;~vMQKP!AO6-?)ad>DOz3Mi5H}Jhll#Yz(~GrQ*@9$^ zW60xg_Z`?LG!`(sT2iqV(BhDQvaj!IHnddBi7j;3-=D9uT#MzyrJ3J|-3z_jI*VcVIxDvDO z+TFPT3LCy(w$A(;&R8Bai2ATVlsN~9h35Kf2;F$QRxurx8=`wzy@_`9=i7{n`}Wvu zbUGBvdA^Zu9Yuoeo^#3~;~CzJQU7Gk7HWQJQMVIO%~h1|JDc~um|IVIze8rA)E|_` zmG6ZWLCy>WB;N4pnL++8fd4xNaYuUl)`a-F%YK7(R|kEdnAx8?#kIZbbyK4Du3FTp zLc*6D-#&O7KQR{wY4t+A=ifIXQhAw2c3k7W?qhJoO(=XoO>%MMtG?cxNf zUbV6Z-@6w%;8@O(4P)j`2M(qL!Z_|Th`_jmToM<;a@lNn-G5)`xGR4X_|Vd5*@4&f zw(=jnPmi>3N=|Kuy+%(bLCX;@qq3^8&aS>^=EYzBNM3>647Q2+X?AGYno6y}$<7~E;T>l$TuR-MldyRT+PGbNfsCK(OK11?_iTt z^y@0(#v#&pFxm{|v;rW)W4&ZM`1EG!799{g(-d~ZoByD4EOwHY9J@bWycAPLA9>K? zNAEX)#^^yKpd~VQJ>$~g=RLU@Qa3_RU95jGeWCj49k3+rc|P*PPZ1Y0RlE4icNi5$ z1C!&5=C=fhK?8)M1wv{aUmFJ3>ahrec=kOCMhK(bANY-^JGFY=a^if8k=S^O zct(o2^^VOHV+(e8;Rw#Y^jgo;rs z_LZwnR`QLYdHK^@|96Y``}%)VU`e-{Ks?PStA*oO`5Y5P`#JB!5epLGLJ8PC#-u!RQQf9l zZtDfRqscrIj4>gv9nHR)^g3miR0j}ujm9e@RA8#7Olbb1Wg;};81*Tj<75B=SIJZe zUe5!47b9{wX7}uCcc$jZm}m!1tgvL1L!6xkAzrotE9q`cM(q1wTV_y5BJ>C6a>d1o zdRo%Kq|__Tkq}wOz(%G)>P|#8I=c?aOR=zS9{}AjGiI4wgofC zQQ{7Hn~UAi^yj2?7N^?{F$Ig_*T0_cg`hsJ zKqyb-*0VwQ9HL5+0lfKVbn9Qu2w4U2mscg__K!fufKXkF4{9nJAh?$8gBrYP>R;(l zxL5V4nW;Gse0(GnG18C^7PliV zwISUhL~o%*-ro;G{@3p$>LaG}p*!Aw*MY)K2h8p24^ZD%31Bb`%#lHFWX+v;aUI7U z)1P|i5{#BR_IPvu1eOhfq7(dpK(Kb#v4H(s(&;q3avpy+gs3O<50XxX5n^Fpb+}9z z<=nz`7ElJkz($n5VGY7}k5j4^Nb+@bfbCkJ?K`?(RnlcOkRbu0j-e_S<@s#?EpnIs zpE+Bvi=II`1A813P{^E-G9HTfTd{|MOmrDx=|cS+vQ*N%Ucn{$R#BKNme`|n?slh$e%`(ZZt)b6bZ z9eGz-#lC<^<8MbdKq${5X9QomIvlbHkjY=)%FtE^l#Kc}{$X~3{7!8407iBY_g_n5 z1~=&YqI5aMCRk}0U$U`2QxQ$PZhP7u2Glvnnog=%KNZGr;OUfsP#vdICo%W=p4(%5 zI2Exl{a-6vMloBIj?xO44^+2~VL9Dw^i{vOL{?ZF9LqP1PwD|e`J2!q8GHN=ycV98 ziZMz`9eV^ldZG9$AGrGl$4fPm zD58^QsH&&pfTA=>0I0^B|0mndy0PVw64OscQ{}oE$(IiTz%RUyo{VXhz z*aEY(qat>v(D4uh%9TM$(^s=@xBMV#j}h%scMQc1;w%vzzC&OkVdxh|#~4doiTsS3 z%h|8z){CrKd~OCHzcg{HfFB8P%LTewsHUs;fOC_J>d$+PYUGB^zT`uhe%`kz+8m{X zGuvGT+!=oid{_+hQpw0-G(5A>wWi5ZYcH((oi8`cb48mAB$b++${%5pb56A{=%MrW zdF)e9%5x!hw0A|~slDWwUoy20SAyviHQ++Jc^+iApTu!`@_I5%LtoY?VBS6oJbg0g zjaQ;78^`nEof4ch)_Sfbm8C{BqqTbOyuNn5J7FSU&k{a*FUFEAIcI!iTySJYAHsp( zM~Sj1gjjRPu==~E+PjNE-=9dnXQ*Zi4vc!<^hAxeuft^!K83iQgpGvli#~!_h9b{6 z>)T6Vjk&ZFZ+5OGkk!)mz`U_;kwVqqw-ERml+4A+hR{MsA`|?uRX;#-p8aUs`&}9iplhs8ai{%!XS1=@|#!Nx55cu z{-w8&=$qPZt^B)2`_`D*JGL7iHFJF1l|*|S1*K+diR_IsWf?-S9XX4i=kjLek5Y}* zoY{-LwCckinQBcZvmzL<`NKCW4#R@JUp(QdB-)?2l=Glk;NW@6-ro83tQoQaA7dV5 z=G~sL!=X9!q%mg&y3nt(S!yb8`mXW3pH;$4);*Mv(JoxEZ==*#g5TV#*(Hp`+=k0v zp>*)ZdD|7kPb(3F_}UG2lzjKH%&j@bSP52NnE{bANLm-e9Xfd(7eAS8Mu4Ps?U!A; z;K(V|?c>LJIP_M=Fi)Jc+WPi{QkJffFa6;HNKBi(`@A{I)9QkIT;3g8;Ye|+ym{B2 zT~xHuwQuC4Ksj86W?C!ujBGYbDd0cHV4>MbcY~#axkI7s>-9EW{-c5yPX}K%D)Biu z>NS*HGcPnV^1X9kvXk*y-e|kQt*ENGGQBTRL2w;M_DJyz)xn)k{$KnMHU-k$%(n4w~nTLFz3S>O>eU%FD=CK z@Ka=5$CrJ_ROtIBFRxDS2|&|f^ieafM_JUZ+WcaNu1!Sj9r15-f&&fsx>@?yU)7n>$G+K#Fp^d&3(93 zD_p@{ut22aI~(w%KOm)TFVlMgC~kg< zUTYzD+IRM&-VOere)3JmqR_^66v1C%`KPy+4&!1fvU3MNdpJhbTDkxx$np14e zJ-00FBEN>`(u$Y|452@zQrd50Lz=2g&U3~Ml-90w z{Fy=AYA1{|ajr`QL(6*AqqaM>{~ncuzA z-@U317o1ZUPBQv*HHLxltlj%?g&5mx{%}wLC6^kM>W8@TE+S&$v^2?X%3A=dp240 zfLsOWHu}GN>C>S|Vb$aE<*l$PZ^MU^iKO2YK$+go+n#+^kMUt(tXi+SjA0M=s}=Pe z`a~xh(mYC;{=5w79DVH(Fl3!QZi@ct!Vw!1pfh1ha$J{#iyh5HfC=qH#`3NKC2|NK zwgPOYcUVP2HKw`|!LqyTVSzQ-D_t8h=pNaFP(NPklb$o?*hwV1tl_)}X$t4f@b*}q zIoQhzMJ>+|ofHZd;=0ET=khneUPl_ho2XfZo$qV9HK}@auRm0lN2SM0GCy*Yy>t87 z#W2>D^xe<3l4)k3|2`()|MJ>UNF{{TT|+d+5wdGe&gCuch++2_i_s7L3~c;pO4pmd zsJ!`dpznZ_Mop)nOX)_|fLc)yJ(^Rm084RJAkOJV3=v*`y)b^(v52LXs>Z5b{;i3) zsgZQX3X(VdvnVAV)LDT|KDUAOsoK78FOhjvr}&`FJNmGlPQWsIBSjA%;RwtkhS$98 ziKgi3k5HyRATtmA1`hm~O5T6#uOYW2i%-+l|DmqX|Fntlq3@c`7W-ZKJNjm+?`{FD z;$KQ6kP_E$YS^gF(eZ-TwU6}ls&_yb1LUcOJSpNaq6FUf%8wjRj!|nIP|?^Gkl?Bw z??qXC8`c&c&GFzI8QuYlOsCd=DHcQbKVhCFdypq*VVzWmQN<$Z0&*d*t}h#JQu8XV zx2?d3TP%%tMc;$X2Y%e~HvB(m+cv6)y*!LhfUzS+D*u2~yfK}->(_~BIES9E#I6xr zx(PLb+0QA>p?y~v4J@+%sfjSqJ9S=P!_k!$&}j^M>#eLF1X9SIQ9H#7ELMomfHzhK zz?Z0?A^Qp7#3io}hjn6hJ|~#>-q(w+mozUhW*7KLEWR#$eiHs;UJaW7pY&Zw&Q7zF ze+fwWj!(lGKdYzpsZTgsSxJ+sOI^gMAfV6+({+%pO8s>kXM%}{VL-jyb7xI?wGrQv zxvm)EvU>lGm``$x(r4UYt@MpoK!*&A*xyCwfNnQH@t;hP|{>yLV4qy47fx#yY zK$iaHaC{`3(LkexG!g)$GqY3W2?QppDLiR07*@Su|9S^D9VxAU8wV$H2;l4xz52jC zt=?RSsFv}scc75G8FBxm_v{B*p+?Yg$)>R%*@Z51Lqqo~p7h0Oj6y%Ug*K*oB0ZUgBva<4&ImaAx3?`3E-5!~pIVvnve_IMno}dBFKEwlF)zbza=ZD8rEoq^3 zIcDWGKOiq|Y0TW|^(+4RJ-DZWD=GiyocXWOI#&Ft0}g=A2+bnOr`6;T-x$nr*4x)m z+=$7bl!PHwZK6^?fo8m_q5fi@V}}!+EkY867XXMEDz4<82^8K5M7QOQ2{bD5i9Yw= zXw1BpQ2uV_LpsgqHAnJcFp z!jD_m*nNh`%Nmo5c`<26jul$3E=M0U_J1ERK;uUH^#yk}FW&zYu^w@6llRHP3qd3{ z>iT+@iDv*j)0q#D1;Tv*e0#~lyHO+bCo>O3T+^1s-LHgeb>0g6GJJEWqQn?TW=S%% z-4?p(vK)C32&;$H!F;RPF(78FM^Q?%3PfdL99N|>Ua;}`NrcffOR)zO*-B;8<9qSc z9?Iwm(7WpcXTiLmI$gQK*`OCXRv#wLvF_u$UgJ6{B(dD8=+vj@A`~ri66DD@X6_h! z?*R>z-wC^sVfe_P#eWiZ+h6+ch25S4RHyv`)n?R+ClVmx!Uud<{XXx<&glNkGvX;x zVryquGR&#AY&=-Oj(DYawi|UH=6k+oyuQ4KAH_#q4jmE{NW|XZo7LFuhnVnfj&tR93g8~><(vZtQFqmfPu@Ik! zfC%eQo`B6wXv71!sPLfHAU-nVNP_bhn*6&Xp6Rb_HrS{lf)toaqrMfn9^jiuz#vok zmYYfcli?P4><@77e>|dYq9!At8*XrBfOOabRz^x2ET(uL;0?!Mv@9=7FE9IkG|C@Wn2xRhW z98}9X2{Y1GMQlYNKQ9tE9mqMU{?t(Y13Cf#)BpPaEI;~xfnMCNY9tz&O}Elb%l5gm zMlyS!KZrY_s2v&|khm0`(qi$Dvg}e=*0)1LkAIa*5C47JK|?!Sv)kKSu4O)>emcr^ zE2BW41yszASVzB7O|+BhEk>=?qb|feUU1ix3&cPy_YXr`!qP+Q1;|awq>v}6M|2Vp zHn$!c71$i<-+ci03IbSS54j}Lo_mT`03Ltr5u5utcgyZ8u~?Mr?Gsry4$>FoBomt` zalDt4l{Nq&^#oX|!r!s%oc8_!QBs4&^T`9C0`T{p0aH!zUre>{SI_KbV!37-Yz2ec z{J3e)JIT@J5;rFK*-NA-Tn`PPe-g)YWlG(>TloU3M566!>ZBi#5YG;rlNG)maers* zK-Tr9yJZRNurblCLGYd=$kvs)zHUT@9U#2M z>{PMhX}*_17k(mM^bz-on43_6Ys1)?nbk7H&(n;g%l&!HUx;rgipA>FKHg5TO{N#5wkm7TG>*Hlp?>{6fdf#PG?@>-qvTuU2h8+zJm+ zDG;Y4Wa8a-!h~lB5>5Sp_&UEeMVa>4exD28b`VCW;Dz9qT|+Ra=?PyCt+p~Nnj7-o zp{*|l0BM5la=X>WM+vgaw_hut{=6uYIk^hi$(o;)H~Cc2i?^d?O{ZFcjY+$SJFGhR z=_r&DOypkS4o__2sguRNqP-L`@aX}0mxZ(J`JSfqypcOdJyWPAESLFMm-cmZbzFLS z>wP_v7mE$Rfe|~0A(<+7)}WCWb5(2W0g~yOrZ5h!852OU(tok>1LDxGs(~?^bSSvCW`cPk2yvn)yT^w;4FY&DJTnUgJ30;`HpD^GTOu2}?$jPw^ zZ>9{(@79<4grr*MTydIFNVn*RGZlqo^hwXB5VAf$7qGqtvocnmO_q9Mh>gXi6*sOE zH7GAn+M9lS$R*z~;VaxTsy~Y6(BBStg7^divVdtUqA;;Fer?OVz2Wgi_vz@ia25OK z8B9D>a~!CQiziN)ZSaO&FXN03OAf5z^b3jK_9&^~n9T^X=bi5u%9v@ML=+z)RuXlN za>!&9@Fj0v!f&zcGkHu<4Vk7d>C7Q1V+6ZPQ*|51 zEX)jFnlsYsE;7)$gUq^`z6#&bpDWwO6PoZ& zTr+f6c?DKN!t6f7qB4m!nq(q63}3Xp?dKxv;%{7?M#+6H+a>o23slpsM198ajd&$X zwTBOsZgn)xZL+hMJP@F3(=~lt}Y6r)5qMoDOrY8Dh`HnS665DyT3gxYe)*0$fOc?$=oD}&hY;&o~fs+X;Sqwy? zPw5B(%TgsQIQ*uBVdC&EStd23TCe$l78_RO*rx`=)xmeJnX~3Pj9=-A(nqi0vT-xS z@0e$AA}OGC6@!+SGns?*VM7A}Lzp>7mal?HtDKv#2z{mL80|DyKl~V8lQ{34Y*Ts2 zp4_SLsCb^{w>GkK%bNW>s-_d|f*M+9A&QDYP8!-pYcfQ%iN&3z1>9evIO9G%&|gDf z&y!6{mkv`Gpch8*Q>%7h&m!ht2*MX*DBiXc=}$)EPJ^Jw0xOR=hDBKOW>OkiiNWPs znmhQv5@zmDwY*7pw{2FZSC-LOOd)UQruj}#WmQDM)*61jHu+v`kk5Ly$vb4MyOeOh z`hH@srXmgP^Y3p%h;QGpyg)yCk~38V=rXu?Ynid0j45RcSHFPZMWA(9PBsG$!th1s zQ)UdJ)^{$)z6g$2^(B%90^$d; z0)58VE++zd}U#FomymS#z%VX&UN1F#j)>;!>@VMrNp7Pv8d zTTuRdSbh<3b_W&f+XOeqxGhh$5h_@64~p1)=o8^uP?ekZBThZ6_WjGM_1{qu@Hjfw zP2neg=9-h-DNrdpf`Ssb_OAJHW%Gfj=~LB#r>JwMe2;5lW%R>?#Xwhe#ges zErP}bc29_sl2bUD^VATs#XAXDiWvYz@sJHS*X%cb9aoRZGiEx!%X#KrFY3vw;D^fC zOfX3vrCH5u8L|xWvpz{pTM_;ImDH@sgr~dm>=X&`U|WvYi}_$MIx$*$OTn5!We&fm z4XbYs#2RH`4w~F|gPaE@GBB_A=Wdrve%`cAwS+MbnFU+nKR_q5VN|H8yZF{pJY8YT zmtGH-XD7pHwSA~t;a!dss}TX$C$>BFH;_hMK~bV!AXqLryr&=_O)n`at19D!faUFraq45$rI6+!@Zyl-WZPV;4K%r6qcwegI)sM7sLmc6j zyRT8!*^MZ3=qmu4ABUgB@J>D8$-ev|t)8j{RJxh}Y-OOw$*w;q1j@pw4Uf7OWy zaQ^TbSvET=cr(jcQ8C1o+EfO{@K_RAOMVFUxB)UhdlMMLX*{(>n2`}H#Nxt$a)qlwC$Fz*LZVgIU{=VniPQ(@KJs;$f z68`k7N$ZJ|@MhU_;kN^79j58*obLx*@%ZFBbk8oi?#;~5D@CQmIJMwTre520H}n2r zzWsjjEBUjM>Dt#q$D=?cInjD=9&89hh|inoPx#v5D16XsRN`Y6!{4AR+3Dr>qgW`A z-2QNsB;gYPaFVcL<(`fU6bG^+s+O>9=RIXGx}Jb!M7s~VY>5I(ppgpBDDTdy8Gjv$ z5IUI;DEazPVvz^x7{h_t@Ye21Ida4sj}jNPV9f&`&h86Wqj8&goYP9rrN!5^YYcbm z6KITZ=}PNVu_Z2Nnu1UA) z%a4z-YsP$e*s&XNI#P-7ae5}Q&T3hf#dWCxMqKHT=y44l**xSNOlG3+>@lwLvw>JY zQ5E{h$ zgFiF-S6#-xqXGZ#(GtXD1Y*O@RED9&J3AF+5{t+j@%Kq2{)q+O(jytOgVt~ zgR$;#X)(SGzAu`yU_Di=n*v+0{l88Mf^&a`GA=AyTBLMd0cn9_cq@1T@7V~jn(!Th zG5#mK{J4^8LE6i0zy}^Wc8H|VNF)OmHsq&|Bav;gv)G{)lk-xL z*Sm2m8LFHBTqq933$n#R6bczheqsMu3IA9biH<>lVHmjUK-Ecv5r~vS#8n;w<ASduT)ghP$?7MFjXck{$UuzScSB z$-4z)gUr1T)L0IV*8;-OcVtKDipea?E5>(T97}5H;)XUB4=A-}L?R&Cu+?bS<69}t zHdS0kFW^wjI;g$mJwoMc3?lJrn{^4>Qx6Y~9AVA94?MlT?QFCmOcM$;iFTf$)1Y&8 zUx2h?cO~}T$g9EqR=aHtpxpi#Kk&S?xyl7!l??ReqfsC4pn6fA1M7tchA1p7?iqA* zF=3xVPl9Qv_-=4i>-i76HEC3~X(^pEKev*}TM|BdS8N9_dxu+8aWPYv&mh@I?NL5r+Az(K63BgzPKYcJ1d^ow$=h`QR9jJ6`?m+!mu4SQ$c((vV-pz5m1U7;_|f zvurWHo)pt zX^C?C9ZZJ`&ipm@rn@!((_p@PZyjoT(%m%s1QO=qPahD>mesra-C%fqYN5L#n9`M^ zA^tS#faT2vixIYC4($-m)$m2$7@-^5kf|!(9b(dlH;UXAq)=5Xo4z_}zTRkZ@|*Xz)sLD~k{{j8Ak)^~{{BHm?e;YLjj;u6j}cNd<&FX% z=hP%$iXpH_8RbrmVzS6TTc(>~e5zflkI%bN#m6eNJIz{-epy&k?M&-?yNc0s&-~oJ z$MK+J+|8q#0C)OQ(sD;DOAx)3C0?rcV+?DwCDSu|2*O;t||1A)Wr8Kf@sl0iSxBA zpDRM^wF3FGF9t2ggNQ#rGwZH=$GG*qNaR^lPjQZ$*KW#*5;7;lp?aS#AtjmSv0!fwa&JuC2LOfBn0r-e(T6 z6@KrfHlTVmAKGl>C{u4?vNe1w+^e35vUspRos|KrvuL-joi*DDcDf95Akst89hml7 zF5=~MocAr;qRws_`(#(f9*{;CDHI5m5s7?iTJBW^5GXuCNsLM}DdQsnNsu2kFjS*+E^_CF>I0qHG zv3tM4EwWH?8W=W97Ibkjiy5eQ1EKJw5KrsItYVp)Z``|YyLYL~p5R(iGQ2HgaY^ft zWc&bxKK~b;KCOJ4!@GS22<|KQDTxpCE`%6iR6%9; zTAs+*HR{reqR^>mz?2CS(v9@Ff9gV?%E1Kwj4t}62RkmuL+kP6#uFsX^1#ZKGT2u4 zgau(K4YvT{hfCF`@`_gTY~O*)X*#@aDn5!f8)+A@)h<;EA|>tDdfFlKOs8acc(@3G z2zeZws~%jOotG@vjj7autclk3-t{YI=nIAl)(2f zrzw7UyIh@kcylIUxsE&Ax)h5WQ7*jMie|@M_f>ml#@CXaw+_-hH|NfTHl@Dz-GZFA z4pV7wBC$eec+HhUmDqXtRg9BZ%@**%lHcQy5Hv>b|;{ET-!yTGLXcEp%@ zwo`x(%pGNZ8Z~8Z1K(x6z!9(e;rNq_d4^OMwvP|gZR2Rcc%_#;hq#9oT(@fqyg_V? zN!xxh!f~%-aBKZNoyP8i_-pCo7d6En+=};ExdNP*aq`jx@8rSP{jeZ6^7@Qu$+kf4hz=qam{xSed887)ssVlJ8@0c-CD_LVv?6lvG8`zg z6gd5W*j+$2Rr~|ixU~b!qEb{qFYT>=M6#8tJMY3I5awil-CqKIr(~RCqyx^L_C?wt z0}zvm%p%@369p3u*?a68v%N*@sn?}GS!tyfase{P&4vR)HCqt`bCIT^bIhNU2bR7B z?0GM0pB>RoHrwz;!{p!(We@Szu#}ir9-wdHcweWH9L^8VSbr&${hP~f;Ob?8RVFeV zTCaMsHt&w@J^toDoT99*LocC%S&o#xVgFM49@wGrhBc>SDOo`K#B?z3<$aLYTUZ+! z0mr#O3(z{!dl<_f6ygDJb|BzNvxCGG=3&U1_k&W?3@qK*Dk2;$!xR|Xgp5b(r~=k+ zeG+SJCXoNkICB9H0_OU6h(`(g4AOT2kT7A{8KU9KwV8(Qru~S0epx95fB>> zO|N0kfHmeDJ#^|Tv`_;_5`P)6U9!nQQ{X@}kK2&+SL|y5EnmU6LAS(QL2i322fBAM z7~@2?+ymO#&HYhm9uT~SSZ{y#lN@vs{@?y-uDN?(J6g#=*|3%g@i3Rhptw(zAJ#DZ zje3vTSR`88;3e;R-AKzj|AWoR37FL|NA;zpz$Q(R<#^6_)m6IuPZB6u!w zP4Z+w2x_RF`n??)RRdq>&1(-d%d~F-PcOpKpT>WjvXU5ZV@y}Tey^ke>&zoT$UxCX zWPcV%@9P=Ou!Ua$B#G`r6(a~j1BF)7*b|k}#y7xrt>>gLj+I6pv-NKITZseKYW$kKMzl-J!Rh z_p#Z)t_LJ)>t%$D6aW0P-(BA6+_pGtGa3`lb2Jh}_Sc}@QT1Kc~ zlac)>3}2-QY(;r$0Q5Y?pDVD#CZJflQS7`1k1d(J{yVL94n;bu4yTLOo4fckz7_+p z0?yzR-BFv=Znl|xd{2oMVto-u@EIA7u&p}h>c1n8$xUAwo3#iZo55H5jn1Jr3e^QP zb310NfTL^SPn>Ij48y8kup=;R-r%lULZOK+?(bS+6Mq{3T_5Y)V}W^qQ4AysK+iRD zC2mrab-A=(5g#WY<8h#T%WVC<))8m0ANLQ=;IHNI*YP*v+Xrv{*=$zKs$w1mxO@d4 zz_v%Qja)81pvT_E<-r=duwuPio)Q?HJ2A)=%hih$I}=i0HY)wyHiIvg+KvPSn8$iL z1EuzdxD3?%BQyd}WW|C-A^|*7(j+AnLs%^Y*hbTdiiQT#*H)K^q74UTvmbd1%k zDExp#ZXxR5Bdi4W=?ig#(>GAVtyz^QQWEm$pt@V&4tVOQu62qCEK>VvK7 z@9(&f<|mU)`V=7q`i)`3{`l8+M4-a)H1P+dbswxme|y7ioX8(7$3}R@F9({4&NfgB z!;?vjAtLsSH%b9Y<>nxrMksCr$EG zevKBgbONnjA7Xnl4mY&F@<+iKAbcCAMw|LMbCxsW=tZgbB>a!e4-NSEDc@5*&@|0I zazVU#t!q5MPvVgHlKoxa9aqYlYV_qRNdaBaE|A0DmkE9isNWnR7D)j*MI2s!f{K?l z51C1duK*gxm2UtxpKJd@nSP;7zmB&ZlocUR`(%5d>)rg*sF9qVib;a;Hep7}p-?(C zaI6&K0%_6b)DL7oI)8(FX{X92nM7>oJS_78;^LFqiVec;+?L+q8fn_#ZC{+YHsRXa zca|_*mt3&u&{bPU?Qh zryO>~P03N~r}CkImKDv>froh#!3_nR&OJ#G{hPjnZh7?WVQ|@7XYpsu04U}F_&lJr z2~O_gvPY89$x-L7j{!7rNiCOHk%%%vQ=Zt}Ig@2IxG7n^Xm_5=D8e@!FFyd=Zp%I( z#QC}bM2Aa;j}NsK$BV)F$H{&}^oSvsfHw#$bUnD5W+xvY!-uYh^e$5!xp~lDWdMI! z;rkQBo8zA0tF(_a%v;SRy)<^!5cCtp4jo75!w=8b6MhoQF4#aXXp@!SvK8D)_Y%`D zc4}h;x^vmxaas+#;A;u-0TLpD?mY{MAfe+q>ip)oE@UI$`@L8sBXDIqHD1;-_4Ia2 zPT7Q2tmjnsC*-j~{$2k{L5pO5*+lvT`~n~(Vp*UH$-AU0jT<(rb$HK=B7*NVy(qhkVzgj2Iv@PI*s_fO};O? zBEo@?g`y3}D4IC~iUiKSZU4M-*`XOH268#xv2#7=t4<@k5f~om8zU235_MP1rS>M# z=JY7OB$Q}ct?24Kvj@RKm8(tUfl0YYd0h=nJtg_O#NQr2c?lHW)ih)`Xnoka@NGw)DK9p7D}V%C5sB)bH;tuU{7&551wADV{=Vw3+qcC65kup zw717{R;>HLmj3H=C8zgF_gVbbZA?6Lu_!$)7FL3u=rHZ*pc!8|rcC8U>#gQ) z&aZm`Y&rpOO}{st&3+DJuY7xF6+HzG1i=70y-wIqh9j+h6fi@zHL%=JkEht&_69K_ z^8NJMuH-(nC9d3M{YV)cskE!qoufnU*@r!ZtjxEF4-qRbu)!aZ_rN;#e?BJOcQGqg zyCof~E&q)Cl}Ycku=+?4SI)z{LWbVEAD(2e-~HlOs2L(0t5;fZ`E)j;%bQA;6xX-i zsy13o^E0I+dw42zbHmzDvSH8H;q;d!E%ae{{WZ3bNs8yk;?>}OadY2mGKQ$-W6h-* z2RdgNmHcXp7Su+HJ7T_qVE4_~o!tVAW#;f=1^btw#hn$)tFh!f5j*n^IFCQ(>R+37_gbxN-@T@@W#&u-dpe}w zQRY)?=eVZN2kL%d-6WpyvAUNKy4fNz{o&EK-fp&#@*rKV`hw}iCN+349YdU z+dTJH^E`K&&?Qfc&uPVCr9m~4ZyLofg;r>Dnch70In#GV*T<0V&82AlvlJ-WMwuYj zovh`C>74VYL<3~c$!g;=M2f@fV8bfuf|Y`Ei%qDjw8y1L;OSUI0S%PYR)J)&><#|< zz^qO;bwn|oqUp(sza@wdgtwWg!rw9g{g7`YEh(%)mO9@a^uSU*%*=ZRLr*QJJ9`kB zWVpa}^~4Cuv(>nR#3AOcK-b=*-|VF*G*M%4#@I%6(#NMnTd`{#it zs?VCIHpaaZo2{^lxaY;tQV-@&g`CiPVIWl#?bt^4X}lc2^mgF(Rhbd^s@!n}`~G65!S`hPGEg^lr@sEVaC#c4gglF z&*s$UjrXXvNGm+&_gB+pZ)s?UoA(BG*=JN0tI*Y%gpRUpg4mFObvp3aM1%g0pWrvt zCQ|jpWgjNz%+KE*%;id5*vgvA-0$hhb^oZu%JLI_%7~H!qxpkEy-rNwZB02gjZn?< z@9z}}&sfr?Yl*95R9)vJ0Z=)z(OeBT`2*AVY({Ux(}@w)4=z@+ClxUDH~4Zb_}){pzW@$Yus-<{+3 z$(4QQe#Fp)Xx>ny?Cb_NG$%jFPvxf>OW}okOUf$l-MUF3hk+RDlQiv-RG~CExG19b z0iGODr-%-&1B{_9jLThDSur()U|%f zd8f0Qwa1qK{-Xt^+Pj@#HGVvMYGwh}08a*nnpa=7M`m8@-+ii8DSUl~gfSH8iZOri zB6*_u_N}OQXF5=&KGG&X0jgsx9{|8}3+B&)&IY7R7|GDxwpq(+k)XRLXE?of4W8Lv zeK6X5elFh(RtZCEv4H1gK_3Lus@aH#XJAsb<%;Hx7vuFjTt1iFdDCpqyT{!n8mLrX z(dP3;#5b@C|K5SBuQI!j<5hh!nSBjC?g1DC$@uPlm!>%3$sO;h^5((IPiQOETqtcK zRpT_TDV^PyKFjz#c;E7l3iHMT(ab$^aJ8wwP!8(H%d5p_%lP>2TdvyPz3(SNXYL}U zE4QjFL4_-U19Mq^_-QN!8dmq^;q7te9#gNW@YSjjGNJGCc?Y3<10owiCW*Br&)w*Uq7!`C=oks|$5WTe=X^6V~c7s$Po@MVgAXR{y>% zm^{7}akit$4!j!+=>z|cqTX!-=5@i?A;a&}Aojb^v~>lFgbpp$Dou6NB>0S<2$e1` zQah_&@{YOM-6%a4K_^bX!^4#)9D6w1crk!#?DY~@vwLWPPKN;4z8VSQUNP1UQF{&g zZBv#TF?d4&$ML#26=;~usBk5fiX7fwC=3O()i)fO1pr2;KJx>jx}3dli2NRfKUE6| z@fQJJc|f@e-KuUyrGiDVJmN53Ec|V>&HFokjQelcF}d8U#^O-2j3mE#(W7ffpTMxD zW256UfugnA4+uVNSbuR0u7#*&f`04W);OTI1D4%yyHR>eZ=b#U0@7+c|eXho>&|PodsCyo(wm3_EVsxc7Kz@FkG$NOnE6E4Ap6 z)rd(q78LYoUK6=4qW@mExOGog#Eumfj_mKi@QV9yk@=bkU=-J( zbvE-UwjUJylN!;=_}KLvL>;~s1P@cZRL56C?LE|}+@ zj*q*%v`2+pO2?m0{sG}oMEBs!LDl@pryOgUQ}NAXo?s%*3Z0pf<#yG;Q%t-zxr`Ra z9m;6Ob>L)1Dxn9WE>CU~p$+HPrIap3JYhR`;}mPgr5KaF18odF?Da&EOGEI9N|Wn` z@J}8#y#a`%4({kt{FMrprb^PP! z^H!^R$}2?4-ocoJk^nb%$7I+_@9BQv%WRqAUTZ$Bi>;NmguI0F@}IMH(Li9!dWI27 z^8@mb<&3|7u#`U9J?_|^@)z!p5F?6s^(8a%j;CYu)@xV4S=4?2%#h#kHGgK1ycieE z7}ySznq`sR8Rt^*(_GS^TZOK4&I zDS(==K zUKK0_idKQ{Nh8S+gW|d%f_Rh}|5d|mzYrjXFKd3it7d<~Ehfut2MD+k&%ti{i_`a? zj?q-UT**Lo#tQ(a0Q4X*1$nm+*Vz*QvIvzkIC|(Z{fiG;4>U17w=4{~q4dIQx{i6E zkn~Sr|DUD)ZkBlPNDwk&z29u1v*MV>Jd7!CHtqQ53s>ux_R}s z@Q<#4LK4kL6+=MsF!^AS)Owezs2Y>xG z%x>F!-deP*W0nkm20&4qA`?4jz%Wv3gApoYVTK1we?NV!1)ScQLdO=n@Y1VjD;p-C z1D8xd>e`%pMtx_$JKksbopOzA5qriLPy_gUb^ZukIM}%~x8sg3#kt)9aMo4*jT7ep zh*8P1^>tu-?|^I*G%&%So3rMJoE(#Z=9HsvPAx<^p;OHb|R9 z*dNe&7Q1=jMTwCMHft3A79pQJf~>}g(3CX9mG z$qe?11D3pYcB4H7m^PO+;YHXG z>30T=H_?%mK5suRY)A5LpA^$dCj7Q+jUK6&1^p=u(=jsewz85iPJ7|aq$~F;$zx{{ zk-bQ8A$_&#n+ri|4&8v>>A}NhW7tK}mnOgey;5mfV})kQj%=Z@K5y^HnGip7YEf6PFIOPZ#ceFn(tuHilkbXgb3*ZtDJ5KxDM)WsN zzuU+6Cr7=%>+_XIvr*7ZQnvVE_5yd)N@3XJ^x@bs9>Lz84wWY*j6Thc0Mp=KpIK`< z0yKySMYl?qQ5zVX(!p&6VBmNdqXy>y0SCp07dh-h>MyHaQPxzdVczCPa3r7z(I;d!QCzLfb8^e2@QiqHUPn z%+o$TDfCUz9js)tC|Z=#JV(Q~-St|Wfn?Xj9Nv7Iz&SJ^OUQpWbE^KvQ_{GqD(gj9 z0IKk9;uMXiUEMX+?BgfKDJqohA(JO=Uya_o@tCjrR%i?HXZzNAhMSA0kmDPCOQ!>a zWyG_gSDH+%G^5x$r8o$*!`yqEf-H^2+?uDl6Rq~458gIknBJ;$(p;U3c6Ctx%trus zP=^=2Z@bp8ny$7e)XOQE=|e2Hm62R?GS(+H9;?%Mz5|MRZ@DvZ`mUifd*2TT-@rGq zoKKg(?GG%Y{3a>pcO~X2gF6Ri?B1Mf%xGgkoJKae?#Jkc<0}%9?LCm{H=8*96gerI zY`ME;qeS;Ag9{5pJa6WndM>#`CCA9vTl;w=czU{U$zfy602c~2TKjRg@l)%f8Q*Xp z1-b=cY3_H|d+c(zA@(*S&Tz16){5jIIz7}Z#$*RkDWdxU6kqw`0-r@Bx7}P{CksnU zA2ng@W9=*QfA+cE<QktG zWUv0b9hO1~ul`9Dq!7+)_!d~aCL$M>f$W^Xa0FxHo<;%m8+g};rBf8xkFoT%?jT!` z7~TPF%y)2C@0CHffC=;+{bINpqSgfUZ9*jYfCQuwDZ9VWiUIT45}F=x-J_&u;hQ|1D zB~_?-{R)%PqNk^4%!k=6JyK7_yesK7@e}SOtxNnPJw8g4v!7CnN2hLA$E4Q{(C9m| zU1{>>+qiixD`Qpp!WjrY06G~61!B=G7|m&9Kt%w9ZKaV%3H9UpgaXT<@m5c^xG|&g zi?G7tZwLZenjtJTMsRZCRmnZXbZn1l$@}ot7N=6r25)@Dt@a;~K;ZAJJT_Webqyf* z7a3B8V%;XSt=I^G@kMkZtdhqnR}Zv;=}hny<3$ehf_8eG_@abKSpS6={Ld9Xw={rykAhl z4+z{A)Djz6{&B|9v5-M{d6;)b0OJZb?qb1bVazQH59F*F_^8acK{9I=p93N4XE^p>Ie`dk{a{`$MMJ~o`!I5%8raI1ce5x<+L395V%Phxb#vI%`WFW!{ zHsU{tS}sUeww#vgSm~fqWgQl0h`bb9Z~tOc~uk^^1PV$XU6d{!vZ%xcvM{4L0;iX!7Vuj z2I5ytDpEqUlZts$1VC~QNm>v%>NoToSepQr+#9d1&}S^_NaOEiPiA?gGV zVR<($p|U41JOZ^aj1lzXcrMbi)2*W8`9u1#a>DOoU0zYX39Ea=4`tItk+WwiEI%b% z99lqUAH{+=S+M&Uz`J$;TSitY!3WP)wgG@0S_RYgm6zUHfx&~{Bqxb3cP6-KsZ*me zv+!qk%C%@AO`R^+B-|2|GRcxFkGFEgEDsSG|ILWoe(Ye-4@2mb!_~ zbbWu&E^()3egaBRL)#auEt!|T0K{Pp-XEnh7wLe-3IlMSTI4o3N$#iOK$lZxeWwd6 zo_g21`Xb8vR?lNIq0zf{Z|2t$Tld8)UDQ>|lt`IA8A*i(twvta_$Hg_Mszf0%GdJx zv4dstYt3KzOK}eV1O!VQ(v=<~CEo_d7h4yZPCdLAqIAJ~RLjBIVGvkfgm|HRP=OP` zTHfF3MJPY3a6?|`C5{4x|H$6AQ`RmLo zRh#CbnM`9+2xE8eU-aG(oKUnb%btt|En*-IghnY>2Ig$wzKB{K#J6qfdp{s1(Hi)W zdRScX0MijtqD`y+cRl0m&8NQ+hEXv(y^PT7)l_t(J@Kn2=D$5kFlt)>=XtH(C$|sH z+m3#%cLN5+|Dp=|pPu8bND1_{z6V+HDS{0$e{}yR2;`)L8d0>~!mVtt zV*mou3*$AYP@0O7j^Fc@RJ)EC`5xEF$((MV}Bz*#n!PFK0cKRX}ArlqN)K*QM|kYK37 zm2YgEU?}7XI>l=p96c>swo^baTVu<1W~~I3NyPc9^w$J#sGnW~ts|{n$@oqL*(K96 zBvxNnR>ChJ--W~#Z<8&@m*Wx-wT=d@I(|TIobfv#<<&l*%fDR`$`b&|BAV5U&NgH; zQpC^WbxmUj+BKZ|Uf0hj`6(U^_=k&O%{KQ$0-uvRT5Fa4fH01F#?%c5%mfxt?E97R zE5~%=pqRUY;#v5LCeB7 z7bGp)3vZZ$;71q^V-~}58RFqjeCIwKQ+$S~fgoo_+r^BKFb*S^|8{7FcutX72jKEwZFvrI`Cr;hiX^cAnQ)iBLEFQJjy=n>1n4S z16yL`7FA?TJhm<0Tob(86iF|Jx0miQ5+kCmk$u3EI@_6X$EOYT++vZpom zOXQHh;F9eIu_2PkS}vC!NYt=qWluB~B=OdcBI!%7A^en|cbBL2Ms&+?KYyp$lY527J48}sSw}L;Z7U_d!iVSCxw}KP zE)0!)_xfI+V)cf_B8rJ>mploG0=BqbI<$y!1c~aJD0~uZ=1hf$oioa)HF;eIU?`@N z^2rr5jkmy}uQ*l!R&8cVwq&xZ6w_CK|I!B_8Z-A^@oH~uMe_Rj;r;Ff)h5sebXDhT z>lg$nynbm6W7Zg564X|@?DV*nNutA}zddT2+|i?)_7N)WBeqZFv@zDb`lP10fn3`S zR}o&H+pSK~rzDU`sce1iyb!baBJvKA8qz7PQp4lsbOj?iJe7m972 zQZ%G0!^A^u+ozxn(_}LEFqJI;oDpDcb;e zpSCwG!Hi=W@(^o{@FKY@AL3JDK>-G=QP%63*4aNG8cM*+T7Svvm>J=c3I6mjwKVa- zOvTj%LaVi;rEFHgv)n@ym^WL z&8-82J&Pr$zLq6Rvl0+k)yO_ST$>ZKHq*w)_CdkU9`hcptLWStKv)HGPN-D^9t2PJ zOL1uS)BEJ=)*IpYGpj^c;r1X6+{r(f5IS-BS-cxD_%q8+2htWGx@G`%H03fkspw99mJ|-p>2xH!V z@f0UO7Hk}$r?A-4*3D8+9yxo!evtPS2C5#+&(e(CZFwb#Q&WK6>;6L zNTrm?$yKlqXOXb(Gf+AW%qT9=6V*um0u)~S9ZhN}ns=eGglBV2=z~kG@?-6%`fz8yDWa3ll0SiM-QIm4$V z@B5P-zqrRtXTez`V)&0bKkD7se<-k;^-c7p?BvoLn5Pn!dkAuRTaD&Oh6WT(NAlf% z66GG;#0i<~!omK21GSy);sY;q|C|VOfW_drT3;m%P1n6Mg-!~+4iU8<5`@2a#zu6q zvB$<-NsEP5!M67hBgy!P8q~HyA$(Wzo}uufky%FOfXPxfR)Oo_5{QmAU9j}_Fl@B? zzYiFE?ktBY7a+Sc(0=Oc0DM%Lm`3vrppIA(|2{y#Tb3mmj(+?kqS3%TgD?BP0+}-_ z04BK%{$c2I^H9PZAfu^z0({)(Fj^Mtv`U;9jk_e5ab z>R84vc=e31=>zHk)SjnguELk_rP+PZBz5u)c;w0VFyKskg8S9zuL>YQ&ixxBUZ{k9R!Xfd9iw__Z6)+cDVa~pw(`uIA$t<}{c3F7#HBfOJYBh!^u)!_k+W-=rwD18YGrtfXJvR?L?|l-mBF+;Gmk z>NDXseIfC&z~{zdeu2giRrdjX@_i8u3s}Ik8aS{4<^6S(J8J%LO6>Iu_fKEgeJd#= zd-6kU4DBuGZre9aXsyU6*Z?b@a%~G54Bs}G9CEY;r7QZvCF<{r`#gzA;)4!ju04&r zrl=P2I;j)RVf8iOoZrMSto1zOLO*ASKaAOJK0XO+%N&EBl+57bO3d);g|S(JT~-Fu zoi_86@&LnrQ;lyIV-{}#-nG>;#L69yIT5tLqjmE)W1R|!DB;RZ!2L*KG`qR%eL&@L z<_bv7_iDh?@Me!LMkk0dKowluA)g>-n-6I1?YYI(TcoO9C6nBzBFY>bTqNEd(18{o zA-oiR5?(RgQBdkR&5Ukv;GkU+7`v6A*kY}}uYS8bMGSLe>sI0W1Qr|xz7LS(Tt*LP z=$XtE5e2+4SaZe&OKGjdJIpy8rOu7ESxvKvk3P{Db2uE0;c}slF3}m4j3rAxn41{k zFw$2W66PGex0Mke+r)0XC;mw!#C^Gi*LhG*{qw-BvJlmR90puwP2R)86NowEIwXb93$`s#Pabt#e;ILsNmd@TJ`G5y)-Dh)gTK`TUlfg$4V3pxb zf1o%IJZmEbNzVU+5frY~WP59;7O$<2K&{^$gzuGVjm7@9h8p?rtf6+0ja%r8BEX0p z#AY5XAOO=xPP5Jz?=W|}5)6;Zxh+8J2w%SIZ{wqCYVx7viz%w?jc{JnKj0l;5p6Pt zF6?2!0{REwF*r^Y`;j0x^d>AQ7T&}MU)aSefQrqe2?&!?QPK{3z(E?+Ab6AiAdC#; zzDw`ecLI3Xh2TG;jsGW~KrM=>=5MRB#BmHkaOI&JFmw9T#fi`*^H=!d|McfyjV%At zdU=wI<;zOudkZE0jTr&-6Y6N{*sI#Ok4kYJwMe1W^90h{0(g( z>@Kmps_?u{Rxn3(54Byz_e}C23Br!_0ECwG*W&O`2$JwT=(Le@%XZ7_Rid7-W#7;1 zPYS+aaf+x-h@K`TOkkfrA-f=xabYhWUJO;y)|5Uj-;+$8n853UQb>*sq@o3K90u+o zptutJ-Kx4^AK8GK$-xHob%D!W4F_K9+jKbWaE*?CzK~`6yQVg&^q)k2DP~+}2}iK5 zBhCVfoDL=Odsp9>1k>2N`9;0Y*xB-HC}YLfLGQD8-$=>~_ z;r(I^gGC}PU0yC`=pgi$V-xnyk`4>R*0*M<<`(`Y)(t=SnY_cv1M*`NXs?(YP?ENl zo}j}DVWZ&?SY;>#edl=kHy|RMaV4vLG|kmr6LEOxRA?QMB=KtekNroEqe1 zbWNdpzK@^*wX1blFqQ&ogiL|jZd~t}MV70%nsSr>B;qKoqf)dl6i;R`N}|f>BySsz z#q*)xtA95)uD^%=I=HA4em~G*Fi4a)c}KkU9H;c;MAed@N%G5sxFT7TawdoD==F<8 z1~FkK#l4q}`681NTNBS$+*TjqRu565b9|^-nWtsT+w&O*YmNHzx&9fA&jK=7$E9`p zu0wZL_r-Ye{kj;7*)Es$yx3y|{cC+s2D_{JxGtM7~cHibf#KPqeJq?UIEf^DIdt!!XL;u zPR}mKj3AO!si9-LLk&hT;<7I-=n{Et-mF~UMe zmlW@&pPjp5T{iIMR(70A;bEF;%ZUALU*nKU#8<0+u3ROH#XNFHEd??Y=SmzC`5+`Q z%3ZUdu}$G)zkQ0W<68~-sZ0`WUS$p0TQ6P>FmDqv?4j1{rgvP$w+H}>gf}t=ATIMD zha=emx2y^>eQp}`tpr7Fe1S(pZXRXtH;7cz0Ep!-;9n_*$=6sntaQsfVXx*0Udasa`VBtt^)24Zf9r%Ixhy+D|FmR?0=HYKNP8cQnBKm zmQKcVIEzb(7vy=6V0}enxOrUxbGuZ`%elG(w&buw5e5b9`)SM00*kaR_eV1Ol9~|= zm;+*eSpx$oPtj4zvKjO#_ru!{$EzSQ4wbaUWJ_Bn;o}-B=LN@8_a*5Z4{OdNbZfc~&5sD^%QKqN3C-1M&;`6hR@??% z^Z1RkIJYqq>rC0c+B)?h2*w@QeeJ=0kJQxjc}nJGT_%WnnU|#$`vvAt9zlVJN$S_G zqP2W<&;Wc-HK-=o`>x4C2bNBD`if8F@=qeF5tic?Sc$E8BEz~tuOKI<4fYyGZWC2U zz5i0vq<#ext7BK1BZc@f^KeMD0O*zq+#bW4`lL>YxxU}*o(ZoJmu2s%ez2crf2rqq z0b=HLomxq;lE+nW{z1)w_C4E757i3^lI(#Y-^Zh_{{~xB4sy?S?`*a8x4U0|2WF>S zxjPgWEkOaEO-J?MRrP1bncgy39O3$`;2UmmLllfeQqx*q& z(S7xyF&D$R=W+@XJSBysZh*-!M+vMZ8ssO>-%s|`euVqbpc{b*M<8&c|dVJ zHJP3yi(5-E3kAZynOD0Iw0$5izG*lEb15kBn)rDn3Jj;gL^cd^Fm{B|2L{tB8eGTk zJ|?y}?IAy-(1IY3YFDsb8^_p_06cP-;W*w+c*R_C`ZG!pK2s9$6_>I1c%?)IBp3a| z(VBAU@gsPr*PnJN@ZXKe<%04`KA~2L@ZWC{%?3UrP&yo|clVj18E4tXTGF}an}SGGRPUOb)tYJ3Op zSd4jw9^M{mnP+O&V+s%%`0)15rGDqjgCo|e;%{~z9Cw^I;haM~bu!l;kWjewjVDg-l#h45TdA=57tf{L{JRi=@ zVs|MvB?x>ZS;?rV(}|fi??aXNHG{Sf=^i_7R-N|Hp}3lH z@l)>o_Nr#eZj@ZP#HneYuV~(SvG=Y9gja^6uV~sshfC=5OPEeSL-(S4fxchFy!t#r z><<$CPx2zgyBYY?=j?QNGl+HAdGO+5$Z#FQ-^0#(+_6m#@{n=`vSc9iYzKu0gu#cu$2>@R$*o z0G~;Mo@qF3dD~}xZn8B=pR%a5m}^gxC_gy)X!&~ zXggYh*xyI!^zD5zoEjPTN_4@~>|i2hhIeZGf=M2Zou6lc({3AH{AM+ImGZ!(o4WxDe z1G`TdMB$SMkA_n@KHFw`x~XkB-ii{A4-~h~BcH5FiMg+Et{g)0W%5WGHaQ$AFETsL z=JUv4JNdaM+RKvdR@la7i8H;tVe9;vw=~P@=XWX2v*Jxr&eJ1It<&X_=je~S zTNJLrCMNE7N+oUjRMm2!Jz70D*)~5w*s7i!i{Jk8=BWOkW}81|9Wqcp-PXZqUPj%| z2p9b%YBqMP*KB!hQTbRBeqksnS^7|aWb}IfgoSJeyX|SY+BAUz_BhQ(23&Pe_cZk) zz!RX2C@SZJS19V`1MlQZy2cYZ+DnNCVo6+M?~BXm6hgoabp|jE;QzriIQI`a8un@l z-6DUIOb+Am6i>xHcH;l+$vB9_Qh`V0!rxUeO6{K;v2616s`YBCsy?H$&9=?WdQS8T z*ft8$-zNR#_L2p%aKJj?ggX82SqFn0|9?rs|C{*LGv8TajE{I6INbmZRBvET1kzVd zYCC&bi&Ub(U&c4Uq*xkcr@*5{L*}>l3gIF9^YtRlu-^8+3#9*3U>_{w$>0sIF(IZABZPAgF<){QnT72jBZn$fpS!XvC`Y;qljO0beCEIP0IM zzh6^gzK~QJ5Q!ip{00MthA~->OFCEnXvu(82ZfiLzsmSbj-^K;#$cQut)aj&8BMKA z)7O9pH&k*|P#>sK4G2SU9FhSfA}aex_l#!uK`wB>*Y6Yd{6e?Arxhuv@w@kgKb!qY zgs|(toJ76N7%7RsiVsJ$%c_7XK^844-WEDP1wW*M--edm00oOPM<9P4W&=Up3)(uY zO{>EV3~g7-6fOC~U)4JZkP!%lw;c|l)I}11JYwW#`;gEo*gbUdOmW)$`dCTb6CCrco8FQ+mCz4Rn$NsF>pXW5q>fMq zhB6DAmf|w_43uVyBty1+@&|s;-k}_smb8=?Lb#Oxt*6iPWbnzWq{D5{3ZWmCun!rK z3(B~g4jU|K+?PAvNp8bh$D4fT__eVukB?6syw|cAboQ00*S!t-TGjF8Q8_xdglpg7 zSvN?nW24rWN&+uoZrf#yPBS6fCnz6v0-V_QdXYZNKSrtTxcU`+dnaM>4E=!Db9Q)cyRpNkT4bS;zl3bPv3_#gLUv#~y< z0bZw=Q=P*vNCA#Z>+Z~55?jlfgA;{Pw>J=x+hX`rjWb08BerJRwJty8%1pcb){?S&-tXFQ z?-!glbU)eujbCsre$&^ELYF^xXDl{`N;Iv%MkS@nkS>@?DiOb&&s6<#w=p+R2qSVP zK&hjS;Mnpt=zE|zMkEb;4V1k1WRLKI@O6(p9LsppPa;;u;ved3PavxuJ;sBiFgN(A zbX5E(W&;{Vc--Ipub#pG^|otbSClANG#ll6YlkBrPPmHFtZ7lJR|WJCzfha;FerVr z_x81iWpACF6;E!-9n}X;XD^$n8?&EqOFHk#a72kAu=)+K;xozxK6}EhxM#ulUy`{H+6s#?80Sn~ZE9+^H`a{V}?s zy8brstZ-Dzr7BMRK%eHCamjt#d0NgmcLg|WeZ}zVtMwrF!IwR)tjnI}@eZ;zn1;|5 z3;b9mrL|zF<;Z1~sHmd$<%0qX*~_8XZjlwm=?*EqHgdnDE7y3y2%qAXcZcV1O$Tb2hqA8s{mtAoq%^QYx+GrNZ<26)9l$?lKHvQ66IyZS6izJ~Ar$oxr3eKc;5M<_`6hMa zh>D48^LTVyP?#&5ORm3!$>wIjl+08%k>WMVt<;(vor6HFu^G4Tv@{KB0J=c4d-VN5u_8X9 z^ci0c2fzI}HMNi968?wyjk^3@6?!>5(ws3>TJ6tkjCEAn0nrzh12(U&Yl)DaC|*u zpPImr5S(Fn!AQbz*o~#jQ-azHae)jUj=AK~$=Zl7;Bmd=u?|fP0%e{i({V^S{snwh z66lt8!wL->tmBu#CQt`xGBd|tIaCLM`2_S>Pr=u_T;cRGT!tDtKXIIU6XF>pc`rt_ z21K7lNUi-h+v(O*ImLZRpx+Z}SES}dmG!!!x1vXoLQq?exw<2d>S1h*j-4I$lf{5> z(27BC#=Gfljd=jJ(HS2mffDWY61a*fa_JGCk`s+a6h5BPZUHM9P%b^;5xKisqC22L z%|9L{{>WaZ`91v$#>G@!JK5g9H&q%MdtLh1!mpuA(4Hcfs379BtF;@@C-Lt6@LFeb z1<)v@UVS8F@n4v{n0Zs3K)gXZW~bC9bm3|yQJRrv7wKZ$A4vXxeLAnQzA_x4=CB(W zT+_`XuxIWvU|+$T^n$3Rn-OE>cLpzK?p7e!+fz5nNKP+K&?Y;iziQ}-S@5QAsDTdY z1l+?o;y=kUZMbbuQJuBSi=C?{Ua0P^q@=H~-Q}ZI7PY8~ml~WfX1sOtE*qKIte_Z> z1)d*^Jm#H;tiQo1VaZX|mFVc1FKgOYPdg3VeOBsuL=UH(Xl1<;`)x>|G4MNik*F{M`8wPZ{VQ!Zsi0DL6MF+SDlF(UI%1wW_v9B8JM6T-8qf-r=f^ER$j$p<{d@1+p~rH?RGD(Kb5PC6>QJ0aWol8Fw*@te*;gsjGPu3H^)<& zmN=we9B2>-`uakd_aG|T&42cTbrq$c`<4|~BEo_5#${vvZ_UaSsAp4RXq_GP*~zt) zDf#V~l3C2tABhd`Q>V{AGF&AQc|#gHGVJ&;z2k;;%F=*=s=;6D4j_625hNd3488T> zh-ba-%ww{gKDvHK$LUY{WUPW8fLPEDn{i{mykODf=E}lIHxGO(L2~&Rx?YIUalq)t zZ)dlOb9s7XPjSN!$J?3qbKNYq1q3&dAPZ@dL^(ld*L{LGzT`JgS~t|cuN>dkMGu%L18C7V)j!8i}}qAgV*FA6RQwVO^w5x#oA|GkQn+*S|t+h;l11$ z7@AAE(em6}_0H0B$!@wYEM(cCo&6;_`)7*Ye?c5RB%eYlH!uc$fs}VN-J`=wZ8Ik4 zOtwTHD;@}wz(dskC( z&_LDtx2VWp;(i9~6JQl6@w|lXpbwqM@R@;0n(@() zDi|^!#KH5_mbvL>hHyQWm0KvQx_5~~=L*K!P3NwB8c~*i5E;E7_=d!F|mAD5q5x6 z`UuM`#cbM+UigSg;i zsV4~SHi5d>L+CRU^PUJrfCkzsS7(kgfG7FGaadSUif|OuNombDPXqcKt?#a8R(;Jm zL2@ovMlhMu9xf$ub5ccjK;rpm#pJCabHQ3~!OP;=>K8@VKQ*2Wd|GsKz!?x%Y(neE zK^vxDogaDd9^2O0Rk6eqr%(F|TOY*`Ules!_(lX^9MIPTj_luH9H3OI5JcpII`F`y zIR>!JFKENht|lMKmV7Ug{Ul^I!)-_DdCByN8^2RdaBHvz3)FS^a+T;W#nq>OoM^!o zlx_FpJIsKgb9N1NDnLIVVIZ5v&XZnzbgJ!#vMjaYnrq{^n$yBcR(IxADQISs=OSA4 zn6R8X2bJ679o`Q+%zH{%Df{C}VhlA#9)Fafk5I#3IzLPT8=l@s#GFNwAAH!}V`8GC zB5D74!|`KK8T-9XUA@sw{kxr%RKoG#?|=ldKIdJg3j>UQD>yv zh_C)&X)YJ}92&QVyVAjW{W-nXxwAw?Cy7WhN&4Ou3Ws}GEoVi3`ufM27D&p?vm5)C z;gRFOSO&X2rPkB!B73<^n=MLARum9qWj+1XV@1laePRPC7#wagY(LUKXGHcIw zALyP-kHo!4PXp}LerGe-rLX=*6C=7L+lzr0{@> zfwF%$n#X#Yva()!UcF;5e5p<9x=x;L+50Z)43V4ZbIJ*Ev(U%-2~pVbl!JBL>#WFT z!MP-*otztQ+4=5E3oDn{iED_@oZ>ql8v3h8&fJV99IHB4qAZbddd0-7Z7;)n>Kmo- zC6_EU7Q}ZYNDX}ZIuw0`{(@fuWuWvou!di4p``h^mqC}Xq}$LadCMJb-9Q?Hor(`9 zrgbw;EV4Ne%h28k#hGgilT(a5IGsdt6*(lk_%B2_n*Prb?H%E7P?5s8VQ4K%CLf@L2b@+WaC)w7P*p_o2l1@+dRfn~hlKw39%((06 zAfAbFi_z{&B@=q2y|krq{I_m#D3iYksvIPLQ+MYHfOvq-)IV5Frybmq{+Kv$i`A{;MBEti@lUnz#@SaZGbX9vhVZh$sqWy97c-8&I}iZKy7h-ks$3fwe$##C zQD0&h^sp1T06i%2ik~m7_~b7C1i+4|sFD)l9o1RBCMVd&M4ityh(v=iwZEwZz0?j*LOP=6f}RU(Q=8 zS{HjYW3`QeU@>3j^#ZpjuhNX>s)rqSdiNQw+6u>Z?wxW-BtLcAyW~=0NskVMMo8wU|dukP~;7ptil2xw|nv7myYca;ZlS2U5fBD2xo6(X#PXAVl}+d>Lc! z@BsJ2hM>1jTz$_!asx%<6fM~y4{v7dVICs~6?xywsfL7O3&sQS%Ho6%{dtKoL}q9% z%kctxT3z4HHJAf>aoHU0J*Uz!2}x-SFc6(sS2sJ3frPzE-XA!s+sqDHL7QWI?Z~y! zg$zxWg+gxwFy%GKr@<379BX^S_*2lVBi5M2lHeeCc}qq?!tmmCon3@hKO392iW#&; z=k&KLIxRMlCRWvRS&s3k&01enJZqC=y=I2;Ub!a?_h@WXoe`v`Yqx)RpM_$T%96v1 z@lgDGZ5f7Y8^0>oO|k6JKAv#9k@9dvutK0+0ax4-L<-iKF-3~<=RQ`XZh7RGE0Ko~ z(IaAqV{P8T?k7*`p%UkF(dApdt!fr=PtH=FA@5p;+ zD4R>d+AzwoDN`nI%V3~C^whJbJmHTs?^Axt>jmHddp7-iCD&OsG8XzqHM=%bQBIBV zn0>&+sg2!XTD=QynzcpUyoOX<@(8@S4R#1X?Vh#fxtBn6PP>24<{>QC_88d(DRsm3 zrlx1=eFOrsIu1sMWB&hR&@eO2D=T%RLqhb#YPOsFDZMLNo8*xQW-f9fuWrBXpF{v# zuV8AP4oST^^=975``fog=9Io6^C#h^Pp_>$WkbKD=5MCG?|9Ie8zM5@bzarrn-AnQ zP!H4$n$DoW6PtD4Pj*|yU$k(1chUM5l)odjOcs$?K0Jm3y?6FBX#R|t2HtM}S&@jH zQbgv~Ro_Hwn~|=N_ZQe1E~5BEyv#SQ8TtFVAJSdv{J~tax2faR*8+2shCjim97N!% z@xoYmP4$PG0bz;PL~&#x-HWXnQ_}85z9ww}x4RdKt}TrZPi4l)IA1!=O55>nA8EFY z2o~O23g5&_MH77kB5X*9)5(>>wrMZl-nhpmhIowSdQ@odI{Afj+^I!@;@&rPi{TMJ zq0-h6hC5tisO@u+-}fdJhuRF~Zh!0T8Gd6sRDJPI+G0buV6*#bm7<75d(G?jeCN*J zQQ}Y#X!bfPVsJI4(Q>s)J^rlOF0k}flWw4_m@duQSWCJ19kr5VubzKO*C>Ztgu6i^ z^E|hS2aC|5FT;{Ixpi;aBZlRqw2qGrRs4mU;-A`z<&qgR)(oC8ifOYO*S_~D+C6`t zu4e%!IGj(P?d>LHI+p=)ZbP;dr;yyOY=y@1paeJ3)qXfyTtTIk!z^^Y;;xx`SmRc3 ztI3kZQb5#HB^t(lYq8CJuj(kHV7r8uU19psF@r^DUv~c0ro7=+_w1EUt}R0+=0NNT zih>t2ID1?gl^5K`K9t8UTMx55_q5&-*yip*Ff=P%T4<_V4JHSeDtqWOY2@)nxc0eW zhB-)ic$|bwNW-}@k~5dv;V116g-=*}Np7j2?I1#t@Oo<-xj>zZ<-{TU17ZGyZD!p0 zJ;xcdPsXv5q}Q>l7P0HMhjZc|cliuA%Z2i!_d;e#0qMHEKA!!4!)XYo=_thRak8Z1 zeSLK;$~FnFT<6Nq-s$E$2;&G%w14p(T@2G_uWkh`>0CISSy)rLf2W@5IOuA_Hvx&$ zdfx!KZ53nGBZ@O+yu1*&D{?+{^M(o=NgSvvzxI21>CG22-GhstH^}rj_{9W78>up) zg;o9H1{N2tzX`vRs&$u3;`m#uCP6B`n&5y(MW{-<2ZOgpes!SDa`G7$i&uvS%n#lq z_Y_je8K)qz)(GGEj208VgqREq1l29ly(555Uj9kcA2yBeygeWp2`xN&`~$)ndSvqb z5tbo07?)A;KtZ<(z>^aut7i_crgZ3%?;^h-(1Q268#x#ve)oL&NAA{?ep||drVXz+ z(^waMQXWV6rA#;n!^`zAFq`eKto!~@4Rg`*2G3mS?U)1?dwZTomt79B;gu!?5^z9v z&D{?^C33V<2#X;xnF(>TA(hNfe|ikp^0l zsLi}2?FNW!fR-IFT{KkXmXD1@TnyD|D&h+i4b{sEW(n0LXE*GpF(dL;m3*3$SIKkY zSENBAQA)C$6`4thYzQwpAh>6@rfKx`#FE6_zQ?IvltevObpP^&Q$Iw`2GFCW=SEp! z3_dhULzW*aHbGyFaA+IBvmwV?n|)us=zR5%Q5#p!)YG@%nc>Y6Sc+ z_cKYe{udS8|JgP%5Va9x3<>XibOh{={S9~n!h>0l0od}_{ym*5c^--vwqc=lmdp`t zehlH2tWjk3m5~d?>z8gJuf`wP;Ld%3`G2^Xp~T6vOT@torFb&t8@aX;FC>s{7$itoij9?TV4Xr~0#E z9n8%C=2CK78pR~zL;>0!kTh3!c}OZ-K`;WLRDWy!$k=`alGbwrm9TYY31s5LH6 z5ftVI%Nj6Ty1!`i)aUepJuahx+1K5NW_B?mdqOq3>fA`{u!Uz=OR*N53F(p*`Mb&M zFLwhw^r<2Xkoj;dfkxKx=B$ZH->QFCRb#a;#$`f1UCz5c_U(bo6FjZ3GFeR6y*TE( zQAY|*O-NUOgLku*Ut`OC*A!)Di9%4}J^KBNH6 zTltN98b($UKci+f8bu7 zpa*S6p3*!oGcm^T*fHivbnz+lVaDj^!PX?ysXG4Cp)ZV7-Wg?BJari7%Xgbt+=HA% zt27rGH6L@t#Jm+%eO+;D2B9}uG_0vgPsnutq+wiG(P_%>)y!lGRijdjlu>#}#Vm2m z_nr8q#NGbo?1Ynd`>Yr)@rY|DoHWn_%``FkY@kzjTGZgFk1Yb zYh}Oum273j-2F9|CU}7XK&w*v>;L$F^>Ye+#Y=i3piZhr#!O#H1j7B7wz*$PcKJR5 z?yM*cA^ZpJHBd0aKA`s?`n4|xU^BlmOTo&Qp(l;Wge?q8zrXN%yZ-OLR{H(NM9_<` zLN)a`_(*;Z;U;oPhSE;*JCt>oEfHW^!h%UyIsf+B=}XGaM{3y;2q^^V^S|eepa^Jw z0gUqJ83F{#&`}5cJ7CrU2mN>ZW0DejebN;AKMP~CeXj_uFFdLY9DHlq2^QH1tp~s( z_(?AZ!zKFQm`GrI{OirWzW`XU{OdlZ<^YgR^R!lE`m`sGkJV)gHvt%%kf*M?B)hGp9`IxNK3juC6zHLvXEB3=ZB>xX z$d!3^&Sog}wcXaCj-Ea*Iim|=GZp_DW%Viqb)w=9{Ci>*AqBnP^RO2;FuYfexj69_ zM(Sw`WJse?KxuxaT}Wr8o~Cl|OO{K};dT}LKnKo^V%t{aUnANzhaR(;?8EXV;JN&e zNjQ#lcsF_nggtX>_9gfrjPXp>!>XR$lPuokW%)6x+|em44Hd@+*NI7^O(6`rAn=lj|hoJZM)|_Lze4t%+iA=(hmvo$Rk$u~n!At5uHvc@h zPTb^^cjX3`nI$W~Tv@u;SfqW}rhUTq2{K4}EDX^3sm<`gZK+HdHxJNRRar zvC#ebLS&z17K$EFuMv2$9}SQ-Jl$ltYz_i(D3rySK2?b^40|rho9iPvnBylg6?zb9 zyQWJnF|}B;5KXk7Cv}y4=(X}nq>Jj^RbS~gef9O29P_o1>{yOp79^vF*LJs{u()f?XNukZpbB>C*h|zV6!;+D(DXpE#0O^m2pv zuDh`g??q#@eeU{{&umuOL{%m*F>yr!qtp=PRRxEm+UhE)atU`YVEe(0#w^`A5zzC_ zxiD+&Q|ms1^2oVGRm1D!_ql2ohBJgLV>;C*c`T%5Ry4l)D7figb4zuLXHh819E)F< zY429qavAl9+0DwDL$=N;gisG2sEJk{6=S5==Jj?e1f?fztqt}CACI9loTf~LTGHtc zkdr=KkPXqzS1a%yEb`g%)j65kc!L}5WS0U{emohk?z(QCYIn8Z`5-^Bu-}gx&N}KK z_L7TfO)&PQ@3XDu*15r9FRIG7rsLSOtfjag$d3_a4^Y$d4*g|YQaju?T`q!6u!XMW z05XeDnqDdO9mKB99TIf!J}eaqv%-X~7!8~QrvjF%2kVTR!0o0t0TS-7p6NfHp8xG9 zkJ6g9{?!TeT1@%?u>u1d^~oSB4KoSDZ?4&e5^qVL%++k2^U1ij-SA2=zjRvp!5NA^ zI*CHgYau>UYMLDkJ3xpQ;~l_-rp*81pn&y@`FJi>PV|mgSCxSD(@C}YHn<$LB4VRz z$NV64(D~lS0qg!c#r5&adWix>5f7E?u17bXH@PmlIm|F@w6AgiNKYY!%Xr4PrDiBa zN&4!(d)$QXOVz7tM(nZL%jS}3^Bo{}*Glu8wX3~y{~>JZB}fT_}W zl{2L7qe*w?_u28{CFJb2YY$^DJQHnwd}oUM^w-h{2Kx<9g8W^+{79cvDq$14=yVOv zGbO#$K;>;F@;IkPP_nc67@W~zq+_3_wxy(Ud+rU<|#j+cK6M8T2*)*rBnWT zO|9ituph~Cpx~&G(^oh3?^K148-Eg=DzT4hC0|J0l<4AL;A@>R>7ChLeWjWGz3g^7 zo`)y?-cll&hQ%SnWRng1Df*ygCPFN&~kLfwU`3Yi4eIO60kQ59uC^8%5kW9?rh);RvPqe#e0g zzpq#CcQR=_lSFdsJA;d7w;W@IBx6Rq_M>}|_WY!!9>$wXQaf)ad*26z)MNd>=dJs0 zn|Ldd;qvR04q4|`GD9vX<*$}b?eCI1>pG0!#qB$8?ld4TXY78PyAIK5uXD^#@&*Y= z)w%nDc^6P?g&4(~I@g{UUI?oA?7^P;&}8#mCz4WQFRprhX0g^K;ud@?NPc_cnI&M` z4r-9dOB6v%T(|1lIl3+88e(t>b)fH)64D;X$f(miWbWJD%Zo_Q9;(QP_>$>)3uW{G zOSj7>5wlKQx(1a_v!gLD7lv(cvnw~}qDc|X+2`Ig7Mk#KLpc-o*gu3Qs%0-0x|x(* zjkIMcG|v@zFM*pM52E=>GNGFO^eXt5SPvWB#bMF`7oa+w#2)EH03tmSlls+?(X_k(oh645C0#2t@a#__ zTa!unyL0F8{bL5?_M^mSRU09gaS3k?c+7T6B zCN~QNI63fZX&C}Z?hOSy{+xrX8QGV*cKk|RI-4hq8n?7nB?xEr?jO+^4>5s4{U+Xx z2(+`_; z(v1JqoD)6hdeUe5Jsw>BN|pIUQfM26Fu3ft|Kn!KKO*Em>Gu4Im*`VZ$O>rjyFVtx zuY~sti?2Q;^HJrJ)7OG#T}I+Zzc}U=qam6bCJ)z zWM4P_9LCLLrNrxT2xk5OXY-$~(62FaA=&JF>Pow0w}F3;x9JUx?#pzAs^4Gxxy zvO_;B%HW#P3j@__w-*Mvr8)}=9!ZyI<2MA^?Y~rrq}H7enB|E!mvwGr`4hBsrXR%gS9wCYCAdID-49td;wYbR!Z?p zza;TN1q+@D1fn`{At>870GJ{6!Ea-zyl8=$ir~#92M;tW{sCSr`X|x3%4@!BK>dX^ zO5$=CU2nS4aH`LW&Q+2~F^3QBhAOMK+m#t8R`jlUXqHgkaCg=5e9_9;RF0K_=6dL- zElEz`p_v-n@6m35!z1$9J0^q~Z=p!d52m*9tmChY3p=54K(o1!>GaKLdXz#w0-aCXv4m&CQM zJe{Vk2f2@2d_K1sjHg4V%n32hdJENeDi6O$z)IlyhL2yMnkUsu9k%@s4z_RHVMQn6 zK3+Wav9-LjU1t1yjH|-xmEMY)=-r|uCkO@ED2wre0jCDmuAfGDHalYR+DY*n`X0L( zCBB6Y-~7G%ns+k?v^1|O(3qvG%HPkvdFPSa%1!Mf&YAS+*T`Tm$$oil8~2STrt~)B z7g6R|1@F}GSet@t9`vhPc+yC$)%!RN;W&N8o9FR2k5jgJTS+XpC8NkX{Uu~TtWXON z3rcKc&MDTSsai!u&7Tq3oWF!~_P@8HADxI-bGJj&w`kaf>s*N@FOf=BVXyeUrt#V?XJ(Ms;SKE}G8gp_cEr~M zAJ!mn(-U=d0C!!C?kCadivStXIS+2rwYE9+0B#&);oI9&xIKc_{|NI>qCq6+0)G)u zq?jlKzx2i9@RJaj%>NI&z%_hW3Jw`Duj_4KV$T`^0Bsab;HNR|MIoayn zBVT3s_8fADkz`-;C>di6m|E%qWZT}~)>@{h;xXE-1yNF+rz@);B0PQKq&d136FpaI zqV)2Ap){mvRzQq;_Z5u7!;&ETC-6wtLWr_r8wU!xJo-$v> z@5A~$J1b@CYP}}Ytg9H-{X{}bDi%zI^))XvH8rtZYu6zQRd6C`pn= zVfB+JFE|uiJS2fQJ-YJZP1MPNU5|_<^LI*xVI(9yw{W?!@y;jJ97eLkK14i!)?QVn zbye*|L7QiinP>TnB0+!Mo}nA^PX*ssRaOPxQ!`*LY_?+sUDNAziQ`(nexT)u$&k zbgIJY!@bU8xXm;28~xw>-7Lq1s2|D2snr^v3RH>LHS{o45#kZd>&qX1!c${?^{(EH zq5TN4kjlwqsiNCUZ!6>;x7bKF`)nB@G`G8OYtn*es*l5XYJEsVO2(RqhMOb?5aBU4 zAbi}~j$E0hb`H3{BcSc!Cg40TXKLl>H&-j#vuKx{!FE&N297=AYr@R_)~!?RqF<&` zD96OE7G6CzLGk*G99>=BRIuCV%i^$GO6}~>K7lUv%V?HvFL#HI>fuf`q^v6@{t#(~ z9#KtzcHWF{4XI#Y$F~o+a$st*{p4P3q!nKyOCve&is9q5#j*euKZ$zALw#;y7djcJ za_a6}>AD}n>IeZ+*|1+sU2gSbEwWcTWMxOZ`&@o%(p@s;}`W>tw(?Vo1MK_kP9#$ zK)z@;i=(bM4&_Pq@Edf0F{7iZQd4g`F;(IId^Fxbf78Xxl_&n1EBCK;{DM0Id}>ii zuxA&liFaLMu#QuGy{cXkh{RA{fsM&&yBxdyf6TpgTvTh@HasALAfa>#NQZPcf^pafmJdU#(RPI~Pm(e-7E-Vur>28SVUy;+PvD(Q-0~pqCO`eeO;40>A$9`WODA9lE z)2LyiW9wHf?QZ=Lmxqg;bsVNE$#(nAtPsi$TUJvFC>wwn>?U=I&yJ$xDT{l6Pld}*hr{&e#VfW~n6$jR89h0B6(^x@5M1NG^v4Ns9PlN!zP-W+f$9 z+mQjK>a3U(94I-gF)HzXbss?P{MR9}tZOXxEB5@$d?=pJ205(M2*A|+9)|v(EVcBK zRjnYee&7F}m-0ZhP>3wEDge)rzl5BagtJ=5Sb-_Sl~(V<@V5wSttk10USugcp(-Iy z(C}~P=}*C;6cZTbZ*(Bo<9}PAb#2gZFaW%Y7#@I2^!$(G|MU3%_t$06LAa&PB9ft{wvuy}Bu(((smY^@wh#VsGZP=nMzGMD~ub20<;PZb$jg&^I+vL19iO>oUw zb0LqAANbqvYE5pz;5IY|B<5# zSnrM@-4*>k(CcVJ#_U1YfrhLT9Ww2WvNj5gqnw5a3dOJ1ay|SkeA+Ws&a>*>eO`Ec zf?<|rVuZ5d0oT|QPTyBgsSwu3Oh7iecSXEr@6SaG?ohJq4%+-7?PJ8cK_PH;2H^d2 zWhcS0R7YcF4v6Aax%#w1fs?X$At9U~FB~5+Nn-G5X!6Rv#JhT9+3y|`6By&%Hwv{% zT9GK*fUZ}ugD>@x5Z}m}vTUbm8iqX)BIQ<4_r37j$-CjF^ECyYZ%k`FBz;qL>!ln! zsK4nN;K%aIgc7l_rDzrFmlx%F+#;Q1!|NF3fkolY*HX+A$1=PtcbHTlP}smcm^QG> z7VeQ_g|RCqSGqZBRWX8!$#_CDU3UMLs3wc)ZN+fyxX0n%YgxDVmZQ98Sa|HeL_DW= zCut6LP{(uq(d@pV`BGpaIA#0zL6>^5)zydu`AN}_wi)!n8dbB&`P!gpy2}_=cfYkY z!{((oAIUK>m&u6lhg7M_?-e5^uP|$O1@Nf7d+Vr&o*YlF_Ln50;bVXPCby-yDms=J z5~`K;QS$z9#V(c?&Enu#G`$nfc7ngm`|q{9I_@L-#W6F|t`w@;k0u!Pc^7QS6^~%} z20owf1soQcaiCkfwA`mQ7=~jk6I4=vC}3|1(w#9{u(FpRA-z0e3)+;3EAHhyVXp9( zE1VVOByOZS9+bKJ3*^h7oXE=LV7)x%tZta6-Hw@o*okf4Duc?qS3y!_bf)MF0c*;4 zZl>?>mW3|MIb*fbigEIFj7W8CUm!*z`gXDz!4Bd0GEgEOXKGGjago_qp zQiibP+$!nlWqJE~H1kc!{Vmp^vLfO8Ibol>g4!fV_>a^#$`^z-4;gn3_UxTYP_9-T z*DkFWVyRg)X1*G}o4BYW-c@}EC%AsAw0A4XUwCcZ;0MkN+;^<($Cn!I33HD;#bWin z1J`H6KPEZUl{?&x7R_gp9X1+1s4SX)Zm~Ff%W>Z{JfVMI;Z>dpXq8SyaVxJmo;pY; zYk`TRA8MyXy1P)Gs5v z0qvl3-TMqFncWa^GHin9m0paXLO5e-^kY^J@|Jyr03uZ%~p3QD$v8Fi7Q;d{o zYAidU7t7&hQRp-U zREPV4?LQ_w=CU{TQ+sk<`?M5_1&IBp-}rX5K2G0yU?)h_rml3+c$rr}o0nh^YU?=1 z8F0JZMKmMrYaD$)zI62Xt`Z&w&Dfkzi_b@IffHE4Q)`KP!}QfLbMakn5qwIIBlq$@ z8(}B8>&mi1I#KjsGsTwtXtwSL1>6CrW6Pu|ie%lT zj!ZgCqv=p4NuH7m>#TtbQRkeu+QDDA+cqej4_iH)m?zB~ALXSNeGeqWS3sn>2(^jY ziH^CYpRjIgF?!SyonE^J>lic5I3XWxe>wD-H(>qXsOU=6K75Jg>1#^eLS@uUH~xLx zaNdJU;z`x__bELTwi9Nc(@&yIU5K(x;eB@A3yvGs6@wSzIA04snSBuy zZcgZbJjmUU9i(|aX2=j?&e($CGN+_F7_E3GSFwjr;hVkm<(yhpWI%!iCQ2Jy`oPX+ zU!kLDuoJm*+_*==MsxmZkX0e>wnI~un7v_qJ@thd5U_kmnGhCiMr~{GDMP_1DX#Z) zn}09HHa8ip>2~`W$>Nk89t~Bdd7<|M#PQxU!ycErQ@t(wYZ2D13qcdZ;%^L!z|=kv zWAghN{e)A!rQ_A7D)0DBjRnq1=ruCR(uI_fyH|(alBz;Qx(M%&Os~G)meeyAdWxs# zL76lNdE$GAO%YM>z#xYG$>h|JDF%3gO^m583c`o{Dq8Xdpe{l&JZby+roS2>MrYRA z&o!mvx2$)3jx2UO>-bI@Q^!x`l%1=CB+y-79?pi0T+&*Ma<83v1-~)9X=zJI<3a0p z$u{25u*b`qvdvOZqq&ms6pM>(uxb{{suYr$^Ab<%1TYkLzrv0QPRE8G&+7KTZWy!Y zPb>LU0$>u+DMo%$uyE2o@m%>>ibrv}U$?iuTs^4pf^3Jajr?o(k`mu!IN8ef`_4O| za4&+8%#_;e-WooQx)Su+Z?CYM=eiX&Z1}AsUn2CABPKihQSq$io2B=}S!(f?i(4jY zCPoYMta&7k>Z{!1&Z~MHxpQP25LVw(+ifYSQVG@*@{-(@XDt2(VSh`Iwv9c$X}?y~ZjV#TK-%JR1eB$UtOQAEfWB^K2|b1Ed{@V| zrqK8e&GAc6Rqn53ze{!gJKWIS2dOeNz!61+=iteI{gvZjcjG@60Eb&u9gQ?u{TulB zmcV0GrL;sX%e0_;GSG#D4($E)s_w4kWNq=K4%iWc zbB6Le1@Q07Uu!XxBrifLh{RuJ7boyg;^#L^23Pq5(~jmqh<$c22ZT?OGq8EAyE z;rm*Z0};wJ7z9%?_=b7Ehg~L8xhm(#NEIZFWom0#WKs^2xfnX((;7Az&gyKaMvlrF zY+A?Ust^RrSK#Qv_S+I70<{6oE>gHi`9&pgH3P>9;11%S0ED$xZlM$oa#sEQm4i@v zfny25hdw3evf4t`P)EAqIHr=xN&=UhjATd`cY^nT6lvzNuMM*Hht+NT#&vc=?JwX| zYW`(s@H>{wshG^a4dUlW|I-M;51E0N+uw(R{0w?6XT=SHDFEX-^#_n5PWb=z`(LZ* z{xUKCJW2jl}VSwu?AhMW_ketc249ZMePB1ZhMDL|a_2dx#Xi~M5UINOT z`;;4@X5IaCU)-q9tb-U zv?6&B%~x>yw~|U6qSRe|7eqH!j^Y3tBJgWr(z-W2wTxjp=&i@tzR68}>Hi9}^l=(F z9dCC8O%lG*o@x-Vg+!GF0P#%FPtNLKUEQbJ^21MVl9$zUh*zfvN}yFYQS(mZ(cwT1gYIV5l33L#|tB^p6s;m zUc6kGsOTx#BDyyqEbAAJ%GRZB_ZL&-1m49}&lk zpi6Usk6xNzLcO<)m7w}M-6Pe&(&GB&@{M#)M!J(Ao5n{;4e1J*WcdEL@;Ujwj!@IWmZL@=Pw-ni7+*w~b)|ET0>Z7fH z=`))KyBVp!<4Gm{Db%YqvMiPqE#iG5x?96V{;sCUQ~Hm$Wg*vk-HNda@T2>3nzd`R z9!S}&PecjK1n}Q8;)XccTtyQPP8}5(C2#eHSg*FDcjnSDJmo^LqBLtYidW{x1S_HO zzrq`^{l4Fhzdl3xQ6||hWmGtinRBOZVs%z9Vc~j&JmXtr`ExvZ~xMJGd;xZ(3 zixU7E1Ku|p8?3E9ET50#Ho#qj7Tmc1k{3h$u9#p zb4o!?J_yN7Dx2xsO+zNDgqe)OX+^L6-Z9<|MTW}7eb{JD(a>$xa~D&7M1QkoT2can zPqA7cxNTN6B}2Sb|MQSpjkB`R(}d{(Pj;6)1$|3@x*%>c>fYLX+5US@Iguvl;o7Kf zi)TRS6M|PQwxPx?|DcyndPn6svGPFYog3)2yy`zE@_B6=&ysn{CF|LcI2tMjhv*05_4uNZ19Q9dori9#W~n=(#(rMW^HIUn zk8hcLn<;9qXHh1pqPxD_zQ1l5wQ{nef4k@MW$4h}27ZPdcgP}EivX*4?xXhJ6-w>K zD9`xinp_TZ$D!PteDBU4It2O{s~K0oloCi6t7Al+49uK0PLec(vo`VRb`8_cK7a@p zapH6I@zStu%x>|?N4L8sD2rlH`zfd5@eaG))4=?sEA+ajmr+;L6~2IlC)UcE?2e97 zZA1$d{_aYD#D7Td!VxksTM`IuQusYjY5J2RMixkJmq95srqp6BRwIVYV@?c{-+o z#_^#$WBZ~Y);B$g1dt4`?Xf4Gt^Kmzc>8lF2ouEwlUBRh;U&Y#N8Pf1KI>Y&owrc5 zZhSl8S>0Xr-M)4~Nqh-rs}#nn*uIyiG`kHS-rs>-JrW{)Hnow~Z6)|aY z=@3>T?v&&&9mZpl>Rn%COsnafzP%s(frh$vWwZ`<8KvKKGM!zrZE250J>R$@=&4qS z4njAo?2J$PjGnldjDjDe4)?^wWN+0^1XQkMvCY+=FyBkNy1nL0eF_EnQb2?hQmsX3 z^{ME#{-`K8uOnQukBo2@3cb1?liU1E&7GsSebMx?^$g2ym4(#v!xl55TwybU0R&pP zgUK@TVijQU3Ac1&60hp_3YzSA^+enzc`9-VvrFyAFF^+feii5;|A`U!zq{=&89-@?pz~}IDpgt3X4$LU65tO8K0ihuK*p5!u32Qi z%6LVLQl|A~;LAtIgahE+nMt$4ZUSijZ|nv*78KCDbPBMA7R4GVt{_AgOgn;i6uHSi zBnIVnKvD54BG`aJfO_IT(qPn;py%qv&)+FxXaV1s-DZzd}?D#i+=BuNB)pzv`sN+R5 z@!4W#K{dwi*LnSX*S2%&mP+dQY#F)R2)AP`m=en)gYMW)l>&$f8 z2W0#QBuyuMNB&-td1HM!QetBRu$xuGCUpHi{KZ;~KATwb$?%(p-)faC=7h!uS5G+Vxqz(!(Y}C7_hBiRyxvTDYxAA7nyO%AEMQ zy)fA2@wIIX)U0h)MvC|nCtn$KY^L;lx&eBfgpBwUmZM-uV0AHi^}8DmA@P5X#e5Qud}LOd03p5zbko#iMMOzrQf6ryIEXx zQ!xwPGb@vFO${o%HdAl?5yIo4>jw=`Yd47r*l%NFdJJ&kqm@fXpG)*|9`1ymk*H*j z4HxeEJ-B=tKWM1*#JzrPjP&X>=P04nT>wGC?EZl~U1-zL?cEi*1j+3SX_{=tC~OY`h0RBM zDE!O&`ZD86XQAr~Q@EBc970)1l3gAnnW~3;G+~Vn5ot6mXw>ePL?6uV^F52b|O^0ZOuw{a9=%tp1^}j z$38aiU`l6YomK(Q!19K;y37+j@dJ_Myy;1SlSMKN(!w$Iuv;R!+cjOvs-c}$_glY5 zhbk2{CyfLQrqO;;YYErVOOayiNp|TIf1Jg$JfaN8&oui@BRrksjhirXK4ESo^ z@f>XUBDHR);pfA z7e_C5Ow6+OTSy1BTgM%R;&#J{?klYD^b@s-?Iy|V1Gdg8HJa0E;eD1%QO0pIWc!r@ zqE5wKh7-z1D<_S+$C{NDb#*|L8%cLAF@4<7^_ucdLj`u#`e+?}%2L+#$+srn2H(Zi z7oH85bd9~RQ9Gjw-6fbG3%X*p9&_IgDrey^uZ~{YYY8!uUa`k2=7ItN-x}2|3hFf!p_naIy4T45k-Bc_Ts2wpt%c`nPSTAF$%8 z@do%E&ezJz@9^m~D86pYT=#X5)Ofa>2!Fg*a{BNA(dW@Gw4Zq@wuW|#Q`3BxPE$DO zxNW7Ah#yY2m+O*lu)XM0sjQ;wG!Snr`lzzA@tOJ{yo~CUtf;rQ>!u1VwrOag`@v2w z712v`)7X-oSxx$+pb(Jex&6$EyA|Iz2VFdGNVf7KKsfuxjzCF|yY?D$enxMyIeU8= z+M~+Xb&jpzR_Lua`sI_GV#k_(Q(|}z3oS#1izE|&zG+t1bM*UrSNz46M!Cm9 zYegTG7pT&JoYsv#tBW7IeIJTE6^o~>G`W3vpArD@>;Ykg3r!TlSSHb>LG~`p-;ZJu$3A-`7O&d zfD~e3uBf%$N3TzO1Y3MQoawy9j|t-aOl5x#=j=qOWov%mX4EjJ*jPNydBW7*Bs=ud zR?N-I276iC`USmNJ0Cq&U`Tl#en^GzuA}y>cluh)!|xL<;SjnHQWK)46J;Z~F0c9I zZ#=Hk=su*1rqK#>oP>LkkDrX57PV4h*{#d5NG+${S{O{5u#y=~sJw?!YtCfcBmz_Q zWdA~ZMOG0>%JI}^VYya454zw1aVP1%h^m{jtuH`~+L2*Vjzywt*)!58%8R8`$vg%$dg zi$Tp&6F1U!C8e1*6%e6xtj*`hPJ8I!ls!ZMmnclyL+Tx*Z&wkeWjoS_$~;+t z8w~a`UdyIUmu)vaqPW6jMLCJj@u~f-+AQJ9S#mZ4)S$eD<@nUQfg!}yr&sUo>U{>F z$I=LwEgA;_qW!}RS76ek5}67LcG`toMR90FG)9BS?Fi`~0@U;Puc5xC8Hm_Gt9xgX zKLy)=4!r+qokf5V+{2O8WLv5B0fko8NXCxTmETo7X{m2FO?Qk{#+@ionZ4V^hq`Lw1HAXh>LkP;3|Hjw+ z6{h^t+4`St%ibdFQh~xL{E+5f8o=U0o*CbU&alV)CI31m4wwt`huJ)mxMqj5a)Bg|MG1ryUIb#=VTu(Af^w%hu0kKfM|ZkUYQy*<&Q&(OjXS zRyv%$95!WdyB4mjYFN!OQa#@J7ToJ(hL8nkSxkB!mqVJzs1u+{%?IwC=y@lR>iQ?D zk0e*VZ2K&ZixT2&2LW?vt)90nNEenui_iV55D6{EQB|aB9^@AvPl915DjanIrwNyj zpo_+Ci`NXJeQv8?JQ`!Fp#C;IG-Yl2UfhazG)19vIfMR1b>v-;yvW%0ChV6J84TFR z)+yP72%_>D%6oEX+RonGylC}AjCZtp(e0GrZQkH=UpOsvCnLX{7rPiV{91MVD6A(+ zG3g?`PWcjE!(BbcpOevHH<@@jqG70K*-&shqy}rYj&oJ9$dywA3!ddq%;FarA+Ls6 zJhUGqSkd(LK=(X+=ZOK+cN4Ac?0N5Jd}@K>Z|`c`e-+~QJ{z6yLg>3i1!sobUr*?} zt3Z=Pe(^d_@?l6usY*NFmAszC`V+0^+>RkEnaloimOQM(?!Mrew-z)UuFz5U(r|}0 zh-lUFKGw~m&RIwW7t zf`dqou<8Bhu4Ul9RhKbr%KDJknThz~-kQkMlid}2L)V|2H$2Gbuy}pwi29_{a@S{8 zE%0^32ad8Fn$NJLinvlXf$jJZZ=1uVg^b~uq%w(?HZ?D+kle`ZDU@ka?`T)|vtdX1 zW$9J<8T3lat@v_vKNuk*JZgAb5y4=)qFsSCy?f7u^Y)Jh5w}761ODDU9Iq?f-U#z% z6Ox+FoJ=&$vFE%Amo0R3=r!33)V>9Ew5|jXcb2|M%VX`!CtV)O-5gpsD1NkZD&<{8 zLk&0A#8PW3%kE}_b@>E~@#ebHwYUeqa~6Uso8Aq1=p}2vS>Xn8G|ZRWd8qF9u31z4 z!`b!xG7@x4+Xt3=?_7cj?&|(H3HE-(yM+45x8bljsgOeDteRl?ZG%$=oLQ$qXk8BX z^+JL}jQ<0_SX*&Rd>vh26q&d6Sl~u>P>R(^L{(na;IgE&k2&@G?%eOnbL-t!v#fj zmSW|&dCj8!{;p`p{AXhrD{8e2YPOHB3SkMm9FI0uHciURO;#{4pda36#4PU1;dmij zBvdTPa^g&~&W1=)t^96rH)JixsQIVoS#@M8WiJzf+O^j9mVBSHHDEsiO4quN-+FPL z^Z0ZseF0tUeuG5!i-><#t`VG+A_enireu(F?L##A8xqmtK5T&Cc?oX2fZ*Zz3l0Jw zk~-mWQg{i;?Ny+N7^ zFEE55xd+Sjs1nQ<&L@$(k$SN}ZT(riFYdm+kA&_QHOO5}`IqJvl6uq9=MkZUh7*Zb zSfTSi@4MaRU~ih9;VDInRjG7i&zKNBw=49QQ$*8U)Pd(CL`LEE`G#tPQ&Xa%F~Xgu z5%Q^>1;xp|B)0s8*{cxcFqFC^i;RvP1j_sq?}u2Qd@Tn zquSfKL{)~{Xx^rqOMXJs$xb(K=(bpQZ0c;LRVBwT6Q+pcIG=e3l--NOORPU_^4x-Cs2Hs)oXIXQt#T-=ByRfE#{gw)$z*ZH2Pe#*0_GA4Ad z+aUQ^q_8M0G^1S|x_#2O!H&=`$p^v|8ww;A-NXG&48|e%^S^vi=RDhy;;$vjybU38 z_t4D<%;>3lgP?y@J~SdFROmWFKm#S9v2Rsvb>~VCZ)Xh-#Y>4gJR318%X=(=r!?J4 zHeMdC@Yp22%FFU5HhC0f`j(-d;I)UE0Z~9&??)tMA<^nDgb3AZN0`< zC>`D-rAR1zfUg(H_DVQQ!LYl<^Yi6)Q)0>`o9`}gDyP{=33TM?@JFb)8Hds>Lsr7v z$(O#_V-Ze@aFtOmM~F!-)Q84c`pi)btz0*G)+;p|NZMX(b*ds6=TdI5?lOcRDRvV* z%~In{g8c${WMXUVi=lVXO#*dZo<5RPZ!~(us>J-|22(=ZW~&Ys>S1>7i&b9e{^?66 z{@z+lg6(t7u5YQ~9a%vObdj3UbI;78&}9)Kkx#bS;>!zrx90-iaMsu_)RHs{XvE_m zA#Nz@ph%~(Hq%W3acI4*u2_3{e&S-n6OyL zW@Rh;3}#|kTJV}{bM_Tk0VjoP$d6_Tf|0|o@o3iO)?#_;1@@oumYz{kBl_)fAP^^d z=mmfd{;3y^)Ho8iTj!E#SGqPuwwI!z2x@UMV2t&_%SkyjbEt$a3JpNhP8-P>W}X_@z@9?X?~uL-jvQa<2=! z4}pyBJ8Wk{O{(blxI-c1!(yis4A$b7b&P|GC?D3HRGuGiJ>(lgW9Ru334-o39pt36 zviFTznlofhc;GD=vU-1aC`-9EL{fbA((sH!=}dJ?pytQ@i~J?S3B+91j$BeI!Bry3 z)yIhH^({NClfK^F(G<%{(cF;$_mrSx>C}eoLENbF7rc6azrp90D!5BW&SLA^bL z_H`8Eh+avMAXuqd;Xfe~8>k?b1e`z8gie;-NA}SL!l#kJXE7T=y*=Zz3NKzQs;QA% z_Ip?{!!IB!J+10Ubh~Vp1(J;d&4dyMz4#CHK+^UqaA7{9VJ?(Wm?~r$7NG5`#A0XB zn1!mL9H96=;Y+k==?}q~eIIgHme5RtCQMt!jc;KzL)@|vp8?5=tPd7>K*}^^O;BXX z2sMnrul@{Peee3zSe7lHwVRSX`XY4BMWBpkcN8rD zx{HrL_e=N~6+FQ*0l4^=Rlq;5Yaub(zx{uZAOxHL@eKdw;9UN$0Sx6|K$Zi@wjmrq zgAk?p3H3Q0;Rf`*6TBeF8ASd(tRgIYXi?fuReZqF85}Lx@6UJ8<8%>BE>#G)0#8x_ z2KoaBfL32PVdG#662|^6NkK)zBn&R_b~6r%N?FsNqC%PT?g9M%uR6*9pRNDaM-CvP z^pL2;w`{&cNN>Q*EVP$gtRnGMIacJT3Rfrj5P#P^-`VAqpq5cBsb@l&Ij60P$Se2P zvcx|jAQY({A#828O?ABia;8+oZ)OddagktN{9f_)A1YS<9OwLLJ*jNMVbH`hCmre! zImp5-+aHXpwP3-2oh=9y{gA^nWGS0|Vk@kx{(^A*O3m13g8<_yB%xlE5g`%%({&55 z!I5!>=V%}w4P2mmJF7$C?gvNm4C?Q}R6xqyvVOA(@$_+r-JMUTq#P7|5qV7&m&SC% z`hacD>15|wT;EPRk_)=nQYrtuKVrR2{q8~W8+LRLd9W3_oM(;Z@sa?>wZ9JROz7tE zE`p9FSE`B!a$caC@b*(L879GQNiM#qI>TGGNUKtr?Ne`qOTJDQmn4t6oLGB)H=5p* zU1PY^3sD)-dEJY8r(i*&;unbTFA)CXBn&2^1M?;z>9Lr^bH1`R$Wj1J=tDVqt|%l< zhhi^@c@gOmdpEYyV+L)jXd1WGN`$_=_xZ%?rcKf))Z^W!2ID88T0f9vUoKaM#ru%( zX{Dd!Z4C&?vokMH62H_9t9aHM%o@#WETkc~E zRik+0BX(Qf*^wqwt1Rh)wp=JfL{!4LguZIBcgc#;lNr$F#t+SsoYF5KfrOwZ{XmVIEl(%^UKhPo*S8eyV$$i)A+N{!Ywz>VE6cq;kdWGK&DV2HL1=O> zchh;7ck1SQ>X$!uw${CNX(dX1j;P7hoq9fK&`C`YYFJ||OQ`GC)D&L#d_p9|(gJj~ z_yscPuhDXGe@8BQm#K+7@QpJ+)w5N#{I8e#7;Ej$z*oR@KGfHlGcbj#W~VKeEO`EM zeuXkWGUQE2z-WDLfg;UQoPv~*(;w$vz)Z5un?n749eD-ys}e=3#q`hNgxrf}oDlO` zx60{)P;-vf90?t7%7Qm0jFf)u{aUdo7c*DUnzQP*P6NG}d{1QBgef(Tu+{bU=#-<5 znUs~&hDt4Y2Vr>A{1{p~x4#5D!;i3AG8Ud-Ww88W3ismOA(_{5wbfImsJmw)(C*)k z@;!>c7tiXnyrh}Nvqus8N@a3EsA0pMT5qGvJ`{T^$lmqQd-%rp^Cvjp1Yd{E=!FgR zrj6Zc+*T%t)kwKi_4GTJ^n6lLAM&F~WH|AOHoc}OYjA0ncQ(E5c>aE#VsKraJM?ma zp~uwV=J%tq9m%`j_D+ovLMp+-!;kE^C+-u;nHqczY)j*i&tw(jY zSLvP?ht4%?n;tLzbKWd(h9^m&c2wWz61UDS_!^hqndFg`7Yu)6*r(4LY^({n`?ONH z)$K}UwW<5K)Y}?pfZq5x;TdaM6`$&>$lLH=fN!@gm!{3iFg&xP*EwwI%E z_A~WV3kUqHUbcXHuP7MXsChVN5Z>jx>K{DS=kbWF2&^RNY9qD~Yac(0-bvks#isU( z>hhp{e5UrnmT>6%I0GYX)u zi!kUZ$q$3snqP}J6dmHwZ1)`&O;Mq!BHTuHtt3$zEF^BaN%y%l1sp42WG@(G8=G1_ zCk+{;$#kZJ4oeBV)zG2)R8?tI;<)aYaHb@L_oB#kWY@;OdI^er#-@P_N@NllzobMDj|9%m!m(3FGiGC35dXob*7$8gvnz$~A=?Lg(e=XZ>R` zeM!anxCa;J3EI*dMDCG(&)8E!lf*8(9G?;9@RAKHbW~J$1eBxWIO)7*Ar^Gr*0ZG4 z2rk9e?#L1Dc%`N|T15u2V|5~XXqoA*6*X(}m3**WlpbWmg>sf|?bs`+$#RkF zfzamHrx{A3eGd%{Ip+35?!|;oV1##mnd`FnNG!r~T%tUiRzEbVjMOO0KIH&!rb3c6 zWJI+y$ZQWs!u-a^W!QA_y=e%wUQ@xT_%i>^4UG|JWiQr2{jU*d_NezIQq75Ws9Ovv zo}PTxtxQorIlz4#vz4lek{}w2g?k&c+Kbit!^q-9&SUf=;?FVeJ=?1aD9#3MBp__ zm{sW3klOa4)Qau5A4+*|o9Mf=V)2RYw234fsK28^tv?A~U>9wg(2HjnCX4dL=qx9z zfA8xUs<*$m*UjuCwo`N1r*sLw^!L6l5RlHFhpNC3*x4!447nzh`qJa@q+H(J+(Ra@oBCuf-*SK%+U{n&yn z8NA*3p)N?p1l_=;WBVU@VB|WyLxx zB|Y>y{pBscKjNv zLxWkoqKOZ$A=A5e+24EOn!6{@v!;?ml{5x3G|EhcTNOlfKShVP3(z@d4J8%|6r%nf z9sgzNaF{hz?Wp)61n3WLQ2zp9m&7{4^Y_`)cv-*x2KG%Ccr|pZH`V&gW?w`pOPeL>)^D>b(-K9EKMmIvPS`g*d|duLJJ1K4{E1V!OiR2B=CNpepxYQPqRPi;bVCD#wa?+rS4|Y?o3~ zAh*OPuZbIdM?pk~o3G9f)GFQhTRi-&lLj2{K>F{I;q^O~2aTgJ5QI}j-+?FsndqOQ zz~4EVKbD{lLGSnI`kf5eN0@&`Cevt?Q(m|xmhrlJ7FF3?6|L47kZ{+s=DkD(9=KuH z;+D;DzQa`K!n12YAFcx-B1;cT(kj6Lc5Ld55fxH*$>4B0!e~p#uOjg^Uu5k6-{Gmh z;}QPc;n7s|0a-#McK!no-3l6fl?_0a;42X*3m+dLvY0FSjQ~`WTk$VU6gUfqcO>m{ zB0!oCba>&XEC3oFRw<29q}MNK1C#OVTO7WI0uZ7g$M@&gLyOcQDMD>0;NWnO^)G}1 z5qH-d(n)@zZU=_+PvLj&5=5L;B>o?lU*%XlQ{bn$f6y4e2L3Fm z^y%qS%u{U2R{b=^)ik;o6uxskmnuxL5@T~0m@WShB97^OW+Zm(04uxNb zad^lZs^|*=SGu;d3Xp^(vYZhZSmK&q3+lOvjej#&X83l0N_u@M-uJ`>#?am3xfk*C zULyDJ#jM>ao8r3M(2#q&mp}a#TQ@qSdGEYkmOQq9O@e(b-HQ})S8nxnk?)6EA@2_r*T-`bOsjK)O&Q`M0d*^e+SQf8ibpq7ZHVg%3`UyIlE1*8vwWC+HCZXt zK1_b>;JM|h*ByPH9h>+(tX0$ULg6wt+$9AUvKDf36^+lS6LsyNnDj(&SNcbLHMp9f zZ6J4G;Oxaf&O>!6VaD33uq`7PPqD$sc?gC;_WR6a!Q4|uPD~L3yLhGhktrjL{e|hH ztYh&}Be9Xo^R6&!7=i3a8ng0Qf1b0jS^RnI&PwZ-8Qz_gz8ARYY5IbPM8=960V#JINnX7`gP4YNOvAx8L^v z1+r~6mb>bGGr%^{HBXV!9|Z$u|BNv{^}A(&lVab3{-opn0aY|Yqe-A#?5P}gMH(7$ zi5_zZnl=`9!!@{`0+mek;z*)0OA!ps=ylPg+5qhvHU-|%T34gZ0(Dln2jx|KMh|a( zC50n&yYL?SCgH6BmwxR5T3SEShI|T*y7FlyJWDEPz1|0Dzd$NewS4raFm4qrNbsiHDHOwzDD5~Me$MngX75p;L&DP=W+91|>XgG7iZt7o$M_6pW2uvOIm}@?l!-+JIycT3WSm%C{yDQFj?wGc=#}!}c zBGY!GqY=Hj_&q2`zPrJMGFm7H-9s8ym3Q!rzd_wp-0{ot0@o2?$>*Ch;< z4tK`Av%3~bIc%fxv$1rb4%kDx1_3(qDT1geQj>o27;yFnboK)BL7E|Im<`!s0?Ucu zL3WszxiS8u+2QfRlNALyxeaEC<+{?HWPghp$}evlbVDSQ4tjY<>t631S$8~+ z=fe3zk&1VGvvTs2v_Q2+Rp;BuL)}#8c$wAf$)AmP<^h_>NV$)CBt?zYd8%`WEw`&A zeE85NdoS~JLmV?x(rMJeJ^4^jnq_u5mNBE3xwBB?-p^3a;#Tpos^QQ(j*K&CuZcpT zs+dn^`ymkl3*8aZ6(*94llhH{0}br~Y6Nz+jSP~DNZ1I-xStxW0=YR_kuayMaH!qd z5%IXmLgX%)=X1FM7xVvUQs{NWA;J(-+#-qyKB}S^z$@D3O#4<+wiECgf z1ojOHcvjB8^EE%0T!*Oy33z%Kdcc>A@ZtAFknyEr`G078%cwZQZCkYQ;0Y2Scp!Li zCs+vXF2P-byAuc++}#_8#$AKELvVKwu6adr_P%@XbMAfPzA@gPuCBJPy6UT{HRoDu zPDCbjiJ*$hd<C= ze-`2XSAki>aCae*iBfOE?~9#Q8_ z=gp=R%2DUdY5-pjAk8N)2G^ehIFivpZ;r=^14%(UGr>#1_%oBZUcO$D3f$)tC&rq8 zAZOLmO9&=A-+-zl;LV{(_he9{Nhs&3*^|Em?|@u! zb9-AjfM>e&fBZ>{(|pFN50R(}?R0UmbOF%=5Ky`qLRJ5~4Z!k$gh8;p)UoFiNgAEH z0DAjxI)Dg&RS5pY$n>8W4_$(am%6k)96+Cai*zuZ z@{LKtk|9v{MFMs-NRgN!NumOO6)A;*jNni*hXT}vqVvD^p8ZK|JS%2oI{$Mls_eAh z=dP%2G~GN5`)BS!Gq)4?-x5^y-GBBiJygsy%?RjsPy;n)&xSnz>kR)FnEubR{{QMV zF`4ISbVy;8gY9YT}BPtAC@xiL= zT#+_%T7wT&f_oZ9qC2}iYgFSNVeOpR#SO*V(fi_+4*u*6TMik#>iyFrr^`+lX4y>$?S2W+MYGpuc9_gIy$IKb;HDPC@Ua+wh@P^?SqtdoE+hATP5!G?(7;TnD3=b2@5 zGZ$Z6KvkcNHp}QQ9j=-mIF$+`h{CMr@UZ=QBFCCY;}oZYx7{YqZwZ;RV|bxm*CmS} zR41OH|3Tu_$4Ze=GFV}vH22wUb)AgC$W$T`w~(`Z1h}lrM7NM^DTU0-JUGy~0`;gy zSmG?rRt(#Y_0qc^P?4?$!91O4Q|(=;`apfuGAcQH1K!+?PB8%Tz0i5y$DDR^`Y~ zW_(OUK^WpJjCWN=vACB0%$OS-X(KAHtmbO%!xC~FPYDh;GFCKAG6}zwBp-XZD@Yc{ zg?BaVS$Ar3HF0_!Ff#}cV^|dIURKHs8G|8njOxKo#t5lCK>UsCT>_gvz2S}1%wb)6 zNXq64N({L$5tuM2RDBHcR{Ltf2RB&!Oy2sPG%y8QK{Y!$}=}bO6 z*yF}bapLBW@6V{Sa^ZN9Pf?UUW+`PLo8RO& z$ZX|?rq(Z0)Nw7txAdx1K&&571hKM2qNcI=lynuagM2&vRzYqjmB67aq0yxKELy!C1Y!kR{S{eX5E`>+L8%S+6n%hpCf~pD!$5s_H)ly! z5XcXt;!6QXLfjc1r-zjhUnen77BzhNB6hnp+j@gwae-*L2X*u;n8(jr|INlxNwh-^)u#&3KRm zjCqTU_GCtAz^c+O%RZHfF7rp2Cuym>?=3&$_`0>hviq^WmwY2;qSb-vs;&N&Q`6(k; zuf$H=AhqH|PP|G~rG3VjCsoiDB&bkZF($@jK`}QoR)JD}bWAZqLeH{!l4J8sq2!v? zv)8Zgs|?!GKIu$`>gy;}Gu!S6w=12mI@pjWw443eu8Oz)n5^QAOvQEa_1AgDOtFqM0TI@hHHi{sU7%oDT zcA1YD4yf!z{%-oxR{yQ9U8P*SYHx`ObB7)tbuU>hjQpiXkAZg0UAL_UTalMVC$ zvNN1Oc-m|6QS*yk{X1f&uui+dB%+e0NZq>)Hg*@{Qnt?o^{e6O7aP7% z+Ht$?s_!bc8}c)A2n)@MB}94M#_^ie(c=f@JFoqqQ_iW|SHri#?R{!{qgLl)wNZBG zgxMjLazHg@Ompej6~F%s^XGXj3+G&sC>*#5ViM5Kp3bQC1J>kK05LWn4W&`6wtbURGrG4CiPJb<= zSjQzou70|P8Zcg7aHtgPPOUhJ2kLjw%pD1mu6}ZBmRGVQ1-Boz(6FQ)6PlTp?YE?C~9Oeb!Qsppornx4v~(J43(kI+=zPaXIFg}@$6e{gmp}MKOc7V z?2-9+Byt5L#M-^Cz~>q{*>UYL!{&05r%OdN(R_IUuZfCS6uuz%Nr9IXQlJ#m+axX{ z{PLe@52UUFWcRei@MHKgoJcDZQ&1E`Av0q*Ix!`8gwWAdXT#UVR!z8WRZ zeHCu<52rv1QbWS|&C8t{!0jnNi;gLIauSgxi+`DTq$=SAsU_bV67jh9u&?okU%zcN z`Wp}*O-_m4t7hE2v$&(C#D3TV;57Nd6MbIR)8!6eGxPNQ#jvYOT2Ie*I=3Qdn6F$I zz*$m)|50uUQyZd{9qW+o17ZT+fyNB6k_Kc4%GERht;v54%gxB&fE<}dm;lYYejY`E z7k1_n;e2_yS>P8m4u~Qz_Q~ zHLzy7O#sFK9!T4wPeE&+BJmp#{KQ8<{jclUAgk-g2DOkmDlbxiQ+$M|0 z{QrPHkY+M7ki-6quK({=VCn6}>n8v`5Ja>^(?zptdA;zzOI<=hhjP+1yPd~2m(E=Q zNGRi{hYc@>P;G&pTin1t&!xO#a5>NAu22hMR8juSA#^^#R(_Z%*v;-kNWz_r3oa;Z zUsW(g2E~Yo6$nLA=SlQ`HL%UGEm}4rG9sZnW`#-q#>>ga#oC^nRz1bDw-ELk>V`Fu zb*U$;#|I}_!)ikU&GeZ85%^EMN@(`%ag_%gS-|5;aa)&STx48Qm|zI@S@6;{C&?#8 zm^DJe6N|e1uh}ic6oy$8t0|21KwKsAzy7HqOx+(}Ds|8iFkXuAYXKf$+w#VRpFhz5 zLD5W9gASOqz(BP0#(zhWA8MD*4$F=Y;YvgVLZrg*xcJE)tRGXl8S4xcM{of_ebQo12|U_jhK!#`%Nk zq3nP|0w_CL0aBpzfc+UYFxf%WK)N>*7T`1DLlA8LlvVrG|0QTNrI3J{GbmfYmtaC( zX6yTN!8DfU-&sNbmvg}Zei9hxyH5k_nf#O=m=idw{y$ju0+RhvjdDKE6g6-rPPG5! z9spivG9lTB*NhdmT3-0SNTk25y-R!lp>3ME?ZXdPea$b)6zp}hSWJDWn3A6f0a^U5 zUx^Wt0@ygr=fdW{^u^gkl7RV0S-ZFd8g=G&d_()+nsl50jqoS{_@U4xi~#JeQvMT; z&|GU8Ra_zy4Hyid+4BAvEoA%tZ+IYpQutr|P*Um}fRM{=x8VX8>}`O_+coTf#sulV z6*MqFfw0*Qn9b`n6I?;T?zh4!{3B|gT5K3*3jwest-j!oi@2d)8NQdKzL>}aB%=V) z{`Y;Ia8E${7Y86;!UdEI#}WUNC78u!W*D2TKsZixq`{0ymQdL@qct$(uw>88C1b@{ zr}LDS;4FwKEToy~iiSwd&U(FwWeC*e%R+eKrrl(^rrnQ#6YK39ivR>dUX++)L@hNJ z6XOj7e7lEu=}2cN!zHB1DwlS|VvG1?tz&T867fi0%+%WpV+6NnPNq=;$*y&Lk}Ge_ zO-WXlD4&v|HS|WOC&!9+_O{exyTxL+&3}hGuI&Ngw$xo6=f2BRerwwshVOCGwbs7{ z%bRAJ5?MW2=fO?*X&0abM9m*(kuGkDW0h}`cpJ!%KR6X`>F4?l)R~v!^fgZfE%dUD z%g<9MepcM%pXccKqOhRtV>R@x6~R#OLN>-xE@v0JHkjw5F=L>ylUQ$CLHYdwX`3i z5+md1ceW@$AU#!(wN82%tgShN2PkWsA__*(>o@@u0qL+c0Wy72Qa`j8w0+xcSs6DJ zI!7i%4cQDG8Ezgd@GhO~GPK<}Hgd1JJzqmZSm`~|xdZ9? zgS%N%Cmg|K*z;+ZoXg2b)f`ezxd>SKU`q_K0ibJ^b5G1pJ%Bm?#_ST`c?Vl%3WfjW zF1L5^g~bZG`KmZz7k0|$M$-ot-+(=}I7PlHd`xv@@M~pgXsHeZk6Eh?Vr6|S3MMxa z?zu8(KhPPQglCH694)o+;=6HOI|(KtPUu`FuM4*@0Uhgd1mp17{Or?AkL&&p_6$EH zVwh`k;E#ufitT>2EV?F58&lT?_QbvycmSPIcJN~?&Y3f^>>dB~U1#%t->yt)Q4ZSR zgrvz#+l&|MhGJOs#PNLbR5HR7IU^sL?l_W9I0}hy9ft^cb1{#bBnPu`L%Gjt*iYiT?u-^gL)yWkn>N+28g)J1de+m|-z`7><-t(XsP zm1lJFX329emqfk_y{6>(82(jYGe##>>LAnrvZGpRbz(^_zUCC10BiY9mXIanx-xa$ z#W=-{h}|e|vP)=1^vJWhqLI8M)sR>tEtNUW3K8;^56f#TL;s%r`#QTTdsx$xHY2)% zv}awO1z-!2syf45GTwIm;)}s7_T9E}my#``gnLHfwkwDHQtYh=AEum7-%kK} zyl*>i^Dv8}o5Zf)ru1oB%*85gCI}5lm&%Tee%C?Eke))F;^&UXcj=k*wVzEsr7h)q znzR@5zwOn_aC?+I>-CS(oD3kLA@HC6)VIUAHHAi1k9*^)(?%*nBg|8v-4a;x77+uPmB*?&Pr}^!!@!W zE(|(0Fl*Fa{E${$7($XNLL*3AOQUq_rm9&8uGkp?cd6AmcVJtwo+I!U%DbmuI^nC@ ze>+h{pbx-^!uXL>SSl&ayBI)*e4tS~?_gVC=E_<6${$y!Njr>=2avf`pn59?|Ei8y zQ@_;eotKwcjs;4-gdQ&BW3oI~O8qw3jW7$rW{A{!*z%=aVcA!`t_r>>!#hQ1_wlCX z{8R*JPwHEuxYzqy>)XdH$Zj$=DKTDkQz{$>y8lh(SW(BJav5X>W&A)tGQ5^B@HFyF{&5N33HN$$&zywbmq6}yyBtsl+Bm6 z#GsJ=iPw%*kRoF*+v^sVK}ee(ZBDa2`&#N%=mup(DXU0`Q{TO&BkrEaHdhA>!MTjM z0SVt))b;t|y>gxIE3B9JA9ldJNlHGw*mIE16FiQ9wl(@dvBm_c!GL--&8}>w_;(f6 z25oFTh|mHHy6WQYA?zrz#ojBjj*XwY^fGVPF=(VoQuoL3+CW55!!pgU5so|*xV7t5PBZf0)t;Bx%I&Kydo3JGu z@*STyO?pYywI4(iv{m`Ru582b;~`5rr(Z@LTHJogSPzV_`dPEx z;3j@iAM1TME&7D%joo)2URSX3)C%~RtQS)z*7njB+G{=)Y@AUB8o#At!=m#Hb9zFX zs>iy6*4wU-Eh)a!zW#!892D zqD(MMEsBcM>-IoVH=>*H_~rd-FGhu)oY^WZmw}8=D{(X06`0zz=hz#L(r@G?6({o+ zl>>qF)jKG_Og2pzy1d<1vN7O5Ix{8wpc~&*yLum~rY9=I5{U8SB%iV7KeIf4q(-Tr z6&@y)wz`?qGhE~%F^)z+mk_xdIoplzA{PNb7wC}a@`J@BAGUYj0U&#!Y4{YYq=Aij zf|GKcbO8ZMoJ`GNlToA7XPGh{91j+WK5Cw8!<^mLnZ_IN|2ijr^1YBwR1h7otJMl@ zQROvPz27;I=B45$$`O^|P&5U$QNxkVY}!;X)n4O`^Mgm`1<8!7>; zpk6M9_{`A}YJWEWH;AOvTln%}_Ze3I25lDWJX-(pgVxekpO=16XB0o{q1G5(^o|+!vIraHD|Ip zkpUq2ibFOhUaeFuKY;Ugo(cy0JzL|1EpjL&pajYxVI; zuuDQLO+@E6T0L3d{`{f0T0LRYFDY5?%mP13P!axj-wEhf$pZ zD6tmr6>8Y4wuJOI-Q=V8#Dxh8gszdaHYRei!M7UhtsziTI0{MD zj}Q1HIN}%a{{097AlAxalk*lB!e=`^@x?G-_H&(f0We+uIrskg9Ka>jfPC~(4p3_D zG4yxIm593VY$ceNzxZNLRKp9X2)Jzr=J0c>_J853e|i4*Yh}hTPUFCNPVmp4c{Q#8 z^bV^Yfpqi7zY2c>KUb9hH_vVan2_i54FNFMU=IN~?f?J2lCLN#+moh>F$Sff{YA0H6_P7o--*lrdm za;J_}Egf+E&~*S{ZLeK$A(a_kZb$yO#E-TPDN$&ZUq3P|A+#qRsdph`!nivwNJ|E;PXfXZ)U%<|=xz`8HdHGq?Q zsh!$6L_+UC85a4Y){g|`!x8kg&W&>LIkr zi^9(d2Dh3(K~y?wTXl8}P$87a|Lh3S#zY|}ngy94zM(;QNCnVz`6uE#Zu5twgkT(? z(TD0jpZg|A#(wx6p=lA@bG-yeU^^8T^Q)9qEah@u5(Zu2t;OC6Tz{Ql#1qVOEA6e1 z=cJDKjf6lW3LpRqB#i{I(-dUB%cYfnOt;uZ^=wFIm-fL&!Nt|3oTMzBY?o0Lj?lXTk$Bo=IkU)*o`dY;ZYTco2Z2HA9Et z%P7bXvecKy?it%eA!eHUIlDNfXHVi$Wg7vuZRI{aTbtuf? zA^`9R7b2OEOzSW@L*37sVFv;?D#Yr|L`kB>G2Y%?mAN@@U0<`CX+OTWio@gG%k-Jc z=fUhT=Wbe33kYs*=OQ!V!A!yXUquN!sk7>Wmo@b>mcxvZ2`V9(CCO)D*b(^{OD-^P- zy?(Kn!AmM|5^d*Ye4u_e62l%*G?>T#P^Axb*$&xpzv6hzxg4DvUDpaICg`rV)KEJ- zp4^m2Q5?n>Cu5-{pn-69drRF-ABUZBbZ|R&@gt1XInCPMR#cnVL}^t-x?J!UF-{xF zXcimW7`qtGGJifw|R<7=-YB%!@`&aAP|9TyG)gNR~h zcWJYe(z-^>xCmA`DslTaa~0QonO{=)+}uxOuR-AzN?rU8YVDn$btKn{z%1Z>h_^K z2&JwyyJWNN8;h(a=SXsfK8(VQFzMW?Z65YFIP0>0%M^{})1~Q>ugHDpSF^$lgLO=G znPpHt9+;z<88~~fH)T>x#uXFz3S-t(Pmm0-_#G6L;zvF^UtB#zMYK*Hyf(Wgiw1+x@I@Mv_ecaxmMo_9C zsqU93o9Wo1HC}PCw%pORq(?PvYomU{F52=Q7wiX*YWLEo8!)MNvjGocrytznniJyB zNMdXbxi*t8m+-sX*qdI~Ptq%Fzhr5VC2%ZGug6H!RooLNrR62{jFd5d{pUP5H zHaI_#A1hm5e6i(8%S}jeh?a;4DQH$+Q_rlA-hVOZyh_CFKAS53LbVe@=BEP_G60rQ zY~!1XF_q7Gp|9I4#aS$&y0Rj`W-;;bw@0g((f~c`;hAl43q(uIibU z^dutY#Uxv<*oeWHl+p`-dOF<-uq$2g(!EwyC>}R; zHwopV-u1#${8Bcd2{nes6P`KSuT6P&T(i#g&~HWl2@o6B>(|bBG+!@+Y&9he*YjDR zu}q{9jd}FL!+AJyu%xLT6C3#rn#rr0AAVzsO6=Q7oiNJz2t|i(Ak#Y}wg<>6I5qcD zOeY<%LTp1Gp-)@mxGS9xlmj(94UjzY9N*b>a({cToTSD#a{0Cj8Y$hA*Oc?tBc^VU zi?WP&|F&j|th`(LRzVDAvedKq{+(BkHpOEe8?7IHO+bQSA9)Ih3xlk3m9b;XR?GTO z>mJtvu!7@jY&?2EM-MzQG>TTlXXfcKuwyREHPUV?IPq^o@}ND9>5NauJB=%-tE~w$ zXgB`iKf-0#+g577PAmLHQ+_@D&0~ZY4uDZ!ShZqm{uE@zu^GloE8SqgOZ!+wwU(k6^_=of2AfyKlD?jO-4WVV;U&EtYS5aK_rY z&rTK`Hp6o;cB<`+3WshOnXD(;S=f`1#84w_IHXQwomu&*EFm7&-}-GK_&!f|ZieVjTT*G&U+|9*f{^JKJPlq#G9LBg-%as)jGC0{py-Hq_>$S*hQ*-JF*9 zxD>%1Yz1Qv&tZ{@Ex;!bFD5x|fkNgTuP{Hu1-Y4zb*S`&vX1&|`v4rEBaXbTvw>vlT& z4U)>ZQ>WRx^kRk6Ys>GBO!2Y2T+Oj4?U3b&SpqrgAVRfO#UxVvkASUovZ>{| z>oK802%~yZkg^Nn{BFDwlIf~uale}0p6@{OJDJ)$B375 z4_?9QooPf?TF5{pECK%?l3-&)V^+z zXa}B_(Vt_W#^i#__RM}w8r)vpcozR1<)}UE65q(q%fF+nQ+mcB&HSgf>1_GRLd;ik zAj-_Rj*Ze8jZeA9dYaS#N3RI3*7*|((TqBqzy|V$+DVn<&-p-jrXOg}In(ej)|0qzMPSN^F|te#ngYC71S&5rybo#HM*3q&H-HXSQ74KvS)0@o)W1b{`~q3 z^Bmo6_!qqPf@eR&H&jo5*6O*x2^t>az;8c)B_Ow}7@Y|ZYx9Eqv8l-1&Fg#KlEs^v z9~x>uyUs^h1I$&Aj01ulZ{6bsNfG?2DZhXiwEWn%6`lZ^|-Iq(EO+_~zKCrnxR&@DOtdrFTvcLVlXiWepTXK!1=V*Feki zUTYBkl|GRe1o^aqkV(M? z%p?5@(xJer>-SnIG>{GLip5T`LL67Uq=TDWaY01{J={F%mann*Kee& z>lE9r_5G16?pAGhX3W?RL%xhPwpLck&DOnID7HZfe@UEY_8xKw!A>7>ImVVj4D0mq zJWGDk_oDU?Kcno15SXX4pS|xK=DjEz|1_ljKC}PVTByT|?>Rt+ zHgHfc5E(!pc+%IBco;hmZx(c)e5UC0CL-nb*drNmXmmg3MJn z!$0q^qybz1^$sJPC4ex|L2v4%AaJALr%fj;vsB!7(yOp7s$Sf)pqI864xXpBJ{lj3 zR2uN{<@QB{ynoDg#6O~4RSmyHWp}P^2o!CiCO5IjVv33*i?neei_#e}`Sw7)&Omq* zKKLX3bv1WDX(5*w30jb$E;wk3RG9$U^=+fwVxj7{d}?*F5851a{lc`1Pg(FSSrtF_ z*}LwgGBz|!_udAE`O(_F;DT>U7xy+`m^a3ve|nMl^>LwCgg<&n+I6+*X6_VO>z*V< z?NlY(%bUTW_J-^dh^`eJF zOo+rZmWi;B0NTxPqY&cvMEHPkQ*~w_(>VP^?BXS@Ju#s3*6g+ETV#r{1zWlxSy&=C zw;QQwHmah@!hwv-#kJBz&MBKvD{rx2h^ac@JMwp|Lhrh2hJQ}JCohCUgL^5W^cN|> zW#RBQ=;o2(fhq8a|Fp6IL4cfVT;2(;BO#KFBSl`_ekFA$;>e`_T zDWB9Jqp~lRUE(Jm>1v}y*MGvhG$v8acpyA|s#}xwCwh508q7J(+CnojzvYZDM<-8H zX$2cFFS6(>*h`?=1(;eX^jP00FJo(-I!22fMOnXh#r;B4sIJg@VqaEcrDp&0@#SQV zC#71@q8U!P&Zm61)hHu5PecV_)AX8Kbb#INHwF@vfxX2s)$>0IocdggFIO+Y5gkQr zT0F~LYy0kE)QIZy;gHJ28y|vl3-5n4scDx-12A?+ONA!@& zO+~dV=drg$^$MLbU|b$3DO|py3B6s-Rj8k4U7&_lO|WU-H2D%ulj^n_1@w#k5CmAI zF^PpIpn2nF_FND^3T@M^?_CD&71EBah}Bx*7UMb9mz3;(fa?wD zRi`JUr1doH@z!Zm<`QN!l({|a>nk|UZmwy-7rAT)ddP4=-v2iUB6Rl>_*Z^{z$AKX|qJJ;npUb%Ynp z&B0m*j!RBwwCSBp>g;=an_2clXx`>1{{@1QUVBDdm{wWn^3vF`P=Ywa`3oR&5Hq0I zjPV1^Vh*>36&iqlPWftCs~aXNlH5^2v=SlUVD(mkY2`iNp!_%jVt5C zFA|0LnI#LuCZdyDY78ftwQ;w~@ya4x7bt>kV4YFXqk>}qzO01RRajQ4e~n3nGD5(1 zHP*QduGgN|R4Y-;C$bZc?JbofM-+Dy?y=T+Mw z0oSeUtwCB^@t-Y_6`UTkj|C*C$+8PnocNa%!)-Owf5_xlg_m8iUkxTCrFVFG8{~cF zJqFtnf}0KAlb4YR6CHLzX}kUgA=j;r&hr7WG~)5so!=n)dxc5ot!?AB3>{3Dg$4NDmf_NOQ4QL@^*E&b?lrAYao1mH-e?^2qJG+HtuiPTAz3)gelMQHI4R$JUK@f~ zKVePU5-c-uDppCwm!u$Kv6^v1)%KO%L+~_5;;b~2Zuxb~fPH{Z{W?Q7TNGUe8GE6eGLD^R9$1&aP%0B^0cf6ITa@s$4lISsMG|`eA&sEp2vD zCJYegB<$xEFBr7j3MyCy8wQ4S-BT5^di@4@Tbt@!yaKeq+xnka|8<4!;n$7*l@bu$ ztS=$;b}bW~hSnt;=VOdoGd9!+l3OvCU~j45T|)z3T?=h9i@3u8F30Un{7?MaDSf+? zCR$#^+jb?MqQ61(E0@3cGsW1#-mWVLEZ-p-T7^^=Gu?qK6yLTXdSQF@tyEhDp7H!} zTh3ZDV_lHuQLb!A-YP2<3!pMqb^altwkEg=Ei3Uj3!!A}`0-sL^%nB;s8cm|ayu-T zB$AsW*|GS zBm1?C^7PY45c4Nhl^BlIilHHIC{59CeDn{dH3OB}T+R*MuwXxV%%oL_b=3ZKoq zYGw1~ecA^G0s$Vwwv4s~2Km-0elv93(!&$x`C(Wlc=Di)&n~A$JSfpN zJkf43YF-47aeVwYbssCvI0Bo=-ft%rc2zA5(Ob%i5nG#q+_1wL+B6j`$4#)myxV%9133?;7_a~Y0qPxX3Rm7PHY!7$dow!f`dWUO) zSra|{elKe3joJLJbmc}xntf7c`b(K{gGI3dX$W52(rI)&r<@BYW)hpFf0ul|tPTB~ z+MtL%p~`!Eg{`~R88A`1^`7RoVNJoHI zc8Th_(5d7GdN3d2&9&@VKp<F7?e0i@rLU-%RJz57L=7vERxqP+&Q^^KI+vr z*0g-J`m!y)=G}N!Zp8nQla1HQ)w_dTYhksvHMFbf;CvOceYmtv;t{o=d+jk!U9sg* z87cKt;TW4>QrH@El&8eD2bqNl=#tEhX~#b2IAXY$ym7wrJU#So(vs}DOJ){adnbI=QW@gdtFw{HP6biSfx6TJk{ zu4NDKvI>nmR^!L^`nG3zx6URP)V60PYg*v;nt!8H0;Fd3^33N|YQ8Wyxc{_1>Ti!m ze62+JxJ3@QLgzho>r{^v8rF@uFs+Nq4OA`73zCi+5X8_@U+L10vRNa0*gR&`^jX*7 zORILw-nTj0<9`q*C5$%>4wrRA5#bJfbF7Yy=U#n4@uahD?`=B5J@STl0@(L$6_b|p zC-M#9GjIpwWv}2p)q}9l&<`c=MO^AOoPx_twRv7Gnm-Y|9%3+%$7=}La3RvCLLx-6 zQv$VzL&2t#Jt3(9F(*>fkK%N+N*Nx*&x6`Z%zK`CdfSI1bNUIDR~7u9SdVnXXWnwW z5kABjr|xheS63qbs1&7vef#Qv43(EFd)GYuRb{R#T+EjzkerZ4rW`#fMT1SIPjoAS z!(n9KcIDndcOi{K`ebjYi-EiDGOXt5dxjvwO2d#)c+7=qV1fit)K&SVMa&}8-V;hJ zSzR0i9cOwsk;@RG@#2YNTeWeuttOZBt}z71#(XWUMX<8Wf{Z7J8WYbc$MGz8%N38p z(Qna~%F0yQ5*|&d9Cnh3zs(|>g`KL9g&ROD-BdTQ3jM~&|B?1K;bydGpL@2u4Nqr3 z0o{>j|9Y@hgzjU>aZwI0wg8l<06YTrUPdLPCrokmH>lf_$Q7~~*qL0iXI9v*c}qbZ z-oZk>%1S46Ot6Z*D+aOi;Cd&U&fHXUWH6$hy|ZN@bVp3angWzo+@>qe{3)v_UPoVm zY_MH-68n$c*mov`nUvjBs?>?UwXEzmdkMcb7B0x8)^QL>{!-XFpp&}>oAY>Re@gCq z%&unwJtS9A=;&kHV*SrVU8CNf+PA(bd z7NjL_mu-#QU9Hi6eKwR>i-UKE?u98|Z}~Dvrsn1~nS6J}Sq@O57WALh?m5hIfK0kJ z4)ty<1LTmk$=(x{9K)2!_s@dXsZ!@E5C$9wIX{6HQcD4iFT%Qf1T5x zv>ID1+F`Ju?B}+fWvVh*fp7he-Z=usR!mWWSa|j(s4~kJ_FJOw#X_UmtFl0N@oqz4*>@eD2EF!hHn$QX8;2j(-2a>M@Dmium=O7isw^e-H9xiP@XbeprpvL%8mgKBvOC-=M8r^`7+==FQr5 z`1!|B2o9l7VPFQ`$RQsE8}>Tp(r1ky*BlBZOd`x9huBJP(Jt6R(tzfSVf_RfzWWC8 z$}b9Wr$?ulkXy2=Gh^Z^t464Hi(-&x@BGWzot+2F?nDo-SZD#uH>bqj%O5R@QR9jd z%~ZD+vf?y#-Yd&2WHj6Zw{;<%G=3neI^y;~1`nUcv?sWG2p0!m^RC#@9S;qP4W zC4R%lv$V9oj5T597}flkBi9%!`YI+gAl(USB~qf$xR+eN<&fxOzU+cZ0-Y>(E8oPX zZxer(79frGW98N2`~1LWy_|jfh9A8a1)qmD@8Co!nvM>1MrA|V^fTr;e#9%7Gv8?L zJDs6aOzwp2%l^Q&W@}revH(b7-L?>5*OK0p^I=&urY+3`zXOkafgp2L$Nz*U(n&fO z4aKHs5O*%5y3nlROv^DKX7K?&!6WI@+7F6qKlk`w}>F)fN&S;|j;JGgXf7 zO+?3Y)E&R?(#elt=f~;8MttltuEshW_F@L&8Y#Spu2DpKFZo<1m|E%1om-VTC**}0 zrIdTh*-S41@ZR4i7}Zzy3fPw^E|>~M6NU??;1 zc`Mdix6a67Pmc=*Y=lcTU%DdbmDsvi8)+y(+?jW$w|x^MZ&ErFa8~AQ%xj~y4Iz@ZFI{$HAMO((=Ulm*FSd_eo+kge-A9_9zdQ~N^FfU5jRQO!p%?H$2m_J{ZD zviKetXMz46WOx~>Q(rVV-=e)7aHeI|OZ4F=pzViAeREMom!-G=k>E}eVvvFw>PSZY z*AAo&H^Xsh6gpmA`UoPZ*hQ1&qIfD{t`<|9@;pabc%}E`W{Q79RS; zO}Nan9zA<6cr1(doTFXY6Cy_`SZc^sI0;Wtb@uup-PUtDSB5v2s}T`IIA=P+9xQwj zV6#!1YYq%)wEu^&w+xG_4cCT8QBe_43F$!uDG})q7!~P~R%(!tZs`~W0VO4skdbbN zE`g!DYmhGKm>~xyc$d$!_x|3!-+dh4k15t-R?L0hSDn{+4o5q&IkYPFd(x79pfpOm zByy_1i^_*?A}g#ur)d1xonE%%vfxrqjNGGgjg@GlxV|SY#c&`%Z~wrA8h{rzR_!A4 zQ5tvW#3bY=W%7v&!^n*X4nSlWhqwOc1IND~?9M{bq-}5pKTQ;Ji4M*3eY!e_%^M%@ zwIX9=am``0oS&X~dj_%;j3S>jGxSCPo$Wbiy9({--oA*< zW7BFI^Y|ipy0`!{E6uuiOe!YTm3OG?&wCr~^40&*Vju{?wUi-KtM&{76mV= zR}fjO3>Y$T%zqd%RoFj82L8X^!=&^U%E7x}ZwHz= z`{stE)E$%>c+D%OJNe%K=JeAipa{{Dh*QG?L%3fB)m=S^&9Bd4DBAEgVBvAU6-#8`4*wb$cLAMrP8->K?6Ane`!eZyuhe)?KVgwKSjy0|5u4~boAz@hDi7FxQ zC*kLNl>vU^ns5z_N!VwUz)Ot;H!{BrulR$XV!hjOl?m&&K)7p7Zd9Q^xa-~YtdBOw zwWEbS0AOY5->%Gm!JR@XN?~^WBmm9<=tC_yl?SAIwADED*)E*{Z@6;1KLh}3KD}J4 z&;Er%xUzx)y;H$3iQAOjna(+y3T(Ftt&y!sa!j8|0X)#ZdpiHS1%R(I0v*K?x9_@V zBms@AkDr0%gdV0!>hy~KZ3zA^w(Gx0&%X~q(fHTocOPxu*L>HKM2|Cl#)WlX`?b&f zb!tZBg1)Etj5OLZaFx1IIT?uE3`vvJ>3gQd{tx&-=aBOISEdm~Nv*NkJ?bRtD&~jH z|C~6t+h5A-v#BHC08t(cSH-mfO{yrn4_ko3%wO!r<^S=Zyci*jcW1I?W;6zM@p02p z{|eS;=C<`#UgNyN5?|TAsgchGz&dhklr8^4fFBCChH<3}0#7-Y{_U0Tk-VvTgF0e7 zs8~OIU*~v8x;c(5EJ7#YU%>GHM%C3e30IZ2Q#K_jmdVVGieIY_?gDYQG3Uqs*?v>p ztLlD#Q$@~@>m50WX>6mYnJb^GK#TJ`z*JVi<>=kbkI6v*@nLKGTj-@SjPc5~6-OlM%Y(NicbW9k zZkXCe0(_+z?QMZ!SFPbMH>z;4fiBVA_>98S?-7cd3gWpG@-{+2VfA4nl|m=1^4o`MMI!H*nWJ~=%AX#d z!eM~q|DN*Cdkf=Hr0Iw$_}z`_H1-jGVY||sM!Ve)An!w%0}{>@R*<*mysV5;RFc$} z4oNzle>X298JGRCM+$czprHMP2O$-db{er7I?iu`t}Glecct`8 zgHZYq050r6SK<^HU2?pJ)nAz@eb1-J*2bQ7=k`Z-R?^sKg^6#6-Jmfy-8u}bYJF9{ zJC5`_-J5$*{P^XOxn{j51hGJC{mR-P_ayQQPCmuE@@6$8C-)tTBW2BYsp!UuP>$SeH@!ll`)Zo8ytBf%2o}5 zwvy45pyW8IwfF;Guf}WLpR;n4t;Gvesp^?OEtZK(KY{&tM#7fh=&F=oZc~}9zWnLY zkSI0*jckKb;{`mIhHlRUF*RR}x%@@nX7)+D=rsi?{l(l}65JCn2-F5-R}%-`hEqN+@v~gSMh-^1JW$;JKCJM|FPP>U z1vTKp9BwK09&%fM!`A-MFum#>m+7F9ldC0U)GiyvBWk8GnmgULAHndYUgw3F=VOB{ z`NF!oWk8Yg8QuACqpj*?fo(Z!LmiH%G)?b~QmK;R1DrNecQy$&ex*;xbcJ@6A8Wj` zUZ>~NP(Z^wC~tcy6FAG_{LJa)`yLC$buMi}-+76hgpRsfn}xrz2vp1DyctG8>PoCS zn}`gZOf82jKGm=Kw9HehU^mAiD#V^`Y@B(#Hl{b~jx!gDTot>lu_x~ICwWmyZYgWt z@sVzl%TR##&~qq!o*6@IT&jsY+RlFvDVgz$C#})t7?FwR#J!|{VCdjFcjrcn->ANr$?w4C7!CY0X(FzU$mP2@l3~ct$>+219}BYhN1+yThblATjvn@%b2| z@=cX%S6EWRlFYb{-bqFJ`B3{!lKv_5XtlApsdKoi$2!K&zimvDT?i!XBJ?mh+2zAw z33sPL2K!|X#n~rI2Ze^UTU{n2e@3a>W&9A&PVeBJji{r!5Vlow?z%hQS16-q0_EPeRjVMm6f!OAdi5GF1@wDDyBm$nbd&n z)fz}>9$w~elTv-)tL2??2Dx+uQw+t4nsc%~Hfx`pyvmuK!%=&&lWQ&Oo`B_-Y{^!H99{4EKE* zJM;3=)avC$7VOT%;p6<}NCh4H1<yOkKG9X^{}O+hO$gL#RN+P{Er#p$+u!TW0%0 z87L{W{O!@+(^ywRxd)M+bF{9sV_cpP<;wNIT{wrg-K$2|zzV4>5Ll@cPvLX(b7f?( zz4>%fZGNX)vPr@~_u~FFI~ptR`?&m1acaVUia$*S1@fqZ`;uKu<&~q&=ZoJ=8QF?C z+a#Jc4)pP`MK3>gz9jQ0`P&0geYuzqiYOOXfse5P3FlvMEU*?CKTV9Mehr2++Vq<` z8V_#ZJ#3TuAn@WEHc2CIt#Rmt3RvlU=C)67d+jezriH3cBd#`N(62aeE&lO-aOIZ% z5mlx&gMR2v`dDS*>z(xIV8wWz=w^KyAHtn1Hdg>3M2FR1f(F~;`IijeUahLMu4~;l z%PUOeDU5v)k*fc*KAJ0}n{=!g!b36YUBchh^b|eR#!oiy^{ew9%yO(KFbQnqDpvYD z*eLGs$)<^qLxsGpS6ur!_|yCRW5gYkHcN-gLKVGRLAJNjUIRqnw_pwc%Xjk|rzU1R z&Hc~h)F^GXdRl!8H>R%xsdsuYP^N%sNR2{-j3hSMfwX1%)3j&kTeV=mKZ(NMoOzaF zFfGhph=H{v6-CRZd(2-Rc}pFyH8O|?6!mn^VJJU~?9*&%Z3jGP@|~$0TsHna=uC>a z>vL(ov~G7mQjqpZ#D{L*Q~YLV&_|Pano7a=FBQSmb!Tp`b~uxDPtHP}ZBrHPzY0i8 zWS`Y9C#7w-dUmLmHPkBhv;|PVxw$G=rrJ_WGsDj7=_O-doxZ#<1T~7odVEACCX2*fd6b zRF`@dN=PIoXdB#szb0gcbouGW`I&uhinA@a!Db-^tDo3-Vb5@|A9l+49;gi;-a8Yq zBXyEN(i5GWhOK`@?s-3ePk`wc`zcPj#>#D$d;%AFn7o0a5Amt|4Yf z+x1!K2q}IpFk2VCjw}xoIN#C841_50KzBHImq(s?THf8N@809CrF)$kUy{2z`M($p zMY)T5NPgO>F0GM~D~M$cezx+*$covJZ#6W{eJpk-S07==BqDD150QKq{EJ1RW6OwuDx{BjqPFBjE`FUTngm* zW9_zTzftY1+!7Yny8wkq*_XqG+%0R{Cd;deDdjN_zqzDi13Q@&G#a$x3}@MTsKa~( zOIauLx0Rs0_Z*@vGuw%Gh(j>)rt7wn#Pmfnn_bw_eQVV+HGyRDL~`6+<~C=eUsQG9 zkJdpqu>c%V0%N=W99G-?P$M>kJmF&lUyj$xi5+Gr=s01{ylAT?G4?z-x%amx z<;gb*(lQWA4@|wW$USZh-rBR&`X6W{tyVE_=;d#U$ zfHCBPcW7Yci1br<(Xn?ku;t%1MpvbZBlB-Huf(jWlv2K8@;(u+ z_z2)Fk0k8cmxfN^aniTOunHUC9mWgL!ZIW zA3vKCYMz#`zNSLR$SssCMYcw#V-;{-emuiOCO?xn@u8D70xD5pz1-LKDTHG z(^aPg^wSD+5}Fezo(k4J9g>UAH9=3zMzTvj%&lK=VQ2T$4CQJH@~QBDjp~Hed}|*7 zFOzm!j#l0X?tlk7$ivwSo-5QJ(yvV^ysFcI>~il0L4j*&k!axd&cuT!{8T)G??K1s zes*_mIK-F%1S)IZL1;u~mly#KwrPT+X1!knIyDEo`A0wt1#pc73<AG|uXYIvXonBzx0x zeWhauRg{5TU3!WREQx*k>u2|(iNcVQ+#@@vByj>pc|-}ulVL0j`mY~uO#2V!S_!-| zq2F1y`KY%=63jKVRb)5(m?-4OS{hv9B;}IccK7(*c<~1ytMJ2a5jMSY(ud{)K#|89 zArLbDkShOTc?Sf4hW5#82Mchc;_OPlC-fW7%@0a*wUF!7Eb8~Wa7VW-b&tj25y zr`q>!cRK97o4XhL<=2zWMc3;stSG~5?msaRGz)kt8WARKAY-owF{L(xLwNz9sC_^` z|p5P>sAz;{!fcL~%C_@vI-#e5P zw=a5M+d-XKaC+-$;N;=+vaI(!0)qD?Kt}^oDWrOIpL;cJ?8dqUCYE>{9{RfgMhKYF zv1-dO9C-~fP8OQ^1~CZ1K1AYUov2__4#=Zq1_E6P(;TdFXndmuIrozIR)z=Gqk(}D zLPRJDTt37kDPh=h>L_x1l#)QpdzOL)(*GQgjtkHU-8keZefyN;C7}ZOE?kYab#kTi zb8MT_T}eU+b}0e$fbEt$!cDqu&^fnq{-&R~^dAc`G5L;w?iohcr(TOc7j_ zjg)#K8FO*cq-@CV@Xt$>9Z2dM*H$I}>^Fi%#Nkzqpw`F-S3_V&V3 z!oJ1`i-+~mm#KhZ0N#p_x#9O_0?(R0*$kPk{H-M6EAd6(_FS)NXBV#5?gVk7jGaO< zG*SIt*b=%SC7a<;Hr@geQVhq{4OR9WQjK{zOX`#0T=76@IQR#;u|YqhWz8$7m5gKj zt-_E<5@=Sgi~YluFo{2kOjQ9U#2z>!<6>?{IG?QM_uSB`;%dq7oPS>bXj1Av4knuU z?97uI%-e0(s#{{Z&haoIapYTr1xxra0VcIj9gR<2hy?hhis8hV=g{GL$?o*w5Is;J z#dFeH#Cqn*2w)vjFi3)X1u(F=c*8@G(fk?{%=EcvR6foPCj;xbjs>8-X2hfmP*xB0 zDy&7_$#HaSX)QG?{0&qmp))`)R=~b6W{ad+Ca39`0Eeu8@{_s%mBCbjFSJ0g6Dk5o z7yPWSZ;)*xNq*0l@>xy>Sp`wo^{JjAcO^gQPb4V#MM=?Ek9Bfu*dfO!8<_Imc*v*$ zD}v?bJUC7ZGg_ZNrz@qHCd^b=yR`lhP$8E8MaZQ-+tFjjiyDU1fI zzEnxj*8nu(P3bmg1}tfpnK4W<>G;pzuxB;7ik#uyP>9T--!m*>BLQ zcMOyq{-pUVzxw%peqv)i$w6OpGO!-@ap*(GbGhdmLzQ^BB`0z(q`IU+*inkwyN>Sy zfd^M=!<{!gy03Td*g1-?gouQ3T>pHzAg>wZ48X#v`|MJD`@HscIg}QYWV1fE#7u-d z*JHY^@PfzoEOEXo-!XR$z-3ktl|q9dJ5OLoY07_5!ccsK32^HWhF}6cpAR6S@ZWZb z#y-@*M+E%cLFMlb{%ZVZv?}LY-hh4bfudxO8J9_eilktRKipaj5@m*p-wfFH%X`pM z2fOS=hOjInlO04)q1d-8+rEn-GY06AtRev0Pu>wfn}0G}D>FfmLoM#_?xNR;w-HgE zEUcB`mZ9N4RgPHS%C!>rmIAWl6=B=i=56kysdd5FR$fNK=& zpNOy{pXJ+T+F*9u4;pZEK!m^+;qW_U~+ed0AqXG=|ghb4~NHn`_#S3Mz5vLTrKw~q;Dy9`%mew z1AYf^npChTkNVwN!yPFOk}V3wGqo1NTs=Xy>?^F!xC>7WUA~8B2h+SC0!^#b&$L#* z&3@%$Jii-Lar9X(S~d9`a9C~}0$+27Kk&e-X^-#QJb%622gNr3#HsH;Nw?A`$&Kgahm;tQsKO_!b4*-s2~5h9NVOiS8F5d+?bAgb6TW zv(R%58aH$+;iwU!W0L}zLuj9Lo!CIJT^AsbIq|~@{2YXEZb1afTnm^)RxEsyq4a{Z z64@}ezqfo$lAN5}Zd5eD9_u`#_hf&ROIpjWUkLjj_RQO>h}=_4?fKrhP)Q;ioaZ#= zn1Y;6WDQ0xBkISGZ3Z0nC8O>9B4(jJobu=x?EW-rB< zo|URaa=nGlUoF&(6^J-wF&lEADg~I&w?V*!KhOuTmMs9NVpVA60(5>lhCGtk9tU=k z@_T}=A%sv8<9eXt#AYOQ0N77xLdr@k+P%;Jj>vU;z1i8jS3*R;*V2*ZEWwNYJy9R`u^l-8CWdml9$yJe^LXsjm&UxU0n%YK(Ry}Ygf}gigiGxOe>y?=f&jZ( z>0h`#vp>Cl;xe{7zvjgByF2ejJGby)Ksb}V#vdr)npQ>c-UkNqjpZLt)zJe*lk-d? zp4nNhAJ>hBxG&{c7_G4KWLSI^bW3D)vG~r@Tb~lHWT8(UG91|V+ zewFV;sfaJz1utWvRi5LI-jD7?n({71PeB*;D$tqD--uio_lh^$X*|~OoxSU^^P{)@ zvST#nV!|r~k4db>O`qY?Y7YxW^1sAqmNi|b5RSDvY{UVH)t(-&bz)NnupD|7{v8E? zS-rFfxQo-Ujmt=`!`Pa~1t?U8$jrSB<%ym#JTdFS zM*&v%Upt9F53N7}{*~|*VBQD2mE@GZAL8HTVN0fs9+)fLO}hQnBkkFPu(PDMJS}d@ zb9jZOA1VhE4E2 z8=?)-KqMS!59^`BP!b}jV4ECO0UvFBA`_!?=C;n`GsslvveIlA|LLEQX})ZCcuKrc z21sU`w?j^z02VMSabh1{1!S5XrG#)|*k$7QvSGO5PE(<;u*VRkenKwcm5U@!R38qhGi#`}3P0yLxK}C95^G>`t~%v-r~BWzGgm_3gTS_yltz$-Ztz%ob7X2XvzXwYv}AD%im%7uU|gi$J6VT zr|h^-ywnjM7)j6Wt?-W{4gnnT?0+3`UUb(9BNl~TSnRA=?i>1CpJ@|gn|Km4lTefF zzf0Bq+Kk%u37vaK0DzBcdmqpbI}!mr&A{yiqL8uoAVm@8ZoKLAST4Yp(C|QslilA9 z`W1%OhMv*m$u2-9p2%}SNgO4SG4sS62%<|g@b`d}BPJQYPN*e{_=#hOO0Zg|OOtui zsgjme;LjcXP^;fU+7rU8PfU%-|VLYEHm)s3lQ#>E%^M^@iwfgri~w~2WF_U zd-}SeSjb3c!XEu3<7rV6wa2B06hA#)@%H?82cBRHTz4QISp7exEIm%ZP_*))ny6xE z%}5{i?%T?^^!((fS@8{ zwY9Be1K7KTSl3S9SJY2^d+3!Ye=(dZBWVDK1wbMK)`}4umxmV{=a{iv&XekM*0_8! zd=KQoxcrzhUzY}_24zlJYK-)jvp-g-3oIT|lgUcJV*{q()QroyUD9uk&TgHX1j5RT zUL4RpYggWmqBslxs-Qcw!FwWml7!8Wx#P!x!fQM6C*-Svrz(lg3wCu^?|hpVsMp9~ zz`GG51L(kC7MPCW{O_e?V||GaYb--_8$wu=nb4l9+N{T~CN@`v$L~w&oES1CMa-t! zDCBFOBxB8x*F)Qoq)IIf&dDMo5HGN>9?(0yf5LnaF(=Wn)xMO-|0YR zvB>F;-Lz+L10>(;n|@qx*{DNKJ#g24QC~NXUggzm0&ZX62fPCCOUW+wdc7Dl2Ar8X zS}ZB~iHe_!1X%k|`Z0yw8am$Cpe(1pVxm|(mclr?r}1W=A>n6$#LSFwBF;17IEej^ z_d#abi-QZ{Ry0|rEHBe z#2X#xeH^qadA&WQyMi=yH#ocEPS>=PP+k0amU1&rk#_@%<;rbCU=?NVZNhKcFVDFz z*%ik<0UvDo=!tTo>@%c%97RqLV+X}Ov4@D z`v1%y=law21Mx#x2GJ+LC@4u+~);2v? z6IRuwKwDk^K5+TLV;ODXZokS1CaYVoqwC4NZnx^D?q2_u7E#bD(2#`n#0a$kE&(=j z8BY3JB5~&B1iyX8JKO+s!vI6velYzhVxwj;s@NUj$Ap%t@L}A&=kplO9$a7XoyrZ? z9Q|Mlb=bG$bjz!6CAz(R^nLgAsUe1MdqC>rjh8|_O*X4DrRRGFOv%i#PVoBRzGuSpShN=rn>veIK^XQz3o{sNh!tZEreLIZMj#(n7gycBSF zJ|TvNu$KO`+xq#l!`q~Q^H>Zfjp@JvCA|1 zo*Iv1>B?9=*^Ru7JbL^Daag~`6Q~CSuo^+{S~8qug8N-x$T?$vo?m@MT8dbJ{9e!) z79RT8bgdfQ9r3z4#^&Kh^QE=KCx=hXvC~>W+GTnmK!-RFzU;1X75}^zo158u=mqPz z>m66m&1>iPjG{yVq#EGPSxCzq;)gqNCX_Tl(f5IY-dIDNM`w*X*@?Nv&Io~j6sN8} z_9?$p70sH@5k0f=WdB}pP{zrI7d20TNLKDR;R4iM;xEOy^cw$ioAmdL?!JE7jHu|`twE+-tP=|VrJQ#KGP6QdD>MSG;TMqjQ^qm6``O`q znTP{E>jeKP*yaoUm_!9a2E5>@0JQa6`q?pXqkv*UTwolr83jLQ62TQhhoX0kF}y91 zn@s}9Fe(3ha$Xfl%YoYO9gh|oV^(;6y;X30kp^&r0r=C9X~K8NhKC0J5)ueZeNMc> zZ|HeNSwF9`EVrUCHn@}fZsb^(=Aa!`jofK_6wZL1e`54O>2=oC+djAL*0z?mw2ABn$9mFSIuT9xT!6x_*=zlSLySIAIWz*YBa{c*ltmqr zL;Hq0pl3WRfW;^sO#i30VcZI1)`(dJe6L#?n-y))gU2>N(TkKAPE%z>87gJIM2$X4 zM3lS^p)I`(bU6i;!m13R=p-))F{m1hrMU;#9w@K4^1keZmY$^Y+5Sl;wvZqA!2_pH z!|Wd(xZ4FAGE4soFH9|{9ioqvRT?_An=2BL7SsKbJG!^j0Ca^9cb9WH)^aET+JXYe z#axCXw3l0R8l5aIW&77m7kJm8i`p8-J^jnINHG6%u!JYR>Pg4=pPbd`0Ynh=)^T(s zjOr+w#6WrLJgZaXSKOhn-sa&DH)r8hQhJEcZJ_~}g2`nxWv~oA)?}!Kc&k(mqE`MY zU59*B6lBA0Z@Fr3`TamG02rt(f(;^9QQ>|L%bU09-_P(GiccM1K2V6kTz~?u2DItt zw@>vb*OVD597?#>bIZaDZysKNz94TkJis`$-fOyrNl2c1GnYTLNGUpR&Rmk4 zzTeH7&w759=J0DuxK{WCOMFIh1>~F4XBKi)0JYzX2|Q!I;)tyY_?JM@NNjNR?Ge>a zOp#lhqIUu@I@tqg^ic^6gnc3+<+&4SZWp~sVcmnoriQ|ar79>6oH4yQ=M zoG)mGJPFkwDSpiU7O%w~_dITomIJJ+{~L@}n=|oU2sG$S5^`klsFCc1{9-D71~iB_ z^?^H^`XmDuuB!i9(CC7-W1MCFcxxLnukQkv{BWWKhMEKX3620?Sj$t$>S7i!YDCtF zq2vLdhzsBY2ZR5d9qXh!WQYwDChYFR%w=Zz8I>wYwP|mZHv64@t8lWqZhCQX`n*1x zgROjTqGFLgu3uxm9_w^M#KL}h#Vx9*uRP7W;|?QJap0?J)?D-Uq{_Pc(y*lSu4C6} zy#A<9bms*qNKzGSJ?`h8-u$B?ZzeY2gA9O`J|sCA16Fxn%!9VvC51Nn6wsftm5KDxGpQ5E0AmwdGRQ^%a=yIl(J%iKRQ;c?sw96Ejb#1` z56uD`y5m!TzR)$eoPH%hrfWL3kvV;dg`9@dKrJH~?o2NS0ko{NU7v|vi(v4lkr}>b zIs#}CMO%|Ru?8rE75?jgK>Z9)Y>qT{LGQ@LO>Y{!-IhE1i_xZGrY3y9YoQsVIigT; zSv5?MN0RHrR0#lUE6B0*-{1lye;tMEK-!~8DzP>^%S8ZOS^K}>A^D`rQCZA9R;2)t zx#*?ToiL-YsIY5FF>1|#w(~P157J^fmda@^@RJ=%gD289m9LoEN~Nv;+2}^me4Lr9 z?7(k7d?WlH(WG40(4cJ<;5;L&fCcR|6aLrf&jATT8fO6`XB0xu!~W`FmUUp zq+C|A`4r+*;+|36^cg3~=F{L-ez`lEq(+~s6A$27d2hDJQEQsT0U+)qrC@+|t~H7C zxke^^`iJhOV+~E)r}<>WNVFLzM4dT(nm^RoU0mH$5vc>qqWV`)k!ksHO; zoRRvMlR67P27xLwixmKnI<(@l56^9NpRu#BZZ;Dg#cK%Q zL|3)}aq5R7_Gw{3oRbl71Bnkw0Q6cmtrByZe^e*P{`@<}WM=7fCaWDXN7P;RFPy?l z!IpK9n)m`+zQ@lSU7jqMQ00p{Ezs?tK?x$(UN;aimvczl*TW;$i^OCa%lVoP3pJHAyzWtg?-g_s{lx1Mi||WCp51 z;#$i4g-yL4CR3K;Pd!qxYQEvlx}oyBM~HbKNJc0N8i2o=qxTZ@uW8&9fOi)5jqCKf zjAcv3OBdGjpQh%5zt7cO{z+q99(!_*0T#DeuQNmXIb7?AhY;Wj;1F z#}U|Q@dK+k5(pcD8DKv6U2&tig_&G8k*j1SnSR7XBF_98{S_Z3RAjQLu8)@LUbEQ` zhA?ZqQ@wte)5GT>Ogf~eMrm$P1-u*99TMvI%xPy)x3X3*jz{K|@eIfmB$IIHo4H57 znATM@iKOU%B9tv!4KeQ9?yRI98~&1C_4EnP4LbXioZunbwNym@uj*ADoopASH+9~Z zzc7pEo7mt!Z1D)%WGI<$>#M3;5>y`MXgTx5GFllgY24ikxQ%+`C-<}0p^EH= zRQKxkp=nfT)G>iEpch)H_YtS)MX{hbS+Y`&35cc(wecU462;`aW-ItuH+#N}Od4)~ zFU>nHL*u6kbRSbv8DhiR#A0U#sK^%;HzHCJzTb|h8ziMKzFXJxIdgy!JZ|3g=x*PY zZu7(kp%KTVqX~2@Zdm;g{7pip6gOtSZ3NmStk-lu^7ha*U8yc-rRnV7GnX_>5_Vgr zUnpKuShYj74!FyhRdLeWz9vnMe@|oV5!A4^XdacoQ^ynXJkZDd9Q86S=k5Vb0Z;0< z#k(rgO4_d<6*aOSY81;_yg9vR2X4S>&QM2q`YkDvRQrtm(wZoAXZWJGJkQu|QB@lu ztd*`$np$`Za3E2L!G-CkCoIDa4kvJ0uN!Jn1z2;C{`X#$TQT22Z~Y68T%$E?SK8m! zok%)QebM%;!{=hVn$x(@EV&OOmm?*OW!3}*%Rf*)c#ys)QNoR-M{Kq&p91j{z^}5G zz>!m9`qd~+uK!MZg%usWhrAR;0=5Nncp`=wY8{Zx-)qkHF14OQu(CwdUQPg*F}_?%77oD*mmpUdqZ3UzPjj@TkR)11`)G z)DcSC1}`!VGU-RdN4rLSD7%y%g=G`cT?o62KClXtmJIw`^jc}pl~|67CsN-A1%`ei z0?QRw*V4WT8OE@y>qN_j9p|sPv(-q||K_9LcF-85a~}{z2RV4eQreEGd{yWb^F^&w z%dk%VYzaHp>B@?L6gwLDJ;QL6E+4$>DvE*?duR^bonry2{NLmQ0+Ym7b@lvG0TdfY z6}EBty8PtVy3_B0sr(!`sIucuO^Z>5CO_liO7D{`2V7yZTk*vLv{E5Zh z=(H715m62-iZTD`n7p`*&)xmHrBzV($%ao$_3;FEYDgQ}b0f*I1%(lyh-{Q*w8v&2o! z*6vnp*>P|F)-{c61-@Q+7z~Fj_S5jkxV7u2H+*^dEPBZb2$Eiu2yS<#@zguPUws1~ zPh5aH>!7z0rBUCVcZg$Ur~V6{k5rx~@n_2JCoflNgb^9W5*l(;LZN$2f-oLzr-PMt z{5!ik!>t`yv1Ma_bm}h_x1JK7-xCUDGYYJM*yyZdXtdBbi?VWo4_;3J&;7Su&+!o0 zj)nEdpIUeig}}@h^9O|VqxNIrln;w2xS>!XN2at7y+4mnhpmnDc&w%IDQ?_Q_G^L1 zN7Q@)FU7<|q9p3un?V?rWP(iV_Z9Mk2=hkJ-c_MrX;pV-H4HpnMDduO`OtVM{YXu7 zQAobNsTooAlX*-fNcL zn)O^f?y6Gl4SX_kkXzytBs(>z+@@`+)fLNIi*;D}p|FIA5MH3$RAkuKs15VaBB?Wr zvFPf*zKY0p{`HLR+6P5ec9dHsqfAbZMo3m0T&)+_Jnkvic&ihTrvImrEfq|m# z0oAE#r=`$#XmBA$X6AKGO?i}<&E%U5h0iaEZyt;N_?h-8`LJNrX2XbM{8-~N@{XSX z`d8~I3opiDM^99A>zR$AwfE)3>lBpi+=xF;C!>TiKTVuGw%m}BH@Y0_IH{yJ$icIe z;UNze*X6wGi-O{Oh|}};u?s7_$ZK8(yTy2c>~8b^#~Tk5Z*aU?v5$x-ije3MIV-yH z)HPsV!*;uEX}6Gieb|fbrR4=ke@o+OV}S^X&rcK3_EEk-nzYTmRc(fYu11d}BNHVv zO7)|O?KpKUo!z%%#n7;EquzXbGFsNFL^gj=!QuUKkY7LB#Eb)m%x zG61$fL_q1&=9k+5T?DnSF4E(It#lwd8aH}URt&TbU_L}HrXX)LF&h3}z$Q}_ZE2D; zI?|GTb~~gHdqF8{lu#BXc~4zz111fC(cbX`C83ffr<-0fXDb{E-^19M6{Jd!uQe^B z#qf8#R>VI4=EOWKon`~(Eq#&L?dwCDT$`iFj9s#praS0mzNU|v1~z@b;aQ^|bdsV! zZW?P{U5`>Ic2S}Cj~u^))dp_E6(=Qqf1n=q)@o0tgO#YQ`)dLFKri+LzR@J~g;V>- zg2mU9qr<{e;V4(hQ3sLeNeh{mFmWPKWDZpDlOLXZ^PZ9#W+N}N<1%$y!!gGkN`+S- zL;zAlNRTZ1IVz}%ES#=R$XkWsW5oJ?GO_*UyYebEN`*;>>4n#2At$tS+`@LO4>p>^ zM(J|>m0TdfY6_}x>c4z>HyB#iqynQ{y~H8K=2EX8nt8@-b{P=s1Ed0)p;u)DfA>lG z_-y!ujB4(%)ztL)wL8kFu&B5}hlY0Vo2#TDIM_3foEmO zQ7xAE)S!5qZDqKZwoO~MVl2D%L7XuxFvYQg5ms&&eg$Q?B-&Zb0;#v@*=Wh!1V4?? zPx4b`pE69`Ttgg)%-!%>MUh}AUQJuPSwAhrN6H#}Z^q<<9(f!MJRiw3Q(2TNm7wN6B8f)Y-`XkLn7XlHHO<8*tuuzcib=oyCz{v(QQcGT~0WDm4g4S2oqF$|m<&@;FnsEqR+EtOGv91I{6hMLa{|+|t zHbo!yP*S%IaFAxJ?L9U{M8TZIbK*{{FjOy{%VuqMh}$Q5U59K05zZ~m0Obqyme6n;io?6K(Ii;mkst(kwBOpVlcU^)Ebfbe z#mDa`?R0kf1^7c4h@b4vU|6LUC{`ar)gZ7Eay4TFv6yBFZ5Y0GS(jy({w;K@D*I1K zrq#9dw}P{VMxEsyY`42HwX;Q8&d*6%QFR?U(CSa;B2K{6ZIPPIv5V>IPd(1c2s>-{ zX@L$=?q*VdC5{D%DXG66uwDpy)7*B%hD_hW^8oWB#LGa9YGo7k(n4;TPjqrUKt`J# zG$Xh^{zy@yjONHlM7WIH6T0eMSLCElwAzI3$oewx?{qEBeYo}k`m))B+ZS?32Z{7I zhWfz%bS%fCXUCj$Bz3Ge7b6Tkg%Mcq? zCsN91T%%b)UXtDG_q&SJ=w3=J_YeS(0v1q&$_0o(PvZ-qhqZ}{$j`7H(-a^f%U9R| z`5%DxT^L$T?Ruy(ua z)xW*U7wlIluI~QSv0wSv^njivOW+uh4_W*HrG_m%#G2>%iH$#;^IeiSZKgZ0FX@x~ zn5Nz%8?-^MknJ0~8;b5C(uOww@adf77tG<8AJbtRwP;B@by3==y=}Gb*MJuQ*)AL#I{-teb+(PMo>pu21lIcA`{TwW$ zj0Vy^HiL?1sv}?d;39v;P z{sl9>QBF^sH5}XN1OUfTyc&c@fyl=|G97Xz-UuPKoGzR{M!tgCAHbH98i0hpQ8DDG z{3gDc@ZlQFdf-2HNYf`S0pMv?Ad}wE_U8=#YvO-{sh=ubMgj%-#dHAs>ns4I%2|dk zK(dK-;x4dCzoe$W)qyqO+yQo~{cz;<2C%~-?D`J~PITl?jLe9=Mcn&A# zFFs#K$WkROZBBgKL$IF?uVsHKXV$=`Ugw6B)7`^r@y=^Xt8wFTJrbO8 z#w=Iw)6p!gNhGMssvB?&A3dp695rzCg8d)L-a0JGc3uBRK@kK|LOK+Xl`cA z7?1`j=^OzCMM6?RWRMPFXhdQ_O1c?BVCaS!Vi=fszmIFbd+oK>-pBDfet)Rr^xV%K z*L~fe^Su6pbl-g6-bRUNyNhkqLz%~LYx(P$sj^+?emnZ1aM%JxLa7IN1mIasAXKgr z1h|HwZ~_TpC2%hi&?)dDy)G_30z+;c=6eSKaLc6u;07cSmu|sFaPw2)A-n~-0La#? zRtYDyNHG<^HL=r|YMi76eS4W3`XH|B!hD#6_2Z*4ypc0sLS54-yTo*u&8&t%edg3l zX43TMuQ^(r;~o4tCu8&?G=W)Fl}ItW-6+G!(xU9KPW{Npxtxqs;Ir2OAD5pneSXgC z*9Tg%8O}8oAV*V6oRq$(s9}N)guCA2so~6|3q&UHsjq^f=rnFYEFK^8#)5e*F>qJaY%n&XkZeH^bF&~$t;r=VNgi+7{d z^&}4WV&pyMo^K93mAzc~ol`~}tP1JS1FbzraS}qnkgZ?k(-349cycGpv~>0syK`&~FkgB0;QEN+q&?T!CD000(!2n6q0J@N{ zBm^qEBZed`C(FM-+y?0P_t#&)-UbX@Wr201uI&!sOR!W3St`i{Ye@)SV|50{7>ShJHq$>x;L1)5jRNC|ILS5frvr>4I&8NE&lTgyk3dLGcYVs zc+{#Z<&DfsHZQ}Ozet|WOQX!Y4bBCKqQIFz&)0vPBu&u70H138-&ldrR&AKSwpj7r z0H9DDn5%j;xvrUF-WZ-fr`;it|3*Jlht_v2DAw%AKPIAyG&32ObIDr=~xILvcS zb44sqG$Xm}if|V+E#;-cDw_l0zFOD@w^W6hQff#DN4krv z8$pu??^RMoOeO-RK>;m;`OJHQ<$EX2`VvV>QVSgRIEpY&AfJ$m8X(^M;<-AAz;l#( z>TqVc({<5?ruAklsqtmN2AAe_i27c@`NruqFuiN$Rx%<*yY)1XN!=boJ%T*XTzF*W z;z@3fTWUTUd=qs3@zNycs!`T1K5# z2eoh!)+>9}DQN*I1k(|Tiii*a)KzpG`+9nDK4WA@9z}%g<*GvHp1GoOTj_+WcMUM;s=2twWm}pGRLTuQOb6dWkI4W?&ZpNYSVaNS_$0_`eWh z8zW#7+4n~=*b6<~28gJ9UKjvOL;;DPK5-Q6?tZJJaUrgotto_Qab1hdk zf7AQ3D)BQ#OYwtK+{e-#GfS1vf#o~KZ#{&crvw)$)5@jQiUXQo@Sx{c*A>p!VA_qk zg|%Iqn^V`!J4i=rJ_~jdQl8w}5yIGG>ai2Wy*MF_73!-V6==%q&)f}fSf~c%_#u;A z((}C@wR37;DD7U^7n(>@nT8^M9eYckyY{KXt`f@wVPSsQKIK75{o++U@GXHgbz0H*?=awZ0iTnjmk~fS z!@k2$as-KSLgxrV6=-b@56qKbAIP@S6RbOS{DffW1igH!yt+Jsr#tHc-NaLT4cb`h zk!dh;&q%iZChBr9_D($@gO)_|v^xNb+Jpbw%toLj0Ap%xsg3hb{5Anpyks~qjp0`6 z>M6*}8L1J;xttp3!O0fU{})Nk(Ba;G;0n8{F=BPPXnK5+$0FCoO!`$bZJ^?9VGH{r z@g=43I0Y7e>4cz}Y~7u{;i63setr>)J()PGEZwY)QN$}r8L z<@$=j-0fY3f@E7i;sFZ}{26EXz0phOQIC*>T`K^%`hz9oAE%WUHd zy9CI!RorgREw;My0_A~X=FKMxog$ssDrY~scb#*JpyGX!vxo`hZ?WmidYru+mk@XX zm>OGyWkU8C{PUUNJ}=M6^LSFy$(z?;iZos6X(^>niE&N-ilILwk3VAM2~}YlE8ts} zVfV1*ftd5dGm+1w{SPt=N$)QDj-9;?shG7?j}ZT3;S zL<%6~PxIF1%4my^*Xs3W{PhExmD7xMfi7M7iaO*Z$-8kV*gW8g^ygJ)e_jP{fYfT> zfwBDxfX9HWLoVg1FO|PGlcjOdz8{=Bl}4SeTw0o7g~tHUyHim1UnGP7Tof-j6R%29A4B@kioU4~-FfnG4W4*Q**g_ruXDF>z{ND{Ff1G!+UpYKc}I?>j5Rie07VBO&P zC5BjpHI+kAbMuAO*r+K#|2g-O4v}IP-Aud%XRr=qM^wseFW#~DDP~mC2z!6rBRQ)3 zz89xWvYy}%MVY|PuZ00UveL4c_{IYah1^tPT}g`Kq!(?G_K^ha?AThK-mf%S7ry!K z{sH&N?`EqI4jL{%notio6h9MQC2e z|IX!5p5yh@*us~NxhD!)m%lZ0CIHLv$(Sn6AP6+nJ1k9R94cN7<%Amaz;k~P?b>Lu z6GR3?*9aAMA=0HT6w8-xekgV4Rk}t~d=+fFXc&VqtcxP{IDT(vQp*sy8Ka@Wnx2*j zz(OoLD=yMzl3R$=^H1xNR*5Q3z_GC&v-Gi2?>B@014ln`zXz8Cm@z&O)xSt;ic{g3 zPa)Z#4wp6YM}!Y)t5#K5P&NwHfZv^{R~JNmw??C2Wozr06YwCD>q{1fb|{A;jV z4?$+ea@<945<-AR-Rg5cMciwRpO=HFx0H5l7q59EtDbD?Y3rDv5k&T{Xc}S+L5N5& zATk^+?UUmau(6ns$&+wJn<)=}lW(2E)%7($_KrRYUA;km76W($1n5Pm0>P_pVweFE zd~w|RN3=VZw#l$-o*>`K4wSa*L6-jfoMu*B^C!gpPd42fxKrka3LW-;!sj-5=4zP!9L=^8fXk zmo^JHUXsv=L$>F_4+h-bQ@Ao;bWN1 z(b5xX0&kFLcL-2Mmp32VtL`yvIrPok$Ty{2FlKIPUqI0!fGZ9UjPR(qe~|B?D=L|2 z!Jq66IsDOmOggoth)Drj?2?)!BT;D6}|2%U9Ku6GSQLLs?cu|<>MA3rt@~5`30D2Un zQTrM>V&J`}^7SfMImDLc4X}{JgxyK)`H0$^Y_XS}yTnb^;O)r8iMng^aECcu=m^|t zY48r@!kHlRBYg19sR7eY0&my*k&IgT6lYRgwMNscJw^t?Jy#ZWXzsR*B}e+T#+r%T z#KmifJT=j6&9+Uu!;^pk zj0c##;6`91iXb+0z`^AYGxeefobge{p119*g0(9V% zlJn45Wd_y(ov#HDZ9WjYsNXZKz-=P!JHMHBkU)0W8Fi8ocd#>|1Ka?m3v;(eEiIqD zRq}^K-+u3WzunlIC)xZ~?`fc5{%^#wJR1iO8veEr$6nu4Tj5r#qo_gVA$6=hUyvNd zC2Rm#$4a=#xlumg64F;!v---f-s3^z3Fd+L-;x;QB! zcr4u_saSenpK^^b%XLksEU$V!e^Po%hBz5o-VfAAt3(!IQ#DhEs;4}@{)qgGq?P^` zJI$IJM0`!*c z+Fxev3tYmgtXEX&Tg+KGSuo*9H@Gklt(DYbHwsw>ivD!Pd-B#gdYw&YdjGj^g&52* z;M?&}{#`vX(#gq2RG)=+(ru~YC?m#_3RTNyU8Wn~E4d4Hv!7hntCXcVjW5sUJTxCv z5ke)sYoV^R3UcqhaJbMfcJtw(`)WpVVmF(Hapbx6i^u$gu$&88X8Uhx_&Q`>1DWu# z+w1k&k430Eyeo$xPG>;Y=(vxP8gY|jkNJ`#dtljW>F_6&JD*FwnhHq>ttT69#B9pr zsCPhQ{$3C>$Gkv*b`Ls%K|%{xfg``*iKDcT8IEHhE{tonDl3n<>YKO?A6@NvsB?>{c4H)_ z4TMf=7k!5z^3+6D)a%_2Y;3LlM1Rap?tnQ;ob8cJN`Gbdi-ACoSHMOt!{bz_C&GM| zweSRxal9Nk>xxfjHy1lQt6=A(PjWOE%GubsH%ovy%JyLW&sw>M%0lG)*h1P|^$?%SZ$Q7*_tRgtsY>!s6E*oF!mp|%V6FKD_ESQ%g5${vS z5CWlHh=8uq3o0oaI4fu3mUNX}NBgL|w0t~wg^f!b{O_cv$2=tSe#lO(6$$1T+y7u> zf;5WMNux@5rZs-LAhW4HU2(UqszsH4R5wadLNJQ>@`sC`s}o)*2?}X(4l*VH^3#V{jTPSU z7W*P=7lmx@th$ShA!V84DMdGIZl#|yI zGd${g30u_%+x({QA;Rd62u-lqVcvWBE`Lb6${Ofj*VSRsg}{?TNKI z?5VR~e}K(aLyf>*z<`ni>3&0tCx=$~7v zfR9`KyvH+4Xye2kZ=l1o4!AOYKW4HL=>1BIKA;lF_+Zyg-^55d%w_S^jkizMo+571BAL9i>^H2 zFz0TTXDSH3#?IyrG{LQzk@bVFF4Ve>L`G6?GsqIg27qVSz5izplmaJWw zUv6V{hx<_QoA+9%lLUwoC)xEH2mut!AP69Tc_EAtaRP{d;vM(b;HIRn_|ZRSVkuvd ze^hP2MFi9%1eNG4qHP-k5Rn_8VhJ%ocn%Qg#2g9tnH6XpNBl~^3NzjRBHvrr5F<fxd9#7g~cr69BFLefgYCB4!%(T@b1EoGH;MM65ns~%%MK3jnJGxy$g_k_dwvcptNfh%dH3o4e9g3+ zg1c|Xh^F?yq3n5!lS0GY#cA{E?Xk{BU(Rpc)8m&fsd4G6-d#cFq1gEij)R0ot(aFFkYafL1{} zXx=Sd^-E$b?)oCqb#-X!$cbe0N0s!GF28yjh!!V8uS`Myp$G&81}wKN$XX_`J`4OSV0KN6ju{mDdE2 zOUgplpQE49dmr4|COQL59NEvdB^-4uFj!lzs7ta1H$&8$s|breugt!KHaIc1!M7gN&yp;75=SDUUA^^w_*TLZ5$L^iIZTDxjmH#4=(=vC*z* zI!q+u;AHkOhJ`;#yU|~s>i)xHP33FHS-Q76n4~Z2ao4djUe_Xqv)B+uOrFx`mC_o< zV%mfAcf7Fl;ys-+aX6*!P^E^O83#)nLEpVX=>+aV{{vs=!wX@JR%zmM1c0;}sN-h0 zO8%PG%g5v2__^P{>h%q+VuH=n{JR9?&z3wY0Mgg-&)(UR(OmrMi{`;+e6rk}o5kDi z9J^J>%qr#$>UO42#ZC^BY*u*QVTL>N$`AhhQgLo(Z-vxal9&2IeMCCo`bj0RyCR?r zC+|*S)UnrnuCxqY7ttgv&&ev3^zk!ahXnb7Ce|aE0Ir@)n8=D(Ye0|0?6-~NVc~-r zj~~8{A>42xq4`mBmG%;0AB!ZB>aXp6mzr<-v^8rq+vbX~6$<1Q#-=2!Bv8Of#`_~E zeE)gqCAY7(IK>i|ei9OrXz1pUU;TL@aP`-)_4itlglEel6)HH`jmCLlRkl<|#pjPX z#D`drqkv9l;f368Q!_N$o@d{rn%U;I05?uDuTY zrTh2yQ0Bj^yjHJIakuBcX}w+biILi!TXLIH{n@Wvo3C8C zC&p${`b}Ib^+~4cR>9SblE1&jzdM$v`4<$jyV&+le8LxRKVU9?B^HsM(e)^7dDUHT^>}qJ3e7HVRv6S_c<<*cD_GyV#d7NM_rHQ-DK=(^=X)VR;WBqpq2Uj z_=CFUmX871B{SIlg2v=LKcVWlY)miP6m3|3>>5YH9GoxxydI?bIyik4)4}${S^!C} zXtk-5c0@lVP(Ur$s!dS4CvJR`{BkOD!NRFps}Z|C-0#Wb5JRR!e%*|BWiEY&nqEJ1 zO_H;8)5AL}^e6%leic!rgjd3cBfKaKMgSk0{)J6Y9eYQCyjSYE1hJkH~B1QMf zmtlzt*P17iJx-af-isCvV!G5QQK=;lc>1IT7%oGSnDIhmZR}w>Xi#So?Of3N&NGke z^IEJwUNMQqZ!&ob71LNCyO&M%ZV@y}oL=OAphs59l@h^X+2|SbA!u#mKJzjt!G^ik zyg6hTr@Uw!yJ)e^HA6Zv1t<&$q7j~xfa>3^=}{Z(-8n#nrM zBzD5?Iy3jC_WR~RVtv@)I#6gtr(W&4_JV*_fkYgC#b|dgb3x${FNs(+Tk26_`Hr$+ z!a{w{SF#MofB_$OF~vNu<`^U~Jt&?n&Rfo70Ug(G_&(&d)5<~8@JQlPmDlSqmsEEZ zl3T8U`x!E0kHpq9Lz?BbF2VXh0G3Ru6pZ?L5elZA`Skq#K~2qfPgw|s91?MIVXEg# zV%h%Zt9&j zo3wkYqqNGdOewSMf}*2CKL;Negjj9F$7$zM${nh!33BE@Wxbg?16AW(gkP1)4_Wg1 zWKU{1@w=wyTq|zHld(qh91;(LV1>IVn*k z`{e1Gv~%5=y*L-eM1DCwuLUVGTkmLJP6KV_`l)Z0#f~p+N$6i=$jK)2DA=y3emrO$ z3JHu>0Zvu0k-pUMSU;h@40TEeP3y_ak5YR*-8Jo+8s*pJDjY`cKM_5u-kfgLS^n4o zX9;^U%{|&8kp(b#0Zy;TuvHm7OD5R(eg`W$(qPngBuAH@JZ#+S zN|*|&_?6$uM2om>Lv@h`Nb;o?ZYcFP;)VB(ISx-gnFQrq_^zP&_8e|3q(bH03O|*s zPOsrh3GDQhcz_qhU6N(Sb zs_#CE@|{qbk|=#Hqt4jyuw>Gwz0bz|>v)T7$?jZ2PIAGcv%tXNx23chwpNj&zCCV5 z(`)f#CqfIMrb+7KcV%T?W_z&t0w>FYgek%YIP%j5Yq= z+xOJhe{FY%!uGDqJNN9S5j}4x6}dx}tU4Vs0V1P<@H@OYzI(M4{6fx%@Ym z)i0Oq2@Q5V|G^#9;c15r>lbrNx98O-)9s3L4Y z(fKhEeJ55IQ@M95`-%2#srZSjY@Xb?mG<2>45-6IDIw^Mpqw6Mz(U!OgZZK0yr$g; z@zGcAC3bU})#~~RllS^vh*l7f!x_~sPBMRYI4f4-j_)#Sk<(0EwXk4vY}%W}!C<~d z>2rTU#uty8FXxb#u%E*YCb|}x#=^jPi&*!r+vUD?;dH8Z8~r}my;6SI>jT*2?W1vY zyAV=5*nic4+(IjO*~2U_?Q_t&UzAosayp0lul~Fvq2}vvr^oLag2djvz%RIvPZxje z|BJ-(+MQOxCWCbE_K%7WPR?LJxRRY@8Lm#(!$Q6c4{LnuZ#UuT5(G z{x86};J^NP^`|^dd=`n1T+ybKvs4;Mu2TVN_LHaGa^gcIB4f}(zqZ?6*w4L52U#1k zYy|gQ)I9xU`o|J76UcSm3WHDy+WK}ZQFnWElNbU3%`@aLo`3gZTmN>KoESp0<}TM< z%g^%4(*2xWX zGQM^S16+`Ra|S(YQ_sy-tjUUtV}oy#@TuY0uV`G}3a>43HX(c@+)**ZsvtAJK~-YhtDviM$x@LoEQrSuGc#U%)%aA^;y$%*KYGySZeqS#$O7cBFEPSDETZl|3xzI zi?lxLp@}itNU!en?J18K^d;E%}c^tr2+t+-`S#|TLnBkCM= z`08ZYm90Rz>xffgWqs0Cs)pkkvm7a9lj4J+kQ^TywXmP!U7>E2eIA71FIBkA%8)39&@>FfjZFT0hczkpay+5uB<= z^<3xKM`cS3iSND(V;}i=qgk+gfDKhtU|8sch=bl;Jt2!%;k|KNU0tEyetlAz`7ZZV6Py2`UJNHn`py;ETwJmcb^q%WO{y6#{@w||2oh4~c>JcR zdZG@cFKZAMPYyqL9D4%$7C{GWpX8L}^K3fH@z17v+MahpoM(kPURXdzwBFJFYU{7Y zrO1_h)e-#(ci9}AUVw;BmZfx=TD3H2yd}0nYd5~GBr$fPKXx_1Zp7xwD|jb+iyQ`9 z@8*<}^Kla`JxIQMBj1-cnvfb&RG-+d`*L?qQBg>4n&0ltBMmkev9dC;Kj%0$urGtv zM_s1=91FL$maBHMgN@Fw02x>yO!!wLb0Ak3<%UK*`yY!oIL?VU*rtp<4_{6&=wJ-E z#m*x1sVEWi^zQvl35hfT*64&{uGI#~`O&{f9KcZ=LfGr5RU;Jq{XvC@S3R}9&N45Ad}yW%zjDb;9^?0kSaAOcHn&$sIeyd6MkhE#Hu2) zwRpQJ=UF-ICOG(Gqe%DqQH^6$=BKrcYs@tv&dQ9Yn?0g-`weqN`4l& zTrP{;B58+3LmX&kMZ3VH_=n}#kzT}YbIXdM?{zggvcZqV6+JUEztC-XlwBS5mh~ML zIw27WHOo9ukS^n&2#mXJ9qS$aashfjVis^~S(E{Y1vBWy(<>2n5$&8uK3Nb#G~mr^ z9np@5V13KO%&`wwK(QB8sJR`MIwN5TtJ@Z;YD&I#{X$~oP7?I1%0erLv*s8D>{pcU z$~PNDbi%=>7ZnwrT({AUAAdPIWONxM}la7&o8*i)chV0 zvaFTj9!bv@tSu97=jLA3~Kq@@)0C;aw@Vlj&KW?Rw@ZB9SfiD9d*mgfi1-C-pDN2Xs)6G&0ZFTz@YT@u% z#j@}2hr<4-N1LT2Py)YKmCsiEDM8@mr_u!mh|a|W1z_rfW^8jgWh^L#hC71)|o6~W{V=sHAiq%YKhKe{Ct|FqR!k9be5M%T6%#z27dzA$?oy4 zjmgrPqV!m0#kV7v@9P#7%u5`dEvL3i=@1;I%s86lQWGz`vKdnUbl7C&n~=O z2T}q3JLdfrM;2y>wL`4}XW1ts>_I6bYCnEVk&S7*#64X-;1Wuf?3 z>mbYTA)n+%OBErVnWk?vr+wo+BeEO!wQx$wy-9v>9N{mL=@Nje2L3q1E;#?b2Q~=9 z1|0TsUfc)*5|)>}K<77-{_~EirxDkQ5r0^|{4R5%DkfTe>ZBE-TQ_~8Q*!uCH2-OC z&TSgDk}pmu@i%>OC`AUHT8hP1->W_0~O zRpiAe3h<=CQicbuT3oIMQrRmXC;mke$WeD3Qk`?xIQqxL5fvMoYe5(6mlsw5`mdJi z5iN+HoDCX#1FK6(mO!e>fXsU4v0sqqoh-#Zp0|zJzxsa}%aR{tM<>pVIH#G#1Ky1b z8SoF)A6^!X%L9(i|L&F!O9MH8B6EbW)JS%d&NlHMbxm!P^|xD$oDO+hnd9Wzg|}h@ z3{xZH?Xxc0Y6i>H{nz=M^H5`MS-Sseu1?q)g}5Hh5|>mbDmKxJpHkF3C~Go8M@cLg4U_gS%aS4-6&^Nbj~YpLfeJW{VeWty@mz5cdOSWH6QfMH{*9Xa_d!5RYN zE|cDsNsYW{{$F%DM1KmqzRHQ^JM7VetccWBK6T5I*v`r+^a)65IgYg`$}C{bzX{DI z{Y3`sMQ_U&e^_yyNF(G|Ri6~{G#TM;PtN(8xVI~lqK+|WEH9w0!L3N(ACB&0Tx>)G z(|;yqus`sXz)?V^MN37B_m&D%FcSJY56)Ae1FA`6) z{1?ep4f!VP>jw4J$C~pS;aR*%5%N9zO}%cY!#Ns^IZCt_?N#O$NasT2s)_v-MI_-A zU#lq?lOfHkr4{S%=A{?kO zb0s)Q0no|Bz1y4`DoAtZ706s5lJkue;FSlQU%Cc9m->CTx=CjiIx6hVD6fKL?<38^ zx1vSybhWEa=`TAw&*p5JJn+J`(k*ZlPr6L>iSKD;pg!Rh$mRWfdep{<{=`^yrgN*h z+OLDVw6g4SCN}v4D7|CBA7P0G-w>8^97Z^8yw~T|k*-qwQ-FY7rh3qM#I*qViDDAmbt2Y&=qI6a=MQAuRo5bA$%Fb&3ujon7 zjy2n0MEl4L<4+rs$^9@B^gV)-QQ2HgA9bF(@lR&@HOZ^zMO;U+SHMq{%(wLHvz+MH z^$fqF@=)Jh1r$<$3WyFnjUyxxYHAPp$OK$FiD0$*Q2)~}^3y-3rXhcEO@ov=|7 z76_%pt5*0htY;4EIA7F+!=EenOD#g{>g5pMx>;E^^{`Ras!un6ahpLe-(RYBMMLqP7 z@0e6cg>+G(c?@AQMzcUOMdDH%v+8AYF8udl@gDc7Up_YrO@AczF9CWP`gJ*8`cO!_ z=p5k56GNoLiatbyxfVZllVP#0lkDL69<=uOQhJO+-030K4LrL_jNKO{0wu^#%a7@Z z?9J%jzo+ESz?7W-pLbN2ePW|8SObbBQJ#+Ld?08|fLsk;&GqDa6l}N2Pw+LjB;Sb< zm$}||XRb_wq>t}%YZYSE0KhAdU*Jr2PC9Gpos-$H;cxfp33&;5EyIufpnr;@zyb4R0KNFze0dX$ z1;~W1C6X|?sTNnh#M60eYEfkMP1fQW9`U;QpHPp^W7sHc4vxGH!#xP1?Cf0RnDjW= z*U@cd^kK~P8gYZ3UM=--C>$DM;~N7ivLpU33_Fyg%?^A1tQiRlu{ebWD5f- zWJB+n@zhuMV1G`5yvD{jqXxYw!^$HDp397Gn*l58LO)H26!qwqDq8n8cGCm3?pe^s zS@L6;77Bu=Q^79?x2d4hv|bcAu}fTa_F1AGsuLyP+=eq~L<@OkjWOGVyT#_Wq#aoS zUTk2AqTb*=l<`FG-Zi%EkQeC19mzNSYF?yXa|RE_)33E{%C$=qm=HOD6jTohdd_(|hVJn~uwmpIVUblQC{{B2Q-!!n`J;of}! z66E7ug%I`=;VHJI}D%a8D-UV5n2eZMq zqz73p1^r8qS)LWR&yC`KI_cr} zMY5O0;VdwPN34gBq$P#DEMJb92Wh^3h*xy~XKe5%o_$6HQUOu^c>D1%qx_v8Ma{Vt zU((m;bnS9>w8FlxQOz?{80px0U%!-7^qu!3(s;b_^o(7+3jxHZs#oFEAmms05qK5o zkIMh>sNXc-7}<9Q+-3i3e+DE{|K6YV{=GlrTmQ2^6Z6g9Ox*l56I`XE=INK;_hpgq?W-o2a&xwvrAQ8zHBS4Uo{Q+`tAdAHaRC08?aP?-$-`r`L67 zv<7cP%DOIbF$+qlg{lY*_e>dA$ZI$)ZTK6Sy*Am4cnRb=9uTiV8L%=-D~oTunxCNh zgUvh0heUFQzVtSU$uI21ubnVpM-hPruI%sodlngkX(b{|gD=P0D&D@9n}g*@Cp@uv zYPlohyM2b{?XDm6ps)T;mBgCx?Hdi*O>eb`hj>+>1-*loh_s;tNimkwSr=3XBV-D% zYt?)%SC3?9tKI!>6InxXpd9h>?(iL^P{HBpnWI`N)|Z=6OTG&bc?F?%M#0{SBH@##*U!B+-937k271}F-wrc;zRtvx&Xyt` znR?PD2?k{%;?Z(7b-DohBTlufEOT!>JbKl6 z31Zda11v*$4b%eSb&G4UM(K8-tyi3G6jL}t-AY0MBv(LAn{!b$&muiOy zuO@BL8=sOH78;&XNdSq;34zOfP>&|qE$n9;1Ta@lPy2vnFG5F@^alv@V=yHPEq#Ti zJb{EcnQ*nRdqCO?TKC{{KTageoGTdWVcl|uYyN&={te3tc>S;JHou0h00jx8b`9I> z6x@^thg*9y1-sscW4>cbEZEv;vP?yqA)O4T@w#8ZJdX+6`J->Q2@iY@y&;elQe-yiI!&p<4Z#(F!6zTHerR^n$ zL}QMXKo!UCfs%Gt2c-rH@2dsy5ucTwg$TfTBO+;LxU=qUKSSSO8y76udTICZ`yLyPd=k>33Y$AS`|AbWc?v1RD zylQOYoow+4v=g6x-Vn#Mx0YhurlTnS5}nKZTRi@3SdlPm;VhkRa!cUhg6lbpt?jCo z02l065>_DT-!^5%>G^lOVHbX&jIi+@zIwBlSaTV z>5u`frT&XLp*}1hm?Wss;C2YfP7Vk9%~avgL19Tf^0w!Zdh)0I_{+m@105;kn0n+- zE`HPu3HAiN>lfK-;bHsJwcq|9!i2v_lH@?Y!Rz3}Hd$aiM&KRMj9t{ePpc+R94G3n zA8qW&^W^yZ#|1y_S-Z*k>FU~L_0Tie%&(iBlxo>~hD{fgr;@*|6_N`|`!VpXm+O;- zh>Nmd#Oy`b_A*+s*Q$L!8Wtz4clsBb{5B1 zG@rMNSOjnIo|iPzkM{?-{=0}?;`^|Y^Tfsmh}*W0Gbb^#AC zeS^yLswWZE}R%H(_UT=9Xh zT)=vJ>2=#3EFUUozC(q-siA4|_Rh-s+}M{DBXK6f;Swzp-=oA3=ZcV^Z`QJrZ=~;s zhHvt+NLakns-Ve)APYEYK{ZO$U-9BKAL+1B$VF>@B|^q%LR{zA`kmMorxs@-MH66w zZ_?Aoh0P4UZ8y4~(Lb5oJ124;V6u}eGO^DaQVdim=LxgNBGAvhn6ZI|Kz(f-Z)<&B zpveGR)i-UO(#w2I5|UM1KyzwetYX}-@_g;^Hvu-+TeDcd7;6{j;{4Kc0R91hKqhw{*$J@C_=FdST2s@-R){poDd`Fef%pWC+wf+VF*o zZ=I2I_x?ObA!2~@bQEy_iVIUN4uW?oz}`svFFAg*9C8&qTpxbJ%7J7{3c7(KNf%6*yv59E*v?$ zONy{nd8$(8jv2tHWApz};36QlbYTc6Z~^#X08TA0d)y5#H-}{2$^?;QQ2a4@jow6qTl z$9uE?G#(BES{Kmztsx!$1Vw=LFhfLH!~`eh=~DdCHGl9WtlP#S!stiFoJH1lrk%>; zvh}CQ)-ey{6b(RMfaC&i#4iXPaA2kTxtBw#1=z{$(ajZ@rRoS*@KnC0`@v(FE3g^h zo2V?=@w~M6%AbsR8`Q^it8ayyrw*jW8z=NOhhoejQBAPFoxzct>}C*GVji3n@&Yv5 z500{tAaSC1v!YWHf?`qswfup%`A`ph(3?^eay7fE4 z1P^SOqCN;C`1G+ud4M!#i5HU+Z+c0YH%h~a3lhu*#n4#d@>L0iL=tSwf~OxaxLnZX zmB>bOEf=K!x(L+^ni6hl^inqr%O?_9xx>hU_!<7f#lU-T)1cKXWlG2QZK1)UdOohu z0mEf$-Y>g^GB}(Q8MkL|P5cs)4^oo-MZZe?@mdM9x*K=lYFEY3x;64~fXIm-xntTj zG1jxW#n=m)hb()&!pjG{qHt!>T|~Z|d(Vtqal4xA-)Z&4&ac^MJ{Gokbz?M;rU!6r z;G~g;zyFKm)(M1c+wEwi>^G^8C|Hmr&_I$S=t$CZe86TET%kG8*#i>htcfZ?G; zNreGvknZkArF%$`k`C!cN>YYysR4muKuWr#8$mjxOOVo`N6L3`-_PFre)itqxBqy5 z?;p&pHO#EMu4`S#d7j5{h!y(U$T7O3mzVRC(cGs4wDLS2M5j?e3x4?5m$G&B<&9|` z)L$u@-j20%H^I)aJCL(AL{d)|_$B-JQfho_-?oYHE-2wWzznNPSdhUBnleL^G(!xO z*u%so4|_t?3>w+2#g7L0mcY&uiT5w6QiBBU+F=0SM#a-jfGKvydM)JT+PA9N^z>XJ z!Hg}1=l+<4cS}4JY;G{n5_N!arSMH-;-&rU<2?e*L^FktiX|rj4Fl6Q?I{mG86pm; zep~qo-n5Z?`QjOr9-w1~eA1a|(yK|*pe#aK9+t_POMNtu>6%?2KcoWvuiSsBd&C3%C2d$pMwuO`5>KH~z2;1V?f1mkG!jAjv~KRUi$>VI(Z;!Il77@6o| zZ=R1tOMl{v`4(2>RrO7(pzY_A-l;NHard?a3C==RHV=-6ckB^^>+FsMnSWU+j9AtWFCckFW^GYQKKdq5_3J*X@Kl}}%BLnIgSSAX`KUR?@ zp3~_l$$pD`LPj=H*esQ`Fmy)Id~w4hKau>Y$b2W2daun?S=m z2ZnFJ7r{3lg22Yt_9&gQYne~&>f%3-^3%!TkcnuzG%K|3lSoi(?50>(KVkOUk`G2i z=|hE;pI;(aD@(3DZ{A6|t@A_y{6;g=qvEoA35ZhhqLU@aPYSZ;T*#O?~GN=*a}YWLZXS{sGDT=SOo>o4acOlOegx za;RsxP4u{g7GZ{*27D232p7z5fQZ<>0UX7%K^WRjZ$G)Wf<0SN->T||$FKZ)y;}+7 zRIx<$(64IXe`K}VQ8!64Kyx7p8oPZw=UMgG?32 zYap?$&4KF7?{ zVpHZ_Kw#y*kqx2%NVV-{MT|~El-OkGwsJU$WU z&q`o#zPwnk`G=Jdha+>XXRH2Svl8>aXC*`OxOoF4!ERM~={voDkGc)3t7n7xW-l3$^?GDofXWZxTZ$E zWzO?wn)s}|Ue;&rwO*J-icgi4%scE#{<4q_JJIo+V=eFS1o@@kNWL=nNP5fHpqI6b zss@m05Fx;A@xtV>2<#L4|34k z=9Uxid#5pGT~8jlToXrae9dsx4EL%KFsUEvN`rk3?jsWqf*8z>Jr3{hV*iz{dU;#v z3yp9fa)y=MXk?$iLlI(CEz)k#gj$}OC@inuq71k$4SHniTn>Nd2fteJm09zgTwXOT zl;^I|{t&;ciGS|pN?pVt>JH|?@}W^2M&5R=)&C*BL`{F3&#uVz1j@xvcWfem?56R7DFZ_`Rc(wJft9lNRuNjy*0R zUdA@z?ZU5tym#I(s(NM1Njk4=-8SvxE|;97rlW1;(CYY|94CE-OQ|W(D=rp|`pLR^ z{gugK#z?Zc2&GM}lEQfqsFG!Utj7iGnhm)GR01+*(qz6OU+Hq6RDD9muX32~Bf6)F z0jQmGv@NaFZ*Bih=^az2Z;;^i)wHG+*v$LC_%%0>o8ll+$3GzQ+lLPA7|7+(wuVA4 zuDWWW3G49RL}h6<^4{d2wL{6Xv#K;1tGsW-8@63C(Y3yt{XQ*@MuDMko-Y4uK_M!L@AQTY_h@v zfTBz?^_&0{6(9dEC<^y^GwD2buiZOBStqNMdg8})XsWT9&hJk*IYF#4-L$bm)JLf` zWT$62FU{inziwW*Crcb)Dq#bz6{iL9HLRXb(n9FM66Jn!9F+mgalqlrdMDm1=2z53 zi!Acya60A1faGx>=OdNAaJ2$bT4wLi2ZJD_FKt;Ko}^5bujcw#%e(EXFLE>Mwe6iaFLUrZ5s89F&)ADBKY1ALMbmg4`5T=Vz; zl{8qQUD}%C+hki#g2dTWzha2(&S@LDTx&U|IG3R0ZtXMS^N&o^AF^~dLH=z z{U+WM@QjFpQ~kgy$WAZh(sQMjr{97(sNrNVhFF8Iad_E0@&OkM^jI9RalC>&!2N*g z$31P$&O-EfPYhNBK&w?02etsva~%B02n#X+vjwWyzYiu}0prKru_-eAeISKNA(zk8 zhu~Cah3N|+3v4K`IdBaBxp8o;MZMin)(-`MwZt~m+87`^-2>C4 zYRf+KK|~f>Q?hgN^=qW@H}@HQ)2RfNFALMldZ=orK(%JFWu>9cds-e!SOx!J`rIuS z*He~07_}#Bje7vd;_5W&4U+A#wU#WuV6Bl&P|>qgZpk*PnxdEbuP1vedHsrv;)<6%pmN+2TBKYoMs89c z?-wyuL$k$RL#+4|ncGC z^cNQjcA}*tdh(|~#a_+A5Z}d$=3QTkv78lGYLZpOrjagBoDnBm8m}G6 zixqZlO}{B!k$6%OrGwAC_h6LH4&#w(&WRd=M+NR&pp;}b2Yv1qNlv$!pxIx@0Xsbd zQNbKVJZ~lz20oy02Vjgy=0?X#-=N(JXQ$ytg2CAQP`L#;G2ntO^L)!Yubq(Q->tUu zzmB%9T#O*hk*BPEEcrO;{kpEf$v1|KgfF{;42vY=z8<| zll{EEYm(})SNT;^17FwZXRsl^wTYVS>$M$J!qlbio>rcqCzZht%faZC^Daz87)Fc= zk#G2PK*Scm!}(%{!I&P4+W*$;k~KrPCe4>jB{^;-JE6i~s{}SMGS@RqnPLTfFvi1G4ivr-E7NkF8-;8 z_!o-7RTNS>Z9JW`FFV;zB66H?c~|?ulII=UOpY}1$(N0}1FSXL$8uc1-fm}WPHS=y zf7a^i6dg2--hw%v&~5}DWCDe0@R71$8=epP66*QsE*WkZqOk-sH&00PoqmdlSnfU# zF;*~d{=Fe(0C(5x1{ipIS|dg648OTrr|R*=E%DFl(g&_8W224+N%90=$1P&RI4<>3 zVOOjHq^R(k(ZE7Q{H?s>#efmH%?6}64(iPY;WaVj)%Fq!Ti-D!RDbDtiNFwsXL2Bx zt0t~cnPP<^AA*ZJ==sNc`Kc!a^tk}9%dqYm(rz0 z#({Z|M&3`77X#l7hq)M91T-yapsAcr7nBUUP@5s2SHwODp;5MqxI*lvCBWv$B#U0M zuTRF$dMS*~t3cGZ;=S~9W@_>UVg<>| zCLvLK#1!gThJquAPV$XMX@LRRu&Tgf$Vwub`-p<0Y#a#2B zOlrjY4CM+OUogS&@b`ij5n;?hv_s}!e?~AH^`yse`%eML*O}N2=G&RyJ;+n&u zVSzN$#ip#&u0YpiRJO9{ew2i?i}ARIuM|g(s3Q$G13AP+%UuH_C&BbIoIMeHXC*N{ z+ZX;TWvDV@q(PA*`AuRKzq8sV*~Msp+?CPo7YfZj9>Qf`L=PWRLx){kLw^csqD1YX zg3W7NY@rKsg|8lSVW-95Rl0XW#}8#U^G{2AwbV^n^dA$mN)-N`ySdBN}xDhM#qb%!4ZZS zf+6NRb5(70LB8bF38kl1T*EsMBO)gf%$O5%A4H}BPJh;9R%1C&A!>3qmhHYoRt)%HZql1S}UdK08(LT;(@zt3> z{q)8ZkbYj(rI$C8!JNBOAq(D?Bxf{i&KQtn7?}XL%B|ptzpG7GDZpVeFZG> z>;^da2{wc1Ml%nEuRu?`^tkh9p$$jYs@IiispTb=;F{Ex4{y;zEW@Dt(8#MLC^-iA zs)~w9$gNI?3*meQYa ziE5l$>1YOmJOGuiqqhTh#rIjiw`$3529#9kTjn$KNU9GyT^Mb_oOulc2Z|dq->MA7 zSKB?W+qQRYRR^PKOk1qBJ;V@@LsMWYt}rN-=e7otNj;Ki~ta*`-_ne zr?zM7+(-@a0xwTHm!}#M0}LpIxAI^u0GvRYDgn48ho-3VREfO!@t6agGTPYtr=3HB z7hAx2VOnZ9*RGQPL?4r}<*Np2w#_6p1ja&ifufF4u9AWS{TUkILC3wI#izd&VnnFI zS{@P)fDz#-v^t%J$wyFGQBQb@Dw@KPZb5f%(MDW%!Lpn~KsmNVKs}lm+;*_onZ)fG zX(3{Z0^R0gr=VBqfJ%b3vVI}}ZPeuC!LzTmFiI09QvwBZrW;5p<-vt15OkgN3_Cao zN6mm>T3Z7{(4C(W4D95>5-KuN9GR#*t?RihRyf-`|5fw~q z`?3JIyF2ZF`@=BG)?^UOw!1s%iG~nLc?l=V{~s&={=>gE1c|RF5f!OWiy|o%BB8ky znO>%Nk*ux#t1yrwBAT`2Vxm3Xij=XSa8=>bc#uvy)Pt^?Nl}GX5G^aCvbQz* zZin{nd=A2Q*KT5Pz^x9lk(|Lmrjwp6FRcLVMSIYIS6dY}?FJeD9bQaRcgx0$FZdXM zhjxW_ADfJ=|0?~8iRjnX1ZHone}9l-m(MK>h43lM0(kG=L=>>7?N7T@Te2*FsLRVY zX*cO?;}T8W9k8@{HPhWGOt}8fckuW(IJV%=CwBiZ{`>!@lmT%4q_d%*xRxmgNM_MD z?N>G8?CPnn3pxRxz?!&J!}~uW#RKcm-1Cf)6XKE38|u{_#0_pJEIKkuaWEP|T7P4%*22fwf6^+HT3{ZehxQL~s(Zi39*xl@9))t6X}4A7?U zTx!_(vFOAc)WeuCNZ5V^MHZG4K06i4(3VR0u6Tka!N&`19y7mA$&CMYMt68_iOk&|OaiPB=wz9sL^$0rmF_O}p{I;=7ga?i+ll z)LUc$%NY)5k*nXbdx@Yv#)pdomlj*a;9*_2vffNKLN)PIR8R8H>R!EnQ*@<_@tP)`&{fnqjO3S@0ngvB}bzoCSbu_~*u zd>`ngy|;?eGy)V|vF#Qk7i48vC2jYO=~;c_rv^uZGOm@8Bzdudt)b(Hg>gczPtA-; zImv{t%yQeAk!L$mXJTA%g;-We8i65Tr`H+p_%kQVf0O1RUtHR#XDWOqN|~V=qYMhF zOM3iFn~5*}uzF6Nme}f@s?C?23dJPIk9N1Uo`etW+L@PZ!Ock`1$6QN+cmtnov=dlaT*@e9F+ z#i<-V;B0iMu-UZg@8M&?MnOHwp$M|U3wYr~^lE-2U1@I8aFILpGuqiE!iU$~NbDHP zWG)vqtRk|TdENa1M7g7j+!!v^c&4=a?w<4VGjlfv+{GTcIaJ z%{!kknyb=3+XFekr6ZyT%*fehjOXF60!pdQ&hB+sfs=YbotKjz!(S`(J^^Ye_R5(u z`f^W8O(M8BbqDk*xXlZ0xGf_!u7$k zDROy|BRfhncK>e05dcnbSK4X6e;})A%E$mZgJOrz_p{GBM>NI3JnN&9Gg5R z5`1wdgdI2gz3Siz60%4cUbA_H>_P1*Kd9DTI_GBy8L#nR# ze+r~~VEq2nfXs;gm5Pf`zX1tuB^R2rH^Y=>-{K7;eLgU%n(*F;?jq)bG$LJzpMiu&w^afW+;QsdnkR3E;`J z7S7EHuBuONe}7jJ9;4$0-R#ZU!ptF+jgQK)a`&IxuVH-a)`Yk!&iDfIV!X_9?Q|^d z(Gf~*CkBQqtlOU82a>`>fktAL2raP}N|+}woVlq5?4FMMD4YB1lY)J5mSU7MhG%2O z2G)%U?~?=f^ep3*WsFYk*PjrNmByAXg?Cqe&?5mI6w(ey#10I|#2F6mC$kG%EWa^O zLC{1=l(9KK(b<=MO5mYcP@VQu+6s8=y&#YSW}{ZUU++<5e-iCJL%6T~Jk*kGf9|2A zA^eOXNJX8Flw^#5xwlx&vJ!)2GGO@8-Dxi9Z{d!Dqv0iiKYcS45 z)dEW&EixlOYqq}B+8*~^w^hn;iK$1P3X*Fvv{}?uR_5p2$Oh9dT8|KfgE#a0t<6un zTGN?J2iO|ZBl-f7kv{^=i9CKd+AK_Dt$@SHne!+~{9%5bNB-;2XBWS2Zy&SQhX)znQ54@^;RNWaQ} zI7gE~d26e8=oLwkn3vE?*zi|eGX0fr-U=ajJ{!eQtO)eyAD+g7M9D~4cYm2t#`b?L zuNuD$;3fiZw8lh`59mB~bz8}~CqBtzw2@j-AfD_30U%rEU3&oLLfYmG3?2`ApeMhg z^9Zhmj#RrvZi$XKX##T(G|3l36wWvjU0q!r#|K$H^xF#`V%Skpz@Hf2J_Tg%Xuo0i z=GErK^YBzSv9u!Ft|J}aM8sV;k-i{jh+aaP;pX|HoRu1be!04KxJAjqnyY@YY#_RW zu>(wyGXs$Umal-QX(uo7^W;OXualTiMa(UOW7{j^+o4s_U)WE;UQB z%Jr7IT%QXf%rcQujuSl_mCY{*So$}8hn1Qb+SJ4$&6qokArek@EgH{puLV zq8S%nf8B^ijyL`@4jCMwOefgz7t6X1ZuA!)cbme z%k4rS&m4TEk1lmd1X=KcQjK5N%WY&g2ih?{GH}Pv>Ocfnk>oW|v|6k)x%A`FXcD+v#{3y$o(SO>hPc#Q_#@h|%_=1vAiNM~0 zdLwtcLe@P6KqCs}0-9eP#zTNFERtZI&oGmdRp8=LhDf~^)Q(vBSYXJekSr5;P0F5cl-ryLCJsRx6)$*=4oIH-H;?w&Ts7x9 zG?VGAtc0x0d35=wnqn2RJv2p7h=hI;?p=Oh4}OtnGQtn!yO7ohqA#- z&}Jwne}TI_AoIYyTuT6g4vDW@W0M6}qRDgPlL>zwc`lR~dQnKfQyZsb@uSDLb@(Vvhf7IUe|(_Z`~$1WfO-LV-0- zVAE*k089|_)b+$+y=J{;m&yzn1U3ZI|7&FB)_)9iCGEj~y*^8ao2EUa)9FD|x8pDl z1TXI^oFPUchQ*=IXNCR~o}?JlQ+AweA~@U;uk4C|`5pVBSan`L4+@nPOkC1|a?^B?Y#S)c-{N!HNFILld{9E=s^EZ7<+Kp)o~LfbUyEI2ytoR3Z6OpyT~@q8c3FhxOP1v`=+ zKz2xl6*neDQIG&%!ZjbW3%)SD)~Wqj3;nf=k*f}HZ|;#7b=AmC0u31xTv)+`9*!9X z2Zun7w*eIS-;%=rU#!1EOuW@2HzfQsdov9{=J>oM=&8l3&KC;*Y1<0Pq^wu_FAIf% zhA>JdSV>9*HMU585A+w=zy9vT1xfgm-s;d7h(p!8Lx90ct8NydW0cx(kJ%1vKA8Zw zihRGvT1a27_UR7k>de9UVQsAF-}I~ zbOMFge}>t;#zb&2$z(xS+ z#$%xN8cWs%HFU>7H2w=)FE;I>=5`-nlK?g{^1%m>{{q%PPRDKEbXn#tjzsF}FA68I zTVUuA#z~ewqgD#X+YVtit=qFm)FffX#~gIsec9r1Xw-dZ)L>=|MeQ4wl)nz!m&xRZd#nZJ=s97WSi(8^=^(eKvdYZg zT@_Wb$YJD|v%l)L3&`2KpCt=GBZcTFG-|`^c)pz|nntbWVFWP^l)Y?-Db>8%GE0OV ze03W)m~tDXigNb%sTv9<+^<0e-Eu6l3G?&A2#u+Llj}(!|K?l!Jk)nv5hmK*g5FvK zwn`+_a6FSH|GKPqP&D>cG>l^_XqMyI_?Ee0k~V1(K@I+(J0V=~*p=guqWX9UO{^$C zlKP|g=n#D}Z5N#VgAdxMg0iU{5%DoBE%)W3ToJ#nJvg77rQR3O=MtGt!(NkoIl07j z@Mw#7h{f(TkCu4W1z%l2=ZT)R){ot5MjEO8jK^V-^eHh$xrS=SyXq>O*EutFjyE)W zFJIQNug9;sbx}7lH(%ah0QxJ%V`gU_dtKLe-Q z-b*xmzc&5GZw;~o)uTHTE!rhPCXj2G0Gv3P<;vC0j)wu7$dlfOzEM4mW7d**u{aSD zp}giDybO)T26()PS7Ou&s7WfGVFH83%HZyLgh9qrCK9eOAmP4$zKh(6tEd2^oiQuo z5%t7lKDPX6YPC*EA>>lEW(A~;-i?XyneqZ(X4+P_0gcaIRa;pL$EvFDzC#t`Qcy|I z#K3n|bwT%TzriF>uLQ-NtGpK?-$(eGGr2a~$Oy*$G7+r)dM=rxZ`i1K%3KI|349-r ziGquU+b|`Jd^Q+ey)R>*7iC6r`a?Y)j+J)xwuO8<+M4s}4oa<$v1mg(wv!@N6;Q@t^tQBCZmZ;Woefk2P z_|#2kbfm*{)?;=S{66kY?NC|t47U>QZXyo_``D!^pR0*iCdtJ6%1P>Yh*J2lf5O)U z8?DVyUUpJ!!Rsm;_)>d84Ef-@d&TRcFP7dHk&&pt>z){fy@sW>zHBNzW)mSadKq!` z@IX4%i6K@Oo-wWCgRxRL-~L_xf?;7f3QRJf4GHB_LP_CE2&%#7%?JEdSj(EzS|Wg| zrMsD}o1u6-NA^+0&be*4kh&5TS%LM~6%Z%`lkA(i9KqI%t{IQ;J4G4HvG-)l9b2F8 zcehR}o)B-8?Mw@mwPuKM@0<#aY}9!Xjrs5~lr^4|-ykO2Eg#;ToZ_gav*AB=S3-Cd z$r?yBja869td%Le>FEz_OuFP4_Bbx)ZXaMHKQW45^g7mm89c9fs`P^qYAq_!cEf=b zAFQ+cij4>NqaJg(t&&;Gl747LcsqJaGZRotxp%HVNwpQYyxB3^%%{0bA~iNaghuWb zre^It5NiP+8<8s?Rn=;oJlOu+F-Seg>=}4^KYELumwO0@}^7|N62qhR}O0L^1`t_+ftjOR; zdgd#~;sr%VYqEGDG7#~N<95Qg6NQMwpI!)Kb#Mhm_3f^_XjSPH9Vl4kH|$AYOPUz* z1kF$NDD5cG^(GwWm#ESKi6}TXhS#B)^lVf^WZXandn|=3Sz+vW5#b;ZZvL+)9`NLQ z1Re=V=@&PE26&)o*9qf;9&{jWW4MSmBvKcNg@2{&7)WxkiLpz}H3|nuRR~rpOf0qu z<(CIj$kYcP)Nb;8tLj&W?`gI5CwSjFwGHXXqbmkh9}?_Dm=?FIvo(pyut)CCVNZ56 z1TWyO5r*AL`x0v*8>6}C-vA}jP6!4^MGl3*q9&|QxVF$+wP5?aZ?*j$wPSI=JgMD> zwc)>~xN^tiwQ+kB>NVf*wbssGR6UJ+(nikBGLn#i_v_f-XH>6DaT^l*!0U8(RDWUF z)$#M_2Ua+aawpY9-K7>CHZ|*#Iw0C{7hl$`_7Wmdegw*yG#}D5LKB?qk+K6GpYqXB zMzy%FIxjwcR1nC@ltxOQxiKsH*EEusJ6Sh6jnx~Ucr?`1zrfhcnU;I;UMC(Qxl{5n zd}|w#A|`rP!W+m|`w4)6iNw#9S%hBs&lrAvP=@nm32K)fjc@vC!%I3`4=-$dyN(lO z9=LCSOdgvDY~Nk3(_(hc{ze?^RXYZ^4=Bg_IHu-e#%_q}Z%7Oh3|!N@G^yEAkqR85 zDUa+5>0_Y;lLgj_fj$ zUvi9j&?W0WJ06PA0xLC{i*+)jm<0x~prR1h1}Lu&0diCH$MmyYy>SGGw`w0%RA~KusX+-K#|%+ne$G6U zn!1KuUw#L9o`}W^E{DBMAL8PRt{njZRRh{IxX;j~_BxPYm zk|y%uHPO{X>`5apX@*59YK!;e?P9;b3%AO@u>G-dhA@Bvx@pxST|AK#8NFnJ6CX!y z86PpV4qE8jZy6qski!dmz+&2@s9m#9ZuRmJcz`Vtp4G?mHZw$YzYXYPM$ujTt zE1J5~mN1a*j^U)^6z#TDo9>aQOz3F2==}P~+7}>krF#2K`x^5o0ud^>iXZH3`o%6I zMU`u)F!g^mDQa)TG;HC(@^+Aw%k|1fW|bwx4y)~qi$bL?!-WNg-fW1SEb z+$DF5>sz@^Ez^rMMM1VL8%LV;vc`$W$qSgGn(by3J)l=%sxzLWE}CKb6P4>7)h${_ znwT-15Vl(;FD9jndrgk(4iN8BO3JA+@&L{bd6F$4^cloqcuk6O`55rDqb5+v&`+7V zMkd`l*`8DgvL!r`J|rqhfF??kCMu?^AW`$3G40zv*kH>AI2Yr8o(qrY_B=r4!let279>V!qQe{_{)JTms&RdGpYKUi_sHn44;FiVnV z0fM<1sQ8z~X7_ygCl|iClhorzGkc#hSv`9L0> zXOadxtG&S}*0G)rLKDs6YpDrmNo~&OB`s;U19k&5A{@hOZwhRy2ZW{~yQa$zHAMKz zjCfI*mwa-&W-*PyNJv7A3#%{<#rS%HWO=Cvhvi;v{tZVJa`hVAV-mlv(@%rb(Db&+ z_EWR3Nnp21&Am%!vkIRt^u6#Pv)(StT{DlfcIfY4YpaJ6$m~rIuBZqv()!t_kDY|i zo{Pw`egSt`B@!Sk`7n~P0&&B3P1TLu!<%i&zoL|tO*vIKw#F91&Od~mdnY`v3wtWQ zU0C@bR}XO|sjY}6`{F%AI;R;3Bu=C$8w3RDqB8%{!bW{w9k-mBBkupWA@FZ%an0a_ zA@(Cke$)N|5em!&|B!0{NN8byKu1|J>)ZanZte@OqIiRE1%OiLR~x|@N6!O;GCa<@ zp_@U}sCFJ&6k*S;NaWH2cdMm~-&XpHg#evJ_9qc17V^$9W1Qn(3o@I_83(`D%3Q$e zaN%075))k2H*TyEKB>0czv@4Ie4GzargIyjdCjV!b8he8Ib5K<{RSW|o28jj-?g~` z=_)5|XuzX$6JbSM!P&;*7Zwn|SCr2`9PKs-)n7&zIV?E0!C&nAOLd#fJ3K5xszq@y z0^U>Rf%oZV=gaYpgwLUK^y&1(s27Dm$CaUOXA|j4h*%iteLfXg#++rY+sN$Rj4|7`O4|%KXe79Srts)u;PDF1^tAwK;_NJu86$G3 zF8>`oouR#X%isDH++ySFz9(BiS`#+HtEXWZGTJ2t?KC&=j@R#Q(NWz~UE(g0 ze?W~q)^)1Q@Z_z<0gAz|AEB7eFWQY$60tF0Zaq;w8s*JwWwixxc`QF#S72NK zRxMDC7n`=x4P~uV6)-Pom%1tKaCi^!tV#t<7*r=2X6plAG3E?#_A`dwAJjS^QMNct z{G8CTGPBWv`3L0krRrVqqq<;pRAZ0+f-c2U_X0=Rn+ql z)VVYsC$b+Ns^sdLNo|^w!>m=YG7f|bUxUBxn@$ACSgudcZqIPRc2`tW(EC4}vw|45 zk}0M5`aE7CYs;in9CeK9EwM(q5^8_^FSiSzz5TatcG@v>ZT8oNF*~FHSaSkE=ekb# z@4c4x-!F9GjTw-GYB;vjf)f3=ZR%wjXm2JiG1uIo6ugeO&=Fs z>A*wE#N$)-OIY58TNIz${)i)4H>>UFZ`DkY!CyP7^NvB7G9E9OER@3>`Xmh{`EZo1 ztlX0N-M;b%PSH2-O{Oxgb4?U67ga_8WgpEhRKBemNZPRaD)POh1-tYVlfnG8rU+b2 z?5h!3?Vs|`Ef#Aiy=QXuNPgQQ7jJQJH)&}D6p$C81EEet)E-)mb<7-@6o!RavK1_B zVb#i(otOm6j;C)1V7#A@*rD}fln&aT&ljk3=3J&^sPfiN)G%Kpb-3*Eo% z?hEGGh?S{h^ZBczucG`a|JlzCW;<%;4i!ZXNqF8>@B~P=P+d|iivRYxw{EQ(MkhA> z1Ok^jfz9uvJWAPMlbg=*G`0KaJFk4)muEV;9ZSLMI8r@XL|!OpKDU!Fkg(V$O%$Lo z-A-c%Bc|xhQcQ1rEMW2*XHuDBsQ!_qM+APTGP&3$6)cCF0B-L-WniUOMaD6d_u*Ti z(C;y%3vk_ldF>G_Z*7ds_ch9|-adI{;U`B`-LsuR?*gR?Dr*d3FfS*+f-E`1spY*` zeVht%_i24<_M@!Nh6Itx8A*!k#-o(XOw0m?rtC+;acg7vdqJE>;O<4Bf^@67DTVwA z1Gy-*DoqFhfQjb5uONURpHTDTDeky6up@7;!rbF6l`r!(MNGO$)9CYOLW|*F$_!fJ zl=J<@v*bPvFg3fR073>r`Gs1pgVI-4lUCB0HX_b%8~>M=2^UMB(w@=jfu= zG`f@_88eFA9qM)%rRXk5-hS6Jn;cDUsQ}nYKp!yXfUqIz0USq`=f6>!Yn0zSUG8_W z#Gc)Er#`eYEKiF<-sDHFp2d*|FN~2?QsP3&z9T51aJw#3je%udgH0NM1dGLjs^F-e zl;kT|_#pY9Z)A(my;eSNxbBn(#b%pe-W-d{?xwIlpzY`Uo;3rQnLNn}O4|eED~!&s zHRLma;j(wajOUv8rXmP@{Te1ksooSw*~@QB;dM@`%*!8;iN?kQ%0Z_ECFcV+!E19V zM&Y|((j7()5DNEoxoq=oJ{l%3 zu!MoQ$zdt$=H*cYT z!G$k3g^C^9K~0dYxhoi)h-ynPeQ+C>B;J?S# zis3!!mVsd3LiY z4d$066f3JOi}O*HxV2INaHaP)tJxXYA6vlfTF7`;Pfa65v#Hi59Gj$Kt&E*sbB{7j z<>tT=mN_O$gcQ=_ah_|@b$hsT?lXPqTvW2V80UUMDK((2x6>Rs`+&Q$$naH!YHKT& zo*p@8FA_wR?0AyB(Qyqn=n5jAW|;^gw>#YMGq;Ef6pUOM`Sj}{&ZLS!e~_$ysKSH2 z?say3@8U{pqtA3z%t470X3${dad7Oa-7-L-KYzOw>@m~=ECvQ5XiT9pU?4vE|9Tz!&)h$)*EZo?bi1m?lkw#f z_m|mH&v(@-Hlh=O8tmslK=ki~kxDK*dz!fJ5XqxcZFyxmC4a~yO={NoJ>W}m_2_p= zu!>4M@Y${!P-A= z-Q5KQmvBzKz&tjDCbcdKZqnyoNN0}b~Th(YOyLfVKiC{O7N{0IGzDRiQ z45WB3SqWYNRn%auzpOPvEa3l{?g^1eqeZa+bK{)QKcIPT+8b)2AmD z(si8U0>hl5x>>fn7hq=aKHz>yZz;>F;IPNdrh3uqyTp__;p#Q}j84oiJ!sKT5|df& zYj;Q9BLkqVp(@(bTA1P6 zAp&w_MfmaTw_n537x23jmZw=nt1MCWo7rQydlRb8i{?q*O?V1Ek6_ZFpM$wt`HtUA zMzRJBh#n^R&!N2+6`)clbRy6XYWm?nlimn_xhG8aNaI20VW{G1g%!gu-Zjs@fG0?^ z4bK&-<}jtaTPfzRn$%&B7Ns12zN9R1=9+!!79JF}dQ>&%F1c>OR`R7lhG*1$S1bF# z#7A`J1BvcN&VF5<>6vkLm5rxYt7M&;h5}nXHCDBb4~sWG;j)4QdN0I9NZpRV3+=7W z_>q}dfFVlTzO>uRi4iNEw=b2II)tq11BPxgZwB5Hc-PcMhAW<4vMMOb2A7;&U_1IZ zTj}nUePfEeYY~4d^ZlRrHW}TA11eYh8+qM}E<`VtOYPsQc4{cH(A|?7q*f1*DYp}$ zkUyC9F;>NOdiBdJ*(TFTg{0Z4PDbTRt*AaUP>s=oipsOp+Wmk)LTmUTRn|Lq0*+88 z`EN8IRg9nn_V(cx8D2CK$s6J+RNLuLqNY%QUi??vQfH}`Z2SU^JyBf{eB>?8Lx9RK zLXK`3MTP|mrA<3mv@o-}-j`BrBDoYmKD)(1t3s*MPh1D0O;=#w+fb%?flXyj>wn$5 z6+#vevJH>je)j1=&ZxUN_ccFk=Zp`zzbVr52NW%CVS;kL&f}#`*J6+q7Pp;k)8h5I zCy?Vq1r)4!tmjeshIL&;fAkIdu`o)@KCeGSGjhLbJRw{LskKV!>ntibMpfuQ3MF}V zyC(?(6E@TMW0=Nqswb^m@O#%sUxkpo$Dl~auJkPuThQQDga_;*=gvz}dPm|Co>TRL z(8JD~{%;S4FgGPYfb{%R*3Un%`btmegQ5^8mG5UEl18bB$2RXoUdzreX#&OYVXbx> zsd#$FmEp^1qD9Lpn&ic2f>?daN*eawlaIm-9ymzc_t%!&mA6w1cTOFC<5{u@zxQ+t zVx_~HkHsRG8nT`K77OV60Adyf`lx1`_UyX}tiMn`;L zvTzi3Re}4!8%n8~rQ*9QKYDMcO_XYK@=?|U=c{;ZuvM1Lg184{eIE8)94a>|O6Bo_ zqoS))yERQw%Dy$>{~_%?qnZlWtF=_J%ZiucuX_SyUFJ?{O!G42mS0$D37YpwUG^O@85;|^o^ za{Ar6kuEx$vQF_e<+9@BQ8X04o`Tn$zF}Yaq8NA_>J@Jf+`R_m`ImrQtkpBu;Ou~; zKI>M)O%@&LSGKzX3)`FP>FWt+QfDy~qx9AuhlQu&>5B1^MnXb-aaWC$_1W8u0<0&A zfAVKk{A;J(lwUeUyjS2N3D4{%mD>mVPbT1QFf96*8!Elbt6`+WCM3&+YEy2NS%?&b zELY#Bxm&MRT_L$Fdo*$&NnjXnw;X+96y{hR8Y@$2*38=E{Y%Ckt4?8%5r;_%b zD5HB&AFd20^n^GdT-8dW9%8lvMaH5`NCifo?3Rwfizor7H&F3u-yNg@*V6}*%NaU4 ztov-%8U7GV%X!Z)Ng_}$RDaC#^u{q9>qVw5>=E>_LeYXVt5a+y^ad&&m z7Az!MnAy!L2rUpv%lf6DtoPjb|M2V*lkO?Z7S~4}%kj|E zN{vSJVh_+}2a0c0xO@O)^9eCj%JLS9)mnb=qx0=)%Qn~O9+9=)0%yyRSvbRJOg{osWb8-xzdAgm&=uE{MJ?I zZ0r~xl`G4Gs6emF5iIFHUD5yds6%5gh7EFeKl|$Tm88FqD-@-`X-c25F)$6)TI&bi z`|qp-ggu60U5h^W?S4))%Rc}b##awO$Jg|(bt!vH^`mk{z;bcP zR|pgq73zVsfT_1!Up?oTeECdE?2J_vU@Befp{9Wzq?(^3H5<48!DQnD54LIX(Q{#o zRdK5EE?4DmMxSfnTGlyGW`qx`94j7l;9UUa9`o#-x^tz*mZ@L-@(gCpx5`d+Qoc5z z=Qvr_>MQHpfxG}BcSCP?>~Nk`z0UhAfdh-Nn>35l&ne7SqFy?YH+_tuq^~FDhGCgA ziyk^Kw;;`xo)v)qLVqmYNQps!B5#nSDHh!{p)B$GxNJON)ZTvvrnMYJ_eB^K;<(Yx z1;s#Ip8C;=~X?>xHKVPb@=Rsz}#=o+BQ ztwIERO#W^_s=J{BXw7&5iSS!V7_$7n3^IxqPWa)gxaGHscsGN6vV9~DpxYaCR-i%A zg{Z!n;meF{l*25}a{hx#*%6Riws$02!Vj%5z8#ovm_~tM#A#s+i~Gc*D1uIPWsUL; z!p2O#_jm|SRE2SPhJh8#DdDV z_^Z1f2h}d;i>A+gLKRn0 zEa)L9{yBUsk^Jrh2K(IOH>BNmw}wmwn9goe+$8yJ!}i4U4n^x&cNw2DZc+&sOfENQ zhUQFtU4;GnW>aJqZyxN;M$#FFbKWuon39)wZra8(baDOIy`Re@z<1AQ=*gK^f&&Gf z{ETx$tvU%yNr}c;&ejTdR@qMyl!>+_ct&j7ZmaB5JA=*oj?9OXfz^@9w9SObH%4`@ zm^q5jjcoJ^_z$@4`G8riSP*s8@CzlAF9}x@H2RC)OGLy8BkL@(zq6O4rn_y|4S;sIG&Yc0vtm?i`Npha8 zJ!AUB3fITk-;fbOgmQpJ)0}i&-7k{Bu2g6Da6v~OTYEEL1!!-gqqQzl#nJu5S>|_2 zZFJozOMs;-Qz>#dcUAQm<@I{TmJk@o-JY$9AR!+vr%STnM1y7ZmXYLpicp(WIqrOI2N$1WtO)q`wEv=2NNQ7FPsZ7;FZJVuX^5Y$xFeyx4{3TMWRQKL2=0fZIHsuTBTLJ9G zCC}|VD_28;Vr79<&{acRT~X@v(ou&_cGr!%45mtT{zFjk`m~H zFVVVrvpsko#THfcS>lxnY-XZZw9EWNu^spM%VQ1o<_%ESox$V0i_;ZN&!-F`Gpi#F zvbdh1x>JD6Tnvu69mUvlI)7)}U8YG>n83P8%k1ZP7wW0IM)CcDD{7v98&NW{L6_&g zt6GMOYfY@$W8>a@1ZHs{{CnR$bR$3OKYzM*X4n)~)#e?2a04%Efzw@J6}!J#J-L#6 zIfU?VS&T*5f|hBTo2}8QHlR_PV;?<}d;D?)x^G^D_o3sX3Opzn%*ojqjw3LUyYCx) zn?MJcRhv{=-vN~sSD7rKS$gvKljF6A?MN)V$4`d4+-dGbA(7XsL!Up(E65F5!PqRp zImRYab|`&~1^qeP%t*#U>@YdHS<43857yY58xFXybI?u7*(uGl^m@Iw_r)!JH2a!v zO{D(te=)ip`tdvGvR2X=5rTj`>p+`XmQj{;?~~Lk*4Zf$GgvK$qsv4}Fk}-qb1ZbL zo%8`_1%HcS@ji!MM_wq67r9!)rexUE;Uq<2wxfXy`YgMI;w}fORG1w>JRP}C0JCSV zvkt|o!H#Z_ZvoTZ0Q`vi`ihah+?~)bV9tFu+2K|!Zzd*>9k^oJn}mDF-eNH_tpE(a zDS`oZ<fHKP?0TP`+qKX;IF?N^amQxmFb%bJQ6<0K6r3c%HXg<-~zZ~SVs^IZ`0RPw63 zUA45S4|LO{7hEU577j~C)hZqc;4Lz}pX2p$$c8JigSJck3FD^s7L7QnUzqPVn24F> zMoEseC9@9`5m@-VU3memof#Vc^e9#dfl6lLy$uh~U3e{eKSvy$*V-l?qZz2Td5BcP zWmhP-Who%l_IJcKdjR`x_dM!&JqwUInv3kgK>~SoCFPP7OS=0)bH7LImDcJXZ<4+V z&Mmw|GT2j#Rf_CZ{QFio{N=JBf1i&!H(>PvE4qBfP%1VWD;=6W+$`iQ(B=`VZz}!2eCF+>T5?tVSzkWm5 z3R2FK+mI#g@5(k|E!B$erv&z@7OE<&>Z&83>V3K|s1T87XL9~5=b-MOv+mDY7uWn{ z@_}av9sKBt?pzj;L5!?QL>^F8`T8bWJG&GuK2X-Ph`Rzch|r8@%OLhnn3n*)fikL{LPsLH7j`3 zxf6V#Au|<8s7kH!7!T&IFxGaUE(P= zSbf>kKFNWxnAlA;!W+%Q{4p0Ci09Bs?8`0b5zks>W^_k);Lg%bZfpJ_mG3i~!uId#3~$!+?Sf%p`gtQL0>6DOHbw}} z{ULo~%!ol^0+uM;O0};OvsJd87e@&@!F=Fue6bz0Fo$SF(Yu$kc;_vaiw-}6CT{A& zs)JBQ?W85QxMU%gDS? zb%G{7pW7KP`>CLK%*(DM9d7kvhfX8mn4U~6i(Mz&Yh#DvKU*+e8~!3H_UOD!pY5)~ z4z`3}r&y>s!R+_s zAFfhfHzJCi7MNj|c5*o5s`+SVNefS>T7l;gYvtO7FwuVcgFB=IGQGDY-XY&Tx_#Vv zByKpNG&Zu-O{nRV>M0?La#6_VpOYh_P;BE&i315zh{4J%{Dy>}CBQ-fsG*K4I66|` zGO+T7HYD17Cn>uZBS~hWaXESh3{%46iV5C#@QSmD08d)-RqW{yy|HJLB7><*9-o&i zv^%V+U@i|JucU?4d_jZ1Q4YC?0baqx3r0F`K==1A!3%!hpw5_GbZd1|A5r>8lmNsdwJ)Fypj#3#F9Nu8KY3bD}cB z=b3K3ZzBc-%}|mL0$5MEk!o){JQCU}@J@ys%g19~#wW(qTFpPG7j@>CO2=t4p>1PsZ#i-&?k6B*-gzdAHl*n?fkUlJ32R@6=1tqj1rfzV#P7u0 zxQy+l8PK!9rRBAA2l}u^w}(muiFPcsoA$+mn(}J$@Y?k8@YM;4Q@ziwcU;Zs{3hO6 zB=eSam?#&xl$(18lbUR#iCx-&BLK!sj?4ilW5lq*?Tt*|?5c4V^Lg3g-2FfDzabap zE*zd3nc2sS5Zs6$jCwn;P`4Fkdxrp}mSs=+1mjI4z1-YCPoN|CSABa7D{Z;iXJZ|n z)1`Lwxd)J4&W#Yc5R4*FsIh$YNK_5GLm?AbG(3mw`oU2og=Bdw7~9PHqV7DT(e zSY)g2WPM*aU?9bN;Y|T@#2ATNJAl4Jo**+5nT#HJ-|tsX7%vXlygkq@DIVzB7;l6< zb1-Wr@vF>ph;esc(8aRKLUbf&r7e`c`UAH-3nLq2cc+bVB;qG#GXNwxH~*=Hv=W8+ zjG6dzMD4KsLuPVxD?+Z>_OO4Cq_Mn=Jh3lj)ymE^M5N1JEuI!stS>4W;?HoP?M`1wu>9=V;B?@z**9k?^Wo0Y@pry-aBDhv!wKvqXd>63Ws3pmVpxy* zyYHq^+_~UY>veVYs;<2?H2-;B?leDGYyb;#o9{r!!}tfu157`C&8$g((XZ5~Bg+;2 z2C}A%4wwGA3tI7FI5>g1F+s5@O!TFZ`+HdrPllb_n~8@u&Hc*n6-iDaM-pDMQ*by| zCT<}w^;?f#T}?^FDU;$7B?Cq1#gkJjF) zY4KHi@KKlU;C@d~vT)6i{fDX{qM~=d7|hcv=bqn+m?-5J^s3n$#;+9pzW#Kd6iwO} zRU`JXYu_*La;~}1{^n>wO*8mH) z(q!iQ>EG{;Fjr`Ga1tU4BLm25{Eo;ANE4{vH19wL`vw1nCW8pAB~=IExC_ji%W{%3 zx??PKbv_;P-la}b595Tg$-9H@Hg<+F&>!d)bRfGZo^4^ZmVcgO*dQ}$Rl4j4ZRfX4 zC&rdbuU;w5Vs-o|Y0pB-F~z^yLr`Iub1Lio4mg3pE0TtX3-|JP?@pRH3!T1(nRvMDtye3C)_W9|KU#kZJ_U6Re}O zRy*Afq@1A8!dzJG)^UMoJp=MS28ywSvlA%>ulv~A=X{x#P_};(Zt~MZq58COCf)myq^WC%CHbc!AJ<>Q z(?7(=&(;|>khy|X&GH`f3AR0U7~(=Ggf&>mq+t@I6?%;D9t*08C$E(P6m_Rz=b99V%)7#GG>=!DvpHr>{z)< zPls5-HdG1sfrNO+tR2!2|1P~Fdunv;;WU$B{45PL3KjZjjqbLG$Uz+OZq`OOfE~5~ zG$^PuE|e{#VAT-+D}&tg%@b>X8qVz)I&Pe=r*gr0z)|r#65EtgB1WR2N6t3w{)U8) z3?83(m*H+ESA<=DXU{LZd1so>_Jwv7=X=JpA+Z2!Bf%9tm^54du@6#q}}@DV{AF}5GC zdG!834-h6u1^knIE5AEb1vd?L9%D^X+A4$HV5k;mz##A`D*C z^WLQ^FK#lJl&ifK-7j`tP7E87KIeAvVDz;3Qg<)Dj0kKDwwTF1QG4Q;1Rt;*d8m^gnDJdrD zF^qOq;ETJ1a&LNs%CSp2Q)|1z-L`C{p2B;EA2ptxo0?qt)1Hu{j&ROilkNA9*k4tB zOvzA_yQRV=&mobK^`BJB|GSs;uYIL|K7V*mRi24LH29&Qdb<6`eA?%$8iOCF%A~%U z0lKPmPKNyd#VqxYx#K^-HHOpKYy`o>i_6D6X@47=r@`Q<3&*O)UC_q*FD9!$ScJqs z{f+-(b^62X^#A{d;&<=Az%w8j9biEJOMkI06Cn5d3vc+>&!`gKTeG+Kl3>8iJS3e=K7YXXlfAvu3Ngs7asT==o+Guv&j~DKKu3`8$l$dY3aT znD6feVXx!0X3pR@<8OF=IXx>f!NQvP4=K6M-}9h#NNvH8PzjI&YZeSu7cK=da=fxB zlUBcF!D#ZbME=p0m+YfT1(K~d1 zK^v1PzYyt2tTrm{APPlpe1iN#qiv%}C$qnyiPxwT$uWS}TYo%~Im*GLOUUoE4hruIoCue)wS_@=1oQ&GA_R@B3`D{ z{J393);t?7HMsKHrq*Z8SzNd`;YOCgG{gA4aGTB7{Jrt-Js7FP*R|T;Q`T`sMurZe zUB+V6pL*v%T91Ug9KH6`R*h7*Jn|4h-*%%9Sne&hb3TYl&u*beTF5@u9RsCinOm(c zwD*~^TUC|cSvJ4~+ck%9k1SMAcxVhhEsS>Z%6ca2#@fxIEOggLCT7*d37*$2nZy!% zLL{e*zh^`|BV6aYjH0FV+3Xun&iCO*-gxukQ>@W?+>t)7pRQepm{mqyRCHRiosw_( zSaR*>4_^uN&<@dU*d&8~hiH6W+j7472d%`Kc;-FFR^g@dg6+=+sL(1}M(1s5dI}&( zgC4?7dT#k1oy9f@->1tAojem_B`WA^5^{{PIz``&MQG0k@6ZK-p1PEpg`nRhzWgP< z=dz1d;-?PKMo~H^mlhK8AAX-uqI7Ybe-(jNYqHhKq6vzT;B6tc;RN2hNK?;kIy~Xy zzJGGks*>?Lqqzz3ykGz#z5YHPnT5~nd=-&hLlfhZ7b^|n8fi+tiw#em`>eLP{k#eC z%AGTZtv=?+-YQZ`YGTFrUCeS{!BrRWC*Mf8(qI%{676!jrdfw^SpO%;*!O13+WQg3 zn>ptx%lJh+2XY#tz#`!Ivj{Gsnm&i`O%6F_eVM4d;H7EOzxYIOJp56!HU*92s8*Hn z<;5o~y4zZxa#){gy^XFO_3kOo$*@hxjCoaJUGaM2{vOW~G7^pUW%8Xs($}-9o%>iA zH=O<8y1B=*!i^+L0OHgBy8O#;?H1zB6Y^`w=)O!$KA+pYoQFvjFkcyZ-x(M!@zn>5 z`IyAV^$oN?i{;q2DB4z)+thqAQt>s7R6{u zM9>TnIZZPb>a`4SZx#LELy1!p9*_);9JTs_?nMM*%F&H8tj*NMYuDQrvnTUtN6(eg zcM0SaR_aRSRyn8n*qpr?8T{U8B2P>X2MerKWKK+*k#`PEwWRjwG21K9qD^NOzbQ>| z%CuU$RQ!f=>)L~Y9%0fC-xzxPr!67n$@gY5p6MqAYRF$(HPCB$M*Z*%Zner1q^3qvpHe6M$=4IXJa?}7fgZG>tioxACmao# z5v6MmSF2dkM|;2g%HU!1v1u|1XBJ+Br<{lMs1%x7T#ea5hWIHtPVBEf0n;mHG?O-B zj$ALOB-5YGdRf+Twj)bHvV=aN$Z2!G%_8Vx@$<;RA}*H)Uct*TUXmrA?!2mIGcl^H zX8bEQ2k61{%K|diqNE>20J`M!H9J=9#&=afJyR7^T*7MQDJwS0{ z7Rg5Lo1rKA+>Tls-<%77=*UtT`H}WwxKB7Ab!Fy(^c4wd#eCExDhOIO(}(*1nDCui&VPN(DKwv;lCvbCQ7^Wv}zONOp#tAQeCl-DKq z<;p6AxK77m?yTTiJHvkD$CvV_hJ_Nn*ow0lByVNe^Nul6{)7$tthxWZH=QPE8LraI zP3Xc8Jm_e*asDj-NYynbuBEs7Dv#d7cXX2`Gcia#3|5_vWda|4;DRN8W=&k|f4RBf za>_z=TPid~h5Y!o<&!0g#S4xz2_z6*_$zrk1SsK{im@fSNnF*X5i};Pm4@z?uIIpfo?r2VFo)#6EhGaS0VjCQw&%4k9&kRDQD+Z~7Ke&uv{A`z)LnA%wz=EhvF6G9imvFKO@U2PtZNMq;Wlol4(;0J>CiWnYr_3UA{Lm2ye5Y&5+@SB#!v?%@8n zN!tv*lD#UWah2Wu{T#-6jpQ;=i9help^yL8VH))y&XJg3yZh$MYp*EVp*QW`Bg;c8 zDwnKNT;8?Zimc+00HKEgrIwzsb2xD9Z~iql`SYBYPQ~Ds$AjlTrv(9B()vLaDaM&&5ldYGfLJMhMLD=$S8wquIBvvN+D*`3MxGB8(Ye2_W|yGsdU; z@z=`Q2bUiTwqE?i6CnxN9^fz;+3~_b=SG6-8S%lpP3%J7Y_E2`J|k{%?fu;QXE_VW z=R8aKjP&D$J&pr<5*t9Fai4Un26Fk`q7Er{6eU?y^Ua6S<_k#vaTTMwYVFqmh~yLi zpFB7%;fu{2h~B1#6s2W7T}UW^cb^j4VHeb~k`TGe<($oRgkqSusFPTxsP0f}T^c3) z50uf3knP~nGBUKltog9@rJ?Kba4}vf_SXiIc>h0Ur^u`uOgvz|385O&Gx<-MB;lZ9t0_v*;KwpRp#DK{94X#V^< zSpeuug7#8hP}EsZs7am({;7FF=A&kEY0L{?y)yp?K%;p+h7E*~+8}7;`|F9QdD5kM zAkw@aH6Bo`KDp$il?+fa59qQDx346mI@H>Jq0FBD?59-6JM`kEU@(|N1{lUKr=)LP zpN`>`GX0VhP4U{ACoTOiXYO!{=HVIO;*>uDH0|s?xp2N09Go5L9cjU%KmOb^;V;rE zKg@nU75P0wU(Z}9(*Nw%S@d$q0{o2?D0L%RlFv3Pvd6dsB}%i;ZR3ws?Z@^7zeR5& z2G02^ARCos*nFZ5u_LXLJjHB8yNv0(tR6+ndQ|(j-?jIh_L_tREfst}h-~~cP%BXb zP5TX@AakHwp>K_uaV6qW%g5bJ>*V?`KeTWYLYx;j@92zq3rsP8q>S7jWmX` zW;37{Tn+KMb1vz%-w{u1o*>`3$vLE$KF*@-z0z&*8?ss7p0T|%Ij|0^{I!0Z!c6W! zp2y5tdb?q#qaB$OpJ#TFvfR&h9*LTZ)55BEUdoa8k@7T%cqCSBCJ;6@O%L>U38TfL z#rIbaLKR_FYc7eC%d{w;kaOvZMS3a(^EQK9&w)o?P~88QPwbcmFQvU7)_i98my=CRE}l zWi%TmtL-r`I@Qu%rI(?gfoFF3i9xL6Y{}H}^Wq|O!DJ|Ql9qMl_({}bXw8kmeAg78 zps>B?M)8*%E)^_4M_FN9s_~pUhki77N?)IUD2q@_kP(=Axgo|z*xW>+s zA+UGJTqDbBIXO1d+UENf^ z&God*Tp@DfsBpS(&~1r1pNuCvp9dqSlw>U?gsOM6okB-yk$*nHw1q5_5uaW^BsIY8q7RLJR#Gp$h>>ge06 zypj%CN35S7OD)l~J%v%4k*YA(t&w{8LJ0`R=6#pgtE2M}j{}a8?c_v9k?GI2E>1vA zppUcKdHlzh$ez^s?3?eU#IEK?u8IbC%AS(F%n|;Ut*i!*gaQQt$5}}Sc?^*)9$(g>~OpYJ&-cWr=zBLVBcrQ*vxiki#+xQ=3Kfd zLpR3*3-i*?th{39;j4I_8NYKxwi?sckoQhqKaCVgDV6uP#;}n&4B8Ko6#|PVGo{87 z9{XO+JJv!k!xt1VbiuxN@B=MR3%M#V!vY!a0y$bN9d7@I%!s2i=F&;&Gm*=dNNN}o zX_=F}FiY2b==)Wn1m4I=4U;vVF@BDP%cnGBC~GgKPTlDW;b&@+(CwE!;Q&1xwjEe+ zXptO@HX2*eGG3!5f||E~Jfbw24=eLYdBVZME$En?R+w6z;I`l&11=w2`i0k{wS@%PZf1sC+82#e5 zVo=@CFhoUR9kzMiw{Dh=uj%Q?@bT>~NU~R;Vq|TZrKHZQGkSgcGHq5u}24s*gS2fNpREm%7punf3C+XSLLvx-~hM4!-&o&`eEaJXwNRUZ zW448Is1NN_<+{k{z(RZ=0uMr7auztTK7Gl*ZN{k5P`ne`+kBRTW6-h*XZ=Z!7 z@6`iQ%}8Bfs%VifkRcdK=e#=Rx#{w1qtAOYFXGP?loB$=>0K?uzRyy;aYVdCHmYG3 z&9G3CCjznrL_wStQA})8`Zq-GHFCBZ9BQa;jc+%!tfIXKO0>uJ1k~yma_6P5MX9l3 zeS*h1%SHG=xH2j8AQMjSSq zf!uj4#Rn$dK|3jZM^^?;-UTAU*mOrI_Az1~dhs`8x*r)cwNtl}6A_@r^a9a%rKwsXPz|`2z%7HXcPjh$a@U@1fge&h^P$bC-LH)eTD9 zE!{Mo+FW8Ad39vu)E8OQ>2>!^9j&ooZ+oYYFeW3iTLD$toShN_JPR~zX7qUTFT91x zv&t60TjuV>INj!yK9ybb(RnhfW@L1x0!~bX1;f@0LBl1ZXk;D{0)#bY{9*Kkdwd@Z zdY)nHc53fsb;W*o8=)RgW;QCU+VDURVNg(pz3os;5h(w4vE_)29C*7$g#Jd}i(Xt&2O-Mt}O0rRkH` z$}HSfKCa?Z{s&)Pr?R|1gEOw2N(q31@U!2N_h%OhSvSkK*N2ez|0x_u@2!7{+`dv z^!MU8HX;>I;aY9+&0x9(x6pDxzgqo#p!R-rSiRj_x`TIGBbU^;-q4elo+z5qiR>db zya+JjRUGIM_z1KZ3t|fjb8@Dh8FR&5ikE82xpzG@kM{Y4lb4C>^al~7w^F>ebA~f4 z-i#FqZ3NX1nTaYw%8qP5th+uLJhf(eP<74i?n^red;QmzSyD~tT*0B&vqkMZg75-umbzL{R&W4>zFwUjr90uYWZi*M2yY z>jx$acMHC!f>Ar z5j6vV2tj1u)H(D!QOza(m@>m5N|fLsXbe_oM?|hPjI7`O)_MvqwZ$1Oa{EU+g*I)J z@(KjK9KC>S;q~Pu>f!{#^+RN{r!ZpyE{1v-H%PlFr%q43jX3gLkKyM{=#Tuived3g z(j&0^UlS$Ct$4QrZ-v?rW_GS>f^-cqB6rF?G_`+2%I=B7wy2Mx2Ni!o?R>MapuH~l z+6mGf*rw!T+KNO=nj(!goP+Gv&<>^7JTv~opN>>FfDUX`y|B(B-Kw5(10MOEI8FK} z!wKAJF@ANf+j1Vk2Oqpa?ylpS^A7n&?*x4mZ5~J(Cx2K!dWm~RhF0Pf3S8zcF9*oS zZdu>{TB)P_wZe*9_*Li{i!Nj?XdaX1FxCUsJwH4^h6U%{e^k+LNaHn->^ovd$Wh*# z?yB{=TbO*knJ5yM8g8iT&OU)a0g(On5xW!%uoCMi5-Z;U6NVim0y8^O?)S)zBhX`U z-IPdVZTqhyHt|D3rWb%jIFOY_f{+_9F>=&|?uv!42k?rbK;AygJ&@eIi(KU|^L_u6 z1x&V()b6G<2mRlH$VPzc*@kkP0YV8{(kW5^0NFdkj*l^ax{~PP^##~DYzc|AmkRQmg_$PEg%j+*1a2X#+`hbJ#<4RNtqzai}7{7HxQ_rjVOZZmilUuIv% zB&9&`PI~weT%c-i;i9#Jw@$7@%}1ZvZ&k6&2t2g|vIzqW=4>hkWPDffvR1Q+`mR&> zZ#a!XR(fm8N8g}?qb0gmYTmzqhnT^;kg^t_7rfQ#LhinZoo^coTG(WM_Px_b^TxPO zdj1Y(^0}4K)&aRrrPXWS>@n$0NojdesefCi$ouIrCh*u>5Q zDde9V2~C2GRE`Ax`=5H|FdU?yTYZnOH2!V&;Sf}GNtqv>0Ad` z`2;9g5K%sJh6pck@0nA4TdO~R;Y>!Jb!>&$XxeR#)VSDGjC3#gPUo%Th@pw=A3d4;vlNpXj!sh)N#?J@u zs39dWWq(FvQQ^q-EdkQaU`*PT|3H8G&;c)V#W*EbYNKb?IBP@EpGaqh<2z8s{_+!H zJ?oM}J=$!>8gWqUFY|Ip+K2 zeT3H+bSiQ-b7top9vCDe7dYw6czp~h*!Ryy<#=euZkEF35h6W|{HW&&XrVov z_zjU4@yR2|PL;(qpo7Gy4lzY843{^|E1Y}g67VHtDiGKLtPSSBu{QO~_^FJpeBu?{ zbew&%`G^5idL%mk{n!B=VV&`prJ9e+Of~6ztr(U$m53)&l=&Yn#5EIaBWNAa)-`@B zj;{it;ev?%Ddb%Qjf;7&FE?KCSxk#{mdyb(ZObfA@EmKDPcdTfOrKG$2Awb6_g|eU zHP=3B9Ryo=r>A*lb%6;D7k?XQiDH0bBkvwzo5qmF7%SFhS z&hA!9@5nlQvjg3nO75=bdN+A{yfoJJNdWQjAr&XZ_EOs~EV|2K)JZ-?my{tTC~RBJ zFCt){Aro8G~&n2R`?Mw8Ki7QZK;2YB?I7_ zq%gu3v+x*l^OZmV>$PaYu#)91YaWq5d4mgaeJc*Y>x+;p#fCZ|bC@G$XUy?FNCo7B z$I&s#8~aZi5kKQL?W|4*Zn7;@@(A}-*Ec68y<|9UPbJLuN(fZMisEmszgb+BNDLWCw-IQ^`Az!cOzs^iH8Q&HdtkncH76R%6`7*h8hFRwPs;;L%bGMyyjpS;5pU_o9 zk3-c`?v3uNkqsw-UXuX%YV-jZib}15VU5fH=^o!En2$M+O1DAK5;g98H9I|!SqQN< z)#dO>mZ30lNB6=0Eha|G-RHyBZ0*mMETpe!nlx@$nlYd85CIzjJ-|s zjnM1#mnn0HyUVoNT^uh`nhR*Ybaz~vy|Q5$V-_f(od;QmSx-HN$usDFPj2jVUug7&;y=tu!yA2hDhTTFc@KL#YtOJO8BzUYyL?7# z?2QOs9=ClUkB`I}9IR`WmdCZQGin-0t4s^EaqU%wy*A~n@o8?mDxpC-rTvoPJ156T zUaV!z-OZh~8rI4rE}H^37+AY+F{Q{>Ou!{BW4dAe(uRo?^_nwYVjUeV=KgOFUySJ; ziX0DNX%0rQbghIMBn>|J;vAA*?Oiyp`8iu>l4v7u8|y(>gVdaA!Vtcjz)0l8=P(?H zN4}`1$DN9Q?q+sb&m5$#(gp7aUnVOXg~M;4RETQ$L{JV+Ia_PxF^guNASIK3yjOhq z*r}yuBY}H?XLNS|)P)Bo4~~`EX;A8&14QI(vY|^+Pc0H%NlscpMDt*^fl_)@@QL}wzG!863df0ecjg|aC780)Wr(v|hYTn2W{Q^n z5&8-Z&IXXK{C_;B2lD9#JpV-$^5)-OL|bQRs5FRKXAXXB1{yrC?+r&KO<9+5 zJ_oHzAQc|rlHCp-zL>3~1)l=%(b1hPN?w41IPVedhD4e!9E}S+2{Z55At{4f1^qq-zEsXkhbQbdGDl} zq^a@X*(-@GE!~3qet_zo=fb~&OEmozRaAH==EAKTk_NT+xf0aQ34!zWrT8yxN=(FQ zXc&VQO=L00d}3ukW;#PB`(+<>dV&wfXLs(+6Y%DAK)@bf_u%1ap>*k6_g|R+)q4x&N264JYSPww z&BGtrzudUvs4V{ieB}GoNs+Ob08tnK=C;pYFj(Mg*NyNtCNaef{El0&q%hQpEh+o zraUxESFD*Y3lDIzmr-%;XAS;*d|T2n^J22~i~OTc;5%?K9NX&1*SvWX+5frjoltNu zmDI?77FYNWiWQqHuV2k4m>{0=9jZyBTN2vRCSOe2B*LAJ1V4o%W+?o3KDKi2@bhe4vk%qZ-A0zFcuch==WhEh6wOo2_EKl54|ebU%q&@E3sfSTGS~+eI2!X{GsEU-Dx? z3tY_|m9Ei6H*N+zt4xx*)gG&|yvNs>RF$^j>x18&%gmaqdRu1%GW-& z9XHgHijD#eNJ@jw1u_SmcE}5{H}@ft;qKku@wk(-Gh=iU&cbdeu5U@vem(LA0AMHn zJp(@T;b82k3kx9z9#08#P0~(Ix;S2FY+d2%CM4%b-(tq~`YfxDP3$sdWPiPde1$b1 zRN11|s6~R!`nxvjQ@`Mpa_OukBJedTOg(<_>?;&WP7UGpdDys339M>su_YbDn4!{Wi=~2D zloEMm54|kc{jdhocu+vYW~TM+YJTBT#fzg($vepFd}%Bp5;luit9?|~x&b@-a-7nj zS^d)$vu>LsCyKa-%)^*iANmEW7Sg$)y`D>JG$m+1qFs_=8w?L&)gPVCPz;cV7I#reO%l5pQWX(_!5&RGc0pc?7&66eee;tgm`v1rJS*V z13}UIa@{y5jxk{i+UjBtJIHWm0yte6hG$`Y_$+l6@;(5x>NtCOx?%+5hOEj^N|Y#r z=iQES^T_5i&rpo3z(}D-k@#FVUO$BvMeO#pBA;mn0155;d%q!{GTx<_mCzoos^|xn zIr8IRbD?H8TmSKvr#|g%@OMv;`8f3i-?#`Q&c(3+UYUH!2c@KNbNX{q(#2BGkuYQ5 z-W1B)W(M$(DK9?WUB7yH|D`7<_wYPS zwm-qCC4GC8zA@eDX+_AyB=aK{bXzuEt-Q7wY-sb9Cl0ELkC@RiH8}3PY)*dC#^D{$ zcrL4+c{Qus7JJe>=CQg~o5*m0uS%xs(`%el6LC2qeo4{!uIGUu;Uwaw`v93*0H^pg zW*)nU7+a}B#Rz2On_?uUaUIlO9A_9ADo&N_Igxwz3-4EEO>TXSdk8Z>pkwlO#YZl< zx#HOK^~&AP<^a+zzAkP|b#+IwLbP;8u8AZWSMIp3+Hv)#;0m9B1Gh`AWBB*b>_+3k zFXz5^I>|2fYu`szxtYZT|Dr4q@s+<-caGtyUGU-=-5%qQ?{%fE19Q@|T&+zznM|m1 z?gfX#YT#KvzMFWgtF>pGn*XdU{r@ocodHd3YuBMGT?k4Cr56?HJ)(wQLlu;c5RhI) zAQS0J@&NbkK$3lMU?!E?`j&n;j1`TZDXk_nT^o?V{3*4pcFxNY`A zNiatorHn=_^*b(fhXE4>dbEcI=SfdpcscpwBKi41N$SbQ0V(oV5=+P$()H`Xr!+e| zkt@1Viq^rTjA{k!zE0Jrbb@%RyZ#v8AbRT~mFM+A!&ge{on?evD~L*B06%C0Fur-b&sj8I zGP6akV%h=fqbpZrU!Erip(=WXf?;{l(=pb_g{rS*b-Fd(IuOkPl#n7M)-Np+RMSG)h{CYC zhd-O1(jyNH)4xAYQvUGSdOga;hzU~4EiUt_>~WhhU-^Oj2d2xCfLfO~;ah?W$%AR$Mn$I=jmGC=ltQ1PntRJJ-$vjs8xT8 zKiX#hKo~)Spu{quKP-eI_b^P0@0IR4Y&)=Pt==1EcWzK^`u1L4v)7Flr63485+ z+?5G!*zfY;?Kxc}AR+$+4d=-(h9a@@F5&`sH@g{LYx*S>M8|7~QN23XCXU~9O_>qI z&%+bW^adPpTns^`nuHR0*%hm%_tY4|Ilw#u6%V2-dJ9#Mm!t;+lqIzc z>I7&32fO~U4~E^QfY>OB>WR~0Rq1uXU>Wg`ZVSP-1{H^0zD0A73?TqqJ0Ar=kU9XA zuygubZuI0BvyEuIU5Cy>M4kil?Lj`1L77(~%$Z|Bp)&CN)Gu%f!}K=~csyZ1@?&Rx z&ZU|v;R+yJFMY`hgunh*@p+xFH5kYS8z8(%q|A3+;Llk}!#~Un%m3pgkf@~=5K<}P zUgbtI^rz-b+I}=#EaA5U!xdul5GHy#;+9kTGKzY6$R1p&`zw>!VNg9gEpVRjN;7n| z*)bAFIu|r6BlF6cYKLvIb8&M!6=1NJs*+kM+*zExo{WU>Yq+Ujco<5>aIVLyImA&T z;Swc*)Kq{mpyF>i7BGsQWc{WpAoGZKjZ*l=_Lq%>=RG(H7tn$|UH#dI^bj1b*G3DE*az#)v! ze4e6@7uAf^?z)rAl1a8|ug=j_P%4xoh98jxY>qxYiM{%2;+iG>cNji=W8KI}wb3X% zk`p7;Y*)#4K((*VcqX=!1<(bV`n(>@uY3P(7$eM}SKG7XUWbdbks%SaR7==|bx1KW z!tPbJWre@h>4^Uzis{&ZZH~d_sOA8-TAl%VQ1H z_f54vb&aB<=u|b z^$ap3ONjtnrduC4r+W}laKHED9Z4^v%DR=yop(f8R$3+{&7W$vEv)uKHYr#zatSgj zoyDamUTkNjAH~9`J}T^ScFFJ3xI79vEy<6Ao3gI(7{BauNxz*NAAJO{!Xa9iJ7~3q zJzOgK%`URjml9;|5`z6f(?sSa*dCh2j2eJa4*Q=+)bte+HBs-on(c%zD@$)ZF{1ifi=>c+od|sRjV&;*3uy6FR zKAYbb9xyGWfo6D6*6 z=S2#x?~JkBA0QA-526(!D*^YresC{q|8vJFpIreuK_{h4U;Ifk=yq6o(D#R#)ELE* z(UN1-B8$I|)B&V5r68bD`}zm)F2J27brEL9T{l=P^kD3GL$oB4j zEK9^vm3q+KJ(>Uq-Em|vUJ12&OQn;J3K(VwXkHw&nft-(#eq-^4Wk-I>fy)nZ<&iy z+={P~G8X#6ks1er5y#q(_Xj%XCU13puKwWYyZFW9An4gM#m+#y7PyrQ6+085B3^bx zV6$Vh)!xqn#mkIFA--pQ^^r&s_s50L3(qL;-?=7;bE8N)_G`K#h!>T|^B702RSKYR zOr!m7abOA1+q@T98n^)2W}K^V_(9-v`&ulMxP3p2o^|HfQW12g$)vu& zEn6F6GYr{he61hrqi|U!D}_-nPgEF}el#X&)h6sRxv_rNBA(GKL0)54vV6Wo0w$%O zp%A)NRKWSsZQwz{*Q{^Yev@A&SQdb9u^j+bTM}bq?e-xjkI+m9T%9a8rTLb;gjL=< zlG}_|%=RNBX)~?f?2=H+N_Uw9+BCY}7StSZI(w>>9*@*d3jX(+o{-zkSKQG5$$_Wu z3cVgw6xqX3*EzGd_seRaVy049h&EyNVY$r5a$(2^9V=OIdtW5f2X{|eU8psu&c0|# zWopMG7DvfPQ0aa`4+C1Ref6iKnjLj2=RdF!JV4ej4uEA2!u|jpdgfFB-ts>7dnDQm zvO@s?z(>s(U`o5a1$?vK))`aAqz-0arZl@*445DLQ-u6Y7PY+>$SgAI$L7PaUlV@Z zv-DnbbUOy#@$jSH3*`z;C00g2-Zo%}{)#0AI865Twu}>6$hiiG>xz?eXV?84;m(FZ zdqxO`n(_t3TYzVWv-IsY>xm*UK;Ii^j`*#NiUyQX$L8rCfD>E6FHDrF<9450IR0^++JOjIzdlQiD3ibuQL%y~QFzSdRxJeXk@T>%=UXCe1%TZ1+j*Xif-|6?`K(vZy#bd=c4|%B>y$O0=bc3q;s|9_x zPR1H%SrI!d?Cw@yPya&)RdUjO0tb{pIP(-2c$0{A+-5N1rFd@WN$f#eU!GE3K*;Nm z>bF5?IgS^hz{9fb?Q*D4KDURL4u z#0e2fGzNtFJ?zZa_9h#x+mDBuLT$gnGphfaWXbOIUNf*TDXaOH$Qn&(Lc`dJ3CtmqbCm@caxUsOKw5k#v+(9513(I$bHd7=gQK2_*F%ekayZ;qSbw32iT=D)|jj@*@f_FZk_jAK- zCD{g!>yZz?*)1c`T!7`pD-6d$4wmvK0KZtH+h%rtzP{W4sq#l12Co1^G1(M|x=<(j zu%iHAFabrY?WejsgBiJyYLgBGS|5T__497tuVV^IoZQLz`PFq6;fSS|cOFq#ATd6d zNln+<57rjV(JHXo^Z})+s+8SduYI$1pR}>UHoqoBj~7DjZF@)NK<0BGl*?mS4mb<> zPU>Z1yI#6a``G3wSve1%HAag2;!E#Aq!KGYlwBram3yV z@KD&s*W+n`?%8$`l9o27Q-CR0@Z7P#;9hSAxrO6(_}ys!GEzCC{L0?JaKFotr!Dj? zPdKI7h|&mx;>&A&47J@NB-VG)F<9&qriQq_G$yv|Xo{?bc}3Nd0%~F6={IZwMCCRZ zssl)f>HGd>6(GSS=&Nu3@M&Sm?u9BsUIZ<%FSZEIAd%%qTX*C1wHC}|*5RQ`UtEqX zO`pA-IBi9}Ced2lk9m`}+`KS;#_VfQEE1Q{eN^{fbVQVt@(Qc~(})2#U=<<>5fhR? zrBopPC3ycd34nLmKC@sr|36g|iD$f%CHaJYKhK0{IYr;F({nWnVUv9p*4z-g8CeBr zGS+UjGP8}z-tf^JYJ{?D0F5X_gaf{?-rW=X@_^+uPLQP(p-uy=^l`&T0(2t7QWX|u zce5YT>3pw%4sV58tkb&uaW7Lzs}z z3X1yiMWPl+X!wB!P@6)e!3W8A9q)~)G|RyWtunZ(Enq}-G?zTZ#oSMDbmc*eAOho8 z9LlKzgBzEKq;j4Zwyh4Yd}~SGm@*{ z6jZ{1`n_mP78ORBgH3`n-XT5_VNw$VnzuGhF1Zq}BrwK}cuT&nfH3H2jsb1cRi7TW z!%k><$Tr=hx%(_^zx#`?*}r#(zQ)e+qbM!qn-jkTyb877!yxW9s5TVZ>;4W9t;+y@ zJHLm`o%<7CPZH7L2i+aa?1R!C)sf5}?8rK(j2@YpKJ62%`6fpMBr1!zY?PK)F^pqe zv14mW7o|$%g**fZzhCO*sI}K8fZ#ed`e5MFw-ZnrALt)FHu-+K10V@pZ2x0oleN&i zq%wT0kd}-g8M$n`%s?!^!N>qTEMYS6Prb%sz(W)K0~8#zFQz%Ny?7`BA%>i!vQs3r z)AG609sPRe3$O~8VC|PqT0sw#?ReHKSq&PLts(nQZ4ltA}aa|hf~(w2;ET`JDP8!h{tzRkS%*J0Tc_(%~Bp|~eY;{!Wj z@n0odfhVfo{aOFzV@6}Z=S>T%fhMS5JU}#vH{AmIUN6sQ=&Hi@cU9Kd?-k|}7t;8` z+XaIwlg$SGzUiWkJVWtbj_l3z%pAlnaie0NAZq3j;ueZ#n$caj#Xy|w1j5UtIkN{8 z5r8)EZ_=9cWRnDwD-%T`AL&0pW%U`@AbS&@Sj3mdYiB}Zz@ne)A2Rk;_s*!L?EnEd ziQ^@B^xTTyd4RfIFf;DYaH9W(|n|@zmdQ6bPcj6gsrDH zZ4RbB19*~s?BRlA2Z|C+zAuJRc36(MY4$B)+fxpRB4 zqe2dpsjLVG(nys#`FW}6ssHq~dC`^4s#fec?R-M%2H zH&hQUKIXB&HIt>gfYitq#rcKS7JoFFBIEH^czQnh_4$D zn~YBnNX@p7ILvNA3P@AL=rKZoZa_ypQDfWAIYma5!=#?fLawgN{%F)x-VhMms-W{Iuj}fn|^e3ehU&cWc@R z0EiZM^GNQ@piYd_zX>plm~ng!J7yRY!S~c>uKJa@eSC2n4VaI}=UA6)-TZ7LOTjK( z^Q)|vOAaGiI9=zE{zCLV7khZYX;2%MZcv(TVMeO0iLu&9Mo1-?wdNfnn9?G4BzE-O za*WdQxSrkHg2&|LanCORtf_khOmcNso z1&pyG?b8~5I)cYT%lbZae#8mY4S6CxZj&)|8q&P{e5iSAjN%fcw@cthI_WoOQppsb zE2?)YX%m5OFf6L9K1$2dx$&$92!QPolaD(R`mMQ;5osiA5VM+)nX{v4?QbkW#H~*K z9>;d)XY>g8u*wU+BEK<2Wkl${QEOrfbF^0sdRRXo=W~tx*D3ZOczrC38*6VE^V16d zhvpF4COiz^h^!z6Ht0kBGQ!@0lWMNdAG`Yf4!t#31=|YDyMcS}IZHe0=iE|gH`!BE zW$B$wMTZQZ^RirH?&}U^!zzc$ucHE{CYxV2TSxvJZ$o_E3&|UJHcNY4*VqJf5JlS_ZVlNwk}IMSg(bEF2N~~!J2eu<#y1%l|1r;_#c*Hd z%JO1v4&4`qfaY4^a0F!1>ZyHBQnKiqo_p6Bxfh!vBFNo8t^c0qX$ZQ8_G8eHfWDNxMl!yb7AulNac+Yag7-bvSO2(&+mX7W;!a7g56Tbmr{Zr37 zA#_BAtBtRm#%}Y?AWYaQUc_MrCZ(F`(T}=260qb=W80<2Q$25f00|1=uKbqqq|sD# z0jvQ56X*0BB?ntT7O?%C?jfHiRp_Fejhb8aH-_g!j6PVwWqU;$>SRrgwG??DXCz9{ z_4}O7YfKrP1nzUBzz4UA$|e>3-xYVCizaW|=N^W5$HQ0Rcuq;8C)QH-5y5`|rZ?PvKhgza?vm5(<<@0<8xb+e z?kDAZoEQWDuqV+gC3kNoRNFW~Kkh4;m~mr_;%cV_jqdh5%JrxjdmWWKGvy<2AX_0w z6L_I_qeF3_@h!D8^G_e^5&iXD0w)#{KgM;F>lmV3FQWFA56|!2EoCE>9;%q(M`4HS z?ncZIf%G$-O&%nf>lzLfR=+$JI7(&B?r6!e2j{NV?m9#z<+5I-;^CAc6OPl;16)u3 z$<_SkZ1Sez+?hlUfJR`Fo2mQHeeE+3KntLjoz}KgLoY^e4}Nw}$~nh5e-PTXoSEX3 zPad~0RD>U@B@YF6{6C3{lJc_s87j3ZwF|S`YiF{2*#9YodJ2#mmP3|W(ahJ zD{P2GZq~wka8;~Sj4&qt4dk1np>)@Az82IUku+QRDtt>d`#mrfd{E@mzNIy&l*w@@ zv+;N}tKhB%>rLHv1(qYWP{^Gu|35FSNbp-Gb`STrOzakpi%oBC_Iu!D<}kd?p9uy;cG)V+3_LC=18@+gv=|k6;7}Vyv00VWOytVT9El>XeSFk8khT^AWt-Xh9}-_ z)*mQ51Aw`CNFW?GT$p+7MCa`TsD$^3nhjo&0ys`zfP6+BNaPg^lDxX)Kjm!nBa^P$ zb~Z0@gOJZhCV49TJzP5(;6#SLPbwz}cM7zik6+>@c@J-dmg)dSdY0cJw)|eG00kcx69cF7H#cfAU=trPeM#onZJ0Ce~Ddl$>Ctm7>}M`O@3xK z(`oCMU-9Dk_gb^D5=yD@4?d8CrHNcsa*Tfdkw^65@> zs}Pmfv*k4zB~-Mu_I$X0yQiRn3q=Aj(eU>6wL5}mI8UtM>*;;YHQYRb8O?CJn~fJB zY|oy}v4!ib;jIf!Z@L-6Mp+UstR5$*;_ShamCH_Pi4IVC80-Cr#SyhvdF4)&soXEt zucK1;gttmX7h*K^1^5o+&=1tLCeqp(F$9l*)-Yp}WaM`{9*sa%(gOse`Gy zhZV&H%XJC&k5x6#k@YFFDB8B?;xtJ_xQX^JHng!YN17i~CRBErrY4HKx(Cc(!Y-RM zhU$hSGV!)x>%&-+4t<43vuxvvt61eC&Wd6s?5B+>%q1U3lQBHd#8xH_l6Y{7Z{0BF zb*RKWb=#5{qqWssYGOh9N*yyy^IP^-jwkCuhXum5kOl_Xc=Gm?jIXj)A8&(!6q4T> z$m~kOOzVWLdcHoeg=;)kmghBm-Z8l@yeq>11}V@jMnxRo@tuq?&S`k!)jcDg!ypu! zIx;ddGD1Bflfy>p+S_wfKnG&<(`gfP%a7v~GW=;!Kv4S!NcbG~s$eKPyW&pAEc_^^ zUpAf$Oz`fBSzW)hJkEFcQf6MJ+FL?Ekr^moWBAZnoePYYQl@X`KhqWaiYyv^F>JCs z@;aEkPjRq1nlz`MZoqZ2Sr1_vITZ;*iorIoHGhZ8OC8WH zf$EL&q=4BTmwv10UVboHH%;Ud5h|Pt9_M4Qw~gH0YCoS%Tx7a4XE2ygZ8&ppeZmto zkF6ZI;pNfZ;cZ#(kVEV4)Z#!lsmKN?MMy&S)e%#MhYPC(L~YG)Z<)D~q(x7DV4-xLP^UCcmWY_qhUon-9`SVH@00v5QvZK{1&ik zd-GFA%VqpT(geaSh6t3Qywr}1o7Dm-^^5R&-=2baSNT(TMoqdWDNus1XWfNw9*Bq_ zuzlkkj`$z?32uZglZU z(y_5EZ+cD5ZRthhrmM^NmaNy6m3dKzN36BZ3O4U|6)`4ktCf@=niCN}-1%`nK9x>? zz>)Y?cw(3bvECU5`uEl!s@&$WSWzr?uQ~rvEQ3wF3pM49^IxoRs?--w)=hGwlms zY@0Wt>yjD4h>@y9PIcZiTTT#$5+1C?NPq6x#f^u`)S*UGQ2dWH`xHyz-@mD`w0{4R zIp^i7lc79nAuw$uq3}r3Vzz=*K`N4DME?lD&k3$y1_PR;cjsz*9Rr}BNyisDF|Xea z_s8$*(c*rxcPC#^Fs$*4voRlo*z;Yxy!9>OhdHuozd6_XD1IFoo?k3>AR6Ev}n{ zAqMua?+md>G5y0LAkYt#K<5;)oL~Ma1yOtDF3YJp$UDR=q^^YwZ#u~4UA-G10fc8Z z(YK*s?Z09ZoQM75q_9CiivpB-?$rqXZ(C31mEL0+5RrhMk97NZlCaU=LK8r1-4t55X+*o|s+Nt6 z{vC=aX${xFY>2d)!80SFDYdsBc)kc9ndCE~Yg&v=QRdFNV+7pLZ#NP3}oJ>{z5cQH`yX!fka?>T*u0TDTu-jJHpv!4l{rW0ply+T{GX zIs%V#dS_M>ccQn;+j6nt!B9fB?uvuaIt)46(tKCST@=~b8)vySNf^AC#_LPbQSaVV zQt)hE#onE&pJYdZA-YvN5Qov}RhdZ}gxHS`O@5-L(3*a8*v+Ues_W3uuoL|8O`v9d zp?$8ubE=aJUsE2FZH}Ij;raDWbgZuuC6iXUhSZWsXjmguWhbz8KLky+KTQ&W9 zUA`ry*+psvZmOe4)^Uio5Htgz?b&M2Ky!Zvum;^6tNqstoGEg{fVstF^pDQ`DK2af zElNI5#?IuU#>ZsF=xr#ez*I++QFbSzDSD zjsrM1>RX`VjD}`kOQ6i#5;qtOuXmuuB%lBb2kdF~wSb9RR}p045eBGS*`5mn7%;%5 z9TXp@@aexR!VK(j{p_w-p4-pzBLwP~)czWl7;xx7k9y29?1u(RGg+QHz7e*&Oa5I! zuu@cXa7uoN630B6H?RXF6gE%1zFrs)?|oWLeEns}PT@)*&yvjxId+F@R%TIho7ZHZ zYS0&23+H>=?c2uU>f~r;R?dz|4Ne?u~tN+xz+W#BFF=$$nl&qAwMD zX*}8z^k2od+f6n%}ugdL4LY< zP?`ZA_ZL|}Zq3&ZY2iE?hRPv0&${0=(#!*LL_^=u&U220DhEPbW20{IXMcb^y;E77 zpUZ&d)uggSdv!9` zWm9c9lP!82ay~1S)b1S|Z-BRlT!bt<{{1}M|L1ua@+mHI#>c&L(r|mFkKslHG#{xq zN&uY9dk9uU;osjdE`jQg6$578w?0RjR|UsM#9I+2=S?YCTid<|3$z7_FNG7*Xn{e+ zL?9f0zFVEfo+CAV*K>FpK2l~9whpkn1^@ZI;Sqy&x}lR$190>w6a$-)z=}w zUiaIWMJprPyCgXh>g=f<6!%J(YIQ8SR;c()FCy&nvr}z_caubX>AC?cCV=H?B#rH! zZ>HEdr<$RBaY6P`4{K)p0aNw-sxj+7uTF#Nl!)#E7Z*p|zzCsTmq@Bcq>UDy^BL7c zCqB&?w7RmIJRHiQl)1`DhT%2|m^;#t{zqwv8CO6JJJt9&J2pCxq%*MK`z?FtT1`^N zh-LBe*aDID!?wl{6Mq)G)np=SxH-16m9 zukJxkj&X#5r&X8={kePD(ACMfnjm%RC%M1 z?b&hVom34w@Ek>> z^e+kiZXXidNazj-|k?Fz&)HxU@yFNVnRsS&L8Y3%mW#IKD1IA7mP z>P69lCB9I=+-_nTu4Jj&Ic#c$lc7F*mhwDPZm5V2JL)YfL&8c$KTZ+gSIa@cdR7Fl z3f}sW0u;4F**Op_VVH6^Q{ozYzP^7iZkeE#V>A0Lw;BX;7{H_1aMOL{5>eqnjeH%y zy}cuRp~%nAs$yqkX^@TNxX$RDIayp>OPomx636|h1yvl7Juqo=43U*+Qq%5|r5P6a zFfk(kdB(7lKa1$L=cj~8K_{Vrl&vFQ2F3l8WXJ4vQB8QgZ}xV`?4$^idt9O|8_`M>3zQdnZbkDT zR(+kmY3;#+2|gOqGIl@TOX$hxA3Iiv2TPv=SoM7@T=DT-O)a>%Xod%&RIv`*R6xi7 z)(3F?x%HGutkV#hUt#NdQPRVfCUNu>2e)QDx6<>mZ)BRoA0SKA&%P5m{#k+w7&*et z>aCwvJ#x!OXrE*r6|@_3(cVNnv~w@#B9_52Mz`Dj6`p|X{CL@*kqWhLi3H?JOi`zu z*;L_-T6W!;nnb7rN_GrOjp04Gm&1u#gb>=)mtrR7oIMMSG=wbj*J1>_4@Bk$b#E>e zn|r)Y2f%7#u6d_5EX~!(u6;GKm*fojG5SdTc%c$b(Ke}<#&6AiGA8h-&lkyhr-2O$ z-+6KRQu)TVlsiLFrJmJ<$L1R5pF`8lbzo>J{$D$`j3pv>Cd{cF^sgJ?tk z>pQxeaeC>qh;P_mzNxR7^?}>tn|HjVMv)`tvj_eFh3n@}Je}Ck1D>f>rH6+{VZ~!o zZng;VA7;8YoM;}-g9HdPpikC#9QwjwKWY8g_EEt;W-sbJ-&Y#lbr@vuZuhlk`eyd! zQ6DWZW&(r8!jpd=G?jVeU6G%nrnP674P^2D?QD(NzR~<@;%maZd@l;Q&y7{?33ORQ zI~Fk#lq*k}EC>v4M9K948r;Q1_a`MIQ_11lk!CYqB1^Lzml=N7y_!}~K9A?q?yatB zd=Z@DhgHA++*$SRl=BP1ouT+POlyq1J&t;Y1rUpU%Gln?n=T!#@jqH~>%!kWz24Hz z!zY>1pbt%|3dk$kiP%RM&1uRX-%ofX>JOn6&Q!}Tx`Tm zyOzo^$5#P*3s>!^Ash;2Y~!4^nb90hER{9ORvj)1DwQn+`hov_2G_rx!=*;5jfe;z z2eDq;++XO zhm|Ua*Yoel#Jw!y&NG{iv|`{@<}QC0X?6RV4gmSyx--d*M{`%C$OXCNiWCEI>%Qnu zYDarwlUdoZ57kyLv{o;4gaK&yL;p!m|4E@uPCSm-haL4piah=7ycTGrz`1tNg|1<; z_mDGg*QS`Zf-;{dUX51$2Po)w3()+x7uv;=eozqNkKzA<++Gzlk)g-1qU7+V5h#}n zT*V~Z;Y#-1Ecjq{xUk_aYmc*d||iyR3~&2qK=#S)J%K zK4B>uQ-^5PXR~RB8#&D!y|y^(7x8IlhBlr%?By)3A=Bm zbh$+yDR7$H7fH|EC;@=aUbtC=Q$x8FY6eWEP1U~GvW;BjMkadRpoTM|BMh^-`R3bv z6jZ9;5Czq3O>FuhiE=cKd&f;`!i3Ps(=mLO(UyJV^AEkDZqEfr>KU{LmLyHqQ-eQ! z**a((-+O%84}^}-_e$@zFEi+LD<}tv3P|*F#n>Ib|He0m+^oFocc|DH`1(~c(^C!R zWSTvz?U3Ro6Xc$VwVR zxL94MXm)WaWJoHGA(Vd2z93I?bw#pbZ6zaH} zZrAXk?t4_8|CS;FF|K0&ZSlfh&cp|L?Dw)6 zeUo&du!X^^WE_O`Vrcuy&1wZ&iK3041Daxb*@k4Zcjj>snY@QFzGU9RrpPi2qE!9l z$ksitBe{2OoGlAGAQibOPa!;%YZ2wT z@%%;ALYjhNL;CFN=OLWDuZU>HkIURkCvT^vo}cS~AI%cqjr2=1=zT7Df&R&Sn98Hew9(N%YeW36N)%G6jFABo6|1Gp=DOb8B-F{bs)TDrXlS0bnm_+4zaqoS>vEjC ze3QNBD2H!a+vcN0sJUwQ#e> zDjRaIS*+fWCs$Kz)X~vC2}@9E)b<&0RW007?5$u|`(iU&MF#FRcfJFA6;$;Y3Q}7o z7Apnz28XbuUD_L3TFULKSC~gIo^XR2r?WCesY)&Nbug*cYN@`kwwa~Dq5e>!mxu{r z;wt!Q5Ry}n8>k6dnyII`C2Jn#bxPMs1$Y|NNXF6(`qUnTw=!*@*)1#1PjmCQBDljo z(7$L(^Eml+7TwSOvUal1seRq<^gZ_a&V|o(yC;0#b8+;m`aKq_yr0v1YtNM5zGsj5 z>W|DlGUnT77VL91qL)WGTS)6yC|cB3IrFuFU-V5y^`)g7%`=2Nx}%D4Z{^h*sDc-B3LcG*`yXoY`VH@7D#Qj*@En zv`;tS(xvRdJM-^A5MKLd)m~g87ivus9&wIeP8Iizt-q-2%fV7#)LHsDp=;XoXGzBB z+csRUpAE^ywsL_|aZ-spl&Bs$Id9tS0ZAXxg590x(h0Gd)Sy`-3wGIS_7x0oawTXM|YDEdNl8gqj-=&7AAFMa)g;->XS4H)(Wo_~nJgj)o zdt9)`7Uc~94Lf}q_p=uHs#d)n1Ek$O7i1p#bnaZ2(2k7|5D+BrEzREYFzo*{^R%~) zp<3Q+DT>U`=~eJUwwy!bg8j9wL$vzYo!Y1`CHp_wK|<;N^gxi$kFiXDmEkS;Y5t{PIiO?wJGP(Fv zAon0?nIR6C!hJ9zh@2mIIxM2beDYk#;IkcHts)uYTg@{0S7d?WA|Hfb7D!u6k#Ik%Z@q*%RPH!FKjQo z$#Yl^w4Q&$mxMKU%AQ&({@fPQ9kCo4t0Pz3Gvu;_L0()frNhJC^F2GhRqzLh6)^6W zu{-+-ASiNdh_-KEV#{v@0dL<$Jn4Hqo~FHM*}HB)L*lyx+od6K7hb3fx>}6(+lB+I zr7ZcQML_KyFt~nNGrgMn{ypp>QdVmV8cKdhG0i>A6Q_-H5iAf5ePohy*NNFuOCgE3 zPC|HLs49$-+7aA+cCHQoI##LkgLDEu4~&Gu{yY?S5|F0Q^1g!60-m4fgGyseb9hSGzmo$|KS?!>v3 zr{k|l`y%QnUzc6QvNb@w zK!@)%SWSnsE$%osPsWBa?!Wki+-8M28p;!npLY7M(L^)^ibBJ^AK>k2WOF@8Y59~-o19k=8P^r}s zVck#=7-9zc5FjCs=2aQt{$s9h{gx2C0Vg*L8ZT=|$-Qb;rZ^ zW;td>HaT(Es_C6t3P#*qZi^~2nkRaX2y1O;<4=u+Z)pyy{Csk~rj!%-O)5V-u6rWI zfJj>=1^4-4_+oZ;1^;|nWlT5Wl)Sy)o< zE)-x4m~2`%ZSWXJ(uzlVV(+jG#nKiG-AGY3rvor2|G!!RcR@fpnSgfOVZU(KY!I?= z4gZp-ZYN_`ysb$cB0YTCQMdt28GeNiQJlolG@bz1r~>7G!ceJnb*I(ag~JPAz@)tri3 zW{fmU;wGXw&~JBqZx|Blk{}s$-5wc`$Uh~=>ik`t=6E-Dk%Y-(QsU@Lt0PGIMzl$; z)zgp84=VO)G1YEX%l5cJ0loPWQgvI($!spB(X-Lt)nAX7z?ARx9Nv4lnA52`GTGZ7 zndBeH6E?xUBKX$!z!-YU#zcC(4MA}{elOgnVu(j^Pe1_b z7F|GrzKX$7d0n`&qwC?Y-hLM<_nTRLWwd}$0!$D>zxtE?@gWb*`K=NgLxPFa}qUSCHwKB8JCl znbe6_=A%W9{HNS|^()#UMJ$rK~f$ghDTt}R>o@{T+V zf`2EI&|J$?T1i;rEaUiCmf4j%BG0cK1Es2^%OAU}j&zo24|}cSdH#6v7txC0FF`h4 zS~ayIMv(k258xNUCmL=N^?pk|zi0`(+4cG@PeCBh$XGT*z&&~K+`B9D1^1^8*Z0k+_r8GFTq{TnR=z9}o>_VDlgocqdDr&^ZNi0D?U(FlFb_Lq*3l z^g!x?rxZA?+qSSD@xZoA7V6PNe&E-zp!R~yu3i5SejbOPWECj?E-p&$EG|%-v!YZ^ z$)GS}=A{FNuDgVG7;}ey21ah5CNj*eh}EpxiLi^g+tbkyb9@;Daj?I?Bp=QaDDC`; z2KA1=9Lf0UU5*^1t!H1i>u2hgEt4X%H#tG=ZhX`FANGDE3R;f6vF(n}NT&auh*$`h zSG%tjp5xWiYN{_mw|88;FdF~pFqQG*Tz~ox5S1*0fmd5=K9c6Syg(--0l|bn;VPRn zjtPYc+&d(wP$S=@d!St69y zZ1MFQ8LtLpgYr~m73d(Up6BCLJnUzZ%=$up2ip)bESu88*8nRw#E+SM7x~AGq{}BQ zs7n4ldl6dLuQA;OF*-f%C^=OB-98rYunhH|$EcU+=LgJtkL&66a(EKPRB@kue)U?e zMcoNU#%@Juy5PfHplmEpf2!|K7Mqwa{j@ZXCRZ7fsO&tc*hqJYL@c9POrtSdiv?c0 zzPaqVW$!+Jmx7(-cGG2A=P_h%as$y7pm0$%`y1o_J6_3`3>wRTa^l0De3mzlBGevX zjz?1f*uB(cxcM*}gt3tT&KCTj+mlrB+ntbWX(@WdAd)CSyVH}}Jr*x1Wx!qOO%>$Y zmgdJX!&E9cZY#YrOISe&zMDuD&(C)`Xw3vperrvYW^v}r&Gi19LQ@&OO_s^T`OCG| zj4&+g{`B^*&lbKmp~waFGMuIE&`cGBW7w_m?ThIQYBG8;>=U7bxMZ6 zwE-H}BT}R)627Gb`}r(4xgY#~D524Htf#zG~?IkS793iTz2Df;?}+0F|dH`&Eu& zU`z9Nr6dq?IUlxUv)3yLlQ)`fs&p_hES4O}^d9BDO|@%cSoZb1ty*?M>7$lM1d_Sg zMXYQEcO{Yk(noMY8H(?`)&`>NsIWsNz^Vg=rTK76U>dNGn+s+MW?iMEP!_$=4mEU# zy>-12zmz8=UKFVu7{C1>FEURO$-7whW-R z3iJhl@RRBu8LjX+ACGFoHA&bY)O?~1$(@VK95+qx3m&F^tiy;*Ndhp%VUEl^nJcau zp93iDe;;B=Z6gWu0Ys01cbqOro=g_0Pp*|DY&yK2_f7WCgPbG@dDFoljqC{1@^AD` zw0YU_ZLNSdl@_=K0P6d{Z`uOzBV8T?ej&mzdS4pjAOAei`|}%RGPi%90*`c>{r>8b zRdSf*m8*$H(z7fmSj$-$c#;qPdFO;k?yK1o!8GtJ3BCbGIRE(LC#?0bF6bZ|$$h2v zeHi$+$n$rgm`Lc)b4-Ll@~F5D=A) zbVQ1wSEbiL=paErq)Qj+pdd;|s-S@M{bt-}pR@0I=e>8|{XWQASy{=NYt8ombBys1 z-DN^G;693cOzCZ6xejo>DarJv@>jeY^+FGW0 z%z3|hK$TC?ey!SoqbVpc{oX3Qi5@$ArXH`J7Qr+cZo+@T|KE}OzgN)z>wPL{g+c6j z1^@B{8lHF6+U*PsIE-R|P zjMM;0QVH+?eIf}zlnwoE*IDE~`1Gcik;svlTBZ=iwaf3^i%TS5HZpJpK!7uXt-FD$ zpUgay>)<2mfJoS0W$TJ=PJ?vI^%hL!0q>Wv$**L7a>ekKjibD3+5QV*Eot|{s6Y79 z&Nm8tSEo%qx2lc=xmZh|8*=0UYrxnmW64n(HqQnk&0ZS&E{iUX-BVH7;cd70Xo42;yKySMi`;VRbOn5?wm$>VD*K%qbxdZHX6~l!l z@x4k?J-_aX4bY<$h|hI$0)fp!5_f#Evm8<|0$f%Z8xdGcQz|lad!bO?MP19a0W>%- zVMi$L#?0tFCfnE~-z(WI^M)w7`Q3Rzw$w)Mm_5<>6@+6LY-g^CmR=3oi)C{sZy=v| z`$E}|Xe-Nv=0L9m-=kThoHxmASEgef zPu>96sZ3eb<=Uju+(EMDq64p;ie5`iVGEU;!=}y#!~_=-oHkwG{}6J78V#Ni?tF^5 zGvZcwrZUQ>St9mg$l_W(E|u$fmBMYt*!IdAHXQ*?e^J_^GEZ4Y_s?dj=UG&-FE>f{ zI%N{QVBHfh7na!e@(N=*?vQEhcM8;Di=0I;uHqPnCQT7`Zgiyd=xbIJn`?t6y0u|J zbO~vNUUDoLMcVHJr~WHVp@S@= zDl7dk)TNlSrs6P} zBBzV~-H2ZGG92H%1`~AEHPc1$lU1?H0-uqCEKyoBJ+S_X40&@0LxSaSz)a_7#~!$P zm*KW#4JFYI;_CxE66NgYnsbJ5x?#*PSGcK>tMafGSoOc297mNGE+eRB&{OUkWdh5Y z{W-utr#r$Nie1xOpzGwCGtpD%>q2Tx0hvra4ZpJcP_|C#oAi){N3}U=4o}h9ZtguI zvu#>*HX?=duWi>1^U9en#i0xZ*-7D=oL*HUTl&Q>q?E-)D`{hkwPUZ-@~-2uSIszM zpM6h%G(}mm!f{cdd+DBXqHn4e&PWffz$J9g!?dny?55VWsO-Kb&;E}6l+_rT1rw(W zI@>{%YoWBR{frgeStj0A`Gi}sbgQvvO^+0*SbDJM4j!%>Ga|N~CYM9)&7y>O2lt#M z)eWyH>-<`7fa^6jtm{;I*jZMZ_$x&*c^2(-KGyfid|NVEzc7+ttROCul(Zq=Dy*tg z402jG+jYeixM)4^WA2$g_Z`VJLioYWFtZlkopW%r>rGwATNRu6Y#LM1nEb+irtO}3 z&VuJ5*bXAe;4y}?v=*yJ(?zNWbNQbR*t=XKA3;sJnb;b?J0&u4E+(F5fd7!RYe~lV zH}RHjd!hFL7M^Hk8{rl(n`&NUoAq7vDsviL`Haauo$}jgJ9@NfF14dR7aALNgQNL1 zk_U6?`OLlxJ7qoldlBeb33ehF6Uv-?LTUM0I#jO!bc?NF3?C z%Gpw=_VrqB?mH;j9+9IpY1&VZ*yABEu*t0FOX)!lLU;UzXQs_WlQ(D}&xeGP7f+pb zACwA(I^5j~@J+oxQY5p>@wNID8Dx*8;8*dS6yq=Z)$jUaexGFA4h*6aKAK-?`Gdfu zkt@S&5hwn=Fud7M_#oM%K#exJRMq@bgaJoz^gx0)ePy{b|K?WtLSyKw1T5S?qNj3z zG)1-NN6V}Q;@9`kmdi2e@KuJ?>mQ?FQYz9si)n|_>)a+EQg;u{9p?{i{X*J zeqA!VU3otI`H@gY?1%DX@isZc!7mmwnu=#hF9fe=pgojx7ScRNbx}hekaj@?xetK{ z(DgZJ z(C7_WK@%PtE9T~`R^kb#aPG{nn}J?Cl`B7vt}ypXeDIyzS2=#B(0O^_ui7LyT zG~SrgYn-t&B$GRX*jG;+o_{z`(=%BV>ZSZhkz=>8xDBLePaYK^MZ+cGPt=wmdjmj!lb~U>e=#HMsOSBu%1to2Rv??<(7J6hk(+1qt~m>Tymwh{3tPCUh9# z0N-Dr{@SYi`FK@LAvDMb*8$QZC!n*e+z3_y`nL^*|Epjly zNll4c9|b0C<9=i)n6~j#Hz{OkU8@vXe}Ngb8~saD!HB`OryAii?w3X(QL=BFIILs) z`v!gB7l4TPzNNO30uVB1c~9{TD}>_Nw~T}s_`vuI zR>c6M*Eu0$t@uki=A_{-Q#uuy^t%LD*LZN-Dz$*&0GE{g+~0T4GA5ylFW6K#?03m@ zmVOt_k*UC4KQEy)!;u4&abjIX{E%!|3DT6~D*zT}Z$G&W@Uq0eUR0=))?&~f+_xA3 z5*$Wu_;tKYK*yj1dL*9nV87*mA{XR@GDF6tes7_Ex$i;jpb}K!?+of+T#A7wHyn^m z=-`HQ-+nNj@W@(i0<@X`@i1H*-1uo+Lpz$Mc;BWzzg$}1eP<*_o;gu1Q3)zt2F@y3 zsiLI7m^edxjO9h9kzpmg`R1R2kobMMKPFBGdV%GkL3D#fsW%*9VmX7h zNUKg@6##LZB=f0>26W%JV7~(_*z$f;kH;)HF(SgW$Fs;NU67Esp1QEkL?aD3XmYuV zs)&!D^x6VODrgO;@#kcW5{~`d+5`3_Pz&XvQe!VWHP8{ba10-m^U_#Sx)L84N&hkn zX%UMQY*dH{Ur))z>Dv4#*1ds%eS|neEkUH?KaDnI;sZ*0@oAcQm>hP!5bYaos>%|D zwiB+AMD2;4n)L=so*&?vdrU#v^?ItEO3fIF@5-&5knIL0k3#o#6PPJc(uEPdl&}p@ zuXw)mA?QBOx)z@AxQp=7L6CQ8{V2e)729_8j@-ryR(9s=r)-CY&F&YExXS}M6RAr- zGa2pIfqbGqa!@~#o#n(Dp+M_3D%P5GFM|Eb`S0%xX>!-Srlh~Fm)ki^uN9HK!3xb= zy%z4|RK2|t9K4#QLAkojKidRDfw3r6B<;MZKV-;Ft6(S%R7%`a{Edc;plJ};En;${ zdGIV@rVnf79&}{^TY3AyuV8)q()y{APs4*pciAs!w+@ch?_z7EZ0sKgC%2NY`kYK2 zB+oUxm8&pq?IY3H=-9D2gB}Gt_OqOujgzQM4UjotJyus`s4bJK&{aqwKOm~^5{{`o zvcJ59Zj$xR|9QspUMBP5-3NnDp9YNy{Y-0cAy0%|_G>m*^mi;~tRo2cZgPUwnhco{ z-nISy{W_c{uljd$9FZDzL}*YBYmgz81(h6#Q!+J8JnqZj$FCqt!F46*Xhl)bu9HSq452CA#1JQgCJH;+;-VzKDGIYE+p^w8-L7;h4ybsq`=G z<9w>k7fdHkPGVH^^urp`(>^+;4>tbkZgx>)e9%X4=@1RS3UVE%#TJqi$2WBXt4CvJ z)&y5QhXjLT`YjOOj`m_IA0|whQWzXTc?E|B6`7-70&tv5mH{i6&g9vEcP{%|98h%&tUb(`xHYjffL*=zz z=w${)ecREWF}P0g0rom5H8a8{6$Bl^qINFDxaiQ@WaznZd6#XxNy$@>_@vcI&XR}i ziHA%J(G-doe^fisFR0x9d1s8aQwirb8Nf1p^CECo?vdG#nxw({32x7PvLU%gm(`bl zWK<00>lAsX+Cj~WZYC3+VNTj{i+O*TeeXaqkih=cl+nysAmS$wzan-@u`P~(vThnp z^O3OqkTVH0&atWuCvmS7daXvm%U!%W^ueBac-zL|m*>Wv>e+GFT$1G_^l{3%{O}`% zE7RWxkM9JUEbh^S$ucz%bUQCq4@v>KFO#2eTR!L00PHe`s=S%<5*Z5w3NUVvG^Q6^J+>E!(Ww>*PJWbN9~ zT@7VTwS9USxS3>j{{eZOxw1q1=^3hXwAMn>y>iGMdrzyLoHDgASGM|uIPK=Wye$3Q zd?K99=uM_MOO8{OgYkDmm5X!EJtry$qsSLO64sSG1Rj%%oDTaKsbl_uT9I(^1{>YFT5u?l)hBwMam;r{Di%FpON7RnB_o&_$LF=sz@XA~t+NMZhU z>tJHIKLeK_fk{*1&0kT)trpFTHyjym7KQ9dRYcUY6hU`s_RTQOv^?9gwXml*wMe;g8RYM}|Yq zQYxGW)?vwBF;02GhM^Cvl6VqTI->k@sekjuMt+8L{Hf!*;R7yyQvnYQ0?JjaI$EJVf3JR z*!8vdU7RkRLJ3N?(qq1{E=?e+Ov=w^EFwC+xGXmdi}{)7VCkC`|@(*;gdX}gvZkUwj`eOS&Rkn|o( z@?DA271+-01`WvM|D35&L`40pXd+UvL|%Cm&8 zhbn))tXiSBVn{5dB3=f!YQWuj9q3Xh;z{zDu|jGx*w665ini9_Xd;m=Z0lw^R&mxw{CE0rlVKZZe%46-QvZ}yRmmuOfkA3kjB&qdO=A; z^j@m<a{*QJn=F}$AkA0Q2 z`IiNM@~n8C7nII$OFO*wzRFCUbm)_0cB((kTxURX@IdiA3Nv={OQ22b+04YV#(RV> zS}ZU`YF3sVh2Hxz#mo_>-xg#7DI?#vO5KxcG_pl?cBiN?!(&dmhrd!bbAF!EW{E6Q zYvyV-l&}`8&pN0qV@u_f@0ku>WH{-Lp1PD|7JVYKzFHZY4SDXpt7)yJ^;mU}jrwWx zEs5TO^kNc!bG>x)q4+(4!E!0`iP8=sQ{`9hJ{i)pB9ere7Ok5l-D+x7Cb|Z#T91Be z7xIudqud6hzFL0t^hK(5o0uTM#0C6GNtM~vTiB}Og6QEhn>hAx zxuwU**^vpCcl>!N*CNYRnKgRDre!;;^x+z#$J?2}&!Q*d2i}9JziVC6c z38kN%D1M3K#2dR9FD7jZDwFcv2hb~Sn}-|n`DhnQz>$U$hfF*|M7n(7Rid61sTP?^ zcgS}rZ%|fL(gd0*$UY+Aal%H2DZzVp{hM7!S&N4VA)FQt)LDv`=|!Q`3yAZ`_t#AT z^0Z5QnGXEf$|5LXvy=in1O(Pml($_vxilSeh+mO1hAO*FB`%li-l<;H)l+-8n<^?4 z`f*Eo7rW|;g3p(60S-~Z@tbsgSAtNHO8AUAe1O}U*e2;I3`jO)%>t~-_IDr+^%?yE z=MskueA%lN&AB2-CrZWG^sQX?S|}hh2%pi5N&f`5e}P;j0#(VeRH%h`zE=~!ZsHAq z@eO|Y4g+T1wLWQb?pMC7YnZ@@Ow+=Q-pYWN{*Yp&3s36+_2U4#NP59 z3FFQ!7@`5piww}b|HMfCi?!hImV^JnJ`ChMz^BSlB}Zxim6+ENG{3RG#xfSrD~xpm z0Iuc8|LR<(BL&E_f)IFIPjCMXzk~DtiRJ$r`!?q(!)K=^&cM#`9j9Xs1VAFVOSLvlb2pU zL{l#7+7OP^7@*J3@B()$Vi)tc9z2+Qu9(Mv-n;~86?y&0l%n@RDd<6_>n5K9uQO)g5irgN~VXGy-UQIwsYrdca5hW4Bi z8>y$t%X}S;GbI~K3bIc0;KouHvfvtvX+$`P^d^BPS`(~_7KBGQ>E|#aBu8{(;H~~3 zj{ac|IZ}IyXNp(U0NMvuM9QylwE=rGN&KED=k4;>naiw*6tqSHI`+M65>OIK4tGVs zr! z2$><9IOjrQhrcbkIGs*~6|u8JzTWT|!;#pO%f7H=UY%CrCFpZ;OEb$$)#ItU!JFZt zsy3j^F(povc3(Cb&L_b68THLAtsImg;g^5B8*ERo-vE?2zs<2y#MDqg(}Z zQL3E;^Dn-86j$5J%v0SuwpSQ;f1=Xw?{tIv+nUkQcH8$7;Nua`uCYv6q-<}`rH|uP+ z7F5-O-_`4PZTJ8ulb&ysU|G6G(F+!cQo!I=BJg>%P9LYPh%MYI(ybN)eKon2M}{Y7 zx9BSN0<#jxYS|I(!vta@k>=j{xs?hBfm}15P+|$IlTIIR#**Uv3y5FzM_7{kgc{qV z*6VMq9jDx1Ke$D%E1ODU-}JEcs-=#U?xDuBBpIip%jM^^DENyObg&D&72!*D?;f4+ z&sndV7-5G?S14=-ecY`Swp17U5;@T#-zLdUH86VW2kE(@#)7O%gsUhf%Ndnm^hLzM z=wSoqu1$zLP!AFr9blO$4^}lNC(2-skbu(BVODA-AHsgB#VkELC&;zIs`1mTJM^t@ znS+rF=UenO?ZegVz(Aj$10fOv(l+%2&P@6hX`a=|h>K~CKG}~CwCBGhmxR7Z&?g_z%c`GLrHetJ-WR5W7v&DvYBJ$28Wqo0=&tvT zwXg5&2NqmKn}om130!uS$EF6$IQ5>0?#QW47+uw zNJPUUXY2G6ypevaE>{-CC?YJD1?kMF6vGl^s zWYtoL%q)85A+(DP@3>w`i?rL)l!?(oDQhGAih0ZXHSETjg^|m*pKR(@^G>C|J<`=< z?n0I!TrGY zTv^)T=|sxdw!ro`HxeiE{m{Kb;XhKU4dh;~Ft+>fyDi1G-aOE^Z+~;kI~=GYo6$+T zX|l_X%Nd6Mh+X%P$~f45jh0lF;{*|(G-0^=e-Na|pPfI0zR{1_9UX;8OUoq%-pzJ{ zifHn4eU7V8!CGt==Z|j0*N4h|k2fKqZbCf^X{E!u!~P)PksR??#)-|kPzHWw7e&rS zN^{9k);ujLo{tq6(Vu;Nd&R%6{d#^3&&wnPBP%+VZM>jTVKfuDakwbs znC9ebO5KQuo;!M?g%JgPx(~TAhL0cfDl|J{cvf}h9SRJt`KCp`4qn%NWj~bz3mLHC z|(&B_-;-eKKEpDKRVgN~NB$2+Hlcal{H#A+grEhj4bBHiETba3Z16 z&ebXRwYMy`H&7Pgmj6#y84p zvX@?4bFscY;Ia3*xl6JCizh$s?N$oa7|E2~_?%LBkEnjR5G(t3tTEcMAf~#goE{RM zM$pMIN{A{r(kpIsAW|4KNf1JY@RLcbOy!Y78puP`B=|!6y^6QwNSbdQ)Xla;JHIuU ziwt4B=DH&o2&>CgOg7Z<21Ksx>!>MkC-?ER{c&nN7S+$BcR`xSul?f~4S*4{qC~B|@kDqkmEm{n8 z)O@Y@ROs9{AM56a2X))E-xm^1w|@k?HrQ*}b4Dwu}96g+`EYGb4Aa{5-JbRVulGf^+FS9IGbQlQ&%&RqJH9DhNUv7OitWkpyU ziB|#ttE!$U!)DAu?RGrJT|b4n-KpUaj(9c$Q}~%ZSF;IciXGZuYBkwXWrw=i{PR*~=h&ItOTIS}ur{6)NB^YCS)_IvFNNxg z*+DS9-`O|hLRlefW6ta&qE{(EDoRy)%KWkA<|w1P0yxY~4Z_TKuf15*j*QyupUB;8f#&WV$~p}=^k#i4%81E?&DxhM zi!&bk`uL27Bu2ScCPk_qy{ISv;iQfNuC8!|i#3>{0OW*)esDpPeEd05itw!X-j2EX zBW2UnE9ykeey;qozw*BJ4d{KXq8s|jtGflo4YCFeP&iM8onOe#q?u%Fl=X<$efMR! z%h^=%Ugc|?ymIEFZ#EW-yg%t{&ktVlE>SxJ?fcME?1|!} zv!C4TsUEV?vV?C*=iYeSd`oyBWX)cG;-&{hz+T7BXP3?=FEs~U^Ch^B#}?2w5+7ig zW9vq{+X|{PFx0sLp-yI)`~asolc7)~A#f@oZOFqt9o50pyjB;52YQZ+I918XPhYFA z2HWlgZbIkmo?nPteG{7W!_t=Yr^F!nNV8yYDu#eZ^wZy@dCOU*3nD<|VO-O84-q z=u#&&l8W=Js#PsBpx}l|#1h!zi|rY-=~^@QTyvN;xyjN2w z)JM?#3zxD2%n&~>pbEO+1GbL=_B-ElG$J=-LKZqj@Tf8WjFU zt3Vv@gfs4|@Koies`{1VS}N(+TQISSZNocaZ0p*En?(la@bWu)6G4D|{%=5@tX3>b z86`G^wHUBRmaJDzMC4KMaz%73N+2bEl|}_)i_!4Tcyz3&J5S z$2=DhbjztK2u-qr#%TTpB=~tzrhE?Yv7-#&S26M&GwNx8MS1YcbNyx?ELOWkJ;2x) zV6}^{B3D9Au3-(pN_ZZOED=rvVKlC_TS!}?O-Q-}&_0mRp+F;a-PDxuRd(Z_FCSDv zQ`A4N11he6U4Bap9WXAlI(>B_I9{(&Kg;B3c&W1dD$xyH>%t$C6N#34dEflr>FYo9 zESTbHFztX#W;+yaKt|8UBXu_*51^(!{7OpdRmdvj`=^Wg`i-QFr2Mv5ujgSt!YB#9 z#MoYC&&$TE*4CTiJ708OWN6oK4#x8z4k#7H;D!X%HB}#~{xO6ng1jH4fkB;z5bwE& z=h*}%JWSiLX-$nFpv-G&mtL=1bsjnTi&C>NjS#2z8iah622J_w7EC#;wiUE`h!%TM zq9h99D1hObKsP2YAy|>96EXk-o*L#ZZ;_XCGC#?yQ%z|{>j-~0SBmi~r`yT{eq~eB%^zN0*k;NV9I-Fa(BMzA< zO!mekYgxQ=jLl%(US&7%Yz%Dw zInNp{>Gh7$yI(yY=UEV>eKdK)cj3UFDxh`6tBR#|Ru0$QVYQ<-dsaQ?sT8+$mH2Vb zXY7YhGW_eSPOD?YDsh2^oSeLYdO} zgLK$RNU3wnY!*l^vCQ(Ji1$-=26=J*7LDonrf!!LZ(UuSRa{&^&U`VJ&2Ogk2a{6X z;ON44mGqov4_7-y{E9?jgio-|JFQs9_1igTb8cA<`E>K>cPS9vE9hI4DC+iMP41EC z_QWB-d~AL>(ch-0{BllmjQAGOs{}XYi8Z)@mXhXx9+i7d zk-y`!QH(dR6;kPD3PhT_%i1K6p+0qs;si&7Lrrv2qFF5t9xae%`mskMkqHjgPr5TPPgm$W|~~oTd6W-28xq9nard> zJqtZ`foCaEhkb|sh(|UaHgIiG*sLAqr*#r5FBQ%r1t(>_B$ry89 z)nJP$qAy@W7b%t7N>Sb4;=xc&mvdU_x*T)z#$QT%wpU6gN&!iVye5Mp-V~Xq5E-Q( zkRa#|G*236PaccGeRjF~+93QPWI_Tm;-ad0J;A1GiYKqk3Ymhjqr=8%a<#tjE)=$l zy9o(A6|3R9G-h9} zhRmwHP{8h9vh}3X_+c_VF;tg<4O?KIEJ!EPrR#zoNJ=$`vo3ve!O~#=w#rDZduwmS z-7SYybt|Uje2>{v_$PB%C`Ga3QZfC5T7xhuClO<5>*D@PWHpbKcT&~4M-m;b7hI|m zfv~NJJ}dq>Je`XivKJ+;L>hhya;+CNnOB`N`yrqs<6Xv-z<=w$Fx}SYp@|LU1vYt^ zQgWRgrRYKuoA-|%u&$eG{)*h%`VrxIvyS|5QpWVQikY;qPiE(SDayL&eVnR~rHd&) zW?m)vbzArPDQ{ZZ+Fj!sDM8ty>%}RqYT}jr6=Fv$p*P@b#wp8c)6`EK^M-?5(&7?S z#Uac^6>i4G5{@4}C^26gXkgOAGP&nv!@=ARGX^WGg@mfgC=?aJHM$}t_Rj0Q^c^=5 z8ZKyqR;#r79NFsA^gtBN)f4g^|JN$3G;@Wo!%7MT`FohtLha?I{RPgHNb;+bNMTKR*Ze7EuyAlY(LbzLnr z-y=64?H)I94lh00E!F)n{QAasRm&0S7)6BB9KWRF@hw^&I z&ePDmyEFGZ8A8k9_kN(a^`vw(;L?fNc6b*n`9~msP;AmslJvn~n3-avIuUJZa(9zxEObkaM3s>8p zUp<4<(A*&#=u(Yi%6>yOZg4c&?|9rRL=^q&v3DCw?U5`oBhwv7qAWRmNG8Rn4a76z zI42{;@rO9x#jyvyq*~gGDuLzp9Nx2jpVniNuEwd-$}~zXH*zJ77sMbuOz(SV`<&d` z*=6CXx!BKke8(m}K7)YVQfji~WcuxgFUf}ubJ>??ylwQ+`(Koe+sxJ|7Kte6dUh-8 zrJ*~-$~!U{&P;2GCDA<_Ig2sgpEdocAEijTt~?z&&L|a$dT}H5TVIDL|5_-$@m8mH zmoWP9(9xe<$d^C@Q}|SJH~(?jE&+SfolnSmx0t+)d)(0MCxK2@jDt_$iVR+!lyXcH z@fcTb1HQY~_nx68IA1K@F-?&XuY7tYaiXk!#Zm6#lwZ)a$?M5BlB>5nWC*|c*u9a9 z3}g!USmW*6;VQ&?pG8My=Kex^rQD3V;nX^7^xmSz>!}4|XmN*go=~hPOs8Yn zEHr6bg*DS(vInF1gYpM&-(WSoJkaA?>^;uheyty$PksPNgJeHV?eMPz6;YQX9&tCk zHdwUHZ_sNaG5X=<3XzZel`5RLCwh#JJ99kvmcUEzo$q#oSqlu?fJ-{BUp(m zY>WV@b-qz{ELe^HS0@2Qy*O;LXvJc@?&J#N9NL?8-?w&*-{!~n%1VBNALlrQFuw6$ zT3!Z*nNy8Pdn}aSYJcPNBypz3(lY+XHMdrhU6$n2suQ{GiqzQ_3RD_dGF=#5ZggMJ z)fHRDdIBFK8L#5YA2&ZNBr^d!E!+X}i2X+W`oWMk#+Ad0mdizRMpfZq(&q(qvx4C7 z`5fkY?)FEltwg7E)YyGZ@viq|dN0cPJG3a39>f775eTHmfCwVMPD~jD90-3y8=&np zGatmNBydY)b6mQW50Z|ZQ4v3{A%mJOKF2Yp_Bj=-+2e+BW0%D;aLyt*)@epVq!^a5 z5trNb0}jQ%&c|AHDn$`#O1D_)q^zgaY7pbq);0@+82=XMOo9(pBKemarS5DR;WgN~ z0XbAYs(4w+HGZ>*rww@8g)bt5qT`*yO%d4iL~tTf11Jsf5bo;T0-^Zf1i;$jjiIKx zX-E8LbJ@29k1&)m^rkJ*uv)$Dhzni;LiaZ={R^M||7{J?mQY4&N5pMmO^tK-@+E>2 zl`PAvypFEr8dV-OEYOCrOo@%91MDhmxx@~H7w~6*2szdL;nx6DPoX{=(viVX)8GX4 zG(1|y@A11c<^OhW6g6$@16&mRl8S&UNKXOkdiec6J`>sErOHc+($!%DkNJO07ys?M z%ufM1z5yBpVIO!QUcx^co&Vjw`=4o`ZrFT6608$O(1@K;24&VE1OZ)m7fk}-O9>>_ z}*$JT3MWU_}e~qna&u^S$%qNZShVUknYx^gUg8vZ^FfRq%ypQNiWUn%8=X_t_h3 z_^uAt0&u#-X(lo+O+kY-f6otz9!UZE5Br|yqyFqVz?~HT$nlgP6j}by4)vS*;|ua~iDH-%x^f$-9-^k73Mxa700cP%DA^l0=qu$#p@n!^40Rerh)_a2g^6nE5k+x8dO z%OL^iM1WEeJf_{^stXwNyeI|(^6SY#*+8q!Q#bCMsVkig`flF6NQoPRywVy4{>nT{ z&4<_?65GcL^NFItC&2+A0Gq1l!2VeIAYG*{q`&s&Lz_j*t-9zzZO_V>PcO)=Pnyn} z){lUH+0(`o`joSd+XwjtJTvFb)IC3E+;C}+j308ZT=^IUlDtt(*KavFx(*Em?+0PD zOIaN5&EK@1*HsLb*(!(hC zYt#t)&hJ27$z& z<71uA%0fzh{t_l)va!x&V#1wVE()`=+o%G2j3PCXVt|u824}!!qM!t^t4Z@BY~3%? ze^NvS;plbn08%2h2F8Z8AQ)SSr^*=LuHDiMe=pAI_#hREm|K;2zm`jR`sfLE^32z+ zsCrVZRAWAC`v@en{ZJCUD;rwOIdt2*@R9RJvZRF!_Z-5EtXkJ{3SK4S7Z0E1-Synm z3rTBnD*gJYO3+m3R;~TvQ@O@pCA4g3J0IqK4c*tqL*CduOiTH;uSq;V7;%HKKTCJ# zSMU0vQMp6*Rb2UP1*-LP+061b1AalxcYE2P@b@+##qVRM(cCn{m^yK%CH_=hnQ-4HK9f-39`@r`C^>gY}$c&+QNXtN=>r2etENa*n2^6xW&cuM_I!w*i!XPA!P8#px6Q~t` z>T_X?L_jp3gx^gqUOfaE=XUosNn;o+3MK0(R(p>Sr#rh_(5duF4~7PMXsQidc%q;O`0rPZHt^Czuv* zaT%(2m)p-i3!d@n9eKIc7w_)dwZ}EQ**2z{AhV)I!i|)4_kD_F z8|JI_n4L7MNCpHAnS;3A8tLpPV|=s@y}5@dJDeJ_OZR}6wKmhIZzKH{dc(EaOyc(H z@Yp>Wu6plvTEWdTVHmZUHf&|Y0I_;p{ z_kbnizrDu#2*HGeLq6jU5(hQURe#Mb*2K%cYr@YW=z$KRJ4%{Q z)MFd-<%$ojFBhwk9ZXj^EeuA=oxge~zV;L@yzklSPDdzDlCn48uL1xogLh!J4{6c! z6F%>>m`tDZN+M6D+Kh|4* zKTpgr>vJqPM+)_RXl;zGENU_v^~ls#NHRl>(0x5_{dxBe3!Mj+yGz#P&6}M|&ldWI z+q5rnY^CnmKmCKif%xT(x5_0U;zEA!yAxL}lpW4Qhdk;#K7I}x#Ilc0u3M!1AemBI zJt=9K;*PNUayx(BLiqVy{lHH}#hLaUlG!rgmizAX(b*7=eI?mtuWbG66^GZ9=F77@ ztkD5lAUt(#c8*`Q(__plb?y;!<0ZP;}kI+1UmES7tI57k()?fb z0mi3mvdI@nM3U-7EXM>m6gy?}bX;^V{3_?mpOG^?jq!sAGQJwv$uf;&MX!^-PmwPf zdp!d}uc9b76gjH_4a7`+Nvef*jfG%tIr-?7ElNxrs(?+W3PWm{axiC(fa`>)b1%Oxt3>&7J^Fyh|DY73F5Sni^yyr!yiNlSZ;VZ z|I%JjA3*50Zk98?n0UcU2pIZr*_IpNAR$>w9JsB2L>2zGw^|$EzViO-OR+QlktSM# zhifDZssr{DALW&X|9KO)91!-0n}-RYibc?13P0RPkW-}QA> z!Rq(oKx9r;626|5hKzronD^}lC#Y4+VRza;mkr*Wr%=L+yqFr|-y`tXCHN2jq&a#2 zhW-D$6z{L|^tuuE#V?_O@C80RP7_~wFyO{#N`r{WQtcFq{AXo>V?6ovn6|=~f0k@3 z2w|Eg;Lw)$jhHLyubo5>P-xktjJT02zujp&8U2gha({$_mYFTc6bvwFWH`@KG2sIA za=g}Op2n-N>${jjGT@&0Vj^w>SQD4_@?DRHEti;;Zh>usKT3pxIxq3}uA{^y09`fv z*OlkiRj!}^vlF_qRpE5)oA{U_K(|2l;1@F_F0z6=iT^T?{ndF0RRAmUC>rY_$_Lcz zIw8>g04DW}Aah*1pmqU!6G!WP2vVOAAt919D>d?flo(Qkf#?S_Lkg@D%0P?wZEB`q zUvosn$QBlqabJ;_flC7Q#FC;s`~nHD?(M&N2=K5>+FR9=2@3^>ld4t7GJ&LaVNgQm zo7{Y^IwC)`DLTR(lS2V3(2YsL5xfT~KWNpV7_^JTeEM&r@xQ8j0xZGQnEU%&k_>Ddz^b29;a?wsnl(9c3*B5+fp;-Cw7od6OB9n z_>ET41?n)fMHJGnJs4 zQMaW=;Rq|b5)BgYcoUuj`!kdPGS{9Uxu}QQIoE^vG#$MwiN7qps=s&xWhwd#&6PE$r-;6&XfTEj5#|c`0{A7=MOTqiqvEAXOZX zSs(-BlwSrOZ1`ngxgVo{nm5C0J!eB0)bEq#`u?t(r4XD(@M)Qr+{A^DtTFW4B=U8O z3dxLpmyah4FB*8IoOyh1t(f!eD9)9~Mn1R`4^w~Hb}bB&5vQd!;%ihB^mUyzZ=1ZF zS3F+zA#p%*$IIDdFumlTM<>%4-#G~z-)+nkbLGD-STEj2$7IMLeAvR)%{2K?KAN(;qyBK2R|lsSxRJD8DDQU`_LokZp3FzCuqD>K8li`u+nQ8s#Cd zgj{BE(_qpYuM&70{3WjT$*RIbpULh(3x`}noqFW};CVan# zLmqK_OQKu!EG@>G0*&5ldTRyf!6Nu32XscIHsL_Up(SGDf%N@(D(Ua{URz3CwK;Wq zGjb-dwk=e))745{Ajtg(0ZZj2!8#p5#J%&QWQ{z_da;Tz6=X@jWJ_NiwAs7VKFkBD zBs>B;ORGlTI_Fw+FH@YZrfQ$YFdSZU-LX_v2e$ z4b}wr^rdNj{3fwinan49!%2rVQzHa$`J?gS=@XkqKaSV+zkY(1>MLgIQaMMCR?;zV zjOrHsFPoAcK4(QdlaTYuF89V+(9W_wnJE2(z@PPutCp`B?hyM9U7dIa^wK^#d8D^( zxdz(nZ}?PMOO6n)o4aRU+T|Ai8NT>+&L}))SA}&Qzk5%GRcTAHP5)kg-Dlj0_v|EB zv+&S4))v;pz)*y=QErP`_ag9(v_9!Q!ZkAUuuG5jni>g%-0duJPwsZfhQ%JJ;wlff z_rKQlb5B73_Z7%?SD9rOpmSBp8*z5m7Bdxtf- zZT+I5i6TWMs31k9C7^UfI*9ZdLQz1eDxlH@MVcb05$PhvLQUv`0s<-mDpin_ZqMm*~&_N@P7lMl%y z%DX3OtseD8S1~^0U;Tu=`qcK+GjZuhAVhjnCg@OW_%ihn|1J*6k5)V$RW%B+GHrEU z#uEkt*Q(`)`3dIxdCv=!l^pqEKS26zS*G{$%^aKZCY;eg_J34bo(d75UagQXh-?7- zZu4cz%WzocNrw%+Fw{dndCi{F(yQdy(Hs+_XPFF%NBPgQet*=tg*VBsD;Im<*mEcU z;iv4Sd|9dm6_X$ORYO*P@N&#P?jS~mxijU2y9mdAGtqfh&mxlQc5Ywka79KB^W_YU z*`Sy$&yr7R`<*|^aQcWPs(05mQ!TA4`F*rUj@!MOyZbX-S;2C&=ZD{#&+@CKfR2kr zj~&Jhjo*5eS(e0F==4e{6gF_+?r<|ZelN(^cxWcmJU%?trdUecm)T|}JXKIfiSl0= zsjZ+bYO=nx6h^%(U*FA?dM>h|xIKVtd3I)PASx^`z_@o(SNNoPK;6*rlET{nj+*@8y?y@4an0 zM0l0HV%;)xeld1U|9MC6`KzDb-}-}|SG_J@2FBn^v>&pa}E zkr_}U8opXd8S=Bh@xPEBGv};QJ<$KAr6c%ElY&W5Q#Qam+|vGq;y4kyWCjr6MYB<> z{S}l`S33$R?3VFA*7;U=HR>f*&&r+7T;U;a>&PF#9hx)npi!Or^v6=g^Mu4iIscTz zs1sn;+;jRdRD}t7!`!M@lrZ~^W2$vv^^=(Btx>_XGTUjnlxQ>ep7zy66_%NvYf2~k zKah&sd*@msmRag?2<2E+VDI$c~F?1JKW%B{~KbvOrW&R z4Rw3BXZr`X*A}}(&;dYTz*9w~EOVGr*;f#!`zU3!OaS&3u-%&@Q#mJe*Src)SRYt~ho5BJS zM7LbU=jewlj_DgPqS=X-G-hW)KGIB_u9(rNRET+HyxZC!yxA zd^b-sKp~^ca)vWn5h;g#E;&V@(r$B?t)*yN{9y3yJ)6kSCvamf*ov5{$ldOL%O5YI z(Uz7iumRqzU_#rfh3I^EtH&{h4j7su#GV8a`Csp%2oJE?nlz2aPD6O1hT{bjNqv2S z-gN;C1TC@_t#S$Ax{`2&prOUQ{UL}d1bWI_K&g~`$cUy8ZkNfY*MuX*D@iO3_L~5; zIbIwH%=t}IozYanxHRY+aj`ObeVTh^N@dETe+Y`PC?EakYCw>`UhOYpStBCicTgX< zDdcvU?`?rgZ@t|VNi*!=;y+a(%e^$GA?Wgk2K=^BBa>0w>JmWMWNrw`n}Crhy?=qQ zfOG!_Y4O*0Ha$pUp;f@7UMoliC*hw=@PGJ8qzC23@D6}PjomiR8bX>zsx1Iv1-}tj zdAgAn04{q$urZrM#<>#2{Ec|A|KQUF(O@zf-u6LMCUVEkfIqm^#<%}eH?+#y&twl1 z$y=T54o!x zn7Jfe1Hk+^ynO=&a1aN1t|vl4V@x_2cQDfK83THWbNFGXw$yL+&wKx-p7`JWAjHC5 zhdlYed&kMaNM1t_wrNYDm46}qAmiqdf!95>hMWvnKvWa2$OUvz!*r#>o+afnXlxpS%Z|^qL(P*81`zFHlaVcFWt3)N(PNdZ+keaQ1Yjz+qlvjurb}ao(QG z`X|DN$_>aP=)Wk%qt$S%YCc-0uBvuFyKRog$IIE`XN2=9 zk}C;;aSah=U5!6VoE%avIcTv;YB#c8#dn0~=U>cl3h3r&&^sd4S8lAgf|#(Kx{?F+ zjSjfcIfY2a%(GXnermmNB;bcsUOSI#Zq`WX&ecy39`$DRW9PKJ3mNX*s_K65uItq0b<=PPZSh$(_stso&e~25eY=ZL zsGKjkV459$g~VU(Fp}lv$8{MZwRz*k_1X?0@fg3=enVwf;R4p=XK4JkrsHaXpuzR@ z=e3oxj+CQuItVpyW7Na&U^1N~^ST}yxfOL6A!|4UeJNdeO!_0%Snq+vUWL}>76m}8 z$>0z!X!%Nv(MEq)##V-iw8xHa)u^gI(CmtjEc#x4G^N8mS#gn>&&q(yh6`TbT5dDN zDzu>U<(v5B&@D#(?YOWhnLiGA6P$xOaMvtF?j(N^(h8~}&)crm&S!@zXVx!1z0z@C zON0FIYxGy0Z7WeDdSXL7d#OHdcS=XbaAiix4%xE8UoB4Ee)C0wdW15%7SQwE{bMq# zWV_6(`)d^>ckbhBq2yQWfiJu%7gyT$y$Xd7;?t?4m9>4krJb*jS0th@lJ1CJ@_cpE zy|JA*KEho_X)^mZRxLEVnEU-;V6B6rr~G*BeCGS8Po9*q2QrsP&BCwt-Ihz#mT`KP z`KDMZ!uw={{@L2A+d>VT>qlfIfsNlA2&8-PoZEpAH$+D~t4(swWqF3JC5h8U5ZV(I zZys&H8ew=ekC&Z(`-)#!hK^8R=8I0qa0+4$ncX14TSeByz7b&(L+khAi~aHyA6|}fdbQw2(u3@rnrLb9e1H%aIrCa zUD7ypQnO_3E5RnI_b7Tj>(G*ISPv(~edeLn9ow1kl)(?XEk8ux*LcOb;z!eBIc5A~ z4dSnC+luprk1v-yE{P6=$z2rig{oUSgbRb63R1@ zRsO;uYOS#rG$UN=e&ZKvNA5t`Ov&wqwJwisI&P*~^dTM|fwQMyao-j>^i0CZK3ZGD z_MKc!Cf`(1^S?BaU)!Y@f zWv;*9F>LQI)RTKvw&E^dGr!j^=FL6Tl8juqL06wY;`pv&<^mYndY`qF?lS&NnzSykUFSJ*b(;@XG3j;fLV6W{Br1QT2TTjs8_#ELqc+Y= z7OJQYrL&bRB)mB}J5Aw`8yLBB)~7wf>bP;_MN;XlNJR&|Y@aiov0qd(qSynD&uE9{ zeFs<25SbWrUH6H36ge;`yh1?*I#6ZwW_N&M_)p^*rIv}+2$pA7UBg(j=no%zQdI8+ zEAq;%rvyn-juUkQ)ceb|oRj45<_kGh2`rpdDeiCoIA=K0Fnp3P!B^fPsQ8=l*1Ri{ zSw{kc*VL$g*m6*n$4jKnm*}6f7uIpqV7E&?CY2ZYq;qzm%<_<~V3}Z~q4pv2v5(u7 z{IZ9BI>ug}wrcmh^IW>nDYA2?d^$-)+v@g_LBeOfjbM(v6X76yByQ}Fut%~#<&@C2 zm@st`^=>`NclTJ{T6yC1qqe?C#nm=1y29naJ{IHJAjwt=KdGk8X8aS*8e@J)J?2Fu zqwIPt^`q+c9}xV>-x9#%!l_H_-Ztf3HBU*`9>`oP7oC-+ZacSjM7!S|T|etyz57+3 z%wqU(EjDJv`sBbgv}yjaY5oSzXF*zsH)E)-Pi80T;kvHb#LcG7rl%Gj97GD+ zTMJwtV3jsPT_Qgz8X7zK!K3WpqqFhtG}ZLOMT8a&e{Tr=dJ^C8K(v6{ns9a;58`?t zK__V3h8UU7BWiXl+6kDd)YdR@biAdiq$M!PhjB<&TkO}ZqaaZ`9aL{4FzQ{VEgLyl zNuz%aP$uiiWZEYRrcP~;h5-TwhZw?=^^v#zOfDOy*z_Ck>oZv68~hi@^$5>q2;=vW zDdrU3LfpLRvpw*D4Y3;G)c+TV)*EsXNHNIg2?WOc-t|6BG|sLR7Y9Fxi5ZFvw!3Is z8SxuDA;5IVHZ~y826t;9l=0!q$3~SKegE&If8=+2VVPC?35(k>+Hty?QksAg5lcYX z5CM!anc>ebG(g2XE++!s!N41$;hv(*AWx~m0qQ?^ZTPcRJ_E%YKMsL@co#rFB7JGJ z4AFQzjhAzMI^@x8envUd4kGQ~ZP|Rba`JWroDf}B=>gbtnR6~)E^BIkfkz~>0W!n} zoJ9*GK-6s@TsZ%UH=$ibvy5(&Y!Wi0i)#`#p(7kcVC+tMvkv;0C8F)jv9|zX5y{J1 zsni4ih9XFmA{`;vn|UgU{+&X!(9S7QXtIUc@6q3wi9RDT;gY=x5~}G0V&%WZ`yq*A zX6cB66M%T2(jJGH&gyE4omIS34I7AlqL?@ufUJ!rZ19mC@i?P?7ziRlpO(nuMUTV2 zgQtV~L>tm|*}!MG&)*a!tk=e@AQXIg%oM3Hnodi=N)sV@=pPE+UPD2^RIx(!O6y-c zwm{0fKo3wdL<@4K0PU2_tA||=HMuhx-Q2A%Y3s*nULP1+=3q565>+S}H9KBOjv)lf zzHHsnP|$7DA4}-CvmG!z1kt%68w8PB2LuO89_O@Y4|t9AFweK&DQ%l_2fWA=VoJM^ z48q5yW-?ZepYjw^S(?#5h==G#$xiK~sB|8z3!@$rEIBknJ}7io2A+?^yoJ39^$4L~ zh#bbGW>@OhVCu``%D&rzWpOX8!ly9Vuiz_=8;V#s`B zx5zUcf&DBE^0j%vJSxI7U0HpqJ8Yb-%FDus|0r-2@KKhhU*uTt3=^zrqp6x`=b-#T z1y|vN&**IvO?8RQJLjM?GMcTm|mUMZ8W}QU8ND?!5>Y%&F`>k+9DbG!e6>m z4}HtNdPB0*RK2d5r#!0sI7+si9Vg`y;>{%o$xkD9~rUH2E4`E2id3ZzYSc2K%ijTrr1K6r?Z z1#?hXJ1&@$TSFd58MT~DWM+@6zI-~PFVN8_W&kQ1={-8!k&|1e--}W5 zWWi14tWp_Zkc5w8cTB3%CX6_%5AYV}M3{2iXcFeh@=$h>JXv;y>D@K-jmYvbNiD2* z1ibH$vSgfP{-pXv@NB97Iu(lqo{Ll8xfauw`+NhL-CCo=j~(qMXM$dI<{wgH<&M6dgMraawKl* zv3+3HP#j)J8(tta4IbQOtf#k6{wSH!;k~k371s8-CT&?)%h3ZXO@eZO0F=jm>ZbjQ zOneV9ium}GT_HGlhtm$@uwRl-TvGJ*@3`VB)v5HMz$+H`ieb%zgtXxl+0Tb+hhrbk zbW=RNt-1_muWVk?b<0$9&!2Olis3owRZVRf+D z>`XgF!*0N?kU+4vm%wga$bh^xgF|v)qw$b&GMh`WIoI`guEoPe!bR4u^qfpeqK`~B zX2<&Ugx1eRGET4VknV7O#9890Tc{`T-PpV%+u7#%x~}m@-prFCWC;D_sdprvs4}~- zWy19JyiUrg=L{BrbF6tlNat*HMuPRw0=ckoeSmloy<##Yrn48_R10Cp%Fgy4%xaa> zz)3kC-w-pzo>ewHLa^IprJ%lp@;3mu!%*33FS%8%RU!P$=W+$&f^8U;p(E(v<(hYI z{gt%NKYSr*7Q92fEK0zU?)&SdZ_FQKLQWK-P~+aVEo_usg-frJ0lie>fV~XI4#`PNM19g;EZATO1+e zRu0GQaD&X?M2HLe>bJ*}#Hj32DwCu8+?oe4p3r7C55pnC8h6aza~W-C+G_n-9ds*N zJ1!dsqAo!3lk9fKp^HonUTq^tV4dr-+1Y91Tvzd#$uX~>PU3~%TvB!uikmYOkge{Mu!8!5Pnw^4gZ2-?(%2BA=!-gYl%$2eUpNMTpvZfm!A ze4^tJY^jJja#M~aesk&Qny}n$5Z*R|j>ikw5RcVDjdh}tOEqgh>}v)@J<;s*4iCeW)jSp!xf z{x5SsWIpS~#@PXgAc8`C%*7?IE1agZrQD@xXu>oR%IRpdWr7g-pdRMbwT)&0%H1+= zRHo8vVPHhRx(mca_vR-+hQ+va9u8?kp}QflhMdTNf#U6$3-om8S9Jqu({qo^ta=qa zNfY(t5!!e;^c@4=R;^Zj&9mdohN;mZ3<}W;IT4j-&^T*OWLbbD{nu-X2m1w{V!>+v z<*s6*$~_*CH+2!%u$+!B*}_g{CjhM=^Bc~`R$}~^CxC|K>5Up48(cNJsITYhX6j^? zmIoQ}rciyr5WKqh3yf~0?}6BOQ(6x%X;X2%yFsHFbZT1%qSc?CuX)4VqH}bT{ZIDp{A_9B{UuLJVz91fF#!WtVW}`1!fa?P0dBy5A_B=*ZE@7(87z(X21l4g?gLM=^C_kR$xo_FO>d+l)x{#hmzUf zCeY>Qoi_^Pd=5#Xigc{yW5ox2&-w+)F$cQx6qm?5W}`e!Co^4+&|MQ~&sMlRW|p{J z6AdZfiYPjQiFa3c8YgY0;hh%GPhIzTLg}6ay0CpChlUcHzB;v*5j2FP0w-E2oU?q^ zGmnP#g;^uD9KF6=t{Y`5zw*@2)JuV@hsDB&)Wp)51NASub=DPUEZ-|jzhC-=lBO@M z{Z_NZ`&!4RH1|tiDp$gT6)2C}WWO{W9CFWdI%ucPLLIBie;U9r+a%_A$9NX?y-JI_ zgv_W`VYR+pq`$7?^v9=Ak~%#==-j)uP%qN6MMWKpV;RK~h1`?bTtR6rIiG92GwMh9 zBPF&@PZNSTKS~xv)x3F@?X8=|C)L|tDBxi|%}xwukgs>o6MsP!rM`hGs+kLm5;TfSY z%)4V-f+_}1Uo_gC{W940%7kgZ&X@1nBRfli8iGZ3%dX``U!AoL_(qYKS$N;4R=)r! zc9e=OMVvg5GtysNexdqvu4inw9IX(aSH#Y;?5O}*nf-@-NSaLF&f4axrTphTy$QFz z3DQw%gO{`2DOeVc%k#DCGw|iz`pNgTwva9=Z9Cb`!#6_JuH8<{o8HdFH#PjGUX^|U z_X~A+^VGXar@u{8%4FbBR*Xm_OOg*R=8r6*IT>g}3^P*f&Tx3Lh!B8LW?QAj*fs zP?g>%I(T3(xCb2gggjfee&k7fU*c6C&YY_k+QFD7*PwE5v3Hq2J-E5@_eJ_J)Rx+O zb6+>F%H`F#HMtU;_zP8n`Gt~0`bE`0mw}#jA3r~lHgjHkY$R$8qVH$T*#q{rHd9`+ zZG1CJP~W&L_+snyvh-lNOa|O^N_9Gl z+7Pvj?+&I=2B-k7`-f?ODj`r+e4cGGTN#4%OIL*b$|zZtS==LY`FX*8#wUWCKT+k8 z6Qe@A7^$Zr>InrM)9?Fg?EZoBZp}3+cw$OLNAc!29+!Ph3;B7*Ro~=Z|3WQ8{6&9O z1w1yi%R{vnnNo78->#Q1rQ79GdgfL1dC&Qq@F_8^HdE2xqy8FLzL^Y}$WL^tQ+!mB z{585A2|f2$qSjn*dJi${k76aolk;z|$Q6`c}Ci@A1>*H+j%J^|at-b?7fVCTiJ?OOCR@_p^R;aoTSzPKd+% zeT`hhosLhPzfd!oGVUv*s?$Kf)%c1&w^fY%Ui#^C z%o<}GT;;{@W@&r?G*xYI3-d7+Z7eGwD4_L!92a=1c2+bd0iiU<9-*mf+Q*o(v7E+| zM<(|NS0da18Qhz&a+6Xvi8ty8xACI;{GO#wz6t8y6@lc*q=q>A1=+>QWo@s zv(;y!K-)`#=|d=}{bbFM;;4RDAR zm>$^}hpQWW)$D4Lz^&(=P2zuq1)l%=Il>fd({Bc|oUK|Q$#%JC>K96a`i1m$Mf%4= z#aY+Y4=^FXdw#P+WYy;XUS|DSPxU^G8mZSFVx7$CPx(q-Y4Nx+Dw7&&6-to z0`lrVUlf@1J(LOM7dIJYwZT{(|AS_FL(EW7%K*FSr8JAX#Rgj_57A} z(;Y&j)pl*IX=((_&-F+4WkW7B-88~3lczF5ly>F>9G|sF>N~uJU2uq!DnQ36c#VI0 zhoXOn{I(>$p!Z3I)g!gd5elc{1f0fRUU>H1=7rgoNKVJ^H$C1uB4--d8dr|3oiPTJ75qIPk#-Pr5xgXwZGRs$&yqB%*UED&~EG-C%uO9f`6BU@L z5s0`u#mWibsEh3T&&}i}JRO#fE;}T*GosaqNiVefE;pb*4fhvx0#>7Aj;D2drayP!Xy@@a8Ek%F0N!j(tgxUq1R~v}(l>l*RV_x-4 zw_0fYn76Km3m`#sxXE;j*t}QtBu}fvLoW48x3$~PsP+rw1fhqT=NpVACTpS}wc+&M zK}Ze9aYWN#-2g3vpcgBAYJj;D{85wHVt0T9^c4(D}y;(DG1VD?n;NqcUiS>Rm~?1v{AWQ3wPc zlH-k`IRbsR!hWR(wGr!_+AGwmqlN9jHPS=rDEu&l%QP8gtNpX}VdIz=lxpAmm(gM27XkiWz`ImfQ`5GCn!bQ$>217 z1ca*TWL_DHERb@~9C@J2%#V&ACU<3pf{DOclfgB8h=&lMjonIaHL`NU z(7B{dEoR`YNc;DAkRk9<1j$>#Bh~B-Q=bhZ&MY(L4d^+jvu79*;*dZPQPZeP?{?|H z8*Eva&@VOZ@aalc^+>OpFwA1qb5~>McDPnag3yj5rEJVtKLUYZ6w4(A^A!+- z7?{^1?3Y{nye z{8J{PNcD!HtK@oCEI<@%X41D(<< z`4Th!Neb(N=8q zwty5asS7z!nQlWuI^0Zti6UGgom%%XwrM7da4iq}iEX=+xv;u5UfHU6TgJ5g269$De!D9B zCI_ACMuz&!0mw;7xmfA7o3ZiEw#N07p%yS4T~`%v4Vf~2wex)9;rX37mnpz2I`6Fp z!sG3tllPL#@1-#c8r-0>J-&Q?;NrcO9TRLrCER@vKXt$P!tlN&v-}sTWku;6>&Jv= zoqPsLw=9Py%_B%voE#rrPK&f&ir;SU!SgJZ?L+N=^kowTzWyzozq(tEUH#GF5ByFOu({9e3F~HJXP0 zsO&nRVpl*4#B?)SdGL1;>^&bbZKvxF?MnRe%JhPL{iLB?AeMEuo;ZE0B?9wXGMUt|`n zEM#N0ruo@F_9FVh68TK_p(HC;NcW6bGad9dzgXLqPTwDw{fb3g$|g$BNJNR~$WA={ zsZL&>v)1)gL(mN0J&SsoA002Bs5{i)+ZWQjSVI>wrL|nVpS~NK7ltr3Il3AV?pC7w zeS5E!SjwCDfH(JOnqA!t_nZoIz?aC}q)ioAB0H^ieDf8JFM5g^W!a*pdc)j3F_J}Q zH{&qr<14oK=Q?hm0h7jW_n3Cmgm&@Vc6K}5H5~1`IQ=(Jkx+4$r`FIyram`bPSzm% z3d0@6`)&6?R_3GUGpDr2hB+C!0)S}r}+(ftRQ>gnx3#5I4;75~s z8=L!zMr;T`z!oySoH=EUp8pVKcY^{KySeV4@3?wgk$o`O2N$#6YGS|cQhL9u2!Cvd zS6e3WXBvwZS4Q{KDo87#kNJ$3Sg;>mdZO`6q@*B&^{N@41c~`=#$y{p_JOYZw8>vP z-2DSapV&%WU_#O5X_RGchmqFt6>eyo=Dl|!A!Bg0-BeA#qWk{kg-oeH2woYcU#lgF zsb_M3(A16OO{$FD*nWw#xvBWkwYKhK)BB{OY6 zo@3EHGKg0@GpEO8CNoC;TS!_5aV31|qxDGxDRS*;mEx5yxkHb!a)+aes~poTj*L5j zxHQaD_0NLPzaKk-2SgvRkYF=N=0ZjDqmGWW#nE%~iky0{weQtAA+kiH-n&AdyVoV_ z_pghH9jVX#@pPK3uaAkC)A8U?cL)lxT(}R|%YEy2lb0R)Dwa=Dl?z4l$2GYW&)=G9 zN#s>yyMJ)YsfKqc!*4*Lx$YD6{yy)ajyx(RK)1bi38?$E^dYa>L*5$k4(WqCY9mEg z`NH=eHn)ou)Qmu(xa}<5@!NFnai9azhQ=A~e=yqPZ`Pf0f9da-cbw6)Ej{~1f|TQP z=_lT+xR(z#wLT?7gnsOyDF#=$5ECJrY&)k5aOK#+_h$9X3q*F!f>vjC24U4BpN-)Ml1}o!~!N z=>KflRwEED_T?QR74Vhf5uw-{k%u>;YcL~rFbdpATuh2#2;4FNaawR!qHml{AjpTf z0qm1eYNXv>Ev$&11O_Bu5d#R?H0=O#7bRU81R2T6U$o{+x{N%iD&GB~^8-zoCH=U6i|F-TL)8`u7cw42r589=B_{*+>V zf3BQhK;v~M%~FM`5ofpN?mZF|xMlo*eJda!Oo8_AaGwfXFChxH$rz>?A`~6XYHBUODXoR0HjNSK-ypMjr`CQ)T@9hd!QEi z>p*?xi28&2{SxIsXkKyG=YQ$Wf8-(kmuqOCc1{z3@+E2d2V_J_M_g?~(CoZJdI1H& zAT=$o0qPyJYF^xlWJg=mnK1PU6895ZGFVOrY8kn$x9TAQ_j*?q-Q&LvYH%3TD9+kVC@S}+_V$l?8)o?xp87a`sXWkgN zvXKGq_h@Uin5H5}qlk~p4=8CrroH73TcY3Tn|p`vdtN~KPOWu(#EJ~;C}a6nUfx-F zvUzU;R^1Zck~J>60(O$?p+S{xqkev>98jZd`p_QJqLQwe^r)NB^@n12U)hR69i`(~ zXBbeTi|70I^aG@z z?%=9p;23Dqey0#NQk@@m6dw&q@C)6hl-Q#Bs#gaME zUqN5m2lKL-Q(_iM0=)ZsI2s98^(ZUjK63C5^yJpHX4A3<809d-;rdP8X5B&b0`U~UWrFgIZG?coI;;WCym*%><;Qk z`SRv`+q;os`&UX0WgOTQzLc(>9ZLMZa++DkrKgFlyrMHJW5IkQwvSqEY!0k{4GU(s z{Fop3M9{tQQ;trNaYcS%0jvN4>X_N%Ea#IYIr>zO{eIFvwTvRo4D~U%nZB zJln4E1_@#Jwb5UwyBA6m0J!FX3fHOTmeyRCk@N}m&I=XcXY!lv`^)Y47Hj8K2Ta8y zWW%@{gQcl5DuS8CF2S}^!JQu)*XgP4h015GBF23ksn&-B=pdUp;Tf2pjfaD4;;8H_ z=a$d%j~g$y*;uSGu9mJP1p`XY$$@pRcB2gE;GyS))zC>)C`KtzUtLs`J_eF z0gKXSOKgjtKUG!~!t!0JfWeJ_seWYUkvt?y%^=gkx6to1++U8pZ%|gCpd0HZ;k9=L5Ueo$Q^(UPB zlGm)_oPYghgzVFKzD5`kMJhYbdFoX;I6Gz5)`%6Cz;q^D{Fq(y6OMNm49Y~|@YC1=~S_1veSa+VI)Kc?Tl6=?S zh_8%d$yC-kre*v;&WKvpmyMi9&W&13t5fa#`E@OS?HCQX`ZO^e2b2XlW-=&(#0tui zJ2I4X)x=uL&y7RA3DxM}oF8k2ldaG>{iV^0%|;*YVx+uK{n-kmV!Nzg913?r=IQ=w z^Puq(-BjQo8)v4k`Pk04SiY^8NtjTb|4fBGs`ef3@{T$>KZTs!IjH)B@#|X7+A+SB zajZ%YBV~bO<`YGbt9E0Q;qKeCfHJ|9-uKd6TeLMdhXS+T4 zt#)o)QqeEeK+Pj1-P}#DV$^a=>Ek87CB_Fou^oriE93Dj>lKtmj+qgPKzH`yqi0Kw z;0Jn(%T8Lk(1XyU@$DWvYGq(^*{;>k=XMc-zf8iUWafu86E3NDouzC;PFdDlR9yjU zIC8%2g!B*bAAHhNp}}riO05QK7dLJXZMRApQ#|)IX_f?&m0|gt^o`fg28OQDHLgWX zf}d%=!*qRvz_rZynltclq;oWm8!gE}8&|AJ2;7w6xP~0E|oGn}M zTiqNW3dDB`gc95*+zH+aNch*LfEglpx**st$#eZBGM*E`D832J07)1{m3$aQM(QfE z2nAMQbotgs+XHliu2ZY#RYv8OMsiahE`3(i5$(%g)-49N+2+)$_6c<<+5}w)t_gdu zqP`wF99WN#Z9%f_F}7Wywk)uQ4D*7UYpF^-13Cf5x!014_}0cMYO-cI7M>X&m{%F! zGt0N)jO|}y4$kG637NGh?Dj_R8fLRpVQHwa&Th_aUe^`Qq|v=aeV-B+}v}mn4js{pyO6gx#%wS;lPh>Y0BJO z#zh%FC;F0L&T+p`;%v(>id?FFhdNb9`hZ(g5ptwoblf1BH*HBhX_IhTfaN>kYGMY3 zqPo6-u+m-S#?&A_;C#kw{s-1W0-KgHp$iyjXHR~(Heu1|6I+t6>Hv6XZsfqS{yp=; zNyk+SPW}|{PNOboZ2#QMYnLBh(RZk;0&#@!st$nT=tjv2?)Bu&Nmrr@ybeq9xs$HU z`$WufYH{YlY-Evw@oLg+1{>%3y~AfW`J21gb?V!5-)vnXAjMj|n|*C@reO9-uIf-B zvMPi7WH-1Fa{LSBc4v(t$tN&HwL(ugV$pxun#z94mU^`UVIRwem#+Oh=l&iT(3&3* zTj7+v&<$EYrn_!=r3{s_vuqS0Vk^}CLN*}qR^6yLF(Q?OFctC1GI9T*L*d5yNgf}( zT^r18m?QZ7XDhgug+Ac!Om<05CDQ6e#Ck!o1Il;N&d(#wn>2WrqMLiPwtL@noHqO@BGC;Vf zv_LjN0`#N*lEJ(d$!!EwkTe)Ly_`>56H#j0j-p@Z^RO5`Ltpm}ZRagYQL5R>C||iK zE2fqhlbcSRPO}`Z{Xw^e{!X!;W#(0$eVjwSmZ5Xf(Z@p|cF%3nD=>ug%iE!Ew2L5i;|-QqI<=*X5yo*ke?BUA#mX4kn!}u|Fbzt6KVCXxjrLg;Y1y zgAs%?904hwD}eijAvNsbi`fW|!ZGd#>FV=C7(4^@|7X3bSFz>uwos!=9ta!5dF)dg zAz=4R91yK7!{g5C`HAGK1fPkO1nk5oBLjg}p zSwu=p)JzI6q73RXN3#bHJcB|(m~6ebFjI+tDwuzLim_mL#vt4*v6>Toquq5MP37=o z!_!6MkV%=*;h3izW4VI~-xYW8x|T}_ZX^5uzk3IG9nfe+1pX5E5uyM0izfY1n^TK_ z8~(!HG7z@Ups}b8?F%}%F>GmxjDujxE|Xaa2^5kxtS8YA6Yazx;uXrmF!KX!a3Fdm zV3&{KTF_)UZb;444VXPB~5w z$ZuvofWH+7BoRq4Lg~in?PVemN-JQBQXguy-Ll{ zX`$K}n}={wlt1(1Nasknq{*}`HFRI{wpgwZA+M(ZdB^mOUn?^qb>_2lG zF`zZEvxaZ)Z;5?e+K#7@a8ZNut)AHT*jOPeGF>Y@B5Wnkw4L=2vZ`26rQEP$sT(tusl zlzLnVrGi>g3CE;QrwrUxcEO|;A{G=BgjJ>zibYX4J%#r1_>f4#6;86b&;}x!Ny{!A zu2nyN8zyr!4c;ekt=bqSEewwfCYx4IL_y?=S}e@4{9o^C-unLlUPm%s@djH*v#{GH z%@Ub`3xwb+rwvlWp)Qk;+1xcj4jzuA=c0+vG%f2%^%*}x6XO@tt4K!i7uxb4M< zO)0ywbhxEO>)yM>qH0`GM66jopq%@eZF^vQWKzEDvhVKfisE{rZmAY~Sn)lN^u+;Id=ySaYRgzTeCB!RH2VBZfcrwD&w*yl|E? z*&*W1xJR&6syt_n=qi5wMR0Etm)!fDiyHlGZ$J1He+s!mzUdR${>(Q)%lu>OtE0U4 zD6`A8Z&ffeVmabBDJL`ZN#DOb&O=vj;bC*Qy39uLtP6w;wh40_FmcU8 zD_YyX_>eKkIbkT}doTw8L5`*C5ythb{ZL*m{Zd*d9CDsuZw%)oeA6Da-pM_y*IJ>-cx(6RYntA=L6OvA%R7wzLFeHDnH8jc*LcJj^Hd0W`=#l+o}d;}6a8CSnACVi zoAYJ5FUzN(9`MGqx^$Qeex>JR+}}|q;3%Itmd;yrw6j>XqFkRN3gi9t6lg`qGqL=5o4YXTH211?Egu> zFMr$l?}t0{7Q|V)mDeX*MbWt4q8qeEQrc=PLc5&nN{WcX&XfC(%{5eFc=IarKP7#> zPu(`D`ab+-Z4!I``9MJq-o1s~cQ7#;ITDN>N|qgb&-f_pEz3fRe796Nk}Y1@J+Hp- zrR})g*g^Ktq8Ulsf&%MqE475(nFk3wc1tLTM24((iPsH1w&qZh?`vjT*4B%9N8Med ze^}-7j+mOQe3FO!N6LgsDz_>u_7ZYT&Aqt29-es`8hJfVcE^5)$ED_i4)}s`5h-;U z^F3Z?AlWzH>e3m^*Qa){OI|N{2t$&e2pbE+nH4juWoD~|_rq`Zi;x|2DFoOmZwru> zdNp^iAcr8V0BucOuuX%}Zf|-vDmDI6M%x8ojF^vDZ>WEb)0SY8&xnN8c!WN_i`;G% zN7HfVXIA{eS|_WDGtTPR>XTA2UQ-&2btyJuvHyKX4c*C6e9?wFW1lh6t&W0oc12nBmcX zy&V|S#;_pUOdJf#rg`nJ-M0gKf7%?x+-y56(K|31A5Fy1sUo->BBuJWd6}SULYvvUJ1L=#*3Nm5ssB~pup^eoJ zyBEprUr17q>kAMdSPtD#sN|lSByk(r{S!B2fW&QBZ7V|jC2~6bf|(BYG=@;)q*-q3hiUyR(iT0 z4cM!xSpeAfM4Kd)1v5*cP28rZgkvu^3!b6jBT_LoRx~8T($8>eJKH$h?L^h8!-4xVLcu*7N%#Mzk-nI^}&a6CVLKe8_XN$(cuq4 zu@#GX%se*g!EQg3k34#FG`&}$*gd}XXtP`V2_vU!n)e!yp0fHomge92g>sqxoY9vw zxjm&}()H!(((LbMOJmH3`6UfjA|1{i-F9s;zc7^bn+xY$!-+xWjyYm`b}&x`Z}1sg zLK3ZPPN|YM$xtPUeJ4lbPQ95QL-B?2=5@Gv%MX!_wZkYrSL*jS2Pkqg!PPvpBk$fE zZSU_mG@@-T^|Uv49KZi6?fIFKrF8HNoSv{^|9ZKkm+JoG+qb)^+8i%ya&=LA?h89F zrIfRFU&1YCY5Mq&gvnO0*s9#m`=VUjp}k8qm{m)gKeQ@W^{|K56%N7qR3@!amqD7J zAKO`Z&iUEEI0a&^&!DS5Re9;a zQF2_9AS=Q?HXr&;eYU~ z4HU7WbXwC;f}RK}7BCl%aup$o_=!l|6yeTs_K^S7o zvMvC3K|D;56-tkH>1CDkU_l9K9s#t)7uaLhk4`*!(m}(a&#FJUI6Y=f)Vh{6@$-%& z$?Cxr-&E?)0pYk>y>;#Ngy7zXyfU*xUwIEMr|oPlx*%6Q=_8wl&X2IR&)?WFX`99mdwWN7!U3XvT-2cKH%TQ9fWR>Rl=%XX4 z=q5ilxuc-zrso7C+d1y9mbDZS&f+%-=u(%OFZ)!th4?a#d#4KVWm(NE+Emzy^VvsD z2fdzTs}v3m_@=Ec%eRx`WoZzI0g-JGUxF-xvU~m7N;%%avOSDR`}Ot5V_Y9#ey38_ z73M`7bH>}l2;&g-naeaetD%p=nh)gywhsIuI7{UuT!mo~Q_=P}kY*ua) zMb|4=AomYefJa6jU0vmvwD%GaNJ#u;BKTuMfWPu0tf81yL7ZwhI=lOR z@ytuMjP0j$fw(eWA0;u7({P+cN#0wx5n~7@sat~zIRX}+L5Oyp-Y=*I{xknaK` zyFq?4;iHZl5I3^lnf$d^CTi`CL9D$`9P?LbnzD(>EN5B?Ykp_hPXCihyn>Yg0GXHY zJYgT;qSPGLNZ-erAZ=OO|DOi@*>BmuaddIsa!r<{A{So-h_bPoiIdfKsD&%ujNePX z^m&vX;D?1Nqt8%@_PnS4LV7AWW*-aC9(bv(q~wfd#qO+uGzqs%GO4#T@D8*pw&A~3 zZ)Q_g7O&+;(5%&WBAkKv>qqYt8{BM}n$Cu;_6;f;uzHu;f@%6%LqW19yA|n$iXQV~ z6c`JMv-x!cQ`W`m8quswfIl&xP~D&#Tt%T)H~>i*FsJ{=3dlfm02O;Hn6QI-0WjU_ zIEbQ`I<2m5@3G!&qd1C3^bPwPRP_!d$2G9XF?(x;H8|GvqZPSIf!u%8=97&j0xT0h zpk5s0Rxj-vC_oH@uqo#nI+z6{_2J@^feApth!@@9ljg^o-*q{&Y~R6C@2C3nCTr7q z{RK<04SSWo758q?E(a+Jt-X@(`rTNm(+C*E(PxBny}3TGL_%dg9{6F-4o*aWaxq4w&ZGpUpmp`XW1@M%|PB z=&q8#_p@V1oNeR1lc#r=UGEqhMSfN<85d|9HrXPtsyyT98kCFK?>k9zyyW>Kol*5I zy*K+jx7mr^b;TYo{VHE==D2+}oqt7m*ozwb(6d){ws5QBY!DJ+b#f z_u<9(s$PGdHxo4K5!aS3LlHyo&le-b_W0_oW{7CMxFMac+f-(;>AI9cF4Flu{|v-( z)_whv_obq^KTOwG@d2%1k1%7{$H6Y=o==w%)zEb{z4K{O?JvCNXHloEeD6L2hkv?g zkF};*qo6jSMXLCE^9|!=OLt=E<&=Oun%&j3rag|-6pLf={5*_18R_SZJ`L2D<;DF2 zaXamaT@3E|Ilk*3TrXzm-tagtAd?rc=jBG`qVuR`%f26SwaxCA`1cG4HkVEP@T}mn zR7*$f$hpJurfh1jZTs1)(-Eg|zsi=rk2`37FSP|Nk~f=;k+G|+9>|} zz@Z*d<-W`05O=J7hR}@Tu5fYyOS)V9K{kH+8(pO66oCmj_&4!pfZax$aJ`{@P~V-@ zgJ1AsYlM%Ed&2>Y!;8X-EXhCks}#if;MRP5SvEvHV;a}?tH=z1}4Q< zEy67rULpj%G3X7Nu}-!x1pN=xeS!1op^)gN#IL68vi3^R8*jJc26C!!QRw{JRgmIy z1qEY?TUgO4{7Aynf;)lNF#Fz{zsHV_ZzuZ;(T34iQ0-_N8HH>4g}}FDz5ORnK8(up ze)m4fx7mn+S2oImb@jwpdjrV-UqwbBZ;BM+tbX8r;QZk_z|+)Htps1(pC&v_X{2D; zg9Tj#u5lb$U@Y|6M#`lf{R1(VjV~+b3=#ZRLb)D%*c_v!NWv#gBJ!S<>BJTCQjR$L zd(2npif(e~n>-kv-e;;Xg!(5KGmDEXH2 zJ=>PmZTGxzbU0%mcIS3opIrP422bXQiGZW=FUj2;?x$9}pc^MN+r^|MxsgIoKZ~PZ z-Rp?4pf=%9zvpoC&BDv4Ht+UtMDh$j{>qpE zMX>%oRaq`K3Szl8@~%6yo`(yP(ML+j(_!Ubb~ELFIDgT2BjiyTW8dGaOLtC6GgMl- zyJLB;yQkvIv->rcf=fR{ACEs->C%)|G(*&;dputC zS~@1k(Xb+?1>)-ZK1p)Y@{cj*w3~Iq?fKs1Oo+Q?DNQJTF8-{ZHz6FE-FJTa!Nb0S zczJUBhKU;rU(m{QJ6L>U9w&#qK)ZysX31vXAU&SLiu8EQ?F=LuYnNzNqk=La0#ONv znLg}!$Qa*ry~kXCYGbo+yRzS3NXbxRy)|TCZ15<+-&vp%a9*gpH7NGj3n#M}1gs7t z-w}a$WFxx5Y}CS*Si=cEGRV9l8Jz~tnGNuHEc0rREnnTl`IUUn!|F9o*$}+yMglDB zdN=Jb8#S-=@IRJp#Yhe)l2b3cwr3-_%B^+x;!CLIJmz^1LpM~!fM3y<1I(zjwqmRXg~&0-V@?Pf8Mv5cbs@`D$nR9&GV zSWFdIc{d|A-2|1g#sH%I*Ud~2pb&QASWp8d-xF%%USuaQS|0!;4FOALFs_ZZ6lT@+ z)OCZ4=yw3PAb?cikAK?!cO4z9^a@y5a6(a3Sx|LG$h(6+Ni?b;`a6l@no}6Yrv3^ zj;wD9v;nLSm9?Rtwdp#m&xt|JLT~x=AAZbZik5YZcnqeFaYZV@=jm9L!%wT^oa`=G z^8)Bi%wwK9q;yY4Kzt1ZC)ReU4OOz-IAspoOlw6;j5)AhK%cZqioKq6p8(%%#Wh_? zQZVPRLW>KQr+`fc+1Gy+@!&E1`oV{@9Zk7z+(RkWb!W47T6j*Bl%_*=dil z5|2&(#oWgYTqhHwWsodDRN1?=zJqBv&EAfQTI)5tI3Q7)4vcp^*CAEa^0pyxMz^vp zNq;chI6i^vD->7fP=&BX@u;=i2ROjWoh1qEYtIcwt*y6#q!DMM^b5Uucgcx3oXzO5T-2I13{HixeIF2_&Lg#vJ6S)VMwXOK(=0 z;1)XQ6qP*D>rQ9DtV=ACMvl#kUS#>FD&xnlVRaaZmPOC_PJ$-^>BL&%unlkZ(yF=m z`S#Q(6zE**k)1d+adiw|a_nPW$D(x);+*#cVcqphQ;6GS0dKf-BDK&5k?ARZpJJfv zAVylmheRzA#&^iQwVBg?QZ|0cR9Zu;uqkbG%Ca;C7+ScuIHh%l884QM7CuU_}az#ty>?bY_gnkZdDANd-Y7GFCy$X zqw`CP1u|#1khnB-20m`c z^}^jG-tsr<^sH{^LblwuK24k6KXxoDSEnzMXi!h{%&O+9a6UTziG1kU#EF8Q@S??4 zO8D)4hmNLemKjT>*OtE^rHUNt`)IScNU-_w1ToX~X~vD}o?l|KIC;VDW20ky$}N{t zvtyVk+tq;%+1CdNu)Cu(+DXrDq<%W@i`U4Qv!>4D zYa*J-mko(;6$HGJa|#;X)f}B29fdH0LX)t@is}1DeVMe^s2WuiP=Jrn-`gr55 zknYyb-$rlxy%G9~$*wshQ5o-d1Yhil3fp$!g}%ajL$cv*%|`KEZ5jpdJ(-#+*WcFP z3%@6G?Y+Yy`F*Oev(D7SwcjHRNCDkp)grM$(e4HA+v*RylOqIEeS12DbMB9lUhZLN zo@DY=gyr2@-t{B-ed6|S&w)o#ca%*my-FTRwUWBgUd6mgIPk(o@@ZoF z!A+A+Fi_pvmMn*6%OgRm@$B75rA7QngOn}3Q)YZ}9)Z_?r=B*ieuVb#=UWgsDP=TB zo6|4b;@4I@W3QFGklPoM+wY!ib-`{}GH}PPE|N|E&R{0(H6ir)M@2lqc}wP^Gi_G; zW-P*rSoAz6nGlWYR~i?;pHn|v02goifJs%WunxQU*tc5Dx7yseu&=q5Cpid_L=@pA-jAWi_ zOo?q4EFOImuRCYHS7}^`Y_DAmV5ZIZ?>hVxYtQSH?7lFT1o#K~=1eznR~TL+KH65T z)O|KO^1&E%#m#Vdk#KZT^qO9H3h+d@y4|WU@MjIah%b8yZ*o+%UZ)*YEd ztx}U%?X&k7}>w=*p{Ovy;hIv(~W(1SdYacw};$1@TIsxoSeHi z$BUt7ct-wqtH-9KMP7pnX*nl_;Ikif9vohNJip-M&Ar-s(xX=>EmpP~OmSs&hqRvUR2;jZPU$O*bnO~S z5y~o6=BcoiRlg?k&SBMGf2s~iCw|1XN zs2Ka-4sba{-}`jH<14=0EAz_MNp-F-M9qE<&11cdd9Jy5N=%`fL$2SAm8LVrH>eC+ zQ`kgQSYM3f`jU?ac zr#-g78$RmL)IE%v!r;K5{PSXeeg_@3_Q1Y-h>vDdmiM#l z2f-Zwd60y5F)xur=#!E>V90m*hfwKby=l>@_S**8sj9%i}uZoGSVbq2{6Y)6FUsHc+?^qdLp$$ zjPfXv9}LM$Pl{eCSoBJnto4Mkt+2iX{HWdy?!rV7Mh|3308uu}icQ>L<_EM6#R0E= zG92vMU6TwzCdoDOhG5alY!Zh(JE4~x-|ctX1!6}QEPMRVUj6es570ul{%Jt78vEx* zK|5bO$qLsyE6);5RYChfC%{>7o)7mcIzd6eYQ`G8r1|1t;Myx31*;ur&jmWNcs@aO zS_2mKjJ2k66ISx8J+4|7#{SWA*YbJKg5#Kr{QOY!{?pQ@s8We@44N!qH&&Z*9SpkE--|y*E1?xu2St3{=im&)(V{RZsv^gj9#s>WB1hZ4^IkMX0)a9Jv~*@ z*&Y?(=pH{Kzk0OmLKjR^mvRy?YT zde4UoQr`#9+23{xd)R+QA{Hkc-m#D+GxysWeQx{I@rNCyx){mL5w6$LVm-49V~ZX( zrOpK@t(*JAxM$;R_T_QuZqpl68np$_4A1el{q~XX6kdYsgyuO%-z%a!6TEiH$llM> z`$1(M{h>Z`yep(&c&I9Bx!d*W?(24^465G&&)782mo5kA)z(K3YWMc~Me7?5-g?}5 zH$F0Ep#0GTNs>zV=I38p?zLKxN&heowxYYmoHy%Y{8(xe}fneRjHj&<;e< zI++5OF=DkFSGXg_t26J;))~I}MlTY>E!Z3(tjl`zf|ldps=Dw2AT>waV|MPo)J!ql zcRF#03lHLtJRTaFBp@t5hjIFPQXl-C~YyqBWxM82#v%#mEuTDFqTQis%f zP>}4a|MEh7d;MhR+I=uGGCU&U4ftbOFJ}eZ@#4&}u1WV*p~eyS<%L@--yw^iCSFL?o5TBgq z3AexF7I>{Wy7}7^N+^{>;xBG3tG;cDT$mI8-Y>b5yDA;omHzYM$U7$*B1|%tx{Mci zrPE@T0)6Ky|AE-N@OAfUSPT2&5XHY7_5&R22Bxck^@HnMoC)P`#fPn4m=$;!*r%Rr zu^J2&n;Rh`h9O73b%<=#TgC$1fK~qTMVZk$p+)@H4~FvYLl%a$^hjM(X2_3(Whblw zTyvn9b@YBX8UPUCk^@-F2X0jKN+xa=yx`W)ZYC==}!t7eByLUGb z=epsfIn9#gONP^BUI*qACX-i%T3D?nx7KR#&_O>aIxbMb?%e}E-^3G(_G%dXfO}%Z zhpM{un7(tZk-jIdMb3?iuT_&FwDX;=LBi_Ntq-`^`nTfWTdA7jZ^RR+O9tr!=cE_L z4J#*`E`9RM`~Dl)?m%(18HlC?0WU`8|FH}9)EAuZJ@aIlTi?1VAo>SyE-fpX(Y>xa4s+r<{-H0^3m(>?xl9+LYu*m;~wvvHZeVw%Y^&f~v!!R?bBY>ko z7wU_IO$=B`$4ISCaEg|aa8y>*Re)vb4RiQPj)$CG-+QHP1|kw+TbU-la26)fp3n>KW^t_a=S{fBSf)yTPtdUkEp=X*??;hIH;P5v++lm7>z z(!L&=5j0r7bQx~bb=}8q6zZzbKZ+l;~{C33ud zAA3}BulLk!(OSNQGM-t$hd3(V_~`p?-?HkNrdtbhF{?}B@9m&-k$$9yEQz>(hbHD- z`Jue%HA_b(SMs;o-!0+u#T#hl(+B|Zh)l1P?+?l9ZJU~{@(rZTf_VnR8#JSdHU>!9V?a6 zU-qpE-5N1mUTB6_F5wI7{XY65Y52t+=pBYn>%%qLKA@x=$2bTpSsanaD!-^jbmnb8 zHv}CBjxH7-svS`h(ol|kYr3rPyxtS*Ybw}6-SYU8S#*UwjhE5!T`*Oww?!|zwd z&6$DR+3k~aUm~GvmZ>t~-DGuc7;H9wTYFL2_gg*xv^D!YbnGD*^pS54IWwOd=5ayi zW{pa$L9ax-va9I?ds<kn3D^E?n|Fw;&bm#KcH_uhW_X+YGfEi zT~gaa(RwKDbP!4SHw>owHkTlPz>evj1@8x!YQFM_Y)h1KDt<+;d4?J&4S z9`p|O^e8^mHDWLR^+Wx4I0=TdFq;pCk|qoHL}12Pyvk~i5X1L55pavHp98_)Hdn9S zk;`8$c?rKR8J57T0H>TuW1jwz{HE>a00<+yDnQlNo(yIfi&v+iNMCnMYKbQn99}<{ zA>Q@tt?9&oamhTgO)0HlQOS$wJ0gqp=mfr4F%N((DO*tVh zD0{b2BcwF3dUKB|PVU&mt(HJ@^~**5b?Mi+eOiLtXM8Ab+MgX%&hg$6TRv5kWYH7v zQ*J^acbby#Jm9=@eM9C(w4P&M+Q40DtxwL|E^DdhbjI;(>Z3nSs^_K75EO(jM==g4 z6GAdXoFXin^|S>v)MFGmjXSOHI${Eyfj>)Hiq=$~8rFy}6)@$?C7qUgWD{fkpw6NF zm7@{yeVldT^Y(Lu%HguoUkEWlc>G>z&te{dToE7?D0x|d7MO75|70ilf^=J5N<1s` zMdKb{wf6(mQMG0AVN1aBZ&jC&CL?awp8{;Z3I90$ys~|dpXosMBOoo&4s|-_3G0lg zzpgNeLR4Zg&WLZAXau|nvKZYhuzpUmuxNJhD!H+)*RQY|mP!y=w1oc&30&;mL;PfE zE3L;kJn(h!vJ5**YAXc~#Dqp0F^T}yLcj)twE%MpVg&+xelEo_kccV(N)e0?eZ%hI z;oZoY9JttPDG7WJPuWoW>~P z{ZHW5b#m-BJHUCla8p^2*8}jc+nWH$AUpnz3Y&*FM+4d}!&N0kQ9=;|->@eza$Ci- z8GRv0Z+s<$6E%5NfOCE?5-JMk1AYiVgza401a}0hp$+*X8t0H-F1pF9Cx#WX;fAy`T%|37ayelp`jjaHzK zsp*h`4MrjP&9n=<>Ff?cqp@nqcaXb~g}V!0t0t#5CFDF^I6x}HwgH}O=ME2e-og{` zm4g*I+jI$H%q&3)mgFuwM1_J7flXobs|aY1cGT9N+%Uheui4hHhkX-#fY)vIU9lZ& zJ8(Wbce2WZs@w!lEI9K(R2FzhbfFyJieQnw+=XuSgw6KyU+*lE291HgIwvS=6j1+k zV%V+o%)sntLI>f&y0&sA-N!SO8juVQ%DZy}zbTW$mKX_xDxVu015qW>rQC?U@Q{2U zjfoGkbd?a>!V;Fsv+v#zN$DJ78)o4lU7P*AKH>T}MKs=ZZxe`w#g z-TOiVJnx9IibFQ~_ z##-va7F;pSDKtC#tNrZ?UW=P&MI3|Mw9CyaOrAd}N=kN3%&mGhWj9Sy?Gh;bg}yA( zx3}}4|ArD?FPg^HB;133d`0HthAkxh{QjmFe(!Ye_kzJ%6GE@Po#8-PYbM5A1 zRfn@j3jHqO(=!t7M`~ zRN^Y7aGW|)o_;0tu}5I~vwftv0{u(J(!32KS(K2f(D3y8qwpK;K4l#?7%x-e^FN&r-?UC4%iLbscI=zn)>kBy?jC_@tZF{cu z*qq&x_Iw3(`y_22MPHtBZCojHb1+@d;)BAmhK)tDJTyYd1xlk*vh}Q8=xdEpM&|Jr znz_mr2ahT_SUUSF=}WV)#XeC0rntzuoY)T8Q})y;hGlPi=R{>X!skH?em{ zP7l1lWJj`-3`dtmx5uiAiw9?n!$RlaGwv4b$?K2vXa)hZf)VaZY9afXD+)nv?skmi zc02f$7$}(MyKcTD_7@S)`{FX;hFBum5bJC_-LdWDjBylmeXTWdYLQ@V1@RrhLsodf zgn)yOvAKoGZN|fCT7iPHdzeMD;pO>Ti;Jf$y6JBaog$-5U&FC;_zKt@H4|T z1m$E>+nE(zlHsV?U5o@ju}(ox)w~C`vt8eh2DOWwPe6q_-Siw+TCOMQ)stS%a!ew_ zvqK%U`4gLFS4w7A%1=&@E>(qRM+Vd1w|a&|^a;X=CtJC)@mH@DPq~`GsbMv%pQqRt zJk!QpLBI7Ce@)W{|Aw*M=_2LDy(d{GC%3v|1Jbq9588bvJ~14s=wdywokXZ3ws8Bk zYn3aRdUww`y-n0d@0SQJncPCy_59@T48fQ82j0bo;unnTnw_h5p)bI3oZh3yPpqVe z;&=ORQMMZ1@=Upwr&{ghX!VIr{^+piDf^GRyf(E%gC6RC#5^@mZ3V@u5|^f}Uu|cP z9y1j$p0XNinObxr7icIg=aKaCNUvtS=2uYcUz5h|9)gcb_gfoYnXh=l8!S%KGSWxu zTy>)>#g3+|px+#T2R(SmXr8vDHBZgn;~C-$%C=(cy^+3Rl*dM}ce5cZxz5ZNUwvV~ zI@n^Kwl#VwNhpF&%O`LP1lx$m|Y4 z?W=ebgBaP;vh;Bm&zqqRU#aVPMCDyZT4XD(Hi)7oOa(b)D=Zv6oRx~Neqn@2Dmk=2+YiuYkY0KYC==1Jg_Zr0|mw84az}E4z z<=w(-h-vxuFV$|ROIy^ri}w98k&E7$O1JKj@()D6>fJ@N%JwWY_3En`#$^*Ioan&@ zb*`(;LelDi%1@e&sE-cBJVGDy!>&hv0_A19M+XX$?R9!EHkUJGEOCA5>Yj4$_vMWl z)I;Y1Xe(5>b4_*pz-_Qt){q!~xaDn5RDJY1U3?gr>52tPzyLY;FG*To8L5eBDKu^0ISA3~eh*I&$@Jvkva6NJ%A z6W6lB`l2X*O9^M00g$CKLGNdodjF9o;a^hHex|2s4&>N;%lCs&0|+;Pjl_YX0jpTl zc6J~DHaOFcD?~whYmpuS-~<0L7V*;{KHSZ*W-s5Pb+Q!J*0UHTeE1ZH6(xtjP7Y>_ zq_B`Nm1BpYkXA+(Iyj>#3?Cm@Fqn(*VWoye{f|HXz`jQ5?p{QJZCO+2f}%>1&-_c0 z`yZA4KlzK-#JZYI8Fja#3ZtO~*%+k%<-e9JlQxdk3HGoI*~3sVF|1zY!YxYtCin@D zJI>P$`KC~iB%JyY|4#T9pNvZ{Y*qD5?CF+DC3aps@imT zu}2-10Fno}P5C2`zZJzoFrkdBvdP#4*<5XbyBuIOu^I}Yc(z&QKeqX^^JEn5a2- zxvPz+nMZM;h&6*w^=_?@i{X+j9OT3MoRBqbve9hUnyAi{#HBw|b?i`dII^yJ|Cd#P z((&`eTVlj#-gQEpd&H_*Y(t&$QaaTOC$dVK-g%|3lqg-EarKcWUhSsaNyhzRbgJx` zYUZF{h&IbgaaUH9bx%%m6no2;`C+qOfTOjXUcam&L9oe2vgVb7Z<}Z-eTQ>EprF~? z;liWG1|}1U6(@*u{E1JNG@sjNh2^F^9+=42n21U8+E$^ATS=`I-r1?yDx2vYC>ZJ8 zP|{f-dtbaVQ}4o*U3!}JKxTtHR+_9Gxy^B;5VYlo?>ZJ1n^D zg6EiWM5iS)bVy?&d3vHeo3viCwu{7FQc+*YxT-urA?aQ$C^$dxus7oQ!7ju+J%o!< zHR%)4+IZA8cKDsMQMbRmDncqvm){%6nS80!-ySd{Np_VzGhB&{OZwuz6k$E#`xtBBwu1nZtoO=?g+HvIyow zcC=H5zC}y=Ch7OA`S0do4cfyM@|Tq7snhgZ>0wv&nFKgKfp$uEkP)7~C@gZhFqXSU zd&+J}v-(6uJw!O9havL^*R_&0oKPu_{h*gX@*P?!rQ@$%VtP$kF;9|S@BA@iJF8!+ z&2??PCVy~VM2W`;a-Ro>%%$R_qwoelzgk@U(R(p&Wp=l z-dNMgIXFo>B@T@J(*~ksqxL$L+tT_~3YViDEo{s^g-dBCg$hb21 zV_lGwa2l-rhU`6IIkg%oUqZ)6laEK!cD^aB-TqdTWPCC+b-KCTBRnhClX6fT`&MJD zdvB*5n9iPqi};mv5Nv{4{c1Vp_{^Dm2iJ7VGTET(;3`JkmswV~V9j6Id+D}RZA90M z#bNzR`v(KlBYlvCgYo^fW12L5iEky298FKb;HT|;aG(5kNvlX>q&ypoYFrY8U0&*S z63vFwt9`HA1C%4$X7J&Op-v686GgLp-eXOkU=%t$Ki*6V&+dDEBEYfZXuId8Fo*N} zq~DelPM31unSy13smMWZ$dNfZB}`G z_-XTUZnN9!J#lkWu)h3}+QZ5h@PiAy3VhayWBuP^L@~By)qH8KIG4fT;^TV)hO0RF zu}Qd%Q{=X`E0UIyq9pHYy8U0bCpB}v%R7%P>o{YxdKxYOEU)z4G!eU2m zilh<0fPAqKP0n+{;_B!W_JY1vDOTJyj7SLQY1elu;;NDVHJ{grsxDKcDndvB1>)Q? z`Q3Y^^yx(yb=e+*6d6-wUK{IBwuRfO%8P~)EKST5=1{AfDB=P_>Y}r~1_zmm zEzxqu=EfMyca!+18DKVLtVMUkz|1F54#caOrv=c_qR1l8>o+#?U{RugK9mrofu=?Z zYraIXaA!bC{7vO2C2=BHWdCY+!S?!UjqJYRDkvZej{Zm5#DDUavKhb_h~X1DOMO{3 z8YvBHti)+b8{>nlFai>PeL%?>8+~F+wc59GBN;&ObN+d2bO1OK**QP&WI>d}4G%d< zujy%z_9CPg^aJGOe>D1Q z8U82V*BnQHe147JYl__>Pl?JU$M{qIi~VH^3whM4IAoBW&$4w`X-;cIxU9?{U{7;o zVHo~<08NY6XyR@fmnk&ypZ==*RwGiHSwUALzce?K&u8i*yX95hY7a2gi7*} z4NxP%67sjsMHBpNiBUi-^Pa_#535T|;n&$bX#lXg;TyVx2G^5Lz<0*{tgwT4@3%PFT~Pf?W^i5BeDmS+L3n~NVQCriZ#ifn#;fsyGvd~?B9yI9bt!N;FSdbn_j^fa)v3N~<38F^!UiuC+?wNeCO;?L^f-{Q{_SvbE3?8&=(FW`0+yyU zu3}R!5)bjlfhRG;$?ak(B=F6c9=k_hP?$O>D>-F3PMDCgSMK0FhY!yPSMY5|c1a{R zi?ou~8`mHkPjDa03RZh7eR6$qdf?^D3I{g_*(~V{Oj4?HvAxWTV6^$T^4*OGPfamR z?P=Y+Aan5PM3?{+fQwNRcELRPlY`6-t~O+dREr$J|$YJcM-f~oZpLPw^vSVA#8n#p0Z@xg}$8KQ7xCq0`fI!q>p9y zM!a~Oa^J%`$a+?~IM~2+7nnBIOA_BI-b(4%RnqXu-~d@k4DDeWe>#}wi#2#bS!kp* z!U5Fjn5oQHOFH=Qyrrt_!R*a;clohLT{#XuHYErTGA2E7N~`TCur6*%B}kPISK69+ zu267H+`&T_F^t&IwT7mh`Am=wGa!BC#r^;UTpnT z;}s(&^p(b#u-*l8WySTVn)Nww4Jm$Ccc}!w=W+U*?J7As#eLOl$IejOXfqb4Iy{b@ zBS#Ll(2iaWnZk$gnd^Fv-?_##(#40z9X&byEN%MJo6KWXv4(dl)P~;;d>=}s9zAik zpf!vxHx&M1Y2Tyc;psh0FRN@f`o19ys#>TbPb!+*An=}064ZhD5Moljk{N2y1y(gf1IMpOUY01;yHt@-*ddFJ3D6T-tp6gb3ow~99FPQRNMxC8muwI(Wlfv|6SByIZnK|>#`K?j9$rPV7u zgXq+6`2?;aBW7|-Dl;7=CZ^kJtvX6Q4KT0SVybk|t3v-U zXO-TK)5xy^Vm&+aeHP@Z^$$yDGG7F$P>Pu2~q4saIcjNPsfFnyD| zbsNF|PPU|M(bj0^ryO=enKmC(5-)~26uu7V&AclOsoMJqxhXl%G7fl15S}JAA9DCc zQx3)7b1T{;#xI*UzK@jEs`l8jJjm2uXk2zZMzLRnH(T-<`u%#6Oet0-KsX6|C;ClA z$o_e?5V?VZly+60vz=PzpYTO)qv{f|Tjp=N?za-;$tAXkUa}eYOxSa})T7<3JTr&e z;%Z%6DUt&_3Z#`JBeFw)bzA8bW1Q4=LHjM1`4V~;2=Wuf4u~K^N&9oNz z@+#AbK7dc$#KNWF2oV1=m(7U=q8+;iI?JeF*!m$rxmk_kXzC9)uK8n`O-3SQ)EA%J)zXc;cQx~(*U+O#Uw}v%b z6STMIWIqCeExrvKjPpS@pajm*NZSkajz%mFfLhy6ry?|I`#kBs-)hU|+ z!)XUi#*zjH2p)nBz(WC?5%@P)I^lo(E7J^!+E|nzV22d@?~;vscAX3wfouwiEX*#B zKV;ftqJZN7)YgO7s(9C0h5v1nfM459m^`e9@4@(`Lyfapqgc+QwNGph1|bVvAk&cc z%fVv6>@nI1G2XpMMEd`O4aRzO)f^+^f5j&z4uqF$HXAF}2`It1!b`9qC_T0whqWdC zD}UCX3y`&i2!4dC4e)~CU*xvVJLRCR65eCq|6J4()Z}dZOr;B93DC0vH$bTd8F3Dl zFv)+q7~n8&*3zD`E{qb0w~;|QZGXj?7En|e69rnMthELZzWC7!(>GQ*$Z*@)DA*_| zsPqPtZ#z*H95VK9-S)})h=8_O_3phY7Byrq2`TzZi|5S>9PZB{X`IJ33pB-Cc&je@ zRvZ}ANAt#ylsfoKlVGu#H(%c5mE3*vx(;<((TIjGzzs^?HLcYTY0OtfY8<637MvX@k~4UTOkfG z8sAft;n+~5%;9R_q|+-bPm>r^wf2IzNT}my+2l4Ry+{EmtW5T$DC}>J?|XfPak;6- zw7pp}AwH$x_97k$o|&h2GR7yXG2`OczKz*^9s$Gg{&(_VOYg3(Q@nm@*zJ&Jb)`F5 z<9S#w^W;v_f+-`tXLCQA%gmjkkD6`CuP{X6#(5tH1X4Ahi=9uR3clm*oxLP^)id6I zTt%;P%4*!VWbK#}CHJiJDtQe$^bNblUEkPK+Xb?+L}Tlvkq~O3v2ID7!j8=f`5^4D zUdHkLx23+(_4J!XI!Lk>7saT^-KW({Je%cG(Q`40&zDq~Mc@J=frzT@EI-YV)Y9$^ zo}4UZ=%A8_gTb$2-L|qPhd#`56GKjgm@?u6-45i8?UhSZuwsR?V23}>=VY=)cj!db za>101--pTF*C2)Jdy&#!VhGeADwN=eDNP&0hZwpJ%tlZ`46*ej@}0DC@lZYLXV+X- zxV==5>{hUwes0MOFad+1aL>2m5P^IN&RnT-N1Pko zH<@gQS{k^OJYO_xUM<$YiSm3~FViW3!jG)?5v!-|J35$qFyjGgwwV~79tsFh)~|DB z1_aIL$oK2bLMpZ)#8=BQBy`UfeN@7!M0A2NIr^qXFWHhA*!L)Z!8)>X-y^k`mnLc+ zcg|R!JDHY?Ub2UFxMQBGU^U(N8@ScrR~a+bjeauq?{^WQ@mG?;j%Qbq7o&JMf%0OO zTg<4!vG$cJ*}W2CpyusIZg8eeH!`Z%?LLQJIMjE}NrdiF%T=x0CYD0KJ=VJ%ZC>Ry zuCibMj_s1tJW$@JkU}_f#`&>num*@=H@)4G3dGBx^GpTtpwpO+--uA14lo5_#3Ii}S?`UV;r3;(J1Iu&{9J2u8_A zKjSuh&MeQhpDzyags5EM6^bR8uQudI<4X?b4bYok%my{=87@g6KXHH4#?N#v+sEKbf{dt$kl9k{E$*CV}UdC@Cbfe?)*w_zh9HQ zu^8!IcOWIK5<J&GjguTr{Y9zw<@FLVw3O z!ug5f^kU8u{Mlwb^LV3j6P;hvVHJ~N9c>nZT%i~5pA^snpm>}%_6eUJ=96<+nN3mf z4NA1itrW|#%$i*TX(!G(k&mP}NypEBMlJPM$ByKT=eefoV*)dqj7Pr}U(HhYlhv8C z+tebUiB`GaWI3rW8Esc#oj*a>BVW_NS?{OPNp`YD8$oFfKXKS$w(Lj(nUY6Jan>pt zoZf$DgucN>Pf^}+$|{}oi-D7FPpznSxdi?lenOP|lIR#xE{H5J^Q&{Yx=&rPJ#nug zXG)8wR8;p?-qQjJ+QGbMxsdINr%D7HS$S4PW!6r7O{a?95QVc27V*XJv1ryH%U_qd z4YT3ZKdb?OoylnXaRjpjSbyrUvMmt~qb}EAAQwWF9NUndNYoT|Ug~R4lMwxWpD}~n zhjoqG;Rgz$dW19foTDNP0Qu9qK|B^XQig-imR;qeXA^+_$d%<4a$xz%{^A-s@jJ1P zJqceNNe(C|tzz?ff?w?WB^&!&Zv@n>qkz8)lwph5^hSTRmEb$4=4V`A!W$|(*7MH< z7?$q>c-0Y#7PMEysy}3G|5MRXxB2gM!U~EN?)f)H%U|{tb;`asTEm_F!3Du3ijgJy z{cnWVxSv8e$>SBYgtq;8z>TFK@Udr#3e)5{2F0olhxYoxp08FGkI*DnmbtiQ$p!aV zRS{H(#)Yjs$ARS_tK>B%>es^L-{+~PCAFJ(a-3l?ISHiyOZ?SHPL-VdQM;fG zivTH$__Z*uF!}WE5_UfT)N(#IONmz#_qRi$f~<@-vejw-Z~tsyk!%QO*~G=KsS2mB z$*`j5ltjTqhv&+OS}Tw`>|m5?(+R1W={s1q*+rkY!c_8^H9cTd-P*k{4IW|?U{Eja z;8G&WgQnGgqK7uXGD_C^<^RGTAV>IUY9t9OgFoH4U{nA3Jrbxw|6-<{IZjVqpHy0P zgR7Cl?7ekiPE^EumZ>!Vhsn7{>27@--USDXKh02hWI|(V4BQC6v-irY33$IhW@mYkJ#e~P$eh|E=3n@CmCI(6CVV$ zxTcbilbd!EdB1`uyvUnS=W?rxA)r;hJG}J9Lt$jPO9jZIEek7=u1X9O>zL(WD&H31 z-42OIm7ko*bCxw_?+D&px72z*@L=!Ti&bxn-GOQLtmJ!|Za+FuagnP+J*|$KeuT3y z*{=m#IX>UK`5wgdc+a6b)+^&jhtrKa)fz5D56A1vpWGuh4_r*U&HJ3ChsU`HlMmaHgpn9{Fweqb1ry;$|zNz7Uvt1R$1>y%{g>d}&qA>cI)NrDP zxNCjoaAGl1GLWK?sgYSbF04m&b#tQuFJW%~$#%66Ga$mUI$15qEPw1M=3H&3x&@zj zXr6JeeAT&Hsku`jhLZ1;Oe|lu5p&$N32BmK`=mVAQOr@~iDxRG`FLi}Y-tuoq!EW& z@X4yF0&J_}eBt|c+)J}w=U{%U-fh1+6vaX+{at1^I2S?=RRwKqdNJFHik+z1@Aa4% zc4s&yx!rztIAL$u_Z0IK+t$bUgw(#bGUcZ2)b`^1z>GII+O{O;hhWd>qhD#`tURdC z%OAVUQ=f}b+=5XH`x#fSmOoXt0^8v7)7E@1skpc)=Vgdi!Qs5E`P<5D;+&T6*v*z$ zyFDW73ddarGy3)UoTtrpdwTn~$Uhs-AV|fW$L!l%-VyQkb0@qHd6QORM`D9n}i-);X{0?9+uh3CnvhV(}*3Y*tc&uVc&4P$1Fv_ zLa_L7|6BHUno=MocxUl3gOs4>Oxo;+x&`b(|4{s@z{%%Dvn3P4WO*Rr67a)xseMM1 zDA7fPWpu!dvtAMy(ZXj0$SzFk$?9KcAW>+XUwM-^o|T}d%d<)C#KI1pwND)wCuC;dLG(Tm@*A3ytnH(dVrVfTIN z(&HHYi@m9;%`DK#0nCD<@yc{|iWrB6*hI#=1&@yQ1gBVSHHF6A>& zs%>+Y6MRxM+1+0+pCF&o(oW7^6i!=DpDpv)@Y!eq`!c-rrKW^4@Q3owhAcW(RBda)rm8u2Ey?C?&^s=bcL9q`Cg*zI;N7K1 zrmQpQ7czl-nY#arx;KGq^32-ClOUidxCKQ7#RW`IDPTlFut5z=AV3nJl+re?B~&bn zOQq@racfjWi2?yNW&>0#w!vbp9b8bP*nxnI#bvAt6|FjL(V1#H-*?^56BOLqdEa^8 z@Bjb(G>^EP?(*8Dz~N+-*M^-4M*p&A-mX0Rwq!<5lvcq;Pc{U^VMddFe1kVM4XAqb zw7+eqc0b&{AsM3H(uc|Z>meKby4oA&$kH! zPjp72zh0i%SNd~i_z_haH-*pli3mv#-ySIg?6%<)I1V=qNJKZdU25#SF_(zWx2c^! z9b~Go_pr?jwB?WK_TK$8I*v0?w)X&juYdZc;I%~A{U`{QBwOO5NJ%R1Hit>;T{7r8pkgT0J6@gX@Q}*~ zzI#$(PYH)IUPnF6DCQ5m%sKD7C*WV`An4&nd%Dz@KE8`HsB}P_lFB|v4-e>U$>?up z+--SJoytzI1<|yhGO3Kd_m81JoS|_=%YGZDoj+rl7tXsh4`*EEI(-yZ2$oK&TOBZK zLb>$fyCC@m=gQKXQYX66U@d8>0G8hZ7f-Oj$YnIFT(Gt#MDEvIrXO=G`4T27r6h{6 z>-7ZP8I#$KV^*C2i&YSpz|CjMMt8sx({m5Z%m1QV?9YQ90hGZI$N zw}AFU>Is0KnoiCZ8=@5ft@s^N->GTc710io6n=37*zWTHb? zy{8LN*_q+&>*-#)KwLj_?k|540u8vO`40EisdS=S8aiu%3BR9(h^J>g(1EggYLs4`u7^`;R1~fK0OfVOtXznS@)dNrnl4`5BKzS$duxF4i7!^Y@N-&wt z9QC&pzsIpE1V9^;4Q(cpC2W~At3LCrP*l+=MVn02^>)usKc5>8aOf#3b)>U@j;f;a zduVI#Uv5*J6Z`Q3!H1n0aM_3hE$bGD8^#5m9w^*=&h76LqZf{eR3|koTCwY@3To*t@!NKBkvZ@l zu|Bq|tRZx2*?cQQh%_i!lpVp8_qY43eQ(N&UB~aljq%+T9=KFLq4ki{FV}U+zm+E~ zd}m;mcTmdgVe&xrs*{xs0~19DBr83(jc;lgHumHS-un4>r7IIU^7dg@&$VH6ie+(s zp48N?eq%US$|@P;{8_vDgDcbbgPUmdYwVNSLsom%u3dk$Ggcq*akPwgHp=t9N?tKI z$n#J%H+12cD9`VATvcrtW@P|zXD&DvJ^=nbu(V58m}H1V?D#j;zf>!uArh3avyxMD zU`CL9xi~$;F^#)EB;{VcTsYtVgQfvnjYOf1%5Xy7)HF*Tz4J zyQ?Z>XZAb5{xZKa6v+8|gw)yY2LtjNvlQuXRC2f2q$}BqQj;XrF_Tr$A2H@C!{p83 zm2sEtA_v@HC#`7!-MT66@@odCcZJ2ecHuAM zAB%79O|i;5c;K(RyV}yhqf_4X+D_ z$JQGzn$Fh6T=ZNZ{XBQ)0x8y z@#LQC1s_x;#<7pBF2j}kplaItS#fv2YDs;lS!EcsF^Ky(rS)$qZ+~^ccj(7bxz45{ zwRGkConfoi2Re(sy0GHwcZGpf2oJxWx9e{y;t@gCNk9G7Try~RX#-EW<}HV{sxhvD zq;I1VP6V1WMt&S2H+=hz=p7%2mbF9G!Z~F-mp2_bZ|Mtwnd^mC6`xfp@~e@|?QTA8e}<+DzAUWxB#i zZ}eq;!~Lh@d=}2er6#_n^w+8}(u)#?$mLX}ZV`8lFec}?aL_zNcboffQD$7TD-z59 zl~CT%e~_JB&i(jx?<<$r-VOEM!CgE`)n~-DxbtgOZZ-u|JZs-JDvXD&i9twJcVe?q z{#aTylD)KazN1@cZi|MLQhI&xCL?&6RSAMtJdOG*KNSQXubaZ2V2>X%r$m3Y> ziJKQs5`5ma^{We{Q7&ou*h_aM*3%6qN^@1vvYMX8mFpiG^ZIWOBO1(yVrZ>ve?Hxt z)J{vLmB&mhv22xcv+F8Ow27An8L1EW@}(OuYw>8{_s>V z2mSN>bUPbsFTop}bz}2w3b21~XZbl97fAI%o}MWTDlpmW1|t_wEZtL|J*G24z~I3$ zYJpaV)XCVg?Q&memhTL#Z`la|v%x#~py z>tlE6N|yj=7%Bz}(e_a9qXMee3vW@qUQ);P)N>X*_MR>ibt)D=JLv6kkxIt3+}ZvLK~f#O>c3U)VHX;UI{ z!Zs?G-*(}{&dNaXhI%V`Mf8oLk8rb`_zVD~ys+ z%8cxz7Y8DsDMG5lmOlR(PQNN6*UC-0im4vMT}j2x@r^;qV|WwkI4f_WrWajkW4WFn zxK@C9=x0W^lw42#O@?l)x}`0i`MpgMO2n45375w^=gTR(z3j1eSIxzGPvNYOq>%)= zMe;3FK#%)qpQL?gNF0NzA{`dpe0r>abHh8bDdE|14c4zsKF;~+lKIdL-m%r?nb~u64lNDNjhDoq zzGs*z+s-Avx63-wigzrzW2hkdHT+W7+pTThx(b)m7t7|q({klNNnC@w&st}HWtbwQ zw8DCjUBNtwV{mlgA&KjOkPGE$s=3^sR}NHdPHC!2S^aUyY29vwQWtb77fuYBUolFO z)$~o&`XGGrz#<;Ict~zqfU{cF_otM5Cqhn$ z!ghS33X%uzyA%J=ZhlC=+Silahy0mvsqdHeWt)A%>%I~rn-uq^E630f{uZK_13c(1 zvOM4QJUBTZSLXRk?8-QQTb`^5sg`%^d0)M8JQ~wzs$09;Pscm|6;qC7rYl*cUn4R9 zg6x3A^T*lSeLsHpobGGZ!MplP5ryG{7PBvZ6t`Ay-tS`0;H*m*+9&+3~2E@&rwyQhk>+|*`AY>SO$e;IUj$WF)HEer&ZK9nh=ndDDq=U(^KW#IN5_cUS zDm4w(Q#W<&ZvV*nh-Q(#266M?;X$|GoxJ}W+X{DkW$m9Pmkr)4JS>%EnLGNkfBhma zv^JsUfx(q46pa0If%3yd)lb~ro>H%qsk()e!$(U+%F`cl`ew#+$~%U0MXpfI=G zZ9ME0bf-82J=9Th&OTztdo?22*n8OrF4Zc=X;GM=ZNe63?hS)S%|b7G=<*r9sU1JX zW{2K5s>-fR&5XKUB$(ziWPvoXSm05Eyn>@!pmUNkSG5exbTd^0kPWZzvvh&9U)k2g zO+)hkk|?_Rp#L&sQ+*%9XApeVY0LWga*Ph9M=z>xL?V%#*Htdq43=lXn{f7yfrdGJ0C)_P7WBx(zzMm$rJ`0J4{OEjsmONg2VH zF*vTtTD6Qm#LOOFB`c)op}Le$2k-ef9bpqK1Ne!c(kuv@2J2^m;x^ z_ThP6Ny|q-1C6KXtxY}}Oq57xvhda_Iptx-lK>>9zlRK7rvY{cAYd5}V`>=h?}n*iJ|~;x+)uq#x%f zoRyqPZGuOix8ptGcuXml#u_ev;*^8v88{wZ~-2kaKN?Q9SDw?w+ zvIXJUv-zQLwc**K>HFM9WQW%VYrtt@SK$_a>CK|G8qqF3{K!n)S8qzEi}7=MMbs zHN)uF3|=zMO8ApM_3I27YCcdnm3o%;gRNo0xz32ab#|m&k;IG+j;LDA{U~~j>dXCs z`;S$v=CpmK`iRe68E11U@uN_KcgPH=D#q9e!iF5$qyb!zj*?WweD<`hrv+c7BB*V73(sMBw(LpvF zn(ME2#mc2*v=zB(J&261%#yNQ8@2M3Bn4I@KI zWb;eJolwHFBWArAGR#^6jdn zAp^y;=hy~E6xwMP70AmUekA^MuaCXs-vzua7b^C&XlG`L8zcg6ee=3^s+OL%4vE=Q zBIy|M)_YYWgZ7_+AHk;h+q)ruNvXxW3mEw?kK{Tj+Ayak-tJUG<=t z7#zjdc%n1>aLUzsKWWDZ>AWWDp}1j~CjySFxZ^v=YHwioR5Ux)tL@JifA2l2%)VM1 zD2r6=eCMP>$1CPK9&z>?8c(th+jmATkSv~>X=UCrP-v3$ zG8Ur}Hf*;k*6#F-pL zaG|SFxZ}Pxw><7cM-S^jrZQ@F*v3?zapi&O>_YY--$QL#uT7n4YjAVgU)>KC$VZ-x zMJSux(%4|OkKir1kfBJ$6z~v@;AJST`9zS8k=z)I-W`;kh#X!!^Ls<6uwUsl=|yOm zdUI-s26;L9eV_uC%0_LXz-J$)^qOR_GDVP(>NUeGo?XPGFpEOLMX@>8)R32aS$=d! z?c5xZrsRP(H=Vev0E-e@cpc+GDJ0OR4tJj4xv^pd?*}{8hBIObV%5kU0iKa|@5fll zziurSPH9t&j>zrGxwAPBzesaCaQWt3HbDs(98c z+-s1$|3%OcPAeCNHqda=mhcX8@k+3f+>DN;{{+^8Y_9*%|BYov>0iW=nz<+vxoMS8 zfx0hyY;d)A=iEwmhGxR^Unq}NSPKA?jYh#G)}ObaNj5>NIrl&}p>-!ZjdCWmZrPhS z396(cW}5cpW|=F8ipBT-p<8i$GJnTzQ;YE1sU9K^#oNZbE#mxevi;*2K&{Ls{L0`< z%8hXqOzY5fK$T-!qL8z1D}tS8d2_FEn5I}jT4VYw{oOm;9t<{x;W`hC{@pG`ywMER zKRyDe-g%u5{f(0E*@cZH0ih`|w|m|wqo<;w;SC~){nv)Mg>vQ z&M@ZrL}7=h5oAYv-nQ2iW8d0FsGI)VZsk=R&9hkC9k7qKRa8r z@W4pxF?@cjHmpN6kISkV^8sstO;^xCe0Jf;IOku&T4VyvxN6;s|=wVl7o zO8u40EryRO`&Lux!M0(B)kp6{RK`B$`biy&9I!o=F4iq%E($hD;l+_TY>M|2 zf0`T9@<~LF^atI-eoUKYVNHke+lV8DzF*mUOo>Noxqy=rU(2zbI-d|kr_8G9$TlBb ze5Dn@(54KP@ao&I6-aKdrGuTetY9BwbVeJ#NoG2RrbpN8%aSxk%#um_jwt7PbDZaV zB=GuMN*)|glb}8eqfA{llBe$vJ4#q1>i5O|@8BfBe6#n}1dH(YEEf32dgnaT2*U-Etl z+ZiH1Jx|})+B&#wd-!1N%-&uLPpKIJ2z;_;uW&h|#y}*8rY2cdLMVrpB5JC)Xc1*t zWMy15eos2>N^IoIm2WUr^vX1VbNfzoS?8s#BdO2AcpS8%+|f>Zg>+Xj8XTP(AI&C$ zP3L$A&)jrJtSSt+FgCPF;OMvT=K(=-s0}%>o5sNrinC*_s5{bcfQXcV^@21Zqkcy| zYnQ@~qxV>MXp9`XQKCgtleVyOOPYZ^bD5`B3_PbFf4#WcOwPm|drFw97*b=~yQL|+ z`&c|kiOVFQ&gj|n$;($qs%@s1*ewx_IPd;!c@KZGyW9s2ACaP7+PXVi$OjfYlot0< z(5(d>Oscm)S4cM(Hz=7Ft^`{K38_4 z7HXP}0};6u0V9Y>|5NfgBVeV7=zfKQXE9W`Tr{sk-)j-klHv?Jr^zRJQ;6#>@~U}P zI<|H|&i#Zof2IC%@=Yc!!I`v$6f5wdn=;zFgpOmjJm`e**Zwo=9A|e*>dSm{9^5Eo ztSE;=)V^GLevL2B;AT|Xp}L;GX*5kfLfUCJl9r(GrzMpaB}QB^V_h8NEhkQtLW&9g zXO`dyoYQV0kPI04)im>qi&e4K2Exz;U*Q@WFe$U*0Ud&C%)+1z18o?xkdA zmHV{R0K=zdtjGyA`gzTX1!?-n7s7{JY|D6{OXWBjFOHv#vQg~VB3JfZQ|f`E!+>A? zE~oF7i%J^-sP82BA%fPnKl&(gZ#S&Ev-A&tpwY1nSWAoy7;IzuW|Uw@bl^sWX;& zSoMV#sa9`uSC<-~WGq1(U#98x#14=Nsp>CT^2ikjBKtxk!}#s}(1QDdnvJO=Q$Bd@ z>bBReZq9aS^J}^K?(tFY9v?n*4%uw1T+@YrNlE${@~Zh?qgQkz(T%k8{1ZGx`@lr6 ziG#$_UuIJdBki!Y)A?w;`0i`2w*YZOEw8%c;5vibyv^&Crns#NW%R@#ys=ifw|o;-AzpRYnuG)7;aPl-e_Fpj%ba*RiQm`NrWy*Rsc|?wgoU7#n`oTGP#OwJ{ z@`30{@8QFFaLtIkPseowHP0pB)zKTs+L<@;~$(#vPLI9`2}ooO~(X zuA}&~tL45c!{8MS{q}UBLuPpYK^V=1OCk)E_b*{Q z8=C4-^HF1z{FGs&^@bm?>y@`isuM8&O<$;7nk&h^>Nb~g#gqyfQEPVbKhJQdN!!Un z5`VFo0YNhYG^PYNv^7K#Ly`{S&0JN9Ic_}W^Rq!7g1tQ9Fp9m9WplFP_ zXouE1pm|bA>a?u51-IHEmDb47P1E4^%_o+bR^>%^S@%?b=Dzrg-QuCjRl4Lgx92XA zPm_#Fxn5D7@OH~OSk`0fVI7UX8@n5K1RLtu&F4Ur`O+wVc{AUhoQR(~T#-{Tvv{q) z;WF2InCgM?9(&jbgKU^m1dJ0QA|*7U2_Ufq)`>8Y5=;@HDZR|O%&F}AQM3HYpPc>F zdMjFuc4xYLYwPaqVd#QsEFK9Fp<)pjTm4Sw5#(D*OTq!{ z!wGItTA4KW4l=OEL;(jkz=8CR(st-dZ^qV@ZH^3*|H3}iYRVZo54DAgCVWx6^i?;s z`2fnKMdrt$hoE%DsoD>CqOvTfrqMBL|LbCgG{|B$`Il z8IWT8mblA33|A6G^eZQOY#%iPRCanpnjP-6*GL6^KAVyX|AOqqzn}<;r`c^*oN84{ zmBOzeOv-^o@%!8*M1O~j{oISNFJpJ)!Ho9EDWOq_;ZDgm5zQ)Hg;#S(v+!gkA`*I4 zbJNv6%H?R{1tUg0s|%V<{RkALUG{pK{wi0mT*y<-IaV8j3{ZMM*m}{&xRGoZt;Y!Q zUl?qMLI4X)$)c0_X`1C2k;~jyO@4Bv7+K99*G|cJsGnpX!4HHk{oD=m$Ll9D(DV>h z7brGea2wjkywkrFm92YL1aH5Lon&*DS9pW>h*LedDf)i6sfn_|kg9rOYi(1OO)LE_ z0y>=-exz|EWOe{XOG{o&v$H5oVbJMdp?^0`d@qQK_EoM&IC|CX*e9$X41Q*rAm{P?xUL zvSZL?H{1zAqwk8nhwTZF`f}?-CVEM<8zqC)NlGr~@ZmyW>URs`)?ta4wJ*dUh1# z6qjksLGB>gP^Z7h<82eBhHn>*tKR2huifJ830Y){d^4}BRAwHO&L11VO{0t%_aY1Y zqV-fZ@t4qg{KkkmqSH_BR0 zZ;^Ux(Yy9IUFdUl?{;CNjjUyXOUp0lT!^UwWbnP?lR4A%g*Nt6v&%EMBMnlP3(!|x zu>%$XVEBl`1Y(O6J1A6>V$c9k8jQgu<6~l_$-81`)Klbq3jD+$a6;cHk-k+i#vZB4 zXG-M`Q$%hALl;WC2#dIsK;rF&>7;3?+hdT z{%mtmbP!lGUTDa>Qvo3`e+;@amGUdLuO%*Ir$ z(R2_NTo14R-+w&tNuy*>!)RwC?^4X$wXf33tSbkAX1|-GYq+?XQEm%hzAG_m=ox@@ zDi;7{=ULZdHB!-r%W?SyS1&?8l6>di(q*SltsoFpd@`{oF(si_b=uF=pEl~hW^XhE zN4-LQ?H;Md%<`iPOgG$MNd9q*>D2Z^U=(!3p}$X%Omkx}iv-=bv0ftHyy!`u{8IJij| z!u7;bQf2b(2$Qapp=@!ValB%Ai#W{*VfM{M>JRrvBFOBfP6=G9u|c79bs+i&6!o*s zh8p9Zot(bO<(vbo08eDB0>tMd8MPjvhKRhuqqv@n7v&Rhol;piK)fAzf8R;zYN6^{ zBWq7fCTq?+T_^z6)J{>>OKmp%b~|$3-sqZf>avC*Zs*El7f4(nycnf7InPQ(BkrU3 zW4T*%q2*|$hIMpX_(at7$(oW@--B}?W>IkRyPiu8oGBdbHN5;g&Mk6$hhj8Htw9q6 z6En4p*`iY`OKh{`OpcQw$TMJ|uyP!j^!Q{Z4ky#+iQ|2lL+ ze=0v1c4PyeoOcqI zJeaHnLhkg0jH>ypFtq|=7o9Me5l?zMrU?rWU5X(H?2}u#S-mV-k{ak&v zKsx75{W$Z!_y^^_DRm?ZFyu0RO044906}xZ;MMT}4V?~`ubP&8kO_N!KGp(&L(Jl+?*Jj> zCV1fk!}{kU=8#_)fMT&=_+{00xOo6l6Qp8ADZv~_DuDTm#b zaludMnjOZkojj&fQapJJoYB2S)SqNf7pa<9oSKa0N^Vi@2wsa~jkSlZVH(O5t7@ia z=c2J37f+-B3`D>_#KaBMdC29g?vG2tPQq2DY}0VYIs8Hfg^LiJynYkxTa>x-5_u$B zu4>Zt9ZK7D!qSE$7;QDe$f)iN@_)J8Ggwt$M}jpj9R83WrIJ4Btl7S=Ha%VW^wjv>CwBeLPhZBX=OF!5#9PqlLu~PR)@X34 zg!2AlzT=u;#3&W%6|U<-a3VdyE*$-Tnt%J6aG4~3690X!2Zeqe&YIw4jNP%!(KV3) z#Mk5iL#3TOuz8y>cby^$1q5tIU%2g2zh&OJ;o@!JO9_=>Hor3><|0iRm(=GB=Zrkr zkmeL>fR4rmBQyGLBtqHo%(kNZXkbV3nZXZAW0z{Km)SOn2;mAF3Uq6l+8?_g$7hzV#_9IKmYBR_BR->aQ#E z!S!PN!0X+H(LoqxpX?mCE1Xl=4mUn0oHXnNuNmn` zl{3@IJp*>AHV!g=4paPvc@rv?&GN5f&&@8=DjN<{2paXgWNhgUJo#DOKkt9-9!x34 z2-!&SOFf$ndSg;`eq*i$ks)J+R!p1Ob*jZ2r=PLya;R`xm9KL7px^F>;ED(oYp!RR zhZ&oMHC%s~OBo!*@5G)x-v8lRI*KX7$5S3)-SxHUq6s_(h_8D%@NVbu(&Ki2?bue= z4|ym9gUpBPr$qB0Bh+I1tQ38KJaSN(7}$f6NS0838`1^=mSNqY2dZd*AxM5nU=5}z z!Zs;3iGjj?9TP}Ada z7E;gu*b|j=suzj0v2sAPF^zCaVR-D48hb)Qm_Yj1UlIN7u7LN6AuH)@ZaCp-n8nu> z+73R}mJeX@{>*3;|8F7AZ9BjiwbL~`5-5FE&*%|5QDKHzHWJNj0eOICTZKWgC5~|g zd=DC$27*x(B?!<-v!kxWG&*SWT+DQrA7RvIH-pq7K*?Fm$}ubF-kHT5$n$K%-XDjX zW(W_v;wfdS2t~ql7in^Lq{{^HX(3En3=kCLyj(Ym!yNuSCNL@nrHz43iJ<;cdB5}S z?2?$Iw8hfT-`f(qH1;uX<~+(nHn-2R0GMM2QdFxhW&Ku2KU1%pI(;f0E{BN94duF- z+*S1-2X{@JwYp_V*{Tyl7w2ypmd`=0UVP6<|Ff+jIWOUDQ!07nb=yBYdCd=YY5pV1 zK9d~yJC`>mS+|;3eP1f8M0LM)>z$}m)B{CLiQOceB3m+e1b;@$d& zJH4nNS+mqozSy*3yD~NC3n^EdpjourBtb6Ee(M-NLL)SxPb2Wc#iId$RThjV=w!-UyE~}Xtc=!(3hvcO6q8N%6I&>NqWHlX6?)AplakgyL z6#FwM(T%x=Gd3O=ApO*H$-Tr`gOz^fDRx%riQ^+!-cus4L{D&QiA&+k29tfLgs2u= zX}Jx8G64hEK&0L4>7?&aM-c%4m$F$)41YW06Nwj}l9v#?0?|5p0e<2SCm5E{(Zp?$ zLNZjD`2I3|VPBH-D8uIBSqE)-9?*P5y{kTqs`9CA))Fv9^Z4Z;5lPlqPv9}Jt{Ry~ zRLC<^ZrmPSB+J+qWC-KB<*3%Sqz)9HnWebfa(_yjPOB^I7x)W%OI!DP5Bxc( z?B}5U0qA0kp3=6Bd;+aMuYQr){#iJd zv!_Alyj8{PSuH$i2P?IvS)Nnz`q=VtpdGj)Iz@dJ>Gw-mu7DI6&d7I$Ls12E zFGzoasKh={TDzH&%MjhwMhZ`o(1~2jqip;2w zbE51{RGpq>9;%#Xw1h%aktjf+LYd>QilOOjK zM%o$N27O`}<5b$TrBUB!X%8MAgv*!K`zpcF6d`UZLG&$MA^r5<$uhqBv%0@)m1UR8 z8a0GjAoiPpogWRkmn4(C$2-xwyQVyO&)ie4w+75mFA?~zTPLc{GC#bo`NoqU#pPLj zm131;UeKJW40pI$#I719m_F_Jv_50H+NbjxlrbSGiOv&{i7rA>X@Ook`}5Z-Mi#X0 z!z%~3pI&fNA~SCQ*sP%Sfa*|;i(O)5PQ>;%Bhoh!P}Oo&V3OjUZMZF{a7)wIdk7$1 zZ7pZ|jbJyos+j=*jBY*gl-CN#0v+WG}GcWf4XVX8;;?Nn)KRLhdECgNV zYd1pCuIloG^{>gl$~hp-{PWh)K(A$5esk_}*1vdbkcs-H=Gj9qbOvOVlHNY!jv;fy z+#>VqKl&flg}kn^TYvbN23L}Py-D~$6f`9YgC>`uXXn$mq6Y-D@UDe$SZ@QRs3UkO zhnoa?LF&I=F&*Q9FQY2J?W27N1VfBNMyUKh?gp*+hc*PNQ#}NeCugGw$EiMvQhb9~ z3?6nkP`ATLlg}?XDj3lP>;Fe`Q85_l5{7|v0k;QDt7hv~>}cz1gW zTuoZLeIxI(^J>R%1(fQ&9~cV<)@gO6p9p1>;T)1lW#U$OJ4@&pz`-rR-8OkXBEvb} z&SaJ{tIMztHK#pMT$LNF7R4rjvMJWxr~q^eko3&b2b};0J)i5etmr3a^+%qwx<;Uf z%aUxIJ__3?AOz;3nR&bdJc;h?nl%E>xT&s=ftu*;cS1rdf(4Q+?<@d8HqOQa_C5ZBuoFC!n zaC_Ob!5HW8E+ahZ-bGgbz>a#`vXSMFy~kfpwr$m+D4P6G7D3Ppf>8|%0HKRrD*mB{ z{HzJ+KW5CNfJH5jI;n(TQbLJGW$+$(u44Ks= znEFM_{`r+)yzvZ}7`T*$p|0$f?pq5H%SYa$sZ*cR(zZ%Fq3(f> zoaSE6(3@FJ+wa7HykN-TQKoAl0WIv(QW3$`0_@naUn2$w~(Ku0YCrYdAM}3J`67yDmlp!dYSzI1`gJ+6iS*a6$ z&-c;q({PxzM>o*0n2r%S;W-j%ScjrxB>1u1t*T+02uf=o3yX(BNm%zZ3FzJsNOlqo z>l+kr@KVu^-9kD!bh?*2m$0$Fcu=?*q*83+l0IXUl@9 zpE-}7!|e{g|DZC7UOn^ZFUMG>KZ`yBeJ5^37wCV(z(m!YGd6`&tOqKwj7+Hl&JZzH z>WgkAqyCHlSWe0TRA5e)K=*%fYsrk}(8RR+;n~1bpVsOTIvn?so*hR$%8j9TRmx>g zW+B=8K|wG?#H-n%z9RRPd)^qM$GdNzX~f~r;o^ddg&agN&fwCboYDMuu3sPiyj|#- zE|amblU^d?DVDFSog5mGAFZ-&vhs83lOca|EJ@-cohm9h!EM7gFsjvs2Ns&Yo?OJ7T5q&m8L=Vpd`qc#F#5H5)J z9^52BsL?k-(hU1I$?B59w*-{x@6-UlRzA|;yE%z?a-9mdw}m1Gh7n0*`Dq(7Nm4eW ze`AL_G;1g-h{2uMngC+QV?2tPA{t@PU$h6I0J81RVK%65F6S23 z_EV^Eh=@YNASx#~!8eq;1fMxQ;(px-2E(BZyrrd50Q7y69XC>MDcVO1LNdd+dKW$F z0+*YitzgQ_*i7-_ema^1K_79GwBAcXg&+>(ZuXv*jE&sMjt&DWoMxKvaH^wI9b~E+ zGqzEG)D1_Rf(o~litbi)5YxB{fBLtOKltjmQKP3nu*Q07xvxAlD7k+^Du^GP*aHi5 zmHF5pSiRyty;D3SGcuX}+c7SXBY>RQy3|0H*4MfP6bPP*d6h;W%q?Ra`igl}x&QhYpcX<*U*!vv8X?OO&I(C&3 z3JDG&l7KdOsP}*qAfX6Q`YYzzIyy$l*9~Y!@*o^%>)mZ>v?ZYi55hG2|EQ)*_4_vV zAp~%K%3?(|%2WOdn5tDZu#7G>Y>A6m_NXZh`EA?Qe2>BxjOTMh{c`+W5=e8a`w6C7 zsm11enJ=h@l4>v>_&cFdZ2*}ySS?paN;?s;ud7-3{C1R}p1!-Pwq$x}O{|u{G%W$f zv=&E?iLC&Q+SmB)^#8xS@Nt|(LDIwa-)Yv60%Ky<|%f9Eg z5yh{Z?P1~Nm4#@^nLgkr^iewv9;!|i5&E?^Sk9gyF*brwELLs}0=8t*5FOF0rx=_J zX(OYoN}A<0KP#OaG))5jfXni(LUt8KQ}W}B%3IyW^vS)@AD9d&Ay;|Q!>u422*ve+ zbeKA-ceQ7AX}6$;Lz-5}6-qatFl*{QR1ByRFdBe!#8`1)PX2p4RD!}R0RIDphbubI z!0ce4_1zu7FN#dXf}~==Za($U7-t43OT^}zjeQEw2gZ)sWVqU*W+fsWezoQ3UwB6t zhMN+J4=>3!XZCMxTYT8VCgcdfz48>pgt*JHW#OF78koosKI{E59vGVoDi|o-{J|&+ z1S^cm9@Yeq!&nXja~Vu9%!|Ohpx&ji1uRE|hGPZGCp;vRj+4i9MwaeT)icP+(Ekg7 z+lQ@Rt^~5NCXr-#`GY|P;0qp zBaS$-gUZS5^ktB;_m~m-!U3g|y_)#w&+Q+UC25lMc@I`g80$PM$L7#ALBtj_m^ErX z)Mg77)gXu$W|`wR^$s4_SkI7YuX)=g*{%7TFm*Tso63b_<^@Q0)kKIqijJ30b3RvR zJBrvqwQyDK&oFB-e6b*q$sh9C6}GA2hSE2Af+`-}zt=`}nSJ8wW6bHIbESjT;Bc)8 z&X3X=9Fe+upeM2U)4}K-6eR;*V(-)dWpMhR!BH|ZpV2h;GWzy;cL|w7VN&J(fv-M2wTtBiY6_BR&hOvt~ zlpE7A?eFyg?gB^&LCr4Z=8we3%c_Soi9bl0RyCY8yG@GK=HRfuie7U)Lz@_jb28!_mjm(5DayV>M>gJGI?X}qg(P6pYfTpyu6jOtD z52?jjJ}>HVc;z^}YK))otg1a4xH*!CeX>-o zrW?)~V(h~YCsz*!q(Sln&SGNrFp<~od@xf1Y5>9QyV?Q*GAOV?Nzt6aEaWi=u`A0i zkyHNM57P(z3HfAcB$ysBFrJ6NS3xFG68#(?hG(D)V4ZO2N@bghttd!(@fTY!bYA}P zte60Dya>sLZrN~V3sk9~q@=wz!efV0k{N!RB@W+{9wB*Zo4RcjJ4OK-xT;TgjGULN zDphmJ2LVixS0Y(bDx(X;l~%vCZ7@t2$8IdTAFD{d6Ad=Wx2q--fP~zH@1pWh5oN66 zS;<2ah2qu<@EB@8V`%eAP>0|tR-=@-TdMT^+!c9liS2Cxq1GFSdwVhmm8NRw zpTHm(k!xRqn@F>0Z~J32Q|ZMtNd_@WC8Jb~`d-Grl!B6wxdd`Df-yn0A8v`r&4NUe zF>eeGp7Hmls=dsyY-saaxxVg|LKM&ZAr_+@_$gK46 zo+O_fj#XW$3b)K9aLg~=$qp1Xk6l4A%B`88cxFusrK zRBL|2L*1XM+}-Tn7vpy~=f{K6nH|q(OOG^dv9nIc1+}uvTWVygKfJEykD0E&5_Uz- zk%c*{N?Ky?wlrhfMhmA&s~xo2=WUhxwciDgk?_RHq>35DoDLk$*8pTJ}cko+9V z`DLxrU#Iq;^_{>49LHYdZyd=x(P@HoA`1Iq zC28z9kfIaQA1TjdP?RBHEU1ArM(l!)9R=P)?iqI(#j|BpNi`NlYqV|i=>juEM0z|S z5~sF~oay_+FiEcj`|DITO#=2oMpo=HP;K~aID^#_7Q#)Kg@}?`#AG4&{;yhLWXa6e zfVAfJ`!vwmQ@Es!bpx&cx>TRb6f8>P1+fx|#wQ1&9WG z-oAq|Krr#cpEW_(mU4C=6`pIz^-em)Qw{nG}9`fcOxcd~D6ZIf5I^VW)BMI=7mWH6E?j$zh>_<`HE z%UJXF$z4S`SmOzE0jQ~5tE82gNj_h}-Xf8H$~CdZyQ1D4QH_3XgzZ`ry0i;W|60L- zcwD1T6p~S5ftww=?u~;;gX0NfKDw;s6%nLb%~QZ{>`eYTQ}dj@RAY|_)PJ#s&v12zzmwvb+b18C@p1+=yNc1&g@SZwxQjgy35_2=Ppa{H_Hp5w6M#%=N3JF5LI)@+f035n!)7h}K> zw!VN3f!nW3TQf$F27mhmVC>}(_CBw8e$kReJZ3t&l!1}D%3(&s+X$2973F}fAB|q%RXgUpd^jZ|6^eQG$yPJu}WG zxV%;1no^T~Cp;whrZ}bsHkr?sD58bS&uDCNO^;ZQ%-a)#yWBs^>{kLfM9xnHDo-}-@zH%qB-?W%S?NzSf++F?SZi>k146#q>N0xtQRR3^%CdoO{ zSz+g+=4kq!XI~HER!`N?vq^%^QY1}KXJ}(Azn!E^b!UVNkVuNeKI?{I%2<_iq;2zC z6*I=>k0?)M)PrHbvj&@4^sNfHnn%t12@X@jBosZ6K4{%!`)E$??-(+9pcGdz~Qx|H4S z#iC$-X_?-t0A&P(D^$D$6*bj@7)osiwQl~vI^&KhFcV16{6cEsF@3O5UU(-ByI!0V z?meD@DOd{Se9)hxbjW7t%F0*7`mB4}HLg~*M4mI6R1?F?g{PZ+2hL5M0oV~rC`S4KJ?%l2H|w#+BY zO1(U^B7zVLJgTTx1`amrvV+gn6)s;ldhJB-OSRuNzp4IZ&S&*}8j^ zE4te*e`Px3mdmL$4^YkuvG>g}uX~eh+nAI^W%`o#N3=nbKr6|a?^)tp>|b_R8N=rd zc`%-5mWMkTRH{zCMc5tJEC+Iy@mtFRw7=3&q0QDCzvr1wwC>$5xgN1e9D<J$}b?gR029ZFZL=6b(?zXH)eLxY3>?6sdW&b=tOJwq_X=xu|@hPPr z5@_;!4PKAH#MUtDO3HlmK3QgLp*bKg&CEr~H(osBjE=pSR%-WeDR%JTO^y0p_USRPdlo66j8zhp&{CiWX++Rnn z%C3Xt+tdP!v)jGFqIon#>78te0ElNypr;@RlE?sTrcYg9_-MErAN8NUOFbGVm-Yxem=g+Wc{?TmIM! z??aWoIpI+ze=v1*->z?`z5CIz)bkni(h}4NxhFCsuw7di$P8&qauxx5g)}9Epwj61 z{@jIy_qSi7s{hDJbm9P?I@w23G7?=nTeAY+AqJB}y-;dR^%%AJnX4HX-6ZEE=VHZW zyF?6({6&YQTm#>h*GQ^&&ulEIjV2SlCa00Fycn1fY=*kh$%q)2D{>RP|9gVZKpU{I z3}LPX>q0ER1{O$qbCuaUh^g#a7Y)o727oEqIpKM{Ck6M z@xnPV>EPeJ)}fvcXGU{+8Sj>gqpo3zxY-R9a)JQoAxku%Nx<;_yJbg(>{pl#H)SMq zpm1SemoaV*YmW_=aX-r7!9*zBb*gofLIo^?AZCd4L>c$%@a(C)+^)&on*@)hRaV+j z#B|{TH-7`YTXe?xb=(nE3?lm&@D>`$3JrVWz^0bBv>FwuG&;FLahs#vhk|eJbK-{9VMrF61;E;8v8|8l)T1tA2Y&W`BrF8)&m_d-#YCR|D(-YYUsTW z>XG3iyp#fTan{J4jLmiDealIf^l3u%S%?4hc}&e*Cw=L5^+JgZBkOJVp_3837?I?w zPqT2AQ)MxgJrPZ60a#BV<5SwyJDTV|xQGAw$)quy z?E4gMg-jrV8~sDW1Q1ysSj+VE4^U>trY8Do!BXCq(V{K9P3D~Ew|&u0dGKrZZ;)X4 zjD=-jE94$N%s)R92{`kV#oR5|?7|&}9k_5>|M8r?-ZsX|o`#^&LGmr13A36*h3R)J z4b?V9`B619^WCKV9vFAAnO?w*#(n*fc?nPPSvArg33YPAdGEtBW^(tWhB(OO%n60b zKik7V3&-=ma}O3U-bDA5-GPYw$o@03?+g3xKZ@@B{*;VO@z$VlkFD18&L$Gr%c2Ne zhwSe8+D|D$--hd%yZW>5rmUXky84#$e40noG2JA&vE>=3L1Uix-+Ck1Bo>5^c{<^c zjij3T#A$;{q|kNTUyZ*&3aUL?G#{Y+_X!Ym$isxpefm{8LPMR-JZh(}=Qw<22U0n` z=9O(lgcuQHS^ii+C_FzV3}}SGS!Bz*&ERH+3+ydDgkG^$KYzoOA02Vt>$^9r4{s*q zXPJ`bLHMz5IBSI+sW4hqUoll1v3jQmH~DzMEUzeTs^C{{cflalLXhg>x$uUApuQ%8)`01|H;}oYHi>DhgQM1w`4HI}f!xX_@LMWH;;i%t z(S)g7)MgA;Q)TX?ZbfI`xM>1+238@%U*2+|G}e1C8VUh+C3_|Ge>_uoMz_;*c_?rg zbyrj9dt~WkBT5@_3NPMPfeU4e7^DR-sLvnm1+Ni)Y^2kuAKBXpo@y-tG@IBDFD8vn z(_X{O(=&rR1$3{ql(sMca_is0#O=B0IJ)WG2}p8}41&zo;H&iX5}Z_Wv=m`UUykuT zb(+5hoJ>$*_f!?$3vC5YyM98IABuEFR8XV8-6WY3_p$#}uN&e~ri_HL4dm*^#2n>) zM_sVSLep0BuI(EgBUKv~nAc{83j(=eC6T(*aEqAY~JC4KLzBipa$#G zawcR?-ls;B9TD8gZyuAfDPK}GC#x}r>Run_x}KWhlZMe#yfT5BzKF?5wcTLFJ2FMDg@5&VV#GM)(@uhkE;V#`& z+rrhHp9dNyGUut&&HL~uaEP4elT1Qo<4MygtY$vYb=1FNel){8DUMagz#qIUy&5z6Z$%;w*spI+-dK(-*D^HVR4c$|XkoCU zi+%sMwkwZo>P*)O$Ra4XgiXN#tC%2C&}c!F0VP1fmIaWBv;$IG!cUOJrN!z5LD2|^ zEDFH|LN>5;uniUK)e9(YqZR@>4z(RbVML^xFuiT3ckc6@0~M{#a)0;!5fe|yS-$06 zpZ9r5RNR4?qMbby61f;LQ9xKaaYMPrNs4-smx&qjFTVGpd3*k#T=z``g1uM0X+EeH zcN#kKeL?1^B&NIJ1AokRAokxgirwTT|F=aI3FJL#27qQKLX>2zn6lKzLh*Lx7lxr- zW4y4bv7df32W=T=4n=4W@$F+jykM?#`YI--Z%{WYQP`=DcQwWYY~+`T7P#3;h43Km zYZfaad=P>G-`Q4Ge?)|G8J0lDqZLlvuVv*n%_oDkkXkYJL|(c%)z$PLR&2LsEMk3~ z9Z};f-f(Im_y@|JyOQqM3Ics~BH`7fLM+PtPGu37086yUL}@m=W*ru*oHb9z=jJu~ zj_R_?=2ELPp*C9~+$l2%D`$zqIH^RcF<$iISmmLfi;><@x1n3F%a{wa9E`J;>30m9 zq+rvXLQ$=#`rHHi(PU(`%%c?QfFn=lRusHITv%FUY>G#8;QkZLVI@cSkd5an9!BX6TBo`EWjBUsTzWBK& zo)s|j5EM`B8_F_ zjLxf=ipo?H`~k&Mu5w$~no$(no1d1Z`F+xz`ZNgXYb#9)TPfZ&#gtWDw6ZeC80ect zrUZAv=L>4}y0J z0YYa|0|M=RcBb`u+6&3faBGEt__XglGb(q*N45zQvL?%Y-ul7cOa zP5LvSi|Nfag{@a~sXj=41_de*BEOyQ_@R&Wx5ramM!qf{`Lr2%a(|}tiFaaKSEs(8{1>~d zvJEB4G&hwvtQ^~yfSnDJmI?MC^o}$h<2f)Zx%tj9}qnokXkRymTeTFFrWshVRhI^&{};au6>uzBCP|1Id*>=6tlk^ALo#ItVbS}wU#wzbjd`NCXSOw+uj9emk8Hnq z_T0nIzY>wlQJ?Q-cJKYG^gl#C1ro}Yl+?vhJKGU^Ud!52qA z&7@*wx=QpxAl+L%3tYhAB4eH+$|XObq!3#@oqNX7IyVIQB!C&Ow7w23XBpiX2UtwV znYl$?2#Bf_CBIm5&&iyB`aMlRBU_q+g^@|&Ee|x&Z2@_A5@2A=8!ZR?J}FDKvin6&ST zR(zD`SDi^fSmW~R?)Gs;nj4hbIUN&P@7wpaL;4XnGwcclr^u)3D{@IGi9weh2me3#xpr)0e9EffR_SmGRKu@g~a}>6MAR~oxJVaM@X{@f>C&YIw$pzFV)h*ri1&uL* zOJSqNGQ@cl9}RWCp3K{YZ<9{*jB&D*(U=7DnfSh8fB{61LBeFfzZaU2J8VQ{;&Ujf zD7u=6_=qIfBCk)IJBglS|d#zMWo3UTV!tlofU7Zx=sB`N!?Xc^&(xA>A< zuy=2x{y@-q4Iv%D3@K2e6b=uv}pPFB5#ouu3}m80%2>Unc(#~LO$bx1jL z+!RAWE{KJ0*)#EsZb{15teTw7fkj?!lO0`8G^b%aS{Dkth_+mG8M9MB1bN}MGI6{) zn)k0F7vx2$ms^^bfw%D_^G6=QhKd7nV`j+xwOh5f7MOlQY6lRta&IkGylcw3CX?5d zMfGy(6tDNR3Vp%}y42IO9u`VZdX2>vg6HaCJR-2dlA{ygL6^&vNMl=9?i;@yx2=MG2toP$Cc&;xDWnmOr zx>Fj<%|ncd{ndj%O3+Q1KcF~Sk+|;DRHsXv_s`}?EN?8{Q*iQ$`0eFZg>y=aE`FZh z%c)oxDp|m7CqetnAXm)@pEqmULRZaapU9U*ncAILw#4mx{FufXCfQ=S@Es>r$tbsE z+LP7w!7U>q(YnM1%eHj{f2x!X^QG};t&jV_s($R#+J6Mh9_VP%pA92%o6G*0=ab@2 z$8vvcZxo-xCKRk!ZsZK}_B)^`*&rIHsA;0temuRhJxvKk4cvcuQMFNZmooydJl?S9 z^N_Ekpuh8k-R?s?3~c|vab~IXF*Fn^7dO5CYpix~$L+pAlbpnR{^EdtM&0&Cfb$7p zr4shf&rS+`?KqH08E}HOvT&jU_awE|e|94jN^REu>91Cg2)lT@FZOTRrg`#H!#P&G zU--P=HP74LybeWvDRVr(*BOa;gKhbC=t#&JiUpgcUN$prj3KjwU-wC3hM)D1{J-hCsVNAxyv}H*xfU{8LY~ z=Q=3e3Dh>2m=1d8bU#EQ*htOpCB6?v0%!vf`nhj|{ zjvHd{RAaTm)); z`e_vP0ERTpb$j<|X8sus|D_o$PCOL=`01)2x;`bn$kKZDoDyXZhb}_Du31j2>(6g) zW=9ym-*Cb^eslFmX!S_cw;l81q4)8{yv@!2e_*|fhGcucKi~82`JS&lMFRjU%^mzv zgl+O4r$P_2TgcC%ZIAaHkJn7i3-b({i^_x*3O~_nbT6j53bS#sG7vN_EBu5ByGTiE zS$LV~3Zl47Kb?(MD?&$;7a%Nu)?|~l+Q0vq`S;FoKRL@DX{D&m{~>FhW=^(b?X*sWmvMu$IBfS1P9*N&H`hwuBFV(O=?Yk=E% z1Q$E8W{2}aeG=NIeOKo@9L}=}x=uuY#rOM=pcl=W}xxjB1vS^Q*Oiq6P zu|i`q_iG=1ljBcjq2iXC9;@cepp(Z)(E`d=GX>$$x2CTWjx)iiFV8d($(}vErdO7k z8-=jr8IsMYLD#bPvC{UwbRRj(Hkql_#PcZGV-+yso7~6I|f&oIP<%DPJ@qf>$3}jU&6tO82%X`sYaYnxD9=}DBQqPR?m+(85 zINlJK*xW@v=`A!wU(6`Hz(w;N3vT7|l&N-m{4?}DbGA)i8|d%)O=Me*sdA@7(=^lL zvanGoxYoCP_~Mwi8R6$6 z;dp~|KrE5wBN9lJx&Zx|AqyNs?Bx#TZydk~NnPg=$>(uf_i=9cqts%${5q1^6OB92 zv7-(L8Q|ZafKtpW7tdldYluSXl3ZVtd>UZ3+SMs_l^?EYRjyfv?47hUA8;_VELHq*oUcgyAP^{7=f7$&(P8IBXCu9vTx0_cXPwcq zcVGb-%)}v{zg2%gv>=jDtkhoTPD7j~KmU$on{un&vW>y%-QqNivG!TiH(B4;Jq`@WiSa{jV!sdq%UZ5Gu4f+Fc(leXo^* z>A@qQHzid=MtbS7bb_eQ=k$f+q9nmB=Y^DrkMU76xo36>8hX@-e?IQ86|iaAscX)4-*Vbo_4>MUDDu=FM1#NAtV2(t1M^b51{>F@ z3Rw1@@16`*Hou`r(0j%|i95JF^&Bg!=G5EoXufD(4|7silAt*@jPz~Wf}R8Xi=L!} zZK>jrEtw`7+_jNKb6nAgGw=+*%_W}n9CIQ}T~#nNwIkd|pwo*4|0V*0KvG7j@)>|f zel`;IRe00`pX#0UXQzkrmb*hX6hVAMAIlw=7B^c95HqgVw`#TXU9=_j9nrawuPB{6 zyM9YT$sIG-x9{ton&klsPv`df z%hGN45G%Iu{t47cRXlr}q#r3nwx{!ih2}TGVwp|z(}{CTl20pMYYfgYimv<52JF;o zZ@IGw_J9SqdB%H^(>9~o794kWn>L;G^-lJn1Do8z+@iVq!C$dArFzS@3vE(1`2}AP zGH%)Q@?XHKATsrPlO9ToB3q~@B;_oNpoplelM848c(Uzqb( zTQ|3n9@w03z35LFHQ7c*R-x54rnhXUZBP+16uP9@Btc>7m8txFbkGywoipK1xWj1k zlCAGwC@%#D?TYH+9T}^Vi^YztS9~DZk4FnKZUWu}43Xi3yvkj?C1@Db+1ZU@q1qgn z5PP_+7FpKWh{Z@bSqI7qBq{}y8hoWb6g_1$*9dc!2OqyczD*^cF{+h(D%fatk<>8}?n2Kq$v@8Jr-ss$ws+Em$-gB7#B%V7;OTGtA(XOx$|ED6ieg3+DwS(o|`INDaTlB9mD90b)XHwr6WV+ca&cKp1Zu6*wJX z`=^0Lau*SHilXRCLiIB=ok4c5SL3`(Ms?G|#2$2b&j$P8-l_)6is^@9Wvr8m0-_?UES zcw#7zksPpo_pW|(pxGm+N{0H?q*^P~2M@s7c! z!;LQtMxllTFkbEKUT~2-#?q)_32z-*b>Uka6CtjAVSm3e1q`KwSR3j?YB6cJ^+6I8 zExV?KEd24s*K+A$GWC<#D{^Y%m26S>aq}TIaR+q+v*)8$PG7s&y1ZCXTD9o&lY=%Q ztmg(OqyX_36d>`Iz$1hMiWo>#n{wewk#xQz2AP07_6u6sti;ZhXkz=(CjI{CDH67N zMZp8fR40|{cjqEag3qun%~z?@D+CJ>=1V^A;o;Ha8M_-eGe?p_uws-duvLCjRrCs1 zE_tZSyo+>o15=#?RZsdN z_A7ajJXep3fbz7>f@U5g7OSv2TC-kzTEI9-=R1Dk{o&FmWhPv}cef*=(g&~ZUMAJX zOm`iYG#A5LSLsJnH(}6cY=P}lx@MA7YeIb{r$~12(+~Y9s|ZtLf}i@&6Q=PC-hq01 z0-#@<$$JnWR-CNbtg;8<2eCPvEDKID#KZ4FAm&GePHYbQuSwV9F?%OL9cCl$oPX?4 zY=J;}c>l&i{1hs4-zg}@b*f^aCotEs&=JdIy&NCz;*QL8o_=-;W!lsPwB{Q`n5dS}z!h{rBl!1?Wdbpm(1YYugjax0jw-57$ z&%`R}3^h75M5%3HGcZn{*_m?pe6#pp|I`!)4!1WH@3jxm1Ess}tf7YxI*%T6=eCKF z_Rcy?ZHl@^9E;|8bUz$);JDaPWF}tlk@Te64dLp zHZ*=$Bqjgsuqc}uG(uwD;!ZAlOV?9gA6cstFAtuRor@EfxzOh){Z;L)Cya)mRXryQ zzY+HDx%tRbei{j}<#sG1Dh5=6sX=++mB&Sfm`Y?1t~;yXGl}B=kQEFa^{!K$iQzz} zZ2Hg*UuWZMg0#5%cS@11)b$v+?XaJ`RnO{E25QzQywy8H;;tpA(Bov_@r0^EBogqL z1QvQwz^E&iS)x%FD|Ugm{-c{7Qu&zrj;*TFm;2uacPxbgEjWU_IW}gDrDd!7kIFa+ zDR`{rnP;$UVyJL4A|zgMOc_VUJ^-RK6hM5;%?B*IG{3C|PR>4bcwOixgBQfl1uYy^ z=GHnfFbQbg_0AAd8Ld>+IazS(KgD6`RVwihjn!KXMy1+ZiXkNR-9Vb4XMg|(0I_(= zphW=;{3d-JD&6`Famzo{oG;U;|MYmfPc-~ZqL}i<-J491?*QkJ{PQCK+*AdX6)=?Uqqn1Lkm z4-wf)nY`t2UqnUE^{8-pR!o-j@_@MyMPJ{ zj<23o?kpC?5inuK>XVBYF1Vqm@*1dMj~>YTV#TiO7N-Rp0ZeLbkg^WBP%KJdi(R(`>&@GeyCM1#9O*s!5&7S(+WXw@q_RZtiA<%C~+Or#mH7N!p@IHi}tEhZ7i4c=^I#x%n*1cX6cXdq1 z@f!}b{9M+Y!YnRtnODvfD^BMQg#f%2qiXESiAiVFSpM(Hbkt}%11+5XXTgeZqKIih z+idS$82{jx2bQbSl0rf&!q%m!Ka0ALLb*ca<*leM!j*%BX>w}vE(nk zhjp!^b*9E*k^c-)9OmCWRjN*{5>zeRpa<2NJg94!*RQQD~LW~tdOTFcJ=OQ zxyh*tXkKA*vo|fMY&ay+xh9$y1Wbq*2Db*-A2P^K-*p@=J2wF6d>}C&n_6}(8Rv-Jj;Ba=tLmHs+T%&PE1{YjmX99&mwd+wvo6ZvAyv)mJqcO>mZ2 zVq$7N-9`WAXNsoJtU`AyLiv(a^^&Eh7N?lxyr5kUBKRZ?bWD!Yx@VP?LBun&rE72Z zuEk0y!B)awKl7@UOOBKgD2ga?r9yJMWVBH@vq!OBua9fAsi-uMT;<#yEzP3(2zxlW z1cO`UmM(VwMN);a1?<+L+X2d()45e!~yl>%Ua)){2+dzt;i0!q|qRj>OvgjT@?eppk$zlcnfQ0?N_2 zyP^i_jJiH;KLZ)NEQ0PJieGT-F;c4em2pZj8(WenUlQrv{6@%iaRN$ylq$A*A30$H5A#(=UwJnz#8 z>1u#{hHuL~5!6Br2)&|1#ukQ*GMNNP0 z=^p=>KLmMhWr>2^p8ZLYF;}(X6<_U}C}X#aW=o(d`;oIAd)=&cg34h{QP9tFGW trH}bg36 zDr6^1vdc2YcOR;!o~P%1pZ~jl-|zQL9p^sh+{<-e`+eQlIm_>tyP&l?8rm8V0s%o< z;2*Ty4XI+!Ih=qXeSJtAf*=+M5$pglfD)(z@qq(^=#uHa-{HEIO#i*i7z;mK?x9oF zR#&$)#2aX6>tR>Arjt@Tupdje7J{6d&$#0?_luYvHWy)T0KKz88z5O|pS7)rtD2#q z9^8Yoi}P{#{OfPFWd#DLX&2nruh0J7D^^=q_cNdt6&x~l$31MpbfiG}l-C(oa366K zl#w>4R?7Ks+1(hg4(>k(WzLhU>u|;r6FjoBm|Y+T&{rDu`tGR-J(Zqf#}vD=+_~ZpF_d` z)r^Rh-*>PObo302Ow256SlK{@qO}kmf}Wm^fu50(0Wd7W7d(d;)-kT%EUm`00cXv; z#SJMFa4m&p>;8hL-1x@vZL&7*fooVdZsOtP+b$}GLW|4o-Xp(PK~enxRzp)uTgS-Q z#PkpV(bn$xiIesYjvi+`&w8En_6fRp>2mOutHh}2nAq!aH{waDY3byQ%&hF3d-n^A z9y}~AdGxIEc~y1Ii`tizrslUT@7{lC?e6LA>mT?uI5a#lIW;{q`*m)90iG8EqFSne|Io~SOze+&wL$6--O7)ij*gy@o}QkOi4pvmSeW1+ z3oFaYkM-M+edWgq|G0kqmO&>7&<6tp12g!)mTe8&+JE`C{1K?9h2>U=ogM+0iGCe~ zftE&}rbZD56^wSdjeC8e!<*ieoN0I)EJ7X8PXm&Z=`!7TZA8rH2Jy5A7vpxP9NhlfY3>y0XkhW ziHb%~3PJ~>d3r+pjgI6hl0G{4x1v%be;#=~XN;jDPA#Y%(2C&)cVy;nM~|qk`FXbg zg=IRS=BAdL=Zy@^IjpuLacb88g?)dg4qf>41m(uQrWzezm<@kne0Z9&MccKI>va1g z%ulmqWnU;5(b>XWx+8#t8xx~!KN+06BY+XPlp2;A;{Osohb%=6&?$=Rs5j#W5HUEc z;m;UcDQn}JWBwvwKhXKJ(EksV53*EAVAUUmHJ7L+Y#|1F?jsb5t=*QIHl>_3rJQZH z29-svcT}o(+{qAQME|KsV%QYX+&_Z*l8LSbQv4ga6~FDAX@(am@RCbv}LGxs5F*W$$oskKb%s7B#(zPx% zPLnfsaHI_#Y~Qhmu2Iwc7ipzaj69PuDmM(FWr^kv3?!x-!)T~#7o&0@5v@?i5Tf69 zFg5T7km9w!p(!J*C{eKdz-ll7%t=iPTrqvG3egke^xlem35lyw(z2O#^bvap#l;q@ZwerYr({Ul&@spXns$BnP#E zhy36%@obYu(;pAXTaht8H1$vR;9t5ITqHM>$+v%v9G_ZPY8+E2AKRn#L<>GCd$tr5 z`Dq%H|H`E_k#0~2NEfWr>AHb(-Sm6J{4+L0;=dqUQHaG${Fh{ikYBe(w_d9MwmM|{ zINBYMJN*Tjy>t*tjdO1-v8OX$EF2D13`ctrulxHV>w3a^V!xe35WFMWSfm-?i_{8a za27dT8M}s_7uYiYKV!o?eud;B~N-_&81JHGNOrkB>EUY(Vzp*fMlh^f2^?*9NYhJ zCBIu0d{<)-u-g(GJaPyn%d_cOz1`lKeLl@ne4lu-vOjwekK)o+dM- z0)6S$fN$c?{vcGufe;ze)l$S${_Na*zlNP0VMkVLEPV%&{quL7_m^`#vXCN83CK_+ z9&D#HnEjdL->)Yq*W+&*O^TcE0#2s5#$USmKF|Lv&l*^0^N=3-je|I4b3X8aVKcX) z(Ed0_PA5YoRb_iXY&DKQ^vSR4vKW2<6PfOBjykvkBN8tZ1BGfdnK*hhG;_N_*`}%4 z#>CTWf_2Q*DA(`Pt?~QXjWfQtH1;etcimCAuirVT2rixDO4AvuZtd!dzzE6^DqOH0 z(SljwJDnl&sJ3ovfb=2YN0vG?^TH->&385eOQ0rZb6jGF$!H=a zsJ;i6KNS5Rc=PPVJSv~14RuG(Q_zlYd> zZo}?L7es9?D5gumgQW8Zy(aPekwPt-q4&Z0`x=$Y6y!&oKP`M(u?@gGy)Uo zk7e3VWw6RcbLp^dBwCFaB(F$>|DUMx9YO?4-7X|$UxQo?4k*9B*H>NVjk0gI+%kNn z21w8Ioz?$uKU3WFGv&>l%r^r)^4E1O7J2c6zm<=nq9GI79#&+izsT~p32s^o9{$~| z*ZvT7fODW^pv{AhtT(53Gg_#kU$rD0JXZLAW<|H|tBE31F3?EbN`YFZj! zwPQz_6Ssd`Xj+8FiH6!l1=;7<0@g*r2FQY>^H)pDSewkJ1D1`Gsn>4=_tPaE&foF! z;P45370Xgw%N1tM!ePgm%-`ym72TtwNYueK8!}Aev<~i&+N`X<9X;%@LzOjFAFIQO zqH9nfXW2TJWasewo%)yfhn^2T&^v}^ztuOoL>&m%lB6fVceWeMy9KgrvogA~Dvk2*Dn?GHSK_T2t%_q{ck`Q zpyTE2K(GJ&e3N9cdm~$TYQp6-4s2lW3V=AP;$d{Lqg1n*gkrP^-a1>#{vKO+JE|RB zEN9hpkNf@Tv#@6gK^hzz)z0p7O>@lvQ@}qbgQfOVjDw1&PwU_LOhhyQGx z^a6qTKLqJP`OD(uLi` z4FReg(O>v-@g-eEVlHgW;ZW6|Jk!5kt5f0)l+iIzS}um}hGNqoQ~^W=b`8^;?7!Pd zXw{Msln0Mzc`l&O?86|&dW)9pl14?8S8)RdLQsAzQsxYP^?_b&A(B{OtN}U zGaFT#c!@fZ(^-SNgBk#Tz>&e<2FSLVintlDpC^T$A*Up-dt)9Cc1aBB<5m|WFz3OZ z;}6XKD8t(<&}20ZOR)Yv#Ey}rp-VeJbK_4Oji7PIFXoJTaSg9`{0EkWi;thZ}QLs_aC%OM9g5?+za_BKU zS2G0P3GVa3wM2Ur@t0=12Pyo-(uEvBK% zD}73K#Nb-6qHx?a`IV26mz&0zk44Js;Nztja~XhBd6jABDuHh|ob9HU>^rpw&Q`%l zIbMpIUo`T=-3Up=(cb51vpciQCV=hqj~c@Y^zm#{xGB7InkN|J(tK8>55+mXQ&Yra z^LrOdO5C+~q=?t)Q@&TzJ`}8wROUSKWWr~HxVq?hz6+rOjbgmXy+!+v3+^GQb_sN6 zPZppzMEb=FP19!4?4Oj->9U-!G`r3hIg&R@p{EGf7wNqWeks z0_}hdF~HLNGaLU+SW1txjO)3R@ns3BKZFRTWa<5Uy z(PF5Ti@_QHBXy;95bcnwx+WSdOMzWdSusms7(_658DN&Mvqq7b=sn63=|WafS-n%^m(-`zWj zG%^oJaYwz#z^PfJU7b#^O(#E1-pMQH?w!UI)L!tz^SiAFPVU|FDvg}DZ1)D_HtU4j)ebouFbuNogf((##2 zX1Yey@$k%OvI4Xg`OeY;oT!ibss6|*m>H9M;ym{dp>wOu_Rl7s+!IBFuFL2e&zP(5 z-JvJNLxni37NfjdrZIm>@|wYPqieAL_K(ExTRr>+B2N?8k4}&G0$_$f0Dfk3H{O(c zQCpei(tD%zcbd2^EjAl$dCp`BH64&Kte`2Q6RoHAAdV^bE<+JXtSY>lEP2mGz0<$0 zlD)%8Ec0P;UVYxx@^KT%U=@DNqH)tmCH1Xz>$N0JR8PBA9`HC~RC(|S@5`JcNq1`z zsK_|yjy|ppRQtKo-uOfp(Py_>OXYlJhuLet!B>^b_5);mw*mMrE%W`pL8x?QA3 z`1)}3-XHHvoa3blyz`%nrm+ex$_*arS%xlN>Wfv(`yjzWeRgPaX7|IS&Bc#PyhUxB zAA8K5AUUOCMv!+E*K60EQQx8K!FuCU1I=Gadl_Q1m^)%~^n$X0SDTM_Ajyp$dlD>4lsu{85+xW2!=|Eh_CD@!rRSF$L- zvGBsLt+UX78H$fET!t#<3Dq|-PRmd$ru@p-GBmCTw@`Yz)T&hZ-iB1sM8o?PNqlNa z8gB!T5SG{vEkkp{l5>sA5J&l%!{RB zjQ!Jk>Na`MNKkUFIcdB-sqh4*&#DZJJXBs?c3_IXB|^H?(T}VXJ{<%Aa|i>x$e9wnekpKCfI+WZTSY(4T7^|s~q8VRTcpr(-Iv_pG}@x z+hoQ^cqx21&SSWI&Kc7xPrJNJeHo%B1DfC#U1^Oqv=cvW%p3V|PA4=^@3u4~Q&VQ* zy;SmqA1p(od>VxAt$=cX4(ZF#<<6xv!l2E!VaxQDmLYClM@u~11YqmlYK`(i+tn5s zNo`)U5zA0YG)!86zA}SXxa;H@am$y&<^ZhTMQ1-43jOb38O%AAx2iw9>M6W}-Ts|L zypAeq+DnlzYE71*N+&-GGwp;Ez|P7vm`Cp~YRyjUNhORf07i+GX||4-ut}QrmF(w& zDQ>;=0OPP>8IqL%PGrC&U<)Z)D`1>!fZdXC!n4beG>`XgF@VgX8p&2f;KTa|N;5pJL<u56$HVWCZ`k*A@ut;}v23 zHVD8jN9oFqnjg}ea)0U39>PPD8AZ#N7+8G0sCz#B3dvqSSf8B(Y#F``ePix<%*n+w zr)C^$0vEsY7)sWsv1G}?_u#ii?SAd$g(rRe;0mz1IJZ<%ZvS){ zk{;>(*2#sdrKR>A9ff6NgL(B4nJ>OR80UyqNm|gu@)cSF+Mb~`5~c-8ztK_+RCaMO zS4o34D^M>$fQDjJ`9q`p%aGdR&J{lHtqU1i0K`o!ECWLU&h%1w_LoKg1pwU+pBB$f z6WXXiG=&L?g(--Eh-^H~5R5}B>m6xYboXn_q}jeJho!m6VD#0ZM#OwXC)c7MtoXMC zVU~QD4t=?4v=A%Vzp3ST`@$5BKVgL&D~N^h=CD*kux|qThw;1imp=QYIidap(1C=P zmMz=c7iJ~fl4e80?s<+3EOY|H<^) zrmHyxOgQ^e@-8Mm?~}`r--64;-OKN2tVbt0n#LT4_FziS@Af-?*_vwHRrA$&24+r>&1(0DwOxjP|c~27o-75YIAFH zv?51*uHHLK>(70~S@w9x{PT_r`FxnUO}Qe(#+n_`GrPGYgC}DOqP`Fgl?g08Ae@ZY z5Th^u{CIqMxr?~RGDL^7+&`G8G#+LG}Ea zk)y}LdOA1GIeI3kwg|Oghbjn5Y$_QgVzIM|Tf-Y@_1z`JGo2f5hxUqlP;;z4?9JZ7D`FF@{9X_pLJpSzlyFQ`j9%;iw@-z1*oIb>)F;kzIp399` z2MVhAmV9^Gz86a*ERSgeZb8m9` zdydJPT)8anUHg}za}+(MMl84L?^ckT_kxTh(-HlZB`Bky!}QaZBvnRvPsJNP4Ul*G zPw`u{rf5`*XEM)YEWbmuq3!GE^vFV-+U9SNe~*u?Y~sR8@XTEL^+2@=YNZnQDQ>h@)0^SSI$5BTqYAUSq-n*aoIl#uJ?UW|L>gn z)=p>_`xs?s(&zb!n}4>CLf&K#_lvQkug%(MG9M^#X|KxLeqm(Uo(oEI>S-$XdzH1F zQdwW`xvh2Z^~sC1U&K?^$lD#r!3q_gw4f#@x>dO)qUU0}3>cqN&ZQl@*Q9f@zqo`b z+WX;y@2QC$y}&SV9cvaVZ7AE&S|z}0v3;0#YaG3MP%zI>5T#u531K3s$E9lD2Nxx` z9{z-`&qrQGX5((2OpQ=Gc4;R=;-_7D=7WA?7c7>cz~l>J<36u4&h2lRGHHf3Z#xip zPM-T^-<=QDKJNzPtMcYh1~r{mqm=L7^5P7!O^RvbbsDJH6&e#6+Fh20V6(d>}w#1g*5~1x!YmFmo62*^FtwM=e!U9cC-p*vlRHq)~Wfn1d@>Y=1w4lFXV)u@{)`*$(Jm@6W!BR-<}^ z1)GrdblPi`SWB-`{?egA#t5ke9C)1fxFcq7`bYny*D5gyby(oI6_z}E{l!IbFNH8>pDEOihKwsDbf4UkOELaWv2+`f|ZTF(1&N_$6PLPAT%px4eX za?ZOCy>?WpPVL<}A4G|e@-aC&5dE5*Z1!+~OUDHaSA2nTS#U7lEa`QPqk)u<@saCc zwHiU&rCe#=VQCxe+wjornIA8q!eOK=ninH zAMB7$bS&_5@N&}`JP7b-i%h6edlXqD@IpbC{~aY}?`QD`6FLIMFHc$Z14x>t(S{5^J>D^hV-@t zL_vTwHmQB=p|m=37f z`d#V1{;gXQNPN*p1ev1moN7g8dvt$1F{T&GdFaKh9z#MG!=ZNsHiC@Jz9JHk-GbJ7$N+O(8CyNoij9MKRBvM}`TuM*=JY z^mkqQ7&JX*dA0NYCn`77j;l@=6ea?B?nJySceLDdnCC)B6mnf{ZC*}{9%Ww9c7OH;YQ|w)tGN|$7@SvYn#a^YGzAA-XZTG9C^%y(?T4Gs5 zrf%}p7#TMSYiW{qahr-(y81f!u$rUTSU4RVz|Q#h?JnFqo!GkG*qzybQ6}s>AuIWya)DA z&FnAr$buZ(QG*W2!eFQMKkje+>?NJmaPVS1j)ay#@<%)t!nmf1G_7VB29bBM)BWV7 z{EwEC9syto%kP%PUZib_$DR>@?0?!gVkT?CiIPnD@4-~qSY5Y*Jo^oYgs-XAINB!* zHLrae4Br0IcSIcg>;&E^Nrt%-AL^B!>MtA#?;YvJFe*EM_b^~)#b;`Yqwrvqg6k`Y6kx~h45S>bS2`8i+54y>*!x8clg=NzGoo2u8$AXK$C zH@9BW8N+FWl;Z)Oy6p-M7R&(Hf74k;%b*>C1c&xbBR(LV1*RxB9CAb5T0+8~mk5=j z9Y?$Ly1)ETvAuWWyNiz zv9R4*g>Cjzg)g-_p6W94Fh?QZRe;FDJ1WxPX%h1n(eavqNXOP;I__5K-FGK*My`jO z?5(tV*R5HFi6~!RChK_L(NDU?+4RHJ9S-bMzBdI57&N|A?zK_5c_<~gqQu$V4PRl0$b{!a>Eq(+ z!x~jxwqNct>wGe-zM`j!Qr6t>S9JJR{d!x$f@4(zJ8kbNEyN~lJDT;p`YcDexnAgP zbf0II{oOdD7ya9Jz0s~OlHaV>))9|ADdrw?N9)r50LrZrM?O^XU7aHh6z6)MaH6A{ z+Xpg-UTXI112INkeDZvEt=LJ|^fikqtg0bf>Ec|of{g4BR-S%vC*9qFjeXp)cyDrk z=4<_{U(ZvVzh=qf9;L4r`#1)+zJomZvv_TRn@;b$iaFxhfAQH8(x|ntgWHfuc5r^cOQ(Z0w%t?P#vZ zf?}V>VxPCZ&*EHjLJky1_Zyh&*Nzx|K{{Ld6iESaL9{BaC8fP=`qHKKkvI2kl$XpE zo%!Ne)~$McOj}Fguu9)dMuCN^qcZS3M~y!vm)m}oM z2emqrhWRxQxyD!zDm#`+^y+_Li@e8!Xo+)QXx}5>BupAN_uS;MNn__S z-+xH%#vT02J?@>M%U;H=zBG@8ZrQ+5^a)|8A! zgH}4h$B7Qrn(p$6%#PtiNnf>lM=Q^&R%Dx0jMb3UuIXMgy|MA)tHSj1JwEHIca+Z& zCNR~@5c}?Pp7qR@H@08w>pu{bax9!>P~ZCjR^jNv=87@Wn{?al1?i4*eqU(y<>N-B z!X~FAIWL%njeZf`vFN}Hqp-QU7r*7tReABJF4s!R@RJ3L zIT*gh$&La38&!2rLUuYeq`cTVp0}Qi#t!=w4x}GRCGEetM{72dzIduaaK}E9`z2l^ zpML6%!)(AG(fY&H`0jH2a;9fhr(7qpS6ek8>Qugk9A2aXAB6$$I~<<44Ee;HuWUrK zpSV?2iZXa2?x-KYL_v$65ZGYY0GouB9H-wIJBByt^nq9qrh-i}LRBo_;Cot3f$wku z{yq1D{63TeHbGEyjpZYu!n$GHD$IipCC$aLAN zF~}e_V2Yy#X|-UKm3*Ps)fX-0Fl{K%D`gL7oGm54=7X<=8cu*j&;=Biy)07V z1E&M~|2%57gA_^(nGnXfO@dJ_6#qpI&JhHLl?gqFMjNPUtJ%XBzgXCRn$l1jwjBvk zi8AOIBg8z?&IovveYk7j{W@VFobcOrAVu*d3OsVM*^7L6FD8LbFk(ofX8N|A@f6~S z?EPNTOpo0*f`cl*fBOqgy2G`A*ck9bN{}zn{wr`9(oX_k10R0b zA9mLM!&LbhpZ`Lc9=vV*c_m8q>?aovn$8+lr=%sf@B>Er@%FfAuyYCs=IMl|0QveQ zCAtfNY&|!m8(`$RH8my_Z-Uqc+$0l|uBLj?XjPpWZouZh$$UX;;y})&S^C!9;;L#` z^ss}tf4s`5VoE(vM)P~-k~ipYIZJfmsToYkAelyv@llo!>6SV1mVcg+*hl2>CE7%0 z@dl&oIn9k;DaoSXMHN-RDSwHB?RV2r01OlfR06#kt~nOrRC-W5JLb;j=i$vYwUy4T zoAOF|LwD@)@l=T&|2nF@c=7^fZcTb^=7*2!IdfNTGv2>^X6jl@CVOzA$vx#ur*wi6 z3U-U9N2HbQq*{+#J{@CipG&IVRC|hDs~^GMAHTLtD&Wcs<|03c)@(U$7{Nn5{3yDz zlgpXhct-2vyShNrEuI{c1)5=6FGejlQ*Gux_11^0h!#(Xh1})3wxb8Nl}Rea9l0)Q zO{=T#sp@T3N33|-sxj52Pb3(zQ?f5}uh>PelU=$~ON(reZYTNQwv)H?G4yrBy~r~- z;9k;x>8xz8&?z4w^yho@FMHo5#*qDckqO0B>RJ2jLB6kX{H4FD2z|h_Su9m1&ugc~ zG~8N}dBzS_j1c+R)_?h$Q6QF*6=GY&ALN1WTLY5kq7*4^n#l8Nm-YF>wu@@!3Up~d zX?)V`EmRmURCqVo<|qiQq&%g~@VJcOUI)cSd0#a0B3&%wR7Qai#+o(P<@scM9oIe& zB_6FeC2A$O&7`?y1+8m7Ii&q~DO&s05w4hoZ4;_JDIaOIXq%4X3Z^nG|lFajp zm!UJ1zDDuu8}t&yT0)b?j1}Z=xR@up_KVfhX7)-84nN>;A@8rSmGj^b)h}LXEM91m zm(NQ7ID)Oo7VRBTwH+ZvYzbZPz*c3;-&lbMTc469*3g{)7>=(Xrdm_lo0Ch@D{sVk z;7W%Joc%^c$7b#O*<@M<$n70P#_N6Dbf9;10yJ5JC+ zp6>)uD4!)xC`s38my|lPQ|yQbw|MPz2GKA1+(YeXFV;lE5|x1&kSgma5M;p6b7`H0 zmr9|>h<5alHtFnf59#&=N}#6CiQ!W!%zBi`O|qhJYz_Sf@` zk2`LkY{wYZxizrcM6!O1|62+94X5)h5~t%;iV&yI zrg!*?dut@@v|FH@>_~1mC=w3CUy*t0Mm#YbGLdPwfU`Q_KM*HkY(eoD!TQG$QZlZK z#@FWxhjGxs$(9*3IXf<&YwyqMqr`Uey1mCcpXepC?=-8;xH(g&IknIjtUQbith@(m zMpYJxxV4*PUpdE1Zqw-2VcC7KryuDC-h1katldQ3b%4{so=!)6*RzCZgP1jgS%`sj zrZDgtGj6|4zG01sI2a$PW6BV0m-+GuG1d->h;~&X%nX-#S4ntRSvcCSDRg}Qq~+m= zo3*P|qU@w%xQ~ZrhUI-ODpI|woO3ryY2SomkG_ExM)H>2n+v&b%B2p3U2RQ%*Z*cR z-Q>m97@o!aDqHi5C+?iV-)SmYilWNQ<#vsQ2njuyFt{igHQ8#`kjTq=Kly_7v&3LZ zDcVeSW6iPPmhuS8i^76RgmD9l2-ES>!4E>}nj#X#RtTPL>|Jh_jVC`BEuNU$&^vII z<8F!i^7gY*6z~2Za zyLy;?DXIku?B4#LbTQpZ8O?OQL9_}e_LWZ{SOWzNfB)r=?*QCrHSwKomSb5S_5Ia_BWaCzizUn%Rz<4XACVj zZ|+F|yH@o5g5cX8ttd`hTNa#v*pIwP!4HIzm{PY(o^FiRR-K0lbQX5VexphM-ue2g z*8OYU!#cU9_-Mp54*Qp9>^`}`!7zcpps*5>_**CM`#2y1gw|BohZcU9Q~JHqf74kQ zym9|~$3jO6B_UHFi0rKrbL2o?1U$|26DxfS53c&^;AQo+T5@f+J@Y!!ab2ETh3vCp z`aOpk|ABz2&oK1_>VRa+;oZBcMI#wdtsbdC{ zO_ZBRNgaOp6C9Dha!_hu3Ct6bsYC}cIS%P&|AABuE2K*B+HNOozi|3 zqhF{p_B$WWSF~F&agoXv&z*eNU_DsPS`&RNw^m4_UVxFeVkXP|$bbl)_cAm?p?j$- z>0`To-%M`%V}ZxRD4$rqoTh-8>|)U)+xSA8kCYC69K%LhL=jKu^UAUKB42|@DR@uE zdqhKD`aPGq9!d6kMc;y<_fF};P%E6ZnGAS=2l1l9;-*gnS?%NI>W?MwJOo3Tyabu-g!oBcmm3GkFn#fs0$11i_*@&_A!f4Gth=vXrZz*q z)c1aO-kezGD0k2P@YV|tbf~>+%VcDiA!Ww%$8Ie{3`uiU7wC_TpVrQ?G-j(osQof~L3^QJAg>Quktqycn$;q#8wup8-1(%ER0I zzE;m+X1Ro^*A72wq}kys^;45V$^fFGID+xAp`n-=njbP50ntXO3e&C4*K3n!rbpfu!C^Gb&$9O&1>5XuC_> zw+y|6yD(XFB)rDh_WAAV$CQ=d;x!n{`(#;YIX*Cww;+s}1OR?Sg1#ACNQ~l6k6h{x+7FCq*7>IUM>t8C5;cf3hkR)6#mM&41FPbQ5EhH7KF=J`rsC7 z6@HDm?JXR_kl%a3md2>Rb1B_mak%`{CK}6|H!?H~$k@tv0RS1*P}jv8P@53|nVqzu zw2`$mn-(6bVTWc%EOTSFd>(aH`0HE0_rh~qxB1dU=$9c^P%Sf$x{k<6QvtJbTZZ~b z$2#EQwob@`G_}ND+CuVPaDKH6`5qcgqAC!%7K8!j6Tsx;2$eUI!PMZSG<5k=lfKR7 zc9lC3&pV|E^~aZ?GLTL5Xtj&H*0DJcBxV}34E2Nb*GCRuzI$t^^#C!&UI1|XxhpDZ zW_EsP+6e7@#Nu#RBc~F;92oH=Re>Zz=(|wAR9U{55l0hf+_c1{4w8@nE_K@~i+zs@ z6ORB6O01+UAXBxLp-=f>oFJ*4YKDL96~UY%7lJEzDMGvh5Lfv zHqPY@Xac^En?hdd!A!m5EN#XQBwhlym=-)n%_Mx)7akwjEj)#R5j+Oy(57DACv18&qAEh3mLRz_ zqerxhw z0jbO-u4QOmAMgta)BPF^xD+Ix84&=M9tIOe#)RLd`iaknz|#Y$@%~oeUOWsYXcIvL zvO)zZrYo?bKz1Hr=$9}Lyv1191n$@on$+cf00Mgtt$8o+D`Qy<9P zfE<9AO6YnBlhoHnux7^CZ^!&gmCsi?jWk;^tr1gstss>&P&#u731PnQ!#o3$FFlZi zX+upK8vr%pPGxZ-l8Rf@T*T4N_@==#Q(Vdia<(_1b}o=>sS;EIg2ZVD#z^(YjGGhY zMUy~8R3*z#bZND^B}Vxe28hGyVCuT4HYKX6@Z2s8H5w1NT=4411OuU~5HPI17;zOV z=aa4;Y0AEiSsXNa$VtikWcdIyKEQ#Y5m`WH7)aLcBLTh-lsO#E98p7KGgyYUQ5Kp3 z&sd%-%ZYeiT1oTC96(L2b+xltTxaup!$y-DBJw%yXmH{yAkg5XiiSdKv<;Xx7*qt>Lkt$NI~lgWfH0)+W3(KLa2 z<|VGJpwTBV7~1zNVn%jCa#VOUb7@9*Hfc7Ex(qGMECTN4JQ+j_z6NkP1)Bf><=fMW zb6te4Ct$8vVGDm6;4-#wziC3!GIT8I6BzE6--n2~t)4ePT5bEcdP;<+{GS)+zF&f7 zzWPa5RlF7*<{O_7ya|d+K+Og3HRCLk7oL*@JaI^qx~{t7%mmO>PrU)d5mib!`=0R5 zd92iSjfI|iu0jZu7NhxMc`B0u#dTA!qlD`gmqt zFwC9!_|2nRXxk2bc}G=vSD7>|FG+)d+-++*LSq@~zNawZn6&8TLetKiDfZ%fCVa8% zwB=MILE2#b0YZNc&8GYeus^RVHSEGE#c|5vMukUmoD{eyvg3X z@k$BxQ*P`zFPZBc)0+JoDU7_kze*CzyKY(*W(KF6zj6BbqcE9sYwxP^0s! zqCE5%>q&irY7}PYwWx{2c-oLkZ0hrxl8gCKnk2i%Xq^6Rh;nxJ{+7EQ%_rxLt}=v$ z6${N(o#HDvsgzZ_o=GlVu6^&`YJI7wBdl)WZ!_ZsLcwPoNEffyTbwrv;)~r@85?Cs zxvY>CAlKua!+o{VCMnwFD|mU79BXrdboANnFLGVYH?>RlVEHd$A9qB{-H|!*G;?Qy z;E}}AoDvsvwsF-l)4NO&iFb1+ryiI43{iLVB&|OuY!>`nh`T)6pfv2^4)q<_D1@-| zey;tet{hU}Ro;7>wX5G?!jmd4c5tw`-e`e)DHP*Uhw0HVKgZUym0#axyBWpZHA-+~ z{qer&Z8!TSEN_pux7Dq0-~Y_454vn2WpzGKkw_Dj`?&wBPV=rACq;uT!xU^3nrV%x zdhA{$SFGP>zshCk;iU%!ryR!HdTxD+^LAOUdCFt+o4%|2Z*Ft*m*3F-E`s>fQ!`VO z*Qx$PjyTIFZ4)8cVtKMosKE6>X2;17Lz|n8@>yPFQ1sxW1-<=3G3_K@9^{(PS~Alu z@De=23aoIL0>R?o2)tmMzbJam;e8!X6~lwob#1yb2efPR-W;0kLSt!!&s{^l&T~tv z3zhhFhuyK5R|YzV=;V0nR1(Bqg*`Do53-2f(P=Jbq-1Us%g%Va4P;{cv55f2`Xz*% z5nU&$|Lrqo#syg-eJ9B86Jzc&NP|T^ZZC_>6TXP^3lzt$gRogNcRQ=JK|cfjeCe;7 z8!?{gWI=l-kR6!(=}+q(f8+*|_YjWK*Fd+L`wDIUwxVQ3&j?EBLnTU+#>mK>TPhAFV7YD_ltTn=IN%png=of;b!S_xlh^n;0lz%_fr3*B60valhO1G zZ+znwggaPzrTT9I@~(qIv}JoVfvs>z8w;nWSpA@$|IMagBUP}o_TRQeK@UED_1B02 z%pM>nb+9?NCAx^I08XsVUj^JyRyBbg4`M@lAQJY=Ck;eHvcZ>M9+5~X{y41~`rSfP z`&D&(Onm$%&t;zG)>Yk-N`5ac`?PAGf6TwJkEC^LSjS($=ppI4|M}@GM=LMS^!F$| zmaNQJy7<+1sZDJc18>9M5+;j)sF(3Cp^I-J4L*_7$WTgyuA2Uqn43ntZx4dq@2i`& zMXXGAv?ZM1;ou_CuF3txX=}+|6Jy?OOHiGG(h#d$`2HOB0wrk21V4DKYpF!?^IiUK zv1bX@fzEzyu4Q%xlkr=kq=wscZiY+Oxo89>G!XCKYZxE^-?r_#@H$P8@t|&#_t!fw zm6fVS9q2b#NRxgMR?Nwwu7+v@dE^fh1ThK|VS=s>lZF)igCMz=Cq16i%3o(*aq3ZOKxtmu zWs9{((!@FI77nRw<>e8ac72GUFXQ#A=@g=&>+K&M3|2`x={mAFlHk3caVL|mcp{UB zWhQ|Q(-qaaJLy{X1-A#B5@$=29FWPjp#&vr?Hj~;BbD<;V$mH@Bw z5+`}XW8M0XmsJW&l&c-}h@KLr`y!QCMhdgGc9LpK$G?PRY4#sGj1~388BFI^Nb71A zUE@T}d-ZUA4jZ@ZQHy<|mZ6g;cFt}x*yT%JeqNmmST&Dp&q-XWzCGSCxdX@d#q91$ zlBNyf1;TdgwqEYVNF${_BHlk#=u~Q=xNi2%ENoH@=6T)+qu!)vuiDaE&hTH|+2hVI z#E^KHSHwDA)8bJ54!5n+4@$PIVK$`l(K@K=-NNj18Sg*Lh3Ez0E(S~<45$&@i!rB8 z%#8q{eLtr#v9ZNh{aH)!fwwH8dq>x^*ldd(*4a?OJe7wJw|Q`_<`Byx;n%I%*bl|> z7l>yybrwQ;Yd(8Tcy(!{Y-_zmvKV?2dv0K=^MqO8e$*$&cQ3o+zlz|a&`%FB`aR*B zPT9TvjQ0$x&}@_>t9^6J^6b_J$O5DyYdHgcZ~PitCHkA51BE?_cJa=+ z7U=;&0YP#G zk(|>Af`H^4lpIBJ6p);TG=4Siv-jEO?0xTj@BRJW_kMrO(9?8RS66jc)mqP5&+{JN z5~mCsYh9f!ME0i5EXg6Tpf}BWT5b-!5`E+ERqn{KCi&l#BYN*!#CLHu*u_q6 zEZbGEbdCLHpt?LMf>1XdvtGIA0-`U(if<0ca4EOHQz+Ny=Ld^ZEG96~GJ(nU7d*pG zc27y(jn;w?%$Yj46x(S-9|@JcQRoc?b-!|3h1%hXS(ds2TKDAnL@jS{(qE$@|1{SE z6j~E=02Aw*^n6RUPqcCAKJ*Pvg(E>;8?aw_DSr*20HG6@BrE`C_$B{dsXzY(&=O#k z^2V$XFpGx&)zAOtdh_qrpkN=F;s@=hqTx=^E`Leg!TvDBzV)#9Z(z8;22)^~TzxcI zG|6IJg+ZAgd*Hu_L;Z6X{dc#24*h2Gt!^dWTJ-!s^+ z8&IRi8Y*4}g4Bf63!eqb`}^iOv!qPD%XsJSD+tNe>C733I~0V_zl(gQz*6Ed1iV5Q z&dWFIslOjJ5Q>xYRP9_%r;i&G)5+A!+71Z2%g9&U^_{jMNvSu6L9%uRj1W6U7A!ZrLNCnXaV{*{Du{XKE;dQEwMp=iGODuMj?wf8 z+zT-x;&0%dG)Xl*LiWG>s&`1b)7M@&F7OjVq1Th;;;C`~5>`zjNJ0odT+=8jvRzl# zC^81_ea)999>{IK+3?nvp-H>rf|{wAzxgLKNqk|xrArC z6WykBhlK`zp&MjH$`Kdm-%}lWv0ro1VMCJl!qxAk89oDAhGg9({!QP)<42QWON3&Y z?Z(A2~hl>lr1|Jl8e0KK9?2h71y_4VO zib~F?yFK5OIoPzK}D7r@QkDn!gAG@v%_&ul-227DfC z{#1E>0@g=iV?-@VrK|2HUL%HXgA3CQ6o|P!C~+WkJ9us8XzeyyOz9_tED>co^T!zj z2MSJ5KB?z~DZ_7d8z{%nL;~CI$x@q>nj{AR1kzzC1K!d^lon!HYgB4$SnA9vJpc3i zC(crEo9Xg2A$x;20QJC)M`_PACv}{k1cFKoB3hQcC=vbnMTnU^P)n$0>V5jo^W3I` zO#IP?tC%VgHhlk8oB=2f<~%`4A4-{?HaVdrylzJ*B#fUZa+2%sA+JV^Mo<( z{MrTb@z!1Le@Bo|@=0;LPoSBj$Fc$fN?BKNWgjdkt7z zbPd!B$3+HCpCk_)AX|Y`8#i*BV0BIchCY+S9G{COg3dl|*bcT#J(`z7i^&aJ9+EnE zeFe;S5h86~EJ9~j?=_y+vKJs`Kyj#z2Z-5HG>L>bC%D1i_DKCWdFPktEM;AppJ_KX zPmXy$T;vg73IT94^9r;}1#rCjcA(Ut$CQrH)9utVr_8lGKj^=oH+JS4x{b{Z{Q9>+ zrN1ND8NMo7?4i`Q4&c%_+y+1q8V9@(pfhL=!P8*}lC&F0q~>4h_Z4&42i{uAQ8in%RMMVT+*lf6{=IZK|;xBG!x-MCg1v8>$yI4K-3eKmvfco~TJ z{?8rgZ0;s-%9mh1&g~&Ilx}Y8gc9`ZHiHyugAH>}C}*_)bAXLtswg*XKCU9S`GgYn zLTa{jnt>I%{l);${)i%WCf=fOo~Tm?zI0LAZZ4ey-w~1i%YATr|=CPvp>N zte{EqVQ3<%xksRBU%}LWLK-mFffrGnT-(b$JPa{wAn!Q4%GoS&#+i@!)d}TWC=WTm zZ)j61dBnGin4W}cCe&rQr+%!WSYnmn++jmBJrk!DMZq*dX`mB#CL%#sGZL|;60RXI zkA{fqn&uD2LDr4LK1HL>`34Hrv#YF)3rK$f)zcj)s8S9T#pUdS|M@56VoG7(IZrsI zYfeH=(lnp+eIr+9Ayrj818+fK8;Y14-``OmAQIG=ioCRdZo|R@;>YsM^Ie=ZDre4= z=@mt&>Lry@)^uBCAdzyb}3#o}3Gyu9h=|h`#I& zH0Y={%nSVprHS@52Tc!#oGJDQYp8&|>RKK} zoZ0_`1m}iNAe)YxDyn`vkqRFay)uNZH$qn$f)W4ZG`!e;23kxPSFZxCQ1vSXb5A7{ zwE|`aCGcKo&dwYkf@g*vQ+*}B54d$y=N|1Ol* z^kpBKBO6*Fqt!6{195g@%FW$`=xYPhSi#+3C)+lU8mf(Zv+?Wng}0Sy+7re$rM+1qV6+Lc0c zGby}a#DpoC?=Uxwp-yiip9>?u3M5aQ*6t_7 zd9WfoIGN^ETH=|JhaQS`hi(?}I(Hjz;gzp^Adk50Y|%Scq5m$gCj^gSy<0}(USKP0 z==ho^=cx!PcU!g5fT}#mA^fTib5TLJsrQhTibkTX^}c3rf6`s&HRayM-V{m^Q#3|;u=XrC0&Ce{Hl&R24Nun7ZlVRJ=} z=2nmbrvAxCP!#aUT4$5=Xgs?)6&zBUmOR9xENN&_RmSg;TDQKc=k_4iJ!3}Ka^gu- z@Y$EyR3cCt1cSh)ztMzMpcAl36cp1MFpJSyZtQC5e-IC6X>D&=~(ma z^|ql4JGfbJV)uj(`mc>9ca4m-p9Y4Et@h7U`CHTFRv@mX5bZd9m_?B5qL9+n?(Q}E zvxQ|P^oA>N{PY9ln&080^Q>~I39`lr%Yn}nZWUE+JyoWpe)b%feG)>O?Cy-#>YGs_ z-8zr~#gix1I`3EV7S>gA0`Ps>*oA4Jf)bp)a`t0dw*VR>1B@#iaMPjXR{U3-M1grt z2W%GKyT9ygc7h2I+XeTp;-3NLjgX7DrabVH|KLx*SWrdAfP0HgGx$kP+Qo2PMOV!v z<(y(3W=ksxkk_08kgI>?e);onWem*+1o#Lb6k2a+jFJr4P6liksP1VDD~i|GeczMI zz9-EETn57>a{(M27;fgjqF}!faAJjL1ha`W|D8_&h@fRlSYVPk|Br3o|69khlUeEl zHkR~X%76+o7^?EW0~G$B?%OXE&{p9v;CIuoAh5`*9axnECi9C68@Rko9nNsI(w=DJ*rC;XqA7FFcx8pm>@ z;h3HUn9Oc?F{AoH#;mNY(h^h`9umxr^LLHSF4G2bdVZsSA)QjfB=fLzW+W<-vU~rM z1eNu6q?j7E&S(@3emFSzcZk=2cOMR**8Y2B3OG~JuoK*dR$noeQ(-u6WipH~ehBcu z{oO^d<K(o6kkjCqpU~aX%qy39`gE_F^^^Zb{G8mF{)b3!mxsOOknr zj(QjsO$j;q9jZlveO)e zJDEopOQSwf59WN|0Iwlu1YQ002H#pC$v)EWFAQUtISZ%slsTw6)mk{@K1XrxEqn=M zd&S11={?=x?=H+C7_>ZJ;6TIquEv;JVrNy$epRz`HjJ!i@+7!Gfu3-mwse^_pQAzy zTmqjn4uSZ``jFf4+*;aaUPX3GFiIxsy z9lXnXA+*&A#H@Bv#5Atacfu4fPZad)<>tPqR4k;oP|b%(xeyhBlSm03GtuMOYwG~TM_S7 z3g_3~s=iL8;;Maf@tx_%iU?V8yIPHi&Tc&p8aY}YotSa?>-ZT{hb@9Ui?JNBvLESV7E zXQAD$cvaVv1fL~4WsBr25(aHv<&Bq8WX88uu~pm{XDH*Tp!-a(u=f$N^%OBb^SYj4 zySTRO>GXqWwj0K`d^;|93rjp-xJ~Odkit4rlJ#ML*-CQU!WyDK}$%4QR4myS4IUDw;YlDawEMfL1bwUgu%Wu{@fp0a%Mx12m`q0*cZ zEYT7+o|cZWdwcXr2k==VGJobMl_yQN7VwITGeo1xN9@Ctg?qo>&9s5nGEBGb@!c+a zjk4ExgV1csC7}HFJVjRiZK{)MH@=i&uf5;2Aj3B%!LUYM0CwNs^(r z>SNrm=ys05Oqq6w~z9F zLTV2(k5bAxj5coG=`z^TIA&j0Io~RV)*TccUQC5wt5*z@SyjA;a2X80BVnlz)s+aQ zEf`&T!MKRlnhiY7$T`l#k&wM@;xlUU^fCH-zLqdGA%`|0zTy+_y}_uQAjd4ZL9U`C|12DN;t?P->psa-HR0EGk=@t z>8+mZAMb&q=(B(mp^eN6RPoffv7tQ9Kd-p1@^!NGTG}04XO@NNq0t5rzvR6^B+Duz zNF#TfVZP zJQP}gIkqd|8a-P!`J(55S`g$YH^^l@xR)ms4Q;fZk1-mMC46-t4Hxe_%x_l1pDu71 z+eKEvjF$F}AUTzr#~keKvOk7{*ckHXyoIlS zIJ8}dHtOyZFh3HxPb{JQH1&v~X z4-!xXRO!M7qt2A0``cG{3FHz#Mu@?xQO1WkqZ&FJ2kXM~m12;LEw{c0+p~HY&08Ds8~3ocRNl z>di(sSfSpG?$iTXf(z5hi?I31D2kdqjt-h8W`VU7a#YuXDz2z3?vVatoG(~vmkGL# zgQI&n_r$kaMCA1O8<0_w9b5~EZ4+gu4YMB(LKO|I>62_GCYS5sD61Xd$v|`Z-x5&J zRIT5*Qe|cc>6(-C;{1-v4SpE&Vr*IXtfc;`3g!He)V`Qu4P|B*X-Ak=_&17F zWEyiB7~0&6Raa&Xcg}rZiu2k#?_ zoO49T)kCBKNyztJLX>sx!UgZE7tWg+5BdF%yZEVZgie{@n%un~nNqNjC#C2gF2TF; zG}&}`cTKRI3$KUUR*l()&qad$7_VI84(vuoQu&Yhyh2=xL<$o%R9I{M>U8tL!#kJW?W`N9p}S|wNh`PM=2hMn-x})3nAVp)MEO;Pv|V~(swq}& zoq&wR!8Mo)tFfgcog)cBYTvq!ZnO-v}3`H1YP(Qq7lTz8l!Bhz%a$6#{rSWKP*(XULdQaa6}?@ zJ@FbH2|q3nK>=8DA+{SvU8|_kWmQ(lE-WYF(;+{kc=WA8EnpD#w0KVp2u4+mldN>X zQvP4k9sjVw)ZEnN<=Y^9|1d3yc+V$1MC5-xu?cB|5q zF-RqWKfn+%CaHdtqUfMTSUFJwNPm@i{dZR1zr|Hz0Q>*7&$SIDc?(P&C1CKHYyvkz z|KI$=_LIH|jQW4k*ZK$i`AeRQaPH5AR`C-hU?k8w#QW(x6+wUc{-dcui~jcz{&@Zv zHEbZ*qaX`;tP*qLuSc`|u4X#B6=<~|xery6=>my%4Qm5kz&>*oXf99wqKVFEcexkBIB(>MX2?0*{D@XKZKk5~s`hcoza)6h=tBCyy(01X91{Q*c2|KkDr zyYBuk_F(~ajsL_L4J%idCYfNzl}rTM)ek|F`Ke|6VT`)0N)u;fWh9UfV0c3?7V00| zp=$102at75ppn6J6%!$_u2zYclQ(Xkl+_p(%vSL4)-1LOV6w9GoN&B{%}U+Imu;J@ zrLEu%s@|wRtkw7|L%rA1G7%HQk%gU(sI{~NObrFENk9^%8n8@r8k2(A^8TgBYbXv&#a26N8-*34E02Ei8Gz7d1ckkEyfanfDMDe@x7fxb5g zatBp^FT94GlDgZ`R%YZ%R%X|-4Bs>_9tfPh@_?b<%|k<`C5Jtn=Lv`_scvYMiZr(i zlGZoZymzQh>&XZ<-&dqH z?~;>v;^zbxk@W{y%b9(ak#3rA`7E0UWGN(k!3&^q3DaqSIt=iYbn{uzQn#}0iJlO; zbPMWVmNQ-Rih0F)Lqy~_FdbPVWwiCxZGGXWl}*4?WltVYVvIRnB~$xGV@;eieoSEq+mgN4;$X*$DfQV}JuJ-6Z` zympYutuQ5QkIbw&!H0(a;hYzw@Z zT;ub1dbB(CTyPmjSj-KkLPy5_IZEt9AVXLHE#qMN=<^h&_x&4;hbQyG%rc`72+1m0 z?4AQ8;5DNBx#G5U-BL-e!GoOvQRsW9QR9%ar0l_Uuiag%Nsk6c%3nG(xGKsDG7Q4wO<$X> z;6zwz$!CSsd>noHwe3;dY!r;L1%>PoW%VmgIn}a_XyBOrsLJ9wlWq6Rwk#80q>jX5 zZb^N&&C&@r<|Jy>a2Swl6K9jx1=nX1S@5+H9cx2SH^AZ973k*e%#KBm?p?#HVgaNy z&=Xjw$79qv(q+x)nRNM3#C!SvsG{l%CX6oE>n&WhlcSuDI*PrtTRqfuGi|dor0#bK zKjk0HH}GnYD=c_o>3rwORCn*5h@DQ9U5B_?8I86#GxPd%K&mqxRt~>^6|?X;=iW)I zxsa1*$hC2}%tH=Yc{+-k^St0C_O|iYHqNw`M@NO%FLlu59A6P%r6(_c6~U91x}Q&5)`!RBik`X_z zg}&W6h|Abe>^CcTh=>wh^J7 z#}2ngQ-zF*Si)2-plv4uyJKqNR159Gw9s{6IOHOlNfu^K01&e4)3BWW4x?9!;g%2Ei@hk<86&l6&o;0C6cZlfXWuQl6 zAnPd`N`hvonmYHD+A2j0@F|lRnNIGZucWQ3kd#~DAd8)6wo4>gD$g3_pc~;k`Nk;Y zo$FqB-a_V34@BsLOqa(u?EA1w7YEOZq?v*b)?fNcT`h^>cf-co;tOAXE~JN2*db~; zEOsq{m+G}*F$E0-bv(tP^(g8Zj1pflH)}GmF0Z+c!(J1v8}7wdx~nx8m?Jv)!C}Jf zfZ}mty%F4{?T#(z=6um`HJ|~_tNZR#(nE%}3APDL5Y4+lI8phtx8-KOs15b4%-A;? z57ZSYo@Gv2N>8erN!+u@w8#`mV`R?@NcR@W8Cs3iFADTZ2i{V)eA`PK?a6e(VZkEO z(fqGY0EQ$R5V$gt%&QiXr={|`W0#V~B0YcCePqde)ZW$2uX z&hKm8*KrFkj$sG0 z8Mw$@-bU8cVc2jA{(0^}sKDrqei-|;)rdr|`l5p!-Lm|t;K*zlf(Q5e&5~|@}gcY)u zWM(?guRV#vGZ*F%KFX`a8Lf41`U#<_6*pj@|J%e<^ zy^CYv;xxO|Aqo_qAl z{k3-XSMSAG(rmP5=j)SDu$>i+?fB@-=S`bM;n>nbRXDa{i3y}x&+0Ai67)(zS2bw5 z=xs974v=qqTo5E7Z$h<^_8$lmM$&I^Saiw3Hsl&)1b^ z`BUsOm}G76SrDL}!`DT5=H;XtJ8eDFcYV)zC6CL6<~y%JHk4HIhDE*9pmF`s;^zWn zQq5y0G2&V4c3FhPju644Ce=g_)*m$Z6!$L2ik!*iaq~bnG?0o#p>b1Tbm<3$%-J5{ z9FjdR#IAqX`oVQT1S_xix4yMV?J|k?i18KU#lmm!x5E&7Z#QkZ2RBO)0LtE0OC9gt zo9nH%-FZ28@S!hbx;knTmxQ^Et4xV2H})n7B`ugE>8Lo^GND2a_Hr(?4@ zk$uXFT{@vXc<*@pE?kH*aB!tfCep>=?ZI+{5WrY9mr`#x|eh zR?HYPtIusp2&UFmN}QXguMLT7W_(+B6QA}hZamhzU?(ij=prxx^B&lcGX(LKID{Oc zQN9>R@E1cJu0a0qXd2#-!Df=90D3jYJ0Ql9A|^Ah5PJ#)I0kdevd%f9ds4k1)@ zT~6%pf2b2P%g0`RuqtCWE*LBb)DQTo-pbO+-ug`gSOypxSpWX{jZY5}^QTzlzt-w+ zBvAh^9sj$k!#}FsiW&1Q(zySj;b8mQT@eNSCXb=73`Ur^VC;EJrbM|K)fF)G1Ko|^ z@A>zWg{ZS6;O$9n{csHKNd!9WH!WDuvg{pcX%AJK!<(am397P|qhg2$H`3mQ0_sew zT(z73C;;})a&J+nar(2`0^JFWf(0JNbo)z-{x|mFbZ9DJ7&8*=e#PSZfcTn<928A*_jg-0i-AKlnqH0SSCc;nT!{x88vN+k4 zzv1Tp@=(etLf{utV`h)?mI%tphiUBrk0AU84Dxh;s_L5pO?c~~X_kgJc%i?WML#7o!~xlH z%&@WW73=uXr`t3xAdo0>;fX;<(vP;~5wklCUplO{h)LRBudeHd6n!}z-hNlO$`iR1 z8cs*fStfM+tp^`zP<%tAbl5>0%|9{vQZvdOct|v)wPG0*BEj1a+)=;2m9b?M>s*By z&@nMfER1-|Xv+2I@n^4o=rcPxdEsm`*6l#sfI#4OIf=y5*Bj~&6tw6P(03b348E?< z#`$lH%}uAy-D0N@zn1t_i9Y@r2<+*jLH3ZLZc6=VIFB}wcX9Gt zgbl-;;X`5Mgm)DL;Z*z3(`3oSy-HUQb|t?v*J?_p!&3U|>QhUlWycDHjEs1)t-46( zzSLk$zyn}vr+FsE7Ywl{c-X5P73JYT1xtfv$29_h`4-z`c^%F*%0 z!>ADT85XTOk|j32_<-oeQ^>J)P>0I%u%xw1*>-j!0_XV69yhpXLltg;z(VHjUZ&oa zHw9ajXIDW+OAusZBGN#@@>Mhey`am+qllH@bG|J4r0QAW?;K}eWTn0kOk_ykyP_Uz z_6(~aMl=4lOx#(2ORlbw{!ts3Esr#u!B>z>H+~a-+3mZ8k#$YM3mW`xS4v{L5L)@Z z*iI?;9CuH+ouWczgAAX7a~_?lmuDL{p^sGD6Y1|9_1XD4@JGsdGu`i-6S+;V*+II9 zl1%`E8pWCZd3x)*eL6}J7Qish7p0A23M~o^bM0}V$#K6&hTnU>F#OrBpej+9NM$A8 zLld=<<6^{JUU%J~eC=VHYf#8dX{LylZDe06l}p)?!7$_DrpB(M!bTU%K2&$*z{;}% zd&BllZRpvH)*4R*@i88jh{YQ7p_CESsC}yH6;{T;H5xN@RT461+uct6y@#NGWwLuY zn`+GS%I|7~I56|)L|}Iijg~Ma*icB?wJuPMcD}fcU;Om_;K3~ww8U+tk5RkJt8x6n zEVtRsg{if>x>&qxzIy&^QuMc@SQFaT#lB#CE6{Aq|4SgnlS*FU$(Y zRYvLFn674@y3y!LNdHQBd<>{@WZ6>3FwzTGhQ5ZyP_H9yJ)m}eU1U<@*Mi7rXV znyu;PBxZf1d#Af8CaiVYB(j+eC;!HK;q;qrcAB~4K5-v|>LfuG2+a>Y1BpjVS5UBL zD>D@yb34$(1{B%$DE35Z@QasO;YE)sop!q-Qv{E_SAx%uP=z2?*kL<6MwzU+n9;o^ z{avy93*FmuTl6i{m0^q;-|rV>d3`ze?UEB5h(^!m_#ZTW`Ncb0$3& zzc+ubr#9VMrs~O`M{4iQ5RM}b^W>^)2uj{-V?I$v1%g^#Jx{E?8ig=6&_xB9pouiF zB&2{BeIFT!zK&>moH0D8BgyFUXhkI&uAMkKp*_`!k1yHYaFuqHDw>;!@rSF_JNfIE zO2)f7Ds_LvVe`U7OoOWr8{SCGrhzcM;Mz`>v9!waXSOt@l@~}`u5m8)k)Pe$K?a+` zYh_)yKRa*)d(#_kL-tFAFT0`3Ux8ugoYT^#th7V4xb%HV&lj&f^Mf>FRwqBsU|QXmJ6J~tXOfy>96^1~M8#|NFYh^=<{Pp? z*QVlg`-4x`K}g019hW_K>O5E3CA`H82TYD;`^XVHZV7*_{IPzNN1uN-QE$Lhc(ogr zPqb!aWXQ9sVw|=K#s9-g{DCVKYZ`cP`d8(H3t19d@v#0ZwuNi&v5r;M{W=uq;_QkeQRkfU`q#q4C=L!v#D{PXf=Q#- zv_-nFXvnSgH8%v)3We{xY;t|G8VsLqRe3D=49~VhxX0~;-nS#^?D=+J$N>W9IlSQe z5MOIZ(ZB~D*tf5-@*p5wy`J9V{(y&lMu)30U!vS)xyULW999dKg_{j;t6E-Jn(iG& zK3zW}8>=!7@~qXSMQWcgCM%iPHhuTxuo`*nDjvOOCsFWBdffn@MEf+=WBTLOddu^C zFYNCPP&C!r)5~?TR3&c~72DpwWl^9ua$PawT-AMEz<0_0#IZ1!8b>1!BSA;(V7iD{{jZ)cj(VASit+fF#Gl{%jBRX z`(;Sb%jed3wzMWDVa|e_+rwuwbz7`o#IWvGtpYHpN zc*cF>={MmKitM&q4h-bCUsl+_QX@p9w9U(X`OxPM*R6D$!XK-DsgN))u#ui zifQ>xOSaMI0|bo}vG~gZAaJu8IzPo|7F1M{p4e!H^~$OSA5H@eMd%w`sK2&4bH$6^ zVw$xq;obkK_Ql3L{qn%MUXfQioGjH|HABW*i*%_eSur|b8mgeSB$ITb}R>r z;eRl0=A@KE1n#j~MrGJ1YfAGar@CmuyN>+zur0|H*(5kFQGU+|e*cFF@w+MU?|<(N z2Jmc9_jdyf2(y@$)@N_25KI#26u%C(d>CA%g|?b7T=)I z!Txs}?caYV6956jtCR(b1!ToK_hKnQKdJm5s%gp6`HL)uT_3kA7f|YF%u0c%0aJ=&Z!cM4dXgzi z4h_n9xUfi07Qaftt^@V%$?0-k?x8En6tUwzf9OA_C)^b*_L{W{f-N4&!Mws^yXQ6* z+Dz%?#7xkIoK{8M5NhJhh0}(&6+ilB^<~4}@a>4t?%svq2VVNNW5vcn`Jb=$hVZ{+ z@v22pHIFh9Z0E$zh4S8&$@(Vug7N_tdC5SBaLH#8GpORr?Iku1TM*lKoUdGIKV1>Q z_t8dzo$|5UemdBACItt%X$KunoXXz2Yw2B~wNSqK)K9eHOlTO*|cHSX-t=)KX3WMsw_6@kY= zmI>HbGt`%0?ypN%X04v8X2Z-bcVd6^JU{2SPG8U4rX!!++7p(Tf9UjdApE{w08c?v z;PvX6OD@q5wkzIPx}A6WXkC6SV|D5l9z*X~)OQuNpt!9P^Rhy(GfFv8o+j4BBiY2f z{{Yzo4Vs}8NTqvPJaW+cOv5RSuxggn>6~-kCHjpvTb||&n(oWd?u!7IAbTbGce(9k zp(n^UD{&pFweIfJCSuPt6w69x$(S7R>WyoLY1i$qvQ!FC_I#df=sS`VDKKGMa)T`- z)LJQiu+Q7{N)2dgj?ot8#1o3pF$sP&J2W=!ryn&oEvm0O@S0bi`ox@)j@}_ltwi2B zwL2;4y@JCH816@C{iM&vK}ls3BiRl#+~Xd;zXz+?L_oI9H*KtocYJ4DUqAIeD(-ug zO60f|A}KO-Ml|>S>E`P2hXQ_|Ys5l}x5Y_w8uI$Sh94@T5T_C@W<+{)@+B3k>R#WR zh(2|8_o@|is(j&`T`0DF##_MAVNFNkY`@&_l^IQNJonN+)M|n3b*WsDi$Bj!hdAe3 zyOJ`w>z!eZvF#A#uzVETRbfNAyCsxz*@|gLXREW!#NmZ ztaDQ#QGCM=NmzdVw^+PpqRzO58{Sf!j94yvnQy)8>6w?2dvwh(KgthChxuE<@8Aw z)(=lGFZOao1-9z;1zTqgd&nXLHinCygeNrFDX=c}ibg-#-V^xmGORs(_SQ%BZE5@+#F0a^N< z&8utdH$Lqtgn03*t0rDC!oyc|VcCY4XBI1Owqx|!K+Hp9CK$u9c-R`VGfG+oc|h3x z+hV~tn5j6U$p*v(`Gc+WG)$&O5fz~e4n?=iszN9sqz9iNFC*)HV-`5ihX7E(-J;mU zHdOaflBELn5c?&}Y(mR|aX5nW)1=uuShEGa<=2(v@@`%A&F!{x5rA7WT(rP*dq=@Hl!$;>Ddu&IVF@;vQbgN|%x> z`>Ib>%@I}{IP@uoFF$$JaU7eoFpeA}w_6rcD(6i zm%MJl$XfZ>;j}ZTak6IQHH$euu08^0~D!q&NKYRa-JrS9cVFgY(@K9#kHzAB+-`i#oMeOqv{u4+$Zr6zVFo+<-57Xz*>cJe-uv{t?wJ z?e{&M%h|9FH}yx(hW*X2=zSD%B4ob1p26pa5BTxbr-O!H&S9%2ZqqNqa?Wm|0`oj1 zcV65vDM_{cD00P#0^gDA3|kc^+cj*z<0-sdHcsSv^e09)SZtE1TKka0If!L!v!(7(b#qds7!+d2 zvT+;h+ufJ+sm~?YVQmt1ZZ*Jp?nO8PIR&xcw+eZmj9<>za9T9qLoVVhylheG^Ow4O zu!TF_Y^+nVNjvfy$FWSlvk1Zhu^kdPPiE*~*~iFE8f-s;`bH^dts;EnbvkyUR2HeCKF@%8e3XGa|A$~E`6c3>2uR%W5B%XGJj*d{HiNvzoOf) zD}PpUKTci_AnXKk35n54pS&4lZ^~r<^J;&-^8X*V-}#FgIva# zp-~@9(B&U({lB>#tps8a#k{NeO+XA~qkfY9j&d$(YUsuPP|y)5Gyx+U08UgaKpJa; zUFQ*%ZT2L*?t8vTY2)%QR^o3MVJ(tJ%~yhzhl>XjrGZHa;zH8!tuq}3gCS09j&i`V{ACtvj>z=FV_7^` zd8iDt*myNG0k-?rha9-`Fl11FZ;k!hFDwEyH?a)_>To6KKiW?eg7Ckr{wu&k2p}q% zRcVq*i_{3f@`I-YP}G*`z7b6=>857|Ub%S>d;I?~V;y ztesKKy=kC9A(ggsl50vFAK%wm@%xr0)!p#E&nK14p&?L;ls}wo)uxDZQI;iLi`&#< zn3oA_)u6ZX?8@`Ghtra%F{}C9aLtH%*sbOGTp##Ccu&NUr^H|JRyB61XliZCkGkXQ3pY^KoIxM%sGmU65H5 z#yd|d9XXna}Y27wMRY^N8sG}RZIJ& zrQlLj%#!D0^KMm6SMgzHEN@Esu)E~i17UM|>Tkw0G^$G)9OjeCpo-UD#thU@M(dZ= zFx7ZVw~3@eICd=)lEX$!u4!G7>$J|C99GPDJC;^$RmERSM;6Z|I!3ueHGlQptmk#5 zE)%)roXY}C@7feDg!oP5ohhMw%(qYFlXk!3*BJ_SL^oY4e4d{A zI1j_gB_v>SV651oS8XEOPtE$8!?TcZwcxJDT>e=ASqcs=v~DRup-?lFr>tgD=Fk zF^yR5!SnFg>E$Sj%lO6D?2m!>FEoIZQ;u#^d!PAMl8(;I?^&KO}Mf{NyVAA~Cy!{JO!H z)Uzw9^TqKkePO&7Zu>K)LjpRRCcQ*nC)yYN2niO^l;`}78m0=GvCK>-WBv&qx{PR= z_BNl@F!l?E=Fwee`Qsh@Y0I@BWXfSY+>6><)DgtVX}DIk%V`EHpS`OsJAj_M9e-F@ z*^t|=JoFs8QN0tFQ947d)PmP#@I_3gU^A4OIF|Wld5HDHcj1|v+;bJGqRU#rOnkjR zD9G}~VQo;r9r2RPuK|0|o#=CrfR#K>=?jm-SCf=_%dKhqmS;BSX$`Yksg!6Ay1FZk zvbDS!@{b?Mf4#XEA-QB#cAv&hxXtUulztZ?Pcl8P@|yqw>BHP5mcv;?hmsQ=&P{u* z)-10;4?w@}`^Lin)gnInl(Z(gg;u7!aN@H-am(#5Rfc8c-S1f^VJq#$i2UbNQEQUd z>)*m^6E@-2S$9OFwgi>U<-h31dwKE>O~gJXvwl93>#x9Q`cUibWMXzg{)JgaVS!7+ z)LZKcajf{+l#fckEKV32F5idrdx(5FLV6L?;j@i3oQEIhbh{Ufq?;5?mO?BVm~lXf$`^&ClzLnss|6bM>c+=3LhQZzV}VntirU5f=v z-{E=gv3t%v_q*@setu?anRU%2;Jy)y2#)Pbtdo{XY-e$YiY^aprJZQ<3l zd)B!sM*(ezj1JWQBoDg|pI-Nv$j#pZQgYvJwA|tXwx;pJ|C#RmyG|RH(G+$&ZOndZ zl3u6Wj59ClzK_>-!V4}Q3+7~05U*!^@q=tyKDxatnmjY@Mq^r*FI+M1xur*sV;RxP zDkNZ0>%FImsM=G-CW%NjGOtN$y6dx^f|UoyQxb_$2)u3HE~(rXiE`Q&)HIjhsACfJ zr26DUMk5Qn7JUJ176EgnXXxXW%jr7zq9yQMmDg?RlpV15NYXF`2hQ>a&fs4ByI|Wq zUvwHsrcIror-b$SgO$1lcDCK_6pZEEIprAvO^7&l(CX%zywOTSc~fW5v&jGfk3>jf zwb^R#|;W^FhF{oEK`_ZCv$7*}Vr)klbcm6{~Vd=pNB9bLUVV(sPQc7S74`MSkZWil4CcHMivAeWbvgrlbsrdYU)+KaV zR>95bj>4KvVT_CSZde|zyji2vt#ht8fyEAfG1(f5x}eLaFu&5r3KM;ON;>4D+@(j$ zI+l$X{h&zwF2R@l8lNK{`UHO!82@2G@Hagj`Oc*7i=Rf5n7%#f`g)SjgosFu{@N1U zYtTRxm`<3!UuJm`xOW}hpB6(E0~jIYPgE7fe&tUBj&;6nBzp3qws65deAZqkxsJI_ zjSYNHiS5a=2aCST`J7(yn)tPcKJd~y3-cMH*`G$ez7P%>F;lPSOfT>k35UgZPE!^u zy51Zegqt;{+0DJG|8$1-IrJ+tf@!$GegSIplok5(F#Y(~nR>xz@x+ z-*luWB_vQjM8`-pkvbY%I!igCTPm+xx>3XYqUk7Bg{|=;`IXC?I|TGuW&U`24U$Jm zPP#p#;xL0pd5!E!+r3A0%Ln&Od>Fc>Bi?vgak#>(i^;_fG-g-P7e*VLNlLuHr?Y{-b+0mi_jhYQiZQ? zJj0<@WpekUvL+EZ6LYl>M5kh+L$pRt4@VovgVgI=gwnMGnk_fp z3&ZV}BfoRc+A(mTN8fW5z}YciR9Ide!P>9wPZrSz{b2pUcA_(jop?pUQ6R!E zEEC1=0MaQhH~}a@b|18yU$m`6bRUR3)jBKx^1jR?U=s@eCHPYC0myRbx2E1k_f%^^ zPW(>%j(LtCsu)(`{23)I)?2r){}LSnapC`0m)%(cV2WF`Y^*3-_`|HbYBAfhD7?o#n(6?zkd@VnWUrR=Bpd6ya+K2lc*l(H84(ybl7Myz z40seo0KIsvHq~0kLQSQ`KenssLiITYfW}|&X5^+lNsO@e@*G6~kq;h4b#drDD=jnN z9R9QH>wm5#-u`bisws(os}J*cIv}aH)IPvwOpp=~t#?|HTam=iO{L6GAOMW$HvdA{ z3&uauoR12ZA)(fOJ;xFm?(X01 z6Nb}=ce)_v#o1yDs0q&jSv#doPbiKZC36;5PO-3l#@XBYJK2PBu9K;9@rC|1pupv_ zU*{Mn4C7jN?)DWa;w622YZhb#_0WmxlhcM{0J@5b$LU2^`JU%+_&bmDXJ?VWK~Jot zE_T%-M=krNeCb^?EmtJh=Os|l|MnYHweUu} z9b5xg$a#=PS|kv`G$eo7BR4aNpcQzcw4*WZRIzGim2ocIt8MCy`eDN0 z<8j~k&b)MW?UB+CMytDK_QyBRNX>&zq0&D3$&DTvzGZ+nPgb;RxZt$@nlV2P-D|al zIQAOuEd@Gegl^Jwf=yr1rl}DvCRgWVG2HP@aZ)AGFf3YcNX(SgBxq&{s^^K z1y6jWsQlc{K`K+KUSQ#VY%b)v#VYgV)B42q|Z*>6;Z4LG9He93@e1*nB= zvj9p0hpG!dk7X+j6?f%w7tLA*>q_cb!~;HE^sZ!G6P2`GzE5^733$@lR5Ovn`DA=% zX6C|-2kgb|w`d{Gkx7mew5Zumr3)5D>#z~NVRaw5q}&#RF4B<%@Eqw*gwLur})yVA!6QwF%e5yn`+&fRc!jhZjv%H-GC!Q&1*D3mgP#mxOP*!LVtcb{RMOaVF$i5#~s zo5JhbJrmEBr|%X>S|2HlB8R&T-shJ3MARah>=YON9D!fw_=)}Ox?Sqd^amKOz&Ls~DX*R`dGBH#eOvbU&OdnsIpZazN?5W8a{U>R zQ|@d2IA$r#*?q|~N~YfD2e}PP9bs8*G{dp(&qhqKsr}wy=wjR@t^@PY`P;hZ9Iyn6 zy;Y-g>K^~8nfd!bd>RxqY(fylnkG{}VIYxUJZ;dox|*q;auC(wvzb;(^YOaT7KL$S zT}N+xzq9^)*Uy*d;qcli>3$IPO+0}4p~P+VBHnMFX2&$*I9d&F79c*&5iC}8Wa0G` z=OLAqYkvsYmM+Qd-AvFXZ|YHc#-^unW-D0G#7nwbF*FjuX-Xl5eM#Mvbk zL2<&R5(qxoF!>ecFd*#_bI9m}T4>_)yUtC(;qd9($QQg65t)ctII+OYQFGV9`y<+eJ&iJdRl8}*z5vp zBX;9|Q3G=XOPhVz40DR>9x%mRQEoVc7qhZz$_|}tA@%9ML0+4%oA<5rxha#@p+6ap zVK0C)V~H9{4Sviax{9%pHbwW<{_80W$ym{DUkKq}Z+x4XcCdzuG%Zg`*Q@FobWqMA zKCCeX)&jTly+7QqaK@52$nbQTJ)R=G;phWWGDt?#?iq*H#AO<$w_7OE(Vt`eoRNz|cP{{B%u?G%oAfDn@FnG#(2x8!DuuN=yZi zGiMk#SpnvwgxLmLj;!tW%A%GG(Uw66N97+ZrnK42l3{CDdXXURzUsuEYO35#j(V)V zHxhMM3rBP)9Kyw~11Y8ir!CKo2;->}47=%qgrfcRH-@{xNR|pfsW8*(n_;`B)sJ_{ z@mKg!;;PQnDip(F42E<#?>B^s^*I}8Fzt2>2qO7$D5rGCEF+vTn?KVPlZH_+I><3h zv0yb?Xgw2#%jO1tP3OS6)NWv;@-K`qfH zeT#b=>_JX>G(#LG@=FWd#>74Ezsoz#%XT$jDCyW>Q-f54G-SEurza+Tyu18{A_7V0 zKe`=V$kSt6(dCey3O^*(>riD203Y^-rLK-25u%Q+s4dv@VQKXwBVL|jnG?r?9kAF- z)3=V=ic^-h2M@yfOCJ2V$NM7jt8fMUd6md}EppnJaTMuQxNgSxZYi~=+Q&vtrh0E0 ztRv6wy%OKtQ|+seRamZU3InaDLfCt7T_A$@AUlc%^fX^ITkM_*kpkMzVdgzYit8~Q z*|>NNG=-_o&qW`q1ZT4Unio(1Zqnv8NYN*e98cveOzIfBW`b5~Kcf41>~522Rs{lE z%owt~fVp-4tl^10u7-jdc#_e+Dd)Vx*tC&j5%%MtD+R}<$j8o{ysoRJf^q3tPhLyl zewK$hBK}L-)Aw`_Mcj;?f{r`CbQ|Jly&CUrC+EMkU}O2A9}Ek)zGdOJTuV60bfmJx zn8Cn5i=U=;icwfaVmt4chKv?@7{2atX};1w4ZI3MiqJho4bC%u_DSB5a;#h=0*{&l z8Sl!9wYwZQ854*QebcMe<(2`eb9EN-s2d1Q+g7T9@r>H8b(1>3lY0h&8(o;hCYf%Y zqM(=Dj7*T>$*>dNQ>Kp((AP2=_;(@wfHI#J-)&jup%8gNG*BxGP10 z0`0Nk1E+_-MlPGsU^3loK+K8l?4XUt++5f2lPOzxP_{Nvt6m9H%kqbrOsWq#JU6_H z0=qkCz^uTW1YOefCQr_+RcykJBw&D=BTdL@-y__WGpESR8ZN>cD z1j)YwKesW6R5wWIq(D28ykLYm0)09|bF5bhH$MyjM~r`gWB=}z07xGa-7T2tswMeD zNmIofNCW&?!m8J{{aa~E@b;h3R0A_1{=JTzgppb;C*aBsdu5y7$6tH*-)-Ulx61%7 z??-;peu+eSZ8r#DZvW>!4BIR>X;j0M6*x@Te|^I5e7&5ZvJ3f1f*F>Z2(bIGo{*%4 zF>iwi{2aL;tQf!vyN#y)$B$cb2cW|es@#|+%kig%r);p-w$T`CZkwX4JWv}|PQs=| z0Jm1Qf34{H7mWOWH~i= z6ZdI$9Su1B&FJ>v}7IcuUFhW*Iv7 zQpS}!Pn=d*PLyE7bY!SFkSxu@Gmx#r@cAf@cu4d*A`UTw-NWm%a29psFsiuTZc{wp z`_P4?UtueCzfDTm&V89F=lG?0(I>yy+PO~p(f%f*3b?#UXCT32nQ!-a*^+eLy2VlP zpPA__=jJ9-;b+wBspuNdo00e0=?Q5;(Ak_t_~J_Xc63L+aAb}hqAvyZ_WTCD^CReN zl2$*<$~3|p68m6cvWu8IEw4F4j6FG7h!GUQ&*Z_(2rF8tDPmU#E8VaGcWt$6GQ*E` z&d~;SA7v=yD&vzX`nNcF9U09~SO^iQLqNqI0f3>W^1+wHdi?gn&zflFhrUbbMDz0Q zQ#^7_jdzEASREsn7oacu*#4w9HhDTgcvP-##h63=Ros}|d<#ubayR;PYw}G(fJ7z} zawUJOnrn?vkr_N|#V-JlDrniEo6-+IDXKLln^FsQd@%*<+PIv&1k_@VG^KfnY%0l1 zKp>%tl(hB4hLZ;&3fB1Uv24|q3O}m4Rf6_35ssBrTiqf`zp65s18UcvGW@DP%b&8C zY-PZVJZGxCPeHfGP?dpFYoabEJHZ%_Njty-Q#9iR^ z_;e_2&cK4RAYjAF``~2LUq_gg@#iy^sOPd0kTx-Wo)rShtstzi@VsIr!C)0vY8YP} z0X?;afV1JGM+E_-#$Kdr>)!J>pK35?krysFpy;Xf-fn`(OIXaQcfz`ZVUC1BG%F}< zSSEJja8|!dvydzgaV_|9f3-oTLIM)Nns-F7%vHQ%c=fJKZnaOa{(W=6C*FtIfCq}Q z(QnXwnpw}2vrjc@#zX276GXp!E4q1JG$vcG2^L=lQu4eZG5^do*C54`rLOcg0Uw+6 zQ5!{e_01Z&(+7#9_ zCsJ80Ns%*5cXC<5GP+iXh@pJd;@IL2qM#xo3;&wuaJbXUgzONW@bkrs9)=i_#4wEb zZ$$o= zO+I@>@Q%Ffjxyy(xz$zLQ-6h;X2cgmJmyiui?z+loVWzxO+Hh-YmS>pl-RY#_!F5; zZxM+Mz9L@Z>EU~OPeQ|SV16mAWrKNAHLLm$q`d}Q6o$C%5r={ufRn?ud_!z+y9i0j zFU9Z46M@iH{gJ5y=XNNP3DIeK|4A!+rENNFOV%34^^xiV8<->ZBNjrwnd%U_wDG#s zyy|d0($OhC8ta$Eu}I?i=Z%JPkuQA}Z*~gb>_f$VgK(jXDOnT&ue?bz`M;#dd=|?X z>3)VKiNa~$v_ENDdU{QazBm;;Aei}BXKFuWEh8_1Rw>wrn0uSgv_KQXgizYGt%rSN z2+KaCT*O)CbH?2=bed|9P zqaLFrE#kMRw}`DqYZnz;P@(}fPoCG^60x$y|UCiLMJ;=1Hxl zQX4RHlL0jwRy2Hn4ZivJQJg?=N8=7DNw7tBArPz0>oV~3h}XLdqV|G(IA!@&5qd2T^yHEFP_CUI9&QcWxRh#yKdOW-QqYuK z1tySNzuP^QV~T7d{1PNw!Em(E1`W%4P-J-6s5p_-tantJ#hc9HW3+)?=H1Dg zILE;D%6xbQ|Dj@vht1;z>Bo-MF81fcD-qX3P|3*yrU?n+*@DzHWnXW7qDmG{HBz0} z4|focQLI7V7N}Z@geBMxPgl=;d$UOM+X>WS`$Xe7sxu$+3Vm7}{c%XQqa3QF(fg<_ zSw&5_^RVrtTbNXPjYX)8WK75k>+R)t1!J^}R#S`aNA1Geda4oDQO0eD*h)(JSiq^_ zZ({@K0Jho0bn(iX!NrSE9gx3IEfW98(+rA!vp0cAR_`<`2aGHTkP zdZqe;Zh=1?D&a4UU%c)ZH(O7a^C?R%`$VbiuGVp8Pm$E^TDd2X-r@CML?{Ja6zmbB(^OUl5^vy z+a4B~wD_F^_ek@=WPI|Ky6+1j=HSHC>Q9mv5hlq>*dS`kR%+f2aW?jpgtz%HyPM4B z=KN%tvz#$5(JVIXD&6G)O0O1X+pKj20cJtcXp`FC1z<4T(9-TTxvv%$rUNBrxK6 zdl#?%T<^%ay^YQ%Gx?C=p~Y3Y1A3-#ZhE5rV~jJjO?aQS?h0V@VZe}KOb20}&^tzhiRLCQh4pF7;^Hm#hZ1CL^sh|w zpogl!-5YT0VmfNC)lch(=|77DDNsCNTSZ1MhCn;#Ml7Pg^29-_jGkS8ZPhy2 zzWsglDM@*lQ&JwB!2?dizRTDV7YE3SL|%@>_3oP*Sj4#klbGHu7awR<{x@xZ@v`!g zXgmKMS#&Z5D{MzSJ5p)DnoV8_XHRkZwG;SpRBS#CD@oohJfQyze)#)W6+jFU0H`3K zV6M=@W(&#}X9GwXj9N0bKrtPU9u9zIbps2ce@*g*zj^_Dm#|FIpU*y9$|~3-O~q~z ztlO-B+T zVuOU1Nd({KDE{An^+`Qc2)#re6t&WEV_1jyF8XJWke{T*r zZgxr-Bwv;jQW!PRLgnzdZfn8&%y>W|i1ne+03|o%M1#xq1HakWX14H;KmVuRwZ>zT z_$cM-4eaRuzR2jmK3o5i)o|y3^RoniC+6l~@$5Zb*4#wtTNZD~9e|0Rt$~sBpHX?| zWR{A!E&C%yWyu|>YN9A#ih}qk-0?TSz~sMmP6yy`#^(^~@b7BkY_+6hzy$QCHmeT< zaNXWfiDDOVJ1syj`fnt1NX7vZyYT5eoq;1TGVjcBkNgOCl$`puJ7rA)W!zd@U-rc-%bKYq%axtjJl?|u9YVtH5&;M8aU zr|M?}C(pvv3F$m8xM%ZE?Z3-vCk!o{RqmWvmIBykn&vRw(9TE$&6^9+u!ff zrV8ZpjW{e8XbjqOMTsQ#9}Zq;%>e}irL8uZ)Mkdcih;#}xynC+ z@v44G{#V?T*ahod6#K;F0*YPLTHSC4kO_jN_Ht5u(?3P2;;V2zZelnT**|uWb@DCY z{3*#^g~4i5Tupv&U){s}`=gl8cO;Isq*CwuQ=D?V$8mIk|H%Ae(yS9@V-qVBVFcK^ z$F_p6U+5;7;%xRhVDYVUn@P=c%QJw>0#Lqb9bpg2PQ`KS(KuygF?^?bq3;AGSSQrd z%#>ajhLWSb#nl|bp%Kv^Na5SC7Ze&guzP96wc#z1tS8%p2?blYCX{J5)SRktQIg~q zciV>dMkh#^`DfNa*fFTbliSJWHyoy&rL8FzGud9IW3o53F>D70!(o_}a|F~lDU&;JmbG`tQ1&#iFz%TM zi%m{O=0Hp+MxXH0kfCA5B-xW_hNz88g;5BzpikD9^mHw!!wiy1-oC|f5vJC8mySCk zXUB}e{itf2XR^UtkdcKHxBBs&(YPXP#!QZfT_R`6OM%QgU6gzM23wPm)|W<07A7BA zw(SpLXSZ?(N!$7<4}ryv}&j3>gOQj=P)&y)K4qoYhG{>*!? z%g#=Ct9Sa&ujYK?F3ni$c{^j-BffLt{r9Gn=?3pi#wzpu%EG|96}(hx3$;yDZv z*8XwZQs{Ly02#OlQ6H(`C5!qxb)_lY0sC$m{=>bjJCM)bm+)}|>lBA`eO_w%drAp`3Rnoq);mC{S|o`5`lW51GqFOW z#0R9=<2XLjM=z{i;18O90lpn!0S-lAE{2_CFd1sVgufhHcUnc#arCvBw>E@egpu*1 zX=0K1<0r38_P+c{Gw$MNgv?tk++{g5Tw*Gv^-8oaTgWz2cZZN2O42Wd`>nTrs~p{} zo$;uvtyi|qyDBC}lV37uXurRsRw1D~8{0Rg5RChIWhN@gLWrh#JlMPn;VvirNdkER zYf%w#SC&;=4jC-)xKE{CeO>EMw<@(bv-z8Rv4Ok_duyl zs@`=Ybezs^y;*jN;68F_csGc3bnnz&dsOs8*Sm3>(3A2CPpaaAt`+xV;+5WOV(lgU zF3T*vLl0LUuj@xYN#V7cc|y2yhOzn4HylTzE6%%yM=6VrbKPfER;2q$rQM9NGMr(8 zyIQ*Xp@UFk22N$C{!8;3=aZQyCezAP$TuU3?bxvrd8s>vZ*4a+Fou_=6(9Sm2y)kI zoje*ZzOkcV_(kxoPv8cO8bKVYf!}12pVy_S{i5D%R~!9Nl9nQTJgjks#7$X3=|u>DCDLR_bfn z@qmY)WWC8O>#;Xr^iJeLtD%6krlJ_biYh~o!E^6vCqbS^d3aySg;>A_H@)FL@83UT zC-$;OVaH#5c1`$3z-4>=^iX85>viS?8Km_Ur`+xe3@-PA<5eT&RhA8D&f<@_EQ42O zn9{rMu%&|i;xpU)clRz{sRc!B=}xr1(&Aa+uT@)elz%=WRpQ!ky(1w~9g{ftiiT+b zkLGCv^UQJv=2S>ySUvb`M1Kz!&cp{HtIkxW2M-|#`K1}AZjFo!H^C=7rgigJfhfwc ztjR|=_J;>#Y+OFWmp`)t+p2GNQ&P5LYY$!3hv~w2^JzGae0GCA_1CN|b8HtNVgbjH zCUxVdEAPvk@+iWeom_R7T#=-s6Nk=HqNETG>v&g^cKOiZQo1VvKVQpwAs#>&?^kJF z^f=@^Ov1!ZoA}_~S#GUZ$)OQ4Mkr-|#Y zobJ+H!qeKT>>`$aQ)hF4*Yk`|XZvMJxRabY;!SU?|IJ-+86X=?=%L6@N4s;0@m51B z&_y3SM?^~3-#kkycOv4-ezEM`clDUNdT4f(Z2TuwFC`H)&z`<5qK!CMb_X`f{hfK47Nv5NrG zu77*72P%*MYKJjtAg6ySQ~pJxl4N2>?O%TMU#u^S>P&9azD=X zv_LD`vvA~L6%8rVBZ_R{9!)iX2aG-NEXr&8*$8zf#I>7iTAD|4JQQdbRbr*%)>KxM zya)j{!1Y&9^530_{~MQs0TaN#173e_=}+wSKGK%gPANA3AC*80XlpOISt%`8X&^sH z%rqcFPaaVe01gzA4H&u6-=!eHn3U{B=f4D`p|X3V0l_1Gj{*&&XSV>f<1gS4U?|tx z-13@7-2aCo;ZMcg?E4HkIkqNAg48FaloLE~`0&>XB*)XY`f5mFI3%pn8gYFtLw{Pa zA&dV+qe2f9bps@2LRxn)7Z1;3nv;&R6K#antVogW!W<8tlg|W8Bp!7P4q>6 z9R&0`Gui4!T`)NP4ze8P^lbaH2_wmJ!C_#rVwos8M8IsZWRH98F!>L=j7k zQ&34Vt+{qY)u+z><~nOj4<};{rC+2I6`P>GKvpou+}~FT^yHM0BpW? zm48LIX42!~Ur^cTWB@K2!TnLr;EkaTz~!srElISyseMqJllEE67`H1b0swE- zy7f7qCKXV%aRE`{pVXHMk+ zvN+LZ#K((qLK}5rUd;UT+d5-?9<~i%rJP*A6exyipQkcuiZv-B@F5`#nzWaS!vlBw z1TN}|w}}_jM(3JDEz>txBrf)h3A&z=oA}gsrTC2>^Z9l?z+oZR5WV3Dk@KPxW=3TB zyc6+BtzP>ncu{joUBdOk58~;c5FeU&Z3OE{Cnsp;)BW~d+&xOQr}b5L%g>#!vz2+N zfqGiKEsDo}lkWz|uR=Poe@Lf$irquigUwBGY)y!fbOkfEQTawmU*f~Zj=y5ahD*|1 z>HiXDkJRX9NY>kyh01~tmBG`LV?8GK8KWGU8;!DV8#kmFglxDAPqojb23;+P z*F-Dd-gufTyPo@BqDU>6b1_7N4v;?l2<~1Nk<=rBh8+(U8V>X>6Z-GXeDL8i`}4RT zFAQy5Ej!(-LEf1dMh+ddA@8;6D7}t7?v}{DD!~a)Uc&Y2dDPdu3WdF zRfs>K6MXN*$ppOVorZ-0cxWHBx6vZl`G* zL|%8Ld)mpVkmccIP7=@ya)W&N@ny`RSJyNd{Pu)qc(gSK$#lhht%KYuHk8eWRBv5c zR7Yek3!7_|Dy@q2bcPflWD%CYg~S{J`jQ8>?7#(Kp$*KSjf|waoC7S9OY3{rH#6 zsqx2`IxS`-#uxL_V{5ku$#~qq?6;p3j+%MG;bsrgZ;M=UHZ7oACu6L1pq+?ESR1?3%|xPX=+Tta@D=xfzLB)}DKRZ9?3_W6L{wy&EvKJWIQc^6tw6gV zka7ERUl#BXRf8A<2Cf;{%pv3}soutIfuw{M2%0M8W3N@MMw&?0qIxdZ#*9SOyMaJc zip!RY^mkd3a9|?i2sQ&I(>6c?0f;-^5~)1|`BBOoQ?p2hHKH3Sz{B8DCbi&(!& zuKJ}psxH0impf#x#+~fR9FIAoAIP~#?b>_aNdb8jq+!AOVbg{voR5nHQE}>}K16b( zb)T6g40OY17|W$zELn`WnucbPA#9*gK$LpQ@y?mTwuYaY-B<>rU^XMC^(yy-CUEEK znPYaWut0~;!qRzYE5p{PZ6?q6GS7Eryv3eG{n?q#+}&PU-Cs5bHWc}g37ScqT@QzZ zHpPo7roq!=D4dj%O)V4T%Hn|a84mTgC$VOq!$~a|KDQ^cnBBv!3li8)mB@jcs>1CF z)0CUMrnQUuc7~NsHkW;exjs`ASnt!n*0P)kT;TB{$9{cEg8ar?T_JWVF6(8`zahxR zrK%>PD;uBuEDSGpqYOMq6C*SJ{zbui(1bJh<*4(iKtS-zdA7FoPfq7>J?AC6nDC@Yei=-f4`ht%%a+C6&}^ssCG+8$0123cevw%CQ&z8uE;vOj9|UKzj9;I zK^a12_z?rw?R&Ucp%6ve8R@0z;xIWx;^|n}-kYXatRO9sjj1iu&xZ>n@`P&KDYq&z zL7wTB<)Is}&I=kRH$)8oNxs+pUlr;GT5=<*ft@KIkGF7(Nv`W&CZ9Xgd1aFCF;bQ} z7EN>Zwmr_ITAO80|q!$s<~OtQl`w_sY*nxY(KPXeI>Oey#8mp_Q#=#?{D?s5>XPCT80`QOIhGX>REeRZ zt=mAX-~>bVJ95)Ts5b)};K*$F9x;omkd9`LZ`(5cdVWnJ+7`OBu}3ISs#yI*x?eOl ztDArHia-ff<=W$~e+)n|%;3W<)5u1_NJ0?SWIxfY<>Jtf(+UDiQH7H#!bULtas8UO zEJxt<+t14{yz^YQOlFNw#pG!ewc#S!7_muBfR2noWUd`M&ESsQ`e{K}zZ#Yr`|idK z`prP<0Op59YQ1-TWfsr;F!|wC9#btAmsK~f>y9gehAHvYWMk?w5I4t;bPW~ty!OW& zC>*O2DC8QgA-KxI*`LHhc=1WCk^WW>rcZJ!v;PCV=g9zOx}5m*X~E~lp7s4RxLPu) z!bd zzuD6L@fCMorv}PvX;?fW0Q5>v3N#>ix6yZYF*Xp}(y%_x2JTxFw^8dqJh)Hum=pf2 z53%KsT8>rj3O8!Y8z`~IDkDbd)og-lQY$C)N%xU^RJpWa;hL~Kt1uVyMzva zDHwAKb74}Z=7~11?bo~{VfpVWU-Ot_HP+~4`Iw_qfjZdjJ(kyY4gdBd5GmpOky5~; z{vX5f+<#OAy){1590iWiKWtheN^11||Fgv|nQ^UA+T7PUN&f=_1@thHRJL)7+w#rb zzgFfT6ouOA23VnxJV-upctX0iZbycazm`_!IO(ks!M(qJ^#3$Wf-eUXMR|YQlBZ27 zN4G6p)O;<^jhmYVXq{05qbdjxQ*{UZG0HRt&{_X}G0vZZPxsHI8hVA*E{W2aqs48}~n50RQ=dVBKcBU70Ke5M$hUT(GD6cG%!e9mWYb z6M^I?+j5TbBm0<@?D&MKccOsw0?!r3eQAx*0U9{U1g)K6sm2XG45NW z6rXMT>CP=u_kpgawS9Nz!uN|QGG#cn2@m@6H>grCkv8iJY-{C)dv;b$eVOJ_ zF6OQ$ZbUS4rDnqLY6c4tmo+2LarNAO;dF%n^=9J6UjghTbs}&bikcp*KYKR?h|p(| zLr<7Xc@Jtm^?zKj^Cb|};krRU(2JtJWlG1r^AJy`8MOX;1nIT&^&@-#en+OF>F)v8wK>h3^KvcM)59PS&AMd_;_r zrO`%x$y(?5T+Xu0B0)F--ibt+H~z{x{{&q&{kXCI8?-+mPPaPTS!M%!u-yRsw4aC* zKHb6q!lL@uN#`NRFefwYg!pr)`ZtXKD@+%Xnevh*;gA=}TMhF~2SG=Q& zjZTs|=c;I$d~B+{>p)LF_=F3>K7y|u<^4gNU1I*GSWyzojUZrrK?8?z8XkewR1q-` zKBDWaApPmn+u@BhrdiFpbV$ydhdmS8%ZNYGS&TM}_Sx?jz8&{nm!-F7d;08l3X>UX z+L!t_AHj=oQv!*{TW7R%$U;x&r8#S%QDG-877U}ls!O!UaAe?#n;ekzRbraKW{ojajE#vI$bTsv!6)M)O9t4A{hr0-1 zm#bcv0&mU?^z|BkU60d;pIC|OlirY_22HCXIxB+x0{9FixdasN)bxA9KD=Rcw0?E| zzBL)?dzjZwhg19lG*XV81M=OmopYc*4e{ch+ zgXv_st3Wp8w%=@~;e7#lTS5ma1Us)|Ye##9#n}qTn2Uy%Mn4%X%)IPgpH-#4f3ro- zj$)ij@hMrU?{LPJ9^IK|AA9ms>cZd`rbV;|?#g7X&J5GnC&PxbCutw_H=r@l#RAhB z#2c}`1phKES!}EU8Q#RMDle9vI%w1_(~4how1iY=J7+J_fn`avI#oozGW)r6LFS} zyjTui-{_%Cjx%?AQc-R}zEOBUQ0CG5BtEgH3srSG9NCI__zQ71RPSt4U;ZmlzP`nU zGG$NZ5M9-RHfJS9;;~4$7Y$;rVl@+JZeAd)Q(c%g;hik6_dI@D09~ILHZtAucE30` z=sxLg#&s7cG;=A3L%in?Ul9J40(+$u zo#!O2ztzo>)FsghHPrhJ`dq);$zGy-M4_7H=M78F#ugbexB=nWxt;UtUWhBMSDeu;11B2Nt^gIz)POZCqe2gQ8kPRp(EWkHo%g;T@wKk5{#lJ4 zN23TaF>K&->;MPQmDixGyXPYVzUoHeomiZ0V8~WB3mzpdLEk-&sih;?N^xfE=%rcZ zx)>7S$UObVDVFO@S)7?6E1k#~%3Mo`OvTxaO*?X%SjFV2!)uTR0}jvYeZyL&3^5)>ATu3 zz!+xrH%PmvS(-Kdm%{x9SUl@5CutDKGDc@Wail{EMq7YjFK(8Y{-5u|$Yf=|s7yt#2IJiWOLpVI$rXOo4U zEK!lJN+8Er=kh^Auh#1ekDQc^SN%6*JDIgM0!*{oGYXS6GK@rqhvvyj{e0VnsW6C< z8w!lt@Vo5q*0f}}L;Q((luNOLiX+Lj zb*da2n-MY5A1a%6;c6n9k^oEF8VK1yY{~D%qxcfjr%JDdkfFssd>X`^Tu)|UJZchV zXuXWoo)yeMZ_M#S>sn`zfcP@i!;4^k29$(p& zqPnm!VmgKBiKA|2BKZ0&=;15}bmBz#l6*{aL`% zm&)f4U7(xh2p81DDkz5SuK(?#u4Y}>zS%X<0T?(8Kz3S_tWTobP&+GdT@W1&Y~rPf zIsm^f^^d+^nb08-(})}Y&g*GJl`nLn5Bd|(B#d>Qp4mVimk95yJ_qYn=(BCRSp*TU zuS~w43>_b>U~Yjk^!nqYpEsP5o&(L+0R%`pNABOCaf#zwQ2s{yfOwN*{{YF{`4MWD zA@ITsPvW@n`cA`+gU73LaLK$2$I(O7eH?h$p;KsS{PD9qd;GxWl2FrnI|(DoYoNI~ zq+{kGk}q2Y`?{wr9J&|Q?ejwYf3f!7aZPk>*JuC%MT%lTdJs^0?;uS?nt=39L{OTD zNELxlM5HTKYLH%yh2DD=k*@UKdkrMSZ{uB_@;>K1&v$<3k1%9rGP5U{y=PzRT5DY) zOxbSlv0K}OR5UZU`sYKBEI-Yu5iDmBitr!4h)Bz01PX9eKcM!+;m+^7VCJ`4) z1xkWk?!Qt>Ig=x9?&mFoBmH(}!UHAjHJ_A>mY9Igy=w*X$Q3tn{O{F#nZ?&R*3@oUf%q%{|F2xt|I3o_8%Jjr5W9kaU5x|JgU);q_=5hf z`}3E4?tirF-`n~B^}P;LNr_v!?IVs>X;2a|eIVrtM=K{&>^l)+fd}E*fFeq;NU9ww zVB453`7$G6G}=F(5UO)l9fKqNUxG_8l;cOfLcYDr2*{+goxeG=;M4dEFhF=dGr1** zJ)=7VN~tZm_A#`1800bWn8DBT`wDfE)C6ql9?bsR83WJaQ}BWNtu&cL{=BEifS#2PS9f#7wcxqX}#~S{IV;pOr!U_voBK zTc?i~^ij2@GwZD)ZgQg3GyTHF8_|*FyPOBh;#eq=U5pgUr;qhmNvmOPUEA*5H4>N z_fMq#7@e)Vmg1?dC%5173nFO#aj^pR8otZ9>Qny=q&;i$5hVk^Ae7i%IdR~Qfy~4v zqMeGv(J@Bn+^UghiO+U)COojIN`2MiW(Ij8=I3lx&oj@B(h2wHBaa0p%sSRJiH06% zi5_q5Y|X~{vJDxk4wUmo#7J2ST=O4Q=r}Rl&+$lWzYw6eQr*?9m54qxj_ z?*@lhNXFmhA|XY7;9?g^%lRH?i_Lp@E+O2KOsF<&4(W%L+di6&ubkc@v%lv+=bL1? z`=QvZhIjGyho`D+)6o<%N}fA362*FnS0diU`YHFP2u>zZ5i6*ze@%6c1z;tcc(bb$}aNPbXoBt&VS^~KjxoXR@Fs$f~Q%f%pM8X@( zTr4zPSJ^x>?QW6!wFJ%IXGp@u*IV`e2x~&%yK*~v7?nouwIp!m7)K=rS=&KQ$6xyI zmPt?rdG!ShZe%M}!^C~t0aq^>cA+#k(w9ex>3tp%1yxC3;WO(k4i0^Z{&wQ4m+g>w#Cc~iv{6e+O7T+sDnK-2( z+8PsTDqTzt33b8+!*aW{ zxir0z8ZqY~MJlG{@NqstC?Y6<6HinSf+Rf}PG_GnV#`**s-9wn@%shwG+fLD9SqG4 z4_tp=IZ(&A+-y)m*vDq~u$*}pk3K_Y2OcyUnOlf{wQ|7dd+1P|`22zO{oL}2Z(NBw z>CcUHWbQchzry2xJ)0)F8TA>*ikqk}v1z#gU$2^2^mn+%!yLNs(CkXP9wYaBF``wL zXUDrui`v0ubzO?~Q=6v@@yDX46>KrlUIUqrApDl*_udnJ>b$A@_R{B)AcfE&WsLmb z`yXVllY?{YtnUrBhB+~hm^}!;F2*;1{+@TZ{Y?Revgi375QgwU2t@0BVI`@FjfkU2 zwT5DB)isfqogyxp_yIB;3rwl48A(K(2okfp2h`!AcC*Q<%3n(v`BareE^+3j5t1-j zmvFm$A2`q5R7$TQ!Ks_n$7k+;t$+sGFyBv!i0}v;jZOb%f-G@gxi@jhsn&AqqT=3p zD*9pITt|l9Yqz+N#yxd@&4bSDt-$^7Z_aZ)lZvu%cn3`$k|KXGteGITP_TYVjaCPd zAP4&A^o0jnRL^k&rAnh3JyUTd-nH>Uo)DbdiQBdZj>LJ}be1w36-tl0j5%7N_bp+QI`jPYE##rb3yOCV?UEm86^g(8=#&ZJ$RS?JozrBe;&4*1 z1(WPr4#36JXo~l{l%msrISJCre8o+|Mdgm2GSujtWgM%-QBij{y}9P7!^OcXV_(Cg zS;=XQ?gt)F<=-8B-nrE(vVJ3~wRa3yH&s5IEv?yEYP1ElL%x~)P@-%5!f>ZT2$r9KB?k+V5hK@Yl2(4`{FbGIi??|-l#G9L*{Lp_8y|67P{+2 z229)m)Yk8wCtL4kIMbcaWTqZhe&X=GdA2xHVztkQV&Ng{4aMCMV#&@nGVyDbRA*zb z^K5;PuW;42c&Ov`&sC*1cncAklm&w3tMsW^p|d}qswg0}5ZPWk(GH3M*cH)!q$v_Rw==BW0P)YFK8QH@%Xv-AS_I=%2Nu#PoUr zvSpDrQ$)p9ls}bPNe&Ka9ZIVvxowX360x^-5u;At@BAI-D@tL zpE4g!?TGc)uWeGCk{WI*H)SI6ZF6z&C>9;Io3(sS+Tv5iCKOKuP~%%ca=X`->kl?T zG}?+aFvz6X@eRlLUih6N_PWlQmCgFbr*QiRtU%=z{6Gg$g@{8i)R(^PZTB(7RVbZB zqluoOT1Q&DsJWQK&ot<*Lgb0%)jj+D&$WEdtm>{TT^a1porDg{tlV4hL+PvQwwJy9 zbl{ZjGXg@r!C#P=!h!mA*z+^cswu2_19wDqguq_W6hhed+_cF-ddlsSBo!N{()jeQ z9--w(ut1U%P+x#*;3at=iJL5k9rHWGiyJFe`s=UDO@AantYhA@=hQzz?3&C1&3cPh zX_;Jdwzr?%{)iTr#8jSh61=+g#mEEKk|5euqHf&+N4-;V)WhR8rjHd8@&0kI>o_H zYkrXa6v=n}a_{S*fcJB+-U>!)E|cObl3*M!OOKJ$s!kN%V7>FUe8?!uYmdGj%>!Fc ze5~?nq$R;~iyC>p_Y_;HhBLPQt!^oI7+5ur9VW?fLUP z!y0Bec{Xhv2gdDmu%V2R#-o!8A3rTQv`I22b!BsNr`w)B1=0cAT!N&Qwlmrg?OF75 z78d>SV&bgMgWO2|9mwC9kvB5o3~Ja=24fCm(38?8&%IfXW>iqsy%AYfbh74lPnb1Z zRnVY&)2&(qIYEE-Uyz3ivO~Sz9_Z36EIC?Xwmyum>jLxkLU_W3%MyDZXkKs3^g4+> zy4~ye6E(*=Ti%L#4Wq+4Am)BSLZoU&+fcp>VJhw2LBesNP|T;6RH~~GpJxvCPb;%n zT~0!F__0*He+EvP%itUEvY zo;4j64pq!v<97E`tyD+$#~dAQ)duPeY+X_%>?sKo z5qiJe$wFB~k;i5bfsm2gRs!xdY230>J5dd5)rovHvM1^e@t-=cS(j9lSH7Cmrc8rs z@+o+GgfnH0?DHCRiMb2gH-c>KYmSUUi^);DGM81Umz_=y4)p3gp9SSX5;HCWQGutNz`@J++weKQh%b%Hk9-OyuWySQd3ZrHT5mHlS^;oFYEPH$X zWqXT@Xv+dKU^P-(5<#l^xW~z@fCZalnyUlr_El>Vc&{=^8w`6lRFFO1Cy(nu%+%If z)t_)x$(M*{YO7hvFpiGJi79D<4Y#k4A!&8b@N45Da?S>u;X1Vs3;T zuARPlgh5i3C?K~q4SzvyX`39@mj$mUyU(zoOJ_>iabETUEpp=b2@y1BA9^8-aUDR| z*8e;*Za-9R*}6Enxiarypo3TzO4{FGwfJ6&mK}O;FE$!0H&vE+1Kx9i3d=NyjpId+HQ6E zO(w;IEW=?YkQWLl^!<@Wu{$?dqhL9k$J%gtb<|tg|3|08;dA;`Go+Vh%++hB=6&#& zNMI~dYNFT1trd=>xL&qWw|A?8GSY|liP|ONQxD*b#D$i`JoWu5$B+}IwXGGgVL6o3 zU@omQkvs7PIUT`Ha^2}Ew^N#`RkAdrA%$-$${Eg*JX#81<=!{TVoi`!B{=3%0z26lc zI3&|~K;Gnkb%g)-2jBngd!lj^$D1iSPtr@;`ZOn0d~E3`LoGNZ(%qqlS=l17)B*qY zVEvcl$AW!i@}Ot>X=aXGyd<&j+!25w^<>ly|1o%kUikYFsyQ1UO7t3lWMobM+wuN? z-R7=D3m7^0*US zr1P!PlK(0>NiOeCRedWDIIl{){}akPQ1p6maF(t(R2UJ?N|7c73%|IG!S zsTB$b?-6FNXxDdTCA}fly^aZ9F)nWDUFgM;JQRpIOpt$JNBs+OB4v+VOHx=A|AhLk zh2{+O$K2(k&|slBWfb-G#(sdysP??rj}J~ZwwLo1sUA)zAb^(Ua(>mQwVf;lpl#8) zApaGuCvh~;-DjM82X%EKiSJi1CQ(XZyD|qExE;}yNx z48DCQOQ)XAAg3dG$NVs|U<`IGaBtARGOv>CAlI;%v6ss^_1Oq=a zlWdJ^{^F++?L*kQ0!GpT>w`u;WE@mKjmfAHA2F>^y%y?fs1W<`fdVF{UJvo4eiW$& zJZi7nY%9-LM#^Y7&g^2-#gy=cTbA2sp@n|3o1>qmm%X_$M7~}5v-B;O(beWOdpQaiTp}aH*Fv|uD#*JVd)1au zq4Vz2cJ~$sqky$=wD;BRt4#((>bEOL9z{ytNF5e`_%>7jDQcn63do8|2Lhn3SsY%G z747};0^NS?&Ic%>T8|rE_ku#eKolqsE)|l+=9(#!e|}4zM8M9@?+`le)&B$UHNlO{ z>&_kD?7q5$>zgK56ivTK?6@tj3ZTmK^GVKz@;@!hh-f=;`eDVMfr{H3g%fKD-;HnM zEk4%~cq-&v;}1tpl_0QCxfzefy&EAj`3dGDhF+zgjNgy0cF7)xqaZOOrM}eOHmAin zF>lv+buH)`CEuH(9;$q!&^*hMh_29ae{=L>(uK-SLuiI?r|-=R@7+kTy9%fC9K7({_tT+;%U#zfmWvT@EJQA zH~11YifL2uHs@hcR7|h$(2of$M2`T3U3C#7ERj>SYG<~k zsX5VfU!aQWw}Hl79|w-6Q}x(){dU19QHkA?aOR? zqkVWHk7J2-f3>2lRlI=ViZ-c7?uQF2x?Qib`UX8s+k8c~c?Ufv$wv4^%5>w5*gb=3 zS1(2B!gK4t^kS(lw#i;eYHyyR-pH&SP-J^Fdp;{^KA`IYFI$7p_~9Bnrs0lJ`NTd{ zeL}qt#zvzr}p|+^;bny-0DeJKe1@yUFzXc73g}kK=5f! zbUo8jUYtQ(G0g#;we1vj(`tJ_C(3A>FKg7h`y18esyR7t@%!9|{D)7#lrOuHbqwt> z_xjS@HPo2P$8U?h3Fj=5m7eQ=q%>(c;Iv{nl5_KeIIEjo4UlE?l$)h~OI|voq2R>( zkbtZ-CK_*fnkk&Sibu*OX)l|oiqT#*sv#p&netgKOA4*#Q!hPs=TC#$!~Yb(2QNzU z5L0nT=1@s?9ZbZh@4Y2&HuX#M`pQ)EWg5*s%QNl}qM4|B@caO^Ffxm1 zs;Bm*-$}ud?mYa$_C~f(z;HKtfo!*$Ybik5@UUj*f~+dh!Wz;GU4K|{`UcAmNLqIm zsl!dyc1Xv}!@sZ!-M(t}@7byI4ZIiu<9I`AlXYUuWz zLJ{OQjklv(6D-JfUf%e&ir;gU1805h=jGiXzloBal+e9)oliwsmMRPhq;M5 zJdvT!cZknD<2=t=Z2Aa7PHum1Iz>+;Gk^o%i?cdmI8~l>l4FvKi(CHs?`VzkG@%8x&6;YQvFB(0_*5Fn@Y=X;y92)z(=8rCfNs*Wzx(o!+mn| zh*#_svQ(h2e3YU$8X%qQ}B(p6uw|{Zs7~kw4@aT+*=oxSU@QeRj;5 z!B{88cyDI}#6}-3Bg9baPo6x8VFVBm7f#|A!~vE(_u+%G7K_~C(8@ya4e1%#r%z&) zk6KxsTQ7f{FI){H; z9$3FxkGZa)5wSgn9BNoOHk=22aT(Djw~RbwEdK@Re-V262<8DA44`)q;l&{OC~~R+ z>}D}rW5&09z`c$T_kgcjf+S)FqQ>47nM5OaIvRr8LHSBEUDs$efIP?|W)&qD)QYMV z7EIn#OAaH%*I{aK4};?yDK@GZk?k zZ-Tbo0%_c-qNUoNDcox{+)KgGokDNrFFqj6J<}ilD6v+VcEu>%nHf|NE}zk~(fvO7 z_?dk)UI{2z{qLa-5WrsPPjK+MM-Wr1s-Yd~rg&czw~}Y?PRpQYYTBQKN35+osGuUr zf~@qhy)RY3lv4zJ$cns@SO6b`O8$(_^!uf=7-@S|AP#Ge`{_%9oH^Ri-cfEAz7}C< zQBxhH`J^vJRkmWp7u9LqB`I?`g|lQsDyc{F_%WeK|Ik+*&(;~9@V$H zm|QIwGYG=MfLnEh;Fe4BR;WQEFpTKVlAk8@y_ZElghvLSyFQ<}0h#+URuu9Jf{&^z z+zrHCWaN2^>q=6)){xL@Kr%VEo)8iBl_YOdViO^tr>*2h|MzeC*KhhCmas?IbEma{ z@g(>2WOi~)eut;((4QQ-e9|qmkkg12q_*Ut*;tSYKOoNbJ@owwIF&H}sS>K8+@PDc zwfwM_&ZU%HIDH58d|Y^T1_ zp_)JV7lfYgLyi8l*choH-`&wyv9)-2tg39GZXqEL365+cB2GEc(rp&h=)jp&xi*Un zL(Idj8V1)yizmJWhu*&VRQcBVYq8V$=PsRR`50>=WZ$lS2e@~=768WFnY39Ak-sG( zmGTdvvz!8UM&~jT0PThcUt1mUaH#RX7cI}Rrj^(FJMIJ)d!Tq*7O;x*W0LX}S-mfb zUD?0F3~sgm%1NHHdpKE*oC9xP2UzL-j(=&PkdpuW=08NnfA#7A%cW@5fE~Jr21@l+ zC_KrR@EVkS*1F%iA55I&H;GHH0|WK+>}w#}rcL*5(pFCrmUp=T?%>~L-u`7&;@kix z4QIx1gk7{El_D^EHA2g0c5~Uv>M;tuXT`BkpT ze+wXSYhBfJVr@`{;}|g8?g0MP@4r++Hx$7CdeP9CeqaNysw8SR_&32!_sj#3K)|lq zU;RIL*jtGJ0uu9sWF5Dh%b^#;&iGhoUHI>Y`YQ1%Yn}U}8lkol9~tC2v_jNcb;P51&+jc&-cvDE$=q$4A}tr zh`qOP1R!=FpXT(Q1~C7E^z9io1mkO#?n8$%p8xO%P9wy{R4VgO(t#BsLOvyRYcVL( ztME{1ZN^5*^q!Ky(-`;<{`JsmUp%xNuJJcyfj)it=qhmqYMuR1-OT&kwy)eW{lj07 z49W`U_ToU<=!9734S4&}Mlo<#VElmi!<_CAe{KPhjewoO%_;7cf8iU(3sbghsBI*R zxVEN@UGbo9l*vJ6Dc*G7ZuON+$n1KpK z6zht;z4NVdN4zxwCpz+k&LOM}67T-_<+1V<+dkDqVD^^TTVjR4413zwems-z&)s4ue9B?JxghDH z+WQgRc8SDCI*5i?A}5;6inBWL$)==Kbnp_vK577lW|wR1;WyrKf4_rlzj8f3WQKq= z?eP?wP*_lsEDg3s$iK(e$=1s|X}HoV^L96Fb&|yBoonuyA9eSbh>2@`UtwL)$(dM2 zG;ap3x1ItuI4+{AR5W2KF>#HuXdAanNqv`?^^J3XjG=*e!0L1HjS$T3RRlxbjmb{i zS;mX%Z%iAR6o}U=UrTT~2S_p)wtb7>!P%F0DynjSZL>lO=YM;q<+Ax=AcUk&&t@m* z{R11pb3e9++1wy9z5X`?WN z)h-xCyEK1PbwPMySoCy!t$HPhsZP0ZO;ew~MQp!l62tNA=qL5eGOBv_KQzkXh*7?B ziT=sVE8Rv>QHN*TsYYIA-`agJOun5aR~z^*G`qPYZHG6lUU$u34wmnFS|lXlWP-`L zgugAgZueenH?a%Un|eqye}Ez{R%Xv%(+yC}Yxk0W-pXs$bv6a~J)0xj(#!-FqVax~ z`Zt-z2h3l?_~B}iv$Nc`lP?mC-YfBZYk{OcQ)TyqpnBk=?Kl$RY4A8N+gp~skVavMhkHj z(uN+ofk`-^9I>ZBb6jn4iZCp&7crj5K>`8SV9iYDAIPAhSd}^|3jZ{qEbnB%b8Y0yH}vg@-4P zvOQZJui0g-7gXM;zf~iQVMiN$ZM~8E+=XYg;d6O>XQbRuluv&h)bmR77~7Wt=D0bU z?q-Xonb;CTUCk0-emA+J%I-w`%2y47{0 zU3qkeFvPp#9Wdd{`}D(TObHiq?kV1B>qTsV_#=h4%mfpK%bZX3X}opn$35LVyNc>) z*t|t^ah)peA2MAQ9oN1*8_7_>aC_nl6!VCa8pdD-a8w5jIwkLnDBf-S)KtY5%KunX z^EJJovh{8(cfq3cX4D13BHe>e3&XviVmx#lz8{y39onHyiD-*IT-d^F;Yyzk!uRv7N7B-HU>me1W27-v8d7ZnjX|Bc_xS+z;!n7=hV$093IFm&0%`!E76y? zdm6^9BoA6LK@vA_;LsVZ=z2bFctI7mHwDA+zF?Ye-@@A$NqcPLr}uX2R@b)BkvYdy zAJu7L=!*8~Qg2n%diM6`mC(_mohu!@f~?1e$kqDKM3LEJs)PMJCQLWBlZC;R3uX;x!|JTqFPevI>W5MQ2B0y zJakwRG$YoLJrF-;T7-q=^{Rz_#kRcpBrO=O_fXyKX6VTpVBLKoD$b>S;wnkEN_F%3 zhw9fG4ODv{zD(C&wml**mvUVucD~}cvKDUGY4xmN-#nL*0zna#t8XDB(NuYCSbUCZ z8W>NMCH?%u6370DCQnnVXU)PiOxffLQLHtc-I{uo1?O{U=3s)%ia0650I+t?HKFz! zYHcbcKIJk+@+~yuePLYXDfF>>X0+iOn9VLaXeSb7jF#D{pVeyQb!T*FGVvpcyx1;b zrTDp+--ifg%rBH;RnhW?)naGYz42bUhl!z&?hjQZytO zN!3`(9?lgWZCv%JK7NbdQ`j8+d^T*0#520zl)FdPj@eQOI%y@X zO7gtT0rS#z?)fx|nFQN)T8(&Meq|2+im%3Lx|?GIYy2oV&dMl5^{9(2Zox4*%<-G# zV)ZsUzQl&nUIl0Qll75hiq&SBr^s9fp(jDO7ZRRrJR&kSan&GMbtFBy?eOQvZ7MFq zien?2w>>s|2Qdz=tgI^*HKK#l`4_Kf%QNpU#^QUq7fO!e4d>=TB=eUq6H#Q-$i~S) zH_y??#@;{C+t+@|+_;}f+@G7tlIj;o+#e28L$bU{iuY$BGGO)}MpC=KYVWCr1(DLe zy0XO-cWoLKFC%qnhaCtGfaLoQqaZgRbwu-C{ z`JESg@aSX4dwXylKm!SuDPI15psG-A`MUd)1l#<91l3Ureaq+O(QqJ<+}4Wi6>2qL zSabC$wQ0+?Zhl`C?n@smipkw}BABTU27D&jZfuV))NW5n6qj-@qWI?B^ZPuKs0pV) zBN$nYT*>^!hqy;oSY4zTQ}P}>Mqs-2@onJK7^Rhan4{Y}f#u#_UiVG!+C>ytEuSG( zwHSS|tw@vdQ|GATQiL_-jw3rr-9$uxk;4qy9gxpPlhg&NsC>u1q-9ugApEpU&y1;|iuu zyCVJosThukHuiIsCzB@XVwW)k*P=l5KzaUg?CAm0XC%H4!O3>mu;`L#ya?Fap=@uQbRDQ~H z@HR+Xa}@lM*8gW+4=_C^gQx_C53J<(r`~Hg3yXow#la`GP;l ze%(#+U^ZMP(;alh@WQpK`b#+sX8RA>18h?sT1P5ztt!7i7pTeiZE-{{;Dmb!CxsQP zMGNkPwZSerHB)a-W;;93*BkRTKM8@F)awbf))_W|w7-=P!UDJ%FrXzO@@|G(lk zq7{CgKWD;ihD^c9Q7_H6H6LFsZ(nupZ8GCLC?+5b?{vaa9?kY*xj+JQthvf(e9|C! z&Nk=!6Cuw%KIiKwrkq8I?@WzR-+|q?ey6^1$^rFzC8)t2%Rdny0+VD-`+ta_sPI

      qCy)xGu*8oa|sm*+XK|$3-fYslkYY04QsP3sSL2f1j82ICkWqN2}#z zGjmqM-pV=8eni-OyvwU*pfdW7oWTQ!8=>4qe+S9j-+iIpvO}`m*t^^H&F1thTP;-) zr2zu8yDhtfb`%)Fd_Et4lq_llj%ICdf}!uq%C5nv_-Cf9rg?9^rimoPWzo^>q>(cl zckW-;;FRVL`Eqh^8yA#~hARws3uoNoE!kihrI)r1R+%nWV6c}BO*g2eg<2b4^ZO&N zHk@C=Qmf-MOqD=s)DHj$Q`vGjh+?U~0n95t%kA2a@}EYy!%mc6=#zCtF6~h$K@GLf zq(Fbi!vEXE}UrH=4am&Me_R z(^gC{H;*7htqCMDw4_Gy4*d^YY&Y-%^{JpyY0CVa{Hy}AH!{B#z?dSg) zIrwk?KPx2ok_d3xrh8n$4<|9Ex)e|f7a@pF6U{h zI8mSqz-$0tg8DCL;|yQ>8|V1vH#+_I`{Xa8wb4M9Wdv`EYKjiUZ^A)+p|SUR8K4ja2icv=p=!??vlsbhWeWw&f;*1ry%Ik( zmShGr=S=7M$Gz1b|JF-q;J#Y0WbUAfb5i^lTX5fH5g0iqy~>p#yLE?a0FSF-MSrzn z<+dM%=*6e@_CdIv>`U~2`9S;&1f_fiK}BaKY{If{5gY^TDIi81Yuy}m@>1~^Br^xq z+crNk1>e?r#d?^Cz?}PIrQcrCQ9hO(`TB-d73bg}CL{Ik9QZPd1xTLb=pNn#1VFyb zW%H<0bsugF>1nLoe4wW|uF>F(FoWKUI@SH3tG~6adl|7VgJcl!AW*}});mTWOKkkD zaQg_7RDWy$+Bc_TiPOKfd}`SI)>RE-L22tslPlel3N^$m*81kgh~~4TaZkuSS|pg= zeA8a0jJsJnUD^7=X??TRmuH1k265&|HWOF1HDL zSimF`K*mBmRapmbr4hCkH*lTjAos*kgHtBnRi8&DYAKo{qiU@3Jwcj=u032dgw9Sh zET1n#Mz3q^(>VDHl6lx8GsrW@+Gt%M*~L5UF(w`ly?~BTfJghDb^n!6$R{OHay{KV z{GP0~3(b)Yea^yW9lKwy94r>jJGOi2-S-Qej0<+|^BFZ{rOE5pYjICcS4P_x*?A8*2nL{9~7CIc0Nrrk$;rfao%sP_;RL7!2|lN zxybr#-!kyfl|fjAO);Q@WZAvtzg)~3BXlszR%N}ylQ$hlS`D+)o(k>Cz8xc)tx@J9 z3Fa@@)6fUo7Re4f;*USgpVm&b4}Cfwrr3dcy=ZymgSZWQ ze-?rBe%>6!IW*?!H$ig}8BIP18N%I;C6SWK+2|;z;y0@06)*Qf6S}SG#}0qNHf-}x@0FG}bqW|_ApJ;qMevv1IgXHgh z*4C%+ao^~OgH7mDtsc&IrNlnyj3$1>RJwUz;@Jmoqs)%BB~Nde!L;0#I&!q`EQ+C* z!ONA5O$d4L&O+*tM}~VfiV)YaJz8Q|$2M5Aq)gml>)g1LmSTiaF1q{iA&gBOJAMN7RuaG|U zw@P_I=};$Da7PGo;#LrNLjh{vU%6hBw2qaVcB1l(TyWfQ?apI;h8L}pa)ZL#&tchK zziE1|$Mx-=LGf#(yEEl&mT}?OcD2C)SFKB)uc*QlKRU!~646ljqEW@(>=^!)-Sn7N zA(8NUsbd|f{WiP~rejnNR&v_sVzaxz${}_B=WBjr3QjF^qT76>ksPX+MUb9GWu}mG2Rq}mid6V1qVY_;4<>dQ8Oc*N_JVl?uH;ay=&si z=DaIPKNv@5Cn$BuYHa`Hq^C!8%HA*@zo?a+*-#lz{__Omf8wP1TW7|%Z zbG=Wk&HdeN>djsZ$+3-_DLx2xjj1X?x=S$V);;QU(iB`XM>fAJn%@~tK0H*o#C`NB zV(b+Vu2rmb1+h!{xXDt#)Q!G-k}!i@AIm^Ix_%UT&%Bp|xch$hmvcvkOE(RcBIF*y z&v)kB#nF|Ta%US@b5|af4Fnw{32xLCm0SvYdLJH(=SRr%l}?Clr3WE~DEcw_7{aC(>gnKG zCRln}kKM=9Wt6_ko{t)E4k34L$SJv3`IEZ40++S1eXe0SQP#7@=tx;B%n zx}nHcP1ru0tWs~C`?QmwQ@Jw0Y{(u&wr4HAhk9Fh>uCbFflXgimh}fULA7z$oFLwg z+^Co>7u7senDg%1`^#(?-8R7S~Ez3BJ)t$$EhhOrApbN-s zni4)=&pFuJ7`ac`UTZwy!aeuY%-S$`K5m?1Vq^T2^8(rKM~Rz@U-C5Po!VPk5y_2dwi-=m?4f#sV?xAs9rKP0qjm6zASCR^v1I4^*KwL(k1r7_bmn|DNvke7dXPg-a+1j7z3NNcLv{P$Gsk8$#7*yd=+=!86WHT+|2bVVsOD&r2}-cgTlClPUFJMg>gG|!aA zJH0nV>V%3e40Fsy3K%ayde9@It?~?rz#WIYCVOpdMF~!rGO+cBi&1WkSVro^iwC z%7w2!R2SGi=5+N+t+)xPh%{(L<+YR)Ku%c^uXi|63F>!twa-px43t?4P}&AmK_uct zC3I7~c6KAkhtAH>pNJWWEj=SW_%r>dLVml%+9m_VohfyZ6jY zRY@PZbn1;>bfr3lVXr>*r9pdU%=mWIp7tPMa-`nYbys%MV)Dm@!`v!gst4d7n+}%i zMg|rm%n@IvLiYKou(0jnFEgF}4llw)o)Q$s5hl})s%%&UUt0WDM+u!$Ml>vR36P}S z(7&qh)N+W0RQ58(F0_r>YzNew^e>fq^SH|vSx>DZPY1At=&z@Fk-j4KQ=9C2_PDQ$ zvsdNTkqhuPxeZyD9h4Dlzeg_FFgN~*%L7?z&x@y{`tu~pn383TVczq*NmE-KN2q$a z>7*3jA$Qop3`qNYHf$(s76Mt*Gd1APxx>Pbz{Km~)M@mW!okVNrGsbEbl7At1px$( z&RW}GT;_MgifF6vC)9)q7Lc?9>JXh(%k6)z_N;)1FvU4WEl$D@z`S`=kN^813YRf< z+Ndq32v=YVe#_gs&be%t8gyOEinkl8XE!()m(0dB&ai%HpVLkt`i$ol>vx@mCNrko zwJ9tm45vnp+7jKug_z)8`ES8{iLsQ-NVJhu`a$&GFUaj;gkq>eThZy_3x@}tEBaBo zx>hXjWi%dA-u@Rqn?JMWDX!3x!1V0A`yj$$KAYarvQHEYJ<_VyFPH^Um+o1Z^N&|T zs2br@u;$D)1rIw8rlSviF98flatF~2quED`je52<{51a@DLaufqbPXr!Xv_PR2Uv1 zdioUXt6q1u@91gF&3lJWk4JL@Lp=iq2-v9=f;#4q=)>olpi=8G(}ya}fLwrTk=TUWnHvb`f>&v&WU(oH$lvCUF$tjeb6cnDK{N8MU9vdJx$@qF5UT!jdBH6koS zW{8!V8&7v5%3p!Q-vh)q;IRcSqt=#a=`}_m(9o87dy#Q?L@~z zm~L%73^EGpBw({*BUvas)Ww}I{RO$++YUeELu|}Lv(`V{=D7?q<#l^9%S{)^h z=2dbsgcns1%G0!2ma`0VHDo_b#$t+QcZ+wf;x6pa+5q~0cze&ND7LO!m<$p`B}!H@ zNY1fACFh*fRzMIC5XrI0C^>`VoDq>EvB^o193*EX=L~(m-FnV>&-1*`z27_T9pnCJ zc6E)Wx~jTr*IIMUHK)7!NzmSV8Q9Y~aJ9^*(0*3yFj8o9;pM`$^7kC~Rr zm=e0eCE55yTcXX%Eh{yx_Nv#Gceow%rKQ4VJ7Wov-^&mU%GcxvPlyJQumersQ&1&6Y|0BEN{f&C)k#o}2l% zzkSS>kivq$n?(3Q=0jj4J6u;nOZ~o+VVx^OW_||4OUw*%_K7y$M>-DgmhzsY<)>_v zay}jl+Ifi-1d;yV;5q+>^Jso55q8YiHt3)%(obT{%9>G%k=JDa3@IrlAHu9i&4GXW z{QnXs`se4+!YePrgR%ZH1ycGlaPLSP(6A2Tt)uDE~| z&=W;gm+WA)MGd76Ds>>G^$ghf0Kn}(ZkX_oq5mnsmZJ!MSsmWQg|SqXXO$a&tB+yP zRK_Chbct*Efv0VJs`7VM={_Li3?`00Ulf_jmsIHlK^f-P1MIiisfgF!#QCEj(60tB^zGW`P60QS?rY^fX8RdG!KdjbEf-`@r+84 z$!NhF*m6khaO_z8v+rK*by@QbUr9F>gwEMraJ-g%A&=bj~xGR?{)%}ygeZnrovfwJ#tCQMz zX<8Ix-mP9?^gK`Yu4awKAH%wNLAXr%O9%2x;cT^UzZpqbl=xq*Mmmyg%E_nPxT7I+ zw?jH{$dk2F!;pg1(P#jJOa z3phG=B&-_uu)UAV-ZgO#dyp&~ak&?KyRXjE^IU#IYI{?2CWxf)C}GR`Qq&U2F*Y&% zAPJSrTDPU{Xyaa##vj~$h|_{;G@gVg_8^(xCTfwSF5f!|u%d1(4K?P+@j*d}8Su1f zhQFlsu7fo92PT=l$NUn0vN+2glK+)xt%t(ZFaE`8LJdvav0{q1oZ|kz0B65hfY`k$ zDRQ&~@~q$$(Z4QYv|!d+=3FLk3*&6`n8}9x#odlMJP(>Zo6IFzB9)bvml&Tho{*Y_ z-Q6RjOIa-PFgyFYK@k1xK7KK93TG29Av7cq#StJUYs8*bo0?~XK* zV+S0xo>kd{P8WP`;Wd)|J?XfaQMd3>$$_5eMD}Kpb@|Jj=^@Vf8MUFt$=e#@ig6-7 zE52E3sRP^W4NTZxD3!q`x510sVWn>{_{5}{OdQ5YCD%95hW3z1% zt+BJ#nq*~QD!^sbTYN1&*Buh%Kmd98`6;`aUw5$hi?>EQ-l#o2A3vnY2lbrxgLr&X zZ{1Z6TPq+{uB_||{cvmey`@eA>(5{(T~bnwbEoQcG@q~nZgxlF2Z}sx&&M2;LlU^U zyx11y+h5D}ewqo@n_%O~X74bd<^(kqO$jUO1L$(w{R?>e%7U-vWQk{qvqK%T1UIezPoAV7)V!?tm>w> zEdHW%C)HIW69A+wUgpz#bGQ$bT{mX9&TQ$|plFExI&A-Gl>*2~?3X`K;=A)V`#e!! z=I9$RCs^1-ab<<#4tmTx6%r*~5N zWBBf)5q7TYp#7y>F}Vyo&Rt4W7ybjB%^~j$&q>)g4fQ$))0!!ox2&Jze);uJ%m%1-^M`zsD1uBpwT&@KM}qQBM>zdX!cNiz(Zc-VViES>DO6HN{PB zPuSk@NnCYhr3zPvdz)NqcYTi8hN^vpWyO;1q_bM%@JEL^T=vQ#_WZT0i;gBEDP5@JYtjE+#pXyyxr0WR7dF>9pjSJ?8DrR(SCk2qm;f}m zTwHeNq8p6hk^QY&jHswkPxjSANs8={5E$NNgcVnRvAWX??R%jyX9~|2-vwox#`56) zxs-a(g)52TW4=YQ76_rsSC7Tv-uf4adiz_d3Y{~}_E5C5r3L8w1`-GBDssh6wca_d z{$`2yoqW(3vnxKMo72|$p0Ms~vIpOPjvI>c^@MPXeCJb$*;Gwp3XP*D=N~|!8vk}M zajcV=*-tZKYtq4QKNKGKWaVw<^ld-A2On+uFDD$SnO2qDleHQd9#zF2%?B4`J;Zrm zT{9FLw|T`}`a;BBjD$b{@*4J{|BOYtDtc-mqQf9J2~un8wll2o1t&Y~<|4|o{7Jau zHW$Y(oxygFfkxmXRt(y9y}iv9e~0I9Hd+QQZV$nskQZfBtA{Plf>1mJ$1@Xo9sN?y zAKDKTh;6!7!g>2@tHaeu){MCgpVFjJ5_uB-G*H`HONy>2*?rMecw0mC>rn{GwPjjODAv!3=fj9nr<}~3CA6I@EmQW>DU0Xe5|64;xjZN*GWw) zP(Cd<$lh(Rw$ZZoQEbwm%T`PZ0>teM$N59Zr@piTuITV7s>T}C$m@xmuPTP$KAHBb zS|mq^A8+c^tYsFXr^VSR>?o>O229Jh2wc*Ge6Q`gC}(f?J!C=oEX&#bseG0}MJ+p; zQ3ro0F;M`w3NsgX#ldfcH_yto*v`-R`>7YUbj1HB-AL;h^dq1x;2w)=vAJvIk9)C zFD@>}{TIwle)#VMq+bKd2~t;!oVK!dW@d{l)R1sld{FFM*(c94I6*NYU4zOT9nU>s!t@c5~N?3jpcZo*)r zT1eC^6L;G1Y3E)aWoU619a}GwCk1Vn>naO4O~*v=xqMx){`vaMUh_-!`{a`NsT!jm zPy@eb3hRV_;buQ?0w?vG2c0Ah(6q4)t@EI?_IPUi+V-v-t@ZFHlEjOKo54+k6qBJ< zpE5aKd+~t&!jabi==#-QNOA!>%rIilDgZwTds~LqkIU@fXqQ%BCI40z4(E2)q;=wD zY6Qh)4>ZAX%muGgw5F|}$_3U{NA!BcLj-13V3acNTK&1Ch_Ia9Z(C7uqt)^!Vw>fo zb1MzYP=oD)1X-sWews~knSs&V6y!9sF6?C}f~~cVZC(T|QK65Oe-cnyl2lnJK((BX zng0WSEMaVP5l3jS}{)^7q8(Z3L=^tSS^ zZd)Og(!=3fPe3udn|=Qy{3XCfpa)Noa_Yuwh*@bZi2#m>b-)Vz_)k#v(U3QU4K|;F zWO8oPE;ac$@2f4)Bx44(dx{uZEClE>j4h%+h&Dmrrl+lsyx#q?Mri;^G)lHgX8-Vd z_r41!om05{vgYspPmEi#IO2ZL*%xO0h6pHV(ScV~!CPVBw>&iVj@U^9_L;xeMcsP- zs7Ok0+XOu$*-iPz#x|2MNu$uf7p2Ver+{;pm-!bw1I>i*DZ!|oxCote@8w2+U70+~ znglZipjB9y)>yqu*iXYLKJ0CrI(fE_F-vltBT2S7Kt&0?v(9*FCR7*3KpV_M8^-)8 zt?Q&i=22~EtFTpMt6O9(Fb_jIGbX9@ZOXCO6KSaL0B1m< zQmV{vDl(7BFMwJ#lkei-Z==TFExCW*ImouWzOiJ$rR)J@O3;M|U?%^u7B3G4@?*2U`s+*3R+zFfr)Mi|Yz@)DlPJVB7heUn zSK%Kmc)UP1{AkF6*Md*#Z3rl)FcM?4)bg7Sfd;%-Cl!`o=D}|xe)aC5_DTkufxW272X9=~m6U<_dIt8da{;Vi_#8PmfVTHv zdUF5gBY~xIk|#0Bx1<+Lxy^ZzENl#}{J;T@r8y(>6@dO()hrdYCVjKO)&E~aD@9QX z#kZ#!niEdI>q|pAtet)+*-tIYd-X1$m`Mg>e?tnXdlF-Fe~*n_gQ}c{}~DUYd(YhL=jvk{ofYx*BkjM#{8*i3M$X!#{~4fj~Rx z|Kog-hh8E01(uS5ix0kO;flXG>3+C+SV9x%9>HK3v|6}-nhg1Q#J*2cf zFcH%l&qdmeD!dQA$?8mDO@e{5cX?)W9nGZ%Sa^t)_h@yoL2@<^;w)lD)G**D8Lj`|x&8GMZ$%>vPT#-ZIDnphS z86n3wA%CFAsX$RpaN&xVr(iVKV83KnQtcm^K7`bFSs`2S?&U|`>oPzKgtpo6dN|d_7&8e z7~d=+h=y8Td$M!kOzaTA=pfbkkV@4C$a*L3^^M{SMEVYGX4x{{$uI~;g2&;TzgAP- z+}bQF-#??}^kkuTes+|-LtBd0_^RT2@Q8VQ)x_DIqxCeF;WCqjXqT?}CuX(uvpK#7 z9C-#yZvzN19I?yuramnnL(Cwclv?R$C5q*@x70Q+6>BhbE=^TZNXa|{^IC*cEJyZU z?qOjGlb3RsL&vHbA8pUxZvds0*JiHM_0=fo3>PSvCKV*&ZwUobg;};(dzbx}jT4D+ zZ4*6NZYp%LRq@pVQ}o=d4~f4ojUBOYeC_9&yZpVZI9`Xchss&r4EQQalbNbj9AZb* zN5m%LNBMR&X|i@cV;}VB-k~BQh4crfrj(b>QL;2=M2?-e`+x;l{T_3TfaUekFEYuiP>XH z=tsLTcyS|<>~JMDFEzb#0LOgI6HFQ0>~WxdgvD9iLtV-0C~z;#SS1+J?8(*}io#mJ z&3e47GLx306R8PwoLC;=vgGaLp%?JajF5IQdD9=>=vlH6b?$ouz3s(#X0#9#n7=O+ zY-?Om6@<ELl}P58D1W zn)pVtKMgM#GYZc%BRurhX3jmM5QcN%OL{!N6m5K`)a*v?Jkjw&HVgNy zd7~u0`(HH0Xh!kF-U%_|lM}wyT6(>&t`-nlyPjfFE?Fg9`$1gHjE=)}Ne|lW%D23Fw(LSfCMnlj^g-$bau(&8TEa_GyY{)+C^R z)oL-F+H1H*vwrf-k|dgm{EkhuAjYyeHqGck{(Eur#WLb&cdJxX7e-*{^52TL_@G(K z2pZ#Z(%8|2B1uZ#B}>bVaJcKm?ZvW5y_QM*&M_X2sS|F%{h{3sKHqGgs|3Qnj(X8^ zGQPsr$@ILPkCxIFE*6^%4w@a~QtAl}%Ne$?#qFTkFViSrtoI)39hdLxH?ekTyQ$=E zI8y9}HOQ3lSk?b9={I%KxazZqzKT|UZ)(9e6Rx7^dShFYd8lcVAZFoH@5o}*{AjzX z&7m`6;3J#%uDP+?!w>21Z!I&0`Rj{I9}0e!bv;d!kZ@NnaL+DrXqfZQ+1ns|qlI3;B`?%m=CnaA00 zuvNMkl2(sFw()b8>9e3{7Ma%7Bg!s1(0~*-QqXT`#X`-Va62$S#-ccDT{ne#am(wq z$I~-H_k{pYCFyXbWG9bLD--Bn(t@$cNe*)MXQag*-G4nPPv?u%;kEp1V)DsU;zy}C z+CbU6(G=hKPpcpdj+HHe2<>^ zijJ5@4U^zF)2~qtB`{Z+kQWCe4MwBK#BUEpg?wcrF%ECMtBP|$5g{@UTyQ5MeAyM) zGnvgZF9EWOfw-<+{Ok@5fDqfLa+=F;eWXNr&-4b~5N%-p2}SfK`>EOF#q7x>d%Xy$ z3zO9Ped^xcHlVcP%YNn>ytq1M+E`(4?RtZNYA{8V`=|7ljMo{gMFL{|N?a@VhX?Ih zfR~GP~Qx?a?^AMc%oJ?Q^2gGjk%U0y8CmVMhj$8xl; zRik)doHPhHN~ruB*othOzp}*fb=4f&Y^@55V3KSX2c zjR1wUdx7^TwzXAn+KrY&YunKWJX&jEcU_c}_iF;HXmK|+O2$NEpJ^9uP?2;)9QNFF zM=amgM+w-Uy$zqUV?-!BYQAZ{pmlrb#~x06aa>jav)c8($Zw_%0W?Hg<5x%9ss>ef z2Han6eH9O(K=JqHP9?nGWMpO`@%|D%)>y~;cEeRC(+p`!lQ;O(%*2N&RbEtcjCVSa z#aL9UriwQ>O8?J?Ce@RuU^CWN`EMvIwKKcdm9s)3K1AQZDa`!B)14`GUbmL3lY{Dq z?>T#{u4EL>aw&Ggd!1FK=ABN)f7!D-lewu!IyPQ>8ZYLUVE=h#RQocG{i1ho{Jp~* z$(dJMyWykrYBzt4{@WNtpJniCTj(>ltj^w4nYFQ9_h6wr(=5U>%gz|*wgs4j9h4&c z5FCv&1rzK_7OUM`G9@>NEzyN~MIg=s%OtCA?})xwZwv`_cLju_fyBJ6W30Sv%cebksN6s%S8z7>U$|{%5c*FzxK?E8}~ql zU>aspr9fi*{o!}i7ylgE7af1+$Gw^hP zgjG%Y1*6pkO&L~AyIapsI`RTFEiIy8|6_sM3;=Q|cmZ^7EhuwQgqT8Cql2AMfZi>u zoS~Ev02a8()oET$upViBAxU3(jccPKsn)15nuoT=ANyx6#I*p_Q%nNIhF@iQgMza< zcP=*Zb0EQ@FGRtYgfxnJQlN+ldhDnM1jW$8vf_jBMH2N|AbXl+{9mpHbSsNt^F7H_b%f-fhFW`7u3>grKAUMWdbjo1>Gf9Y+o5&RrAed~=y zzWEz9qx2RPk}cXw?A9r`Z#qST`QGQBg5qL@YPRS?1{?}=*7v~CM}CGZJ%c>`x%n4@ zfgB;lTxX2T9e}4b_c<+fp}41^PpR2CX8jJ@+trEmN+TZHncF=pT-Wvj7Y1L(5Bb2n z z-ydF?pUo}V2Hz=vbtPT&M9qe>Xp-I4soVa#TZe3~j4~^DQ%>ryG*B=1isH3=haX>%ipg7zx~?Z z3Adi@-uJ!rx!YH)1vd$*OHH`0_eF({KHhKSnr}+JuGkN0PLdSztbaB%-&L0ENCBch z)bwc%7U7(%+V4D5VU%HjUg5MPyM&a11|E2W|0TgA;x>|<`?%3kV<=Asi6BG;{B>6T z9C2h{O|rhqYGxXg9aNGbMHLJd3^ru>L@>cJQMrK}D_`)c(4QG_8p%)ye+aqOuwCB-}Lnb8 z``t$f>>OAt56AcN<_9duGGaA>PL=e-UvSHR>J}u$PvU*m0&MOK0A-jTIidc&@HP|a zIHwFmr*3Llw-2O=&A6#?L_q z{vdg)C|W+-Dp|g+4*~3v0oMHmB%w-oh`q@}ZlP_dO74s107mBFWID2SPkjLFl{^{} z!ix+Y8kE9kO*`^=)0&gvn3s4*`|WkxQ6Za%*TRLSk6yruK*FJPmiHMIO+`V z(}ms~d{5K%T*Lsjb(;(kp45c4B^h@_j3DwuYVKiEw}LI__{NA24L+PEz&PJ#kdV}Z zPUvmDgDkeXh&1xogUJb4(nZKeTrgxKRiw52$xxV23GHXdk!9uI}~731GL5M|%;@Cd1O zHYfI+W!>NVQjjCQ*l6-CWCYJX!tm*Yu#NP##%ZoimF+DKz?TnO-`PPSWKRnL7B zca`zU7>1*Ks1Q|xBjnLkdRhzDSbw0jQ&a`Me;%mRDEn%A376OY9xe*?xj?^Eru~D& z=Hs$a_!(2z)&gh-rtC6V)O%NIbV;v-Pj`oP>LigdU?i+tZzpD8HO{S0e=?M_e;hth zKaZ8Vr?w#ZfvBrOceJ$hBj+g^x$^zV9~dcKBqwj)VNOWy=E_iMv=-vC+$MgG!i_>5 zU|5D#GG$(-NP2grKr9HG{EAh2TQ#df9;q<)T%b^)gI|G z=~Yi&E%fZx)#kw_0+;^sxy+t@%Yh3~ON}4mj6l9i^W zscNy--%yvXUKh8Lv}9)<6(Xp=eBj#gz=d4-WsukCQ7mi0E0>9)hAC3i9*pqV41eb5 z%{JMkbU94b$+gXuR5<9&p$_?ly{jK8Q;3G5RF5|+)nn3%?P}ht-~AG?3sXE!4}{@W zK`3V9{R9WF+G9v&bGEye=%f=1Qyr9xGi2ET%>@B>*f zt(hk4wiOa8MUn^-S6_c@l;OBn!W*o%8QnF_g^n1>W+8DecApJrBL-=2Irs+;Sw-37 zztBc`*MN_2PUG94-tCO^K5Y^uv|xl+ry+4qrCvt@VV^tp5Aja7(fQne)FCz-XXaVXzG z9p`t=NDEBKWsfsc)(LWpRMyR&QU6@NtIFqh=G)QOe7LDHqk-PVzZLxP#hsuo%ZEP- zLX4AFaXCqu3ByN{i*6&m1fJXw(y03YY$0@*&`*CaX`x;g%qUN@OtT;t{&s-)S33v| zCNWLd4;yTvDy`uf`f*PJ84e^^-ZU9`3wvc8D(g(U6tUZElpR+6=bPKkvuu1GX}PNhZ6ZiDxQ>t2nsWL9O>SYBcJ@b1>V%jZ8l-1j^rDfh+q+ZIz< zZ<~D$8?G#6C6|%GtGeYZQa0#A$BgYO(g7m{oDAfci=`s$8p)zR~ z%o2axvJviCgqOg7%q_tKCB@Ee4!2(r^t!GjiM()9<9F59H7@!l#jvMPMn)O1;w)GC zuxjv#`*jLxSE*?BLnDdu4xs*uItHp7QkVjDS_K(6Cn=${z{w#8wdW`6(HMRHVc{#n z{{E@U6O;0@^Sxf9OVkxBQDHJPsobxR(B!AUEzWCH3S+HtN^o;1GcS1jBB^^Tj^uJF^{^h_4!sQ z1s&b}tmPVXrba3z`UF{f-F*8)5`BUD)e8OOg^cg+%?7Oq+!9%THZIUFk!33KWVoDxjqbTTx`d@=2fwwfHQy!2b*iM;D;O<;c)dw<*oK~r zOhbO0w0COqKK0-uZXK$Ceno^Vf=i9DtG(~#^75`-dUlnk)g?Y+JjUz$ENk7MxUzZe zUS2+;S{UIJ^fg$;mwV@3t<&A(vfu{mV(BuMF3r*H_Xi*Eez5HmrS@p<~!v zLRWVfmKL#!VCz)?FYjiD4$Q#B)A!`d&RIkYj^p?KKv^a!`)T#2>}E}SzYfn@MnQz3 z)5gn(IS6<rwU{;CO2uAG_ebzs3|Lp~KMk=MFUgLmwpez3-8Cl9phGiAarLBZ z6OO=@+w((^o^B;9c3aaORzYtRZJy{q{YKHulY(y_Uv1%yUU(lKg}a%{o}HIm2QJyF z2OwbhB@l8n=L^i7cAIWp8SX{2Z$s=7+6+@mH}CJ62D75ZLvo^a9s!ED&#(YHwYf^= z7=La2;`a-dHDB!&(_kI^P>JA385owp^;=jAKY|0+I({83&U#YVu&vZ8X&$t5@K$cZ z={kK5A@Pj1OCZ2;q_=3pa~}{Bcz&H8E$GfrG`3*`-9TXEOinOblv-BleGqEqp+ft| zuSgRX;geN!s2)V+$*d;$liLUAk)|ohR!=TSv=f>Q2@Uk*c~f^_rwbA_5MG;WWtp zHzL#?31#sotampD(cWulkFL3^ffCFbfZkfsi#2$@o*P13Cum{>T;xuh`0d zh_%&>e{wa<56g*x)^?va)Mpf1@{QA)$xq+xNFIYH()z*(xDY(Rpf{rRTI@{%fCd_q z>p~MVooO84@{9Ou6M856xfbyhOdrg%Ez~)x;{180TJF3l@3`YHL(`jqYM|qZzxwqE zv>C8&>fe3;_ddkHfvw!RfHfJSv!3vpzOihK?N@_dH>{`l75qeC0=7=Qtu4eRmi|oz z?7b{VFn;F}(9h|6GG*tSUZOxx$Nmo#Te`B&y6ac~PH@0_?Dxm7L((r@F_lkh&(j8ZQB0}c+3lpL`})9jrQ~RY7eZ~f)_^S` zoJ&@iV+`Z3)R1vVMg4nMbrEe|1nFJSs;<_?J1A;Dj_W-Y_V_LvxrF-DjX&Lnvg&PN zU)`=*s2YDKDRCCDsJ&-1d_|vKd4oyVXhx=078T3@JpJiyV|l9G0KAA3uh^7}_q0@G zPg;|BclGBRmM{7dUnMJD&Bt=GLIVE7>g?M3s?YEGU3sN%%{PRS$a?mZB>#bOgDVL3 zlO4dANRK)SPEvt}U)!@{M%;nk1dR>4!-zjnx)A)ZR;7ix3*1TA+!^TL3JC~%?(du* zj%o6*>m$w-Jj4Kz%}DL?4;0$7LA!~`ta8`_IT^c`;uls~Cw(FuV+ovtLjHBd<^R zlNL_H1zFA9p_jtP2=Pn2Fa(B~08un<(}r3&;?Lo=HPE{llJvus)eLHR*eX zy8(sipFD)d%#IFikTBnX^QgOs6D>W&>StMp66B1-w$-_73PH8`=HgsWa0rkwW!Og= z1i*#^uj3q33rqFSLZs|ZlU6(LX45{hwPU~i2MP+UqC`qPHa$f5RatZc_N9|oH<<0e z*`2vBI>lfdyty}$W6k++ZU)co(+*^3O`-y-qdzyX5pQQn@v`07siNT3Ns1D}O|~Nv-`r5QR|Z7`lnYis z)mcEw$1lkK9{#%k6f*k6nPH7%Bqt{RKuv-~+6WWFgJS^vi2i>+#^Mj}v1B}Ha~iBp zl|lOgI7gZqK2qoJeij07FvFwWK^q{S>Qrs4tZIgexXLV9p+KblkI{c%Y2;nRt@f%h z1qGXrAK|Lofzk*AhFMO5-Y#xZo=iK4L;ee=6>%E|rpPG3l>HZwiM^mY&049sLK3t? zJUkW&m`}fAsedrHu7Z_``UG>RSty(+D+eow3o<@B30dUO6JN-#OD(aN#{x|(AMg}H zuTD}nDV0nbxmyT)zGN`-STKgR`LlC5(8*<4S^HFe|5E8+w1D&64vDElk>e}w!pNI| zdnEN&pu_@pim*FR>6CCN7-=u3f{H9=wJ2!+fzqbf4cYh&o1ykg&yieUf}4_A3}*2j zvUQsfplNW)_g_u*1~ku9ahHDP?GV=ol``4oa1ei>sWf#P(60>Dk-gE9i=SOskm^Sj zKB}SjlELC3;ly&kqri0_qB{ujfY2YZbr&FZSpZ{Wc_NUL)@yZ1cparji{e4!>F^bh z$|LVIOxq6}aR&_%oYG!r*q!Me5L}w4{((}-)Bs!WW zZy&b%sn3eTJ3~x>skX0x?K>LA&wDl@WXbLY6>OYRE81{Py8*r9nM+|0xx1VwH=B7rizhEhTq7~CTkDHX;Of8ybskm4>2_Q{$Tq|zS0vgtNb77o3 zhe`N=mo<0X;>BcQK9$e|%lP~PgLPdY(fIr~k%mUHrg75&+A_#0l#ZOb?W1`K2 z2Nj@&pz_`r^}lw@SL8f!PyC`8&X(YrM2^BTQRUZXDQ z9Sg(G2HVrk`>COlhBY)g-<0&?qF8lt3192UE!g5&K4Tu)c96Bns z@>Ise+~!c>9xWT1LYH0eHNzRlUKSy~h5EB+4`r59mUqb6`oamz%d!`?!d;SzlmONl zC_X*R7fun=v}Uwdyl@@i*C0ml>AmiDB7Zm*A>B*SG+xp1Qht3{ckzN65$NORuzm@3hEsq}-8lLV3mvh)AR2|N~J=K0& z+s>${h}e1^8fo01++Mw;%2U&S)j#+q1sj`kJmwr=4y)F?TWw4AF5lvV`n!xZU?-Lb zws)>1_x1c_x1GknJ~_sx93Pg9yCjt*4(^5zl^o-w;U)Tq+>fK57!26wx=R`-LkSur zzBQPmX{fK}GQ-5Y-d#NlH+Ae;eDCtwB5i)YsRgKLAj(QTE14a4X%x`2-twq#i7b{#&SuAG*z3?T z@ONR_9BD3(nneW>m&`p^T#hBh^YxYFuQX3(WO6tf94ilEEnNifyR&u+`ko%}?si2+a3)=XWX= z)GA~hMegv~@}MS3MCXKsehzNI1CB&+<%8c<^94(D77oR7C=Y(xR-H? zYZ+y^ZG8$)4y_(`1Q4W9*v&7-NnejL>!qW@h}fVT!_=0dvgj>rNa?<>AzvH+s_r@8qf^ro(c)arOx^qA`=OO$=K(G zhayT3;{fxe7$ggIhnGSY-cDnSKx@P*_*zFg3K|F9(^ESa+!p#12lKebF8tZ^6UJZA zY-j)*zo%pJM#u66Xb6vo9(U^qyWw6KD$R%|h0H!c6Upt)bA}kQ3s&)%`grF)Adr!{ zr|%`_R%0Gou$z>FrON)WhcL0>G@r}gZu;5>e7_99^aI`(alv>Gf2;5q^ufd-d-Cve zRzncXR;?<GaCov{J<^1qaHkZwyKq)3EzZzAo;XX?SnvT(`;3<3HNZ%$wSF!fP^z;3>f5Y`2wg3-Z3{ z_Sta0boEpRsY>3n&*7LWJ~Ko-h@;4EVs{xM4Y!H0n6a+KQY5;v@Hb>pmA)rS8v>b_ zyVLM;&jw*#jbJQ~(5lcC_bh6umM?%bGVtz54(ctYx_520?du?Jcn#2Co~5pmmsVP< z61hmX2x_h3Wu7(fNLKOb=`tz)MEqdM%g!nIEZZ{J%qHDUcSJW_?7xEhu-P%UV0nXu z;*tKUw~Qc&w`C$g+}VtbJI`xQnvinPxLPN#_fOkd%nrCpj(%`#8HtBX4RMsdqgMUF zpewv}qik^%o(VnUpzVm*0(40iviVZ~i3Gj61j9xh;ro;_+(v=NdeI7mV?5A)9S2QA ziWa1G$t(Ro!)eq@N&xKIoIL{hw@|0;?W^Y_bTbz7^@N`%%V`g;3vE<--M4BKbiCJB zxt6XO$AQkn6%ij2H-r5UYSSZ&bQR4H{c(I*5-K>=oHb$&)V-qw#YhD%P@hydoAJ5@ z00~}2w~=VZKjV7L|CbaOO21$v2yq_pqF#kkMKsu>1uUT4vKR^N4Sw3KqY~7^Gwg!btY963_I}u~=jQ`}iB91`d-pxEqkQsrQmw8X7PjT3M$QVB^HoN{-ev_hi zOv=l_zI})=F7~x%Zy(gZs#Mrf!2e+|OR37A88y%h^&Gh0u<* z!g_KLrF^%feq^{Vyo~s97^(2ceNf>7m1Go>Farm^+`}k83rueym!dLWiEV~3v_(#X909cEB)z@w3 zlYz4c#?2j4tsj$*8>3ks%&TLMtVGlbIE?;KH%zBH%kK2z+HcC{EMMZVIepUf;m54| z=6g;CGm~f~`ERZQeE=~`{+CM1UsnqlX~Eqn2aTY)Ka60sf}&J~oZIkIo3%cbQ}LC1 zP(5BuGpKD|Bf-V!Ogv;`&4uV_5pi? z{bJ{%Td@}F4!8}>r1Sz2gXuI^_bvrJm`4$8XYKrZ6Wf7fLxOl5E6cUFTJB2V-BO~W zE-0B0vwOAz^_|-@f{;Q>V7OO!_@ULBscM?3dj~H^4haFE zxLtNBjToe89PJ7bSF+3d8C~Hj%9+eQr7v@Lp8bBxH@nf52h%Cf-$*KF1#igDBGW0< zXYhHhN%SRH>*W_Nanz+!^T&eAB;j~J9%J0>N9TDzejTqVQQXavLHLvCL;_6fpA{CV zpS3OYZ$nP3L_WR?oNFH_nrby=#^uUXrw`>>`W|7)MH!`MA2Gw=5Jf6K!TSm0)>|A2 z9Dyt1@gqlY>!)9raW>Jp*g${uF9Iw*{o|SCT;D)`T$6jMpg2}F!}VJAlDYluPM+J$ z6pV6A07=*X5b)2R$bW86r+@NvalEaJay=bQl6;fs%(8P$F+yWx_w_`FtJo8z z`wTa@&lhF%Yl!aZ8c!)qa>LMekMXW`51z_t*Tx@gIL$B&iz(2T?|f^I_UtJeu|0c0 zc9j}eb{*mdLxk6s^=_>D4otF3A$u>y;6Zc5lJ`AW#*O#6+F-ZmTmlvRc!h?Cf+-Bj+0g>ShE*_8MeM% zpX>Ef6^zvJ+5!1|hb?zEQNYF}wEjR*O>hje{`h$C^aW@qCh7hENPEwKrq-=%Gzf~K zf+9#KDoB+kAf1Rv7eYr7K?S4<2uLrXqezh=z4wmvPNahH>cPg3#DZUorOKl(8e7z-Z3qgN0T=zM~ z%@`4<8VjOlw%_4M98YvG4|mrK#P21Y9Wd@Zx#Q_SuG~=O@a0gfWYp*l*`XZe(VEO_ z**Mv}bF{Qsg?Sc^1FJ5>^P`6mi34yOCMC{fZAa#aeZgcI%G77S_f7qH)Y_a}Td{9+ zvL)JN7OT;s^4N~+!%&#abr4{L5iWE}B+3`xPJ2gi!iRzZr^;=E<}J$%jBOuRy>WIf z$>G8Em-dwUo1N+FJ7Ht@T9P&lstgE_=Dl8s@{srgq+jDr|D9uDOn30@IV@DIiUHH@ zcogXU>WPqCQji6@u+W{bm3GU|J|2GA$waMsz+UMbQRSg_csYN|wdJlGmp;2U6=W(% z1T84@pR}bRX<^ezfZ?ouzBh~i$$s&S?Hkyh!%icvCs&ScJlXb{gSGQeqwTb|Pk&GLeQl|}3oq&4STrrX zdj63vrL*BZqSERoUwW&(Te+!DA&W(E^f=J=-en44d2A4EG44ax<38QE0lDaY8Oi}G zN>}A!K`z<@F1TD>T>B1Ttw>5(S{PYg?R(&+lP&{6C>x*@LdTV64cdPYI`uXgT(N&t zaZ*gwon4WY`np{ymuqXpWnH_?;r?gjDdhAUgYDhzXlyT_K}+02 zx?6 z3=PF+Hi9GIBniO7dG(>!8$*y)c7n`K@FsXWhrjmq-@EMp=@`^6?}DSa$m##o6#b0{B%6JI*Pf%dEk-Up2Qz9 zD_M5-4z0ykbD`7Esxz#RZArKflpMC{y1aEICy6J57}k$yF;u)9%Cg+Nbo^G`cbl5} zE`6zRO1t2x4_hf0C;SyUT$nVay_$0Kn{y>i?8dU`Zrtqguu{l1gS2rO;S-8a9CwHk zmONp)&NMj(^ZRoHk)9TOpT>f}4hkNr0bHCLn{{O-+KafQmrSKad(<##%DrYX>?`#1 zn2x$gFEcInIyzv^5<1iCXI*bt+`aJdo8RR8l`2;veTpkxOce+yk`c#i0i(m+G-nhV zF?Mu3^_{SCEBxxy`uAeAo6TyC*(@x@9OA}{;hXzKMUUc~cbQ5=Cck#wf~zY(pj#kL z-C99wYA(<`<(%-A-_91Ri=vo+kxBGCKg?4`$m8s6WwxZh*_!RaeJ6({$q$AqgDi~= zk6pE1+@5~sES_v$l)q?{{TcSbJ>mq0;Wy*ysmKk^c|&{qE=-uCC3UvpRs^y56Ki8j zQaulDY8RE4dvNdMGX-uiYqMEPyx7E0GScw*Qo7b7#q}8*Grh{y{6puQXT)CxmOE|2I`?y#s9*M< z&Cl?w=xUd`=`!@SnLp%S(A-G=*0|#pPq9(u>EPQEj;+TI;Xg$6dTH;-y$~*XX>5G9 z$2!e08y=?sed^Kk<2=`5RcPDzaBW?!7y0xhO7D-toI@^)jXy$eO1+o-(2Z772$k+D zczt5uJkdh$W^<@Xw(`VTxOjp7P$-J?iikpT-K#}7Eyb0}0w1#*S@b*?6Q)<#D{Feh=q+f^j36 zmnOwiXXOUf$DD2=MDDHx1b+n4C)RS=`Hk# zeed*6*GH|*wG*=@Je|)f3U%D;?Kn^+V=zPVu5zvzKykf{%&7TNeN(>p6XEv72zJK) zj^|_IaYs{&Bb!v=bZva)M!8#^#bY}jEa^cyK7E)iN}og;+acW#)I}nKS{=d^ivbSb zMi+^5k2U5xR}Mod_q5W&4@r$=Mb*?V_X@;YaNhRUXrq&1n+=OmgHx0fFi_s4AYc%) z7Y)qu{5}+$-LFZ=FN=33{_|f2b#=xMT%MoV(tJIEb+RIdCtF6ZaAv_LwUJDXko}9| zjC^;mf@nV=#Qc2BTto0i<^uh*NbCUnJ^%RxO&Ld5O_d-HR;pbN?(y{94giWOio={) z-|WSVrBp{;NKX)pVui5H5soyfhAGnXHc_4^7nzRe9wf6QdJr=C1O?5Mcp`K$73dli zZnsI2sm810SUhCv(e#<~A)gZLi`%Q~0^EJS>smn4XBhoEh&q(+E<+zQ{cZ z*a|duV=T5?tyrbV%RjWe3r)O9`O5MBw^cWW*{^mphi?n&4N@irs2sjry3K#e@VO9S zxlbo!1#o)lfzp<5g}^SpE~c8eU2v!t!K`>EvZXYrXpu?GH4Z1Tw$F(>mwY0fOn^~> zlG=}ri(Papu;IIP-$PO9dUIY<_E6xiO}SOM#04%3G5%u}VsEj*2;s#%scSXfVj05P z1YWqEu&|@y3=u%4J68yQB0=2$C(Nj_2b6zQiMtS$WK9#WAS4k+DnsucRHLM{E~<0- zsmk_@ied=^v5LhJ={74+624#&7adSRd1_aoCVaR9FcHLQ+Y;nSMF|C#)6i^b>DcYW`8l>5#F zTF`%Sn%I!GwfGiNurog70zw%5$O?YlaFFQd{LjSw)-h(2BQl_!)i4MOmmhqKoInIvEGgHOY?UEZB~?1az18u0bvl9 z0G~A_wV_Cwu7}bCv2Xxtl3rKQwG=Vsb_IoFHa?PjvTWcaKA`^fvpvvw72JXl%ryYQ zHl46IH4Y4riD{u~xsk?RH9jjdaZl58xn2&k0kU4>3IhjSyoKw8*~`tjM=6Q4BA5NO z!u)CEC8UVY#5p+9nxK?rTy-|+Pn0D>Jj}y;^gKF)*9ydVe>WlWxNXAQ+%vW2}H|I;i#uDFVJm(_~|!7 z=i?NiUR4n`THvry1?Rz*y%7V3vN9f@LB>Oiy9Dlw+_o$5ai!ir>!^#C{PakdkKVmb z{c{=nraDpBV_YqXy2Cwd>0_b6@-;3dc*Vf$PVMoY&&=M`@1Gfl2$BR6r$7jF0Nw7w zDd5_bDFM2T{$yHm{lC_il>*U;VDcdsqIGkRr!ceI8V@FW1GNkavRl zTvvx>A5>NGJ(eY|Q~b<{_Nhe!Z;>O#DRiAq^~E6t#H(qn?S7lGz$kh7A-O=QNH;(E z7kG_$|GLju@<>NpFPhzPJrgD%ST%NrKQ{lwd;ffex zY&HVGlne+{i^l-5HeF95Mj0=g&JV8BSG0YRLt7Pf>b#P>5cZ}&;6mXDQq(j*SRhah zWGGsZztcC$#J6Lok^uwpnzbE!XnCHo!HLniO?tH z*zRoh2Gtz38x5*XHbOO~)ID#-OG%6^X6|cE{Ao2gX3|F`zS>rt$CcP{gyD|>SgE5A zth8R-M{`>;?HxA1P|?ez#-=x#dg-$l{Cg18;nw7kiUW&!$v2NL(6Rs@7~pmI=^|Q^ zN!Y4uX5Hbc#^<@PYY^rS4HSki%_1rwG_1Eg-lPRCzQnU-evJqGkrTZGmJ?q@*P@#` zXypnrI93Ll%@r7DMAJRk{H`QB;+rfFmc_p!vnG>&4lqEg1b#85@T{r-_>uepUdejA zVQHfn)y~#FTO~ALvvM`rb{x*V^+nE})JKzLM<@{bMbr7iHS-H28m)vAfhpj)(cF(U)(btD;(uY$X z(2K)E6czo#I_#fwdV;3q8f!l&IYwyv9dMHMU`5Ls2q|teBsi*_k}~bSpm1d_cw5+X z(!uyf0|_!2hGzP)!HJD7Tt$`%1zMF@=EVw15Y@e^VymgE1gdDg@}pl>c?lHAw2PN- z*BI0BpW_7SGspek!v?_vMLpy>65}*Dxw;23J_b9355vdl;4^sx(V)@_K*Y6%gVc0X z@aV}C)o_-mk)1DqgW@sU0>Lkxsr1ORYrFXe8`yd`krQrHvKg|o<)X;eLUsBH z-H1jmNmhEj?+^s_>?&5}O2iay5>~P4bHe6!+hRgC@UZ=Or;GIMQ%vymv!X1$rLKT$ zVHita>h9a1LywVl8}`P{Y?eW;Es1l8W!OGg*QC0(Ht}7zKq-A0AFMlto|I!{u(6=J zeF3Z)PCyokyj}w_XTWGaO1oH-+qJ0oM6dCPeh=cHrPiG1nFk?(02Oec8)*LfuKnlH z+%&Eyd*-WoORFeXi>|rolU5z=!RswD0;G&W8mk2al-M3LS=%M_k55w!+nKe#?+gYD z1&Ih#9_MJQ5>F3fSalAPPoUo$4%hQhfzG54opj_vVf&v41fa(kpFMeLta*F~DA*Z7 zif4M~5SXY?C}^2(0#V6|-Un>E4r*x|iBpmb1-9u=hY@>4ip!}(<6#SXp#1QwjT%}H zaEALl=p%GDZYkg=Bq`OPu>uX^WTJb3uA3Z$+;l=5tx}$>>DhZQ#bwR>YHSBE%IQCU zpj}lMN+ldzryaV#0tAIuE(jit8Kfcp35PjK8`*a}1>?(~@>l`9ehL^vUr>iwAA18V zi#cYZcmR&RNsLJ{2|%vX;jf=xv-bu2ON0)%p|29W{tQomR{$muZx#TIEahY%zYj73 z^uSRRFO%@sg7Npo1OJ$vOi>qT`_cS+ob&gX=O2&%d^1P{0MmZIr;-nVj^1BT@4NQ5 zFavD*e|4??dYmDFAkmc`MWy6y-Ua~JUppKALNxwxxCH&Ofsmi0{|6XLeSQGwtp=X^ zDj6d>grxJh6G(G6R z7_}o-$%yg%Kna4Q5`$C&l!gb)%nG)A`PWJDy1>JgVtabZLzb+nOcv#v{4w}DWV@fO z$t_Q>AyK$vqS?h*AeM6IQIYS+}flZ*b-62zQZ+Gj*3Qqv)W)x zHMRAP1eZY59w~_3NJcTcgvtRJLA)e)T?PhS5<-UV5a!*XcR~0O_c6*i3$-KP9K)Ko z0>-lg1>Y5y5&7TuU80F;|KgLgm*>6caJRRk7|oHrL0dM_gi>muV!Mxf?KnfvpWk## zo#mBn!$l89n}q$BwsMN=9yi_Yjx@5Pw;pe|SMe*cWQ)WqytwIT-##n<)kd%8^Y(^; zoPTOJ1DuXhtQ4;3f}KKivzwBTpW_QTL`St3}bj!Q$5YKi`Ywy(c@B3NbPhWw$` zQ(oc*adGE`E{4x$GIsnE9_QU8BHC%|zIt46xb(EM^%6DjQ(jkF&bpplNrFws$Xa4m4ldqen-IG!A2PT25Q(TDl)594p%rF6~7-ojC zwXh&9p0TRwd7+3p(yWA3gF1rO$!*i|41Rqlw~B@>BC4B_ zyFC(sZDZ(Kah)s>UkEBCATkk?R#PG}^R%s-HP|>J&1wT1>?0 zi-36-m_0-oJCW`w+Omve@{oqR$9u)k(a>wwZ?38}o_mza+2KAsZtn zhPQZCz)1=Avk=wtZ&%|fj4J&Bt_;x=^JIso%B_*KfpcY+U3U)9h^K=uXq zxQTVmw3vuU)|%#4Rl^g6#4#ObO~EB0w*NBQoqmq+b<6C#(Rv;1_;1QzVH zH}chAo#ed;aZd4s(sw9)D9wrInnD=TZ-^Ibv5@=*R^}0%+ab6W5ZzV#s`c8pw+S}i z9E(S)?OJ7wFQ{X}jy!CdhS?g;b&F$9VelNU+ zD1?-6yhb=QMl(lPvf}vQyg=#`Ruy@4?U0G`$!eSJodk8Q#kZc5cLgtBe)aeoma{Ly zA~lrn)L@>%GsmV-!o3VlqmzA3wL7;1osCwM+np&BG6wr0ks1MtmZpim6dEGwEV;Z; zWeL2= zF0zM9Oxr$t7xud?Vq8b`+(S=1T5oS6&k9vF+Xqx8eT}HUv<4+Xb#7P)^=+b^JlMLrtGcT+LMI^zbu% z(eeRmw_YhygP1j9AkwO@g^jcDOE=(5dW;M6&NTC_r6+`2JknvplURGhK*Z1K zle+6@g9@-R+gpE_mS1=0l~bFLoy$mNjklM1U;%ZUgED=}>>uui^l1yMX(?T~Z7jZh z;L3f0lxwA(^7Ra5`R^R`aklo0x!1o%zRr+fHw#X3XXk1SXu>sE?1|ydf%++{lF3hq zzECni;2iHFT#~m)uoU2?8b9?Istmv-Si=q!d#{uXUhKzeka{CI5y0?m$MV=3>!0=q z)lfr!_~;SHQ|v&DX+4No7o@^P=;^#wSas<&JW52M_dnwCw-g4dH4PAoGTyJaYLhUX0>~|=Ldh16TUdK=HL?kA%0uuI z$6qW$=%n2*-#;Ooq6Yxret7`~thE%^NjzMCq;3{`7%coci?Jd|_9k9Q9dy->vz=0u zT~Y4J`1guwqX&?)Y*rSh ziey`TwDBbeve}G$ue+AcnVuHk2ZFRA{6D_}@8j$P7QA9Q$>W0*6j`uIeCJ>sR^Vhu zh2qgFWJ2-l1~r0z)f-vbY-@q$Af^w*4Zw^--x1E4tj|sJyopd=`e9&J(&RSQB{bq( zRl_>?x}*%^dRcFOR73P#?w&DTF23M$%qm-2!6|%Q83{$;3zZ;UnV~{c&20H;YrL%6 zuFPE0yC(7S^{3L0OjC}YK|t9b2mPpp$pmqzCXN$;ULDx{3InpSuZw1lr^|h0-w^J^ ze&V03gw@}SZ2z(UMQc#USjybP63U?bdaSzp_+hf`Z`pPrH> zJ2o{aBmD8s=$%hrghBVxCz#(Mg4RR8NodA0#u}Yd!N!$kbkaXz^}`OJp+&FqkuwGp z!NI{vvBuU}E2zST!VI$uYAMYbHlBHTVi3rwlDuP@uTeQ^fvsQgjYb+HK)wfG4#j`u z{~y+VzhC3O97*$=lFk6M@`(o-37YJF{q~t+J6zF7J8^ZS=UuJ}g6p!*hMuC~oCoI* z0Aa{m0=?BLbVr{Fo3?L+8i^w5JeEwpF=*#bqi-&c8U|xOLEf|>u^7id#WPG=VmlI7-Q!q8XVfre$4`4`W4}Cr0*8t zv3;)0`5K&iSCG=j?Pv>B-OORr7KtvJX(r3hat2E~xY_hFX7U@Piz%T#{EnoF?HAX$ zt^94e=}^p7gC>{m$16~%8z)bI6}_J&T<*KQNyKo{SZ^|I=mG7EgJOUQ1|Y9^K;0gq z_VhW?ma}v`msd2No?Bmhbgz=XR0b+wC6?~lQcc;*R#Ed#&R&K>^Ud#Jjz9h{0V$X~ zs3jmyc~Ic`pmc5uGq88#Y0An4^@-PrmU7s0=rM~8DDDqx0tJBDlhVMD7s_L(WQ>^f!F7;AnLu$D1>rcQ zbR0wK2^i@8`~WvN6PFEC1V_I|s(oo*N5lX56acyYadn%Y*%qHUxUv58spWrOn*E_M zg&l95_~?vD)0V#h5x&dwY(R4{>4cM=!fl!)&R1<2A3RSlgmo)0Jte65mH^@RJnf?O z=l8*R1YXj64AHrZs>3ynN8^AHPO1ye1vXu2up^G)24JBBnEC0ch3uPpLNGEgtCj`- z4hf1HoI%$aporhSM$niudz3&{nP5L#~Tkx6tejT z;Tqx+#?0Z?YvofTO8L2P<{PAxa&(D3loYmK?$XDt=N}z`yWNT#ICG-Yk9-k&;X7_l zWoQ41tD*bKcRQ1p-*B^6_}M#e&Pr(K@zv4lz}R4>TOl}hY&wp6EtyhWy`4WtVPQ;n zb27?-kNjGd$`6j6R)U&t$=mJqw*xtI>HgTfw2J~#*Cbg+Ue-&@tg%;c!PR5AHVO;- zx&#I~9nx-aD?C?$P!N0oAVOI-=8De28}yNd*{srpJm&Ju;{s80+S1am0w(BGCe%hJYpXEV8cjHtX zD5I3>1QZU8x6B9yWZqx4S~qM!v~AuGLR>LWQ+p`VIR7MBcohovtM&Xm?(Sq?kE>FH zhoUkC#Jhn4g7Z^gq z0mlwz^;L|ZjfKRICTqs;kG==RYzQugI{{Gl_2dNeOP?QI?|(u>G2!Ti6GmJgvaBh5 zZ5Hmk6dG2IV<-OTa@2aMuOW6E*JIu=oLRVEUa8ZU*;c>Y-$ftDz(D=% zLnHtTh!`tu3Pu*#%h3F3$gJLRKtNLfcy)%5J_nj%-if%4)h=E}UBI6}esHw#oEXYpB;0Zw-p4?INi=nReCut#rPxjIVBbYkgLt>A2t@+@l$?PDC7RS+;_ZYR2=Y%lIG(qCTopy@Kj)fX>>Ap~8_ zdz9D?%ym>`BLoefEAkI@sH^!^hLHSjX{ z2W;A*Gb@s31!nMzyWr1$S3(Ie@rWi(^l;F5eMe5u0jy0q&kQn>QHo?o;=N>uZ$eris(n%XwHC_^hKMRCfjGJ1nre%2 z5rPj6o=0`;+kkSyxz689$NzN1Lw`GFc%%^T_wg@>i+`*8{sRLg`2T>H{*S+lXZe)} z6wgShO&S9q3&6Ge>-InZsgFVx&IV2qv;XY-Kfs~*KT0oztCChoJb3p8K%h*2U&Zl{ zscf=`Ue^dfj@kbDOq%5^V1diM_V=Bey<0h(>)-rN357E{vm)XY&#N(~g}JErDM^7( zeJ!)?=5-eDIZ%WPNsQy1w*x&+|A~a+18fDF7R}cw)AOpQYdNZ7hW2gc&!ncX z{NnMc;~`MsWLgZqXJPO?M?nwH`uuW%NP5!d5dbEz3Ngm03fl}KLOi)Iz(yYdjnUmlB%~O)DB5ydi?s&pw1!jRqE+o1&CNElp4kp-t}9ZnM65`PyB;I_e0OL-5=o zC<($6xhF;aJ-LquL!Lf;ysS@@JgYoOXJvo0x{8K=gFL@jtYl&U`84u$%jLjrgR!Ni zJ5vN`tr;xI5zm8tQR=6P75aCEec@7g}#f(nkeD90E@|Tdj#PU@>8N0_Cs-V%! zQv=TGGmxkzHkUK4`iT_w{lzSj2sT}iT+VV*2sqOLlFOY{LoD_Z4Ej#7iXV>c>QKew zKOt{%cS_KX(-Ma5E(L)f%~WZUKhU2drhCJ1GP>Qro!?P~Aot)vN3>}mV<`?tC8g%x z=xW>38zHF7_g)E9)O5YgH%&4*co7KcoOr{)7Z^b-^2|S*--=KoM?7e>%$@A2M)enn zoeD9rKxBTWD`)K_Ok;`&fdKD-B-FO#x! z&7y3^{T-D3K6T7I*=j+O=%G8>X&vIpdWjq_i`OI0pvY|2^fFt!FF|?7u`rU8wF6Dm z#r6-GlTJPwp;spP^X4y~l$At!%B>lsZF`u}Gg9W%Nd`2{SF=C2$mq?-h@>|G+4?vm zMM}%`hl9)Ml(i66GmQI6*A6BvPSb=EGx{cPlo8C0JNSiI6yt+g~s32Ck z$yx7XjQjweArJqwJ83e)`MDUeBcOT)ND`thLeM@0d*NhQl6py&CXE&SwG$W{;*C=B zjOp2*kjDJ7w+py#p?M%${AywCuz%km4yI|@yQzug_^w$AxU|TCoqT&tvQR!OZ5e~_ z(u}y9_Gi&3W!oxQbC6~ozzD?=1s&R?d&R# zx3n?~yO!VzOz*eJ{-6iCYY$rylKO^y+iha3*haSUj0LWW%r~N5*Qvo4#xzC=xA?Kj z?wgv7iGcMETyD zo=y^?FFp!!@-ipQF@OX<6!)FlW=UoSFfdn>03YzkxEVq3=qsXZXY= z1tzZuTyLgFOH1R8_EOC?5StgRV>*Oq6u&tvFzD>hyOmF*uiC`j?VMyZ-^|R`Gb^hqD}0gjIa(?dFc4)`6P#x_Cyx}Q3TYV)_2x+|*dbwU@)_^XQD@%d^G1-2 zX<92?(qyeP&Anie5Mp*G&}E&ckpg>f$d> zy(}656@kN0gXW`vTiD2?r6!yEhP%v$JB_>&e5#!b{#Og?UzXBe_FIKD$J~HtdPv_H z)PX7Z4!C`BfkqOJQe3Xy{e*knMgXVC=2A?r51Oy2y)|Yo3QX6=oB~vABUVZ~Y#8}= z>g8UEFIDdquPGheME6^6Bc}bmUJY0ez0T*9=k5|=F}(DGbwL$If|Ex&ykcG@mGJ<% zQxhM)fuosnTfm7}JMsC)(VJ1$F?pWPtDdIck`?JtoNF6geEY+%(2{IL#$$6uSYW?P zq?dX{m5DPP%)?$QcaGQMa}2eky%jH9a=!ytX-aAQc1 z&01efrJ+Jadu@mE^&+8KgC$m=_kn zddF8$TBvuK(QN|SU|6kYVdv_dA3zJPo^wwFcP_ejLjhDi{y;4qCcKt%_HH0SS6=LW zAeo|}>-ud9vHl^6)zC=-zM4b&;RpnE|A9}grocGjFd#wmuA}$pp?%1(pMwPcka*94 zwW?H1Zx-AvXU9mb?N>zCk&wt(#P!ika3h zO)LA@{9n)_K+M0J5`DcTlj|CFW=dB{OxyMBusExWr`zueXB%lCNE8Y313@ViZwSB* zh#iz)Zc+e0u|E1{ji`t-A52-`QbRzi=Gnhu%kS-;r&P?!?bAGORC}M#VXmkNHug7W zTW;a_+(LB8#ancAo99FU!Vp+Uo zh&wNV3OHfvz+FU7WJ|q6TpE{*aQiGwr7icx3$Qp{YG^b+= z#VLVE>-Qqg^q=E(Bo4B zf-a|%c2S)cB%zZ}N9~;DAVwtnyZMz}Z^MeWnNKA;^~hXs4C0SwpXjtFv5$y9E9zP@ z*WFlX!*+v@;2KN&h-~(Y#jz9k_R8IKfcP2V!IS8w5e;XTfKcn{K~IfSAh0HU?`Bd(t$&nsiHNbn7;+0+PhCY4dF5Ln-3aukH(a z!$Q0fb>L2>NCBK$J*gZL`DX_a<&hW|?;t6S(C$a9TYLl=jo3K$d2!bd^sN9)T;;o; z5W^<{xUi{VY1{M~Oe+9G{$<7aKV>w(&r_~I&wKE=DWchDW892Ub*)tT(BbEwZa+qL zaS$eRo5u5f_T5*YjBe!Dx5hV=C6{zyDCwFEXtGMtuG{sGouu3CC4(7vr~!~5<{U8`TjWoi^{LQ-{kq;8zE#6IHwIK( z&e=YDX@AQ%;U+?@qFZ<@rt2ZNYvK36GDx;ChdU24ELLN%&_AC_2B>0@#ZE0!6EXeK zV8=lldBC;u6LKuNh~&)AqoXVI4rI%t*4ZBe@IfCK`Pb1V|6;_k{uf5vFrE=N29kM^ z_xKf}%}s85>Mg3}6XSLCyIG;JEGMhxS)9kKM_hf4ZN1wfbefLLBseCI>rP7s=-9bL z*IBYa6Cv8FH`Xs2YTOd6W4kT&_Oyu2L%i-1dKvV?FjqO}F5oVqucx@(9Aw$qbaQqy zs)g*l6!;om`d-v3apu|~NlTIqW^iOLDq7GR44yxPQ#p?TP~olb(N>AP;C9pG!JE>;is}v`yL4-LbQ$_p>fc2u;~wovgBriw;Y`xk)~7GaI%h{hxp2f@|`6~H@DTkMu1L3}ZtXRzba z$O{zClmV_J|E##@^QM32T1XSvY%XyAvuE_E>Yw;hZqgT^HChD5`i#h_|GMY@S%Lb` z=c4hx5cs$n_<7m_+EZ5lfG2ZWe06_yA7M_`CUElr4Wf0yY$7SA|J-^0L6`gAoX?80 zzrJ#JcAw@~gh9J|?$?o+_m{86UAzDOU%sFH|L+Ks)~W%oi@Iq>NB@Ie;!MV;d$}b# zzzFsWW2Cdsz`N6boaU%py+sAe@_)5&0xwg9Zb$+t8{VQs-6syB`$tfbnWhV7QU;zS z@#}MEbKQxgS0x0(c*1qmI+li21c~rA zk-GcbgIv8+?eXPWKErY-R?hU(7_Ac=Bbg)u{a^ehGy)-On#iP39fw-h_5>7BnF9ntb;06P2wS@ zrGvKTk`fk42PL5x=lqkiVGik7vYq0&?t?ZF^kHFkW;s|h8O;-F)|eisbRMxW+NkGt zo=GPptBrQ}5naSN>fp-gssm6vl~q^n@h@M4@wYjfXA0={U}x@L0lP@dWfEQU!TeML zbH0<0W<#BmmgfWAjq_Y0K98hOl%0knlLs5zTTV{7iNEVeE_VoE_@Zwd70>DH6GV2K zhOddpo^qR+uX2uKyPVPD5QaN<5jFliiFJE_7E^`|PLYfeI{Te$m!jmhqL5H3UD=^D zygcyp`1yzaM10Io5v3^vm^>_>G-0pg-to~WXW0n8SK|zRRS>Wjq!D>(7_!f*VqQOR zi*4sXRmp=QypFi!NCbpRkI+3+=g`Vz-sLr)C(&M0Bla{W@kBnUjEQXsp~_T@=Z^d0 z(r$4bVmj4;>dI_#2Uh0R(O`kcmuo^r8M+rbM0HCSq1|U<#Vu#C{4$zfxOApH~saKNZhi&9~;&f+wve4V2)bT-%n70y5#u8+E z#ntVQ7F(NRojE<%L{W6IF_lxtzWm81Nw+4uXrcK}h-@3-;#XEodB!B9b3tf?YY`zDJk~NXX#lb8#6V-uQquZsz!kxnNWQr%ycF7wR zs2(r(T)j&17$bYef;@Q!Ea!H7yFP36HY{-8(z)mArsZnATZ$)N0tuF>rHIj~K&~>< zni}WOhbF!K_2~NgwZoqf{=`X!`eqRcWmCc(lI}gVb`F&FdlFn=P%#zLPHeOxj7@M0 zS=Ir&grtV;aU9`baS|0J5L*2q9(r#-FsW z7Jn|kNV~|1wH5BIe|6ihY?i>%#wXyd>Q&~D44*+tRCEZA6#cG=zx!6$QIyi*L*5t5 z@M~mJ1_eFo*ODxF)CRfi&p+a_(r~l&E|8GEa5?UHJs^mWnufEk?KLiJy9RkqqxH!k zs4454%(_NH0_fa{D>Rv-zIp^;6tS`8md~J?s3|V)x6Gljhek{2c*qt4dEtEX<*m3lOo)sM#Ztqa58M2+JTb-0 zt|U*N>an^U<4lb0IzEoGpOE&!i4L-~OWs}bosCm5T{ZLjXmf9fSVI^19_h_dGL8s+ z93YQd0_LBt>Q}#-J#Vq6%PrTO#7o)A)dmkM5U{KgYMAC$ll!wIYaJQ}s&R{{FT6ar z9ruVnhhBpzzL=6e+o?mbr||pxw}!S`J}QxJ6}p!fYOE^6B&TIXhldqEHx0ej4Cm7u z4cdDsSV+3lY#po1Iv0v8pW83WBt6?EZR4$&Bt+`I`|vq$Nj|Bzg>1U-P#Ai2j^UI6 z+3Mj{<@cyNXIXvTlkFGj%3YVDJaF8AO`DX|ly5T=vcEPmgH2zjaKwGsem0IFH0W_| zC)LbX=rZ-_h<;9Ku9(J1IwSu@Ge}da{{7(PzB$I4Ofw&Hn~{25A+xgne3f3QO|MRm z{XsN;8!gbke3I#utn|K3A~NsxIVk5Px@-x~D80~laaya{4ii+8{2^t9pnI3!AgsLD zdrGnbors!~+pb7|At6N#At2-v37RK|YZX7%NBVgPA(~huZ%;Y_N3_rurK2h`QaKvG zSE7@>{7Lshhep?Zomg))I?}&_4n8*=+BuX-^^JM z7QnaLP6{y18fx zZcgFsN|Vst`@?&pfPVDdlar2|7=l2vH5FEh z!LEX$vx)$=^83AN;g3A#PqKo`$>+UXfOxtOOEU;PrGO0__l%TY0hyvPOk_5~@^rRk zYo_j<0k6Pw5m&)4c>WHw{yN6j1`k2V&C35=4&6ZHe3}d||8_p~0KT0RZ9$xv=j3v^ z=-X4PXWR)F!cQ|ONz_E~5<3)EuCmYiJLyl&5QQmbK6+^UNWO`PF(uG!Y|<>bNrrMW zxxBa8*@GD43jAId??oN6WO9sKt&V6s$0JftU$g!YoUxyZ9r6AhGG|xRR+rTJAN0uj zyE;B%vKKabEuu|aLS&>#p-OzPiz1x0N48sgC!I&960%!gT5&kM;VNe`P2MqGu>FroQp5x$OtR<$UHH13Vaq$u4Ot_r!dUiyMoM6VPY*D=ccA=XD1 z+)sHa8Op@`)mjg~jH$J_^5&@y6>p`t3)Xe?s=1Da>l9b+Lcbhht`g5yZ%1!XW?_U% z6_&Yn_| zku?6|gYBf8Ko!#$kHgVSa?NBLTW^!N(Y-m+@z&?ZxZ{lg8fAR+B`DyezlBC)gDsQQ zh+TqgIjMPIym_3CjxFlEtLcdFoRr`FyQgvpQy@>Z#-A6F+QEyodtQSu>-ryRd_ZsY z&yusw{1)b_G*0=k44CyrEg)Okve*y&2dhr&;^&a1-76p$SNmJF+M@$xa#X4Q)*<>L zS7b{GW_pAmh=m<8U3CYYD<*S(l-MPA{|=lxz-AhW(P{^D(4)gp92z?F$W>h;6+_)` zFy;8|C1Pm`b~26dgFXY%ax#GMw!Cl@uBrzCC_@>9)6NR5@~74VRe9KWFnkv*at0T1 zj1~$DYm8S1iiZx8+Rw?&cbXogZjjAb1>*WK%7Ed~&B!DB#BN3>fF`~m>iG4Y@Z^H1 zH9B0yj&&;x&Fr|IKD!WQO(K);#IfSVesF%I+LLdyf9l+~;?i2z%gS2POL9BIG9D$E zmtbAFkMTgepKdQfwWyY(-)6N$?L31%AMy^dJ8*uK?nc_h&nH2SytV%mLbf(VIl^5p zG3ze6BtLvhufxX7@k*08w`hLHwF}wb0{aCH*!gb&dlTNjW?~>4;js{Sk4GNkRM5+> zoscHbx6xS!mzN#d%Eh`K^SBy9jtUc!8w6{2vh(kgItP_`^j^52gG!z5rFIIPIeaG& zN{5!+-@}^xKh(WvSW|1)H5vp&Q9zm^2vJdKQlv?bsC4O75F((`1nE)}0Rf2==>pQb zh!Lp*5_%Qs9g*HqKuRFR?+$LCy`N|A_dVZv&v(vsogc|cR@%xc_dVyBV~%m-1L|9^ z>vB}%SjF1DE#q?v6J>wl}pI23M%utPimBBE<5;R z9>69eqaG1ntPSB3=d)m2w~3__z(5gZ<+AZ($GQKTs|g&|#Bo&cjZKx}JpkWrRk?lh z8J54V-J$%9kCIQhO;&Rj9PYx->tM_~dM?qJnLXjfN^HQG zBS+<+qwvYYVbdS4yCE4DCv6f(9og1ia1(Z5RgQu8Sy+zWR{A1_cOP0b#53CM!~z9c zU+hnaRrboRnbK1kXN%r#{a}vlLvJRN_pj&&YBJN$e&T=YbC5DufP;qC;vc#JWA6RLexi27GRj@B7Q>bXp|b_dcVJ0 z7YlnE%|bON@;)gkh#Xfpzd1(yfWr5pZ()08bT>Fbbv3LS0r$c_H|`hg-dA#R+&s%s zr%XO2e!p1%boc9E`dhd3_3NZ}sUZduZK0 z6G9^?ePnESjBEF9S(rsAf;yvoC9+D~yU$Vl%Bu!4yBwJ&$65X<%ve+j8 zf+(@is1?TqW5T{}8q4?sB>!;1Pe``2;Vw-XCOGv!)*5V`oR8@e1a+6R1KK!Y4jhYf zWn+K#nl8*kRq@ndEeIr9x|%h2WnlPiYQn}_DGdG7!K@VXt;|nyvFywV;un61iPCHCdqhuD!^uAXfYux(ABo4aTRW z{sl&>tV(|`#R-ObP?{uwz{_9j3}PgO1CU1@z<MJ~WF-nI`Ss5annsu@DE}G_a<8adOFPOiu6c7ab!u;SVxwpgthqw5Hm?o zR6i6|vCoqrdaqlPo3vc3_n!>RDK|E)k4B$=eo`Fe-lwd{eV|go)3zHZ{wmc&i}JM{ zaE#3b_FT~H2SE{YhCnS6fc&Ds&f-^RYW#P@MJqhhVgOG4II{aZ9f4g8;rZmT;YSr* zF1!1cv3XJQM9T$F&h%U!_T^AXC5PHfkDMZ^Y+?WBM@B>kC!^J$f7gq0O?OZm|H$~D z*@;+(UcIpc3S!ZSbs!9I8M&3rhh6Y1vUF^D&q{29 znjO2)MPw`xf+)KJTgSLz_P4?Ca^05LbP-l%xk`EkR>nz%#TBlBsKV$cgk*Ig{eETD zCq4RwY@UOLP1c(ER`%tJ6|;H?@q0Zx`~C0Tn2$dc?`xpN_uf;L3IQT~R-$jz{bSCY z0lvKjX?-HGi(Pc+s=z($FVNki#kt0pZ(qp#`}-13`h|)r_y) zo1_vYBs!(4UBlN&+$|8~-6i8H1+ALMnbZh-IW`}$foemK>q-w_XBCkr(Dw947;mq8 zrvzbmFp~wT^r;On3gi@1c~VD`32!_33FRQ`wxt%tb~X#Hv2zi5l{KZ@ZJOWthgs#t z*$)W?;7Q@OVG@ZZJpe z-4nd8ZaqFB-Ipz(HxkBe#=mpvn&eqmiRq8=Q_!I74X)=8AIA6uhO3$){%a^3Iz-mLZx zLk`hiW0_&j&mL{dU)CBn$+x*@{zP)mrPJW$TsxXF=!)4GP8@rG`)i2Mcqa$W&ia93 zE5#!S{UFAw)5(r9z|CwF8e^W<4y-gs_Epvw>We%y>vxPrefTR=yptPKDmY3Px!y4| zSsR>ZA^LJH>3ajS3C?`tqbFGd8z?9k6l?AP#YIVxq&OP8NQ_a5kXhts;iA(u}1eV(8j z{CHy;^Gp-vW1l|}Kk=^fvh~P?Si^_3cAu_owwbp%D5IA-4!6ghI(m#x{B2jgOQ=;D zn#VJ)j!gcWZJi^NJ~OARt#fTCHIBnXA+yE0x7^W@ZxK4uGLL;Cn+PS)*0`5T-&R4V zV;(f5J|pjQh3&KZbTOfKc=i#Y%N_CGXO=M{+YiWIwsh-lQ%KN<;`FxhnfzUgUT|tK z?MD8Ertl{{R-IFHnVXF2P#kD}46n5meX3LOS)%hQ%|zJAw1MtWCUj@(TdqbTb&rhh zy`9mgkna=@qgBnmtzu(R-|kp3J?xTLGE^*8R!b3R5$B?zj+{;Wus5;+-GNv1Ovmf3 zH=>0@i$aSY%P?G{&K#||)zANED=Db7|MFa!>qVqTpqH=RF151urfNWBs9`h~@D&1O zZ0s&P7T0Q$o=k`n6MVIhzm4v7hMp+UHv3$YhsQ}wR@Hbx^Y<=fXR(>aP;RYsUh51+ z?S6gswfO6E8Lr3tpCj4KjOdeBTUj875n3+&Ljmd(WyNeJSvb9+j1mD+y;-LAZS*O$Pzd3}E1&9@6d8mHMYH{fM#^gWr8@;Z{}4rDmu z1jnh3GX-ZXL>&uXKybAw1AB0e`>10HZa*PUe0PspvdyP?Xxxf=2zy*8`oe^-_r(=O zGR(7kzC7-05@%5O_eT|+Q(ewpmd{JbvS7H<_FbMP#b)Uh)8}5#)@9zAob+Lw_YMU% zpQ`~g%BfHqr+KNxw5kl{s4+0cV!2F(*h8;SX;h*12ak=pRPwzX)HiQE*Q^9nC7uzL zb@;{;u!kDqmFIZs0w1ATZJSAY#DQm8k1J+Mx%WKhb_>reab5oO!{~2z zO`mL5K6$;oZS759N`P3E3fOf>lV)QI&fXWtNw%}YJ`E{+lXO^aKuXHm_`16A#+26~ z{<_NdW-3^fU+t=|Z^8EknZT9_v?#V|YIfsM10AD>Yx70cz^Ch!XV1jzNU`=lFA0f) zJuBHu3sXLqsa2Trf%&AOlUcxq$aM}TZ1gm+Aw%fe?1!$uo+Nwv(0#gk@h~lz66lQg zk5m<;xG67d1f@9Ip#iMK4t9QXAo$iK9b|W^pWSK=oMO$cM6Fx6nsw( zCEpvDK&HU-gq7;9Lkw*UeIi7((*I^R@~!4>@DFs4CQy&ZCtpMpJV*qqy%ETHJ=hxE z0z3n=eyb>AG_Ff_mDDe~&?sW$fYeKNQDqo8@nl*0_Qziy?lK7KE?ytI{-(+>Z7 zdQqANM55Vv?sv|C=T3tqcw|I;jF|*ooJWAJ)1ygD*r=;SfN7~O2O;L}u_WR9> zm6Q9u72wZe2?0QV2| zdXM@dfMjg{WyC;gf}p6x0u+S0W2d;=wdzB4;nn7K^>oeXmL$20r4q!Aim($Lb)86~ zQascHY1)q$MBCI{>bWnz9~9|AGglJuTHRq z<%}-V=Shz2Jd16Ix?afNI*Y1*y=*1=*i~EM-UH@!Iy_UyGzQPxHNx2>G}<|5YbIn} za?HW={EtfuU!f0Fl=@M)-SZeA5r@6J(1ct=ht$wurOm}xkTgar`CneA$PMI)w+;mF zW(J*Y7&Xr0d!JDX1C9!)WpPsCd2l0EBPU;_9zjlNA=)NRlvD+@4ajJPp(keV#`hm8 zIUAl<_`W`i<0&iA3}yujG=#F%NP6&1-Q(>%iOIjg+@@I_=aeeG)GVEKWDRQw9DM8_ zS=louXW1P;alVE1>}VcJ==Cq^1gT>*8i3*#z>n0Lb*uLEuf=otaFQyB@LeZlI zI~v$X95;3rYt>9oz$n%T>>CtOGndtbcx^<_eM;cCcQY!>{}_(}Ou%sGEAE;P^}6QT z>2-_&)_f+(ad4>BY;;cV^qaTcP~LgQk?s^l$M)N7a~5uUSWvHg6$=H z^v1($sW68;#!;xJ{pk!3FSD!cfZuebxpb1Eq0BytyMX(5s>HX`LOe3IF;F7Nj*Z?y z404_CYU~j&%;DoWx!-fe5hZ)SoUE%vZX26yR<%wyZ9i0p4&ykEO}G6-73?PMoe=Bw zv23lJ{wHKi`F7t%epDr+F?8$;^BJ8c85J3U3*pyJ-`2RnbD9ErL&xx!m$Kmo$crff zwLJP0HX9)JEPx6MfD=0`Ad~!u44R#qi>g&~%x}60C}vX-`%@UeZs%J;$Zo_%`o6hs zn|PtpLdks2vw&}W>=R3f=_!ZdmmDrp5-SmdNUX15ZNxr3UX9c=zo~!}-_onN#KnYE zfv)ev{h(W%*!WE_9L%C%Q(KrrutlW}!8t2XeqhFr^{9=cJ7?}IgRahAJlNWFXd@;T z5Th;tgoQ2}3;ywSSO}62MpQ=6PoQKIY4!fMzE!DMzCB_oz5%{|TX3YsXmCxm^HD>} zx$ZTAJDd|8G$lRq6*E51nbt1VeYGg|;z19Zis-eEc6?aKF)9Tf3bk8|g`PGc# ze2TL3PDT4if!Ni}gA5cY3tcx6+~Yu;LJmvofKj@8XiM<#l5g3v9YG6I|8iZX+6EOU z3cqMDE%E+@xHK9+176&Y&=q*T+ny$P=oJ8u#*iP_(y{I>wvp(hW*8*#tA$@c7l^<*I?=~;C2QG77i5{$BMz0hTsZZg= zB1WZm5dHS9Y5<~}VEp=eaSmAd&A`8ohA*tqhmv)-76DU+SuOvg6n9Y*43(nKE5JE( zTOLdiRNJIkXo`@HoJz-7;rP_OR=PHj{yrZC^~Oh{?g3JhLZ!pP=P<{{?Z^wVSKw)4 zcrQdfv?MgOSaXdm%OhU5;tJN}PIG+WTZFcaG)?i$;Ka4}R;NeVl_tap0ZpR1iqY4n zm1oI7N1>36|B+=H7d?Xq!Uh*INYYx*qmY8Ie^8nOSj*p)=HP!-n)@GgyH+E20yi$C z&|;5mK%2Bw=G3#WE(Vn$!4xXYS)ng?#}CV{4D-!^0Lh^e$cwPxwfmXYL~Pc6$JAUHkzf0- z$+n_H3bq4?M;0;!vT?e$`>#SGvpg*J?^}+l2!Miih~cBb9FwD~NKNpC&cG<{k3!Wy zEs?Z8A#9l05)T})na1!ZM0FPZ?K~zJ`JW(Le;7{+9Gt(+ex$=Yh#smR9L@5Xy|MNv9yqw8S2DtcL>Y-S2Bp4aC_jnWptvKr1BL*>8pYk;Ton2pe z3R^$?69PuGMKD@)Ao}530v@0W2YK`pvL}bR$elUT^3FMP8%>1r2pIK;u zo+S1l8A}D9b4ZlEFCN%dUYm~NmyfFSZ#A7hf&iKOdvFnk1~J>tiY1I2$ZIjrZnpAW<=CS z)IX>6N5QCJ)R#|EgW!x5o^fZcXRc@bE&B(dg3MTCXx-KbR{(YnJKX8^{&{mex=_XV0Qv(F7P+gm~-URtKq-f25m2@#P zZ7k17!{C1Jaoz3)w88TUX9z#jyMZJ+3H z6m`Gc7|ts<%kpjU5vg4lh^&@NS@OSDpH(3j`>u=CZhAIuN&iYmNBprCpW5kRWA0oC z&3RTROI>bTZ(@GiumT%zE7C30yH2CQ&#EYXA; zz@^~~wDbr4#_pXi{C*3`y}{D&OX8q0ln0QAfzhtX<|?R{pNd0IL=AMKwlw zAn`0KDf*P3DN9H9=HV-VPx_1$6Ft5_KCJhG$L>kiW2S+5apngdP&=x3&FXsM_iy5g ziAOgkwPzwmMb!NDzA(RXQR_}IZ=ii@xE;dqltu0dLLa{i6s+;=I~dmG8c~mslf|ZE z!7WfxUL}d$ZSZG5?vE(UN`>gYdc!U zvXjLcnz?a-m~7;kuO%&Exf?BOWHaneSqu%bKVyA!J zvkG)asQK+hc0 zyHb4Ce*BOU1yq$_-dEWBx%~*!U2X}E*nK#~eSBLRdgaDcMLFZW{EY3oXgl|_R$}|? zxbs*CzDGw*m&=bP%ze1X7P(nshc#R%i4@JB}%h`)yU+m6!=vg=& ziETEyOVwM6S8mxf=e+R|L;iVk`3yE|v`=@W*}JbxmM1|jCT$v%`u)=g?v{S+)3e_w zIHYPp*>yN&)tH#*iBZ7U~O+MN;Y z(Z?QA*GgEm_O)B7kiP7;@y_%l1|_G9Dw^n`yFa8^m&$$mOcjV6y>5K?*tT-(p%qH^ z)%}gcU`h0Hs4TOFIE>BV+sw?kOJe+`gplaMUR}a4UT;WEuW=6R|H3hBNJ+dcD&hcakfftQVjhvY05m7LmHk-BPitz&<_#uWP;Cu&{UALcz(Y9gId<)Mtd zbDToctaSga{PfL?0U4!7uph9iofT49f>I{XWMNN)RjcE*CPPUxm$tXKrc^qg+mmA- z3b&<&wImNIgzZE|QGL?$GCV$g?n$S=?8^2h+Q&-FlpU7Syi>XAqsn3TE^Jy`WH}8c zmCD!3BpPT*d|xt442ukZdMPTVm)1*f7>w9yjNMv?v=w=TWo;l1hr&8mw69*a`M|Wz z7nAC|-}AxO_Bd$A1eB;;h{QZ52_wSRrG+Q^iXxDv3U#~^oYu#pznb6eTsU9H(c3p(@puZH z)eep9n!00@XqbD+ZOSx(AmA68Xe~>y0c2?jvnq`03rHIE-pl}?nro50>QOh=Re?0m z()VV@iMifYoHO^AeQj}7K-Q+(->HvTn|u$*3HU$7GI?c+l)81dp7O-v~9e6tUdo&2`W&yIdU#brRfvR_;4fb8L(?oE=ut@+*T&WlVhjd<#l zRC#rn1GSFM$An&G5}e4sD=9f2yM0KxX8(^?$z)JnQ!xh=&r>9YH@BvaTwMJ@8$&`P zvO&iPITW}+Y%{W;;rWKtyM*C|O}@o!I#e1Gg1~1;PO^rQO$a??ohfGaiz@(+v|k>u z*=#lSyf%l4R!aL-@cnrnL#)zA2t7NR;KFpS?T%qLS2_UfqFFLtylcyJ-tP`IK8d;} z516s%=_d+7*Un1Bl)k#jJFrnm>I;OyJDNjC?1e}8So<3*-as130Fx1jUVsyig7EXA ziPrwaFW-g4YU9K*FDLRr6nA!a8i>3gI(~_&z*|9Q_ztxYk9At}^j&QbY>+u-aRCL4 zY)DYAwj#~pHc(Nsp;9$Io?W4HSEon*Jh=Od(-tWbA>FLNzSUHP9L@RjMQhMe0;G?@ z>yheQy`Wh#oM+nlf%XLl>i6$j&xW3mU#3Aa!Ao}QJ>{tIh>ey8y2~wgnYKEF2zTd) zR*i$XDOt5ksXjwHHC~240J)Gd1$%>#Fe6Cm^QdpCQR+)cq8N_si2(o6#4kw$Ss`B# zk<^u%I+NS56L}e^r9X9(h^dheGnO`MSAK;)R{tO21pYfRa4*6ee(X@|QPwtqs5 zth5)IsEK#&wxPtM7dNbkCj>_y6Hh!Jss0FS+4mWg;5xR`ao53q7&mnx%y+g)K_Ok( z7}_6pP9i(!)ezqm&d16U3p_Wgw1i(u)T`92ac6@hx2`sqVu_hrlEAIW`Vjb~#qrxeAu)DZv(sT0_?Db*a# zt?khmORTpy80NqkY9;q0$e!#|6DZ!QINT+5GrsW0L&3=dW8RN51^M;h(}iVFsu2m`CKEKt}bIV-;YA) z)saG_gj>kE3$7|#=n>jMq8@MJ=kk2hX3Y)Wmuv4~lUk8jJxxogDrj!i$Hgz-wI_Q5 zZX4Cl*3QReQuwJw>$V#T?f--J7Z;7+4GT$i?;cNYB}35x<#N! zc7Pbea=&6ZbMc8JkuFPhkmp8?8ZLi9=pgIZVYB_)nmAUFDsRR0VhmQ|ykB-WqUSh? z)Q@mt$odfMp3^j%&UjCb0sG{B|C^oTN|f`HkN8{elD~_*2BCPtjmyClcx&cFP>N&xU1uQ$~ zwfI!^yNDJ+Syf7*@|>m%b1Qq+=zNSNe3bV4jxBa?>L=uxCTAFf74C2`!|Yz7TiYSs z65sh}feBRZ*Q6M=b*|oQGlh--kIc{ChO~Y4RqYw;;)s&*C3>Q9 zC>?B;G)@Fv^{C~m2Pioq2mA={XaU{? z8{(zZc{2DBi=4k&>;;50-^K$B_>M`G@N2v(HlAujF7jT(V)=18VxX>UNLyD5uX5a9lz4EV4#fJx-iU+gK z@7d=+mdyJAtDyifPjvgg_)qu=bT1&Box-OW9;o&s4@CR9iU_xR)B+S~oQ!H!Nfq6U z+JpB0p1aIm{U4Z*P>Jqmk#Can@mmKDlJ~kPVsz!VbMJxb`e`82@xOcjXEC013jVqQ z@I>daXvxd0?!=QQ7%Q)*${aNhCI7kq3RI(}<$0_NSX8sxk+(AZ{xHu}Q8>%V^nvLJ zz!U%m@{jj_j&Q$6y5Gy6{eQq6=N?ttl+5FrlZ;QaSuWaaH7;qs8SA$RJh!B#KuOOL z(_W+?+Sp@;DpHU$caB{wU_RzMqkcscU=Dx6A=&IUJW9rp6yRIkdFU^2#-SLutZa6Y z=)W?p?Y&!pz*a%!-|8-Z!E`vJfJ>(q*u@6P*N%MQiP>4Yl{QF6n)m*VhDF+bo(>Wv z3Iixnu}d1!-%`_a(5kMft^^t@{#v3AW-^yQb@U#fO#Y%kdA}7Bfv0_4*OgBlwI=CK zK?zVX1VX5meDq^Gj9T&v+lz8zRp3TB7zDyZEi!dvrA_}3F6Y%_azz++;{zy*KES-w z97Yq>f7GXLPu&Nj`4YTk9Jw*$0ve?|k#|#z;P|A4K-fw>2vvZH|El>|;P8I)qixMC zKqe;+t(weCR3Mi|4=|2PNd#Shaj?Nli)c1h%;e(Q4j_&vUy}5P_w0*Hl@}p7(b0)T z_(O>r3f@P4cs)yXPgefOVNJrWIU(MGOxgy@;8G&9U~U`t z(tDT6bMZxJs2;UuTuU$d^zLm{KTlrkS0$|D#22v1sG8HLj>$pvoA+x`zBh&TWlIz> zCWp59Cu%^L;O~z0b%l+LDLydGAs$nUc)Ma$@cno|+oWNa1yG~1NNEV7I}h0McMtMV zJ;?@fI=-QaoIFP|EwAvH%Fw;Boq~-hEZ3yr^KJ37V$Jcs)K)Ox!Yt0*UQAd)vZB4g zH+72{!u_6!mao}J4Pd>2WQm|fbj1xiiI1SSFFpyEn~m!W-^OFo#_Skq>pwnZa+N6J ze|S6eyJWu8XCOK9l(nMzgzLP!36xjVSN^r?uCH%FNH?nC5C-1_p_1a;TwMj{&O zXLO0Z=9`LV5YqDKYKyyiBChB}jMYcMRSp_|^WAUaDp+BDU%r#wD7I|GLVe7h(DVIO zgnpy`hYi{ml_@*Le!OX>5q+e~V+!PMpC7zfj;T*EHsiKYUTs;3B|Z?L;KSKB+fU@R z&Sz;KyECSWJZgZsfA)@28AfVD?tQGS+IwWe_A4s2Y2;v&{}pvA(MczD&1$>N*bn<~ zfuIM4S4XyRAwMAti6V&W7-r&Gk2pA!2M}K}dGSp+c7ZT(jKH@&GaXkz2%B8Hq0L3c zBuLnDBG_$cSkaLy1J}&18F-P$ZNKOODS|jYVej!7!`%TvrM=kUZ*AK-+_j>S99O>U z)SN;Q2H)nl;(tQykMu*-RlW2PN@C|^sC%l)EQlYWb9I}v{TgX*GW&_RS;nPQFO+>S*^ngqG4JP%WaY=4m@B5;(2$iO#= z3_+5~C`=vp{>U+c=GT2;`I+7sp|tEf}k0Xh5d&v(RAsm(tB6mWz83)E3c;$Ta_Kv$w!BxMJKCbH%3riA7ux zm@k5}&z>y2;m7tMufOQbIi}q8z@QqDCY_G~YbR75A_qQY)8UWY9mtvO%ewsfIbjCB zlFCSHt(>R&2_J+l%}y3+Vf|Qghpt3fsm}>5-aH}8w~QRMYnX2UWEY2E^3SF%Xaje2 zoTf3+`Lhr{HY*j`Pe>JflY9NVR?BS5M<&B4_p{pZx{G#;5hE^5#U6Y!?jf(Q;g0v2 z)N2;sQAB3@e1GL)Qp1QLc5XNX=aWYie>M3a*Krosn{St{4J#9*v@4i~(%!aJ9-ECM zd-T@soxvVz#t(jq*ghc@tiApq{9^Ruou_gZTZ?JHa?3Z4)8=H{!NZ7UCHDJyNs6@! z;?Z*#gtwY`?6M0r6t80Q`@c@-8bmFE!=QI03QT#e=BhS4g&Jprfl)9sr)$BDxyynR zX^|BR+1I4BR1Os{%jwKgJc1V@f{}G105lBBEovc>n2I!JmBnqrfmP!^PEx|_1~3_oFFi26#Smq zJ6|-H1Cz%5QX8fhhRv{bE$CCOnX9Y0Gp5wNbTGXh`{weykPgp%_xty?zhnh}3Pobz z528_aEoXX4bn;vs^<#tNsr-+~%IBnHDy{1)c<-*Mt;cElVMSn=+no#%wT9i_r?wT%HUwJW2V<>9goMof0M3R zG#0Q-%P7HFNtD7FP8Rw);oL~-Nr5dmJr7)y!4J<-x8xr-?ibGFj+9L8+y}AF z14YGn;$gd4uOU@*`=HW1 zeC3cN$M)0!&JJv{B|umTXE3Mn_WgH1u~NRaISDWYHxHk-5zXDq5N6GiQepS&(cQ%0 zG=Da0%QDb@fx39)Stj-CObA2*LhGNMQz-S})}i0y|NqNAi}3fU>k8;S0fZDt_dyu; z=xlyUJ;*mgNcqN--?_!_rC~6!&wb+yu@l3*_TaRN57%3s8QF)I7S_>@AT)TGdZJ+} zVnl8FbHSBbB+atHYcC76Vb&4Y#vY}pDDqT{wt`#I!QT9%)D0a#;u&0!_$xIb7ztwk zfH*7QTpno6-gC*qxRLv!TICIcR?6pvb z@gq5MT0`l-VAZB*pZdrb7#WLgbsu}~Z6CLE{wCb*QzTDAYbGT6v(po&t$4ij)t#>Z z9R$%YR;(AZCs6YO3@Sp%wpI@eu}M0DAow1=Wcxl@G5J}{o32?yB6|`Jr0_kWNIdHj z93QXrw=clFea-7zz?GcZivHVc5m@&}YoH*0vj+&|*<~X)O_8R0+G|EG#d~0*gC7vb z06L%!1(NNR;GDh)-bLF)SjZzcvyT;*qK1uik9Gwsq5T4c<3Y`!$diR{IsGRb$oDm90wGzeJfNpw>h_ zsYH3A!h>$#9uFTYGAmTLnKS(AEr?oF&M63OXQdteM~NvZq8Dbe4Ehhc>?gpL^-Gui z|1Jtb%6(ujkyt*e)d9dHESR|fXQ>vmo5MbuW~(kIJ9YH%+kmUoPJL_SyA3kMdlgf5 zn+QxPd^BQGE)u8gL%#v7?^L_@*syzV)U6=`yVHNR!=xi6&0!CI=H*DR7g5NKaLjPM ztSS#{)p!zZ`P$|x!>FlAtI&ofc}ygWSHU3~SKZj2lGBlZZV^bzGok%}DS+Gw1>;J= zx-5~&!++(|#;Jx>fl=87LGfq-M0~ z?}YGsINBnHdwZ4kZO5**%WZ@P#!@|2VCTaL6<3$^^Pla%d`a!OT?~Lh#E(a?g|)XU zQJio-vdou~_3+b!QrJt-@^(FzP*5^M_TDH*bUIdV0`M0$c@8otT5xqP# znlPL0SfnaGu`s>pjXOvn|7Wdl@D%8}9uBZV$0q+m;~h8z6m zbL-$nXwl0+~P`EH7} z$%XEG_jbqyN*+Tqv50se5+I|_5rDI>k+Ll7o95@zeFt}rJoX7W`|;KeQyJ{_YoKZW z=V1AJ(ERK2?`#>vU3z!}=5HcSioe>_EJ>*{Bm?6Kn|$_!Ej2XZ^xm(Aw9VmEP~Cog zdpKAcTBpuTSb3=V4JFRCtDsqWss!er$cGyQ?_eUWmCf_Mf#u1-e6$ZC^e8qG7c)nL z|BJT($;j6bx-rdWh!Zg+>zAYA7vmgjFsV}WexSFl+2k2eTuS6Wn&PCUPULkHCJ|we@Semq z=j54uXU#+1u+Bhb{5g#t04Rq*%v=07qu5iIq5^KU<-cGEK)K0B1B;=o1d_GnQLqn4 z`{D05#})NlL1T!UXrUD9@XLnCri&6Njeb@IFllQ2TXk6%7<$`FBG%K6@%6^RD-bcy z4;Ha>jLhyg`I5fN__M2_7><3 zx%Wtpr@hhPggo1iF@|i!JB3|mz8_RO*Y%^Poy86ns7(0y3qG=XnjL#jz~<`q;e zHP2CIG? zvw`kT9iuYdbjA6efMWMUht;bXuaCWejw2#@vZh0X)wSA4h38sWA+b9|nmP#Kh=@%qhdUTwsQzB&wwFjORJBM;QBE5Ry0>0+ zEwVfWBW4;Hg8sei_1`_udled+Z!3OU^7}QjMst(FS9+|D6pq$!=Dt5l8|%LIa+x=5 zJGOBAFdJ}Fb9mj(0<9fPzAn#2F=_BFp0DyszJi9pjl&bGqC0U62yOLCAEFA5*-XM1OQcop021=F3%AUrCEZp#SM7*3&AHH+OeG6SJ zcahj0IiW6|KgAbO%;2))F{33Fg?QjJcBtmc5MJ^P)?!nQ-FSSYaweT7g>3l)`^rvr z#PfGAD}E@9nkFgD592e+I1HljpIy#6Eonsw+w66Hys4-;So3zbf3thpawoU%!Xi(` zv$o7jFZ&b3O5DjBU+Fsg+GK%y>k5WL&b$`aUe|>yNTx<=3>~`BVjTJT8heWvdAaJ# z@GQ3`OJ)zhjfFLpY{smdf)e%`W5%!Xws<1MFJKU(31aRlx-uWB3JbaW6~UVnLv#3V zGBe?n2%2V5JtuX6uk(o92Ie#C`@1Z&3rrJEgE4nEd7y#QHS$17PBnNa(R5z&e1gZ7 z3iAkCG$VuUpe219=51g3%IN#~;v@3o#BkU>7^5_vbv=2NOSr+7DaBd1GsPt(cK<`h zGQDlw6}U-q$0hl$hfd3O8>e;{`a5^)jHYNltl~@LMkt0>PSrzCG&xzj>NxI9mLnct z?Jetk7t*ay-w|)2^?TY;|~_Pry*+sD%GYPlD2TJHRaZ&lrX7E9N@Ozx&4dRh%$cGCFcKxL?? zg9~;j*FXM7c+Gs{1^U&OW|TFoSz;OzeF3EMyq}UCxFBoyzSzn)d4h!va*=2kn8^esXsLQ|?v5{D zUBHt;22&_c1YCGJ>?O~ZHT}Dottn4X)iUsi0rp7+wDlNnF+N+2_!+M9+aY?}^62mUHIBl!pfS6YaV3RZ-Hyb8V~b;*$SfC)-`v0Hoi8N#@u+EWfOf&oQnv5 zdxbLZCj_&3$+rNzgXP>f{{tY^P2Z{T8bz1k4PMEmGq1O!n4A`rwxWbveZvs*$T~|o zk=wcT5oCCwb;5Kf^!Yc9Qjwb@be1+OC#(i8M%`Wc&TKA$YyGUIyEpT|%Ho@v$eAe5 z3yJnj2J?QuBRKf*DLmsC;B@m~MwIHjWXCx)^3nBf9qWoLoq2v2%!3~&%jC6g8xu{D z51>zb@lIHhM$WW&)X?pI3*3W=0?Wfzm5=iLpd48AnloVRUhrq4Z(#v23H57W2-Hnx zG~j!Dz6~6d#sh0@Zgy)@g$&DdLPUXQ<=Rm{MM@A$9^8{_s^XJC%-o7>E3n4whz%o` z*w+y6%RwUapKJB5@=B+EQfJaORjm8hKlvL(f^M!P#DI_TBimdyh6vA7ah{Y1=}4C9 zKc*v?i{M;&2RmtB4aG^@fEqJ@?pVIZu6CHQkf*qm3xDG$V!a1&y5!%ESL25I zu*^5q`W|lC3->AYWo{NeG!H+bg*@)Wtq*;22tCAUB>}ilppjFE54!<&)>CBL93DiN z??U1G>NH660Xz`cz7CuHA;*IpRum*$3M$S1kS#I;c47HJhz|Qdqkk>LT?-N*AwCkO zZBsR zEASThePxkh)IWVYT;LP&to7USmYO2~Xn?cYAs`#}MJ0nWu*YJt{k~6M6zS;g8Z^&2 zgl5s#y;ZFYI~Yqa`9KW&>KY9IRP82DVC~4HL!0)j5$mU~q3`&mH`v$GWoHij=C^I~QzX;!E4C@g# zR2qV<-yRkx_7a%@D3FpuUjoN1Xw|@QVu%|+c$DEbKmjR0o)mo^3$C3z+nNL^ z(?R`|r@+y!-nuO-TC;B}xAH0;;q2~y39a*G9`{jS4+XYjk<*w%w~iobZHGpky6%i>tOG&4 z(uN_=Zc<9kyrr}nJ2Ol?jp+Req4BYCB$}4SmD{j)M2D3PY3_KAt)mphethID<~ywG zFG=BcIEZQ&t>6J$ZP+k_0h9wGL$e|MiQ&$Q)I<0~&a|2UA=8L@S08V~YC|^3-$VVO ze@e0cNB9N#E>MXN{t1D?P9w*`ANkSGxBD4G4j>KW#*dA7LKh&e$X>_ztR&{li;x&x z{@4HY+qb4eQ36z(ZXg%c27A5#FOFqHi}847C8DNoGjx0D53pj)RD0yN_W2SWp5ZxB zlkcEs21e?c@xmx0Cu5MCo@N~2eEKD>qbp15nXv)7*B6tdNMIBHEz=-# zCY_6a3S-Y?D}gp`vY&&02QgaEs_steW-+CqF!2F=bwLGVHlV#`D0SRbz2#H>Nuz&o4@c&u8gq z8V6-Zq~`#VPX@{lB9Z-TuVr3}LiyBLr~>J8sW(9(7La$k-P>p*th}vY4JTIoOOQ{e z0dwuHxtTgt#~Zc#Xk>z}1pff*d+hj|#DVuj2R`188>-%^!|@Gv$oC#p4TEWx?eEI& z_;MbqdhiB%tnqBOD35qrv{w9LGe zmQ7P|5RMHR>JzfI?+77_8!p`82;_>YVX!%q;%(T6ES=97EnlOWIoW0vp#3q_3RmEA z(Wkb~ufYwC2#-{XHF2&_u4Np2livE;rngYh858E374CUn&%z~f3mtT_$IPQoHAp+w z?xkN#y6uk>?(-ibCf^)$nMoFcPE|Yw7e=Gw^v~Xf--r^dg(qMr0B*2GzeY z9mX-DF@JJ1ly*3(M&irUAB;g}o@ga3W(J8hG82e53h*cknWLv-ImiB9q75zl;2sO- zcf;*dU$=Ma#GSLr@f}rukG5T25)Qm-c920ljLsxj&A-}>ux>H&UkX1L||7*qXr`k z&yys@E9&%lOPavdef+Y*1=VrZA75E~{ugua9oFQwb&m!SPy`X_H7Y1gL=Z%3RHO?C zNRbv45drB%KuByz2azU4BE1{woq&{ph;)!%rS}>LiFZZM-e>RQ{+;`Mzd!DCpGVVq zQ{J`anrqB4$GE(<8~iA)Ro9Pyr2o79$ojHb=y65Khg^$Ztg+qR9qv}#d}M0q^)s=1 zPu3-OY}V=gbnFc3pP+>!-`rCf7^Zu>y$=zZk+QpA5YyaqqV8g|XxU(GXcg#^>P$Rh z6_ONA&T}6K{%R`+PmTq!|?LANWm!<5_MV}K)6EM9rF4!3fJ^$*;DM~q7af@n+^4i0%<*b9TXq5m)^{0!nT=_>pfn^D5xYG~;o;)%8m6$X4M6iI+-Q*oD755-a0dhxAV+mT!!y zRo9^#aT_thZX%0W`^raD@yu0Us;7?&R1Ww_aL@;LW$>KANJq#w4KgT+xy&0JR_CvG zFC-oo-RpC38rTB*6XWj;(vf}$o+3W~l?jL!-FbF`^jdn}xu_3=X7<$2 zh{&7N2Plxm+L#20YcbprZQsA=wB;wU@Am3cHYa!Qa+B=Vqjv6;?>UH!(GnJ?Q6oj? zm-K=;PAiuToD$#BdTM>jbHQs}!^^hCnj=L}E7_UbXg%HF&`&i`2CX-m=l65vG#{pe z^<4kl1c+jk&;Z7o7g_wPwikgAAT9DF^n@^XcazbV`*-^kYk@}2h})NUg4<3Bb)dwA z)j%?jalBU-2_l{@BtOO(fbAtiD^QiU_{+-qNhN90O6`~8o8qc%2OJc_&)i*9*Z9gJ z-ZoP>d^0*Q$TP(=xmI0Kv8w0k^n;)yppNjGg2y!tcrtYV!c2@5x?XNWUQX)moWUqy z#^b=!Q*U6Mf**97hvB{>JQ|6`h-8ccay$;U;Wh{-yg(y&)Y=*H9Y>(OwX?vq9X5_W zue`LI61E@u3mMsOuFl!8B~ovL;@n;;@|!&!fv*gGJbE@uIBcdk{Epq`h+v7Me6c*j z*c?`_r%pq*YBp?~KV6);AVAh8{4n@?+57OWchy;uw@%}OoQ(9@HgEcr~s z5YN+}92zD7+~&D*7_K}neCtWw56ElK#Ue=b03tL~u#?-eNW|t(9*V;DcfH}Qj>?cL z&H|C^T?OZ!kjgH9T2cSZKmLOZ{O4nzH00lUb&&m$gmrX)nv*41wtp$lUyO*8WLG0K zF8>NEG!G(xU>3vz%jw_QoD}u!F!wd8%rlt4u|f0%&jBqV$=hYE^Ni>n>$u>sd!o;F zI1;pM!%P&;JXdzK#p|JGTZzXM*Zh{`dDmcAVR=r)ton0)h^c3@#?gsAD1BFJC9pac zr7pF;=s6%!|KI=Iu93EELOnJ7-X`nt#*_*D_1iVpYoWfD6L@*BXH(1{`yPRsl=-UR zX~$^z*Myr6l+$rMRG*;BBCJ4N9g#wLx6>}AGnehvwX(`Hwy$&tQUd4^Z{morY^R_& zKF|*W#p%KdP|saCGQHpF8kG_O)_1q2Gjmc;pDXT)cUe0=YRApFZRD-$2>69`P9JTQ zJeM#c5Oto4JR&gWVJ4^z#l=BLL?A6A^Rxf@4awNhEg0N+f9z5o=&=9Re;4F1kBj$P z;1MXGjj<K=$eRLc5HdfmidoNujZ%U&ek`2tD14fcw zhDeioQ2TS_;nh)*Lsy3+gq6-{7Y8u@faGYtG|B>qS4PAJS>?u|Kq1#$n<$XJ|67uv3GfX78tUt@9f8PFGuY_={e-rGZDGGi-!e=Ld z5@oRnfs58!%Xy+4=L7gY*FfVEu-B3Ynfh-HC>s&PBl943Wn?42btA!`EQ#7LH7Oi! zGA51fpUi0-Ko%%o5#nmvAEzO@6Pb$%BkfUXxTmM)PjNh2iHDAAPfN+t+>nldzhokG zc&p={OkjezA^k)STn$~3HTRjq=&ut>lKii|KfK)X7&`7(2zg?#uD2dGVHF^M^gs&` zH`ZGBG*F>fb@fuG1JT{h++U1-R9jDD&lc=O{A1`MKjX*ao1-;SO>lYtRIW z;L;1*ntu#$PWuNXV9TFMz-APzj79p)Vhq5($%y|Wv<3LgK__AkgwTVl4Mx62W_n@4 zd3nXn7q5(ny|tIxzEsZx-StI0HrN8JAc|<5l!!lq1!pj&AIm&n#pAy3vj;ZZ7rX-C z!r%6lr3L9cbM~4)yp>6LSPcTzFLfdU$bFwnczcI^UWYBsqD>(ap~cBLrK5)rSNmR? z_y4ZPT~KNPuVy+BI}iqlhQmwfi38aBrMm7tc8s6>RK?STWuGcphhR(I>&;UY`Amd( z??bo|fCvvlCmsZ?>(=QYlWoE$pSYWM4UC*S+}AaQo3h7|MfyR74moCV*;`8V?5Oc4 z7n~x+9D(CcbULrnwrWRbXsgv;0-jL%iyB$xq!<#Z?fAe!UE7X313mCQ0K^uT`X7+?Tj(+AnXK|) zQnT3mKRlPnMB_j&W^yf(54NTU-ILQ|MfRshfH8Zqf7T8W0PNY3)7!Ejvi@PuUfjzS zUO|eYNxq11WF0?ddgDM20G=ZTfJ_#Lk+j_kA|6!$^EPH;6;HW<*}vH@EcQ*zzi zo}Z3n=b3@*uTvUl;CW`|&?0Tc=XL)y@&Ef{B)O4)F+P}&Q7~$&CK-Y+9?xJ7AeSS5 zEF?cqt}{Rb=zjG7BYt6|IInKl>O=wAR2dO?KbcL>ewH)NO^M@NYI-X;6|cd8f29O} zzrGWj4pIptOVXt%2poTV&A*yt!4DO>Eqs!cS@8bnOa69P;OOgiYX0h{IAt$4#zXgA zZ%6hmUUTvySwaba%y>V|P6ax18QC^%v{WVc3%A6JB9J*Sg*$m|THhIn)QaGAijVD3 z1CR{REDp6Cz>V;W|M>9E#9BgcsSvxGuBuVWPuOQkIZsKGg)sOg*`cD>kzJ(8$;wN# ziwEA{zXUqY*2562pimYg1S1hIt}zhh959RF=*6iW?kj}EDp4@*^pQZ_W5z;1!=7`0 zMf(-f4fYum5;mAQHC0oEthNW#4{jS`i{+zAAC5hPbu?a|)7!0|j&M;6h#txJPpmj+Xk zgUI+|mvvr^>)E`Pr=*3L_>Zq_p;KX&Lw8-)7biyIIhp#FkFV~ga^=hvImCo+Z|yE6 zA47DE(P&tk(JIWULs~!!Fd~dCK zMBrDNBu!M|l47sF@$vOl*Y5;k>=dS~f>sQtX|Hc11xBkxJezFi`19y4An8x#(ED`j zDLl^MA25@%Ray#8{ea!`+3c&vjPoL_!p=82l|0lU%9)*6ekwE!vl{gV#wYW+Knd1M z>O2z35Cnw|tD>^9Hj)#|ShFVG??9-i`f^@fa8BaDq-oLH%45rM9Fn;-vh<$>S2!wwKG%1J@UTtkD6jR*rZJ8 z<54rZG#7IZW2)m_+#S~`6o%+ImUM<=DKdg^FjGW_j3}yiNe(4ZI2d5(H%$2^ZOiaE(OMeEG{ShYho* zTo|58_b)j)Ou`!s0uEf)G%&&5WLZPQ5VJmk4?J~sqv7(M*#0yhKN_dAkGMR#gE~>r z23eS9DwiH+dj@gUyd|Q4tS`xJIia77fW91(Wjb$g^;L=aON*h=iX07{!D(;5IAX-7 z2R2S7L#PP7PoK|M8N@j#XbwwHzP0Vou*&iuAKpq&)UJ<>>2(a4p>z zPo-TnG}B+=m$z~ga$AO*`?X@zO`4STEKs%mQ;&mgDd!B^5drIHz|`cyC*&@e@Kea( zCW-iG#QrU&Jr;l4(I8?4D#_7y_r7y*IplVJFpcifX=+Eia&&8Yd)bYmt-V-hFU}qr zY?_pbz4n5wM3(k2!Gg4JAEuY|Pa-Eh*X-FIS3UJX+}AY2j$ za70(%4~Qjhb+Xb;YUn(uhYv}hm+5raK2|`tT0n$hfHz4lE6HL8=^9@U;~D|eI=XVr zL6>?_Vb-Y(H1bNb)4XlwPLHjWSZqnqhgMCorlzlC+)ic%YlXGHz{Hwppe5td;qmDQ z(QznO-JHuH8NO|DD7-?a+ju3;X*eJBiW_DO8_dBE7;f(&i_IblE*XGa0CZ&MS;pX* z=#DVlxiM(55=a)zhd@(F@Kptu{fe<7^3dWv7)aSXOvegGLw(%JV8Blaqkq;Xvv^Mf zJnrs*!-hbK^^VASiAE1s{S4Bs{{06UazbE6B2~2+Nm14rRN)^l*baqaV}x5`(uB`! zrR6B;kZLbSse8H+wh2B$7ikjoqhlJe-_@Sb+9T?KXcBTd%_gzSEbQQUF@d&!vmK1#`5Sh!uBk(socj0M!Aj0A_A>zUr=&p`fs%W?FN7nj zmrM0;%k8hVRQK^MvWoa$6S|B28er*ttNe47m2{)17AX2{t{Y%cr5T?@aHh}#Y^bd0 zPd-BlW^kXxXLu*GJ9G=_Wmw+;_zWN0)dHeS)vnOwUc5Ga!rH7hY`h?FpKqrCr!+>6 zd5GdXkPL*L!)o>LM%P-9FG-r47?UlQ2r#K~)p7T37Q3ha5`g-TJb~f~IAi&1;Y2A! z2Z9oJzmx6Jww==D8uH?h5~~WSW&GP=I%lC;p!IPhIRglo10zO#5zfG?D z7ioW2WXM`IwF-SQ5;$S2+@bx()U*Q0eRMi4Dqg1yl|UY z9^eSCz7g_}bGbN>?2v=99Xo8I1ZE507m_~>WIX0!O*N*;q5u5KR}Mq!JB0~$$RIOb za>6^M`v6@wH2C7G{n^)Ia2qwKwd22jvcML@D654Ct5YqG$KK=~x>@OjGYB0SHM3!% z^F+$VTxxdSP-`3s%X-ivVHD^Nrt5z$AWy88K$8O<_6MZMui+hRq7-y^iOtOJ#u#YQ z4J_H+es;EDNm_6g;`ZRQ!TGC6ALYT8;kiBWa|EIj$}W7MK55G!^30oo_i@~_aHQK~ z=i`gE@*Z|elAooxjiRnxC?|f*X_SX})2vx7&99ZJ;O>n3w-j|pR_$JRui^Rl98Lep zDO^G*)mr9ls9@ku)CX0~r6$d(%;dluKU2_u>M*(dyEFx8FcDApwg!OaPaX`_;_dgM0(r;pECuvuV7>L5?u2u@3mZw|5;Sy>@C*O3HACn?4j%bw5WXxhVQbn?gwoFs zj)!=@X;N;iuETm*ItYjH<_2}@VC-$l8_ifd4(ul08PpN^ij0?ejEm#Gu2A6%u1Pbx z!SwhNQ!+DHtei3|a7xM5*qtyO&j-ZU9}u=G^|cg}kyBmvM+*U$uh_74bLgvRlbM(# z*{3qgh4d=vindRG7N&tXALhgVg$5`%0G$L5yVk7=F3^pzW5vL7BEWTTsWrUvb9n!* zE*W7im3na{854kH!zq}pQ3bi7$Z_f41{U^dhx-hipyP2={NvnHJg2sJ4g+Yi06mK$ z9wWnFGMOXt0d|6Q81=xcW}m(=%xnX@pVQpbYU_txYh|TpsHO3jF3Yo4H)f&gIDi3g z!sQleuA_@!_g06XjhF&>Sjfit3iJz!=-YkbX%00hzGW|*s7vWXV{6?$i$aA&8n<+8 z#*xk_YNC(_)s)urNAjn=FZWLNiJK}qzfp2rFT;1L`OSph_{zq5Nd6_mT-1Ta0Tm&I zc!Ah9D#$H}Gsvrcp<=MC?>uA7(E}eZd-|HAjOOBEM1EvK|Egat3@3!GgK-%f&%@Yr zXVxf7977xfcjHWQJ%-WjvB@n&mh0Kl&7Kl3vx@U0p6$;oI@QqNg2w6mD?EC=$tN{r z8r<(wZo1;jK17mtI|am3eH-t$yz?52BM75GsP_$E(M5T1?v*Ek$75BwBenE!an)`( zA#*BI-BS1x%JRDH#?+L<0kiHzpD)wbs5NN4BsX6TB-bl6llI;7#IGO8cLEFu(HmOT zIzbkyr~&(w4`e9>AGcC_rc9QHk8aftj(AXR19px;gi_jx!c@>A>j(Y|PH$wW7y z=VEzf0>PMQi|H^C`?d`Gkt&0o(`&le#dV7bsu%KSmkq}n1{I?Y^3fxq))NL%RC4sK zA`|fXMx;4>O*4MB2YGMIclRV`WI=@=N=3x}3^chbhl@hoC#xi+XhG;0TgcEqn2zK9 z(Gz3;0ihH#THoB*|C2!F0cdCb7(n=l_>=HqyCy?C0!s9N7y|o)7}8X>aG-Q)_?DA8 z$OIHyxPC5YWyeR97VH>Js9wYwg$8;9Fa`4uV2VG8Ww&9|TL*tZQ+fyfj;0|0U(uAF zw2wbUR2dCUKmC1e_|LWCe|}&0S>xqEi2H&hN%8|Yw*I{1rmP-ul`c7sqc6shq(J$ zVg{iLBGQ2NjCRP*lr~q_!Qr-W*f+;eN z>77^UNFEH4w?y&2;+tJ>5{eGFGf+m-8M!giw)hGXCREj=0}q$-{i1mosbYFjNx6M9 zvl}3zpg41dkZ$+`(sO|x0LwGzXT0Xe$QR*0z(>x5kOuf3uYd79*kFC^y8>v}?>S{n ztm2Wo_@2+5B+AECs(U_yVn@wjL1`*lRP!n+f{8uA=iL%E)rUiN15^dQbafojF3%<( ztq<~E65FF}bJ`ST*oHI)O{;PpS_5PCsO`mvunkk8_Qn;r&ZzJmU6o_ZOL~|nB*$2> zP5Al{&_G=yp(l9Sz$x!%umOi@Tig~QmxCx$QzBC=x6!CoE3)3E5+mzeR0jN$mg?BwU<+qUTxP612TklwQ z{Ba&Kv^F4TSEBw@+2OhiD-hC-f4EKpOX^rN&&Rvv-hZqOfD<+}DseWC>x4a{J>&4b zrvfxcP?3B}z1bcRb7rW{D@C+!!x;J&t$H9u~W2`f6#Tep@&7CRtC?c%jv1P(8Mxn5CE$dCUNP;bc#I@pQ^Z% ztIK2_Iu@tZUBgPqnm#2W9TPTta*IB zWVXIWX+akIHxC^cT#MECo7?u;bq@+%W~*n)VoW(vo}mYO|s=3 z*(0L`F4Y+ibKGGpS(RruEt)i3YZ%sEl}`ESiHA_0+Pr8HBV%x^e`foGj_ZTc2b|;V zgv*4;D!IG1<9DBYe^nNDb*NfWshrEUc5-;e@H|gubcNxTnD?QEzS}DvDui%vBk?Iw z6hVknBr)CvWm*%p2@Q}H&Bp2KN2v4}+;2Wc!}(n7>MVCJKAPv|YwR;PJE9-K(v`aQ zuB40yYI~<-byh}?^;tic?IwbGD#6s-Qd)2M!9KDcb@$y`lr*EIKAk;CPq8x-eAU-Y zU35Oss2OK5-KpqbaUCgEo-LstOB6h%d%IjEvBuErK+)YM;LJbmfrh%0lDO|MPM6=2H82u>U+@@?h3uD z+jDV~b)Py86LAMWP0|eeXLF*zMu`8zZHToD&>VbvO&z4(-CeQ8E(tW(ml|xDe@8D&$08q8*y)@^|lJaAmkhYpqUORo^0V=T6#k7bxx;qwl1 z)+?L5anHZ|X^eFkOcPHnOc5T#t^!k0Wqv~ej;v#OVh&XTQG2e*VdZn0N8|)mB8!4T z8`z)AZ}a=SWcQzI3hN!iovX~T8C(e6RR&ddgT?3X^6{JSH}Ex$BgZt*BS`mEXadvz zB+GaC-lYyZWrg}LM41z)gcW!d)jVA7fVWp;CG2cOep{I$(QWgtjDdWWL6)}w#@5vJ zPQ=DnEuwUu8Nya+ZAL^muT_HEuV)-c&{3iig{IlI=ZyH@cZj`8iLh{1))D}L2t6gU zlkNKhGV$Kqn7^s>JbYik7BA9!RO7-@XtDr|gtM?Lb9mSjC2=KI)8em*;d$=n(<>JK zUQju)ZTYQH3c2;N$+sgy-@FpkwXd-)>f9ETqD)Wkyaa||R5Q59y2z^!HZRpS?EFrL z>=u=_GasgelU?3UFFYiw-4$$=7UI^l0yLwAk?_S1l%U}f?+QvdO#Q=fL3_VmEY;_8UBddK;gIP??8W}9uHaj#p>2tKSAojuir(E$ zM0$1ElII(vXqJD0E@D4H7w(soGezRVM>mHj3l*)c#}=_Y+7c^K6j^vWuxSeY0eRz> zy}wtA+COW%&(S2^zX=eke?1I1g2wHHlK_&Lm6*DXSRZ|B-uXg{i$C-9=V@2q#WFnk zToDL>-q&G>gP4AL{aO(u9;A|rpjv3^`$K=-&xA(ms=g-&>n})`wXe==oK9toJrw=c ztCv?tp_H1eO)R0PNji}KD?emjW0?P3{68Q#+E-~$bryAJrkTS~VK`yXx|F||NQs-x zsQ2GKlG;&BMNZ!4=hm1cpF7u0vwsSz^=W~Kh1s_QdHxEFC7Khb-v^{Y@{6QDBtlvV zIhut1Y-F*Z5ZJSlqo+bEtmSCmZ-Fyj#83JLfBi3nKM5L5PXHoB3?ZtDexfx~_MNU? z32~r@j1^h50IM0C!gt=V04ZWQycM;Sv=&k$h3Mp_#98wy*>b6ChpMlK%GcS@Huh;f zs^{S6!}&82G6l2wZ;%M@b3|unp$$&uw(vJxJPUMp9N5ImT86@08 zx2)pM1I1kZPJb?0gS)7M!`0a@dXJ0s5buHSw~4@ ztSWl2b$pCWO<%cZm1jt<$JEJ^{BG*7v|8;A>E{iP1d2)-+~o~$9BaoWk;_a|5514A zRjS%uUoR{?)}{r?_B!6W0eYtca^%UrV07786=!H%@!q)VMbBRFAimkF3&BzHa-O z!l=pftWE95gqY4}yGH~+eL}qj#aU|&ULpo&0OjK~Pe3Wa<}~sr*I5jLxRnNb*uF2A zCcant!cM6eRm!z{`n0V~l~cIl-Dr>-`_wLWAuA5QaOzls=8K`lB19e}nIL3+Fq%-vX9=1^}-`>NV-F z4Y<~+Y7wlsr15vQ6RW{XjXMG8!4_*b`tiuuKHp)V<{X>rqs&>IDcr?xQLXFZH4tA% z#R#$TLN-X+Dq{}SLk#lq-YoQR!%BL_mT%mbn=!Q$F$sqdKIkR%P7NVSmIrTea53f+ zd*wM4dcN_4xSj5U*`o4Omd2b#Vo#ZGuUxF3(c{_U#1*c^U@?bfA7^<>;lk79WAd!? zUg^(BS-_kh!c)Vqu<3Joah|w3|8m8S&14e@7UaGMQ;P%-A0ozPFT|Z)@`?`el<*xk zN$uMZ*AeM_{TxPawwi?K8!M)IHNiXxYjj{xq$YeD(<_f+;flKN4)LMz-c)2JUkJhs zA$Q7a{1qh|Lws)2AQsk>9q6G-{mgq@T0! zRGV5ub0tST|MW(+Wjam`2vd-2bP%O`lNq@3qIhtbZz1Z ziVD#Jquw%Z{-Voua&$TT6`RJ6{l{x_%JI!hwt|)Pa$(sZ+SWGzE>engVmgpH02%k2 z&Z$*As|1`N^rj^B7qU_Et2K-y&hbL>FBnFvHM(~7cW%h{%<3t&6-nz9>kMt}drH9i zU3`K6BkkbBLF*oEbB-d?!bu{EK-ct+!|Y&xA&R*DBT=OP|1Z$SAM@cquTx%VlD6^W zyVR1@mW&X>r;tyvioi9atDCIx!UBK;Vu9l7p-xqjsJQT75#~Q`lhnA2Ar~f;z;f^u z3Y@ET`QKBmfSu`2P=Ui&ED(bD<4GViy598^_M?+944};p*^O&mfmCWjRAm6eRyp)@ z<)mZ$XF8^^gHS6RC$}J*4BOWC96QMRMbaE)8NXfmWDz%#lF*pi~kT3B?I$ zW0%kSY&9ddTzl6T);!S|*mznBw5S%geGDKRfFMHFI*jix*P2ikzI`!aQ&o{E`KR0b zi3Hu0>DOM=YR(;{?B^BRim?^!!~)E{DrCJZ?n&b7LFC*^^mG>;ZWl$Uhxdsr-9}Dv zZ$bB`tif7#vg*L9!Bzux*XXsw6u5_GX&pZf#m9c7Be>aYd|BaGZ$UpHWVnufJTU3y;j!O4>xBEfJ)UZQuX3TO>tSo$ z`gJ$PX?%M8fp|_=@=mXZB;`tUf79igh8G_AyAZcMc1NU`M!jL>%%glCU$4o| z{eZmlVsq0}`xs*p&sm<_x&QqxU+m!vQZ!GAQK_m_{9=mca~Ghh-P2I)VXIAT<7Xuo z9~o?waAhb7?#yDNajx>1lp)2YcgS*auz5aeM@V)&5 z?x^*uVAEZTVkL`FXI09o;aP$DsksU!)2gq*$HfdivtJJrDu7@VY8)o?mHQMp=QTW_ zsU>F|xadr2!~|Q0ecC^Fw#fHo$f?uMU}CS?3`3(fywsm?o;&TjM3HmUVG3bUD018UbOXVsk zuu%R1VU`&u@k2`g;D_K~y5}U!mQ~X&o`y-(Iuda1Cr+^EU z7@b;B)Lwwp*0Sc#oK_ly>$r^stZL>(6f2+XEZ;{oIj$S8d~gBBl+>Q*(XSs$&i-AS zDgB=MBS`LD|GiLY-Cz!P3gLE}K8ZGtYadbi{UUci=9ijoXvyR{C5NUKa+1M=p~cW; z(~JY%-kCMNrz^}F3>szUY~K!Fj5Z8vS}0JOW;$I!# zEPnp??-!LtF;8$_P-v?2pa6mmZ@q=0ABwwQ40wr^{!vgqg=EV+U%{M0wovu2|- z$e{D-;#TBb`&Z>+xI|}Es7oUQYtc~jSIX{#5QL0JqSht7*vb`R)c({IKHP4iGV9wo zdFdu4=9EN$_s|}y2lSVm(j!b8Z1)g2UQF58y)8)i*myZIm|g8)pc#K%1>aRF_rygN z>SdeYwAd(7eruQjw)W4h2E&tZ(MyM}R%ykTUJNAY2RwZsv4;U^+Gtl}X}uj{Ug>bD zHh@YI>+qC?B*)^VPP6ar314_elmrglkad;FpNPkP?0@tja>{-Qf1keXz7?CcVquXk z)qHbK76m`+t#@f(e0q;j1+YS^C^Qn*Y#Hiz@9nCVCUCAFl;tJdXIO=!^z^O9bRXZS z=1=7eGaf4okD~>^*D7KXpSdD>M}N7jdZ$ zy)Tac0l6x1porVoh6Twgo|3(UJ;lt1sd371)IO~OAg^xr zFWhd6<>jv5Gx+}afGG(x1$9N@Y3spLyG-BDKJa4cAUdIPsR)6Y@9m%=#_r9f&?P-Z zd~&siH($JD7T}wtf}T^ApN#FRzD#KubYn!lSR*gbFW;+uHI!GP$agr1M>&y`lQx*` zzRh2~B5nt(U;^+ALFwHtB4^WPi2+7Lzkn^YU~wF1%lE-XYTv*K1h*!C}+*j z``2~nH@M@l9B_Hw=O=qBZr(jFCXg!rO27&@>%2QnzOZa4p8F(X|cf5my{HS_qmTq9GD>b;YW{K{tRRVayVG63R92rMyfx&Ly>dto^uA zacKGBBWAkh!Cm#vOrB>btq*6sp^vIr5M|eOx2#)0%77E?%KrBNr{C`9E%O0aYMQ^+ z{_6a>UI{zR`=!Ojr3$>=YO~b-Id4?O$w)eohfQ$`J8eiTjB}gy68AeAFXlwRM#&Iz za#v0{bDl^34gEIpxzG%1fNoqCS?E_wN8^3*E%scF9fPUO8|?0VOofBkBj}{z#u@z; z|MZ&Z4^m&3W+IDOueI@WgF2z8%h4iTKO;4MCM$oUR+D`ooc7H<)&Vj z`&iW_F{GONVFHGw{lw^$h^2bb!>xFwQU>Uk;M9_9(g=W%VM*tYDZqrScuP(`xnq9- za}na=yJOf~Qsb8$GW;@D`Q#1F+Z$rOh6lx_EyIg9!TJ29R`-wDpQ)Db0qv9nrzzn@ zof))8_L@tAr;$%;zw`Z_p_BRVj~fNzgK9Brnj%9S*f0iTZ~3xLyHp<6{i=>{hp8$R zg#vy+R^PVlpgWMhpaaH}F5=00;p7psM9a`;#}Q?f6>vi7feqJ<02h@5W^@tpFrpS# z)Q(*(v|*=uWqhsU!bPRLJo|!TelKyg&kx?xcQkTb@Hj8DuZokwA0@V;$XD#PwWx6C zMpq3>>NJ#ZKj^s-g;iD%y&q6Lc>|3W5-?bLu&@KW1bYAs6Tl3DK69{}L~{A!d?)8r zToZRrRQTkCUVGjVqi^n|D&X>(?K(R_3_|v-7Bmzb*I2nf^u%B3;h@07R2GJZDGx#8 z<;=gX%iHNq45it z323Om`sdGW-`8eC-j@VEiPSBiRqHxU7SKF#^L10`%PA+iunG+lkfUjY3Ks_wtsH+J z-Tx+Ob8q6$-`GTSqmCp8-!Bc?qVI-n0MSvc3HTrZc?!Cw*?!0jx4ZKR_t@5s-b36N z+q=Sf{7jfuNV>kmRZIH!yLaLJ1ed<(bP4Vxz;rQbaH04=@LEa_nA>mGz!|)S`e=lA zBnx{_ot*l8>-UIN5(Waj*#pMP z9leMhU!0h?!2}17(-e~Y=!K;>5?@NA?~}XOh_;GS^ZGkK{?Ln`!KloozkAmdd%$?& zTHSb{8UYK=4FIg_WZbG{OnQhg@55cGJ+324$`yTE#wxq&wwEkZD@sBw2r&B-5XFI%pIu+b>#I8m2ALLWeV0_+>Z{les*9+?7|d; zHDU%F8m+G3d^7ecEL%TzZ$B|_v+#O>3UA)yC11UPb6b3#D!P}BsdQk%DSsN3=@5|> znvy%$Z``@seSaq={Cr1q&D?nKi)OiN)8_q?%FAFf9e!NcWB;K_8-Hn0bbe@K_FC_K z_m|w2b{~jV+5Ki)&O^C;ghas5PwxSdH38@lJlN>!lr#i*^JeF>Ux(}P2C}>;BTIUVAz)$yE>kcoenJ=U*S zUBw)|Ze-N-v7S2n%qi;Z?2C+KwGc;&2RLN?Af2zcE$q05g=T4&Yo5F)Q*~OZI?ahD zX+fo@f??)D_sYKu#i|*A@0R-dceV(j!C;*Ze{J6ARlxex;#}(|@c-;F@Z~1)z)oRQ|ix z2r%GfdaTwX4rsz64U#dhWp6EbxyQ&Aemtdqa)|eACZ2bJ^ zs&sl>$6h=mO)TvrS6K&L-sM1(S`u)gtSE2J3ycd>0A9}9oSXuzXRF9y8t(_e4@=M`}8jgk6QS&Hjf_?2MIy{9BqrIMer z0c#%lK*$4H)<1GD(o@~fSkp4+$bWSG1-p63HQyUt3UVsJtEAAQ0Cdi@4cH8i%w_Rs z?>tr3MUOCk7LW=NQxgDYb|#H3;1k`v`jEE5;Fph{C#lkOm&OAbub z@w^_$pAso`Jy4)N?cC|2+iyWj)gfttX&PPrKSMDoB;D7%x{|3g9 zAb7ef0kP;KC}o$G9D9(0th=cMvvvacG+<~0D2JwXV+L~#Hl7|7Hh4#ipl}MXWxEY# ztIEZufb@vuN=jl%*`Qz@u)9_IN^#$UMieVim=>m0)1lhr zlZB(H1`|`LDZDBC`4dCktf4e?By)B*+M|niz*!FnWcaG~A zBTJwwB&yO#x2t$o+$}q>^XcZU5J|rS2=g^ye}XN|w)5x#By=s`fG`ILB5NL5$b9wg zEvpBnX%F|&ZO{+eKD`HmUeDy`vKIHG^0daRmq1uO12UMu7nuJ?_XRkvOujnQsW-%X zWC$kdG?+CyHel`kX4Hm7ba3c;ZX53h`rG_1=QSaaTOVp`9n#%C$=<7_B zyLN3G@2SOnC>#+Mw4XCh2k<6V^{Xa%xbGU42lAiTv+H^7bLkuV_Qx_lHj`PjzG!GN z&Or^O*LS+)sv$GEBj*K|HlON=R^Gy*;?l5?>_EhOh=!v)GhhzA3eOts68QS$2SjkY zjh#gd^q|M#A7xOF@3n`ttp|Et@bhF9I{Z+1gv-aK%X=U*h77{ER?CiO7tYK{i>!z2eS>6U=gR8E#f`Yc#qz1i z6f5U0-IObUE(+thlIpN(yHk0ama@JF|)r&K)PsYY;N^*r-sYv418cb{CB{V?$a>QI>;*O{-L z>HVKJP`3fp>df*0nxoB0JYR58e@N)d_i?eZWp;PdmLv&N}BLRw=|nsnd8SMEhqBV^mCk#A9;RlFU4RTnmgk2;Rob)1)sbv9k7g) z!|N=*xQ2sek6v1$h`N;EGhW|gupZ%duID7uCw@y}!93be*wa;rVvwquu@QDoW+pgp z`)pQQ^G*3>vA0Lt70Qgq%BCkdi?_v%ZK1LBT zi^A|!X2_+ewc9`^mM{SaB-}b#Rp39M4*Qp~ss!z+T%>Nd=t4h#q{j>czA7<$;Xd56E!#f)+GH z>-6C|;@-|VOwUnVJ4wusaUAwVLHr$N;gkh8rmXbL(BF)hypDtxrxJON>0leI# ztV?_NN^nG8k7ukwbV1RDA{G~d*HJC@qxoaRBk!ixiRekh+c8slz%Rz4D~8Lx7j+4@jv{7*X&U$d|QoQDc*4uVKrdr+z>J(Q|JC5uIW0Q0=^M@G-a} z@T-CB_ri8u=i~Q23%R!wkA&q)K?k8M?_WONC{m69m7pp$0x?y*I)z0B#(W?Gj;N1Q zwUl-CXq{v(&S|PDxMMu@&3@+#%KDVo?k>7x^%=5&ZV8W<<}brpP8bSr-%Eqvt~j@N z%eE-I*zX%35snGoA|6k%H5VgPTjilO{Jz$>_ zWjj82i%ywxcbj%yXgXU|^X$SYMjDa%0g2i|*1rXK(&|3D zEI6mnDjxrU2rX_m1tPjkKhIiCKR%EO+W~!hNhwV*97hInqb3aJ0$r*GU9HN&$TdmL zACT`ph_X)Qq8P{jzr7rUgXdz1*4k+ts)=uCxdIShgcg=yYf7sFW@5({5kfIsal|iH z#JvKheXJsPmuchI_AlyCU{1aC!aA(TXfdV?sA~>}vw^v)o#rJU28P0CLco^K2YK#e z11z|eVqql{!mZ)M0Qk0DRTFTjnfTJ4C@YEQMYIx6ej8-@grTkHMY`V+Z+#M=B*z;I zQZ1w1N?Q4P^`g-Fnh{d2u5vTUygZ!}%HJhd?{p-n?5VMg0o0&VQT_37^e2+2C0YW9UPDfl z0tNmfZQ~AsMvUyluuDe|>)IOx-a2v;v|qmwfbdc(Q5_B*4>yZaq7YvaZ^y#fkaeuY z!wdalCODBM_Eo=^FT-hwiU$Hqu_~!pe3$4hS_o*y1f|ES22IiAK(Ejy2}x>7vD9OV z5vxFC)98slyvS-oG3J0Cst~Fu9o9G$&|MaELnUCJxheu0j(#uCiL3(~wg*~A%oH zSsS_9(iZ3Ck|2?4CVGe-c?WfW3J_~ayyy8m@S=c>_7~1iomb}-#h(9>C)ZrTwc^Q% ztw>IX^u8*C@wp;`%|z^Uk9f>wmXk6*vyqDUER#famEh6s9XZBs^7Ta>Frh_=?Jl3; zs_QWU%O^m0dclB3SQ2yo`gSe`=M37b_MxPfOm8FPdNv_886_hWiwYiF`yTcjxh0SD zBjUgsKc&~~@Bq)tco&`8m5o z$bu#fbwf6O90$yHY)DIIYLw@Eybly&;e)P!F{^8>0CV_RlnX8BU)6;MCM zoT<6D4m~)@w^Pw6iL`7?@_6B(m2abXbl>NL5V8-GgsMs885yM z71!?6yA6SC9m?xeRQdOKyNTQTf0%pgu&CO%U3dTmK?Nn0Mgd9b&HNofhG zK`}?Vq(NXvhY_WvM7q1AyE}$_zI*iP=lQ++-S4r#WA8s$YfUgS>%Lc9=XIUuc^zFo zZBH(Eu{q#(9plu1j=cN}#CCD35ILzwsl~*teLoFHH46i319iSua)LGUQ&q3w6NqHO zwTG4)nYH&F)dF{nzI)qsLnpM}K04L=CUSh4wBZToX0S|*D1q#md0N-K(*`u{&NRwz zqf2j<%DDBI{WI8ArUjd8;;-ty$AZ&N?#F2dpEc>3NX9U|Tlt=%{?3HAe|Mey(=>1S z=DqL5eWrmb2g;ngMF_D$SAmxp*Nu*9=(%G~`-$Y2!*s>3c2sWN<3hc&<{p1C8hmf` zqliR!18v0pHga9k-*U*mtv#(8NQR0h8dk5S$)@=jZH$OF4DS0z&j@_=cm$&<&z|B~ zB7{UW=~s_rREhfNfay2E&}1oZVQ$wCp9a!Ke3~&bNuB>RB}#)g`eq+JM9mQyH)!*D z0|CP@az}Ru5Y!WfOP=(1M#VF%pPywwUV?wU{J7A`p^{gP(JC~K^nF9`xX2cIbeR1} zUfR-6+!MH_ecNDCEmaA*ifMoiP>5}%97#nwu8e<)+B{vxTZg3qGz+x?NzTvpZ-ifM zM10e<4jZ)>5qQ>f-#D_pN2nH=?!>R0En&cFO@)4x8WG|BrU>lq@6!wlpHVj`$>VUv zIaG;w*K^am0ICV7H6~K>8!c9CajU_IWR7({Ywk7kD zq{af@WZG(J5F=9wV$+VXg1D{<;CaNKVr7so`>l-e-&I8zFodsHQ_drDYhwkm?wr-IM zpmI{kft;825EqC_)cF4CB*eC?BnQD7Cnf(R%S(Cbu)89n-&kXRxTdQDn+w8AheKJF zNR9A)q|OFgb^7%ms}y`Kjp{U9|4T$}ly!fM-|nLN>6Js>Cv{@Mu&6JjbnJMGZHh6X^ zqt??W5wITCm8Z_LJ{P+}lS0447b5d6Jm?)uYWVK;aPi1e01F5}j!|A(G(=!3<5Ne| zqmX0Y+{5x7$8@8@N4o|sUTbhg@eIRz7>n`neCpIXC`pNQ!m`I z%QVixg_M}1c}pVW-W6pTC!YQ^SA^kYqdeVf)1 zbbc0N=ysg*LoxiE1Qg2`V$IK`hg?U3cDWv@aLcPBIw`+E+|K%{w2ych%oneGMIoyY z5$Zfe!U%m36M1s^JyQ^Gp_c*a9iNF-uiO_izR(GVNaq{$#wvEeGPG1MR%qACUoB~u ze#v15ozq{wCLgVpOS8gOTz23Q;u{S0AvwoghX)KJ z6jHJbBQT@}tSNltWGm?_uwID+C+fc{R{s5YUGBD$v!V^lGGtn+51dFwkq1@c(RiN` zL4#+TL~%hdDP-QmCMHXXa4$quOP%#&_UlXD&^!B5WgY*RbepQV zT>j*Yc1<%~*lS{eZSHeZg+Z((3^@z{?AY?Wgh2`{D>wa88B0S&BB&df-y5KIP!J~~->RjvoI}-{)q4%}YSpmiS@_I)t>(8x(e~zQ zSvk?pWoVfM!uU(;o2HqIIF)Y<%V`D?LfPgDr;MLgW=l}Eb{0MozdVT8F(Qk1(lCKW zJt<%&yhI#}zSU#FGD38CGZncOB5);$f>lg#ejhBg_||kX^KZi+q_Gf zTy)$;8S^>-;Ip-N7el^M8gGbxo+sM;#h~_rhfjUQ_VnB*H%A&T|Rg<|$Y=ECO`XnRhA?~K?&JT#5rg}BEa z)L&96J*v8>2z*QknaSH;ik%PNX#_t%y=k@#8J3PpA)KF;+()O(7F^!ag9*u36Wb^* zDvI|8@+l0RIhh{yBT>Gkc6p_$Ey%mr@StV#46s^qyW)5M?O{GTO{NBcc{D7X;~QfGgAVGQGz3ZKw&?my-a8R<7Y0-1p zbFod40Og+XpibL`!!S@c>SpsJa2qDO1{Qgy;IbWzUA)EKm8;`;cDvx^oNMbC*0l#G zS%PWHW+46Y$)HAm+O5{lA90knM|iWUpgnqG=Oaju4RI2E=;zTne3sdtQmsJ+}|$ z!W6@a;m>;sLvf@y<()*n2FKf``cqU5Et*8Si_OXVX0x{vP_D5T+L>R%#yYbTh{60h zHwNW8R6GL)1Izlbv_h~9QkIIVnD^N{M;s7p;aE7;EzkG*T*{)YelwkF^bPitq#Qq|tjAmu&D$<*3Ca}*1xn0r35RU@%H zzZx|Nute^PP*j7Ece^S@C}|I#6(uqzG9<;;QT@RH{~)p15D3Ny9&%7g*5Duv*>bQU z43;VQ=VDEIEeH;ft=y)m2)4PaydcnFBoaWp&R?++VK6L&&})DWnOeb-D^p1SL#!wC zHSoiKQWkXj!y|W7&Vc{>J#+Uz+rH%tT5w{+Jjl-|4`*kzH&!qblHLUzpCLbdJvDJ> zdOop7w#JpzPpE-n`0WV!X}w1OB*ZIMuddPx+b5J{TAFK)h9V*;+CajrEDsSBoB2pr zV|6ql#G`_yV(C$JI6_X0yx{kt`E7{U1hr(aV^)fo`*B46&zEuLSaO;9@8k2^IQ{>) zrj~7E0af-_X08{$0$r{g?%dC^yKJifejv3kIRS4gHLuN%Cha#~8yT@oty{!knmKPh z`|Ba_=fmN@wT*wt`fFyFN+iB%w{Bz%d8y1p@?#=iwppAXA#GYDzLq)@LX}z{hh4S~ zcrHMW5&Z0tI&-+Ha|lc4KXAmsA5pzFULCg_FOlb>Y9q1@0{^A=WFj@bEb%@;(r@tX zAM2Eix@Xz>S3p3oqnHl%IK4B@1I*3yr`J*ufu#hL zx%mdnscG9JNGWxB>yLRrmZAvW`!o1gDFIDiNVU{}83_=tjP^RD5d3@)N~$KFda(`a zoNXQbN=t3TruPH344w`?^RH(?X7MOg`Jq(h;vpcK03F3y+3CILKzJ|5I1qNI!4kC- zXuSp&(8m<#LxxRwj?Nl^68>WDcAi;8vhv>bPmGE$5!x=UN=9kmliB@umNnpIHgEcY z6~p7+it0j!V=ocV8#S$C=+=GXN8_zns#Y76b6g@3X;q&$hvJ4Oe1$>511 ze!S#7?E{ohgsNoOsJ`Ou zn0Fb#9I~j+UTkqEvB2tzu31HPk_n`Z>B>ENfx({rbx2sC zUWYgJBN1TQ&Tgy*@Re#zADUK%&h6<}8$D|~6t9H{; z8ebi+eRmTYZ3K>f?&Nd=tBbJC?q{Dw-lJ!kP6-RAg*L2K*)whi#jK;a37Gw<_0A7c zpQ0Egy3|*Ps22OnRgJkglU}gC7o{Zc>FrtkVi!D@X2Z{?-XNMmf0ecW&60|=J?5Kt z<>7f5_=_reG$qF^V)21=ANAx_KgRjlEy=pWR?z{Yk3*gIuP{OMv&*q((r{Pb33sj7e4TFGP$n>-%T2mEm0*( zI23|QRuw^?!D^}~-QE@)T<^`Zcr1$dLYF?XKaGY)d1ay5s?4e>MzS>)d;C{{`%wVB66riKHmz1oN>}UFO6JIX$jYlgYA2 zyBg2SX5vKNewYh-HnYN<;5!8xKI9HRY)F1cmO&&sfjEYBt;O|dA>`~o1k*ro!%z%F zh50DVRizRJK}1OTeBU6+EG)|t{#rRbySQgpp=J|wW(iz9vGuzBIP1+$_Fez4zHk)( zox#BsyIrGY^NE|c?#2bigMqux z5NP-|gm<%JQL+2@gELI@LrRWRix`)zA z->7RQ9^JSgtv^-838K<*3&T- z20R|7xuA5jMsO^0QACzF@{&$FGjFRS#4IBiYCUnkB%c2oG4{aj>+E~6cAV)v@qvol zgjY%p#W?hf<{9iW+MZmyRI@gUU}riUNVOh=u(p)-^7AX;9X{ji+(9u8sA(4Ezu2ef z`R1@jp>Zv4&uLIv&mx40EJi5RQZ72k#)?u=C)0;$3AK8!1d`m39$G;2QHoNGE*{bf zW8Ce>2GQ!#S4K(jD|uQQ@;gU2W);CTNqPOIJ!5eYz8G#Y)#Vh;68b_jMJBws-Ry3S zd-Q6CdGgPi!U7bV3ESjfy9bP+WNCw)5j$DM8#`h*QU;?@4TP-2sPp{9bE3J)9Klh; z;mVCQIwiMBvj{5gQc(I3IXz0ywp2~TKWEl!@GyN$xcU`Ey}V1cSKQdg8xJWK9!7*` zo5vM61kZ?F>_NHGv-HA07kK9mojZTV;TQy`=>u{!v0{sejjE8b>o8)>6)OVQk%KV% z#r-r#+e6y<%S}q9ln-OzHYU7le(s)YC%YP#Z9TH&SrW&kO2zUBXSu)U8+yEBnZpxy ze|IgeR)JfFi3&Y8qNbpoo__RsPnGLRO8A0wAP=_&Qz7RCT5VG+)&zYLwe}bkH}Cr6 zAen0Bxytzye;>9nCQO~ww7emgbXx`itpma{2_>P&W`q$v2 z8uKupVXU4a4YLiO3Q7>1W(eA<)o(> zZHli7=O{^@jcpgh>#yV+k={xOI~?fjRFY*%O6$Gb(Cj)OcxVF^R>)Kq(R2AsNCG;& z5ifbf_;P%(oN8jrI{vn(5aY}Cv6Sit1w)K4y9f16ZhdKl!?gUgqL>2q7BAZ-$gO2f z+a-7RjhX$YZ2H2z`IYNpu;JS(=D$FsMPNrad=&jTS$%5_hQ>ekzT_EZRWmKE3j%8K zPCIJEY@ID!mBE_c+jLs=RPO){LIIVihZu^qEV24RK(6S3rEvQA)YO_I{hZ-nHLNl>Z3el2WSr&Z)uP*RGnrL@OfSx*_qf|=McJ6kRR=TYrtGevJ;r(&rLiH8bhh%veU) z8*-f!6P$pg*NdgO0TPA7lE|;#t~w-!Pv+u_B5(?3)9l$O(?iey;I8~7OH9Fa8B6vS zL@53vbe8B+@KVJ_)W5?$YMcA zfsTh(84&G?xN%PcbipFW_21X~{#hB)pQ&T$PxD(`_~*4&Zxo#!7ER?s{ucb<>Mm+D zX35skgA*EBgJ>b0_gxE7;V(%1I>NC3l5bA57=H zlz-k^|Hy6sPhSH>9>ZG#w*kl~>MBaD7Dl)a3wN2)$Vu4KSbw~jnh0zQ@8Wl@X>NmIRM%A+xtNl4-+{pdsqJIL}zXLQ)CmXhm zm#=^Yp{Z!51$BC}>aWn^CrtjSkO2Gq17c&>9C*FGyO(cA=w0TE;`bf4tk1+HH|=?F zdm&@pP_ud`JKx|62tsiFeDwUfjT>BWXu7PCf}0E!@7ZDe3?)F_zFLzW*8As!5WBs1 zl)0$-164pyfl1Ls7JSR43qJT8o(LaUNIs^x0Iyf6g)UJzz_$eZ4u)7+@hZG84PEB1 z0Ih}3VMVeKFSpF9QNXMNc*6GX75}!hkUcfO!_!ue2hit+x_5qow6H;Uvpc`r#L=Z-yk zZJKGVQDIq^{5WO6Y+8lP1~rz;ji)?gzHIzpB1uB`ftl?)3ARdW<9BW}*#!oC!ps?K z$PxA^4`OZE<~S=;U-s%rH5U01%~%`yo(-Nl0gT2}!wa{P!IY~-9|fHEEn;z&S7F01 zUuA!{9hNFev;8_1g>!~B;CmK&GC)4b=*cP%;{&vfD|z)i!%~dM4tg25?cTunS6zl$ z!3md1L=s(0BU|qmiyR(Mct>=34@3nAXWqZ!>p)3XCmshOc&d)(%4Kq{s?)4Gr-4+L z(=RzN4fXgcq-BuLKa@=8xu*fG^)Q0JyhpHX9AoTypX5z()oV8@8XA2ohS(2u-yAD^ z*NB8O$tLqd#vds18vuJM9t=Cr)4r7*D1}T+Fhz2^DKY;#r7`F}UtW8Oa}GM(QEfLp z{#+dSvbrbxb-iLcBur_=xjT)4CQY@{J?wZ+9Z4%#C9t0%pBe6*(38P^Bg%p{n$E*#_0eQ%Os`ewCN&t`^6M2)z6^wO^bO?b zf#a@us~8RYmr=#V3w|G{DAKR(Qalu3q#w-Df64j!%40E0s*XY9Jw~zpRWl2z8!kW#u&0AmeuDz zljPc+M9C{e!=H=m0-=-8ySH;5AyYp$6Wo`WV`pgc%8Wbv!^n*&R9WTrqY0TkX zkh4C8M|b=NVdvpiC&YGb4jVAoBX`9|!|XBS!%i)8oEKhBRV8%{zd9vi+JsI=4p5ss z^@hGG(23G5h_>^H<}1Nrd_$DjJkE=<(fQJQ(=Xqz54NGBfqN`aT(&X)&HH)(u6FBN zH!Ddph0C|9p4??9Cv51hN?d;fr#-#l1M71TN~T*{_jcyyB|rD>#jy09njQM6nTWCb zV;wizrVF;3U0Q2DlkEygp(1;xTWDf-&{h{;A^Mo8+K2OBjL8uxo(~HfNk-WiE_#%O zS)(49F;I%KG6cC9vIaQj-G5lBM>xXpElI&}@BBr^9yPt-F(IbX>L#>y5?;LSlfwMX z;McR|5)((hVrRPjyLZ=*gS?1j8f-7H;dP_FIaoSI$s8JJWq5ko%v==4HQzkdTQlPg za_|&QcsM%LVPvz;2`L}2%dplqyH`eBPQ7?@=ZldhL`I{8A}5PLogf9RpGY3Un8*I9 zqTq6ka2ATf#Kk7RUVKx(Cg5w{3$mBmj2&p@FGLPdE`A9m=0~OvicvQMJ#U`%B3~Gp zPrM_0@k5|2>a`i))3`)kVt*p%ygK!Uo0(2fA;CbM2;2lGyE~0c4PJUjHHm5qL+^1o ziq7S|Ql=S~-feTB=v>%3YnC+kh4&Z8=txaY=h}OktI5h3yuG+hNw1ek`Zr|9p@J?+ zvb4E)u*b_@sG_KhmMmk^{2KnjiS6oTSf~l_K%U!)%hc(bynSuTOcIXhvazInyo*^y zoY$ArSA{h(+&0a*i-6QHjWjMj=V#n&q!1gmsEtq4c)Ov38zM%>xhc-AWN5VJ7q?1x z^?}H;TKlE>(oDXl40`o(!|sf%n2MXKftM|08zW6aknlTI`2)`lYLd?u^m^_|fwnV` zMVUBnDtZr0U^~Q=ZdLj6B4~@B7mf!?$pM1a1q|mG$|Q-hqV>9Ov6BNM@P#bS zd?2jkgxw~-*NNQ&WgQ#3#Eyrj@-yF_PRAi0`WSf~8|FU4O{lPmf5G7k=EGnAL{@ zifeHBWIF?Qszt={D&W<`v@>Ja({a7*02OTEP44Q8KHO3zH+l*;bUE!qFd{9SZb6HEmZ%*=%=xW$~UPJQEEmz<(qZSK-OPRCQgMs0k z;spg$=cCqatw006J+f0m)`nWU@qJL(!#09YgS8JFlNy{V%VBoBtFZ2bqyrH!R}dm{ z#%JHh1-t{rzr=e=SZ|+?D5wNpz`oo$2-BE=p#B}@Ky?R{J?Aclw#8U7w> z{riO#q4wq@2ukOOP17pKwI6{bqBa>KTrOoNeT1E#Pv#u6p;xCG_uTcupbc*62yXiCj_CM%86)ERHTe8oOz+C(_1XULQ zk>vi>lqvoNb_77vj(=T2<^WUYFLk3IP!O_{@;DWdULOPI_y92n5M!|E|9tuP?ESB3 z^PgV}cIE%lYv5Rf7=nce5jU{20>^=t(-nk`t!kVBmbhf2V91S^^&bod;Gik{V45fr z6lyBY(|b;KN5=`Mf3PUTsh&#I>WNM#_{(|n2TH~x(&`Tf_E)aTUb+vG!}Q;l6afFJ zI55`P)Ym|9An^B7B2DloO4NdmW!nm$N#aOlQP5WC?Qt(gis2B%J&Jq--ukTOUk2|i zoy`zAu-ZBxFQm+z(QS;yB4#S2*dXVPDN}$gh3iX7Yu{BcSmf8+^xb zV9FG_fDT!UJBdC?&ZG3<>v;{FkNek_w^#2WyP)dQ0Rb^$gg=VM725c z>`HTKbnMefYU(wLp#de%Ces_6BDQ?Q7>~ zY(Swz)G43!qLs(R;XKJcOV*{n31y-z)(YSEo0)P?B$NslWoEEz}cg^KW zu0UFg)Z6A|R~;(Av7;0;9SHRjzJTqEnsJ52L_c&Fsq zVxX23a+Du3a+{qtT%yorYVxt2QgWZI@MOtmJBM<%BmamT?SjiEHnSZjA?wFzQY-B% zSxa1>nyqNfG`4d4Vn19yiFNHz{C=RPWB6Xf(&v6|t7X3s*~wzeX<<}Rfe_9w5Hg7( zx(h+NA7$J80$%DP$tE_bd(GXGVcnm1FACFX*UW2_AEd@=5Da?jBMrjGTW>sFG}L=$ z-5yzL0q0|&l)NhPRQlRdCDoU}G^6LvNv989u4tf9Pw2N3%pFLSPXO|LhB)Sl8@LxR zB^1coTouL?_R=Yr{JM*0_2M!tT0}yGg;Jzh+#oY(XFL}OdAxWK{K97&Q2rl-?JVz(Fa(!mF;B%`c24|8{%MPIaXMWCx8H&l z#g}utQ&&~aHYtGSsj8W_g$@n*+8GON-RFVCw&KWqr4ZR|JkWo=Rq|Lq;IvE@>z369 z;$NYi-S5I4@3I_scyyG9D&HOMY&`M26mg;2qeH`3QfO$}U>gr^=Ok+!8p;{VG|C4su@2cqy>Hdz8N$ycr~xA|GpXaL;> z{qE)+pBCUkbhRO5*kekeVbnoQqjrDHnA86a>B0Fz71K)}WJS&64!*aB)(p|agavbU zr@iRWW_~=zyV>x@@0zWP-56)3NL)lhL14~#XMf%lV#rRON^|V%F@CkXh52P=?ugL~ znIUBYCZ&_;v*(_BTpg_zZ94OK8tL`e$<9645R-Jr=@9x72x`yj=7g=9pi?l~0~#-= z?~IE|CN~7{_IsDEoyt#T%ivaHf>Jr$2=lr-fFxl)eBm!S8Hz==2T! z$kq6CB2zl5*wLA;qM$h9V~|-;GHWooWf^au?hSXEd2uijh*oKx7CxF znMZ62j$X%$rIDq~`)GYk*VH;jMN-Yx<3g%pSCvPghaN#@cX?ni{?c*Po6lE`E>|XR zTs~CS@ZTe6{1R4QO`VYZPRIxoJZ4*c6tfP4GRMy;d=|ZzDvoFGL~=XM*oD3I{lJ4Q zCA@9HRo}~eCY$^FD`z(_o^Pj^lUPba9*YY`Z_O9C8q-gASGk!5xvy)K5fk(j`Y;_4 zVIUZtNoqR--k8k(>4y@ZlSkFA(Oz!AuiSWUA9S@#&X*WJF=c35?1fQVsw$RWp&z(i zZ=cx}ps3D%m&6_eb&&3Ic0QS5NTWW)3$!d3nHU5icAtdDrlT}5J7e&PMzI_J$)|xmZEaFbVgd{ z`>z=dTyJb-n(9Ot6Ve?h@P+Bljj~8#TTt8(89Egm6gd$$hOZQE9gC7$R>qt}-;l%f zn6|o4b^BFqP{)4ipI@84Pg;!!y8!=sE012|x#3;b(+8m-K!~+l4$Gp5I5SO&MpQY* zNAT3x^BJM{(qV@zhGK5KsU3~1tSz!D?A4oUjwMyHDuWLS6pZeq%tgmlu=mFCMo$9F zbp5Q2O%+JGakRB_DyUymCtOG943@0;eFImI8}OO*CFuGXd?5;G9;*_QG0gc48=ne^ z;lb-aSQ2;oW(G#EecR`ZBKJTf6?+|cRz2+pF|Jz6vi$6Biu;v-?g*!pFu7;nz`Hs> z0gF)?d3G1T@dyMl5DrQ$Q%Lxq$zwa6Y?PA!uoEC%xo~KbqyohEkKzCr^xv2w92Tn2 zfcMW2_cC@GB)6R4G&yAWZ`o*Ug(J3(N)R%)sG5>&cuSrQ=e(0GUx%R>D_!Z8|gd=QsGk-5SfqDDK_L)dsw;R|IHpVvkA+P^+ zyj5R^V^1t4Tv>3EM*kO+@R!MiJHCrTMR%XiRZG5%kN|)3+2YS~Lpgiz+m&s{GzS(*v$K2vJ8`TY{71 zlbC^(TCy+zaZdFxZJUDg^#=1N$xAkqqcIBE^H$+fS?`~m=_ZcqudTx#&!W06BV_rE zfyLdG{utxTBRLWuxE>lO1m#P%QL3;PXfgp8LL0HbS2+~oW4`FvH69w(m&x=UQ*p&# zSihX$drs(}&W-As4ddw|y8p5z_v7~rvL6lRV6{>_M@)4F?ujpZ4gv?doJ}acMQ3{Z zWrlqdjXcQy{2HWZCQ;@~XB;BC`V&Uk(-4LE&a_^d73=~@Y>RcO_q6GW#h9PQ?J&F= z<0{(_33}{!b+EaKDPeCBw{_Ey>fYX<&c!bbBO*19 zq(+{fC)ZyUezB4KD)l1U{B%4LS!K8rxQRF}026f22L_|{xg1b45CV*0)WRo&)iol6 zv&!x9s!)DNgN`Y4EAv7r0{BG>JBBU7a*&&M`~{NIk9xRtyl9ZT+XkfC;!@{O8=t3I zI{ikSPY<$k+5lWsF9B-DUW*5n>lcp&eYmfmZuq+iPQJPjQ*SdWmND!9&^f4KHZMwj zq1i5bdw!K_;+!ATG3&tNdub%j7cO)Ujk;Jf zX;pXChpQYGI0Jmdl~Yv^sDsAy9#6cf$hg{b`rq!t3dsU5ZnvW3U9P3AuuPbA0{m_fV6IVWPYCDf3Acc3ZDn+efRG1vu_$Bqc@mIuP7roTjW-P(Ek?NPJadV_?r`qcv{-ts|lPD6Lj?r7P8)M-%+- z8)4B^H{rHu3`ZL(cN7Di;O4&fDZl`s{l#w@Z>EjlsAXWRintuUNmB>CT$mdKuQOQ9 zG8y5sjP*BpB08KxEO1}wThxrWNgGnnyeB|VtuJ``TSs;v;EEd zwPvcTt@Vn_b$*VpvHQlY+hfWD{4M+Gq0z1(uDU=(1NEd~T^ud~EvUBjV$KV3rvxU6 zcJvPT5-F>9vFJD(5Af+xHJ(qf{Bn_@rnq=QaVA}%b7E&7t=3siV{b*>;(luJ^b>xQg98Gk%*%B=7xqUJ)z<&7!MGhgVh3cXIL z^K?=BrfHVcMXz4AEpGF++}AF0&+#bf&Kk+wrFGA-iJC^4JQHC)fCo9G4%G>ZSduUa zIdtO*eu1FgO5nS`LoJ}D-bNku4JikRWtt3Mx;H4FBNA+CuyE z9;{uWrPX6tkbmQSH(*ShnF}Rx*rnjzO-~^Zl8V{OS=H%xW6=JfFXkGstexTWby2Ne zox|e5NlkADUsae;w^+8ne0rr8Y;WPmWIW zxCs#?rX+4sm9pEm{iK~WbSPYZIZRkj{PxgHgl+o%P=_4bY0fF{c+&n8Gqi2F-v2XP& z7}nO^s3=L#i>Fdr!rr54u{ee3WKu`37DQn`8$e(~5p|P-VVB+a0d&fuf{|&@I0g8v zgXsu&(WzCIF4>!ix{hOP7U9!CJG}bUNzNg`p38-*D}wtB+p8j{En581^Q$+DnVV0m z4eEed%EOkHjTROC#gjCWQ$%#-r||jxZrVMD=1PH+;yggZy7;(v{RDXxyjEtr=G=qn zXz4$Cvy1Z0n#76lUuP?@geMSq*<6u8l56SzGbm$uQun__bK*oyf-k^ciH*6zaNo8N zChvcrUGU*r>hBok*U&sdawr;z%Sl@j1P$&;B%R z>HLcvBwdookVG)%;)wN#HK64rX_AAE5PqwG7X|1yqrE9}$l9}jR&*2WiIypVVQ;be zVSjf(y;B3Spw;@I`gl2sU)NW0z5}O2VFWi`F|`OawneCeV9H#qhDju(Kuhiq0vHxP zv46=L)oZTCe#es9kIw`V>Hn&P^uPVOR)4ZqmwL4laAv%MtIQ#8948}dL!V#vdf`|X*aECrlO>4>w2GHVG*B{V= zlL$M4|1og*52N^FWOeF5PqR2OIv@|dM4pDa!L}@8tj{W#o7K zaO%&QlwF|VtCB=Ew=%lCg*PSE5$I3}tnrtQJqT=1F8XO6%Yg=al`|*MSftX>v!Mj( zv+`cOnb}z#hKs7t>*&}O3<;D8#wb2W1W(W;aW1UCs6`!9E;&K<(1}D7!JLaEFLFU{ z{v5v)G;ZL17}}3DF-~RQKG|(3WanH2ou1tk$*{sVd%G#7oZ6?_&Vz6_U4s#7=eBa& zQGYZ8W_PkH$5VM)OIPPcg2T{ydlw98g>iIz(UV#=67sSsU%G>AIdjU-ek=2gc9fF` zT1YaJc$0tfsSrUsxxd{Obz(;k;~5Jef45$;z19pIxeRajhi5Ao`7^{n-7h<(upHBS z9vu+{@1Zaz_P`71j`EY+N)Gj6P}ZilTH-k-J(|YBcyAUSi@>4p9_D?eOSt};aBjr7 zuE&}7L=3%jjeq4#;`c9zQ(8Yd&rHn?gJ4PM)LV8#k?wWWn#%{>F!HCHVLRxR&l4me ztI{k!H-Kf(gU}YPzQ~Jwl3`5O%sWhWXQtox4kzS=STr~`XWPb{ww3M&RYk$@C=A!2 z!u@xDfs~&E{eY4dhBK=>5N;GqI^~W^hAAj}F9uel%7AWzlZ$3A=i`o67*D#bKnA{Y zh>IYpz!I^NVbJKePj zglGCQHxlw@b5nWQi(jYmQ5vy>-$}IU4WJR;3oM@#SJ0t#l2$Z|-wXE7P`641xS@e3 ziG9ekZXZ!hm~-x6ZFYuTc*jj687=Dfnh?AU4};n|%@83C)<9p*d5&D=hCc0Z=aQ5J z?SQsxi*h!sDPjCOjoGVtNn@U~L5L_*Ptir)3o)W?Y>Wwdg#)_uNZwy$^u!$aS+3n0LR3f(8 zITxT$$h&e~zdPYq*-ZK@FAM6LEW<03qc-vHF7&w?2`E+Mu)L3VZk!ezE8IP9=(xyK zA>St?Kb^$!%7tZ7jaKvItU`odr*+g52fIa$p;vfQdcVPpIX6}(G!oeK+zxpoJiglv zasao{=U+odt+0X+H#6v42EBu{Jt-*iZq|ePY7L|sta;^Rj^^xtsC(d&wx;h(<)Hd@ z^Aa8RDwHjrv7PP&f7P@KR+=i8n6_G_+vu2(f5V+W&ACN$MuGO4^6Xx|ri`gfbEw_H z^hp*VWFq00rTNsj-)tOC!HKQ&H|K^&RLH{v@;^0IPdpRfLFsZUTP=6|5PDz1sBE~@ ztSfI6WsWr3X=%($HHF&4^87)Tg{fz~ zh2?%Gdv3(9zpMC+M8J4WY(+c<`0tMyOK!L9CpO(22S&oQ4KxNEBjzLfb?%y}Zz5N2 zZk4ax6~23FnO^v*tit&W(Y+8;!sYvZFDBww)mnH7smmh{E?WMZuK~XHtK^C~LOSh~12H9k!n`1o45h!7rj6&4}p^c!ww+YZ!w{VZo1H*;9R z0I6}&JlxGexx^%L_ko6T4)T>@m_>_y-n;XZ{B>s%K-r#9RPg#;<2Yp?Cha8L&0IPc zhj^a9n?B-eOJFJ1)F#3bk+h@B|AIr%1p!ui?bAMF!8ug z((N)*HP~%v+tfI!Lot=hJt3gI-r=>S`^ZaRzlhfVYTeCv=-WUY*@ialJK3%rH6!@z znxnI>A{A|F1S%y1Lh*D&-NN2)7t;{DILrlsA=Zx1q}_n-&M2(86}|LCj|%$22Y%$_WAS#P;Vb8fvDkJYtKC^hD7^c6BwmA* zoX-WRMTRHdz{K+8i!?W2sf__~{P!dF<57<0hr$QfvPfkw*{73=qir$B8OzX*4U&2- zUxwB#qfbxepCq4cl6bQsmSe#jYjL0)IF0n~^&K3nH(a_iNI-YgW3xJ#lo6|n<-tQ- z09w)I+<&2Fa0x^`e^_$=OlQ8l;6|?L`JuIeXUzC;pxg(?zKZ-2B`rLyW&C7#!z^Lb zq;G?6&`0T$*)Sm#d1sH^!LL~5@zl}M_7^xSChwmr=AwFBEB8RYKn6grvyjRaNW`3J*NdBY>j`@#3 zVxoTy#MZwjmv7as3C0Sd_k)Csjc{08@K zfdiwtp9b=y5vxA#%wMGT-7j}Dx|DMLfA;a?z9SKVRyKT1hHzK2oMQP|^Z9@W3PIg# zg8pv5nXM2Y*h!SFv3RcSZ*!mh=!fOsAlCCB^x2HT&MKN? z>*T6^GofEc#^~Bqczj7u&VPhF>@GHJsguqwHs7QcW9uYo`(JlSfr0$*mBs$k`!czN z6AEHtme^8L`~^GJZ5oigZYk+Xajp!JC+v{vIDH@}QF5>+51ECX61n{XVdW-pT#3ON z*dGk=ex=+&cWGeQ2aXeJet`_K{4kbsSL^lV9DLb0o4t9NN7+!eJG$&jy(+lqJz!uM z%VwZa=eQv%s5lmN^CCH3<0Al_-qmCuqt6UIpXS73obC6<@m5Snz_R$;`D}uK70@FD zU@fBg96P-4lV2VjX~>Xlp<8}|yjImfINaTxK8QPEdNy0$O;WXmZbQF@ZV8N>5Nr!( zt?7vBFm1VxC23G)g=o|xq3&fd-M1VZ4P8hL)3P5N^*P^2H4c}z1E5D0USPik|8m@4 z=|HjLSkn_E|9z>d&90{jiU{H1Kc$i0?dhcPY+|V4NUF-n_?&LCsLefFs`lKgKRC(R z9x!PwbU$T}x$g=caik(;>%XHrk86ZzZMWU6X01YYggVP*JuJ;RnwX~1qZj0!9O2jN zP6+wlKT`0#x8k9($wlD(T_d8WLnC!-oC6mDvBBVjxTHT(r9#- zjN^E=C}5sW#`91tXutX3NPh4Lsjkc?KdVG1yM8~yH5uvIHk4>@V?LdHHLY$ccWRW{ z=G^geOr4iO9VgFcJDY2i+r`DFQYMeY4Mrbs`86?+1?k8CxZrV?&68baNsGFjLXadU z9a45aDUq#|Jy{HN`oVv6%6lwS*`v=OqV-hMh<8LefJ0z|((1bR!1fcA%~dbqH0gNv zmtqr^XEls*WzfsKylTP8!m^E$yXd;)@5_6h5jFITZ>O%lx@L2eN0<4jtbK;i&Ds|- z1Te#`{4o8B5I9$z(P#uzko-)QR6L5<2P1o3GUUdNd`ihm#* z6x=mn`{Mtj?X9Dtdi&_n0aOG86d461lny~!I#jw_N}Dew05F+d-l9P8PlW!JCNH z<6XLrPF?4dxv)DQKjJ*H>mX%-A99B&KZ&!;tM@|38_#Ltbm(U7KGwXap1Z)4-`>eb z+_PM~DXD7TS!hR5&r&8+mC;Ldf0Zd=0UoE(MEuF6m`+fic9$=xW?%k}hAA!r$A0UDo zi+XosbxgG?>FEp8h(~ob)P_z^N&5LfpVWll;nZ>Ynb?ece3h~MF!WOw93_Lr$)S1D z6xsZ+*yH6;BKddTW#z{<*OllIJ>7)`PD@8o9#)Mxzs%ZDd?~q_DFJc@pyos zdB$00abW;tK1pjdNB(-78#4~;on9thK8S!?84OT}_e~)Sm=?4)NFM2lqINeiXMHd!{C8hFVAv|cU6;zv|WZ+VF$yPb$fF@n@j z>em|+zQxTBAI0F62U_!g&K|P*vSe(MNUdpRpj99KV{bt zmVW1~^*S{7pDwuzVtfdsPf?UQl+tTz(xeGh;_wI_bm94!zEa7u(|vojkyru>=ev`+ z!fQVBevj6r`lcLKg<&XM+h6jvd758U{+RF;T&f+Fj0io>?Fm||j{8?;uCO)1rgH&yMj)w8`mdl2b^C|D)^vkhsDXm&@)-QPoG7=9MZ3TPqY0?EhQ zl@;hm_VcQAtW?E5Of`TbuFz&D;#vlDz(%D&&~4~xhJP{40}+{@rs($5ZWf>hl7Wv4cZq)i(z#S@`?zq zFB+U)i`{qqWDlB_deSO%SzP3lNI|buXX8G0vFQOH6T;(|-G9aw#1JI?eEfk>$6Lp5 zD0LNB?9eN)IOdK9b;+qXlNlLrh_5N6ThRFO7qbdpiYU9oyNfA=A}Sb!+g=>rh!~GJ zrfsAm=7EUTyZ#}SLIJA;3Q|#1=PD?!i;e!PuAIALu|ZQs_){~;>VEn}Rf5=11*`M~ zpvhAI53+}u&9nr}#8p|ZpxJ968IzN^2Z5NT`6^Jr_90g}NyE-$5ItO@JZtX%?AH08 z982&W@C{5ifxWABc44F{uF-~4E+ulVhFWdxH)tz-i#aF({fJMy9OICiio}|zEc?Q- zuOjgpL!Nf98ss=UxQddL(M<<87D4(bIMT<4fqNY^O&9)FlKqdc0B~9PZ*AV+*FCe9 zQ?P&l7fKjocGMrt74q;80xSFbBBlaE@0y$hdHU}sC@cPT;%aWgK_d`n`ET+y3@M#x z`b-K7d3c}Ul}#B)Iahyq9r)Hp!NvIQCOJx7S;9WG&9c3+kGdkkFi@&kN&l;v$q!_T zeySd<9z7Ie0^lr*c1XqK4=B4Q_S*nW#RRLlmzBp2pw}$`Ow@h;o9%$|8a`T+FCdf+ zugv=b_MXmvaZ14I7Dy65n;d|V`xCaMcQp>_e||FL_d>1@(EDw>J#`oNZ1kV&q+%2@GTfh!bpxA*eHkW1OE9al;@j5P&=I;M^s93b%*knB zXSREz2Mw^FGFxso26Dyj^nU4=P`bFL#j2Xwudn?>*F{(y(N9I0GDBD0pMGiDu%nBJ z2Sql)4$)msJ@2o8(Ryw-_>D@*EoMKz-=IBPn57^tK)uv+9iAIlEI6L{R=UweYtS3d zdy($lC1}g!L8RQ@#yw@3GwRE+x>~U+_p0(mqXz(V-nC{wc@5Cp_rY{TCgle?S;1>> z+uMgwtL8oluZ0g0)g@k9YV&4s9;Q33I!AtNC6YSJKfo1vmNC?FkNS=IOPWx2@Xq#W z-H>{b<=z0VVZ3&)@M=U)T1e_WN-{L}Xy8`Y_Ll{6Hj##klNHi8hiMg+ffF{B0%!;A zg_id&y?o)!T%(P=1q?ku?YB&ab+*LLOLK?Q0EpuP1{#c*Z8hJ>_d7&o2P;jTc9o5f z<9D{TKQ^@UlVTUku&oH{TYjZuqtCPy1X`hOr6C{~(4l+MJ{@L#&Of7DjAL_us`JG_$SYkntt+3In9 zzV~T|$lYNfgSkPigCXlr3Xg501?<>@}JFwRnO6Fwjr>54!R7DZK zjKG-=n3y8eIqjdnk$f5-Xw9l@)i<2l=;(`xT<8h1Zze@#bTRL#RT(es$!aFjFQi%M z|Hv(!Wu;<64+}+U7C!zmfu(djw2-x20>f_*8mr((Vf3<(UdAFT~TjnX_Q@7l#K*#Dd(m7M>*fQjbB=D2%j{x8TJB7(;oU%ypdcdGA8 z>BH?X+BdgsJNOXndI>RA*uW2<@omJ9ttarAJ87*GZ%r)S94xCW;zZ<4I=^A~94t3t zA%2++BiJ6VsF9c6rXbYV?{%$)^O}v!m5L0J5sPgrsVYH9-r6c|^xCS2rnNX^e~q{n zYW7i3<8@DVm4T8)%nnoabeoluepiUCv*32%%+rzc2PKp1U%fH7>dZ)|*u9?{6uiPh zhsB3E@Hw(j6jc*-a$=sh+NQd;c?rE^3HN$4OC`Scsurjgd4Brxe1chpZ#daD2!)H$ zF3?ixoc2-9GZ`@#+Wi1&G`k|+qTh{!^EMx$X%3vkwERi8EWT#;DH3AVfV7O`%8s@P zZk_s|TeOd~vCJ$#;(TvfOIp{n64AYN*PvpcU9CVtG4gv#se;RlU=l+1xGm!iXtEFVSs#%^yLr^RZNDPDI~bw|w$ zwX8lchs=(#fy}*t7=yG$n4|@l{Ke6rqoAjOVkOFo;CL%bTY10hOQ+E2maZ@#BB0Tv zG33t11l9Q?w@_aM_W|DRetMGFyL?z)SyaUIs>v4>g6lzeC| zPpKQ)I><*i-kLH_a@N_?-iT7Qeo%R|AkjJy(fpil|lRVU_KcEsr`= zOR?I_Bz3m-aT^vhFvArSKJ~Qh^P)Z1*fD4+#`M19(s!(*BVJ_-*beQ=h*(ayG}-f? zSBcTHxs*4KZLOXLdJ#`IBi4_ZO{)^T=DIp-Ok6qzN!VPM8`B!Ys3=7qo?DMCq3?%R zV0p4cSEy({#Nc1`!RoJY9Biqve#4wJenD|j;5}RLiVa<^=VCv*5ym2q+v#{ysB_t~ z*)2QkSo)#uOe1xj#Js+9+bQzv&me1EWfUFVjJEXDWI`ze2`O##OU~eHJ{(hblP@eK zyXDeUK1z_AVJ8D|Uh!j=h!%ouRa91Btlyn4;R!3nj$}&yY>qVbW%+KDd`Bf6&vx8& zDaNF1PK{5;GQ>ixui1psxy5|xliuYZ+4LMHlpbJpsBWSX^!+YU@l||!RhaqZNUj|X zDbFx$g1c;!Gn6pL4fVw%DXX&}*i&9k(6rjducl;0ae!KkHf@>mpa7QFY?02m=nUE& z$SGNda+6~dmo;0Tep!9}ejKWZvAL$Yc+e*CfK8evM<)y`EoRJwu{g4gVI&M&$W0^2 zi0jMC6M7^oa{1%?DNJFO^vcTc$yo9+&&MqM>l)G-u*||(_W%~ZWXE0ymCP3$ByD^m zbMcFCBM+yNan;iAOjXsh*~!sLOYSJ~y$L^t3Wu{7NX~M~mq&olpGarzrLdqqjfpht z6^RLW=C0w!Wbf)9LTR+0ha6*Khr>QdF4$Fp}4-&o8TTTcp z-kzvH>S@X?jW~f~SzS^u>QRJaug-#U!%5=w7yAJsX{ur)Ogm?f8IaJOSeear2AsYiXM{zFD zQi?A(?CLEnlaJ;2XP{1!zd?JqGSK-4LF>xx&JTSJDix^+3eAo?LGf?&r7?w<%>vo;oo}!LF4IS%q6OMjm z5jVP1VAfS@d{(nu-#_2qicd3#{Y_u3-7WgFrkja{i0K3uP0&~}foUVwB z@2fLHD+^D~q0xuIFofYTUAA6Ac3JdOT`W`St>OXy*Is(|6OA{QnBk=~P>m)UiVnAZ z1Uyw_$;^6U1|XrC0UYBA3Sr`NGK<1+%8`cX@g8+2K<) zG03c1(U+BIvS}D{c|)N{;$h~;-Xkm zZqOTzT^p9$TzgS`woJL4q^ORPS$BgzeHQSTOL9R43$8zs$z`eX$Bqp9EElnsb-~89 z#UdOo0fQ}E&~t4uGkh5mecgCP{HgO09w9`p3?GyX`WnxP)GlQMGA=KE`^mvsx>~Jy zrBn%GS~!#_SV&^Pq*fW=kAI!is&y$bg1M>$?hTmJ5FnEv))TzLwSyo}BdnbX6$n>A z&2#hZ=dVo`AphP-0w_G-41r;ujm?exHxTe`G~N+-)NQEPPyvSK2R@iTOKahVHVCT5 z&7w4H!kSd=m44f8`945f_aG+3GPf42a}5b}sL%RDDD)<+>TEc2?F%Zk_v z0Cq!~%<>Arz(b2eljw3gLQ+3SgEMw zp{?v`V4snliI{*i_X>ia6a5K;7tRZM%Iw+inqU3Ca8t}HbMfqPu*;=?7k&SQ;k#Of zfc?6X!&%`2-tb?dRjiKRL{e$oEZAac@Q!~THP|_TR~?;|0xal4b^3ff*hGcv9o9L7 z;ixXgrDrt^z_tFc+_Y^ihcP0m4ldSb;s{P2J%xi zb0AIyElDw1#W%45BL4rDq6Jj(KisDOZaw|Kkuk$qk*Y@W!9?91UrvJFC&Z}0w~dat1{@ixH%Rh9h3O3+txtS0@c`^cugQNBomb0P z(!Gq7NTh~6JNV>iL(&fJ%?){wCuF08&uKNEaz#+9Nzl{J$w^*^(p^ED|5XwAx6}4t z@%`^uZFzaX9V?4b#x$eU`eh9_#BFIAXy6`F%u4C`O&f}V7rN^8y{e+Ih-0Q79h;9E zPStYSo{0WH5L*tEbf$W>M*TNzQlQvvG703A6lC3M`MD8^`i?^xnT? zKxWO?<{3CkkM(W9`)lLr&Ckq5aK?SU6%5yYSx||L`keYGCH^eyB3gfMttu$bPwHPsq7dh#95l39Z2XXO|wr9A<=&st>&()x%_M5@H-ogLeM1bEA;i#&p2W7{3)wyJdf zXJylqbz9EFJ-6H8`u)N-kW;O_p3fK)mOmP9olA{~ka&m<*0~zltX4j~eKM&yG|)j^ z`)D?qq9K+fNpAt9FI4}Yh^Hj>`|gQZFVK2ejd>Fd)+B7L26dk~s}489Eg?k*k^s!A}c6m8dOgB}@mBX462Q?4YbhE0$jD-zCkREx<;M zrXqP|G$>xqmr!oCEDyE0qu|T*Rf$+q%&|6-;;QVDBC*F+XZDvpQOEBzKh1mlS;-7- zhX}XYBJNzf)86>l9CHz9vL<9aMpUtTnoEzh1n2wlVU+AU2R{QU#x>pX+HltwY)JUY zP0S+ostcKAQeHlm%b~N1SfXv$LeNrrHXwf$eyqW!p#fI9K)Oiy>X=~xXMZE;XuR)s zb{2$+``)CFZaQ?&NMYqQTe~9P)^(Tj65P(yLOTuU!1A^ITt@x^DPg7$zIsyyVm)-W>F#H1E4SZ(f^kBvG}+QBAIXxG+F)u$_kE#j#FF z(D=jFX75KW(Q8)!vuX;Awb6$DfV~qHkFZsbs@pcFh4nc+1*h)JH5gM?47>S@d}8<- zJ5@pZ@9w|!Hir#g3SCmAbFYlEqPG#prAmM2NzmrG)ZEd{L7yn5 z$5>L;TsGMz$yyuEEmRX&EmTC@t29xUcnaEq=k7ApQZz)m{?FRNA>R=l&Ju=B4hZt& z5LZyRz|N**+o8;2i!IdNGD*e6dT&#&=x*YG@R{GD$P3P0&)H^e*!R>TSRT$%`|(rK zTT$p%HGk(h(;149zA;V77vu{~+dt!2HMqCGT)t0pCP^#UCeJ3`ax8>31E2fl`sr)sSawLS5 z$bNpa>)NR13>Zf{by_0}4A-Sc?W8Y)j;GHIJV-$(u7dMCs|8Cp(8jT{ng(be+3`oX zyu|{3jlh@C?-G{+ZLnaPVVmhF06|PNaF)afrj{|Jl;_i(6m{zpV zn6=`v)8m(JKu4M6gB*KXx&GnK#o!p!TV{v>wTy9}p~RCPx<2tVfb|!=+s=21q7&g1 z0Ik(E?k@@s^;8s;*h!fl{(K~M4SN}T7!6&EBhx5O!oX!Tv`xkx4yOxq%JjiEs4Ry` z%KBsYD7D zR)@TOt!-Xl^ukt&kgr!&m)db?s*{_LnZWj}u;?p4 z@2}CGBp7Nb0LpSW1@06#)N`gOQjgYuQyF^;mr~h&Gh(M8U872#dhbLd)|EoW>$;oz zmqFQW`*6VOet;X}m1<#)kq?L5GSslyuiYq?p~1U*gF2c%67hmQ=xH|2`8hTBc6j1b_%;>Ph3J$C&un zeez>RQ)N>b?O;EGGlbfR->Y-a7P`T0UQm70Vh(;}j=S#Nl0!oCaAl&PLHxw!=x{&7 z?8PiO<#O(}QS~8Rl!S86^!=ozifN~B63DpAi!SB@W=>A&YvDMD*=VYu{R^^D-P#kO z>IFUD_x9oSEFgc9>JHnvcma~559o4Kd?~&3FeTmG$k6uk-<35R^2+1hlocKhc(-~7 zjfpVSN^fSrsS2^di{d08q@i^}FmT3W)jknAIz>4Xu{12{5gO>2IPSa*)hkr!O>+(S zCYh|RI-X!sxjkILH!R*p!`=z?j}|bn?P?`(&BMXSx9=4wyd&LvMEjU!_1%vUtsS;R zw4Bmc^zpmW&mUd2ZyIaA=(L+LgBbJk?3H7vZrMI`Y0yvLrzoANPz!IZE?}@f?bzu( zj4fQd=REO1iY`R5;d!fiI}OdlH!l4{Rnn}T4_Ag7iNK~AOcHH3)BkL05U7dTkbZlj zL_VD$Bz47 zfeqZ#FPYB8N9w!_qkowaL-24dp>8_Pg(N40t@5ahdcUzueOrO6nmubjr7o zK26es`fKnP;1aPjVH01BFw4JYQ`$x791D2Q*Oya%lm^jb)Z`ck2#sU0ar6?%tj*>0 zmbKF%V7$1)R`^RP9gz&okeoGlW2xBMCGX9<$B!L5D5S`l@f?S(HP%+#9Nc`qY zF|G{rU>h=OJy@lw<`Xa}S)C|7KR&R($UceFgkUJJaY#=y#b&=wB6?nhE5-#f&0-dR zanvf1V!<6j4{*g4wl(qa_w2mzm<@ruBL$Hiop;=nFVig{6aw= zViCg()FR1{IPVHR$^TmqV^$8aGNv`~MHIyR=nL2JC4H#o4A;z$(9t5LM@ynrRHWF0 zo#9X6aJXKRJgtf%IF5?Ip>+?SjVu5<5E3GWk6nw_#6FTCK(C@6?5=grvP*z#xSa&M)RAaos0Z zMJ@Y^Oifdz}5Pff1j>2Q95xiPQ%&?fOZ6Za$b>wLB{5^OK9cC(Uf-sI027l^f*xqF3uK#~!m@ynS}Db68f4I>NRa=! zvB_=4z0SB|_!O2+Lww4p_9wuWxn=5so@izs~dfF1!k|-hgw@(IRbcE}4wack7_2&*cD1hIk%-#NHo2l6U3<5^a zpHUIB$Dc3+;-2`_ma_-E^uG0uLk;P>&Glbi#4-SBVYkN`HnfD55j{%zaufv;oa&tKbkE+z-$2meI&zuj9wl@4s$ zRk3c0N&HWK*HmX_R^EG#R~btoSLFmC;^6-}4C9Oj<%ADi;Neeagpfn7n%^x=M<)!_ z+|VZpdtW-@k^gF-{l`@a6KK|5*W`PV=9}>c;Wqrp9C&&PP@9K4*t9%fcvQh~@=Ea&rKSv2Gz-QA(R`V%pHS!=W-a z?s^I4-UAXIULs4L`m;6t=QLjk71BTv!h$Gr`o=`HkX8pJGwGNqOT9aE*wkiV_%x|yxEG{>M9P77dy!%~vh;l?^ zd69T`%Kp5>Iv=h79C9%)k{6$Ne(P)x+~p%&2(A21G!YQfvi;Ie(v8zpUC7A{#PpYv zu1l#Us29DC|4~&Xhbo^!@jZ_5yBf2Cnz;-;8xxf&F>0RjZfTHn?;TYb*^tHP7vk{E z7LMk2=yi`;+p}CV8VFHlqxvPA%d5%OhWnAS(X3@4l_;v3BFsx~;~wIjvn|&9Yn)}b zJ;f2x<=y-Q z#4efbJP9$FBMjMYYl!?T|BBSie26h_X8Jw-@=1*$q#3cQ)t2O-QyF4=(}1ftbd#pi zw%7Z%z>RX}Tm#+p35VPVliJXM_v$uwuH<6}j%WB@L!(qwBKMn;kX=I04R3x8{dnI} z@Ui=IL5?}sURUT`w}=VThBR!g)cYdO?@Prw(e_&yZODwmH&knS_BF|o#mR=w**Xt$tcaw; z=iVn%1`4n9hT%A@DvDT-2usS4HuVw+j1!U{Pwr$%95>FoZ{x3PHgAnGy&CD8eX-2@ z;;fos+X1m6IREaOr}rEpWZQ%1^JE>7uJi_}g9!n=%?`^>oVu}qcg`8pnLr%WU3TAX za-H$6dXw$bS0h7pH+C{(5}ne`2g)m#$9t=zhwE*O84rc^Zaj^iHrxB+EZO?>nqx1A zFAdMIOIpjs*ayd=;wL)3^EEy`FW^m-T<2nx0X1c~a=|zG$qkSGg5aVr(%ABb?27Ck z-t>L$oYUFPCC@eq{{rrJv3Rvps($N21kR5zYNPwKhI}ia%M{`yDq~3@^Z~{bIV~!& z^S$J>StT7+9<0ZuB_5(9&ARo6))2>*1=4;^a(jbp5yE4?OqZd)2a^wiCJY|G6I<_1 z#&n_5Id|d6$37iJbxU?xB4Q(C>J~!js*2XwEEY5rReY%!mnCFh5 z59`F&LRB@70QUTOROE>+ny{s6LL_%%5osIH&+KJr-_-B6rg`;aJ(S=?Iua@ph1yZVYLnR!HgqJ+CF6W`{nGKG>V zPRp6UDZ2iUV`{f7t@sk~^VqfM_M{Y%FBQ~@hpJ7#4;~`4UZG2~`@Q{!r^Q4%-vo!E z#3B%62Svv7krroYWW~w97E`jQdG}Vk2`^el^Q%L&VHFmY>=b;4@x}oELCIGKU1$D0 z^&&Hq$dug-{8dZ6+uapjIw3IZ=p^~bHn$j9PR(~FUZL~)Humznp=gCZ^K%w$83S=b zHf*fDEa!TzovfXlJ~G1TZMg`yP)Fdg*Y)f^<$jEbN~9VPI(KGKe`sQsm|zxy%fl6> z@E*jEMg~!*Pcc1Tfy+tRj@xYPqWpx^X|^?DFoGzNIV`ooaBZL77pB(j#c#16z4U0v zEt_$C*jcMiP+Lu`SixO!vwbNyibomm*4^O-OMmpc5Qm@mKfZI1aeVgM$}oITA~;63 z;2o@~gI_Lgvc}p?WvcSglf)UiiybVa=2|i%L$?_Et49%sj-P*lzTq8bzq)6qwdXzU z7=9k6x9?oKJK|>;8j@hi35yWP8#h{R;P7oE?IqT&wCiykV5UaqPQCZ0L{joW(FtFN zqU`ef%W@RiWyGP00*3Z8aqQy!Bn9pRFL4($w~2oY+LoEUgOYs?dynyOKQB&gqeDoo zqo?`%Ba_AJtBrFyubMhh94i$Heo2@D?X;k!)`f3-=HoS&BuP5i*gL3M zOd@Pel75Sz+v~SHjEm)xhYO@iqXtr$@_gRYUL#1DUD+_mG${SKX;64f)q3yM4YdDU z{PQ|GteUWU4>Nu)JCQCf)O*=c#=1=};jFYcJ@Asj#_y+MwNkW5E#oW-42{b>D7JKk zxxc}`C=Y&A9j1?)9Lm?ZvT8AK67#x|x&iZY`s;l^k$V86AD{AX9q4KsbaF^7BjQRk z+ezxu53uMl)%wOobAThEbn_yOiatlhUfdmmwFn{bV`BbY0iNBXztOlqNk?!rLcBg%hLh- zZn_??qs>S$yX-XCRSzmd$4#ek?l<$nelp?R!^z+ZJ_;CRI`mh(Sgr|Jct~n6|6h=a zk;y0L#4Gb*RRIT9)WeEjDr!(hyKyc?i6+mh$%}BRd3xdPU9z34S#dw64!n<TB_mSB7dBjpBYCnKx({?E}4IkObF_>`Y&7ZihS* zwVQ2pnAHpYvhrja8;a>pHI(?odqauhSL$Q2e@y1GBMS1@#M>YW)oRa)#wnO-dWA)H zvOj{J1dNSWc{>T4=-!OY#qViDbxoez?e$!9O6TFDKM%T;))XzDWqR)Hnd+P@9KxxU z2*-Wg^d-H~V+cFf273e1%K@Uh89)AljP-b*;tg`8M(&2TzVSIIY@J=s@zRUlP>G!J z%Sp!;O45Uyx~nR)L;x;&Bu^t#0{hxst^MueWAW8*C!@A&Iy&Cxo`fv{s-i}~&d2pH zh<{PO_is72W@F~*r2}Z5_1@J$Tvj!n(pjFeAy}YjoKg-FDJzL1C1 z@zj91R=#$nDt1uDO8WF4G@#l|{=7jkd>tv$HjyOM0KT}Ns+v}@K(Y^Grr%ID7BKJ( zhlfPk9upe@E*LLjKAt)&)0e=)3gFqNvZnY#p+oycpm{OfZ8$$Or`AVJaCnVlCi#sa z8FsM##K|1iNr))imiLs ziNVED!Y={XjKVdIdag)W<7EI1!k#C9RAwpm&Y;fx9cHK8L5FKEiTyp9EtpoX0e^JV zzDn1>KYx;7Q-0K(M{r#RsV3dVLNISeT*E!OCZA(to;$lPPc$J!P&>{}U`tIq*B{`I7_>JE^PyP)G2>$p`QN zGcN`{?20@8=igORfz4gaP+T3R=|6*@VgKh@?a+toMl?A0Bh0Tks2zgOC~?4&5Q&DU zj@mi@Ly`_ohkrE6y3i2UKq-hZKp4UQ!xQA3l|<(aF~PzP5*+Tz_o@7|@Z>Xns{gFM z-Z5w1GR&CHCW?1e8M$sJ4-N+brYUDya9_^K4DtWY$=I2{nmPC9AB@#{(Z5e0w)<3( za`KeO(`?l%$lCI+DDq!lM}AcuFvrBa8Zs8~&_Cxd*NxOnxJ@>2p;A$PmGAVZEMSY@t)tFkj={a*`k0`YmG zuYASB&dg9;MlvISsM+(ISD~ca>|&QkNK}#{rA@iZo{kD19ygfHx)BK_%L-VnOX zgKsd`K<6o}L?Tv?2Id&$eLlLhA&6a$p;2Anw7&kD5CN zj!MPRa`AVl)Vg|7*2?#>zqX|CEa=B zpGRiq`+1Vp=N9Zu$#4SR!!l+dJkvlo4$$%#e>^6hfl{x zm)yB_C*H8o;KD0;@U*JFvQPa5`E30y)OfT!sF~*Lp20`;ysk6{uN;cE;u69n1ogeL zl9#bz480HP@yRtdJ#VVdC*!axI(0VXTTBkM-owC(9?DWyd|o*s%-7FD37THw{N&3? z&8N)Lo^KlHTy_~C*$Q^>=5tf#x#PPBO{+hwv$Rt(`WaHiek$@N zPQGIh>%F5uI5A2Xc=?D5x6ajm2}TwTEq9(spX#OMkgQFqIk9hoo++VOUWCug3CS1= zb-Rc4^wlBwuqILoYI9FWmrWDiRVFTFRq%gB$z}vTo2*`_U_PRZ-B$CO+l$b$MJ)`d zT5(4Jup<94*EUtx`~ditIqD^R`jWizx@2&2YFO99LW5uwn3z4^?=@p~e(_^wrKPbV z{#Pjs?I)Gi_KG~D6MajR%g4?dk60zD4>aQO9E%sP#=oLOZ8>;qQZaAMNQJ()KYCf> zvXGD1AjECjT=3pWtpJJT@6K2!JTwN=+9{#IclAr3;bMBblgs{sst_-AQywg9rMn;B za(neFqS>C3pMl=Dxy>P!W@ou{F6`Bc)aXM?!B5(h{6ree&P>g-XJgzX0n8V*hX`hw zUlGnJogbxYlZUw7m2QN!o!trN-`9uWC{mzJP<0eaZc(l|lQQfigy635=iuEIpSVDH$1FfzDl z67BoFU_LP>Kvg4fPXG8;pF^HYd3TRpmtz(z4LM;oQkjAIbAsKR7R&NiIYNLUpV7-Bk_DYNL~)8VG3HvGIVL-%5C4f7^FBeX_p zI~r4|wu*RJ;LnRTe&45nwa=Ltu!=vw9McIm^F-p^Gg;u#>lEYLP!=B2c`?-8K2w9B zLr1iQ5xxj8-+bBYi1Z)V@Of(+nWnSQx_mpYW-ze1FAJ%%HW7}n%+~$%Y3wFP+N(y{ z6ZiH;(&aQ>9kO}KEy98QHwTCZXsu93!n^t>M?^8$RnuEauPtMw&)eHwpy(fQZx=66 zp5a(@AMW!!bN^I>$c~&ciF%oQ#3M)3En+cS!VXinGJbwBfo2dyaIF$kSB`&_owYKK z-cW>8@fX55RQ+BuR-tR71wCVQgqfo`dKF@6Ed7uN++%ZtQ`WS5;lQA8P&{F{1B zXGFAqqmKyL@xuJ8f(!rIA{uX}&45q=`kU9k)?lRzZ+Oo=1+7(N=LTr*SCl)k(Nl?v z+Oedu_wD>Z(2KE6ueT~yY2YvCjLW{%ds--JnTn$Q;1UB{$2}S?gBXGLd=_1GD7Uys z^wWUG*|Zt1tl{FOksna96*iOt3ewJZ@%(>hXwc{@lGhc1|}UiUHtcTsU~V?4qLN@xu_?L+!Y4~^UxyiqS7 ze{Ai2?DzW7$7gieU(Ehx zs>6CN71`WIBYmA;lEzB+h1@R3>}!8!9Na{{e2+npx=kqC$1eS0rZu!DB#JjKKS|90 zdDS$iyeFU>Ez;Cp2p?@uC^}PoTZM&+tTS~VWkw}0I93m`LMfmLL@FwS;zHj(74xP7 z(18cayz7kV*`WO~FEyqM=ObVxiT*h$RZ`6LAyrbUV_M1K{m>rCe)coW&8!7+LSgk6 zWFr)y#(Je~421l#`8e9D29UW*7mTU-yk-+t_Hn`+&0Zo(DfjvtPvrqb97EI<_DGAw z0$mR|WUpTruriH9*~!E4-X|T+>*-S-p{RgN&}`}$mit9bwkp;VR@q4 z`^BuHmm-04+9Nr}m*t|eiu!W0ofly4^IqmJ$al@g(;3jq1_CP@8k0dwFNnybT<14 z$5kYL$6YLVGn)BpctLf%FdiN9jZ^r#{;qrantrv=^|#M2ro219-3j&=4NfcOri?xR z3o_+3(}+ZlVVWPDncw#A=KfO2=zUM}-DtgB)6X5%CvY-_9$S|^i7#Wns_?h1$@*5X z3vS`H+m5>$i!`JrA6pW|j->T0gj>lp$UMm~btGHQN)AH@!FPu62{aQMD!XH{~lYKcpD1fxMbZ~L+BjtHzykgO`G zPL1sSTh?*vXD{D)4RTzQitHO8MZZ_a#P_m2&bq{XfZo>rYBu0qUMKo)kmmik;xQof zxA5M3!Lugn-_lFGLb_&tuItkoKecgsr*`xTe(A+3YTY?4d{+ zEwi`Iso-4vIq@fL_%0q{h!+C}br$v57fC8_lUp9*J4^d*L zX4WWSxpYe&`_Eobwa-x5oqAPRQS|7R);teaDg8+LfccxyZ3#!m#V%Gsu42LU_i)Cw z@4)HZAfoX;a7<5qT8^&F$vS5l-v9ZFLd5JnOHuV=sxIR3CTCa~_2;*e%7)2?(b2x^ z24&8h5oo*xhSt0t!th^jg=#;^l(4Rp!=?3U{jL=SM1EsobAyN<+^A1n=;EYo?n zY%4{j&>hLO-;(7P-pKbC#IkW>uOscchv%DFCCu@cRj0&S@%QgX)_vB|a#WX)xedWw zjE<7&AX$mW_DzB9KDW=TFdvTKju~(I6rex@MUv`B?^+mF|&L zde0(yTmR+bvSNp6kpK;}P3}J@jT#Kk>Mr-blsI_9u=CwK+3|^z^S0m(c8<}KzUKx z2t$x~T>A_1!Cr1hKQ{X)37BI(9t+xiY}=%K{e_fLVhDMZ4P-4ka%b#Z==s9-`PpjY zF%Dqns1S1aQDeo|Re8(#HjLL=^gdsFSFB)@@FIygW{l_<1nCB+PcAX#{(^KA{M3n4z4r>N` z&0dnk8{6svD#mt*u-oC;uY-a%qc^hF$wx0#^YbW8=!mtN3@Xz$>CpFQO>^eLSejn< zaTI;C=ib$8G4*KeEWE#ri``@YQAshShmA-hx8R!-)HH+A)bmU(+JM@tgX%%u8+8BN zVG+V*+NxnXcb(Nd!iqFV$V=W1vr9YRR}Co%Ne4y$n0hIRtJiDTb+e7u%Zr&z08nAX_6k2te&+KE<+=0~+S>eg?~AP@n2)K25oIRE^| zID6Hb+gv`%-sh(@RuH7B+1Hv2JQq@0IQo};XmybyYLym}cGrvJ*uBY%897X)FiPgK z$UC|##Z3HO1{xdsna_l@Q^>55yKs?b?*3EDN8i|UC!{Z4XJ#QfsNhsP%Az4(pS0q8 zi|>?t%F$U%QrK)urgnLgWTxv%CcHIc5n9RRlTMI)AjBthU3Rn3lK_64Fsv1M_7F93 zEO#h(fml9X%GW<;>9`j+@h;eE;px3i%bJI^{w_B8zm`lCaWKM{zC)~dj;xG>TNGnX z=s))|X{kkUKZCREdLKq!ATOVIa2)p{j`994#=bMEsdn8qpn!Cd-ch=M^j;z&T|hv3 ziGm6Uf;8!gf`If6QlrwPM7l`t0@AB=LhlJRKoaiq+xy(R&pChY55~w?BWq>xmZ!{j zKJ#gRvz4%wA!(KOGQcO*LaE<(-EME4;4JI?JGpBu{Erxjy;qkF#xwr;^@}zDe0u{Y z#~S*rz{2nh=+u0=H}g9gHrczqH}6O%BIy%?iFrb=XFX4fUH@t0X*GhC%w6yefi%Pd z8Zbzw!W`oo?igZier(dPr@0QLZd&oTqKx`7;HglxKr_II$L@@6({8npRQrm!aNCu@ zioPT|-N*C1w=fmoYOLojGHyK|!?VIqvo1BW0fi;&0PqI<)H`+vN7HbEc$r3R$5X!B zq|dybJXX7LO#1h^ihxzGFEGB0D*@V^f6H5ZqK z5X)CTgKCTI@ULrDwWSQAUA5^Xa)MOLDP9tm5vQovkEqd;Ma|!P4p1%5Y5&<#qVJ)9 zchq&>;->Kv>dJxXYp02nXsTf9s z{7-g;n{z+!AU+t-b2{x_zyqXZnx$}6&C-F7Ft?ZEsGS$7S85n8Lg#9yJb~RKaCxo$ zn=D9{C_XFXOha)h2iPq&uxJ%YOs+Y~pi|P(?Q0EVs>E#9YMS{p&#w;m>oI-(@2IXh zC?|W&l!w<@L3woea%){HCEw(FBFIC7RPF>Z0Gp9LoyTOOR*;i5OPoYo7cOsExHV}~x0(4eEI4=) zKBXkkKVfJEIYcK761G3RYnoX|IlN=Xt*<2#x53+*FcY-m@FhPVF%nFB9Q6FU!}a@VQm`x}tF< zjG$8D#?WKuQv<%nT~~-P)oAhfb_delP{ru=xNA|RV{f@oX3}$`RkPkJP)%J=+|9u2 zD59{2WoN~JCIgdZhJ?(|0~ESzjnQj{@l}DJ3F75@2gP7=u>aw5{ih!l)Hr&!Ps$6~ z7muY0$Ib0VN(iB5L@@%0o*%QGe>M~2G^#T>F?+4jdz{p;rMyGEAWC=_CwUK1^#_v1 zQ9x46Cx@KXPnUsxkmuJfrL*zv&Mw=j3h`tU^n#4SW01elct4E(=lW!kA?+A)Sr zL$@d|!|AziqK2L66O+eQAc7nT133=HD-bme=JE=p@?UOhXC^D4;pI-NxA|N!j?Wm`OqY>rAi`b*M5VL zM-zEgz6WJAOEO1VZ^9lQDtzjHdP`<+5UIt4m?SHGNtRFQG5+Ao#~5|rP&%`ASZ&-M z=gPsmfn{AVVb^nrHDjDon!8lKdc{Y9`|S6JK6%FbQ`l#1gmvhYPRP0Tm6<%aXQZ9ChnMlKTS0WZ*6du6jGR=&fsIptNq>`aM3;*7F{h+AyZq9;PC?M^2`?!&%f0LQ zCUw(mTToq_h&}`fqs745GAK79u*NWk%>1p*=Q)0{^mNd?r1qlgiT6Q?L{J;7F<{*i zta(>29NH-f`Rg8i+h!?1LMoouMwgGw>oyyBu?xBI5tjK}Ry$bHlEG-f=0%9s%(&U1 z)0B)QJ&u;wgw*FRQyIFbk(&b-Q^mfVcrq{=tX&@~%D!34f=gX)+Uu~GTROS@-gi)u zGDIb}=-8p-$A|J#JPbu;zMh=94IMSKRYf_kz~12%d|zYLmomfo{-i24Xpi#_ltz=9 zB&c#z5X|)HWU;XYlf>cwfY}0_P?D_~Dm6>qHJ7f1ML{VG+mZrWo%$QE!VG_;uw_L% z#Gb5;wd_Z2X21kf2+by_FneMVZC6oMd1t{Xx5d#u-! zH2sM7P7&mj0H^q%s{DrsqA@Y6QyVd_nTzc>eB`?x>rq@bb}es#D(o@bw6DT>2Hr)$L}WHhK@< zF%XtT$-BPB&L?FnU@r zo@^G^=OdwTR|h>5$I0Y{c)v(_fb}kFzqb*SxG*^TcBtA$H#?%?JBZPK>Q&bBC6lpS z!Cno;N?u<2xBfVK3=E8=Gg}$2l1i}HjQ^s zqE%YMkP+aV9P;KvFS)%3v2d$I{2T4IA?d}aKB32PN~iKg_RkCtJI^xi>Ie<2Dg4?* z1cO}=Gr65Wt(>$nb-N14N@D`C9ME5X&tBqO%=OPr6AEIRltcYDbWc4q&d-(%_LkAI zot(7Coa9(dg}|A6H6E#M!jJtV)OZ$o^qFMl{6f`SN!EDKK)ry4wA1KmzkRIlwmY1A z4(AU@dl_-0tVz9{vlL91#%!R0{D1%m2Z5~lqm(Fx(}hWGdOpP3@&SLa#mg6U3Aqcq zyoU7^Ss>8Xknx}3)+e1Z#mDkH$Y*-Cn!7>wv=AKz7~i=6a6L90?*U@O9Za3W+ZZI6 zdh;UtGg#q%)l45Wk@-K$qUVV##^gi>i)@Q*wD;XDYe2xCc{QA~arVic;35kt1y_*Oi&~ZA5Elkb)LrVx&T$nR= z;gA>NIx(}6R*o3x_Q$(wY{8OaaN7)dZ|#wk7nR~UeBv1eXAWBv_w>wq5w|es=!Oo! zUJH~F*Qr?0LDiCUxp zzlO}1Aw@gOsM|C23etVOb$^ZQ`yNi6PEG5 zFg|HocVR=5&2s9OuI68x12AWo4}H0ph~G2hA|<9~@=upv6y9&uleaDHL?wj08&o|e zh@(F@0RStEfI(08+kH*cn7Qw(DgJ&fk<|73O(9LKZ+hnL0JKAtAMdN343>MltB9CF zy%{TbgOu44U(BUHbkyIbeITHGW_z=cPpE|4TjP{({+e8`KWRa%RG$rL4I1&(?2QCP226W|9DQ@sPiDP^VT=g4dy{HIYi)`5ZO2{ z=?`5mC(>eG-tpAPS6PwReOxun=?WA9zt(8>+1IVru1~<5oS`8@_ant9!;uAH^!LY~ z@A*6Mm)X*iC`>44$=<1X@i(k`xQR#wKc|jv7Qrl|%2&E_q;y}5zm}HY^|8{h)6_yn zs=qWEIC}YjN=%G|-FYj>5`B!jcZip!m>ljEAo(JY{|=nuVEdtX+-vRU-E4J%V(tf# zSR)kU-U)gZf(ifZq>!;?P`I5DQ&#;wKUaWeNFvw8P4W$w?DeD}>#6O+9s`b7a0)05 z2Huk)EItR>VkpqM_0$q9o^Gfc%9UdDhl?i5yZpDQiD~qT{@cb{@%*R{$km@5Rq9Yi zSVz|t$dUS=c#sN$R|eui$SF`#3=aaL6wulgXk}s%Uy2NDK7wBwrKMly{(s#w4l9v^ zi#CF51pVMYrOZ~Rc6Tmau{M}=bW%t2HMC8^r;C^0kJNA1)+Ne6DHXj_S&ktj8q<01;2TLQ(})rOL-9}Y!|_Q4C9;)yPQcCrh>dCdJfs@^<0w3e zoAe5#)SdmJ3TVIVigTW5qz2pp2c{<1T9Y&8I2+t)8W46GLzz9OxaT#`sh=K)ElY{K*}s=5!CN z-Q~aW9=uebQ28ho=(ME7eSJida*E|q4DB6&UPjs<20oo|E-0k3`&ym|layP3k~xR- z{Q3g{3RDA1;HfERF;%S>S^fpt?%k_$zH^{Q9=AKVqoYee{v01z_MAJi*lhA@6qEF9 z03hG!uigSc?)dxX#9RPK@c%6u`PTn~>TB+ny2^c?ueo32Uj97{Fx+iT1Xy6o|D2a- zR{iZ#25^J^s)7J~Ac3OI+&}Mm0b)&n*niz(tO_#Pr>h593d)k%R%f{YB_aP?YI1oD zUHC*ZUzV>rayYMdvk2%d{cVj?T$!F2paXb>|Mw4aUU|dTslZEQ#;ztC0kZK==?o1= zT}TC_+3D&5s#+w%(FDMWb7zQxvjnTxa&wU1w*|hw|G(!I|8ai7c{Pa}|EKYE0728ZY8mV$rWv_rm;v4EwaL1`v2^4cP^D^jmWYs|Xwmmki&V=J9j`XHeaIKtn#UlU%;SPb4E6p8rb~ z2hch(qK$m?$bak`Wp|pV|2TQFA0TX_CEuk58o2gjt*}-mMlokg$V)O=4zn9w51 z7zoLj@3x%Y2>FLIlUW5T@!B9`A&*e=3d5DMqh-A!V;+|fxoC)mOHQ#BXGgK`C z-Ubvi`r&e1R>zUY>vK9o%ZHWG=15eP#HOJG(6;!x*`VVhpI#8r@}GCZ$ev4ZU)49& zdc|UXW#^3;AIhToV}wtRHSLZq(Km7Dv)%zc-ePe{6_m@xxZ=lR?QYE`Ny48l=9!t3 zdC`e}1y7AvdJK(_yVe|eE7Hd#?UnjvH!fg+P$NO?eZ=QxI?9=G9qOkr^co^SAnx*E zkOAjk?Z!=p%&ugfE0C9I&Ot_1^+{JLzr0QEo$9{OR;3JF|DVf>f9g(%-hIv*UE*+D zX|kTkwU^LZz0pN}D>t9Flh8avoaGk}lVTXuc>KwWia7fO)6Z$&971|8Mj{NeS;eB9 z<~g&$oJ!Xhn9w5KWzJ+#x`)j7uaQ=m_=4P%AaZ?@Eo?5#7gO6CT?W^!F#YXmwZ%7` zE2{cjS2m31o>K+42-lRb#z^ez>L!c-OqAj1m;$g8-*Dj+vO!%31afHHg928Umo)Ea zNBD!|L?OP#|1-@>miKzW+D}i2+@8E+5U)nr^9TN;Pb!H(_GRi84^GD;boQ5lR@t`* z3wz^(KSK}TXH`2pkDo+j9nHX1b6xw*#=T}UtpFjKIg7p|)jQ(UY#%-Hwd9$UZ)`zE z3bC|jr?pj>v86Ut98zRPrmmoi!t}83_mKO0p4d^PQXiW4!9wTtns+@4GxnR=F=w4{ zsz4iiWDEJcbpBrXLH2{BFedKx>mJnk230@gJWC`mDkh`?nrO|^bej0YsmQ*A1Ktcu z4xfA@O0JmJ?kV$x5UR*xoKL@FoQdlS6}pf?($7vs3P1p^3K1mY+kZ-OF>&^|Lml%7%?J4}&|`G>vuSne8!VKC<=$wYemg`o}0@ zCoS8rbs@tpaeyxEG~Gz-@fhVWiVFI~NYevq-QBs$JOyxQ0NX~1{m76)- z)d2xT( zkZJ`4!Tt5dnIFzB0rk)qedo_%vbl~tLJVyOv(rYklmU!DfE8jZgi%JaW{Y;38n8EL zwKK;Kvy;nAwO85S_6)IA442ay^Lv)E_rrfG^X}_cai8@J7SxCp$85q?4Oq2*VZTWb z33MOcwVJ0enr*gpe^y=RzV(2DE%NQ$wQBEtBetKPjrSk${F>7xn;EMqQ`3&OcJh(1 zQuKD5&ZY3Tr%$N)*g+*oug$Jt8sGT@LVownhmlk2j+U9bJCCxZu0S-u#~yZYI5Q^d zYwa66gFqai3aG^hu2^T*Rv!U5I|3y@l=~1AwA=7>v@&oX3Xx3J|NJyVmwNuuLD`zl_UK+WFPAspV`)3mH2g}p{@eX+Vrt7G^OX?&TU?KlQjZJe zJxZdIe^4!ViFZcc-!|~in<^$uOAdW3ISKFqIw5pe*)Lz_-&X13ev9?!U+Q_7bhitc zdKtEjXP(*l-$(!)f$y|R9{PMr5Xs59Y0DY2)|e#~F)TVTvK zb2+j#oVyDfGEnE04NB3_*F8%9xL1Vc;xD#efj_a&?`#O!1At=136qPIVE2gHhr1ry zQY&}&W;s-`#a|yNpqwP<(6)h3iW|!GKdQ?f#QiFebcl4~m-_s&memd*DH|2(n^eS( z|4`H=8uaZcY~uSpT~bY`56zC7Z0cDmY^DAPX2t5dEgz7XYYV2l^RuRlp<8_}&#I|p zmJjV;tdxxb5*-{=kzGZ$X6z2vfM;_$OV$4q8qO3{X9q1<;Jg!#_-F4`tWnt7#Z&d(=GuF<0~s=l#=by;KPuoBBe5p_Q$A@DTD#^1N!*NJFM{z4GaDAl z?cZ~;XDb`sy71U$a5K7VFy=*WX5nIGpG=Iwui}viaL2_|6m4V*a}NF6B#@Ui%rffz zSv|ru%c~+ePoIj;2t7qlATpb;B;IPeqD*NOL8W%6ET{s}Q9aCt`^6}&soJEwm64Kv zvcKM?*(FFlk@lL*hrQvA@~f0^cY)Sox`+EF5Uv2@`%gMkx*E`SGa_*Bb`tcQNes5+XcBZ{%oia7Vn4mxGXK^x4~uXS_?ksh zMtbAEr1mI0+VH(WGO?in5NpLgW;W6sAAR~5YMY#O-&3q0soN>J@i;Gjj3e9AHUYJ} zhn8-Djq&4CHUx-w-WF(|gEi4c9Ur(+jhBm?4c;c`&J1u8!|0pQFslfvIXB!ac ztX9q;?Af=lfnOja!*U3j+TAb(pyVS2c;dP`575y}pfyj)>zhsoySgM_7M0T+&=l&g zOq=?E3mwUoGW#&xCzb$`fi+9jQ3p!P^vGhd+Dk=^h;{T)gL8F6jjckwS$ zrl=BbiG9&>xcoyNvq%{zjCu~IY39MWwGl+dxZ=tE*gSe^e;io{d~*ed!VD~fLQ_+o zz0TkIURINzrA;BuHIvZ85~^-E4FVB1_XB!t`q5&?QbAXs@{N$-{JnzhC*y~&6;A^& z^m%{E;SMO^kVEWSC~hBqeXteguWgv!T&m|Q+69{iFZINg@zta9i&1YYs?R;9qi3q# z-m;>tbHjNG1P$q2x}4W^_)!QxMk{y;RdktY!&W6;>kvL;N-M#HY!MS z*U+N(wOGo%>%?}W1RVmbtpwv722ATsx~GkpEz#D4(8>kJL6VO(8_ptf=*xn;kJ6Zw z1M|10HlF|LcHZ^~5<>0Z$wXF2y@z6Ip$yu(R=#d}p-PRYI>v_s-1ZZ>;@6JpH=o`X zekr1lC^={Q5msbaO`Qg%4ZMd=>uLEIk1kHBzxL^U zR@T%$V+ZH-(GsM?Muy3%V-n9j-cBHWT5@Y%DsHaSh&WPwlU}vQls^Majp3($CnDt* znt_O0|Hyhk1CbKslV_;4e^sPhC72|)^M0#mvz*8+=jE8Ujew&1TW*36k-=Fhvq~iv z7*3RDCxe@P35HuEhu@|Jy+WuEcWOf?0r%#bS}WZ1yks;Flpfi@WTJ|6W{4WD!3wpy zO{+|~NbTP8E(pgni3``yC)Uqif@7T(Ek=$HZaOi}Fd)e0%(um~-IQMUH5wIb8lPN` zJbGhMrXy*jS2MwG9xcqzYGsTHIpx7@z)*F_Y28%&Lr zM2?-+WJ?SJ%csem?3d=c^8_+l>PfDpvX5NlxAH;iFXsSINF)z5V2*2roNU1W8T~tm zp@>GJz!q{5750Jm#sNDuk|p?AF?&X*D8zRNgL3b zmsag8i|s)VT!6YITywAX>dag%uL(!Oz!9gP^YLg)rs$tKqzjw`FXbE1l_tUD!T_s$ z1wyIxx?k4oQeQYEaB&m~T|BxjMp4Lhcmn}PD)Cn#25VDWR0?xbC)W!dhAqznPTMfW zKm~T;(%YIK2icSoXZy>0JtP!7^+xPHZ|pxRUc)Ul*PKv7Sz&DtRPMssU!Y%66FQB# zjRmsPM6|<1Sl8d9p0&G^({Jh=Th^=BT-a+%i3tioBFHd9T`L~u zz7M*|BiOv0%FFHRA|+2K4gerh`Z#VEK=AmP3xUpaD!|dH`qVH3CxF^n16ETBYf!Y% zv9{eTtgCDk823q?caHFnN-!5EdsI!}xwh2mcF{}AvQ%A9_D!C3bae@a0U9(52+7`fieTnSE~o8ew?`XyH}uNdFPEE zp16(GA!8{#3Mb0Q$BLGEQruzug?hGHI&m^{J< z?Y8^=QrWk?Yj+(LQ=4*ou=jY_W$i&goal{3O;e7EU(hz#;p?BAxFDcy@s`@uACmW7 zp7s^e_8&8&GbT=?Iz~eBS0uZ^;b@BfCev(F#kW~(02h#c0MKQs)!rOoq&zbp`-vJvr* z@9wMcV<0*oORWEF@o8&qUmt7kQh~q8@@G3k0G=z@$zo1(iIeFSMV_ikA%%`#o8xDS z>AhBOgpU}!rd6Z#Yz!L*CaV{}M>BnKc?ulAB|$O3+0N4d1P(N{T)12F#j%+WgIn4o z@01bSw)CF=9VOk==`3tSa6S4OH{p$6BtR;UD}b#oHgmzIId*`a=+c;!OHNpCFNPK0 z44}}Xe$K)vv*acS@UKRH+%dSkaQ}Ks6}(kDJ{PMz{0=-nwq$(%Ed)XhodDP&koQt> z!tIt@&2*MBaXw!kPMN*EvB%pu+55ZveXn@6bK24#7fm%A3r-7>4SO}jxoHc+&?9?K zA|~ZeO31VC4T`R0AP@N>9>iX%-dl()XbdEls4U^IkS$(}_@dV~&lv-$r0&!+TR!WF zqa>FVdBy{XTHIFIx@}5QD?XeB_?{E&9CQHB*ogM&fn0ZDcvIabZDmy) z9M?cp8tiAD(|U))`9iZC4|L-$wOr`oGg{<60R!ORQUT;p+?N{P7twV4J^6CR>&= ze1Wo42oiBgL5<^&{rN&(xboh;Mbx@Be-vQ9EIF+ZyIUyMmX1AgU>%@fXlZ_|WMM1h zg8wHYo*WQ7Ksx^^LS&j}z<=Y9#=KUhOJ2NiDEQM>=rpY9?BvYSMIkPrwGsm|d1%sV z@)!hqLj?7LwdsOaKlvKtJ4miT+N+3B+R+wSFwhD6@R5?x73dj&QRN+)xf+)9aTG8u zE^8ekFnk8sXj`Q#5JHy|$1DPE%&9pi1x`2tyO22PK``SDRo~a|;|Gc4Nxe~U_ zJ_?Y_C#i7^LbF2~0_|x+b;q%6KLz{DG&8S29SVU&K(D5NFR-O%YCK<^l03?7SvKUv zvF3mb1J)k(7ElxRk(qQ?mmp!aa{d-9K}y1|%AN18qQ@+KbgRs;L|$^@aon>LY&-T* z)E*~&qZvpsaNyOsN5IRqZs!Z1Uc|Z((r^}S-=()zw#bQ1_rKs^#uRSG$x6-e)H3FX zV;IqT_v|_C(9|EKiA{~O3$&O@ino5sGjg3icT=tMz^9({54d|`_g=IRise;M*WO?vmybr`LvJGVM)+qvp<|w z8`$``aQLa{7X0E-3c3#9ZMV)8R4u{CG3)IN8H*ihvNvr~m=9N)X+9)0(J85VZm&$% zSK=Y%<*o-Obx5a=E#Ken1|lx0phL}!n0{oCbEk@Y_})*^uvCz6?+dG_%-}Vj6t3hL zd)_o|ccB~!>fK;0FUGX?nl6NRv;8-$pL2J|O%#v$6F1s&6Z~9sMrLt|KiW;CQJq$I z8BXITzEJ(NI(3ND?Vh(`KhZ+lxQFVm-pADYXQ6>M%W-Uu-+klSS+ZD~Gd)za_vREf zer9=;Di*hq1QV14Hfr|n+#t4z;&5M+Tx2L(MHt%3NgC+U^BAQRw^XK{DJ(KPXVL7x zKA-)T(?DcG^Gq+O?(>$oBMU?Jm;38l)jN)_6XJ+c>28poahcZ8BPYAYi|LvBw%EVt zGSwpE1`aTS4%j4WGT1aweX^yLvbu$1UaIi!Ied9U*AVC3pl@{JQPy!&%k_E@G=anU*GXh|i4u zg*c*Z=-uK?t7V_A19Hqn-r-&;g63WvPGRU%;Yu_2TkbhBeA>==E3OdF26X){$1EdZ zl&Xv=&qUgqDnZYn13s7ICa2L!oD+HmR1-$gsKYZ&8qbD<>!QJ3R@2|lTz{Hm(hQ|>@c!T)mt^V=dOS6@1p9-MD4Dc^>SBk4XfEd- zgg=DlzLv)BL#M7lhaR@_fL$(XInKF@k9;F^K85=ZTV2HhnpO~Z*p`yXKnLlilRva* zOa{+#0yYI&o*ul15!WmR+mcRtBd%|rO`ZL;sjeE5PL|N+iB1>(-KWhzKJ^{ep%@78 z&%Orz%h=udic4?30)@daKp_1zzyJp_`>_YnIdnJAh|Fu+XxM%`TF7DYWfkbA(GtSK zkpH@LbAe=}6j7RI@ia$>$y6&(Eye3;Jq0x|C#z=P#xKoA(<4{2wRJ5NGrI*{K5pg% z<=0#9qlcrOyCOxUOyO?Rur$M0tSXBNbw4!meXAOA)}uaGFcd7$RDNs*Smh6 zTC6H;wRu2X2VXu7D_XhU^^8IUBlW5jd<~Kasdd{O`w~b!ag3nxy`2Z4+8N-1Ii&oK zarWdKsIbO%%c!NyS-dyxRWjSphXVerO|zNLktNC3XBnFICe1P|TPp3WU50{7I7}|r z#equol}zU)Yb`Bvp2jvF}z-q%U$I$-uVLGa%Zg0hZh|G%->n)$@kbtPi+*`%#tsA5TtXP|1m1 z{m~$#%pk4*P^$l6H)Uue;IsvZ_7c%UxF8^u%tM=PFYe+{qDzwvDBU)N!J4IPjtl&m zZf&aHdS#e4PeKxJdb&1G&U%X=*!c={E-wl!@Tqagg#kTu8b1I&ysrdA`Ttr0h|KZi zH_3iBwb)o;I^!}$#P-_qr2>|?2b=U?;R=0zULYAp{L8lK%wWt?5KG@hpL!}OvLf_( zi(a$3@p!6B8twEE{C3t#%53<+W&H2!m(L^S>hN1W-qnHO9<71 zC%X;g7=-Xs65vdFcn#qY4qH%pv@`$?JyMaIQW7k|!1RYt0x)PNB&$Yab$rlN3|}G% zGybrbl~7qMAQCr+i*A;MwJ%`|iBb{n>zKF>R$)XPTVuuh%czjK^N%;|{C`&L3$0nf zg2Puxc4?Pc_wFnXpczBQjG8KEBo4bp9WEbUt6zv@zQ4VhTaSnWzch#*15{HH1(*yU zw_WFm%>KV)_HtaWEym%Jh8he8~-uck&pPt{hsl#C> z{ZD`aC@a~h6g?}#UV-@h{o+uNlkt-l3||j6A?_xSg|*4z6RpfPgP^{!HW~10v~M`R zo3!OPai+NvbG{-ERYoC?O8;XVLx+Uck> zoXiqV;>>E0CK5kTBKwTDLh&uI&?Um>juEv*h);zvWu%@O`WpygC7#j|6nNL>QZY~+ zLsN>G!z{nKo{u^#U>`jew|IfCrtB{#G9b&U1Aff}<@p4_{+%_PPr{z5*?}cUK0nw7dW>!6Aok48Ty* zHsPHi46rs0;WDqqNGRP{=^b;CvOwkQh8nqK3HiUbzma^tM%X+8{9{ly=#b36Z?QQq zzdhcjk-L2b`n}OM_PA57{NrVCd_1uM$k@YqgL9?_sfu}l66*BtYHT+BQq+aZe5u;u zclPwl3zcGybCS!NL$9aXdJhzQ%y#X+3c)i-E5(2j$KR8m?1DEmoA`jM=E} zhYZV_ZjuBQ+uilxC>T|ZTL|pd35;;zx9y5%I)!nO;JU_kW!IgHTcrwK^6VuGEze-# zFjN-ScdW*$sBYTVh<|*QC$QP$nMilg!%{Zb59dt?Mo_i`!3P9(ob)cpb65K6%AcWk2cEPB%{^FwmTk}c19g6DW7@0TJAG;Q2)w2knnH@=O~W28h2Peg3?U} z%l_Fft>d|4uzag6J34)*TFCwjqEtzW1k0UahVuuu{nsw@N$(h;j-Pt#O=WZWIq<^H zqamqqz&ghK?z{jAwTIoScJS?;JLY{rCF6*HwhRaO$P|I@>$AUR-nJ}9omv;K3zcY- zeLv=sh}V1|r(|~Au;g}2G`f*~`3l5%=NqM)K~2+46XPC@@F(Wu`AEDo&}%DFls!-f zEgM-q4ey((vFiW9J$tlQpPkKk4J^Y%;Xifkjf(2<;h+X0U25_L4g31h<4hsHl%El~ zp%32y1(`<>wax^6x>Y>)@4jY1-io81y*0pb&6dNaCIHjI3xxuqUdw7r@89@h2uF1n zM}%tvug-c8U0i+#0HzO!AvX+vb%iQ=-*3Y^R^Zrb@}(W&xA2dN+DjmXjOZp|S=de?u&!K2Y@_wumJmxPQty%X<1LZACT`yiAMAZ{wgG zUQ;nNejoIv-uO+&Wk|wJ<;tGa){C7}E?g>XnWnN9(}gA;Vp-b_sBX|U%ar5c+?nG@ z(kfizYz>E}f3#|Q97`5gSAtefJHbRYY1%>uSw)5%aEY*&QKIj2= zVhudzgp)lGh7O(I&0ijMdoj8fpH=#1x+0u>lX)WRvH~$4_Xl!nfpJO7t5U1t?~tB20;7xT@p@yze$td#azIa6C|VDC>12!TVXpxQ6wnpB7& znc`Qg&z=XL&pdu|M)vzx(0C;77A_v_j39MxrX7Vc20mzx(wM6L__eYs=j|V&U78}T z@K4Yu=}(C6_G~PbYu{Cwm5k903I9IRce>}cz488o`|06nCfv@qWJ$IS&NAI=j|3At z`H%~=C5R-FPT~H1B?Z8@R&%otSUy~|k`T%bc?H4>Bgl5rB%DTO4Ja(VJ6|2J4iYAv zzU%O~yGkUOlWxDb+XQsaYSw3(sfY?rs!~pW9O6{-jStZd6r)10S=>OV#Mgh`3;C%S z0|iX%11gv!U_ZZ`z>gA}Yyl9$_gpgWFP4mB#eD!!t3QpB0IpF(jT267Zbrm`pP6!> zI-rJdZvk&tX?cMWN63l;Hx&amR}w+i;=}n+pNa@73BYc#P(F+SzIE>klz+=?w@4n* z4J3zd*yJHn;1zDS9Dmi>Tiyh!uSC-a(qW`V!*eq&s!oU&d%v9f@y0m5KD_Q|0l{SfMD*dkQd+qA>s` zfmf{!@FvV-0Fd)V=mOS@aENpp(nCmw3u2!87Gfo3cX0qe^il(u7cJoB8r4n{F(fFf z4lrS(0|E#={cYhzf6S%gt<_v2cwCt^)}BgG2G!L!Id2Tj)I{L20|Sz-Kw>~HJDl3c z#_r!d$(T9s9W?ZyM&M9cfin_ZX%<#Z&4?dvra>J<3g4cP>hEcwwX){&&HLjZS-z&k zCK0UM1qIeYha_}p@z|i+4SPKfD}**a6_CRf2GRnPB?>zZL$-X$UO3rBOTS7$&@)WJ zOXNI$x_iw!7q0^#+ig@xTgGpd+o-~j;{hS#I*L2>b+r#ut`(ckv_5~KO~M|*PU$PV zhX(A`1Yob`|JtjdgoXv6e6cALTxAQWJvU9mwjP*8r41T&j(PY@KpkB91dUP}I5QT% z7lvSkR$ws)xxO9RHIt>pDc%CX`@9*V;~g@d-}N?C=O}hSv#?2JcG#gRD-yf%h1%`Z2e3(4L)j zT!E?}t%B9;GER-xa(j7FhpQwlZwu=XYV&yL0r}&9pOQ%>=_}Bk^%MX#iid4hSeYDo zc1<*#EiMdc8->;?0sn0a8I(0px&B8Z3LI7{r5IIB)6bhnO8Zmrmr<(&^9H-5CFf10 z*Li_*DASsiGB5b|&Dv3Gg<^uNqH}v|bp3Rw_1T>UGAyw5f)g<9Hr&fH;mhPskKB;# zx`E%Q7wak??B@CVA!vi{mx+1V-f4Ws-@sQrq2Q|fN?YF_KbgzdzbmI zJ*}yUPNRw7bdeb@-Ykj#rkG>ElY#TWkY5sG6;S$d&ZU7*s^Yphcv4KYZ+}XysIil< zH-A4X&d!|I2VXW^Tmk%*+q>6+hUnCzu8a@5!YXtXPG2b`h$i~|4C@L>*J(EIbLT@> zriRhP_IEwZU)}2!d410D)+WrzcGACU&}!+dU9;wr+?QVsFi1zgw`ym>2XIk(bod_W zB@u?Ti-mphBb4gTz4u6==S~dWGIs`#V;?=UP9RTcr*C%%v-P1Ku@u@6`xE-gX#GR_ z){~C*R|A~C`8&WLp?5K&XtIuO1)6{$HSJG%rw7Tjn%PxWeG1EAs@I>h(Hw!CFDy{u z5yveo7&9$2H1cxSJJZ~CnZHOoUW{ZU?2V?~4MbbZY-XD&lmhovx^Qg8yzB(?bb?Tz zm0C^^uz>G%S?kYU;dp_$2*l**!A||_#`ru2miKh_I~|fLYUdD3N=TA1`r|oOed+J0 zE_LU^75t;a!!5ZdDSWT+i=2nAEjMsM|7@%E!d?DlPIBL=m5|K^(NPia5`ugi9j?Yu z{;1z(AB9}rXx!%~Vc$7MoB{`dNw22BJNv-W)VbB7+_IDnKYU{Zn`SuzP=v5lGrr30 zGC#bQZ+_q-S7@~jbN~(#z2o=-6Y%x!_crnAE?Ou>WFC~bMF#9LG!59Z7K%m*s32BM zTln|~f2@d_fsd0BjiO7eV|MSGhQ3@O+J3?tph#RYBJ2tT0Nx*27{i*cFZ#-Cd*dh3 z*0pHMR(aUH8Q+c62a0z;ki!qeIBHg3Fac~0^AGlX`zN+G`oCY0K?iE%!oW*8aGDwj zRiGmWA?=0^`bA?OT6UgMCP4}Pt2w5|#-K>)kQnT-gQ7mAMm24GggNe!8-_6?PN@d( zpDwqaKW5@!It*G(aNjwfI#}~^SF)>6cx#4zR9TBlZ30Nuh-~G5Bn`j*Umm&uK_1H^ zmLU4T-G784Khcf1zM#XBi3uKkW>yCSYLN276kl`HWr;Hq2?X-n0O3%$1VObi3SbLu z+?R-i0?jEr|KpqR_ZmcOoB!tdcwG}*$!DVbu!KzVAs9CX=;ryqT7&3*XGl1%K2?c) z`ImJr#tBelqBG=~sK5S|ZnQ219Tyl!FcnNDSOOA=^!NWu4Tzp=%Wym-|IsNEaP?z2 zAhr1SF+cWIGOI_Qk0jlUQfOMqeI;K+lE-LY zir{V*Sv=b~*_?65N{yO&0nm9N*>#BNYdkQ(d4Gqu!YP#QlS*}U_a}h8O(B@*MR)N) zQv2O>^NE3>z>J6BWy}i!nQO(WQUl%ZdZal9-b>TXA+u+t|DP7;f9i6MDvlN37_erd z0J^9UL|S?zf^y})Mt)#q{EW5qxR&}yJK!1&)~2>&WaD5z10j~=yG6DWM0KO^Yz(PYmk zw&o@>i)lS2!Z9PEdF#T+wfF`_;g*)>VLjku?gKk41#)>ENJMfA5Cgp}lhpapcvzdE zdC#SeW~^u4EO4T?RdV#oOz}`t=S-lCh%<6Upy&*)qtvmb_A)v#GVZrZvS0XUN^5{u zfl)|9hz_y7;Ca#=lXZnNYHC@mHqsbBbcndgc^BD~*0HOf96pRVEGnw5CsCER!PhKZ zWK!4l+Kza0Q+(gE8s{Bd0`h~0gxe&>zBWRiPcVDy-#&bE<8R&J%!ww7^Zw5RFv zM+q5yL6eMV*S4Bx`6yNTeaH=Otz5K7<)bCy8F{(wep`k}PVNGs&nW!Y8NCaeVy>QG zFd?P)656(}(YhgPdQMhZmZNYdC1-?imn-*ausjm+j*!7s8=;J`N6H22E$b@gxnC>}WDA^0>sov&_ezJ8K(_9^QlZS1}==D%a z)|47qB9S<&^OM!Rvm%+bqBm7*@na&_l7;m(-FF~2@xxaj%BR&09DJjn?dhUwf`i?^ z1EPJgC#O@S94AT$x3Z_0aP=xQLwZ2|d~_aKLa_TS1Jkb||MR&65tA)tY0cu~m29%& z!Y}6p0t6M&0Z*I{PmoQVNX-z=b`|c_*2Iy--Ej6N^4HGIGIeHS70Ix=v=6)rMm2ZL zExslE{1fAu5nJ%W=1Utgz50oE;WsIFz&ZeJWJ~Ab>&^S#Y8AS0R8P8~xAa#NH9B)% z`L=neu@MYYk$(HO_R8C{!{?isP^KPrKG*>qw;+I_Mqe{PzfJR6@BdA>y`ZAjT`xDS zW!rcJ83FK^#@9dw8jYu@z{D*T=hh#s#JDcSBx>7d$7|ny0hISOgn9TPR-0Z3s9r1F zfjHxTwbYn>Y`%s`ww%-D?S2paoj9A$`qU}<73Aqsd`aY}tp4Uw*T_a(Iu1lC8Il#c zscF!iPh}Cu=70K4L8AfqG3G5AOhfxXD;u>^FbV*+&{_<;^*H zPDk}2U59?r;R1}aR{frb*=`_qG7k0uGnLzjwBROF$l^$04SdJBd~#r}lbfV4 zs&{MuERWx9zUZ7w4iHRy8&m=d17w}6!wut+Fc70PV~IYWrqKUvp0{2`hWNT ziBc$K$!>%al4Q>^N%olRyGcUG64{xFLdg_&G-F=R<$7M%{qeX~YCzLC#?>1fbgzVuWqOG{{@zKr>P1Sg>)&Bs zMr_Bsa2b&Vx!l^+h38fm+jP8Sh3)oo`UqXkWF>5UR?M9pwvW+Kd{pC8KQ99WLA}OE z+C;p&{fjy^IShR}aq0e*EcvrrBH06VjwP#%Mc*cWoq7_qa|ctYedmMGr-jF|LE5He zdwL}rvZqo7sXZD@T8-hHQtSHEu}96MJg4;{vVzuKf@&OtwPwkhS&vfD&e{_r<6p<0 zQZEe`l+o{_Xn)`*bGxgPMbUSi&8O$4LPhK!lmr#l$mP%sE%_OlZpA#B)GM>iamV2a z%CRho2D_fW0G#kA8yEIYkk4(AEY*iACw#2Cs@k_!Ehstn+;RhPy?@eknexb>_w5VM zzbHi8SIJT-t<6=5JXTSg_*J!C%kC+m{`qmjjOQN6)aH^U`}w%Ki0HQB$FnY-$3;`W zHd%iA58AL4Ck8!p{6FXaZYhwqj6r0GY$SXB&n?Za1%?U zIrG)hN&50?{nFO;m*ic_nBVnM_4vj)?b03bH^?%6lS|0?^B%;?bWi>>a*ZC&(LWI6 zj!AXic>Je>v@9G`)0pEGROW~1TX9#&6^k4|Gs-@bW?Ifn|Jq%v@L<2UIIvc#tFtvd zE?`J`B4}7RsPH!`-OyQOxOu@aqo_iW)dU|pCu%pkT=jIcs%cSu;k-MBu~+_;e0fPw zNBi*C@W&pyL2E|u^Z4o%)NU5_>Lzll%-k1cN}Oon@@6KV{l&%~X+WPR7!C6``o;OS zvGTh$vG?HD}~^FT1LEU z5`W*K*6Wk)bM~`Wty&U!ZC(A7cWtUpog)f$Dl5viiH|E)oR_xxT++KKao}8zzp2Et zdN@sEjM=z-FZ|GrIsST=%Pxn)F6L(t@AIxxO4c7y3%v)IN#DHCFRkS(`|DJ^q^3i+ z$!7eH?62i7HBm>pE8BwxhlNpsCWHH*r4^0aRO80%emzp=A&LAhKJ&@lZvEZ60MoRk z)fTGBuwM@_Y`~kCFqz1v>as z%U>VbTG(JGO_wF*9}c1RwX414kEV9kTuZtwqlwEEb_rJ-MR$2Lz*I5JS~y>}*qcL3 zVIacr@ws(E+^wqWqq6AF1vK45`O4G|i3L$*A{*Ps#_y`nM+?{9wbqdqvG~MobcUrS z?!sv=bDB^oZp4;|qARshhI+T0 zF;x{y)Po0)?>KD`zf|^LOPjq?$+sbW``coQU?t0HGEHOen|i*_On0$rmA*d}er+tq z-g)A|ypLL=nKdDTJOzM&?cOdAYo{jR2MzKP&Ra7+y)8W0e(7 zcKN+w3*G&V{#G6$TNQN|t~`(s%dT~)u}Gx+1Yn6}p4XfkDm53tq6ZzP9F(tzTCY549!(e>FSoqEhSm$)M&# zsdQyT$5}A6d{f#_jT$?q_QW9a36r@BOMW$MKE%T0?yas{e^J;b7bcyLbcrE-CDoDu zVerA4uP?OyjS}M;#5Gt&qFGLZnZA}g!!wosF`7%kU{>rR9%_=_Ocp2$2>584Wv4rV zvSd8tT$5lVpkxr6nd@6u365$xQVGnxG9o6g=f09Ju;M zs?~VT3kS{WC=7-}r0!i3#>HEOhj6-9MUlIZT64`@w!<~sXQg|rZ_6F&82&gDlT)`P zk-W1V!N^|3YcOR**0GuaU2f9@)kLnP&^fRrAY+RbDsBhQ{jSy28ySMO!W`sdhO;_; zJGPomHLv7uNxDis$oQ}Kc~{XtqVnm8=l)l%>05=j0>8U(;|#7nh{5#Y1>QN#Oa4W% z-+2f!MiJ!bj#i{B603Q79G8qdUJL*=0#&5?Yx}yI-y3(L&3$vJB7BPsO5GmiR+^vV z9n{=533wyarMH#C5IuBV94LMN5t4sx^w46^I{C&;FjDVPl^$&D`j`ms(p@1(idIo` zaQtF!_@8Rql$v*;?Z7Nd-&3dHnF%d^_W)NDfE-O~iM=jh*_+okt?UCvwiL_3g|#EM zgZ_S3sqN9Mw#0*idJEIyl7>BP=Z`t>$(IUETrG>=l)EqUWPXFh+`-&fO2VB zvG7<$!KAEJ4>2XyweOJr0Xnmad3jCD%Xy&e+ zsUJfzNW3$RczcNaG#cuY2f3RWp=PIgqlMy(rXocNljl($DcgnbJDz|$!&bjKyq!(7 z`;4N8f`hzKqQHO=F+hSC$;I*ARtHriRjzH z2LL?CBgP^x{u4Yf_5!(bBglH8a5`a}Tmwl%|1fEW;Ekc&`&|JVfyEMV=-99?+r!ES z5Ax+H;%O+nt?Da|*9n|NJ%Ge!VMXp&*QWIRk-S7o@QY6rd-mo2a<$mGdob_GRcl3X z3}wXI0)YptsZi{V^d;ANY*d>aug9~J&UQgv-BD_n4H*b9up^&a#@)xO&;7)_bXy#^ znLjJ7cJ4@ZGvAR<0V{JQHfSV&uUHsWytsyRg60v=Yld$!>rkp%qT)Ws7kSFls zh72XS|5QD|GD2dBhPO;tAT)cU-wxFY1`lbGi^)vf-y!xcTgv1$j&$-lG`4x;jx-wQ zsTa<;CQADB$kcP;zRHB`fuHovAX^)Q#P+o+ISi8d2MO9;1lxB08gyx&W;20adipy! zDQy<_KaRM5BbG&t#mJ2pI6YE`?0{@N6*!~68w4~g1mF>w>-%&Ez0b(qQ!+q>W1N0$ zb{YXEH}FHt?bD!Ze?MaFJNl3o4edB1Mp}Z1b(5x0&2ap6H)0Y!e53BeLF%h|#s_w(n= zpv8)^=sz+2X_^38tJm>oOZGi&tY07)uvYOG1%L8~(wpa0pCX<(n^(pF>ng1j{bV7j z7$)E373#3l3S~C>lKQ$M^A9OHn({jonBaY^8~fN~NC=vC4@s3|=9Xn#Ri7Ydc5~?{ zd(5{JbvBPm_)D4-^7VAtbQ3z{$432`fy*MJ22U`9m7jft_j!L!zu>u@&!WlR<~$}H z+~ZeW;#7E5H@RF7^QzH9Mo9jLZ}_GJs4I|gROE(3_%S7F8QI7qfum{eWpN}+V9VEC zzOK4H2~k&#HTtB*RTz>Nrl^WkNiVNuB0%v(DFdW zO?;pM->;}Q0{p@auRRSc&b(?g~tr3V2r{D@``Vz3A?xU5l8g#I@Tv z)?z0wp_pnh^}K;zxZmk6bG?_|O&UCWZ_(=cb@ImLzE|U(k7bjhBhJ9?n(mTledA}+ zn;2NlG;QV>&aBGl{R)*K#a&-~=zDZk)$HOi%Oh0;FX9intOfLyd`^EyyW{z6l#8|1 z9WxEbi@6FT0=)F)cEL1Ug%=2gc{jRm?F-E6(%#-vGp&;{Edv{ ziZ1jI(IQ`U(bDP*AKPOA@h}Q^?>I^an?zpstcc?gFH|t78>6BQ(=P?mc+D^{PI5*$ zH`!3%))>!DYuw=%NYi{9)`YAvP*}iBTc`%6Ic&1{3I7Jy_Ir4lWeOw~jx~UR6%8=) z)=AS~LBm_j9@OI6wFXOU@MZyXbj~zMFV?eu4z$>hoq_*4@^NXmejY){Bhxy9LB$FR z-0Q=HzbMpHt;+yV1EyZwzo`EDtn+UTJ@CIV!1qQWqWlPVQ)rybMg+^l>#l>wzbL*` zM*T&xy_E&Oi|9gJC*_OP-|>As*jhPty{2EluXWBz$M>M#B=$AQtRk-`|r@g49iLdQJI5qPQ7 zBH)eL%)$xu5RzbA^=&V5Kb563X(FZrb!HuYNz&0ZeRiK*XnW&p=FF-`FVpQ1@1rzj z>X)98gt^*Yye`sF?3hi#0wni+K;6|S2y77lO9kLLT>t7;&mZBE3&}`K#}7em{wR#(Q_^5l@dDCGTcPAl`{y)2fm7F7m$-O*_O%qpx2Ia!CGX1Zg^GZaSxdR6jNNZ~@ok_D`;fX9It< zcF+;&I|f8JP)9_mfVu(?Iu~*t&|9_O_s0}Lj&GO@E}1UgyI5y{CQ@}a+Z-*5K0`tw z*iVQN$lhffnnW6M-j_Mx{DZjRgg!6zwqB&`E6gJ~t2jHkGTD`0>49Cyr+SazvgpwX z8h?U*#Lxbq*IefKMam%4zurv4w2vm1ibZW|Oi=~}DuS}>}XydLFaSXrS zq$G^z$0-q-eQy6ODV;6+1A=R^{*)P`vH~JF$r+XF?uS|}uhb%f{7A^FlLN}@tS0Yi zHc8H)v7~TA<$P+0vSNlcvvvtYeAogAxd9ZonG=nOqi z7K}OV%(T&0_qfB=2a?D++gg`4-XUIDvM3melK)#{Q0MY5jlsSLk@Z`fu84g}{hr61 zGePUzel@}CKUD?RNhK@(u(##T+YHXb1p#?Z4}bn_xlLab`%C(J8u8VT$DL^BZm2^$ z{|v0TG3Mx{s? zSm?K{TG`zpZ=v1eJkFT5DSCV;j%il9=8$~h&$a?I|NglB{i`a>P>-jU%n%GVF zVDHBuk!B?^axa9IEN!NMxHWJ|O0?>oe#I%y5$r9fFXdDg;rki*vt8>0CMEBM7PWQs zQAu99g_y}w?nZAcK)O#@*$WACDP3-~fes<(8ILUm%!}kK*8wZ(EzA`+E%0hd3pi^Axv}o>^!oJZMnuz$MvV!Vm zI99*+=o$|*xVT`=8SoSNPrPw09jl)^iTyXL4I?sLpKn1Z{8M7&qKRyJf-cF{IiVre zGmiJ+i2#Sxj|qo54Jb(9j5~oWgWu67@(e8Drd_3ApU7L9Kdp$YSIIl*Xb6zMSIxFR!p7*n7A;iWn6{K!E8j_?ioCiex=u>2#yr$mQ zX>3^Oh9r)lWX>X=nv~nWJtzDkM-EyMJc#!63&`0gfdSb4&n?z2VG+y5wZDFiK2=dW zyTxJ45xb&;R7nU_)>Bqt;En^yy3i!mO5vD%lU)?y(TRX>nOqO(i2>ijW8nS;KL9~f zeEbP7-g?4|8)7lIoYmvZGFY(xCu<&@$)<{9#>lv@i2?3~`F%a%v7FSTrb!#B#Y<<& zI#sKc5kQFXAKTgwco~$Ka>5D_sK$^hr+#*Rtq)@BX&o@XZS&GqXt!)P>41KiEbD|p z+=547I41u@I@-HfnC*_d)lBkzTOvoOnqUpySWUvX4zKG%^EsTD30pUiV16~=&&}dxGjCXme zpeX(7tVw^)Bl`%2UExaN4`bWJ_-$9H?CCH>0cy!@roBZ6HVbc=q-q;`k_-Jd@ukXo zU|hpnhx5Yhd!Q-h`C_uyTW*q&_K1b{ZC*Ay*ZYyU^}ywTtfO%f3Vcutb)JA@*E^Ay zB+PCzlZP%Rb4OK%dWo^BW~(z6e;w@z6m8)R%)G)rTFEVqGe?8Co0ysuja=UnBMC;J zfhNp86oUQoZz)0F9Ii9$@`qG|8jE8dI}V6&zvi1E|J;+;D(A=Bgkh)nQJ0eaJ2!yo~&dVu%L>4l0S|vlw#! z&Xbd4O?nQrRE(HH6M~)uVaos^f>=tbK%7N3W#J2@i?ha$thsP{aXW)gn^dHc1H9Z} z!;d=eP(Pwkffv9_W#nZT*1?R$>ZN!H~}e>J2Ku-I!x1Rt~R; zbQvFUXlRZ;r=sdh_{B~rqFfW%0`Aj(6@XK zHLBBLQS~8Pf@CL=F7!-_PRuK%D_~v{r;jyVtT60VRv9+q-)sN1^Uz@q;(fS9(WK#HMW;I~BxUM!{BK>So9DU}{zTig`Ii^P~(;b-N z6nHL@G87n!EvusW5u#=vXRiNLs}M+Q@T=}Ys&n?Zv~+y!N5BjDg^;z*-H8$J{IdK| zsnu(Yn5D!2!1cMNoOfT5X$?L9y*c@t$pg7#ctKY*nK`Bd{a3Jbe-;!~RE{ z3<6B*E?FFF40L@MOn5bIc*%HA&;{GnuTr07bmWm%F=BMVo*_w6F-yy2rt~obau@?x z!e*@gqWFt~#gZn{tM|>FKdqI-8{8^}^=ZsX>Id0nohy-kqswu31w3Q4$QeG#M6lR| zd`iWqH=y^I13-12KQGBayx7dZaDe=iiY3_nRgzTi6Yv-(a5%VzgOb9=fdDuI)&KRx z+9bsJhY}M$h1?7xs_Z_@O4}}6J{5ZK={PNy2CoPG2>;IzI0F)$em)CdhPAcAYJCZ) zEfg311a0=AU{vXSWOXl|G~j%+^`Bt$v1gfF*(*{j*=W<@4;Kr&D27P&N;D)sKYhQN zI37rik9m$o&eMbwZRDO77i|T;+v9z8V21XR6`A-N*8rK*gAW*bs7pFaKH%m83fT9` z3m4TNIv--!3-zyx_^^F{xnJ5mT}E{;~PSC_FnZq97p(!=LnXsq4_*h&yRly!{~Fzje+;c_Ln9Fk#5-kp6Krqu z)T#0@-2|t%5-81^kmWQOo;k85fJ-kLZrrFAg3m#(AM-Yo4{8hgAs--EefZH09xgDW zn32au&*8RUWrIQ&roaW<5DXv=*8^o>NPJ|V4R)FI68vzWFoGaRKDSp*N)zoR-QPwg z_ea99pJ9u~(cy%lm>Ru;orK`{l$|8AeOp=ni^GjoF3ZwA*EnjP-n=fq_poL(!D>9T5-PaQVkE zWJ)$8qDXoew(ieo^#i1|W3o}XoQc8yGWpyhcAI=D{k2J&Z0$Rj>36POSNLf*674c0 z5x3c4RlJyfNI9w*g*S(uR)Ebu<=b@&Ev@}dVORghnK1GvXxb5$#X;Q1x)GVxVOZGf zJtWO4^!#&&%rO&q7DL=EU(PxRhq4o5^tD)|;w7S9>9L)uf^9j<})JR#efJRJ;#KYv<2+7F3XbbMSi@_Nym^?zszj)TVG-K_s{ zX4PwkaFJD)PILyyN>Vl2kxpB-$fbhdJ0B($1?>}kOPfVii6-8|Z{mFBiq1LC zPcuzAC6@lcx$UU9)t0f@7)Hj9B>dWh_t__4D`#^HXRU4z8er4nedxv-KO{EM7!XQF z-(!~CHVX&>D?F$A$LHFc2r3Ra0vTS~)B>$;OWzEWFZvnY^#1NZtCc|ML1RpHyQ%YQ zL~iD>BK>n&^G?>K8nA?p3Le}Z*js&({%%TX=-K`c_|0|g@^2q4Ed(MwzIWXwe!p{T z_F@9PoM;~+6EoMoL5XYHZK?-P4Am^LV#@UWP+E3jTJ+~}e z5fr$Vok!7qv+XL$b+!{&^eEZ@@0K0-1iN|IYJr5_(`?g@U1$Zf&_G~Kd6H3y?gEWx z!fA`+`^F{@S5!pkL*%GAh-XPCw1Zqd4St$npd1?WTI$g6hDDrc4x!hnos?D@e;8ja z+*aRt(sIrbsGMG`AZUHZw(pdDW+(qjQ|W0fx4kwI|7T9`!+xYKUanUzV{^5gNpsS| zZ5<~>_%HHp|y1Z$k^t$4NbYV z5=RSoD8YY7zW=2@IsRHOj7mOyA*MN<2OmCb5?v0s)}S#u(lz()a9)jcZ&OR87e=^w zI+BayP}s&DzvlWPAs1bY?B1O>F5%IV_yZa20cPBd_dw`4PHr?t{?;OO4x&W&b(Am$ z6#~sZj1OM#kMo5=GEO?UTz>+CY*Mp_Q75aWh*eEqCqKd@G)Ihl4v;AOTS;(<6pCIL zB})8TNMK!o1MSrm49GqH)K5Kz4`o)SS|)12IA%GxDbn4IoD%j&cds`=C)@knW&aiq z*z~sa0(I=)x&d%JZxh1{>9n}v44Muah`5!EUk3r7?t-I>j*t5xp>xkc$$@z7L;_4b zFN?qQ6*+=f2b+I!;g%R`1Wrc}ltHvqtK%!sMqmIBw(c!B(T zN&%D_J_3S#MH{a8iY%#UvgGeYKIadK0`4&|V(NAXay@68{C>h@PZZv(zG3{FpzAB} z7sa6VH0eqjIsn=l1;B$h%G}6Au zlU%<61dAdaUC1e+V^sHME0Weae2(?yKIDzLb(JfEeZt9mto_3pT+=aexdUh!UM-KO zI#g;M!0|x-{Y>nb4nI#OufgZ7LAi{2YyKcOp4@;*KjxAmWL05?>_@+5nZ%Fgeq^w=IvziLDNfTly@zv3qsCZr33RD_b*&S7EzRMP-3y4@BjJV9KSN)EzwE6 z5$GL51)G1RG3(1hS@Qk?^RI$x!gbco)3%PMKQOK$tZHsJr`-6Qf9tl-F>M*J*QO_d zAa=PV0VELMq|b;Sz5z2P4M%%`{IBQn_15GWyKL`rtuOH^r9eEZz;HY9Iy@M;ryF<@ zgQP&oQfAsxjc`0@%zh3|MnDwEt?+We7=*%Vnur_E)q*6I>$aCP zSwhQ=^S2{i?cN$PMc?*oMSw`RQMcT0X+VK;jv78O)|+qsYkE`n+Z~2Z!J^wWeTg>g zO$r6muvp)GO`EZJFyBH3*BzaV@m92W|`_4rninl?%Wrt-4#4;)^Sm$(zf%=d4x3J^{ zXJ=mSoP@Lmviwyo!YlH*!P%p|^p{B-Me|W^rZv@($L)-&Zom+$W&gmp*GD8I$pZOl zt{=m4EjsT%d3Z)pcrd_$q77sEWjy2NjeQBKan?A2kA5C3H>7`^UCT-}+06T&Iv0e; zNnE)XBxr%(J4o*Nz>2*Y>pds`X4sIPhnGzxL?T2jX#GK4woPzqc1gT{V65y)<(mlI zcByx{gf>CwdXo0(D-g4=Je7j~$du})oZUgZ{)7L z%+9;V8yP`l!xOMR7I;dW(t4AD?v6#t_|Mx|Zj@-5_|CLQq()?p1zWQaf5>?+J5!>f z>r=gu*TtjkkFon#9AsV~mdL-s>!3rC4!yJ;IgYnENw#`B(|YLV6?rdcoH7?SqGY%W z%*NK?qp(gmnm76!D7-2mke6N@nkb{P60aby|;YKDc^9icv zT7`$Gf~|*(3zjKBV`CYXq00OaAlUyuZdZW$tYKz@*RB1WNe#ck$Y-Yj3=e$SRsU~t z{2%w9p!0*3h82i4$cyM=6_Gmv0 z;||rXGL}Bm3f|NuCeoVSgRW%zW@X1WT+8=BCI7X~X@WL!9OkG2vRS`C^kl`ptL%}Ih_*~o_AJ7mLi z6Tn%d6;`dP=F%5=iE@9X4GVJe8FvoVb+o$1M7CefEsYVNS#Er4_mC z_uFkkw>hZ>GWBZ>HXK-nj^k$_F$OxrGyRK4E?ZRMV0F)Ee}p<;)37!k`a&;yQ5#^lP<#I0s6(uk{c>`$U8RD#FfDGwbAS=`2K~hD` ze)XzvtBj9q0p(=7mTk%4Iy{KhE#)J=HU=VtNZ`Z@Yhay(qA8CDmh1Mp!ztLMql-yG z8MN*UJ$g-=&!a2sjmLKlJw>X1hL_Gxb08qCbCXvqIBm$ET`r6N*1Bm{8w^jWNk}n; zE!1qj5O}snbUxs8`;7LbzQxbeR?kbyP~w8wctPHj|H9Zcdh$Nhu*zU|rvWzc%YjC* zYs(DlwwrhdsmiTuhf34dN};<1Cz7bJwNu`p_J#XS9lU`PSQQK(?!296nLD*Mj3I5I>COj`++6Hvp)|# zB(zFYE+V#sI*m=ZY=hWAT5O|&VB!opy`NtmW!BRkUh<0x)%oRYf0Jiv%J!{QcrAhj z2MZ5`XR4xnU9nbXmhN046l@fxfekzo@J_=kjB*c`m-*fDTZ2%9TZlg#m5n6t^ZZ8k zHGSIxQ34kf+M=}Weod@pMemP>7y0>aa3nQX6aJ$3eRnhg+G;RcN!{KBQQ0rCZA6G- z|DuR4r-L*4+ON76m&|Kw6&8MHeK>9^#Cwr}Uupn*R!&gymM=>8wvOe|+I#l_)d<%y z2ONt+z`$;K>5PAJi6>|2DTkk8N!z;S#;XS14}YhE)u@Id^8=5#=H2u17K{8SB+MU& zdw8?q4rZ)Dar~ zz}=4*#I+xA;zP0PzlRlM!;Jn3}@7Fqlxk@g%vq#kj+J-58!htr-dZizad|L0ld=dgaekD(onwu|{Ia;OAU zdDo`2ty})x;T;rKBgbM#YPs{Kn8BaW`v>+m?78#I%hcEAsaK?{qw1DMsJVifyAQY8%{~4WO2<0k3E&GCOFKEnxzMgx)12y{O-JU#xbIyV507z zp!<=-nXKzDxs1V$E+Z4ed!N?Ya%JRw^&Ze7X`SWB7oi*-UqAOiy$|oV&8ZL#>B8A~ z?4h)8-Os=bG!uI%3<*A^H!p3$4nBoXzK_*DNQv1_{(2;u_MsbMo?he^yD>jej=8-+ z^VpHYDmj$b3JNY$ji>M^%kSA;{y^oZ-^UdHnu{$sd61ZcjPXO%!OZY z3@?^1zG6wCqg^}FZ;>zZI5+>8e(rtui%Y=?u}WMok=1-5C$e-guNF#as_oiCUs#qef=RdRtG9`<^@N~u5EW!VnSN(> zZ~BYkBAIJkzL7@(FK2I+smc9Ao#)gZ`!)2n5>Z#t1>XfuD_w3FCoEUqurt2&CkL+Z zEB{%xCmc~p)P})fTa>i&X&*{%2g&wI{Vw_Z$=h)>GTJU?sg3KR;jzg|WfKa2J-uZ* z3_|0EU~K4lF0aBjI{Ek_Z=6|R)A$1iAL>XZcxuk?CRQ4ya<|S8it>;-;>wFbN$(XG z{42|$K!XxEKE~vIZxB*4UgrvhYnYWc~UncQCZ5LD&-!F`_I?L84`wZP_@dZ_dwr3bg zpM$qg8V^6UaTa4ZdiA?Tp8MX7i0^^5f2wE{Oe)FX^Z90v#wD4)YDa^&-QKL1;C0WM zJ;#{w%1(33FZWJ~9;DV3Bi13D$eOE~vm&i$uFAMc8WUd5v-;xI>OLnfZLHRx8#s$) zJ-CC{#)_%^2-g}Mo;QrT8@Jy5(J#_aM7qtJN)}bzHiBVmV7{o?%kv@QUZnRSiZ9Dk zvf{O>wb^-%m-I8Xl6~H(Jg0AZ4ob%T2)#01AhR23K$pL6Hn{GuQ$70JhxviLL@>!Q zzp_s^+r+@ci{ItC&x@jU7Zcr|vgK9LZij4;MRWdvaJuuF;>CUR#41!A+7Uj=I*Z<9 znyr1O^SJ-fFbiyS>rk7JX=PS_-;2h30yLnVvI{ z68E`1G|1qczuESdt56$+gE&%h9XCN4f0@u+ zQ%=W|eT|UYI2+3i(GQYIv8npDNs8eNN4=Y+a~@QKQ!v*HrGM!3;+` z4ZXacl`NOn{Zel@T`#(v)Kr#2`80dt`h@V`y@h0G|M}%yt7K}QuH$veyFCgQh8mv3 zjM={!bCxgF#L#g$_5VdtZ|=&*=y0qVCX(0+YE-(ABRX z*J|}LX|dHEq2O^Gc-K1?w&KK8dpkrSe_!$gPjzN_Txpe_j$pD?Q|62)v)%)W&vk!B zlpHCKjTmo>n>@4BXr{`@f1eD6LucXy$zlrg(! zF~ks&V_Eavulr2g+G*lCg3MnIO9GhL8^7!(pWA*Qs3<&ewG>dzL+?SPNC5)BM#+7) z0*_OmMOZ0yNW}F*)h{4ajRcR-c?*Kp>q{3NWr8?Pcm8r#9@8qooC{@{7+_#UZ9saSrStW_QiM*)5QbN3i8%E9i zbTL1#a=+$j~o}hMx$Kw^m zF%wVBl*`!;5N-2R7aN~nyDuYtN+T@j%X>;e+m~7k zA!k?d#q+Sx^vh(rb(=a^*7S;uu?N-p4V%}h;c8)P(Dx*x?{~O(L^iAQmzZtY<#mv7 zPX$icdFg>VGm@0CGi)^*5t?FbmgG~)N)AI!FN*D zvm0kV)5J)fd2@53(>B?X;n@`QA;MLRFCKalwzRNsmNsN_9VU-M3e= z6j{^MTesAkEfDmN*W_l9f@!o)kVf&!1WlqWes+!<*-WOE@kBnlp^K4d;tQL-YH!sS zGG$L`?UTu)Mq`ezB-83(fs_c-jAnf?!lC-zAZLv}>k7ZaZ40?}<)7L5FmJt8>v;k@ zxjXF&{!*UP@m~~CH|Jd3;2O5h8k~XNrT5If4ZQGUtCU9=wVy6aCUCaYbM_H!W=Sed zQv&4{lM;rZGgYC-cVrtEJkGA2<$V=$malHRSikvkb$)FtLh*OHgK)Wvfm&~i*oeA zva+-J%7+uDH&w(%VgTulk^oU+$Q8IKjrPu==vYOW#tRY;CK*)Im^oA24(Cs6us{0L zfOR_)=!2ice&$re=0!-I=VLnuFYUblt;I8rnw54XT+#i@F{UiKb9EkG+1z8bO?shl z%vSx7G{*(n?2o>85jHoR2NqF8#F+oULAb zrA8pJ%>|;H;8r2czhGg?CawnE`%}usjd<*TN`24MniWeo19@Ff0h0Qcwi%YzAJ&at z%RtOCT0Qs@+@?D+D8NV6AP{F77c@{8xPIF-xiY5d2XJDkqzXH5$9N-@k&U{@1zRjD z$sZ1ui(iFfpv;<#-chYn0lF0Gzf03%AY~&L4Fd!MrB3;g{V>VLHgfy*yi&y`6sz+g z4jPh7s9`aaI_iwO#YSiG$UR_3-1?{65sUiq+V4BaZMWl%}Fym`|G%qr9Y2PC0GM(H>UHRlzOfBGx@9 zndK`~65c+2Nx^Oo7K3JW3sgUN>?Ouz-FiCb>km>yt4(DnjJ;tjd*;c)d@}is5<8v~ zhIgDfPt)LUPk%?W$eopn=gCWZGx2qS&{RV5iNk9Mf_TG?Ezmr^-T(FJz238`OIf#$ zIF8-*S@y272Bx)$<)bD1EKUVS$_XFBjd5Ock9FNRy{Hrxbz)*I*{S6v(*B zjp8#eo9?X-w9tp1rs&vP2)3|8Zm}ScaS{Mfs%u_6O0P4*gKK#jNDtzJVFapCa9uqx zTcHScWaItJSS6?me}+-Kdij_}!!yn6vWiC~)*$`?u0DZ-c=Z**4Hl4P9|g%x%p{OI zd{7LEhjWn|OEn=Im@l6$<`Pm4EF+(AhA;R>5SUX@sy<4k0pCq_J+$l3+Xm+1AuGQ7$1?^J$oL;xrAMG`I^HOY~)T8E*LW&_{muCdJlx(CMHDG{y99*7O>Lo2f6oS4Jge zSPdqp`luJ`)9O7Bf-Q@Hgf;kmY2)W%;Y1xzv4mX9jsxzy@u39K{;FNeWOECZQh1kK@I)bpG{a6NX#~q2;&7n5{ z<})*&f$Hr&73c(bzb;tSK;7v%`Oh2!Tkt!kv!mhhPp#TL(%dOZHl!~NH>jmrqp=U1 zj0|xXV8?yrWJr!G{C5Va82WS&(+)1z5gqXE$Rcy}9smd^W9@{yxPrv7G*!7)?uxY& zqoy1$bd%rFm=i%p`akGuI0))!-Vlwj&W&U~+>4l-C}du%XiIc3$34$*VG4G;X7@9o z*)tO`sAZ1ARyOT?Wqex{*iAzvy)!TP!};Bf@S8WqgV^hZO|nN9z?=LZ=F21>MduyR zMOa|C5yV0vhO8DAqtW8x(ln zHN-ztGOPG$H8Rhq5iPL};|DfrF}*vNq4)>;+xTl}Vw@Wja>NCgNO6$ZP`^?5Yb_pP zO;ARS&U;Ef2%2167AmPccvQc6yEJ$v@LgxZKZkyV{U;UjpJ4Q^4K7dpk0kV+^{vuB zn@`Hq3Fb#lojG*;zZb1M|CoAYyQ}lnnnInX$9TrhZhpb_?5HxnHsY~9mEQpI{D#Ww z_BeWMb3v^fB&PgKpK0X?dGa4iW}-hs@XoL?8FDyC*%LznWM@)M?tz!bw4+%ApJUU* z_S2cIosPXo{n1Nf_oMnVc1EgY@2d65JmCSg{dXqeYfU$GwPElbd7F=oYHRQ*L zYsd~`9Gnz}Lht3QAA6&gROaj|+KHmuEapeB<JJAuDxB#)i|L2Dx`^@l+P`hn*I3XmDv}R5cLpUTU2i8b2u(qaQO(}W!phH; zjJaNA zP&zd@Zu#S~__)=W@~)LypTzrBh#5=5%O<;E?pvO4{_rf_Vn4O)qR+jLixxQ?TP;ai z>YA6HM7hK+u zJCv1pR}i=GXuB^@WzM~*XZ(b0{uz-II3C&X9h8L{B$@9VQ%&2^_2-fHNETn=`XUfyW3&r~x&LpV7Jd z!%Fs4!6|;?@#9j9aO{I$3*;%>!Z<^sXc4Qo?C&F8iFH-z(dDk=HAXPU;TV*d`)geNBzz0l{l-ImJ-I6relO|x-DL&K=I=6lM%(;qdy_)}Uz+T6x2 z5|v45O6-Ah+Ud7XjF`BMJbyn^AKf_Q$qRg7RW?mCRsRPpq)~y=@o?;!cQ}a7%MQ=k z+OW!+>$ef@UXc>E-?hHd zMEOgu!F++$7_w2xhw-yqr-(hG8C*gcGMFGwe3`89NdcDyai@QkS3H&@h^F~`PghU) z1gm?D)?9yyhtJeP!=@P&6lW}MF!w*33;yKGsCC9Jo=kJa9%{ZV6M^qWT=`Qh{JGbM zLFXPVKd@FJeFK|1(GzUye;cfm7`Iu72Z+&QrGi{Uf4x{0x+3nZ+ftabkJpu}k>hPL z{5)(_jqUf&u89nTh6mb%`|6h+v0?@X@dpkdU_t!bQN7w^*X0X&;4cd9I`J$se862? z!e-*HXrff!CMRH2PofnSpra9Snvgd(@X$9$LwByRr!wo8 zL9>LNy->@$4bTepP0oD$a`PCHc#Rb^xBjr=&G>h9-cQy>`k~$)Y+NRLr{q?xrEL>? zFAT)%F55#`IFi283b{XRL`jT(76f1skr{PjXN>$WJL4UoGyu&&!VtVjO#NrDQTAGc zAnY`xzrspo_U5Y~8|SMn9Ec*G*?IjYfu{nL)9ho_^MaA@rrz*Jr*B-R2DFPh<#F3Ttary_xYnpU+AgYr62nWf zQ8-NqXetPet{JeZPbu;Jet1e8$TVa54f-GKK940{Mq? zf7OX|KdFqA3|*KeQvTCgeeQ5GJA+r%%Dw&o0jqcS9PRm-Iun&&JSBvw>x6?)%YW_wuYW<=Q(6$EKlL zMX9kQzCnv(xu z&r-rngNC?Ye}OcL=58CdB22z4vd#A$!1qg67kU||Q&n$_NRjIWFeDDE~@BxUaSsMhW!6%d+WHUzO`+92m$FBq(eeV=@O-72&Fqmx0ayw6zG-mtuH78B7n?YNqpvJl-wl;lfBNv5CXFIx%H zvaHd4yxIXEso#LH7FZ^_CK~vYMSY(8?u1UVTY`QZ#aH<-srOHiNJ|FZG=D=>a~~E7 zgg{wWkm6E*;tB49y&N1kos&=X^f)Y&xUn(bccbD!$LJ~)8bGoIeUhnqQ{m_(?oWL) zt|^(mz;MB1d||$lZP<}9T2CDj?KksSRK(TDHx+l)6}0ynaal zITC$tS<+RzFvEv)45qw(#f4J1_=$E-nE(+^&b-#R?e|sftripadyye7&_nHSVnjid zNbpN4>0z!73rKNlk+kV4<{L?h7Hgo)+fYZWgUI_6zZj~FT% zhXIp>N&`7 zlB}ODV47C__;lQ>BrV@bd~Z9&`Jz#I*#XT_J}*i4Z^XBwl7s;w-jOE5&vBgAZ#(p| zYuLi%2)--nd%eJOZs!_2(d=tKxddWx&3`W};fuZNd@bM|xu}v|xic^D^*y}x?Zen= zMn~jtITEW*#gVyax5hQ2H-=AZ@NRKc?_B;)(*Jb%Gx2cMdm)0xD?s5iL{-V*-jLgm zz{QVkKuIY;0p8({0{j*<5h%nm_4f*``p*>_@Z6y_LTQ1wr#$JaCdMc*`6UT3(|~>3 z#Q&mw!zO)E(ZK5ZQH~A8VsbCj^k|@Z8weUmU%2EN@Du$h7{K@|Fl5LCI0@kIvmr(P z#Ra!R?E)V<$}^!@kSsAp%sbBYi5=bbPdg5xZpf+x;?F-`WxNgyC%b(2Ke*klpZ@~I zi%QNy*Py@ByA6GMyP*`nDW_g}WS&lketonAIL<;h!UmR(raF`lDVPso*6(g{p_0QZ zfE+YHgc}NzT+C;{4_tb3q3FC)N!hq?A+hqy6-#e=jyz8N0N&-ii$RUbyRb!Ob}isP z7g=#8u(7zL7Z?kViFAQ-ga)t2Yp&LpiM{^ffchc{fY9&Ul?A$~35%Di%_PM-wHwk9 z?zbnJ59JvWg;2d0fQyg;>l~uC2MVF6RnNzJJo!qB`<|RKJW=|V!ifKx#F$pxM2JHv z?8^Se|CgJX6PY&i_*L84{R?A2cVMIELyP)bqK_rs!hHT+G38%fHKqgd8x_P?U~}6S z*FecbqXhy9oUL=;7%VyfySXWKr8Gr61wBA!9d}6y6G1#-u*N8)Ko%5Uv-f#~J_0?S zRcJ0Z8c3aszkDolN9xA^L+Z8w+9?sZItPGJf0MdJf$m9R$vtfJ7vhs)b|e4`i?TYE zvS4iU3!IXl$)m_lS)$Lp>iHk4KPvX^SLj)4g2o}lE=0gN3guBlT11z;K)Vd%#+wxu$5&Fj{sQ|6@b)$q(_)d0%PGyy(NA(VI-BLP z8g<{*Ac4fQILh-LS#ECNiHx1!N~6YugIw0jxc_GZ>w(H5F7&Wb=Jh*vMFY0 zMaDJW?frwpH5Bt{l@t8G=FI_++3wl4vi zUeJqlc-Qo98|>GCO)*$B#|cb`M5M(^fVTz1lBRKa^e4 zsfzwExp4=m{B5L?buBO@(nAyjFi7j5oR3_0ic|3O5*kh0m1SRk18IUm~&9gS?$d=mIkv5$U8BAg! zoBB9A18*Op7j~PnDLEZM4>yl$`LygIE>pMJm6t*PV@3sF660*zUxr`6Fh@{Y$S@+B z$A$WBGzt$tJ2Av~)C5poY;4B51xB!)bS}XzDFk6xC!s*)#i;0R6Y9|x%cW|~@4mAE z9uY9dCsjKO78P2Fn-Etb^8wYRac`7li9A!PB|MB6g7Xkdn!Fau|68M3+|XD*_vd_T zk$3cxeAmkXCPjh<_>%F93Z+8I41`fp^sZ8b=Iraq6TqtVrZJFxqqm_KAN$0FaAf|N zsPe+??X>w&l`EF!3GByRQfk}x$>DE|wz**UO`Ok27WDbOasNsmJM?n%VV{IO+-JMy zCcsjNi9jOq%Pw@gC5fCd#2aFti!{{u;5M43Dcl6vbfRq!Ky@>U-+h-SZiF5VA)hY)SAqZ6^KR(Q;da;pfaEfSN(k+x~#DH(|S32h4moQN%pW z1$;5~W&1ad=(jm}AN0XQ{P)v8tjFZU#rEy`U^6BJW~GWth1>a}QmK4+ci+38j{Io0NP~_XXBEHw++q z*8pU1=okEn&SWczo`AtXJQM)=K_u~p=xz4nD$icuR; zkpIa~bh*KOpJDD{dX~iv2Xug;<1bK@$&(8_B-3QKn31czcxI;GHI*AfF4}9>5IJT} z@4mkNFW9tZZAEijYLPl)k53IfETYaR*X3N`-}~}|Os9f;$f*XQyPPJHcvDqaTr-d% zcW7Bj2A7S+m+UT=7sb=@Xxb1(NokFVCrUlysw&N6R;=b<-!GYo1v90JqIUY^UDg@! zZ=dLdGmV3JUZ66s1@XYJ>#9a~xjD`V*EyK*qPQ;GA`AT$oq1=qbBzfz~YxkeC z^2}uWDsSz#tUI>@&ggAnk{8&?v_t7iwRMtt3{>82owsO!@jXFHkf}T`(=dxfKDU?8Ws;TUSk`xW$|zU%04TR_zIlYc4MN zxJ;+1j?|t%IPnurXg&~MX$Y3t-=Mmm{b1n#&rr_>%sz;z_m1^iFZ@Nun)$%KGy$DFZ2vimm4lUW-{T&-y z&rAC3Qh`FUH~`2!?Rfbd&x`0CiUauw=VP@~HB>3PcTUmNGlTK%oJBIjKJ|?cOyZ*O zaIzCh@tmaWf}-1nqOMIfP3|v{KIYF~ZV*3ydU-Q`LTRRN_B=g))cu8*IW?S-;v(Px zOQ7I_gvjO$enxr+jC77Qe}Y?XKb0khoPKG-JZ-^c^A`y<-|Rk}0`}=dv+i*ZV7H`<+v2*nM4UK(-(yKmk`iDMf zpM1zuSBPTTXuk;$JcwLY_EJLKtZ~8^>KN~kCYDZe>^>fg@6WY&V0X#A z#Q=XS-6%6_{6f@$uvwBKLAaMRVIRiRm;32{tS&wcgX`{l2Xe)J$_!b={A?hpe3t-dZ!*bi~V3#6+^np`C zz&%-E`NGtP3^O_REpKxCy%QgL-VP)zK~&H*ROy2(CWlNZ4otV(1V*b5Uk;<}ZlWcD zf{nmV$RDkI-`g;*%j26UrZnh`b4QNB1DwgKDXR8s@Cw_o8+6!n=Il%E1@h!ZzZ2m1 z699Xu34ol$8DM{t9T0$s!*=Ig+rRbxhO!3CQcILq2Mn4#Y#ep8Xi1~0TDN%rvMrtN zJB4E*8}3tyP?kf0aGzZZKA5BXl6qb4%a9E{R5^~HK?inn<^wn1OzsCPL4W3kwD_$M zMT-4E5`_pUy2rlg8vdC7`|D3vJf2@T9gMow003kVFw%A+z5paB^V~9hs}%ZSXU2K0 zP+$MtpI$<6sej~#av$tOj@k(P&okT9B~6LmFOUEQaA~>>BD?669F3;8?~_+|_2n&g zQ4RBl&w>}Poz7u+G>PN+%Ht`i@iGyn)Jba!*Ntxq-;^D(_rbsiEVEpf!GAB{Jq%}* ze<~kc`S!uq*_K-b^#ELYJL;q{>hE*&D#s$1X9^b9rZA%RdhjCkhr#slU(d3O^ zbQmH-<@0P4`*1(ce8*wNBo1Xn^vL=w&SD*J zs}-UAo2O1NhKn&|V@EDV3xmsYAb7UYh}_iDtOoNuneaXJX*-Lvzo|UiC&_o30W-Pv zt7mN%fi(SOB|3vY7;dliFaqEkV)9|%gn*hQ#9wRcCj5=dW_e@x*ZE()A5!FIea5Vw zyZjp4ZYjT$kSC*MGz##tv3iaN$aG5JPj{}{DE%!j;)NE{X42oGbORJ(f$Haxm+Efk z>aL(6+`?hqllVheyQd_+1RYRn#vp)N{K2TG{`hM}k@*bOYukrbDEl?&wCb7ESipUf z6;&$52jwdq>S@B!Pn^wZV)(oC&_{V$1V7OFSN6PoJJ6!b{q{poomRO&pYYGan)`!( zbt}%lS8kOnjQ}(96{t3(gaSCEA)bLhIurq*$R4PXLw$pL)A~TM&8ph3AL5uUj1SCG ziu*s#U_^-@RY`EOP zPwW;nE@J{Gm2S3RYm+?zlt}FFx$g|pBb=%B0PTAyKT95%HQ&FR-;Wn}`z?;p4VynV z(*2wGh4WG_wUO+Z54>7zW?K&#^!X*FHO6mGmp!1Y(1phUHF5?ziXwW(RkpkJEz(C0 ztlLlj=^uRaJI?w4<@HcB?_NPNFqQxBhZ6ewz3;K#EBHU*7vz3X|Dt^@MI{H;c4Ne4 zFNm`MIQ{O0A3^Ghz&DxzX$l*tQdmFgfHk5rV>4s3E+ri02`JOc-^u4UWC&+GO&}=9 zt{_7RT8dVR##sR*G|hVA3XtQrX$jd?pRNJya3_JQZ5{96=ga5 zNBL2lJ?|tBM<7%qqO3nxNf-_&aN&nWrRtfLJDT}a;)t#yR35*WJ}D?1T7UxFWNX!n`8(9n})T`rW9ao+sdsV7S!($cHGbX2qz+ zrxH6Ujf>n;58+r`WGFTtS2e%X0p?OsR={X&;M5~v(U4b$W*mC}OnxB93@Hx$(f`7a zfb;LqGc{6&CNs{<{V`r@aB@`bv>Z!$oR(Q0g{swm7E9uFd!++4knUH>wp|@p|Gd9A zc09!v}GulVf1+N*64OCsvUp(yM_O7qK2`7|;uO!;DcI znQoS_u4(MFA9P|aCYt<7Jpx_j9g-?7liOZhftb_Yy>j+mG&Q=cID>BWLA*Gput8M{ zU}@=0KYu{+re2vWvFYpLN{Ri^s~t9X&ib_Mp|Q_K)7l~5HYg1PrirKqIi*|)c+(xy zDCu)jqP*LMj*d$X^7qHRNa6@s@vHpYi~6dcY8aqT8N{5Hu4>R~U`29)cy0RFqT;ez zIg`GHRVG(rsKm+7Y*2yRhU_ZS7(BD01cu0I2Z^MQerp1l@buTSOr%K1fa|D8OWKM| zVu-TSk>VQnA=yKUnvu~q|L5393IIto^L9cL{Spz2}>aNj8E(q+y;7HYs>v=W*Ecc zjZ#bEW3BDj8c>BtJ3KEn@*MEa?HG!az`FQZ&>L7_@}2vV$galtRs)>1NT>+2pVH*5 zouXN%x@E9CAu}e*s#NTC2Q7!VDW9Adz}NU>7Dqx=uwzCN4Pu-KCuhlLYJp2+hXw5@IRh}H96JeBHf?Gyip6u2nN!nG;asSQ6a2B`uTbBX;B96Yuu}*PkT9`bR2OQoaGD#J$l;W z`qi`+Te(9EJw#KP%U+${#S@PmZWl_~B-{pY-tg+7DyCP+kskZmW%{bD@L$SgJ>7=P zUqpp&Y<=t$G0^zV^_+$_OoWiwX8{+6$ft(4q{dwS11NiDMr@W7jb|B4)e-jQ)1B*$ zv$v)x?9V{6ZufIJC><}S4Z;lLxzsYk=+rj9Yix3ga)r4ZBq;Ap!_%EpS zX*3T{XB&{^Oo5M?7OfjrrmViuJ;>hmtw*}#YOXKF_<(!*s=2Bq)y7zP?T7^*-^Ubz zTPtXN(tgSYe^f85a*h8ZkCf1Modd1I;N9o)PFl0tIrYb%v}9t^ zH8PDOEXJ*h`?d6h;6+NGvc@_@OV(TtR4~ortf#3td`J|!sN>yy#}@FtOB|h*jD~8B z`jYrdpIw#TBX0SnHw~8LwWWX4OX1U5<*LaX{{| zYm0N_3cJjQ-y`Yry!S|s$8oNLmUh*v@S_*iqqWo0at&Tz+}GjPac~FyJ`iEC-2|d* z$gH`Rr6P`_y&{b{L2OdG*oJz&mVbIPO~XWC)>Q1u3sxdQf7{DI`P|~x{Ynmz>a=i# zP>?O(*p|fMtVWfo*?JEbR+PCuMF*D^a&{N!CS7H zG-&m%54_I1*HN`cG7JD-7rLBjjO5*5QV43l#WmLOFPCj$jb*UWws6VX-0uq@S@OYr zOTs!Y2FWd*5nA4d&G98%yO@nqvJrTxw&UPVTsXTK|Sbkdm<qP^Mj z>rcA9S>jnS?gP%JxlTv?gp$;^-zDz(GY{Vq_zRD;&=eZ(`Wf1Ld~tF;mM1L~VObH~ z%2?~tZ3reW-;{W8*b}>W^I)4o)QteHi`=Cl{7$AHE}7w*nR;=AM2xzj{<0TCUY)CKotJ`EfRhGS*VX#N2kDLp=jR_{#ydSz@S?^TVD%xsQo+3Cfc$!;RIFnEZV5^ zpD$_rUg+himF0mXQir0tM?>5fqdOP370dgU{V5hx4y%*605Y}tIyaWvSQ$);&MiaO zErA2ap^W+YWHd+4FAlwgvbQf z5c9bwd#W3u`jieSb>qo(n17mWXT>0aVYX|32D?dVVZBzq;8y0*>`tq7s6I2S67aE{ z#^_GbYpOJG+7Ch!Pp;zb?^+wUnsx_uh$F?nzl7NCj zx5xO@lqHZ7y!h01yd!7)svbQKF7dGgW!I)uTe9jpU$00(7e58PV9`aT;kBIuH4GJC zn-lhT1)D?ULL_+R$c6z0g!kk()WK25Tu#4=Fs(8NBtG~e)X2Mj02#6BK)$imJ@y?% z70Ti#K^|ZW^+dhtU{|G^aHZeu0Q;)DZ~CYRooF6XrYt;3zAp2lBD z`x9GTD;}O@Z;%oKxl*!Unezu{&uh}y85`coI#YL3$B|2EGe~h75Gnom(K?l+5C_B~ zymk=Q*W2^JRBicvI>0of?qDt=jQr)YbzEMJqE-Tmbz;S7zPbopd_tF|%-rc~8C~AT ztKUh2J;@o;d=>BcY~<^FdDBast|UWJb#rLBt ziJg>Dxz5oX?{WH29n?!<{ZPvSN=D8?|5KNUGWO$O+h1pwswyKT(r) zr0%BNEX$Wd-7_L5*4)x0OjVE}#$k*J)!wVB+C6`pgtIs|X%;>-Np2ODhg81 zJiA$b2hJ@=uUZ@n>c-c75XIh6R`>8T)o#@R`b`ba#_uuTh~6=#Dkc^i=2#ivYFHrp z??LTvY`u7@0XWnmTTWip_5Silop+{klmpxZe4ri0$Lgvya$=9)&1nZQ-bj~%+*I;# zUNW>T$FQdVpPpLmc>TXzR3kxg1;hvcn0JbC3!m$8p#~rFlEj0{-OjTyVS!$5dyABM6M>gYq zVtg2q*~+010uU?si{R^FEQ!jc8*F2>jQLE*GR_z@7hENfL~ut{^|N*ptWoX<$BKvh znU-Sa4hj05G%@)2{AJAO&9lG|Ntvs5(Z4UpO+YEc%#_An8e5S4;Du8~J8?kX*gt=h zsmfP%nfN}f3*bgTc03pqh6NCy{j@~|=fn`IVlxQ#qT3^;a#VX|y*e=U3xfXjO7JM# zmXdo(J)0Dz{Bs8agu#yuH~=F+la2`b1cc$xJBxtvuXhx9cyfTRQ|Nj%I{f&Fyy`4R zD(Q<)6_XD<)BWUT?+Xv=`bKn|4H$G3V&y;LiUqOb7J}Z~djd3GaJTmttK4JaK~RQJ z0%35FX?X zEEbB)s79Xr1rkm0x=Es&rX%#+XmwBMxYuG&&|(j^AJMvpv3XCqhl8U@%S%#1v*)YAa(z`^XOaOEbzs|HpD{>pNBwA2Wk=OLV}@a*glT(+2R zuvWv=ny!|s{vHvaKTt{R&mAdp2GvhVE*`A)%Dk&j<(~9@6VT-l!8PFigMApqo(Da@ zU-8OGNRnWy(X~O*r)nuNZolPo@{sRNSxFJ%yRG=IY&PlPdYp^}nX&5n&CH~<;Udm} zd=PoH99ZOS4`8twaaZ6(k%2^~J~JEIhl3ye1n9aqK8HZq#?xfnjAO~wlMVRT8b0mT^!7|DId z%Ja;!rH#?+PU+ZQPAomRf@g`;Vk9K@eC7^_!KucN;^!zeb8z8B30}B3K5cG1tVss< zCT*Fc*P>*szEU!VAi#|cv1U$G2 z)qAx6^d619;WJ*HQE-q53sd;en#QVnq7*+e zDu|&r0*vS#5i<|7ARQ~_|HD7p3(@aR^m>g^E%5D{_(JNx&=xzW|{rZmM}kK;{ji#j+U|xxP8sTp9Li3@Rq|Guk_97@NRAyfY!}MQq^u zT{LAjxG3_~^9gBCin^TLKU(=^MNs&U1n~i<82?1U!M>B=;_!qnUD-CONAa#I|8|6;2FHTwT3nF z;FjFx?NuekJc_gb?UFR9HNVaJVGV>tJZ<8JGV zE~C5UIUWa#bDgrc7BxtU8wX|g3Q_GSeZjr=a7TY&n=e+KECcWmGnx#I~wZ zS2DLn@p;Jdzb$OSQKBveg{>WxsdU zWBc@@rdb#$Bs5_|^y$*}7kQu_^EQQ& zU~HCwN(obD*wJ4_@pFpZ9nd=Zdfr}Xx%=n#p8JhO*I^(es8PB|)lLH6N&Ul{liEvu zX3E4VNQV(0enltaO2$psO^&2J6vy9aEbkM7JFU|&#xZ%wdpY)_)&oxb@HoYls9PWP zZ-f3lbl$&j?DL18%az+uM3>vt!Lz?-Mam9%$b43njU*4zls`?0&q}4j&KH-&b<%Pb z_*xyH8!!NEbS%kB@K#+es`>osH~C7P(r$niOGODbh!B32SiJp!x+a&t+LSlvz7(gt z2P5ywru|4E^uuNUg}(G=bep#YGFRyp+5o$yX%hr(lYN^@PQ5PrgOlH$3uLXz)>7op8w73EX56sMcR89i$ydwQKb#fqwOh-%FHnE4d1-mFrM&z)&C<#lyuq6allIo+zyUn10O8b>*i>lN4ia-b=2u z{T*>h{aYj0=|vW5e&Xk2ScN#QPZwh2DAaGs*msDl+?FCOMzRf~^q4#Zaop1*Y+QKr z$qhPQQ+Fu4B+;z5Lr+on#cF>k{uBmA^6VNO%Ib5U&4^bAzwuF^s?9n+zt-VF6ZGs9 z3H8k#42E+JqHYKtVy=SZTLYGM0uNPS=uqoxX59yuTMfg3`dmLfR4*P`t^%1^*50Ye z-#1j)&OTfe5fu$$X{L&~iLa{LwOQird^=rY!Xy)6ON-D|tT*qGOZaM(M=qd{_ z<=NHierfLlcTZCm4pmiS`+M!IZ)arx0(Fm^`pX{$FxI+5XC$y@x|Y-<$JK^sNxqA$ z+Mf{qzP-PLYKoBSjo{fA(OZF7m%K>?r-qk;L8I)qG}Fb?v|*FUN~br!10n4 zEh&y@cyxEDU-~NJMjqgWhB)|3H@)FGEe(0@D)VCYFVI5&-pRqmqcD#6Jt%pfxcoGb zy~$zktp?<)NC0p*y}7c7B;zag=S zrwkM@iM0nXJeg;DT_i9xiaZQByXB-G{ad4zOVzIUd%&*xv)hN>!5WTcM>B(@Xb+3h zL^dR9brJLUf5xroFsdAW-E?3jn})~V7l|#|%}^W=UZ=2fT{xOk@Z5zGTI`$Ry4v`B zyQx*aX;p&ag5j{O<5Q6@TV&PT*u{FoyEf@r9=LC6J+WbFDkjPA3PASN4)EP!XmQge z5@VxZ&BGOK-l&ppR5CvWbGvDbsI8TAyniW2Zt)a|+oYo@JR!a$>i*j=f{sK{TN^;3 zCp)`7Vgb=5lFZvn%rU1CYh~79R2Iz!Uc1xn*Vqn=5KHeNdM`#1 z#KJK%RvBxqhxiJv zIGFTyl92(}t;mv!M9SMCTcWG)yzL=`qAXtIXBYSl*|-5SwMQR*IqaxTw+DtwFznEC zeU;OTG9<2K;YLGgS&#<9gvCKTR=9&Iz)TRL8)zr-FqUV1j8Fz49a9DCH!; zRl18W?quXxf?_`_Vlc$|YhIh(_RWKD3R_6rs0}(Cm!G}1Y;2h+XBLC>y*t9RX2)}f z!G|yt`2DmjA;WZa$+vPgMa&l-7qy4E-MQE(`?3p>?lhRAP|B!0O<&x5F=ld7h_S9~ zlBk25iP@%acxUf5im%B%!V>=bef#W2AVOUXOTxjVG=AdqIHNuaoa9r#``#BM`o=^^;g*7 ztDsM7J1QkG0}lh@i0WyKFFfDuGH=EVgqhDT=qH?)umZjZb}&uea<3u$SOJL7;5Nqo zatxT45Tc8(x}_t|fhS6nC;BGh##1>Ruf%OnIuRuw@6-VE{)Nx5V+A8}rAXBk2yN{J zq=1clBVEA%%)hHrpNG{pCq(e-FOd7VP?azc7;v_;7h?#f7@=oj`$`WycTpDxR(hTP z+(}0O?QJvwRjr09&)ezr3Zd;7=FcV?B2sddtiko|vZEhW#W}tR0uBf-Wm(XHrgjk{ zaMK`lz`qA+Iji=S&8{b-0e<=&AgP39D|22ng(nBrq6&ZRUINGn)&wy*w*ch~wr%0r za6xI{_B-ZOPEUbIu@aH}>Tg{Zaa#^m1CY|cqc;DK_sj+V75Mx!00NF2iZ)szz|;7= zX4MEBicJ5GRy5Y`&RIkIsR2`ZZznxk!%{B>uowC&$V2W9d9tdS%EjW!tatO|pD%Mp zpsiZK&-@R5$uf?;N6C1&PtlZJvLKb*Ge;U5K%+3bA-3M=@9={g!={)SobumFY5pU+ z0U{jlEmecpTPoP(im~y4a$*X#7jeBh5UKLwvICJWfLi_I@4o?9|BpvO95Yg{F(Maa zOIebWayU!*YhS=CAcDhcN!be%4?q^ZxEBxb&%FobK%#z8>dx1{)HFh&z8pQVsh-gpDnSU~i1JY#*zT?)_ zPBBx!*!^dxgTPI9-moTYj{M%8^j}|DG%yHKz=%>!gOmm|ejBQyV)7Jyb#owKod*9I zE&qsYBcQD^?t(AR%i{CP%+04(HP{RwqTq5wZPNIQ&T5ImV$AhC?6La;qFI_;3XHg( z-rd4cqwW71Zyp7}3{{@GM~kaD8bzekTShV>NB-sSIsfHR2;iEFmO6p^IYjeVP3o`S zi;crm3QdhT!pw4&o#3opjtXL09*kSsq#=M^^inci9dp1eeIlF%XUEnci!!hkulRuD zQw(k4s9(GOo<}95;LDI-wkdnDVd zv?+*a-!ZzlhQlnT3`Ur!irp7$YfPInet$8uCq5OdvLxsNNLRNhoj>yH+*!qaH92-h zi*W$e&MNd&hypmJ;!XRAeb5m(3kIS=uv%k9;Vap9RW6hRR-%jQN)- zgn3!7g?%k_H#Z{T_HXS~L)GA}FGeCHx`JGwu){O=%FbyINuI%@1xpFE+Ol&E zzNaN*&{q@n&%wdXW48BCOZJ5bQWBO06qsj3tTMmS+1{Fc4x_UdOHNYEFQ^`^O5`Wz zwO<(m9FY4lgfr_20wih{a`;_u*R9UT&)%pmn@)8unXoDa)TP9YP z>9S-mm4{w(|6-N#GGs_nsirsg;G_s2D%OCNf8#HHhh_YCQTMqjtBU6J{)H(wqf}^Itx^`4n-IR-mD67fL zzHW27A!6cEW;n-7$Bk;iL1LO-%Mn9}JO@P7+wOjYImnWFXnXP8fxcd)4~B?fKUEW; zwf5<AXq5wH5w1Arc;R@Q z_BgCL8&Nv*EhM%edD3)_sTMOseM?6a{ez3TZTMxq#f!6|-J(m2B$?ia!kt zD9VvL^@!~QrbR&*_=8TEb%l_8#|_EMU)W|r)pjsS9G{wwJGWjQ zWHm!mIllGD)0cUxv0RiH`n)w{&I8GNGi_FFi%F6MERzyU+c^?n8_gdiKMv4L=upVH zKfV~pg)Hb|oHt5L5$z6CmpoKDzy=tv{&=BAHsf?A^P_50t9ztG(mMx6U(#*Dh03pD zAU>MyWmy|*%=4uy&*Uk~EIeYViWu)7T4MtQJp)+3q7juNloS@BL#c8Xj!U$HPL(~2XN$D?c+Q& zLyjK%{R?L{ML)${=InK&(be~EN@GRiJdE%V;WS8(%#q9N6bqu_`@dZe#6aX_{Ju z5ZYo%Mr0{qJ8|G{>?5JP2j-ik1i#C*+%nYoq-cjT{6gPG0#o-tT!`>SXB;u}>VP)^ za|^W*k=Ife5NBeWv}FeH7WxPI=|CXs2|sqR;LZ zUZ@}&0oKAILBK|*343k2>bDSi<>dU>93ivd?~Tv&$~$TlKh}mlnirNFfTMNQX<1Q&EH`})5wTP@{XD)uYqrmYoDGqLj5y4t&AmITk|>JOZNiZHT9NB-OEE|zu0Ex(T4{Em?=4u}Fc_`q2aIyE%_cYamI~qIFIjUjrd&Vn zu*#>3r<@igCu;FFFR1-c`YCp(NmJ?e1F`DZZ{;OM@?O?n)XMlHyxyy^J!|f!3EIJ! zBpu{gTS}ZqI&u)(&5uq|?A3@K)i(uo9P0|!*dFU=2KpBTzkf-V9}VZjf8u>!|Cac zN=;x_0Ndd~XuV%N@kOP2sM6RIsbeQ@3zD9%@3ik_HnD)jlG|(KJD^mzQ~6%`MNx$HU`x0(9@+ zT^Rup=TaY%j@Cs@q{LEn7!#f)@G0>*PCpb*+i+3xoqPx6p(vuzfuh^N1wi_~19V0A zF%TN62J0SO3NH`~GHv8J%3uzSxy2xIb;?n&eYP3-rd>fD&-5x=v9*G2>La-=jDq$}6^f>2oDu)h4p0=61#FIm>cU?BgERUJteWk`);p`M zBySMzul#Y{NblUUvI2sRZNUC%>J6Qc|s#UuB8i9VPd#o==Cf)?#MWeYa|WQn=z8- z147ofT1lnTK|+%{q|qwB>JRP+{t$40QI7LUa?jG4D>JqCMe(P)?vEnnj3rgroD z9>0xH7VU1dc(m6vq2{amOE<6qx;tXd9^p*W{!2xG96nnKTv;BE2+<5YLqHV({kid* z2499a!5?gaUunnitj+v5hdw3m^4t5Eau$1GpYxO3h!o1h)s_F7AW@Bzh5HnHjc#|& z(?jc^-NJEy+Zz2%&oSY?`6KtRXMr9fiC^N1@R0*P2PnbxcAhJCYMxuH?pvWM_CY12 zaw11@p>>103EThDmC*m{%846;(xbL7CJ6B%hk}Z!1%10TovWNoF=GSco@80USM*Pr z6R>n4wB)a)G2h5pdq;MCqd&|^=Wbk%Rr#$rw$$zy-K0U8k`w9%z&C`YaI=B~GuZg= zf)<#}#oD2B*HxAnPbA8|2nwCeu9YTVg(y;^%l?p&K8_@*T!_0=5tb^aZIe4Oln(LS%WNljogo^42^K zQal3g<&fi$@LL7O)SK}2%=(!$BUy|^yt>JI6*U*nq}mMIUEu$C!w{eQOr`aMs+vS`{md4YA`-x&b^|1ba!Z8i+S88JVd zZC7pxirm6A%cR1RhY1Gx-&U$tYTfFdlkG9jo?aPcMBHC&4(E0((k}q|Plig5J`6CQ zPh2KMO1PD&1XFY#g{+Pw8(GInFnQA~O~i<$$9eozCY@jEe=*4)#?MRj5L<`Qqh0jC zXe0Di6vc$Rtq_#N_O4T%5TelWXpAylZ_fKk4(V_pR1pXsQkqcDjOUMTbTw9(pA^@L z;)*~B>!#v3%=EX9Z5Dt&&W3IV^k{|!NO3sOFVR2r3Qyqwf5C| zMX`9www%AB|HBp;ZNvr$pg1qw0w1KV#Dn!V&+8zpdTc58Us)tDy#9HQh4?hkv-Z zFm;2KjOY5awbhdppM$V974F->$VS_201IxgUgx-U!`5?UiOBck@4!?P;qTvrlMv0Z zehA|A5$BrT$f2>ZxSMcMtlaynoE2q1SaLuZN%o4PDeg(Nd>MZH2C^nk*!CC>2|1|Q zEMG-`! z?3)h*yP#+&>9}oYjGZ%s?P;+7K-QK>3Y66=oQ>v_OU@V3P5T^oPC4W5BHSvIU%LSG z(Z3!gEl0kQ^Y{XO*x8<5UgG_;+rTN-hv|UW*jS$St14tlWs%4XLZHRCx<~#7(BBXz zj9NsoBITFb;vXeVb_B4Ec{D8<50Mv-YbsRUEJt+*o&O)!-aD$PcKa5l2q*}sG^s&B zI-&?t1R^TEi3kV^K|!h%=^!K^Qj8P<5di_ES3~a*dXXl*_udm~fF$0>_x-)!Ip>}` zzHz@XzCQvPlw@b`wV!9MwdR_$Rk$X_2`mr=f-{e=-%?Yj6BIHQy^+QvQMxFb+y+Xd zY?AxqmYEB+Y&5yimJ&M!O)b|1H=e{Fy_dT)XU9AwrWdzp}m@~R4me-J;_M*P4p zwC&E=M6=<|ahqw`0<~+WXxlxJR$r-Syk>B|DpTqrLnnlb_#tK@&UfWkiN4*P33-63 zrxZN5?U&i3o_vv&J}`Ks2vIM7*7{wEX(@>tH-~xCSd}*p38}G4TRRCXY%bfKeo)lv z%jpuM&_?-#af#nL7x_rTWpMTXund;DqQha7l9?B6;IhBAdCmPkWKHbwk7)!EgjIsF z%0HaG*EgR$(2N(im0k?D9)9vnR+_>{TaoJf88^n0`sxpV*jUbWBG}#lIjWajsS8@p zz4y!F%9M|y)2>XMX4ix@yK!5E+kl0U+qo=neb>4Jb?Lv>;s~6L*uO;?B4>!Ga&&Pk zb@FgB-ehPVis}EiZEvLT$=_Su=l|O3SSg-dP~^C1)DyaZ>y&NpGA}CC73%|!EH={^=?v!ev zIXvHXJ_x_7PLusvDw@;%%AI5)8SUd2AhciK{+ze9iu#P0D@9mBOMDZE4Aw^{1muww z;b_sGU^k9u!QP*nX?|ktNbVWlW;`k-`&2tZhpA1C6;iQr3n11@{ zy`SL+igf>-s{E6({QvkJ@_Ufb*O5LsT{12_QY81`Lq)%?(oH&&7qMji{huq--+T^A z`ZWH=HNybQ_!cRU!~3VKEa<_eEf9VsI{dFz1Xx+y#Uw>*oi+@Nd=p@*n zEPQj)f#Uj2opFJ20h$o1{Of&d8cJs-wQ|?gO58Jb?>Y(c3a1cbN0#%zX~^nzj%FNPKKrflqE7t{%lhs+QO~ORA!5dDuZF*dRmXSP z%2GFpTc;rX17&PVa+IX~PW~$2YG1ynrY`&IOm)=)+SvN?nv(Pdp|AQsemZpCW%$WK z*6adim~-l=@el1UExF1+auKb`ad+rpOw5rc*>Ca5I8{8Zl4A1yc``q&?@4!iAF96d zvzT9N`g+%V*NsAy&gZs`?qZ(1;Za`SiJ+5|iRyAmN(3dCw+M#(0~#6x{e~X@*KZ*` zvzfRrE5mi4LxLif%?LQu$ZlT0G}5l}*b+)1B*|fP%gahSC-`d8W*bU-h}3o~@%B{D z1NG6XU!>PGK_6s8turp*!L&JPC|0vaS>Bd&Frh_1ujEV!N<=@$TPYvki$2j)cvy3QN+*i=@IPr)*T<60HD^}~k~WL|n9s!9p89h7R(rOkV8yF7WQ3(m>rGX!<-({XD=Q&~ zRnqhf3nW}g@?q?Sv=xbxN2+9>F3CR9RcZ*IY~JRNr@7@O(9#oO-DWu^`B#N2xYoJ^f{8&pM6aZyIt}RlaqcyhBBG=VK&3Hzg6?#>GR< zx)1%K;j;4d+^1l@UjgdczvxYe@LPBeLJ|Cm&kfJKAJ?94DadW#tMp-r-n%KkaiBBD zzg3JWo~NiW7%#!vNaaz^$#>aB2lh}6j-+Z_$j_dAA>PrXUwG(sRa}E(dsSX=jKl#V zX)xgC%WqJQTu`nG-)3Q*VW%?nb`}~le5l`gYr0BiK(5Vo^VejkgG;$YNmJ-69*J2U zZ;n@YzPlhbk#vH|>h~d$sygX>wAXC3)|ts%)cn=Gb&VOWq`gScafks^tgN?NQsTgd zc_-wNv)~s-i%0#CTc8}{zbDShPMHeUt~G@N6$7MJIC$#lXlxJ2`;#U+8yd9)n=cZ& z&*|x!{o`Y$cOKG}^0_>kx4H>d28n9#J}ukaGY`SvOF@k7rAd(<5-eW5l(Ft_mZQl& zV*ArmOjBF${CbVFG!8u(tXS3I;Z)+ZQNGn}nRdvOcQtfR3snBnEkMn+Mdh|RgumNZ z;QErN+ODjt^n6Y9LRz>24KA|yZ3Yk{iJ(3h^BFat;U2sE>@!awXNTmj%GNb(^;|@7 z9lM<=bLEJrm?V2q!VBaadA174+gfMgWKLb<_I>BgTgzI>y%e*wE;m~h%<=0#8rJl5 zxT9=1ZkTnc@@(h^p^_6N7Ja{7v$%hCtF-|cQ79<~*m(3)wmo*=)3?JWqyri*aIi!g zDzud%o+lS%L{LVbUiX2jepH5OeULunLq~i#bNzEbK}9H$!|+h+l>HTAEOl=0%B)QX zzT4|dLO9hx=F+v?jye%szMRxtWg@Sz#OOmxZzxyHdRl});Snm3D2{u#p}_a%;aJcw zi3j9()#b9EPVnLUaE<|{%{lY-myz4oPVXnS*V*`-LCa_-%ZK~Pw^QkuKo9tNwXppQ&hH1U$a#ife3yfzB`2GR>*FTzL~NVUC} z(FwjDs)G`=_e_Kz1?K`xBs zn)aYgBANfVQ0nmVDw=gUU^soW^x|ya^Y?!?MRl+Z3Vfq-a5Q!X~lXSO@-9^SnX8GL@CR8}r)!!2jEW0oK z6i9g+cqaVQ*3Jg#NQ){rFYuL$N(&$4WmOyGqx4UP3vrs$E1MqpO9JHBd;7^JK7ipX z4C_-u$O7h^YVxrV3;OUFl=TeIDac(rye6M?=p&1Pgmm$OODuokr^y!pI4RQ#aFy^$ zNV@`7MVvUeU@42wG^s(WQ)_y^e<(-!EwjvqI$NBQNr*+Fcb~ti?4x0E?~oe+lo*;Z zf-)_O8K`Fc5tR7XWXeC!EwvO`*Kc*#jetzrAb)00;b=WY^fZpjeD+M@DiUZ za2|w(#qMIUxv!AMh|z_4bgLM}gy&9Ux$!eux5=p14{onToR?Vy#vTW@ic2q7O~Hug zj!*Be6?BNE|47IYKmV?*mb!aW?jQsXI^>y~tp7M@4E^n(5#>E%AS{QXE<%<@TS^o# zE}a>}cTXDQ&2Z%NB}OB97>9CZPE|3|NEIiQ4k34_mk#%d;W{ueI<=E(_EjF3L$~O0 zqC>W|J4F?l%!3XJo1M8E+ins$Q|*#S4C)jZp6F&>~ybI!yQ+WOhf#ZH<}@GhC1 z&FkGxngc^vduR}a8r5+ZE@IXmF~`!ZW!PYxmLacluaY{tS>3^uH^@R>c~NMgqBrZu zPe~i|*#$`;NPD7uD-^ROU7ZT%(Zdmrth_p7#ioy6eBXY>RPZ|T@RPK{7+I~xp#u&O zaaror8wof&XH_hrrt$Q3=p--pZz1-bG&&?>b&N2(E(W8 z>Tr~6f=NWH*gMEsA3bb&OJ)O{K5AGP>v*ltu98iPP@?}!g_YR+ZxmN7KFa`|HyT*(37!oP!;v;+9_ zj3Ka%~=ptM@iG>M28r8)}$P~>Fdhv|W^KLDr=wP5* zyjvtXo@jt04_nRadwQi!x7OkXy_?>6O~fGujv>S6E};80qnMtG4oA;<@o1=8u&Oyu z(6I*JyYlQdJytxa*)aR_^Xy;Wu=@Hl_oJB%^pcC-F9A|4;)PB)fdFQ^Ug+p9l*cC285sfH+tF;4=I zm!HCiH3y8PEecngWlW5CJeNwgvhnS9a0oCPfwaDN%(?G7r*mO65v#`jWi*>8kRBQn zNq$BqbuW>T}&~jpNRVnVS(jd1f+uc;sm)cTUvjLTNS!uR_ zes_bs@B*Sq^!~j!uChy^jxK4F7lV;&hF7Ez&d*-GZeP4Do})5L)#S)-3p-Ebwl z2%rhOv@0`4_0xoL{z*jfcMFBq+9&FlPw3VMU?Y<~YW%@jrAI~tMKfK0{XAe%Pa`e1 z#i2&gWhe4VR$O{A6HA#l(z^b}{XF#z0Tj5tw#gryeb{lga&ZwlAKP+g7IX%_xIN7c zdf5s z_&1ysrp*{PftAw0$cJ`C**+>M_t1*HF_GCTk)MyCAzv`69=x&!6RFnE-Qpwx%tdQ%_GPwEHxa5VLo7H~m+}~x;0Jn}k z{&`iIQK>vasVFj5;bGLgf!=EKJr?LaC9k7!wAoAq@qF2v4H|YB!$gJ#;$@fVL-axG z%lD(F+~%;_l6RIhSy;L&cdDV0!@J||P)dZwPAHVIjsyP!bJV0!-C5dqa0j=)QOkD( zA7!%~tK^CPVsiQIy#?tea!^w=?5~(Ioa^;F=w+~(zBN?Bekb2U-F~E}>^c$Tc5SSe$^ zL^~Ua$i-B(^Ut4T%+gH;lyh^qGhtcqj45hVjWaaOm*3f2@w#8LUuwXM+wD7`r4h5~ z&|OU*VIYJESY>Iu1~SjS7V0`yX52LXpo)KmG`bSW7ue;jd|z0w;gQUli@%V>w?Hih z7B+*af-^nU$?4(6a=kU^d~^oie*;og!9$Gy)&ZTNqas7|DqO&GAdoW@#A~+R54ap^ z=LB6hP9LU<@lsre51ifhxEf62Y{W=cs&QzD8{luyK_0SoZ>GC)8ZsNlA6`&xdR1`{ zSU|W{?i^7I(~W<=5Rxg-HXm^e!^(ZVrlPJ^8%MZk*gjJZ#V1H1p&LKINlGsm8Sd-g z=*#<6;5kWYp!ZR6@q)@jyT#;*WTI9=3*=x#=Ql}zyWOJrZ<2+kA-GlPdm=|IJ=m*p z{QUS>6Fp)*3+zPC4puIWY6de*;Ai&#`kCZRGh_$C>#~GygeK8XP^z z{1%rmYCb3PZQJMv4upD9zebG)fJc2VkCTrZJ=Xqt^((cxh(dDaO7@4m)iQnzjgB6w- zhKZbB7%DjpUy9Y!o;jRR zWo`@d0EeYLjAd)NU79vwY7$sKCH>jjZhA9@$7Eb!xLD^Yl$hGv#v*J_;oo8&>C zJzw=lD~fljPXvx$kVQBPFQw8{^LlR^3?+X^E(bnJ2%M&Z1pT?O&bp>X<_&v4=Ztk9 zjXU&Cx33)ChY8@Nv7f(V2IiRq7c`V!R{Yv|k+=XRyu;p`D4J`hw zwc(UTtFg{IgL2n|ZTiXJ<|M~k6RQ_%3W;;565$2L%)lSV9x7FXc6x?m$w4!i9Z1A) zlD!dS&_V_$X+eK)rq$73K?=u^y*92BHr|&xbj54#ls8{!Q+eFuQf@#zG1?BhL8LYT zW;|^H&`r+Qz!E<2&6uU5ap}u*<86r_8tz?ie!mNFd+(UtyhocD-4rglLC37jz);JI ze>~GVx%;si9f|USUAT)WNPOt9xIdO;@~)0}8U#@i29*07Zi6APFwbCO(;i|?v=km> z@MLqnRJQ+}y@e@+RQus0s}4lrQ9XdFKIbsZ`QJQy-q<*niYonxvFA(+df5roHcMNh0l0ioLO8!m@EIJe&CNm_cU1syu`R>y(CYA0{1YMxj*#k{dCInF!NGC4od?Td z`18*QA0ikw3ZpGWgp^0xS7Iecq8*wMo#~N+9}(kmX;dR-c!e}VI*1EDPmqb#AHhyY ze{A+%+`9SRRZFaSrEy>>Wya+ggY%lzvYWRG-A9w*+~%}Y3?owme_T$xn{z9%E2sIq z;3};zFk?uBaOvRDt1ui36om0eIK2IjX$>U+R@sRCBx7QT|CZ_J*KtGNb(NFNV<`gP z%}+)&gDmsRGHxFMaQHoNw}8++=Bk?_grO}XIPhc@SgoC~9T6*E8+)F&cLaGqk&=EY zmhoI@n<0Ijblk>tBp{tWX$ORm4%Qn6V)AmZn9EY2q}&fNen4xBf2AA(IIT0;JS`X!cKIUd|7D4X8yP?CB)%r z2C*~bxM~>AZb1vu>V5N#?|s!I_N<1Vj@s@;w4y+A-IFZmZsj!sU6i{-hK6>Q`SF8yW@VqDs_bVMz5gIpt-5M z1Y#sRiA@D*7r$+D8v{qR{qYAH?y(IAH`=jFbHV>QeckeZrNdl=gE; z`fWU~?)o)t7Ym~5>94*fsk&Jn_2iq6Pr6?R>)Kb2`MM!O&7XU+5rIu{JU^szx+C)xOz6t z-F_OR6k3w^K3ydWqWi(g&ECUx$VNv$eQ-)nJeMqySx>y>S9mic)uG`HrL`(Qb{r~1 zxY_+jWBe9~M@=k>@JzD zu*h7w@Iu<$G?Vn9oAidEf&yN%5?6_}YF3c+GTOR+osFI8h0|xJ#E*V=UMeq>J^3WJRNckV^gElq!%gv8Qd4D`{g&uPOR-bk(<~E;l$Hq zYUGYRHTF#Erx4@rU$RPk?53cZ&0cvJDszaMju-wJacTbXk1dB`*`%l1!_b5s0Ax#8 zni2l8n;u+THBt5=8VR+0?vBOUvNy(EmO56~$;l|4#WPPxa3)xurex+C+6>Q~Ci(;| zCGoy8x%O+IxN%ojT#Bj?Knq$pxUfLq_nb`<`%uYFr~*gXQ){=e?u?9(8yR~xQ8ES6 zT{(G>4E2D{n-XY@7COX(S>ZaCrHLk!Z{TqmTTBIiU`3fQhabHWkCP6;g=ux!)n>k6 z#yBGn5DK$<1G9znqGTXc>E ziDFv`1;1dgGde#3rkI`fH%7Evsv-Bqlw|euP z;e3QZv^AQ}xOCB&gRRRZv9WPohJ#+0{geBN0RK4JqXq!RtJ|?4ttmjP7CzPQBo@sx zmX=T25fZe1WbQqkg?^a13_r6`G=X!`tLe)ujegma=$iLjgkD&*B$c$XrAD znU`5dIL4hL2R6XErm2A+RD*COzx>`iu7lb^t+tnjd+kkZ%;s+Gi`%iz9 zC@Y!rDH4y^kBORFU6Ad#UB?F4nB%-kTi7KK${3T@MA{#TuMdqV7Iz;kYxxr&Mymal zDi#9<+y7Q~@&yNU2GsVVVDqTc^YjG%aJ0%se^>GNe(uk&_7*PB3nZ2(_~4NLDpff+g(D6I7lKbGh*gl( zeO)D=uwJuc^kEuq7JdeQBe~hwnRk3SRiJ1SF5ZL;Gl#v0c|2n<;Txzr10AXH>^VZL` z((TnQ$l2+279wMxIn?2)? zJK>Wy_3`0poB|cUuo_yNf@4e;f&Z6$j&n$@ltd*GDeF{gKu(|qtOdcrhve?a+ekM? zlBtxBsB{IS7D;#AV8ZK1kEGOP6pO?u1!Jycm<^7;`lDAZc=iu@H2!$UdypqzRkpw2(l%(4{$W$NU)bv2P+mUB0F#h_ zSdmF)EBnzkE1uNZ8qnd3{{H62!k4T&_}6Ua-M^{D|2Fvd??Zt9AD<(B5B*W!79N#z z0?ZmxXQk^4n*5C9@kx(n>3W|Q6(hD=R^K)?UVO z-QoNelC6|z>&%DV2DX}KS5uIqfZ_9vitdv=V{ZTFLdS=i~ zA4c7k<@|fH&_O4VH+=aoUP=~PZ0j0?@D||az5&DK$=@VWE2iDhchm8MIry6ydW)ya zacW_ILhzE(QBI5DhaG+7xsP17gH#;)_xsbsLUP_;8*b6H{aPqV?wVsJ{-rx~V7-{I zU=(&`+GaoPQsj7Q%G*w1VJX{Bmu&NV*IhRre~G`ADJCLVlKNIC?#z!Ho>`SWTG|R` zwk)PGH|?!|W{fm;u&~~#Jo&Y-RDbNXrFlPeFmLRk5CLf%wViI^U@be<6NaGZqAeV_ zqsr@ch9eU89dX_VVKpCiU0^XrX}s)guW3^@s{3K}*h7$=A=I~OXy+$pXl>m`4E)}y zJ&sJOP?#P0^-?BtXt`9*XGS>8+de(}Bun2&jsN3o|Bc6KTB(yGS7uSH#-0i48WXQr zi4QlDmK5b?y+@w&ek*(3hOi~*(1+`%eAVe%{}m%6-5%l8YVf7I)%IdwyS#Tl^N&iS zS4zRmKwWsyDRr<0=;Zci3wU}b)VHSR1O^<3%KI_xwk6?4z-PUR1)Rx0YSJ>Xi zCZ41_BONmgb@bP&rtD<7Jvv{e9^YlH2M<&M$L31D<)rd;avm=FSB8;amLv)vPuNY% zi=ZL0;#%_bKXWGx%dc@7J#jnpqBw}Dz@;Q~WxDPcE*%r&z>^HBqvNF3^{XW0*$yC& zJO*2oPW;ccd|r0yx3Bl5QJps{_0=WW;!duU-w1r-q7*0dXmSOjewJtLQEtrnB4>(w zED^WlJyEhwg{ym)=sxKg=F8@s_XC4HAmA8D5^Zh>4(gXOOF0CnwvoW)S0A;jaq4)x zF>&=42tJ9RvTR6pnIzb5Dp~ z1ek!O`*DT2s^sVpJL_f0!XR!&rm*^Y1D1gXTfl4z*QMcw04?mkjg?cgp!JK$APw`+ zmh!onR`L6DUJ`i^pE)8voqxSyBC)|yTD0ym+;-qH2Yt^)iEYV&c{&=Fe^Dhp+yBk% zEYMNZ{}HYZ${c+^a%TzfFyD2 zE!1|jW)(a#n)mA>Y42w$8C0IDJ#e3utTI_Ov~w-WdL9~nwijV;axpJ%<#EGLJ``Mt z6=E~1pDc3S!{;4bQg;&Cqp-RlSyi?oCH&OSd9b=o@!`vJ$6Bg%p8Gi;`0jUfON93K zGa!pE-u`8Xlw#9r^$#OwEYZCi(%J7QF)$qUqXlS-%SiYu0X@CTW^|u)m4xFu);Vo% zBaHFTe3E3aT_^J(F%Cz|T#K&@eTbIMO^77@N7 z_HYt~(3sZ zDE%6AG%UP-)m{GK2^;ZaXY%(9c=4@CMjB`0&z`WWTo0*r@Oq>Q^4 z%Ztm@Y!cI6sC9JkOZJL{D$LKGEGK^nnGx*xVH^{;uGge|RfvM!rs3|IuA||d*mSv@ z8&6twT++vz_~f#(uy}=%V=tcZxEgY`+E*P<*&qIV9wdS|k3)zh*HG2#=g8MJZZ$er7J5W_&9as1 zm4{^ahsPt@t9k{(m(wm0NX=YqjaBnZUTQ93$I46H+ z;pFjCsn2p4aSCmZTTJ{yiN1SUXR;=10$O}e)N&6L?p?BBlf9kq1?UZKPtC*7f{XhO zwZi)>jS3&9o@?+`N>qSIZ=1$Jt-#*_x=5BmNOO}KV0sP(>)M&b|CrC7|jvJCVLANaip-9svUbCXP{?N$N9djAbPx^u+uw%Y7-*@~Re4(VfZ zqj(eg8is^n9wv`6Xkjg5UtP=nl8%^$ThoNPYk9vkgqOvf)>)iS&}R{JQ-s6-9OBH%yGnIv%|gSfBnPEg4o~ zA%*ccz~`P&gViOFSbu3_ge94oc+u!xm5=;Zp1Fb#irZvqF_xzE4aG7#dCY#CLBu(m z<)Q@###wkM)nK(FrGR~xr99uzOtE3ZP@OZC+}xbk?W9x>QR6kWW5Pmt8*qkI|18jt zvN~F>$tYmHozo@BwRKjAdBXA7FS7qFs~bIQ&R%7Y?4W^F@kTn?ecbM(Ef^R`p+eU!YgYARYem1?j-$4fX zI`e63Mi%&_lN2=Z?hWd-f^HyFfc>BgLjJ>}uHv-jL3!70x_>cbpToYB(wA&M3B#Obcce zD!yXUH02VxEqt6Fq;mmOo8Cp4Z=TG&JGh~`LgVsMif647dJ#5A0l!R~nW00^(c?w{ z{X#44sfZ8kRry{2K$eH|f`f0q@+5npQs>qWzh=wFEiXSN3`zbXpfIQqKxJpaS*c-7 zqbBl7KRj`mlRwnDmg+j4HGhGGiEQt}8-W}M=HX5V6cgrir5K}?m*9Xm z^30j#fD{=;iUbphUiT*%Ol|AYd8hQyCgSC-lg2~Sfp&ZxMl9GZISj^9(oMar{Q0!S zJ7XhT%gtp70irCo+mL*^6ZD`ll&SCNvJA3tE%S1BN8!tH@0o58t& z0{P?|kvW@Vek{v3fP%0-oKg9)b2&ni6aWj9Dd_VO|$HVQ{9}l;L9#TFRHuu}5G;+gSg`CE*dZGh0k1gt!ZmMI2q@|81Y+o)f? zIQPsZofCFR#Od-3=s}_MB%bKlogVI(N7HOo%6EsSKyd;q^}&(4CEf5F&1Btn_JO@F zAEKP({Uwj`YcNxaoZZ!q$d`1XvP{oEA}+$dz6YOk5NE-unIt?krpQWa-@sGR`}s92 z^>*2>G2!Sfy7rxj12AhiC49=b`6P2ocwWN>;_7X2 z`*zGU>x?O7y_*J_3@%NWtGXVk?8q2ELAcXbgN;W%P%)z>OieJ{z3_Ocq}eu1I+ zVA0(ywWD^F*_5f^sf@9FY`sn}M#0%tgH;BYffwXP5U6W2YR*3}EQzTu?#q}F;;cdo zAZ9a?xGQ`8<;2+Mr|PD>uKp(RYK!5p!cYM5H^MVi+YU}JEHyJ!jsdPVCK)(*f8gfN zJjDJ@(;yY5gD-ecR!e~JZ^qSvE7Ogstv2zTX)PAG)%ZEyM>JgDLQJGdVTlKydl38j zk83iH&mr?n(q(K<=w+|rqt6}`Z?@v4?!x0jN7fFsJv`pW;L%qLWFM(Sc_uHcG@vN$ z47oV4DxnfvQhFs5D^ELb0Hw710%o$8(m1$U3lMYJP%HEa#1(Z)_)Rj5Lc-;MJlG}p zHJ>L~st7r;A2v7EP6o4O+q|wdYKAT46h(4~3$pt+ObwKJ@*|{iUm$~iCXf9*|>Pf|qcp3O5-LUobuACQkd|RCKBRVubo;aBW z`xj#Lo`$V#?&;Z+lsBKY$R0=8vxm1xi`)h3Z%*e{g@<#GO9=6-(T9FG_`{EgGJ7hn zpu6Yj{^;0P#7Zf8* z^jF{p1+=G*M2#OibIDtkn1K26u#z2;)!W{sN#Ao~pK$F?GrlXa?zb1^*u9{t7ZJ<4 zEeuwixp5dRboCXei0*tq5U0V-RO*g&#&1MMA$%R68_`Kr78X14CvC)YABd^*(9?09-D*@Z)D{wO0dJ4f&GLi~ zOUf=MCrpX61@?CcCSqSNlf>4_?#@ZFnPMag8%=WX3t`h0Lm&e7^!%kKQ^XI^2h#o1_3Nq~9(* zb%BMkMWK7KGlvZTp+#s2l9z7hy`o-rpp}^zRT>l?MrEdRve9^wyGs zv@o|Nf3~|%{8A)M(~tS(DR1`ph&*a|Bej{ZEYEfwExuA3k)DegAX5KPG9f}w$;W<^ ztU8Zuo50zCHzZ;a4I=(;Oi>2`gJ8$vF3>rAZwA6bY+P)%yb0}Jr4v|VGauqwYcM^J zUG$$%j5AP+`?SRKa*^Wz^&A_#wRWn2b8qx2k=I^mjxCIsnizkob-{h}d~upJRW^IY z>bZIJox1bz7E>|Y0On+3o-$&domPRpl0RIUUroc3Ndp-ZHpaiZ11rtZn8c;nCtlR`K$9m01)7}5hO=bfcz#SCp?Vxg8RmEV<)HJgF&n18^70Py&5A}h)RWi5 z+s7P>)wi|D>3GIUE|GnF5&$dGZ%LZxjc>{t7sg7}>lqI^Yz4x-5e*NB0Ms(<^R>9a~M_Hje4%~G(qKJ#?|A5b9@+!OZrJEx}A^Dv}Tq@}#VwE@w~;c{bz zB8%RPcLP?J>j0JN?OUHe%>3fnE>p3R557C^u77sE+v)pxF(iY0b^2-Y-Hh7Q!zVaT z!1O$iX$_Qv>k}p+<1BumQpsMYjrlulF%d*^(}8(kaxJW5(18w(?Z&c^?P0O8ro4F~ z^=)~K=TUk{8#@2=Wf8(J8{jyX_LF2J0;2pw(;ML9C+i%!U&$Rny3;% zi5Qh#07H?~OYP`s^F4f+3)N}Hn_fJFBT;n(C?-Rth*1-Ut1sbR|0n7VYx!w2LF-0x zQ;6YQCZ17>sDb|nb)zP${+qZq0yd-_2%q_1Q2Y!b0I@jX`u{-jg3Pz3;cfD3QYLNSpK(kizZhO zTB7Wb}eg}?y67kA#1#mJW z=0-+q6z9;$+qy4A)%gY-M(N~MZ}E!w8vUv`Q5G%%LGucK|5Q6-9uYVawX~UL@~BU( zV|S@p&wrKaoz?ns?PePxm5Oc30359xUTS+o+SO|9prx7lqnexZyukBE+nY`~IpS&x zeujHmSoY@e%vMCGY+Hy(s?BzdPWGqb5W;+vG;$o8b@v0#aI1GK21`X`I;w|U&^VK1 za#+KnRBcs!Lgn_|!}MF$MoV0eYWo zy}ar%Y+v3ph5PJx#@x_=l}jAW-hJa556PI-b_s2`ZwX{@s!x*FihM0N)X;_{%mG~PCvgF zpT@KV?odZ{%)I3y86XP!=wpujB%j8;+H!lJB*PbM z6G_&*6Ek^sy<*XDZ3lk`gcX4{9!=N-_@#AY;9WdcR3T zIT*|DZ|BywL#OpjnHfk0=k*0$z475Eqk(MCRza!273bxFEnfH>ma;y95>2nPS9u&s zv_Hg@CtL^B2d$?USB6@AI{<{dT|WTlBDry@63C9@Gh@PAU52kN*GBPQTr6w6TVo=HB97oRF`U_9h-14~U( zLkS2BIFH8+a&~}?P2)&l96EArm4AVt$Max+RN9{M7xcNYrZ}oyn)-=Y7)8AU_%yCb z;XJnde+WP0uoC%P6CWyE;3Y2QfGjj*zh`RZ#4s|B7n`rJv45ukU2ht^H-?G07Yt4# zlN{&VL<}uudft<#?PuF>cE}1ZBC50y+NJnzz~7>b!RFR`2XOGpZpS|*G&%dD@7bNa zZ1~Y5pw+__te}r+|I;ZJUV6#Sl?{eeJJYXco=V_SO#BxMqFU#qSCs};n-TS#xx3Ma zB7-jn5H<+PA{qUPHx(T2FQfX+NWr9y89@nN4Ih*G_=MPPdOOO+pO+acR;}T7iGFYd zY!I7v)<@`)^0!wt@zIm*uh*E;dJy|;Cz}b+C?5Yw6x&)6ToAD?VzBV>yTAS5LB#)K z2>%AP%rKvc{{(7bAU@3>{;9m=RTwTLx95#mClAnr9eL5DACzB)=?TojwuL@Y|H*FI zEuKos85KS*CZytJ9Orbxi86hnjJy}uqdVD1{oXtIEIpKGlqMfF+>e9xcx;(x-Oq;i zCEcm;4-m4YyrgpUE$r%ZocKq(wF)D&-L?Li>}5+d*#Xo6`f|(E-sR8-!+FpY$LU&O zCgMFmaU+yKW>^mf2Xy{@aNrbus8wtS8SAb8sNg)z<){#?x=Y`e3+jP_nfOpRIRMle zI|Npymoe#tZ(Gx_Y){k?qGI+BT1=fAsn?;jgL7eHK4-ePWX%wAZkeAK=hhKqIByXz zPWO19D*WJ}9546gKQJ+nZ2Ji`SfD-t>_!~G1k6a+QNq;9E0(ZgSRRW_f?CKk;Zs33*R= z7G9P2&~$l|apwYK4^r{{`_-as;jgo|hvjd={N49(E5OOwz_V^>u9x9|QT7&4QMO$h zCzuQek0UdK^UNLl-g{ryzP2_0ilYwQP+v3rQyVUW%=Kp)?t+(348r|PFEgd6{OqfL zqwVr>U^Mn&nzi@2dK6x;VExj1|D@iscYwd&|At1oFYczUL8Qdat6Ni2UK@iEExS# zyLr#ZB*zx^JHDio4!wa!!l;oi5D_qvoJ`~WhsgbujRQ-#EVT+M9>L?vf>U7rAl4+% z;5(F{c5ugIJ*LUR=WHEI_qWU}5ssN_ zF^1;0+I}^0m%Li0cKeY_0l}anB@HxILdv7dFD3Pk>&Yp8=cl?ab+TxcP&!i6L@1kx zNblF@xl>W`1^{KyCFJnb2IAMV@>>q%*Sj6_=WG&m>PhCT!K0QlW7BZcD0>c|J`Bfe zX=$Y3d(xPU(;OGZRFC3)0)iJQ|E1ap_zM5^+&$|2oIu)Yt=cP+j05YGKe^!zdvbg7 za}GmK>Q7pqbT)F@T+*(y&fC6+8)v&p0-}tZ3Xp#nxX;I{r0!TfB@6QYn)eipKr_2; z@K@HA05$Gk92Q3=O&=W%87;{`&lG?5Ly~+2IV)!t@W^0xRJ4(^piGxB&h*muda*%1 z7Q`hN(0=4vJ^oTai46Q3x}d_Ij;4(kf^*~i%>**W3r;n9c!utkPH3uZ{YWdFwiEf}#aQ60c*`8kpZNtvU6328 zF9f3Cv1Y}UaOH=%W<=sPg<5ifj-%Q*fN+yehCN+0t5!6#Fjmikdq!SkC|s^W-ET1} z4K(YenMQEn9y0mgTYAfPS_7?r0W(B_ZC z{LE*7EO%5RegAf1?WlU9ytHKDv5^;Th&%AYj_J@PJ(;FQ+(*@k@dfrx>UA zZ8E39YYEqgZ7CMUep1}gr?CiJ%a$&%gHj!%RrM{Wtlt{Pwn5*jY^%)2kIW&KTz z$xX-Eu{%Oy2$F;_BfP$wLVvK%#ca_-Db4{kzjREuxYKLyK}3OKNo~l2PAR_9%y1M? zpkFH(FU*QyUM$f@Q5TU1dKdCvnJS6n)P+DcSj1 zsGa`9@Lj!R$$dUI6*;ScZnH+4n_W_=W6P{R2+D4_Y23lq95%wX8idL!KOppYGf^@+ z8b+Zkovb4-e3Qq!u$QxihZl~h*Q(bNNNeQxX?*WUJ|Rjz+aBHiu}{Zl#+B@2ZnF`; za(I^cig-y2R;<^;*z=Tqb+>1IvHXp6$@~qQb65wv-66+buc9g==7~ihzF5aniRZM7 z&A#fB)vL(S2p6X73AYVhr3d(963=vzUplZud(frRQq*Xbru~|4(@i@k>y9$Lo0+nE zP@9gQGsV8iLuX}soY97dZMLyPA0XplFZp-1%$Z}36Lai5jE)n8YonX#WtgJeV!X<%eh55>-xYa{H=rww^(xc0d@tfKS%J3Tm-Vaz4h`X8(a2*Jw(CjTcr zd7jv=@`{7n&RC}iAJHOje9&aX*8)2HrMg1-t+!YSYj!sc$|Z zH+y<wtA)_T#3_NUjb^#gOnHnJ5iKTZ&L7!IrT zVl7LF@S;W&99Oi3Afc&zA~zDU_Of>UBM9PD0V$1MO@#0$$1ahqb}@rjwOh0h$jgUN z)Zua{Drbg%d z4s%!izAFAi!AFB5PNB$sEeNfbaeiU=3t^spl4M-()%LYtwqmQeZ* zw4Gmks<1o+XXF~zv&>CH>?E6yEO#tb5*fQlXGoT~&csLkw3g_}y;J22C97h8i9%OR zM;D<}xyuS6iPtm3_a4ZMTV$v@Yvhs`ZH1bV!3&grU{VPBq|o+AEx7f)db~JjSin{2 z5RFzFsg*1L`E%|mZ&Lh#M>pfq=r&Zz%L+rFu<(2%SA+Dqf0FEJ6y@jifCv4r6?lV( z4hs|1K!+i#{yLjN$K$(*B!S1%vXa4l+?~Sn>*@ zmuaTVkhPW~LocV=Q{S$P3?>Nge(ZYlL=Y+bJbq{Ohm-y2-2-iJSM$n(lod&=BG*F> z9slr-H|pGY&FmqKZ%gt$%X68 zHMiR#WL!V`k0cFn8pwCkE#7ghAZ}x(8Lzz-m1yAJql;D1cUrM)M}vISb^872Mu`CG zs^>us`0LZk4PRZ-MNZ00`RIwv-W0P{p6lNGh#c=6#BB7l6eVqqn2OZuQM1Zc2h^a_ zmY^v1`OeEH3c+%fRn@BL0s%gBu=&2Tdvz9J4VhXlqRQCKusvtQ)+u(N+@L zcCueGG$dC7{$PQ?ic;z>N)5K;vScE87lt>Lqrdkv#m8VEga7gi<`zo(NfLrJ_x@>Z zDn6dmxy>Fl$f!H&_|m2`wWZj_YrQz}8%%6c$-lNhq;m@JCQ)$*IJpElws@|uBc1uoln4!$uEldLCrStZO_&rhH{z{N`|LwNA zg1xv8cv*!OFcPWta<^~fbL;sUw{F_Tv78!sUQb=Jikoiag$(3euf^W$zqc)K$7ZUa z98YF@M`h*wWx|VKWc)OFg1*{}x-4NIF_wh3cL}ldS{7)YA}9o3eY7_}ST4b66w3QM zSX)Prnuy6M?QQ{SO!bnBijKcyx68z;zO>P)z-wA~Z8G@uS-Ti?#JES4 ztW5E!{fmWs&2Oi*OeI@zZUntDhpu%53=-K1*E`YgVSKOpRfD)N1$< z#Q`hA%sprB<72Jsmd13)xD4r&bC%Sf%S(MS$x>DaB1d1Q;Y1ZOLyR(>{3?yI6r*$L ze^Oqx261+IGnY>i@(@owxW-qHes*>2x6_X!r!Xm@FZ)S|JL(r@YnH_FH%-3t_ln%T z*0MTMV<-8tPkdYWj>7#=+}mEvl2y<&PUfHWzr*%O*1LR8JD}lpKG%PB-Ya5`79qs@eKe8hoht{p`c&5Q?>2&MpzJ>%g7e7&{R73)d~mIq_M!Ohddj_+5|4E3oCrb^hVV8B4;W$9c(Xtiz^Jt#fLZQIp# zrRxLM;cC$Dz9mKOWoyTKZiI35_udtE+`xM$ zv#q-twdat&Jc4vaF61(P%C8H~%Un9`5SOkMerWm|ON+u(KIs{?`BU757uVDRSnjTd zy45)I_c#KBKAx{7@HA@2?ahGrki7WX$LL%xw4V-y@MR4XP;SS1m41WW7c0$18u+dh znc~5vKapm{mIa~#Rs8!G zOEtUCxyf4O%)3v!Y<~(_F@|U-B|XSJ(ZS$@5bME{3(Wjq&7ZWO1KsF5j2Pt*kfa2G z<(2k)MpJ6V?;90i?h86VU;f>3QSDnvEYKgEz(KGG-ay6J0-f|p%qOBgT9xclK7&u$-k5{(Xi(RkU z&M01cm##f(6iDSrtL#^~_Ty zCT-rHoK$r?MJ99W9QfFG@C~rImNcfPfe;UCCeGRcMc(s0G_!MZpY%77@g8r##Nq;{ zhHJ}oS-+FQrvu_>Se9*NL|F}&w?m;UXoyW9hT&8!->;b!i!1s`ig#9jAEvrmoAUlZzT=Ew($jByrWVew$mV^IhOz`{$m z)OGwdvh#^`bG|UrVCF!1d+Pe7{}0&a6N7btaCBr&UDLS0nS*V(7;FgdZeOYSRKO-@ z!!QXC0)YK(69;Lj1&gaH;B`zGg+HrF0aR!GjQKmr*EMyo1gtwCI!I9Luz#T%A!dP$ z@&6$_1i+JOzYnz*NRNQvfHreT)hm!8fr50+x#sbS`V>0jG7)Cy6r714K)9KKe-J&p zVw7KYsx8)GJ5unoz{;3sVc%}%7@W4rt>Sl0cJ7TIQ)8WJb3K*r-hNp8X760r{1e5h zsUwevLrmy>0({}2i^@LvXqG4n@;gckSYq$(&VF2$1#lYqnV^1_p~4hXH1Z5NZ6@vz z$2vKr*GrDCSh=5=nN}X0o~#`pyFPNSw@cjt+^M(EJ0@ijlX)a|HjH@c3P67 zW)A97!lOFTyiBYajKbWjK2U($uT<{~My0KN_#Jwd3)#B&OE-hXr9CRp)10C9dC+;D ziJxp021yo62A!%=|8Q<|L5jMz-3qHZiE_W`7dd~sm%E%ZZeTjP6dz)!Ed)d~op!f) zFNEy=mjneh?;zAaV7bi{1Ln9SinHi07WNP}l*ccW1hjZwo31+D!2EC> z@W%#s|DiK=D((Q2gnlOJg5sg+QPmjkgQk(TLZOc?4qxu3zB**F?u#xR;3-VKlm(i- zmzU|tjR!D*Q1eX|W|(oQBL(hHRepO@esi?SYU()%|NKmV3j4){C~QdiiZ+I2iSR{f zfJ7}Ltw3bIU90eW6Pd#P6YmM@R4N6|yMEkzs#+Jm4YXU z?sltY)Xdt=*VXOJ<~>t#E3`8(n(1pWeDu(f^v5}l>$@zpQH#$r#|Kgdc&7~ME&{1Y z-lM`!C)!O$A_<~$ukEW&1d>(X<((~DI$aPK+M>EpSFW>yQGrX?l0x+E(YD#iN2|`s zt-wE60ZYG3m5ylBSWYO8sCGFn^t})#fsK*MG4mHx?Gt>|k9PDkQpW28hr+gayEMHA!8)5vlr+I=p3*QjJzby`P7hs(CHpU{&3 zM2{1#QMiKqe#oUUJ*FAjbE!To5NU!Z|7mDPe%e87j>|eM=gAwYBZ@+h#q#rwx2o$4 z-9-#$qSz4~7n!7J)=>#NGm?+OAx>_*I0@BF*Myzg^&+KyN|qkouROjaQ96(cEAnzO zse9fmt`nqU7oL{e#hGZi9QV#AMnTKuuBuADzMwd@eqJFJ*}lQaBUFdwe)2NKm+i@0 z4%w*pk9RH_j2G{DyckmM{$xOFm~pKN*r6;eZ0Z}xk0eu%PdpDaTA`HvR_=t!rnF6o zF=<8191pGH=-xV4F)b!6i_GLY_)hLzUU++DE>r5Pv6UxN!j{KR3jC}3(A!H*i_-M8z`Ji!`=93?65n~0wr zU8e3iT<;%fbT8~NkWxLNdc-94A%q3sDy@}!rep`m6Ew=a6vbt{@AO8Xarc$K>({le zidWZ0ZtAbGhSw9dC~!53d$2?1j`lyDdED#wjSc9|)*(|#yy#vJ+W|bJ2LQ+eo97J6A%;MvSJ7It_ ztCDFLr&mR$8OLhnTZ(x!8Mix~4#>spL#&E#_WL~pmg4fCmck+zLwj-J`Ufj35eODw z^+F{+mjJCk%<{;2KV(e-@I(TfU~S4@rS5teU^?(XzI+8S#CQ{hkUba-eLYWwvV~P| zrdD&gINj5Jn|ogRB1r7E4-E?|jqe-*K` z{wkgQ%xp}0mPsnZ`Eudd;WDs7qkWz_rtyNwYkgs@PHMaV`p#p4rS9aac+EhSyMyT8 zai1YS{fH@%uRzm+$0Gr}e!>%e*GqO~{VBI?!h3_ZAgR;}u*AxG;u7Uk#2T`bhO%-| z%}r~6paI9omd1YMxpC*LzG507Nl{OaBnhqoGGQ|2{`Kst%oV;*}QEq2Q;L`9VTXr9x$v~ZYR@=w5 zN`>Z+-Hw9bs74I!_6hi`zbE{gk&*Cj|ApjsYI9PgP04JxBzm}wR>g`l%iECimpvU} zhKB9gdh#UH?6N!mxCq84g+z2{pxB+z$tZxukyrt==Nq)a9c2Rocrdf?|6nm*zS#l0 zK-6vs4900}yl55GMb#16f0HMQ4m3gqK+_vft^LCA-K0a+@q1ou5y%KS56Kh;-543H z7&$-01<ZP>ZVCuj`nTcw3aFRST`oIf~h0UL6SZxF+Y& zfT3eJK5GPDfeB@KLhsoD5(2MLm6UmzKkL5ZG59ex5&EliHwVPQkzn!8qq0{_QNfFp zxx&4!keY`$jLXuMGfbykm4S@2=BAhzEmd!$-M?TytbqaWeAbZ*k;4d}rNb7T+6V8T ze#u`m;U4a}zWP=8olvgy4QjpRtn9Ziy+oh?MR#_za`wfCWUC;u~-ng#sA- zb9E0OU0gk%lfqN=a!KRXlp-o_9&zUI9=WMJg6X#_H_n1W^LtA$?`JwYe&v9YZh4^! z8;TdkT>tq8>);%+7=^TnJ-_n`*vLi*1_B)_rvSzM`WQVh@gEQ;7UETBNvPO;&n1m^ zCHs4*h#aL{p@PqXwP*L=Dcn!pItH`N4FuGd+cHmx>p>E~y~#fEat#9M+3g0E4jG4F zrl-Mz7F=SyAn6CBiQ4gVYG?)^%p=d3P45BojGjMNae!tghB{ylAUdjS`%4u6*ETri z1_Y0P*YR-&%*~7L-jp>+2Sf()?_xYVKB(pY-Yfr3G|B*)fldI$t7^G{N|UaO>_=7y z@1DFMtebiS@uKiw(HsVoR(hau^6aW=gJ5tuGgNSM^>>+L#r2Rywp%Nv0UK4m%oOn$ z9O!TeOc++}*#Ts`mlYz=Y`6h>?~T*am~P7pS04$r(rEi1dsd0oo#7!RUHJaV22?TkvZor;>bKfsIlwd)|!PJFAj=XORD z=bKD3=m;zXEm`QL(&1weQvLR)Wu8%o&CR{l);CmAD7uT`RLn9Fq1f!+rB5i?X`-d( zEQ0tl9|TPQl;A>^q9a@ML$t=_JIKmP6EI&;Z6G-&QFD2F^3d`^(rZ!% z<$jRXjuAYNh^0)SkTx)}Zj?(z|WN zQiW3j23_;}^U$iyOI!qC+}$ubVSEd#RsbI#J!upj8;B^#SXX)a-g^DbqQm<;INXat zd^|3ru}1rxFG^+Hl!=&!qP>JI20_s}SHc+Ae;pQAx7vIGh4uEJ{=xX!@NzY^fjkSQy#js=eZZP<{R;T zxpT5m%l{d-aZT{h2r^vZ>fH~#WR?o{5%t<=MEH~BF%Z-p%b!??N7>AA(B z(F~jWdsf~rm>2MiJq-TzJ`!{1arl?y&gLid69JTAeYGskPL2r`3g)X_NU4y7$;l;V z0y+hY_*9Xj}5SUU;5(_Tfw>u3BEUrV*Efng1@qVwYaI)8Gz z7FZTFXl@Jh@Ge?uaaX6KE^|vz<=L*q-f#6#p-Z9wXCTlsAVIwhy}0=two!l>HHLo^ z@#s=RY|EtH| z=geVM6AtF}s&qRKrCvjJnJo=Hm?l>;ch@f@woK^{OuD~1-7=CmXRU3k;iXwIGIee- z$L@)7B>oko8YI6;u?&b*kX?r*jlgKOuo`jwNeMQ0wv>}bzzk*fU#~O&9X#Y1I5U^#&?y1};GmFmhBN^Pk^L_}r4lhlPNf{X=K_v_Etj8b#ddY6H|yOIDbMs- zHIm~lyNeE9k05y?fU#jfC0iAEamk?!EQOqo(?XD6WX2sHo#M`JuBikekH1d_!rDY7 zNZrc(^vBUI51)PB|CKS!l`&}$MFC>sS`}`8kz+)xyQIRBaQ9%uV^2Oc1S+?aKQQhq zWG_Kxm^oU(kF%&(1b1>TY-e#_yL-W)L;NuT$FrK}@h9c#?kh^3_XFEBPMNXpANA0F zxtOv4@B?xzb*lFTJw z^$BemL8>K;0PS{0mAXrIZni&^l1Yw8Ntky-J%KY$;oAg{cB%Z(Y+cHp}Cv6 zFCM0&djjd`&f#qbjA(M-g|(0N-5se?c)HKrovIe;}KF3y8y76fx>vBYu|EFMs& z_Bi}PVF?m*1PPVGA4j__iN4i0QYGTRUFc;~T!NnDxl;J|!vuZ(Hzja`$*7)g#jOqWtwY?K$O!9+G6kHH0=teRw(H6Lb&LQ<>uM{@(p}ll*LYFZ0T*XQGmr=8}T7 zb0#u2od;yDdDpx;Ak&KJxn$Z;4s&RF0KR*Bt;mH`td)}$!wo4FuL<7sM*+I9HZ^$( zg@00bw#Nfx!E?wJ^L@rkA1G|2r?jgl{(R>7A^O8~tbWcQNYVuXT7K{21B1&@JWPBy zH1`5Lhpu?3r3d6&(ErT0jITAg z+bprMx7BQ@laXFmV`JVG({zDSfO9Y)U;uh_2l#`n)OOp9_+F*52i;c@k(cA&5s_q5 zi1MB2t@5Co8!{Vqz2!lrYNjsMJ0aG+7ZvNPP&D}{MjnCgdi6eGpkMuz!?+>@UCt9kEF=EO)N zq#|34&c~s8*UaUbgzyc8cj8rn5Aar}2`B*I}^t_x_zv1BeNP&4>O;rZ#D@knM zw-LHN>FLCs2hdnXnig)aC+A$weF886G5NfU*paD(Qh z*{zZMytPPl+BP4oPb1$z6<==qyEfmSno&SowWOHLH_tXZOnSOEdm)o>tFi8>#1hZOiXKYjakRk-*CP5oFf0L;i!GL_o|j9l{64b2%fo&|%NQ z_k;_>;c(AuCQH#vIXRJUxDVtHw6x?}lf81iii&a;)atc^Q(tUqm?kO(Q^MH`nEsQ* z^MCzZunS~?ND~Fph*adbuqiv_JG6mLPo!C4pqv=|A%KsxUI5G@l8{IsI7+7|Zy_c0 z#c;3&(3z3uzY#^_Pwp#l4IAzZ$TYF90zI~bb*yuLX3>9%9-Cr6vA#aX0gwqT&?z?= zSFjS2+LGX1x~aF3-xpZ9hRA2IJ=gPLUzIjIG#H5$^yh3Qcg=bT-k^{o$=OUTkbXN- z>cvs6mt;wc8fU!9yIW>m;?70-nVLf+Gwd@Q3EEdc4;UV-fJOO5&|ZQOuv@?=;r#;g zK9(YyI7R;=TXcn6{O|Q=@_)HcVdbF3oYG&EFfi0V$@|#vqTxm_wl%V3{#6y|>fh=* z!D=~yY%?v@X~`4WS-QUy@&L|uKPfK@jOjHY+*Ieg`Bwyt5$l>yblPY@e#(5-U}&zP z>;MYfITwCi6`ucv{ScTFj&5Y#^hizOk#|#3H?2vZ2oy_j&&z3O4Ul`JpwW|cJtyj) z+$-#IwL(Fm+R#>ZkwUNrvcE1s?xzJyZ)+3-HwuitZiDf)6ra(}uyPT{Ye2)G2mnOV z1Y)C`!9arv-sc;Q#X{^=+#_i6mKHCT-hFDfLth%M(jSo+JJp_0D(Jb zKb5kRho0p+TE-B3LLR*T1gfGzB&47QP2qQG+%|0%_0261S%y4TR+igTF|4)yS|bFM zh^EAE@}?To=xwVPCzRpvShp1pt4#gl*5K^LSOqm=spHNwH$YS7qU)kwFVsMPt(s;@ z>$!{?efagh=RJ9IE!9(j#{zg9?ocLHufknyPxE};C-OuZYj+yG)>2D{ioZ^L%F#ix z#YO3!bap<^U@SVUe zw#>*bNpnjdvBq4)5*4F3SQDTplMwGEuDIfFH`7Arr`NLIyy86bxz|w!moD83r=iQ= zQ6W>zqmV88cf(f=n&gjy#iyGbv3&e(CA$sH+x$wCJ2j2$h_a8SqHc-Wpxx!}Q8^^J z(Q3oT-|k3RFIS|Pkw`lsYs$hZMtt;^&q$x5xB_d+co`D=o-6yTYC%$2MSEv{$#Kz z)6M{#vDcz3ccy#HwcfR96jpEzel(pbuH(h>xu?VHdCTw2Wwe3%-b<* zdWM4K@TlxNiIKcL9RbpgRkit7n5FqLgw)2${o^6Wjx(gFVp#)ZIS!yoST^U2Q6%T) zq^G|Mnlu}BrQHm6sksOn88v@`0!I3s+T8FNSfPl4sTpc26??z6&{Bngh2Io|-^%C3DU;mTHVyBJhxw%09F<1t z7ztgfQLwwq$8Y6_DyLuCmd-SVxftv5)jYP>wh4Z#--)zI4A}E|D{jT&>N|2rPE=mr zPGIizaf;QwI?{NVvtB+zAKgZ|djviss}05*uBxr}bR+`_EmSrVG-Wfd3*Nf~8~wIc z3Df?1bPTLPjS;-ZIKO;zTmI;{q*{|Mx35xyKr>7thSQP`M?Fg|TI&rEuhUeV*p`;6 z&Lcj0q4VkHw1%k-4rg!U?~?-F)}De_sT|BXL&seGkM)9eMp-W;$A}TsU!D70#SWmu zr)aY6lkHKof8Gy27F4kF{mF;slqVxR7yHwlB_c6AqAz^XCl8e?bobPA^W9CU4y5uI z`_-q7N4QyPkyRfaC@5IqnGpr)2FY|YOH8=47nkxSG`K?=*-yy{fl^wtH>498n#3S5 zYHDcy)0g2HBis9uCs7u}1ueHP-~7RHq~~9@3d`PyM6-qtc7AcqvH1M8j)+-#zL9um zw?-G}U1N&Tet zwlxtvr43Oj1#yM%9huq1!#d87e;4oRj*1d?i>xqi*tSvrUvs5lv9PdG&Kbw_tC-mf%I*yaUqnDvZ^jrVn@`iiUOO|dpg~IMo zsap&kdwh3Y;9AdqNlq9))gmt3$Nf=}d5bT7ydax-t4hGPsd-*FUy&Cz&gIRNit9xu z#c}51h=2Vf}tgHe|ILTOV7w4kVR(2ii4)4 zVrOA;HJ1Yi568O0XRVn=&rvo)lQ?Bh!TzK;$b0%e&MUW?SdD?4M~&%>neruFLYjL| zN+Z9}$NtQEs*&;vBvE zTaKmTWf$v*^kF&M+?P*)9|@d5&8zF$vOty(V3Qj4UTNt3H%I)+z0B*$FJP#~+JW2g z48v8)Ohn3%VD)524W1`}wYQW-DymNupbAeYP z>7&*;ZzJr%D80Pm*IZpvH!rhcX4}wc2C1|q%7`VoAxA_`1oz_m(Lp7a5RZp%l6098 z2=Z?GSP2ee`WY){L1XMY^QaE3$xvkR%rvc{(6oPs2(yR&yQtoKuN<;)elJ+KWk^lZ zbJ&%5mpfVQ7&ng!h7|S>I>?A@Yk>PSwfivS1wt&u+g{M4FC%yVXv<2`?k7_sw;w`; zS%&G|wX}GqWkxS^ge1v8W|h^BIi%Hzp?|q_kD{WA`Pnz?q=TO+SclKo0&) zjypmg`Z)dJrG6`nDM45Z?ueR=d|3)H?e&74_8O6X*B*2>sMj%rjRVR+A6Nk%6Nx|y zm{Bn*c9$`ne?a`q$QA$+*~UuH;)KF-AI9kv#5$VljBZ{lfBI00cOP%(-ac6ymGhSH ziLew~W7s=)B}zq4pEJ@N`Hr6*@CzDHD*bheOPpPMZjJ(<&ZX$7oGF&x6cYqE``Ug%?u+b%8euU+Qzc zCx3y@SN1*-i&5-79)X-cqnxW&xtQx(aFtLxd15N$zWW}E^%au-dsqj9aBarS??`X! z1JF!S_#I;f-pJ?I;8j@c17|@eYMa{L4At9eqBuW<6VclzeDLGk%$MZ{0XNq%gZV#D zk&6mDwELcmvpZsw;e>NMW7%!cI?hL_BA5rY#zFphRt&)slhd{(a8fKitw201Z1tR` z9eK`tg?_D>twLoUeK^=44QIny9ZJeRBl?c2gc_t?P8}HT5pDqza+@LBFYZH~wWqvg zm37%utx>_icU*CUV*5#hz}^KdJn6{RvQOOLx%lD1gOH|@#|}D|3531ujC+cxORl4r z@XZeMZ7t72E^57Oe6%9k1;7VEHhwZn(w`4PlOKAo^pYRcN(Dj=D;hhX`>H?vR+7Im zh+bm~ur7OudBrgDjg|Zl)-*q~jNHJlKA3#G0g3BY){<<#zV{YZh1WJ8bjX^&D9JKo zyO9s8c7pCwN%OQ#Ip2|pAxVmiYg=l0d$kuWjaApq3VsdL%g7nApP8SeTqeIpOz}xl zdb@855Xn9zV{t97Z-`z_&$jgX_SKcZFHdC4mE`W(V+?&L1pLRVv_Qki?${B&o1OlW zL+;6)&u6f>5LKX+X7fm+PXpy~7Z^@}ohLY9c60ypccw32fIW1z0Aq`^FwpOd0aPfO z>BZ!ZUA$JIItPD-2-_r#Z$$?e@w3{MUEKJP-qvNda&LoO!3#*{6m2_=Q|QtJK$_M^ zA77tR9`es-JawRcl06jL{7y!aH^x;eHm3D~^e$&DwIIOQD7SFCv#v`%rx&y3N)jjOL4Yi) zb?lG2b|hWXlNSC_!B_hejrSjVYJ&j^-H;c`mGE7@FP+7`FjIsT!wB{Ttcg>9XezeF0#Gg%$hBjH02eAwp&5zLB1lR5oiYb_^?K z5)Whu@bNrZ7M9_^!N}hOBOeb&9){UPj8AdZW_X9c)isx-0vI$i^^0O*kJ!G)|MsqO zBoVAl=*19PE`QqqdfRA=UTTXrEJOgwkw+@xnj`CRq4br%dbxrqAX`iKa7RV|%HZAk zM2<{8qz;A>1dKo`qr1Cf8Dy8!7%8};wyh{~r+clf@cOLojdvE03Go=znL11T%Bgq$ z@pB|uQux(tyhd`b(7_p33u*)@QtWIG{;6h1Fiv3;wXB9`|vKe;xlQ zsI2jneFz@^-Uy_#j0*}p5c!7NU(|kul{{trJoe)KoY$ZN9B3T|lqe#;8G^K5s!>Qm zS+YLk?TNeh5|qHkvtkszR;#rH<>}E3uyMPb5bnXxr!N zmbBv>1piEc(m9E{j4DJOEcf`inJ?uJf3#n!_Vs>5xNxSrLvO77(bF8)V|xH^_O;&P z0e?QXkK^~$gj_)z263RnH)03wd!8qA8)YJ-9Nu}yB1F^N06TF=Ic z*4a25zeQ>5%YdSaQP^j}*xl=8n*nT}C*|LiY@V!5$lEzpv+Rd)2QDf0B|Mxt1k%Lt z3s(4-|M`Z39K~FUXfz7%;+7Gb2^D5}aLeRCY`e^=KRWMQ)SLV zG0tvZNjB)o^&Z(&Z-8C3b+BXmUgw)i! z9L+UyKAF>Ae>)V&D~AHY0sGfC9IfvRU$9MM2BvE3$}-7>!`a3c;4>tj{o`IfeYq01 zu2vr(<;oj8DdF!gXZc*PQ#bPMYVL0+XsepAboL$`YE6e(T!*>K8avc(2L?-yRR=c) zzuFWilq=E2Iz5Q^ki$Xq)p^NchIl6C27ipt!PT*@6%=hnK9DKP)K^%6$8RTpAr+;$wA8EXnon#5?X(muH`CA55Rzu0d5D zQ2xPMy76Y|^-!Ark5i$+SFfO=y{ANEhEv;E}W4RqP9_p8K*Xz~;8%cYZ`pH{nHkKNKv*#z-m#On7 z@X_j9zW#pJIvae(Vo7xuPn5Zzz6g%EUCdzEwvMFV-b&@75zJRtcJ|3xzy6?a@B`!( znid3`w=P}6jlXcVu9MOGKb zOMEcT;E&BI1#)oPP5gj*G1zdSt+f1@HmjD=NBq_kljgVYYpJ#?K~Y_soaZ-*&KgIj zHoAk>ksYtWc1A#0S@dkRfK{E>_7M4)qFrw*@uG|*&_cAoFi6<$l{ zznB8tYpdb2;U`w!f#~RfG#akTb;H6Y0Ayn1xv*6#4*~!6r^Z5T`}2gcCduquyV`sN zP}@f*#X~WnlC3OOmw9qh$44GpG4O>_eI371Pb*c9qSB}F?tY~gDwuCm2D?-YkrH2a zQg3AE;J+8upD~d?7|{IOWVtMQT}qqIT@)|MUrkPo;%W_eSnFNcJNp0~09;inT$2BN z$Bg?sa^MV2{@Z&&jKjHxFw=iK%O zLfx1+K;#ZT(w-l5|6ytNO=zZeNJ+#votV#O(n-#Bsq(l~At?`WN;BN?g!n+=8>||+ z)0l@DcG;-b?h z8kIh(*=;;C(7drlaYop%ZWn5N7QI{ zizh&X{lNu|A18|67cbnScL{R4Etr0HtD=VvMt(asktCPbgFeMzlA2tn)9P+&H2;s2F{ z?*HqB1A_HvwY(qquAmyZ`VmKghQMU!T<4;Q{s(j&tN_R1q+wtCpU+O_x%2BZ*Rxqp zj_R7S(69zQ=R4js|6-xBYAv(dpzvEzklZyY%qG{SxF|OPASgnqsh~H*>a|A&352?7 zaspA>s}x;CF-N=06FHI`i9)mwofxWjH@Eey^OoKk&1Fl6= zww=%UaDzW6obg!JUjA`QJb5C4r=&4jk;K+G>)8x@PXTd)ZXg=Y#{o*?e}f;jq)B>m z?|-<;Tt~qzK?x7sC?<~_`_It-fhql!w_x7}U`^hS>*PB4{rG|BoMK*^s@Ja)CI50= zQ}()_l~b4i(k|qYr*e;!o_Goz7c6MWD8=HC*U(d>6*{jm%yJ}X>D~QF1caA4L&-m^ zlUM(hlfhOut)6l{&VS(kf?ke1dZg+^np=*Vf}Xtj-5uRT;w^WJ(QC?_VjS2H$RkIR z0%^vk-otCevFXWAHdV{X>H$==CFXUvDCL$M1-+`dGYd%GB@x7dr~|E_TQ-=&p0 zw)KQxox4Mpz3>g(=5D#}JN#gIB1JmvC~e9cA3OO#a_Tx9Kt~ChxLbTdQG{dU+P2H; zT|v%r%Ida>6n?)ZNQ<613%@=t8p659axgtVS5yx1wC#Jp`mQ`b))xgmVAkBvzigsX z_+H2IOaGlGg{9IUmE!eyN(t>z?ZS8OLAe?acEM}lq*CP?VGmqL0h>)eAiPMhQud%L|dPZ8O9NVn4PW2LN`qEa|p+Te^ z;i_4kb)ablqvu!b{j@AvW+fHEBAIL2uSK;EnO#ducQ5MgSMi+jU}3W8Q=d7Zpibu0 zI%l}tEch@94{hAhiYT_B{F2Tm<4aTT&dhSn`!3=`liNPrkkz-GpZl$qBRTeayKMzk zc;11NM5gTl|HBFjXk+%~i$nYoAI1*``cyQ2Eyi0NbgrC@J-Mq|vF$B~2UZ=sY@NK0 zOtNfNE9V}#2HzPtP&UQZtF>eTq7V^d?8&(nHpZlZq^V4Y;Qr!qB}wY#UsC33rKgnr zI4P#m9(2T#oV63od_pcm2GLx(<3?K#EkcTI(v;em9Ymr=wGt8LqJl22>^&vBzlWQA zwBr?#ug)p?*o!*WYsB42ZGL^9`n1KzDRfrSY`m28-G+(}y!Ev%NkT)M+EU4|tjx>a z0-l~C>q6@jjyv`Y)%FZE^rAi^9y{Wd(|G0VQC8&-C^ALdq*gK%v!0On#K4W0*OOiC zmt&QMX+q%$Yq>CfH)&UnYVDKYA?iqv0#c2f5X}n54n0@zr9v#?V+so8%158=9~%K_ z56)%<-4Y=#$~u}SZhA#o4~?d7=3O)ql*>`wqV%LuM#zbOszOI{P2DP>_%)t!?^q<6ZIOg4LIO;wPKHbNQ0fZTi+t0n?JII=2vS0pG z2$iLEy$Ru2&lbEi#B?ERNkTxhkHpXaxN95?U&Jt)V>2{724aHN3A0L z#xvcC5+@8n{|{~N0S;H!uMZD`L`@l@69hwu8ofodQD<}!oftyYDA9!^6TP=6!(fyU z8Hp0n2_a(CAfiW!ghVHvZ%v-}Iq&nF^FQbRo$tE7>&nXPr@FBa3g;O?a8EjPXrXsskwkYO|@ zape@Q7jM(|#Z#F-=qX!dIsJ92SDyDx&T6Yp>@kSiisj{xH9eKTUpw(L72l|ox^J?@ z)W*FMG*Vo?$Rp-FOY=)^d_+tu**22yK9Hc(UV1s}`6(_$Y_oJ;mmuzPxENt@LpY*W z_A4kY{xh~@FpqL%@u0|hF`@L(Q()Tl^kTdtU8$gk=I~gWMX0gx(@#J!i{s%~x0#tb zD45d4Z!KupgV61f!LD{)WLg7zgvEZpLjH;hxzUN^^+67)u;o8_eYr5;^Yz!8UmjhV zZ>nv6JfC)1Weuqsl^CF4_G9NL;G~n~OKUas-8>L#uC#sf5}1h?v?jndCV7N zC(B;5ne8}ORmTRyiuT{#X!zjQEJI&z&#fs|Z%+5};h@!Slb7#r0 zP=v*Q2-%Dul{~WutCT5k^^mDxEIlu#*ve4sl=0>P1K{ zOm4Faf1Tb=4J`&8v|XVhO_7XUoxxy*DWFF5mY%C-dee`^TtL?CKn=hO=ZPUHN`|0C zbOu2JR&})GkmFZgNrL^x;Ygr)M94`oPR!xQ6e7I^T`@PQjK-2vcM7imA7z1(bX^np z%6A2K`QKs+dW-l{{ORn21ZlEC_tY{IcZy3Khf*{JeCRP*DH?igh5brsXFWl}5}n-$ zLS0843>CZ;KutyfVqjqK{(J2qBiMG|Op^F{FzVWWd1n6SZNUekr^!qJoR#vw`$5$P z4*5rLlO#f&Oc#s`Wk2W^$$$9)#uMI^UG-R?Yy~@DmyBE-F|LfTRG1`fM;i8uly*u5wz=DiD6s1@DP&{uh!stMJG3L#wKWAY`-!s^(Z?+Z+aP2i ziQWC%XJaIj7V{LuKRr8wObWsULki9f82~hjnkN05C++tiz~u-Cl_9QDk1~`uYWj@7 zQ7!QfxGa+fqs56W)K3y60Oct3m?eGw(E_^^5dDn2m_%P}`PsjZ|AYAyKRL81$%Ey@ zZJYq6LpKV&~~zqB+B*kLXui2!E89ua5SUmsYNJ|>)pr;cw3 zITQhJRl_6C8O`@4a3=-d64IgnI#EgFehuju zdPp|qjA1dUkIPC5d42S80{^;Z-QspMtU(E%!`_3?>)pY!3+OL^xVmyk=~YX2*`N<( zI99AM(*Q(ME66w#8cRV7F$Hz{Qgku3E+RMCVJNiMSiS}P8p%2#Ju4THf+Slhrj*1( z;s{v=Jdz}q8m20eblZnwXhFwsB#4nTb`w#sHt_#q24dL|SgSvl3~f2k@8a(TxVSL6 zcy53!yQ{rP!m`c@)n~V*&<|rshm>MVphWPo;)l0QOM>MuVBCo8FbAT`D9cEZQ`vgO zVetoXp)&p0d+}c-ZjC5iFt5_;fm^B)(sQPX9@F1NP5!Ya5ooxZ2S2r*dh=Os-2}Lu zUAAgE&&##z_Yb{vl8V$xrX3_R2Z1MecvCFtrpArliAS+Wevy6xFVL4qtuIlJRlB8JuF#yC8S-TDbOxP42l2rnc2v8jt7Y ze=#n3RNZ#rMV@wA^Rm%Qye9Q3Z*Vof}S&l<4od$LAC&6?<^xJ-s8nFo&yEKxGdI4g)CZo0>%yurmQ(Sa|M zS8Xz>$m4;rI()nGi_CPz>7gQ;Y6CTkSC)QNeH+;fi9zC#%7Q~&5!taRGbO&L0ap}r z-SfE_{(S^?&0Vo8`3kb>LOf}tegFv&rU9gsf9K1f zou)FCqH+!yyNk={`>skN?_803+AG^o+{=LL%OzJk4hpSxBi495ZgzYogF9F9-L#8a zC3u12|?SM3V|6O z2(`#i#?rr5T;u9#z}gxJ_g3wZtV7vsA19}q5!b1AGxXWdfhtpfr-G0$&qRj_hq?#bR}im!E~dfhmztE&UFI%)iF(ffo-F=xYJ{$c(6q!gZ#T!UiwtlymgrLk74(gt$y}~ z@mQZ_Q3vaGetreQRyQkPD; z`wjW#c*3!~*IWYBT)oVEI-EmxK^wYZ+i|IlT}4so!1UKkWrwPpyUtrtu5wrO{N`>h z$05z=dCoYQCXI)swL;Id zn3=zK`69}PFKBM*h}+*zDZZmkwr)r|WpcWEa=ZG`%$X4-j{WgloBCL0X%?xkyP5V% zcBi7L&&#~mbp3fxIGt>HY9xERIz)H!d}8beI(1>{>tGcXlc-DPp~O+wln}&t8+<1~ zOAzL%0gG3x=0`pz`eFkX;fdzbmtzD8jm$?kfdMo0<>!SEjqNb%Th(7k$F~J&L#aWc zDcknOz%y5MpQRbAp8TrUen%fLuMfWIK7J9QE4e_(F*#-54wft!MiB{>Gx}uBBILri zi9qr1D4PJlunWny5r6;%lpLvjZqKJo_Fh#X<67&~O=&)^8@yw|Lioz7aV zlA)5%+7}>Qr_Q5hee_$43>)Z%Rk)~9BhWF~i6n^#Z9H=HYJgre)TouI1P_ftAZ?8- z@kntOcPPS`3lob_^#@-SNs1PAx1q&W_bz5p}WA{(jC4Iym-j>-%!{=Em$&sOYf1n?BL zpREY?PA5Vq7UA_H8j8>+#=ZTHkRsl-|LMcPuB}0Cb=yu-Q(YS%KGeqb&fljt@gaY- zf)fQ$E%VBszhZ0a`?Fyt+Mu<9RYVEYSAsMDlXp^}1&EZ1B7;hx2ud}l&Qt`Y6h&wV zbo~a}q|0DvBCJ5){>7{Q);A2{Ukb(F6^!2k(hwDZaVYN*OvMO&D!|DJodqPyLJZYc=cP3V47Q$GMSo$~jG z_J{6WWG3o7ff!(A7LoRaYrvu%wxGG7KJh=CT`ps*Sdy=OA)-rU&Tw%k#nOjFR2EAF z(nXnPR*2iu{!`n=%#w~n#O=*!W~kK}m$J5PEM>Mo$@c$c=77Qezt1d)rn)j%2mU0f z#A#YgCQu4^fC>#JAKW^KIJo6ZP)Hw&nK-%_fYu3rL+fxT2Z}1(O&Y-srd#O-)SUsY zl?3UMLwhXfctW>Jkc-*yVxsNY7TO@0I!gkzI@McBmVFJapG##%GGgSe*?Z%9*8;~I-S*!Y;5 zi5g{8NoSLAhwCN-e#ei%fZR+VReo zX6AMiIV(9^xGZ}@Q0Vebna_)YqJE{cYbxfZygyvwRjkRM1Vg%C7voFs1_1!J|0gGal zuN|?T5O(rlKW@m<&9Y~&dno0Eue+K0%sxj~%Fg#a-O|Y$$?x>i>2`aI2p&_UJPYV$ zq6AZMX%e`(qq-Cbcz|S@EpdQkRTN#0ij&}_(9-P3@Q#~nTt3;Mb2C}t0zE-xqaS!P zuG^~ML{++_?JJC92f|6ZZoeC>PkD}W3-h`y{*KIEHTK-pACOTSjbNGmcZ#t|sK(nqF2^sV5KydqIhln1IG(qiznX5B4>((#HL~+-(jczkuC8v6hA399HuH8F~uAR2j zt>yW$%r1EFK9Ff}{X6ggD$mC^%w=6-J)SVmtw^^UANH(%{+fMZL#S)lO*ZRNrl)b@ z>u`E9-o5IkcQ0H_UByGUHV?p?ByI7fymQI0!l<*LyfN0Z;7f{ir{Ba@uH|%ef`j4a z>+RAvOR@^T1f`c~VQNy}t#Sv}CSWz#{w;p(-g zJVHsWR|VjgLHb0vY%+q7v79BO*U9~#Z zcx(Ia;{snBKb}Lgj4O?X6Fa!^2X7di5T+cT-fbjAoQz7%|Je612==i`+4GlvPRpR2 zmNpHG5LZ;f7ksZdV1EW4NDlsNh8H~#f<`6zV!*N%>G#8$Fqhp?RaXd?A90-BvM?vd zByw6>5+rj#MfH0m#jQIyPy6~*v=B>(g%0agLQoN|!a2@S6`5Jl+=5C?E@4|X780-dDt^820fJ>&xLOU=uqe>2Q zDhlb-)9j_U=I1SMWoajiqc|=33G=8|_$_Pfu*Pg44$>w0()HkomP?kq6qN2tb0YK7 zPcwhMidGlS4I1U2uHd|R($9(aM}y$51kTBTgpR$-_*LFmo7S>;%g_C$sbpO8+SIY} z>S3lJ9UW8zBcDx3n4x{K1r)2!YsVs3GbJ)&v@e8qw|1*+SI>Jevxumj%X_o%VSMy+ z?cRp!!rhPGyLYqS$9WS_&euJJZxy@bcr&&A7`uTq=8zyXY)Llsp|wul@qWBCY39w5 zm-fS~>w9#@y;{BbyRAmC7k^d`kYA2xI=4ynmAe#&e@nf8RcZJyeEdun{jk19Th6iU*|o9ttA;i=>MoMrTfBbw@K z2A@(CoAzB6n9FWpGpP4k;P>?9eAjKH-8@1B$7e26a;Fy=^+3Po!_9>H{h-s?B$8HF z+%G|iOJ8`fdM|S=U1x`tb2#?} z!AlONAiTx$Zr=JSWrm%f+er#nFU9=)vfAeoZyARDLEmcjcahh=03Qa3!6g^lw^9f@ zQnbEYd$B3z$7ma80Jp3?$)vu}f!atBlf}*NxhJ`1r4NJO4rIx48FG&0r07*GhCf%A z29o=@CH_?4ctZ;YGOLYag+wimbeHC$h+tP}Fj-O=iJ zCZzS1a@z(_Bi|3!pXXjZmU?K-YMMg%u?kcDv5RRS6#S|UCi#%ia=YRv#!70ekgecI zu~KRDFrt1ucl4vUuIZ|7DJXnwY#MV(9cltGO3%@^%=!w_RZ~s5E<=r|Ib29SLLH6e z<3#3Ws*}QxflgUCw8q;z)NS?#cvZ6?vMw>w@MTWg7 zDvj9I6rd%=cEg%bqo%-A@|T64iwUHb{ti5#g+Kw4(*07UDaY`3&=L+18-p~RohDoW zgG~hYii@oorQOA^nxqAkKYN8C;;w?pMc8*Ck-Gru<4Y-Cu_jZ2Fa|6oL(z?&^d)ty zxHAJzOp@O^yaX_*G-Fj0R+VtrkvbED(h#LUA8a4$4t;v#3^jm2=lzMcRh8fK0U42f zZ^#%kLCcQXCWwl|at5)+w!JRtsXJw_J62T_zhG^6&Ru5U{IhfdTkIKuvf~br92_FB zXsfNR2`WM>fdBG<=murcL*!kh)CMN=bfETH6o_>;6WA^JK(`h;gq&{!z!Zn{2lT9)ar-?TbGe_>Kwuy0!d-}}!7{l6Rc|8pBjL@Yu?8d^sBC)X<> z`jN~wAZ~so!*2&z)s=h!JpPSyUG)A$$Y%KKOU%Ckv?7KM@f$)HO>+Hz*tHv_F=X(R zQV5*Pr6dV0n|Q#Kicn`;kp;Ar;rbHjRQTV1u)#+93EwKq)th7R?CR2ftQE@6?F(fk z--`T{9sku((k2InZctVN%XkqS1C~d{67{1X0%8n(kB&&bYJ9-e>571hCh-LSrH-aR z!$gHDz^0MnXcw_C)w!|wf=7Cm{ zxnO>0rq@e`gSta5ZNQJ00sI^g_}|R||88fKfJAR1W2wmS`aRy5x&L;{m@D^zLZB!! zxUUm<;*FH4l&PM1SrNf{Tz9@4bv$@{4b>05k0g#9=$}tP{Ti$ZTo2od$e$0pY=?Sa z6=(%x@E@L&|GpiMhkzh1Z7@BNBLJSzAcw(v{YPjJ*&oBd-ha|BTb-(yPIuYA{dn1a~Qw%YXT(!S-CUrqQ`<=MLltQnqWZYWz}+Vgt+!sq^2+ z)9xf8M@{_LjB}zjg{UtW`+jQA(|1h|zYMKkkW51X1Cgu?fLQud8k4e#v{-i3J&jD z%F7HZEL{rqDE^SmuFfTk#XSel=Ra4x>Z&>=*4UpUs_fZOt5B2m1CFc{O#QhLvw~yg z!QEp8$J>YK?H~I-CR!aAHYK*h%sv=d4=vUxcwDDsUbo5S5G2@WkBZz`v=V;eD$4k@FQFd z*4Agk2!BJXz$Y&>jrI1i+mG?f&W+PYN9*2#0+{E-c|)A$T7e}SRjJ`byA@gbQ$0&R zCq?3145y-3U=gT5hd^!|b>~$g4%k>+W|wnW%R)6>wY9}3`8ix)@d#Nqdf0^^o)3*|f_M}s)Tgju{e~b$-HChz&z2juQm={DdKD_i=Q!ex?s94o{ zP#LPm>}KGzef3IF>6~HH?eCs?fi@39VJMDEF#^jR>TQU`_p&ly5~_mi_4!TQOY`z} zcJ_na<@z>n_l~u{&%q}iYTTjH=nnc|#1oaWdb81)_kNt{E6Esf`4RWAN6(fX1@eKH zyLtC3%NTu3u|t%}qAWK?P0Mu$cd<&04#u=sIc>cp<``QeBQqdbNFBYtZYV*M?Yy(% zb1>q~-+&U=win4vR!h(|6APPbzKTrP+Y&lTx^*_??)0kG7H+AWKPFPD)D*E6@DjHf zrPFxb-Raz`0@V|#<;I6?1EwFErruik^iV19x*k4HU}B(8SAJOC9B7_flDn&k)4W!} zQQ8rd5F@20mZ^8{;QrPxq1-%X2jylY=lJBClZM)haHP|9`Xj0S&!8+|(_j`SuOA!L z4=*<5fR7wcX}&3=IL~$0=D+TNCX-vauiF^g91A?pOn;erN$K@CH7=NEWWGbyvz9E6 zH2cR=GMtrta{#5aS+eM(+fCv2;jR+L`tCs|U910TH#!^IwXrYfZ`PBI>iOu9OO>j< zc^c$?OKVw!R%CKCMW@;5=Z%id@&val!J<5ymDuQ8vGLxy(smHYYKUwHkQZbv!UglFJqIl zs`lVbyUGspcZLE9UcG5Q<2r zlr6stzb31+auDdvH<20z20NN(!F6s~BSCS+EW>Nwc^R@_;vbkyH7xQ?@6Z^`8Qhdk zJ#L^}wceo&#~;~CC-&k%T1PkA_wzhXxrDz;s6oOPhV(o~o>Wt43ZJ1Ikbmug}%oCUa^799@9D(`uz zcfv1lq&0~yj%v_OVU25W67-eVj3y2J%Aavc6%0jvFu6y5{{trR;F*zKmzh&Vw1E`a zZJEkk?PRBGy31vKnXVJCcs)dx{FvSuKD12FSWVq6S&@aynuE6(NeXLn-jf$7J_s?0AvQ>iWG0kqp1wq8nc&Gc%hzI{L{!3HQ`6^cKLo)?|lBa(7kE-SvkMpb8UzCbFz2m z-*N0s_dv*vrqD|jG0PL_gX`(@L$~RnPnw<9^G3JscG1Vs&s7vvu51RTJH}0rTo0F( zd2gUJy8XRF`LXrIecaqldd2|cE^E`xw;d;I>M71XZ9+Y+U+-24OX+|r_BWDb+IqsU z8aA1O^k%o>6+&%THivzF3bX5MIN5^B?pl1 zf~z9J<~`hJz2(DwjGvIptTmVO*=n$PO?+5^brU487Nz|OuR>XXoavTgxy7xQQ=Mcu z-ehe>PfAtmx6jOjJ17s$meZ3etX3~-hq*5FKgpB-f=5_VO6Fi zFMO+6*s9KdBMT}Pjxpjm{NLr?w@YsxH*mNdOPjkar5rC*?kfxH+7m6c$@bQy-!f8* z>f|6Ft@KEh3p`(Q>UP?7^)R~0f!LS+W{m!v>%zH`A-_`z}}Y@5uc zv$RK;Ls4kn>`~PvTT|H!rHxUym}Gq=v*tCcE-_)1gVJI|-D>Y>KLM|CQB2eTsP5M4 z2MxDR&n9^{_EZmqQf8g)GM>42NGSQCP|PXerNi<<#zaPHWZz6HLm{yc)>`x?gyLiIQ4blfYqGhFJBmefaEB<6p|1i|+2?JuC^%_Y* znj!jCTU{cc$d9eP=8zulv1YZU-FxOaAO`j-2pv3OL-H$x!evUstBbWf)tpKO`?l;I z(xb|laq*=vH0L3xj-^{z_MsH3sXq;?D>lpl=)EkFh)Ii@q5cJB(Xv6qDAg|AC%Pff z@a6P|ym@dyA|M67MU;qJIxwsWGRtv0c*Y2?jXgb8gR4nsC)Xs}XyDa>2MwPEU4Z}s z@sAH2M4_LZ!Efv+VEF*GdlR?apq818P6%OLsrmNj4s(%zZT)!*w#)X0-e8IT#3umv zz#sr{|Nr?hCq;X#Nh3gfy|gEGTF50FKenz@Z=^k7!~#5gxFt?e(34ueIWWyGg5d}! z5&&X7|5#N z=qLZZqj;8dhz@5R30R&H0Ei(he zSv~P%Ex#qn*4Y5;4L8d^>3l3$z6?UKz!8Uv*>s>}%FxS1>7ZL94G7hjad?j+KmN0{ zc=sYdR-8A)y~v%l?2}Pn>$RbmaU&kUHsM|ni)f7fg0`vz7@p`8Gyikz0wNrr0!P=7 zEvvdbaJykzuc7Y&_nA6?sT(CZze5TCF&&80L%CU48g#ZUHVv(72@@9p$tQ{5go)J- zi?V2bS3NAsf&lJS_DNHm=omsY<`cKlXlb+vBPVeQ1Yj0w4q-W6tX&tWK_KI6#A}4- zKXw>efgn0gb@e0-brIX=!VFBdFn@G#o89(5&oT#gfYWC_1k_lkF59!k`%_MyqMOTR1&U}`EhrYTrMHK{?m4~8K&Zvt3bq=djI2e%4STwF-1WH}C zu+pm2r%J2gPKrTBw!%`lkujXgH6Q?nWp`~2fdbNLC{O)+n6Ukk9WMV;wBSv^;RJcP z{7Szu2tz?fhyrNl6+|`)D*D+HIcT(e6&IBNw%dgS_D(xWpNxUw8Vixd6~A3I?@A}b z!d6vw3&yXL$`J{336Ki=_mb&ZMHu&s3o4di3(oqL63AssZ3#71^8>!v zJ|cd*G%xLPu5u^sG>r;B{FPhoKC%7uhg_4|J^^3|)yU2`-(JDM}Qsu;N`o@E9G7c|Yv8c2(X?pmtqjOQQxF7lPLzQz%P9Ki5 zw=@#Gu&(|u9q)+s^)+ik<{M1yV_SbVZCBSQDea%xU@R@o2~Y@M!!klBYOX zL?ZY0WDEIwPm$=rTVZCm*Db7tO^UM}p@sqL@Pd|4)lA#MBXF2kJYFze#v1>XaqTj+ zm_bGn1v~9#y0k^=^X5d%-~NGc{x@x+!ytXp2c2YDZah+2n-;5wL=>rBob z#MlzzWlE3Vqz-RrzC?_;UXQId8q0-C^9X7xfVBEm4T^yGC{@0L;tu_ir*S8RPI6Y0 z?^nzUX*7RShi%$)TijWU=`cm{@kCC)l*>EgdQh(;%BtFaXoN|c_u|!bpzm&iEi5tS zvR8VKx3*VWW*zDj?LW~Ov%PS$pPFqeZEz`j8tu!wsVQ|Zgb8?MI&?eUjXH{2h}4s0 zQ1%;b2*X@;9OKXSBsYUv6_ptSn^xfvQ);>8{Z9!g-HYB2J8;{jeVy|d&*Dv?i@M{! zzfAlm-t%&1{H)@0rcCJyp%FWq+?cGuj7YUEQyU>5FLWCONmCIPlUgh*N1YlJjA@Gf ziu=?^x->Srv|0<^8DBBp#)hT~cEDO;8+vcw!yeQtnEI6T@j2SSs|TN<{FQ^Kue!Eb z+{Ildi5iq_xoU@Cl)Sr#LFP=9JOVekOQmaKmL+XdkL&|g@XL|nw+|e>^(1SrQOx9x z!Y1Xi4Lck=^$I!133@TH=qs5D=e+2c7D_yZEHM>4B(n6O#xDsnT~ zmD|A8#t=IZHfeG{@0!wu>&=e8H1`cd&s%(1PTAp{vu?LI8a7Xs#s13IsrdtfcimG+ zzZz|6b%%M1XNq&qtZ?7Gu);1TY*y%mVP%n{8CL92*jpAD!qNHKU2|kXc@DTOMT1}G z`&n@kKq}?w(1+&^+_{xE$$_-_`m0wO)&kt(K_;4K;EHDga68VtEc#C&?UOgcX335P z36mD3)34N(`UJN=#tXiaMC8Sid*iBmd4A=6Zwgr zl7UI4)Pv_+o{I^KKY#kSo)u8*LGS2Jz?0puEgEZ%v2V9qHV584|M77~?d-WUlN%ys z6yNm1kSCN|p%Dvl6!d5JgO+7CJV?5`P03JgN5%1Pc8>Wp`BVGWrp*1TNAo8%cNa%O z>D|WiD7K=f4Cd|^Ej?SNdbu?HcJN1c|3}3aM(B?6q~)6rZtC}JZ{b#724P=-(woUQ zW_KVz0tThFE_TsByby40V@F=}hXq`lDoS2Mg$)8(w{Z$E>TEQ4vaQ2;&ZYKo<@dsP z^NHK3Now)+il?h+&9q;PU0JGqqblhg58WEh8;v1jRZxg$8y1cZn8F~S4+d&U+T)> z2~E6~S003Zb+~I2G&-c9c7JXIF0{_fY5R7C9Jw$z}++;!aI-cE^Dj|V~aS*4!9 zNL;2}x!m2{Uo@Naah1n<#a(<`U&=@#cw}4o8>CjEnA>BWTclpY=9_ie{d;{}It$;M zMU2P(pbhyx`67M+#%MR|4`@?TCZYX} zx-?#U$^{*b5^$xmkJtiOT>KwXj-%Mu*PG8h9xgEY1EMgV z-)gqJ7pnk7md@Obzgq{Lhp|)npFrsCR&9zKEg#{Jd?{trPp-d-k}B%e$X>_IE%^nO z4=$=dd-(~7CQSax_rH?9}H45(9DUoorWhv{LKIL52!4O<(BFYn1FrUHwBE`+|;@Rzz;l~5l z@KN_hK#U~9SnJ>DkjA*@g>l#Zt*thNTL)r}?peJ~5(Rw`6!^RRE#TEjJ8{trUnDfF`p#kOo9~ z17d{9=O(U~MHQ%_3qx*VB;f4~U0_NjXHY!t@dS-jm}C>nlUQdx`F+&P&7tQj^?byr zB19d|>X{t~)HQlLpMom&n)V>qL~claiUEE;V3=Wvi!<^3S3|vhL7?=j>ouv1)UlZS z=eTRxA{YN8yaghE5?&Q6BH@iI39p|773o>^NW4I}CKfDGmwGG4r&<(pgkfEdvOh+N z{8DH_f#_l-<_;wVAwQ6SEWfv;L^)+*JYhK(pvZ>gfI-662`MC~D*ci=I0=}u=Hvpv z$|8qmx&S7RPLkD!h=Pu>&z%Vja*;O_-)w}1MAd2PZ^wm zA22CEmP_u_QtNUp=c_xgQ=j$4V$dJ8z*m=($_c_@R;P(!!hgjS{J9#8{Vr>&kc-9| zt)3p>3wP4tR71=TXJA#%6XNR_fB-`8511-*SuM;s7R|&R6u?wM(IxLyX&ODP7VatU zWAlJ>(_A@b@4WQG;4p_XqBpM(=}sqdSdg$kpBihqH*q}e$2Q*8@+zUD{)Fdb@USGW z<{;zjbcbM`I0IQqF9qK6B4ZfWQnu;L^aYct?WM=vw?F$~t;mK%kiU zyAIMk^ScgMzC~3^nZVrSs0-!*_MP|jC;}R$ZV^7J5Md2GZuA=~XTFA&RV#NhnW5&W z6JEH2;7KJv&a!G9a(O)o8l64n!!C~lNP?>AD<^+bjWqtJ&dFbR(W9}s3_H~;^}UwY zlrNT5e(Vbv?AUZ`yfWq!dyvVK>nmH`Z@|2AVMQ)wmC$Y2>g|&kHxP;+WO*&`sIZ0J zKiZAnRM2MXEDQ_0!4D9l%uPFh#*j2u2=>6g9yb*86NIMN6H&{lRq^3`ek z6c73ZJ+gpP<4b9FOii3cx9S#!&Iciw1*UMVB#2k=jF-{Z&8(U!UwGJ~@KIP{N3}oi z`z0A{$%n6CGF$?4;XllFzuN+f1VpZn341G@>H=2lg;xH(qR%O@Mn5Q-s;n}F{N&j= zyRu)V9@M6J%Ad>L%cnj@@d`hyY`cHkH9>XVMkrj`{7h#3{LmS(?X3zH>7APuQwmNX+ zY2LE#{b5f-=%^KqrZ=Al(qO7Dtj7p;EXkBt-rMCJXT}7p5d6xLj2|RDVi%P?%n$l@ zR-!Z$YY`IfwhEh1G1D9zDHM>`DyD9k+0dV;b6PNgM2%IvxVK$p;ym_%R-hRb3!~15 zqwG;q(hUyUc6yV7h_}mG7RK6lB;RV4C0OPa+Rb3KYI(+Ti^0(I(o9&Z->FPn z$dVjV?#(atq&RNA?+Znd*q7{-7P#zy{rrHiebFUKQjDicML$JgIf2fNF9VqxyR4z; zdR3tM)Adp3z^aYzRU5L&hB&pZmo>Qf56LsLV@74^-6Jd0!^tvdvo7-~@6}hbem{HD zGWdDT-4a@_jKhtcDNd=)yv}ijpUiB@+K1y$Z*qRv7MP=p+_Pz{rzul8<$(1Oezz_CXZVP-b0S1PPX7A@M<-Ne*Ucu_R>1^*-vs1>U%) z<$*nx56iIXr((yFR6^TNT0g&{0US{3^lR#coB7fm;|WK#q<7=Z&UIs-PS}~4B?=wt z5BYr(>Xf^~(jDi|u#-l37-McO-k3KVPa8%#_Ueaa?Q4F=M;}k!Bdg!51`@ga-!~S@ z0)&iT>^p3+Y<=6lKk$4?hbtMyrR9(~7wTEj@H|f_xtV(HvF^#^I(NP)^P24TU;f2? zZzpDFx7s#rXpXh_-9u)lRBxfGCvND}7n16N*%r7?dlTy_q0tzz9#7JqCXk&`q9?e6pa zBDI=-A(-52wn{_d^i_vT?hg{RAR^unrp58B+xHm_8p%WBVmix#&vi3xr7yR^QC3!2 z%Kr}teXxVioVx{!=C2NpfI*dz(7M*{;&>pFy@a{Wedn9vmccnVgTuQ^hMjWA*9L?^ z`T%rK=QWs@>s&2_>s(%xqEg*!R<}Ex{$p>>@1qvp4br&|eVo>^oUyNnN{Q>Fw}Bsp z>9s^5>d^BPH1P#n-)(Z2Exd~quPLj(^C36AmcE>9>y@3uT|oWQkwefO^SadIVfi?K zGh|SF-C2TGS#Mi!Y06~b6}qLba>Ix_NVl@43^zV+$fAFBDp09V=2UZcxcE@AHwxZm z@wTm78fNs~$NJ%x=6rNUWqpIZmFAJ! zkd$1S-Rh#ZwWY}5pA!dPzl$4MCCPzGFF`lm@?;43F=Q`)&g(-vruZR@R&Ty8q$x^Rhca7myzXDvEAzlVUA=Tu zzKS9l5Qab>IH#i_zQRj=tjt|uIxU;pS7TE4@6QR3$9verM5-jmroSXzh?L=BUq5&; zvkYhoAA4-)(xr7L#onL@gf8W3>~BG))C^4;8Y0v>$yO_Xy~0M zxYCM9*ECUz_kvf9`&Agu+=(Q&EFl4Zq6v@=lrHsO=@SU`Qu;y=^1 zAfP}0d>{UI|EI)4IZ}ay5(2*90$e*HjNp5oM1uzyX3iG83z)UZwbIxX1sniX1N{fC z4e--(zKtBoT^a4w-n)mROpl zd=89+?eI#uZVnBQFvth6@1I*S8VCS)&}YBG%(p-2RmPoD~s^Uwl{0333Ae=Z)%s$S@+l zj&DIHd6r2swP{rvZT%L4oF!1$z&*JWoQ2R5=zsw(;On8%&-`{(37OR3T^x)c}I=*p6HvRa1fBrOJL9tlVt0~d5`T_1$% zm4#3KM=&W&(-LwE?_P?qAm(i@r+N#i@LH$hfs9u}Zk_WZHzU-5Ue$arQTXytA1zK$ zF1fbiIpTM14O_#FkwRRm5qjiI1XfXy3)MMm8ZT5p-NoJ|;f1B~u_y$oQc%Z8|;y-PjLTW94Q1|B=^-2EOj9-{>fS$^loc5lO`)@IYRB`PJrf8Hd& zb4tj}E$p1uJQ9~iIynPfgOVI%AG`Lvi#^-_=?GReCtq}(e_HfrE&(AhIpvd!ynz1X zisWcUjh+u#K#ugf!bnyTBM*UZo97XdEZfpnrb5d;PZp=iWvq$SDh(HcT@v-EE5*Wj z>eY@_BYwGwHI9sZb?~}scT4iJa~6xAvdIbi8kynd^hJ&VP>i!v-QVl^d0XGE;Fcr# z{ajZTt3$_uepQt(`B|BV$u7eV0!tHzC7> zAo6ul`ARST4@jO+vEV)DBB$vXrQ-3T*y0M+p&tr|QnI|o3GZwkb3k)c#`!fkFWI*) zurVt5Ni7E_n@~{u9t&+h1Oh)1m&Ud~e!FDy6Z5Er?5HqahT&iiXRcG)xA~SnyOe3W zx^q8@-`7{!v{ypPGUKYzHKTArnQqGrR4D9T>hhbQ)GAhqiI3*2QF;Z#+Lbf2iPG3> zXu)J*@LaMO^kk!V^%M0({c_PI&W)14nSP)U!yts}Du_K$xB3aj`T-f*v+7y;+`H+p ztXFj+$MLBH56AMj?^*_rd_-xM-Hs(ZzL#sC9dHl=EA;j;xvVENJgvv;(iXbApR80c zUa+Ekq3#nWsY2#2apm^cL)WkN|56mBcz127ue$jbTRWbkR7zKYzfq~>L#V{6*@Lby zCNsN8oeRh5&4L}B+0G@D6Q$=JZ+p=1FdXc#B^NQqMY~R{n@tAi-7$I-LdF&*x9fTE z%lt4nLFhtKyA8#c728X#U+j)omw8_CXCF}C*zYJ`b$#L)siLULzvH?+z_XzAB-G@p zeKMos`%&vR%gvLr+pZN_TW*VKg|&)h0^boAd*ELxUz-$sE_12ok#D3I^EqA*VPV#= zzjGordPc*tE%!o^z_C{7tgId4yhQdrc&?O2H(oaa-*D=u_E=GH%(37%$dN1Ks#?m3 zv3oAfNNu9jH8hMnGBHW3hqO?5Vl=#XIT>xoFO_+LE{)Y6)I~TmaMKpNrwqw>R|Eyq z^&>Hb*}{=Uq1!(RMT$aVg{cU2XA5?Xppi7@b;Irl9%;;qo%dGb0T)xTQ0dH+G;(w&9DJN1J*&m;`D z^u875i=m@Fp5Yv;=Z|mrtV4D!D86W3Sfux}4@DxKCP?;T2wUh*l=M`s@h0;)$*fk& zp2PA!w>Q8om+y6-9<1^*?N)aUO7#@enUSiNy8q-wqv@v=MIqR&$GxR@rlx)1_ab(x z<@~VHg(HRJH6FbMKf5H}sLODSK8db#*gm_ZjpRsUzk7xcPyGhG-Ass2=>Pa29dZc- zKY7$ltGqwf1-19%rq;8EZm?75HjGLCAKKnJAj+=Y7al-FM8Fs%Bt#GxKm=)tL6Od3 zD3KCT84#oe1cne%B&COrp+g0vyFsKwq&uZM{MLZayPx+x``!DTZ=drA-1p3^d*!v( zedX`+-z zM4SE`kCbHG(xfKB9AC7})>%|6DXh$oOel7*m2Rqk_)z6c;Jw^A03o_|;4rVx++{9K zaVTHv)2;f&>&Dku8STCY@1MNChhpaZUbju#PbNsxwt^h!iu!sG_o?D~$C9Yb6RyZ~ zX&T|P+^=_*1LSup8)ET#oT<~Pn33BuOvVueoIivbK`m;s5XEWyd)O#k-HE!OVHXx)ktUQq!pG2vPUg$g$Ug!IR;erM1=f4iPPEaz<Uj6rZ^)1G%tQW1vu{f2VvoaXYYbj2G| z)XirOwkg&}0g{eMsgU!LNR6S2&`m|3I`b)d-8tF0>YL01oMW5tU zC(OUo^Jfn9wW2bl8VdrR$GfjqQ*4P&w%a4o$VLUa3>T$UGklAo=MaQoD^I&3kHI zdts8g_3D;*MS)p^IP;g)S6&71E2dPB^zMu(yeIySAKyGjd_-z1;wEC{CelBtbqL+r z&g(3@J+>I|hW=u?+*XVvv3+M#R;IZXiNAo5;k`y?qYC_x7`fp%wDQL{F9PaT2Ky(b zXBz_i#ILVi$a|TeUshU}PbU*JTbd$->@4-|Z(UNb^1ZYZkS%V%K_dJNMZ6agJ+&j8 z5@?klXum_Zp&$Lk{48&L^02ZPEMi3`8Mvn~3Z)w|ilyfxR0l4n%DX%$xGtQ1EB-P} zbRxEjrH3GgCQ&6yDBY7T%B?lM0sbQ`nEBwzHsP*O92AZsCr9O_yy(plt@U;PfYN?N zl$N`gGkgX}4K7VVs^}_dj@5%>Vxm>gr4s4lH5KHW+(DYNU+i`9c{o)(wBoAze_fst zXw+E5%tVzIUp6lF{|(_h>TVN_M)H1FaSf$cdO{VEXvu>Jo$9`0qe92wiXaB=Q46qv zv&$)8dP4dVt9bFlfrSmn>f?da`b4U@0JNsS`U-gxt?tM-E1aS(z*CzI_c7W^RljqM z(g!3^49vL=-Sp-#}|x&NcJU( zq2z>9`qgx|{_Q!y7f)^b zc7Tgm60X6GPoOIQXP}#!pAA9oo;XD@lpAo2mL~P#|23U z1C(Bb>J<2_2K+L8*!TYNr8+KtUt{As%`hr|W>G0ajRMCDMPR_VPC0MDL*tnI|IQ-_{VcbsBc^zjFm>Z6Rx2A#Na==R|^pY^E(pb|w8?n#_C$#-)fKQxYWw z5dpq?`Kknkh>|EdH8!CzGEU~^ zJIvYz#TL&cbXkQZ#$AyPwZ%z*9~rEYHusJsSovfq(Qf4OsCFk8{&cB7vJg%VvZT+y z2s065=OO?mljOuKd|wP(JmHWO*UVN*r+jVkjmu6w5frp8N^;t5;$H70>hHulA=%T- z(>cMy_5@bo;&^**d!gJJO8fxL&s%|Nefk)^kUL`1Geu89_n$Zv|l&b5h4kX&ux z{n-_zhj^NhS)Dr7W#p66z<6sHX^ommYG4l}30A2Bk|-69{_#}Q?tgRyfILp5jV~<< zu>7wK%LB}JB`1Q$+nfX(6|G8h*1l6+*s6E+)Baw_-DpGpjahXxqRmy}r*7+ojJW~f z&D>6M_Yr=TOY_fFilV8_d*97>-uJI^>L0aqwMbJLXdZ^|vevCh>ke)=?UPF4)8D_p zZ=YT$4YB~N`DD*9HTKUeE9$if?C|gV&P0_t#SygJsuSo($U!?Z!_`VVAWlAOdAXox{*YO=lf}2^(ooo3^b!113 zC=6w*m&{fV?a$h>uNNfj=a5sMUse0ilYX7?yHHAiQ`CwL>&S`{<7-X!kH74P%g@pJ zuGRCSx}Jz4CF-9qW$mzU(@D_*>usx*E{@tl-E;xABU?Nt7l0fIQN8*XH3na~5A-+QJu!W*%A6vV%jep=;G)isrY9T|>44 z8@)AirQcfifeh9Yj$2=xtba#Cq70j~~5k4%T3r0LIBR?URrXBZHV%=$xV-h{R@oh=Sod*;bq zGtyA$r|+rNZ9L5twDHKrHF&}B)5g+^vuC5+luZM+y%Odo;U^jdHr#KWnF>)NN} zk-e5K-sX9w-tGil-+PrsN2qhwPOb&ll-QtepEdHo$%Nl1rYhCPNctxRJ)@IrV5hzv zPc&o6!LI{Hg`~G8GDO?)Gf+44tLdpVxh#57ytkR~qrdR0x#YXFF&#*0mO%d0>Amv#)jAZZgJt+IwZSQVN(_Ceou)>?GNvhW772Iygfe zHu=2Z@rXmFR6a7wLl%Tu^|!s!={;&3je@&nb#t~wQny*Q^J&HpfzT%9@mCrv^Zu<<)ag`}$ z48_GR08k8k{=glZ!aK!Jewn6>2`?%$@XQp`PO}#GB~UGSPsQdN?{^_obtqZ08E3Ph z>iLvZQ@%z?oQ>(J#oyWQhT%``cNT#XvP2dybzGICVOt1^lZ~fw`Z|yx0SLD@q5t?G z7i}ZOD3%7{8unwE`Gv~YG}QLJQZyS%of9t6UtmslJo>1M3%J5{hGaI>YrkN0Z<ECn`N-m>%`!M-Ak1KIPU;_@;KjxuQ&KQ}Ey-x?0A(3_+1v zo9nwH_c9&-0nM8(8Jd=dy-aUpXw8~*qh~Rbxsn-l?fs(^yDD;P8d3LLE{BZ~pMU$< zzZR~un+gOxqAjqFO&bH!7BBXHamKsx4A5@z@3e&_m}UHsOw*QS{aU2DRMtL#XtZO62;-hi5pALgh=g&RkNTDbk`Aj4o zWf&YizJ5sm`Ye6HM*F+UXTzpHs5d{X@bK1|5wkL=HB_{d)bVAYZZEEhD_p7Fc$>L@ zNFK>-JxSaocEp}k%)nuzpMYMqnYfvF5%c1EW^`Z-S_e2?c|BUY7#IAY2>sKmR9kK)akPx~H%&TIkI|7-_i>z!Goy;4Lft2{pA`+7*Q*$#j2G<^x#58= z(W*ulj4l_BjsBRGczx@x_7?fY@8@MKg4yNhr|rn!N}u6in=E%OTaZuAY0paX_7nQS zlyvl^(zKG?^}8#6ajC+aaD!rvgIk|RHzpov4K_e<-7hP*KfzKvP6(JF+@ty#(hB0B~^Ufy7)}cDY0MX#AaR)iT(5J-Pg%&~Gwy~Lb zEoWgp9ai_mLv-=9Jqw%)DY|!69u%$Jes9}$zEeT&wH5RX`S#H5t(?b_Ve|$%M*xz@ z!|?Eo=R|mYv#3(`&nsAzW_PIdwU>hl4tkE7`j*3YHF@o|uE5ciPut(juiA_Sye%W^ z;M5YhBwu)jknn@zuVRis2h)Apjp`KF%_iezR)d-p0*YuvyI>l9^md9mPS;!aM-l?w zUEsxvi?@gesmM*2HW=eU`6VZVUFtwNjeot)W@+P`*Iz&41)?3-k z&61yR>|xgJw0HB?D7KuBTu`<1I$YX-SJt&Z51YgLjylll3g^~->l5Tg;N$-xh?yW; z?u0dt9`Tgr1{i8S4NH&R0Yi;*LHo;`g<}yksNBgwgTZoAw#<@`fq-%j23o*A9ulFZ9U5N`g*C-!N$rmTAN@5e&$&raLrlbVO%<^ns zNIsmLnl%%^V?+uc7x1Kw^CUZ!JpTy1gk0cU)h7H?699Vv%(qmaE|?mST(!{%4m#AjW{;o_X(VwLlCA3QPTlGpHep z1;C`l)Kr7Tv(g4#fUsc+SYtz3dXs-x-AGIstOzkTkQep=h*kovp(h`|FAc*72&G>7bE0<`X#S~hi@l_5dU;l z<8`Zp6#((z7Xc;`dg2Xug6v^c6XUJ~U8qTdoh76KTVUAtUuFnqS+tX!n8C?K1>g~- z^n!Hjo@e7+IK>UYXmdCfaa5RU`3V4L<6h0m((5VBpaE!8UaZ1HNKkEWQ=_4Z;c-=MjCM z191|6)=2=-X#|&g`{M+YM4xF+w)zv9UBiZ(l3DE+z?z&fl+ChIA>yamOeOH?eL*l^ z>c_PZa3f_O1Kd1q6qGKH;0R=$22>bP0~*H%sf;=$NPpsdLjQ3>QAAu;JBEfNVTh~* zJZf{rtPCOg$ zadY*^+owzrEQ6n{@2=P@+p;iX7AYF&G#GcHmWe6r)pQsZec z`#-l^b{TC}k(18~5^AYERf=Oqnw*Px*dVdQpiToO(#~jq`K=r;u9WxhS_^YsL4p#o zFVZs$j)$<&#HWOe)Y!zBDinRl0$U2urvST<#4Rx~RAifM3@OW?4zXX;1(8eop{t_} zujnuLCzknrzBW!vAEki*IzWCyHL5w4u=ax8@}p_~XakWU?H2iDU1rx36PlycclnEA z(tQtJ-r;|e?=N^^!^148#d>k15?(=is2cU4!c(g9@%H_5&xA$f6Ceohcax@N8=tf@k)O(^Q(eM?6bF(Z3y7xkpu zvQzga4e!xC!tB1Bh^u)Wb7MwNOX>aeTPZZ4!QUTczLhgN(g zUww#)hzKC|bc@q23d&{=e_CO)H*e%mV~VWsCg5d>ufSZe;oN zZ1Ernne>8<^2kU?bze-rni>gL!VJT`J+(>8nP3yJ3-bF72cAK&&9r#;7z9wn^{}F! zkADdp+s(0yeni>GVGvd%eDKgI_0R1$1Z1E!1Jo%itiGB-UNPMaA_E;XeoeH38>x+% z2GZ7f+)P6NM)jFp@*u}6jqFJJ7BPR1+ZPwyX1~8UQrYd@TGP);dXGjP^Ebb}5d-j_ z8nj#9HE7Ege(npPStMzU>c%~;Tp`jIo)pl3Xy&(mZz{H|$NZvB7u7%+FWeT2FJhv~!Ukl( zGrEtFU(mE>XD(Idf@XLQRWf9)yMi`Z56xl{^Bgh)tix#H9+(T{y=EN^Ieva{*IX%9X*-X5YH*9IM@_FY!gH@Es*@C^f*= zbuIL&XkT}~XGY|!I%&A*S=?m4<6+Ym(uEhx{#5CfXSCA593ceb=u5VFu6pI+fx=l2 zknq4yz^CJWSjq|?@c74mf&!a4NVG0=i=;0`J@6c`^#4J$!pNPC0qR--d>q=*AiD)P zU-7^jTZWr#2vu&9`1a&175XaEl}Z{YM4^*PN2sXb=BjWJy0N08c{jA}1EY_Y{{!>Z z8<(k7(!|WT@f%61p+`T3m{|Mrw>_ROJYUKM2yV1%{ngs$Ww$1f`@qC7&N^>UU(?P`8b5U< zTINEL6DOmx(N(LQrZd|SgImVleh0icp zA?cM#(XcpTkQEXe$8gLbN%u8Rc#X8o*jP_qdZ1y%9WqDg8gR%1*wqr}JALM3g=_mV z$+~CsW~eNl?tN}&RB_s1A7_bm!@9KpxN=@_1QI9zGKdP!Y3NF^rsZ8Z7H^QJ_1?hT z`(1MDWbkv#a@TZycfsY?t2fTQ=~`U6xN{&{*=jENrdpb{>sW^jGP{{XMSJ0hW6DKk zd-^CUKTWYemN|&BM@M{%XngVs5_(?P>{G zvwW0nWtLtp0|fyg@&n=k{icF^u*m0u2}t!wbICh^TFd82XXe_q`dGfT+rD;Der9#y z0k3QAN4PNL=F)NfbR1eMx~(tfZR~ zUo@Z$=oYvp8e;|3i}7A5ektmCnZ5L467{^;iq<4P?DUIxbQc`5bXh5eK~g1 zt6=D!9jUeK7K_kckIKu7?A94F9Nk;F)#zQ;!lfg4gMptWO-P4hiIz8OetFbQ36r{? z+81S6Q!UK)OS+eD+3r5>&I*4aVD&CZzxs+B;e+8#-Y7n;n-cSRTWxnS=i?%NLn!6# zm~Fm7hk?b&Z8$$Nwa4}!+6P7Z8!?D-7W|idFZo*7EkpuFOh!;4N!gSX%^(ClO6pmp zXk7?MTB9={{BFRw+pP_tqU(o74GDX{LG}g4gM4CF=Cvpin0WkA*1%KeDeRVntCXPQ zRx(Hdo$VtVg)YCAK@(`3?I;^eX~#GNGLI0y$DxmPYTGPm)D5x$*G$zZ9Eickp$E1I zDZ4KhmlMNHuyinqELNrmOV{ryd{Ue8XB#q6yJsc7BCm&Y{E53m;-f4LE0wwhjjm(q zRFI;GdUe8TgLl7h&K5h;1Hx2v zUAL27BiWKWe!*WO%F{#C0OrHdvh;Qe0+%Fdqz5b(Oo#}4B~RO9J1v%e*Mv*y1V1l} zTJi)aE}0+=IB%c;@+=O6|GzNr0BI8Coo(_y5C9th8Odi7XQ`38O5zVd8_c2FTqpJ$ zwNEx=xX)~hQr;jn6I=Sz>;H>r^KXW&W=Nw08#*({HV89G{Z^%v=m((HaxAZyB$;3n z15EJE0ajx^Vi|J!gKH1xb*j|o&=ACTH~R%aBzVNz>L;#e{>GGaAjuKnbMO7_i(uj8 z#PB;5^$iY5hDiW@8nwYVqy(U|!vU0FQY2v=fYgS)q7kGK4%Gd(Uk96fhyC{_^~($ccN^$x+*iK>)B7{jxUWB%G{80p8vwQ<=rjsqVM=~!;0`fEu+5S&23IF5{w__BkX0xjVqm!{HI%);4wlj zLa64ud+rL+Y6uAiki+OdkzoMALQ;Q00mYFI!JK+**SfIrfSpS}*$B8F|NP-4Z>My{ z=7+|gKZ~-*Xv{094V+U3kPtLc_a#KRp)1@@>gsB?-9xC4E6`9nxMfOWpstSn;InbH zo}S=Gq>QDgbM@kwVUsse3~YmzLMfe^y-f@NN&?B2zHO587?-T|jt>q2Nys3C&nh9QpHrVkZv${ zjc=Zy{C>r?w4AeZQMS*M(wpFt23gZT3Pj~+KEQ7^PH!Y}X}*6Jd7nL~8>{tDc-S9} zWs=R6pb}oz|MJFYFS340Gm5O8S|#W+X0;uplgV3ygzd=LdMJ5c|HV9}Rq-)><5G0b z?dh+UJaIK;&i0Q~F1Z@pKjz6xULjaNsMm+|ISbl$>T%sWiMJI-1ZG++Gx5lml1}i_G`(KnRVAy3~VN5eA`7XO_dvx?Y;dn4AEb_sh)XX zVffXJOT1QHAvNlDDZG`|@bgN`3uc5w$&u*59JGWmLM3+8K<=G6#km_0_NA+Jm|OVl=5)JEyhWN`$8`Nad?$9^fnR7sKN zYm2cnkszN#_jVA;<+mW?ge3UTHS?}mhSx@}B$$4)REf69p0<)-1eKD;x5S|1jUf7F zQ5!uI^VOF|0OQTQ=QkiDoEVNuTQ@W1w>6!V6-sr-Sjb?7UaN)53?*Zr>3k85AR$6% zv?Jb2Btt(OmEsmuVYyffy59oM*=G#&moM&-FTwBF0+L0+iMkiKr#%cpAY-IB>V&n2fp_zBXe5O~ zh;HIZ!`^&X>7@8D=3I6ioB;1h!Bd8NNjwn5|Dq+EL6aKIxlk{n$66$)cK_tdPi6u)665Zo7_QXGw0(wWzH{RFCosAYxwG-LmWz-ZS zvl`v4^CG8n6r^Gm;?lCEP5V-KlPAwtt-Lv8MqhT`pFNu*c|q7hC-}CcL-WV_LVL!~ zKA%~VUW^FfOmDl+L^Y4-R8+KIAZ`=zamIO#Ux)gt z;k;tTc$Khc-4i7s`b6K9UYP7Feb05F@cI&YGt^_uKC9@9)00@?d4qI=Ri&3doX{WO zmjisulz_E#;4VGEFM}|osPzjbr#oM%o%*cpSrm_Qj@|*-bu$CE%5zKR{O5?adj}#d zo~qT_n)Awf5~bi#Y&$tev(kq&C6Um0Rh`H8(ud+T;$!hsSVfbC=qd$m`bWaQ(T8+* zMQRvnymzjiS13uOU?GLn4zc_ly z^Y?W$Nxs~A87-lHmE~jj!!@RSfv((h!*h=8jdTw}u<=V-MYV=2f4R~J)o;QcS z<^xG$a9C5l?;MamE&%4(%jP?ar}h{KZTGcC6(pu)DK!F3qG@-gFt#HegxTUEG2Bl6 z%Tg0J%QfMaDG0<;$Da3T*!8xF zbM4M~y@{m6fM;Ic6aX@ox;pOX_pbuvMXTpQy6^exGdIEB1u15_-IM1BxD;@}VvYbM zY~AYz3&;zPm{ObZS1mo|8L%xJpHCSDPRi47605C#dNBgqzzJ6LKMxB5(vsT669JX8 zFohvd?mwcr2V$FXxs4$Im;Tb?!FuLTORz{JS*CzYOna)Q=0b|HaI1PRf#>=0qSh>OvF8Y#Jh-=lL&9U_kTjYc}^kFItWZsaT@`08UPNt4ggo6 zT|9DXV-ekdCiCyhB>ayB`@eTrKqso+g877##B>j_fF)DJ{|_cO`-&m7ROzOu6T+Sd zB<^WD2_JGoL{4j;sL2aXg8;x5K4nF*4JdX+qhLcmaveLU`d2{KT_{K~)H7Qky4)0r z*MpFC(tv;ie%8mrM*ePZGQk{xBBpZc0s&W{@nl5MVrUCYn~2^?&9E!`Zs-yLHwre3 zy@N4?7PZ}!^Hm)nCFn~Y5<}$=o%c@bzBvRV7lV}C z&TCM=PY40NkP`nQ(_v_g`~iFV6k06v4Spl8NSzNkO=e_X6Qi+UDb3F8kYJwB`mo(dia-Sc=n(q+7+Au`=g=+};CO!%8A#rg`j#rqBpttA%t_sB zM@me^6moy%+FSX(#_gn|xO|mC%JH`e1PiLN2aijTPTWjn1J-{T53iDqPj+4uWG z=5MUc_Fn+yQ@Cq0k27-hwX@o)oj*ted6DH3vwxd8BCL^vW+y?Otq_)m5{4Z5 z==jIpMJf{Fl*YAm3xN>v{NhdUlC_=E&{p!NrTY!R3hOC&Xn}m7^GbA5lEEZ2ny>PH z>T-T=G8(WhTK24Wky`DZe1ow@KXdSz_A44pThbo8u!*OFG+?E*$I8+0u5afIk%_0! zJ9j>3;l)$VH)nP&LOEf>GKmIXU@yz~&Zcf;GqUf90z4THSD2apwKA=Z_xXD^^pwmX z&VMWSYZ=Gd70ETdEEVYdnFDE!VL4{J0T;8a3#IQ-6PJDS!?$K!9V4cK6Mq%RBsuy# zGJmjCJd5#bVq42@SmZtP{6AoW$deG1VG${u1Zm>ABA4 z*;O+M>!JQK6yp+a}8I4zOHcC+3%B@@x+?4OZ662Pj>HVeK*>5bn%?C9Fh~tR6-j#?uFmX ztC0;f%3QJ;toRM-Ili*1wi~8>KR9mk>C(?Br?FV2hc7C5zr>o!>#2}d zA=;s5o=-bz<->vA_2O-O7Wj5sdgW?bnal_!Ty`+;_*+hF=EiRblR<9RJX1qiq{_=_ zC7wH56w3^&X@?K)-k&^%^j_MPn*jP!PKM*70GW+X(pdqfwUW1!~m}XpA zk}{PMBHXP`Sw-&p@dfd6vbl8?SZTL&rPJ2tG!4cuA@=*Z{*=`Kd{c zr?rPBI+onu5}#XZOBKRiDxTvQP^pQ#NWN$I8?sZ2B(G_|IxtnZoq1U4NuG=R z6~J@}Zly3?iFIn)HJCu0j;L@~4fN>K+U>(t^uhM%_#+vv%$MG)M*B-5NzI{;_>QvY z_0wFh?)o0h;l@ue{`{Rt6|M=jut%qz(4KoG6B3G+R}^zQ8EYDZdrBt1irFw?J_u+p06FZ`0Pu9B040@6n@&P1D4}9ms@514p;=?Y!T~zRohrXo2HXqAScN8#=!8G(`9NZFA@Y3fT4cP z)-*uwq0chtr5-kh{_+pDwOeFZ%`Es0IVoDJhOWu!2i@_+O^d?8-Bd6U2D+eZ@Jj=E z!Cb6U&ktGweqP7gT5ONWnZ?0O$N6au2? z5_SPKQ^WPZa=?E~AFr$@Ev^5CyxP;s3h;9J4RIJiSJ42Vy7PJZ$Md*3&s?B@hp)n( zFdjC3bcr?i^&7%2-hO)rI9QGzvbViHNIl*)g^z2RjSj;B_3itC+KIMW(BB+cPijwI zXFgv3r=l8Wl|!JYx&i@i>RA22bbNHYjJ6MMM$2!t?`r6*?2rS(K-XmUSI&MonYjfG z(1piagP>N1kH;b1*T@pfZH_540fC7cfc9S=vbSIKKXV(mU!pDf|52EJ2;r`u?|Rh% z3?8>5?yVNqas0KPdrZ-{#~M#66(D3I0QlFdnTNnpPZUS(U=_DEcH*7UMLjO}amz+8 z5Vt|ElCMTVABrnj(7v}5d@^z^v-0dwaA%;J4t5(Wj&P#ZL<8`HIrKsI&R60V>SQAW z;f&`Pv+E!?Gru6{ zYVmJ~_zI4|;OXke2G!pE^E;v+TE#jgxkO9nonSvMnIhzH{HMtSfi&H z3;=DNdw<}g=Qn>c2elwoZ=551!!OR13m$)GY+#zzcVKb@KqIN4+4-_{5#dpB*b<&T6pST`X zB#5{FBF~j95e;ZXtMmo1NPq9_jxAV>Tg~EMIbAixYNd|ww-Gr;XN+kR41ET zTf0^Wu}$%f4IN%MKG(D9)^205ASVCm*sigR^UX%pvTmhzN^y7o@!9x$DJV!lidRkL zQDq4mL5@XR-tEb3@m)_!qp;{3`L%BRH6O{BN0*>(JUg#8#S!bZ`f@y_c+H@3P|(v> z=)#K2D||V>N@kTCN$_v#v=;N2<0|-bxW`?Us(`mT2A+IHY9hPllm6RJRkN8I9ua=h zwzX)e0EdmoQ`e?-XWrHR!&?E=gHLpNLZUQ+nEkx+*Cg35X0JyJxEgY7bVkm zwc0-whNvUgZBtdn=OC_pMf|dLPiUS=iFzop)XvMb)EeIHsq*=N+Dv`UPrD?!&W?W; zL|S9*>#acEQMFE~2peK|c_4(5vQ1`RHF}{)!;RX58mr|oKO-f`76Oy>jR?S8QIJyT zF6F2>``m#6HGMO_G^!O*sAG*0j+DDR2@Q!#Ys!$He4(S*bp>NCGN2Ry5W6C!6yl6z zrqnE;gO3+AfNu#vvsM5oT$=tjW*9)yC*VgDh?7SVj1oX9a|L1WEo^w8X(Z0M%!+y@ z7#9xFXW?>QR)PnZdp^9P0$exF#p*k^4kLlr7|r0dYrBZbKmRstL%}isSMDF+2l!%KL0H zcoBS0p!+-+VK@JibfbH2PoWb$O1LPUe~z5_OTNb8Fq0;*VSzm3_9^85g{1xOrSuO# z!9hX$4uV_<3)BSY88@9%5-0RcA?iy^#8;}sZ`IKgNpVXXMGOQ>fJe}TwOaQBk|pW~ zRDH-f=>#$kYD2_!syYK%dj)ZMjS=!Vyb0Kazh?gb^40m<*+3LGYEuK|_6^Y9rUtv) z={g+6Yo9K}w)^7~q>0pJLU0p*VwqJ^FUq3E(Mskt+2a$U!GV_pb);gVzbYUvX{kFt z0Dk;*8b0NmZu7d+3lKv96{`{e5V#bC1o;W6LjL?TVWaei!2WlJ3A>k^7)*28+J~#B zRqRPw{a5`F;!aCTZQ*(XGA?x_CMA}Jyl@n)0qO&U9*h_OCZ^t&*ZHo&xy1KwE|p&( zUr`AsI^TE4xNy@V7mV?wr9focM+qSeN;o+cZ^ZlJH2`WqpRJZi@C62}HJv)lc=C6F zLgW;}BFoW@97JptNI;ADrsowT4f#<$2(>E7t|)-t@utcUhmmrhS21IV_zo<3mM`@B zyn^Ay&3KU1HFz+kpT^j24FFnacB;?Nz=q%>vbP5 zynMDN$AmfD!|U}ZlJm|Aj#+5 zF1a3cn?*w^WmJ_f5=LBa8uO-Oqbmy0sro}Uf)-hnv8mC34b4vb@Y?MfaC_-0w2ixL zTEaJBCg=N9nmbEoJx4#8j)Psd^VNqu7E9kk9R|H!^2wuba-{wh947MhqjYw5!g^73Yyrc2x_Nwfq$6v#M7PH|0*vQ3LV~m37oMMYv?y3eWzFcN zr7ds1{P0>1tPZx&2-jy9tFolL-WmFfMw&nsZdizxbMJE^H-)JP-C!6Rww*)$hHvLpRVxRL+PW?t1q;$@8?V zD0lfu0Z-W`p1!z}2xSrVRFVLMm)HU?6>G4h5J>qDC*O~`;1ukoKNB%m?)ShzEL~8& z#I`Z5D|(A>^tPB>x@dJ*gn_70y6U^0h$w`^Oz;cW%`Q~wQAU@~93_2AWv$X63R|9D zn(>rP{~Oj?Da98l7$9HTxc~-ZXB z?jt_K%S`EZ>0qVRe6Kxwf)bVMFfT*$UlM>s)7vT2c$wZI`Hl+m3}Wf8&fmrx_FE5& zn`X`CE2@_%Q!j*+vaAR9`Mp1y-0Kmkr#IbK4dv1M=%{eCvDf7&dC(9!lZ$10u}&y3 z=4~(1LZ1=eP+u1CaD~-f}ek;f~6J@rJpl}H>-dF zhHNE|xWFn@Cy`Y(QSpA)5`9RlApw>ZBL>|gACkbkthgZsvzk}Wq6ruP*cufn z2_vu{+bQ7VU`0ywC9A%~_5zj#iz=1RE+~Opk|p7;BI&FvL(fuT&#meF z2-jZ-Pkm{#FzIC7L?>-p=k%7w*^LI(@h{@}Oy{>5=yvg41jci+zjOr6cYky-Ym8-N zAgI(iT&N(k3@+(7Fr|v<_4ktO^I^+DTK=UM{co;#yG>_0h)koV2dMnZ2^j|^0%>U{ zDMV{u=$|8L5+6P;*q;S>!&65e3?{(#tS6_f&AZ<{?eoYX2HpQSS&kt2ow~1kM}JG@ zX8_P$IKeO|($F!G#f9M`o0u4b-2c*GjPPqwcCs?)lc`<7b9gmg>Gk69+PJnY~zss51EyBHXLS?6YrPpGL0S z&XrWf^JkafQIe6Q911U_gX`e%mmUbPErQ71^oZj3>;&^FiSbH7ojBJEEBlKf<6Tiw zT0)nNkvdUIrk%w8;AN8;qFWaN5ILaM-rRsAqHoGIksI{!_Yn&ctp&g&0AESUCIDJM zM%ce|fD1*7yRSro==IPH|N_z#LSU ze(^Z*vtgW91l~s<5g^WZ6LY4<2UJ2#@MNwjZQeUu$E*IEypnX!Act&sUJ;JlHTfrO z8ysVbbW0}S+gEY}ZE_h38O7l0$+bKW{KZ4IK+ehk@W+7sxyK`eaf_J%|5@QIf=^;7 z(wop?oP(rm2R?|89>f{CUQ2+^`{N=xG9z$bMiAg*WUzOAg#Th~L4uB{XTj#;xLhgb zj8utZH~(2f|M$S||0nl^ASj!T;<6XOs{nH!XG~icQp`^QeSMpIhP4UXl(C6e@{ITl z*f_1k;AGl*oc6pNS{`U~l`XB2PrZnU=OzViiT~|I%t;L}h9-$VA3grV?r<^^t@FTz z8J0|W6||d!u(^ytbOQIPe{+$&tJYuCP=?s(WofeBdxJRbe|} zXc5aYExczxCMRMtxYPi+Eqi$)8SV<(AcE0R+~nR$iV%JPYM&uf1%|5ke@)=w82xtv^Hy$86LumZ75MY@xN+dW} ze^XA*U55oy4nsleZ?OPD2?ue&21&W(;{xr*zZ+gTUO(7l-iPx(02{O%s#@*{(8QdpPI7H3L4Wl#AKNf$JHsk zdwY+FFNR~uU~qI>v%m6L99>)Dw*Z@@k?0@FOJ8`@)kkNbs+k)CMA^=+RxRAM4DxdU zK^>Ash(MI~Q;#$Plj=iQW>cHEA)S&%KJ83|GCAMP>?$1~J2WS!;1DR(!Jz|%sC!~% z)W|{+c!J2PNi5YQU!*Wdbvair(uQI8yPW=`D}UKecqu~Gt9n}$eYXWNi=q80EULMh z3(Zz+uu|I%c)HqOpZ!V-%^i2=R#8J$BlQmd;$4Hrm>U}}oP6&P{gRMk_+B9?ZAaA% z>7a!*jrE&zj+EcFdts^7lGd(8r_y20J3;=Y08*jX7}6G3v3Shib1w^}1z%fvb?u0X z=5Rpi{tP}1jM}REOkW5+OF!cb$$Zjph`lsJU-=8c%W}@ogSEVou8VNFH{mhQ@WcC!}$kl(jM3!K5$&&cj1`l8K z%IKmeLFH-zewgx77w-xKX zURh?woZ*6q(^0MMssKND^|}npOpQ(|fDUJD9C)t3T+1@JtX5HbTnCDkyTH8qb2z3w z*HR9?kk`R?yW!)urY=b;poeC%YjiC9LA~N>8iLA$11oeUjMUkYmSG+?dfdvp}mGivm4H`>E`=%ECcWq+{kB zDn-I9=#C$*7|^HAmV?I4P8z?x*_1&I zG;Xq?)zROIUM*fh&td=pjZVtpdcBhxRtdF+sA#Cyt1bvRIo<+pLAFwV$ip{SDa#^E7U~U@(MRY%m(_`IFIb zt(K?1RouqaXt|)YVSYM-aZ6J;po-UrreJBvmvp>)*ml30Dos>8VxZz@_{EBw$I8_r z{@Z_U`|dtI`k=b+702Wg;Fues(m?mX8u8Z@p3D&opa;rJNL*dr2W{8doWUB91gIWQ z94j63#jO?zi#ob??pfSIK>taZS6H=kS6|{I#4mY=; zBOed#{Lr{ADu|~r3>1*3Ik%fyx8O}_;a0?s{MYR%up|<@t!P|7*3oMjj>{}s+qWba z+}Zb$W!3(n>s7mLuJCD$tI9Dy>mirfAvzPdc~(_6q#isw^BZE+Bc2F|Hh{;4V>)+N zcQE{L9Y@h74Agc^z`OyDF5IpS{ysj09tb5*FRR~7`eq=%XLC#`HtPTej@wP}aT{P^ zXy*rID;vD(Y`t*9dDoIuz>28|U0(r>ZtOf!d<{$zG+>TOKoej`;vU4-@d59DKxZoX zNuN^oDtb4r?>48!?LB4qvke0ffFya2!OpEm9{=#{fqyo&7sqUvY`A33NUjASmM@@# z(QE3sIfL(wGn_1b0j{eBA`Hro6+N+ro4Jwjacfh(@jp$f?CKVRT~rTe#dZAa^%C#x zTgr{y6)y|qa7&8DjSEacb7ez_^RsfIBOO2m{P3TuzgR!^h{xePfY#;>c*y+(pJd6R z&w@HEcRL-Vos$ov4o!}KdRwDEXD}$gqz`#XAG*Q4+v*6GB?z* zZrslY^_gB;^Ix*QBr`TTsg~C&@+C}3V^ch^tiZ>F9;Pm2iN4U^sxQ1s+7QQ6UeFl$ zL#*AB$h707Z{gkqKim`Tr&_*SVDpMRNG$740#z-?rG3nL9;dC<`A#ufdi-)qdxzH- zc<4hy?bl_eLSvv@%9dx@L!I-)0|5^uacXn$?Op{=YgP>>{wZ&9s_s8O0GF@nMI{h- z0h!=coL3ow5q%qoKilnIFUDjvKqWMb(NqjBEHpSSPzdZMWV%Ls#Xlk!Tlrk&$7};U zpVFT*UyQyNQ+LNDanfv0;bIU)!xYgOqVHyuV+>U*mz=HFQewu@>vW$x+;5*;mU{QSzB zVWyqo{njJVKGIf(#HW+y}p%Vf|S9F5$X8q)0rumF+dB;i z+Ua+AiNie1v>~q#VB7#$QCTdfl@UyERg#dqj;q)H(hni|pg``-&Y zG@1oa`|AT2@IT<^yG`KA=7+0O>>FAx5iKvfcE8(&vjBuu#CLR50y8}sAVisL@>!ym z*;rWNc?2xnScaYI$+vn(wb6Cy`GC(6GOjfpg$dB6AP)S!527DeX+ZQM@BhkS2B;sQtU=-u!NVE;b_F00Dsg?;qV|iquJ- z)io{EJmu*Zh+xUGlMD8k(AoKDW&)%zb^I_0^^NXiJp?WFG9PRid)VycJYVM@WM-j0 zc1ncvafA_*4C3-3Xe-8%zOym&hc;$n5=}te7D!rfm|uiQ$n+cky9fC1|5wV<0K5X4 zsd|C=M?-OFq*dc6641e!gdsWR;!zl97mrz!)|FhLESYd>JL$==2y>)}a&byufeTSf z2uI{rfHHv&YY(V)4 z1gOSz0iyWK4_^dD2jX2D z$3K-S;euX^J*c*V=V#-&ky_yKZhl~tXos%>et~&2zZgOm7!ZelfA!kf@&sUAPncHt z*%bkzU&}Cy`=v@@O=8VyqSAZyk_}(_c^&6T`w*u8aLQV;9BG2SVSdrh zzTqU&xWbbuR36!Yg&GX(H2I~C6&kLb$(NWjwR0N>E#~@z$Mw_88GRQsMq&I_JQ?Tw zm^abUG5G85lV;aTKJ&<7yu``TS6yQY8GyHccYBYHFy8k;6dlLs(>52J+PN-g-(o{K zrnTfVT)kAAqx6{dis7)v4TNE&YpYX9A8DOzg<_V-xPCxdeaQXc9svc6)a#!ZXtfA~qiETt& zGIx>Z3fH(uY}-T^_snx7FVghrH?gA!x254ve-P1O*2&Df_wa?AEFVt7Ivz`1sSBzf zDj<1r9*Av7o@WmUOrJts?@$Ivy&Pmjly}09{6SH9k34Zc3$N@7@w6kk9U=CXr>5@H z;**&VT<_dkASJX<0|LEjGX_L*u(pp&CQ?iNa+99!m`dKq^j#`+v*w88?&7@w$5?`2 z=6|d53+8r9tG{^2C9r!+x%41&Qi9Gnc8}8#VThbQP%86OP--YyEcdhLRUS=e0l@zv z4^VxCp^JTgJW=IS_e7P1@mtL%RMWFP=#18?>`0|6l9fori0;cDVbRs z6ClP`DmJb=Zs#vONB?RyA4X9a1aXrnQ7H z#p*d`^2Y7q0>-$xGTis(Ed2R5UO#!`X3x<+=^?;8WPAr034kDHMBcF4Vv+_%M3d5z ztsb#zz;OUOBL#U>faQiWQ@Bld5H_pt!EigDPJ!hrv3MC6{lyQ$_6n13Gy1?>dngrZ zHdC79)rh+j{6m9v)v0}3`3bJmjGNf?tCY)I_ihSqchw0R3heD~#Ruuhe-4e=x9D5* zU{C9}qkw_H^;lncvjAVsTMd^t&s-^*0=fn~r*rX~i3gA@oQ?A$MdIZzLkzU`Z0}t= zHqsfHy|@s5(VR_8Wjt|_$AIO+6Q7H;dM{l!_((-A_qHB{xDC^9cWiSSGrq2nyx+IA z)oN`%taf}eKZE5`3qxIO>;zJnPxgIqhsCa2XH><@FwPU98t^6gqySz?uvt2EZY8bo zHGA)aM9Y*Ia>Ssnu}l_9nTPO6#N6NXGX})kY}F6DYJciT2@IirhQ4Rcn7|2E4&JprDz#3V99_V zcw;0R(!rV$1z^90|M;m#;fK3K*LBNU86ZX{PQsb{ID(9NLmV)ka{qdRv(d(x@dN^D zg?~9*K^0ORulf9*fI)UA#X-|iff7XjUUhv0(4+~Hg*jw& z+Wct-B;OZsw;TfySKx%0egD8i|9|e-LC13K}do6 z&2N`>wxfP38B|hB>PCBGlxq&RMYMVRWeQAGTb==caJ@<=AmNBLAd*o^AhQT)WBg^h z|BwF~r-Ghp3H4dn?-zZcxiSg>hHUoB0{c$x#NX_}9d4?tG$TMJogaQd^>a126B451 zAr^wmK){HC2ceMS1>gdXD5|E+7!fA=sHT(3jAbf*$Fljo73u~^R>;tirkl~TixrMO5eSC7ha$yCd`%b?zxI7^BHUMZ zj^T65$v%MfSAnaHUl{9W{fbW09M}Y z&@PfUN6Wit-lEBN;awx+>Q=^MxWi%BUQI=YXMW;-MhVZqmaLzC>f^j{UFD0Il7x3T z-A`Ct!<(;{zmETObbC+UBJ%3^aJcMA77X`PrIq2eFH33n#W|&(GsFt5F;1 z?ZzxtO)bwK&kE8mJ#)BAD=+m*-tgr;B}-aaq4J?4E9KMf<1KSJ-%Up}OQoLNx!ga| zlQQV?T=u%h=owh3?P|fi%d?DcL9dW1!t=XgF3l_$PCouj#`U%9ZWKHSkBP&`T~oI> zX_p8F%0B|5d|=Z1Fsu8u8bB}r5yTW=c>z-^SRQ|GSychsT0v+MXg!~);t`0b$x}7w z5s1SSLemGNJDE$oRBXhE29yNK7PYK|bb$A_XG9x=1kj_GpJa z4GzOi7Id3aixqIT0WdibLWK#Lx}JH?&_MVfkSF{n(Kot}`BFkFd4o1SoX;#I2!E>s z6&O+}>a`woXu&xRreU9_z*?bnL{5Z!v;t+m3psZ%B&&KLI%bl|MjXu<&*psw% zPUpDTN%csdbXuL*hh$cA9HKAoPRHVKRFLY(;K7ftGr1l`nbg!+nI#;F)u&?&Yf&k+ zc6PX|H+7iERd87?ioaOM(p|V*(KoD<=l-#%4t<{Dqn2UcJCtrsJ|JHz+y&SgrJ9Uy zy%~*F=OZ(HzD$g2q8IXcWATHh`~}ZhGM=^>SY}o~(9iBm98?B_v;5fuJqh+p3hA*( zb5m^#)CDT-|Y>?ai z7BMh9K;I?%6jpe~f`z|T1Xv96+MtKR^29k}K6&Q9nztsvT7+_mhj9Y~Os$b%_I~(Z zuzqk#LsOeGbz=)4)rO|p z;&J!m=uiOK^VDuSM1!-$nNPt`NzlpW7k7q)HZ%_^o9*;>*zbcPHk2_rq87>(KtSZ$ z`d~%*-*MqQv;bohK_wn9bEUctpp}jteEA=+z{z~raV2jAERP_15HADL0~|H6+Wv+X zj(<$Sf4B_)lL)i!jG_J@Vhqj#Dk6cpdi~8(PzPC);bFR?7D8QL0Z3SxDV+rfM1#Lq zIGYH`Kyw(NRXy@P1Nw}@D4FWRxbz4?CsUmLYuHUvP*vkfS?kH;&JYXa`;<2f6uT7oAtdMV5?U0Tfz?9Qf{G0vCVtB{B_ zq65`cfX*O?ZZ#$gr5g_+#zOBys?ASI6`m;C=lBIT5)z|0I>;n{W1%zrR7R0vF`xr& z+?YFYHLrul&$#(Pt(9M6j$9lW&{VzrGPq!MyYL3-=UtOmel8Mx7Dgjii%W&rGsx%f zYhX%{Z^wLYkX}4I?S&}f>#+-%^LWIgt8GJ@mKB&Y4d>%ARR|d+}qSG9C zm=|A#rWby}s_co_P2er(pNZ1_E*fLwqp#|YbYtv$q%v!#t4;KK>_31ry21Z+)*#QD zr&o*P7tGUiO|+}_-twCehB6&NLoVqlH<1W4S`oAL>6bSZ)|N*F;_LW^+Rl%~ijBQ% z?mH$TC!{Q%aM~t3++jfIv%!Heu~)0*ht?i0>W#)X4N9NYYUbg8v5d+`w$bNVM8d){ zd~5OPJi2wn?}ZjqqDy80pUn8NT&y}X4wdx+^&_c=5nZvpRg392^PS&0yiL*mS^9|ASC(Rs;16% znBmM5^z39X$#t(E;mqfSaPVwNCG6RP7#@(w1;nnO?-k7V1w1^8Vy}^}Q(~ds6#58E zEB2CLs6mp_=3#gn<~>L?&%6PTDOPJy4nX<|!SG3Q7#o43+{j%nSomjaFq$EeM6*3J zCVgakRofl;Z0(ds0Nz%Xe)fJZgDC#CMKAoloSn}+M9th-#FIr*6|)ARaHsk<=^rfV094w12a*0#t*H2dEwC zy}Ty!CXK+FH#eb|Zl0YnA2<`*F3}Q*do8re(VLAl4_}%LYxo9FH_ZE;xr-z~G#wRVj?QWAFF$Jy^n1>t40GG+xM!X3ood*6E z5ObcZ0fq(8Q>OwSBoakM`r+lg`~VeqG<|+^VNG}7>l_p&z_Z1 zbI~wJM9B^k<@swz4F2Bd1~v&4R+5IyoOH-6a@;YC!~i@6#tt}QF6=kZW&?7nA7)J+ zLjRXx+RaVCtrA8+r~@2Qca;CU=-+R!^BLMYpCpCu1?&lF$dlSy)YStSxD30jP3Vw+8vJkvnm3RhGVzS&d9HYMzSaj)WH9hm&6z_0f2}osH!79 z5*^ZJA|M6`ISi!D1V9i(LBP%)Yfa;Wknhb(G*}`%0<;oK-AkkinvpR46+p`YdO?{p z#8#0Dz%vB~6#h1_&KCkaP2ps4lpmyy!U(KbPoY9m*09f=+YBT|MqoY_e+mS$Kya1)!Xj)N3iek(Dl3R)<@-F^VZAt{;{*(9JP2OhdhL-v;7PSznMVI@_eOUH zut)&Cr9LQ<{Ld(EZmB3o*8#Kv7>rh*>33{6u(5=!s6#YVQx1x*QKL=`>MMm`+6c)B z0)RDo$0!3)Ve0laG0=R%Qif$LuDa1*yI7yG8|b}qazcaT9o1IKWc9>lkKr9P@#Hz| z1?#(z8qt3+!ay!?HMCd8zl8U5vg{J)XgBva0E)ao^Vf*}J$#j4UBK>MsGSiL$e7j! zVKOZtumTw!09#ptsZJm8M*CreYi|RyPAG2m5s0b%i;sp&7Y?81p8=>aNQqWZ-7?fy ze>X>l3*HD@4M1et`W-4+sK{#lG`7|}DdWPvP|qa8XE>P!vrC7oBcfbP8qdd$C10Dd zGPE?69`f90FNrv?onrG|;q@^_{1qNag@WAAd#&VL1M-$-hss``PYgAtZz$IETkUGiy;g&*G%s z>^V!z@F~u>cKze3hoV@l()aML9|EQR=Y+b|701-%y~}5=1Y)zsY9mCwJHm>J!VFyq z+|hB<-$#0rBN1kW3+R$dG`!3%FGF;P+3OwS!-upCE-AQW=|4Y`VjFZvXz&T0N_4?> z#hfU#Iam3j59`K7eUgqRF>SI5ho%`*xsYJ#cAtc)*(D&k`4V%-hp z#@~-MItW}dIW4ss%&`8sDW>XE>%;eIDTiyO{rZg$2d(vUT{#jztyhuwiRtd#yCq+f z_nuSYT8$xy_zU5&M8yMsx<#y93ns0F+m1J#z|a=@R`zP}0Iht7w0RE5S6v}q=1Crf zKX)mt^G3-9!@s=gmhSuVi65CfqidSaS7`0}+$Hy7z(Lb|0cCI~*>zyqBXZ4;Bh1kZ|Cko12!|$KIo{;2j_&41k)tNVTf7IUX%+J z4VLF~9anb_uz(ry+d-$WWGozL8mwD9wJ!?5bO8Je3`tp)4)1vGE(fBQ11%nTN=%%6jfQHYNs3E>e8}ff zmNHaXY78;vl$%aY^~(8lASe1qK5d*~0P|h~ZYr&oR6Ey#Hg3xmCq+l3E*&DJhZ#mL zm|At3OFUziQwMnyCtiFvsP4Xe(8U)^Wg^l%?^wkRRuDA~m!Xud)3FWt`5N%_*h4yQ zz*jMD+Vb)cvn&v*_+%{IOc-^Or@aSK67+@h!MCh8%v7}$L3zYGW{{2VZXSpcl@}+) zr<1Y}`!+%4cz4X0yvj{sL*KMB5>=jBfx3jN;2KH>*O|IVueK|AG+a-JITAL5Q~~OZ zv1zNsw|$s+=>;m!aKb z+xyZTka+B>6I-iuIvhVbc*1;*X{^^dBS}?|EkC$vv1NNt)pzkXbyf?}V$r5GZoD}C z?zb^T>gswGTCtWg9kasRy)l7Vso&qUOoAG>c^ss^mAQ@BG<+F1+&Do0VXXdpb66<9 zeC+F&Hy48Bfa@+~lXFp0*7RY?dTIMoqp!D|JJ-T)Mas9Z(N2bI=pX=kzh`%i4))5M z&v^C&tSsfkC%ypRwu?7UZi0Q`Ee8XCz9=S*>N0%JO_3`Gh(~Qwe zFWu*`F&%HR@D&v7YlE<_m=HDoy~?y(Im9V^ifBoWY!j*ypXFv(c=0a1I;x;jz1=E- ztk~6)`jM3lbuk7rrG*x6(F3fe?^L+h=$WhE-o@N z&V$=qmB?mS$TyW$7b-;u_@vGbRWGj4+)NmMc}k?y&OUsI(0n_?yoXg5?=y3{nzs9I zpmfMmjiYYe{$zE*8=NDEW)LdZYrEJo zL+=5`zpMEpdSDhN0?m|MFzs1GPbBwG2iTt?{#De!{e_ekA2itG&HSOP$R3&%3n&NR zDh=5j09F!#>mNK1aDnc;OJwf?!Fdo?17S8$LJZ^|E&f*x{Oz*;-7Wvm{||0oW(KZW z>lXp4_caa_O&Zd3vD{0K%5x?VnHY@0c0>=_l4uFt1D3YtYWf|3Q~LIUI03TTfC<{V znk(FUzv|DYO2`=pcbYYBEfr=kWCH7sPo7H*|_AE}0}4qyP>;Y83TVBcqhQj$}k z80k*lqO`w({vaH149DRD2$GC|@(egzEf-}anSv%cL z0_fFZ-#c=AC#9L5OUY8i-l>iy-2;07^YgOc5~32bmN6|PAhO+U1A*QCl$M7);BVnU z^V~z7#rz5CJWOze=XM6U{Su7XW4WND#u&q`~A7I zi{VXsgl$isnUUM);}Tl06^&gVO^7peT*v22^4^j3(=#$(Cj`cErM0--G+leXy{D&8 zpKQKepV`G|y{-G=+mFUT%01M%e!0Y!@}=^x z3}a-Iv?PfvtwtB~kQ7JLQ7PIDE$<#e(E1Sp($lY#?>G3fJ`KOC(Mp9&N*@x@9a;BM zv>mXI|LHoM5@HB)+8MUF9oQ)kIL}X7ewi-%8c55~Lv?AQ<) z1`mYLY|`3X@@}suOl@Ut?~ng^OJHLifGdnZem>n|{B~&6_ICYRH(z~g)_fmD>w$L# zfF-EMPa1DX0zg6Qw$&ELlLofiSt|fl(HajyN3w73{{`z}`vnuKU+aB~s6UYlHNi}} zXVnj`kOpJ%{(|lCkYTa>D zJZ)Jo*eud)0@bDdX{r3n!)*@bM&BQoPyT|1fa(L+VPDJ(==ud)5(e;uz9(0msTK1< z>8i?<(q>V;bJq5Cq4I|s7KEL&d23Og^l=cT3bsV8}uzdc%ie{vTkMV)!Td(wY5TpvN3s&){bu+ zB5rt@ZgB2(_tnZ2cw`^qTfdNO$`kNP0F0T1WQVTdO?W4`Jm~B#@oPOTud*Ubn%5JJ@DNE4PR$0(Eo&SZRcJgjrZ38o=+QGB{+);+ml`F*$y4w-tY!c zEp>zK0lnpxtu)^)(8ueH#VKA5(=O1IkTbkyA+T5L3#-<`W6&QzzD7=bp`Mgic`b@U z(#n)AuooBsJ-(;D(hV9ppf*ApGuAu;J$8Pk#ZbqSphp2ItaTkW6q|%hzxZjKr#x#1 z7%e@KDayIfDPX<^HSP7ge9F|C$E|t)Iqu#{nGpCSu^niV<(r8HHDa{gk+lI@DG0wE z+44#tl5yK{cx@vjR&b4fv2~@p{yPP9v9}xYts$TEw*mI4s!UBdWLTkx2q0$x^$XxE z#)i(wvmq6WA;aF^iSc1sEH}OYXx#>SD-GO9fBPDA4&&mJttC&bs&5(s)CLP_j9qio zn%bMS#%nGRHq4^+Ww+_(Ua;iMbqaz7K=|j_#;(B+@teCy!C0McQ1Rto{I@y08+*r$ z*{SOa7$;1}M)kI6*tRykzvpf!x0cW2Le$0HJN%p)PB~JM`U^%!t#wcobk?5u%~x8` zOsEpHXAKQ1!vPv%=1X1INm>tn!5$fJHE+hM+Zqojfy%uAmSedr<7LShF1Vd{HK%Raa(Q6Fk0L1=z6A*FQ8vQ!yHW9hZgf|x8H!-%^1wq!6#b{kDra7Hk`8&Snhxp z%)b_t=i3~|8ok$Pf6o?RiLc*N|Al$yGV4)^4xf&Kg-IW=is#iyX~T-mZyxo# z>I7%AgQHEw#h*)TyS-d)9DdZ<9I}t|x+JpNc=Lp9`p3tk4o@l1f*%y)utTD+=6S&L zUud4x{Rm@o!q$W)ksAyAhFT`eZvk1XbmfXp{^4lFT)vYn)OX=j7;2ESk)Q{Ci6tk}3=i z9VfS6qb&+IXKdk=Vsrj7V^aK;JijU`cKXC|%P-pMALucaF=tfsD2S$PogBRmd=yY7 zO!7UQC)Xq~YsANAQ4}Vs&x(-k&{s3tHJ*UbX)VN2iu6#C(MFzQ886|)Qq@4~i_ZV< z|8SZ$0Max2d+t4e_Hclj8wop%YnLd6w%+%!fbK)0#9a*go1Xi{x!W=zn+62E|Ka|) z`-sWR*8q9`Z=1RTM?B1jf5rn+!)5V);aLss8WRrWbZ8PzV(;Z-?|)_ipC_eRhWcm} zkzX~b@R2;|cSM~MKp5n2=}>@M3&i^d$(M+Z92w@9^28NV3}Ew+#1z$c0p&jZ@k7v} zBnVZ1{{lux0Ot@L`O#0fiI)o4GkWxQLGIydD_i5f09CWDC4 z|3YfjQ=$951>Ghia40)n8pI)YcsO<*4Gu9FT1+_tuY*S()GVG5XrbY6CI99Gb4G0l>I#EsqP}G@K!Vz=GOkGD zje4+dh4BccCHQ8&&o9K62q`Sq*jGH1^s6!JGP*5!oP49m6y+wNrxChRoMe@Bt)X~M z-i{-YX7qaT-1V`NNAqTKWBJ0$wav5W8m)4L<{9jV0~XFYg|<^A$0sc5^=D7&dY`EJ zQJgnbmQolKkcU=A)N&##k>^slto&{VlUHniGuTiXZl#a9b2w`ha@x*#> zIMS?IVs}uX!GKQnM-Sv=+Oh+*j7EevakMU_OdCmYa_^QZl}-5V4mm9+jSg7 zuDALY45j+^k*d*(%%>Pd ztlMt0;$G9+sPYvk-y=Ro9J<3aC?jR#k*tmlYMl7$T+O7%kI_1^ZoQC1%$`cjQF&<2 zdXo9=VO;02a?Wb6F9+K!2itrr!Y3zIh}RE)sG?YPsLgGY;ew}A#c0(mf(S4C&v-Fv zzzc>IO;E2~_7gfW?l_I37eC>h9xE!Jn2?RJ{Q}Q4D1H27EPcY&5q(+Hf&<#V&R&>n zj<4dWbDUKD1v`RQkH>u4&=7Dho@VB{X?578lRB4+3_r1*qf_#p)$-^X)Nvnb7JaEn;M2A zH=G~GKFU)H{Xv{Rneav?j{KSd>CG&vF~ifr;jWrHz?RUkq4FblZhYwR_WNq!#@EQc zq2$7)GmH1`?;$;|Yp)JY@u1|)^~X+#8<}rCIO=hlXs6kFt0sUB*R-L2IXF`*uutjX zKzwDX+XP=p>%Jpj^mE%BEC93e`Ex75i5i$o;9Ky+>-*zAlu6mVRGag-px>%`I(kyi zaH71ZG^;|-kfK2FL+Xm@C3CM9q+c%0suag)wiaXnLeVj~z-=7$4H53qy6?N;q;+dG*R08g z_)DAWQ+;oiJeH7NTggN5l{ZM!Cnjay3E@sWnDrQ7oeJ~FB>QJi$~INIXIN}^@Lvp0 z36|JE0LkuK&076Q_vdDyT5v#xfDRDn%AlN7{hihs6gv|~_z(~)_X zVO4XE?9ZH%*0}4v38p0}vOLmMB6-*DD=|JDGJ7sVf;}dF`&176aOm(JTdAQ$a?4C-;8Cz{28hn+* zDi0l(a-dw%Nsm(+nM3E!_0YDp=|ti>_VL4}^Qr|_UD^9!4}C9B}iJ#Yt$4gDvIMY1rSr*}A`0B4|@~3Qpjzu&n zzhR7U(iQs)CME6P!IbMIJ+xPQ?8s+RPCxZ9t8S#-9e0KFuI{9mo*po>X9ZSqD@p*k zds4DUTM^=TQb`gJG1fLgC~gq)_z~QMgPAYkxYe1{j}HDj5W!f(liWM2y$9s!0G7JX z^lf=<1S}->xMSTw7Da;QNtR3gRj#=$iICD1hc!@{Bg2-T+Xxt-R{ZviKcWD*6-R z`paDcwK=iKx4qYO9 zN93Ny2c%(PC72_jpKIGw>y!51TJZnaNd-WNrHD5lmiieW!z2ie)OwJ3#0>Q=4%?kJ zf`WPfq`Ll}IZifAHbBCpq53tWo#_7056m5@Kd}C9ekUQW05N3{yTBH`=eI@Xt4(w? zJ8n)^`klB{q^yvs8BXI1ucN_a&~DCIL$y9%}v#6q~Lp(Re48`LD-^?617$s7ngaRsnEJ z!>@s$mr4$>E^T2JVbDcIW|2ZG`46e)pxLftq#Tj4lH)O!B32@Xg0lu;rb6kK1_!&* ziyvG(3ht~al!&#vmn!p35nW+qSH}-cVRXq|_E0CzP0s^FW>qiDH6>u6*cI(6APErZ~CJd@l&C?8~*;N(|9KXVM7Bf0Nj!N`tMbXOW7G-@!9B1jJR-G5!#yJMeMCE z3WCoqbw81wXugND*v{igUpQFkyv;N!%EH?YsbMd_3W#(u@!g z=P&x)@uoqkW5!5nA~U6~d&7d|+^sBG^(alL-6CVT7wjoz_gg#B29$u6Vqp4Bb(C?8 zSnBqT=U?u)B;y0ma@%z|_d&v#zks_!y}M#Jv;CFA7)|oHlJrTsxm;2>1{i0J!4gXb z^H%c=*^cSnI@&d>oX;u#D5}2hu4PGiRA!b)wu8=dIqmw>l~ki^;k@ zu+18hIv*=S_>#}H+yO>m)#0iez=L~(XBaG90`8Pz%AKGnEeiUJ$A-j;Vpnk1Kz!AP zm4B3AM>;wa7mN$E(f0ZYjJU0$8>eg#!1AlSfu@*ZzMSHk*Grx^95^zmr$I4wF$AwO zQvw+Ea!mfwq|v@REJ4Hvp(R*2bs$6t*4a=yZ+1Rvxf=$tmA3+Gb0ejX-gxp!Vf7yjgaAMLEh zqxsI1OR+l`A7AAhdom?sp`Y!gICx#Icc*Yf4#(ZPm(%{U zfV$!`>8?s~(d&=BEcR(2d2Wb|2x|v{F*k+ajE_1LZ;V6bhQ$=CMdpTT{)TEs1;BN@u(~hKB5N2$&lhQbC$7RFJHNpQb6>}WoM+`%X;kz zf77x42-a+sj9;++CINTOuHZCaF-ffC&l^A%bQRXLis?&Ty9Qhnyl!YusxDNSN8h{< z^eL>%Oe#zR@v~Y$@RlPbZ$s1gvVa>UcW&^z#k&(>Q4|M^=G*x*1qY5ZsTZs~MEa#V zoCm);R^$DXkrXdHKp2~pi73pS3*W>KA8d#jOL!G`t4Iw;aOAbWl7BFiQu^Kdd8xis zloQIL>JYp$IDw{ND)aGc(ld6i7P^((xrQrSh~PrMR3UPpMcv!E3)u&U$o1}r!EjS$ zy)jqU9ACM?^X2Ay4SMnZ2pg53tRd9zZrJEhob(bHxwa6ZvVN%ot`s7((Uqo4KZSc% zh*EXm$SLThRIT&5b7`Pj>&U@b-YUd->4P%#X)DBN>rT|rzUe|@^i9ke0eBR}jlT#O zlb6Pxs3*^uUl%9kzcc0vaVtU==%GF?ndTck`c8V7hF{`3F_3@8#ctbF13t!vQH^MI z4bB+Li(}ZlaMo$O0O1-wkn;}6EH?BPy@+hO@RLiODe!fstbu+F9YTf84jPhnZ@q6Z zAauMsQE5hA+?u)!7$;_HV*gi%Bcy2oc!D72QH3%cXSR^)RNm6}f#lksC|bH1lYIv; zl;)KpD^^ubX=?#Nq$6#de(SDq{X!Bl!>dTx#C*W@{DZ5&mxJ&|!C(W+Lpd6T-F_t2>r$%wt4bUPa-_Qw9g(Qlwwu)7^#;R`xMnE5~rr zj!lSTig1e=(^ytd<@}a*iI~GRh-N}QP{uffk9?~i?ZFTKy4Tx{3)ynwQnEdY$kc6MvfseEDkq!g;)$23X%wsG&9yhTpcpU@=Z8cQ{?(8 zJlkHI(_5-%4y zDugm%Eo$DyWj3VPVjr5DHlYk00&U^<-MNC@iYo-;D|+S%q$s z(~&BG0(s;JS3Uuo2d->Eqh}Wk*s;j1w1BcZUjw6!`db>tU-lC<9Rt$^R!WeW^LJm| zfAKjT=ybqkxT)fM_6^G33Xtmi)6gBYaA{gE4o4G*l|8=p#E;Q6-HZ;3M(k$k@mF*;A*r!2A}{FqDmu4si|H0%YcAkG-?{5SGC3 zA4FZ?ivQEctW-CmMeR2=fKD&y)|1~i512~Qk(zFF5OUoM_{R8ndm%rUrjJ0PDV^e8 zbw$M^P~JX-nEpohNTQ+Oo5J(LT4rGJ^F~O)AOJ-*+J_z3m!Put9&t_w12_e8ghB4V zf0;G^)n|~g1H4akCqZre(;EM~W&Xea`6#vX{)e|K<3N}Gh!$WiZUqTmIGab;#-BhM z^+#cO!1|)*@oCdHy3 z-F8j_8UHoaJFe%@T>RPU@KVkEN4$>{Vl2KMXNUSgD0ph>Ao}YsT)RV*Q@xPUi@2*Z zkP&rl#-bhQ*8xH2qg!9k8#)>w6+J&_3vzQ6w)^(x=a#Brn_PhB;ktWoNs3B=m%_J$ zf=ivHdY!16YBK%xf;2g#o4KjoiL@eA_v2J-!EqO{f)=G-9 z|nE0;;7^Q(DBk#EM_Q-+tuIj+<7mR;oTpbst{7vkg2?EY?+ zx;9mPnue2CoF6{8zx~~UL|~7dtc{XC5ViKGcd~sa8|rT?JB{y>$ymvQhuwlJXHCd`Hco5ObHA z6}Bz0{RHWHrQS#In(u-!8Ld9D>Nzv(Jv_2KoaGa4sg4xl`-5%S|omR+lr<7)i96RT+^pRAaVXdp&Kcm?>e@5)c4U3?}8Pg0tBe(DvnhyC^ zm6xx8h)ezsDt=$pXg zQ7QIhZ*|)7|MYf0=|p+=r}tLX{$x3c8N&5owo_hGuAIYN-)qCI84lSnmG!<@&9@6@ zrUvg61gE|9T&{e|qE(5HpIZvK@?`rRdALf$yl_2AFoJSj>$xH2y2erexlXh}VTlS!$vUZ#HBV>dz}y3$Ad~8Gvh- zy6tLF{%bTLLn4@-~D4<@1xx3~Lc|BSCNag-KV1{*mwjRa-tKT%=M&@N&!~`1(wBngzcL$W_`#RiKt^oWl z^E35s7|nMpPhTgz+K>ljiLb2x5n?xduzcQ-VnZn?W`y5hE-i$?0_jADO!MTg~uB+N0icbJud`3MwyZq(x68H>u4v1J@mZ6SsRJVpLhY zt9Ud!&uE=j_+{%!6s6SC>KvSuA*DZaM$0V_6eIair7URt z;FNgI3W_^Xl7%RiCG$a;>-3{}lY>N+ri&Mgc8M%o&C~SAMig}zQclM7);<=;ha_O^ z6JDm>1qQWwv>)SOsYa9rq7j&U@gC8Wvg+Ym1A(71YfHlg__a`pE-EwY5I0^LvsWT1 z-|(V!*8SNgb)=r$=k8N)m!8g;+WPood-nC*cBw54xWSf^wrwrbH>0b%QBx>vE^D%{ zu+aZv*-3fE+M=M`t$~-5b4;ZPmD^ic+l}7; zoSdUQZ(}9jW#-(1k+!;%$fgbC+E_7;ygeKX>GeyceD`$<6R+=D&LU+BH#FJOA>ctT zddv_ z3uYvgquXMGN=WofOA9!34&1&$V3QfQ3GnL76(uns)Wc@Y<)gUaHL)%>ZNM$ z`%e&CTVLN>>%O(#T1iflb8^nv`|O&oUHe9I;nU4i+9c z(YRP*DvPXV%UErc2Ha_zhDi*Z3xFsbT^!~_LId^SySn@&elh<@29BMHVx4hFqMc>d zJK-8v|JWV4*o}x5vT?J7=Ddx&Y@$&b9c7JJGLW+WX1BaAtK{>ELdM7ty@%e_%Iuzn8?4AcBh_-f zydSW@C;ZmXwuHV+nlPN-~POUv0 z_IbjftUmFXl^B>yAwY|N!WbpE_{~xqEp|JxTle4@{br4^1$e=b4y<>WHE_OUfROn6 z4-}}3lc-U_*(A`V3WG55hp;Rx0n zuFJpuoDOWIS3ksY(U)ZI6QPJ-*D@}Lade~fuPOitj;L*@Co!_`dALjmtHYrW>-nqS z)YVb+LXvxLY@IU5)!qq_0#uK~0dXrLFK#wl`w3Y3;ab-Gq>i-9sjrvbuud%l&tm&B z0rQkq2F86{6<5BazbXO6s|3kGbfq79$&&=-PG*c}JRwh;;TMRxc)mQUqzBL$UU|zQ z#a`|@M_&0jvyHkVvh{wsK&8=Q{8G&e6c~XjKeDsB+`q#=9=(i%(F%ko7?mVO=^U)H zPka~0c}-YturA$obAm7eUs6z79!EK&-}NMt=CI{LYKY_+^^UNGAxRKJA*!D!Y4`4|9v;8>O#<2d3VByq`4nevMBHcW`w8=B!-OnWDS-;~uql zZ3yqh#nK)rm4*E+#jiVFT%E>d1rzJp5~2_>FFbLqV&^a`i_b2 zYI4ba#azFs>CkK98%S%5)qZuv&0(3`Ylf9K7`dK7R>*g=MN~i5q;fz+ah|^ zAe<>p@O}F=;!4yRa0c99c&bO|D~#nkRrFP2?nLGgDK(%y1RwkzUu$qEyZQwM8(x3- z*Tfo0M`(FnlQ&pnDntA@lsUH}RFu2orgC7g*tKZ2hy5zAJl3~c8z4*V*GzH=l-bEz zX0_`l9v)iK@!h~enR5gHiinb0y)bTOb(ng(i%vP!)~-9c!EefJI7smBVfC@O{Uw*< z!i6ht1lsRMeiG$GT01f)hMQDGI9ETiv|JA$t@aT+6m|ux7xKNBATcJ&jnHbg?16yn zc1lQn*5O}SRrLm`I%N`^um<^ArRx^P$QSGtbeVx&%y(?;p5*5o-B)*+%-&Q%Zg$uw znz3&9nM+^I{@Er~1AflBE%LaMty}|jjioj3*QokzYv#0TL~tm6*Ssdzy`tl_tjlrl z6~lCG^^El%%Q{|5;87Vksbb(9>{S7Ad3emT2B0&CToT9D8O}YNw7K^Ie6jV&U8&8) zbGhodv+owqzFQIGB*fc1qy}Akt*VYYx3E+8ecM%``CNVpxow4E*TN2^6O6&qR^`;W z{BwHdg`l*ZcZNbF(_?C-0+DiwxCN^M@SLlc{E`)|;@h(Zr*%kJnV=bPCb zfilAZ;)TmnSBs{%4J$q4k9q#WeB1C9M^9MW7T}_)j#P|}&~MYK-WKbNva9RNsKPHX&+RPy(o&zr-M%2}nneZ7@vct3c!+6P zKS-o|3}K#mR`Z{g6=gVg}_GwS)H2 zsjLol5kEu4F#9P#s>zk)7r@A47W|^?kYMK&T@G>(SUyR}6i?u2;l8xPyga;{dO|dt z2%@H~*1UiE;n-iYy&R>aWzD#WcaM=$zvq?!^$ZAKE^YA*6)~zcFwJxFr`cEo z`xE(%uD`Dl6Q|6l)lkz zKw@PdJAWtvNU$p(#rPT1fip5>fY)S{I72YqNUT(2@iP!=8CTlBeEAXAu(J`;jE!RRAz}s*L2y73 zzd(pu_t6mlU#^^Cr_ef}(*7F+Hyt=gKSE_Qp2YAo+<{z-l|zH@og`x|4{#s_ z_>oB1#qb9lLxd6Iu!K@gHl5DwNc0I9JS2(8XC%aKOi8iqMmoZ(KCARz58~~l)QDlg z9n&C8eMU{ZVxXwU(1#G$9)A<^2Wy3QV~)N(DA>3-9brEOr3OZ`0H!>#JSpe|*FUB> zpm#c?VEw#b)_S{ec>WL6agj@SKU6c$o4MiIlI}!ywuQn?bqIr1zS8*8Y-Ed`NUr#g==qXt#Kc<=c5fRPM>X^0tn@?Rq8Q4y&pY zaE3?n*^^5T#H}DK#Pi>fN@EP9==61F^Bj7&98cWoy2D0NnY^1DHZN*Gcs3>QQJkS| zwu^pVu$U&uWCw*P<&HPo7iHoyFzqY8#7YK$`NUm)JnQhCxLOHs*iSORWIHkjQ}wuv z(A{Oc*Q&eAXA}zqE6-4ZxOWm--q26=*C!B$BpK{GLreaHZ_@J-Y{~PN-C+CjngSK zaGU&d4^i%zO}YkWE6G@Kdr8MeuQn&mj|#2EpMGM$S>d}qS`mI&b$+??a_<5BkR&bH z@X+wkLsXY|gJ#>W8JBZ@d4w%KBE!9h?Hun+LxQt3O9Hj|3xG4{m)82`u4TgH{fu55 zd@`=mYBM38eoB12MVj2(e|iLavxUt4I$yP`uzSyqo)(R8Gb4AK?#O5McZ^Nsjx4I^z?>;)BhpXFf=spQ z+(pGWde>9}b{La^{h|YNe!1f%J;BUgF*JRE_3Lu?4!{Nxn(d`adVVFv*oc{v8e$wH z{fiB_e!_O#FTH2K;{!Nf24vl;(3Q>8@cw?2TY92T_BTFNj>|c7;j#?QZ%Q?3C3M6| zazK{o)tqT&$p#W`Z+eB#iY|rzjl;^h{a=sde7m)t@p(G^7VpBBs%jwx! z3bU80X#tOzo?GZy$yUq0R1ognyiO?gxO*~PXtvfgmh_;SaIx*WdST})MOWye$+lnd z0+esR;{%LMzBJ(RoN?sUN)i10r79()p5Swxcm!{=ll}wqti}x}KU5EKRuU-{yP{wNiX)7>`$K<;DU7@uXy?sab!=2H)n&)g|YX=-_^mS>ufKtMK0qNyM~zK zOpe-Wk9m(omV5K5b<#WSJb&$4ZhFs>{%!pmW|WA!ljs6`^MZFhQHORjfo3)&k5y<} z8hb%zQWa^=xqZd(qB&HS?@k(5sLN5PT@aH?hE(pqSWLVtQ9J-zf0iHvGt^#2 zY^98y{9i4%%K+IRx#9oK3+EZAr65sOqcw>5`#-?TMe~I`ahwVAg0@(B@sJRQ7?KZ@ z`ai#)@#LStA_kWNkrPE;data!D1byhBRT~0`#&ztEHK|5@P|CUQx3_(G)&_-gg`5Y z-tmd$VZbf;Ea*CWC@eTvIT^qQNSU5Ai2?ep1-Mt7(LQwU8V}|Fpr1PcUy#UgPWXUZ zW?=ZjerNcMP0s37s$%6xHO3!U+=Qq#3{LOE8got$$i@ps!?GyIq%}G$08$~Vv_%Vg z>*e{#NyI-?bosW!S*-wC{!t0nB95nVg{1N`pnizY16$Tkns3@y2PuUkf{hkxJ7-Nl z?g7%f?`pv0lYw#+lzIh#^qdh@Th)FfI{Q2K=3jJViB{`iz8m)Rr)D56{y>_#9u9!E z>;`=&SrZT+2q`rf2xI5ad?|Q{29~98ppU-DQX`9lvzk6pqk@{x(KYZ19~@b?ggd}t;fFX*e~=@JA$m8}%q zFW?bP_YK8BQ3i@t?i2;tmbf}8!jdp3H-_@jkv~qHgF#yZFk?cBXMARox~VK$FFq*r zn(E?afhkW#bbEGn#HR`~lG4T*Kq?p!nqB6o#dIfeLV$c{U-{ePbj@n3masP8d`!&t zUEWuU$bPi~ZvS~vuPL!`gS=!S?7t7ny`0~ESANL4k)7!jK?9~$& zPq7f|4DWdI8P#Tvm*%y6Et{68)dU~oM_7|7TUu$W{l-U#dkmR<;#I>^sad!k=Gz4M zEhM|@fnP((-fi@@87SS|gdnEFJh-tRH~$a5{~N|oX0OweGhf-Cf;=4vNS)* zHPyEXl|2K`S<;Aoah-C_m)K36uHuK8+J=E!k|r80{v4>U(|#(2ohcz)o^&e&2anTd zD4+!te_)3e9{cy;%`cD-!Q`}9xYBF?evzsvk2zu0S0g#?!tzTcN_}CEUU*XfK)BN5 zZV_jIFtV!HGrr6j?@#Y?+?8#IfQu&)X?gUpHTY=uMcUe*}u>nWlj>}y8`vdnp=#K)IQnqC@j*zAp91| zLwxdB-5gwEeD?Go1IA0W|AV2oA#4P{;)IX$R0!{ZN zWk#KHSKDPqH)$EO7OR`+P4QoQ^i_0XMuf-X(jt{@Nt5ze1#Z!mDpw^vE?#9$D0Q=Y zMD`w`q@g99&dqCUp5haH9x(=juneE-=D8Kmm?Io!!6r#1DRX)V?=DRhgu6}l#-6QZ z=58ClUny}OtnhvsOs+ml*VC}YdD{=&)g0ZEa>OUr5=!gbAU2v_5wQ1d6K}$*MoV#( z-g!9CgpeE2Ct8B;W*d}?lV32q;)QMDLy?oRwO@?@JYx^MZ(S}TmU(j$rw7l zx_OSM1fort_!(fnMo)-U#b1CYjuR==-j2+D_DTo|9mebgyINo0jTs3;fIwi_!7^e8 zPXFHRu@RjOf@XipEcpv-|7j#3ITrwD&$u0hU;xTbdXS(174~GQh-n6>${Qj3A$t?S zvcaLh0$X+N6g%C#o!}3yL$L9mU-D8gv@;Fe`_VZ9i+q!2QEe+$XiQ&6vj1 zF;DNFB%~ZHwA%6sDaE4+2joT~f2x1w$CRK*=h;oRE*UkpIE45G>WkphkYf610mA!U?G9?tKF3N zgVp8@@}Bi}t<3fKA*yPzk8}dOgA&Hd9mXmKQd9;=F`b_nqLbl{9HyVdp-Rd5asiWrY{YuSaR90Cjlf&|Wp%(5t1*J7JsUn$#$Jt+Dr$JU} zX==sS$#8A!Rx}ep@Gc7!7?HAAq~?a+CzuZ7%Vgbkj0XOBVP*kV8IcQUEHbWD!I(>7 z@21tHS6lG;FO{}x7u(V7#L*%4GhFf>psic8LBU+6W`%*s(CAS8!-`{@+lN!-; ztw_K|WDJ%zO#`w$Mo}{kDyOj=J5YBdrK62CDZKSIr-(NGei-6sTg{k?TUoIKg=xfUpx|LSYf}^k|_{@@S!w z*Jz!Q?fbM{0SsiUrzvt%sz(`} Date: Thu, 18 Jan 2024 16:51:12 +0800 Subject: [PATCH 098/248] [update]: v0.1.1 update pypi description --- README-pypi.md | 6 ++---- setup.py | 2 +- 2 files changed, 3 insertions(+), 5 deletions(-) diff --git a/README-pypi.md b/README-pypi.md index ea0f73ec..fd673645 100644 --- a/README-pypi.md +++ b/README-pypi.md @@ -1,6 +1,4 @@ -![vbench_logo](./asset/vbench_logo_short.jpg) - -## Project Description +![vbench_logo](https://raw.githubusercontent.com/Vchitect/VBench/master/asset/vbench_logo_short.jpg) **VBench** is a comprehensive benchmark suite for video generative models. You can use **VBench** to evaluate video generation models from 16 different ability aspects. @@ -73,4 +71,4 @@ Check out [details of prompt suites](https://github.com/Vchitect/VBench/tree/mas journal={arXiv preprint arXiv:2311.17982}, year={2023} } - ``` \ No newline at end of file + ``` diff --git a/setup.py b/setup.py index dafc6f26..200c9a65 100644 --- a/setup.py +++ b/setup.py @@ -16,7 +16,7 @@ def fetch_requirements(): install_requires = fetch_requirements() setup(name='vbench', - version='0.1.0', + version='0.1.1', description='Video generation benchmark', long_description=fetch_readme(), long_description_content_type='text/markdown', From e21b1f426713582cf72552179240f7a924cac425 Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Thu, 18 Jan 2024 18:48:10 +0800 Subject: [PATCH 099/248] [fix] prompt format --- prompts/all_category.txt | 6 ++-- prompts/prompts_per_category/food.txt | 2 +- prompts/prompts_per_category/lifestyle.txt | 4 +-- .../overall_consistency.txt | 34 +++++++++---------- 4 files changed, 23 insertions(+), 23 deletions(-) diff --git a/prompts/all_category.txt b/prompts/all_category.txt index 33be2a87..90e4cd84 100644 --- a/prompts/all_category.txt +++ b/prompts/all_category.txt @@ -223,7 +223,7 @@ close up pasta with bacon on plate milk and cinnamon rolls boy getting a dumpling using chopsticks a mother preparing food with her kids -man using his phone while eating +man using his phone while eating fresh salmon salad on a plate cutting cucumbers into long thin slices as ingredient for sushi roll a steaming cup of tea by the window @@ -438,7 +438,7 @@ modern interior design of a coffee shop person arranging minimalist furniture aerial shot of interior of the warehouse a room of a manufacturing facility -interior of catholic +interior of catholic interior design of a restaurant a female model in a changing room looking herself in mirror men walking in the office hallway @@ -470,7 +470,7 @@ graffiti art on the interior walls of an abandoned mansion indoor wall climbing activity sunlight inside a room teenage girl roller skating at indoor rink -home deco with lighted +home deco with lighted baby in the shower room men enjoying office christmas party a bedroom with a brick wall diff --git a/prompts/prompts_per_category/food.txt b/prompts/prompts_per_category/food.txt index b380a165..41308390 100755 --- a/prompts/prompts_per_category/food.txt +++ b/prompts/prompts_per_category/food.txt @@ -23,7 +23,7 @@ close up pasta with bacon on plate milk and cinnamon rolls boy getting a dumpling using chopsticks a mother preparing food with her kids -man using his phone while eating +man using his phone while eating fresh salmon salad on a plate cutting cucumbers into long thin slices as ingredient for sushi roll a steaming cup of tea by the window diff --git a/prompts/prompts_per_category/lifestyle.txt b/prompts/prompts_per_category/lifestyle.txt index 6adf3145..c4c0bebe 100755 --- a/prompts/prompts_per_category/lifestyle.txt +++ b/prompts/prompts_per_category/lifestyle.txt @@ -38,7 +38,7 @@ modern interior design of a coffee shop person arranging minimalist furniture aerial shot of interior of the warehouse a room of a manufacturing facility -interior of catholic +interior of catholic interior design of a restaurant a female model in a changing room looking herself in mirror men walking in the office hallway @@ -70,7 +70,7 @@ graffiti art on the interior walls of an abandoned mansion indoor wall climbing activity sunlight inside a room teenage girl roller skating at indoor rink -home deco with lighted +home deco with lighted baby in the shower room men enjoying office christmas party a bedroom with a brick wall diff --git a/prompts/prompts_per_dimension/overall_consistency.txt b/prompts/prompts_per_dimension/overall_consistency.txt index 4dbf9710..997a874f 100755 --- a/prompts/prompts_per_dimension/overall_consistency.txt +++ b/prompts/prompts_per_dimension/overall_consistency.txt @@ -1,4 +1,4 @@ -Close up of grapes on a rotating table. +Close up of grapes on a rotating table. Turtle swimming in ocean. A storm trooper vacuuming the beach. A panda standing on a surfboard in the ocean in sunset. @@ -20,8 +20,8 @@ an ice cream is melting on the table. a drone flying over a snowy forest. a shark is swimming in the ocean. Aerial panoramic video from a drone of a fantasy land. -a teddy bear is swimming in the ocean. -time lapse of sunrise on mars. +a teddy bear is swimming in the ocean. +time lapse of sunrise on mars. golden fish swimming in the ocean. An artist brush painting on a canvas close up. A drone view of celebration with Christmas tree and fireworks, starry sky - background. @@ -50,34 +50,34 @@ A raccoon is playing the electronic guitar. A boat sailing leisurely along the Seine River with the Eiffel Tower in background by Vincent van Gogh A corgi's head depicted as an explosion of a nebula A fantasy landscape -A future where humans have achieved teleportation technology +A future where humans have achieved teleportation technology A jellyfish floating through the ocean, with bioluminescent tentacles A Mars rover moving on Mars -A panda drinking coffee in a cafe in Paris -A space shuttle launching into orbit, with flames and smoke billowing out from the engines -A steam train moving on a mountainside +A panda drinking coffee in a cafe in Paris +A space shuttle launching into orbit, with flames and smoke billowing out from the engines +A steam train moving on a mountainside A super cool giant robot in Cyberpunk Beijing -A tropical beach at sunrise, with palm trees and crystal-clear water in the foreground +A tropical beach at sunrise, with palm trees and crystal-clear water in the foreground Cinematic shot of Van Gogh's selfie, Van Gogh style Gwen Stacy reading a book -Iron Man flying in the sky +Iron Man flying in the sky The bund Shanghai, oil painting -Yoda playing guitar on the stage -A beautiful coastal beach in spring, waves lapping on sand by Hokusai, in the style of Ukiyo -A beautiful coastal beach in spring, waves lapping on sand by Vincent van Gogh +Yoda playing guitar on the stage +A beautiful coastal beach in spring, waves lapping on sand by Hokusai, in the style of Ukiyo +A beautiful coastal beach in spring, waves lapping on sand by Vincent van Gogh A boat sailing leisurely along the Seine River with the Eiffel Tower in background A car moving slowly on an empty street, rainy evening A cat eating food out of a bowl -A cat wearing sunglasses at a pool +A cat wearing sunglasses at a pool A confused panda in calculus class A cute fluffy panda eating Chinese food in a restaurant A cute happy Corgi playing in park, sunset -A cute raccoon playing guitar in a boat on the ocean +A cute raccoon playing guitar in a boat on the ocean A happy fuzzy panda playing guitar nearby a campfire, snow mountain in the background A lightning striking atop of eiffel tower, dark clouds in the sky -A modern art museum, with colorful paintings -A panda cooking in the kitchen -A panda playing on a swing set +A modern art museum, with colorful paintings +A panda cooking in the kitchen +A panda playing on a swing set A polar bear is playing guitar A raccoon dressed in suit playing the trumpet, stage background A robot DJ is playing the turntable, in heavy raining futuristic tokyo rooftop cyberpunk night, sci-fi, fantasy From 3251da4b09f0d1d46bf31a42b1a795040d25c977 Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Thu, 18 Jan 2024 20:00:15 +0800 Subject: [PATCH 100/248] [update] README.md --- README.md | 18 +++++++++++++----- prompts/README.md | 8 ++++---- 2 files changed, 17 insertions(+), 9 deletions(-) diff --git a/README.md b/README.md index 5506b888..f1fecb8c 100755 --- a/README.md +++ b/README.md @@ -1,8 +1,9 @@ -# :bar_chart: VBench +![vbench_logo](https://raw.githubusercontent.com/Vchitect/VBench/master/asset/vbench_logo_short.jpg) [![Paper](https://img.shields.io/badge/cs.CV-Paper-b31b1b?logo=arxiv&logoColor=red)](https://arxiv.org/abs/2311.17982) [![Project Page](https://img.shields.io/badge/VBench-Website-green?logo=googlechrome&logoColor=green)](https://vchitect.github.io/VBench-project/) +[![PyPI](https://img.shields.io/pypi/v/vbench)](https://pypi.org/project/vbench/) [![HuggingFace](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Leaderboard-blue)](https://huggingface.co/spaces/Vchitect/VBench_Leaderboard) [![Video](https://img.shields.io/badge/YouTube-Video-c4302b?logo=youtube&logoColor=red)](https://www.youtube.com/watch?v=7IhCC8Qqn8Y) [![Visitor](https://hits.seeyoufarm.com/api/count/incr/badge.svg?url=https%3A%2F%2Fgithub.com%2FVchitect%2FVBench&count_bg=%23FFA500&title_bg=%23555555&icon=&icon_color=%23E7E7E7&title=visitors&edge_flat=false)](https://hits.seeyoufarm.com) @@ -18,18 +19,25 @@ This repository contains the implementation of the following paper: We propose **VBench**, a comprehensive benchmark suite for video generative models. We design a comprehensive and hierarchical Evaluation Dimension Suite to decompose "video generation quality" into multiple well-defined dimensions to facilitate fine-grained and objective evaluation. For each dimension and each content category, we carefully design a Prompt Suite as test cases, and sample Generated Videos from a set of video generation models. For each evaluation dimension, we specifically design an Evaluation Method Suite, which uses carefully crafted method or designated pipeline for automatic objective evaluation. We also conduct Human Preference Annotation for the generated videos for each dimension, and show that VBench evaluation results are well aligned with human perceptions. VBench can provide valuable insights from multiple perspectives. ## :fire: Updates -- [12/2023] Evaluation code for released for 16 Text-to-Video (T2V) evaluation dimensions. +- [01/2024] PyPI pacakge is released! [![PyPI](https://img.shields.io/pypi/v/vbench)](https://pypi.org/project/vbench/) +- [12/2023] Evaluation code released for 16 Text-to-Video (T2V) evaluation dimensions. - `['subject_consistency', 'background_consistency', 'temporal_flickering', 'motion_smoothness', 'dynamic_degree', 'aesthetic_quality', 'imaging_quality', 'object_class', 'multiple_objects', 'human_action', 'color', 'spatial_relationship', 'scene', 'temporal_style', 'appearance_style', 'overall_consistency']` - [11/2023] Prompt Suites released. (See prompt lists [here](https://github.com/Vchitect/VBench/tree/master/prompts)) ## :hammer: Installation -#### Install with pip +### Install with pip +``` +pip install vbench +``` + +To evaluate some video generation ability aspects, you need to install [detectron2](https://github.com/facebookresearch/detectron2) via: ``` pip install detectron2@git+https://github.com/facebookresearch/detectron2.git - pip install git+https://github.com/Vchitect/VBench.git ``` + +If there is an error during [detectron2](https://github.com/facebookresearch/detectron2) installation, see [here](https://detectron2.readthedocs.io/en/latest/tutorials/install.html). -#### Install with git clone +### Install with git clone git clone https://github.com/Vchitect/VBench.git pip install -r VBench/requirements.txt pip install VBench diff --git a/prompts/README.md b/prompts/README.md index 19ac0250..18110060 100755 --- a/prompts/README.md +++ b/prompts/README.md @@ -44,9 +44,9 @@ We specify how to sample from `Prompts per Category` for VBench evaluation, and for index in range(5): # perform sampling - video_ = sample_per_video(dimension, prompt, idx) + video = sample_func(prompt, index) cur_save_path = f'{args.save_path}/{prompt}-{index}.mp4' - torchvision.io.write_video(cur_save_path, video_, fps=8) + torchvision.io.write_video(cur_save_path, video, fps=8) ``` ### Further Explanations @@ -88,8 +88,8 @@ To sample videos for VBench evaluation: for index in range(5): # perform sampling - video_ = sample_per_video(dimension, prompt, idx) + video = sample_func(prompt, index) cur_save_path = f'{args.save_path}/{prompt}-{index}.mp4' - torchvision.io.write_video(cur_save_path, video_, fps=8) + torchvision.io.write_video(cur_save_path, video, fps=8) ``` From 91363e849f93c84f7b4d0fa14b4be73653d54465 Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Fri, 19 Jan 2024 17:31:46 +0800 Subject: [PATCH 101/248] [fix]: (1) add warning for video not found (2) remove irrelevant dimension's info --- vbench/__init__.py | 31 +++++++++++++++++++++++++------ 1 file changed, 25 insertions(+), 6 deletions(-) diff --git a/vbench/__init__.py b/vbench/__init__.py index 342f4121..9b5a8713 100755 --- a/vbench/__init__.py +++ b/vbench/__init__.py @@ -14,25 +14,43 @@ def __init__(self, device, full_info_dir, output_path): def build_full_dimension_list(self, ): return ["subject_consistency", "background_consistency", "aesthetic_quality", "imaging_quality", "object_class", "multiple_objects", "color", "spatial_relationship", "scene", "temporal_style", 'overall_consistency', "human_action", "temporal_flickering", "motion_smoothness", "dynamic_degree", "appearance_style"] - def build_full_info_json(self, videos_path, name, dimension_list, special_str=''): - cur_full_info_list = load_json(self.full_info_dir) + def build_full_info_json(self, videos_path, name, dimension_list, special_str='', verbose=False): + full_info_list = load_json(self.full_info_dir) + cur_full_info_list=[] # to save the prompt and video path info for the current dimensions video_names = os.listdir(videos_path) postfix = '.'+ video_names[0].split('.')[-1] - for prompt_dict in cur_full_info_list: - prompt = prompt_dict['prompt_en'] - if f'{prompt}{special_str}-0{postfix}' in video_names: - prompt_dict['video_list'] = [os.path.join(videos_path, f'{prompt}{special_str}-{str(i)}{postfix}') for i in range(5)] + + for prompt_dict in full_info_list: + + # if the prompt belongs to any dimension we want to evaluate + if set(dimension_list) & set(prompt_dict["dimension"]): + prompt = prompt_dict['prompt_en'] + prompt_dict['video_list'] = [] + for i in range(5): # video index for the same prompt + intended_video_name = f'{prompt}{special_str}-{str(i)}{postfix}' + if intended_video_name in video_names: # if the video exists + intended_video_path = os.path.join(videos_path, intended_video_name) + prompt_dict['video_list'].append(intended_video_path) + if verbose: + print(f'Successfully found video: {intended_video_name}') + else: + print(f'WARNING!!! This required video is not found! Missing benchmark videos can lead to unfair evaluation result. The missing video is: {intended_video_name}') + cur_full_info_list.append(prompt_dict) + cur_full_info_path = os.path.join(self.output_path, name+'_full_info.json') save_json(cur_full_info_list, cur_full_info_path) print(f'Evaluation meta data saved to {cur_full_info_path}') return cur_full_info_path + def evaluate(self, videos_path, name, dimension_list=None, local=False, read_frame=False): results_dict = {} if dimension_list is None: dimension_list = self.build_full_dimension_list() submodules_dict = init_submodules(dimension_list, local=local, read_frame=read_frame) + # print('BEFORE BUILDING') cur_full_info_path = self.build_full_info_json(videos_path, name, dimension_list) + # print('AFTER BUILDING') for dimension in dimension_list: try: dimension_module = importlib.import_module(f'vbench.{dimension}') @@ -40,6 +58,7 @@ def evaluate(self, videos_path, name, dimension_list=None, local=False, read_fra except Exception as e: raise NotImplementedError(f'UnImplemented dimension {dimension}!') submodules_list = submodules_dict[dimension] + print(f'cur_full_info_path: {cur_full_info_path}') # TODO: to delete results = evaluate_func(cur_full_info_path, self.device, submodules_list) results_dict[dimension] = results output_name = os.path.join(self.output_path, name+'_eval_results.json') From 91bece871c27d8a62a4b1be753caf2a3e2706ca1 Mon Sep 17 00:00:00 2001 From: NattapolChan <66078943+NattapolChan@users.noreply.github.com> Date: Tue, 23 Jan 2024 22:29:00 +0800 Subject: [PATCH 102/248] [update]: change --dimension to receive list inputs fix decord.set_bridge conflict when running read_frames_decord_by_fps and load_video (.mp4) change relative import in third_party/amt/networks/* to absolute --- vbench/cli/evaluate.py | 9 +++++---- vbench/third_party/amt/networks/AMT-G.py | 8 ++++---- vbench/third_party/amt/networks/AMT-L.py | 13 ++++++------- vbench/third_party/amt/networks/AMT-S.py | 12 ++++++------ vbench/third_party/amt/networks/IFRNet.py | 4 ++-- vbench/third_party/amt/networks/blocks/ifrnet.py | 4 ++-- .../third_party/amt/networks/blocks/multi_flow.py | 6 +++--- vbench/utils.py | 2 ++ 8 files changed, 30 insertions(+), 28 deletions(-) diff --git a/vbench/cli/evaluate.py b/vbench/cli/evaluate.py index 7c73a4e9..81ef97dc 100644 --- a/vbench/cli/evaluate.py +++ b/vbench/cli/evaluate.py @@ -23,9 +23,9 @@ def register_subparsers(subparser): ) parser.add_argument( "--dimension", - type=str, + nargs='+', required=True, - help="evaluation dimensions", + help="list of evaluation dimensions, usage: --dimension ", ) parser.add_argument( "--load_ckpt_from_local", @@ -48,10 +48,11 @@ def evaluate(args): my_VBench = VBench(device, args.full_json_dir, args.output_path) print(f'start evaluation') + my_VBench.evaluate( videos_path = args.videos_path, - name = args.dimension, - dimension_list = [args.dimension], + name = str(args.dimension), + dimension_list = args.dimension, local=args.load_ckpt_from_local, read_frame=args.read_frame, ) diff --git a/vbench/third_party/amt/networks/AMT-G.py b/vbench/third_party/amt/networks/AMT-G.py index a24cb1a3..332ec760 100755 --- a/vbench/third_party/amt/networks/AMT-G.py +++ b/vbench/third_party/amt/networks/AMT-G.py @@ -1,20 +1,20 @@ import torch import torch.nn as nn import torch.nn.functional as F -from networks.blocks.raft import ( +from vbench.third_party.amt.networks.blocks.raft import ( coords_grid, BasicUpdateBlock, BidirCorrBlock ) -from networks.blocks.feat_enc import ( +from vbench.third_party.amt.networks.blocks.feat_enc import ( LargeEncoder ) -from networks.blocks.ifrnet import ( +from vbench.third_party.amt.networks.blocks.ifrnet import ( resize, Encoder, InitDecoder, IntermediateDecoder ) -from networks.blocks.multi_flow import ( +from vbench.third_party.amt.networks.blocks.multi_flow import ( multi_flow_combine, MultiFlowDecoder ) diff --git a/vbench/third_party/amt/networks/AMT-L.py b/vbench/third_party/amt/networks/AMT-L.py index f992688b..551fac52 100755 --- a/vbench/third_party/amt/networks/AMT-L.py +++ b/vbench/third_party/amt/networks/AMT-L.py @@ -1,24 +1,23 @@ import torch import torch.nn as nn -from networks.blocks.raft import ( +from vbench.third_party.amt.networks.blocks.raft import ( coords_grid, BasicUpdateBlock, BidirCorrBlock ) -from networks.blocks.feat_enc import ( - BasicEncoder +from vbench.third_party.amt.networks.blocks.feat_enc import ( + BasicEncoder, ) -from networks.blocks.ifrnet import ( +from vbench.third_party.amt.networks.blocks.ifrnet import ( resize, Encoder, InitDecoder, IntermediateDecoder ) -from networks.blocks.multi_flow import ( +from vbench.third_party.amt.networks.blocks.multi_flow import ( multi_flow_combine, MultiFlowDecoder ) - class Model(nn.Module): def __init__(self, corr_radius=3, @@ -152,4 +151,4 @@ def forward(self, img0, img1, embt, scale_factor=1.0, eval=False, **kwargs): 'flow1_pred': [up_flow1_1, up_flow1_2, up_flow1_3, up_flow1_4], 'ft_pred': [ft_1_, ft_2_, ft_3_], } - \ No newline at end of file + diff --git a/vbench/third_party/amt/networks/AMT-S.py b/vbench/third_party/amt/networks/AMT-S.py index a7155bb6..e025a36a 100755 --- a/vbench/third_party/amt/networks/AMT-S.py +++ b/vbench/third_party/amt/networks/AMT-S.py @@ -1,24 +1,24 @@ import torch import torch.nn as nn -from networks.blocks.raft import ( +from vbench.third_party.amt.networks.blocks.raft import ( + SmallUpdateBlock, coords_grid, - SmallUpdateBlock, BidirCorrBlock + BidirCorrBlock ) -from networks.blocks.feat_enc import ( +from vbench.third_party.amt.networks.blocks.feat_enc import ( SmallEncoder ) -from networks.blocks.ifrnet import ( +from vbench.third_party.amt.networks.blocks.ifrnet import ( resize, Encoder, InitDecoder, IntermediateDecoder ) -from networks.blocks.multi_flow import ( +from vbench.third_party.amt.networks.blocks.multi_flow import ( multi_flow_combine, MultiFlowDecoder ) - class Model(nn.Module): def __init__(self, corr_radius=3, diff --git a/vbench/third_party/amt/networks/IFRNet.py b/vbench/third_party/amt/networks/IFRNet.py index a5eb61cb..6c87a8b4 100755 --- a/vbench/third_party/amt/networks/IFRNet.py +++ b/vbench/third_party/amt/networks/IFRNet.py @@ -1,7 +1,7 @@ import torch import torch.nn as nn -from utils.flow_utils import warp -from networks.blocks.ifrnet import ( +from vbench.third_party.amt.utils.flow_utils import warp +from vbench.third_party.amt.networks.blocks.ifrnet import ( convrelu, resize, ResBlock, ) diff --git a/vbench/third_party/amt/networks/blocks/ifrnet.py b/vbench/third_party/amt/networks/blocks/ifrnet.py index 586ae610..a28b3fdc 100755 --- a/vbench/third_party/amt/networks/blocks/ifrnet.py +++ b/vbench/third_party/amt/networks/blocks/ifrnet.py @@ -1,7 +1,7 @@ import torch import torch.nn as nn import torch.nn.functional as F -from utils.flow_utils import warp +from vbench.third_party.amt.utils.flow_utils import warp def resize(x, scale_factor): @@ -108,4 +108,4 @@ def forward(self, ft_, f0, f1, flow0_in, flow1_in): ft_ = out[:, 4:, ...] flow0 = flow0 + 2.0 * resize(flow0_in, scale_factor=2.0) flow1 = flow1 + 2.0 * resize(flow1_in, scale_factor=2.0) - return flow0, flow1, ft_ \ No newline at end of file + return flow0, flow1, ft_ diff --git a/vbench/third_party/amt/networks/blocks/multi_flow.py b/vbench/third_party/amt/networks/blocks/multi_flow.py index 4563c326..53ad50ed 100755 --- a/vbench/third_party/amt/networks/blocks/multi_flow.py +++ b/vbench/third_party/amt/networks/blocks/multi_flow.py @@ -1,7 +1,7 @@ import torch import torch.nn as nn -from utils.flow_utils import warp -from networks.blocks.ifrnet import ( +from vbench.third_party.amt.utils.flow_utils import warp +from vbench.third_party.amt.networks.blocks.ifrnet import ( convrelu, resize, ResBlock, ) @@ -66,4 +66,4 @@ def forward(self, ft_, f0, f1, flow0, flow1): flow1 = delta_flow1 + 2.0 * resize(flow1, scale_factor=2.0 ).repeat(1, self.num_flows, 1, 1) - return flow0, flow1, mask, img_res \ No newline at end of file + return flow0, flow1, mask, img_res diff --git a/vbench/utils.py b/vbench/utils.py index b9298b89..d873224e 100755 --- a/vbench/utils.py +++ b/vbench/utils.py @@ -138,6 +138,8 @@ def load_video(video_path, data_transform=None, num_frames=None, return_tensor=T frame_ls = [frame] buffer = np.array(frame_ls) elif video_path.endswith('.mp4'): + import decord + decord.bridge.set_bridge('native') if width: video_reader = VideoReader(video_path, width=width, height=height, num_threads=1) else: From 8046fcbfa1b853d6121956b9f48021de5f1fecb0 Mon Sep 17 00:00:00 2001 From: Yinan He <43169235+yinanhe@users.noreply.github.com> Date: Tue, 30 Jan 2024 13:16:57 +0800 Subject: [PATCH 103/248] Create model_path.txt --- pretrained/umt_model/model_path.txt | 1 + 1 file changed, 1 insertion(+) create mode 100644 pretrained/umt_model/model_path.txt diff --git a/pretrained/umt_model/model_path.txt b/pretrained/umt_model/model_path.txt new file mode 100644 index 00000000..c610ef8c --- /dev/null +++ b/pretrained/umt_model/model_path.txt @@ -0,0 +1 @@ +wget https://pjlab-gvm-data.oss-cn-shanghai.aliyuncs.com/umt/single_modality/l16_ptk710_ftk710_ftk400_f16_res224.pth -P ~/.cache/vbench/umt_model/ From 85dd5d73a6f7b91d826ad5701d1c7ee8022e5b3e Mon Sep 17 00:00:00 2001 From: ziqihuangg Date: Thu, 1 Feb 2024 16:29:45 +0800 Subject: [PATCH 104/248] [update] README.md --- README.md | 2 + vbench_env.yml | 309 ------------------------------------------------- 2 files changed, 2 insertions(+), 309 deletions(-) delete mode 100644 vbench_env.yml diff --git a/README.md b/README.md index f1fecb8c..7edf6c2b 100755 --- a/README.md +++ b/README.md @@ -37,6 +37,8 @@ To evaluate some video generation ability aspects, you need to install [detectro If there is an error during [detectron2](https://github.com/facebookresearch/detectron2) installation, see [here](https://detectron2.readthedocs.io/en/latest/tutorials/install.html). +Download [VBench_full_info.json](https://raw.githubusercontent.com/Vchitect/VBench/master/VBench_full_info.json) to your running directory to read the benchmark prompt suites. + ### Install with git clone git clone https://github.com/Vchitect/VBench.git pip install -r VBench/requirements.txt diff --git a/vbench_env.yml b/vbench_env.yml deleted file mode 100644 index 2b350b9c..00000000 --- a/vbench_env.yml +++ /dev/null @@ -1,309 +0,0 @@ -name: vbench -channels: - - defaults -dependencies: - - _libgcc_mutex=0.1=main - - _openmp_mutex=5.1=1_gnu - - backcall=0.2.0=pyhd3eb1b0_0 - - ca-certificates=2023.08.22=h06a4308_0 - - comm=0.1.2=py38h06a4308_0 - - debugpy=1.6.7=py38h6a678d5_0 - - decorator=5.1.1=pyhd3eb1b0_0 - - importlib_metadata=6.0.0=hd3eb1b0_0 - - ld_impl_linux-64=2.38=h1181459_1 - - libffi=3.4.4=h6a678d5_0 - - libgcc-ng=11.2.0=h1234567_1 - - libgomp=11.2.0=h1234567_1 - - libsodium=1.0.18=h7b6447c_0 - - libstdcxx-ng=11.2.0=h1234567_1 - - matplotlib-inline=0.1.6=py38h06a4308_0 - - ncurses=6.4=h6a678d5_0 - - nest-asyncio=1.5.6=py38h06a4308_0 - - openssl=3.0.12=h7f8727e_0 - - parso=0.8.3=pyhd3eb1b0_0 - - pexpect=4.8.0=pyhd3eb1b0_3 - - pickleshare=0.7.5=pyhd3eb1b0_1003 - - platformdirs=3.10.0=py38h06a4308_0 - - psutil=5.9.0=py38h5eee18b_0 - - ptyprocess=0.7.0=pyhd3eb1b0_2 - - pure_eval=0.2.2=pyhd3eb1b0_0 - - python=3.8.17=h955ad1f_0 - - python-dateutil=2.8.2=pyhd3eb1b0_0 - - pyzmq=25.1.0=py38h6a678d5_0 - - readline=8.2=h5eee18b_0 - - six=1.16.0=pyhd3eb1b0_1 - - sqlite=3.41.2=h5eee18b_0 - - stack_data=0.2.0=pyhd3eb1b0_0 - - tk=8.6.12=h1ccaba5_0 - - typing_extensions=4.7.1=py38h06a4308_0 - - wheel=0.38.4=py38h06a4308_0 - - xz=5.4.2=h5eee18b_0 - - zeromq=4.3.4=h2531618_0 - - zlib=1.2.13=h5eee18b_0 - - pip: - - absl-py==1.4.0 - - accelerate==0.22.0 - - addict==2.4.0 - - aiofiles==23.2.1 - - aiohttp==3.8.5 - - aiosignal==1.3.1 - - altair==5.0.1 - - annotated-types==0.5.0 - - antlr4-python3-runtime==4.9.3 - - anyio==3.7.1 - - appdirs==1.4.4 - - asttokens==2.2.1 - - async-timeout==4.0.3 - - attrs==23.1.0 - - av==10.0.0 - - backports-zoneinfo==0.2.1 - - bardapi==0.1.33 - - beautifulsoup4==4.12.2 - - black==23.7.0 - - bleach==6.0.0 - - blinker==1.6.2 - - blis==0.7.10 - - boto3==1.28.31 - - botocore==1.31.31 - - braceexpand==0.1.7 - - browser-cookie3==0.19.1 - - cachetools==5.3.1 - - catalogue==2.0.9 - - certifi==2023.7.22 - - cfgv==3.4.0 - - charset-normalizer==3.2.0 - - click==8.1.7 - - clip==1.0 - - cloudpickle==2.2.1 - - colorama==0.4.6 - - coloredlogs==15.0.1 - - confection==0.1.1 - - contexttimer==0.3.3 - - contourpy==1.1.0 - - cycler==0.11.0 - - cymem==2.0.7 - - cython==3.0.2 - - decord==0.6.0 - - deep-translator==1.11.4 - - detectron2==0.6 - - distlib==0.3.7 - - distro==1.8.0 - - docker-pycreds==0.4.0 - - einops==0.6.1 - - en-core-web-sm==3.6.0 - - environs==9.5.0 - - exceptiongroup==1.1.3 - - executing==1.2.0 - - fairscale==0.4.4 - - fastapi==0.103.0 - - ffmpeg==1.4 - - ffmpeg-python==0.2.0 - - ffmpy==0.3.1 - - ffprobe==0.5 - - filelock==3.12.2 - - fonttools==4.42.1 - - frozenlist==1.4.0 - - fschat==0.2.26 - - fsspec==2023.6.0 - - ftfy==6.1.1 - - future==0.18.3 - - fvcore==0.1.5.post20221221 - - gitdb==4.0.10 - - gitpython==3.1.32 - - google-api-core==2.11.1 - - google-auth==2.22.0 - - google-auth-oauthlib==1.0.0 - - google-cloud-core==2.3.3 - - google-cloud-translate==3.12.0 - - googleapis-common-protos==1.60.0 - - gradio==3.41.2 - - gradio-client==0.5.0 - - grpcio==1.58.0 - - grpcio-status==1.58.0 - - h11==0.14.0 - - h2==4.1.0 - - h5py==3.9.0 - - hpack==4.0.0 - - httpcore==0.17.3 - - httpx==0.24.1 - - huggingface-hub==0.16.4 - - humanfriendly==10.0 - - humanize==4.8.0 - - hydra-core==1.3.2 - - hyperframe==6.0.1 - - identify==2.5.27 - - idna==3.4 - - imageio==2.31.1 - - imgaug==0.4.0 - - importlib-metadata==6.8.0 - - importlib-resources==6.0.1 - - iopath==0.1.9 - - ipdb==0.13.13 - - ipython==8.13.0 - - jedi==0.19.0 - - jeepney==0.8.0 - - jinja2==3.1.2 - - jmespath==1.0.1 - - joblib==1.3.2 - - jsonline==0.2.1 - - jsonlines==4.0.0 - - jsonschema==4.19.0 - - jsonschema-specifications==2023.7.1 - - kaggle==1.5.16 - - kiwisolver==1.4.4 - - langcodes==3.3.0 - - langdetect==1.0.9 - - lazy-loader==0.3 - - lightning-utilities==0.9.0 - - lmdb==1.4.1 - - lvis==0.5.3 - - lxml==4.9.3 - - lz4==4.3.2 - - markdown==3.4.4 - - markdown-it-py==3.0.0 - - markdown2==2.4.10 - - markupsafe==2.1.3 - - marshmallow==3.20.1 - - matplotlib==3.7.2 - - mdurl==0.1.2 - - mmcv-full==1.2.7 - - multidict==6.0.4 - - multiprocessing-logging==0.3.4 - - multiscaledeformableattention==1.0 - - murmurhash==1.0.9 - - mypy-extensions==1.0.0 - - networkx==3.1 - - nh3==0.2.14 - - nodeenv==1.8.0 - - numpy==1.24.4 - - oauthlib==3.2.2 - - omegaconf==2.3.0 - - openai==1.3.3 - - openai-clip==1.0.1 - - opencv-contrib-python==4.8.0.76 - - opencv-python==4.8.0.76 - - opencv-python-headless==4.5.5.64 - - opendatasets==0.1.22 - - orjson==3.9.5 - - packaging==23.1 - - pandas==2.0.3 - - pathspec==0.11.2 - - pathy==0.10.2 - - peft==0.5.0 - - pillow==9.5.0 - - pkgutil-resolve-name==1.3.10 - - plotly==5.16.1 - - portalocker==2.7.0 - - pre-commit==3.3.3 - - preshed==3.0.8 - - prompt-toolkit==3.0.39 - - proto-plus==1.22.3 - - protobuf==4.24.1 - - py-cpuinfo==9.0.0 - - pyarrow==12.0.1 - - pyasn1==0.5.0 - - pyasn1-modules==0.3.0 - - pycocoevalcap==1.2 - - pycocotools==2.0.7 - - pycryptodomex==3.18.0 - - pydantic==1.10.12 - - pydantic-core==2.6.1 - - pydeck==0.8.0 - - pydeprecate==0.3.1 - - pydub==0.25.1 - - pygments==2.16.1 - - pyiqa==0.1.8 - - pympler==1.0.1 - - pyparsing==3.0.9 - - pysubs2==1.6.1 - - python-dotenv==1.0.0 - - python-magic==0.4.27 - - python-multipart==0.0.6 - - python-slugify==8.0.1 - - pytorch-lightning==1.5.10 - - pytz==2023.3 - - pytz-deprecation-shim==0.1.0.post0 - - pywavelets==1.4.1 - - pyyaml==6.0.1 - - referencing==0.30.2 - - regex==2023.8.8 - - requests==2.31.0 - - requests-oauthlib==1.3.1 - - requests-toolbelt==1.0.0 - - rich==13.5.2 - - rpds-py==0.9.2 - - rsa==4.9 - - s3transfer==0.6.2 - - sacremoses==0.0.53 - - safetensors==0.3.2 - - scenedetect==0.6.2 - - scikit-image==0.21.0 - - scikit-learn==1.3.2 - - scipy==1.10.1 - - seaborn==0.12.2 - - semantic-version==2.10.0 - - sentencepiece==0.1.99 - - sentry-sdk==1.35.0 - - setproctitle==1.3.3 - - setuptools==59.1.1 - - shapely==2.0.1 - - shortuuid==1.0.11 - - smart-open==6.3.0 - - smmap==5.0.0 - - sniffio==1.3.0 - - soupsieve==2.5 - - spacy==3.6.1 - - spacy-legacy==3.0.12 - - spacy-loggers==1.0.4 - - srsly==2.4.7 - - stack-data==0.6.2 - - starlette==0.27.0 - - submitit==1.4.6 - - svgwrite==1.4.3 - - tabulate==0.9.0 - - tb-nightly==2.14.0a20230808 - - tenacity==8.2.3 - - tensorboard==2.14.0 - - tensorboard-data-server==0.7.1 - - tensorboardx==2.6.2.2 - - termcolor==2.3.0 - - terminaltables==3.1.10 - - text-unidecode==1.3 - - thinc==8.1.12 - - threadpoolctl==3.2.0 - - tifffile==2023.7.10 - - tiktoken==0.4.0 - - timm==0.9.12 - - tokenizers==0.13.3 - - toml==0.10.2 - - tomli==2.0.1 - - toolz==0.12.0 - - torch==1.13.1+cu117 - - torchaudio==0.13.1+cu117 - - torchmetrics==1.0.3 - - torchvision==0.14.1+cu117 - - tornado==6.3.3 - - tqdm==4.66.1 - - traitlets==5.9.0 - - transformers==4.33.2 - - transformers-stream-generator==0.0.4 - - typer==0.9.0 - - tzdata==2023.3 - - tzlocal==4.3.1 - - ultralytics==8.0.178 - - urllib3==1.26.16 - - uvicorn==0.23.2 - - validators==0.21.2 - - virtualenv==20.24.3 - - wasabi==1.1.2 - - watchdog==3.0.0 - - wavedrom==2.0.3.post3 - - wcwidth==0.2.6 - - webdataset==0.2.48 - - webencodings==0.5.1 - - websockets==11.0.3 - - werkzeug==2.3.7 - - yacs==0.1.8 - - yapf==0.40.1 - - yarl==1.9.2 - - zipp==3.16.2 From a9f29cea33644f3cf6f3a2fe5faab54d0499f522 Mon Sep 17 00:00:00 2001 From: Zhang Fan Date: Sun, 4 Feb 2024 02:29:30 -0600 Subject: [PATCH 105/248] [add] radar charts --- asset/radar-close.jpg | Bin 0 -> 3733979 bytes asset/radar-open.jpg | Bin 0 -> 3954324 bytes 2 files changed, 0 insertions(+), 0 deletions(-) create mode 100644 asset/radar-close.jpg create mode 100644 asset/radar-open.jpg diff --git a/asset/radar-close.jpg b/asset/radar-close.jpg new file mode 100644 index 0000000000000000000000000000000000000000..256f2c7149ecc6d64842e33136b30a2d44908a2b GIT binary patch literal 3733979 zcmeFa2_RH$|2KXZdohHHFohzLvLx$>Y)O{V6lE&OGD(pvF-Hm|lB7jtEJ;FB*(#bL zYe{6A?0a@*EMttBGykLezMuR4JA-DhrP4sdY+fF=A7;Pe5;W`Uk(0KmouPyqly0N{%91bE;SdpC+t|v~6l>>TYi_mQ?B}hyckJ0~V#X~B0N&>=`Z`#cY&>$*X(N9F zeD_s=IG_pWpLF#*zsJsQ|MDHq`J8iG{{3G!dh5>;08MJk_xi8L{+FMGT+jPngwJBZ zyT)lZKUcUqJK!`f;Np2WpX)K4mNi5F1MKSPF>yITYu*M6MroC!!Ng+wg1_^ zTx-Alp7;9~x%&HF`t=hRwj}9+z1|CK7Ii~p;hn+1)=~q z7Xrb}gW%=mfdR`E3Lgh}#CX?i+PQ~M-2NoL`~?Y(EAgoU3MQ}1Bptp{6*W)!h6@U< zUAJCJdh-?~<*h1O+Pidi>*|^AH8Z!cwAy#@kmF$tTtwH?ZfD#*JiYub`dGy(7yEuSQeH^`v z5vq_V{)xaG?CCQ`owCO&IpbR(o_F3^`oqw>A_m67b;+DzG#4-P5}bSpUjs1pqG#$5 z{r94XHLeAh6j#qY1^3PAtC?(Q*<#epm1-to$MxUdC2xejbZWC2syTqa%T;7{!6A@h z6I*!C;POHK*tw6!tApq%BK^_B@-*4UZ`*B+S{f|mKVPAwrtp6=lK|}hbhp@cbIrPQ z;bDqt@2o;FC*`bn4zy>1A1^!u1EE7)o%YAh%I$2*^nOu2le?tmu&Qyd`&^2!o^o24 z(8|8j1}hj?!N3Xz{^=M%bH1$%(^rP+E2i}d2L9bKU~*n?E`>K|729)L`5mL7AW95g zvA(bN;l^8vq1nk+96*{C!>Ho`yeLYEY(H+2PfeqxRcY%z_xh1$t_2Pd#{s-1b;{Z! z>h2a!RdHMJku+&2mc(>9T}S-r1}Q6t^66VGi&iuRdS#1&+l%8a)^6NV-|t91YU;}F zXZuvdWn--`eddb4QM!(Us#as?YK5(vA(cS9uzp0B^RE zw+(tvIJZ7DMy0O<3(h^!(^wTC1()IVMt0^!#-Q3DdKk-P4t7o;~v-8ekVt-B}W*d#S8C)wa)V>-!=?b$fdlTW?+Q6Blz;$4*B> zxa>C#A@*BhVy;KG&8Hy8AN7Hg7oM?N@*b~S8!X5`#}VB^NIf`Q5CHQ#T(ZGNse zPaF*BvzQEh5u(@FVH5sY%JlQ;T1}A@k-AB2ItLIy_ohovNE_J2?&`Q#-Hqn}R26_d zlU9Q6v#3;!Lj|5{I+mmu`%-kL&;j&8mnI+eC+k5cAURq&h5H}e`Sx`v1VAH7GsHSn zyC(fj6MkLbv*U98U4y3`b%k#gFdh9kfLtR7c-!-t{3wp09KcFkMOYADY~W8HgUjP! zDAYP*8GDzkDz;F0uz&-ECyY3BR-?XWEZ#C=5X%YT9H8ca&qAHsC^#c}l|2SPgb>t{ zm#1D+dzw>?K?AvR9GfL4+(J1{oqzW1Dz_Uy%1#*E1`6&R9wqOEPLVTxu5y4G+c_e0 zkjxZn;LrSX!$<wf*qeUsai$S=K!DdTFB$wK5V`)^cI!{ zj86V<7YEMgx^JENYBV2o83xh5Inj#HXzbDyS*0X2Zr9QnZultJsttf9$wF|Yu6rc4 zrWB65D=VQZROc+{0gQP!n%>wE@BY)@-!0y{+)qv*E*DeQHkCNS4+$>BP3jXD&~#U0 z)SEebGeF{>wjXQ#@N(Ck=+CKn0)a!-?Y_;7}Xu2{+c zd-nTE{t7l$Fz|1N0WMDCiv71T_WySu`>)jRKON&lO^AtV}=|13S+~}28(j>+d?cD zvFKYNd;D_uUvUc9W~}=%dtf?FTU|CV$;R6L#Me?0K0Qo*K9^QwP>geHYSt-}uY5@D zQ9b^MR%lEj@K{t;pcdB;EzE>Yxq6#sS+{(ClUqrV?>X*s!?L*P*maQL2Wr~PZYOI@ z?Erdr3i8K6>-=^1RaeQJ)9&bG?Ax8X=9~d#Zqkhd@YFNRaPnX7eH9pL`pRUq{M7!kaGADv!* z$ClQ>KdWYjFAWK0CzPOiL-lm$Y6Dh(u{cKgL8-tBn)NMyQ%gG`V7Mm*xtX5_wQr6C ztYSdkeCvDn@~9#{5qGN8mUA2BhxJ}HM!hE>KyMq&9%}ca^o)ErNAAG_d^>9`E%B^ z5QdgDu;l=objdy1_cLJQ&jh)UeXs(yjl>_ZXO9MBS=TmlfZ#CnKGGVo-%LF0A5A>U ztR(d|!~txWO0$w|Jr;@s+yOH{3fwylhr8 zDv~KB^<@nQ7}4Tl`7@T--U~}rB>($tS4<~bG(_?0>=nP*HX`tF!;%!kOPEf(-->OS2&q6!-9JCIs$@VfN zZ~zl6uF-#z!wl*_f_a${253)@!gM*~^@DVlYIn|@^JQ(bxHQ+f z+z7j9$zUP>pEEStcXyP%yo{lCCBI5_9}&T&r-d2$&?pmKeJI(ET+&k`N8K@tIbtog z9Y@!tx+H=^ZF`SM7_R345%@_=a$sKH)Rq4GN453c?mjQ=GDcN_)f^z|Fo}n@S}*jP zn|&XKWxWGRoGn+_TD}@HWx#A82 z`p~|J`lTDJai+ze9~eQbm9X5y2(w}X79O8GWWT|wxnf;zM(+DLiPb?yA=7tw%Oa~ye{fG z;n{n&@lX2Uw+UWPH*CoT`$yL?*;kCk?Hyx>o{w*oAb7*`u*qp*YW8-kvi-i5W!`W# zTyMv7_b+phR`JGiXc`7G>?3|eF_ej{un7(@o{gEBt~`u#k0c9#y=gHbH{9bE$C|kq zQ3stlKy?LnT7CAp*B`9L@uk(^C&OQ?#_>Lo72W#NYJ{z1sC+fL&!qz-wJiBlKSZ(Q ze?0;O+ekc$%2oR(PKW-V8|kf#s#n^21p_M>_-A1NeJHI1jxiMRpPix;;-(NXbniym`av^vj`&ar!K(EJ%k_Rer@X+n%T|4UT9^NA=I`!amd1vj9x2E=_4e9QCyVPL z_OdNQNngjk(v=LK)O$g}u$mMt5UWt7(J@NzcJOxL!!aq0h`t(sXYwvvrq< z5!(p7#pKAd(zYe8QpBqv!oB*BHdyb~yKM3LkZY~NdY0-OfBCBiTDT+E+Z};Qz#G=$ za7DcwYkhB)*}gqPOg(EiQSVXz;VbKP$@W*-@Kl=g;(bU+xc|6wIa<(cDnWD4cIbJf zE!GS2VwuvW`wh&!lL|`9i&%v|P5|_l+~*N^846d0CzI3}t;tVlt4xc}T7=|Qzu*9| zpj0*e8slLC2N1tZSYoxz6_ui6yV1@5f@{YQJt=jWc@67k-7Q;k3l!CSly4(VRDDT2 z>{tWq7vt}b*N$mwJ=KeQcoWle@h{$Sj%wyps;au@&UoLDzNgf{g~64gJwF_nwq&UH zXD=!3tP^<{JXYEp{_3?%1OFbK3Iej~$cA~?V)q2Ao^ybY4b%lT4+dglX~aQX1sP?e zX!iVgnimD&+s&TEtnSH2Rg~JYY^K2JSv!9E@l2L^!^?~|*_qcy9S0k4a{UPdKu+e{8v?_+5PNWCC77dMYG#e&(B<0LwI=8j(@a3!roXE zo=2_DCr|0j5Pu9nLb4oSv&k^K0`g}oz;lg%cH|-#W3(S!6s&_5b3Uj;CZga6r(`7# zaQku3bkgT0fpye`Qek6M23Al+=@;wRh64z0PrceVb#LaCojm-7J@VxHga~Hq!?Vnw zq#s)C;<_gc4aXNukTEXESByO}cDUB8^Y^NSwkrrfxmq~Uz~4bh)@)E4@TPbP$fdJz z)g`txXY-<0iIWAJF^CwAmSkBf8w zhx6?C=ZXf9w-#Dkx6_l#4tMHF5_f+?d)VNebMtU#G07R=NAlw~Xh&f8(52evUrU*v zTO82Q_#HQjHnU1O0L9c8)zWhX(th-AJ=UuuQvVH;@=b0%BQ>S7#;htOXNTUK<&sFR z?U63I?ByqtsmI>zH9ALZ=kGVzX4x{u&o1-=pcb_dY>ErF_f%rooB|8y+e&YzSv$Wy zZ>YK$&3;1mu<^B)A{S}*$G&y_^0s6p|c(bf1M=^q^6y)vvFe4wi3m z-(Xd>#cOd2+m7r%5g*nv1;20V&!iwH(rWvSu3p&EOg_v3uBbC8ere-$y>v0}9SL&R zFtl+hVSrp=UFWGlaD01W&$uRL3Ok`a;C;D5FXHo9fX04qQ2ZXGFm8;>Y5t8DxE7}2 znJd4sRup(?;Y1@u=8q^@=k61D!oTUMvBeJ$M0$dK-c9{Mgxq{xci=$zz0YF{*KDJGYNRNAD(Y_WL%}$(j#f%j41o#`p4qaIy){MupA3M6^4E` z>D+(os>GW01(G}nmNle38+9GhF<9%TT}X^X#7Gi{n6vaNj5{;0^gfjFLyxpiTGP_2 z6Sf}J-dcm6Yk7`fsuSo}DTxoZ7j}gnWT~c6GpoCt`sdVI-y97Z!?08rpP(fTP8_gd z^(;`EU$LBM8O7(~>>i+)uonV5h@sQZj%_OH&^GQ2a3kUxjT9KZeXl4rU@u%0)-h*T zClKZR#VPkA45K+80ek~lwDsWdxaO5Bt(V}IZ?CTiHluE7?K_4vL01*@C+SAI>T;72 zWH)qNzqej5BcVbd_QQoAoyr3cj~PCXwV9FCr{?mxk6bwUtX#SO;<*%fHs2U!@$D$x z=^5d9_3+VSiZz$B`B163BKd!Lwb@%*+TS^^s&I&+^f zMkZ6C3QD=mB)+qI7gT}%E#9>8id9u0&P&Tx5W_qx7zf_zwR zw594Iv5j#@9xhI5O3zr4G#OLieA3-vsJ#FHEsQp?QwnO1@0D)int)qg4%v>X*el%L zf%k+YS;Mt7=PFI#g;~^+hu2P;Mdd)o9NZ-h{PW^{)3+#1ZF?!wW!wB7C(aXn8jRH2 zY*!oMp9}7;|5Ct;uB1g(i#|Wpd7Bw^wiX)1}XAFR4?b2JT0_oVYkL zZD4xN%`HZhvh^{6Aqn6ghYQqXywta(#YIl5^^1<}u&j$YwO6-(;CT2ex7C>&=Td|M z7A_p^$ZsrcksXg{F7bZaR^j4YU~E{;polM-dtrjH<+0{tbzgip#_`C=owmn@lk+Vq z3DTa)GRe+1CZ|uZ_g=H(pBS!Gzo94dYOezMV4Oc(XqE%JB*fZ1dxU<|=ltf1*oL@K z?Bf6qAdBDBRlJRvTMpPBJ+kl;b(~gx-C%FHp~IL}`0T8%&#E$fI1t1zlbwGzhDl}|GwtkmT-P4F|3J#8bHTPgL;95>ZaO0*;S0pi*R=UwxAgDa2lU2Z1RKK)qe}5xNlo+md+6cusj@XwhT}{WLz`Cpv9No6)5hYK zlWVrwhp$e#D9QS$*l2FLsZ5gM>7qgLR*F$;QrWIgprs%uua<-BOT+@6b*OmRX==nR zdKHZ9>Q--i#kU^r>387N&kLp}oG3{iG|w#j8eA zcGZ~!r7`prWT)j+@7=H%k znv@(SALyDe9n)%7AEkA(dUm^7OGon+=eu_NdInolMefxuBB@!J@?bsR!Z9=#Bw1_* z*x6`&-ob6oXZT{~xt(Mn_q}L9IV~Ds4KDj|!!Fa(*aZ%d3vP4SJg&1e2}?Sj{P}?q zL_P8#_8|1N8rtH3q??7j3Ql4^qQms$?Yo32nECO8)#WwvY`jTi zjubF;jd@`v?B^cRw}gSeREQSE4kwc5tTEr_XQL*u96(XE3Pl$pae!}Xu%IS0I**+Z z1Um=J5z#&+xCR49!lY_FT+oc5rO*LmIW|=sGCu;vVjG4hjBM$5Il%lU)x`mZAO}#` z2AUH`eCV({!R?><%82<$*su`T4i22c7^>Du%Y}&$&MM*)%^--VY>2}B#Crmth`j(1+clOaxk-yv+ zj;;f%UEb(a;4WXo#crne&cksRh|c&O>^soLuF)U*Hk)fNG@_vo0AB8xCuLW<<|gd( z7bmZeVpBnJ@U;5PwHFjXH_LbE*?RaotRkbfw$rFx!l+14%>kpv0(?(w>eC>x|uQ(Tv`!p|75Vs^EXFMjLFQ?O(U&NC?w>q7nX z*Cj5j9e3>NcHcIv*!uEOYL8dZR!@(JWD@`gZs2ps-uGQ5!ReKUUTitftu2a7p@nH- z=nKp{YHxVES~?^~=B1G>wu;n5&7AP2Ic@mv_O?;6?J=PXeu_pAv#TJ{50|0Pr!Yq; zjCo5<&O7_)$Cc*GS;S6DvKvxPpjme740`m*g=B4ZJYJ#Ovrabo&d?h@i=RlXw93gW zg5O_HL(PoulzOgRcK$Y|BWZiKXU>s~p}XOhgxxB>yLS@@!QQAU zoHxQOyphQPYR|J*iM5i`(QTUja`7)iwlw+}H7L%mb@5BS^C|`TTKNT?vh%q4=IcW? zM_>j+C1Y}M*RJUcx*s6jP&KHW1Ids#$Byy2>` zEPo~XhI9n)=;S>4Z7S?+kPwK$hI{19XIEof`!_&S}nNqZyF_t(d<|(?Icq;#@EwWXaCXCFdos5LigtqU* z2Qx2Da6RrTgQcM{*gQ(?{N&mr+j8X0&wS-}XTpww@qTI8N`?D3qLhk0xLbQ_DQIkk z)t6}uGTn;t)rFz08KD(kN$Z>1XM_~CZtau{&rFV<0RF+lBC|~uw3YiCWSB{5HIib-lazWq z<89X_DyX^>)O%pIKPh-!)cmHLHq>TvE75fvTLSXtkLNdPeW(e%Wz@58v1dO!j-*`v z-1&3+{xF3sbnzV8iEYgAr-BvC!vr_Ux^z|Z>&V(sMgxBjp4)-6AzzqMYdds2!w-@w zxKnQxE@&=gzX!GFIPZ$c@3(oCmES8^Si!)590ov6>k7NK!tSkz(^fF>?~MUs0!f=8 zv``V8^0ufzlZ&tO%2YKx{?wLuZ6SaA^ZMHs3UKI!=MbeQ4_4{;lw2@q)w>vJnB9Q+ zkxWj(23}&?pzUI+x4H93Lp5-4kWD1ofo!#|-2Agr7iuIK{%Q^9S+X1DQk~HJk~O>nC?fzT$^^!`w+u_t(VnCBxtYNsZhs_L&k94f)p&IqV@pIE^1wZ7E1hnPc>VmanF&fBX^r4|5FxW3z#BNrS5 zpSd`G4;4&7cJu(yw{DT9d7eAdj ziL2k!1N9ezb_qaBsBx@&h&nqhA1mTqbD!cncsn?O?SWvv_!i=qI(}Pf!u_a(>^r%F zR_D%2-$w(wk_;Vc9-YPr>g(D$>GSgG_Qr$-zcK*aw-nVkJy}ehb#8LN1$9R5UdrcU zt}_dQ9o0&}32E%G#MuR}*l ztXq8+Iuz{I0UZ0rtXD6X|3Xn3zc%hCeozut_wc zG0lzx-1t2s1&RMtrcZTo@K5vqDGVS*_c-bX?1ZMIv-IQLXTW#srFMNv3Iko%UYM8N z;e7Pt*T(l@{O?}sTNPd|W=u9s??iniA4aFr=Gc!E#uN5zZ;g|?&QTSx;nvS|WyF;sK$JMWf==!*&wO7%CEjj@G2F9osr=-SquL=`-RtW4U!S>oU7Bs(In1lQo7>1$bkqtp^Cf5ItU7 zJoUWwMG6|B6MMK{^aLX2iCVPb2y~Hmi~JRtn0eR*uF}1N5Sgq%l^GRzca}cU8~-*K z6Hl3hOea+@jBT{n}YQ8@Ua@4#X%Pa>~IcUgY^{b?B-ws=6qS+}Z<{{1VeMRB~W?zNlwf`X~EmELR4* z7n)|xX|rKPRGHCoPx?c{&==ey#6ZJ=Y?_(UP#D-t0B6MCa{xaMa7K%3NgZ||DH-5OX}a+>T^SyGmx;Wi_8fNG&?GA*NFirl4nl^jL4xAl`@h9$J1W}bL zf+MqCFUy=rPc#g*B#u`1T6}zzt6%nB-xeolFPe}q{xKSuosJ~)Yda%feQ14rXp@Pr zq3WDbhgs99h;_QazkXHTxOb*mRG$mu!5v3;3xj8AR@r!~9yl3UbkIq{j{hpl zk-FEZLM%7&fiLppB=$DBP-kMcp96?_e6vXRyra*}0WJndv#F;KFO{Nl&v&a)`@N{w zj)>8cUm)jFko{o4&&TOCETbozD7h|OkJljL_R+@J-{Gl-t1Pir?O3mGIUCKmw~WF) ztHL@(E4ng|;yY%+%Z$+Aw$L3lmWj=1rrE5eY~+I9)~n;h|%Y+<0v!0*0!R^HO5MWR(Hl43^v5zN(gyu6JUo0kg5 zw!$P~A!@eFYhHrutCyY#K}&<{?9*4oFN9lIUhO)JUCZ>@i$6Rb=<2X)|*Fj@vA7Dc!PZPkmc!u3E;M(C?*{N7BMX z0X|-}|H;ZTE7$z3xAjV!thC7r2L2T>z|WzqFlj3+%D*>@veLgR{R?A&D<`_GhM!7Q zp&_qsX-+;9EuqhJmmHh3;~!M7RtU`CNMf63w-3p`Wm}g|2gqC(Sj;m<)m$oCk_@rz0E} zG@?W+UXYPsQIy=@uf0m>9(p%AfJPi^swCv)(%)@vQmiqUn3pS<35X`2h;=>srufRa ze!kQGOXO}IJO0@{Mo+f!UPTrkt-gwVtH%LYYi6-UMfZApOve|515r1~y#%M$h_mP9 zrsTOM^0^lM%tEvJv2D)w4Y^Kz!=_hnttAE0z_20emy=J=w<(@CR7QL$Q%J4cZ7o0N zi!`#1j^yhhx^5xY57|iT zC^ch>oIfi?xS=XE;n+L>*HdRGng(pRf=mpds9SqPP6e^FZ-{E_OE%kHW)VCRV*6{X;8-`*BYzpNTZ?PNg`ZkC)J;AkX7Qz*{FXR&kRHT9 z=bH9Tirae}JYm_PAGL6ZG7B$0lUM3z{bH1|DN@k1sS`7Qg=*5=&f3+u30=Mh99d;7 zmwFQ6-2Qa`jzf;0e{~SJ9Tq%mZ3f^^;M-7w^TZ8a_hv?4tMbN1ZR4LM4o}mcQ~5G` zkDsaPa4?Ql(OtiBPIYviIF9Z=e69WN$Y<7v{mzT01+G1bj`CKN9k3lTmW};~WNaM3ITWH1pSW>=BQ0~7Kk{6_ zW&Sb_uqcw*K$%0pp{`-t^!5|V5@Bg4ad-)?1}o~|2#yICH9T~Ajh=CKJ^Mie|HKDa zl8IW6Z-6!yHcoH6b$DaGPm@s#i-K==y5l8>PCoJ=z5slJXW+j(jd{NRaA(KYk*A~c zPS7RLlfB0ux!zhO#_e#Jk<%wboO}6esrlmFm^;&PjjwT+%hz1jUXqHV)*wfXQ6*?z zBT*bi`oR5=SE;5l?$aU+VkaSru9nF-T-9pE6)%BW8pI6flugI(O2S0se|Oqqbj}uVv%arPuyZ*`}D40i4}s-Qp5)o7<+YrE(6ayc zqkDjPViX7H%hl>xTmYBG;uyH&I#Nse%lF}`g_=oR%?8%b_iTRZRc0ThD75VGf3{w* zI}q#Abh6ku;#*A_f4UleIK+&du}%Bvw|o5S`9HvslRKcDH!De-nv`zrTf9)CyWp0B z49~`xUpns4affg0b2js6RVvpU?3etA?Ao)Yy@RFM&>Ns%k|E2+&EMaX=-zZPK3Ed< znJkEQixao-?)|W)GwW+J>N(s>ykC?W)K1F|V26yw8E+=Hp##Ouc zV6Cs9Q+cl8>8X};9j)e&KfGktetaEib+dtMe@)IdQO#GbpAea?@~qMkopJ1R>gJ6D@as@=1=_mLwBc zgI&@yjhWZHu7Ynx6 zK&p*xOtJ*cCYc!Oxu#M)Qt{ZP0f;8@-;M~#1?)q zM30O~$9O4Ry)?1g@DQwIU2@}Vb~Fsjv^SPB{IjTO3zb;L!Y@(N$TJ9dlIvxzG3vUb zHklP-2EoLoUgR@a`WSbe+>ISJMv=&`s#ysgKVxc>*1=C+E^f|;EkNm4pV$e@o#`EH zIe$BTEW1-76w30!wM0!LKLhJ1uGc% z&%wY`1qF$GJH9Qsu;-1Z(+ARIainGp<$7Rt?{%7RYT=U0mt>L)NcUo@z}|baG|a?d z15=OIp7stMnc;~=cpi6rh%5&W&$sj6a6QBUuEo`sDM1EBaPX1BH1b&$snj(w(g5Tg z;tAV`ujJN-S;JAIP!AL07&9Gkm>Qc9XWBu!BMNy=;p&ogeK%Z)=WR_8502SR^SQE# zwvBtM%uA6#;vJ*S_$q14tzoNY@8U0o?>+`Efjd}KX(T6ydhZNs;t%QU4^U`*w~zgv zod%u~Rw^fLyv6fAKB}$d9hUxxr8<)is|LbSsaxJMPj5e_%&!>Sz<=wO@TG3C*2pJ0 z&Ng|Ms$^-M7_q}=hj&p{?SSAS$C3o`&1p?0jrnmC6O*^J>z|D5Z1To`W@sLQ+NdqJ zGK$;@>(9FHuKxhaqF3W}j<@5aL&I)ZI%Gd)ieX!^rSMRnluo7Wd3%EY5jP5U9Q_(? zrVBJS@DD4oBnT+#v->bXGLEfXcO$qCT7o?SGr?ZC{xjx(l@H98H&dE>@`_m3;KLU& zwMtpUEV8@LqM#Kt1c%5$ziLIl`Bn6UzW|kEX9Qp=$;+GA^t*qF_JjFry;CR(yBpsE zf163kNe*x#dz!tbnTzGg$VRne2-6gxCy6K%5 z$jP>0gc*&$$qz#x@qk@E!-pZm44AngCxSXCs@?RW$~yFv(kyXQ?#~aTAggV~mnj2y zSzmXLaq__;+rzo2-*OH%1huMJEM4Sc!or%5r_7p^X2A$#7ZWpgh58JoP}+X>_@yc` zQ-D|+cXd}vo2NISHx53CpfDtb7khbaSxOZ#*<$Jx8nX2dJ^ zNI`!;Iv2nf;J-rZYm?5!;)dK)zD`30pk>dh%q81v+csC}m`Sl!7jX>nKs%4xDOKim zhH*!USSE_;ih@N%a;d-SjB;U}(Jc6PWJa%l^n}`kjH@%Nh-oFEP~j4L0IOFEy?&1P zZnVs*{kEo5H#+~Q71&KXZ9u~T6RCkaP<#vq!T0aKAaY(Z6OQFZ$ja&wYt zm2nEvDEpEQ>U44RfYi*_3y#XL51RpcX@Bh6j~mbvY{(&^{>d!@ReCaw{i?!a-ID5V z4v-lKeI)V1(eGjWM^>Ze;3beh!C7&Pv#^V-F_HCv15Bv)swg9-d;XXOlK_7>o5cNR zH^8F@B{@I>iX9QcYX*dtXW?798IjPJjyR?dfqjQ`f$@;W6F4=silFdm=F0l6;w-t3 zC2yhum7yygL1J@WnTNeaw!0e5aV<$M=>@)W8VTN9l$dPi9`Ke<93l4Qz)_#58F$}S zjogM3Sw*bX6ej_#BgQ;FG57+uafrhGI_iPP(5q?~V+~xibGE8wPB-T(G~oVZwX$3& zrn@razJAqXAuMjqD9;OwZtyxx;vEH(dv>YY+E0#iNI2PJ$$3gG*i1OfKP%r)P z^jDuPMlU@Jzb(GuF&sh6 zoy$D;`tg}vlUM?Uw5wTtgJ03^>U{uPw}spbPtZ!K`P>nhN8WvNE4Gs|(K9hmqg3G0 z)a{RQ$MRs$sEuDj#~1mNA?wMs!86U{_%E=Sr9fDzOxos}pUR6WK7kZuPprt| z^n`1&>78=z0+)l-C`^JAHSx({j#TmvU)7f9e~n#46@DXYQ-gtzzL)gf%U$v+Mpu=X zl-bxNc)jOubI9wIwXGLKQ$?Wyw#@;})mK|=XNVQ`9h!YV;)#JjevFNTRIVe#(~Npf zb!kVtN|bj%Nf9ju_A)!z-BBEO7l8?LwyQeU!k zeTs)>L@_g)dka<_CFV)20Lqc&;2~Au@h?e*I)3Fm+%-_5?$CV6Ciq0>}ZuVsS@wL znVe$2l#rl!`)w`DsaabF&FZbx`E^|X0$L#kW zb#n@E@alO}dI)tgxAI=BzPf=lbk9h#;uS;hqxQ(AxJ67aVd7KY+wBHAX`LF6C;7-A z9`$h}weGQNN-R1NxYY7reVbfr|H}Ib_O!hG$8biNZ!WM))9)IQX08d;pYjc4!u0YY z+_JW*#1Vc-Z~>O(=o1&vbXQ~4o8{?eyfFN*@4r|>G{B^5#`H^(=M;$_iPzU=;j-Rr zEG;FDWYQrjOE2?!{yRm7eU<9awE~79Ezc(I0F5W!d(pVUp7nXe7X2GP4(V4j4E+clwG2F2*+flCouU}x#0alOqVHXF% z=@$SC22G?X70p^qLN8wz$CyQD;OD^d-|qOwEA9Au$t(@j6%H``1y-x9j_L&$;ehh> zKX16FRTIAXA@3zahM^HN9(Ef2W687R+3_JFZ?e|9U&R)asr9f+;3yn%>BU}la*4da zi71Gff~wUFD`?99+qJivJnrKF4~b)l z{)_UUK#zIXY6nmS3N_MTIJD~@KX4**lcFI$pNmoaJhrHbn4Uyj{cKC8UUYHJ_PWE9 z7M>dxTfG7NbFfTP{V8~}Fon>Y0K1N&Pi|Q{;>WIpSCJ>cFw)tcMCq^0%c_PuQ8m_HWh$@%KyG_;3J}7+6C7nwkYOeFOdK3-{qS>ooE2h%>YqH~*eLPq5){ z!R=Ol{~qXXr7c$4Vg&>LQ5ZmR8dvy}75-#}KUu-Rzb^)ek4bu?ud$;eZ?@3hQoa?` z1g4wkx11F8eN>u)97?AzHl2Fas$r2nQYzHWD$oKcY%|Hukl@ zuJG6<2lJp5p^@{YOljtDUr~j3fQ|L4-P@0fS+3_J?UigCc|SCtJO4J89sw)IKXHK5 zq36dyA+4J?vjQv|xc%md*=0PZeO0No^h1oMEKi89QH$Qpt$f2=6OMvSxA^Fy@~TmJ@bb@=0JP(I* z>BY9ES7=&O{VHEcy@8je+c|$Ddx9*7yF`~m6(_n0ecF&r3MY$^J+RS~0Fp3G?y-;Q zwtM%t#xT=`0G4R$SASeO)aB+lX|>Svh3(s79%dBwr=e~BKMi@)s-5s)D*ZyUblY!X zuqg&P_E$yV`Tlms(E;IohL{6k2Ta1#QP+v%$li$WZyDa{HPiQYDnVfdtb+{k3a1G2 z^&Hw3$s^!SYzuK>mZtE9x-06;!qv?RJf}hbP?!+oj?Q#DpV0C2N?ttXx7S9Ne-Q7B zAMQfRh@rS!M)LRf72k01QCGn&HodbiCh51`+(U;4PH;lZ$&3jW^D4c?_pwQk1u(7OuEvz0|(+f+R0zTjAULuDytu25DtV4tce@MJvx1Q_c;|P zUekH(@N8u?9(qZa50P#r5k(g}Rl zFj^=GUcW0pU>K>&G-p`%D{HGGM(>M%kv^($Y<4Uy%+Mv;c#o7s{;zD?&OytnBmH4J z=AGvYPpq0GJ|JynnZN($bK+L}LahkWbV@afoq<>Kl9MDw>Tjr5)DUka!{n3~q20D$ z@@3u$tq=Prl^4s5vZswv@0igv$NBCrod`K=ef@;Z$&bU&Yyj|-)2;l9c zVjtQUCsrNgs<j?*C!$%j2Pb`+i4=h)E$7qbN!ek;*b!Y^jKtvQ3dJV@Z;Ih9pHu z3K64_-DH<-vKJNECi^n>b;dG=S)8li^PKznJ?Gr#b?)c9&N?kAk(HJQy8i3SaPE7})?8cZ-b&Rt#qI_%mfH{8`$ z`*x)8do=CMvrDnEyx4CgNL*H-chEf;nVM45XoYU)dC!~C2NR4SQYk95da`QsiQ0F= z^^_Fuor=-Zs10fA`k2}r_NjTPbW2IY@TL&NR~lbx-^A^@!|Lvc?Eu1O1mBM&YAoV7 zs}j0fk2u=`BgPS7n}zZ1o00SML;jt{^ifR4bEpQ2`e)+L#ma!##p9&-a^Asm!BWEpRMF!HdYy?p2J8w9vsqI z#eD_`k(jz_KLj=0Q8OyztF(cw$RS(lI|c_pQt)(Vh;RGz-x+o#PLzr?ZCB_c7zMp^ zj}+C*7kY);9;kt+#R^niNk>{(SXdkE*VBTWVHXvmVSwMU_y$mPEbkg-So(ZR?xwHh z6rP%xv*L_> z_lHm|u3?GS-1U51bzG>k*iZ9Q<3W|~1M2y=v;P%X-4|gZ87$;@C2x>h=+$0DD zz-A!TcqSq8$wenN#nj{WMTc@BfNC((F*vFv#@4pSv6$C;CS03Odi-_Bw zQhe_$i#()Hz4ark`H)xiV-Td2vapi}+O$q0Ltv$4okyB1<&2~zixz}kHClB zva)QF^?WXbITQDMf%n0Gya!wy_^+JiqG@x}mVNd7cTCm`ka%QK#`D_*brt26L7%j% ztuK3-oGNs4&`K5ZR1S4SD_{q%7y8R@tOxtfTWpLEbxki#HS)9C!s+F9FMzBg&F(wv zA2@So<$!4l#T3KprymqsHS)7_pxNvufE8Is zR$0S|H$W@&#sq|m0tjBWiP)!_2Wz^QT*j-S$6gdhSG_oQIBv}A`I;Y~>CEPd|Mb4| zy5NzX)Cx}8%Tc6ge~cEp;=^OU2`)h0wjIU>AGeE|?q={Y>j61tHp+G%HzG?yA?bx(r#Jjsy;yzWR ztq1=O&ckenLvPAQm-q$znp$twg5CaUt>H$lA6@-*y;~ks((-ezR!|2?%pQ#ur`TDI z33JGrLE;?hTw*A<#@vKqAA56Yg3jA0H=l`@7O8CgbJj;_x^!QX)%Ukj2Hh#r5)WIT zZcg;ejO-d2aio5PazFV%YXG6mEe~>%(oe4bQIy@U?xka?Gu)U!Cze;ljt9#LOLcXv zCnPOw40;~8uiMJw)2nFgf4hcI8EkubjQBAEGt^7Q9w{b2Rx! z@A&z-FUDXE?4t>L@4rB@t+&~1f2RHRX3NT39>`_}#gW9wPa;iSeq6UJR|>4R*QPwD zB43B6m>)`deDD1{kK%K)35d{t7Rl;<@`0c8OJfhqpIJV3zGGj4IRh&jel#98X4-0Q zKJidSdmVJj;U#xI*NE_UBuCZq<~kJ(1iWZ{HCG0RU%uYSEe3p7Q9gTan@I_^$qC-7 z&&1*IPD|UJ%&@{LzeY>#v@2sGY9?!RKb6GxT-g6PhrcOMEsEPa&bYKE@8OS@?0RZc zCT@C|;!jOa;?gcjG}4gb*h>+jH%b9De(!_T7d}fI?{@a2ohN8Z>F@ajY`K8QJ)j9v zqYOS3h3{p16Syds%^{(U(SY%wl^tUVo0xu;i#)k+zQr?3Xrig;b`0NSm`L1>(qp=G ziaBIcDQr2u=A`#->pdoc*v;AEnHoi^un@3zgGemrL45s-bh*Dpgss5Dy)z6u7V7oy zNpWd>#_geE$-|!=InY=0AX7|VTsry!vllC}x?rOHOj+ZKc;Oc0{$&a!Whag*tJ>gK zl-q}~8!C`62_S4i*nwENEJ0x{%h+JX+&G2k{@sb%jI8cXGb_h_m-LOXWV0Cry36g; zVxC=_uX7H%8n4B<*(-lb7fjZ5Vu(_CN%a=pbk+3X`diCVd(|H8KK)RT4Uivc`@N6X zxrOxYUl-0sel>Ynv$h$U`Z<=Ocrm<~lYLv&UJ9x`QIG2J!F-^>_QB(A+>HTZ5cX|> zqQ_821#iDVO>IFUHN^`-sX0?mu;7vP@I0aYbZLf609ajCXpTKz_Ol9;D>e<=W zYdy#L@*DT)c|`ecnB6WlLta33?p?4akz+;|%hZK0Dn>+}_XF(KF&O{sLqyq0L*9dO zo=3GI9;I;vVZ%v_0<$$Pq0M)o;g>|7-$WN>CerFfG?a#jJ$TtRE}@K!+4aqv@^(j_ zW6uMDYy{PcSPeHV_uTvFvMf^)HJy1prCc;57_n?c+fO%&(eGN^XWqr(WNYc>k2#5~ zhH+|YP=)36wHNcYd+`_MK>~={K0;+pOQGwx;yn!>Hm7AfeWf;!7%6$m!k1>-xw1}X zuc>^YCN`XPNsjUtyO&=8cP9uDlMhnYNR;8WXC6uPqH$sbX_xB_9d0i zrXl+D>w@Oxl1PllW)$IHS8{>p-yw*PTf^gF4$Hgv9iW$3alSqGBYYwUwEi(j7sCOb(dJ(%H-S%OYoGl^S*f3jSC2R zS(y0`VTN^&WZfhGt6>0$QxAJ19?>t61SKRZPbOYFBH+Ts-UV#k{3aBM2(Ei=f^78f z%wO0dnFc<$W|Kj0zf1DlyT=tW+7T0kXEl4hi|RKYmzqapMc(+q-H5TTQ6U9(AHNgr za`BC&y%4pG+(|)_DP?1W=V@0u>>RAmpS#v1%T9W<_`~nW<&zP)Ri?Mxk+-Nc$}MV) zeXgZXf@O$b3(E3sP0cT&M?WWeDhUIVB z#irB|X%2AiDg>^c63%-@+Nl3N7aHDF6HYRmPa^ByD2bdT=p}XD8)ws!P#aa~T^iG= zaq;AX8#+q97~e1PnpR6YNrw;~Y(cmjbxk@IkLsw}`HlM_97$3m$fwG1?wOv7*o3dg z+ylM%J&@@&9PNZl?F$R(kEIS01=&(Zt;FQjbj!*d>RxG!G^gZe0w);=t7#D#*Mi>@e)qp%Xp*sK$2|%nI$wnEaVSAUj4qh_{2t9bRz^K z#5_&CLWFUAS()p3!r^sGNn@^6ZoQw<%y{eXIA(R0^QZDB@yE}v*7o5K&~MY!{Iy8e zw*}Z=P2jYB{!x%^M5V|UH=FPPZThxJMdmce8hu7ZLwM!Vtuj3$ta)<^e~W?y_5HD< z*5xoWFxYEX6vpyD35BIq1iLn`gL-qD@)kr9c;WqA!#zztTweLmVDhM84kp4wiCSiz zsA<=_pN1vp2VA(+?;B%f8W&vRH}**6X4z^Z3MJRmdLo+{I7KeYT*kAC6abP`!;?7xSwis#bz^vqB?cA`j8D#i3W z1fyw=s~Z-crnGqmI2oUA(r@%3zSEH%)JEs~qdINMNO?>t9#cNF8PeTLp(jv-@RyFq zKlec{F)5VntSZ_dOWC&$g#lZEmYOYU@hSmr6(-Yfzf z#FO1;5j}<5BhTlS3{o*;GSB33wmxoek77-jDi$}Xez=)V14_7;)tesyIc-;`f8ox{ zvd=H^``@iBt1v6?ikIc0Mp3(m_xn4RS9k7wXdbhd2%aD`)cGJ%#mtiLirhUMsK&5w zP5|u9-PR{ZlBejX@QQ9X9Kz*WNb@HZYq8~d5lZh`R2YrrJN4+1Ek8vWTou^ z{9Qky3!i*m%P*?KK>WRk+zD1vhOxIYxEyfZOTBw2uH>5V)ZxQ9>6dalbaY0i`5?E^ z-{D1G5wJRvzE}Fo+O})RDi{Y{LE#Bq;KNMWiw)xo+p;VlNbU@E~?K9_+pklLm!y)Esgmg>M~g#-5ofuk88$-A-Aaqq|L3PG=k3UdrLym zl)6_@sPaj9?uY&>$XKknCI>afvb)3}4}QfvIu}k~Ab5Q%jV7Kls_yTH+ z{E3(kA_1YbdJOlf3mo}TV6dUcmBZuPH{EM9mQmm}3rkXHP)Y%zcrw!NuJq|iyuP|h z#$oISZMTW}{$#iqr0o5#ypWe#;U8t25=n#oNHDjif z>DIN zT$~qFoJ@y(owqn?jRwCsq(!zviaD3M_Zi) z?Pos6?#V*zeXt(;W6Ue8gEO7hul3zLydf>gwuq|fK=1a>xxPHq)1kZj)^n&iZOySC&7Eu)R{BEE+IDW*=6UZmL+b&ms>fATad~ort1Za8PJDO1+UmPxe z6j3Dbb^o>T5l{9RYdC3YR3|mP*@hs@xzkQbZ?!RC)OK6F<4(VOrdlby0A1IWZW`h%NcxaKt^G`VOK$afh#JyiQo$AN!ldxIW#G5m0sJdif z#FbM=+;-SxB^#Diqm_G0`^)12dDQcFE%UvT`Gdo^u`g`NKqKHb{rD#uGI;**VC4hF zw@jhwl5r$Q`W}CH$GFLhTj9TVHLE|h3H+<6LKy3D8*2<$xL{%6p8^BWt!`Gak5%kr z75i8i_=m>;+w-Tx-ggU^l>K-g2eiwbG^-Y5BfB*yJW@4ZSnHycW>g>|CJHEux) zbR#^~!sE(HF6-ThieJYK7(eyRRIcZc$wh4d=7L$30iYZ7QVWi!dwR!a6~srcB&;Mx zQDv4I<^mcQIwydu5@8LSIU`_se)`3`V^C5Uc~Ez>d+_s%LlyMT6u|JB=yCgHKc;;2 zW{NeuUnLwqZ%3ToE58j0`VU%yewl_uk1@8lC)tGJ(-5Eu{_8WsI%+`;TteH4J2VcRK{<^*Rm z=DgN!6L@ldbq?iVrtj2<~`jf*qA4+ z#j`<@qE>Tc=Kj;kQZ2oQ^ad`2E=PmVbv~jVCUk3Rom@%orge@E@5I>m>h5fp zv=f2L=8vgaJTOvEZ=->((j8Pc8r%sDE9;D1>}J>6tDfAwhJ9Xjgpo56z97+eVCss? zG1S_f&p3U7l<5?-;24{TI?&EGq-N3fQV@Z2gh3WBQ!g zx3C+)0xk2zdEcfCl3-h*IZ%EQ1i%2@zA8WkA3u_W7o;oXrB6&0azLVLP;0FeXR6#7 zG&JshYqpM{vQyRmRa{wP$;X-9P%3wpsQ`L5XjKq03(`TUHwJan@Mn7TMzj_a$iI6< zpk=8YF19)$<}#-H3&iCpJgE*g9&y@kncD;5vG`q>T4l}S$l2X(;@P+3u{BQlQDD8a zQmY-a`YpdYCE7Y;-xAhTDHh0$G3@%6)cI2UN+WzJS_>8noq%K^bGE-^c_Yv?=0vKXNE&#xg&c1h*)tT(K*) z^ymwOW2=9nI1_l~2|E5btxZ zDGX|9K1zm&%d$Na(>ay$IpgQQxZV8E^gadh%p?EP-SL#NM3CdAK#5b$!*g!1_TE6L-8Rn8Jaqoaq*%3KSM&^aNPqhe9?Jr~A) zMNiU>+Tw;)8mxbR8?;RC#SVSfcqHXJ#$2_+a?klMAKyu19w4ITFwlIQ%biF$&sOL; zEm+H1^SmK>`@rsY1=xa^UB*0uylsIZ&5K=+!04Gr)$+NH-Z)ga3^Wd5BX>VJo_=Xy z$DZ89Mqj^y{*2tC@I$K1kTy?9p?WcQ{3ILe!l%FXQ@>*__Hor)^a~%h@TBK-6~p<$ zW#o@>pJsxZWHUSnUx18Im$E-*o}O(NFR-!)nU3eu6vXn}u)L7j_hCje2u^QxSK zt%yM8CYwo!=`g4IEQ7Q!AM}TFF;858ZGv z31+NVpUlG4p4B`$^S#vK&J?2v*<;lhQ5DF>gBd--_d|c} z8?&OzDVxnaX(J~rt|F<2ry8FfmtPDiblC|7t47cpGa9iL^46%;wKDDuai}^@uYGq% z-|6unrPSZ64n}9EiBg+&AANVtGf_Mmydu%P89L$?THsJ&npZ{1lmLrA1yR)O#v{8; z1s0eU$ez*&L4@^)yyDbSH2Ynp{#`B3b4T%g-gc31El1gis}5GI`#Hd*Iqu{4^!@|f zk3E~s>78KX+(y?X?*psGMrGg`j>s(;F0Dx z?0Rqh;VB>-NY?DL<+b>rOS_BDiLj_|^;Gm8IS-{Btm-$RdU<+ycOI&^75&Q>SjS!f zjj|kz)C@&kgFB$~=TE5ihB0@nkr9`>otF*;o!7aJeY+XP3|w8}G;od1>HmQEg-A7b ze9$ zZD+()5N*9&MnAY;FH!{+GXWkDE&jz3^$!hQSQ`EA8&qm zKe!C^oh1J|x#WLSEdTGy<*Y~6n6NOw!T<{c|7RFr+aj=HXjTmU4}Z|diZ%a^W6giG z_4naYzm)xIMz_J_5l1=xx(T{b=08iG&F}6wDD}{C zX;q~i{S&sUMB}7+kGxF`Pg)p#J2jLHGjkKPR(%6RL14h8V*&OQKl0vpAmMWTy!-tK zx^*{6CJOI*#iF4Ukzg5HkePJz{yy3OqTUO>tHdH+Y%f~Gri$sfvv9BLhd`L8VtM&X zTe`i9_47UE6{hbh%a_AqH-T!z6y@~Kn49GB6K9%C@ky9V$Rf<+)bnx~IhbWL-m10Ry#JOU)yh)S)GN+}v=^yY;1Q^bghukoV5F zjoA8}j5w@5Q3(tZVHD3L!$Vm*iD>U#CFt*0XzJcq+-`1v72lTbnV>u0JMt+%^`>$> zNe}2>hMDVzuJPuiJ^Bj~55sHI9D@NH3s?=p}Mv+tavJUArS&n7bq&kYU>T!BE zN7fvi7H)qRwxKJN3iQ+ys;tP?O{a}&GuP*doj^y$x3uI##!x)8;X2=#hB&|2sO}iN z34gdsm;B_S`-GG8=>;GSZCz#3023>@Sb0$X+Gk=Kmf;Iw{Mdp#j?)32=plf^-qo5v z+#V0+t*UKfU5Uj&d!Z@ao4854?Hv1w$oAdN1OXDn?Y18 zc2qYwU!*uTFm~=U?$Y}}b)TB;bh!NrUtn9i=@&Ggw-^^x`~`;}&ziX`drUFOI@nG! zmU~;_YW_>P8@Fk}A~}Np*5sxA$M25=l^oa{viY)l^EvxRxa^XiD*q89nF%tf}H|2C4bIw%>bX-Hm!}c+i`YflFJ|tyJ zX^yVnXmBGdUj!Th0XU{`fri8hh~eCQN0v1x;mY1a6@2ksQU< z@3-XJvOPKD>9`20^bg_G$sr%<(MTsT$n$Z~>BVao$dd%BYyS*JLa_g+02PBg>T_6_*R?-=_umw3}f zTJiFU`Y^~aQq(&H!uz0J0!Y$XTp=$#d%39OCIZ1!VVcUtj4r(mgD|wIres$OzwZhQ z2`_kR%;sy(+5$&(+tIjYx9wkjGU|`#?Xc0&P0*0rP!2`l1A_(AY%>m5@B573rBbcd zU>3J`@ACqwJIevkY6u2Zq1rIKW42AaBx%)Hj}FQZ;C--VOqEmf6a2Uwl-5e0W?wTL zL4>J%q9fC#%XXeiIJO*sE@7z8NGF{gNUlU}+@^P@}#DcOoz?L+ycwG`H=gkB^f9(hM2Lyo<3)NxXb=$$@%s#Q?4SbbUq0B!yJ5H zm7>fu=`eXSsW;>Iu-{0elcKM9$`>+PKz($`lC~?e{3Us#|ze+pqdzxn5jGCGI6{0 zpc>pFIBl|5v)%{)D zRb04kbk3I?a=c?K$lf+2v|*CzR93ck_hQnU8{H9pX31L+qlDW>WI^(Fc51VoJzR0t z>BcReer_d{ef=iq54xa&g@&*c>9%el;;`(+49Y+_Z9)X9r zeuyzHzI+yPrp+B_@f{AO?WWst_ZO>SY2%m>Kes0sTWnZQ={cF7_Ixp2h}FT{qj|?U z59IAG(TPZp0 ztQPh3!jC>eLT7eMK?~m<4un6CFr;Wxap^^QJ<<0PPreQ6tFkwtI*!R7ca*Fsecyw}G6KezSqEXsWTTEJNMw+{}c9-wOeX!2&* z(NYg24qnjQgoDmlpNhOV94xqhcZ z$#Hd>$3KPDU#gCDyc8+8i6}>`RMYm*-8vQM7&msyll~F4LbRoK)DNxZHnCIEJdoMn z{t^FwlgRB4S_9#g$Zv#&koQ9a{+D~kBX(@JMRvN(>!9>)taBBSFSj83w;*|hn1H4lZ~E0wl|IJZ zCmMhp-j#jK9aH2}9h+y8yHDgofqMKTe%k|sUv#Uq$^(!qdslEXTM&JbPM2Ed0$SJq zG$Y2}Zj|BFvOLMj%bN;|r>Ja_?F63moKDPsZp4}WH?Z#oYG!cBH3zLW5GMKa{-GO) z_Plk&jv8xfie5FAQBTig-*9IXc}AQ=D|*z`_6v z1OMk3z_1J)SxLj+Z_>b;n}7DX$xU)yR^vN`dUI8XO=dYy$0f|7qY5_W6?LGq7&T`c zdNhw70YL}QZ!vP{0=EyRzyJE+VoS&2)3Zu)8(n&H1>ZKDY(x!$so`?HTxbg6Jc&qRfUkGiSc38Ka4N7#mAGKDDh-ZWwMK z#r>720GsJkzby12{RB zrw8U@2y+&&BHj7^fRQV%d`^zh-LKxVMW!HTUYFosJ%!%lgdmDPhV8-1>$da!|8}`h7+`-NMr`-zmB;>bQwo@VveX z^$>CRu^1)P!_UVLY12d z%-0xux3CI@2)Dc8hH*aEx8tKR9Nv*|{#jnngRv0c;P~bbdD}dZD7fW6qNqE$I8O^X^)_ z+V{-~&r&hS>FvENzdTh92~gGc_lG-vS0C7rxUkl2J`b!QBYV7xbay?gxzB|( zhivMEtpunx-y6PEkS$QYN$_;@mQm+{RR8${uf^Ng#%jZjj$V4FGX3mn^Cd_T1%=rd z|3gb1Xg92U1!C$qxo&8=hwZ&#>>dW`-BovnJC!7K$Uc4f6T3L-{GU$ha$5(_xo~sX zE3;*2`=Im4+r>}W?m2K8r{1NhezW&7StN1dG@$tE|3t#tXjf)PZR7E<{IdiDM>3^l z{K&4d?@L;+D?hCfAN;i+(GPIz(7kZyf{ktlN_wX7o&8H{nt3Q*w{@_Hkb z%KV*m=1k`#U9mCdo<~g}!_IaDRG{I3Tacp=rZwIfvvI6O5||Bsg}=eQoSSXyW8Q{0 ziKDd?6MsMRYMMQZWKVx?hFyYA%56SpB*Di`{w@8q^xyA3#%scC2f1m6LNC$m23`LB z;j_mun>_7Yvmbykx`aD}6Pd4jxyNMQon9T{q@A4`RVMDh?iBXqOz_+5U4P-pUemra zIE!-rmKM+l>3Lt3XiAOQo^&j_a8D8N3_ZRDaj4wqm>|XLpB7onwq_@^9DrL8w{<^T zcfj@PSwgzw7DRAi$x2WM?d+8Tz5BkvdCl%qf6#i4gNGL9zoQ$Q?r^*4j1ZUI#t?e! z`It*2<8W^7XPdxN)0xJ%h9jpZZ{d$81BL+l&s`2jcPlN zccAg!+Yy343)!p-+rV9~ma>|907ZNlSvE?H`SyUAkdzseG7=INX=GcR~KPoa`=D znVvw+Pw#62im08A53=hHZ&c5d?&}WBUgTo?_BEkPyp3BfoaeFgejR z(`y~?nZ1@^N`~GS#F!%X^K?GBrnD%?rtB!5R-12L4_!0AqQv zv2w$I_S}$l{eQP>U|R`s<6#GUT^nu2Y#$bV5|cIStJ|w;>)rMyawrkCw#OXX(&@S% z!$Vj(M6>mGs^~x;b=o}Nlz#E$&Z!wKSOQ#V4Y~jqMGh|X*EsT@O#V>SQ%XOVS(=q) z@Nhs>G-?vHJMe59r?(e#H+EOwbd7x+(`*@@ z8Y~G=vz~qc|iLPI21j9j%Y@fVCsKn)Ay3a@z}C&!TH|+7nnxq zEvctpBvqjmNlBC-IYEoGKE2IgAcsxuBe&HmrhfGcpj6ZJPcM@hsIUHKM&9MkZg2b1 z8%%J;4kv8omch*Ypo?2dj%n0U@ zS@Q1JX_00wpZT>RzR+M7&Z>M5Y+Oh6?n323MQ44r--7$u>NYWlA5Le1|JA4O<7OgxDV=|WX)lxh@~_nO%7v&>g@oo1t~ z;l`y#_byz^^j9u<$t|fn^m+N$3^IGTh_KL_Fn5 z#P2&)jzk+3NChXqUNZVRnF2hqV}DmSs%T@F@X#Q-P>y$8%SufxN}1;6()KE(;O+hg zxN|smMmjL*ilAmNYs;{kvh7?GWzX;<8Ns?VEpRPG7tldmwFOx?OHccRZF6bLZs(e@#Fy#4bW^3l@O1hXgd#L;E<+decD$y$ zm%_gg3?r{cF*1-ZuN!eKegJfB&bzy~7k<+gew;W-Sz>l0ulvIRLS8|Itb0Kc|4Rbd zC%PBzJQ&d|u3murc5%0lF)%@NA=x)pVtJ8$L%L!GWp+|Bv=4#sA`GAZd9iu3WuJZO z419AkKE(HBQp>&`d?NDEFv%zYWkSCDY_zBN=`q3y?W6(vqA=doe-GJd3RzwfM7%ah ze3t@E%gh|ZD5+qs8UIU($uhEX>{^;WfE?3m(A{{>ymRJRD& z>k=nlWb{_$%DUlyLzhaSdXg)aYQ^5q3S7*+F7UP4oT17{#<+^#82o~_i+YS4OMy;L zpK|Obm(4TC%PV66w7!9hnSw_~rm}E_Y~LaBpt>OBW=~VejUlRScSpuz#PF_UQh%fB z=3a_wpJvO1S3t&H_l1vySqU%_l9kwXxs6app>%Tt;~u8-ZVJ@93qF$fsAp#M_qL;^ z4fyz)6AWd2GM$izv+t$0)?rRX>zSK??gU{bYr>er z?2QTElM@-?69MmjdQkFy4FsIxVVD;sE(rf@d+sKRn}``v#*%M1!q@r~rjT99fVv^s zk6CwM*ZIA_d{oK1<9IE!zfzN$YTmu?s!_|y1UeZ^nj;E$kol@3%t&b=;ZgC#loTkE znnSx56bvGm=59Q+;6pW=w5zBdnPTKeZ~wd?QAmqJE;? zaY{b%De|TZ(JEV)54neaj{L1g`BT`KDUU73y#T!wXm=ip$rH`UIupkf_zv1EG6ulv zJ-Q-qz~m8D$ne2`q%57;f<)#1v8JaxzI3vW<3`sIN=d=rB zWADb$z)&jah2VYm#W?Cw$(B}J zd9w(m8ZGxsnaO(ZloTEhAgI3{Jdof9r+ohzcDp`$)Q*Xm^_tKFN3`+t&AR%7L90+JRS; zO@R&?AgOZq%0j@UJ+@WT1v@Qr`7LR-HVI@t&}6r0ke$r8|PlqH3pIUsfM%F&ygeB&iUQqRcZ^bj^xLZX3i< z+Om5GUIy+6?Z+X0uF2^c_ne%RT5|&gK3h5xjMw9+Clvz;h7?%esX(xnk^-uVkcYja zm^{BEYhJ>rIv$-&WHo;7HQJDFLS6A8^WF%!>hPKuastd>+u3w7{u3&=)){jY!U*Ys zk7R)5eE+KqU8>R_#EY$y7|LrOzNLjZN>(3o-qKZi3)0+%V-(@OIe{3ar;|b!Ch)3K zj^8^6z>a0sRhwJ^UnwCVyk~eq41CBHp)w)SKm6(NqA-=>Jrncm{w$e_S>0059<{Yo@hq^Sl(kk z*WQ%hGx3yzymAByO8dXoLo^RY?HWjv(D_b0GmM?F&gR$BETK=*w==5NA}ksd%}a7` zal#vCkT?tn`ph8xLc)!VXC|%e@4iC<+_lf`9I`i0{^LmD=ujI4rSGbk6nq-jf{nnG zl>>@mC4-%|cL8R1zp|#g=(I#4ALv6uF7tq8e;*%*?U#+Bnww8C9=eS?uumH-QRN&; zKRMt>MSgVX{&9%|q)@t$18Q@y281r$J#A;YJtzZfn#rc<9VjGu3nD{n0zSAQZAFa~ z`I%#wAA#`vB`$P_R0Gluc59hFmdbeOj-~SX;UduopgX*+uVB8c_?NFQ4419d;=aIn zsZB_zL+NIi>|r6kud5Kk9$zuV3KkWX4k@oh7uH-jK~G4lpDISF;C4^~LbZJ(NR z`vY*U_d&pwY~NF3Ox{Zu!Yw>p*h`l>J;lU;+Z5W+dx{t_{A_-)c;ley9)b4FU6^!# zn!CR}hs0zc+=6}$gdFnBVQoXIF2x1xe*i5DXIJFf`WvidVZVNDLCOFe@Jek!XJgF- ze9of~K2cHl$8|t@ZR>$kl6;p!;Un1>w2iSK>;?y(hULFQK5Eh>er|AI-0yvX$&;)i zcioMuduq`c)4T|Ph>rFbr!qAUQ*~M!ur@S!UvN^5E8XRO^+N!Z)2lF1gVcuT_rb-J zHFT;P&1;sqtsNXr?Yw~6mv-oh0mdgB1LDQp2Xj9IoQsJ2o}kX-HJF^}4#dz&(uc&% zR-Pl}+A>^SgrNlO8GJ$HILYc9=$AA8eaHFzlkiQ@)Sw?*q+Nyh0xPy}?r#fuXhskw zA4_)2YJ~V98nDmmKbOv^_R<#`qVQ##VpO#;FS?)%bC-SxY#y%WEyL6-UEP9|-mMX) zF3>z@HzVF&>M3}x$UI6X0QL0Am8MSV!iKD3f!a262;&Mh2#oNhJq9qk35bTvhMC>g z(BFB6?vr%II2xBLW@ZUayNW-il4Mn)v-9&#GteFv>jW=D_Pd&6=AB&^C~qkMpl_fH z?Fa4Pzj|xi=>EUM$c;RZR1|{=s?`a3515U&rW`=Hf@cz}b4J^!M~xzvx#6SxXTYb! zYPKM2E%;qdgAregKj}A{ra)(iwRn;-8*Zo{#5ljy)O$?Z`c??cUcxl|FNV`b8`c2`@UJ&kG(LGFed+yZx3^{q%pF!W)8AtQ8Z$p%iGvp1yOMXl~3VjtU(&5}?%pCelDs`~{YAHmDo>Z$4q zspk6oVh5%8bxR<=2)af)roCVb5(Cl;EQ~JnO%2#_*C}@WzW(1C2e*@6}GV5h%< z<-ofHk)^bvt~0;TG$~RrhvCh6q4nWF|Lf`&?;{LbmJ1tfA1=l(1}dI4syEIT^_ z49G`bX)dNf?i3RXF0D8xVG9yk0o@daFZ*plN)TiGsW~dkA(A|GAYRO=#gMMEjlnnK z{PFQnD!mO!yCb)zM!`@^HII4QXt>Se?+)qpPrA)41tGub&X@=V-!rDq}z4han+AC1exGbHMcX$(+Iziwagrk`koxGEN%8>(Lj*v%r5=v z)D4>=E`<<1Pott3(Q}&%q>PZ|MIgke;p5^dYTK-Y8c1Qg>GgEl#dfC2T736<4`)|8UHiE#*nY$>m?P)VE zB}eJ(dqo)$4niYRjH%=#A=q_1*Ck)*`bBT4E+ei6d@+3C91+Sxt-t;=*Y#1OV2IXK zI{FCRmfDD;Mm3gs#ob?Y6n&Vr-MR{WWSxR?D5861MGrh^D^$^y66%2qycB z_P2AQD&s2Od_WuQcKT?hvE7k;#EVCbP~fjY+FwDbwzKzeZaR|0wyBr`J<*YSsG=%T z_}NoiMyEAlG%7SU;WgdJ&~?Icg#H0<%@ktR*Kl+%txi^yV1}Zx@tB7-OF!`=AVA;w zfD$&T=C!kmLDqr=>5iNVnve^4T^Nw_m>gOsE~i)ExUdj^Li07_)E1;(l*)d$?EuJw zoj$H%e%X#T<*_~T771qlN`a2YS+z$5ww#~)3VT41`1z%(FnUdhafI&7cvK_q5L;0F zI_Kp#%yYyzVL^ek?<&vzD$;4?7JE3WtPkL?7lyr`~KYb^E}7#J-)~9 zkNY`({o!!UHDhL6*ZVrJ^L)Ku%Ubd#B3^j}Pv`iFo668YeO72NYS~0|lfHhy<+GU)=s3x`^vnyD6qr?og0 zn=vBu(Cd=v4qxH1J=>lyG&N80_%_$aaw2yR78#yx3u)u0h0{;N&FZ_TJyBgwCkW-H zEXyn!s3)*6rL22CDnVl`wE2g6P_?@@WR_($;1h3AgH2?zdjkRFIHk(;#H=0a6nu0_ zuE!aDeJC_3&d0$8EUo=7b1ub>7;y2Zz^394>fSd0P^um5*W*oGdddQSCd@3FgEg8v zL#`@bH(dTj!Q^5D)I1V9cYIiW8^OF4^e8-D_}90$a}LY0-#srY*HI(xZRpI|P?{1@ z#4Q=2ePxBCYCiGtyiJe{TMk9P+Jj8k^?i1abBTpnMw~`y{``~_1q8Rf4_MQL}q zx*X^F#uv3W5yEqaO7Bn!;t6o0%XsH6>OmQ=2hV@xdx7cF4^o*SH|=(NOtP3b4Z z!#fz~iKML3!W4}!yzdF`y2sC~@*#@33-5@z8JWC*7BSDymzac?!~;F<1i*d@=;xqZ zGB(%QHQmz3Y5yGq>mEp)ewN`x6=-PNRiER;>l449OYo!`TLE0Wj$`ooH4Bi0$Di%k zA%X3I=x-ejXO3-?I<@acpq%Ak^;Z}7gv+lDk8{iJqZ`?Zq+VT@sDU993R0V@9%~!m z<}!Et@&%4Sb&Yn$3567SN;yv~S!wm<0}AR4u`txZ+2U~8OEsr8uHnTwF3PAWr6noT z>kjOcCOyf&icO^pd2=+eN1Qa>f(Rh@2cowKtlg%(E!iCqSE!xows$`h&rr*9@G`mF z|K2{sQ`C!1i7AQf)2%&=R`_zqmfjQg7E}ydx3g49{2n=pMD&pb5GO6w%a9U_GOW-T zbv1ZGda5guA^MhW&}4NB)oYCla(fL~uhBaD7>#GOTji*PQNqU-$y${|$-Q@v9l3M! zMLTD0;-I%dQ-5rTl*1v6gH6Qf{^-Q-wt+g?*=zvEfw!$vvl;~8eW`ZOqrsY9g}Fo# z2tVxc-D)zRK^WFW-z17zdutlUJDIWWe%tLU499?0S*(~9RC=i_enw;5N}G>mT@3Vp zgi_iVK?}G+!US^dGr08d`8d*T!t}c7CgQPG8$FcxRN)^UQ$WRV%?VHa8$|ie+X+7V zeiKneY)vDfM`45YXah1~W4k82^sBMMJpMm8M-h`^1My2Ir~l~X)Fk|)my;2=7N-CiN#ijVwY~?~#N!)wiO~n1ZJrE21EVgs+Vdz6Du<(`tAJ?CwR4SI&b z_;W(qZ=jhkBYtJ(nj&e#qxtr1lk#$!%5J%ehwkAjoMzfM{C9VFMDOiinCOfA6bLG4 zHEURU4n+&B5M3A+WD|A%eF^$!jsUt);DO7v#8M4)HK_N*ql#4kM>)?wk;=Jt5DKvI zIoQm`lZ&qcSJSgz=RIGRboeiZ+&J|v0(2)6DABRO>QYX^hhLs(biFrq%>Cnm8sD^S zX;w=isuR_AI+TFw{_`zdV$i%bj{N*;b+7aM>v5|1}cH-FW=S2a$<4Aq=>N#aJrIUCgIVZE0b zQG@cv!l&t`rmA+56MQUn*fwiaP<=zmlElt)dIN(dQ?GGu*r2MmdK7mNj>jsmC;jo4 zt+TV%FE}{VWDxszys1%O=F}+6;<#6yF5Gu$xYHOmCO3tA^zvXr{oRb$50v+AA}%rv z!HppZOx6VI+XG&D*jDHV_Xnz9U}m75P$Dy>?~6(;-iH+s+FV7U^{dpz1{2|gMf(SA z)W$1R&<(4)I4d=9GrRfXX!8{f4jw(e-Ia%TJyH)OX=P4%3DgRS+i`z%!(3bPJ-(De)4l|8rCFJ4%k`I-vUL(& zU{UNxB9DuroW--e7bn`W5W%aM8O306-rexNgwJ+yI|6oBG6?M7lJ_D(+p+Jm!NrX( zGm7e-8%Cl#P6RU!5OZe<-@kqoxLl;bW%B^#Tb0lNBs2!ZP;}q&x01z5CsYkM`6{3R zK6cmkI?Do&NiCs|7=r;L-oq+Ma-M-vmoug%#FX$2sfmfFEruPwKl!4FJ6veSo<&t` zR9>@oeKz1aUCfb+p-AtwSG4TII>ZeL6VF{PIf$LfAO?XKQGw8Qk?_H_$&7 z?JM5VlvccYCy^o?s1RWi1NRxGzDq&SqTsHd$LXBk)jO7)2ii$bX3@Jnq8(*zQrEY+ zzeM^jAK5l-<@&JbcF(Ggv+%Ln0ny8xhrb@mHx=JcW3xhqt#}d3cjC zSB_#hWKFAN=5FvN|M(=f6Awr|FSmmd^I;ryD(lO`B(@Flu|RU<34@HWacQQ6WsP8>Gk@6z4tA2O=>xzMyw9DXJ`xP$b_1QycU+K+>)HH zZ_hT8^0L=Blq_YPpefM!_ivCL5$c*M;iqao_uSGMm1$$2CXoeMnkW zptv{i&SE*cTo)__lD0mDd;b z$7X4vI8*OEyq$4`${D_}`b@}W$KB@AYJ@3?aVy(t>g|1c(6Z$x{cV|vRZ8UU?uA41FMK#IiPe|bvn|O9&RU^E z{Pldu&ieBDpG!XTx^U#o`rhm{fB630&4av*S?-`6pi%V_`*o;jG0Fv&no2UV#Lf*i zRU@+rQ^{m}L$yJJ7o~sjo7hFW2>4`;5R|iAr+(l=@1dAJ=W!GE*5R&Wfkf*uPIM|(4Ogq9%J!!1l^OC7{%7eDp-<#guf62H zu|z<73g1LH#j^By*Xxx=%yWf!stP0Ei>gj!dcdCRkHs&aA*ge*3)B79iYWZa!5;Rn zip~b+uBwOD>`OyxP$r{SFvGDjLy#Xiwd6T>PVVHVyV}{H2^p+7kW^7(CO8&n&Obrq@_6P!~s#I_X;TsHhs+=*^;l%p4?TwP7Q67pc<5!2@`La3`uSnMs z5N1$LM`l@6x!^R=z~u~4^e|yG*YKjqK81|^pw|e9Z$7G=Ul zNxy^7ez!gT2}=Jy7Qc_he@YExPmD;T=kVi(SB*9i*$K=p;C4rDi|BvRVePmzUqk2@ zl9!+O2Kw;!v}nR1d?zuZEo!~=Lhj7Ltf&$#IdiA=P2TTTRgcN5_(WuN|xc83#){ zD)WCrL2J774et0)_b$AXSuXLSE2 z*ca`sCOa!M$Vhfem?0)ppvlppx#eMloemw%K5rUjxD6y~X5U2{oO|DsaBI%*BWUm7 z*E*qVue_1hO2W<-T6lPWPYv!Qs#z;Xk(=abI?O6{-l>Whf@6o7YlCw;ZZ($@Hfj(u zOld0Rf4y@IX0j`TpabV+E@G?=^=eWz3qBX$1x zg4`OiUf`IO!%jW%ug7G8G~?C3t9VE?CJ*4wO~lf91}M_&18qz`d$x@OP$Y2km~k5K zBlec|)}%v`FNTIla-Y6VrEmL8NGQ-sO+G9`j~t`R-jPxSB?qXgGk?+Q=vyfYeNIyO&0GE=d9kkd!n zlZ5F#H-0?RXqB65M@(z*ADT-mjA~|`XP=BH2kCLm(Uv>w^Lz-ARcx;uOO%gU4&;6a ze!-OMa+Do0OSd1qDU4w3gz7>5^`W5|lZ;ET*o_@-m=Mg)L*ktb%v;lsmAhqi5T1m& z(Gk0PSMD}4^X&Zmna8=JFb6T#HKrGJ6UrSC6xYj6=0<=bY>gqVY&9<|qm=J|_EVE0{j{GdN0dOr< zbV+N0*V~zrmV4umqAN&~Ap!IB8;z>Isbx=X*PSu!?d)r5B2OPI{93|hGNZA)-pVGB zUOMJye^ZxI+5Wgzcg&ZT^2P@3j^qu}N&sEP|C)+60CO6Kw${~g90RF-=v%7Fucm2- zQ~6UP*rPcR{GBF=5NA`0OXjPAHu=uVW5M0HXkQh_J@E)b4R;Lz``5^15k-;y9-=j{!9soVOjVU##ZP8*&xi*{S29Y^iKZA#~dn`IlGeLkH7t4JhOG@zLRgK z*p8e1y?v6%sD=B`OGc{z3+!R4JN@^;9S4isAV4|&C5-{}O>tA)Pt4@C)&IIr1N8>S zf*W(|#LX6AT8VF?CE+tAn}`L{QX548))te7E49E0eAAEzU?&_=wBs|C2e;-e%ySFt|e>@F2Ik`4US}Z=z#6N3i=M@Vp|y8 zV#zbPVKV*$!OsK(eyu`{7-ZLIy{mUY$?tyLXo~0_(3qUvfC&Q%kTkA=myF|{X%EOg zGodHIKAH|NF5KA~B&8&dJg}v>OTu)NGo56QU}LR1P4T!9zJAy-)+!j zp2*k@8a`6ptGQ!6!e?_{xPXQ(soWk_C|$p8f#0s<0crsdqjTrcfjHKxU6hFlvnsXg z^3R&Kv575fpl+C7T)5lw>e}@vg)_$4EPvS4z5?f^YZFn1OAI(Wbe}0Vz8C?tQ)qhF z251!*(``2qpKnn)nb-@A7XM_7ITkdLm(2XnY$7o6xZ@b9c21+=whh5N^iR|`lT{(Q z3hq-Dc*{x@98}CGhmOPNT%PGPDRg`eQQS~QxK3eyq%58YeET4usaZUzx{*qX#=_h z@g*)A&CAdlJXLxl+0mxBu#ap#p9ylzIINJj*F}ork!Hu894<#OZhfW$ z(-isY!g(!V)unb?zobh(slV@RIyd4V6IkzDwzOz==hc+?^1jhP3*l|hpzsjOPg>D^ zZZV)3CuO)*vX_Vg+7x16+sB_fA?>``iWA9vCReszER4ZA#2HLus+R{E-GySVBr1XN z)@E3Z*9aky#=eFawQXt0EOmr8-HhTqRygg&!*YvloG>-JCJY7VLiIY8$;K595uegr2xH6o zy{<(i^7ix#grBj{0U+G+2yG|$0}ai14)F3CYP{ugbt2UoHymIbpf4nB_akCuJK|d} zag5&`9>*!5OH3Gv^6qe@@{$r$)9EiF6*)Ae=JBt1ERJD&>4fwwJKm%n2v90-11j6>V8NcsJkGXHC$>Y&Oc&}7S)36oaN<1O7T~yf|(b4guIoJH;Orb2(X{~g)Eqc}aT~Vb+ zYS6fV}h#N=a@eM#6{VoyjO$a5rD)3oDe#3{Wv!}ZRC!(*{T359O)znZumZxdw zx|#+q=DUN7yVNUM!L^2Sb}i4Xj6XPvrvHQm5+5Z-ygWcv|4@U3_sO;N@?VNnp$RcEtJZ6Y4v5&lTBlUbKRSNcO`%k4DMg^qVC z_FJk`*spxQ3arlT-pE><>*sO!8z#-pJD}|*6mcfeaRWb!tZmzd@z%Ur5&z{xTwJSL z>%|lr_I=fbTYFw*44Ug##&rjwL6h|ht`YsPU+Q7XL(2D@Cp*qGzj0c|89ma(sa`z7 zUl0Lj!7Ba&luwkGd0wtI^u+^)J5G-fpKT)ey#*TNMs%U!56-SVd7wTh4c zyQw>@9=XeE=AEIBZb-BW$!8R~2|?)GfzJ&W0pZug?|Pg?_K=gY-o zFg3?DP~V6QzV*7^3Uv5#KorHzs=Qd-6b6$;`{?2A}o)eurzM$hZzf6eee*t;lyb7?H5 z_dMqiM<7}fh;R%|9tznb9Ugnl(;EUdy`7`0ES{f~ynU(M2f zZ?pW1dED>g@%woEu7Ur=8sOY)|J^zGzrg=~iw*vli4A_A)BpV)knAh~8A{W$Hm(+Z z#`T~9qkIz6bovJEP5OZQXTrquwKOX{pU18!ETH&i;;s^T=gcoG4&3%#BaBzBl}0sE zy^IWx()-Yv2!EngY2c2b-R3BoJ-)RpWriVKt2A(1TF?W&&q8mbAF~9s{6OG{kQBU$ z&`_dRX1UuU>BWo%>=#-Px`DW(NtAa2P?G1AaOYGoD*l|z6a?}AVv55#HZ<>(d%cT7txgAa4&Lwn8szJb2F0@;G-_Zgxah1=l*C9Ydf$CH&9pK!G% zb5_{H-?x%t=f3;lx_khQ>OlT(T>Nk?}cSkz!cjWeeFi?{>Xasb7#*U z6{h0dFvKC`>toREmd~v>48}2EC+9X1@0$7ZlXpE~W&QPhmsgz``^)WJFGMEl z-&_Bk1OxeBBpAvKmVnd$Kj8Lk{9N}5{RBwr_A?&RBiF_To4^p2sC7a!202jO`01Po zaXS+%+=Hy9mn>%28@pU=^nb3f(0krEc$Vl@Y^_U}l5>f=@y~3mk*a{*${~y1Fb4=% z(!=dV6&=9ugi=Sex1vczS6{fX%6$_d(J~3JGGSO7ZEqUJIE3Ey0{7J}4H=rxA!TDV zRG$#La{#s0=q7-tQhNFrY5U^tM% z!d6x*d-ii)Z@sCWPBf|#qFOiUav8a0k4NO+8)Z0chy|N>J9i&CmdG~P`)#ihRrKbT z%Iloq3ItrXp%sHi)K!*&E#FE5_!Y;e&JV?n$P{zoPB4Q) zEyrUCy){gVK+~2;!}rQ6DIvSHUl8xa`Xb;L89jGyoxE^HNvdH_>Gb$FHUd#;Lu}X} z2Hq2CSfZ>_Bz|V{SOi_Sj)yCyW=dc!gpIp3%ifcrk9vy0Lf-O@WAc1FOkhKSN%q4ZIEXGUiC|)0OJ%D9ZKFk z<7s#{OSO=H8XyNU`2+PYPg4a#HxcfcRu@<+SxSy^DS8qXv(< z>hDJ88D1WUHPgcu`X7agozY?Mtd|YHg(|gL?j16R4dz#!X0P>ehiI%T6>vsXX>Pq5 zGaU%x*?|p`E2Xr-mFH{L*;_S+4LEOHv{!+5<(Vg}$rs%h&`rcgKae8|;Wu6)NZ%0X zpXgDH7j%4NF3!R{e0AT}YpuY5hybvaBHt$bVb$_8ZDqOo?Uv%_>)9|-Xe4|Jx+gOS zcjcbjI`HC)+&q(WEeP_h<5Q`wSwN2$wsUs7dmCt_Ui|Fx5VfXd82Ry~Xvliz>+1j} zmrE^#>W7_rP1Cy5Zo35Pys6|u@=x_YNf)tr=&3qAVfkt zM{6z2UFW91pClGzjZJ<67313L%a4$L@W2`XzuWf4foqhs z%+H`sV+4!7@7u1Vh+69e&yep=53TSS&^I;_B;X7p0Pj|lg#8E;z?SP_tPb!BngwBa zSmIw_mH`Xsr-H=1bDIbvo(YDG2RO21Cwj61z7Y%Qti*0CfP|09#eaR30c54(#oWRR z!?@KcHrkfIy#2C4L`r@*j;x#ObIZhdTWu|T6!VBkskpyL+b+!n?p(9IC(5aj`eg<2 z=y}Cc&5WXTT06(&e6d}+)-YGnhpWhHUlz>6brh8be$F1E$goG9w`Ked0C)uGq# zhfg>SZ`*OxcEO${kuFYWb3{62ScL=-tE^X{AXNo%b`<`&#LZnx-`F7AN?O33|a~X_0&n6RDzsj1n=E;it1 zm`guN;RwIQ8G1k5P>N^GyA330;w14OdJ|4Uks0Mtprg~EM(iid9j~1&t?R$MQW^_R zMcR%iZf?Vlzvpl9o6qy-WqxY=H1y?Nw=$YKuRxFf6_)q}+@NDjwWJ#@l4R6JJl6@I zn7&dG@VP5dwi_mMiwwSlYD2sW`sfFMN_`4yaPB>@cXVSrViFh_L2xjWig^tR=_cB~ z%V!R7D<-PL7}#1EB6JWQHbmwr!z_$O1nx4PQ{SKG4W4Er00&!VSzn{rPku~sW!{Yh z-;w}CNg~h>4=F;>VrFznoX`Z31?LDnTdn<7d!5IgR^a7=z{7Ur0>{>Hdt<+96(`Dzn?A5B(bccC9;?J6JzG@MJ8%x2qnp^RRfzmb+6QgVI*{DT6qcDt@~>OtJ&_HIvhU)G6g&J=S*%xzf6j^9 z@gB!Pb3|txA_fBBOpGV!V&?lxrS!#ys22#uqa{7gRoljX|TJ%vw8 zMmoe>-TR3V1{|}bJtWU7;U$6fQKU5i3bMgifwFjk=UCDS$~x%62dYPx94SbMcf$l@ zirQk{t)f_+lH7jeCO>FmvnvLfLXIg5uY43c+T@Ncx@4)%;#FUCe-LdE2cacEa+vh9 z(<%MjMbp?b8Ypw@H&`UtikNyJQ17AGc_-RQW(4E7>#M2wW#~f&5qw- z^l#wu_dV=)4gBZTz#F1W{r9eHH5s~feCR{2x=$=Rzm0UYUJV7kt z40-d666xNG8}x;17~&(*CuW*j1;@8UJcvN7s>t$%B4SpG{t}|c_qT1s<6DH(w#eUh z5_mtow)=sJN9Ck*)svRfoUtizf{CijG}KH zf{*`f=yYbb5aGaL!0sC1YOE?hCg(!vK|%+kqkEd0xk!(8t3G#SvN$c>nAmpxM&YC) zi>T-~ygA)=IR9m#l|x&pDq)$Qs#Qu`p^<(JvAal^-5-~JL0INvVP5m!t%pah6i?G1 z5m$s)`cQU?esE;z!inR0TjC9PHVs(wGNnYly%)ZlP9RSDJ4c2V_!j8fQ7yXrx0dBjy4DF&Quq3 z*tbeoMbmf?ep3Ih2>4%!?15_1NR^sB%%kx z8+%QlCo&Ph);>Fy*}=#KSk8alo6YJ71i7kS%|E$Hsq9~t2DZK5I>v?pNO%ch?(-;q zw|DQ)#kzidx<`?glnL}O#^B796HV}7Uia-X1zv9~&-t?xgF2qJKrw<7tWk%AMi+Tz z^qk(B3cf(-)DpP56=zeAd6rgKS9%E zgGlM#yd$@(ZQgE_ddlRn9EuRh?XqDS2V}hoay~1`#+ce1;cU7l7$JS>>i5^Gs8$~{Z_nCh^5#lFS zmx1sE?hg_Xp+ciCc88sOoBf{xaswQMW#T28Qngn+WVb2%_f?^JOai(jd2V*ZVE0s< z7NBeTh5C~NwBHj3SYK5{k>p0i_KHAtMH2c&ER36s^GrNe0y4iDd$sGE)&n&jZ;lV- zdEGM}v6o}N-k!#EYtNsiC^zgq*u5{|T$j1l)$5)Sj6|r5F2*qFCd`i&bYvaO>HD}B z@Vk%NPh<3F@P%28JO_+;dJ|NRtg==uUV+p`ejM`(nrPL!dGZt=%eb!GTlT$)0ZNh3 z$8EBL>q;kHrCU366$xy;_GCENjJr~w2XX%jRB9b5a@f>%5YTJyr${%Tch?5+e)cD8 z?wE&_CRWmZ-AJ@yF@LYp_r1=yyOoYf8c!a=H4*h{wqkoQJ6=&mF)lq#9U^wp(~Fpv zh#!{XU&=Wu@$68d4ukFB%YK~NkE<)AE3jEZzNuQBZe0`tYli*X_El)rk zDe-Q>9o|F~mNtvA)xk$Y7cIJzP9~b}ZjNA%L2}f|TJ%oQ+E+8XEEOG(7Xd_SHJaaC zBs+_|!?6XSR5ML8`8vdh9;M;dkxPVZiv@h9e>GJTMuUDY7&{IkVB>!epq(`sK*72S zkI=JA1`|$W6^I#sAu!7DXVL2csGEVdGuYr;L9g2a*wL7njOM_7YIJ}pAbvKxt*;B# zQbRJogIpv2G-Ow_GORjE+fKZyI0P-nP90<-_$hYtB>yA~&t%+y{UH8J%`%}CB&A$B zrbnqD+qQaTfrBwY96ekB#jlGhzCPoluRqC)uF#e)<%n4FXM9P zy&%0zXlfI|eR6RHUhGp6#9Qd$H2jj$Vcej>Bqkbr0(=#4>0SdzS_|QBjH1#x=_ADR zSYCjK_dZ`Sn2=jesfW>^kK}JM)mDdtXkFOwzX8%14&vCkF5K)x3}9*(`xw*`Hg+iv z(a68>X`lm9#C8{Mtr2w0*GB|`{s{HMB{$q|4JOPvTh-w|>0>;SVtVfSNa{q6!Lw6HouPU36%_&gUVJ5JaS06w%%*LHvmx zkHD$GF%nVxhHns4+aM`bs`4Yk8h+yjpr@zbJv04p!NGsw{b61L`*uvA0>i*fG^yn0 z-E(Tyvnvu+T}}OpDG&7x)u8i0ONd+}P=2(9sX8g;eX};&VtsTGBXyt9jfP~DgI2X6 z{Jr*%1D?`=$9!u>#iQvtI__?!MLrKg@3nr%%!$d9EV(o%h?i<`Hmt5+@A*%ghNIke z1GAWKYPsxX(XoO`G%rfRNPd{>iRZW1b41w%+uq2qf!k%~m{-42b%S2H72o8HTw46q zaIW%?{Ns4e)zf~9?9*N6=)v_uZ48b|!Y(UqyG%XRyy9!7law@M0`OuP5^zWw+O8Cz z;C$Df=ci^B__ zAN==G-eivW4LrI(<=?AicwX%SL4eSX|E^qbf|a1|&E%pxZr;tF8KSL{5~h32H4P74UQY6Y8w_aAZ#z1 zYgJyi8Thq}YP|A6Vlr`O*LAFao)c@tI{61ct~rhZ=1H&kG+zC5hk$^ZIC@(8J;}xS z#r2H$jOcrbGbn*~KbxESqbsBMQ_gHC`2-;Rh;!te#&m{x`|7rs-Li$a#U#QsX1KJm zTH3T%ab8_^o9FeT5wIzP^WB+O-YIQ{<<13lebN*q8i{_0agQ>;*q|rTA}tm55r{wu z_9MFXgI@eM%3Qg;^P8^~2c5LopXa6cU+`j+M8@r&^bU2DV?|Um7oX35T$Nl`)-&f3 z2UHsW5_>8R8SdJ|^ zZnFjJK1P}rn3EsQ({1#v?)H=`d*^7+wr;m}F3Csfbeqyw?3l+U!rNqOi+?a(g`wW# zFCO47Pjg1@WLxH_sR5P&r{V1v$hoNoBg`>szIFoVR3s+aTCwY97olI4V=F7bItM<~ zm#V}zttfa7+oPRBkEPo{<|{ zossLtiVQU)!{^vUZAhDlvjAVt@Za{#4)VybS-WzB>y5*17@%*dgTgOwmlzF0HEzZl8tGRBQk&C zUkrf!B2>^y=1OFTr_sMz^F3ZHB@!N1?C-9n>b`*nm89uv?UTu_NTXXy~wGRB= zq1%jFlDEc{;o=Bmxodz;8vkGfO`BU~IzpqcR&E=>kdzot;P!g>FeiY3XFs%a)+zsi zF4z2tE{`hUM#tgezjCWR=$yFE7clE`G(_ni)+8p&Ogf-(|J&Zn?A=$cxwvGMSXLv< zUmrwXvGmLs6+5lJ@8b6sH{awY1EfReqc_Ryfqs`c^S;{H9sO3oA>CV+XH>f;peMd3 zTh~}CtkHP~e00CzP2rVjb(+}uA1)GOM_Lg6TBC%((*Bn5hLV!-@NLZpY7SGi=PX9m z&ijQ0bsQNlSl+uz_q>9kZ324Bo^RTo0p;{ll2ff78upG7) z&AVbHPjER@^2*gn0~N0w9C5Fb@Z%&QP1TRW*{II(K0wnb?55*geLa81*W!qm&3r7S zZ)QlwdQT^A(o7cC8NA6Oq9=sKW`-_(yOc|s#sof7qQ%jLz5DCyE3Cd)>0C_)@7skB zWN0~$a5-d_oZ6~Ao3bLjp}b~ERmys_;xz4yxvQ;#+vzlW5Ja}2#{}GAGGT)2ABOO@ zn68Nre+B>tGYfHJS+n@nzHGK70$?`&GBH*H1gbGvB>gL6rUoz{f4K=bHiKPqzjjTr zt#l1bGh$Q>n4Xavz_S;y1wWk_V)nykFe}&`1&>Mg#8jM-ur?#u84&Q(c!XuL?Nawy z0XhGH*ZcMsf0?nCE9iP6^js+axFJ-YHvPHA!7T-`YIbM-_JnDDfKyVhszTP!$6AHT z9OY`KM<#-^W$kaa%C+CuyGOE!9xFY?6$`TH^r`E$a?wtgedgtP9Rz2bMu;NVX7$58 ztGc<77Uu>ecdOfCini=ry^Py#6Uy7GgboByrAZU8IVD60hi1BYZOOSKjrJ$CrDV-< zQIEA=Q2ly;!<}JQjI$u6+~++vBYsu~o+7Z3U||eKwMow7^m2?ED2!7iG#G>h?SpQY zhigCChE~q>29JXm-~)9`qUZ;!taT>P z;5~nBSZUVo{qIUgK7j7!7@w6kCe zhu#a+(T9Y*ovyw7$^TgUkY>V69wu6exeZqfp-Oh9I z7%X`kn+KHt+@JFS0-b~Pph=M<=~num^tK17CwBW>!QI7PW!Z@Q=PT~x>W?`QmuK*) z*wnv2{r}mEuJP4e(y@Ac9gLe3CXC|;z^z~X?MBBxKih-GpYNTk+acf5*vcAfV|k0% z*v4ad^y+n;H9i(90L=gR6OuD|v2WR=E93gwr{jwl%y-Z>K39D*Dmd20>RJ-vmiOxh zmgZu_T@;tUq#N1K$2OJKb%kJ+oqpeKX?xONk!I3-A}9r2FiPaNFrsmnEB@kXa9@y}gTY^##PXhw9!-^Y=aSu%b@U}cduj0Z?F9zcal zC;rHl=qe4xjbXZovq2!qc<|39MYaoWv>&rBvT?PCr;XvK3y*?lf0?`gPjE8!Gtl&$ zL65xw-@gpNvBC^6IEk0TpxclZftZ*>kFFts`*jpU8jON@%2R;*JSqPt_Zk0i6G2^l z!{ptFK)`7=Jh)Hcpm<4t>xw%r;e#jJGig0Ak*;8Zu%fE89b>j06Q6zk!!rEx)cPwnt!&KdsFaQVZxc8QO@Fs5AuUg^R`Ly#=8 zPR^pyM*%Wk;h)8df!RyEKdNUztGXICrOSi%YX|0pzc}-?DvIi|HuCi>zQvp0l%jG} zvb3`%A&vD_nFi(4*4C?_t(B58DpTJL9NILQAF#U}fkjG7*C)Dj~C^AG)8>`Zh2gE75_6`v4Ja25@Y415AOrQ zbIk)NBS*_&*lD)+By zWDP{)hhm8WiY>-NW}TzYBI8X23EwQ5cU$R;{oijzV_U(wJpv2Qm>WcTlph`y^tNe6&#PhW;MSLWDyis*jG`i|IZZcA zJ7G?jLb^tIS(Ad zs~?**lo+AEOgoF(A-j>p9^41Gqg2&rK;^-Z17QFK2+C{YC|-X}j&Nq|n*Xbfnl55i zNiB$wSur`D;WT>nM&*fN--!yxe{d#b1&4X{;?~4nFD~PDLX-3Zojz5mKQE&JZ#o2_ zMdfNb>g`oes(2Wo6Qt~d?&F6p&@aH>h3&!_`~>fdr|nG2qSBI3oty-=CfW{hQdtZtMXg8G_ zy=7a!0!PSg(3n~GC%URIhxFRM6f;fzt)L5lm*$!4md(ui^366@!Apx40Xw5zaICT1 zjPqqh%T+BoFXo$ZZ_|)%j?-1*_YSSnErO2mKVWKIH$XjU3^-^Zf-kgu4YhUwgiqij zp zbr+SL=t@ktNgXP@{wl>d7>+G%DK16h)NC=12%33Bgplg@Uj1ip?f=j~E#t+YVej)l zZZG){z6h8n0w0;?p5d1dPgiLKVu%-huS#|1&`_moM)HW-NT#kGX50E z`Xn&;I7c7h&5F>r%?riFdirr&a_(t&#;!C*2=*@AgyQDN zpb*Y=V>?QZ$Todj-@8hC@Y*aOsOj{bh3{#0(;C8w!?+?MBaXt877 zehjQbionf@_-rE5qY=3ML%TFU+5aPaR+B0qM5Lu`_`0s<0Y?UN2%z@jMKoa;U=PE1 z{E`5GKJ#TUZIgAF4W6ua(#jUVDq=LRRktzSE+UxE|1u=A#Z#LZktX;B)R@zTEb|q3 zhmb#Tcqi|F!QlbX^6$^Auuwu48UDoHN^}@;2?aoQY}wZ#2>J^R9DyB#Yl%ySCLnGu zP#%19ZFma4#sGn#|7Gu0#QNRz8VTk*qC+JzuwT_YWDF_eCAaj#WFY@RBlaX%T3+_w zp?Za^YhSgfusdPh%ZnC)*jTm(EB2@M!U>@kLgzWx8-3I(>HhZpnVnxs;<*v(>(y9O z@3z{Ex}|qm*JowC>z)_~S;BFEk>SoEFF+;bTrZ}mtUydbpCvR9!SyZ%)gaxHh;=`FQcWye-wy#p9pxFPcb1&=eATb6Qb3Ufd(d0dc zVPUFj!3A=V>2@E&!oqEpZ~9eHbxSyqh-C2?j9kp}llc(YGO&>8M_olMZ1Viv-NY;b z+D0~21vE)qMuPLfg7HIgWBseCL-?@9H_mcy1EJ!DTCfgK|37YOh^P}O&uZ-32dF6v z{JHD}6z~U5!#H>&5r*_qJ)s*IUc#h294W8{hcr;43 z$zt^$;Z_6M9YEnUwJ+hq-SJY8OW#d_F$B7_@Yf%heNs_``Q06kN9!!KEG!)%_(&uSKRLPek=O~D2 znVDkBKIj7r<)!W|#-}GvoR|`2$gOab9qT!Uy_4XQt>5k}m#jemujRtU4g9$~?-pK?Cg-#$~l zEzUcF#Ea;<=Uc#X5?7tPqshH1!NH~aER^R1M(%|X_$)~0V@k#&LWniFx1n;vrM54> zgRO?8?>Sm-R_2CC=8IUB9J=|g%(^wF?gXs2$_tmQ$>TI?^5Ll$sd}-^2f_qN;`^BC z$+xJ5gZDMf)*vs@b%aqb#?H|?!0X!Vd09HUVo!=$qDyM=y8&U{`Q-*EjfG{_F|wz2 z(K9EJ$_#_z9X1#4#6J9`e)}@!24>|@)gGX)@)4ehdo(?b8IV`UM@#HQ8uVeng#(;# z$c~IC75p3&bh$iWz7)gM%D#@m01;x4QJ)@KD6QX>$$ZUd0A;I%8ENogk@nGneoc_} z2ZF5TUGULQ@Z{{Jm+9iYiZEM7woU-g${J+!=ZpT>&}-orP6#e;a+gphf2Byga#e3r zkh&8Ej9uaM+fE7@W|oE9ZUde8u^x;Tl6(;#a>k(Z29b2K6uzv;ko=D4R*v4{>>f%i z*Zi8i)ee6dZZ{VdU&YbkMv{(x(RS!uP)uOh60SvU3A=%DN@AA;BJQV~C*0S~sJfkt zDo$7R{tgqN>1&*pI=K7Ycz@MXTqEvl0yTIS9!eUaZ@oVBNLLxD z(6VsGz1&Mj5-0J5gYdg_1A>DBtM8y)uJckWiZDAX6gXRrv6xZjt+Bt^Wq(o!H9WUo zQrb>#suj1tL{~`saY)^*$^40r2xNUA2qcG$2~F{MuhQFs0mIot&y5DAb&8)ZIPV=> zg9ugnxl+PLX_PNdH4~`{G75yxgYB!Xf|S&kUp`OiB}2biOR{)WUz_F2JIJ=tJ*h%9 z#DV9!l#rZ94XAyjt0!!cEyd_sbPb*dKdjs_`;%_;xI`Ov$7>1D-&_2oXmMPzq>Lm1 z*&nM()QFX?A%!~1P~^!66>E@_t-uSxZAp328Z0J%koP_S{at6|&*j~w@L^{ZVD9j7 zfZdiyvEbn2d-o7|0e=i#Cam=Q{iSxUDR?zzMbrbkN?I@B9{6hsH%1Z8u?87){mqUb z(~S>sdH75WVI%7b2XNQ;Rmj))8(?gT@UyD(Z{CVxY19N}BY2we6D%tKRc4*}1k9fM z_N-m5ZS-))GzHg4oEODT;a9gLZ*}{Pq$K^YU>(Suv-OWl6Ko zlr4qjy>XQsMt>n= z!OM1l=1!cp;@B-^%mo&Y{HA&*b?BRKhDUb|ip20^&JhdNAmEaFW}DIh1S;@jn+5Ak z#m>=-!{!!7GM~Re=mVJ7I3n;&#Wx0erhJDL*e?UtL0+Q@w2ygdyR!r-$0>X_S;&M^ zW)4`JW_~^EWMw=#BdY6rrTCNgfElZXJoxiXA?8u==E4LE1tc?3m$?c3gW=B1MH7=<%4De52RrniS1^_~dh{XNVBvXH1Gw*l zqi)al9-rH{sC`6Jm*cKw_f||Wp_J7j0h17;1!heSFxn2= z9gD(5d+!qyYv-D9tA$3#$BX18I&~<@zVlq%hc~X;>t|owttmV>kaJ#jpjmA-7o#pA zMmf^HbX0NLU{$TDWtYI%a zIY-xi!Av37DC--ufQrhoZ@n>6Mhs(tohUf9epeYH@yAN%rZ2|`0}&8<7>V|^S zZ8f{U+NeM+T2cbr0L=P?lPZ(*@$p~9f{!y%NAc#GC(*6w9b`h-QuOUl zJ;Cj=4sh}wcrN-1(cDK}&y4K8*fG5*It*IwF{8eTuj4l&cbnWyeG)MsO@Id1FO@2v@(x~ALezAHX9iB5dN2eVidJsIRKv4H#- zlL%g3l=dM**32y*dp$26=!M=3p9qNABk4e1|*QtBz3)2D%R?HKtdx zuq+-=O45z^ZZ**V-t!TOsX0@12zgAYv`?N8nhYI;K0p@Rqz;!yKKBURt^F!@5}5$o zh8ap-9Utaf-6D{*2zs zPn@r`+O*`J)0~GNRGb?*>8ds_?KE}^x*6|`zlQ89t&O8q$$rk?6bLGb)u4ScRFkd> z#Zft5+N^j?UAGT`R2s^V+!g$zBfPmuxz!s!tQj)wJW^2*i+USqQ~67+&FjMskNS>NL6drx9~{`n$s z?uo=ld>4$Z)jUw%X=hg}%ri!KhM0ttkmU07`LyTRO&j@(9ghQpOilGOp!Cdf3hD-A zp=F98mph6%)2FuEWPOYZS|v=MtHBKr%!(3p$n8b3H+VOm7-Y5!ts=(uZuHEJ<~4iY zzO=iT*SmqIb3M|t)fH%|M6#L`+{?%Hyw&lW)Ep$@UCg@`Bva zf;Eqm5JOoNa#=NEH!W`~{;ZP44Z0014DX6|>K>M`qcu9Wkm9pAXA?vL${wobCX#whiSQ^4k4$n6{F4P`EQz|5_X-B7PWVPV61_q zT$Sc`Z(MD>P(l6M>roQWN`HL%;PZjwvrr1KD`csubI|P=T4aZilBb#0t`VDfHbeF4 z26b5SMQFrG{}%LMBLpCI)iedqQ$!qy9g%Oyxl}pP!tu2SBy0wJN2GQ=(oaY|n6{`9 z0znT<5i%M(eu_O6#zg`p7-CEUnv)#Y{W{!gwvoF5ebKuGXu4ALH4gii4hpps>hL>& zCE?ry%(lhePGwXs#qoGeGSrI+X0wa+nGvH9$BxXv?FKPt9MjR}tph*2SPMH>lBSfA zf;LleMR!!rh!N!OUvSsr$92K_c&l&Go}YjJ=$s@fYoux`*)1%kH$zs*;T_j0WEBxO z>jo#u{|xuOxpz-E0w)d_mYq~XavEBwLwSq6-FN%f<6o8iL#(`l{bcQjA4r9Myc@O$ zTU#yy{01@W9RoiwM=&Kuj3-RsH>iv;N@`==pT&GzccTVXw2!cP7{c{r=%?H}za<@G zK`PD^ww$oiv}s-kxH%b0u`8Pa z9?P{vl;?3IgW(4ZgQ1VEQg+D&i@3Mh4=G+@hZx#+u7?S?AA`blM-qxW3+3r=ICRs} z_f_hE@=6O5Fg60vA!@jDdcV?#|dz8TVqY2Rj|FKB`N%GFze#Uo0wAnKO{y1m;F z*pbD#>Xw4vyr?jcIikJv~NU~ zWOe809O>R0@0QxKSAp_n2S0@z6~V)nxT71!3Spx+2qtf7UB*2d?>n|7_P#IK{}otu zZYB)E^T&hjC@QaSKT@>L9$=0l_t7q&GqhMdb#ND#Mm9B)#XUgI)UE8nZK~hs_p%ZG z0NhjP%49da?nYJ5VD@=8YWy?Zwz-TTgn zP5yAp`@^|X{q0XBfeOV8q7Er|^RQC6mm;o}3*)jV2Ng58bM*Da1(*fP1l0QubdC52 zb8kdEyCQZAMW_-%gB-7bGG_aLg_VtiIN4i?gL%fx<@!&pG<)U3efJ_G7@f=)Xm#52 zCW*sg=er$=qGvzH_}qs#;tf(yrkZM0D{+wBNMYnq!iWmim2X~POW%+q%^lfBBwZw{ zbPZ=b7H{2A7=P%*?g_cob_y&c$M0-J0;=2<4Jtuuw5s?gE&&~`ce_^7CV?K-&-*W; z^Lr|{C{M3Jc2tPF4u9;B-{-)`J!zw5k}$VHvxtMLPz&$B4e^# zXNbddQBoAsAlEm|ug|{~jaYno0z#Nag6a&~fwl(x*8Tx!(}V%L^M{WN%1saQQduN@ zt45t?oH&=Bk2*F*wr%l77P>`|_uaTRG6n*oVYmB^H zrrbDa6Jhlcw=o=Sz0#(HEJv^hb)aXEk}jHi%DVm9lnZ}~B|%cX&^LuYvi2EGn>*$z zNDcWRQNijvi>d8gCweiG$Zi$T`miS(yDYbIk694Qu>RG;^$(If<-=|(yfv&< zpgWWPn@Z0gE*Xn=N=y!jYx>nSWx=1FfymNxSw>%Gi8?x)NjTPR3>ZJ`KI{av!) z;O{(*iSn5{VE8&g_XF`$0Q?0|agt#?X~8bM>f8P54!- z2fwb>gZV|PC*VXjl*U{ah);!c10HSKa8+hOaOJZtXzh51sghfC9ELOeo$+T`b^-7# zg+KK~RAdej7!Up;5O12iQBqQHp_C*$%qUpYYLmmaE2wcGkM8CrT#Qt~eCz7GF_iW;IRT^v0h zwtU1FvgFH9wZ7PArO7~&SZIbk-H&mhdoe37vue}x6GK!f z)8cW`07d%vHHS)J$g9u*xcgGHYmWT>ms%$|aIM4`3^*X!r%bKt2J&RZ-6?P1j!cx0 z2E0im!@}9Ii0;*Fur(UF+q;PxN5Mo$jM=o6lQxA9Ehlp`JSk;!rj->TDHng<6!+J| zJ)6x~n3b?+I4VtNhn2m|KpR})Fpm+~xtPqaA>C>-*u-)Cn-;D+_A6(#@i)KorI*H^ zpuaCs@G~_-mPTTJ#vLB=mv%WFzL~1Cdqx*Gi5e1`X&NGifo8+Tr*1z!?Zk<%0L?~q z;o;py`*)=7tQ5-iG1TTmiMeEd<+#+oZs@mRdtAi4f;erL_;udw#@kz45jYd^3yS-L zn+JJ7Tr5hrrAf=VMXG(`g3H&iOvtX^P75a~eobQChdX0~)V3CnMq6HbA5d77UWg3( ze&SuMA%(vu?F`SO7W!J@zVyPRq1MQvgO2;}v z{N+#C&IfpI9gSuQwrHexUaL5qI;e}A8XqCf_>wc4bbX&2?&6T#-YmG1PALUCkyRnd zE$$i4P6eCAKY6Hqg zCM_OJa~f~<7ktfpngyf;HfKh|u4u?71ssz-oj<{y`(RPr`WYtWgW$k~<_`S(%8>nq zb~!rUAUGX=Xl|`Lw8d(?!_Zfzb`DY+u5Oz-=~eRh-mxmtG25?LeI}e{L|6AVFtOf? zD@wCJT5BoU%}1E#3dCG93Q@D_^?4zp&aLFVC-4hs1bL3%2)~Bif(tKN>F?Dt2>Og} zV4R?d54UYH5@cb85u6`Bd#S0`0#Sj9!gKD9E)OX4TLdR2B1J-MAjw<@`d^>BDeIK- zYyEBLh+vZNzMC;=XuAwxV6~B!NtdKaHd++Ve!y@lbt&41&LJa-67cJ}oLUq^hKlK< znP}|pjf=P`P2b%zrU1ua+uBD_d0_oPWuznt@i z)Fbq;{Jg6T2g%Jk8rmQ|za2;(;ljc-W+;=}Z}N>@MT{EN!$Lo1qD0bE;x}EAb>I#B zs@MSWdn)>Dz{}mT-|fL+kAaZ)4~CzNryrfGTP-_ZZE<|Vo0pTd=ISYG-Z8q|MPP!*udnn{cd^u7YrBz07xr0?$vL;OH0CB}{+%W_ zox2xWyOFpa<4MD&4<7XynikmxK+SEC0b>7H`Y8s$v!c4;31HuiZDHu9C z-&Yr^aY4-=wS=v<+4_!rGsG>;Y(u_QxsD4l)K4hY+qBi(`PQL}K^>?($bstQO_S@D zApd}g;9;hY-a|uauavACil%E~>DPdI35*7E)e3*%ABw4}_5PffhD%;|8(mZg#?Ffxy~a&IS==Bqvey^8I`WhzM~*$!lVD{?&D+e6W}>TmRaN?P zKPK(NmH}ySy5JuQQ~o4~f0=#;^HJ*;5`tBt?!xPRlg#n>-MSwBCj^P>_0$gUx%vz7 zt<|lI94iC9AgiFTl-D*q#|Q&;!NeTY9CRA)>}LT-o?C}nyN~d?_Qq^pgBYT<^$-i__OsZely0_|LE+%$ia$tPee}==G0-okr|Nq)JJ>w)Vw*641J`j;s+tO z)j!V~@3C_{kj5wGi5s7$5^16t-q0q~UJ?164C%t?FB~_|$DH(+Tg`vap9+sg z7R=Q#RW0LB%J19r`qU6&G@&6j4AUneQ(pWoUv9?<2nT?3vFPq^s^<~r&LaZ@#;-+Z z=_ZEf>Q>V(7vyFK@L&HO%@zZr(v@yI>SK6XD$W~KMjTTzrRy-2mtPgXJCQe43qS-skSH%fy#|54^ zF=+{hqBncXR3rE`Lu|#%k8mDT?5wQfb;_;T9%a07=oi~eT7d(!TK+abWq+?ffQCZYj?YHVD_o9 z4oamj6qyS_NBjMWk4Og=m%s24$X283mm;h1U6Y4;XCO%mCVoXa4kx(4BxxYd@@-`v_d8dZ9VgzQwJ zc345Agd4n@>f3^Bw$TDzN>-Pic=4oRpMh#u%Bj2wRnx;82J}COci?M3IEFf{L3(vs zmVo0zF{O)YO7qOHt}jtLdywbVeY_Oj2^UPY5$!oN(y_>ln5rj91RHbJeBHCq;6c@t zE-wh*7e#(?rK~i}^Z50H@iCk10DQzm0fA$bPpix`t9{ov%1c5FjJiEAzq=9y)2A$? zO-bgKU&1)Jz!z$=I6=kiUKg*aX-Vh{A0Gnt6GHN{jsy$n;_5_|zQBGxsuQb!=ZmFptr= z=|{bx7b)j25}g$V6!XWtz^%KFzcv>rf{5~S=8plpb@bENJ;1MoxzTSE6YF=3+^F-R zbKf;*V@DayC4x7U>P6q@7*$%2S1U$ZF{E z@lTw-bGl188})T8rmqIn0^GsaVg#(FV1n|bw5#XhWES+9s{uc^w9zG35DT&(XFZ9v za*J-4AG?{)d}ULqmX0x0AyP=YFz@l70%UO8XdlRtXK0DnOZv-%Or*%}y_HqZCi0Hd zDRIzb>E~G=ZMJo}4O?28Z#m#2Mi_%O#fHXt%UvoCI$!+c(A#=O+M;AXz5us_p+X*y zu+n>LDV4^@Ut1*)+#3%2srDWoj`x=q=8+tgoCL(~!hltyyVQluLES9fgST|#mMp0c zXrjrqJN5JZlOlF$wx>AG39b-FkvuLcYTfOY=I^xSqM%f5$h~hj?c=VAU-?GDxa zQSl4a6-R}dZJV|jLj0Q2CxyH&Z&aCwrKr*n6HPPrWX^N6*f-y6^7{36X(n#+*FY`u z4&i2s$@D9|BN1SMdvoc6HagMo81of_bBJ5QpZ7wLM_ES8(bnWrMhGBf{4p=B3F`11jFUm+IeHV=OhrBnTXUL; z)C(8+x!HVQ%)5TyRpG^p!&i`30X& zvF{@!f!M@j!tmc|dDnDvXKKmNa<3Gg@IZqWzlr$-E18aR)CeG!|B zR`H(G>RE%7Iu1C|K2c$VAkHoMUE05D=Yr=ix=IXI*CM|osr`Bfiv~7)0iI$B>PRJPzQ{Rpb~|V!)c(Q5R3#;gtZ+MM|H|7(e+MUmz;gflp7-ye~4vXyMQ&>hsWWg;f1PopY=QUCm4i~ zn~Kp#SY7Z974TtSnq-GZL$o>fNzg2Df#d)`vag6#tmpdVpu|%mt*chG_h@oSfh|^l zXlq!uf+a$LoM%8_qJh?~RdW0^oaIq-iQ<7HZ58I|6NDsfGH0bbWgG=vX7S5ZRi*qzD;rd{q z!~$%cxZsTPN$Y(AFSTw2esSU)!}g6+b;({K49C>z_>wOL4M&elEpJ53Xg(btFg$5r zf5K!#VG)*dBFUgZIfo}2CMnn-bNsJa!jzYs_02xpFM*?-sNV(J#nh<@Tk0EC6=Y3 z1$&XUz4uXxa`O(Nar}Wsh@IS?4NaUtLJ$xot6~5pQ8#iNFSSSoY?E7dquS9_QzvXR z+n3DtdV0@FUNy?vx9;U(|5?}*7U`p0*=-q@2RX5dSh6P1RjVC-=0P9u^)%nfB}q`? zSoD+Jmb`e8@m61ZH{@4VW1ZPx|6`8<8v|?%{3l|7Yps{f39xgu|9iPw_PDXf?Vp8# zv^_^^(o+JC6uUv4`&%WnXKcGyBNpVnh7VX!=WnZ}bY6RdQ?^3K3s z_-q7(aiN6y*l$CPm$Pg)rUlNb^yv);uP2|>uku_sRtnPwE}^rbTuug^eek(BL@Jq; zxQ*vw7Ux%EuIe(_%sH|J%IUkcg!tEA-+Pe`{rY$)iS2hsw)Mx$ zQqk8=#~kI?jDQRSWo5KjcPTSt(6aT5_Y!Vr5BwTDX!S**mDQyCu~q{`#bp_pN`E75 zh~M@+RvmHUrBBZG9k0K0!OE9c3BjNUF&|g^Oy0ps=|K=j(=0H}TUu?C+OD52erj{! z{`R8DK=XZhsOz!s?1y~*3~IV}S|y?%S}*Wkn3+Enb+VN@(T1j9;%`8S(5Kg?dU2Y!&yx}KAQXf#|^S4 zMth61{(3CQUmD!eRBvJ37?+j&==)EV=o;ZoR#$r)1UTkF(cX+d`Od|yhdr{Mw@%Z+ zH@M?S6`=9y$!LdRR~@5e|=w2JitS^n{T7lyTF$kwH|S39&MltDfq5LBRR zFH>W}u08x5<$!?oRz8FzADEE+?ag1uZ}~wVRaOQE@P0y_5#Mx|Xe8a!%l5u*+Y*kS zfVl<@TK(LekdSo6Nb2rgy^zscpS@huB~bo&C`Tpa`!(wu5G*9bK1RwgKKJz_HSojBY@3Wqt^E z>=@be3gfkY^MO=_ozZ=jtv1;OAR7ym%Ij_*&j;E|aa1?ylO~ckzdg8^`CDJbxkD%=SJC<08r(+U zH!w)> zDIUi@fZN3eNsm407@1sysIO1Q(N+8miku-AzDs_eI_Gawv}ym#ip)F7GLu)XfjN3% zJVZ~kn}e477X1+hl`M0J)jRLVx(CK^ss(6 zk|~^VH?YefPy0D`E31LHW!%hn_w{eKy?qj!5-)|H-~yHH{_#wpAxCbDR8 zV1rWhHow$#z2Dp=!CJ;47AKW;vGLTjjW&+4jjS4e*rB{Tr&DkL2|EP5w=xu{T8SDn zqcnNGANzbbj{ELM@F~EWwE-v53%7F%;^{1KqBvrIUDP}BiZ9N%j(r*90%Ng)fXz6N zpM4{#oi%G?EUeGtbZ1yTye95;NRf9LU~$*iVwRG%y2_mFoyBZt2RGqQ$y^`pCleb~brwwC#psDfJpofuz6WCmzc18PS-+}&h!o?8cgyup)YSsA9}7q; zW;l!g%%K#1i9>v&fE`_$uJ$zQ;~g&Y+L}M9I0Qg`XdMOgW702CvisY`=%)5Gr0&C6 zr)}8vwN*x&F6{?bwY~7@*>7Jn!x@`TtX+{hB^@WR75?Lwr5OpR7?dQAHqvaZ(D8Dm7>3-Qa0XhEY#`1AN(dl zh6$iQBl#+Fj!dzM)&?$aQh7_hNmJag0ragV;ukFV-kn`SKU zF;6COebu#HFIC8#kta*2BCL09yE4uMhJu7nsLGVOyL#T2V|P88Xwg=}e1w)z3NP2Y0Qzy_!Q^EoC)AcHwf)q)8H_a% zs2P?Q)ZI6Dd<}997!a7o(S`l?#~*JkQ#ar9>r|KLqDs+ZZ?q3e$8L$!Ix)YFIqC}^ z`QWRY!gVyVwf9DE7T}10P{d`ZezbdOdSbotirG4+b|Z}%D?_{Viwe)i2tZ`?JO6MC zez_G{87J-RCk}9mT36neKRR{{a(b08e+p!GZ633Bezx9qL45__t{B-dUXK@wlyxJq zx=L^As3PX=DW&FR*rxvH;oO_=0rt39OTsWm*P$}5?Y7HY=#5^wY3Lx~mgB|PH#ySR zpBDHK-uUgwSJhPQdhPK|#HkTJbcJK}DP7V}p=*Y<*C`>TU3THrmwr~PhvCuw@3mi3 zhTcXnR%zRZd_|=@=5}1IzG4QX!Kx?4|iL-Yw- zr|u2ut7CK?NG&s+B_BANYkE@NI{uh!wfO_sYg_|nxApVEnEbMl8PCcd9OwQ<2lv$YeX4U! zq1(0I9FzG7{GM0zl}T4U%6ypKpt3f_xzmi6GKi!N?uM>A3n;yy?D|>IYPx$(zaozSLlW?E>27iAtooYjg$-d{1Rr?{vFt^)7 z3Z@U40WDO-z^Y@9n?$5@soZ>`qpn{^MV}Cs42Be-&owrGlnO)b%fswNiZS?U+}Te3 zI*&E#B^A#}T<}*UAyd^nOZ!|UgQEmD1RUPeAK-I2>u(7K_V0h;xPIrl*pGkZ{n%r{ z9*cid48Yeq*a0~^DEQ|D1?+KVk24zs{~8RuMcfh&*pLURi5_6c#>75C16)A;%%kC-Q4#8RU%mYPcB8&!K>=tgbkn6}{_!St~f%&vYs80Bls zHNT`VQ#^VAYkwNlMEaD=3sDc!f^r5TBfSOgyDxvnoFX6#GE$XZ*G~@6P(2Ju^1->& z`Ie4PFZ9X;YQfs#LqL1+2n~0)-DZEmjW6gG9APdLQwR@dz%$HFdEE?reU|IRBCaEs z^OXPLPbaVMA??n~*Xh|1aDvD?8D`KaPjb)_-hPU)$>Rx|QD;|Gura{K!2b*kz}7n1 z(KtIo_}4@T>~Uw0I~xOR46rf4#sC`wYz+J#V1RS2iyg=QAB-PczL@QQ$;JR118fYi zF~G(E8v|?%{A)14$@Y?GpM%(OJR1XS46rf4#sC`wYz(k5@E?c)PPRoK8@K-y+@{=1 zJr%F4JJqXy>l*S)fKpaY8_5^8GJ$^y>!-pOIxuRGu9QM}@eF1>h2?=k0Zr02ZN>!W zg@{%b0jM0lid^Yn;($+;hR{MXzna`Jx=j%OCZQ8 z&oz#)OFEI@;kYkg|377*{}l(~hYwR_lHbPq-@x9hW;kv5A(+U-T|2%Yw2*l?`}&|1 z>?Oph!z88rzI}8)UUH_)LQ}B|9@+^S`9gk0kT=A1qQeZe!&JGF(1G1b)-M z{kM9&a?D7Kx9T)x`7QGAwx-6)HC&98eBsNhh#Gg@zKCe8BgU zwgy>)>_mQYqkhj`gTNKe{r0E@>q%g!DAXvd{^hS?X*ZDfk&DL$z;LX8M~p0fzz4@* zRKk9JfMCoXvijhhSNOjlC|!dzffrAjR$7A$oEh4)MYO?m1=xQAy)(={*g#S#gQ;P( zxib4u!1K%v0q{a!*Z_3sP)AB;IO*da4EQDw@@AW;p)`}M8M<^S@5-}A)jL=kKSs<# zCi-TwcT_?Sr12iERGpiGKl9FDujiWseWfQfpKvfW%otkalGH=D`fXB+f;VxhKkh2N z8&MPbV@!l$NV`M7?57@S*=1^Nb67eO$io-OEn;i&MVPv_?cYviI|AQMl@CG}^l^lSxxHTJ-@fFWi?l;2-cB zx|z_gz$X=!2`hx9W1#Qt^M_aC!-9nuetcP;UQC8|`^aW5YpeZ_Y>+?dnN4?9skO|` z+};?WC-Y8l9v*>l!i8F>-w_?WeW0RZ{3M7kn@|%9k%wA_D$l;DnZL#|VYK@BbiA@(CMZ90wQUc$pWyFB~GDUaIS@vkRVkZ@P z%iz2ywbvC^-|#*VRug&~;|5tMb}1Z88vdz{@2zn{2g?hKjtGx!{aJ;@jYqP$!~E~v zq3AC;K7E04G$63b&mr4M|DK`hzhn5!{!DDF{8KPsr5dh|Nv7HMYUmH!9XhsP%w_JL zWnuBC@c1RUHf3Uz4F`FQr8wMENbVP*_uyBGPZQ3KkA>Z@15L7K1Bq90HOu01U#JijWCME z6s3(j+@pWlqSfHrD%{<(rb|_KCFC@J(XGMlDeqm9-7im#SK^=(bI{!|X!`qybtAhU zNR0yb2%h=DbJU0A$BF8do?Gw4AQ^I$D^p(MIi&Nf_DcACUl!+555@@zU7_Pk2ryvp zz%#Yz)~{NdZ=GtGmu!3w*x}aVh0uduAS-Hk+(jrTPBs}jp;`z02ViyY9XDgP)pLEo zka5mrJjtCL@ttHbUjOcGAQ9~NJCUtddz3jk?+Q?ITu{yLZM9%uk%4Ov6!t#i(Yx6C z5>fyh556g($_Xx^DNKv&Q;N2e?bFe<9)t@DGEjc1l$VVOCyNJ15A5+&UKJ;Ej?RTv zX!yEI=f2%1rvv%sJ7&za_%XvfNWZ7MGI+#T3CF~=8T)NeQDFM&^Nc?lwc^LP*m46L+sErHwo=O7b^z|wX74pXdKw9`=U zz8&EVArhxj@Ud`GC4z;v!Joz0iC;E;Zu!jMr0S=a=pqj06`Etc#q|%Onpm|~4pP!| z_76|K3}rFE575uU4IWm*lMhZ zs%_R*Cnw9(WPZl38gosj9C6@N6Y(@H1fjbV>*>D{9{x?Wd)S&=|9aguINQsB&8+_4 zCj4@kms#p*t0L{c{{Ug+F;nnkoAN+1{siVJM)8#*aIL>T%^YQX#$3P@As7M0#MNV!%T%k!CGnBRObxk1nSqsK}@daX_fYpH9X7}Z)4ZL5uc?wg2k185udFZQPinU;} z;bI=jYTpMH1|LQt+~0b0Uha~4#QlU*c+bM5m3X}lw}tdn5l1J@w;oZ3&UW?OU$-gd ztUXPHP9IVrab`?N9POG@nBCnZ5b#QF!BWa2FQ~3R?5o1)%Wnahz{hNLmL^I!rkxqt zn!77JL*=^3co0i@SxQJx?TyWl2+I;~y9SB0FpZ{2ekn^VA+1t@Nc-?|1CzJG%lK>G ziNnU85>Gf51qC8*C$Dz4|F`9Vkvy!g*m-El8U#2m+OhqmtW%2qmMe(yVu)E0*c`JkMC3TKN0qS?#?sq_qt7w2SIqzM z6CkuyL{DU1M$ATd{*OGS+)HS{uk0v>t&YmDf{?Aq^#s{+k7o4xJizW+gN*5SW2U39 ztNa+n_1J8bcx6>}L5X7}5d_n)qyDILGLC^(7`5slGWuy%FjkO;OoDv&V;xfqxzbE=y<|RT|;MX|E{;Iy7H5 zvHsMjTXlyV(;=!O$VNlX;oKsG$cNfb4d*t_b7)retZ35>(O)I_l31K>vdD)ZrMm_Z zsb?1Aor&DjYKD(2E)3ladULh(Bjk=58v9uZUz?rK6vIqh$_LGl#u<4H_VdO%@AfkR>Bt@BV3n z=y=O`VdYg4U5@pYwUsP%pR^c#)2=IAY@VCm>fTN#443H6S#N&+MmewT`%%(WZy8|w zTL2FNTtSHXo!3qNnI#v$#sp(5R-YSmzwVL0GzBXgRc6$nTZzT6V8*857Hdh9_wEXb zldm!b>1|}kQDhjya9B>1uE#5tfB)zK&Z&rlq0ce9(p^fAFnhBXA^#4O&s69Ee6-dm z{buiPKS0n1Km!{-_MmMIf>gxF;C`N-W@!SuXMg-ljNuxDd@*)qAD{{dt;mnlD-S`` z6`P15x{YgXLrj#;oLqzK{fHkl8b`iVUX`bB1+l_yx8=(qr?BWrSHm<#5$7US3#u+N zU&2Ql-Ig2t&9%1p0KZchCL7N{5o6ETfE>;e*#4p+7#VB{lJS)3$f|X$+d!LF5;Mk^I*nq1FkFYmlJz58tTtH#stpF>6jD9|LQ3Sx~vam^@}k2g(f** z%k77!oBzp|dc%KvjotzF>$Au7{}=;MwmAv=EdL*KmamFnzEJAzuDEJ@IYw?AY?;){ zO$-@FW*#}(jg4<*NsVV5FTOoXepTI=5Vsqh1N0qDo>`^Mm| zc!K?o)9m^2Y9I6I62I|zQxDMvhVrVYu8z(DPBOBQNH%hHxn9=2>6_X9=d-JXA4c`a zzW(JpwTzTU=cB3DZw&b$s3&rqa1=nwl?Wq8whRV zgbyY|e;%AC&Jg&ySt8ligeVW0F6Hv_yI0R&Tck?Ax9<(m2pb|qrH~O1V$u%^?Is~h zVfhM8a&RHqD)B;u^reesKuLDItlyS*R(aJGggZY8sl~MH8$a|`VC#8!N9P0aTiV_R zt(v?Y@l(h<_vVbq+qhul59{Lpw)3(e8Al3VgY2QZE#s*dk)DWJe2EMrh=d1+%RX%g zMw3|V#<*{{S~`zwb768bvoT{kg?d2Mfv@qVNF&;67;}l`$-IoI0ZE>*hajG-T7!JT zKYjkErzjw1yH8*)vw+3)GrY&;H~(X+3e;Ohh>UQHSf=~qHOML4Plffu8Q&?a4(mET z(n{q5gBY#l>w%PJaVHJIf(uddT;^Ds*H#Gz!&d(J zpuCd_g6{h^n2(n$*C6ja@T+PESrKw8gulPYUL*f+uaWH6XJhRD9|myl#{*SGAY*>0 zk!FT$X7Nmf8J`EP2|0kFKFC)y#LZeV?Y(eC_#17JCHSojPR%#&yv>47Y<0P?Z*V`$ zM{%U>wiMGZa=s|%y^8s6FWyWqN(ZPDMiT11)g4_0F7CY)W%>A3Jt7oTI2??6Cs91D zxs&Dwe=rhnAtd7XMZn8z@U<;dOgmb1rp&8!r z^x)~LTPE~7U~gWD9mX?u4p>mGCpzXfdF{vvP~jm=tJbvjiC!XShUUy%d#|H-Fh;1e z>OIYY>@!^>Azs|^Gf~QE=gli(=#sZkbBZ$v{ct z_w#%1`}uwUGMr|d&f_>f$LI52K1UeBG&TygN}NB?fE5hH9y@>MWfx3LTiTv*a8(>~u|DOm}bpPE9f7c5UOD3Z8^8v+z(_~XVPdz(S6@gc0arojAgN4Hf+F3H0Wbm{xI-I> z;mo$F#RQ_*N#Y!Gje<)Ic#O8ie#0$WtP!?o$kuAz3?4(;m2H@KVmQ+Wze!%jP#!GJ z5SH|rGMlac-31rQ1CiCvjZEk%%hm$8bFprwbo*cY{$ITP|9Rg2e{~H0Iv?r(XAXt` zwY=j$Hb472t^fa;->=SQJ_UjM0dkStrn;s!4@w$wfW`U*Au~F^B}00TaT|NFnYz#$ zLi6jpJ&WLlETxQk7^&RH7Aoj^=A` z!+M2~tHLnI1fjr?EdB7Yf9=ST<0^?;IN#&d{2y%-)VUi;SF6=7WH(t0!DnwldL&r| z6k~<@Ye9dU zxpQ$*#^~ASuxC7RC{%3_(&(pKgmq7O>agd+xbNppU7SWb*gQj$7!ExpM(;EpH2dfo zz^AAE4?yjH?Y|g&^ogrOnV9A~xS3%j4LD1pOVCjk2Ty9gBjc?ho0I4 zSf)VpDFWSK#CHFl6Px6ftAvszeAnb9!p01)wm2VmdnqF6dsaQ}_qRdF+!3w-`|^R; zip8#qv1^OMJ1YiLluxqpGm@uA%0?QrMB9dj&x6m*=ESXAtp##GoG#S#c0v97yE>T~ zUHxkc?^%egMi(+t1s_jr9rZeEd`27H6wrYRg+q_+Kq1s~MHjCx&PNfPu z3K84!KqXt^8_nR^ao5Gdl(ilIF@IFj9R%g@pG_exiRO5_==Pgwcdth@`T282|Yo%YY-q9U1B4 zDQo(@DQx2Rm{3(%m?#(I0m*8EuNj-nIdY!&fqR?=xCl0MOF=47aq`egoV%Muzh6Ud zT_-S%1Np~1XeT%t<|6YK(w9Z^gQ1sy%#I|!Nj4*N`Un@vx$}TudM7GrYf8!elzVON zEJsP;NcOCYuiK%Rqw$}8`B+MR`$7MYYC!ne>#Hw1Ho1PLnimY}#dtno%k}GynE7h) z6Wxd(Vr#P7->s^VuReQteXJB(K0^!2_SOaktAdh(8xsAWj_ht;`#7>*hN- zP+%`GV7w(!Ruy7!E1NtJ+n-ae&v(7epR>f{tx6>?y6rNkDp1vGYTqj&tCL!OD~<=z zADR}5sK{}jms-_jN*@}d^bVWNt%51>?GgIqtsLS}n#(vCgMCBI8#xmD)a|Q$fI|4S zB%2DX_~GcS@^-YvLG!Zxc~Qcu5;tRONgA3s`ei?oc17i*o~e^ZZm~^X>7``wbO)l$ zFdFxdv1y!T%G9qZJXSlEeuJ0TmJTJ-&P)(#CnOuWwDjgsQOgQ)mVMYU9$?-2ghl{p zu0|V@u~m+gf3;Zwjp2R=fv}}*nDe_KGoO7qZA+FZY6~)fx#;|N=YkZqm^(3>EfvfX ztoZjwJf86OZP-N%hIBT8?#R%2UoN0%qbGE48z!6F3nWEAxQiLjZzOuXFLqV#v|6{7 zG{cPRz+^U0_TWZe!VjuEf7oSv9@wz6e=m`B{7!H$7ZFNE;*ToB@mszq=s1W8{w}A9 z&8TmL^=9f)ecNXK+EDIuQx?IPD+#Nt)zgG=>}$k)Gmb@DjQ${3?CHaljbkbNZMqCZ^KxA4`>F@tgiPXi90erKhWwY ze^udRtt=RPOB2Q&FWEo+&?y<>V2Xmd&!jP*SD(P(c%efx*uw*dG&fF$+w)1atg{9o z@?K@eJkD_V@j7E?;o$25o3g2ss4NHCkU2q@!t7@j)-{N{pU`5*o@Q7wixEd@R!x^v zTE|biwYhz{;1Az$W?Q$QI~H|arMTvcUk^sN#dQ7=m+w_@HQ(W3(0@-eyDUl$#Hmd6 ztsm^MvKy{93GMkEC1AJGG*m$5dC9VFZ6f{m;EihWoW9DhP zP~KfC``9IKZywitMi)ykab{aR&`_Ol zCPAP?h(GI7#1kIIdoUE04%W;vti+r<@t&s;c$62f;x53TI2F?1(e?GiQ+$(K0GLh?M*It{liu5{vt! z4Ij`;^KxHEy7Jv@iy==lqI(7yk>`uMz3&OFw1(0XMNvH@XHBlW@0cH=#eMFvH_MfLJE`+snvaxVK~%=uI~osl9Uq)j>*kL#!`qXje2kepdHXHqI7&e(9#kS=_*=QPSH#2m8dVPyr?2OP%zEZmXLvX?z zx)qU3zr!%uJOgw-D*q@4cKUM!P~s`~7N2;ZKBKdh^D6?|>HcL5v%CD-%zt{=|Iu zy2e-Yt=;*{_d#}%r}V!0a2H$d-`ZD%+z-g$$#?90BiIlyLchc?9e{+V*VX)$AG)N~ zv_6{V+t?ycdNeEzZH(uH$vMO~&Z^4R-(w~GzOz75skJD(kyu{)ASnqfH=f2yhXtTH z8CAzoRr{vVlq>M)viMgm_qm0h0<9%r=z{=1!QqB{ks4!3j3GI6KHl1fHFUDn6<~T~ z6Esd9yQ+Y?&K{Vc{g30n^kp`ktzHvFXk!z5b!zO``_`&dL)zWgD&m!!IZ9yu7Y7sc zs-y|O12aPucpUu$Oeq7;ny+(<0^A`PO=rWt887b%?hEbxOENGXy5axe~*luz&$ zw-&4hk`ewZ=(Yh6BiNN`?LPH}_jgbH8UKl2;a@}7gQ)1nG;YiYZ|etjC6*G_z_EFD z7&f9?mk(xk!u`U2EotlWJ?1?IRabA3PF0`8cK=YIBu97@(|4OX1UCys!)LVNFM0Bm zN>57J-gwLsjQ&D=j2z8|3wc$!j`xT(hv?07Y^nPo7yHpb_fTVuRh|{DJoo{GqpxgG zYa@vta&Nw_JU*JNA#0WFR>#8GD+75BC<;zf=)n=pmv6w!y=JGAsCcB;^V|dP)FQ zsuP6!V>*FdA;6N#?Z(~hbbd$jP|N!9N+WCLD+RB~Jx6p1{t(uiM3d@l7H~GpwR;LRF%{8Zqnc5x8!47se8cM#raxiVG)9a(FgtDeWDoBT zQlhBB`XDVX>!pFbxDZ%7G7$fW3s{$^$0-Iw?B8`HA-^MlN&@55+CyxgN8Au>$B>tq zAf}%vYxIshu<&H`J)F7-5*X}DsDo6Z9}S|dt+%o3l=z9;ccMtyV2}u=HiU<3MR_hJ z!<|QyyjvRCy|0HToylPTmxAbx@7>aA!S{9#Qeb6Q3*0YCvR1p&dZ|uVDaNla&m=Tv zJqvvzOp-=uU`KF#ngZA-F20`ZGK4j$5fG8FoqoioE6(xkx*Uhojr|&l)Y;|Qd;9Bi zsKj3@^#J%HJs!+_l(HxuybaUbhnp{%QX45#sBfFu6aT#l#22ugg7|Q_f|>0x_CoK- zrNFtKVucuA|HR@eDz0jHE68?=SqL35J{(sd-VC0R5yU^>LMkIPh^k9OiR6ApZP}9; z#xR_IoV33JgYVabZdt;u{cYr|U)SQ?LR64(FA;|u>bG)hg&8o)~W=oV>7M;_N z5VPr2X|^(epKkId{_>sZ3o$Y_4(`ywDQm-1)Ns@l$$<(qCwSL5q}mQJxTY@ztC;|> zg}@)vx;Pe)jQuOng|sDRXcn+2cU!EQqB}_c>SzD5_WAzrI0Ek$Flj{Gg0nH)e(e!J#uRGpP=f>Cu!NJ6-be`EW!{6aOzk)h%DT zXuzSafCpLM+uxq=Fzef+D2ja)gqUJRl5RcV+0RffA;tS%nQH6uS0sMLg+VOgpHtLl z-k0BbovxpMA3xtt*@hWW)IyK+s`TBAvD0ahd?Q&EsQs4jx(Vy=efpmjljdTDujO-t zuT5~Mmg~Z>7qDbfQ5HT{Rp$P!=TG0B9Q7xUaxB0p<2>OAg;f3Ewzm>a5)Ml+3|*Rv zs$)tvD9I+bUdwITql-hJ%ZN2GtUQq$=T4(prBFYR!)SjP!{p74gx18H&`iNdlRA58 zXl_YKMWP9Z4Vgn5emy{pHssrbg$EA+~bsf(!=6Z{={>;?BQPiz*s2ZQLTzIHBLcEpTb@E(j}k1 zp;ugiwK9mqk7tk*v0c^5eU)q!qBRu}fA0IYx~ko_bsA5-01wg&H@}fC4PQuF;D52iX!_xIcn^6G`I+`zs7*I+LfU#qMflC)BYQC%cZL zTp5YEjMzK*=XLBe=(lZ`5Q|lrC7V}Ee2mqNRlQNFdI{CV31j-cdZ(lIyz!p7^FJE? zXHaz==E^=?bLq8dywm&|s5|^IcsZf)8`tzLuX_fgcQYKBFDp&Q;JYyX1qAiow-=n^ z?nd1|FX0)*Z~@NVkn{kTkmYMh%ZFqfz^3+RVaG*xtYW`b+AnM|sEd7|64itL>g=G`=n+-We!E!CkGm%RRRvt@cDkFms3JnPn{U^BvXi4)8LMk z*kOETE(P|H*OE=N7cpZ?k!X0O&RClHat|o$o)of4=K!DQ_r)Q;vMf(E=-2~$8f>8* z%NhWBtAxY_*io7y7FR!UHJ>*((b}$v2eR`M+F$)47%S zyKzjH3h6gkAS#bO#q|*+DMT;i??Urh(uUq;mL-e1%>)XZ!Phw{L2b(&*gN&5VYyRok^ARcRM>6SNP5t!Fhvn7Y=;+m*q;z}~l5!axZ6{3qL40L7 zTz{nGdED7DozI9*AR9D>zImKAck|Xu(GEXEUIrRoo^v#yXhHtH>R9ENp8>i2E*cSN zXdAlenkjzgybY3xNU=pmrBvOc!cr0~da8eL+NyM69u2f71dya6gq3>&;;L zr9zxSRHu4)#ixphFo!=X_WKM{X?VS*p71!5zDgR?hWDBWQ^%LCL>HaU6kA)eU=V|tguh&O5`AT5y9Mi!`n z>N*4;fbO$bH1g#IQ1Ln3Z&Z>OgFdvlI=$!d{ugko6);FHpJ=3?QQ+~dffBh(=w=pz} zQ)kTlGcO*5%p`v6g*0z%!vZY&6PF3Ur@ojnuKIt*g#W3Q2TZu;e+(1;4~krhZEc_v z(#qU2OUZZo8zcVj$^`yXwMzU?Ae;E>dHnmblK)-L2lv;}l&-}eFW{Imppg#=k~KA}{)Pc9dDzdM%DNeWdXdS;;#WbG z2(}ZVPFRPujjiWP+{<1qOOC12l(V8*M2{8~e)RKT{SX=rcV%eP*62EZjV>kaN4(GM zB`xvE`_=HVY<{9zjQhO1|JF5W&*|Q0NYkE?J1NYA^3obh7@E&H>^CNoo$@d64wk;en1gfuh%tDegj|k0od?}(hM$I%y z!>?dc*G%*J0d28F>aX<)D(IHTFn!}ZAYi&F^;1tk(_(5P3h7D^niEKOfTojo)It%m#-5P|HGA{W%cK|nX{8@K1>At~M6cb71sgXN4*jK6U8aglY6CGs- zb34k1k=g$+y>2iF;q=bardXWQ)zYi#9Lw|Tj$8d|Q3lY~cn~6-=14zFBaAgvCRJPv zHotZGZuv~p8nG7_f-WV-Hb!!)w(hAuV`DK!IemD`!YzYFw(S05zMk+w%JcJ97sq-L zMWL|_af%}LJb$D$U-iqbN&IE@2iFcSpCL9ceFasB-aE-!G|%pWA|k{M$~h&bS< zP^<8EW`j2`MAt53jRSO~T3HWfz7KG`G1ijuV^&Eg_;dRmc!@yMl7P|ZP{e`DafM2r zgj>XB-2Q+IRDprJ+=eCMWKR8?*I?Leg<-x~S7!I6 z4AycF{<#xb&Km-rtx?^XVg~cl>xH>>ZVs{`CV3g#Flh#Xwm~gI_pUOnxNLQw9{mF2 z=m}DR6tF%1%=GX#x_kSaZr{D`IfD)+@gki)`213A^)G&UD)`g)O7M_s%mg|XtJ>$W zxFAVN%nbY-y)HrSyw}?MW#vrUnHGjKEJa5)|Hy2UN+jQh7kXix?~f-*Ii`i2&w94b zzm_xqiLyA(5# zP1240MAeSQ8v7866z1upY1~uywCS($)-AdsV`1&8e{n&@<=gHbfpXxW!deO zcEKP&QxW#oZ|a!jnhHxfT^k}piWFj$s;vmB0^btzoI_}We-l1|8pJoMXJ_4;`tsVK zosc|G**1==AK}IPIQr>RWyO6leTjHXT~6Afgp$GCrWDxse#Y5*?`#dd&$JmBB4h6T zlJ%QvD6JZyb( zMv&N8_UI}d=mJuBV!ZJ@M~o4-qSdlsftK{^euFCwTZu1{FCEIs3edGxh|n~B;}_Gv zw%h7lm9kAg3^R>f76g+OODceW4a74}(8>-RUa*{PHtDnN3R7Mvf6z4 zA}~xyK&t?n@_qShY`@U_Qv{BHL8k> zvna>PnE8@Z(`PfhVlI+QVXWOt>%o|KjVY3M0I!QEK0N<=jL%!WQ21GGQOrqQ*-MIcffK|7dk<)Tst)nu z;d8Hx`)WoKa~=7XF8Znbk!@^T`Hlyi3>#Vlb*{lE+tsJ#4WcW4kJFs?z5qvNktXK# zS69y=TYY+7a3f}grwLu^i?6iI`Qf}#^iEg(De}Z2;mxH+-Ovxx-uG((R=dj zqq!g|@?oe|agYQZ>-gue-su74ej%#cHI*wY3>L(;g@=x(yrDZ%7v` zF6*8D$#PT>E*H=*S5m+C!&T|Pz2}u;(jS^|EH>2r%FJK2V2NEa6sJICmU-l>x zh`BK7AOhO#B z=^@R%CTww%SE}ebWm|jBKvd#%=sM7+#d!c#T0uN+S_a+mYk>y<1SZ?CN@uol|G5Ay z4Smmp$ZAqYx%AAc0*&?<*aE#+n3r)irzDgDj+ILOZUnU2NKzt=rVy454XtA#K&I5~oL?-EP~kqxqe5DsK8H=>?u^80L1_~L(<5y z^!IHTNdN+S?fNBv?{He-8TXB$1?(}Xdtn>KpY#sDIgMHG1<)7%1ZkL957rr^FKPi* z2*V&;_2EObq$Q{VxdOTeaO|r-Tp}?s1^;3mUeB{BB)<)t%LHzdNjdkM#c56vW63Z1 zShl(WsZkIAx^d9=ee25!DLsP)^c98$^DV>7g#tbBbI&|_`VLHpSSN#v5<6DPd9=c_ zEJyX!8|sw?_4d@S!?p(0k2O2GINs2P*5j#8kqbnXPc2uQPD*3#8msSB^i6%*fjo0& z=4wFd|Wh_Zv6VA{*<6IG4YCHh72i11NE2iut4}I5oQ?Z-*V6X!n|bM zqRhQUG`H^43vPM4B-Z&@mg9DK^Kq99&S6FR++OS`zR_^^-nHSO7bsqxC2avYZ_9kp ztKks2555QIiV~!8_Q;7FCz>A}CaW%+URLjo`{FL@cQgq3nkNKZLU^nS-|MpA=`AQ9 zdS;w-y&8_e6KPkB`z7ymdV26Z#G9L7Xc-h-1Cif(M(B}i+Wq9Va#hz1%>uuD55|{nPcVZrC9js=hHZ{mOHTLFh zJ$~dUIBNgVO&?v|*C+QSENCrH8djvTpxAbm<}V)Q50!TZ2|?=D zO*m>#Hg{cszvQ3C^3aO%X%EbbDoXAM4+q<6$wX0+;i@9W_OpWdLZ%XT5J6;_Nt>Z+ zOU^{lwc8TuwiKZ@sIY!MdH2r`ZjC?c?W?V+5*KkjKwxQbR_oo5%;)@q#%P};Kfo=>!x`36FOjy1O(n3)MC^m}pDhl%c z*FXCYZJGX`=0KXi4hQ;cT>p#BfiL~;Rs;P%uY|B;!8TT-uX*JTmaih5Lj?Jp*s~O| z5&zB4XUaL%<*KAA&4unoOQD-k_PjSf;oV9FQewu*e(%V|spp5bVIny7UvDv?sgWqU zF=JL3HnX}{(PKFAX(@**6pR~90i>PRg8sQG%)`TxSB6Tkr?H(7ym)}lMt-%obbo8;s>QnUi)9osw@+pyY?9CZ54`^0Iev>N$psYUlMw)? z}TSgg^k`1Kh@WuO|IoMQkP7;-m^+=Q7_OVKYCY z8)KqrmDVg1Bpjst>S_I5nY9FHcmIS;)dA~DzZdfMXWOwC$Yes1ew0V07EWXEqWv-0 zIMAFD!+t1NQ2BdxHKRu5{fIuqHquvoV@h~L$+yhPk4nLUnoOCs{r}S7dW96Z73xyW zODc~JYT*lC=ATwN)pxwwmYGqq>4t^R9Y1lW&Ephu2}~3uZHfBWqPkmkL2l}DymiMz z-?|h0v|j;JM&&jes`JSGnhy$W{X$KO zEb5=flcJ)f5(Xz;b0K=DU-uD!rKg=sfwDPPG-7;8Tr82Frmhg?t0`$_C z_8esHYuhk-J|hMvIwFRgd`An1szo-*>*7>pNEc*=8`hh`kVds8uEm1N!wgUZ<@gbz zACnNEgM7qzkTZK~o`2%$K4U*Eml>=4fjF;ZpIFnbNnUOE2b?I$g=fE7&{&xKzJ+;q zteqce{72b%s*CKB4ZqxsO-B)x!){={A#tIuMQ|STXxDOCtCBHHF?LFFAim&O*d*cZ zQI}sveH_<>gQUy{o(F1GcIO3h5k34f*3~?jvJjt6U77BdEAQMC=3L2kP$?SrH+ z6|b(%lqvrdhq!zKirW|m`*`nlKE>6U!djXJhV zDJO%Uo=iss_zncz7-cG(^tn4IbqY$BsWP5e+O_SWKD9rU!jz}&r6$yG-Bd?BXJ2`( z9X7}#2wWxa3@g-l*ERq?=`a$OdJA$`nYT8(Id|-or80`OnH6AjE2Kdj4 zz_LGdQ=T+2h5p@Te+U0RH^?LZGR8i}){&MEr84_|SI(l?4gz&TjXqwTM|`>sb3)Om zsZ1)`+=Nx^Q6)#y$Skn?2yaYu{lg@$4R86ikh~2$hO-3vpHRlTNvo~7^|ozTJPDMl z{5832j_w|3rdif6%1XPucy1(5*MYh)>>SD_q z{1Rw)wgdb#GyB5|59LZGlmmG#1C=0h? zU3wmLT3;RYUSmGluil8B;8{S2wo|Pe z4C+Z zl&zm>Ig&osXFA919%!vK)IT!3RwVX1CHK-LXVzOvHf`y7{k^n%bsH6rULNAw9oQay zy&uhaJ^_;y*UVy`$igxFZgp<`9wjz7a3W1f=9AyBu<4Vl!mvJOIK)#ENQJu-9q6yB zz<;>*Zi|lsuO;oF-5M9EQvc@gN;K2vnL!&j?ePpQ zW(n5bpF7AJ+gA)YJJk+ee1^(w$j^|PBa$@Kat2QKp8Jexcw7ccwUlZuE3X&SoZBfKL+ z5*bcQQmTwKV6bHHXHO3FR&y3d;^%&QnKddOVti_MpY{PhoKXBiC$0WlSRL|@g^m0B$ zWE6=Nv;0CACC|rDdMygmO7f2@)jU$u2k9ZPL<&z}d*Nq0RphCMw1DIfMBEAv~1byh{7L4FeJu;x%=`{2R)vRYR4Eo4P{;Fvw(L z_ROvw*Sk0-NR2$-WX38vu6!)+$^0IacoYd$;j}+DwMVJY*A)ANr$lt46!b&zzg#(g^R;SI+L`f*)~5UNrrv7yXWMr^aGi1 zdGZIx*-E-XczQnJA_ur`OlX}ORN0j~e{v@>f=u9zf-%{caR_<8w-e(9cn@!{AEqzZ zk{0k;I8y>orO9B5^|@Y;t2w->h4_ro>X1}gp3pt;FiZka7L<}*#R2f2V@9<+Ys zy=)!wZLyeTD@mP`X1!hy!f9X`eS|JWEFYgzLDW_YG7FKDa5rcJYL#N!T>qywnHJNi z1#7_hKC8}WyjvLzlC{D$^&yvtE5zAUXg3~+i2vw}X~xJK{8tx-{?;e{Hhwz_qs=?BDUatgEm+kW zpBo%9mF9f`JLE&G;dzV}3-EkH3pXi#K=!jU+^M!MIb$=@0FM%r`bah2%3sYFF_L5y zBT7b-Ew0#I?CoesiC-pjeR=DZnu26z zQQzu53qQ)Ca$J0D@?nInRk_?yu&F0V7W)Oqb&INBKdgL_EE}Elq4-ni3Gs+$_yJM| zJcv+)3RV&I=lZD97gOr}EtVAHg=WKY`(ccz+^PO4w@oX;f3>l_pF@Ie@TXf|jySOU&yl;LSc5V%vNCw$D|P8d1CI3N@39cJ zxyZv*{(bcqdclEyjDesu5?mx7sXVo~x8jC}_mLMhj()7$LslgE>FZKCrCrTuPTHkD z1*E4cR-U1h0NNE)9>kho!&m4ARH^2zgO9Rs!e^rn(B>+yh#~lF$Rdrw>&15>Dqp0P%*r8Bd)5F2TqL zhnj>!1|4DBFuOU%G}{v0W6fyw%`Dd+U#8hQs!!mW3Ak;T&F|w_$xs(^dCvgp2ksZT zC%i#n^8l!e!I>(pB!Ep1azJNO$n;L|Da%oRsw%8X{*+v}=EL~E`8jA;4BV9Q*qy-k z{&@LgYmwAi-HL-vS=d}0IN?BaaIT_#5Myp69g!^3>${S{tuc!JoT}yDVbNc_FnRLO z8&&WSLV=40$u?YbiatYAMUjzf+8Azh zD3(3Nqu4O}Gn(9L1=~zrH>WwhwhB<~LG<1}cs>jzGord){6;J-n^&MUw-U>me4*Qm zDf~@N*8bY~*&x2__U|k2`IJJ!m0q`%UP5|72sRo>=dg=)bz4j1hUmfn4AmTO4=3<^276sA-L6(sJah(~i@i8MDsFPAt+d z>X&J+%8WOwgM#n`&LJ3Gh7jwsuk4XKCNGr6qDiLOQuLdIXK70p^(yX!_>lniq_oaKIHdV<8|K%=QNPwZjiyose#`!z6%fE;#z&CA`A?vy z69?nK70B>d*&n)(fqz%`q0fkq`L}c*iX)=PNiI(aq~^rXo7=E!BUjM(1H8cgE7wHd zh7DBjO<+0%8pFIH0dNtU$ONTHjT~vx45T4g1?CQmmAEn7TKEjQ8`#r_AYE(1#yI#l zfY_x47)n`;06q3UsyvuweYlBMj|~h}d>w4haZP4G_3|hPNmz=2FDu!C^hO`@uD9@@ zvO#)Aq^;>~SjWdGn9NJcRNC>JTzxlT4_VKsnpm~Oki#viYa{c?qcbI_DF5@qVZ`TL z9V~mw1sS<)6*PW|t?H79i;sUPq56kq*GXyr{W9yxWB7S*O2ORa2s;; zx%(C-DAg(AdNGlM5APz?p_hxY3yYT=O-4-7Dhyj1W^6d%xx#ISXL)WR%{;C!NJ&%w zwT+9K9#;h8gJQ!yvyG*N&QZD>$GFP+pyP)+!K-5yY{Zg)y%Lwt`!t_)W^GKL()eoQ zy%+rv4MWJ!mD&kWpk&dpYZE19+G;*#rszk7`#+ozeK&`H2%R8>U+KwD^5J}?81v&< z2G6{}So>;9cvxORmfIKV5y@GZPsBqseq|J8=|WtGVcBCad5I1gtd%R{=0(TD$J$cA z656C5{yI%OcZ*Z$JLjgtdewk6ie{XLa+7bezI&EK6zmMi>tASf?_BNJKL#rpG3FX) z8K(v`_20NgQ*8|_eG|xYb}dw%#=*uyK)>|&wx)KnM=qf?3)}XCRugWwUADM$-el9I zE{1-EPs<==Tyj7wxN^{OgeJLBrzKeriGEDJR%rOuV5cpc??wiz-kDz2P4bk%%@j)rP13 zP-i}dcs4@vQEc?kkK*v#vjYdy-oyWe=s;aO8$;JpvgIqOkNVr zFtZ&N_)fQ|-=J>#UV7-TGw1}_`cENAx%Av$EhPRcwGH^|UjJ%y@wYi3rb?~;xe&k~ zDB-k253~7`nZMTdsgn!EhmZT{yP?Wtg}OQ%Xt><>0OQdkrQ!$g&b;ng8$b6% zhY%b1 zsEyxX+*A|9DKPd}gb-PFSw%Z7a*!m-Fmz5zc^IenN!yh<&8(2U>;WGxI*j~e*H6Px z_a8Y(x|89+qkV|o2Mf~(FuSS0LEeU)<8QRE9d*DTuq3R9uK> zonii!m|gZ=Km*xk(Lma9HuG3RkM-tFl8rs-J#Mf zT@btCXJyP{9U6x2rw(kXqAlqeD?Ml$ptxm4-0~fiJ_F zM!(uuX?C|Ftw#EYD5{V+4L9uDPu*=XjBa(B=Owme`i1x!Mq5UEonAn9=X{p)H@5G% zbwB#!jlQWMM9SYDIGggdr!_fk(l7Rbqq&-v1|`R1Gnr`xGbD@#>xGZQ3+TLbmC9H+ zkL~+_eK~-wzXn%~)PeskJFE8ko04!u5e zJV{U@UKi$c{9Wv;s+^i1eAU${J-Vgx(fX&6W7?t{b-1vVIlPl>lAgh2jiifK1NKLN z0`mzZjCM8`u-LkI%OZyB7Gg3G^A&fK_)tvnBh4jq7(G)c#0WA}47nGFex+{lsQ8`8 zwegT0W+pHeZJ@ByaE5MDLt0Q1fm3CKiD_<02u+Pg)2j!!t$R`pZw3GA!;a^BlV?;3#is`exnJ3muh1mrji zcZ?N+#+a`|u067`Pn4I4zHtfk*-8iV*e3u*qDt01XiM{13U~$Sill1p%0vveV_&(H zb=%cSM3!+v0CjU9!GBEK@g{v!5Z*_=eFrXT%8omWE;6q=^=VA7!137^BQ*&yub<~; zgRHGzPFHQKG=nD%I*x0u!a;L5adeY!6Mj<~Is#ydN8p^B2nNWa#!&0mjhm3}k{SBKO@zZtEPc${9n3$g474aeo zV;|lkxsq#0T{4L3$eU@(GD5=HlM_ZL81UFapNM%*aSU#AeF;6>>%*&UB_T!L7+ z?>zd{8H$Ff0HYYDUFD{_gH+($7Hbtyr*W;MrtW%yzgF`m4=dt#1$DW!)by|>CCG#$ z9@~!PE2a=1I(#S&Z!xiswy#>U>APOmTX1(&iI{I!)SIvJb&q*2ueK&j)F#(>P6W}- z!&^%H`O}@(ch37}@HBXUSTp`Bq8K#p*!q#?;!_Jhoy+nGYqlvjE;CcI_Y&q!$1fjy zR3Eu_q{Msu%{{7)dBVMu%IUv)s*U0tWsPE;Sx-;gzjky_f?YnPx27zmEJ`-tJ^yio zMDDJ`tH;Zi!q+kP$3g2JFo5w@h_kY79MG>Mz#8gC)B&3AVW~oGqk!>oKElKum z4kg*wqEZ<`wwPqiHf1ekUq<#ZvJW$sF?;vdd7amFp4a`m&fjz2zvsG_=k@f5FwN#W z-{UwwpU-bT`9GnrjWQzd5^a=NboRiJLbKQT6rP`C@|N`>}8&4LbP1-@TT4OMW1+?DCWWTeVo>n-Xu0hB=?H| zmzP)?FKDPW8Q+(<(&AX*hT+?$WFR^US+$!8_NHIz%u}2t)V%gl#;)+FCH(%aJZzOi$cd+9Q*$ zOvBe|96@e(c~9`doaJPLIE#@-3{g)04bn|XQiSm+Zv}p7gn4etyaFD1v#ccSbQ&^9 zMHjA;qi{khud9lCjBWe1-txwl=933fJ*nywqU|&j5=HdBR0jMrux#ugi!~-cOyTJi zosd6tW`v#0j6D0i;K237m~&jLyv41f=K1$(#5+w4+{PP5`f-+8vJ*x;Jueb2L%VgQ z_FnS|=4i<>%t^ohV8T0I2C^Ui6~P6+JOXB_VD6G2gJ1_Vuu^c(`LQ(mQ#KXQTQ>ZO z?BYr9LcJb&x4;SoRe~pgS0H^+A6*^kLJY?; zkV}dG@^)a&l#gNg>o4c2TmJ5e@rwri|1lc${~sj;Kf8YsU;e5s_;vfghj{aEBfdl^ zx}G9)a>gVv?;TB^XD)-Q#jFX>UeS38`8f&&S-&+hQ2BJSqhuV<$V|_mP+u1{NJgf1 z6`RG}mXY!-&9OCyI)kXrz*U4x)iw1Ln{+IOCS5y4g!&e~J!!jE>?w)v8U#yrF9 z+|!%ytG*nr(_M3(5bj7Na6;9<_8ccHCJcUWab|^ovRqyn` ziuPF>dBx`%Y}!jfJtrg|+2FIMcvr7FzqiolOoLT5F18a(gw{r|B3T}XNEYL}{d`mL zLw?4Q#YR_>B#%Fd|2!ow!BGQbP;B_BG@N4JU-IhImryB-vgH=^5b%pA<_%Vdi=N=0 zy@>nl^A!q#p<6Cmr`#th3(@YIc~oXT3t3M@bR)0mZn!!#tEyM4wV0LZdg+SUBk@~7 z2;f=l-1cZ#o+1=ISyA{t!9$@c>??A@J_2DsB4zEg^>c+t_4=|LNvu8>Wzk(7#OHv%$0ctUbt3>izmZ5hV zp1DDEP==P>nnnmAN ztYiR1$7kip(zyQNfiRO@3PU`{BPeOT)uCgpMOiber3=^gC0;(_wjxTb%ZK?XKg@dW z$-fu5>y8vVk1@v%d;Vgy^pV9Rdk6Fi+C97tz#Gv|nRRsJ$_R4{C~LL@QNnB=un=P& zPD?oxam4E=>~c@=nBjWM%kviwT^dh=uS4?nuYNM*w!iMJtOq_m?#0 zc<(GT)fAhu2?HIB4$F_S*>#j{tBgQ=cwz@i%qu0quP%V1sYG{2eZoCIycxk|&!l

      #p}IU&y=4_k! zIab&u6_v!SgDUPrao7jA8?gdX3Nz-1e0@*!`QfMfU>^VQZ00IR>(1L8Y->c}u1z)@ z3B`JS*Xul7#ws%PgFCHMO% zuzx+h02#)EzP|LG!l8Lss({alEvbkhcyrB;Kd2vy^SLzSpV%R(LX!hj0BFaZN11cX z=p+B!5LfFBOTcT-l=YukedVH5_x^p-r}fG7&anyS zoXw9*C$x=VNBqMJG7h)?82TXl64RSmRFuN;EGsjeABVx(EZ^o@j3SEzwznT#sB(X! zKmHV%qI`j*D}}8^1nV69K)I8$YH9smx3lS0hlmO=8h@-K$l5`kf5tHCayUXU$bMM% zy?rMU?_x;C63*Z4n_e}}vXd0NTl1iX*0ns~?CfAB!)K;DP0|&fW*mHeBfh%2*zJ7` zUpD-+c?(D=`7NbHi)BTTm_DlaVX;gR+I}3_-b1fs+Rlc*8Mab1`Z0fLc0m^&f+~Ac zB9}%{7_*o>ye@}zdKKG-_i|4<1_FpkXV7IFl_WDj2vEQ`cXL7KVq3ahUN?GcMxoUi zMdVO|^-~$6N==KaW`Ya*q{|l~Sf%9L1&UnqD|aos73@sF;KGGnHoH#->M3g0#L<|qzh>B4y^Z(1f_*W?jXD7~U;~!9!gl4< z!x!~Dlo_k2Nz@mf;GeMFk*L*K#t1tMF&;IATA9bS27!*M6j)E;&ZeaP{yj)*_OCoF zJlXpKOu?$yp?XQm}$r%}+l*CuJa*-HcjnxXa82M6WYUH2n6Pj&cdveS&-L~p9m3!4jffv?Ft zN%>^wTf2yJ&KAV0>4~~R#@DQtyT#8)*S`A42HB=edY#g*J0%`JcnhbfqZzQbs3+h= z4oXV;SftAFszgMA;>yFQD5TI>phz$ALZ#P36$O95GTF`9H|=t5Y$GQS78UH5AMR_d&zzY+=q2 zW>IMxwbE(K-r!6hWTydDelvVE8Wvz|qXU3ZxG=P4EfF#OCu<24=#x5PJi!X8Sqq#t za$Zv8Y{~(YqJ%ZloEgLUA4|9)po@7K`mzs?nC&$lcb;~lXHmgeCE;QLgnx?xGm#}J zN*71B3=S^umb4?rE72?Jt52JZ>FWj|9NT&HJEB*kyPX&HFL?yeG3kFzJOCoaR{%QM zRsyFjzF_yW4A|)){LHNfnoYwF1nP6N-hk*coCw0i&gMq9AOw~N5L+DZ>F2@) zN41sKO4y7LpZ+0r`T%m>e6 zJ-Qf;mxd(D*AJ|J*IShlrKi%q8M(ZbdA)B3%jNEX>Y@^QYXB7dnPOpW2PT_W@4DLt zlgAdvecpdM9R>3w$}?-LsFksys6`D6@bNOXqzaJ`RL@(kA^j?LkEssOvc#$-8S}l3 zV|WK3`?dgCno^(d=4$bIJ$Gd~;al6g4YMzi6UL;-v2G0ELdj-tU)Nof8k{a#WiS%8 z(YD#@j*J$b&_volCi{VGp1%>}Whb+tU* z0IMO4y@u*&`Zy%SlsEQR40!tvM7E@+7MGTaKTtoCrj}KPnt^1VsCq^ zvvUGGI#OQ{sJVsp{F1)@Li6}VBn7JN4=KpXZl(S(*?x=&gqJ!OWOzk;fA?~*ylSq; z3+!%caTFRZ8iwuy^_wzcJZBhC?KDwrDN*Q*C~NjV-uTak>wgz!|N8!GU4HF>|E+s~ zYY9hEun0KXOaU3r@nbq27PlYM=s-;`ss$`1yi#fI)>PZr_{KTB<($f=XW*u zU@YQq+_=LSWkwSt`6dkg9{QxAg(H!SzcsP^u&C8u+-v9|8udNndrW#Jq%lG^R{Wgr z*<$c{!2LFkA%S1MNYE=?x|cr;r*A=Grh^mXMBF^;={e(>? z-^%+$Ama`YWSv+qVr@`(T0?aUjDi1-dN1}+-M2<@ia-j#bXQv&m~3IFr)mD%!^da~ zdN1GI)FDSH&qlC=LQd8PZ3b4Z*c6T!?QGxyt5)hT>I*)Cv0D$e4OT+NkV3ITQ~d{w zUGKHuY*T1}5(B@Zp2Eo{By$0cIxmvi1C!a((SVCKKV^eKb=%J1IE?(vc!5l~M2>?4VwHQTPrJ3*+T(s2(>Zv+H8+!ww zL^t7TV`;WnA}mECxAeTdh^fp+|gB-o{d-YgvLqRU>=r^K(g1NA8>6S&8-`q$hA z5o?o-H?cH%xgr-Aa&C{@ll{Y-nkQULk+RYxB#-k zUqkA%OyR%ugJ|cC>n=RKbfkNSx~Ql0))N0}(|z&YB18MQ7r^Ty9_{HyD?js8G)vKb zCB-e6LdVW*{RCigk4F#2zFv4VL=}2&XQr*Kiv`18n*7tyc^4f;-}fKQ&Uz9%ap}y< zW@VAwG$h?*L(m~u=I^kVGrN%?!J^SV5f--C(n3vv7n#GbUL#kdDk@DKbL6Sraj~Rb z^((-0xtN^GRY%dxYN{9)GlGh>i6`eeW96=$BQQDii-Q&5fiQYXq|p$G0bS z80yrq6rzA6Nnv!NL6tKDK1CR`8{*->SSO2VA#c9=r?>9DGA8e#6veZOolzz<-7lV1 zlCJ-f)tHcKjZr!=rJdJ(mF7zr@GsOdSI9KjMM9CWryuo-n<(sqnXK=K&KB#$7};Nz zd^Tzx*qCp3yK>@*0;8QLQMB3G2VYm3YwKKM0%oQb8mR37LiR0xf$v%)YuME6v2wlE+Jw`!hjIO7Yd@~r=qkZP>>cu^e z9mw6dixzwL-M%yZFJZiX>~FMBz@AOGTRwI!MYhbXut}0hWGA!Mv#|lrgrBe-vYNFSb3JYh!ZL`B)aTT_cn|od6?9as!V?71!{B8HH0D`x7e!=2;RlCPX7Yp0@h? z54!LK5b+jE-=hk^`){Z(v-G$MaL_%6MqDRRUC>|P+rJDPehJ!t#n1n*+5_A_dw${A zzwqjR4_^IO>-cZOv1=Z)pXZ0PMe~UcH3WP|nR1%#bTkq3?Y#4dxgkQOFUVf=zgu?= z+jX=`>gJCuhQzW|9zp1PqCRWyb{52i?e<9w$RNH;6;c4bjpkQ4WdfH7Cx>2v+LAeD zvy#Zp=sZA2GX{Jy&!;Rp_J`SCvQ3)S73CkcO(lW*`ap%X)G`v3=H*f@%6v7R?>a9mox6{F@mwQ{t4iNnpgs7!U8UC{MasyS_uLE0$q~yQb=8au z?|5#t@*GlRkdkV9Pb*yKRVQGCYb52WU)%Qg4`83G-QijV{K6?HhEDke-J9--Yq8JQlU*g1z&BSG~M>)R^ysdTT22m-5O*_ zvK1X`iy4+6=dUzk{Lubh3O?F62^jTUzG-jImCU>ARU0>LpcuNh6k{>Ztk9NlAUzU4IGIgk~aTR#hUrU>>z#0CpV|%W`$g z9?0zpj;_APH(;S95ZVsp^p}yc&;ntE5)I=sCi~&N;z=1YwZW+lBDu#=puF& zN8JsEDp56~Y=Tihszhs2n%;N=+FG~TRpPW2U} zW^@8Jy8Gk#C{BxWEGq*On6~yNc3^&&wxRbRA2p$P6bz0@<*H_ZD&5GsUGWNlw@9AQ5M&oaV>^+rucO!P< z+-Md_CRHP>xRP6CXrf-zAHQq}uz9KJih`$E8S-)lkKtjdLKj}g8(bClQ2W~-j2uUi z2#{21=8Xc{XGAZ~Mv;Q#@X7d;GD=f6%?Pce3f0EO;Q8Igj!9i0+_&9i`+;)n+L6*Z z?G3(CFpnC69TeLH?K^fr7MaJCWv8fpup-n%F%dud;cOmU^ul`Jfj^W1;k(zc1ZPxH zvJlOTq0+>vr~~b7JMRa}v>4G#eP(M>X@CF1X@ZSRI_Ny4$1QQERvjGy6b;2nX(?0@ z9`q$$cuN*3AH|>6_$I9Q)VF?>6)gAIR{W$FP z1n!97Q%VrpqRcEZ`QgU6%hK_$p3d`qMdvdZ2@}eGgOs=hp1$|Bxwz27e(2 zv$&xgI4m*npWZ`nU30KiHJZ@p?L;;}lTbWgIE6KDg1}M0|1YiQ73rVHvh zHe>>LvgyglJbp}{1P6#YE}`i;XS&SL$;pk)P4EoZyqrG`|;!-uzM6|!0&@B&4-^E6C)EW;NdvL~@5 zNy3jDL0m znqA~$iLf(}u2K4(xImn<5M6Q}F(^utWS#}@cN~NvECqVQ1##nd0G1aBE$_D0{-0bbV1Vr)XJfR>@Uc0E%&h!P^N=}1vsQS^magsWd&OG;Z zoTbq{NPUQ3@}*!*4Lza_xquzYk|}Bh%b{>QrF7O1UbSWzI|%T-ZhrsbiM4%bGxEWB z64f#C;C4#yP;kL8cv|qEm-}DD3ZsY|pDe%_Uo|Rm_V_dwr2c1C@e(5P;`0Ep>|Hy) z9fqdVSzkEq5^5cmGYQYghXy%vxGY?_@bdsgzu+okaDyCV?Vm|25-mG@e{(= z{o2{Ihjt+2VmeCKXMW1h)>UaTs$DK1&v-txBQRh21ciHveBiwW4 zNSuJW>&9Eo9l*@`*lMcKOT}epstKj>IickULHXe6GcTo`my)WF*GkJr>W3!|7F=-P zX3VnVs?OaUGv*nz{{C&RS%v3kYP3LLC@KLW8!1a$w2*6;EAJrsMS#w%Km|WD{3^RH zyJ4aM1d?rn-JoLSXt86jj&#-G+rJz#=Nn;c!kSjNNma#wprdXYlj3aHIoYH3-8$5? zE~^kM`6O{QJpJjIV1W!K7{rfXx@+P@9LL4h61=@! z^38|+Ps85c9yM({(2>pkK`DYA!hVe2;auTb|4N!V(UGM7Oaw&DJVg|7UdZKBkaeZP zQN?&_He%HDKB`;4R_LK$Qf!W6|HHzd+!M0^+L|K2we+kYu2*upy#2G?{Vh~cZP@Nz zp4dhVP2pCKKUPTIBIM}ip<5&!%#eCnAQjeSws)a{!q}l`I?d?3X54i5U3ex9ndNQgjqFf9& zZsTI^1ySgm2q(HO8@I;N98i2<>s&-=vRpas)u2Ys*QbUofP`rnI}Ri^hLExB4O^Ge zITB*ExaDoetvG;86)QuNc^BDf{vi3%5)@MiLfV?)#EH|asXrdF=1oQ7AED25Lk_k- zVQuEALf?h{csU9fU*L9BA>Qgnx@I4ECZC1{_3Pj6-&j=vRqs~gS~IpkK|aO z-I-gaypKRzL>y%lj(B2+#1Y;4oGCz3R%Xggz3=xa*l%%l8@Bfuu1CHdQh+7=y|s(j zC(o@0yCfTLXF7LxgS#az;y`hjTnaiaDw$T<@Do;ss&nD_0h- z)2zBenP|FX74Lel&gUo*0P*S7r>H};gYQ5;c*n|()8iW&hI4TZ#vh5^srlo>F)0aL zlH|g6CwH*{FBV>lF<)S;VXO=9#DpW9v5z#&-@cnYlwfRU%RPqexBs~6pcQSY8r+8` zsPw9z`t-pmYQ~$?>!rrUs3Qu+N=sD*@dp|iPNrewb!1ouHWxgck1u}YZpSz02YQY8 znL2GxK{CGq=b_}uFi-w0)3Wn@E%zQ<8y&y%{42&US~>6 zmX0JL?&HR<6#ay`L2;>q6@L#R{}oC8E1vvU?D_wddjR^``wPJOtJ3q=?f)KN>)!`p zy{~-T;@9E`2JiSq^^55wZlN>SXPHFTov7IDcp_b_j%tdjt{7~ITLgInFVrDm}nf*- zA9xrRQNMzn8zH(BR?=LsRx#L{_Yg;0pd*-_EVxoiaL(ZN^&8|fVQ4dhkez8Hqc3?H zd8PAa%j#e*dk(rvGSx;{R5*%9HI}e9v)1)q>-OBV{jhPfBE5heGv82}b=LsQD4kL5 zsJX0wGRxvrp;35qkPa(D9`?9D>79DCjT&u02C*Mphwthr)3C37Y<&+b?99uZ%tY_u zbYMz&AC6L5i<~FRUA3FI2l4LA8|W@e=mE>k?aevv!i`nIF(ot? zu@80D+t8BF-H0(mX=RW5;ro)6SzGh0dd%glK8HS!9R&mYuZoKLRZmXAO`R({Youl> zGR8Ju7SZ#=&k2SDK2`xmprqUV#nc5sGy&^Q#fZY|ycasTkFz7QfCq|Sp; zTR9kCZ=v@~HT$K5;+3_VhEKOXn3!we=X=kafgi1TtrB<$Koq!4<(L$)crEojqjJ0E zC!9yP7Xp`Ey6xd46P`eHC~$$5H_(wEC|d3VaweRX^ZY`e6Ujwq&&P2H40bhPoNTm& zZNuG&t?!lmT>?tT`j@S`f>a)gt|w6ge}}jML7B ziK2(U%=3l7ea3R?E5}cBDBMffcHnmke}6EOVq8adPFme0QM9a>@`P&=y`CKt;7h~^ z4)_FL2(jdURv&Rr_|sdGH`S0*^XQh?mh8n7@)O#53=0}BdC$_P5kZe?zAI+Q{2nKD zm)E`1k>RxBD?*ZT0*ZF=$lpon=3>RpYFhVeFDI!+Y2Cie5q+{rN^u4rh1i_cRb`i> z*DClWQ%YMPpu_);i@k#}Ibk1n4b7@RE_{qB$$WY>{+y(=KR2q=_@hw2bFJ;}?qwIA z{#&Y|24wxa8zANFmo~~@5xOoAn9n>EaG{J0f>do3NW=qso=FP6H6 zRZBC9>!*c&zORiErI@rc{EBDAZ(!zgT&GbEy9d*%b=6=qYwTqf5)3cG9F|6A82+a< zx!KOt*SG^XmuRk)<4iqMo^~i72C3t=zrFFkclem*C^8<|i6}Nt?)Q$!43D|Hz2;7; zVEQK@SaF1dI3LiZ_dlJR_750ev{%A6mc-0anbDPA1=nYh^jp*y6k;47IRVr0!yw zsp=Ca7)yy=>3Y4Y=k4aeyrt5bK{j?p6xaRsyPgV4xSgja7_=1sLCc3r5a8vU+KyYm zui%(lpf5YxakE1Z@UQ;~bK&Hm6SYtL?HkM&|5wh*vkVhjCNO{n4`I7KrX!b7OkFU` zR|Tk;?@!n(82Ta#Y8u2@=9&L4^-C2R5TeN*Z3|b&EDa);pQBnZED<o)I@)dmjmV_!8wGITcZk{7B8lO_SVXhhXLdTnwhKVALwWu0vMl=m&5T$#-2p`^DYGZljd$Hm*4jQ^6;HAjoD? z24jtOoEF=+BnzTZIbsgz#9AnIR;4!nW>G6kdpVx8YGRg%&!cI4T0B?@bTZ4AqUJb8 zvXshVd!(;`l)Y_d<$5OLyuAP_){5`L6`t2aMOEyj z`1Ew)y{-?Ir(YKrdQHKw)>4W|FJ zx&gpy=)z%wg=VW`zaeOHw_m$HJn|L+@XuzP@gG$R_Bqa}bQ2k4ZP(wqCZbEGJ5$S_ zoFWI?y06^;#A``3qUM+M3%skq-!EYQ-!hsdi9D%g64o=k6wD-QCt}1kCsnYn79eCm zw44sn_Li|C$`A`+YLFVixu>PfdB+C6```$*f<4 z=w?|oxI!%+ox!cNgO7>*PtLp~&MxTl7jJ-|0T5MKSDL~+27NK@z|GDGvH5T%C>UjQ zNS;$3JK;>rWxvFY=Qxb4tnjWK+t?sz+^7?p4ddJj(q#Ny{1D^1V$7N5nC;cqV1P2o zV@(;y3TMAXjJpCn@Gf%Hl>|mkGI0##!c)>U@KazD*m~TMBZ|H=6i9?!3y1_fE4mlH zfaN??4E^(O`?#4G*g#xAo-~D9nHAy$+mdXzGLHcR0HqFSwim$h)gU9!jh(f}5iKzj zR()P>&E-7G?{=Z1U*tO?5AF@3UqYxj0MTtHHKxv@u)I1f;X+RLZtUEjD)i+y9b=b* zA=HcO?)Knf+Jn#kbw$ibJrc}nCL_Q!Dc0p+`UmSw2wb|!=WR~hE@6(<9ScFcg7j0i zbel8QeY8u4&Howx8q?VoSF@u^p5qJf(~pRLznLTC)`E17Z`qPM^9SX~MWKh-4DHuz z=Z)t~1D_E2june_i)Nu9tb1|mMF6o+H3(BI+Fm8YV# z=ga^fxt}ltnzvGS^SXp;p%2(${_Fr+1jsANVr^6YWIBuBPuXx)XTjASWEYY?H`J$h zyr6fh&)KTb+z_F}6w=_9oy*d1s=R`D)g`%!EM(?ca*ze&jfRt`YW`hf7Uu z$~_svttx1|YA8DU$0h&IxF@o8WNU!$n<=rSe(OR3aDr5^rF;66vV>6)Bc(W!kliRM z%-iyOdr|y0{awU`eW^m_W=n=$U63#l6HLM_z`DA5Plu#obG&s|K5)l?Vp-9adZTfUB?}d`G ziO4}dy}qUb5CoP!+LC7(ph6z13Z95*?1R_Jiq43BrU|z9`RJc5)_Ruz8ubNMQ1P2e zPfLb~d2}bhkWf}!YVyrs7S)C%wUI&4bIo&CrS5B|n!&Z!yXUsiGS8vA&J+xXPoF&R za_Ll&*rdirZ9@t)oJ_KySZobM{NDPC_uM7yC%fyYyZ0 zf%G4iEO#0=`Rsx*MI&^@T_IkyrIH9Ay6a43eA22s`nbbkU7xu4nlNahgo%8$Z7&+s zs-Mi?j+c=`av=H~^a{ExUv?Lenb%=xH%26(Hc{a0l7k>T4ge^W7LvojPRnq1M4ABLY?=-(wVNBV zp*9cT6htorP?!;=j732)XJY?&Y#sv)OCDYCModXgd z8b_8IpoLsc0)p_YM2J3*0aWF0mZ7Xw!G{I~RLT0pZ^7Ay#ywmhduqym=F{1PTCEm7xD=Ke`U0z_! zDSpfwf$SmrJ_&EO7chvH;e2yK5Oznho7w8IY)haty@mdUTRe1%^8@|7iIb0h5CmWI zv|0PI=zoOW@WfOtbY9TernURl@_L{o6$4|MNI1U}BPCMyKnwr-8}L)Q(%zIxY7rUv z=+27Gx|v&79`6sTo~6@h>6Ei|Rum=n(A$OgS7c=pG?RzenIK`C7Ca`H;}xGM(rheO zqf-6M__)Dw5Z=<78K~4SnDF-PDfp!;jr!Ua9FOLy0M*x})wy+9yFVrN_yT9Lw(;(F z^*3G8^U<+B_EgcH6=vAAbJT} z{j{#j*Wc9x_MKm~j2=4}Ejzo>mfsW1s&3;K79QblwwyAEtGp1p%L@yUvug9AMdbA3 zFBo3~%+e@08PMdzd*hG!dyLvIr1VhmjY1)eraC3h z%wjG!&strf21a}3&MOkiD~8LWbay70=X~yP}?{{RNkxGb6hcrm9EYjV0lmmB_7?+>EGD5-SpTVoEPNiDs5y(e%v6kf%n!c zA3w(F9#$reI9A3yA_rgB8 zNvvR}^r^SV#mUk+vN1E?Pjso%fY4}tKbhs%Tj>+iS8vFgW$dT96MUABCwbf+zaFHW zT;oqGB3cA!&GM(YP;lpw?TPdTrj<{#5wBTM$Tlg8;mDu~F9A+kDrhPQIA30VMuATr zm5)Xvd*LM)N*Va5iSnyl3e}3^3}C#-_2EMdSL0FyqwFK#zxPl-XlHzJ7YX-DIk6Dy zM1N6tH!|9*c`NH47X!J>Mb;lp%BfOF4`h3D^gNj`&5XciMo=~fX*M+k_F8ELeaHg9vK2p1yr&3pnpX* z#ezz0T-A}ohOUjg!?NPoyqxN%U=rNl6J1a@Io;Vb^QP3Py8EQV6(Ki}7&QV9*e{FT z)H69;zU3@*mSemUUv~L}bTsOF7DQV-g_{V>A@9`x`zIo z{Ms?)*rQuJYRioC1N?2X$aRhHX{r|IXirp3{9bwjB?ggtog(X3+!~STl{y=1$!x@J zL5!xr#pkOWb8YGlI8|;f1SM|-ySS`;$w!Ur&fn>0MJwkD?dbm zs`81~B^I!?3SeDezgxy`YJ*&qp0rV~(m&aP1H`F@9~6=g(rEYUF5Yxen3_Y3#Nycu z`crBIE!_tEk*P_-%4=r{V28@IXRFz0%y*nEWHR=>lL59m`jpw&1{5(Q~FGs!c>IV0bm32X`e|bI9W*Pe;LC~pVK2OREhjvWU$gh!TRy(O!2}HZfg>4_ z`mC!273p^{x%B;m^L4UV2MY#%8BTDO{#2l4J}tTbCU1haP6-Re+CUi}5eo3XHYR@g zxYcvPO-;j5*cI&X!gm=e21mP3R1DnXBMcH>Zi6=YJNX{hO6FT8f4Y`|u_y16XqxVZ z08yriPvFYAupJfxxqck=)qEeZ2GqtgSb_Qa+vd~YQ9pR860Lnb(`8ioE(1c5!DYrk))SB>ur zbf!CYWigeO#ke_F%3Ygr=Qtvq&Qy1DZlXlHzo@>ujTnui0a5hhD{b`g{|rt<=Tzs{6Y4I$aGUgH3&5cLKVhO*nTRM?;%s;Z z$8(}*7Lp)r@nK$y@=8Im`Kl6vbkf+5S2lFj?4Q=ytXg{4hk;z-CckLfGNZw`ncVoX zXy16OCFdM^o6mEW(paR?g+t;EyY4tsE2)>*Y1n{s6!{Y!H^N14euzFSlml3|w9tK7 zeV20c-`3nv!I<$b=Ux7X@&c(mCS|m%>)?H?PxwpR~{KemmfJT`SfHfbgjH}8;t z0_fsqy4zi@GEcugK$PWUuV;s8?ca-SG~yeth^HnZQn8N$_TErl$@9N;RB#5NiYkWQ zqiUIf8coS^uVrP!PuYR-g`mT!JMm81Tg*%Br_U8#xUYXRAGovdQ>TnLJ3Dc+85jm1 zehJ4sqyN5fCoodPFC&?*da`6{k`Qb8@o#~mz6eKy(_UMkwY+n_KiMAV^+cj*j8OIp zz)0EV&?glOpZ z=wp;8n#HOExm&zb%2%*k?2_b%OIG;1)Ru&hu9Fs%`7keAFm(A-qNq8ziu}(fiYg&F z6%zloMA6mQ?_W7*lcGVokO?$p#?Gbja!&u-p~_g=`KC@>z<*V&7fUm%IEa}18#Y-b zPjv@?vK276!oNi@1pG^2KC6wDs{W&2r%p6ZeV)K=!Z|GPeQ_KmnU7Y)$;jbSaT1Q_ z@}|1_kA3uPK$~&!Bb)P=aUuOZGoKADHUSraZ`OlGyFeuX4&#V7AH>U__#KBd!Hfw^ zG)FNKEWaZqUagr$BFosR+0nKznclZH9cM@Hw?MLpqJ{eJtPTDL+F#eKa6gVoWQ(C| z;oAzdAOd|}PB-KRcW^R{t@!#`bonjJ?|6xs(#i4uD_`$wrnbO*+Zq+`yNtS=HB~L$ zWXw5^`s&@}DqiQMe%$1th`1LpU){j+roGZaZ|F(yt2&_!QJv|gw5P-`B}z_=p(n6! zb=VK7LR%_a@Pfl{DqQ)9Bp~opAbt3{X0VQ;u~3ac2uJWoJjEpCu~J3hsQZ*(KP`jubnrnX4G5IUCcngTftCi+LC4o?et8&#hh$ z7ydG}KeRqUx;AfSQB3o=yPSf|QK`HL*DW8~)}?DMrPStJ>#I)dFKL&`JyK&fqZpw% zkW!njSts3tLSCnaI(Y7tiq>FW3>0wVy=>S2oSiZJq<5!w+Q6$T`gXcKPuuZ={gKi@ zt)h;|i|m&w?7<^EI>9VdPG`Dc+elfa^1whRrl;NiR4L~>YChSRl);%UkRi298&bRQ z+X8I_x{t@M-ImUDK~y-0G9A-Np=yCP{QXj8)PDQO7@Cq=q5K?c!{;*6$bKx#z&BI@PaQqO&`e z`R4xEzMrrP=P=csoxa8Fhi6DT2BF&1m?1=_8Z8DfpF9@OWpw!BicfHME0H5|6x`P_ zs~*((ckx>%E}iRG@d+)q2XMnCHE2)`sx=npa9f|y>b!+{7p56S0x-n>SmvWbPgnSf zH`z*(I~mvCIk#!KsXS8O<;_V{-`Ai=j2f^9L?wPFJjVw&foQDImT!7DV{papT5M)O zES1`Jb|rRw>&@`k@&YFlqrMLsb|VOaTr4wmDb*)gmBmvY*j-s@gRYfBcf7*0WZf_X zH6h_9c9-?hgxJGR--T7DAIH)R(!ok{M1K2ALzf(uzyh2#Tl+)Lz*miqb41pI#TRQ; zThQucRBX8Ts);4LDYn8vf@M!Ly36s+WS)@9W>mn7kK;()oat2O@U^F}P~(5Sf9(@d z{lFOjEALJ1hdl87)S~IL$^vlY36kI>(-BeOI5Myw++BGXYSLx zmHDRJxJ<*^Gb^yVdLMNox_ur|D#m=1;(1d$`Yz<^de)~~A~rA8Q*hMj>q{3kLauUJ z!4@Zh6u|o>$t`ED+@*$ksH1oT7%!PZjL+i83r5B={7F;U=I2*(woN=f;nsce%4q-s5GF7)}XuK8xK|~5+Wn%xAY|$Fn^9H-3`?hN@^ir z?`$Is)xS1JXi~A0#tpWjv1g)fEMC4e;eCbg!13uE?)O>OZ(x6a{NP7ch9Kq=>exlV zmJ^!WPf2NOfN#!smAT)oR2q-4^n-1l zSFFch22RK`@C(Sf@05F!O(|O=FD9#>{HD(fXwGv^3i%nF00mBo(|>1ki#Cu_av#yq*-HM7?9R_VxD`4GoKS~E8ri8~O)udCMH z@52uw$}~u#f$2`Y*>P##T!wL?99f?k*PXLMIn8P3Htd9-h=7eAytVZR!OS6Vdtvw; z5mlD{Hm**Z^j5bqtD~Wmqd;_-hYEskyi=ehkb&A1$Gu)&95DopT0=kLzV|oPZ_09R zQ{{2V57GwcmnAM63Qm!Z)hFw{xRD>fQy4p5%r#?amh-+VpJezYTGWRsp3kRy2)3O0 zH_Mh9LH0Wi+=ys&uzz2MMRO(m1){Ttf>om|JlsQ3+rKjk=_nro<6Ocyjf9j*Z|vNY z;%I8rs^2WMvJQusUG?Vhp~#CuLOz!Lm+!gn-x=%2I#f}%-jNn}-jbknl$2wlSV|2OvDJF2N}T^A3cMnOTO2?9Yt zX#%34ffn~pQD}*5O0STcbNIt;h zV+GFZq!DA<>$prRk^slA)dZ2Be6j1o`KuLwjqqnK7(!`lAaeVUFzXR>=ZU_y>;_IafrsYpcH8~w3G&@1 zUr+Mbo7n(6z#&{~Z^DF|aVb7d%9HE3Zhx_j^&w1WuT6f07nAKFz*1(ZdvI7VXeoY9 zvf!^xh!q0swj4i3qzzZ)^*dQQe!#M&AA1RK;3s(G5v-qM+@6AXQwG82$<;T1-Tvq^ zKeLx%p+^JtC~6Cf*^^BL9*OcB^$h6zf1|Z%CSc5sk}@OH+~|m4am1*<67^-RjOsssi#oKeS)S8 zo*g)!^GLG(@8UeA+qx!lKaPBUH{1%0dyE3^)R6DzUUhTjt?|D5rCkR?LajG5gD`SKU+!Pz=tb!NIU|p~00(t0CrxMOePfjuntsw(S$WCiv~hDC~~L;*c3 zg|td&tEKhHS!|@|eE9e5@#trow!mSPb-3bjMch5cj7uAsIUH%8c<(83prwNY)xL%O z)(dUJvdKRguDj~t6J>m5K6zSVn}!?x>8NMubn-z<*;$8rh$sCJ;ajQ{@jyxR$tTFC zYz!`Gvh7YZKk{wx3{Bx){etM>;N`22*EQR4J)o%`H;B>Ff&{L*;hhEMYC81}F9Pjf zvozKKcFvLzJ3)_)g5Mlt0mStB$Sg%Qb#!LU_AVDI1VB82qFFwHTf3YiA)~UbUOE@^ z1pc5wgIec}q;@Gk<%_c-_eDKj2MCy--X#1Ww>~*qsX#UnXr@%dDlsizsKUkOmTe(* z-?NqiLD_aJN*Pv;in>SCjQUac?0|?2kg1V3{jJ}6<^EsvTZ#<&dW|PT<*da}17NlI3^S&;kAIT;BYZ_>gM;C_ zDR(3s-=vrAGW;JeAVD`!jT*ARN=yT_w;R2yj5^PaAa2S~Rsl%r3VLY_>I+*My}_2Q zhxitbqXu-+DZo@K6swNCJGINAOJ4%4twjvbT=1r=y+GBb0SlW<%*X)j?pxXn!qn_5 z>JML1F$PvO8wUJ}uQ#|`|J?=T=YT+Z36R1~XPw(5b%H1uTLhxuM!Ly{RG`;Cd6Nb! zi_Yj#<4j@Oqi~=)liRrB`X64TNq>5g{)gD|pYiR#6W{*7bWF^^q3W!zhqQ{k&x4FcHH(vbY zhx*Zdf(}rfhRVDAcnoK_nf!25Z1jG`=~&It*#%`d0Qhw3I(|Xw;EGCr-B59WVVQsf z6BH~V$d38|>^;Z9=htMup+NfbrEdZN&x1Vwy7Ah@05z1;C8(Zp#Oq%WUNwp~vR(zp z4PXs$-GJ|a?Fm=ZyC}1cnMb@iP9Naidl6lYm9+{grI`7o4lD3pss32bu!nW?-EVN= zsWID{9}i`q3M{8G>)Jt7~8}7)I%kiquJ<*EljkM2gNHu z14kdr86Ovq)BhTzHWdZghQ|Oy4Ri~f&CvoEAzfmJMCP3d4}N-ebjf%#UWbaQds8Bv z^=7Z;OUVmUqRG!frLtS@_q%t@U>-0#=?!H}bTv|9J2;6bwIjT7FV7=eXhuE51HJAmio?8*$e#%CXp(F#c zR7QZxr|0vxo+e~HW#|e*_c9<}Au6-0lPo@5hA(B&k%)XyjkyY+iIk7MHmq&%_MyAy z9tdIV0F+E_36gwi_Yv1+LXuha&&o*x*}YgT1~bfXgJNSkOLUW_>r=VSMM2=$k%I`~ zO`T$L6t71Owk|FrzNsWV8cu$~ExwcaFp4HYHlf_`$<+(Ia#MTodI)#I_dEK~ZM7K( z)(p#w9*JQ2qK=Rc5F`iJR)uYSijJUf%#qG=+qy`^Lk{x;pUT*HI@J)L>b#LoAD zVh@bA1#f%*!f*g@RWYcxv`m*{KnZ%4Xts>v9SsUKg-Qb5ZVN~fy;9%Y8@TGrGL34; z1=5V%tNxUmWc24@u8jvfZqK{IVlKHlK`JQD^!^}8vd8;Yj|?jfo;U4cnBKsnLHh!n z$y2k4=t~CU%O54nos6AT>w`i+rNyDjU85Sr%}?i)O4R1tHR1Cx2{|b_Kl1if0I=mB zSOwuhO+E3Uiidtda_<9GSic8+NUk(cUpPf!v3W_BNza*v_x|uL(nwL+QF4}b9%*S{4 zMNEyzQPf0QkqU(DtwBZ>IhlT_NxXYjp+}B^+F{lH;}$WbDOt<@!7a05uON|rfv+U| z&aL+d9zh(RUoUhqcZHiMV@i0O?|nKZDmtCu$xNpwziV<|1Oti^B7N7O?pg1gEmDkuruL?Y#J>O6}V8fg3UF1k;S&_~- zpt*yi0={jgr_{ro6!U}X9hmO*s3};T`hC~DMWU5vsdvdm%)@w{#UPmg&p}V&{Lzt# z)=ELuIbM38%31o5itO7Kzwp;@ba+JgWa=RSV72+8Ejc0|UvKL88g^c@8?x4A!y~+I z+g|U8^vfFU&Xmqn==`b;f7*4wpo~vUgrW~E3V`($Z})JGj97`^d#=hNxl=`wBj3*S z@t!^1kc(tL8v*4xt7nLvLBwD`qR{t-8pIthR+(-*^!*GM=P%rgBVQo&+#t}lOEO(~ z%r+wD$D<&7du)+*0#We9fD05`RENqKZJqYgY=Y@rx*ro6@$cE87Pyknz^Tuf5j#`6 zW*cQ7Q}Q>X)PFpp4_&gQ_1((BOmHkg^8FtsJ?`26uJDN2XE>n2IK&?w_pvk^8s$D; zZUmolsqVRnThM)!s!X6&*3683IvITXbnGAP`43?qp>fO_+W&SS0cS?1Z&aic(7*Ae@7h+6Kxzf0bsEld19S6>NC%m5 z4OqtE3QwB!7q7oZ$)OIfuUg6t^B5|tLxK*|N7ESUF3q|MSWQk`TcL1gvJDV1Ei%zI zoWRsH+GWe1Q=H@Uxx*qG!8?06(ov%3^cs4$b8&s$&fXpUzWpkqb85A5nmCgdr}*nN;acG@$V~IZ23RrT$H($;)eJz%JosEob2$*ifHJV8)SX$q z_RUfY68?e@^%!eJ7od-&Ua6`5_~F%(EYCfDSs?wC{$W8wjkP=J(KhBORg_psG306p zoI})^i2h{E+ryPuQ8mn00~&oMk+RI-w87&RTdBqJldnNX)X`wMeH(SR_VT5%+DU_DK?2 z+E4$W7Z^VOUsE*x&8Yw=$ay4uz8bT59{Uzf7J*Y^l6!X7B=>Iq0TdmqMH`~upTGzb z*F&~C>kn>EXrC<}-{U0#8)hlPO}$r>w@eygfY>AiD;I5|NsP7ew)S7!y`7`*WgGKE z#`f5_9P%?KO&$nol*LD|#1}1RKrcK@+1qJD?v@iDzmU5#Z+i|cbM4hXZTWE7t<4vZ z6GEs04-@rjfv~&b64G zs42pfvzPiV>W208%XEN!#RjYsnP_%S z2Qt%6sS3oS-Bo!X939+SLgF4UqH)^_70w!#Bi%L{XMNf&THE}LCZgXBSBBpwt=>9m zZ>aX{9JC(d4L{^bMB$xPCQVM;JZNnHWVS^ioSdDky z^zdBC&`0Y73l&r{ec11&siJd@y0G*Fa_utN)q#|LT)84zN0Um1~1|J6hAbG004`TLK@ytR-Ma3PCbIQugJKQ8y zTE;d7j@Gw4`MQ@UEv)^1I_sqV)di0mHjLAHk;}yohIhQpo~vQ-Kw|vyz zR=fXYYGyPon6Nr~tUM%G(Ck#PPsZ2o{@ZqSXbDzoIf95upbG`K56Y!*k!DBNr25+C z0zPj6j6SFz&RKytP_t*Y*+_g#sw7UuZs{5|C%;@E?HuZRCS=ek zkZzV0;7Tj5bC?a3G43rXVSE3`h{50Nr%HCQp1)j<+D;ebu2$=uz48~iuSUbqgRjUi z|9`I_-M#_}AWz@W0HVx!a3CQ)6u<~1AzT2p$4YZko@x@*jo2_Qcir0qDilCJ8ce?% zXBOexEkNU#Kk#qKy1$ppKhJziGXlw9R3aqkI(-8q2IZAd0%i+5De&!8ni`)Q#G>;fdJW1~k{Jn8%pI z|5Qq^daP$(4^QkKFL|C+M@R&`nx;ToQA0IVKX!z@SyI4l!Vo4(N!&ZfI{#iG(=#zf zaxZZw_^jL&-?jAEm+Q*f+n57)BZPZ#t4XWT1lTo#cc`=%zasQFRDgygnl4Q~$a$vp zFkO}}^h)o6)eB~6y0*RV*jJNjiL_vq^UDnaggN5ax5y*G8=1Iqv)M#{L^CP|j3Ui% zO)LA~NtE^o%Q4WR0|dxnfq3TIr?H7A3K`lC`1jtK04o9WT4ol# z8MQH=16rmXg56XFjEK>sBx2xpWp36N#S?GtA*$6(P62edeU>i&#mw?g7R>)Gv@Ql*D63;TQ+PNU zQf}xXQ0{^-xb<0zW1@3O;(}n0fqHOqRMr5#x}f;6rkink>o??_L4wt$)A1vlCR0W) z_qj{dN3+=Fbz}-v{~Y?s7-ljzT*BcXuM;Mk!r!efJLalT!caj;n9NYz{<3cTp`Sf3 zjyNy;%&BamdI1@!IMnQ&*IoQ*2+8t=gtp(=U~J^HuF$?ZjT0P=`Q{k0XJ?uRxF)xU z=Ak^ChlgD>)%MJ!NckSRmyYvdY{T9y#?`oA?w}TM;nvV@yJiYMwe1<5OfAxh`A6dn##8{!^s2?{Zb1WkNo#7%ewJpc`qHwq5N8WqF;0TckVGb+1 z+_TaVDBgSn-HWp$9jX@(YUa+?t75fxn;Oxd&%Q!A!>Y5Q(hTcQS~0*!*-XSg!(%_* z*`hh(wZeMoUbw(GxDKMUU9;-Wv^(rO!7r$MeZ?Q8H(a~^QS%1LICHFFro&xLqpC-5 zIElhF1x^`ezL!loZ?HP`Xih;kV+`fJolS44+=8cgp&YFGcbPS-0-K-xz`T=SyS+iJ~b>8}7#Mp`AMjK)mA@$zGO_hpal03#Om(9Q0B)|BG%y4tz`o9)cyG7e25I9qyaq&!?E9B^5$^#qtln zKL?{ZO(@<}No00R29bx{w&W*vv$C^AL!7o5@Hj}_=v4*%92=@Yd+ z7-S+u4qe!RC3-+Ez0W?-?qu^-IU=X~0}>}%O)W4ew(_GVcI-a)GotPukg-K~cei@^ zgV7{E$>jC6d~0;R<1ER*Xhom)enR@axKi?ms<`$SR**q^PQf7u3pAQH&yUxNy;1%? z%-e6%t!O;X!%({3{mP)y6>vmDokyTzJ|8pwVz?-$>XBbE8vTrxaphffQFIvzo5bBA%H4pYy#Wr{ zW9r__m(A>2A*j>NCvztaRWjZn+~@C5cppP+gZ}oVx<_TFW!CA?P+zzWn_*o|!dqflZ|~v^VhXlTfUpa5{sL8PGuK6+ zVs~?9eU#a>p)x`nBh(qJCtM%M_MWO~*&5vG%i7HiSKOgKV4PY!f4~Dpc;+}|zo|09 z;szw#xv9t(I3)*X2M3W-8?|S^T^O$_kX8>hg{54T|7K8fBaf;c;5>_s^{%~NrtRv@ zg_?x!;l5xwe?gFXoIV~J{)H#2JM$Sq^t>7$598i;Ww_mZ@rBaelt=}JTC+JAFpP1g zs`q;?n^x3DpD~0FiPPCv5f=jNX=Qc3vqWiPHH!p_Z% z{H3b=&{jECOFe<1Mtx9Y!cwA>_7Xm z$UqDNbJ|M&gB~p*E_7RiH#;L3E(M=X#{%*wmyi48$?qe<&r%Qb(+YQo$hc|iOHJGc z>b5cOU#4oqyRjv(_@Kk3M3)r4OOfUWMS(mU+!en6Hrdj=BCh+QeqZU&VEBc>)jQ3l zUH-0N^ou?(8uuOzl<#ud@hk;zQzcuXJ}k*ixrx> zJ$2!_&>_drOFlogs&cCQ>ENn8@Os~dQIx2Bz1&R#0z-8XDmy#+F`3_&~r3Lk3# zb`lZ(#Wwf-nmWoHwqHs3z9RvD^^}(6zTX0Dq=}0Ms>67P#?NcN(u5|yhVs8bL2{4hxEzLy+FLa ze>;dEyo?xNU2P<2UMF5Gb`~LAS&c<=lI4l`&yRzkxY-XMpC(vv2EYuz!LuPgyeUN| zdw({(C!;7DKA)N;?`Pfq7An?Z@VXJdfl1O5F4jX{upAfZ9AZ(2plq-o(XG0A2r@Fu z+NS&M+!OZC{H=j_JI|_%i>DslfcW7y)ChK@&&2kyqViu5R`c${{W{0)``(PNF{>}$Y`z@qn;7!&6e%1{rGnbbz<6Rh zv*(kL!P)$V%sc5XJdXW>gv1ju666&8>$)o$w2$3uQoHoMv}M(k2sR}nc*IuDr}6}o z5dKF<0-m%>9>G{SUhJF~wl!PGq)HBBom@x#Tasm-oJsz=bn*V9%k#TtGq_fGynqP# zCDvgm>qzM*wvZ^urVJZ-z<|hiO|or2i_eJ?i73Bz;lLr$1u@17!thhBbUW{BQ+13n z6tk`0mA_Qb9FtVp%2z&Pk9XG8gS6lh?%xx`2&hemQs+q`RcAhH}(i#w8;Ox?f3#? z2lDUoEu7OPK-dLKeR=c>{Sy;}{v7qsv@pb*v@#iL3~1OIL=G6Y3K8k)&_3WmsP6~r ztRGDZJyE=+ML$S4p+*HwbeQ3Lo_>aH{DR1A%G1zGT(WxmzfAz&70eHAV1bZj6yQ#j zYcS%Q(gF2~atyoKp%N1bW9|{fOQekc#TiI!ZoPfm&r3(H=tCDoSLXAoiLaufa)8 z3aH9_si_q#-8xZ>Iq5TRK+JA2S~|j$x&O!!uXeH-RP}(fk0;);489uGS{w_(-!6x* z_CnV$HQvbKz4hwCFUVOuJ&}?`pso>$pZa!SK`A=}g$hbq4@`y~e)4ZJzQ1xhPOk4{Z3bIiM<7 zteu*`10*ef%FJz3JFHRd->~ado9Txnw=7BgfV4I@wU)by=}x9`(a@+lg-ewD-yW8V zv=lhL&2^_W@HRr6-knCoU!u%56s7N1KI+37j~$w#%dfY;$1o6g_CEcBfRz7>o-rA^ z4!sOigF$!AIx4mS3jS}XYe2KZkHWKW7Yw6_tFZo}-oHo?{jId_&+Q*_wSTtDf2bAwKhP(jzxw{fIol z7vfKihMlRsOg)A9+GsfRRHdCNiKDZF)!()rhlQn+>5<;LV7~k|AP<;C3LxCGIYV^# zULlFNsN?|64#Ee0O=G0WWRq(rXO$I>ttjD#Kcemf8gdIL#VH zy^HC|bGpbZ7p|DtqeJ(EgE=xhsR2e#DE$7Q^0ZV&jh8-0NU&Z7Bp|2AB#dPUC~Cen z82G7O{Nuj(fWVLjUVYHns$EIl+-rK+F7wpr1PCcv(Nh)JR%^wjo*Q9iDR*KMR6E)f z%@*d-SSJ`vcfX2Dv3U1|uArk9);Ao^_Z)5(-yg)eUdYQ;+Sc6R&g}wV7KoIk zJV?)pS-c)!xC#aDC?kL9fyGM2dMp2h>%J=$Z-$yf8U?m@OWlZ>ZB??bvp>1#(lGQ2 zcMD9ce6bCqtFO4H=;brPDsKnj&har-U;Ffm$3M>3!0g&*Aj+uFhvLH)6!&pH2`)|-~e;%AOnr-!N!{++h^~7s5_al5;lnIb+Qg_!Y)$B`Dp6i z9FUmzAx6Ep(16E;oRktD0ebgOHX&`c{k%J0yN2Wnb!~=LWsWIcZafG~du2vnOizZ- ztV73_EbLQDzu8|+^su@NHY9n0iyy6prO@M|{t{df{f zinhO6jR(c+;di;LJMjPP@9m7Ic3$!^Nq=T6`Joaer?wcUEK?|Gp@Q%v$!e8i+c zpjqoh(I#GvL z(@i{iPH-;hyf+qG5Mx08qZ3BM5+)qn`ng*v&hemZ9$*@2djIH*5gs&@9(zXmO4SCY z#{%d){cGOXKQYQsp&lMTPkQF*T`%w$_v8D(VInloyB813&0a4*lbVm zmSxW&CnnO;*!yQeiPQn=BeGtmBiCpbtv)KV@Y6nMJ<6=M;t6l!%U4y*HT+rtMM$2i zz%;^`#;Eecn1^SRdSox`<9-Lglb3I+W4w=xoq!BkkYK!ImCYOeo#;xj$mVx!gCRC5 zJ|g80gYJeLa!rkgO$BtS9P@YmABz(WTT|pekUape4&io{Arx(sOumxDC1X z3o>$Hwxni8S7Y`DGLvP_V5d|Sh6@ZI_mBt5FHY8GU+-5mGJn6>wigjaG9o({ z>J=p1Y7{E& z+A;1oU8aAZ;ysEe)8(1irb7_a}Cy6~cem>k2 zdvs!~J*P$ZD2|bHGaQ0kfNz`tC{GZ!1o7SX0>R1(>yQ*T`1O^=^hHJKV*hN%5PN-{ zXXxS}U7+ZNTbX||SAIX1)k+{O5psXQ<`kAd>!|tgtPNj{aVDg#$`WCX8+D@^xpHTx z_Zi0xZEjUwIv3Lsu5O&p+V6@+RmQhc7E@d*K7~T~%~7#|JlXVN#PRz#8%kxVNMgM1 zu&ZaKl+zFGE%|)rUP>(<-@s#$Fk%pqhfu3sxo0K~5=Pvgk}c=azyL`tR13OL4@# z>A=6T*kWa%o=l}LeH&l=kw}{Vq1-{jOu|D^A92lZbuZ}*P2G%jNf3UxGOq*!(7<{G z3=+V8|Nh-P&0LK#>9%f#j(rXI(yF-OelcNyo9+9!RnOGcG<^AvYC*7VRWRfZ(8*Be zV9?mntHSD+<$+bnh9#Xzcsf>P^LiQcL)h^uS0|b`W*@neuydVQTx8%hJbR}|tD_5o zI{6E7yzY`ftb0C(fPN!JNTPGl7Hf}nmI?6-N+@bcKA5_X6IGjwBVsybPWVU~^;;W{ zzTRBX1pQCes62wdaHvj}=-(Q|@zIiBdP6J2P>abwdGqjUPNj31^nl3%<8rJhsfApV zqt0JkS^6li4sr-URy)gxG5G8{$*+nu(e`)`6eN6wLghc!Ztvy3;===fTLcnT+0*7| z|D0=hP3O4Ad3amhyfH18e%h&5+tsV!)}o+bXdrT4)H`KIKs*2W*Pm(?>>|AfyVfJO zl?~=DXm#IH8a`L{DWSpq%bEI1GxMRCZD+sHQGeTo^_V+gk`D@}s?gHpko6W_UQ%+t zH}Q}4!5w||eJFFvy1&KPe@KVsNXu(r^10AB$@Clu>=E z5-srRU(AgPr(wJ7o@slt@Hgo!4S$p9`-9!3ZO}!}{{@+V16^<-6q0rK?AW2r)qxnwQL;`*umvBw37c1~ED{Q;9C} z5qKa{ew*R`a$z&cR^CcxI|Pjw+TdSBQRcoV9Zmp+(jt)ssupZ)X?BmAdgV_Oc;( zzU=>yHhIJdU;A8zqZhi|69PxBM_n4#ezko2U8R3=-yjN^_Ox&Pa8XZteUPrX&w#~% z?S^x8AN`TE`u}bHXI2Nt zTO1&AEs3Yv`t|#k?hSl6ZlM2Wc014J2g7H%L3Q_^j+dA^{6DiEH9p-C@J`+$>w8UH zlKWm?eIPO_Q0I59e3-smCLix+P zG!L!NtCD?rp{-CQ-<_2Od7^x^ORgw4uh6(pS|fUCN0sn1lRo?j-sXgUAwJJ3G^QQ* za@SV!t=PSHC7vj#B zr(9PtbUF&LM5pW2{)Bt-ZqUnj_a6N=6Q^`fEDd zu5q1?xHcgLi{;i>?2hAX zKig>LSp2ge2h&)0B$^yQz-Bo6qs{BZn+|>c%#m_;<2B=rO8Rtul6J*fXeBjkEB7HM zOW_({j%vRYyjmPAx4F{BISZpK(j29KxWBQ}u7G_8JgrLT8dHBp++(f_ZEs!RPYh#K z#!-g~w+!d++CD2qQ_5!?kFA<6WWI?hk;>%HpSltvr-=!=zOpk@7{6keP6 zD0n}}SM8`f8faeOj48{uNSo$BYn8)4OPWRntXG@2E2`)29Jn^Bn+JbnN)rEc zHK#z<{T<%Xv^j&RX!Ie35L%0o@{e*&fMr`0w0?CL)k9Mf+0E5MuK|ACgL-k5FRa;S zRqP>+70IkK1sfK5P-dIZhj-|!a*L$-*z(^Tr=J)Ws7^uOxmBBV&re1}j=5T8^aX06 z`cpt=N;=@yYt*bI*^kgF=jRAAG1)4-+voJ^)Ol;+(BoglXMf%u$Jy=yX@?=Rfeh%{ zkVGlBumYx{XW+~`nKh_~CF`BwcT%RvQ$3x=S+6gDd_B0PN#yk=ReYyk&>AkLjfNDD zJN(Fbd3{W(q|}?XY5KyX2Hxgdq$h;Xd8CcLswtxOCgu2&_kM6_K>uZ`IZpgdyWINu zM^4rvOb(8mD^E6toTXoHJ58&1ecj28UVcCMzE|?ymT`5DX`#57vaavz!IE|&@1r4C zC=u5Ee&+R%+Tiu*ctd#&XR~JhH@^zU)0Af_ zPH1(s!oZy2u#k=CZo$%cK3$@#G4Y#C#(VbmZj2s4^S@*4<{$KPI|%I6%YI?QuYKTaoR z4>ymCR)HZaxnubr-%Mv!15xk~t=1FgmL&V+;-97p_od(PzmjzQRP}~(;%P;60mlZ3LFc(o zmB$k`_+omY8^w5CS-H-kr@|YJs&3@uS;-Zd(e{Wq1*zvYG{ZZ(A$2a&Eu%@iW?x%Au7%lQzFOP_O z@Ht%GZVj6AIX-+vhRJcbRC*}x90UV|5uV*7sltr`Su1q|@<=soVJ)3@a%y5fndSR9;p3jt&9Mh2ifk1(0SK!5siLW~0S_Nrx&Om@YbCQcz)*zibQjD!E z*>^PaBwAR~S(0*eiF~gQ=>u2o)x)3=|a9{U#0g$nAg3ull z;j4B;p|P8<%)Oq59aurmVfg4cEN8`)GgWEkDLCcBoWxZesI#}ZemD8C#N>9T)l!m~ z>QnP~z0hnj1}_>-wSF*cQL*wuqg_N6BWza_EkGUUzw~nEpmy;UYgVJW%T-SrltJXi z1nD%Npf7X+iT6p!Xrmd0p=eJJt%Y{SqS zz?!QZ@pNA^yMjBUU4(%q+4~m&VbBOw==3V%zu>CWaE9?0uzzOWV7~Q?yo8ap!0G=&w@D%5uG>=X zuGy^Ussf>1Yls?H;>133Hg1Pyq`d|+Hhcb%@^V|0oRTZWfoc=3nOHpT zwbBFz4HYqRu2E#G*Q0NL$tk(*_+_tU$X?(|XML}v3GPjgsB}&0hDoXk52eK$!$yR~ z$h)#DHe_GWF}C|z5_D8$ZF*Kj%F=X_6}Q%S&Z3NAft9R(@QA7Rg_u^JJIdk~ZL7C< zg(SCnNET$M-E1Jcep-u#%TDYImpRe0K*>CEG~r}(@~un}Qxa3~5!xt(`tIJn%Xh=f zwqA|oja6lu-6u+Fp*4t7KVBX#wM%@-k|NoIPsnf++O~ku49V)V{!aogvME6JpP&}* zS(CNlY17K*uesj@nIvE-D@Er6EsY3kd3iCe&~Tt1BZ>n;SP3I9VzO7e%^$N(J3OA!$! zz54_r>Y(jYOY6VkEPO96m`sLAg9SU7t@Yp!vAdFcJ+uJW4}=g7hpIGhZ>2fV8(23m z31W=>K^y>5`EeIK6q$4{8ZT9Vvgben5RVSsozCHbmt@^rg>7OMaMOe%YEzfX-HkpiCP8t5EPpdid`heJBxt>`cq8 zQ@_8;PA4i{jO;ZF;+uAW+fPZ+IcX)RW|aUSw(=fmZj(9w(+&gm-tB-(L~Y}VJAXab zpBqi*M(Vk;CEZeJ-_awRZI@#q4Q!YR!ViGj2Q$^c8e!qYBbw=!!SE4~R&H8E9d>ym zJe9Kp8QW zhT$PWPoXMPv4OQ1eW*+p`NFlnEOzrT5$C0X173IT+%fF?LwEVaPVJdAzuUP_%@fA$ z8)V5z1f#~il6;h}%9|t4uUxp-a=0t7%UHC3MOyp9wIa)s2aLNY@-`uCB~@(jh3)`{ zM#R#?57|1PZ2R5SdeD@r~ax(%HNv zuZuu_ioo)f`B^wxfSLF(A0FvhjS)C^Ey)Tm{{;)W zEg?!Ri7-(Lz7T{iev0fpyPc?28pOH-9)o4f6F-@_hZX!nGD61cZfW(4TbL!(U!Oq) zS)&-W9%6BEYJUUw$VcVM5SMU;{bIAh!sN46%+pfmtU_TO(F#C>wcfs9WW>^3wQ@Ki z60ZtjOy5`h7YgwK$z^6|4Vcg(V~$7Y1shE~HbRAK*z;%TQkyu(yXkcKRe8PyO|dbd zH4X;y3(D;P76L8y<8q0gy@P?Q6tCslf8K0T5hIpD(`gLOzdz=NlVqHP(Wm*W6S1wX?d;8o$;0Y;FU4NSz7 zF~yl4AUBQ8{)8RDHwQU%&TO^AHoEK8zz3D}gf_IZrp8O(gkO->$~ zLBn)ACNaiY)Dze~sz9>z8#lMCrZY{)Z>j;IBwSVioG=mNR(H{@P+Im0W471DWUfRz z-d3bbX@MuivsfQ;f&&8KxZ}xDZUC&FEGg!TxJ$!avy;kXSK6zF{-PAUYyB1B%z0x) z(;YRiRZM3UfLL5@zTV^gZ=o3sqjT)Gv z2tB{Y(FKbNFqj^Ipg2vjEOl1W@6R;;)`wyC1F|aM+|Y-P+{LafcgmTwu#@qGDoj7Nnr8V~v@9{3+Rrg8xV?bf2=`!Q*{&0v@U+19_$6#q1B%;i$q^Qj0Hw2%BZ;so z%m~Q!YP*T(%|8z+pJ1Gs3JW+Fq0D?Us?U7x$%&F;)6ri3^wSWF%wk_s74cG&#ev3( zK}<%})sc6Md&2<;ZYrxpgs#RuR+^jzWX-i4$y}Yqe!aNvnEM5UEID^NK2+vtYfG}# zk=5$CmaI*ib=H=6rg6K@`p7lF7&DzkEa0A2pTCUqgHZ1g*FV$tU5PGp%~o#=u$4vd zk<+I4vg_!#UZ)4UJpt7mW?N-4VN_Wk=1#klYs>A$_ihkAeSWb|r2o_%)JQrjzOxl^ zME?6s%{j+;!FPATT=o`#dsR>8aKg7&PUPHjFQ!dpqa2=7S;)693n5s?KDg8vn9PDP z@i}x;i16d%hxHGj%K^=B*pr>-ONXRA$4jn&X_|@hcIHx}@|jsjm5FQ(jVtYdAc`4d zMuASm;%w>3D{@FPBWk>y0DH3H?(DgM?*VIyXDUq86NJRX!6%E)a*R=(r)(H0R%Cz8 zUU^B+58Uyq z+S9tduqfcm;XWg3cO3rS8V}n>u5}$;lSGk}-wH!yP%jyFUh);lGke{p$~3LL8|i2` z;<|2fHh6k7B*27j4j>w#f0#|$K`x}b*K&Eiy!}uaAV`m6Pi7b$<`hOA+$n4goqGWo z)~z_qPF~3kkfNMEz>QDMLxVoxxvO)hw{;he$pw-08?t`$mNof}1IKMlWbP<)%xcnWLv zU})OYYE2_fh@M-P82DUpA7LdN@DjR(HWCFlV1HH=lnzz6ij3c~vVNF0#9SdA) zGz^00s?AAN*?EG&OFrzU$@Sp3G3)p%RXvaF{C#0z@fg`xCJXCdeaLX!HJFH6k?3I=t zDZM*l#}Ev1rArEUCO5$@P8{-h8~p)D2YQ>gZR!!h%srl2AcR0NA=a#DT&HDlWj<*s zKLvvvq7DEy!sk8crMiK>8r8)1loi7dc!1)889*zOJDqcWqKqNm^Hc`8$+Xfz4eoFb zdw@8J+Qn!vdW|<%=sqys2F+Z;>+ofHZUluI84=#My^x*?0^J%YPpb+drfEsjumk1g z5`aKJ>h-I4H2ATnwhifZl_^y%&Qxl)ZA3HdoLh)0dzJ}p3sKiK+a<%;6o7DPKsC7o zSx@4<2_z&GPh1bcCt4kmhe1|}!?lbvb#_6+H%zKoPnrb?00FvZ*k?#RtDZOpca4UV zeAzwZc?1ZX%^&n{K@+)wQ~_9iWh#z5Q^6bawrc*hj;*jVbhC1$(6`#Vq~lbtBq8lq zZlW!mE)@ktd$1WvjO$Bf#b1*#vRUeHb4`fR3+_xzVZ59D{YF4Grre{1XNrXE1M=eU z4EScggF@IoMc#QR1)GpSDICi-qRu(HJ41~m5*6a~7l5;6Z(*kQ+wM$)G?C&=hY(04 z%abIh5f-F1^H`j8LL2dclW2!;nxnDknp@;UtGe4_wwJ#+AG^*P21zgs)(T#44GfdN zEhES6dlg-#jpb-Mejdsb8(@Fi>6%kwSaU(3ia`%`DY(Cla`8930JB zHL!Icxw_+Z9dp=f#cEq>-~J=JI;IDrpxLrVMD)!2fck(IMcE;mub%MbL_-k%LLb$<3j^=x$F(~CVI|n&DLVOC#Y)9X*MJnJX%eLX zj79#o(ky)$*u$vyjnvt~^|Oc4HX z0MN!HWR;zv|9!DhWy+?qC5ih}kj&Eq19TVpLHO`lXrL5#VC z3Y_V`+KNON?kn%XIx|yuf#KCo+tFa@AwDs<9p;|?^Q+TnV37X$H0y0sI6o}Rdr1|j|f)12bt8Z_^l7=+xLZ|N%1;Jtka5zTb z1-+Ot*=ZIRfJV>*tO+oF+3F+WGYBIsdR`%imKzv`8=aKS9g*XsiuX=Lr|DLiuQ648 zwnR{RT`aumEvpdXAlwD)otuErnlQR3TZ-fbA6YtZ%CI4bvEjm2GC!)D1FJcRFCdJ( zIZxsRBeFyI*L!VWTHl?1;V@-}S=T=Jk+Z&-q4a)4`AdPLiGjZye`Za;^c6T{5sJq- z>)SJ^3S{ie{MZJg2Q2u9N+`)`0w1io^y1WyoOLG0OKe+xyv?U+LHPtIiVlMtkpYM5 z*ie~Eqvo7F?>__sn^~w2X4V^MWn?peb(nIK{>&*dTgTEobE&yoJR@#U+FvO0ikxaI z$m<<-BkYXSjAnv3FU`%!+?Qs;$X=km)Zlu*aWqluj@OYHh_B`VITg7cE-XFoey)tZ z4VpM4!H5;oc$JD8WO)biKa)y5bEpaw-BVRcp=51aKMI`tGqB*~8;Zkx#@Hq7j|%>l zEfDXBE_Ba)Qd2%5OaZ910a2C9wRfNAoHj~AfIJPO1fy7EP;OcKj4b-1;B+J-$C@Gn zQ#I@Z+e<*Oj?=y8^sm&BDTWMasbxjm**&s9hFkx8srwgge}L5VU*LBGzT{k4zS-|o zoCVg2-)OMAwW~iAQi+^}Bwa1!)k)94@|A3xy5}t#g8;O-kF<*oZji9=?-$DRrGIXY z>?xbXnr-1LGA-`#n&j_Z9Es_cnyk*L7n@mq6F)FLvl>fqH~Gbq?3${)nA_T8^FuWK z?0rrAHz!2cNa(pm#as`zDrf{lndwL2nS$^W=`I9KhI5yzV0xOdd(Ugd_VxDNeE!Fs zB6o3X9M8Hu297RK4lz_HG2gwdV_Pg_*S?!u|IH5Z67S?40hGsQVS&Kd^V}O)?Bjq{ zxR4MTHe9It?V(_C+-;LLr`&rkViY+rg=8nS(7UfqCCcAl7cZz(q}_~lzhN|YYe;0- zk}T;+EqRz_tlqw|PVhz%;3Cd#^z1oo;cc!PlKF(Sd#>vFPK5{3=x z$rm9H*6_+=E3IlDhs4SsKoLKlv6r6lE}~V{3e*X;RRGH=e}dNWW~toQvAIp^&V2DV!ZWf9dVuVe z;2rh|_x0JvBiRLw$pJC%(N$czc%E2puVsaHJW)v8uzl08P$e*;^=X$_Hb3OPHNk3n z5uECN_tQ#%A5|E);{nfmujn{T!dzAg{qGk80pdnQDaT6=eH zKWkTgq_&jL#z+I8dx*~X#pgPa^*Dc{tq6$k zH;F&BT%D4X?27D38#GZ9;LJBZpn4y?q5j1f~ULU|_HoRUjd z0723s!^CTxU`7eKwxoWV>~nPs!PguG*raXO=^eOX zd`x$0MIm0sA|SjDV&)>+F|teusN53ksnK7jbxU#7K*kSfJ5X$;dS~FJ_u5Y#i(}-0 z%Iwy`n_Vw)CP3Fh41h!94T-R^J#^iX(}tZ<=7C&PToUg`Q+kA0l@ryL6={i z#jTq(VKy(B=0;E|d%Z7EROM|vX6aHi5+Tr%5==_)UXyOyXBM_6Kz|?@du2Muuh7Vz zDW34@!7FLc;>ei^zWY^lqh3&2lSFR6k&4->AextN_y)35iOFgIy^#AYy%oH~!~<9X z+yxgi>)aF#LuU*rQ>dNp10$R9C@fcS^QoF)2%S>;H@e6Zi5na0=$@E9_#($2Lvsyo zp)<0+Vmc00$=CX4YTs#anyEY@=z;O;FX8xq^jnVUeqgx3&C*0Rr_jBB*{m(W{|hL2 z3I@u%{H4x8vrmIT7~0n=$B~x!DA|9<=OF9k%g^gC;$VaRMakjBQX)nkDkQLg(dRCY zDF$c0UV45j%%l&VyO+yT^rZaNlPbxV86axQ zWmy8YD>`w^OYAK$0^&q~`+|J2e;`a(h#uAmuJ*6S8{Wj3E>#lmR`cBmVb{qU4Q`(KVV zu32xLN0!&ZrZ=dYqd!= zhb9@=rG470sqT(*%?HeBtr}McGhp?&y|q41KBr z*S^H?`~db+OK1C?UpEPUz!?eL5el1t52)acNg1*v|JwUn03rpUjp@Ry-)+PLb(>7& zi(1KFL8xe8Q`Iz$mcs7;Xe(J7mh}eK=Ir3anGkD9Pk(fog1&>wVF;iKdoco>g64w_ z`0G(K&%fe?Cmykv1-em7Iy4gT% zv%GIQ?1Z!U(<4l=UuK?xL^m_pNd`b`l^_N{Tt=$SSy6Eb`WH`x=UAH{)U|qaC@bbt`yj^m zL!R)x))NV#mpop4>0@6Q?-f1?OW6!LG3E1xotd#`dwjK0>v6!fVtjj(1mmEITw&qs z>ycL^8E;84jM~hZUAC#tA03GMU`Og!q^T(hmI0+j8@QAJ0&GV>&x(YyUyxFHx)7st z?lt$I4QR`z`7U zoGihA+@n6F24uD_&z_^SG5u8j^on{e0B_-AP%|kK)ZamU6#z-gi+rLVf8I-^Prp7U zz0s!qV=izTra;e^Dn7hbc<*Cms5Igo5o?sgrN^mt&6~trg^($BHd(0RwYF_#u>#z;1nR?VtrC^vc$U*NR{FTUmzw)$`Oj1NDkP9REZA8&yRKpU!D! zv{^wqE0IpRCjz&CE#b?bHqBSD@E`3f6>L-Ee5oCA|ADPx7UBWqtdk=4%A;LoRucZk z10>42o)E`g9&1*G+4vxm=zrd&cop5mboIm7ffeVV?L%q>&_o4&RhXw{h$PUDD*wCG zDu2tm=jVI{b<fostUMTHnPqd`C%- zSK1LUZ0F05e)!>jY?#@K*6T^OyPB;^zqX5hTV?#|W0`H%)0u=K_7n@|t~N>o;613K zbCOG`J2bj-Mz0Z=5P@AnK;cH=q_oDF{{>T0Xd;?U{ozlX_0I&|^dEdW>Fy`ALYnQ7!fi z@NJyjanYRH;7VCC{H&Cn;mX+Z-1HhLX?+vcY@2j)Oru2J_EtGkeUp>ugdDSQTAz{C z=xoi>btONl)8P#3NR`zhz$dsUChk04#651nD%aN{ayVwFB+cDTA5A5ddiK}R+!8?) zjm0QAbV`nua!`Pv7u(x93JFTr5wz+kk`gAB2gBYWmm=}xu(=J09{R?V%n(bp>cyq5 zDAG73V!(N}^QCI1A>Z;~pi#}`nmYFq(wFvPD^1q zXNihHJYbr4w9^TJNEUT}i`;?e{Xbd&MiK?q z-&)nYWdQqm&HRV@?J71#jQW6g(3o{Q+_0zm8(+pX-wDeX6I@pXbyF4JO#%Lrx>O(d z7P&cw3bz4@;#S&YTefdk8H(y6@9VZxymD545TjZle%m9OTzmD$)ySt z{P6YjHDe;~e5~SEeVmhcM)ZPCtpu&jY}zszyt^m@1uf5B{L-Wt)Jv%uw!^gVWZ^1{ zRgx2UU%zaz%Le5a6X@>voORaDlVKuMu^RM@^09%$6S7VVkzC6g3$>vya2z`VGZrog zzXC6D|zTX;D62C*1A_X z3GK*|+kLyY7Ulj5#i~oA{e}8gO6Mo*&=tiBZ>W&LhZ?HKP$5PYyrze!J{5egsF3_7 z$^Co@wM3U?GN}4X6UKZ??yDiJI~jW!0RI0v1Wxej(m=dV2A*tv}yArz?9wxlC+&Y&;@gc1C_-oT`1NcvVrzq-k3t zxyjf~U00^!v)x^a?kSd-o;OTNPRiN(kZD1wz0g8TP`M_JSYdWNbxNHs&4jOv;eLL* z!bh@AhOia%HHYk=^tR&g+^zF8@sOm)5y$74SR}m4 z9li^;cZN#rAQz!u$UABWY@AmunHVK9G^w>10@GHw@y3aKMUnECc9Fax#v%!y7&{ns z(`KRpV1IV?Vt@OR@z@^gVW&WlPD1fs%IMNcR0_+1CH;*TF^p zVhEsBpTABp%)00cQbWnUcNr(F5SI$hWby&Q!2v z2*|iCv`>BcKj}^Urf0#Rsprp4WMMO*@F5Tp3m}H^A($mKd9P)EQS(y75Wz`lkF$ehQdixniz1^f7h~Z*LeeDU4l8UlVbZvK8Ir7>9S(-`Eye%}>9aIDMP| zI-x#DKTR#Cex>Vm-u%POrw)(2*LJ@x#wqWI=_U-wO}W21PIw054*x>yI$Iatm_jBd zJ=eu=dqpQ@zGSF@bGB<^_Mm$Mr|CY*)l3-%$gT$1z$KC%5 zoVWi)e}jgpBv&T!8_helVVb7H%&LilD#kSh56#GLTV1aB^yIdZdW=Dpk+0=2I|rMs z;E?%k%TKFzIl7Bk!_&_i1m6qdo(l`+H@{wVp)Atl&@EuEv)0$BA_yiexLO4i6O{*3|iXr+iOFKO1gCv&l*PSa@*fqG@1WI@r^WBEKKj(NNWG)l_+lKK3=&V zkzq&avlDMKzKByQlqgKBxjR9lM~m4bMdnyy3eKyT-MPWe2C`Jm(ic)#%NWd5qvD>` zsb&pnMRQ5c#^3*F()uWYWW>FAQ@Ap5wvdD=mil~`Mt?JVu!G2O?%t~*w29w}42kJl zX)bF*s8Xy)2#WEYYDbY5Ik$lpoJ&oZFpNfv*iuw|xp3T%`4fB~-trir$G5Tf1Fz!w z=Py0}msZ+;DPUv*!~)-ogy97%KqVmr+zMLt(G+P>Kbr+HZ5kPt#EHqu{a|)0;r44Xsj&M+suX<39x=sIldCTrTt8fHq(T==H(5rqLnFAaqiCU$F9|3R3&W>8s zjLGT!;@^dK)C@ZGF z`2OmS0w{OMY)t66*ZtC9R`TD}OFtj~{@U{ATK!zBpL^i1-vhAk!$0fOzZBYj{`{Z6 z&ipIx5h0hc4$Yg~yTlIA-R8NJ^&0oNE$}J>?si+KeTgreV?h!%fRg4e>P97oWncld zHC{t{l(D!LB|@d<DXQD~4!P1fFCHn>@Q?o+LX; zWFj$(60QYAvxw2#5~T=7pLiOQ=yxBfaDWs=p7YtNH|@)ol$#>AaBKTphQN@Zm0QB- zqN&g=ugnARB08mn+yJgp<*&1OAbX7Z(K(d?kei zhTkq~SHFF5{gHPr(aa!QZ~>&}{cBJ3vUTvfbn&Fzd1Z)1Qym$=4E1-x#%LSo*`IGO zNu}=Qxg>3^8IngtSf7HmKDj-G-8su1hD&$`9JX)3EkzU1$7-#F-p+D zE)A^*2??nynrvv+$O_%;)-O(M!>)mIb z?OoWs^QOldzCakl^d;i6M^)RUG)= z>S@{VYt0SnoS+{iu!T4#)X}|N_877GxU81#P`Ye`GcA^#Ib_W(E`}N>SUR zQjz6i3R=Du4WAvnObYb~=^lCJkRi@<$CuLJ0eN-1f;X57&KS_Jm(`FS5*_rvdQi_z zxy=xJ5W02>ohbPCHmxM&CVDW13^6xico^u_{s(buT7s=VrTYAD`9Dw=0@KdyWLd=2 z3c8$xVR2(9hV)~LyG5N)Y1K4`#LNU%VX=_RNhU8dmQNv8gfb**NEUaRrQS3ViWp6{ zfh1DL$H8nQK_sg2%4Xnld+I!1vXr_EK99ltLd1{BE`8TCO zwyKY*XZc>mix`wCp1UU}4k$r7&mN;UXo|jqjP*=OfEs$2gKL`vy3QfhMeYl67+*Smqik3E^zJo43E=~98 z7nw}6`Jlk#*Zbs`w``U6+RgIPsI3dS%VsWR`w%cD^JEGwYDbvI$g?~PauXov?|*nf zrkNR-Y5oUNM3?p{wMRQt!1VjYWNO>f`BwCUV6v^epH+EZX&i_ummBU7WZs zeB>*p*$KjSZXM_hZr&#n#Z3;!Tc5D4lsvqCiMstn@Q@OiJroRHA2c$%W-scaIc1I@ z-Eayr)zMxWt=kL%&F}G1wrAdYX6yhh{7(tvQBfiZAYHCc@k8plKWn!C>X$Nx%-YH*Gxhxlu8{LVG6}N8^mZPJaaV}{bTWPRm6GyDtg0?k?7`* zU{%Ja389=~T3#OqSzd#R4ff>|O^EjMkxrX8Wx=)PuFOLsm&NWag^?GqS8v`5S!!S} zybp3|?6z8;4VdqGZTb8(KQmtDp;nlOlQmkQ0{jx}l@$mEqpgtW+sj0AYxyZs`@K+PPJ+Q6|_U}3@#_H;z8#f3+5 zVNg=$^5+7pnR)znajem~)`z6Rg-PzT>0#jhSh>aG>%RIBg@N=kXZj;m17>n5uyF{t z9{s->*YIDgIr!g*sDzhUCf)OqPm3LOX*Ib7p!5zF+)H-_m2eVajtQ%r1)5J^#9q~N z^Cl2W$YBo)jY-&|c#=?jU_Q}pUhqMmW+^f!fYCA_Dv%hRQ5kd5nLQs+E1i%(g+6;} zAn(*GLT1setr#9GI0_5eknCB+wPQ8UI?0`XN_WbATCPJy(Hg1x^lqzOq^^@V*Y!bG z)#!vY!Q3$W1f+9N?s#MiuPAvP>Bk+4Y4`;=BZ(sI7DP(E|K(Bd{ZLi2%dO_EckL5) zorc+^d`a|_JZ4BikfJ9TfCJyR_iQD>+0J72N}~7hsg(kcsTnBhZs9ZOxXg(Bkj{|( zl*{dcUh;`%;P)>F+0&!WTR3pb{BNn+X;tN;MEeN)Rm)5H;6&Sn@4I#BSxqlw!^PzQ zC4bOpdcDc8#LcNu_xBWJHXchjtIi7BQQ603fs=xj*S+r}wlxS2!@)e6BX^qe=TQ*3R!G9qw;X22?e+{rVmAkB$MjH3WG7Uew(LdMYUvv+^ zzN3Fuxc`lK{4+WKFD2($j;<7{{FjJm2`na5<=H6UmuJJhEIx`)jH~6krMp~>NHwkZ z!d7fE89=0lvdikf-B9h2p}t4rL+PaeSedcs4iy=XaD4k9NhtoXTk}o^@r^s{9gH)Yp@26yH8_6bh)cT&+ZVg!M!=5~JT(*7FTJ zyM1wW%26?yVBN`>^IV{igb=g(-9#YyJPE{fzw`AColySu9R<`+=HJJf(qSdwU zpwWyr*G@Wl4M2Xxib{D@sMW#w4Pr`g*DZW3wHwy){v_L1BE94^*URL|dzM)H#M9BW zLd5eeh;|!mgmVLILU7weX04e*-A)?h^WrOh3UQIAQYQYKAHETpca z<(*uV*C-D;I}hKsNW%}nXarE~lVfV{)NX=6$M6XRDZ)zmRn_QZY|g~mMS7h%wX<`? zePsK@^i0756}vgUwRX;;dJvH<&rTI4Z+pZ0ZL7;??skLZ{|;Jp1i$@^hfy2h@B1z) z=PrH+0SmpEvLFdb{tklp++Lfc*`^rL;st9;tt8;QvAriwI9beAIG@0p&Jf$PTRTk4 zls;kjVbqS$IHDSQw0D}0n$NI=ZOMKIrH}YZ#C-<^2Y9If{uXb5Ax28)WQtRi5BsVF zb9q_Z47pqbB*H}C?gT{qd!nTn30ALK-0?R)uioAfmq%>^P^ECO)+wF;X!L`_F zNLf7Y_30IST`HhgVl}n=JI{i7vHL}1D-EwqX|6Jr6AYJ^=87g#cqD8iz;3hN{2Fni zTu~&f)qCAu;1fxl^EP#Gt+ZfeOx|r7TDxo%lMf$$_&ez0^!90YMLiN0Uqea&DD`=O z@KI!~luAf7X6~pjWZ=EHaHqpKnn$=e6@&=h2Y&{qwXOV8V1j)Uzbeo6Ce2v6&S%?8 zB5Bi$-|R;4jsCP#PNJ%x$KQWM=y(_c05U@4N^5J#Y|$Wm6r&AT;f9R+qDhis1a#P{OFN zgTehQz|s!_+ABN~9E)-j3>r{aVvmB}TG7py3%+AsStNQjMl*Q_eph_eyr>ZrTlDsZ z@v|%OMR#UjsK33<&)^{&(^0%B&i&Qz4cOc1y(*5d%q<`&FHzu*`8ol(h%V;b&5q){ z=hxJu(zEHd^E$v`rI)_;OEyN{tK!JgRk|=iC0SII>Xf7Ps9+P$&3{QL(1W}r?iEC# zO8WkL#vpEz6=k419Qd!H{*VGa zm~N9^P#x?{~)w2j+hLHF}Z!n!@?UzuX{OT9^4BC;7RXWkNej1=mIg*9B%C#(&TV$3&k z71@meI?HJaX)C3VbZP|cZVE1h=1uU)2-;BUE?4ndaT&c+NEhd037IXbIQw>IxXji` z+pAk!Ax4_({Tog5EV8`3L%t%@ev{nHad8Vrg@)g|M1SAsMM3Xusg+m3b*`V06gMZ_ zd!5d8-M78YUHPS$(e;3`8;;k$Z9?ZEk>5e|g0@s^m*|d7Q%y9be6vv$5`sP>fbCA9 zm!Y?nS>Yp=3+F+sv;-$Y+p-7mw`GF`6}$9A1IFm=+pKo)w*jt&${||1N!^KXu2LP} zBWe@FWQN}MUPdd&&*GbD(DEdC!Fkt@trL42_M&lSY_GkSi8igK#+ql9nuR?Bk5FF) zJ0|d7C&fO=$iB0iT2H`sBqsI|h;tIHdZvs9w+Wr;*eRwy5sataV4STs^yuWrZ+K0z zMD6L?`x^}_fxO`+CmPtEPAN8rV%ZZx^vZWoC935LDLWTB<$%)bfrT;ITQ%pJ3@({BI|7-L zks+xB@9JY-%%0F_z)Bq;zIGfcJv4AIE-IIKDvh9*U^vakbGo_EukW4ey3G;Wok0#o zerrB>w>?soI`%nl^+Q=%lqJSp4bD zJOjx2?|o{j*S<@ObQtdJ9{LU%QXyNY+SmpZ?pvnF1w8eVE=Y#673Q#tL$_N1*UEc7 zoPeMsgbU~fF`SYEq_;doM;)NZZjf>W-))tIq#I%|ks**>Ov+&+vL0bNHQL*Tzl~Xs zcin~Ftl>J8o90egGwjXs$@6K4?QnJh2AJW1rRH_OywZ@kR?~epk}7Lqrt7sQyc+%` zrA!$Ya_c*&c6e_B^t-+`0e&M1J8UM7iR3$5uOzM;K;$i^t7%GDqs)o=H;d;22=)1f z(7lG2@NAQATHV)6lL*)%YE}#r5EXnI7iR(O`>##M?0GD=T*xetorsr+3bU3C-EV43 zT1oBVN+5(9gv{rfA@<{ljx8I7rtjQjz2B?D2&?UQ{MmwSqQ~1VFFe_c>~?Q=`q~`B z*Ikm=m%pV_GW?6y{XZE1zmEa%e`TFu-}`>zKtBti|ELi9Yk%kO4rVcabd!@vRCDM_ zd3o%&gIMs<#y0<4*s<=Ev~3=(d6zE{)l|3#Uz;-Ies*JSL)Cjmhg+W*!x1+jm+?2S z6*#NBX1X-4m))F1UxeGtiR!UcCvY#NPunuctZ-eIircsSqQe-DpsL$81$hGDj?i0C z=N-57_a}zXRn(56bkRNX9Bf16SrKHqAQ{mJZ%a>4o#K?I(3<@2;#g15YeAE@zqk=4 z94k)?HG3PMv@#i-`k8Oz6y)$41iK5Uz=}`V-zY!bk^ubQvIgs!+>(LV$;A7MP2x!@ z209(_${E>|5Epw{DMvJe>?q`v8bxH2StsW8OKHdKmTXwc%A& zhY_v5`qUY>hRwiK)*-U3wBq9iK8UGh5K^7tc7X2lA8274XFb1zULGrD9b`e9OhGT0 zfn@d+Ce7WE_GU@&x_i}&iZ0I+H?ZkstGCsF`R_zz{@NY=>+Ao3R;j69{|?gn4!SQ1 zc@3-3e>BNkeGLIZE=?W0s(g!+8L~sQLhzxw_%|r?BGSc^mYC01MH{7hpxZ=qz0FCc zYwcA#t;Ra?kx4mt_ee_+Zu{iMWbC+?XnCSrWy=O$5P+KkJ+-k$#q)@kTA%|4h@JQ0 z&SJW>Lt9)N?j%c6zybi3au7ZfW+n*5pk&DK-oHXr1#R`DqjfpW8@!hc3M;oM5Diw} ze%~UALosW3a;fd5Ho#s-FzL=zbJfn+jVU+gwGos(CE^YW1D9tMlg*gJ?U)nonzRB6 z&U}$x@*Zg2po|;E%|K$z?_4-(l`t^%xW?TeNdADn*gdgIw=4&6i(NO|QXyMcUI()M zYvPwyQ+iP8DM3>^+=5=#K`ce_rw+;sl8pO*?sH`=Scl1-i@NEbHfrUR_%2es@$YHX;7|o=Nu|Z zZJ(VL11w&IbXeZ-+T9%cWCqoZJCF+gQZZiih6pRWpcZN#pB~v+8h5SY;o_6QYtM+Y zESfV%V;4xQUPikSCX+%hX}(D}D}MjHj#gvM;zAgKR&MT+Nh$p+X#k)uzM1}a@_PAa zksty*XP*5>o-@EN-TwZbGk=8R{}$8#2RtUCu(Z5rMKd2#CM{t~$3OO{$|j$iY=Fnk zM@n~($t_>qxU^F#tAT)-)Qa1RmRwGE>!D>cv4-g+>k#=c{Ao3lAx{~}jAd?w#$CcX z&B9bxz%VCo{A*)XASBlMsub_LrF-&&+0qaBiq)~<&KF0oUF9*J&^zy=S)}=(V*$!r zJdVK zVNEP?Cm>9xGsk^akmGG~#UjmQZ~GU7$Z*ikJvh(t$c5&w6)KyDu|BK3M;nl z?QRoJ?JSlB@c9J($)rV&BIyu_^tEeyS(*R%y8(W*dOX@kE#9=~dlz-yEvQSuuIvjd zn=32MYKQM2?KGoSb|#YEKzlhMqy5ZP5>`e);TAuJx_~Sf*Zn5&Y4Iifn5ZG76Gc=n z=p~7Ewxv-LtBPb6p05#KCK`|s@aeA?9d@oU*yqx}NtBC>tf;fuHK_F*L*JHX{>+|d zFs1((_0F#K{ZQhtNkV^d~1LK3@75k)h_Wtj4_*r@1QuSB;<2)I3`9XZdNe?xPzZLAz=20iKtD3kTFt2xoGdQcGrQMN`th(KzmLFWnaw z9a=}NUfMot#FhMxtZGIl@?!g85cJ^6K@L`&GWfvu;Lh8E6j7J2+k-YDY|EWq2l>ll(d$c1%JgGvlayukA8U56M^U_%M$q~|wx?b?t%Z7e zTIF336eo&;9=aYFV+*{zXfhiebMNesN+AG^qRouLkjsyxa^TF<1v8$hx;=*W=0i(& zSE0!N0;T6~AY5Bk!TY3$A;&E%VJ@fLSNW zWlZ;CRk}`51?v4P{XLFj_)H6mLa;%2u=jKa(c~+k^&^=(PCH?g;>3j27#?}qo)SP8 zsIgiMTc3(xt{S6uWCuq#d4H$`V6&irfhOlfP6mVyA|L~x0X)24IiTYac4ff z?roO8`l@6eU8nMUzUpKVI5RX-fSU5Oxbyy%rCek?-)O;Hd$?1|8D+sFTyA6geZxXX zdBF#_8({pt52?@apd9{uHHG7D#2&%X*c@MY^6hQ+TbAe#4|BW2+iKI-)T%!jl-;t- z5;_K0S#IA!B7nnUIv@7BAuhBEAX$Hd#rn*xq^ypjs8G|fhC94~0xtR~Uh^b0`?M{U zQsgJ7_LETce_#*5zW4tG)qa9%e;b(L&#WWC*@kYnrBIxA`byNqShrhrhi}*=&|;m- ztcG0XjRqs-9b}X8sNQULd=_>}^Fxeg5WCKq(Px0cTq<=KzzGWdc!Uivl*+N(gTQRK3G%xVm;0~b<-^IhoLj9%(g-Nu_4n+c+qn4L7VC)FIjrzG7%728+wCB zErOi-oCa<2CV`l+t ztQF%5f5^Vvr4{x5Ur7L3aXq6Z@6O+`?4~7XT1}xAW6nK${l~_;6a2#R-rU(k*B6F`DsspR=N+Sy`#LHO0^m?J&TQ<{+7}esaa_%9`G42SY|z2N4{|& z<$f{6OkI+Z5v&GX)9Lv-uO-pKQ1vG)3=LClbOAZ&bx6TMm)D#J-YI=^Jdq5N1ky>* z&{6H7eW3=-Okojlw zNRtkr9|*lYr#EP=_Y{35XTmp?(h~ameH)&^FGLi($X4=YdW_+NZ&mK+x$`3ziErR zmA2||*|z9{mUu_8#NqT(v$u?+y?a{Yjqst^%X`R&yahwEX;!(k)76l0kh?36b1E=p zk8cdX{#bH?r%}zK?!+a_TB}`sf#4O$^Gf{jkCh~}uk~>P#kpuzp698^G#2gyftEo9 z8ovIw@BY?nO3767vPno!&G6pI{Z~s5j!S3>a-<#5Ot&!`RmW?~@HLi@uJ^NsFP>O` zSgq{*^^ZLWnPJ7oOs{~r^7XDBKTMg^EpbCNACfa%`Km4SP%hh7#FG1sM_^_uZLFL` zfM&yeiU_Gxl0O_>uNEV2#RC(LpeXb%0925;dZiJkYC*UcZj(5h5BO3Ly zBMfPl7w925Lw#hk75anUL30OuhxtkRB>?u(ftpOE{)+n6g_qRj7=|Sw9suAZYF_gL zPzcEbS$Y3yk`0RzwKPS8L(=KmF&oy}-4>8(DT?nwt;V1#KpxUy?ZM&NvXYs`NelUx>XGs>(=mbNUp9g+G^%DB9mHeUm7?NzrLQyNc*N7|5k~h; z+i($Jby)lvFF-KWub=q%4raW8erx*qv0Tpa$dE`yz%+ff%)}rCCCEZm?hGqyg6=P7 z>HjXlp3R@We@#QlneZyVAGVJqgc~`=18@{kNeUb<_l%DEh|5(+LO`6^Vy_tD96(GJ z@9c{$;!4Y7##y-cum7?`U%j8b6tK~@;tX|W+R4RNfU8;}$tGTxSjP3pW_BQoK(0=3 zajV(z3z*r-O%92k>`J4L{?n!$lzVqm&;9w%yyntJ|F5QIXZMAHXTZfBfv;ZQtY=5m z$;Xh+6yVc}dae%@q1U)0*CZJ$z)S732>&?i79zJ;I- z^I{*qi_2x8{)v``W8tE=#YXRl+GhMN6JPBk-UGBPez^z@>l@~`zmVKN=t68Emn7+O z^HIO<=>3|&HqAzyIiUwve&>|aV$z=+CIP;;-EzZQxNmDyxB0TC=bwHUJ6SvGKj?DA zSsnF}vpC!Yp1GEslr0+ev{u~93>?t*^vCg41plKct=C7?r1w){I z*b(XOLnWlX-D~)=6GTXaPh=e z8TjYWl~BFpZsROWeCNV)zD0ybj3vT9Ng5gk>~+aMzt8@2Kc;_$*MVnXmtk|Ge8@ed z3-T=XGtf2c_Xc1NujW7K(|~sw|KJ&}LRS0F0yqC7u0vhn#YmDgNOr!>BgCk}cVAV# z|IzvlRZpnxCV6*qCr`U2TuNz_`_0$HM=Z1Li$^Af?uDk~b~!u9N4L`ELu_L%Z|Jqt z80%AW-ibd4SlncJnwMTEXe`YV$tR9Ks_Z`c0I6> z_+{^c4l;X^0>1({+T^iIzY~BD+FXvBtxT;ys_|GP+%T@suwbI1=kv&yB z&gY&%1VWF{V!2-{_|wyKub<{!l|#MFi`yh!p2iEtgmNz=aQREKURz9216myaVXZ_4nMN?j3nKw+(Gu}0uKqlOjWbF4uGiMwxuj2sw)&*)dQ{GGU&iD(8z^P&391K0 z2%FBW7Y@O=xt`D+v9*u6y4!xjyPKq?QpbD;J?9I7C$kJ~Ot!1@`(2@q-8D=Rf}`QZ zPivw&+H3Oo1Vy`cD2D<&_VWpSujU*0M`VARAZw6d9$N7Cw325F?^Ov1)1rHoZYhMe zd+)|1srMlUS8(thbl`yHFJ6_P7%2V@Dm9CK_+<0^El8&?7ry5^h=8>glO6AvyPRy2 zXv1^)06A(%h`We&wM`p+SI$YN8O|TOBwU0FV4|)V5W8$<1 z_`TEyT7nn0GX{icCg}wB$+zIz=ycHeC*-mvAaNURJeLc{ZX8EYJS0fnQ#0JQ+ zfwKTX3&=2t(d4<=8pI9E_NP8;Cz0EF%yWbKY;zdHY1!#VpNnhglnb_|{pIiGgNfeG zm@;|?`aHn>Kla``s>yF#8x5c$(v&U&Q9)4=P>M7mpdwAAD1;sr1R(;_1SttcdJ_;( zkf!t&5kiN6GzBTr0-^UBNCZM4#qafZ_C9CtGtM1%eB+$&-ZAdoe*vMdcg?lteC9Ks zN1Z^*%XV^hE}Up;2vmdKD)=SGutKbWvNfKvp%L1;=%PdDXjW}6{y5&pze2ins z%%VDGI49P8t)n;iHqORfuliB9AY2J+Xl7}M+W(4s`ol%ot|l^w`Ro!yHb?nCIfXes zwJG4e6=3>xyt-HBoJi?W(6gUbO~`adUme}o?j^U^NydzSL}eK6JUitvoKaj1(`D?bWy z{%Pe&ze#BN`~&xzbuU?Kub7jl(&|RTTZ|zm?EcIO@1Br|sv(EWsU%Mw%X3*!oHv1t zqGyRxvF;5y^|4)*A?jy_A6}olEfW^o#^zXm>@^R)6K!4i(a6jax?7ZEnjg;$C z#lM(hup$U%e@Rt;2#^;0N9}2)DAFkmdxeN`SK1zRmFWj$?k@G}Z%__mhJL_{$*0N_ z?k9kg3LK5pU$2}{!9FJSxB@AfaV}A@`!ggKq~TC7zD7JSO*oDnFzHs;SkwE54DGv^ z_S=Rx*}kI>54TdEH{{pGG8a9%#PlR;(Vo~9=8W~Me|T|(&hcEP;LEXOxMJvUxMyc_As61*kO^s_aaBRHz$=n}enD@@aE)5A1~ zIqyIAfiV~8d!hah(UN$;7|@fJ4_VX$Z$oDd@a6!GAhe4zkMTn~`ZajWepZNY+JFUR zq3;pD&z%V@?UT(=s#{z}_sT-$7GrCTBuO^gpmF~I#+ ztvGdfMkfVB^SZagOx!j#fa?975pV?x1UV;lFE%YiN;9ey9YZLrkYy*tPu5H>I5DU79{9k20 z-x~B1)K=N%3PwO-%Gb6nZL>cHd~*j|B8dHLKGx^-aB|nQQu;icF}-X zLh#x7x-Wim{hGoc5EFqaf)au=u)*@6Ym=Si-AC~I5)lQbBSUNlmVvt9@0Vzf!pWl6 z7ZAbtznu($l13(oac5xXudR8og*(-XZ>V$sQ$6^nHq_EB#~Ml>?E(suOMfp+O1xdV z82-HSEdOdm*9naR;86L;CI8p^|5}g#{vKfdJ@~IK!oNrF-=p|nZ~w1S_y4Wevk9`A zGPfhf4w=+N{N|B-j=r#rqd!{DJf5XV9LUnXP}Ngm{OBUdm2UagPPH(MKk`|8lrMn# z-*$WI-hQKRdXGAY_~|>TywWawmo%A}cuzZ<^F-s@-plSDNhS$wF+5&!gF0ox=a1E4 zqIlN34hNX40mcQQ$a}&0TJF{E9Zew3fbJb7gwHk664w;m&6AH4uPg4hqPmddGqct8 zUa_2}SMo3Vq!W%|^5U7tE|iDD&s<_FhP=t<$Ddy>Sh_VM|M!8=PA)$bi%H^7`|$^oB*pZAlg>#$swKN{BLPXGHg z2L_jP%OxLT9J*{OIRCn4Qso2NtoCE=U(Mb#O`J*xB(>dD+pWqAXlmICb9%M4R{tag1kbMI`E#+~omV9CChzp|Nde^p^Z$ z?N4Tlo~kE?=kY1t<2)oz(-}Yo;t;(BX-w{?EgYhfg-D@>Ts+ zeKWGYADrfU-^UAI!EjN%33r2c@<=&6_Cj0V*P=+n@5nJ6UT~wpsY1c_k$XKaw3=)NAn?-L}Hv_Az|lgtZKF=v9qwQ8JxHl@xx`lc$K2NN_3lKB462j?_E4d zuOORqb*Br0UinHnFe&pYJPz8G-|3G8EWc1p)&^%ZO75!f%9HQn`U+=!mtyaFsgO2Q z1VQwmMp=rn598>FX@JdHk4fxcvjFW&GqR{t`i^Ht=)JV*;@l@o z1fVMABy8g|2#EPh&O??pQdEYtLltdu&RmB5v`VyIuIF3E<7#N?Lm$g7Z(rxV+@UEY zvv=Gupg+J0aYiyY`nqK0Pc^(ZI%P2IrVbv+Xk7E)>OI|Eb3PxH)GUk`vSA}~ytid9 z#}6H4A~H4?Rn_U$^7+s=Jb(o6FP*-3zpezVgPvE4-Ij8*c{T4ENk2`fjM!7jjrja+ z5*|kGAfnokY*!OEEc!;p-0-kBu&Pviz--x*L;ebH!;=Hd>-^CWxIw7zks@9a>d5oi zgd~`(YUR?YExT{S(rralU z-OxX5ye+b~ttLW4sqs%7_s2Ui)VR3A57Owv!_3qgr#Eor>^1jc2HJB+Om%;2)%7r} zB)QAuj|$`%&?xarkJ!Y^r+X)mLENy;MrC?=IZHi4Vysz^s+-zIKZkC{)OiTGuw-*? z)@YE?a|S1;x}FwHYl$VUfWR7QyM^UXQqSD?fc4b?T(}9&Rtre9+`+c>xadq)2Rq_3 z^%AYrpM}V88m%*^@T#py!SBUZS2Ye@%Hi|C zT5!)TM;+#uWnI=5^2099;N>Z{KK!PYHBlWIx?JoaL=5>8K-II&xr+)YZWRz0N;~N3 z1Xshp>P0?mE1SsV^rfmm40 z1b5n}>TqBw{XsH*SmJk)rjju5nn8!JcnI~Ccb?NIRzcyuv4SxoaDbWgDv$b{U{g%& zf(M*=f9)7&nQv3mF8?Le89T*qsAa#oems0m^2XseYweXDyZ$*?IffPU92wqxy=iDy zg?-vT$It=G!?54MoAkJ3Vn)oFpBNK0n$0HZVEN_ePm3+}pe8s#Pm7qTk`8P<#i(Dp zeqFs(`%J7SM$qmV{?QdT*lSo#UqJvz?6b%3llEf=10hB*(gax!xvT^LBLRZc`#?)Q z(%*`J>R@-!uO*Budh799hbt~(m$4*L#cvSA;Nh0EOZhhinbX%d)OnpcJ{_rxjaPSS z628Q=i=3V5FkoA?a7ksy#T7pPbfT_r2*@{WnnCz?2Dw{}b=Rj}NGvmR8eT#;$(f$Xa)5ajsUN2He zjSuVPQPP@TK-mDiGWcgrOOWExa>=j8LgZk5fsg{Y&fqdmB_htGQ=^2p-^Va%u7~l} zMZi}h2WWsxydlD3JhxeH3mt~JojYrb3shDq8IuIbRLj~J0Ry$M&8!5wkHSw570rSX zwremYe>>V|e}-AOS^jX%<99*vPEG5O}n)bR(_OtKI`Rr?Jrx zCseq%Z-|a-=^ZGL!=aZHkW~f2nH%owKNkRD92YNVoj8lRSXqkhZxHGOaEJ-40_=*m z2+;_=(p$%wl;n2}!v6^5y}ni|ak0kbI8zhyG81$-TraHT+D3h`BE$&+TrR@EE&Ll4 zf_q(o(*T_Q>Odd^-~BSq`P4A^wexYEYhtnilY(@~o4iEE4Q;F>N~-)r*O?rMI-JBk zn}tuB^<%fZ7qvF|?%T0INY0T*>!)8G%YcdcaFVJCoH1Uks{961D^}MuvL3*UaZaQA zp?8q&vg(3w|MV1J&vN-04|^bsk^2U}vN=VGt{o6IvOTfPOQI@r*x?AaU*s|kMdv0X zjVQdNN>`ykrr-b%m=b-YrsgEMe$Mty;g|$LixfYK)~0&StE4jZ7%T(pAW?v?AvjN7 z!b<)M==D1mOVOgLj@(3jS$~0fckEgKocyB-*eo94!v}k);r`I+a}CuY2APgyp%2A@ zsjjurA*s|}g zGKwJa$#bY~>BuZJ>NWb3^k$jei>-3z;pITcQ~B+Q+q0Y3dRgHw=aPbD;H{13l&91S z4MQ&HLo5R;FEbs(q{W{iFYg%E*Cn1QXpc?k2LQgtgS{q`$le}cXn9f{@-ba8(F zcw}B&HP+V0fRs*<&ucGJD-&QI{*zsfcrlH>lnZ%)u}K14Thzoxql2N99ApnBD*-u3XK?hZblSYG`%x<~d0wgCon-^?<{yju?SFQQc85*=z1 zp9%TNYGp`z@lh#6a5E2iobuR*Nq0ajQNIn&&;bgtoi~WhWFvHZt{dL|6v^t`BSHpP z#r>IzG!%lz;#TNh1b>zV)swtIvGIX)3Z6H2lu`ax83s`=g_I#)`cUTY)K6YYNtHS- zP6ZzC%r-v)i(%cq$nILEt}YzwlNA9NvocuKURx&$aavH1EB{-{c5S^r=ugT5LgN@6LQgV1djlCs-`W zB1Ci;J7jZ)R-wk#>i|Eoabqd!(t3adnW7JWXbN7F4zfYbWko0mdXoB8d*-jzQLn3m z$NR#G-d}xjx1OU!llqukfV&_@wo8Wj_8a&?7g>|?ge;+xq+8XHFupLxIV-Uj-wLS) zw@rEBJtv%}pXWUcFyLVzO+9FCi0V6*W{ZC4mG8JubmIH&!9V4eV2-$00q98GOF7PQ zI+Zc-X7oIBCyLoe`W~DHoNebH1e+0I$_n^8g0oo>5Iiix`u<&%0G|i2LAu~%--XLN zC|&wXFNvm?gN_z@1(Zz{-!u^YvlhOA0H=ol?S%|)8-7|Tf*(Hc{;HRc7}EfjI8JLS zMfMM2e`Xz5lhm4ni$@#RzI*~4aAO5Qjea@gPX1KwTJzvH9JpNzY87zJAgP)aMEh88 zF%Gk<#Y(7QpG9+i_kkmS9T}?S`qaDJjeg~i0f0at1q}akWw>zYH}9j@cWs4U%fdS>EUbY;Gd^y zHEQgPTBIcmZPyV>nj!=9g&IA`AJ|{_59#IB5%mb2a#WkIcO+GJ>1z4RSqC%->5#=t zm-m6p2P93AT@F=pjTtU^^!jxx*LGv)_o-YvUvZ@PIlZ?d7i6x#^*Y3Ej#TLj9K-@p z^M9()|Brn9=@l=hys6&mCq=noOV-q)FijhD;^D{_A;BEAy5|PGAM>Qz>qK9b&pikp zvEEa8-1^w+H7@sFd&uY)`}*m{5X&yk$yCEFu@UE23>}Yp0!}ltTm78wLRoyS_JG_T zML#r_b${+znR0|_v?zAdEp0?!a4DYmrR1?SGj3=<@32+byJ%jI?9S3HI1*y;-XpoZ<$T;Q?c?X*tFpJ?R zd86Pf{tSBNN|U)^c$lvB$CRk(m}~#e7W%(c5CK75{=; zOcQzBCr6+^T@2@ob!_Fdj*O^5hjmvs@q2nt)d6*AtzwIXaNgy6<>5z$*OaX6c1x#) zWs+|X^Znvm9Z^g+@ZBMMT^e#R&>f4ntUh92x1^DFIpZmBv`EeKm1iY7g?Cn%n(WrT z)Wx2-B9;ux(=#6U8g=Rafe|>~jKcM6YXk4|S1&QmyW;jcQq+WoB&UD5opgBIi@XU& zZ^u+^bH_?=LK$79`}cRB5R(>O8%6!3{4F$ayH>NvoWw$FZbH2z}FMy;)oYzS?uS?qEp zDyuhi-vnxGq)`WHK?p&c4KPQU%TJ!zZR{GXN{wR>tTlM9>dm8%3sIo;1YUk8HirBE z`HX0h1E5YGDx8cggxB}>PBj%J>t?#a`k?rdITgI9oCKzw{1&|ErLxs&8Z}jFvJ-%N3?(7v`w_)g> zB_WfgObz_(#?YdUa7O)y!-!#MdnPX}-Cr}0mfjaDD_}+EJ!GH5->`=0R8=K~g~>f2 zU*cgHjC*USYy`BTP4A*gEuWE==T`tfhGFlAr=1pV>Mt~Le*Jm1KJRP0(5%ZIikX~4 zF(!lOdK*2~DW4br)Hai}9Ha{}TiByO#$UJ)>G~il`itReGuc`U_%%yBeE zr=+8+DS+uCeGI~SHtL)d8g!Oz2lm78ue!YGSY4DA#76N-55?5cFEs>R-h*2!($uu&3e(6RB#qZ8)K=i6mzOPrqZr@^f6G@_!J|CM2>UljC@-6 z!_MhB9i|6tgvs^eRepmGwt%AUBw3bJLiTPn=i>H)BG+&c^d&k7YBeopdj=U~``uS%cr*}p{$h?-ebV3oV4 zeKeqiJV7=gKJ}M(KJTQ!p?6YCo^p;Ni zwOc6zeZF%N8evru+yMZHE z?-%2@hWaK<;Js8$BK#}jG+MzodC~2nTZ;T5KzLGv_*k}y@HaI#eA6~eilVv#)r>tE z+#z^1%|2DMw0LF(7=NtsX(QE;O`bmKm8xJ2bQ^89CqnU&gyJ>z5R1Us%rFed-;ekZ z{!-?)xlx<8+Kt6A+wF9KRYfqI5&W&<<~7Bcn3m>iFR)}ifMZ-1lB1ha&~0Yz)gyl2Aip8_UMeF&aG1m3 ze-hUF`nvQpK6DpHAwI?H(rgHvT^LrMcoOB=AMFXTABmFCFrxWT=fG)m28=LoL?xB6HzU4e5+!JnIzAa@W8fyM7Y2ecL za^`u?P9z7}Tc^pdG>B0N5{UW+iJGb2Z`+`WoBqVVZ~t}sDtzyeI$j-IC~^@f$@s?2 zemotSwGPRIFT;4~<6xF8Nt~irEi2O)b`~0oxOTf_e6kJxV)h${#Ij|q$HPQZh@(F@ zuo?D&^0+@7%J<{*#>u42&c=LzY?wm0m*rs^k19j(XIL8FJQF6@Ry31Y32adk*^-n& zxR9GyFfz|1+#A3E^raCPoD$O zsm4cO_+YfbQFtkKP)8`_Y=|{JZ&Kk+=|J_86FoqU3o$Xnxhs64OAq7im~bm;`vAJW z@)e2GYNd-4_A+|&H+2JJ2AHvOCX#B0l6Oa>awz2t;8^QyHZa%5O9c35IU0U&BXTPHoybN#)b0&dF{iA2CGu+{07)NFwI;T(00RC}-&7=6kMfialFiW-`B~6dYOKRZ()2$6H(E30|X>#X6!A zlMQV9r3X@5PX!WheGNSSW?4(LPOkeBFz2y723|k~H=lWiec=;ESFwM^6SyLDf@T2B zgg7HZD$Y%nTLKI}6KbkPp$#s^vilono&|ksq>I}0XaB+}F8*iiPssQ9Co)?cR<_LW zHeR2f;=q~=ZVf(jIjQ_5^SC?}G6BbR;m-dC=^~o}Dr-J?tedn93!w`++Fk7&lrk7F zWZoz%8-}{Ed~|LZNn5%p(oq$mqjUf-0Z>%CI8kH#093e~JaKQ}xqWke-?xVq)Jse` zBl2+liz8d*ljqNVzt{+GpQ!=|-juMz;_^T77?m9D5m|251r)4Y{1w!Mnrh;GUc81b zV1U+}cU!Xz;m4tUyliQw=5q=N*%=7&@YO+r{gW0D_i<^Io7^0_DASCM%$}E;!b9esbelur>gp&9KrZz|79hd`dv?(qk$X zc~R7C7`rujttbiDn_*G_vc*ht{YW3pI)Rq($*mcgbvrB2ih4D0(<6r*)vbb8^GZ~@ ztHDe4^5I-KVj-XWLBX5D;w%I8C;~%eqtzjVRx?G0JmI95A1>YA1G zntceVU4T+2ygs0jG~QmEi;&-bsgN*(%NvGyb!vjF{uDKZi^=EDe3*ThoLbCvR*qo- z#z&X)fz0JKdiX>gs+f2LT2btqogvlRtsjuz+$ZRV^2B~B@7=?#iQAuV{Q4qH!%Xs0 z_kM%^>iFye{jjVF?^!I38JNIPK9rq<0IEmF{58MS)(=R3s?#>V&7&&ANrn>T|Ay{t z)lj+No!}>g6g26;yg&Wa%D7D+B;Z)nqlcg0Jc7mmefC)LtGU=)&ML;kIZ@99N{-zO zmRQ_AO&=3sBwYBBW>aj|&0o`mkoEVYecXld_$-(Vd)8$<&h&mN(w+vQtzv;*X2{Dj z$;Py)TDLBXgEkx5=Vm@Do$JvSfr}p=VRszoQT$HIaRrhcO+%aZ-^vX)lyf$K-XW!< z=uam;)i6u?>|&;gBtE)}6nch@j)!u%Lp6!<-jiW;>+?DscnP-UtC8cFjDN z*6lm+7a-0*FnSmgXjBWL2XPh~rdwoj5SBPAUuN=(U)m;_(9@M_BZ3_z(U*Qa)}|7H zq+=zNe#1w$BNgtD2h);^kzq{Rt923mz!fZo0g(g(odpeJL=&tDVX)?b zs1IH0Xv-VIzZCP>_|u(;fWTp4seFUE7C*#3cmfF^t{<;3Yd(8k(niK(0~X?1>SXs} zQvR-&zWQDUu$v`xX?Zt!xhs&(9y9QOX#?$q&t&V;_GK(Ih`0(T&kIf$D_b=MND@Nk7j1aMG>6ND zQqBiLDj=NxnuPsGNzLgD_r#oq^9|8{5nHxg?e6&)1_TRY3dYfRIY+Z#Jq9ney+Vd= zh_}w*n7(GJ5kgby{)p)^;rl64(1R$@;mhLb$u7s|JKZmf@GYc+}eua}NUt`s@Mwx*H z3^!jVr8eA=5=h9DFl{ylGF%mHz(V0@Mv+Q4^2%Y^{fA&9b&kKIA-Q589lV|i>cUB! zzJ#Zy_%EP!_@Eu>ruur3FO(uu~n z7a)0R=-=AR+rL4Tx%JC6MCSWnqLzAYwg5%q7-y*3WwJxN@9P^D0XQA8x)x!Atj#wL zb-?Gs0nu;6K;%oBZ<01_O7+xh%<2$;uKC-ZDpU&&1pkYF`VW-n|33HfU;I@bL+0N@ z|Kgwi{ml5++yB{7v_cof;M?@?bQ5_Eua2q$S{NX;O1?nx@F~XLBZqn%iPIQElO2gC z_N4bHkpqhW52X;mP^G!0vWPb8T%42l>3av~il z{=hoS9@)A1z%H{Kb9fTy5W`761YQCG8;uZ9pX$h=mBY0*(s{q!{_-7ufQDNSpgT~t8exW*d$>QOYYqCHIgSGYsU=Pl4v4>vb zqBjMprnLIsAV64yzg^Joq5^Hv*z-A(Yk>JZeH2m!4cnqVRk^JaXyOM2(v^0DQ?Y*S zP~XtfK4bIK;o>=E@lly7BJ!g!!)W&~kh{lVdWQsp$8N@#YbhanJw6#B*x<1; zM+h_wJ(2lwjiCdnYk}S5FLt^i?NW1{u?_6r3@3*GSTJsMZqNo+eY=Nq7T^ZYe*dPJ z$FZ?*7N3J^qF+OFDQC5*a_QU5&np~nc_gKbZ5ND78NI5Z%EH4i;ltW-ebu8XmFx32 zVHVT~ay*`dCb$Mx9n6iC3-Q0uI=u#qKASKwk1msEaSmb_3t$HUn)a3PT`&%Rvq6-! zr#WjWN0j^Z{UFF-b5YpS3~y;&x;volmW33~CO`m>1iiPf*akb|(|?*FEojqZy@lFF zx19V>T*++Otx9X^B6YxZdyF#@xybHDU!&i}WWjLKbO;)gi}``w<(-EeFj*n%kqM}k z2@>FB7>{>d`@;t%N6vE|oahCpo}<{HJs&w|C)V$Myk{;}_-g=|jCpgxL0adLIAF%@ zkY%nvjxIcuul36O#r2kQ*9CBA#+i<0u-}=}lXapKdH%PTb;jdYEYh4mq`m$qxsqi8 z>Ps;f9!nMD@=dFNJ6yoB^aiLBLP#&0$?tEzX_d)u&AjV3-#kx70!u%G^(q?YBb^IW z1~|cHjy6ogXidozA@ROtC1jc3ARR9q?9rx;{fRatLBGsZJ2rXy96*_=FmDDFID(&T zq%{_ux4G4zK|cYa`)xU=jaS!Ll)x%&t^l94-x&4gOdo4D%BqF$*v_NA>Scu^t~3)i z2PM$CWv9>DZY>AiR1hCU?*`n`(_j7k?9U{Oe3?#}9Hw#V4hH>-WI{kcg15r(HhVOngEEs?p*yP2Qh*?iXcTg&6X zwG9~3{|qfiJ}VL@i2m*be3{^_Z-V<%w-1AJiqW(0|C_QMh)TJ^a#+dB2Z0@ zxqd}O`WnQ@@Pk=VcY0&B^qSE?*K6`ivnybrw?Us9JYm6IH~pc3FSKW+M4)=PY}0Rwb}PCp?&jUgQu;#w(j z$U&tFkAFzi`7Q?vo!onZ6m-$|$`m3jY`Bwv1?4=Mm^1bb}Q2nRn*+2Wg`cLZ?fBRZ60 zKZuxTiv9+HvN>0x48|uqt}njp0^_0FMO2AV&v6kYL&l6(s5N)w7{>6;$&1lhB#BNc zz{@>yFtXh|L_$6cav( zQS^uAK*GHucWchRvrZMIB}IYyhFnE(YZn{^j-K6<-^Xs&5pT5VlOuD#yUF5ppf{6n zn49oHIzRcfk9T`@eaK?OqzG_|Kf4q2BVaFb-n zSmsECFy&^6>&#a4H|6vtfaw>T7?t7$pY&ud|3-F2n#1Ij2fAM?N;cs57& zN9Be2rq^6PGW;}{+~Z*&=gnn>9Tu+-2!Gl}A$@7GP5TpQr|GY)Hz| zSyoENr*fCq&+~-rt4gf79z%31hlit8;9Y4l=bqk_&I1Lk&DmWi`*v^|3DCVT4?QPr z9Uik+0NR%DsLc7tNA%BaUSj${G>CfMC~fB|&2bXC;*B=~80K2He{R~mdhe|g$DD1k z2bo}+e=Bl=a25MY^knI$kT^7T8Syhqc#a%x>+0b0hM zSG-NP;j(wAykgK@=W=ALihMfehGzY1mwp1Y6MPCen^8X<OxVS(J&*hghvt%f5Z?_U39n^3nw%$rzIBR|&KEd_9L&@6V% zdsB({*b^3!veVPl0$uQK0wGOLc7OT!Wgj#=G7Ch;l?n0)4tJ^hT>_^#K*}4j*;7CI zALnM`r}V$m>JhgPqp)9!R)Aw7L6ijS%mh4Oshozqfto?$dgxlLS^Gy3M*u^~U#b|I z#~+UA&R)Zj_d>hJnuiEoovN(*pMy_Zg=rf4?4lxz=(h3mkPLvd4zw|{gG)fJ{RnlwRy%DrNNEkJuVM7$#3EDJCPq~XXHZ#3C@un*aM zz4ZpM1VWs3f^o|0D}$9y1P-yMy$kf7F}RIlKI$XsENx{@FEejVedVMi&(fuVlko8d z>GP*EtFOe#0MY)e@gjB-CXBSp(3)fa0H~j_oK1+Lmc%6!1NI`T6y+FnPv^!Cm%L$M z4uQS&KS`eYNY>H}{z~`e0B?*ovzPD<)*GB=yDQ+S27t)nh%Ok9XFQkxwJ6f`t#11_ z?y8K~iIh|w?-RY?yQq}~(s^<}b0}S4Vol~#<>^JG{z4c5HHiunJoJffyHOv^cqD>T zj$w5d;QjIbAOmW1nWqPrhEW`%k?!!9TeB7|L#Hyb)5ry+405IGNB^564wq)?U+*er zbKaokU>s+TqXN-BcYX+{FBDXuT4rJZ4FOLi-?JgJvA(6Am7zQ^FHmCWA-UJ;Gvfun zE^^9S#U!s9)Q$y^R{w?$P2Wh%{+hJMk;>5*mHmQ?^%n6xQ z49lj~T-H*HypzweIMDizdX|NXiW{(8z-jc6q(c{<^i8On)F87&8b!JlL%1IR*Njcx zntT4&GrC|wWh6=tBc9v^ZY7U2%vXI^oz(6Okmxk#6HS41Zn$*z*AF6?eDsCLRp2op zop4pXFH)~F#BUig@7NYO{#wJbyGpYa@MhJL#7XnGoxY=;|9zFKyO2nP(TVHbVbZo3_3rxz533m>ock}A;rg}>s%;O zChLm>p4qj~U*qR)Q=;9Ek{-Vk^cbqre8e_h^3CNl9Uy;4iZ5AryFH6tO7iM3pH${l z$w3+!Md*PNG7_4a)K9HUwC)`N8c+0QH6xhUwmt)l~|ix{DcO3D2t;1%^#*sR{9#EtjBe>V=$M_W7e1TIVt9@E(M)j|NrN>F0H3 z@tbVWqW^;^-}Q9Xv^>dQoh-Y>Mo;Q;?s*@VfuGQKJjorGoS_hLD(o6u^X|R-+G}Es zao3X`E{>kPpI>Lb=auk+q4a%uW2>bW=VZ9yruZm+EDnBg?ZL9!&jxqp z>+^VSCq3zZA^e5?p_1)r_59CFjSnNMjcS|9Q9!0qce?*VZk&wDNY$XY+Z$yplrQdE zj=N%+cG-c>LZteAUlYI5R8KF%jhE|_X#?>VH6DTR zzH7#LVMxg@P8T$Iibc|esm4M~P|-IK-JM!_qtlCCWoGib8Uv$T*@t}jFFm13BK3(J zVO8gc`83{0i0glRn^2&6Td1Ma$tf%f@O^s&#LqwI*!&N@{|eI{a92`A>s6j2W7GLY z{)LW}@te|eneYApp8zUzQ~;YjT%{r(8rfb(Eg_H-ebZ=8v|6-=U#>+^rkE4;Z8l?5 zNei8E82wFht8AE8s)@GI0pG=to)d0w*zi}?8#jIiG@VF2K>k0swrdbNO4`4|#Ee{m zeNqQqlUR%&SGeQM-+IcjLP*vVRsLVYxuwx%YD$ZfKT87C%X}{}zP(}kEl}VAAO`eE z1GFX>-}11^km+V^mnJw_?cbRFjI7bGYxK7Vo2_Xd=~q{q83?6mnYl}+%-Di9tbc<# z`y&jgs)V%>`&$cf3|+=>qA01GmRCx3orT9rO^e6Y#CZd0aX&EegTDX_hY0$gT;u03 z7fPBaL%=t1z?gS&V#A}AM7ZD-G8xmpz{Z>o8RdrvOVF39>uLUwb0=-Z$u zAlt9mG%BV(AnaN!HT`faX^=XxvXuw{0qjftXXc(w{z0Jd2fY3 z?ItO)%9!I}!U1&meJQdPvIiera(Qy5lFGZI1r+nM@0EWAe5Bw%&)NaF3RUKjNyKg* zmv+IG_5k((*M<1;ddk;*<=5`_Vo6$WggCdjAbI{delE@~B1}{{hIPpCMi@A>5$aG3 zyEm_x%l1k+%aC{$ni(6?^gpAJ?`^U|sfS-V1O5CbPHbMMh67&ywq9fP z<3{~W_Um+Z%-|#0zHEkV*Z?m;+K9G@RabaHq<45^{y{|?H`4NgLZ#0g$@-IIj%y-{ z{DI-ZU>`VwsYWNiF>*gq@iwmcJh_MvwlX(WrN9l0rz+VTQ)UE&+k76q6lMX(`YMFo zyj$;SboWxxqGUcV|(!Vz=#zV=LxmKZ&n;^hpYZG^+sdRq+ z+}Ul@ON^KE%xq;8G=Od}YThMb4hC|X&oK22E&X@oZ3zTx= z)2Qqpc8SXuztNX`M+!EdA{jsQ^3wojNuU0H3n#bvjd4-Xf-1bHQDc^ZXIz}HDi`nr z&GIkWg z13;=JGfX~l+uCVQU)H!co;Zj%AF|YI@UvIKxv6unD!)y!_@Kb0+iDp6N)1hUK)FdD zYUbIglIDCr*_VImTbH{VEc|qrF3B}0?L!)}U07-^BSJ4CPIh&efrj>hLPHFH;`3t` zm17WRq2@4tsu;P>HQ;HUWAKe1v4VD4RwpPj;ya>H0LZyuzm8m&(5*y6{-8Rx!1-@1 zPCvB)J_JIr8rfde=^w zu&jRSgY@qac33E3yLHfK0#@_$Rj{+BxX`jNY%bF@5Tdk;3_a{b zT?;#MB+GZ}4+iea)4y*E-XB5CkRQ-l7D}J(Uw*M~__=Fe0(TYj1b=I$^rr}<4%jSt zE1P@s(ehJbw)8`k98o-QPnttH4;QH2Z$VGra~&%&Y>K(kfATjd5Za63g3NxX$3|S# zirV7E9jClypziguyq+sQ`9ql-=0!!&-lD;wYG z@_TINz0boyYxI{UUdTt!NdVx{<*P=Y`)g4~#sZ8g*cDg{IEIhlb4I)3*;v96QKelA z8spEc4tn}g96`+R8PhkQc8`WOl6B@Seds%D`;QRU1?lGWb5ykvG0S-#cLQHK+CN}! zp9l%{66QVrz}xkzdKPIo3b6O#+*B=gDp9n27yh2!3t(Ors33WXlY`xj4x@EvGNde@ zUR}sTY*1me5VR}%>fM8_CucLi!_pyqfX1DVtX9gQId5>aqChrhT)c&@M{Ie|s{V~Y zf2%bYiD^ujImmRUpQW*=QyzN8bGJ=Xg6YcK7)N6taMUa`KlXknqF zwDPPM zIK#a%z}+u;cuL*`M4*-py40@w;a!L!wH%h@_@ZpiJH(j|R(AW*ErE;fc5B~JspaB{ zI%cV2+>>dMi!Yz(i3DuuX$_@6sG}$N%evlbO6Qz?B|}D2wb=0 z3*87S?nK!^j1~gGWLlN>Dn*qVBewo@VyJn8%^VxJ66a6`(dnvk#8GnG*xUj*13@9f zNk>LJBUzjN7kh6W5B2-+jgJ&bQuci+TTvuQmJCTqmMA+@WSt}-vdoC=TS!VV3fT=Y z$ToJeBYB<&VBCt{+-|NpY#1=9Adep4NVn!Mj-G;d8m?;iy#+N)venENG7qertJ` zd|qN+=a8kRWutH<6GhX)mU>DunRXc>JVCIF$zvzNU};a0!gvF8uI0DQD&VOS?4s&D z?V>7LJhsen!Ljn?Cv9Qr=&xX67{P@?u{j}LT~}3ZrB72^?wGWDJcn1Jjv^I&QPu=H zn&czdbGJ=9mz!$oY#CFb{7O-L^sq*oVG*(y2^?Cn!f&ek0#xl@miQ*AR4syUpV=W{ zF=Jttd1t7@*Ua3CCNlC^Z9jhQPwdlYZ}%N0Y=Pw6-;mqt&~PZ>E^3eA2MPy}g(Gxd z>3t|Tz5*}{9&4ukhR|M`Z8fSNLPjGy(a>52LbJwTq4zTjk?B{(On%brtMIf-JFl-# z3t_&`1gZ6qcZkut#L7sq!AoB!yY4Z2z)!nYhOI2mA2TrYv}jD%W_a`d{S(A8{2-O7 zmgxA-1tpaKaa!{UXqYeL4rRZTG#(naBy--bj&#uCyPuBT_nwj^OAOb{1S;cA<4EEb zOlw@20xt|%VxjU-mh+?ODfX!h_Qp|k_ZRr<|+8D^T1nxG>4i+O|7^Wbdf_qEo+M$cWlax-^A*OTU zlA1VGM8-z|n1K54y{enS~QG(`T>?^z;a*(0c0s;)&%+JlRE zm@$2rk=2mcQM;f^xj;_F(yq|CIw4x#zb3QzW=c~~j(W(;>`8=M)+^vB(}Q#lcuLE8 z7y!~h$$`^P8pYyI)oEm~9|HVmH!5rZ<1Ngg7tSfIgZ8PK!LiXLsOnfEG(r|rB;hI4 zuhX2=7wZ8OK~xQyFSGiZM%vyMjtQkzyRMrv-F}JA2V8Oz=FZoZ1g;!SO&^g}yA3V! zM>oSmDs5Hp>t4PO?kXPT;*cK>Qnk_>$R6ByJtAXKwtp0#kNz9&i-@4 z7M-QFsHecPm()=*8b9_M@&J#g^MhgKzp}62X|QYbo*k5j)7R0ET7ABQZAzdPuflLp zFAvo!3VeMU!~_t~{#TQw$FffcNk+$$q+9F7QH`oQvB+@_mE4qr=s&B|tM8Xo71V|C z0Kmia$;P5eKjjf-R<1~XiQUapl~Fcp6;U7rq;n^TCWP>AXB@l8nm3!wce8}*+n7AsAnFBNDat2901S=d z@ik~S{`{n`J18dJa+h_QG_KTwYJ+7XMUY*na)sj9sX9)bL`BCRk!cg3aEMybyyCKq z(TUuK*9xkXC3`rws{59h8BWr z=Yqpr5lwz3M0czMDHOf2rl<&kVnsGQrwMoMZB z;iwsS-8pj|B88sQ@(<#J++~wF#e+Uzzft7;{~_M_50>PASC%L9@4oD$O>%TxH$lF|=v6CrD$o$NmeIK+ z&L}FLR#kt@lDv2e8|T7*xxyE^1|+YyADutSD3Oc*fh%Wj^oNK0`B`g7E7>(C#0A$Y zXRv3hu~Xrn31YH{$DuL(r)EdvSK)o}YbM0>b}KUKAn^mF#v3z7XN;Uxi-DyfyXhS; zp5g%6sR9JwtSLjU2|$YaO%rlIvJ7B(A^i7EXUC50tUwxx ziqz-DuyI%ieruHiJY?HRU$#?#zhQ8#K1h>C!~g{4sw}M%KJwfa@gq>>hEMx3~Q{~GKt-oJFl8N3y=J|ybgcwmzG;rTftd%-e>uDYym0eC;GCMQ zg|%;)Zt3ACjuA*locQ_dO$QD?f&FNb5Ak|DQC2(L8AJy|$vY5~v!!0-rBxr{;c{uo zYfAm+4fBr-XB_iyQLgpV&r56bfQSBu5bof(Gz*tr9}W?OgW8#!Vn#ecTD}$7iUj-J zMxE^Ote?Ljf3|C&s8j$yJH~A@k`WmiNaWS4sjcw7eXuknVTFaBOk+cK;f~7G@=q-l zN;=pTw#WNXwQqP`T^fQfn>sMhM z33NC5IctHRY+W50pOEcP;Ujgt+O&{z|GgJCHZ;&AXGTCgNthjVQ?VIhJ-bNN0^4vN zVvGPaCL#EOd=Jus-Ua`B`7Pnw{)(@RaCdz{6b(l33Crvkb2R-15d2%Y04)b|%m#Yi z_|Cuh#dZ7|sifba#P1$LsiDiwsIT0#X-mH$+m`rkUWzH$0Z)3+r!b2HB!KoN#^(Np z)cxgqNl;#=3!^$l0W9{%%(5!#-#i7|1^CY{D4}56eV-HpjG!t**)=nMVde0 zC1>Dq-34ueL1i-{2Xk3 zRL(tEfczWc$W9Xh$l4DZq48U&QTIioJ9rq8C`(zVt_VOV(j;55nK$tm-S_<4J0BT) z>rmIh7=bl*O&(ZIIsR>>b=**7hX`lQ9+dWE87wM`$o*ntg-Phx z8OD`SK~2{zksah8QK*cayw%s^bOq6^yrm5Gc)C~sPyHa^CS-dzhr!+eHFkg(S;0St zTzU74leO4qbO)X!S7gaLei>ok@K}2po-oOea?n??NHHF?oo;;DISw}ZdkNJqM!)qz zr}GMBv3C=-B;L zfQcV&&9c@$U8-Wn#jrXx$GEDea|L4j{eY0!KV|2&Q^itYdmD&-eFf@IRHF@U?m^#g zNH18wq7*z0|0yRg-}|peKmIu}5BlR@C+6J&q>R53^M3rHy&n1)XtIajHYAMFwX9_K z=YS$pW#vyy?-nMqUOK#y&RaJu$#3QRmP{(%pMgtVoW%y*r#v9{!`TD)iN`XcSi-)U z&H*QJ=v!z-Mk?_#a~=&g}C~nG(ULp z1Mm>;67IGFd}%@W^#Dn-Xdv-)PqCYgnG5h@-gIYkb}9H``tY9F`r;ZhL8P}O9h#~u z;czN*>8cjQ3-uYk_N=hoN#)ts=EV$f5#mG9$7BN7G9}U;H%2A;?iN6gstIXK>4-(~ z%GNF>#0{+uG6?C;7mp-HojIUn(jAji||~J^sYQ>ppGFj}XN0)^ujHCayN@8vDy1n03UAQ<(j5fOO!4 zHT6VEYnm2H1ziqbo3@`G7q!6Kr$wKU4F>$3owi}o{<@kebPKCHg#27VCG;HEiS|0* zA?h{3uXfNT5B*?w@k&6d2f)caAWCJ`CvM>KMy%R8??uVv43^N7DS{5U**_aF^PuN7 zJNva(ekRt|dGIoMMhJbwj~-diPQCuF)$F47=|dwK1ZT!&zkioL9YQO_%{{LAL!Sn@ zD5CLXgGu_AoGn)x>903HbMK$t0BTQQq=LUhcTu;!P+ZvEgT{2X-)OS8EcgkB#9w(Ph8Z=IvpT+cr+$+*;bm z?FDBs%cw>Xwy5SC<#+9#O3{Oin7OB9Unr&vS)CR4R#O+2Uh2R2n#&HAn;j@KtCfFVpww#=b^^(4)zhm?qdr66wW& z_b--JKOv72yU>gJfTHa)ENBP_!DDCm^`05dVX4spW@M~q{;FktYF{qXwYtDu#)WkQ zjq}vaV=+TFvXZG^`Dv~j=!Y^74jY_d1WdMoi`C@hLf!0<&jn)ol4{+|JDB0gpK#Kp z;S?jZy!I|{2?h#cYA_K1 z;@h~QgvcS5oZ8r5GVq*&B=}%f-olNqE7G%=Xy~DJZ&qQmyBA9?zJ=!Oes*RgZxC-p z(4j+%0}oD?%XGCjFH#C+QPO1qa@*rQhCD@KuSqe1>s3{wLfj{u&vb4cECS z`W_1O481RKe1+Zx>o9}sCo6bx$%Y{*i-tt+_p@oA8R1p9GQ-&HdSXS=!#j|EhT$NX z`%bI^R!>8~qlO753Siaj$zvNbNIr`*_kr@}a6DZj5odQ_DU>}wejd(C|2c6aybXD{ zqBOblP>uGK-;G%yQQ8!pF+8jfESXO+tD3RGGoaY7{K~gMiipFg1hDJ6qFDArL97|BQmk`g9I@7-_MNg;RKI#ZM(r0NUsxC}v zgiQ0~=PLJL-|DM~clQR$5z!eJ0i*jG4v4A|=x8(LoMxKSE$2a|&=ky<4E7dxkpQFe zp{%;kS|>|IQ}_Df%+V&60yTT7$<6y%CnM&ffU|^4bj6{plAs{ti&V-5=rHaXYAPMd zQcM*rn$T0pW8*j`y}3m?O7OD1Kz{D+AkEF{B54tyzyy`O0(XOZX|+Sqo;MS#Q3Q~R#? zOq-5qwxe!Q+grY6>z33mB}rj~vfk&tE2}ogn|I~CkA0wOpXg{T{OSGT<%y`p@a`c$ zX`?kJ1l)m3FX_JcNYzQ7{F{pJk4#4ETJRk}n4V`~SaJ5Y3rQ$Tj_U4s8r2tmBhe&C z%UjO5OVCN-1KZ2^Z2NQCc0=;7qnrEfbBnHAGK;WR==->yGkg&vVHqs{q0AZ2ggL#= zm(ypVWL@xwrT5Owtnvx_#j7$ZI|-WhMR2pD^0o+(LSio8vgHYKC|OX;v+F1z@PR14vmzC1 z>J3HC?lCb;%EK2w^;vfeeatlN(KYJR(Bd-w-@ttSGfe28-p9Pbxx5?tOnA6?VHqq8 z2KaUycIRg~fvr2_z5O0w46Rb&uEa@~*VEeCuGC&kEy~44A9L>HLHkAEZv^3{Xi)^6 z*L|i`l(^z>{WxC4t`A9|#DXsLQ_UZYWC$&te8|~NRbBvU0$0b!QbmF^GCnen;XnW9 zsK2?0=t5@Uv(lg*n?nhxcF2rBqZUeTO)h{vA0PM zAzJpfkdiD~UT`d8N-t{8K@_YMT)OYHl_iAoO`;(PK%|wj z)|^i^7`Q#`n0#dPKszI?5M@bkQq}w@ zWBBb-iVF4FK)r-*&!PN_Zw5sSi%_RTxQL?Ur#p{c9sttk4WJ2gpYhtCsPT&nvJ?9S zykiu&!?Rs)>_mw2aew*4vMa4X?fx-5%%L9vn3cHWBCUn)P1Ey@-sq)Ig94!(peDMw zxkAv0tU?1XfB8RM)=Gzmk2Z`MhGGNj2uUQ@!JQ%}y(-<~tYMMi(20e5HG+3;E%8v% zmnjv7{V4he`T=@R#R*^c=G5Pi^wIQ*5o`M)Zd?(>3(X85nj(0!a##vRPl_uj>?AwM zgNE4V)$dF15j7Q^E7F%4wD{5bNa-+yCVegF_|c|k1fITgur$SX~xHT{lC&iw&e7-s2H{bgDnQn+z8@Nz%DH74ckBj$Wi4Rm1Q zYFS_}B+NhE^Eu#)xs3U4II7ujfh>QgB1-an^<`E3;yA=+z+n8wP@T9Y*u@^9zXwGn z$Z;61c5fQz>3KDO?*Q)2JI?4)&)*Pz&b=tw8ttuuV|YXLE#GB8P8r8Bt+?bbmD*%9 z++*|ho(GpzVvBmpb+H<=gE7irS0$sLR@k1XuX+^C6~4($U846vIf~q=O2sbuIi7jK zs8Md(v496x2ohPSSsNTc?{-i3|i7>p_bmg@p^OG~P%kW+} zm2+T*3yyAm==ecyPZPE z3I2wt--OTsT;=#4)g%g~^uYD;-RLqyYLwh>NbS(}IPgnPz0DNuacT+d{6yi;VS@E% zEH#Q&DjW>1C&tx>;rs*aJf>H1f5F8a)MseA)Veupzy+A@BJF)wmTY|sjK0{$pHc|@ zWbuyDQDjyaQ?U2`?k`TERUd6y*_}&W&G(;icdQCZBX0z$ZGgTmhs|Wbt^V6KNSF3a z%yQ;52y5wO{py~3b49~oi24KBHhIGF-ONNZQ{!(46yi<7tYkI`tWNwGJq2+%!&rAa zl<~W9AJBcFeH@*XxIbt?kfz9CFl`lQ6WLpDyH@m{E_@pGK7OO7a_5hd>>b(1lR#^RgZir^c1k`r5ezRVGe03t+rQENxI{@cbDqJfKkJd5yDQ> z$V$nrajy6U>Fs7ZU$5Q-@A^g5q{5OP^O|_(7+*2l*jpGtv787_v&CqHL-umE;3-jf z*8yAxJY2CW&sScr&fs7vBRo-!nHK9+C!?qPdO>qOln%XRwmKQ?mEET*dq4X_NAQI& zpg_9|WJb(Axbk*tq&;ng%0epuSLpCxL9Br|m-s!7cFGW$lVXbM_~1b&XRvqd;qWj@ z``)0p*yJjjHcMRrZkds=sbsK|e)04-aNd;M#EjV}ZYUsyQH@f#IU9g17k>re%DnoN zSaks}pq-N&wd2{wND(VzG6uG-D!pN+D>oIE19SPc8mz$U)YH}n zR`CO;gJ4oHXV{3-O?1otaG0oDOGjFuMJUq2HATO_~^pzNb}rO z^)K9%A<8Lw_pk^nai_k9YpwgfNAOiZI1YvQpdh~?cKV(T%2WYtOW)@gst^_Yay!U^ zlL>1^Vd4Gl*`&UJGqT}6nOm1sHEYY5X8dUYn*UKI4$mjRy*v4R7 z_(0~bI<%X&{nrn)by8b3b}Chs<^8tEC!B2kDPLvxLkPUUr8#+Zih(pmj2ua@!?$ou zuDDAxo;}~Nqd@IN4nvQwZ{s5UY7wW@tlKNGkbpyihwkb7|7vvS)yO2{Z&f|kiF1yh z8=6ibup+`Dgx1AOiNe0wjvGr?5BcKOGzlENZJ@!j zj#YAW5FZV?D!r>Yof4W!h!0&W_kHtuv?VWTU7Zijt|l_(t9`Aaf8m}b_lSq>7mjDn zy!}}pg9N(1=oG88Ob8`e^e>ji1UcNeO~yNcpYj!CAL=s_<31H@yw>?<^Xrl{!E8ph zkm#VP@ekpet_Im+MYm9hHO@t`6sl(E#J@6xn9!uvZe}o>ls%eo6 zW>%oqBAY8e;jP}pZ{%!YmKI4~b&96)oV00msHu?wlk=x9O>)~zL8=3-bor2*!5ibR z5~HaN3e;~%JXGM}DRN4YCD+5ZZ~XulhrJyg>A3__-M0s|PF*{?IK+Ti2q(QCdS^~f z${R0fd?XQ6L}ofY(DA0Q`wN?rU$>ud08J_=b=;#qd39&wdKE)|>;V^EZcH{7N{d zKk$74mkDQGTDAZ9<9d#o^%>YHb)oi9RW`NA^RMRTCl%c?+L41z;5wZEikK$os4U5M5Gm~X)M%_D=D{{M&Cc+V#G=BUJmoR< zB$gPL>7@Gk;+sZQ_96&M&axtP#H&qWZusFb(YFt-a^dR(0stJsPCWPegxo!dD!zAh zI7qF7tV^tJzFyl`o^s=ycr474UP?c_+P8va=-+sCPxS<}2LV(C4jEupi@*6!H9Deq z6+YbC02R<>4l0i}$~-V}>JK|2>qD zN9k`BDBp7x+$0em3zA#ho6z>&u*<7c!nTXuN)*-h915s%pRJa>3v}G%q;ix{V;#)0 zeJ8c_43E2JU4A;IOI_{xaX%u1y$ofK;?|r-i8MaDAa>{x+LHbXG&m!l*2|=u-#1pp z3ZstDZ`KR>Rv5CZ6!;5-ofN@RTEVCrC+B(0vN;)tnGXe&{s1!7-D^=%N{Tyqe|6pu zU}q!NA9zwl753bSTeDZ=;&Ty}CYd}(mfQPrx3H_tOz0=tgd!B+v{x{<=@rS&I97ft zbX9+wf+@{l?=g%T30^^7#MNj7ZRw!oZQZOaclHLc{EyxRuNf}5Ugi1f{%f|vK4^`3 zQC%dIVB3h56>+I3XnZyUeP^~~PjH%|EhLD<=Bwo07IFiI-v`Jngr~zWAs5ATgj4T} zZNTHg6~LLC^8ng=fpPa_JlNAA9;I8KNe}*%arEeak@D*w2*m#yf%tEBexSdH{z1O_ zJNxE;4~hB5=l}Q46Xqf@Ex+@DJ$grbAO8t|waA5N-7Jl}Er=2YiN=6yWp|3V)hZb} zX1EA{m%I;{tcPSOtWeP@Iw53Hy7yjWh&s<&oQx6v$0V!!57+%C#39rhOzcUwL?-TE zI5vgV7eo;>pXwvqi5U=33lDa~0jkwR#UHAbbO(~L!+~r#jlZf7+RIXat@EEH5=l4G zFZ!l0Hj~bZG3ZJ{FTiGPiSVOHJpFWa-zD=0l@B?9lQkP`>VulZ7d*B=`~8`gTWUs# z&Efh7qf3~1B)S({%H%9fm%QT$vhs)EL?w2iG zj~Zd0q(@hi2;LtpoC!@K3})QJf5+x*9f-^g%Ba{=I=|iyp6cyTU*BN!bCa7jqYN@o zxBj3wY8=ZV9E73T3Pb6{{=k`>47DJ$Y47>Z4Ugq}i%z=)mU|5Bzs%P;nUcJ?~J{56Fkg_;jFKEjL^&_DVl3)Kaj5=)h zNHKm9q<>VdWg_>(n{1BEs5AKM^g=b@24?NBQsLFffMbNQO`$S#aEx&KKelza|G>7+ z{a(0CRhS0B-1sZNCM5{yZBU`J5K&z?Md99fB(1i}9Z7mn*c!^HI zo5xi&hBK<4&z&zLM*ugAEdanBBw_V%u_bNP__8i)KQ0#+zOQ<@^0-b=NC`>`vYW-H zB@$$k$GZ__`PaCS3d@zx`F7P5Gjv}29Y1ayxKyQ(J`=D-y>Z?(DURa{oTF4}qG2nc zsJ_xeVEb6}4rWqzB&t)9Xj%=`qczzKV_zFqZZYLZ2a5T03al;cLQWc=EAn6 z2!6ZqKfCBRU&sE2z>QD<-=S43!8Vxz`1aVe?t$~aJcQgs?5ir*)YBuR z6~}%wzxw6r7;TTvuq3@(^y&f#3H(Jqg9|RK)zADX-pR{AK~)3ku(?!o?&)`MO{ysw zLYA}TC&y?klLL;9@BP`u#>5$t{eft9pKoQRz6 zaJyeoHd;lSOTtl);n*7+nfvQFn}0)&WAzV=yRE-DWRebKe|A(QT9)@OF6YaBu`{Eh zy#l32_>O7Um<_855S>UxFU;h1k2~>Sxk3k-DY9(z9}}vC`4>DqNoREKoY@PM{;>@> zJ(H~n&O`5g&Tt)9n+Dl0$5e8KHMV%s9>$UOSB0Q$IEbn=#XI2o;Q93tt56VWkTgU*c;cMOB%Yi8*rukg0M*hUI-}p66qWgYkso&WXO0^ST9IMhv%m=z!%m z4(rVNJB>;mnlSkvIKyF?LCr#D3$j!$*+w2q zr?tHq=^qwd`J?YS*sWd&hax{EB!s~Ek)w3sN<5u;^)+XnL&Cidk%12x?6a=x{Er+v zu1t*AtzZMqh}zSy)23JXH`YHoxGiG7U|x&eFuSM0Cjs`X4KR)!7(@4#%d%h!Bw#8X zeNB3YmE=XMto59x@F%h&)Xbr8#Q=7`O02IzAM+hGB5Yo?EMfSvoC=}msIeo5MI?!J zwVz$(M%s+xtBYap1Sn>Wbj83&t&e0x#O?frQ1b8&SP04$C*Z557}w;$uuuJhdE5t^ z`iSGc6k$5S`ic<-gxEC4(TT=O_o;)6V|NqIX=jFz@mqBS3Tc;|P_Q(N=jA>7W!Gu0 z{l@9~euDNoY$naFa~Ki*qWeuRaKKET7Q2>3Km-rhLzk<&FE-0paReVb45pHIWdE7- zL$vaXdddzBk_i)aqlUP!nV(`VMv2TG>QK4LwSsAhYDI~eMX|+Sz%r22#vvYPYNtmf z&2)1*y}D$9&+?>+EjXQ)k~lh?^Bj9&fZ4I(XcW+b`;` znMf0))>64g={NMD)LjLdL@NCuZmM{4^CG}O7eQOSRp1Nwf7;jJzwRpk^CKhXHTa(i z=j3_L74$dUM^xD4rqoXqUJT&-Yshc#%>#XEc>o)hB)b6^v?zu4#Bg3+cn6fid?Kf7 z5LcCTQ}}OB;4Z=*v2%`%Q#ptVHKENGH?J^%$1kadW0Z^v`brPrR`8xH>2qSDXpk20y-H=7_65Du6Qp%s>6;?Pc&t9ukXL$ zLy9J0Bm3Xu(KXP z%ciTY(#6Q0`7XQi(qbhq)q(gFT9m7AxvZ_elWZwa7-nvTBW0m?1k%!l#OTuc@vJ1w~ z6}`U!u>ZY(f9`pS!lqE;cwrky005eWn)65PFQ9u0c1KB|P)cZ81QXA5Sb01(-u*Yk zdSDNLZCf_7FYGuG!=>q=z|XkH3_f+2Z3dxmb?i^PRX=QPkqv%lHiy z=uMhe|LNl{#2a-GVPrA^y|(xfkT(xIMqeI4pXS^*o2ZE58RzmdI+Xf{*!Sl~4X4#@ zvj&>jhGw5rvrrC1ChrsBDX5CmUInZ`(y(B@+SUMh$a=O=8wmNMJ@ygo4Qt z4~|wDLa{9K_J3Vo^4e=9V`twq*%NvGWoyQ`^3D-dLI!&{&KVk!`eK4{`25ZG^3iT( z=mrh3cY#Nt7;)(sZ!&bM zOO}rurNY|gZOpQFm^nUIVfro!umy8kQL|NcFCVzMgOaKAH>4z`F_5=0UAb!TT-MvP zwKRjxEYH&-omZtvkEFe$vja|16qYWJJgx6&%k^K1T4^-#SkkeA>|U3% zryd*>@}4u(xaGxh_QsMkBhhDuoyrL&onKM4de8E0#^KUqI%`kA9_P1DC#KF8!KU88 zSRYB=@gKj=BucjgXR)n-l!#(BHC`Kv-~7Wy0w}L_0jOo^SotF?ZdGlGS@2n(4>NW2zhU| zYn~v2;d^P{k@AP*JhwF?8M*OGRl{E>_GLHfZ3eqs^~RI}xEKkScS{U1sc zY>(Vn%RaszU4{xRQoTJ>Rk$(u`Qu;?W0MP-tqfeJoMjwi2wY9{6D=vdr?XW z>YB&~(bLi4j`Z+b`eI^UDbTX?BE+&S>A@G#y!5G-E3NipuKg|Q5HmTMGLMsu3ODtZ z9aQnRoPFfI+4iU!k8?ud!(|VKb6(Cp-yhaf(H$;oek6QD3-Oivdz@pNp1)X**j+nT zA-uNxr`FSRQ9q+KD;y3+85A1r30@(u-))do9WJ-LF2kf-oOa3gI73{r_qohl_E>w> zKTN1Ysw!@chj4OcvwbIT9XDKnHC0?=Ff0^ykvF(w6xVb~>EKYSdilPT(3_^HGkO1I zw2l8xV8Q=rWoG!luribUhn3m?B^fcbLm&;%ViWtsDJ?@Rm(0bK#OTSDT9)s)<17u? zt&$S0wz!jJVvjylmyG&#B;>AbpHQOzaB1;=WTS}3`NYiQ@~uV=Qj<5gEBo(xCQ7|e z6|P`ubr-sJ(d%bZQP+oq$C8m(?R>wf3k4z!Y^1~wKJPDe!rlRQoB7QyO@VjYRC)#}; zZinxj#46<5_)5$uF1K%HurI(#30U)UCq-ca_@?B2MT)sgsijNBJhPcAbuQxo&KGq* z`gQzt>r1{>nl!-=2TEbPo*R0oaFR|A(DMO))GS#;^6iIHYM<(&Y9*%}nyr1GTy{6H zUH{Jk<$y+t8T@K^Q;(%ro9TgP`}l#DDaU#_S_PK>T>a72q#Mty>`;~vx?{b9Z}FmW z@WO<8)%d6avFm*e)YCR}7RqhwY8g9T+b)gkuO+-sbY?6dY{c^9&OaD$SchXL3Ab>e zsip-pY`2}BnepWMic(hB;jh3tx-A`OyQ1};<3JQ`50K8i$WQP$@ZJ(Z$1Q(q)73zY zm^KY(w(zyR95<8}B1CP0&K-K{_4iK0VfFhVwyYFQU*mxm9IHZ|#QYm=RTYDN`Q2B$ zOm!Po?h(Uz&!(kUedd(yfpZYGf$By}qpD2TaT*Vw&wk$#WzOW#zJ%S2^yiCO(P`JZ zf>}x;0&}{e>L+gJ) zUc%+Mp9lo%uQ6CB?lqcxR*{WH5hhaDOdS?79*f^7eRc0qaIiJ-%ji^RLfPV^EW3HUL}M{f`|#DVw0AVbA4yr z!3c0hO=VkJTOw;7lqPy|?qE7PX|KVB``A?K8{8sGC88GLv6mPeyXdf!fN2m-UESp+ zJ*sjbIKE=0ZgG>nodBQXB?_-a_og5yN;=vh+ZnLMHyP{`hQaH&M$>RwC>^_ccMf_m zE`Z$;^#i_Yi&GczCtmK6trZV^wAb8va(2vYtBi7;Dm!oqs90B1-@D0dUwbAB4e&SH zPMXoXQ`_$pY!DLzlY=Lru$bnw%e~XGkqU=1-R16g{D#y6diz_!c@`h3xVC)LiZb_e ze#s3#8kjcN9(-r6rLvO076m|wLVg>uLdGEc@g{JM*ww-OME%?)k%_y}#kLdlk*_?L ztli=?sAE7?$@NJz!8UIS9yQYYe8I(Q@rXG?M=xO8VC#hg0)|x$pY++XdPAJ%j{;y| z-?O=1RNwfr%YzvE@4q2Ct9x7^>a2{&+MLLLUeZXXeO%#JIw*vLqXJB#Bo53o|AhAy}J=z%PSQ0v@9=19%@jhv6OkxK4 ziedl6>2sy8t_svbTW}1uS7-EZ*xXpKNS;?xu~4U(*1%ZFITStps2P_e&!aos6Qkpe z=kC|GDgoD}En|AOEw>r_;+O}$A4OF7VNN~q{o&JRPr9Gl7yJbXObD~zfTk%2U3tfT z$lSb3sI)AJ|KNkT$q)=GW!@mtU8|AHOcJb@OYd*T6P0i&3}_lVqtB zYV!DpN9JFArtbi#{lK||KcRRnC+9nUD!y$pRM~p}Tb^H!%&b^WZaX|13Hq0}SqkT3 ztAWurlt)d##q;Niq63S4Egf)uUZI3XvQwgVSnquMF?Z6b=n3}?bb9vAmdxlSU-P~#!L`^ z(DOrEn6si#Q`gePVz%$01n4L=UgY@Xi2z-%K3nTg2NpMG>SxK%C|bVJw$CS4rY{}_ zO2hqYPd_KT4{7zSjHzc`LSDcjR=+x!J!DUKbxbYa6=<$T^gRuqZRF@TcC0YpLktzU z-%GtZ;fihn3?($7TaA0XrWeSf>XQI5w)kW7ly>;q31cP4y+8{TId#`V%qf$RJg@}- z*9HqRvqAZh6JzT;qEzWG!Q8>zp?+stLHN&~)ihv~atqb*8a9DxK#!EswOHeKTagsx z0CZD4ez$-iNh$;*=i2Nahr5I=R55Bdek*rH_iwZ(F9sUN|IBNGhrZPtzopPQT>nK5ZIWub(gJqs9U1J! zF#A9$HuwxF)_K%FL@NCI>il{kSKM4#)otiAuj*wARB0_Hs zy1kKZzi={@w%hj`;)3R~`q7a07%epm!e*)??}G`hy-xNW?=F4|(vG2TT6x>*ws&Ife?O1I%SeGn;)@PU_0iuU7xBMa4j z7#nMLJJIFNTNZ#r?%B^G&a5d=-Z@7~C0^ktDRhcHR0;J$p0D=rah>s*D=Ve;cwS=I zhOX`A>!3Ndk!a!;J3SQs6CdjR)@h({_MXyhc&lPV-YwYi?Hu7^5HMt0J=EI3heTy#IgRu1lh>=jT|Bze{bC4SbXIWz7b1T!#-e7Q@HZ0~;b z)*Lf3%P&xRJNU5a;TFVqv)0K{?^jRZ?9;K#g75asMz$fh0Ra`<7{2+Y|CA~JY8Nnt zaixZWKN9ZDLxLYFmb~_pa?1u8zYZ_8%x5n5R!iP9`vW0hb?Ao-X6Z;fx}iF>E4G&! z&_$=9l!FUxQ0;@0zbd!#w;8~fU%P)c-XaA$4tkJ%__gM^jVbG5QI!1U^IeB;H@QD&eFzIG3!;@{rz95RO=?dmu zV)%wHr5+tHF(HR$kLbVrY<>OM)sfn`amG%rT}%ne$?)h1aHhR9Z-DG$0YGmqr0%e~g?P4C6Gwiu#pcF^vB3*}p=9i5p-I94!KckP?;wXa8mW--#X+2(3upW}`Cb1Ggr*qB4X+_{-U z9GEf<|1i30*}|^$QZ;^yyUla$BTfM$dcyiE-_j>sc07oLHi*xrD&ZCQ>{VJGPR1ej zJ`bOkTk$!<08xM#huO1?VfWiXlW>@bU3gyX>VDwEIBhCWyAF91{}cV$C&oSCY_82$ z=SIDq9tg@DcZ}-oKchRL5)^cXd7hc7?{9;2296TPy`fyLJ}1Vj?v>`v;Pi4KS@xll z-;D=NgiF4F0hG%)_Zt!osPL_bXkzx|)frhO zRi}QO9Sq?fHB$eaC~MKbK6hJyFWk`l6@F^5&Big%vvoJ6?0s2>ZU(#fQ<iHY5P6bpw|Jzoqn$PnURzqDYGzI1QV_~DA`MZH5llCVe1>hClr9w1qC=Yo z>hXanYAYzap0_UeK4FNVgC>kJgzj0dY}@Po?8K|2&2;RRcOaD5c|OI%BmNcs8yAD! z0A20FZ%8024P}Kcd6ev?*x=6E(ZO>01eEG!wyEs`&1~EQzt9a(QF&f4=4Mu^`oTd9&|Mq!q188 zr)_Y+)xncA9LL*tisLur*`F=&mC5oJ4{@r9*)AI-AXwVW5SJ<#ahLCT9V*S?ZZw>{ z9}b9EV7udkY#e#fLlA75tE5(GQ9}bMt_wPvBKk-Ky{XIqcYu0<_X_1Gyu)m1Be8op z!m9Yp0nC3$VfxRLH~%e#DHBT2Kg=;ZJ9lPQbZr22Wp)3{_wA+gGoc9(72Z9w%{)+& zG=lS0E(AK@z2%eNlGVfwvJpcAbs`cofb3F1~VZ?bw3ifH<3iQu|at88qior=W z@!>dF1b#}kMMPEaDQr0tTtFI)mIIq}7{*PwQW?piK%R7dlIHb?q;&QFo}|PL#?uK^ z#t-Cvu6XlGi>B@IZH!UwT% z7^ahP<*yh>$KomX1FXUp6m0CKAle>qOR6HRMD47rKx;O30;?1oagalL^YYoF`gML$ zD_%H0*Zn)o9te5SQyoCvlY71NPbusRPET@LP2P%-(M-IG zAP*DT*H6sSIE|tvm{2zNY z(Sre^Gr1?Y0UP6{?j!r)o42vhT`*|+5b7e#Hn?WYp5Q~&B%F5G?1WAG-=7&VlcN7^ zTxBgI30#*%cqR&84&7(-r$M;e$)nAdK2csu+eHSaE^(}PR%(VZ)f_>5?8vqPY{N}y zQWcqbiUw6?^<%gdNZ^bZ5MC1=wq}!Tri|o)|`L;I*@J6o*w3}91WZJ)1 zK)N9Zab>*0r%SPJiZ`P@B(E?7iOQZ^;)rk9gKId;;CdQ)m!PJ9tdqB|yr%S<$YRWSY%ilb%2F??_SL9$|Fk;1N3v3H_yG^{QRm-ZXHl6+=F1w>|CzA z6&~X&ag9G2fQkM8!`^#GHMQ+)!$A}jr6~x45D*lVCP;6=LK6`|Fccvo9i%9|B@`8; z3JNGiP(Y9t>77VNlp-w@>Ae$bNb{TQeRetLo^$U#?|0rYzA?W2ha+o+We6)XbI#xM zdrJ3D$T82-qthkyu3X7*KBYdAy)Bfx+9gw)b9k&R<8U)$hF=Sa2f1`~X zk2!x}1Z?9DgNg`D^;M|WFW>`zJBvx5>c%8O@q%Tf@O`%@z$#G92eUVh$VTkD_3IHM zs2IViEUNYBb;7=^CuO3H6k3Yd{sN0d$-@9I&A(VX8zo11PT@qg=2!xD$x9${@Y&bo zz3{ay_i>^bZ4EvMGRMMplKNL26%OTJ-8g=X0d@^ho>F`|ip^j0WK>oNcC&?e#g45I zuWhKoz9I$#5Y3k_7_0^bM{#4-5G7y<36cJiTO$wFZl29c%J&nj>aG>SXKGxHi+{|c z^sIVvsCjmI^595lw>E>_+U|XGJO^ZJ4R88Z8nA>GP;#&dZ_hCbDg$UyU=!^9n7I)Q zpBjme14|;7P1{g|&*8TV=D`fKFooKq#BZVjRQkl+2)u$jDFV74Zjabp9Rmas%|9+l zqp)QHpmmOb=&0_YFh|%^{DwCqQ1xPYsLm>&+f)?&?y&&4BF{j)fm;i}#+E z2b2Lb?fB`}Q)TF&!17$iFsi=nyXC;;9UIgZ2f+)X4{=A+kdr2~@%jX@7E)ycz^Jh` zUB7#EAl{^u-U!NVN?FBhqI={>9ppmTv>4m3Od&Eko8{<|uQzJB4_D$BPt*fI$u_2k zfNg$7RRA^@d&h=Q7{q|bGGK=B!>#}pt|i5d6) z%efaE`8xBNsDZmGMYWPJiR6t@5b%ZP19@JT76d8i>)882?p$_8q(TMIFV7JhGv!tm zwqF3+Ik^Mhr}(``QrRUU24zGN!7-pp^Ea`65Oeq_ApArW-t$#2Q#-|p*u&Pb5_rM- zPKcCM5*U6i_YkbX&~ELz8aYlAWd+;7`;;7n4)&R^RN&!Ny=I!5d<1yKA9v%o_`JV8 z(zcIr`3jm%a>~SPFrU9GkU()kIjtQrzvu2*jYUHpYFiu+Z)jfu(@>y_fg5|~=SLFO--9>v%NhySY$I9I2BYtJCtXqkn7%d2 zJOyg!EgCpgR8}eW5$O*QUMjp~wpNa(Qrn`nNGH3jB*~X_5&t~T24+=QTwyaM+QmW1 zMpT3fNKkFJKf*AHy!~1)7yHejB<9SO{A#&Zql{ekjkU5%&#l*6 z8jm-L3?3f|soMbz$F&Fqi{zWFcHMR~`$r9$%iHJ%=Tw>~gcX{p`_?Mz#rRvgiDH^{i!dULw+9>)K9?DmlKlO9OYfT6F+CJ;@k6 zq-#gmu1SAhEkDId3?MT2N`CrI;TgVnHZ^?N(b_K$YnaVV+GqTSU2$MGh8+(rdw7owz@11pnu+y;Zry~DWF^%5PLnk|Q4~Hc&8@tEp zl*`9eh7?y>!d&%x3!jN?|xCy~x@my!=lHH&~>r&r^^7PtumZe*P$pxcZaqajdJJG;PLRADMJ^Go$*F=dT>S!nR#5@CHnQ zRLME?`o@u&mtI0YQZoFS{;tCR9arYHxHvFsmFJ4=KX za>+y~BB*lWp7rwF9(KbmAN24|m&gRj7j!wyqFvQTSarOy>2Vl>5sL-=eu60!H{99+)s0EjKER`Wc;N;Oq$~`Ws<7e> z*L6oPr!QDTH#%Soh$`-*?CYhkX-WH9?^HSnaE1N#Yb}`4->icy>AK5Mv`Y&JC`qFD zi=HhZ4^d0tW#mYv=ppb#FezLLwhjJgGE$`LP$CRZ4BFS;5zUCQ;1<2hKw-rY&v#I` zU0UK2X=$fn2hb&=VSoUf{{Wa?P!5BOBCIKsprYQP+h7cYZSWZaOC(Hx4bq=4PBNBG zrd4th!&XiG-kw6+yG8u!79*S1Ks`1~Q&ahI%>Jm@f^3J;|{6GqML+)ac@cWo14e7@&XV;yNxM9wJcZ?%+1=v@^sjl`+Z$Mcqj3zg z@yl5w@o>U^d8>;y9&3MiI3gs+#fBrV{M)Bk=rzOcVJ%uhQxn7^LAnvmY~#Fl5-!RS zH`0>leew-a!}ogkB-7C1oikC_5DMRwp%x3M(v)WT%dFLMBdu4O#bM306L>SC0ZF4b zS7>z8;B`E;EMGr{^?T?gvtGDchVK<@@e}nG5v~KSO%Rk`Ee*aU+wP=$luK5ewSI?o zp5M~Mh*kflM(LZ(W2>p4={V6jWIZVv+so9FeKYGdduIl+kaU8aS#69YDCX_t@v4eF zI~X|!y0NvNgSuwps;lbXN_TGy%Igyiyex*qqQA)OqHiC(bkHO9Q0|B}S;h9QdnV<* zKkV2)#Mr^Nz#{th+!=qme{l98<^8?)Pb%X)YzLIS(W)sP~j>PveBMCf?g?WMjfq|%v&haaI=f?YWqCx){6ex84fLU@^>If z=PpVaDIjb#y$03qB-d$h)!0z)<=C7>RHr|9k1nisSbt1t9$Hv&oy_nyGbRbn8eAw!% zQ?MYw(-siL@J24)8%B1Q_bccFp%B8R2Dz|SbiszVq~DQQ(;5;=JTYOt;(@!S+if1R zX_!>tI1QhM0MH7K$nfkj_&&ijbh9;eE)+sZtL5;0*k7V`vfF1wZ z_VqxoeCjHZ$M*Q{E{rWS#2fL_!Qtc0kR%8_GJ(WI{-|Ipm6+M!k@I!)d@H{>Dz!` z;|pIaTM+}yC<7p}!yLfab?&ij?W{}8?M6zqj8gx$QHtm)MDAKaQ z$M634_CG)<@Q?mK_}etGEO(4o6CjgF9_^xfxqJq(Ba{c2@XjGV(;CF{%K9S=NUM{`4ODWs8_P^V$VM^EBobK;hE-e9&aLU4u_w)d(YYpJQ6Ke&}j@4#tT z{ay6<&X+dzV4l!ze*Dao-t90y5!xCr7AdwnPWd0| z7B6O^BFOjX47Hr=7 zQ#ao1kDX|AdjeUV9DYVa!k*kfF=`?u_ycv&pk2Q_g64X3KFusTJ7+d(u;75_jY}}# z3Rd$w*;xlOJ5n3F2k2CX1aB)*YqyZ9_tB0oh>rl0TI5_n#KYEg$M$%70LkJTjBxA~ z6$$-2hh_W05=H3)fKo8Y@KnZXj5&Jm*jLITfY~B{LM%~xwD(Z8f_qKmNX+zQQR2SJ zPo5kLb}C%I7+e))*9jMKvux+@=lTp_CCBPAZND3(!P zo<+KAUB21$V>*GjOQBQX0&FTq=9itFfG3^)1(A}@-0v@Ho6Wr;so0KujqLE2EF+!e zkf=52Or#>)Z{AUZEHYCsf~<`)WQThxX4vTReaA7{O#of1)rFpDC=HcAPq|u|FiYZx zbU7$f=E2F0-&rcZE%f`37h+TkFQ`s)__0vo=2o00{FIg^Oh0{KYD$H5U;tM()Rexk z$HDDrd0xHvr@2LC^b&STDT<+Fd}&zrg*y7wK@JitIk%S0)|;)t&-zYaySeK|9F^)!fux!TD{lA*o~IQ^E8+?g2L@G1ia8|wL@?s5dP z-G@FaopV2UDf+{$k2s;q_M611xkmdDYLmC!oJCB7UCg8YFn~XS@xZ+#g@IOIadL33 zzqCVETmhu868futs_y0G4?0@-6A0!zmHji=T2Z^*O?69K4sPw@i-I#R#2+bhBnSyA zrLo);KXMu=Nu(#rl0?5Vd`z>0nj zusp3yJ4zPAl^fU6yt{t!${RJm#qY4XNFcLrO{%i(uGSDA1W=wi$T23?P87bh{E zH}|MP&U$)lOKSvdCS?H?eki50R-v{Ghm1H_HmzE8^^$S8BapgI?1pcJ)f{hTtyGP! z4|hDpAE3eYhMuB^QoDyN)GeDiPd`QHR$p_E zBj{~7Qqy^n1VOr22tM^x30a<~?tIpyikwA_>uJWY`#RA+&u=se|4b{>m`9#6xI}Wt zKc}XN(l0-V>ajx0;)F(#1&>5LETdr0b3tg1z=T@M+H*GVwOFM*=dY2^tk46nfqgOS zHVB#k+1=7HH=lAodu+dMK3Acsd?#JB9AdsHY<$6xP~=1L%_OJl_* z1%U+SGAJ+lmEfA_@X=c*<&w*pm%lj-v#vWp42l7Gngu%4a|o>3GOPQEae=4t?qX() zs2vs5nf@`{Cfk0gPk2s#Ub0n+7oEsDHlbb^Vy;y04>QP}-sHX8Vb|wkda{De=$tNx zte+4FNvO$%62Q2@N}#rR$4f^eIH^5weE2IlrCO*^^qUYbZ2=W!GP84htCqUB< zZHP&7BU^#)W=k$?RdfSA-AOru`Uw#@LQD2;CrzQ3#IY1bGr*@p#X<~zo4K-$fJ9x3 zvSRYv0R(vl*^{y%zJyqlL{gDJ*DX43jZz8QM2~e2xlmM1?8&p}&#(~KhWH5B>8b%b zP6M`j-=0FR*oi<00ss`67VWv85%ke!YCf*>7yn@zQg7wU~BJ` zxveKf14bcz0~%f#!>xL!t;Zi*vo8zV)E+k=P4@jcSawgbxHid;kMM=u|lB3HKrf^yAW@M~B&`=s>>} zy%ZH}daTOjY;*jHqQ@Gn?PI>j@n)`t3PXpgDrjQ>Sn)jZCxpuLjR?EnbAP~@KLW@r zjW6qmei&{F510>tA|7@BEf4uEg-iBz)5O5#j!UsPXd~m`n8pRK9VlZKk@)D;KKY@^ zPkcjREj(QR_+c5voNd}>R8dPvT3MyN-*J`tQn3^8dhRoLohf>jA10;)S}Zy-8e%U% zbco2^i;o3Uzx(Rs*svX#(@LIr;c=IT66-(zy+U<7)DnWMp?pTM0^))lA#gF_Nq$a` z>l1=Eb`4V>#Z;T~=uC0*>ys6m681(+OBGIzQPF*;zHlO57x}dbR<-$FlQI3MAFc zHGnVcEy`m0U|`hh#r}<~lnDgVshou)FbaQ*`m#glg57=<6B3Wm82Y>Q);06CxfLRQk$ILl0T z^)r?7NxA-Wc@yNzM*5ogySHnkA-0y-tpU}AbGMb=gL2a3J*u-FHFnisM%ZV(CBV%> zzq*I#1tXwlx22pwLOf3%=`vhkq6+F}Mb|zoiICTocSwixbm(r;Qu+5k(rjYAY-ZWS@h$wySl}X zd0Q_!+lzaej$UI4I#vRZuP2liD4T+b2QFB(w6@pM=BU<-?&GlD5|la^+9aKGnORp( z+&d+nEDemIF#bbTHnobMY7Bj2o@pn_G@GBn`2J$R&aIJ$?;af(^BUuwat9q*2jo2A zPPWV~Gw?PGVX+$jguBx7XU`l7DtsCo&^7SnF!EyN??(`D`cKoxFvl(fjNbE=?F=4>tNst(l9lhB6E-#rK?TJbM1H zTsU??`w5b9e%JZ9>%ed(sfcw$d-(!)0-**w+-KlrzneoigXf*V&@J8-8MBDI1@8il z@^2WtuD2-pKpF@<1Y-&|p)rryqKl${Wcux#olOc80#un@O0c#l{1soogiyu<#sY+N z4GcDc_uwZNVZ_~^kp46Z%iDc89xO#`!0gxv;{=;Hw2mfIPm#b=PuS`)TZ~ZvGiZeU z-~M&kH7!BDy!NN!?Ac@09)n4$@xT2Gt%>pztB-bUE^o66d8_%KmC-afLIAOkR$=3j zVJUluz?6Y(!NZ4Pg&6;b3SwE>XY&&3Dp;JW>{kx^ij7y4N4A}a%5Yk!6bZ`|b5I&N z!;dvxX|^tZKPKFT#QROL@d2j-?dup?>(TY+x~K#lmjJ13GPJ} zpQmPk))|5wh8FmZ7vo6B*@idDz#p?HKfyZq7JuO(WeWQP-ra{|T;9wPUM)619^6!* zh#E2?{u!CbvW_)-o^I}co@(~G>4*1hw@*aWSKnzwobZ+Zpyn)qiM-715>L&Jrit1K zS=aLz=I>91K=+o5sdjH$D$>i%1nFiBWT{NUGIP;cQz#fB zlnvM`ORjnv3XiEu0@^i|4OR7LO}d5b63ert<53aYb@9?Vnx; z^oE1{E(G9lKBi~X3E=ftdF2?yic>*?-c+4N^=SvmVO}%anWzh!CH6SNY`4AIpm?m( z6bn`6J5w!&jEBr2oPcQafE=x0Cy54p8-Nk^uhm8~a%~~cLJ$gu+P(dRNA5=Qjr_{F| zbQ7~F?Nks@W}W|_9m$OR4rBg-urGxUaT)0kR{>~>*Vo#O|1EFT1D811H|*KK5y3mg z@g2P%Udnz^phzTWy~1KO4qh~QJq5AyS0wJ3_o%+Z322tZC@%I;1%m1+b1Yz#p{Uy} z(I>tOS{_!7|Smk z-l$Z`_K@W?)~R+h8hh5p)o>AZ5qlEHa5X^pp-vZEBIn+B<^0kYXkKN~MYH1{(Ao39vHW&{z~%t4{El4=WF~gU-L+)xhlF>@70}h)`)cB zFFJeroMGo(cGYH#wz^Z35;CODMP6L&1M7wXo75`wDf;%M>mzsH`N+F`QPy%*W3pO! zt6!)y^h!&7S8}w|oO|lYWW`b<+a{;oX+OdDl6Njoi+ArhEB`>r4L#eqs@86xZ7Ub! z{#G$tu?VfuCv%Q}NRe)6`s67)?1^R{?8~skr?NlndfCc2o9AS5&jwfIT6u>=GhVxO zkYmchsn)Dw$(?il`Kd>HSc#7kM$vgHca5`*)9Bt<+?}_V7Y_z8IsSicQ~Y zq*!eh@NvIcL3FgNP*13=&le1XJn+lyl+)InsAhj9!@LzyquSyG_VMUg4>V+|=6>&| z)7q!D%z%9!ApsD&9hXL}>%$NwTANPD%pr{4@~q(m{LTyeJq$UVA~5XfQ`cWAZYM}J zI(ETXE|k25*F>x2;dMK3Rwrn~xjxTHP%ZY}KCJLx!NLO<282ZXSilC7(dFcTh8Zw?0nAZTseDShZ*?Hvwwvm@FPNLD`ZDt#8Iw` zwgtvgd&#rOSI2F~Pu1c5wQzO&7d;e3U)6wanQeRjG)khR7sE#n$(gTO^6?K`Uh182 zUyIL!_<>9fbX2WU!bZ2Oq*sSKZce%cxk5UFyIYlI>n&7vqH3=|`IbS_j0#HlElrpK ziplVaaPg;=RY5`_{?x>90Z>XXX!Fy;EKB*PC~v?EmObWhI8^rdM$`f#Ii)n>ZRZoC z;fE~*S#)=z&6PrfACE%uX*&}=X-K|GOnMeM^x z<*|i(BY;9oyx76pYt273ZA1ygm~#e5<#|(euCnd3YA6mA>cyPQkv1cQ=tX(#+8pV` zl9860aj%}{?1MpV#15de7}@-d>#{$t1dlexuopl7aoP-(=ZHOlqZv2$yu0&Y*gX>g z6wtFjAuBJlBrr#|i^o!aaBq`Jmrxt5U~7uRyhjg#0FECd*?wNbh7(8F+fi~x6&)GQ zsG-zji$v=kU4N~t9|tP-PS&17cZm+v_pHcev<_G8+7AH=48~m3j3LMS>eTMhpOAOL z$Z&WAqOd7y(2cXG$kB`oGsLujIq1GU%-PQ#`AIUR9QYlPCQ)^IuxYab zP{4_jmT>ME8_ak%bmI~L46tkzH++@yNgxcIT4mhD?qOGgmjQxz=Gp$QZ$b(T0s!m> zbPc_U9%cl4a}|0)N(CS>P2u^#MUpARFtYuCV9Vb#g&^nCg;*?bdg;9m4|F?uAU<7( z-k3$YinWTn@CbcQ_9s?{&s0w+!dr%@|& zOHQr@)KZlf+;`1l&?apd;lEB|K06)0a)49#9Frz@G#y#v_>f6NFr%I$Nj?p6%)}@? z=m#j~V{snieR(x8oLpj8zAQlZt2c$(i@xI6yj3Q{g=zZC_bKb><;b-Obf8nbLw_tw z(1?9xAMy}MwB~*f9yxl`J2jPY@-c$iGvRjyRcCyI(KmtP#-CuRj@6Wp#$M95 z0!}>&=6i>hOJEE5_V^gHR;1eVpm%iuL6#*-lOEut-r+bK8$1T$R`~GVQs%5X=j=rxg~(C|%i)JR7bh=+NDJXC*`Jyr#@;HVF&l zK2ZKCUhdvOP-rQ{)n3D=tuxj+FC1Fl3Zb4WzKYRkYR_mLeDS%@IPd6<8m)M#r85L( zWoRe75zNXvOt!j`Z{q^W+@pR%@({^mthH=QQtKa!dRq6QRA3W`0TpZmhGz8K*JSYv zEIVf^xZvZ#4T3Y=$N(pSJ{1V+csHK0aXtKt)zrGvdhTAeXx^2^(q$Mm7%u)KRr1@~ z(~0_&&h=@?9A3AU1D`^aUw8(JTVEweUB^HK1SPwBQd%mn(DkC-M_oDWe0+=c;8atZ zN%%a0KZh-*V(_TBd_t`pPP75WGVftEN@zAnGJh5x3fW{mdrQr3c?-&BFXz^%;p9A@ z6K)o=b}yqZy+4NigKn-&9eWmQX!6&(RK{M+Qy7PWo`}TKosVD34vWpE*iqgfk1q<` z#-q|qZW5)LQ>+{K;iPs5la>qr7 zbH_?Www4vdF94=YUkZ({m_s(cux-fr5sw$7?i9K=Q8X~&vQcx3X81I!Q;fp8A+Q|i z3TnzwZ^`Pm7@DL5M>_2&30sImp0~8be}o5Q1IXClfoSMKk572dNS(a#gFc}p++pJv#0vXQqyQHpCA|H3N0d{} zwTlbUACF~{5$_1_`7(c!sDtH4WA(Vjr|Oi`@Y;rnDB2vF&<>uFa@L+}84K&5Y|Qle zKAb-=&G+zz;vs9hu|+}S42#9FMUZDW^^e{>D1hm$>!bFl22dD)lG0H;&qL6?V*_Or zWh@v}{i4%-dzOfgm;xtK02+K4xri^oSB8TOIuO%$q89tM@_hS<(qLWAg!bhu{2Ga{ zzY$48QQiUOk_%{a2?K#AJ&f!Na3jE6$^tL@FPKXszq#-H@0d%C|BAV^m^UeCtQAe7 zCuR3tdjz;L8$ThH`!fEQQTX#UL9i9JNh!V7F{fYXNr`0zuV*d!fRbOH=#KYuq)0FI zyq1pXa)~0`kKH>urDX;$d;2zr^-O8Vv z@@U1z+ZRMW-@i>g>X{~E)x=XCvSqjDG^yPTI|eew?B!Q(33{;a87Y^18K8WBOTB?b zmeHV=3cF~BQiEDE&`HJUA%Th9Xw`_^VKzq?R@mDusu4|xcrRJAb@Q}^S>2K{7n$|g zi6)oh?MG?uo;s-4OO=>y)Z}4@yYcQ-_`3mJ;+f-#tP zNgNF;?jc2`tU0t8kyc+lP~(2mwa+g-=(e|IUtxMG-0-#x@O)g`IgDc&s*P~(`DSrO zWzhxErkJ1RW#MAa8%~68f!*Z6W}JCbD&i8)guI{#7kv>B&K?pPAozSUhJX}G!Jlx9})HANcd{pP7B zqCzUefmu?%$OjibF;k6Bl>_C8`eQB}+h3`)61*Hwf4S^v{-ZxK0GgOL&=S7-h6VZgn2zy;l@TTPF089fI204k#d^AaZfrR< zQZxWxK?DdjMFk5B*FQYMam?=!#GU65JJ%uM{{u|oFI?x3P@BK*#lO54v_HH4qDlQm zHu&rD&#TOT*IAyu)(L#R6p z;^B%{(48q^n4{Yc#=5U(hDMphj+zONA<000{e)>Ue2>#TRy00RRPq}xS6D=}2v@@h8 zkgE6MgfeuO{lGCuMZpbK9u-sVsz@<12)KJ0-36d zgh;pNI!1 z;@IHd)3XlqymbjuX2&qsUiPHv)Tk|rH!jk6b!o_v>9oI(6((*h5-AyW8#h|rU-nT* z_Xg3uoF~fBem)Acu-PSiI*%_tpjT$<{u&PTCUxS@E>!pD=QHmL7Vvb@sPwG8NtA6@ zJMF+bl*Zp{Fh{U`n>9JKlO-omii*C6=iF94M{*3)omAVdSZ|l=A zt9?QzE6FWvj4sR~}aA{py=m?*oju z{f-ZX4mKWPYVx@AYXGKC5vj0PDLMM0E#BsK+08B*3)&N^e(WeYau+!sDS$Uu$NRjq z-A%}=7SE`*)Bm8)Gwyy4Lx)|X+fAYNd@p~p;Ay^;1CRN7D&um>_-cW5uBU3-YKKLF zbU0I6T>weNdj#o=Y)xSy$+FdxIr@DXH;?f=_G|PA|scpP{9&MhCWr1h*m+6Cqj0IS_k_XQR*s zPigTPtd~^UN`JpqH!~?qjXJU=iD%{C(EOT&9fg+AHfr%TRo;7KV3Iq1avZ9IIM_uU zA4fe}8;tjuP`sm88t7v$d)8zs_G$yV#CQAfw^l_1a0e{NalZ5)my}MFRg{(5vU$z= z!`x91h)1^80+C#0<>Ad=ESY%1nLlr%^OqVZRJJY)FvhJMjjpt_l?NER;N}0x0JRM0 zx+d7^%BaJZ`km&uP=;|B+!7-(jFhu?8K43P;D}HUNL!{w`^H4yNlumPTRLqI(yOq{)8Z9lQe$GsEBqAdHG! zXS5yt`{5SV7oZT86Med@HCyDxntlzR;T*Inxv>_Bu>tj%lo z%UJT9|2a*r;a$-Y0({eqs62>|Qjaj2)nHXxej7{Lp*Dt&~Fx({P!|iQ)9rY5P zEWIMIzFGD>7qxy7M!ASde3@(K&_59&O*Ga4_8-px$B25z0=M&Y%H+(BAD_;%J#*KC zO=ntZu56vF%{j=eCW!~v9W26864N|k_rmU0(sfG6{6GzQlS|fDYs~Y9jwW6>7~&HO znV$z0RzTi6RCPt}o`jvs9d z_GcD%CrQ6*WM}W6lp(`GE`_Rqr|odz{kTtgMc5!2|Sgoky~B#$$$0z)1Y?y-vcbeIG4e!5nf=!o$BA4&@~|`mg5PL1m*r z#65^rUmGf;s1l=g522J&is7VJ4r=82pO7sh>=xU<_9KF<%dJoVV`A9+Z;Xl1zdHqt z2`50*(Le%|71J)%yyV*U`2H~ZQyAJj_0dL-(d5*H4TPP?5vOD3XVdSeGJZrSkSvMU zNJ0fY0=B*ynay`jwfCBr@8&1&ye_S*z&m|xu}z*m5e^5oOT!vCrNoJqTZ>lFIyYQe zJP!E7%t_irZC}B-zPPa_qel;SszZRMY~PFnQG=xLrR909H?ZpJMUnJT(z;6=@7(Dh zk7%io^8h~3*#PQBGEtJGzupq=^dU^6FK_KeD!+<2X5(D#%0cd6kavx$6Z5roNVDH< z&mM80VQF5ZL@0<&CSPt8D;l`L519&uEbo8&o{m57Fm%zg_+GrAA%6XcQhmw!R0jFm z>yAN+;Oq8YP>dE(Na#ouaw?Lg*;g>9(>|j>Nw1L^S_XoTb6O}8in+)`y+g_}fe12< zKP*i9w5KSd1nb0=YjypuqWOKfgiTuAaGhs|$FcKKO@7uI_9LnimLr?{+|swZjgr%i zu0l2kc7k2|<3EbmezQ)g#~8!LljIgE@OclTWXuYPMC=slMu&5aJ)mtK$pAw|_^TrZ zRPD(Slz!Y}zG0Q(pbgQD9R0BL>kAGi*8s~jFB|fE)|uijS!V+9O>2NOc!Fu;CE)8R zwnLBteeEy`>Rc-57bB-`$5a8`wqixp7kU763^3Ae%#oimfX`K+S-(z{0X5|(761$ z?NCzYS?zadtPeiCbUD@fOf3xvv6J-4@@2va*A%NHFK1A5mxEF{jv%Q4mki6&D_mpL)DMe1i z_l%fkf}sB?Sq*CU9rU!lR+W#{S%jngt~}+)7o7H$b?mdbFnG1WKrzab04$rkVl+=D zrBQEv6#S;d?K!22^A$Z8>s)k5VTAzC?D`uo;6y+I#3CkL-Bgt>tpd%0&zl17{_}57 zRaI2=-VZRps;Q8jinIcwh23ta((1X3oBR{?@6QDW;FFmva2&cpE+w!ePrT)}YRcD( z5QSNP#RsS%?TJPY$Gk}&s-T|`82B)8`dv0CFk{ZIQYALYd2;VQLgDmD4R#3~se~u1 z@$`Xx%Jv+u7o2;k$Ub=3&Kta@n?&f_3q6St*I&OiTWEjEIXtZSgn|)6j)Qj_)tL)1 z+T?>++`_s@LnGx}uj9L(>6p7Jx{o5KiSRY);d$8CV*YO!tcgJkm}gdFs5Sr)DQ zE6XCU{HX~nDWV^Adniu?ga3SXo|<@z43@6xxOvR>5+fzZbj9?KisCv2Gj%Gjn0%Ep*jS36KeXk*buKD&|5;a5r3>OKnsh?Gx!(xd zbaZfA{zaDSo}T0OW*T~dtzJp?yRQb{xfHoJJ?oiNM)lvyx_qCe;~}n^Z;lKJD$n-0B``6y@r0tR1 zqsql^Hl_@T7klKY-n!I+T|!(Tp>QS{YY`<;=bY{^ zctki={7^)tpWt31xlBSn;>6_Xon7YF0~gnk#Z>Fd)(rGMPJtwK2N9t(Y2E{$76;Ib zIYYk9f~`rV2NEEZTBYM8mGw4cN5X0G#oI@o#o8Pe&v%(-^ajDKHloau*hVh%)^bKp zZK+FxP@g4)BVJ$%^As>=pUJsLDW!)=+;wi&dwa!TT?J`Ey4pM0r=aEK z04O=OH+S6d4%rKDMY_>vdn}*dRrX*HfxZ>oDP%6?=MEK|V}y9#TKw`b)057hs@;!Y z26FoVpkDkh+)ikLS438Vh3SXv(su{ugf6}^xL0k!pmi0?EP5B3Y^j-J6w4!%D;S1< zbBm)Sf-`2?&B^Z}t+?o%A>8O`#nqE4q@~CNzcy!km`Ka1cGqhR;Ujn6RaM=+DHzpJ zV3FK;O-s|uLgu!Xk(Sguw?x&{_el7`FHOVuwAkM)y=-`Qq54cz?1;VM`fGxifWVI zPj=mdq6mKI3IFD{HCH!=-+oD+Q0B({g;V~4ko*Vl1+{yqS*X%v#{1;Y9SI{hO4$nX zAN!nAmkFon5Bcaiu%_D$#Gm+fwrBO3_N%ezY}Jk#+4E!3Xn35YV0WRTa2PL@H}<+1 z&EtSMi<`+VH(si9zBJ+LPm~v>vp=IHKBIagVgAm|j&CK_?kuF0sOCydL83Fm9*dur z^P0VcZ**ut8Tw#vTa3AG!D95{$;|Yyz0gl(moU5mB>S(ti8gx?PD5kYL!>2LIvLIN zFIU3v&fAXIzb)%?(09IfH*oRpvW!{#{e{9F37f3rz{H2+_32)N2VlDTADLo%MN^n( zXvtxjn4!VdON|q^_xm6~lYeA9*;t)J;ElPEd~IL9xSSWVkE}K_ocsAeRSpgqQqU=` zH{8SdccBU#I1qv>KBZA>UY?+fN@~}(RxBe-3vRP#H)Nq7;mgLkoa(S*P1(c6XJ18E zR!jLSx_llzSJC5`XKtJ#eC;R12ilLxf(^1w&IPSdPTK4cA`9$Z?m?J(Qe_Dz5MT6@ zKFEg*%RoSTz8)IpEftowuoEx6#K{8*w4o-newVc8dQdMvd?dy*C5;b+WV03^4k9~K zimm-I69;;7XL{dXW}-PVSK?Qbv2@ocFWtjEpIa8f6Ln3QQ{vAKNXZ`A0|rU=zD1|( zyrv7%>(4FD!`Qtzw>+iPhr=|lMR6ZpFZrRfDEH*iNnI}qs!3=ux*2tN-tbZ1J|$#uAOb?PEU!tpfne!?;a@O?r`mBjGDZdo=Sfi?*C z^PWoxrmqS`T-^ocYs$HTD6`|c>T!wkz+b5b z4`1(Te7R`k*1hnu*xKv|??ZFGT0ufJUMoY*etlOt)L=(F&V4-a(iLW5q@gbZ-X!8TNitYh zu9K5H8+~cvpQb}e-LD?N_nUO~6T&JFEv#%0dU16K?n+aR1WXB^8E%yOSX29)s`#d( zGb}<;3k5S9aFs!KMGa(5(!`Fh7!gc#=asn+>6GvO_}~xAix9Zi<7O#r?);LfrL_(d z>i`E7_9AGbB3l!O?}ovMP~ANS!R6Kr4fx~Jp^%ur*;5|`Cgz{fC-G}}0}Cj6x3I5o zy3bFAoqgj^J!?8!gMed#ZxzPOy3>WE$M_vWKBiz1Y}I03>(-5x4ta~GhK@lPWk-vt z_Kb0xlh)6+wdw8R?p9`07m72NSl*Q-P$Qztmbo|X(()uMS8L{caq)9im~3mm$9K9w z0y0FFBL_(Mv}+qy85~<{CxGvRd5|x1jeQQv1Lq8fU?KUscKb>nSS3^qyodBa>Lb<^|Kq+V}ZR*9+|75AHasxc;97&N<<&f!QdK)G*Ep9w;LL zgpD^5wY%>WPi$G&rd;8GlRLlC$9yr=zttO2>?y6rr|=YHfkF#p|r4ytCL)rfx9LMaDssZB%Veez`% zsC7Vtxz#@om{L`+4+WQB%V;Ty$lxW2jnWIxt0^8c$)j8LFAQX6?iBO-;qQ zWlM79^E5T$1>i`OHIAIPhjXaOE17H^W`q#nOKy4J?ldMcAOSLdV1yU44cIpqzgjzq zeG8$t{yIUv=u)Z9l9HO2eV+R3c4sh!yP_<)wm!EJ<%)WWgd%%jhb7%{ck^7#(!Sl_ zWPfI05PpC604B96+)G6JCcGZQS)z63yyW?rVa)8(7QhjrJxAM&8JSFsZ$6XKIL)Mb zxNa-|YK_yBaf?8dn^e^58=zb4%K37h8cR9UAxt?pe>`sJBw=7~fd{=1Jf0FVRncEO zpwd<#L<976uh)Wb)o)?nb>tGifoRl?pQ9D2Cq!2=tsJOWb{FJSbDhwSztwB~MCylF>raR)cEcXmqSv=l%d;5G z8EjH^JRJ2I@L0~^Xji1yYgJQPakt*OBt2LKxNY@r-B>YWpi{M>`9ae&MK1g}|3hu3 zXKG7GBgy7>6det)?|cPuhnj*TB@(N00t;H)sKt~5>LjSh(fDgEawjv~oajqP;#F@C zw5F%DV!L!c+4Kd=q8p7KLt`3@M8xkTbf`X5~ds(yRrQIf|JWjn1di zZ4A|2Y7C{fZhvR%vMB_i94I}X$_VRMI66W6ymB_CVnA&KI)R8vDRHW&zq;16dwiat zP^j|h@ryXQ~Q}YRh-F4aOZ>_gTvE#0;EQW>J^or+9KAMTqKpckM!p19Mf0aIuh@y6Uk zbIzQ-LJ(7tt4wPGT>8knyK6li#l-;@jU12wasIrrtem7)3L zOvUrmkHNCS;L{X}Xoz2w{k9)GCy5(tDAS&7H_~xz^y*GpQ7R$J*ngK8QtZ7UZg`j# zxC5t5c!XGE`b3X^Z)1Bg#}x_8(^heMs$7p}rPz6+pwILD3_i9lnRGTLY6$t7p&ZT+ zwJsn22?^ksI8VKd8K}A!rHyZT-{~jUbo@u4rW9n{Ly~NZXKXR$>=({^xGZ^CAn|KQ zZ>#}snI6T4q1pawLfzEwvHSRNTBNduBoRyGCE2X^mrX7C$&HfvA01t1d}jn97t``d zl!Rq2NqKzaXq=y&oEB;&lwTcN*|y zjeS|88sSRvJxrQ>j&DT)Eini4tdF&C??H${_6SoENhKsDjOw(M zB%~?ZRFasGU5pu$CEJ87QOFXqOv)aHY(@4hyCM5FV;N(%?!$Rq=XIUm`?{|C@%#Pm z@Atm{IsfpO8RcVqKF1uN<9NNF@7F7ZAaSrxrjv8v&QoOt1@M_RHb3>W=%}W4{__7u z@*FjpCIad~;$Q__35>V!$MZJT$Y9L8Q2hmAc!Y1OLNJ#QpD^?=7lwIF94-m}02I1} zrl=);3jc5%6#_S+UbSxgNsz#bU%QYE8p0DX8)6EZ;HZpCehO=|Ol}L%DlU$pZUZLq zLX1Bwj)tR9lNn^DhxLC6l}lLi0Jrm3nBOzy#+Puwp}B=@A^YFVm7(1>*UEAd8Db#-igEtZX^0fa}Y;c z=0qMl_BnFEUO;MhBjzf;*sh*|gp)k;FpiHp4m>&-Fc+q<-NZ=WXpK~u(gN%4Ce)W?+zsRIS%ww%hZ$~c}pJIAXm7$%7%A*_6Tyj4;ImwR1 zj%KD*-|%5&TtwRh~B==!T1Bgf8rcfb>VJSQ>)wKXo1Mc@1kYZKKEmrMaNXP2WfD+ zG0-}LP8JY)ow`M$^&zPR5cKgKoJ+$km+$!&bkM5;d(W>)#)GW7J*VL}8Qr}JR!xBP z7vK5?8Om$)@d`uPotQ~I^@ETuNexT@)dc9t4kC1UOQ!W10DBSoBeU>zT$yCjq@%J(XR}Gwzaai!57a^nP#ndkjz_-s1l1Ta9YU-KCy=9O&ExJm^SYKZ z#i>Irin61E`)h@3QSwDdD91c`t+4+V2&LfOc>izx-TtU1|Mdt2OHkYU>+ApQ9Q^9~ z3y=Gwe)8Aje+v@#55I+-^O^5wVH7sg;{IcgdViy`&Pb3SSVhtp?_m#c!pIt>zRlrN z=^Sy1ey5Pc*(Rz6RSNHxJ>o~n^neB6mR2Vb^^2pfEn8ooUw$EBH;3GAr?k;TD9qVL z@stlQ6kpoL)sW_{J!D&lp3QDPU3AL(M;`A5&{e1<_L<1cbITgtd-+kLMe3EATX!b? zHP-12W~k~3+fXOy;R|IV02SdX`Xb|5aTeh@R+NKZ!zyswP%N8|#gYR?x$sc3zNh}Ds}P{*I}>WAa}?tTv^hc#K~i`&+eOSB6V%$K2FuY@gL z{tHqiRN;x#;aNgy)L=ygrjm0M)U!vDt#l833SAifp4*R?Eyhyh5nn8B${J+P13m{V z-Cl3&Li=N1o4xGHS*KY{I*iGtKUM7}lS5GzNi9HMqdPJ3Q1X+0 zZr4+a6Z@40{{j_a9v<>eXejdudbwf~_e~<&nhWE({=Ho?$|yID2PeGf{C-go zEK(JuL2RL~8n_{;20ickobMF{u6}O`Txb?1XD^Y6q74IM z=X-A434RHWM@dr*3Kr37UHQwMA}I^(5?gkxcvQ_U^!JowMErC2(R<`w{JI3}xk-O5 ze+oPCk_FKdCz9m^16>aKd28W#J6%ACiki zMJBfj(a3rLWt?efC$Oxa)p3?7lg3y7{d67HMH0xjkhxjc9gC@FbTJueR42 zf2zZE@rB@vVIl7Ok7rbtyG$PoqyVBNAY;+9vMmBm{snQpzMHO#big0hf`UPyt0g@6 zu;1LWOmQa740dpXwVFIYCQ^`(-`^xu{DJ@(m74G9Y+l;A)HXdZvWu_1=4da>2^%8r z_sP=W{9Rr21dfvu4Wuj{idD|TNF-0|$_UKgx0AC!6)gVkjX1rbzxiD<87qWSLEM@^K@X zPF4r?@X>c-t$s+}h@`I4o}U%F(&n`#5kXb_x!%wFEwn-x_$HB58OAfF@In#lT-o{! z0m-~|Q?el=m410@`Q+OAwdYoM;x&F!pJKAWrsLyQeVbk_nm63AH)GG?Pg3UqpmK>~ zY8Y$KeX#h&-DvH?q!)Yla+^cyNNV(eLa+z;IZ#w&wz5F%TQ8IUHh3*|!~6)`lQ7 z5LRwYStbCvz0!6x<^6Z@C(_bVwb(^OHRcmyS&_hl_!503DD;7J5>uD(YASm7J;*^K za>I>`V!W+7H(|t6pY7mQ9(^m-8n6yl548?`8dy5qdji(i@;0OsP{k2p^e~D&7^J%> zHd)@TEiIGNTbT~oKc(U}I1*?cC^dY*ktjl_GnpOYw|v!WcNX>nVn23Rq4tQ{b{lp} z$91S$@0~%J@bEV2-mW;12i|7jLP;x#ar%Jd0b-v%-*XoP2 zd@QN$u@<00o<5_W>fP{d)7q1;E-9w>Z9#!Vn{;k6gf_%iKD+tp{FCXTR5xuYt|>*q zVFza~iQni;N>q4E%DZ>vvEN^fNO7efjmTyzfOs+4h)!=Ozuk{&7Q0qL%LBY(7v;%& ziiLG`3~m3z7x0;_C41ud4N_FLuVf5|`~s57&PV{C2XW|>@P0Qe{y~TX7KP5?*Rfw1>x3%nriJFiLDM(>*MuE zVa-U0pBqgT1@B@?j2mj@d}m8zliG<#JC5#^;_Y{zA5(4%kwk7^hkMZS^*RSe@t?e! z7<`1~XT*h1>Iska&@X{wwzp}sIwIWW$!R)4ec;jVyU~a+C_BvAD3}Q?O*!A1(CWVJ zhPX91UWTo=y25f?Z7-M$?p)^_-*`45;0tg3z#L40Bq<$k{{dSHYXb8qH00*ECiZK~swCy~j}P1Wrl z+9~fdG|+@Bbh=u(>yk#nNfxbWB@iV@cBH5kPz)MMx3;2>y{qp8WsP~i55tc8HguR~ zn#xM0r%iuw`$2^1W0*QiM{8&PxoJ#?EIw16i30OOE>p6y&x0XP5ZeR#YJuV2?Yl}6 zs_K1TEOlqoR6kE@DmyCA)vN8gS5X4`CpQ!(nj70Vb;9*ORoeY0^*3K?Nq+*^izy;<2u2s;BP8li&GUEF_T7}rQPAeTkFrHLj}!G~yU~5zUwK4(PG#W&4$Sa2Q(Y`(Bl&wU-gpj!OWS3zaUN>L6zBIObN@|KYsf>*bA^{ zdy>jT4W7u*8Oz;omZHbTsd;RXS5|6t#P1t1H{TUXUXSukpSG^Xu8Cay;i< zQh^x=^C)^8!Q){leZA*k1x8tWK?`y3DBBz~0#T|~zfD)lwRdq6NX)dPMTapxwT_@J zmkcPX9tqY}P>^7=QjDxQY46iInkegfWHhRJ9U)`={Xu1n_O<;p{s-gAzP2r13lo(8 z(yO|N;ntFH!1JqSb{A*yNV7gU3xs$lu-ef;arsSRgdB*&{*BV1>nk0IE~C2f5HF^1 zMT|ifL^+@l{RmU1fw#j7(od64=K19}yief&P4xA~16V)av7msS(Idp-nvu*f1#lE5 z?y_+k69_s$BlL$I`~^WOZcCiRj0jT%Y5G9<1dFZBHdyjbmIaU(rGq+XPw}i6d+4^* zII|*q-2t*a^BD__G1YE?QXcTdl=|~4fUKY2NxO-$0A;~}d>BY*E%7$mvA&Qg0}Bz{ zEs)5#rfXkvfgl4{@@2fN?-KcC1Ik|Lq`e$S>`Y@gYV?RE)*kG;}tXkl3-*~^t zCWEUz<^I*0^coeg8Q9}HsTgct9hvE1t* zR@_^XTwWV?u>G1Z3*un;&{B5MVSlYy#%r|nJf2U(;7nGo(r)5u8>y_`Uyv(|YDNMK ziC{O#F`7{+`&y0M#=&iAnb^0uOf627ZV}OT(t5Y6gk3bRh+TXK4g^i&y(B?WsRZcj27v zt(TwOg^6@T>1ut2sZaa(R-23UAuuodqm+8WhYcPRLYWh3e0f!7q~!V}mh$4tb_zw6 zf+cm3i-}%znW2@WLRw?n7v@zRLy{i3lPFb`XWi=f_@{m|^rQkQcG5XCw!vDZ&%^if z{zf;q)8XIdR`k=jrOx;FSde)7-?B-@d|iSGS{k3rl1aJT_GnK1G<79IB=wQ<+6QAm z_${Y?atc3FK@v3l7`UoY3*SHz4G1Cy8H+wnZ;$1e-Ravlpvf{@Yfg1*9h#$#HYohe zudCU5#LdLgQP;V+GfGucmda)dj^CzOVHgxS290 zw=gl#Y5=$djsBRBZD8)|XLuLyCNPbxGvyE4($mEc6=hj4xX? zBGojWt?|P90=znY7ic%U9cJsL&WXUU2$zy^2|27o<73C3-PSj5I`--OxmJlJM^qg4O5 zN5kXn5cLi-FFA!3z<9t@Hm&q5MBnDF0W_o5B~g zGN!p0ftcSx%}KFL&2%c5jUd~>f22TSsuX6_scsXibf0fb&?o2pm*Dr`FZliWoHp5? z;yED%`4Z-9W)Y52+HDMf_}=7$L`nOr$h{^^ge%j!Y3T46?wi?yD${Ka_OC`#Tz5@&fukE5lLky!yjj7*?qH>vD@Wd z;PguU&5=^vu7^UK1)-atAF)_F)kS}B`0fVLHMBd%6?t`YS9K)SEAPUmw#S_d(TBt| z6&y}DjfP%FCfzVsjaYF%eG}#L%;eN{pT54weSzURmVQFU`gi{mqiHQig4b?IDTs_+ zt*Hyd%~VD;lM9(~t9VWL#uLYG{2}@!!siZNE7h#oadT!d%)vRp z{1>Fb?;_VmqS;evw%Di=iX=@J-L54M?(g9zPZS=_%{0zaSOTuq<>R{S3J81*7DOI+u!DBWG+Q zg&5k+S(uZguM4%o(xk!;oq?qG6)`D0?o2PhXv96l7lR-02o$gg{$9>T!rAv!*}q#j zvg_P7Sn(=N;D!>u`21E{;PRLfQRuyatIz=EChXjlyV;4T#q4cJ$2VPl6YyAX zJoz~h6PSM6W8dRu39w-f#Z08b_Bbz_A1y@4F~TT|Q?O-^i4IY79jVmEKl22^Vvnjs zN+mB*Y_r1+hnrSasPL(TWh0vILzn;n3ObK?+|cDD;o_Mmyw}DXVYW1(c2xEDPi!@o zh}&&{A5%45^UnW_YQcrgJ%I`pu6{^?1lg>o(i-2mH*LZo9{DFUzg(=goW$$Yurt2_ z_avRdSChV)_y;eZ!cHN-F%O_y(5FcrB~cYN0>#RNx)V_fR24>|7T>~{D8)9*exNfa z=7`wpkL#SBnB6FTmo$pnwW7}#W_0#Cpwr+r7gH>+r1*1Zzq2@Cp+70juDv5>x>?J2 z|Lk+GaI_ISSzZVNR_P9mN+zGf7J^TPrhY624&*H#6li}lCr%oAdF$4K#7eRp?JDRT zY8vBH_e=@MoWed<`6ozu;a_nEcv&n>6VrKljSXz zseqvp7V-Gz4s1_x7gq=~p8d=Ru4{tSKiq4C1dwN?2+|Bj+Yj98`A$K(7j zzqW43;>}t#(7Z(IHL0{&&J8TNKx5J1Wgi&8wKX$Hv>BY-xiqDGl+6@AvJRefX9gh|#1Kzuw!g$}B@>*RHp2nHHdwFfEaRG zW4axM-CHm0Z*tYUlJZX$31frM$JaUW^ji#E6^l*gf{*En;Kh$k=)7o#0x@AqZGsKC|kcj zTNbc5;S`=!{0yUoY{S=jz{dg{Jm!-da}s2bWmV~H*#TS$_(C$xF%y~NYd4)RBSs+F z{8b;Gkh;Bh7z7xDycBMb3JNv=zaV4K@7Fn3p>?qBUG|90oz%}t>%9P7L;w-AkRLP0 zir<6g0vjIbE8sOC*{(3UiFWibsx#wpIPfSJV(A4dlHa=`ZgT8|ftD?@-KGTs(Y0NU zE_{ipdjLwE-0{d3MUumr+;72Ni3iJY!OSAUi&-9YFtsLjLh0Y2f1ukjYe&I5_zs`+ zOQrkO?X;Y#Gd}#@_;nosn2Bk8W3WGkfTAKI;4e`8;GK$hFdCU9OF}FYFzD%Xs(c|n zRr>-emS@;V=FHz}6i|mq9Ix^n%!_}lW!y7jY#?bWl%RJ8dEO7!TKMB*BgnPddV{8}z(`W&`qTiRr&7Rb1-5xfSBU|88`9=7YEvXb7HoT`^cqbgu}ZopX!b zw?CBzoCbS^icO1}X<*CrGgFRDt^{Q!x1363YJDnWtXl>b&`>Cqo;Ppf33bg|Ky|X6 zGrJj4N?McbiXcpRfU^#=xD$WClF@dV!RJkV8#To@pxhM zSxWzK6I!*V|Md2)#?x6LGA`N@df1Xs)p7pOg)`#|)6qZBvgPkhF8_Sp9~GBv@ZMU_ zp6~ci7D^{)95(`OA8QInbk?$|yYH@?fU~*Yz z9;aBCcsIu8#r!ZcC5Bq zncUz`aCj5T@Wf0bI)pKo@I*7kZP9$p;-{TK@0|nrukoR{mZAS!z$k zbV#@(A3OZ1_qK^t zzKRtO`jWRGXk{QaS$v%mReXUT!DSSW4~G-l-!_;QO@ZC7)?#{%LM*wB$h!B{FmqeD zbz!_Lq2B7m)3(P`uq80bf~~toca@s7#h9Io;dqEN75=kqS(Rqytx(UDPigsCx*WKf zzX~RVGEDs&JyRa`*7e3E8@!`O**dgTC?8v!q)t`$ zEvZZ!9=e;f#%=O_q^P4m-+Z{Fcaht}0WwBJ={oN-Z>lx`yf)RFkpfhoz%OG`y{FFX zpK_q@r8p9$8@!SlgXk{U6Jcct69sAsFqYj@twxv$*!z`DkCjaus$*|_`izq9s5Z2v z*x_vKJ@$>#ESH~V$fdv7vs1Q84s zN1dAtKR5HSeP0eNRFxhF%9i2T-X=r(E1CC?*OCyK_+79^I96~&uLv|Qf84Dc-Q)w_ zZKnxk^@_9k{G&|+->)|xQcty^j{*gUot5(7Hhfv}8`}zeJe3>&5W3g9tp1Gno0kz? zBf9y@nmQ0azoYAEw$KKJS!ieqOC`Q*HR7Q%p@R&4Qyo5enYX6*ang2wLz$l zF{N(9v4{@Na1EH7UH+)v!KQA^t2ACo8SnO=XH_=`mI6rD^@N0mIr;^he$(LHXqOdm z!w5nJ*K-QHHn};UBsQXh{b6~jf#apXUQxv1gMGin*3#-S1wO_Uq6B7o z%nl$%Z>$0XR82k8Hw_N^J>&||4JdI&6dObx-kzBaz~HI1*#s*l6S0emP0>1Lf(DyQSiUXA1>Z3{La$~x*b4~EW` zKfWb_dRik%`Al=BYxR5B+m=+Il&4lE{ndf23XDktPUD3ghVDlY`r(KMak>&^e9G5Z zjM7en_q^UVy>iR0JDoB-bymY>%Fp0-&$JCs!_wk62?1o&9usvNPmfKuhqG;XIV^kF*%pJrBQr+iuKJK6RO;bCKYf&3@lzpPBEKMF>S@nH%tkAEQ#A94UgQnRb{a~% zuXwvy&Gr_RS)5u7Ob7_jP_@psOv`gOF7oYPL?q++H4HgJWux>Zlbs77E|n|k9(w{$ zPL7*5MDC5+=i%`1+u6;++4?bo>b@go@xp{*fO+?T@M&?9$fx`{YF34mampFG3St0T zi{ZPRjne3LoH4FGLEt2s(|sAqm^bTP4dyk-CQHQ4u2zj-?>tXX6@*Dawb%Cb%iHO|U(MRf-^OMEOOw#C6W7=l2nnKhdfOE6OF0 z040M}bq`1%503Fa?pL_T^o21t9@4A^=wu9SuFvNvE2hLUr%(2&9`nsvQY4;RAk)$Qpn^DpyGIOF(xRRGn9he9AXXuXsb7HYLvh}px(lA#o zkOTNBCuqVB)(1#uo==K79ll>u{?z1*r!EiT1NA|CyeCOl*fmg+-R|KK+k8LqO9NC9 zn&lCg^^dw2=$FJ{(OPt_a!RbPv)7*VC%dO+EH^>?+7Az*qU|uf9I)C@h88<7x#cQ8 zpSkO@|4ZN&xFF3;{@A*dgOy#jciNl}UBx;u8*Aj~+<4Nv>=PTj?%gf5@E>tj@u-V_3k`i+t70b^SjFn_creZceJ1iZB}?p0Bx zujj%uY>WSKd!MYvH$zu#QSzLtQhh7^P*L>v(9Ex%Q{1fcu3CLI-t@k)jHfXd%@d z&NWWatA=<(Lr@yjk)VA65%4h(?uoHs7)_eJ9@uB88?c`ju8ADg;%3%r9qL!K67UXY zB@0uxXd3A6mDyjCMEad;>?1$jTfyh&ASm}FZCWT96M*&5dGnh%hn~nMTFmc z>BV;FafsH__j8pt^F<%k&gZ%Q-a>W?W$;gkkcR#TM2JsI6FJxpB-JQcXYwByqhhnyrRvF5O4aVDnb&|dE(BBtl}Y*_gJRyJ(F z&fmWi6WhsxIEV72m?}1e-?|$V=P5z2*DONl_DFi|1%VlXG+X9(c#UM&z341#JUTq8 zkMavLxw`usAy-80&Tz1|Fe8ZBHO^AP7hd_u)6?~s@Nx8Az}a<7VVBq@_d88;cr)jh zjzdtz3`S_Ll>~&D@ zE!?ZQG@d=Jv{$qC8Uyv1Vu$GHj+=6%+~tF6g2yjWLWLJ=4p87;g`@Sz14oskK|*T1 zPLG9IwSn1L#njFRHF*?8+SzfG?m$L&+N*u#ax!*VzD9M}A3f$I#qj+6dr7Bk6(a%$ zxo`s={;;B(6~uvt>x-hMx-IX0SjPO%IsaHCYxklB$45s;jU>_^@BT8?jlWT>xre6I zddKxV_i0IP!{d+!DF{6;FOBuz!YtBwhw<$LE3gq^Iu+f8STinxZe{#}bRiL)*~?}e zzaSzbP%=A!K#Lt7QLs5!^6&qsEyZHhT#ws6@CY-GVC;tT{q0pgXkcgT2L6!Yv~!qm zie31{AwFQp#jN5ayMg$Le|iJaCS|Usgz{#}8PY~@tI^}Bz+PBI3@SFKJ{!OY(p6=#4;S)(O8MoH zTBka|Fc5kwr_#I*%px{eC<;cmWe}AOems6bBvZ0=A6n8vPxE_D-8TuoE87+~yDbdw z6uKorO0pK+ZnM#v7a3F5tpt2HVo)TU?G+a%7y<5J1auTL-rjvW^4o9EB7Q+6-PCbS zeboX%>l{SAG1avaA(pZ{hceN!m}^HKOT=zreVUcO`c%Lo)rHn&xE8Hb-=fY}^dAX3 zeMjRvx|n6A-*aZcBb?U2P1PfK)wbdfP@eW&2{to*T+6EKGf?OaCF}RAM&9f{9mknC zr_MWrx31wrfARSR3G}E|X%OrX({HJPNWEm4fqmRsQhQH_QJ!Z~-cQb`>0b?D!Bh(Z zPk0{nku7eFuvVa1ApnZ%GVnWE^xeWGcNsCh&=ro?N>H-9ljIc4SC z3!aR!EcA7{4#mP_!kOQ4&TDTzQJ53DeAIbSTS_f<(S&POQsyotDSmzYkl5sYXZ+l%YGiL(xa(1*8Uc^nn{JjC>03 zk|DwH_?8mi46+0#K(U$JolWeIhCtW8^7RgsZk!zW9K?_y`N)9Q!NE&B#a+3&axcyg{IqKVjvSU0H}CiX72U=F4HjTA zc8=}ToPMs!3}N@M0$+ys21s+)EGIUE#e6tltXMILbK>#|5e@+X&aFsP5V+Hw$VJ@~bs$^~^u&Y?_%9^AX{yt4>-Ilw8Qy7*VQFE=tq5afp0KdA0ek(@|=bpV!yPslCHenxqNyszI0xv@VpasP&pbN^RD z4ofksJPDiq_ojbJf0+IO1?thQtwnwtU;+pTIqE+Nxhpgfb8VQ={|jQoNUBl#FLA8D zAb-wdtq02Mnpp8ph}VbJA8;r!V*^ShBY+jhvcu7If9@#_7s3ol_PN&Uat@tlY1P3l zZD=gGohUAS0^qhN{P~f{t1r&dYWgU8BqZ4$PkEAMm}jJX7TyGDHfMgJBME~V2I-db zMf=3>A5&RHBqNGBm+T8Dv-Pg?-fGV}`L1Sm7GQvwO` zF_qOLq>rtqu>P=65oR25HoS@*5T3V>-0_kC1xMF1(LGzXcQr zf6LbWJfc|OiwR=#TYCpMnyVR;gG`iR)!I@9Y4ef!4*gWu*J6n$&`;@7jCkj1(#d4s zS=qfsuN}8=wiKY!?OGv8ZjuuU3mPoA^qM%)$jzQw=*N@-!eIW&{kYxFawmMQb7s(w zQsfmw(u=C@g|yZY{DS;E_TD?%r^$Y^$I&qEwP*8t$g$0TE^Wnj!C!cH)OHkHR;^WN zk;)jVe)_iDs#9O{2B-i*nAc$wZ>88{S|@7_>sLIb`nC%&Nb=<*>0WULl~Gy4H|Cu^ zso?OqGU&!CA_OLIT9mjI>g2w?`^16hM#z(8M9&Ex#?sC3x)+lLB!KI)Fqm$oKX>(H=um zXXfYI&~p^0k)b&y? zH6erh>2@5kPIiy4DMXf0vhC3`M@*F3OrlB!?FvR56!;vzrCRAMAtQ|hbaEc2Mci;R zn2{K;Qa{$DoUB&p;gLvG5O`A6G;-|hC!%kg*gtx<-d%knL0UNWMHRN$P) zm&?nUTx@G(emJ4^abN#8!u+XgI``W$svK^m#Px_5J+O>8*xT^v0B6{6p!G$UKuh1@ zmmr%Unb%(3#m3~}Mxv@&TPBhQ1tleD9OQAjeTbg(_a0`Ct2(GUZgu61 zdX~z+WU*QHLaYx0^NI|;p%wfr-~+~W^SZPJ8SiI**L&>WJxR`wzz7-}YjfYmniw8& zPm&F$e(-z7Q_esxK!c$l$$2D>vC9QnIzPa>?Br~tDh6ft-McL8KcbEMgx<$&9!iZW zII-#NByC(yglj4+D+XbR*Z2e$l73T@ciAV_abXB=fZoGwp_bLRiJVZMxpD7V3TFZ2 zE}fO}+DSuoqFTKL9{Jrk;7!REF8H)d`n=Mu7s`Pj8qe1!89b!Adt$#`h*C)VozrIB zVYS;#8W0zzU-0$Uh%@q;5KELX@#>IfUgp)M>?3{(t6`1(#ZA2h->*Cd_@E@Ek+y#! zb?=FL<>eJo+Dm$SqT)!|&#M$<1I5pOI1hNUehPzj$^kEw|F*0+PN=D2O%u81JGHvR zH3{YWF)K=*NCf+#}7hWa!67f=)OBGRaQgF51FdF2I^6_g1pl zRb7Ae6LJloZZ*BXkLE>_a86w{swY$L#kb7}WI(+KT3-T`trFkphHAg@Rn3kP=^dPr zJ2Gln^#xupfF|W9fz0^Jd|)sA?q+Ft_L*Ei$t2?aJ1}+XY%Q<{D=nB@Ht8`6zQOTx z;=b<#f31f^=(C@s;4+SAL3WihBT0 zw)?|WkeGVdni#0Pd1I7r3F8S-9)H-)efy7gbM>&Nk7xC|u?&?cyqm^;B|_wBOwZr) zRW<)uCEQuu$yfFHP!jzT0XA%B5kxb-?fs81i-5jRr(5BMV3cE_&7qHAfBS}I+MY8Q z(iSIek-kW--|0GuhV7tS_wa&CAa`|*k2;5M$2(!{_N#fHzXIC;G}y{`9dw&NAN;w9 zMQgnEC8D7xb{x~sVN;V0`>e`6H+12LJJnw2{P|VJ{cLuAkK(8&^dc~r z3x%;MRtaWnZmeV;YG$o*yIT@Q1xhc5z$eoB_Ue5vw`j1tpMFbsGd#y#mgvUp!o?m1 zZt$~=wDj1t*zB#v?{po~)rEu;%BTIILkG$9Yf2<+h%-z5AmXg5hd|cw#tzL2UEdIInkY zmc=WVcGoNJ>G*m9*FAizD%?GsmLZ&+K40+e$0Y}b4diAYMEFfEsH+!jzWra^WqVv&B zMb5-$&CFEL&#Qd}!hDMPGZK|beq@*;5b9gvr=-o_l5$UY6DL7R2;j|oXQli3!V5L$ z>&U@{J)u=|<}lDZ|6Q-S<42EaX}`y+AR}EfC^PoJlc^+ne1HD6H#w~jM-sn53osAx zLQGDCcbQ(V;~@`@X^FjHx1>N)oK&O>!W+o)Ig86~B41-JD+P32Z~)-yH%a-$C;+6YHl96Uw7iZ&t$b;J$~=Wl3|aOGuQV|`fCKstjpZk#AVhmYo7pbqx~hUhu{dC$8P4c)X2sdRt!@xbwq zwhIq^9Jk{&c2B{)hVW)6g)B)??N2=xVganA<=adi;?ESx&gJ57`X{)Lk(gf_J*EGq z>bo#ix`uD}pxZNfeFp%3HBB74PjPk~ZXUPpqsg7Nc38&6!|aU$6YxT)BPW4ER=C6` z-R%#HLI_aoH=bK11|~SqTAHChqPxb6N=qhu?^mQe{Bc>@ofWU4VMJV*ttWBG+;K~J z$ub@A!(9fz$r_7W_wR1VW|x<(@$#6a?RjBk`8cTikxXSi)GK({y4% zl@0C#uC5&4pWO>zG+Nr4OngL_%dptzT6Ba+|9*t`i(zJSEirUT`@C%Z!t;&O3Hg;qgKj5c~ zq^hSVo;hP=Y;8KF2(k1I*Q@B6ytDUfiwd(djc>}J5ys`@zW=1$6QOPEr&Sltl)iqO zM|5R3Vt;Tp$X%={U!SU%*JMYV(~TJ~HnFVst_xcN8Al&iv5Y`LBgeD8QS>7JhH6>X zLeF3fs(H36(qVxqnqUz5=Iiz~g(y(s@%0MQk|_;8^h8|uE7((a6DTAJo6OX$E?o4( zG`=S!uZH!)Hi_us@M3}ClFGlwY10X7KGLvimTz5w)f{9iIne}fXu zf|`P|e?b5o1Tsy%=Kf8l>3c-c4c`p|I;I2QMAtbtPW(N_w0EF8Zryq5=Vz4s{w)XU zm475Q$p0ru2)>|7{LZt1YsFFI9i}FZ&oWc)GED$F^!Y!aL+k$m9g-uT!J>+4- zNv6N>y|gs=D@D3d>eSH4-hc{!Tpxb02NW1kwiBZY4K5wx`Jdalf6@zE844dM0N>gL zBrklHj;~~cHphl!9{KzPv^E*gNg$MguHwGX;UKdspt((2Z8MFxbpl@BL^OI@%#d=7 z>eLlIgq+)TJxs{+y}3v3r?Oi{#4!9ntN__j2Wa-*yhdX4o=UoAo<0%Ofibh60Ng`K zs)bF2olWFPDQ2PPF?&+?$`YMYcYRbYos|vE~|EO?+?>auURLz-%%1VS(!uEFm}?)JO!ljHJ773e#}J zkc*CNUv7HxJhh47RUp8AC=%MfQTWp?m+!3nHhc%>ts}Eb-#mz4?$bTWw$miLvmpQL z>#uY0KQRXQe)ar??W6d~gAyhFIZV zAB8iyD&F3P+$3A$M}!$1V-MrEo)>*%8>%o^4KcSRHee4@`I$f~T3z@B`5+Z(vtPmM z&@YJpzzMmfbD4_t>dU_%sDTYw4oaXC*YhamyH*6Aa0^E}PqC-A0U#%pZU{jOKE!2a;6I;*T)=pW7Ie!$`$7EyO?!Jupq|*Lez@xnzlp%I>urstLmPr z9BaXK&O~oF!iM^hXJ;P%41K96-CccbH$5>Pwi0op_+O$pAjrV}DoK&|YoC>*oG|9A zVAI(?9*QUuiJdaMxo8}}prf|;8tY8WZBRUzh_jZgOfks7oVgXd#p%N+&&dqBZ<1XN z!DuDn>4Iw7MFP=&@;~^9ynR0hFdt+i*-nYNx-i=?x?W7~5BTvJl}P3@OweOD`o2y( znPSif25^!f@Y5oQ4=Grev!SEkzA=}5II!R{Eq1Q1>~ef^^eb~2Jw8llAud;?ndU)GjVoPY-19OHTOF6?51|2e6+rJMkEt$)hh*%${! zN9z%y#J!rsq}8RMciv4F-;Pvx(5^osV5*a7?*IyL4X`dxuTxyt=2v*%o;vBa#cuiw z5@JAcW|hLDHClNsLe+_6}Nv z8{8@gIjj^`@^_@Q->P1CT1%DU%o#R)TgLVC1&O@_`QV_*7-3LwG?bn^F(PU@svj(T z_sc98mIwt3mdWKy*mk;IGpc5^!p-=epIj|_p!E39rci$Yo%z7aA9B729Hq`;A+hxutAoReXE*-Ob6;aZ~>_AII)<4glVq^jw2U*@z z0}lG+lHgZQn_Nje7#jXylug7jrQlJRvbJ7mxxxrt4UDk4&JBYcrt!L-h|LI z`3#adaWe>l!V(wB$PYx_YRCLbDivj;b^B>8aOyzc2aDa>vuTYMpr*Q0S4Hx)z8`B?F3S&_J-=w@ECiWtxsH!>_*3OldT*{&aq)G|r{1d*qqy+Zl(oxZ z1`&@H2-r~YtFFD_Yw}_Am!0_ezQ^$_H!SGlPW$M+cj}z`HRsrfg4STzJ($}!rk_r= z=G$U-F$7n4va{bR)iHrd5oWL=DE}9G-yIiKmbBe~ih_WFMlwo}(4vwhgGvx2hbDt$ zXmV77WDz6?2uK5xb7;vqXAmSeIp>@+_+4jbXJ+@?-I;xNg1ggy%ulA9=H9MT=iGX# zo_gwG>ycuZut`xI#iSDBMvvINDyNwD-1&zlzdotlbi@7xB2;jVK767-BHjPxJO8bRH=qX8uRL;%elJPL6u>uV8&jw zAdS}vAUrW?riSlYSjg}o^feEB-nn1!xzYNcZ17^*$<^^otT}0MGEM5!V4Pf=z zcJ5s(Ef3Gl9)Ap;B1WEEo388Z5U7d_1Rg-^ii~Ui=cmK11o_?pkGkcr_Hf2~T6&*x zCJ}@;3ZApC`dAobYNdTc*21Msvnx4cqwGl^ElV4+c=hmUNp|H(BZ)7fVc4JBOL<(paShJOY7tY^q)X{GMAJryCFhLEv5#4(vtx zw0a57w6$-Z)(m)P<^%Bu(F8wM)5JQ?kf98dUZMYW+S4H4m4)xmhp%S@fZnH4Y8<6r zhl^YCWd3DUVu1^^tou}LNHXFz^@!X-<|fw!+~D2T5c!*9;pJGD0EJH>a5muKs;Db1Y;3mFUutsZ{yhGk8=!_Hl{a5?I$H3nRO)nC~|f;xfR z%ca(pKl)AJfRu>O-kTEsb@)pDy#v}0K4}6puohvdeW90gZNMSlT{z7;z7OpCO|~wh zn2Y(Q?6*i|#RPWszS}aUU4Zw5S5lPva<=b`Z110U=A7g|piTCPuQOEKlADRgE9FMh zOjhJ0Ey(rJ7B;z;2JRGgXT8XU<_p>!zvRSNv@sf8kZE=8rUGf8phn%bi=8)DWq)F! zuN4Xrjw2u*1eiMv!8yB-bxI;(fm-q^G0=(MDWbrxVAC%Mgx8|^Ik2(K!*Uuji1z~L zWiqgn1o`eSosl0T+8oRK1cjtsmRjgL*ajxfI|IXN&n7@y?2;pHfCAlJ4@Td;b+;of zjT4E^TeKweGDpa_i|os`&ZKxzOidIjr;?{1D$xz4rULO`dh^@h>6aOPF$XxHm)lNE z2Xdz72}b+w!p#>Q9MCkOD)l$rbjZA{`S5zC@;XhQsZGk;9$_t~{vG6C2IB9xLCSd= zJi2ZH4DZZv;EyNt(_1(dZkI&tw)EUYQ@#d6;)LFEQ5j<;iiC3!?RxuN_pK{KMo;Ti zXZqZu4!IeCBL|tYfF-j+e^1^*WUp@7tc^_16Lt>faxO!r#Ld&!rlr z4))$GpF&16eh?NCptz6hiNSQ4?w8v4a%$3)bRq*t5 z251rT!LCwQx2~JGyyCpSz69;HrMgm^UdfpMTNNL&p zA`dk1fL2QMWX0eB7~OR*bap2@UA`wE*(xl>(L6KX_YyeI+(u6g8PQ-*{RpJ!$u+VG zipA@Hw_tZI)j>AOuP5V;w_5Z&3(lkd*^1+4*C=bX+{CW*V0OA>stZK7V}g{GV*f60 z@-?!xc6j9)p0qVEuFi;$wz==`qA$VU07c%g0;a2f7Stp@@IJc~8Eae=qQSNk+x+Ah zC|EZ1N?j`k8e^-E*S|Iycu(O@9cch+Jo7{0Ur|WD&xl5=9Z(E_99@a2Q_OpHx&3o7 z@UWRh@DVOT4V$-~yxO4xMtd9%SDw1>I2sN%jCfP5^wY$wa3rcpd>s7?(r2{ZfYj~N z$t1lu_%8U(9CjI;uCv>E@#bru{sj=;f=-ybTqa_;?#4avK~NKY1ud$Nf$-tkfs2fG z^3gLlfOeYaU|!RHUGkkcX>c&FvP*4Q7DsKEgEfKw`}y8LAS7?*2OwO{-4lK->S1T5 z(ZC6^jHQ@YnmGZX6C^Kgy_ypl6P&f|t~_`(`mAGYmltGlhP&aXaigcdmHp0*{xS=8 zKiRJT@8H1E8?8Z59E5LFjyDx<3bJ~gGf{P0x`@z@cjag1qdcpIbNJ6a(JKoQbAO~L5=xzSc*3RKK7P3Gtnv5{v;^@a^VF8Pfo6@3ii+%hV99@#S}xDZSM^{ z63Hty6dH*lbsjnkWRn6J4JE!(Rk)LIRN-}x&9Wny^GZD8bc;y^|Ho{M!rd&PT30EtA|cRA=a=0vIr^cS zV@D)qZ4s1Aw=_eo`zwJc?8l3KmshAateOaUN#ZO2*7&-UAKnO=zw?_3yn$u$4gC<) zzTWe{~Na8RY#i$B%ie)nx)aUy4`(Cn(>SvA+c*b6g%Hf#LEtWjn)E z&0k%^%6=R@AG;?!>{+1jcgO)`Ql-^SKFMr0vkWB5QYP0AZg+U3tKMmMq-!|?)VU(f z-N-2l36=GiW6G;eBQ*_#JRf0~VEkqCL{pqyGMJoq=X^QNffxa=RsyU6_ZJIr z;)iK3=(jdcD66EiOlA9TT{FPw2a;u<&PWd~8N?0Opg(PD1=;K%;`4_Eb@afooLBu6 zmj05_Ox1raRwf|;1~M{f$t}~_!@e4z2jZ(;E|djv^`~5m$bjBCcIdEw$b3AhmUqxFtFS^Md zVQoOu>wBAU5;nbRes;-pa@k@Z{mdKypBB({j9A+t81b=?`w{=$hXQ+F>x4jp%H4E9 z(elQP+zpo!jTh->)un82&#AW{j|)z(X`FEP+`E!c=IZ)-9%{H`++BZr4z zuNltWujlI)W?j{WxH+DhYBzj{1b{@PMiNBJUu z9a#%H48H@%$&2MTCXb!TZ{PWtJOW?;iW!0bIM=T&_#ljO2TH2n>i8aBpORXMOg>l_ z0XPNT0&66#wqYavi&FzcV*-@s%Vn2H_^Ky(YpxQ1sw4k3Y3?DTeGf ziq?BvC|Q-O3oDKH3VpBB31~svIz0kF-F6pist4mt3g_8XHRV&!oxsE6kh4R?LZIS- zP=ayuH6t3r9mpBDgTj@Mlz*$=<#^HM>jRFVi~s8bF5a*G1zinAEu_aL@Zh$ho8y(Y z<6@j&x2Oci#GFrZno`ocnSjiyN%d%N4-*res|lQc*`cjf*uW~Vl8}fy0gq&~6ZFcS zjTIt;fB3J>j#m+uR&a%v^+?q?tUSl>bC)VUV-Xspum zG>-4w;EC~u+=1+xq7SiG-xvvI^_GsDqdu(}i}?S+eHi~G zmx&Ktv~`Y!CGK0$JD{&m^}_?(jU_yc>W$i2k1H<-5eh%aCN%fw7xbCv<>|Ol1|9ZS z0HG1W*F~?LyPkdj$`kIu2Un|i5q`U0KnJ(U-5mFP{C5)Czh1gZFa5iF?(D>z(60b) ztZL&d+JLdS@_lI22JT$BqYB`|&W5E<7X)9>*8pdMVEfLr9i{9KNp zJ=LM7tr~QpF*?2`B~7V?b2rONx*Q9%jX-}+KylG1L+5%J)|2_M3h)hMFCe?~H$-6T zquH)PA*u=+@OF4p6`96!_Kq=C0ne)anln#n%vp*P@ZE!v=ML2rhks$AwVt}4mh;^< z=DXjuN?E2)^*S49XV2o-MxgqDu9D2RHvN zhxXf+uu<1W+4q7NZ?%;}!%>(VEx2MLymmBZMlu?fySn(mqfF{^0P zsxHLsWXVal+cSWOApC5EoT(gWlc}+{n2aJGUpp<+inmfOZ1^}$+FbRej~oY?5f%)or~IQ4#NkUjgw?0}(m z3NvVyjf|1q2$_4bg6#PV9l}E`)LIv_25g=ay%Mr{@(f+e3#_Ltw5ryP80Zq1EEk+y zHewWQcuwnYV|d3TOtj`)F-06AGsW$;dNwnp@g>9JMpI&faQPv?TNGsT2yqMx+Me@D zZPTr9v<%XG>cr8jTUnt0b{>8xE6{n_&3A9yYri`w!RcTw(V)9mQ8>b?hyISW%FbW& z5vYH^kM;km+xvU`3-vtF1@ha5l~vspL*=y7v?q>2yJrAj1IL+j!~TpHc=I{*lPxe^ zcl=;C&CX+K$aC9GpaHCnU)Y&X7AEJW%pno(}o#c1MHSVZxc>5CFDcjKve(sPZ<|*}jjr zft0dLP)(aqJT1snJltwzQ6y$Fi_iC0^sy9)R#y@sd7F@=(@x$v0Zr$9ch^duIF3Es z+9hDZwtz#X~C;jW0~CnDDh!28`A5qk!Sc_`Gc&7(0j1d`^)PuyD3Oc)TdPqwff9wqIGoqfo?N$#!a0mGsHBu;Z&tX zjnaqnPLt5=dvhfq(fU3GUy)fTE831r&ARuHy{}-{B9SliG=+>NkhNI3H<5vJeex5* zMw&EP=_zC@$cKunq`^#)llflllisOibLry?4Zf0G$#;eIR*C;>9P9Ht=F znzbzp0%R)j7x~Tb{F$9st>({;D5f>(mh7^y+@avlotVAw&#QWyQmUL4eJL-QTEG)$ z44W^rriOc*(?9iy4iK3e&n(2A2Zgw)U8c>Obi1v{P^OS~*U5Uyq9 zcwAWEWtUFr%I9AGc_Q2^f0@FY>+c`I(k3f#9rKhxxE`KQR)s=Z?Jeyb^?LEe( zo)mgO%hrcQFw)Y?M%!m*z27eDI5^w*giW_~^?emzt>-At;0|3(Cf8Qhf44>vdZ(CPg#yg}k}elOCAgL?>pG27M1z zp1)Y~m{BEuZ(^QjI^P(qyOMs8+HUjE81s`O)K!GK)8Z07+8%}5`q&!5-N730!Y)7~ zt}UxIow>$uIaV<@T9+*Z0!qfeeK3!@#y_?(Fxf5q`^j$m#v3tyF`KJ5kVio6n(Z$z zH2ic@lj%p27Ra&T_h5^muJN~pr|;|`FZ)rRy7K~_j~Py7^c`Od0YrGYOooItcH`$D z_D-`G1{4Yz|WN}V$b8f^WPR&*2Ud%{|<&Gx?j zO^~MI;84S**1L2VieI}pBlF4gm*|SY#2vJpRBV#!%!9dNSja>wHe(YBQr#gZF-d}w zLB{vjw~(ptb$ErKjt|&l-n~>PXCk;&y_FSwtAjzIBm!43du*+PiNsV!t$webz83k^ zIN6}J@lm?kTI04Vfy2AcH!nAq$dSGo3RRKkBuZg0@)6w`R*jvXGw!?=w*YI6FN<(s zI$73l+Lu;CD}_;P!W3Ba7OD$ze=6QtgM{h zrno}~m^b9lPs%#MileZHw=25`5IF4&r&5-FWbwz zGxq{j=F^t^zJTmpi2zAs$j1Fmn*3lAb8aIHpeBm7QO0;(wpWtz7BdAfAOT?IOmB3I z9X~GT8g zK>vyh(Avt^4?N#ap4qVAc((8OAs^{&P)*T{A-%nH~^MYnmt_eV);$tiM*1{f829MMi889qD559*n~0^GWCGS95U=@?BI zbqWe)NG<846Fy@228$OzjCrrfy1#d*MNJWH6@KG!o~b!77F>9SGumXoB_H58Wm!XS zbZf0TDc%Z}b?EVwE2);XUR9S#3<`;EAzqs-9+MF9i3yNU@~vGcIx0NzH&Y&XnvE?i zaHPPQNJF4SleqG7kuAkY`|z-5C%d+L7JJ zJq-Qv?#lLlM*mt@{Gv%sHYf4iU`P_VGO2^~1$z(M>qghnZhwkVIWnk43ePojn)Yp~ zXCgo|tL@Hm8&gLVD@ohbL@i5(7=5X!)1fp(4=?Veq4xG4b-p7m9RTb|@-AnWB*%gu zT8`*2=7@h7Q;vAQy!hFpYIB+PxYm_-3KjAE<`ED2KeBO-?P0k;*Atc^vSq#ZbZw+q zljY{ln0uEe)y$f*K+l_eY(3bbat=$HcM*(Tt7m=Bp=L|ITeGu4NIPqFO;=OuaCw=J zAMlJpyAI1`|pZODN`MRxY zLHymbWF*)L?whS>4{kNJTNl~g`?w<2UABd`{FLd|62!1S2loa~To$d=B-KXhYxq4G z_|9;do)&vH3|^HJQMlq4Q(GFRGG~U#T*Q)s)|{p4IB18{VVpSR+{wQ*yp`^MZgZnm zWKu<$*;d5Tz^UxQ{3%Qy(kym5sR6B&f3OIRFV?!ClJkL#If12@K;H7LvDi%J`lM1X zAj6ntF5+S$a5{Bn+rnTo_=rTWA1#~WhR$pF!-m$qS2(YY%*t56ms?hjuj+Ei80JCNX3S0!M@Lpmr zDVD}pta&rj@;jjOzenUdUD_xUDX`~IG3(&>0Rc>r1Fz$oFFW=#(nefr72;Tu8=4}o zlAf_0M{YPy_Pb3wDpTe3$o8^exE3R{ddlhJjhbw;a+_iVfYt!QxPYs#k@%n>L42yK zbec&Le*gjI_aNDzuJJ?D*|$LQLi>V5!QDS9970_e6^j2-A4p^%ac2=10zK^CC>j$m zrqVGS)xMAttPmKMe%-jRGm4WxRQ8P{)=ctP$K9aRRQOgmMYx;NzC(M)hf60Jk!9si zD6Fpgj^W=c!8MSvO&wmF^s>Ie!O+cw>HEpfz^s5$MdIc;sJu~XL<#%DPYoS7Ps4I6 z?{oG0snGO1dv)J=Ili#xwnUlhl#hbA)(mIyPFZyI3R?y28AEB9MrcpsOlWkb5t=HplsnWj{1vO_wpZ3NV3W?BWft(%JbAPfzhsa1_Jb*|^3HM9kew<}%{8*POQN^K#6*eYb{HI!;NSKrXkXH&{dk^HQyx%F`q2M4s2T$wZ9LvJVS#?f zBR1)uqFD_5(R2r2TYNRFkjI2O*$#V+PpE02zRUDw(BdpA+4|xS=6vB+s<&|TTa1{m zT|_86cUyp<&r8BZXas7q-$U2*7o#rS@jf?&5VFkogLf8p-E9j4=`1S;KI74yFR-D% zEw0GbY)g27d8%-#H)|Y=5E_o&`hqPEW|z}Sq6R55ThhGU{ zQjGQvit?rL^+~T6eFMmoBdkaS=5?K zJWGGnwI*M(ku+jWuuaY9=>v7=CWRGk{_?^e(1-mYu*Qq*(>n4)URrhkL5r4p^YA@hi)F?Ru;NQK6KTs&MwwNF(s~U0u%N(a7>yw;8T)V@L z&c*W}zJF9V`}@~I@bjh5;gJi16!DL20-j)G*}T(I3`*^Um&`LSgky&_=(XrkYlW?9 zq)|WXP0fV((ObV@)^l6vX@07Uf{FUm_Of1&c+CD{P3xs1wIaEX2}zg zfn`)@GCL^ta%D1qz0RsCit-=y958aa_$m@khZ&IumDLC4i3DV+;OqNrVekURe_sipZLTi ziAjlN3Bmv-Tc(eBF_-j;f~gk1=+FAv%H$1qaawF43;TI+%0=|=+^fB3yu3r^ygK1~6B8e8H#%`@>ij$V3^tF+wDyjmb4>@~NAR(Kn(RgY z`!EUh|G%ECeo!t6UF)0VC*q$Tx`)^S<}<>6;MG;?AX4~nfw}CNG3!{MecxZ`D?jH; zsP}&&*!VLxC0xxBhu+A5btN(jiy$19vxjvrkM(VvRZY=~N5of8z(Jd3P1^`U+_^G} z^G5!sB^ZuK65}evDrv_cjnhfq^EWGBX-OG__9D4&`(LeqXxSt`W+L_t{$XO}FItR8 zJ-$F*K7S+PO6s~2l#9DH(ICsFSBEXpnDwsQ=lM=ufAQ-|+k>oBw<{hRKF%o2y~}F* zPKT}QvWPXO9M1|_XaelzN58A894eF|0k8{8oewRzkq?7|!XYA0Oc5f)rVS>lOD)W6 z;JLve*hB1XEZ6e}T-b3=d%Rqn9NEUk6O+hj`Mc*$hnM^*Jpu~a85oJ_qjA9a<|Tr= zX)uB}z?XQ7Dhr@WBPbvEc6@;1OE(IW{MW>&sNBK-2G;nwb%ZM^D$EWGxh6*`)Q|c< z902C=3C23d>X!^>8BbDx`gw+;x(nw3l?ps-4(XAY`fCZH7x}pw!WDvrY0XvIUl4Uj zNRi&kX;N8~pYj_S#1>W03n;iFZEk_*u)w&}9AhS;G9D;YM5x7JpnNOcH0rd{FZgO0 zPydL@Inf#q_&`7()y;rDvl1G0cq-*Pc5Q-Y&WwB+r6Q}9e;1Mb5V%xVb#<@8g-t9M zRd@P{H80KlfOD!GQ^v~_;Y+OW*q zlqQg2^=esF^&;aMhdY_pHCYcd)r(&>j-96h>FgrnZCWlDzMe0}M~nMOyEPi>=bvp! z|5a=R=vsvQD_sjfUJ<2h@rQ;zw;m8BC5u{kc4Bg5ED6&kKE-9xyahXoEKImR+!wfw zFx!+Yv&5O{q$fl~BQ#fUth9G1bzTVbFudIxk6Cta8-cJf#eF~(pmpN5sDU?mihI8F zl|%}+mov9+oM|s^FPh=y%d%bnM^fDTq;th>*6kvP9vU}PnVSTyU#Kr zBj=xJIGbmUu=*w3#_A9yIREPyR6^td~g!XeN6Zf&wcdIE`0p2P_5Zm6S8S76Xb7`&o) zUj9(CLc1kZt*y@|J5}28?Gniicv3g}wMU2jPfPtjrQDh66E@PDnd!|DSnY!)2~yGC zZdM2Bdb(9h;qu zHrTbxbQuiju9(I%(TeTzVv9ytI>v6kc|zb2d%9mj^7@x2J^4Es0n)I%q+zwS;K!FO z$0|iP{fj-6B6QS1qZzAZuN9_E0{XVTfc$Ifj^s+|QlH_+$jgS2*ztaLUYlRQr>Ey&?jtAoI&? zi7K}IOrhfkIR8ieUYsqR()foy+m0*nt!PK(uuHRkPh4>TCI0|h?oeckKj~__CHu4F za`ls${*>2q99;DQl2Z~k#KP|o)K7JQtV1HJKS8^8fFR*HodWxyRC$mB3AT-@&2*CC z&M)(kpV>n2CKm!5r;T?3-{Y~P;N`-I#ZYalC*nwiC&Mq$bSE<*_u6j zo%(8F|LU?5jy=nU(wWVt_FGMAeXmXC^k=1J)t^x7Ifgz7mgMT)s0DgeiJg76`JYGW zbeF;$=>(4pqL!&;-P#xTVm(S=2?crHq^TzvLZ(wvrNwCvy%yC%GOQJ@(YUh`q9=iH z-{mVp|8v;#lLQ-EqrC67DNNG$iLSQm8FUwa=gIGLoC-JM^s>KR~buNM` z@bf2i(ghFaNqTMf!CBK1>{6@4$&Vk5F0Mdc_l!>VPJAX&cX=O+SGsd3Ka|2VmCuC? zTR6(it^c64I#$6>FiOaZ#)x{-{k~5+R4dC*T>W1n`9I|ML>-7h?o&caF(3N2?EMQa z$tguEbZ|HzE0>Rahs)wZ%*hM3Qhc+CO;r?eq6$Ddw~P=9kx}p?Re>IDn5a^0N7qm} zp2){A<%E!DH)F((@#^+0??&Q6qZ1CLFYHj6Xvych>LfA#tq$9N@=bMEWdyFW(v4hI zpw07sb|^Bdg71k@lP>!a{-FI~*JA_YZMBik2V{`mp^M||m-{$ZVj+M8Ib^^aqfI1& zf|o(4B!cT2Kk?G~if~sBzu3Ilompux<{@EiC59%@3=w~&QL-NT=<8U$CuEl?C6(0-sl2Y-q!B zHu|vS{3{swyll~k@Md=jr0nfhj^b327s=Z>o1QU|4niosrNwLq9#|lZ);EVNszVrm zTir4_=uK!d^NJ-rjN9nVW%`p04IJi~2i!Qe?ERUUB`R`R&l+!VMnX6rx0Oh;EUKGU zl)j>1G-gO8#j;~kv(B>CyT(L}>;3@*bko!PCPWuz!FQA~LZIg{J~s1?X)}XwrBVu< zCZF?UJ)QSQoWfR5WpVbluCN4Yl0FZVC?@v%oIk-Ljj12{}P3qxWHx zrdW>%lCJ9b-A5pHcooOw1K2F2@XwjV|NbO5zm!^D8%|(sX zpD@Tkhp#H54!8kj>0Fm`$Q;)XV$!jA&;OL0{0~YErOwzYaPFi*-uJOV9PM?3s(};% z>E`K1ptS9H0H(u626Ee+1$nzn>b1SNb$Z_8!E;#CnLEcm>ZJ5ktZ1uz_&w??&qDM? z$TAir*w|I#r%ADVKX<7~|mFJpDb zr-@Ztz`Y{FNC-*}vyv5$Q??K)3@NFZj&1t(G>WB0$OQvBa$~3Z2Qvz(^OkP#!HOO; z@~7!himYLBaWxl8--R&eiAd?$#j}m{bql?~?c~7gC{OhkuE_|J0>sH0ZSJb+JH*=Z zct~Yy56cx@^i|SDQ@uJ~Jp-z{T|y&~Am6qlhmn@juk(jj&;F>Y9qPLOUMCmIUcZ{X z_)!ypP}l$!HvG;$@HR@GMe?Ix0jul#C010d4P@){We1B)+;9cR_ z#ypdXCnp&Z6}>v4Ee|b~Z^&K3yAUrL(F8_f(G#|6x6|8CY)XkM>k^SI?N$cyw2}!6 z*$@oLH@jX{VDPylJa=WD^|VQQdLH%K!7sS6%t@O%FUh#> zX4Z6sEIN3kRv0P$3QQg4yXkn$d2O+U49DgrR+*A*v2x%;I*e61CBoi7v9OY3GL=hV z4|=y?v1GsVGX6o4=eM$5Q3Gy%gM=aauk<(1bwFxP%(GO7H1o|QEr%-L+tR+huBXeh zny-;9e3_q8{IEb%yJf}e=8uG};XCHI3pbzyIby!|g<-8)Zu6mp4#++Y>>gdN$Cwm# z0%S;U)1l#~0klgHyNsbm9Z&j_VJVAJ8mxn$%Vq zOb_t$43SvMISu-}WwD%7rFRnz(z)N;X2@2f!!$?^^dG2f6>I9a!{Ty#wO>6AQQM`v z`zk16{MChMwG-lovO=^hC9_~1uWV((gvV_*vif1D8Lz0u!QZEG5M^g+Ow`O4R8;e= zL_Gid_W>hIU)pQd)#t5Ap~-Y$viOXY-1N8%El0hSy72EIlHbY{oGqH$V5MczqW5H ze|iO(*_0ED7k9!k1um|5MQZy1Ve{yn+zMoR{rpn8?Q6_qh27AkfzT^>f&#>>|12Ym zdi~d<S@DEyG2)m!a6p&M$rlb?u7f_%w{}_0i%$ak;{)|`m3+TBtX8FGxqN0BP z{|W)*i%YIVk?)|*le6|ir9$QXW{3}C>i5UZDwuIue0O7fj^J7KE{`;bSly?j>q=sA zhwI?j8uh^*0k9<(tf5GCX1YK<1nt3#_=*mAFm43fnj-t$(0RW-d^EdX2|oQS414Qp zWEhKOt@^?hk!6h}hA`)}+vX$HTZN3U{I5;D$Uow)Puf<#>;>t`Hh~+q9w9lb<3ux} zkh+kRlBif~#qC3h;{MYQA9KhfGTZ#w;$-Xl&DgC~9;98h3w%`AuV^Iz&N+8x`@%nI z!Zp|t`N$}nS|?tKoX?&p_?b#2S zt1}~ZTPPrTSvou`C(~KBa&aT=pSp|tR21~6vUDS`ZBrf*1#LL?&lU#fK9Fq+Q7w#r zSjKE>THk>WJWUg*J!jy^qRs@bB22cN^QoRQq)0ZgQ-3fE^SFp8uHdrDMlB_&{05*l?-f~mf1S$`Rumjkp}g{6Kn#bp8eP|}->KedcMg1U1%APA|3bNVqXwz( zxCGo{<}EOJe8UB2#5qJhK(M19k;=o8%{#Mr6=bPNBgCzuenTdTcrVafWh7iB{&6d) zW?g_nBEUT?-oHj>C(ZDW>nr+`og_cOA&vzMiHXnZ=E3W zpvFXJsARdF=$017^#iCHgf=e3Ey6r4A3GmwxClu|N0_ZKM?td}VB|wD*qtrkbAv@B zWmtnqnj_1QzPbHHK3DJCREY|R-MDH{FM2GLdVxiVB|AQ?fo})1Kg*-Rx95R&OKDb^t(k=s1Yity@fhoe={Fo0m%Q!Ynm#pl_iGF7YEY)%xT8@Sjq*{_QRxmqr2(4>sC8Chvi%DX#AV-5`A zBG^WjNj@;CS2)dXj4PIu_H}dW-YHW>bXWR~;DAug^1qWR{tw6mHN^f~67>`nB{Rf` zM4Hd!2}n<~Z5z}_3yZhCj`%FZ{|TApBvu9QK4++gpiOf6oT>TQVW~UE1a}7nw}q>v z-C@y`JK_Bpz{@B=*8aT8Eq}ml(wgBy4VjQL{bZZyswui~nGne4YOJ7}>>pT)1k1u6 zM}{pcE+j|0&YQ6slKD`Uw2ksTh)Va}VUFER&mDY3-#yQ>P6XtaT=02*qz%I}5@QK;ZILj~X98oqtY)qAbgx1=3tFv$%_0ShlpN?Y^= zrmH`@p#z0j+q2-6!{Vb!mJAre*w=UIWhzB7>36Zcml{*^T4-;_kV=kbWV3cNSrM*M zr_jQ4@Np-u6_n_iT-LEd${85og|-H^#@lA9X<6@;+L#jFi9$Xih&00|i$}cHk#S{E z`A%C{G?+#Z_?a%#O!}Yff6(vsy1bQeZki$ymS=mr9s=$0%m=wOHw)}9eof{|O*X6q880;CxQ;g?dsP%BOLW3VHE%3jFQ{1C zHBHp7^xweQ{><{qWP`gCGq+0=*Fm=PSD6=rqW%7Bdi77b;rER={E*HZO zo-iQENZAB0j|gM%tvMU-dzggH(bCvEjeY1vTN=lELv=i78gk(doN#7lrP*LB^R&>;dn%n~~>9g?)DZo3f zC9V=3K_4>_05e!JFHxCRU-c~_J6WoJ#nz>vLg&9xj=Z~3+;QGkSWK9a)rNp8YW+6R za&lqmBS zS&ibmdDtXnwxZhSfB8gGl&w&<`Z*uqM-5T=21DCFQyBk0_gx=W+s1;T5cKVYwffum zAiqT36u9K2cFtR2lCJO~BO`iR1{a1Un>N`@`z-k-rBIUhc8d3k<^dH=sb_lRd^V?$ zXSW20lCdzwW};z{!UQ+PpTE<(%V2@-_q+7H|2v%QD{8{j4-<8v{wGg*I*lZ?C9Sxy z_DUz6%+2(BA{4BfWAXeVXp5{)5VtErCT$PJeRz;f7-`x3^>)mv!z{TvLBnGi2VyI# zt@VX~7E$g>wycF1`-?2BZ3KpT)+gpG!ym^g`xf$lH`w)i!MktqI#ACOpsgg(u(GPV zV(5ptDeus>aenMMfqt_60j~&rePV`Y=c%h(SC#$Ms_@no=)8Z&uEYpz`V6G zK0p@Dz3oiR&$%HfIDok+LU~1^n9ao*JFePcznVDD&qXLd`~E|;WpN?U!;QvjcLuNa zcehzQ=L(E2VrI#Pdg-`T307L^xMEv)L7 zz$rS8bUGD_MRviifHbk9jtI57FeK#(6|WpGdhH>5^a?-MaKQ5sJxN$gFjQl=cQIV- zqyOs2ezwCq2RME7$D3R9Fs`vR%7i(JXiX@PglV4EdE6u;1g84gyGO}9eLJ3nT8nRX zE%?8nhNPg%8~<}y;Sa)|2}TMI{U$>d`HwTM2d&D-4rHr2riQ0wW<*Pl>)f4$*A(~N zva*n{&TUoiz`IZy_oyT)*+|MND~e6`WNmZObwF&Mn0+ujpu6PqF(JS*Iv)#AV zUJLyL%u%W*B|tz~5@L*$3g!gAzl=3@5z}}s+FGV;Ib#B?8-OJ$*P*#` zx7i(OA`6xULyV!bYnlyYnPWP5mHclJFJ8@YHcHP`3U82S;CM3MN%Z^3pX5w33B(o) z1C39B78SCjKO%?yXS*hf?DYpSp8k=x1(-DWjH~;7wJZRqD=JEX9+BYVo0EB*rLHwil&!z6aII3H!Cdle3UFUE4r?-VfeI~9hOkob5iaoFPW=~d*Asf zw==n|MLkpjp9a*NaZ7+O1xlLEZX8dD?o0P=vkdMRw271zGR3w$Kt#X1wP-aY;MHWU zDVw`Sh#)zLNIK0P1wEMQ#B47$9U4l*wG7b|IbrFg3Lf2X^gS$+?@_67rR4-K@#7UF z^-NmSaZz&bR(q)YOx2EtiEBFnmG6l8?E< zpVgF&-kwOo&0_#WaFoJD2%(k=el@m5#fgooX;dHQ#mg4Au9p!i4M^tNIs4>IFrXu$8e z5Fq&lBn~&irkg5B%WnyKO<5t}3S)4*PIZkNm{Pxbm5yA0^ZmF{#?`0zejh@+)D540 z(qwaE4Dx^dS!_w}X=6K?LSYU_L2U=Uh_aV5$UR_p*0+)4NU^UlkupeWFEd5ZhJl5b znzS3qVxD~`^Ew&K3KDpK0!p2hZxUe)9KVhK2s5Hy`<5EDQ1|k!`lAbK*2#YwfTBMC zX9h8ev|x=na~cjPzhfpV!$q)V^59)XwzYMyk1gwJyZuE(y!A8`)Qiuc&jYK`Xua)UBjWv0Xn4{)8O>wdls>dHi9fNve#duxZimp`p;IO_`Sa`sv`1DRYVllbhqLScJR^KpI0uZ zE^IYBVrViL53N`y2BGy=G&RD9+Y;LeFdu0$xgbik9O0nFax!i6-79Yztze7%*bA~s zqnV!^DP}eJH|C$pq^Et>(UeqGd0kjk0Mh&~Im*%LuYBa&{rowRJ4ed_DK08(Jk1Ww zLE#`~NuK*G)?m1I{X^f9?9e4N%nIv$70Np%huNjJ$M0JsT2x0a>Rxt)iW_CdS~fhB z(x*CK!Qg2=TL?1}5{=AvP_Il5DeO{7KhF4MT5uhE4ZT#H@mbOX$`XS|Ihp#l`3ewN z2vbc)!Kc1APl03i5sI*dYMMa#z`y4M{HXCID9rF*@}T*)5|3|tuRkfcgc&4pi(RY( zUqA2ey5Ch6wz(0IX8V57*`TppTV#SR&&H6B243( zD1=WT#up!B=UdBqp+~C)N9v~m4II+<|BDt7<5RhLLUt0XNxL6C8SnD^y&H?Ll;*fg zoZvm7RZpny4Km2w7mzQpjfL7x!GTUpirDAw;)Db=0xxzE$xb*g&3C#Wm&sw%LeUdi z8OxgDP7it$iKU5lWAV)5GgO*&Uo&N}qB=-^LqQ-^=g2pup`gt5X-vW|7`GKBrZ`N( z5u1%&&-E^5aQkv#X)oFYALS-~NMhkEh%}&}-Y~3oTSl~{aV~Xz?SpEHT!)0s2dsQ$ z+xK8^#aKM6jaMFIp+ipzKkz~ZkZZn;Jv+^&?p(NbzC*viKJb5{!4KDX&q$lRp3l|cJWaE;fyBfVI?Fj zdxp&u&HK~K^0SrEEhIke@@NBFkV5?6ky^PG-Zh$#o59vk&gsp70+RDrRymsEkiJV< zJIx&Q1N0r~kNnKE2L>KdyK+ILE<~DAFt{2FWs4YHHn?SB+)S9w*;kV5Lu!%eVxyfD z@7s{;?n{^WO+`w-q(9dW8F;WR;l8BHeA5I^Y!9otcjw^ldm?;d&_R)AK|ueyE-c?t zJ;jR+774Qr;KT|P`+w|xcR*8HwtfHwMFByg6d@u?6|jIx4Tuy$q!S1wA}ElEhzKGr zh|(feL=>d=8WE7*J4i3l(3IYr1Oh4i&Ykz>-aGT&yqWu(xpTdD`~xBJl)d-aXYciW z>sxE3t}%Bl?j)bqtZ8(72`fIbenR;8ZEuC))!`+_bT$4r&qWilGOJq46kghuR`>?T z$8+?@y7#>Jrbz5RmL%YNZN+aXqbJ`bb{cJ6C!QCb%=`4v8r$WjgUchQg4)z;C!|)m zmhfpY&VG~pyC8l9=e%2F`*2H4AiejAzH!r7b%(-b53P^)NC%k1o*6b_UVBb-yq5Xz}+kZhwGRInW(3z~mc7yAl8}x;l z`#SU@6dmy)F?&$!$HDC*N^i-@;Q9MQ%|i7D2jvH2X!xh-%j{swHbT5^5t@f;P8vOx ziS4&yoMbG>Gzpk&K-GdnX6GG?!Tq`Xf)5@VMfFzPC~z|ER_CMY|j*-4Cj0D~7`Hoq> zW47@%q;v}Bp1{IW<%h454K*$vzSR_dG&2uLoz3|GF3HJN-EK%xCBvX%4S5YS z(s^YDGOyci#bmcu)cZkWbtU-k+btY?cj#j$%Hn|V{ym9#FMH1oxA`5l;RN^L@#0f$ zgM^|NeC#h~Fnz`+IF7@_G^X<(vm3q5@?pQ7^HjH6zf9LGINFz8SDp5sp&5V6bEDh1 zb7e4~?TO}>q5?@D4)HP|S1EcDC)QxxkFam0Dy75)Y^+C`Q$3m|%C!sTV+>t35||h2 zWg?nyF)}ps7Hu2}(Dpu{sFFZz?`?>6pqCtxpDSOVe zr2f9h-{117{L*{>n0702dEm!-@ChuhP(RjKtg^zUD=_bZnsee;7(rW@Utm&xHxPb- zf#CCbDKJ>O_Z>krqZhAeY|RK-!?Gf;^OuHxm|z;iSGhF5b`L3)!nua(z8dSqkdNFY zblR>-Mh98?-LLbO*ROp~Hj?= z{$6iFx~@u1-e!+KJEZADyOjAszA)_$xvI$XD*d!gF7}H*>4W-73D2D z)}lBQ9d)8&w08#~O;^nc{#5IJ_y#>B*3lrT=UsTNit0Q52hy3M2iU(=0sr@Q{jr#& z-}CcCe=lD0-5CEzjPd{MW2{P)A{3cVG8b`w2;2%0iEP|prA*V7R_alO+3z6n?0JIx zM$~1w;2hEC{o)7C;D?PK^d6Br)~n`lGI@$KuSHK+dP+3t)KGMssGS+@(G!Kag+qRv zcr6z$&jPi(S@q%ZGgszQ->M!xcRJTtwG=*o)0dW6#6XAstact}=k!|e)0!q>+oS@a zZ#{10A88*(y+5;M5j8Clc;TEvmWD8Y$N*vgEb2prcAA#sd6`6hcAE7om>x4)$mcXY2-T>IZJt00n^28wA*ow(ihBxupZFUJCAlM}PA`9(X>Cc8`wcUAEo(kel zk>JX~@08rHg#v!8R01pinaImq!SZJyWILql4N-O%#QP3iAw)O&kIahi#{gwnlYZF` z>W#3})add4n^y&cO7^&P6}f3OwB(lv7I|b(@Tl_TnhWfL#;HEuaw`T_;whv@?T{IB zZGXvazr+LVmxe2|!wc!{uOF{ib|g1?o=uwV%fu*iHnY>rmw9Ns&uqSw%jmav=nz;6 z^n12&{@iu}nvA%U$KM$A>tAEGnC_4L2%pFk6JR@wSKNW+fawdC6UMmr@?lx(`6nH5$^!dI<7h!>$10mB*>+%y$ES<%@_zgFniPI&jH41_&W(-tIj~EUE7^`D9Q9Ga9@gUL_gL( zz3bk7joBQ@%x<2~MlWO1TkkTKESTn}jqb}s7VYq>*I?}!Uc9n|f#_1jz%3&kKEFrF z*58tKnc%`6g;y~kd(za;J=I}ou9E7>OBob=K*yc^q!?CIWe`T`L#PoWluP2)?Q~rQ zs@w)oTqvu^uik4)7NVP%U|QJp=^bUeKFwNDAaVy0deufzI6utWK{SVeu@(L~hMqx8r`4%(b=5lQ4UkV)!eKJr|A! z(#W>v#dR4KkM!f1PBR^9O*d`hmlk+4C%_sRYT&ah`}jq+VZ$6g=Zp^rP7{R&*DYH_ zHp^&n3k%rxTr(PJnbXeE`0fb&1&+X9@?v~H_IFLieJ@M$FONd_JNE7O`~6NLUzk8( z?iSXo8P+BHn4EC%UZ&PicrW|V;czc|LqlyI(Wvr)hK$ybqU$I^*dYdZyWv;Jg8AE| z@Fy|HI!gLC2mQWm(its1n2uIVAml|itGb?K)-sq+VSTL8u5wFIpiEX%qTo#&a`Nn* z1tU?6v#FGh(qk2Nqf>u4`izg0126=n7bq7y*rYsyCwS!2s-3!2sXkSMFK%DpO{%hA zbMWW&)SLdyn+{C{Jz@@ZoJ6{8Q;#C1MHeW77BIg{YjX&0`wP*}m#GYpM65%%hYEMs zK}FsqZ-pS$;lxY=yr;M(cfrW^cQUnpUrPjbKd}|?b@X_mQ{9Vx%)BK;H_f%<^m&Es z)=FK)zFPQEuH`L1{vF%>W7zH&j1vfVF>8JS$$L%~raA`HOKp_1%R3c2aYfaZuD-(N z;b!n{8y@_-2SfDHu4D2@`P-k(=MPQ7beR*&wXL+#sgeZ1Z=) z$>5k2l_}$2_Y}|9!_Iom}Kjggn zTl4Mv_kSN^LcdoS`F-vEeMF zQWNAsrNth(l4c%nYTtNIl-}e+kaq5shH{(1!s7*BjU{qjgfd=@ynCU$|CmJXgUvFT zm-^A=4hczNr0hgYzgiG)q^|}gn4R$@9$E9Hb`D;1^Cf2DMU0d>;H(e(qpn(qf`VY& zxwe7lG|$ST>&j))Jz|z(o$%fYoVc6oQqqp()96@RSG1mGM%`Jmra3H7PA+YF+;ijr z*VkD2j`s<{I26l_BE-S&VVXKWLPyl&0hV*ME0Am4HMi~aG1xoR``$ah_o#2O zrHj!XyPo47y;ENsed1~&{=?NNbS4(mLslpjK9)kjW|M^-HtpH)d}L9I7RaF<63L;h z5&Z`y!BY4o3w7vk4S1IvkHE}0NIqb}6}c1{p)7?vP>i{36He{ROHY{%3~za+&zTbj z10v4a544ULZ=Yjlcx>LC^*SgCUNmR{-naL`D)P=}$H9uF$H65_@UuuU4mutYq@K2A zlTC)|P}s#>h0+7&_Ay8+aG%_yUY8Zd1Y|H#V| z_*fPXV;tcW-~mZs5nE$^a>rggty$KX>Q|*zDsZ3)NHIF%){&!~HcWf|g~KFjQp{A- z_f&YDTQ`0BV?yBMU}A76Czw!K7?X4zHO_LdbF-{Y=GY+>Pw6DoL;VhRZ2_fCcAB{j zJ>j+|OPb%swtpqI{ij9fKRP=^N3!d@&Y_zN5r{cO$QfK;JOg@SPI*l1E0-|%%2ztHqX$D&V89Uyn?lMPY3jgnKAt?*5Ig}uc^>QPCUDKgKkgKk%j ziz~nv2QCLOf}H==v*sJT9+L>uZLQRL;nO2p$R2}~?&wRX%5$e~*W|&iKTgv0R^HT9 zPiei?i8|uR)$KDRy%W8V9$$pd9+baB)3 zs63EN3EGqw8C{F@7R9&#_E!8go?N0ap$b4!y+_( zS#G{HtRCsn6ApQk$1SVjv+1hS_hW-&^{DFoj~upPqEeGzJ*Nc{?5v}Fob=QVyqz-- z^HII740^Nq4O z+(4)%HBVZr;K5Tm_QKy+{g8i#A*L-F-!GPO`5a^T4$N->fBJxfq>uPLcUa}2eYJTG z4MyQS=Wepw*exw7LJpq{w)ZemFykUX=#g7&^WFR{_uv&@4U1iuPu?h*9H`#cIhTF; zYIBDoPNycSg6@Fe*?&!Oz=iB|=7RAz82D01wTRB_Ci%#cFwYEb#3$j&r5}Rz&>Giv( zUzS#$n{gs{nA5i8cH`I+t*l#xl5@dye^gia_kR6-Vbb?8O7i|*apc!=l+f2s0x7Bsscdn0d!5h^Z! z7ZmqwyN80c+6BoU{fd2x{*vIqunSUph9;2<(e;W%7V48DyP&OYTe{_yC3nd7Y;`OI zzX|=?gO(wxQMnju&>I}ZyP)cJHL@g&_bzDmtivt{{nA@u`FC)2*IiIpV>N{ehyJ|V zG+PfhD?`Z}h|bXe?y(1uIG>5`Gj!*m=`Lt&D*5OAR)jo6qdx9}x>U%sf9^7_4!YW{ zK(`A@_#0Ob-vzyVw%wnZfiK19?cc%GW|I_&2kv#||7q~b#+W^vRuxh+>+Kw}YSv|i zKpN?p2`q(=_gL!<$nNi*9uNrfGdji!`*%X9fA780&S!Z}U_2emsMVHfT2sRxGF)aN zFrc|p{q~l!<~30GGm=F!)lFgsamsaU0{W!TUMY8(aOtLH$iV~{(_cKT1eWrqT3llY(-7pI3(^PFV7l%e1={3u+q7*GM|0Y3yM6c?EN?9 zIqoybe7h3k%Zdx?#zL)G7mtPov<%DW>z1&!JF#VMXEW$lAqt9Lk}0So%QGTFrqMI) zEi;^X_y;ntHEy-uopSVP7|?mwQPY~PwkPi+2&b0vsVsjqt@yWMDiRIH{o1s$pu75spkn+LxG_JKcYD58GcmCBX5SYSVBT zD{1;d;pa?ks^=9U=ZV?s>$w6^?adaB7Sic$_AAOZ98v^qh!{#PVfF} z>0KbtfdtuIg+Tq^^FT|G>b~cJ{`D5S%a?dPea;{! zWcMd`Vo$%}!yG1CTeXTg1Q566VR zauf0_Ozo^VZ=J}VzOtE{y7%N|7kT*T{6#art_c`;pbQW$%go$yby+d@4IZ{IZf6|N z)M)C7?Z60Q=Ii-W72f_JxA+Ft$KfID5g&E-+BqB#d|&RZRtb~Ieg1}%VFn@Zx-}l} zOP0?wv)l0{>$G#4WvjEVC*Z;?$eybwoxp&p`~)zJ0b6SUnYtgqXYcLH-OLG#kbWOW+ejjlf< z7^`t~h{gUs=a^@j!A5(+x%Y4?%I|gB1^pbSfJn7cnipk_l;U%3>6u!jR&oAR8CM_$ zralJGWW@0B7HGc#L26SVA{%a@#I1b%MvWDNBu$U*uUG~9KlIu(-^(2RDtMD_HJ!?+ zosD#nS$b7a_MCMJI&xDM_XY8+0yb_~b)zh{W^NZG#^m*77gT!xZ^2FVBz~Fur9C zxy{224a{Cf5|uZ@0Oo0Z7leDe3+j&chQ^K!F*HO|szcI@rS?y7!+h_{;+n=*qg>;u zK2eaZRouYXdnh?HVi#0a-B~4(z@oEy5kuCX3KSUiFYgS^s=wi;iG))a=}H9I_%(cUkj6`vW?(cA0y2pLNoH3 z@B=;l?=~_sjfVEzxc$+%;y&$M;{!V`tO{)k<6g0QZ4WQ1p8>pyI%obv6&rN1YhtDG z*TTW~)up}~RVQmRRlJjQ1;GZ2!#OQQ=(Qdpf;LK{uD)5Ma)Gw1>v5te_tF>#_zp|W zPR>I(M)lp@&uc_j2NQdk#^UgORKFJ|qM5nSPj!Nc91QZbiWc!LB2>0EXvNnI?0kdR z;akKr|5%&8M>+ncI|KIOTE`Cja%#MTKshPq)l#2_KWzp7_{&rWsj_`BrGx~Q<7>Be zEHr!}E_+Q^t7624@918q4|i`>+aajZ=9Q?z9Et+H z&+(kI@NcR1@V~b$`kd?pP~U9gnKRC^$|8@sZkHbDP`KeW2cCOMH&`IP6s;U-hEoxF ze?qr(`rOH|*z^;*O45d$cZ`}+Q_ZgmSzkXpFG`CtxnckAAlH(f7KfBo>AU^}<4E%+ zN7J{%%_n@Aoh+{9wbe&2oR*SxoiKPfGmDDLWp2t3AJR2hL)5gL8SqrM3?D7Q=jS#i zVT3{75=Jq&2whu|AO(}NkUle1`P5yIV|645eQvC{syqkp@ens&RQd8G6FqG{$m@&f zdS8RZ>9uq7Y*zXukc;FEsT~vvy$fn}Ut=J3qvvE`Q?P(t&?KKAsUKARlQq(_3)=rM z5b*U0I|9BwCZ|Fj29>}33i8vzI9V0Y1H@0Cf2Swoh9Pe+21=2-La9gm7IdNwP(&r8 z`%0YPZ4%-Ex)zF_{30j13$omYewDzocmYS1w*9jR@&1jKPbwvKKa{$tDbw@IP> z-)+Y_1KGRQJw?AdOvqCf;Y~^WvHkS&OkU5;ftrei@fzfAB5oJN7pqB`P=s_kEU}J< zvMQ~anyMEG_gonUO|p~~efhbyn@R1dUN@4e-dWlOy~}kq1bOaXJ>RInA2rVV;vG`| zT}(iUIJy!s7+)tFIMpZl^<>Z|!+B3%G+#9>bTVlUL0Ds<`XX>Pbu8O^=f;+?>+Cdy zE9iUrbK#U)?LRId^c#^=EZ<9@{W@gDy4vP65j?IPC{vlUbHy@}|HggC!RnK8nj|I< zmC|zYFyUipPVBr>qXipvI2;ke;28ZCg?(z_w7eJ1rwI9Qk$lF#A#{5SL0SY>-Gh8< z{8UdIl_W*ExC^R_QOMo}d0?PlwBikSK~*ZiCIMNkgRntoM$jt-YCoNd+ZfvgG2Yq1 zQr3lbLB1C}VbpDQ8gdJ|exHZ>E+`&&_p3T?&5z|n8stTK@}?hO7U~+4qA2}zkFUF+ zK z?BNd#cM{Xk%|QTfRsD^{zZ%>n+0smo7u=N+O1v5L2&}haZ2C|{74^KTf6v$RkArY3 zG8cyful1DoCq2Bu0-u_)Usjj5;|U5SNF_7B>9s5vEZ5Uj6?<>wH!x`#Lg+TN{hFQ4 z3cnDoERO3+eUQo9gPF-T%}=?Z%q|OJD1xUZl??qUr_&!gr24M~AAkKB-0u2*Bu_Fk z`FbiUGmFc&8h^uy z798E^0C_rp;W6T#Ggfu-7VSmNK>Xkt?*0>Rv57ZIzWO;T9}cn|Gvx;i8;Hw|KJ7=VVa1{k`O zNK_c_aqZ<^$RWWkJJNoaqtdwT1@*f}N+z)BTedRuTW~bYveN}(Yu2;*B2RK5lY-Y? zYNRefv)|6hN)(;CxMUYSas7S#NE)Eclt_jAaB%#ckcg#l+!5zU>e5A$rKiwNIkVGy z(w)n%H;I*~XEJ5sy_x2yb_d8Sn3Qha`$eB~rL#~6Ooj3|rLMz)0|+FCYfq>r&)i0J zR-p3oiny}!hPh^`?OB827YtImc<7`y;_Ebh{qTxoqu{U2p8}+os#(PM=7_UoWyq0R>>XG;zI)|B`uBPahvuVEF-;rc8DH?3nhj9o}*3vwQV;zuQ???51RIXxxF-E;@@b& zMpxKn)w{C=blyXKQaFY5k>}}@#eT)Ul5kMzruCMRWD-XaKP7Y4HhK!Ku3UfK#=yor zUQz!g_NMj1pB12#s1A!cT8m^@E=r-091H{eH0LlhsiFV{=-~a&V)JJ$G7Q zMiS5h`m!}sEi2uAD*?#L0fCqFKzJh;-l_Y;a zIck4KIcP0FF!A4LA(W`EHt;8y<^K~_0#=+~1hb5}Ru)YsD9j&+hiekmNVlsqR*3pG z1L&1-3#m2#FPu-5T-pdcfSn6l>#7Sz)v~5UWA?K2F-T2D?gW!;U9Zi$c_v%)G;EMj z>(kzh1QvSo`2n&Ock7KrA3|&6BTug|uNLZ2nSsbY>w~QQo)&P;MDVf!i@kl@+(SrLr2KYiCI0yk))n8+0l{W#~{JAH< zn&J2nU7oujGTID{Q={JY7JnErDAM5obaQ0c<21rs*l)ddU_C0*JtCA zO_5yhlHSC_*O`E73{`!c2ABQ$fk1&}M$vIG$mX-E>dV)J&aPOr0|Z83!TifQBgJ!` zwm|XlJa?6W7@cCaw~Y@#lefK(`~^>}dWkcqy?uf<=wSkrhUhEmr#lb8BFGOlYl=Kz zFdUCokZ;QArcp77n`k%UFLJdB^b#_&JGwq3BD_CtgJz{4)(}-)Xt-37mqpVAiOIQd3pkYsw-{;3f=e~eC*`CbB`-_x*6W-UJKqzaDaRM zSAA@K&iArEs5Ketznw>f68_Fl`n62N`I+ zF9_vXk#1G)y%t@5@ophacaKi3L<|Qq+M7{CIAwj}v@Gm7a#TLLiPM^qx!b*);jJ#> zC=yX9w)842AZltWhnoctr_Rq_wp5B(07Afe5NBo)oZ>~o4fb3?BwU^=y74|TidI19 zydtmf|7tUV;nNPME;51&RPNz%fspIoZJqj>-P z&(pQ(o4gO|rpw65F8@j!WaO~fZk3S09EX`n9(diX@$r33XQ*Ss0ZF~HbrG|oADpW> zHPf|EA`=!Rfz)rX#wtEaqQiqqt)(MizN+bBj^q7($;5bJHRt#TbPtSLDmcg(fGlNY&fyjDZSH*HM6-A z*aMs!*F{;|1tl_4i_kV>+lL;YceeZiEnosm*@G_Pnm?rnHt+Wu*qF}% zsx1-9PqR?>x6T-AvQ@15(OO^QvzAozi$^)4OO}?G8gE(5FRPE2lCmT2E1f78?q0SM zehFZ-`W3M9-HCX(ha)`*r)`ydvODJ~Aw2%_@;cWj)cdR+&k)c>nHD zf-b&Lv8K_nKj%~7jSEsA6YVy1%{pHGHHY>e`T41}UXLO#YK^U7WbTeP;Y1aD;DJj) zaH?RXXCl_*z>pQ^B`j!d)9Roi~K(Ee_Qv#YxxJyXs%;r)y|HuY}N^Q;9zO-vq$ zG^9XlAe}TerRoUM-qT)~Kl$+vBWN>AIAJ>W8{?>Tf2H>LCE4vuzafy6DPK#Ufa+z@ z@CY(%N(KAjBbW#1kRK|uA_c^)l9Ng=3=Z31@<#a_KLW%a^8*)~)H5x69R(Xg`)HFC z`0mRvruv7K_TY<)S9X#Zc(OLK;KkxYHVf?GR!4TNdLRqW4gbDg&1wN&ZIJUl-=%Ci^?~D0d4)F53dY8A4)&!n zsX9NE&px_X7x&(~Z_|u*-Kq?G8EGh#&-#1f@Q)u%u^Zs`=||NxItx zE4`5ikFYSQx<@03s>zTBJ9pP-pU(RYX}&9uC#vxiBv*)P`lP0)!4te4Z^S?JRaA+V z^71MT1E8@^A+FHfHT2UPA

      -f$~tIT3>&kpN}^|GLDa-;F81fUP^p#d3d5=uEZuA z^G4NM|HV!Y)98DJf|U;;AMF;jxkQ*ZJS~Ak5%0JZT0BwZ)FuQRnjbY7nsa>9=D;TZ zf;Q=Z8_tmuQdPLJ7)eMacyHRUY|<^5?ScYwJ1PUwju!78q!RKDHxDM^RL?P&`xEkV zz!u1S)fbthmeum|F!sT{4OI@bDREAG{G(a0G5w_H@)?&KRL`OMW0+SPh7AA%d1*gC)9z@D7jEnY9#ypdI zq*st%>lv^mIrffg1)7f=44Z9VF!8iJyx4ho(J%BJ*YYk1=t=|$>^K-dc*y;{4@o+O zB%QZHVq$T+O7V+J`%^FXPrB)GFpE}D>$Ks@328FXXyeP9HhXR+x92SqJz3ne|1_#Q z4a%S1Kwhfw!B<%PBjmLtiOwu*TwbGmz|Gi4BS+D7m*UdGh{*F~)ldgknLA&E@)oG9 z8O>{44rKlSXh7d)57(1RZ;9YDuNdb;{ekDi>p5`FXDnk$UpytdQ+P%L0k^WtVt`xBO17%zBp1m$Kjq1KW-PLmp_M=zn^d>N z&2Osch>3KbJaz+j$-X9yLoxYNgY@fx&_hiO%6cqps?eu64y(p9SV?yde_7dhqI!3T z-y6BhaegIo<8OGQ-UbXPS8?sQalJsV*4=JM9+;Y4B6%mqm|ev zFFa21MLSv1rM5%to>jwrnr?%I@uQaAMrc^rb`QSC46~5uZs*}2T;>Yb^jEK*0gwYZ6rt}x$+A5M<&H{eG)^+@!fq1`*b8B!|mB{?Cpux3E z;OQ||iQu%CoeY4(IZohFb_IAXSXDu*2;=ZEhHeUsq%l16) z<3^*%TfACu4aIGzByR@kh85CaraUgDiN|w`yqUzA{7QSmueI0wde`0OW+dP7IP7H? z{8NhjrL3SFCf-Mm$yMW@au|Y|rnHd+J2B-aD`o?Oe({p8^V*zAui6;I?C+h3M37Rr z7R00q!v{cldZzbYZ)O~iY7M)XYA+ko>t6AOT26%BLbPEZA({}1k@i9Y|M^5<*nX7Gg=V}qohjY+?7jY z}QHhiNLPXqH3JA zgN|PCy3Ma&EI&&(hZOx1Glo~)8S~=Pjv6gEU2Qi+diI$0vGgSXgnINKPR_<3TPEyp z&|(8XXc1Ali49e7l{R#;7DNF5e~1~ zU|RS=kN06fLMyW1TE!A4_9U)~UpozcRVG1}-mROVSQl|Sh>z0qZDZX&j2xYAZyZkw<;1kjlG&Wpx4RFqr(sxc~VXfUhslw9v8B z+{b*CM>ny}-DNf4l}ELbmDZ9LG;bV{lIAU>A)ErQx+IgXA@Chj0XkHvUfb$I@O1K_ zu!dv+lanrB*37xbkKug_@4tIo;)cuqXb}@*I}L40pgy2USD}mkIfE5EW)HfmVr798 zpnGh9270fLtfqbZ2*GB&}q-X*8P{=_y z`e*5;4Mm?t9C5^Zi}zTVXSw#83Xh&iSdyB8S2x9E9|%0bn)zu(5u&e)*fT=yd$@Y7 zC*ZIT8k6*FGp_pKfk{CMK@sBFgW$3qE8IJs=mOmKZnN8_*5UDk_}~wY0EIIFpl^Aw zDt@*-Xlok#<1F9nUQOJLm_tNEet4bFXQC@m8&xY*`IH^lw0~3e_Q83?3Kxiq+6CE; zZHsb^Q2-s*C!-3HO5A2Jz_8`#0x;Ww5{atIp!?*t2i>aDTfHz2p)gkGbv=cwALy$d zK}?EnA11N_L1srafFQE~^o9tvT;UM9MdJD{$Qd1$z>>vJ@`=9|y3F|B+|qB3xqoZS z|F640dbgiK{uTWjf5FxMr{3qA<*320vqmV-JFiB1i|n8F7T1Co4r_3J=)oa@4#ci3MxtH>Ze!Id;Wm)?18LH(<-% z3?sWB_Fd5D#OUHDg>!k9Z`nA8^Ukjjuwl*)M3JokhY4G5^=1W1=wS6Z55$^->6#fE zezfG4i(r~8E;PUWFt9acmUWTYpP3GXiQ2_DS~z5T4M)FEtBtPb;MfJp;byvxC>~SLqeH+2-;Zk*awJ4-W*Z}onBq&r5n(7vGKL^9X+JQbhwjX+zCBR6= z<#qC@x_x1LN$I39iW{Yf2$U?>!xeIKw`X=&^dv(v*=aWSf-Uff3*h@s;YNSjW9WR> zx&k5fWmY3e%cA(|vuHc)5X^vRm|pAXU^Vk71h!5_ZKzEE>m3(~&aazZUw6t`<8H|JkO=)4kSNjkRhb$ff;sqYrZ(`IdDbEUtEYgEO- z-dcb;Wfk!1&NNl8>UG@T1?fCujtZino2nJ4Cp;dkK^i!p+v_Pd($LZIpc*B_Co7!z zw-Q*oDX`700fO+g-1G^lHZ77J@_GBTr+*ddGbMM{?n2zSH&j(}WvCDKHdqH3AgU*- zX9G(VzFxW~yqWU)jTH_M6L_Uyl)3!@<-yLyRTeV48VpxtDlelYcRaGkDPEs|?eE%4 z)&^vA{)@FfnXi}$Q z_K?qjVZIZ>qimC_P~l$ZC0^2>T*RiH(jwjHw!_oLk_(an+*2$9zyR8Lh=`F(wzgw- zL2JMedKpps(8Y1s)hy}(cTZ?18=pMMOKL}OZE|7}u?e7sTi6-cCP{kG9=Z^r!vOf{ z>Tw>*yCB8*uiJw!qjo`iVQYH_$VZyhYr^x#$f&jZg_4;E;X!Q7S)USEmO?k(h=7BS zjmmB8n)PLmgyX)jJX9N1klK5gcu~K+00Tzz%Tq+ZhHg4j638lF;(c9Y>~q>O!6hJ8 zAS;Q-BT81qEogAj0jqN_ZmNwWu)O!M0`94aD-1T8xvnL8zv)BiOn!5&$95*L=4cfS z^*`rNop3!-Eg==kIulP*MO|PQX-2dNh{9AwDR)NIeEw+VY5j>h1W(Nw zx3ub>uG+VrN}cdB@RMdRfJTD6qUWHiIux60FVxHKI{vVPZL#=Quj(`1H`mgcbNvit zcq}637w#A}h306dTo?5Ii~&*8s}4}{u(3mTQ+&p|Rq~YpzXxFfpsZ+!i>`5Y{Px-T zPPozPkQG{L4^?zku0uRlNUw%2vP*-aObM$-RdW%e3@So)zMOgMqhN++1H7E7oTiB# zVy-N;)es9}*`SGowUJ1Tst#9$QjTqfq&SjaC9{U>uGh*NAp43TKqVzx56kfV$Ffd3Sk*RhTDfHYJD6Q{VF&f zPiEJItrk6vP)QHGaOe`0;zn*=!!FIN2{ssQ!LVH^-GEh4{l42YBASGrj}z##7f;RKcN;VT z^noHuytT&ZVfm%p)51KzFpK$|EN(6L3pNP93-ZtXTvT-Lh&TgytGUi$=-D`LBr;wf{1pgN9#X4)6BhYc+m{z%`iYN{FQZuzNIMt;+4nj2)!`Gu zWT333>JM(<3SiJj1tRLYtUfGxGG_-CP0E7|5qn9w5o+1eT#(ZYpC0c9c(0<4?nw3w_tMwAXR% zVT~>^WSNz&K;uFLrA}=FJLyorXWx+2jy*}itGntH@zLl0o{iGT6Z(9$kWoQk(0xgl zz3^QAbt|kL$wTn{k?hB2aBJ9(P%d=sHRr|w!5}+yBh?+(0!Uo6n2w9^;E|KStAG?m52YLVMpo6nfz8CS)9eg< zi6;t+7dtpR?CVqbqB~YRfk2awP^yX?cwFl>R{P7m8Y}Ll?0BRAIx?#5;wjs^M^xtb zVJo3l)rSz~9`{!pQ@j^ZK@Wh?zgfDS_MW1xYRu7F&9BazyGnEBao1n4eSZWS;)0oz z(k!rXs5n#&2ld&p(wlhmPFKW_3pVUsO%XnM3*pG0fu*A)Q_rIyEM#Hv^weufDmUca zaN|);+Q5V6`Jlv9cPMYH2H5<%Cj6I*GCZW;YQ~_H&facj01yy@)ir|3STKHofwAoR zeR*KBsg@w4Nmq#J&F|YCGSxYRyAEJhgVhXUboo-H8$n(L$gx#}O0@n9b^P@+XMxqQ zivT4G#T8|=-Lk*P33#D{y)1pLx({DppQO$Kb9E(VWyry>;#J!n$B5{R>h~dO%wCO% z{g3o^LHDQzac1!&KP*L0hN#0kIR{gWZ5*}D2o$)8xMQm=X_P+RET-S0;ebt9Z1^4#A2hu{Fi%`WNF^~ur6)0c1s0r zi0O^#7hifldc~2TAC5qfp_H5oR3hp~-_VDIrh14lV64d0>tU89Z_H~RTvb8h#-WR^ zNfyMR@r%K#{@I;rhpI;!y_#(C<)e=*UjaJ}JI&w^5>8khq|l%2A4RS7r#>Cn^7P;% zsiRvSp%*@)XSIN6#iM`W7+zRn8G=yNT6aMsh#ADuZU)3>?QPCQAOZ-oa|pWt+XQqW z;S6fPBXLDL>jFWJ6zF1oL-(*UmU5}2@vf@M}s9Z?j-4RG-)|;(Pv6My_5gOdfHo|t%q>fBV~dkcYYn~ z?-msGRyu48JYV=fiO}C;Dlb(P0TqxK`)l5!W1)rj>^|G z+?z(m7YABXmo9s#@KL#@A8lmnp2x?(3%xOYdjgd=RRS?oMx+veW?zUVU$cY!l2u;7yD68q0!3r>-jD1i2H#Iyi$V9)-f~MLj17 zDcJ5V9X%03(OfYDZAP?)9tQb=B}st?`4r#YBn=s?^unX1FT&cW90Sw&!Ozcc1z;0B zL_JSUg*{hp#rk%>^(b(X8&;8(jlIRLi;%kywDueWZZ_UA5w{Qr4DuCN;`@5YZW0X&xf%OEaE5)N*~_ z1hcod=X}5{iZZ`c5h=y_{<27mxd zi>o-ulR0gpTT7ZBo8NAK!V6T*5RuMy@EPL>t2;P zT@SgNX80v(+J=-jGcj({mAEf2U$A}IUKDMYR_h_h_AwewGW~b9Z|Y6%p6ouIFTtcw+yd#90*Ygyz#G z*Su)M@^B&kPJXAuHejqxmgIG25qo9rkbPT1c^+*3kImh#a>ARFjghw}KL;7%73S9$ zOt)`HYs_tQXqdcOmTJ`ty*Z(?kdd`oMZKC0{?L;F*umbvrMdan_ZDsN0>T$s%637z z79$q?YgnoZZU>+(01C5uo8FkBO~5w0Q35DcfPab8iU1v~=9vXy2gW`=p(<^~>rzD5 zG1V~iO_ptjQBO4>He}Z7C+{!eYxCkZ;99Og8(|Y?f97nMk<-q6|2nnneh_<7L;$8yFUadD`3K4&45h&IKE(LJ7x8g51hvv5JqsX$4CRzE z-%s~Ch60r5wBR%9$@x+7z?)S!YX2X5?;X_Szpi@+5m2gyjt~_D0Rib!gCZgzptK+@ zDpDi8*C188B1J)p(nLz6cS0{Jy+nGGP9UL#K#J$_w{}@;zvn%3=FHyv>@&0fa7H5p z&GW6-^|`NG&;MD|w-Cr9Srp;=Ky~qJK%0OJmikTP5}h||_x%cb*s$kJRH$THEV5_* z+`V2oplX9l3p@EOA-5EeI-`(<0`Aq!Q+FDyyxoBN-QMe-%bSIJ1q%s48y5J13qh3Q zm6b1WMZa~67q~9AoDgh?FE;rk>b^$8Z-b}2?Xogn9U)h|C%miQ+gcUF?i0ZHa2zB` zqLe8w#E-{q@8rB1c(NsaiL;vLexZ0y-KFPHqp}jJ(;lz^yY~6%w@d7~C8iD@Glj`J zF~NpIAD|e!kl-t`Td#kAUN5dAlS2n7kJh1yZl;^o0jy3*iKlFO}T{aomN8=$zHQW$>$YzcsW7yBB^+E z0&?Clo;g9kg1zfZBuz^_?H9JZWU|?+bEC_WL5@N9@ zJ)P(?R|8(Dr}X>H$59|UIYFNV9Oty2qDNXTkuxdMTYO-Gv9*Jf)R!)25Z$Ke-ttl{ zCTux(N$gImUJLB~Wn$q(i0VAyj}9a`anQImdrhYQTijN5Az$^LcRujyg^98`HP0F6 z!9U`5Wb@M2tT*2Ld+K*>EoFc7)y_uS5CAf6AlJ#7$4{1bZ|1^@j*hAOXEhRBJz%sL z#7??y={j)z&(HyeQV5*ev)y=fKjz3K3XCmS1F5VJT z1CQHnPZOWd2b#5{Pwfn;@vBV=0ZPiSCEa80Je#&l{@fG@%5FC?SO6>gYh+_2LOkbE z+yQc32!E4+E+O3;wyB@6yYsDAMKK0M=JGqwntIuKqKTyk!Il{5^1Mc^w92m357qq} z6njz__RDB=O^r+Ia2NIpRaJRv8U6ydk(wWFQi z_)U6M>yPlrsK7_S$8&9BSN804MO%6B{v1lVDFA0}hFksW6{OXnZ6ra>D@~DINt15w zc+UfLLK!kgO(RX}-{i@0&Zc+jPwN}$yu4Yy?g6870zRh)U~LeYzmklqj@2~b4)$IK zs^id2VBlwnqB#sAAeD;pfIhd3vHIGa)a{j}`o%*HJc0xSwS6O8;;Ze0=#sl|=?I}e zV%%M9Z9;dg>uVC~<{+Ic>8USdSsb;(vdFWw&EVwj6K)VzpO2)I-H}#d zo@b6PgYUucalo^U-hweUJ&`e$nU{3~^pCf&PhhSA*~h;@Gsx9J{IUWe9WqNvraG)8 zsrYn44$e>z%a-yHh=rmC3bJ(CVK)zR;VakL7>gO{epy<(yqE*hY(|L26bwju__YNT zIMZdZNmbPK{M_ud>Fix`_zDYO@Ptn=cSU1e!mLGWiCgv#y4#EeBfJ#Ev#?QFn|Y_p zK899>9VJY1BR=x&F{x`TPd648Tp_m91CIj6hG;YLBK*qw;h;dxuY5v=cOE7ZguUyF zru&?ytmJX9H0VjyN%)}5hqRJU3HtQfe$`Fk9L7XeiOJACND$?p6gy1F?wo$I>pEG4B5RuaNrzkX0 z5g!qH+Nk?R^NpciLM#2j-zYz_t0iK+fQtVPH$T^m_pdg9VPAKeFG{8wP{_{)K6YCw zrN*G$X#?>{(*gf~CLZi~?WxXEzwLVN*pO#(lS>}CX3^6LIj5838@zfl+ur2xCY_W> z{)B)dxO_D`q^%r|&NiDezt3T5Zn+g0@OE(^f-nC?thDlXKB_|#)t2#NBu)NSoQ7k;PXSA0BV>D4+sl5$a+zN9@_90RWssG08*n7Kdwg3X6oBUO z1&SyV8;8~SB_#*b)+1mk+@H0k&~ub*mT*;VKS5+29>3S4@5vDiQ6)5!YrJ&ol{>6s zrht#tg$bq?cTa}&$1^xQo?Yo9j~+jK)MNZ-dnt3s#v1Z^=to@Pw(<@_bT=#A1;%Q* zZ`6;-K_Y+1skQ9F(qQ3*gld&4jFZ%lW|I&|kwhe_mM?c{x><1!F3H&-!lEJ%7&hjg ze6z&vjR=4LEKUa;CJ9|wyPo+A1oxAEi~#laFYA>9?aqZ$3YyDW9*}cT%lK(ZefFw7^*#~*mfRwn7=a#|p5CC?dkUdkx?2FtDyqyk z`~%|oZx95BS>x~s*dSL@KIPYde3X$67hf)Wivt?cG(Hgwwv}7hF`i^&DHp;K|%SrT>Rm< zGQ%6VPcc2mwbz2Fp5^*Y+XY&Ng_kQQYIS;$iC zpUcOoPMmFG#?0RC38#r?8EO`Yasi1vH` zg`gvb!&^ueLNrd<*T6GD3|V(^qD7Hhf^Yy|B;6b;Von z+1)74!(FcVYBDj>yLI7NQ}#nt5(Orurwa-$cLZaN#QaZ8k~d6e|M+72WK9Y^Tgm*7 zy-Xz@O7(FTTcVmnU_GP`Fx!X&%lL_Lq2%5dZs`e-X&s|`On4E3=C?Fr)L!1+Jh5;O z_*bb&<8k9v#E48!5~k;P%PI}n*4rK&&dPbd_iWl$*rmuGL&7$+)WAa=K%-RQ)Y@eg zcNN7T>2C%sLen@TFvG$C$KuAEGWyDBVe~WRlF7lG!BWKK76u&fPGQIiN?T3C+UG$t zBF`cO6S0hB+H#vU`|&Zw)V;6{Knbk8kKFXz>mce-(LE^sHu-a)IZo4{u6Cl(#B}(6 zYmXqx@E{i7bWb2GY$`-<9he75AE5r|9}0tm(}2^`|0k2LAAtdYfNR#~zd>d5L=kiZ zwBg@I@C8V;CLUjJfRg__g^$p~Ni1;K_V)pjmLd>W5&xaL2bgL9P*?dM|7Vup-O(&#E({_%FZuiQa#N^I2Yt*=5czXeq#4q_c45Wd<^==5bJb9ietf zv=~GC8w7O_P@clrcNV=&>&xAid9kNI)*8m{D5{I6RY zH-bJ>-BFI;i+gMGY!qp$^eo~X5&9+Y1~#R;xg@`;M*rqzTva+*kNh5DyUR>S>po}Z z8+P{x)1~m^S89uT*Qxg}E*6HETV#$tDz^ijlDbd1*HVJ*X$3RE?yj!s2!xT34t|6D zQ9*&cM7}_w(R|w|jh9k%*!06K$YkMSS5ivUg9wlH>(s|NIQdo==E+;^Q;(^^^B%uJ zmC)~VvRH-<9}H7wtVoE`TT(v)R*R^V=zw>;uc|X8#_a8$r}RM1x(FmamNDh3$zGJ9 zA_Zogne2%M>lgvQyl|)s|vyPucnmE znDyA=zNku84bifvEs%l?`AEvG!Wc~HYA(!u@cW$Ua96bB8`FX{7=zY z)+3{=xwJx^FOqS67nWVu^+b=h+NYA`(`Bn7ZR~7#$q^vW1Au+?o$r=7 z<^VPsmEz-LatrDglf!S2B{wR^B@VVHp+gCP?(Y+W5WU=^lm=N)u-y1oz%l0Kc{%BA zk$6_Boh(*zPRnHiM4#I^@iXUB-f*{C*q8JLWt-*FfF8v#NA6#%ei_u)jQtS}G;YTA zdG;p}m>E^$%HFuyJ4x41h5LJL=g7ELQk$8mg2x3jS*Pj^VdXF?gdK#D5ZiX*g)hpb zOEqnK*6RVw&!Qs7P|nZVr=p=|tb7iaR`l$tkAYh_+tiGy=8c8`cAL|Ci0-dmu*l(Q zb1eaH#xr#Fz~&CzeIUyITY7lV5tXS-9BG;`I=;hu)T!MH~SpAz82 z@BA_aI#KcJ@&bC74kJXA^;POnr5=~n71cCdeRZMai^m*_>CjSYZ)(r=iKmA1Qi$lr zN#;Pt_0-9luXws+^e~LJMTxN08hkm{aA>!`sc#tgJP`Po$oFS@Xv@s3bZi%w1i+3U zGKi2tv=6jJmoDqfI^==gQVPzKQd^X9s7`=*B#_j}D0gS&_-Xeqb+d8gAr;Z+#&K~M z(BpTt6IC*nSN9EA@Nl1OXX$D!fY#s^6)lQ<7FRpc7!PhwOLHx*&aKfu-i3WcRG?c+ zMZ3~H_u%Qb-=wti457Du+Vy|VFY4}uvDQur@C+U$uL3P5SeOGvtDC>>m&xG820bXU z?OPO3(#YQZg>u&~zQ2HjEW#X*st{_2-zRJQSI(_`Q0`sWW4FP5^*r))jl&_4z&sg z4*TjvIm%Fr0*)i@p!6zG2P>tvMsrzU1kf1RZhs@i}sAlTWDy>B;cr z^u-vSh)dZ|VG}~($f;$$YMTAz!yM+-KS0_07UhGA-=G>4Q4FDoszs9`|6pR!17qA; zej2$NP>|rq<9}v`G%zzPfR-GRMYzVhOBxHMQZXRBF2{0If{d7g=XajvOV=CKenusB zLNnkEz**|mp|GsjZPMhgO;@_Tl80w+T}t0-l3bkCe>hg!T9I`BL&BXTma8F>+cL#{ zYZ`l8DaBjajDg@$iJi;l@e1!0EJ*$HIOGr{e7+JI;iV#^SKmCWuJq}`V-`agk`GKy zrf^J8sb3#?!VFRY_`-`_L@8gclJLU&$6p~UdD$dmS3YTSR+DJK#>JV(8ye4zdzEJP z3DpSgEi6lGEI3t$P<~KSs)KhW_hxgf2mf zN1Etx`dttf!6XFpERXja?+;hyL184d6KJ-MgBR$>u$0}{v*L$zyNK<_#CwD-9R z!?C@)v2i_|6dhO%;w;I~XDNtA!bNaD^L?Z%-JKv&!Z-nrQ|}CBs`afZt__hqjkydx z$o>sFO(sMKEV)FO)Ta0E9H&2yw2bui23VH#VWS)3x%ILBx7;?F9mycJdz1z9siv1- zjf~%a(BzT4)#VHBWZ~+t&YgR=`-qqNi8B|V5gf-ZCw9jpbcSS;E5}Y%Y2AEOttXlN z2sXgcIXES>1b!th%X9${pryvtBsf6-xp^)92A$gwGI&$kGU^)g z+}h<`$1${%11)7a)3o=Y(o2TwbcM`_OoF@l+xu@CKV0s=;ie_(6t8pfC-iHe2sUW0 zv6R7~LlSvj^`mD&h@hT5S8nrb4E}$wnYJ0^&>g!0Kw>65ddRZ&wpxk z+Ry%e4omL`DS74d<|i)EUSi^RQzvxe!Fp5_uhM0vQ~SE(1-0x*zKL%7Hx}l(-QIOm zLF?2QmZl=|#AhHO0`>QJ8}cX1 zG>W2z&psS3#!*6KoV=7W)b5JK5$U(8fJI>L>gba_u`+t3vXd1sXe?-0scK8MYNvj! zE*&1#?}k89Z1eNK$K^c_gT9an(%v(TW3SLIX|*wWGl9}*VGBH0IiU=(^YY8BZ2T;c zEJJnNDbsvA@-Cx>%mzBQO|?vr8loV;;dKx#93t6e;i#f=A8P2b!#oh zL)%mt6%!KOe(CJ(N2m-Qy+nRO?y&dh&X_+j-qaJ_>!8r{EN8?@MC{<@tr6dkOzi;8 zwjVJifse!OG%q&B>NG#-%{cLMOUTwhe(!hbQY@(YQCqP~6}?WkR|+{tSTmb=Fmd#@ z@}u_x86F`d0F+&wdOKuX=SiG?`R=LR2~>TtpL@;db$)ZBo*LzRI@yy9=Eh1;_WqwLFhfWu%Z z@Ef#L+~1^*73#n{U9TNIGyNeD03y4f2ip)kB#358W*sHc+em&#;hv?Ha&RqoaRgCR z59d#PckmKXf4*q+!5z`v^krYKG1ShG;kp?G6{tnLL=uEQUbAFBMaph~@5wLgZ|^S7 za#j1?kO>NvxLI^2fF&OkEP3vBTQPgBfm!dhSBnwwttrd?f?Szrw8nP=0@BxdduT&c z(d0KR{998jiShlV=;UdCj3KED0aQv>u63c#5pp}vJ%-I4_Zk+A^QwkG_Ywig?>vm) z@sMbneI-W!5ex~&>bpd@Tt>GDO`FZnHbs{cgoXeCsE})xyL^X6wesLNyawm-;S-?N z4&|Kshj0~SLSzQQVD`a3e}nc^^(lb8qey58QTz=$h3Fl$ zqY!{+5rHs6Rqm2rDMH8r9YB9D^q%qm3bF$c3m^P*^QzAd{{+sBj7N1N+rak2;pGz54w_$|1KKoj1 z{z?4i<(g5<0{H4u)p<6$?|Rg%)!o15R*Pr=V&=b0TH}8Kn(nWMV_fExTc?}E7*h#c zc7d`29o^cDBOP=-ua`9=t^QN1 zVvOCp>^at~7l=95?bFHc*&eT{`93wuZvbHrwK*=@)+}n(2f;G}R!=W(DOzej>7*-V zWA9s2vfSFzwR@5zDM!r8wy&+`70K*U&r_7e%J=DDayubBHK#&Y6gbxi%&{Z_&aFK47ClOJo#N zLMxT)^^Ijk)YiUae?umTepLQZHC4UrNLgv9Hcc4I`V7Crz-Dut^wv#7;D=yc zW`fq|@Avg|z0Yq9y-7;(lWBCP>lNXB9@RU!Hf?dp_VlSxwf|KI_IE2ZIzX){x$PnD zdT{UlVU6Lq0Ea!s9LyI`**t4}h+JXAUE6=>G`-;q~AI4+- z$9wnu(Q*vz1tq!)vm6-YweXNAF4+U2m({|k$~1`?^Z@;tC-CN>LkG-L(_wtvnC&~5 zgI=%kqGu2kekJMJGoZ{3o`LNjt{Yi=GcEABED5Dl%KTWy&%gv&*G%sG13L46nAdpm zRJyf+yM}6v&nar>sLusKN&Hi#k2IwjSal=nDcny-#q^EIy7t5RnGtchxs%xoWZj$V z-a=o33u>+xVy?}Y+`DLR0ty-&8)NzOm^u0+eZ99|=2KR+cJYo7t2;8!j0N}(;~?>$ zd>0R8CpjPD3wZ1Y9Z}B2*w>4Bql;Ft#_8W6JLupE*IHxN!LMqFaN`?lnbHGU0MK@uQk4G zPtX5$asp)-PMl!t#q}6cOPQ6TRiG#uJ@LiC>=9P9pR? zy~MBlH^|q-iAhRZIO`e0*I)-`X2Q8CEX5@L?^nhN_h?Xp~v?9*w}FzbD?P{I)q0mwwQXFinU1SvyFb>2g!$tu}9*EorPw2ko?Q z{?w4GioJ@91T@jSn)1+}tJk{xghFLLmwK_jv7TQ3QH{m31-NAj(* zMx)Us4O-bUlO@_mM{bt5^rV|^;>@{%rJ%-W&W>bY8@$FA9CYXXy-ww&ApFV;r}|7F^U{%gWYfR}F~5X-;; z+XS?8Kpp;9sBR0-k0$~^l}+LppdOHXa!&mnR)_p1LbfEog8Tye9fC(fARiMa_bD2_ zlv0>FdJtT>MvB3a&03=VX;=z#rQBP+bt>d$pyp@Q%t9^K#IK(R;QOQaVghY-GJoXC8L1C0GQ=t-dB*zODQaQB|Gmm|6Q5%Rx5JT13;vpW@; zOY>$o#coLDUNLC>1>k5(zWewhLh*8jLshenvBQG6ujJik7NUsgS_Z~({~>TmNSp0b z0c|~Y=N*ye`!$*y>QEQW5OkgDg0KKFHocxb<+GkNE91QO5jb3xE~8UkN}z^Cb)0VJ zEeu$PD%Q437qTyI80ajCoo5I|ZJqmr(p^^N3Y1H0^HeioPLbx;yaVo@hFuCE@))fgV=)o)ZN`iVFz%r-}Bo$iO@HMG`t)^M{!2a z)0AN$y8)2NL3I`NF?R~2B#oDnbo{81jG~8hOUTv?UJ3It{_^4X)(KjtFzRvC2gm^! ztG!Q+VjeCiW!cCU>>2lDh-Q{!iRG1yXfN`GIK9?+#zMyr=MULVw1^_X=xp}t`B(m` z5ntepc-tot4<06{e)NDP-#zE*E4u}wlO-aV2@Qh}tCBl$y=R_Ig+MBiw7%^@k6w(J zzg~Ut{0X#@`5qiwiGM;^USjfDqK(S8+O)m@yiP;J)cSM`xv!?7HvKyOeu;KM)KrMr zum&sD9 zmPyfJvE0VS;C3IS097hLenVmPyCa=lGt}Z z?3EhnG@_m>f^Pd@o)RIxc5N+8S-(LVBX^~WU#1w&p}@O0%1ZMQa)P((UcN~NQS!hx<*Rfxv~ZMdaCUyszXK@3pKB1~ z={SkZDQ{wT*Uk_5R>WTl@^E!Hdiyjt;#~j~kxx+FS-n16GP6cHL2tTj_Yjo`K3#lg zBY%U;hiiY>xaxcy0Kss`2}o3SInb3Po^}dedV%F;nVu2gU3^e-3HN3xjLy(=n%2b0 z6VQq9V7j`hq(^LTLx2)qN;AT0gLsW3e81^q-tF2#bqk3Yl(j@SY!WJ65bX>*oxto5 zh?`q(`SYDUo{Tvr4m5f!w5TA8807~a zM*uj2UfSjJ{pKWZ-a@?lXtFWlXKe2qI_%3WSg8Un>OF#RttVdAQfI09K;EPB`e?p# z_K6c4Ld(o@vd%hnR*h#2O^iQ~+58(NNI)(_ND+3wLGGvw$aE}4+x|Dm=LCB29+yqI z19Y#dhxKP;x&V4Z2GK~qPQ7H7HMOMkIMp|S%ioCgAzdX)f|Vk-)^G%ls3hLsc(D-8 zg}j>OfV6Xft$Hhu3G?)}@f-Z`r?Sex?zVHM7A=`8!86WU2@9yXM{x> z110&{k#fSU*o}vxNWeeBIxgd%VMXq8c_tvX{#X~js4kY5e0)PydD#y}zA3zB<{dnJ zakls*$R(tO#;v*vurA6xEXeOwdA8V>c$&pI9!~xO9^~K>>a13NcPVeNpgt0a-)I`R zZxc&?mV(?U1mx&cWx6~EtX%x3ZqxdKE6mKm?~@LAXu0pG!mb8srVd@Efl`%z5vS>S6!uT!6>lM@j3 z5H80#n#ZzY`r@!ldDh4D8UrHDc$O`n>4NWN z?QkjAX!JrT4I8_*=ggbpyhB)Iv1u4f9A6O2Vp*wI!F)vet+PC)c0i#s3%MXzk&vN_ zr7aeV%#XgnA9oeJDtbJLtL@R}^&OeBR5ELkzGHz~=>|5$%TPZY%Q8BibSG`f68}-) ztAfaFeokb{&(vo<_5C3J3qMQ`g=7w5v{aW?Gntx>=w6&pQ|*vFplfHLn(+-?y}u)K zEJ$^m7(PT7-jVMp9oIp93y<_7_+Yo;H!B_48GoHq5J2$5Dj^k7+l31*tm}KhOOow#J>BtomY6atCQW+j-NbQq-a1ZWzW zEY+c_>*8aYZ0>K0o=oEq+jiJ=lll+#uI!tthOkH!cktPcPrjoTRiiRK=Dy-(Y;qsF z$K7cG2BC=eqw8|0Xz?%!UTUBAX5RFNb<$N%-YanDTng?H>f(h%TlN3#N-X4xV%Q1=5p7$W$HQ$^nuC(_*FwB^4bUK*=Xe-&z}P2 z@5@n(J+CAJ6_$9M?sTiOzG+cs9zQuEijw?rdgVBl`=ixNy6Noigc&B={bv^t56?k02<;tX;z?=F_*4KW>QDBNu$@J@? z^^k z^i<*%Uosav7xmDsn%SOukGL2WwKUPB1Gn)z>pP3Rcb{TRtS7PsiUNViQB7_R0d{sP zkW{~!9^@Fh6G5%YV%>w+am%}3e%vAOkvc#4kiRoVW5C|D|h&Tt^y1|Xt{pB{q zvSK1Te{s|5vRA;d*_~W!+5REwP|0=Cd!yPd^JfAlpJo~iQwokt(I7d_m-vQ1nJ zXjbJa9EtY6MB3e=93s@LE&xXfwvPOENcn<=#7CrN4LPHQ*Q~Q#qoAG5-wh~epP8>j zgr$!EvJ_gSzfWF1K{Q)IkbxJ5#l#Adm~MvVEU&uYkeezfyX>g#4CYLB7GBZpCUPo; z%c67h#7N*Ka9&+TSz?2_UcX7|PZoy{*j6;WO2V<7LX{-OB7lQ(`<6b#tmR|-*T)}C z3?;hw`DvL|BI{#1T&Sa26=j}<_=%HXhU*%KBbAOZ9mtWoMq zK?sdMw{TLJGY}c}888Ypg5qJk!QZp@m>PjE&jorfu;Io28qsW-U;0)|za4i|m2$5H zI!K0fq`6avo&&Nt)WGQwUdrWu=9f#NMopKGIrsta`KKerw#j;l_g;|uy37xC$=x0J zE^3Nr$ttHs1TBN}pVpZX2X@m(BPf|et_NrrfF&5Bk$cwTwq-frF?(y%|FHS1zO=_O zFZ@;LS@BDn=e~ori5KyAu;EitUGaPj)9yIg*3(kIfWn{=mY*Gq#&Hz+YRbsyP`9genA1a zi<;#DH8$6L=MOLdJ^zEw^Lmhmu8~al>{DvKE$BLRH$@WB+bnU6fej8~0G9vqpA@@s zGBxQL5VztF+n+*ZCr+7ynWZQiF4 z)u-hArfYe#f#PF5lf$vXl=B;~0bo*`er)QX7~km~6LDLCA%>FIa>FctEzEhrYlf3g zGHV=Q2K9Z-Mn+rHn>IiOSA~@9sj*ivcV>EX%fE~xjCj1x_T(DVgVII)Y;VN_)g_Rx z;EpZuMJ5-bLBHJ|<$HWzBl3L~N*fqXdST>px2wliwyfU$!DOITK|b)joxLn@ST&0n zoRmxkCQULG{-|{~GMI1V#fs^+b>a=h%WQ|bDG}A-cSN6`$^FV%Fu75a3M#EQO*ixm zIYf0|jh{9(2DH#*xfmpP^q@Y;{`~vo!-oOfByPeec2tvlph)p&L0})VcL4f8s3<2u zn05a(d_Qps9Ra8g2Jc$)NNY;LRaB&0`0yDot=vZ;XDMEZmYQ+NLa&^YU8%vV@U$Ej zfKsyVdjMfS@S)?Z1FloWyAda`{k)$z2K%;PeUR-+;n*eb}1hm1z*~IG!vP+mv61V zz#HM;ezp6f(G&WFGuEN936c@i)w+L0MsO*XMHg+j4Frq|)`zEN*Vs3m^JdZ8AQrwrrf{(*Qjoq?<%()V#|5%}c9+^mig76jZl?hzgpFhRT6Qwt<7`+?fHkTU?= z$+9T^RwDKjoUEu44sOb7M**k9=isQ5{GP5LUYG63kMh(?ua`O5$d3Z$u!3(tbvb!B z*}qYp`v}Tm-oFh@rOb0Z9#8F9^;q3MQ7^gtQm(uvq61Z^pMeqnp zSoN|g!j=#j@GL{Fd^sNh178g0*ibAvI;t=gNSIWR41@iEVxBG(&q9`33fW4_@25V# zd1}EshviL5%!io6#0Y;frZDyeuZ_*w8;GuT__zG%MfZf88V>fg5)OzMoCg{S@(a?WK1 zmUsX%21xCKB&pcfrU=D+m^MiqTR!f&F0`loqYv%uFAEAhFsL^@UgB!Q*_HXWl9d zrq=iSm^27x+tv0)LtP%wPkahd_F1;vo=|0INvFO3ovV$GjYJ4Wt6m$ghB4?n$zZNV zUSb!jN3MgH^CRXyF2oW8WF+|Ss)3%P7xlX#;aM^hHwvOXkMn4v_#Vqp5i&hoM85Oi zr0|nAStEWe-h*$?vrz_uAw>yMyy z^vJmXBc8EPi8dG$#A3nH>6M_*V3oVas1GE+9p5E1FA&Yui~D&i%6eB67&g)g%)gv6 zdp;s(z=>1hnc&E#+ftn$BdXg5Ad|9$sLvn?TWaDM)nkjotA9d37e1g33cJjo@qqb{ zj*$IkWZ*y->aU<5BlimdoCh;O2%{DjU?a(8^dMI6VywcfG_Vwd^^gw~3?juJ17gEQ zVXgoMzzPtm_7o^RlMR}OouQ0C_V{2Yc<{+Zpkr$o(Hb};9z%d)w5_ERqWWL0y>|ZA z+UtM{O}I$87XUfxp@fT`jFYnwzitDAcxw(!rbQ*8{((Eo{(J5WNG4XaBl&hv){76- z{{=Dozu5Egvp|;2=?gS6A+H;jOP|lq(8PcBYb*Jlv6$KsT-|43`D5qa((R+NFP;zb zuRcZ!-so^6Z9&9)cs=yaYzM(@+q>1JcC4f~ES>V=j)Z@z^}9Akfys{B+ZcXkDOA2? zZDlllX6|0;wRmVwnMie+11Hn{H>&GhIvQ;K9uNPc?-BnGzdyAq{>&Pu+Xra>JbTH6 z=;ymSiLdcTvWHYukxahm&L9Xzy3<(xslE#!_KNk~7>YZk5u}Tq*j3baadwBVIR*ci zs-yA#1@&3vg>FwcX4%dG-5NF{Vx>b4!lfI(jqE+B1V|uo$9nJ;uXbnzG@&K zd!R(qT2*y}^yQbg)N(Q;Md{hs9BvMeQVvJFHV#T&DnOct>;it2UavnZ@S z#id28Uf;0h@C2t=n)3t(wH=$dT!<{na^h`FVUqJQS_vIZOB;ODj>A_Jze zsX6Yir>-cM+H2aeyBY->H(iPIWZf0hy!}`)cSU-sJggz{TM2`x+py+}NeFemn0kir z)vs0Io{x?VBbuypsIf)sMXT23=c4HxgS%Kauh)S%Kg9_Drv>#HURw0f|AKa^VOJen zs-_OZCC}r8-%*of_xd~D@>6eF9WTW;<1dtL4*Z~lmb?^cKeR!ylWOAWnn)LJT@ z3#EaZYPP6Jrdv9*h;hV0P2dJ&>EcyY<$ZBsA112Pl@iVRkaw`n(e?2O{WB-few*rJ zNB*Sbk^z4WQFM%_hvI9h()n{{84l;QF}<<_p0fDl)qK{$+e*0`ENAAIwQEffi)@?T z27a%L?G%u2J)8AR4Q@?5pqiFF$c{j=>q=eie_PWcbgS)_y!iG$julfhK79WS5XJ#h zbO)H&|9LLkzvt`n_Jxl$S`)viPo+PJ!xw!nbS;N%fBg;e3^wYpq1w#dEA}*8eQK=; zj40RKy)ROe?TXh)#l9i_quRn4qc1W#(S5r1A=uo#l8sWe_3AlJYZZ-~(|7$}nu)ak zQXVk3oo)S+c=~;-*`c8w73KV~p>90tNOQ%klf4Fl7Tx8dLvJCzuZ zaI(n66-oC?x00UDjF1c5W++}`TGr^ZCng?0`v%TYj6bD6bCZf#X{GQVkKyL^>ud(s zwDHnU7Fkw|DP8W6@X;gFYPpY1X|6Ph0GNODLXQCo{xu`- zU08zepa43R{An>D{f&zqxD%n+zZip$DvHik*_e9Ou%AM~3 z+dA@<`9E_~{E3PE-~H9cJT7z#setSH$t^H#{(*bPOIW#4GjJJ%9>@|ICrxX8+LuPh z4bjC}%;*XvG)8lk5f__l-xnj_)pN>rI1KZ|F4PU^2JF7{tfVE?j_Aw)` zO7WPp7{ALRIK^GD`mGC0455w5H#_0{#{0nDIbRL*Otf~i%)>b&;{%jhKFA>wh#= z_`mc1|1J%+=Krptmc-vR)LQ#1>iDmQT4R6B4yf$X_qPt8W*+c3k6C|M&M8Z6N?SHw ze(XzUn91W{dQD+KQ4AEULm5Y%bGE^%);jSnUtD<;KTg8K(^tsjm~UF5QkDK_^gI`Z zAGXPn-W8q}w`mdIxIaD8a=FfFU&%im6xobkKt^R)q3YwG?$?NXuvY2!&9BZwyHv2< z2#N>>q~s?*|H*nIEpKWNC(Ib(K@(cSv=fh_5mG6ZH6dx5NuUifNk2TuK2W_QAKaVuL6wVu_Ff~R+3f_|RQw%R(<`g;}*MS$63S4OKGWB7b z_dV)3nxF_Yciv!o+!&C)pK6d-1YEbbvnRpb+<6cV8TQnb(Ac_@@ zD}D-Dd+YjBiijZ3mV>_?Y3o8**_!*J+P}%TRMXZ z0Tw(ccE8O`EHP>uZ}v&*UM8zQ^h=Km>l2!Xk~*i)Pj_E^MPt11tRUx9@yxV_*F|-y zNE6P>`zAno&wgV2vY?Og3WJV>Mvo0p!pIFU<>Hf5y2``bmL0AXFQOZf znQ4P#*7oUeJ-CFJ_Z&g#GKbnS{I$b^2`q9YylA+U$g^Jj|+K0vKv?AjmwoBy# zz1FX7bhUs%hVI;@TSxP;wa`v-zRUYdL0-dL1Wa(9(W*>?9Uhr`#@4Uj?Z8iLad$;k z6Ww?iL*q%sxIug)b|n+lfwA=p7d>OT=28zZhC9dJmfrvoxjMHv&JjvY{xXomGCEHX zbn2@iqZ>Y4^mbTa)??EQt~im_W-#e4-{Db6T>9& zFAUQ|y3VSc3$DTtI}Jjisq<9O~y4#Mv?vJWzi zShS7v`Wf6-*sYoBfB69UvjL|cZN>Z*$M&ty>qX_cx)q;?bbA+Z`yE<2b9ZAwQ6l=( zZ2Q-OTM%s;fv2XNYi%363R<59Vje($1V76;9!;sd;c};0eu+jGAU_8a zU7rhsU734mV>%8zx&IKz?1|vxsh2?g7{!L;*PMNf0bUn^Y+W6%2oy2)2e#@BIG*H@<1UYn7P_k3-r`Q9)SmH>Iv0L#l@v| z4{II13*P{A*jzQRB5X$M+Dv|eJ6ts{dFM@HspaTFW3wJiWSF6Ro7SFs9$|-IM!pKv zlC5kNJVpEBeD?AqSQEH>uw8|zS}7aDtElV;j~`Clf&s`PR@Ws;ULJN42{o+{W{;?7Pv~RD&R^nDsso9r& z%?)eNyDYE2#q<#Lfl^kWR4O9uV;kFn%l7UPe2e83x941d;N;7JZD{{a1P|PcZPeHB zQJU`9QhhXDs>YoeK*t(@8hsy+Qs2nADf61-7whJ-ClXvg9D~%5Yoe61N6P^i*pf|k z(~<7g?O=wrX2d@xV4;;}S76o8qhT~1aJ?}0+)sK4aaE+pB* zku-)JK>>xz>g=e)epF_#<3o@_X^?ogAMy&76oZK;LxT;>5N1x2!}?5r`ubtog()l( zSR6+5CmR>}-V_mdZlfATt$mVi>TtF}0?`<|mw@|VFny;A@+A84Bgo3TZP9)J>qDHj z9}eWk?&~;)-lkJ{l-)9Zn0&Lxg)`<#H8kNm^#@g!p>Hqb`Vx;co?DP)vH4T1>D_{K z<1{`859!|j3>3s#Wx9%&dJ+sY6i(mELHjx7Pfopl8Yg~h!6X8J^L~RcDqm(DtmxAN z9y7k+Q@!2e1{(lg3@Y19A9(-b?YEH-%-CDlP$34IyJ^<*(zbTg+Ce}9On+AS<^i%} zH#VEL30^XNFIPAH!9_qeem6w7T-+gm2z(YEs=0Cxcw}6`1Bml?i}04*%RB*h0J@zq zcwqiqxD&pVdl~5q*c|r(PXrlhCfB?m8qi4OTnWN_zbuG?t1=U$?+?qIr)mCjm)4h_ zCa{Z0Pf`k^UhHbFLE8QL##`q(g)k$Zsjdp`SFU5dj_1rd`ziXW+N+y1T7SDmQ2h_~ z-aH=4fA1e3sT66-lC`E%DME!5F~pgOmkhcU)TG3y`J0a)UsYKAuyp(ed7B` z5&RRhBRZ&m2Z~n_^Po``)`mH-c9>w+%@DD0&Vic!n>J|y-g_@Wh$X*1*6HTK>uW2s zYK7;VMlj6TB}GOd({S?Wwtmsi5yBnj4XRuvIKu`QaOeH##(;HCpUn~HFIu|X8o#?oj%U2H zWOINtfSLayE)1Kz43TFg0U53axWrmJL*UT?GG{DPa|QABQYU~`3*frELE||>t{W@z zksJIOn#r&dRNqlgl7Mw&BUcbuz{F1&v%URx0!Gd%cGU&E3tOA|mjPqY?*U^7e|=P+ z0OJ~T-RS4U-d~2G{gmMQ@BjZFfT)}+#D9xq{tv#l|E=ThUol0@%AdSrZVxGoF^9PC z#$HSG8{eSk*Fm;6zbhkYcqh4>yAJ~6hUjAK#Cbb3f8G<+sFM8ZY5QYsgu7D@PVMX- zT6o5>;|-Mq3_+hPgjXEzVP0Da*`p&~_;ysc%7vDLxP1=%=O*6RitM?AeQxHQC>H7N zJyd#+W;v|u!e8kf_3Zo07fNeiH*--{Q&=ZSuf30!Sypa%hz|4RCafTM#J&tHGKJVw zY5Mm9c2e%zyNb%48w}Lwlyt)2+Y@LY_aU(SKqkTQpZ47kilOj9H#$2MW(u;%;XS7- zRuB{*@u_K$WGMs%k-=7Wx& zH&9ZG>tNm@c_SrHYE<{@A#Th3hOUacg$qGJu3ULIZMc*AMonZ|rU1Sa&2}fJvI0^; z;J5y)FBeN{)*f7?IMq?#NfxL?_fVasMej2CXY1*4`3HWa=~Ntuq~LsE?+^g zsxug7bL~5^Chr8(6T_zRE7(N{_9*)$Yp>w=9na7Q`F-au174|@wKvBcwcfNWeiCh0h+od|b?)vqDW~I=|VTabM=pCc#flAMhiR z!yUa}$zmop&i8I8!GYLTXqF>8Q>o~=+AS=h+CPK-lvXGgJ8XG!@pXeFnk0o-I=Omr`XPM-!D(eO@bw09D8oh4fk`*r6g5H}wq_Lazb` z6U*Fh%6?XUc;9z|yf;3}X?#AwMBi@k<*-+eyVHc|AJfhr-1f;fs5vrc)bl=t6c>1G zx4E%16G_T`9D*3WMiL4f5q8>xCT)cUU!f}ocv2ak1yCeGFdKk!^m5X^!YsEcGm{5z z@W~g~4o4%xPm<_e=)}V@=uLhtmffVtRr2gc=25&L*+V9Ev zuMy(n5vhW-Zd^Hj@0I5053)HK=eQegN<8>+w{k2;d(X&%J%=;}4f@WfWU!*vz+Uu0 z>>C0}zlnYT)bDfX#RTM(&|u+SBe)R1fOiD4AE#x^FSnil9rp8En9$j?0>r0A>6!J9 zPHpIV;FwlDROqgo_H5_-rp`hhW`d1NR+a6*_xacmrL~>9Lf+GzpB-90guPoEA%A!* zR^Z5SaofzQ)V|}J1fE)2CWmIXew{RHOgvL%QJb6d?3>|5*toBPTPmANakpjw^7+NQ za>L|?cISQTWCTn!{wX}>eOsp|cGi~S|qmKHYWoyM@7 zj?KN7GTrv^`tv58&*^>};=Un)*YW=_b>u%|ul~2U9rzrmX&2_D(7MJ^>Q%}8HM-ov zQzrEtfohu4(QAjIEwzkHk2rR|y*`xI@J_uVNbttn$`Q|fJ~c*qN(PTJrNg4(BgcH3 z;^wapP8$bFzNR4a%aq@E4!$gU0*hwSD8UPkA+ z0_@=m6Yd(7O?UNlWMsu&^P5R&^M9-7>pSY>n9!yh$7><#(q_%Acdkf-xXg<^aw%FC#5&SjlfZVOkD)Fk*1>Btro5n*HYS72CV+=oQ@$GjUB?J>EUrJegQ1!j>QHi1; zcYVDSG34q2KC%(bP-5>;fP9rBZLT}rFB-??-IF$>?+CwO4~y(S?pjwiElr$KQocL-WYS zIN8BfnZ&}|X!Tpx^p1(SV7C&}6Neq&OrG*-I(1-3LzVY>{_@1z7rt$wMgihcg0Is) z7B;@!;^4$5V?`AIl)>|^r$AHiSxVixw#0NE+J@RIhM_NepE$U`2LtUcgF${%n*YeN z;qQ8Gka+Ikm^U0e`_}#Xnr94y^{C+NA<=7PhZ@@kdP)^KTzBX`6gYZIajuA($7$Xm z9QAs~(&CUNQK`yk>}vvV{D!jmj8hw$8nC7l`8Xp9mzp~1@)II&Ei^LJ=S9jK8Hsz0 zA00Yemavga$Lt>;-M{AdUhDtTznoc^7kgb3GcV{JI+k_5CEC!~>1Fgr!6w1!y<3(9 z8~aUF>Z**=vx|(2M9y=q&79CWmeuR@+NIvuJWK6{#Fk={D;Ul4>=)0&=B?YavIh_E z?~`B3Cc+L;Ct$*+JQ=*uLIShA79iihA#+K z#5uhqmwz8>&NmDFde%@d)cQDcZlcHi)&!2y`yiG;D`8en& ztZfDH;Lj%k?b0~wpc@a=G=p94N`kj31ro-oa?BabV$Cnlzk*=!g86P^;WP&%5&$!9 zzr7gr(`zlnhxIFyvYILy%=Q)q=bT}=p}sq^0C@P*3t2RU0$>&>YVWtDA9lq!{3H=~S-8ZRBaO?;tq^K4CLfHI1w zkH^Juqpk}_c>}H*e8DG{g6BYv>dxm%wQTk@S=IE(=y_gTJlLWj7lsv*IoB%gj7#kY z-WuJjyt!wf8&IVFrD#=_(oyn3m)E-i1=yQ_jKm#-YNq^op%W;Q0gBGuy@CK}RSq{C z1poC72-dfZqU#M5iH>)X_LCCIk)Ku&jsa*#f*l$e8ot57=$1;3@Np0di$x|b_{rP8 ztd`Ip#Xcf@Oo<=!Q)wBf=Q_N(gQ+`sraQ40um2Q^M0h?2qXbr0>sxMB8(0k@K~@?# z@U@p~$yDNmTQ>-^DFX~{;{44@H~KoIcFdp>OCXJ7&=Q4O>~UkS1Nk4T5EHT$)h%|8 z>5}=j=JRHi+Zc)yQ_jr)S^0-w5rXpYE|ck+nCdmIR+4;BG7k5xy*{RT#8ah;o$OX;lV;aD~Na?ZE(a;Zj!b>ERJHgU>i|0A^^Ty zGY=QPT|u;@V_S%2Xc`-AQb$=EfeB^JZRxAJ2Yb#TLKsL|5R7Nn!2nTom7^a>Vh>DW zTkZZ=Aw&Z5p;Hnm@mBQcN9*HPSXCa8L?01dpXdfHZ`3!Cd9Te>@zLbBN%-y`^W=m0 zEEV5vx$W?dSZQ~;AhMtitO~ILA>WV!A^ps(VOkWctmF5-0pkrFKfK2NYf%`RJ6)Qz0)1B zUUv5g&qCm~PJR8z(3rf(c2c!`GU}5>;FE2*>NWWQ{jF0FfqV1IiEpYVYxc^V^I9w} zk9~ep=}E&TsmB|;O2CgH2gq5!{^a;79*6DJN8zk#28dS4I(_@*T5Z|kQA$%D;&zGO zE9Ob4X10Lsf&ND9^jff+T?dxD{DCFF?8<|S5ps3_VCJD%##jeE9h(dlK&%Hc&~yxD zEE!?`UZgnqHgQ_N%wWVre)u2VC#F-=2CsE${eyg6lhqNKH-j^-<1*^% z6ffPVh#rqH? zvS@8*n!EN=>h6avs*8ypjdDma4|cKMnl`gu+Ai5|F%#E{-R!9yCKUbr@cS9P>(NoO zu_WK}Z${H5Q~e{&Ttwu}wRW4zO#Gs$FOaKs2lFkrAke*nw3uvMJIwP^e`MOJ z*7-`4FP&+@PV_V#jJ(p>c=lKY@LvYvaf&qV)_u3UjtOiojMVU>lb;QbSH#S3wCe4h zqR!2VAT~ZHE(Q({gb4{TY${F6z9n~wUp?nL)rbqquZGVcJaym*g~hcU26 ztu9m!*q|jQI4#$L9KWZ3-m8u(wLT+@ep^pkP+e&G82 zq%j!=W6GEW8ui&EHX0=gk^tq?p<=)S;@L^iNTw6C)C<%fW~6t^*d8TVr@8?6`3Y2G zJa+=phPD+fmqI$=Z3IIUDfqJ*D0t+~uO@y!!H?+w&|*Eel!i-pfva-{V}^DT)ITLE z><;);fLdDk9Jl`w^rr)qzguSg_sX&Vf4(2bO7p+T{r?}5`)ifBwiHtFcW_6i^WW!Z zaq2!0nSnlI{rbXsKT+`w;g5T+EnuDa9^VIAycwX~5hAB(MAa0d@)oZ%{}vEMvqC1Y`SOu`>-5wOy{vDC3PtMtYg?q87KN-~8dgKBY8Oxjs0H{}U2^ch*r9};# zXMyG?2NqGP9$2h`=HH^HZ-oGY`^%%5H@ zoOTV6I8A^sDox`Y$&c1jdG*oZ{#Yqkzy5ef|8XB*Mh;cs)n(df(%BlsG1$xF&v(b* zq1v*e{VWIPLwanJy|!$vE{l%=)%#l{#}sw(w2kQ32=!U;=|$=|v5GVv^`e~H=tKVB z?4lXxcLlctb}#&=*oyex>0e_jG8pUs8YyqUHS7Y=b5t{GhTn0)@5l!mkrB`kuv2MQ z__dODUU{sCQv&q_IsjZ&Ex{=CI)ZH}AgB-EQUX+W|AsDX)~}9VSQ9^qjl||Hvm(Ab z!JkCg#=o=fm|6c7#GV4cl4q|EK)>?0gHJ`F=S^%JrqRFTu=wWe=cM#fxh}9j+yqfB zF9?bF07tG~P@SJ6P#s|}!eux5Cd(BBaW8%%_u<@ijGL*$pfsg<@ot{ z_6FiEb(9wF8)3oFlfamfW?W~-6GkK);6d1nRRjOE`#m|9`=A@RyYc73=j57=COF63 z!c`9vX7O-6wh_OYAhIh)pE1(G7WBI{X#+|Fh-wCVt!f@N^+3dZ3CM4b#AZ#e&|jdz7)|j_ZTeHH7X{zEU2WfkZtE*|rH4>mj@11yx4n)p`xO-H z-^BA!Vp9n(0;v9e9}{fARWpz~n$wh8CSW(N$YR31`QdI!UE3^dy}?D%jF2$UbCoKs zyJ1soA90b)yy3Xj%D!`rT3Z1e(9S`BWYi1Qe8bM;7t^Gi+-S5=_7d#L!A9cBqO?&J zp&Byb{ff9|nggq8vovRX1OXiDmz_3B(md>^+rBxnQ%SipLP1-_vg;769Cp(1%jxJ> z+mEau>Ikn14}XLUVeMewub|X4WG6ab^EWrTjwY}&=pShBl22IUQsLX$Ad0SGdIIWp z*%U9rQcHRbIQox@eq^i$TsOx*tRM7}3ObByV8TfvEYN-8rtfXR8Xe}{fncLa>^7E6 z@uNAwk($`_D7ac2j!o-Y$ zGS_57_R{``YiyTGptQ(XyqX!_v^m$YN^3h@lVTY(7wb z#~Od>>QV)$797SGtm+*PyT%^M(9~KLVjMXwrF`rY|Z$K&6)oer%ACd!=-nSxI(C z+A!C?5|dAI9AOTjON#w}&Mh%j&d;$)I!O&BcEx>?q#MUDyE0{ZZu!b7uCB2ta#Ka= zMp6cLptkPbb}k^{)gcee3X-PF7#@zIV~IN-p5J)57PRBhH2z`9*#?rXalzRc#Zg<6 zn+n^Rlve4ujU8$MmWHK_j2 zvr1GKvTb58`OF44Y8vO*u-A3pLdA`HUdQSAmsm5woa>T{_CK)k>>SHnz*g&r9PVO& z>fP7yxs;xOYv$2E;j590cWDmGzf#{Wc)mYx|C{;-YnkzAVF-WF zv9vka);^nzm+gwjVm~QnDZ50BT5B#}%s1qKnu)t0&m7(7u2Oj;U#@af(~=5qt=I7W z67leUvj2MLo5uyyT~aKy^S<0z67(jXjcvKaJ)Joly&sG@mU1-7uhNXIb;;Vk^|YK$ zhlZGXjgF77aB}tmow2W%0K#!6@Sn{k|6lrg;+SC1!g+DwHt%(Q1;e+G%P8_i$hIEF zIx5?iTG5`qZMc!4k}$j`;KZ_4`|f*=D#Ou?rgGx@uWwJCwj0HV?Jc;i8;M!Mzb)RL zZC;*yl7GjkX0A`t>qP{nl2aQCwxvr^u3+r{w?wf2J$1PCr}Mc8hEzD@^7I7W!+xZ5~_ZWFyeUQ-T?>vV} zU_08h$qppKiA58h-3AL%ER)BtpPDKUUu@j&X+Ay?pO$f9A}YZ>J}sNNKmW1DC4`B4 z<;M<%8`mc>I>o6|(wZt8pS$O^{4nf1pCcV9(%c(qu@w_3u=zPdT1ooRuFsg(DEs8Z zlwtlgFGBu+U;YPvlE3-d9KeI&^oqQ$58d6G=IR|iA$w`hIgshDK$J^58j*J52ho)~ zT;Z8QN}ihIr>oa+x7#-aK2o!~W^L46eeIb&zU|pZRgXN|YO4GoeC zE$?jD&iJtXktr<}%r_@PaUEp(w|Rpfm8$xj_}esVI8i?hzxIz?qLRm_-BU56rq;}hvFAg& zBp8V!x+9st>r~+s*8=`w<9mx2_8^QP)01Mz`83*$Uh!lVt^HT|caHmLh}nJ)a6ml@ zojjiydPn0&(@E2auDb3oOW6-XS}xjKq>3^u%egkPGL-eUQN}kiLMSJ=PjO^C@_y*q zcA|13s^S@HN=hQxGVLq29g|YzY9$qxpIL!0so(O|vq|~3%KfRs1`6(}W^?0hZ*P=k zcITRQU6f*Hda`Q zVpD$X5pm<2`A;|VN^ba=-8iM{Si2v#~p;o4@O%)U!=pnf;}XP;6W8_OE3os=%@i@B~{y_N7Z%i+@QZK@gHPMpS@ z^XR5|G56 zCKK(W4DI}po)lXZs@ZX4^`Q!8H^*Zs>JfB!1y!QXlx2a->wUQidgxh<3VKCilO zQt5X0T~NGCRKKg=YO~XXGNo`P;nk>|z|#-B)dxuD4An1DvL7#=eVubm9HX)~WoD}{5kL62{g8X;rzKJ2AiP&y6$CE< zjQn>X3Bc%o4M_4@_)Em;Z|wLx;xvl>kDyq_F96rC$W_vxz!HM4&pfpZ`{wN;6ici( z)bIzr)TWqWH~J7^xdTP|T6z@VWzv8*H{klJsimrSq!duPe+$22i-9)h=J&k?RVsGg zWnC+X#^RqaEGC9}gdNAaFo;H#-@Yw;(b)CTJOG&r0=$G>%PtMv*qZFy;FDV6zwfq? zn-VhwEk!Lt&@sb7EL>_caj|>_v5sSvM2%+>n1}Ea&|lvGV3fMn7JB=pyjLctMdkJ-Bu!^d-9mnhwBsKH`K=zF$EU zp(7D4zXACiW1zi+Mg#^fVDAGPu_2x==m{}qmEgOP%nxKC_@v&FAWa{Z>6a*O2FhuwfPgvju-VO-R8;WcM3k5zM zw?8niZ2#?S6Nx6Ue7?ey_;09QG^}j`dPfT}dNfV-Rr2Rw#v#VWgC$JJ3gJL4X$I(poIqbHYgCDyPD%2P*m zpk@d@p`_F=!RAj>O@Y%a8Q;u%eV(+v+o%&Yx#}9vSsx?Mk|t%|d()QL1fMcv^K@Kw zqn*sQo(GVq4uW-w)+;d3)ssFu`V}OOuBW}WHUlS!4`4*G&G9w7Y9|w>vcl;$atk1I z+9%E0q7XH^N+-wSi=it%VT7$&Wg>Jeomk$nOqkg#RYl(YJ%)*2#RD+31CGs8rqbbk zkTE_ZW3=hsO>Nl(L1;W^$&B&%Kg6;f#|x#bAU0d2&5Ph`njZUF)DJe8)?MEn)z(a$ zkqfoZ8k{Dot`TL0(Izv^3m$k~+wPYRq!o}-sf?J{^i#M6wk<50^s_kaw0s6yed`t- zV7?%8OXGXL(|2ADk3!?vR{e6Gk$Ce04L=yLvJ;9dBChI?0vJes*%1cwn>(VfqzpTbhG^hPIcZvGc*%VMhn-r4Qt~f}n2#2r%&| zzCnjA{Y#wZVZaDl^bi#HbLwh|w9nQsjoG{5uV#vs1H1|j+fLluK*5hKYOJmT14qv&;&xoFpjwPj4KXifTmlRw607E%0i8cjOzVmVp zXpi2T(O<|=F3np1%+*!$7$+*r55+tWw^(f>?TcUA6P9_W?=e?0!k1VX-%~M&u1Fuc zSLyq)S0VTkr$AQO9*8?$hbA#BG)o{^V~QT-&Uzl=_C0+FcECoV^0a0-68v=AU1Sx^ zgQNxa>bvUl_kxL4Q$25^&T-d`gTVxx6p5kKS4INWND<3m=(D#)(gOeTuIC)gK@t;y z@EX4Uw}L9>myk<>lN1iCd06!=12Yx`=D}^6!p@7b<-k%E?w}S)e?!K@r=4vgA>7Hl+&=5(GSjV$|F1*1BgxprA(BFFaWJ ztzBLv0rR#(D3L_0_YTmQsJzd=c@A-`ZvV z5Yhj`6aN(-D0+9Y^k9DBl_!~mPJ$z3GryxEjkbI?cf&~s^#o+83&F-IrQl)l(&$AB zm2q)^R9ApqBoU&3IHi>RK_xS*w^IQ=m1V#+Uc)+)MP#w$MpqD>pinlMhH*lG!T}=z zR0TNbptj$X;kAN5^U$HCG*Ai~g9 z;5^*DHq-nAZlGXb{QQ{qpY~=9T%z4u|LT@Z&aQp(kMpVs)%6-Nbqg`T7Z?B2GkK{Y z*)iFk+g-P7q^#BSM>9Go8u&_kS&@so#T0Hidcgao%R?1G#wd>ga4$n=M5MOT?71nwv(x;WUp zpnN5aL|tO*FW|fJLD&UuY&yOgOOpho?ljOH8#5J2)9^{%B?XSj-$nKAI8Yfp5difO z>m3_U7`8JHv*VRI3e+?2 zl~@mC!S^0Pq!^R488W7rljT~O(T+Mn!s_~9cZ)S9BOQWKS$3!~^+X`2O+i4En zV~DGGgkG1XSZjO|I=Z!Py*%~_>Z4M~P!7#HN&Vu;sdc!IG|?F=8g64*+3o<9pN(DM zuz8=S(MGAis_71V(>G@wGP9GDyVHa#E5WImLU2dB5w@g+CYj00dPqQ0@XbG_>7GD$2Q}V4SvewxSTELIi4~y>`s8J5H zd#%48I#%1}mk~imW z^(w)&9X}$c2RPj{z0YipWGgH zpRy)E?P0)GfgK&gs9%lJc!#hn2>iC+Q}Gdy>IwpAE$_^(<|z!bab>Ft`vtCCP-;I~ zO~S`efFOP?DBdEp?7)%y3l~g?ErK7Sf4qhk&VqV=8tlsuGvJz;zXOd36O1gsUyoY{ zPR6Tjy_=rD>Elj%94ZJ*MpvlsI7zOkzClBkV|x11i%*I;V-_2jW^lWA{1%p+TA-Wj zRh!F4kZ&YA3ASkGdHx!qa<+5i3G?`I?AL)u#o9ZP{Jl0<_)$2%TqksMOm{+z36ChKq5Xv~>t0br3s~oaU#UAK@d*%DRtiEJ0E&*e~5+ zsi}`W=8aFEtK+Ujvp>*ynG#c093P`avbQe3KOvWlz?sqP;Ei|iK6ktNQZfphT;{tt zC#`DUK;=?Pw95uw&Oewyz=E6dQ7S(&u}#l)W}~cmc!J=h_xoF0XF-mBUII9qR)(LQs=N>W9UsG%N6hgc?dRt4RD{)&w z&yjN9)dtl)B;5DK6k|Ba`>r4Z<*unFdVs@!2tCVn3qisJCvFYE)wmx)z=QZQfBhIb zp&#D^YTr1?$8MWxcl!?1n_D0o^l*}nTuKq~zKGOc4ATR@8>aD1|0}VKfDEv_YEuwV z5zOb#8^ZD)GKQu>@hnB{K}ZiGdrrfr>%sWS3Q!?CGD}Ohu-~FbP7s!6&=jCR8Rz*W z#eU(}6#FG=&}=wZIzgj5hhM~pPH^H$lB_T)G#}31fLY35ssXU!9#~w7#b^dEwATtO z#!us#*x(ZRh8xa<>;~vt08p|Fs&UW|7I`+&$U*ED%+!NdfIv9}FcvBHFU$4smyLhy zUp?C%d3Emd(nqN*vw*V=LQ!A7D{ZyFCBKuYMQn~Nk<$BU@i0%u)Xsv#qOX-6q#*B> z#iOd2eR3z!U1xvQ=a165oSqxGgtK4NYq{Ku&P#Q0S@V8hk+AgwQLx^GaCMmDMKk^I zebqqI#E?{+3RWB zlH9@9!-8*;Xm?za9`lt*yA2(U9_XnYZ!NAu-?;wnwkU#m?@JWgp+#mC23H&-OPO0* z_Y~X1Pv_mg_v8~>&o_+k)`y^_gpFzc4Eif&TiMvVye_ls3l^Wxo?q|8yX0xrp`+)2 zNR!v>ZI_3{o7(}QuZ<6QX@uDcHfm=ltvPfn^K0JTCzrW<6jYv!dCB1uH_g5~LJJh5 zIX#cKULk0S@p4W(6LcX~yr-`AnBXhLi6`w_&0ZnbX-2G`@V%vi|5UD>5 zTTJ_|-1$jj#_=zHdmXPvzS(DAPAmRQ$6XN`HP%*xnfdh_k<5LiS3A7c$?XQt;?O-w z4c=Z?-gw-9GWs>ka+}=I8MzOc-p-{?DWqaeJ#~%=r&*`ZCv#d})TGhe+v*B=O;P+S zh;}VdH9NfRc{#etUpkZ)4=17*drDE;jiNCp>hMNo)BUh8C_A4`Nta9iaWiC}-TSuf zoSgcJS%*`6^dC~ZV~nEJyXr+3r_;{y2}mAOq#R98%w71p|Jd5GVil|WL)f$TDT-7P zTj^N$K8k(gRJF$EnMm{2lA4PCedkWQV&c~Ms9(;qGzs+MTQ~^NM-%!3k&6$KjRRYj z*q`Sgns=sTJDu+^9@MC@z?|KzfxBT66WYEY`abu9#GJ-^896QfpOIj8)=wPo7fBp%bb6PVYixQVbu%pPy=U&7OM)`cV)^d9=(DlGLHx}9 zAK}NJox9f5(3-ATXtgwP#37SDo@J+YfBZye3%)Pt+|cvL)dbNuaORrsq|3T~$sELyTH?c=69 zR&JZV$eUevl-@F8Vpx$KC>8WZ%YBbH_xsEZ*`nKTY_S%^p||W)7X5Nj#L<3wf98gz z=TGbeeN~!&h?}^^NWW}JOv2QScgCqsnQgE7#|O;sI<~fXo$h$I^X%h}h4_ajP9=^F z@s{XvGmvIPK7kkFp2HOUlf19R?DOX!e$=?S0EMEYd{<8I8#>lsZZMuVub6DHtyUPn zmIO!FT)00@dr|tObtde(W`BUe`WfpDT*+!rJb#@~wdSX4`ro(zbAR`S*}-fe zh8+hey)_6Y>OW1#CEd~!Md0$X|^gn)srBI#WH{#sx#@Pswh4-}riq{1&P*35tw zYj-hZf`2oys-fVNun1~u+ZW0#12r&fk;WcW3NIHeW)_`102)mvitCh|^7^owf(y_r za&TNfNc3BwxcNDV+=GCR`CW@SaUosuZ7Q96%qff9GfT#$l|Dm_NJk_fJ!!j=VzHHn z%Qhe3#eKzn8{kYK?@PWZTDQLPbX$8g%bg~aneu4p*qrz^gcE~L+|EoMvN=r_->YZqLFdfHd3x3Pfe>Cc7@{bED>|5Co0q zCv@w=&$w2`ROrCNd}?xeSZv|E1-qa+RD15kQ~u32Is?>h{)}IBOhCvLgdLqFlb$uL zBZdGI+27OdtsAKqKD{XJmEBrzVxG#)WrnzSww3qJ@#8;`bwyI@?((|oC@I!;(CD-? zL}4HFj|B9BV9p8xJhiPL8~Hc30PyXVnXeKnP=?T-S@$QitB3lC1O@{k_!BqVt{FWd zhGc?i+aonI_f{FdEzEgk5xieZ6mPmKIEaNt5-u0t3x2AI%v zF3>My=4vXjzuJk>EP)JI!x9C5)&tI9&{J+*Ma7thK|{?3k9Hw~7Ls`YKf(251;Gk1 znsTDUzbbuek4}Rf*w2X+P7(yG9{~8+jOTCol&vFhsSissQwWD3XTX{4H5*`G2To!N z6SL|hMgu4DJtkWLHaWU&TYUsgFTwEgG#^_-!ZnazQG~jyNnf*XM z!z`RSeb6gAwIYALrAafR1Ob_1KjsIw`iYaCcTcC7qFDFysEz5)=3f?j88zXprLOGf zp0)r8kQn4@VtvVkC_2OrOg=NWaOaYR0oN#34YW}k6{zU2OnG3$z$~5|7Fb}N^ehd7 zUg4*WsCP&esR4NpKOQRpnBZ_8>y^*sQJ*-nzxb*rRy@$ZfVmCJBNx2otR~I`_9E}~ zzrd88ntn^GKG0>y4G=0dxT&aTO7v^+rmKDNtXodAzN||2(-Y>~@`egsTF&;`*TmbM z#HRj^!U(LQF&wo%3m1_0mqd`HY9~_fuwBSj$|IBQCDu}XHSVo-XKn^qat_g^Jg?v_G>=j#H6&OiX2r zC(t7Ri8`c9*g#2gdS_mH9`XiEJnRPM>cx3Km>zocakH8h?8R24aq2kfzTC41$^e-X zvwz_T#+nkp#^(yLaJ>*wp4EVWPN08s1mhy%Rg_8?AfwqX*y!gR>4~e-g=T8rD7^3eeMa2waj67FN$609H~XOJ0wDA#g6r6fg&=xd1PSeh8XiWIR%O8Pb-L90dO8`1}36&Z3tDWNBa9a zw*}REyCSo2;4~}O^)nVnub&rA2WT1i&~LJ!BWBd%=;-~%)%!W0IP8H+M@v&;o@raE zJxwT034DGh1o7c5&2s5jN5_XxL#3Z+)=V)+XsbeU`e5B-pS|mldtKq^Ep*Ed?BHBh}#Zt^Gh;(zTz9GU*YvBhWA(CLogT zcHf>24?8?^Z!_4^%T5)r@H3qcA>F}y*Gm^XKuam=17g9x1^pwgCqH(?mB3Zs+uKIM zd$gjhr|UObdlzzD*}pS{P|hyb`p8z#eEj5o{+K6h2-H(P!Y z02d|zic6=AVbHUnXxvJiooDd@>{p`!1D8Hqjrzvz!u)z)_-04K5+$A?v!8RhVsIIl zv_dEh(6tA39YVKb(hSz!{!1sY6Q%So8Da$N1R^sA=+dGCY;ORqn-oidAuEpn!ucT! zygxT~q|=u)&Tib_;C!X&{Qrq@7=*E{lEJ{m1@Qa-NlfJb#n*;bTK~-mgR=A=D)0Rl zWP~{^v?@2OEfp#enF6P2LiFr3XC*apQR7pAq*!}+b2AsYo_U6LN`lN|R@mn*8A~9~ z5NnN6qjq6+*e&b?$WGVP)r)<&m3;1 zdnWBYi!ViAiw}w7A?u#C46Pg{-zyNz2Ec`dq#~b_cq%)Fhrt?^J8W&OG%(S7GH&wiZWmGJ)!Cd-Q0`0)bPvGX{8@gdX-`vq%x)ih zD!W01_gN=lU9oB^uaRNMx7WAAk`@&i>CByeJonDgJ|sB~8J@K{R`Ur+SSLfzbc!-M zH=eC|DQ{^oY0-EuStdW;<=(PMkt2dk_$%Bnq&^@_7_o|CQSsyBR0g>ADyFhM9T6;- z)rN~93Mya-`YQW@5kHE}!EPpu2oQ|0VZb~GXx~n>5n$1r28pb@QB(Gut-p7gBP{Tl zb{>Eu>BdYf^i`&lFntH#ffa>nxuIm7F^)K}pzDQa`~bNSQJv7p0S^$;ZSqq{H(C#P zPyebZm=KHY5`#|*K}%nq7*e#2D+oCP74%ZJ3CNU*-w@%foA4VrazUvIMdx%xX@OYo z!oR{HtD|aCAA_#cu6P9jTFwd0LhfnSO~8<-jx)yrp`uu8z|}I4Bgc9kD+hlD$=qnW zdHRMP=o1tRfvWbeZb0}iqx!H3s{*7@iU0#aj~Q;#m1BbI+ehMJ#ws0e+XfdZLK@f| zzNSzwmi+>J9^l_TeKdaQYZ~XGXqKt=sAZMp@_EOs<6_niKttT&c87q%pHNMhjMh2x zc6T6{XC%vx^*aUcquHc7eng}gb8${$tD^!>Ku2GBIIO98@+bs;0gSZqj}|1q#P0Z3 zoixh%Y-T^_((aJz7B7W%=iP+|IlQ>=SYiq#qGKpC(0N$zs!o-yPy~Xz&kb z>atu4X%DZgI~OhW>bCd&(FFTPSiYT_f;e3;mX?^;20RSY^? z?WaLbki_$SrpIcMKq9gS`#$S{&;y)Uxr5j?jsbjY>2i%QJ|2AcJgE=-GX(VR1FXy9XPZr7?EL%k{icI_eF6tY)}i zhUj#ADAwFW{)w5Xx$x5Q1Z2&t^+f4O^M2vZ81>Gb9b2w_=^meOh}+u+eqP1;UXI$k zm~(*Hs)TgP!4HA#mK-kWPa>nWKG;9!Jht75VMl_32MluyOb|ByPP8(1{0S3y3wbU+ zWB1&lIx#`^4%{~=*QVx?SC_)AH(TGOi!>z&a4??ie`i9xaW43D`X0{mj9(WXBa2zX z{(xvh^(iNzi|s?X1+SwFE(kAZT~XhG9)^NFCZG13OL4y6+DUln*0VU2TuBgIx|A<> zL>t6JB@=SBW>4N6Y!Y&eK4CK0u}Av#>L~Fh0YWIt?PliL&zrJqP@o)rF(qWXFgkw1 zbr0-$NnO@;+wL2m?p(}xPtuHKC*ym1l-6i(!GTckX7=M^Bsa^0RLN{&`61aH(~K%o z=lvHw{u%dP@5g^Zb$k3M7R6NRBb{~|o^xMcjqY+IgL&rX^ku)SrI;a#)~}bYQv2Ni z-EAdFPJ%U-S3#lMAlwRc$eh3-uZgc=umgUPiT*zI9pFL}&)|(UpwWNa1b@~CwR#?; z;>G5J%8!Bb^l3LJ2EnpIIhug%9#{kVi^I5-&qaeLc)(Iq#fYB!1tCIYn^$9M&Xd?3 zJqs5+7$H+m!L$dTSqJVQB8Hlk ze}FW8$rWmkhP{^D=o>8{Aj9mS5X+vn-02f+jr>rjt<74lcP*IVvL{!S*rl0?MqkGR1L_dKe%?sgMe+nK&b~Z z%QdwV)>#MW`Dj&t_n$ez`%VX;vhz8tYZWwv=F>`^Z0mbqXuWn=?Ji+!DH`r!JaqB+ zlK4&FH3-zH9Q&G1V`k!1k@@e%0z1JG&E1HL(9(8W2i zGQK;FH{D;n^%6yCfFk|5)Ytj#VxEO+Z$Z;t(1afLCh-d@H;5G!1(yWD_uDyGgE(+b zFR1CS4&za#m6kTMKY*XjFCIknc#-`PUhsdyDC0*m%HkK~!B1R@rtHFY;%5L9vwL$m zAh$_iZC2n|KL6?up4p%cJzR}i*1#sL`KO@L2O^Gk>&)?hgHxy)ABpV|3CKLJFLlW%@+=! zfD{o0q(=c2K}5RLV4+Kq7J5`V2#8280jbgvK}DoWlM+#yw9rA3ULw6pFQErW;#v5X zUC!+9d^59W&YbJof4$FZAPMVP&syuge~nh2Lo?KMK7{fd)iqd4ZCdYzt%v4FJ;n|4 zH@Tt`^x0Byj337UwLx9QQocM#GsA%NOVOqO0;OAQEbj+u1>Gd%6=y?t*7OKwmJXxr z@rH*J?|Z*#O@B75zPVUu)D@3#gSnMObKaKB%zL9{ah>#&#o!ooBwX_1`%U)M_kxd1 z<~i&=x=mGFPeyB=s#UIgFHk6LeNLaHUEEp!ImV4&)0#m^#C&-F%~QVg#}iSG7oW4S z4@Mf1o|dzdoDNL*^g=nXj9-;QM){Gt>xrpZ?UPGYoG*%#=RPK$e{xyJT7oI8$T_Jy zrm;;rX~DddJ~>d|zxh7jN+}ts03-K0eEI|Zwpf;e)lwOPk4t%%M%VvAVt7O(n85#U zQYJ#P1cW*VE(Vd4qlHYRoc7dx;&M&mUC!;HvhcTZp&!+u0BJU^k^KDF*M8N*Eg_RX zC4}DQYbJ_u*ne2UJLZTpqK-AgUaX@n+#RW`c%xpGr5fF(|=rLaLI;*^S&p& ztR>94;Y7c%EHL}gy2jgIJn6_l;t{Bk|BtEsQ7Rb7EAIkFei!T$deVc>|+aO~lm5WaVb_gh2;Q9Ziv zn$5}8IZVvOG&JebIqcuR;})~~bn}N2om@~{boR&jzCn@jZAk}wXLfFT)|F7n!WqQp zlN8BcAd&s|%Qj56lSpA3dMAA3!ksHiqdY!xh`Z2_y=vjZdsU^i@OhR5N3M3-iyO;^ zGWso3i&)Cs;_|=7x9zoD%zo1TnI+$-Ti)}>Z}X&t&PvOk5Mz2s<66i(FNnjuy_o$a zns3RzGl-)!C|52dJbfWBi?K;*TH*ouaDSk_@Y{+4Q>F+VB_##zkoj-d#U5VUDM(hm z=^xT?MJtvLx(-FQ1;-~STUGP5Jr z>(Qzuh%(t2K&KU$a!zy|iS>r#kgw9{th;?AS=DJi%ichx3!Jr}s!jO1Xhii+=u1^^ z6@%$}CV!6q;arpdsazN`14rh9kr8?s;}eCdiY(0UBMUA@lWFxPjtfs>q#7-~5`M&f zRu95(=h;#_l2C|^sFrS9=>)`E`Z*Y%gClb69HeH$rz|7dS@QLh0*xX`;Mz$xufDeC z+`34#^P1fr(|dO-&Xl8i+-|Li^~JEnolgo3`sRONI{$6|-GD|RH?N}m;pHID64cOf zu0#+cmYO=UPP~x_(;?VzjH4d_qVu0PGlYM~eHZ9JV}0lQk*fHNzrvY8CxiJf&e%R1 zf5{>LwX^d#*20DbS{ohAguO;U^C1mWfl2Yv>}@HYqCRB0ep-Qx2y$`&jyirO9Mj}g z(`Fub8J32kb}nt(MqxCExp#fbx=T8&ja4G9wa+l_Uq({ja=+pMyW9-1$GMyMo?bBF zrC>Y+?TTZgxP0fIkQ4eJ#RcgVCI~sG=J126-TS!D&R7(@w)&}{XlIzv_7x;Z?Y!ql zw-J=I5*eL#51xhjA-Ljvs9@rw)W3GFGU(=MF_}~&f_iCn_|)S9y6$d(yS2&wMoc&c zstQlKX5?9wDfICrqjz@9BQ32ILn_3CMb?&dj_6c@I?J!5&VUddL&$*Hwr6$k)If}% zR)88pUIF3*n}LnY7!(f_tu(*8eV%*WJ83f)=Co=xzTC+a_sC^0e`}<9cm%QRt`irv z0wT>WM5_R=r!i1d9KVQp_(SFR3&iEF2~-8rqHE-(KCWiQ;0ga5gBiHKq<+^eMAxMN z8JXG(7>AM;>D#@>y9Uv=$oq6pV?C^=1xlB`8mu^umyi&SEyCwE#mrlDp`B&+{C7Su z#kK~fCMYc${o{P`Za@K`!x4Zx$b8`61&k-;WFp(NM+i&b3T4@dHxnn;fmL$PA($#( z1b$Nr`~ImE_PgWP2HRA7f;gDu-$|7Rv*{shE#`+jo9^#`uP6E+;K_a~KK}@W)}oj? z(g8z1L9v<>`{aM>hyCb49eZ3Z8=bn2yR?oEbVN@81VkMv52FCJ6KSfBJZJ0gizK-f z3RFPC$V&Syirp@Nj_4JFf|#6Wh1M)Z;x^(5eTY}EG9>>K=lxEj1fOg{0A>FknD;}}Bnl~OmUsM@p-kI+UCq~E!JaH0L^Xy`Kfjp0Bp{R#F)3F_w>-LY~ zl6<+1S`wK`ZJDp_E4VVm;PhSUAV3~!*Q2{q^n7XT0*!FvV9{?HQ8Ue%3AXmByg}Z6 zaYawD#g}mB5r0GfuDoVHGrwUN$3-l8Ic-h#gL^&k^`Ldyg~ai`Y(BB>nt$`I(pqYb zpVYEZzFqw5S0qUX5xcK&`LHH6+@&Nz!3XsFm^NJ!v5%hd8D2?;I#cfTho)8|n*FTW z)b}eGH;P`KQ9+VUZVnux4n=*Sp$1sR_*~IoHAOaN@b}UUVN}#FOlEXf@VV#0p*4?yFt+C%PDJ&CP8@B?Wzd&QV;t|M=s=cS4NAopx`qOf^(v5~i{T&p zD19)f*7Uit0CwASRw9NdxOfYEPUeGFQe_K7hyA`EZtqaec;T*{*YG!kG^-Zb4v3#! zxzU;r{q&;DrCfm(JLbrpy&|bKZrY|epdOzFES9xU{F&4lJxrJ%_t4KV$z^gBjnj8^bC0@jSTg^nTyaoL0C{r07}t0%Lkj zgb{>rEF(SK;|GgA#e$2@Z+Aq?L~XXA*Nf^^tW?r8UI(R+v*<2DZqh=>!V!>0_l78q zS4-)tlfs`T8KI7@O~B{-+{UFkdln+(weClptbK=Bidm&T)EcKpd3YF;g05cj0XuvO z6Lz&@H(&M(^SBB@9b!4EYW1ScdoVi6qI}PmjNS2iK_u%AA}Hw;2Crv(Z#MV2Jg}q8 zC!Uy^409mzj$C)ygYILtOEYa2E!l<1lFt$Ka2tp0c%IOPafz~86>6WT5wJ$xXATis z$lTXDR{FPA+5cC8anJH5U(0QW3q=g1H#qEj?OXi>ThTY1*LcY`&_pV>85LJl@Y%$I zG%`_H0*C$n4r?h2GNv?yWw-zv*;$1QZVz3 z+$&UL%rM1ne}6S%N?k|(dDnxsU=gA^#LpEE$GwJy-LLxe1mfAzeISHNH2LxUAl{U` zuuu%&CvsJyN@C8%WMf(W1fmR$y`v)T;SL&DM^4t=Oz6}3yP3ojZzWz%t!g3rdO<6R zS{{m?IB79P287F{9(#=^eA+$bGHOEpJ9Da2|30T0km0=C5A*TBeDdzUa;pD}5LL3D zeg8tM|BJ@D^PcWB`YoIE z{tDt-Kv|qw{-kEqxCG8OFFSu~xz9<8oa8C@qft&~dU@!uL}0bia7n?ZZ=n!(#>I^3 z(d`7}dOD(Q?p2)g|U?&bB7=28A z97}!mvfY+^azi(b_FGLbf~AL;S~ODF@T5u|JCOm=ueFf$pG4}z8jV`;v zQF(6RxTZ3Mj4UndKNNMHj{4sv)nYgRQy&9_(&zt$F545kLTTmmjR+&*`~JBX)mBWLZG-=KHs zwo!4lK*rm~_D7br$2w8byKAqD7-Sz?g#K&EHN=8nA21Nw$$8r|z!nQic`%g~1cNGUW37Y}iOEyu&R_5N-3wLeTK z7a;#hC+?iNl0Tqfzk&*el`2J6Ut)|7uoNy3ItAfhG^m9_nlp~k4sh1YT zjwbfaZukaMPWobtXWw32Yl^IbU(#Yyk2@vEnPJww(@Mv>x!U0h#ZA3a1V z0M~L@kIc({jWIp)6O#d@@-%;&4E)gB$1OtRRWqs#)R`%#kQ;!FEODrge>i|AnY!3t z^154DXdUp+rNLp|1UJ%BdF;Rvztpq7Qk$@)6oY<Ec`` z^EFS^t;;-YI&~ejFtba6)68temSKcGa2ZwtL~a19xpEmR*f4&r_v()grnhs#ZE2Qc z9v~6=18UqUFomoee`4zUcFoC&6>Po`nTv$B%3%{o30p|IP!{nJn4~f2|6tHMZr{c< zH#X%LDTILfsN?9Ss>Z-v(en6SMd4VRmUPJXL5v_iymu$*#LLGO=OR;8Aw+&TL7XbB zbgSRFaNh4%;O-FT@5%#P{p}xb)+P|B@xFwlYSK9SgIOc^flg`Do9Zue8$9!D#r>Gc z(C{bH`Lt$pM=S3MqLsgJiGz^?=jzqT>rW&ZcdhFLhhfcTXEs>J48Oi_l!bqXH#W1o zGG=Aj?Ip<&ec!AgO~{UHo2&`RFuSg^9?MPLwqAt|%wJQ!FCyN1iC4UQs5f^jV2K}> zht*uA+c;C_zMQ)t6>&rODG7bc7;;<(Vd?<|O^6Mmt%3S9MGFPKY6Q^40aNKdAp?}# z0%x}v*dm+<^`O}Eh)#Gg%uzE+Dk!?#XzSEj*wQ`d7k*d^v>e_tCZ%Vr?)icq%$akr z4#2Y9YL?@ddbr6Wa&GwZk}SyF*?mnYlChqQIX${)WZK|qhRF89%xOjeC;FJ@hzjDw zV@fij>?#2^ic0Wt+URRFlj~>G#;iv*eS`44v+pBx^OB)(g&UumIQcdN#xp(735q;_ z#6KUUN>o?;QDk}8ao2RL!~1fg`R7-2`2~)xR3b?bgO@JBsCRypIDLUzEeE=fn?}rR zKGm<;w^R$HnNBQ=ceg+6VEfQD$A54RM|XGyFCL+RC00L)y*keCspk2SF#X;o>N{9! zSJ8mgPq8h_Cm^U7B77l|R1(M!z}X#P)$sIs0);%VeIkaMXMexJSnYk2BwA&4-Bb+n zm&oylAXT~tz{H0Hh5r`WyguDC2kkATigLf`3W9=@uFlQ)mItwxM*c89^uw4FhQmI* zgY!`uQzT;e&&xLL??)p~kzsb#4~;yO)cfTam#`&ld-96H$QE>BOay|iKvc-Z-O0w> zf$&%p1dMGCc+myQ_~v;?=_l4dO=Mn7nlx2@AT-(s?z#l=#NkFCVn9YgYV1xLW$4}L zfy0_fK}NaH)HO|Q%G?#G>8h#n7EKnsK&3sJ+KkPR{a$Ps9$8n9>P7}ml+CxKer%t! zkJ<~x7lR=`TB!`wPb`9_W{ebce*q1uFHK6o%3=^>JDa#}bfrG*Cq(bmiY(KcLO;w; z2!lYDWvJ0P5-t*4)K6-mQ!S-AW5&ZyU!cr$c&jAeEBc}x$uJIfPVwA+ptlSn4Y`% zX!KGdiitiZn|+Y)A{k_>#=7ugT=0>h+08RQC1bYN?7Ich3F=hOQ;)W zPSPd_6VY$f(#c7Xr?xa7&pKvjZ8ADUXAVS}?sSHOJm zfg&FGwRycra*w{BDDfRQ>w)+P9oX@rVXJz{CO~~yJiux0T4UD4q(coaOoa7)KvnTV zP(d(Tx_vr9f;S!EME=?V?$&NB*xoil7w!WnHk15e$$}$LqSBY2kTJwO;w)wuS_Z?% z0LFABy0ca=?IimDi{c=LPw+ScHdfePD=~=wV1ST{oV@{N_wqQDK<+QM_y_0VJ*zd| z1BhSpw|}>C_$4CvFS=IX@xOeoPnp#Edrx6(v^=g3FOTVnCx7d3_BAifb+Li(wq3D4PkwdbL2;_=iIQw|^gqC8TWF9b?M|Gu$3RAI(jCi1@oYTg!kB$l4k0av z#^#A~7_GYU8|>k$x}R_YiNSc;18hF$&V!Gli+e_`@n0AVBpDzjzxTrbQ>=DvM}}a` zpg)JAf{v&E7xLmC=+`$I2gBbyaBRJ;aa=D+J6Bn(rK7+M@(2Jj1ucV4#!taTVP>d> zRM@h0Kaae!*h}X^jfhzL6#M0u?8yUe&kf0GomhE8MYPrJrZuwPe?eXeObVfNPkp6d zS`(?E@gsllF5gO%C(A)!uWTa^+|oXCSXu4~LESbx;sM2pgQRD0WxtMO6DFJGwnCed zS1dQU3!f}z8>Ka%mh2jan=&E#r<>eFwXRKfJA}{13>Fqek?t18l1836ia(3O6>ldi5AH!fQJ!-u(Ya`dJhrT{|ZQ^)+riZV&uW~c%=$|pDiCL?Kkr=eo`9|o`-KC+u zs^E!d?U)f11Oc+;{TE%^B^FIK(|YV|mTcZ%>-tP$uHRTBHK=jnqFnf$aHCuQj1glw z-*Co^d)oBzH1(GZox1C&do{_eY?CwL``j(VHKSfl3o$M0H{{u)i!WCW%*O3s8Ca<0 z?4Fch>bp1pIJF=C()esejeh*|mfB+Mc&Nq0$rr6#_ca0J7iIOvq2F7t(!!3BH2Pmo zmj5XyPfA6*td4i?+;ni>>^JhxN!hO%)DjrKn%q9MVOA(sq7$hsVltkdZJbYYY1#Bi z{u98HXdM~*LP0%V=}oJu9hfwaDr~w!N@JC;#Q3*ERRH(8@$ohvt;S z))biwoS^1+LAxi`GNvCGB;wss-(&uYLg7zEFeQY+W}%)t&|Wle{$EolAjA-ub=N-Q z%t5Iy9YV(M&zSMw1itBc2Ccb9vL$HOdgxmO;ATjo?F9pkPWy-rI`)cRy??0?DQYHz z70Z4lSi~I_a3{qtf$##ajjeq%ol0Z6n3OE{L^xfag`0bc64l%_{At$(D-l}Dx;)$x zm$gzEJ*=e76N|6oY%T_Qw3#$QoI8bCOMSMET_@`8EazVOI$a4-X3-U5C)i(aS>6lb zC0m7oM$2yW2E5Vxi*;|7uC>-lxGeu3|E?h+8-I6$PWAbWLB-9lx=&i~BHE*(^IUS0 zN6LDAgS}DyNL%Ey)RTq7BaKOKjJeIvDVzbqSx9AX18v-iw3%1Ns{MPA>KhayJD#0q_==%YBe zb}{kruC?%U<2~od`GwLcspxd)ywRG0c1L%f_`dk6KQHDixxf-#B8O09_twY#&XNT23EaKoL) z#w%cJi=BL}S>Pu|4>0o+D;p*b;cFM;r=kS!(0_4Y z8lNhL!T(ScjLM(DNaRI$k2Va&0t?6^$ zkz>%W=!-B5DDbE}(jXsnr6JqIp?gZ3X|(Tu-$7+2rUmg&-z@QCQ;AdnJLD>g0a{^T zTG_3Qc9Y`*z0b?F8EC-Wnao0L@-6&??EIca9t|FCx}lhVo4JAL+B=E_yHSo<2N9Xs z=rky(yf;ZS^U~v9{>I) zLIrv@J7y1=1@Tb>()A<=rn}U(L{@)7xBU9^u|T?9RIMBX@w=Q<45Duvc$40Kpxinp zSApt+32eMAO@Fua(o1Z`ebC{fLEF7YblgD2fPvR{MH^wlimnJaT~hxTOXt{7yIEFU zp2*!CjLNv2yHMQk1>agP@B=@=QdsETSoFaW_9GI8SMZQ%q8pi? z`Nrz?%5}Hv73@baI{haEaJ|ex);((YY!mck$+G9RcM#3mjrCqM1=?nPSgjd(s+4!l zRZ?$AE1Ib~v-KyW)VdiNhUMYp88G^xHEYHDGM=~;zQ3n4xbA92=XTh(O9q$&2Q5|3 z`5$aMl?yH$`@p&VK2noRY+&Jlv19pmyJIIcN$a4$Y}FLJahbzYU@a%=)%q%;54n0z&dMTf4IbO;Ob zF~{;n#<$MMbA+T?asE9*Dua#&ik%rd+lr6J7^197-^3R80Rd; z*%#PpHI3g3h(Wj@@@`w)!$je&v55)OsTAXjm638!v|DtGlxMKO=+zXL-&i~}X8N(E zj^7KOn39clf!TBoRn-ks1n!s`tM!(d@K^?%R`xt#YW6!nwpgCHsWNx-YkJuHvOJsI z*$svH5XYpYZ&R;Ti538KaP#W~V|X@FhTB^-QHL*4UeXA^3_3T3HQY_CA>qv{@4jn( zfB}6<0dmNEHvmKAu{}MyEJ0~#LZ^Dzr#Q$}ebek{ea4&P$aX*BO0m;Jg+3DbVjqRR zkn+As#hp$%$gGqoZNpg{}-hzvlcQsB1{-*h;a#&lD~Ku_BSzSTOl zKH@TS;YaFg=b~aq;DrXnnpL#bq3d~F1^ZmXF>eQ zEPOh&Wqm}gHAyL4tb=rTN+#^8BNyW3N1?lj6CRnnySP2s*`BiM9|n`e>7#`m^&GFX zajTj0QvaH(gZ6B@Y<~=Q*#L{Rfkd^6>UD=ZPl56QlRANOPX$k0x{Y@0WL1*n9LK=EzcZB#k z6(l37p#V{%_xG49lk$}R{bpTX4%9#W{2#M1Nr6hFD~AX!y^{muS@`W zVk=@(^@xfG^owi_VEsG}>?7O{k{0Y!uxNaM1A2^@UCH_zh4a})(Ho|$9A-%s_$zsMm2Qwxj3`He}O7cWo<;U~0eL!`GP0g)W5Qc$J<$=`fy>?Wc8` zvobg->@9VO*;3iKR4?wwQ69Bgc*pn|hYElYWLRr7>r>C{#8BMqF}SYPDl1}UToMjY ziZ`$a$S9x-#uCTXp2Z5e?V<)Wk9h#k7mY)OruW7v)9A>UP!I`I-qaQ9<>LhpbrmlU zFg}0#t^~qO+kSmLD79>&#j2-Ytw;j&zY~uR0stqtzTXXUT9862^Nr&Gl=b6TK$fDL zstRQr|#TEZis!gpD2@PHdcJoHrO`Ox2FYuQ!|>kM$lxPi8z_KtXUeuFxQ4NPaH@*`Agp5VT_gj+2b zX}9z)s%qxtQ;LI35>ETeVQ`0s#<)Isw<`y$ZGTfeCU_6^CCa;XIPra7s|YEN%!bx7 zg1%Y>wf~ej`DJcrU)S}3YAQ%%RN3vkl_&`r$2WsL(F8YBsa?j2D_1i62h`uWGC+n2 zVE2CdXiSuqJe#D&5$uQZM)=35e$DU3rm3nuPJ@?ysylPih|}E^+ksfWA33EjQrD^W zy!J(Q9S1M=uUxXVmgLBIaGpOA*yqnP4EMRdyYPY8}!0hFRK_ztAfQFRjPs)mNx zZH&|zMk+zbh>ZB35QLBPhI4HNSN(n*@MNmdEQNTk`UMi3(Z6!Ri|i51$cD-dcRUV8 z*qoR`Ac-x0)El(Xd5;|nSxBP10*PQD1fff*o3!BJF3JSf!;HHEKCchOzK*aWy6H3f+P;A;s%UN3u&5PM;Y+r!q);GGeqRPHHPF~M=CJjYo zB^dLEK4Q4L-g2+4Fh*`FWRf?Jj`lawaGIvowLwU0jg4aVpzF(s_fRT7$RhmFBIc}- z(D>J&=b^Ry+sH{&9qPoFDOx`7v5RMf?mU0M(m8`jBeLVHr;Pb2kLe3t)?;oUk!Rlv ztUc+WrtzV)BK)i96DlQk>~Zg)Ywzd_?nd(UWkx-a54(0;?pI4e#%bm(J_0~$DFb*s zal(-Gmlp*wNbbrFo&$ym)-^Gl(4hxT$GBhATj5igEoEFK8JHxvXA15heyZ9Ol;U)N zHeHUp`x8R7+&r;0`4h6&jtqwQZGa6A_6huJWoz{OK70k!n5Oswt%HbzeSn3!no3Uv zzmSM!9z9b}(YJ{Bz)AKjqmSWQ>qUR*KkUpTHCZATbbw)D&TcIQk$u{|Q+KjtwC72to+>=o*3gJq@Al(CIh!4wibC7VC!2Sw5@oY1c+$ z-w(y279=JU-VHWveKY+GhUfPKQkhQ*VS zBt%yr{_fx;+mJ;y`3f6HT+%y3^PCOOe_()P%pNmF7`-+h2?ED=xryK66!P}YE?n*h zUS6#!iHKbca418~!+%+ibmR~o+X3d%zeNDjqL>p?IlUGKFS3*B>1B}1ax82UkmsyV z_qYfY=Xcaze0VMom!wkn+^v~<`(TiG+xUc@(jhqHB8R-ET7!2G!Ih>RV=uIov7{>U z>PEN#EWB~eF{`Y1T85n-F!Se`M@DbrJcg8a7&FN~c{-f)7=2p0quB%vKKLfJIz7BN zd(ORu0a$j;OQ(`}yybZ~be(Jn-Qa)8Da>f=_I(MulEXm(%rjTEieX7hEB&23xY!t6K@eES+6Z!;&d)_@R8yhKoLEi@l5x zqnv*nemJu0g*!j>*~yYMB!QkyX*>+RuJfo* z3|QHB9~Q+gJF)~wEAlUT8!H!bZ%#8aEC}2$XMT2*z}NhnXVw>P!&8|wftnfqGWM-7 z8Xw65+q2J{ZJv9!?YAu_;a$Ly-wH-)TT44wYAT|DC{+x~)0OO;B8nZ3@YwP(ILF#Y zVen_u?J9i+F>XC=&uLESp7Knin|5mQJ3jc(YzqBxPC*CLeh!|{m4lF1Nu-dMRpdWd z#!X-a>usl~_aBzz%1m5|3-gymHBLk!PB;pbEWXj~lXqxb%Z9~*l)#KgsYw0c=p5&J zDgoi~i)ImdK9U&#OzWZLN%qdr>=Y z<>x$y!gk(AfyLs~B>}reENn_1>sLD_U_d-tZ%?mn8n;edzM@&EPhaIbaIx|oR_Ph^ z_`~nuRem%Ulus!7Z(M@+nw+2wzJ4!AX1i@!lbemC4cVazVk?{{u;hig6GqRTe16Lw znR;a=7uAalN~JiLdb8MjOh_$(<(nrl;h+y2IO!C4w$Ux##8&IXO_V{14cQE!9DHmI za1?cr(5mXux?>*ojL+pc|4w5Alx|yqTj4{<@R-&u20I9ccrIhNE$5k@6$kEc>L&{F zpnhC5eZbWcz;N#*FA29o)J;K%&I0gd+<-+vHwA&Ft_&WHN0dMJJw7Nl`k0q&{|BLJ zix6YmiZ5~cfr!~T)H`&+foUolao6@?=k2I&u;jE&;NWIgK|l@#AcYx&feHO(*wI3V z=aYsVtip&P1Gr0nyY3=yfQ&_rpAb5`BbSZ4D=w-it;zF1+*P>e;3w=hFVd=QaeETUrzwCkj1onS8P}QWv-B7n)7ySPoD5 zx;F88%HkjrzGHfL6HkSW&M1A^_!8J2EYS@$<@IP9*G035uD&mwA_;P6Rmgtt$jyNt&L2DVDSFg> zPhhL260`C>LK@h{xU519xhaH_?n}&)!HQA- z@R`+wTKIkPzFKlP1fPowJ-p^|NfoE|-d_Ab_(H$9NZY&1ZI&_IJE$gF@7qb8)04h$ zJKbZX~f!%y)7h*KSLZiX1LJFF%f9$t9@$yoP8SmSmz+{7(rj&xz+?P z!1H3(N(5}LI%El5qlmRXy*$+wq0r=SASXf?1T)4MNrLMgDsQ&rO1(D$eyO_)e`daA z?|_JMw@J zLIojhAsW@y$o3CMteo04R5;8=Kx%2(NIzV-ab*%Y(;*M-fd5Sx#_~4BZUcCz1%2E_ z6U!bhmwH0LdDX|^ZznSZ{zqa4V738*4w&0P6iR@v=5b`GX&^(3-D{4+C3cKA*U`X& zYm`cna42MZB;Sc-k-md-(17b0vd)HVg&;yaB7i>8JIKl*EpsMYgxma=9q5k=gm*mJ zj*N=711c=S4%H9exPgC?RT6#b4~yH!63iLC)WARv6opXyIX4vq^|KZO=`M<9J{bRvb*hJZb=1?%ouJ{@!550tIRZ|=+dIyer5q;m` zNk*yMI$<#{?PKNVRW!_g=3sdaTc^tBS)!wXNljfvT>wmu&QBKA$*I*ysqyO%OmCH> z><6;TKIqEpWDa&|zQ>a7IG3bvwVX6|SGnU`^+92eQ>`7D=cCcn>WUwfHJpvF4L@(> z*Ix7-_yH6{<@&w4-@5;*YfM|M%ju z|6_OK|BRArqaO(%QU)Aq7mL(%D!5q4?Y$WGU*CmWf2%!L`C>_@2*eY+wm&!gXpb;) z!o5~c@q~4LJd=p(zlJTLUum{^Qp*+e^9h*y8 zy5%{h>B}8MFQ8)ICdZ4+i&-Y{W0wECtg}8p36j2Oko_tiR1D}#T6|rVMN;kka6NY~5W(dkJlp!7d@SY# z&?6rRD%Nae>y4%GPrH2McyG$-cQGZ$TH)NH8bbUrSi%0!qxnDmXEmxhKUC0T(d4d&tOk{CFh6EK0J@j zVtiSkP6(2j#a;l}Shcb07m793Bd*1&HUr-XbSRlV~( z+{RUwmRB;P5g4hx`*&M(D0{IQg{v^?&hCnm%+>M^@RYsHG!zftM>fU3iPBIquSi6x zr!ezY%1=m?uS2>qm+c0PITimGJa>-bfnG^4 z36GaIrpz!+nNcY%joS2Id)~i)&-?o^<6}9ea?-#mJ0t0y%igfh6{WAV&Ph?l(c1JL zsJj;WOZ>vqgO_*`$SW`F7bjlx|Mv~x&S!*oLyc&Tv^OM;XJ&nmkhXe! zVu22whH8WTZzb8l`u9NC(64NbE$hGS-w<>O&#JkLzT56ST^oq!aitg;_J?_)LA2X1 zh7YY7HTL45y%VkC^P;|ZvzE)DVHHd0*A_F{HyROZl7AE!xE9aXKyJ4xpZ zLwOaD0Nun2hVHcMI3aD>b2SyU>?dP}LnlLnW_qu@QSVJ3HQg#ZzW@&P`!`-{)cRZ5 zaEL4k=}}!<5<=v_*omjR?5*kho^+p_+0%ix!5fq28&M33lNSQ%j(lOMAX{%tHKw~C zF!7tJziguQYJpv)4!yf#mx(*mv_UwOfChjcDj?Efx~B?%YpmL7h7r%dTsJ|BeI2(y z>;Ca|_Tz7@G6Ne^v_9LVPHmthxKm5H!A%pPh>yW(QCv}RX}wsels)lHK74F6<(0O5 zS)Ux#BYmBl_QC9#Uxf{U8mEAbR*O=)b9UHh@OfJi+iMiZHGI+$PBwm|*trCD z;b9{swzRhMkTe=7wa;a2ZIFZ0!`0K04;o1+&V}#zU%Mu)^`7B4$8mvQ9UPy+jYc9& zXJG&j;MaP2>$|KJ*~-CDMJnz`>$J!*by7YEvmxl3gsoju9NSLpUaOCf%m zgMScg@&RQj*mZgx=tK{Hpc_g`kfDA8rW&Jf&el1tm4gAa`V@OOBKBH2gb*rIyOT}03X zRvTD|H1KDFwynT#OdL-c_cQI;*Y2h$_;DfYm3>6=mr|H|dK*UDZueP(a?MpXmu&wa zqblZapNPNt%Kk6|5QhF)$^ipy1(ugdME_dQF8*b}LIu00NwsX`_U4J8Ly|C}0yZX> zf%foYyuriDbtR%?wws~=gf$$sCrDoB19DAd>RJ-MflvatF;w}+;O~_HmdpWpPY0ia zbg?b>-PWFUR4tRe)^L;QFnJy1Ak_{+)OP(N?89|233Jp7&VQ^BIEg;$1q=S|qY61Z zvrE&3ze0^Z*o^-PQOrUrAtCjL7CD4CL>Fr1!X%2`mMB;Ge6uG`JvlJeYd>!l(L+&B zO&wvMcjsC84+fGQ-~X&(OTnZ^R%9_a{&d7+)N)6biLA1i)vHIhIJ|g{`#jxW6Bhz^2jY_nFJIU?ip4`QT_C; znkk>`v_uQ{F38T@T8ogaavatT@Bf5ZFUUIGcwXOV55l{+tlNtp)k3*G9}>I{@5Z!O#D3`X%tvCwfKAla(gZ-8sJsBR zN0?n8A6f^d0OawM$^zf0l1_3ULAY(ndJkyrdA>;=(^4m-uj&}8DD(P zp`%VS5;}hK$ieKH5C3EbLd)_eMA+Z6XquaOPN#;} zvJDn|J3HvC`fl5T+#mKV^)yb}pfB$2_D}{}+n1ZN_Fa}AtAnfz4COKz30gp*O;*kC zFr|!8=}kY)meBd#f5zMkK0V*!6nu7fu$Tj;Q9tUpL=WsL{CpU>a|^vWtk&utQ1EXT zwQJ)%r(cakoMlJ4q8mO;)Qw+=Ss(gv)OIBgMm5R$^}??5hmwABs=l9(K);sx);gZ< zL}|bDXs`eVfSmkE(1DfU8u-;1yLR;|=a2T!U&L4&`5(R4urG+IBVMpNW>^f~4PM}3 zJyjldA?f>Kihwu|GOpa9nV)R0%6%|JB)AqkMD zx;}k`#r#f%nK+`t*e^@%yUdin*po<{;~6{rbQGZ?1uGB5hX^2--@xax{0jU=iL@K! z>G{K@S((@A%jJ;`Xo;VYDN~Osq7Rg~?*za+r$3o7H~d=V!2+50i!n3Tr-*Z(9vBdK z^}UwtXyDP=sYBF?=#MzV%v-MXN(SUNz+G3t3j#)@vC|$huPEe0_4O>#Fowue=)v+o z+fQWewY)ff<&EsH1II$gPfg!)Xn|o@FuJ_8j&P~5QY;haq9Pp`NtYP3f#^9F_b$J* zMZr@urWT5GZ^FLw%1U+#=80-!+Xn#vXYz68M#c9&8K^TUJ>mMCSLo<~OA*)1ZAPl1 zOD};(^5n3cI0%x;&=X=`4Lj)$RmD5w_OXrnpQ^)bhds4)2?M_xuTAWyeEIWx@%Qs> zg(w{^THIaKmI#p12&Mgni-P8B?fF|FrT2e9oxKw?f)T43^h)06?yl>u`SIg27K)#* z-Ks)~eF0;|jY4)EAuxPTK&OwLv@FtP1+-s999{EAzOc+b{O|#T( z5K8BxyC;N9tEHx!azbSJvQ5SW4>wiTydSk~%5&}1Js|Ls{dmjYnA9=wHRJo&{Z^6_ zsC+*%JkMv0kA2S2(t%8&6Hl5ru2Tq6kb2yYazV%5jq6Y~GLJvG1=fJpk`cnf>kdpl zm>S#0DsgSaod&RU2(pxJWqk@p=>f5mQhZ*l8Mdz{IZK0?Su4)!u$>dR*+xd+wuf47 z-L1r`CjbHVkEi1UCO@u^Eb#B+aEsXTyP-EK>}g?ibAq?IakVc<_JAFQR?s7d2ulVpI(A2nb5UgFPeHNW&b*W<*wIn%+Vz@9|7 zb&OL<42J{;1|7v4OsQ#zt66L{E9DA=AY)l?6Ka>Pfs;cJfGDMW1b}!_07bwVP4xKDc+lN=aNzd(*zHK4VqO$H!(aHx7*nIF zcA4{)ds}tQ_J3Ex9wVzSI2;r1(+W=e94RGnj6SIwV8NYWdlGzF$|E$kh^;V>e^p;3 zs7FX4vZ;}fDV6c0?oXF)A3qP-Nuvv^)&ar*x^~1Tn{L-tQcM@|3LzOUJ9lZQ#Ac(s zgbS@VKF%VyIhQZ@((=+oBr60TUrb0YQP1&G==h?tMr3Mi@WmZ|Q%vY1570Ki7 zN!^AC5=-xtG37P19EbQsOsx-fO&!!_JFrf;S5up{zQZ$NFTAbfF|wd8aeJoVwR!8b z??AIEgq;d9zak^@Zn679O#x+CAmlh{c^s1%RVG9y_fJ20s+Vs1d&T?ilq{OL9gn>B z8oDSvqL^g08(5xhN+`1JV={Z>InMys|9NaVGqJh!Cj?Ecgcr$am^dIE{t1}~M!9A9 zUw=zB1>zNb#O%tF^pYR7)R3m#J%A&QSL?Tbyg(%j+=%{weMluAsKmCOYZLpnM5OI5 zt%85*7kF+k3)n`dKq{`k47hXFPQh!JX=&X>Px&b(1Y!5nPAIvh}#?i_MN+q#HCS_@Eg`BsVo?spMsPA|+F zMn@==8Do{Pt_Jh+WWobaA-XoLO1<#puu!mpOv5ONowbz49~?z(oY$!^)i$YQKF71~ zd@{Cu1TuwgXVjDm+>89dK$Q%U(xMQ{#;M)zHFiy?RMgA}FlBJ&K|Xr$?l1X>zhsCv z$!I_TkrufrF~&Ua=HT94{!E%viuV3!ob;rFimIvYhwF01mr*)EjQs?lrPeVaU5-9A zoFkWruL?cfBVg8o$34|{OnsGJ(cwWH!pu@H?tMC0yY-3eA zJ)fz22ZNQ0fn-DT8PZx1vd_J8Aj+~EVBaL8239U>k3+dAs75yxLEhUVb^WSc)nb{` zUM2-LjjpXKW~g0ZTzuin2+<&XJBEhXX?FIIkC3StA-PV8>UDW%J6~{XlTp}qFzi}! z0O*pmd(o}TV;2lt=WPZLi^q?nZWROn#knzt?084YpO9>Rv;JZ(8s26Vd3G^WV;A+% z`vN9sUb9#7Qtkr8n)jDoGBJF)>D{xg;=J921%C>V@`%XbQIvgi~ z@bs->>9;z2sZOj3V)R+KGXO#axuIY`S)u_Cpe0Jsvkccm(}y|e7O~~G&eO^ zJTgPB6BnA*6d;Tdxi(Uc*+k1xzSYRbB-3R;QFwk(j?zaCb4fxf`lSuJFcm0T|0;fB7#Z>DFR9ff(TJiASlvBKuV-5 zpp*zm?*v2yLzAv3y_XPrhtQkSOQcEfE%X-Rw`cB|cJ6oXnS1Y?bLY&NKltqx$X=|y z_G<6*zR&y6EKNq_TD(=0p(Xx)asytiKd=A0KDfREeiSz>U1Wj;VvieL?UM*jr)!_B z8a^8htgo*EH|oG6%ELd25{0N(fN;vJS|MtI;epox2_I9x9Y{nI{rk9&&uZk^Dy!S3 zPgVQXl};9ms+KL??k4S9vQ6mAs=5G{ErS^S`q#vJhDeQZnngbS@s%_YWVNy;{EHOy z8h-i8(k3Tk19?W>o5JT~_ip9kmx{1?tCN27diqU+8mks9E-B9gvG-rh*=9JF!an8s zA_mKkJ}e4tVwN8@ba+Orb3B_qRT{+QKYs=O>GeX(RtA1({9{ZMSgZ@j)?RFxGebDl+2*r`od}SZo zt?k@^b`?d(fm|1@@+5Bse#_mE2v1GX;#c-+2bxGQ_Z5BrMvH0^1td>q*>7bH{9fxh z|JjC`&0^EiVDFU7*z(tEorS5fu44Bs)EJQs-+VdRm?yR4iuK)_;@O&(*a9{?{1U2Q zErQB5YHzO0cwHq=ZhFJN)z=V;@EgLf7J_GH`* zo=Tc?cA~)f12b$i(EgSkLgez!kHxt{>e}qD1cq;*jPNCJ9r%YLpAl(Zj|}bi$|5xI>yfU= zAk>sqf~Teg&01~R^8{!;l!=*gQo*uYfvj>{zfzx)%)z^;wpb`e(>q;{1gklMQ_s7tELSul;c%= z-DUoPFTtdA0p$>y?piN7R|om}zXXM!_+Gy9R*v zC!7+fL^cM7$_B2ZD$`w?^YUV}f_5)Z6>Id*Zml&C$Lz9En0H+b6rdIdvMLrj-{=(S zwv-m299OZ^_X1InxAzzenfF{`s6|=)84>myMaA)^-dTQ-md{Gb_2M?d(CUEkE+iMbkm_*ifs95zREjkIO2@xfxt8(i5O1 zL4KxRlmw$QqD?bFY@lo{{07Ed?R2BmCQ7ZM+@hzX`E6*gnrceMC~sS4c4V?@5|Z~H zKJztiS2DXT;45EI)=Wcek)SOPw@H{!l`Jv_^-7pzvEtfzXF1!xsfsUVo8)`b+hA1& zhqvBl z#Q~Pt`3=*R*$Iz^m0=N<+jom6SUHbBZnG7~bIK}L*}sA<3lGb~E@}zb_7DSR zz|5@Q%535a^R@{Avh z^oAn&p(41u7jzbhMJ*!8#HrJBtgX}U2EJ?NRO_vlzZXzTdl_VMC&_%bmMlrFG>rrf z0@W_?n*9iEP>gh3X~)k^LG0Fq4G?HNEAHkSeM8N@$M4X}J=us`JA6FE^@yLF{$fx7 zd|0T%%9fl&^wFa2lf1-~IqNDY-&jPsdSuC5GzRf`fiv$t>gNjt(mDKijAD=b3ici1 zd(FDjxF0@=@fFv+Af2Nvb1RXHRBFt#`0HN~6acg)c$HxGCe8johTpu@zAoSrI?Jof zSI;2rFv4Ist3Qj1t&VYd!LGI!oC9Mwtbo1jshK~UT4R^Qanf)^j(P!;a7vJ7pArH&v!#!|= zC4N~LWOC^!QuO;`R19?DFEb|4xdV;;(umPmPi@$oDX8c4-rct9%x zJ)EW@_!uVNVB1&%lB8~&wVq&WsQh5guy|wEA0SSF4d|!hhq?e4%j0(=3*A;N4Bqgy zDIx|4@o*hGfGC*}Zz=#F!c{76rt1#S%|@8?BzmZ;A>Q1&$gF4De3@(q-7mQEffz{a z>^H-^n5KQHy_pnn3otoDcO7wj^5bSb2^XDw;a^3ld_2i|Ie5*}8Ws#b8xe4s30YW^atFYG%;D^ZXfRd~9tI&#`@Qe1Zv`T`G#LQUbx9kBN1V?@h zv*f4?*12@>FgMZ&!O2PWIq4F>-fRU43-bOVgHA~#=xoHF_w4k% zcm!J$G22)Lq5GSQy*stH2m(k^V#Ji_XJB$S5SJbzoZ=s_Qjk zah^ZJ?OKP+S|ZM{R_CKExcWNjL%YxaF;swA^)n=*2+<&hzuFAV{~kQ5RFk@R^EHqu z!?<3pNayNDt)&)jBNBq10X#i&Cu$nF9G&=kD=Q%D8r_o5b0BN2+s2t7= z9MihHECMk(mWW*)f&-B0`hWsZl)eJc04@GrR;VBS1Qt9KxcFDS~J^ND%HKVLRWFPgi5=U)B8vc{L%o{xoNz83eT@uXYo=J#2hGzK4HVQ~8Bb6O0d zIaz}%8tigULBX_I1{MSo8W^?H4FLQ6;}I$T4@3XSTw##h;q1jyQ7Hvkf^3CFaz?XB z!Fx~}pB%svK5L=lP_NfIA`gn`#yQma=)6gJ4yvy{quVX0cKK~fEom8=yKh!kSzg2s z!~{Cxygy)>Ce)mJI=9xA2p#LriyH38kY-W{12puKIN6(1T**Nm|K-D~v&yJc*+@~P zbn-<+?{0X^wMpo))lv|V~jgSjnm!s|TKN|>!? zwsUtmrRfuaY|F5l$od$<^#`ow(a#)PA?(Gm732W@$4}L*vqjWcl^7hG5NC0?NFNju zzC)X6oooPp+2#$fv}5iTKD#xn2L(}UCH1boIeJ}D#(zbQB&uY}&+0U77r)drabQUw zw=Buq!PV;bhE$n!ojg$WXdBi zQ2^^Nuaki+X+}=%nTzCOJ*X7=o{KhRu)zVxfZI>=x6cM;Uy?4VU_zev$1Xm^&hK0q z5ih^?Q$pXLa{J~Ck$S_b@j?KBle-`G^2ee^Bh`~6B%jw0d;qj@vx;pWlq$MLA&Zr;vJ4Y{p$V-GAkdcTs@rYJLwB+2cA^dAg2iJyJrf;N!9?`p z3fxG-s&#_&_^e%ffK29x2rX9&t@9AJ8bt~|>RnN8CmDL+1lT9G?{DQKwP?kI5Sghpf zf}vz7mQ#*c-+d*yipmmg4&?^Pq?&iQ)yk2@jTZL06J9qtRWm>gnLk))e_nrti2n9F z`3Xk)o8P&13owDDx#TtQ$&lu+iSbFpjCZ49Gtr{MvW+}|& zRmij=V87U2LRT*KT!540{R+*Gu4^`N)6%um7>d{R<%PSM(#%H|(Dl`X9g||LarnpBDPx^{f6W#DF(= zMAnR`aY7UPA;tS<@f{b>JejIMEEtiEYJ3W6vVIV5qUd;e0K$_7;$77152<>r{wNya z*`fkZs};>1prv^{Gelb)7|l4;;oxcz&m}KSUYNf4dQ6@hQc2@RlxtCpPJ+BTRmUh( zmrbLlhl=r~QIr2|n$Gj1XX)R*zT`+J+4kGWFu-Hkb`A*oJ{iX#@_PVmE-m>FrAk&O#<<=>_EFW&ygB+5oAS9V(=7R;wUB%(yCBsX&>29j+2i=SP%yGenph%xEBPGbMUj6YznLhJJ_ zmmsN58}0Ur1qg9?fOPp|O0dpV0h4AK2cZO+6DhU1=CJ2IcQ|j|wk>N#fll$7<$TP2 zct?gtU=`PTbO^+IW0Td|IZBEpx%u^q03r>fqDs2g-lx#6PiuH44@jVshR zJ{gTU8Vw8KD_@+Pe)3je5tO?uUu0v)?1r3Lf!QxfG`W6_Hj-{0S_7A|!6F~^Cg(4; z?U6{vRPm(48Vf&@Z&D%ImW&U9gcm!9Kb7xOoj|+o8xxXLGSDY~C}sMyHptLu{y4fc z!Ta~SuT!RI4okPD)0^5Pw^XU8c8~sSeSf^NOP>#rFB&Ef^>)wrsk&QIA z_ppRJMzcq(YAwIQd1h4v-mf)2Rat?br&g%9;@>o7z72$uJ-YDEBIN&H{vQUKcQi$+ zB)o}VooV0P_yEplcgRf&B=6BM(@T$6O4N%oBBhPL5`N1hw6|J3gQM_S!F6KfS5d$I zjy|~z9vaeZ=ZWAvRZmptDdyaO6#&KrJv48%hZJ59aNPK1jL3giRpY;k5z#~xqVdQWwqqNB0C@-9^MPuAMd*@{DO z=yt_|h`KkzYA16!3S&Am9|sZxHCNfB>5eXAbX8_&_k z>5w{hre%>a2@CK6!KlQ` z6$(6DQ%N|7#r*zzNa4V4(xF}KUH$A;*m8%v5UP50_+CQ>Gy$#nmxlA@6f$&@hcKliUzpPDiuvnKND)3MPAIh`_@(u7_~+l`4RcXa@cJda(|p#>a^UJL%fz@=M?i_9h)GhU?^q58FZP zDyc5o1G*xCLKpHAZawUgaj&bV{gKe~6SpVgnmBp&`eexA8_A@q3qO?7*xk<3sA>G1 zmq@~Vgea<;MQ*XtE?Sa=3rQ?v!BHC$?3jfRU^}s- zoT00@BJ>i9a)0LJZDX5l>_e@RISz1)wzwI71#tf(8t_N>uEzCd{TF-yS)S4o5-U%N zGX9iM&H^t=;pC*vu_#jE6JN7Do5a}hOvS{A5v?5aMX?O>Uh-|B1ElAooGpuFOC0em z&TU{p@SZe~a&Y@3X=YAk5U&yEsS-_&)5)SMFndd0YmG2h*Z?b00bPYrf9WFgKN}EilK` z{h^BwbRCi}MvbLZ4(_1DPLL02aRY`sDy!+C2 zlls&T3d3ZKEtx1u_hdXYt=S^U(^KWFE%G=LBuI4sVJDj3@^tiPUH^Tsi0WI34 zZ)aQ+3u<8KuDz#(GEJhZETVd%+HoE-E~Kl<8^r23dY{Yv+sz{!1;Mhe*eR#Rt`X}g zx{$OEe(DT9fJq04!`6)B^7{s$>xl#rM16~j2B16Qw~yUC00c$I_7lH>pa9pyoe4TQ z$wmW25wd;%2}DQ$X0>s87Ghi)5Ad^mgX1y*B}J+Y^z0Ua_56l3!IPS>FM?>O2U3)G z)g#7%)OZ=t{YQ&S@q69hKtOGOxfgx_?6`if+*N@E1jwLNbCwN<3qi7N;o|q|fY=r) z0^Kdw^y6)4bI4CO6AZs^40L`Hy6z)~7w4Sv!{fP+ac0DaK0j+p@{jjoqPzB;<(jAt z_EnB1fF6_zA?=v6lYzd$^;(}0=@^14DV_#A<$qBx{TK(|h3yjW zA52*X=s?CgOW7TRWz@Wbi--=>?@Y&_tuKXMH^)*wq_^oO2X&VB-{T4l26s8Iv0Y6a zsf#>#a0_`bKV{vd50~yoKQ4Je$6?dtdD-3UxPrjBZ#PAkoih9&iHB=gQu=s6gypQQb4lX@23uh_Y$ZHW8)Vf>x_$s4juLH=3*ewOACNt_rsFC z6DDVuwX*AKSe)yRwDSt1?84dgHkQC+-LrQRNHv5NIrN*;(4zZ6s8O>R>Cuso*Z82q zQuI6H)VWc}#$XR+TecsD9pnIZWI*Vb6x#Pl)9px8q^Zmk)7wUOP<{8(C=f0GdHAf` zE4w({w&CH7mo{tzFqt2R0DPBwSLryRN@aSTU9%4V9p7AAc_cW$;@TI z%Sp+%e>VrQ1nonbT!6$-%%$oTv?_|d&oUz~PRH=-MRauClIQM-sj<1--`47A+~YQqsB z&u6h^O|*<%F)j2#D`pHUrcDn*QCJ0@OSp# zWG(F746Ust+X@EbK25K$Hf_+?$Db3l|G?WXM_tBtX%)cCd@(;vbBnR~#*<~SM(y}U z$|OuQ+fUOGo8&HdT;<8N>?x3+S622g>=RS`+dW{2D(M=$8F(Pqp+mcgZh-m&8XYBU z=8xT1Ohb0<4At+ABx^}DffRm!c5};fUJ=l3OFb;aDVt{W-QxZ7{#mW3q{x|Efw8f| zuN4_RYo@&6${B?-#wgP!#>kE3dkS;%MCvPf;t#I2w#a+65u{H`p#teLm7_0x98Ea%6=}e%0xY?6)A*u1~#LU8%}-~ zMwDY757lTUm{{XLZ)x?jLW|>&ce2bFWT^jc)7;UD?VtSipa_BQrU;2 zAcJ|Kku7KO*^4x9A?~3Q^S2z63T5g*-#~nO61NS?BZH?OHCO58cjl{AhsbhiG6y7| zpDNdlwU`pFTh_L-*lKlwj6Vpln2Q@=p=~bfxgn%&2hVL;8>ES*;%!U0w|H+H_ z@A|uQ+qv9Y7$GG-qV7=r`~`S3Ucq%^{DBva0h<`zGanmd4lZmKSLgnasrVJ8H!Pq_ zo1fEaQspSo@$9ZJJMEzatr?eV{+Hz>|+-D<@BNiEg+FmYnpxYGEfi&!2*;yBv z4U=x`aANH6w0o1@c9Xj8(hV?c)hfREE5$)Br|Vd)Z+zr5DqqwNVGL^BMk{+<=jI9>%xFs-xu8p1WpgI; zL;9Dg>l6q3VKv6agH=7; z&Lg%13p?7~fvk}c3j)5Gad8Yxaxao04Z3jk%88il+~qyGsMP5vao`yHm>cQqw>dxV z$7V4rIk#^C09f%=8MHr_G3^4Y2sRhO6ZurE4pP53Xl|QQwg4yLOiE(~RF2Z|5qm!!TX$6;H<)dk#;rY%=(ja zZN6!8wi^?a162trZvu06By@(YZ(W-Y96(O46pP)>nycjD8fP&6TE?fIU*U8m2?@Rg z1ib#{36KBxDb6%?vtPZY^O!ilE=6~SXGtonl)ePl+%+did{qCPTJHu-6}J|Ga5BF| z>U9GOf^Tv+BT7pfgKjBhB$nT2f9{?R2y{&EM^DkX%6!C%Xn4-WzHTmTW zu|9ydF50kbxAGf^gnU0S!)35wG;MHtjcMyl*9conwAs>@U{7sWoP^gnHeuTXlGw}WIISjn z*uSup!XplMM$9qA4gnW1p(<+7z5r=QMY4$>(g(LSrIPa|+tm7&hOC_GrUoJ^&IVeP+Kh z$5Xb$3wEiN!Ll*yCZ)85=V-~aazvmccNK0hfMp=DK_t=DG%FmCn`Yk4IL>y5Pgvgi zs((xKVP`Eq1z_D^t$zX?)HYa769X37_ZY|e_ZSD;#&;-1=HqXmmp<+)=E>?WKIXB# zIP50f@lIHVXX&2yz8KeB0kLd=9E2ZT#M^DqHyWO}lVwqGD)hZ~ZHwYDJ#KY7deHNE ztQm3QZ-@u|o`Cs3Vq1v6_5O)p{)Xd~cv5gAnxOC*P-y4gM9&pY3*x=kpvwTl*qwc- zeq4^2`PfC+MDP*dlGgztA=YU^1=TURm!$OfZ9qvsb$B66@stp0e8i>>z=)0s6vJSZ zBW3`|jDME=mumt=`)~a_0unJ10Z{k^#=n8ECIn6V_BW8RMf}#vz%dn~4s0o4iNEc6 zB`|$G9Np^$wO;uK8t9KkY^-wkzBeW0#ybHdyAQYv&K~kS ziD;AQ#Vk*;0i6TQ?Rv$wd8Q9DW;g?0@+VB7BMIjat%86|v;*QTItvhI@M1F%+Xx`Y zEJD8;&Vj!yw@3<^63&1BiH8_RN+2f!luY2z!-awUHp4

      7h_h4h}k-44b|g$-QTG z;ugUd@wn@Nn{Ir)7$NahBYq6Mpt4J$#YTCxKxhltGdtWxfXrRcd4p&JIIjDAGQnX{ z*@eNdJi1&vXuA?{6waX#<`_ad91c#&P zBY^$b>gzxNg1Pwclx5#QA#V_0t>>lAVOOAd>YM5Q(ABPQAPd-ZlP3_lRVVD~>_Y&= z-gZ+#h&vc^%=A|vKHm&Yzpn|rk4+C5H<@4VRX^-m|HCWo#qlK*K%NU|Tv+mlW2JAP z3Iq@u#=HS1K#Qp|(Q`LhBFiup1Xz=QAN&00eW$mUQW?Z~4YZ|)+D#X+YJFvt*e^I6 z?4)TNh!wQ4;|f-gfEPrv&Z)m!w7%@bwU;K%7)#)a{hW|p&OYeqitRKiq!)a35cte&9#_feIILNmy&fWl%HVbudr(Y5JD+oZuYAtK zBHeZ-I;_>xBIeXrv7G7F^qjzC5Z04TB$H+2)}rxTsj2DHr`G&#AF*DA!AWbCt8q zpHvU)c%m{$CbO7#DE{W>%G+ubpI;Fz>vQ{Ca+bb5sb>P?BO-?=usD#Omcor-=hcH-Y1C5eHFEb)}ld~N6MbQ?G=qICCkUDsgAA{nWVR$6zA z=%-kYv2&56kghI^{FhRT1~E9<1`2Tepk)HGV@&EK`B%{ZtILXOt(zSxsxkGPT=z_1 zac~13Ow4HV`F`LWwLFML<|2nnn&%-i>Us(S(if4ljZ@b7yJC{65gI9Ptk8264AI3F z3&=`caXY=$N3W;k?~vKVOa#fg4Qw9T1x2Xs$TgG%J$C@S`N6LY)S>lSvGEej$-R*45$j#EMeE)!y zl%+8=k2nyfW|`IHdp{5+=0P8Qo8%&~D%U3OMXW|rjPQe&oVqLf4b*L!%RUAD-24HR z$kpi=su!{G59yw5I9&$NFpihMr|BCUq4g}&F>2RNay2JA1{C}aq*YY>TDgy}Ab95> zbpyN`x$z+A>NcH|kmZ|?6qyoGCY<0f_wMVBct3H|%*6`FbS@jYOR7ZEwENZ5h+*Ou z*l=;LQifvt+R*%0FNv9V1f=Wnqu3y;$4$y@@wb3SUBHbtaPw=s-pxhr@X+Skgb%ogsLAAOQ5Jt z*-j@L-KHHhH((xk_6|H$&ZfA{B3yU#Vg9&D3!n^hfoD3S>3X$w6Dw6sQP6$5`(i63 zE<@cA@4=o0VvbyC7cSAE;nZ;X{-kdpZL3cjGcV18xgMqiO*U*mFyPfD-As!6hE81K zzg|8=fNW!ZFxKe6uWZ3s#eygza`$8*>Xzd@Ly~iHI_?ie9`Xm&>3|5i1X@C`0XMct zUlP9juJjWVU43X9o6a!Yu!wUz>n9hZHW&7&h=ucRn-S;ik8Sv<;XD8!PfwOMTKpv<1Ko-kHr$=r&z!t^ zm&{&!f^W(09S|Tf)Z|`xbnm-pzH@2)#$%cHJ=I5H^&0u7~Jb z-=A>X?*wim-bvqO1A^Y2ljSSIRST6~7VsRrg?Pp-juYW8;$9Blhwd%FJ|l2Su>!qY z4cIy71|yRtiR?X6CEQx{OvtDpAnvwRNTA{NK`fl7IzFI5)W5=6AD;j)EDQ%m5jx4z z`wcYzl5jUenL1*aG5fzwL*ueSYIzup?9RTnQVhKw?R$xO7xjhn)WL4MH z(@H(FTx4ar;k?0n%{9v>l_~;HDI^w%aj@M{oM&R@;*D%;3zNp6FF==isvzp~u!sd# z#CkY^_Bdb}K`DL}ouI7P&g(TlCk2R1sO+nbQ$iQ0t$}{m@`8p!_qCYTkBPgEHvw&C zTPU!!Qy7@noG3+3jeoi{5_sM)B_5Ypa}m*fWt@OVIWgVz#!DoO1X$i06n*Qc0a7Kg zCcfLfSoz2pvNI)7HU}1p!DHF3q5~Qrp}VqK2}BREv0{2-mq1NdX*4K)TOjTq1@Qfg z)|l*VWH^JbHQS8f^gA?ye?tIP9MNd(@*!irtMCw8gyi&7-ib63BCA<8X$x|-| zTF1VY2^NRA&d*~$xEyp!7Drxg41Az~2GhQ4j`Ir$?9i~6FTH%+#zhZ}RX1m>h%5RW z-oZ{1Oyb_8vu65vAa|zLA$^74Idifk@*SwS`_UJLxwDH+G@g zG&rU>djKauNJ7}SqN(=_+iJ<;NplxBw!`Kp``ENnIrn>!yfll?N{0pFL?_dX@a!&0@~ZhN^%t1?_RNTzLT!RJb5-P{c^ue_T~N+c1$#}0OHmXdle(2#b%;#@ar7Z%ycqa zXEnK)&+GYgY2n$?-IOwa?+$T!zeF@gNx$jVDLsNikFEuqDBw#$xo+FZe@a~K>6jMq zb0?GFg_Dn~dWA+G!xgM8zf4HDoDcRA*U=&6*GtdU92-T(-y`oA4R z@~@B84*^|Tzr+MeL(MW?PLP``_Oqmb$8dE6tEZj8#*}U%OP|1sjiLDdOO@+{XPvgq zHGS-|ksI@E(bR#b}Bvqd>jlNosE+BK_s$#^j5J_?8?eFkXh1v#gE zt`m7IFzREd8);-sOI+it%x`Npb3R087f@HUI7=5zZrx%)n5z{AkAEV&2V?hu zm$JRy&C5@Hr2pvfNGZIdSgX?+#WyW}*@N0#bsg~z?YiPkaUHxBsC<*L$VFJo2;kg$ zU-~s0whX{99`8OeY5cM3d5} z^go>=50p9_+~(V?{3sShI&!!5P_jWaGP%|60IO!{JWGGo;O1oY9p#KIws@4&;5xrQ zix}-nT_%^OLqJ1&#IR|au3qLFTNf1MJ=wCs!&2Dk%{6hi!4&#lE4_03ve;y&gX*?r zeXQQh&7O^DR79n>-&wuQv>x-QjEu$%S0+yx(zPiMM<@D0Zqc;l?D4M9z_HT{TAg#RI|P5u5>d6;wD zlOlEJ$;bNQ9qemKI%gH-yrM8LS(t`e?=?zz_v$wM%(JsJim9%TRY?Pe2|#5O;UxC( zC*LXc^{IFv&Q2dy$mD$km4FR;K`gvx#Eogk@dTy^LRUhcBUX`EW|U{=_*vSP=ww~Q zcqBS~iT6^lN{z4-TPT9n^W?tPpa!9%_X)8zu8|aBfj2l#@MKt`le`isvI>3vn5rTL zp-8AM79=+tPQU56cxg`+UgvI(ldg-&*-{aRqzEe7G$JhqA?Od+!d=qzkK>kvnI|ru zN7l0k7aw{bS>tI2Ry_Id*Rb?D4wG?0InD`3cil1kz%&=>l44Muy7G+o+{8;E`~oo_ zqrvHup&zfIGccw=$RmL-U`q0hiFN9^)2hawMVoifQQ+!*Q^)7-Pt!+V59RmIr*Dvh z2)&*c_rt!~F{{kz&SNHZBx)JRIrAdj~yI(-)=Dy0%{0)@q*==^={dNWy zx0qVjq!cr-xq@g9fMG&%B67ON^2WwdGrFG~4_I9HLldjiVAQ18l-vRIBSaKB&en-Z`v%(Q2KaGdfVayN^PcK8 zW@Dc8X!-!VJ^l@J@*Cid4SO)Y-c9Za1h80vwrv7UGqfZeVOL0fFf+{i-K2(p=q>vnxHA7A^c(57{y!nx-wR}W z2mqNcVTjgBU~SP4KY*=SMH5nBlhODKl?0iQZ=l1K_;!q)J!Xf)3-Y%R2)^py69$O{ zbHWBt_8;sd`d+C8D}&6dGJNF7MdbuoTH^Q_`rO+#Yne6D69<{b*y?Cuz9N$?Zi( zXkeJK^UVOL1^PA9)ocZaY-t8{ps5}*lBwoazO!$6SknrgZafPh`y5RMow$n;!#_DP z$8%TYJFhx?xoKJm>jc)(Rr3jjo_kC)2=4@{A#HJ|$uESRQDf*T(%thW4xh~H!1xX1 zjl?q!U@Qd573tdt`vUh+SI_LG;a>67TUtV?rP!)Mox>I1JrnB>+4D7TKv;^7F~5{7 z-k${eWO#3Kpt>OEa7LRot-7mzyaZh@kLu+u{^ZfPX8}6q)}px13!k^9AoM_~b}GrN zp6~FaXrH=ssybVKgDy5_D0MZ(d)o7r>ZxNIWZi3-t03B+0ygzmf5KC{NhdL@0KU#a ziv>J(*|=NN1j)y;fwwplN)fwm@(pA*--7u+-1`a4Kbt6iCrO|q&Faxr-9G@d=cV@s_ z`eqe#ylqCjeP;CbP;FU~iXlrGow;2tNEn5=FE`zbdb(=m{@=fd3E}ZU8~M+u&6E za#IR1HNj#WqUYGr;_DX1$S3csJmavXq6t9)KGr94yX^$#M?=+C7j_C zs)by351Hs14wx)O?l_B73u?YLPqj?oor&2)(BNfp>#yvK2OQuLuTK%rSw#?yb%?(Z zSR^}>u1ug~ zSHp|WE*1K6fU6maA$_wQ@bdvGvt#yf-DPBB#nFImAB?{I^g=F)~4mu!irb$MQoFHpmmpUUu8=h@ZQlW9mEt)L3JmD;N?!Wm^Amz6Ys2A85vt$|r=}MTI05PCYMdS0CNM8LK%Otp_=ngW^k=m<@Y#YVZX-ra zCzgGJYyp(8HhT%}i@8uGvB4>=Y}&A=U`HH96rSELFG92w0Ta&L&sOS<4O`N@o6HTF z^khHIBrrI;ZYnV!Ymvu!2Gc>j;W3(Uu}h@mR5{&gQGmETC_Rm)^b;;P1R}`xU@}DgxtV z4Tl5j{T_zhL8bN1?ChbHfQfB7q2zIwG+sBvOnl}@m2~y9PHf6&TCS7vt~F#5;4NaQ zB(_cK3;2#2%=9;%Iy=CwijlllFe$r2e`4)&f1^AJ^Izk)Z7%F8PSck04>~ z>+nMpsp$8z63a93G@f7BPOok+mjxGs9De(-_!71T?I)2&*h_7Tz&prRnwB_q zKDs;*QR{n$Ex79TrTVdEm7^)l`C?!-Ev&Pvy?1NWrOQ~aB=SgY{~5-6MCVPRZpmuQkGtPrkPlUtPJ1u-(IyHSX)V@^q1#yqv$?0qRjL{S$I_l z;q2+KiEd&WHl4&aR6i|3Av9hOKLthF`EM}`aQpS_1C0{ zXfJ+@rS&U%C*G4fvz5XZiDC|u0ULux{3c4f!Oqk%j4O+b9iU(3sHy6Hjy{*t^D&#k zN4@*M?L7POzRNfa5y&4u!=8& znYQCbBzJDNS%Qrd1f~ z2+^OJ4_DKv;=S5=QC2|cUZ36#*MZY4Z#vBjZi;lDMn``ZhCv-T=Y?DX!t;FJq!7ob zUU$yU*S_EQ#1$Km^L~zNW*f}^+?g0o9uSLPmg;a;X#}4_p1dT~hEnbnt}j)fc)*z7wTOaV_pBpqtPl8T&wgS*wyPHTX%n3(Xc2Yf^dEWJIvy7l?4X zY5vFZ`uO%tiIm#S^iA~tWADA=n%viP?;r|F6#+%60)l{ufPf$nihzLh5~)GyM4Eu~ z1Vj*#DgsiZNRt}rozRPl^w3*SdWS$D#rJWoS!?#5cV^DaIs43KpPBuKCIX2ed4BD_ zuj_kRIdJoAq2)T$m#^tJPS%;ui1(Jnmki5(>2aS-9cmD$?0t1dqb-noF8E$+)9V3$t-^oAa=0 zi4TYNS*$z$Vn3KVy&lEsl(;zkj~zcRvvNP$#_4GX(%uf**=GC3sQW0*-SS$2JpQHJ zQ`iukTjwW`20Q6bwbv_e&tC$Md23L09mRvrEPmx2W~XR5nroK9Cm!9_XxE28aG{$c z2TS_LF?TkEZeWmt_a0v!G^0_XO>wg;)hPp z^dYh;SKcAdPt;qj8t1AXefQz{TFPK_(@3Cg6<;?8;U)eR)E ze6m&LcqpxB%T}F^wKv-pe(@}c!8cP9&6R~rs^ZXAzZRfzUm`(${AjPZY{WY;QE<(> z{EJZ3F~z5IOKD#m^Im&zUeEz1}Y4dqMJGbJZkWX4u9t`&QP1KfNEePo}@u`fVp)kQq5U-fdB`#SWN8V?2gBJO%fBNFQCpZX?$Y9;`E)}KTT&qWqUY=DR-|z{-wJ`$%+aPL-ruf~hn$@L6l+_g zWFa=cd=@5ru%2@q5of_ZXfKgGYG$d*eqppeDV99|I4X(QyqA_{zE;&<^dqGr;q>86 zF*!Z^sBdIMCtbx-&M&(6VlTpl;2iqn8I3}ThAFr8B|jFH&2;k5NEcDH)*EXv$wXYw z`C|b6x`I{Ds8q;Ye|tq_ASefP>}1nag7oC%l3cioxy zn~qZjeQQ69G{s$MF+X<&E;|C*?ql`EKyI!|ay(8|3Q+|e6?qQ*x-*k%FRe%jMO>W& zkr;4m?HI{;4-aR-uB7vC?sJesuwUlS_ymaB(xi!fSSB4akxd(zj8Z(rgjt1ncrt%S zF*aYvxh@<=Po_6KOB^TwCEZ~kpe!WjiEPt~<28^!RL*9!&oWb2G z{_Hgs@LX*c84P~nI%&4B9evQ?;HjlAFk$g((t&)Ba#5(rGLL>|?0Nni^dr&`U_1h5 zO2NY+O3*W;AhZ#n93CLs?N5E9$yof=ghGx}U?gz5ijR)gj~#dHP}ccV9ToSPibDF) zD4@J&4V+1{7xIIfxi~~jN3)(gPqcFaHK^@>uD(sfy^fE5FEfxrfOFw{$5m8yMjQ2d z?WWb3@Wk{y2&h19*IIi!y_miS4u^Ynh(uM|?Zo5WYNKo(z6ixT^GG9Ou_9AC3ui*7 z6D!se*H+vVtpWC_Z{Tll#9Q+Y1n{t9fO-0>mht<4qUDNrCUCq_C@P&=Ca!^qWF`za z10Cc$c{cc9I%tiRobBMFM#3n}d(t*_8ouJ>GyFY&-k9*hJ#BSH?x=xI`Z)Y`818UF-@ z&Elki7@{e}w;-_MMGW5^8bMt} zdat&v2-`;+upMupc2K1jlBvqL?ubh{VQfQHOSKEZBsAYYVjtWsoO(|cc)4acfE*Iq}Gf*ZDG_L(4ou1xow;%-0(m*u{Mq^rwQlB0He-N2GSwaXFL5f}3me5zu3k<^ztT6)jUmie{zDMB8bR+$!K)2M|zNV4cTE)-~g61(wX4U_ZD zR}noZchW*8eLG?bNxCP5JQ_QO%zJec0yIhc2pQro5orB>R|5VDL?pl|%@N%lsBNkJ z%(JjVF4DQw;SR3DWnk$N+QpzA0UNmFCRs&5T!z5w2Z1x^{_vAPUK0d?r5kwpO#ro3 zN#U~G85*VM-I++au}0rKN1@vuq7qtZ)7;AQ zO8b~X-if$Od=IwoKab-#s~)LhIj?&DsNX>00_htUwXDZN43p{X`m^|Us%Xtp%01+e zOp&Wg95*QQJZRqN*C?qYnsr>PH#qZd&w;Ey=~ZRZP1Wc%FQEOz*X)3c!RrQy>_qOb zPFB~zTyN>^H3o4aSdU{4BZ#E|jPp8-)2lL(8|Bx#snZ)F7@bYF-ENiLpP-0qBbCY1 zU7xdub3uV#yX}WjJC;byd!;}r)6m7ZUN&=9LLn}Gk*yEXDJ7M7sd%UEj-VoQtJ6`R zm*-w_q%xPm!|(CE(wVf%rB2-?!z1>_BRXbbg|YYSVpU~Qu&E+1Qf8KhS?N<}gJc`< z8U*%Nske8F))c;*-$J@&%=5pps*$2)nWTVY-TjiZ`wdW@m*_M1tarB@ z+i~q4axmR9UrSoL(;{02f`m<*$e!Nf5RA1m*h>T4Nq6z0q!9#N<=x{s$QXc#4PpCE zeloV^-JUVYwJ;XLpU6RJTvP8bBFa#E=Q24O73oT|j+iLF_(t_I$1CIjY65cjcBj8x zPGf7)wAQV>jmdl2O^)=IjA-O|Qk9;QTVg{WqipK6v%E2eSN4Hoj9whEJ}6#`rsUy( zZJM#IAV2IsUU4{u1vb)`C(F3o^yTRHkY8h7dqd8qBu<=dMXW>fCTj>jng`&|s9W%x z#6HcnpCEJ)I0gL5bg4L98lB*hM&>0&iga444D^Ob?ILo)rUSU-pCBGIu$_ehkg%1m zQcdk-t9o^4Ca}>vhZn&uk{Eu1!cUAyWr07Yx^LYu4-$nIWK=G`v$Q;pIXys*rn^!uK(9-UtD^0?R5r7M`BS!27!5)in1?tGF#Wy)PGW#5~A z=l#d}ql(tF+Ra4^={CMbJnd{>5kWj+fmbXN1r9|>LFq`B*E#trpP%$S4bw6AbRxY&SRu~M`S>r` z7}r?S(7fND43Ytd5B1a3d})ldL#1R)W|)q0A&ub?6z1CL@A>p3samX8`ns*~be2O@WWOD*I2iFOfnO zxJH*-Q(xI=33pfc1A>q#TyXAc28Hv7Q6$s-a&yMUNa7)KN#@&+tEVO4Nu<8H%KT7yPTbYo&f0eB zif`v>h!{jy5ZID*moyBV>|~{j&BuWmVW7vFhxQ6(92VfpMu)YqIH+!Dt+xP5Kz}->)aE-!Ss=%P5CFUd2jyF?$Myt+0y~zSm@tVi48`*g+ z1QLZ}<7iwA4>M^QN}bwshsOq>6+b~8iVGP`)O|4By${PtSLmKMJLFT?;&WmER+%bS zMfaxS&em{;Th&sX3zNV%8?HrYar!CT*n9>j7r)|2X1$H>{it9hrlCciM~hnDEcMmH z>y%j%cpW@-fc@a3QI~*;ccfLBeLy^8?!lYkxs;0(*x`@yoMYA=G!f;c87unTglM$o z@}V*`FZp>PmMpTjhUO>)CYAR0~{SnCh8mn5KIRe!?zdAhqm z;`aHD$!ykAg6F@D3>M5q$4hvzay{S1HpEm+eH6x)t^_dd(use9dy;(@Dt3Y$ zxv&^FXJCZ&850}ayGuO%w%Y^ElogK7g7`ULFX}%fjy^pgJ2AkE61;xtMUEGDDUj?p zzMb7?p%Xexou}Ls-I+aVgFLApt;i5lJnBBWG>dw3?Tre)RBd2v9DGLprlTd@&cUJ7 z{vGdxci!`bF{=uk{lD>GLnyzdqKbTuTajC3CGnmhij=M%fPDBd-Lktj5?si(LPyZWd#7<%@Z{ zkFypz4BOUPLLt^$H?2dZq)*ieTb^7teB$OPA*?Tenar#6NRnmU%+{E!5V-GA2FE&~ zKl^s~{z|x24neTU)#Z6^- zJd8_qpEua)e@rYcDpy`}5_p;>#SH6zcrr>|ht`)Idl~c1*Cnf95wm}W*`X|o`()Hx zS~|~bRWs%IfwrKL#l-&TE}hGw%*YqE8m9`{w0PQHv+Btk8(BfQn)Cvm9PBri%f%iI z-^`#p9~)RyQu!(2l#B9Y)uu4(b)$z*h4fS@gl5{Un7rg)pG)kLDf<|79h1|^1j+F` z1iT=@!I#%94R$YlHhXveUe0`a!rJIJShdKrpeuc6>Ek5K1WF#BGb~l}+X}e!vDK0B z9Q)1tA4yjn_e}a(+#`|LQtqx*tJ9)+EqHQ;`sk$*S?5}aqNVe^(DFmxo;?oW}noWFbMDSyC(6LVW@U(-dgw@ z>x3tN-H!{7QZTfbVD`j=kQ+IWLUXf<0+8`FbkdE2Q0Ar4FTy+NT2J!k&MkLRVVT-s z1*uAP-5TGL?x){yVP5X1zH=b^`JLeFtZ0_#gPs0$1I;J9X{;d$^XQwxE*_QWrKqN^ z)-Mo6=K_TYGB=^f-fX^_+=#c&WE>W+(Z(26qc;+7<*D>J6ss|}QAt~<``17P*_%foW5^&u0YX<+D zbjiuItj+>tHS(@Ti+mlq{o4-rkvE&4Yx+jqY2}Ct6ew`qGncmIC~70<*bO?p$$rLj z3_<^RSZb5$J&>XNuEx1MYlRHyG>HA8az9%va^4%sB=iYg6566Z5N|O)Empvx<)pNN z7t)TisZuqFRVc!CuCR92l}E_#2MLuMPXmU{02Thrcheta+X21D;&Xp5KCb`T*O6l# zqy;)XZ!;!ThBmy*G)LOUYE1A6_q2?QhOfSCV8<|Is$e_wTGnE(x+da^>CIRE)Q*o41hh4^-#8cf|^O0-;@CGp;zb0?*Q39fSTieH^%M&-|x$2U}5?eFWDAkc-@zwhQjg9*%R2TBz z=cPhzQ>`Np`{NJpbsM&|TWa=Y`;ld`T2)~JgZEpzWvNyPin&^jo_d{6H&W?@l2-z4 zr3+M-%W&Qu7u`%-Y2vS2iALhb2K+4ZVZz1!=Ip^x`YH7vUyF54HBGgXZkFp*OeNke z|4^MPzO)}%PCC%ZsxG>zP`sY{csI7FPab_JZLNP79WH&jw5JEx+cuS^(v|Gi^DK){ zM0=20`8|tyO^s#M$mOaSA84umsztVd!rpRq`qjkK2AQ#0$5GE9em3Pp`_Sen#@13F3 z@?=%{?k>8|LSJIsFw7T9eXG;PDyy`QHxm8wqv>p`blWY7Ocu_BWf`qKv$~tG%8KUb z`3qIu#tF&AwjTNsPM=)ft}WmAxBN2yRbTHN$+E9`7n~7?NE4rlRfvO~MKBg!Y&Z+ib4|*== zC_H6T2zbS7a4)mW9axE-Q9bQlIEj`4N_57l#z%RpmUjo28|n=2&xonFu-GEkG#9!z z&+MFRtCtO9MoT8`sA}oN-%vIo*yJyxEbf}rG{l8J9A`X^ATr{d@v1(qiX$6#_3zu- zR&*ACB0^t!F3k6WCUqF(@@eBxxEVo~*euWLWY?_`vGBE)`pAC_opNcDV?oXdBjoSP zQkg@xH$XY`7IG;E*RL~b%b?T|ziILOKD)cZg@3v<$V~Udg)8Ge_CA6pLs{f0amWt& z>rKt&LqAfaZFt49u8gQlnGwLoqMw3L#%au+Q; zYvrPW=|g3j2*Z!;T)Cdy*Sqg=CCCQ}%%&+!kNw~$Fh%TP<`(3>v|d$ilIGGNnm6eW z++sd<22P3$11?_B(4<>!Vw(8~ClJZPrE1f9(*P)4&ABcc+Koz32fxaDBAI@p2CG2L|O01HJ0#TnPv9H}i^%(ZWa`8C1? z%3?EHytcQllb^&6J8vVwwo{=mJv2_9_;j0ZAb(6@)LsPmI%9H_XWyGyPQ=-5gejp(ylnJ z-BGNk?bdzDRcMdMhCRj%O6PMz-H>-xsYa&+aUWWN?645;WL-3wl(y4+JdGEN0O1q= zRLBUNYv}{oz=m(In(x(3GI1iP2s*?p_zefqhp>~UPS^Q53Mm%SmQa?B{uoV7i&I$k z`46AKK@x;~(!d-; zDgBLawcLj2!fF}?KA6KMbBrb1MR)-3^a!|I6L#{l(}MdXjZ5=PqB2Cr_vg_K*#yHuFY53Xl{6`+6Y0YftG)Txk&ht2xjz^_Fb|}8ic%7v z{pQIf{8xK&^|WrgsoKOYs4#y0bZ-cyav90KyWKIvqcGsI?P4ApCS~27nMn)26kxQI zA9^r>#A>c&MO^!G(F(-8^Rlad62EW_AhB(46gMh5472q7-cVwI4Vf(go*Yksuv z+mE+G)mWDKP+X1bm~kwl8&7hNM>k)Pd@`AM9(I+TEY1ASjW)%NyA-4 zZr$m0Z0yio#zg+zJY9-pw3q7o)2?ipyv!MUYcn|KPSadCd+x_xayi)W?d{-Z?h&R+ zy^t6`&~ET=-6wrodqch8r6ixk_t@ej{kf7nR^eVKo=6R!sYqq9?eR9j| z%imANHo>jnDV-L*_l)gGwmqAj#_55ySDIeTX*FOG!Upv=N^gjd$e=h=rqearW*O;@FO{b16sO=25H=)%j z(5THwo+Gt=01pn?%g5h>51$}fo8i6hOh_D_z5zN!D*GCmg5M3<$Ls3+zGwd0Al$w@H^=J z#eH4ME#yD{J`+P#xGG7(S2dhi9M50;6l4NgzPuTl7hpQVUtL}8^VOkbUlM3|tVuA3 zd{{qM$RieMg{k=nDm?lv@TSe3`1Z66S}P_ z>8-p3z`3y32Ryf9`%t{XTlimJE=$?o=3E)eJT6X`n zKI30T-XQ*3l_7M(Obh{kPcuU_j}owu10g&GqW2E)aRctvLyte?7xnu(VHtarg!Q2n zGGKf0l#_q#!?0)S>oBEzPC8I)3*iII?8FkdO$ehC0cPD4zD-nW>F6lk9UEX!;ChR= z41N-LNv~q$;P$6Di#PXXmgtVTW{_Rr^J~DW5e9#?ZNQBi*UU-`4p7cHC)=mZH&;V4k#fZ` zo1no~sPoWIHX|QceuDA<`)mYqob@{SaU~A8I8P7nT4ojXd2m)-2ojGGXh5{A@5f`d zH#Q3c1kv9Iw_PtjQ=n6d8(Y&t*vfP7w6DEU=l;YN%zVs`*CX|^CPf9PcolcLg=bxQ zUbo^veq7g7De1xTs+S_W>Rr27Z$C3IW>{sJMS=sY{Xb~5rktbtP6LS`a(^V?CNuVcZo{8)|-4171dAoO}NT|58S4<&x~K#fh^kQ zek;md#qjm|j3JRsWcWeHJGH@f*8G+zz|C-r- zY&+&(_r`;i|Aprr8pHh8ws@eWPVuM6BQ1?)EHXgwcgaV+>jj@7;fp3$ZTKDdg1C1f zG0u(h@i$_V#_Q%6Paun^?wG0J_U22w^0}E`azAfH8@RotCnh$COu8O+3=uexGY)%G zcSRL@Tn(T>8EgTr^f9vbfu(Js7colX{I@~K;UI0O8%O#TarRNBY> zJ>MTb6g?#dE`h(AFuhjN=pG9C*nKLu>UD20H@Ho(^S7oKu)Ew8+TGX2*&=8zJu4CQ zg@Q}U`4xjwJ9_E=9w)LpYJt%r(Rc}W=-Zb#3G;W#I$Mx zzm&@9Pt(P4xM%`Da#7buXTi3sS~y=#Zz)5ASWbM8u%SbWV6fbflk@4A z7K7BV$PMsfMw?ki_$<4#;5p`V-Msc$$ZnlUokOuvS@44Iw&ju}Tek`}lfIP@H!zGg zG1$5*G7x83v6vB7fm3Blgcwh|w;;KHB`c1@7WwJ_To*|sU z#ld#D+6J8tKfN=W=%TjYdLwrR!9GV%x}vz?z$tyc>@htB>O;o3(Bid{?l!zWw~K8m z3U3z#HeD%oZTH)QRX_hhL53}T6D02CirYzA0W?%r1Pgph(kQUL!ZV!ab{9kjeuag# z;u?-5M1WZcVES7(^C5b$C5W*RV09ENdQrNKu_wfsju8WOeuA=DYc)U?Uu76IEc*4e zB%2HtpzXcQSJ#y_LfbffdF0N39^-ggy*b53jN8tge$tW_m!5!C3qx3Q_;);IdaZx2 zrT=<2km-pmLfOZRHy8p7e&QwkgMb3CHyUL^V8jbMadIZSUsjF{dhxzD(OAvaTe8jZvs_+DT8#Zk}NFPat7>Y$) z&?r~)V%2D|+-w0^1duBN?ABOQYBV{v0GPimUnw;+lLpGPci#H$wD6 zMo{a(d#6E4YZgXyA7)`C2wkL6=$;W)d1Hb3l~UKUJ|=$- zSmcR$h%q13fgp%f45tp95!wO13ETPz9u*mCX2uC0c_IOAwj#1V^C!pE?r|%od$C{tm)36fQ?I4s55!N+nR9)Z-j}va0CR0KjIyLOT4y{nQ zn7k?2dF^mk#n#I!up|3m|yE`9e0eM;Vb||fa2G}!T`SFYd%Zgo6okUY3?d6B9KPskHq4}c(z@gE4v(cKl zJEBxUnUsSNB;tp?&NS&8>Qag<@gHEyUaab*@(=9w)yvCqqq9 z7YFigS%VS|dr10;*yr-QlIQvj%zJsUA8HzUuHhyYTq7@@8A>|Eh%0YSyaAjWhV*7OLzgbID}20#(I|w$9U}niP1Vd zYBPd3hkJyVhWbGD%SUpOS(KFB0FHe|3u@;9;-bS5EbwL(V-a1tCSuoX3%b)H&pk99(`#B3Do1d{8MyCKbP8w2=@!|?0lB`?aNd9$jW~Z9Wv@H$YWaiI zbUMc3DZLgR#$+L4L@@#-kh-A*aAHbD0rQq!Bai^Xa|VfTKaL zm!AX3lvz*-Aoxd|JxSqN#un9VF&7Q+9lBv+tJ!~^->uIWoN}(E&|vz7QlLN%Uzdet zv1n44yLni4QEmsH1k2LvJ`=mBSh}ij!Aqj0YlhBsZ*#7k={FtyG;H}jmQLWt6Ph6_ z){V+d>5^#mb%=3pD6@OLOA5MJlO4Ts>MfP&d|GMWn@@Tw%;f*I4L(43``_LAA(%F% z#MdRnw`h7l)sfjL_3;z*n@#IG-5Q@$Eutl_LvkPW7Z;V?S$}9Sea=QWx@uI1e!B1L zOonMx4Q=f=e>CH?>zzDa3RO-lk~4CdQlcuGQOZM52$EbKg02LY^C~uYIi^xnLhqy} z7(1I7-S5^|=vWecKHXO3qv-UxBk=W6R1dTEx>2-^;Kd?KCm|W;Q)TMNw_SGe??(^h zy)PxJuU^Wsb9@mJQRwhZ5L=a1dpcyO@kLtl_nf7Th~CRiW7UJ{3(7@grWgHqdgvua zCm!Brc)`~sdy_oXga_UFqf~p}^zt*AyPcd;_B*ACXB%^<{0-wu2O|Rg-+ol3R$lq| zDEkZ`dtfdb_)}ojtu=CToxB~+x2oms&R`IEb#n{Wwq0!57i66_hUUCJltQ|9W${#V z;BI;vTI?ggG}8fj^^SIR;xlSC0ojzpCQhuY)jreXX3Zh5(8?%lo~A&qdDx&=-ZnWp zHhM2Y;iKt8041)D7^yU490p>P>2Y z6LQr3!!k-&oxwWJ+{=w=PWgGYiPpOH*K32{!{@7E(wme&IRbiGmIdo535vd*KFxS-&rBCf8JUJNualx>v-3<3eG~+G0l6CFD z9f~FmpGjvIzk>>@r)!>`M%Qn46g4nAo4NIQsXA^v1O5QYg2Mmyai2onje$N_YQq`Q zLM&Ufg<9nKEzTmZu&mMVGq2l>92fDrzO>x^!PoC^Hq z*-1$`m9n)-+IYD}trYB((G*q8BXe-rlh@<-NTah|4Te;>#x~^bN!hib1}+sQNZRAS zodgAd?3`kVg>^UTedzTl(7a_u*1d;(T#C>gX7%<3vK4FDmGkdro9b7{KWPq{`Z^e& z#)_ogh!x!gQWu|mh3uX3WBt+pdHeuboqZEPHmJ{Vwl8cmbT27W3}?LcLS{MJ%u&^R zL`EHG#&H~It*8Xb<-vM()i~vZzKc(Dq`0fm@nu+7d`O9^Q}!D!`tc1Q@caR4`=4_w z@$#e7s$<<{4`UZ($LYvt$VFwM%@SZkhg&*G~2RCexKSCMMmX56u;q$YV5iXZ+jB&1cnb zJ?ibV)~6h_eTVn2@Z%Iy9gY|rE^Y4frRnAsB9F<+Ta(Ni+MJiu#e6JiCx4isI`l?9 zX|U1Pm8a8%`$`%eis*Ex2@AgA)Z&GF<@sLSd28{h{3`Q7(o0|`{5@_mRCeAA@c7@& zU`zcGQ0+Om=4qh2r8`@>Hhmn_Qdo1|H#?$A<@N@pxNmmlcjrgP#@D(+1v5~VAx3sV zJun}L!^_jYFVH&F&V296ePuae)pdijcC)mPryZi%N<6y?{gZ1e#pWRQV+A;bp3=gF zBO1V_i*}WwqUHx5(TbN_H?<8vbbcFLErasDTP%T6@<=Rya$4scP}7)xZK-fc~P^b#=zCTzJNh>Bnd!}aNXbCz{i?Gg0o{YNPhw8AB0yl5VgL^oKCIuDyk zdP4phwP()9E+G>$H>C8i#I;KO9Iw)@u4DU{XQJjtjt?irBv$fxTrTI2Nk96NVfHt~ zS^Ni|lib`z6{-uvAfr&HkpUKk(`y6M+#+u&sFt(jN!Y`t$7_?mjn#<~cV=rd=@<78 zlW{8kxdfHbtBxZZ@1g-%dQDZD10FB3)7i5gW?c>-b7rW~UTsO7ee`)cZyiX`h0QnsOvXR(Tl6-*OLwa>krBBZCJCnXiBVLGv-RhnibG$>GXk$B z9S85!NWHABbJY1y58`}{(Y#g{{Y{0&{RH*Mc!YtSZj8Q2_AXi{H;_4V zeYMy`?O8Timv1Ir;_n;EU%?&>pVJ>RBj14?VnQy>F*geHy|#DzB>qX-R(Kd#LDf`1 z@XLKbIdW$A-Wzem6D2X=m}ofDkH-S^Oey?D7>fQjj=yfi`t9AgjwdpRmMDb2*ReNn zpgsdMW`r|{o~}O>xW3FqG+vbovt`h_@>JD_rSuj}tQ-I$6GwEfAsQ?xG6LZrg6_`I z<2Zpc#gAy+tao&3P4H4uAL7$oxIDLyINowfKl<8CB+EEZZfQfwlYoX&cN*7r%rbNs zsGzrMh(GJ8za$Q(8ecv>-Hh3#C1ZX%gkVfI7 zg9@Bp3o+-ue8Allq3>Ckx%zNFT3#>Wn0#@r4fb|N9_h7ID*H-j<|oJ==<~nK^`-9$ zyjG=i%IR5zo6U2*;}Q^p84ez>sn|WUwL}TyT7vbxvxaPU@?g=hzba)vYD@+nToLF(l+CNGtm`8BwS6R#Ij`Q7?gh@m+wrkQyuV zB-v8ai~dIjMzs1mLOSpJZ8PbER}9S%JI?S_JvhiMn>k%A$#ylYy?1r%*(7#Z}I@HpP85? z2l99%P^Yu!I73k0rPunTN@eqoCN5 zz~4<@k{bSRkt}r0zX~rn>fBWUdBCMchn04kNCn(mWw?^M@1*wk&F#fbp$b+Z^?44A8 z@LCK|frb`O8oxz9Fp2iCI0J%90TbUR`fx8Pnv@((pn5F(+mA^2jFHZTVm6g_w-`+U zm*9TquZG^qwPJsw>~vW7?y2kKxFf&<#=V|Fi%lH9i9Ma1#QuV(rh)aC96Fex-2A3@ z%0tyX4m!HLNro460-vvXRrQYlo-XAXj{(Sn8jzYW@c}3d?|Km{zP{+b$^jJN2;y`v z#{752OyckFc13;c+30okqK>QGyQl&ls@uNcu$kpFyjuG7(G&LrW;Knvi+(RIsl~As z!SAtxNFSQdlSb?1fX=8(M70>DHS4m^b}(_0s57{Q#_L4f)%)w^t4oGgPR=x^X0F9Pt^L4@BMFyp2>cW z{X=B)H$2b&qI{%s?UdRB&~|VFLL%cb;yb-N(A1$Mg+hliHi7LC?-Jq&$QIZ)*-auT z9C2X>5P&3OkK;F`88$)oA=w!5xS>^kICz4IwyG=&R zEH$wOUXF*W##Brr`uPY^AY~P;L;o0{VHuabam2OXi{eo@KWGq`zDV+w`tI=O(@C8C zmKpN3`w+WLbk|vi#{T*A!~kHIWUnF!8*7Mrxn=MXU{evMELmpGNem458LBTtuoMaB2Swu087EpP(7&ccHa#(mXPO zbRN+oqWu%(g@{EK^5CBbas7ohMQD7zTV52=)5MSeLg)m+K|s4eL6jH;d>%mX%4MNL zR}vN3FWWTumu7Ya0M$;r1<3H(Aq$GBEtdXgM7ZsSI(=b<}ff;}NiF%gw71|@u-@JKVHkcyjy z;a?;z#dohQ74UB4c?(AiVUQOPt#=OI+nc>vMY6ksHwGx@RnpRp+EBTHM>~|=2B_%D z@LQtGZ889e1Dbt;AY8~+O1@z5@lFpHfI4SA5&>51R*BoLXV(+!zj}Y`ni*_42ESMa zCf8GDp@8n?LnW*>tH&l%l%D|ClEBt2pO>yU4x8T!5KtcS~O z)6QjcH3kw-1gIj{v}AMRS>NC{uqlxS2g3`knhunqe9wx#aIL?KiFuAH-^_Sj*>~5> zH~B#hr|2fMVU8MQdzS;O^y!_sR4=QaU!VgyY+-w&BG;j80-E#UT|~_cN6}`J{D_HZ zZvBB`gc+Nj{gcGCr{^D+{6YVI5e&Tb5(pPmL=BXlBI?kKcA1zjeFvWk$Uo8%lLi>0 z{uq(06fY{hv-3+RbJToE`t?9+62totz<6E*Pd7q<0n7>Gx(i3xt?T58MJMJ;fl6m} zRG%0zU$zpOZsx#Un1i{-$UlHZ$@tC<&tE>m*=ZEfbg;JoL0Ytn8*BPAT zu+obkS3z{5PziW9=|XimPAgk&DJF5+oB6QhL%TCo8!sg%Cg|p%-f`4Qjm++`z$NRan=eX|qYj7O?D{L>KITxpi$s4Nt;_pV*kmg$gUnN3+ksF202@G$r(DZ7#KBH?M^*rrE*=9YG7$f)$msf=+kw)l z`v1!`KnT$}M(}q(G-`bfL1Y+=xNF4S%3Y&guKf-K4|Fglcw1}wrj9kVubaTAa_E!3Osb>*0erpP ze$M45NKb*Y%Fxw*Z$63REpWNjYm-S}W~Fv3R`&W`3h8?g2_OV8TUXO}Hk1fq@~W&N zZdJsMO`=2yX2s1lg?&DYXQm{ul$)=RfX=0}&+f5k{d~s2Sht7zQkknM$j0e%ki@~I zRuL-YLfo8YaP;Z;doq4iKF?yF+{B*E1a~L^N7&&*A7X|uqUyG=N8gi*4%}iHkjBtO z;KiNEUv*oRX_KCz2DrOi?IBQiWtNC*k%+UU25qd5Cma&X$lphRV@uoqY#GaPVeH#d zb=b7ueCyOWC1ZzrPmjCq-8Q8|9iUoS_39O^HJfKQp(++QpnS{;SBFMJYouDKT{(O( zbNX-e^X~FKpm`S84@j|=iVVxmq$6P_M%yS(S$eML0qT4iY!|7>mS-EOf@ESS5k0Li z(hlh{D(}}haZg|sjO`Nn0sT4#1g}{E?}ZOS6gn)@f;eV6$T(TN1Pp`Ge{T1WhAd

      HndwxAM?)H*u4Fi%rVO6bR#E^BI=GrlP1i^!xU!9FDFMc8>=a(RF}2 zItE*MV)$md%YbaZTF6?f%p`9z@Os5?M|qHc%Eps}s%rNux7S8LDV>^hbbPYVTH+?E z{h{`n;S5Na+1NiuKpb6m(cvvWH1Zkt`K7yFv} zs?*u03 z&)kmgpQNmm`s^Du;A9^r85VP$9&-4YM>9#L@1CN!R9k$azZC4yz_6U_r1R>W#K4i- zWt%YB4miid;~qoSr-BY*GJ2*27aL7f?=gBjgyiUGnFJ|YgzE_?-vPS zr%gND_#3>$?k?>L?=nA&4RQMD!TrYl6a@RyqJio-%{qZ(V0=`a($_?0no;11i>6DX z4F+N0p)`2$28g`5Zz@*y;<1!cPD!Vm)y$f>^*xpqW9GlhMQFe2S5n@~ZSIK(6k9Re z)pJO>U>ouz;CZPcBUb=>tr(t%&FcdDJ=rCbu=X;_mHXR6C#kcf;6WhhX-YBCkk=oh zH7_{w{GOs!k{d9GSma(-fOF5(w^A**z}vH8)vUwWYM@%yVVpq|iFYZliU~*aN|32c zXWiBs+gW}%$J{dTS$C0tgwpr^*rE1yD|W5sM-TP1hL$I{s3SPLdGsi#TIw@aOb~kP ziRJ>vQ3ZSF$JSEb^&4y43($L}!)h=b;;sfw_7zIOvlk3N67+uwI{0_j zWfQ%7=JMd;DeT0wS#8`lu?GpR(L)dA^5CgBx@%ml>`1=hNjK{fOzU+IemY|KWmxG= zD!9Vnxvwt^n+;ZqN3U-+5lZ&5FJ--Ash6kL{kz9c+sI5Q zL+dt4t1zCDvpnYBvsrY6iTvE%Hi`2vb=Xocw-R<0zEG{#XEy6Uh8{Gd&o_J0y?aG{ zUN|-Vg4`jc$;V{3FolXBURRR5XZINygNA$7t15?vTppT)3Hp67?#Z2T@`|;dOA4*N zl_Olg#4z+<>KpfX7np2s3eT?HQuRmUfG9~Jvp*yBfjDVzxb{ww43fg7A}6O?!Nk-U|!2SA81=Gbn6Ko-&{dpj4i2jmve54Cwh)@OMhjT&3RY681iYmYG)# z_UXdi26>;*4Bv;nd!)yn=b)Qrs?;)?9G6!{qMmPh>vsbW$p6&yRCab?h(!kC`@DH} zBk1v4u=lLz5D)T5Oe(4dQs4QjbOIYX1f6vB;d>n<-|>EmlQ+i%01K?qC^OV7zj1RIwSj&Rm=K8|_8wzMctMjM_c{BHEL29+t`243P2oLqFn%faZtwUvYV~uh z)f2=%GR9YOrRem+wLS&S7S7)~dt=Qo>Zdy}bZ!DiO~*R)8L`)ciuO+a%X7l5zdVN^ zZlZn(4X*+Y!8c?s%je5U1PHd-vS)&w9J~_rhB`}=Gmfb zcINieqO!|yY~R3Z#Vn(Xs{?o+!Vn8d*v^!Cp(tw{32p^zyVX@6)pxnN@_1M;NDROp z)>Pu(;LJq*aWH$26UR!5GPszHHFl8^`VU|23Xi(+HzKDb1}j5dX>X(6{ZK>f$sDnJ zs+Mngr`c1tk@jn&g%8zWKCGK+$RI(Czj3wlxobnO5B8Ab9n!nT>C2ElaLEwmkDe<{ zb3=qyJ(G?1 z(%dbqB5BVi2yt=PYs+@UDvzaFk(!v{Z00A^(v_JBXAVk1m%o>i+B1=4G<+TSS!k=tGCvmxpE9V{x|Lw0lHiw?MDYRma)u!4gnt#(UVx8; zCt?=|@WkFdtlNB`E5aE49dgTx;0*v~R_w8nAMWEx6s_V3XoB>kAKugPvVTi$Poh|2 zB%GWBL{H5q{s1{d0bm1(EoAeLfcaS5nj&Z zFceuMSor)ojKLnB5|9VCSi73n&L7>jBAzknb|DYx%@~&o3C%C#56!Xa!^xi@pwA8wbI_G}x}K z$bE}GNww@{@qU2zQcXB9;F$&M6hx8pJb_!+g6&)%IuR_eQ3upSI733OH&bw3BAxIz zs`JsM?=9AF+=x6kVeepX@N1m4nZt}u!wdND?~EP=!xF{gGm9$Sh+6K+L!2kbOE`P0 zbj{qV-pT znSkM~AUNQ-@bWyy^*cnm3v4(rgE+&rxk2zsBpf0~CVsuD%&#|_WJ~pLdk1?6D6b;u z3u$IpmvrpuOHPp4g*2}L;GR`fpZE=N1imZJm^5_q8_%6bgTB7(Oo@aNh{fWFwa5uQ zf&RfcVa%kE4gY$4}S`=Q?< zZwkO6Lfhe^ElBEqOS)HC{g#vmoYg8u7Tm6?CGZ1;+clZt>!+}=DKebj?#2NWd+IP* zrtAvQ-b^ArYSttYKSy{}9aiNyLx&6QMO2}_YLZP-;HXJ}_!JDVa<<-}%zq?lv2K%s zW1k-i3yFgK+8-hZyNq0f?gNe78+#i9h?HJksMo-f5S*8IpQfVU|IEmTUv01kUJ_rV zxvhGeldS|uzUYDW0W|~*qL}2JG{^E+kFgG5(FNU$-U`r7kdV4}Q2}CyCq=zMg@rxF z1Y`?r*z*H(b`~*{=*{FFeos@&V;^3EzWX?IA?}3lHR((GcV;erA~+EpT?K~sdYygE z6?Ydi*fik>eRmnHFOn3ZXA)QUK8_lKB<^V07W;A3V_s6;Y1Y;5;}t&(*-o@HQfrcu zkNd-&FGU1!7QC+&cT}B!?YNge9t3IkSM@t~qWZWMQP_|5e@VK?-!J2T1#CCQ5M>r5 zn-lxsY=}6ug5M+PVwgeq2P;s)R!wxGJOMq6J(rJ~fX%^t(X3lr{6`(QMPi1%t^ihA zVPwGVnWL+9*rOW}iRB*<;@jdNKC8-k%g|t}q5WF&RFEGjsW>`DO&}^p_&7C0FDC5c zo`kd1X+0Z*!sw3`VLHnhY(6sS2+%emT;rCg_-SD>Idi<)Rf z=pM*UPfH^HF}eRF{QjUw{))Zf-kc0&iE=UKT*IwaDX`mpgi}kI%I=Z^RST?uts2qA zD8q^nn>u_ZUP^AR9T(S*o@gMrQ5UWdL0^(qBflhYRP;%{2>e7-WWp&vSdxIWf2`9O z9<>0x;BTXd5Ec_ui#4Qut4Z<1ska<>#$=|lJfR>><|Jsssr*Rj_7>^3PAMpKYp@SF zlLd6G*54uZ8lK>cPl!NFkO3EtB#K2Y0z+&gVrMfknsu|kh6hxO=OB=8AC!Xx3}`Nu zVgt2<0`&eV;)D(wSg}iV_)P>R8$$3y{PJa>T9B(wqp%o_HdLY=#&YA8%uX-=(d2i? z7AQggqA$c>!splATp8!?U+FHvUAqK5pDg8*K705www z`$*8m`fs_eTfv~flndAFc^91OD+b>o$gJ;>T~p{u zozN}+;)5iLO3)Cgmnf@|o-qbX0wwfElm5?ybr!c6s=t-6M9?AHv};6S2>sIOsJ9cp z%6x$C&TYX7G9?sQ$a&HIY*3((SqL-~q;>2V|0eRmMA4*tQ-h$mc7ssa2#OGxUYTwh zt}np?2>vi_t}@M=^^n|afb1~sQ1e*H%mf{n9|EPkmPx&d)E0tpOJffV zf{(l>VB4g{Kb6YRlB3H+h}%^ttjHoKMcbB%sa={?ep1j^+R`JL3a(9K1IaM{DqK?t=qsHM1qO* zP~y61H@bd;(YZ2M>d~~&O-7#elZXLSH+mkm#+W^3=oU70F2p*CVu5TwsT__BbWcV0 zcf#k}4kkueTob2BlQ;Um!oOsaq;9?mK-)weJj693K+}Cu^UFNMCrKIm;@MPl)O*CS zOgqYvt3Q#r*&D+Pa>XnsH&-!Y+bN*%?8+$Kit|*FAj0L`%d%Xi?G54xaETq;O}1TrWWBUC3S(fw0UhsqpEV=3!5q=I zZ4`OpM>=Ixg?<9Nxwj1zTQ`h%mkb?GDm^vjeJ8c@XrzM~&7Q}X7GkRtF<7Z2H0n(B z(&4IZ&cmj+vKjJ}XUT-qNlCA|YZ(bfnlFJ-{ht8oKPdtFPk`hXK1s(@Idq?ypGx!1 zcUjSir;cXJziu7N5%zN0IN}YtW2`}HPu9gjriy`4ztj*t>sUvX>+t;ev-7qYrs_84 zQ|M!Zk4xXE)~5P#MsLmpO%(M`MFH+VkylB%=+p^ZPfXq@-?gj0W5Bgpp+wU@t~aQY zo|LwH8M@8{oM`szp&>$vnjvPY||PN>R@j$7z5|c&XpLfj)^6ySSPJ`7zP1@i^)Merb$1SwckJT z2<~VEEzUJs_CE5F*4@ItkxBIN zTYvqW`@+NfbfoFYFP{&5Z50|yrDw}j{b=}U7AmiFOA3BL)BhX22KhSygrei+QBJQ7nM@ zQEqTN+c(Bj6bmM1Mm2El)|}{0@egh+#ns~W2HFPC9U#+1-{_f4gQlVwB(d5S(YEp8 zE_u|0w$vdmxi%TSk#U)Wal^5vcEY}{JlLY?yLk`pFadoxMC^|Kz!4I_me2W7=&|4) z&8Xsa5CJqHcLV%gaXGl$(wcWpG4B~QxUn#W=*1&n*>OCj5Aos|Oo^*E_t3c!WT)V6 zB2UvkV&%~0?ZD$O{uDa~@69j~9NN43$W$sgN?R^#(b56@e*b?;1C=alise6(6LwR0 zlE5CH9(`jGK2XE)t?EXS+z>j@#7Oz+mujzdRmB7Qh^48)*c$`h599Xu&i1hMa!Nhr z`5Nv0&}3>VM`=R+MVMnT)b@#X%*ojMi>~hXjup$ZM5pPlX)2z|%%RzeW*&VK&sP_8 zCG*@5vJd*&Q}a7Sn-zTsMWTOM*8H-`eSmr?v&?ZAw5js*(p%qEAuV~$e+0>#iLwoi zI-r?7Ob;ee=6JgfAAHqPyP$?W}Ydb2a) z{&9=W!J4u69@s}34Ko@B9R)eu3tVda#!@mRP%`nvfcO4Z69R+HC6U6W5Ivt&$?^g zES5ogx_`P%hHfQtDT2rFdaKa;f|66$9JCY^!z2|QQ#%;Gd?7U=luSxxdx4zjTl7%Okhx)kn66cnlz$Yfr9quzwI zxau=i=h=qxu%{z;+3oq}}Y7GykTK34xr7nFkG)U%rdfckx zDD$iY^scGt^w!%9(V76hoQaCz&T;}RL3c;~TYYss$#}Jf(V#7f#yNJ@>q{mY3|O?A z?R@6s6Zt7s=UCl9eP_0s$l|_-fm&Z?p|3GdRQx1{(li|-Ocv-Htgdj^H~tPFBsV*o zrXgn(lQXpf{hVOrSQ5Tu>?JP*=T&2LdLWCl5Khx={Km=yTWV#j>GnX2v=>j;gbq~k zW#m=8@o{}{fILxCy#3gk#Mh`daQ`NgRc0}W>y}Dw=k?N=uMVh{=Bg}1>3@4EV8ZDp zj7(jQWw4&MxXBh}W^NNw*ObceIb&pyF7L*1E=j!s&%H?vTG+(0C3*X`F_!UxjqBI(}P98;D?bSxl7)SCprTjI)Sd zl0)ANijfX388j8$DNA@LNqk)?lP!MQI%rpU*F(=l7Pmrmq*NLt*mBmp1kl=xk0-yr z61JQXBQ51Exu15dApemY2qp>&;t5P@F(}tov$)I|u2x)TN%qwWa}oVcKG(S6x&T&h zEj`bk&FoeF{cei^HRe#7+k>-h?@T+4&Xld_-yBM92I(~HKX34#kN-VqqASMw%?=Lr zX$}mb&+Xf82W@1B>|gI}Ow6-U&$-hkDp~k=d-ls3G!NYRM*P#Pmh~vp>W`C^IKIh< zyxwS(>?0X$h*ABxqN#ekj*4TY!)?6$h|P`8uq0-1nwICv zMcjDFJ~=fJHg;MA!xdgP8hnpoxD!aTJtt-+FkBn;Qnce@e0d*exnxA(``gna5>dlk zDzha>;>D3!IP>Hq>d4wX9BA2Z+L~C6nEXad4TnEzy}1IT!zp8w(=pz2rp3OSva4zc z=wa9D%A}1wv2p5KmM|Gi+_v30?>ZD&)tzb8#S9L2B{D+mE!Gy;t764ZX8Iq!5yQf5 z^qkD54>1gc%qmAPU0uU)hhMw|2{k=|#QsJCTT5;M}=p2w7)``pUTt!R4dq>Y5(*FvN& zGQEZdXR}2bI;|#r@io!;D@^W`F#gwy29@a(9qA^D9(D&aVnES;6UT|qkkrjKwj6Ql z`>4zy%+69qbJ&$^T(bV2y#7Pc5ACIdH>s!lmhV_UU|(o!XUnULvChw0tW z?zO&mpE{N#X&*SN?@CekOT1<^aE8$QPI3;*jQ-s#w^%#*o9r-njRW|Xz71qQw8j(Q zx8eQYAud(q(ua6?h!@``nji`0!JY_l0?@=x0<>mZc@%$0cr4BE$9i-3zjhoFO{18d zl!PZpCWgj{?cTOHqj+;eR^j5R)9Q=2YU%~awb%x&h<^}gSND&vkHhzDGMO z^u*aNk>q8i*Jnv3@M;k6w>?THq{!Ksu1fp)T=$f9dC+I8bn_A%>y7_J;9rS@HX2(s zog{7i5SB1-gx-pr=FIr0VQ*%~mL0Wz0}sc-@wuMzuGya21NmkbV?QL2l|VdurkPcY z4Ry;7cYdwD~wdl&t!)~&1 zR-&5omXw%2#box+e{71nD&&Tw!hV5%y{RVP4|7aC8UJQt~ST95H(y&;>v+m!vVmPLR(;tW*Hw zXmVKB#9BQd0^Y+87yF5@Tz})dGr2ur7;G@Wshx=M?1)-P)NJf1qGs=iXmSHfN)X2U zq&fsn(r;h@)km+AfrfcTOgEWPGbC<{0+=fIU|h9cu|Vt(8TMPl*dIypOiu(Sa8w3$ z>H1KX7|8am++uChSR=a>J8ho7Q2liR1g&~`P&8jPuzO{Y8rnam`UC9`*Cag%SRd~t zh-pq{b$AtRl1-~;eZ-8aSNhf!JNOCmOfH&R(2MWU3@!qBdhzL@johF>cCiZ=004;> zlsmfTmV4WO?POip02fY}KM0UXQ!?BHFGUbQd;Au$62P|}`~YiFn|%}K5DUPyu=YD- z8qgyr0SuhB1;gRE<`jRtaSH(TKfvWRfcQ#nkrCYG!PUV^;12&S=u3nLa9{mB43gk| zemmtCfEmQk{Pqj}1IP?~H#GjXff<>EeiLqwtD0k6JWDVi*l*sPRp{M^ILC22l*3JC z6>Nf!l-$fl1GB0zCLni0eJajU@Ttr&an1GqdCT>2g=OWuHOD4{^y*C2%<XZBn5ilvU**dBxdhOMRsADGWDI^z<-=6qfe z4|;dlHgT1^!S^(D^4hj#NLONS4Bu~Pc#w%S)Q%?g`QbnG69w?+$b6kROnPJj;fL%KMH)7R}3Z;g3CCWkfZ{zt zQ^Fcgp@A+6rnb8Tb&OzVFkf| zz*vI&52CZ0cIFjq$;A0*S+-+71xI9W(?k%Qn0y0L>^(1~yO0 zu{OX-$><;cXb6(VFg~%3+mCA_bv=bw1?+XWJ|P-=e3KZT*l=h-*VQS3UilW-k*YZ2 z!}l9Wg_07p)p^HL@oD8rCl@mt3KCwXCK3Na^TMn1Pfr7~{D5_x0fH07gFEf)ZScnn z5$#Q8glKL+ZHVq5Cs;?3ggZRQ12h&o>D-L-R3+>n#33((cof=IYWi>N>$k71gd8ND z+PXj1^;Ik8P&Xpg>hWX40%SjI=xEBCVep;9NPa@$xnD5B0+kJ4oDC=Q zs0OZ(C5i}Fs%9ruY4h$Bt9zh&?};^FMl+H?qIP=v*#3yx+n|d%b@^nlhcL2^W3R=O z6;32k$Zb9eE>V-wEeX5uPM6;4Fo@t=bIQ|di+EAo_+vjke=L2IR}Qu$Nbg0%tEciF z6h7W;Zg0>gg)i(l^ar9$pEU_yAvyBtv3=1QeIOjy%j^iiU z(DC~5nwp~5sypF?cAIDKT|VVEnS zbB!(CRnzLy18Bl9dXC1s3Q?<^=iMnp5o`OSajbY{o5kl;^n+-uF6Y}H7D_cm7_F&F z8%6cNSsyW8_mFO~wt9`~htV%yWgyHa5M8vuCm7<%(#!9}_$*N^wEddn+`=K$88mEY zfctuq4&q<~v}}M@J@mfK0dC7mrpOMp5DeWt{kt_3aaX122)=*@o_IHj zu;PzE5Qrw(&K~}~7qE*k2wxrDz)pi+Mi7bet2Mx8oKlr{@K@n9kS|+VpX2$v`qWl)CGXFPL6=$|x z>f)a|fpr`aS&{fYT(-*3dS-|x;~#Rq(ReXW15g0g9q(f}t8H-z5m? zJv%)K4os-@VYhhpO(xO0Cae*1s!MOkKm;K4-E&j!)b zVgDUs6tIn4Y@c$)d9?1);O0R$vP6((eMENIjgY7rhPt~7yu?a@iN6nKS06KvD}@Q61EAd##UX60uj1Yi zh1`WN#6ea(DS#I@U*b<*+y#gy>8zW+7*|4BGpx1L|E#3E1!9EtD|GMbk)%l?=8mNJ zd(_+quSJsG-e!VSOla4hKS1;nX`v%{0MY zPD!D8y7Xge;g#hUV=r#zUpqXs6Z0_*>9rEQW_HP-@XIqs>5>VR=n{db7rebSh93sk zs*}&9A`RZO4bj}wt5o%HYcLJcwHjh4=7C?)rTH8(_==vM8Up!}mQTj>{spy`(w_tR z|L&ybYFd{UcO z;98yZLvSM?peAF<4{8h&`V#d@_p0SY~rZr%D z&xrkl-mwFRucIF#b?*;%DyREss9+Dv0+rW28?ANH52)WX$LVS(MC}H)l7a0s@Mo~q zQVAs!;ih*Ld2DF-()r9y4ycV?n&AUxf8i{7**CLdRyrWkRK^4y z>*uf4KKz~7{CD)=C* z^GbaLfHAt^-P+kP38QNfKv7xj^XpOhKU#)gv@IbSX~4_hPmTIkR7*}?WPcc6EW)lfFL5phqF z>TGiI_Lxh)s*t@yD{Y@Ts->O&6lSnP*WSr^lq15$JjMU1*)-5c>5ETBO3(*L3ArE=h(R zdsxcc>k|5u!Q|>*xBgjZk2lqevk|sR3lETI`lik#MZj)hEM1nKhVRQied5f2ETn*6 z&J#j(KmX@Wsf!uI-C=$U4uzajM7AOP8b|k zrO+XVu3VJ>eeAqshjshRK;oTux83 zmUafeoK?_L;-jWX-!7}Ts+oH9np9hs=hliz$XR)ByTGI=o*OM6tQ$cy$A{Pd!kPLn z4PF0EpP9;WwwQ{{V8MB%@nwJE=c^)?>hB{9v>OflhYffdj?^b_9x`AL8Y|YuwlW3J zcUZYi=titYaXiP(pbC)mam)8>u$<*q>I)-}u~s2EZM)*0JG$&ccY>rG`r0zPae<#M@fugMV=z>}C~EahGTHPLcRj|a%v zWCRbd*XNuvYu(}R@+{rUUy(0lwR*L-E}<#A%Ag*9#**;`WXsqg1j3z{`tYHVZpG<8 zA{lv%zPxE@sTZS}tGV30I#c*YC&acEX}~TVpJmw0S9H~=*J?qJw(@y&_#4i^==UL~ zvQN^i84zNr?zgnkcS`Xzr3Lo9jTl1Tr=C!<Zw><5vg@sCZSopT+;(79E`UInR z9eb>x0+Xb|l{T4o+fG_7f8Nq-xa>mHqZlzc3~YVo0o8uQrltue(J zWBHO%8lw`2&KIFHaz)A6&4TB|=5tBOsvU2}olOq9`ge^7e!UZaF8@PQ3?g;7Qfj-P zA3UO_DAdfF&sEn%iPE3RInT1cKrpoH@@pDL-w@qV!{O)wSC)ZFXDbpBx@t$N6yE2) zLA&giIUWpz5yqEBE%*jYqx_CPlcW>m4LZ%RW^?W3Ts!o#m7ehPu$Z71oS~Z}kNXGO zEdFOb=5_er#n7=ACeGcWQbjx=8s*K-7*7&!dMiGb=Y7R{D3Q7Z5F40f4^N*F9 zR-zV)prbHy>6@7J<@wrc!uRz*8O`7G049X71JO-$Vx;jVHk(LIFS*(guLfjw0(~pT zLMBrAH+cD(_HW$1NAJ2t2dh>LF)~xk4(18_ouZE@f@!8^V+OmjC3bf+C_FygvyXRdElRD zRlaFt4nF_J2Q>EPWYy0aSGkS)Dey>F^G`P>-*;;xoapDy?L9+bYGQed<;7;(x+*{I zeWN=rh3yx}7U752@b7EbR$_tCf~j!3kXgT}bRFtF?4*+EVBQeZxfR^XK4M>KU?H}- zLP+7ZVqDdeAx*Mf!$*y4Y^MD|CHD`tI^zLWYF>N=f^0`&uJGf0fi__!1>}X@`P~Z; zg8grN!T)oeGJtf|CpOSKm;MfY3ppD6D?3A47{F-w5&0J6kC791>0&Ii4b{#$yD72N zB;=o;CJ%8vJl#I^HkrS}d$lAfU zC-tN6l0NH5pkJ3EYZ)_*xgwu_8K3C%xwcf%F&@*H_aWbl=b(>p(s%j@TXogJmP>3$ z5tfBq=tTiMnY$E~AV@`2S zEq7_BD)Pe98NdWmPOIrj=)Pl6%Xql42GUnqKL<4M8Q$4yA2#?VWTBme{IU@|l6zPh z%ztg2sZ#R!8~C~m<_b`L2%jKae_j&**?M(+0GsIOEIjt|{@WrJ2Mc2QrUjBbuhW}! z@}e@-l9GjTQSj$o3nr{dO_v8Qg@nw8$uQuP@ClMc=FTvR2}a?uGf!MbAozBS-J}fz zD}}!L=Vd1D4>Lp1ibjTJdU<-{PoDh5^~s}2^mXC0Jjp+u=09nZ!?Cm<0Zy_4&>~@A zH(E-81E30ke|As)J=WGohGY3>I@bBP24Heo1rRl;P9K?6yZv%QqI3vf?c%Ge^8P51JIF)s$iFV%@ zxHKBKUJT>xGmD&WfR6i*&=_^%7k#>x5$%qT%O$4ZvZ6I$kmh zpuNB#{&%3{KLx(>(a{6cl7Rn_BTa%m{44CL_!rQY;ER}2UPSGl`s)fnKF9^+BHi3F zM+ZM?4Hs5^yG_S`w?+zqIw2T{#KG`rW8$Zdghb>|$ZP*!LF7fc->ngjx#!N0C;mPD z3hkdLB+uJJH8Ee)xEf2<+_eD9`40IF);q8?anhU0(f}5mmp|S2Mnuo#j@#aT5=FH% zv&lEH0!1!*nmrE0GJ27xO;R+i*|wV|guG^8h37!A_fP8e-+*$a_}=>`_4+TIeEqi+ z%zsj^zesccIqLPw_6el$&phlkA4%6~+jmR}O#K|&wU-S`k2q1GK(a~IW`=o}ZTaejpE3gLRsJA)jQOkT;h_Jje5=ObA%(rU%Yy)U1) zN83D%;8*bP_7?Lc?NS?xY&mZU#Nh|~)U7W` z=fJ*(uZB;*7sbS)V+h=Ym|b%9-W83{`l)ABT!e_ad;aQl2XZtcM`CimF@N4> zmA%E3kTq>{t#wv;otS2&1z1vJ6B{O2JSo!P4)?#oub%%ndx09UIxxq2K@vt2hRpn8 zdz*&C)P)Dzg-^k`0$pTXcYkQw%rlR$>&?vGpp{5dpU{Iiv!9N4xt6%oBZJA_t)vK_ zIL$w+>fvQe7x4*l7Tmxlb#WRLuUNR5`C&VDTS!BU@Jtu2F=j>W#fGuJ#&XNb^pqyT0ju0gfMJX$^skSDt9#nZqA#?&JnyH;5IDknNk`U zn-DWyU=RpI*@>%to9(iFyj-wP%`8R;2+ck#j-BDIrnn+8kw1>T7m!Ml%FM;Sq+&E{ zQAC*GBLf_Zja?%hG-l*MvX1{%vpx1h$Ng%uz1(c3-iz2c6 zuKXOAhBDF+H&{zb>(;y%ro9X7%^qHQ)v3fLr~pinV$Oy#+C21!3bl@mZ;R43iFFOx9#kb@(7w*TjH|#sMUTaj=T#$Z=LLkh$f-1~5f{Jq}){dD=D17@4 zQST=@dYXFuj5vcqWAC1-YOsZi(#eNIg98wndbf)DwTLUjm5D0rRwrku&QOAABfx)6 z=2W;2DZWIx7RlIxJ`1Cqty3rjc=?mKXCe66_+~HNJVMC+_{S!vhjJb!S_P1Oa&(6c zhe3EAd|YRkld5hdPg(g&xC*XMomry{nd5y9^OBLeIB@>t&Tgj!5SeOhQq&R{QJb8J zGQ%leJtXl-=yF6I$rK8^8&1H`>@=??POC}7UR20h_D4cCV1{x+j!hh|*0skp$u50| zoS=9K*sWffqe-n_KY?1xL+lZoJ{UWK7dsF+A7}m5vP+aE8IZ01-M1=zr8X-|I(urS zw&r{>q|1)VpU{%qDg+~HxWRT?W|GI~NfV68f^eZENbUC0#WM|_owPLqTU^a{C!fKK z^a7ALAb~wR&Jr-GtY8`jNK_)&R!)zbMje$;Pc0?udJt}RbE6qW#$hwRycwRGT#c+p z71PykS?6#T*u6n46oN*{p+oQR^!p7|y~4Wt5WTkSq>&Wjs)H}%*L$ioGWh0W9U_~Q z=jgBFdAg(h&jdlb{)0BT}ORgXu#o+Q7O(5wa7T z>VVqj*hlP**x|`DN%r6W%s}&W5SS4ySH07#*?ys3fvkI46c=1714PL;f2v@&VwX{0 zK{$JZ9hS}(F(!<$KLXttEx!B%h<8a8wWq@|m{|Ng!I@kQTzP+WO!$~YF=&V3MB_rx zta~jWt~sQJ4$)kds0Hh^!<~l{6yV{9v!HE20d^WSGp>=DMDfV3!aI7qxlxV!+e8;v zG-_6IW~3%lil+mool#8iu*XD#m#$kjpV6*lOPYayw^w( z!5A6Bx&q6YC~6K}jnwa6Ggy2jd{&- zUr&A(Zvp09qeoWD0#>Z&p2TpCpF)atxnqKAd@u*OwYhFF%9BwlCz$->u!R3v;d0RZ zl;`GhK_22rgIQ0-r^if9U9&@MN%;f!%J^nA3)foq^5vZ|sVgD~h8;WRv5tjnj3rNF zVhh8ozOCr1M!y~m=zXfZUosx>bT7ZIuKoH)p3&Z_(OUQD!l5fMvX#Is)Al{bc|-O~ z$IkKGM^|tP^d}4s(+WiAB$=j>&`-ydGnQup9Q2>&^ZpUn_y6wyUcVOaChk+^-b^eAoc~axYABbdefF$N^@C+m!&f6?dgq1090x0wx|hwkU*^s2 zz9%T$JRF2et~;C*>i?Xeev7lzCUCy6?84=qeaiOd0P*fN{DL1B!a&( z6)U#SIEqS9N|p+?d7F1YITDH?8Y=JjW+T`A;O#a=EeEMhx$0xSalM}lqkSZ9<{6$A zlPoBQf9hKsc!Yi7u5UyklDqfmnVrbIbEw}lo@!G2B}03|J3-fzb6-FnoH97D?9l>> zW)^|lKN>BPgN7FWh%5P{*Us(cvl{n3U+>hGdEqK-;x>?M9kO+cmC-Hc+sFP{O@Wsg zme4{)UJLyYR)b~9hh{uhMS>rjmM6~%L46HF7$8TCw<`mSF@jJ-+p#qwykV+1GR z+?)Y51=ELvpNb>8HrRS$W5f{0@H}g|OjTb%Svwk&U)wRzZcyPY-VzMg?!DNr&F+Yn zWBF)(3)=*5@I4r*_Nbl7_?V$f`#%fx zgTU4QW>_svom(A(`&JEkRdi`_B9Y5KmFn3|0UM@$DxpY`hME1$qgw!;M90dsZP4N#NEu?DuT4= zHBl$I^Fw&q4JDPYSlek%DinRKP`jnc1-YxWG(A=#4k;>)N*APZ-bo634TxQ3?+q?r zVIk=^T2{I27{ao*t5)(=o2a0u(euSrWjN=mz)fUQ6}xBb*;>P!2S1d}2|IB&x2_K^ zq-rUhwS>=|J@;R@!2T_H?LXR&R9?DB)%yo%f%?-DuOZYl^xuX8w5x+oKXI`sqnWA^ zME9)f%ZeCEg*r*|4(27Qy<>}_ZO-!?5dK<7b)oQv(Vn@NiCQddB39OxKmVFwVZp!Il#4xyi=hNPoc36#Bew64cp7u(GqC+iE* zY~i}=`K{6YkwV0#KTXdGeXMmC%jnbFid1(hH6h&UQQG=b%6SxL^6c%Y6uy++M8-+# zdT{M`wB>xT-P@;G6|Y|AD7@h%c<|)$#m6rB*Hicn<}HpGbg2ir)HZDk2YZw_qVn}!_mSS4HF-4s)_32R z?;%6)LBrQSKtH+B=_*~alfLU99tY(7T@<42QOWb+nYsf_I`fG}`+*+hjb)D?AQ&d} z$a^b#Huo-0uZn}c@cG@ZO>}}Z#lqC|4YvUx!aYAf{Sco-py@Q$Cv2{%tS-Hv-j=sK zl}9!G%((Hx^ljannX>T|?t3R6+@_#u9UBoauOmdGPoc*GNLWq}%lYX0NUda-)Tp(W zuLODA?~g1d=zKAg>oCenza6QV!#76mL$5Q{zbicN;>w{?8ScKaX2&)rSih5R5XEbi zZr7tbIL~#HO!(#lczkl>pEUSCq__W>76Bp?ef9_H#~oZ_U~Vt=@_Vzs(Z|r5e}U+I z%Ix7|(H4}vZ%$8Km}vC!n&hhK_~()im9GYEITt%A+_Bk*`*OVR4a+x-sq&xDCn~#n zG*GH#HZ!WdcAOZge#GtwG_nr%uNtZY1!vDZduLVr#)F9 zetrcO1C7f^GQGDo_VAA}K8TYt`NIGKRg}@7#^;2mY8X%d4!L%z6wwV`)eC%f=FsU{ z?%7*_{bbbn-7Q~uQ^(&#Zbph>{H=u#g0-V!3E}*1o3Ai|>3h-0+DhD4CoJ{4B3SYb{ z{|<6cW@uq;VzWrdcyetAjXj0S+>A0;v-+mPfR`VX@i?2bsq` z*TNH5+dcP24RjOJ&RBL+?89&SoWfNCGy#B-Ewn&4B)X04SNY2;{`lZ0Ks9BE1)_&w z;3F!djUU+*I$GJI*+5;`+5btZeDXh$Dw`4*Ui1E1!wj^XG8;f<+yB>tMRUA?a!kSW zH7VI(MzW904S?MxC7cV}n0isCKT77X)B}_*(OVnO_n2w95|)P=((e&n3@ew8#`_X_ z4TR#FpQrh<3BG~)+<B`cV)=xpYAtZLMrxcWvu#1#^l*EGch7 zN6@BKoJ8OwdO6S#0Vu-B_!=H#?OQd^)+OsWO!5*2R#D8J9;EN`x`@Z0A-HZ0Av?P; zoLzx=tCC`w!R2+47nkivQt>&Gh>VO7E}be(5sK*~ijJ;Px%&+0278A>EQ;S~zUC$g z7jd=!%(Wbryy;7l4kt;0Rm-*#k!6IU<1J>CZ!vr{{M>K^rJ1=&b3b%X?nnYdd~7p6 z&~4C)UY@dZVjsTAHAUy=ED;>fxn^@eqF1DIyqOdQyy5)BtzBw}dV6L@+K(A49tjbx z=n1^a(}FY|H%v+Usq_%-B(70@hp_d_?0rM*k|Gy2_NRc#K@Ev_vLg6`T=hTZy*0^Z zo6GD#y(juTB zg0vu2x*$bCDbl4xq}R|9rMJ*~uL(6k63_CPd1lX^{mkBHpEGCP^Zw3!A!V^vR_?WO zm+OC(Rob1D*F)bYSuoPhYPZRLwv>dggab$%9%7(j$$B2OtUo)rE4%^DGGhu!{1G7S z(|_A(KX0j3H=D{=)^d8q*S)HHgKcDyei7pcvoS$1A_#5jz5&jC6ycNz|4G_v6HN;m zqcY!-PM;i~b{?S;r;t%8=yw1IgtW**`k3WT;6_52MtG0Y3DlYDe~qgeEPl*0GHW@f zc&TVq3SvK;$|y`e17QFhY%@>?_33~0ys$46Gz<84QsAbDM*zg=MGO0@=Nni5{hk>= zK>>74mMLF<`_B#g%!y|0#`SfoQ=q4M?#PB0o0joxalwc<`-FWh-`fsjjHn3&9-ZKG ziD-c4x1W3@RB|?m`?L;%ZUqIYVdRVQK~n>WvrwQX*1t~No!4w2?Mu<%g?O6_r;HvX z117I4?*Qj9PG)-j#9Sv==H5Ks6C8iZs{A+5>-R;$zaIZ-Z9@IC|6gV7znSs=@2tW8 zRmT2EUi}HWhjb=V` z%|`{RMv{uxvw<}_5EK@YT^s{t=YuaTwP*zoV9Qi6;YDFiR7R#RIs z!sX(py3&Wqo_du6-i4eq8#OeV%3YRPR=+{BB( zV97qFm1p16#*yxQ0CELtUAu2Nsg!stvVnZFQ>4^79zx$L_id#?hs`UGAp#QohuG={ zH-@&eU&Q{Tq^(4parbM|Dy~qSr@eEKy*eq_Czjw|xO8tsi?1GGc%MwRkd%O~qDtDr z9lUh#5WRujq%^T}9Y#is3T)^k_K~h7NiooJ`mGA3RpbiBmxl&WP0m=XoGW1{-Z+y& z^r(B@JL4H~=jt1)B1jkJ5&0C67ca(hr4NnJidDY+dULFmB9&c^8ZD4^p2pD_3M$E%W&5`9U1HtV;hOi>y3sC;nm`nzg?d331E|3g;3}^~Z zcCO4a9lHPohgcvJnkWEhKwko(FrR7=Gw|QEjT_7qrqNvJl&I~L@Rtmvsaj=grhqvAtA*P~u1f;SUpkw4VmCEkuyIe9iR$X8}t zk}t%+^+oh#T?%k_J#$GV8ue|JN-wIr58FJOEAXmOW)w`v6>I~D1pGePP!Gs?E|>}c zs1CfdbR?L1$$(|N&BYh~UC1qIBLmxWN|N-Vi0>)uxp)<|VnSSo>||;h%cBop+SJIg zL~1;&iHA36wEKQ-sGZt#gOfdguH+u0Ja4TK$VUMXc&ZU;J`*b3QO8mv+1qs^A5f(W zbATtx9NwuRF(I(fGLlRc@b&FC^crSv$`1+yhY5cfEp3ZiT(a%D?Evum$euvhN>q^C zg}yyc{g~E!JDe!sGQtg7qr7EHiS%E=`u6_XK*fJW1AbK}*z&XGut#^ z(m9_WQ>8YI6TCB7ai3PDHa?qu_SMgzsABQ}kqoGL#o#H+s535`EeP-g^wPUY)|v2m@?x0b&E8 z`T!~nP!x^-*NwgRg8#52j$VN5%L6>+QeXjuttab~OueohmrZSIQ}W@d1o68a6l z26IiVS^=sQ;A~>jlSmpVn*ehiLPh*zg*NIwf(0{otbLS7wnW*Z4+V}ANUkX0&Vl}o zP+NcAbu9&O=$ZHFhpp8hemP$lz)sa|#t~4<(1Qn%)r1uU?cXZ7LF&JocZ1iV|K{lp z5LC-I_5kY&q}I{$AYlw}?*1zsHwgS6B--euR!r&^fgeST@q`_1+$IMBq!Uft8d*Vo z54r*d6x58Ed1j2BdGud0X?Re0PQWZA<$%G5%LnM}E~xMQvhIfbQT4osdOGlT6N$eu z?!Ns;<#6=UZ88nOb>M}zk(>c*^>pFDDf|}yfHw74eUlvBdw}b!04mNn>&y3Oz{~rv zf|M^uM|f*Qh|aeck7vY-wY;J%6WkV@2nQDfe}$-Fs{JtM3;LTV3OrbqZi$PenhxEE4hp2Tq$Q|y?iOsHl$mF z3eB5LUH+!+LKCB)%H2v66T+hPDEL(;2nGKMqV|Ci?vP5nugI#2CmLOnubhv+0AG%X z&vtO(vc66I6g}@Y?bg!Cs(>Hz_sq3dWo9{evAOUL1U5aNNFw4`H(YU1@Z^52QtSrb z`!C`l6q0}q$X1nL$1XZFnR!qxvtalBz4ay{GK)vBkLl8LU~5*6b+UxA>1;)(;w*J= z$>4;G8wceovZd~#PalfG>++Lg`_ax4+WT^Dlt(>OQ7y2rWDZtX zky&L{6CLKmXEbShd(ukX@?1=Cc!(8j2y$Nl9t5=GfT7CmhPNi*lKRh-pX+fq(kB``^uO{Iy*Ybt z+mJS=bzSp~{q5+_;o^c}gN`ScPka~{jp$Rm8IkxslKskz0(4m~D@G)?@TFU5iTcp=6M;DhFS|U0N)+%iU)EsyBW-A}`ZpEpMXt4#d$*Y$qV+NVBWqv-roC-c8sDEohM{|8zE zslbq!tDZBK%2)AO=Lwzj{b-=*ZZmFguv$4T$aAb$9wQ^@=biVyBB_H-@vRy2GQvet zV*-PTloywEDOTqT78vAus&$)|rhwD?#!B!wv@a$kK0lp1e+rBxmaEzcVM zaYoL92bmUnGw)lzJ$&8N?)?jKPu`|%Xnyns$cv2n6?M~?xK7>kAbQ;%_xV9dr-$-x zH`#JxW5#u3OyYU;<<&Xbss|kctf)bQbhXwUt>5|BkFYc@&Ric0%*};w>KKH~&AC9i zFF8)xi_|aW^5`dahyvUesFU;X}F`T-;s}9|4LP-K*Oxw;$@`We%9rS*joEQ%2c!`!>e#z zo8IZ|<<`E7p?o?Kr8hy(Ge6Zb=X%6zxMX(6&E*y%kSdUma_W_*+U*kpzqep+KCHiI_ozmt1vvN6P3v_yRs~D$e@S3 zmhd@o&+Z{Z49l0SFYiM}gK9=c>`i#|y>f-3 zIgn`!T6EB!uF0mfToyZOJcp;rJ%2t9-&}5_pekD_pPliMiR z%8*utZ@CQ&-wJtvG~9djc2r152KygQM-7MA0(EwF$8P&bsdE$Q9-*m;5yifWp(Ss5 z!*dHW#rADX?UmNsZ9j#6YW$<(;QwXsowkb8+v0+UdFfObl$u41WAp6$udChWf(Lmi zRH!-(T_?YJcJoG(6&d8Tig5n zY3xlI+S;v|sG|_MPc`R$f0lK@lYv?10mXlX5b9`Drwd}CLfm#))bHJc?>{b%+kXL_ z?Z5NAWYtccuJh^S0C(!ByvQ%{gl~?T=j+{8vY)!D}gu&{IJ# zy_V1r8IDz=s5dyNDB5pnsFtfSO7=>`XJvfR-p9<{+kCIbBnmioAk?c*&D0ttDW45v zv-}u*T*FdVx?{JqIi=~8nBmv{%QOD3zR~}c+rjkP`66PGPd`hFFA zPYU9p;fY?oN;k(_b|f^l|vt(T0VX7qeyAjM?4R?xdM>NO#>ccf$2UuNFjobez#&iqmI1 zKbp@I*Rfg^QCn%{2AZoqS>`?!vNh<+{s88&Gvp+7!22O_2rtv+t>`ibA%vr1v4wZUTTeY)j$e3U@qJgv>YAhOir+CsQjRA_asUAZ_t%M9)LH6v)3x0=1$Qcp4sp1YvHukG1QhUC2sW99Ty z>pWH(Y@YDu{RDGuP?;n;&5LGMoz6iqFefUt|VgzWE=&(JKv+v<^O2~SAoffg%jrFHyd#{i!bKW?>= zHb#bLU2l~`l?%YM8ei#iJJlGG(kw#%@-Ip~@xL^x2wb~-*T7n$KJ{dbmD2UL%$Jjx z*#b$StN;zl>Clu1c%SHVs&u9o9z};^JE|75%2}P?&406a77B{e88o)DqPwBc=`u9r zGSU6@reRTr-;ifWF zX@A2aQC|@9{Jmv)`)j{bRdZ3peo`LgAIPGDp^iWV`8p4(pZHeMkx^8pI@Zn;S zLJP>d9H%~ODs?+d?EQ_ceWmkPoZSH=Oa;OOvzrQbix)TcSp8DcS^6y5Ssp#lq*Ci? zLdd-3K&t|a@D9Uu&(tN|!OYe&^2JOMv zh(D0G9o}wT-0Z zTfLFdAD&OkaJe=TT_Ik)=){(p@Y#Gsb6~V znS*+CVBJ+Y*}iOX-SwSW$uaUE3XY(y;dgtKX}7KZM9x}-O(Uu9)! zPqCtvdIED3HJy4kp*J571QK#!wDNXT_ICm^dpAzdlktG$xr{Kvd3oC=D!$GOQ6pEn zo6?rUBXmc$2_ob*Bozue;+$C+cs3@VKYZA>gI<{f0^$XJ;NE=wKm-KPm^danq{*`E=KrU7;(< z%s56&&Xm!F$>_J`&JVBTA+?a=sKy^_-6!0UIwmK6T#n1zZ2No-J%TG-20 z-(F-Fyu-9!ITMzoISJsQih51kLXys$IAY zu$x`mr)-$oW+2*9q6+wNA0FnSR}uTK2&%JmQ4Gj~-jzP)YrKwQh)$JoB743u z5^k;Trk(E*C%LoX^O~yzhJpC4?x-ZtbQgsI*OX~yA<`LLOJhI12QVz;-Lf(Hb>nPa?4Io&;4JbNPzh0kt=b2tKo3!9hQ z)mm)BrP~3MLU{H!Uk3PyZ3F{s^MnPrQ!6V_A^K#v;1xuD0VmkKlL%VBUZXQRRylMU z${r>HdKd|yRo)xN9Ls=_N>nTN736VPaQUsakHc*!dxFZ$LilJ@NdCs7JL}sQK%M}K zNqXmVju55+^w6J8?Y8+8(xpQ`JWpjB24Ghcvnwg~kGp9_&FWVkln{t{SGB{>l)m_j z@alhR1^;{4?Y|!Xq3s6yIq)xP#lNT(|35Wn|98|1!4G()sIliJVeSJguCQ-d742?F z(s>nt9M;L-`m5`IVQ`d;<`cN2A6>Q24ka9-jcb#0p*1NlX&?ot$3Hvo^4%Jzn21JBpY5@XyisWOEkwTg zd0<$4?Ilzf9WR|AgL(?z{!pxzEwIj11f2k4ePV25M89_(vK_Yq<#Ptu{sO@zhiW_9 zvK&$`J>U_wF^HSedU*L`5Na4q?6HM$ZtT_{&qbG85=bywG}_DFU-xM{SJ*4k5r|Yw zNIQN>yq)wQ>G6+kkptv5WIv41n+}vDTF$K+KiIT) zlM!8M&A)1Uf}HDoqtIAewS7z9;UV&UJa~1G%iii~a_Lqwhye4R3Td z$^$ftW2)aeYtQj9Zd}K&@WI~RB}`U#5Q1v2KyIjU)-d+uu+~iIA!7Pq-8=s$$a1AB zx-N>x)FJBnkY- zgIg+UN0zhH4M>u-er+W(?jjwRgKP*8`?>j0d`CPDTs)$#_KO#&;@p|NbZiM8?Vm50EM^oGTo= zz`s#|(=x-8^Blwz&V9Kz%=N()Jw&^MVs)3@TLBNoKD3MmoCm~Uwh#(?b4vV!BkNCT zRvK26$$LA4fS){h=qKpQA_h;_#_G|ToXK^%Nm}EH2iN{1L<|ZHv-zks6aVE%e`2)j z8^t$WDUxa`(+6JXLPjm{#^h;yOI;YWSW7f8-0!JY4w@5FjJ%e3&iTP?F4dzR;6r`u zILH$aCNZINt&OV-#Bb?Jup2K=)J^nZrqf__h36y;X11ToO{3>O(^byBfk1JvQRwQC zRw>C{l`)RaOmZ0&km zp+^xK@IJ8;9^r>O+*)$owOKMRO2MN?QT|g*Svc0<^uA3#BX;)4J9As-Y$+>og>g3$ ztS-KJ7^SwFv!e7bip9Su7Qa(0fTAD8$Im7q>(=2{FDHNrvd6kUo#p~~)^Mn|b07F- zO?Xp^C8Cb8Iu>2vo*gO2tv*YK%!nA&sGKkKhagW$BZ&f6Q8r4?Q&IG0`%RsP*^2{wstPvIuAV{moFsK?Onm8BLmw93x_ zr|aO6`#s})Nv}s0rm;OJS|46xcgyCX#EA7S-jD{Fw+LyivwnK)8#+M`ooSdUj<6hw6|<6UnO?d)Ea)W=FNOy`Al5J+mq{=6_yv3qayS8!Hx!IU}#4CYKx@` zdt6cCR;Jqb^;TUL31i(mSj9|3^_oykQB60?PS-L6>xr*zP^kT`VgSvPX%TuTtHI)Z ze;NHF!NBSy2(XfEIH(HT(WH&2K+_JEs!QGc)V{L&+ml;Ksl5_W8!Y>f( zb}0_HGsoR-UeEnzUu}MEX6rReC$o66PB*0_;9Lp+qVT*}xscizRQ^XLf`7gDAIm0x zX_Jc*?4NHa3zEtpfS$m?C<@Xsy*TQ!VciTIQdH@_KzA7O8=7xP{ z%XZ^aC6{t@-q71uwaKpC@bjk~yNrh`6ki}s*I4kP%37ByIcp90Cn?qv+iC_3UK-W& z@(&e_gmGcJXN%d@?|OBKLSsL^Iq>edZ+`(Wz2XNuPY!g2 zzhxXh`e6Ge$RM*Y@KRNg{i?%t`s-{Tu$h7NxADA#jgP;gI7CV&uGn5-oC{rj6lT|+ zTTfXk$l8^n4V5^L3x~9-TU(PhPlt;WK<7=LW2H>{2>M!6&T+dl8ZKM&F|S z!1Ul(CIkAP7sNhVOQuJjspxc-pSKPJmHdq^@AnG#f9vrfa1fDLFfTYkHvL`zMSFpV zz~hq#q936`T{pZhx#@z|47ZnW3y6m9cF+bHmKTBw1uDJxp6dU&_CxRcLSNQvtI%{l z`j5!|Lb1g-aeqTM9!QKr(<=h}JzOY;pfivCK1$GOIvc9CvV_|$3g03L-f)1XeylpPu|YE66p`ijur)o6MC#ATtaiWkiaV7li z@M?JfFpwsnlll*Y6#jRV9)G(pWf=Z(V*^J#`V{_gpXM1_&%}Mm>)V%3zllypWqOhF z%<@-!1=pPgc-F1lX3_^1J+O~cO~i7VUA`5noTwFTWC^mhy0Ckx&Yx4x*x{o-cl0U` z=j%3Gio%$$yA>KYPj%YEo8fVbBa507KvIig%U4YNhTynErs0T5A}3mbfRDEhxJ9@}X6`T8`LJ zRKGUlo9V;Iz3um#ba3st=KfU=K^gxp#T zRBfJ7Z9Y{p8I;jsz$DS!QHzl}f_Hgry`D?7i@nCqU;9`pUbdrH$o29|UWo1>Z@kgX zwhvgojmnTuMs$^OrLrHZ&F=A1vr80ec2{SgIw_P{Y^KJ0KdTnjILTA#hm(#J6ji`= zeA*YE1P)w2Z0BG<)6V5-@FmOrF#ci8LfCk*rEQP+B6pr~sG!S2reHO2n%%ZY2TI;Pv*BSpPyi%mV~Y(GQJfn>-nDUPlhhu zG-2n{W$?Z@<1LB#Dl`UbBH5mkBFWOf_!j&H?2#Vw;sW^cDWEU^3-$RApX=0=iwaXh zJwmBImdfb0R&lp9f3t)%k&xeoOIDuT=L7F4=Vq0zoyef41qx;??mh|vGr?MFtxs3I zZx}`O-qF~8G*l%ktgY_-xeXlQX77M_B>DOKm?1>1`5+rC=506Q#B9`-t;8wPCD`ZS z$g&E4F9r2-%XE(xvW&cxF*jKkg~PtKNKh|tOjbeHy6S!G4(@cuQg?!@rB{JM*pBwi zEnR~)-yUFkfBNs;KLR!QEmI1Et6A^3IyXOF?|qYA;G(y@(<}*IaaCR&?x2z=SivIi zT&C=}T<$KCyk|YMV8Lc-2Hgzf8*XA0HN4Gn_YD_Ir>pWO+qSrL=YB&!G1|7oQGdN4 z{qNW$RnEJ^S4xDOngvV32k!eV>*jcvI@7Q|%@vF($Dh3SN)6IgmO7wo#YiP;8k5oc zIXYVjYVx#Gs`%>PJKul3ANrep-@k8%PcQG-@MeN|J921b^SVkg9CrH-(BkqA3T^?W zGlcCijq3Q$+3Rqwr2(71x)*qsn#4fj+Y170M?I&!EgwWKZmTFuSRl`hcBxm#-JLJW zQnJU2f|w@JR=9(vwSu0B!*s~CR@l)lg~?rp;cQ8^ZgV@25wy6eJpPZ)72Vs6Yo-a84e$MBF(Ba?4pH=!n1_0vVJw#xV%7x4wR-WBPMKa0wy7#EAknpd^heXh3uG|KglI{M zfjY4x4~{x(*Proq7*etWIeaq*%_+lcX8om}BV#&uc-`hKU%q$3s25{9~8 zi@EibUVDDX>2kLtgT!E=P5o#_$f2ZGm-Bt*)0~l2dR6{CCGPlR94pLX#^@TP-c!*9udE>0h^mC{41)gy&m`3;JIYEh{!y~!5b!x--AEq8M67v zOMICoE`jUlOJPbQqL`dr2*iiE+1EKTO~?Yg4AN5q=P@0@ukk179{&95GqGo{r<~#^ zQ-6XyM}4KY{HMN7g*~58)Es>L>aH!mXbFHRt#|3fi~3T_p9RTNuNeAp;D-B3et25b z8sDUn-vy86jEczrMxsNcK*|ae#M3fOt}9AJ=k4;;JtU=iF~P!~q(!oy09(8Ski`Ym zeqvu%f9bs;o$xgDo4j4@{u;s6Wyag*bXKgS$lLlhrve}QaT7KPeld!Z_YLko*}Z^W zMy&(E-#}P*-AP>E%4m2;A*aL%jP$ixBi2A?IRNfXL(`Baftu=WtpKOpCc?Rp>4&+0 z8hHxC0A*0)yjN&i{D!)H3(*N1b);5BkdL6l?kw9gGFsNH@C!G|yntrr*b@hd*xfp} zY>*h3e(r5+xC9x8R?}s(f;V7_(P5te0*v6@va6~$?}(=CBR8wzI~U-9JF3Mm(pC*> z_8`d7OFdl$`6TWAV}|ap(In)sK*0cy$EkxETE;nPpbjkA`C4zQ1uBK|As%E}r z*6g1PJ*oy=9~g`C-90r19!99$3R*)X7*i!h6tXye9L3h+ zm4-)57hqo#MRE+AZt-d|htg<}W&p+P9Ne_jQ%DW@Qf~Vk%(%TBe1yP9?Vks#SLp*X zs^YnvZEV6N%JqA&4R`}bxj%nj&BQjzd_LrPw-n1wO<%(@>U5T17MJe>(!8zT`Rxa9I_|Yz?9bl|KFi!Q}|f{Vjq+{zZ3w z%aXZGTC(bMAs;hnm{_DUtOhye#fy*RX5?4ZRk`kRK&k^&t#{OqpAcoD@;87U1^d=L zn@mfGYr0?Ie!24LGg8AOCgnIKs_bj>#W$!iEpn-QLYSAc>yM#|*Hx5}x8KS`axpv@ zB$~<{V;ZxjRpA^?{p1DoPzlf5tmE$AeEatPR}obRZ{qBYt;-eOEo1mAIWby%i@qmR zCHbK!^)GMrFn+BU6W$L}555M2Bl670GpjzZN}x;Rn8_Vdb;W4g@^+8Vubp%c(>B+( zFV%lPs>Tno!uX>A|n;Yr)m`Ptd?dutbHkeHU|f$WB*fzK>zx`|0C^0@pJHB)Q5lI ztch|4F&1*0DD?qVlJXss#1wIJy|<8wBnAN@(T!8s$ly^gogRpCmBZ(G9x4y`Qg=9P zI&qQ%uk<1Q>-bH^G0dR=F?kW)cpI%XNl4*?J zbXb!xfLD1ppN`$9JOa$2_|3a zB*92W#%nb|-T&lQ96ErgfXr`lUMQGXfmG=n+_$CN5)KftRO|0P79<9GD?!G=OUW!a zAxyBt!}9R?JD`E9194nA}f zu5y_?Hvxv;Ze_AQ0fWFMv>}F*6bTqRhSK@`PvSz2ltXoQ|E6gVzoD0%_G!Z0nmjFF zb%Q`X{NK*{xO+NyJv)QQha-Ai&IpX7k;&nNQ zZDo?4Gn(`R?(6zcWeLFx@LA8h%{kFbF5}fc+fbhjJ?2P{7rDeNyG(i5OvwGk@f>>Y zz;mFFtaz<2*^?lO(v+4f5$oMEHw&I^3e6 zF=Ta`I8m35BgZ7(Na7B#xpD%+MlS`k(Npz+ zCq;Et>(Fk-;Ce=h4`sME^>p{f2+I^`0v*OQ_T z^H}{qp(&PYWkJX4$-}NsRaQ)xvD&K-fEi@NRs`a!Pr6O|YGQlE9TrJWPHze*L1eZT zT(V)j$Rau1c9=*W_j-mK#kP!c4@HYUa^$s=vDu=BFHB;9b1lFX7NekZ<8_bueu6%m zFa7FIdXh`~++7SbuPonhl0YwO0Miy;uS@rR2L7#cQ6&4MkV!%yFRf!*qG4IT+~tw) zy>G)04$#YU^^kKH@K3tAY#-!MzeS{=XkgPxavTI?VOze5si@e?H8$?GUEX>Wy^t5T zs5+g^Q_8HXl*7N4<=fq0f`FOhqH477q(~nIuyV6y8V)nSldI*57&>e-RZh9|VIHmq zF2qbIFQZ;t_`+heLQ;zMvu8G5tPXDSUb(`%iEXVD#EI2$4_fzs^k>`Yf$z=DrHr7| z@h@_p<)~kzdeMHUfPPbB27LE#xVG1&s-@R%aIwQy5&`-q2qhw_vQVj|7KBN2P$bjn z&)l^^m0*Bt8vpB;a1=#6AJ+6Dbe-QR7IJ6}YxISkvqfVs9#h-GY`1Fw<$v9&=TdIO z^FUszOTP7s2G~vqDiGu&4mX+6V!*sc8w{R4xm4gYVmpP?K+)TuKZy(ij-91Q6dWb~ zvM_1FwT9#feK!@k06Db!^*f7&(W9({)Z4I3^tVqZfn*9juq%-Djbkb#4|`o@3?Sb1 zl7Ct^i&3g$^ZCx9)pQ)3IC-a~o-37Ga5aUIU?IdXGAB7M(&sCaH;r!m5fvOC(;%If z@X(!Y1)=RI+aF{mC&xu`juo4 zrDvUpQ65ty@9v+VbdCxsa%_zfaQ>fg$(03@_I;)m$Xs*9<4m$R|IrZtUyn?b__cjm zc0zf2)R8-2R*;7YjGhsSI)>uf8lwnT+=6_#+_Uw*Vt<&FVf_rG=Nt2Nxf*F@vcfqP>P=7I@AlzCATSC6v+u|#%*`_n|7qhAIT z%)bmM%oS%Kfn~`YCXXXRwnZXEdMs)KSuFy&b0`MqA*}Eh0REw4jxoiA%1E=me$tsA z3`)`Hq2BR7_Q{&y=!fm}=Hat2g;SA;B9>tMi!MQA`}@03OaiD2yx#tHMv<_&6(G>8 z57;Lf(F63NWkORl`QLy}`&XKS*%dWv`%PScsr_U^bEB^xDayHRXD6QlD2W?JJO^vM0j$*RQSaW8 zb3OY}Grh^cK%W9ogfQyz5O!1pG<|B;Gv!Z74={Sm5hwlucpIj;d;M}M@!WtFrs(~L zs=$R4*eM?#+^#X-lh-cE9@^Aezp8?ouH%{Bc7T*Rg%H-bZ67Z>mP^@j(#Q4iJRZ4! zc5p9`1UE9tX&h(ua@a$U*|!Wnj~ytmDzo&RjslYRVztisG*y-Qn~rnldIe+ zZ1x@M`Znf2W+@hrVm@m+?R)f>+Ac-RUr@{N?wzdBGa`kE$r?=Fz$VkTHT&L^`xDBJ|Q!uIq&b^iPEeCx>di&?w00dH&vfb1Jf9p1ft;Ue$@2ay7NUT zT_y?LuvB%pfp935i^Gx(ce{hAy2ib7ql9jh6N1=3NAmc4u2Gk!w!t1>XF@n;<_uh z&*cZn_bn((MPT|cx;50e#j%E-8tpQ-GaeRo#-t1=%NK0?<*g@7WSId9;yigcJ_y_9 za*@1>ZFSwZ*$8>zSmP#|vWeb7hc8nW;N-hQp0ph+9)^-X`+!#F3H~q8^SpVd&#T@o>hH1dr;f8u!)H`OFBB!!5IcToV_X zdxrg;&Vop-{*i@NyCVm`c5#LmELF~9Wiv{~Wjz-up5#YVwnKB3zzzH^Pk*@#8@P7n z#)?Z1MKDwB9E5n$Kqoq|3iT%!Yj}WDMye--r&QfTS2^Z4_g1CMf@eqDn#Nw5oA|Zy zy#+c{X|=&q9|xYOj9;pI)k{RtcY^Dhx~Pp0`*pvV2)6SgX2eL+a8HxiHzWyJ^wZ9eF-p*|kD3-CHhC zBIb9jjt1YYYlKsI3{BL9aBe&q+BLp&kzqwos{IL+?%j(cPAYm$$FY^p^FjQX$m=HZ zEt;lZ^G81UouSw5Rk8r|?b_8)3*?^i1XKKwGx{x zFYBywuPT|Xvu4>F&M?Xse!|{Wp*`HqRA;zWOOqZLI~jCW%0!y+S>np|U6qi*_7#(x zE1>}#Se+7!+HtuPB189%uHP^L=SLe$&_*W@+J_4EskIfb{hCxg{=*8~e^sITeY-)m zS2_ffcLc~Y zDsXQgai0w&3Eh15;Zx!vXCsF{gZ0@hgw6 zzZKeA(Z^xF*QaB2Km2iJpoYxj%7*S4taX=?jru1U8+NqX<4(Q!Z}(hG^_`bxb!Co5 zOnHh_;?=EnUPINCMxU9#zY!24@?^+rk~`{|BuTI{FTV3hlt7uj&XY(w_FT}nLO-)> zXWmE~Jb1ItKU!gPM`wIz_0aqa6WCEezc&w#(`pDMHU2NDh5^v}AHCnZ&Jw-VKw>Lf7;R2oHN^treBp^E2=)lxU983jUQan& zIO_feVvOe?(eY*fP%zzly4uUb;K}cM;ntX%iUq9TCA7987G6%EI?Irs>Jhc3pEOW< zNu4)b^s zu)i5I|IEz$QaDKV8Zwi2UO?a3?GZ4_S_94`jyE*;BX=nl$yW!5hBn?r4$g5*e5Y9i?ff6<_w{ik*33eE;>15?AWpVx51_#bdbydUO9HEc=WD_vQYyH%IVv zle-rJExB{9UGlj4=@r1m_{&uL@AlI!99^NK{wK0CpYQ5etPDJJQ|TaI?lm4=&y{R> zCVJQ63C9t_7&qV%&?(CEsYb7PRL#)r(B4@v`KesBaND&^&ednjFDjhW_M(^wb%G}D zLTVq-kb;+rIeNMZ*Fu$^ji!l5uHWZ@T3UOTXRQ8s=Dt!Wr?BRYvo|(Rde8q~f-m=an zKeXG___URPG+Zs#%Emgfire)j)#$i-D&YWq{rnS0X@ha&em^R~Ydz}#Rf0{DxwON0 zO;|8McFp^;Q0ROt2UMd-iTMHjzpwuP5hLLL_4hux3D|;SG(RFcYn0{;OY-7h#U{(z zGqDj`9ayYFsNDP~>pbwnA$)^3Iy!D@)quxXAiPGM3dTgSRy~C4NK-{5Lt=%+S zcE{A!&P4BC$H=F|U|zA+C9!{otoGW3$ziV)(<~T=ih^0I{{)>;b-ShE-*x>Z!uWVQ zag|#wd6$AX;00{)d$ty|eQbPeW#T4nX)}iay9e(Z4~RW@Klxlvn?Gjv+3TXh`waa&x(-iI;uRrt|Y?K&n_JHrU&jNBT zSkR5>WW}dbFxYB*4Fu4V`ymMCL)WgbGHmaVb5Q`abQhop{Py=1)v+wK4i|GmLSWhq}a1E#h|Fk7$8sFIFP+OmekzXx8BnJErw)Y7k0 z=b}d^M9DOrGt~_pz+%v+jR4QLh2Q|s2-;Qc>l@C8-NbEU5YY*z(#V;@Vz=VS6`D4T z=IyJi!8xt-<`Mk*#fhMJp|J=M07lAqNzXWlT%Y;)+CYe$&sedK{^E5R$Et-J z`p6J||K4lM%}@|Q9YB#4^6cWv^q?I!KOT@FY9tAqO#okySQ3);VJox#v#r;2_j0?X|M#U!aw};55QwQt6@IzL( z*7V}YESlix&p#di3F;RigI9Qf%Cesc!I)-aE3j!EfUZ6x(3v35 z?)9}udSxBZ&{)%Q@>y^C|by@_^V==qrGi62htoYZ{3&l(Uqr zoF<=@XR-ZrD-gE!lW}+JuJ=zziqn<0&%<}(j_uuf5U1DPH1Se(Sm9*zj@$@?UJgQ+ zc#m5#ziKFEhc$TaO4tqnLgxCDq-$uTeCb zgF>Ds2DnlyzhA0fm+Nq9KTwm*^_wGu-yo6fGhVdbC*<;**+wRWiy+6uy0T&;QV(b9 zm!*IV)o~Ddk7;dTUFW3?Q@uPl&sB~3; zMvuaUI}4I>$+}EOnLt_#AkSet@VZe3+o<)aBPhH8;J65{IUMW0b-x^=yS*&c_d9Ex z=fA(gt{@)R#nWXs%ol*{ewC*6h3*VbjMnfHUV&c#ck00$b$t?sli+xu9;&Y=1)hh` zr?fx_|AW2vj%#X5*T(}0D4<9Qy$MK*AiZ~`OD_RJ?+`$GM*#&P6zNDrdJm!Z-g|GM zOGiMEDyXQxcCF3xy0~Y6QjJ(oT?A93^tWC-Q~22 z>fXTxel@<_`FMLgz}W1g`=fTAu_c^V#~&e{q$i~DI4oPAq)rEZ%=FjApo=oJHAb5N zQx)kUmmR4y^y|$pyV2jUjnH1LIH8V7YOe?IL|D_bOt7t7FIbp)7(F(@zPaexDBLU* zcs;Jb5gi)(fryi2tMkgK8(t6+1TZzA4$R&#>z*pOnpXJHUlA{MqQa|Qr}Y1H-`3ot zr&lC>U5%X4%}QeQY){92h)4fW&M=d(?IshX+B5D(aNi7RMaL~2zB*hQd|})(p)uyz zIwT4$iijcq@|@}mVFx{qV>4^Ep*@cX$gxQQ;LZ9ORX@KNsZU09#efHRp;9BLr@2VK z3eXGvkHzG7Jmp{C54hiFf6pWM4f6$ya<{fd&UUAtlIR}bOe}3G755ZPf zMitBFRzzs>QS+sDyDqoX4NoKAh9+M)>{&?(O}50KjeTYSp!&4=t?;*`D(fS?g-P8+ zWi`p959k0ux_vrMqkig0al(Fn4zK)N)JhOMP)_P|#7`37 zQbONA-%#G)t{=;-L<1Y^g9kU!VHdpP{I6{Sck550i#_{}!DwRl8a;&T<6|c9jR_lx z7Q`uQ&s)-SDmKxYARI^|`S+%}6>PEmEcnKY5CA znQW{gUzuz`S{>|u8EjH|OPU{0{ZopE41;IpwR?!^VAI&P+KX3s*tM!87sBF1rhGOEcVsJoxA z#vcSxV&w%KLp~oixt1DHzkZf15Q+IBN#}weW9~{a`gM;N6K;yh+XK9BJyN#OHy~@R z(RWy-Y+if^$dqm4uA;h-LK-6zxEdvN^ z5+-YA(e4iq;lGCup06htkFo0t=f~bm4lGu6O=)Z*^Ep1yT;7{{VTBt5#ZDWz!`kAl zmE&aae3O+|>7_P=Opj4Xhwhv)=3ILFI&4W!XTe&a2x<|ZzYUz2e>X1f%UG(*MC4X$4%r29;EWl*efB`3 zc>VxFq~AvsN?W`al!H~{rq<+7C6BF-d~QalT+5bS?!29h_AX{ie|2gUO1+8s9?z0hE0?S480SvSeUj z;ohe-8z{S&+&ja@79nP0ju_t&3ITH_^mej#DZL~6oZBLv58dT7h_lZ`$d|!)@6-^n8J!`BdYPo3Wfp;!@?J!FDU6SI zcH%!ok8@|wzlQ7l()UTr7~eW?OoIm(N<)Y?6D0Ud4V1HDi^{lA)gX^5xU@(|CSm%S zNNZm)n!e5h)%H&Gi;;V|bxG?yn>hlU%yh z#P}2~Y-v-MWY%r5sG$xwopLo?qfjXkcxQ--J>pCN)rY)P)qqE5h1*r8BobcuWjq)X zd+*HT-Az=ZmwvR?0d6>?x>u!DgX3nId{F!i817?r6A9{%Te$%`&q5I zb9M_RiAYi{E5FIv-nD;q%+cr~oaMwx1^fW5P+*4 ziMg>&o5$%bb-&G&1k+rzqLV4b^CAR@Rxr--t`>Ab{0?E#wVY|#sXG2bNky&icFVH| z&mWKkzXh&upL@89AcnG9(8SK8G(V8vKl%zLGRF_TK6+}pYtr_|<^!yD#L71`#8sHE z2kHR*p~+!^7Tb{n{^H2i193*H1?X^qQ^B^Y|irE*9cOpFcorshP+>Nj>c}$&N6c@z0{xBsu?N$Umnj7Ks}=EIGH z(oo?(eLX3Sfu|4f3Cwf`^%R`(Z*Wv@g^y8Oj+}j&axT8k*vzUoeJjiBG-F)^A-ate z%yZvSc7tAj(QRWo)=2&FOXV}+9HLB9WyNeXXYGMDSXoa}u*O$)4{b>Ke%h0l z(s%o1ReI9REHqidj_-$qPh>=|Nkm_(^+iQEk8G{EiB6qGvl)fPH+(AakH8WVc)x-_ znblB1EW91e;-?UXyowfN#?35^FVb%>mo01oAgX_rEWP}3SefWCc^L6$uUBbQXK%+~1a{|&>M^4ivTQX2MB-`>6j zvt#A#I?a1rBQ4WfFIUZ;GMSubU+yVc!`|9$2Ahh4fOlFz-jAlY!*(gvW!dHRm}o<6~pvyv}7gQzNwcBncco|zcqWZmP8gXqDB zPZY7Jc8VTUxTY>o7FpD7XE3f+vIN346UJMJT}1a*c4_p%Mv~BT1`J&e@tyu?x;!i zIvXFWqYQfF{j9{38VI)*tCGI$18#TSGxQL0>?Rv~rQ!EZfvz!XFiN>T6*&QIkT z59#JcWb7DY7HS4dA%yS;OOE5T9Mm8p4rVIe9U=jM#yO&A61~OiM?ns{jsuC# zzE+W2wIGu*T{|ZIpj|yrG13?r9@do$_`lY0#z?O631pKK^tm`HkDyP=u>HQB;~u{g=16ecmL!5Nbq2 zc96L8mo?g!O>2VZw!21ecHV!r_WBn5OE!~V0TNB&1a zPns4nuHmLQ1SyzcQ`x(hTTb)!CLqB5i)DcHQ=DC|$=<9wv0Knj17C|<83s0t4cV7v z4`hYWLcLErjhmS)LKo`T-!BZ zuYpm}r?{`hR}x%rj-~X%W|vB!B3SgwfKO^I_-io;$>{Y((7qd1-#tgZ_t3F<%3CN3 z0h`f72>a|hNljwN^^ci^yCkcR5ALovEkhqnl$J7*ov2@PI!QaQK}TMJPuh2%+?_gP z2(K9=GdJxf`~i)htP`VuUXCdan_y+=z9B)sx@NR`=YSb z0P790Ko-Eq3pQSV)Vg=Hl>G-f{qFoly+mU52d1wh>7=7H;SO)%(siGD-FT1=JJ9y^1}<4UlqNc<)3E7Rg?qy}0MGd(YF` zECx0E4P2BMSia|F-0^yeTMhgeK)c=A6D41~17@;fdy)zYqf=A$E2$%X@AW}@PN74F z9a4pBs*lAW!RUlS$Ce0&1< zCt2Vl*FrF+-Dg{mDym_F%M77j#X$hNX$fQ;tD2QB-qSSeS7-Y(1jQpOm);(D02G_s=FS#`SAvv!2@-H-|Z?cL=T{ zInX?=03X(fE26xGITdk^#h@`$bb0a3kt#zzQFaP}3zn>c#brpfTOkANID`7qq7$WJ zpBav`B>IFOW8IIg&A-4X!kmHL)l}MtzQ=nZH>&PYJ=SC^v_5!s2)h>qmYx0JNb|~y zLlqF%L@>m@R;35+4!~WfM(hqQ4JvLR{rQPOC%5b{dwveW|GOVk<|hQ}a22pqayTS9 zW20P()u4nL)68MK6LIj~hH2L8bCyPDbyb6cRW^-9JP#BME-a-ZKO_E#Jz-T{#+8$imKiS;5NMVBua47YJD~;bav> zhEt97*!RT61L+AT^MnEq*IYQHh|cuI8x#y2^g?>6Jr`1Yi~tzHDsq0BFbG|2SKNSW$7!qG{0p$0TKSuS*PjFFm8meM zHRTw7iHH36oI~-B#%#vB5~fBijPTHMRd+*^)$wdjaNhkgf3?rGFIydTGZSU-$Y)mO z6tC0APV$AoBU+*H)227P6k~%oEsPf52e1k3jalQL8y2x7IP~4cD3h#(Jbiu&+Ty%o zO{{hl(=k`vH;Hu5TST~oGz>+`=`q{~Zcld%ilwVMUduGYh;h*5D^m`r;H?D|Z$VmI z_fmU`{iy5_MtxD1ur|W*vbx~7PhcVC0{ebs77V`UL}0A=e)RKAx7{TirVNL7PWRL0 zyNFCQ>Ui(;3ph(?ZQat1_TzV#Z)KY4+CG#bk%+d|UdbgOd|;yMRI*$ZWQ3BlFC5$< zm@m_dv8GJ#A($n3=argW`TkHSp#h=dM|b{~GQZT%r7*Wt6!r*aSlRjCjP^eudHQGf zz{>OO>sA^z)aLbreQ;jtqf-EGwo298PEnAg%lR1?gJvTIvs*TOu(yU2NSbIJV0a!c z?z!)4LGm|D^O)l$v}eVMoa1E|n}|#*?1{Ud-}gHm!+0p3B76v%!VI2lW#xPAX(+xo zrimV7zWI0l)Ls1}^!eMPpg@B1+%Cd}{A{=v7FsT?pl6Eh8I!-ODNJ@qH2PiHN*0Mz1sHPJv9%;3*@u@u1Rq?eIB7S=)tG8kLF+keE9ejoyXt(pUQ z2&*O~7E_ClX;fKWbDZIWg(kEZO-dzXj3T-sNop7cd%HB?BI6FfZry}Wn)-m76^>>S z*3;}O00{4!c&`~tQof&A#tDKy|nK6w<^Wp(G9E z6}cN)$JPs)+Aa@9ZENaNg}otgC6l8pZS+i-NUhiJsGD)N5TPhdQJLPZU*0C_1;wT_ zQGAk<15S_InllOH?&en$C%x2#X>25Dhuri%NY4?$y$3BVujhS;FC@?-s>iP_V3*HM zF=~IFAQ$?(;@_Xt;Bc^~F>a=5V1S-evI4gyg^xLBlZa~&x1M$)FH!o*L+bXNn_aHx z-88EC!P6&#BX`Vh0`!jwBLz71xKpkXjmf|^Hh67l%@i0yv&i z%b{yf{+u_ACA!{8IbaLycJPAI4sl=wzuYp3k%vmI_fqONv9UC=>sFA2j@@VOz4Ai7 z4m8%9uN7bLhMU{p=*8AA6TU1cGTJq{)z(~XQK>-ioM;al zp#K=S$Q)|2hQr};?3|`9WN$iZ+mo^IY~3r8^HbIIj{1G!4!ZnSnUs$!my_}3R9pQW z-_S*~=sN1kX5)=TC*JNMs3+Yj^wjoa>(~4MgVndVGX9xMvXeEFcLozW6r@(@k}i`1 zzt~?{w!Iw8Fl;aj)CVVBPHnWsh`U?WQY~Lqb(hlMNVGGdtM~$s5g)^j>v~iakZqNZ+fCwv4pQ%m;jxO0X7WlyW@;YS4B*XCAODQw@P*JjqqtS)! zZ$gAEO~F$~Wdm>LU##_}tVh}=e;aa%8*&7NRT2=qhKi6_p5@A56?R6R>8 z7XDY2cM5)!{hdTG)_zsxA=t9M>sMEb9 z5hG*x^q_6cG>)G1sqJG@mYiaNu&_l8r~2r&Y;=%HpOeHNHeNeZ#>q_Fbw`gbti@FO zlKhwhluGzfJWKv*@OQu^ydz-^_vOa5))WS;ugPc_)((sAKHX&u{kgXr;Uy*F<}q~- z0rz7x4A43vsvc&-7jAuaf%O%gevcRYg!Zx~20Ichs=2_QODWl}4xsHw= z>f6%Z^e=ePig~5f6b*Y~vy`#G6vO4HvCkht_O zHl6Qc(oY$R@k)5RYdVg4zkKwxYh;y^3-VR)zVO2v!BJH?_B0p`JX8d{|Qz%oL3S(8Ua5I7*JQ&)L;jSa?{G%>}^1Y%F$Q%nsEIVxRsQ zW1kH3MBhaWdoEUP95}=uOoCW$9F^tiQ7T!96F$j^@)8&t0rf9NjCg6}IX!Yvbc`9D z7vC<0$PTy1Dcvk73&T%CxfU!ignAc_N`MTL?uAk$qsvcD5|b#~4-Cy^gK!;)WTWsr zeNU1`;?Z_J_x~>v3y>D48kxSg*6_?KoWiyq$svnD!H==_O=N;|V=6T>94l@dJbF<( z1&LQq+HT!ys|efXtYk1hM0=r6*VeXJ=kHLCnM>1A=k5s!mc3dk`iz>O1I$v{nH~`D z#7(0{Q$9Ow#UIA=)<9KeoJ%_J@2K|%p>rV8MG_N55T5H9gjWwpASxdl8v1+HXIuOx z5Ak1JbT5VijI(2DCdV=U)bakeY}0bEzy(&*G|ia%3-2aj()ku^n{;BKgAhp9I`Aa9 z6SC*AT7p#rbqe?)$Q%)$d}zb$EcHg#`puZO9w(B;9s|_l)r-f%xl}?>;%W%@tgwG# zy$h;9mkx)FWhq=Zw%(8Y$&z(pdoz-yc{S2SukC5pY86wzS!6xL$Z1bTXlZC^(n~x z)2!3T+313ng!A4X-F}8%o!!J3&yZ+OKg$Q~j`4Cun5XNLO3}l)4j|hvhUnk(Ux68c6*QRpz3{t^W~VA_Uaj2SFp#2YsRzMkuAH9M002`Z2oOP92NV2eTT z@kIBH#B&pflT1dQucupG%NR?o+C{@w4}~XbkRR;5vm>?nLfTgl4um>5BNp{W8=oy* z)rcnB)|tbn0NITghr1i9wdiHrz43!}PAW*`Mk)omG>1vOaWXI+mRZ_4Ewv)h?k;Kb zn5{l0jugm?NHp2WGgz_VN#3qIIJ9IR0!Jg42DvG|c{+2-x&T%bHcXH-vx_L zoi~1)Dj3G|+pVvea{N_>_q+Ov+RKONkxsijFP_LlNz+w5zfnXECq~%@o0^9wWzN49 zvhr(t&NNR#U)W;q7{S04V$z~>`~sBnd^&MVVF56WkJS9AR| zk|y@)sHRP*DJ%)2Y@@!QTJ79@Yc(rG`paH&a&1M&>4pP7qoz!VrWwBtJr}!L41{i- zF43m&?zvgm@4An_%a`w%CYsDJWgh*fxyS#A&1inrHA)A0Pc_4a$HWe?3kzYqPS`>F z@9hPl$_WWdl&xIT;^B$CUwq#~cht!o+*P?H&BgstG8+<)$08o`9*Rip53tn7Tc$9> zw~NFw9w(ZKFy?mqE~rr?s42b)XK;N4BX`4#Ie#Zgm%uy@qBtGPvq~CssIkWOviGr7 zRdRYDis`;oH=tFC@P< zoji)|VM1Q<<&mS8^Uh?5UpjJCOy}E(@zAKMA60f{=dnH=UELRnLe(Z{cBUQL9#^CjDX&%9<2&^-tGLT*6wynVa1Anrj9`#U+;#@aZHn%ELc zX1W=Y=rIUa?-wDPfZn|aRPigeVbbqh3N`HCdXoahI(6$DU5{Pp9o)YCpGuPvk^#7fh(FS}o6aE8t0SYYbZI9d|iXjzJZ_%Ax;Vmn8aPR@UDir$nwh_dttt6%HaAchXrsgn=&BaT*-Gco&4OA2H^XXo%uJ;3B;_nXCzJ~L4c zd-9~FJfT|wr{OL>Rld3$tz99-UwYbqi}w_i!wf8=@p;-w0ji9QWR|eg$FvTJTd{^W zxEEhMI4=s&E?eMweLs$Sg}kB+?iuKCctFL+Zl0c+iQ@q8*&23ydB;4%u(Ku%8YWC# za{PfN5fX26LhF%%C zPZ-`x4moisFr@?AF3~y%cB}Cww{2Q-);>HJxM?__E~d>wm?bsYg5OSG^&QY{zwPRp zttgMH^Yj}_o6S>;w?X)ivxddhb2ucAU-ELj4JXAwlQUoYchS3~;_MvhwD>{(mIGG4{Xs`Cn$Z5( z-Skqx>xhW#)%@7i=?`UAO8X;-Ht|+<9hxiG^GoIeUHn_W8N#(kbo3K1R32UxVjT}j zUkZ7Y1}yV5dcQ;iBqlv;kDt3Lu=CmH5_qGPK%S(OM&z+B*UDV#AoBHDmxJX10mnx> zYo#w9V@1Bk#$_sgc~s_Hc@z_B_u9NxmvT(z8*rlUn&wl4Mw@bS5C}5AxRBp}n+?JL zb1z4S#7KegExOhk{~m(=9yD&2E3uqyMzG>3b>`emzdA_KSERI&zcE!!R_*;k+I9x8 zxe}m5q6IM+5?(^Y!4LSG9?&p|oSLw|eEpQXxG3{*K zZLWziKV@(@`apZ>uzhGHSRM~Jkfytnx1#bmuIVvjYs`^konxb0#1`ei9(0G+Qn7i; zN;egcWM3{Oi?j0iDc-3xy}RL(oVfCFtPCZBVY~Th=MzwQq)LcLK6Pi1yGfq4|IKO8 zy+>#&PBzT1q(R%se{*!@e|Kamca#moKD2#bW5|=#NTkQoXio!2iW?C{O=8oXqEXmk zCPlS?yDUVT3jJB%B5NNW{Lvv0Kg&`ypPTM`Ds z4EYJ0kd@xoWoZecqpeUH;2cPtoJpqAG0879GHFl64w5qLXq{o;r7bgZ&hfH(&C>?l zTCV~EvoxS#8d7Xt-2YbDf;s(EPRs zI!mz}59I!?Jk=rQy@f1HRHvS_8eCvGr}#HHa=&7`C&&9jHoy6zSFi1uEyLlowuP?Y z15vx})TSIIvj%Q&9gzxYNw+2ZgCyP3C%;N^VIX;U^#F3++p2xUy%BqGVfls4BFvGn zyrk6fRcv%dZw+Yhio7syx&(OPYEq<#`5Qn&zxp#`gV4w%4#oI6(8TU`876W2?&+0FUqj)tTZ4U9Xs*0F9A(CYdOt z5rl~_35s{vkrrKKV(nPD$Tv`yc=CQs+?vD1zJak(UAtKZvOfnaf&psGYK33Lb zes+n-j^S+I&x&!82%-M+MAG%s%z1NgBlp#QPfmRz_uUcbqN!6X*%Q!A&g-`x{``E1 zcN6>g972HUr0)5>*>`WnW;gvtUen+L_q2Ltjnz$(1&Ix-9AG(gS&vzl%^U_Fy}VHM zC@QED%|U~DDsrT{zp8BX*EuAbgKOlhnfb9aQr$N&?a&GM)l$fSfk1AvE|Kll=a-Ql zFSEI|N7ueWx4{1}u8UsOjuo6~z~xLVC=GiUIy zWJ65On@EvadT1QA{R*ze>n-k!;5B5+ZW#DqxjM+3<*Oynae-Idi3HRpV5r% z_!okVdo=}&xNs%OPa`~*oc&oEb;`SEfpmMv_zy!vE#l5*IM_gys^seFt|wXOBs>4Q z(2|FGGS-iX4h&XqPtwh@kP~e%dzMM-=ul_|LplW|)%ki)>qzmkaLDsSVW0JH2oLbz^`d*3sSrKiOEx)+`}GS8*0ek{9QUzsjwW5qz|@EN69M*`u+9uU}S{20wR-#@#8e=jG<{enaWE+~0I(l>PKChc~$F8**me6(242 z2@lfDz~(;Xm!PL7=AU-vUu#q8svG7=G!iQUvy_5$^33^N{E~9Rv~<jY6 zYQ?TNeXySO8(5y_RN45b&M1ae90evczqSzr%&EgwJ@k^hGONe>mCb`^aj^M3CbVL8 zl($PP&KkUuR$ckz9hWgP_3IYT*@ZMp4U-02+1_GX)mc~`?Rze1JzN17Zh_G$`H%@Y z(x87&Cp|0PO731?9)Nf_F3Uz3v;K{{Q6-sH@t}~|MX~620IYhm7LLLD0s#B;T4N>` z4#sC^*bZWv7Pw)v(sY{P*IQYez84r*WVs9TOiCKrYKL(fyh4>gsThyVCDTM73^OOiz~{@ zagt)B`eIhA2G5}~)S+w^Ug+)CbgCPVL7iSlw&Q()DdnGly69B;U1PE&_sHC(CHE_Y z;C83o1&=l?2cKdSSGaWGF0;j(m-;at0oVc-^{QOwdgd+lrM>ut`D?daO#+pj_ZW`wx0RaOWV>pnvyWT{R##`uM!Y$}}8_5-eJ!ip5 z^`z1dby}n;lF;eevs*#1;sts5l|W-YavIA#&B!j4m5uH)!%MP@@9edu z=}C;q;n=8#9=C;_Gp-yVEyEwe`K{xkTyY~!4Kc1z>z+|nDP@0S^}VIUrK1;Q`O zKe082+_>enU5$1JG6J5iz5ep$Rr)NBS>_IAGoS9{<9n;1@n! z9T$zEr>c0-^_hf36|E3IauhZE49lLv~Z{$gn}Z zIy)6D7uM?-jS(RryreH^Za;{fpltEbD7G;kwkT}Vc`Fya!@Uzqsznl`3gCQrQu5Rf zv24=k*pva=d0f-vhAS`zA)??k7%>H2=QJpDFJ9<$Il(XrD=U)W!=iq6Im=bxlR|(; z)nlRRL(x1ic&sqI;jW~73?Z{kY_g&%QR2q{->LGIm-7`JNEfX11L{&X>ZOS~4Xq{J zK-avUL0)LIgDs(U$DPG!UJ^fH$CjaWiAP+9TgMb$aF0QaN~(R!D1dMgDIh0G8?llj zBV!kMmw-29TpEy<4V!!S*Y@MT$@@jVl<{O5d~tg3G6`=VZ#^$6;F(ikY`!LcEW#QC z-It;_?5fd%iQ3^(sZ1$b6$R9RUO6vU}WQ ziy%bE!bNoU+5-PUJg(s>#}-w5){wzYYx z3YF_J<=%&=ZRFzcV0)DgCz0}V_=sR?3SXpXlY}hW*yt))&c0+mCtwo8#4Lx$m8)=d!`_|fGd9ns3(>iCqui}&kzNBr=@P0 z&#P{ZwoiYbB-PJtv+bT*#bOnmFcPox_*#hA4T{ft)t0`@!iT=((aswQkF4~r9n#M> zv$r09M&+x-IEE`HZ9&AF5=pPYcbfqZ7=TF z)2z2dcE$3$5kpf15Qp_~CWic8iJEc&sqKG?HvMS-V-Y7bB7jL0E9d(-;%q1~vesuj zE@o!WGu6{^GG6+?SYP?uD40JFB$mPJko@7X$8miJ{hE%2aGZKp=~Ryw?^B1NKG&;I zSGPOS<&VV+vNL9fmZ1Gv{qEA-6`3D9mC;ePE~ zo3MMQp}dgVmAaJld)n(FDNNYLI!mm3zFe)+rfm8sa|QEJVcb8aj>a6Je4uO}12Z zG)na3VmmW&xUW=R-N`hXSMHUgft~yNG*|SaGni@B;lsxl%}nK&mP9c)91d(qb?kDq zwfZA9qYmGr*wP(gpn!WiV*>=L_1=XI3w9atl%Ub!`kd`Btz=j;Wt~dOwW^$zA?s((U;O*ow)`jM+ z91eC^w|5D8o}#jg6_PTnE=?g-&CyX>VLKJ2+ahE#NdhMYAy-OHdEG;NmMpC=#s$I! zs5S2?FgEUvzPl8k0_+Y{SBt{fVI858)#7Hqb}h(lM`y)x4Ba+!LCg0|l~TS!YYNrm zCtb6s4wa8^ZP?PXzW~K+z{-fM61F8k6VCURr-MXMAlDqV%vw+_Fkxw+9U|60seA7d zI4e`Ms3?mFfsWBlDh69Fsg?jL~P)?*X z4Mnrd??h=dk^QQaD#L*E=Z5G1_OV9uU{JzB&mvggX(vk~JeTP>JC+BBgsiMNTD~ zP3Qt@Z>sZXlJ)7;QQh+zNT58WO`S{T4l{V7R`7}5r1mwuKx9@Fm-Kqg z3sQjPuj=Pnqi1U9Ov-a=8y`wtUx6+oeWQo?m9f}<38i3QM3F&r?QpS(@HbC1997Mi zs4f}&QBbMJ(@^5QU7tU!yT_$TI#AL@7gG#M~w!bD5$E8VST}L4j!&TeV!NJ@dw&CpjJO?->pQ2|gLXf_xcs;MW5?U4H zs=2U8{bEkIv`u;H(LE)B8Ii&qw)~z+kGeWHgQ`ogIMg=mm9q2GXet9!B+cfk-rd%aaNl4Bg3^eHQ?p+^_p_%<+Vb+(=0v>#ae4)s?NB$o;AlU>HL zUBk?(#^~@b&oyutkBsk4AcAL+s7SkSQeif*x8^%oNPmK#6r}+b|E$gPzxF<}PC5L5 zy%787iV&cDp!CD%8+qo&P`h_vU^Tm-N<2T8XW@k8fqTkArBSBbHM8xif(8=IOdY-1 zeD4KNC22fb&Vh+}pdB&WsFT8U2ce9TvtfnrI(nQs*0bb$#?Lw6$R)#$WAvZsP8u%y zvR9w##d%Hb$9qx>XqE?=p|zi_{$rrXtQRdAUuWX-*Gh~BTMa6KD!b#aPgj&cTP%c0 zT#eSNoNG3T%KMZz%W7m?UIWTNW9b&Z20`+=1C;Ba&Tk(DY9e?cX?;Vz-e>n@8% z2q)zvPx6R^f&KQ1MD7{VXFPiq)f6_gUSs$99t#9$GXZR-(=k_fsFDNj<~b2sa18dN z;XKQK=32;Gqr&HXBZ>wb<+CsEZ!EBRk8sW8V;2)}Ctb2qNEDKZrbIh<(_bSZ{uQ(= zN$Fts9bmWd<;}O6b$HCHS=bGMg-a}L2%(hA=s{b)9o8O&4RI7Kz2Oa(CS>}aaF55& z%ZkxF!aul#{{??qKpZq7Z;?Rj8H0C;Z*__%1I-iQ#KeE8Pj`;2?J9r$4w$UEdvq0t zsq!spR}G&)SLEPPBy1g#BC{8_GdsB3q9xu4#d%Rw?D$$i;GH)WXjpXi(Kk955ys}m zhJe}r6OeTRgefMEa>RKgmp{O!D!-2VO@953yyntck*wzOZA21M-xC-u-634!wVqQH z3Z{(+wO6pLt2-vpEYy1O(r@l%_rz_Su1}~}F*TxUq%gEZgw9Lt$*kA9#_EtraLglf zzghSr>N>-dra)uMK3v#RWOv-=&HNTak@%-15^%~A)9%sFu&uRYOvQGyQ^Rr72na=) zJjE*&8$XX@ovkYV!hDePt~zow@)rZ>KL>OE^w@2MGf^&gc&8G+JGTnHu_2bZAU-J7 zWUS9-k{hU!UnA)}bN~5TELsj>bu~e+l!lUeU!!p3$s1*w;=b~@<{^(~5Ew3Nd za#yf4SG9Aeq<%{}Ad;&}Fa^Dy{<@)X)4NE5RMUig_~)DZe>j?hq3EYIAx&q!e)E->hk{qse0rSJt{ccPvf*h~&DrX?8iYeMLGm(PDxr=i10f86$9 z#7p)p{V4M*WSd&VsxN)3od#JHe^FjWd~8!%r{_dL`wnZQjH`RKkaoZFNaC6Ca663| zn8@1BdCQ7!yCtPv6bgB?nYM{{T_N9|?50)8jZsynDIJS9GC%J0_o8rE#@+ z3c~kF+~eS#t`#3ep75w~e9)$2hhfdO)OFJFJyWgJr{nFDtXb1VKz+)CB=$(*0=?!; zdV;nKqPNQ7HRW%}?Ih2~pTUplaeQh)DtE(+_Dz=aSMV3D(_HIqn(lydR(URk)E=we zq+-i1Iqm!#vPrQp5$c4gl$yN`BcxO1G%cgChtF|(+`SFOllP2jp_6mY6eUeVOfI2v z9@?d!QwMQhWAmVeex@s?XP4sT-3^5dnK+15&5Y)&u~-@tbC_5?y!0A3-E8qEYYz@* zS7`{qDT`_hqyA`syd;+)*feFqR0#4*;da7#+7x9kRV+*r;TbF+g)4wO6c z!-HInbhpz#HYcQ^3YA~?g5Jz{d`y+U={bRLqD)(Ny{)=M2Q8gE7myPHju_9qHjnq4 z_Chh~H9&CnS+ydi(X+K>6;nySN+?uo{XbzqhZB}FD7&a5Qe;^4^= znFw=w(keVDWZp!H>O0^@S$(PF!->irq)$qC00tRL(~5hD9afjef$@Q}mdq71c>D?7 z@#Y0e6T-ck=2`ZJ%Tcv?7$C)tlprzo$DiBAy89D(eXpGMz89|+_^t(fmOX0@>)13S>|q> zmFj&kw!6p#hC2Qb*4$sWRQ2yg_U2B2h!z^~$8HZWR{Io_?5i?>VroJ@)Ni-?17FqU z8k|xLPijrxtGYX%!f;Xi++7hzRJY0J(MUBaaWO&MW=jw(w*)omQio*A92n|R@eY4c zIxv<<8>LD<_D^0kh}G%pe-wR~y$klrQYfH?XrMV`9j- z4H?km**zD%`2W+3%0c)>m5?DzYH#F1n11uj;=VLJ-{ppq3WZLY8+5FCM6RUH%(KV8 zt~*ee|Heyg6D)71kBdvnEm$**+{^=+f&Kf@lakTOp#me#uC8iM@Px;cG7=ncj5}Y*pO!u zc*zBii^p|1YjC>iN}F|rE5;_I*K|SdznoWiZF(>AQgfTQuLAa2*9|mbeYDQQ*@U!K z$|_D6Q$E=n9D* z@=qnG?lHZ(M&2~4=3CIA)nn>zH?o?8E5#GSM$l|d>Xe_LzYd>+lR;HucEcAZoxuii zy>5~lg{*CkABCym1Jwfncf1N7e-{(tPfcU)83wl*BP(o`TwmnJQOw9o`Zl-_IT zNQXpv51`T^Qltuz-fKc{0xG=)q$5q~O?nZqees-o&OK$Hd-pxN+K5cvjy!36XAeIL zJK^g2%3`i*h-tm>Wen0qXELw1>;~sY=Lz4uQrq56PGn&&ka_>aRe^&e(`q!7YV20x zH6u@dKe0aomOU;QBG9 zLF@5{+eU+VAK-7jaP@3*1i`!1D(s`O3ixpo&C_;#D=kpJ7s2jA%8w!vwB(;p5idXZ zv`NLLE~4mJCjB5#!`*Z~FemR>We^-$oNeE>cp%LFxvQm6ZLiHpF@bOcW5O1oar;`V zrb#`xOxU{Gl{I7hrA$uKrTDbj^eJrkclmGME)`Z9y!f5Qi4UEIGN5f=?sd3oQnw8n zd9zleQUC6iS^r{y^-v*(NdmD~#B#XIHW5TS<|Be}c21}N6!JBvW$W%JHD>VEWQq@- z8cC7((B8FG&>TzsK`a$j+;{nHi+}93|K>iytrv2Qnqf}IqSqSi@B{UxF6c?5!09v+4DaiaTJ@J>W=xJczH-q@XmKF6I^U8)Q+ZL%7 zQ%egrcVV?M=dON6ifBx{uQbEWE-Za)Wz^>#RZaAg%=H023&~CJQSu*u5xV7 z*-CA0`woR3FS!=n>mZkdliqB>gHKv|OXE@AX;b^2dYC%+cMbbDsh9uV@w>vZdauh= z${9HTkh_ZV&?S+O@gb?!5sG(CmrN#Z3`h6?H|q6S*J!0cxVv)N*ea-n=8cJ2?se18 zWuD6{Hj${=^aUbTcRqZM6p0xpR<&4S1R35@OacjwITi}ZGkh%7du0rZrNHm>q<;3* z1G?7YI+izF6^E7s8}MT!D*jM8JHM_IkQ1+#z7Fu_f%Sj3TB5BnjAy5vzXbPw(tQ71 zxTFNn8uIuQ-6;s8l>g8$eQ8Su*sQXs1XigQBBk6lX}f!pig#^M_`>u`uw_eOyBSW* z*EW)z4r)z%9Dne8kYB&OQ@@+@mMtuet&#+V+j}ugn$oqp zP1=$Zc+q*~bnzwim>CD8Md@ts4)9PB&*xFarP)}GF(+VTx?9vtzNY${y;a&p&PLah zPG)jVt*gc(xPDaN`gf4AAJ_b>^+lX-<3A9WoL?{v{J8eNLu({7ahZ4$M}gYz+8DlZ zPAO36i*b}>mmC{y`G(GeR7JFJ@qmrTw!yk&FQFe&pF@d+>A2j+9xdrhgpb%6;c9nV z361yd@ARrL$}>SA7^P|U`p5LcmEV_YukJ?&Vd;d*zigEbOF8Rrtw5|^jc(#+t>wn* z3BDl;-+D+mnp>ajsgp)I(dwD;uE&c<0P(5dP$O)WI{GjSAM(T}^EI7mc}R6zc?0P& zw(^lCQRF(^%3wO2BjNxDxYK%hKVT;pUNP`02(Zr_m`B9*KG|o$dG{Qf72oR6Ta}ryf!3IhD6o&RmsRu z#e2GyV@9JzZm(>batrn4MuRs^W9-py?`)W~bd`&HnS2Cm!#0-P)MSgSA5?6aR^%nG z3Z}eTHI#N*ttR(L6@QX<&Y6Te|{7EU^<@kh*Vt;+>e{E%vxg6XUGV_I3ON33Ctn00HznH$wS74MNB96ju>93oy4M#HXSMhN*yEXC%M-TYq3B0@)cH5y)>f>~X}8DpNJzpmcrlPonP1nn{B^M^SwR6kvvIob zm3gFk6Ln_016YX$6^t;?W$G?qVuG4VU|F6#eiM}KcaFr@f(y=>Jf%}9xbK7L>~Tp0 z&2rBD$@$$eTF>s*SoE;%qebeB-nID}L>0dbIati|%6pBLX?U}JMG&Wqq+D7oypPOx zoL?wrEeUcnO;mhG&LCzXmxgL1n@0?2Yz2q#rWtnc$|2*R)%W-dYFPKlMo95YCN69# zlR1*=$yJLMMiny^X3cErG~uc_8_zc*E=s}rMotV~kxZi$%ICUpi?1cYg@tbUs~)?Y zwX|~h;qvp4JDdSm_D>NXvHFDiU+U1ewr3Tz+iGx28cUrEk)w=XZ98l|O9O2t|Xmrt<8 zbpuDn4R~`ya9fk0trTTv=|h6fn%@z4*?$BUnu8a^&r`m>9BFy@*Ud1ZKuF7Hfo}jU z&M`>PUpI9-E?Hn}^!!_eG`LZsEKE7y_2f_Tb7or4F03V(u@uX#%t^#NMxKXlPKZQS zn_Vp1*nbjff|rIoH^~Khjit`WdW$t!+o$nwn>|@zH1i%?h>oQNuBm4mAg14-%%;!9 zDTB!T44RZ(?CxQyGNBIQp6YPK2%39u{)M||02RgMJ3|L(JtAhv>gm9(LXc2zF};9& z4q}=4=f9y}chLOY=Yj+emOJS2$$5NH)RIBt3mM>juWXBl_v>wDVwF>sdv;#mU(q)y z>e$zax>;U(F*`bJcCp#(d;_&(oqopGk=ZfcqN9~@zOtGaJ>YI=?lVc1@AWV=xUDNq zFF7mM9mkaESz%tiKrUV{P)D_za;_t$kQ6p|^p1FA|I8iWO%5NIzbepeWu<2L>?o6L zdM0H}t7om;OF3Jf2x_fWW)FlDCpS?@OO!)Iy|7?`5Ec&qdw&Pz-cemc=MjI?rhFqM z&nDD&2vJK*o+-HOk)@V#r+D*lph{kYu=vz%+{vk}UL0ek1nLFhZJE9cEQ>dlyBouA zPoYqBYNdu4S#BbwX!s%rXVk2%2zy7GI>_^d0E;jFywvWWc^*wuOtfyg#9O#H&R z7<2i81VqC<&)A8Ru<=%@D`}@D1M!Ebu_&bd=JqkSA%vsJK5rJ$)m-k(5nBkIsh!}f z=b;ej{4;gtWlnWg)-;<$d?}p6jSm|Y9<2> zCbLkhZ7X@9D&D{~3iLA$BIsD{*beIq%_~Np&C%A5Hii7+9t^K$U8et?$W5L-ap3Fg zR#{l&F%S12?a+_?tRHRiU(|2^nf>F^w}~IP&CfLoKhFDmpXKMYN0I3(BBY;0U@r=! z+2krj-~=*qX%tKfgNh#B7$}*A@b3&RCun5I`(BcQ+FMvIA&%In;)0WU%b72+F1wl@ zw?uIPPofYiw+uSCl%p1FWdN=wp3vFpD;4Afyi^c@Uu*dZATJLBX^w91luwK6UY3Kt z!3w1T_B7~EE!!ql9I_$^XC)qj55II=A~fI=F!$J`cA0%$b&phN7V^|*?OuJXx@{0M zxxyV8CSW$qyN+B{5XjN8^y(o?Q)^oJKwgXlRvPU;RMpBHB%(2;tD-y)Yj8fQQh*M~ zp`AYUkHKNctDo*QE_;HvCaSvyoXBgl~F+0y%)u*1f{jBCD%U*<*){HiWnI1cLf8 zp10&^p#u^3NpaZ9y^@hOFJz;1?QqdEMsh1rMWub&GI+IadgTGpxwu^KqXlX_iq~@* zQ3ij;@L8BWvk3CAkJnV;;aXm7RI~h08Xau_L1m%bEd_XxjV*qz6?=1~5o5K+DJv`{ z6H+T5I-&&1>XXt|8qpb@t9lf+} zbBIBz*Lt0X7jkQ*TQ)t|n8Jrg+U2G2Xt3d@)&B}%<{x7&(7BeZH+ z|0kc-nPxlMf|P5lr-;E@4gk5CPDdXiip^4RTW|!do0X1&Q;QYT{He#b8x@I(Qp#vIBYlFKAx}33XCw;yY5@#@B;X1Zgy9 zjc@XpP|k*{-}JD=D^ujgNr9-sQFaQ4HUA(uIX>@?ue2$e%>0(eQqotYZm5_ z9L>~6t8rKPf)r@*13Hg$)%|Qz&B`mk)fTOv3|2g`w_1`8`vSxrqTSWLc=-rCl^3(Z z-aV{%L5kZU>K{AJ-|q9!PL)~5DkF={)9VHUqIIcr6-dx#iVs$`-?hP3ihB-?T80c8 zIcVi~x|(^ko^iB(V$1KRWy)X@xtH9>q-upYLKD0yeV9ky)nLlR|B)pi8vZUU`=Uo4 z`S@t+cnuv-O-Enh;GW)QesLo z9bV5M94id)!uJ;If3(ignp*e#A)$Mxzh~Bf#mnfi_t+}lJRf6;p*O*crP4mf=+vPK z?`IMpuySX|QeFsXTA(^Q_K(e~$%z}k(~Fz5a&I99)YI%*(oDgHki{X&&gZl$y%^@n z*%yVTJq?pvH{Ii$+}c34NEw>N@*`AYnZFt9i!@Xlf~&y&pYvCA3>K8Kh$RwIwi(0k!`8ld zYa{O^i#!CqJ)TLTrdVj4hOLxttoVh~atc!pXcTl+oAFc(s`+_`)|_U&(7)FWTyCWd zsVlu;(Y1LVG%+F8W1}gSr8vTf^=LHz!&?2URo9=D`Kdz(cFM_=1GqHJKXGtpQ``@7 zH6ubBJ@MOMi*~_5m)S&H>onLl_ZyC~`bE2{*?SWUf z9@b8F3NjtHOL*D#$@OvL5asj!M~3wOLI&=y%S^Z{91&p{!-Zq(@+98X%y-Gu(wJ8& zuRejSTRFJfsH8oQI!IMy;SIKaeT$Akb7OSe@(rOa8Op7BmX|N-ZJ$;UHm2NrMucyh z4kMi;brFSB`b!Rv7>8ZGf-APkh3kyt<2874RaAfwtB~b6>;Y4g7BRTYC7ooaHd?-| z8}an!8~q;57A_p%cNODbp5FJ@wM$*w8iY>cV=x@yFm*|E+%btSUDu=gpiEmCE25?C zXFr>4(~H8>eq7$}$YA?oL$5WAG7{XHEjIlkhQQmaYBy3zaRGi1k!=(rC zDjb$`qx)y)>6bz87i8E7tHC5RPL?yQgiMRIT-bDu+o#_E?Mr@ z@`4X*55saT-0fc}hCU*${);x&W9g-u7eNc69Sh+_rCF=Z-dOR6s*OI*L_5x6v_WkH z!Y65oIY>x`x+srGxaMa$WxuZR{Ya5%lsunQG&8f#2Yx`|t{yLAZh z)qGwm{XZ9N;l2p49-}akN2oAyi#$z>pG{i6M{5;m=sj7%Ln_8FVOtzY6{RH3HxSs>s0|u2*h#eJ?Mmn)OGSbMkoT z%dsL6OY^rXga)4LcrzuvH|lJxZ=8lWfwt1krUqVZWBkXLo_odWxYkqQv@A#|Id1Eb zKKDIFXB{bpOentJ=|M-A75L=o4T_^nCF#Ado!n!2o?b)T?f9Kx-Z%OeQq-^26Mx(L zXak9gu;ndTMD|6sG;0&_?d(67tN{5ymE>vWcb_BMq+K%e-Rr3Zai?NhuG$|J(Bw*$ z#upF^ij}-)W9wz=dvEvAwxqVEzt;qU>UU=9TB^NoG94dIGBncM_tuVY09&<2@`ccl z>H`Uclj4Rc0VZ+AM<+Aix;E+6a`xxap>!f@Up+46oA+eu$-{&7sZSQ*0-rYt>pdld zm1@vhnk(#f7ATm=@^7eG`w3TS#S7b)YR|;PYf~by1&M`{C>apP;LGp+)gQ z$MO3zj|^J1kMFG{=}uah$P)0PT+Qyj9~&$JeBJn(w5#t*#k^GOvfI*@R+s z>JP(derl@xjRoufjxM3Vvn$&IRmQIKUMe-at5M9MjGZayIQRIr9g*Kj+gADb-q(^#f*Bs(-8V3>JG!Y_6MAq${{Ft(iWHeYFBu!b8R$|&vnD>W#O~}uuP?f zlR$%{-5D8p%XP`>h=tP(2=5-^=9V{U*fAQgE3dh$1UNzbMOLYX92h^#ky7 zOq_K9s29y^tz@XYMA+K!qCR@bsy+=8ym_v=x+rfjDCnmTW)6@xjD6WOBqXc3?1@6R z8>J0HU4?pNW8tgT(ZNsU1ytp^X2S9HL3d}03B+mwU^!bAYQnCOuQ}&ic*M05@2|EB z>=k6~L*r=80iy2an)XfQwXg3f;fNihN^~0c&g#rR%!fP33G<5;>URg;|E0EYOS4&^ zV%f8}XiJTs-}%AQSD!SoJdiGoogl6(r10s5*CpPXED`VZkG7@;H5*s@*V?@ABgQJ4 zpQ;?z10QMD285HCY`qrte=6r$kQbFcDn&ea!m7SjZjBU|UcmV3Qc7f~dD6*Emoje8 zOf;5g))iEyzO@t9GeEIEyaTkp42KBm3u8G&|92yOY$0%3+dq{Qmcg&;rlWR$3>v~} z&Kshzn)5EA+aI8-jVMDBZao|N3ctxdWLWl%naPLlyE2rU=i_rD!nWG&QbGpQ$sZwZ z5295PrnC)>H9Vzz#ww3G;gf+LDk;Z6hec+qhAfgtmp z#Ij%Ob^|K_j9h>}IeNc3G=O{gLRsl^U6%m}M{RMLAepX>g|M&McukIen`?sRy`6Z{ zeZ&#yiOXHa=1C$?{3jp=SLBhy*eZX#go^RI)I0F4r6gJfN#ub&V@NJ%i`N8%%am}l ztClsJtvDwaqVt$c$wdA;WcmMBOybAy{%zhp(YH50u%2IPBYs@}_ZZLj_9#dunMh32 z57_RLeJRkXtJm??=;VY;adrxY1wklPi5d2FU{^sP+;A`=t~o6R)|zw+l!A$87wkPG zhgDQoIbT;Y8&^=Tj>{!-^zy2F5P{1B8e5lerJk3Hph8p-a@(UgFhReOT?EKT!6Z&b z#JgzSjf6FchFT@|FymuAj+X2=)6EJ7BR7TL0&QjU&arie^%w8PH=;>T0~h-HXHt;U ztTIGmpSkclAAo6j6ZyV0PY$ zfR_K%Fvqel(X4x!2;udkQ;T@57c^+y`ojrpa=9Xx_?+-|rs0!v5~8fyU}xtgpA|~_bc&& z93?hKhSl1#qmB0EVpL6{D(bENe4no|juPs*+URn0Y)Wl*p^>Dv%_M{V9>4>L(XXBw z;RkgR7j5X{296{nRvx^;vI1nNv=r2906KJXno3|Zm4;mWgpo?VT0b3nz>*j^T6sTK zhqxi46WagGD7_&nda==XTb(m9NLtNeZVt3FJsxEe?~6|Fv=xM36NtjFA948RIbAFw zp`3V%3+>P!QT58|Me}TD+-zZGkIWOp#fn62Mat_bmDWi-I1|?~jY)+L7cj!!w7vJu zC@c)_KYQ0~?SS-%E2iM}q~2q_3DVb-S?3<#)ym2$i_L9c$R)6526*J>AZ9mz6aW1e zBvXC=K6F|dMrUi?=MP3VWA?LITO-g?`-IQIo$V4V-%AaTQ5c&LxJE&$ePH@wAe!8p z!HC00)H5x0S$bUp2&dFX43=PC3?KlOH^5n~M72yeiqdKG7HEyVLR9N{f7$l`u4JWy zoK*3cCN+z_6AHYSeUMD6VrntH4jkPh?iKYk~?!ZMi0qxH;6~jRR_Z^Tg>iR zH;v$%8!{!8=qzI60}Y5?Gje-j&2WU8HjK)F3%e_IXMb=gofZd(mAcNgv$WO0ddg9< z4gHo)ep2fAk5W6~V?xr-5hoRA^`af#uLC#KQE*1v(cAv|A@HP{bOuNv={<9u@h}bw zW2CU|MOTJ2?&3hgS~aQY(LYI1{Rv*BTO&a(e7@^_ZEDqYb{o2FD;9-3xO920yUKbF z%UYog;B%lGiGF+d<&e_t8^9>&`i&{7Pr0d6V-s9vHh2>+m0EJ(-Olfyc$pfikdhUi zLz;`|<$Q03@=o76w!B4mT=@ply*e&lVsf}o{sGHjZ_fS62@}nHn*FulHu?x{{@2ay z!1sE)k4PVlDIEjfC;eqh0B_1Z2a)wlh9S)ok3zYtGoV>Ny@$M!aLOf^xk8DDI0M~#p<;V|h8FF*Cz4AwQgJDRz<XR36xdcdC47WC?2 z(tBy}KE7xrSAC*D?}oA2?R|p$MoSUOC$2-fL>7hCdRn|o()d;ff_6UJdnlo1hi?)(7JPho3zwx|$YeE@KaW zFkItC#jJUeDU=jiiDpJ)PEOhVuJ$l*p;P2bb?;=j3u=^`w|` z7d}nBT^2|qUb2?gXK6y52SY0CYM=7B-9cv_^V025`~|89ro+Orx13c zRlk5%-;=IMo(`(*$mF|ZbkKG|OseEd$h_|J$WB$$)EPv3$y0|+yAAkp($NztK|(82 z8Rp3WX~K#7;o9;@b*ZiOCye?NXO(zi-T9tv-U_d5#jYyr8`B+W$r-NvV*C-`uAEAf zz}HfrrqdECBCm+(Y6K5|=j}8Mb~MZ$oE?3N2s2bHk+TU;=!C0%X?#i+)F^dVBR$d4 z?08PS_M@O{UAxE3c(h_%X-Ul(Gxx_(-?0jni|wxZ+l5nE0-Zo^&LVl$qxDrKLfvKE z+5w02#EUefuk??+1$wd*_h1N@$|@TqHx9nF_QanTntq`X{v|Wg|NPpbyre560$p4< z?5S5&1UAgP(Wj;ASc*C&m&=7xqe8?{&xm4FAAOv;01O$?(2|muil6q-1BWTMv{x8K zts0DMtEy^fZnz?U(N>CwhPR+0jTKi(5`ywEZktY|OBC$CNKEdVMe^|CUioorN!6^Z z?6%Y+;&bAD4Ft8m?(0t8(5xp8Z-Wuj*st(2HOl`@yabO?kQ{_;skWpEJx1kZrHirv zJWE49bPTU(8!JYs^QdqSxJP#bAS#|#d=s}|aq*n;?31~)?riC$LNBXcB9wr5))VIq zQlS2t#t?||pbO!nB=5TVH+`@5s$j{rASp5nHZz3GoqUvz%#YCVpF_tVZSrro3EsEi zA8^i({_}75pPzJd6aA+C&pC9QlUJExAqy-PLQ2K`gTPb=2GVe#YhY zT5>L;?jBf=llgyckp2f!YuTPoHtRy2<~fh0AMJ++?|PD~B7yX-X24PhohdU(3Kz{d zb9WP?R3$cs$>7t>SE3)MLq2L%5!iCZKWcj{X{^y|>|iyk(Fa@KrtFq*n~}q@O2Zjx zbsn#YJlg6Clf9@D!=u8h=D4Ev_RJn{hhC*&=f0uN({Zn6So^&6=rj$li1ZH8ov+|6 z7n8E#aI+&Etc>8_uW$d6_n+?Zh$iy68;owb$aaYkv28$TL@UzDOrL2+3SLBc?dpdH z^ZAy&h*lT1l!wt1I*3WBdFkXNuc}+6er)W9Bk8%XyAZ(}$zz51#et)S8`>S41M&nW zoB0)q#zGHjd>4~UV#2`_cgaUC0xci_-K0+KQri-ik^huhEO9r@Qy@&tOJ4r20@dpx zaZ(nEm34jlG(%p_Zq6rcn!}o<`smt{iRo%ieBX!0*X)ze5Cc>XseQ!;1_e%Ot9{F4 z7si-syL&#R)WOG`K0Rke3c}W4i#DRg2UAhHB~mhJqj4l@#7KI()Hb}{@-0*HNBj)v zt*=1gy@P~dN(D0UmMfW#1sKq_mZ`||JPdwlidVTYuZSXpkYzn?pxF*RAi3%dp`$?e;&%zeMvm5pC#mU%D%EP5ASj(d*B3?6kTF ze3hcJ8i?nj0|XYMsr_DrB`I#qxB5=I9Ky+y;HND2TTKGT?cK-k^m3oIf8=p)krf z!P`O(n|1q>ul#pRvHA{;02y-Kl8%eHvdHN>lbz0WUy=KKwQOcx?NbcA0oq81j-x(U zM+zjwyfoS$x@=!(HKuJ2#okLS)JY)E`+`+U{p&4|U4}YB$c%RB zCbgmYl(1&yAPm+qFIw4Hk&F+)kMg*-WG-bo9~p=zn7_Q~UuP5R=0swtYnn@Sj+Oqt{I|J+f4j|1v!Ik;WaCm4RXFjXKTJ{O+iOpy4-RbeVR_^xJ$*69YxArl z9DWyFh`{UPDw_4MQrk$^ge|3X8u%cE>Y34@d zws~~BjX*1wj~I z^awMsy3>;%R`*Hv{nw;k{U_@VwCUG@`0iy08V|6PdyV~X?#TWei~h6DDaz1H9fATB zQZ*_3AZc3}LZjQH6NB&gm+}=*J|8tkcQ|U|7^>pM@IJ%V9#O#FY|@2PD>M*F+CJv_ zPwj7m$<4`ZlPrRJ@>wD~z-a@8C07~SZ-(Lm2Ti0JyQ+91Wihw+UT_fMl_jMv+h*tyn zJWMJfxm7e3+aMQ!mwWn^76}RnPb05AEXciI2Z@(|*#=d+g-zLBVQ17u06yi2e=K5a zF+fx7eFMazZtmpN+0bwTKRg|<)|0Nm`L02M`$zv24SBITHk5pv#5{1hHOc8$+W0W3 zMMCZADs*m(C$=WoB!9dIslC*4SGdbplBCXqKJ(E#vw2eVFXI*TaCn0^R@ZE57&Zf$CATAOEDk@ zDzTj04IqaRPer_jy6bn!5+8v1j;-t58Zhoxr#G*@*9WeaeWaiU$AQkmJ^jCe1QSMX zoR!(aI6bKin;Izl%qNNx5|GLnV!@ZCV7-pL88`Bld*ZzLfu-bWBmA^mMiWSO=h_2b zR0iX9E0(|wvzTi)6UKAL+HI=+a#@G&`plV2f$|Yk*-YcE5PQ6lbhGUnlJ{$*5n)(m zDC`$DPQUY)UiAn5O9=iFmugx)-Df;?LOtW*-CMi}&hKUjC z|Gb8%5?w6J=t$kg)pOeeG9ZyCJEU!Gz5rgQQ;5jd;JZQU!iBdn=GaFjQEDV|3T%O5 z0N*3p|7SjvrhRe}2tYrwqbENAMiFpv*~GT#9F*Z>pyD21Ku$4x*w`$&-q=&tcvKf* zjXdVIq~Ig78OpygOKAeE8b8rmo9xS(Y>E{~zH~zmex%fVi$12H%I*zY#1eQ-uYZ#- zJOOf}sEHw3jBdPfbb^AwwkMDr6yveNI7mB^=zX)@hfPypq>gza9*jqXSY9u8SS)0W zcCg|ki`jWZA%B8ly2RQVFmc1kuh*8Sn4{%-nEWMpcc!YLQty_zTEafVdhcAu#3iXT z^X4ItNS+9&Xs-qz>U!;@)cO=I-3ZRkb5L^diZA|D&^j+^9G1plQj1kN3y@dFWW2o4 z0Xr%#<(uEDAlWv$@Ts&X)Mv;5DcfvUiNyzukfA@7VdG2GdFfc-%cFewm%nb3UcyED zomzd((I`UW|7EKJ8ZA2r_ei_PBg4-*Ho+2G+&>>_bL zw(#$&-wfL&v*Xk#NUl${k};LM&P4NO>l&GP219q@MN^aGNBCRZg>9Sd;^3XuYh#mI z^4PNWXRQ5Nep-QMXM0;Zz6(#0XU&Irc_r*_mfXiu4-p8?c_Hn|?iv^DfLHwApRsvJ zB31cof=}h}low4^nM3oW!eG7rI*X?nUG`oXDdoI&o!aF!#WgNu{#zwm2PQRfaj(mu zy3(=SeY2Uu$(<^5nkLSg5|d&MWi!uNYdvOIe7WRzO=dFfXo^@(+0-PWx~_}88rm{z zT+krzg|SiG=_xUz*jsk+>Y)6naR}V<@*;t{)z)xl5Za|UUq~=APQf8WX~{wTm%Ony zJP_MTKUuQ(f;u5Abqs71(roR4ioY8%H5u*`YaCCqVTqTWv-sf95pE$;>e7Ci=YEEK zCSS!{B=hzwi=q3=X=lUdrzx2Olg0K%biPm35;Dl}XOCw{SSzx@4Y_zgva0P+$Vg#U zP0;UTOi3~NZ|04fncv5m2`p-mNtM4)R@?;5S0dMrG2CE!cS;7`KG9oyV)g1u9@6^A z+3o~A(Aa0C1s{)3@~IcuAt@3BK+JWk4=$&n+M)`M3)e1lM6d4P@6L?QfR98&Bo&wn zDFusZzy}nq(?`z4rzl4z5naIYMBi#?ZuBM=+-?4hE`a0(OV&-w%mJtqLQe^fp%_LxULev z{5izeJD0s|gc9Rcy-^ees{`r?x>lAtmS;_;Yy_IJU(M0JI?>#GjYlK9{Dqp~gdEL= zmUhs(*uatN2;z0^IFp?&Te(wk9C9&{4y=l$+~xf$XwGkDl7W?rzA1^vn^q0(~rUBFdLLJCOC|%+tF+XD@%1MlDY%6wA zIPyZ{Q-QBMCz)hExbp|o^i|!QNID1@G4Hf8jX zXyoS{Sni1PgorpWy~u#K2;aMGos&Ne@xeifhL?6Td6LyNPJb(UC64AY2(N7pPo*Cw98Mo%e8N#KTwbFVbqV$uh9lD23&g_Myyo!lK>xW z>P|I3w_dyo$;YtZhOvS~Do(^kuJUqfZT5?1`lY%KD z0B7F-9n!erl%_l4ucbh*Q4gYPzRP5z8=cEo8$*|zqAs9WCq9!EDmdTREEMA`2l(hm zqNJifxZ}hZv_9S|jLsX|)gK?jsriUT7SwnpOONaPg38*n&w*}}kWtY(F1Bi=@ zH?`Zcic#vD=f=UwD5S1)sbFINy1C=PV7mDLltRyGYuO+;9@OcimmO zTg5n{BOE+#?0|@SOlDGB6+?4!Q@ItO6=;$K5`ob~%T2!s;6Bna`44ikp6|-RBx`~f zgepC$w->j{c^6ZGe}A>gQ%&30Czxf|By_%7a)}hy-)Soly;v_9Wow5;lo~4o&|*$O zjivodUUwudi+i~fKaZ1>`G4^@DflRDbtO7&zU(4~Cm0#WHU!=9*cRkonr}xACJ~8R z+bCS(iRtMJ>K@SaU2S^eh(ilJfD=#Dk7Vo8FoG=D492*&%~t7esw^rDTL^a4B6;rG zvI6Y#&GD$T%dy&9o@X*xq7iN zz>L#Ow!Scq)V~OO{;%8V3-Zvii5#PPIZmpN9GV*LxB;}buQ`ze&)SQOi&1?kJ!I}f z7m1q0q4yp7vsVutNa%t1`-a-nAJ`#9y;ck%3G>xXM^9WSdt+o!#Hlu4DF;du-z`XD z_Q<(YfrC(q#0oP>s1H6uvGyWEAQCxy+lVAJBj7WS=h<0U({3#`eFT*peWI!65FitkrjkhJCE>sYNP2X#OCMBU5% zEpAZwQMT}j6b%Oaomy!4SCH>_W{P~r)qo6<&6asNk0sxAHMu0SuQq#~^nZ9ycdnyb z8MCi@8+vg2>PDPlSAoD7@?5o$wpOSvR6RUpvhY;MI6$0HH(>LSc(|rDe)e;Cxt~u7 zah|y0u-Kfz`KR`et6#^IeNbV|g|CoN@eX|kAHf?ZyQ|Djjp#wwF>xMYUp+Qk4J>Kq zO3F`Fvibdvv4~LtCLK%EToL{W4$0Q{w zO`TK?H0haVLv;2{s%crL`Q1Q|Q=c?C8+pW7yR8)$J1!lv6U)1slxO-XM>&*VPOk8+ zA~;4X?RAB)H`8+F+pg28>X>!KkY7Z5j@E!+oD}l>(2efmI!du8sC!U99yYx63AWxF2HLF z{3dO8>?uLh+3_nH#prwwMsB%0LAwex(}=K-wSf&=JPta$rlG?tu4o;Qb;F){I!mW0 zCouH)nr{RqeO?z1%PtTve5`VPBpcrI4>G*55U`itw=KW7BMx z98J)4*DG*L4MQ2fYzz@Zlivtoj_n86+_&q%U1ZsS`cC!$h&~j|_;H zrFsb7`N}x1UJT`UfM*Vz-lUMhEd*|xUET+ei69W|#pT7i(lF7!QFScxp`c|bAvbDj zl!x+Vtt~A&-RK&8jSch_f35x)Vr^MXgVWuvWa1g@7ZtHw>IJA1QV zYJ@3`YEvY|>O$spAXf@Et5rBABq4_QokJWAB|7%gf{jl9$yEKLEhrb+1f;D#rdAU1 zMhoscKY6E+qz8BqYvdjMSGS^E}B=~%i zi4B%ahG5h2xO3kJn=WsiHC%!e5C=>lJB#Eq;YY;9d(o6!$Tq3T#+ zglYLWg3vq{dH%~d#wFSE1OT4sT&kL%4)^}`(EEkwE3i7#6mxx7w=TcWj)z>@y%B=P z+9GXmuLYJde#44K3{?V#(++ad!n_>20H}7wmxA0Fx2sQYsEQ23{I1tuoNwYJV7H-# z!t=D9O~z`693as*m@CVd9oxZcgVc7iR_i&4-)|KEUbDrI=lmr2{hM`DY_0^I&QZ&k1Zg6r<6rbq15DrI`uW%rcXCk*)0iBdoG zEpL%{Yvf5PhVi~7M+2&wNV1F;1y8lxG29b8&LL6EY3F$iUGY8UF@4@8f?XP_;`(J# z`M-x2$l}m|pZgVdy5n_yp}At@aU~7`*t z&@%o)fhXt!ns)zX*bQEfI^>1Nj_rj@GECgru#a%ho@Bn-qf0^lhTbKLy>2o92uAd> zB<=l%PUyPP+;(sRX{b`xrFx508%Yr;#wYgi=ewQ(;>M{;44!D)Ai0NkFaU!;S|7@K zDluyC-m|QKE0}?_$3>p?9`lM zICenb3PH|wwIjM3#+~vAXnTneeZFr`t<0sK45YPEVzrG*pS2k`j;%JD#!gMRTH3#s z;<+d52T5FAK9)4pO-!F*1!M%(^77t$>7$QIvSH})F8B0m@@!z>LVXs&=^HqzNOe!j zvzc128|k8Ln)p(e32bsyPjV8|W(fuI`qfknBc{$JXcX@_7L9A>;-fzVy_Yv0u8l|A)R_ zU+wdX;|lGzQS?VOSBcKiqNF>hG+t>3g9cFb+BS5f^Ku+M1karpB74=GuDJ0ekB$Oo=@m1Bcxq1g33`mq=*~d!0w2!@x*%*iH#X`-3L$nnWch{ad3ZUmx5-Y!zo>Tm z)2h?jzYay@PrYbDa}E}`9~$D`k##yMaqQN}z5%l7&IUdbCi+ggKcm!8$;g$v<23VzE~8bdlmSPM zO8b9nJqG9ED~z@2d_=IDAW!Gm7F|6i7;!e)X~yv~SPRLO&vK_Itqd+kN>UDD&5USv zC{idgnJEwnw>YCa@GA{(NX>s(rc~_gw9?QRZ%7kAp!6Av{ERVii#sR3Nsr*A>r*lT7Aju;q zqqAh5!MkGmlVpBv@&N?K*jeBE-s?RO4|-Ao2O zc*TV-y?KMV8E;T?3eCKqoPe*5PEIeSi`xvKhQv;j?mx~~S997pAIGHX#uh-MKbV<> z%5-Yu7Y`|S+6R#4&?{!DmIZE?(U@?n|hQsX%S@>k92;K2kt2=i~#w{^AV}UH;Z(A~4v=f95;z$^2`%D_fc>PVw3D>rgnhB(s#9x9gOHEEy_Z^8_9ubT!LqWANxS*Dhr~ zGWQMkI5#oW3vM<>t&O^?(c&yEARXo-G=J^O^o#A5O<#DnSF}y5O_J^@}&D-)&ZgWEO&ZSd~0`Y5za=-a9U;E!!F`f*^E9Fcqy0I{m69IP zi3%!Zq$#|eWdu05AiWU_Ki4TYeYw#Ki=HlLh@3G!M+|^{$cCB35V_!{5W~cUofk4N zbB|ru&wHr+E0kF9iJCSvRTzX0l4!GN&C-9?BSgA8RFU=?vzo?P_d&{Xf}KMMT9sV# zj|e^f?DO}9Z2yYm_HF7TuMkBAamVfSJ%IMfkywh6yZaXGL=V1@>G)M z_SzG|HxCM{_4MfLd_u53LPtE4CUoQOCvNQ+x$NHJTIi{JNFdhZl^Ok6Tr}?!Be~E! z^!|Kzjlb^YfYRvDSf(YKD)i`9BY23@Varm8*1OtB;O)*#otaxKX{isS)ZO~34*p|4 zjQ3y4j)(F}Ez4)%pCyj5@7dp*I}qycRhe$}7(cic{Pt!jO-=ykn557z_WJ&+mzl?i zkpj>SUso+m35C7Xy1KvvN^WeW)Pqe|6wGo$_`uh4D(d>^qJ363TO0j~J?K775;^ll zKp8ia$nM$z=9FPgq?3>I5f1|wmQM|EVlAT?@Kk9;;PnNww0z#-edA62<+;&``Ve=R z-u+y{e3-8wg9IDK2S`u}a*-?_fUmonfUu20BQgy5EW>QLjUhbb0eq>7&iqO1`IGKw zNN@wfbf*G#WAm%TOFRkPRR`pz4H>TB>qA3J5SOYhjSMfZDnzDdIM+!4o+BQNq1OeBQB>8hPvQ zq}gtyR;F7x41cf~0Z#8E5m(dla6QhcwP(+Pyq206uZ<&{IDlVNG|*^y_}?Pp*r+yF z)0HCrNKDjOc13HOTREo{Oj;-Qtb-#6Eq|)|qjm&;aXT1Px!kdadZjxa`$8AoGi1_3#T-}+EG3l+N3sR zGK-)9d}6{Q?QS`RSL)Nm7c(L$@OU$Q4qhbU1W^C9)aXB+L;{QPQ)o+AH&nU)Mv*Bl zdcqr>+Z&00C|T#s<~cYt>{TJOskQ}a4Bu1U3l;O`8&qqupjk~7TD$|cQZ|^86YKF` z8)wdWFLwJ;y&9vHqw`#=&f{%_c;`HRn7s!>z8^%qkXMk6+vX~6IR~)a@}WuOcTH@8 z+K@0d3ASqK2Zgtr?o2#iB(Kty${>EM^}^kuL{f`3(X9dJrK!v=D74M6u(HGGp8A{B zyWvlr*lv_BHOJeN>|U`z*$Gf_c%7eE);6xqyq=rh;(jWx>oPCjuwt8C|DtLp9(tXn zdZ08kL9QFoiuZZf#9hKlBK^Lp^Q4B8v}K`SOd+3*w6Bvk=GJNXb6DZMQ|`X$M*BDG z8o|TD6o*}#>Y-a6v^)s{B5$_zPhbO&nTe13p1!?=oGc6wErEMKG%i_BN`MbncoPK% z%@#YY@Ew!-B8X*mFOZ=UyNs9cW8WeOkO zo0<=c^Ue&#h+wLjBripcV4I540gS$KA2)UyuEBor$2R`w5wq))k;!O%cYRBfKVq_I za8CrDQJ zXRFN*8H!^iiM#Hb5uQsYm!3lo@W|nA)E?3;ya`nvW7ksq9E|&A%%IFUhXo%{(1TA8 zxS}%N6erGEYxS4>7o0o)!w6wfCu*+0N9!l`yc^qPXK=y6ePCE#ZditU<^2(C>+se> zGBO;pI3CEGmLi+sZgsb<_T6KuiR`!mX6p{YGbBr8ZIXH!acs(^N@rbBlfK@I^tOZ> zZV(oR+DOTEeB2!<>v#EG-Ji~g@>fBk!M2HghiAMp2>*6)v3%fk&|e5}`<|b5M|}w< z0f>cyMmQO)=LxutG;M|L1a9!wn7qPAR|Ma(QP1D==B93O0|u}9L-eB5YslMK>5dIT zqY=AYHO`C&MItzKPzH8KN_*$0%5WJdHceuRhnQ_m!c_A6jId(%B<|bs?3z|gO;bYX zw9Q+Ihy|Ik!Zb;;LnHM!yB;q0pAKVZ`s@WC_z_yYXj1-yDq+PS;9-S5==-bp@mF1w zzr#I}QSHLEOo9Un#cDVk%yEwp8d_D6q~3D4Vj&*gmf3Mv*J-(c<-$pKd$$tgqG|0k zV;1!vav@|#XUs}xujdXZpTE|h8!8M2zr`zNU@CL>(VDLT^si_5&d8E>yofi{j6)F( zra`4md_PsTe!QFpTAx$dQZE8C?LF}#EP4zf5OR%JeFuL2NACX;^?AaYJcCr$I>Uf0 zOq*ML?0sZNq2t~P`6d^~@3J)97TM;;s!2_r;J-zy76Ir>iaI0ExnsW;==|^RRiU#Z z=|U6MInX60VnVsyd#T*+h6{r}k%>053%@Y5SYJ=}4kH2`}P4| zj-trx7`8CM%NcKbo|*^DY0w0VHDe+&MCjbniqwb362s zML3|8TD=!n3Dp9XbKvXXAG(zPcT@d${-3Zm^*`jQrxUTUrUOJNS8qj~%oo)^i;o|L zj$B5DpmH(jfuBJK&*$Z~$ShD^o>48l?(RI4+Tj3?EFzNe!@KqUi}kTavR<@$bp}EY zPa@XO!KYD`d4@JsY(pXP=Y`s~JbEk!pS-h(8aW*3Gwyoz(?v|hxEe~VNC4l+=66s{ zT>m5q-i1Ib7VlSew-=MI2HYF_gSC#rHybwc!NrPHUswE;cs%ukG*hnMW zy?6VS^l>egHRb#FL;8C`59#4+#k!lZ0dTkwW~MZ72u&X+pQegALGVnFDJ$iG#mg1G zvOarIY~XZDr1{8A9WneBc|l;dy$!eXNy=E=IKs1( zR1h&RH(@yr1-=b!ZxmBe#CstmQhlp)EVy6C3+4VApHQ-}%$iHG81yz*N=o}uUz}V@ zTPY_A4sy-=SH%1Ok!xb341D?0y||c20dT|tQ*wtx-^k+DdJSFQFgg%5o|zFo&$1V{ zyC)6bEU zdB&hd;}I+>aT7W)d6{8hSFBRp`I3=Zmb8l!W4xK%xL**M%8?f0rF*=$!6`x`?U&zY zmI*jVI2wiLtCCta3zeHkx^%&Zyr!auKqJv$)`GFEzLbQ6vGw{}>VX3k=g6>QKBqPr zAz-SfEkaxi@v8IKo4W$d8$Mx?999z-ewJ)0k)gnBhOWFBr+xUN_N;+h8K7$KMFoVj zVVD-Du?{W+cr3K^wYcf`Y#1?9oKTlCZtZP=F>KV&tZ-mvYtWDj6coo4URRF|T`(zO zmEs)_MnoOTw4HRbzM_gcKO2r01}FHvr`}t;W~0L#u@wQoRCzGNA)Tj|jPnlvAYOkLqs86AOy6QQeC+f@~gwN>^QGAL09O~xD(4znmXnJ)`7`v-}eB$hd zHw1UKgI~l~&U+8CQkys{r|!D1R9};HpIq7}<9W8eyglW!*Q;pwBaHaGIvxaYDb?2{-3V_#si(B@m8A@ZW1_n1%q^iOrQgbYR!&p+XXoCT_`i6^3j`Rr>Oz-OG5IfHyBa+Pt{y@qQG zm4JK2`lgMAH{Vrg_wdC&$1bbFN`qEe`!sxds9=5|G6Szl6a;4%F7iQHhTZ_+YKf7R zUKZ-i3myV>ClBEWJcj9-y$(^{RS_G`)#oVeOK4q(F9AGT`v#h0w(;Yp04|$L^*yw3 zqKu7g6!DL62>(kP82P}aY3cn2vy*bW?@B^N1D&>8DYlTfkKHAi16A-rl8)6@1(>@f zv%QOMQabN|!ad-kLhYF)Phj$uE&zt@z8&Z4$!&JtbEIezo~eJH8IYZ~#tuWUv{*?_ zEj?Nb3Lz)G+~RN-#J0^3^(ya9r@|!CJ>9^GoDARoL+ej8c^*B>e|eULsS}3=NdVBB z2w2kn0WBSwcQog+M#qq}qzOqT)nyVRK3n0Ic!d@7exa~g2P{@HiPa+4R-!Q~nr}$2XtAu0scOJQ z3qTL@yz2vY>Fdc=)l=_{<*RRZs#Q+T>LdHFLG>tuvNl@^mNcja8_p0*^Qyu71`9J{ zw$hzeveIBmC!BQs(ClA4uQC4Y?>F&<|4rkA`)%e=EabnK?msG__kVG}rQq=g7sdHb zl+K={%&;_bC~ld4S$tf1kKQzrWb2BG?dH1ue#$|%!raSYGml%~u{W~{17)O>w^Hn? zavG?hj%IjpAY=~?(%L2tH(liK#yBfQ@6fToK|4sQg?dl=UuU z1upnhxjbLR#Op}DQ?%#>y{7@{36%VsW|80+BmuErFn^fkwU6v@a)e0o|DpqWRTUqKC z8Rmo-SbD<~{zve4`41Y?Hs!tVCG!GBZz?IuYd?+HEPGIZVj&cn(Qzyo$TG*Db0_uS zt+tfTNEdWBEd7V@|Kex&Lq3;7&Ot59jML_EeCq|Dlj5C67)A<1A)`fyhK{RWgjL>! zr;=iv8YZ)K_>}blC@!LLZMjuEo+0DmN95%hLORjqo*95vIWX8t)OdAT zxXVS0^c4Q4_BhfNL=oy}e`PWi4;?eITU~n>lOvwe2536idA6spFUNitj zS&`o3G~0BA3%uY?4>aJ9+AVSIan&ldVCw}94YIjRHEwRydb8>8lRPD4J-^>6x$O+y zJ1kO3Va@cE@f%Y|cUbphFlr(NmMkLinX?_M1}n|}9waz||IlQJx zT>i>A!4wZoo^|&^w}@0hE-mq0bLaK3Xl1GZ(A28&Hkl_9b==^IXX_2VV=4bj z8BsK7B%d-RzT|NE$}qiu)@*`~lqzWRnNT+Bu5s`?$2iqC<^cmMF0d^lY8dg`1)TaA zu0GrGV`R9!yIicXSqG9Kc95oaP(A-le4b^H^hR1e$!zK&)x=t0T2M}$VDboXal4JO zxg}VhV%@S@VXI!ZDyW$@Yx{*Karg}$8Q7wm;sZzXaQfvyEkEu;c@v8b1sqrb;#qHB z-zHg~y0RDmO$m)wHAekCYX@QC=40MA_?$c8n|`EZQ=~l((i7dXVxIl!# z-66PIwVu!Q;Oh3W=V&@mb^e4PG1#IX(0ISFxX)EGUc~WQ`>6K5$x{O{&FbJTn(KOh z7-sRGzB{%HojDr4i~JDwpegA@N1X@Knp3>^VDsSsE?*wPeTADd4G|rC$YBv>Nx&rq zeh_O|Uy4Q#x8jU*u~SYOv&~%!ixBc4TAR>r8JLW{m&1{sZLA{5MB1M^`$JT(_geT} z>jcZ$RWd;C(4Sl!F#hdRDoaOvg=<;+P}-6HhI2fNOP=|k*7S28#;bX? zHQXWnRFI0*00Ps$X1M>r&v1l;3L5rGSrvJ&AjY#3dGheb)80Kw9AQf0m%j=&9XF_E zH5-M`wM{TBjGINlQhNI?p}FL&47k}l?`J9nHLu6(KFIeT&h&ULY?beEw3HXjA_faB z2T{sE`Or3{z2etD=fAy5|9LI{YsLoa+vuM_#h>r%?|olk{Br42oWN(Yas+gY~Ad~pGq|lRtzD`Gg~U1pGyvXcY&Bur~zF?xoKvy zjXvw$j0`E2#w_l`teG@6jlF5QnrY~k@9Gu=XDKH8(M$0^+n(07=NhbxW+|_v0e;7; zPXKS<+TL80=?A{dkVmVF9xLg|Wh%E=o6{wTl^#&0YOvUOB8iWg1DTqeQnm#o2^>dnJcO+?@q+F_H zsC}nOj_qmtj)unc-ia=|BaMQV6)CvB(p?AW{>pc_Kr~QzQUkRfpk!iXTS|rtbSik0 zT4{R<37xHIp8*x^#_Gpt&=>9A&e^GA%Tpysm)yinp;=IHaWrCeJyJw3uHIGWK7`tx zStVjwHZ@c7Iua$>XqF{)Mu;Sqv(vClQIePE>l3IpVy5f!B>qM1D9=R6^J;#yZP4ly zCZ4Z}BL;iVDl2+SAT6e~R$_LK=xQy~r%QoN4#slt5m1ep@&nQ<1l&5%xhDNyRs5~= zYD1iRXuX?z4>pe*%Q9kpZgV1%Q~@X}^D*e1`coNcZ;nRZ9tRBz_xn?YIEn0gjdH+T zvs}?Gx*092f*O*2rsj$_xc9OOm?SYzY3MYt0lBDDDJeXDxwJ|DnGJ?&8k zj@38_C~(B5>*#z=$MC@K*l=YP1&YnAa(xm(0-B-`b5cSrJ8(ZAVneSiFH%rPu>Q9=3O=e|C7OdIxfHA3Eui~AiXwtt;% zp!#yEFH*PJk9g5j<=xcN<vR1L><(V0!kgoQRV_M5X5?0kR zncH4oQ&6hojyCyR`g*uzyUfHOMba28ap2c6@LLHG(rk|U_*?EQaEz3lXDCFJMg}G#qQpYq&BmV3G7fdaeUZ3khJ4WsyMTMk3s)=KZ+Y~K| zp7>Pfz_Z23Ge=B9X+QgX#r4I!uu9+=VeG2)Vs4ntdZ3Y5#&Tndvn)ST`iK!O179tb zQWMxwEhVmvJHrdTQUv^{8~dB>h^px{pgC*Ej1>xROtb%Uqh6?}=P{^lM0AJaA~#fa4v&43EOH$DKq z7au>@a=ZI7*t071QDD^hPMgFl8U^QtnK1oFOhTl@sMgzxNkH)U_PHFsX_INDoRiWdCeV@cS=j&bHRmc|-{`@18>nW=QqM~(P&^CTcT0eN@q*&~ z-jxsG9Ou~nh4ATg<*<-c3%Cu6L#3n$;i@omQ{A+L8@|0)&b!L2)H93D8It->z~zVi zmXnd@AjL@L$Q7hBZM;@~!)b&`h6gsPqEO0*^N?Y>zECUYF4mxEU3is>!YdoUn1Qm{ zJiwhM5ZcIZ>rwzKH8dhaxQ2-zc z5@x17yrn0t8MLn8#J1>CYbKckXZ=NVSJI5B!lQtb;9*~A3M@lx;_Op+IioClUKHRP zfWq@x3+z^+%P3ELU%B7fs**GKx&kbOjIetYJA-j1dlHOi8SO;ig>tE!Ywd1l{WQ6~ zETDmqcs5h9{S81=7)#%<#3d=lg3(6v;Yrb9(4migznIDg=;*Yr?}Hm4j99fDU02;X zSrQ31$PpP?))lIM)=&LO=VQnmU>@@zeU=aS8a1oejiGs51F<@{A;GQ*=UB!7usH($T^*^M6eIsGSk|>ORS9Wr8r2?j7=HKD5cu{j9;mbNG~oMztRgT zkXWd@mox>vd>yEV^Uj02VA}5<*HTw zW}NuO+=5=g7^EntQLnK`P5SRg@5^p%b zj{yyJZ<0iXXDEYsgWVpDwj2}0fn;QBT{R4AN~8|8Xbjft%cW6%`*!+U&Eke_Jgz>K zY!3~l*h##@?A<6l;;I|5&x-QzYk0`GIh~}Aj#N!dFUPekO;o9Z!M5Qd^axUXXl6f>h`btP%OLe zRg-SW=6j+pl)SzHo;yAgC~KSA-l+LlXr`d>?nLSF1Ey?H#yc1gM<2hnSblR+#3!Zr z>QIT;-Wi+1sNC6fAFObe^$l8Q_FCN(ZmsuHvL%69eFajkmeNGch&g$(EsPq1rgR4N zbN#0Gw(4gf1i#x??lHBwB>u7spVnufMJQ)1(#uskmq@-E^po-D&bT1W@}m0tJVQuox8rTA5NpY-y>5$%^Mu~xenz%$5giZ{w`Vu4azL+?E);Na4Vopi1DCry_}rP87>CpzOxtq zk`KQt)?QxW!fPn7@|Hxbr<;qIyB%p7Sx+Zk`>{@hU}s?H#PKRUNh=I5?8K*LoN zf=6G20`pxE#a?#y8Sd*vD6flWpRGje)Vn2j;FHG^;cs*v1rT2G`}70}ZujcP0*5Gs zO#hok-A%?OICBHsX*JA1HefxDpbfF9%RIiFcp%=+y{YxG6n{5==(vyTB%Rfel( zAF8WKG6Wps1d><8R!_Hxf0C2KY7hhMitL$qI*evuAo*uLb5lWF3Z%>?AUstDS7 z88m<{xbzHd!Ep4pzdmagkMcc8FW@MzdPsy1C~0DfX!1{UD(+Wzav!FoCAHl?jo8XiZS%uwr7J|ymDAqI<^(k9-7@~vDVRPN;^6HSf8u{)#?gT$Q%MjDWM%MOUL z06kOb{(qMKA6XUtJob+zPWc1IU*g;NpZLSS6DIJh?xm!jhirnr&|{8Q|9z#$SVtl_ zUvM$JDyrD~&a+ef;A*}!iWJzas|aVI1`M?WsTK!{tC-x zY<^IviMT51<~EE~l43C4C@Jlp%sk*b~Jx&5{Ue;~adI4z$MjA;3{T{R- z-pwlCQ6b)3=$6B|PM*wtSSC+R8infqoepZ4o^qvpKK>efUtXHrw%5i(Z9xQ=8dOm| zVG_I{tD7|OD0Cr7xu6z^IdfOH-iJa4lhH(2GeJ$iH&9NF+9r}d>U?NdxUAxlny_hQ zsUa8C5?DzR)C?|cGH2&)M=WCe5mj%c z>%tRfijWV80upM)s5lK+Mb3El`Mj{UfTMcaeGV=ymW6UaLP#OfBZ-E4+neP2Q9n%Bx!ZMo~XuM3@y1vK-CPTkKU>$UF_31+GiqQGg7$u zfS(;_{Ob0>Z9A9Hk;h__#1nsW@A^<)Oz}G zFOLoe-zCzvCmT&M+2j9()sQvQTNw&R$McYo-@K;Zy59fx-PLv=-33BLbzE5%P7!tQ{^~ow){^@S&9SnQN|p?ycJMQ_FjM-4Cvr!Dq$^q z=RQ={aYanQXkSpqhvL#mrN!Ihj+nGxNZz7Y`M5&56w>i1^BJeZIg7Dt$H)nI+3F>y zqrcnt0{^4Um#6?G0FUSFO|D%~;PiQMZcRcdK>+N5{?yXa zuxSJj##gOVNZ}fX3?%O6g|f7Tx`)_8HD{vIqkU-i3(tG(oL)?gxP$z*@>I<)#FUim zZ>Q4fyCY;qnnDCSti#JVy2r23x5{>0GV&~O89!e6YFgcLFS|=@wJdA3HCJhL-|gb` z&aMnd`$6%lrS=&TTeo`RPBmlSQq6Wwy||^i{UZ8dO5OQ8(QX{&QkUtY7lxo-QRiYs zchjQ2NHkV)7ENwYTl-Yh{~0pV-Ch~i|G9pfl8<<(6y*Lg?bLAR=<@ZA*@isgB>Coo*d+GfdN8T<0ZIf<@!{Y>&Gk z?aYB&G;hD)V)S@pVML)C{9$Z-zGd?MR6!vn@9tYn*abxz=LAU%Y%<5cO9QUE{P8oMGq`2F9Vf>xX?H@f3iyA9} z0@YF(rG3;{Yg3lp09H#39@|Pu%A{@z3aaSh^9wXg^(^cW?1F+yb}HMm3SA8*OPOj9-MZ@Z0E*IhKlcC#YgT$&pH9R|1#EbN-^*;!6}LddH$fZ}4tE zT`?(!RmT!x<)j_NCgA|(+ZLNAWnx1`5`hiSq+kkx73e)P*#_llU2T>I;G?B*h2g(9 zt@)RohjA69*&JvY9@`9WlpAa?P4U2S63)mZCAp=hzwT1=FxD`!XpS+~c;;r24kJD} zY&2JWcW5ik&f4a7bi=7ZdQE}UYb1Nn#{SoR4`GhefiQn*AAdE~*y%CV?oT~R`bv~n zel>y6kHB~SY~!ED$bRP-5`CNa6UF#dk^jF-KY60~y17p{2d&tgBnuMX5bxe3tv4-< z0R=R-B&ljeNozN@CQlZ-$WB+85xkStJ!93Vn(TCG*mGO@!fWt2+BP*xT6Nq$v5<0- z1kJ`n#t?k~kidtZ3lpaKL`9KCA4F(Qs_84G{%^5mfQSePB)4#WQF3e`E3{k}t*Xn1 zv0})3)Rt+Vr^3e#SfdutY4d3aTpmPcz8yr$bk6NA=A?k#1^XH3TReNeQdfkjSk~1^ z+nwX%o@5bSAFXYk$2Gf_$ke*lo^hwSEjejZT}EVPxW2AhH!TD-GLEpmtUXVSCm11r zZ_0vopv?$fr<#HvBQap{`kk8x9%iu@so*_GbsyJVJw6PF-)S2#|K@qZ8sw9NN&pZE zysx=|mD1iJ(Qx%$WV^(kz5!~ZOKyO=!hf$MkurI}yn0uFK;b~Jix_b_zhjRw3 zsG|A-{ejd+sAgV{dAx<NOCq0{p-p0|17iv`;KtkEYu=sUID%{M&F@peN?v4x1nF z$+PX7q@=(BCq~cexS8TMZl`>#y?7`{hD3hWv3qfgpe#GA0p+)`3M1RkRgVqcs_0RF zBdc;W{a)_tSZJ@u6Wt3Gk7u&HS;|M=F3-4m?+m@fVPPuhXxmk^*PH5_)IqImE91jK zYQ^_&Lm>4%hQOspE{G~;EAw3UqJ`(#I4aV>X8bHfWcj@8%qfnkZ6Dn2VBBmQWXy-j z?RWJRM+ncT6tw5{zCl<#K0?Zmp<$y)c*GabY|ls%^TVaE;+?KrMPea_C3R1U?~Rv3 zyCyy3MK2oAN=BH!83vrZdiD*ltFiY5GzA*?%J28$ls_75iA_62 zd3!jIw?62|^;7EeIXTtUpG@A^tU)TJhp13RG;|aU@yGVa5P^^1J6rXY_R4w|^4pJQ zDvTl?_p0BK%9v^@x*VyD1v{!dKY#;c0$j`gNiGOkZ? zGPh|ZK}?=@n8Fr-lY!9+pSQTWgf%uqQz|yhwsYXiyuUg0zbiUJFUA18cEl6a){_+% zT5PSs(*9h?l_P1tSBMYK)o;7>zTmN6@LIBE2GJB41h}W3fId9!ZVGHn!C;+_0}Y#d zc(D7VByD=%O{lJ9jGvsmnxE2;Rx2TyeV;2bjuzk12~=}*kQQ&$9mgaECJN&_3E(y8M2^7 zr9g@u_9=k)#_1`%8EM;o~ra=oN8C1^GHmb#jU@^ZlS ze1-E;;>CprA(8zwy*ZqH4$OolGDDwaJ7*5PEg6$fVCA@CpiSge7!=PMK2fptskil1 zuIaR#WbxvL@MiNJP$tI9Mc%N-yXI>Q)_%dM0YO&LHmr8g}wP>TBHAroX!*UaH2xefKK%?7J$yk5=m zXg*UL7*EjO^nQL%Mgtppp7v-pm1zK88vS}2G~>gJcG94V4aa(y^LW|_ z?n&_4N9Zs`U2CyH?`w{(rmbvKYYg|d#BbkTua?usgrx+_J2}bNy#I^z$@$(}q8gCz z^U&KKyKB)cjbGg`tlAB!t6M`*LsHMg_7>d?C?-@Qld$oL@)h%U_#q0?Eo@I`CChuX z)I!i%ADS%fg+C(`^Cz{*J*ICwBUG-N6)8u(S5RaD7m_5zxSUSduw+7bYjsNj)=2mC zicQEKz;|n#<)(0x0{pKySCul?`jEWf;fkwV+OEIHmdRb3i%oIzi89%Y78LuP`p6SU zPRir;1w!|1zc`N(HYRsf6t26$_u~!q&w$?kZ1Z2?IZ1q*{1Zg^cL6tk&3(YQuB8Cr z-dbkN;J&v@ST~F8rqlN)eW^WNQBLY2%`rr}``L=6+-bps&Bm^Jy3*CjU)@)WweRl> zuX`fi4^u}Fa_df~Cx!CSgasf(5M-rMpoXzTIE%M*nMW=wkcvUMy zH#w^4Ob$TmQ8gIL62zk7qA$Pp;iyy~ueMDf?Rw3h<@@*f+f%hW=+8v@NscXPtLk;D zFlS{E5scUBl}g_4*EmK#KEi!=$IJYaL7!L^Y>Pvtb8{>p7XK^I>`_b7!k7ITjElSZ z3%Bvwm`~#JQ~iN)V#@QJYu*`J zcja;7D0Pwp=U`1$b)B5h@%mzMby5e-x+-#Y^(O7*NC2HCJb*CaL=xB`{DNPtqNX>N z*dmHGPKAP%aT=%4A1^q>B(vOQsw^j>6sij4LmV1QlR4{0l^6Nt4m^Hb8}h_O|9W{! z508jDGP7rsV~LfPHxi|fBz>-7Fvz@QYFq5`nZg0DSRIcxksNuap2;gpJ)VrgY}69* zz(!;_m=ThmpO!Dh=5qMI4$=yane8380*~le5gU++wHB`Ds%}|qArt?9lX>E zk;VdO>}|8Gpi#dXu7Ru8sxA2Lq>wGGAM0uT9`Eeeo`e6wscZ}lksKX{^NSd(aEN3C zwVskA1&&drD)=G#C)QXSej|>%jY#(o_K!=kJ|d%Jbx%f!=W1WQXmn0{U+#EqiJhc^ zBg!UA98sHSsWjjUDUCnMTZI9QXHo76FaPc^>%WFv|B8PzN#>V~_ZNAC)u33tU+Gmg z7Vl%3J^cdeL!|Y89EOdH-<6{{p)RcaYTD8)gdBA#u~B<2CJb79kh{_2D-gCnsM9^~ z3yo@zFjv3gkF}D3rSzO34dUIsk1g$K?%2r{pjo44M88x>iFP7>JICAGsQbw|>-pEj z-NcP2B;Np&g)^6`xM4FDUie(2uc&*xzW5q-RlcO%(}5JCD9L8fKd8mCQpr5Z+GSvl zP@{BdlUVo%KQ|%U8b29+s#;xs`;E>blzUPc1&KbXc@U^>V!=P%#TWW2cJ&^J=Dob( ziA@R`4e?hyTls<8&{zf!SaDljeOe=A#_!*(M5L4$7JTwlTcDVZSTHZsG%OtSOd8Lt zqRt~oK@s%N9)KolOK;0rxP(!?NAkr!HZ$@JK@{GcM5~?bb$Ir2Na@PXarvZN3P`_# zug6q5v?F6{K0Z>%Bm~WWrCth6Q9`90xeifJ6uFw3OkPm1ffH#D(k2s+IcS`I^ojzp zU&@AIR=uW!VC5@`trKq;phcXHGAVZZ!AvCm+%=w;lgbd0BS8TOghx~Z+20ggO0ebN zOL0738h$tVWu1%daIw?)Lbb!Nk_A9}ZMyL;C^Kt?t0YZz<2nOC2s~z^YcXgb+eEJ! zt5wq~?%5pij`4;FwAE(LOEtLgkw@Xxg+9ti&8gTYisQ!NX3k`|I|imDMXPaq_yL-U zUC=jk`RFvPOp>9irkC>b%UEX;+CX|GQ_Nz zWM4LqfPmzpGdZ7yYpfW(w7?H|YRUCr1R)M5_RTqb+&=e!>s=#kGyIjeZr*UKYa`l#$ zES=SK+q0v(R{Wi#vg(z}3Gk`x1Kqk8F1`I&#`p{%lBf>0%E}9zba&GCgY<37zSGa> z&@#J6q1Cp0FV*E`z5&X8B5suSk}FFg-ZW=stX3S{w9Y_ls=I{zEED?hC^t@96D_?& z=yHityiuxeu~AdXI#&qX4UyUfQEV>o-ni9d1F1~N@?KTqdqf%s%CBo}Cx0?siQKn1 zeXS*U3Gq55X++cm8N{NYL7PmGk4@6F+G@9wJO#}MeW1?QhaP1=AegzL3lUInbbvoP zPSNfDAxxsV`9OkdMr)lYKo)k2^=f4LJOC4&*SMPQgL0eodmi}8(M-cDUcdV5K-0qn zhm*HdpP+-fq1o*7Gml*lC2!hZPA!htSgLo4*FLpzboXTHpKKHhxdo}e?P?dPNGl1N z_Pds&-&a(sLV`uz128xu3R~wDnA<<%5{6%-3n+2WktM3!*&rlF96R@`DE%g%(pr5xnw^TJ`v-rzv}Mu?85p;10+wtA{#b zq~E#We-dc(XS=_nl=)xySv>hR^(WNvYaoJOmX!whiT8tnH^&2`+58HG7<7RhRu34} z^G*$)csfx)Iu2ktk)T*1(gP-|a2^Eun7!uNl;3!3%dYi5EPH^kp^|1q<}m%V;%}#{=&v ziy+!BfefNr%dUnGR#ZrL#%zvy7;`-Uo2`)l)Jg%=O4AAD|8wbRF)^&q!?4kC{w1>R?w(`&~eFks#Zl{5$4Wu_tWn6n-aw z4tXF4B2yGj#!BA|qjZTWGU+cZhRox0H4MU(DiP3>nTG%(TU9PLa!!Vb8_L{+;v!E|=X{UhPr zh**8M_-0JJlCpR5n=$4o9|t}ZEA9HVF_W`JttL5~ZA3J#=E0Zbhlwrg*lXq^P710E zWz>Di#OFgg)?~#skO*hevAAC-O0g-c`vnV^QUfl==1cjWe`q%nesLYc zLtBO@-y0H7&L!hK*9uk05ZqUj-#28k7b&GWPi~af`lM-?6SMSCWF~N67?!kMV&o9^ zsWZA>>dT5O!?hRICA4$5z5$jh~pM_~N>ZRak0@kOGGVqYdsbK67ey+LsXE^7^% zSL4kS{zMn0C{b@=r5pA_c{?=-akIX?`Nz~1#1fOxa3~s_S)mWPpmJejpscJOJ(phE zm9=e;l@pzwLQN{mf}?ps{L+31=g?{waTRdkp%6*&I#k6@4^?R{!<<@zbky@!|(JnjTFquq>j?(c~WU-v=o) z*N$kZblo4`ST1e(4PZ0>#7pfN5%`WwIv&P5p{P)`ut;(P5$UjZGRG{Os()ayOS{2_nylPcw*U8EH^zd;_ zj@O6}`)#J2+*8eq{H2-zf6~Ry^uL{(_1~xC9dc~-F-fWy;T!@xCn1K;h8W&vqI$_Q zm~AOjcWlZHG|giTrB<)=TifNwWCP4T&Wvu5rTP|H(pbH~R6RBfU-p;cxa$hC$7%Wa zA`T|gX@bX?1|2mmW{YtaE~G4_g<1=z2}C_~poG5zb+*ykl6)L#Y{q~8p-^5%b@P?T zP{r5bT)C6I`$H*k7NlS|+ARJM{$#fFo^{Cb$1p0*ss&k+E-GlcSD|5btuc)x51R#u zTF{ozrAk(4^%e2FwAbKfV2_u|vsx^7AB)HnF19WBIcZC)%^kl8D3zknW+O8w^T`Y2 zM8N`4UJ+N__bC=#{due%B~Cg68*mWH zHcTW{cm@DoPcYPKVkM))Z!}<~z5NWaMTaK1O#6pWJtfPlJT_D6=be&6UhRAs171~M zZx$$&EbJpwIhk#!UkIUm>YtsnnAatN{nDVZ?UCT#Npi+frs3+2_t6|5$VPBlzX9Z{ ztZAL~{c1Nd@GDZTom;8nT=y*jkovP=qYEegaHW{vGM!hhm0G+Ef`V5KIY$pHURj2s zWn@kozsR8SV_$OGT2Q8cpwcJ=WwI~L{XguzbzEC(+BF=c(4rMWfg&vqDYQti;uLoR z1cDWZAjO>)>IBz9aF-B*I}~>(NU_u6?$8!!`{m5@J@d>v^2|9i=gfO%-t!Oqk`4FX z``-JOYhCMFOBH_3MgvyYAopVm6%0*ivUw=df{{Y92O~;bfrD(8+odOQi%DY}uw>bT zk_&*t#yn+5HIpSmD9(8Xv{ak&ny6vto?e7-7%g9zUR!RYfIxq|p8K}awAd>Ku9G4A zvz54L=DW|?>w%BjnHzWm$W@&0X=AnJiOhekRY#qhTny(vRa;FNa}(TYyixHbtW`t6 zG)e)UGO|mISpQLnl{C%hb@-QU2UypUbsM!aULo)Cmro5&BE7|Ox3aAc5IglTd{vQZ zueEBTUXon_XVEjpuj=2#7sZx+pxfoS_aSaapKyu+rVkDaM{prF=ebS>?Ebm9U8|R0 zAvbY}3ZTr-oM1WDxMg}^nv!tz2h&>p6HVYobWSvFoG^3qqsfzN62~j7Jr=IgWY{Wu z4qqV7%oh0ie|&3oy{aorjB~V~8BXJUonHEU1AeTLjxU3V)?*S)d^Q!p^Z$v0F@)P-#Y-zj;d8B7U+v$2!tCAn* z1*Ow2G>0o2)>Cq4X5RKIy6;H{+2S4By~m~Cub+75mAr4czSs?FobAG|83g{`_HVy5 z?Wa5=g71^RgFn9_Sn^Yz!N1Ml6qV0qS<%HWlNzU<=2N+qM>(jtG_Y%Qk}DjDpVF4N zHv5X~L5zDi5=++%=Via-y;PWkzS2h) zTrqQ*(-80Ipx^N1>O1dgdoWem5mXrz-UpjOQsl_{ZEmVr-)l7kqrP3 zrX|ycOMQbho_qhe9S1iT+qGaMTavlSpn=${LMk{vx$$kBt!)$5;tjsK>rCpR@qH`7 z{`2KJ06RRlDSM8r5IC=aL^Rxw!<5mxRL+Z{&v|(5Z@2&@({t1mwN0#3>z+xV5s0b= zAkADv+jJ{M*A{Z*qXPVj^L3VIAfmMnp;0~s^5|{QAUY(>GZtw4@OG8;7f%BSx2DR< z9ypQKP-e?uiY~`Zz*2mY5L4Gn)a(sR63)>?cG;BsG5cE2bv)O*BAQ`^tRQbtYsCrM z_c-;YG0Y|5^Bx7m^k+OOBqva(Gpwuf%TGM9(;GUoJ+%7{4dlUbX$8lnbpg2tEDzMj zY=T2vC+pwb;q(PsL#fWrNN{#AhD^$@+q#hDI6^^<*t6egauxqybX^o?9drS28-&P~ zq{fU}hr*zmPoX zlU-H#X8*FKm|&4K;P~-arr6HA%&*)NSG;Z@#aP55s;)JJL4{_pgvV03P_bggvZ7PU z(z>CP&1;CT&8Uczo~pC!epsR%O&oFho}HI1G{`&espD)!UTv%hOOt~2Agn#hMBkg( zsD@$VwQd-H4T_k&02$`vR--8i5Pz&mLr*d50D0W(~ z-Y&H4eslcpP>xp-OF_11e%5svOu4o)W7#@Pl*uY`!$kK+Q3g?Ir4cP-pAey2Rb#?+ zJS?=|A}0kY-dFC;ey*~ZW#5e!w?8~XXy>bF@PlII7*z69?EOZ*^(KCz+S?9kQruD$ z11802QYC@Q1??kURlkNx6@9b4;5?1q)nwswv*&kse183@-o@}|9fs>qwa-R(Smxkl z-Lw*%wjk&wo>VSn4{mb#>q^-S@VY93Yrcj{iXqIl?VIGK?5{4{FjAs<$7iY| z!tXQ4i`(DuLzqsoXx%>rvrnSG1GbZzaWr(sWgw z$88GVEYou zbFwQ+o1nC~;{KR=F~}_~A|mp~-vL&Go7^cg0dHKCFj8XAQY9#v1TuYSt+e?4u@Nt6 z3U|~AIDq?G^U;T)$_A0df{hc{O79KCnEG`u<7M|fv;0SV2<@uj`c~{}co;qS4*it; z*AS_K95JeI*(Vsene2-AlwDVa%^$=FKW|FG9-m(7wxRMjcn!LLXxEGWVf$9Pw&E%I za6u8SU+Ih6dJC`Zi=v-r&QUC#m{mEU|0LXm#2Ab0mwZtGK;J)Lk_DOkUd4#i^fsiA~CeKfgxYhfFO zyG{w}@)~$B*_*2U2A=X*D3gF#Yq2!$`|kk9*|RBX83LpoT;hdXYHHsW^;*-t)2er9 zm(A(Ihwoq-&nVohzG0tFrh#cR{T2zQPYNeDq}RZ_!NzMg_p-pZ>%*oPYkG+Nk~}$E z*5M{sSH(uf=$o>&0(`DrnEyE=zN;O&PMy%R%WmWK;WHxjND|HBvus@d2EFOQdioh-~#!7UfL3&{4mgd z=%Yi#0Y?arhX*vtVEGh3)&^@Y+Sk*6pcsM{d@tMQI@$c>Afbk+khzleK>xyiQ8m!Q zWuJ(KxI`)wBkIlTH^rtrKYd<|;?Z5q`$VUqZuo2>_jW;^B7t%kG;^I1*w9pC^un5e zqfqLjR{SvJnu&}_`Dysa(7K(ac<3|GZ1JP}My!sa06hun&f?-RuKOxsu14T-n(5DT z&OF$7+;UqaBKtGfHbTo_WXt4Z7HXZR^$r%BGE@|6rI`DZo;`F62Bq&?8QygTi9gqE zQf7^oDQJ`+&@AXe%}qKeI6ET>H~ptvKdLhuREZcz2Vt+dzZVSTHS*P0q@gKk@hHev z!<)s=%gDj?|F0yw^2^nJh%#8JB&@t2@xqWzdDu^$szsa}pNID;qJ38}ACr>{bK+Hk zH2l8qKb&$U7$Tkjhl)#x)ku0NGjC>l5}L) zY~(pe6-Y#(VDBKb@J&>9hZs4AYbR$+d%3|>9-ju8lDjW2o z7|>}bvSKQ_`3UPL@hb_!pvSzcy(Mk!`Y(q8QxY(t>4JexyP{GP)-_UnRD8D9KjwAx z@JGucEqw95Ae`OSlF30@y4~39gMjvkc*|aj zXg+0-AiD|~2A3nkTskJ%s^Z>y-?ee34awjfH1 z&rI5{zRj$}IYDKg5}3))%+RzgQ*b|EkjM-NmK`ouzor6E{ND(me#vukbEuuK=4K%u zxy&@Dv({X~x*}Fq!ZV{#UKmK)TyFwR$t|t%6#+g%&yr2Bp8#!#ssr|xfvMG<_~ET*!S>g{5LX#z``HQWffg0?BR{>AWr6DD5(POrPcYjiP^LMGjOVZxBe#h zKJ`1U_WQf{_e0qn4*Z1Dy(*%bsEiH0%Q(?saycCX`b=Q^f*N~b>7M=xs6sq9+?M+> za}%AGvT>VRojCQ%^tFYCWD|+TD$ydbCvlM^J?fs?90|dS@HGrHi}xyHXqEIat`69Z z?%$C|^%oo~+qsz}Im;$HZD}T{&#^7BCiChZ^pBR0F+_i~S_bVVqrgaE$c6M2VvC7l zC)d}Rls-ls;Pd2vESSxt*S|@q{dE2RdDA;mmM)@w{WckLCRL@$3C1$d^0=CM)T< z732tvLTI#zbNziBn|Q1uGuu`|99VTgnz?(9vnDwX7$d>DE6J(SpKQH(=5hq-P+Z)hrmwnzcYB(F|jkycCok41I};Nz(JNH6?T!e>yl>DCrBE&Kv^CW3;wG9J@hVX+A*BmW zW=Ctup1wAoIke0y(5QSJ2IZ#E;BM>S{u-w(J}~EGq6rVKVlyABn4AV{RI6a!IxkJ1 zzJq90O*>Cm*%!d|#-jG^P45)OjT2~9tk#1^wLTR>Wx_rQel6B=tBB=5x)>jsxa@B$ zrXb^dp4=8gnl7-M+On5gK{^Y;Q{mV7y5KU|8GG6%FG!Ml!zCi0RB?{e9HdoEk%JXg z;vIFb!cA{uY~0MI_qH_YK#V8Tq~@Ae0dUFEzGX#5K@JYIin!#rtor>RdTji?YqwAB zj*`D}p{GPK%m>E)tY*3X=^-U#eE1Sut&N&4HqpwMoL-2Vj;#${YU+?6ql4+#bRXWZ zUoRj=z$78kk@XFe-fi%cE651`kYKb?sZyW!v;L|rC{F43#3qd z(_sI0JqvLbyMcoR{y=9XWX`r?&&>(~`92&cuM=M_(U$MD2(FTF*G^`EdFZ*{FN^Rv z-S)%Us!3MW$xC`s-@m0|*#&1Gom-1GVWawrqR@34%o5$@mfF5~Qc{h0G9$a9c-upm zVN+;y+PkD9Y@F;-g#EcMF|){0PVv6`1Dd^YdOz8XTt`Wyy zE9HBJJZ)NsxXnF?;ge1bfFU?RZUKYc+MDe@ln*gH$@3CJfs?4B-^=27|IyttxRJWH zB0Sae1w*)HRVk2EzB4^>jiRS2cEOP>dzvutCTayc3@__Z8pciBj?;= zlz4RTmg!*0*1c2Z#cm7G8tL;@UDnGysRrz3qXU z6iMq!7_>7_M_H-FRxj9$xb0Axk3$^U<#Em=pRg)4=6GQdMD)t|(HdFmPtsz957++yp(!XI*Scpl$3Vo@{0t@r0H(M|cu&yPGL|HtIZr zza@44fQQ)3*Evt8O|}leAW7vW3i1kjs5Ym+ME(4y`LmurK5h2V)q*|aoo|Vzl2uq; zx9(NmnXwyX{NclV_2c37yX>f$IbXt9lapO!tYRf{Y45nRn%4AWmZS(KK&#Nzm0DEb z9FK*~lLnibkaZD6n=qypHNt%gP>MyoHy#vmhsTfBh1H#OnwGE^!+##MT#(v_1hB^1 zI1Dz*FBw##5^&j62{>U)R^rQluE9y=~0u5vM~Cjf5?Me+#NQzhA?j@05Q#pQXN!{Ek@t%9HtTcJle@ z_Z>d_Sa1x#sdG5KrGy3A$Rm12GBSy=3Tc7N)N$q6&$@e-%ut~V!A_ z@hs|^pkv(@OjnCI$mey7%MNnOb$GxJ`fTp~E&D{k6Ns7nA6>Id<mJ*xz4>i%mlFI?GkpOR;D;#K@GV$o z?g_Ykrak&RFwu0o|1C1qggzQE$l)nWl%qE!f?&kP)PWuNo9!*8kw<@qC{WB6Ew6`RJ^coF-G2W9|7eJEGv}s__ZWYKt zMt`ULD%pM*UN9$ol3De`gY&x<*(fI9IhS*UWpYcbVPGTv+ZkcLC3nSC4#(_6h{NibmUL zJ|LPHkiy!E_0RPo4(0Pj6?M)NBZ~P7-O*5svD?mBv4vFh#Y+@{+`6)D4W1)Ak?=UX z>JJ`;K5yc;heUbg$R={eTY*!&Z!a$aIm^O4fXO9NHg2e#!ae9sS%ozY+lBKMo5ZHjIQA1-jJqze*fN!e(l4aa)A^(U2UXGMi+Qke?CJ7kWrz%V zQjKMdpO)j$D)n5>J-vvOOI`|8;A@p(6aF?FH$wfz^%UM9b_ZW5WQyq$SHVgYD@4@j zqUj3#ZH#T zQzKRtA@Y`;J_9W_mDknMswk)#`l)oR{Kl4b$$mP0pa6c8Xkk7F(nAHLnV*i-YdQy_ zhH$i_sW(0iofwJOJK9r|)d7%2P;}l90R`36Fmz_Hb0#k80J&DEb(WG1diRc1KCmD; z^LZD!wWZ*K!D_t(UE71|JKgX#$*QW$b9I^Pp#@!r<)G=tdl_t5_X&@*K11bdlp-19 z*M&|Co$tqwUS?T2qHpc3y|XWS$0RF13o&x=qh49?Uu3@i-l`3)4~!DfUCVW+OFa&y ze?v(ic%`DdYTl{weP+w1yzff+*Q{-VE+y1iXMfXi(D7aS89{v2^;QN0E%ivf{ar-q zTqh^TbJK2`Har8Qa}t?LcJQEca4>>Ic2u6~XLRZ>Q{(y5AL9RcK3M@nZ!Zw5(=coz z2dBwK5{8`qvIf~NS74315glv#n4ik6IsU{nH%Iz(S80>n$Ez$`4qur&LLR4OAc^7a zKC)+d`O1u|9h%{zXyy-^h{7}VWd0bLYfxG54_!!=w@HnJT$9n@5x}KkC_iVOfL;U# z(SE{6_ka(f&l_djYBrvmJzV!It|@u=!Ua^4vKzo||MpzezHAa_(3QFPYv*9eJm>ff zZ_c>XYR4Msj@4~JX${mF-3vER@Sx|&z6b~<4Mw@CEaIa}TkW7u*8?xSOI~9rPa8|l zS)JYFqDt5JZU~N_KRm-S=YzR-T~M54u6%+$1pm1T8npSEHtnT6q#|+fvezbj#ih#f z(NCi{$$_@l1OeqZ6LjCVp#i36;zb1nxRLsKsl!`Ljb#6RFZc-arcu?qNUXA(l?0I9 z`Gh?d+F+O@-4=X?Tz=u?)>@`YmmEVIyoET$5c`OAYnK`3K^g=BKTv!o0?mq)s9seX zR{11xL%~FHNJMv;v(sR#GVv~2ho%cZOn;_8J6M_|+pZIbasJI*#=qgtj?j&Lg}Lt5 z4wuW}+GGPa%Js6(G~)?cv{Eg8nEjFV=N(vm;HcqbWl809oPb4Lt~=Qa4E*?pj>0X* zq{lv1)|#*hjQH5zvVV?_Pfy7KlNab2`hkE?-}14xyv1))K>yFd;NL(0y5*^Vqkr+f zkN%Dg{dA1vUk4ihmaeI%n8Ea5XF|mrY^#-&MGHYJc^Vb6YR4|rbB;GKyOqg)+f;pZ zX}!AodVvLZ(X($SGwapSO|^F$m#2(Z`ogkNJRACUcWBWoTG{wXRI1lMDsaP&)<6{= z&k&piG=JmG{ImGv|H|n2hpypAjclP$nut~6@U#2OmFp}lRCpguIrkOPovgt)cd1tJ z2XAT&d|)%j3H*uuk5*(D#)I^i$LCJWzL{@`X*N!%@IIbVzw^#F4zbuW{SNpLNo}$j zBW4^0z#w;T`|!8ABCvn79t)COc;y+j{~iTiWF$NwpFbX`;g8{4XiB>R{6Z{5~&bBf9OAv$fC34 zQnDPU{US%%+Od29cs`|Z>H_OtQ}8u)5c7z*EZ+0^5hyiXx6Yc|c(IZGiDPz___l_m z;IR+}95O)xLe^tf>u8=#Xzwy=&|nQN99A3_*Cg`?7x^l1^)?+N!1`bfU5+XbI;Sj^ zoch&roUhk(SNKg~Oa7T31~);4rK9MTFaiClh~&r5 ztxby`*sKq@vFEL&!AY>pBYUZ;k+Uh>&-*pF_hU5zDDG;k7;1iXbDWzDFp-@1gC;|V z9ZfJ2aQ;9&ZqP2mFA5}E7fAG>I`s-qul5BhpMew4?iUh_`K!T1PEWQ=Rtpc+eE*hK zV|H0yR5C|Tw%ggH;M==b0F%0J$MIbzitUY!yA4bN;lgovQ4X(Tb!q_@T}2EU%O~!z zZi064JvK=p*tJcnyz| zeO_g=Rdz@N$}+m~98&TU-iVmFjJKF-SYN(t#?22^s)0m9cvDG^(Inh~*`z?~A#h;= z8!Q2#p?3E+00)CAwf(&r;J=SoJkGaM(8ncw56A}TEs5zTKl`|=bjrhhBRg*8CCtm5 zfR`rL(|A&I-w=sznR?@V!J<}jqBnG4H_^8ZMJ2-hpT5vyd&2}f45R~kS}6Dex6|m= z6N2y%ABMP1g#^%LqqxZFaP22e`q20b#qR)}bh8c9!awqp06u~KPunpklqbMN;D6JT zw1|sF<*D#)W!y$CKaTx0cv8}FT;g|l-z3~uw2IM=rDw3Yhr$^R1=~n%UEMuSGnFSQ;Q z44Btr0QHEjx|5$>}<9s@-kl=Is9<>HqKCPp;3v-Z%T@JX9r=h>Uo<0aVAy@(YMvjLQ1~g z-@f9RuLwF3{pMMG%1H?v^pRW>Q@fKq^#yt205Y(7X%fUb)nbXc(vLmZ(yp*2oSfbn zL(xpr2b*Si^jlsqXF0J?F}XmQJF-*9#j)ZD9|msg#v~2tk{NaGX@WzNo`kmB6x3+F z(3>Fsn|igTaL*~{;5=2wObzOaG-hIaG-nw+;uX2u$}l*a8YSRa!lw4d2gq5Q%7X0`?liGnk!KE7se?0md?Gn+07UZXlXXtTd``nr(V*fVzyCwqJiUSqH_XxCma z)Se((Y(fM}n%$+U8fy1hQGiwHI0g^H2FFXkgjPZb^2Ek=&0wKaVhA+^O!_Z5soH9- zTInkI?Hp08))iwoo|D)`sezg$3pdEt2s;B3Rgt<)_vN|%0r}Vde$pP5SVUjWb|4Yt ztypBzF*7zm-QTBNhGX6t6o0#{XHUaIxpxzO5+~`HtTgCmeD2eBAI9v&KVZy9CI&_M zD9Qw{I4#lZJtnfhsxJP_pj&3L>DjP)btP*@#*Gal$ZJT;Kn1BVp|Y(xFlJHm#18?o zT%%9+{t71xRpQ=|4obYlCskW-1)LNBQ~e99&B^DcD{}O{kr(Ui ztK#V!H@^df55Rsx;n{69*J_DHfopy`xZ+E1vK7>Zy0BWXMR*(8f%L=89jGHoc9lV% z0)7F0x{S=fSyWBvrcj=#uKDiNO$BFmzyfdxpoggGeZq2J{S9?h`5f>_* zy3`a^g8QbfCW3H-kcS_3tnF+ln!>$crkQug37^Dw^}M`t+Y;!lo*Lu0T3;lWMAN01 z|Lk&Zu|Mgn=r@+gX^fb5h1iW+ToQ-nUv@XY^2qr8K7QE|AoYFhcR1-ESik=R=tDm( zL@wZPL8UC_+UT8k1D_YqWdvAkjgkyL zx_A2V85gWMZ;y}e#K=FuG8L4Jjj=|Ea{~0ABt0(EfC4K41o*WmjqMpy8C;Or@}C;t z<6S0vH0hje?&Rq_s!ms*Vs$MV=7B?->~ZSgV5L8-mGPAO%kJr<10*Wf z_Y68$|1wON&ncRK($y4nK+wL_8@cUbbxr~}AZOBVu5dF?T9O=)54`q)$IT(pcz$TW zP4dX5`SE`nxf%V@u`PqLX2&+-kxDgu-X}dsT!yrrV`f)$C)V_xz4xWI$06;M+R``e zatImYD*p;@cIUPh{%TP_e~AKIw8TR0*)+Hl?p@++W_i)*eTv=~uj7w@iG7@ZG`xiwIbEVC|WDK!TL; znwMg5=l~{*BjNC4h|Y3kVbRNwdnX z$i1NysHLSPjb$C*XXuk#_{+O~B5DN54Sba-n$*a|ps(_tTJKw#=A|Wr^jgc$bkwmJUWyR%;*t)`_(SLqZzlb`pF+m2w35@>(W zTQIO-u(e@o$nPtBu1Etup}^#YSVdnneghso4Y_0I=}Sds5zJVV1+TcvcRN0)WzTfQ zSvf>`wOMrhUXF6M&hiuR$;8V69_QyigF=_bvVz>l5OTlVTAr7njU6EXk0q8H`OoEdgMa+!#9_2+yXKfGqVm;S&#R{8vNTit<9=*~bLd!`4?qQb=>RQS`w z?||aR16VVgl%q;y{M}v3eY(kBlAW+~_1-nti;adniQzS)Ywvp_>tNU3ANT3{KP0)k zLZT9Ar(`wXQ}PYOS&pauMQ_v^@a@_ceYcJrSa4grQ|u8BT+WtOJmUgYNL?C%km7a; zCYCW2`^0o7nmml~i00y@u_~$Am(_PssE^Vcs$u#%1%2Awp&;`y@I{e~)8+N&hlT5f ztJVf#51(gH?enS1gMx5?eb8S8;eH~(2O7Sx<%(cy?PhN{l-@Fh99(}?nm0Z-m6;%W z5+j&{Vd?cOoOuX{adQWh^UPhhZE9WD{;;b%D?UOVDNY4Y@5kl6_{$QpYp0#A8z-kvF-k7Ups!Li+wx_t zSxQ7a{xaQ2t=uZE)-2@1RGWTiSkk^{w!MJHq{0!EZIjnh21dNbg3S!=wcm4verjwP zd>mstw(NOq{}#7+ow;w||2G!*OZ~2my!Kx4L7^3Sw<~gIvvGtq|85<+KyNv||3&d| z@`XG+VK1Gyh2qBwaltRp?%*W;kk=@kw5^fO)K@wfz?TfLYQ2vQ%+5+Q1+ z>qR&<{sdTeErkww6OQ+M3$h$#zOBG8f!_oV7sm_zE&Av%V}Rv2mu&w}-&YKUDSQ zkE5k&!k^wT6sG3G>##g5>Bx|E_4dX6;dt+#=vh@q(xcEJKjw#*&8(z#&gs?BK zkzI)?n$GB4B#h&&5-M+;_0) zIJwzYD%Huzj8|i=*jl!e`!(#cU{&O$W7<_2sQ%&HcK}Us3ax#@dVzcgM>dI1^sOkk zBj2$Wh5eP5-W&qMAlec|6Kn#A3!(f9utdyIw3>-}uSkkXAt3vNNdkZM`nYQ&pBT^j zsUT(BpVrxQvu6-{Aye2Gqu^`>8oo6I3ir0b0at<>tTq;!qPfqw*jx(u)m??%_X>5i zV0R`DXxT`^mZx$)tExt*uH5Lj-0f=kgl>$NBKr#RMvcps=_kv4IRfH`fTgjj7h-7& zwvpP#s6vZ=jmpX|nc=!oVgQf^0_47u%RZ0e`cYl~OAziS6njsJ=-5x_FZ#p?k+rx= zGrFQTbvoUduP6bI2YocMx||gBEmInvifxG-1}y0&vK%x_*7n4&)Dwm+kElLyqG%<@ zQdlx5!q51}SPdWmXqp|&<>sgX9n)N9=dA&iLMZoWH)Z^(z> z4SeZKm@qCsfY7hYDGp4#d9_GkbuzNZWP0yh9L>1xVIdX>S~q)r)-iSU-AtbCa^O<> zni&tO{J5lFcEY+!mhh!pAUyK{y>G%Lz30_y??-yd$qIW4QN!XlQLREL;o>bdbtPw_ z*>TQc%TOzm;h+v}m)H;<9HX-3pII?}%Kxen=&@BklU2>&*no1~=gR*PffxW^2|OPA>{1{ zxH1U1J9ZamsmEvPw3KR%t{j}K06KmLT$dFj>n&UTczAY8mfzF8y;Q7bZmyccD+72(7 zM=l!UoSDu@RmU8Q<3fG(h|7bO^X&B}Bh<8CUVwaXZh8Rjmo(bAV%2}nH0YOFnSPCS zGSa|4JuJ-mSx7H}C@h-U$lF?)LIM;@q%@?v*raoId*nVrK?&X2M0QwAZOJB(x7b{V z^^BoE>6Sm!P=f&*ncr$-J+V(>l>NsAt`9;(Q@AAM3F)vTc`l7bg?xv6^no-uiEb zhW|@9{QnR44e|H!-|?^C@vpy*h2?Nid%M{dXMH?yI<*vxtu0<7&c{@VHD@J@Ia4&dg6I%TZ4$jWj7*j5O!Uc z*#JO+Ge|X7$ox+qbCVU1(b3{N7{bMxQ_|c4`RL9O;`^ z9}-m(XzvK7<%#3{+BiZy=k=D)Se7+B7@{AHOTRx4fSOZBlR*ZAk5aw^8XYFKG3NCz ztwVx!1Udm`SyA+3BY!fvPClvxqcblx2&6tt z$l9E`J>&&`q|Cd@nlj;QbJrJaf~-66&9PUU#~vK`K0}O@4oSeX1#FPYMDZFvn263L z21WPPGMaoLVfHvK4xm`e;Z-VqbaTA>D!$fY=}TJG%v7=KZ)IOlm%nA=TAQhVKq`DA z(^_QkG3zC@6bo7H4h?5`dE3}=b_J4^WQ z8_(dG`iykUd0`SRtgY`%+FC=x&Y%+-{4b!fKA}nphAVM*SSFDKYc$Fc>&{EaGfgF| zz?r0=2IUJi=DVwt4t_08UZI{}iagnfI0e zE&GY@POIRUxm)L*+j>M;r>SRcg}sHdF;86e3x!mpOKfDYmMn@9JFd@y9+Ngf~&Yb-NV@&ZkJ_KY=-C$@D$CY~vh+4Ye2jcH;}w z+{s4wq90afl(AVAS4PbR5%F*nSo)%TgN zIq?Q6o?v84q}p6*K14B!0;i!JmMebf6d)sRiiXlA;zG~A0*?Qm-~9#eNfxKe^uO?) z01%^Hr01Ot>7+qp1=Smfj;DoM~P!3w`;!yD<-B#t;_wZ(F3B1tfj4U&n>-e7|&!3;p9aPGkeQ zP?1v%)d3kfa8!rSOjZ4Qt>|i_QRkV*n3;e=qeat2_KiKaiV8~J*NL1n6Jr~19b+Cg zl=T$fN`%p>+;63d5;cs99YLI_L_3k39HvXtlfsLe%=P4`JxYk~w+g+274%c~-x zWRLVVdHaDRA)DN)Nn#$Ogip~gS6;6fYgg4q7BFW$78OGd>BfQ*kJmBiHn*jt0$#8` zs{(|1ZOc?oGOyH{Vc%i!#L4~ZotC#vu2YDR%}4j$P#^jY zd_uF&h+JgX5LLbu8oK*LLdNix5VdxfcJ(8~x^$<)OBX<)gFn+9(W3{r_eAC&9F$`% zo*BHCf`%MkOkpe|baD?moy|5+tA{MT0aH<*kzbr23wOL{oM;l`Npt47A1>(PcVPc` z*;(O_=SCp6I-*p0a%ZZQkkY!I!bG)jyoKH9ErkzXK$=^(Jq8^V2yid^)Ri zN?DC_)>PsBAk`rycBr|N7nljq|=RwO9P6ZUozQ+;!e;D?sxqT;JKvywZRE@rc3uw2^C78BtXNV zSnaj$P0-L=(`p7Q;O&D;c~X-k>8SDqTgJ7_5Uo$SM>i&G@fJy?zHD;q9gs&}98;wd zK~KXTq)r}esWt^VgfcRheg~vwrHCs&qC6XgY#TnEp*l}Z`=}uA#?pa&;<`!_FRq&Z zd;q2TUbW1DLGw#Q5n}Z_;3LQQ)3u`UcOTiurPd0ve>_I3`6YT(+j#$g9r<*nAon}q zMWluVE0sSN<(9{HK;k-QG395a%Dpd}T#9czj&Qdl&p0G=QuK7-v9`g}+0!3yJlwDS zn;(fp%Ig#4(*dVZgLkvl#5vDWzzz5I5T=8XTUWnXS|EDp29v(=Nl!Xa9w0<%0xcK@ z{XqB5O~=m{BX=Ck_DDo&zBWEU#_2zraMCu->0e%tsQud51ecYJYJKs*29Nv7&j(=Sl={5m0HtyU~ zJ$>`l-mUB?%a8jWSi(L4dy>_@`Y;YcP`GaYp6c1+s1k*}9nDXfsQ=Nw`Tworh>fdn z7y(IRO*t0BMVG==GU?Btqik|ybW4q@dEhb55P`vm+3w^02LUn4>3qth-Jf1@Wxn^L z+X+z4F@}Bdj|y8mgVM69Gc5pzCL&97^)!SP^L%V|ufn(+6E7Ur?~%Z-_FLuDO88H8 z@eQm?6C+L^6Msg`jEyoQwQg|~qV0#eUnZ~2jk+n0!&_g;i`9K)g4EPiDqm~2(m%ZK zS_{>&?Z7?{E#2hIdVc@cj_Qgk9v0^#suZ~LABpeoj%o6}fkft-i;0PzVqh^qYjJmD zz138qwMUImPwuI?E?0iXd~;XEr3=}H&(2wnfR=CI)o`-tYS%Em*<0w?LiZN1r~MIJ z&C1W`ksky_H?H+cpQQK-+B(`O*Z1&c_&5^|y9@gtU@>o*zJ@$-W{R1;5t_KUYIYcbg#=OUS{ zy4_T>Jd&J-h7e*Lm4Sg1@Q0x8ryIlmy;|bm=<&AU#9N$XF>1rTGyABgZyD5N1U(`hp9ouHw3lQljqeo;mOars$iB^8E|-v8bLY|6T8nC28p>fPuEYa3LUsPZz40#+-2ThXZ8)XB zC>K+QrsE`7+hh9db$DtpGs7&sh1f?!RO`#q`;%0n&D`f_ue_coY838bq-et4L)vLM zbpm`6(*2y;K$4d+*%w@ksbV~mG&yn3E%Z}^$f0Q0ydNYQ8~K;M*rb~m1R|B%`EDug zNi(_4&UFjQj4WT%ml^aq4OvQ+aLNiQn-3C$7Qsvv-AQ^#zJ-`wrqj7v78 zI2`fbofp;-+|8K--9c&mvoO{)Mwz1 zOk)ZVre=H|D1+bN^7$suupH^zKBH`+arIO~!>A?1cIImm{wD4LJD_x@vgr<)*t3x{ zHOXr-cI<%MEX2kG{9hwA_Mh5j*$JojY+aWR8u4t!Tek1P2ywUAX{zhjOwc8fIR|0+ zsLq0i21{g8ss0KTnHq@4BMGkULb%(4n*&lbLUI zUR~`5AB(FeV$XF5ahB9Sa~y&bxtAyO8^ejzbwWwsauv7wO+q$DGJ3%}Ch+-JctyI2 zbjiR1-G$MjNeqC-PYD$^-@#r~nwNg?FMlLns( zV-^%?f#q4z6n?SB8nn$2y(?~RbHAPq1nckA}AdM`(R=g*R4Fbwyo&4Lr$n_ zyfmSHoR?Z{Pl0E&>A><$wL0+~Fr0z^qrvsOhMX`?K(LMzV|-W{S(Vz#VtKrQwF8Vs zC{4ezRD#O|M5|Y)!fJkVEWIj_%dU8qQ|;G)-u`@cBIa65BJ_F~i`XZB;u%Fof9NpV z=*}6bVZ%|2NxfG>t>fa4JSTe<|IxjMqOvY2&7o$0n}?Z?KzxcqY^u;lj3_|}M;Kg? z45*wHN~wIXLn5S3ThACZ$(Y=gyusF0} zK-EqA?3(Im!=OR45`WD^@zy|8CT8S5xbCC-Ua6k%N0%%FsC4U>BRc*U`Jcepxy+pP zh+3Bev0WsjC9_zaZP+@0lh|@q^XhCPEbL7?q2(8S{Qb5j;ZCG(1w2;id4yCHMzlU8 zGb-B?c=}nT&&Bk0?iG_;^`6tpQPf?0W;*N8eQ=m+Luyv@gPOn?6ywCsH^zfY&F0Tt zhw&T3_AU)mjlN<|9o@VAjGqgB%Zu`Q=^g~y4z}^l{9i1X@a}U7VRFR~yoA;_#|n*Q zYA>=X?TU0t4OVs?b1d7Y1pG-Amaid>L7|D6Okm%8As2d0s1Jhk)N)_0jAPHg4&sTSOc7oL|%c3gsoUr&dMK zst-hARw9cDv+4;7CfR7eSIHlZ+Cqdz!$^=?4T9a4K#}-uCk9#?v>!RTL%D|4Rrggq zKkwxYZE!I{bjP&V{1y|bX+LAb^>w!|I@I-+W|gp|MAQN8J(8yJFFi;q(y`HBQdru8eFsc` zVvKugIp?SCA!RPvrLlUF?d~^Ob1KoD%k<+R6XLRTOAdFx(YhuFqjTn`Dh~D#yz!H) zb&0TZTL0O}@VzaV)cXoQou@Ya8}jYb1H%hTO}w8@P~QRX4}ZW#^sk^XX}Ivf z|Hs}}$3@wu`-7;63Wze2f&$W@#3)@#cMjb-14uUlqQpq2bT`9Dj5MN1cg@h!-QE5k z_nh66d-mP)+jl>E-hKZV;;y;txu5I)Uf=7wqR)oh%X~bdjk0MUSl-X78x^jXp^xwZ ziHl=pcgpgIAyjS!TplQA`W2MRZs1?6#@Yr^qD=kaoK~7Btt8{Ei^AAfl1D|US@iKt zvUhqS^g}LIlMV7O`4${3+Yhp0@0VJywPa8qt*qs@5I)hhRYZ^pRaDlhhhMzZ5PqKYB@2L8(;e35}js(4yd-o zJ^iAB@6r1+^TE%|?^E2KR}Buool&)c&z}4NN4yp!ymrm))6(TmmLq^)4xHAooI$VwR>=Ri0?xBhxS{prJ(Rzwr$Db!l`B$)dgU@U_eci2+^C{v^E z*3UwhPOp5nsC(jvjkSAM*#rA-&DwDLbgJk5!1}6~>(i@jAoPd)(5;KPll}v#_{&nY zb?=)3vbm{IaD!v1{jHN_mZgP+)*b4e91`3iph!$rK%U8WpBb9L&ws!v9}f1Oqd?QI zJ)I|f_!4Y`sDW-5J~?~s9qxMh{U-$aCjRX8&(3dWcwY9Oue_eT?9W=wN2PU^ZUQhQ z!yjNsR_`5s$|tQy;Y%td*}l6iIWx)z<7I-CXXd?Uvwp>+X>*|2?_*A z!^*M(78O3QF&{6LP;CHVBt??*)p@&;+o5S6X!dRA6t5pD7iy=?5q}4VRckYas^aq-QEuUYlLSGka_dp!c?+&UjA4!fI zT^_hYkKq%9byiK^fR?zuwPMPHjs#^l>;PCZ93{S{ZvoK9!1;6bS(kTVau;*5tLvzw z7u73Y7ip(bmRi^JEgKjsdZ)mQJ9=p@+zlmXpo=2K>g7uZkAahdHGLNWEmOiN?w#53 zN9PhbeeuRWuux1s9Lu6Z``qjB>3c2rX^YgFS~RGqE||UF*4{5?Ht2HI>IQuyB)9s3 zHFd${Gs8W2VR^{84mk{^UgU?egivvO6qp|XR@lVoVrwAWguU6vQu)%PEVG*MShiSD zaX}X7u}yfvc|K5!+92z^RZrX@Z97)WBdsG@K-JW_m>C0OCfGh;%E-#4IS;)wbQ2`3eXj(c5A-TkW?YKrZ>z= zSC-uT@Czuq4~;vNn_Uu~RmV1~bIV(ZG$-adPQ534ygO((QGT0 zEMH%Z_NA?xO}(N@CrCqmxse>Dlg2!p@&R&0^s&CTbQ(*Cejp0^)U-U-gK;y4S!9AoY1d5GQak#5DNEPnbLCOS|*K6 zGe$B#v#-rUdZl`ZoPExxH;0#A-2x(WE~Ys>jCYKNqjUIIWn#mXAdnTslGm*&0VcjM zTVC~J&=(%fQBG~f%yc7h(5*IhsqV<1e#hH)-fufdN|$(8YsNTOQm zU7MzGhaP?YRP9WmK9ttjca`e?_NScdIk(HciutF)md|d9-chZ@C0^5c!pg36MX*To zw$hck{}riO`4#ES3ib~B+AE2ignPkD(jQs{ zAbOd4t>h^i=;}$a;f?Xz%oJANyyaaw1lpP{dsPF>Vjc2m@=9E#YZd8C(dV{wy70X` z3a$)}eP!{#PJH=mM4jKu{-0#{68;$e9iRQbiJ$)WoX5Y_SPC(B--bl5sMgx3z+&o$ zA%s+d;#LYIrtJxpV)=n|Kca!fB?jklT){-vPyi0xVz2 znr9PNFT)CtRE;N`A?lf(VQwbRK@W$;^#OsErrrru+=Pua znXSNRm!BpWYqt{@Z|IZ8YC5pIynR!fWjI7dLlSj^u?Xq{flZY=TevUIgA3Un#3VI* zJnvIkMg$0nq>^Z>kWU9^_jEw(LU)5&DJ;+-Q+b6&Zor}El#~=Z|f*SwYC>GkROCA+!HN^D;t4LU;qwdLlqoCZJ`*s8D zq(e|U=z2e8wAMF6%olb4knnd1-21jGPcEZ`0&1@rdO+}U8`}M;W!5&9rWzq=Q|BL8 zt`-xg$x)v-*E@%l0j~-B*#CaE{uY0W0#?*>2R8Ju;OtneFRi;s0*sA5#VApYzkxzq zZR9&QWKe}h;;t6gcktE%X_3qTKcn34^tjH7-m`aZrK@-11aDkA_vkh+NJ08$b3Vp* zRBH+1dq39P6VE~vr5UY^b=Q6kB4l__l!g0{XsW@d-%FOFo0wlwiDhad$?L{c=4aln zr(w~Ug8AWh!TFi+y^x>x$}4~FlOAOjj*!~=RJYZv&lD~dL(9b5O~3ABw7pxvt{Lf; z*Q~*3ay22SZ6BF@4k>CXm`X@Dd4r3xa%=j=RMb|S9s$AgmB4|`JFy2aG;RP{mxtgj<|jma_@d-e!6kRJ#8hdW;15O45)>&xKVB~!Y>TvZrfxC*m5R` zJJKA0NWJ{vQR(UblMGb|aV;IgGTl#SNF zCuzQs7FF@^fTT|8vMw=0<;C{(ghYy)2I~3j%7iE=-aFHesIcHqt4YWs;3K7jiOYE6 z_fjGZx5KiFofFO9+uPelrB8=zFW(cdQpd`loBHG=CHG7!@FOB|);Xm&FW?FuZNz0u z+F54!Isg;Zupl$zKSlbe__g_%C|>kS9&-P{ngn6`B|k-Y%ivO9gbE*tb5!W~_lvtY zsT_6*<=_PJf!gOR&sRQ=JlFT%D_MoV9*x?*!}DzW0&^s}q?|^m=Rw$0e+4dBey_72 z)y<37MRAfkMZyd0o>n8wx9`Dp=Aw9L!_}=A@JbMOkty)&(Ptpv!-n{>^pTI*fI#gf zkx+KpU^q@fWCDHruq9=oj0Z8m+VZMVb zsJB%>8<2ZDZQYFuz{_qdETx|>130Y9)&txn{X%Q}>tE_EPrp222sdf(@pA$C;xw2( zNsx7b)Zl|92CZ02Y9noBJ4j4Ag-hAJg|+|X=K=M4+(Gtb+beCMj|?&Ft#K*j^DY-w zc~Z$P7qFouNn^et$(N2(X_Z{=ijfNd#LgA=9l>y@JC%m;aI}H1urLrElkk@7rmO+RLe%5e%9#nA!rMa|;{m38;X(ikd98H-Umq^uBy3qz!G+HL8y zI>y7c_*li#B?B`LtIAh0`zABN!gWk(+wY}(f>aON0%M2P-P_+4q^d$P0RWNd(d8p~ zV9+v(TQ^@%(bbjYne=p7Gv&y8JlYO<>+to17d5@mcGmdI@b!-Ra6ww)*k#-=Wj~p> zYE9ECk~=qFqiNjDY+J91yIOp``-k+w88XQo!j5nl)%*`E^Wk;76Yknj7mKe^I7fjC zqT9u3Q`E5lS9JEWw`gQGzohF}ANX#?WAS+${&zRCJp+fBU3kV=5JD8uXkfUlt;&ii zw^qUYl)3Vcoh_QmeB9J*W=@Emb9&w#@TG)V&_~tKX7X$@lG9S{pG2hI_YQN3r?0^Q zl9&P|ji%B#8CLh6mU#oJ0z_8!$1VBG4%@{rQ));;c=VctA_DdU%YvxIO1NA*%v3gu zmcjlYiveBHlPc-)e%+L+LO0EVCQR|Cf?)UuR$1b!(QbbKw{R_?idrBg&UX$WIiGl~ zqs{eSbO@nj!~+#KyX5~xGY}D?qQnx-noS^`cQUdGMG+IG#9y5o8tSy)(yFNDE{34Y zk>eyogK;O=t1w#b7)|6WS}!Qa5q&b4vHU31aY$uB+}m7I3d~cR80l;YsV>on^AgFD z&=y}}-ue*b`B*m!2VrQ%*hVY6 z3;KHEPmJ!y*{B_e@8|^q(6TBaiKeT`3Hh!z;~^2tnR&rlRtvcgP*UAuM8;T5FUq0U5+A(e<3c!@yiP*6T3- z;hrj6A0_71FpZbNX|oO@JUuXsyQx2XL4gW97!1rEc<;C#Ct0;66BjC!cNasXwk!Ap z>++r;KaiJ4NV>9=Mcq+%0}AI?Z7FPs6;=~3d6X8pmDN1vz6=xy2aQmL_%LsaqlsO> z%xghBUvduARA%v9CF_94P1OXLZI$qQqf|?~NW9BQ#TN=x4@E}?P75+;afeek1!T=V z(>l_C$p>X@0$>Er1E_3HHIzd7oDl$QMedsR!yr+;WAI6qgjK zSb``wgj1dr-Y;L6INT{7PhHy0!`H{@yNsPRY3}!N8uD-9Ae<3Sk^S1xTzn|l3%eZh z&-Ib98_Ez|YnBX|rT*%}7wctn=MqPDwHTGEy6GwMIp`?w*2o2{B{0m0y3vQvZD&Gw zg1e!Jy}5Wdevhj2Fu$#vNB0r(0hQ2O!YZq^ohwA=xO+JVS}trBPCg$FpKud3^YDP@ zab_tWni!q0itdnOpI?t3HXG@;P!s-eu*%ZzGh-@x(Ig{8+vwwa>C}Lr`TPmcx^#JT zgBD)Q(os(crwDk7`JV*>ty+ODkb##bL;1+3)OQ|U&YZ8zwf3b=3#U*+A^?qwr4PqS z3?wQ1!AnJ;mQ=&@x4YgO@51-Cj^&LnSI$?OTI{KTd3a* z1ru!jt-7R1acCytV{R4iMwdjfrv5GSb`U}aDl;!W{$j`^)^tS$1I~lWGDpM5bRs%* z2C6J>QRF$e#LzpA)qU68K_{57#toF-<~hqvntejvoTFQDJ9}r%;Q4W&uI$+f0}gf= zj3Z7!wnWq?UzSzQrF0JEU_UNx|MXVRs+Y#%D&QLdY1n^Btw&msYMZK{ z;Z}W?(Zd7KzTP|kCUO?WIpXdo3NRh142DhkBn}QSRu=r9%>`vB#+9^yG1qqvRZB*1 zr%}zckvic!y!!n9qKU3cxw-l~7p0=U7JKFqgVw7?ntOUBDR$+e^{_57uTRa++8UW3 z8)<3l^`Nd0vA1Cplmo+_RxkSlAFMXpasQ;;{GEdY{%ze`{^qfozw=%f{(=4-?*E^z zcmIre*$g<`C>3{?g4~E_%2l7uZdZYpj-FrbICrM6C^?|N%Q;tXA|bF+y$S2jQRr9_ z$d)%FNpT4BOx(2LMgdd);T&qV5_8!0Kks-5Hhr+j{UYs&6P ztSU}j)p&Q^SKyY`vEx-FTYGGowBR*2^~>Z>kXNL18w~r%D)!>H{bfrCx1RFp8+!9q z><$hdDz+W7(PhpYLCUg~goji1BLP9Cah1#^&M z$Z`7Q!7S#0?=rK^8TYAg|7n6(?jY=iP^Xv#?jm?VYh?|31DCf*N#qg8yndBb^NS@) zC@O&9&M@0c;0;}&tjAz1`nN#d{z==wq0?K{sN2%-Q?|~Hwm-|Qrz$=8a5at-S(u1* z+%fM?VlBx7MjuP6m<=0Ed#~D7W@ZanH)4d-du*JE?MM1p(u_2}b9WT^@*v=Q)VJ60 zo%vdtP#%`cjHDI+Eu}%Z?d-Z$_SKlTv*ir+93}Qz_h9T*vlXHA?TeD)+>4JIi$_rs z?>`pfq4MneG`uQXUbG?RL=R}x`6(P>J&4fEeA(jiK=O!-QMv5HNR9D@w$1JTyIpD@r@A~!<{J0v|C zk^OctCnhF+@G`!XE7+`o3}Nw`$nS*-`?}6sY+0fEC9R{ z^dx8e(nsJ=;aEI!+zr0@955YR#$eAJ{4pL2BnVP@&i4aL=?9jw2vyM^Ta}Bj6=T{r z5m13H%Ue2@kNc8hcuu-qpFbsJ3bn@{!nB+3^J)u8KFjK)leIoG01sI0k;IdhXekUP zjt^vaHG#?WUTft?$3ZHF6u!d6d5KX+!4qdOUjgqW*nj+vztQh2RN{`>aS`RLQS)JV zC5ag+t}oYC@vUUC-q*0-E#2KLu)Vt}zE5A~H!{4U{V5<+Aj-<{(kRYDnoCug%Lm`uxu9 z;^a6mN?bch=TKm`Sj&DF{`ezaVz*_fdKQCf`g=*+ zRjx6xo^d^+|AnfBJLpTq90*z!gff4TtnrqwiQfeDHosh)+i73pC8YR5+_IVB1DBS} zGxm8vx$_@`CKXM{8Ob-Ql|36~ThUGZTnI*$+A+`EDo-QQ5@Rb14?3u<{B)bxd*-~T z^WcHrmaj^>aWS-}^Ly(;#B~B#GEyOyE^>!AOmHYJTqHjMKGbNQuh(H>O>mICU}`^; z&)d-!pcVds1uejkiY4xFyhI51DkFB%HHm&4)qaK9Qnhjasp$1AG@@UUi-d;pBEn;U z;-Oo&L1r2kNqKOb(*XcC_ZX;ZS19*1uV;<4Uv{fbeHc^u*dlM;ubA?FM6%S^$xQg0 zT4}tv$S%2*(r!v#?43fafMl`LQ1(`0 zFD8)#6EWSJqKe7CtcMQo=xr0wNCFX8M z)e1nBU+~U@D{(&(%9Pth1PAi!N*;uB0{t`=-Xec7lizsO zQqOQEXqBg+3V`>I7VOvEKcCYo0~0m$YG%k_7QB)++JycvW?bpQJC@nNc{X zkDFNd8*CsfBCr0*p;#E8$+%iq;3Cjn!A<=QN}I(X-%>BI40!YL842=8^wE60pclC) zJhChtG~i>3JSrOW`4~MksNkDg;cY~85pzamaPgu3IXQ(KmW8tiuu$!*mO@lwe1oT7 zA1!dMyIXz4a?bS*_LlbuKc|(F)CNir0X^(Lcq`jsF|tQphCTJcrttu&Bt#PO1B)m( z?Up!zN2;EFlX90kqtXSYh0eJC<*@?ox#VsxFs+YCjzWctU&hA_ue%$CgJeAIz9nOj z)Hh!G1YN3AxILr4q?J)m-#4%FBA`jG&d8W?6h{RWhGdf;wx8pZbd86)ry}y$V#gZ^ zCy_LDpv`a$r<`N5n)#umP&MW7GW&<8q$}#AhJDG;V0`Mj`hl-4;#I$i4MO-v^4R4Bk2Oy!K;18D$lVt1u%We%@VSX+j8FnVErl8iUPUGi}eD>&mMABU8*(B z+-Th)A0L0YBV1N^DA0L5O+RGnZjcXAe(yUI>Tf>GfVSeYv(SA5^gZp?H{J;!=8$7^ zj&n6n&EN$T!JWaUuo@u>#VzWN4_~?_DcHD<@|R7b6N86AY586A zj3xQY1*^aUrf*5|F|0=O7P4eqwkv8|4U(YK_(+CL4V86l@}3|htg)6Y7scG%$%r*F z%5mzs$KbAy`ps>)=qt!4ED$gI2wL7puaBykr>LA5l8eQ(TZYA`D2vd`Jf=xzDD0N@ z7dc6A(PfyGx)fCjR$I7|ISR6qnk26M_*!OSf@|TAL8ZRuZcxD zxV#|FDi?ZyPWHJ`AW{GDl1O)wxcbgzrO=Xsz>9~}PAb64O8!hPU6KEi6$v^ROX+EI zOI1F{Imja6`!xLrz_-_sHJ_CLk{Z06x*eYa7sDhg~ZE>A6Ulo4VH0#`YzWayv+%Eo5L~D`ZB;3>>*cJZ=7Hj;T@SL*Z&K1wMV2od9^pj zOy2r}L@)uMsqeaOnbLt1An`iUpEDMi>dhoD3lqOeJ+A)w!#;&DsjT&l8+m(0DqfOR zzM$RsG)KTGhLF*I;U3}BrLjyW<8G@ZXUG`sj%3jM@RQrcKqxZZ#4m@z*nqu2xnNM7 zT(o{(hT9d2TFoVP2?-BKlK%VP19w{^@(#>UV3PsibloTTdRt>;vi&7!Ys@z8Q(XHU z-Z8AgG5W9??_|4zHf8tUhU&|LAU1?k0FMpcX|Z(9K^8Qp{@x;r*AUNu~%6VA5D zv!?&fD(@SuJO&FVsC&__Y$t$7=KiSIgegsGRnky*!Kn>}+~3Yqm? zRJUZ?PHz+7$hK2MMzxyn%N!w;(jn2u&znielfx=;H!%gxoPeS5K^Adr{Xj@ z)p`HuiZ4Gha4fkT3JvolKNVt_z#V9i*mfS*hGJtr!Inb_A#t0|q2W-hP0K$8#)-B| zzm`Rl=}wH(E7KWw5YGdad4A9Q%fp*_+|74BmgvhH+ehFldF#7elXq$&Xmfm?H}NUvAe#R=iXj*@6#IHy32d-UF5aS*ti~)q?a~%n3UO)Ofx@4? zlr|&B!iAOvf$UY3wW^=--(a%*JM)eI419<401n?vqwZ>wnM2o(kKVQ`C-_Rz&mZ-;(4Ak0rvqTeO@~|UusU#B{4@vPXt1DW-dMun%u2B&7LeV?w z4U1sS(BY=CA87a3-JzRn@#)?|e!@5aS|Id_+1OR)7b{`=l=uozt2ECy1-q6zsn3g8{X&u)|>MCox=TdESIz@fwY@s)4LFWuN6*RE-dF0A-aM#WS+)R?_E$ zO!1;cPUu2EZC6i!1nu*cMLhL{EG49pjLOq?B5wV(5_AuE^k25cqleDL7ZPjh=w6ys zk@WoLN>e2a3b*tVXN#S<8$+mcn(4m~>BrWu8LVj~yigE~H}|@I7t)vc9~s z21#5qanlj23?{F?e&hbNu@sg}A|X}ImTKu{jva|o_Td5m@7(zl-ua)Qw+~80ikZGl z;N(~6!e`NVS7$#l-RRa%0)?5|u!FApX>n<1GCg!EnVf-;g4OqrlMRm$89LQTeoM{v z3ex0Uk$N#TFR?WA&}o#gSb&P0KyY2fKq?SkPf)}ehDu?j@VMz~t3Rw;0AF8CAIf>1 zi<(YJHcgzlJNt4)Lsd25`ZY6VLAWaaf_0eH38?i&VV5|x$zwA;@vE9vX5w?k{JTmW zcwj~&EzRs!Rm}~u96D^xG1N*W%JoGX@f<~@1{4S)2qOu9Ww{v`!P4r?2A$pHq6~x& z4b_P{3cR{Ug>TQgK8lnHrC?JG>DnW3wXDDHJr}KTD_SZ&+E4rb#{gl$R}5y3Z8_3C z8WNxh1m0Kl%X02wE*-vERtNlsSc#4i#W#{-EN`+mdl6rZh-petg4ADYte@Y9Zt`RD zR?urU1kEeDy>AgjtZINQ*BwI$zaSB?Kd^{@Lm!D{GAY3||KW2%m_U?5Q?$Mxx}ev( zjH%Z{UU#+0_%W_i?+4_@x999|0&H+nn*gMIRi7MX9;NHp?GBotX9>PTKpd&-_%6Jm zagJ79>)r|`+{Ps;PaVAWnra8Xdw5={O=7@apAlK=@1b!RS=epk6qPYgUg*qfmSDDB zbbk!pitZB&%SuHVj!Jk!65Jr)DxYImZs+;we+(zt}8ytDSyxFGhFcw$7*D&%y_1hBP#fD$aq&P=O_Xh0COf&qYcGB1bglBh@#;2qY$ z84t@yhOG{HO=di~&V>#p{w3}L~0Dt+MM#Et80*1^?1JZM3 zPZW5L#o7+2f%1H#nMIjnIz?NFxmES`Z{c|Cxr-@U-M?g@rHQ;00VyVia~<{gA{>Z{aJ$vE){ z*D9yn$3Ca8w_Da&j6Sd*BgulN_Jg3zR4J_C>)@td-vq+E8Y`YcaCZ zGBAt7ldq6Oo#V;7J(=+{lf&Hdw1eS{T=M<1)YO0v*7pYLvyZjg4lk^PXFz2?u;>%N zWm3sh1rUs})0xU$dlou$^L-lk>${k?%mCUEu2gj#&mUY1*MU4SIW|>6P1(AD>C=U6lUN9;CmzUNWT=Yi?Ahqy@l9t z?i)1E(DpTVp_{2NeT-TnTOr>~W>IOVJ`U-M%#l#mfCbkWZRtFG2+6U1Sk9s?L3)Uk ze51$-Huiw;tPYk7OMbUb|nY&uv3*(p*oo zm4kg4#T;kauDgdvV*24hQ95d9kwS@zIWEKcuFFk+w3h>C)nUnDOynCEi=uI%V}GjI zVhD@L>oax7R?uKZhyR+svlMsF+tz+@(1%}D8}^@H>A(N~>uDbUKj)V4N8j(z*q?`q zeqC$*f5h)CRPC%3I4mMqt!^)2(X3BkU=H=uQO$FaONcGdZDS|JnJT}>DRs*=V{*Z} zyB%C>W*Ji0?5MYq4>%C0v@pUB)yywR1mE-}7`|2^Rm#|!ql4sgtnP~i}HXe4V?)9Af3A_~b9;WX6qVgmnJyEDi-4Mf`rY%o< zv4!!H{gy0koK1HRlr>HO(rjExAb0E~@A^#ex~4`qD#o!&QrBP2dX|mz4JRkd0HWAt zh~?@Mr~!vyanpIB3i8~st4=3F`$d*4opR_1br< z^~JJ>1yhGzpeU3dOdG}0hJ0jcYK}vRflTYP(+u@j8v6=WPT;dSYU`aO1i5y<5@_Qe zl)@{!_Q?6c8u{va!HIVDc)i4e?psIr7_qHvBt}taKm^=7@x7}=1lnjLwi3<`y~m1d z2M^wSHqqWy&0gSK=i5@0G&&wNY|!DtOM_^&{T70NnZEOky~Vo*+{byYLLi{GtP(1y zxj{B;+)hw(5uJ)(hp2V&G?5N^%6euG1k!pX6+3hPJ zRR4Jfa~gOG-?{n6h~11h!c?SS6O9-#WnmldCG8GeAeU`fqS`$~*7OU!cJMQ`GiD$nHX8<2mv-#M+Je^4V zyg4Kbi1FTdTfc@(enW)z`bY44Qv4WXg~j!nn+UfzvaWz!V3m z%F5J>FkpH5Nt4KLCOqH;61-0mI3bXU8IGHoefu-~DJ1VUrvz1SaY>B~q?gA+D(^9K z%^Nq!l_$!&$IkIQ#C$xyr(T_;tez+)f$y$`CQRTxlmG~b5TSqvjQIoNc26Cfr44TM zas$gF@^5biCHaS%off3T_$(1qU9|6#DGecUZk1Z8g6uO{(Ght$dOP|RJ640FkDIWY zy}swePE$sr$jn(oTZG$+p$cQv(lHBRug{iM>(EH{p;neCuBlQfy2X?sbZ!#V!sVKy zZr?b`vNw!0q_SwkR*u~8ju{EzfwQ^|a3yon}T zZjTNROLcC)G;lh zk^K36b>Pr&K8|X~4qWKXdG+PhwSp;s8;92{ z$rVr>i^RDim8}=64(rscwmsyBGfRe=i?xKe#w7XfcBYEkE7QuhFV_1v9!7r9@i+qY z^y+Rp1!>&4lHZ7c8%sonFg6SQ!%F;D{(GP5n@|56DR91)V%hmXj^g%ydG&J z^dXVIW=%j6berG$Y6O&lA~h93xV9T27sfO19DWsYb0?)0 zW#7R@?JZW8*taggQKSGB}ZTRx~Cf`7I?lO z3e-wxEyye?-Y2emE0I#gpgLE`{i==0j&4H~NU35+QhIv|2kcyo`^Aa70BQ5rS%5?} z)j}pyH5KV*EKa4MY6j4k<60$8S(@#J8_iTt(NyD#lQ4F4FLc{R(O?KkYq{{DY$*>U z#VT0@gT$NM<>jF-eY9sGu&3~bsRVD%*~8}zH;ZS&h+b2bNrM&Oiyt*ze|Kt=o~s3~ z*JEtBLI-9g)=Y{+He1gk2L!7g3ZlKi+})D`V`8H-OxCDa)k=2jVlE(1^IFLlp^3;R zEnc`viE*Dz0>QU~qOVlBV@u3CAyw|yRBw+Ivfr~@uIN@-q?a0Q!$El-#=7aFIlJ4h zG#J9oA=liRUB_Od-2Cpyi4?>77e|g9bekekw9BT3YUJLI9yTjcQ#8F%;!ez>paq#{ zLQ|YIRi}X0$^X&o#^3Qx{P%q8V`bCQ#KEx|Ys!L0KOn(NM|frCfM>6@SL7cBAVkg< zWG?)k7_*xJs=(c0+38m4rX2?r=fBD!W^j7!rRVNo!vfCdpOTqI;7ZxXGK;$ zBZ8Bu7i?a4Sp2*7=kLrV-=AzM;UE3KLo>hCBjm3gS2ioIs%)}`-uwy~MHq7yMQ}Nb zTJTYo8@MhtljS{;-wph<>U&cGX*{L+%cKisBT4w(N=cIVq zrY?1k>L6R|=%C#a=rj3-E`-gavJXu}g@tn^4$czO_e-xBzpZi+5sX-L1P3Yhj^uZ5 z`M$2Gbj71qh0i#{lDk~MuM|?+lGnL3>U>ZIp?ldm&~R znq{vNi}#C1O3%9+3v411L|WWIcfDFF*V#-tX$|M#C#z}`+3v(vdu?MCj-VHE1^F0^i(7C0n9Of`}>1s!oufhgx zN@!Cb(OEECUg`dg=~aG>d+48UQ&kp=Hyu}l>N81_3C$cYsL+fEh%8|KM#h~Igt z5j7@;bt1W-;d>?SW2{7{RLD#J^t|bt+RW?#3iFRia#XpC)t*y$#9+!t0cd9w#jYJE#Q!X(Q{`LVv#}NK z0#3lE%tKC=jQ()~X|R?&Og%)Vqi8JK*f~a3*^g4SXmhaoaAg8G!A<_=q-`;5Qtk?+ z>+}d1S$Q;cenYTmN%l-j#R^aCx;M+pTI}~*gwS2Y>-MNIjS)$zSb!6i3Nd%?D==ww zP+Q7zzPwkv`$WBSV-5rA=z7bgiqLI2!3FK3R`$Mp`vWCv8=YRM!WsPBklf-p9G3J)u)y+0o#mw_6>i!#QBE2$-&(83i|R-QSMhV<4$j5iri?WP3)hH zJ$h*9Ol_duwH*-8sqfEfK`JQj85&#G+)=^ zIDwL*)R03@A^wiEpAp>bQyOR0W5NaPJqaV-NO+Hm;CU_XaO}Yfn!?P~ni6S}x5swSRYQkDL=oX?M zTpUTTcA+|$jUV&sFDec46)ajlQRBShULtEyu2C96V;ZTY_O92JrAzO5$7vdWS%e}8 zT(^a@88VFyy$IPcOOc|MX?X~rw>)!em#3cq(6U#YdrWSQIu&5Cs)BY7oqzuD!-wa6=ul^B ze8!6PuK=tP@z;v3f4O{smhzo)N^Fr^tOb;23BFj)-F8!t(QNUG@$D(DCV z63e@~oK)gk0W(b@cd?4d-?B!RMTU^~QZtq8%{zyLYxN({p5<_!x+1kuwV}h;x20P> zPQqFo9xK>!vJ7+A_AfTa0rb7c9uT1|#@4--u0yQGqdX08{gAbqE$$A&*-+d)Dl|M@ z&KU@j`H!Xs`MZEGh2^4TFPgQp1(|Yj6Q*>kB7;;zQitOyKoL~2DCUZKt3YQQ@+6mcow*6B0Y!2kV z_(lG^F#L-{2>)N3Q|ccBzoRXG`KbQ8s#pIm=gNN_bLhO3}mi$ZnSsqnxYF-$KBprQzbxZERRJbD`81JfIVB?C6ywuSJjny1iWif>x^;uAr@{w4uu|^|A zkO;le@ z?VwkwMwYSLpkH(>*1=l&u1!t^p?B`$0wGNgV$b7Ni5qy4-rMoP>ofxyVeZ#2JH>=t#e12V`+*tCqGSi0xjathi?4d!it}I^vT_?OP;i(e56Fy_!Q^Wy%qfb?raT_= zFluoka~IZe;4K7m(@L>p`Mxl~u4Vr*aoWowrv`?BrSqoK2+hZ+vVgsu^ijv+n-Bk^VP^y*NH~8RXM3SOP;}eokPY9sh zFQN@2e6fKYN&7dyh5xM;C>aIM#(8!6bxzbz+t4HIN&HlP6ek)n>hpmQddV>*{Trsb zhsei1b+C$I_Q zEq6F6Bgg8_$8t)7ihH8qjL0~iuYZl^_J0%T;>y2%!A(eK4JCww+mrxST&1!zt#NH$ zt{{r#TeqeDu7Cg#VU-p?j0eFcq!ik1&bg;5HV#|-IJl5#(LIyb$)D1l>^C%9f@Y7I zE?5Gi4AQ3y9aZJd)`?FFasg<_IYsopm}32v0R;cM`np>|h;KDsh3})`%Y`}X=9*+8 z?O+7iZaYWwXxeiFt9T;Z$apMI2#>T1Y|KkMuPyE#hpv&l+fZa=GW^W?KJA#qQjUzP zq~6Q}F@D~8D+*f?r04dh0{r<9TCW`gmd0_?J@}zw;Pahk{*%xVNJ8@T;M-iYpgd>C zW!nmM9nbud>&rdRVJ}*F%b#?YH+Ec7R~c}?-B2o;!P(F`c22G-q~0nz0$0-R1~Lg% zZ+pEuRzqR2JbB%t`Ze~jN5a%YG+JvBkKt;$+eCwDSL0IR8*VEK%lX84 z+okU_%lR*(rf{nwBuU%BlgF<_>*OU!qJ-n^Hq`0_v-I9CPiFCSd#0us?g&`7hp2xg z5jbyp(3Wg$IjWh>N@2@ldRrN;y(}c5VlxRqDWdv+M-~qQq9dF5C)P2kFAi)e^m%u*_3 z2tq&gQ}yWh)z$^PHywFehhdo&`KWzg00N_XPrhmU2JlJV0*Ms#za|vhlI_i`A&{k3 zEp0MJ@EV$zr&G5s7Bb9>7ZD){LvEvzquNGvchpj|o>%C&v&}sU;4KL@HR>A<+8-VA zZhA2)$A@qhEg~x$B9STDqkBS$0GU2-bXy@A(T&o(fJVV`jwvHQE1v>@TD$gb&oJ zT`G>F(3<3yCF{<1BbXOSQ#)vDGgSB9&p>ANX+q{$7I?9NZqWRUX)I+eHJdL@HO8Fn z@pZRj9)up>`gkdSX{6d>MI2E)X<<^3fCdG%I%7c*^$Rx!y-#8H_p-Ov^ihD@wF#pc zUeK;Ys8`(biv}+Jv*``}@iReJo*sD@<&=FPLcvaVoq~J)%lQwh!ki}pY4Q-gG0<^f zkN@>>;h#NanVBAI4n=t=VbkTqHTOYZ)?U*OqVY2B;x`hZi=hBS6-gJ=fpZ&vG7f3D zHzlXMi^r22f&Y0Z)xsGvy;xSfzR8|T)WWGb$c8YluF~u?Ed^qUuKYjtzB{0)J=;2T zMLr45O`ZC1ew{Zl*MxDR~axd`G`B1@^M9Q!~Je&&=q!d6caCBu$^|%Ih#Q!|=_M zy^y&M4^Fj8arb6;qlOM!Ue`)hq3`D+pU0ubNEr_buQK<<{QnfZ{PVcz|29rU z*vv67d%_CmVs@5W{Z;drFW-(;e z?LI1Jcg*9)5(leg)y_2(x|Df1eRZ2w>^)iHYEoH&&a)ub#>3N_&3YX25Z%8?e$oF- zNq-Az_mBQYl5eBG!!o~L4?KVh7p)Ce-NqVMZlf!nhZu2_7KArZAms3#k4AQcd7)1| zC2=-V?JZul6Owb+(GNK8@KNvPCB#m$uF_ev3`TGzVwRk_|2|J^i9nt(*C+%&kckvrroq z<+_V1x?LLhARUn>ioVmfDG{X3P`UFo%dd&T*HVk-G8L~IvgeeKbx$~jI367dLm+Z( zv{lau$V}g#vf+GBSI%o*j4g@)KMTYcZr@&28@+%&C<(SfIlK5URlE%8dYg=E83XvKe2=1-dpDE9@0z4l6p-@3t~i2Y888lrRJ^0xEwj8~c5k z4$+?-18|0mVd}m_gh{F5a{&>nvL2tr;LZMLM)evt&w48{wGy_H`lh#I0+Uwbrod0x zCAg|EU?UsnFxc3Bq=&&6J3+BnkwNlGaRbeAs8h6uDcVF`mC44pymGx>kOUz42`!ly zZujP>-lr3!AR65>0J37{@+M(FLa)!WGcberkUN!X6w>z#-o&2;;;2g@q=*p!rO65s z(l4nL4)X%G9k2Fa4!cxZdocTzM={Ec^*mW{>6DQxE8zA-mmvZb`OCQx{m@;a_r?uJrM9a|c@l5p9$6H0-3_RddU;mK{6zi`yn}!8v@^K8JBZnN<3YhG z6%{;`3|@K)ITWli}E*s-ZP60^U^b2Y+&V-5yiWd#{$3%j~;-&7gM9x%!};BTTWzupr|CjRc3GfVdjU^l4js z1J-FYNZk9XGOzBVFnlJPk~Dp3Ic#FY%m3bffh`l%0RW)1QLh7As3kS($%&z%+&%tKsJ`#8DlP$b`fLy}xRM+@k#Zafi3m*UCY*iHuD@}Fq<(yYUVW6eXhYN6 z$4kKO-kA)^;#oD$;MOsK&0-5rXKz>gB1p2Q(!kXjs$0B%kZ%2ihOaO?&Yi2rz=nyX zb_0L;qB5fmjj-`}sg3W-@fB`=wy(972Ik~&4Js;~d=k3&S68^2S9gOAtdgokydCd~ z#joW67f(OtGksh!NbV0vh}vCZ40^dc5i)1S9}<34d@+0!ZN6z$x~>^85w^`T4H2YJ zoBwjn`IC4>l*oSaDdJ;`V5g}aX9Ta1cwK@TTGaHE^1Iy_O~8_N;tcC|D_7ixSv;8n zhawJG1Hb97#krHE#Ye1`XlBge`r~5yt8HW-vzoDX*fEwtgg)GGHR-_#rH! zbIK|)UIE-Dqa@PLR(GZ%xpHgKZLgdK5^S*$XLl|}zSq;N#^WrV5H_Rv#v)w$oM8k+ zx_ZNWr@+aLP`RqiriXe<&Rxq`oeC@z+sS>Y2sRSxE`~%#{Z|X(KO*_0+x*BA1yD}m z;T~zAw(qEebgKJU(Yq^^FUd*`7_%lu-Rrtzerb`~1T?e386A%o9oZ!_)7QIlMenBY zEAAFs(`i*tH=sZN!z^Xyra_)_1Mo4?TjWkx7aFIJV+ zEgcq)6^v_rg^5vn5kD90iEr4_dr>^CU+Otb)GuH=Wzpf$p|cOtV4P2?t{i$=L$F8q zU%|zHM?wj^S7Ba=@X=L|kK2GvP*_7{lF)!Maf_UdV7Q`Z05T* zLSL#ZsBYx_MIXN69L(8VhkZ>Khcs2HSlEQ3kr`}&W}qu?P1Nuzye4yj@5p}yA|*(; zzJGVe`vn5SKP6tj@_pa>HugIt^jn97ilklWPW#6&nl;iL28MHwdl_50ji}v{2RN8P zTHgTr@Ta}Bi2mBTD-@)QEtEPdCQmhIn+m2gQdxa~bB_3;ITfxdFH>5QdhSFsDa`aL zR&TfSNfe=@6&Lr%0`YI@nG_7!T{dmAku7P=d`0_EZ5d2`uP8j9kIY3Yt0$#`tFR2+ zt|*VO$y!Z$@dHR(>$zzWYo_%`6EY{>XSX-?W5)m=Xz)Ra~#VgR&yk3u~6U^pzZ27K5LE`DT7-W0!L8(ZeifqDP3+AWky+qTmM|Hrv0nKezn6F9!fL-IQ4 z&U~pWh6_g}+_PcM8EEO^YI5_z2KZxKe@;5HYUjM7-!69k42QoURa_5uPR@DeQ7~r> zE-vbG0kp}{Tr$Eg9#CLU+)Vt#E9c!TqJfcs&jyVTD)99@^CUF{ut@6c_S|&0V z1@*SktZ2~RM|3odPuubtKozCVm(n=QJ~P&70kNzZ7>Bf0_;U#gBb;?3F;7QSdipC5 zdO14zI5o%7jzt2U9x5I1*SXxE;>2}Xb)DNHwH={8a5ZMyS$l7<3y$TKJEZs&_oz-n zXVbeoiX@Z*y!in8LTAPEsgIocfsf0?agN`$o2%)+;}iNuHu@zw=Q)m}i!(|tkPJ-> znq+6=O+7p_Cni`}Mk>31RAPl*c|N`a(NvZxW72+SG90wFoS(IYAPpG;5F|3 z5(p8jD|&}|E>Rv{TviWRT$FvhSevh(s-~zo^-K~jrEgqf<(ULe#oF!Fr9L<9d==dT zUe;@;gU(pGvy7nZ3)P^?nmJBe_ZT1y{ejr9S|l!flIGchg%C%EqR%u%OyO|8jR9nC z3w0n;!R9ray1;)+F6!vkCBI$)xs6;~?gW#QZvcV;`ex%JobIC#zK671gXL!FQ@4{h zVC8iOs4uIvV<`Ix+o$xS2xRTmF?=C9Ms1TKAoSe5>#*$F?M7(Wb-($9701Ia6>0t6hpjb73sC%!P;nl8f-+!xmcVMzN)lqC&+>8ked2 zw@pWDI!zr;exUtQUKHoULpN!4p}jI;fxZ+zn7JG5&8NoK%@>g?5jvOo5%>W2j`oe! z=<>a<(_#8y6W-yd5tY_u#C{*#(_L`L9W~ZGHf^>pzgqlZuB5T~jNca21+XJSTMTYF z5SAr4L<*a?mLpcYG=*I_1%o1cO^3EozG9lF38=Hj>@8!uWIk^2-={!G028WM3xCWrg z)qAiL*s-@!T|=hYS`wtYc;Xe9NX+&GDzk=TYrfOY_<7S;$xa7%SoGw=7OLv)+vZ*& zdc3GkaGq}b;n-uX`D_vM>V`-!oXU%_b1{l#n4 zdpV`dOpf%8%K@RYa;+e5tpV+TgSm^#86lt+rMJRHeM-S~hVg508Ds0BEebA~NuZj_ z7SqD+CEzL`1-(O<&}~_XVB@70`fp}M8?@nIfhF2^X(nOhXUy{boO4QL#}Qjc6z0#P3BlooKf5L_mD9sb^-|M%_w z5zzP7vF*ceY-Ix9A>eb%!^Dos~LF&+Yz3s1HP4``vFDeJr zD+1T*J>7n-x)ndQI{x>pXP|#8jp4qU>N^|TNL^SQzEl_j!>7zo-SlAyE*HsHMyyQh zomiN#KztE2T%6dqUZsBPJuYsI9j?@kcda zxrIADIwsdT#y`3__u}uP7UbD*R-XEOZX-u!M|<>$0T0|>18TPbqsd6l)|AEAstR!G z9(K0I>A}1(o4s?E2~HEGY4>VZ?Xk!O@T$!R#pTXRBgJ`$PkM53pzq^9$&UfjB;6IC zGZgrpi(4FY@gXwe<{vanZE>>%N8~7&Nfei*G<5n(s_^XX?xj#_fAi}3KDtoV(994A zl8zb|+lu=abwp|=>W&|FCcQJiMb2ZW|50(h0q`nDTW#0h0(yT7ujuT#+`d{22(Pk(%+vf`^rsG zf$_dGq@@<6GB1Qa_VSaE$Q@ImGnMQ!r4OCBfoN=`SMyD2kTn6_)n+S#jy-Tesf*fZQE#J?_$q|I~XR z+4+8gZs;2C?CWJF%1_sbopxM`@`%h`7W>X@FYAOc88FIWWee09S%`AZqrLj7eb0EB zAlr^v_I3|fd}Wp=yjA`O4*3rS;y)!=|2gaM($CxIkV9%_oH=i8qfR{aTDTl9O@I&W zT0Fi1tdPNRK`;HH`%qrZRaozg9)10lNs|hV?96R*fbG0b_)3NG-z6Jpw z+LZdovZ%j)VraBG#LfkhI%x@M`r&bfXcX~zlT)t6Ic}Y6wf@lf9kx&)@9ac9(rSm3 zr08+hw%?4K19gW=TORR?JTzEhDm#T^u~#Is&XG<24Fy`0jXdBmf%@17|FPdLQOg5< zKs7h!W|QT;iu2bf1YcUeY$v}+MvG@9D`c?qFtu$=!9ihbNfLFbD^oSKf0aI z-Q~$~q;a4ZDb3N!bWy>?U_-1v_Q>(Q{6!XItQLHNN_dDKOwH$S{UXsfEyQ<=*LWR%Ng>O4A?o!*1RryRDWNz6&gHg9&Di;SR1GS98PQ7*-N zZWCj*;2m^Y5-oW=;e;MSeT+M&9e*y|n|`tMxwH{0?fhD;NdYt2k8oJ7GZ2UgA-sLg z95BX#XPPj`4md^J9O<5+V4ax#%9(|F7WJ=Rs|M_@xV){7&|- z?q!c}WncbKn{|ZhPLI zGgpa>6{ti`sA&8_K=j9k;0v*M*L-L0npBZ5YRUvVvM%{egHL5K)m*xbm5U~rkkp~( zoMgno2|0a&+UYge^SFvx>sr6ge5*uWOJQ>)WtwuQ-BT82zjQIE3!H1uJZ530ZVXns zMBLmO%Q!cD#Ncf7^fMmG?w#A0W;G{Sz8x?(_dMFKjPEm_HqD$EmREH^9xb3XAVGiT z6e^Xb6#T>Gs!|#+vr({Eglvv4rn7f`R=Ekd5)D#r#e{b06B>fUj1BXHb;lW4!W0c^ zp7vd=X3;doT&{2?O!U7w9tdbwVNGy)*}LyLm6O62YNuRWw_ZJ5XRh2tznV+^PG9$w znOK~k)on6^apw^-GDVv^9v@-_d|w7|fs_J}=vSGZP*B7y%yhuOMJXmEaqe$wv z>&)`&z&rjjV*S5-{qP(9cdX^74e5W+H1O-+rYtS#1hTtS0<=<^E|>t9%-x!rbw%^G zZEHTAKs`=U>J5NU(VkRUj+!fbD4>Hc1Cy4hKtsp6p<|;XE3`}zp@yQ^?RX!MwOj?;sXt^}7 zy@qnBv0WjG)Hi?3nk3{>6$pALWlJRio;Kb{yu;S8qg^vPuUSE7hkct!hO?cIg)#>z z#pTW*#om0ix?gR&d#4*K^YUfV^1V~a>2WNFpNar#`CKyUc(#vj1;oDMsbQT`QO95c zN$um21Zsij%rN0j@sS=AEEf}GD%|EvJO?@>qMa(sp-N?RmIgw%pR8GSI&}nVG6q$u ze5R5Go+%)VK2F~?=>Mu+EFt=>_}q79_qxMOE-+t3UeEBfPaXduB2!fvT&}s}rRJo8StJe6swhB($0(3$FHd&WH-J$vW!N zOKf15QdU4IF8S1Xe3BUSfSHXc%=DZs@)8z zulgpwFo!adnP!qwDaT&iE__nJ_8^ZcaH97u)7V;lH0?DK@L^h{ony`MvS*T_3iCe? z+WG}**vhGPLs+xExw8ojasAB&BP? zH1M<2`9E1iwdnnqpbp@}=^ss^K<%KJADADOJS4EA41a8{tT59hQk$;WIwFIoUDZZq zD07KDBeY%JzDxYHnT2m|Wc~x+(=ChhCQMb8dqC?x5VKZ>uu%y5IKzVkW?NHHCIqjj zvchj_2FL4Kr1P9yyO%PtswSt{$dlJHI{k}G8dAE(CWWs;FB(=VxP{LlP`i0}VUUtP z$3(mz))=+RNfU^Vj0O9DdD@2Cg!dM|$aaska=K^M22~!lJ#kqq=t=x^RBY&QQ^~1y zFyM1@<4akNH^K%uyPnIOcBB)HetdZL!qA7!4-5TFaHNz+x2WFr0iVcEGGN7JUh(QT znq-caf=t#b@8;?djTDqI21S^9+eva5o*qkf4s;0-Ftxur`u_1I`q5{+WjKbkh>K8Rl1xe-&Hh0S+|_COch_EUVK;6c~mV7 zQZ&Sb>_D+wLSRvb&&ho66M93GoC;EfGQZ^2{{g2#^mfZ3U1e_@vKmy+lr>p;fm2?E zd7;M(6oxuVdQ}+Ee#HW@*|i<~@#led(me_!;FOKa1-?X)XenIO4Gw!it*bJZHW#i? z>ov|5A+NBZj(YSjONb#y@Q_e`11Qxu*(GT;V6Wg03nKOv6Oam=+XtM_z=D)?T`qV~u}tt3G}4iB#qV#Q^(=Aca7#V$vJD)7ppV^|89xlZq`K zeb~@Q*U0=brAc3115KH)4QdQH`QRbX6yfmxOqnMj9fsCJiwBS@AI7oRe9-asdWL44 z$y_6&9J_{? zHl!5d1@MK_({28B(3to?D+!eKIs)x>V(ajMFWUL9$(}t8MRkv(tsBCuV-fh2 z?EwutX07v@+L@eDMOu7hDY=U+SK6gump^vo%hSO7(^a@yjMO&Q+7=z8^eW%_nHr={ zTPf_A7hBgGkQMG&$e7NlyLvLPwM4#x4)SZ*xE}RKMNTWXBw2r5(L^;CqGoz83wMGf zZ5@s=8#@cQJ|Swv@{tOu9#Y|vu9ol6>6V4Q)t~{=+K{|)OSO42jHRf#l+%mV9ym5> z-b<^r*;~ufnV3GSt0Vt8pA?ndFbyV*>Hl6LQGcr>sI>QfOOL>28mM-P4Q)?AC1t6dxm|pRp{rC|vn9~&_9x7iR)a`8L*qnRgpG{5S$wib8&SVu zfG_ln%CPrVhXQh zu_d3mdXy;z1$p!0>aM-r?Jn~A8-gd^@3!9JkYuZ^3_X}rp(^3bXy>Ji8o&q-c*7{# zI_@>!;&ZPV`%9%Zto9>K-Ajff#MM~+T%Bmv@9nlrqJMd*{oB8W3o#UOPV)}h9h1>eHAx3(lO z?0lQ_!Juv#-`tY=sOi_q$(c7sLSzC}@He)fKDFgVeR@U;ubykwoSjKvhOd=3ftqyA zwf7v68n)cf$JJ`^hd*Q@z&b)&NeRem2)tW{8r``kK3xo0Yoa`Y|H+l=H))!0+hN0|{B45Y3KVZ5?ReE|qUzBYh`MQLPDv0$UbC{m)Relq>k16Wyy~TfD zE4YK#K*fn0>^7$?=d-fruTSe4gkHsVV<=xr=jS0(Lcx1tDjX45ni9;lv*XK z8ni_kv^y9uk@LrB`B(Q`_Lj-C#aM>2_hmX+bsV8ir`vQ#c|dAq_L*yzuP?Z>vWKhI zyY%eX+S+lP&A#RXWMm}2?q-LB%Q_|FsZcO{_Hg~;P0CnkjGyh*&r19Wi`WN`0E$bs ze#Yb4{_Loxu}XqI%|Pa>p9@+3Nn9tO@Q?0PDMSlLe^l3jMnUVpNPPnk`c8D^S;~KG z@47vV4JYBMid-08P&pH(+aPgH(jTR&Oup2Dx4pa!+?jH~oWT`lPjSU1CMO3qPBE{i}rSZrX@7Zui7Y`KcgotGagyK7;JKFd#X7-VyBp%t(kH4 zhebDtGDk4{Z1*V>_QvX`bY^_MuXp>d5lN?02P3r`c zkLM&wT8kySfhRBQ!`(81r&5qHUb>3@cYik8K)RHytW|q^e~}|R{=*rH7Ez2wp{++9 z+IM+cT4|CMn&KqYGMgyHph}M3oK6^y#~KdJ}Ih4-f6 zOn7s|>_F0gUvSNF(zP3Xp61T#rf?(!S^0IP&zUqH@Zlg?n?B#G^m~Jarj`LN$hhV} zW*rjKaXJTL%G94WWF$Kfh(4g&8i6E8NuTxo+9(ASz6eQy)=QRmubO$3DF_m`F5NKu ztiZ5iU|*YXuk8oqFlBo~=946YHHi+R_>c3~nw`cz?$j)3uNX)0LX(R3(a7{0cLD}> zAx|Qm_e|Z@==b3fkVHu)!v;YPie%#fWbdl9TkBzAg}OBI)!TRJ=({Mf_0}K2{X9mF z$-^pVWO<#)f6A=>t6Qg2(b7X|bUnv06H=aUtY)qhb@>Q;am;yII?VtE&Aoh?8JO@a41k^US|o|($9E%UhrS&ROrvmC7n zo@NG4UUD~NdeQH@a<|6}XlL)R6476Z?<~e7FEM@>B0Tfc8dGyoMoY0x&jG&u0RAdI zUXI0v^YrlZF2%dGX>j{m%|$b-DX){D^a(fSr%*yeak-}l166oLfL$po*VxflwSBR8 z;~2pJcOxA|@GBYDqM1LLSRFkiT^|^!pSC1;$eCz9I{)8{!r?PD{an5MDxpH5S;ZSe zrefCeFT-8PND5hL4Gr!o5!a&*-|1AbIG!)3X<#Y{lT;{ z>U;|k@P2{}*yq}P>p|IYpL?5d>$d*quwhT-C)Pk*krNc+boU}1c{N60<#EuuL13(* zPP-TQ?fCG%PR3(QFmS$a)Yodz>vM)b(^oPtaY*WDrAy94)bV{?v5zZ3{W#b3Q-TM8 z_*r)+hD>qu6YR|zCR6)&wi4rdvM)?m)$}|7$P#LZp1Jti zYT}wy&HYdU#Y@tRUR^1GCC<%!?24-g+>I%AB6H&tC)>Furqc;eYgRvlurAvm;3$wzpGF8+P9(K(Wl=6()1(0k%)etu^oqtCBGWEW+F1HRH7g= zk!23MBx3mqgAm=lyR6gT_!W7}hKNPzdRzFWi8r=ZS znYLf%B=spp>$N>rtdHflv`SrQYuC;xg*}F4k3R~2cs^D`8TzN0^I?6yk0Le9CAU-Df4s`jk|Zg(6bn5S-%XA4QPm8q7R0+e>)>DfnT zB%JV~8P~hl1u=`lPw1h-p6)3UPCbrhGw5?nNzD0~6EI#5dtTc2iP5W>r5up}!}Xf2 z&<8s%lSb-HIgNWzzUXtQtFq_<+=}*!8FP4@_rnKth}SzDu>rc&R?*yEUa(JC!@Z9g zqSG^&AaHd=ATd(85o-$SXdM2;jQ>ppL?o$KfQx-pRceUFB-~-(v8qJk@E^$1E|}C zLG1Y}Y-1}#FI1`0dS}Jf_7yhIE2AVX)AIQMSx$Y6H>%p zUzJ_+b^ji)3dKf=$)3b6=3M>pwlT40`05 z={0cz?e%X#FA=JVbk!^=0Chg4@Tc>X3qk4?Cf@+IT8Z7_))Q~j(S8|Pm-pK^u7%dncj$1n6nZK!;KyC46<+q`JnsEd|V5CjA80sw>y_GjF{gmsOjV%c3) za~nrvDn%l1RxZg2RD&pY%zr?x@y9;%vw9J-2aK8RybK-$FEg&h)(wbbK%=>On_XRY zlj00Grc;$(B^pVr&4>70CJ0}Z1O<1C-z!o`A$sO6@opE```()h_m$&%wVdVfv|eYfWdX8(|1fB*|3&$$W^-t<_QaGO91oE zu(LI9uos>qc%uEwjWEB&sP!A#C(*az-%+8zJp=r$8RB2>{i13BwZsPzqY0_Ux=D`@ zz7uHDrNWotAYNB^T*~{D!~sp?G0p>5#NC1XLR@0u4t<4XIICksp=QTiyNkWo8yYaIRaIPnDw+9sb z00HcLwd^@PG{&rDkrSCwH?W`OuCw>qnKzV|$ENe=WOYwl=Coif%cM8>CW z{Z*}TWtfG6pMCpukV%kk56>LpfYBZsUr1y8pSro`p3(yg*u@Jm>v+WWuBqB~;wqK}`ibcF8W&F&wDX>@1 zT)y%RK<_A`TJ~mBVosj^RCZHoIM>R`Fa%M7u5mLB5w0uGM(yIM4Zh`^^DD%nBo!9>1rn2uJSV66OP6&Pa zra1~n3T2V|-c%+NEEnLy6x(ZXF#R-g73jJrCfTp2KPUrjA5Y+)mgB1s05Q9yl(Flg z-sur|Q{Ah#XCY*e*s#sHjtoerMqYF0Kpym&&NKfFT!n_jNhLtSt4$k%?CP8rCs*Vr3XJ4F{{1-rk)j`Dm1NCxa8ud?R47Zcz5hnTKQ$j zbbAJ_6;r6Cr0C}pSFX6)ad~OFrRc5BFfa4gLF_2`M&DScN;QFnmZb924k!DCoaE^c zXC=3Mu7<50yWu;z?x<%v6Fca+MeFXxX4fCK)!nw~=abF6cq|}(4B0k0XxhfdKiIxT zI)`f7CMrleAC+Hdw{;6^2{?1S(bE$HC+TA=i%mAmPAr^N_K5Fxqh{2KkmprQ6jo=KA=u5uOKFs>r=>a6W{cnxI-pl0fu;&c_j>6|H1ZhmFX4D_Y4 zAf=S>${A&U8o(^yBvLzIAwL67tgSYkpc7?jhbvaDbj)G~-Udz1hBIrLQGqBf*VAwQ zi%})0?Z&6Q?or;5UoKshBVX5GqYt~f`7l_dLt>4-RAN}2e_M6e8A($yES_`}ug zfxC(fY|T*pVL|>^$Sz>NURDCGPXwrIVw(9yyRteHT5`+|3B^N2dLG3et;lZfRQ#Yj zpi;2_5~;r$ZfLnUi1msy!GA@xEj$P1lCmHVu#b+8`HP;s=|Wx`XnCzRo-<}p!~MwY zaLwz1OZv)4zF5-G3_)&dkmIk<2)X`9BJNtiMUBK#s{rVglH)Y3yMAwSSQHKLL8i9O zBc@FkUW4ZuiwXn<3%spEHARWaEc_=tQ-UJwy4H-vQj^VK6}qA7UWIivNvrdGN7i4q z#=b}|oSVEqkMq8&Ft~BY((uE(i1!OG5vo?wp5{lk&g|#*7qeU(SHgQAB{+?e(1rY{ z8uL%Mvx(@Un@alq6JoqJijf+s+0xKbyxB@AG7^BVDYo}az8d-Q$%>r&i`pc-W)gKl z>^MKc2d!EoWv*0*qW4xhhQ`6<^7I^h>@fxt#Kjj@EEd0|I({8Nrxaw>g1y^}Fy**j z-4RrI73tw z>u#oga#y=uN;-ae@RjdKFDMCK75Y&!$`Hd?v)#W0N-N1jv_Aj2Q{un!`bEe;sByE@ zI&!2bBKMjObr^Ao%;f zyQ4&2>m7l%&NJZFO{nU}h_x11C*MT|HCmY2n{4O87ENEB;;~K-dhK*>=B@IYQ9D~` z7cj=r(gGf49to#aF$mPhN;`zo^51TV>b(1!b2p~Ilum}Ma4 zf$5Gy5T1S){-$#gN4vU8KWK+cM+OS{r$>6grQb7hCI=?*eh~b0?!5&!ucyi}uBTUV z>zq%Lt>JsKow&!9YM9pmh*(B(jVF7uS)(8wv?*=D8Oy%DKg)?+mP7DhVW0l)Lsc^9 zYBNZTCG@6TT6{J&5xC9jaNnzmpHyDRtw>K^143rBi2TG`k!}&_e)}#Ag04#% z6`Y5=uxYJ&m^6hb2aHB4H@MN$yf~_-N$Jk9FLZfzHjX%Z{PEUDKGDxlKWtQ$FpwbT z3|ff`@^H4g)dm&xgjo3>ZgjY*kiypd&|+wpyD`-e5E=}h?7#PRQWUWxf9={9Ex4#; z{j0%*X(2pyeYn$T*{ZtP!m6+0O)$0R<%E^F8$aC6^|z%FIG9!%hFy`#?-0{} z)g$64hhq3gOiO-3m`vaw{4!;9J#oOuDANM9CWnFRUu*Ms*mtBT z1G(A-lbH7nJ}gBqX4ngRGo+tycBs6t>+n8huEyk&uw+b}Oz=826=NuCtCt6r zEY{3ZrHc7Yz?n6&R<{~zPMv~>wXl(anN)-v=4Z1tfh!MbzX8UICr+Vd|7G+p&vMQD z-G{BOcfKn6h5x6|I~~zx&0nF+0jZ))6ZLfh^qfv937VzdPQuT`QN6aW4l#sewk93( zz-GEV0NTDv=R)W`t31TnO|EB9y)@a~v9;iRfLz-P-ZL|6X`LqXT}Ga847^Ew39`SV zC`@C(7#y5wHvExyoxlkC$rQ%F8kO13Z3?x;(;rgQYP1D^Mqy0&;U8Otb*1kNccHS}<9hx%+vkH#5a^ifSo2)s1hKVtX;vX?G zTy@xlmku9BPp^tR3JrG3ibsuTh4`|yz%_>%iIb8?GIb-Q>2oU5t-3T)DU|QEVp9HK zI$m43KhCwh+eVm^Lh`;NYB7DvPg03%H?X~lCi*3dyCHkjcH{V+SIk@4?2fOnn&UNf zsD)*UQ22nLe%}*pE`c+&NVk|I7IVYVlixS7@`#Y!?!Pvw_fk3N>2f+5nfrn>{0e$8 zQk&wqq~Qx(Ta&oql)Qa0ZqQpFr$Eq1GW8d~8%bbZuI)qY2(+@lT@;4iD%tYE#}9RD zxo>E{7{`zI(wZA%H~Rz@ETt(o@|E7}_q-7y(~{d7%dOa?9rW2P4uzwv7^<~U+QNO5 zaZpbp^Atm5LKN0;Kr>01A+Ze`u<7h&`iM`C{HFAn)PpDeY=kUtWXG9-_?cQK+SEC~ zkLqQHG21x0@-8eMS*Vf^pfov{!5z`YA)gR#R>U7G7i`GTe7k1TcBu+s3zf&2Vdl?D zozz{`E(Aay`5|=Wa@pfoLqwP#e^N|4jwaBF#1iH1rI8Sv;aBS*g@t#vEN(k|40o}1 z{s`2dK88A58q|bX>iy#^96j1Zbw$zEKGgXCWAD4;n%c7cK{|qp5T%I-NQ)pXbP!a! zbV%sEhu*6K3P_|wKq4T$lh8s3>Am;fdldu(EcX}Zy}2{LnRzpF?|A3ldGG!~azZ|5 zpOd}!S$pm8TI>6r3&kliRm|$wst`n0yzbTlm*fv3s+-WYX+nD{IM4O7T8dFUk9bl7tbg@NIE?P?~;O%gAPKdn?YyU*w^+02nTG+nO6+E4OZ^&)Dqt?dp zLgDo4M!*33UA@`#f=3S$+q4U*|Dc-qOPRCp+%7imVdAx1Bth%+;}IQW3P}3c`a;>hQ(hUfI&3i#SQq0QKFXndVBLpu2L=jm7DiOeRU4OYdj+(!m(+Ka z@1W5py~+&rsHUfP%7dA&bmZ{7$)w`a|%c^K4$?t-ka*3RMlKzupLIC^Ppym%!=Hz2XDG z&Ohr>E{m6C!bq?WC!)@3#Dpya;!=Q3ZhhJg4rXyA_Lv+Yhif)N1`A#Ho;b=O-8fRL zB@B+wQU$Jud7#vW>oDOg+D{>ZeEhiHEP*FvOagbMN6Jr7Zl$GGsyNtfcY~q>r&$Ny z=iZJga^&Kc#XJLl%0B%$ZrT+Qz2Tg?*P=#7oEYP`Lu7}P7hG?68GYHJZl#r)d~P-o z8`%x;!BF=5t3~p8h&qPRf{Vh3+t|%k00v(tr)T-hjzNkvP#pbF8LP35(Z?y#%QTWu zS}&o-J^dI#1pQ8@u_0*dVw#bGY`Fe|Lc?|KR*7w3H%cKv{otsuWx7D1L~O4>^a|GM zbG=*b38E6!w8Y1!G$RqqC8)xl<`%I$)dJIrIAyVlC@E&08atk^_xW73JiyG?w8o&? ztyg2GYt{fvX32@$QL~wK9NI_)`BB_3Pb5Ofx24VNLr`L!e%EF?tT*+WS^YoqhfQfl zrdM?O=3XgpK(xO8VbA+KN>45atYRAgU-*o<~()j?Ta3%>Rot%DMFt`)wH@yb$-|2$^)zx(*b zvOn33RiDYWM>e4XHli$7?={D?AT;9M%2c&nB*H5jGF-5jf+HIdMM z)9hA3N|qGBT~Qv!zS}a2c8AGCVx1x{E%lbhZ^h~$l80iJy7r+IB`lyL%GpOrg}&0p zp)RoyY|49h(5b2CJRjzBr#l)n)n_Mspjr8F-;t)Jq`8iO=IYuvH%+um=8h*E-E_f=Aw3YazHP(^t7@OmN z4X&^0wZ@%hJdKce(qYwSF}qNHxr$`mtv$ax2i90MiVZ+i|?uz^@tAcrs&KSkQpB<7=zJjfIV(hXG9+0JL4sOML7dZH>Yj03}t_ zrgh~3^;AXKHO@hs_ZM|&0|vuy4bX{V3}$dM@9^A^NsYvrOF~- zmo!7byO-Wvr~U`w&C!!-ygZv0gEp9OR9ycLrmvcCxldY+H2Gxe7d<%qrmaFq8a_JNKA639G z5A4o=vrD6{{ZRwVUkv0DICIq~eCRHR~3i7027c6T`(`?qc^_hVC{&03af2qY?x-su}QfYk4 zk%PI4f*4;OXlnGYEYJBzz3QSG$Qfy9pQy}o$jH79ed?h)i)qf)5Vu>=$;%IJDwz9m z6id;Yo=~A~kKGSH8BuDXiL^l%ZQDKb{tz8L`BgJ0xlgZe_LJU?>wL?}=xfeUQ{@tJ zC76)=f#ESL#R7(Tl4Hg>O z>t=Q~LnArdbBx|(n)WMGK-OIwLZ+ixfizTMoL9Q7A88g;Q6S;QkbX>(KeR43s^y1>fygR>5Hb z3{G#!-2r#4=j%>}@*|FCZUICyF^CI@oypb6(cZzNEjT8RTF|r+FvQx}#?N##$dV;z z;F{&KW38?V##kK<#E+2PolR>f=k{)^JPhl@JBsaaJFhkWtAI5~%$rv+tES8~?{3Z| zi~=Ji*X4=!ZH95`hRa<&mlK>;aSVoXI@V?wF0l35gE9-YPLIgPD89lc+f#Icj`M7Z zehzZ_=cD)M#{iOO2ek$?h++iiiarM+3v(Is2pGf>s}Cg!)3mmG0n->SDQq2Tsq?Wm zPUyCskP?5LCf8@Cr$ZR)h=i;1*!F-gV()AMZKlt9i5Rn-8utnu4PHC5Tgr(A$nL3D zx~M^f9CAx;0d0ut+PaY1lIgMchGa=OB%1%u)FX^v05cm|l)d!4 z4=SIbtlrx1oTbLmWaIaUBE@7s51KgW##~n#p~fAp&%nQg*q7OG%M+CQroWWhiJ2Au zon@L4-930oG>FtIC$!F#GuC~48zxPl_c8CY>arJK9zX*p+JI`A!|Oz(j2DXtOI;T4 z@Jho)@(?@@b04Bnp|j+(gLxapEY*M=otuHnwde@v4knA>M$re0FYSv~YB(E{=*ITI z)Xp;!(Sfp98r0J972sEdLRjFVFcLK^oPQbk`NQz%DqG!UT1O4j#f=mrl%+i_YBLH9 zWM?9EmXQocW~?HR`i0F<$Add8%yk~R<<|^^aRw)Hq_d92d3hVF#G&>k55MXK)C0%M zA5e6?wyVHs79Md1qmO%9NV~ai5f&6E%F%j~XyhqK`{~+$HItt&-%{^YL$_Faf9f83 zC^JtF@ywy8K4Kxb@iaOFMIBwIWU4vR7*gaEJN>fV#URVg;q=8M^z~?YTLi<@fj))J zv3WlMx4EaC9K9uIVoN7hj~f?T@5vD1_FT{w+C-wyL5(OwQ1LMP_|tDp^Kl%^?eMnNBdOOpTr{GXAoCN^ zQ>d;P&qlfLFBhLjt=ChllybePPP4WxWi=6*@XMJ=Z_Jo3afLF)VD*jHyv`uYRn!EwfQl@>*XJAwB zq0MaWdG7*M&OINvMO0^7q_+u41zSd}4*%)7arAf{Wn0%<0m^D#=jL{?&dCkf0Yex& zw)Qw!^(DM^5nN13iOOqMS(TYwEdCsQ1ICIhtNI>;C_L`A7Uyw|&ta%Q`VTw(@EviI z5?Q9#Y@PUZ)Y}WYBjSV7BZe09S~5VR7!$7$ zM@jjF1o>)uoFN~KH4DKM5m3XN5lo0KohooL#i_&y!xMt)GLZXD&pPu7*~&&3KJ{-)?8 zv61cg6>|UmDBH7*=?cIkaeqemuDkT-fsrcnV*s5lcUh7I+Gr;%H}3HROFWS}Oyxz$orP zUb_6<{rvxgh5nIg8omwxF%?LKAY|f4s%e8ee@q3!@-F(HTEYAudrNQbvqW#}gq(H) zSAh_yEh1~+GOgBy-usd+d9GG<^#kG(*IF~i`GPB?zKT-TWy|@%CtG`*G;2DW6H%D* z;0>&~Ah z)en|Z20$yRUGke{K+A~v0?=Ah%%eW*vsk^VIyX*l2-UqJwl1WX{Kn+kY;4ew5YrTh>{|gsEB&y zWsyY>kiYaeN^^|wwY%K?r;1)Lwn7PS73{wk3=-V5yJ2Hh$w9hmpA!Y0I!@NH&WYfW zCc0EdJPi((=oy9hEfuL*gIHB->;Wxr7OR&x?4so8-hXrNGv^|`OAJ%wv|2IG`^i5jIPg%Mt)DHI`?1J}VXu*s&7&LHq9?tkxTu3AHKb zZL1qJYyocS!4ng%$$W7YAG|wQdOr@%PSQ;;WtwY+}8W;%67SqDHsX^VXV2rm;MZkz!f9K z;87qVqiS>Y0N6oRtkr{X&ycYDs}Wz|-bN|}z#^rIo5l#V>l{CuDuyYQYq-9(yU(C! zX^)-F)+D-Sxu{sA=nk&8FRL)F>IFDi(vA`aklMroi2Wi}03@zgxNI^joI0zFA$f#- zAZGt{gRTeIqPE;qNIk(Oi5o@DaqnQ@d|8-ou2EC|78Wz}UEMNcFumkYIMCCPu`}%MRBukCGNr zXC)BsVH7L08emATyC7=*@DxLJa}bI&t%$gH*ycK%H+z2XfUV9eHPUNGN~O4pT(_WP zH2|`cduA=&--HQwV2UTDfG?<6D6nE0`dth>R>YpJD1=hu(^fo?H@xW?XeSJ^qRB&% zO?;N~$V)G*kFd_frQUz%mRn`5)xwCZ%dR`=%FJt+bry&XOdi)VY6U0K4nALv}*5AS6X6QSU{>rkK9YXZSMlMWREHGLVssN(mK+8+AQZ+_J^wLYF7VH^B8RZ*46SQZwget?i@(-e+tOR(|=fWS#uhSTmvd{StjXC zrN`TXHLe)H=GrrO?qz%}^v#aJGjZh4aJM??BM6JO)3e==>Y*|iJHZOE;rAk0_SG?-Q zcvUVLGxU`-3F9?DrOy7#YF}OeDI{3itX#A6^TV)}P~u7(4>6nA%u9dfxc#p_PD?yZ zqaR83SJ&&?TM06v883gq*%e!&vwfjoC;(wPsx84!zT1mzwa?o%=21#(mjrD}i{>`f z#%x6q*~9f5T|O7oqDzCBNeZ58?iL!!0Rpk0uHOiH!9`PrjK&g|FaX=@K_KU;d)Rv? z9nK|h#FDR+2a|m|f19%@Fjund)vcEri&H^`v(;6i6XLyEtD_^`N6G!pHh5GF4G+;u z*8ING3=iD`O59;0+uCHyRJMijKv_mSTSf=sX1dHdB|>rT~d3x{OMNjX>ZTVv8P*d_iP+xecy7=RTQ>al{h z(oko$>J%L$=)A~&OWnn2ds4_j{0JX)Ij;%)0~Yn^7YDgYT`w~-%nl;(hUa|`nI?6K z$gFZjHEYXIsw3b?)EYJ_;q#q*eYyPR$UBT~ege3#A(W6{s#VHw$T;CT)I5bRePyh6 zZHU7^FqmJ(dmw+F(Uzdbna2~{IA^dfqkynoxJpwNIr`SEg z=8OKwro%6S^xluQf5Tww4M~5@gG({5%D438a0Bntk@XHP)u&KrdNeE|C}HG zy;6lPWw@#lf$rW}RkK;P*<3*-kPD2LUB#~fb_PU;0tULVm>U>xDa6~AEgg=YTTGcu z4IqH%;70*{j#0D^03$QKQ#hb4v$1pg$VV)w9*jbn+I>j{LzBo7Osl|(EXIBZq zE5YQ^Cy565U2>s(2LcO1xq&e<6#0c6uwB=1VYW1q!+j7!rc5nAu$He}jGbtYG3GH| z^;fqogkKW3M1DR#9(bsbKB5mx0;B0BJg2EasWZ)bZfn8TuSkrK&t{dLyZB!y@Lv$* z2@gqOS=d~=QxSzxoZDNgx-tpaC(rflP6@86mr<%D6T#iUudhP^x-vL=+?s3(T( z>41}SN?cNwdNQGuSJ+w2Jd*%|3=K7o>zFd+cUImn%K8D5(XjmxNdvManyu;cM-9@X zKIV1JN+F>0d8ZerZud-?^e}l??SI9S!d56jC8`!r%;SlK8ouMwi4QtOG ze3;T?`W0*xi?tCYUZjA*jjGHEk&9Tgn0lE)rN;l?jj}cp9wcBv&dcFo6@@)FU|Xh| z_DSz`8LmB`nd_p+XS8*Mc3Z;Nqiza;ltjQ{4}Du*dHD`;9YljYTj4Rces>iXVm@%Ge-^P$MJw<` zJ>DxWjMgM+E2lVmzJSCnX9LtJ?qYOC#XK zOWM>ub%1S6(N@bwfJ``D2uo8WmtP)2ncww#hOx-yttx9ChjPZTpNR{uyxA^rj3aa5 z61Qj+aHaN^%~v5++#IXrhRC54aLhE;uo+zMtLB7!)<8i2QKizbzS1cSCP0!ptiX_d z!xXkDuAQ3L%qiFAn8G>6R92ch;w!KS zpG@uZ1n1aa-&eVbMSUTy0e(ohTt#XqTYj{b&`ndU1T78HrDL`W+<-ZWYS7ZcriPHK z!G}~zTu}aHW;reF#$40rbfZp3DG#)(?+d0S;M+%@&ook&9V5;uQ4a~|A^qalM)x0- zb2Ehi_Q3)vF4*rxA@G-37JjG7;iqvY0)&04wG|P@Ayfs&L6SRqhwftSk%HqKF*mSp z0$w7w8kg%iPy4ANd#jQ6A!x1S$50G#tO_zED!9q_sSC`71F^Q`Jku!YVyMeAkADzH zy;FFf>00XjX$&<^Mh;_Hdj5=D16S(?$c%P^T)KJa7YK`ePuJ;twiWD3N<~nG{Y6qUhT?3Ng(n?zkDy{q`zd#ur%M)yIu^MYuSTV8fl zrGd>?I9YV$t%70LnvBTU8(~1h>fMQVS?n?F5WY%YkbzQ!J)1H@1M-OPXIG7YxrM*uv zLT1zkIW9J@wTZt9qFmndSJJdD*d*bk>;gBwJnUW{D`9|tX7>}`*Lv`Z76UAue!SLS zo9@^$xn4LCcgCw-sf$N0&l@{Bk*zb^=0tZkNwz0(BlpK#^51$!T!xn_zSi<9tmW={ zC3YfC2KAnRvvIe*QBRD#{hAw54_Wzyl&}DUngHl>F{SOZ&Iti0H#U39uWrgNmGxrA z@N*o{6d{v2^Btb90p=V9ueSuo4}z?xb-F6yps{I*K)L6Z>ukX-T2%=tp zhA_LyMLdJmZ|J;c>LuDU6g>-gZYb-wv*S$c9&2eqrPWHnTwa*&AS7-8KLW*BU!b-sCfOmrHPA>3)ms?=?_>ThTFHdS#icNp5xyo0Ut;a-4{ekFFkd}^RUMs z(bMn#dQ!4uD3q*Ps(2goI>ZC}oKpnV>GW-*Dhys6dt)JPgt|(q9xwa;{fUrS^rn@h zYQJZWCME5)43PIb=`3-xry=sb-^xNpe_E#NmZV^QLIPRSWsxLoxzY@p+G{p9xg~F` zJH89|A_zM%8p?_0U)dciSRqwoyQd{Z&mxXqb{SxLu`h)5>fabK*CNesP1(0@XEJ$4 zT}bvOwU1uS8@&G3@?PVf-d6zO1yz@n-dp+77$YYthvs?*iDHcknh}CoO=aRK%9Upz zrxRo^JL~U02g|a$T-_Xkj0^ju98!_wvK{q42LI&1;SYKe)Kgm zEi^Gp6H%Mlwg-1n877q|pSv2soQ~`~dDWRw7F7`D?=um-jd+{$L6`}Byt z!V}b3R_2D14gxZ+e9&t_fF+K2-S}V;3$El3Gakrq^1YI~71^!mDqz~0rPvSV+>Dog zEHd&S$eRvs70uRd`abs<>cBy0-sXBoVoMKTGxt_EpgFgV3xh9VP@WC3Yk%p+6_SEi zA1>aUIU7GT5}ru(>|c}-+@MtysP|JxJt78u>NRnnENXyGek(fn{#?MtZ7Ul#?b+x(X zF+Um{tE?7jUtihLJshBFoP8j?B<=*^V7_w>irWJ8dSJ&L^PwZ{9Nz@%pl@i{2Ulm& zkn+f5VuQcwZJ%}mSf!UL5nJ%MNcbTSD<`@2-0cTD7q;oA%Iu6`r}$OXD^{!QEx7Ke zEEES>&#h|I^+WvR2Qx|ev}UJBov4bO={2v&2UQoeuo88|uB*Pm5@$L%E(q5HDK=VQ zbduA&rgzO7EH`-y9MWX8@N+OP&^`y&xcz|D5M+54sZ~Mk8ca3eF2aRMxv3JzDex7I z*LGYgt==yYSP$xQbEg`JajR=jcaYtM0J5$4H{MKxg3Qp2d`| zauWN8l6P=VskykreG4^$;d)^JXJiCM>^=s}rmgf|q5cY~x8)0vfX^DaT)PP3$hl1k z=pa>fgv23-4Ht4oh~nQJ$&JjXPhpcFT&g-L0Bz6#>gDvp&le%-1qH4sy3BjKPs4*+ zePms5SnbgvxidtRx2it_y%F74bzXOOJc5YUs#KaK1m(-!dnwUHiQG`Fu!w@|EFUnM zCkBg5WGHz^Uk3|+5#IBK<6*q=QR!}s=p*WGunfe&$sowm<91wzV?QvZUgRbbO`N*w zb1dtzPMhC`8}Oa6z`x;!0BB)UvY@-T76_cA(Y-j`5>3n|U}McJY&8%#exKa;u@8ha zSE7rI$)uK5W7AZP$(#0*{gw3&{MuSC@S~%OXVgcXv>6+qiOtwc8!Twy_uCY!?tJQ+ z&q-izj&J_biv9li&q}xd!ovTj_RZ}dCVoeZ{?gU>-}xEJB^f*H&$b|6B2yg!j2J}@ zYr*pR?4Tap=mv`QDgl$$5Bj5QTP|!RCZTH&?t>;gpH4+jR_$q1KPK6Aala$H%+NL@2HTuogQ#<%B+05QJQs^d`=-pxVjW4yx3Q8(XH4;RAeDb2h zHopbe@_gLlm3ym9H$nANmHmD(D}@>Fn;6S=DHKt_LM6nmxOzg6t( zSDNOTOF|U2_L*ot(qXET&VRsCAF;R1o??eHq8l)40qrNeKK>HFzs`R8^6@g{9f&{6*fsLf-*I5OWG{;7FyQcBJ#Ox6=VCn zh_5Ib+uS@9eJnjYZoW2LbFD}#(LxMbk&IUzm+yK{f%k!TI77TDeEltHdFlJ~%H4Yw zQX&cUc5F;9<;r+S#ocK+Tr0p&a&d~RbsiP_HlvJ)SMs#Rx#(ecHQU)8v9=VXTq>+E zE))IVOojLbiu)*6*rgoj@0W6<&F&ev|A3{czh>eY$jF(%!`j7oF-Xu}$qylu-Id0| z*dXFv-R(qpu}~3*gF*@`llwzFCj`@uou_s`o(T3yMRK0*tZ=4XENru&t?qVio7i@K zgD1+GfR7Jb>L-d*{}NXJn@Ss~Ve3L&!s`JH886a~y8=7BrP?xEy%THjX602-q>4F9 zF^Z+y@#s3K->It7;FDu(;GJTUPArt_h@D@o{a64( zHuKmlY7yJOVLAC}zdn!FQtbK)?I+mqAziJ-DNlP0=_HEaUF`?Jn66MEyX?o(Spwd5 zKjH~UBrAP|SK+))Yz$Poy&IbTSRn~4Gqyz;vvo>7$4!QOMtovOY8{Ddx53N<1-4L| zy_bEGD|4xynesHu6p+a{`kt58)Fbc~1@!hDCC0#)i-Mam>0pw7F9s`vA)X~kHXy&7 zkslTV>~=Ul&j_DO2r%do-hU`O!HArGH)$^N?8O8+-XuvUR23e1N0fSI2nn8-zULIJ z?fkGOMw*?;1bs3_pF=}Kx3dI2+i=G&YxAb0KSo`CZ@w%EylLwQoGL_SROz)(p$1%% z1nxrY!pvSBt4euh27PmA?+8-yp~35}5zHpuDxq;Pv#{-b%4HKpFJwtU6B7@Exbk?p z8;8YMx+{dL3B{J8cEmk^JMok*!Da1(UdL{$ZA@A00N{F>X@DYCKc2PqY3Tq)x*jy` z+iSAAt2cblB6-GUy`EfZWS)Ros!ikl$%~33p=hTlU?tX-z^a0=@s5$TNv8D7k7SyV zEMXWh>h$5?NTMUz*?GN)Ro&(}5RSUx<8LaoTz|O@a+$Zz=64~Yu+xDn&4bfH#xNS3r%Z%ZfV=-jQ;~t`XcE>; z*TPS4{5l^-K$T)Is;ej!-%jI zy`sr_^a0W9Bd1~>%udVWR&;$5 z?uKNwqIhxfx7l~;H~NQP8VH1s9v^Ut;P$z}b(W%}r!t4?iTkf?o>!_3?@RSWrd0@;HV z%Pj<3e{W$6alT~ba_e@QGHn314G2d~f}PEsW{O-)tPOTQmg+U-P&=5;vgW&2I_ZVA zOBTS_`TR<)3(~@1-mv8p8X^M^GXq}zdPp=j0j$QQ3?J$B_>fQSS_`gbE+=tuSCN=} zyG8OG1h(;#^-L6!j))PO4?CU_nS3H!y^fab?DHCfcL%E|XL+d1wrdzoVCu*%hN7J@ zy2h9Pc{~0WDVhIZ-TAGxdo2vsIBs;`-ko@pJ$crA(El5>PcD<7=)j~`+*oj--`v{a z)L6t}GB|*+MU`&t~G*(x)J;{+0QfRyflqbQ750V#xcO zerNRGT}J-i67&0a{}B}k1V0S^4$7qeLg@H!s9o%<0O_w1QM}&7mB5#SC7ZzH(Sr;h z4;8OAYb61*uE})=tXF4{M8MWhNUu&Kc6)d{lPIgZxas0mGK z$Y{4S`<&c*ueJ{Y?O@IbH)1bzKVYVWSm#X1t+6rG*kJ1FxT|xZ7#>Z^O`xhY9upKT z!FTZi!#F{Nfppcl@*$O%9q85zcsfh+ERd$U4Pi4WP!cIk$_f{g0#?Z~0)ux7K`J3x zPMNU~PD{w9P1Ht~uRD9CDAHG)vo`5Gks-}yr(8kW%@o#ITpYi6J+L;JdT1~D`Gmb; zu)SQlhj|rvHFG=|UMS7fRG^`1_!th4%nP$E#O%j;0)I(|Eq8V%RPtPM`J0y5&E~nwUZsJ~V;v(eLS!L75k01A3h{LOH9qvfbiP-t90`OjT z)cfIOoDDbGEx#z~$3msJiE;YSWWyu_bH)p`w~27`s!~{ScHih`R)v}wywI#UO1|7) z;}iij>Tt)}gpQ>BL11hU5B8f}+Ox~>FzVKqVgnFGce@!@PxD=DqEZ<_K~DM2-5PYt zXa{1&ShgF;6Dw_EOih@&o;@y(rq+QkEFS2QSb_514WoFXTTNj#oSa_1SH5NDiw(JV z{OX0H>=$TedAX+^FEI>WR3eWjLMkqcd5UXZ-bBj^{Fl@Q-W1U0St4*xTEyN0TA5hGo`xEb3j~4m1A+0KJyd18=vK-ID zE?d037NQO|ae8j}L}#JHc~a+`pal{=Wk3*h+~-HWB&I_%rS;gQBa4|JW9T=46|=+J znmeNA;qKLTCR=x6q0iha(ymx*_;MSE zMhzy$igKgo=i*Kz^}CvxeQ$+w7ud=-BV!T?`)CJppQaZNtkc`h8s;wS!R)7fm%1?z z@Xx9R_%32$KVCdC{Ds7D+eF3F#P0sO8sMLv}5gJ%8Xp?UtX7-4tmLC8K%Pq&rc7&r9l$@Gl(tlzUGTQ zt4$0Sx>^R@gAJ$NTHZazR3>~IVOw!{!#DyO$4CIcD2J81Sa}{8Xp55GW^P#EXmx4c z9pm|$;91eDKnp27lm5c;VsRtOlTr|+_})DWP{<_f2aFVH{-EZp`yxIB4Gi8=E-d0d z5<%a+0tLT&kVkOS0>fCXXn}Rz<{qxId~RmSShLMMeZ|pQ2#YAJ+HYBeZ-LA-I(oP+ zL8VSE$6dXNxt3LB-*?|Kk*Nq@wd!?Un4};G5@RKbD1+K!5`+y-Hg7W)zWyX)S(RvN zkhxs@gjwQ9#Ho z2QkRdUvrce=*?!8+IyGjb%Y{^$H%nKX#Bsh;)EX7e|Bk2yZR8?eTlCw0-8I4TB(t^ zwt-*fT1#agReA+(b>(AIywXrfhm%1Olq{4#TJPBl4!UXnDlVJqdv#ooIKFWLvE2;jV^KpdYpQ9#^%cGDT`??h?q>4qLj;E z&!JLM`L?5*&9Ev(T30)J?}hCkzQrxE9f36DWvwq)c+?|8O3!YqM<_~gLg$yOA&qqK zqmbtgrlq5zT~}!*1`+&~c3wOkR7@+wwN?s`7SBR(#|*5Bu1v#Niw5F;(LFzuBvQwX z0=xSxz74rsSJl|(d771>rz36V)aHlhLI&pozBm+r1Dr}uuVvM&2nj(f#*HK{GetS+ zBBYLVs_!{zAa~6x*E)34@qzVI8LOsj>e0snFVI?9;Snx<-Xy&DGJw+!p6}by4mr|2 z+@f#gPoG`cj}gdiv;8#tmNo}~ecWF8$4Yk; zVGkrn@3FZa@v2)bM1vNhPn=AW9i}M0Wu!*ATx-X^WT7Tg=uQXVAhpfg>^ ziF}!TB7$HiWG3fa+ftuM8XEj@&Hb4R|L^DjJcsGu)o;&!82TNR`3<2bKPE7L&wI*b z!Qn$w43u20*zWP3SZWkEC_F>Q?yV4ViD4$f1~|GILKutjnA>gtU9cqxmY%>yn^{x32FrLokZ^tG7el5k~v$3)?wniPXTjo6voxg zEl;fP$PB3J+Z9Sv2BwVNN`5!Br1c?*xHb%eCpSAs|6i(!?C9S4$5I!|vF;FxWcZXa zxayh??}h34MAE`G!!nOQ__UlAfk*7DYuo#*$2P>9|V4|w`OTwh57JpGu#J{vhb zLluE15!z6rf1{_*{|2yC5C1#586dqr+f6h74WPM&I%C$Y&9lcLAy0uX3sm5buVt|8#~NBIL)U>M zC#aiaUvHns$T@9%WZ8dqvWe5_+9zjczv3*u1=zoUL>MP#__I+pqTU`9inqCLCC|w1 zf%*1ycPrVLXZINEvHTKoo3+Ni*tB`k1+1VddH^}Gm7|Le6=nqV?k~Og|?e`N1#*i*V}~1dmS6l@9@K8Xgs!bt68Hhy1~;v@{2gSr_+xfFLiA}_KpwO;FYeY zMHn$@dH9>jVXiw7JXLv{>D4pUslR5vTb3VoRY1nAHqXjPzTk?Hp4cseZE0x0;aueO z@!Xq|5j3{xn&8{6sMsBYzV{odhGJ*Cz}^!vn%$Sf_5UXCvkhT1amPlwt=C4jgf#N$fUku#Rh{%1Ah*Ch2AAVVg&1GMCK>yXE zUNrwLNSAs@C&d)i&s@!+#l2buaQ^DRuQ-_>wb2W>z+;^zy0w)OnxPvH<45V2(bx_8 z5HHp}FmD(Z7_d2Jh1eSV;rT|Gt0QDMzzP^X=eYk*AL^$()ZZ7Tp~Gwl@$GtVGg7Z+ z8or)za^xkufZV!g$Ss6=)1SMI5Qhq;<(DXG;phsRz1;#v?|d$}7D#GkJy7aqUk~kQ z(r_xct=`$Bv6gpRy$;-&qefDc{Z3B%+!F|fx!E~$Fx$g7p$o!%PW zAz$g{wc7sK(h9@hj0&shV-klri+}t9ODA2eWYg%qbMCQ}7__ydw&%u3zm!*9QGJj1 zXr;Q>2fnKI6$FfLz6qLKw#RkmrX+kdkdZg&)=tS_xx2v7hn>&)+0mI^6RZNJEW`D> zN`nnu?27LMPje`oO-o=vIL}y0irpkPsDhQ2L{;}TH(SMiGiI9}^%JZH4kM&OfC~Uv zyI4;ee_5!k2A#}Q<}#^z3(&R?7WYA%9u-3FjoRyJuNJl0ZPf%$BMI{bZhX~R90M&k zRH&+hL1G6@7yeb@5x@rNSzwDVOa7N|D)$YR@%j*j+E03P7TjkO8&mCM&6!ztBxY#7 zuEsu>n-|NvwXHJpWw|y#V41tn%^vQ+j=~Lf7F<_zp6PtB8gAGKeZ9_(C^yK~|MEV@ z!`6+tcyiywdYJzs$IfHxbwNX&|TgsFge5RL56s=ZNsvmM&$tDf(N9A%z04%kU)N8r3jap0*ZXo|<*zSOvaShBTk z7?quiY_3me;=gSkkl^l>RxEOhe{hBkcf5$6GAr}f#4~VhYr`#OClezii)`5A5IEsg zC9IQumUZSdfN5px^$HhQXVr-Fg3mEB-%Zx~m?xu(5UVH`ghzN25O7ai7k->wSbKFC zsqzFOVpeta`3`s|Iz(;y;ocXhy~y4q<<5a#S>}*6;=fc6JB@!k$W6OnIab6c7d?xT zW!YVGD_Is;q~76!Q@a+)wf)AO+S|i*8M@;Q3yx+}mE!05vV?ASa`wm+31LL1Y#m0& zfKO&a@FPlUsE+aUwc18s$D{CfU%c)75gsox-Q+4s9HFD)%~G(I&8jc#WAKDOxMA4S z<-}|5qde)m`=0xi>E--oPn0DMR5>*}k5QSQADO><@+M_r6cRAE^;O$q?gDaGYum3~ z0k-jc-CSRMRky2dqR5j`Q`-$s&mFWvISnm+AM^v(Br0C9dhugk7LXUk*PJ&TyX|L5 zJJYAE7aUahrM`2j5=RV*LP_!-bss0)wZP16-oGjh{0XMvA8r~7i_`|^noaT1)`We6 zwLF%9yB3~rL(AjQ60Il9RRj;~p$m0VnLl7z(d*AR`tcelBCXy{b<#XNXaQ^u=DC@! zl<8d#ONtrXV2riIueWT{8J?D8h#rL<&LZYUKn^8=wJtFtV`DkJg-+&Y9r891@w5>X z-YSD+J5;*vLo=cy*(}G>(EA`QwohUb^JL~-<~_D$zM~OnRz5+KK(P(v&0TYF9dBuX zhQ^3RK*w2L^@e3TAy$y(cKO2&Uw{?inxoxLT!oyt>IL4^`@3jBl@ldkedVLwTlv{x zl)Ikgtw`_jB9VevBDzQ^%bEG}!khLFs0UF(1vJ^1-u<^F0so@l-9L&bwvTZrHiuD9 z<*q88sJfe2zTO6B&lTIyD7wR=jhj~wHc3I9vFlR>C+y8Ez`EhL!N^wu1G_l zkASW+(7$mxn=m$iFwM9yT($GCe|;$0{r^&S)2}*s|7dW3q3<93F#J3I^F2-NZ?G+F zqBaZt{>L0=<1cb-$n{9av$$symn*U^$1NrurCA=E2;l z9k794*ZxTzp>kNV5^doA;CgD&?9qpNIyp2RJo^qL0r6#5S@<^2-+(Np$r!Pqt254) z?ea>t@3N*}#Ylo?t=$>sLJRA!^x+5qCS|Utb)ueO2g$MVZt?UosH*wI3`qfY=rDB^ z|2{e-uwP$lJiq#UIV+QYVRC%%DVvhuMaMy;ySx(KU& z;v`JL)E4ui*tX8$JL0p_kE*p=7QTVNiS$^{9SQP(=gJL+wf6(+(ArSRTNKZ@4W^3_=ax%f)9xl6Q3!TpcIy{3j~ZeQiK5p1g@u2ih4G*5 zI7Ul$=*B@PFIisd4_Miz=ZI53r9F%);&L59M*^vMYS_Z;?cRyORgGufi6bEqIm8=I zed1)F+VWSP>z=yjexi(k?rtls_F)QP#0fdD;jv>N1qOZIg5k_{oH)ed5*k4Z^l&)TZw<=du@l>;@KkkhLDisXG;m1I> zA8XlvflOo^r8V9qr~9XR`AahPhi;KjlSnQYH*g)Dpzt(A=nt+_TF2@`smyWB703o- zt4m^|*+h_8Q3`jK3*L~5Rq9)Vmx#d2O_wXA<7+lRm>BLw4;5V^wR4PtFDfbcGt9+80WhV3yijiXpAOm{aYsZQ>}$p^7`D;g7BgFO?)+ z9Rm)3OunN?nBCXxsCSve#v6NusyMtxr?T-|&ih_7jY#9Th%$K2#f{~QqaWdm)@@h* z_@>YB341R9#I~|^{q*ZT+qMb3a=`d6DzK0#zU_(>)#p2e`B=UhmUQ4I zPpyg3Jr6qAckTKMJU| zBHycl5-AndA5C}hF+XwYF&SSep5k3Z-HV@Gh=yt}Rhn#?fQm9Mbr!l`B?^m#*2~Gr zuxpemGrPOz7R&XKE3^gJ-%|e2$b2@$hwnIE1FP2xQN%M0~$Uy8lQF|wFV zbHloA^I-0im18T|J(X6T;&qKREmqeGO<;Kk)Yp;!|4i1{%m{iVs7o9aGP?PSgr^TS zq!6^O-&ATH!_Df3LICe>JjMjm0GlH()Hl*Eyz!c~Gp3LaM2cQny(tOzKp{%&cwdej zyb?|1U&}IKQ6w(+(q7xoF7ETogme5Og+l+{_je3^{$4Ebcedl55C0pE`Cka3{ErCx z&%NesF=8m(`~$ruNza>1U?O<=TKC5VoyNt5OdU0EMEh%^IQ(_PK-wpa`_c9)p}ylF zvdid4)urK=o?%Lg@3A~l55VfvEM6!@Yig`!jNrI<>Hk}I`M-0e|DXDs_LUPQhb_&Q z2uFw;;iH1dq??m@h}rw-?3&qHB&pPvY*@f(Ux5B8HbW*gx?h+)#HS?ygdQwbAJ_uE z%^c50xg*^$JFF}VwE1XAhC`(Kj$}yyO{1zbuyv|e3eKL5r>^w%YP1G(eDn*Rt4-wA zeZ?o*AJ_U51bObe-r*@*E=7wc*mSmj0b9@QX>8ONamVx;vA|pF!j661I7Up~PkJCX z>Bp2E@X0`;s;}imvdC&t>&g`o!uSM*&CgDsB zM}HLLB2(%QWniCIOLr;i&X80ePk=ARp?*-*$PxT6c@!D@Lu{{Yeas@k_BGubnlBGm zAaOR*>L*sJ*Ees_oYM|dfB%N2HrH?0m;Oxej&U#@@IbZY`@^kS=bJ0iJ{@c;UXHjL zougB^83zVNxD9#6J_01DoYZ^C*cqz*hte23q_NiWi1>L!0*`Y|zyl`V2fy`tRfjKI zE=rPH?;cFHCf6qlDM%Vwg!88q zN36>!C%hIXupSk?eTCQ`0-9T^?@G~_#Ki_5agXwU4TFEqHbcnn_3u)?xbn1|KP#)p z;msayo&(65LK0Z3U>O^glHo9(FxiLa+bQ=wmoWfNp zZjcxMU0BSdE?-zE?;4S&q4yk8$6HW4aEFvbl*LD!NJBqAYwiKX?S^bsgP1Y75T{L0xY)Uzyc5gw>5p#XOLN@@EGvh;zpZF=~?I@xY4YUvPn zwJ%eCZN%0!QarRO={VF-->M@IW4|PhE4Z16L^QOtHT%p)dL8YhJkpdi9O}Reb5=}o z$rEamH8RMKM#f=s#*#LXFmKt6(9{n`QkS^Kf@t4jjYru46RV@$_w z9xCGL#SOt_NcGJJbZ5l|fnrLECs@I7a0K@C2p_Uc*R7FI0ltns!}8Q(+kIGUhXCp>#O(l*c}mH$b+HTiJIN4;xwo5lTAG) z(BL-;-+K*1K-*fQb!oaMp~4?H1{0mv*e3;jM&f&ppKmF~-JTG`svCT%Jqfo~xwiNW zXVpx9tvmCx_IqWX00%6%BH0))q&NCNT|V-8x8w3l$C*nv~+j$|taJ|{^uh*F_%`LpTRHjLS@y^+dqK?BRNMbSkY z?Uoi)+pDJI{s#*=KH3nb)+` zlIM8wB=Yjal&;Wg?@vlFVjo`;2YqyzC}6Q8162=OSZO5)qZCGptuE6a;1UPFyKnik zZpN=nX8gTl16AfenbFyeIz=7iCNaZ`-?O9y0%Kx$uY#&aHizUDDG>A9CKQT+v}lDm zRIziH?k4|4<_$w{h%AGhq%W*zb8AtOivGZxX3CaGqvro-|3po~BKAy(m! zutfj2&Pjeg@^82$^Iw;f29F$p4~9>7*H=^xE@_iNH>XmseEKJmu*)heh)5S)YHM z?aC&By#Q&kH!Q4-5RSs>2T_uChZzO>cy<%2)-$FE`O6n$KxKhkxtj5>tg>Sl?|{p* z#*07tv=u^q^cDeOSjXInLdLnOB2f?p&<$Eq5kwiMuam|jM@1G_x!esAl*0{UbRdL4StoA$zqVcL^Wjqksauf!UzgEOi`=8VU8eUhWE@UjD#n9G+v=8mh5YB* zBeZJ779pd6)t*F4sT16|6UEywyvl#$_e?8jTqA-?y(jKzs<}YT>4{SzCrv53}HvtwpF;Mp!)yN!jD{W~a+ufIyyECyT~C z*LKd&e5~CqViu;s!FyjAd2EyZFc4^aOYT@5{+!TRU$e;Bl_DSh@``FEyfyuQo4G?j z*i8*{S7&#f>OxBvudiHtDo8Rw)OB`e_y+*PHHokYuB1a=ef`-U${tQRCVvoPLG>H}jALJh9zp0igr6}iLX#%&+^1)DY5~< zQPAA$LNA)yi&*a3ghHJ+O$_byG!rVGESrrATW3}2^ory|CFZf<=kFx5G!cRlM{y@v z4mkUzDfyr0aBc!VlH-PJF{$~A)JsyhGD0(VVZ$6Yo;4vv(Zpv6o=Y`cvZ7q>)!ntr z0x8rp9vq@BJjqVnPQl=W$WPn|q&c`_AaRQ){H3iMEf>zSM53hps~ z6vKIx<;n`@ZN<os-kx7^MYpRFUF$ zl%~@1`#agfKP;2`>H@blQFVA+0B0N3F#g+y@(`2TG>GPyTUj?CZk@<4MG|{cVoe10 z#Eo5+3mQ*ui8(?Os-2>oMxjf$0PhRwT#itwba$+*Bxd&!0~eX&KS(>?`hv>+@YLia z+-W4oc$j}BH%Mnyg$V5T;jaeypB0=^RBudxt`)R0uohe5DBanXBxx{j>nN4>naV|3orp`~{cQqU}yW5e93Ch(fB z4bIlaw|(@aJ*!-&rZ=n1jK21i-ZefnPs?8^y5f`TSL=n^7V{ZO zqgMwX**$?UV(U8L^CW!w_rvaKZje@bb{4P6KKj&MYR9K02FS5q{o*uU%9CQM4NHQP zB^6pPKyH3Q49`ro2eF-$_9MOf{Dsp-c!U?Qt{eu} z-&iE6D3nDH;3~NaX8!5<2nz$EO1BTI87ak*B)7_S`D{Y=P%UAC-yjz zTc5J1mBAr*yDy_Db_r?~Pqb+OJZP0b#T6er+m{PIEA7?E3DVzKN6AYa_-y6(xlAay`%R{E>S{wwo~3!BuFY*+dr!ZT zqq3l&!=Vfkkg*n(OI~~rBvBZW1A4O6rLwi_ZsE%{7}wthv0N%D!z9}Du-|Qf=mMx( z!??IPq5EqJ8>QnIKg``?9Dn zA45>{7K}ee6)9mpxe##>U0@DyMMoBJnfrj{o#hgyXgdi(btP41(X_~?pL9Ro)~pVL zALwQghprKL)44GeiGM_2t~V_N-~t2Cq(s;sJKRxGA_Er=izkjV&6uBd7Hr*rT4}IX z1wd&+c9{|R?;4RSvx|b6oIE!B+srM*t!dDR+=gc(5%Fsh?4!loEb3Rbn$beBMA1g7 ziHi&hxN!7Q{)Bf(=GSoerVCOYbF)bmQv-K!GYo26ei?b6W4IG(vU;6-U8l&L`iYE_ zH`a_W$9mS60Ak35Av zZK$CGC@Wa2B8fCI1=~CNp--F)LKrT>fk(p20ZaNNJQ9FkZd%Bx0F>~+CCNOt^n|^O z-Q`XoL|tRpHt}?2rHFFOg5~5?$6^TyC_F*O*fSq0Yd$kFsMo^+AOcSi`{dq>=B%%x z(!r#{Zp2X=a_TXOxF$cr!Z%n#H0gk;(5eAPf9Uj8WDH1>or%K@4xNib+5Dg*V);EC zmY;nt9t4g`m}mDeOZDP2$OE1;?R`w6&43U|4US`DbQ8L$t?m5MNbH+rsfYQ)YDRiM z(X97Ez@S-Dt@r!biaffIAhcA^&9jsr?eay;yUmFx+QAz_5#eY+SkBx1 zzM%JMm>uwxjanc;HV2r2S>;Z6%?PPjS+D9Ps`PvtoNiHdk!0Vn=xNSV^-Uu|#7KQ< zk9MMECVb}~%d-FGa1AcPyp&7#NS7u$J|qH^UEC}GVDJs|=e`;~VzB}H&i&m3G~u@h zvBQE4+uTgoXXA)kCfMA}i9vtp%f_WAu(p6@ix<%_ z5`}9bXfzr5tLY^rxiAz9?-{Y*+`M(4?!E^EZ0@>B`@gSp;oYg%Nv~S_JY-ucBQfe6;=+AoNFtT z^cFybY`sP%y#DU08=vCUo0pe|8l4|iQH-^tYQ&Dd6rNo#5je`Q(5GL$5IEz4%#2Lx zFh2RrPH+XM+{+m$^lJ_CpM>5(-0HDUs=ugr*lnnAbH?0qmN$s6gvIHqFaVdK$i727 zJs~!fHR6?1t2P5$t$>?J7iwH9dO*pk~LMaq%k1OYqA3={Ji$iv(x%C{V9kqaB1 z6|1$PH->GQ;uc{n+SRTpVaAPuN)_BiM9>z@-8A_S?Os>2u_CQ<3G}_J&iH~pm~wwP54$s zyzc1{@Z<;<;4<|!%~e--5C^oibN|(i`*YGD$EVqLY&jilo5UI81raa${OCv+VCP_o z-{bnG&0|sWGt%M6(zF)f9IIid(9iG3xDPqF;_iLiE`q2c%JeWE%&NbL5<+XMaf99t zZ%p60h_+slRS5c3P}#46*M8|fa+2a!mg&UhGM8o`qWSOJf(FOyKAIOPUo{!3$P?w^ zcL-;=1C3@in-|kr+)P2qH0zd4^e{YyuWKp=g|CAGDZ<;Uvf}AvNF5~W7|pAYThVHS zb~r9!xYC+sq1sfdnll9NqA7$==Z9GFoshi$eJsCB+G0KY1wY3oIF!khUM^SVZ&sxvh3S~yo&a3QoJ%}Gv zgKFu$-lJS{W2Y&Z*AS!-OnZPfsj8Mc(QXqMA7r^1FjpbjCJCVZTZ?Y7tuf180?A&R z5xb0Fem(`w#Tbi#pyurylLVRSex8(9O7a;^T10Ap z=8&{wd>)4IrAwkt^I#e1d3}HHT1$e;4{w^}!#^2lBjgRZj26Jy7;NR3DTSqtuEwia zYLAA{A+7p41VChkG9A5jDfgIK0(g2Y?svlmD!<=a-CBs@KXKN-*lFxk%>D3(G?WA- z&BG6N%gR{>WmAu@I7S;u>wcCT$lRhk#df>JoH!LxhS48PfplN>(kF|;ms$sFHy3In z%NvX|r&bC}CLgZ>z}5}zku@q`32p)ob26j)Ug-MOkp|RwcS3pbXwllPq|LytbpZ{z zmL7eq8F3@YGG$bdjLI=;S+@2eU~&>m?tC$dn~X>29!h|nuFXY-v9pi2bWN_in|L4L zYhs0U^iAgVe2;W}+H$MmQ~Sp$o{|3xYzawg#0cM;t5W#=hobI+WRlzZy98#<8?fQL zm!2l@{hk#@{VxA+GL7_nO+l10hw<;bxja!A0Rm|p0$--jHs1~#+}xVwEM!3RP+XX2 z(Sb`k@hR$?`~Ol~_3!Whx|GZBIBrDen1ACve``JX7ubduw#G|}CT=6nxuvOzm!G{-R9ApB^ zHgUWUoNYH~sVbKRT>hcd?mu}=yYO2bKRrg9*t0-?Rvu-%t{I}p=83#{C8VC}KH8)Q zwYErR>OVcXoMT~)#Uu+7>Z zK3k--F-}oZp)UPY_eb(j&9ZnsbZqvv?!G1mE2=AQMEg^$?+X$q5x%GJ{YDw}+bz7z zxu~@-w-mdG)3PxvjEEC4H;A48K+XFC=VOn7{KvetY}VRXiQ_@<4^33p_qLd%t4 zXA1#xvSk*#@yUD*X`2W@u1t+OBBrVEYMDe|s)QCUTlysZ4@+LksSsKhW37mBIxRma zbxXx!?!s#8bhP_QtuEU1%>Y2jqhfldC3Ra1kmR-L+tszzF+yXD{%F0+@qOM|9_sea zmAm92ckfhzsL~O8OwePMbe82<-Jwa*<)y+}@Zp$J9e+ZwDF-Wuo4ASmejv(@{s%b_;hQ0|pUr1d8vFPkHP7YH`&oy8X zvIs2iQC*syuT}SEiaFY7tjk0&mVFtRCdULC$O(Cc39Dr7_4dw{)i46CiKx7@q74&! z83qXh(I>OIK6^P-);g1=Zq4zKZ>|0Xu zaeP6q1AeQC{+Hu$QpeqgTH)P%Dl*DpKciIe43Xnn@Q7ku*~#5qO8yGR3|3l1kS{J& zu$ptfR%ZsoM{WLoTV>p!1}Vf}n^ZI(0YXCAdz`X$oT+J{D;pcx@>FHJ(5s(#XQK6p z)A8DzwTrE2NwolE!ay$e%vwD#E|mKnVvGNVy5xUo^1396W1f|`HKyz+LYpW=*!ZEA z$z3UCw^kKIn}!Jpv54D+c8>_{0~@Q@N)dt4OMrU&8U0&y$*V6F%$`C zy5kV1vI%z0>Bg|M_;u+w>V~@u5MFy0Ao2Z=5*6`wm*`bwO^m$0k9bjw8+~vn9}6Qp zT!})B0I+vvj^VlhS)`lxF70Vz{$QK-8qUNxB>zJR>pyo}#jnLLzMwb=^L5`=CL=-gwPhO=tBREh04(p z(n9vkri#;_6tE#(of1mvjQrRcwWp$bCXzO9SeXf_uNxZ8^*RmBb;aNyq@pzT-P^^d zHl-}jA|9lA#7`}!T)LPByWDwtx17uVb2P!o(xV;=L^>XSLn+c$P$}UM-Z5S#ky=SR z`9@{fjw>!{1aW9|MOPli5R!QDbi|Nh=yrw7HMFKdw)(3X5}neJ{iJX9;JFYg0Jl&c z&fG8jWdwfiNd4t|+MbIP?Yc*~ucltxZJEGtee{9=3)j>XZB`<9ALfjLlS~8WAN4ED zEJT%x?Z8m+T+4k+w41iqcv}zo3v&w6uQ`A;Rmx&!$JkN18y?XwW6Q3aS>4Se<}+9d zx8BJht0^2;NxfaqG+*h`S%Xp-wa+-9CArS2VNyb*kmE?Ur^0!7b5&Q$Ry3_lEVh-p z|ER@(yf#lG7-w;t-m(rSs<(ss?B{q?y1nVY+SFgmBcMd3pdQ&RC}bm!Qd!i3yTS#X z{SB`(JUcZ(~WNR`AoJ5BUQ!Ba$?}MG*iKW=|gK@W#^D%A$ z|Af=&q0o{a=QOvXyU7?N0SA@I2-Ca}0({~}s-dIrP2a9*o_^gPdMn3rj8u?3eDM1mvk^qyTR5=lik=MT#+WWoslfo-j zoUm~34#OV#jM?n_gYo^SmUZwwkR8C4BEYE<(cB==~70-KEXA9xh6?hHLuW+GST5 zpSvd7+uv(lrzc6s2_mXVVn1Us48%c%TcFN7*OQ060`JiomXK&eS=Q(%JhW+L^(DNC zFm-$De3~1k+B-b|L6h@0QA!+IDctMyX<3rW0FmMmi<~C(v@_Wf%o?{I1>PQX^KrMlVz=+Ka;AAkXEwManEAr-Gq5&Q?OxZv3;| zKMn5AoJ>FN`>p2GWw&Xi^~Ne zrX82{^a#nsYtOaH6alyux~>y3I~x2*OKTJSxlTf=@0vu!P~{u>#R%4@Yp{AL;sDT+ z5!u=WhT~5Yez56^Df272x>5V^y;7>;1BtKfAAxe?dZ|Dvb^W|LprP&;w<0P{B{E9= zis%K6(DB+377slb38cV4FIh#h*HI)V_GFTHJ4@q~Rt0|pH7^&f_|)GZC$4j+0GBv< zM#``$UQ{w%Njt9BIC02cV>|3W-OSi+#}D2UOh=$rNELd_MZ zQfn7MolAKkVkMDWb{@^hucC4LROaROw)zU2Th=+{amn)Hq+aA&&&tJ(Iq=kbm365h zgH0n$Fb^lt$2c1owtGKxZ~p-Q=T~G~|Ed)5-~0dl?j-zk?7xAef5kbSpQf!ss;En! zvr_f+(mLeQR7m+-4C?A6Gc3(*`kQv>SAVfKzk+)N>Jwsh{efnX$AM6Irh# z)jF zSyURVpKFPn@2XcU6cN5cDKb}k4mgbQg$-n>j3=o)KH8;;GpR)lO{m1XHGxDr6m7C= z=jtu7)_6}KMPvDo9cZa4)b-6X{sSMr=p^*8*I~T(o0afI%ydycvIyY)($KwdGBi09 zzFU6RAdz!&$Jxf>uBXl-DEVSpIoVRWdICP?4djMjImZ`tihS2w&e&LfWJOm}k;!x; z^$>4VK1&L5hMBofj>LzmXt#Xe!I+rft=hvRk~?f!`I+!7amhbwj^~!!(&e+J++aOI zk$q`*l(Y-b7N!5(?XET)J0~C2Xc%3a{BzQzqM(aV^q`539lMx@xW3cYXIsjs@^WWwK<{83Z;mz+00RZy!!ji~oCMMW(N?C!$&>_xoiUCbV|ukib8 zbbf3We5e${l~xisEd`E<3W%C+a*HCJ0}55cqJ zhXV+n4zJUOgvb__vVc!1IWADGySocsSpLhs?oCXUyb}rM0N5$U)0E)wzMMmIgy6Kf-T z){X9`3^vq8Ciby=Pqr~q7zjkOYY88sED@t+i9NuU_|b*JXm)qv~CNA3R>R3Iz3Ok@oRGMbfH5 z>}Dd`8}4XgV&;3p$g(k-$s3DfM1pSGBd)Z|vN#DY1b?xUd{Kf<(3Yg%JUC(*+s=6@ zQMDxDAo^+_j#mVX(=6_6m)=zOfIYS}2oZ-3+9De=<-v}Tnynr4)nW1(p%&Jq(!h*a zK?mGSzxwN9P%`2IYd{2qLdKwuc=3K^k9o*EQ3}0V7BVxdB~sncQT)9-cvvR>&>|$V zV?r)!Qn7I?;!x-9nzf##&&gxls=68Z#&G|3S(SH)Kpu4SzFQo?4 zp}hkg=_9RYTx@U8d1Uu`T_a(qdV@Fe|B4b%;V$(Zpf%je|D*ouUU}k$(llzF8%L?Y4HV;o;NVlhS%je9WV!k)+)%xKvll=%`|wVs6Mb-q&1UMXTQj4US4Rj- zX?(_wG*Cz7pM|or{Su)gCUp&-EW#HN?{yJR08Fc^G;eQwK}!uc+-50Mt}%J3Tyk^B z=e`FsNB6fp!p5MXu~s;(OJ?^-%SFexvw-JP6@G1!mk9pMAN12n^V9eG?_S%O)`v@x z%wW~k6zLr0v#f!Pw?9Uvev>!5UF;{+0qk-RM(ge7rVS|si-&ufjlPTe3WI06K6u#U zNbEBDMZjg^c=l5Ee6*3`G4xDe_|}c(MRatQkn1V7&cohN(VN}4c}6G6bInuiQNVJ5 zqxdIAq0T&vd8`AMEwvg?G}LwSxwp`#3fypC{Iv_k&)!d~Cep$Y|zFB7i1-$IQUMt0P-eLnOexra4Za$Qeia`nRq-ygTf5f=WXBNbiEP}_D1vX_ z<9hVCvP&mE{22&=8&ct8?|yX$@acM;wEGz{Ldg!bjR8@R;t}71qL}0<>y=21{|K zzqBL6&z?(vNs0Xaj?urb@#l~E_49$>6M&L(xO|XKyf=Aij|VXaHlin$*hGFqPB?zk z-8?`&Xg?E4d{mb}rRIk6!p=X~Wcca@W5zp<%s+n4pgUzd&J#2Cb+G6K6!5+B91!^G z!E*4~gL42T>x}y9IUw!&@i_n|tnx0U|Jc*8{v05o{^GNF*bqczZ$xBLDL`3DW`7p(t=BTK(o zs^|p~eSr!ULMSIsY%9*CDp#a3&mGDrS$6CJPn>sO994`o+{;=M~lJ22Sj zn1ij>*7xqMK9-7efY0dw(Sgjns}F_t9_*e2&MIN6&+pTGk+)y8os|6g=`8-+c;qpM zT^;#1&p(acspgU4_j>S)kQY=HMkM;s_E1k(#>YWjC$H1VKT~w}0u9HF1Zc<30oPuh zz)SzW-pjw7CDfoC_v4(LJfWQzM%~e?+h=4nt~%_=BK(Nsi{{*#>3b+%R23+Zt9^R_ z9li|yv~pxME@486ufS~jE&-c%k>0*uQ?owus`wlNkad|gY>f^ z#Q%l+(aY1JRL%j~d$}G*2SV#mn((=OBRinUPJWrOW$ff9M0ut!cVCp3lx6H2oe7Nj+J)qo1WbG_hlY*C}%ZHQwN7~ z^83R!XD@6h>+muU=SxG|B99MG^6pqJK*|HU^90>01Ouu2iZqU2E-3As&~;q*#;pQ| z2A;*=b9H=cCgsBaV0I>BR-a+XCR1T=6}In+F;r>25IhGk;cPeN?Cq@F<`f%=FRTPU zCL>$vbH>N-{V-)1Io4h>aMp3d_oSl=l!=dzQAsYJ(=LAdrX}TBjGe2b! zw;rX!TYx48eCvpLAzE-}3%3fa_tUKW}N?cUPU8McY_OBD(IQUeHknIa0cg9LoFHQPwx+kHFyln z;y#b_=9D`&%vHLQlgQfAB++6Ao|Oyr$n+(Rsyg=;=(`!C@7CKcX!`H9@_VI`N9Er^%V6I|kh=;dt@A1IUa6q;o$}HzQ!wZyLS_JPQub5J;r5qQSMd2cULOh z)3v!jo=U^Hs)xZ4@%mVpj|V{b}tA7vdqC7V10 zcTcYw;bajz=GCXx2p68h(~Iir3x*VRjF%lzUyEcgEt^;Bg}!lmiwoNk%A?YTRdr(l zgCYesLajL#*|U802dE(0hU_aTa|!IfkQ8oc7nDIlrIW;ZLp)FWKE} zIl*>peyGWD_b9nq!~RXlrjZ(RP{Yf%abAO|hFA0OyEGCm_+6m-XBB{yGPR^t%dkdin{*cFRRV61JRY(myIP`c`v_8cK zX9c7Gp-|+%RQ2Wk7FB(?1>$NiGtMxM$B9xo8$5uvjwUVwmzG>Kb3ZMayVxt2Clpb1 zRcI;IOY<{A+H+_6BSRf5OtK{w?LgMGscgn@SZ4Sa6l|NfZgM2PzpjU~KtnDD&Oxql zNdB01hEg9>P;_!t&TSd+1$m`tn94?S)$lsYIL(Z!`B4r_^f`_(Or zuAN^$f5&k$+HUEM40V5#GJEw9d=KzJA z3@b(O0W7WdOnVI@A~G)Ipsc07yhJ-~Ozt_3O-oyrfhNuY z0y08t%BGaD2#E29d``Gvi4k;Dq2r^9ovH@Rqha}Sig{hQsO%>ZfvvN6>Nk1!5y9Ze zhvqCdgRPf~v<`NJMArsc1*X;eoR77;*->m;zQk}n-29QlEdgqH+PSZU#14!2Uzr-C zD6LkAB0rBk|32aFJYKu39RAX<$U6sQ?}JUAmMt?^A47Z%!}06^Bf6bx);WCYU-`J) zYJ8!9X;_M2s@cReXkRx80F@5dcQuAPPM7xXSUWaWu@!T92c+clS1MfY2?Vs%jXw_T7!cZ3C^~OUfQ@$l|(wR*x-9eE7(lzDDO^R!mT=$(x!!Z}>;m zgV5$QL0lu-eQD`<)_zAw33H+utc(GQG+5H4rRx` zW$@re>gOEeNvktx;?&y@^%rH5#;)clDJ43}{KS@95KLT!$3tv_j)Hw1p1(6a%tWp;j#&RPuqD2%4xjYva2 z?+_2(E9UHYwCW_Ld6a@&@pNFl3d;G-zUEECKum$*!=?9~s_M|;H?k5e1omn5<|UQzLbJG#Z9c_6-uou;BA(Kg@O_36fb8O)0c zG6CE7hrK>67B05g(}XaL7)>#M3^z-qv$2(d{ud{u=C(rHG_16GWi1~geWRg?>_oeK z5$=a`kf2%h{FiFNWtFG+nsDanGqbe0{wE@m!%EHsQx!q6)<`bZ99JC?^kk%cp4W%E z>MI~}XSHUI2sK#~gCPaEHUnlP!+V%z&` z?DZeF520NB02=9Bjc{YJm>N4mB^2rBt|WQW52X~dMRG*@)LuR)I(2QI2o&1Zv9}2z zbbwJ`Cea{dDjFP;TmF=mNAMXT+5UjOQQs}v%RGq#_1*<7Qf#wfPve5C`Fj%g+u4M_ zjVl9F3yeZ(mFnG}3nA%GD37X--#PC@zUN!V3>JqoMsv~5Z*yPA_=zvAw`We`FD(BF zJu2Ql*42NJe-7w42b_)`lg_Cgvv1YH=%X!YjJB~?&H>~gtm8R=`3DGx1Ku9|1bCY& z+tsZLdk`Oa4157T2kh)Mx}W)Eo&%bCeMxuD0Y~33uHNj~Fu9)!~W&)o1bT!e~j7n^TFQ_9nO6>X`hj& zX|yFcfFlonK>cEY<-ymtL+=vL0TcFTif+|s2iH$;f=>O9%g+Hfaj5Mi4zhdqeB#BP zUgf(lIO*!a%aYfb263%|5XD*1RsQeh(@QKtvKkNY^+$E9Qjs~=&}z<-*Q?)_S+x1W zw06&eb9=c=cyG;5TzHr1%cjbw;@{N;GU^oEaGXa<`alQOQwS*YS2sYX1v`Rp(~+BH z)mC_u?>gxLNAw&-3ef)~Fq3Z$A;b*H2NOKx$6NYLS4lM7aJgXW@tA3V#sdQMqU^k2 zTcfjf?2}=OM4n_*eQ)!3nf8%s617HnYrDl+XX1gdXti^7qWs^%vWU-(fo~tq=C7->$t#Mo(QI z=U@=|e7)`dA-d)MRp~oN#m9!rCkQ|@y7tA>>nB3@iO&IJQrDM2v)i1iTiD6BB~*5} zUw*~x#nMgkeJZ@~FLhsZR6N+azgILW({d)BaSm{N5pVeONBaNub+LTg!csS z-(K%Dm_D%%)eIlY{&>%Paq&~^x~^sY3IA%I=Waa?Yj+C~iTTh-u+_Rel}Z+B9d36S z6(Q7Wa-mRHs-zVtZG@J{M)keMc_Sd6z<+g2$D}H$-{!@b*u0<&N;h#zvfEr6Cv~zJeW^PKjm&F>^-o~$>&2t**_J!;U zBlaQMusVSo7L?$8vumjd#@jlr!hRw)LE|Zu%O_kgANfM4uroW_ASgDkL+oafv!N|9 zms6Qo1DHlwcWS8E#9OzEsVF0Yhkg@CkbR3b9+D2UZbF5FX|sLPm}A0zTt`98Phh0K z#8LmeZ8vYYrW0hdlI0faPu`Ve8f`faZq?Ln&!uL>DwEtT6up&2-MeX`nK1aFy#=@g z#bvX$HLsz?d8KIZ&wC(k17aALUThy^R`0Kt3N-Zj8h>v4X!)=MJ?=**U-t7*YX)@)AlpgRhm zOLv(TV>}(J2#l5SSsXY4m&CC8s~=!mR;K;BCA+9h7VXc5S$Wn&lX|i&B|YqScF~FR zbunBtl`cnVT<`o?6+F}c-$1SHi^N7Pvf>IfW2ZrPNYZ05IcWM}A3aZyOjrvM9a^#a z-8Igb7Q~5<%mtU}JjV3FN@+5k`!)sPGsgy9yf)s& zK(5rh4i=^lb>7~Ot7#S5k%oaEF6tcdyyyP^*n97&sJd)hyZ{ABB1$0{1w^C-MS_52 zB#D5;A{R)eB6f8M(t(s z#f0>5PaYed;Na0cGMcH1AvPOTeI32(_uJYgpp@mB(D2w(2H16N3x-%;W2?3CHKu4O zkNfX3G>Z~sjwANK%Ys*MpTF7fcVB*cBlx?|_K)F)q8U`(P?)JBklc9=_57nL9{AtN zz5#$3S^$kbm-z{Ll?ns;tT6ti?Az7f1JQ;$Bs-+QS+DpVS|e~`-W?O_Vs>BGc0Xii zgFxC)tgC}mW<;%6+UtCc?aZ4Qz7QgyS%Feqc0^E@>QcUu%77|^ot|FI#}L*U*G0&1 zJt@br7`Ohi3~XjexFN1_smF5MYq~p(<<0_VEWuPwdIhK~bq(yet#Pa>7Rt8=S-;1; zWgD-h<}58;Jm0&RA}eVZqi^S6yWeAMoi+N=MMo2&DuH4<3U@ zSL-|-^rTDpS@XVW5t>Oc`%1YVC8te(HcF7;$4>(Han)2_eLWozWM#SbgyeBapwMe% zCWy$%%4nHg&EDk~%4+&YhXEYp<~+ou&OcwIAP^c!YD+U-{Ln>;0`pc!HP_j8f$If% zt-R_O>+S0}JG$+mdLayl4@VM`>YaFRt6X&_*O2)#R@XWUtKm{Z2R*vyha z?aM4WUB}{89<+-$_C?&Y$MH}OdCGBeX>g$Ba26hjNP9M>_TW=aMg@RzXZ==&k6VF( zNIU35^R@3Ix5Aejtpb)h>?h(9Z^tq(q`<%RV1yz7>20}C`o}A?8UcyilJ+GHH!Lf9 zt0dTYA~UzaN8GwYrRcAR)6D8fc>ID2lV$@z%{jORZXiKj4GoS{Y3PB|>^9!TR8{@t;`QM+2Nr`}Qn!bj zrnM}`Jj1x#F*l-5A09I`1(uGaYbuxOOJ=M^onYj`T~YQ`w1&!(1~2jPADI6G6yJZ< zQ4(lP%~SE}ss+aw8m_@R{Z2V&7Lt)?;%}^%Q}J$-WNzs_+uF-um%weboZIUVYNpz? z^a*}2nvgYgg74uYW3TVRTsog*VKKTwPbvhu+$Sm;?IG+H^%MysL+VybGxhOfMa#9w zQsAhk0{X|G%djc7ZXVDxs~ufv&|$&>b}apk@%+bjWM7;t3CAcpX5-(xb!Pd5XpWaW zil@GA3n5qQ6BkqpHsU!RC4+zY}5%5>)d6fj(#d4vTw8_;^K)b?y`X?ha>O4Y={gd$rm395P9v(NLKI zeU?gjc+PYB!gv+6eWo ze$bHoenBwya;)cN7PP^9;$ZEHvh@OPRl3NEU%Mu!W0LaL3wxaGg7JuDozBHQu}gK+ zda}XAqD>$~-~#LI?<-RHqu0SX=YpLmaQflST#bpog29+%RfX_t-$z|gY1@XkI+g2a zfJV6#=DQsR#^M7~&ad*V$Zqkr?B8dp5>i9)HfFA?T5P_Q0jCP#KUU;YkL_WAG!|wq z-n~z#$8b1%Vd-%a-1B`Fnp0@nWVKsW$dQ$m-u2>Wi$KL=^xlA-zCLSg@?d>;>ALMJ z8&#kZ$WO|i0jsKMtwQDw{8IzEYmrqe%*Z+GTRjQB2jKNRX(Hr2s_2$Ol`y=@U-Cg$ zX%O6TUY&Xy^3`fvt0xqlVBEnfW4Y?MB_UIzZZuO-thm@JlXhK-L!9od&vGtG_*YAy z2mgWu;lGYRQq__qp#dJ=+X8KAj_;)t4VTY}FW5(Hnr z^8CVk90qn`1Y_AZlxk!3Ugp^nWD=LDPdPIVi zpckAmQNLHiJOw?#+jY?U2;23KJ=mnq3T0r!8(;k$0p{1v_5XkU{mwzVU;7f_*}_hru3S+-)8}=pHF4o=MS4lH5arYv8ty8qgBPPu0hx9xv|>L4qEr}t;0n6`Bhj;{y~oc;zA zM)fIS?_=(_jhOuciSNR49slkq3|;zhPB9}LF?V%HU&QfJ{WJ&$R21&2HlHvb!*+7_ z(#V#qFXtBP;<*Oso0HAPLYlFsi5EwgPWnEgl0QMlJ119q50#c+UzIiy0BXCDdfa@Z zed+bLuK+;zpI-r;h5yRtN)MN&vj-aO$@Yqa!!@JL`|L^b)5zLIBlPK>5*CqL>)6Gd zqUVaqq zjk)GYz-2%h;;nvEy{RxeK63dKYrQx@bp(6vfDOJlgngTwwNDBIi{8Bik71!lW?1#v zEe%pL$q`E0M3?KKpxI#1u?|;=Yt7}=OLc2x0l5T9yhHGxdrJ7Dr;Gpe_vMu06@yfp zBD=`%%-48nJI6{Z!n4MgA=VvAFMhRKmFa`mlaX>7DTm+&3{sa&O87f7KnSF&wOrz; zJ0aC&Zw*x_UWYzEy+ZD$#`YPo;IjQ?5%WKr*`w~Dhwz;`+&s?(D~Ekcd{P-wd)I)x zFsv~}F8;PaF{#GV^xFYAwfdp}@2ZM_)Vpl^>4v+VvVdG)p(y9YP@QwpMAjdPNu-0iejCuL{jX}NdXR!kJ+c(t#gffL8sA~`%IG1gr9e%u5z&) zr?$M`Eg}4_H0S-8g!vB!^TyM9a{O&cD2za9Pn!oSMl zTAsI5J22h)e=-IDOYaRs-Y3wDKn10;&g<}W1K%A6SsSu0zf3|SNV?f(mn;0WIO0dlQrF$@ zWxR0igbY@~?~~nm;PF4v?3!-8mfw~hR(MX zq%V6p$vj^}T>s#v3sDlT9ZX6ROr2T4{W6$CV*Vx(7cD%k93!2*TyRnx8oWgFRMq!E zL0?$Zs75TCguL2Q3XjXM65akM52NXB>Bo?^@`=1bcgLOfACS;IZ2^z2r92p;anxd6 z;N6d(ifw%Dpw{-oXw9I<9XL8)MidK;ynUDep6GHRPJ$St`_XMLuJF=3DsD& zI2;(d`+v``CB)J|^96&P-?|Bs)S%q{L7%3A7feSvpGa>*+#O3^X20$jJIH!G1t(vL z!m>tXMGXjB>d0y>;RJ%GGNQSroRV~YkU;W3mA&uYe`J|5+<9YRsW^U4c0divp*^IYsS7Z zz=dyW%}IxqL6&!JTa;GiLN)ZQPWT5*&yO>g6BQ94`@rF@_)<;Duk!hg&YT_1{Jym;Y-$ac^E zi>mko&fMYFPtf(|QPEN8d+5Jgg^Dp0{+>^h=9Z$s2`ONkfW*fHZ`v4L$A`z)i)9%= zdrc37SD2R^+(pi!?htOCXSpyc?O30bfV##Ydx5k7lRMlvR^#e4mjj3 z%$+>>c&(`RN^r4_^$t^51I=Oaed)@*g{YY2&%9ac14;V##A!4Hg)`pS`tgosTzy3# z{u1D-K-Xi1ZH$&I$LtM(+kO3qk>4w2SLyAP+`xiKQ1W?FX>+=3I^e9kE=Gyq_i78y z%4lBdj2iZEu@1Xx?uI=Zpr4A~wEVN}$i*>{T;#!BV&+9Vsz3peR*ldbO94`n6c5|! zoKWKBBgN;HuN#K*c|uy}lD@z0Qu2|Gs4#wA{Q(4`_>!si3e?H2J)5-c#moMJ4VdtK z-~RVm&z`(!Ilg^u5NwZJrXCL=-8N{LMaL@o8(Z`D zdo}3=9R(#jF4f|}w!9naveTjJKQzZISU&R>;p1PG1yND`9su~)^ad&n0R46Fc+uSy4HsREK69}IQ-hPT52+Wo zaD@tll5Q65hYjx|mC!5fgcJ2juJ1im{{i85x~Icj>{#cVYa1i|Hk`l2-Q?S%WA3DE zFZ(R|$=B21iqVDqBfcn+Gh%xsHP|A=mM$1mOs18T-JowqrL5m#J(+Rk81>>wo#_3p zIXwYO1t|)3y1&Wne^0plJ=ba>lPg?qxY?5gkL1)5Q1L1|v)a{_E;29=Z*6VKclyww z6P`pWN<&s{ZfHqy!1GMlJ;tQIzg>pjV2`__PHaIV zAP&<$pzWm+%J~&#QiQBS0TM|y^~vK8Z=ALD7NZr(6Pdx|zNBu>iS}zE4tmA0BeHYy zrnMU?qcyVA8Ig3u=m%;4Jgvxo9TfcAdjgt}!|$uU_3oQ;E_O%R`^xrIs_Dfeh04rZ zam5Oe&vOKV2!^kieFyyGiKQ-8)I+m3G>sZS~2^^}Gh| zG_!@o+^(gg%2LLF{^p-^i|RVHp7C1cq3Q+uh@jJoI?({@k~suN6k$Ke+Y8b1yg|n; z>ztz&*=b)wFX#n1iNHLH;O%!+vhaE)>pw{F4ZZnss}yrv8QnnYVkV46HW@64#9}RM z9fShMBKFg+S+;9b-wJK_Z~E|ME*e(D4MTW$rxI4!&{?>|z$szo=F3Ec^i_Uhj5zW( zLZTb5P~#b@0R&&s6QY$eZN-c&H1EhQ;+oA!LY1gcA?A3~-gM^@=!yqhZ#GM4_G$J@;~(N;`|)>?*=lDMm;bi z63E*y6H_?Nf`l$cEVJRcoKfyAmaRTl&L-MCU~pAJ2_|%IBP!8Axp8 zrVxI=ciwY{n~a8HH-9s4a8ypt>!EKJaGQmi%~4=Mtc;eAF5{spiS-exNl1w;M5@N# z&R_HkdVl0h+I4wk3)!DLfp#{dnigXW7vUe}WWmrc>qcrGAM@MLh#q*0)?ag#K%At@(Bor5AOzTxSa zgy+)-^9m2 zd*G3KQB0xrNjUmm^N;AwX@rxipH|y-MukQX-~;v=o}-7TUcm06FZjnb&QFLi(O9pi zPq&D7=cpq$C_hJ>zNtJx17m`Vt)AThdp@oy8+N*8&BB?)@-{-r5_IVR6rL}v+Z<1x zjJc;Q`Z+wHpKeYT7h_ESQRZ=V*m&@lMjHU{jR{rH1Xoh$e}cX%Aov-}bKf6qF)Ktp zV%6-1jFnC_zY#TQ4r91B^v0UlU=3cs(~A^+9wl{gO6Cw2Gs7 zXAiBHqWWc`#;_{iMVo&FZ7Jf3Ji$4v*8_NZD^a%_K9-I*3BTha1E43q4Sl;UG$*( zt+CfXLCu-xmMMXlHgP?15@eB%!%vVfAf^AonrXg5d`C1)OG~*%74}ht275fa*zmr$ zuod5}FQX|MD6!ch++EosZq}??1KtzoBT|#m5Fvvdo8_>CF}11sLh<;u)G&INiSxn` z_19Gn58t^z=E13#qBsBZ?XnAGX=Xm*`b+`Z6U#XX+R#@o@|4*fK;Ao3-&8(v8_*?r z-B4gNaci#Kp2F(&H*x^920o==|Hg4HK0&r;eeSsz*3HGXRI+fg{MOeOm1{`Gb{kSl zwDRQlnGWXK&p0h_X-FEiKXI`VVr)DJGWr& zLpycj=E{%@qqi^t*O((;tZiw@^y^8~`H}X?$9b}@A6FG@bI?$qQ0EV?2v+wuv@N~X ze9ol=c)f%>HC`deVZjV+*o8Eq!o~Gc@~t~RK{RWg5X?xM@&sF=Mu@-;&@1`-^ZGxn z+)q@Gzn}&YS1-RZTnTR{Jaokwbc<{9|9UBU&L2I_dx@@kCkGu~oaVZgXlfSUt=LtL+ZK`T5 zkph2Vxv64`pDp0}egKXSerOTe_Z?iDDODRTV|gG#6>r^-B5Nl|h+5M#067-$*=-I| z6&|jP0g`l$w}6L*(SJ+O^6v&hbIPw!^WcySU)j)&tF`qkEiIU7w5`V<(N!zawFgu& zoH?pdx465zm+QP}rj4aJxqTsgt@yUK+20c_fHCN@YS{}5v(~6DU8vgB~y3GpD77|V=a?P-*h%AYVw)Ds~J~U;LwzA=Gqos z_xabezHlD&7b7l)VWdIca#~&>gm1`}wNRh9}t$ zp7mt?Jl`{OpO?!UUEFjjY|u;RmQ^x8w8lm>-Om_Pe^g3|bI*V{i$c znzT3?iiib-=yFES&oU`?xnNxHJ8G*2DP2-~XNm1_F2$jiCX5J$sQ@ z=U~uY1eHaUjA;%9XosGl6vdoEXh_WrK2+#~up0)z`#tq?x!apw+!{0#W6;X|p@ci9 z8`0V5(5f9GN9#72;(^2@_R4KdUe87l#r4e1FE|UNk>#gH>DlM?C*&X~Q8mCyTtY`$8)lN3^^(z*mB9|0NosLiSx(1$`?Uh){ z)4uC{5-Ae9ql%(t&Xz5Q=TCQ@K$$TmD{XcCZBvYVs$sd!jc!32`W1!D={-prPPYL8_tkC6KaWTA^{CVA53nWhMUn+nrB^M_)z&Dbbxyh#)MLp^0B)^ z)SqID=kQ8bPnwXnS$n=@*@$~f$QW>Wc*>pByGvP4s+E3*%;caqv(Frh zZYCPZKg~gxR1uzaoGMcaL73$m5Hs@2Bpi$aS~|zF#4G(x!=-j4KvY;HW0Dz9xMQ`~jG8q`!djpmqJ?YaA1#n zDww*qg~NOLEoKtUJ|dMH`G!J5)h*QM#$ui9Iyi^?U<@cq&uGt342t~0%7ge*Ij1X|%K=ukDcXAcGx)-<9%kldI}N3&Q&Hs=8xh~2DpYS#aS_QR zlac}iD_ZpbQOLillZ8*oFt+R-_DS7&(oesjjj6WLn(;se&?z@*1ue$bBUNJ)7CBmY zb8}w2j(^Nw5ZyjoB?Y|_F6H$h`okNiZI4F*)<`W~ZZi{YL@9nZ~2Nl^r5yUIN7asn+Lp-3^Yy(M`PRro^}@S#eN>~NLr(hk?^ zpnLmHxSv$ z%|boL^(Yq8E};+~!jg<(gV%ap$c04>jpbMtgU_ZVYTGP>VkBIOMQoh6%&C{FD5aQc zmWXL?4#{4(u>74YRc5r56idnnknEgVjBZuEXQ9d6u#*lcl%?~EQi5`)-G=^sCl{8) zkIt6SxO$SWYU071vZv!O30Y8~rm65N<=y_&fF!6*(74chF2mgsD$CZx(zYN1Hz_ zyQm|tS%Azm<&G?CycJjhW15oAI3r_1$@S0XF2os@ueJHbY;5UKxiNCLyK=(i7@DP? zh`dtWD8lmn;lS`a(=Ew|a(CKD(8;c?{a4^id4t1^oaHv@H%Z8m;~zE$bWN_RT)*Fc z{J6q^lBz%~N9Ap)4Et)Q~HtA2j5+tPi2`6=t84r8k^fb zia4d2zCoTi(SHb~gO*@bm4hL?T6B1++RN?3b+ucI3I5qVgYaL66-lL7lPaicBv2R4 zRmf;Vt)~kl15!77($ev3g(bdIQr>mza8{Q-75@4duM)24Y~0PD=E7_ZuFjl#I)T3+ zhp=9weqFO%x^RznxRES~>dl~ruh_+P%V|4ziUqqVc&QfnlxBaub)ZYag%QZlB>zX} zXA(HSigEC-S5on<;MNd+&091=9zM&Oitn&H9ld8jK($;8)AS%iDTbH`eqMW<0!^|3nD!oh!afAdJi1%Nbmvrh4S10<9}Ibdc7_}sL$M++SGkR;VLw39eGd&LqrW4@E#*u*M&|ijS^Z z?5)xy)K{u_4>v+D{88NiX!leqMt#&8%=E#j(}>Z_ixl@Ib~v7B zR^=gnS37byme$sUK-3>(^&7++q(Z4YI->tvsliVF*}`D32*vJ)UWQTxxc)JVP2{8f zG%~~eTjL-5l(bKtUk)*nHm`C6{>X^9tdJvrtnD5fqC8sxKe(%sC&ZkK4j!oQF}lRuhy^SS|{j9uA1>P{g%)Nyk9 za;GOV+f}joO#EDgul(9!h}SpKz3=9Do4I2}w{~9O3wKEHZaqvQ3|xlnK{^=eJs%(K zUvn}q-(o}EAGzdP&DfOPG0)){Wv}`$6Nn}ta^D)&kcr);Eg*`ZEiZr-iw+9JuFa!PaB~R7HFE2YD5lj zfa9n7QaB`Z+!HZZOf)rZ`82A7YJ1?S8^JFuo-yzN&DPjfm*_t5bXF5=iqpH|XH5%4 zs&l%IbMP?WlOGt`g=1gV8(uL`p9OC_wj)%$rxvD7ELv*MNi%{zn{h#mv2vr>qUZ@L zV|@OIa=EvL7=q?p@oN+RDzSBwX(1(I1F$oqIr<4A_H^HM7xT5!r(0>r@DWv29_v(F zRNYt>CG_;bx=*}eKdo$<7X<+MQVYb52JQl}7;XLM>lfE-&GG6o(gR$#nQuOKc|bj% z^BZ*dmpc_CdDK%mb1j;xWaYWxw@(x}LuytJ;Vrz2b=93eKo&Q7Zr5dz*U;0bgBK>y z02seI$lCw<$1f-oblQDLD?30V#$Jl%`E2`(d*~YkpL^^&O!pX!%`&G%2cPu5!v084 zT!s{Mg{GdEv*8^RIwcF&+>Gc`uV|=t@VAvaMs?EPUPuk3oM+_L2*I;PxI@+=hygV7 zs{Et_9||95h5J!9$U6|9AUzwERM9*^JlEJBMCaa7YV-@hZRD25Cd>)j8{(&WQJ_}K z@WdGWq|}4=W?o-;bKPrw>~A~Ms%rU?_J0Ns=wZvH%C4htUrXms`b5u$m!0(s#){HE zx7xjm6i8L8)=PF^N0SzL4~dfW7X>5xs{q%iUzl{bN4+Dp8E#=!;cGt*Hpcg;w+*Ui zVy@&{@sfotsTOe!x7z_*-3HF1B7l6kFPu$*bNbC``XQp%@FZBdWG!si5a6CsZIINk zi3fC+r9VMGS}u9bgFq?IXg5 zRqZI3ql`d+TcoaRMn0uJDp^p)l)TOQq%)bfa6dJ{C+&KE*7LJ2@C!Y~o?BINAW`j; zwoXyH^Q$>C4czOWm_htc^0r`9=P$6BIe|(jTtWM04gEdfA)+p-b5;cj6u3ZO!HfewQANS)ZS3}>f=$-D=&sbj zaxL-ON0qhqg#Zht(b(D6j#NMA>a`IR{DO1ku>e6f$dN(?m&(|llmkSeu2cWz;lloX zgTie)-Ky+`AJ;9EsZpE@VKVl!uL#Uux{>2+&Nt`-eA8fCA=lZK_Oy3d9!dc7Xp4OJ z`y4L4>9-43V0|R)6szv=gS8IBM7RUu$=abSO2L;R*n5kx`%s@<_ZOA>$06PVSbO=( z$Cjl==4)(+0I$#Lv;|BUyOUts@q>|nn=q~Q5n**)-n^Osl?V(0&1#Vdzd0_?bU_CY z{iQ(OQ=Iv)e>f|jj8!6^g7OF8d!l04ojECTH)Ms#$p`3|KI28&CBp@h!V9>J;Jiv3 zbjBIHx5Yf)?B_(4q2ncS+YFB?lA&@n5;m#JW}b9oS|E+M;x7kO{|`<)=%`yebkK0@ z{(OYL(7h*JMX(O*K0?|>;`3F!Zvdl}`D;&7XaPpfc$J~A;{)%l@7$9=Z?~xC)caC z#pv&;>c?ha(PHSNuAFHZ?(4QCq#&mX;4Iduww(N%IuVtd6vhV>pb8DWZIqEAv62H* z#v_)P;QcMLd9)u-4arCMcU*bGFMVDSYUt}c<WjWh3wP@p!yL@JYw~PDB$?G$xd$;fhg3 zMw2<_b{FZ@Lji3L4W{e?E`S)D6fd^uAO6K{$DNKfZTP{0wV_>%WDi2T3va_t)*-7G z1PeR9Ds0b@4b-21g5rR^RIaV?SC}vy5;a!}*dbB=33{VL$q&5NtDk6u@o0HE5IMR2 zxSkKx9_;>=Aa6)5qq!=PLB}uEz5X}j{I|71j6X;Jy9q_mQghYszdoti9o0Le?cITo8PxCVP60idOpT(Vat#?DVDqcCnz0y z0LIwNi-eE}#%AiV+3FgJYF`>Rap@B`(^+zNPR8bb0PH8l7!04;M3xHld)+&GxE8(M ze5;~RGseyBbJrY6$8d?Gjx0c!Od8TU{l$Uk92 z?3Mjfl%+f)d|-EE!j!cCqz`1}5gnjx@J=;!fdkIgO!!yKYVdtHT)TicY?YxkE$Nn5 zAV)dtMt1|UKQeE#<*?hdBGX)1HZGnJ)#}rh&!~NVip5|Or5i^pT{)a?$T!r0F5r9n z(D6g!knfth<&axqjH>72*lriZJ;2em&k;?sveRb=-VrdiX_TSaV{Tw9x9c;@|&HndzvB}m2qhI@(I z+O_9wct|Bmc+I~x-MWu7qn>t+xiiTuh5j+wTC4yjumQ3pP8xT z&iFjLcr-5>lx8yRy%06s+}SK6f^+%xlrLm%)#`9sttdU*UI~j{%$XNd&$F>kza#VD z6&WBssq@D39up-Nox2$hY1Dd|BfTz@hN`F_V{~^~VsiBU1Inkuk5tNM~R$9hY8kJD^tTL5widAHyv?Y8!a&Srhj zm)QqFqfTr;dYbPvTV7Bet$fxVmVOua{wu+LZ=t6cMrHjeNWbZ+do>mY@({hD)REc>J0#AD-^X($FGIFwXcRbUZ`^w87FF<2T??oUp&1-|BE;blc zmB=;hNISQMbB<0GOOHpY_zvu5t}k!e##z8RWDDV|Pd=70!cpDccAR@c1iFkzm_qWm zFKI072PWfN1yGKdTj1gQCbL_B0LQmH-Wg#j z%cNw=YI6@tot6xH6nN2bsqX$K1sKgh9-*~w0}99#sD;Spo_#DIfAsuG-T)!krZe@% z0U1$1gles(R%BHD8tna#%G54K((Z)^m;HHpdVc9?WK|*@-s7o64{a8!1zQWjv&M)p z@5i2S?6FxPw(N98W{g;D?`9U)+6fvEa8Wgwa8}PVR(zYe9YHUSqV%XwxW2~nPyyu5 z+Kjo~UQLtSUJ4VCc4l4{COkz8&G2GqFE*-HrhbCL>=!tv_TVa$GWB&%;Vdl8FF-Y( z&=Q7fj+0bO77Ppzw&&Dj1hQ(H)z?^?d1{{rZ&r9Y2HO~KYPXukjkSimzrw$XMYo&V zSHBSj7HhE~hPWLgOZ9{QT<08~RJVI^kx|}DjRPlUu@hr-y&Mpwg^6Jvq8aXv21^*2 zP#dA$QM_lI^riR-V%CZEn}?m? z=_2Y7QE6nYk0yxQ-7Bk)c4BRz79Z(SVa49}HfDZJt(dEgf1X-phzg|{bW_<+kl?uh zhTz=a^fYzZ9PhixZR~!>O>Eqlaijgy@U2VU)2%a6*GsFX9j7J(DCg(IcQ~Tmy`8WT zyJyPPBfJ|PiE%{1CoJgc6nbp=T;T&l2iNz|CUd;wz=f5qTf3*dtx_W@L~S+*n__Ry z%OOMsXH>~5q0QXItBQ=-TLGKkK%)(Pc}(y8Ozb=(sc=qe9wA-SXWkRQ9n{N-`%|<^ z8_T{JLu%3a3k-cL4*gkbM{{S*iICO29AaFr^c)C!!X&qJZ4-DK4d*yiU+EoTxPid; z_8eLwI#X+eO~vHqhBz%!>E!$=)IMr>ly9*}IBzmoY|xMQ*}0}A?Ro(ty({tw?hHSp z1~6+5tnlW;8?Ej&C+9sTa$5u@oU0W7v+%%d5aK!F%1@AGDo7KnDmii}tV^7?WdH?! zbwi8{p?#sbE7~Or+7R?g;;UZYds$c=H6VqcIK5u!W|1e+ITw2)Vg>ezvHZC62U$*O zn#J%sWH~O(RtcD7!yj(R*xiU@j8^+O)XU(OYTMm0Pi(d|l>G$?_;%$5T1iIbX}`$Qp_S4n`5kJYj;ZXA zwacMW6|P@T{J_Hn{%fyV{bgR1>U+uQB@-L`)e%;e#UNNN-S5GVzNpWAN-y@eRE`7; z*@|P3t4?sN{`VoBBxvH7cHms-xos9a{^cZek{ZQM)u$lBudn)!I_E0KaCL=;nKM5% ziosvRLUWgGcqKMW7(1aF5c2#CSYIjJ_{W__0U?^)&J1-d##nNi4UDUjR!5J&mMpbwK>QNep>7kt~@KC#Z|QSMeoz;6sT z-%qo=EcLik5eM)%gu$J6^Hjc}^maQN&A=($Sto^EHbrI?F~-4aqj4?_igQGa0hI+bct?jlr|V&?YDQ2 zi(-is>|*3E*+!}>DeLi+a$fCp7Q)|YJN13QDx^Vj>ePql81fb-Wl7np`Ce+dM%YNY z7Wq(y`abUg)yttq7Dow`L-(iNe&PGE{2CoL`sOa5e6*iHuEQu(}n+pqvElHM0Lmu)S_KnrRgwPr%Gm8Dvwe^&Bg8)vk@2?}o>Y?stZ4|OYrz+=DrrSl`Mn}`$(DYt(rl1-f z#q%zp#Cly}&qTp#IK$*YcwJIMR{ngtW|mRZOj6vQ=pEDQ;v$pQ94ZY-^rE#zjr-5^=QR&B4i}c&7O9g z$Krc-Oz#5n~~e9po%%zAoi7Jp@r_mDNAK&Qv(xvc_i=*DW6YLxsR;_Pw%l8^sg z-yamUNGpKsE_psk_j9=c^a<&1(oPR{4nDr6qv`4Va=UhM8m!7iwFhymYoaR_+% zx0CJ<3QPZa1?{r3$}!^nMa43&sw+S}pG|!;DW{`g>Z4X$6x6g{Pey+Ysxd&ONs&MNRuF1!reevyq+oi=66XEJsRU28QR~B-H)F+? zM*>3Hemem*=3e#79APDntp{Hhn{Es~5F5Eq~>AoxG-o*Len)N#|X z(sJN*1;Z$9)!3|f=3XaSLL>hv6bO1pn$-QW76<;|vFJPW2L&4@T-gq<9B9yrU+s5A zs8q6`9q~;?$cPY^djzvd`DYa4mHZNP>#^LGP-Fi2II94aoNoo4t4B*wux#cp1=rzE z6>7uhKy-<`G$w00(L+SQEzJhjq) zsoJVToW-zS?LhVAb?x;5jmTJi2$L*}^qt_HU|&Ivt7C-?{yy_{7Gw7^6E&OwU1a36 zOeC=eMb#dzB0Jl+K)Uha;3$Y>gWLi?X!NjeQWwy8jxV}!Re=Ob1V&0N;_`sa$MEku z9)M}TjZ^63;1RJQ+1bW`u}#^ZuPONSfxYeTQ0SsMcNOXH?Ln{-NfU0Ekn*0-8(6ody+ zw7(05nlj{63B8=e&%S-HcA+m@iPw2%Df2UFxTS*hu#Sd$tR1os{G(1pRP#mGP}A*k z4(Q9bEQmz&l4ZhZW1xHn-G8c)N%5*P6np1&W0!bi&}QbzTNdcY_t{oTO+9w!m{(rp z!vOg@#jRK(IZY(-giBruWVn9RDgTCPak)k(L?HcYvMprPM6Wk^xiZB0F?~i|-e^^T zLr{;VBGY}qpWgy%zUkI)`$PicL{%I{GxyqFcfO8&q^|aZUc%~}pJT3uS8}@p-lydX z$beYbua6QfPaI!-5d+)H2D zpX|PI5&)5ruhINSVh44Kr>3yeouCJL%SwruC+$eP>nA_y$?`*vm&l0MNJ-Y9HNxpk z;J!s~`sWXR3D15BXa7TH=a0Rn4ljJam1beguTGstMtqUU+iOKO*Ku*!o}$fSEqD65 zIoT#0cKd+COnmg<%wuc22MW;I8|W?3t^5TmR+yh)W5ZM8Wj7VabI6G}1}Ixwe+plY zCZ+++ZEFhm6$#!pIvpZyh^hM4sUYFTf_9mOk-=R3RCVO3`obmY0I24I6X?qBgjG3H z8Q%dL{&*i#iI8e0I)7%DKG7E=wX|)E5QjxF?kb>g*C#0JT~Gb%?{M8&Zgw#Udaa8S z3_4i=*vgW_jrTKDLQ@auAt&J%bq!BlvOHyKd}SNT!2IX%(}S9~CV;ip`S?#zi^L@z zybj%dmY7JZ;rJ~n!bprUG+KHn>8W;Hf4d7;Ek6h?H$RPgCf0byabWuNfVbN*jcjz# zrufjMFOfyj*18>y5$8w)xpXaO{|NMjN}F@w9JM29fHqgvG|wfLJGz&-?7&K zJ&ODhx~b*O?ZRZ2HxzDmmq7a;f36u6=w!p&I%Owz(=gYN61bvv@E1}NXAsV*{*fW} zd=Js0)Sy(frhO6oZHmXB^@5Q8vhsp>1L5Tzd%lQhQEgr{1VkDoj{uf?PwA49EchpA z1%S->003)vSmds2mZioSDwVc&skyUZx%hu^_ugSm{p+@G5CuUIL5d&{l%^me9i#+A zx)h~^9u=fTKm?=(f=HJlAfQN--Xpye=^#?11wmS9(j=h9ti?(!=(?Yt= zfl1WWcdyRVDE*)s!jc0B@O;Q`DI~-v$Ab3C0$P$HJ=cb_?PjOvRJeA|&t2CSXW1=d zSiNhm#p>I~Le(J_t*8<%;@JU6Ce}Az;AuN$>U6gKJ}VcH;P1woMT^$l-`#lhn0lh2 zIZ4qy3PctoXS6Dr|A1C0w-mj(pLMokgOEy&CBKI_&4sh`84@*$FYy>44_h{lv|@j* zfcNjS9z#dxP9Tu%;ep@CAAFa$cPbLs8iODeLv*CuI81c5lc(+riK4ci7H{5Q2@yOH z#U@ACUAW|OW68!@Jh8G2rvaC3x#EPyl+5&&nDUkWLgKX-r@VnR&{2ezS&Y4*YvP6X+~%=;F}JW`!&@ zuVUvb()1EQ<0}TW0$!l0gJ%(H2akWMa=G`?3AVNXztKYjqsV)`x_{p((1vAK1#Cwd zg8+iVOGUN${`U|DXW~z~d76U*=_4i-=`Nn(^VKNw5Qz?W#EMt%x70f@-~E!CQJp;D z&7P)l=24LBV<4W({ujvHr-x^is?AL#n=k3KfeWGM8Ha+-#XFcu{Y_WPbj2H$=N*M21>3_)o zzLxN|oTY@(0(1a`=ppV+l-)UDc|hey>LJ1R73Ps!Vhc(NDx;hG)|MB_WlscV>=W$y z-G?V8;)`ddZSKr^2p3VRsmN(Qj0Z6UxPQ*noz#S?MltubeN=%<`9f2qeQ0Ib2>J~y z6woZHU$KoY)Q_nMpFeMqFLEq`lkE{ZwJsx*{B*8+M9vVTxb#8=eD|xNrGCoI(ywVH z=Y;+KBx06zeP_$^EsGxex&BfVra*Ea^el*lUm2w^pPAfh{kq~Z>$lmBOb2;UZV*Q2 zVKiD5@Vaca2U>lsM@jhYgM7mC*(>#I7WHRhZz7NPcg40Gh_`(VI{lKGQ)LQcKV0O_ zM_*CqC*-?WLKFPq+3Ws*P(VHVK!RiC&sl6m0s^OV1zx{aU#}z3f#f_M7vg0}n9#ax&oIh5 z)`LszdZxi|DSujdjH(Cq%4>Pa>bKTeRuJ2KH(bQ01Gw36)71r3MA^LzO9v`H3Tux+ z-uLOByXRx7sn&VU5#C`%^ei}BQQtMnhJ;A)z0m_Za*iV8gHMqV{#;4r`E$<#!i!Md zAHRez%So5yuR~=9V zBpBf9JEvi1eab~zfUfWUNVR9(=m3a4I*xJ{BbKilpd)`;#Yc+j7_RQY$sV;=7Vz!* zV{`iGG^oL!7FK9|W7vdXs%@xEjpRr5>wffj7#N|Aqa@mTyLfm;0($d>Efp4oF8**~ zHj(Q9;}@J8zr++jg%+n=qm0j;d#qZthERCN7UJ$%z2pBx zJBz-G&?}#vg?m@^JpGcOvlMU!5S%Rm!FgP1zh-^X`L6EEy0Za+H9O5fzP6(2j_k3= zYlFl0BzP1&GILvAzQA(2r9VV9!I8QE$?fy5n-d?0MRu3eMN1!E{PQeYRp28A9n0O> zTCi6hAML`0cO`NOq=k^LQUh>kNDdI@lo>`EB8#AY#*k zSa<$NHPL5ZNvq^~P+G-WW?A4QiJ4s9$}&kXGN>HePhzJ6@qXMm!t4xjcYdAcdo9v5 ztf=Yl;^nG5_PIE~k(y|SgLZ-G2sv8eh9f^aD|D!6H*9jw_9;Hn8Z>D@bt1pnE^3Bf zC!NV#!=wiL5`kf;v~ghBE{j{Y5U3erXyIlFR_X$I?g4<<I zeA0E_(q-|w{@$UYL)3V>Ek$u|4VD1I-8VQAU`BpIzRO}>*7gUssf@25_aJ0H5}cf==u|+sf3RXX{Q~6k=_FZQ9gpc@o4wK$xkp4v3CFi( zodI(xUthn4iL`6@x=-U1RdCy%YTM?NSah?R_>92lg?-4P)%S$HYhPmO$2_UNBHK;M zy@hrnxuLu+WI5)AL6L--u;1UuPnw~5(8eun)0nGWKP!PV&6W`K3Rsuif(Pd z)rBu7+@9Vl>r*_8fgfq(ctQ&YunK~A9-R@}WhFmBFyP7y*k3fuigZr0B?D6J=JSnHSY% zFU)(S^y9=(x7d{BLPp95wnQUi-uWGoo$bZUE%o_yTb1NF_3;iJGfy|1G_I#7GpaGL zAy3b#U@km>Ooe*sFt(j%&nS75f0%Xz$rO0Qxo+(hgL>Q-gR{u)X4O_+LKy*$f#3vz zbz^bL9K0>#yquA)wALwXMl__^Ak1|p1Z>p)v8gDfT*uhS!;}B`XAAuK_0snwp8^xE)Vq7q4W6bFHp+wp)ndBRns=I{CgiVIfP5;GL-oy%74-yD&oB$ zYC5RK_4TI6L=Y3g7JHIdclY-?Rh36l#Y@S$SIx;8{+?UI>g)U+#Cvb5tI%-)+Ye)c zDpAEL_3>=7m!4r^OwL@gQrraEnZgZ7=5*3d)mwsdP-w%u#TY1G@*~jp2CUbsX6uVS zeJ_-C&)^f9qdy;^J$wTtM&H4@RV8C^k@*BLTogx}Ewk4Z*rpzI9Ygg`^IbvTQf5Th z;acm8*Ny|qcCN>%!fSadWfaXIrF*GNV%Nm% zr{Q@GO)Fq~*+}39c1>-{Tm*NgJeae1m&9$BBE^-~OTbR#%Ct59@&3g7fL0EEeiy|@ zK}slWW{*F7`#h|#VBM~JM+&m4% zWj`nx07)xaqq?7^++`dCzQ){un~)*-rr8}3lgrh^MpU=7RlraLn9x+@1eK5}DXa|6 z;Q&_G<3*xS;OH#v*BZbzl=GwURq;fMAUfktvKD3WQ4^Yu5WJ}Wrpby00LZGh!{;~= zjEg=z*7SS&g(`A-B{XHPaq5k%r>xV(yVqm3(CuPuYZ=$S z<~EPoU%2ua$~Sfu0>lgl|53fXMhCJRrWQ!Ltj3o{Wd)7|Mu(fqh-(eQ?IlfyX4jtq zC!$kRx^09?a=qf-yc^nR-_R4?{q#F0h|B~WXhs;s z?u;&s)kvT?uRf{Rh${=MN3fI2DXi0u7SGcQ;UWhKAGG+rNE=5%x+NhXzSpO`c-S%9 zoBPuoHp|I!ODDm&OyG$ZR1k3ExAPx`<3-4$>JK&Et6PN_-ZKQ%4M zvGU!#muRDh*Tk~DoR8ty=eA#d5UIss+apEw0lOVF%ZRgr3pPFHHC424Jx~6Cc1DJ2 z>Kjxze?DjVfhW`A$l#!noU6!+6}fulYmM1VeC%_uGwC^@U@^XQO4wqSgHeeA6hKGR zBzeZcT0%)mMJa|MT;i0lR;jVTD?ZuwCt>nEL_Lc1CtWXhS&0U>GPMK%jx9l6@<~uj zY5EOR6qjMk+y_@09t9SVoIHPl&aU5zzHRCA?!{ht?ev0Xlxy|{b*Iaw9D`S>6L>^0 z(7gWN>(mG^BfmOyowWVEOt}u>(4DpO??kO!HoDv~oC3?$Ve*E-EsoSR!@z}2oimJF zZhiV9%oo718d@(Re5*fVo(PP2N$_d&i6$#it+oU@d~|%zeY(W-Fomi2qLv2WaI&0{DXHLjq9 zo@gj=5ZRn2-tiGJ!CSYyu}E{p3N9*#7}9RwJ#a_Rq56dL#$IT=_vlpff4Ilq;7b^U&?sFDzEirq0S65J0#Dw>E!*M zEEEGFZjmfvFud*JqU_=K;FlWh_F1dpP%2|uOH3Q+LXX$X!< zg{xp|;bPo@D_(dpzCe!@g9Kxc_Pi{?g*xbN|Dpen$<57zf+Z5&e-*td= z5cF>m*GGeY2i^jsL-r33xXeF!z@=gSy$2lpruub%qvE{15{9(OI(4gI+V_M^tV~j_ zy2r^cYP>x<6z@Q(z!5)o6929Agzxok5~|UR7R0K9Aaa2sc)^Xu*-v1(&phW@i2DRO z?H4H00t*imZW$BJ6_kG*AFdRcb#{n-qtf|!yzBdY=1K+d7YwhqkYKR(hUKkuG0IJa zT?Nx;`$SGre;+=DI_vX%qq8yE!Yk`bg3MQ#8E`(=0ICUIuPjBhOj5l;h}oJA2o>5+ zj7J-rE<@aGi1~hVX%97>`Esrb6G0L~tu>~TizDXa&C;IJ1VO|}K<(t@9^qYq4gVoq zQ5|?rdfU%oQ*-gvGu^2IgKbW7ckt)aUsWSjL5z*UJ@K@pU0AOn?Ywt1`k?{Xxk7l_ zqJS3SgXsgG2Xe!~kylGybGFX1nmrW>OXWM`qU1DD>FZ5* zS@BvH;|BXYbc=nV5CP6}8ok?n@2pOs&HN*@x+jCasHI-m!G$;Rsz77dGT#EU2Yt>= zufR06`MOP)#WS~JYUFW zM#R+JrXa-vSRPVkuJY`8Zsk?+qN5htLp@NSL);`7%galSmxBOz9;g%nZhMTJj;8Kv z?ygxSiHC7{fqmqRIf}2?(zW)Bns@FI=s~+~p*EKqap=%8ss0*zJg-sIrsCfs)9XQ8 zR8j4CG17~2$PA=+nD+O@$NUA=RsN0Ymi~e22CfEv4e&xA=22=053%5*GY0#RFT*sV z0sTaPe(!7Zfl}Imhe{mA0R4{s$Vf_K4@GYNDKI@`@*fAL1BK=Dgw~FZozp*K={VM` zLjw`OG2#+Q9ybJy!a(14e+i+yq1pkQ>xzgfKP+zKoODsR@gp+})wUfys}$FDOq??Z zN_A92HU5|_U!dwoh$)&L${Dn}a#Qgb)2{xm6)zaXwWAn*Dfg;Dpx*3+q2hRYYRyO1p$bpfSkUTMRFvxp@O0t(eh{~)VGdK$}QZRslY_%GQOL9Tu5OM#Z*kH?@61)xp3Ch%3A@WQNUyuOn!rK z$EB!Q!>wy~J|xd;oHH;reRxbyaD_ud9*2V9YmMVG39Pa926=%~zfU)B5T{P!KAW*U90&|@vPk31LuXQb9;QiuKuDpeV8A~!m*9RM5qd!O6FlzFU`L9I zKux?8u;srYwu|b>Jdf-(bt|ISqutStv5kiluvJ*NGVkw)bM7}rZd+2xRn%70wcNXQ zxp>TwrX|-qHPKh0-(jM~dts=JkqMzTmZ?9o$Z)0GPF`;QJfzG3C$x}Uffwm?U%PcD z(5{rOX9Vr|3-lQMjlGA55iivkR^WK@nWaSZ(vGzOG7p5{p3j`5#oD7puFL#*N{dfD zwrz*t3^F4v?9SG)6(v%>p}rf0+ni{?kp>Tw*QuyKqFmS}7IU?Kfe=w3%2Na|q)zS< z-WAEg{f$2X^Y-psBzisr#^Q2=OR16}2bgO1gC{07Pm2qYcSQOTCi6$q@{fHU+~&~w zah00TK}ZLp?g$QLNL%M0Y*r30Q}IU&H{~_fMGmVDTFD8`Zi|eZwN?2R)bv99c2I0) z=NyeQ&uP5C#H%8~zN~;;|6$s+VAWof5ka*~FQ26L)iw7`1U6lpwDEZ(7nd|*?h`h+ zt`-{8CPhVtk^~70A7(6_rhJlK)po6Iyny9c*ll<`=IZ|%gj&g*hjr;)L0(vs(G~yx zIKozWpM061^EgwrCdv0~4$w-l_TQEA5>DR@sk2uqo(7I4W~q52mrUzl(|JB0if8yt~7Z$q-y{+}rX5MWj7>SdSk((mH&EpRq^fD@9*u>6f)yaw1=jmCjLfk#M zm{LCr2`0sA7j!Wt)kYoBY4>F^$~RPwzhO{(!Pb<{(>4Fd%Bz%{ZBV`bWYqS<=sbUO zSFpal|2LZp>aq<$5bZs6)-CYu|9U(^&;Cqb7Ojc-&x8wM#n9twHeL>WNvj)Wi`+`E zZHzsbR=$PF{iTCy5g~u@6IN}^5Bhc^@M>*+TJsZktGS4pdTCmtGoIzjn&16WkH63u zUCg^Csz2p<|2w+K=F-1b$8%2O;bIyabzh(6K6U@nZZ8F{{gm(fCz6==<`U=?jhkinhf2&kkqdQrz{I@u@Wc*b9zmmi6%Vw2Rabw|QK&R%K?e zT;GN&$PW+n8HD#EbQhYM`cX2{^T2lHjDHo|xKN%yTO9V)o$4?)x3|J^BhAi~c1!Ik z*^9!?aNVxRG4VAOu)*|qetfu12w+NI#@EI6_{vM~FHc3#v5(``&h1f21B3yMq~6tn z!h(BxH*eNhD}FCzDlDfq0Rl~a(o@+qytt?Mu%R2ssEd6gBNeIp(4=-lg-+s;*P4*ZLLyLUJdRpt8}jpB2lx_2+adL)v1X%KsOl+PYa~e@`jJkHc=nsYAUW6Qpp!*aq3Ks?iHa_#iOy6 zV~4q8CW9sI#fZA{{*$IRUgqmLMswLpx2{#sV+K~l=|OGr-H&Xdx}U8%NxLX|pn zk=0q|BI`i%#8T3bkT9xGA+@A8FTY@hWu4gDlshzXt*Pr~c9ig||Lv}z+fT1Kr_Z#j zGClusE$`W5{-kdV`y*=9@{1>$i6-M$>O+co;JPLgE-SSO(ym%p>e&&ishc?lMY|2L z@^i^zs?nVpjwaWgKz(JpXRh7QJ}qf2I*I>4D`aOeTK&zH{hD=m;-iVrcY?FOZrat} zl`9W93uU!PXUuMU$Z|jNJE%*P1uDK4`XAu6SWs9=%SNiSsNFJ)cCClkHAKcP+Cjyz zXY3OugG46!#m`A-jii+h2|OIIsj5Gp;Gx%0lW@Z_diB?j~70;9l>%?RQ!K^2g2Vk$k$4< zqKyi(&YG9b@xL%*Hg8-msec%6E@{zp4P=Td?)=(U^2D_!&&9KZmw&HNuri*?);OFt zb^W;=cQbqZ=jTz4=isqb4Q6xv7|l~l&pw(c!4vWno>)9Sz9y+_7XA0X>fe8y<9i}l z55&l8XpraLSP&`tz~M#g>4J40fvQ^X(CL%zHcDmA;xjPC__UXLyg;B7m2^jvZLpq$ zstf-v2of>o}7&@f@Ua<2vVbBt=s z#L3UrIt5kWRl|0Q)r%I9>~YVAiFGIO?iEOx$94Kz26w5z7!7Cl;XN?DGks&4LL44C zl57#;QJj4Y?e$}5pPtj?%b-GiP?RpUoUuCndg+C=vB2=rIleaQU4cp^nJCra`xtHf z<^K>0{ZF<_{Qb{1KQyH{ty!=b>34pXHFb#bY0`~fiO#y%hElArbG!tHwr*HFsdMF^ zRe|@`Y|!yGTE^5?sc*3M~<8+h4J-k1rHR7CY3&Km_;%@lMcloh6 zc-hi*PJi9LFMT=2s}@5iC*TiFnbxm}z7Vw~W!ZW4`?0Cl+P*OJ56>|prjF>S0C&#+ z$gK63GsypquUrQ;uJ)?6^_KNKY;7om#?U-2C{x*VrCe|Mcg5q2W7ay2UY|Q}`eVHE zKMO=K+1O=X<|toPZ&0rTjX%1a*Pa%e5oPy`jz(R%p5add?BBJK{H2}bpM9<0=7#kE z$k$Jxtf2Lsa=Y>HX@u1twe$k`lHxrFhk2#@nomvz8y$&)0%a&;a~G@Ap>3(wlw z#CGppU61oBy|k={E>D%&ain9b&^)~c?vmE56KH9O<6NjkcqBs8-T$P{r>vz+9kK}O zDjpeHUL%`o_WHcxFAz{%yMu;jo;6rNa=jM`IXx#v{R6(_N7e!gcp(VL*iaX(R~*O% zAZYahZJpVeop!64qb1S|PYp3ziI$h0+HD`)oB>M z4rFjj6p}unTw>Les8MPvjDK!Sp-!Ek5RRd{i9o=MK44c8)4 z&E#4p+!ycLXQ2`bYdz%?XX(Q;d-F~`U#10_59hr}i|5sK@tY{B5)M2<z{Cvu&ijDmU-YAwj9+i1pmYp;=rv`WfUAEXIQNC56Y_sWD{Pj`j#B14c;A=y2;WNeS^z zB>BX(8E~vo#Bn>YN#(sq8i7e$W1v3e1r!q@2}m^>{{X!F5G1Bcc0w|;)%tl_da!oHrBs(Nx0o!g1NVldDg}!8E0W$ zz9I#b>d*Mt>~^u2IM_>U!5k4>bhF_sr+DdDzG_*W+(jy+##Qw|D%fMC;cGWUU5uQ-v(h-mF2J3psi)DV z?3%w96$ntL_R|hbkdL}FtJF_ z$)6?vHmeqDJT4egEf^;hxz}{z*>dw_*i9w)&aD=M1-8;gXpASZKm0OZs)X;JGl(+Y zd}dLB!APfCYcgS7!JmGgR}G};N9rb=!q0VBj8ooym>#sSrxzzs!SYQj>0B1${cZxi z#rKR+Znl+iUlN0^QV&ut`v8uLI6FE~BI{YLHCPB5cOM)*TeETkspv1tl6wC`uX`{xA*z0)CO9(mX)%qUMs*RZ$C2_39zJKLY+bFc4(9??8;r@B>K4v5wP}=kuM;4{y%3*C8U1tErUR zL6FnZbCVuD(Bx5m)1nkwu=qBNpRhrp+r`Ujjkrmhm%Y|xULvhdpo`Sps5Uq$w}I^$ zRx@-cqFE>)&%8a%e3beH2B`PzfGAQVOQJF%vwgy2&ZhgsZHYkVqjh7)$BsiL(qw*w z4><{8>%X9<*;fs|1*GbBq5&Mgs_vB{wVO;yv1+B#%rl@AJ`iuEM>lgJNqPKWZWDzM z9R1`Mvg?dF#~QD)s3d}=wJtS2kSiOwLVhxt(B<(1XgENVV*{msWHL$XPD&kFsBzKv z@Nt`K?xVgI=9z1lqEqJxf^%Kzm)iXf3U1_xQ*9Za-6g?c!|WZX2n%S;17FWI*-+j; z&a`~o#byU9LD$wLPCvL-X*I$T%?{tTQ*zc&*+Sbx^CGc zB#Kx`)s#t!d_@zXspMjwoRi%)(0m+CxeTvd&k$&C3cG(_n=7splbK+)wBx)?YWEL| z*H`5qJJxt7gYxHoLoLR43LFD+BB*sh14S_|)e20F@Z_nSf=9@_fQtFJ$sFvtk|N8} zo7Z+3!Qv%zRoAyHT>Y&)aTRLCgN9l{phQf02EIVP`RiGZ6$%- z6tPA_z8ZbVl=PyFH2GyK@A`0e?L<)$KY4czHY}9L_*=L(vY4Xb>Mp>sP?53_TI7&c zJ}0B>KG&DPbHITLZ51HNjJICeV9Z(n%w|mQBTwvrOrR?>BhlyF6t!<@KX%Q%%9j^N z6F3FX3$JRx5#F+F6CDmI>~{|U@5;C&DG8P&OC^Tq8L~KyocS`HN9s%tzA?9n!H3SjyBo+?{YamhaYf`f zfk20ymcrS`Q_ByDcW~3ao`pBex2FBf7g-vDCV!kuF1MTig5g%d_eI!1=8i(>7tZHk*_8=J^lqU z>naxB1E2MwHpWZ22qd`DyQ^Fvz!xTpu)69^)rKaK+yM;!4&ms>KqEr%u(d^do&nM1 zkbQHA$VXW1Y-J2S9vQpE0Zz;K0^h4!nA_~fq8Qgi;J%I@x*e%;G>Zqf@dK^ix0{hQ zHeMb$$xc7lk^_z$s&VrA#yK^Eg6 zIt*Dg7s-Ba^k=k4wTn_f7t#=UW^a3Qo67Nd!c|6wT>8LlX;wUK2*g2*#fk1o)ilVK zo^b(&Tzf^1%(7H>Z>x(`{RIjtWTEOsJes&X+zN*Kz3HKIb9lN$yl#itZNl#^PQ;b_ zI2ezL7TEG#zSk8v5;%%v-a`_Md4%hj5p&<_UIG}fG*U=e0&iWjKd_{IS>Wwq7NhVO z%D%oS>0Q4F*TiFGwG^R=Cdes**gbqkR%2k=`}7x(B#;+`L75qjWOne8dUMwB#54Da zRt&+GNC&j4+<14T&L|WRX#>cSXh1zSclKs>=R22S*L(qf!;p4IY0-@!blAB~XI`5Z;2JD}Y=tjxbC6i@ zv@c6-4#+36xt@B!CSl!f;ogB_xb1_$bk`Wc0C89>vc_R^2s_P4JESmu>Qc{L0N+St z0M;(450jp9C=Gmg#|*Un%aGrD)kh)&Kd9FrA#=5dkGNBT z2Y0-0db@Pv)ZVLJz~KB|<_4_q<8y90a12s46L%Mwa3_Cco^=!bnS1%Y;cJRJz$qRk zPpm5x9v0|SS&6-KU735OMUCY#>3(r(WZELVsaYw~1WqTIrlr#pO@ z6=|g}LGr19K>%Q1T@c2uhLK`<7qZ4xbm$)+MR|;HK+VD6m}g}j7iZ$Y_u*S{adYfa z{+Ym>)g|Q84ZS#278tO8f$nF9iTqi|6GibuMyU<9j|KKxtq`dtEcR;EUZ^Yz6^RcM zC`2gykt$_lHwo~YV!t7P^CY!#cHNO-sc5>T#k_!TxX}X;;mwGobme&s;CNDnw1-X3 z^<>cC?{b!x*|YJ6mm2_vV`v}Ac7vQ32N5iYEKgwd>YLx5bm&Pn?)vFsbP_nC%vlF^ z+lap=)fNHh8Yj7k4RQX32Akz6+}oPiu<5H zO_i)2N9&Yl@m!K7C#)ae@pgWGyI`3wYa8|!T@BkDz@h!$R=&sQBAb_-KQn$kBsQA-2Q#FRN5~f$swc z(Hzy%c%D~z97AsfLfr{KOW3&byQg)}?+vD3q=s%p<7{#mFyUMLYOg;-UY8fIFN~%vYNQM z_9L$|4bOQ~#e3?VDPDo{h-bLN=sokjjWfYNVFe>W%DQ;X$e+^kR=%6RKnh;Rn3E&D zBuV?oOPQKT zMMjVguy60XAs0&R>%W$fx->vL05{aLi6YAQ?3X2HH`z6gXSyRML2}2F+Q|*i~nQSVo1A? zfg&>d%q|NsO%)7`u1N2!0;os&9Lw3nnX=+1{oq#kvD*%>k}a&)R0EaQ;ca_eygl?S zcmdewJSL1D*y534LXGg=H)jGm7g{7?Vm6ja=^ZL0R@A!KqBrnVwAjy?G}Y8xdTf50 z? z^P)k&5t}r^D!*SqM;sh_kDiYzzeU9C72wd=^2kM#^J^n+S6@Ot2PsYbgOU;BuZjOt zR59dlRPis+zr~8tv8+KKX8i(vrBeZO|G!hkn#iix;BS7i#E(c} zvOT)uR(NRYs@xTc*y{i;D{CFjI z=h_JwSHOKSZNOTwMQ1^Y6a5i9&E9t6a6zUz&tklQ8Wkf5p`x$g%Bu?duCUFm5uhs`XPK2m)-u0s^w_p}r!XHWy8>OjKy$Pj zFlJZV+#=5VWUp&2-flAH(F@Rn#m{|2cj6bhZPvxAjx24B4iACO^m3`+5*wB2KB>tZ zHVj$LW`SDAPYf6xYI60>g8f)L%?NQ~aHpgcl9F^C1u?Lf!G{(#UX05>7} zGG)2rT&k&#n(#VDUSRe)>#Lu9b0P1_b#H&X75jm;c>g)|u2kwuuC(=S%TG;+ZXu2# zi;X~br(d9{z#A3SGj)iH=tU(xP(TZgalU4ZM|Q;R%c~cPLIp$yjlHOyY(3@ac zN-&&&Z~9B;8*I{>#6k_m4Fw7Vy}O9HV>;Ag_rJl90mA!_4C+7lGwWmtjqQ4A4;>_tO}K*Nf@YQ?2hHjeABe zJ-LD)Koor?>ZUsZsrIPVns!T!4Oj9uq+O3|2fy^T5Y0#|Y>v8pRjw=G6}_*!Fw)Z$ z61Lfu9PZ$fK5O;Bf%R{u+gv|u@@h8qsn&C$^V?fxL-l`k*hP3!yJiLvWaG|p-=Sy3J`jgmOt0a-n8{5d~ zj!!S!GcV?iHgC(G*YK!wyH8miX=TiST#Gt3q}dKBV4K>n?s;7bd5c=RaNtQ8aw5Oa zcH%K|=#TUO@Xr;LX5#0uE6CfB^uQX*V1^VJ_xi{43-*y5%s<{&0lWWJuuT~w zkxLAJxaHQB&~~DCXzG7~fOZpL6Z8UB4YPJ5uOZ6SHc3m-KhusrI3NDIyx9H(vN#1k zWIr-l|L`!bN{s>(bA{0=9Jey^lj%9aJYYj9Bv!R9aoJa*fku?v-;ZoFp#bZ?K=hgG zpG(l3?b$=E%2MbxvGG{&1+|ATPQL)yGRlfW;?s$u+4_=gF~%+eOJY1x(R zO;34^bU@L?(fLs8k2KH8_C?%yzD1UBq{cK$=GDe!PdRP%h4C8@mmbyq;QBMYW{9ql zz4wGQFbYH(Ah#+rTM*0~;g`N8ud8?C|;(=iXg9B~kl5bxGuh z+qXJx3g8u(|Lx~=>jTN?VcNCTO$DOjECG&Xt=UWDV~82{!|a_|KryzytC)OA{0K=x znBZs^J`Drrx9DoOqgzjUVBKid#fveTSu7yHLn!FaHafmyvCVvfB`)p~AHf318?c(g z_M>TRZG|AvG-BiL58X!<$4q%Hj>C`U$;G5AV#-&X4|!ZA#pR{}kA^J9PNn<3YloFn z6@4-9kbF^PPP}0Y{E&7JYt7iL);?^&?X# z@WNSYdZdGXHf&5u2{Hr|rkucN)nzbKw9)_QVnr9aMMWABwDOIu0C zXx7}*9HounvQkp1j0e=_`EaVc6s^wH?0Sawf3Uqe@jvCgm+I=0M~;>wsVeI3SGD^6 zpbXx{*9-fiMw%vMMFfW?>gV=j)>5}}bQ_{`wNFAEQyj~4v`4=fg$Y)9zPCP<8*={X zvPsHkNcAeVRK0zw^LBP_cGSH6!e5>SJ^Lw19eo<-(5}HTSK-I}$+?40B2||R$1t|t zs@pbBl40=?Wv|F;lDAy;x6Q6U^MLWicJB5P(t6mp1nPtACI*w7?HlNoM3hIsw+!EP z!)r&Xw*{XPt9tS@m|wn0tjM4}x5|IaIq{r6*et}Rh z_3$qrM8Ay{3%WCHJ|8dh^m9!&i z`r3st&e6gfe7m;S)~`}8&-knPJW55pj73^trQW}ALR8A)QjdkY8X1#vtp?E9$No`A zMQATco_eWC@EsooOWpp8^CzJdC{>s!2{QJ4-FA6@8>VeB=i3E!mD~UDQR+u_l*)f4 z>&`kvh&gZ?sjKm1?IYA$HyZbVJFG8i&1MT<$~aYK9eKWRzg#K{+rLE3)f@zSU3FC8zTEqvN%0-y2#lB){TRMS?xg&nEZQ_!|(deBR?zw zH}E$3f5`*zuTRz%mWTD!S$)WP9-)rno78!$gid#a`^}}mdA|W0uh?}? z%v6p~?V|ENbNuFmoB9UtvXho^={heaDjmy2l{>a_V;gpJ7?KqX;1j%Dk6J`j zM^eX63l=(Bh`nB29pXuvX`@Z*hMV2NyH7R|iz2zyX@sl$cTLmpy|KfjydP;%VvHI5 zv1`cI{kPqd`%C~ z5i#lazlWl+vW)oNOWDiPo77&~$q`$#x=~clU!kaCGCkuhUioBy+s0dI=G{&0K zdg#o}p1qfR$%fy0{K{$cu69{h)e_aV21TlDS8Eb0L_=R$OJ3cwE|1Tbu42|x%UR3N zt(Zw8LZMui=RB;uis5$tlV`aT-X!OYKgSznM=d`&Uq7c@$+G%WZ3_m+Q znj%uqV~a9-?=v3v3?3sAweQAkar7~3@TQ(`*ElMHMjyT<(R@2e+Pes&164Q6ofJ2A zgZZB|-ai}J0%R^<%1EqTFlE=6P0DNORNak*Pm#o3U8?aM`_er=P~F|qq&nM7oxS%W z$f~X?dDIYGwOjxCu3c$A(CuRPpI*EEvS9podCld!0(gH8*4K)0aBP@Kt_^Ue4C!n{ z447sxN=N{m6^_OqQ{|pKkuRC>z9LUmw0&JMNwf(PYWMQvN@IDdWW|)_Ol0gS8(Qh^ zso5hTxdBfZXr?2u(|9&kV0-7M)9+wyI1{ky{{xo-BaVaN6Ovm4?tVj&%%3$vf}NGz zy1?IdtCejwku`B^Hb!L|Ft>V%C)1gTasHFHYIEzL2E5#Qk9(*xRAcvEF{XW5l7x}*V6*={>P9Ee%8?k4h{15!0qZVbgL~R&4wBMfB;d{-w_X~7lU6))CtN?u@u_ebkO0XeJ+8ou-5U|+*WNj(_VMqrE z4WHW63sMr%Y3}bx1&W`prlM~bG16-LjR+S|DSvB ztXD5e0aGp8TrY8gmi5QJ2HZ_ju%6Vj zdrpT4)SSzi*XMd)=X*r$Ulz%k#~X7A4(1q4+~YnU`xfOl;>DUUayiYy8>i@&t zd&f1^Zh7B9KomtRbd)HdC=sy`=>!W!rAdo`5Cx@0K%_{Ip!6mpAWcw^4gu*kbdcU7 zz4sbw2uXY|?{@Ap=gj-eocEr2=AA$IWV6{MJG)%#+UxgSzct=&zGcB&_uz`2qrq$O zEEhTMOlGUIEsoq8CdC&TTZOa>BU*g*PMNnlT0mdK_JVOF@v5kDZyWl${1XV*{6c0O zF6K&%_WN1m3$TRdOBMkP`?2#o&Ce2zGEsfcw#8eb}7RZ~3t~F(%KRp?6~eVtCWPpeGNPJ^9?O zJ$f;u+XCYDgxF^Od6iVvThd8fRVh-M!Xg{v-l#@|W zAc`_V@%@>$QkXcElyK3nQcm~=Wg5PFV{boE%^`V0xjZg4U1!O=+l=MXO%tF zx~(d^9- zU0t8$6KgPs?3$ONykGh!6T2159)~cs2<<8+6{U4_VKKQ|2ZXIlzu7&rodXsp^DoC% z8~i0x^IAo2_6u=sZB=6NvB$^s&9d9Xh8WhdQ*mR$4V|Kao6*cUM%SvZ&HRAmQvx>Z5)Q46y#>Ex33o_<^GVQ`|T4hqU0J$3S69#A? z^Iu170c0>1;wIcFcaPEWYX)XSgvLZExDDrVg{bR+h+Dgq}u1#)3a# zo7%l`v|bej5)p&fttQsvzL>B$K#y>HUDHx5dQ= zQw?0gQ?Q&Ap-=0)4@hIb+1nzcT<)$VeiZCX%n9@eUqN@x2So!P++94_X?HSz@fy|N zbd@)zahTqX-`q*>0sXhOC2AN2NgW8v0cpVe=BW?5h&yeTp|l zpnZcpI@TW;?b@CRamhi+xrh@#S&c_aI$z-9_ddm3?y{&c5c9Cw^%OAJ#5^3$hWZ-< z)n3v$r6=VQ#Vw4)m04trUrwB7qF<;S^LPdS}-gXU@9hYrYIi5hiT~M}XOimz7PvZ7^ zXh@Iu#K;V#YY5*=xXx2;#^OqzGW1=Zgh46(Ry%EKkS#@(yT1=Gq=Aj z26g!PZ0}%~LLcOv_^uLJKaHJ5 zZ-1r4?awC&JixmR=UN*THnpcN0_WRfE4=GZ&O2)d`MQ+Yk&#MQkGYf6Z`BQ;wq^3I zjY=`$OU3mPT+?&3TSnWW#A1sS^5ah1{6P1M3D_d+g21hk6K3J93;E=y)Ox4jZB&C0 z*Q(`mmROr4ek*oN)dw|jSx48A`z7QW4gu6_cPuxtZh~0GH$HDQtzP9S6Sj1vV3TPP z#ZCQA!fqpL7E`xZ>cM}jF-$TnZxQ52>z&|<_vMk#0vzGQLDX|<=SP07@&MTMQVq{8 zHEeAQ!nzYziRrK3n_zRd$hdKpH}}Yab`|tk6EzS490iyde5fqnM<@x+-=vT5whXNgYghUz<_f&O`J7pPXayV+!FivFD)?v;>ET$9NX=UKrHkIr zIyG_NEuxB@>;8TQ(pWTuJBhQzc(6NJ)$Cr1(mEZsyn^0d3J5^fVG1HV8GNZc`CtsA zjM%TNxYj{1yd$UA-bbm0Yd?BEyNjnqIl8=Pn9Vx3tto{Y;O3IfvTUHSZCwplUIi|qF3+C8S{ za^s6XAfZ$}0gO3>@+~D<%OO=7giIusV)-SKK3O7T zLQw6{<+H$A0pnBR6`uhyu>n$!FCnh-%0%GJiJj#NRE$k9>?O4YHp$ElTrWf${I<;U zq06hVR{qJQXu#nO-jId1*o_4Bb>%$?4lhN$i9(wjh<7i&A1fnZXL|P{F+id8+fxb@ zCZFXPx&s79D0NWXfth+xU5am_RNsnvTvc>wJz;r|ZD?71aV82v!i*uRB+FrHom)vE zck6E%K>_EmwYb^TD^=<+Nu9$Y)*?ICXL-d;;~=z0rgPK1VB<@TM#ZTe9vYPgXf=$;b9 z>WVE&$@FouLXlWQQ(RjGWBifOY6Y%kFcO$zuAn-tDx0TyPIe)o>rG%J-0fJya!nWR z)$ICF-^ehuA+l{h1@D@$ug5bFU=N)dfo98OH~fIa`^~{()>gPg4` zO;`q|A({ePzei;R2ZjH$Dv@~uP8$5v+F+vAZ~mn-R~LVGg1VtGjA`f@T`k;5IS;}1 zT7Ack$&J>=3NARD)$HdDl6S(-1bc5XFGoBX%{6UT!f%4uNmsk}Dz`(}J36K*u4Wl% z=3T=5?c)F_J|cC7Em+d~iQ^I3h7o>zXa3!$4&B>)ZRX3rjx?6Hd^zFl023CnGAt-9;`rr7FB_8nK$li~EQ`$xa4;S~6S z(J{D23G?<7*&7sI78YKxcGVr^f;5xc2XoY#qHLoo`Rpmod<_6>@g>yOxHXl|HNCn{ z;_JPa%g(e+xF_qRIMkIC+C}Uu#9@;)mn5@?mmTo&+t_S$Y-iq$B;G8_pyrVe^ZKm6 ze9fQ9pr?yh90zQ3f&k_!NKPbHt&KPaCEK2S6v?cb>6UG-#MX9e83l1cmhWr6{}6-0 z<-{{jt4@3=?3Z#*AB-D|^jpp?qOg)IHMyrp(tT zKO|^Pen%qj*FnSGT?Oy{QZovhK@DhBA9u*DR_~Xux_5Ppb+h;RM~Ut_YE-A?VB<2d zc76zYTO&L-h!XV|rtK`&aW_zrT6>vb=5nrcf9z+S0!n}69&%h6!n${6k8uHp9z(z3 z%LF6YTpqTbM0rN>OFl-t`Z##k#fm}$+|F2u$6j1=fpfbL;mh%Dh}?AVwG5LXd}j|$ zT$S+~I=)MkIVR^8Tb`dm%BLw94)B?O?sl!*a+=9J7nC zfZx`h>!s_APhHFK%b2Ard(sBPTN8o{T`d`$5;Uj?#8;%Zx37%gJlD&}>D^rS7?XUA z#t+DT>Y?C+s#a+fZ0k_0R5>mSOkm+jW~~+{Z^w+wQegH`ugyI-Sr&dhu!7W<){`Gr zl6SQ;f1B2}BbQNYJR1hYNUZrcLC<$H3ROIb_(_2}W@dyJ`JP>}_+g_LS7D@8V(8j= zhji@<+@49qwzp6)%>Ej9G-lnsPf(>EY?IQO7|gu2=iI!JC;)vj98C5(PrK$SuL2vX zpsCun@`+NenkzgS2XscwCCu2niE(vR;+>o#OnWeBdylnTV<#W~;9&9~uMabRc0(w@-I~5F$8QS>~*xS zYP#qKwhJ|YsC%uxHr$F0E8J;rTiTZC-Ng!D72gqoG?ARQF~l1-gzJ=Liz!V(mi22r z_!c+`TugKwYgXj5GG~kYTi4gXwE4(NqBBsjl_);cgTpQ|U6!AslN&Crwb0T+Qq(5tk2>i({N@HQ?a;kYWyiO{x;si}7?YkQ^GamS zMis3;m6lH5OxZBdA=oH!1$*RkTMWLeIo)=x7nuL25EMMVbZ1$9F=jAAMwBcYR)mpm z0GWQS(>7D#{+>1^KE@HS7yaoQQOi=LktaGj7&6IFLX5!?ygF6$A`6cn?@ zyN)^8!LX`Oqv}<#O*-Y-X!ADB$(fO4z_R{Xv0?2MEPBl|P0#dt&iP6$Xd@Cll^+JP zM_I(wNSPLRz~04eo92Y%hRPdhURh-C9J+J*&mS>zETOX9C7ud_WJ54u*-@ zhE~M|H;JB0idANJuuRxH#7{4F49=E>ks{zTJv0j-fSqQC9^3p7Ot>3I;=*s~tfg1w zC?wD0xr5iB9mw^DJw*LkV)|!3+iL<;uY5`%S)@ZqqsIQY=d5{LhZ;lS_v8q(<4%l) zO1?Mi(I3_>PkB!HeWE$EMQ$NK^9SjAuB$$kKk(mS{$D>8JZje3g zdZ!Ub>@J{c*iL^;!8Ib&Qk!1O9zuoxfUIfYcKLQttI6xw9zSv+nAia@>>Ct39=L*1 z5scu=-#kti${&>ejvRt}V)NFv3b3qG!f8^qD0I|D==zc39}s8YI)|Md{P+G8&s+o~ zJP7R1OBg2t)>>S&K9=- zdIKER-HX_!0S`_dYgo#gGgQM0u7A>6Jsa$_DP{;YjZ6#lCOiMik(J9r*{Hn+XI&4e z6xu?j8?mzj+YN+ki=>{bVi2Fk37iAYa#;n<#M4o09m$1@u^OP&h*s+{GE-6MmW5u? z>>C=ojTl7Ds`dJ#h$L^y7S)!9aul3o1X-9TVXfSf)29Z%%%5n!B8Sc@@Q75 zv;@tlXGdkznA;Gs7&YjcDwA@)>Kl3c$ysVX`JGu*V&6u!kM#z21G|8t9{iN91g1K7 z1c*N%pd>aCuztATSik(`sjIadTkw#7mGN3+{!SfwNY?F~6yPj^HdUx%zv2wVj!#a? zcyq5af#ls8K=+!|l(Sx3dGr=K2Db1-mB0czMEspKJv(f2Fk$xZe~*4N_taMHFNwxWActHyq%wC^gKS7A1)+2`ZH3YbYQy*CB^tL*+)0!-Tl0 z6@aGhNZ(T?0`!45(0kAS=kp}VOdaIrRUs775XO+hcBxagPxh^+Ef%e~tz1M<3r_^XYd@*ESh=ES*+Pc;@`v+AWXkL8-=vm#BIs%=Kf84o@B@T|`@EB5fSECzUi z1hRWBBwB^(C=%_ zrIZ519vj%D61Bgbd8ZaL76n;v71luCd;j)`SGk4)K{En*tXlJhwE@D0IRwUfO3(tK z9yKtof8>A&&6c3d5?6RxY0U^R~x@j9Wn0Av}oUN zTP4%W`m#_LUIM$!+A^ej(JbsFzVm)vy)$?HS7*+SCPRzrf@OGeKL0DG$43?J-|y=^ zUynI{Nlkrd8*{|rqr49}A=Ak%Bqe{~or2S@WxIM*>)m?ms0G_q4s*5y^Mb4BSwcSo zK6I>I3Sl&#FV3>C9t*8wk5JnmvX1T$Y0kCYzw=1hrJj#L&`Kj*a{!)_Naxgr=cM@^ zcl_V-dIJQE8S;HgfzB+xb}j1XibuU7TJBHYpQv-S!Y~f$Nhx>utqZf6+IN`c-c4`2 z@QFus+&T9trrfa(%&)w*l1qypG&ZnqoU|#=ths9a+8`N}IcY{&C~LKIhcr^?EB*&UM-lV)x9sgv=Mx!&p zuRi~h!}pn5_1VAx$B4snvPYO67D3D4$a0tKf+zF2HV{&E4dICqkrLuS7NW~6$tW(Ypu(iXESMP>>g z=B!-Q#%6S#S@&x|2-&6IIhOOif0)z#T%P6>@`Tk({#%UbFJFtyjOc^mt=h3(k;!s! z6?wB>_ZxrrC0Mx&Mm8Z?gJ@s;)=qbfV8Be|M4+XsWb?WnccIBzo#XYK%!jDUS5Ho) zo@PQnKrg;N!_oSL-_3cSbLUAndrnU$Rc{wbw4LSG#FkdxcNH4L@r?J(^E&7p= zMfhnOKi9wj?d%NG%k*aFE9-7Of;>|1lso;Gj4r=Dv>g>bnXw8 zA5e~*&`?MxvTK_3yf@0he#%TXv)*J~SMajp&nr9Py}#nkT0uH|G0Y~0M&DN8OR&#` z6f>@^KzhI5`K;HIfB=f$_uujJcJm9+RsAWaOkTeKScVTmhZ}N#W3h*CUW*q*zv<1B zbd0+5;mKB36RSn>gz*P@v61tH{oE?Bd&(u3hY}_~vC?nya(l_0+-Dx8c)#XI@5q)y zLtT*~iCKxjSJCJf^c>dtenrg{J>4pRjM0}bp5(3Y%&l;=hn<`$E0g1X<~pPL!TI(| zn}^?xuJB@Jm!D5w4Y(Y3y5nfy-c0)36b5muk|4A0EL`vW%yY##SW=*#v|@~{*n^q% zHCSo4WqbcyXT}L^{DUXez5aUhMq08?NG(MhzVl5DNw@cUxWaR7=at^VcoMb49J{oQ z=fDSq{CBZsV1WB4u;72~x|4m_XXhRP|4UJ8NA*X@RAtL|>%Em|wTmz5X!MI5H!r97 zED0YLWIo)cf+JrOLFP^9E+!HbLg0XeRMm#C|4{bu~L^@j*{M%+qwp3L_- zsW`3X$t=wvYm;`HZ^lE@LSIKq`;wpi0#hmO6({seF~>5kj^Df!SQ4yD^~r0MwlMT{ zZ#&A|g$q>mM&u_4Rh(M5zo4t!_U-7S=jeHmGWAGsF_&Y6AAvLsn?tDSDIncT($B_j;O@O1?|#q}6~`y#c(} zq!-Q|xUMqjTNTetOE7*{a<0%i`Anm44vBGQ5vkjIuFvkE)yfeAjdy6lSJHZ8eet&K zr_2h?uc|u)PxVsnx*R0d*`&)vC+3;c9Jm%|PF0o?O=sp=WA80}=x~lR^L$ntW77 zkZK9Lg*%{mMNBYJ@Fh9mZ*Wv^tCVG#ud8lqpV#|qDYA%xcxHTiKghRT z9wi#xE50PJ#zpmz+1Jm681IXFmV8DNCQbTQ!v!~r7#p-fV|wi7Hss0cY9d~<+w16ERMn0jC|(Fat{EodY|lhl_^Xh z$^OV6kQLE&_;QzPjR7A4MwLtd0omc+<$_QJ-A)5B1n&xx@N@D-CCf5@_bX%@6a3Jn;AwRYzCHggUZ9%|C^TcWMls zsISvi`NYC6Es~&Gl>&D;118Jn5-$8l6j_0lL0h*7^n~p^d&*<-B1w?QnyxJBBhvVt z0E)5~%5z!{xXQWE?_ta>l3ZYAk@AWS)Ym8%-3lQ-D|2!^xFI4JLUmlYap8O(S3th5 zgJC0tAXzJ%rdOWpy=oem0Sv0>O?jf47N{`k+oe|HCDgVliL4~X7cXQ9K@qPZ~G zh4zLy^a(x6C$diFpzFsO7c5XI^T{uWp4ZFcTP^1x#_jtn4rUGN5RwuyWFHr~c-TDr zsqceRMc3IksM29|^OcPtxVgBn{mkFFb>dVzv--}}KLKh(VA+Y3*jq|EiuHG>c7sSi zGF#7C+}hBKrWf}R8}MaKkiI>0+DDX$N>dS0D(Z^|2^!^H915l<5ogtp{6wMeqIx(b5|$x-HtjeZZ8l+KI@F!Lp#yRhR12>Xq{^-EnJtf&@#*=&a5Mso3bFYX`XZXOi(BSfAl>0#NviBtA#?4l{ z7-z1Dd@CSg52Vq3bea6<^$KB9h6ZX96fcDk{H#;^a-XBqu5iQX+WlB7ogV(Mj0x`d z%(!LLhk>11A*8|c9n~Utzwwo<#F7`Q-{P6|iOmG{$%&vcQeCZiz}R8hk6{x%k2+;f zF?-^sL@lUmz?mW*&??6kmBW_3LzNHPT z1X#ae*kYw{W2&S!ZM#M8)oH{u;hJ#!f%=ol!=!P)R3}304V3JF+q-oY43*k1@-Syk*jZ1U*TCXl9B% zPDClZTdAJEm2L$3*~-IYNK1p4Bn~1Z{~g<$bHY#}yYL?Yg+zhy&w809g-ry4f|x8E!ogx7gqk4 zcdmiYb-f#KEau0h-nG6PsegrdOx6e2#$~+ip6^n4({YxNoy94jI*ksr6>fjf?K*u{ zh-{F*+nPo!07Oucv+v-=*>lA@`iT||sqswhtcw;CLT(#6)~{bo`|xbFgKREH{w#s7 zaFziGZ&}s2L|lK&EXtqKM72Pepmxu5y`|OF_;B@-En<{q@w@AQk(#12+A-1h7MvezlX0Tt-Gr|42R5=dZh&jpNl{oZd zMf>3q(a*q=WLK0D@1n}9^R$1JmAA12oh8=rRA zQv$;!PdxL-d`Fk%n`VeIb7`5Ohr#fV{pfp_S9VhTJgQ<8^TA8=Y_qyn5+C473aPCHAC| zQXzDyV}ID6Ci9OoLU+>NloK)Z!=nTDVnRQL&G)D=z|iB3%ORTA?fvFIs|xKq z(tDN#f^3H7=Z}9S-cN6l$e^R}mF%dB-_=hJ@d@q~p)C2^pZ+fC1x)d9rdJ zHd0nxl@f!Wdg=yJnpKD`@jQw);j43k z>uC~DPhoNq-m`U(?^Y+WXH_;mkf)ZBs0AsG*UP+75UQ-q z83#uE?ShXp=bK_@Q+Kd&A0GQ`0#s~In|SDM2ly-Wm=La}GI&||E= zRKf&!A~7JWR{aJlLJ6z2Y;0edg>)feTZOk!A4nrunS7_6(3sMgC{o0BPw%;2Dtr-X zQ?YID9lFfux65{{M9ZL9gr`Sw23HP|z0#h9o+#0#Xw_7m? z3{#m@sDnpU)M1ZiavkPj97O6uQr+5)V#GHZ&z}a6r{bC8MLuYcRG*Ld4C(^pqx_ED z#6mPuXvhDpozzJSA87wn0CQz@_K-p3>)AdI z##IFyIheQDCpX^gdvyvG0sS+mKyu$#gktCzX$5L+yVfH&M%-&+g}grM2;4g zId7s>NprpANf&$nrzA`%stdb}dzL$YJw6NL09I2wx6`ds<@ZfaiO{TjKF9+Xa|&h@ zHLJBymrQoOMJ0g>0SGk{GxmXv@NG>olmet6^yU@|?{3?R?dpXZyl!waWDh@ewiLHXsIn z@CGvkCoH=v@7)^t_A*soztAPALMQ$Z8;-EWytfD*VBJ6jK@Z=9Rt=|jV~F?-S~Ebz zsexFCYr~}m6VE-3c|qp5)EpN_ih@w?bj5r#4H&l8`csqJ`_cjobETC3wf>@p z_<_;vuU^FAPvGW7z{s918mMP()p6^3n@|!71%8xv&5wOpEU26Jz(lUEesK2CR{8p$fQy$&Sj@Wj6t8CvIKL9Jd9YEyoabR=YfK z42c9XZnxmGtaGERxG9I;ACScb4COFBrh}o(tqrxVnP){JfdSzrc3EiB`wQf%cJ+yer>4ms@+hbIm07+QGiB=w^4CPATkRju?_7=XN#J=B&$ES} z?^6tgxj(FXXKY7j$6J1VWCSkvNIVnPQ}CwxZ2lMxpcH1K|2HR_>EVOsp?4ivV{I<= zTAF{(I@h9?c~e%$rp`K4{GFkyQ_f54Gm)>P7Y;qWswDc7xBUFg#@QD=r6!hKT9ZXs zIltbc`%j^-4>TQR_1Mr$a0k{34+jna<+ix_4@D>SVkzz6-p{hERSOZ_ayGm$>!_Jp zyym3+3qt~9jAcu#F7?KX1!kw2vxc*9;}z|2ZR4q^jD&L)H_MS1u1#El$4sqA&<%|% zuIjsfjOE#SBn{)LhdfmZnkeTzWLixdBlt}I5Z1FS$PTDV7{Gtx zAOCp(`Ky){6N>-BWFYgIPOwyM(5S10)VVxsN&?|7O+*ROGxb ziQ)I}D~GDAwI)7Oxe}t!R1j$LtZGu0$vn|KmfodJ`05L{Me$_q=|jrbxix^2ZARm# z3u}rQitYCYM@aL)PtoSD{*mZ~MQeV%EvCIhj+P#X37iYjjeRThc;Vwdg zqB5HzM|z1tg0FGt;USTzV^aj(A-e=y~@?{(o| zWh)&))AH;f_fbbc;?X9>nID(H!)=`TU!|t4kKDb(Igys7la2lwE6I8Hw$OoyX=x_7 z2)AQfA8W1|#8LBDo!teq&AQ@gSVMV`wRFkCg5UmTq0Q_>hJ@dX9Nkqrir^tfi0_q$Pg@Rqdt%*!c;UxQ zZkxQ=N-DN7bw9cW%j$M9!5pVMj1b%~OtfR=ZJKk_@QeB`VKYqYVbUdG;&ZO3P{E#) zx6~Q^=H1yBLvNW}BKH@&?t63Rz`jx#^N9t1`6P`1#@!YJ`5oV8De=t*kDR!4r)-Ds zeDS-w`c5|M%USJ{QM)C15A}o02r;TN7HJz(D?FS+5XyiONB*rn5@51fSSMutq*rI-LKwhKJQj5Bw2o^W&fkEG!YpE ztupqT@>H#j6|6%$i=VP{yU(Cy$O{G2(OqP7VTAfMiSUVvKW*1|u3WcU zslYv4Q|&-9%Zte?57Vl3=8x)T+j79aN~oRB42}bX;&-5#9%V6MQ8T5Lu#SLh251NP z9Xg>2j68NP_IblzY{vUWEH9%yvNi#GfW9KnxRrQ>A8^R9OFgB*B9wu)?b&={-UG2Ab3fSwPU{T9)A8rnCRaiB62# zewEZ0eRE$NS{-Lfox0->+qBw_2*rN(xa!S$uB_hK_bi7|)JmnKrrfO;^fMphS6F)X zC!UgX?d3EOF#oVf;)@#_Dxftu+mfBx9^Bgaq$#Neh zWAlGQhW~H*%rl$--Vkgx4?PTjWX^Wy$Z$dyIyG%W_TA+3HrRuoc1POuf5J(54~|8RKQ4YJ1$hoj$&1O84Q;jQO;R`h!qN-i z&e`8QViH2CgH+B#LaGmkK1BJ#BQL(?&`V8qxtAiHA1x*9Aq}g~7t&prmks%rrg$(v z(-PSHntZS+L}VJ)2a0{;O!U-!Hf6#(&%@TayuxR6mG<>W3WLMTnt~nebpG^KC+)~2 zxu-{iuDEMF_+EoPs)*m0=i%P-su?fLdGXACVF}8mr|kn zU50*L3ilgXr`}ySf}R&XI#B=Y>}0-!{HJF*)2|xf&DsKkPd;RsmPTshnJxFi%Ol3O zllgNiHohZ$cCxywBi`vY)p1*~ihMD)pDD6a4~WoMD0M8W3FvJalY*t(v%8?fErrDB z&kvDtta9CPFfhT5Z24!y*nh=3x+jJy4+mkjyDNeeK1dBixyaH5`Gpn&(>=w2Rz8ar z^qXS;=JzS5VDpDsFZCyw2wpSbOdF&tc}qWb!XS*XP(~77xvrCybSf|Ku-{|~d_m?E zfwg5Pr*_r@-4wh(-!?k+OsRr~2NPPIU12{htDudxOs1FGLfICpVOe`kRK^ZFn&sW` z%jmAx`)IH{iQ@XfU6+f(w51yDSnlh@p3$0df656r^fPmx9jHsuX`{Dk*s%(}ySbRI z_{Fg0D!Se7{F|35ERSdnJh>t|s!d0`SEm#^Z)DZ854o44o6bbZ*+B>f-X`LgvsA@E z_lSZoH|2Z5C9G!F+i3n@aX|m4yyuzuhgr#{Uz_e^Wu-OmlexocJ8i5K)_$ohK<-j! zo$0{H9WKGwm0$WM4%vCK24po=GHxQv0_;uF`z!VwzHgn@Dw)j?u*$8!M&>PiJADHy zVVx_gf%FPZDZFPdLK7=zjxjXHZ7gm6_Cgc!FRn2y2uT$l zDSWJXit`AbgZCBmu$RD*ayeIL(4?wlQS~S9kg>ntXpx9-$X~!5utNuO*RvHCwB7IQ z@jw7wxMqAK%%cC#ycijkdFr%@ zdLeXK+sh?Waz#TAmZN7)8><3MYzfxrLVmSz*bl*17^VgH_)yaR2* zrC{9)Qcv|JFt0#oQO$8nUkBX;%oM26Kw9fjsQgwqJaBk+3J7m91FA*18!Ac3L#X)V zUzjHvkOiOv)vxJRUtEF-etqY1dLC>Ij5gx?Hezl!3{;#V;!~SX>b`%xcD-;N>{+sl zof_`wmKtJq0!3jgvn+o(iH;yplNv&2n}4!1A!-bHWIDsRW6wD3hCqei4QMWWAStbG`{45}1d^{c*X9I^` zzqtJXsTFUwHY{J)Erp+Ft_=40$Vm|OI#X8O-G#KAa#mxQZrHZiArs6IxmLJYf8+`~ z`Pxel;1Y2HI(-LJV-JtvNc0LoWEtQ2`|X&2+m01Ft?;)+S*SN?em1&MmqKl=_yPGS zgepR;wT>JDaE~S?4Enc4h#M4RJc?;&OW3=ct*)mGwARw^6n5G2d(a*D6z#sE z?DR>Du7g&;`U3wZmXm-B-tTMiqH{>F?9mwXu|Cw-ZrsGmrobY@0~g0U5VsvbXbUDV zXm(wMi)#-X$g=QU0bP-V0Vy$9II-V9Yt*bD;?8sfU&=1CtTk$%L;7|<(KBsGze;35 zTmHG5#!%F*YAsc)FJE8XIXbm}K;H1q4t%z&v8y@{9>!ftXD zKHZJ_c73@aQ(bXH{vgMYs(DF?_UG9{v?n|0j~^6gNoNVH&O*oMtWneXKU@#OFm7S! z9C&7aK-3@p^dyt^azfQFMZ1jJv%yKqV>pii>7|T92~(*fpDRMwkW1rZ zgY){9wkeH=hCrJ*!?}TI*Fj@=E_%FNUU5e{7r*H_rrLjE+>?CCGSBo5ogM3_KD7~b zF!S--OTm^!>Q3M?TW?M2zxVV)kmh+;CS&9vaFTNuLD0I$#;J4)^&QaCd{r(EC$K_n z4cPvK(762>p`p6D2mvLd)N03jgOx(-0t$NH@`U|>Je&lE1A9h)htp(PRYTW>=0~@x z5fnP)9a&CY9N7`qh2zHo)@c1MKj`lFv*$r(Ny9%`4Q!Vz@V^z%WB=p|M7znp4FjlQ zFZNArJkZO6g)nlsufuQboopv$R$#l66mP`T6%0sNfotv)_+wnI0@*>8N{naD`aR!t z{&V`!x9OYhF^|;fxB>?TbehEwtXARDYoNFfpPNUKs}!yA}=_2D;-9;zNitds0Z_KsDrrD5^il1iNQjN z1=Z7x>5p8zD#Qt>Wyi@l7TxmLsV1!;w`G^&G@=*CnZ0B`&i1Vx-p-?9O4~ucL156p zSL%f`Fi68Zqo$KM%%5m?w^<)&BWdMz!A1ANmrQ9T%~IJ8qQcbCX*p{nse-Y6;DRC@ z8egq(PkkT~5l)x=Hb@~mo;jR8Z14r|xi4G>@lLY6q|k9@_Y$V*sPg&`iJ?QtZYmcg zQtFVM*`fBt+IvS{0GAXe=@Xhpb-5D-4PMw~Ek$+jd>%O*)qgmjVKZ-*6G z5B-X%E=Sxbi8^(T-c}=k*$9ky!?hMHncRn2m(}nIE;`*pc^Dswwpi0oXFQCxXyZAt z;_=KFN-Sn7-fVR1J90B@?;57Dig_=LJV0&3t*eQwl7NHI9ZDo_>XA342|~H@D~=Nh zCFPTVbHF1sYM~AzhWT?urUq?cw+wMHh<`)D_GAO_Cm7&33xeU$the7W_R_H}@iX zqIxniH7IxDnQMfJf&d}gMr`#j?oMOp&{W0*@CKA%ioFxn(G+-A#r+Ar!8Vyw9#8@i zQ)I_3A+_*MR7FHQ^L1jVH7`+{qOb!t2CDL7 zYNurMgj+Ude#VNzh|9zIJm({76GO-Vc#DssUdNgG(}Pwvkr@T;yN*DJz8TN_A;)i7 z3gR!|XS-?X<$&8f^YBe`fxN+othlPH9AyfIi&ZD&F@yyOC7Tj1GU!lo#_UM>?a#PR zr!E?)zQW`J*m5Z@?VfOL;Hg7XoSqnYr#o%2KPQh&T)!WWUtW_UWS=WTB>XJ5wvyH8 zos+Lrx0K6b+f@gxw2Gp~MvA#@_areAlq0dX=FT>6P=cem|)`o`GHlbxoi3@HGbg@7My2Nw6mg>X*;p zMQt^P2;W~Yr5$VkB^34i8o#hZP@?fFI2Iiszy$wo@fCs@_e)5<|62gVJ;#hkm$3s* z3ecj0h(9eu$TGi0ONC!!ECmReUY2Vp%Dx0L*!Yh{+O@r3qb2m07@NMsK;@33St4Lj3J(RJ>)!@JtIhB_FUw@6`ooDL&k3KgZ*Vvpz1p4I5X386=*Tn<4ScM3xueW*WpikpRo zV(8XF6)GXHg}R>Yp_<8wffkW;l&=+h3sKBLmn?)V%kmGD8JB{-Z#w~Q-Q%qfM~2?r zyneNR85MvgL!VFOM-Jhju!nQYiL{H@i}L8VJaN+F4v0yGviEjl>(W zq|-?(#K%v0;>NkTO(V<`?+3WjyNItHsw!X}i{-U1&>y+2AnBUEDS6Y~Vsq~EjXZi< zs3+tZ$jQ>IwSy*W{|*&{U6Q0S0wa=hTNBtlhSpz(FDU?EPcB&P4DNQ&JCUx9VaTq{ z!2d92ajxC{2ZXRAy!RPAVqw11Qss8RLr(B6a=z8g$gL8(F5FJt#5CblO!tXndJ!G? zsrF@Lyq*EEZN`yz$mRVk!JJ}>DH^?nq9J3t-c9Nizpv-mz1Jq)3c*}L@MMPL8$=1G zE02d5Kv9a+?hYTBv*teguZUx#Q)uy2x<*4{Y=WX#ZR$_{1oIGgh1w z9l6f17KOAGIxN^Ic~F0LqJ>#RPS`Mj2`>b=u0TBWDkHf0JZ?I~4`-*{4G9 zj9pr^xIoPRbiB;*7lg(k?DJ|YlC#ydbWg&;nNh|)pnB~^JqCI^I=*-y zh>bu2DW?HVAs8o^Fo$u{l>B zJvrLUq+lM7zLiqB7d__jl1z8}7+5bzcQ~`lf$H=f7C>w?b&DLOt$>b#Wzr)90B&(X zIT@T>+m2Jm!Zo_F$huojI4Kux5y0zI7FjF= zDLUpply%;f1%R>eEgjt$ty(4bsUQqPUc3a3!AeZTJ}39}!K-2a8>2>@2g0#TEVx75 z9t!Ab_)<>VJ~CX*=Nz>BGXn`;~j9?_oq3IVb5rxAt#ahDI@5ZcZIo8pKv=08D>=-7w9 zwRKV&{s-0uxBe0+*4T6aI4T5aXR<$uV8)yRge!q&%ZR8PjwCi@0~JJmBS5D0Ba?5$Zsfd>I?jTKO*)&;5UhiEVKtep>c7GSErD@{DqZt8|lgaUM6qPu`5K_u6DP- zyIaa?OYf+o4K(W*PCuOhNDjF_pKoh+uB4ysf7bY{RegOxX+hyp-IQ*PTlJeI>4f1e zvahex=&A~}m2RlWtS;=Z>clL_!f&?TQfUsWe$w^urMML-nBLl*a#heSfYJ>|!)&~^ zI zY@v@U+4ZS&g)F4VBz$4VL_o?epZg~2xs=`)<^X>Y4NOvA2!QK!-_izugad1aauF@z6yN}u;7oWqY|c#^*xq2EzEBehM8l%>F~?MH|N zLeu@w+c4o&3Xf%(aFFNA_qirxA|ef9rq~&oM@H{ds$+Ozv(8@xxVRq`+8hjgu0n8a zrP5~4yyu2o?|O(4#mA-fD}6`pBrn!qIS(FiO6(;Il~?32H`r07Ph|d-=O1HiHyLPL zcTgG+tqr_iova`>y%nR$ConX5?qRDo8DLIOVD*$)aYFQZ9R6gED|96-i2e)(3~c-G zQ!`gtkg%?^J^+_-Z&=f}h9ft1mUHe?kB9Sj&oAnBIUB|Wn9+7zb=z$Vhip>g)CKOR z9o!JJ#`_wn(cQnR(&DT$)*c>Sg~`YmSkLINq`(*6MhtLVN{Wi?te?1n^5%&M9+uY3 zyS`L@KVD49PbW;jJN+?d_qa~jFjwPDaEo`*r;PA{YOK|)y7n!)D>?zDud)fzdl_=u z_b)OCt`l~I z`?P#iTHZ+UW=^(zoA52E$tck|-C~5K3;cm>UOD)Q6ri20tP`Z(P@o!&5 zdNM%|320w#F?d%5nJ-29ci2~9!9>bKA^9o~*4U?rdcv9YtYo?Ml=!28BICV;>m(GJ zf`6!&1R(PMnYgt`yq-PeonFD|8!lNOw;-+JfG&CF58nnh8nHfD=@;#z7%$w$`Qj>* zNiA%w2ldn{jc>Q2MjZ8KQyvf9^DYxyAW#`RFpe~wzd27Uey+>Q>LurSk8_N(wPMV1 zfl4u6Y^lMy3C_^D(0XMDmS+qwon&F=wMC6N6u3pt5fj&VX~B50m+P=^0SAU7`ZLw_CJ%#w>|YQD}7 zZKWyUPd%pB5d}V(5;o~`Hn-JT_-ze)ly0#OZS`a28WBq&N2X`6(u#n|QU2y{m;0}m z{U0BntHg?)|B}OY_f7{!28Xd!i>8Z^EzguR8JTZ+1eTo0D_C)-``#Xf{o1x&C~uRD z5Ln_KjMuJgx2!gG8QC%rHdz7$sn}P#$fU(4PbSIzRgY6ic2(la| zmhVdeZHEna=l+Ku>pyTS8gf=I$?Z_TQ-NhCbW2b{v+A8_Jh#X-sgc3t!B>Goc4+xZIaowd2>v=2&@z z%Xw&1F`OD4Pk#)RH0y35m0UNJBhefIX-fUkmOQ^70DQJ|0!5;5_R#If%Rlf8gAnL58Wx`g72zJb&_rZ7Gf!PkP7&-ZrhE}?h$f+>qJ2{|2(l-aR zm8*RKa4b?rbP=(7mZ%klSr%+GIeo=l4r_GWQ#(W?n`xDgdP+QvK zgGE8Vuv<#Pez08_eB!-u=@u7Tc~@|zuZH`zZ^nj!D?Q#<`Eo0^^Zm{C^Yj*P`Zl@) zm*P!pEP2(bd>6C(qw0>r&eUjnGZofhAe#&TPp(&K+;~L6SU`pPF%;|W_+W^Gb74P1 z%-~#$F(NI$?`2!Qp-Kaodq+Pns<=w>PEAeOEiF^md!rv(CIITh;O)Qjv;DXJt{2qW zeR8Y~*|yOtRTx-{346q&EA{%;u>stOMBCyhC&Ja$TH4#o7Fw6aCOc~xWMh}FU~0g; zg=C(0pEk-td-I>1j#2N^7hgPgZ)#PMb2{QWDtP3-UxdBqXcXB^rLKRpBI$A*SbJ6w zEdePxx{DA?_G?x%>OVmYWiOK_=tbKG zeNjizz0%MO#jJ6u&;CKge^UJk=r z2n7bH5Sx+5^N1f)`!cUiwvnB0miH-$20{yRkZ{>sr_p{44{Z*j$Cf)YN!myUejxqf zU}^E2_6$>_qfI4znQuszNigfLTN{}B7x1T}+w{@d1mAG95Vnlrxi*|06_fd?v2N+M zUm3|^!W|0o!Vy~rDZ9_r<#vVjG|Zu%?$6$}(r5ty1Ka`|oXn1XZFtjNg7@pOT#xL~ zsR4nvIb}@PP(mlY*QkBL=sE`5%0*~=REp%f9 z^nL#cI}W*4)X}xmWTClQf>pLE#7);3h^;uV{cXL^;3ffFN5p)52a{X7FJUSLP=_nP zdFJz#$WJu1&YR==7~EIPxcxYTain_x`iJiA?e!_}3EpqmaAjrg{ST0YX5 znuMuSAM_?I;12%v9=CUJ@UQd6UWyI-HHFM;Jpt>M*HeiDst@9Yl_*4L3G-@*PyKOB|Fh093#{sY96@%Q#X04k(=0-VkL4PtWrj}Q~X z+ML7i)Wxz<`nk)L#5nYm^(RL4ru{rI4kPF!G+@0zaib;Wb^fAYnSlQyHWfLt(;Q_= zs2CQ4ySGnRefNhyu>gE3fPkA%g7eBD!p_@~3p%~Oh^Alv$h^~Ed(x(2zS&0}?G~*t zXPhwxu|$5x{fNlqzxK8{`86^Kh|HsS%~iRTg|XXDs+w%w1Wr_$e2m|A&6(Wk=zapU z=*nY+3&-1bPScy?odT2kZ}rR>Yv))5$U)?Z--JVm0yd6OZyo}P1Mz+rtbbg{$yViR zJ%?Fh*@qWL+lPBeiCQv1y0Jhn;wvxyJNry$<;*TP%0bkP(mq(~41f;{0BwPLfU}wa z_xT`U^$T>0zbnEVX-iPvUn?`hzLbb6dD$&>3Z&l&ZLz(L)5_5uk1-bXg>Su9i|8EY zEDrk0_PoI?mG%*y)h&|hkg$sE*H~l0FEuhBG#=m18+YWY@SN+%#zB4M7e^j3KzCuM zasWSd^F)F2gsq6d+5B+j7pB;UB8W%wS zIDgRZO#PE;_>~n$n>bmQeoqUWWajtxvWQdMQTp@Gr{j9?SQkU9rXEV0PW{daKDg>( zTN9GL?3LT474|T9Mo?}_D?Vs@XgiyM7qnsOv(e`e}LHwJlyYkZyh2R;FyTU)(di7el#b}TNUL6 z9*f%>i)fLgabDRfOga2%%H{9lJ_f@dL*4Wk&YDS-`#m8fuc&oBP|2`sj_FwR7k(U@ z`Dk-+AoaOX$XP9#ZysmTKrsQPZFnOEsmO@kuin9xtvk)dw}`Qi zb4E(Xx_B5bdfD;zV{QG|h=9;^!5(7=a6?LD#^Iw#V!3ZD-g`PeP;jn)vo)TwQ`p^8 z0q#EBFkX!Jd-a5Fra}=qK*>is+e1Ta#*(rriZ0t6(&8~%MZj`HZqt3OqEG7~khl>C zeaNv0JCol_;cwvA1*hEIshPU32ft$ii~nMO#%BHliajID1-yDwlY+~9I*C+WyP}AU zybeCA8)Z{j`h{d9O*SRSR|ZKz?3J2#ljW=q$s9c{pM$T3f{4Co`;c#dbI=B3;z`KK z>dt<{&}pfC+(FyGJxuL_DRFO059A{IClm_%rxY}Q;;F#vTG97a2f8MQib%Pyi_ocK`I-Afm+$?4PC5L9onUu_$1rP{f+|98x2J=3+pI zTUJB%S*;BK3I{M!8lZ;*{}F}K_(v298bo;gNj=NMbbo_0UO94 zXc55&LXCj*daL@0g4K-8$2o^Q56mXPkp(O-TpstDwG#C+`h&@90x-{Uj96D!~+ zoQ={5eDIf6ck*H#AC7 zyNOr>2UtE}%wk8|h<8r`7p>6KF#@2yaXyYbbp{jEj)BqMo&t<|5o9fr3_gGVSGDoT zUncag$6qDo8hbqDC@8?fYFm@yzy2K(g*sveDg=O^X~zPKqe(J0*$@N;k~VJwH-xccHG4(WLSu@5LAdoTIJQ>XrBQ`BI6`T;(#_4=28(K2bwRt*JyXf}Bn zz}*)l<^cZb>HJnVM&wnDG%!Q6VOBOakhK)R)2NTIl-B&6d&qJ-K=x^UJy0TL z8wM-n^jr!fv-JxSGg@)m{`duws&e zQgfN-jhRT{$Ehdw-USex);6pb|>BzCgOrNdvME2|w z%a3gUHF*t>e(9q)dws4USoMtZj}<8X#@N0$a!&xUjM~Q%&m%A;9=S{FF&D)Z2Z2A5 zkM!V^4-@|S4@)?Q+TYfBll|CFcf2whXFeAI-N=!DD;MB1_dOVhKCkAXL%`mJ4;ULF z)5ybZ$eVN`eOmx}sohV$-V}%+_}!0a{Q7Cg_OI%p*x)E})!vFb-vPvXlVo8NA33vY z7FVuG(VvHD?0LE7AQ%r1@{A8_U>_O#QF$T$qiWq6BzzwLi4B*8Sf7y%o;+!n_Z>FlO__e0bIbTMmAW#E8_9cnnDW*OZ6Y%!ss*H(bOf7JD+u2Lj+avd2uh1b;}o0?pfLX5C?Yz6@J z*3y+MwGfM$3s7fWd_zI{C-3D@QsZ@yUnudbEWEP!X5R7!=E|k%=__rvc=(u3voMM% z7`Mpu;&aeEE3SLECUgA_)Sg4m_xz_D&z5R^cPi#K<>(rzRC8}99JS3T2kt?;6qKsf zo;(&)a$@-2fEduMKjFJw?zcqzhNOm>s5K_=d<19ha$ zW+`;3X)lQMOw}gIX#J~Kf`aoVnOt`wQwux)BzDLGu-tQhbVt2M9`VZ#dYK0s$$}*u zeK_3*oIziq|I-*U5C*B_Lkln|)#yp^PLeMZ9!`{9_z7~+q}a7T2BF)r@Z&xJiafmq z>{0#>x&i@O7gFj(qc5u#m7~o$ReQC5b-AW|u)y6gPP@NnNQJfyXxZ7%%U0m#Oo7kE^Ey$cf%r{hW<|hk)?=ndb~$hWAU#&|V1k+O^v~%r z$&&yd-zshu8AQBEU}zRKdT4`dR9R^J)DGZN`~3VjW4S>&-L|TL`c?!(Uix7u_Ov11 z{3c1ZyE`=<_T6WnEG!?^Rw#JPFKV{-wR`=&?k`__5JSi-SZm+;SgRq*6ae;wP2M7|y(KCAJ^D2k zs{5X_)rsa2A)Y9QgKVVC10@D@|Me=cS*;h^!!*$XD_^@II*0fK>kRCS8fvDF`+;oN z6X7k|lc4OQltpXOD-8>${^#7(E%E!pX_;9~#PadGILZ9K~QAXB5NMcb4S;&p`H_a}H4dhqP?m80z0 zOQI!rG4Y>Bmys|1cv93+9{O`;Qv;Hi6}5fkt?p<)u=T35Te7*ol>n4*H1KS`vNq(v z4Eh9XCQCx>HKoUI2Dx;J19jJSM);Yi6k-(1|i(avDkbL)|lX>YRl(InXQ61xKM7My*XRR3>fcvjYYQMv%{o*yPADY->d_ z=YU;5Rpy5aZ7;igla9Uo`o5EGwfWnX0n7KwBh-s}C3eURT|2G93JMRDz{Ag6&r%6w zZppFMt6co{2|fntPA^PRz_;e*tFss=*HIvsa28Jy)yW*sn0aPRac` zYm5^R0o0e2EN<7(g`Uixz96#Nw#^JxSCw~Mn9SQJo!O7!iOTRz)*W%~_rLu+6m59V zjPT)aqlscacopOOnCMDyo~1pc?D`W;DDfXzwO?s^7W?$TW^F1J*l_`)j01TYHx4Pv zK8ZV}AU4DwQi6zWg6O$n$N|HlUML0y45Xqz4W#_vn%l?(m5I_>fXg2i*rWlr5v(xe z5m5Rx0jC&|*Rhirphsy;f~ZV`Al5L0PRI$sFz0FoI`fb^zy|M0-C{k7CKS@dE<1vU0E&jn5SXqB_>34zSj!l#Xdk;%NoiD$Ht5G zZljVao-4VsJDSE?UA$PRELi~UZob1{!KitCSIgA1>W{{C{!e;8J*LIG+?PX3xpNo~ z!yRkN`wy4M%QLA;EWgRcTTi|gCg|pJ#ldg= z!EMTAE=DRn#K)L{d*= zj8p8_ii&M9jxU_`Ql0D^#z1HV?fr6Dc}s*A^ST^#xrp(Zp7V;+HzCg26=}y7dM(qa z)u7)wz5=Ype?R5{_D;w!6cs$J!2JrOPj(P|b2~Fun-A&lv&2 zB(y%c@T^T>Wg=8kOV>;n7~)3mBIb(p+J7MN_+K}r=T-^>!8S~Jx~{3T&@H#bo=b;( z8Ztqi;DwC+^;g>1AYXOIXC)CzsIMnRX~q-xv%X6M9mcVv6DkZ_776X-(6}K z+d#Ep0P_LyGkW9qD{G^#jn~rmLhkuiFp+Bounh$|U$;@A2%T{WH@p^j2@V?`uS^aY zc!R2m))$QuRR-bR1yZ9sV)=!vA(@(<)pE6{4?ilP@%jSGcFmC6CWF2C)X zbeO<3VFSxq@93|Ug#;XWVx$Mlmc;I9i{7yqt5*YGR$y@j<6gB?iQVgN>y9@x8LO3u zbL_GGD!zUF7I^pZ!V7z^ih{dR=7viNVL9!RIoh6DRklM_ zeL}NeMa?KXJh+Xcp~E_1abd&DD`hf@{qsza(7k$!hZUDoM76>j%6D>WueZazhPNM1 zZCZrx6u-C48SHi`^tHQ?ZFBvQn_?@kgTGxg8@3Y>Sg1^zB75J_G&`M;et*RMD-s&k znW_FGr%g)m>M1zhF5kd}y73U47A+76HNF4wT8*gnFL!~z(l`IMf$^{MF8_nqpQ-Pp zxX*^_l;r>*@8*IEpM3{KuggaO>d@?Lgcm8$M_S=h{*;&wG9}=gEf%YeM zSiaT0LW7T#QrhWd#@XQIHrkKddZk&prfmaKZXfOWlL~3)jI3AY_?=D3W=;)nyO@j3 zCF}`8O^b!8JB-V%NA~dsiMH2#FI0pUkWENX)rhy}!Mgiauk-#2YI*rbD%xNLZOL-y zHQQ6mRe>vw3dD{AK8)vu+sTmp2~zq_T70>uWr_z*Cub6>!nrhDE$`%vOrOqiYTivx zlPIH~at$!8Ol#D3a2%`q+UWRR$N@U`G?lOO-W+&g5%`0c^q=(pP&^ZQ_|oYQn@K&9 z&FH=Bd&Jn4*gxA!W#f)CDPG|!Po#iO=3n-29DlWy0(k+>K>vKsR_6Yc;?djBf4Va7 z+GRRCG-T;>iH4`kM%&3vk;Yy#gqO>M)e9Gia_3PSWisN5TtGb46$exgn8Z2HMfyc8l{NkoGYO0Q9!+Pf5^38*@yI2@8 z2jOD%Dx67eq5kvC1gXBV_lI59Rb6rEybc9H%eV3+Bfrr+P0s;Q9z6}MAGm+?{c5uf zcL(s+M`EkLfAjzFJk2vbY$k#)+A?x85y6!|)mSKAZgb9fsTrY(fD)dsk(-1YgC{bl zZUHQQq0b_j5$vssbi!sf71x5uy9iGaWog|lxwX}Rs07Z6xcj0J(9(f5?(H|RX_;zF zGezA`drK-#$9(F8T%xjuNa;r`M*P`^U=9!5Wz)jJ^uQ>7^6Yyxd%x zXLvEIrpxpWI3wZ#h`jEDZheGxXx@_?4&xDD#EwwEZ>dcSc1Q|H4@eM4z%Jc}UO=VQl)q{J#t{1XJ>?}0+w~U%N_JN{O`3D5 zguzjTD{)H?qO@B5%G*pGbb5Y@eLluH<1M7^^7UNDEFA>)~kMGU_d zTL1q(EAyWN&rl)V;IHJME?aFd&2WrjBVZ_nY90;}WL(GayL^Eg9`UwR6bMGq0XW`F zO;Fm@DcKxsRQ}`JS?7Hx;Up*T(A^K-?>awq zc?OcQj>st58BEf`?mU|a7Ia#{a%L_TO~@|MGiCSUwgS z?w{!wX>rvve7Jo<(dL2vW#zZs;@wRX!}V{c>hC1)Av7!E*R?N7PN0)?DlJ%l)D~kD z^U_nEcr`zuq)aHXE1cy(xWlTlIqODS+vF`irG*Ckw^nwo!9?Z9Vwb}NO{JYpUFU?Z z{9rZ-|Hhz0=9|Pvy7d#ZAGx)c`6{g3NV!gzQ(o8zOpw@$J9LFWR^qLXo1?ARmk7Un znk)z%PY9I-EU3Vu30CKIBxt3Qd}Uu{wDO9KHK}+##Y|Fh&DP_${I#!D10mrKT6cHu z6|;X8`Fw&}#s;ua4QCJnGsWapKLlfUfq0&w(+|>gj#jfYH?|5?{WX`K-A1h0g|3%| z06oYH_>@Lb0~`IzbtDdX-7-PGP{uKTFHO?rf9RC>Hq5o-a;U@94k)pF4Kf7gC%tvb zp8N80&lgH7PhY~0##XCUae+Rz_}R%5$t_NWbD}b_?X2!1Fc6wP`9@d^#)N~h39^Vv zz<3fO@wF{+-#M_g-rAh{7Kv^h7^(!`=L)QW)O@6B%GKVevM3fCa_mDOJ}KGsI@hf4 zr`jr(GjQOn28U^PXz>K^TdjG|<{>W-HdyjT>oO=Oh(ONO^JTAvpd$rvUGAEu!v$}I z4l*UBG`qQrp_#~2QCw2HW86An)u-+k(EG1N!1X`B)1Hy$FBM6v&|9Hbqe)i7 zm%+`~kxEvm(UQ=W6={ZJa0>2q=cV_EX-%LF5~SHEmJH>XIL3VjFeiz17X01`*eQD9w%ahat#~3U6w1$t7l}{t*2hNQFxh9bC+1lzHp0` z`%>&9N8qs2M@Np#dkRqasgW!C6+2%@rQ%KHm+CzXjXHSw#gG_miAJLdx;dG$w4tnI zN3+sclVTNea(Pn`Mf>u~)CEr*jc|hdHJPc-|JbWbhWZ zd!^1HKM9chP&{?kqwU_%;`8tQ3*1hD{-5_TmYBxKQN(^Q-o;O}?AR|$oFp)2a}P5L zq4LqezY62FE>kNOB^0t#1;|@Ld#eoLRhpvE!6bi{ophOKOo`odNQ2>!IZ~S`E zWfDQWTYgy;UMcPUtiVw_4%3Qx%{&Mc|6$MeSUjZ%-y}-R$+6jUM?x8#qfu(g0UJW; zA>?YpjQK^Pj;8VWHl^A@X#)#8%4b$_!W4tO6jSKp_!q9{WJQ(y%!N z7l!$EzdpTj&hRAiG|}$@;g!nPmDoJvhif5jNh3#yHJWI~bxwsZXZ!^boyhVBF^;(e zwpl8VZ()v~nbwi|xT|wl`BrM2<+!Gj2T4vch(pOt4gEJ)-Kqo9$;NYMu~K6t$&-$PT9J-nDST9>!_ke&$o*z=AqmNzV;OW3$O(Luetg46u?RSue(igZSn$Wjpcro+9JD{D#`E^)tp1Dvi;TIWjn zeYDFNFBnR<53eT20Efq51K0THoSUQFV}pvolTp6N7jB5sMAvr42pP-6`O!2I?ga_05dkj)p5ZgMB$X5!b~kL5g+TiGssQ163r-25Ctav-}) zvQ^n46kJ)Uf4R-xw`x;{F&WTK4$XZY&(a@Psvo9o<$Kp2hb+W+Z;(aqc;re0gc3G&cNrjbj#wPTjEiRb&K zX*8O~uFl%-e>Y;%4GRUGUL#o0%u8_!j&PJrDp%*3-moFdTpaib5&&qXJFxHL;~D3R z$;q4=cuM0?ERBKi>KNer0LB^MH+FEDHmqi3@+U}0@FZ_e|Iv~l(jLeexq9sLczA>t zVgql-HOG9pR(t3~O(%Exp|i{TJxuRU(C8wS*z74SyVbYA!_bqpa3K$?Edmi))nnxB^6~wFwDY(8yN9rs2dn3PJ|@WUMh{cRZD{S%P%mh7r{{{ z_sRUgeiK-_vT8qW6Bg(o-=o**fL;b2$D+_COU?mad+6+oi3o-b8W2RoNE+fG<(Sb4H7;Nv;R{t zo_E7}Re14kA$lPZ-Hw<>zwCwu(;h1TN@6bo>Upt$w_NZ9xDixAr^|SIS;hhM>I>u% zs4%%2c{b@gc;FmBU;+k0Bqn;G$0*3wBJj$SXyCqyMtAVV5J^wE{3=%e*|$Mxk$B|R zp2YF`$qHriDp3|tJAAQ8cjO>xQ$Xcn}X@%6=eq1hX+(U4G~d6&kQ7Dov? z+QN=<08QC66kh&RZ2)`>IVKr5NC^@DqHAS=o1J=rF)zXHJh4l|BH(5z!EBKJ2JZ1L zZwIMTSfd zIAEqe3=c!Z5aQ>Q1IjF`%IfOR^HbAB4f_ia4%!;n2FhN4MZK=-S3GN0Mez}{xH^Y! zrc$ZX$rX?Y)%|)y`1{_j@;cYAL{W{)D{l-|e3;IW%C>#Sf=}#bq2bD=)zzVomN%MN zlL4qxT3xbwcq5C#ZIKri_`vN(%ZHq-UnwFzuh?V;dQM^BQ#QY^CC|3F)=Pn`iY%9#y`ux_C4F7;+t;StJ~^+Ll2TI?310c}Y|#kY$F6n4+k&vERCk9He=FW!CEHG>}q0f$y;A2)*;tw2Wju1+;f> z7jxC-K)anw_bqEoDDgG=F*;bExh-V2EMm`+LyDpcePc{<++dQseOxuU{7B~bqK{c$ z9b}7l{X2KVc0t7)_pvYFY}XAOiP=PgR^tnNy>?3N1)LK=Ef@qrj7np!UL3~61I#k~+auSvTWTkAeVgGq!XJVkIF7r5(02jb+B!j(xSFS}69L<->-*q6`?jmH zsLfXgaeW+a!cqQWMk%y6VU_sFU=RPtdOd87wJCHDKdg(ZIn-8CKLS$zrpEFbrQ+UC zkAKKnAHCfDKy|JZG5@XUQWc=Y0xqw_k8peROT(;>ct4NbulGBjaY@r!VHOXXu{6>; zGc&O7nNKakZY{Cxk}SdYH`m_Jt|F;d`qdnriq7N=75CzrEzRC|>y4WzZL}|rc0z1H zL|{6zoMQbkw(-GXYcjvel~o+I5qyKQ>TL5D|6R;t-u?xEx|o5JS|2zNFxnSu+cPZp zAv(xP2rwsF)4pmDC`~pI(lUr;x-;+I1@cu^s*FGRSYyF?-c+itiRSBKFc@L zlpvRqj6XfWPlfpj^%tC6&Y`bXbbXGTWs9|F!(OT5NVj1~v8_AT$F-Dl2l)ohgteIh ztuXUsXB)13$@G}l6q|?y`VE|X)Cp0pGUX?l$oa%MYIJLZ#D`X?qwcjzbsRwCg+H;a zNowhuUtfHtZ`;bt%j18WQQB(u}q5y_U>w9 zgP6yIx#r;_i3PAK7qGM{YJMd$=nN|lv4Py#b~E3D&o(-{^vFr ziY7Q-llD*o14fU?+Mk}qZO#1#7-5Tm3}!xxlMlF1>>5OZ_#8#}Tj-t~_;}cfx;y%xDi80A~U(_yRH;b{PJ5I9uwO_v{ldyaoUwG+Ekp=DdCXzFx)m zhis@rdBPk%umy=vvW-7)#U9!7c6V+QvG3^ptaMz3Z6b~l(}506DcKBxz3!4HQdvTw zw@G7wt?|3ttJm{yFn@RDU%tnFtf%Lu5Gx7WuWYd&ebq81p8W~gMHxKndHyi+8EOtX zjwr42cAtp}lw*63uWjy0C&Ue}tjxDD%c{VRZ~)2_fp-g-UFYA?9kRIh@jcZQX)`+H ztL3#mWYv$lsJqI_) zevqG!t$V3Sr>P26ijHnpZFSY64b`bFb+qHE9Ox^nenU#;gt75mE_rw6sa_u@bb)}A z(I<}atEf$65VrVLq>qZZdH&evg3JC>9=oZu)`UC9vL5}5f?`_g;tF)SPv?riV3w_B zo`mzn&ahCaz1vq|6?slugpna=;1_mj&;Qt$a9iPi$v1c^hxxc}eo?MCAnd|hk|)3P z94rhti%H+VjP&8HjsOd&Wou;my+yPlE8s6SP7u@A^9$(3u9!A7`orGMIBB#$fn1I80IBVkfI?AzOt#ieg)z$q-f?zIzxWF?ojX2M8;eixxMq{1f znAf8fPJdsUOp_i`XHB||zXTkTj;u&%YOr17&qZZy;hwwFlCGqTCWHpXU?^Zsz_1zI ze1T>uyUK8{Z@OVHmYLC9|15BZ2s;+U@9Yl{;C5-F1{ww9haWFq)VOs98?9Sk69$To ziioYv@QBeZ!@kt7argv;az44}H;kst(nKKqm%eA$oaXsm8I!k)*Z1dJ%E2sHs1usw zmIF4X!o2ifVz;tbCdJh}o=rL5xD8OV0dpzi5;N3Cux8;^x8wb}=csMee){Q0AKqE) zIgQ63S3JCe1?v#`s-cm5Qpr^6DuIQ>XL9PkQ^(>y>{}AA3%=WlcP#2gImzWi0wE=BVGW!k z_1*E5H92j9p{L!aui;j9?lTSX&g9k4YLgYYqP;XJ`p_jB0eg`19&9XE^(UG*OC6eq z8=CK)6PC;`fM^&|p#?R0`z!cB`H;)#zZmS<>(hpy=V=^?KZr{Ha1h!$qUK;2xrBTr z%MSEC`fbfoVWe_>qrtV@BWF!W`9VOLrZV5V~E`R~N1pUA)FZUu{;*v zDGKe~ZC5qoGaoA{8|Oi}NUlJ$pNj_54}({b3imk8gs)qtRnz?TQGJYadGR=cU&$;z zVX^#cSrCb|K5S)Mv6Ncz*}ZB;S~q!e1p%H=HD~kr5`jJIUhnPijrN@s;msF3j={)xzBn_ zMg_<~yJ46sqZ39Q)K8sI1C0vz2p=?}yZWv%%^l*RR+%%L8}Hi37`L>{ zsnk;6gUOWwok5jkv8@TWZTjyy*Ro~b?A8DaXqIJ%ju!WB*k9lR|LD$mR1T}J6f-MM zn0^A|sgb#@Q0H(j3}=u|a9W(a7lo=Gc-qjNAr-nM;+a{PW-gVb6p~$cWybP8#A}5! zTu-h=6$IpYEuj$3cBOIJ0Q3nC;P}HnnLih;|5HqXZlMp>f}_nWz3VArr~(qg;l>l{jrm01m|}hCQu?G z$9k+F*GMU1P7)*5P!!#768R;KC57)3Z>D9o`N+}p)`#;I;e+#Fum>yP^8{(R5@JIw z(B_~A$@Bu(_p4v3*@e-(HTpdO_j_*j9hf!hqw%B6v;;G!n#{nj&sIV@z)K7XWXt-T zVy4q_(#!?YRGb$M=S@1>7oo+obx)sYFs(XMzP&tnl=hwYeZgJnb7;*u zF3viIyIVm?`DboDT>4?MmlC>`8GcX3H*Zno7I31X%$;-7O;=CNyR0f$weC58?=2>b zloraF!#XYW4REXc7R&p8)$>^&>n|?*J>_oNl6PW^X~V4$1MO^S6{+>}4vV6yVUcw2 z2P60kJu|pt?meJ|t*!YvjK}a$H+^hzk#b!Q6=o_9G|oxV-I1R>;{i)d@Ve?^truAr zwK2w4|CpM@@(pRih8<=)S-5?ayusupJ_jOx3!~OAt!_tFqqOEkk?Ew@w5QxV?W|uv zmcaHcpV3kqCt{8YJJKm|Hk|%{!u2iOlO?v=^gqBn%#IoR&HfWLBbcEJBOj(0*-${bL-_nlh-} z;2)R!t<0y}?+tqHVjS;mM@%4MJhX?>>n9YwrkA1f&R+33&T*xKKn!kO;D5R7ru_>M zO&BO4l9(xr?XTDVlcU>_@jNmYXP|eS%Ee~$B9?bpU|IDY4!fu8~@TyGX%)Y zS8B?by~y9rKA)lBwY-zNQZXEMwva)#e6B%o(4sb3jC3C4rAxd%%vB$CU%&(0)V|Qe zQ~WkE`zzk3;=_bFce1T+z(jcZ&5+CDA|?^BX+1a3^GEP&Ef3|U6?T+$@h{9NopWr1 z{iuj~8l?A%U02EuJGvexZYo#6JjUPF{iyiDQz>8yRsM|v|Lf$T&Gxp8&6?b*BryRv z&{rL@?lJuK#%BW8{^XsXiN>q0WyuxTxh`w@Q8TFI1&=%j$UA+{$9@LfR&Qr-@kHfH zY4;0p#x?v9+jZFh21?dHb&buso$I@zZ&#Nb8ifDDA*n`m|mlKaT}V@0h?4!ii~fYJ)C@FnGnn2{^d**=aa&8^#Se#;m?kS zs``?j`klQl|ELfxJ2&o&DhCeQgYGbLrWaLi3l1?li6K^da&DLYD5`pV_8su2M~%?` zMQ-?)9&rwVJV^#eZ_}V|7QK|#)JQs$)-jjQY;Fl#ao0M|)S4hPRqW>Fyq-hflm=zR zQ+>~W8}Xs>&20^Kt|E+f*^7@IjD_&VVSDPrhpuPpDb8e;M3A@a%#%IuzgO4PX@7NTo!lyBSc@xkSx0AES# zz9sp_h>N+We;?KVyN@c{$B!NA(p-3YLLDtL=q_tuqtzrgT%S9q`G|YQ@*A3G@aix; z)Jl#Ot|m)$ch0~2na_}yz?;X^Io>GugmOVFOe0X z5v;|}nZHr?XP=W5a<{OIa*Z>S&9mJZQ+evr3h)2;VEN$G__2A>HcRA7g6jupwwI7z z1buq~PI}Gkk(XqC)>Nbokhm%n=q&Iip3*7F`Q)!<<;~c8L6FfFnoLwO2%iJ&)^{I? zJPv`DdF=^U2SmSW&U<}76>#e9{+t)z_EwtR3E3oww6NVCA6LKEJu;W4=%I0Q1ncQ< z-NILFA{>|Cx?z?eGs&f?R8vwDJ^4^d%k9Ym^xX=UN}Mfzs%9KM_%?_So>(T!b5|vS zP0ytD<^Mzw``_`KYN~92*zqM_`yZ3#_;?c&jPR1&6Kq}&zh2C3?E9%P$cK&b3pLBy zi)X^EyWMx7sdIG<+-zxRk#c8otM+9$=1Z;W<@9V8sz%j2%I8)zO;j5G1I?xpmX3~1 zlRWrwq)fWow<~Wr*S)%8`8?wl%-7!NlT@(=dQ@i{y;MTjC+23`fmxoGdOubl|H|L` z0wRvF(=rn?o|DS-biAd_d0I(l|FIZh01tjDN#Nd^24sDpQ%hctHCpJaT1FL-xOoylbn-Mn?Wxz~Q6 zwdvRyQWwG$!B63xMA6s}goPf4>%a)`18rjPtZ$82072E{xFz7P;BDXoZHl(w0~@F> z#&>}D;aww+7V?)2CKXnI25$obPYNXeEHjdRX_zT3d2qs?CI677WN^b-Tj1FUdcl%F zi;v>xBxb|?{RK`4eh9E1B8!vG663PG$`{qvCyEW|;?@-THwwx4UG$O=g#pk)5n_xb zyC9Zeef1UlZT*-JsDI=FrKfy3>Vqz|TVWkoH+76~VMik`j zO)rGc`9J7Bl=)QcpZ67TJR>J5A3;fCZ?4T7zM`ATu}78lG)BtWAWhAV8}OHrYK;7B zLS(I?i>A-{Ff7&T;cbpIwGyXJW=}Lxzks+S?7(ANv!P)mF|owhg0I_&W^6mKC=_@7 zg%QAwu%#>o2PlSsB=M8OWw{e-~kmLjep|PHSF`-;{r7$>1DcIwmuA1Ux&HJBtb?82Z#@s9s(Y z2$dK80UOyOnBv7>W=>>fI0Pl>?}wvDokPsn>r8Pe)Oy+1elf3A!uUf`$Lp zQ(dYD!e3Nec^xH)0yis_SM4`ilf|l?&09debPXtuqL*`AMQCxqQnRHsSPA0GPtb<) z4EUt|nf4-kyh=d??GCIby0f>lmx-Z-KB9bfyYQJc zkJS-*b@X?{7>p2$bB>TxcX3T$EpF$%NMko&7;uj`--;b;P2cCJa4sc_cK;YeXUQa5 zTW%@xZ8alK*B8$5zR&j!T;5az$hwmsHeKVDWMw+eA}g{->&X2^Z5=XS(`ZMH+ufhP zR=LsD4>&XuS%7MJ9C2KI@ch7!)8TUtajmzYS8-beb$mJ6#WJsn;oYY(04tG}B%kq< zCi3R2usrIL{6fRusIB9mOE&;!mN;XcK1wez@+}|r4O;l}#c}n#?)$2?a_LcN!cZjDvA#_pw>K@tEv9oMmTy6IyE=7Z2N7ze@% z#qGthC|$;7pqVQt1z*@Y?G>=3SPeIh3o!x>;s{kP<`S|t*YnB(8b}P}^jdHlQAhjB z;NGMDN4>PCyt@K~Y0Z0wM%+b5!su^a~LSZ=HG&lGfHaHR9{bJtMPA9TNYg^bZvur_H~B4EkKBX zNCCJFGj_5uKmZq$X_;jow%yLnz_ty?Pa6yt6>TnFz3y(^BWdKQvTdRCy_31{fpC1h zBE9jLYUoy3qNok`bNS2iyegH$;>T-ggt61$l3VL5bIx4TUF@GO|Ip>v0{N(}OX8B6 zp@%KRqcgxMZ}%t24t_{Wq?uU*n5~bs_vF+V2SST>gS@Oq#JhdG5WTuBR zy>wfUub45}1|L}1?knHC)_h*ZQOG)yY|#gSgM*X88EcJdPc2x2u=tX+wImOox3?H? zfeH2z=BhotK)IPA@@TOj*IqU|PUVXu^N$HE+Ddoim04ec=Y3sZYQ@CpobW1N0))FZxt{}e6fow1+hew#jOTxh z&%^>qZ;mk`7)QM~XhB*_9>Um92Up)q)fz6OzRtan3dp~79BJ7`_iq3|tQj|+X8z&r z$-^ESkE46$K1@{-RAfG4ETNw$f2upWaQqc8k>5c$5w=^jYqvLc4+HXfRfWzz!Oo+@ zNX9wSlicRjemo1X2lcOiELs32PywREq$79>r)Odk|5K-hK+*Q($g{=m`Q@**awe~U zHa+FnR-#1-FnmaEmTImN>$?Trro}yOz!uaN4Ii#QwRU*z-NZS5L)V^6ugPXHXVH6; z${?S%2H&7^eDK>hXQ|GimpKWt!-e7`o3q!|xC5a(2{T) zc_#VS_p}7CQo14AfV9?LCv_j6Wf3WieiSnA3A8Egr(5Sk? zyj_Tdi~kS=)DwXj=KKn|bEjz4qM^3@v9X7&1qOAM-F9E9y5+lCu*VrxCzeo$cWrQ< zgPd5s?O$-&3dzmpuZCU{ox`qc&L15cD%-EnJEqqma1g6}te5Twv;vw+9G}UB3T&%%4m`D|{KND0 zDMR3s8{NV~Pr%WL%L>1i=%<;R;?rDw!EU!Ptw-=gTGHCf{;H`&KRrZjCEBf z8g9X2r%Nl^(5Hx~7FW~dF{JvLi=f;&%q>6P`S*2Bi7Aw-#E$NS13J2c<41Ik)2+HqZdVB4WKeZ5_B)CXRBQ7{DSnjYN6$1>yQ~3MWZL2d@x&>)3OvW_P*1Hk4j?NML@opQYj)TqLqf^>G z0yGn)Sy&M5s~NIX;~YC{V+8X|`U`p9rr|F5r-qYJH*zQj`)p5QBj?w%tR2z?iHmPKAkR<92n}PhM31bhUv*eb-0a0RPM6a) z`0LNzdTHHXYRmS$GrQ{N5ylq{v91 zIHpINa|zj8l3SFs#4Zk0wvw~t_@*)l;8?6?0$NugcW(fqG@oaDv8Xjn`PD{=a^#tU z@sI^~^ddzkhT!OckF03t53J=iB8IwpyNI@V5G}Mo0ktG2@+T5Y!RYE|yTeka{!cHT z`Mbix6oiL2wU;Y)q|KWTLK)jUDnEu|U__%_WzQu3V-(>0R) zVI1X)-Gv^>5^3~t&4uTGKS*}AUO{=PzgHemX?B0nACOT=8H@3#!&3~o@7P6rokMG@TX?v{Fn8|#Rf z0(U~!mq@o^&8Xeep%kKIZU&;^lALn(>UC2|lCS{XCfEpcC`))A_7h}f+eUr|NeF1N zZQ^l*+yUw=AR*wAg#A|q1&%+GdDk1Sf2C~r=UY*gm#gXEB0v?tQZSE^(HW)}8E!@l zqyswZkeMunV}tH>v@K$m?W|Q~(Su3+)gVA7?pzGs+zaYSv z;~>Nb&>xEh2wcfL`{9e~LUq9&fig;p;GSqLr<-z?L!mq9aWv5l6AoDxU?$`2)i5*h zW z&Sr8G;1N4J=?}(9-PnIVf0T~7kC^SJ2m_XzL6H3hv_28wK_2VG){5S2KfkV3N!7>$ zIvgNSM`edD_UvZ>zMF8EXh1jWxOhFA-4`R1Z(ler2k6BF*604o^03(5QO0R~Ay}97 zE1@v&BFzrXJa`>DhhlwXkFqFZTsjB}IES6v|HYL=Mew*)_i{fC783(;Cv@ zf?;jgJUL=!6KKh##mLi1RiMbDmu@lHQ~pu^Q|wAe%F*#P&TQ&U1;}X8v?lN=XLxlW zhw-;i5|A{xDiOqI!cJSjU+4Km`P{!O@rx=986e9 zwDl`QyCblD!0G<^lEgWw%cHZlG&|s#t@Z2p3p;9odsnt21$sn!p&OwMSA&q~S0u1y zroGhamj*8VR&OA);y$w?C-SlAy%4QH__zHwBJ@-GhKcTRxU^iPdsNgD24L3z#!QAL zlQeK$)8P^!rDa;{YUukwzX2R>NAxJVrJC?6`+Q2}d8^kSY$}aNkMYxAVQgL;Z143$ zpFCGwAfVSn=PF*F$Cth3z+cm6vHZ42LlWH?m{(ngMsPVSNdzN>ROhbNSPiNSM7M|cK&$PJF54-j7A7q6;~xc9kAUF~7Rftm!> zk+B7&@8fUQpJq9ZbK4*9QC43bJp7F>PG#oB0q7*L-Z_hH1#>AdzyUA9#oEsmhAJk| zvE87j7B3*l(_;_<2QhvQ%$}n0z2o}joj~|be#7TV zu~szS8jJ4Qp1yl5W@De(6}e4TrO+p#9~DGlKc^&97(k?#fMxWQgfn6b(B{|LL_E>5 zrf%AB!Be_yi~|~tYIjc54}l@;z#Y=6Q)q98d^+}_@uG@4dhjHeDB@A!dAaqf-~nb7 zZH|MH&OppG_ngaHdI42&P=Kk|&6(IRlBLhEkDqahE=IriG^Ef!8TX1HM2yax`1X0s zhCvodzDv|OI$=+tA2K_H=S)Z_OT!H54xv++QvqxyUbj6HAG240q`82j1F$RTjJ7oc zS&7|nNB0f`s102zKm`xFFKD_o#_8%v6!O&6?GJ~9I+lXbAv!z2=G zXfrhFjmb%(q9v*lD0IQK){l^bAqy4DkiA{ZH|Qi7w}N0r^xg#@&0uROJusZMyz8uU zERaY!GhwDIXryuW#XEq?yfKL;`7}}H0FBdPy~b1%b@2W}=5G(*{$5)^rK9_25xy44 z*o_Mc*M`)7f>J{;4o;RkXo5kA7T~&0GA}e;WzmihvG4{?ttliZ7rwg)va*UG zpTd5CZs|l}W)c+yus}4DcYgPriI(sS2T1d1k?|jAHHcI}{(xD?wVmGEqoV}F#+;il z2W$?I2^jepbatzWnRtN-Gqc6{Xzk7P-44eqPKrt-5y_*9*%Yx-N1kaN%Xrq8ZJ()M z+yaCE{wDE%@qg$wOKG#R=CxawOX85r=%VIhG$tIm*O1Q}t8uvDMR)E8(4Dg^rYa~a zq2>ZQ_U`eqXu7crUy@NA9P{C5N&9qW1V(*Geq9oU1 z#cl^})q}6QlJk05KSYHZKB{*cstm_jZ-CB0VNM~^jdA_CD~XF!r?boKutVlKIW1tT zS7{%A2@AQU{HLLoYTC5+xBSyk8Vh^OteN?>R_f!(u>^XpQKsw8mL858TrVelpl5Q4 zJ=#Daq(WwQ0YT%2dPY;+MV zR`U+f==jg^iTt-h?Z{5f!c)&gZxfd9y^Jk{1*H!fa zmHjjKH1!ihkB<&mmSDY9xJE>!R?8&g&8{+Mrb&|=mO>ZKDel?|hZ|T-9I|E5ZL!Aj zE4DDWON7(fnW5ldbMd>TWhlE2a*wRp_`vj4=)-)+`?Prflz)fxe7^ z1(&O-2dbdym~By;U{z$ktJ9Hvq5H#3o=Cl|s;xs<^zMjbMlkC2b<>mzAO$WJ>|7Ud zpnXZ|nnJ?Zh*z9gr?>ug91IfGF1QySg??)=as$Wg-yfipSnx)=(*;k@x@F-Rj6bY}$|t_<+pAR&D1CRy z@MwCtEWIY?E=V4}7^%ahOr3T)Kv(tplGE_n)wl4nyL#JP7gwS>B6v>FaZov1NYaMN`b>+V3{ulkIk{ z>M3k2O(s%S()maW&PrWbWKaB7ys;63cTyZW7hRsWLF-ZO6B@cPDo!Q2ER^Z$W11WO z>daWR5i!c)Ak#I(0T|}_bN|}0{+He-`kf;56O^li-bFwB3o-JAoI)LD7JC?a7=dT_ zofdhKX2|w$CVPpb=&TjX>zzA#kxt)29}Wo&403DVG&7z{@fh1aS6opDY2i6EJ*Rv9 zm6@Z9%aU6`rqJ?p67`q272I-xO0VtD-<`aa(7(c?1QU`L3cSp+(XGK!BRt>+k9^mZ zhSurZ;2&!jboth^lOy3TU-;yNmfOV|D0`{-;K79TrS=iFC;B<*IXgh`@u!3JKY9eA z-VRH=T9o`eDJl=tfVFky>?rdyc`6|V^E&0O045$QypaI^@*u<520>*Wz)$^FRUxvHe`w(zJ{ki zC|(%IT3ioZIdrQ4ePK$NTJb1!KB2-6Q!kdvyCV;GI|XI8uTY3Q9;B;Y3=GAh{NE*J zrvJ-SSSotZQ><)}>A8~EzJ@=V*2CZTLQ~+k1KW=QwmiA#s+uMyL^nZ08`QMO<;zjS z5-tY=j%S}@-DlsQVIQ$nX|!?4dR$k-VIK9;xO9k%Iz>HDM$G%9+I2JE+zZ3pXzig4 z@p<+opYM+5Fz8#Idog+U8~BucL_&WUl+sLKKP;URE$PcX34;0Tx^lVd7rV3fts}h| zowCQ!hGuWZ3r`;zW(GK$8aeFP{3;QM6ZmKLm;b4l@IUW&*?AQvdO-jK61gDB%sM(Q z+qe-Zol}NlJ^wbE{unlPb?nWgsoV>9Np-3VWjSr`e%mU5#Qyo3j&n413eu;xPkEx1 z25K(}S^QMw@*fg2`=`W4|M#DJdPVt=lz$?abW| z_6ZHyYc45&6eheF@u1{JH(+&`5VzZTBOLl{+1{w7cT;)jjT-F@QTv*{gOo?l+kf#w zZ~na`^MBQAq7y1l?=#H$>121D|Hx3bZ16s3j5jBONz}--M5g#5+f3h1@MKi^L}jae zun#oCEu>so_*J}_r?4_Dd@x%!A4MP4blcTMR)-OO?U4|@@i8O;!DRn>`ra1Hw+Ett zWo>$$W<&MZ<2!o?AFSgWDTixi|0HY2hq>KXqtu<(4fLWK;G; z$68-ZJ-dtGy`MTXDe)=FV%-O+#>bN+_A{(pt# zjOBeBv9MKOHMSKaT9oH$S3TL5Q+TPnGg;`nPP>SFKH`3rj=YNRuzO4Lsn+!3`1VS} zUCHH+hfhjva@)?GV?b7|Y3{`hxtN|{!={0OY|mude^EcZV!==!JbX=g0SIpGx z`CB+V1&}!F+6sg)xL4H+csY?RI}r}$PGq?uNwTk>d~aqsZsWBZC=b2Oe$awjBd`vU zF6FLxzA;s4>{k0Qi>Ukx&@0GReGb|w*-EzB_n$ELzmVwj$YERk`};lfC1QJ4efEs0 zMn8K;E$d5SAwmqo7XTpWYJ+8Vv(1{7ODH2sshn~apRp#gOZ!wr(+)d;bhr}k4%WJ- zo7sJy5aezGX)G0j5Eu5I7oz zl_D6(wNd*)t!80oTKv6^SPWVN(y+UnNA!NG{iMhFld3kCo)_n$ zw@9>@JD8`=qr4A@+_@%ZW+uYv4-F2Wd;kngV|+Bf0_#>KvCC3ac)^E23alHG;s1Ax zWkDoZZsGim&N#H8B%|p7V*U`kpF^{Dm4-x%hlH2fUM-lNh!sN*}def&EJ;)1i=0p}tY--SdIjVJFWetM(*Wv zY5duB0r`~H>EivE+GCQijY!3M#3*4&WOJ(rR_H&C3ne_o;Ldl(Yu#4(LGv==2;6l0 zlSa^2EYl8`DZ|^!qoN3E@R4 z%K1M*`@9E$h|uK#pwRjo)}L~OSXQa5-uP8_{SQw;8*_+>DCR@?u{92?AKMQz*c3PZ@C+LnUE2>@I8Ue$EQ$$Fn3w@@qf;8Nfly z^a1uMeELGY*^r;Cd*($uEH`>Bdmg$zix8DL*Zn*MBEZ2Qb9Lk6G@0-Z}Q zlG}A#;)e8{LF_8Wdtkf|htYGaViQV`b!^5m1b08$hj{*LeUK~pJBT!b+i#dY?G-pI z`u|%+FX4=esKw^J`{# zi~YzS{chnBn9WYU=J%NVQWC|x#tLikWTgc?uUdhLu0a4?^z?+OT;n0XcED^Jb5keE z$aISg6xpto=K|f=hb=uYy6MX@j+&8N#KM5icD|`h$RP^*kAX-7J78c~vK9`pE2fxr zz5=a_QKqUrN0JkEKS4NczUSKyqs)|d5XFk%1SzQpu=}4|Z*;ntwavr^DhT1rh@S^6 zrj{nKt8}4|nRh-UC|=F^;~KdMX6*+i2e~7t*F_RtF=b;SE0pb_UIuWPcsUVd!_QvRi#hH)!j;A#EFX3$Rz#I3Er~ zZ`8)o>;oIYd|NB(3*e|-cXuVL{A=`kxzwR;!~C>~70;4U?Mg51x8#R$WYNPFe8qYf zVvgSFlX|arODsUqkd!k8`mr60-jm(9<7`fC$48mnYz7cof$tyO(qvQJV_g!5XvFTS2Z zxF}K*_R2n7n3@!XG+w06D!6I;iyRpb!~<_Wg6S7T$gzv?y|GQ^(z|_!*j7y0t@RyW zd`3fIn$h>T>*F1#V(1}&=VkO7`p*V=#{=d7MEMvJfA&o+$~kn=B4BpY9`X;A0$Z9N zu_Jc3;GBRk7Yc|%ekl=RMspcPbWmFnL}!8@@XG;k$Vq1>Cnxc_*P2|~Y%xLL6<9X- zWIecRro&q>;g!=locu+a(ciNqmrfje0+6ryHSw<+p$einrm~MCtvv7!*|4{k#FP{4 zL|(i;fu9&nn~d;r_K{Wc?^M@p_dZ$^K(!f4l~argNGPNeGlGmHEN z$lp*^(q@{^U;&i6rhxtC*wXMWKCw1zhx~os8S{i}*5}Ea(?g+}VymmfRLaRF1m4tx*j2dlNQf(r(9LFUE#7iqSQ#x+w{l+JZ)`EtAh2s&dM zwNQ@U&Bs2^;d{OSkQ-=Bd}*#vSoVTUpbKj`k51WB_>0TSeRD>duQ&G6EhIHfkAUYw zz7UXgb%_=ZQqO&!Lf57NOLS`phaEq2;P7HEZOnN(9Q!`BX*$G}2vsg0Sa{ZYzKyyr z5iUre9ds80z_@Xe1yQqL<+Xj!m2)SW;)x6-4{Z81`}lj4cOcyubZj`P#B@K(agQq_ znRDaL*KZUYx|7e6La#n2rlgm+;yg>$4tWMRoX*+`yGf&a(L@pJ3XWL3*Mv#c$&y3p zFy@a)qaQW!Z@H|YPWc(S=SO}^afvzx6ujgUd0WRkim!0s6c$Gy3BFHcK-fSp=Gox3 z5NFHExS>FCY4$DqYiT~Zv`nQpxzJ0@6o%C6bG*!dipoB9jO#zIWIymdf(5krWVbnL zOl@~BO4oj}I!KbYr8#(*YYuGcV8D+WyD|L__nX85(3bneXg=c&9PSP<)ZioOduXVVuiXQfx#uLCVi zYjx@Deov1)_DQ>N#wfvPm03alPo##KJRyGmVt|O*L&`r`K(93sFyw?XaLl1qUCM_Q zmhjPfs~#g#4^a55NsI??{~{VmP!SP8(NpnTWfJ&0c!xNBNDVFiW55f0N}?wRmmxx` zc~=G|ReogOcviWZ2gwBa*B#x;h0(V%?<4CI&{v6;GwptS8HGUwT)84IW6D;4qhC-N zZGvxe+_w>|_a|oJ=8l@DP7jslUu4g!E=Nkv*L=%w5=N4HeI-;H z@q^e!&zPSe8)i=8?J-UAUB=uRJ}rb*{Vm))F;*F+6BT?z{DyLXFUf=)y$9j5W42`M zwAZvr0HlsYJ*&l9-Xy-Zq+^eK!+aLH#Saw19IL?}!_{B?2n@wD>}xm*h-wfTR7k>^ zcyf~>IQHd)o2AM=q#4*)5A5|ZV1gNQ7NFdx0PR*9c zt9f^E(*;&g4I!ODm<+KbI}36N5k-4TF~sxEVb1IqIB_^t9RDEI0d@vG+R;*fZNSl8 zOM$V^F1ck%Ad7l7>lwwd^qLNW={$sZXL0ZRLVc! zb-;IcDkPAlv49dTPPqPR*&~Z*l_EKSBkR0P8;sgdYsTAjf|ge9j+v@BAWCTx{RVoSM%7_Tne* zFyaN3x@-*#_=q-SL~_vwwSKZ-#H5yXQ!ck{GL5+tb3+F>O}wwFgYSNXnDd!Wh>T?{ zk&)|;keQU*lizbWg73>+e`87qJW*b&Z~ga6U&^5JGBvwLMt%U2F?VYj~j~Idwpj53kBdvYJ<0Esh|81djNRXjzZJyAg5@N z!2x|W6sp)YvleC0ljSk#{@$XEQ9_IJAJP?nNpl2Q10nrFSr23M(A~SsX%7HWS_mmi!|TYDf}m&o?Kd@Pyxj7q7Lm|N)sqDVAKD)DLWhJTd>Wrcsb z7pUo{7em(BEvnCu6$5zBj62PZ<>@i(ZS@A)O zus*Xy1{iWyFhe~>e#UGPnB#c?%D#KxCrA#^INn>itr|l6{#sJ!WUK?!Tp2|Dk5GyK zHP4Y<5$-lH6Rztov66aF3x_55wUij?qPyyH!##x#@G}S9j5ZC zy&DcyVdu)v<}FrFMoumzbIpUJpGSEf8N7er^z54bP%v_WTFd9U#>J&cq>p4QikEMs zt;|EJoO0K(+dd}VUaaqhJp2iO_B@_^-(oAqruXO6_m#W@7#m&-oGGZi`J)4 z)XvJ~y&G?%SKzg?e(Q#@w@j*L#IUMS-H+m17C|$7hfFz1u ziqVFzJ@dAb79Ja14;BvX5PA3G&^m)g;^KshDPPQ+l4@$7xjLQ%Hjb={;#g<+lG+*R zE)N1f+q58y2#kO@-0K}B-aRjRVScPe)KSv>vgsdKqieRb5zc7F-*VmkXdT=ZQn zZqQS_pF<8?65RE)^!=@1v!%fAsm5=2vjDxym85#MXkph%EVH|{fQR0=!;Q#+ez3=0k5>(y9RD@uU$4xWtIYxc= z8pb*@e4y4fr}VbVIcN{(VTOsQ09T=b71se$uRYT&FWzh=NxuE9C0AbQyDfc{_xU|3 zsv^0Ev}fqFgJdq{3(EEL)FzH@>1pz}ea+_4N*?N5xwGB=7H6jd&l@OpuL+Zp?s6;k zQUJUs=MC1APu@yDOtn(>Oj6*9VO11~_>oo0Ql;zj@Q4FS$P{0E++)u+6dIE@s@^!B z@(|CG*b=t5J#s4(-caBzmQqE6Q%9)0$x2;<(?!%K_9UQ)?2g7>Kz3L70I;QhiP+2U zycAj}DdpO8T-!KfY@_~?_c2t74Fsqm+{?F^oIvK6_=UbF1`qIo>HKEns`u^}t3LR}!r0Hd@0i;>&eisZTi=W|1aR&ZPE|y(k@PGM%n<1u?b~0QLFk<2o`j@q?mF7-}N<^ zg#9{9H@WT+7W<407#puQ-TU|fx#D+=e?vv!gR9AeXtgr`-h|il5iye1mKj0Z<_ARdHXlsSpu4}66lZYd()!=Hvpey02H7-GBka+G+acCHS$`I zp>r|Q^v^Hr=la%iFL~37$a~bqwskeSy>%i$GDR6~S1z~_cHxUc$oG@qQ1|;4rao=9 z%9ohM0THrgc#_9+RR;rqX7xw9FMP4EV+y*0Wl=iJyphjq^?UtX*12+R(+jrW16}bF z0BHXr$NIR&#_OaKqQMmBtt!Zb-nxeD>!OZU3br{_P@hhaog0oh+ z7Lz;Y*z-kL$YpcR0N;dtIm@N>VHWP^v7ttRT(tgj6{ps;zsuG)3sJ$J) zS5@_fjDECp`M@yi?&#T5?&V=Ay|tykYBW%*!!H_?y3iiCX8JJKQvq=eC{em`>B4ke zOxbbv40C(&N3U8%3zuV=u0Qt>9#~fxZgy&$HC;@|Bqys)q8fAUcq zhL(MaX9YEr3zeg#7~VvPzT12K?}v$1MIE9(Zz!_uaZ4Fh?b6m70`hO2o?4Y8c;B9w z7LxA1hvY($T>JRMQ2JwM^~j#Od)+tS!xCMx-`*;$Q`nadkEb2^nfwF+QX3Jc-vL3^ z#227jUM6#{RONTqD!zXN&7R2GF>{%FItBoFN^fsp_$ zycg6{Ug_xE0v_EdRWMsel*O*-VE6V1K3tAktc^7q<~f}CrgHP)Yv$oSbj1X>lw(gl zoW*?S-Pw7GTyW~w)V<|H)!_Nc3iU{qIAj`khL`5XX;mOhfdH`D-rA7EdTp4&%@52P z*6*K)IVPTSjToW?ed>b5N#eR-KsceIe}T>ZryC%AuOrjLn7J6AnTbKC2*5b9F4%EI(Yfd?2{OX=XArS5xDQmW_fY zj^8);9~!l-2(dm#$TUOgW{Za!D__^P+iULoWn%r}zj_}1PkjGn8K|F?zr6;KRg$0+ z@yxc6@}H~|jS%~xrjzNDGNITwtz_ovZqHT%b*G#%hST{>uS?zY1ykW`2o{O|ft7Rxa87!AsE$V~1CU@{ldMybT?pqHor4r${B zgmZNhz|X-zfmhwbyuVy3g3A#lF_2m2e$3iC^b|M{wJk70rrSn0Y>l>2?vX_)RT1P_ z;-#M;LB7~t#AVQ47WXlzXb$n2*;m}byNMGksbxceK#+_*gr)`OLg;h?a0723zn0J1 zLGU>e`}%%kw4vt^4P@bRfK=V9dqg{O!i^3>G|*s6@NgYRn>t4f*u!PR_~BngcgLQS zev z57xJ}UK6{qrnAqygI+pGKme)`TRH$3k&emO8ug?o4Wl-}ON!u}K17~QT3&?*${u9H zMab9t^)$Pe?kzEYwikaQcK=AqiE?VM4O8Zw`0)sGvI$jT3Eu4hj17QZot|g$rVmQ|hP5J}U_xQTjQ5!%{vbB*Y%?nRW8r=q*dn zib6PmJ*wxDZgJw;Ime{7z!N|N6l#fdgtt~lPB(vo?qiNFA3l!&9l$insVXR!CI1>v z@|QMBmsxDO-wjF!kevK|m_s5!?VksZCyRI>7>56bxBbuWNfD2l^UJF?po)~*|Va@0X|(IZ|NZ*cc(SIO!>#prr-GS(fWLCYa7{OVn@XwFb&*A@7Si*mF zxc@JGo{GNZdFr`J+j2*RvO3S}VNYF5469_-OGM=1SwZfC&Poh+Nd5S+YDdAo^Dk~P z!IPtJVg1wZDs)yAMf8{6PmA_dOT36Hb3*2^>F4)2ECZ+Y|EsRtfu0wW+4`xAKFZ!< zu`1%!uX{5CYRhz5-M|0j9M9a8N_mNezaJ}9!EsC@cW>WQO?h`ZgKH;zcrH;eTQTrT8&6yRr}T(fslOx=VxSj4MZEb^Cevxyx#~@w{;81TTfV}ha*MSj zCov{w7c*b6j`r7Db};)J-62vk+yys>>s9%}ml8vp7r8KYvS|y$!wL3D%%R+Q9L4m^ zLXsoPiDBSOZKhC(KCZJ6iN2~Vqc*2c7hB+iF0KEMy|0dIt6Q@UP@sa8LMTwA1&Re) zEXA!9FD}8oxVvkClH%5q;%*^Df>XhvxI4wIP~4&4>9u*U&F{`T@4a_se&1i&Bspgv za`xH#*=s$|T4hlzvS+WvB(!LZWYbw~0Fb4a`%hQ$XDGwpm~#H+=NU2l{TR~HR957s z>f{{&&p`cPG^b30{ryUHwph;um~47(*q&lWOkDrP0wL+_1TPWPx$fiZ@5xf#kl+s^ z`5zwS2tn5OE;5VxvP%;QR@w>|yr))hBzqxm)7(?5Yde96Sq0mqpW;l`g$_H5Uq#vu zUptiQTSin~)i>^zw1;M@@%q7eC;Gk8Dw6tMV~-eVD7=B%OL9`-dzDF0H+|{S$-d^% zSZs#|_;9d%DE&?Tn}bRuAOTpnF~?w1_fn!H2i^602^k6SYPIwIzh{N;>$m&A7k&6g zj#nWTTUXDEw<1Oqw0)Iu<&q*|v3`p<(!x+xnV#+VTKHWnnAN)Kc3DIYZRui<_4;Tb zYYzJ&m@2Wn^FAIVw+bzN&*@C1Q2C)z8~yvq9NguF5@Q{RvK@wvX1N=p5mD%tjm>4-~0K++|11E8a?nT>}Ir%j7cM%_h+* zPbB0GEV`#rqr=IA4oa!L9m($#Bpy7Qd`^s6!av4UaH7&_WPYp1d;Ix;ErZL`YUh^g zg_+5jwqHJ%mY++)$tDsNB%JK;G1FPngm@=xN?MC&3cS-0eo$W?BSZcfJhe{EHT=Su ze6tIn$cF#b>4Doi-CIVQ8rxSocyeg6j}j=}K2{vH7V`rgb8vM>7u2C1DDT zwqF%={EJp0Yd|yL9-Xazo<`L#c=tSg5~R^lN>-&HFUQ)w6hA!ThwfT-^G#*p2r7U274hwY zBUijYc0sux#%B&nh>^4&x}dU%1<3sE=v6(EH34U8YLjP;IQ**gTiFmd2t|gG@TUzXoV2DlIx$X##?<$-)WZVypYQNMj9DA`f z*v!K+p}mc8tWPNW+JUbQ-BQ|1j}JVILzdL}zyo01{ypmj4y92p!KI1zb!Bup)4C2J z(X|~bVlqLBLZj>fV)0kj66gD7#yZ&!5`0^?2@SQmA4TGa?&QbkT7O@ANC98lwap)l30~K);0Zwz zz0VkB>kYxP4ah575@IqjJQ6&<=BS_xUuPHsYX;<%GgO-zXDTksWM@H%>9i&sgSCk^ zj)?*=*0m&gaJSRD;!S>lT!uFYZfv`BF1#M^cu=Q)a$#T4=8n5k6RL5*-AyB#%kYAg zji*lN3C*jzx-@#Ex4|n!Wpt_aG};>%3)J2ubc6WiofY`y#d9>o%G~QzrL0+ zXAZo>{YveYUEd+?+b4)U;zYW=B6S6dT3K(p^(i538L94(MtcYP=Skqb5@DWU<9SU# z^<2Uhd37AD=q{wXR4RUkdLC3gzaXV(r7s4dn=ing{3ADVYZpEvL4IrEks zsJiDLhpo!h`buEFuT%lf@O$l%pCp4)uOkr@3ajG{iB#jE5PYN2=&;8m)Z>wq;jY)= zyE%Y)(g%r@-!xGAQ_buj%tR9YF*6a9s*xTLeI!~2 z+_s6xc|Xz7H65ru8F`#&_pO8z z9D)&Uk6|S8*yW88>eEsc{)<&ay;Vz1ZsO=NBAic>E-A-k#cDt_>%%flR{qb9{!r%y zeTT&j>P$DjLr7aIu%%$YHYOCQ`|+boWK%WLv$ZXsi;EdLB%9iIk0x??0G!_tCuykY zl{uP_66b5>ZE?~0G*)klZpCyunJtCZ^2z&!dObS%mmCWP>-=8ip%QNIwR33PY9Z=E zB{;-e9J@L>Qp?2vgdUl;pfL_WMB=|X*2?#rvegSS(s&n$SUHqyBi&;XuEbb{)xFHA zTOBQ9Rg!nXa)(m2d>ecZ&CP70{}4emJR0E6FaOeUy6vme_0kiUk}uXwi<4iSAS!vH z0DUWsXNu=!-E*NGXsYW4%N-KF-LP2?sJFF4NYddod7cIW1~ z#(@B#@6J?wzQjRk0t`IA-Zd_Xo9ia+REz3%+(B!JaCH}$2gk_sFLnEa512s@yqD8NY$RH#*aU0#3c@gW0QYtHhkAfp{ zKm{&6PawQcCE2lUm_D_N8{TB+koyG`8|0z!VsfO_1U}rK1I6{i^In-7yl@zNc8eOFl}}sq#k-SMDXUHZ63pspqKjsp zQdum_{b`J#`A;PcAXutH3w4INF37X+1;WwLNJ}+Gcd0%<_DVUO5$X1OQv1+07|P>| zqKv)zd&LfsWYpsi0pbk(9PougD#d=M?x%@Nr8%RBmu0{#3|UvaVNh@R9>|@=_d%04 z2kU>F@3Hf+T?moJA}h`o;Ogt!@i`b{KH#>k6)R*3O=V7+po*ry`BATZWnaS$cm{)> z{vzJQ{Aafd^v2uW*e~E8CAhyaOF+#$`d)K)7{)#&Xlpp~xM+a&Ykx?@-^aS&3+cE} z%$It>ZGC$K-%E|fKzxbIhWNF!4rO@fR2nv$5jl}BZnN@%*KSuW;f^+V#Z-t$HNjMaOe>Wo z%v|wZ2-vN$V3mp$#NYMs)$3l$Q;)90ZfJ#Pe_4Ggd4p0N`DFd8N#$rS(u7ipR9QDm zdLSNtV0@b8+(^DQX}j3l!iVhoS}B(|*W>nH)|7G#-$LtM8;lj;Wx|0a?$5B@-)%nr z54?BHslfF9ajBF_r9{X$!N>$rEWB;xzCi`w5zLkFbxOtc$#}yi{{@r1uN; z4c_9jXL)>kkPlgP>`@U110A^~Q*xn+Rq>OR#XSi&E_pxa;(r!f^~>{8j{e^NBKnv9 z7ygoGb8|Gm=M(1k*a3uo?CzBd#>q#R%iPIT*yTp<9~+>{ipv!LCB=W3G^qQVa6wi8 z-Cx*0{5q81-~0#Ylp0WNJKKPq317bPUlOvsn!j?r_<9AvnR`O(0Q))&JMcY!1SHIO z4P7>0tYIBeJU)~;AUQKN*oa?aKKP@Q+y8(4gD~YEmP?EuWB(u11tP|O(JuS9>nEEx zDt1cv9v1gASxh^<1qR^Fk%0#g$L6d99`0Ff}Rdb89A@Tb0noI z4j(R^m6lEFgEo@sO(%0vco))RYYx}>K$rzWt>BnjQRop1jZd|+kSe8rI?IenXDH}E7w=@W*95Duk@2q?fp_9Pg^fP=l8w;K3V&} z+*Y{*>MWpMV9(iZg_3?sc~2#8W(A{#n7PQp_2gErq>FWolo84pMUx~27(erKEE^o-zi=VHC-LM`3c!;(Qc*x1*-lOgTw zDw07~8Ns;dw=2b447t)XC(4L5ml=VcYaNMGd<781li09p56{gSai0xrtp;$%2071G zy9nX*N|>;k?3H%S$yZhDl@(w%;JKN7AWIoxNj~C-BA#a4N7l?sC$QjQsL)>1%SQ7z z+)UI>D_L~{0={zke-%OhxBDsXybl_ZzLt8_(n7J6*N_l*%Tc4$Zt3A0K@E;CP`9Cc z6J}|rxUC*b?N1b9u}Tx4>x^m~_0%*59$hEPSIG?_i-J$`_KJc=N|bPzt-~vR4m8nU zcEu98oC71`l+Fl-6#x!cCMb(1j#{1cjsfF69>ZnaT7VL7ZhZxOfg!R@47`%Y&~Iy4 zX%@s$zb;CxmuJO88uum=o?_gn(}ZYk?T%oGNg+5~f|2ft-ZG&aLGq~9RT>)`GhwZt zm=bWMjKKE|ZftT+H_5jo^Btl@gG^)Y5LjT2{3EG2oYJEbEYrb?SqL{Q23y)e^#?H& z%V=NBUM4wu6X@LVohb?SsrwzJMZAR)+uH90iB;cAXFVjwh4ygU@Wam-zCSCEy>5zA z(`fz_9gYDP8T=4@3(>mP`&L~rVM)E-`*f>~=oJ$G>E~=iB1g?0tjyC;$aq9HuoD>7 zI(efU@-!z04^R=%KAi5yNm7yburQPDuO6}6Y8xL9)a5Ptz@@)I>r(1#ls|LBb|kr6 z8hL+13g?4a`L06VtKyaP-bPjFG5VxYh9m{)RmACYGgV$I$KKacIj(^0=IqloVH-ph zmAuC`J{jR(-l1RKt$$xTCIzn_q=0Q?HI^oy6A>K>8a{R(kRmoZ1@CX87p#vE9xuZz zSozNfQ%byeDM^Z?Gm#$fGqog~ghZj@zMSOo6hs^_Ihxe0Le((ko%r<9?Ij}&E5(AW zEQp<*Rhmh4u4zO#WY{d!y&w0kPZvsCba8tPIil0PPli9U1rV{JziHZv`KgB8}2RrzK0D?%o>yw zpNCzFm5Eq_p#usi=dLzGzQhI3NBDfYDA{$_Ti8*Y%8t5IlAFgNu97feKjd8;ovL9w zws4Lz^3rfZY$n4U)tBuk*SDTDuAwN z`<1zrfea2y&t*SyHsM9a>q42PR5#f@^Z8{szj*k4u4`i{Nh2L16jm9Npe*_l3tGA8SjV-9+|O>cxfZsmeyoA{qT_v6?;*#@*{e&0O?}6!s(3pj-r)8#}UiWNnMxR zLXFo@lU?j@KCGcg6#P(ZP@~l5^ZgCxYc?#Uoi6u}rC!{PIE)C{pRC@xFes`o;Nnr% z*Sw)o3~^<-Ew(Pj#gRtv?wqk^@+9eW8YkiFn4&qd_xrw_0dFbFmAu15!(^8_o4h0D z3V23651rn9^Hnu3c;i4JyGC0Z9{mWiM6zpEl2Vetk0cW`N4mega+NI?JWIx#?LHgp z1(LLH=k61qkLAl62-yhEAVYe17y-h0#vbqn#AHi^o&WJ7*;sr>DH_MUKE`d`& z+)lzHs-kq(bFYPRYsocdChCG3+>?BKw!HX8k$n-dL>|t_efrScSY}h^&KpB)!@iPy zc?k>Q8J^AvvwAknCca0at_#H{{)#e>L5Y646bI+FPbO!A+iIYWQUp9;dsku|&1M@e ziU4{2Eco!wLx!1gU|;hO6|P7w!h2kil1s)#vLBgI#5}p(@q&2UVVuu1;Ru?<3Z;Ym z!d9PAYTb(Gr-H1tRt(n9TI+@nB`3lKM>fuH!UN zb?GX>aw&G_c>q3Ly1940rLfxTisu-wGnDV#+qN)FgngCCmpJCwCT|=k)0r<&QMHnz z4wQ`TY6z?3Os9^Yym1cHdlYMhatjPY?G;VhW_b2jL=&@wEraaM=!Ep@|Fn_4t+$6DD&M9YaHfT0^|c>-s3`{_R!FC*&y7 z;Zl0*H{^>9g%%9vZ9KhPz4Bf-k>(VN_pm?rIFb^5uX`Sq1FIOJulK8e0hJKmLDl4aCoZF| z$n22FF{B<@bD(S^M0>hLGq7b-kE|cHv!ez-;PQmzIE?*7oECoD(f(ULIluZ`V9Kb1 zoOn2h-XS|4z9cADAcTG=_&7+2IgNgV;+9Ik@O2dJuxs&`f=M1HvE#fLA}ek5$Y3B{ zyp~e2d`iSZ()Fd3MRJt*PFf^>`@5r;>($kxB{u}wvZtmc+e~qB8@&=EBc7C9i*#TS zz<)A3Omt@f9TYg2`5u(vlA8tp$rAjo=JSuQ;;c7#=EzIkm8+!3Z2|0AD?^4r(+r(2 za3ed^Cm3FVKOFuE>U1>D8fH6O^NLlUC zfXB0EIi}Je6lMKqtkJ|)N#^kfP-Om`Jmp$Z3x-;0ztjV=HWr9`OOce-@`i81LN1je z(nX-knyi{<2fn!|5JPc#EHzq6L@R$`2&-Qo^h|QBr;LLPyey6W;{5zxKIB901`|*( z^Dl**fk*!OB-9SQdd2V_u5~-H52x`?p{|y=V|Ot9&B2s!y*BEEmISJ+R4N#F%W8%i znqT1uiX@g@)h{JejBCjI+m>8fy(*Vod>EbtznmDbL^KY|lUS|V%bwB5)^ch~c29qW zbqz>=p^Y6ME3ouh?kaISO>vy9-Q`A=ZJ8#A(OF5={mf6#G-6L0A&-&)`y+p+>DaX~ zFD+?lk)!;k@M*b35O3*SQXw*8l{m6GbzX$I)m@va zx$9+Zk3*cKY86E1d7Vc>2yZF8zmE1%dk!X{tajIsxn<&O%2A}}USx6|u-VJ`rz85m zen$PP1l)i4T8s3VJYGTV5M5-3Lkislcg`kA84Mf71hC%D(ILGUifzfNoN%r7-LMtB zVA+x|Nn#E27n!<(@iFa@A4lsc_9Jf^SK~64m2Tf~Pm_0sh33V0-36A171n>nX5?qu zx^UvflK(IK`!>(`Mq^*&5sX~y0=|BV4Ll`Qh|AYnI+X!WFENCG=;9>gkKpsV$UNq* z)WxaD<=)&pjQcyLM~Z%v{JYEV;#VXX=eXy*0LYu3>p#X}&>(^>vLs%1M!0McW>C={ z2KDy8zHZ5iEiDFM+aZ3-KIEa&g7~I!5$sseO#wpMf9FpAwesE`UM7n4B>GyphtnyM zlWg@Kps=L;(~u;#^!yU2rtFV&bp7Thyu%Zovn#)ed>5m8Br+>J()I(?gu4)Zy6=C! z{1NDo`0r$)8}ylfGQjqTBBUG_tcn07I0s*WFNv+}vw>#c?))vgKoRbrgk-aX4X zyXnjM_|OsnvO)ifTlc@Qk01Si55YzW4L@n-d~bjRX=c5DZTv7 zB3?`rwgo&I&`O^?64@W{=XV&f>65`-6{X--Qb(CCq%2wCG$8GFa$?(7l-Sgrga0p-tPlmGS9ld9%dvsCp*IhYKex8xy# ziAB$ayH9=sMX?jFkOJ@C7=9`fiBzr>r=Jq-@lu=bit-eAQ~9d8m*?hpq>=QgGQ4SU zt4j8Ut{-?L0JKnR-TQ5j=y%*h#vda>b;tx}(>O0L)JzgX5!zzKlkV$&h3*M2$s~@y z`${RiZ)T=T>y10OFRk*_Q~)TJg?BqZ7K?_R%$vpvmcBgMtlfSI~l4p7f4hLABdF@EZqRJjf2$BqSO7#gA=L)e#v-BNq;IM+u~l+1Uhf7W}~+^c;H z+>(EDMPQaESn@_*%=K*?Zy1}a!-58AH(t)$VbLa<0F^(*L;Pu_TUEI%mRk zqjUALkswd02=0#T3e}q+_o(3UvDj%bR`FQ)8De9SdC2A!nR!i*C+f9JBFE#2dC+Qk zKW>tyfv!&B^%g(qrXSI;w)DFmpZ)2vXnk!(8;1>zobra6;O!1r8XzM&93x#?tBNSo zx1W#H7AXtlt`%OXpVE1?67BRv0;H1_L-~tL#s6fs#k@_Z9V+LMOXiA`pM|is5lFe* zPPS?B3C_>;-2Rx)QnMEy;RSmSRvlJ*Y;Dgt7eP}1o(pZ~6_Te@Y434=QeM1htHRXf z;5F8`L1Bs`UPT0W>cRJWOk@&zeF|!hzy}_>or-c)?1j!@1kIVn{TndlEp;c*aSozs zo-F94$z&lX=aZq&D9M4?(ue~gF0DEM(aa~Oy4;*7IlMlh%ISQFO1h(^wyYThk)2Ze zaOd&&cjQ;QC_3_lB1FE)%>+ANN2-?RAbiAoGKwAFyV^z$X*R2rt|t1~wn(j7<&_{m z^j=L#o!U$kpQ^CR!z1^3li9r#B!<=ik-x`6rMBuES*Snp2xcseRoV|2j)H;)Dw|z=HZY@nIIm;Cm-zzA)J>X(rj|d%Jc3 z{en0U;&3Y~z1M76C*J0Pl0=Kx4gPX1sVM|<#W?F#AoXTS4mO*EAV$O z^`?8$d!C;%@ZroD9UZwDm~Yo84Tsh;@TT5LCrh)4uq?X1d#FyWeU-RoiZz-MH_Le5 zb^T1e$&Jr?K4B!9IPI>kBfLP2%6bHCO*SF*hHP{E<~;fqxj24dbW{BUmDOJ3_X>C| zT2zu4$a~hCFFz4T38Tg4J$lKEvzI6a+bPW!`g8`ryX!n~U!_Km>L^5#@u4uHw&&T4 zmIw<=n-zvNX+smPTst+QRoB?HP?FJFxZLP?>w+n=guCRQzH7_TR@3D=@mBicu`sjU z0Hz4ZAmDK`3MVs!$H74*wnlbEA}p@}T&jq$3#BY3`@F}NrKb)6&Wj&KnxvT+bX7B7 z$3>DdtH*6Pj`0g+IKOIN);(-XBv=F*`WYtT!)j)eSR=U0Dhn>+IKI=q@3t z?-U&r?tFkh(a0~cKD}u)R1&QT+XEBCPANjG#g{u-%?bQu111$jUXBgNS!!S_ZWCXf zf}@4wt;@#`)T^ihGCVH zztmcON2YW```fmQbzGQ@)Lx0Di}k<{5Nq_Z(t=Af(cR$=tsK=Ds~3$m`cYDgs#S>| zDQ~Ct3L-1N{m8oF1bwaaa`CVadfKi*( zeVI)$Ueusl;J(}NcNg<;`v;5>0MyKX#QUA?D6sB>-d!$6mL;M!t8NP9jsxEJ=;KHq z4U<(3{p?-n2U?)R$?o4Ib&?D8`zZrE%cnDqqZ}}5Gd=|Lp4VS^#{* z@ym)97SS%%xHh7(;3A#AvUJn5`aMgGE-+PnaqUqGU%J7JC5&bp-)uHYnGjbSVIY>y z5|2nbT!nEq#Fa53g*O(YlA{(p4~sHLh4X5sgzG}EN$!*LQf+1pAm#KjTo<9TU8Cas zcDfN!Bb!f>-qS&@fi79H@4YqNWnpRE6yP;}kSNM~fHVXZ47agk3-UtI(J3k0?aDGW z%CTaY1Am^H^mpUs(5$yc`0`5K!m7F_Z=JG3s_H#oLpoLu6Ko+-F9AOZePBMa8HOvJdhFZZy;afzdSQQsYD zATmWM^#Cg>Y!aQX@NKbmA(3|)Sv)KIC{;O%OE)J%D_n56qGL~5#@JcOb`?B#UGtRU zFlC80NCDe&;#1p;YmTuJiN(n-OQTWOwGoS3>1haTu=nPZtck zbYs)CyQ*a#`evBrR0UZseZnqo-04kN*!4Ln7iv|ybE2vMi)|QQ6M!FDLwF5)Td{bz z6wnnx#0e3P`!TdZ7)s+Hkjj9>eEymK}1XIRM=Vv9J;GQ}0aA9RVPtb^^v1Bjo z0BNKGB#`SkSPfMtCtx5ReYNM_(uY7_)0yPDB9)V8#Ap-9q2@?OBE|dn?Rd||bnrS% zl+zq_(GN_XeNj`EF#Zj68UJR$hCwK<+m>$;*P-=B5T>l$VDV{5#bPp2U&?t!-ixie zoFxKp^m-TC3--!J3zhpqfoomg-z}8+W^4q-LW--1$N0_`%AFYO$vv&6)B=s`hECj z^KgaW^C{duDnhdRg~dtLoF!d6lxmYX_~0*0x`v|`9fGjTI$@#w8=7Zy&pz}tuow(A zy6?uQ!M;oiU39wMNtV`)Iy`_zW!`Fe-C$0QUS43||B0tZB9iry%* z2IFyQtOGF&_cy$+hc#g0WpFo+>0++e6BO>-0PMbbUFd-%Cntw~$c-7zRmAZ6g$T#q z9Iyo~i3|o2a-D~)9~>%1 zMnm6l1`mHPD9CbX4n_bb$58?Qg6d?9do6Lk^N1b7@Hspmj)nS{@#Mdx}y z1Q(cAhw7V(&MAx15LACV-T46$*~6{=0s07$qxp8Zb4jm3A@o7&l2iNK`5S5aey9Hf zsc|-u)wrvt0fJJ^>G(d-ya@^X#t7*GUJMY*e6VD`tU}I%g!Zv04$H>pI}t1xCu#v0 zpE9uj-qThx^R#JpV`3o@3ql|U&J_az2KEcyfG)Z|W=&n37WZp6Gr{=gy!WL zWfM&+0p)V4r-6D+6WW8`- zhosEg8;k64h*HgfJ0%wF3A^3eU4QC3-W)=Y9ZXeE;B~c}5sBVYQ@!)dzs_0l*8)?& zIF5ynUOJ{e1w-_XvR0TzIWCUG#oE6)ednIckEYBt9&LZ_MC>`2Ra$ z$iLxO!XE>FiT*zoCFcgA-xXOMQ9#S;haVD25^!4c=ZgkUWO*96U;sAIi(j7QIE@7POHd14BK_{n z>V&-CrDyeA2fP5nQE5=rr2VJz_j|4U^Pd}c(76(2y2)*D=x2p$kVqSoVJ@?$C;^$60V|h;u zEWkgYsgX5lnaKc=A`wEraT@G}P;h$7 z4i3m%0Ik`ZS~(2udojd8dI?pyyrH-zRawqHIlwgIDEfz7h2!;InffvP+zYFhp38_4)C3n!3s#eDkyYf32Hv0g=_ht6UWv(Bo z1$*%Z#Q}RL?{cd?Bw>by?U5CBkUK{SXTVUf72FGAmId$hm!v3xHv)60!H&hC;J!7-JFD=mtwrWE4%JubVnY%|Oj$LNPyY?`C+0uDPeu|9$`Ya*f;U$m zKYqYb9=ZxH^W?m)X;$c4D^VCQl#f%B%mjJH3eyOm~{e&oio zxsx*j)d@W!dyZ#`Bp6qfc9!mTlBX@^1B})!wsI1K3VN-*`ouo$t2b1@*9)~dbdqM^ zPwLc{;_%Dj5nGkE1bYINmPQXGJue`+L)@LlvE)O!ZUu1ACsbOSS+uE>TG5LsEqa?9 z&ozqLULQ*`pOC;p_#@vw0RS+wf9VG5Pa)c$O)4gJ8B6_G+7@F$);LGYg1r>!o=$;h zqu4F)=4k8IRW{7|@RE=DTIIkc-Qu<#PzoLCC3bS~=wy=y{Wa%6?cvl|j>z_@X7pX! zO3*^ABsPD|3&=ZQyYhQ&oxjJ-j27wYlw>FN=m6q=uUjYCMp`~iJ(F&?+P}5Mzug_k z2|gPeR&An*gr>*sx)x*%-v-B@M~ephYYT2G{N zr4Mt_DNUkmaL`%Qr7Qe8q zWr%?o4!9QSXqU-oV!?O&z?(@uZd3G>@MPI`JL2mbvTg3K(Bl+G29NBx^i>OVlS|1* z%JC^u+uu2KBU2DPy0xv&vM}MfTOmr)`kLpX+o`!T4ZGDz()8-ydi?X#xASoty=4dU zpM}8PNak<7z6)2lRmtd9153p&o@78v$c1eU3E0-)FUzeT!?p zLQpP2ua6<-V(Tz3FfBpa&lw(v{V>EUf3?1zDm_Fg{2m?Ji!%atV?Kdp>!9?g1|s5d z)?M`^2M~e)$F31|WG1ZkkHI~H+T1_HCUSD=PkT`;xI&oe6UQG(V46p4S=a%Nt@@dH z8@FF&t6Y4ArEU}$e=wXj9NC2A5(IPO9Z$K<++^RzmRI9G1tV!cUlCmtUH&OGl_ORM zX{M(GNh3uUvg31iq!0^!femi!JDEf*!dJ{IYQn4PV)chuwHPDDsEXF(GPRU^xnH*Jt1R`fd> z7AR-rI5m0^YBy>oqn7lQNJ}BH_Zx$-+Qj!VeQ^fNDw*O`>z;#4?HN;1U6?seq3t6g zvOCtHYLmjn_^%^ISCAzApN5LLs^(uQ;m(L< ziIEU7G*<$GFQ0;3d0WaJyYk+S+w<*387%uNh5YMvxrDV)7U|vkMx4WR<{RspFzS0F zRLzswHm()q>KYcGwN-xZf9-(h}3dKr-oQ%q%7Cra8Po-wPW-3RX=20sgwta6}Z?8QF+N`s70S@QiNRn^r!1o}bfIpi4C_7g@YCAHVjC)PJ zkW#g4tL)nscPvVrP1WLJ7Zg|ZE*M@j!-VYcQOz??)=1~%EizoHl6oQGOxH?bw9Yd8 zWkE>j<8ncs;hMY)&mQxwg72Cir>fWn#pG?ZN6vVmjv;{Nle0xfaN~nCj@dH&i^l^+ zr1jAvUg#qRJ1McFSB;jT+aL$++99W~M%6e6nfoIw*Y;EGr)&Y)UYVm!`|t2YpE)n1 zn6nXIJ;Jx&uI8*SccqR(P{*wfQ&XdnJ)2S^1CKJN(3fR2K5A^fcK%T{7u85C$M5Rj zlxinN-i`Y=urM^dK0oXU@p>WzmARGtuE(9`xQNareYV7(ZT%T1gbO|s+u$l0w=crRB|8zzrvF^_2gvKKofJ~_ zoO?Djy1d(4xhv>fxt-AZM+;#z;e^=%ce?32A?69}36V*W$x^~Lr|n439=9JL(|7%2 zw-(9hqoC6qJrZ(GiKN?}kzW46rW3@5$0&A`+>Q^Y%sud7snxwVY|-L(X|`1EjSmT= zlNCkdg?|Qr+2D!&`~YS{vHniFh$NTCnqb(~m@zw~ykf|FGO{X>R}TR&Qn$oqN5S$r zQHet(1A(LEaWhR3aQB46ISNY}HkFEasNWFqyFIIoplQ@>Vmd#82c-cwD%! zNBI5bN>w9cFVm4UUsR61&asigx$Vc}Rwtjcf!cy2CcdbOt*RfOIg+exOiqegw)W9H zsZX0S17h8-G?roK!yhePkJgqk9p;BiA(>SOOtJueFCQQQb}X$y*{F0nQn*69ta)}J81>-e zghl7=%>@#Z?uae3s|OBlfb~t>X*?+|U*RG7%km^uL^tYb3ad}j<2$PiOr5l-W=k&i z2Ob5q)v*)ORGYPxW%0(Tm8$}`2t%D;6yM{yo`cGD>B_)RZmKuIswph)D-@{MuozSp zlOWMB`}~)4=zsBVES+t{qU*i!IM~yZ(ke@JH9%iY205M45M$oRjSF#^+9r!~lDAoR zi}%fYGSwzwR4~Wh?yUbVSMy61&s00u$ho#!lvbamhm+$aX1Z3uAARW301)|s4x}P`o%)?=q?C@nRPt=Fca>vT%Hcy~zrl5izUyMA;iS`5Vegq@I zN9{948OI~u(<>qap_cw*VsVYfbcFX8*j)QWQmHKcpKfd8Nh+1F zWsGqJVGvJij$}9rJUK$1#mEar^CuBY5rt}eUIZ80^WtSg^}V%xnm(yN6QDnh7S~hg zx$Pd?cMpWIG#xoz{OfZB55k$S;)YqNrQB{H zcJcitB^qNLBPh;{C9o-gH5}p7=L;#?MKqNt=6CG+?LXhw1w#kd+$A1oO^;)QI1d%d zTo*>0E5S^waWp6g}D5E{HMZcD{x&+7SlV@xkeRs9L9dk!ZYEyrn z7&EAmOm^)8#nI%=dMpWhJMTMfb}(%W25+rNy0Mo-Xa7I3SG5v zVbIn6wt)_{2qFV1{i!#RJDiUCNAXsc2lMf2PUcUd;sWU|Y2RYJyVoQH%<#h1l`Z~ zz0cjlIh)C$r0JY(E%ER*^!QtyC;*CSHfPM`k@>M?ynOWISeHU3R%pe>)H<+YpE_NbEw_9I_2H`ZnXC@M??q=uoh|=`K!T@D_k+7I)=9*nRQD4CCAsm&_fOW3#Lsc=H z9e3W+&q7c;2VDQy-Z!Ycs7q>*#Hj1DWjjqJ+#{p;zRB~i`gzN?KJxYtuz;BAinj4x zvjMpujYmcIRr82R?Os{E-8PTQ25zoD1r45{qhqeFfc|WG5U#30!Cb}Nt$SYqXKQ?^ zb9?nD5uVPeh94kzx1TV!M3r`GQbsP)lmIu^02S&->^C$=0JXNHm{SkgkrFpBI5BTU zz~DBc=g*U0MAm^b2-1EDmd3ti`a-ELx4Fi8-}-gvWN{}$Ls**uMZ>*$IXd^kIjNfj z0HHL5HXyR(XT@1mPBj*rUcG>bNbI7*T#uM@XG&V|Qq0c+4N`xA$Uj;uhTg3siJ7#% zeRrCl!u?y~u1WJ1=Wr?QkiCXeL4NS2C{6x@V3*t?fWc$LMeF_bI`PX(`$HQ?dXwUf zNdTn#!haD(=waRrtE(#Zs5&}Yf8(5j5hFn8b$~A?Vj52!`Vp{_wXqJ*I-yz!i|l$| zy&*EE4eF}7eD8NtLZI!6F@Q6L2H>t*hDCP9KVJK|W)85x?LR4_D@}BxwK@g5|L)`C zO$hK4{{W@G$spY9`$SU-j3xU!R@RoeCqd*(d|&Oi4zHaX>wkVJ4aJnPWmUej9S zeDKUWM@DyBIIkg2hU)Ed_=N>#_gTnNcaw}(ApO4Dm$D5@^cwZe9n#!IJ_-SBJlOC; zfZ!GUT>SG~*~e%fBj(UHFg?iAn18HX)BiK{ip_x80vWS-Ss_Z3QbobmmXW)#26=X( zazK!HmOnpt!m!%veLeLBT0#1XOAm0633BsM`TPSUsd)aaA9qn`rg&0zKt!FlDDkWU z>HnEqm~C3GlT$nI2IF_c{-x^Gjm2HL=UNmxrql)oDgC&sPaIWP%bB~X@O{ZkuX;bQ z^e-nt-Ue{oA?#T<5V3OOw5*QP>S7;Ap#YV=D4G@H{Fw3y6S7GYm`;9w=S{>nsF1>P z1bYYBZW?6KtJ<1N;=XT-;So0m&{OXmdMv2Hb-~&fOVP=Uh~qrJjQZ;EOg--;XiRy$ z%+&~SB_?9xAO3{;`bjo5vaoyM0p9E3G#SMm)LJz#aBs{u z{sYEQ*^7?4+BmIS_9p4hqSRp`;os3Bq?dV+r18BwB6lE1-bZe=HQvbE4KJ4@E?WIa zPrMmn!NS?6@XPMYl(9>Q_!;mv$nV0#`AYQY^lI(uX*J|lv;Xf>NL>6M=v4khVlj4B zw=#c#?&EV^o{AiRQLDvAz@ITbG~W?dA^x_6ry$Gp@0~wD8AyL212o}9Fff_>0E1;l z6PRD33;gE{R@FCJ^DlhGuK+D%e$Z7hVE+C#eNPJMKNo)r+hFt4^Z(2tvJ|Zi%ZW_{N7@q0o@g_~ zJP|EE6O@z-!1j{SW)wp6%acxIIoH({Wm?+}q#!=(yUOSRKF~k_WcJT2)c?1k@T!#a zGWo=Th~Bg#_FU&4HT9N>J_CdH{XEYl*nK}wc7c)lPIraz@596cP( zB$0?olLD{Ql`QDId9jYztDuCj`i^?k=5ZxGdCm36bdK4m9}%cl)=KwNVw>Dk-B;eH z>f0#|Tw+Rq53kF&)k~1HmEBR(t)SIuce?L%_uM;o?)O(c}6&e>=0 zwbxqjdf(+1;NB7j-~2!$q@Kl|H>v2oX<<3#HF+YstjDx<(*OFaovC# z!obP)0)Ht~Yr;>mGUnFvv8WhdYmOUSA*9fC4K23)bbJOL9qL4PgT0bxBht@uob`5S z*jPV)(hIdVDi&kSG8d=pY!2@%03Qk<-(?>^V^=Ot6Jy{UQP$z-{i`Lgf2}O`|LCzy z?vPRta=~>>a%U>7DbH$mh}9dOSXZg(57Fhz3+KG8)6M1{$WrbiVxS;H(=IJH_pJ=^ zup+V$Ibzkd^IisNS~mD0)m7C{Hy0-l9%5JZ?RW0W=?f9Tn(nsFw&1&RBbwSem3h`PJH<@C& zLWgQ?FPh~wr}*bT1$q<>_)5;Wz9txXdMe8&w>2ddyal|d{p8KeAvHj~$WqNpR6+WQJ?{?)jK^?yC=N7gv#vMJ4T9U_8tsIA&p|tg+ z*h16LQQTH9pIsl%Ft6t)w8^68vUUc)@OFSdja8X(E0s4H zuqxg+?}?u*8Cr=gcqE~$y)o3|t2|aGD!F&n^^vUutQ_3SBM^mGWv#e(w8-b~2;OiU z@Lzao(0MJvj5LZUgTza{dR@8rz!~{6`h5R#S&q;?3uOwU z`1LLCjFAfrODQm?as^-HD^RXjRw&p+v3LnFwQAg9cp=4@T~%pEza(5R;!q?4OIpSa zf%mL2>NXKi3=ouAxR%}{H2Uc!Dz|;x_qDX^J~4+%XkVbvVI$h2{KFt;>P^1CP{N_k ze!@u*=s|(=eRcy4DY-vC&g7OlLvu4EYBTeehWj4Xb;B|rjgVVXM7&zbJnS{?ZA!}O zTU9j!R*X^~gyC@siqR~SJGy}uDbKWT-aq1$fUtcm8LHOhR}Z-{nz%=7Tlm-pXp<~Q z#hazuwC?sk)SufbRw@bZ`rOA(k2zC6tMGDJ2@X?!(6hs^w&JGKgAWT{_TDhaQ0wOaVdc-R(c{eI_0{1LI?ipHX( zu$>+}S7c)%RO=F%99#wsP?vf`v{gE7RR46R&Gz+5BGHk@)7f&VIl0~MT(+&*ecgn^ zJ@l%}!89^A<*5icm&ej|s8Qr+Iyvkx8MzB5vp|TQjl<7X_t?-2~XZR(ES6X?0 z+ZEz*Tnaq$tqyh#ME+<7XkA>w;k9-*Si6n)iq?g>BI_Azkn`h(JuWxx!4dXbtj0u& zp9C#dZ8u#aHh4L-3ddCPnow9b+qJtI@1pPUEXEX@so7mWHKQ1$5a4TRv;Ve1USdpC zIK|$fqvpoxC}@DpJ~gFOc({68q?-9X=ein)Vx-E7X?H2(_l#B1)4Z48NEh>Z6AQX> zX`d=bsA$<3@9PRCzZr`Z^c5^KP+0@Zy?YBPq;vu^jNz^Kp$|xWN~d2`{zn z=g#5oe-+f~*pnDv9)DGHK`>P|I`GSb`M^@!m1>e#O2#A>zC0>&b1<|(da zC*{r$Tpl&d3sXE)3X!VqDI8ks>K}9Fm3QKVa)hr^*kz(063ERqru%6IMYdnksg4c$ zfs3T@9$9vDBSVYJo^R(2m6>O5J}!u*9jcza#-KSLAT2;+=}733TrCh>C^ zJJGHdSijx;^5G5b!F${Dhp3zF&PCSX&0>88x|goFI@Qv%-Mokcg*4(8QZ z!?@beCQx%et7R$xp>p>9vGJ#q#2>1#lRS|Tz)I43g8qq1edpnmur2E&@tbe-P&|rR_16515SnzT0 zqjd*l?{v5oWwT8;t0Nt^afIPMqmMLJK1vLg3CVambY$78Us%04&*Nmo6n{y4_fWQg zY`SDV=X!u6biv8t0pMKAqWc%NDt)K_>8Y6H^Se%Cl=}#h+WQ6*yc& zjpJ#~T)01E5GRnkpI$83{GSblKS@GEa8gO!lj4~li@mn3{=8^uV{VQPgk|A zlQ$h-(boZuyOZFDCI_sH&9xXT6;c*e`rQ5*O^({@#8i<{ZubWjLMK41#r*$14@S*u zL7Wf7_fcFVs>3pwz4T);zEGfN+M>L?lk=McNJm-Gi{}Yz;WwJH!)|ZVdc`kvC*Od^ z;T1;6Q8;C@kv{muur*NHe`~fdkCXly@f&h;Pul9h&1W<2>2-QHm8t3NGs`M?(!B*W zf+gPGrOx_u@#kM`)_*l=|Jhh&P>kkadY2$@#kB0hwnwkigC~$Xi=%tJp;|wpC2k*~B_HD3h%&V0JS=fg(a=*jJuUw&{1UM48 z|6s87cLirSbZMyP>ce^p2ROP>>eUJZUo!^>{R$VjNn;(J+r}v76hmW0G?vPow8|rR zs=^zW?p0b0bTF$tqwa&3f{Lh?C-CTLjMskaza}&o0M* z@O;8$h0aDN)Xmp}rnb8T^y8_cZl7n#L4sppNl3_ms#XfXbEt(q%qcG1b4(08QZN#) zJTyt5w}tl`zX(w~L2KNp7P(0!J{+C9ff4q++YmgnN9;BhgVLU@ZcVb$f1)MF68mgN zFJXaOi#7Kn(5^YEKOoD&ZnDc}m-xkH$*i;-7m_Nx*GIKi56zvio56N0u3&rKbcaNq_e}r#NZ1~@TXgWT8q%XE2jHD8m#^m&gg&k|9*2$ZwXtG zKL9;KAE8J#XVdA&vvpfd#L_EgO=rXL;U%OZzUpL3@>OqM@1lkklT5D{xs+P5(vMrD zet*h8mS4+Pnb*0wo?U=XL$Uc6ZBzfVH}=2n{QMZ0%``~QO8`xS5g)3hT0xl8yG$3i zd+zdLwhls1zFpUS25{%~% z6ZojXpMRh?t9zS_@Yzxk$QHri2{bfyj(;yLUui(T#85NdKV(T2LEpt`^iy4)*JDv> z%t!Y-)tDE)sZsQ^>~ZUCXUOZ3A$}iQrX1*zwGYDh5&cu6d zJGKs?)TUv%o;-TVLJ1yXSgy_%s`JagScm(D5car&tIa4oD4*#Y$cFz#! zPuLrC`|>X;`1zl9QULkeJ$*XQ@5zt$^pY=p)9&PRTxzD0#!_mM@6xO1mO<7Lqpx~W zD_eMNYE9%=!OZV4akoh1WI-sFgQP7mOQDO-b-CUr77U`Mo+v;;P z#wW(LvjY9r0RrLGzTHXGUh=1fpy_~wYiw+Li1r4`Um)|)sk!DKPX>Vc__$a(!ga#q zQ?&Y37Sw>lyUZB&?%b@-lg2XQ)%dWJ-N7 z?0eZ8R!@PMD>U~j1LF46(*ZWdGeyhJmCA4QAGO%>`saPQdAR{MK0r#LBj z06>?SBB7r&R}RC3^;00m8?Yl0MNNN!KEMu@FPRXXr=*ufn5_%g zRg&n^iS;QS$fM@Z-qQagp9H^#{}ryw`BW^t#+@KbvwK;>>Nmk@T}I&&w0nsR32Z9PmTONtK|RqZ`9%aXGaNF?O1iw0zISc5=7-(x0h(dqcZ?3#U#=L zU8{CT55F`#g&!qgi4ea)90a0QZW5SfbZ=(i0)&;JNu(Q8hu|}V zg}L*a;>S$iVY`U!wb0V_6q$zDRMI(p5=1BC+|?k)$7+nq_ZMiH)cOy|&HT5gio4K1 zErAuIHM=}OtV8tsVLK84=5{7J2fZS~PB9&Th5@wf7f4(S>)K#Rk1%V(o~M%bR~G|2 z1`{CiUfcDQYnrFRQR~nm=9TGZ) z2QDF6g`4C-L4IMkj7M?w83GE|?zumFSd$pLOqT#2ny?72W9!c{YI-M+7KRTN7WxO! zL5DxKJbPa1uri_Nln*sj$i-8T!3P@eY^W3;=lzy;`uhNwKl_-RH`-p;)lnAnWs(Qx zHr3r8;_l^$^eIRfbvXL^xm#^qIR9+uKKD%QbZAbIOLo~3VHx!+uFp)(b%m7b&-j*0 zpZZYQA$0aLVU`q}i?xTUBV+qwxZuzSKg{I3>uZQV`tTKoObpAY*va8rj!+go?t|wI z)pE0+J)pnk0e3B1!gK7llegpQk%pI&5Q3*Reo%y`NzI5pa-AC%g0dHi-^2~v+PV95 z&%dh=vQ`|IF?DM-Nw1CWChq%d?Fip8uf{fR@WM#$^6|52V1|NY|Et5mzqtxaem;D8 z*EG8(Q9IR;S7&x-Jr;y#t(aiktpU;3xiQU>&*qkUvmspTL_NY01K~hgQb1kC)f7 zTojJO%c{SyVs1$Y$d;4NQ61g{vQiIFq+u92iG7bXI*VZ7Bp>pnY@f{uPfUeZ zS59PwcuNmY>yGalZ;sEX3-Drg7K}`?L5KU$)T7U zfhdkO_TdoJJ)P?8OZx~gU;q(_<;n_8T%6ulh7N{2WZpF#nJK=t#EFyoMLb{SW#4F%M& z_QWGeU=lF6GG8-=In7gV$zMKM@71Hqn+8`$Y{J*$#_eB0I*)}J;w{)ZH_dVVheKnW z{RRS06`m_=UXvow+KUpA9PHn#I7W*PLe;(;P!GqN(uaLD1J-DYMdHOIE=g2Y^t`M;&~HVqAcR} z-4oQUh4n%+ZjzXtCRo->2j&pp;_siI9*zsWuGW*ypPLcSQXEyncl?Q^%s0LzFE6?^cT0YXIX8C8TtV&u z^;q}xZNZUr7A{@!Qo~?qsEv}KD%n^CSSAlA_oVR7GfqBfwnO!VmZ94&sIfplR`W6Q zd8U>pWT!qV5zS8oE?W-JuhKK3=sM`2#mhhEH`L5NVDcmDX}<9E5S`p${lTMo_@*?e zO^HR&AWo@3C{UMdX_pHb@5TY%QZ)~~+b(G)zYt(7^f+_gq2=*zKW8*K&9yXE@J5OA zQwzf9;|HP(YQ}r-@@?u6T7LNUEeuLkYez6sEI9lIU7K=*xN}zz3DwDx^Asan77oP9sxxzd`} zz}XM#X~*I7@yMaYyjQEq*AiwKhe<;XL>u8R9??lMUuM^G@e|s?((1_=Kabj#B)jN} z_@BnQE74~#E0{|ZW@ZRL7aT7RUW#n2mc8A7r+n+3Wf{;a3J~C$NvpYolGN7dMALlg z_}W_Y>ivaAB!cX2BcnFJ$yAD?SM9MZ3f~-oB}p#D8iUT?_YajW?Y~m`^z~3dmRgTPotEQTZ}cC5an<{|V6nI)LDI$1yf)vup)H)@~5Gb#;~qb zhJ5>xkXpOPmKq`jAF`h|vVK>b+?+ZL=oL<~a0q}jno)y+lafr(z}Bkgzr_JV{zHC$ z_Sas)KmuEQWW0Z)f=+s#Oajg7M7&SoMB-{$Zcbl^5n1U;zm~z!Gz#t-Bc$%80ij~c}C2~z6f#~ye7$oEA@@cL@IBStgKIOBYkLXu({C+wVspjx3 zLABgeEo-`ZZHLD;&TQm#mcQwL2NV>pZ4Y-TWS8Ap zzTXb{5npv&DMv(frRZZ?fK#F~Ux!Nrersuo{}`UErnzpHP^uv?N}fHedh^(7G^NDE zj#Vx&zXB&%lK}dVn`CiN63qHfr+vF?>fgL6vSjqFBz97M^N{Ga0^dEGWcEa*TuMgo z;svh~&OC=3ZEbz~+EbpfL+v4i1z(f>vMkCwwsSpvtnN?os24``e1r`URlJ@AJRg;R z!7l*yxlaJBVSORa7vCSSG`wc6Dch0dB2`tGb=4HEhP)VM!}HKq z(JsgdBgy`@(!A7!6;PJ3M7W<-1cC>Mwd zZ|9Q(a*jq>5^r&{xc}nKr2icyec3WKk?10kK+u|1p;BM)G^CVV&$kqGSr&NX7KY%nHCoyA( z`KlNc474~#m!0dhA_P9aN0DYZ)>G7zZH853z{psl-f5tM?M}2yBY`pzA-$Zz#f`&Z z^NTH|5uZjiTxxJNq|==Pm9;YA&q)tMp~dx(Cib4{p)G={E9L-0FL&XeUVnd%?*7qF zDH;H(^m^>@T!Os@c>-OGseYbOvs&+6iR_^f|4|vNBxZx#RcQ|$e^C@l$J=_`GLHoT9J{jLom6fa3Mc1l`-rN~Q+j92wpyC= zoJUsYV9UQYATW6}$ISgUmvVttRE?*HXp}MYB+Di3k$Pr`N8A=Ax|$toS=A#f`O>ly zIv_P2Nqd+>$8cljH~k9v&%`DF-OqW!tFu~x6GbpqEtlb9WrhoIdsj?fMf=6-G#Uy( zOEPEtBR_2^Z)vO&fZ}_+ZJ4xU*GK59C4Ia4p=oSi78n9PFtMu$g;|73E3Siv- z@_F~anx}hnLCL~?O=E*ST$D4dAK)_u^Q6MYwojdOcgQo%$t|Y5Szj(bdaT%2GAczPFTcCteezV<@gZLlNZ8MoMF> z)xjDsiM`7%KJE3NSl91G`&i_3J>l_Vz=y0q`XoXS(>i&7L*awP3O0s7f)(An)W267QHa{%?IKwqRu2uXrF-OGdQv6@0z|#fQkF=!V{*^d zUL+{`E>Lv3M)i6|s50;)UlSbEUCMa2UF5XD2|1Ei`G2&2sNB#`z&_ZEtfxPyZ6{u} z+Sm!*M{jMNB%Fa=_KlRhr=}JKURi0cT|d6S)$iBeX}A>qfuW@po!Vth*g;sdH%bZB zC8r`>y+j2p-zZLISW6Kc=jN#`d^`g5*KjUbG1jGf2%jCkDnrxf(_tQx2no^a7-^;e z43$w{d)CWh-zM~u+d@Yx+ymcDor?zr1Jhfi zIA1|bi%a`!J)=tgl%@M~w(f6y(p;oqTxE}-=d+h4h^|}v(pPT3Ks%D?K?WtPFxv$! zX7}nFpiN1{+s1VK6A&Y$1k&Y6cGwr=`aj3iEw>N{lt8B0^IIu?3Z_QhV>&+QU{E*<}=z5jnBjr$+`f6`wg{|Zdy z3kJO_39rM`%NGpmP5cB@v99Xx&@2M4^0T$pC)}%7Ek{sH#o1uRS>R6Hn_nPa{Sl=d zn(Ve;Ai~wMdnE384dLjO@pdCdmKvrVrd4y?Xml3BWC!Y?!bGsOYe<T1(Z-TO z)ijMN!swY6t@l$&he#*PT0;x;XIMdxQ!b&NC9pNB?eKX9{^rw?y3BW3G4x;D#@BZM z#i3VWJHfGz*9Y)iP(mevoe3JowyE}L|N7?`mwoAv7;&g?97{gSr*2$;{UqzI3_u`Z z?r-rOcc@FD}t7xA8*&2QKGt*sI@flqBIKsv)@ZKsD6WGvQKpPIvafJ&NU}A_TvR=cNAaWy9Y8F$^gd zm!|G7OJfxnT3rZ4H)AlsBSjBjAL(rJ+cXa55+`Fvx2ib;n3NNx8gGpsAaXnbKpt$r zG3Ja)$8+^uaSJ_#uQU0G?kS4~X(fLPo6&8nAi%!43)Ce5x8X#O9Mn(v1u6;M;aP0( zKUHlh(rIsg%=1|t`=b70u*}ZAl}e*=ClIue`r3J- z3k1ZH&M>bdX3?<@N@tvA?eZgErnEZ)Q!bp8u=l(~)4A=IgvSzaqKs5OZRy!KT*~Q- z4^d#Y&eul})`&nYq!cI05O|hh@GsCJJTl?r`H+;P9bi)zfY4yeA!;3Qql* zRP{WTL@8v`Ulv(&T~o3p2DvgASjH!4CgSKvBT8XA9!n{Hf`D8}g)-Myqwkipq(cqD z?jFBDc${Z#CR`6-VF#jXv%-kkRMK@i?Xw_}q-Vje>D|V^Y&9^U`KC**wPxZZmthVq zfeC>Ksz&&nrJv{sP6=|Ni?a46r5PJvr+tfK!g~Q-z(| z3g-O+NsBj$?hX(f6-&C~?PZ+!?LP{7RC)%z>EAs=Rzs9=0h^6>DxP^6yvq6DIT_G& zP{o*53&e?>wm4o)A;ZA)oE;a8O5WlSJWb}TMG zxXYjEBCPQP>}YS`E3T_I1TnO^%Rj=%WXj52Bx!w6q=Np%a+TG^0uef8izFtpjMI1N3GT8t|e#*07 zFkHTl;?@TitYHI5(L=^Dha|xw52l4jQ?Pas@-I{JH^S)Nz5o992w3KqclTDm%}r8b zNUKGo?8!E_dyzy~O(3Z>pTk76y)*XXd8cD;BQQX~9IVkN04An<{gch=+NUKyUaC(M zTrr6L9{*&6nkNDTb>^^TKy&?4%lWV)S#Zc+LB-zOYVGA)7WMh?8KNtffS^{So0mB* zuwMQHpkMS~fqtfQf`1t32jro>jTPt-1l*?r*ZGo^$~;K-?Upnu7KjbCxkVPCFU4B! zGAf$mZk>9hhELT_tL!7AeHxVy*6KV-Fq**0q3*hS=zyPmMIr}e6pCu5*R8Lj>rF>Q z=d+lociBFi+PDL|;2-k|FmI$|o`SOzLa8rvOx-SE$&YqKJoP09o!Wf z%%0429IlC0=>fHEv@3o)pY}oWs~+iqp20XfuqlDC38f98r!f(1rtEZ~MXvq6Sk~SG z{T9nkP`DRxneMujou+-RKFKf7V-Nl`<&Ttz5ls?U?l+r{dl_cLGH|hJe=LNOj2LGy z)DwgO>b|@`VpR7->KIehsZe~Ew%l1E=gzO_e*GhQgOvnY%lZbtP3y7eA zu5LuYOl&LZRbg)o8{fhX#h1{(O{@Y{>xFO9dzafGH&!H>%p{u23HHX60F90L>^YjB z$3bveYvyTI5MuDRUy(|>$p{%*b{mU*MuTpj^A#;S4Xu?ed^4ptBc}%}180M)eAE z=+-K_ry7b}rBW)F@ItK3{xM&D+%@+}-jfWPy}TYX-F0%WR$lum{gM6hEoO-nSFEQ4 zuzB>i%~~?CtnRYlnq9phwcZ_1+7vBsX!VeM(Q8A~amDl7Y;Ewj%$0KQ=8-x2NH6Ji zET99(H`g`8Zjt?Hh+=qjGIbW7l_v9$mF5>H8-fMTSupJ>D+R^NOykW+aA2L8|Fk=H z521UvMGUBW;o}Ea3;k!D73~fI+UWhFxzP>i7gZ76uY>#M!sOuyz z!`5PUI4)JrS{6__3@AUl+Zr!iBoXW8xJTuSOWhlKHmI{9-KTbUas?R6Fjp9$%dB@4 z#&GbST50X)YowSOfcicOZz_9q#wp#-EhWJ=9)1V0@uzY1#YpV8|CFmfh-PQq(nHu> zzM5;ZS@Bmz3638Q=6lXIkRQv|ZlH`fu$8&LuM!h;(sAbVpx#JFI`MZKgg+KT+EHk5 z3~aB3IY2r#_4pWVI^?d&zS%1Gv4N=w!Eqv55z}^)dMV_2Zlf6Js`~^38L*!_<8*4* zmI`LA^}88*{-<5NkTAs+AIgq^>fHdIAQodhdYqtrGojPHib3rcC?jpNL_#szH`J+~f zm|iNfHEh+RGmY8M-&UmCB>I0?5y1H^V7R$RF%{WEcbA5-Y(vX4+bWY#S_}J{Z|>~= zc-xMkI2*ssHPH=k#6AEw=1Zltn+KY)a3l(caJiF&%k@CSfZQzs_o+{ZGdC) zB3I}|&@a%p@lB25(Ni#TxNZK)=v2BjE>B3Fq7^S)*@!_++T?NX8Jk55ao4t%C2xQxe` zEDQaL&^BUDq9zCGuKtTCz|(WQu*OhiEK$a18PxilVUq@afwoI=t_(DFuClpTQR4?2 znZO>aEb^3#>#Cel=1(TTV{lLZWcn;%Ntfk$ugWgVwvtQ~A`9d>IH(8R``G~#VI6pn z{&`P=Ehqv>bi8Hpa-cAdKqL}J_-`aQ{wP)YTi;1T-oLPvD{!2CYbZ=CuuA~V(~J2S z0dC9x8NpbV;bgz7`CQ$N;!a=c(Y-oBn{UrDcYAjzXCBOt+ZK31D4T}_CLV)pbf7to z@9|QBtYW~wkdywIjQoGsbCjRIrHvft^~jZ{DD>JR)?u}IYOv|#W9D#*a%Xy?=r$Tu zrKQ;=EL-sOB>xRPEDLOGSEt!vl#QZ%HLg!Rm8vc;12&MW#fS9eK}wuc0eOTc-hX6Y zxTtKiBMhzD=(bv{96ydj*IauizsyHyi?B3&;e5@x4G*eJT4PTra$hwItoa}UgSO9B zvea}%DEeD{;DYM8phCKv`H7Zc5mE^eLRx%@D`{nBnFTEMZZLf!7aO0W{FAR-FEgf4 zN9g=CNpW+JXeyG_QXuhLP4Z`Th<^dy{!QHQZ(yE3R2e!iDdN2?PMF_HNxGm47853f z+N9{szM^VH?x-{RG!rMi-~D>BSMzZI7Paa5EMDykOU7`K9hK+fk$barkDEF-bdV9H zpG%z31x?1Ag1rop>)LjDrplnAGbl|+fsfDmV6Z4Nq(4e-=BjT z75Z}Ts%ZLgxYC;Lj-T(lBJKNx{dF#}g?(hz+Hq?6Ax^LM3axThp_3ccaub}OHE0gr zH*MtI0_dSL^4;3FM7-)d%Uor}^uWb;-~T%$`MJtk+rcxOhtGm4A;F#QM8PUH|5t=Q~k0tD77bL>lqoEKfCjUYv%uFC{CuJUCh8lsQ7@?G_m& za|BmEMhlE!*bT#l(iFGJ$Gn6}UX8>Ltr+9(#vgMtwb|;-NZ)u3$R0}DfV`!q1l6iN zv!Gecoql)|DR9MFc`4P|W|A#)On@no7t4At^3Txw5DbmyRp%&2FRDg!qYt-5OY7fP#3HRZ{=&`Lvvcu1DV=EHmw z@0iJ65?(74YjbFSDxwMwvU{&D4ov3^f6kC;DU0J}a69(2Cw!Tc+_v~E3TUWa5 z^oojJ%^5{WXSuwptwKwX6o!WOnYl0z63@VN7b+WT@T=VTNY0?s>#y@?yzz8lX}w-P zt8wdm%90knGevRXN{JG&O-H5{yzE|-PL)l~)*J2rg++h-jYP{yqQ&t+Zq(l_H7tlWb z-(kfztj&)R$JlMIht#jv}g$b2(=b`Z*VcT zS1@y6qv`6}#wZ*pw|%Tj`C|l^TIh0|TSy!q7$+%h8QSHIw!^WTXRSByZ~Lm(kw0thZ|9? z#^Yc)Zc7g-QXb@Jb&?5K?}k$bbdEu0=Ea)C_! zfRM|*C_LNJ;(DMcEq4$_1D2B?(~Iga&wGB0_9L3zv&`}n)w8O5xt&}EmOIH&$b*{W z?6>iwD)P4UaXI%g`}-g%bq&w^!3FfpG54x^Wv|J>OKx0WF=*djI3_rJNK?1zzomf| z9f!h@CXMK->>>(B?qIkx4i~(xtPG$1v=9Ojm{czALhVjXUWXi^C9!-_q7 zxf?U^hx)`)fr?9dg|8ns(b&6mx7>S*qy%{n*8nOxgN9%eK12iaHg=9KS)+moZWx)Wv&v><~y zd$XBlKB7sc3hCN8VhF+Lc>WMBQX^zeLAW+Qt-DP*2pYvE`NoSFLYtSZ6^LmEb;u1U zI1Hz5=v;Dk$eE5Er4bJV%`((_K8O}FkOrqG3q&c+(!&zH#5N>FObs4@`KyKXn3%qA zQ@zD(X^~R*u+HbA!P56_>nxXDk%RP~6ciEgu8bI|hdUHDy&{1>hWXA=kufxR#p_Q~Q3{zB5O<$|b@(u!2lRa^ z5wY5UeWhV=Gt!J$hYfhX`uj{-Bl ziOnO9r+cLti7?tkwyD7d+GUC;7F!oP?;Pb7;Htbk%71jnOOdPov2$7*k8VOhZm}>|w=N~hMh8$} zCn+QGKkrL_CZDKw+{1oU@FPc8%4qJWa78>a+g?qB&1r}+yNJE#Yag6nHs-te;OmTN zod^|gF53IR_VOno_P-Sv{Kat!E2~?Y*J@*0DzCys$A7R3H9#RU=yW5ZSazq5j4Gf;)5e!xCkCkud7_bnX z6u+)j$PRTI?t340NmQeGVbGh-WS|8;kFpL|QdQrhj;Wr{CKM zV(TyLe3TQ1-IY)}7WMjJjaIYCS3G(Og>WbS?OUJe-PGBx!L?fSb!Z6N;moXu6XZX_ zSr5N{7aY$#b8Cr?aolpGOgy~95IN3 z6h0Rn2=`j_KiP{$-DU9(v+9pR<+1p;wlU@McPM*}NF&E>9Ina5Li=?m*<`CE9ftE9 zOeJ1A@=LME06y&UcNyV=Kna2#ZsGNf6!hueB3O9~KChFt8N(RMAkpZAndIhO!TI8ZyMLlz*VI9ek?p}c#(LwluY=`A` zp~H<20m%aSXr-^IeeHq9W?aM~mBilv!6dl$|Kj&y$e}?odd~)s3l8eV8_YMz5cH-) z)His*sKJ}q_oxfoV{-9o&@tmPun>Z?o(VuDN6NH zDC_v9{S_A#=l|9*{U`u86gDXBCFzaCjnfANQp-G5fUKD0wo%Q%c8j+S1kmIHVI4#s zZUqXJespLr5;u?lxZXv6|U)rytQ>wq*p0Qrna|+g~UxsM8;%%3Bb?M4`uIKq! z=5pEtNX%j4FfaCu^(f<5gM+NDAL?o9qvsg0IA;#lEYsG^=SuthXXBrNI;oNzUn`bd zCqsxi1uP%Vo|UiT=2*_))oMa%qaoq>UG!#F!?dV~VOcJ`PogmTM2d04oE$k0l`{ z#{(t|0RUYVAzB9Np#a3NEP6%)fGQw_+KH3YBKK-~RUOY$JWnIfE~e7cU=dD?P>|mE zJ?W%)k{i-5`+T1oNXhi^6_?n=IP@n^b<_7c_9fAe;tF&j`NuSi9& z8npq5{$C*ElwKZpDZdS#RL4`G;H!0<=NE|d#o0K}mpTkUzXTD#6V5XCzP5Qaq5_}( z2>g-W5dHHeqUjZu*V1*iEO9(6GTcjaePbr{1b%VrOn;wFb^nnM-+RRa$tORw#er_O zJ?Rl0OJYnfm6cy$jSWIwqrLvMa1$-EFBk)W1FBIymj|&_`vJ`%U`~U7PrBUBnKYr9wo}+3)uk=U*C`zKc zdCdsR$DhnCw}FH@Pym1)M)(4Q!ykf?G%*sM$!|A#VRM@S0hA?ZjwrsHxyVEJ0buW6 zia2;|W$Vl4p6QKUSm~LAL#bCHY0n@h-9Sws&0^@qDNsrgB*xU$SM>|T3`d`Cag{mr zpWoC)G-#^foJem2x5+vqy8By#UH=5Y^Ft>aX26&c&yKkR;TG%`LJ%AE3uLBrc9n3| zB*tX*TQ}nm;<9X*;Fw}TZdOv}m%>0A@lEM05PXFH&bc!01rg?C|8$3SG58w1wX_OJ zD324RI=qMOMpFYOM*+)gG1ZrrBkn>PbL$%mfE%SPyb1kXDIg4p)XURZWD}7_XJTK} z>T90OI^;}u-49mmcg4CKlb&%}x65}f`^tBIo&SUl0R;5tp=-b_sPJpMc^1!UWysvH z5%?$g;np;EX#DV%(eFB)T_Mr^zMpDUw@c-Cf9ii1(l3X#3S>UvZgu&iUa6Y<(aF|X zQnnns@9JaeD;`ww8n3UBz}Y?(%g@bf98EamJGZt zQf!aj{@xsxNOj*&r^5LQLF1U zc%iG@MGFuX+I-(+3MGuZh415Lq=(&hr5_sjzqUX8m3@ie*XX~NBIZ7PE%YF#GH-+M z`r845=3k&vec$(?-q3>okGl5`YbxCKb%UrNRirnAfKo;2AXSkry@Vd6NC~}*bfiiV z5Rl%Rlt?eqTj)(7^b&gS5L&#`weC7=m*<>)_B!j{z0V&ZPe^7Une!WCd}F-hH++K~ z{7o()M^ejuA2d47eVN`0?|3Qs0Iq8)#~*b zaOa((oFu`i2-9S|d$u{$@J?Y^?>I&R!wQ;}4laSzaq*v*$FadglE;yy*9<{ewP zPM?=PcL6ndbV;~J;H-`&xZ{owojB#8ZPAiqXXEIaMUKHy)9I;sJ9Ih$WfaBXm8_L2 zgut2il_+y}p=EPd44K(j953TjIyY1q4#Rke-|`;wxzE3S9;G$D>VoBccnTlg4+fL( z^9#|#T?ji!#)fg`d;^Bp;v0}{;+$oT{xWI1^0!_1XY)$(<#=KXGYDze%jfle^mgMG zek!YZjlkTl z_S(ibzuDS!-k%C?o%YO<%X%w&rFBY9hPD2X1nrcAmY>?u^q)W7I01U$Wd$#R3L&sS z*<-SkZ-A?`-X{*^E#*C-6h&oVa^>Hg29L}a7a98v+Wz^61szV<1qeT;z@a}+4MnCO ziEa6u&iw`@W)QQC67h35VaC1FJjEv{+TT>9GOKonj)h%UhXGQy(e51G@Ryb7IFyJ` zJe92hdpMRPxW;3BR~OaD+v>Ra!%((dNWm}aCG=|(?@iXVR# z=HCZsoLa;yfdJ-8-h14e^*|zuUkOLPKB_yQ4}W1_rr1XV8Yw~1Y_&kW)x^+r{9gM( z84msyN2fUJ_4`Ag&}$W!f-S*vMb2DD`Rz})8rUzl_BB$51Po`_lRzWm4N@*eM)Gdl zeRA@m9uwG{kxrHAZBcpnFW)P=^t49|2utzqvjW^NA zQY11LzLmPel3(Rn^)X}X#}Jo|p_b0@U2^9px}0Z z;>&36KdeG4gjfC@Poa+%e0pdy`z+@0Stp_P+Jb(qjLx#U5CRasv3ZtP(GwcNGTJQH zY%CG-VlcSKps}$}=8csQ;Bk%h2V)6}?q1T3CtU=bc^?L4qF`(}jVb+FzH3eCe`%?@ z&4T@;F2YAIvRygJIT4#dOzxya%=qwWWnZDxu4zQb^&M~FYa{eF>Lcx(!XTmaI7to& zdsC9c^ZiZfHFW9XltWRZ;H=FI8xY6owWGq!g6{(p$t{nS^MtY4E{Hu2e8Rxwo=S!8 zL6^d%bK@(qqL)b*7)!km1i(-?klTx)$kV5${kBwKp&0kmQo>MImWeo~4@6_s{LR@L z=QVg*T-L4c}^pO7p-`TU-)zE+tGVIDgO~jgH&l{`2mk>fG+9xQIkno^x~dm5R29P z{OA-XMg*JpBssk5m?tccY-^J&C`prnwE+FuH7JW-FDn9aVNqjW2@7#T%uJJb`N3Yg zF_)Ug8Xdfd6j~v12lIRmUDGx}6Pf8onap#``>pWh`{u|Sj?Vhh_~=d;FGExw$JTF9 zkVNHvM+D=Q{HRTpd%`aBF zezfKbGybsZ(5@k+Lr~l0f?ifdy;Qm1JNupcn$Uyj^HTCmtrn(xAa)a~fCewQG-AG1 z7+PLAI2TSL&WKCl2B%0ExtL+ zVW0~Kd6}U%5UFf**tT3_l#mOgksWo76bXq*m_w6cV8GqGriw*KXN{ zR=47Qf5z)}@Kf&sPrRG5Yje~Pm8G6Qwi{K@hiF$>Qi_H67xR4*;ps_R!j;K_H|nr`_7Jr88r zp>JfxUgjxoc#fEe`8gEw;?)K~66-W%kkNU&v<_n_kVi>cjzkRN=CBep+ zZY)Hn9kH!C&h+x`9>?Xgn%^Kb@KrVdb*k))Ea6uE28DQVZrfCtwb3olLUtQOUkBVo zUl(Abi2o7BG5+ohpS8$yi4tiAB3$6IxT3yd@g6Z^Iw1BVee-MmwKSK#Yyq4V{Tp=k z7Cpb*Im%Egqa7iOzXw}<3P#kgeYQE%XPYA(hDemaAoZsU)BAKik3gnZs&{{b!me!! z+>105G#cKT#??IJ2y^9f(q8^$m#aF`)=!*kIf?$L9luImLKh%O^@>g9wM5*bzc@AM z+l>VRRI+lUE+`PwUIo~fyEn;ALNM@scwT@;tt_tZAwuse(@=l!i+9|0FQ%oLyg}BXc<%FO z1mbI>lFv&Cd{HT(JR3k|CN4I&d&v?h~wUba)Vve7{0hq)vRmetosFFW`HmQVXIeU}5ZK+^7#Z`hkB zEF{4OPvNQ(E^Q?qRQAvUKS7~h0rHA88_r(Xsz){(4w&DVsuTa};Xewk^7)VK_Ie$|YuTydfeHo^svN9@If zU31icY`fk^149}7j6rw!*sUMDA?p zgZm(yaLB3D6(4%*5?Fq+fopYjPvrTwUrn8Ib6H^T>#+wVbEl08-y4+V_x{U{mS(Vk zX9a7|!HD}XpzFw8Q6}dI~SC|E&7f(Gn@`&Eh=;274Ns zEpE^t)MMSI$Lv2^dw@>tren|A4x<8`Hy)3cW^)d{Y#BO&ckhQvR6!%enG8w>i+Czn z5%^(;>68F3D-n*;srsCssJddyzvhfJ1_Q8XWQtf%4>mo=w`oL1A*wpf8---0uhM{3W*X6+hzo$mePWIWrGJLtq7){`!p)PUf|6%PKTDvoY`N z7o`yJ`aTn~?C;O72^nJ|(mp|^Fs$fzcS`0)Tk{v0YN_b`;BhXEII2|*aoSUWF$dX3 zphQY&E;(kLf+PIsJRmdRd(AL7K+vrwN%Q#8hMU0_-v?<4nK72=K!@^`D_S}Ym0yx! z-LQI+-M~GesHg{%mU@9>-kPOkhmjno$*&FxJOZ-rjr35r{oudWMw6n}!z#tz&+iQ$ z$;}INGaz>WQ>$?$iT>4PBwq8Ys!>5clXaFGa6LDhI81z13HbomLH5yro1M@KC~cF| zm|xypc<=Obk3SDRmV^BLcT*g;Z0g^nR+{7 z^hfod<0VWVwJqjf<80_a=3fxE4sfA_^bCa_Ob(G@UHm%JhhG38_IXc^01`WvGzxtW zKGA_DTpD5q?{p50*7-c}yg$5KxoH}#q}t-(2gr}!6@)hAcuC6>bWzN{K#nwt9gzI8 zW{#ilE+m|U3{g^0x?Wfsv(^F9Gb7jJhf+ERrX^fyONt<2>P@JL+hxL6 zm(O0HGfOuv4SPG?M^=%IKCjX%&O`RDrI+0IN>_QxSYj-&H-LJc@BJ@#Rmj)H#SeKl zn#jlPmQruWPaM@>w|~!Mx`61SI1;Kx%~Oe}HILCW*Fxyy!|l%M=9y_zy>NfAgnKn& z)zOj{+v&)E;iU$`+=as?lBeU0e8&f0l-j8Xc8_p3 zmgl}l80awC#@^#K0&~mKyU*n}NEC{>;J>t4+V@X-+Obh#nc*+SQJt_T7z|oggaQhM z%x_5H^AW*gH1D*b4Y-%c4S19|(Pon8NJ(A_h$9WB(K8X~<>lRUVw{;KyZ!Jh*6MR3 zQL1SVI%F*xyj&A13bvz9{p6<#euE59yge*1rCNAi@O4)hSe%n8K~TV4ci?m2(~{Lm ze#4G~&p-18`rb;fO%B#&kY#?HuqX zzO^C1c+nQpGLwEb0P;Sa=~?T5r>tm>BhWNwioZcb(+0gWiOa9#&&enM9UJl=C04Kv zcGY3gP2} zd~-R4e|vjKk+~kQmb6_!&kWFV^mc<_Bl_`{Z^Cfx9a*tOaVH}?QA44RPrRNQMna2R_5pZr}EO| zZ3F+K+Ef=|bgAe@O9-3qH8BVXJKl9&ok9r1MZh5|pUYO)bV}v;!W-3h_Y8H16uP_l zX zY#OaW18W+Fvij^@&lA}T)cgRh@pQ=ff4jPz!-=Dz^*9wGA;6wZ(NaE`;X*#RV5LNK zQw%ip*k8{7bRnEA>tq8410O$??eGYAv%Qu=b8fFInqM;bO|@>!N}Ru0-sthW@Mr_E z`8p^2euJBxiPy0`Oz}=JRu7fD%m~OHm5VSbF))|)P1}$B0p*BrigoH6#8yN`(J#o!R%F(q%A}q{Xtk~ z=58rTE%!4no!671sJcWJOCi1}C#F^TZFN0FE75|^>d7mi+k;X><0I3$IkF5I zoF&8JlDqnNejX4py;S`0mzu2kYSK;=e*@P3k$ZZpxPdGcS|(@VqDobqSXdKvRnWA2 zwDKZHPwuCVjzH0mkgyEc_=z2BW;g8xf{M zLNct59D02^wdJw>;m2BQHKslInt167C)42*c@u9=Y8&NUY=70r$acn;>^7Gqpf{JXUJ#Kb3K1w6> z2nzfzZ~v+H;u6_seAJ_1yH2-J>Bgt!Zl+B=r{Pbma5SsO4U9@E{8wCV)<3}V{@Y^; z%wtW>Dn?V8K%bDLC^7lPUo7qm1UTT-yJMEZJMfKEKEh; zCl{doFPDtim>(-SD(@hY)(X|vr&cc<^BQ+M-apB&3GNrWm-i}gkUIsL>8_y|gpwg; zU?NY^8j`6p!>w*_OJWu*ukX$GGpfqnd_H-Lk>28$b019L$z%NJ>FPGW(fUkUrd5Yp zgu3z!9vRWsN;T~dOMgzbR_Wj)lO3x8^=6?OyUcw0?zH}6FYTqIdla=ex2s`=V`j`k z1CAYw$404qp5iY&NBQ{1<{rD%zkc`rHiR-8grq}e>BwXvMk|U3`>WIXH#V}umc{>v z77SY?Hrw5FX^{PNbu4a>E@=HWGNSj5;O)rq6YbmwRYfbV{7dzL<%O>JqrO~dk4u^6 z2%U4dt-n$)$GQ!>6gdhg>=?CY8_9EKQddaQ$B_l(KK% z+X>@*`tHLv zH&wLJOZxs05KgW|XQ4{VER03^;TkD_b&S~KDqR)OxNCD<5C`78@sgv*#0fVO`E8fN za@j__l-`Zf_pVg@B(tzAMY(cuy&C#Y%s11)T1?6sagjWTXvk8f_7+DTjVCr*TJEWd z5vX`oCl{FR`{IcZV@X~D(F^S#p^%J15O(hSEuOD$TJv9rxx;x5nY_|=vXCXclef+3 zAff0sZOYw%Nk~O>Yj0_8tCb^ZndLpc$-AbvvRPQ36weUPmv~iZhgOJ5s8mVa0>3j9 zm9oOp;+u@!DUjP?$8bXl!wLtUm;&a$RQ=UI%5G%4x1ScWW>|hT>Y+QdQ*h#P{Yv>q#K#gVdMiV4QdYD{PZr`uJRhJe4K5Bmvl;%W)UE6ety& zSFw;e%hEsoDH>zL$jq`Ts%;R@o^hSmm|EHAiXl-=rRhI?DKEJdWHx(%2n;s4(0U(n ze6IAd$bv#>jaxH+nDdpX47*n{AU=pFTXyI|lx4l1nr@UofoW0Wa>O0pNpao4BT(x` zK)|#X4i{^UEN$AM4zv`faGoBx&VkBh{$*G}SN}20#30+`XHB)f{j2jLK42B?&H`b; z?SFSU{zubj{_?z^2?UH4J781iyUGR&USpG`UAgi6Ml?krTwtmTBn`!*YaYwqV zWV3mqG7pBG7z?#R&I$^J;R$VHt8qiT_>dU4XPLdfzOy#%Dnfk%)A{*^M(mx|^8V-( z4Z9F>s8Nvt)_?7tajdS46~!+U;#TD*?f|6HcvXgXGJ{S_-X>1=Ry~jJn9VIgH9htY zN4lI4NY!gWh@OqHn0Ar8<$Vd(5)&=_FYBMeMKp^Xm;Lk}%w%-k(Xx+_$2qnkr|4Kb$X{Jr-KG_&@Mx-v(=!V|aH961v8>zvMu z6xb%C2Ta(tpQVgtWkS|V6r*+d)ZHk;cnHaY{%2o5IYCr)Ua1|V&K-*9poG=!UcEai zJq`1Bv;0O;vg_@dI94`bJ}p_A5v}!3XYX3jgs%GKq*JWnO{FnwTjY4aQh}xpkx0^} zB(|S=>!10qQh&z){yH{=YJ-Xn8H3JrC?MJ4KoUzC>7q>UUdL{5%yq9{Eku{F6e3QF zs$;-nRz;Ck`q0yDs4lb;|g=|uy&P`-kXRsi>{A&n>Y&B&E)t>X@=4Tl|cAG z4S;z6ii7!IHkAFhd99|SkXlbCh-gX1DQsC(zF5(O>iLhAv9!ly52pvxC|e)KE$hXX zxHxCXt&B6@o}Q?2@Tuq*o=Kl5Bhhn+xtB$5SrMHA$#K`0t*MO&??ZAl9GL zD~@3s@X$`o>omue>jD)PTfeIcVTSqXy_DQkoxBxJpgtna@Ndt~f95ewUsA4%ZRt0D z@1$O;x8pL}*pWZ0b-FaQ$fC104b66NB~f*Kwf;J9)>N~qSLdG{C~NX&#PH-dD#^*nu+q4ir)?*_VSDGm zK=%o#xqs@sQaXRXEAvPEM)Bj{n_0Rr@{~Pazb*F>l1(|J7tqPgteMEkO);eP&Ud1V` z``){w6>9Qa;i;6#?|AkW7uD{f4Jjm=<6&tWV+~5Vt#%eeG$2@Cq8tnV*00p@p}?+p zscbsi*Zkl6%pP$k0aO>!mO51CO+*UKMGV_K(^nz>ttO%|GZKOIeS^s)2t+I;_PIk$ zz~njIT3GKdE*ewol!=lL5js~PDv5@;9@>xRJ~`Ss^{myfvyqWoUuuLG3q(DsGmL#s z&#PTrYVanzTH116pXD2Z+@F_j)*-+s$x3d0DlPMNPS!CbSYfGn|1l5yUgzndsjzhX zfdH?*a=BH9425`aP3%CpfRhH2Z3#b-%lVtx*VRN=QK_~S;41!)%*21jr2L1kz23a3 z<&!(Lc`|N@Z!EJ3%^F|)4Weq`$VUfFFgJcuz$P642J-k+R{K2+=er5%X&&4X6F?$P zFHZq|74U&lLU32WO`nv(OkDi8!dte|k?<~QQ^F=Iv&&BL5I4dc0- zlY;>I_xhE)iaZ&HF{fK>`*@}eOnFM%Ct_YZCeK-vlCF0Jesidu%I6UOQ(akP7$d}5Tuutt!rXe-!ca1+0mk+aCF?$`>x)y^vy;3G)5hi_5u}J z^G=EK^8imZ#-0zP53<1?bu&Jb=PmJDl zLoaOuv=U72vcFiK_Z63zo@>p4zaFpTwIPdIiR=8Z9nMV#nzq-rtG}0rEc6sVxZLJ_ zQ7LW<;=@`K*KD4Zh-5oql=p*q{*Th+Gp2(E5A$}I#hvDa(8tSLT}>-L4<1GpY(ku7 zXNn}^SXicjGxcbJ-r3z5O1m)CmaFxc>28B^Q`=2><-7cAqlvdwpRJy@VuKaXH_=~~ zf(Sv*;$+`uwyQ@Y@qPfpH#tMTK9nx9qoN{t)W3||tpTw$Usm{P>d%W#HyeYUFV)FN( z$J-k-0wr}JCP2rpwM}bDlfZH2q`j_>vg{G$*8s)X`dZYdXv2qsYnRhjm+5|Mi+DjL z)&=6q*ga{vqe&!bZ>haZYx5KyzCF_PXd9WdSAe99p{Dcefs2LMM$49foG}E|yy&1& zo}8d}GqaGBr33JicC5rF=j4gd8Y*LnTyGptri0wZwfy|Sv#WAQdI#h*`QTaGv^A9Y zBzwhIx&e9#=sNAjl~17^=cmMX?7v5~O2Yk`t>G7s(pZ67rq4}5)g{>bGx=vmf;;Ls ztI#y?zZ}B_!YKa@z0@iy_375zzPaqpATF0M*}r%QNcgxkr}`-TL6rY!F)qT_vd_eU zD(N2hT>k=j0>66UgV_1Is$q2GuuASzrx$B^@$oe*{dj$}(Wjtc96=N@g=~IkOI?JX zc&9J_1`S<7K?kf^**_U5_61$TvCmj|RDTqV2v>=3)XnWV~wLa-5%9H9lkH9eB|yM z+(B%0Bz4!ODHgsdrz;cA%WW9650U4?qV9VroR9lnj3jCiR&=l)yuVikvoNiC4~fRQ zB!wEdn=fhQj?c0Qq<=xAgn`f>caY%Yp62~nrRV((<8&W{Y@5WMIyh}zOV=^WZy>T%KA=*YNyxVz(7zxhsaLlP>ds`k@+ zO?I&j>|JQ!LKPz78rzQfg=ueQw`d{%<;4}3_ryc?!-T-qhH=|vA}(Fw3wPZ7=85K{ zjdgdlM%Zck9Z!=(w}5y0^o&PxCLgfB*g;U41OfIi~(4oI!Na&}aXBiD6 zkEpA$~MVgGRn-uj5ky8<#JdSDr8*B{{~IKfnk)y_1D~2 zR-TqMGHOM=SWFl_pVOipcu;!PLWh^ELZqRrw&Tiq>~ByBG>m&zo^qC1`!qf5^fMXB z%-|MIHi;zoSAu`DMTGXTVpaUH#^b4%!VXX!wBj;Rs+$?Mp%tEPu<$z!|FzG|wbo(8 zcTW3-{!9;JDMRuo1T@56o<{b&0f_ke%nF2=;1u;*Dt!COPBjMSkod5qyFoEqDT-@J z*$aV29+59KPvao2GIaSrhQIacdk+VsO8}UkKQtb(>yZ=um^YEQrV-?QoqJmRZE|sa zz68KfNznQEN{@des17Qub4heHb^JJV4Rn4WWUH0<)6^#dx)j`w9|7&Cer_~H z5hTxq!|p+9@T_$R-)I4iRFW?_?>LrD@t48(;<$vOAqP(F_h&56WtBqY1n4eI097Dg zw96H}Cr4wH^2Y&29@E!Pzg4NxYa6KaXg0x|D|s^O)>6Z zh4SIz$aIUaH}G>!qWSUbGxxmK;uY1N8b)8CGJT5rx!oLZ11`zS2{G(6)>|hDq)3!ll$0>06_Ws>MC`%nH%`xnT~xc6Hx%u2Iu&(7uOsowhYN`iYO5OBXsNZ>V7p_PEZ zEtb#kan-pK0Fxj{5&LSt&K`^Z#t}0O?4PbBC|eC}Vfw5cKi{n|3|+3k*L+Zd!7*)v zmfE0iPJ`S)Fk}mNK}(?qq}mgB#8UEjG>M2f+G+Q}CAu|@X~=In@L_&(#}+h;V}&to;s$E#pD^5GS&G%%e51cXl-4&K+Y&Pz zDA^EeaklMvZDcvKkrNm87w&-R=BKAcb#S|!G*{04ju>T+;XC@XYrb+sa?JWZpH`$oy63guqc}<3pX++vbHCL`6$D?c2L{o!YJm6M;)WTjNVn zZ>=ld1JW~{1DPV!t^mEhf=uA~Ua!v&*qr#CYubYy$fa1{X7nwsd*V7LXY+G`ME!8+oFdqPA=IxT;GZQqFc=lH|_<$JG=ym^3 zW2vt=tobIciOhGPk^2YYHVtON`_VP$L4F8X zkIy@@PVygm&p$$rKVG?diUi1@qa5yjHZ5x-e@bW>^H~8~Ih?!M1@R zq+P&GL%wb$;-2s+`L!~dxT}7L<|aknam(+k$%j+&8zt`{qvtWpZF-`PvoT$)vh!yd z#KWbPTKX)+*OkuasJF0v_*DL4EY*pwSc4!PP*tH$U@7*wae%5>E|j~qDZRiVY)X7kAlx+sTm$5Jn-niJjg(Sg zG(;`Nl^fXWWmkNj+**@-=tImpaN_IeHO?f^VxsDD)0)zef4vx*%5xZ=x318Du1(NA zS4^IQzfS1Y_mxLYNEsY+fbO8}gM=<0LVhtG%nrg-oAo&A>&fja$XR+Z+FW7GdNmAn z+r{1yFJPhxj9x!bX)SZ-(5kN)T{`~s;LB?%;ZBebW5Rt=jTg^bHB^f#&6x_|hG@T~ zG7*xmsOY?hS4bXyUq5|iB;l}c#}9zb4P4CJHc zFhthuP@$U?6lzi^u>F<5J8l(B#J|5QJmn}XM5zkyJn$mjAb_@W1z%!dWf?1+Q}b_H?59)s5pV4 zYTnUkw=+gG>xBIPR}|4rwgk|`vW@%&co@-pnU=slswQYusO<>|_m0P`_HaigiR3L` zag)lMXw3cATU|*8k|3PRW=x~zxP+tW=VIU3iWVy%V%o#m`0L^F7^6$3*PR zP9GWY!5c#5Wz~aRKuyC_;YG?hFgO4QDBj+ADvJ>XS|eSXA4mYv^aY&@^l}4swg7m} zzaqI&(-Ih2dI+W-Sj+@yfU9L{DgkU4&?>a?zG1785|w`)d0BJgrScn;nRNUe#F2Th zmii>WK-PxFj?*EmHBvCnpP`5UbYUl9annU5%;PmpAUqL&5R0IWsuSeua~ngzRfC$L zHPldi``|#Uh@(;6GHRmM?Y?geNguQ1}8;=&E+_)GpG0txvyJ%e*JkP6={(7 z2lUJt2bwR|?`{+YH-NDH3GmRTS7w-@dX|7H5%=q7c_Aa&#;%!kQG?9|A3uwHj7`)O z9D@uccIwwWYP5H1T>(s{JO|9N_r{C9%T#h=lb1*AWw;0guAxTuFWvvT%0Ck zsrYqqn5@<1@{&ZD-O|`eky)yLLAZ%q9v}mJ2kqqbBy$PIQK;`EDkpy#wzrKCzw_qT zwa93WQ;ZFOM-Ia#H$cU6>bsn~M&>q|GERE0fo#P+^i8pN^rG0SEo#qdxzKK4EJtXw z?S5!Yl3@GC(38)N<-tbb92ZUKz(d&%%(EKUQ@z#P&?lT&)<-;IV%xj$X$=%tW~uvo z)P(-TSGx_U7{@W*htw6X*=xkb@Exm>%27BR;!bXe~C)7jdO~{THi?%VV^~QF3BRz3%zKBy;8PP`CK;*;wYEPqpq=!-5 z71uS}!A;yJjCVRs>Ne0o)9JO4w=qGZ$w(^{&rk2$3G&>|#`Hq(oy_>KFT4qbOaUfq zLoooK0 z`AJKXi^Hx?xv1uKXz{08&taq*!zVYdCUm$MA-Z>QnSgj=Sr!4kw|COl*G?F!vpo05 zVIljF)fz8(Ik*InXS9b!i_HO)t@XM6G^XNiv23?xcvB$9l!yAQ$CgOs(bLk7S082~ z*L=kd{ZWdR*j@H*O|!tqX4pz1j(k_AA3#A~Hz4nQ4?cAq86Roe1K3|ezLbfSUF=`U;kPk)B@&^+aVtZ%7{7_o}ZdtD|_AVM32 zJehGbvGW2a*!v-8-#1GDimuq3=gkCr>eC~J;a^UmXMrFeCMR73NnUo;cO()_C5q#g zKiy@sbM^p{wR{N~xVPkG(*=mRE&}4CV$YWLBdgL0TMj4Rm=L^NL^bEbCrLYNXprq9 zA5QPpbQ@l-d%ma%dV83<+ob}C*zfjhF!$9F zNh`RQVlQ1Ic`aX-%0C^_+glZi2mydtF?49*)0!Z7SQE7x;Ofd@7oeL=%oB71hJvEX zxQ1}0^0-vcbuLN)lkUxJRpT&i70vmHm9Wca3cgaaoLrfHze>g=#um``9WKD`&Wl_B z(tRCkC>T&V@_m~`@9RQsB*#~Hmc)BSeZ`^BOHaWt8eT@zhq4}^%jL5VXxeP|sIE(! z_=>j==XE9DYDS0MOdio7)Aoq?WbU#Q}I()S+lA=uU7)AM{Abq1_!cg#>aF>Y&_$`v4A05@8f26=84 zc=v+eyMU)E3x;|-`4DbwOX_nGktTtx8N5|SI^iT4q zt2u0*XpGaPnSFjNV>yGVX4F?yhJuZ~{Pb>92U-H?O?}}tpODw?*ax&v%+0uXmN$_j ziEyh;(BOm8;}#6v;Tt1{3J%*PDPpMFE+B&7)-q>l*MOC)$d{bD_*wLv!)O+liLqHk z9Avm{kJfk{WV}q$g4=TjA5LZ*unw$8`92-_PY{yo5AQtXQK_}SX2%$`EQ=Mbq-wV$ znm+P^ZR+<7o*7&sA52gj$@gNXTkS;96BwL|TYOaf>KVwG7lt~UYUt?h^a8tOOnd&akmjotOTUIjk<@TvX&nOf+EDmFr)r#7nE) zbHSR$qAH+1RSCY9=iGaJbQq;~q9~?3Z?Gomun#3fd|znomFq(z)w0mRsu$<|b3cc1 zrna5%Zzcv5ifN;b?kvqY%7$N*r>xh(^+x$c++JnX8)LML57AD^i>4W6v>0ajBWs2M z@lx+C=s8IvusCiDo^JEJapG z>+MkDb%ULX9$s5Vd3Z_7D!0F~UTWn=Ze2Fe32h)ZDz_4=YrD&2R`(<}Rzup3={bYx z`^GA&P7Av%CX%{cy$Jzg<^pPWW{9F$zCS73=5!+u;!QOP8@I2$2qf-v|5x%Do>YCh zxwHWDqgb$n!p_^#l1J0swTy61Znz}7)kQtJP^GYst(uP&`&Oznd5_cc|5oL4087m= z@l({|qQjlJ`<;DJN_57=nr8J$U38DS<SqV{QA{Y3y{ zRz>i6Uvw8NPbcmJHhHREfdB0Xjc9c>X~*h+N+q1AN@j16BP2l5YcdXUaLRC-r8H7tI!U6m8 z&*?K!dJi(^;{^#9i)>HEuwea=%&2`?W2!!0B^@^UDk=zs9Gq?rfv-Pk8-0);SvjYu zQ*R#a*(NKt{W(Vay|5T8$%VMG?`T!YaEqz8U)j<)BFCDwC$B3o8EB&PrYz$z!eTf5 zIN<;hdofPNIO%SusJqwJpGdxQX&Vc(=;+AsCYvD?$4%kM00%n1e05S<@Cwnl>`-d= zlAPQ_V2R;5oI~-ik_HR80l1NLY85l3qP?|AHQT zqERzyXj`Phr+u@21Mg(UqAfb?xS8PDr8#4x+I*D z_Ff28?a9lg*dHtT@Uj?}?6S2iz29EYJZ%2ojG*OV%NE^p~7c zf?L79nmoH-YpFMCsjRhl*@10euPd%&#hKUO`G(b}gc!fp;&$fXM!|LdtbAi#(Hb5J z<_YHm&UGh(vm$-Ev!eLJ2Mv*@c_E>{)V0lNUEX7VDbgjzE)<0nKhCkPGo!xLy0>Gz zz5!bO>4@iQlc6w;1Wm@;y&p^T_-pSCzOotvgIJolJJbQesNs1ufaOKSg zQppZRn)fuKA#WQ}O`+c+rOnkTMSy9FQcm+GM?}v*6e*B7!=H|xQY*FLIE7y#vu0*x z&{#!5xdKr|ZK!c+r&nPXW@2pP(bIpt%{wd7N;_YNPWScD6&|ApUPYs)n+nQzeS#X& z^}Ud{krOML#bXg!;R@4bpD45u{9*Y|em>=Su13UbgkkqAqZOlR|7`lOtyBy4G6b@z z4O{vy((?_FV`u*d(cQml8UNi6{`E1vl|Y2f+p({ze7#*0Ow#~O()ESep>+Br>3E3F zr$_;(f9T0yXyUxfup^JtsN26m-|Y%AVn_b*Fz8DP9!N%rJoKYW`IG4npz*?W)o@-T z2QRlralW=Zu&h*(G;cx5?huwSTk1qb2i<(;8w=4i%kF#Dc)JRSeq~z~bve}vzsRa# z%Xq6_gs6x7fOs3rW!BF9~E@*Lh;q<%F7O_O!_z|R2LnzahUk!-j^WGId;mEu1D04#D6{M zaC!`WbfPBxk)`2ZeSM=DzY6lck$j7;4-21>`KT4f3LwbYx@{jiH9qPb*>e~HQ?;WD zMnv<44zPx`7{l+O>i5~i*aiGWA9BczKkit4j!JtH2eVl;a4z0tWO&G-#n-*cKpBfY zQ7r4mCYqFEV)sTvp!U1N8~Mo%hFaqf{2!P##*i$H6DbuvTQI@bxxDcCrpljzPOAOu zh5Y$;&(0%E*rktYoHZ30mrEqhLMA8&M zZZtmKm79auBaDcd{Oq$%s(Fbkn^cU@Vj5muuhyr7m_exv`WQtQ8P6MxZ4LqtGJOspt+ zD&M-5^qMd>&#!%dP=Wmk7{)uAOy5yP_}#r^P67PQhJ2_yN05Jp(%IbP`>#EG8T^MZ zLuDCh`{CYgUm}i@iZ4_NEJ^D{%G2?Q9$}I%SJWr|=>vut>RvLx`DA)u`onsN^W=jr z8Lf5lP1&SYJ3VDgwFLTN9!jgbAilKUq~2!K58j zU~esw%}TW-jtum{e?=&sR~|&g2ZXdqNbBsR$wC9<7g&;qJ}YZ2JcBvZl*%@4eV)4l z`qZ=Ph67Hlq5qHz7NGt)5mn2dZ&olnP4Ie>tgn{lQFo{L3No|q@Z3H=xY-@75%clG zoBeJkaEXVgI*vg=(na>m#+W%Oj_4SU*#=&hb-lZ;J@99w3WMu&rnP(Sth_}zOiX0A zaJzF$r8WqPJ`07FA^6u59KI(05J@`K1+l{-n`)q`Wxx=IC+0#a1I2q!j6NbWr)O=a zd5zvXmOJQQ8n!2Yx_y{5UnP<)#5;^o(2JlVryGk~=W_i>%-&=nRxc+~;XUp~7 z?gYK~^5w3Qr0qMS5?zm=T>S z552!|A!|KwF^toU{@g)mXH9eO!>q^^BnD>0J8mp#{0-x>st$V*f#@AA1M3wJxH%w! zR0c;g6-#wEaQi3@u`4-bu`yWSZ=q08Ka$mchZKm4rV9(@c&Tv-eFCJ z>AH6S5kZOtq!TGhlis8T3ssR$=#e5Iks`f^A|PErKzbLD5|CaaT@j=c>Ae$r2@v90 z&g_|Q&s;P6+cRh8oPGY_N`Vzt)>>~_@BMr3yYEegj_O-SV)x@v*9Neew%#qCdzA{* zdc^-@sz0n@FWnmauR3P?mojh>?biZUiH~wU-rI~<-Z(|P#4{a@&Jc?FF5@b5SA>rce~<|Z}5WwcgNH0HKx7L%P@@RZjy4QR;{?9%PVmSWnfizgaFrW zY}of|@~+9pV)O`%xrJpT1rhc!yyt5Uxr>%GU|#wpol2|9F1}4C#sW`G@IF#A2&rir zng^IFDn<|*{hSi!@C7;Gt35Zn2Gh#~6@>le-hI9(lH>lFlgAJlely?zpe2fjaFBC5 zAA5;`j$Q08>y9VUn8)M9fN&aJ3u4ttdHLI9|miWp};jO5QX z>U*Hf-bi)=2fsC?uX`Tm^(ON;6pP+dci5X;xKn&NRJTq`_Kxkbdhl!&xI@LYaBN4v zj(bN$gXEZxB&-S^^wypRbsaC3Pkd3@<ajz{=W;OxfA`15Gb^4{2iy|mzmL36X6 zCgO7@7bwjiGEMf5zRmz?gSo^Kl32$~(q+heCZgbij5!X#S5Hh%Pfp*(8ViaNSp9CN z*>!=^A5!QQb?d~$s0ro`leP|aH+Ab$R<_&3J-=DqkxG=5-E42+_xlbbEAhErWZL?% zi$i&zvi~@=*|*PBd#?bmvJJTU7z1hWX7b=465Z;-rqqKzLPaL}lRPpu%B-(p@wGiS zwslSY(HQH$kv%>_po%g8>^oC`fykSL7BNoy;A=)D8?mg6AzpaEya@34-Bryzr30df zH^UK9po_F&lx6>$NM$KY6UgdshNT)cALn-sSp4P}I0x8ZG&1`?(#Tm}&I5aMbJs3> z{Y>n0Cs~~Q)#xUm+|W7Cpu1p7@vWc*VhQe*^TCRq3DSx_ z#oW0D1(xG@dEKVvPTl8VaULPMGjmk&PJ|z6dCQ2{)nq>1pK(r5F5KYXxSwA}IajUB zzPRezGeA7?hj~~Y2W1t({Ok=M1Mx5^4rfo#OGljuQShz?igwl(e2ahCAR>sGY4su) z`Ezzspfi50A^&ryqj|60UWHOR9E zcec1Zzw5I0(+sXLyqKEEFVg>}bx3e7doG0I96W+tT{p}d3&`UM1#+d&2(g|PESd8W z()wuIJY}InnqB0U?z}Xq9Hq^Hy8TCW9y!$u_@R+{Fds%slu@22A0z@u(lIJ>At>p; zLZ{gQ(YiY?^ARk4_TR-#U*SwoH5Z@zX->nI=#m<^W zsAqVj*uCL=dJa8M11c{-d&iYj9p67}*-t5IN=X7hbR5J+fZkSWHb(5~xRo&d9ZtzU zpPhS1=_J6Qp%b6&T-ZB2RpD+Cr_%SpK6sfdmt%l9LfYVN-Q%n2qsP{5$e_NGO$M9h zasl$bTTTV*ZniZ}uIh7Lxmb zC21do5@9hL{8^P49XVZOAuj6XNWgd#R@H!1fVV)5Gd_z;gqCtucBs7 z$*>*>_*2q}*W1;Nnu?>p+C{0-=lyqz%>P?ey?=XY;4kiV|AEgz{6hZ)lnwg$xj3*y zn5tAN9JC^qd3DVzJ$G_Ozf<^rt6r2oxEIHjjPBX4?-C7HbNYv)xG!L4Z&5HVS)v@5Z;E{rSeXITi)W zk(d~*UAItAQWZYk)?@HI4C3?jjDLksul%njQRjU7@&K(4WluyMkgbjcyh3+i|tl%m{aoEMsH z&n?3hasAl<>E*c-fk;0-SZq({Gid2cqfM#c;;`HAiz!2akn{5XUZE) zzdTN6ZYJSqEPcV3UHe7!n2Q^G)^j)@Vy4LPs?X1|>n_{D0oJb&u z_LriiPG~|+tOIW%!uY726T!V2B-_?oo4dnL5nFKdkU7UGMO_D~pXR(`zM1w}%}{BZ z5nBRLPQXM9`9)fjhy;#JO00VvyPqYwID}HTr;-1Oo9;{2Re6x06qz9~*#IUSR6k}X zomGM~hV1n8^^BFLbr|rDD4S9q3aMk75Sj=sFMdb@;#p>OB#V8jy-;Qv`XK1s=)tnB z-Xh&8gTEY85i5NH$@!zx5mzx3&PM^i{|f|62|O=8$br{WQ2vJzr-xrx-3};^!}Ryq zDf6I*H+MN7R}TAJH*=tAczolwAT>F3*?J3-^~8PG;z7FE)Roi>$Z92e3*m2WKg8ffX6?!n)LT5O0) zWGOrZI@GUqS_za+ESizspN?*8UU{5)u~KbW9ND%xg8Faux(#_ zc9krX6M*x_-_myU&Q!&oeyI4igdPCAyYmm-0p~4O?gihC;hQO^GydW@i0w(rdkw|L z?}Bt)UMUy1u-A7{n!iAAzN^%ws@;W1v8hBvTD9E9LL80(le_`Ubc0P7WKQ=&2*

      =wxL;Sz6_nuKrh3mQ~h>B8G zq?f345D<`FVxub&5KvlF1cXqew~&A!9U@&o35tk-5IQ0y(gj31NH3w+gc1TN-swJL zpR@N`d!2jtxPQ(a3T(El-KSF0J$&fVHvC=!uGr*0EYJyEe~M(d_Lv zJJlksf$RhC!)piwv+oMR_D)#tS&_|#t2Ne?v<$bc>$|Ljd#@&le3{AroeXJmIg9~@ zYz`yXjcp{AXCw+*@}hv{iNepT&hxlp!`<~+bcREPwzqueFvg_Y!#J+|*n!Lk_a-n# z@04ou1hf!#79&r89olw$y1Nx0idPh}rrW-|OZVih8^i1$Iu{hfBx!8!SIx|Q*fg7B zA>eVOjPuDwHgXYAHUKqE2I-Vw>ITAjguCZ@^QPR7ol2T^_hx2q)0QUgw55BxXbX;9 zKMH-m?dB8{wrY-(0>rK6wJ*=jr#TIK2AFrh%XgU80An%5+O?uZ1Rev^!je)f!_o9# z(0>5we`-bi7cEtW1Ad|h@U5x6M>hKZHOU>Tvy7zm0)#ZZ>vD&H$JG8Ot3xT8`2zTe z=P9^Tt{3eYLA$8Mz4&jcH8=jw>n!W7Zh28%3|~=asbw zs4afUZG`B_S);+{ZZ@w==i%RGEB#J-;s!aL#ZV7lEgOH7t6SiG5J=|R6c1Px zVVnwwSRRx(yP)1RodLA*^_tk=x5(NV7#GIYiPSgH%JuvwJ;))ATzCfW9HYd_AojKo z^8D)YrlfMVSB1y@W-parDa~PuXpuNj06IfX+mSp>;+RDU?P3G8$(s@!QpcPME47@* zzW7#LK?^cI=*2KXkd)qM$#ZHU(PTw0Cu~90NBbu|F<(NdZoQ?!?pUZ==i-C^rF{0^ zoP(bFfAW6LMG&{P7F?Lb`waU-~8yIEeLx{618Jk#~$pMdE_}X}!zO z-bH`S)O*}yn)v2Q)8SyX3`294h&C6=t44pLM6AY*&a9v*)a2%9ooc_ z4_je#kVy3l0g!oclmsigIO zi+I#I;mglM`KE z*Dl|Zp#)41GZ#>qo}|_$g?sG0+@yx^j2dqM(oj(;IHEFAjxej8ug zHX$CdKwurZ^~BG?+|F89wc#ph})PlVuT4y{!;N!VDl6 z+*_UDs=V%O;lWZPY0YO&xkdc8vndlrM#AmVECd@$h66{cosN0_q`=={uNi!R^qaP+M?KB_9-BjVCf)t6VNJ__ssecDz~IeX;g zVX(}w+n{+F0$9~}|4yCeuHMpdwmvq@Fmy$Oo z-f++peMI%U7zOoQ_9uBM(sS31oNa`0;6P+t;$o1+&a3K1Ve@hoWmkSWcQwXjtptPw zZJZ`Wus{ zlNYDwb96O0pqX}x_y}A_1|LFuv0YF6Xe=4gijW~hVsWjoK(BirZr8&~!Alk4y@-zi zJ+ST1w0ajf~q9Zc6#djYUbAX1F9BfO7_wMG5%RI(|m z`9WLXvZIWGq7U?aw)obM5>Mz(hooi=fBR~r8oxE5>p9azPue3K53(~)tvb#u@GjcF z?9P2$ELRHq)#pj;&PI9Lh_VIRkZN&|_+RC?dxmpOtNFhf8LC`Z;fH|=-GcV2jHR%g zVI>1uQ8B$Pu`>74{QI|*6``p`nj>nN2eR<;a>&DX@Qygm<+`&wk;-N}f_8qaJ>I=$ zK5_h$vBb*;GKrQ~u0#}iSgj~AJ5HV5$r{KZH1rk)6ivjF5|zbsOS?VqcKtZSL?s2< zov(nJNdgGs1ltlXAqF*kdd^JI%AQ|o)7K*S-9t&Z1OR^fAENE;b0h9n3lGC3c|BH4 z?ut4jh(jiHSxOGDfP~vubWi<*w{50!=Z;yrk2km$ zelzDpcs{aBH7j56Qr->_dWM#Fr}I%O6m7UBp`$h-ztyfh+Oq99ewFP#jkt8>^Ln|6 zy<05{T5{*dhLwro*w-Z=5X=B{5-nF-3NvM{ym0bv(q!Y4hm(_ry7R|lRx-IUr)T7M zEEbZoU28txJEL%L<-?8Q*HUj&)Zf)DyKj1>LEWAS1X9= zFY5~`lrEN^TFb3-K64f3DBT+us#fYU@cMDk=)!f+xunsux2Aj!`R_iykr#SbL`hys zO{~tzpnoex=yi_sRrT)mx|f=et=k>->&)jJ*~PS$oN!M4o&miZ@^@wG>Xf?I4j6@R zdPF=wA4HXUxD*(zly2{zvQ)zCwmJuP$>*1~H!m#!5D)b3PwPe!6Qo}(7|8FRXbKy7 zRFZo8MEd1#^jSw3VqjVD4ydgRb!4xyj~PI-z`-xKSNpt*ew$4ZuQVAc=aGLjt<;8g zNa#$v24Es5sN457UEc`roX{8na|IOG4&Txv;-gVJC_FxOKJ4;pZ;;OuR+A0Dfv zT(5NY)&Fj4@&ZYihpJAjbLbVG6hDP63BlqsbYO039*zt?pL=G>Bm0`t84bp=a^;Q5L*ot*6RTI}uXXb?DKrdB1JK#koKPTDq;G}mRbG${_u5sLYJYLY4vlEA zSk)DF23hBV~^@qEs5w}H+HbIrMPr!^Z1E8@Hdn3u9!9wknX*RKVXMvo;N zxdt5Ic9fr3f8XtG?E2BXVVm$S&aDS8z_fDeZJ|QU*G2o&-*OEnr3x46tsjYa8^kO& z7AFl01uo}tbjkO;2r10qz@~G5ZP&ZFQ7boEHa&iBLiT9ka@VJ`Nt2iM9*%JvVd+K! zvH~6NM}=GDEsMGRsBoW>HtknrgPkirZF0H9=e00vRHLLcH^)=eT6D$^TXoQxJiS>J z9tuoKjd#%;($JYTl00$8{A$qN=vVPN$B2!UQnPJ#!~7F(vXXh7Vu&hJ9WLe?Z&GF| z&IgvJ2_GfcS7ra+)XXwUlNevQ`1_*+FP-_V6ZUuG+4xGWe=wYYXf;*+$k*|B%T)7H z^6dCbkw==%*fRu**48A@)aYjRFkNo?d;e=MaEy)RthSIvEOGh5H9kwF1!Varlwd}y ziO)F(PnN5YQfAAq_1D$kHtW(mXX*nar;*i8Ym>%wM;j*Y;dz6!6)wMMY?G zI(+Pq6+^*89pdOb!?Skf*huRw&?WwUO6wGL$3 zUVG~~SntIm!)r7#eCKIVuU@jaA`Tosa}KW&l0{pyfAQ$X?r-WK{hAFn3P^kVPHvb{ z6XbM>BJsL7R0sAn!2EbX@IjyV?P0ZBL`e~SZ{(Ls9o+#nVEK3hkONMyh0P(j{? zx@ivF8YmECV|r+Pa}P8i-XX{Ih=gsu(Cd0PF}5#<@s4&iVK%IZPtP79w>?lREG+An zl%{l`x49Br9DtgC=7D*#oNjRcK-_uGx7Y7bo2|Ue`Q4c7bH!%fxe#d_DO)nl*s9-e zJ4nq@>!zyy?wgVc^M{!?n6A;S#V*42@Y|hjbVrZBv@$YRy--_+O~^g2iR@j8M{Lop zG(Y95x0-c(`lZsICsEsC)rI>cfd5J26S2S!>l4ss1 z$lK0*fj##4bXd@_UxBc6@&h-U{f*v$@f-)H7cN$O^K0G4T|$@1kbrp~TUqNH^9C^* zprchi7fRu%+{|smS-%(D5l>dUnwMsPwLFecy$2@~^yu(u&%ap{2Xqt%iqy zNtmU{@W1!LI2RFWNs$$ClU>l|k&Uzk>^TweWSB;vB&F9zMU^nysyZ$zOVX0+%;syy zs%nm!ow%p2-yA27QEF83ZhsyLn{Eqrc`GB5?8czBUlc_@ehRkMV&r_XXV@b4)XA{c zQ^JDM?tSMzd;gLO&lG>n#TgqvLc;U-79&-m6=h!Z2H|yUqbsy z?09rL=)`9`BaWP)p#tf$x;LWtrk1}j>RzeLN4_~(7`Y?;g4~+#)AK|U^d4Kb{5A!C zv*MdhSF8a&_#q$m;la|&%bE0jpkG);>>PSjPAp$(KiVJ>@%12LYz-VYp74D_{yWWg z2`3?W-}z8bO&i8*>3_fdS#Hfh&c`w6yiQHJ#56DTg}~pnc)nS5Ch4k_SN9javC}E@ zqN62GR>TS~UAuIXr-L1SP-;_AobYVQhpGS29i>C6B$(YHS?3xQuA4&t*>*%u*fom z_2WiMY~Th%nz!d`bK2u>t&)}?pS4&6I^OJr6bEKfe`_-AV|o>tu9+_C^_!TwY#v*; zDpx`uUNK6fqn-**jQ4!T-JfJ6ikST3-Iy`N=fK7!Y} z?!I6YS{$}x4e(VHok7{)vALrM@nV=#j8gK`4H?>wg`lg)j)SH2ce#@Ox5 zMJ2neRz@Ee?shc)UA7Z{mo4WuxSF-Dn#7NPDc|9?f+Qaa$WdXf#cS*nw4bf`xOu@c zfmVkq8rSIDSV1VXzKD{s97tOE!i`=<_{0b%0=wnOqXf&uD;TTTOhM_KyDe?l>m{$u zET8H|JR2!?O0Z`-b$2r=6qfEK$$9#jXQH(6$X(OZ?x|&yT$f)xn&>`=c(*Sj{3&)> zq3GjdzQhGpp}XuWkKDvYj=dise&5Md`neXai?5CziKtCDEzPUMDd>zdJNcL|4f6A1 z0&iZz#U#pljMUAo*b>pP%)2bmFe`OG#o%|?mmi!PZZm}Xh3X>DL#rn&GCbI#+)WHL zxX~n5gml2I^<9nC50(7gE$B9X?zrSgLQ&XG?ZUPvCQt{x21ls% z)6E^)y8(UW*x*d&2hL?OW&v7^k)1ns7?Ek*`M8GI)$0jnv1^@L3 z!Z29hsNfiu77AV*`=vgZlOV`5Wj-3ecDeuk8^+AQPf$|OblyZQ%PRry=A%91d;1mQil%D$yV zKpGP=!aolBv?X6p52z53!@rwGRU?p=D;N3Uz4hxM>0HV$f%agn`Uxzw>Ev&uBjqx_ zx=FsN{~o}77zbdmS4 zJ_6IS4G8)__8R{?uSH7ZS`rex=&5r!z$R3zZekLT+Z^5EABm<6$3CEmJRmw%05h!h zx0nCDiTmhV8^ZD3wS798o>IQaJy14F5;8=e((Yg-(4QLoVXb@c?jBQe-lgzxQ_Gb> z?Fikqy9Kp|4%dN>u-Sh}kNEH}XcPaP%iy*2Q5 z&b(pH_tRsJK|s%4KetnO_jA1AG^5f7(>JDBufI4bm1-~*5lIU*l+pv<=;Nm)ce8nP zCv#4_oURc{>*opEpcIMKp5(BSdby-lP^(pxeM{{}X_h{nCQ>R- z)oquzF=K*yR?}GmC%{m{pGnvB1sjV4e)Q(ar}`CDHgo<&*yKrDv3=coJItb1GNkyN z5=yN^K3l?8WwawM4lNkWIB&ZPbama&k7-sSWd}MdrEZ3A)yK8=3&FUfJ|Gd3vLLhEpf(>Xu4K&gl39jTLa~OrwpE^yZJH5~Cup zz-q}oYrWy?usRvJxd1!iswmTsppJ@-VYPqR-u+i?-~U&~%`F_6@w%QMO6MtvLZ7 z%o=E}ujL*fC=+Fz^2ti3EcvD+Q+wos`YtS<#E55Tg);B+E64sC-<}v%T}-w9prNpr ze#uE`f0gbrB5mPmxxjXh{kJrYO9K0aBR7QwC5zMr%s)a5=TlVllQ^(0wHMVE_qXoI zKXa!_TS&;igq_u|ie01E9&wPV2;_V(peDd#zE}f5HAWtGI~IvqO(J~*3fp*yB{~so z)t691M;7kyZSqgn>}*(YO$wA1zmy8M18G zemKl|E)3wyBC3VUSo5|Cdf;}WSnVnn$>$$J|2xb0|9|iQ!Pm=c$hl`Ug??l_z-PIu z^JCNz6Ao62pZ`Orml*;A=LZE!Q;*f;iw~W2fJ|##=-OX+!hd+s9|72sQ!*oPG)n;g zojyv4ot=$6tr3@?2tXc-cV)3g;JJJ&%sU+N^lIiGI@CS&1)LE2`5Q6hE-S)$Z16E> zp>TH_HU!-;TZTxZqRFbyYaccbmjV!wdZ*{!07{^E(*+M2cQT}PQgh&EofDcIvjsb8 zrxYRR&XC}P0M)3VWQ0?`LGWjw3)@!tXXt-Q!ZYL$9^}})@?M|RTW)Fa0+7qeu7jLN ztQ6`82UWTowxOUukwEDa7NG9+Q_h08lL7q*#toa==t!k{EYSSREuH*Mz!~R2c8pM6 zdZaP#1;`Pb?H{@Z@cwy|s5FAF7Wjv+<@VDjz!tRezqX)%--`ZEeTJr@1!QQw1ZZ3N ze(x^;i%N?9LpO@G*Z}66#Dv;i?Q1Ys^}98tV7$;Zcvf^koWmgb2 zR!F(P=U1C?cUbrwF${sPxz&oC6x z@&Kb`oK=4%U*}Y>@Bk&4pwR+lbA8np?T#H3ISJ@A2n8bIlEP?1=d|{qtfUoUI$<_u z#>n;*fUmpCf9VXHI}fs2S%IA#hWOOLZ836iFW3ZaG3x%Yq2a6BpwZ>2pU7N6Isy9c z3zfzDHnmX_CWdA;gZkIr&}v8DgNB6&WkNnyh*& zmQG~O(*7ewlQ8p@6L1clwdt4^bert~G>d*P9}gRUe2Ni&^0~IW)5LDtfvirkZKO|( zmKi=wU*M!Nq*J9o*HF{6#rjanZtAnNZ^kQtb3;OLWMC7Q#xOLnqV)mN0eyV{`tT+| z1s}lTRvrxvQTvqs&}GRucakq{I8pthx5lWy1+m#o%_}6}7jj^QQ8vsjTF3kr+;hXPPJkn(ZI-8g;-+?B_u0~BzDz+gt&?c)Ty*(Acf%WF_qWUC; z6%DCXLyh%s#@?&LvRpdP1CebbD*k_WLn68mojCv-D58Gu;8TXT7FS1P0`NKev^=&ND~v9ee)Y=SyM(zz3u#2u)zs>8Ud4yed+=5g<8}Ix*kkZ zT{S~(#N|t8zmxBMRk#Bgfwv3gze0Q@t+b3bso?!8=cT94RKp&*9SZp=76OnVbJuH! zoD@vki$D;^m%BdU$!Kc5j3{v8!utXo$1@D$jawdBll$PDechy&m#+SNLARJ3%tYM> zlr-B(A$VX&aDIXd#{U4A(Jnw{?jm<^%38&{8pHsSum6HaViqJy$;hkt{Z!+~?zLad zPsb-(`@1>WN3jkKj}|oK&IH(-6O2C~2davl-EEvsMHpH?KYV&kt2cx@fw#ECl0Cy@ zY|DA1QTUj;!p%H?WUoF5;B*ytkpK#%k0J`@%5EhNdQ{?nGP}^JnR$wdWJY17qNb%6 z4F^YkVbjQmO=G^|d$I?-i(fObjUr!c0b2q99{)q(^W2LEWl7|-a%jK?&Z*m8o~`83 zgOG=oJDBFaBN00w*D2!ywv~_NV{_(+V+wG6lXU=GD!nuM*hmmzghP=(fhdq&ZB)b` zx)|{7Jq98${ZF0+Q^U8wDVAL~xGq~J2XG!>wt(i3XcLmmIDr(yB;UNWXt_!O^n$Ly zEX85%NhS z%efhrs>;~d;9aAg<(LBsgto=+4^7dw{);z^CjlKD+SxyJf9Nj#q2ou6S+i452OXd% zp)~z?Eb$ybDjP5E0$Owc#*m#*OZ6=Sq#T?ukEqoE05cj`;7^z#)GX|qJ=BqmJxvw4 zCa-?ZAw1BOwucyO4_r5DZT|X5z21lA_6g8K~yL;dC?ToHmPJw{FkSBeehQaH z0}>0CX03OLo}TGXCxT0z0p#z>)x$IIM%bG{O&1IVGyLv+J^$m5OPEMXmFU1 z8(SCb{U*y&5|}BV2ET0?Euh+Z9lLE+ED7$I5NFj%5yjal1OCFzAHQybOl)LA{G+Cr z%|wa6N_50ETuWe3N}q~aZ(i)uQbLLMK_8L7(%K_D!;aT5ypH*z1bCmXo!a2> zDQf=HtH%-1+rH*bhr4qbbTk>j%?tY^n6wQL9k^~*isqE4N&~Ky%nbYD;4~w$*P0g; z4ywe)Fe9B@YIZf~5A6e%;?Pfje!vPx)CKe%qWAXptPoLyO7O~Qn%S;4Ag;Y`wui?| zA;+^Q>(CQ$&_Bk6Xc)^bl>~#9xv2GCRZ#Xkz6Fp+ZI1-{LH4V z?Q>r^1?}rnBFII;trt8_yD(bdKNWiB+bQr&SO>DW{%E^UAgFsoRqMM}@zDHT%=wsk zs{j|p#kVPyV~~)hto3+BmOGNm7YapvvW6BBGeGPPBLN`qTXq0KEObK;5QKP%-4|U$ zlGv(`82-?)(i1SWC@l!UA*u%9`$2qw*rE6EAG#f~C9tQp$@GC&HU7|vWv(H=@Ia0Z zNWur!02ho7!ODJm1H&G%DI;pcpJDJmT2I8ayxh`)r`c4ILc+9V9mqg%{_VX^kgtQ) z5UzCbt=H}&$4F=FC9Yl|%pJ3#MZe);#Y z`4YFGK+Qf-g*9#Nvd_%EgzoCWf#CKS#!fw#b~Hu~Kz3OV0E&)`jI`9LvDJcRxg+7f zFx4`pC|a~3WrNxPUMvAZ1(6{fkO^{!9R;_dk0gtbQ;?rzKxnHSj*9vGA(r-Do9@_% zs8_`e1yK|pIy~l129a35N3#c7~=kN@U-(1G^Fxp_KPCoOnV^SxUyOX(f=nV2Kzz2JYZ49*6?;imQd zWjB67MXBvsK1#*nCI2quhMltj)0=zhpCd;jrnO0Mpg|nKVtTiZw)7d7Lu7fv6;Ilv z#Me1$<}8lQtNdf&K;!GQ<&gb^Qsi7WkTE4%%Br-$CpkD-bT!_|xQUA&RT|ppM20s= zS`T&t@P^t5Vg>yEz>}Dff;jz~ml#EF<~+?k_E+7AjIIB0}U{uV^zxi=%e!2Hy+vB3uG&fFiT2>$j>T~I`L-n zj_(Qnoi%pcHPeF0G}&Bbe`?T;4%Y>FAzz!7JwnhTb02)O*P<&hBGcJQA@S zzR+0oc_qOhsz^;utv$@i?UGdgO^$P%sr!CvPQdUrsh^2jj<<}jVviQ$s~|WVUniiX=9a=C(l!*_ne3WB%u5}v)q8aSanAJ(}t-l_JM#-fZheq0Y`=uD&EVh zr3u+`^&X2XOjBD*xlpu+>sYBF1iI#4;B>yl_u9J8^g=aq87>MWh6lWO#`T^5y$(09 zMVv!_5Ja_5y22?ZSxkM=Q=yRgca5v6qQUTLA5(wAJ{afOSn!E*#?-&kyeN+Ai{ zmX-19;LT6VtYfZfDeo8DG4CB&<`DM%;5cjT#!`9ZKetZ_cKg z>g{4WkmFo54G=0v@R9Y?X_+ZC?Gm@kCZw_x*yl?zVlnft-3{n1gmUq)Bkzv5FEiQc zQ%L}^?qdi5zW+m4Z#X@>h+2^{5PHP-=wW5u)wTnX-Fpg{^(^?Yae;rnt)#)1$XAOSCJWEe}{kvWzuACq{DKLjw@aD$8J@fDEHPZa$Y@)G}00PP;{Jv`M@zQ|H;697NB~vIPZVDGb?oQG4+I(5aJ$Yh`XW zArBF**NAMt><9u>Zd|B?)I6oCLvU|hTF8WM?i|bLj{z}~%9IoDT(e4;B& z2sdTOdtM8qm2*0En10LzBl=~IVjq>k2{77SBQpUX_+%Kl$?OSeTWoFY%MrtX&yd7{ z$*21$#YZV^;3FxC;~`34V;6EtbPs%1`YXvBx8*U9j=v6AkgUp($>K*i@w3sS+{2kMAJQx5o>v&elDKrL{#1$uOT@F0Vh`eOlnSO>=n zuYt$waShnhaHH#1Wy!eKpi-`F37TW94mssRvVxv2?@`9^E6i^+3#p*BNpWC)tU>Pc z?ZSsPEMLlM=C{A77qhK@Ki?lq3)uLKFGzqtnsdA3GIM^U^m=irLciMmIX_==DZrB4 zvYGOjPZJo-Yrl^cwQJ0upqlNC^*0o2rwazF6?YQ!Hs#tp#-a0aZ3d#iplUVfZJpQ( zq|G{eWZ_cBxR!B$Cwb&(j{qayB}L%}Cqg`or5Cg=2vqjDzH*9TBp5M@&`+n9F(m5* za|04oXp5RGLB)knZ!^+2Z)KB3&v&D@4A`fTW|i)YI$~J;>uyd41g_fZU zWZn-NNyCSX`@)|!g-e>!vFRk|lr=mqIFnee3>>XI{G7@z73(EoM4an#Em?3|ax zNl+9l_gBZdR^i5StWrvXYVN2B=r|)ypD_wN+XQ^qb+Bdj$vxU_8aog+tg%!!^_=W(-nr=erzR2c~XIXqTIKWm%VjcI1E&rq)vQa2`xnBt0{H zP8~Auw)9M^CsDf4t+nRL^TR;b`Z5NbmKp{JB&wUb(lM|ntL8*sQPceVqT04&Skcfj zB5i9P!od4WvrA&_P-ya>GDDMpI%2tq`p10jp9{TD$TrnJ?5cwtg$W7hD^3Q0kMK>4 zh@JrO9JmEZx8X@bwWyzPK91KXeEAT#c-iJjq)1Xm(w6lsXF+uh3A(U`v9%9%=5CG? zVBrz4C;Xz@h7sgw+~Vy!$3jYd6kZ?^C3JU=HeG?=Ac!Un!F3kgAP=fUQOE1e^4ElqOVeU8=e-f@j_+bralRC`7ypRGdy-+xd%{wE3K8BzgI?+fTr z>j6Sy8-awR?t;KwIsdS!UEf6tndSq-iwb3cZ0LFDuX~6-#19&$bv6JUC0}67r zXr;~!8bO*&+8fA!a}x&gJ=U_E+BrnqwtTHOd;qmIIm#s8CUMg^ciN|scOt>E-3uNy zR&>{zbozEqU@GLF0Xf}R9<8H^5%`GW1q$ED&)`jp67!6_@RFj+_B`nRPa9+%aPBig zr^t3O@8NfnQz)vy;n|wt|}fRgC8Sy$d!aC~M(x6bM^7KG&T&5WI-_+)JN5VoM06m1Qc6&TYmbTb z9=zmu_A@YjI;gYH0$;c(DP{m+6IK(vE=5n`USki`|qA1d`S)tU11zx?D@LTn@-wX$F~v1MO7e-txP`e zLX|k?HBRkj_hn9xBj*>buJU2C-mkSHAva)0=ytd+h(BvE?`2idBc1Az+7I8RF(g6u z`uTP5{HWBT)!MOglIWMocIV<*D_{dIiVQRXPg`6x{wsG zFKAkD<^jZYlABv*dXH@x%cx4kGfAzdDFSb1vA*DI+w3@xga-PBohYlHS=@2^<38k6 zk;}f?HbrbIz}fnaO+&_VB=7~%`0CyD0EzAq$?5BdLAAcq2-_gWb8|(v8j7m7n+gP> zb+eUzNwP6`jb@^vuDyRHS$FW7r?;>%`vOSm2#nX8D~QpSBYhbCH9DbRr7B|4)F^ew zvyJT$CyG^=jZ+t*rg;K(c*5X1mHVFavu9Q1Wf957T&f>p0>#VBO-M{fV7|R?W(fjK z*$pYn&OU=(LG{?R92E1DXK#y*+>(yYlVL9B?cg3+6UtMgd?6oYwRj>kK&hcLB`lzq z02+k*FBky~R8r2k`~EEZo6$2O^m*y)H}qxMPwM8W3ZJaVX|wDFb80@DJ%_kL3h&T7 z3Q8NQx`e-THX;J#$@f|57(_lfYh^IUetzw<-o)eKwo;8FukW#$#Gh`cj;8FG8R4t* z&j&wkDe>quja6hE!GEzZio%<_t%?DsTA=LYZITZN1{&&PcB5Bd_1PJ_ZZg7`?VRbl zdIVnS2vk1Ch0=>K9F3IFra;^W%u#M)tQv;l;431miohULzl~%2Vul zuC~yqEdI4@Kxkc~EEFwa6Wy(UgC@{nj&em=Q5rCy78+YOf!Ed3x3DOU-@mD*vch5# zrhWWQjph{n%HLSF3n4xWJqZXzfZ}M%q=NZbCQ|tQ+6>6B)AQv!)iFGu6S5;2fI@T| zg#z#DYQlibqZmWwg3Uw5rHOFD1Dg0ylN2dk?fN@KV=-%qkG7Sz^^*4|A=2j zunTz=D+ZV4o%jgN0>3~2)4u;Ka5`?`eS2eg_sUcb??` z2*{!X=X((&n+MEiSxh_TUY=Ri4Js5UT)lbn1&jb@kiSL@$Wl4PaC^v@9B4jb4Dwnl zjDhOYe$4K^KyV#Hk!u$Z=J*m5ir0knh}H2#sPoOMcJOOuWm!KG68OF5qwip1b2{faJS^kjovb^+{cvI$Q$ZyZ2_Y_s`hYNV=|LFTn zp>L`h)@CBf247);eX?%z+uyqkm4$Q$y(Nz=W46-fs7M`BJ8}WG`mU-sP-d|jee%=O zS62eMR(TCMP}{Fz^Sf6?udc+vIa zM@X~&lF7_*%u3*9&Ri+&9D6k?Y#lSMg7a?!#i@5jyHcF}Ie%LDoHIC+cruPv_?l}f zgC;#R&B8+ETiTR5;+==+$VltSM2S>@f$^cJ)Ihx7awB6i8!v53GW!T6#kHdr*C*ER zeERt~E+9{&l*s!t-uktwNvF6f8F@kBZ8k&NuWR{d%+pd`GnChf8`ax1TZpu9+d}bE z4lj3UM5hcn2-&5MKcv3WqeUlUrT;6n%({7QN{l2hlZR$@wV`()1E3O68H$oDXi00$ z03F|Rv-^jTdW5_PQ^bE((xmuNKYO%Zjp|P9?E1UKN2HJUWY7W?E zc79KQE-1Z1Cu$11yC0NX5oRHF(JB_kL_5I8lNtzrj?RIJxi;LOLIB34qV$~Bcy?3RZiQE5@)aNddeL5yu zxQ%4$1kiK+YH-h zl=g-zdNeVb+s~49um0>5ee*WiVBG*?S3#mDAP!qqX08EMmdnDV;m{m#BC>vyb+JlQ zKa+k~=+fWRli$$s z>c&Ce;s{~~{=W%+BLTf{Eb`J@-ibS?Jvx3=4+pwHee*%`Y5tP#=}{SoelTF)hxa?s zl>6|z6tlo9eBs-V_DeN-QPUTx>rl2qOQM zR)wyy3_qZ9q^BZyP58Z?CVp_ zsVN8*`2WS;dq*|(cKg04D1ri_(o0Yfq)YFKh=72A(tA{-O9$zRfbNE!Z$(I(mj0$Z8-pkOa=P=o$cGa&eOk$evh5ostD*>xU<-wG&gCw zWEcIK@v%EGMigEeLS103naA2#y1aNrfV;ZSsQ~c zddxBU`-^OyjmZy^|7KObN|~e4oVbcPM*(+%5Gg*W6nSbgc?tAK;Z=};Gbpn!<76eC z@@9Q=MLbhamLlJ=7yn5Rar_)ZC~>?JsS!@E@>e6WUX6*KqqZ)wI`>VMhvg}`unOWU z+KNVfT{!40G96czAjit}lTe;sqiBR@`+Kyd-!QA6+b}TMawuYscToDknBqAZzr)-2 z@rpcO)VOf<|u(EZp;!mC+4P;^`~LdW3!=7HvS zC1Q9fSfcl*uL?RNhnGan)Tm9*e1xP{iR%)(N<*U)X@ zh3ymD&5gT<38KUK{ZTIaj=iAKexCZT?-Jhy$jKu%p@!pmqh}HOobJ4jzaF==pcRxl zUxF`bu$}0b?&{2Y1Xk;2ff{$4vLCv2ijbWHb0*8@GVlD^<#YlUkzlwTWahAg(JI`r znbY%mm)btNuDRTv6SqfIEo7IupjW5|=F*-+J9JazvgS)&t}w2xnV9y{f|nnj%3thz zu)pP#rX1p4&{id<1)$2K9_UX%f7k3Cp+I2-_Fq#~{S5rRTvm_zWseKx-R=gib*9lS zjec{=kv(XEjMOzdhP14#hjsLgLSy0&mR|T;GEqqXejU(YwNul^FLde{lyP@#T`PWW zTU_l?#x3|`H7e|`*d;qwf5+K2lNI-dgQ{8PQ*{mLXm(AM35=3ne@`qsXM+ots5sV115tMpl0X zOtHM9-~HZ|o1brbNL>X#h0`oUc~=BIc@3RRjPd%9Aj{L${|Gog0i7!WGeiQ>HDlPn zh+gIcz%?KJCSJu3$c@f-%R4YWKZ3*}Gq>j>i;o*j^VcrGN}{$;um<@K zo0FN8NC)@>8+(e7j&vch@9%=_8lF9@C&mtfsHM(Xp;Jgj9Mh^MxvwgE!t5csbIg{} z{?C(^Xx)kvWgb@DU@z^t40&KH19rcv!ct&FIx<+$mNkgo*Zp=jWy8dVodP<)o|Q4a z&W~j-C(6ijk!*#8dFHe-Skm1*`eU)T>7G5S??8pd>~lP0)gBpV@2h#CcsH`IwJ0zD`5T^ z`?k9epS#el8 z-=Vu=5Xa$5x<(L?Q0hakuIq0o!oCxhGLHp4A%x~Kg@zR3tbBwX21MPB(V`Y!N00$R8e|=0N{qbnD{JIMz>E?%|i@QTpzhCqU z3n?jS(La^skzZpz3dj2dVPbBlyfZhZ;E%80$+Se8rkofCljo5~i$MP-M5kAVxg(+$ zDJ#Q89!9*5Ui&P~UuMaLCQHRv9$9A5CajoR>84aGTUj`ZNMv_22|-RACWxp-qeauIc6S*oZ&Je8_EFJSw=#ynMB1LJm{?MoY!1zjJ_m{ULMPQ;)s@Wkf zri%w?N99um{%55~$Qz{PUqqMesdU(B_?0%lPok&e9-auPQ)dJyyST_S! zQ(Q1R&ILDfL9e-@Fbb0ggdgEF7wFT}kO0D$vd>@*gH-OSuJ zn*)q@YD?jAubO;HGuRPa$t8<|C6RD2yi{j=sUKOLfc=$)`=(bDo?zMFN|lKz8iiXY5&dh%a*Je^$=^cgKl4s&Z#a%mndb=9g(x z7LJ%~h@ffRseS&*8v~kk9ZbpsMKEjS*Zk7QqMU#poM*)C}s7H0U`wn^d-A0)(2+f}5fLe`~ux+o;bs*!f zK~w)mrg_-N(k$EEz>e=E#GM%cXLxV&$|aoox!^S=(iL5ZKZsi&^ZGIx2>tiJ0##(9 zfA(6Jg)VE}^aO5OX$fEqt-3?cCiOzwuOKc=Ygztpcwykjovccz>6!SkM4hSyHmnH#z1fr2tq>-D>V8|&m8+KKE)zX9gN z7)=L$O36@+@{V)A9`+hu<$PQJ4;)Crm-FDiJvx@yHIHbs1pURlp8jMpxlzUEz1Q%n z@}dXG3i{(6_qt~sEAvjdlb# zt8(M6*njSLJN!1kS(t0u>1XAVg~}kKHOSQ-O`J%4cR*o^XoF+S4`YzTj(O8ai_EYK zy|9J>Y5TQ|m$A|QEmU?g@adPaKGTj_fM)Fl+!_>c@pLP*O0%8JqUH`eVRAfSZx6Gh zq&7R$Jtp;@70_N1XuT(#zi$49SX(0!3DksH=x=xzoO2pzAlCev`2#!`=U%gx@SZ3b;1Liz&Dpk{Rp37Ph5WMA+DyCn{(A>qC#|B{m#Trl zo-eBTm6BU$%G3z_MA5RLZk-~Fgm0^2hv#!DMza%Kg3>_29|)Dk__ zeO8o%AK|?j-lxknf6Qx1^Ei8y(!-0sG1BmRyh^X=?2`h(>G99ucvQn647ck(hALa| z0{5?+RUiyv16Ob`{m(cFx(m@cq08rqXe91I55UWdL1Tcae&ImOe0i{YNOj|mr-VFf z5-swI+s?-K)=nlm=L8&r@w^;!U?=RY5p0nwrg$$zd&pqZDSW{V@(r?LmiRg*yG3R+ z6zHW5wfwsn_78U3_M@Bm_>}yHtd22U*qNN>iYfgozFplY9-kU>=}DbOoBVNTP22-2 zK%31o99W-gq~4uaV%Jca4zX$H9mwCmAHCWiA?YxrJyre}(fh+d2>NF;sg0g@B&$XP z>W*z{^h$^!*PHd{J~2FOMaow5%0`#O`T!Ze1z)`~+Ii zXryr?-Eq(|F)Xb@K!{8l_?CbHpturk{)j1QYpnCKXsm)~V*>y^RgvuHfBw7v$5Yq- z?fr;*_+r?NZmbIK+N3P6`K^Rb7}HQrHu6YxlzV{TG)06}>AiShuoX=Mx6XP?01HB6**evfPo>$0iJ}BU%s{&e!5^;H7S6Y zf4_g+aY_L@W8d;+9n#)WgKuEmD{1f#+G=YeS4t4u2JW0}s9 z&bz>%>y)pHL^;j-HO`bJts}laruXI2*MDZ&+`xb7b+&hj;M3}AjazlDGc*7U^K8-C z$UDVMGMVt3hCq~m1e}}yF{BYG46)0+r4Wx)-#uYJuLJU_Vmf>v5RXo! zcsqxIi39{|fMtxr#UvMGhhq}9pEq8hK2kWZWb$7#-|SEV|7;?!%?zdDfU$Tzk}>?C zfZN%|z3i)L8K$&n6nV!{Q*&#qDE@wq)wan&^2ZEUI_Osk)zQlO@KZJZ;VYCR$7*Vy z+6qPgPbUvhR}+2i@cr-FTaa5eMtssXO>{z?PZJ!v9KQwJ^c?hiw*m=+F`zZ#r^Y)s zmCW5ue_iWmzCa8!*`a=@Hxfy9H$edc33oQN$ty#VmX_tK>BPvk6B!nv0@%0RzlbU! z?fvlJe(LHIht06=ufKnuJ;X*ghT~R_IGghAysNr$SQ-?W4~B>(o@qGQ#l-8{B6#V}--8g0l$%66p1dE4Hix6I~2^bgzD zwU0e~>>R_41n-;K z7;uRF_J%(Q=RH%_33_8X5yT^MhER%T(8(E`WY*;|vRdDd>0F}5F^u?XTNQCce=K`0 zfBdD7o2+YP(AVoor1LjUJ{A)|SU9iM-6XDKkL&ivCZ9K>`BH8-P7-L?eZq(B%*fOLyFPiQIR?7g41$}x^uJ~)SFZSW4OdWgR z8=M$gBGlIoWpKt~XePPh{we-tU$Mh5TO{S=x^}me)RG-bEl!pt`teGl(D%x<)60BLGv9E|z^IwMChGOF)2B%NUl=tJBeo>r zTkMDx6GIJ_iL%BRUfLY`Eg#Wmjdp*ijJ_qco)@&dr(?b*FDxWuY&(6*LaD1N0hWf+ zj1FOvwZ`2Bg-sQ7-3=ZQu?)}^t|maEm#UPbaoni>Nof+}7q*|{o;ax{4{MhCHDHx| zmCMIyfVqck8N8wG!o^RB!ftIMZE?Anl)UXLk3+Z19MARE*$_$oR2(Aox1haNf7Yqz z9SF|`*tfP`c;Gp=E@|YLI=4Z&&xFUVektNunMtp-G;W0W)B2I^q>|1Q~ZHs^-)t>ZRAy)tqF!2{=L0d`tDC` z%sKrGmucpf_n&yTXAUHo`nkfWV*1h5Nr42vRmitNx5N$2PktS# zS=3LcSH3&fXo!fkik#knM&J}IO6>HF5-TkwHxd_gVqf0r@V^>wJ2BR>Rbjr@N|UqV zY(yonpQlOlpw#N=^5-R^XF1k7Y|}YCdoI8HIwDht8fwX(@|-hhlcx4ItADvCOyQ_* zX#1$DPPlehb@EoWB5|6|Q^U)+kf;EVH$XP#Zcp!?D$02Vjfzuc{YEotBW_`!|?Fejlh4G_di-c8LQmqG7Kc$u}k+~d;3~k=xS;` zSBflf@cs1j)okz@K3bMv?jjtjJNGs6RZ8nYk@xf#A_OJwA>8?Js}$Gii@;LM7HxU_ zF7J;BA>Lju_0ulO>r5Yrx_9q@{4*)L+%6x?B(E2uRj9MTNUE%EyK)}uJd8L-ycqR` zdQ1Xs#A=wYP{{(H_RFIV|Gu&f7fPaCv)v<*Y26r(0$#A~ z0?$<{&?zHZ|9l9QZ%_He`rX~KO0pu6i!(dw_FLfcfz3YLrN73Q{?Ug)@P?;r@A8flJ(j~416So&E^YYkdO#_6n^9B8;j>RsaSZY zhrZMOiC@U+EAo$drJsk%R~g%P;G8mzC1D!oip{R$-NoL|Y0k2Ve5WBlti^lOILz%< zEC~)7S@})MQRZp7`I&x+LX}OZ@y^?;={fK{fBRE|IHl`1A*G%kA?)^A_ewk$HP@lf ztSi1%ZjJifzqD*VY~Pa((%bWrT4YZh6>Ghr+fXI)X<_uodW_Mlr1v?I+Wt%vf|AUc z3eZZ`bE`qoq!9L1{`afO<9Dj%GfYn96CznBFAI-`SCf7hS*xw!m85cNwb}eu^dC+~ z3#H;|j*wwEk(KYzD$j7#lQ+$8HY8ugQ%X;-X6`ef7VbljNa}+FgF<6H-_EpMn_EQk zqI^x)due@Y@5oKgoap7a6}?L7Hrc4KOj-L<3sEh0{Ui|RKtIT8+kJDaj^;#kHM?$` z_Z!`DfRkrw%&p}mRrq!aut5%T)!(nXcv6N@|BWdE5o6B!m!wMRWhb2aa>89Z@AW5b(iGm_S8xB3Wo^WJ+vca!kdRSa+w0(Bey0e^4I2I& zzMhvCC0376;)`UQ-+IUC#l~7zv@b#q2x6feir_c9Zx|SuZh4k@DXmIduLtY#X=-r4 zjHxJ7^J0jTKFtykpp%j+Jb&x3FicNhA}^9WY+KM}2y))mBYCCIrWY4#P>WIs;MHm$ z0Ev)i+3_lYZD#`3EgvdBC>C@pY@%8a{BvM%0KPw4fE}G{qe2aut$XF?EXW3ysryc2 zDId3aGPfa(MD4aV|f_?@C;6aThyF zh~pbudrREpkD$kR{aCmhz!nEZ-|P9Y&( zGKnY7>lqsKn~+2|<-2tP-fq8o)y=vOpg>jEo8lrQr>5E|{Cj=b;y0RgBnWP6O4`QQ zR1;J=8Rz>7+S-_T%NL-Pr0Ak>a&*j*PpiThXKi*R6nR=9$ID*vqaFlMpX9fNs`z zJJ&L=Rp535%109vNBBg)f!7;vGi*XU!gp{icA_3{{~{tCgwlv{>54jPg-2@UpK_hN zmZ%`Ok0IDoH*nwvcIHozp?dMy56gab7tzeV7~T7;M;@DW3_12}vQ|?W@CM9uQxB@j ziW%XUa%)nO4$CDHV2fO3x^)_U)3YGx?o6 zQ{~Vy$Ge2RuEz0wwi|pT;CZdrYDtNnya4Ffr$zw^?Pto}wccdH~TSer~0Bgj! zF5!LC&BmdAuZewEU@k6glW^0moQ50d^6YT#bpLQ`f5hV1PkojngceoTKV^UjELt9tAru zW{KQjcK{?rUZ!z% z9r6zOHmQP9eNMSj)#BBi6GCwY?*ZgN!(~HXM*A6}JLb=C6m8ZDAPj5GZR{|f@7n}D zA>!kYuEHw&*$VEnH^e&X>q0WcLMG?O10HEtmO1a5-+p1Td`REv1?^-px@c$Q@VHW8 zYbi5f>s_TkL+IBf>>UyIRZqfC;4FZ8#1`ZFy5>!?XAkRHuiiv@efopk+!B~KVHC_; zdjt)!f8;-q;dX2MT5$UTDMfUM*rjo)4d>Wer8!XsTpG_XUAh0939qeDtF^`DR2<#c z)%*_m7PA+VWHPu5;i@Y{;l|ieJ+Te~b|-+Lpdq#j?Irh8~Se>bmvu0x^^M zWr?{GsBmIR6~|WEf|$mmt1u}mOQee?X54?C^_>LnLN)(%7kEP2f9*J3XPlI{Nj$!* z(Rc{9{vtdC55)t+{IL34i*%zFCtcQiuu3_g}IkGx@(^yHZS?Rl=Q|R@tl;b2zTs zoz{)Jc#Cu+<-6HF=)5zfvwKS9I;eUJ8xr7E!1cC*T81bmUX%DizSynmKiUrGlEV)$ z&BN_i?h7OMy`=uy8}HJNW|1-kPS_auNn?H8SSbnG==WWhB4vuWA>Mg{l;_+l^m%S| z#Zym&VPbfBTafN2 zRrmYrqwDjud>`Bw#tY3;ktWlj=0dwib6s;SaVqRtm&JKGw@Vy`H3H^=()G6V^}ov- z&6W7-ELzwNjH|8ZvDOEGrSk5xe($K#SWldwJ{wgb0)brq@$%6~BQ>K_q1YDaS)6|9 zXJ~B+9wIW>gM;-_UsvaDrNSUBUS@P1WXHT+R$E#ixsaDu*=&-O%>N zMqFalmM-^Buj3Ii>uSz5|qIc)-etZUoQWfY0)MdT$LP}3E{yTEKd<=@ z9!>de=nM9PNUJ88Bdq)Wj?qejUfn!f)C~CD^qQ6wVzgyP#~#%@{G@2}wn$fp;?Mg^ zT(qBYUWf_E>zI(bBx{jq@ z)1hK+$Kl0REI^U`lCOLIlbo#Bl0sjVoYH{}ZbufSO|$vd&(izTqH|Gm<_;dHhGJ7j?oYinrA~|=EjVJF4$Zpr{MXp6 zWU1Dnc5Rm?Jd?k{Rv_)9jnbfB#!5sqzQ^3P*W|Y~dgLG=Tl_SIyjJ)Zs&8qR89Cg} z>Kw$a!-c0=CZJjunK)B>{P)=9?r;aMjp&(@{w8Dfa;=LRG9ojLbK*SbDFiwBv1R3U zwWVi88J$YfOOPe$8O@P$RBlMeNWUXr-$7KQRp!@B)zi!{WF@Wt&)iwvtdm~8hWHok z>yMhbprNBbv&2S*rY%2O_@v@IB1F7&lMiOjQk2O1j+K*frsL7cj`sJmUalvzBV=?o z5mSuTRl%>bGeAEVxR?OS0R1f}Hx9Wd(Jl3t0;xVpCYY@+=IT>U{nJCwA=|Qc6pVq!)3~f^LVLbzS zjb%=M_Zt80#p+zpxdv5deJXJe91^D+qPlRzEF%#AZcMQogR*D=KdZa`FE~!pE%jrwivI!8yoV^Ow|2n~l4b3eq)pr; zW6PA**M-w`l!T_UiWxWqqraNyJ?RNSs^;@Oqm-E{&PS;F1F+){tclKXd>b1vQhFs` zIUf^QNL6>NLgea7QdW8{$_ic%y^$)UL*#oG^v5LEl-dnaU3hjI<@U(&{uCA#`SbpF z6~8ov#*i^n9JWr0QhKU^lS)oThr~S4=%<46y5qhKmU_ms!~_%4=4>**BcadwA=D!3 zwSbHTaMgu8PgXp_Wl0OwEJ}%^nim!G&#}s+ldc=;Wy98%qTH(}uoAGn1 zmB!UU6Dr*ALL6MX>_*yGP27^Ydh~h0+mu!wp~d=9 zGdbT^KXK80ZWHp)3W+x5I?0=cy~{(D7t53e<4LqAeNsn)$hPN7m6;7?osFEHQ}GBH z^eN_QlK9l0mRg&0F@>-cJ$a9_GvXj}HTfM^^16^-yW6PpzO2}72GWeGw*oiSUF#Sh zJ*3~`dG^^|GT*%R2Q%7M`7M2(tlNViKW^nFVoD0Op8Pk@$UfWr|0v7<@1Z>302>>k z`1%-%8oJwrZGwCM)M3cloM#U_ifV)6lZESghhiEg&->%vR(3cxu>BtuBoGQ9WxGP0 zhjbrV0XMa!jrb?b>HL@`K%}u->BHuqH(<1nd^$m_vbv~J-q!l&tUQf#K=oRcHr_Sb zZvWs#AY%>F*Xdqy0`ZZN-6uvcB4=^?(98J&Xq)!Tf#2JDu z;V>m6P? zXyTKQD!eDF=hZbcs|!54%xXWg3EZOjz$K?SP2=T#zrP?7A_Q#!>y-}QG4V_2NL62S zUR!ASi|Aq?+>SDnIqE?Kk5dl->w@NKNT{>kv)Z?(VHPeK<>fVwPqd(|aNSs+mGK2YB z(Q^v4sdjv+=h?Z&57%C#-x-i2RU|F;{%m)`b8xUj*s;K(2%a_4Nm!ti2cXRKehqB2?LN{0%TWyPb3mF?&0SQs_U+prl* zmq}Kd6qH`LWEbn#iOL^*SC>Jb!7@h^&O5BC+tMGtJwk2K2;?_9$D}lcwPkcQ7omsu zs?ZlNbrMYDSB=x$Png>&{o;k0&;oYb-n=J!R`=f9BHfP>f+zFmHg9dZ-Ca{gW$9Kx z6hVxXQ8w@Uan8@fH+L}e@cX7>5A;K%`ZSI`q{r0WW{30q3+o3+N{MJL&9N`wz2--B z*r}x}bsAW{{(8(lHDHd3>3|*@!?#6sub@H5!!ro5*iH*Fz#h2AyH_He)_~u6?b7K# zSuw;`?e+T?!D4uc4sY|14Pyt7>OujE!+lbQa}L*i=!kxZ`yA;07NcY6Y1e{Hk#VOh z_2DmFN|A$Wn~TWwKD-M)tkF$Y8C~y4Zqi$S*e-EXjARX;x}5R|66h5!cRd`Z-Bpr2 zyS%-cUG;Mh|Cpe$Em`74GLL96%8+kF&lb{PpQ{{Icizol4kxkieo3Z(0A7k`XQhZK%7NJheUyeZS|(c zypqg7y5rfctAKaCdw1ivfZqfCO*7^do{f}AX-tT_3K!t<$fGBS!#-Knq)$YHZa-<) zu)5qC{sshhLODc+9Djz90wjMCz^OdHJ`OoelE#rEmNWx1k6$)Y0V-oT5kPlrj%aTL z)WvRq@+gd;Jmr*m#I*%`OPF0cHr@of(SG^IwwIjE)%YspS`DB}c#G9sn><#B#Q`6B z%moDW07%@*2J%o$&JY`f74aMbG2-aa`+s8PhEi^gmpx!kj32o*?a`nFT;pY3)KYa! z>%Ll-G)+tgh+Y>uR^U*^bB@M_cVPmSUAIAsEYz;YF0lxRp1~6~vZ+hubF)4?Z4?iXFj@e{kGl7oZsT z3#vV_)ml+O)*TpmlV2Pr{)RP~tY7xH>kWWHpfm(4y5TFQ@3x?XeL2BjWh&rN)%ZJ( zGgwE`Esk>it)DytVn=|z*ZkXC+vSb@fl9xOZCt8y%d)dZl)}CZfZrR?v z)hO`t&5qMo6`@Q4-G?Y&kJ%kcnP^^gW>*H;ksz=n@6Kb)JN4|X{A``%*E!6FncO}& zbiif9@gFcW*f$tWlx3!4j%UUO3ti_eT8VHyGMaI>C1U0~6#%ErYJ>FW%Ag+wD&_`8 zt!qjxA5-(PYVyZ(RsIgOhtIvga#(2A$J$+X2YY5Rw9|)w<8!B7UfAEP5j7hWyRwa` z?q23LUoIL;LQX{zybBPsC)me?;Vw9YK>a480XE}R1JRTLjWv-1H^I$j8jQGZEqWTY>Tz_k}A7N?;W1@ z*iY;Xe1H$~^|%?g%3|K)QAXpwrRVcr6IW9;Z%Epp1W5B*cu##$q=BM`p=Ng`aSTPc z>Z=k8TC}ja^NBJWOP{13NEJss=5DI|d}3d$^wCPG<3|gXtVBu%4u!;br;m{4;G3}V z3~Eea{+x}`LWi_OxDxgL2m3ImrKK86>>RQ#y*;kyGG^V32mO;XNyti4bau3FuItUL zS(4glsRuYD^s+3HDEN z^~`q_o!#*io(XMWx}&I=bI0(?Vtb>xw8sN#Y+62rZ&fp$1I-K_fw975)%A3781QH= z3A2paBWFT;on|}DmPNC~3O?MRUU-wRUqU!9#eADyF+71C05!Pv{1!p~tIaMdnz@|0 zUb+yzTzzB%ThT`?vS9H;_GluI8%uXSq=u(@O1$*}AMN?J|EMlC=^D}X--)7BWLq7N z=B%%D1MWH9TU>m=oPc|hbBY*(%$lEvI$=nz)qi9k=kq> zz|bq6QV_bB&JAW@uG^FG^?VkYg8T-@hW)kponfsgX(@NwwL>mIo7e(2!QhVoIj^to zRHKS5cJH3Nou#sZqvJ6z)x{8$aKlUtz_>t?mu*%G6sS~uDpUlMLR$5TcGQQAfz*t* zyyxEvIQV@GAh_r7Xa$|$1SEu`WYBbYO$aWr2}Eddf^Vx2!oqO(0EL>&aweA80KWAF zF@&6JC#WvQ0M7$S$m!1wWO5_$q=`?>$9Ld{n};Fu`aPHoAWPo4zpN4124ky3XNHT$ zJTNTVuKZprkmwrKMMqVcsSwFX@ULXC(>&q~dxxUn)RgYm3XlnfPD$wq6C`D$Cvkhz za}c2av@b^59y;A(Chg%!rS{OyE-P95)h1s_4o64##%dx*8gm*<1%)J?9C(WESTXnh zdYG?IH#g{J>b~O#qTG0sBcYM=3!ajWHO5g-;#gK?4nrB`iq%*4KWWSUV(Y1{kKIu8 zb<_X&t48C~88L)u`cx&~)|)E4GN^g%!2Zl$_~;hZeZG;N`~YF$+)wN$2$m_NCCH!g ztlxCeQ+1Jn2{H2_brMeEKMFaTy%eB7f=QalI@Kj|OHc@u`iV}8SD79n%*VRX1~F?ydw`*88K5VxJ-o@u_sNT-bphLAaNQnSJ(iD-2C`U&0LL`IM4| z@~n|DV50MIf>WEN6z+m4Fh1G%9G&`KBccsXpANVk?{OUgZ{$XYy$h5lJVQ)^sd*w|7FQebrIwPZBH&cl7n|-Xl*6S22u@^&$x% zBhB;E#nSn*hID3dd`Fqp(I;r2PqSrS{`sKdS1D<}PhS3xFsl7$ddCZSSAjTL_S0j- z+;n6kVR$JY?QAyRT%go#kcXt4*e&2xLOC;x>(A%hkXk4YSyd;JgNC2o7`tuEj@zg#a=x2-Su0 zDATNYdGsJ~Z#|};;#?dt^6F^_{zt~yRS0V7wlzT1v$YJkadpJy(}Zu{wc5PlCN!xBu+vJ;cIVJc8H62O!K7fPQD_>GD!z zpw7Lxd9t^Pq({etpFq{fphG~p6Ifhmpk!fBPGIVN9-tZfX2peQ?@oBfIZ*Y-sO6>t8J{R z)~W<-UYlT+{ct6HFaOUpm!gH)7oIkq9p0ZM(+Tqh>YauBqxjpN5}mVx(KsO>DQO?~ znqvI=`~{>VKZl)iAdhdqN{T9lCubFR`To7(g0FvV3sDUS6X-T(U z*#UWHyW)|>ugiEFI;Z))A&Jm55l%n)51>-`GjFBhcV*jp-c!j>GKL8Bo5Av<0K_J zz2jN%D>oJ64T^DR9_QmtlqwqS4GaE_CclCJ-unPxF; z2udPWT;0@+4vI}2+BhbjD+D*Z`v67I9Lw^fuG-$^d>P~rqCiPt}XJ+!XpM4;rfmR^i}Tt^}s-o`1ACx7A^p&H2e?Ba=-gU%eTJi$QF=S%X9 zyqpN{jUD{Ry6+|tiS~l|Yr2uNjI$+-+RCuMEGd)nGM;=~Rp9C2ooA}6WsC~R@Agy> zPcY+H6;JjR{;0u7ed?XX$K)@JeAOGwGlEx=ABnMi%KUatzvjiP-Sw@RJxzG8IpR&z zegn)gTWxB=SlR_wmO&;xnOC?BlnnGc4ERO)c~=s@6irJU-=6Cvef%)u(c^c7xpm|+ z>)B{I>GC#X>Z zANSTc1`l$CX2C0|aB-RpfZsqIz7EZRmm_dVEXstI8qgJZ5vc&N23jTa1Ek@zu4XkakcA0iw$)G-B_rEFYBg@<=#Fo@b@yoFe=f3!RZ!rYK#S zyWiHgck6Ausj2MSm~%$0Tjx?Vmm5@FxZC;OrT(Pz>P7lzpWikls_?ME z0ek-g8bc)c#dBYgJC{2=C%iyHeGKL{O(1SeBW6>=EmqhSP5vA z=3l6sUi1xvmMoQr;NtnAgm&OMs@Xvk#IE|Wx8S{x08`_?V#cbgu971`$B&aS*`CZ| zgwrO5VZsvVm=ge;wgI{PR?fdN0e#^;DF9^omKuPVDL&q*K_?SlxHVY-02yI`A9DZ| z>^W;d_-8)YZ=|t0BC*z*w@$9`+U2`p;TC%85k=;%B{i>4 z`>a3;{Jw3fSfqHeUQoawgTHV2oOI&bYj-3!S>%b!j-V}?fdsL7U@cR78jaZ%!f+Li z)p5hcqIZH~>Ut5$W`dez;l)7%=zW*o+|?$;&v@_csjG7tV7^sxd zvUvL6Ir#dNo?LiJWhZfNk|W*`={njcY!#GHYNX~EheqrJo@`^$4Dzu=|5|xJ8HBsi zv99K5c+EOAEpZ3V0DiIqrEB!?LhY`Jqo3Y2GEQjdsg6t!%UFvyjJf!>8mXV8e~@-R zL8t$?UeXA8PaUZ@SJQV6Ruk*w`4grSJ$yQXPsT?f%4I>k_6$Du;&u_n0oP<;W7IiZEObKUy$8<@%{;I)IP)=Iem^*Ru47uw z#=dxZ(ZmQzC+EE=R>H8OW7+|p zM|Io^zx3^L55A1&bqYKF$9Dl_c{nZxpC6nCmi4T1E04H7eCyHE>tQz<-7fB=cD9svGj`1kC`p(fSEp^U=3(#{8GCyu} zou`*`9kSmDe+yaQ2?6cjpKUG#Hp%GU1lq&)h$RZ6M0JgYq=U}PAnc20pG$m5^*(S_ zyUx*I@_>*}y9>K~Z`vaUt#&j}1IXruU{>k+bYAOLT6SM*$mh zBG4|*am!dTiEF)I(!2_{l-V(#5ra6Dl}0mSA;U#7GYPFv(@ARedxZV*EuR!t>2zC= zZgKNsF>bh1-R{LCPO4~rD|J6NN#fkWJc&W4IEU&DE_69snI;)JA{Yr~@}%lM;xiK- zDRtLoOjLOs=~a+lprTKiv(?sEBse2-4i9&f^W`BU?3$Sg$FvNFU)`aK!f>hkSw?;Z z&aEYqHlG#L`1-wmqX?1102 z1(R2x#ghwL;Uxk&EFI-7D9uNb*@>Fp^dA}|6LCv*X$YAbLc)I!xRM*Z;$%3MD0bKv zI5!gfZt0o9s$<=kr?OKG5h_0Q`42RAq@>+H9BGc}gk0W5LH3vM0$fx;*}7R_^HOB) zFCu%;<-;1}85SsNW=*dq!n$CykYfc$`0n{QV5~!QOCL4T1Mc=0$Zo+tF8v4aMug&3 zkYCn6vfWWvSja!e_ogTOE57$1ACzRg7mojd36G9%z73SbFBk!B7@m618G7_?`zq6b z(A6vv*aoLt8ACuVYQp#zyE&H0jyb;I)wtyk&aMtix6WA~jv`K3v>@WcS^>s3$=K1p zJbIWCGQEc@W6JZOG}roPP+(3LPhX$QE%sPo?}q<^&-jbT21$ui8!PpdK@}HNF443V z-{**=7JNy6@{o(z2mZj%^}sd4t)4 zcIR=Uz13!EDJCIw#ZYQ7t?~r&1WZX7BFJEnUd&TY;>lUW*ag)Us>IJ;cM&)B&b9q< zuOv$xqcd;6!^p2*Ha@iToeiq32}Vi%H}>8-D$20^_XQDAL=dG@1tf-&?lEXYx?`kE zx^n~tq)R|r3CUrkySt^khVB`1fMGn3@80h^@A<_!zrFTeC)PTD3n5y07o| z{#-_{rC11_eq=W11=tJc9i3QiWN60??X^bdRaA3KetUz);QN$05&7RPJ@n+Sq=H%~ zfEXui3QQ5evV<;H3|twX5;=f`g_Nfrl6_C&=D(#G>sa5dL9Xs2`z8gd$ zb(3XFY`G(E_R(o8Y0xiAx27?fp?X-Q&RzcaX44?e>%v~Cm1$hm zQ4bVZ+WdGA`z%C-8`)|PyDI{=uDzKfJxt*_)qr$;vlS^C57Dh3aXb>2I|Zwy7j}tZ z4K5u5ap-?yA0cd_6)cXMPkX!tlL=5MHr{%D*tEk%KtH>emZxswWqBe(jy=oJXP6 z{@~mT2m`)&gc2W^DMns?$=R%}Hdb6~U}f=_m18BpoP?BYh<u?Y7|% z5|U7~yzP@%U6%KD>cYc?vrN88>+8C|f6r?X!K(^n1L8TL$hH5r)1EhLh#_5UC3*3d zKm;f?RbP0zRv>uEWt@zJBcy&H$Gb@F9y%oTe~|OWkx-)WiJqS>$}$#vY&d(aUM2u% z4710|ev?qCNNV6eO9~$vJ@Xeg9y_SYc;;SL(fEp4okvy!^JV+hIs=Jd9we!sOPqiE zG2jo0{dbM_FLu5ETYeYt3Zo8eN)i$9YeHk__9lF~9}|b}v0w@9;vswWq|I)}0`pwz z{S8fv2C_(LPb>KXs?j+@S;wd`^j9_R{6C>&*R)uU%%#0WoBZJUh_SQcOnv?K*kEz~ z;|M{jCtsr#ZGwjZhbK|or*5gI*^M8rpLAI2dWLk#qLwRE>cxI}^?zkOU~f`cW_;jh zf!0BwrI#2{%E4_+QU-5-_4DtMH5dkfjN*;Gd}fu8RqP%nl>CT$18r@XKB$21`HTKsymP7s%>i5r2{P= znfL~K&2|AFP>BZ4R#Sdy7PNJKjmi*yS!OylrFdPw5F@0$BJi1xm^bH%J2U25#y_7B zZTGQ%mt*&&aI`ryBEPic_=M=V`l*W~(=2{K<-eg1EB!5r#{_IgL$!cS;YJTO!9a%~ z5`6p|*htuSK#r6~Ga%j2IeP4UjQsCYnCajW+E=TbPh|u5{ENAsV%TF5q8A-cnI;~( zkd%8t5hNXqVKlbN*}La7O^x54Q<`(KiIB`AWSogMn%F7?1CNaEhZ;2Fn-`TppD;}C0d(5Z7Rprn2A#Y2RGkCTdrXrfihS-t~5un zCkKxBxkR;FPO%^5rUlVlG6AAI_&@Ho6eBjMq>`qOlmq0(x!LS}PsgFt31Gcu7VNAP z(P$M&g&2Lue9|fvK~^_GMQY=C&xx7s9Z?CT2-7;f%gnrS1gZmJ)lt0s3CmSgj*?Ar zm0U`bQ0*aK=aYzeW}-!|o5Y;i-a^g>*YkSy@pFM<j3IguAM{p$gS04lLEq%!T&>eH94fp-{+iOs+-9J;9D|<5hYbTJM?I!w!~wtgWt+- zdDDh$;-J>(H)HKzV_Swd`L=YTY}Yn8<)&z+c5Zt1TMp<^l&zFm_>I7cbFT#D`i90D z-PI7<+PrvKzNhWkFU)#a@dEqc-QjE{BEp|-*3sT3J3|90O9uwokRRTerYL?5+bM0S z@tfop8((k;Z%bgKlerW_rUqA2KYLusI<%SkjzU<>&~cDfxsxYdK+-HaYZcyU#M%c? zR9hbEzQeVS=x_?HC84#t?Zev(-@=ApHnSq(_rIAka7W20=1k5S2`^1+2gIPnf6={C zkCaT$8H`!HbZEYVc^+-A`rv*C(JSKLJ&ul`i#ZBj`j$nU8&Id8@HGI`?FN${_;KYA zQ6#Pghqzo^ed4@>aK{_aTzVNL7FT;{78tw@WhyA8WBpVsL*1xlqnW1s{K|Xo@W8u5 zKV4aUZ?;Q@J9NkDpl;2Pz$>El>Q2V$jmv;Wy|VP+=-h%FKc^|B(I9E`9bIA{tovoY z4Ee=^|w?|~y z4p9&4YD3a?}owd53n8qVm#a zm(a-TVku3uAnRjUXG8VT-7PIGX)V0{J_tYH9ricPa3X>vGLvKSRH^o;#DP07+)G!& zft*}Hn~pTrNsge(>wp|d$I~j{(r965&QT}&vt(@SaI5T_)uT?0C}CP%j>EKqB^#Qk zQdqvcNqu=DYjBTdybw`|)Mq@+0(kRR4Q*eB*B+CM4b6n;W#n4d62-XROYF1plKlEW z24C+|M6^5$hj4fQK9&84XgQ|1SeXDt?AQymuAe1}eEJjoeakr9LMe-HP9}**%v}2$ z!D~${vP@+Aow!a2z}-dWhkkWQH+ri~2rGPEr$1_&It`1ldNG_A_OozU++7R+B6)3f zLWm_)tN_hwWF4ZH(D7ijH>yMK8LE!TNe*%M3F%c2a^$=oZLk(0@&kpx>t&`*DADVv z%T?#FDLT?u174n6THHfbho6eT1D*8h)l1VYtK*jzv)pf(=o*BIL&j)I2eBUy_XB9W zCRm1Un&aRPT0s3t7Z9GZ_X}G1O=y`F3t@Jt(C>HK@F!s~CrLgbtQMM5CIHHa@3!tW z>*BEiBurswh5x#FKNm@65&Yd6zWuFt z2#KB2%DvVKFeyMZ01Eb!yZ==!k(1vsWwEB=x+La4?_iOsd7f10bW534n4jcoOn7S4 zbeJH>pZk8sGi#>!U9Dgm4GH3h{IPsmx`CO_khj=b265!OGGwgvl~juf)Sv67p@ib% z#n4k}V!i5|M`S*kVmcejGpYvl_$bK()y{aLe)l>A*ygdCa^e~ zGagxu|2p945^ioM%6Js=D&ExW#P56JhjjWLMhZ2ELgu=zU(W+hZgyjy zBb6Lu=u{8!V@k2dT0r=*ed{k#=|RA=9Vz9~oIG?;yVHGXW7^M>tzPpYOrs$(u49XS z575eVbKhwwL#Tr2e?%`W=^RyZz88nq<6h~k;*(zrNRWRv2(S?`>8KA z4?zHk^-4hgUJQ^!vuh#r>OnNYgDWqy#~OKm4FmZYRf~@c@4kE6ruCBMs|(W)I`KA< zWCYA-3L!neG?IL`Sz?^#tP;vZMoe(Ju^Mzvz`5dc?0Qr&dMR{XfzOq?g2 zzuAaf3sztF>@IskkITArz{Toc3zdy;UN^u)n#CuYX}s1HP^C%P3U0HW66PyPkAvc6 zRqT7^3DW5pCfi3L*$AiKuy>k8#oY=&8}jQA8flSVbLzofe)y=K)tF9YA=<_JM%B^Y zWR#V4aVGU)W&M>OUWrIJ7#8j;yQhYJksw$d=PoPU6>~@(1ba)AcaWMKt_MNIuzm&b zE|AyLPk7P>plC zPWTOv@T!k)Bm9~trC^TZy2brx3sZZC*@g<~S%lGIOXCgsO)bV^m50_QMP5B&3PuqNcY`(nEuvp zFm8aC-MtS4>I~DS%5Us&m!2gc`{R zat1`py={1W2bx7LL*gBWdiZu?3&qAI!&h{X_62+H4RkdoS4s*y>T}T@F)@CCu;=OU zaBRM>+}aq8drR+|lfms(IYJU@QP#~XHu#K7Nij*#4>}HqTL4_Zfm`nY3B z9YxpePUCjzreHU~$}@55I<0)wW> zug`QH9&vhSYWAuo_TeNl3*Dpc<`bLVCkc@f^kYN^n=_3y%9uEDQ$DbB+av1qd_Xj> zyEa351o=#+ZD_K(L@8WIC;f9(4P;~sE1vYb63{lt(Cm7ZA7MWMg`<8 zZ}ZksO!d#wecH4iVbot~eu^Cj-wR-v?Kht?nQ|v5u*l5Z6R+?QFI5No4-$*vjkS=n z!M}ER@yJZ5cbn#^aVOREUnH-ZjTUx2+Nk2N0g$?F8y^NU7Q=$E?+R_!u` zkIaH@U&sCQC5l(^V7@58lG{><$pQ1>UEBKfJ$oyw;u&RuQm42c`gVSPhE^N}!8SDr z9BQQ=O}qPbnp4zU-H5!zZ@DHRt1_wd0_@)Ci+ZyP$2>{B<_SN|KXJRxdzt*JL0sT1 zxhCjQXTnDu=IK>sGMzyu2);(&4|TF#QP;{d?RhNS(Gb6aX}UpF3HcS?HIY|Kdo!&g z)tpzv-x1$4%Zjr@ zu$C5R=H~HDoy6aCKi8wsJGU_DxrwQmmC*|QBUS>~1 z^nyYNAGxSC!-s0FV*WYU0ZF2rF)WKX2d%M6#R^3HMAfM>k^FLh!y5&%yU~Bc8z8A~ zcdW4%Qv7o-i&8?b*!tqdlAhu2V*w56wwyTEsQ!*15>pY$;{I*;{a`e4l1>CHB3O=9 zeEV;3V+XBKUeL17<;tCrf;!~IQ+~jn@+&ahg=)Jh71=p08+X?!fmjV9LLFLzMc!W3sX^kvR9J5Gr!~e1K>KdJ zmMnSFWnrDq?nvw!iK?3~t`?R0h$E32epUJqd4&H}Vd|4>VGaMzc6|WP;7JANZYhgP z6VuG!i1MW8-yWy=*Ps6bAnWq}4UoiWCn4@&>yU6&LMur`x8x9>D`GVe)T}X`f|z4~ z$||CDUEss>*Q$paY6tRN#0l*REruWe;5Y$>yB>LL)js(~=Diq?LSWDC!gLuAn%LWL zO);)&j4`{_)X~>yp5`78ZKBEA1n0RV&Uvjm7rw-n;a&p9B^t!ucKLP_2;=oAkf+ouS){Mo5wib-^H$?jOzw)#F z9l(d4m;a5iE7F_M-4p)(>b560Q?$sYg_hPZ4$gZ*t9Ft(@LF{h%An1MB5%z|c6UH+ zs&dZ;5&v?4S;Me9I-ejD>wqK>-hFfWZGWN2NJ&XYYRE zSy&HN%y#jOO9N~JqGfWW%Fad79)o7vx>5#4k1{KK?o7N{(rKy~uuf0!90ZtPm!LZU zz_Jl32g%QQ0?&W5)Z69)ZN)5`cIzBQnjG4~)j2bCC>9&xusocArG0b$wHc^*s?#13$q)>4SZDkeC8KjL#8jlT z*)j;3bHj{G+JcU=urv|pqd@2%LlR2yYYXeEw?J}3==L)!YPqKQ#}I(s*#o>H<7|)r z;K(kdLT35*MBvxn1m}4?{Aae|m~8Mv@XGfK|3~cu;QgH*M)b*ql@V}BpN-s6Tu}v* zjTz#X5gr0iOYSO@3&6$9daa$Da@8RdpiBjIEY7q*Eq;Xr*8>g(im&580v5|%mPrpN zv2(Z>7#nsd6oZWcVyRxp+{88iS!OsU8K8xEp6}AdglUIj=d!TU`44!`k3W6GKGo}s zInQytAsma@--eECW5Mf;|9DesI3@#tCllVY7&PWCtHUb`Z{0%qszUXDWdV)1@;w#8 zr@u2-z)<|2A2?IkC>yiiHC7_^2M3MMIR#v3l}vzc{sGsu{SKWil(=$l-T0@tjo!%b z4mUYq9vQu9XnukAV68>L+xEV!exZn$u`8A)6w4TU#nyT@1m5?7ui9J*SXa=f6kdhN zc(t@bB6%^cXeedO`!R@Yv_TlGlZ5SC7Awv#r@}Vo=#1hy&~UM$YQpClIdk*eQeg5e z=_h1^_hij|I*+^gDPeXkvj;5?Z54${2no!5w>f)x&AlWFds-KV*<4&N@@|uh4^fwW z;~JN6$xcr8hWdY)kLFO?2p4+^aI z?RqGeE65iHmJG~JDzatP==-D4X4(nxDlfXlSGW^O_73Yd5-|MXbAJ2~Ex%-k zRnWMoHoY17#RtV${Wg;iVfcAPcUOa0NV^^pN6^{cDvEAXteL zNCQwOlFa}-9F%x@5Di^T0k{0rV;sH>%R@31LEB@BB`;Px5C-TQZT?FNVBI}aJZ214 ziBjAJAHB)y3q?=EHJO$N7!Otda)Ocn*`fg-0j8MHwO!0Jv~r1^k9|*aiS3gepO*xi zl#lV6odJ+_EdQtRr?2)P4qjOr*Z6_q{9&bh#rrLt(EK$>NG2ESdDl&)y#zhwmCi>g zsv}|DP3k2|258yI)_oF!Z?V6QAod-#>>Amqd%hEQ-9;J=8` zzvfYHPI%W8ekkR^_|2zq7jzb<){ zQe_aN85bzvJ2EH}@T%)lI+9;B!7Ap zcho5P=Iv388qE!7dB|I-Zg=eFYA^b!9R~krdRl-5cG`@Q{E1Y(eG4`+sYUtkr8QHb z^KAXyYQd);CJ|f9hX9q?M^6V8AU6Ogt$qrpZH6_G0y|Yk)Fms-;`M8If)i60A#U4e zMh za%po#_D>Es70tu@eJ0{6-WP&B3kwqKqnOId#s(na<0#Z(E} zWf+05o@2j<_uH3do5#)O;M)Kb#fYS&UA&fHxuB}3Qp#K2_0pxu*X{PRmJU@-C3^R` zJWt!OuWM^rULYG2Z)~@(2h`1@f#bsOi(3nyK|2V@W0U!i03}&FDqC`*_(Q}=jnpYT zzfpws_l?+1&?zoBjN3i9?yg@@!;6uDXBY5krE81S$&}DRGVI?%I8WLNyu>>Ptek7T zi*g8Jnx0MXHKRCjdshSDz9A^GU003x-zO)E+(K5Lf#O4_Vp*TzlZa)pf@AdhgH{wI zF*%C@T5(pmUzMhTTjLSQ8B@cN57ZdDeu`jYKul@A>UkR=h{}(Ux{anao?ujK7=L!s z*VezD>7t^)f}pWIuMQ0`(jU+mXlDF|Wj1<4p6V^6bh7TZJ7b^g4AJXaO;f(FqGU(a zPHy-xhq}wmPXrCk_OcW?aEs8Y3db>UiM(~+JZ8A55N4lmVT$)@sgZ-)C0&R4UAnPg zi^tZ}95nR6_icwuT>RagE)V^o1!JnQM{6J71g0OXF#>lJ_5o4(S;CAOXFAMJ9;H>W zQ2lMg;)`7Ty*1BgO=8Hrz}#KfFpZ*Ce*d|oU3<&pM#<70-i5Rot#wYq%a3uN!jDO! zvnn5ieso)}8cT8DX4QN7Vh~q7&AFQd=9l&Z-QMvgN$K`Hb)-4b+U@k|fh-C?1J4q4 zG^EmyGRoZllPC_I6?vi*U#Hg397?P0sQkp&k0-6o&?g9rsx>b4HMI$D23ZgLA89O_ zWgpxw;AE3Y6jQH>%!(LC1uB#2l$9Gy( zaOUx)zc;MO{z3Nsay%%wHtlxbQj;O4eOd4Ns8EjkaI55Aa>maD)>w|o$=#c^K02*8 zE`+4Uk?dXK7krj)#~3m|+S(a!pO<>{Mc>*AjH)X(eJ6i)dP!j z|4x0fF1A!KO+fje#sOdD`42C%c~|zY9wW9Hex!&8&zL0FAAO*vHVJQM;!E_D@PI9{ zuzGkYuEX2~BJuj_2xN-Q`YZj0+JA}WEjj#=Q8-#NT!w=TU zdqt8QZID_`8nduju_c|j?q4FwK^W@LigR{jC~>iq*y=M^~SG~fIfu5IJNgY3?jN5Yws&tp)+VMol! zTvMxXS@yUm?>$b&878kt*T4@t5Kxk3Yn>+)#~EgpMm?@XPM{N3MWjC3vqyNz_A_?4 z)~!#Kf5K|+S2qj3b!I7XR;Bgj{jOB;U9qlqqnAKJVwUM92cjM@6u=4u+ISjKQwtrm z8KswVLdoi`S$XEW0Tc@Gjr#jdDc@p}%LEC&n4}I!hj+Y-7^w5U%OdiD7tWx2p|P;< zZ&(iF9x23f2K@H)scy(en|1I}kL_v@{iJ?_Lq}weYrK}XBv|w5;NU3tay^`HzE+g# z0K$&JL7UsZhdx3_Gm6;=NywVt4vKXB1kqizIV${;9nL9zM`WECQOj?rQf0WPa*OP% z`IzLJR#bLtN&beZHv^(+UIoHEisP_#IUV68*6nNeo<+T^G$5VubcnIDUU@&HU-Wl9 z(F&u_)gfUOrMJ5jRN&LsQumeyb-gdNMHvHM?AOgdSY<3F(r@Nj*xPV*mgJdGGf#Iw z06)FWypa$HM$*w=u0s2ji>^zT%#UI6LT}iY@L~C4PQ?8X*SJB91&WM;GvS)yYC~7{ zP0%38;sTJKXY8b9+ds+;5+;fmrxzuk+zk+!%x~}=2xE>*W7>3aPDnk@dbk_E%YPaF zf;n?bvJ4Wfn7g4hk?{M<`ps-H-P?%Xo?E#|(z;Im9P4{Da`Pgk9-F(Im!i~yJFg0D zgVf7Gij*u-4qk#6wv;d3NMp7POMkc&ebX4S$4I$c)a0JaVBNQ0`IZcjpmHXGxa`yP z-&jn9Bs@07zENr=uo@Hy7$@y|b5nJvX^SVYAeCotesHAY{&i{a>O?E^&d0YmQ3+bO zws-RQCm(DjMk~O0_wACZ8{RK4fC503bHX&kxd}5{6rMO1!RCUk!;iz4%!vA0hg`=x zDNJ9s=H31E3;jtva%w_GGgvUiVONEmd5mgUPM-sGi)Xl7qs-2Ky6~c zYjof|gfcA&FD^zaA$C}CX>#Dw z#ec!FG{XH|V&VNaeESj)eNu}Ha^}1BM6Wd2ZSPpKF}D@JEFF8H3#fLn6`NHQeyoCd z-P)W)7CQ6Xw+p#(iVL8K@nWptzm3!< zpMOlGN{F^wfzIgo!$X)uUaBj^GXUP9y;E-KM{p)0@zjgjfaJ=2(1paHW#qAqK#IgP zku06P$u5aL`mb$FAyE9e zKFR5qCyCj;ecU>cxKp+KL=q;CD|iYnlV2cfzU&`V2~>o9ka);WMNq?#IHHFRm@Ji; zP>uV}6mQp;gIcP}Hd4ynn-VtoVMZ;bs>R~7CQ_Q~m1cy}Dj1hww{l7lrM8T5S#jS< zz3h)xunve&;{92u=b(|+>H?IP;dHiH)x%XDd*tzuc#N-ZEA`rJ72ZAo3Au9kT`>Nh zE{ESGHw$q!x%`>&esXI)nKoOv!~G;n^-hy7;7-ue^y_$c?L^lsnA4Oyhg>2fR~XFO zD19gb_z#T;O=A=lz+XdH~%dpX%C=bjh8 z^U$2Mf~x&kbW37-egz%zK*m=*Q>lF-sah&%F1nRrLCd#z&%k=H=1~cU$Yn3%aNTp% z(ns<2yT)+NE(C&Yz+lYM5SLzC3{yd34>cd`;~MQY~@kslc{~C%?Cw?K-8Ml zcCsn~cry30YS04)`=L#f{>1m11sCvmj4xt{RdRuPfD)h0jz1lMUD1M_zO-Tlw%5(UL+@@XUlf*%Go=(M#Ww=8$S z`s3{9iJkub1%+k#uh$xN2yvfP+Urv>wXdZ12aWVo$70>K>_eZd6Sm1Bgk@%WZ@U*i zlp$a2Z|}ruJ(;RY5Pi`sb`Im>rtQn{D>;&UF`Z{6qDvfpB41rrM;cEUEAA=CUNIvV z3+7OMJ$KG}-@W!%+hzS^6Q9SDMAwSXw7}N)G4^3&mRoOX_PSh$+$6?_JbkL-A#c7# z-#ulh+MJFRto+8BzxpmZ$9mkmT$as>sQ+G(nh;9qsc36y^8H7?TdfZ?_Q9kAo6BWd z>sx@^GRYn?suE%ms*1cx8l$69U3bMiYe{tsrkLjUs_)g`8XP#?yLbSVf#4sqq_rhw zyS)8!>ty;f`DCIVf=sW^$hSu5%%1yM%<&o>HnDxt#*QZucLVaY7H(T#qZb#bzs6Jf zlfcg^VLW_Us815>Dk15s*t03u1tBo9USeygn;CEv7;8SrWgAF$g>c!`lQ)*%U4I$+ zNcCO4ZX+NDwxqZi0QmB8fU*rRqsaD9Mv^Cj# zd$qtoV_2?@$8oG?Bx!WwmLR86xs<+cQd0gmuHcD^OJ61PXlLW5)hG&a>s3#cEczb7 z-}7h(Dhpz=7i(xcSY=tXz?|ltgD#)hfOl&)^2aEuhJ9s=ht(0z5)6}LJNK&jpCq`X zN9xc5Fz0~uv2M>_Ou)b4^;5z2ARB99sNyYbaMafsaGBD5b*dx~RJL()&pP@6Qf0AS zs%4l1H_xG2WfU%xFaCq*CYw3qioD(5T$72NuFT(s)B>r1PYG0x0&+*xe#&c-#&hf& zsogVmcV4ht5Vqp&>=Baej!*No)Oks=Fix~DVJdU<;h<5}N-QE1r82*^*^(R+CRArjFZ^lq+uCh2e)W0$5$7?Ht4|BH0b4Cksj;KnHb>}E0*wyCh(fYKR95@ z<*VC*hZZOxUtWAH5!r@4XXHN-8`%<%GDv5bOu5}g8Qx;6zlxgB$zU;&HG>T#BoXBM zj0b7ZT9!ySKa2VvprtqQo>W(qB5{bM(yt*#zly$ZHc*r4OMTECS&o~G^b;v^VtiNS z@4N0avIW0h_oB9(x?RTwrV{)vH5xJ}x1~ss@w-#U_ss4G4|{xywn@$U{9~L|$}kP7 zf?UgUnk+~V6s4jZdS6nTqgL(f-u|F9a-DFRVwE;9h(m!|nuVFRX~^tdX|8~GV^k8Yi4)^G2aLo6v_ZaEh$hs0GQUcVy zwsQnelMn2!dFL_ezYWt>-aE1ry?7R1)?>%dJdxvr1!%Cr!1~PmY5-;NPOO7?(n5ld$LPg6jvD5l;9eV>qe?6vl0l3_q??59@P$IFjbSJTygVq^A= zJW%|@9NRKr;tdfey|I%sdbXkR<}Byjs;nFt`Z>EYy~Jgxsu#?@DM%20eP{TXHgO>y z4?Vp_`LWT49MKczanQ;r{DJVAzg=wNu|cV2U*8c#CBbyxtF~a*R}7~kq}cA_rQqy)2YtCp|&C&xd56;5k_{9872Q3RM3}vGz2luxo!IRelfdz*I9JFLD8AK*dD4 z-Ed6lVaLLCL}**rx$RcqVOdpGi{~)zwACFN89GZ)2tCzSq)vn}8mK_A^kmR8S;KEp zE0^m4l3}G6ZlCoguvOgF-`fPGxCTK9B&iMRNenuzoMAu{`A|_+h2hmqJA_JzzVY|e z-%UirO6((B5PgMW!KQlZ(82G-$^m9T?Tffm0k&?IV+w8$?WH)E!4=#XKqz#km}^*4 z_{F+sT$pRE-e{_I+iI@1;I}(ty(}LrBl?p{T&e|;Yl3l8J}G^6v+GB2qPJFQf?ND8 zJ}BX#+E!)R&7myL9>h6rmBYkzw_?k=`8?Fl4ocoS8!LZN94`_r*-^M>Z1CMSf@@UE z(HMI+z9;tee2VgpAWI4{uk$cy|CfYzQ{?)ET#5v1V+(P&9$Hz)g7d-%v2*P)CHL?oE=;al4 z*fEn9f5tf(!SU5;TdQq(wCnn6+ZWld5qP0o6Lot1%~)rirYK<`dJ+)(dUzsqTx=aJ zat%rEF3_3<@oM>?j4mSF11Z7{RK#%!Sct)&r`z=x=Z$Q+En=J3P9FIe-$M!dsaK}_ zWCW>MN);BU6uz}cGFp%Kxtda9;-k+$hQaqw&AvsuZgWHpI=-KAU%9%T&N`^e%r9RX z$d(v4Sm18Y><6pc(NqKCWsxEQGMMpuiS5-OUzJGustlR-itqIPh}rp3XDvF#Co1PJ%dS;A=jW>#xqhbc<6*}K9>x4lz2Ssc_A=R09*(39<@r<} z1N*0@j)DKtjP^fY&mgAiqbjDJ8lo?GNO|%!TIGlh{H@S7BnfMoV*LkaDU1Xw#&P4* zyJ{jsrt$E zV9wTmL$qT=0G`y@L{ zRsqyifS^_a9dSsFMa_AaRiod^nPjn7?@-Q*0~!vUhmF@-PL1o0%~7ZG>%lwl@nJ(_Q#F2Jeq z2*$FQC<(ITzSSEZ^@$(y7f09pYoa@?f~cvA)hKAkSR>8k=P8G%s?PJP)-8g{OI+r@lyD)-6b?)H#FZfob>`)3u7|0U0oW*#DHsdM8J;)S~vc}*qWRwC5j zIZZU6*O_vF-JSw9Q(U|}FP`}$(XN0}H2(^KpYy{bY+FskwGE5MF%7sd&Yk}GjMJ`4 z76&d>DyAZz@$h@nR-6~f%0*}+>4(@k+!xx*NiT_=jT_^#*>oq^=z~SfPYYi^C`zF&R1;mxvs~AIWA<7UkNK=vO=hG9rbI_%_nH=}7T&GS{`6O? z3e}2sr5eK>e(4uPM9%tgU4gx&-h)#w?waBe^~C zCFd*h+Po}a!|CE8Q^oyX(menF-}!&*eZ2iC4>Hbj3}1EB?zuVZvWot?VOm1vP9UJ3 z#W~TyXGQxnMkPCiHjGJuKn@4%c6IYWeAjOeSP8!Ujhz3Z2jic~{~dVT*J$zl#>x?M z5~XvQwP%>;;FY8zY}nu^d;cE*#$XbmUgcc|5g0BzEz}H;9 zjlv+arze<{zGz6jRfndUOu$b+)HTzpQdYGWM6Pkz?^@iOAA4$R7f{O+oD8-@+pT(6 zB;2Ig=I_Q8Wo;N81c1K8a4GNz`D@Z6z5$|1*sK4q4QvPwb|UNHFub&DMN-Icc1K#l zy%KrBDWY*elXboGMG~~X%!sa@Z^b^Gz$iP12*ok7lb{BdIqlG8)Th*fmydnD-3yjf z3XFR#YwP0j;^vFRGX_=aSOw28tjQ-6?i-aJXTWrQm1$|FKM>~*0!mUq+95bYD9Y|6 zAL`qBa@W^AboPOF7jXA=V{;a^s{A*6M)L_o_hX)sx_HQq0=6h{yU3t31^}zkeoMH? zcpUw0W%$+6ESLhj1FtH&Z)fKVvMxDb(tRIDAu=jUt|p6v_X{sjV|raLTFNW6G`HDU z{z{Qdlc^uD&6GX{IK=-)95sA^agGVA#Q3FUsrnDjn04`)SPLiAI_8oy;MpG>9{@3+ zkPQczA%Ae7g;)Y0xh9P|X<6lgjAnv>!EwYo00dG0;fVdmD0W2qb{RiMChko5aTP7v z2)}L6SJKO*u-K~#Cym?pDk0?H7>h)#3c6jr=vEfbnLLyd!&uY@e^2X9E$yZfcUXV( z=`fe(INvjFPJyc81#`TMf~11L9EeEhPB85?729&T(lK|Q-Uz}TxC|p%6XTd1zxU|; z)dMjuCTWyqo|VO|gJ`Nl^LsQ`&$b99L&>+ubfj4CDVA@*@L4t!FayB~GE3auKPyV@ zq?jfwCeN;S&Oke;X5TOxJz6|G&dy8%yx-1xZfXXvNyF8`%cZO58wmF#Kx-}3UJUDt zXgU4*F>}YJb8kKJKRC-O^~=a7b_vb#(bIvmkX5TVU9F>~t)=Yd7>%>QL zr$#~<1ws%+9+;J(AYb5B2G(glUC!c-CataZpAUL(N3KbvBG^g`R|@*Gi`iINruzTj z1Z>)teD&7&{nw|$Pk6@j0P_J|wMrFS9f)YS_98Pe^^P26V_a15EWqG<+mjE4x2eGy z6p`visVXM|`P!fI932DjfgTErcuJJX70Y&* zR9&!qG1d;GuqD-u1s&)Hc9k-JGCa6&j)2BZ(%GAr{5;qlZs=l=Le)fBo;2!}8r$a5 z1=przP~J?i+G25Zm}JOzReunwspgZ#Zzsl$h}Mbs{Vf1B5&JJg32#&Y9FavPA)Mcn zyKM}bC@J{?Z0y7fZ#&0G8?MTYllR*-Kbyv=6$rr)%Cc*kO-^J(?jYsoqrV+o-VvDm z))pl1Q^I}y*!`(8yW|{@0`bl*RxZ_E{5xhV0M`~9ULJBQve97m-q1*JRoc+=*Dl>t zo?nEo+vFddiozjPH$}~0z0%%B0}zqfMbcDx))3-(vu}3eprz8MPN@$vR700t;30al zGOhQpdsW-B2Up1lET|E!sGNFK)Goo^`iB~DV6bXT>vWuLJun?ErrkJP`M=(%A8-aK zJ-w=kw-X8(W}lQAs1KN|$ytnSq|bj@ZYhhd+hkGkVz91%kS*V-X> zmf{wMk7Z$z4tL-Ec5+U)BJUoRA>%#Wk&FMTZWLdBZ>VqS-Kr5usDvH@de>O!$EuP3 zo_wAA%$WWlvVzA5Rv8(2E}Vp$OeA2)xAk7vX>`niV}4)u6xN z)x1Mmkw7Sece0Io_6zgCE930!FyE8=N1>sDr@J_1yq;knQ`xTGw|JF0LGADk;0FJi zzG`9vZfJizCgPuW7;v07b^=`vhr`Y{B0^zob4)wLdRS{TygW-)?#YCJy6qLOWQ@|y zcdVfGD6^39iwQzm-fSB`42jB8Xz?^~L_WVqS>AN5n=%Y=$Vodvw{!K8yP)1=5{pB_ zBa-_U8!Pn9WT~wG;6+y-KKK=-e;UcsJl`90ux`W<1oY-IeV->Jc zTFs*Ns%T5*Gxk!^Z5Vr=^164bLc(Ce?LlQ!tH+RyR~ka6lZ1MnB{>`;11*}Dou8tO z2=(fS#{0nP@KS=#vNjx)q)SMgU+rYLxDUTLv@TYbOB4pyPEf(%%ZxUQVs%x;&9DKjMdN@JoO9 z&`kD$uYf9(&r5Hk){iO#i@-w${3{<6cpD&5$+)?os{ejtvvex0t#-o;$c^-$3P zsEoj=zqqJt*yC-+KRd-&WW#u)YyJFYBcdFBdxr|=lT}o5v|svm zW6sPSXDevisPz-GBYDTGa;Xl9-JMQ9OHVr^bP?~|WB#D_JLs;+ljzSh_T+QszIQ&M zuGfS^zEjMpM8G?hF4S%C`wQs5zGK9jaW)bFiYHk0Tz~6S_bPu$K&@mT4Nh%;;CCY9 zRpsT#onE&U^2vXJ3EI0GEmp6s0ri!<-wB}zc#f_X4(&8%n{a#|Yx2W4^kEz@O-y{g z0gYH^H*pG4BvKu#1%>YeGvnu7MWEFPcl`L*cqqL1HkC&1?NRzjkF)8~Oj$+vLuGmGC#R7Sa;7rayF3RY>~WKvt}87Z=2&e0y_dR z554l$TFR0c%pWcEnmG~6ZVc1@Z|uEyP!!#}Cys)GL4|d-@AKn?e4GcR=s~TJ>6Bzbe}$b z&U2pU6VluVg*W2g>3uki&+AN#xDSq1J>qs@<8YS@=454vBFf6BC6ab)MKR(8FgkO# zLXZawtar{$lH32Kc^Np*7{K!0R~lZPV|% z>#V~=$Nd%h>obYWsG@}Uxdy%BqMgfwF9+o5%VY{GnzeSnW_yl7*C(s+?u@Y0^fv-3 z0y;|TwpD;ikTLp7#0x;#2Zy8G6aepzW1(y5#sw%*&xOe9-Q1~F&d{mCIKj@*mZj~F zR@i51sF-YGzN|5pIuQ5|p2Z(L{8=w)j4}WQCZFG@NmH(=*5Tk1?tcF3{mE3F9ic;7 z)?^2w3KoSM&ff%)+p_lPKOVD-XO!v|WMIB32n8upU+M`B8?>BLRY!@%mkBc50&G?g zFL(}Fma8JoFkjJB#^{KkZ>Jw7O-say0 z%3!Qca==0akc3e&1r+Aq0iYZ&PzQ>Ou|pLL2K2jXhUKX-bf^DEQXzokdpWp89V2(@ zw7hql;nQ?VXrDZTfuSg?$uE%hE;$&M1jUv#&(Pa#xrgD6_m646*_Fjh`Pi0Aym2=<^%l)fPb1sgr^ z^#ol5<7G^#{Z(FY;?Qi6q~@{Tpic)jx6N3s01h(1%sJIIVBo1Jai-(X+z zL15BI1jKaW=tIuRXFU*6@nGJ)wUNt5l*H_YGxX{1=q7n^M z?kAmKCGPmS(J2HSXLTJAZ1ZTy{@cF!Z;w%WpbvLUKzCfAa)D2*onmWNZ&y||vkWb+ zY+4`LwsBuZ=H*|ILEfy^SWw#cHCgoNb|9iBaiV! zWAUEbLQSG1^cRcloFwH`s{C;n{TPaM8DjM za6@h|7KdLg5l+ds1u&SHA6GHGUPei;3=p_amR8ukd1IG^60V<^kS@v;y+n%EdCnA9 z@?#=LAOJ#8p?P>empL%{rBeVI5TSZsi#fI{maZtz{k2Fd{g*7nzn7t~!Fb9xqD5*< zmaYZdW*7RRbALQ_h}$J(J-b=j#JcFH<`d{Mz;t_Giq{@s7j}MK6tJptNT~4dcsNh9 z-AAF!;1E?%&!~*Qfk~3dOMK9t5yu1g!4Y=`Z(W0 zv>S3#+gqG#6~w3FqBf=?)*RGu}8ky9=r~Ww_y)N z0i9Y8HPm?*(~41pm#YQOWX%2t>io=ptaEkR@U?d6sv>5R&^UCZ;=%?o)CD?uc-(Vo zn7=>)xV^1^oIIf0Gk#=y$(H^H&x9%FNo8T$VWHhnaYGdh>oq9{;Rx3$6kh*>CpejP z-3CAxoZx>o7*jvEGxS(S%uCky@|(-M3!^S>d%t@Z2nHM@G6$8zbhW-Nc+CS^F}Og` zc>$@$`N>nci#cTjNLj`K=J*inJ7(90)8s-H__kg!f~G-P`kWb!T4(>Pf09#l@_Z{= z7^=pianu_wTRx(OzYS1X->_rAH0Yk~{u>stn|&A|ZONssuM#8zT;k1t6b%1|z3g)$ z>8(#cC|W?%81JgHGC(7i@s-<|qD76{fJ6Vib5gl012NYXhaAglC3ExOkP7#v(`Wk1 z%-34l{1|qT+%teWM&CM7GkE5L%OxCj>Fo!?G@jqWMod?3P4l87J^M5sx3h0czaDz- zs|n%vZDw-;Rr>VTj)ybBTidF>5N1dh7X_Z07cU(OZXV@Lr(Ua%sc>feIF$`}K_>(j zTrpC-Vrs5EL#CoW{H#UDq-nig7)jBd3z`BXml;^~0Sl>nQp!#b``{(%cO;wMyTj$0 z|KOD*oQ@x~F5qpyN>yytHyf7rnh}G5&PN_zgr710I`W~}o-BGcDN+8zv*@!d;kKH0 zvT>q=c9GTvwqvj#xF;ESAe+9{o=1x^uOJBI#8bs4t5U$ z8u_XT>34wu1lLQ0>o@L&h~Orwe9bX5CG9En$u) zQCeoIT_rk5#vvfIdBbNwRKs_W>`nV0JP|XXCl_u}_)}dMF8a34 zz%I6y_Sj%!(~9sZEYWt-A-GTmhVi*_>4Y4-0JxBhn%6v2JFQsKD=)dbxFfM^VAz=D ze&Xq_dOg_(+_=SGaH)R*WBzyhpx|5I7Ri;fOH|uy5mPqp@rvh3QcR`?7bn=AI^L#1 zDr2F?b-3@(+CNfr+c|D}fv)U*hL{fX8t+?8?|*k#vxW~i_cP#!Dzj?VewKi7iOM)c zUTK|{ketzDn0+X)4=cwiz5qZ^Xq|bcm#)u|*p%<^5Alsvh1sT~N3*WTgpxt52y8Dg z(Gr(*xUuBphG-ybmFMW+^%<8425cm0kHnFa9@l}Ktt~FzMKm?HdMU=SXEHo=+8qqY zcki3^cAONtDuZ>+!(O2R^>XvJN#SF+njg=drSiJr$STkSIjeLSV&U5ECVq z;>y8`HQWxOmU3R=jtkAMqG}tLn;EY(v`*H`b?)#c8?V}kU5>pGcw9#hv8MbP3CnglW4t>d zEcU>fD9tP`A`i<@pCR<3Q>UkHfl9nBDk671w{pY5U@CCDny#Dmiw9BROCKAK8wr!R&?H1LtwX&<^V}{Ft-@wx1ienB}Kfp zZsCvBkVu5NGvFHQUgL`g zbl=y-mK&RgQ5MpC%guc3$wNPBFYX_`K)R+%S`9we07+vFY#9Uv?dK4O$fu2IkH0lj zU&bthrW`vC-o&$wMRFJ3bwodO8=QcBs(OZC2CpHJj-9!{3Z^ zK6|e1?B-LL8OKJQ=N{wXyyB;~@4cv-t9=h1LzO=WUJRF`!YC-MX&sZ;d{G<*KMHMt zF0?A>y6Bc*6H;Ee>o73n7mvG9agZQnw*+XN3&qrLiCV>yc- zwew8O8Ilv(!hc*zRrV9joMKYf(DsY!he?giLS7Rl_Huj!E&XRw<2gi}j-% z4aAF5`1Uz!#%7I}jXL4!3_C)8Q36^c>H-0!%Q3#Lowu`dDoMV_;Z9}SJ4doZz3xRR zgL(>n$3;|F-W4fjY7;A3?fj^_ZW3xXFpV-}JNZSy_U7fljXk{} zJgHB5qRbM-j4hd0*%RB?F{}>j;ocn)keXhn-^!+;6u4I@KA3Ij5AOYR*X~#hu%$%P zH=Qw3RB1k@KT;UUWFS35l7=mJbqduGZ4lr|fTIhKse4WOPOaI0rB;=0ASN<2mO%-T zme%CwlT4gl3_j(*pH4Mu@C*eVJW_uBBkA_@oHy>9o)docW(0e`>2syIkhx-r{Z$8y zcHUFgYKs8s`8m9NZ7od(&A4B{;ya9YDXy&pcSW4p|F+GdpoHfvI?H_guE|4qNPHYxS>6?YSY=Ep2@waYR%?k2K`>}t$D zno~OVQ7^;Gzd`Q39U*F@!SMW!`=^MXwBOa3zMUD19adb-pSQ4`5_evLYrOb|oDUeb z7iJX3*+;eTB6{dlGf5*Dtqgpp+*%Pf)}_8r6sxfT!ZYy8-llmn0KAr4oEpT%prTB# z#n165*vspuXhMef8%aUO0Oe%ShtH=gDiTwPn{QT%`~Zw?!8uUR7tmqT1ULRiFHVZX zrP=4#F5c=n{07_GeyThq>gsrbbh{PH=Q=%f0mn4z&g2hlC%xVDkcyM=yVIFFzud^t zh&Im7!gTk6kD3`mgj-7SQyyM2EgyVcEDhSSA_SiA+MK9Gvea^JTyR4l!L#UZ3%^vGQpN2!@mj><+ z&;Q_wPE=nju(06vd>`(m*i}4z^Ts-HgCDL|_#kugQ%2aTv|k2L%p5-kd-rl{IJ-`*mbRv2MAQ?QOzPEU5gf7&6c zf$tD!oSqF;D{&63t$cqavZh7pUThycf1?h~^kuI|*uXW4D;3Na6cxG!fo}i7Bk_8) zgd}HcjN`4Y<;-~zNLKCA)s2+Ar$5c2O?8a8Vz+kvJ?FG(Eo-2QjTjKUfzeL&)sA|k zzs$;nkuadwGWY74=ifA0n1LuRv@-B?n~YOU5Np&GuO&4 zyU4&&1o?FT?n(_la-t%Gn}uZbW}^n?ctz-^L-vf4R11sE?G5@UXi5EWfe?LTMysYv zR*{1b7n)**?_gDDcJf~QAmcaW?Zf>F+iVQnz#1Jx&#&n}DqJZ{Q`_i>dd4lV0Kb^* z)Nh|pTCV%K-P@2vfF73jQ6~OX$Q~zCS;0AyQ!X=vcKh0cu~Lncae>_s2%sRCYeJjvxOAF7Y;dN2LO_x9`N=~_)Gqq>;P3`w zj$<&BX-2(|j&Dx(~M`8}!hsHS{`>XG|7uDEqxRK_koTsyi7uUTt-D^U= zY-iNclzS4QRdp*VMW;W=AI$dIDsHe8Tc&;`n{%JMB4YeRs3Fpwk09_9Z4UL92r6xs z1O+#fY*!zW;f^}@wS=yf(aA?UvG!AeW9x|oX1_BNw;(UPq%bd+ybEQ8nkti3R_($FWMFveYPT1irK%_2EIYDs7P$2Pu(A zaf`y7KnkyS{-JA+7_zN(?P`U|4;kN!9??D7_k^ZM=vx229DX5i#6ynZ<`UZ)seGuO z6t;0n_fms``pT8%9^Ht{6S!CBH)sio=cfMr5gYu$RZlBR;1+}udcHUrbigMyLlDxJ zV$mDqfpCC0SRb`L;r{GVD+QjrAhW2cNX?^=(;I>eg*DCmQBoW%fDr=>I+f zh3TKb71G|*(VtcBuGApoOeS+0N9^~q2u&SSnLjG85+BG&V4~(K1n$;75i-g$U3m<% zLQ{|#DZ2BX=p3xGPK+=!ZrnuiEpyk8+a;grnKK@aGjVn=f65)BX7z5@HI23YjRlNq z&sCtyRSEE?tA64@C=x$1^)53HkxS}CZ$I{V*@uXkwUBlSNd&(@JHzZUe1yfH8 zvWYK8vqRT(o@5)Fc}HoR-R7kKLul^6g$ut-d7)g(zqDe)2wsdc~(&H|4 z?6-`8RNO7{X!h{MXwO-Q@8&)5(0iIIHpO+6TzenWLFT5<2W|x@Aycr`3`bHAycKVQ z+gf2>mV&*0Do4vWWTdOXaC8}5?e$79Eub(#K^w5C;`wp7R_3q9J?<6>OUcPBN4?*7GqV8>HSk)6Wepb`JkU-+PPYg^%$f zn0IB6>Qkkw02!k;_-WAYEl-9sq)f4*=cBv&2oy{ zvZP(nk10*qmbS94n~V{l-PUATuA4 zR8_5ETYN(KT)OaBaK$yirSPmK;sNnU+Cp1PSakU;O!bC3Ttw#mP$L&6?q!UIc(!*Y zXh#X@bBNe;T2>RX6@l@;>+dCBxrQGuU|w>hz1Jq%8v^G;EtXgvxO*i`@zb=utv`Jf z8rlrePd>>BztY1XHGe&k|&l;6lItu#405QNbr1Xec9r{7;szJa{L6Hs3(E#~Z*`i3ntN!%BAmP#gDwUAU zzHHT!m$vnB^9_Sbu;6_xI6wWX;0?Tk#gq3G1YyT}a%!XN)oKr#U&RTK>hjSu9|=Mn zu2ZR& z>x5Me-ZUAOOONTcwqPEE<-;_-M1+ex6;%~ScW+EL<~~^(zhkU)Y;D3n=In~jpZiJb z2?inOX_w7rj#eh;Ym*IZJim0@a^Lt`QAfM-X)eiCnxfbP`;tgAYdPw*^ae}H@WJD% zI&z5l8<76K*Q;|;4?To>Z#=0keiYA!t-8K;f?7GEO@fkVPKqusCp$-x0Gaq%&TX+2 z+4Jji*eR9f3}#IgYRSbHx&eefk&bJpb>Hk*rdS=j@a9sE`yH63(_UK(`Lcd{L;*_m zT-c*tq!aS|D%^Y5^f*mVz>n}LsH5?vXY$Ij`QwH=B@7D0Z@>F*YPRLecAZ%x7=8 zuG*}$z$N=%I2ce5V)Uv$>CDe`uCCV-pME&Z3^M=*X)UFHB=Es03#7)g6qlbC1{xmL zA8XK}5ZA}X1`Xg^wNm8>=BbNDd%<#IK>}4V;-|;CXeDLD}f{B5noFOddZ=}$-<4GYwd9C?Bx0fLSHs_Bww|Zmk9wVrhWhsV$#jyUW>pjKkgFf#6*~W7VA_2 zoa#At@8n^;RHl8>*(QFU<0ZBtb8n-(OJ8&576ckSYaft)J;zygp@pW{-B@`u?r$n7!x{; zp-wtKCG9>;D^?fZk7Mt6bbmuZRy%|!XY?Oj*ne6=`bQ6bCvOnYv%cz2Y$0=ULnCZ= z7r19{RVLF4iWo%wE!Sx`phk?A)8c-5^7HBo)6RVttd(KxT%>#&ue`Va(%|oz<@r4A zl5)qp8~c3;2R@I0gr5&4>gvN-5g!#ZRV4rdU|K3#`x3CNpEY$69i#fV%}N*XQvdBI ziT(kC2O8aoSL`UudVZjhc82y$ql{!F$GE4&;4GcjXov<>?u;Mfp813>yMZ)6${4_P z{?ei9d}*WqV{l{g6-It}&J>xHJL}g*{@7%q(f?3!*LxXuw=ZMUaE1rJ*I*SOly36B zz<`$Q$%7v#==^?3-;;)rN4~BstEzzYepMj+S)#X2ks++>GKP)9H~<&1TdK}|r8)lK zg*HQ0Wr61g1ND; z^H+S&JDzsDZEF=;QKv9CkJr`sYQ*D}Vy7PS0v47p_-Tk>1Eu2u0oX)Jeww=Qdrkr; zUGe+T1b{-It9z#kQ9o_JG60DA4_Ai&7yak&@A?1UrT=orSIP8&VsHs#;5oB#_f2^KKfw5aUF7>;MQTgGf9Ixd{F(R%-_kUS zB{EC;KGnrrzrzLY@NR|$U+ z+GCwzI}kU>Y$iY3CF3n=^9J<~NRsjIuKCdUGGMe!e|WBPsM@jM4)|(%D1efYge=A< z$qV14@96`T+)SyT|45!(Yk{ zh%#e??u&i(Yk0%!Sl9MZ9}5{S;}lpwoHGMuC}VT2fgy7`Lk0OTB#F z$0?L>x8LOq7F2o90w|n63-`Bq?O z!Csa|kSHlN(FiZ20!*UUAH3Xw0l!n-(B6lDg8JJV7%&&)rx_IVL00JU^mN;6;0aP{ zHUOoJGDqEnkwB@1d#leDcCPGbV|r#_r#zSd2=3M%ga|8o7@sDFt^}wz6bVk8f-%@< zipO0}zxN0z9C=PhI5PuS1E}vXyywoN#UsbV9|8}cT_qFmwRV;uM{`eC-OS2MW~bNy zH0{lQj;4hqLdsPVFx|^{y=`F|w$wg3lk1av>Y0rQwA8kNuUmn>9H(7-=ItjkKbW!Y zK9Nho+6*1~H9DBDiGiXl>st@gUf^qC_jG81W^%!w2g?a5pDdEBy0y~KN3IdFv(vJYu{6F;tG z2OIJYTd1r|T^_u(oko4=8tP&pQ%o`=J>K_4syd6_bUJ3g_j&A%S%H+}2=GSxeJ~Tn z;RMCgtgZJ;rgQVfl{oY2J{SEXje>n(5#rhT7nt+kolU{pqPz21UJOHD_a{w#7hhLg zI6;aR2%Rm4d}yKeZxURq%ay-*2(9+LJim2;?5o!-NW7gQQ3Kgm#|~Ps!Vc(s0w3G=#l0r2fecNp(Z z@&QW1E${2fZHpudDVAKM$3vMXcO#+54&NYG5S24TWYV^==40+|86$pF4j09_@}_Ul zkr*pe_7Zky`{+!YsrnBd#ZG=}WXo~Mr+pv8kpQg#a-svUFt5aw5~_OSRP5pO{&=5b z#|mjt{avdBpLn>{=Arr!$PdDh)Hc+*h@w%CG-jKz{k(w#N*r?^eb@4s&W8YFHu1|! z)cxf|T{h)KCx)2lo*pH$_y1xQjvPYxLmn-oy*^{q+kX}5ubS8hics4%?QOC>>O{LR ztb&^zZolmIIC+o?wijNz@L(Juq4Rb0M7#r_b?DrG4p94Nptb)?=Nx>*4hchK9$P}D zWY{W>wH-eu$mpe;VBa;8OE-5(HzPx&tgm=6E3%2IFp}ri6NU-IufW98bhN=X9)*bqVg7z!{vIy=9xeZ#EdHJ={|*)Yj+Fim6#q-b$pQT2m;kpsx07917<)9p~W$t!mPLhkp;I`q{dKopPGTa~8=ruCpj%@+<_UEoeL<0{( z+>7rz1uML$moZV%eY>JaF+{&g$&;-$R)9R~iWr0WOc4Ua;w>t`-_QKLlywAY$?`FR`BgH6)P6F zvzmOSfpwK_lzH|7`n-t+m-!&{=(BY+=9in7COXbzNB=88BTFh7Ju{`d{vkyG+UB+M zWk(hy9WJD=AYtpiiokt&;S`^eYkU-WZJofdOe7nK#p)1%t=!ubQUV}1AICjs-S=UZU3v!C+c`p zozU78P0~(QRVoULS&AbhKyJgZsQAtgW4;0 z5%bPW(V!K`H`JjTKPJY@>N*3i6`W=g&yl)taV z=m&=R48cRsAWcZ7dHnK7insDIkxwZe;)-~FTd&*C09nH;(K9P_dxt~g{5;duA3T@) zkSk&5#g%@2l71g^m((8%F-L>D{$^*8010PI=L}!T+U2q)6RsPgJ{juMp9q=)7PMH7 z+X9VK0EZSeDun>(eUWtW$S-*Sz0b?E3g{r!LI3Ed|Fb>yk;NY3^9PJrDeS^N-+*KF zmng0mqBq&GP>HRRp|*TS{>%CqPIexkAQl@ZzD0)RZfm)Ll_;a%YgMe&8QqUa-4}cO*rCj6b0EYn~4Iila32ZGO|p zyMgn6?ThZ0LD}%2HCtP||AlM_mcx3;;(l88>&sbeI?{ICl=D5DB4*<>=<2q@4Z2^uYp@>uEfG(h&Hr@tct9L7^uyq%y>KG7c=nB8N zxT-DD=c;o(()btxg6dU50|1-kDp=n=$cwR8XV2ZPs#K4Ygus zjr&braglK4fg94m@?i|z$zkHKwnl#bT+4#o^XL`@j#P&M69wWdM^+b^iu3a*;cB6s z+W~qyv}bhYu8*~!yixuL#gB;Fe+AI!5UCDxv8hJ(Ue{ zT8<)_1M}|9%+K=P2NdtkR6?bEzk=sr@gAt1u8ZBC+SE7T!&dOlFq0{+6NFFLv=Es1 z&@~qi*5Vis+!{3;WgJI(F2xZZv%Pi%Mco}he9hV^ek)K<4$q$RUd$7 zy$<~O->wW0z^j72Kxj}i@Y1v(XgW76Wu8m|3^z3ELk{7rcu zAf4YV=S5foHFm`kQqcmW!9$`w11P6T60`V{kjqH!}XbRnjY)Rb9o=Vj%3BM z!qYGOuAUdFxI3pcjG}PH{(S(9LAY~Dz2SSuhi#LI)`uSBy9^0olJG%UB@rk0%5%&m zK48HtM?V*>g`cQkFAxgI{24kE}_e+By;KtX}xbc zrx`jlIT$hv8GQmum?#B{4?%LLEG-gE4WqoLv<4ok-2)yst;bIKb^DfBBo)0(ogj#X zTQY3?AyUE*VwMBs+pJ;nG;~ip^JPv?7?aWAEqQH4v9~#J?tk!3cS>BC8;6gd$2M+s zqblXLs&LOLE2|xk=#|hhX#}wzldVUXkM|^^QC!Tf$T9vewV+ni*XaN&iQw@zLDr7~ zrT0m18Z>%a4?Mw;E%7cD+2(kb<8G` zGW0P4v%$CUY}OPp;OQ!+8*4Rcm6~3;I5;P$r#k##%G0R2TCkmygM%*Fx8>*MY+9SA z=EfXEZPfT4TSdJiDR}`;pwaV=Q{Lr!+&`l!Z~D(S0?5V?C$yA9FnO>dLm#LmcHzu< zlBM^Dtd_;~`#*TZq2boc6jtsfq3M=fw}L3AU#XH(zNON@)HTC-dta3{m10pX1(E=B zvkU`9HoJ58y%5QIe@ATNu<#s0=HglNZmx@n2q-gMj<uzbx5cR^X7%g!96e~9&8V!-CPt5t2%iAdE9b&^nTKI4 zEu1f58DSWwrbg=8$$EJcd+WX$vF@g;tW)IA)cJuXO8{H;jc{8`_-JYnB@}Z~L#mmW zo9Fce{VY^p?fHgv(<&Y;ck45p zBb2p(}`=_p03%S$62GR6+&{idTyI^#*#Lf%?0?KCiHQdQ&;9K#wQKkj53 zS?k2`WiD`#zk`(z$fF@s{gqSHeel+e5KMj(VnDDzeV$z1<@aHz0%*Z2WUI zhyr&;aZ=LlkWGIgqobL;_}0r9z5gSGi=Kdd0Q(1TWxT@(6W4M6e4f3Da?d~dKwNBV zzA>ALoFs*nGj|665+E%c7Ump%3o=9*D`IpUR|){5N^#ZFBl6+`=%l7D*ok&g{%ktd`!wErlHRe%sVkyUkQR?rfs+wmCcH%~>=#I_zKpT#-pSK${%qzFhHuMATB@5RP&M_W%+Cbk$qCYJebg3UJ!mk=yGj zT78B?^UA*c>f89QaC^e_2aS5h_iZ@X8*{JIowdfLyj=>#Ik>|oM}?1%H+yo|2;nDM zgSR=?-|n-izk4$+YI|TDo&yy>V6lw$?z2GX7rHOehg)Y?#hdiehfvsz%O&g^nnvA? zt~MD$YZl_Uo+H%}OO3tq->AAf=b4ra`x>d!Mm|eKd!~#g=d|mv^YfX&2{V z0GUw5F~+ex=0ywfZSL#PfYa|~R}}>0DHDRRrEe7Lp@c0GBAHoj!cTN86X;&xp{qD5 zJW1!p%Q29G_AHEHR*bS;$=X-|UA283Oyz=jL1Us^IdK}y`JsoD3M-{)$1lv&o9?`C z{MsZ|oqBDVQ~W`WXJ?x)E#m%^Y}ib)-d7fuyiO1F{jW{UAZ}cR4_67DY$JDS-$>hU zy!VsT5+XyamoVM2XBUbxV-pbqx1j6!-o@WmT^O`J8=*s_)+>YbAKTdblM+9QQaPGB zLF!?o7c-Yrt!&?zKIxfjHrJ4Np7Pmyr)65%JXxDuj^W`IY7v+GAbByV8sB(T0dLLE_b8E>cXmIcC2IhV-wv=rAlBE0C9otUQEe;mbAzF8Wne|eIs zvwwaiL0cwi8a$Y*f1N%irFVFFHlidI#Hj!p=k+S)JT#SgW~(FiA_^pA^1f5NlNKo( zyEfF`Z-{YH$5^fy>GziuKMl;V>z>ctKi#%&JVXS!ASo=v^R|ByXbW#td{-sOiLyUy0Au6GxEv(yriC2$aa+ zm1+E9nm7d4;1O@Irk^J4=~UK{+&M4J)y?7c*?Lw;J>;yaQdTb;?|n) z5G;3taL4%p$`wd`p{sA3?qR>GZkOBgeXLr{W5rQdFog~*mtxnH0m~Dtz~Kb1`Ouuw z4ACI{jV;WC@cm*35g12uTd;1)h=*o|rS2Y~zwwr63!SOotfIi<8&V3Oj{5nP7Min3 zjThO2eV!(lBZe(Q8xyj4bSuIqimleE!~?V4c}furqCE@mj5TwiR9Z18C^Epa*l4`` zJ5a@+y~UOwcE&I%t?y{|$(wE&@kRelgFb=`Fg1VyHq;HC?j&q%x!+c?cptth&RK2H+Ire9$tmYjV(8fFjq$gdW8>N zhGAM?93D>Cdi9zd=RLZuE_`C|zfwE8H!-rWKwR%N-)rz$;mNxg@m3kV+-ROV379g+#c|W=d z`{tn3)ffw9?PFfO)GlNq#Kh_GxY?Sqrq^^aRupD7B#S`mp;Y3p zvE$}nTcbL<$iH4a-y|8Df)kE!O`g1p+2xj&TjIx@&td;rR{WE5Kh6W6_XqU!$o`%gzM zzi4cfyf=(L=;RJB!?&N96Q(6&;e3n+l`~qlUXV#$e{h;-)>Sc6ghi|F8`jSrm$1WP$XzJq^q{(Bx&{%!Dbtefm_z=L@t9&Ww6$hg z@M^gE8=@;O|zU~vUH6;@=O_rJXxN$=RC^I6-I)BvpL~kaKO6SaFBv_ zR}@m}&EMZ%9aO`UvNB)L6T;SIIHpX};)gKQ3TBUnw`7jBXgZo{#q~Kv2~1M|;CnNd z3(MAyd?k6VI|>utRzH|y2gz8Gtj!cVGXG6j$4qi@m`mRZzo+D?^r034iAjf65xxr zv_<~-z4mMQtM6S-3Ts_PJ5)s9!S`}i<9oj?bjG(z7Y;T)ftK)6DaYzTdhuNF(pE-l z!y`Y^w%SYiZ$ZSobjFjkD`WeWnOA)>>b*wkIW-*=jS0!KdCJzA=&VVKcE~{Aj10tf zqIAABNFYFfB!26inSIWlJ>|@qJ>NO= z`@Z)NSdgssthJtcKlgpz*CiH|IJxraS$i-O*Ae;q)%~ecT@@_`mJQIfO)J-T>l0({ zaCbp3)_ozumZl>qIL&vIzSfK~B4G#e4NU7Rk#Yhw?;|jMFZD+x6_j9pjZnDcDZ~5Lwvs0#O3s7eub4krEYs>bXY#MEeoX1xM!X$!(^WtEzBDu5w2M_;w!0z!BVD{ysH26aKb@%4{a5=G9kKwPAV?7aWkb!! zI~5wW^fLXRV>}CUBsGp-uRPW1$Ovm0IdvlOx~NPj?(3;F_MmdaR%g8P@4 zoT;qHL7d0pL1umlx<6=5Nt4eHS9Xi-(G{i{)`wFK;#6~^m(>DPj}TnRvSpQUX8h_w znI}o|R>jVZy!$k~tIcYIKE_Fl{Z#&ny}iSL%cvrqv5LA)z>p-lO^4n)U8-)j(E6R1 z!GPf7OCZY=-Kq=ghy#HMV#9+vfZ+VIO6 zt1lg3*ya_03syy-TIE%`*d9@Ct~4t3bdImv(LnIc2#!=vlSec zED?xMReo>(TXL+FkjQjU^<8U9*2ZCy8NrKYRIY09OuPH|N%MBRhs%TrRV&@$vdQfZ zTML}FkB_ntjC--JR038Gxw3QOv>>W?RV#G8sU6$uylj>2au_2=5QjWgq04#j{QlkX zru3q_v2um6zZ&el8WF*7X(UMCZH@FpgK8^Pk<#-mP}kvyY;hus(dzTo)YG@3Jz%`} zjkx8fzlj>ID^5t{eJkk6wINtPT#mh^A zYBLho7ZIP;qXqVm7L*>=-N~$u0F$%r6=`x)J%c~ITe8xxI+v#)5c!&ec zWxMXS=dr7ZZXzWeZ-SuHtHp;|3Vj<+r7E|LInZ5+7OV*xslDuST&YmOBjNC4d1H@g z#yzOqvdvKxOLy%&$B3Ed)9*KSJP@u0w`Ap9ATI!}I>r+Ra@*NEq@sjl-$WCt!LF}A zELP9Y=}#r^z{-0W@(=JfM?JzS1<}X4shqAwekmoE+D(s9GcUQ{X{X(b_|PRUVcyK? z>$OBR>NY1ke`jWV`OUp(f0AfO0pSXA{7CkBH@nDBX#8Qw9uI2Y$I_s)j?Y8OQNC-# zoWq}MA00>K%~|9jkKMkznzuL{wcEhP_q8teKHO=#!rryH;^lgeqCc7XwUTR|8ES9) zB7`EM@Nz49Kd`|#nR3|tWsl!P^k<%GKFnjH+24=?gJ?>Ua>@zzd`n}>M7{Wkl&VbI zMflQ2qsy}$Val@ksX62Fr9o>h$?GReasw=Jn1dhnj*~(%`ndxiSJz|0xV9qyVX_&e zn7TUJgEev!Wt`o1Y4ZE%#Brnn(bO4+{n)FHM%gGNg3w2Tvl(&w85d9d$RL?(zbdbJ zv#jz)9h?0TUU5$ag0ugra05v~+H2#Ab3p#lw2h)(R*SGwgLlE(QzU+!cKD~PZLc1n$SuAf{V#;*#}HD#Y&M;)zQ1ea7VNtiAIFo=@xF1ZV$y!0o^HI}O!Fu`nVBy4G)&t$Up!_zNBI>G*46knA`|F}~d0`QY+KCBMqL($z!H|(+`4qZ~o9vR?TOT#9XDF7KILALm z(DyvsiPo;Vi+q=?Ca0nL#qhhpv{Z@{q3(+9`GjPiFBlc|wWe;=G?qhRz1}a?Z9UfN z6ju;?gBg_POMOOLLwrswSR9uH$VhEm)aF8D^3L)KdQo06)9uPJX!pVq&O>0J0dI`8@oQz{#2<|R;`HsD1Zc6kc3&#YimA>E%GU&BrZGH^1gG`p26Tiybu1QHIQ zAPCaogdOn3OGU#LF2z~az1gz5{hP=?3|wJ0DOCbrzsfVkwM^c~_R;SZaIt0Q1*5M_ zVoqnxc_>ghFu+SXc@9KZ5-owyYP6z2nuu$)>cy%$;pb-paB%>z9bJCa_;gS14=UOf zw?exMGGCS2`%jL)4Cg2QCW@TEd=cm;9=QxOkBo@biRrh;XX>73A!EOAJ5#V$38}}k zbwH?LRFy<>8bS~@T2BHpFvqa*BFyIb-(B|)|EOv_iaKHCn>G0Q=U3rBiLsgue$oY$ zU`kg29l#EcG-<7C8d+ro9mo7}?g0jdMTkr#6fRBGw%`J4a`*sWknP=zmx`m`?~^a~PBdE|8-Y%qVU^xx52NP7Kf8Q3ZtBmup;ve0r}##0 z$Q!oX53Kav(MvNjE!_;wUG>G{OQil4d*2u20wilBEUH(Gze(>tB08rZ96Y6$leAm1 zyY$O_W-5z#D|sTUZyI#0_93k19!B(1+{38vIX{Zz0u?!(Xm_2hS}Ag#njFTr#r_=m zC}`6^tnU~eg-H$R)4rZvsj!wltCdM9$oZ-}FVC4yO`~i*hf+IWX4M17`926PgXd)I zHkC9PzHhptz2cufReWtP*x7ozb(91gWWKH*nx23?*XmhRn-Tn)u*xi27aE-+AE(w!n`P##g|(R4JzN&+fU%h^P~X z$p9ATzvM>#^RxeN)(NL#RJxdE&rM}!Kvsay8{7wD&M5Y%T<7eh6+>yRLaDai^stAr z8AlQ0cD8<8pNPnuJVc|$7zjB7z?r^Hc8{(<5nvzjl2nx|qj152{CD@#iu^nP z92nWpKO*?B-xL3su3EHbtjeKSA6pZOK4!U8VUv!2oo*cBvOS}7WP<1N74RyM@ygr~y&yz=&793Mfjz$*1)!Wbf@k6D{S3=I8fpg#n& zZf1qy@*mj->+Q0atq$9Y*qq3tzT7DCoL&+YDXZo{#b^K-Qbf?Krg4NQR%t1E2^8YZ z9MSv3$-exx_}M4{-Fi+vgQ40SJEL|AKbcG#&e?6G#MD^(ewGI&?4z4)e{nJX^|8qc zpWASQClq%l{eEdguh-Ivl{^VsaVjwn{wN6c_NLzxWmg5Y?C6)-0oPdW2O=qrqcoCK zNkfC#)6p=P+mSwoNF_B_c~o+?S>vb+wSaF|%#N(;a=7BMmll2XANKQ}JEk zmpvOB-h^-c{G1|nya?%e9@HV!5}S-PQ&Hgq215Nnz+XSuf1AI>MgY$4xSi3cm>39e zUQ+-N^qr5JPWDW8asL<#eQkf!9_X?5|4lnAWCT+zXMlWyNn@YeKY03u)k&6bpnLQ@ z-QU?WKQ2xqF8z}gADi_ZQhJ}23LMM17SA>5x5YW|wlzRaMxl2tsy zBR_j$f+L59L{Q=8>ndx1)FuoavIlt5ev7W*?CEoqCjBhE1E~}|f^}1TsA`g^j_rUl z$OLvg0}^XFjS8Zu;LjHwi=dHR*YO?{8-zY27uX|v4?uMuO3a#5q$~t)tmCCbS%YPg zsUJVQ-3JuN2vDKPD0x`9nYcZP-&MZSL9Q-8OZDDNW0zq^C}RXrwL!%by>f-zjd4LugRcB*s@m%{Lmvg20_ zvU4wmVIPavXUInPcurc#zuW@0nww9IwT#PG9;7iy#XKgst&SZhap8vt4tW>owvDfe zj^r@Mg78;tew1*ubmK2A`LO{q95U`>M`tm399-@uP5_vf98-e+knS0Pb@FF~&hvne zX^~(|=1v3t%T<^q4>4>~9kgRe)5&x8xC!tlmNxz-BE(60Ec`gdrGoqcFzsWCV?;s5 zUr^|>uKy+qolbuhY9H;{6b(Rdn`B`*o;f~a15me$Qb3B54#pdX(qHa;qBbhM*;Of& zKKD3&`0gJdOWuDC(EgXtK%bIj1s$-Q;Et5m;mL~MME(60KLXDuHs_!YzVxs?RqP{p zH2aALn zL%YFf7Qm1Vfc_*fTr}JH$pXmbGNdVhN?q-Qe#1%8{$iCX2VS^ayZK2I6|7SjKo`Bo z#($@Xvz~srf=7srx9t)NY_bvt&Kv0QNQz0ZM_J|KFPix9jR#G=7s5N#XWcUn+z-x| z-GCkQdqla-A{hH+c zU~P!229$yL@LW;mV5Rpi=at@6=9b!!sV1ghuDhnYb#8IGzlogZ1qt!FhE!|bF;Lqx z&ae>^0RH>xKT;i6e6=o|0SDUkg_jT`J&Luu3O>K-?$(Wy+Gv?3fq8e$5-e}JQ?);tIuU^W12KuZH5C>C zI_JSr$t?>&juzwsna@TmP6_cBYLGC(eZUd~YP}E}SUpqW+01D528OO1$$-2}p}~oI z^J5!-s_l~}TM}}B>J1gf&EoEO??OEQQ{#lc z{o6e>=i~LLX_|D?j}g=ylbV$)3v>X~*q8{Fkh4s2dp6vNU%uL>zgGNXgqfP0yJ;Ij zU7n=aTU3R3*`_O@zi>BUcfISKWV|rA9*J84YA@K=X$%-L6nZ#nl%5Zq8u!Yr+R{Wc zL7WWKH%VIDPT2C(RQS0#F#;+_v=)mmsBbSLIW%_z+fF zm~OM=FnQXhf3Qc4PVC629PUtZezIKSV|Qzg@IJC_x6dM=3Vl1wDL6W*#!T?M5>mQ~ zq*9R3Dk|>3GR=h>4(rx*GVVLy$HNwZjt6<=mH8JZ?caOUP|eyj=0g$6P#2w_0gA4U zgdbCeU9ft-I9BGt;mgjqu5G;;36Snt3|k&>eAe+Ro(@Q4`9_e%+P=&Y6v2SSVQ*p7 zJI&>QtM4}vBeWB8d!OvtoU%_Q0HfpV>-#V(dkbn1ZWmjCQAcDX6`lw+-)~;bY$BR? zBJi8&8gv43tN9mB19h@J%_RGZv9cykuwsaGUUyBL5q76P@UR$~=u3;u$@I0tLY5Xn z=b7&7+l&|T%cXUZb@vKz!y4~}_4qlZ$77$0-w@McWUtic$8bw(iR8 zduOYm$^lOsaV$Ya==mgz_=qwiHf;9BUy$RX09WlZ?TKMXF zn9vF*o3brRPlM&gi`r6y7X|WX-?~)+Wd^`gc@LAHbnvjEi?@AaVZ0$-91m0e+>|DC zImxspcDw?!!07p5f3R}af6<-!dzDv&^^-_wX+4vF1vb(pe^%hOe%gRo2BA^0%#$#( zLSTPpaCu{YLhw8GK)a&3i7jNmZ|@9RnF&L4bOooF#F0qabab0+Dbz~#y4@9U zo72!46}(m)&(ab05shC7GbmQsM$X!6bqz_?6u~KXCQeV?_rFp<6a@1$S~f-Jsg%p_ ze7^e!8MLr=>O%~_$EIe*#G;Raa!)%t;(CZYvgmuQXR%m4uPD>SxUHDPp?FAS&51x0 z2o;HmbaJv4isqt70WWEnStr}CgeN0s#&80HUdvIwR@d;(f_bUE@rS(w3_G z2KGwiC<z@BqWtP7~Zq z6f*rGkL%K_(yA8d0a2_@FhX55*WW-mHJDg*#9^8dC;2V>RBO zL1A_ek-NC^Tg0NQpfB&8l8+zRxjn)pJ*>U4zRwXjLxl#A8I}8dK_+efA|U;XR4UIQ8jH?%UzfO85|n0YEaDZeYpFPG8J6UtZym_j z1wYbLQ6O>T$Xzky2*uf@z1Xu8Hk)ALpKe$Okz+$Tg%dn(i+$B*Jp(v^+iIL@HkclI zm^>?uX@DD|K^MjfZkFY8Tmn zbN0}HNokMfu7MsGdBR2e0u>s&9nV>>l#iLR%^%TcZe3(jx#$oU2EEhbvi;E8%4pu0 zp?tJ{=SI;br)nym*5T_jo{*)Y>9sz%*g75Zv&-oX%GORt0NouoDIRly-SP)GV1+}S zkCi*+@{i6E9xHwio~!%>SGNikxl%tO*b(X}M5Gzva*SX@N_epft9LupA29mX9vD#s zH`v$3-Mk$3>O<(7(-wqBgGAi8J8|#U^1fB%Kq}E!a;O&DmzcNGoi5d4QSBEispp!H z!D$0>RO_1Ul|MxXw<_{kzocI~M?xmm64>98GL>mPP|CfC0E>lRu;{_&2RUtr-&x1l zc$heUmvhOqy3mt#maBBGg9q)cg|C91ox_4$N4M5#ZVdqeADfZVgNIFb@~WfVt6XA@ zTsixA6zEj0VVRvmzcEL^(quFYE%xqMTnys2=?QUBJ?9mGG5F?+H>fR2>cM7l#BO{F z;Y?O=BOd9=W5i5{mW-;=te#RzeKZSwZ?=UUasE*o?_sHM#z+ylG0xNQ(wVJ_O_=d^ z2y%isSr8eM5JP=~u!X3OU4{f9qMknCnC<2`eGAhcAxxsK?7keTsv(^u#@mp+D``af zZs)uDMw?hsXPE102HwBC>a9bu%(YQdrti3MGXTH`fedLJkoyJcT z?2{1nsq=8YK7y^6BzN9HaTwJUVPCOvM01(~U;6-g4Q_7)Ys%g(9r9_^s6z2sMa=76 zag@6mo4gb3BU5PYaDDU4e={*_DPCu^Q4=X(v-vY1bpq*a5wQQQRxf9&vWLl!5Kj^A z%flAX&~4w@OAA)D9--|~-gK4Cvgn!+qS0>IH*P>CxU(;-hXCl;iR$fS0UF1UZj*GM z2{Nl3FR_Wi0U6%b4bhVQ_a{O@ae7?_(}NcIU7-Da!z0L7=1WMsg*IxTY(yOK%Iw5Y zDNZ=nIGrKIukiL;m35X6-8p_VcG)i%=V+9wd!$?~L_{xg0gNt$H)SkR`JuNSGdriY zIK{PXwXNx1Zcn>8uN{bqs?&-(>_;j~<-s@Czh(P!ICO+`c5M!gXoXpd#Si z$F#3UBhB*Jdd}Z{W215ln7M?aA|HKfGkONf`uNXB$N zU^jm!S?aB$GyZNo*f}z4Xk9Lfv^x`#fm2j5pL7rfuldaS&ZE=hDv2rMdp-^1}3t?ucA4Ex`Qi(Qc<$vf#XenKM5{R2GaC&p|Xv! zD;AN(3u09H;;0Bu+QrnPWc1u6-T5iX8g&)eT}aG^u4S9U=Zn-zVVcLbzBy=Nd7F&8 za|0qe@u98VInD|5Gz#y?h2Ij-H;o_(6s;wO3I#oyt-P~W4{uQ{=}&XCNnX0g{ze`i z1)Bb02vLc+Cs-$Cz?ieg1dnMD&hECT9Zs6nQiil8SCzZkn_ArHl5!R!*O}2K-23^C zUEI}f1&Iy|bGSM^FnQw;4?7=g)qn%)9VwhHQry+3^g?o|lBi@8o&$H^MU-vs$1MK* z;z0v(ZvOn&jpjkE_o!Rx8&0H51Zx_ms)$r}Up4TD@*4&RagT>(OP~%n5W!iIZ))|* zBfG&K0C_OP)d(xrH~1`YV=p#(5YT)Id|f-bGD7&Nd0J|&lwqzuZfs(c&2wf>&_>9RIs z6$U%X;;PTQQa~JVQ)BGHWvL;;`|;mI&rV-zH!tbx`D@D)^#D@zIIe{6j`=$enGBef z&5HDIg09-e(~H4DxpIw z3;re|mm8<-m`)bIrp&X06smKp(A^}`MRfyP$Fx><*4@gq-E7xv#&<*3EV;UlT16a0w8iG zJ^=;(TA!GkR`Xb6ua%frpZLm#6sWde3TH~DfB%J`_8q(h&zHFtBkvxS)KlcM`R&u~ ztw4w_IACz#;Um*t5{}PeI_cZSw6DC>E#>+|SZ^uY-B8kW7^njYy^e8HOMbG)&d)`e zdLp>l#QJmXN4ZVVN1n?iyhB`|m!loSj7N9T9>&z=|gyAXaM!iu;pkPQlUR`eNmUXH)mb!aFIycsG2l!hh zj;5O&z=?tai5kd0tz37X?zS`)NTmQn@8u83bNX}nnK^}9*id!3H%<^rPafmSpT{(5 z-yaBH;!pn~6@LN_sSFjo$tT_^>i;2uZeL`%@#(#I!F|aA{x1r><&RmAMdo0u+8DLX z0cBUAiJeE+pP!F$1r$H#OXt-)YH*TSF9fv1e&q`d-?4KQzZ}V?vBvwvpz0R|4`1rL zhKUywCQ)bFR$}QPtga>1hRD6WtlQm>Z{EMz<*k!L|4Xg>ndRoRt~+(ankq@uagt-s z^7|nX!S2oJB0ha?JAVFsksOWk2GW#`6X)M2#I)mLm1z6>~Ec z8)}j7e9O_)COqL!Y32369v^1JWt@z;bTZmpItuH3-Omm=Qpo_QgLa;@Ev=@U}x-`IqF8LNfm=v z*KbrLP@A=rFRu98RE+Tm_AFiqrMPe=US^8Y*@DRCw$c{IYo)8Pdod*9ICnG1@0Y(C zF#ELcp1kVoA$B$2qqZDoywRRaZ<kHTaPsoE?ItRSptK@Y!lgi&#ys*TgLI4O`~FqBh{i;mnrj$h*1;MEf)NQA0-k zNy0BmF-L_BD@5i))ceazW$@ycVE;XdmL)GB+eJps!ts1s)IeErwvB-_aO;hYcTVGR zxPjejfYd>;hc@AEwOTc(SY@6(B3%xo--@Q9n(fogUQ3x{^~<>`w%*7>pnPop^K$+u z8SdqD{hVrxqC&hQ(O!LWnQ6|fV2YlcKXJFz*xXCeTn(FK^$wYR&nw*@_Sa*rO<7Xy zsWOB=x-rm*BTezU6SXgM8cXVSM0zI0{AAqpzx(rWlP65x41k&%#}v}py1!8K0kO_o z(LA6nI5I3=b?&H%dpj|5ds3BZsk~+;MxHrU=k9Xx<%(s#tMbDT?{O-s68n|R<&$3u zH3Gg>J`>1=a-7TehH{lpbY>qB6HbVXp(wk3zH4nq<4plrkg1;=qPWepH$c|_)r$ZN zB?>WX6DkGTV3Z*9oy2H5T${YxkRN~Ye$zc{@}5IyT#FaC;RdN_H`OYEv;JdO%gN5! z_{Imfv8a>#A>_o2D>=UA*W=T3k?$4+xmfmniiOrK38Quq(cL0jdMJfFgZ!`~kT9c* zroaK6E?eDMA(RFjg;^z%Bh5yCQg=PAU3ETZ%qgp`!OvX}r7yeaX5Nd)qBu zwO$ES>$;*&rC!Aj_o#4t)JLXugfkx@2JXY@*=G#D&6 zNUFFBs$xcTaWkez6wR@?!aB}%Gr!L>d0MtTKIUI_pLI6cz=hGB%YySTmm_OqGtn{2c$4-o9{R(VVx_aI$CyE1j@#-VwZ8h0V^<>=RMU?fVkCVR#d=#-^6 z7te6Dv4@cKb1yU1FQ@h^%*X3yHEXi-qo&;Z5-}dzzW(Ds23``k(C!!riP20GS#ThV zz2~aiir>7TW@Si8_kaa?(}WOiSRV#iOkVZse|CUDXi#-bdU93yt`zop+9xh;W;RAW z*L{0FVN}RAt35CC;hiq|p?BaOL{_*pK46Zn?_eW46~b7P zF1gUTNUBEaNFy~R*IGrM>*D5uq#>%F{cSqbb<~-z?#qw#qA1=<#UI`DV}6;8w@=NT zO7GIn;A(aRY@|8K=Roj^1dwGnrQ6U;(w^4@3i)|Nb13!w0O0zEa60-2lJjiyjJ4Nu~k{I)fgYK$f8y?v0dB88-WYIwTf zjh9YdtR0Se@O7J9R8-LAnEGZ$VOWm0B#;L5Is#gu>uGMF?i}h9S_bwz#9Im*rjEWoNvfDc_Zuj_V#PyjswoAq7jlt%r(7y0zKS!If_IKGjrgf>Y z5(yh^qmNimPD|b?-*UXcz89&G5Tlu-0U&i;!f=f>A53Ky(8~N zkC{Z;eQ8TZt4SGSjp>T9nlj6I7Cjey5?|;xNOSzwpMH$`ab`n2%QpB6-#SQ z>QOrAHPo1Gll~BCS{3=c;#1*q>+nm6w2hNqx6_5cqloD7pu#ul_fM4+Mwy$A7w5Cx zC25i#v!tU`gG@$>K3(_i^(U2xB0DMZ-%9*BZ`VZ?+AX+UXBYo1p~L26^X(V;zgaBSp$3TSm4nsPJpI8gca1I_3o zr=Qxlk`Pmqv)8`eJWOw&Z!~-&`qE{nLrZ<~)aJ3)0#8@p=2H5d@A5;-sXAYz3K^%J z>gaAfT-C}T+o3)|4{yZQ&Qf9!-`aIlU53>>BwvemODa32^CEu}mEX5g0K`#@1L_LH zv6+x(rElb1jLB9S)p)dJWqVr*T685mF+yUFm;){l8221mTv zvYzjgP;blwvj^vNcRPK|z8&Fp&3Rl4!l(kP;c{3}?<`3RFg2KY9<*4<5kkHQ?E05)YtoS4Y6d z2y+ANw`IhftERcGXWg@o8Vrx2Yu>+!l4ks|1X*RV<+~0N zpEDL&HEc|cL(HZJ#1#DrEN~PeIY$w*_H-O)?``iMBMZbY^yvSy@A7wFM|%J;w~uz~ zOW9+#RWL* QOpyywW=>BMRdf$N-%{ zj^%fy&_P-EhSv8lb*| z2u*XAbS}f0l?XyyJE;k-ptjSLrQaT7tA^B$eK|u#2d`#96|9qp%Qh#EUHX~hyi7-;-j=DF{wmwvNRWJV*qy`J`YydY z+jMN0+goi7t|%JNDG8VDE=NyEw{DUFwxm05{#0^**P``}&WB#z?E;Ka{S_?EIHAk&j;u z8Tb-+S)2Kz^=+=7b}F)vd|lbns|#|Di2jR2w-WGUw0fA&S+)i*wN_))Gy8kJ0$WkjpiLJ2T=;;Q#kZ?%m7_cKgk&F0CKcPAU z10WLKF=1mYiHruJKvH(8yX5H7V*q5L->l1pwjo>>&&TJkFbC!2_f1(w^D@H)J63@N zjDpI!M6NM8smB))I4ojQ{6t^`KXvE0B?l7CZh6 z&|-!^W?87hn?2poTf8L-$g4mL&LxALc=$7}Hm-XQ0q~W5D&XM?64=DNsi~a{pJdwEioEzCjMQW^|B_r+hRu{QXs)#3+6fohxeTW}&ty5*Qp8t^Z~JDRaO?(plifNB{AH#8W^VU3a@kID_8 zQ@u=&?Oqn%{m!0Xq_Z^G)?LqS`;<0_bmXy$(=GzA;oK3Gn+OZ`VYDvU(bAgDejQi~ z&!0gV6r^R9jUMhWzKl-%xN3=$$u^w)C^Cjs5IMiJ#?Vh6ec%bQLc)I zy9bfsWuWk2w7zzwH9f#d7^r$A5xQzk*oAj)bDg*lDB(9@KiNkKH~^H?@e4q)bO$$Y z-n9ZCQ2^F$*(R6pun~L)5ZR)5u$N&b&Ff&=O~lkNR#pP=zduh=y+DJ06Wx`=u`tsE zaCGHdJQjf6g=YWNRmAw<-$Y+0)NR4e9!^IDT9Ep4lU5Hf#%q`%!LfrN2Us^k#96*b#ijp`lOsRidB}VufU6l`S`3}M^!4y zl<txIIqRnJJKaP6-E$^*U z8g9AHY6GMeQFcpW+`xEFjWTnOK4{lBUIse!w=3~KTil)gNVo<-FH#`F6+rxWi_i`m zfb#x=lxO_Ss{gl!gp4=r2;kmbgrG<+0GARkw)=l|D-fD$>o?JG$Zw(#uojA4VO;kL z6V(L(3;XA$3^0EG)|B|`Cj^%+9DyQSb#^!Yg)Q^9CK9D_1`Z3u(%^{xe5*g(>JLly z&mQ>CO!Pm;)n9F>{+uoTX^Z&h%>GaN-v8UP1@dIPTBrO;Wh##Z;#6IbyL*#V@Pg}a z*Gn6f>As;}MZH$W>)xM`kMO*DNVKugspI)?qHqGy_^b0_6qvVx?n2yG9{oPL*X_Nl z68nzeCT|sZDjaVjl|%#}>C2aLTYnRg;rJ>!lp^M~)k~%v%t;1K=H9rz8aA1z6+vic zHG!m7UfWChh&JiELl8P=Ud#zJPji*D3V|H(Ciw&4_kTHEf&|Hma0^~|`APYX*mEWD z?Y3s_v^$HBme{(XeR#Q-(W6I!IwOLTr0N;+<96vIhf`Gw_X5dt&w+IJR>ml<6YqwU z<%~8k1x{tOYL3{dpyivvJm*?sQ#Yt1X%Fdk*(D>t2OIRW5{UjC0!|4pY{Q+5MyBDu zFs@>BMu3Fe;u8}LgmQ+KjgGfRH0(VAXEzJY&Llz`#zR9@zFAN`s{ig86Rt7Gz}SLphn{W_w0WmWN)Jy7&GBX`r!ThTHV$Wte@D^`}#@%bd^4+ z1HLJ(lTy_8#>VabeT&qi9!%)P0~sig1KQ@WqIyoT4Z>3K0LTcre{mHIecc8&e0&N% zqB&WG#o<3^H8TKmy29vOJn1n7?mCbQ8kSAC3!{hqguMaWWBD*_0eJZy0IAK`*@vxz zW}*oiCojTq4uFBX(3OD!A;Bn42!RuhRmK5+JZU_TK6xnzaOUL!PA?4u0t`_LIOJgV za%ZYj*SXNme_Qn5-uZ#D!x^DzgJ6QM1+9VKKSBV$`G2M8zqxmiVDrZLMaEgkpYQZ% zJN?=B{%R!uIiCJ9Km8ZY5zHd!JW+bW*K5Cta%+LfajU{1V&R_aV`?RyJI3-fSbNqB zMblB+neH_`EmYy{#9I9)XWyF#oQ5BabK|YEns~5&7dkU@%G4?7TZ48=CNg_@rnILq z50xmKzx^hvyW$5o^RdXAmVxCCB?^t^d9FqVvgB_{Gg3mXq`ns^B0uAOR(awF$|fXj zo1otmAg<~GsGV-HOMijB{C7B33dlF?84rTD3Nm^&&b5tP)9;24Ox&0JEDl@w2Gei9 z%cT)}XdV~88odj|RegCeq&zFcX?IjDR5y&}jGLF2(r!7JtKzTeeAXF%pV117lP7RkTF(UwL{xTzzHG7dNKv2I;TUr?zE z)17pE?;63E;b`+Suka*h|EVVMn9l)|;~#0#|N7X#e$MmoWEr$CzjJnnYp$>hCr1-o zY{YZ=MHr7ECRFx^>9adXZ<{d3^?gRVb>H;@!k?P|9$^nL*6xB+_yn=%ck#+5dmXap zgV7)iB*(4Etp+qJ+8ha=TzA+5hb}NRyqLuo>U_*UK6=WN*8BxA%2@#LE*>TY;&%W$ zwoC-5ihudI|Hfw&HmGY|ff;&`bDg!Q%HKqHc(7bup%2bZ>bV$5y`T?*uED$65xSs%P0N_%~*#?tg?3?Y7CD`M3 z{anJZ@pmP{F-6Aekf8%A=~$}Kx=O5Q-0Jl0wZP51tg@XPw14-f1GzfTAa!lQA@fH6 zGR*GWUPx60vqLK~mf*SVK795q z9Kb2+*}53aX9f(Mp|BR)c})iQ>*n#Mt3YL&8=o~_!O6T;QEqDH!e36PS~<%{{QCZ;rI0+ZmFg@M;KLj>HOuyuL+voT z?x{>hZ`HD?;&Tn;7A>^EORB5TZ&riVZ>k&=02^_ottk$XRID=d#o!l`Pzj{--$nLT zeFolH?vK>he+EI)sjH0|npVn>y7No{V2J2yrU2Wig0QhC0oc||LT|<`i`fYpwb`BW z+(AZL(l2>f!VSx2?-hxrP7vR6^9!(T^E8bzmK0TWVK^IOJqpX-0tmS`jC*qXd-kIv zdI&A#t6^t1PHtiu@;#N>s+$=ke3@6eq@)Iilfe8kX8c}X%OnnJ9Oj#=T3fF+U4%Gy zE#|xB7iCzwAIIV-(gx59N-5vmj}mbtdoW@w*K@4P;vP%OJZ<3axp}yK0t$h?-Rw0P zRkD|0Iu|N9{!0d$f9+8Y)ek9!s<}zvjbHp%ZXsD3Mu4pwkYV+)LDl{#Mj1M8z%Xf$ z-6|I6kV%B{26Vgjtw9NAo-Z&*|gmh}2FK9&P{ zCgblRhMKXDKp&0=AI<>VTT#hXzOvV^RluQUL@2JK zeZk(dtg<)8su)=AtRJ&XWci8u88mk{h-w|!zDu!N!e|L}SdV3g5cv!0X3Ind{+f24 zxnyeaeAx4z`E8pKuEQ>P@xQ1LtlY^MoMQ3JPX>!?8YT#yy7@|9SjnV`x5dJo4LSh| zp9B^>SO1XCg#ZDPEdFESV)HkV5ygi*KskrqQMH44x7bz$A6eJ(KOcAdjeY`=)*!>ebI7lD&5> zsdoufXK`hv+rz_DWbV(j#-7tLSgQ)nW+bW1kCnMBarL11iay@DY6DS48CG9rO?mAX zB06X)VI+WX9wU>y1s?biN*NXZ?o#SmTH)v(fo5QrUN}9pn+lK=2JG)CmCum{m0Xu(x{b17IGdFbUDd1pvu{95WD z9l3%iE!7VR8>R;Hc?MC(cDhON2?ZycY`Mo4T+K`#I+A(Y)Akw6v%y2pr={0xSHe?I z1y*~sUWt_GCNzJJGPYlBzSZ{3a9#eps_z%qwAGB=+jEt25&RF^OAJIR<^1piS zJzYJt@D(Cs4PrE}|A@5x?jS_>`Bu#=zaf>i*2h{~@cTDdAAJ4dpuVmk*}Is&v(~qP z>un@aDv~ze#-y07lRqNZ*A6Y%rQ1D3nF`~^-Rdoo@`mx*xhWoP|pizVgb%Fgk|YxwCGj#9R3_cBzi&$tw88VoJ~ z+KWU7rk#`AZYgG);hXc*10VW3?=$&iQepAS#bU$l#-dH5`(+F?5yYvmYmz${&NJr- zFCQu;1}W3rAC2{%pZSUXc`+pi?b9F64V$cMW?SGF+YQDSZ)yYN*FLmHlG-wJ;t`+r z+0+tSnZN49lZuG56l6z?P3;>S%|*73aeq@DjM(4!u%};_*pcQe&fsI2{;cJ8$D`$j z``CKTnblA!)l)g|Fea^r1-{ipZHeW&H1X-zt6D~9s!l0giQJzJhFX)|$XkeePY_?F z=zJwd+#E3B@$7oUCc)i?pdARUrH0;i)1$ImA~KGAPF-9W$l`!3*J6?EFxQ7O9W8Rf z$Ac~`I_hc$$$-sKR0+C@WfS;l##@slO~SHncnsFZE+}b7d9F65(Mv?#I|{>J!(L>~ zZO@ZYZkf8Yl$YJ79`2o?L-cP|q?uofQ*;-j_;OpCpmj9sk*EEEFzMeZW99aOlt3gbi4XUFEcA3Bx2?3Yl-z`Si@Y!{Kg7E0{e^@#Q($I zdq6eyMSY?uDuNURLT?I+QltseQ4vCq)F3TX5ke8^5I{gcsnR=)#qY8P?%*jq_bXnm-ppzE>k*N> zvABbtK!4_B%ZgfIC*!B32A3d(v2sCc&9dxM8>?k6x3d z2?TFsMt0~IM01@=2h@)Xl?`(Jwpq5?JSkisTPwZk{#cXs9gzWakhA?|==ME30;bPI z!Y*4!y7l2A`dPX^yoT?5f|t;fY?)(;$i&EAgh~|Sok7oTQBtTY!q8x|-SK8QYE3$* z$$~_f_()^EqL(+MlDR-@P4911dM@wh; zoh^tkOYEJ+^#9?Fm&3n zH-lBGt^GsweIbU=2vHmSzD5PCFsDwMSM8j%W1S@-+ZL+fvSi>*9S0}=*iR!SG+90$ zG*zZ2xoe+46-CfQuDE?%2q;}!jE!=0p2p@T&&h6X8hO#6y}wty|3WmL!nd~@G{s&Q zrt3t%795+W2-;m5)E;eM-M-XGgo22W_inLwePmGJzM@^81S8NrH;IYoULd1b%N7%KF?NTe8;QSzx%)C7|6i1 z9|+giQajsTH=UroG_{Jy3hQcrDE$B%I2WmHw_-HpNOk`F_L}=T-g?doV^AsSdfjp$ zQsEP8Msm>DW=r3-=frx;jr?#xxb7q#EUt8CvxG^LR*PStY_Hq+9ko9R*u1e^(6M1UYm%z^Pfzh*Yh z5@x zXVSh`wOTXu@-|zZ5yfzoFg7hR+ENWjJmmpS(uxmZ7rB zLhrbD_Q88_ue_Uc3Vyc*cgg5oYA$%EIA4|KDZPALC;x7?B%H|lLNCE;JgNHF`7wi2 zVfY>|0H;<zaFT*e{HOYv$1CB;dEWVWMM0}5xU)l4#ofc_tOCPm*ulkrXTg=iSVVO>|X{nmq z{#!-*he=n!gn5C0n#p5XifarR;M_Lj9yK)|?nqmglcn5z z^jXkrJ8881>E3s@$M1;pKsxsdsxzmVdOPWhV8%NV>>fvua7)5ZQWBVc)-uB{rKNuc z7TU7_kIkrtZ$XezT-|m46UPx&eE!AYsb9(HZrLvV2{Fwpi-hNQWCFYjHmd}AKWLTj zUkD+%u9u{ZfKtfW(bV6RQz~=uWdK@Il{}aCv0H!M*B?bx(jV|ZE!w4C-ag2i&p-LS zxsAs_<1$S0T=sqAFh}(jF=78?tr(tpV)F#8;@??w%;p8$p;)v?gGkDgWzM`K`T)32 z6uv^db~qb-bQH4EbGKC#g$D6 zf+?;KeL9faW%NA-kEZ$Bv{aHs=eM8uoLcurL70}E#y_GqSY^g*;4&c|byfkvcceFb zI*NQ+=Efc4Dk}%C9c7jc%DGLlGl{hW%0=G`vx0Ufd6OuB!O`z#;7R?lp;5|r-w2Ed zZ(3nJweMA{5v!3Nkx8Ga)-mqwnJZF= z5-gS5BNVCi%*V0J)7sxSn2vMcm+oVrY=7&T@cF09NsB!F?ftQBf;SR|0OuTohK{6( z!6Z)bFYEgTQ*w?jE{Q@-^h{Lx9x~r-WXh>rP<17dd|xDJy=r82)M2(c}TdFGiE= zBE}TN*n=~DS8A@wwB5}4jZQ7|Bf%UYob*ZTGT%MjEe~_eNdfX1MCWa2w*5k=9@w2h4^XvlkgH*p_ctg+Px8|ku9NmH zOJ8-oqSsncX3chva`V>%`V_XiX2g}fO#_aGe1XoqN@oyVQ9rQh;tkzPn^Sk@gnL8|F0Mv)YhCIlg(&|FmgnN+g!mIdwI_aLFH~YhS z&IE$xYphWxcE42vF5q6V5c`dWY5AOcpj9XB318Q#AVMyCpY_SdDbjd7!I{|^MW)c7 zxFRX|R=U5DNH9h@`fb$@QN_EreL3&?H!i|BeLlSTR#B<8p&3rj{re!cPET1#U!jwy z|A^@Xv|&+h@uqxoXU%C@LwfV%By4J%GEqmT1ELZhG?#^?4xD7Gke4v5h@Ou$h;O-m zy}y&780xnk@d30=D-sv~o5PieRRuc#3|V}5Pj%pl6!}<`7+W>qVi!HY@0CAL%3b&D zMb!Nv0)L|1e2@8njFDe)r{mCv)nfMEV<{LFr;4id41T?esTsu3hqz=Dvich9Lj#3h z`l5rZd{W#Y<8?~Tko#qiOe?NNZwiv%rC+Kjn?ClUbgnbbv`L)uG;J`z5`9A*&)-+8 zF^hj3&xnuh{MISYKntOGhT|M3TjG^yVKAtP#jU*g$eTq&ldULyLdj^vesJS%y=>!#?Ujx3r^AP0&)Nk!Q;PG|HSqn&!HN4bFwu>zl`b&d)vzB zYp#q!O5Y_wDe|Gif+B~^5tiae7#rY~hpjoJ#}P_8q`f^z$9Xhxqo*aMZ5NmUnj>D} ze%5OU>BQ=TX(mbXb6Kl*h=%LP8@eGOJXyBks+plSO3Up!p}c)= zlXb(ZjfpEI+iBw6il#9U9|2J!E$fDd-#&Q0pS+3$dA6~FjyLbv*{*V4kFt(u#4i~I z30$C_5NXt7X?(ooPly7F=+C_XzrcYaJ?fQ5O*bxX{*qr z^upla;>+#QWHn)5T{H?jq5vn@ngk^gTlyY_qXgmB+I1bG8J#j|HnDY?ANffUr`oN1 z6`?=RgoACrh`r(NDY#udutm#*gXAdY&5?beXEy!B&Q8J)mTI?aF4~|f=5OhlyijCE zt{WhxECOaMp3^JJ?3#fFg^K|>-Qcm&&az{%qM66qRC>-2w#Au}ZtvA|B&AqH#9r^X zf~L>?6%HyVekDWRWkru2UBlOZN44iG3{}WD<3YUr_^jVtZRhuPjrtsoU{0Ib5wFJT zZX>NjZnv4dZWiP59P^<>P>%{zJo1Ew+`G>VNa8*;SU!)?SLM|5IX`9R1>Xpa6azJ>1ZI{_tP~a!}!4pqkgg+@|ze z$4(?p3Txz?;kJf9(z`Ff>HSAqdnG#fQjmKR;4j>>rZn(FIS>-uZC(~$Upr5;wtJRV zVYL5ZkOu2vWqTSFEYf5TJ1y`tRFh)a6Z@${M2M8~F3&E2B|2SI7{txUBow{b(v+(D zp^zQVTmo8VSjb>fM$=|*`nq{g>AZ3= zB4m78J$!0SHERJRN)!h=wGYw+kJ~@8;?U@N-E0TksB;#e3> zBR;hi2fheH=A;sPzCMZ^AomC^UwRwPt2Rt`F87Au6!hE7?>1#*GacHi`?1-ea&@Ph zo3853hCF4kGp=5Ra4{KzwJIan}=rQrzL$ibGwOK+TrU$y3{YTRYy%* z@HWZ!OWD2wsObx9{z;n~^Yso(T%ylE!0$E$Ra(92p#n`B4^u_CdN2&0`(i~#-VDEw zdt(v$>CUi6cpIkwB`eiV*|ux=Ghz!OtJ@HnRJZ!m@`Cea^(u+)Vftac;XonCauX&I z+R^JYs7d7a+131TyVPwR5L5euVfP@%e{B^`83{}9j)H~3=h0@FwX+xLo#hf2%R z7n}AY^cK)@YDF~Kw6d)O4i91qO;yHe9{AqU?U4BJd#{{3wbnH3w-*fx)nB;xn2Nnc z{u7-!!%&3Q$f_ABkBk5;PPIqS8XThYEX9f;ptFyB#v>TEbCpPY{SZTRP_!nR@#DA2 z4=-+ZEyF6gfC*2M77(KjcdYIqHCH%cy0&xY6&`k1crj$A;G_4r6MiJ!hD#K*#pn zBhsi`_nC#Z=@mZrVg_@bf8E%39z{WE$YMkc#@#Gm8t-e3F*yNkz1#L7+q=Uu0J%CR z!uP+MQ9W4kGpV$R{D(ja45W;(l-?WU>vw{Zq$`Nz*bk7MXk*9pF0HBZij?Vwwkn+C zR5iP;U`o_ov|v45TZ=8TAKXFwhX)??%d+tGgI+)hmyr5#_>LJUgNiU|>2p}0)|3bH z^%@#EX!Uf;vFy@Ar_tXA$L#f?H#hg_o52T_&yE1wzsLXD2g;1ty6z>nDR>OZAhf{XF2j7S=G4jlJ zQ#!ly7EPKxrLn=GHjmLoc9QPdJ|AC&hxQqMmY>HKs@k_taak8|1iuraB<3Q5ptP_< zt0I{?I|TNDnN+gf; zpc>YEeL5g|wdvCTRMrI9)7dERGQZjnia30oPo`>UedAG}u- zeGXPy3gDP5v#zCb)0ks4)KPva7Ft4Hy=_Q_{z|Kb`Ji~3&==OBuv8Oq@7zo|e1EWl zt3OtZ6rwaYZ0iSiRaoZiw|Q_Zy#20eUXuu_nKt$!Vh1I$^CJ{o?HHPFSDwfhRQK`) zoIZk_Tyf5)quyg}yyvaRr7t5yGR2EKaJ)_|r_=gOM}y6nm{p8p0s8j02(7>yRq;6~ z9$A75b^~(9cJN~Alc`P1D2smxbk3O~b&lci>3-(r4idI4uKhrj`(>wh7b9t|i;8pHnOIV{+w=TGAI(H4rEzIQfd`BP zMpu%x(t0K-oFYk4_p=^Tz|3K{Jk(FHQGxLs=LNvB{$249FC z5pa0DMt|=sCb8@Nb)v;K3i;hYH|h1=_C!&^ic=fmKf>jK$-}+!9~fS$zh%xvG0@<0 zZn6{?#kb>`Gx>t+TlR0F9&)ekc}Z>e^>Zk*QbH%@X3ZzMa08CKlj`LqD;b+Cxq~Ck zqgxe+%suLXq9;YNo%h>q^A#v5?|4ZCsGmFLUO)rDj^Xojpe^Z_#l9=StC%3(^d^$S zBL=mio7>Kg(GOVkf(EJPcwRkVq-d!tSeDKhMu?1m(YO&V&~*}f+l$*VOLyk!M3wYf z9qpQ1#rvO#r*)%crM*`@|HJKwtC0 zg$x;jlht|H=Jn+Tanc?<=78ZYj$(15Q*H!Bv`T`3!gAD&mX4*lkZnMZ6?OMta>{=z zY%(en86GRh#R*xYN7P#b+|IP%?Mur`C<+?9p>ZMJo_X0ti@#jH#FRtrKvzg9=Ukw* zHvBI=T~O!9Wf0jbL5B9x@Vf$5A9s`kwdcZJlgXPBg!CV)G`Ixg6_( zI{UCDXNew%zFhS;lGleM#p+)8Bu;;i&A#*L1z`AOsr%nTvHtJ*9t~USvUfLGykk;G zoav|uW*qwYiy(Kh*peVMPj9l*mJ9U~X1`ziu@n;AE*O-Xt73ezsi!Y5nnTyUUJm64 z%6(8tEnEBOq!9rM6-`q24eOXTbEyUj;Tyi;UfiGBB#u-C!oBs%-|2`Q*)Z;td`FF` zasNDQ-XO^Buh+rpIL{VOq#$u_@(=#{vir=;#r+?3RzFGyiGo=(h0}+1@H58@kufdapKQpI@o(*0n zc;cPBfP?cNuarv?=>f0XEPuq*e0Eyrk23!riu`xh|ET}{tW)0}N1%hw0_X{{{y#@2 z*1x0v{j@T2&Gak&Gu+12(=*)m&%dJv-q{V7`uREG<^%<7AcfagtaDqx<$-@>KdYta z0?FSqRyMCUHA5_SxXtw=lz!+RX>3u%bkd-FQ%2fr&ekft2Pj@c^o;=xkLN!xRAS3D z|7b1EUuBfvV)WvYlN(fu_y~QL@LPo6+M=9pU(X1Z+noJ?r6YUv9(BV2k-=yP2?ct* z=x*~_-Yp}45Yuw8w$?G|sqCMz9|_rPgqnR4_wNRoKpre*U#gGzE4VQHblO%PLD8)T zyN>&YHElg}i0tR3Cir3+_rC)E`hUm#pAR|y&*%K6(%6v2;mO@j%O1QL4$k3}hgvyA zhQs#wWg_!iM_2ObUCgz@mWDQ+)aAUtA`bpzxPmTJ?oQ|ZL-5gim|?f#^NVfxJ%`<7 zr{s%nSiLZA6FJ&8Xnc~}5fUL+|o~$`HKp53863t-yHvxaJcPszR zE84$x1^n;udKtC`c3`qBaj&qC)A%2a+O+_l@@vJg5l{ydjq@zIqG%s{@a7I-FH@0c}N**74TykyTrhV+Yid!dTXxVZ?8to!25tr)`if}|4y?enZ6uXa2;`w z+2+c&m{Uo!Jikj`%sLe2t%=!4-!;s;j$WM2Nkfu(`7lIoMP^n5+Bbffbqnvv7Om<} z1BR1}YZ~>+x4o9H9&SG3qu9ROWSTXUv!f^@2E@%brq0D1)fzus^Ib_SK97IKXmqI@ z5{NT5*F4V;9tqArT;!e2zQliB9jaG8m6`&sxV}L*UG4MZ^DqDJT*%yu|5tN>McpP>xEEvaCx|JjjAKDZkX ze!@KE;BC1z1}Mygr>p#b5oZ5?c>Wpiue1-UbIk0{a0?$bXqqW-isIbCR%)AE9)%!`I4Peehbq ztcCRy(z~Gr$aimf7-eV7E9Y3B4!@sma;ekE18eS4Y;MD{EG|-+RNV9GJ)OY;`V?Gt z!%v~Lh?nQr%X~N7^}GN%6&ovJ-&6OA{V<=+n3UL4BJK?$y@MofBvk-XPcoL>=`EMi zbFAey=NCNhIp-3SQqALy@H2Y27?x>!J79o`LYu>5sa~*=dMQcYUBitzGfI&Dy&`Fr z6I=Le&~ZGVI%U;Je-7_0#tH%sY7@`zY?4pMvJ5*m;_nd4%z?0CLBxul6Rtj-c*_E? zAVe`eQwy5l#<@)KrOtzrmr-384Z|Bg7ofg4CP|7js`P10e2EvNW^JuKhOJljt#&K{)?Lq$19(a7T0NEm zmTZKv=;kQo_d`^%58m~^NXt5e2WJH1(^?l7l}J*BKcXVBj>*xrGeFdu_>u9SrzZcy z(j)Y*hFds`+H;?R_Fr(#7irA(2vJkWq+!J6=&`}|fp|EWJw5GX5$ParW=JL%XbfnU z6rf+g<;(KAl{j}xN@{}M(CeB%=wdsg-n!U6wZf(}JUe$~N%wqmRZjvb5s*?5oh6o$ zyO3EBKIZk|SxZD5RDEAuuQP01z>OoP)$@_Jg-3zEdO`Sq+Q0k1mM84s9?$bN{I$X2p4{_)Mr@MjMF5px%JxK}H!;B*0P=Ce@mPwZwfXk_NKBU%}X|QyCBd z=f1O!zX3c)xvhE`fpN6km+al0I5^-6debMrj%D*2-hh@l;GC&v!T!wvHb6@s8uuwL zyumJ1=O2NoI2!o!U%LX(JlGU{VT)+I$6F-!R>Y$wQ|Ce3crk1xK@IcsVqbDHg)09_x%QSm`0ucX`}v^y{7sr$)&wo}7O zeRfVR$^qJ22iAe><$73i;YeYPvo?a zPl@mgqC+Z%YsdjTaE{;~f+F-a_hBcVL>&OyH-@K|27M#CBV}agvpshN@wrZ_dAjFi z$9OJq4Bi0&+M}tKtu+)lYME2>ac_3(d9rdcJFlDDD%DX#eC48V8Jzoy1;B^vPtwbN_4h-v0K=c@dMka1xB zDr4nC-gQQ^6RDW5+qzHq6C;cvhEgJ z^V_bv7(AVOCDSx1`v>JCo=($n-wl1rcAbF}b?y+Ki_9@27^syeb$>N(I^|^PfBl3al88lL~PS?}oOU`%;U(c2Q^z998c>*Sa z+iqt~_Wm9eP3XWR*q0ZoCEYhKDl6GciqIT{{V`|FY+QsxH^awvoUhU#U!0!J`7o9H zWEd%%Bglmw403}lDlc$AUV_$L{#VcTe{H4tkLSCvZ3zUNtL{ z>MZgrQ(^vT{Kq@$bYc>gUpha9Ob`>VV0N^)WV(|PBnhxhca z>xdudmfcbDA@t{Qf@jt}Nx!N!>pC>=`LFo{hTWV^5tC zL09IjwM`5F_j+b@%x6E>LgtFseX^v1#qz@q{3Yl5UwcaeFJqm+=x0G%rHN{!j`FPs z#&GGJN(bh)&n&?`KJ4$u!xk`+LV7}5I#_MIBjf$v9d7CEHVmWh!;tQ7$})rTtxc?7-Rl>d5@;_y-HGV3cOHU% zS=&q~hwy8FLmqotqU6s`2ik`py}<_OR^6swlHs%VR@c4I$)Z_Dqbz@Nx@gqbd^87k zp&Crf93Bj|rjjrz)&CDiSygQaTk3MSL1w`zpUSuX*g>mQE}A(wS*!diHHDJLk1E%e)>= z{u=3AEu8+nvT{oCv4))S_7O!@eYKc;6M&nJsJaN;ek!F#bk(7o@0@-2Z92?bi z@Rf)ARbGJ{*QrkNir&l66J3+=@ngw?FX9BEtV+AQb&wko-Ytz9TsUcxZwkLl7OpAI zM_u*E)9oZ9N>|p4RGoO7SYJyT_#(_$Mp&aJI}L(AyX7mIeth)3lE2eTBgfxZfAw|r z^po;kNf`^0vTwgEj8Yww5axzJ=E3R=9i1eOO@r!2%rw4sw(L=Yqf?y1N&{Wf5x%}bdQrsAjo1kG>`&4!u>3@JU%cw5&1DAF>Pr0DBjjff(>1Id|4q$zL;+A-i{aqnr zsn|@m0ru$DEwp=y$_imh>{xI-St>1(q&IP!^PJ6rPUo?PeFhULN_0o9S?Qkdoi>x` z*E%hPKG5h~_L5%%U&&!8Szb>GsQy0KOM3_tes#H)r@A=HA-U|4Ys&9<=A$>Jp6Nxr z#?~;loRC-qNf0N8hmNvvB+SZ7tB-8!S0))y{`Q;ORLboj7*ty4D}4Ujlz#fS=< z#CL_xTI@Z=)u?+QPNLyt*6$pg>)oeI$=C6BHL6-Y05{f;fjjPDYXwxh!N-|uG-A`@ ztF~LLSFJLsykLq;fJyA+!xh!wC&8Mj>H2qfnrNaw$4oF^;VJKvP{^=_cxEPiva&>= zs8HH8h<%Ed5suPzubf!yYxI?oZ4B3C54z=eQaD-kkm&JePn}~H#`-AUVSqT;U_I98 zUt~9NGwa5@6}rEnTullR5JvAJw#cPWyx#mR(HL!3$7y*m=qJa z2Q{8Q={v@46MxW(*Mt-(7q6@9M#qM=q#b{^tdNg*_zkclz+~h2BR^xHuXRIBqKPgr zhV};-wTkfhese3umEC5%bEl@H-^3K12x89ZtZJHYx|p{~RJ7Ch*(U!1+PN-+cn{~rQppKTC@yRen z->Y>j>K?ZY$wt=?Ih(>rSLiZky!O3Qg2fZDoOj5)N2PpD%v^W1*c7!a9mxZ$WXiW7 ziX!$5B;qmQnHkf{EE)HyU$+FV-zik)3_MZ1-5n;oKn_b^e;+-)@u18>aTUZKnY5VG zSXx$XL6ToBy3S-7JSS}VIJ-rWB|q*`OV0dN$#1b6`bpv2>UCA`ZFd))HJj-AKaDrA zmneTRefQ-7PJ2S%tz&Rkon8LEz-q69!GLfP$y0r1h9lQ01`+u)&AIekH}kS?XPeq5 zbo9;ACv)#->Mw3vOa%Sfx{JJ^6pG}y`Xm%>;Mh5;$9R}!b53))H*k4R>h*U@!8}ub zF=5T6_*TW02JzKIs@6~MCmuy_5Irh+JgNa=-zDse_%XQ!p945mnQb0q?EynaX^-U$ zQqmMTBU&243p*;-a>IGmD6W~H^~R7o=I1S@Uz>VIf51C<5waGUYhtt?h?#zf0UFAL z?JW08YimDpe0wlx?{QAF+}JWY{5D=3;c0e@DWAy_&T!=#qr|QG2bpke$%1&zk<4fN z81ZF#%T?&4o{jsj9_N|%in_5)nJ_E6Q~%NC=#HA$gLwi0fn_Lj$`3J zAsZU=p0iw)oVQL+i~3Z-yy4cL6;lD~t)x`Oqyf+{v4-+@38E@pD06$_FN_ z)Hf@lnn^d(Ol~=jfRsCitsUHwp;4BmEwX-c?)__}g?ALjpV2E^%?544qSxr&*S#IP zOmoV2mdcRemF6CDm;BcHVaBfANcUJjsAsm7&JLs;>JO7__Cc!=yF^CFJQwRk>6Tq7 zLc}+c-7_!zDmB2b4L~K`!dNnG>0rYrl9CRy9g}-x?;{{?^coTY_d=tmg^vnr=4yVF zjYoAck?t~EA6nx%&1j~(Sv~fQiDWm85GCK8Bas!-U#iBb+|Ay%+@S$Ld?%HIN}l z&B_a*Orme~x|u-$*x{sg^CkdxAdhc#aF2p)QL~QM`qrDB6iKway)JTMZD&1vR9o=$E=f9A4OZWB4^BypzMYY2r&OWqHDeLf}=oCw$;Lb(r0$4^Q;{AvifKjO{c7Pa&3?NAX!%mq-o|2()b zmR%8d`bY3O<5lzbAIoIt+Jn2yb{|99e%)QLGa8GqFK`PNJL7Nl*cELz<5_;dn>cdv zk#Z8pZYxghC;tYlA|^AODRXyEW5bZWGaNDxH1|m+eW=u^sQfsY)kOP6Fm0D#qwoiQ zL@FffaTeb^OB&EOQJB_yFS+(!rQC>aBm_}6+#FU>+VeJbI6E)OFv+#%Vzzk8Nw}6f z-Rz1nrSGcNiOtDNpMgAEy;+7(b~>JZ&C&jW%h@hi zH>H6IW#-;JhtT^oS}z;Oe^-5@_V!Ot3Qj}|u2ySp=x#a{NZsMpW8C;mXKR#M-0$UV zlCgQY7F^fX;BPhch+$xh(<{Tx<*SWIpmf*oPbi>HEA zZJsn|B&c>H&c1&LXd1{j96J)!)J#L0Po@*V=Fib40eaJ{I7UtT9?T=X9bk82^}9?Co?7vti=T@e1>dB+rB zt)$>>;k{f-;c;oc^?R!)!v49~Aa5}45~2|W-D^$g%rAtXeMp$+N?g3{45n_X_aJi zS>B`4w*P@C>Q#|izsgg3X}tSjwiCZwrh3UeYMW%R5%=15p4!K=r%P5N{vVGed`NYt z=D@;Nci1^eCa<|`y(kSm`t&z{-tJ<3S&o=?USj-iN7mg7k_uwNO#9qI6$wiyWX02Q zoSuT5KCzs9b5FM3h-hNHAuN^?x)mDuu$yE@CS-FHW&l~ZmT65ROja#t)PBUa0ub2K z8pB+tcoV~^Ir46qMWEdGJHq731}Q>x^*`1^HPdaNqf3Gwx`@Y$XCb%<@*E5Cs8VV1 zQQXHlMtE0dt6MHDCLUU(wSP{nY(-Lk~%MHJ|ta+VrcEFXiY!Ob{E{= zOnRQqH)Y#>5hzg0nS}R=F}@@l6W6x~Sp&pd`5R2eb|9`9_($`axL2>&L&rH>lbmj4 zS=!CXDO*d7J(+xx>BY5p|=*1)~e zb9uGYpNht@ot3mdie`NCYBtJ|D`FeQhS_Gi$7sBY=e%H_=v{R~{H+C)N%;K-^$(&B zp+^xK+!$K_u(4H|kOm+Uwr$A>8Fj=_T1qJ5xK~A^va`u-tB>3dZCmEd=}83LbKC=_ z39xzvCPJZi2UiXE0};29vKyz6G4haul9dDaO2P6dXj)QX*fO4r#wR&lDUH!ee&-jv zbnKoBo^WI-;UZfBrrqumH^*|&U-)|O(f8nG{|_Ns-KSz{ z@9Kj)6)FS8HI!~W3~`tySC{d%rLD=8A1vhKpyN6g_7Sw{YmQtd^3OA)P=>vmZOS2m|Pjp&GkaN*T+) zKsz72T+HDC;qUTxx3x!JjiHZtl8^SvN5EnQUgmrzX55QGts(G!>nV6fR7n@I@dHcj z9#9#2Q9gpgs0D}wLe()Rt9_l0g}jKC7KdsgPL^=M45;h0xosCC!<0NU%8gwmVFP#fZErGk?e_ljdaxFKdwG+}>YMdJekfHv@Q`Mo@i92j zYL)lIt(^MDQ=Uw*O+W5EzQ9Tk#DSd0V_7zm_DYHb%sNYE{&kY9oj(b34{(-)Euz~R zci`W4_mTw=Kc9UP@lChBB+V)HtCj5QNf)eGgO}QmCror#0mQ&O4>AAq*9#XTu!(2{sIx*hIOdO zyN^^|ee$mdywmkIZG_?QiGK*3#fwkH&;sX3^wdWWGl`xC$IQqh;}z53^8mWp2TI8{ zOE{lpDV83cs>s-Ex#e~GtBwW^Y-89SLe@LiH%a*32Kd&cm-YI$@`5mr!*@Fh9KHi? z1Mu{cgR#D|8#maT2c|AaT}+;NBL5ikcPky<2Z5bG-DDTxZb%I) zvyk3yvwHvgp8pW&U3Rwm+%WyEZIvcQF8*b1EcrmI-Rom?$D2I*X-vL9%uqK^x9LGt z6*|Am&0BcB!7Twdr@Sah`}SO&JYvZ=Uq7HbN`!~dv7(|uc(dZAI6>4t{KeB(-JLoH zm`WbmKu8QZd<8wl*XDYs*=bgcG|FVGf+;1?Xl%>D&Jau(KfphmCNplKp;+2!%PRj{ zlR7*YnWoQd9Xd50WGXuyFyIjgTC5~4f~BZ%d^_%DiOtTjig%cb%(dfKza7YZQKCy% zvHHaJnM7Kjq>nA-(=g)BPp|X&&reU+c8XX(sEIqFWy*#P*^g4FSVw}lcu<8aceUfB zGU9MiNb7if%8)M8e3mh0#lY_NO%?4Ks zrV|Lh-kXo78|==afqdd#=lVRaIwl3H>CdxStrhACMe89%AN14L{NOMNlzl=3LwU#{ zgCrSm@xoVHd3pqD8z53b2B4!VkJu9{q1Bkm+UJBfsgJtyN}A}V%e`bh7n#|2rZWPF zMLURI8=T?TJe>q^RbGcp3X|+8U z*ZVrIE!ia?Mu7LviSV)1IO^xLT!p9^#m-od;g${*UGo&I)+xp9oH}55&*^!gyHkzi z1DQ1JYJ4Q7+;ZN{XuXZVrqU;VFE#7r66)w6Mzy3YTgHOScB*m7=Ut*;*wNli+Q4Iy z6!Q63BfBQt^p?SYlEgB(kH{6+E9-4>;#R!Qd*-B}RZV9mhF){U*QHr?jN<(3Vw*<1$>vsgGADxX#zm~$pF`D<*z6>Pp zZ^@asfjds8ijqR8Ll{ez?G_Z)L8E2x7v{yiN!QjP+RqAHr1k2fEyXM?scCF2`1@=e z!&3{MkWoH0mr&5Q<1VN!iMD*>L$Y43}A28 zz@?r(9+~4N?P`xe3$N|5HkZzpX(&pVVRT!KB}A}Qr&JDrY#OE1h1v*?!V-B&RI8Apgit?>FQ=&yNE z9?1de@_uXRjR)U;+y-u&mmuwW_4*furq;BdHm5Y@=x>13P6p@D5z-vnR=J6`KYwts zkeE}oLJQsIJ6y)Y9{-GW1*l5&!iVFzdid*gXvdPbQN0>`NvN<}#+e1L{bk+t`9{U* zc*V4FOQ|`FC&TWGxZn=9M_V%vCq*<j@0rXgWR0k>Z#;X`Z)`yvABui-lQ|jVK19^s-oeKBw6peg zgw`J;z5`u}U1ZhYKBxDTo3P8dtSn5K(c~Y3A8|EG>yk{4i_D~#6VpB$*+t=vs>p-P zCCgf%xL5e!48uT%$Mk$Y&VJZdY|k0(z%)N&Vc)e@zt!M_b}5b?D5*@o!e)9KGigW=dkD zrV8l8v;TMRUj4@|*1z?;$_>Cz@x@wL+1Aq=#5aBAZxc!7-pH1FG8F!7LToMG_m%Fq z4&qJqGFhas8AsaFi`=SerNHOn!*{#rI;{7ktg2uwsV(gg?uIE7dCtA0x;JSt$zFL! zB<)o6$DNCct7C_WS%=rTfOBPCF*uHEKYkYsJot69SA!*sp@>Fn$wq6y^pcN9eKojt z^-3Lh0JfIyrV%`V1MgG>-xT0ksoYELZd!EVJ?ifL*W}VjH?tSy+IyM_JQ8>`4%=A_ z4S^y4{V0eU*SPuFYp%6HYnQpdhM6stVtVTIOvc)%vMLy49T+-qmC{`lc{j))S|?CK z<4|0qYUkA>R=ppbYR}xMYXjJR4`;qO5RjizVb!}sf0ErH_gqMh;N>l+>@BhEj4$^| zHHQ9E>9>DP^Zx(7=Ks7olk(6Qek!5ta?87{oRV;uXXVJ&B)axd^hvvJAqsT1IkcX? z;$2<<`?-khR}^wMT**Ru24vk@rsmOzhPVPDdV$&;@$uawTUyJq67yd->@Y#(dO zoQ;rNUlksK_p?`Y$UF=G!k)-~X8<1W{|x^J^!fu#){%2a=W%2_{$lwOS_$7B$hlVT*lYf*i}Mm%t}$q_ z-|kMqXS8jZa=xe1Ye{=1Te90! z7?9lltF=CF{x5zlMgRO%T<1st1tCz6v zCs&+{z#_8?Bi&@kaXjsna4+3nlScux{i>eygVpUz=4yim4!a>a&_QS6|4zXM0ltE5 zi6ys+E2Y;|3RPVujQK=E+ntXwyYF+o3YO&RlhWgQTBzBtk*<)9t|SW>j#u5%fLz61t-8-F`mIT6QjBSh>A6VaRuVLPdgeLN_dIdhcZ%fg=*p z?zf$c?lZPU9DE^FKu2=shc;o-cKj-$tVp?Iuh_}U8oZog@Rla;H zGHZd*UZMRvuQ<7P4R=A=TqR#)e(wCqrXEYYNXN3Cx?>~!Bxk|Ony5|g=(QH)0DuzD zlI0BFsb(G!hYtQb_TD?LsqoDgL{U*dQHp>dMWjh@A_zoKdJ~jhBGN%XdM5}9h!g=S z0)oUb8u@deb)GPUADp!rGie!{dPEqQw2?!--OSwMpVs|K6*Z$-P!Sc zTWJ#ib#h?Y$Aw?CD$BsyTea;IAxVSmF zj5G|%lZ1YC>?~Az@0DtQ@k1xxU&Jmevj69Q;(R56_Gm{IR6M7I8?u}bz!4&j)|Q)eeiX|Iw%1)vs(=|SeQazTN7{=GNvwaL&goB4! zv_LE7;lD66`|||Lza*a_M`R<}2#z@Y53sq!^7_JGu(`@!8{{U?7v`=b|9Ef)E6JVO zGYKoyC!ijP=Ug(g zaRTk$8!WZ7ea}%v-o^KML=rU;DwgUMAKxyg#~_ zjEi46A;a@kmbrAs9Sn21?2AegWysD-wxOQMnz-&`X^S20yc*+^J5yaipkei|K1AQ- zt_I}pL8!URm{@GZ=Di$%Z00thL`>X*EwRGr7N#IJ@W^DwRzSfFZ1^m2e(*JGVgCKr zOj1azxIsN)h1M28OP580sO2gt@C;bi z->-EPZM?1qy(Z3>Qw_;+1dV`iG+tdUhQRFab=OqaMTscc@=1qt1g!9-?H=b^~y;b2i7beL|JrsZ6#Af7IFZapNh#G2fwo9EF;xIN6D{K z#){D=rYhx1bt}nkBzVn$nIkazDjvI=E#a?PKeYI02Q5sf1Pr#r zz+~XI%-aEPQ?;`1=DPE{O8i)@3sHL}}uTtdG&HzHsE6|w*+z=TU+gJb%hA*wdD_9AXHE;}n1CU_yY{&=L z3QB)SEck>^*OR48!$I$~n16UC6^6Acz;UH)DGuCuM)0SgCB!b$dmp5O)RHKnhza1ZGnh)*k! zKKw-cT|l*Kuksejm7qg*nafmzphK;LSehLyC$a@jO@lt5>NuCS`uGNxhgYX~FK73o z)pRxS?UNh5nQ#{8{DBF=u58wvb5I75TI+xRudYk9I?w*au^@{8L4dfzji>cr<_w?1 z1%2mcX?~|~02-CR`h!z|9k?T(NbW*^_=m*FJ~LxZ7;vXbMJ0ou7vHQ(jsyNgtHb>s z>r1}%M_6a7vpayv2`JH)X80did`=)1j`jHGL;s=%T|mgHqMFWM0`L_v68Dp(o4TB; zzTB0WGp~K>c8RPnD&Uw!IGbeOnrxA33bGgkr$@)`2^`rq9YhB>kC>XkS_rFmYsmaH zln zs#B^j^9?z&&37Ww%mr!HJdF`*rrMfJ+18)S;U?n2E2jlaz!YpcJ-8qbLY~Z8LZ_?t z!BGdc;iPtWW8j;{-@sV9%GUz8pnx~O zSM&*)XFlCK-}kc!_gW^7mK#a`kR)e@{vmn5q7wnOJyPE0q`Po{Eau3u0ju!W*w#S( zh5PQnyB8rZIrkHG6ec{<=E z@ZqV;H3Xoo!g6UGSZ-?~;6;%@Ga-2aG6#!4UPESlKwYtC3Bc(=AkZ<@ZtowG-k@9^ za55>8Jo#_|9&+Y_Be)R70fy&Qb&MwfJs#Jdup>LmCsw?HZvZ9NZcruP9}-?{f?|-- zACmTw#%r_B7oBvF`6zZ`FaVvQk~_p_9_<}2yiOp_zrrfnLrPMFE-o029KNt0EfJz5ibV?)7Y<($v{Pg_a;JC?8njJ= zsLjd=JVYeW<=GBqp0YyfNSw5ZC7>Xvu8c-)DM2t)7r(LLzK8?RD?4bP5au=fP`4CJ z1O;*0dzsV|&o~f7(y7dmSd8_-OeOHFzrqn$tC8S}1u#|LJCRq!$s!`N4ABc*5U6$E zL8ybEnFZ&EFNFa}M}h%Si8U6Rwo%3juLW*qKRy=YH3)+M*H6`@`RXN72QA5py+8X6 zUImE!w}aBNLIG{;aXce7)Vp8{O;L~C=5yWQD54sRT!jX42S|*%6S9{F9!%a3n8A4< z2&c<%IRVgOGAm`f^SF#{$t=xSSJDx%y97Q=oG^m4ih^u+-ZA|!=#aE+#x+>8V%91$ zCt^!i=Etbdk}ufMPdw2Jb3mkntv^S9!I0*!ngKo~ha3fFy5mzp6zmc`=ENybrEC}w zE<{K>S`vL3x&Dw0+l{YzfdOF2bo?YsP;!B}CDCu8R}7$mp$$xIao35x)rNeFo{8bU zE(Z&s54DIbWx2QLDEa}Q=iHW(je@QiWY#3_7Td-F!j;gB_A28*ToBxL20O{?3k>nQ zgzu&9s}9D|^Pr8ykD;LR6KlV<%-zp8@hQu0jDOx^&_Qz=fMAHuE9};Jv87jwcumu5#m#F%|7~hEm&%xM3Ax=$b@H=OZBjtWY?|lA4((^L#Z8#o_H@dI}Cez;0;7S^6 zu<}x-cOpXk^9i&Kj7i1NMS%S&ec5-|pmj zBoJ8w<}TwGU(f;LA^>>*9uRM$Xd&&E#L3K4m(*k;#Tkew1}^-jL--y67$JtA3msFH z0%yo{Jg^0pfydr^7W64@Ys_lu9cLYDuG7w@3EA!A*NPacVEYDn4U!29Yld8 zSWlEb10j50@_FS6g*Gsoe+AH{d>U(%grYVYU5$ImCMB;!)DEhU7#_qfGRR=hI4n2c z7Mf|nVY##V9i)a!!Y!Gnm0qsT4=aIQ+U5=+LlV$@5n~XmgOAcfAQbo}US@X=jFTH@ zCE{M(#7^>`Jo*ZZqI#E!g=~qKyQ08tiGnA8tzbcWKXAf3X+ek;98V1dIp&TE3JS1~ zJR2ZeUTA~TI-Z770-hJEj3Hg2<~~om5*W(&DMSC^@W4J z(1d0F{TKH zng~1744A|jw6EhI00o7O=(WlqE9hAne~z2MZL(yw5XO zhs8@PjlZR8!Fdo`-=-u~AAB0!4ZFS2nbCa49{=QaS}4ftt@@-Fq5x>f8%Y>o-yhU6%_M(`RM@-h9AU z)s@H&6aV5u%A;p$14EG}U#_a$hb}RF3zj6;$z|GpY`fjeT^I9aV$euSB-G@$11fi~ zqPwic6JMmRBA&8x)AfoQm9DfwATYlzdYmf6qhE)JUVT&oYmDKo6sY8keE&GdV2Lx# zq=z!f=#FThd*qxPR|bz4t-z$rn8-qD4%p^#dq0wckZJtO)5_&V{InZM4+W||aI&8C z*Qloo)Fz4!8#NI;?&NgfDr2GN-YV-DJzTw2+PwA~u^*~QomW`6(m$2`9%-JKUeH-l z=6#H5e}0=bp7Ry*DI7b#J{cqsCy1QdkW-6GBnx>oufSDH><#wU9x}z^q)Mr3wnkdf&mvb(GIQO$hN}AR|2E~j z1y|eFEKN**@J26%(vl&|V>x17nAqxD(0%3In+P{0QIYvFWBZr#GQ)+`B3;5y$0f(R z$Y;`RW512lj|nCjrqW40Phuk+l)8|~h1B8>D&;v6uAgu`IWP0`Db4E-o1!s!yG+BM zENNw(_S7T;##v-?k=sN0kwv7_`&HpXb*Ue6U3_^o{PM0dTgK=2JWQUyXi@VMlp2u3 zv?N8Pnto5npb0fK_zrE+ndfDaJsGK|`BoqAvBP=xBWIv7{yJW^D6(ap^|21jr~#YW zqp7yHdYq?w3D=U5S319~(32hF{_UM4cT{Svf(ng5msM3dzw^sqS8C3~bM1Tl(Rn&G zHV4fYxbbYBQRfoUuDY7lrvMK;4uWP(az# ze5|~Vy&Z&Bay8O#=7J^khrdoYdAjlxiP8UJa^t>j%BhqTx&#+Zqcm>Y9d{+t_rguf zp^mpK#7+9_uOL$c%hivp=f_Y}@6L((YkC|t6 z2MaTCvESH8{3n`XBTFn^=tt|N#lTjtEqz?m>l2i95Wa$}Hwh(23Zd0ZU^q7^GvsO2gvq6zi~Yw zH5Kb+!`>&al=bdl5lzxJDixGz%3+~0^(s@qE$cV-bH%!n{k%DPTK3FXo&Ds>x(a~8 z%l3R2q8X={2v*zvm8@$l;m)X^h*Yh@lor!ow(H@jgAif&Ae~KG_U2M=5ZstQB-ruQ zW%W?nVsv1gjAX8ZGO0q5E^{U2#*T@)Ks&TTA|lg$%=Rd(-1~Yw&-dr)@8>{jh1?M^ zdFq+(hDqIgsjCxvyL*tJwZ|dvHQf)2%)zGF#=oqe|LFBw&wD~1_+c_(+l$6bTfEiT zRzzfLzEq;ABsehZ<=(cUn~nXxjpn8|TkAT^3g)|Fa1s@Gl{zb_legM2Fr&4->J>?^ z*hq1Y2lbO7Moa#cSrySZ}7b%y&yE-x34PW4DJeK2C0{XW~WxPw}oh@LqGq-5N zc0uG*)tC>(4CD)8$`1+adlv>|jxNk8F_P14i#*$|^96I$<}-3zW~(fOV%|4)HU4ya zoU?l{)_X}&*E7YuH{!O;mvG`3q)z>j>!=HJ1ov0Ugs#gX&;&ZCqxC2i1eA{J$MSv0 zHoxF;@j|Q5m9?~QQ&yN+yP8U}#0$oBtJ^%8Kw!@lkFdnSwOQ5mW?jshQ#NwxY0tHh zmm*}d90mhk3su|^S>GroleDDf#N4q4@c|l`nq?tcBiZPzK1vPW8D0)16&cHUxR%lC zqfWkeX3}2@Pn8AP=?O>KjV{ldwIqx*+Guk5?0@HYzHXa@kSMgw z+Y!-hdKiar-Fl_(k&&-FDGozgm>F9!g)^Mz){U2OeUv-tL6~Qptta1Dro0lvd;M%c zH`)^}{9LSuB~ZK6`fjE!3p-1lt!wiQL5&8~tg}Q@w$JkP7^)$R?L!pSAi>DKzKHD0 z9`%*l3srPkzQ*CmOelN6?`_us3*9Q3_1dO-bY}@dm`}O*Cr6<0{l@26J(sn*t(=16RksELObQ&ZPE`xG84@pI`Qn4nPbX>e zF!>#i(W=gqVge0iURuK@+NgrdEa--&F%ZB zP-7+Uv}XPFALj?s44NvjGRXQFB+NS)xSCMd#$~^8pQ!U5J6r#DHQs>#8HcyLX?zW{nl0cB2?No~?~hm&jevbU{)1UVXk`UIq$SP~9vl%+Q!@+ugXrx()E9`~DJ zXXJRX(d-Z0arB(!(F3U$u}gx#J+$f4er0CNzgU?qBqp9o{2}==_sr~~i^>6?8I>J@ z-PoT`QvQ%K8d~h-XpDK)E_}!3A;VX1L09}vPEExjA!7R7FDqtSFGs}ck+iwWu_{k* zR8(9~nXT>-bGfjH9Ao7u-7!c9Dp~ww^3fF5Xz!ii^x@qj`= zV4uwR*>Fkhar!MXd5nRi>&ksE)g<|K(Nzx}B3+{>FZ^@#gffaMFjrye0hv3qt<39a zLgGWK*`*IVCkFlc860EWF#CsYIh4WA28^VMh5Ys;HhNz@+1~z6(&USs$y6}kF%GQw zXqkRXSXpfq2Bwoo2eiz5j}aAiHJ3&2q=lWBCQ}@H6wLHgxKxtqagSjq^_a_M!eQc` z(O!CSsWuB+!A}Z+eE4mxD5f9zzgm_{1@fZanVTf3a(1cCMV*`#+}lx()%4+ra(>c! zbIQQ|qd=b}*I+Z~w&^TtKk{Wse)CfUy(I|UoEdZ)cB3^>zO{u^YPm5nqxX8tC;5k@ ztpjhka(3Q7j94ruiCOeZT`5d1EPhD->ueQd6EEzXr)+cjPA?+sca6TSHG7PY!^M5r z54UCgunBkSj`z1Nk;$bSD9M<3+SA(m3!yFtGf>`g49QcOY#T_l9>d=ERKI4{_z9^& zgl$78B<7?MW{$7F1>w6w*BKtxOf*wX1h`c^F(F;8Bx#c0(K`jvZ`a?g%$<{2i}Jox z{fso56fEEFBb}n5;n+DHC!GwkgsIcjMk>pc`XD0lpli7nuJ-TK5}Ud=9l1MLnX3*c zG4#*J&qz09@sxYvFWpLG99tz~Q7%f2LqZaUWp`O0Jo}(gzEJ?(G+JTTk5AUOiaX>+ zKrV5A3V1R4==$(9iIj86jRSL+e6ZO}$+K0#Kt34ZCsEuG0~m6eXb_5+QdR}(3=%w&#(S$DrLq}PI(JJ-KH=}ra@LrMljF8=DM^$CAzRS%Ghv2;{ z0Z*4!VV`1_ghLj%AqkM~0xPM>{f5ywWYu9$2Sy#Z}3ZF9C-kmV+%RKjd@a=6wj1-s% z16Nfxe^pC9=DXUZtU=(*uO9yiueGGzUbGCBFLdS_z1dbWx4N<1@&TpruGHOe5Z+Je- zb7K)r&}kTUMkxe*B7d)(O@2g}3uR`1b7rZ1c4`=k=HZcaDj2da^J%r2E!OOBY60O`>ElJ6&QIHKC z=(J^)HLR%1y5sY>ODgiB<$m2Ds!-_D=`g!*@I&|7$aNT)cJ`uS_w&~`lwl8kr*-&6 z&uTIyM@JCPJQbofXT+R5B+fpKR#nek8&NhNAM7cb(Q2UhNH$+7Ayp>0H*vi+>6tJ> zX1%-w5At_^a$@cC0m(4nNBetw{)jq3mcHE^_2Owa1)r0QE#rEFrjX$X{>&)b#QN`E&H;P2Q@lQ_qyYw z7Fl3v8{1DcekKxXxid?b2tonDJAhM*Ho9`?r#HzSE_1_qtCszsF=Q z{$T3R7wtbR8bAz0W!#X1?W1x&J;!DqL8T_fb3YB8O5j*TKRt>*;mG);s<$IfOUUTFvf#&YId8;4k`Y^7*riIZQ=eS}rH9vBZrN^c; zwcM;wC1s)`TkkV1VJ|i;E&$dqufNPYl`j)Sw=Il;>gwM_hi#2YtY8Iyy}P4PJ1a7| zWo9aQyLVcj0bS;xmy_$5jw#bGi!E+u!9Ra}NhqHp%1s+SQj3h5=e@VqtL!AFE4#wB zA=T-VrR{oW&V`SMC0(y&&nK4219l_L^Zh($w7fz6Th4KvlLc&cg(%{ z=T&FHT=hRFjG}~+IXurKrq#Y^E65i6SDsNU3e?sNJxCQ>zfEhIe<%K<;NC1;RQBTZ1nh&EW5Oxq8?gehO4;f_2w zou^x=O_WE%&cWP*44CgKD^LD%2nnI#s;3b z@=<$PdUk%p$#+FBhW!f7HJ!|__gWf{Ja-@OPV9O&Uw`R%d9;1haNjXcRBDsKK=D>o z+)P^CkT64A+>S}_Cx%nnylF@6Bf&BmPyy4ptV@gBu`9+2JYh zbs4_5KzPI*{vB~%E~9osptKb3)*k$*1Y6oszE|Ux!!nNgAvmaR$?RJ2(DQPIKvl7$ zq-}bL)RPDBv~iDfWoh+a2y=(};&m?msWKZ?iDDzvnNDol%qZRVv3mA_FT_igfx>X| z92Ic;qN3;uBO6QNxLOY6o3mMrm#j+%A9Z6SPKAN7aK>=#sM){{vz5AWCUoaPp@?PQ zUTDkx;&|=b-J^`$ZkBrqU5*tQj1vBgb>~mP7!?UHjNygVXW?I!qq5umD3((92ihC; z2dhV>?}E6h9jo5b-fU++lbDfT=4%+}x$^KjfoUSL>fSh;Mm*@Mxuz&1r|lK@#6KiE z6K9hlx;OLak6lBafhWp6T087_8Kgy&8Z%YuS^m(z zWp5zNkXJ{(d83DMKy994B=!1A*eMO$MX-5hsh#Gp*{@umQbH#$muz7rmQ|Z|Tm7k} zb@*S$YnJnLo!qxt+ShufhdBKtZa z@1-|JnlBDrB3_w|a@3OCo%)!*aX^mm9N)FoJ8@E_Ovl5{yj!$yZ(m#3t+D=H%)pd3 ztQiDvzxbH5$K^9!c1Hm)Rk<3$apl-iZhhL28gSF*uiyzf<6yG@zg)~Cocg?8{oXOB zG}OF_>D0bf>dint!mK|&@Y6j#C^6jK%B!4|0s8oTB5%#j@X}LRq308crETdn8Fgh9 zd=+}{pKvwX5Ep7ka;My*uq&m0VIP(gn<7YiNHUua%*!a4gsQss{g@|JDsGdpVJGaz z^h~QoTTY8^_#{-S_>iTp!2}eZN~J0X2x>$8_J7!%eR&}HZpV3lc;*~=ydnj?fr&f1 zjCfXUNm`slDTfAyZ|VA+cd6zY-N;G{{qVKJ^I9|Qk?m(_fTG{Vpe0}eWfwPkt=UAf zKi$&78Bv_HgFd9_vre?hps13jo;X$M3Wq7d>?&yH&OXTh|vj)v(*Oz{Og_H1ZO#BXz%5SxID1Y#YBO|4_nX^tk z@Yd1lQ?C~rpie;qiEr+*UXpaQ11gCpjb5B%g- zk-MFs(HgZ*iXeEYZQ;$9#R_)Lb;PV6BqF%kPRwOj20RY}L?AX__Y`hZQ=MDABJdp+ zJ@H>IRIf^2_ZTT@$_dm?1AvJH4eUz*nn;Rp(L}MfLfX= z>qz6bO09=@7GQQnA(p{{7%1UW6r_}okYD2p!D1+mMuKZ%vI7L z!HB>a;kz8At0l&}a@o(wAsYI!zJ|4^lF78EJaAVaduY1;;op0g|Jncd_rgfu2im{q zwflKFfqNt=lEi+1TO7-CDSdcAE|SDn++8~>m~nu4bo1vmcjbTWfL71xUow*JA^-s8 zlH~zxRxT-B{dYdt|Lj}-|J(ofO}PJ{e!H5I?Ae%)%hf)iqcf{Co0zm5rX^}TO4Ku) zR=fx*iD)vK^6@-zd;sx0x%gmq)oq?B$X44d|6=kL*g8=SEH;t;gCWVw zsP{pi5fD`upuH4J7G6c+xKw0fpHyy6^6Uq@;Gx`rc z(dDMu49%R8zeQ>|d+MPydd^RqPpQ$xuoobrn?j6_9%0{w$*pF7Ut6bo9h2WWd@zv+ z-$;E5(cUqyF}Gc`JMnjyJh!9(02RzYQvI~|3#|k`wKJ=9^Tb%-y!NPyn68InVME*z5P1X&8t%ZMI zDkXglNdM0D0P4A-3TQVf>S-QE)=iV+c>9T zSYm!?iqiOHxIxg3ZB`V7JV3%deHR9-zYAd^0EG5J5u5|BSkDF;iJP$ny=kls-)eXV z$LaoF-u7cW!u;9>={J-Sq(uL&_fSkZRU*3p#NXADSry$!{ z5J$Cr(`V0f9@I=6Jwir!lDG*h2LTuWH9faZC$!U-(rtrac98t@MFT zqYejoVUzc5V}XLp%LzgOHvQ zaQv|s7`M9Iyb)@R*W1EoXxTNhem4hD1vN4WXOP9bTte|-al3)hV4pl0k+d8DidjK* zplKlMz*l<9m(~kGU8qrLJ z=lUd(gq)K5{%u6&BSq*GUNNt=KygZw;ceZ1iSMiZ>k?;%a)O2T`MQXX%0zJ7#(3Y` zv(R$PjABrTPci!CVOWkigaVAr?)AL`!oC2>*M_s|BDTO%m^8F2CUa?I#AGV3+_~|f zB5*GjyWR>0DT0i#QG$V=imP_ zjz^EBIldHB$6@($iQ}2KT!iaC!O$6VeJ?85@H%~v71*Yq3qt54woDGW{M6_3-V^_T zAkm^od_K@bBWGB08;aO;bLIPMlgtW~_h*@>C6+ku*RUp}_5+_)B;i=o$zVCqWsAsn zq?Pa?*O5NSXKQs~aRK8DUg}A{QXE=f?66*z=Nc3Yfb zfK+WuedU;ww%f2$-$h9dF>!pF2%yfcm;Gin^tX}CtxgXriUJw0W@tyooG#4Tm5VQ+ zWCtTp)U~4^6fUl-nA9G3z!8@74rFrxHiE)Tw$@DH?lxJtW@MO+Y2s5RA0EOifuWWl zuJ-_KtEnh|b-Z+Bzxq2TOF?V+)~g8tLx6_N`+sShF<52o6uRIwfAojsckTu;#(@w3 z?FL1Hk98aM0N{=~lE6HUg?<7bNWk0RVjW1i>)3ygs%ri_q$-11_+yM* z)xP^f;-fDm^rg;5u*6PNEj_KGUeoL%|DxD8IdZA~ba^$^o;5=p@KG!H-+k2n1Ge@{ zXUV&NF~QB{e@Nb}{2|HCh7LL4F9-m7Z`FGOw95=R_3*^r1P{)0K)OeQ;_$Jhy`bO_ zVB7d83vfvP=OEug-w$&fSA7BMjCpDhk<1-)R&#<2YvL7Gh^qqeIo*Z#CZG6B5`0KSxarunGld5R85~MNA51?p9A0zy_P^>f4k$!)?b|vP#PG;Cf0G3$>bd5Zs4au44+>x|ODF1P~Pu~iw366k=OpWgrsF%3q?j}G?E zJ<*_PvgH?bT|4SP&g5Wk_3tsg=N{bU>%e>sOisX&`^cAv{ebamFBQ#R!ty{J$|%r} zDsiA)Acg?CMgK9-asMXC04n4H${Rf~PP`l- z|D`1r{4x0zAfO((wO^KRr-L79xt^#~QCxGi^1MQ%OS|V)&>RG25Nn%)mphzme0j|& zTWgfNIgZvf+)I>~gDw7!?_LZaI{D?WN1%xb=Fn^jmAco)NFp&Er_ftJ; zswL@=doD)edg$gglG1El-T-b>zESnXF}&yDAPfLoYmA*9)L>i#UI;id^&_TB*`KD| zD({Z$>w5A?WXqfar`da>zi>;Ar41BoP)A@xoRx0rHK-wmEEXg0-tvP%Ep2PT3MSXew5%auJi%!YZLf}%21wHjXHu1?$CqyCm%^Mw^y-dsE>5ELH zr8%V4I8mMMh=l}s$1vgIDm1gv45t?~k_zq=x`sJfc^ri-^HJVKy4=mq@5xtnrKM&B z`H;kN&UhILna{q37pmcV2qgke`GnG-8@ao@vP~Um9T_(st2Zt!Y=zxy`Z_nc#+hJ0 zGdR-zOQNMn(AvsZV!L2+*l}zZVjCMlb)>uq{TSdRW$7MhRX$5EWm8ve=^{o$9E+=@ zddoMxb8Jbxk#Ka+h^8}>TqvKv{Q8+c!_m3?114zNC>J$Z1FJE?wxpr1-sHv4nsV1OsL!D!o_e&EUb^!~{uddS0&pjr z#e?Y?MH}@uiR?L!QI}@lY>XLfee#&6Vro>ik)0)~jKe1$B9)RvbMhH##_+y`uMi8? zP*xThsbzuZU6a_kQgpOGGkLXvuu`)LqG6rdq}#og#Z4{|y$>LZuapoHcT4eU-zFu< zdgQFI!3M2+5pSd4gXWM;z}yUhKsstV7Z~u57lfu@d``v0-5S>#VhG=9a})o}Y=@fg zu>A$cT$}&csO<(AFTH|(3Bc)t0KiTtJs?=S0Rj@oxNv7J3tAu2!PQA^f_{o(x)-){?EYNalh z3xO9p;M0K{X)}tKNv8h{ zL*6wSSjvYbBdaPhmK7V~_4pWkR%0ZXZm~kRgR?_ij{!2-GeR;zeDG;h>f=gVA}E{* z9Ybqn!*ajXJ)}9m3zj7Jdw0d-8eE~#1ZN#UiLqY#!QbV4IOgKutP*BcLjW;d6e=_E1&u7Tn^FX(S>VA4lPf| zrG^6*`_XmhJv$#Ch`p4gM7BQY7sG0d;e=s|^b^XGe74YTwwRxzJAHCwC`^6(5?SbO zfbaF1$JR}E(oT|$TR5`yJM@8UEsZs{onN32N3uGV_0_G_(x3YvF+Ag zpcs(CW*LS6!d=Za&^#x|3QFjl{^IqEQ#YrzWA6V0hwq>B`FNx6ffima=%fbHj#jt6 z%{DZIFH}c27f_uRA+zj`19y0|@g!7R#^3ahy7Xe(A2+}284jK2X@OAq_+B^yMUhLf zs}5pILR0{o<0m*<5;Li+=z);zL^xCc{E_cztvdd#f!*sSZN?&-F>ocnd?Tin^b za3$6;10zKj7B+;?MW9gQz*-gSlrnPwT|R_zB)0Gs;gQ&4n)Xti>lw+UqoNZF4+47O z2e8hL+1Cuc(YI(Z4+JOQffh;e_Teonumj?AfRbL^lm#<2Nr7JYV}y>`SEpQ7@!*S$ zpnzKhC>h}h$kOMrUeQGDK@Jj#5}O|SU!vTp0K%HkG3RQn%NpKj1AO>#7G1B6TLqYS zHyZccSyvs0H;YYeQy|i~QM)lEG7i_8M?cG|)PCiWP#q>N`bSU5Ri-67I#(E@YZ zOpIzB>_(4fVXayAg#!oS|2RksKm0fkL3TjmpvqaXXQRSI>vJ|@dpjDhEyZ%9;lM33 zE}!bktoPqjPiE=pVvxc_Pk=weisoXvfZX-nJbW>?KS{)ZR+HGazDb)8)wI zE~!{N9#4Nkg`@v_X-`BJJr8`e36(%!6t1-1_O7=Co)7Da*#KSI^6X!h%R(z8PMy0Q z!VQ!r`j-(Q@8FTk&reD5kK`Ad#4vd&M1b?R(MnB{H zl?432I~~*Gfg=~cze6JPB9;F>T^0Ro)cEje?v^e6a_}D#kyoXkA_Mz;CO3m2YBh@P zeT@2G+L?<=JtYdeUtgH*&uN8*-exM&g*xrqeFb)o;D0`Z|LfoRS|IkD0Sh5mAZSO@ zS3&S8(U4OAGatm>O)`|+O`lY`+AwaQ=EDtGqPnf0F@OyHNy4rI z5PaPlT$M>mdc~q16NuGQ54>kvi!;ejD$*j^lrQyXihodL^V$5Vr_u(TRAM2o)W)f` z9u{+dg@?|))^Bnhwyh@WFoz5;WKvD1Ffpb1!r6j&vXYv8-CH3wc>4ArmAONIjd!xU zDqY2ur-06*tNvn*P1<=jj}*x{NbO65gO#$iE5CAI`S;w{TY~^ZNWCX|R3Vw{W=-5A z_k5M{dM|}4hat6;$jR--m_Wx2|Dh9Q!6AM2DC~skHBXM>M z-8Rwki0$pq1k-SInp&GmOn3N|7%8m&Y5k^|K-f^@K zb`MyU2)R3!XJ0APTA7R)H=gL$q(!(YfAYS1U(Af^tDqxlwmQysUCXPT^I zZq!7T0j0{3FjDU%o;#O@nqw0sxol;-V>J0VX{et)nvRQugOoYYo}0{LSkz_#?b%qK z7CmqJ=KZs6WrvcjdnSM??+vnvpB}FFW^R};BEiNJ)rZaSLX+h#_J@z8-nQJPDlZ(q zqQPUK4o}j`&u@CX-{q6m^I_(SEk|Uk6Y1;Uwf$D*5no$sNXC(ZN_Bc~ea*G}CnB>p zP2;*RZZZ{o7<{=FvCpS38`S@#>O1X9jC|hQG$CX0rzPyS*jd?IY_Dhi{wSe-a-YwrneP)zK&FJU!WO9+vC|6yitIIF#EA_>`jt};?8lWTfweLTzL;e@|w-zMkaoi2jJ* zelH~lE2e9$HsBF98TnnZz^epH`sE_6>afk27@3ZZ92lGDvq!{-I+DmrNeF%>FQFGz zZ|`A$$pTr%I2YP5As@BsOB`IDeV0*XK%LD3_D58m`Phw^1MSpO9tf8 z#@dAO4bIf1YeS+`=82sPKI1H^F9hYusulgd+~j39J>NRBv$#GaY=9Drifl8Th+Z;(5e@Sj-ATBDL5S@p!wvH&)_CBR7$)6TZ+2Z_;{0}Fo)jlv z&3*K0=7#CG2Z}4r*PD9QSa3OJr401Rv44>6@EBl-$jr_GabMrs#SahaPUjpkXEsj> zhwE>~6c|?7Xih9g=j*1#D+|qGm6{K(n*PYY&!^fbcil(929wgDGkJc$@$tV$Z0|i9+0@Je~-T zb|)AUG-;9G`lD+J`ENAhep*&hx+Y-sEQ;)tX~gyvwWI}n%YJ1$V56f9BsnbU7x^e% z^-tkcJFlESw~~JN+E%&2qT_1$4{9+k5(H^6=R&jdsOxcuD|K@t8 zhx1+D@4+^tQX8q;PBLNwA1^P8T}jKfvW?vS_8<#J7cFq~oRO0(C9q*4u%uMH!&{*jGg+0oH-W+ zZ;47l?@u=B6sWRS>-XIv<1R1R8=01Fj$}&^E zy}-NqZ0|Lzrqxq+z!rv#g#1+fVmXGvO7t${rXP7rgtoVb@f9Yqo-MPNoxPLqerWN& zTT>DeeM4%X+tusS(eS$Z{gVR|kagyUWZzn_mdK!@*|6nrl_bZ?Dn4o@QDMHKCzEf+ zRt>M0-JO4Z{pXVQM2$e%1NYU$OY|Y_rB2&J_R=5X9XWoq>^jmCO?`{QTa@xCiC zb%Em90{#ZCjA!Wlr?NNfcFTlzg@Cw6If2~o&7H__oD!3Xmu1+E4+(~M{}=Y&1FEU0 zTNg!9u|Y&cia?Yi3L;IU6H#d*p!6P*4g%79q9VOSKtO3xdJnxr=p8~2z4sn!AjP}= z&pBhJ^P ztQ2_(lKJEcOyNzaju$#*fy-X96CNn(5mI|E|Qfb+7c zR{1yBCIP-y8=Ti-BIb05m7r602U}a&wxv)?d9-URI)@FGQZ^g9ksWs-IWhI>*+uyn z#)IGfuSB~W=fL`Swg)Qj>69}!FiE|KS#>n(MnX%|!mW=x7%s{SNuOpfm8m_LepL^T z#m}7&Y#+AD3G<4QTO?4lrr+m>cC3Xs4eP7WjZjn9_9DmS~;H3g;D<3!W)AMti=G$ch>w#hslJ_2MPnorWo>JtN= zP_dks-Qry)8F%Z(VTSHT#*Vr_er6cv7uoAQ8}w-KrFF^=UT`hsaajw-rk{IPm2qFV zP0klaDUkR+rt3Jfue$!TNUUWeSHPg7Vf9+dJ3R(ln(&ab^Jh(Va>TT+olNa_%_xUN z-V&K{Srx{Y$$WLr&=)Ly{Ej*8QR`k2>8a7oxdMzA7MWU&>T&XEa{}@t-a&&VM}rK@ zcfL-Qxsbi^bsgHE79OgTqE>@;*`tTrQmbl!z*bT5wq&6%&y+-R${=62{nTZ#_H_4~ zcm>s9%?b7Zj2Npzx)|R3eUU@7w2*|R@|`=NNygZU>iSfnxa;&D>qeKRwXLLnP@YTv zC~Rwb|Wq-T8$!ZO&wEWsAg$Y{=TTl;R+0J0{7XpZI)RoTY{LFXvzlBAL0bwto| zA}N-O|NP!%DGCW~y=tw<)tYaDce6~}q%YcZr;k~WE=?Mx(g*?Z9;}}Bo~-M=7oZ7t z7Tl1?vV1)EkIYnKxWzGo=TZW{mFy^&zv7OEjEpa--y8$itpRB1050=5pj_w$|zHd(h|4Jj4*=ort zr1aq%gJci|%`f+YT)DmdR+fOXH)#P)0*!0|iKiUt!5`#IA1eZM5O0c0_Dw(gw>N;)Z2?7<03epH*ux@ z1h9Cb=r+UeS2Gh^lTQ%B^}0vmS%<^J)ZH8p#{~Oqnp@!x`y8uus!LkwK)fVBf|aiL1`;qK!_q4HNz3ina%oQ@_}9(8tN z(lra$>H6H=;j>;=*FEG9aijy!P9-p(}lXndr^B!bDG(C=pHp@?{$9FxyCU0Qfr z^gvp?T?E79wr14mDvmlAq1Yxnt;NsH$229-*#VxBiTQn@BkeWE8^3z-JoJsS{FW=O zHZ4l`#NReB-dcaYvsCVkQ|21ZczSwgvdsE9($v4e!TdpEp>Us^nYC@%bRuz_31h0S zAM?H9o2Pk%m=sWO)F*D4t49QK`^P(wjeDiEkesg{ffK)AXG%~|t?>~G(y>=+TXE3G z^Q_FQZmrc^qTg5Z(Wxc(O8gr1f!}iM(lj=l!KsT~FfsBxs&n^-VROb5#v+G4 zg_DhU5%=BhzTNFE+S!<@{kK8f(iofF4;jhy-E^2bpJngdX?+#0ekpryap_+3S&voY z>aQ2ze6lRL0+7Dq0BVsQmkN(D!qwTgu8iD%tsl7d^Ioq0LWPwgf+j|hkGrGTWW?Y1 z_!6-1VBA;aeWpO~hLtZ_#2}o9U0DZj8icH_pw&;qa3KerxFf%eb&J)tk#OI|AdArM z;*TGl6K~k2#NKSFoK8-4_Mlg;TBy8(x!q}(%_$)2+`#EpaNFUAOR?DLG&+{c2DRFEH==pH7po?*={+vt%x#@c3yo_qIsUBeN|y%5(I=v!uI zp-nwSNv6j`=h2}NQVh4l9E=L`YfI%M@w&s;Ez?~gAQ=%4?2CF}O3{7)Ok0i+9HXnG zZQw3M8SwYgvva>%-FdO;^?j<=jti9E zjw~hB_Z8pB3=3P0Cw>ukc5=THxgD=sVv-}+GnpjV2J)I7KB`f^W3DhEKQIi|-;;j4 z{DH6Yi~PamUCkqE?TwEZPvHy367ezL57c}wWSWlbN~v9Jp}ME(!U*uY^cjSv8DoAs zz2|gJK0-o`BHxYB^SV}Scvn5y?cFBC%E*gV$vX1tsmw#_U+LKAc#!eb2yDdCq_2Xx}l=ri~ zA>I7w47&hU&m7x7?W?xk%pU6DZ2LnoeW}vYHNg#%arjZ;$#ahV^fYW>vH&GI@AguG}*qV4jH zh>f%#GcOz{JLGP}h7HMIy26pYtJ7)sWo71fn(lAf8yx2ApJ#p*8RY7&3Q9hmvW#!_ zl|NgU!w4`kO8fGd56@hTm0LF7MA{fu^t<1ah<+h#7;qX{b%xDa|3h(eJHub=I7$p7 zGKbygOy6IokAOk*l`>n3JHmR@1scq}hH^iJEoSo+WDYbhXeE^f+i^S)W=-97h4||O zdYT>5!`3VBWt=<7TJsWVv2Ivizbeh}JcYXp#SX4|_l!Du&FfcKx1IWN45)CzXja=C=un0GqUQ66uu)eWqWPW3!8*Hy(O>QSd!DQGvI4dQ1b-@ zs4S*A*a;Qphy6Ue2~~+S;ZP#TtQz0B1%ulhN3X{@$q9({3D^z-V!QrN2py5njK+?( zsJ%fgv;JIz1~rPFK9OHi@wUr9?T+}bLOBl#hJzB9Rx??1x3k|QGus$}RLeX@5f9HS zQ>VFl!<;Cio{_ckQp0q5fm9~Xx`cm9BUi(L=I8MIv8yMODyrg(H-BGLf%$)FZAym- zFXP1ARo1K|MtyMh+U;KbH=DZAB1Xr_3SJemovCczyou6e1Z(4?X5_ET_42{70|rUJ zTozlIMtYpjPH+tqrVURO=?!i!J$+}D?xb6=6hOP;*bcC3BWfK@tp!Uxzf{Q7g?ptp zdS}IoRTCVcJVdL3o7b|q^^D9WT0VKl((nI_g{>=(u-?A@v9rRkMk*sKEZUSl z=UkZoal!AMln82Izn`^}^0alB*q|c{&r|*V-SHF~`9P_pK;&(Sqc@Y2in_reh9!R0 zBz|S>bA%G=&0fV9@{GE7QbWZ%c?qmBn1*~?vnb>$P}rkZ5)ve$i06_$-T~$NPR{ew zw5N{0xB*4l9J76?1O<>8hn{%5f4_Sm@PiWsR4|=(a6ET4%oTj|%CF{rV07*ERBO#QpC4o%ywvuUk$=KW8iyIAhzhQBpXWZyL?aX_4V;MMa!v zfD7h|qQ4l$5E(EouDYrM(?17GZGj3kU#hT z0x$EB`am{`0@mikgxm&S^?7*TIb7lb5W>F2e~5aGq9-L4)MNK7o7Gn(-^W7;CsM&A z-ih49Fpjzh5Ux`&Q;>Ex*Gdym9mel&%6|};lFn-VP(lB-m9H1{3ukY53v@afa?%f; zH?}s~{hZbk7wPpC5({rb(U1;#&{x)6-pY9?F#cPPSBGfHIUFM{FhQRWH3I>#o`VJ0(ZF;FOvnyH^Is- z2$B<+j>6!=h9uANFC-#KH4oD81-L=scmL5F{6FXY=Rhp|pXcyTJqJqDS4N#4=)+d@ ztM2^#bMDx*d5LRt&rH&5QcG6Z16dU>E|Rsw^4eHQ#VZ&3mG`3IY;klH-RYpf$w zDeBdsjAcTEF&@BTY$lrgU%5w%M? z)g?c>Iz7s@?~+?38I%ahW~84tB@va4-6Z$U)D;3t7~Z#Mi)ReT`Fbw=Q|a^nL`Z5j zTU@9rc^GfBU~6bUY0lZ;-ADicr@ydEy|Dg`@`pLr?(t?wryZty5T#q$WLJDMr^BuA zTYk2`d1SW~F4YtKa*UG{-K_s=KDHx%wrjTOOhbxkQdg|mI0gZ1*;^^P&p?)0>-!&y zwIb*>oe_AkcKR`##U|c>g>U(@W%t1+wJmzRL;nv%E(D^Q~?unVj~?sSuJN4QF^Fd5j14|_j}YDe@lldhnG0Nm@Q zRQE#YzRbqN%ghaP3-a^&DO5K;*}YB4tzyad(mDOV>6v<;t-qnv|0NHiJx&vBj*~C4 z8|td9R3og&(-SzL-_SSb2#^-&truYzhj#@Jg$2RszDO$N*{s^gmBpGHc&jcY*!ieScR zPDPdGQ8yI$PfTlq+7Wq?I~dX)Z-*;td~%^JLN88<0mqM<=!&vj`Sm@o* zN}=h@%!1nTy^j!{gK6dC7kg7xFCk=4SogZ!R-St<84_7^u z?B2qlKf?;k7{;Z}haA=b=s4u9(k)VIvuHlzbP)_(Dy2e))ps6EYyFcvIJlE9i0%8L z2!Mcm&lwHlo-6}QgP+lvT-AT;ql|s}*CpQaYPr5_9^Z(2K1+fI??yn-f^gC`k(ER3n@#F} zfluk`Mi}G3MQm$_%xchz82Rucn*24OjGdH97Em}TD~ls-8o<{XQPe5Lo)%2Ist~7# zI>-azHAS@bYzU4-$kl_-l&g{A=u;WCwQv%UIvON5p&xSiO{Z5=TaCL3ed<=f5X+N! z09IEZ=o4Q+kE-o~L8h2V1!TaO>%iO2i?rklxk87N*SW+gY-g#q1A7zX)Gg#}T5Uz9 z0dZW1Z9kg@=~oO|g8N5WNQENj$R$lU(ZTIN@{kBFi9TU|>kc4M3kQt9Aju8oyzN3N z#1KpO;+A~@Z84rf*w|A`lBItdSh)dy{0$YDmF*+?g{%gQryt}x9rTj#D6)(k0sBJ{ zunJcG3L!kmy+$zh_yAZq&@>{g2)T#ydywTL>_>*SA~uh_kud&k*Lk{=L*@BHK``lu z<|e_=1B@^Aed&7v!0g!QtS-~@`|rWK$j6ZFt96+E7*c*c_z%VHxExU0M=&X012i!- z-?^D-lS&=~5Krx!2TdW!BZDjBq(yRn7AzPs(LP_aX`Tbh699DZH@?9}(AO~rc8c&x z0LzBohPk{|CTTT!lIBUNZ%B8*Wjf7tqkPZclfZ6k@jH&7=_3GXhC#N!04KYiHp}rJ zX~ zG%9UA?4s0d_CuMGXtZtOkLvwgu=j6zz*6KSi~ZYf@!!Ad6p=CHYbC(kZQZ5ua8Q@! z3yUs2xjdEI`XOBS-m!0j+Gwp)eGDnG%)pL!u|1GH+;|gbZ}%B88$~rHlm3YkU_H={ zW~=eF45#8AD4`65QRgmq)pGC^poNclTdI$d^I+-?7}v1uZ7;dJ3ZV;>j*4%pu8J|; zeFxGf>frqe3O?F^%DyyW zk9Vv5M0)?d8cbNJPpBi6J+4*RYlrT^LtUgcCDOkhzn#_>(7m&L*_4md-s$Tl?&U{U zwo>YEq@XQ&Dj|HWeIh&qh6C0arCzqxW!PvL;{bExc*n4$A8+N9FVGqGvX+(FJsK>0 z`SRk=pAi%kPD)Y~yz6($Gq2Fop9$`T9uyDXfMdzDEoFUtY@&NebyzSUF6i>Vc}TIKWklMB@3&Y?od|-kHP*t zWVFB2qKObLK7!C#p~j)NVeRmXJ_;W;Gpq;K7+$iL==Cg7P6~DzT|S~X?`wMiS=4ES z()!TjSoGylLdN4zUL{L+MxGM(8;rL&V_Z9?syHIQI2&WHdj!T=lpH@X2>U}3`>OX$ zuarzUWvL{_*0VY*YfZC>$k88hHz3EoTV@$dp*L)zS?QWh3Li6xsyO={T)s8;&h#RN z;_$jG<$8(FBK7BAIL7S#R=BpZ8;R!nQ)bu9P8B00-vG2MnT-?;yNHKPzqg|#sJ|Vx z9Z!py-1wDex+iNOf!ylcc8q?=Vi=4?b(2Vw|?!U&l|zJ{(t1Fxm!p|2VtY zRJNTLw@6l)AE7^KZ<`%!+iaF_#e~Q(s6EZJ{*b;hw9-74LZkQFm9mGD@YTNkd3y++ z4XxNAhPd)`;25to?P~CZh&@EUXl{FF^Cph(g@H(zm;!C$M!BJK=jwMz3E*XN)Fc z+F$W*V?M)+j&m1lc;UBwo+lfDq=9sD5AytB z?ps^X*ZjYD zvYceFj}%U=W%j1*Jp79OozQHUrQ5;-4k~KS{Js&Uv*?Bk-*`80j3F-8UHN)k^tldZ zp8bB&Z$oM$Gtvjy*_xggiaA4uP;WV z*_XipoI}?Lj8T3KnJ`^*gzCiAKQVV%nOCKY--%LQc0EBdLjkW3-z$-&#+9aQf@(d#?P? z+;gxV?BCTv!yv~&WCdSSawA|M6cnJpMu^)BqQr3Sb#TYoR4V>50L8_>82Zn$WqLH; zzG~FSa5AlPAcpcr-Rw6#5J?a@{01hhBG-*4p`yvkGa=Hi>#W9rD9C*B=RSNM0Dlfx zB?AWecL;9c7z7y1j;rt-_}cSYSUrjZC~^SMcRJdMw*80D^YGj{8NeuQjm%;%&Z!-V z29naq@--7UVdzOP+=zzAz4w>tvYPalj81isM0nf2OzuyXf*Pr61EJebkkQ(M`LJZ2 z7SxsQAq&*I*_bBgBmRj|2=8$iuIp?j7e4!@4W93D@;Pna$8P=+06U=LfB3-g-sG#&j>qt^$@VUq>+P>(@f&VwVFsUs~=@lSNKb0fyS!KNQM1g$BD@P7bc#JJOC^ zA@HrEffox3!|%X53`fJHFDbRCMK?bjRV(;P?OVVr^4Bk3gFUJ*@i-*3y-(Im3!5Ll z8P*-6L-|0_qyyE$LAVbT>`-+EhKE}re8(;hsC)SE`IRHd#5~vuKDBk`_MR%jaRJ0= zGg|#L!6uwDj=}knU`|%2_o{WoA)$N^|Gt$sVYFBJ z+`(ek54rR0eC$XeGB?P5Ga>+5Gox%ZC>2MOvI(0=ZCht=l+GF}Y76@Ps;j}TGVutd z%;$HUt>Fbk;yDuN_?`Ck*=!egC)={q?jSHY&`Q-v?$dZ?K-_V zH4fQ`{N_oW9I*!{(xDbm5ta#1eD}gY`ix4&Cj-rz=wc(*gXK2`~$NC5AhKP zI`o<~vugN!($h($!ZHwx_HhxXAbWV_)gX%Vl{N=S-q!S{oox&0S0e-o7-an=STB@X zhC<&E-o)x4(w6PZEa#qOBy+l~DLz5b#BRT8lZCkx+#qsDeM_6VB3CL~rE?ls+7!S2 z-^H#9avNXBaF)q@Q6rtpjVVAiLiPnGeK~=uI@C{GOyHqlO`lK#JG`X?V63mE(9olH z(%zDL-1q%SoT$MwVm}E9?}yu?LMPm1rSq5YNB1{I(8s@lKOGAr7BZ*#JI+3U8y!SX zk)l4fq+S95kB+BM!%%2GiP64=Cu4GK|s@$-{xAWIEY&5=#@4S|Ox;UPT zL2#8xfJI*{LPEiN+Q?g*ghTMB}c}18MRi?-4aWHud$l4m&-_WO)dtU+wcV~8VTYPys^(Ijq@i{U%1)b*LkED>H z_PJI16n))?%k+26>P@MiyKI&%MjnR)y*ayW2?BY=>zvg@! zJULw`4=tO}AStRj4{gvTyt~?f7F~C5dg;d^!_|wCFIbXQl{%LW4ABkZVNMccgoyDBvR-zff>gBPG{0Zg`Foi@;CV|Er#{luR# z$#NqJ2-0Qj`Zml2-uSAqcu0mB|H*aeNHW=1^pP6CSJ>1C-;^kXwb?d9jT@#(yhEG8 z_&BPGdKP@fY;C;O1|s`2PIrNRz0*_p)#D+x9-`4-a*f9J-`0SJA$}6OO_S(AW44m| zngO5_7D^RYLUbzV0*qzM9TKG6Q7(Zm_R^=6`RbA8PkevA50BfstFs`dTV7LL65Ut& zbIy^d*s8X|JCmB&Ciphy$u`MN%B$*mw4EZUgpFaZE3qKb0#sHDC#p!2X_3?s@miTA z{NI9-2Wohrbg^K}kBfD<`wHYFJN=$5r7 zKLLo57y~C^pn}yjZ(b>>L`we$0Uo$zakM`JP?QhZj|NyKFP#}IneFFv5IQ`d&ClLh zKmGBC0_6d0vEw!7k<>90E(y=2rGkJ@{)KIGLzX8hoF_EBuFq#(gZ%yqmsqz$e?VP) zDl`7uhSuxUf(eq4Z^r_oK7zUvDY@gcp=M{%WLmXr@P0N7;>;ZZjLU4DW@8p;O}aiP zjTW$JKI;wyW1mL}YzXM|9;1~7DY`EYa=49I(a(3Bf{Cy!J4^AoS zms8&@U5dACHnWpZ`OWIMqyYF!gYYK4dF0szJ)oJY|LBC}8{|RcMT_+Xfxv-$`?w>) z_hsyO(w^7b7@Rba)b%uju6!xg`iQXvHv|S`6HzHX?)rq0FWVG+~OwY zZBhkDJ&~|}>PH^?N8>=(@aE1HiX}{&WCRd}p)>5jSL{^+?6*l^g`G2l615=Y+wDz2 z)J51n8rY!&?ZasTke_AzwQ;o)3Aw%YfVh}JJF3CI;i32!^Kfcm2jJcvpzKmW5=NvK z916V_2)(@y#WMj3n(-V4eDhwu1inOt_}CQEF+f1@H%NyOgaOC$t7)?lyx0!1+XA20 z=IB4v4HzpWxKI@aQymsuBtP}JuWHb$O|||2(v_`*JD59!dbi$gCk=4GxkMi8+801R zk-q++h#|G7B0*NymF$FuR|`zk_A=m2BJ(|~1fC(tQ4?ej^)T)@63+{o!LcJx>Q+)xQ=P;r+Ht2FS@5a)zYn zW`>(*o^cP5oy>)g;dZ<$m@y(Z3oO}u5wag_`_MvHW^)IaUi#g(kdjmI2k;fVZc@8^ zF*kPW?_hfZZ+bZIj8!NBqGJD;IQZL}$DO#L)h+$|2@IJ(?nj?0#$b$KqqFoo!pqb$ zlaPbHX#_c7TTfZE=`r$iPFGIJp$pIfybXho+DSEkhug4|heA_S(%@RTs)AcN5I7J) z_3)x5Y!~JFo^R%!Pac?O4Q4wYfa64{h<9O^Iw5-z>r;OyhSDHJ71;A>A9=i=Z=~mfuj)H* z>7^E!+-5@ch^yolseIoSxH?7&(Ek3JmyXm}S34twYri_{-h-Ge1Bn4^y@R~N1S!{P z7TP})0=$VeqcRsKibN}tS;GTQW`^9C6R}Gd>BCBPhOtNcPxKynb&!mzal2c~cI5hl z7d~*O%gaEEX>)1dVek^UtjX0ean{r!PKRMx#3-6SD=*no<{M(ls@TidKqip%+s9;Y zvrWG~(s%Dv>*i&%kU6fRa>JeLffse8_3J%SLiTM(UGJ}quBOXJebv-?1}yh<&kAy1 z58rN90^JJXK0q#UgCA0FevW$~k_hSZLVOJD5Y3JguVdZR&?G8u~KrmvuCnIz~+g?c}D z)+W>vefGY#D`bcOx<*6-QEp+a*f&ti1L-G3O{}ATY3X8>Jog`p7g@6lnZZ9R*!9~S z5wT$0{o09;^e7VsVvT(6MJ391AzHQ&D+cE`v;vOrVWZfno(Wo!T$@Iu+rj+U0*jk|~)1P0n zN?F{H_!`LX%Ij-{ILb)aTzHxgd|A2ziOVU~$`RNKeSy=~|dj}1bl+b%)ZYOP`;J!qz7}#dv8Qt3fWz^@)Y`qb6uS>+2x6<@Fbm-^!~v86r;;aEe-GV3tXp}_xr}MUy(*+_&|!tH8Ut{fE2W!ipu2X|a_U zuWUJm;cWOr&^fu?o6WN}G=!cdT!a%mUo%|q6+`5tHAdg}R(-m1(f8sU&=%P|(gmkF z^pbLEiE0H^HBhh}FNqRl?clZvquka#)C5vlf}SE&Q>Eq1Z>#CJn%hf13Vu=~KE%rD z>)n3@`uXX6;vrc}H6&=cNL^iJ?X{4NW7Kzpp|iitd@t@o;5&t0iF?F`=!*cVFTAOt zt?q>arCbPUQ7zvF$eg)1ZN=?&hD zVqD-??I{CgGn^TPLAS=xnr%aF95$q)L@Mr6UPUl{etotBOLU?0il8~>MC}!J0A*1< zb+XjnNKu3l=^yjJHVUUSa&|JkxpjrZq~W9uqQ8Dv1HS|K-@qx| zk8!o;{q)+JdZgw$;r*-CL+Dos?IGcm-=zO~NS`hw;Yx>;<-LuyBU z)SlWWT%#{MKQlVz`N`=S0F&>F+S!^`qTVd#w5_z4vZ#-Q``1ef?W$bJSY8SY6_>6* z@~ltYgDfQGbF`-@-E%}7{k6KTby5jT-1%4FbNs}xNvWo5bH$#cp(_-{Op_&%2Q>L)T(54}*5SKM@^W4-5l;VIk0Ix8T-at^*E%F!$WVJTZe z*jqea{p8LPlF+PQiuH<;`y%Fl5VRG&tzUcv1AFi!3i|KDWzt!f%yS65C-bd{- z;J02TlvYS}tmry@g*Tk$GuB@wzE#qOY+X5T8h2pQ2j8up(3N^W@FtFY9%wDgV-KBU zeZ9_OkNoBfZ$$DK#zK~p4HRvBPfAl5x7Lh@>q?*_`f_Q6+|*pB_=K0 zuJN7j_0KxHwZc++CKm5DXyhShN86D;H27}o#~rmi``jmC{f@w>=9ML1_0b7Vt*jbg zqvu|omZG4LDb{*~zKvcUu3u4C=q`_Sk9Wm>GZnVyMwK~aq?Ipa6DpU5+iXX7VJsdM zDpj8LAGR4TBzF5+A^S`WSy#W_v}4;trUDl*F!Du()Q6vjPg(L5d-9>`W*W%L6YQH zyRy5pWg6z*6w-soMIxX^q^R0%_Y09zO+6jkei1+D2VUeE>&fp#%xIW$NWIg%mjAry z2f6L5X;?Bc-J;uT5VxEK-8s(JhHCAEo5tw)Lg?PJy5yytK-|MDu`@YPeDQ*F6(^6p zaDX)LUGxRJH=qxWTFj%D#Cb`)R?AKsHpMoWq9TU^Ij6b&zEu`~jMp!KpAlh`=JvEm zX{DV}`-q9&cGEb3?&^BQiq}uKOD+a0-at6VC^P5pqW$;qf;J^7s1EvQ-2a`8VXb|u zXEfKGz?Q(v^v#=HLeTys9*e z+mMg>ytnlJlpzzR_+qy+b}!2m|1(og3c)TF)If3Gh1h`trPLlZjoIg;kwi+a81)0` z`&xsiMxP5#X`+%Idc*cCN!J8o^?GF_Cwg}}*9XqVRM}E{W z)Av!7$s}Gh>t0NfSE`+NxGd!u&<+lF)YKm66v*9zS%jXG^NlX3#buzpZb-Czk(y(V z#BQhn`yQtxffd^7mq|%fopbjx#dDdA#e83WUj&AL+WcAoFC^*!a zoI7hl&c6Y+gOb18hVt77xxgLTy|;{_w-(2m*_iv~-Nss~4jIXs_*&`I=aVUAS~4ww z(&&eQ7avr^Bss*NG#x4G*`xkhWFtoE5s*ULlYDm-g4C>t>>dt-2MX zb{FI&e+Xk^C?WaIE;=0D4~~Umb>t2uv`4l&t}zQ-+n_oWiSM_L%Uf8OU%1JVOhdIq zZSnRkDMsbPcy#BwXZ-%e7>fa4-}^Rq%04))-kl3@_M}D+Dl9O{89%URdP^oTzP@f% zzHkD>=Bp$6QvldDs+gb5>%mT#M}ICgs8jLZiE~(*ei_-JFA}aT(m7eW#LsxYUNoho zMMLXB-`DckKNRI9(G^&+!c*X zpa2|SVU3dLZ1xoIZL!jQb^d-`Geg zxx{m!?)Lt@8!_J;M?el4wlD~-mC&<7IdA5D8r^MiqN6cL?scVwpSC(Y@RU(1C3(GM z_OWHe##pFr(CXzY7tOS}_eGG$Rq+p6)+oNd^IjdTRn1gR_|Z1D?85drr~rDT#~%Ry z9JUV|l*Xvml}Bl?to=&-=5}0wXolV_kP`K^%^Z)R=&9=4t?m&Xtan4`kFYlw1c)6q z!zD~c9*5-$-_TPOZ%@|--^*6@>CJc#XwQwO9v61su9@3pRkLys6O{5Yb4SI>u;T^! z^X*ybE(UWjvMQp4@ZJP%xp~C(Nj%iJKdm}qTJPnaDkW7u=H16U@$2?5m#`b!&`$_TQBvp!(Rh%S6 zCL8yc_-5k|@rQOAHo@o3Xg-b{jyc$wrC;)N&OrUHq*t5)(Z^&PX8B!(_wgjBrsF>^ zjGJ+ABWlMAEXlMeM#ajQcSN=%FD=bo#;FJcI#iHdY4PhV@%3+NuTBFO1kpH}isP)) z3JvZTpTCT=FrWA;&##6!{PYN=K%QD7)(r!4_fp%3p5yemiZ0-)4cXAI@QFhKo0|FC z-|w8Vl{4H(DYNv`v)2D$NB*`1{w0#>O-isxShQ=N(vfxEqgsd==n1xxiT*=zl#_@o zsc60Q{QSf!xgK&h5&0lzkoFEEw9~5Nd@=m1PC6SnD{q(yW5Af6jwvGYEhvY&2d6v4d^$Qr!3TC@&***j z0shha1pJf8+T8&0uO^Yt#9;U{9TeVe!szkEh(#t!7Uzlj=turm@_p27(z?0R!lQ=^ z9AyVO70@QMINW+#PXxm7BY*M-OKh8hrapZRb~}@ijZdgsWJeKGn!|U;H%d z@s0rUBrwkY$JBtBIUlcTXf!#}Cz1vHY^>KK8FfG5_i3TN&1glRZd4%?i}PCRd2nOy zL6+HA5d3C>@oxocamiLT6FKlJ-Yrns-*hHK9b&FL;JWvv@A~SNyN)2tVFgI>9s2wD z5$?=h@wt+>KHga|{QAe?&1Ob$4)fU=`gu2(j}$m^UuAUqko z16r{yE#QR$czYn|NI@=JvIIQvq#xk07L*bou=%ev89U>czfeRR9MFBM1z0kf`x3&` zk)8NxvdrQi3g@%{D|%umBhC<3BvQRk2KgS1VNRg?so2urYb^yp(INv`KTRLJlZaUwZeSZpS~F!TRq$ zFE8BLc(7J9qZ_E@bL0ejBD~F0Rxk0BbSrR|A0HUjnl@OMR%Wx714^9JbSFV9h zbRt^faU~!218&L?YOA@>M9P5%^D;ru^kmf$cd*QTJbzQq-iH2{x~`A!{7#A~*PicX zk`@3IGylY^PNq180karysr?{8Z9E64Ax-at#74{cT;dNaa)#HG>dc-3{cxly=Jizv zolC9(%PA2QJia5Rt=&Q(6GGsnrh*CBpni$~!7ccD?ritZR7g&p&w$)Vgz|>mF=5NLb$+LYd-uu@rvX5TSy8(EwWjlMIuJf`Z@TTAZ2ql#mxfTru*7fC4i#$D=Y)P<)0zqgWerZZt5Ne2j(aua2o(CIGRUx!c_I}34vcqfZ zlfRmQ^$8zxg9PwsH=4mM06C!+-)^pDWy|F{&FXvWt?~T-6t;8mze1h=@Axw-1K`&z zZa_~US$U#&bWy+AD||%%P=JO0P*jm@nV0M;V|m>em{r`|@raRXyhA~7E1w>3Qo@;;8yvk^&S9LTtAo4!?Sywru;^&>eB{hMk3&;F_tL$t3ZfSVA2jDvj{fMM9( z$%z7)25{T>?na2gM!5_SlJ;J2+-dBv0Od4#DBa$YqDE^w{gO^sG6M+RpiGbTQ%^kV-S-2i3!x z7w%;FN_D8KEvZ*^Bge?6&c~2j3f!4Ltu#&*M^5PJM`$?O_$vUtVqp(vBLrp=ISy{y zL>AjJyVNK2bP`i-nkz{$WbS4p-xiRubaDfq-Qa6Mm>vMDX@L$c zT;I zf_@~JADj(RXZwI`t2el&KLyz=B?RF}Yl(0{Xvtp{=JZkI0(T5*1wwCDkqka6Du&ly zCc9F@#Xeb{^S_1-w zf||2UY;@5ii{SUbX;uh8j{LC@LLZJGo`ajCS0F4_s4>w!s?Wfr(VOoTsW1PCB*o^S zn<0UqlUlxX_}nev_RVq)P1VT0MQ{t{s#9t?SS6$h5#WGrI&+`5VZ%0$6%Xp zL%u$P^Q5=kf=80ilY8Qv_uq$*pblS*4RWZ;_82Hp25C4HK z;hqM^y69|(laggQbT@83dKvsE_D7xRh16=`+$4I8` z?}v~ekP<}W;?N5K^x>cbAv*nlTkL+NBERc{Y;OR2#0X$@HYl9!=CAsD6&`nm6;}E) zC%6UXE9CI?6f|c608ae?IYDjzb?K?Byp4s0#^Syh|-rw=g{=wm4SY+L6t$SVNyw2;q*t^?6 zXU|<=$9ljZhnLaN`28`aG3?)*rS)H(Mc$Rk_4_QD+=G8xS2}!i1H^y)TJ+(7rQcTq z+4Zi%<>v4`INCp52>?%af}m`2p0w#gyQVS+Ds8{7J~W+O_NYSdZ}+R@YU$k>u*hjq&Oy(tRa-W#bayu(=WJHTm@!tJU4%O*XFOxkJvz8BG4h}f+(~Ze0 z5pnLKNh1}U5e^njXJzM%-@W4gsYokOSy)~EVs zq_Ib8dID_UNkgszZ?hG_D8tU!%_u&dHHusLo3J5)38HeN?7PgYBDTRvR->~qDe8#!r-h}WTq3f1+_a-#yE zi`bt`13aMZSjz++uRccwrTThl>pk(rx?!MgD}QJzaiu9or@8Y+tJc}-bG?d;5i$0x zUhgLqszOKjWQtp$&XFZsk%HY1iDXW)Wm;V(`cPqIuxE9apV!mbz8L09`mH_wvmrk9 z`Gblh;<=is9)lx>b)6rCtp|-A?y-vFPdAdQp7yz9+lnsXcsPK=5*9b zn9!RmPdp}`DepqQ#hjsyP*L?m2@B=8kPW_=glB2|+^)yapoQ3t&$ubS!(Wc!vl#u0 zd5r=#iWl#mz<3u9^0{vx`Uup4*y>|=c&_vA~d=_u{XQx7-rkq-C*?A~c=(@oh-!`L*}(%=!LFT-{| zc!ivK75SJ+M)D_${RA0zO#Ph0DED?iZhN(D4}Onyis1C=O;OE0_~==JYe_Dm4mzaD z=l;dwtjSz8thhNsmP7lK#W@ExL$1_srxCL-GH=>VHxhJoG2&ZPn_vPG20LLt}1z z4c0f{zcWHAg5uS{4Wk{nhc%*)EFgU$-Rr$z4dX~fueUe(68;)d!brZek+QHy)68~m`R zq&e=+E2R@ALz=cRoR1g{b{tQ4?Lel2U>sY+g2oAdY{(o&#RpVvS5_&zB|M!#Jl8qN z>CGwr5FbYv|9HG0efE{i3&CmZov>tD*l;9Rems-#!tMFfU;3)*d%aSPK3weAY4Ptb zSeZRGt;&^T>?j3AZV%x6F)4)96onf6ET1}^r2%Oflf9RK-G?^+^X(VdM?0xn?T%#|3 z$m_VE%anGF_+=gmo&eydFREpSQXdy~P9|A9Cbpc5i#yofbS_%tsE(RW9)_SdVa56V zp=`mcjC8T8dBp?>)7G}3daCYHu30L&-|{Hj0fa)Za3XY~^31bxTDS*`r&}ue#E@7x zQLdvbC04TL1xZAN2>Up%jvKFDu`WiI(O50!5mp4u2_zt0taedSU~6zvR)z;AwUVZ9V@;E3Gz3RkHYT zf6}{{`1Pu#0lKZ}r2L?6My^Z??rjXih#`%>P(pL1G&U|za>Ntgi0HL7iOh15WE;pL zH{{`!W&|Nssr05i?^LZW+aDb_R!2t)h)0`00~xLL%Woo6ngjywb8uYdIQ#<9?LCt4 zBZ=$8c{QrsK_nNN243uit_ zop}J4+fit+#{*ePsB1IiK`S9onb)V@qx8Np{l1;t?Zs0Lv&?HHsc}5tEFmGxYs75l|9iX~+mkBD){I`qPWijsW$p%fF#60q$cz6WBs7G? z|41Fny9Rwd1)|L~$ZV5qwR3k*8nX5w-V*b?zmsl3UWT()T1e9F?Yc-L|#LysBm6;tPZ0{Zj| zM|Xh7cqvWT;hG}H0@wl|iu!u^w@$;Z?Il*h%`}vyRb8`&I(%%-HeFdSM_%zZRh{w-CdRQ$C45xiQYXY=BmcFyd# z$=L(r+w9jM(SKd{mx1TM6O5|yPxM%`m(;8|S6$gvm?u!pE#RwM8M^F_WI>o*G{OqL zIKys*6KyeArs@-vg3Q)O*REIsa9~ehv_z-|AgT7byYK_R!{FWK!!;3|>}$Dm%l;QZ z4E+o{0B-Qd^;&`=d^`m=_lzk`RPhhGS3cV#HUee_CSl@;Ga%JC_T?v`PKL{A7uy82 z_W}?$3KNsKCP#FH7*{lCYj;*7r z5r&w`*=8hlkFLL!A`M10{v5^MCJy_(wTTvY~3EFV(f=!-VU7 z>&*2p>vuTsI+Gv)QxZ9z{b{etlkylSnroz#Rz=k%vJjz43T)OdFoX9A>c{6?W5?XI z=I4`JTKGEZ8fiDoy@6kxH38n?Ib9T(KW;?2ugc96D;$n;8p{e*ngHO>NmFO#twLd8 z92oURKMo(lk|h0?9`MtL>R88&>si_AGUB7<#ZYeHpB%Rl9lgdmwS+Z7CGN{T_I>|L zaFk)FNLD_4a~YzQdxItwsfg@M9K*|%eFZ7oWysJatQGhh#I2?-aMMuOkg>H+1At+8 z0`K*%4v6KLsKJYxx_|wq9b7gc)7}TJJa<>bw_-onJPB{Z^3r10;HbX5+0L!mNJ4@@qtCgR(AI{5Epf3A z8jq(&r_*9+Xn@sC<0A1 zzU(f6i77xf+wrW8j!1DNl<+r7!*}}M6Wp-s8P=e`@8e!Xs@=6b($I&k6W@B@#tr(W zV2|d86E0wp({Ev7st({@=i92 z`s15xO+ENG4ZA}PhYw7o4e*;`vRp+j^8lcxOjA=a9;>&Sho;ODhN2 zfWPVkWf#Y>SCrR97CzV%&;(8H%5^6~C4}10C)-k=Eg|An4;Wqk7QWp{Qna3LOoN4P zGu4+-SXTkgwVjga!)~W+EHz~&odX!@oS5!JiOewE>K%*>IA`X+v$QLj^oWJB$sCl-&ENpwL)CkL9+Tt(vt4pDEiCc#d^<}H69vVrh zVc?}Y@`W8UyN(dOZEK3Y2l;dzRJ*)%?8?ZL`?)aF_ft{NbhZ?S&ga=Y1OXh!?;vg& zK2ULI*k>|V0!wdR61&Tuh&2C`;|AYWF272*Q?bRgln z)bgq(J6CXvRW4!|4T2?7SpU ze8><`G#}U!IG<8w^Z$ep0NW>kh?c!msx(goAVzOrL**r;S=>Lda~kn_uUWUCW$d(- zqtyp)p3@+Bfg~p;ShK6j?U%I9Od8UdSoSyU3@SNyh3C}R`E{URYoau32XDe<^#%S@ zkQ-k6A7l|b|1-ZUD$%#m?UL4C#(VK4?h0T1dd=`-rxE$emt;|54pZ)v6^i()&MqNe zrHD7+?sEGzCTgvzo-NUjw)~>=Wl`beIcv|;6Czfvp#zxAtd^N-FZIcNM=y`e( zv2``eTF`muJVSkj6jYaBZLGxkRfO$RBfe&P*OEI>{TvP6(BmJiRlRp{_qKdi#0^f% zPl19IalY+gqoENCQ;P;4N0FPV7leciPMR8A2Ytkvae|XRJ&DE@QlR^7#V~Mgbx3-Q zz6!tVkcarrKCLc6$w=h)Fg$KyXVDsR3RAho-VB$WCdR?}IfGq>bP zrkbgN0ut$c8|wq~cGzKYKFN-;LiMfgYDK1cpG$0)>rmgXiQCj-2SdiM&rEIn{oS$$ zGf%H8Z95|EFr944@_xkwxqIn1&&BJBziWSn{-6-h>7Q<(wqsX`=hLlYhP;RYgzk!6 z1bX;fxB#X*q07~M4qRaMz;^zB;%)d}eT)8o_`6A_J*ktMCeY#H5v-?w5&c<&qfJa@ z{1gZ0v!`%jDanlqgNq(QQ5TaW40PoX_Kq6pBsYc*@?>zyPm)Q|cb{Nu^-tGWQVCR8 z6VT9d`uwo#5mq}m@PNdc?bh}H2YgTS20kgfVcH0Fe+%@&@eI*@& z*UK0Wiu?|}vkc4QSN|@ov&ly)Us>8U&bs>C_m+`UVMKsdG(T&dLliR%?a^Mz37F~Rw?--e! zz}o>eq%||&A>y6XFHtc$CJO=`y*1_d#2Wa@iyQ2iwl1X7n+`W6DEKZR8Rywlq_s;E zSF=07&QA3GRB~E3NHS*c7)6d=;uhq4^z-#AiQ+q)HfiYV)V#on;w9^eYiv$P3BD4n zEHiJ_HbEGYx)*pWsNvFnckY|gWi6+-&evyS_Mm3srFv`aO}=n%c?eawRL{6s%P;U@ zE~l-`I1k_j%-T?E(4x-AA77Q`@Jwz_>a4pE7`dWjwzik!Fsu?oRxxZ=9EY zRz{5Q-G3ZkhCI*yiII<7en_zl`x+rDR>8rVQEd#GQ@2(lc31uymWw$4M>aez;i~Kj zWvee4*t4`?(jp*_p$wmqyXGlJOMeQ75Jp899lf}^k9c*U!E40(CdsrsMratWovN!i z`DxE%V4HFd?F=c|6TZ~CTv`@oEYx*?`=_y&R(ObZx{1f`hdpkdU-Q6-`ZU>GeL=Hc zCOqqAwj*yzRXe9-+gLBpFX3?t>SD-e{e~=!!B4sp*$V4^-$x~1H(Ci46Rq!_irN`{ zn!VpB5N1Fc2wBXY#Sb>?gT}6mG;cWq}px^9$v zUh8o!AVg`C{;l2oFRF(7@8BK(U)QCAJj6ZL4h_P76v}AIh{M*q2uV+94wzMUGNHdv zX-6pNIUIAHwMsq%6&h{WUx2NoiN0*uAh4bzXRSi5-0a-E00is|CHboPn(o>XU3fMF zI|@!(wbuFj)Np>ncp5+3tC5>~{^OLngIlwIy!nS3))B;Px_>SMGsY^~45}>qfvL+h zy}91{0owGXWmKo`%heT+v}BOk?=U+CAUpIC%ArU>)92Dx-uo&HrOF$awR5y>lbas!Uaz zZTHP{G>ntA%E69_c7-Ll9Yc=U!am?DYA>3}t!uqB9euJD>cGr)NMTQ&X9fUDK}33c z>03ahvn0rKuH!ZF=c$Pm8!vEcKa^k}#OcI#TdDYR5Ho~%X4?ENiY{`W_(w^1|K4>- zjpK(_^j6XO9~VzQv>eje>pvdt#qPsbp{8L&!@3Nu*NRD&nr5wxHuV-&=jiPeVp&~@ zkKEg}a^~UNPh53yH~@2CzM!cLkqLbxhI=htvnZ*N{$qd3R+Mbhc0bFziOc|GR}@w~l;nqk%IU_mysbS>8(#eYkLJ3;E-7%9upOLTKYv}{bY;a>zr_PO9?Ef(#;IEdL=)oUm04>CQ|ZL4C`}sJCF&CFWv$uYe5gFw zB}^iwBbIGgD=l^#!pDMlPGSVwG}{VUg}uo42eO?(A{!wHfcwpOvP%~GXv_9elTPpz zcsm`fvL!MHP`9V?|0wABUqZuwh3;b<*gfFd8$w&;*wsUWpl}0z3W@U`7*9<9aWnUdNcYR*uFQR)Gs z*cAH(Ks-i9!L-3W{Vz$+`>(ogzHLsb9$Oa?LLf^3OPd-`KuWE0PsTCd;rXC}duu?4 zQ3;bhfnapFQt<6mh0a&%nVewe-qLKxU+*|X75Wg3`wR2w*Z%Py5AEWDR1MzP*h_D9 z&NaLeZ@qqZ^^$p_ufRlmk>m*#`Fd;HwtdS$IK-#O?3nmHrz40b6f>z_%vrPmEtFEe zG)}dPyzrp)X9h~?rFUfavlky#UBh3@efoara)?vz{w$3RgE?afUG7+?_677w zk_?acu6oa82})I0F`?*5_F>~*)|3bP+@l)KZQ_dkE3^(Q3=K>1&O6VH=_8MqudXEp zK0b(mlcPhJlAti!*#I4@)N-8|J7}g_ZSy&q^hkDC(@_&z20o9aWwR9N8B&?15|5y^aaGRn{N z+8UqMgU6C5weBf%>^}`UXdh^vAJ3Mk;e53ZpGP4mrw6x6+>jZ~TrL^vG^~Qmgk7M( z7}eVegA0LOmRf(P@=fA|Q9KUnJUw>3cu|q*;AfxrH_o46eon2B+gW8pLXugz%%eS( zaIc(&r^vS43$?dio_w(X%s$;fo4DQZLnXe#Du@B%kEg{{*_t0Io6%LNL=7l5G>^r7 z>4aI->o0NT<+xky?+D%av9O$A!Su4Cx4x6g?_jLM)fKFiYBBGG37NXCNkBC+Hyg1? zpOV!s3l?cS`&M0>r_$bY{OgrQg{I6IA~}eJg7z})@?UK;D=abmJ*L6qI#N;%xSTiH zfd7T>gNFiL?nxSsvS>;%S7}fpU(cO#33lV%e^&ey>Dh|`vOaXxJ0LJUXwu1p8d)W= zsyx&-0(oGITfXRe^K-e5A{bj*v*JwUpoTH}t~qFH5=8q^fJG2Up;Z8E(4U*lw1igm z(Ls}dTanD$?^3BYdG*Q#r31V-Xmz9FsG36eTDBI--u-^=H zWf9uU)7^dua6Lw0{LmctClFqS;@LWQ97c-ym>SopZ$>nxqOQGayc98d;IUlY5%<{B z3RjN~oD{`gpSSW$b0}{lpyTZVY}8YI#76>$#NsvSk~{S;mZYGBPxXm+c`zi38H{_o z^9=&89LYZ@pZiLFpGfa1qKkRm$p^egG6Pdj^y+174zu^PXOpOWXB=C;e#qJ)E=bW; z{yUdrTRdWHjT3MQ|KwN->i%ak#~NW0+CJ&wN@w8y< z{VIPgh6~hk3`~Ja!n0qCLh5eH%vcl11ZMjZ)Qumr?Wb{d3F(EjfC)e1Y3b@L+cTZF z&&@&U&~+u|WNM{Lp=N`r_UVXyBA@tJ985)OQf9r(j=5ynqKBI4Pq3$9HS)~DOy=He zb(B2pseE}r2dPibtm0oxMPdkUr^Jo@D2u7^%N~Zqp`^VZvMlEd=LLirI%&a#uI2K^ z218~g!x%<$6{olVz4FqBUR#9chu?) z8Y4|kRH4Sg=IwWM0n#h#X~xa)+Mb4cbq!#Utp5W>=C>*9qEoh#5s4Y=>frnShppu{ zs>;G+BmtkIj=!3>`n5q%&c<#zgZ`(jg_X`*(a#H;IUdz~-yNZVyg)R}qDeIhh+&;4 zh3s!t74*o!+5%_%e9newozCZK=eWWV8S(Y<9I8LF=7TiX+xSm=q~;eYEM{XJLlVtq zojtCE06^*c+JBU9|KC7qSb1rrVA;azyS^RG=9}B7_2!d9d5CW%X-xrPkP%|jOjMN* zWXbKiFIOM9Q!AhDXUX!HyRem^EL2N&U!{{<+UboLC=xFLMj#=UG4|L9TBiX;7MO#t z@OAJ2ZTW@~C&%3Y38Dp?1Hh;2X5_`$H5pR#`qQzx>%dR%{|X}J9cyt|g00Leh!<_{ zLvlUo$ZO?PHhD%W_ZL;-NMnK6apgO*U-r1WVd_A6{Q~4yD=p3#3fNjtMD!~@Xydsn zB_S_#dGl~0$P9uX{f|n(|NMW>NLqngZaY-@8KVA_O32G@lnS?jkA@qr@wF1;5R>-L zFWq{6S7Y~I#>*I6sK(^+AVYQlzs1zDhOh2vVaL!>$oFu{gTN+oM;dm;JH8p3${oy= zKx=;>`;#d5;8Nqn7b&oskB&C0B@vbV8=j*E)lsr6eP3Mdo?%KtjnsUKW9pwAqN#Yf zzP%#`Oa5YSAyQ-E2huF;pswOwjqBroxt@O~ZWhXXk>FCvxs~F^#IECiU_&74J)nGQ zsxIglG$Ww-_5&a(w3Jj=z_Pi25N%&Fhd z(P7z?zoYt*7hQX(tlm>=BxvRjgtGbQ3FmkgFwq$=FgC}!N`=^~>Yuj_>tJx^F-?~xH^MkKDs3`9?lK#CXA;Ft0v&1Gs@UkUF~A%mR60ZDkatcTdVhxXIwTwGb&R* zORZI~xsaXYw#iTP!imsorZX}{X}b$(*vh??A@@zbgEt5#!dKH>p!ECd=bn+&`4$NB zSaPyE<1!O)Bl90abY8^IlC>3bMgZI%9t7wJKP7p$r-(m|6WLGKs~Wo9*G-H3 z+pWm_%>Ka-ccvYn%vj8_?yww6bD>4-T|j4P(u9pExPG|%lHiZrlm2`()xsrA5w zk>t?_RSU_4-#v4jd4KO3|4$z0-Ps6xtpaNZmsXuUuF?=6>FmjWnG<*MI#b(lGBffQ z><57ly`PpjAr%+0awB<@Pn>s9;#+VKzWD&)r~2P#AQt=hrb*A@jd2QHNdr%6pA#?6 zG36`rt9zzaU6-axqZx&tZf?u+-6K5Dlg?dEl^wZNr0JXe`60B+!XTYeHc^d_{_JgI z5R;J2idBLb^VeMaD?d=Cs@wnM*!45wS$RFvCWU|i5(%l}eQXm4&Av|RO*c@lB&D~q zW9k-Q1uJ!ai6w!bvP=u+RGZVT(@ua?2a7nj%C(46@Sf>S-hy8`kFS&g{Oy5Y=?w4O zcb=v>H2`alz581&Rmh(q=)Ittbk(dUj{~3#feyjURA|b7s?{>TQ`8`pF4rJlQwm z^~qxGU!;o4Xzo<5HJB}>G3yfa%~pZCOA0eQj%!G$^s(Y20Mb>bb{Iq#9_o9vpayU{ zA(Py}>0-n;b4&+8;9@?aE<@I@#oYz>DH@*s`~jgbs9`wFr*zYEY@>VHko`iSBw}a9 z5cjfzce&n@$V|D^q91Z#cJkbNW#7EGeejVedpP|(cYnDaa-af%_AhIY)mWS!D#mqm zmqr-9fgW~{=J3U69SwgLiox$Nk1eI8`1qqMjIrAlL>&}E2(5zMyJFlY1}C?0aksUI zZ{(&%PQnc9J7(S?jMh9uSQ;g)cNiXXQb%dUw?8@Z;x53~IZC>~>VZjFdTLY`GUXLD zIRWC$x7hAdEH`lh2wW=oR9$RmzE2@oOILF-#fzI_Cd~IXG1vdl0rchN+(jq|$-N}E zi99&}mc)p?BeSYI_;lK!P}B8FkTxz`rJenL6!lcN|<&lq0l$MIcXA%{&`PY>rFl zA$K+pNo_tLPRP_URDFbbyH~GPr5Qk%?x9jkC~*N#+%}~JTE~P`E4+KQrJaRO;Y?GN z6PwZu>zhBudk*e)7V$lQjvZ6s3=Hq_Q4-l#bbWh%dRTX?MISTRbj-wmUFo_b;b0S50#VtN8u9AAl#M-=OtkR-|k{DDj-&@W^Nlas4&?(uy(wWdu!Pp zyv8Dk`3=4=6Z8Yw?CM*pqxi_}5>B!Q^$f+nPw{h5aS?bqyG~X1DEBb+%uGxLZud>bailU6?Og~sCZ6qOYcVJTnSsKe5T5EzV3eML z5*w?hQi@G432sBjZ?R%Rb(Q%x_Fdej8ttLV*h15yDy*9eV#)!jtCekLAGE@RGwrIsq+rSI@YOT^zGZ7L zNgSY7PIxid3#sg35C%3n2L1kPUnhZlx*aMX+(j8hBp4dtppowOFd8p zOIxX*E3_tbX9)M)=3d+f%f-ZW-C4Z=x8HhKWH!>-mVh19n_P!4Eai#=KIya;JMeh^ zGTaUqerX#{!TNl;n%-ByQvC%2(guykG)#~e3ms}&W!21tzR~!po;~WW0eC^!TE{IM zeTP!#FQz;VYai`C;e1c_YJx<;Hn{=w^RB60kmGu^qG>(70Txtdwzt9tS2|yIiMBX% zW1jgvCrYjcKN6Dek?BZGEQq`F>ffDUxuHEEkhf0~~1vx#B0;g1Iz+sCK3nOFX zCW`fHt){^#!j*#(UoJC*w*7C|YKC$E#~I3(BTi9b++IN z=kZ+t9JhgjI`*48+uV;pu3Y^N&$cHWh&p9W!*@Or8Adw`N6fF~U?t_-M zZJ-5v-xMfDt5#Qfy=H`yLen;*R1HGN8{9$gBZ@gmT6JbI<=9C-9Y6^lGk2pCMHG35 zzlBU*3T`rfc%QBL&33 zNA?e`8wzjq_3?~`ubO?Xsd}3j^=tY(-k^jgL$z8HHVjbO)5%t99sO!OkUJqd+ze{+ldu>g5bj+2DeIr@H@ zW166&k88g5u7B407;6Woi^FoC6#lx&IM_jQlqtqWpT`$VgJ%Ids?7AKMRmis&O`zN zi`lQ&n^JQwXe3P(yP*IWynT;paiGEXX1fFMl;RqgS2r0a91dWMy?>WEe!b+~E_zT5 z2XAaIwism={U=BHBjl{$gHVxrPo}m{y726*^p!4>b(F*dt9>H75sNaf3HGi5)d#;t zu!#oFnyY8I^wDg$dn!j>TzDAqk=#s1eWG^O5P}Ft5`WP`^^E+&{J-pcgp1f~>ixuX zRFJRxt1n*qS4_K=>tU4X0MjdlpwmS5qWsjgSTHLvT;Bc(!OxaSL$=(1c+%EBR+a-N z^i2f5_1&!D@nx(`W!Fq`WoKKf)J8IK)0HOU_^SGp>An@whPqjbyv=%E6mh(%HdNKe z>R8V(Y^{4w98gpqG1m%m+K6I8%^kDH=wcX^doN5TGiUOhbITrZUnQkirW2=?)LuziECDli_YA z$8Mm50NrxZ4``rx4)Sk*!(Ohzu3ITlGRh6-?hXwk=xAe9+eN7xU+ zT*vUro$!8?N`k)txX$8me?JYg)_lN4F(fVm&t=ALf**oHjr*|O=D%q^)1jp(kp>)a|rniJVfTJPI|7_9pvXXfPwz1x;`cPZ&v_A<}#Za zqA-oZ!te#O-gCmrwQ%N}oNdUuO@W=-P$+DsF3 z9Vpy@NMj8J1V+CG^sd9y2-9dKfuwCw`M?WDI2Fn)L(Ez7ai6P$WF}iplU!zJrociv z>z}v0HttRnUi|@{{4Lz9Ng1pV+3?IeAh9ZK&!>`x zQT)XerG0Y5U9I)a;PfbtX`1&EfEZl2(c&s1EpF1OD8%0gE_NQ@d!Tbrc|!UJ`u$2SFm|eW?pOv5Zz{`fA{-?}<0!p$W?u@y6euFtt-L0h-SWLLXT3vMxARot zfw1JXW}T7sCoc!MCU`ehX?8{1;hz`q!tVJUht)qEndQ6JER~znaB3)ebatkMk6x95 z0jcKR^=;K2B{mz3IWPi40v;F!LPyV>jH`_=+_7kL8K6H>n}H#)!#B4F zpeJqHhV{Wrk;yg-R;ET(C~T}W28!=L*S~eVqi|9zxJpPD*H3t!6{U6RvrWv+tc4Sg zF`Az!Lw|d=dRP2yXt;p;h^3g7SCxWUJC%g?)-I}{hfM%Ombn49geY{o97Gx_B4NJ+4Y81g4dhPm;Tyj^ zl6JCN5VZ8tNPdd-S9sF1n3nTOsaiLp`~XsW@9#p!QpyyO2feZN&$iWJ-ODB7KlYUc zOg9l7E;CWX=&Ov2XfvW`)yO*)Rp}p#Eib@>cm|iN**yJVJ-AJEZe|Tzj z{&3N60oa+@k^cssXKVKq~&evmF1z z{XiVbgVEj-x**WNSBuhOi``>@bnmBz_qP=a0>7Q~l{>5zZ(zWqm5-Oh8}8c0*9H_G z#B}i&ULRl7(;jvxue?U;yL++#pkWW$eAEDce}*>Hyh|I*=K;aX&qL_f$VfMILSu`k z-l{qys4u^*58P#>te@7eu^mac2{M68~pS!W09a` z5;#OIxDqf@qv8ty2g2X~C-t#^A*)TOiz8byXq*=10j2Q4_JD=#|2JKN}2n( z1wP@ba#K`&b=*PH?^f=N$?0A;p-ay zdXE^FF8v(J__|{`zXog)f8adHCEXf~E8f~WW*3xy_$@iaZGXbs;r5f#@+Mp#U{!oJ zPx@ywKTu2fHUYL#6c=CuD{?{&={J zT8gL(w}npaePHzV`qUAvJE+y6>Lien3z+2b)~NIje~MPXdABZCqpJaSDP52%CduStK6nf?`)`X7JJrIP$GeYH44@P+ORwimphwm%= z!Dt=z{@H3Ce8*+d@ez|d=Auvno4`x{>h$VPi~${(EaX9dxo5StYxk=_#dcD`;@4O0 zF5e>Moz;C5FdUGZ<}xZ(<#}R5|8`ixTbDqExi{XI*=q;s)?Zazvg1WB&4fbp@LSq%wFLgqIKgp{ zLoLA0sr-0op2igFxa_EyT(AwI>1Sjl?LjWzmi!-&zdg>weh7>Ay1l(pk1HArQaZFo zdp4slZ`2&VrU*AMpl!mY% z^!^q)HZ$EQxN6K(fS$PZiYbj6UpYK9cd29M#C6AWjjAnzu}}QHgv!x{fjsK-}LW}KJUkFPsf%av&LSC#D=yyTmYYW#&hE#T(`ED z@V8HV2hyJi!I5j10R$m%>j93s!Od7?E_O?GEJ36N*cNe{n1=A;X3Ae#JM??D?K)@j z@2CP&CzJx2lFgtcY0{M)tqr1SBEzjk0o)6R(8FniLVCtUJ#o3cUG?zsC;W6`KBl{D zYzPLc^mj`n_h0-3GOdf&zbl+?X*xEnEGnWV-bG=v3S|%~Ty5KqVxbnY9QA#LRX3vH z8R>UEN`~$DeysRk&%NdRllV_m=GqPWH6^{W4Zts57cuCHZtu(KVaI=BMmW9eoY6mr z8?}ZM%!N-HDp-^V2Vketh$DSl4g^uU@NT8}4YlM-WPQ|j`N*)ZERr#`gC#X2Ra3Gb zy}0onE>dGPA*sMh`)u%BGk(8k(ZJK4BB0Y!UxK4ic5&9@T@s}H@hF@j-xvTK)KH!N%^Vd|?KDLgKZGC!uVP6JuiL(ucVK>vuhEzpz=5hv6QVs99HIijHm?Wi*Bn*b#qCyy~ zw9j^EA245(k@c%QEDK+Wvq63|-bV6tO*I<+=%lCz?UL%g4jgHq$KU+uzf;!-HuN7t z0poA$Ktl|xj_@7w|nulRIyhPP}?S-k-quo~gB*oJDJZnW0PlZ3Jetz9WCVcKAh&~6N0FR=wGbPcH@!bWP5k^a;LsBWYks{9=ss?e~k)w`w( z5*;^nhrdcgq!zdby-1!|ifeS zb^MTzzf4x7ybthhgF5^>jnJ3Q-v1)`b_Dj*Fc9rAj(np6B~64|b()raQ1>Qz*m3y= zpM|S10`Hm-;d|A2MlJnIXKUhBqj{qm8)5J1`NOYiA4Z^k?qHG$pH zI$O?wf?+(;XJyT3nZ{tg;r;3)#42|P(<<||(p*R7ak&Qd`OK|M{4eZhEN`veR&gGv zf6uN+X8uumAhqJl01eW8eW89|u5?c?t~@mzeUfpLO^m-!Hy(SXe#}CtzD(EpTi-Fh zaFIuWdg$?3wS(_DxGSP=%FJ42<}A>AYe}Zn$?Ut|{esONFn>jxmXIkce_Vc|WE+!gxBnLm6?MkuiY73v#>BWkkU&S}EIS@V+;^gsf6hh-Ab-ADW6o!tT3l83|DpOZa zBiF577I`c6I)z|XKatg+DQRp>61A(yKrdbS+M+9SAG5@ircF~AlZ{Eo*s$KCLbpCE zIIiXEy-zsxB2%yz@{v~$3K^nu)&Nv)RI=LHuwMOm&x-4*=^4N0N3U$y-8q0cM;x~< z65+gI{BDY;W;eo?vmScmnV;b2ddw-mUXx3dyzrII*?6-c16R-CGOyLk?x*O+pU9ttO|TJK!Tl!hQ$Ou7Idw$q z26rwd3%!e#$GkFm=;Ckb#bd6#y8XF*)_U_1(#zjZ91>ZNFkEq|jcyjXsx@uQw_#ya zYjS(^Rci19l%q6}$=Qb~PsSkM_tqK4!xW)w$>%8Wx|QHda`0fq1ZmGhnXMnX9@`ob z+hQ=boUB;B+1Rl{(5H_wz1`(~W90o>P^=F>d)sFpONxrrrNZ7kmmVANP47?oI>k!% z3KH)npDoR+7Gd73HoUmiAdp@gAty>+mRrAwTH~_fow?6nRA+Zsmi#SlJl$q2j^_@p zN=1SYw+;KM%nx7<^&|5Ew$Zwj+zf?+TMrtjbD@Fm>$~Hb9$<#3`zNL*tt^*@uY1z% zSSEbyY4gI}v?H~VaYIk9i5e*UkbBF`N0M*5u&UT|CGywwhX@_@wD^eXrq`8fnK6ji ziyle3Q4cD*Op6(1{#TRj?2}Y)@hKfk36eRue*-@^C>o;u$J|0!;qt+KlJ~==gx@#$ zvfUO22-QdY-&ts%X}sim)ouNUByH`bum{=MJUTideWk-ysU+Z?-JcxBGh42H9;Kj^ za#TfUQu-cfK2_?S0^O$2c8?eE zAD`Vtd>K#2ZjQz((k@0;1UM>PefTli;P{QFSEg-~L^olD>0#$^ePl?YS*#=IJ(#O78RexqB5FFZY zugkjURG81Kypea_s)i)Cpd`lI1h#hD_)-&hUOC|!GlwtyA&*vvb6>b9rX0%7t5Edu zesX)R<9-IYi?xFWHH>ky&(FEgg;=qKOiMA*KbGD)t35EWa6x)_z;wr5Av^d>fq7pC zT6h*-(Vr4xnFtbry3ftqu`G_1zK?L9?IMj_*6M+B-pvsFb(C3NT1y$B6#N9NZbKy_ zW@1S}SC%>E6s0)&eq~lK7Ri`HG(x3@jjln}k`#a3OSIINh8(B;kWhQ)6Lj<0i1Sm( zjnpYS(Z-$sjlDOIhx%Rr$4TW~%2c8VQ>he5kz}1z2uZR`mQ0a^kdUP@Q-n|^gb0&; zH`#ZSeT%WL+4p_MGG_00ch2LS?|Gl)bKalt=lnj~_x;D<@p8YGxnK8v-Pe6x&+GXN zeHA_VmgR0E-OAeg6|N1!m#R_p z5qylIYKVJc=Uf66dhY}7yX4POJz~R8pW?qo&6G!qSEzl+GIRbe&<}WEN_eej@3)IZ zmjkG|H>|KOc%4PeqE`3Tm&NZt-SJ~N^_1^`6g7MQ;&~A9G%hc!P-qZkZX5QrBupij z*H_kvB8m~x_4Pm4_H80|?6KzdeB>}y(ARfntmEBbsL^4d+qwMx~IM`(qzKQ4(7AJLnZz_xZ>YO z&*1?qa4ZY4_Jjswz9l1Q8t#ZxUg|4=9kYg#-qO^I{Fm?>;Vbz`q~lgOTdURAus+Ir zV4FL-@EcL`>)Ma)dL~mSfRr@>P?YJ!sPpsQ3-FiI+8G38WT3P<0b zBH~q;mbv&U5RAIi>QM_68w{XWUg{43XWR}3TtMiA!!kQ*FY7_NO)txYM%_A;*F=~I ztB~}GW!+)cmjnAjJ`iv^Rp~u07Hzv%fv!h(CJh%#=jni;;+UepEUO7|4gLxFunS$N zyx(n9ljGFt^=@;-R@C4XnDHprIpds&M{m+`e3%gYBc>Kzi)=?hhei})Zu?Y}mo*vu zgpv1UOpo@@>eJC*%d`|{>9Th!y@fxOm*B}so~p)=f(uW@uUj?3!=}+XKdIQpF8rqc z>1oG4D!4zoF#_xHu{0>-TwBdbWKF$6A3uHWQ+}wj z^g%VqqFIWO)^lW%`YOmziQV{3-SKl+QL*H@B07ADFxg1Boc9h?n|vqXH*m?~U(q&h z${lohh6;wiKJ+QrOnx0^ip7ycG+_dpVA?)NV+{bXi;@MQ#SzLRy^um{GUahCP>N_R z{xLJUoD+f&TiAptse#pRAY)TL8_lEaea>uC=XQL+8R5@=Wf1?7q5Lm7-gG>`azKm| zp~er*dZ;~)=LW{Ap+3y^92!WDGV1JfHR_b=KbZ)=9)%z|oPa=#ut3*I!dB+E!%SGD zXf?v8lexOf`WkcSGMsuEcQroG`RG=Fgh7PUH^f*`wY&0`&qdOP51eYAR`eD7`s_2L zgf(R+w6QNsS8vl8(OfxB2O~|IPB;c*SB7AzVasS)K$)0^C4NRh(Cs-OBO}TeaW@7)cDMozri}Q{r}M>^_)ajc4%{{|0$i7QeS9r+T3SudHVS1OPRAV*J_OI zs+2}L0fx@BPsE+bHMU7c0=9UQbLiRN5Ga$c#SE`%aaKvrfp<$lEx)%Ft9Xyqg5J5} zG&S!81pM5GZ#n{jXnqjK1Mpk^n;J!xJracbc~@HiV)UxcBEqoauGYEFtZJFN%^Azm zlM2}Wmg-h)uA46zwlxXnTKHg9F*Bem)Oun4;z| zU3LkPEy;L6i5&n5y&@OZ&7LsXCEG~=oy2%L-9m4bPqTCjP=OsrZn~D z>?di%$*vTI49g_qlHm$EU_|47Lx-*xD{O|Avh;$uu;?@&6_jA!@L5tY6}I!rC}q>r zO7)q3%bb>*Nc`pIoWYwEs2(#A*CmnQ7JtT*Z(<;!p(hxWHL2iW-S~%?fI_27mqE@ zxY|IoN9QZbq?x=L2K;c;usT#%hiM+ruY7cKMD8a<*Y^;g>+^THE1W)mz`SI@SIWe` zh<)AX;eB^YSdh5cacWVk;A3prm>SFcOT^yFPzG!)Ds#3R%|qH2z)X!8UwvsMhpjPw z=*P^sth26}Jj$ia4;vu{fxPR>eP|Nzr*Ys7$cxh-!J1JQIuLmx<2p5LU?KS>2pH~{ z=Rpfa^d8g^)o&=CPWZe9@XH1OHV-b!XhaWPVCj)iVfs0{6HaVw!#}xCM0jW6Cx=wN z1VXKXY0z7JmJCr7ChpN#vY4AHnPX3k@-IA!&;weqmr&ORZ1;)nn zz!YHvKY$s4SGM#}3#A{ny<;c8>TkEqG#upoMBK9soef~J#h0c{-E}E-m}RLe<{9Jc zARqD_1~|tn{NX(~$xT@ygTxyNFt5`N3{=I7YZM70uBOc7Cb2bkb8bCl&=h*R>(Ei= zUJp;+VI2-N=$`$piUt}Er_>K#P!XuvcGmH8b^1#-;jj81bnjVR{uj9u>^7=H2RZ4D zIxNK+n!ZfZ3*nY~iU6a|1Naey^K2d#oKp~9HXmBb8l`u{F}`LPU=uGbivgrX7qA!` z`;B=Q8Z3NPz^?4V9|FBkjjP&{WElWxsKPta$DSd$Ak9=?!$FkMrDpg9g)h(}r%$)A zFW7cve4F48Rc9+-0&6bu7r`_`W%6OSMSe9D2NcUz9K^l9+NEf&_ zPcR1N^vo83o0oI;9|bA@qsOTYhG2(az>EI{2^S;n!VYWrY{~IusU!zgXV$3W^5iE`D0@f0YCMWJK16YAFSp2A)i(k>g*mMTrCDl0WjgJ20pP3;>a=b9M|-~i z!Nj0kjHB}|;b8er2@ zKE`p*r8qK7I6FW8>}x}?HCo&Not=-1%h+HEjWeXpglXOYPuPReX3;pnkV%>kVyB@3*4G8I zgTw^G;Dq!IdE&>x4<}o-vEiNSC!{V)pV8PCX8FMDmoYEM)iYsaXg%In*QPhrt>!w$ zZZmhKa?snM}90bLAoSHTx^Tz6L0hl@8x$SrmBiB4vpR zb#|ZKDlV3M;)|+3`k-GE)+0Nk2?DAaE1FEKpA#(W!G|Q01eD4kw5=ndIhmaU3tnKv zI&wi`JB@s^U+M*U!Xu8)f;h_)rlree#X$O9T_2&w{Q)V zf>9w@#1y7oikH<2#r`ZzMGh=M<6(Q8lRC;d;|F&P672I}+4FcerxjWAveKq@*wl_> zK)Id|qp@Y#FxPi!U;S{jPy?ktUF=_?{{d?VzDKtAzW9G}M*4r~^*Hi~gA<)wNZY()EDkheJg^q=^&>r$>S8I6gH0jegj02I zebc%OS|kTD7CSO~!~-xHI8k4Q3uCH`DoqDmdhLk`9SkB4=;DaqPC9{sKkONM1V+Z>=AlTN=0Hl43o znc~ZiYX$r?VLqW3a~n-lYxhGCD`y-UAC?iH&NXr9jq{hXJK_TYkQ>8+DJ=QwVjG$) zIiUjth$=-ZXxWGK^SzMd?~$)rm*1jX*27Q@5r*WCZaSsv-{6?DgGx?gw{n<)Bt*@6 z-h*?eE`3u%wfo9Ix4JkG>0y+zq|w$So99tUc-oL$LL?w>@P>55mwJ-AWedNIm=w;X zs0KsQzcj*vaY2&;m(q5}^i-}=oVr^?SBk@dpF!#B`qOLvw}0azE6=bxm7zr2)_97M zo>5Pi*fMtG@!{u>^dcS%>yw6G9mc7jZFW4|9NPlnwxW7e`Wj?ZN6maveqUaZ*Oir6 z(JdM@IC3l^#PF8@C9O+rxE&X{u<)JfBD_5Kj3@BqVX?Axte+_a+Y*h3r98+SRJboSwQ?Blsg=IEH4%zBB^nG6r? zBO81ncluJzHR9Cy)VV(Ci;48GD<5eO`;JL2Tau*InvJc4A*x<_AXcksJ0m8(bed=e z-(jhOT=~Iv$$jmcEk57#0tm30kYFr#>(MPRKPl5tVhHhU~kOWoepHA{#+{8y>Z=!1q89HEXF?MBA&@4H1hKvIU z>{6#w(4&5!GA3+Y=^#4Y$brexumS;9nU8+k!|AfpEb$u81GdNgmq` z8CjNy&a+wxSeN(sVmecDpY0t)?$k*!jlpN_;Q^0GY9uppkUeQ^RZ4t; zy=Fa{50awqBy|e9B+S38*&V;3=gfVpzxy;{ResE1lwvdL|ATG4*hk2(3Jtqu%>WL~^m!8M`^;JQf$v2AvinbK{XiH)ho)ZS)%<88)7Q0+C;WgaYd)$&O$GOQ2{1%~H z_WRuV-4y&w40U`FGYchBTSLg{I%P%;Fo3<{L2IsTI1FvK*C%oMov)ZGz={&z^?}+i z+?&*FW5)ilZbRog8>Ng$$8Q<9d{i1mZ*sNMtJTw+?}J$cxoazlFT4@F(Qc-hv#-){ z)<74q@7u-~>ZoSm3heG2PKaVb%BaTj9OLWFBJaHimon0{jNZlK7uuE!kJA|GlBHN>yg|auLNsrM!Wb(If(v-r(7v!hzWt zGY*5LLj;Pg-Fd;Gh4c6^i!e0VW;6GU$suiOG^N1f84M^B1~4B}zf=i5&ThCzTpVz- zs;NBrmWMlwx7OZMv@r5L#$otl%jM&Eqsjwrjc$uy<;_Y&R?2ow-ShCDpx(u96+13f zSBeA@jq6J{&~b~F1+K5=FM=SQ#=`$tc>n)IoH`1P>NVqz&><;Byp!l+9x-^UBF){m z@9jDC2>ZP^}*5>%QfTciFml6DBy9T~t4^yAR2MBaAaqJc2pV4MzzP7tuAxg%{B0-7X&u zn986%_i;1uO4A-ziX?>u2IhSSqp|z_e{NKC3;$}A6NeJ9)ks?izK9aed2023%6+Mv zdiupIrEyNFF&$(+y6Gk^`YtofNC_iG`xzrP97c`Ynhq z+mQ#H3sRE<8FQr%9UJ!89@N~wZ-;^IOSUmK&IdZU>FB=m;gURx*SA9PRLPKby3N54 zM?aT>G@O|`NNM~-B<6o6z=(YrP@F{xeFg>u`z_QYEHXa3_4qe**9q9iR_ng%384iI zYTn!*t8wmzKaG`#&Glw1ETNB?*_ruI=B*h?rh$aqoXo5X*$7nkSb=1lqNx>$e{p2r z;r0pT%-PNCsFKy)*%yyEcmgX0m1*y5u|TY*ED#L+zE*4l2G9b4uYjmor5rOEZ!?4$ zm^JOlzl%Iyx=OCd`|h6QW>=nz$+Pd+6S>)@-dU-SW$nH<0U&4?^tM4!`c@}w&EGaX zmOC**%J^_|xn7>t;yd2i(cOyHuv&yKONi9kT57DFHPG;kSg&>V$z<5}wJS1nGOZ`A z9@)ffg23SWzh{3T7fB#qty2Ws06E1+GIeku3Q1vrq)@gMwU5Wl68$bFpLOv3W@3_( z^_im&K-~1`-y6laZw?gk;||P^8@;Sq|Fb$Qt$RSuFGYr$0>GKZ81zyJ7n-1is=}`c zPSScmRGzp=R-UBDh<+k^f9(V)o~vDuSzDo}!p9Exfq3*{9!StHRc_aqVL&MEl3NUg zL~L7|*d#+0Q1%V|V55P0s&<{luj&de_L`TYb|scro>4 zsc8Gvq#tZaNDvm>?%$zZ^p(Iz@m2BbMqVX)dE^cme^miN_oN4AY4jV5(0&A4< zg|g(6OJbhFZc)4~#b+1E#c;0H^^l}v=E^C}9J<>$Bnges; z_6rsL%_FBbC6pI$ePZ70*5UC4tRA!Q({b!n4mxC53;J47=X2{?MR;Pa6zyWpb{I{p zas=0Aw;KS^2usr>&IUZmUgn;OdQXAeO+EKR`>M^otjmjnboabzUZB#xu%?~{)g9_bw`ytC>IZAv+LYCaYj9qag z_w+66c#;h%U~hZwMD|DvIiDGJs$o!O`4-0yHtn;ZhXx%B+(Vij#aZ|8#i`>a3zh2w zso6XG$O+8%4>^?2J@q`1+kI%0kHWNl4jI%VtD*Gjz&IjaUWEe^lDCEq=W5Y}qAh95 zmk!Ig^0~+9x~vXtt~RXGy5x&R*X@IDJ#iqg zymwf((oI;6TwBO14*{^LMi<()UqAP|I)jdOIynv4kr3|RJIwCEmObewP9ZBaS9G={ zBw^(H;(U|jCT3Wuvgu$ZcEuRaYO7AOyo;rhS(CC}=(fuB!8u&)IhdW!K$!G&)x<8? z_d-8lNj@0zHZ0@mipi>#5fgsOL?d=E9Qm1Iy*dE^>G7m>+!*xrf~NT75N0TJj7S)` zDB+GDH8Eh_iE2O(`Bmi!l{)yiXkg0ffRIRdI;*A*Ejo2Ao=}*IdjlKKm9zpl0@s=> z%~>UwhdRrfOV4i^sEf?96jA$ao>|2m05Ofv`6*04e?U)#j#O^Ghbgko601(OQZT$w z9NN$Agg6Kpl$}f>Fx-*+Jx7#aj&48Ln!g*|)0q@qmd&&-^6U8t3@#C9i|q0xUpEiKE*x}d$%?>7*M`vY!p^X zFkd6tCnQ04>{xlifG3&G(ZUllOouNXa=a+LeQm;DDu{y(nX4|exGZK=p?db_;1DBU zIc3)oJq_SlnOS%8Qg?xKUAwPCxsaks-o9nM$3qvsR6T<%OyJEHNf_4d1k64VE;>V` z!D0$*2N)ngpPe^U1r?@88v=v$;EmZldGxh&R{}*(wey#%oZLch;aGheAhINII{P{D z#Xq!tf7c!!VkVfcSF$lwb+kF|IM64ChiLG8_y>j?rj;T@;+9;cxEexZq6>6~59v&( zzeQKcPOVq6Qw(9$>9i3DBk~atkS;@nfcbONcfInLxjFZ>%@-ZieWSg2V{r|}O`7Sp zUR0dwLH*d5|4r7{|ID@g_j~=<`c&*4HuJd+TFPJ0OJEuqS7m3^p2=;v0o_=Ez0Wob z$QThjBHR-vG^i8@a^W^GvEEDscl2B#b_K0xVlaR0p<~NafRp$CL$-2#_~s0tEqLd_ z>pt??${9Vp7Uz_by#^X%D+PIz0>@b`_*f8_FP`}b}`|4#d#hz$sp%^JE1%1LPK zxc9b;tAp2AR|M3VNIdJM?UVq+FAzVPG=~>iU4v{rX8d3~4`@zibRN<%zlLJfRRU}O ze~?;u2LO}$KJOv2Zc1QIpfZ6^vw3UZUz!-)1qyK0PV^ZAz!#J$7&hNC1LH?eMJ*y$ zK=?Nyq*xlUWsDz>m})EMB+b)s-zYl){-7Gub@i{ISh2Rh25XTiPJd(#8u&}rpt0ZT zR{gmhPLwoSw+eAPWs$PoPW=wYsbBg}-&|8{IJzv_zRmN>M0q%L@D@Y+(>v}{ejTEg z4=7bxzA*gEXG-H zXB1-0&pb4{^i{y#Q~IzNj7PIlEas#~h*nR(b84G?yEaWYP}S`THXr)3qcGDoNa?y3 zrl%@{j1^)Dn-&|{Cl?UhkD0?rtvxk@>pO=l(&=px;pPYMB6N&Je97>Ux=MZ$MBrS? z4&b0V3OYdbFF&BdRn5ZWbE0OvOV=*X?ou7ptWrH_5c0>mF(ZL`e7u{-=(6$flA*4! zuiEarLCx|xWVY@$8}X=Hjj44-4{bZlplJFP%7IR`paJ_;$By_nVpuw*8LgQFS>_!p zI%b)R$0#w1-#|YAs?#QS)-Ht?`mSd)_VrNtSM%ZF^LYMBp@p0KY|xfYgKi@rCGKhA zuxN&{L|dd?IhViqW+l!Z!{et%$yRk8C*|9K8c%DQba=Kv5-do9Z!SJy{x_8NB%ci~ zId=Gh&{L-?kiQ(LfBW~X^VgE=CFmf2QN<9KOzMs5r? z7E9c-_^|(@c=2eL-uKk>aCx7g6q|@rms(fDwP(3%Ro%>ACbIe}_)leHE+ppzH&O2= zT=KWNDD3N^%c6ibTOPu6E@o*UMB%-#d@RFl4z}4jNj-^-L0}xM=nNJj1g6r6AGTO$ zRD&*!R&v@H0%*qfd;msk)(+wl+X^y7|M~!J8>R|+Z7m(k1NuMz=x-hH|F~T-f(NNV zbba0AXMopx2Ah5@@-2eIwS^wLZYV=KIIkX+7-IJ5Lz<>T`0k~ZR9yHbVo;SM_p!$X zOBVmRlT{2|u-3IN21k(Jkjk*bgLl=G5|UGd?vGqQ7$3tf$lQt;`bZ*jrqDQyvK48D8wcPW)XrYbI`@lF`pfm_4owmA1K)F&*=0vd z9=M-grj`WWTCA_k8uGe~BR)ALS65d>@zD9Cy8OViclKn&?a72ZRo|I}bCmw}4@oBe zaVPvK(2!RkgwW_bzFuI`|74Q2(RtPayvF0ncXZ)TY$VD=)B@JUN8cj2pJ>2)`3yG# zIwx6Y2Quv4U`wx!~*A)GKIXYWjC8 z@4$QqNmZ7eaI3OVt;W3pw3+&7H++_fub1@a8+9^z4qdJTAgs9?+AcJmWv)pssYCij zh21kU{ufbGeKDCS@$L zBz;*;ed6|JfwAK%Oz72=fkWO^xp-3 z&{cgN4VK{B?Eiufr)OOD@FkIjyR78mp_)%6gp8u~>`BdgrOq78Xcn?>m)y5Cs3EMa zqP)l0=OpvV(+Bq!v_t17W!uq}__m4z<#a9n?xWCU_~*VySMaW)GZq-idjS7liDp*; zCjazj?wwoXI@QZsut8-=M|NIcYuEB9Y^zfsEe9^Jm(Zn#{u*7WcibG%h->gErKQr=JGMnxhg~EtF zy~b+E5fO*F5O)v3$+8a!h8uL=3}%)z>3(TGSe;2#pJK;JWFw<4C7`89gi&CFD)ip| z{c-uHIrZQ3H>HbQ(cwBjB23yXNsD|`;Iue%+8OFw&sZE!XR0vcopqLRJ_32iS0Z#S z+)F76ACqJ3kHSl4Y_##=2tK+j!@fmYEcXL0ef(2g�G2A9<^qz&*nEon*rGXtOov z{QBl@$Ry%}!PeGNXtpP=TE6n-azxikuYs`t(`X2N;S%hfI^XS?8YeQE$Qf~~N8&rJ z8LR@W80UZ1oo9CK=DP4zFvQw+5;15su@U9$K%A`!T`rDc?>>3WE4aAv_5R@oNmdg_ zNdqqAVUHu_&G@{_F!`R5ujqb(+fKnvNM~l8Xirix(e>rZO+&7aD-ITytqE6y&(A~6 zYr>b;n5&r7czn{@lPKSv0SPMBJ(~_?%Ch_4gc&*A08h91cH7_CKmEUWjEgVA)ll|= z*}zTm*1q#@Q||&wrPg)_Hy+1dL+wC%VCn-8y=Iskugi{~;4d~PWAku6z~0S$B(>o< zUOwH#eYPN0A;U=EOx*psInCVMXs=(U_POt@M87G1=4+uo&}|qpu&Ki+!*wrKn@_93 zO`wI&a9`wZN+$LI^3wh87`SQIwxs5bb$wzUfcn@TzArEP`FKSeK1y1!GMg{|%Vt;- zbdjW7A9lA*Om_fypr&fC{yk>WY(HINkeM*fmvu_YF0nB52b;)o^it2t<~06@gz^wE zrS{9Dg56E**8FRLYJWbRH(jfTvR#+T#_f_6S@Q+Mxu5V+xsZtQ!H+92qc+90oJU{8 zAqY+mui7lG2Y0(tgZtSR1E{AeDa>H|*Rcoto{m{WIdbpT~2)Jx&b^#S{H{ptPIuBFtK=Y1(2j!_$m@s(-JaHR7^#6eRx zd75@%Y5y>{4wuv6u@??EG0zUdn|o}U$&1<*-nVn!*Erc8Z{fiiI4gX>JXQ7RXB@|= z=_hs1CKyy}-e4xj7lRzkrF_b(F*@Xc$pg!<=ca0~>FD{|zzu>(&kdPj_+*c@=7Fan zJy31nDS8E*&aD6AKJdQ>hsgi-XK}2&L_FlHeAHU>21z0{=Fq7#@aZT%Sly9L&TeO= zDx5FFmVVZ)?|QfvW>;g>GWH}~oVOZtHS=nrl0h!IRD8oYfM$vbyby^lQ-fpHD&P!A zaD)HXt_+gI3v7_F-%EyF(hc)&XD<44K;8bHV?*JodT}O!q7Lf3XglwrMH^&J*LCJ$W1xUi;&Q){ljS1s|dsvNG()j{^k@M>v)9@jT1pR#9&81z|Fu8 z0mUGpKHx<0gNvgO3^1KvyCOgk-?EQMMH7J{1~~Etr^kBL$JzmuYhYjc)}etQlpM4O z>LmA+0vv@T>ma&X0SCNssYMn8Pk}G*IEb!=rvW59XtNE}!9x3OsXxUhe*HMGU)@-o z>=gi;5C`>UpTU#7LVtxU$uJrw0B%tP+=R(-4!;UKIv?n-Zp=$pQH7Ft7trnUpwtCF z#z_`nS^U#Zz)%7-nU^lq& zL`e!e(-;4d0lMff`zhg9{f>WpI^q9xG#jf1@eKu1k$>4q06Y7uTO#rgf?#k(fSr!dBYIA88R!kHK-slef{#3K+8r?Vmbc6UQYI<@LK zQ+29*Jd={=OHmjL?#QH}q@<+W*FmSd#l(WX?k-bBay!*tE=N*x4wRZmgiz5pH=j~& z+%fb5yPR^p&;QIV|IhsY)~6i<51wF$-*HaGXLaAs?zcdwa%`m8jy+l#QhxUYto%0! z@}waYxz&>cOW^$qcbW_b!InbAjV4Cx(;jk(F|!t(o#GeB5>ZO1(JH!@4_ykS=43TD zwZrY-J%Y_gVF^*axZv8d?uZPfD8z2i1z-M97taU+EK3n8MYXXzNp&I|HVAu!-l5UA-%J#+AAyBLlZL89m2OQh5HzvXKfS9!@DzW;+od>E z4-oQHz~N&`SkFFQ=5PeS9hUJ-CsEfb!V4&99+#O|vgCF9N+o=Ngm1yX(|%kLA8(^l4U z-~x6_wzpm<<-%?>u{|X2K-Q5I$kjkmH?OUrNAYS@WeT54ZfB0`rSTr44{zQFjZf(< zBc$eGd7{fDblu`&O5H@<7-6p|G^+XlTfh6D-SU7g;K#{a9DV^Gf$l_)#Uk?1)faX! zdwmrVS*qUT3M;#2aqfACS8P>tntZ|qDR}|Iu|ZIYnV!6LOdXmJW{n>AH zKB}+UIOoP}ceX!!$QclYo?N)bTly^FndC>xm;f`B8O7pXKyxbZaFZQRE^Mz%h#UB2 z#es_q!mez?E&gV(V zSs)iDwzvA{mNw)2ps`u~gK{6yw}7@N;4n%pQ}>FX%vLohw}^@qy+aHSey)R%uO0#Q zv!gd`!%CU4${ZtBZ4}SX-~@t_{0NN6zT<@uZ>DXlo#@v4`MMPhHia36wt9itjOFN3 zLl?ir(ztdv13gLRr;~Kn&I0I4>oYuUPkd!~e8Jtp7HfP(=+Xm5DiwHN=Boo?tSv8d zs8&k{BPA8Gq)k5`Te%y-fZ(BYM}&Td4*e>g!9c$_)H4ql{OFi4^teCz5tMcSbl}cI zz-J5ALG&k(&COzW>(tW#4rsZ~Q%4oH48TY0B<7Mb?k61;vJ#sE)GK2Y_d_OK-O$mS z_OSjf@)OJoMWCDcsKjCR3@+($(kAVX=b);g&6Sh004?&6FwiDhHm1eF+WI)!+KLDy zBbc${^jP*z4846Pa6|oZ?(s>J;%EhWKl81ENIhU}Q{Gmyr3`&+#RR&MKV^)JpE9^8 zqMS6W--z(Fg^L+CS*Uv+NxTpq-Br;o%s@lV*K5RtaXUO~io16yV~^KSCh#`m>#k&g+3#jcl_xc^NQ&%t)&8}?(^lA%^>?Zizf9OvnL%;!{G ztM*j~%4zIUPcU$yAq^z08pC1xW%ly-Y85(>yOTOeP446Ol{^&)eo*3V2d~O=VHXMf zC;Pk#6qj5o^d!EZe6Ew;z7P9y;pjhDjyweIzTwE2@{~xgKuoV{Sme|zV zRc}#T5O&HRlyy6w)R{Fd2(AaL|GzC`0f+@30SF6hG8K=Hv$0q&2&p(#4iH(5!^FIL zt;P5S5uo~>m?G5dJ4bC{?Xp3RdWVBuh2mz+74+0WhebqI0Tndao_vG_X-3b+767{z ze-y|PFzvI_kvD142PhR+0lT3B`0>yA6}+zSUxe3T3vjhr>gq+glotujz00X}FL57=@3^(yvwd&o;0@ zp4lj5d_0?#TwdxZcwXejl80Do6YM3f2@#0qS5zgz?qZQ^#&eE%n#}trM-Aw>qk_`0 ziP4jn1zk3r_EcYXOYRw(a~wbaEPbP&Y4C+4>$_iUgBi+@d$r=#Rp@1xoRWa=&H-<7ZaVOaE((Q zmai@^-?fay^&SZ{d^~jIC0XO*O3*^C%scc|;_{dBzRe?^m$p4UQ4)6ijwkywcIBQx z5=R}mavSk^+Qd*XW4deNd(P~Uspq={W}~+)v0ol@LcEWaBt^U_cgoVvri3i7WS=xQ z&`>tjymeKrq&jtpvwLdYAo5Wl$o zb>{@F_tFR4wbMsTUk#mw2&GGvcl6my89vT@&N*3bc9*(?eWF_2Lf{2nmn0jIUtCO$rkZ4LY2-=GD&k@J6m&4X^pg zXZH4B1h!|L>g&3c{^GLukW*i^Wo)P5n$?O+L2Jcyx$Ef3&g=IZo85QZeX6pg zs*T`cHJC|ui`rRL$==B?{GwJAKJKqKcTbyiDTexET?EATeQET(HPrtqs z$(c(}IKkRh@kIEHVRE_z1jeJ%_371#5cNHFXWRB4e3&Q5rqq;a3*m8dFetHAk~GXO zFTUv;#6KBaFF$SIk88&L)A6-t@{dMLZC0(WUcQL4hx6L=Np}V5t#bk6D1Nu_$$DY7 z_HqqZOupG*f~NX4F-DMj@?D;w9;*q?z-8o)v1PrwB1dXVN?nOXn{4CK0U5&O9sS!J zzs^+p7-B^8=WS!JB~we}m+Wi>SU%%J58QDJmuOG6k7+oGT~DNKo7(1?!FiQ}8I$B0 z4&2Lu$gL=qtG)0oS#79E?6vPvk5`Cb;eR%~OqVNK>tbrbTgaELuIgwHv$T}kzQ6J4 zhRnL&gly|d!?nSnnGbw!?hW?uZUvW2tOMuwL*E-uWK@t!efY!_c5$-2O?eH_{z2I47oxsF&I*VG z^qg9eIA;xXjiZ1j%X)3=QSl4%^i=E;Fi-J+SO&`{h%;+Wiy~;&gZ?S>R{;8Lu#rvnT4T=J{uhAa~a%x()Bd~foe(JB@(yTQ;d%6XO%FCr-4x%|>Op3B*`r&F*6fy?tMzWq%HO2KXP;%C zslAh0kzbO~3y4$}Epp|hy~#G6Fs;+6pQFS-iWK`4sf+pPEo4gV)k0*QLLUCh*$Hnm zlc6xJ6U|zZ-HdxAr(TD>@V`q{@Y{Yon+t<{H6Psi94eq_JaU-$I-HzUxNdN2L`pt= zKfRuD=+)c}8KMsoc966~^ML+o1*IV-B)LiIqd{PX!98bXzZ_-)Qdj7!otAc9?4Gq8 z_h8sqT{J3e1Luu&h@(v2lBHdLt?0&cm7&2S94S<_q|Uh*jEHub1JNHv6_{VSQ?ka9 z;Js6Uwo>GWAAR@%Ro$0SAGb)!gXSW}=W*vA7uk8Z#|Bx1+;RS%mT?i~)uGBuizNL4 z6JY9xQPfmCIj()p)RaF?`-}GoTtP2rj3;zyugm5*zy>nj1Oqm$B?OJD5Btgdq}JX$ ziLM$kZUUMval4)jkSyzJw_Fe0Gi|htdUS53;Uthb5bcig4G<^RuV%osZ}hjzfQb0s zMZVIJD7t9D)0)`B=mXeC{G5Z8g(RH@0%Sg1SJE$m)mafGgA~lvZbEEG^J7t6`%**w(86z-^bWcV+E)0>xA&qj*d-sPtp5? zRC#B5aq=OvsVj`OX6MNrExZpw%;4eO^lxv4b^-A|_3Z+i2rlG9Qi`wU)h*bdVIXGM zG-L`wen_tw3gJw0*XHf=tv5(GXLHAzdbqcyI(|B;WDD$YD#hIp^`9i}zaOwU;beD# zEp*hv8C}CUSaAt2p*tdLBx0VKxKE{OyYtb$F6Go|u0EPVD4L!_##+9=EGjRM-O$thwmHD)4cnd&4|ap>)y$={Q{9rM zZrH`$Srri!OPxK@SBK_7NsdwqjhsFA4)&>SPits7*-*B{HHXK3rYVlfC3*W@+;p%U zWwB3E748dvbH7cMLjh^55&6q681AE*J#`1aN#aW@H@Ts zBS18wG-PF*R{r)bQ*KU)Z6a?D*OaF?Q7LBHV-|Dhy5|T)GU@6iuRi>30JmvTxSF3| zVPVj5&SxdqRU&$k;dtSWXOKL`qV7?L0srN)x`6;)Z6mCPn-1Bhd)Zh4amYBZv3Tdi zz5}*UPfr844(qX902T?<@kk6k6U zM?09M9mkBj%&9LZ?6X~5iW` z3R_pYl426hRI#lHu?;{kj44 zFf1lVxhcoH2tBjkR-}2$-K-zdxNDdRuP7uVL)>mhr_`n3B>6khCFa>}O_913+S7SNNdYF2dGmR2 zf1}$uRJjMcnKgnvl9W*v#3%2Hj{m_1E2c$5xy7B}=#wm~_QBC*o(Yq}hwV%Ezr5YB zmcEEJWn=*CP}KTftqJv@%|{Zh+4tCn4Mor6nCrqFH28j1WVonYogwR_I#senm?>E4 z1M~eQM`?Jt>rhGLj74`DA1v!?c4O81um|lydoKm7n>ZM{^e7;5+ zMAXMzRl17irYJicS9oR-HUIQp^wSo8CARspG^O)$$|TYu(@!WKx7&yY6 zzV_vv2~h%Djh7dnxQxG?rmE&6`e@Ukk!A*M8Nah>Bc7Dg4&if4vRCddtsDD92cqqv zuCrw_D+q4XVfJaL`4$1g!LO))4jidq;zxyfhuCQjzE1`E?o zL*CjJc6Y5=bm6|-l;NYS%z5LzpB4n_x}S&7_e?$6=E=68^I5R3P2QoNuEaUM60v_9 z{oek=<+&*R{THsg>zL_t1f}$Y2C*U}wUUXl_0+{8ZAZ~#>et_*T@F6wTwe`E#61#3 zsD)iF)r!&yWVP%i$p_z|63Evj2k1D}@19TXAV&$1i% z2iKqBAt9_U>Zn^@^(}n=*2sC#9&8OzM&r7A_9!64Q4v zzh$v%gkOUNsq$qg0&>SorEx5PyawVY`+aP~MwDVli4fT%VRy0%3_S_mrQ$O`*u0^< zCws_lI<0vFTK&n2V{wLq;e*UV+(fm;{b0*4s8*g0 z37u6So>zhZ=P~#Zt%he ze$;81$WXd^fjN^&Z>bp-DBp!E{ko-{-1dD(8T&!(R@E75u-TWOkfR}&_GC)^BDY<5 ztZzEYL)2SY)GeXv&Iv;&X7X2PRIm9>x79D3*H$hQZrgUYxDAX(@Cqs5KssEqc&h&C zV8I+L6n0e{YTCDf-rKL7lr?-ZsgzWLIqP|l`!1b5YE9|d z3xXT&Ex)zf5RqGwpPV8bq!gKweB)gt?8lmGj)@4WAHa^QF2VG^N0FWE#pAB>#Zwbm z(~B%b@a%mlv9ObLf{P@cwyqS|02MCKhJWi~9M4KIj`n`qzy2MaH^A8?u_8X(WEt)* zU_G1|+;sMv#);Ue*a9J5YO=zGEdK~4Ys8wF-@t>X&T|dnQC|QYvf3GJw67dtK}l*A z<1cdD5>N88JWuQm30(HlAksyG&wl^c>hA})6cAhCoXGEQ&6(bn9gO2 zJytN}+VNJdgU-joK8$&Roqu}(IX9YIc))D2+yl30(bh2iVKK`9>$G_)fJE@d?mJ0} z4Llj1N-xk>uxT8gepk6T+4okq5`u*1^er7e`%t{|o$(9lb;d1W{p5LqGw`Qe4(~Xq z|E%TUe#NfR2EqGz())O?ENCdQU%RiN?RujlG1yD&ZOAP)juM$M;R_UcoVBtg)@PE{ zUCJ+P-8_^=W6mrO2Vpp+x{CdC+%VeaV2jk`!c$ljjO$axPx0Qz_n6&UV$ zZ_KP^cj6V4m^3xibYn!0lvIyg_MpC6CP%U{yv<9D#cR_(!eet&M_Xe*od1wOtn+Np zI~0VphS0J+m{&kEN2+~xc+KRY+Iui|nddQ_#Bctm(0unsv)tbu)cbZ}pX)od$|s_) zlO(|csT@PxT0Y{O{hpGbZ6q>hS#a|13#$DM*j76(pGAg1 zwv_jPw;Vo`WIN|U&Kj>!Q-Z4&+uD9CmD&&~{W7Prve)yJQ9bYc#SD4}1C2wJlSt7p zO`DC1J_jl1>tQzWB*aaXo`GK<>`c zMk+;gSH?x~mhFGK&Oq=I+(wU};YzX?Ja~o!GeDlTM_AG^wGN(0f7+qz@R0T0s?%*GaG#%YVQ=nBOC35iDRB5`Mw&-_} zXVH}yFh_2zMyo|vYO=HuxK|bxd^;0(gr~o9 z6XKs@F3Z;O;eh*d#N*HX`6n1M@|f%&_3D<#k{_i?7S8hdia!3DIgxzW+X^WYjQ>uS zIx$ZD`VejZktfnW<7M}r!V&)t@i3cB);C{!$_g=uVGYbT8%wU*4e0JJPXT#&i{D|h zxw_e4teGeF##$Xi@gUrcs+Leh8>RZR0FpNcG2bc=tcHlKSUi@)Cj$ww!X~T-y)SU% z|4mmN!~&tW+=Jh|6}77KK-sCL4t9vJ8+cOb!fIZEP2atlPgW--%$Kb22b9MOh3F0_ zP!PtzRdnoit-nf9!@6&;8EU5WF@}N2Gb}qmB$in6sYVMz55~7uzM46ogkOgx4eHJz z`5t=QtbtEk{4w=BVd7HM=e9)5bmS{}X^z7)E$*IBFiw1mvbFUbSRjx~$=_6g>X8PB zXPW3%pDr*Fx;bwm3g}#i>uj2kWtP%P4fwiS9ZV>t!36!isA#v{rvdp3;u8IhM0V-8 zS_kV>U!zvyP@R2nm}D1I(VP)}E?SarC!>&_A zSl%43XngKCHhr5fP5&ySwj8xrH3F2#yP?tluzyS5=3cJuS*a7#4#T0m_pV+CYl$kH z`b|gA=h|4@42=?Z&r^x5$`*g<{wR9An;6=HVfIma^{qN)ODb{ejtc{KukQMN)oM{bqvGZ!&{hbGkTA~+|KF${_$7-$WN@FFPiP6^# zT|Vhs6(YOHe#(&0NM}j2BZgUQ1cHeuQ*h>B+NKKeeiC%$-R`N_x8xI->aJn0t_@R8 z5rdlRIeZE>=ar%pDo^mSTk+D8s^)FJDZ-u%*F}~_xeU7LS0q!RhXHNq$l1plXzpwdBzvF0EupM)S2yMf#kJxkQ%%XS6Zz)HpNL0sM#C=VtBckJlcA3@KQnw2fM%w+*@{af1ZPV%Ad}h> z=MBFceqF|L4a41t*k&QoovX4cV-M3tnQRcWJG22*H7JA1EQc43)e;l!*M}v~-57G? zS}K$%UH?Wh_ss0nH3eY8dHFvzfYKHVX{ouMy~=FY;=5^S;+u{aFr2$5MFUh9kWM(j zheXY_>sDh%_7oDOEsA?IUjc1s&Ld*XbTNc(sD_o?eQjG%O)sf%(|JVcekfZO6dUKp z=p-6jH#<(L5i)raZ&-vWtNBfrbQGlSd(fnJPAnQOCzN@f*&mQrf8@DC*E6GX)?GzW zF#nh19_6Mmk9}*FP$Y?EF#F*{&Pl*USnvgNn^+eY+AA?etSWW)n_J{UJtEP!mEx99 z=ssA?sLEOoJ^Ots3eg;dlhA#ruFnG;y`8r(1 z#BUIuPgHHj(NszDI+JHUl&lW0rLDuHT28!P2^qAS%eb+Ur>DDIg3gu3n`;KEGiv`U$<}~FPBTtlA zl(LP4OQIF_snYXBYY1_5@dZZ6XLXyD0|d;sF9(PFU~aeuB17BFd09W|fnYLT102UG zQ8-Yq0KfN5VsTp2FM=Y|7z#0XW3h)3$tcTC!av?(iGq*HH0UR`Fdl%dNjCeT{jOHd zwT91~1jB?p6=S9HrKa?ECt-I-VP*~{#nbjX+1<67S45roOo0w&V)>L0FP^<>#)2pp zkyMcN*zUW{w!)_7fra2%grWd+XhGT5K>AUOL(Lv+(Z`e~6l6Il8&D0-;5Gq=DywJv zKt>S{ZSpGg>w4h*Q0;dd(C>#v_q$1fnFI_w=BEPsR=z>)mqwFF&%FSbfMK14 zM(%hmiH=ooKTtFFoX@KTvC^SS$+Bywn#zhfR?1O-)nsM$LOw@rZPki(!O9Vx?wojm zB^#kg72~S5U(rK{3)a~LWXp9DwDpE1?4jiPwWqPSLTrLrxT{mVZ+)PjPGK)cvoDpu zz(py=)(R!>?nudZUNHFdfwvY_pWg|$C3-g^PD?rT1648%1Gak|1W1a0r}D0J>|@aN z7vRoAwRihh5d3xny94pk>|!7NV-C2F*aP$&2o47x2x)znG!9rdEo_@0pgdg%+XVK8 zjnwK8n_8>alucw#36Bssr>W0X79cl~ljFM>k;=--nxvv@+4k8U3Eoe;^s1Brq>-d- zqEznPZK8|TS7Ut`TdptyT!#gDmQs6P?B>!#x`oZ}CxYe`uTUk7UB_i*rA_z|Ld_xz&PIr5jf-sM~GC2q9|$@kZ?W(n#% zqR#i*K7R9%A1YY)0m=31=sl=G^)m5cF`jC!3o}dXiXq(-5VaqD^on6e<)JRT2ig3n zQOfQV5UWA3ZIzV#%kkdbk|Kwx)6s|a%B(#E$lUhQvs$!6rxRc&%~{H}OO zNr>Tl-{YRbSV@WT`m=EBIznX3^3j~iaAX?;DQ$XE1v-S}=($6lKt)UrU9}iE#P(8p z6T7wvKkaF zJ^oChqGx*Zb0_!*|Di<7v{0>1LFu@UFS10EU1K&RHMGWt^-s0{o|2oz$wTGi>bmDK z%KZYR?JY^DJTXyR>vI9ehrH23%DpvxNc4Pw5Lc*}q~k4LFV#vM_C+E4V41bhw}it` zp=Y@rs8yz}HX^Gv^eY@r+@P6sZ|p9pD1QvPsanEIrNeM%)NArISB};6bav{IAn&&H zJH8Rf@_U0Dxulr6@suRFwFdO2b_c*<$CqMfWr!hvcK^Ks$|uZNb}z64=&Qufgqx*9 z-PH9>Q$NQSn97X}geYmhyQv9dRSe$gH@r+izbu;8CXwfAcwrSE<+wLRBSjD=R6c`$ z;KKT>xILVV<$a8hzVQzj`6!QIZU^AEstP{PW_L~7JiGg~2)d*;h4gxtpY zKxxGjK-MxzI|H|_B@7alt%c)r`^H(Kr)9?dPWye>v!2vUCtv*(DUg*1u!IOwJHP=i zc?f~E6n~Mi?PQbiXugcfcVcgvW7^Z>KP&+EX`oV|xHgqhX5rX>(8vSnlv*Xcl**9W z#wd!3qRn>`s1r=oVA|c<;kt=wDGBPi6;HENvS zQey2ol_$Sc3R(y<6i_?hTPRdU=Ii#iR!uDw%$6#o-XAuME7c8q;bVgLeIAJN%0n-R0Xvpc<> z51?Rna5c1MOtDa*z3}C|1Wr*{2LrAveiW{Z17<6DiLId+8K>y}O%Wi;&3W(&xL=~V zU{8{OJ%h{xn<(kGhtHFasDC5{LI?hPIrx9O?*D8H0+41`zA+L*!x-va5uIIy?l%qO zLe|!HYkg%epbz4>Eu}vwCFe+vGSrXHt*(j5PS9BGS-Pq>h{GYv+ai*ep#8O6EK?IYb09|`t)q}R7L7fuT#EL}2%^A3yFttL~dk?5{N%V~~o!x$&P5o3O#{z^&y zq$U6BKQ+Cl;8LWhxpR;W5Gx5d48yVxedU{*eP;x3!<|LF`S9PGpW64KRLm|HZHyu= zHl}24@OT7eRAae)cv}%Hv6pMDGOX=S%&D4H1>?lO~^YIC4 z^1Lzc&*y7-$iA|7@2BJ~pr0iEPm$%ne&Ed;c8cQaOCYzN5xQ93smbM3842F|2H_OL z2}JHb202hYOX|u8ojwm<&Em3{)F^3P>x*e!5E%)Ah07gjlCSwxmH(y--AJFPC#Rl( zi9VY@Y=l$|G<*wap|c7taF{&Dy8^G|UaG674_(pLB?EWuAU?H@{*gMXyf+qv;`k#Rg{AGr1GL7ZQi4CqF@{paL zxkq~Bjq?z)BtjdyE=J+bIW@&V2-c7d!U(z+ZYiaJ}{XqO(L?rYNC4z=QvEbw;DkYF|y`^&Nhcc1)Q zX%UPK*>HaY(MJ~WS-dUnJl~%M$VkSQ0Q#Lk2@3ZmBgO(s_wSHU>@wf`hxN#N{tUoN z<&Q}3vx^65tx%W~HJ)}>2j}w68B9H5gzTI56X|=Z{xKMnjE6k-$9_k&BTp^%yc(U* z1SVsP7WDwuldk8I1s1Y91*4wq&#O^}PtUxISLZg-eEqLp=)Vwtr368Zl!V*JeKIDk<{qma}r>OZut{LG!!Y z3Ag>8LBktnA>*0*h3_?sEWsgZoB(i^_##%vH!EyUNE=<&e!uk|M!@b1L$*Vn%2eos z>m`1j_X0wRHv8X5A_Qbl8SzYQ;!&9*)5C#cFVC|`*5C`ms{Km1Hg{UzbUA9{oX7~G zmj+Tj^UL-kd!akeC)c~`%|E|ID;3<|tV^zMBV7Sl_LHPLgq`+5iki)DI%c6SA8p*e zGA60{wK+iCGnJ#rBSL#epiGk5oj#E$Ny!f%UmIVgA7i4sOwNxUIhLnT7_;)|$17Af zkkKD95yolEXm5O>E*wS#)}PByNewPFX$RWdyawjyaGYFj;zv##PqBKc=irPaJ3{FL zE2%~$!R)sp=Mqku$hj-=wsngj%NC;i<+!si9jGWg5iU=_;-2srbBn+J{E}|@%8QJw zKO0s5rQ7=EHNb?_;xobVG*xTFXwb7`$l zzzC3lT>JP8HLYIk@F+So2Yhn1f1!5cdYFwQZBXGwaayHr8@aLm!XVbJcRXU>Jg$ob z#Y0HC6nPhfmFnE$A#ZBFtIX-lv8l&PIV$Szo(H(K_h@@)ZpoJp!|t)$p2TfGjEBzve{#`*SbR+#vvcywYzO;F29Z+?&KP z5g%-jZ1DTLeS09Pn4hD2dbO$>aA>+w05wJqm>5ysi-w|PYLB0t_IUH_wI(`gR?y)g z$HZF0Qy!(1UK?Uh{RlORB7^Ws^=ReP=y;xhZzHt1oM)*Ay<6rTQ0~*d^CcYl#;!*3pv)C}d0TN0N>kN7Xl=P^;_wO>2u=R%S zr6W8XbwKC4 zLGh+7Zkr2QG@|GcM;scF9k&ph>D&8?FGS9$%%()yb^lG@rWddfSEQ^gvg7K@wwH8k z-=~b?d#QvRtjOAdR^LU9Xm$4JljkD?L66ko9T-Fhmdz_wL-%gK#pLA&3(4o~MYc8k z-a@0oBX{|JET0gUffspYW?2{44Ut1Sj z$QWtcTo8>h(;Nsyly4UzTH)Fxi&!)DJ}A8ygz!if5NnN9$um)QV>nQ^-Xw*It%k$^F!``k3weKft@Q71kgt> zf{}%nA~Nh%1qaUemEF3(UQie^f`9kwYveptOF7n8EUw!D`s?wx@c4^*9v#him26EJ z!tj32(!<)yfi0UGmgen!_d~x{a~6OqMYuaH93#d*7Ho*v=cW!T^qnx9jOMv5sHMz8 zl_zDAWk{}gp)#7tz}C3h;5EqHOqTGA9M`Q66K@qa)gJGOD2mz4BG-vC(;+xDQ#)rn zs9u$uQm|xIG%MMSU1xNo_t&fdBos1a~d6C|lwTIlSj{!`uY zm0XYKnJp7vf?o24G5vzxAm3g-P?`o%XO1y*jb{l8HkEW<-c9gr!$@N2-3Cqg}zojP2!y`bD9z;?jDa+nV8pl*))Kl(_iJL^AA5Xw}cq~ z9IzLeN^{lKAew)Iu-8h71=fNxeXQ+|m)h^I#$G4qsO_juJ2%gb)o5SFVBT^YgfGK) zv&Nl+|DIC$KWXyhKYy-9`}y^&+_ar9lBH0C|`Tx-M+%6t9B<>_#rF z(~{A_$M%T!7Pe) zHGq%%@*b#R=rGdn7d2F~a5-rw96Q&R6yB^bt{n*_!1kjg&)eyTt|DWK60M!9Jy2WH zC>*4}YLeT|AY$B`7U*c#jEXRg1BU7tXmV+c<4pQTwEJII_cEey0!ZG;-Tw&6{(tiN z|7mk8hoSo)Aj{Rqj_NunZ@spo0`$l_kj~i=U=PPSP^W%9e2NvsSJ0*n|1g)XMo@>r zkZ!=Oz?A&jR_fp0NMC89rOy5#E;$Ulgx|;C&3)zONXEUztR|3^ElJO}{Z{{PSZ%&d0yftmXu)715 zYYBno@hnCs_xy5gr(OhD?FR$f-~KX7HY&iLn!gu8!#)KLRj96~R%Sm4D04Waj;IlU zrjXyZ%x}5_@%qs8cx0{GASoFzx@J`WKdcJwR$EkjFMG0`<@2*xKRQPLwzr65J`v{8 zx28{P%SL$5Ye2mkgw}rOHP3sSZ#3BM6>~0Ro=YL6SSd*)XBN+VfKC)s7)kJiPKKtJ zo0%sJZ%j?ivq{aDx37UqL*j*HmX!*AK^|4Glw8@L_;m7U23G|QA{IlnAeyvS%I$6^ z_-bzA+7E@pw`$D@11(eRlQ;A17ssS1wLjqPSgzMl+HuC0ay#*ay(>DgS>^~kjCR@c zs?UFRa24pXr;%H&MF&04!Pus(Ym>c_erh0zzuNK{X{Y^OZCJK&o5YL%blLG1=AIK2 zE&#-c2WBvVs-nbuH&&J6qAoSw<}+8Hl*n5)QI2F-ZT5Su?i%!Cs97&I(xr=dHrX?? zTYpqYQazv}2hK=f45#;HI|`bMyV{kh+9-)0E*B$75XO9do)Y`1@Z(w9W6i~R#MAl& z0iFnljTZRyp^$84bl@v3KrLBj#!q4m#DVB}6^)V)EGnC?sd9W{NpyErz z7VMMU#`LjT(<5*MuX-wAE_;#p?kHbvx^$D`3v2d+4iSo6mZ3(^sG|r_CeVy)>XqIz zL6@PLu_7C*mouP=04 z?n(HNs*S!lvEJ}F*BTDU(7#@N^@PqV=D1n#syd(^Mr6=x&;f5cSAIo(F5d*O z4`0offE;Qys0qXG6Jw$)p&g&@65D=UKW3G0KZx|aJoBy>SW@9TrCd1Mw87e?9sdQB zQ8qo=*=*PfHX7V0WTu@R$^r846NtCCIG?RXBq7)*+?inm=CmCvzjM>qcdDv&vU_07 zYNP`O0nSR_rw6ZMIA--J4z! zg#tU4r^^k4_Odk)(n(YJcCgmv{2Ve`6#&C`G{5P(y2)h$FoR1%|#PCi+8gQa@yqShA*wAjn@mLL44tEqbZ@bm68 z7o)-f(BIQOD&lCXHqF+_Dm&Z-R2-Xn312;ZkD8A*0WUa=3i`)4_iqC=ih6>;@I?$B zL}r>qF6UsnSE(0>)0B5Y=YP|E3&7FB9VkM9>Z@NBvm>RSfQi`I4kD;juH1$mUt{h^h3B74l2Ql19nWirO^_OM7>HiXQ zw00vp8e56Y?iXKCa15*O$sFvu3#NF+k)WBlpO7-xy96`fyqL$o!_m8WKi0cgT2(*0 zK*BI!NUQi+?by|!#Qa~o(gnBjtZy7r+qz`+)e)Bv?|Js@)^Sdqu9zu~?HY?#Q1v@z zFo}Tqu(pYMqohAj4{mtbS-!`dow8IUvw&Gn0-GO2gFY`s8PJTTSB}!eP8%&W$xUK* zsGw&>GWbi0s^v3Lb`Q%9cC=i&!j(!jL+Jvv@ij^kdw#pd_q9Kb=G5=Z+8e!$ja^2V z^})0$hNK$-jPo>M!dsz$f~r_YBQmjgA<~he?YAKi-6W{$Wc%!CG)qa!X`l1q6v&NE zhgkt6A`IpU~C{tZkq3fyP^AF=WE!HnQ&8 zy0l)lqekSa)?Jp37=x_G04Xn8qe6Czjf8^Uxs;eaX_da*cXt!}@hqkkE=Yhjrt#gY z9JJxDBQ!`U1@WWzMHImi&wyoA?~L{Do@e=boo$~ zPR-MxIZ;{$J)c=D>Dh|7pYa)!*8KKZ#%$=w`|FC*^9tKbsreQchsrY9V%bn% zc14N>qKAmR&FCOr8ZYOXs?fB5Hd|?~*<&FRxgAogC`i>~cWd-|N!YzmKqz zmVZQzEGhbxiDS`HP4UBx?j5zAxU0kD>*X?)jHpoM3d`mV8vu&=B33*T%5$22<;3<9 zm$hE}_aFu7>ze^Q{^+JWOiuID3)<}2n$J-D@mC&dpN2YB%}EKous%6J8_(}dvIJi` zzNmG7S1>eRhzWPBUW{1L+{;OtGKt=Ym6|eqC=ZJ{yL;8I>Ok3#-ZHJK$N?cTxHz=u z+jSt50!&4Y8okmVF;Pe8rV=k&8omszVV|TT+sU;RV!@2vlD5=8%3kokbs$v(sM!N6 zLafB+HDxPGy4{WPnjA4;>i!69xTe$f4VXi$?UUoFdL&j!06EQ=Umao=_676slVe}X z(2WmIvT_o;h(9I9HWBpj+kw)=i{%aACV>$#8kq%RzN72iST9WI%8>NdOX!Mw-GdbY z?_1jH&FG$utl&_$F1(reS-;ib%Lel$k6nHBm#LS&zbjm|792W1ESnT5e{&;VBpnFC zFEx)tmqXtw>)XtoIyP=utiZ)~KdB>Ho!pKH&HYCV-UJ`3wLIbthpJV*kC22N_^6z5 zqZMLjVBXUr{q>k&5AVCx*E4B2OHs_qs7A}ejYw^IQw}mDWTUsFqpiyW zcguTjqFoBkg+oR{^a5#BZ9F@*1G;V^y0^+=6+51b#s+kJS;jPZu(cLl%FHvSR7x`P$@@D@z%WXXZpLfXZqh}f&{p&CHp#>KQ}>`G zx3WRSRKkUZ3ICE5kHMf-W(*SZhQ`vN1lQ`n2WeXZSW4ao9jj7wxr6kzr{C3fz#1Na=KHEYLyrvN z3j9GiK`wM;J-oZX>dFlbUyi)FF1MBCTjTNv{LxKKk_y9bWU3mV=Z{AD;LPvC$)SD=kPpVc|%zJOCTXvUPBJK z9s|6V^#`_?_3Q%y?3k(lcxu2>WS{w1eQ?1tFXY*9L3YW9rDH}CmF%l}$Nj@QFgIVi z!Anf+o;^#hW?7}2!%yloB+v7=rZs*O3Gx)LMtTE0Jf4-5JJ#w%tPa|c6!2I!HI}pD;N0H;>J4O6L^2YprAmS&-8Ji%E=;27e13rCB3gN#!p#{4o2cM z5Egyb3q-M*#e86s%APBz5tn?6bKFq!D%~ENQoP_tJ5=2&{3iBv>x{$fs*;y6dTAMv z7}DHW;5+lz(q1)yc?s;S4BrJG2c&ht(a%rWfJfJjW;e5sYup%=@^*bfim{j_42gH& zO|E(F<*1S3o3oBmEeh#uR23h5$~E%hAdw^HJ+VRLicm8fbiDpFNoN+!tPxxI4O?%m z+tL{F7&!u~P8(SoOx|53!@&$4t$Ry$qsXKk)98!gud>e_p%V|P187N#<%5OUWgp!Q zj9!d`HRy(vsMm;~5Klesm93n$gnTzHJ9}j%@Ro>|;V#7o6J9S&xjaMxiNGup>iW-8 zO5f~grbS#9RAKN`Z42M_NaM{9w(bWdL&EofoW15+&&n{oq#u?vKL|L@stvR{t%tY2 z4X*-yz;xg-#tbCj4p(?IS1w?_IWC|_dz&is>of65zp|vym%l~N9R}Uk{B}Vj!2K3< zaEJ_IZq)zL^r>em5d!J8<$V`l8yGI}gC<2Wfw47-g-{^pmCEwpJp*rV7C?5TmHgNP zByeqWE3|y5#Q4Tz)SlU)aIe}GAh{Y9NF$w< zBj$$H>KBoUNUQuCPC7di$}bT-}1@XcLxh_&+YYezx; zmr;E=$9O~r`B;_+MfKA&;AUp*u%8G=x*Ca7Ws`fmwJ&%dn>bHBOE&U6rmscheZeLk&w zs_VvtMP=+2BWdZB5EH)R;JXsr`1&R)8i*AMF>bRKqpZ3)`1>-0KiU|PGpCNO3+8W< zQP+cE#CM=&C9JePMv;;KP3K=oyMl0{-lC_lAR@MbzpS16@{e+_P67a)nQI;K zpG(KhsL?}JToa43B10Oi@`J}z*5ImVvhCdsB3#>l7Wx=k4!qJ#?t@vZ002N7zl39T z*#Oqy`)cv!{Z3c0ZJseTg2e2gHp6J=jXnDbAe9Y2Qc@O-imlP|!){m28{yfMtrL+V zv)U3HR8y;r#T31_@SY2gn!agx+i-8I+xln8GweUBOf)=;Qpf zcW!+k3hO4islR>{)9I5P%WUw%G%M!)&n7^Xo&%Om6C4DX?`5ZMLAQKexpM_0zK9}D zV0C5+{8JU5EbMz(qmC$gQTPYqkweBfkLSSGUZwi%&(|g)Vl@op=Ir*oO2fHgiLMU8 zp9a(3Xy)1c5Gu#O&QLsw18dc@gzmRV-}=Wn9}_SBn0>deSFN(MVXfhvtjT<5aIIwT z*7gJ|##k-auuIt%Pt|L!o~UlkconqWa$3OojrXQeE62<-_@+ktI3q1+ixya#cAb zn`XbCzSR#Cy?lk->MvX2AX{|m0qHZ-K|ZOea)m}r{mct37t$yl=B_$a3rDS-r?`(I_P6RL6pQuUUM~Gpi4nMcWCQZ0e!q6 z%fg}5$+xXzM@$v*ep%uAp@Tw>d+eohR=oFdl2JLuc0BW<9N&#jMIIdA230OruMw_F zlD|gNUMyud_hYf3N=j=+{b%CR50T^U4v3S_-(NE;n0NXPS&2)-ZdRzy6JgT2Y{UI^ zb9lK!_wcg3s!xE6=tIa_OGs6?9e04kNUCCX8mFl zPJBY6Cz1{qy-&7rU4yON2JZG{(B&%yKf|rsjrC*o=;*cYKQfD^n~#I>Q>;nfXcrRn zPWiY$t+uq|(#{3HcrpS$T!aue>uy+EJAX24G}w=u&e-&Re4VRVFX5J#9gKKx?(0}> zvV5T6vl#ah>zJxLyJ14-sB>+fRU8*Sb&hvytm=#(rs461iLn}2y8=2EzxDwx(yZLc z=Z&iFSob#M_#We4%A7~yBEb;OgW(|`{~tPm$8yU zo)tgWMo?ICGV{}Zq2nq`&w%0E>1%3;_`WtWgmgII5fCf0JRm>|*IcuXSN_C%(WgIg zqDKqx2bsy_UXF}9h$-n7kx3&eC+>l~zLd##+Yi78&~5#hJL;Mg5_>Uk5dAhV>oGv( zmWbnco3AxiV7{lZo8OpGEBf&#Flo@OmtB9Hq82;60p~4vpbIDp5Er`;^QjMfRX&jd zNLw%q=t7x62irT@s|NSmZEY3KN8e!@1aj^s6c6aEk9Ma?%$q9x`3d^57bF`%0RF6N zI%YuOmb}+_!jMq$_&`PH_O7aY*h?19{l3wyRc~2+^?-(*3;!~${g=NRS`?AP;>cZ@ zJ94LB1H6cIy5Dq8;Lha!NLSg~j!z{{TT>_#Tk5 zGN)qdulQ8Vf*RF{&_I{iUc&&5u~93$R!9fWXviqNK=4}R&_K|K5Bk#a@uPc^JS0O$ zGJ^JNpS0sDyM{hmc?Mv2wob$>y7-g;r%aO@$UhE-hQr4k{0SzU&3|Bvg!C55Le#p z4$wh&I(siT?9EX<6^XDWyHsc)NyRNAea{4!GtFSlE18Bl0A9%EpN8`5^xP z^e8b49$UM;EmsF>CSp3h8OGW*Z~ywf*IUJo+)5@_*Mj!r2nYLHjcc7Bah_xMimi&2 zW*@dujwy<*Sl6qC?VDDrqV#GPTh)?3X)am;=wJa_0Z~N(vO~&62*^{=`4c?OJVQIc zHsZh?Ik)OqhXEv7pJ6$urHx~l5L>5?oC)SkCYrA$d(iSS06zQGmtop=Eagj@AEMDS zdzu4Jmb#BO0x?SY&nVTu!5z;`HVzR>qOCiaQt7&x=?_JS9+s{ga2?{A91HhP9T(I9 zm7y6`1LWLSi5>^Xck@+gI!O<$O|G>~M(e7VTwnX*(N1yUU+y&}Wf`BLu^YqS6e@Q(pg^`K^_UcsPFYYF;mX6pGhVw-hJITbJV zwzavx;`Y7b2Vn1ZDq4}FVpc%#+kw~Iit=0{U(uTH z(cT}$DA2D!iFQIwq{QM!zi%x+_s$=bK=qt?5~3hKR(iKbo-Fa|noI3m0kV>#q@7uH zp1N6A&=je(IT;orJNqfsWbxb!Qu`^1;EsL1j{PtBCf6fUQe%grZ#a*F&EVqHD6i?T zDYgNM?H8XAl{1pM58kWon^eaAWHUCy`kp;kTuGmHQgOpFx6O4l`d|hfS^&RKJ0D?M zQCoB76Ju3Hj9I2H1O4LtlZ~f+&x%6VuSI2zzEDyDNLjgw_PKX>Y9**6_($OoI!44( z#3VwIHb_i}nUOG)VSh(oRS9^&&X(bufhs0)29Rkv3b|pz`_uEJ{*14P#cgNzw)=(U zTIHDB@LME>%LGclzBzji7kxuSohIqrGu zP{Zm^jgDm3e?AJPwm%YjUdgdg6L0j0ZkGvSIE=ko%Y?s75t&1RNIB2!h5My8iY9rt zq}9Z0F88|A^N>AqC|)G_uOpYT9p`kf2zyO__pjVo01nmfikX#Ou_dko-(&OXQ7YqW z_eb`Vt>nXUU^Cys?ATe~Jo2+oI}izoy=$NnvcZ*m#C#cQFlnz;DT{RqbxxOk$~boMI=mwJ6=VF_H7z!)j_l{Kgu`E5NI4n(>e%+tY1d%s^Jy_{H8bnEIJllfR#bNgG{0SP;;>llu!So zApTPn|NB2hagFsLqGq(*i(Lq_eVjvLzsEvj)fl99{pFYXf@b

    l|M|zyqHs`T^jm|BJ%~|6SkEf9d(aozr8F^?rjm zN=^WBg=ZF$^g9fPBQHkE>3#jK}cDH+8|6!uI z`~alre@?;re~)y{S(K(XxZ(6pG25vdEKU1xRntujWpfNAYS(OG*~V3b6u2lV_}Emw_|X2brD<31zxS;ZaA zG@BshD0Lfs>-5y5B6IGQaho;tDb%pa)q#3*a?5>;@!N--GmWf)I@@mpU+5gd8laHc z3*--1HFG4ApG|MKlYNh)e{|bGgOjKwsnur@XJ?((*xLbl?0P09i)Rn&^tn7}l^tln zunTK8?tNpRtjf)mgy$-DWg&+&6dsJ7%Y2@xpdc^>$3U&p?TI+MUc4hF++1a_sxsIl z(x+*V??gmxlA$A4}mQP`@9xIf}Gl zQ?*zj`=dCEKD-0&)nW6Hvso^`>r7?X0L0`7uwn4xxN5b-Gl_<1aWr4{#ccO6&2oKH zb6B{N91i56XWkDqD(}>Lyrb*rxr`(V-K^3!om5;0Qm=orYy9tvB;XUY4E(QmPSbnC zBfmkK8K{RNEBhNj@xvc%0<^A{;(=M|J16ld0-ml8T>T|#YCx?g)p&*ehZ0zBiTZom z_#Y(c^#9HAhJCzn(^P}aUnd?&Yb%4;Gtr-sl}6fb*^VPO+_awIWF@%@NMay4mMpe@ zR};Al6|n%-{691R{(obe{^|Yvo!NvJp5QS0%>gB*jENRUw=7OKu~DIjvoKMX3$fc@ ze(bf}FWE&m_WYb~gq*8p&+-|dvnV=ih>LS*KRT~;&*-~h@#aIV?q_b+=N>hZ&q@ef zyr*u=60K%{lASwHwR@NN@Ai~)Y zYOL!S^iPqFzu3ylH3VD-coXMY80Cc7A5L)*A0s2VAC8?vC+@#T9S5GM8`ocV(*OJO z-}wRSf#5)pcsY=Hf6kKb1g)>GubMvEeSE~EB}b$j09<0gJM4AIcXs(bA5LcT^)06Y zab)bjMbbi9(GESMY4c?fbUB~0CTm_oWD_azn>DBSVw9;A|08x)Q0qmH%@e+^!9b#o zNpg*pWvtPylP8R;elxEwbk;D6v^D6|DVRAMi|D<^zT6_6gb|N!Clt8s10PpC|Lva^ zIWP-Sv^(HAuY~wD`nELvE@aI*(i%DM``q6Zyl(@1;RCtRIt|POu6}Zqzz4s9Rhi*x z@+WEFOu6s=0}JjyHMzuv;F0*y6yq9fgYykUl5-h+97Z@s`Oq^M`O9B;3xRaHK&rHQ z-JKbDIee^NFbS^A&S%hx28j|~n^mq=5i#MOFxIK1u0eZWuh>ta_nB2f%k|eq8+zOV z-dAvJQ!$1jU%FK6dl$a}52^o2(Lq!|w59ihR8G62Y5sz(jZ z9&6xFqf>sUk>%XHp(Vh)F(LQM1w#_x$KM+PQiAsbq#rN$4R_0z=3Np!%KbHFYH8&x zq*SmWWG+zB_$@{K7?{Qljq|L8Z>ST{2K7DPdNQS!C7jdgZebv*9?G;&M}g*9gKc$h zcT`#0a1E(;oR#FzT=a1@Eg5_!ZNh2%2|aah6VZ5PWsT52wz}9pr&y(sa(YfNM(bxz z7Ei<3^QgGJkoxH{Y@;|!)wp9M*wQ{K`hr_q$NuJCKRJh+fL7+BcI_%x1J1^L{zxCWGMLBc}0(}c53Gy9(=ryHZeCeLHH|ID%`u!lC?WrKry8n z4_e`i#(H$r_U~RleG&Xz>KSa73}=D@2>n}m{zN6%GZ|WA7<8A7eSQ$0s1)evD!>1! zuYSNnowvi8qDr+XzT7hBI1btAzxf0ZJM@mElhLGVa!96D1kP-9m4K<)PWFZM#fm_; zf)PS}_fsBOORPRMtT%`!YOZR?wc@Jc)>8eDV*PeYGLv~HXJ1{GaDKyTHQ>>3nS zhTP#aeZt&&d7qtZVGW=+A@f^dwU;JUzpk=qXOaCCt%tg;*`ZCxAXQOSGQ~T+pBl#N z>^G?;37tDq3OXe#w3gL8_ORI==H)4|zPxJH&z5YBiomQTOayd-ez(h8T_Lf zXmHH5``dE;tHOm1{?TV?uH~f)PYZ@+)gj@ot@C5rMuT5K#J=jf#j&Z>46VB7+N12J z6cp8YG5UH+H~M7vS;X`|cgVcUW~fj)LnRZ-Hqdj*swpEsD*YTExSwchWxXGG3bu#b zRiu~fFoY?&&TU=PTOudJD+ceobC{iugi(Z4jHw_TD>Jq~>K&X4z8__%fE2XZv#}zN z6EYjB-lWZ0Xn3y1ue(^Nyt)HnShIHYq*w$9mkGA%-C2U{bWduaw<!rkn2ZapeuVn^+54A6N z1(C>RqN~#A8`z|j9Zg~UOm!fe&6#oV2sXw?iqZxXBdZE_8L!`geaGE5IeSdFRVheu zWY*RX-vHP*@hX#_DCP3wg0Co6u@n zU}`mH7Tydy8(=ih;kZFkHaRirX3KZ%>bc~1#2OjuZMWvz{P28uzd;UXlg_ziVz%LX zgjoC+p#AjKzvOc8PHZu_#Z2Xr+J!e%xk5eEHuQSci#8-P|49)a`^V>ceJ-{gf5t2Mu&gvTui^c#vUyy!CvA^mZc^Vc{D!8%=GHjsP(QFCww)V5RodtSKx`k9(_Egz2Z&tjZ58r zvLT5<={$5J8aED@E|T4f@l2_1@(v**1s zA|ded@JHl=Dh4K@ob3jT>e^d+4{v*1#5IJd-reZkUw*hUK>p*gQ15vu2@+PVKAZf~ zdv@dg{QdXbn?WWR=!4AT{#=L1AJZjeZ#^7YMw;VS^2eT?{l>u<$6KLaQ2SaHI)#gI z$WOZ1UW}~}y*SiRk7j||QW=Pn^$-@>dGu^VpJ+2fva4X562NE%Dl4CAPiY+SM}*!I zsLJ(mH4HZEb@^3LQ>LK(*g01j^G&&KChAlZ>N6Egp3bBi3`BkUcA}Wb6mv3#x&B@- z=7!|c=nB6q#gP8NE-sw|R<%(+~X|-|apf2-ysOD`O>Q8vA z6Sfo`r*6R1@Rph6bJDhsgYBMLc>0>bLgwO3G^dUO> zJ$qgzYIICCMrepC<>r!jki_`@!^?Hft36AIBl@PfYpm4iAUR)W$O5$o7^Ow}OjZcs z+bJ=7nxxv^B1p2Cx8PxeP!rJ90=75 zu;L!jqhx>e!M%|1?FLv$&=Za_rc$j`<419CgMo(j@HPBwrV1ml1-k*;B+C*IhDR2Z z**Ddod2BJd0GTe+8E>ptHHpoE{M*Z21{!!ik|lqGRDn(Zhc1>bk9fapjLwy#NVZU$ zXwkY=P#fwsB64;*>Kexs*YIpbg+e2^k~8YooWh-=z}s@mC1fk^7`3ZijBFlP;@<^;y&+H3Y~0|#As_L@T;+CDt8*+*k&NxmW`<^o zug}N1*UFOKzr6c8vU@Dde6e50@S(+EaMRLSWnAeePR>a;>AT0J$l;b6#68ty`y7%Qe$CDjxkIkpI^c8n)_zUWHSGdInjy8HeJVt(co0N z(Ad+Tj2#2;iuVfVO;-~b3QQmC{{~_1fme)YXay?VP}UW{T>CMhi|FN9YR-;d_#%C( z(Hr50CmFNuDeBO^)q+<|&GtBV7nM(R6=S=lERG2J1TjxeCbyTK$s4kW+bJQJ-`^C7rycA?4XN%*biA#*Zg3i#Z`ewwMyObOBs{g z1s`+HVP7%?0o#`ba{0T@q36)d-ntL@BmWB2XApR9#f$&0coRMiV!02(aTC!k_}BEa z)9jny6Gj}dyZ(~9VbY2$nUAq{kIaVhPKkE?EH`}rDK+2!8an|U3VqrUTnRlxb~B_D zPIX%BL|w-pN>^hZ(6UsnPZeDX)L=Y6`K2WE<+@Ve6&kW5D`j@R$hCL#4cvI^%VDBA zN<+S7>#f+L+MNGe?A=gstANN8+830x;3QGDWsQyDpcx6?3&(8Z8>n6Af>m$LHD~q% zb!UohfPHCT7cMM;QfNLNeGpnlLzea7k<+2221jy!{&dp!KH%kqo-_dx+E{Zn=;!^) zt7dw}C2w(KZEi$c%zmzO&uV?G9ZsKIMdVZwBIhaHk7v-kIwr)$Cz;%78+N5MlfgT~ z*E8VjjdXi+gQ%_)64yD|`3vR$*4<}*XdMbd|N2h0q1RplG$zVo5aO3rymSGiAt4QF zxeRo0ag%@icaV%fUN@u$g6pVfpkRwhR_Ipjy7zI9oyQ#-uqpw~KGC^Kz%Qk3W}1AT z&b`LPrd_+PzBKN4K%g!&P<{&U+8{8-M1{pmF=f(Yngh0;6CHk<7qU_T` zXO(qN$F}tS%I*cuz5Ma-d<=WL&r*{~?P)?(&RlXQ8$dc6 zE4jX-7GzRSw<4Oqy6fMEeV*r-j3!8K*v#Vr{Q#HhETgcz+45;hCok9P1I9ovi6N5W z>wqtCIYvs234O^B|E}w~H(!@%O71pGCu}e&6Y+7b;9Q=j{+>$v>uaZkBwnhsZ%!BC z8;J8gq48fyk}~)Dh`!IGw@VuUDYdLHpMs?%b)rX3g_=_G4>Q%LhkfU|j&yL;fBaG^Y{zkE#& zyh-8Y$>)UaVM#`^)gZErcz_W;QrmHQ<~l>ry6Sk!3l>eC!*KkiTCuee6hAq3PTYj> zsmotX>vZ?eSyu@;D!aPR!LYPZmieWkzK&09E_jjOlfzvN75e{(96p0c{AMWvl$~EaD*tudN7GZ`fmaHa7 zHImFMqS0A#4*lvgkv-{@1ltFf&)or8gGX;G4+e3P9|rsqN=}1GmL}$wfkAg|(M{=M;F;9w_KIO%9@k}W!w#fw24>)l z=(fQYDU{siGoDW5&)VR=t^p;sHr)XViDrIEt-wc`P3UistXv1ZK%WjLqZ9EY<@?Jt zw^Y1<4K1x!RznrRqR_IRw~>6%WF>*6zymCcL|p8PYL9(O`gxWXkj7)=6BIlNU0c*` z7unYw#+Te#nxPTt8(mpI&&&(>LQS9|5lo7$CSzuFai><3xV_3`+8Al6oMEuZ^2bW0 z2ENTI2e=P8b=^|ebBymNR~D?%z0psPZgoj*Z=4o?}4;O(fJ zwA=vjXHGGy@bQ$(RhE%!l;nr#>4#iPtm77u)vi3sbz0lnSGN`}94 z`VA`Hs-uahvH7J}2A{**l|{b*<{{lQhQ5?LcqG=}BdVuZ9HY=7!(tjc5j&iQ;Z?8NPLx)NPOWpZ1y2+Lyq8YRNt1X{3?C;&@Y~8PA+ePt0ER6JPg>!gz~E4qB#vuhLlCwY@x|Cvo*#?ai-` z3P~lw1AAK~^%XJn3Y;Kzc+745q;LD1@RTtr*;fs}^_3!{w#c6vhdb zeNt=Tf6B>o@riVF-EohrwV2b9ndb7rN&y;N9-q=-Kp|q*4LtIHXcx1zk3?N2>&}us z(K6;fsfruF+@7Nh=x933LuC9@f-Bf%lEN2b3q}5XVLs^(%IV2#l_I-LHZO^KuJLb+ zIjH0|o}um%w>F&T@*brZ666Y(<3ldPE}OZ0@S6$dvhO{~R|N0y>&EB%L=jfGXw<{V zHo#0QwzdB{?~8()yba+5Hzn2kw+sPf!XU~}K6er`#A8DWBSX_Tw3SqO1|z693ffdq zQQ?LF%9(#|o>WUloW3fmY=HI*x(p?*sG}VFtS!kwRipggn!@R26^B+xigkcQ`LkH_ zRj(g=_CZ$}Xhcz`d8v*(Fq|zp}<~DK$PgN;w-$y^%V7i zzRi(aE`>Q2>x&O{ZL-41EJvnaG?cf}2LIlU4zPH^UbIV*MAV z4p9h#crtP}P+;D2WV64$dB|IvFNNXru`GT9s7>&N=$*~YvHD@;+NcSmzlS4#bT>Qm zNwY9z&@lnLF#RU=F`k@;E9P+NC8O$onyU?TDl4rp?0@ip)Zbh+#X_Yz`R-g;B`IF_ z(KInx_Cf>AG>*SPY~*~$V%&-9B;I$Y^+kBfo`X%?=(&*1OqE{9<5+_fl&wF^7FJd#*Qe=x1be!U4#rMUOoIGKQo6$ed*1{?%^#Act<_`Nz4L zgUmsqlT;pxPJliknP2|GyhX94ohY^SC+R)204i6r#D|H!g!}GFRo&0-452P_ou$*wnlbF7^Ct;5@MY`%{Ge zPYN52f-^J?&c#WK2<*SHaH((#{E2cx{>*7sQgX{PN8@Qs{>omnDAttPS>Rf=|CcqG z2GTiauOIjC@4L;0y+82lH^>ah8mJH3&F=cVoHyJKsRPlC^VB7b%(Jc)NrB(&@&SDw zj)wz)Ib$z=sSAoA=8XxvYksE{Q;N>}JXhlpOQUAn_U_uQ*n;Q6(I4{6g??GR97=Fc zH8pLa=TFHUUWWANfj~blvK1Sfz8RkY*7qO2^dI8Qf8y(r2g|J!Z;nHnKdFOIPw4|d zQ?Kz?JbO!6^`*?anBy!RGP18A*JDQO?*`?%GVxhckP z)?mkt|F%@;y(bC8q8K#2hiEOUPLBFB5qRvov;PLBQb27k{r^(lS77M$xLu z4NprWGoJSfwF%Vh%*wCt&jFReGxJG{i50gV$@fnjw9hE}Hi3MWLaJqQ$Uy5LMPBP( z`}O;qCuP2&BQDGw-tT{kW zy3w5YW`GGFDe;YgJbxO5oP_;vl0?^7>2~g!bxEgJSTSiyZ^?pc%EyE048K9c`IX_; z?aCp0N!ZSAS@I=yWv)j6cmsyuL%!|t8+34x@EZhN@R=>4w4W{C1Kn}YU-KRv@So5H?8bo< z{H711&X&@2(x1Mbi9fwO4-Y31i#H2q^te4^09fWIf-aEG%h1PwkfmAt2IX13W~06L zXIwLeept9rmvm~4wQQ(oH*2Kc8iF|R>=~K>nxs~QGS}ySCWK~DI~!q4%i8^2;1R)t zq4d7p5A=oieDu+LbeuyUEZXbJQ0%HYPc<_-JJN6&l;x26E#jY3+<(Wq-#&Gl@AwmA z`+~v*nsc@5-feV-d1?gHjZ$g;UJo;sz4vO%U(oR)?^9TM0l+iCIg^&~h=xp0&8Sz7 zWOvLYZE zyY9coZFkvs%tX3CD;BirYv@q+bvC1FSv`|S`n(Xy*Corfdufn;ow98?tVCS48ojjN zcwguz-FWu|JNQghQf{}E{!WDGm9d$uT_r>0^3vUR&e3Y%J2FPs`gXJM!=sHYBafGM zn$+rjlf?Qw!)DmvWxBI^N3Bc0z$ju32iztBwI}bV3(t7ED*=(D9l8nx(h{R8wK*u}roF>Vcj={3(jKX9KeuHwn5MQoVCMvP# z3r=bQ0n)Q&ugSh`_LG-njolJ$LVjD|%RtPsL(V{;zJ@aF0cNv%I&6}<1GgdaW{UJ$r*C!8n4S$9i}Y`*HK^_?^Fl8<;{y zG3;cGnDkF?PTfJpNRmf-fI}QfR~Di-1J6q4CT~t?prWA7=@-uP#t2i@1J=|b7Tbap zakSK~LySyc$dYbFpnAZ(IyCQqfrQw{q?O^^g}*w4!1c-D2i@T+WxpV!=qZ&@`S^(a zOx3+d4{qdSWKLJVMimMHWYiv8Z7GGx7I}d2L7#UB+s8eIYV>U9>e)1Mu`j$WEtm+2 zE+{_X@_8v=%359XzPfIcbiCQQ^;My-8smKk$`7tU7mV;kE8z@Q>jmBmH3%7 z^Q{I5+jIaI`8?rk#aFDT@vv>AJx9^Jz|@%h$s5r(cX%2^K7O|P5luOrJuQ)#G39h$ zUOZj;p60hmU7KCpoTA0BCzu%tr(YggkePP!W|_3q@~bXV@V3F4@DJ6UE?O=}JHy5* znY323h?gK z*&E;F*Kby(UTJ#G^4~rm!!C;vw1)KAcAy@+^lj7)jf5d*fcoh zl^tK-V938vJ84VFGOIaT&hm13uJ%(d=igzBf0UQ9FJLdJi20yr(7=@20DYqH$$>ki zjwa&c=5t;LmtVs6Gu~w%8&^;|b{oP=2c*YkZ(=FZ<1I7j6XR-m0k`q5?iJ|8mki9H ztHgh3nAitEBra=06wgyY(K zK(MV&B)y!|g)pG{ypcGvJ4yW&lGmWc|DoxOQmw)36lSU~;`-U|cFhB{lMS+PW`4r0 zHnwn1GcA#J4b|h%RC$=tOnWpRRO8rU=A7iH=4@0(PLXnJ|J5=ey85&@bsc|lxOj6U zh2~_>604F9stJ_j)h9eBHoP5Ht-eCP~sdNV^(KFzQR+h`=Bfx?rGRj`-7bcVuW4TB4qBxBgjvqg?GKVh?YinlRz6Iab(0;6~AXwvgVlYt($OwDirPglHFo4-Q zf`>ln4W3i%h@Hh9Gi$X;F!Z6eN!Z?Vc{41nF?UP$)w&vO@dT1YsY4I|CHLs1x>DwG zy_xyBOL50MhjTn-w-tE5MRN1ek6p}i*K&lkXRy;Akf-UV^C)~i?R|44g!I}=6%TJ$ z^y37=MpUBP`x_2g5Xlfu=u9-0EkJ&06?-y3)p?F0*C8IcDCYWOJ^R-OewAEnN!u@a z48jMD>Rz>qWViXPH(Uc`?>R`wR)F%ie62SmhF9g>E_Z$~AT$h&NODcar&5LiMa-gT zhl2U(8P6Dn^Knl+6-*2O;^G~WVnjyFiS}He$`TSH(j-8~%7-(Nd}M7k8a}y?^ku((8&NMOyCk zKP}dixWbfzJ?=%&xW%t8bXR}(!3lfm)POw7vjFcJ8qm*TpZC1^7F&=w+Q4TwPq-VL z;O%kopj_d26N~oMpi^R>nS+oAVwIH@wG~Mul6`r3xmS!f|KrWgB{9H~Jm$W(5}=wx zQ#piP@U!=QINw2vn0y;j#=CN>pn5~p-GQK9e)gi->GNwSmXTrrCnGzN^c0QzGou!V z7dgiH4e!=P4;}_Vn+t$eB6>5_4*a%&8c4n{3kyFR!M>UK)TTD_rcq7lbEY513X4O{ zjx5=T`!AqufOslkLO^v~IJrc#@OeY6`wg(u)i5sc;w=_n)=bo?tI6sPsqFckz#6_x4{)rhth5 zO;2<9yE*ISJnT7ghy@|OCX?ANVr?51yFfJQZztb{*?8Hi|yDfOz84@3evyIg!(%I}rM5>9SywX}LUfb-Yo2 z5q!9`U=a*(R-3qi@Vd=Eg09Q|Zg2mmy6GQlsQ=h&WrUVhr7chje04&rAT_d0B|0v7 z(AOQHswQh+{&d`Y{ws{hu(LyPYFhX+VFqFH-IcOh|0h>Jz}`z}9n2E|dPm%wr(=&H zj8PN42S5=P24smTeU4iPX7zZAc)+{U$C(G=LqH7!V5B~KC;_ywhtZAgh)}*yep+8b z=41^2!8h#4@4I=F`qB`wLS)a82drxydi@2h$U!sBe$~G9_AgS#{&hqDEjT zP?`e>0K)xWQ$YV073L^2datxE_5$_p6b1wsG&s*g`m0@P5cBU-?3=JG@)eLw87Rw=l7oBhJuiVB9&y$0uMhJ-EjIjT#hCxUp2uj9Nf9=b`)goN*I{$x z>qVc8+sh4;7MF+Gg})9|9{aV$O>dv#8A{UKI$`yRceW7KBk`nEu$R{gN}OFj5fD4~ z8}tf&Ug^BGo~!n|u}o_952aKQK_plbtn_Q&ij1N$;tDcLGJ5YBum)U@#SLyy51` zNqfKt>8@eWqvca!9MF>8ONuzeRqe|?w7tcV)?)Z4__cbNA95k{mn*lidNHPEcWZ2` zetVt*SFJAH+qPB!S_*l*NRZ;`djHPA)a}I<=+4+^H*{+x5s@SdKXISEUiZmJ8gLJj zHLrie3}1poLeA`Q2VxtC9kB=Nix3(YN9CF%2iee|h2&LY>zR^b-0Auo^tk}naz^w` z0FUl^7hqZDmfg891Swf@*{f5t8(PU+Lk$dX9*@_f?^#`-jp1ps6*GN3eT{c-I3HmledVHp5UIt z4Hg=DQc8Zu%}fQ-EP3F4fKJrq)IGrDF5}|oX?ZAK`NJ-GBf|~0%(#Z=15!5Zu_1&fV(_JTVzsO`gh^oSrq*MF?7wfxj<(i=ZUm{joHtN2Cf|ptT_wr zp8T?K5qChM&(iDw4~OYx93yN@bM{!^hArSe=SDj}6h})%ox{V<+pHlyprkn6b|Q3V z74aN0Ec_cJP3b|VR(qnL)GMfzs~F-NH|l(jC~;&~beG5v32%ia87Wl(7p1bWFGcY| zODo%jZ!eHX2w8_d@dHFVNT=~@aucEA&?lvZ_#POqu*U_kjPyk)^4wB-pS2&J1H~y1 zV{@F2YzHqod6Lrzqm@3Hx>VOIkgjZN7Th+p0yCyrG1+Hn061%VOnW!o(g%vqWhl8a z5uln**KUP*H0)M~{02p2>^WiE>9c^#%B@4uINnnxXNTUHgVIG33b1x6 zPh9#;(;8EFmF08+;>dY`>ZFQW*8H0G6unW1{$n{&*BR*~dL8K}wSsm*Z)Bq*Z#o@n zulOGUP|ygvtiOU6W12>Is+`{S-74UN$JGHvz_US*o&H4tBR_^9X0%hzEwqc^#&fGI zH30=7hM37%rPluhO1PpM_qv14=<4FcZX9&$i|$gxZ;MuOml3#A={!8ukci+>0YKi@?YCa2mO=0)3X-E+wCugh9=}C_phmYvQIHXmE*{R3& zyjFYtOk?IWX#VZUAhu-xMKw5s%3VF|k$1GdnLf5UsHt32{H*=UG?Jxl1yPOc(HcP4FaUVD<0%=IwSQihF~!umQ-p9<(?%^XbS+X0wH} z^wuo&?bV?^-k*-}sDRFg(sw%F%1~8bi1;mCT*NbY)L0nx?w#^<=Ga6sj@wiU&v-@M zRhG_8XKwPczCtb{$}}ZCcjVNDY!=g*G9%QR31b*!XRhb zr7#jnlV;Pfwdc!)>#J zTz432-S=AS`}uxK&aOALol%G^e10}%=`_P=SzBg7j4dc+Aq#i7ECD=j)OZWh8Vto- z8C%wmC0HjM;;eDXitR^SjfxhDp z(lf!O=5E+K7}frh6xH#(%r?G5bL$2j*X`Nof~$hK-b&m`6d@L?gL$Bety`S&WOjzv zKZ6t(`^g6fhm$ngW8!;+g9|CbJ}5mOk{G&c>xIy+Q5Km&NaSYkyPbjq$M07&aDBeX zu7f#~M(a5{)}-$4(PbwK2ye6$i`;6x$fZj0_3QTj{*3n$Xa4&_ONhzO<%8w%Jlvfx zQ7Q)Bm7jjC6nrXr;5SRU)rBNQY|yX`6*>k8+1+v(tlyc~KA&@icqmc`Gug=d0L4+$ z=m($<_*Z!j=Ncyfc{MQ%T{D9FO`=(V7>%2@B5ImNgSLOdHpUPGuqk$;9&Ou#&OYHr zX0sm1#q*5U7T7cqs(_WEVRJ^KCh&-+U)t3FO`@lXvyLOQA%+e#ma3p((3a#hYoPD( ze&6Y#!W-t3cVyS(B69+P% z5+BdXhXBO%UBt~yyOi_2w zms=!4@KvqpcZ1Mmwr^wUiOK0=^wr1+b+1nv3?_QQ)8_jsTo4&dM5maCE?KMJvx7aB zM+YA9Hy*w-zxC|Ly{(of_J^Y-zHmAs2}BQTmfQXv+n^sSawYHDm8?wpd-1OwOey`T z!zxTH>alTMbOKnX^v8>@=vLA3FUc5{2 zvL>j=Ip z#Q!mIF+uV48c{h#5NL(-ZhinkYLs5)StnEa%r zuW-pt)3;_^3A5FjAoRrtSX7S*x^CP9^1!B2r9dpNUn?-2(S#tPTQ?!AqkLn)qZXNV zmyZh}atS)I8I+7|Ey=)w;4YFNtM}1crC0Xt2j#JtFG3?p6Pmy%9s1KXb*h>c3d#! zxoD!KS4vs>@J_&4=2h8UzGkC6fVhCHJwpTcIoU6X8FCjptV2OJJClX4xk(F=hc~qp zc8zukoX&KvfqMq)l?6->b|h$ti9_J@Bn;~*uKh@3^cSoZ_$$n!g9PoCCAhHgPBz=5 zY}cCb%3Ze0ySpVQ5w=LwlV_lJJsf4{)HEWo*-?eMy>n&iw?u$-46xcVd2glIa5gc~ zMNylIHIC6KJ10$W8gxCBD|`dYuBKJj5@{* zy#lj{K9wF^dY^2<+M&xah!yqdAV}Jpg**+(o6~w)IHTp_P1oKjCF4=>vzSC9S9ylca@MPA?w*HVSswnBp|UvLOf;h_PIbtUwT+vRiCmdD;@3W%Y|K< zq=S&oj#KYmZ(lxLncNB>qI3Y@fAOcijl%Az3+I^)%6V^m{TVL%Hv-t_vA=FG)9BMk z-#E(cJN3k{mlG)yo=ExY#}|EsH7;^pUPNu}w=TR_0C#%Qq)&dda!s}}JelOK-MQ;C zbd#HcvzzCr7axgG!&+bfL${@ZXfY)XmwFw@nc3qoV*Om?Y_q{>3gL^QW78wVZ684= zbQ2Jdx77WV$$YaOWVcbI;F#y|ZL_)_nJ$z$a>Ab`$jQo2`VnaDZs98Ti&QMWWJleK z6ggm6DVdww?;K*(3C!5^8y1JHf7Yf0UHxLVJqPt8jxSK}`wRiaaa+ge6Nv)Xr}Voi zaDgTD7IQzd$G2gPG=Y37Trh3J?9V~cV|}hS4TlOby8r;m9KvH3ycqWmDTDPp)o4o# zv&>s&O1NXdZy*0^iJeE`*g_uDyyj~PQ9hQ@84M( z6Z$&mCXa8%@sBGA6+Q>v?gm792}FnOHq_;7SAf)q0}Ll>yKwG#zq=-2Kc3)Wz&QKO zwD!*WS46E6AC6;A_i`q zCIpkZqn@?K>c;7hc~@TcT4a=_B{bT;wvhv}%|QMa{T>>t z_5fBFEbG0D`H;iqlD33CC=@r(oWgjXgrb|7h zIZ$h-F^wV1h)9LP@kWCnloWO33b*LvLL$r07dgkuy|Ud6jjdQK8ozZx7T@s zJ)S}Hkwu7)&340aZ!P1H2oNzc6j_4!`EuwqAc&hUmMHpXBRl+vZzX)Yio%N(LWkj^ z7FL1s#Pz|?u$EJ^LD_RJoF<+N5JTr$0roxdfobyz(a6V(vsQ_MQy^wo?vLer{b|e(yg6 z9WEfl2`OB=Dw3OSo^Y|HdIlH@bQ9pOcekNl%~)B_TWh#DF$^gUF!9CcC`v9~GUyfj zX|^e`GlRB0ZMox(MRd{yT2AD##NMS9n7;7f+QM!fNj#p1bS&OB)9CHTZMANP;=5(X zIc)AK1wi1kTFB-BAe6vxck54^>fTx2Zf&Mn<93b%=2zIetX@iW=EVlM27P!)^W0XP z(QfdLXGD8nZr!tB&raGnTWYLxf_cB-NoF}kzXKU{;qE%nbL!G{twDYr$RtS zDG36@*<$5z`fX{j7%{tnX(i;NRoGBdD+vQ}=k+~+M}msWSgPBQ7n^8a3zB#6u&fTz z|4|Pdia5@_z1uC1*TeA-3Id{9ev{<7jCLzTFW%|8Sa|ZZ^+XnDkn5TpG@s=3ZF7Ik za^V37My^OR>myOUnP&T_`$Qj;0&qv$r<4@E1-22i19l{t(4jlF(pb(dOL;Sf0HEd@B@2Iod@tq1m zBmM2lo{21iL+8mnBVpGncj@gH?5INW(I+^ihs+d&Ljb;cZMMBtR1|LT>Mn`T1!Q_e zX^lt4k;c1p)UeV0VKt^>jW_>P<1k?hSK@2B=6S%n_kmkWLl!!YTJDbPTHwUShscVw z0;xS|NY6+j3uMSm0Gs>L7EkjfEJR+e0;70PFpBD>+Zl|wsU`g z4rG|ljliBCfaneU)S)~Ab3aN5VSfmG{`*Q-N6Q+p--aX|r2)iXM-`Fl{SOXxL}-DE zW1xHEw&DkQgaCF31Gqt9-Kf*<#8((%7!cRiC1+M&jM6RC-g_CmcujvTA`-9r9b03X zZiQjmNkJuM+CKP9)0dO|b>|C-A|=L~V_BxD$4#Q=2p#BC1W>TKweb%K#dxb_@)Uiv zF?iCHg6GkTQx9ZAo%!KL9iWovtMO0qKb5jw_;D)7z&XSpZ8I0=XP2NqQ9KOhi|jNm z+BcXz-T24BYx1uLuRpKJ(aer^ufDw068yq+wvK5yNk*dV>(KD_ z%`}wG#ctUH`kS?!m!pq0hY<5mH>7B}q}eIy*5I=~PfN>-MIN5+N5LxLZyL!RM|_m) z*Q!&EINWiO=3cld@|UVx?-M7J5JLzKh#m&fq0?L$05=F+Hl5?Fha>~z+`zKJm%oyH zEKtiPvW%(AupBaTm^`<>L&;eE_f0nApDh|-BjOO8>n2)u)|L)q+@()zjl2;btD9c7 z1=@@U1^9t(@6q-%nqr?nD%90uPc_Y9i~kjm%m4O0>mv{V&X>hlb>Os9g)wL}_#*e( zH`RYDNc2g|PGIuSYhOa1@k`+HLav55fxj$y(G_31d#s|}bX`wR6d2(R$-h61h>sv= zP+V9mUH{_36-{>AREmtdt<26V(LsG0Jh02H%Ei=r0*qhQ<+4iHq=2&sKsfqM;+(;^ zG^4w;Pn+#(%S%RmWq`ROmio=hL1IgH9S7N2GvEv0h7=#qM=5Cp-#p$~UkjvFoie6j z(r_Rloi58yvwCN{yfkg$@a)Hqjc>7{z74Xd82DMu*nePK!=vZMDWP{Nn%UGKQV8l9x_>%*82 zKMT)&-SK*LlMCJ<&#{IqN3>rS%$x|=bI*0o4-d;J6z&6}Yk&x!RW^b(mKNhE@s7xR zfe^Z#W^TTJ&8r#Y=fD*gdvBv9_S4X${Zx7(t|-nn(`m@joQZb+f!*vzcGNjp52a8p zzmRA*$xE#-wC{hDxWV4hyVScQ+B3v;%eREZCA2diQZ6*a5y@K;cXBGs)_?hP(?iSz zWAMJ;u&JLa{Zp>4u2)_;cnG>khWXCbtUTbQS7j!ip-hWtdwk+Ec0WpZJSJ&AFOVGm zl*-zAw!WLuS?b5$iz14bx-W8llAHmL#|Th4BCOI~VIJ3QP>m?keARp9W%bW8j-od_ zb(~kk!p2B#2i?(5OS)e=Y#B#qi1qi%VH<%6zhqTQ?E@Dsn@7`We%k9RO-wjaU-W|| zu{Fp(mBpW00%}HlW!gB@!Cek+=QQ(4dtkFM!WDW8GTC}+@j!01Eklp{8NUo4aR5w^5Z*++O7?IiiAY;9y7< z?lzufT)m8;IvZ=?0pGtI{q+z~>v`j9^1A{-|*VO$#27HzhZ`Xr{W2_;V5; zdGb%2%KvF!UuKG+p-IaG1zkC#*|MH*x%s5+)LdWAQR~f}w3FOz4CZrStmcBX>jkN2 z8p%9)1lk^vqaRTuL@y-+oG`=*y}LT^7|lEk_+}?v0VM@wmMw+2seb4zZ2u*-&c}1K z4F>G*mv-6B4AWwYe&nuS-L?f66Nf+)RUdXkmeO5vp9H$k>95@59_X!q!bxNez+<9xmx<1)av6u5I6v@ z`LBPi3v|3VAJBTG`7)B;*A;mLNIG)Gj$;7bmLU+clelm3<=g=lU(zRRZT4*3LM*jk zYjN%paYq!c;kl)21iRq8dupEwThxY~SnfI(`s7RwKP!@`48&nC`~(ya4Zi1JK}#m^ zr9;>yG{_tf83`Yy35UE8b{-#E1IeI`WIm^FX)y)o307)H8$N?>hV+JEyXs_CM zj6ekX{fj<=MC(7k!_~&#y)iGaPN-v{`^bb7!RbR@0vNCpER@KDO${m6={H9X<<9_1 zL|n}M7p9jnW3OKB$KDs=8plJogC>fXl^p>gm%^|9`7O)O2Xod#y#4($1Ga#rz<`$Q z3mdKH13=+ps2A3jc^#$XzU(g2N@T5ao3pa4i~2Faw9P+y`yp4mU_t;{VkgWIP6?rK zLN_eI!;zfkw*w&Z`O}m8K=RI)Xp!C_|0m|SG`!g#0zC8wXga7X2S|I(M%|gVNYwio z7R&Z4#eKYQz0!H!FjG}b&<8Oe)%Uy-nEZb|CjWAb{+)l59t`a!q1nLG5{q07i=5KV zD4ttiH*fZs;C#Pp^9sJYsUA=sz*8OOb1<_9= zp3@}7(cc8(eo@xL$OGg$ny)pQ~sI1UI4;nQwjEm|$Q=A1AFwdpE@ z_KU4#$xG&s8%HSt3Ob>=*NtT)odbksBtfB*&VX-D0$uzB5L%%VxuxTWrg%Grp zZI6ECmuge(^r4osc*Z%q{Lhj4Z+&e%K(-6*2{_P_MMS`;p8#>+C#ON39f3>HKg4z? z<4z!Tul|(f%ev4cSgYx=i!b+dy_GzTna;4xw*85zlz!X~BXC+(r5|s2kY_9WtwK$g zXB3nL3(l5H@m&^~+>QP;%lm#`_ZDY|mxw&qds_$aH!xZ=j~n;8irBpT1;Ig5WO zBCvw69rr*OAxEM;fY~~+T5kXdk@L z-#+~;ql-$&l*y5Jn8fO*JoQd+qCs0Im!cn3ZPwxfb8dbWwExOyFvt2;eok%*ui5TI zoe&`7OJZH_UG(hT1ox5`=eb%k7x-sy-zhg=nAApFt4e)LxMlb3du+i)sSm0YS%^Jm zq!1ww_GCX4R++aQkYXMM7+ih(YE-m2W3oyDa*5bL~c9OP0yxhyuOuoH5 z9L%H2E2CQ;KWjhFM6p1z3C@96MC^E>?N02flUG`qr53hN4bJ@8I&q>kv2Rwp(oGD7 z4)H=ce&h!oWX5anektxx4K41bwCIP9kM5Ab=NT(hArZVp@ z`XSu2qk8_#&40K@`0rhdSaw^c#m6B57^O60ofsQLoN~jaVF6w{AK)jlBM3}nPG)8P zEe-&ov%Cf~0;3G+dV8P&fu zn8|*@XG9idnk&IA%Kj}1j>F&Y85xY|d?-HIw*GiupF}$vV~>^tPDQI>CL>j1R7MTM zzUdaCo;x=K#CcM+wHV9F%ruNb0j_4)Pt++Dm=QYCTQ?1?ql*HQlmZJL3sm;b>2f^-y}9UmY`q!{lLKl)i!?20$%F=c_0FLPRzkefTkOzbFJ4^+%p=0bWelA zlIdkA+t3LjLWQYrd)>uU0^k?kd)wf8_D)RS0suZW|DYihzsLVBB`8YST$`NLAYf;q zPsZ!-lU*ptDtcd0o2CNE&fGGXqBY@%8ut+>vJPLXF&h7G%ys>bj^ls&I?^*)Ogw%l z%2vQrX9@royN?hufFypo;tWoYH=IvP`iakdN)UO4m_oabJ!wo?g*1WRWj3Sfl!o3T zIlx*Dpwj^U8x0Lk9`5*T|zhI1pb#E&nEU`Dy6}GdUN{a3b zH5el!M;8ANbg~{y5f9mFo5AMqf}pN2R&KD;7FSB)mDV0daFt^X9oj!w-P< ztlRCFE9%%8V=D||g7@RUdFzPhpMW@yLFCv5wcha~of)2{M-vA;_`cWHbI?gsft)4y z)ooTHYNSM>#`kbB7D)M9xQEKoiEhq>&ko#7I*G@Qt3gX#xygu(u7Nm>x4Vg;!)^%o z`K{#I^gkN%CHFPxkCbY8Kwc!r+D#ysA@0MkKZSRs5t|-4BoF0itj#f<0;^-hQm-^2 zeVM9OS!psS1*m%tw&NQKA7tMQoxtAlq>m2P1T-Hp2kl1CKlydCdjMOtN6VT$SSb?O z>~EfWu>6qq&S{Wu&u+B~BT@GncZs(gxWcn@26ATk$e+hQ7LWsEjG>IBqeZK~A`j)> zens{x2}B)RAws#CcBR*_uD{BDlPH}4TDC9JX3AdZm1U-U=0=`X_h{ySlgxuMBvJ)WJ$2+^05kyJ2U;&kRPXD|h~ zdp71{@Pz-+frx|p1OgX?G%LhLiC)Zn-n`)XBkm4(AO0gJ`~Bbk$I!ag-+s`jPy&JEUpo&SH@q2hi4ZEZd zUONB|V*>reMFW{5iAV9+DR2<6gX$Qs1)kw>FW_)HAGSLV%~^I29PPm{6bQf@iiYuCDlp#AWN}Ju9B?$juHz)y zS3Dy#*+-$A8xu&*QTwXYDv??qI;$=^l+)g*Yh^Ayllqmtd#r%&jpD=?tASqEe^@!p zGbTQ)ksb_v?L-H|E;GKq*|Ex5R#3O4 zuSSwM(tMqG3-5&K?rgHP`b|PNUvC~EFyAI{WBSq#5ZuFHS6r=i!a{by#nQ5Fal*_3 zd%pRzASbydu?bfo0X z`6Q#2445lD(@HGC;W%3aPGgkl!2 zc4`ZZe!6R;?QT1rce7&DO7>T<%kbpHXJcul5ijez2=9`o60ESrcQx4)!)KK2dit6NgCri#qg6oLXBi*oT{7oOqL^47=9M8)Hghi1t-i zHv2*9;xUIHsZ6PFxBG%auR_P6Kbjf#kfM;=Wfv}_uWxR=guI{NBk+;YOay3K-3xW-RPDC_#<&K(D%n_azqsH(Z0 zl__Kl7+v#t=*7nZ)n|nw?x&zJY-sZX6;f#T^!2Dl>3r<|ve;~eYGGFm^La=^;FFF8 zpFyFsSj>;H>3dyP@??iUkfo&r1d`7gMpKC(4e*#o>7_iXwgT5I_DjY}R^8#lN437V z!do#$3NFBWzYf%?;T%P-Ql9hvRY?u|&F=aX!i5oUX3Hoy#jdBGd*$JUu!@W6x1ZzX z^2{H?4<-WaoK_|*Y9#TNRd&NF%-DL0s;GALi7frb_e@P~v9qsODDj-AMy+L9#)ay_ zWgG1k;|6H?Y~Kh{ne^EUWs*f>ORk1*SU2M^OEh6;I)qT+5$nv6Q+OWRoyB>15d+T9 zY27hi-ZJi%Ga8;IHlogm6qU8fX@15j63461GQ1Tb4oD4X)~Rhmi{(N&PzE{=d3!2T zUoG*!zk4sp|;f`=$Nb5zjz4cao&LNE8 zlHcnU_RcH*!BjU}ILMS}e)hD`yiTPeoumUNFH|!}cxL0h$MdTnzbkBY_cNxxy33*g zLJQpQo=t-?W4L7P+#Fao>+ag3{9QVyijQp?tnMvSGf6mb=-zZA7W5yV*6eKJ_@}W$ zn$?(vjtbErn@M_$N%bzd+UTQXGj6FI80B}C=sWkipm)$Tueqb}-b?$`1Dfwm$Hd&N zW3vR34B{5A$l?~hZzXflpucC8$nZX|)VbxMl-u^2snX#D@w9C*$PBTJV1D-Eab9Sv zqF#YXy_JQZ`|+k-dPMn_uXWM*_`_v8w0^5+MD?vT7dC+i4+SkR4m##;Z~kU!?w7h2 zDim1dm09Zoy0^>b7jnlVqfCgMi4lad-+OdMsr&BUc zDR-%DK-%x4()(>#$2(+Qh6{QM5y1%0mXo+CczZoyAb$tzc^||1Ip{51zTSoBqesku zX?agW#0yr{*2R@H0u7*Hqr*##5U&QLn2tbh<-43#-`VDuo}TW$7}=)AI>%xdaF=J5 z4J+|)OC8)gltO|aT44oxIR>t50wWW-h^+oZgRquycaiNgKJ+xQnh=fWTZ&`B8;m`N znB>iDO-$538tuM1@XWj;JK&n6`Xw#O9dgxeFh6PrR0_iqDcCR+aUi+?D=2>jeSX5>%UTDu|>*-oosHlD`~t^`<+uDbeEQnxqiR zMd}w?N)m!hn_%i@2T(YSc4sNkaC8MD@SWVg9vQ8f-n{wwYP}3u^#eJa$=%9Z}~n<|_X*EZ30kNXs4 zc4`kaSDkU!>mS&VbRX@MCM1A|iP)v~Tt;=goT#Z}5L|(%TgFaR7bVy3E5y zSOC1bMU?<9w?wC<1>ib7?w(y2%~izz*bnZj#O-AOcaVouJ-hWx=o!b7;98=auh3jc zklTV|#G8(v1H3|d%x>rPPKSM}h9++>ZwU{^#HZ#n)T(NGIlZ2g*T#L zS5&SAywUh-9iXq#ct~z)Qu$_|B5StzkXFwXju-L_<376S0>)BtW6joKJ9d}{w!SKvmZm3?@pT}+zzn8X_vD6P*uJZl7h1hcy`r)vAVFRC_dy%qnzKt zt#%0XBUX#=l$!12^Z?euLYc=KWR5pijA)WlVS+%RV%p*t5 z%<7L0%g_b8?-uvT10fM?VWhwuDlv( z-WY8YXQ-!@Z+1+hl`?eYO~KnjgTYR#)V<#%*U#XjQQwM;KQ4cR-vb&30RSAh1Dv3YGfU|}sPe!v?$x5wT?R5d?cVAIcm zJZeDul*7vd-j4Tm&+Oj$(o^!A1d4A72{=MTxy`_wsCu76SEj)8}Z$spY3WnvY6+w2#Sd^t~4MIdZi>OF20jC0w?ILH)S zIn_r4CKf6kg!g#%B`^#A-UJWEHp#QN-6cQuS=5%KsOy?D%CD<0F;wG5)A~s-)&-4g z2T|=D4_yuL1A>Tb^jI2a6a#j9V^%(v*CaFEzNxydEYc5pf;baG0uX;GkX;>%j3U&5 zR%0xGlT6yq01JYyb=`?dV?bMs(3q>-+Z4Zlxc>IRcys?x`{+ot7T=R*#PD!amGWu258{`xvEdnSuySjtQU%*C3piR;D~uK znwAzbzU3t@^?7a9ta7-qPPW5FnqbRjyCxW`Wj^I^a^3CIPkWfB zgJslS0|-H)(*W$g6yt9aKjch|)bnl5$Z+bbo=)oXhfNuKN26wWmBXIieSH{n%iG>1 zTrOb!ZM);>jfE?aTh)_O&S@$tkKZb~!CI)r9uz}_vARH7mZBm0M+!6%!I0Oe+LyU4 z^W^sVTSP0lqo4R{g=DW_f)*klTk%9_EZEbYIOVNaq? zsSg>jw#?RNeOf!_=40jb}IfoBZ|9-ZpflhK$LpibDOZT^t9O${Xi7+NY<1;HopPFZ} z98z#mAs*ZsQjd_>C;~jB5Zjs&h>got!k3D=WW~T1=hdWpBuq+o7%S{%l#BO?>U+^1ct@+DRcViqA&yiyxfa^0j*tM`^ob)_w=aLegs zILC8`{_Kh}0x8yMsreP;x^ee9&5vvX;6M7+9 z29t{|&UWrap&}v2>hiuyvV_RN05H|0wr-=${A&P8x$^f9QiDH^6&zG zQC2G}F21KDX*0Y_vJ(QteqP~@1Z~^`bWpGbyc=(7#``+aftLWQ&h1o%Wl_nL+{BiT zRXEDatGWoag%B*-fMkN?3QRN-(oFpUhHDuPfq;-=Rlhseq}@d)r`V$;mE4r8KHnZ> zxXq)WAY3XO$U$&J?^_Qe8Wl^x8*FKT{z1d<;dHwn%(u<=7%Fcl+ zLqwu0;dGbh8jHbq*98;pdZkvkcJQB zEL|J0vlMHCyBsYVl@%uafpiGSG-MDtmOI(PES!MDeBdP0TJNgrh@&q^0c(ul=S&xa zWOC_vPTjAu$H5{3AxJk1()0;27ul%!TRjyUm7ug z?eCER^ZPFav;Q~N^?!~@{x6yI{~E(IZ8`QNpC{xzy7LX{=GcwDRL4D0zjZ)Mxj=Yt z`p`Ra599|e*nayCgJP z{6`mFajQKpANOg}@^?Xfn4_tSQu6}rcoWSw{yMw=c#UAka5g5NceW%FE$f|7N|zo!Uujl7yysVgD6!zwWn)yrKj&`+OhIRwhUk}w43>gM67IyTHT##STbn$ z+gG5HjWtuZqH#srI^ONI{0NUf7yF~6aO(N{+?!?KA7kI=LnP4|-I2$a4)c1Vrl}VP z6)%{0Pm)}*n@h(}R?5Fd!_Y6#lk89L9Rg*}GzSy%B z@H?a|_H{52U@0r_@UBlYWafpgyDMYA;sE-s$zjsn81U@d%z_H))S3Bs+{<|Z z^mNFs>CX_SXxPp;SwM4}+i6uX?{OW#430u{Op8by$AWvYDTL_ePlWF(vS?SrZcqzs zW(gB=}t%;_EBwdO-w_@x}E|r+O-V z?9DyLF9!sb>VB2~VXk2chQk44eeTO3C;e*>wJ0%v%lOKeOOYbH)zs6P1O#RbUkmR`!=d16_ z$n>l2!1+sUAc_0~CM zlV|8LDW-jygg$*#aN3Eb2&y3EaYXyYz|k&~^qQW!xc;Yr+ zz$`HcHfS%`mGLhLF!EWQta;J{^_|ff$s^w04^Cr}-KV-CE@~#$mspFnFF)=yeqSGt zEm+|u$A-0}?GQOJH_K}3BR*){y1`wKsbT+k0(uRgtP)_rr>kl`yrhbo%mQ0@gWGLH z-1H8V`0&h>0<}IBrx)UO7>eYRYzt=lPGz@Q8W87piKoPd$^wbrtFZgN8TV_><&Yl( zKgr>V#&NyzXG&7@5*V#lutY}e62pt9v0-YZsONg_$Cq!^#u28BZx11_x}w5~gY}=& z;H`s~2WB!+Lp6fa@}Gt9*Do}UG;6hwuNSbMYIK=Kh|UHB>xC67To5`5KhX7Q#!&&* zM!AQmwO`P9;=lYBL_LIvCeR(7fzf|t?JIwp>;7OGBDdQDeWM0#L?u5-JTfqyU`G6vSaO*zDK2LM3l6|zmlFUC3l&H zG1L@f$5K2+DIbSGDfr%8m$d;eLWm+VA2SL!qOX#*A=%JXHB(CRc{46$PcHDhdt41_ zyw}hDQmf=#2;nEGWx-|S4#aW62w6}T7pMZ-WUn;-o8^!97gMh3Q4C-@s2W9slAh2k zsJCwDL0=VFCZ%n$`HqJXWLSw))z6=>^U-l5fKILXo{HulW}V1eY|P4_$Wr&4eI@;(Yx9q%E?TtfDu(oQ zqY9NjGE6t5;VTHaNWR_`80TP~`2bk+XKKk32_A8K*+u!F-5E6ZSi{KH&nf1F@ho}Y zuHw?$lZuK-OfxgK!Wx5Y@?HjAn`M9BkT~R^2C(5^lwTKVJ62awa%xOyLm-U%yfVR5 zRgGyn;Cc&}0QP-JK5+Q-RM4X?uQp|Q{F{U$ChYa*SmrB+hdiri&%U=AYT10Y3DO8d z(%~OqCVG{Wb98T3{3hui$mtE^%GPLk9we?`1$_;XC8R@W7XwzBCT`fk%vc1h-jK5l z5u_qlsbPWTb6pB%!#tNrji1nvFdYTha=Y7B)Fddmp4D3#X|!>%5@MWuv>KlxA}0wO z&+PyqaJ+pt@jc?jaI+&2wf*p^<4EpUF89I6R^SzYxNIi`$L<1x{a~N+`fJnN(1KJW zPE)I%Ll!6%L?{#_rX=!2m5ApZTAPyX1q`!Yuo{TAWwBe8g)_O;FDva~Ei9EV>~8QJ zQ)8)5T16->Iw$gL##ri^5hi$%B9eF=anjV-JYCOFy-;~0DL@M(lp08;Ohy(Oa^*rJ zd9&y7AiF+>IW9QMwd|}q1*0T|D5`A*+C8c}frt)*nLb==Y+t4remr5MMfIDciJb}b z6%qeK3Ok(eVha8s<=LEhmDuooK={UTq(1_l;_IS!gOWF(J7?3^e>z=II%Ym)wdW3t z;A+2sL?8y6f!2Xh0%}=2E8l}Kz6a;!8()s6gtsswMznDQX%(4Va+_HPbulWI-{=1P z(8o)Zg|Me~MHYhvuBMy5ZZ81ZstHYO!B+<*I@67;FOTq>gfB$W09cEq5f;^Us($mk_%xr+2;a;ZQrXwsjd^y{a$Y zzJ=)7m>T5;0*ot>Ud6Vk$5-xhB!XW8ozv6_`)UlVBOR7hb`zPl!pdo~N3q2ROj|>V z?^6Kt5I5(eR=**$GRW7DybAqMkLplmZeAu25ouoyEs55B3EKRi1klr;hbohG-FWL7fE>>I$vb+!)cXF8j!!LC_HrZ%2{j4Dy-Nzn1+o;QFHLhe zxO7yPUNOp}6*x@=_BgpttEVDdMmO08zxgNoi&SgoJ??AYfZYxX>p}d8iw0My*DWc& z^Bp2n1&k|^7FmyNmuHZdIeNk>NHn>F^H3^jZmPr{!tL8_ z_J`4~_GIj((s3m^wzk|6ORU~{W@nt^6@l7!qDo(pkdToW4%_Nt;L8uWsT&!*zwzA~ z`|f=y+9_l7Hm<|pm|OCkbk?lF^xMFh-Ay6~uX zO3(Su9ZsV-K!lcEceSm4p90I!nW;1es(^?TXocKZ38ohH3Y%MfE{wBZJsAwSg|<4e zhj*M{0^rM({Z9G4Z$k(5dJAkDa+M8epQzZOgm3FF1aZC2#=7ag*Ze7(U0M;b*GU;} zeJaSQwWlvwhwvv&(91R~%*N|WFr{{J?Z`THs$3()a)w%C*R5t|XC}qz@7zPgCD+Q`)44KT-66q_4X(mYmwPR>f2r##<_FLbJ`t%B;X4}K zC#OKE(G5hTeEQQiq}Rt&)dC9cr2R27xiKNw4O|Se#_S9}4~0NSz?UIp=)<-k$HFCL za;XQ9*2ofCwTG z6%df7(o`T35D*X%5D*Ze(nJIVrH2Hu&_O^zP(hk>BE1voO{Di8dQT`J-8bF)>~r_t z=iL47cfR-D`R@1b9|&2ltgOs6=Nx1F#`ygPol_F^!lY$UHucEHWn(j_dYo>1l&i;| zs4umgE{11YTgQKTfL918$#6Nq)9-{Z)yUlAnu9B#Z!Pr~cRhkt>G1TUq@zpSraGagzgvHdP!o?u z`q*(z=&Qf1XAdqmiVRJg+ZYDx5z1a|Zx39^x`d=)r|+Hir%xpRXvmc=(`TeWSKqv3)E*A>IHR=9bnKO%ln zu-Pkdl*#lLIV(kSP4jbM5`_xJ9@j)b7n#valg08i>zw`%=Eoz^^r)6nOz?KTE2 zU;%#T^>?ndhlNjLOk|5SzXbhE)>F@cU#}LVUZ&|2uIhx0-RxdBXYpO-iWcY*wPP9a zr#mclu`yauY(*m?L2@BA+A3}LjT}>9jVMCF&LhOb9)!A@P%`TPUGIFpow2zhNcdA> zNdCS7V`)~kbC%m$5@~dgiadM9QKU$DNLzG$>AqXd1}DNm>6H}Qs%4V8hlH?jia$W= z2wc!nwAU6Q3~n94w9>5wf9)jPy$K-;y4$aRq@0**m097LrGBerAw;}Js!Z#;gG%+N z;jsy|R>2REY!jYiVeJ6;n)Vkb7S?LhF-nq-nIzt>Can`x%ShLPqP;DV_U|w(?5FpO zRZA(y7(=2{x}0r<-oA*W>u)Cd;8%eQMdn;${THq^)hPKXvG#1wGr2L&Prj;$@`3en zH##P>`5F<(MX968ued%%v&ag{IQbx8<4I2zo~Pm;9HGGYbNlD@Dl29ePtpejC&1@W z`WEW}*NDmXp23D{c4(9*r61%A9K%6IR>nqRX&s90z;| z;@FPnx*p+*-Vu_eG>bi_mv?w|-a-sq*4^qI#VcT5psossm0O)J8n2rgR$F9fR>jP4 z)#TQy0Aux20L*P)VSEG8o)8c|8(=aR)LrZav57MJxh=M26&UH~BlFN>-&;pDw(EuS z{#7zLW|vH&VUGDNDkR_8B+ks}8{N*%sZLRv2R{CWY!>cklu&}Ev9G|wIv!b@l$Jh^ zdqK?kA@b?@h5~CQdfW(uz)BAR&}^#iuBQkWmeF=A{Gg32 zLMWqDku?m#_n>k6sGd!w-ikUZwZ|eSV95hL)H?4~W7$dz?Pow}4||Og7aJ5fvbP!G zJ>#4^>@u8Wlh~)cmg#BZXLOBw9mHDhFNCFn^}p{SD{9u5ElBC|R}xk`?k0i%gU8^D z;|40-M4LzL-`7-_jJe9)<>}+K+fTh3ur-{73&BLTaOm;ksdq31m_Vu)GjtPN@Ygy`AZbyhvKls*t2yMe@7b5P~lvp7Y2CXsZzqU zCvWl0Dtd$-6j*t#S)R8drY)9p8AxrzkC#|n*;Mb}*zr|Ss>(T?=5Ms)Ih@1x)945# zgopuJ7+_wWFl==1SuY_lS=yg{E_JT{7M#7Hrh}L4j=kD+_vMNNXpyiDl{a+c5ALPrTHmaPi4OLu6=g5@3NF30xq1;0!uJkl zu$lHHBrIJo?#JOMNfkRL-1yC=_oUgn>+s;mf^IuUdSCU}B4UkbeC!tKsK~=v!oo!` zd42n0F!8g6{)8*!IAR8&g(e7PyYtH7!q^o1wK5*1ke3a&(R|rZTRCzF0KL>bXBnJ` zAdmV{T_P`>!dlD5(k}Az(b&TKQZ~2!)fZ}y$~XC#b>_te3`~o<8tMQ#Uw&$xC3h7e zBX|dU^!}Vq=C6X-1B#qE*q*zm;M&`%+!vx0g@s~9DEp`)o@b~B>Nq_Mx0K6JXOYdf z<{guT4VNp*6w1dC#zv9~cZzytS1rW#Le(8?IMg_f2S;WaLH)Sgu^RO=0~cMsRYLhK zb3TuM1krByHp+m6f>EQ9_&940yuVIhU4pJ7GC4lOvZg}M+7Tvor0qnBE9cG_$INR)e! zixyrWa5P{eFi-xfZ&B+?|B8|mX|Oo^r1E4iL$LquN6MwrK)mgv@#yz{_&ZvyJg^BI zhQogH1b!nK=V)3wy4+oyx6=4LTY2Pr>Uc2Ck>DIGhv4-LrZ$fDC)y0`ydCS|P5~Os zh1vTf9$8Ag_;~%&TtFkxcr&+Y(=&1;T<7v1Q9Ka( zBeizw)VU@|oHN)EhTw{b9=;~>qlV@u#V_ww**s9MhmRGkvEKVj6xjA5%Ea!lnVDr+ zu7k(^K8=Dz2EZ;Q)^YZTtsA#4{JB)-b!mw=qX+3Gu}ziD1)TncCgDI_YefqB+kG z5T!msY?DXWp$D0ptc}Oen^7}Z?QKb)AGI?0TD`!RI-SZagpQ-Rc1xsD^Z`VP?3;t2 z6^zfldo!cXGtr2l)^a;esuuL^2a`#3F{V!MkUfmQh7|HuU{>{3K1hS2dxrfs+txML2pm_{}H;ty``x1T@B>1^(pOX~~1jt3jp?IFqK&j$vJ z@{ClaTt~R?eERUg#Bc#Iik`i>SJp?9!`J-kP(;V+$IARH^zdV+iPKPdTZ#L+n(mQ> zye!gVLaL%ngMfmP5T#Fi`={kaqi>6XtN0zkI~V`xW>kdvW0or|Gjb;{B&C_@__&Eo z1T)$5*9L!0pk&NI*Q&Bf*@u+WX0wEPi`y;{Ml+j!b_|!I2eQ`R-w$tijqo8^)lG8a zZ36qMrN1>~sQI1gc%V27x+`7Vd#vl@DG`;f-l@!u^ZNMt*9rIVT-qu=ZY_f#(zs7b z4l8)0)JZRK(t23AQGapE))1lQi6*Mx4_O1P6cNw4 z;=$l|n67cSIe@&DJ-(DMQ8Y4Tc1`QCi+=|BgeK@LGv|UR2Mz4kBWE0*q<152oKKi$ zTrquci6NPayta41v+EnnQ~Zn2{Lc?p`iu)!=7*QB`CO|?V(n4q+RCW(NIHhJkbYI| zkxGi;OEd1vOoBE!i?oThpx=M?nI2lJEG&Fy_}&XN3&tK5Cpyf8XIv=4uQr%{?yNN3 zp*~(5Xnc5?R)@))Y44iM*j7@)B{j%Y0_(NHc1$!S8!=_N?5@BYn|!g(!QqUKMCkL( zKm{4-LuS@jbMP|^iBXy)z9Mlvw{q|+ucBzg*-KE_TzF+XpT|}HHfZ$F(2)0+IZAl4 z3{dH)(vk>w{k7rxQ!h(yigus$)`KA39G7_hldIY5xzU^G`7f26#lGgkpT-l1G4dGy z&da|_bYWa&B-u< zD&XnVW>rpz&67{RRQJTu&!2CP zIehWahCDAZMA{F|lFMlgJ6+}yirP*8$Z%EUCGv_d+uQV?&Et{&NqA~?vBL|hNQxsO zNcxx?_07$?yt6FNN~20;#>CFq2rlh|l@Op!aKFh*q=&k+A5Rl5Uo~$%=@)->zx7f%Fw>1JHH-@>hq-MRk)FV;?eHi4y#xiL(vrS7;0jI(YddjHKy|dU zKi^9M)r^NxXCKhy2{jRc?~BYh=O#7a;ghNgp&sIX9UIJEd0Zz~u4-v3rerV+;kw3G zeFCM{aaE-*5;7MnuYZ1Le45FJ-O!{HT{so?sR@4EP@~FsFHykdsjKVlV+yM1eIKA3 zL%lH+dEUw_U0pZW^%9Z8<+ZOKh&0mmcMt(E;eMQBR9(jS$F9_38vr9t-8AvAR3|KP z;$JsXHS|U_8Sg4Gf?~4T@_&lv{1X333j}j@+i zvJ$z$>Cr~uA&_gg%R~AglA#6Z^k@?3P-xZ)8{)#l3;A=gE@2yQP0FJYF)7>9ic0gO{7Yh#WxlB#G0V?1kJ?;O*7=O+fSrz_=+ zB*$;a&_%{$hn?4q!#;!%23Og-pz{pBDq38NK(FZI<9x#%6z|AKDR|f;U0rg;j1}6K zep3H_?B6`?zy7uqrfMJ`_o4RFcnI(&RyC`Rw9!brqlqFlAIa`LR#~iqQfsa!FR$N# z0-ZX%kzPa?P>Gbp)+--hMFmU?i#PN@0gn;9eh_Knuj6eiJpmjD>#+qu&3n| zsv^$eyKIYmQ7GD=h%WqAy@B;XPe&n&7Kz5Ma#5fJx3uY30rqx}N4W8zJm_!r1^B$m zLi!wuUo~b^^+l{@BL+9$C9xr|;ZuKHXKP40tTtzF2`vd9JmFGN8j+ycViMWr0rUX! z7lOt2{{a&IWmx%N{C~Fj4#p8bE&6w)PU7Ho1S3ilk;Fv-DRt%>pbbGCL4odNkEUqx zqmag{k9kZspfvUn2CP=4sNt++x{!i>X7g4{G;q{VN@3sr_kf5vygJ&j<|Tpi#X z`u(r{WXHu3ufMcJCBNX$RU^$m4F0Tvuo%EOkbj z^{Mv&mdI7@EFAWENVHJ0v+bgp@WV6qhT-MsYp;~i6H$ueKv0O}n;#0w==2idbXI1x zmaIvt&gBp1QzAQ*E;{X9WqU(3d-n-nR9ESaaUzxtR?2_jYP`aV;mdj!ywp%&%*4KG zK|g7dB;xTR{m&QyzcLu|d+@L1_7tG0QNMei_T|n7e50L2vBnjxekI(rG_G(}ylhPC zN1H9KZhqfd$l#ufNX*W=;1A%0v;R988`^8qdxl*j{G=9r-CzgEoO$F+RVMMl<7W{k zwCHrzHHK@6>j2f5-|VwV6wD0GsXRt>)CH;W;&+TW56$0@vsVH3=7vas7oB$|71&OV zmsY?@3-CNtoXKJ}D6<}oOQDUd&0`zLzyctoy#L!=ivKQG=6|Qp)z@K?2`kblWYd8B zhFp&ghBSR@u;i=wqos7Q_W>Xa92z4Bw|$z~^mBp_ksNME(^s$RZELTI=2=-f*k5_T zZPjH%(#5wsV`qG?qN|)hP1M&{roA){lA-h@ejpQYl|e%50@un*;~aidM}@sY3sIty zyPhBPkPUT`KNMXhyGl#r($*X4nBLy$4)&GGN9TU4sSWP2^^7Ue@?Ep6hql+w{+J>@ zA>9J-lWsgL4qVV!PX9N=mhG+X-oq-r^ak`@hb8ZegU-RfAxRON+yhy^As1mz=n=%s zYu`bvUfL@C?tlsX1B|y!r$!eQ-9&#Rw===za2;r8bT#V@*#CxzZH=MY5XZa5`%Cn9 zV-Ng>6kn`DwK5LEUzqd<3?##;w@d~;7)k7L1Aae`uR+pQl5p8t$jpH~xvH66b}0RW zkvsi8O%}C)b%zx+qw8y{Q?l$xU1SmfgnjSCZ6ygPti4WwR^Sw0?Y?4T`~+8Tq1aOs zMLzgC-un$%S3%hH7%{%j!WS2(v8@P$bXrDJCskk!%@+LG_gNiSRRf3V5!@uCc|=~o zZ-_IdWC_T)+$UhSxABNM>CE1BauR#e@D^f*@DOg!hhFy%WwZ>T8*m@s#05>p0H$0t zgHbq(P6B5`_0z|;VQ1h4jAR;21i%nuIcP$AD5I4ZK=v65hz%^uM zs8YhDTI%hHDuY#?QBw>W9F(=)h8Lcy8EZ6dVGLtQLlZlOzagdtNu8v#l($kXbs0oe z0=M%Cuo+Bci3JeV2-1$IJ_fxF;oYN0w`)c#Out?^Sp{W+j&L?9m#o66I6eHgq!I3> zFC`t))CC0NLM<5-z9pTeb!@2Y9rf$ro^+=YK;8C#Oi(XkOyr%ioY+&I|>_S6O{Vv6UjG zf`#ZS)v~xry8;iRI`DcdD*T3=0iqoZf}}-;KgEN-`{w|rTU!jI$#`m4H`vxF77<-o zl63?>x*Gv*q86MzdkNLNx{Q(X@j4@E{Om&fT@w+ z5KO+xSkUBPk=tEboV66CC3&0paZ ze8a=a8=jHq@FV^ePZR13qjzfSmYg$X)d8U1z1T|*yNOJQ2ldtJ=pgMW*W37_tJyE+ z!v}#ch`pus9+>`rFZBJpzYlD%{p)UXK->4sZ-}rC*e<`zCQC)r5&si-koT0lcKhJO6|BPQKfcx_7&2~Nji7q` zhUf%MgN!hB9EOgT$2w!lr>a4Q1>pgwNQ!Tt!ru80hYIjXjGHfgZ2;-xN_hx<10mfy zil1+cO+44TF>^KQOeVixW%+q*%0-4-F@k*TJ_v+>g2!bDO}_vq%LL45W>Byj@UWv<G+C5EX6nBw3AfY{Zy5KFRezUQnxBTHTQ^nOC|n<%UsO^-wj*ZKYFUlYXKgPjwWqf)iEc4RVG{6fCfOh_&vC-94) zTLPw$jg=I;hMUsqca*YeK}d-ifz5~iAl3`1l{V_xRnPnFU9?!~kc4d+2j*9m)MHV; zhg+9Mc$9{w(f-vUS)Qv+2_|o3k-iwCFK0ELt>0)_aBor6cWoIq*|5GsGMSERP`tnU zMFrP`l(Rp{`a*u0-+zYtZJY2%zwFddzk|%)44J;yjgR4Vr?C6|2`eORLh0D0%&#! z?h#MBoBkSa=hQr&XKS0^&;ufduS3_FXe|(vwMNus(m8@uFI2&>dzbXl=6r>NmxYJh zL9tplk?u@6+tn5X3yk0XCnom^ML2tVu1@H3bi_sz_s^DLF}XbFQ-u70@8QR_Yw`w% zHai?>SPrX(4n_u)icD4^J~ku^!ZbKCOEHENV@xRb8!JGcr0UzMM(4 z)j7~%sakYKLc<~;3vsH#&S2C{3xQVft+WT}Q;BA-3?4%deOM$t5<53dliaBVfu-2$ z&MYw&%pR^;Hp&i^A_uyw|7%)&1VNCY{fs*uPQNDD+xiag3Dh;9>{GHL|XHnrtE$q7m( zw!w}nk`F&)LKE&q?-0<-S7c`pdqLi)OxiZRP~T*>IakxTv`A!iE|U?1p`P?i096Q$ zqrNo%bNTmm?GIb|*|PDq%dFgd#*%f`Ef+H;&Cn^$`VFxxMN@=~HP{1*a<3g*yN*|VcXp9$HM2YJi27rj4(Q_&vN%eYq9;=NyDMyivSfb76*@0hVi_|(s1YCJzA?q@3Av)_?~j>CY0Bf_LrAzm6(J9efCj%t@-R#Bw3sNecMQJL0~$ zqY^*Lbrer&EfDojUK>knD#g^G3~4^3^<3mL!Z8!p!{2*k*E?*Un7f=Qs5Ct^-8#&F zRsTnqa7}0t^GBCcZhBWR>V2tq2C+}7(l8FCdp^Qj*K!6@{p;J`2zx0q;E~fG^wrVo zNq=d7hgO9#RZyK@(UiSDlPPX#z$x?Qk|V8CO-!|{00sNv?0&n_7`;rNC*f$NV7f9r zo~Ds_fpl)hJo&7b#CmPR*4G64*p@c$nXZnVZ0j%nfo;B-ueWr zjXb)LPUxy55?Gt0m1cMR_-Hcx~AmZ0;u%qHWSw2I6qPX!#*ofp1LEt@P^z;o&}-Vm6{+R5H)*Ly~S} zY}Yd4*E(tj)n|DzpdP_Q({=3)TI?KfO7qT9hPo>`>3#p72L?Tf!l8a5wp9M8w_5h1mzKr1 zO^83h4n@a7DT~0_Nh>f})>~@=n+~)dy1OpI2#>>9WJz@ryIDu zh5`eJUT*Zw_h8KEFeYg6AeYYFu<4h?1-sD1zxYm8^ExsB`mFr?QPfpoQ~ZMib;G<> z(jyKt$^lJ?K%AsITtU6(G1aEu_uD!DmcvBbHfJM)tDUGK+6@9=H?VtNKy$1fp0lTT zr`;X<@l~dRZ1qlf`WSHu!nDIhM7rR(9vr`L&3%88vc0))4AI!UY}Mx5V%&$ho*Wjl z;=Ge6)N;~yg;3#8e!C`l`9hBqY7(>C&f;VfhncBigH4s8olXx1=e-ZhLwCp5c)ctp zbpl4uth1bMmI+rDZ%67-U(odW^XCAC)Q_z?0xil^SF5pJZT9$ zXhC#a9m#wd)Ssjo4lTE=OM=HF-VEJcg6}*vS%9yF{f3CJ%s8j-F(UO48PMV~+ItKS zdgv>FKeHYr3ZdI{{dY2cLo$TyBY#86jmV5!G*f@rULM=FN+4n%V*kn3 z$V?B>f~H9GWI=2YP=t3$Wuqh+?vgA$shXj}vqTmmXU4`}2aYRGB3d@6&dwuFhWSTEn6 zIx%~eE{I9RK0^cNxZWK9ZuWj@gzx6Nbn|{tFT}ZZ2lndDHe()Th`dgCJLISIBE_@0 zPq;WIEhfgh+H~w9`bT#mLykHt;cXK9k~^q$VmYCvz%bnSz7!`kS{!J*1)UU*vHKwN zR7cAcdw*GYI}FRYzI82EgTvG7B^{KnEIJy+`)9T(L9Rz5YNifWb70wEJS3m`XP=}b zC1di@s|(JPy1NYp2ZoQ(3P|dB?=9!iJ5b2eg7|azft@n$o8baSFHD?ar;nI-Ks7P_ zm@wEiL;(Dlk_xWaH%@+Vogb4`@oCS+W$Jic#CN$N+~gBN$@6$8uBpV{-GyE_X40HX2sqSw%*kyJ?h+o#wQ`I1~(r@|19j|vmY$KCSXx;u~A?qO{!sd$gj6Z=CbWI zktjU_d-1*780mMMta_|^PlRe`=L)#KXtk%6Jp*X%=D#&o`Hy)3ID)~0Zj;An(EW3< zJE|rx9w;=hkuOZ11a5~>93xPB!+|Ua5Gc*hH~Dshh20JFLOYEH0iNQ+DP;A zM2A81L1F{hw#xj^EM}9cGMy?@N5n>ZIvTe7IG9I(S;F~P#L3sk-ed{PzSd8S>GjO+ zH+SIRnz7rjLc#+u^E2jx_OBLikDD*`)%E{d@blX-!5SL}wf$XIgMCl#Bi;XL8gjW! zDM7o)YhOuq;>f#WSFW%#mYa+*9V2`D5RRxO!s~3(H>XurfJ#p3o@2xiBz=>qrypD- zg*3*#q4k=nPWuSGfzprENF)PrwSltaVhgQ{q#Onhb_yyAh7Q6--lW>Bc({5dHBe+Y ziBBXZXEd#Z3Qsz>YQV444}l&?Pm|uoF(mNw86mQ_Ztqo|aFKm-Wy>6P^aP7gvxdvk zeIiCiAwz6tF2#MmPe1z;Ts|`;m?XGT{~t1t;(8iAZio{r?iBf;r+Z~(x`kI$Pus$~ zkEbO<<_NkMCf>`V0#Ge+9)?K8{`GXG2Is7+VG&e)4O7kTZPL)7+XDJf^*(5W^MU)M zXYcorUG6Kcytv!WErot3z5UqCcG~Hj2ej%#tt8iwsEbdHgoVHBt250;Ti9JAlc|q9 zJ2!S~q<@qaq894mCF7z>r_9}v#KKbDob%GxnLW{sVo zoe<5xg1z4ISUJ!1HuadKl=jne>Rdyfob@J&*9h)$Gv*RGQ$N|qkF5nFd*L+Zeekz0 zRFCVbJ9W;3a?>4=wHZ7NetJSWjAnGm_js@B*;g&x2FZ)hH+coHlS(g-iSNAn3+w1e z_P?=?Y?l(`zsyXW?^5kOcfxeIvEb(G!ps&}+O9~JQbIqb`#FzjcSd{J3NlujW2Jrb zA*g#Pp8rc)^=H?A+aYSI$F3Q{86A44#aC?V8fg20Dpdo_s9E3baEdZLuC)U>4&jIJK%O&O4K>*E zbU7O)jzaa`H28?RzCCFeDWG|O*5GF&1kxt712H+=k4i&zoKS+ZtO-4OrQFnqDd*r~ zJat&go?-Bk9#x;2R%UB3IHqIs%pak5jp229y(Y|qi=x5t>8)tf&MngTO>(wY{tTGQ zDnVNw>w$3G=&~In1;jhcO1tC=L>jgfFhoLZzWF@;b4Xs4^Ni31NrF)}L5j3mz?*Hg#U$+lb>k z3m=cTDv%*LKGX(EWf2X5C10K{;*MttP<_u;e^M7bmv8C{@W*G}p+4W0)ls14e(vfS z%`qL1f@$0G9IS2aBXGk#iI@IuSKUVn!12)X(1sz@I77C!POBmHSqMvr+xP27SCic3W2Hb9@p`W9a3dQj^gqK_k9HI|#aPG6PSPzjMu7NnTfIw!$ zX_}ZfG*k93t)2Y9=j@wD^Hy6=GLsy09o)CPJR+bY>$k2xk_AY8A@!>;*o@^!3@woh zwK#eq?bf>64CvQ#lXO!{`WwQar5$wTi{8crIN)B4YbrZg22!5o$4N>#{TcUQZ%w;t z=m7^fyON&6Oc?u(SY&b3CGxo-6fJB`SuqMSH?->UO6l-dq!NBdKmboWCm4D_B8}~e z7d%*t%@uw2D&G-8*GS0BPuK~(EJ~;*gclQNWc#?`L+R=fHr<9rSOtLOcS?Lqb^Y#5 zXKVR%ZNVI_G*xs&KJ))eDfIroY%=1FjB`jN$rn$yE-kO-P5f4|zqdQ!`D7H#y=CL@ z#L~CI9J7-{vJ;Unm)r03m?&GB4;EvO+bftYlsqpjHNIsXtD(dU{Tcn^K$Iun5N;}H z_kPI~$I9p&`^VINr^WJnPTIM>GJY=O2P^{!IYwfaJ}wy)eTjdB%N?)7x5kxoUbN& z{&}8TAa~`mzZ0J=b|#lMtSTZ)i<>NmvKd;QTDJoZ{%3$J)3E@oX~YDBsTi@P*=K{O z|C1QDJhu$>VRS;7WK%Otk-CJlI&F&`!{(76-UxDSBa{(>^ZeYH$>ymthZZ!r5?Qbo ztRM`)!h0~5O^9The!(M`=grf)BAbmqWW)Y!KIMPieEzMrgLfk`RZEcBh$-q0;;$-@-{liQXygCm+DJj1-F_gqBYB6oIoLa(w^$;-r5#=7D^EQhQ^zCs?T~P)ANne zvwPe8*|VK+ywV$iuVpTX_>!#(hWzGm%Yx6y9o~*n?4NPi7AfZ1lU^IbUz}XwL7n&P z?bbT;XlK<_v;D@hV%7?L@uL!&dr#N1Y;6x;ff6LO>SU{7Z}!S}x+PT!LU*4p(o?QX zt-T%(rW@b96=@ZwdLD9epC4oif=O+=G7q29weKcuD_KQs%wqNX0fokX8%+1RECGyO z*0SlcAU3Le!**q$$=n1A!{}{+GAn6r67}ff@35lcvz{TO4nX|tnTOV+uyer1dWvMD zJ~CQUp+}|I^3&mrkl@->49G`VyF|P-pspxDDNJs=%&4~j_N!{>Ci6KHY*xVAJou9T zEvL=WiYU}|EpOzScpGH;~FrI%35F+2QYerv%nG4}EjV1@`T?Z+BmZ2|?L zuZbL&ek9_ShsF;d`3;eiBRd6dU2oxywLJ?t2s(J~2lnKDW-5JI*bJR=(?)N%&v&cDwxaNSCBz8AncafrNf$;J?+*D~<=k{aROy0e3Yg1q&j)5m0Kb4m%Ib(z~1s?F?(?M zwrtM5yFp*P+duQgFGKgk?at_O9ChX)F-Dx^6!z=*h;fTKCaT@a*wk8LJIvUdpb7+w zZ{o}E>jwz2Z@I04HqmKdy}aZB8{A*y*r5qYC6skivNgIU{RVnVwt-^OKCm0`$3s}K z4;Z=?000Up@i_{ODg#2wq^#Xx=*&&=VHD&yr0o&Dg^pBU1bY)!{+a7lPzOw|K+d+@ z|2nw~m?t&>huAwGL<1n9p13fnyq9pAyT>ZOwT>eqBUZ<%LLU+1MopHz{6Lks{Wa>^ zy&Uml?~HlQMyQGtKH)Uqq~jIB1S6i`HZML zWaxfY2M<@)M{^QeFcexnrfc=X!vNXM?)&U8<6!-5mhXzk;iGayI_fI&0wD+p`|+P? zbfQY8eAIlg`)qrE zHFg#n*5U^fcv`umS4fCnBPM>sIma9VH6n4ac87lP?IL0op@oPEmO!%EU2xp7-p+?8 zor!;U1HJ*e#nq}mxAnx>sCVSIuGZZp#&9elQCg2p!?ot3wbZ%DgxzGsb4E)o%eIr7 zUZ6q&?R!Ll83#Pf&;FzD!2yLxHhcsJ9$@cK4I9!Cz#AIFJzlWj%=Hj5l||?V=DG~^~aDj zp-Z{LU~Os3mIsQXIS>kCQa6;y`cVa9M7uW1S4X8pRvl2_7$&jQRR%=R%riVqbQJ}5@muVXC$$xB zEz$>A4pjWud~o-5w7z-KH=w|g7>U=Jj!h>R8aRv%`oP&n$~k|rbc(xerF5Ms=-;{P z&2a;s2@}9?1kLDPU}b9#bYpXmwm$(q!fD-7f7E2x>dE9)XVq?V< zF2{f8dXd0di*N*xQK_VkS4oF~Y2lLtGdvv{W9r2WoR6SpuuLI;emq6i<*RCsI%H5W zPtaSom+1~$30M=bP#z`uqd=Cu?B9)P0BQ9<7}I*%DPYC)XAl&80NEsC=p*tUUoYQx zo@wcUo<_{ZWdhSv@ZMyBZC!U(0nmhkWN!&g*vyccD*aFhW&F0n&{Z`6<;6H8h z$#n2Qth}UpIYH^e?GF{)Lx#r#KYn#Zv@rOdwC5Ycym3R>=q` z_+U`|c}n#Qm{jL|Atn=uBE282*6+q}dLiqLukaMN;@5)w`FsM{ouB9**hc-A#WsOIqc&d4` z)u&q|yM5!X8T#c9tR_W2?bx+YzpT!TjjeWSZ#b&m{j%ZY#8VyB+Vo|lLO5tH8X@z^ zw)+GBMeiwhA1@R_SlGau>j|)AR^vzLZ%E$mDuU=7z`15`xXaw$bV#wN`Somueis$kdN`MWukBNR)%Z{&1% zByN#&&x%tOY)LsvComCc2Xr8GM1Zdt&-JFfa_~^XB1=qZSnXZ(YHCE=txW`?L*jY! z_U$^3OQs(`Ls*2Om&6H0ICSVDVK3axne$d-gNvBC8@$dL+Hs=A`NMwSHaXnlM~t0L zjI(q{*0RmZYb!65BY*S&Cq|@?*;kx;jmD1%G+Xryuncj3m3FJ@eYlhzwO)GfMjiSw z3&GkG8t{@IsCunO1QdUyb=0UDy}LXn;hhAC{6x@-FA{I^<|TGN>2X6nMHP2cv_7_( zSBp(r(5J{c>CWp=lalC%%DLouJm%;23kj2=2ni26gY3-Y6U;^+u7;c})<--;{tc7c z*NjxbJ4Qs%&SiOmx-W>O1~NkscCX{76MW*K%*qwTp;`6an88mzE*S`o#}y@!xEMC0 zm%{g<$vl|xY+4~bQ%M#y48Zuf9eERA`c8b{z_*2;t|;N|0?JS);W+=*=IrI6O_;ngeU)}E5>d2ER8ltH`K$>4Ngkj4k}c+OWUksLMGL3LfuNhc@@s#sl^;w zbKP2?@xb~6ey&TOLJUF%xJ0wWWrzwtl1?eH62SW#Dx>5^?snOWZEc*G6*{T(oqxo& zc6Vu)ggDk`-X1+xr7mH8r7$Wt?d92U6|T)mPWm_0Mb8JC%$-{f##yiCr=*B2bT@-S z&W)c_+LTb~6M5N~1f}a_waHKLdqBZQNs@{n$><+u3oc&$B4Ah-1>gCoS#j}{%FZEF z$&T~8YTs!r&sp@bRr$)->`7ABSQ1#M`k${e&ev4(j2giXKiIxFldXo+v;a><|4)$r z|7p$nfA)EXae49|+|rr@hO$WVvz@)6EV;)ur^i7Tp`ln4$LUXDadD?w(gM#A3_*CB z)6m|_W#pL{18zUf$e~=2A15S_?Zl^|P zOvQvohgXTPTXic!%@{9NsgFrP(_v7Y$txul0L39JM$v)KoFr^#g^gc|)u2ZWJjB*+ z$k85wiY4zBKbl3Jr${5a*Y(8LUM^OyU*42%@r<2bxl%1Kn=p7Ro^Rmk6Ys(FnHp!H zn-!vH{*>3E(ZskHGYMLyC^;z)_?<6Y$`NAp!Pt-GR+Iy|eYV#6i z*t;%A<|Z(-Xa5o)|3Nz#C=io^YBK@!%vnNy5S~S?95@y+7Mey!JRPWS=#7q>$BvmZ z#=L{+t)x@}i#kakzY2=LlvIbm@;XJFB z*9jm@e=^{oW~2l7G+8VeNFI_o2KkhfI@9%*afZp# zVb6YF^qDs&A`rP){`tyu#b{ru^t+Gc`L$sLRQVCr>F4 zgB)d)mQn)EAEa6|Y5R1g8xn@r*)U`1bO+|08oqLBX~8AxJx~W|A*|1%tQk9-YO*eA zBX0&=*?WH>aEPW9>;l<|C4rN@pgs@ht37tBcP$RNXX2W6;$~`9>zpXm@Wz>wCi95q zDbmHgI=%h$R4tOemCNje(UlaJPs(yUE=lt0SK40FTZ!M7=&M^*xH8&FA7Tb6dC^P- zr;dO3NlJ61ShO?Ms*Iv#SMlpQ?k-!p!E3Ir6%hnM;T=uvu^as4)4>Gggx!QK&vhKn zuGeYJRZHe<)JRfcez>U#NszJ?vMOm&jzo zid~{Hr7B8q(Nxaz=Yy8rbhdRe720+A6-;>eo#mBo3y?>i(mjM2tl?}xl$flI;Fg(q zSsJ_8x@7ny^fJ|$@)#@Htu1&i+RGljNS4d=rjf~!75SdY+hx6=XPV^@2zdPRG!A|`-U3sFoo8qFhvb6cBTc6A=QR(*Kx`+czDM5yF{^_{A_kK>9+oZ#SkQ7CSo7So!V4; zWk7Pr5vm*7=HljxyPB%M0tCm79Ms`}FufPYOA|A|4WyI0noSgHkI@rvC-Q>brjye_ zs>)3~xw{y|_br72Ee9Vw+z6dl)t-Gue@6d`eb)5~UgR$q7q%PigZVI?0Q-CjNAkaN zETqDKOF1;R0_x$0QbdB_#J>#roN%*f1S!!gp~Ur2i@obSs|F~y46^t0v&&bCZP|-J zw}*_4b!zR6=-l(M($X9^p=apPaIMp42nVaB zg58fMeM1~WPiIdRm%Y9a4&^OPn*FrTufxl!H13?Hbz=Yd#fjM$h#cS+qNmfKMSIO{ zjxQx2uUE6;%%d_(%~CNV^O5C8nWq~aqi*b1Ia4q#hMBJnz9TAycKu+oZAAsi_qv-( zMtBFq0+JUYoH1*GKUVcH7C`<5to4c(`4gR@lCp7g^JBcBwC~%`$pgjDU1pJ1T-EpnpnF` z^hyc+2v8a8TAB3(#^udTjGWGDn4W%ozSPE?*SDBorgQIOZLP+mK+EyKPq30QdM(<4 z>S$4tMv6lXu?`4`+m+EIM>!ACII!06M^hwGUFr99K3%7Ki@|5RrwIfy)fqjNVi_~Y98ch0m*=OulQooEb6x_hgR zzggWpNGxg0jVoei-PDbK;P#LKT-Z(giIXblmlSKZ(7k3CbFQFAe?xh<=7CMU@lP~M zlO;Hf>i|QKrEzL^J{_d7gxtsk^Sr=Eh;lSuLz+Y(RWTOc(l&GOTiEBoB84Idy95t< z=TY*MLE?{y30|;6OMq0@zf-lv2~^@Nn9SiT@WV(gU%b!s<1sy)_z6yfLE)oH?lOtr zRXF!YTsh`!Hy(&mefw&R@`zB@vV{~Usmw|TcB>JCXo~3;lMPV21ktuoC{bq?HiDTI zU{JR2)t2|~o=(7RY-lFfKbTL{I&5pJ+$I!laww>1)f_#xNW_*7jG5^Ld8|L5(8rbf zpz9}+FGbXe6V-_P;EvvmF}ls7XwsxCXNOeJVa>{R&%-*7cQp=28q14xe7WZ&$`f!V z`0ZFJHi5ES&4rt=`vH^gK|0TjAiwCTRQpp#sD~ZLZ&mCvg|d%Z8`dEr1ZnXd^WQ+-A(o6C)U7E#<=B^-Z5BZbPLra_7p(tyT;c!O5v)R0%=2Y z87*BTaZNv@po_=M<}2^Z1Yep^2Jy>`kFVSVkMPSu84yt1FY#(6w;`G=Ow~mVC2?Wx zfgL@A;hNxAXej+<)}mhGP8B*(1Xq(CUKNb6NItoaiCHx|K34&~kJP4@qo2ZqUGF4- z?2dD;#5WN-oA#rO8z%to=@gZJh*2pMJ)S^*ys@>#u&t;~)Cn@rX>A&&YY7cJy||_& zA=Bx2HepgDSbeEoJ96iI&FT+$iQf8Yd}DLs5mLba#ol`dHTCHGq9`CrF)AQP3j&G? zh=7RFq9Q~KQWzr>gpNZ@h3vq=QJi7kV_aLuk01JP^=U{5@`ME(LTUq zan8N3u;U+v%gElJN?L8lMt%Q1;N}N^FjRQ(^(@g(;aRD(2-n4d+cJ+9GKAAuGra@Y zR0704)8?;Xnxn_TH0F3W$15Q%vlC_%l3~FennQwo-x5xk^H_Mz;#+%z$GWowYfLHSQUBZLBR%d z69U)+{?0!!Z4753_Y*O{`oh8GTw5myFprT&ec|^DLwU*v zuUX-c?~Z)PL2&6#iUC`h#%qZ_>ief&E%c4%)T1QI^j%m!)4l+{qI2J|CDuJ1akAURuh4q~X7|F8+O?H5 z1@oi1=t)bWKFR;hC=a-`P-y5|7c{SjnkaZIvHHG}!X$)^VZOHbpd01L!xUC0hpUy` zg0QTQB)6qHuNd|5w|Z?v5)#0L4se1s9*bLPMFj$;LT$5bvKwDP+F_rHgZkp-^ZF7M zp^FJJ)28Q1qX)WVA>=>}bn5{=cC`IRYl8gN2WqroiAoIP#5IMF7hv%>KCXRPh!=^! zny&q0P&qR_{DvAZK#Kz7bsQ&MThXlLC-A)Ur_v{0){CD}F3oG4v?+_Ss1bX`&DSB8 zc6R%op4Eg)Yre)D65|<-lWh5mj!8q1;1RZ8f8n7VV?%wwctt8Heh3F}WM6^bVfS@> zfhAes2MA38(H^<9G?V5gOH2dU{c^14>eer|w>z#RrTp>V(*I6PQtTiC@OL(PJQpn- z7L3I^1B=iOJJK#;3O_}fZI!z-AYQDQJ6$7x?$@tDDTeb+3@)dO48n8UoBv?k(!c<9 zhCO3}sTsjc>rCbNT?a5{K+vJWB3a0tkv*M?%2cJpQu*yDbz3wQe*H63+!4es%yV|+PW+L8%~HGLI!qz-ID z-LPbltdiQxs@mF^`0b?UK}!I+J?m1FZBT!**BndBaa#ZpWp$ zZ()`jY8)9oD~<4=5BmsSk~>0F_qU^9Lj|(g#N)zEzX!q$9+x;BM6yx#Lk5tLa_iB3 zHW{MG z25+Ax)Vim6vE_CYSzF_D-yf$ni2N#w+M#?3@pC&?9YcBYGpd*sij2D44qvgDX z;6SW%9TzT6MCG|kEKRpujJ5rm88deH@`bwc;&3i|H_FsFajTkqb?r%~8$m_Q<$d7d z69*=NU`{6#bKGNUIiO+1=3o`jg^LQ~Rd1A&16w27a!xbh?2g#d(L!e}$?o{Wa9=6h z#)ZyS+NtVcdY%oXt4T=pgPOQu*om z%8r`e!ekb*7}*QP>jUIZ&lf%@Hrj7uNb9lsjX-!=q|lHczl^wM$@L)*Vao%NYFt*M zZ^u3_pxSyC(BJ@cc7wzzj`$tJK+}H5knNRssoZ{xZi2TRu&q(_l1^fkx!nIT!qB-u z*Ez>*O!)&X%tH>>5M%P7WLeS$!sdBEzG!s&`Nzx>@wqYsBOT_nC+d=iEke}kreyf3 zzJ4$l5*THbIQj#deZ>a^ftIb9#}+XX!c|y3x4V~92d-;7dEQg>z}S)O38?vycD9>l zHuH&L1-?)_dd5!^8vHoRpBLJl2(j{~v{Q#EM(}jX4O0FBDp26=Ao8*=WC#o!F6*Q} zd1iMx8los8D5&`KuC)^M$+^>mA5$&Ku=G0AoJ~6$`+`l=EE~mSu-Vwe&ftYzXjeeR zV5^2{tLZ{$7Sybj)&C3Y=t|!fFqmO{g6C@S+NOZuHI$?hmZmnEcw0-<3KsLCRNwRzMSC z4GUq{w6w**7FbqYSdIC>ucDQTorP8DL2qO7kGe>;-yVYukqUY9#a^f2q88%a%4IK$rSOg0|cGfVr zQfMsa33*wkKn$LKLfZunF5FM<=pQ}539*$^H`F2PS}etk`c<6Cd?jsmj|a^ zsg-OPfALvv(CK(|0?P~ml4dEWHOzj0+C#erubg#{rPMG~qS$rK?R@MN1Tjl0oQAlT z)&#Wbt+vNe==`y!yfwJC`S{eg*z4ciVy~aCk(Q6;7HH+EM*IYC639jrD{69iP&NMQ zcbon!r%_|2dD}NPnOsW+JAke?H74F6=@%ZAJv*L3OeMLIVV)wM8BKA1DLSR6lT+hm z=v^&k=--XClO~zEKC<3o+Wui}I`u}!rhi{s_3^&}68=HZrDpaFG$mMrE?Kzr)R29RvV-lhM~?AbqzTKd06|1Pf4V_!3S)V?=-nGK0X{#UyD;SLKxg)x&PV1J2rtsvUW(aXT(7#ZRDzq+NE zrocZODe)4nw8jC{+9S>b^i~o8;}g{Zuy?f(Dvua$XAILOz=yAFoz_8JSa+Y$lQY)) z*mh+opVeLz&GLqI@#bTl`+avYxq!~sESSWrPrB!oVXG|oqXAo8Q!dry=*lth1IpU* zS_;7Bz&I&;ZQSw$xMTJRUU*L7*Rm<{6Gp>=3fuZOeVyN_fpVAw+|;S5E9?lsb-K&V0c=&P2SYxl&Sg+ zzti6#EEHV4Z7_G&5MT8ZgFe}?#zwqM_5kp4IBREyxHk2i$=%!g|4gAr$;+D- zH|0&h0E?j>Ua9>P{#UU+pf*fUz|ah^ZMVqux-7R%e#<=x=s(cD+RafMn)UM)B})TQ z0-e|;B_wAa_xCTolgOO8sI&F-&4=x80>^^8rybyb#gQV&&^I zMOCHx11DLF0HvbZfP>2a1bk`!s3N>1xh8S10`Sb(;qg>yq45%~CSf)Vysd{uV>J0Z zbUWzfy?iup?#exnavsELn|5sV;dWR&lMkDSGkn8avg+colwOIK>UG%LO{LBc?@JW_ z04c)vXzXY7olEn69yuJ8R^n%MH(%>T6X6mEK&P#te3#_qXPfN_2sp^cAFeNdwbI@x_iAiJ-f^n_u9C3m{gUvs2cG7S*oUwXFQo(t0d}O`BF~IC}YsE!R5fL)C-PSEBTasX3P#DU-lFWw8|KBA@3w zQZGsPCE+cJDfpe%Fhvy~oaP4jtcsm0kKx3$DU>673L`CZ?^(CWy~$?0QeO+ALN2rn ze1T$aGJAIY&hghedAB+ViL!k07pk`0MT;UA=vn6o=FpIFquB>$THPICBH0neQ6~B_c-lu^@^Yz9KFE=a0#<();{TCp(=PK6b=yZ!=Zc9rWQ2 zTkK#Tl_=ui|L-HzZxLUAlWu4~ZP4a)tu&axH;n zL8R`=(+FtI=f~i~{sI>CSk8hk^OI+`hYapfnyN1l zub&XH>icEHSj#Blb|Oj%X53#)ZJ@}BynAhazBWL4nfLP&Bu)g?TmNdg&W_Q`0tZO> zfd`;6ElXN!awH(85I_+P^b}J(TgtlmHT-Z}{TCF{Gu-r1;PA{vr;O=hb0`7}W}{v6 zl*9D~OzgbB8G+C6a&5V>nam}uVD;2F=68S|QS=n{9_kIi+2oN7klm7NpW&|1RFLq? zj9Nb3XgIKTYqB3o_c$|_uJH!U`@j>@nmktLNOZ7-jt>L&@^ItvobQA7X1%^$!jbak zGw(6%-Pr$)zbxRz^jWqC+=Aw-lg-~4{jo-r;w|?A$ls5x%9FTUE9kmzT~g{nMk4!X z&-Rc%{&7R+b8(D z_j~Vm>c2SU)rQRJC|d$-Yq`Z5bXRu6u7qXJx^5!=$!S)3biHybW&z8v0X}moI%9*# zbvQ6PSp%dSz5$N=ODV6VE<^JcvXFYCJ)V$&mkb#Ao(GyX$o&5Lq$oKD zK!15wz{!A~fLB&%{48u3e1SHO+BGYO9Uqhe&^@90I3mNyeyyKL8x5SSs!3igSP4>J ze!SJ4e-$zshwim*+@l*;={I`G+i2RIdg3#JIR{FdfP5{r_vnZPQ zgrrmeMB4TbxOpGt_)L<^BahVaZdc8Fm(FEG-4{dO0;IRLXyTE3d=jF~+F9X=LPV#L zY5in_JzJqyda>PK^76^uCKkbNxg1dX2`_{j`9IS;W!1ftQfoukdr<-uDlW3i(>BfOnYw0SCbqsG67#YaIe+M zKPD>G4FuCqRGJ&aZJ|h_0kgDr1pk`_X_K{)#8womT+B&is_ld0(O}QM`N}Aofaw)% z#J=X&Yh8x|s@+fhE~;MPj+&Da!m7OmPj)pXR%$olK%iM$=X*^Fa+>ilcd)SKC>`1o z@gfbM(~4lud}p%ZeX*3zqUwxLf~Y`98rWZVRn?CTb-r|}BokGR&yP7U_Uqk^q(oh_ z`mC`=HO|J4YF(lnXq&eX7Uo+a%rw#0GwdZV{ggCJjV}UH#@%u^;-=DmSshHK9xM!# zqv&~I><_uapAurVyk!fCheyrerLGUH(JXD809G=TB69l&`T$ zU4LpwW5P<_);OZT`!*x_sarQO+9?Dzr>)gm@N zsy&c3^!8`34z2f2!gO4*?@3GLw8oV*VFuE>sE4#?I*7ZIQLg)#S=d>MDaiu&ld;_& z7pc4oo+CemUh6x|a4CYE6DyTZ_;}8YH}{Ua-$rzw?8-dFs>5Pov$#sL$E%8VR(!%& zM#C{VUQ78ex*>`-$$~6Lc}hz0VHz*%eEaq46Rsgi;a9jbQ|*PppN<(9lpga%kD)>! zpgIV?rP%;^4nO5%Ul_b5;&>3(J&QB~xxY|;_s5*A&$Du&vKhjtB9golF8;oF3-(&ij(gt=4lH^BtKW*aq8mDPT)@du)ZthTc1iFx(f6H14udaLe?q}am zLdZFrd-WhkgTSEE#HXX%le4=BnZYh6&o{S^gl{kR^#O8fq6?grQHn+&|J#=odn8+Q z9$)u%_9ym9#2feETnDUK*X;N#(HuX^LaC$!-%cTqDnrG0$J z4;Jsg?L32hij<~65Mi@y_llJ_}trk9~k@WgZZ6v>ECxkPn*fwj42qE566 zcB+z9`OUtx-|!(s0$FEn>ZwK(8FE}Ku8KOVV{0s`cPwKC>TU2LVaSFQIg*dHEIOnp zop7^4I?wUd{I|#uua~B|^yKfFHW6ntBuQaMMb=st(wk!X=27q(^Ve%v)@M?!QO_+_ zH75)J<1zH;s^9hW)f#xn+Y~L9M?k)2;U2kW5n-Xt@%#i;eK8d#OPVJ?M|C0zOjMT= zK%>Age@Ak>Zy`7yhoMvd4%7~9cckoMXHCnBO-P3#7=Qii!b6fl}y3(-) zD_WfqBF$sglNtBrBjXCw#fjS`tv&p8uba$V&T^k~@(y3bW3RQ&%dGDc&Qx-D$`z(8 zN5cX1IQtXe!TkBL2EXy(V&6CHbFq!b2RqzrMbHVGin4en)J@SCgur-=D+&G^T>%d~ z_>1l$MF{lm(V=7pLe)=ED!}NQ;bLT{cIP$IKX`KX39JO|A_Y3qz(E@5Xg3zTP+y_* ziu+?+BSrBI^vMAe6YGhlHe|c665u7o>8n-}Y%ggN(0*if(C>_M}!mYkTs1l6x7?n$sn;WZEzC2u}#qdw=1YkE9tKVXN9N@$)IxXA5P2$Z_4^x1-%ZoIPkpQ0YeZ@)0#%gpm2R760}e z58Sj^4;r>70=9d9hxlQcSI&X0R@gx%RVtZsJs+>jXb|4kUSK?;kX^CN#W5UFpYoyM zFS@k-=>M(m1tI}eda62u1g3x}~LlnQOTH(q*44|vV>V=X>_Ay5=lUY)bCcOF8uCe9T^Yx|oo)BJ{1o`7G;b5+l|#w8L=J5DByJ|h zM(p|Y8vA#j8Z8z}+7T;?nT!GeMCSPm(SgPl=(NUB_=vQM7ss`%RM?dUr*AGMmQr*s zj~h1-A7|NoJ@R~S6WZez%p@Mgu9a+ib~Yx6GKxf`PJa9dx_ehCpNMWm@FBS=j|MI3 zf0s|bT8Vk!qnUJ(H#dbZU#*oruzl%QYU^kl>1(3pt)zOs1rXD=wbpq@lf5;k|B{xx?(8&_RY;1e>L^ni52i_BACy zLif)g`*WtJk;~6+i5T`9sr7_Ou$gQ<0O9(^G+ z-U;ELrY?a_Rdt&U+cf`k{V`Ju>rv8 zudB?R{BL>ye;;^*MgeV6@UB+f^*=4RB=ubU)F>Oig{EE^1li=Y2IealYLjV?yRkDj zP+>Hi8Ew)zpq^`e1&r1HMko2FXM>ydI3r0+{A{O@SJ`o|P9!BpY2bG9+RIRRy*|Fb_3iH_aw=_A+jT>Tf zYmhe*Z;GWn9j4*GdlzXIsxmZQCIk>k+(&6Z&qaba)G79$Rwe+YtqhutB(WSYZc94M z0z9lKcg_Ojp@&--Dh-9t_~*aU1yoeM00c)LkbeCo_dr{(0ThhPj)vGAmh^@nMGKll zkSTz~6ZF5l$>n8WO{_dhW59Mp%5yy^ef8Hc0DyD;ha2D@#$}J%*#qo8!1yuwzv!qe z71+ZgVEz?N3Q59H0Lb};zvyn!M$%^Yl=sscfH`Gw6Y$-yngJgH+ehO-wLISd#$Gpp zBbi+RL=ho4ftT35E!3PQrG;!9g6bv@gp(xy`7pGBWqBY5pgYw99}9)1T>IBy{o^VB z*V+2lMfk6K>t7+`UjhDKk?{XSBs3`cO19`+W*n}&rC2)rt$GpFXat}^8iF2ClzVgl zc`j-_c#kppH_fq|j0`7W7$}}3w}mjvlF|lwKgwqodryX^twe)PHPIhz=+{K!qc;vG zD+G|eb$p0qg}mX2Vca%QZn<3ir<&`3s>A-Dz4q$e^(nICa$-=c|Xr&hPp#5HM^^3=73Mo`WOJht4$sA0m}e-qtIy#)@Kzx;jztfUw3 z{3la~R7*S>MgfCh8~cZYcWHyz;>zQ#xV{eBLrj_79-79i{@n}0LXpg%k1pR3Kq@`D zxGPB;F8U%=ZHLcsZa+{z1}?%VZ^P%s)+8tLAJTr3ik*G)-w$s+ujXE>eh`G^>xetR zq}c66SSxFkL?;mqz4mk?>H-=*Q{&3@2WjG0QYs*2n?(|h2iB8vyAh+^2jjkqgmZXw z)LY5(3ijdc+oGSMkCtjLfZEP*JYV0{k09S_RY|@OMpji7Dya#1c*)S}{7=Dw(bbjq z|A5?ql}IZ@Tax2~LZ0k3i#XMWPupJNzX}{W@?L%B?A)oe6^*s~2_{NZia$Z#bOSa= zp&}qmDkR>oL)A(D7k)#~Qgg!O}4{cE)iU(ppi~-ZgcYq52fi$gc1k#4<1Pye9sD%i} zekkz{?-7KHWC#tf>*VV-D-I?;RThtNV%II-3QLMn8r5J0UsiBi>~-f#y@( z%$*n7IT$29s(>}95KI&xErmW}e+xba{3xg#!yTWkKV4SSiEuN|Q3LTB=<8WafO?Ow zr8nF?7Q~GF+2k|e(ZPzKwDsegU-`I>v(a`+Jx#pXG0S0Q!Ng9MDtre01Xe_h=rNC` z#G#l`V^=!>jhQQcTxXl)W9{EW3(LhlQ|JGqnPBRRhqDkA;C@W=vthEYQmiV7T&M2r z$9((_Ca;EX=QjcTeV@uQl1x!(FYEGzI9ayUi_$1mP0(G+y6*mD6vX?-sV@US7wx$M zPJMIKNXoU6>5Wv`s%Rlb6JB>=hzw5MdG-Ej<@cj42ghk-$u;$CiooDax|Ver)2YhQ zZYdDOUCO*23QakevX-~99DmOiBD1O?7DkY41qeclw+L6O?W~2|LVxl&Hj7NH;Ej(q zTCo+NfPE^#)8SY>3w#Y`GE+H8nR&S2!3((qy@ zgH%N6&nA6no+urB85)LqjSZKdA(d52>_3LlnLHAB;!{_9boi`%Zek@|Q5|R7(2wKS zS<(5(7w9b38U5|WeTvx6!q(fPO~7V~z9Cgxj-UX}H5a3103$p-Mbrd#VpI&+vllR- z(PY~gE*b>=`Y5ptUBMMDc{oXeXas6RES@O>OoPMG#^z~isqQ>gIBj4c7;V5S8^ z57`Z7&L$wj2HR^23GD-G<1D*^n$hW@FAV3Gn})0~u(TiA7~__zHO%ZCzW zqI}0$;h#s>H4LGWZS{aTu63^s|pHw;|(^WAqSKaldjy$o`Xq zqkP19G^)-wX8#C|e+WcN?tR!u^!OxH3p_`A=kIiAdu0&4b1Czs4r)#TI&VI1Zjbk% zsUv^^0k1vaTEV=ZOciF~)d@F?AykEU$098NX7Pj2e+Pb^ASL1k(VTXoBY(IrYO<~e zwYWeu`|FjF-M{^A0bkZvzUo)z{s5sx>vY~BVQOh!W2SQJfyblW@dwb&fND~PC#p89 z7qu%cRI@uw>j&wgdH?9FWqks$J@0-0{HGH~DrEK7JNXogktX~8v7W2jFV2bAQ8e9-%tWM3n)A4^X z^PJK1Iyj%bY;j;Mi)Z9qg!koZK7QqWyX3nwsa(mY9mS8l>N#&dt2dlH(zBOh_h9kS zgUv~#jX-aYxr!?1!f^ie+Memu_t)ER^cw*8LaDe|IvyQ#fVxBq?rJgSGq<1K_va#i zmoF{e{4TA&&xvT6;UB4k=TEMhgUbNXrnvW!Ov~!4-{oUH=?jq=6n2tiHs!_uk577{ zH+4O_Z0f{r`T-!Fa{0u%r?{6Q9aC1K^)5zo`@lv(hC2`1y^m%f!JY!_+p-1bLmUAa0v228ac{;6A(Us3_R=*2iR5alarFvREh zV-K1W5G=%o|D*sW$}v>))NsIkZiqqkUBY3BbgF1Iz_h$HA>D3Aem2oy6Tj z7TnMK@<}&t^=@0G71DzC+g!qti=sOs8l^?7v7-i`os($LCu|NlI=0st02bEbk>Ukr zQ?aeP>PqU_@a5s})jd52$^&0GUZmGB-qZgYASmALM7J1gTMm5K(&#`-LUbDosQfBf zNs#yG0U!IBld?+HAM=E(PK9%PRnNVdW{@6XidNZ|oLhPKoa8pA1X0;-p4CXl9n_Y$ zzPre9EP7fg$UB|`zQ&Q4tE#3TykH}DbrC3;^8e-)hj*N?T9_+pP%<6_U{mRsUm8?% zYSySn6YLy%I|az}aoX@DHeW}8MaA!pEI3xoAi+le=DE&r02@S%3$D!TrA|b4&;t4W zAP5QC&sZjsky8dq_8yS=u+HEOgbsYw$dO#91d3?~mOAEXz*%wr^&f?@Z#3BwM5?TL z@E2WfI%+j6e=iCP1+qCXKfAc04mseA2J)^J;tMApkYsJy|1Uaj>}Wv8O}XaZb=;bL zZ{8!>%!A1chNv}pxCbKZH%WNa*9;K`6hg*Hf2tuq$<{5ieQ*d;WH{YEc5=*=dvZYS zVJ>PXU5t=`g+u6z5%c*4n6&w#UZE9xL6K<=-_n}A zRQM@uL|D)kdpgb)-DI~@GS;3TAm_euh`t`_kY#JQ^+yy#ll+_{37h2#e%n=O{-A$R zXo7wF3@}VhW<7N{2a5Ky2~jbDGl@Xv4ITYQ&NCb46gfI{73NOQ4Y9 zOWW1B=nJ07j%A)9deuXjOoCGyyOyimA0f=mn5)?H>T6H$94IjxDZ<jB?;N1dO3Fz=TYY*Zq$LqLt=>UzRB8{u4szBz9@%x4d z&xv$1EY5u8aw^&h=aaR>;R9_!`(;#jtXW&Vl0O?aA1N^U;P+O-Kzq+gD9=89wP>Em z?%y_~7=%yaTs*UHy*K*$ht`_X&dRvO$2@-Nlut^{S7pQ4t6oh(a zW1EN4K+Ts$Z$Uojj-UFqW{^Of8Grr~ax7gJv5UFMUvw-G3}+dJ74KOCsh`#Ffyk{c zXK9nhpvCxr?@{W2D4d@b^0y~_a|K|!;1DzK<3_)EakYeVbr75k2ZrGHL^sKuzoj5P%D%m;v`+=sX_x-yKc=#~%pq7f#u|G6^KLESDRih^2zbW%hftfr?YyjlnJVMF<_jRMr zuyQzA^(Rw|;vyNSH4xB{p0}l}ic7oaVIv zYgIebsD76bUUBpFUm5*oa(*q#RycK zLgXEaF*hj<*GFbHN3D)vzTzPB9YaKER5&!XUzKxL^f&kktuqIFk!z-X;V29Fb#xiD znd(eMtj0!XvE3+CcBb(o)(UrFWT)2LDdk_AB+rbn}8?b~hDyr`TyP$zwyIpsz zA<)o)kAxaT0QVkXO9Z6M&R>9~0!@0KYP9bK#iKDtjDW1$nRWCJEF*UDlP8>3cbTTY zcCZZor(cf^cuCIt&~`X!>5s7K8!J!5mT-PU4Bp4|R>ioaVv6Qgt?yC@h?H`I5oO1GI%dE6iMC_viJ zR}LS%Kk7y{%zpZ#o%Xizm>W0MLO6+i*nF)&jZ@Sf6w?X_GG2}EQ%eWfYm!@$^Ic|m zlJ29bAwH)o)ozv1>zp3N0v)y(G?2MqzJ`QTt6Prg{ofqtNA z@HQJA#uX$*NgKzs=4n2_`J)#y_w*!)ygQY~{JuMI@ z>c4rn;I&vg=0iSwA-+!|LEs!oKLD_Kw|t%+erz{q1JspyzV~YaTT`z@7AnfIJJ2mz z)lDGRJvVgWQT%f81b9h@wj~8O9#!_P6q_ zsPqdSHlf+JGcOsGxbbGvp*+Fl^v`vM#v~AxY;bAwG3M86avZUDIIIfa{yws~u%`=C zgbNJ;Hpp)HX*Kjw zFbLg6DYyD8s2~t)%9j5=c7zf82G2o)%HzQbEcqVFEnRPE_6T=)JGjCEjE^Ka`=nGy zFL7gJ?!sGO$LsD(y~F89KOh+5p~F$oSJPH1!8*Vw1ipPU#j_^rt6i?>bamJjF!b5% z2==TPv9`&^Be3k^!4Kee?bk$TwKy7KU_%7g_)Mcg;7tyo0ZxG99C-CFx(js@&hz!A zZzT36yH6RP`NUtOYn*F(*ahH9HV&Ik8$Azh`X2ktQ2%l4tlU48hM#(4On^MBKU|T_@U2(Bm~_D!rcXj;(RRK+R^a{1}hx zG`PB=(x7sLFk|#2@!XTN*vDhi(y2j)R_FXwVTkD2v&y$T2j{;|@L$~Q$w`=&h-fW4 z3o-@GSyGpp4u0aWum&EACjPLMN@r)x%e@rwzPiFiaZE%hgx1&}5dk*@?eDr_l=QCoCEtO!K5qr}wr|d&$r=%!77E$ryI=7`W^M_Ac z95BgBxqC?DXUty9&L{{ikyN9LPRoH8H~|sA2xe4z|I~N^Q0bws2yJOp+R0g`P z9mEq9YM9OEB=_hA%KfIF-LV=b2sM{J%LJKspmm!ICbWYT4Dnt(OR1*}Kj=G_ed`c) zt9|@1s(|aazUul_)QvJAcRx7j1ME0MXdIX=$$821(|wD!@&&WwJTamyM3Y_NsTFdu z=F%_J6lTK#8(c$cfLLN8Aox0z4T6b;TSqDAn`m3)U2vIoD4}*2%mkaIKU3l}a8-SD z>=J&h;Mcze&s>L_5}R8~wKXvo0gN^=Q$c#Fm&DcQ=g< zTD7~{;4Me(9F{MCAT?kM(bdhq5M za0LqF%a-OVgTE7m2k2RVq#Yt;bH3yqxXyr+rM`H0YU%Ug3z+xjtMniLrqPpov#7AZ zKD)j?Nn&fOKQmUHTS!Q-OGc`Z?nEq%4!+k$c0x6&$?k+6)}WhF{vLhVp7PQohsgyZ zS-rVTXN%=diCO+!SpMjCS^eQaRf_U96<90GWdT4j8f!}5Y09nhfXLjr?Z|FNAT4y= zV`9*f=KHno-EsbabyZCaFSh{MVkdl9pcPXK2`a4~Jn3p}-%53tj7)5zrzW$7b))2- zSA)HxjS2E%*1Q*haTQ1FKej+fpzLo_{EN<^3mudk9|H7OALI(<0%gw_NJt+WbPnXi z{}6UTEe($W+cU+_+vK2PD(5JE6fp>~Q@|A(j51w3Yi8|A^ zmFEL2D-(2=)onzG`Ho~b57~UeRy$qA_M}98WpaLbj%P-5Szo7( zaKuM{_!1^H_m348#-4PYtl(=-{dO?B2?;u8s0nXva&QD68W?;#Om;uwSooN6vt>K) z8vu&OeuKORSWu5aAroe?W|Wr%v*p^{TMU!73U9WAm6~Ezy{rWy7!cc|*`7PJGqU8n zo`KNV*(SI*Cdlm;wx)^;Y#-vpZyoGlD3B>0>UVlgw>RV$>JH#Y0xpAXHW74*HUuaU z6unO+a*5AAAC_qH5gP(exttoM-%%TL+X%>%I-1FSfcPCvxgfLfG!KUknYlueEtkc~-2OSM{S!5g@1NXnCvK_T}6t#&VJV6a%eNopsF$*;8kLZa54_X(R6H3)gj-h~b=?P*9Pc7*8#uvC_IVFQ(u& zlFAVm%+#)gJRWTBbX`hih%M!O{46ynYI)^##aYNbSujbwXRXR2)Aw5BbA~OMjVoV6 zUOb$j169CAu)K;M3(*%S8iXj!LLc{xEUmQ2RH>)CXL<_%9TLQ0p$%pJcA}B9^KYVX zGgw>4s6=#`M_UgJS5Ax(2)UZURtB+(vXuW;5%i+(v%a*619G+9dB>bZmGcGjaV61$ zifIr+{*u5MDW1r(HI5^FG5Qd_b$8_>ln7C0)h zOi84ghs+zIrb_A{$DA7mrdYfIY9bj{Z9hMTo&EuI*)OYPk0MoX>M5YJ0K8z60qXUU z^5U@wVtWmk)Bnztjd2F#C7@Ky)xN|c+ux|_Y!ftyks?1#hpdx43Q9HJGnD#0)w#E8r>~{b}C6W!b^Qy*~ zvXz0kUvTLg5c}R;OaE3=Q&C)zOx)EEn3?8L7jb6}4VCWh_WTpCHRyIB(O^DBcCwPM z+)#sv=dxeLNMKh*#Y2X}#HXM{$J0tiKNt7)8KZw?j_%qLW)}c0>sOAUa@cdeexBB; zE;=K_>SU$(A0nCOuCL~&!VK=cyb>n6ji{8{f_ag(d<_%)1Z-f8o^LwsGVZ&itBPKj zu0bz+$rU`u{0%T73L~XfM7&V}T4nvy*D&Rfm$PFk3m#j79|4)ZbWhi29VsTmAr)~K zIwxe$Y3kyli?o!$1I8vC)>Ih~C~pF~ejD`Jv0TDTPQGUiWx?kty?HWvTlK6Jup=s$ z@p+Ned3lWHu!>oS>H&LZ3uH&~`;%fY7IWKzQ;J=FI)u+?-4*p4EBH2FAHoOPY4V$j zMERUXEcfHl(=5~)S$@n;xLsagLNzIUo_7YIZJD7+SG64ctjKp;lrK^;6F%t&&r zI@Cn1z24JXN_^8DYS#t2|WU+F>NfYCU=E@$9v=B!&? zN!fxHLkIq0!h>J%zwh4zxj-VL41{;*8r*LeF%8d_L$_M+IYi4Ji5j(^JVV-Jp*6Fr zx%4%JF_aF(Hsr>y0>@SjyV>PAIyx-czD)H1q$Z*FY&>2VeU6J&@05L7u3gL) z#G$PRbfs$)xjFPD(l2;mGta5CB6R_JiO7OV@#?jh3&k8Ky0y(j!|Qkfa_z}=>=;t+ z_fDNso|m2ExErsnz3A@jHa1Z@-~Nr0bD-%o9Vht`AoJ@oRDPOv4Pengqufn+BYDm& zhCqZh*%}R}qVT3RbaArlD5NpEhW9^TegePIaSJwtwZ z2XL(`hquWTY?GnHgE_OKzK`H@Zd;3+9Tie-D1;kn7Kk$ZRKPc!05OE{)7ZxNHO+P} zNZ4qGf|R$OKP#S^j9!R;#^AY^pl|l>$K@iBw#*SZy=YzDAtPfJmbFeS8JjVYQh)xH z=eniEuo2*Gv8k$4`1n6N^glAbe;%~YkhwS%fO*FM+66{ls17VegePPSyD{4p=s{E4 zvm?+G-%?|cQpf>s%0=Y)fyE!cJzj2@)P)VSGcd80;}zGGgd<{MX88%=VVh=1&4$BJ4&_$YT9uG z?lUlDL6T_Nafl8o$^u-zzC%q^z7c(#C@Xw)za1OxJ(461eBy#QvMK4sB#qZo3xMut zj)8lp#i}q6(vwLVCrK7~DSfSDuOh1czE*CB(<62^m#uW{}C>OG@_zr zPEylo0^`09@ob|>j&j@Eg;s*5MyBsrnxvy5s(&h7dTcFn&W_ZNDMVSKCQwGyM4Ffh zDKzP(uSEW4`N(D*T6z*9{~0bGx!+9zW^o~9CUC3fsA+!pTKOX6AU&0LDDO*^Xt#Vq&m-= z^!{$eK8oao{@|nrjuJ-6a8zgSA9>m09AMr$P<>-99NM?9t@`@GCJ_s6xNbXp%4F_l zVu4chdz*$-+Vbf$>Nm~ZBA@6qxL6=0ogyqWrCzeSZxQ};jBtnzft0Llu@GK!)Zv-l zm1{liHGNa_3fc4eh_=<%6eUcO^?Lxc1Wn=)JNt*6u{%!&-nLuBCh~lG)3xdAe`94; zODt`SvItOv=ar=Xrc^M^j`h5e+DePnf_Q>fu; z_U8i5Ed-siJZX^VPD;$CJjR<=CyX7z-3wY(d|oYj{m_i+KBYGo|GePd$vOj|Odp|L zqCCR)wV_YELY=SUcP4%nHm}h9Y!ca38xq;rDTPlWo-hCi*kBs z+4gd{RMA>m)T8Ip58MNl=(&OdqR4%&h-&>ngO{PeKJEU0_9UC+#QE3bIi=|cg}f!*U#tkRhvj7q)lC7 z#~PB>0o$1?ZwfQ+OP6ZN>ytUXDq|N$eV(_?`jz{11L&c~{_z_B1jmT}t^I#VDrOdOtFek<#v$RCKK+vM z$}|p?s&CvSJgKw_J+@@*ZR5PAU^MHUdLrOljnKOYdRc9yrw!}Zhq-IWWCK#!IC9!Y zZ%qd+aul?Sd)f|yWA}#^@_+Vc>elA|pw>$G__svDz(nCwk&V@1t-FL$vkMe>6=P79 zH{1&l5WAd=_1s;vN3t5$-6igvvY6bXvp3~={loR|_SLPZ`$z~SD~~2jE-8yF&S_|| zvt-MA>>a)|6x8&Mg+*BY?W5|uM_!^4$oTnc*%jm0WGd$aa^Px+ z`H&E_hWjQWx_aj2&s-BP;c`U4ty|fH6#+JB&T+k2@Pnkn_|)vN7d+IuJQ^oOaXiV& zxEm>vZLJ3rv&cBYyCrjcC=FG@nFt0!We2BGExT%W9%O2+9a=uDHx)Ca&`_H8NIQV7 zkp_{YR&IZDcEfGaI1wA>`3yz-8M!{kYTMJKnNZMqIPPk`;{0IzK#A3rmpslQWk}fSajb5Nyn{=+liFu*PuHO))i_`@(QXvsg>2G9o_+1VU_mHby&&NVg zlULOwXXf<)oB?|wc=bNYIi+%o*Py?~C9ZSl@q28Zp34ZF;(5>aqcT#iN1}O!!7)NR zU)!P0LuW$%o#}xuIxgDme~Z|Af3k$|MsaDdEgE+_FcV7UL9^04uwZB_dO#W}@U5a< zU6(!4_GI_uQG9%`OfiMk!AyDR6BB278&k9<%i0RJytIy8e1(OXpLVWQ6*hYU9kV!T z@zuCq;%0xI;K&j&-8UX8>}-3wJVH-F<-SN+(2eZoXa_&O3GsK24?gsdP8x70m7unr z5J#Xqv};}0_^%gayP6Y)-AZF#mb1L`JyVCtl<^?qixR6Bwy$+jR2wYbj-PRY! zK2i6^tY7_KSpo1({2dSV4ON4pWI^HrcqUGYJpv(jBS^&|S>9txvA#_y!Ts0eQR)m& zUKSvI14`lZpILUK{)4#s&aLFp(XiOvyLyy z?VKrBf4Cpt`(-{|^ZHcu#Z;j)te>srC!K{v!kqEjFd29FhaWt$m(rJlCQwFq-0kv=ICU&%M>vg&W^e89U~w(^CVt z?s_Enc$6oF&+Dm(<-ymF>m_^=;!NOCTh*4@DjcE=nW0@x%+x^LT&i28_V{mWLiW1e~nG^bAtE&LLna-aui!`qivQ z#{RpT-qHJ$RiW5XAH6K;$PR>%sUYV-{jRFOFw6cB}^}rJYBpb zyxEH4GLG9OPN|eV^JVROK7KCNGugQ~p4jB<7FQOtuKZ9W|Ag{Q>yb3tVQ`fj!82OF zU*>=G9oI#F&H$NvtuC2=rZF*Qxro%F5*1TZ@pV7XDDRg0J~$BMsM~b9#c1Rq1^oV5{=0Se6OsczNo%mXOQvwDShi^ z#<50Uc(zZH)d=vk!!-MV@spu;4=o6;AcIcuo<(UBa;dG14#my4+;s-$>dW91_nz%d z1Qe&6l4P&(O5<05`~(+2C5~WOv;aL*>6#R+C3kh)ly*K_8?N}|T} z!>5EM$Qg-?%V10A>-|H1`+J`I|E2FcU89&Kw!n8ogA_OcF5~h3`;|w(zI&W0P-j~7 zb7E9tyPmGv@Jvl_Yp{Fq>Qp|gfaY%LP>@%0ejf;h-28NkD7*F zV5KU%ZvRcH`Il7rzy7;qJ19=ysBZ6E;RsFN-!~hjgs$*~Wd*zwj*1S&1z=rB8zX}X z0nU0qBp2O2f}$i5yH_pqy$cef%d9c}U^h4>eE2=a=h-#}brAV(6-rXG4_ye=dJ)z?!i3!G7UkBlbPs@8XU~9oL|%I<#RPh4WDM;vOH!0dAsIv#ft`_Ar&#__w`lRCGR;hB=!6I0<4s$Xi% zO@j_@9KcA>fzP@*sA5zbPQi#DdA@Tx)^ID?GVZ>16lYxd#lpY?g-dP4z;J{hqaU^^HVfNe zjO<9{wYB1ZO?x{4>9X+c5dfZ=nd#k+`boP0ppI^1M*lVr8rvvIPoX#h1o=p9+UB7G zfGjzL&;w$1bS{W8<&O6{K#ke0jUiM~-BV#TR8nTJhYMF%3#1npiDji{$ruuAFoz%T zIpJD`e`9(wq?^{I*o1+n!*o3DAS?FsnBD#^UKLs`)$ti6+_W#F$oY9F^X#Z zGRV!FlpuXAigUV2~%dbCLk#2k1e*Zi*hHqCW{Odr$|g3y@(5LEPYI3vBI4m z)WNx->HNrSqjAUjWZtliHtzZJb5KY@;*HA0Wddu0^73_MCLi&q!=MV{Ks_UR^ztX9 zL5hP^D8JV?2ah?0Ol(CqWa&slV89sq00mKF4(SIY%LpCQy=HB=JIt}tCdnkirKP=lyC-uR41-Z^)_Mw`FRfx22 zcpgJY*wV>;nRi@cVMC@R9*nNLXzz-%c9Le^&mP*FR|;drkU7ladvMVdH>!f~OyjV5 z`ls}Eh3uL&*~8nba`X#f!EJdwf{m`rFlO38iU)~4QP~A-cj^53RoUh52{!_7+&&HQ zx)InKAQ*U@E3C7(Xj|la{XMz_HfiQJCN2h_ukP6ABzoFd-lbUOJ+4XTx>|I7kenLF z>~!sHpK9~0d(JgVB}jfJXDf+&gAH`+cMHqXGE$O8i}e7FuT8)oSy(%Y2N*V2*tX~l z1HS|~55(y+NOLp|9SeOU$>41PhK0;^mHwEX2KXn@6njP$sKV=11@QQ;{3EwyJqkjy zo5hPvxI^j_^T6S*U*!SVkXCYtyDGYW%)ENZlI1ARz zRaWbs?QbG2t&T3{%K>Kqh}-3l0np5#^HHo(UJkNk3u5sNJe``V#E9P(M-+_kernyS zUE#6HX=1dNKhl_p!mncq{M($TF)Mdyc+=U(KFf-K;Yg4d(lYS{ErU2A7qH4hJ%ad^ zoKq00OrRU$7^;ysVBPJQg(Katq4?`>OrLWWK%Ediu(7p$4IB1!)i0>T8H5XeGaVV3 zM7@Kdl<|Wl$tgzcoc5;;V>@AkhtUnioo#gOF;FJ~f;S5mgYfGWZe2s}de!YS>7ab521nj7RC z)ieRZvQit((V7D@I)}ETm127UBh{q)7xD!8Ku1F4VkZ}cqu!5x0r855{8kH(s)Dub zTuW!n?SK&7_Q<@@3lFjt!IWZKSCg<4*Eh$Hyr&r2_qwvB;LhFKJ{R?7uaCH_W$1~< z#X6tOOVKkJ_Ypg1a2LF}rsiJ)0G?rr(O4J;`EXWUF=NO^YevWqAQ-M4p_nrHhExp!g-!HYB& z^C(@+rc=#~<`7uW?IJ(A3O!3HYF(h``fU5NA=9zDCgFj)bJG74a?rR8q)K}PGP z`cO>3me`R9bfgj^!k;|933&orwLqQ+R)zzpHKl$5hmv0?Z%wft&1oA&_W;Lu-D6_U z@3!IJ`Oei08&IoBBZbj!bo1})cl>mRA;8J}Dow%PUHGBe9;upzb?jyl$#HY#$vd(} zalq$FnC!Ih;|DXRAgd2vEHYzT^(g{_9E*9yrT%HRp5(XbL2Im_f3s`3`=J2Pi z!rVlP#-SHaYc`c@HUa4Qr(w3IS4kirH8=xst~X;G^5>RtKp=bO`Of=@6OJP+$sPA@ znk;L5jEePTV%>J!yhg@=cNZhI{#Y(2(Hf3%xAo7QnScemA#U&Yh7`NKK#y@y>`|qw z_i?@G_S-d+hcwr$h|lXh`eG32ri~5ln=1vY@^y;ve!sL7JTKB^7)E)T=Nm3ilBYL0 zfHCmJ-K<{v%2DrMfK|+hpfR#^Ra7+kH@*Z#7;D;KfbTxkHi6-r6FMF&vb6T`d)-GZ zyaH%FFkdos@7a=D^cl@xooB$lVV*(kN<@8>{ayzGg66Lf&(U+ky`Svb2|QN)U5}x& zC8cGBr4c`S_T9J?Dd1GOU;6}80JCX_HtIRj_s>On_(WL47$(A2`ZuO8DMUt;3mt&j!cH&PmgfuZ9zy~9Uw9bNw3EZ<>py}%%f{AN=d@&Pz_ro^Q)+hCQRBcMlke>n1&f0?wmo zW-WeWjB6EZggy7C-#fpI)L}B9@_2&W4RAW_MJBL=XUk}(c37XTzS$ICn7RndNBP3% z#Fs)`94HU6t#g`!Z$^rDSn`FfHJ~04>t^!k={q8Gt7X*PfYEm*!(S;JCq-Fk@V;1a zJw3Nl^}$<-eclO^4V>S*&kntt`pIH0KDakxEqT}a>J49?Y-E*e0)FSlzs>jc4-00n z)2@=gH4ex|%3C%CZ@zgtsfoF<@?wOTy-B4IcnxtP?dZF`-NNHiWZ+E2-B+z8>61^q+Z?8W!-1JB`&D9Y$RS4@by~I`;w1~cakuuPOYWdOK~5XiMZWP1c8IoB?jaDLQ22g?K|Cl*9>Nk*Z!^f!yTj-l{igySDsH# z-yV&ao+1=~dY-l;legkeOQNUKOw`TD2#aL#5w{P8ecB_DD2Py_)?}b-L)*vI_Nirn z?o3TEyT0MpN9G6AEzzCZ*df8=Kd4GNU%Cn~g@X;pVMCYjo%oXsAty8`ma=fEJ!0A5 zN8eEp7+u>6njzgVM0F6mNs|h1tGL)y5(_1e^UEdK3gtKEpg(eiKu4c>m zGH&kztolsG`3~@j-k7f&5(u`HK?!}>_^?C)w8>%DaX5^97Tbe8&S=H6(r#i!E!{R1 zyF3%BpN9sH*I*#pEWnsSsZruh@kK>{sZmhJv zZfKjDYvU)BBGCM5y(($7l}EI_)lI7A(TigRv7e$?3%`s8aS2C6SH@LJwKGm2FLyKe zeM^b;=KJ42fymA?JraA0X~884-rT&cb{C#%48NWk^(wPM?2(DNt*?UN?T%vcEBIj{!@?gcaaMdYR>&Sw&Uf35~ikbfi(FJ<&d5Lz1(LzQRn zECpLWcjbRvP|a&zt@}7atvKeG+j|{6Z(a}X0L=(bIZyVfMRTt&HPB$_W2+egd=J9= z-FfN7At^9^{)&BQ$8IwYDbp{#>pwIc^-~_+$BsphAG$GhH^PiFnk!RcpV`h|ZyMW% zrAQcof#f&QMX0RwxkMrJ@`*B4jW4EQR*u%gdh3U$)uUgWPNwh2P5~|jn19PS-nJwd zI94S;(b`Vi^Wwc#h>FfPj*3kq>>Fz4{OJQ9rHn7EHG)#gbA{puA|~VmSo?Rgy-Pkh zE~X6++*sL*i>}p(Xm8yU1~Va{dyr?lP%Jbd61H~ff-47WlSt~76Wlh?yXA?DfonhO zG?@36>;~91(zt`5(ej-FQpzJ(JriF|lLXN2{7+z@LNuC# zFIn7zcQRTLoQemc2L@T{6VE}_`wuo-9DvkjPQT^MFX8&-1$(~XW73V?c%icZ?{N&s z8=SNCn|`_McP2hktw6)zyl9$bmQ~iBldQW)b~Y-ErbEv}dT>3(3En#7z+~X#BO&4H zaNvKBhHPnduqoXloAdZHvIY&%ylgk&f; z03WY>G@so2hWDwb-=kqvv87sqDbXS0z#C}i^O{B1ExFBm^av;?s#P_xYB};w#Ykl3 zmoomT@dIC_bp))^n+F^HP8hw8797bB47pXlDRG5XK(Paerq}c|qy)PuO}kqWB7k6b zGk-bPdl$3Z@NoDM%@Gz7h#v*j@9fP$5>Md(NJ^)}?(jk?sqFGU`{pW~%_iL%UIYa4 z@H$u!BI|IF(80wUa|s25Y1{X{yfdOsvcrF(d46NE2Qu0dJW=hwX*I_$!aG9;bDZ5} zLgvpiVDg7EwA2on5qn(c^(CyiCJCF3^p}+eR0-eC-e!ZMPZYS#8brUFb;*0RU|Lf( z`nbgwr*8B7Wm%{LBHDE)1RRDSo{6i~1?#6O9N6|#C&bYH07AhZ=N zXI5;}!LdIujCufR@|9vrHKB-`u3ht1U25-;gDbgphe?T&}g# zKz;)DUV~#nvdV<;Xjpi#1k1eS>lk{iv2mSb8|$?sw$q_}3qco5;3~V|XXdBl7*>i~ z;{eA@2o3SUYJp$pU8q{5RZg$Pd(*zWJ<96A-9oyGv{JBL(P3hIa^3y_?y`6|pSWb;yeHSxaqP#z zaE+$q>1&u8GR3UU{lXeRa{wf##CMFyC->ZYcbJ&&#`-f!ZDBT#foHl8I6tyL-l1nG z6hPlm#XD)R$K?j@J@;@?^f@lGkgwMZ_l!u2Eh=CD*8R>1Mf`XbIYyds2I&+%p+z!S zO6dK5PwJPke`ph5=r25eYbdZ~k0^E)QImRzgsH=G-lANt9x%j(XIjX9fAGNmNF(n3 zM8ZAwB7q_yJ@MqZY9^)wOj)0-%JqQLzUs(F-V3yDpb`{NcdKT=I_-LkZS*d-x+#Be zBUPNM6Um*jx$2-E#DRv66hml7zfJH4ZBSq6j(T2h#POy4xV+cs7=Qf7jef~y`{*00 zEK$o3XG`cp#)kT}P7F8{^m}VAf`Xo$fD5T_gmS~zvf=e1H!rK_nbC^N$gcx>Z+G@< zXY`)9@o;pT9%kYjN=Jc|+OEu$EC`V_;(KXnU zNcDo-iKtT@EBBZBqmLO$OP`hwR9b2)@y|c@gdyamap6XXmsAzRyevR%fzS5PSiFAI zv4Zn(9n4SPKPu8SP`xnHbxat(*c=PPAmwt`e=Hs(mUcCU* zE(&oJ9=!q;Kwb(xp_q4KEa7I5v4WkW=4m@(@M=u*>U{eCWAI*pyj*+=;KJB9_3pkW z&ASxR$DTSpyq>WJHgu`qH}wK`lNlj{hrh~l8vSY>L9b`vZCGeWCMKexZv4KU5PM(r z=5P#b6wn3?o{GOvSI!*@trTCSDlAQyDz4j9(}Jnb>?=e?W3zEeFM!##55MrNI>)hEsGZZ&yi#9 zpIwhrB+&V=!7Ot@KKt8WMD~8a8*brG?2kX1w7cDJTmb&3)rZjn;UUls_zP(HcNJ+q z#Le^6o4XL_#PMnXafVyM;$MBhNB=K4vjIh65KICFo$;!4=diVs=qjpiW~R%0iHS*hPsU@3Q@!AydB?5(`j?t}q}>VpAy>3dn3)Lx zDKF0bM%p|v;9TBdd5LtU@Q~U7lKYt$$+5omlp>qSiNT-e63rifsZJQSb`Wr%&+iD9 z)kdF3mp~Tn0kg}AEYmsM*Y??0Gdw`!@xXzL)AGIsDes<(39=T{%!h;-EnH6$b^mcv zTQW>{|1Cg>_7x89>yfZ9gw%PVVo|k_0^Mlnd75Si^faaG>YHnn!1N)4k0?>HJw@j# z%6F{HWUu=DF=W$$W3r3$h%mHxppyl(HyE2*}N!7H|y zo6~204_exw(nv#6TMa(Ua;v_Lw{^wDPmrVYFYU?>nw%AZh`^ zJEh#n{Um+bMbiGQ_=%BYT2~V;Zm$m>IAb}{Mg96LWlIUOY_hh$E5{VC;? zceoEI;=nkeCw&LUN1~+FSx8j8WU>-AeEAZ_K(N>`(&;71eDC!kUTHRFhIBForv@Hs94_~9fkmG-rm`}*6$ z>n-okElkBAK0yk37f+GaHXn2>mC#Xa=%cht@t50jn}jHfebILc;C$NFqd%UCtfhR^ z;;M30f;OB@#f;gFSLv(YTd}nZjGLEf&i^u#e&nt{alrpx;s>Wv zW9JHuw-=AUjM8O4^2zw!`prvBwEc*T^g_(E9D_Am3>5r3x6w1HF(0rDWx@_|nvM!b z*O=TLq1r6C(p#0TLAIHI%vDNBVP7Z@l^Yn(#|TPAD$LFBIP>;U1kD@F*P)06&48Zj z#q~}Z-5KCBdNZu{z~1Sf{Wps1xcV(p+ZX}k4I*2QF+=fsD<(F2mi_)=ra6qtNuGLi z-33%Dkf7IB7YDZ`#l^T_`%9a5Z#{Y}2@e9Doijqb7Z}D zWj{QxAgf6u4{!k17;49L_*OS2I zlPH2Dt44I84;V!`0b%k*+w?sA&|P2VvtyRkQy-)@7jxM_`u?P3&R+2+=ajLf_x778&7M8N#d|GoLVl!>?)5p)CyRSWln<^iO~zSwpr) zcNgSAntLxJ2!UvONQwRmJqabvwG;1^xqVCDl`@m>94DL+A=v~g#s?eX$5@`GQ3sB{ zMypvYbh`e)v4F@dXqXAT;)~XYrM&l=R}5AFum2Lbu^BKsEN%03+$g$cg`%Z?{;?g| z4Rh4>svxIOZ^I?4+!}$r!?nt7EYcPNSqe{RSnxvO3}P8VNF6O#hd%HtW>oHwb%l<- zTcR-zb9T!xn%%kiGI0vvQiOQd?nOx5jj9FEHnq+D^dP`j1J}WelB(Zi>?wX7X$4CX z?U{N$hO-MQZLV|uqVkwk;)L{pMV1ys5qi;)oVl4)u7u+H0>;w3#5HZap?3^OuuNb5 z!e%JUnBT+PxM>}XC>vn`Ck{~*R`<=y6?@XkXky9b9N#qFJG2c=J^u2G4Sun?6Abw^ z)$3Hq-+Qg3!&6Gxa|zn47{M0c+k5uTivv7I4(~VH`))5!<}DTZPQYlsOfe)C)t!rU zC7-2I8HPwd-h;1>NM*bS2*bBDJ{;By=$$EPWbW~@*b5it15hYA8W4>(Z-~! zto~#$vHS2(1FkRgf4h|~&4nNWyjbK>eN8hCtenJ<8 zHC*XIN-!Jk^R4{hk$rYw{@d@GsV!+&r1{l1boRzZxnBLAfO!H^|3af#d&VIZs+P6y zk7|ExHTuYqUj-)kmCfA!rNp>vNhI=G)A7|WOq;iOblDKtcp4J)k$4>`B2+D3B|<7X zCY#`&kQ{yeSf5Rr*yxv)f|D(Lj591H=-WN7db8evEq8#2=Edk4S@lQ*snDWxFKqp@ zKBECtr|Eh4$r-5BZ%ln%pU`vG@mc4uZZm+(JqVIVun+FY_4D4O=R$?Q_C)nUS=w3k z@@#d^bzXR!@bdgbOWeM=pY=b4^befe8%KW)ax8o#tc3G(U2W-vKIvEHVr4IuC*+fE zZnUsUvb9HnuYlX#ef*3d1~a1?T|}&YE+aSIVcKAO=E;EvaC7aH`rwrIOCIdXFwc*P zdxMsGapk16XMie);B;mjA%&)J#$`I+_;kGYi)`P-3x6iHxbC>v`dB^D?ts&u4RpjRF94EXNFla8I#-4yKx8xQRY3E8?eh`$vI8ufcFU&a7qVAJzzOZF59 z-`PPo;x9O%8;6ty;!KP%iceHnA3=kk8kIosaK54S{(xgbPBMO}mOb~Bdn)fT%-_9SoR=4< z-{sjGr+)Orh(C`EpJ{OdA0J^4{>W#V-^3dl9ba$x;-~+1q6civ9#M`i6HFVx-@_a; zh5c}>$r2sHE4S#kUjYyiAKC)$c z4jh~vJs}IQ3c|a6-2bnC@fet(<*rx6Q43UPuX+Z)llN}>AIK8ldhh`j%bz3XcW4oH z0o)7KeKo8XY@^eT4gQO1zY2}Fzr$*A+t_F5yT^Wh8<&86d%#5dg|EdeV}DRmK;&gG zwGGjH!Z-?C1rE8m>yZBS^*?+KKuM>e293@-u9V`cCGHXL8eZF0FQ&RFiBJJ#YMakN*vWT%uNZA zPA*Aj5PA^_U&A%^jOdD(Cq~pup&M=bl#%+hqO=AW*Hy;#WJOo6IhnZd3qVHoXFvF# znNWqkG+F8Xq@lsq<>->QTR8wN!9=Fq`5suhX?sojn(_1*kk$yK(xytq+)V$IDq)N! z<6SiSMA{Coq8x1>s0cmBez2Gq6vM5%Qzg7rRHZ(F9E*(rTbFe~P!1`9#6g!PY^ws` zN**SlqN9fgim5i4w1pMT#jL)`+0R9c!*T+kyIr1(9u*y-X+ua)ePp;x_-?Eg!gH#k^xul}#k1x5DVp z&-~M3=j`+fZB4^BG#6+3-~Pv+1>%3&?sU@4bNP}T&oTM^b>nfJy^u?;KxpgVJ@1kZ zkwC_W+Air~CHA;ePxO|b>od6PSOaZ}OsoseQiL>qwD(M0` z4}pC*cVzuE2K!ihDXt{9S{VD-n&Txg}P04}W|@R%vH- zr}U(LAmSFf8X770BQOyUkkX(OCbbqv{AbyUy5r`P-uLYcZr8%USem-jwB!j4EKj!~tNX~!d z{a7kcKl!_pVGSvg=LYX|o4el9s7rT%Yz^e2GT@VmHHEqjF!m!Lc<$h8Auc>X9T|@o zEhwI@AgtcsLB=A!x$ZIrXx}&g!2AC#sO`V@^T+2PY1exK&!q6roEz$o+p=rvuVAQp^!@E%!D1IpNQl26&u6I<^Y8QT zoNb@n#nijM`P(uHxS79{Oa#)R8a7!GX{AszhEA-)J4;-m^&ySy-f#WNJg%}Xl#Oc z8%HZK6OMIRl=xADDxqJxoAJ3+!#E)r@(JzPj%qCK$;$xvo8&Q#lH&;rH@!#C3;oP+ zyN9~mVPeQ$x~5S%Yo8znKt=4IviBYKKkTAP;ht~Um&=~L@9{xw54xW^UV+pjB@eoo z1b?2BEU?+zYc?Cl%;-Hg9L04>8)Ztg8pZItqL|2Mu;Img7p<+{%0tSf9anTFD!$O7 z4^Hm2jkb#Rltg3>O*W31Sy*8&Z>C0!F(AlG?NCu)Z^&UM`N@mNoh%ruhuoW$Hy8bn zeE&R-+vGOUH};1}oc;Rk94KCf|FMg9iyIWsDF2NqMhO5Wv73O`WQe!}C!kA>XyFxH zbaWXb!ewpEZObnu2~EyG`E+RWY;SKoMHJ!L0e|KV#gtKV5w>$miX4G=M1l5ZlM*SL z9&{hQt_AOWG(&)0Cj5k+NT~wIV+$k&&O;Wt${41g$7O_NXUe~Py$M-w*U?8$RKs0$ zZ8@$@w7f-_WJ)Qt)mBJ-rNoaFjd2Mr-~`wT)zr%OU1q#4AK_&v|FG6TIVvkRHLOEB zS|tW}*n&(01h5-HwyjL-12s3v9?h zW4zz?8C33^|3yrORixqZMz(`2w4$=`J4Q`RJXy9`LV`h*DfH#g5ANcmdWuyYy5QaR z88Z4gdN{NZHYJUxJp_>Nv}(5y99O?gXt?07W{;U+^n1OSA(ri9)Q=IHtWv${3GLaW z`q8`>c^Z?aKYk1`ege8edlDj~zE1t+GnV)~cT^$v+!f2vRKe0* zSH&9T)lN^xwh!$ddQCEoaTkGsxwHEus15zI4vME{Dc1@NPA+#{yNEYLySk@^7*ky? zf(F&T`0aQ=%lnN9GKu-nj1O4@PpUNSgxz6qf{$MceEh)=@axRBakj3c`!i$k5H9-V ze9-V>wv9(*0d!Irrr2H z;}P-UqZTbAM#J^6TT){YG^#>YldbE|-gjQA8Oc8ubqRa_XOs$U?HNRBC-AuA5RvASWfnMvHr!n(4gBdiHhNt+>!ECx6`@)S<2P>#Rv3x4g>IlitrJLxY{971T~BP#xw3D_xJG^qKNi&tkPji*-->3;}Nm zw$Ge^^ZV@D4WhX7~zv^o|&0=MYv- z6_{nd|0-WFUG~=Q3kGZ~Lv&MdS7)|>2x7Bff1aaGZ+aGM_pS{w&H#^-t3DmNe zdWYKHN5FT8V{3UL$J|3rgShTUZRz9r7-yYENEn(nS?ndRizXVD0GoSL4(OJItJtJ* zRevqp?6NqcTUSPB0Zqm97qo*Og{?;klVuZAAIjW*1KDWkM{j1Ko)V{d7t?atAd$uc z353lT@%2pp@q-s(tAeOh;^3qyPEfWr?R;`$&u8HC1H|8g|NqhzK&Dgga8XMA9qA!s zfT|Y|K^wq!q6ZJ68w$Gci!v;X=c)LMQsWm^w!&sH+0FPM@I)|;07XEaQo|jmT>Fiw zTfQ4*9I@`Qh6r|qu~wc%&kH_;iSo>u?U+M$CuubFQbu;irW62?jAP6V$g7>OG8}{w zd6lhv`MxX<3tizzY#N2cL#aVjr1&aQ2xO@`$66r`B2t<6uD8$QgJJaZ9ymFgdjbye zd=AOIEi`RpIv!pJyw~c?*JC#~bN3~l*Y%PJJ?_ONY%_H%)k0S1kxHYE>rxf>rb<~= zC>s^eE8Sc4@q6e1QEkHRrP+Y>V*pZg&Ep8lwPdu~Hu4ml%O)3%4V2qwkIeh9Ie!>5 z@KN)OKISYy=h0N$!Tft(1(lfTv~%WbamW4JU!I_UsQD&{(q1y#HV6s6p-j2euYGhp zf?zG*zBM+|PaL0l9wG1aB3o-@wR0l`2hAC&q{UXP{WEVdmr-S@!gG*j`Bw3Qe&OI{ z0!^HBa?CAw)4(o6*XQxu?X9%Aqj!Cn8QReKjH*C0Gp(Fz0VY*u18~@9YgyaPmBmd= zeB!=>bx#M8H_0$V!5+MLxNV6qh zv1nWXR)32O`lI=%O5!wR$u1Lpj|LCagtO({s(}U^zhnoC7U3ClXP484wro5Ox6HH! z!{oq;vRik|r?GUyilB17wlm5jUXvg?TX({5y3qZw^==$4Gj$A+&XUorpKpBj|LmxN zUVBP{e#H({T`(@{RamgGy8hD8HQ=ZrcXU+jSqIp2#f?f8rSct^_B$veN7rym&B#z~NH8r;T z59Tqge?d2feWW%2r-_B^zwGskh@NkL@fsV+1L;BG_&^hF$&qwyL)y_U{LX@dXHYZPG#9UA9@h+dp1La_^zqj3Cry2NldrA;dG*Lw>F8J%?8yMG!4OGc?;JD~tmgz_GiMYh zYvJ1B-b?B~ly%?g>ZCCIt_QNkVeZ3c zV2d52c!~i^{MN9wF0M@rQKlDD6T^85Vl%1KVTj5h_K-IDJL2KQ%QdYFHXogUciS;- z_`A4yH+H6ZQ<~3{;SK1`BVe7{U!dxV6K#!8F%b+&5KV96Q;=In(45fhf;igab#;KB z$G(I<*-_dwSEk?{Rvvf(wG;Tf2FjB0LmYAf*zbSlvLH;xJ8Dey@*I$a@zv#Bf2sq2faBVhx==Fi+}*w9UUQ@h@^*p>q}0aaug#6$l! zdJ{z4fqvn6I=091UfL<;4_~mGnNV*x;6u(IYP2$Jv+g>#as2 zR07MpeY6w=N6H<0^OCVbb7-ba=2_FN3XpKxYR*FbvvZixODEr>U1>tZ7`SIbBu_Kd zPd@KFpv)`5Cx4;FTvcsK%iklM%7QO$k@Vbz0Y1ZQ_r%n@7g_@{@rX(B1azLzes@Eo|4KTblKm8_Hnu_ubV+aIdlRm9%Ox5Ht(t{@zUA{t zo5EfvGWAp5qk5(!5c{6dcs%knw0t})0X7~z(07U^_zSq(*w(`-0(`zy z|EEvVo`YV+UHKCwPoazM!dD>LH}(f8_cZZxly^%p{b>F+V+DYxiI~CYN(r%& zT(Q(TmqGA>U(=nETT16mFrd|7_ha*5uD{ObL(klDVSbc2;HeA6 z3dBl4f|jta7S>uCFJcv^+psmw>zT7W3y3{(&^mHJ?V97s&-I9{4NDj>SqWw8=kwCI zC~^A4MA13oBWrT^wh0R(zD)~R6!lHvfw@wo?otEKu?yNB>^_Ij`A*$vu1h!P+ZU*9 zfXGTa2JYz(c*G;vF?C0-CrGwD-=w}n`QuU#xyaUftM%tcriIgP8=@`(MS!ba`VGID z?%jE;rygq<=uWjjUbP9Rf8@(`EzT?TdBgHmb&WxGdq%$mf!b2Nj*S7aQzfD}uelVn z-uB_gIohl3)J0%{$e61;Z#MNwWt(mF&Xt;y_c&Pl-wF+TZogomuT|2uQHk(YeBn&c zNG&+P6Ftja4kOR+QjNdY9A+*(vMwSE8^*rRTX`G>D@s4Pya9!u%WE^+2EyNni%1uG zIxs(>put(Ikumro3oFm!;9Y;Q2Dg$_HhoXz<&$U?(YubNxYNfn+m!A;?#z_hB6q5B?b3Iv+*@c|njA;z;$k{N#ya-sFjonocTw5r3>e~M9vCyS%GpQ#N4M8$(d$fst{WFY zNX~1ckywvq9ZIjUF=>HpSvr{Ra3hL%HU%_H*0Y3Kwui=#C!;ih$!Hh)zsVP5_Wuc! zm%>e*9)htT=FD8}wJ@y-%W^AL2ACYC2tt-(fwAK|3cs!FC~WZ@U6T<%LQM3PwMl*- zlNx&keGXKK_&4~M>+p}JKUR$bkKzHm(?8Pe{`I&2qieFa$x9j4mSe5m&Z4{{q&s)h z^*#BDkZs19z2P??_3jLw)?N-5r=UM1fc~CI%Myy}#2*r-Op%ltA?ztUpG#cDFJ9|o zmV0;F^G(mr_S6s!V+e}#E zd)lQ^)qp*Zf;I(yI1z?Y$a0k>^o!>5mI>fabNNQoidP!d+@6f=Rlh}+*EROb9w_!c zIJBL(#nN1A?XyRRy^flWabc)^US4)v{(!5B&;I(=J62C1!O=(M-W@-n+~T@v&tP36 zmr>yj(36xynM3zRE^Fq^M`Q>4Jj#;EwCf`ra@;zK%|glbezDvg4-}|G6{7LEMFWK$ zrDwE59$YE!^Q&<^%FU5%u;J$X*)!G8AuGyxchfppXxv+)FDYPZjxVa0!6j}@zRDQv zetd3z%jLs~Xg?wAJ&IEOTSC*g{`Y3Adv-H3H4~!RE?^minSdr4RE@M&O^X7Okfzt) zoO4Q8${YEZ%F)_;r_3BO9DT-n#{hxbprILkC>V?z=G^uiCv{7`@1%M1&tKn82K+zl zy?0nsUAs4kBBG*3q)3lS6$C6GQey)I#0UyeKq8>hq)Eq+C?`t49^-?O5duP4I&_)&y7z>Wu(1f_9*aJ@6Hz|A zwD!apsT%Wmt}IY{(6Q_M4~Uh}9{EluJLU_SQ+_>S4GtpO4?55J`36ZmSIGz@5;?Aml>)!4Kpwf}N$I37jgKXt1OISjECD2) z$1;&Xbt=ks^q@ly3X_p@$;KtAI>n?eAVWtBAWNC#8A?c`J(PDr z`vz$N7pv9fY1;olM6IK^e5UfzW0uuVK1Nb!FZhB!$_8bWlq-^DwByGh-)VK->o5&v z#cX5IQjXLj7cmb?YsbVrp8JzR=cQkYKQ$$11k2pX(zCcP@2{rhIU&zyV}9i8o}qKWlRRQNU#o#qG+VnXEyJoD(0^XFw`3m$AYtlhXy z`^*LNy7fzR+vIP3;ed%$7RMi5f%yb@&s@)>!RItDv6Y{-gqP5aelf2QQU;g~u(2G> zs_TF@BbWl)MtCC*)wVU;NhK8?XXt?ba|?BmWF4-dx}ee*qVQuw6?m1 zdm)ytl>ww{h$!>3tQ*sqOMrP%h9<^H?ekn5BwP5R#B;FpH*tip(tr9ps8bIQR>FHyHpkL=SebD4+R=@v&%asb z#N4;Y$d5qj;0|O<4a$Y-dm9U~m=S$5b&lUo1fW6q@tuz_u|Qm*#T{F##Md-2YCtob z#WeKQjczK)&-m&-+YNKG`8S(6)@p*C_%KZYbq~66%06xN>$^VkcVxfh(gpAdM?*Lk z#%0|wlP`Pd4!s49e?W$D{VP=-!z9)5K_9B?b=a09k57d+bdfn&$%;~PLLRaKDQvMM zyxHG9b+%xXQEIY$HLTyiK}0e){zrGt5-k7pJE4oXTNX|wuz@h)6zFo_suB`Ruq42@ z5Nfs^hX!E|Elq7Hst&dqj_>--g1gwur$dn--gt&DMDH{HYQhDUVX(2hPQ4GQhYu)K zRqn3vUs7@J{TO?!BVx5MAuj%XM+IxjBaJN8o=A))fdpUiV;zfK9R6r($2+1`L42;5 zeLW%w;`XtI%Q|A}5~pq=wtQ0&Nb|9L5dpAnThgziOmVvad5*-#L&XYH0pe8S1r zfH!t_%qgolCJD@sRNf|Ivh0*hDl?roJ&QWa*IG}UUe%M~^|bUp-J&B| zlIb^pPMyXck6p-s*lFJ~FT`x;(N~X3sTEp|_8Z?;7gc(54+` z{wPH(ekYOO+1kQ=;qy0lJTQWKcCJTf_TpO(M~D}+Ki=hGd9t` zG$fGA_iRBmN*GlU)i7+xmS14X7sstJz>V`0s>R~XLt zF5P_Shb7={(uK^&k+j-j<7(8%XFvn>7pTF^ozi=ty;8?*aQE#$Mjj9SCS41t_yrKL zz|$F=vx#2{CImxXUv;WQ^2n!V`U2I$k%H74;nJX;_QNu;REeYsiacpQb8WuiH;Z)a zmC8kY0BkHS8~2M)#oAG5@tY+^FB$3H3vs;$-sKM_&d!XZCPPWOKbo&y7Xj$8Z%Z*U z`@=9oJd8+mmwg$(w@@Stn^SVB=e)ylBWr=ilhf&r_;UavHfzNxoTC%tD}gk1buTAVYIBX!hR*f1w?-{1}2z%2@! zc!FG4P_KWu2at3N0+g$NnM8^{fWVbQZfILhdf;aznFmD4u}p4|3uz?ovVcWpQOg8` z-yY`k%{E;+eD=qnW1dBjmdw1&mO=LM(kKb=;cRp^Bo}>Ys-x&moWMG6kBW*7FJ^9_ zwfdpA@tLnfFUr(fuGtRp4*PNUZIpbPjnfV+u!lmt&_C5-H@Yl99~FLb8%lL3$R zrH?E!L>N`FXmQ4gbZksNFgfGWu{yNkb$50BerK}QT)aKfV6wj7Vn7@H3tuk7uljdl z#m>6EB}kc|o`=-0^cU5kp@F;P@F|zvea}psmdm=+nCm-*MIlKl+tU%zXQXQYet6{y zEA>fIh3N#|uWP)m9xFuUx!Okg!`_WS`RnCu>|K=sH< z6i2G(glc+ahr4HrX^ra%WqDVdLj{YzExBhE>0XF+u+;5IB(p9+^YU%6Coqe$&u--# z!K!lJ3_p0rr3^E)lgu@M-T0Ce{Np2jqKJ|!h_PJ>o?xc>nlU{M5fpJ`L+p~{1!gyl zCx~&Q^b>vAydd=yRBb2XV@HfRv3Z3Ml|mITyS;R~N9+Zbx;@@@91hgEd9HzKqz#Z2=3PGni%zTEwaBq(&(?YG&=3bebnQl(NAu~1cuH@ z8&?m^1#2wbs^|Doo$+55U2fABZ=yy5?Oz}3cq?|NZS$eU2mzf5!VFE&$NnY(g&>a7 zc?}PZ9HKq~`&_k^4Ka`iB7)(#SV+@BhT=%~BpRNh1=DiUfHB`~y4|PZ_tZmReAhxX z(Ld4&lqBk%%0#g)$dq#+;cEu-63pu58q5wG5UlFuR|_)K=U5ps07sBDe{6^Kc{bqh zun1nuEr;o-K08S5%$BTi=2QCHuW6pAXXX-;x1c{WZnGAXqWOY}lK^(T4|W*7#`DWq z;{|)f=PM8dTN|AtZ7%p%c5?0x&*hKm>>oh@=sOy&jO;AtFiqKO`k=K#9Kc?qABV$% zS9`)XGYV#-D!&AIfrfo!m85apa#p|LtcvZi-^$0kN~0&tt#P5$h;1-K>}fCWCqhJQ zw}5cIy@ecXA*zKyQ%4seCgkZ{PdA&X`NaFGu!VK4SSG`)dgo8Ah{KiypP!*Jn@7O& zt#$*AdkOlx^Pj{M7Q75O%T)2s8@I^gm?PK0-e5y_jUfDS_|ftCYtNa2F6OCl8H8TN zPczGk3*HQnd$=#UlU7anFISg8R-9AVlh*=4Ap$QJ^_;!rBYTHg!VDI5DD5ox%$&Bp zi>f-F2XH^W*$Y?7($C!XCnl(pim%ecJZj`>b<7nenKxQ{{Du=>YkK$5Wmf_~AzH!q zHw*O$ZbJLtjuDKgV6d|8VBYUml*8VOg#4fXk2d9|4AU3f$u5hfar%#S6PcfcaVL~K z0M94G!G%EY5`g`u7xG+?Yx77d2gC_7t)0|i4)VYpKxUV5NC3m!BT4P}x5IzAuV-GE zbLzaoYj~>ZC0@5eMrp5H7QAQF52l|o`Z z^pWM6L6V9pfeKoCM;ZvquVBKU|9;$m5x=z|PiG4%3R)BGu-U;|1XPd+%)l@hAa`-m z0TtlS$IQk$OdB8?oL2wqiTw3M{u(2HjgddkV*g9bT4pODqs^qmLdnf1j_@figz}#! z_&&u6`QXvRa^)v-O!}#j_4co`0`&-N8M4RxF)+r}N4G*jh#={9`dw`D@MmYt8&C ziuo&w`L{Upe~B6TuSoy@Cy~A*{fru>+Tic|53?d_JZXA%$-hp(>g=pW|2|35Y3WCA zEsb-cE2kArggb^h?ksKy!wj&kUCWlG@Re)x6KjTcQ!cu)s^>2y1)uuMy%Xpp|D|U8 z|7siTTEi?EfD%D2VPSi8OA&3~H>dq?FRUL6#H<3yP7!VvmE%fx+NnEib58yv#k7-! z`SX`C0CSXs#3a96hYgyyE46zim6m%cAY3(HBsjV3EQ_aNf=TX6v+2dRON$rliXJ{N zKt7|YQVx?J(Orh9qvl1OM$fI-G6d7LFahtPHAQ>$f;R@8e#wtFX=%>9a=(WZ;%ngy zWgK*(FN~c{JT)2psBx(Tw|U!srLc0M|8osBU1oK#2N&apm`g0A?VfdVoT*7XUF_%5 z6jJ6E1Iay`$GP!+8}Vb58)FT5J$k?Zzwu1G+bw2t^zPfY+<+Cr1A+EL%XuBN?`)*f zPBrKoe!AzX5k5!I8~Dv4dhjOvRZ2;ku*TYVP`;sYwJ5At9J|{)b74R>baPFrQO_&U zn6+=;{xrK%k)lt!Z|r@1>v5ufMvf=F-o5C4KE{Roge1Y+FHjdJaMZV$r5*?L=yOc9 z@;Gn9$>jxL))KQ-cvJlY$Pz9hA7<10`o^XQMrGSv9_=4v7PjJ@60j|xu13&N18f>l z=P!lqpFQW5S*M~Dcp|DP3Q=B^j}TSWuMOdQ;#t4gH&0~NBH8K_C>I=b(gv{wj1=$q z+|6%`{e2%C@NbBYN>x+mUY3$^5ncmEXwhiX-mo>fzwIa@+B)aR{G%B9AQ! z+|0AV5cqk%k>e3>E~q{{NGyrlE>%o6Jdh$1b|2NIVc_Bb?nmazmH(* zCRiQp)whWG&hOBm^%7@8dq$&JFedooB{}U+4j(BJUpvm_Mx|v%YzQ@d70)AWvxsUXg>cdPeMV4&AI@lQrsvJ*XpkGF-6%o@#YV)TWaHOi>D1)U^FEr5c|F+@dBrSTLgWI+U9%ov_Y#aT7Dr&? zx8@s!5}r={X3@#ctR^@)!^TxeT~<-41Q?}1bKF_znM(iUYLJc?eg%1-hmyNs+{Eo1 z!qrN+IyUex9pL{u0EV@*@-U{+kVIf$uF@AaN(o}#K47FH z-zK*IKR*-m7_fjo{D#e}YlbGvsHk@qjnLUsl#3H5~px+zz@hpnG5*Q$`|E58T0sxsIN%TY*&Z8Ngs&ct@@f8?Tnp6>qR&HnGa4y42jVb%8(9E>VOXPp5y*G`UY z^~-M-Bj=(07n%Iea4Q?IGx%`(O$egXZOQXSX-AlSL15GWR@%oFs?? z-odFJG!Hc*OJZ0f*6rS6&UNB7VWZqCx1W5TW8sHTXwTTEdIL4qgeZ-W)Ag?M*0LRw z1pxQHwIDoG1ArT%R9n16$@mtU!s`W;lQOl%84OW9hkXuZ4Y zd(_|MktjrkYiNQRr7q;M;Qt0DlBuoRQ>e~yE6Av0IVG>W$f2W|i^pA~7M)x?)|VbT z9rbak{F%_j@`H8@`7q~Me4vIPx=-qw39`1#A;m)L)06k?BX_muZ!@X*Jd`>SdzhN{ z!%jo6`$4zE&vWg?4k;9i9(XQG->-3(?o65Q^irHEl0(I5L~8>m_*-B6ql+5J;}_kU z5f$Azq6I~-9!^)8ivYSOV$Ap9Qo<96aX`_+jFXuZBQ zu`=g#cmHhflhKET^IBeOVxpN5RH-pf$v|>!c)7vE(w%wcsmf=Kfmcp#S-pmn>iq+t zH{pgl(Tj0QGdHa;AJv514YhsfE`+aTohvyCtye_ZdQ!05 zxkY!}#P7k2PgcsPSFak)nCh)*#LD1X;W5j>(SU&ylWt zs>or?)bQ&~0RF*kVoGI}Gw7%PL02tCA3}Fq_DZ1?$tuPVFC7)uV|%x&it7acW}yK2 z)63m%J2Xv6N7@LPtfp2Y??BVVxNOGoz~SAe6-_S|9y&LzT6V8~cItIavj5PzNr4jN zt@q;vp!~|hZDcyQAbX*IMN#n8lEVbQt+?~64n*%|Vawn%_Yz(i-BO8!0|z4n z5wUc89C>i7)Y>n?eM$3arN{OBP+&Nz95wrnoz_E!VJpvgtq|3+)GmMF$@^*i^I(Hx zCGNObu5id(jg4u=wYx1$_;6D8^s~FD&$4!^34~CzD7C907VA#*H8lNf74!1!^&=h} zN5{3)U808Kh`Sn20%3Y&zlp=TC6-U0^39WJj(UAqAIYg z7e4V7N^M-21>J}1hlT|sn=l!j#(3!fidp3_{O{!rkZbOA3d5C^- zTXaH({&WL$(xP)aJ@t>?%B^N|c0$bJlxSuU|!U{~bw@CCR_Oex+7QSJC(T#5+3@DBF(T}4w;+E@Woq06~Hgkz!+E#$+zbH?V_b@lS3%A9x&3og-?)e} zzW|KJbicZ`!(0y>&;MO?oz z?SyHFQjO47ML?3iBS%QFF8;%ApJSnGx#ZDLv)U|Jr&F!!A(Ob8Es@7C)+P1XWpq)8 z8RUhAgqIRg5j&`M7M=IwBfrDwWN`?C+IA6`tUZET+E?Q90Y z*{BX2hn}wMVM0I0REiZpS#O%z2g!V+e!jEvB=t4zHnpox!x1;G2{gE#-Fy;8{o5>zmc&HDm22&^+ZF)`#l;m@{^=mcgjW>u$+ff@YGB8hL&rDlQ3FxZLt zUNEA|1L`3UQ6yq}iUvxPjl#eS?iqm@=0lMtkM;v2(CaIKu!UhT0`2>95p#U7v6>FF zc0CdHTHY8uKQ}%w>i)JTLp9Rw&ceaG%ItjobU<_ReD1QSZ#%B18~52)$!s%3P2GXm zc5psXpzLCZVNvAnVUJito}khXoa;HL`%qimTgzczHPm_nmgDh>cx4mO*ben^Q6Ih0pCi;7VLU_TrB$P2h)V@Sj8 zWWm-1KGf;?GEI>MatOt*T|-b?4x6&rqjp7W56iU7Qy+y|&*dYihRXKK_?F)+N4>BJ zvSW!SQQgO>06Fs27HMQ#5N-HOZUDDpF-=cTc!iHfg*ek;U zzI(5RPR4(h#Z7xF=Lp8BEqvN!17ZII)a>8G{#~nSJJPyh)iuoOcFgSSg*6j`F6Iej zOM(|h;Q@cS)`pDhAuDgv8XBSS7|b3iz!`yGV+2y=TuI5*1PUYwcvj&50Apk-1XPDXw zJzaH}<-=h(ZOochlZhA_K!38x=&$=?8+&$m;HQ%I>6^krBIB{g`-aEQ7A$CyteY** zbfqqdnSn(kYvzDdMp03(i9`N~of7B3VEwsH-EmHBkUhEeOU&fUbey19;h3p~a+z@g6!m~STI+q+ z%m0?lqYv31_``>NH7@B`svbM-yUWc>b-?|KRdoU5Y;n_{iY#~?$H2B`0=S; zr;eF-BV=YChFol;)x6hu{Tx$)+~?ImgoUYjdu^Ia_K24nE8o4!{et~Lt{k^jAHRBI z&dK1dC+3fg^EVR{u$~bAH29-0tl^rz4kRe$W$(JMWm`2`vEteA(aBeqaaN^z2& zZ4w%_M?J!|SdP=9G%W&ae@1tYwc6hbyT1B^(Vdj5rm$(~<*NH^k?}eE__>*!HNIh7 z{kMg6ud+;>Ku>~Sr353d{NkD|8cKdBQ_Ee`pc08YH1$@y*khVOn+WxHX2-~+a-NY4r zn4jtqJ8AmjHVNN>?3X7BnGgcgNW94xX^wB|^^C06bR14J9#Fx_uHh~~674EzjxN{RgzE*5+ zuKi<*JJD^R>G+Tv!AXR_)?g)tK4G5dbCBi7Ouc`jmL?J*F$44KYr-6X*)O==1FWl{ z3wyPXf2kf~j| zcjs@`WwOzfFW!d9Zk}vUu=dktg0wN8*>WPx|hKunL$|+y#r@l<@d} z;wmT6#mR@?k0W1g?hkE#X7O@x0jV0P8_=m~(z4_TXHxEP7-KHnR#oOsJNfodONq^Y*9%@*Yje$h- zRe7BK+@${6N9gioT93YW0NTMg{^bYrS=vQ?ol|mv$9V^E`T%mjT*-X`!aaxP%li>) zSbV)`S3)ox2!Cjrb1yfZ2AI#3<|ERrT#%G zez`43F}p}@_>8?PM_AWG%)6tF(akeZem9o|o>As2AI5Eb{x|?-!H}yOPD)Jab+DFI zbTuECq4>&k797#YYP`7r*xIuP8bWl5VgHgTbjNHIF@rmwnScb1JURzL0{?1~;Yk zz90%vY}Dr(BJ`_y73Hx_U5b!(z3Q7hR&T=cp!#8flC%=xg}T|#TA44qCrS@QcPA0I zai!Piq)$hDb#vZ#d=h9W+4gMJkOV2OH4eU36Q6kfF#eiBi2d_}ktV>h^-{9}`uRir z18vrIj!R^vAi9xdX>5?R)U+(I=4`Oqw>SgiK zzyL$&ljdVPpI&TIwWE9d?CBG%;xjRpW)P3#h&Hq$BfV0H>L?j>If8!G=iKf~2d+S9 zx6jn8w<}th^4yFopRzK!BMA#I+tyfK?yVj$9h{3C8DWzbi#&5j(_H8AMHa|yWC9S& z!PvNyQHE5ds?~BIE#2Y!Ps7~UNyo=tRk$|0RY9NCGsRGk$@tp-CVM5=m24QG&|r>bD_l>O&$XAGgwq!5AGA`3i!AT{I4oDN{!~0JBgH zoifLnpP;ExleNgls+UCiZhTO$k2+v7RPW_Ad3ng^%oN;Zd7sEAGQ~nHL8`DHD)>tO zPNZ)&pwjyv!MDH$lD}g}VJIz?se)%u=vScwd)arpRkvuQO}Q@jYAe;fP@jVzh< zSO6ODfrPldLva#K-zX058FXCsE_=Z)l$dvXJjihGGWrSH!6;s}xk6)EkhVW@WO}&w z`ebN~o4fMCl&AX?2cOqBz1(^`eDjGhzY$gj_H-5>u5kq2?PH#MPz6%^KC;HZ>?G3a z3E+d6xd{5;iZT6jNQ)W;1g98DrBUelxx6<$bqxcebCqh)#4JP#35?0Uj=fA9cB$h75K; zB^Ptk>Lu!pHgnQGK*%LAtweFca;@pkdJqueVePD$a!t(ZfIg<{GO`u2m0CM1g{l+< zU&JWoryn`B5q8m9W5$MJfrLougvVGBrV{B7K=nDFuG5~<1`{s)r#FY&Y*AZ0@u zda@XsX^uf9SN9*VAHEi$clzk*_LA^xCw5j+l_T0$k#5YN1n7n@+&HKTtx~4s;I{kj zaAQm21HdphEFwEX-LcAJQ4!denIB2K|&hcIOYa5l)WadyskR`yL;Eq#$m5%PxH# z^J(2%-?VFLafeFl=aV*%SnMG0mgh&Q7j?=#8cwI8WV5OgO*nBUnDJVp88`!^CSn3n z1Ak>wR0HQR^@_T|Aj&TkpV~hle;q?lp*p zkqKn&lK?9v??MgrJLFpl#~RswQCbn5b`=r@U9e>y#MczK0{G4%Q&Xu-H_NA5jK}IJ z8*MNy^PeWp-&W&)hjCZ6Ub(=21L#}f1j`jkrQgToinbR+Xz}8XT0UyEzO@X~qNgx9 z6oQ!TT3@=<_Nx4%3kaOQiw}k z?*%oA$9!^GE1ZqCEogG|bC`Wz26RMb&D4C{YUXi^HBB33O^O`ab*V@#c>UZE^jk)9 zS9)Vu?X!zYeV^8s3JZ8WI!sJH2J>+maC#lcIKx8Ood(i4EL$flhO6%Zdo1~(7yN^> z$r6O4<(IR~y|8B8ii}IsGX{w|lk0~>d8;V=RTTW7eZ1H5(&xf=k0i9Tw$FUW*Ah5w zJQD@_3L%iubki^u@8cL(LC%OpAFSP0qG-jZ!P|q4+1{RK!IpkU)hL_FM!VEFW_9eP zn)#C4^QBkTQ;99PGvPX~fWir0%=$ALdvC~3C) zr*sc@8uJ1U&W&mU5J$JcAN@{r3Q?);eW@VettW0VRCS@`Hwz*DIR&4mp^ozED(6Tw z^Rll|(u2Kr3>bDGWpEQ#G6o?V z>O*MCyydMN8=3A2>R{3+7^51wY zto@Q(1OIj1W(Zn~P?M zo@id;G&>l(cAH`UOQzP%Tdl&l1JDLDAXOg_MF=e(T2@o(P7!PV42XsPGI{uKZUbdF z@8!~7!56(3L`i^q)|+#n0&&lON#Ief)tCD<^I#p+UCn5+Hv~0L9@hWz0nx!Zepx6% zkj&g16E3Rty#P6Xhs4&N@PpqfZG>$Ry))1;`thgO?S44*NGhzaM(2fb|8$TxS zxX4yLrrVl2UVK#$TX(nX281Xbyv&rOvLmSX$Xq~t$-RS_Fh01onDXSo^^;FCPhyvU z6uVPS7V8eR^tQXV!fRBf6NvgpT^+v6DV-S})Ft>Re{>OZRPTuvV7695=-OQxa&$Qo zp|%yed$iX0$o*(`jXOETZ%_JKy*USUSES$TTjv54oIq+FC;)8M683B;Z{GK^9J4b% z&=B>=$K{;Vdbmq5t<^x`4*XZ?O!<9cg5Y-%-d?pw#8Uy^Vm}SQpSeqJS`=LsjhJ^n z%1XZ)oNG=5LZVAd-0oc-%PS1U z2=yHny-dwN09uBJuHM1qqAQU`Ykc=Wa=kq2ITW5e_|fqNA_G~uPD`k0{1PuMi2=N; z9p=_0ev#unVdG4}b$7Q$p3o;g$m zKZ_T);)2pV@5YkwMVVWJm7s5>QKP=YIOzPOhOwZ)-xJIjoBWXY@7QgRqBXEiG{}rWK%Sk17=0%AV>QjqVRxOE__Fius_{e5Z{iPA_nDyu%hE+S$*ngtJBRlV7F5pNA*p~8=D6{Sv zlp#}|8A!CAD&#gWd4sM3O(zVS{OYyBtO89qcgLNvzi$m>`S*bIOyu_o0&SP}ZBkPxBp3j zKgUQI#4IvMjTNY~NjQ3Ba7A!AlH%wvnop?i1y^~6S=ADt1pkdCyL+dAG1bgG@B)-7 z{E1H6tDT_s5UE0^;@dG)X?A)d|-|DYEG2*X$?lDt9oWVCW@jUvXU z{+$&6S^WjHRKY;g66AaSiE-OSBoxFgGZHjT;qd|P-$4tUN02j-{Xt8-Wk7|h?RE3Q zxUUUnGPH+o{U03o*u`Zq?^v${>Mx9kasR>IUo;{DTBoZcBZ0|Vgb#4L3NCe~ufX_V zJ6LruVU%FtfgI6EXLkB%yu!b8b2>bb`m_4pvYfcphV<=Ko-HfnM*`{goH=}uqi$cz z_26bbgGG%7L8~>ztqs!#OFqN`oPLzO`s_SGSnW1(tMBQoil8PG>S5nMu`B;Ki`m2* zwBnX5vJ^?u(&MX3t4p|%b04SqQpuUy{46wFdrB7=H#&o0vi9qLk%M9-n=&~73TM$= zfQ0xS&z~!irWV(;%IbHbXfMYDU!&nE*xfqx0mLYdbX#5k=q6dv%&2jl%4*G+L#Fp^ z4TMN)#n)D`)wU^61LUsW4wEC*g$2TpJ7kTLG%u&CMfMeX#wCHB13a`&xazBk%gZfQ z#`<=^ui8C;u}h!MgR+kO+nXk+$h@Xle_ z&_}7J;~o)(pGb^oXPuLw`*R>|E*I1v#ek%Pz=p0=)}fuQ^SAC3%RFEgfO&%D=jzB_ zPlAzl?_$V|nJ_>x-!I8j4O+PQO5EawYua3-d@*kmu2moH-Mavqd9jVoS&~a;1)Eis~e)=f@pDS zp|9ez-z*F%A7<|Y^$RuQ0{s@*Z`)oVh#9SJd&BUev{X{Tab`N;rOV=eHCH8f-oJ|h zev^Gb^$t$wI-5@upi23`GqCh8W&*TzpLY_kCN^>3ALu9?vfBS#gh`F zS3AYJg)dM1?3S?kA=)jT7cnb1)HgBZ+X|NB<781PmQqcw8h+=7|N85i`ko6B$Do_! z4}CR?L^Ys%Nz6EtRdyswF;6u6)= z*g%&Q%9+Lln8Ut4Zn?E^OE?@(Rs7j8F)z1Jmnmkra&b0O4ye6NY)BI7qvVx&z%fGta&j z9oP9T1vLnk{+G~BcKT7?pZhS6+m4u@WKt&Tj#&(LizmD@Sm%c&pfeCtc=}Npral&6 z$#PR^0Ka#7VM99i?er@!wY>QwT%-e81iz*WYCNP?U-v`Tm7=0v1@@8$P~$lfQwgYxlkv#S&GWd6s%G%NitT? zV9vK*=5|m>Kk7SkLc9CJ?fqhr-V)#Fs!lWW5s~LkvbBwOZJfm2QkPJ;J8`!!iG{uW z_H0to`G1dl0bt;jF>D}l7g_FoCv>wfkPD4?WS>)Ou-_vVBt8DD;>@Lc~H;8(WsV ztLw=c?g#m0Z!$i*izTw)KcMVsa4JWQgLtYRwD#ER2%3N15Y#)|bwkr)1$h)y_=#P`=L9C4+m1T{$b+#TvY(%kUQ?)qgNdR6$^xzHL>EHjSR2AuD(J z7$VQl!nv4TFs?$1*E;Q8fvLyDql`n;%}D)Loco};WTsb;dEe)?!V2A)Tj;VJ32!)4 zP2Z{mneyOis2smZfFe{AKUy+rRC^P{wUfS`efK|eTfOrX|M-%cKPB}gUBsq9;cn8$ z#tnu|;|4wwrFeU*qi-YT7b-zhTa#z~-NEz8x(cIL<5F^VL>)bjfN%d)LHwWjJKV|D z)mFOAR;EUJ{h?)+#!FZ8wDHk!40pLgldR%gP7-qKk!MKEagUbw4`9O<=6ae>Ex&Th z}m4akskQ&lb zQT~wKnR#CN!}@+{xb^M8+ly5Kberrjj#`1yKzF_kC&);%g(pENw^NqWA@Gv5#9X!7Q>ARG_tI-X3!Z0Q%w% zlj`j6U{mj5BQ(Nt@HSakOf&Mr!K__-g!~{^4xts}CyPHEf2rLC@%iZ@^s)bP`e1oc{k-1Td3^Y(noUxqZ~#yDnMLGbD#>Z9vZf6u>nhUoMC5R5P&1IaZlhUT zy7-Nh2`d|6W9*U0lVIRCJOG{%bLbLo#Vj8qW-_9t6o#=PBd7Hm43~YoG*PTdORf`O z2W*h^OOiHXkC8NKiKXOG!|B=IB1EcDey-~GS&cX^np6a#OZsAfD5Yu~b1)lZCXr;uo z&fdZ8db=oQJB&|^HgeKs-ESBZZJF&HD7lrqNmxBbNziCv+(;rwLnm-`?=?1G%dvM@ zK^yJj>5r+YK>Z$t^yv9#VA|?7*(o2MLBDn*CZfD*>hPNzWz*Y3O&E&Zi*h(oi#AZ$zuhMJ zn}z32@deD<8tE7%iFuo17)*cU@XXAq;>!>&tBtA1VKsA)dE`!{Kf_3+9nCa zSn^An^t2^59%_NpO8ko14|LAxgh70KAgvQ*wqFQf=BlVHre<=?JwgxSqp zA)>qDo6q)wsdVL&KF#;lf(w*Efz=auza%sC|9(r*0o*Xm_Hc4qc##2w3U+56x}`BZf!CK|ls&Z#R)!fF7^)Q#K(9Hua|Y=`!C z9AV}kpS={_$*62#qCsO-Hc&f52G4gAVYFFH82a?TpBTpM&i^i1-OKqS3q9!6?%$ue z_B88`>}DAH(|2x-Y|U|b-(lv`X|Mh;xUHN^0Z8O@%=~k$|HqpD_n*Qre>Kr;ueI83 z&zI%ukObd!UD3;QX_frk#V;T8MXCfC)l@wawwl0BSBN_nWI5_0Wn5O~thVs}60PN# z)QsI>Zl6dU>L}$9NyyxoK$c$Gp_KYtRpB{MOqZ~1g zxRA9QVnto+UZU(TdDkvzmze?A_c?87Z+tUQSV2#hUhG(_MTAI`*WWg-a=gWd~V(~depjIBxajE zX~7&yPJFR9H`z=x!&W8wS~9*>=vBY*9?p$}DG!DG&Li8{fVs?3)UEOkl(kZg{*}kR zO7{_Od+yS6?^l2R+3vGn)DV(xgbyLJ*9gg~ohJ9Yd5a%1bI;fFz~`G%@&;zoqpW+x z4GeRZakt|cnao{-sI$Ef$eu5=ThM|%L&?|98FwzcU2RCZxF_w?!Hc`>Cz==w)UL5F z8pl!cLU+T2%5 zX2^$kE7+TJ=wNOzj)T#S_H7gKb!%gkX5$A}n40)~zghZ*+n6D`b^rRu94uuezpUF0 z$@c5+rA;NJV||=;+`9LN`;V|0@r&&s;~+;nC*ms?0B%~e4`exB;uf~B%}p}Pm>}HO znb1Po#X^{{T?ExbS!>*02xC*ivOc;dePSgvS9l+LT825ZDpA1e!rX{#S>O}jIfvv3 zSLS@45L$8jv3%`fWSF+D(Q@Cz0a|DNmKiC1@?yI7`>U(4vr>2;DuJd7=8Cc9!6kEr zZZ}&#Y2?7NP0Z`6dzV@28@{1$x1Uu8isE0J^G|EYMdAbFqLD@nH2Zt=mvhaLN%@Ly z!jIQ#N($B`7&aS*O{odQRk^vVXNb z0g0!QgZgTXyM5>Qq}pyo1v3spzwZ4^QJ<;L<;X5sWmb)tFS?D+%``S+@^57;fD zs@Ahiay_3=Ja3!mUiK1>$nBDLb}?F5nY(hRDKz;=t6htO6?xwl^Hsv{4W7~$lJhtgs?@&}Dc<&N(-pf` zhkEPM(0_Apa`LOJlDBgb0~>iBTZ!R%IC;L-KX(66wkTRa7 z%5JBuh+I)7RS}dopn-|!sYuXR7Sv|Az48a%;7gu0Dy(50KQ3C#tAwuZ%E`ip<_*=G z?S#ttr5IOT!1GiyN0|FnT%UDz19O;i6*uA=N{YIR%tfTdV-F9=ev1w_)T1R?JwG#_ z`TX85AE}GWn!Z6P%qpKNmxAt-QL8Q5Ewb6hitO#jJS7)itmEJO^jeXwy5pA16MJqp z%E6~!71gH45p5|WXQK@-rne^%wA8d|yNATf^v30RPv?g&1yhESaPX1(7HtN}o(O@W zl*q9)!qzm7dLP$cKdTD#-;i(kQ}Vx-L;T^?;2q|SJ^fYP zRLrbQcUVLXVmY>YG-!ksdXRn^9;7gRe$LqNx=E*mt;luG_wQnwdBs5OzORY^fmtrO zHlR5=Q5KS~KXwIX?<>B3C|}orFX6q@S#KHtu23Supj;*icZh1GOD%aKx6x6%dnALhB^%(BSNTIl^5ok0D~LXqlGv!C#Ft?Tg0 z5xmy3B2mhzn5-PCTlekl(a;HXXPKGNP-NcYd{XT9+g?5N%ZD@B3Ock;>#<(>{>b6Q z3%RlJS9}vW_Uk#&iN%a;6~0I28v?ZgH7bBbPAbQ}Vh2g~sKGt>?_9Q4hZd<-t$T6%C7&4oa847BgI6+Vgjw*@Lh-v^5nFmcqwd6g78yQv z9k0~0b-{O8y5QWRN`T@DZ0?({IuSWITDv-Yk%S3+vg8nSpl?NNRjGti;WGL?%tTFz z_Kcb+6N5ZPef)EN2=_kgVI;f3JnwBOr&R4;0&EsP2qOwSOl<3C-1iwCU8!h$%N(v- zb9H5EufS^LLvawMyltDyxZc=WeMw@I^U>+!7=p-6#0N*i=0Rm(SQk?25ubv(P4z7I5A7V;6u zJe}7j)23PwdA;uayU3Q>Nzi+vx_gDht8O)&bMV2me?b$I;kf%y7F+-`*?YhHoOAEod1ubtJM;blVUiV+wbrkE%cl^pW^dEP zC=JQAsiXq32>dqbNxA`!$r9D5s!={EHI&sMC81jV@$1KwD#>f~adnuu?DZEpOH7fe z8~L&9a|?&L!<{>_T-KjbJ4~-X)Z*CVtT!6f#DSWaxV*^msD{HUb<7-1?M3=>NfGQq zCqEG#pPYDcCOvV9%a&QjH9X@72-52|{rw?+P@*F)u?lRLj5WUchwcINOb)Lb;JhR} zLT)>8r($nRpU=+?qeh5d=f~NBhxD=Zq<8#QzuB(lC6g5*B0B_~ZncyGESp+QW?Y=E z1w5>#=3X>;BfWDs{o^AXrj3*`qbjkOl0LP<6F3x`d249@m1(v7?YXnj+D&#w<$?kC z)^<;~DS8#mmAD=cBlFJ2uRkO(G!I={1B+G~F^x>n_5SQla2qwc^|1ZK3FPYlWRX1y zo3Hd<{aY^X%Z|+cds(39T{ zlA~?*?K4MH!a3e$Yi$okyaS4oPyc3@_+LOEG$d!1D6ukt*5dFkq#J9Ky{+?aPvECH zAX6*I;611Q$LNOmBiphA=Z*g^-u~Zs-rz`xWBwQbR(`vVxcH^}_r}hkk~oEHZZ|Ke zyyt{|)!jHD3LCoht=%`cq3ZSAD~o;u{4>Dm<(WN5{(k^j1FzG6B{)&uxPdc)Y(8w8 zW5vD(*KtwQT^q^q@+_-`H&{-I2>h^LEHSaIL zbj@62^l8s=fJxG*aV?iw-eT~jO=@i~MkK$&KKhr??K#?spFBOuiLVCfb$(kjY^V`D zH6tFzn}pwsP-SX0@79>Su0pw=xckLuT&C%CmQ1yVlf1WVhwuaHv1wFytwM5@^P*44 za$2|5gG_xNw6hV}k@Pm^;uKFwM8V*aH}~O5=#uS4z{&uAf6yeQFs?_Jd1{Qg3zgGs z)Oif5Xkz#}?Xpc*m@<$4({C&x>NYqe#{?6G%_Uy-Q}{moz^eWaUFFYNo&4h!t73WF zx@E7tgyW97E1HbGeF_o!_5-=a{sNPmTW%9N>?R+IWdG2q2;&(%y z;IdyfY*4EF`*F-f%x~c3nPY9rm;ul`D19+DuoN`LxgZ1AeEB)?Ir#p)y)daA)RvM3 zk&9#b$pL(=!=SP#hVw&&*g-U$l_Z%?k?J#;T(QBhLIpYHBP|ry=9_4(8b%gE8g%qi zV#&3e5nTOY?=OZNxRg9m@zj3~Tz9i;N5dWi?A9*m_g4bohWK5+i0sVU0cBd{bO^>j zbOA`8;Ykj@0xfl;6U&810QU!sg;DQLtF!8u*p_z+!_1yX)$+2g^dx`+cP$6eAczyW z=agL!u7|;OqQP4L5g)F5^t?Y+=}gL+*igiso!h|msvB5^ z%g?!ABwe4GAVDq4`x7GX-<}r&=*Ynm(;uy@9=|<>D@ZKAZ81ZZQ2!*K$Jw6<^EO)0iky0|-%{LMG_dc%4G8 zxI;L9@-@$e0P6lS!^5~`$-P#9diQ{_<+r*-0ax=d>eexu^)YKvDoF^LTL@=aH#5f#oh0PP*s~SZM$Ss&qwA49{NJHu9}U z8)rR)DJS5>Xg1<(BkK2Xj^&CnbXMnx*ZJu0yiS1D^Y^X?LV~pJ&`i4|kQgxy1TG7( zTmrr&lA^7lDA_~h8*H;&^g@XE7lYY3_Mq!XKn@-ZkVb#W1Dd_dk_$lY4OnT=F8cqW zD~*BBQ12?Yd-g+ozmleB?@+~op}Tuw{?<|3zIqskZHQ3AdB1m4V9G79OW%67B=kAm zWE*XlyMNyoWGva2k&$U&C@nV{d#1cZ7$LC6qn+~ICEi*ZLvLemb*ERK*UQ2zV2}S7 z-}zQt5o}LruBC3=DAdGy{ztG z-Cc5&q}M+%(jkDFwn7n-I>ELC=7IdS{p5)E=dbOGvuX*gN_GQ-d+Es5vZ9wuEL#}3 zdq7?(*L`9G z12;4M&V5S735X~V^aKMil@HU#jZ}dL`CfFdc#jDtZzb}&O@C-Y0`o>ZQ=A$HWyphJ+ zrw;u-<>P4=+*4S4E3)SFCbyB}Urx2#Iyhv5hWBi6$b4_fJvr-Et6b?D^+v4f=a(Da zbPRxuZ$fy#&G3nCJSs*e?7nuqst4}cL@5Eo|CfewUR48FgMbeBIXxTUxMalLc4r=J z1|p`YXp+L_pJsiSu*0p3YbiSVeLxnYnm^ZWg>wC&tAV*pcMF;6lJzOhQP+|6H#jG( ztponhC1=-*<)JVZH1GW)P_`M(e4pPt6z(@?#Pz@jZCyEFe0`R$@B&*|R`azt)RQk1 zOl-Hm#8iLlN@0)v2*h(jjFLa%>yv~68VY<${Z*gtzFyoGs(8}qE^?-BQ->ib;e?YA zM~|5Y%tIvUmvy2@Q3lD+vS?T#;Vx&KTuy=Vdix9cRhC<7k&@D37n~>QEq_zLk-X;o zSV9o2{S>o@@5y?Po)*QhuQY}rg3~hu_w%M|LDUZ=sW2Hfl2?aYQNdvD6}x6`;Ot(Yze(pNHd?k^ATqmH~!`cK3!3j5M51Yi+0%PiQZ*xTVv z^Z*apjE4cl)2Kx`$pP5zR~dZDGU@|{&~b#Eo@fPr{S^U{pg@iH)_tYMEtZ?DW0jxo zsJ*m#!!Uy(DeXtVMWK~!P3#2q3(0MKVdk-KB-*u419k@dS_oGPoQmF3=k|t#tMj|{ zlMtr%_dmM9F8j`V$@(hccikE(X02U)y_=sOfUwcnn4UzN|6SbT-n1Ev|Q&UM|j3JBuKE!Tl5&Z6H^6|Xq zaVdiq*?duhGf0_!#Ns4Jk0pmGlJARR z;Ax#%3*heLD>DAK+p2-&^4lVCP4Eo1zDoz^9WGe1J^Ip5)XF{m*O(9k_&#JB@#Yn` z+q;_n*_L%w#?*l|y3sG@$(aTw-dD`B#Ariv|N2ev%<#k*LvXPXle6*DB!QyH=Jtt) zLLVOZMf(;BtiLeww5LI%4Gj#l3O5F}Jzjr(^woK_3!uIqdn5idxwh`$_tD_l2>{Ie zY-xN15JfccE|hR}XqqZ0hLfy*9J3*I5<4}yDmkA;ts;5qU(HlU7u~rdzN@pcUU%-Ke*p9wq@X5nAW3Ct@3t8c=C9V~IilXL-3mF+ zYq?zFzi41;+R*a(z_$I2#Vg>O)@Z)}kBRkv_ed%u*5jSVc3mY~ED9F&8OT@K=+Po( zm2&>E+h7b7T|%~+b3be2(^GGa3EaeAFe&p15O@WaLjiUL&w25frtxvLZ>rmKMl~Or zJYYgvmj$wx_3|ONB|NIp8cVdU*1lf7&w65$^}RuXI2CKzmY*c@*9<#nR8z< zb}lBOpOguEEZvpAB(VC`U!YbNGXeeb>PEd5R*lV!icpZ;#ivID@pK$iyt{WM)9 z=s-xnDfHc@5P^1#!&<=BNq!`j;a#0>$3JvU_UOfQo?GPrwAMZ(7ZZ1AjygrCM^@>Hi`t1e?Kth#PW=meBFTV%Oa5P%xl{qf zvIZ6#e9)`Fq$`7QGh|b4*4-1{rSm#U&=f~0cS)8wNMK@MY*4MKi9cP!m0%H#hESn4 zdZ}P;qTqfm2e)!XYVxIlu=uNc>V08iIUO3P8B5eA{1}13s8Wuje~;5fzmax<{R17` zw|V4&@0?HtW|)Vf+Lt3;VrSZ;Utl_{6r*$YhL?=cNt1&_Y7-`Rb}ah&C;L{se8?5C zj%;8K)P5kmJHkA7O^j1V&aNd3X;;r?PGWbYC4V#;*up;Q&$&zi6M>Wn}zh1PhGH(ifrgL};cpqSPyFUl^7 z2>zNC9cY}50h3`N$ypaZ0dYv%Co>Cx3}#5pLx_zgr_S?u(Nv2+ssbr7_Jss3HpPSd zh;rLY(mmDLa>RmZ zrS{i8T0f>w^$J8>lSZ2Mm=g zwt}nWaV!^I#D9|9>g{fA0ByP}-3-|RVYR6t^g~}CjTc^UcIlKjgb|%dmKi=(O~hL1 znvj;1r)E;)Bdj9jA2uUc_s8nw1)kG^n(WK)(?spkH%(b^-O8Pbhqb?+H3|~rOxpI( zp!P#BB~l7xeu;K|HG{Iqp`poJI_!Gq&+FV^e}2+m#3s5U;a+yK9SevzLA#rswmk`m znOOK7gMNXO7&Tbnhr@=O$DaB0mk>p4&k0r0OR-R2();YGW$z|dptybX?+|)|w&1e4 zEwy6=*>eBoi5G>J*4tl$qUJ7_d-CyMbXzc**$kmXv&ChExQShGWc7ty3Yc(pIH+WV z*mbP)#j$OJ^AM>JzidRX;5M_z-T`F>H}RxmuOQlGS+Ww(+33zeQ5r;+sLOgUxYs2z z9Q;VoaWbSeJ`8>&8JPT41b2|U-XxZm)6?gE3NciBV(4Kc=wgF(fcFP^SLcm3;^ULe z{)`WNlS(!&VmWI8v4|xtT$ijRcVoOsAq_ri(`QxiXB56?A?Cu0!lZ|+pyOH~nTa%uZIV5v~5 zwlz9D%S*XMkXZBWJXsC-sqE~{5{G-(Uaz4;fRr(8(L*!7(_K?Oo=;JntpU}m@}~#C zfQ_|@k}UfJv=G5? zAL>`LMpwuO__5%IHW?z%J+B!Qix`Q=IPS}xKL0>iJY0w8~>C=T@1#CE>- zzx+`S5&O%;k}*2G+Aj33xgD=h&8rn1m1se_r>_cnIbV9|NornAp&jHQn?S`|P44G& zKHAtfrP#Ed7Jniz^b3%~inRN+Zz{QzC7ERv@v6dh^xdyM!&NbOw0d4&JKzQ_1vE^2 zqy3#fnb_(8e!%`V#~Sc$-2s&G|f|3=|C>2%FHy1fV%DPOkhJ39_3%t5hM z8MoU@o)}7-g43yAXduX21W3~((xPs&-T%n%>S{*Ofz5zOE=K?tl z8PAV`E?J~}{fagQ?!@2fV*k~a`#+3zqfxIr3i1kLnQP&fM=cBg5rCZC##;D(;G@i&$gf~+RJ0e?-d#%hsLXD6SD z7gn6(dIP?tI-Fu{n>%sqe)qkvj&H913vuaxOQ`()@zd%7_BXwcl~%+}05GlVk_S9Q z#vSy9@dMSUPv=C8r$c4{-^9@5WY1y~t;fdy&F3DYYy*4=TN*P!F;F7t-yn6R{wnQh zB2GKhKH*~`b1;5r8Pb38R);F*0R31(eMSR5>(n2**;j}zuq1xts{o?lrxvv!?Ty^D zOTo!@LE3J?>nIj)%zR>L!0#ZAz4&!aLI8~&y#eT?vD@{`JF5Z~W24qlqGyQu+n1#Q z#9xz0n-Mv7+m++4&b~ObO=hEwkWS-_-w?u~*sP3}mMia~M|rFm#9XC94OWp=Qfaz6 zMZ@a?+2qFuNW7sWGCUNhN^*GZh2Yyux2FEkJ$o?TcHK`o)Oy|T5%#M@_vy6yzlfA( z*qKjIpieieN?P(G=-dlH#Lw`$%Uy1e7>ihb`hySuO8W}Z{6TOz29RI^dUx;qDV%`2 zmn>PH^y0_>*_vQbg6j@VMfKq}1<#&8+rOEdCtJ{&IQ) z9fXkXldakq)1q+g?YK@NbCK*N)HYS&klxlGv2RZZ++@g5q?p3>@!2F!93-roZPmOQ zk~Qr-FMwSxs)mxUI^Rv;(*nqTY)Aj}NT~LGHNzZAU>fP@18Nux4sqLNh)o~scsU2o zW-9SA@xLJHQZpat3NLZP$?5`avlI+MT zlyf9mHkk&alPhCV%h#!yk9OR-d=&>$7ay)eIX29t9Tp%OSSJigWUCJO?t;zQA%&qO zN#uQyO^x9dvXnF=j7hyvyywzSIk_bBK*W2%+jWM~T7p*$$TM&J?b2w`aZtvPziffF z`%*EA%(0H#&D+NQto%=;%75_}ehb`QRb(5WY({;P0#Fb$)($WHzq1g2&9ZT-s`fZs z4+=fI280O0shBwtY*7zNziP&*(R>fTml9^xtVCL%6?4nGxNI=HWCzjFT{J`Tqy706%4ka;}Z0Km*!Lut`EP&CIO z3qaDk@rN$v7eR^o+8KZq@FU-wN;8bW&%PI;-vV)fxOc5 z7R`GgJzr>F8%Tik|8>n(4<-iKO&7UbX(v!9m_biLLa(FHk!qUu?*QgSX~UjE%c zm}U=|$*6ougaUj$X-9Hm5IW5WM9zMe5A+sH5-Ho5VGv0Q6u6X6ngv!Y&Jf}_jEAm- znYxnW_QZ)2?X$M6S{B)O4Kslv2}x(4Mur{o)?Rs-Vyj)7g@x%r-ijX#&(5WEqOLhw3eVM#Zl}+MKc&JRCwb6cOj?G@) zi^{VjmRmBE^JVpyLFeFlg!zK`DZ3Hp0u_+=dT*P;cMwul4)G3qD*F)NQNJ@;F##S5 zKY2%ew;0&#x;5Lg9~SbxXMq(Ic$dg$)K@8%T=2Kr4^XY7Jwwjq@mLj5(_dEoF1OQZ zurOUJcwA8#LK76kbR<#%3EOw=K7OqPqXwCa5_bGkw=N4 zUVs^Q7y-;%)Sy=IhFaGBf;m(C;JP5PIR1Jes+SpIJ3NKN8}NI)bRXyZ^|5u_VeryG z?wNg<_)im3r5MGzDM_KII`7scg<; zCC>$VXm$mlNUARatM6$@CXytgwn>eIUxrk90er}$&w#BknTQ1=XFfAm5jF6ww2jRS z$hMm%6}HRs??lOYNHNbQ=?p>K9}a|El;}9@heTQrSVMa2E&9CWa6@7_&~;FOks5NH zV+4VvxD8V1KAKR50Imb0&2YW%dEn*Ku<>l5NBm@`3P>dHB3Dg+2QNHi_Dg-xbv#V$ zCj()FY_}@6iZ0=7vPr1j_K7ijgRF9Yj>jy&t9^<2j&vbiDWj$%BwHk69dHqLb&yHF z3T`9ZX~mtrmzTLLcY!74L~!4aV(OO?sxiyds(?r(5J^0ev@tg!@wX5#i!|&{Ba*Oo zMm)O6q9Xmda6UnCY>t|6)s+v_nNN{$%Euv&uo$5?|D&y88;^oN`AJd3Yh&lSyh>MS z0tMw*^jEexLINX-_=StULeSC%8VA9f|DpTv+(&XoE4_XLJTHoUS!E!2oW%9g(Jwho-Xcu`hiUrA2NL_0-l zbIBz`;OAFHS`Y0k92`7rRPTNouKYd-D6V&;v9ARFstFOs?NEx(h_W zm)~JO%8ZM-QnL`^s6*5gx?2-Eynz%1YF9qvWDB?{Xpg@N^2Op)lhC5-UtHU0>`WAN z`L6fg`!^PMQ)mKtQpr?Iq{ z4(QMzQW*R1PdOQv1Jl9k1F*i+NlZYiN)ipX&vPC6&ryPx%q2R3UBS3vSqTlG!}i{M zXB?dH(VISR^ipT)62oj8&8uNJ`za9^ko^xrKFHd&tWFI6auIO`K0WiOqxHZ1KszCg ze)MVZ+XU^;r@(DlQw^xYB{j@0W(AjFvtn)cv&RGGbXKmlX|zw#;z(AB9)PwD+h5>V z$E0pPv_@%^K+vdBOMSx0^cL(1mjdT9gJ+5Fgxd2fv~qdgY+XW(C$lV;8ORobHDLUEeI;jpFcQaI%Y;l}dSLK=WgLaiQgF3M@BRZ^WIXYG{F&jWP} zpB4?HkgBSy2BcGKYZEr5MIUVgn$F%1ZBj{{qC(%0f z6HnwfRok(N@6jqkYO#YsH&2*XQYJ|Bgk&hai|#Dy{9T)deN3RE3RyS@LE4Q#P4m~{ zx6pl8o_L*mFPGmc^zkNUaq7zKSj8)$QKS3Vz$l$@qeEGr>@5Tf3Jm!f%^4cg?%++V9HVlE)@UXSb)|~Ro`?Z;RUuW3m^?TZ* z>HsThMy=?oR=km)oj8lpW!4Rpr>qD1JI99c>{r|Uh#DwGr9<6cacF6U-(eMT>9>MO zmDRvT4&9P`XB2Yzw&gsrfFRj&NO0ePvQBkJ45}0F0K~AsOGrJORl-OSo9;ZYaOdg- zpefav`$CCU$z)l`y*)_e^qJzEP>R-x9JL#TM(Ae6hlj)T8MHw(^Trt)Nf>!<Y`k@otAqw|Q+Z(Dv82`J=|c{W`00|e z$S1A+ySd;|qx~T(E(G`ys;~B5a@7i+c23QV(rJG^d)04pYvU@xZ4GRg%KRO&(>?~I zJ1HD%pQ=fN1Pe0QJJf9$veWh+awv86wLiz`*)kSTP$D@h2BTD+FknZTYmPr4zrtqN zI?;UX^rpMAza`>u3uO~``pU8k91aH*EblotLm1^>xIYIXx5ocS>i+g|sxxE6XF$tts02ziRnD?BedD#8ZoPUe z1??twoyfBYp|*YXt67NaW-N;(Uf}a^qnJ(7BZ~inC$t9^?wq>riu*sv;;(=}bx)Wi zPOvbln$0fQV}vAdxPs{^Ho(Cz9_~e7qD=i*Wap$*LwzMjdWzJ;!iX;cO`vO7BbIB2 z6E-YMSFNC<9dj4Fe#aupET;R+jCa)K{?Nr2kbl!KW(Uv-AAgQ5CbA;M&aLvE{$6G9 zp=`D!S&m}mR%)@fSAja@iU0BtxwK0D?QfmFU9_|+xg1E1r%9CEzskAx9P&MLm}pH( z_N4|Q$`}r>7!jK}X$YBWq)hwjd*IrS6xN>Z8i)59K_ z-LQD#&;=Ue64f+)8!guhHA0Qb<8H1YQ&BnnNJ7|0L>u(H->NmRgO>tW9YPp-qf?YZ zxw)r@ZiiQB`5QwXqe1(F3P6$A6-fQ(6bNKtpS9ZDG@uM>h9qP`$3Gh#kb?g@JX6cx z7V;R$yFa8bO9G}bIHUCf?`tTfpO`2vy27-=0;Naj7-A8{tjKjQ(_?CNC*_}j%6vin|2l{M$?!%D0TKOeRiirh za|#Y@Q2(?=&D9>YH^{9wDR>MD>8|0Xj}8$Kz|(M#^7{>%)fg%gVzGR;OZH{falK`A zTMS%5oa+)0)=u8NE`%>)LTrtYbjl5tYQu5(IvTaej8woYZTS?|=x%<~A(b4+Zx*?G zOTU9of>nV~|L{J^IB$Q}G5?UE5zDwH)`V!w98+_se4Sm5ko*XHm)mnzr~_vwqDY;S4b5}ON1md(q2;>(wlygs%-nd+eM8+ZR@j_3)S8ZA5uQOlttn9fL==YBev&j`buP}Kxm9TVTc&X)Ij`zlvDTVD#vea2? z2d3JVl;l--3VZu0I}3$so^GS9h)Su=oJ(M6z){kCXi}N))1@tyI1nA9*)-1nycT<# z(0ebI2cl~*2Ste_4EeeB)5=d{qV%Mpz7EILxh)=&RZA8vq- zqb<-j0R|*=nswT@=d;WUVQsuO(Kw$;v~R$$vnhk1EDozCh8fNDv5V1R-t*SmoaUHT zBv0>_RHHt}A;Y{>4r=XKmSgVq?Ls-0y?HD$&ss;uXV#nER(`q=c5?(}(kgh!3d0BE zdfw!Lx#5~a)tvI&Liu0z*#ru0T2DmD?vFIQf`nhjS|gvxa2VmR?|7Id+4@&*8EAxV zmbRRynXrQb;ojue%LQYU8+ck37o!)ma{b8W&~`!Fe&ZYelL+W6;&LAJ+faMj7H4Oz z$0%j+58b(z8g#v4enI1vJhUhHQa^V75|GE)nf;-=#X;p9-uaP^Y`ry1t4yc8mpqio zo90JIiBhq^8kK;L8w5i$DeExoHtzsr4F{#39WJMJK~G=6Z8fqN4FZ$$;mT=gOhMG5 z*fU|l$RN#O>c^2Kj@hv-O3)t5Aqkj%nu>>+iPjjpNgVG*sD7>VV6R^rr`q)3Q=k6sThC0zlzpn>h|*`znm>A02Y5-BYNr;Q?lr{&<4FAb-LZ|0wcXT+T-k8yIBT{z3+ zyJ0RcxP>hZ4y1|rERq`%83K0oH_Y^lB5e{PinZp*n%_dJK#f`H0&h{da1~&YW0Ob*FU|lWR)dy=$f^kdkk69^m28UIVRHHfU=&54A*b zyoc3_aDu}x%PEe*C<{wd zV*JS$L;H+iUG}|Zp9iX$LNy2_Wc^zbaOZB>rvoAb8_;416SlOa0eMyjcRw}2JlwD( zazzwpBB#yKvGlF+Ln6Lfr{lRY6Fmqgu_(OuPTy}fMhmmcCJi&%QR zyh;C-!na#%dIOiQO7^n@ct`-!kfW7FGStn

Z8Loe5%>LamHR{Z;q_B5*&Zp* zGr`51zBqjBSoJ*h*ht5AgcMb>nmO{m!&p!5C!BX0c^1*e%~v@l6;$kYYaYZR_u~Y+ zBl8vaPh>B_&rp7%&ya5RjH|sIE$;OCD3;I#mU`aH>lXMYM$0M)gIIi?Sx`lyUX11P zfz5v5SXzv>f6r0tY>*JogO@6QTy>vPePwt@UCYA%M&ULwVHy`e_yuQLIm0<`CX#Zn z!M8i=`-w5z*n4$$>OY!AhQxeu7zTUgXIscG53wDjFp3G4rlBUU)k8_RJ$hSo1djnnz^dxAJF$SI_Isu=t$N)01;*70hJm zrVJTzN3SaO$Z63nycUR<)RA2;qyCJ3`52WA>)V>tHMsUOQ@e3&Qd6MB_0d#Nep5&; zJJD6hN#P8pA9#Hdh;S0m*3qKKdCk*=?O)E2uPg_~a1%ra^t{Ne%3w6rDvXD9)0FvH zj5J(&3vaqB_U@^Nk8C*HbN04@++0&L3H~^iJg^rktk}0O;GA8ymV24a-Rxx+l(|2t*3lET zh-zUB+aVI}Z!P+>Yj_`%*vUUITK~ND%$X@HD?M_cycVL-;3W~*X(QC9=QC+|=>RL% zJ=bi)t1jh9P_V(ckQ1QuKP&-0{}ZhFd-86Yo~3Iez|+rH6@2lofJL=>$omn@UjxC4 z#R_hoWAo5hmbshP(`(svVPFSpmi>cv z{`Id69>gR!b(W;fX>#4Eqlz*0p-n|r4B@f0o|LR}Yp0ZU)gyV`x+jP1OZ9pLXJhSM zD{^OCGv1PKI9Ry^^XXh|H8cKXy@wQ17Zv$ltgvw7du5LG-zX>Jec21p zQRUe+Zth<5Ep9&{?ypq1-n# zpwuja{w34^NspaRb$ddoFZmOY%?!|{n=4a$`%p@Vb@jBS67B&W8&GU`UrS)H<8e{E z#pT5OmTQH#PpZ`(-x5A-u6psqQ3C<*Cc0HV#l{^mt7!K*i$I^{2k7+XnN${)8TNEA z)86N9)Lo%yeAaidd9o2ncP){UVw|S0kzwsBK5C#OC09^(PkZJ0j}svLC4uls^U)~q zpO%Y?KCP&>$Dkdj_4EVhIYy;Lv?57&-PB9|W9PLjC*E{e4>;$d7jFx)&}mo2&7e7| zbdp4faXBluH)}>G6>Uzh&>4h&K$*YnR4AU`1qzwVTYT7aUT;lOZZeylK2n);Id}j5 z%--7%kQW5Za_kZ3$+L!|7v_jyP{etV)xYzae1mBJy#xgYFAecFc|-p7!_vENSw+B6 z$ZVKY?c`U-OUd?mPb3y{FY#3~yP z*OuuQ)ylzu)0LeZP!A6V)wmsKT3g;HtuvyRT&HM}^BF`F3O1SId^iC7hH1eW_}3FN z|K=E0=B4+W$}^1u8Hk@pH^gMX8n7iNKd-vZR69J4aw;o%JIr!z3Ch)V02|kxMHXH9 z0B@Q)D|x&`Y2V}drS5k(wIUestDR;)?tFo@y>XE@F1#kmU%W=Kp30nv)mcp8Lv?rv zVN{3)b9bOLZuLXY2CmQ5^o@xFdUH+)?<)nTacSqvOnuV4aznd=qB%3_Ptv)~ql0Y@ zAY6~Gwh64gvgD@42LwF;R&Sp7E3i5aHPW%o3Yd2(c*zX~PI46DiglXlkaacW5rI1vdvzCopv^P^FG;)vdkq8vMw%fGt$NnU>bJBnDe{JbiRLE z&4zmeUrm0_`1rv`i(TC8?2d{0b% zu5A1|k=O5{Brfj)gZJTJ;wu{pU$>V?dP(Li$5M{S6XL>a(?g+^mf>SIGx>2PQe2Yk zeV@lhkP-Uct-G1s>6%7YdF_ITo-F|G^=?ofgfOfhyPtk?d)cuIa8+iPXe&*@_)z#~ zmBf*j*Mr_Y%MCfjx7_(JJLQJ;S!Z9S?kp5pm%Ws=;O&R?&X8m}SF{OLnF?L26pdd! zEpnqlF(SBL>w^Zu_g*kcnsKO94F5h^cL$)2= zt)wr>6)E_iHQnS-RmvA!c0+6a(vYEc`guIOlp$awhIe>p6q_a;DSSvc`JMYtaeCGO zxDi$zi>SV@c(NjhqMv);>DO|< zo{bZ(I3yCLsPeW(2yf&&J;7sZ-w3m@u{pPau&`?n9*8B-9Kh-n#K@U)o>?oRUP2~4 z*DuUUZqLP=Q|-BgKX>%L-~cLk&jUs}6A6wAx*kKL;J#dns;Ds6qnM`hLvax0cMlJFFYf6Yhe7yP5K!u+HobCExXAy{$i892}rsscVn9GIcwj@%LH zevwFyd2cb$neDXg8UHUQ+y6xvnopH7;lmb^#6NWwGkSc=Jr()A;e8tu%`Y*Gc=TNP zvFe)mr3hZt%k(*?E^nWbSIir)Z)bzc z_XMCLrGo|6v!IKq*GvP+YA0xq_L{vOZ3VkQTJ%c;l){fQUn&)>I`4(wnWDR^Fgxbw!BfnlYNCtkN+UezK&Y56*fFop!$K*mu)?HtL$M~{Fw)*mw##ae?mDwrtQg& z2`prJU;8wd>v~SLQD7B6#7b8M?W}U)&Am@b-*`h#=yFm~)zU2M09xC%67L_dg@ASH zG=)gKjge9+e_xB^hkfSzIbe@_ZwK*1qz^St6}Nma#1_aoy#A2oYx3sqa$5usQtCMQ zgf~EZhWAu&K926w^xB6<)mX$N^!`e5?j}ZjKK6)~uNT&bEkre97>G}FV;U$|gHG9Q zevvO;ZLa(lJELCp!IYQ&!ko47)^@Hs`i>!UAt`R>*450e`;S^8K}#yu zh+5th-K5)zg za3eKv182DZJy(eTH(w9SCCa!gWg)9Jkm$X>PHlQ7YCBY<`7p=9Ydnt95x7RXs&~)4tYK%ya#2pVs@4XE(o|1KcXad zagrO(J1a7w14CJ#RA=CKvrj&5_{Nqiv9wCN^2lh{@{b(aYZq*A|5ma8(@zEtU|n+h z9kR~>>{7+<{2n+Ij-(;dLcR^UFHLtzGx?ujb7xxPJ`Tb2BY|Z}n@xxF?Ws&d?Ed#p zl-D)`a`e#I2gucI;HP1vB5@F@Cb%Dq?(7lywBjWM#C;hz{(+$M-zCla|LXgacE}~j z`8zYnLqa(?czUtQ!aFzu=qEpI`T>*3QBe9TtN&$V@&!aF6wVw9C0D&|;rAtbAvYNp z;Z=J?aoR)TZ>mYiPptcDPwk#OJ?$UaXw~ljXz~7O!TkTv|Icrv|C6r^fZhdzA zl-)8A-5Tl|)k+kQk3TO(+6^DuYuU}KC=F}*rskOe24CiXA;o|5Q~KLaBc@sAM7w=T z1^d3omb^o$jZ_OI;Q>@YvLk4OJLx%ILNqmFMSW-96Tr>?Fp~T)_3ioJ-JxNNL+jq| z*k%tAe`No#d+eXCtqg2)!n+r-x1iQGxFV8XH0R=4v< z7fP$ZewVEEB_lbASV(w>0WSnCL14mMIng1rqus9j$GE3+^p#zxG-MOj5S2eec743{ zv(#+%>{U`_{4Fo6GPWfq2hFx1gV? z<5N!#dBy5l7?m!V6aiZ-mXE{UeMb-?$~(GxM#do4=+1CJAxc!MH4EWPNJKnG{G{-L z_i~*riL53xWNzk1PW^;bLiV@#5#NVK=cCFq_{Ws-cDL(}uD^{RRfG0Wz+VW<^ZNS< z_kVgh!P>CJo20-|DEWgb$GTR(LmkaPhsg$&UY5cZ113^V#+w>9Oi{-%0a2_>H|-j$QbcXR{!pV+j{ z9+}}!nsqi*=|61cJ$R&11qp?ms2K1PK4Q#ieuNXv1^s;5BhL;`hLKo}uT&-)+Y(rHow;1hI$gf~;C<5bi}#Z}t={_k)--&}oP2tNq41HI zx>-V_RIAr#=#BS`+%_ok39Tn)P*gmL8Sy=R+7JpCLA0FeLm?+^1zdPPea4IwXHG zE#sCLOx8Z~@7tc*{P@IvNgu0dvw5>(JM)kC!;GBvdw&Rr^phS-vM3<1dC=nipGe~& z=Rp+ai3;=)boNAu+1&dRiIYA1Cue(P2>d~jt3|Me;v*BJv02u0t#s=fXed8u+*>jG%7tr^>lwBn3Xj=8Xo!p77ZJbs=b4R~eT;9gxFzp4Z;|eQ5UypuO zM+6D4FZDhS`rx8E`>gY6_CS=?#>Af`%JA~f68#6~j67X=Rf^(1bgvE2gPGOVo=U0C zdIO+79@1L}w?%;2^JAaKe$fD6 z8~|sO^k=h+w**Q4{NM-&$jmnpK=S1CW)LGf%G-hc_~F5if8S4NB=i&34kgJgTGu84 ztXlIpp7K2V&==qNo63AkD^S$KUEg~QOqg%vV-~inzhql+N>2t1gWcar!G0Z*j3wcV zu=YPs!dKx{#YAH;k|Fzck-yGHQrwnZF$ZgKay<6!Dq4af*xGk`3*2l*4*kgAgZ~*z z483|b-gizmLq zzD_)U&vO28d#%r@*2{axVL(LyHN$(#jczvhe5(3T74P+d$8>L-bE}*(_mZLW?h;!k}>74rRlA(k!|hYNhoH73}0~58EP>JiZ6Gi-eBDU7kHlVbpGU z?CD##Y1CtXq(L(9={LBYM&Ait$R1O{?ouPw@nqKqg)3W4Jq37%L5j_Y~K4A&sZkz?g_F`9r$9D2Iv)|$_&R+~)P zuSWNfdh`&;r0C=n!OUsbCZug z@6gx|^F`wN_=5WeY@@VH@^A#izrbd~V(`$0^oQlf_DDO8(HC<9g7S(R+7&krxxd2f zkK?a(u_!cy{_#2^|^Ag+sdfnqsem@@00}|DSe%$ z)vJw!OpY_=&B3MID(faMZz&Xt2F%`hV(S38=8^hwWX>)meP~NshNTm>=^h62?abiU zy{fVjPwG?c=*8%SR8O!SsyQD$A&L47Yo4wy|Ag98qoQXl2M!V+F);1b!|j-_Vv zEK^=mTPL&e9F2rShA(MXT0P_NQ6T!sT!H-ARbfQZVrH{`>Y~?!nsb&eO(IP1;x``a zu?8TJAF(4Z&|iOHH7Hyp*=_|6OOyLO=c*!%^F%oj+wN4&2HlJ^+O^#CkzLK4W*XF* zy~&%pYnATU9)#cY%>!MpUgD!{m7bzaMyuQL0+Hu@ezr>n@o`0|z8z0W-Ip6o0|4-s zU-RRs8zXxk4elWh#LK#pjbWqM2%ekzTU- zbrz~sJz$@^+a-IktfIm>NT}Z|ZhV@0Pm-)s?-r;b{Ecr{8+Krc3u#fw>NT=hEd}~7 z!AaIX1#S!_P<<`?QiI3yZql8!r5s16qh}&*d)q0e$->syIPplGTp3mKsbAiql9s6~ z|Kikf7*?f0c)w--Z4MqFbvypyea{I6)9N7pqh5F8sLu?GCMC3Q-w#&XU2If}Hc?7b zHBAi>|KXU%R|$z>*>70`_ykWgAhg570q>=Pcs0DI(=N^M5gm|Wt_P8Y$?;WenM93R zF^8Pm!6E(Xx^kwnD}!8F8LZqNrc2~cZNiJ)A_G4-wkcMd4%8f>vpo(7VIIEt_%x=qqvKAs?1ljc z(P(n^c>;rls6seg=I%Cy;u$3ETaqo@Rvw07lQ%n_$9Cs)(f zq0dxV;g1b&Z`Ma^xI{nvc==`v+xYY++SrbysD5OFoY$+VLJHF_9NBBkWM{_#ZeVby zt}h{7r|4<)a+@Hr&_~)j)KF!t8*eSVbO$;TJloZAt;@pJk|Q&E;S>Ze=1$56Ng792 zc#7V-6&j6Gb9^g6u^noF83zYO_)>G68H+nis}J7(Yj=C4cT})BUICS5a~Q&YKiGzU zVWGn|OsM%8dWlv?i^|ar(}M+*l;c;34m39L%NmJ@h1(168m98vj4sf|sdsby)$n?w zV_T^Axe9GFp|^$Q8yGKhe2*j_a9l^k)vgKuz_dwYP+2C)TSVs$&CLn$^=^~2Txp`R z?Bn&pc`v+^GDt_V!fTO24zXm!3*=HWX2hBlNc=XlIz{hx^r9CN{wX-qEaXQs(d-qz z>@bt-&b$j|uu3qS#i)v7AdM(R0V(K!s{V;20e<;E_dKD1GS2UOuHerx=|yCUcn~?Ndbi9iN8CphL%?-GY$byfaLknp{J(6_~hC5AjUe z(~ytO$cF=xA{P(2v>sGM{M?|aka_hgEIs|WM;K!UriLOy=_*kWE>CDQI^(vx{z%8j z#ZX>KGVYuBv8zH=9&37^oyLv$wj^D4 zD#KswI@URjmOB4PwMg9!%w-C~muyOz!fMTma%Jv+@wBX&FGf0iu^PzPh_vX+K=I}{ zWSYOCqB4!}pc3BoC0G$XbBMzA`=epl)5Vp8siBPw;(QMGPHD~+o^M=VIVth;?xq~g zmPPFpBa20E4y7-Hx@+MlU&53<^)zic z<@d$NO@4z%VmYAm*=`Ax8c1V&o_|f(7b5H4TaboGY2Z&X!EWQ_q z*%o1Cm-o|W%q&~Dq}5jD(kFdogPkG`aVNEEu-fnvN#go6>dX|erAgmDplB%`RuQK> zmwb=Gux*XS5qY?G#B3B33~V3*z9(g>kqM>qosQC#YKNc#g@5AS`WM@fi2;xv=00;Y zvhD<7z8gXq{BGkO$aczfv?p?jfz{h#6>c#@e%5k3t#iME#J2b=-0nk;mlQk$UHB2W zh{${;0|vwHcYRVN27G%yvS&B_DHoU6jZ!#N9C_RmS&d_G;b~N1jVIYy4poDqcz#@% zp3Jk^(jwt_`Lsq$7bvP(Lw9Cz^vjdtTj0Ru_3*w>>DD9Z~9BJo5$ ze-i(ND}KNOtts?92iKy-MpS;bB^GE`C6n8%W8-oFhw$Y`Znp+h{2-Oneo}E9IGn&k zW&->v0#-Z?&y`vFFDKY!v}_s!7UQ?88|l7lfQy`tqMRgd*4~us8|Bh?bbP(CSZ9fr zd<$L`up++Hj7=^x2;Cws-A}X<8dypCc!1d+KajICkhLvaJ#SCrtNq-?pPR?%C->{BBIAncW~U!2=K*erx@tmA;o441Z1jl_q7 z@q6SneZbb1Jh=6?uVpZ(AbfLDkesF-mi_)`iv0S&nl>-KeuXD&&> zq)N{l8maH#u8wa%XiEIJp*A1??%1Of9+(dlI?@AtxZ14=9t}@e%c08922b6FlvgKo z{E6YyrI$BXT!TWT_j?fX4;-_fkLlsAVZI_QZ~^*@A4#^~iJ`WmBD|`Jc6?-!rSAt7*Ho%693CX#8YJI=aj@y-NJdzN9p5Q@bi>m z?2Zr+wEd={j17(H?p#_|p0<6`q;TQK;I13{iZ_EA`-AR_Z;#h07i6p!B}cADj@Hon zh~G$y?@3fObEzQ~VJ~{Pf9A|kA7bzhA0 zNwrK%jNPKrZWHW=tt{4IPEEQA&J0F=TTws1E8Ha;=2G<&Znghd3GD#SAf^TS)YeBL zb|S3Y`8IGI{Jw;c6|DrUfggd?YsB)pSJ4=V>2 zLTg%b`Q9pDfm}6J2~)vkg@5s2f~UC|=!vdOUmZ zQX~6J=4<|n$Lx_5 zL+w^W|N9_3kvDYkv^R`}q<}g=P9p>k!VK5tiS`HR7?!mpM&b@$xGp1lumVN!OEo!j zVoBmM&DXQxmuY^`(-xb$P0cW(u)-mAYD&k=+5@p`!m z<(LWih4s3QJBNVkTA#5-do16&D=`X;*}0<8G~?qBT~Zycx|%FqFpYd}Ho|b`_=23g z1n2!l%tSw^z2ig<2TiKPv~k#n+pPsLxjy<@HE)X;I#_WeMNV|gNIJe zu6~)dayQ#OxPK|Etji5oiyJwp5|dN7fiJQ?OrFWLb!4+e?T?4iC%rQjGcw>n6k%FO zO0yMiB|l;<+*BrNHapr7V&G87Ipaf!obdB%I(imDx>ZQ%2X6!-Jo}1gp+$mzVkCnt z!2Z?cguf?=JIp~ZE*r6xwo2_}-C@XPJxH}D(j7z6k>3{=m*yAy zxAz=>Y>?c-&JYxM1GusX&uWbb*S+4?M~7|lT;YM|k48vepSqQ!@b22Y#}7=|Dv*#e zOCaq-pSe|E#WZamid7K`JUELErvA(=cis-s;kCj5*Kx+w?P0DDYCZIN(O|%PEbn4q zWVbX8WSu%m!e~gv#f*3utB-HgcKK`wkm@J0MW1KqfRlNOgS~v}w*JKZ?yfTzoZje0 zeFi|X9mSxhsdx~awgO8OC&kaci4>%HO9Y=USaIG!7=SHoa1{95!bqL0;%?9|Kf}U zEB_^{z2K>Z3FN~APQ7Dkf}s2|+-g}@!80orYlvuz?3MMU#?W;r$V+{rB0pE#lN_2t z3{75%xNcR$UF@T^sUh5sjR^crmFVTIVrWO-zM?g%?t_i$J&OF1604`)uA1wHoeOA0 zv1Qj~Sm#<<&YhgZ?K6Gy@b#?@C=_RzFP1OP?OI#5u{eXf6+joBX>PJa)h4;g%?ONP zc!`N~36G1}St4J?i4r5ZyHRL`t|HGNTlHc3lBk4GX;Ku$ppc)vIBlS?R(NA`1jHnl ze@pmH#lEVyZox%5HaRGBa)kS4-KY}d6*b*6t8L3Z-UhW0DzDSi_rr*_&fc0gRJyw9 znF8GAjjmazfbR={z>8}5^kRP!Rjxg>DHNksGhmGl*>8GzndJyOqtDetlKu)>Y zKgNruA|2A}dzrpmvJ3K^i1p+CRl!d5B>ccH#!1FBOeknYi;al`rx7+vc3=7&NjP(SkYYKF*>B--*##1(i;{YFti>jrYL6-s{@ z=%Oh>v)6wn(NN2lj64^Uj{rId{twA31KJ7`V7VT=IQk1#{&%#J3gd&d**st+L4hU+ zSZ^)09}U*r0=`JXNLPy#w~F4|5tA*QrRBWgte-MNQTZ~*cYo4)9BDDG-e=1db+wuB z8g-E)^@?;yq5QJ{*~BG=sfDlLy7pC9M;6`tYJ%^mxYRI%xDPcPRa-F)iD&KtIJ~0t zT~UU+(9RuoU|Lmj7V&X5*D{Ai)^z~bu<#}Rrn==W!v`<2Z`}o)5-hDktnlLIs69nZ zWS{H>W>a=T>$C=G)et}=m zG!4?Z*wj>^qfrPI&R$0jFHv@yQbt=qB*ah?%YMw-$FGy4mht<67NHSS+R>+s`CdXt z8a8?R&G6{QswVJiH$NTmcI)r^3jh;720Rf>u4xXTaN7#?dC*x_+&j64@=hI78pB+pjN_iY)OfeV`PR7C&Ts(>w1efR$J^&x?m%T3yl_37ng4LRW)G znD<6pIZLB40ec3ymtgO6H?Pl&w}$rL*xjSe;PsK=g!RD-0{r&Q;Mfe^GLwak(+#SW zl%%Z|YU$+PRDNuEB+(Dsono1_%E_x5>uDPMdE~v=JpGbfq+5ahx(hgGq7cisHP^R( zCeA?^;6xi;wi0Z}H{L8MU1yy0OIFVyQZyd@K@xS0=)c+!KW-*aY9EE))?kBP5A@7t z>;V3Dur)(>TqDw-n}Zma8hqA_9`@MBHZKKI?T4a#NUum@%&&vIt$SfWPSozp&XuvB zKf(-AsDJ&EW_*p|>H#vLzG-_D@?6y!yLj*1UTZz3cbWEX>?$D)?^#)#sotDiPwD-J z|4mgp#YH@+8tXKzh1qAHD`=7}7TUZca@6}w|M0l z0X(TOx?AWv;HY!+sz5@jn!PpgH`Rj*a^o}k=FPZ(qad4AG&QBe=t8%?@NcS1Kb<>@ z>$cyR6hGO(y5lK5RQviksX2P=v0Rx+jkOjSMPu<*hS{FAb#&s$PR!cD*6Is9G_+`n zDYsH3)p94dF^1trD4(k1{QzQui$s`j?06)a5b(XiKr^oBk0LBJ-p?^q+lbie>$hr-ok)7usH8sxl3$>kpw6 z6MJ&3qz)OfT~n;XSw+`UEmQBaUimz?4M18MOm~$)HusBrVwehGRNIbY-iK$dq6^Wa zXe~xiX-`oiI^Y=t2bd$VrArm)#wE$v6cgK&hFcNXE&ur2A1|Pvlzoy%r6E4U-p__U z>ktppvgD&2K-*H%`GcRrF$aHo$S>Swf5;ZF1#}BMpp^R0(G?&x0y8`gbp7A=*iar>27re=^QO*T!LT z#UF}dmMHM3pPUHhF{KXYeBzN&+B7xvA1($fyRMUJ^T5j!*F^x>U@6Ki1{;5m~ z4~Y#Q(~uo!L>R$1{m_c>JQf{@vA;ndJ2p~=Q}P&!gzYN1J*@XZtUrLp|E78etP8!+ zn^!do6>=cOIMnTPPKi=3(su-!5lQxY7yX_BFi-XAqr^) zyN+eLFYW>3UMb~Ej5%Jf$w_NfA~R#~bHs~mW&w{-wug7v_{=!TDO*c3cd1^!qcVB^ zN;;cHgN2hbeG^-*(6#$I$`CJxjbeND2L=T-@(zYU;@Z|KxmVb98G-|9ikzJ>&Yb(Tmu!=oLOM zepO$`Atv)=)p@Li)J~FgvUZ?RK1QWbg!1tKP zTQ6WYuIT#F)Di&E1p9}!ZuFE1X_d2KG^@yI2ob|E??g$V+%{G*mMUXh&9ifi z-E>=H9l~7-Gi#gE5u}^pV#hY%Fsr8iy*~ z^-r^a|DYJ6o%>XOQS}`id{PRVzr3anhwe*dIZKUO=#BhZ_zUJf76Qvn={bMNApTQ< z$FeeNsoqf0e7$f;vBSM}pUZ#J*riXt!=J2xJ*%kW<3mTXz)O)r;!DZifbpJk7SRct z4`@QtO%at_Mw{cjIe63@__oBC`OcS{zVzUhR`|d)idEW5F4Fbr9~(%sk8~yBgyn)4 zg0GTp0Qfb027W|0g8bvITtUhqGR=4b+Br$HCw@@C@>Y2-!Ii?=OH=NM9DPZ}wX=sA zkZowv?{TYmtfk9^c%2g=Hd6WvAg>60PephFvKUH=0^}njKx*w;-HpI1AZ?(GlcU6+ z4UL|xBU7W-ui&>o8k@H-PCVcpLaUQ$8q~wX%p*OU(umOR?VS%3khtwoL^bJ#4AB;R zt`pCZOP?{V*`Dqji(!MKx1+6Md3Zt!B><^>MQs_LZ*;cTyxWU6*)j&QT684r2Ele+ zu+HrphYh)5bQJ18h~-YrzcIDiQA_0adVchZao3&>LfAxpmLw#46(8vmAAxpi%h|s= zy|It$yoFViyYTUL8U=ZlsJBa}<*6|^Y3?ekUq$><03g}zB|TsHrET{i0xBoGizevH z5T7Ltl|n#g_WCcSEBO)e^q*@-=Ox~E-T6VtC07BBOkO)SxnJ$a?r4mH*1=Y7{Kz5SOOb$q(Ukl69zep1l^SUdd^d%&2l5oT&?+63yw+f>l^6i z*HnBwsidqlsdrYxAY+v6+pDlJI?x=Q1JSH*JXo9TMO-kNst6wFx0lO5q%oA=8A~2$ zmOXuUQQkMqq5t+t=)-9viLle&PXOi0WfFA&KY-RHDVW8`CXu8_xGvWD2)2vQV*faq zr5|VN_K{^LR5KF-h?r!9Gb#g3YnI<&>)L!d;#2M%fP~=|Kj&q*g=?nJ#`QJoKmAdk zer)DaXQ{ms-wTXF0Bt|r^eQ1J){9#<{&Na@_;TS9D)C9C+Q{v=WuxnWTEyJ^Lo%s4 z=K%pVc*>`NneqO!`VGjYdDS4A(@~SB>3ocfG+hcG`tAu&_jJARtwG1O)^X1cA^4 z(m_C^w?w)|igX1b(mRpf2?ByN0V$DA=$(WbAjEI;&b;UR&Y4-~d^59V=FD2>k4>UR zU_bkL?sDDNece|Q+|si+%eG|M_YdiS(n{}oG6MnCLSG2g(3~8zzSG+ohi7Q5Q_%ZB zrxSv~mS}1YS2}e`oEnds2B zSoJOWPMdikq3cw6h`m+TkNmAlN(r2xTi-8nu{9g^6nbdboT@wGV%<<*>-hL$sSlj7l6_mQl zMt(QdDR9>5F9dSw#2miam)u>BBC?eDTxo|WvSBzU?(*_;fVc)h4R_dQNf8`7G@dgQ z9zxGe3CIk%ZAWkQ{mY(mk(-Y*^G^MkLf!L> zxSRS%QvS8{kx|NYGHqQJbft!p-jR7{0*JqD_4wcqp}jt`g-n+B%N00`4tfW&CLYzx{!d1h&GLx8N^_tr_ZrD}x(adJVO?93 z21w0a?Oto+2$tqRltJ&3y6TfMyk`}ZyH`JL7P}g8@Jc^hDeb2F!Yd7Afz!>osKv&0 zq2qMn+?$7sUUdhxe$&6G0$UM+fFy)Hm>u5AFw6Q@h-<9CNv^~h2eUB%HGP2B=aC)Q z8n+3xXfv7L@bmAQAkp=_F4BMYK6NEP3E$a?#HACCcXq~~9(zU{3+2*vTScb*1kb$u zs`!zus52ksNVqZ!CTKMVIwreq4|z12O)dl!UnRyO4<}eaQ#8=wsDYkWW*J)Gt%>mHd&E?S7`#&C`H;82@+AKyjA_LK=-vL^_^`yl)21;M z?q7E>5OG%z@&L&HKY#f<^FM>)=zyMFA;+2hRWRwjBhd~u9;MnR$)>oh9g8UG)u!jn zr_E76b;4Je4tj!U*AD&IOX!2Qx>XU66cC5(U!sBBfa&CZ=t!31-ZrY9WQP{SL@c0M_ zqXjyQ67<<-t4)8j7d!(eK0*Je`t}3taw%eKD`W ze*mm)qx%gM%1n`zWt=K}=?&J>nQN1Au;z7`JD zi<}=o;Ndhr@Mn0fCM)cwFd#0vS9CMzH>k05fUk@9E7K|D!;6B(>L_zpPbITsA=vz! z+cVg>&bs@7>_1l|3<_d9%2Z={ryei{QUbA^L43o@6RYovGDN@ths9#Ws;R)$66Hh2 z>=%hK>36gr%u>TEFZ%Ss%uP-D9psa0bO1>7JKHTL8yjhk5P$>5bYcco2rwg1WCc93 z8{P9|c!DY0eXO+Oa_`>xFR$scB0s-Ung`|JL~g2GY%V~+8~nkG@P$8Q7k!SBOxOHz zGrM5Pk}Y8&-srVo2I2OGva+HE2G2j}MJrKeFzwlegB?ZQrZ5#6Ilgl(=}6((wn-ND1#`fvC!Z%Z~GS9Gce{ZxXIksv(tzArh{g>Sr%&g0E&#Q2Hh=s?rF2+tG!VHKotXh z03*$_$g3inMDFTzE97Mii^b+`$`9FvVfzbv-@sNtD10H^j!Bg8>4l@<{h@LbkWP48 z;j?FF9QzM0XB>sUFD9T?QAydR-poOGq@CT)HR=&;TSKw^I25*Xd#mLfdXr2J)2L%M z=3@C*=%79Q9!;c6@9-?(|K4gMj+Y#q_lj2GTY1b8c~$M1isOpkK?Sx%Rl!85F{yRu zZde?Tz$?FyHM&7v1 z%Mn0;f74;GSn>8yN!6>`)46zO!9b00?XC-+HK1zN8)XjIXXW%G+hSi%&vSx5qTa2P z8c)(BYME{YA4w%orl@p^mnPQ5IOFp+$?>cmubL8)-{*`eK743RZ4!|s!~J8Q^^IcU zoE}8eb9S52(4K$J|5VK1Sy-K!D_wtQWa^v6RjDiI7k1~>-5)C@fH{0ii`T3)cX8gE7{1on1bTeP@s+khWsnNsKuif zNtex8K6G+f7YR}`Spr<{PzYwGoF~Zdm{c_v!g}R#39O?Owyw^Z9ZIL0aWPO#!Rh#6 zlw*gCnb-8%jKA8C#X*-w?Z$i{Td+_(8?Q+wh{Qu9UJ{M4O=@l4f5_H6FPYzQOUsT) z?5wW_u+0^|5`O&zt~^THDrE;rTbQ^{Oxt)~m}E~sos+2V{UqHRlU@pK{9DEMl(uNB zI2&EcRlgfO<_G~)j8$M60WSzb*+i}K6XznKq^6gtmbILVv1)f@9xRml(RZIjh8*?H z^`H02`*wXZkC3XJ)VHH7!-9C?txnn9w&Jth2iL@*nV1X z>DlppuRRF!aS+2y-K~WS&&1Rf$j&tv5wuSv9j9;mRH50kQ*2W%Sy#zue(F3iT9Q-* z9`y$0#d^`BBs!rlX{uQOyEG1p`NWt%Tp}cC*DbjUq6&a}o(+XfS-UpsdXie(iEmkO zY1ubt3G9$%FpKEa4vi0x!So#@UIEdqQ9rG){u+jlzrqk7+oh{5IQhY;{bh0KzTgj! zlS{^POW1a|ju3Y_(H1h=>=E5|%Zghd+0oCV=#mpj!ubqYeW4Lh`>$;u5u1)Iq ziOPb^dtS00hRdAi4=yB7K;yiR0}m2?rP2x>9Ml+W=swQYEiIP2>dSOe6K42Po=xNa zVFJpPC+?K))6dz{%w$cFDeMO&i8q}SYGM!GlR`BLF2sy!f4f=wF559(VC0LFTamBM zgkW&#ix@Y%Z=~yYrHi1R-SQz%gWQ=yHkWQpLva&H)!)@imozTjadw z&^ZC%5|%}Y1apk`wq55^nPX!qES=N`-+3$%bG+&az%|oN(~YQEWQ0h7Ag=S#FG;@M z`!DXNJaTQ!^}YEZ+l%Agc|S}cWL93uK)e(4nQiEE$zjFXt1`!g=Nr-9-_u!!8m+KB zz|nEclEUB8<;eHVHQUR8%Byc&crDub{fo zkMyOrrGq)mqAqjaMY(kK8x>^=UGnkIAcX|&Srky7WsHCNi>sz80G@3Io4d#W z`dxCr9{{&)P*nCpOaLQmS$1)J_DEWEyxC8?>kany4VK=3T}L3WJ0z8#&){3YQ#I;Rkz+-RhNUe`nqLfzwOy5 zj6`r8G9ik8d>i=C{Tdj%BDfEpTmII^Iu{k{^6Ltl`2hWe0kbjlX48=1u5>j|`RHuW z&zYM6V)(o_r3*Fqi)nXS@)qkNuh#%|^fy+>hv97BVN-3lEF*gP^Skng*FhuRP=um8 z>zg!Q&qBEvI?~__^BzGC-fR3VIw)FZo;B`=%Ye)(dW^}UW?JTz{+ z75BQoDBv``TI5G;Wjd}1=bVi3t3Uh=csyj_Q}a zyaSf8KxqWZu-{~?ARBnoOqOj{)9v*+wVTUHw-!%>m%$~z1mU|qWgqQypQ{^9sTUD3 z(8MjL=M=`DH}3yzRo1(><-Yv(`(y0WOv0)vACj8-}FS@Va7u1>6)oHZ(sSU(~y zF2EiUmGNPiXrg6-Ps*}DjV?VE=*W&LpbRzp#cp`v5LJbCbY+LX@OeRZ(NMj5*W@Vg z+oH3Te2lATxI&91G`;G?JDdenk#*N&v|`Xth;(=ay8n63mHVj&Z&h_|Qyt>xFY&v( z^_@5Qy6mjNC??~f6|&q3^)Bx%^Gv~?gn(gEa&X_+bMb^&C=yWkIRs7_mILpRF4P?3 znG{SO!x>)hwgaD`5Zdscd3zYxPa%j=n6o7w^#2uRh20nhvk z-xKE;W9jl<(AiRc@TX59wh)SN@OCh@=eePR#=+$mLfcHR=PPk^*Z{8;zmQ?N=|;u5 zT1eI{CSPN>L0{CaiAhcGQ6%QvJg=^tI1~%+d9HGNdV$VI_0z22!{_Sm z?wpzu#YR(Iwzj-nalTyXxtS$1v;;ddylozP40eKFq;zp5Hag^_x077Yo9m6>NRLMfW`#n0RM^MOpcADGO;6V7Yz*N)JTujX+wXHwRAX{v;*SnyCF%d6C&uyY_qe zz8)t=wkir&3e&!vxO6xb6gRo6Rp2j11!g#sRaIwDe^8I69b+uuIN+2A852*g@ClT8 zGR=F!wepH2Ha6=BJ#KE2Q|&bnnr5RbRGpG)_il}chUXupt zUZk8iU&e}zxpy8<#`euIzgMnwG3m9uOE`$46({ZTvnPfNO1I2rI3{v3);kcmQL}pY z_8k=%?0$~he#&=}4ZUfQ=yC}0*1E=*#bRsq8=fwQS)=;~4T}vB!*vC3Ti17dd6fKG zgu`m9J+D4jTcL|LSI`c--0Dh6l&TiYH2LIeeIscrK;oy|n+N2O)5OkZZOPR{0*VFS z)`=>z94Xju-T|EELLf6V=|ltuJa{1HT2?DQ%Zix@1wQWHGW%X6&lBB^{G3Uw-Zky7 zsP8PM>VmZ^?Z{rt2JBjin+Oqn;x=oCwpu=BkHCeHG$Mc&mvs`|DHYnpX-9kZxFX;g zu2v`h0C4oIynCq8?peIa=~glUZr~ILr!~)b{D714gM&S)yWbu-{F}JadF=AL?B=04 zUghDb?CMvFhMyNI&mRa96^quCP*kWf^i?0Y8eS{bEn|JO$lK=iJ$%iI5-`n!BU%ad z2F3;Jo#w;!^c4^ude!8fGgQqw73SXK1CLsb$bhGqPWma`7H^)GbdMzP(OC)&IdfjBiklIc*9ZSd~7x9&b$Vg374*3 zQ)I77hi;?$Rdj;g#sp~kc+$W~fyw|Z zacj#oXhnR$pPH(pw6hIF*XS_Mf6mA0acz_GE1N6TR|6McauA<7gm>d^T&f!iwP;vD zUnjjtbF4B;eK}v0kBbOtRGN>nWut2wEpd|%b$tmE=Hk7u;DX57UlDsHxDKX(bs^Wb zn>id=R$!gBaJ7#yy?n=#f&PF4rLkvc8XD5TZAKa}7ZB-%#R3iYt+vz^_wjXy?+hI; z4}Ck?**<_A5i55UsD-sB)94z%A@_3UH^c-qHlsZlF)a=R}f!#s>-qZ^} zq1L+Tppfm!#i7s|H@)VB6#I!Y@)ViwQW46mIpPKqZT(x48mKAAF2`v?BYlCXPxx{t zg#P1(&>eptqagN|zbvF?o*^#r4hOA61N0T-w901~3acz8b?3@~-m18Cl+d>47 zI^T(R+1{vo!Bt$XS+0h$yw{z;cn;_!)}anAM88ZY)9aSACOYZ=}&xrKYQBa7+Q!{#N@|w|7M=s~o+Qa;hcYN~>obdxU}l zMBbqOHaVoT2K@9o6U>9_wdS3+o5|j{m4o8cXUPp+JBAn$U^>vvt%uDi-@9wJ$TLJkHE)?gunfFy5i z;1_^&L=MahbDrIn>es;c6$T7|%1l3igUa!u2Q;< zo)#y(&wD7Lvf(j$J(-uqtn5~1`i$J>jVEJwsSWAtp~F1{qNBK($m2N)_FA9k6mqhPa$cB>a7vYFQC0a*bXSaGwUV^fpxjx zHlK0RGdl}X=*PQ+?r~qh*S%b=o-bB!q`_~`T@P{|(ypq1s?mCnNobqmaaqs+=eg1^ zv}9A2eh@&X+`B!aFA#Z9s{bI3@#mR1FwBsz^Vkg z17lon@F0}X>|SbCN*5o(R*3XG>5n;hOIeQ@Iv5=t^}>&jwSQ2$N)_9Yp`+(PMop%e zc+}58h*23z2c0h+clxwawNV3?3+RVZCa6owBHa`K7Snr>^53BPf5ca&&2-Sy6rqm* zrPG2GfO&_hUv*eqO?r3w`vZJFv+)W3B-Y#xX5O>}Me*aa{Utr&+wFNfTyTwj$6YuZ zkFYZrY3CjhV4UCFE%!C9H_tPWgl){OPwkD#PbrQepcPFlS%*$F7E(00OR*LQNhumLtQFQqHL|M^y$yjW=RU{WB*N|7~j+wpyD-m%7 zNgRDXiACYro3Iym8YQ`qQx2>>WK57^ryp2W1>sRL?%c8Vf|mjo}s=iS10SpdD| zpp&*E%(eoI_IJBoQ%#urCoYv6Eb<=&4-rkb=$;R>oXuWu=zs8^kMDoaKeqmp<@pCizvjT<30RI7 zpop0C@Td&+rD$1wrZ-(3+}KXCtrPX<9O)h{=VWu(`!;f&S?QpIb^IQ{R1& zaO}R^oTGLnTnhaIYoCcY;@tmmMo`)49|OdJTp# z7aLE}4ibVU{UOt;BRgPutOr2z3o$m52kqr*)UFc6^jDdIWjrv^WSB1VqG zPNaN7PQ0;NX;-ROJqoxh-~{sN5S*G0^kLh+v&_S~qrX}Ng`;QSMA2?;W;A*Hozf_T~$iR_}Hr{vKQb zjR%4-#XC;{rb3Nzq>EOn?bD_>ubl)yM(GF4YBT6JBsIkM@>0;dM=ouXp6UdB3u~xkV;T5o9Ex1ErpD**NW>*U=C)=Ey`G zpu&(8L~T6yK2COZT;8>8E9b&{`r$_*tUtDYP+;T@670LP>nuupMSZ&UIMxgjsXu0h{TxAXRwSb5*0IfphG9K- zU-x%B{e^B-;8_lkuG47I%GXK7nnKcJ;tWfTQgr>|M zM*re|OXs}f+JCwuIOq(wnF$>w?3TEc0(C?MVMF)|tJ6w~;Aps!vYZmk?gwV>WuTfj zAm#+-TuzM{gJbWG?QCpW^dAbF&XZ22OchH`IDCu)J0D?O?{=;5KHee&^uq$Ojxm_w z=jkTOCf}%%ug%YH6@k;yY5hM%uEM&(^b#$<>?_8d=%W$QmO#P3#hM(9KgJ=i5;(t; z7+*JY4sI2bNzdGU!pS>ES|tLLfrhxEWppxuQOyAN;)BwL$}58({NSJ}LOO{qrtIFY zO1)&D*w}*u(-%%fg>B_|Gb8gJ974pu8qU4bY!KZbTfnRX{j05@aM^^VrR_E;9#t9c zJM;_3A3jpzOW!Nu;9>H@mvCFLPI=5XY=fE9ELVCjYR@?#<1)3{XG&KyDa!9g2CJ&SknZZ_I2X8u5L~s_iIV11vxYfG zcMP}dCpyL$0M^m*_1{&G|LND~!|(-=JD~3dlwp)4Ul*)_cj$m3cd0he?3|+gBBLJ6 zuyQtOo0LR#;&gByP)wMcIeZEKypI?h~>iGEMS@-?|rmH?weQlKbq{sJ&ze} z%5A!J^Uy@z^r!i;l2LtEM{EFmH57S#xkTUxCe`J=AAB`FruRj?O@%Z+@iESElXxCH z=PbJS24FC1nyPrGlS2s~O>43ZiN7IfYIAA>0F^R=a|DRx_WD2nZDqwJO9U|HgWG#@ zn=`h<7D?k~5l$~ZQAyERg?vN3vjg%tpMVR=r)HxybSKGfpi`=gs0RBI{?P>3^%vad z57mH<(u;s}mv;CcG76#T#JhQ1X;(`)NSqF|+x^a- z2?oP>rrS7ml&U7eKQk>UEpN5V_oip1LD~MqH34B5f~+V($4KL8LwPWQFZ(QDx=8mn zKm@zZY$)|W(~dGF_|g5|S?D4q&L!x9CM&p*KXrg@rw#C@ z%Fmk1F3;DFgdR_>hLAo?-2MS1)eey{wzf=ro=mGR2on+|>_(#wzD6*99xLklNV8#$ zQ7^BQz)CNJtK3zMoQ`VOLRHv-$#f>Tv2fr*M z5Z1I2Lm>cDtSAMy-3D~ZYOSIgG!_5|BR*lfno?^Gr#ohi?t5g&$h$n zYg_#bnxQ5Q#=x#HT|Bc?H~O*tbm}lkBZ&rsCQ3!$AnLmPnW&kPmxC&%YZ?Yi-e2Q1 zEr1yRB2!R4;d$%$_g^!P#|mki-BvFvCOSDIJ=(^Z^%~2@&Ld=d`Ww}jijE2It>K?hCyA-vLq~$Gz!{zvI*bdrX(Sf&@x5GoNd}i z%mBAw;OGe;?!^f-zy1a^Qy!!PT}!C9;Ju5eGN*aefAd#6TC%|K+A6p{Fc&;)ss|o) z6(E%_M9%`jIb8UmghOsy6%MhejH{>_j0b`rl8{uL_&6+ zhL(`0R_`Yjb)&2V94e@hdp6ZqC_pGz#iD$+&a|Rg*2SAUTJD@vQ6@sa=x6rlHTCM_ zCPP(zK`y}bTb}_cf1ND7>r&O2uL@NQ{PJSdVvjQtVqW=3DliT?Opz~Pkd)2Wq1Wti z-`@&hdfuu4BhU|qgcNB4?pAQLVc+v~68B;N3hTfjBv;2g;-Eu#vUwZ;gBSpgy@(|Z zFzQJ6>yQU9CCoY3_`AcUl@FO6=(slmn~ae2*%J-61;De^B{wqeh{*B@?_C|=jFb(l zxEfG5Z*-kAqm3e1j=ic#MUR)SimyQeklU3()A`7?HZvLl(s%mRX-lUE;rEKCd#4_= z1xfzWQPDRZDiHRqp7G8HZjl36P|>jG!xASn#@)_x&ugN$MzLO9jD>7$fmG=dX5E3cGweF+kF( z_BfoAbR-n>t4SBJos%+`!EUG@Xa4k)Mks3oLZ*cETRJ@sPm{4y?L`en+@NR0TZqAB zSM4+F1V%s4NKCl){u0%E1Uxd^d%)hHkY(dkudZ{N@aMZz`*KMoO8=+d;+sDEK^48+ z=?(R>6}?v+=-xICdiPQ*b?(Xy&nvnQecd@;U0PA~1G)9R)PZPaXK!2HSI_N0{-jgR)t)?^MKp{jX1{G|S0PDGV>a{BxZG+h_Ok-X+m1frCjYh>9 zeFY5?l5HoTpFZE)o7=vT4J?Fj_&2~nQnI??wp$>6v7^F`3YJ3aQoDjCsz#|Yldol7 z9?D~^$tWlQt=8(Iw*?Xj3scn%zdwE#`T#t*cS^v2CVRknu@T@6%u=>GYw9}d&MQ5N zop9&U?O}Z4_lK-Vl}hmqxp|k0m!&ur!E1^t`u=nu>c`IAViwg}^!#I^xc$s_0gxCx{+GG{aVz|am9mo z#XN4K1x_NHL>>-$JZ_9))voy(lRM!Pcl(HssX&G4n@pXegIvXQ*j3N7SYE%lpXX)Glr7=znil|dMLx_0pjFAoP6FjEr3}f%UNBI>sfEq_` zB5_6V!H;iAQ?AQF1Dz2qiW3LOQo&spv?s1cDD>6P_rL-sYF(&0_(S1qUx1$3d z0Pll--N2akW9k-a@v#+K`7(>*8&BS^C;+Ox;^WoUYR)!9yq_CsvWhF#a0_k^Tvpr| zv)Xc+hmL_;;Py>Tgb+ArYt!3!Y{4@&V%rp6pUnr}yq^;|1QngL_1!EJq@>(xgX6(F z(1NjeY@OvM`}Pj7#%NK6GmPUevw8NKpBv8l5s-qEOyVauP$f@Hz$sA%(`c$lw9M_H zBG}Ij?9&%Hu!G7g8z}|3UU66}7IXrECi!=-1RY~cyI$c@p?b~I))R}Qcfe*hOaPuF znXfR=^?=nXDX1>-#S)qaHk%r2lK94&p4hJ;&1JSz_OGzA3W@z>18W5Vx&p&oZmJnN zj2>jPWJP)qbIyAxV*yD=8SZlD20zoEj9J%xih%EI8Xh593Mktt@+Qd;A7&XlinWtL z(iy$U+@sp<*rxMo3hC_iMwFy`Z`jZhj4 ztF(z2Nc>ScjaW%{Lc~=Tb`i&@BAEx3$r#Bll5HxH2N|#c`$veW#nG(@x8A}JlYE3* zO0Ti3xqP8mDa3Q?(C=iwv>PGlSc^y$+T@yQXo$bTT}~Bkn3>KS>~T$vL!i4&pxXy1 z_+QM{# zdTqaCGk4LsbfRYZUpk(@V`6e7JbTrz_=S+}Pu6xeX8X|x;ENZ_?%;TXHW$RcVK-&( zD;PQ7rrf<^X2#SgDtV{QiYK{1`10D|EghBfPSkvyQ5Tu?$jJ7{^b&LorT&Tu3}~CM zt~7+)DWOKST$44?4s|7Y$HmKQdg*$nf_{i!iAKsA&%B%p8Pto!8I*C;LD=^Xfj%Zh z8rh37Ho=k)I!C^WY1eg?)br)&X6NzpH&kEdKGW9!Vrh819kB;xa42`UdHUIV^!3hQ zcO47crL9XFs+TrR81;t=_|!Fo`z~ki*?l6|&2G23UUKAw40t@oHa<^?omT$@p4>I6 zM!NMk`PncU{4Ta}+*vNU&pTE9hfK^;OVj}IRrl%*AR+fd7z=)^h^i!b&&ts}c49VX z{~Ek+^13@|Fx-H&GaHaM>28+DXfRU7Bp|V-j)uIpI$sSq1z;{D5L;e4WAiwDs>f-E zh2uy4UrV# z-lx1-@*^bL03olt(>Kb_A3q`^hAAjs4G@XhBAB~r#fGZ_CB;RPNzkum8{~m9Zpe|{ zxf$)vmbCFPMZ!UJg=-cjC%8T?INsy)o;Bos_{}0j;C*P+Pru(9o@`5{?l1;idcSL+R&g}%ZJe#3n^{*VRkhD<4ARz)IVV?dcZy6!anyGwJM`$FREJNevkwPrVH z)W~sXn9)ma-BJOQjxJxQH^F3EUVf`WO^FzE+6Rs3$@N`=(7NX@NMt*KO*91?@e9bDY6p;zwdH)zLW6xU(LE<#OB6B z>AW7!`TH}(M#cFdy>3F?YFFNroR|&cNQzjcGo2c46XB4;lwiR8R-L{z=sIsf%bNCU z-x6cK;$<22I`yft9OtWN!N6H4A`|8P&72-DU)Ddzc5_H4iQ`-&E~}Q}vkRy6!K#VZ z&A{+P2e$A}&P+L#oR=8^yP)<(pNy(b#rdv^x{9){vryt5w+gTSUrv!JjW#xW!nA6$ zDS_Tp%Kc79;j+T(7TZeu!pQM4#M!QoJlpl173%G54;}SIygE7&Z8$lRjxoh=5)q)O*@o=2sd!x|}`oX%+~FDxj^JqjoWxNdq)mc0S+M94Z?x$Lb+drD;LvcX54vHW#f6&gbm#Q8T zQP=Q|i^=M$iC+vml-vf)Mn+WpAu3=@RcgJ{$2MgG8-G4X&J4-skvV{U5I^Ha2)c%vG1>hb@@5* z4#D#ElDfP|UNF4w+!ta3xlZ12J_fiC#YzYk}<3QYZ4!&fwahq7Q zB@+euW=@AA20PO|Dlbd$@Nu43190Ov{p)55j1X?)q5Pvfo|*LAScLJNoc*xq0>=Z{ z#7b|kUy<_9Ibv`47oDTAw4c6&^Nw&a-c=gByLS?Po`oZb9_JU;Vom3gIIN>}0}=Z^ z)cTiy(F*VRsd3;;mhp9uN(fzVmqJ=#iaCabDjIsyC1aKL$%DQf-WmzK(n`98YZr0F zPKnJtjec_d>Z7u4&Wsmich#C)22Jj&XldK^9{EGHYt!}pC!YNk23C}MMDZMW)-{#~ z`^IlI#unY{%;1#FQqi-W+xnOS{K#xez}TF>&x|ws#$7e+Bq3G+QTwkkGF{HA6lg0l zLRL7tzT*>i{ju<14QJ@j4mK)^4|$X1<{>H`I^Rd~uHSTbP}mqp7l;9fj1q7xWPmS# z{eqR@PFlu^eeT68%n4^?B z30HneTNw_(TlB~w`I)sgJW~_HBS7RU@Ic^7>c_jAISut6RH!fg-0CDh6XqMBN^eY_ zM@@jSx`gb&_pu)HMv;jhC$B45$jTxIaacQ4FULs>L6Zag!gZWl;pYSJs!_?pkd!kTXP2x0QSoiY>z`-Nr(X!vHz{okgU>zpq0yfK}eRCVJSwbfH`DLPc=di`2v z9r@L1ZhJfn`?Dm!%Nb9p{g+`8P<*nwb4mU?--<6o9<938qsAu@2s6es}rx! zD_rSJp+UcuU;#3=dn=E3Zv$N~owlI-1?#I$`008&S^}y){aPbw05_T?g_LQrVz7{A zqd-krZUB1bhm5a;s6F)ZxvAclRKlu>{gf9zb;nm$sjV(|#IPCpTyIGU8;(DZMk6$x zBtX77v2L|=k0{*~WNEV@i)_wL&37;X+OTotTh|p3a!ZFn(zKs(i*acZl|yQA={4OhvQcnF{@1h4JRT# z=b*Gc#&BjwosC9}*T=+&-lT%GJkmHr!{i0YeI}Ub+(n05S~kLJo)>x!T?y4O8Qm+j zI*|3VW*t0l=Gn=&SU10p!@M2fLFX8+-G*(C-&VhL%zQqkwYC6uIe;%kNs!<;On*Ai z4d%80YwXJkR)^cX&8~CMOMt}N|J1V-GM0`K$L6(d-c*uG#ZFwZF^@U$umh=DJSMJJ z>r)mh}FYOdB{1RLM!qvXMM9&uXHVI7hf2tRLOG zQ@4n_9ux;vkeAE5i=1Zuuthvrs_zzb^hx<(5SnrnH=Di!%z|;Up_&+L{qU&inj}@$ z3MJDEgdop^w+#?jJwEqePZ_m6Q3ku>`m=~O*)h5JwVmNr4egUzms2NOgut?{E+MTw z&II`Uv%m1bkC*@^EQq1!7U}k2i6qMlGQoZ-xi>N$bK)^Bi8Ypaza0!>e>)@%8k;EJ zi&YMm9`?KN3@A1j{A$-r(BtO$H3n7I)AjxetnFXG7`ZvHsYAhZ0d9VhB3D-hg_g*~ zRkx`lDak}5OE+lAUw)8gTGfMSU1u=}J0`fOdo?fwZ@6A~akTOz-jetl^E zYf$vkSlrLHhK6gsux!jz6{bYTp#omnsOm#k1HBh<<$yb?_&c%xKaZo5W&Q(b^dId# z3LYe1%c;GU5Bh*QA12=w2j0F1`h@x-xV($Oi9iST%J*v_7v{5jnd9J|pJ5j8Fz>1k z!jyOp5GGE2gcXh~Eyh*(q#5R%RQoyucxEUR7C{AYZa>54flnCkQ0;4qaOeLcem918rzT`l0J^HKCGav=AdL`eWc`L?;4c`E)AY3qC7 z6Xc1IZ=#jLPv9=-R}D~$KIaTb2S9sApgLmLczYQx(jv7E&;_i4x(mqaRParv5+o0Z z*O_?!icbDE?Em@tFS%I%tLp(3CXJBJ)#F{;R`#33I3<@Otet|3dRXmlq*4r9nx62Z z@(V7Vx7eY1gzZhS){78xKn5p3R_)s1W2|k8Jm_R2ob&d! z#Lu)h@r>LNvdYs)PIirneGwLNftQqWk4mv_!|8yz)|D(k7KdsxW3*3auG)!x>tN$v z?=T5xj0uWgAtUoT*O0PzZ_k&0#d2lmEZr#SYyHQN=Re>7bNKiF+FBrQCdWR5(^Kr` z>TimAuNk~MU9|$GQeBM6$j?NqD4aRb|DH2A$B*m@R|nT;la!l1P+uhNb}AdXzzx(9 zfT*41?y1|0>HGt+@7W)-N)$iJ4y_~9@`1qQG2oW}sgw9W(tAp+r!fAtL-_d3lFcKT zlP5cx$iv??`=~LIYvyn}A8);3@t6wZeqs2e^2{&vb*CjC1oYhgDcw+jTwadDvB;T1 z%u)Q_*UYr4;0TUQt~dVHo+OWPGQkb)F>og^V|S#K1X{cHf=2B=^C$Vs128=0$%^`W(*2yG;KaJcv2gA?KX;JYt6O&`;xz%R8?A8s*OQ4ue_ie_jKZod!ReSa|nU0)V!&+@u<>7 zi*)C{g#u~~@Vf8LEpvFEtH#B`9n5jaPad<=+oWxMc1%l$|m=m@9-D|8(}zE^Dk zW560tL-1D0iK#Ck<&s%3Pz4PD!)QK8S?_V&fxO=(_Vd2QHBVeHV27Mkozjgs0?io& z;-$Ym_`i59E`gd3dR%$#Ukd$}HxVGdRU`-Z!KFkG(gKhx+aJ$CXOiB}kdS?6%oL#%LMX~)%Wje_>)3@DyX?lko3W1B`dxkQbKmE_&-VSD z^E==3IOqKCKQg?V$ISJ<-q-cIUd!|Kd_Ipa)a}QR#1`^-_IjCC-G~CpM~YXa3;=@c zvmyC*R8OU##7<*{#o01^Ito_yO=3?t4YhDjOR=G*bnl=Q^N%wHpU`RW{HXHBn1clL z>S;oI)6o`3`RRpSS=Xa`Hy-O3=a@p*Q&r>Eh^;5P^S*rXbY%vr4;o zt|4}$8O@%TJE}G9MFEe;@Xep5DfQd8pn)S&1?eUa07|c%IVVTs-BLZ1;J=8j{;NoB-wm+%VFWJ%XKFjm&4&?)yiamW3MrQ+S#tQ7=` zf7XcTnIgbdsb68KjO#itfyeyqNYm&igqGk=6cLZ^-zK1zE8x^ zrtf{5@_cXoV!)ohrd6?)=lIaRmo#f#b|eATsALeFkgED3wT8DJ^+b1G)Kj0Wg;P*q z3ZIUn9^{b$9H}gIy`G?mOVl5jUi7e^2O*M5wlIfKUmQeQ z!&>YnTCC7>$fjBs+DpK^M863(IFX|SYJp_?#n38Az|wr1sJT7M9jbr(_tujugz1H- z`CIL@sZ0|3Hxu;}xBt^IE+hx&#c+63NJUd4>)k1uaht5dsZMXKYN|?Oz~Jm{p zu5&VH*NR$w2cm~-$@=9}0GaAQA_tr2OHN?ghag`GQiAFSaRayM0^r-M&h6OcQ%{o& zv#EE83?;V*$hY4cSCuoaWiPx;W+_$JIxhB^V+k3Rwpat{SuM5{I#^m%q;}E(qhUa8 zUDBRO)|a0{F)H&XGn3ptn^px#OTV?UNT%D=a%R#Rhp{34+kwqSLHDvh)5Xe3sAKc) zO;abich9U~apVPX$NpjMe|?<(v)kthp3>)DQ03~G8K}|<5RCX>!ROH`mYVar2mqVU z);L$v3aVJxhC&B*`lL=V9kMJ;5$?ZNgOZgxH!ez@>FtyoOJc<3?(Ks-wZ6hrK0)-;_3QYLYL606SLpifi}V!A zI1!^VC_G+C9^erc{XHO9isD_@Y~I8L5Gv%#g0=pXc?MxV4KNe}{mxXw?52lb?H*n> z(`vCS%#s?PVP)T{Q6xGgJgYjFMfyx~(dFOB{%~Bxx`Y$5^X}QoRSbY!a+8wGh@oX3 zQ*6k|(Q(Vm+A7DqH6Zo5UHhzz?cP%kr15y;a`DLmzH{?e9!po(e5alMO3R>gSy}jT zWVbtZGPb*jtCIkGtpdiicl!jJS?2{U-Ex3^a|7Cw_ZJ%aE`FCMp{;re`ct1%2-A%O z(Z&@|i9KGW@7iry)=@w0s3|8uVtyCfu_?vp_#O-Q%rJxO3FYgXGPTmJHlVEQd(?3_Vf!1%TZRWaf1Xqr21rkf`h= zwMl{kZHCSTVl>>4%%o>{{TGUEeFz!cnrCLk zMV;+#Ub($S8DM5~;@My?19Tx)BksN^^@^GYwT@xBUaHeB@7%WcS4}PN)DUB1cP<>Q zm)5`7F1F!`QT1yUQuKhRsRwT!$)zay7b8J`_6HPU0L`roaH7_|GPH>^7N~_4dP$)* z1x{+d!gRM=QcQ1(TT*NDQS6?vqjJM8%B@@Ka7#w2{v0yM0DfixUNRpdHDf|mOC7Tx z!AA0#zh#;@bGfApk`Z_UuP;c_YTm&*5Nfj72#k{9-figh=AT|v4ezezp6ql{kH!|8 zZ+WuS4o*@u)$NplZVjG3KVU)ra}y!|iJ!-0Kl!+9sYiS04D|geSRU+y?;+{L9qXTG zqFO`-(+0Cctnic8v;KJ!OpU8WYE{scXM1lX{L7>GRmb*EqI^FA#hz#MxP1MP4_ka# zh+$K17adYySB=t8+l!_1%*QRFIeNLQT9>RLZ(N84tkC8tkCVR`JaIRGC)}gz`!^B8 zKU7J6b^R4;-x=DcO<$xN$*Xz%Vi4;2#qe#SEk`Lut~M(*oH9MXKU_TL1u)Jx0QNHc z%^ybb|6_yKf5*@Gf2aJMstn9SG-$)%rg6OOghp(ruG6A^G0Ys9f|5zVQeAVP@YX2) zvrAZ!Y0TaU*fyN3qcU1X;sO&;b93##<23wBW0EcD;MOIcjrB@dzQd9v(J0e?!p26( zSNM(E4A*z%@V#W|A>rEPtmCu&F5|sdIfMC~t67u$1ivv*2K$R)`3ZL4egO3yd97o+ zg&`L5iV;_H-=}y<%uF%|?!#@JUN48%a+Q%w&yN|U{_eQ`BK|KKh5yWJ678zz5yss}-o=aK z@Ww%(vW$V@A@-hGhEdZ<@2u~wb;_?9tuo=f$U&rFpIw+bQJL$#6G}N)X?LC?b?yyjyH{ zs2aW}6w{e)eh=4UX=v2zse9Z@#cD}O>E@yNc;mx4s+n7Y%bQBmWlc7ugemc~o-(Od z8KJsPwYfUDh>b5Qa!-#5T@!qHrmCe8DMQqMmB}Y0+Ak%~YZ!O5T>Lr6q^T~^8DR2o zzj_1{Yfa{Wgl=M{?h|as-h$t|m5zRkXk#^aHmYbn42nNVf~8BoDs{RczmiP%YlBdB zj@FzcNfVNPF-+;du#TC#9CiJx>!61xTO{JG%24I>TBQDc)ZH`T3TJLUxD(y{4+2C& zK{WwQC)e07*7o|?k6T$LOryRnx4L#%MyX*Gv&)^TD;x^_@8$5{t|;N^Q6&xJjxm{@q#Gw;M@aU z7Cm<2AGm{)f59F6dp6Ix7|ob8jq_nz2V|q)s4X4d0$l_UPJ|Wh{oG~-k+0#gy7265 z@7H~XeeaG;Lso>)4avt4Zdvw;krT_( z47zSUYnkzYXUrsvPPz8=OWsgPHCGy`+V-l56rz<@ZUV;Z zOM!XwkxC5@+MqglnmlM|u^XP?9pbV21!Qujs(x^nJoZn^sn|U}9k+-)gcuslOn*wZ z{>6}QWp5QWsX4U`T?RCoUAaphUX+&ztXx|ic3imYacVfJ_cvVak1l;V`Eq)D8hmtX zBYze}nLUD;O~ht#K=gOu`$>`WnX#Y8p|njF0Xlqns4YH5L8h9c5EKU%lqZIv8nJZW zVK{c4QJUTXuPPzg0k9J#xRdtj?k!YS4+gR+g8~0h9)|;i;GS`OP6E##qp91b?KnC1U4PE^k0b4&a?*d;N~>^#(o!tC%BbV40e2>I$@Nx)jeL&RX;6}IRJvk zmzohSw4WfZTL;#8878A0Yp?`DqBDPAT7TbKe_sRt&NWa{%rlX15=DLzwsAEtae`3c zGPu%0v7ui}e4OPRl?6@3nEmDg^&8MBA zm($~>HPW;NKa-Gkv6*S5Y_`AjkASaA#U* za#UsAK`j{Valo^I9EH*RPdbNVn;1oCSUXw#2v6 zZM&oYT;Nvv{V`pSbD(DMBN#9p^3%KE99aB8;>^ZSOOft?kEQv2L{1|eaQrNvd^V&r zSQ1%h7NVH@wVe1~$O|a0Ja|O!b(~FuW5#_g|CDn6iT#R-qSxhGQlWh!jRK+07i4|8 zQ|W&xsN7?#nw%1?Az1+_BX=9>NM&A5{u3rV+f{sai@FByu6ro6iXTBe9iA!PN zde~=#vxPa9%Zd4OMOl|ANsH;p>Ps6k*5t%a;9zW5F5wm)H;LNtZ0)WTPU2SA1$C#) zvv$oExYjKq*&54DS|+btDcoQ40MoZVv{^Snc!9Xo+qn336LdK1p;b`+O!1kE{1%Zd z@=G=(7kkqyU;Fl)Wqt&PvKyG=^f#$cxq0+3rD0sNEQR!0bkBJ*!1OSE>q52MnmU>? zS99fe<-$XaK14F|^L|xfdDhoCVemxNXKGw4REOG}`SI>-!9<|;WEH%mR#V!y6bo++ zvFvr3?!T@K#7!*SVTY@^!eAx6>a z99{iwMBG!e`JLVRz(bFmddzMNM6K?4i>J{^%#J^ow;bfJPG*cjI!#8E(#$;3Lr#Ie ztNnemf7!$5L$h=2ho#V}xw#e@85v&_VodqZX9=a#lQW;enW_4;Y%jr^9cc(fd}ayXMQ(|3YY+7#4wY!0|ht)zv>d@;S`G{XizSMf7s~2|m?l+JcES=HJS{ zRVC}@mRtE{JnUR+djD=)Bw1Vaa|$ND$**i8I1UtJJJ-mZ#-7 zkPFJZ2gw~XgM-oo+B5bi>5|%cOX8T*U?mj1zH6)o7z_SDB*MN@rAZ@GxX?kFkroLD zfpgHRuW>bZI1IT@z8hJrTjMYz=1AxmW$0OSdD~rc3G<3@i$6rx>9!ZWrBaSP*S13p~qc~tMV;!nxmvB=K z6K$ET*rJeW=ZVU;i{v0#t#8{HF~w~0E=09{-ekqGCF?X?_vLC-zJ4ND3Z&Sb+SGrJ zl3`rcA5xk+I9XFc^2xf{Gg9T`^~hwt-B$R=Yr7)GZ~JObkq5_k_M#~l?aO?yyiYI} z>Nbig0q-9Pe;#<I(!a4Y1*w`O6P*}HA!b43=ft1$J zv`Y-|9zS;~T2s<4EJv1*MV9M79Q(WyH6QLP@GX4pBmdY~@}ifIqqU3Z*h;I*0d+um zdO$-cBa#7Pwg;kpbn!$3oL$M>zE`)vLYlwWV3u)R;!S$M=OvPK3zVB`M`T~_BnncL znLPxHQKD6-wNJ|EoumuvRw%g2OWe#@wsaEodb`4*L>-fC%N_GT3SDb}yuRu)wOndL z*on5!)bzj9v?d}n87x3|aC`+H4w(9x@Z-wBEULmOZG>7x9>l$c#VNN{`gqv&-^Ew4 z^2G6M!l<|cPyOM|rua(3MgjQ@a%4V>N+r4g(S++_-f`go!m;ezrbpQxu~Z9=0;Y}$E7#bwFu;Wwpyu*dFLF`9HOEu&kW#@!gNoBf!tYl*j`KX_<6#gevA-q`{j6JsLjKUkmicxQwt>ZUW**&$Atvw=3Qxp2zhHcfc34J?e< z>A!W|KqkEAY&YU`>Zl``);2se!dfJhzGEb4`f_sLz}xhgN~sT`=fs;Viyj5MxNtva zZb>7Q|7}s_o1C1hBC)lMO5sB?>d$|os!@9NnOQyN4_d2$E(*}3^H;{ z#Lg~M2hgoD)u*7mzOhZaEe1rXw96?_1=&&~wUuyHBcV2i0?_R|pK*3u(z*3HBVrG8Hn79`Ei{{!ZVz`)sI-n!jh`eQIz4Rxce@!PjUMrPUu~ z?@x#J&F>mcSBY$*?}-T)K8S(JWD{FOWF@-_wK+qL4TBEPaB5wb=&E}pFJ3Kg8O5lG zA631qx=~yqkHu9$ogZW3kqihAA-Pa#Cf^(j$1h@QkCa)8)jCm&sge4-VJaSQL)b9B zkSq6~0wny=RWFMRFwLBPr0QcZ)=~UljtX%Mx<-uNRl`jzt|fmvX6&pM8QOYCD=)t^qb)(L6t0U)tqn z1rh&J^YJQrjSCODby3^A1D$S*VAMOO%O_FO?t(>qC&L23jm~Zo^UMO~Xn|B7{p#hw0={z%cj!Eq zYRqVqaP5btk=jeQLHEqng@evNebX8%OpSZx9V@R_E5DzKg6y_4*QX}NnsX~?g3fo9 z6DFh*B1@yHP(nROz|zZgV{Kn{0%nr{Q6i@EeqFLZyIXoyOMew35m4M$I*-hsSYVIv54|DqB2 zFa3`2p**;uABIp-RnX0b(aEnNi*Uxlh4oe2`_452!(&;kMSTz419{S-p-IftDNK0h z^b_?xNQtY4?&O)uD|-{axmc@$m!pQDj7xVMh1j~bb$!?qE*r1E@o0VR47uaZk;2nA zI1gW0C+76l$?D>vycYah=Ap7`G6zGd8mYCQhw>H%STY-SSBR(rNd`VRpXUzWxndtiRI(T<5Pz?PQK9j zD*NI0K6BiDH8~%p-w5Atx}0zzu;z@@TK#JdRb`uJBkpleygQX(Gx>qr6bLwJp1U*( zD+d-xvJY(lKKV2GqUNz_s=AS1T$4G5OY+R9o`pL+fXnhBQsFO;?)g($-~I7bEz{C3 zuy{U_?G&5A37h;UT{pDgvXd{~M!8i&Lo{^4W!wYV2z`QP(HVjxy6aFC`c;G`dWtWi zz_#o|O?0N5n$1&c2rTq1hw0%?|J7k7*z0Z$q>owdhYz!r(A10XOzva|+#7glGSF4h z0c$X;E+|hOl`mOM3X2#l6nwFEcrv)|qyWvIART_1nECt?=Ge8ALn-?!#TUME#A^t( zO@>olFumBJa4pnxGyk%i^W{clzjV;2?>K@d=ihT(w4MTq^CL`QA&x-occxeM83+ZI zIa-52-i+h7t6y39qKYkjSd;P)edo|M=qnq&dypp|=tg_<@J|yZpo=YBTEpeGI%Mz2 z^5$HPgc48@M+pPL1YL@%D&~)%DtKA{CpOHZ-2=S2;fh?LpJ>m@$9EgLmkw(~4J{*F zCs^5$Cw%YxV(2^Hvh_@&9MdCCW$B{3Hs=~AZld177>Vl5>a|vf`ju~ey0k-l{!Cvp z@j92R(o?;gTZbx_2pq$9w<`K-xZ$l188uCp_}G~&!$(Z$C%y)TIx1U?+UppMoppKJ z9dgU)yw_!S7zOuZghJ`KOAgdEcwZn@@G(7J>#CjiRS~p@5m;Z!Pv~LVqBh_$pZ=%i zD?v=d`UwX(2i3@S>h{%lFH9tfjL$ByB7JerTDP`N3H1Z;sJ$24cKVglc~98kucx=2v;WFL$*jS5Ch)!WzK!8~R1pwNmyH zUYPhw+ZrMCmH{yv|^C@&QU~!?!z8=yAxj8}G!E<)p4EKa0Bs{(d zjq<95Ud>w?i1@>#E8G$^+q}2XyqetL<75B8NpyCr)y3to+ZjV-Q zlP^_C7)8vxX1gBbDv#|b?TYxMj;oGu8EBoXp-td7RiMo%{+iRii^(dEM!u^j=$63q z^Ejl!jJW`ANzxI-HB|y}wpaX=iq;4u2hX%topaKsKuaeZa2Y@CFTPX_Gwha5?b`~- zmRy@uiUA7t**n79+j-=3nND{yj`hDMA17$-yohQlQo}<9zN+Qc@Uo;lkDRTPV1IRb zH7c%x^(5nV2=Yv>>_tyeQ$4kloEpdoWaq?jF=Epzl78dYuz_MygO&s&agM7_tiLy6 z{L_5FG~DRzT!G7w(L@?lVyct(UTo8Ygvx&JO)Gf9=ZQPOif@U{Kl^v)C-(gpy`L$E zbo4@n4G`f$-i3-nrY)=8>FiODLRe5ZO>dB|enF4xgJCqx4)0YNR%;)0tGODOcpkGm zYe43vNqj9}YT0<$HXyxzQ-20Z-ZDlm3vpSc>tDdY2uH1NeT${^a_xzObnZs@4)mZ_ zk-9OR8adHRhMheg(31NOJND5DdXQlQ)PNVR;4O_1*?ilWYG z$wyA*qec~8iya7SmtPDAB?unYt4L;S6s`RodM{^Si_n``W$+=VS*ip5`4}vS$_OyC z_+Jdej2eC0SdPU6GaG-)`jPKtnzE+vugrSOxiS?B|0ap@%fH2F9=U+R9%7N^cXm@nZKE$(!nqPD{CDb z0;KIp(3vGbRjBUW%K9-l;4V5U6It-^yRA@q+GQq>XYhD;Es?vI?q+LUH+l=ShYxTq z!%O8~NrzIER7m}gdRA?!na+4WV+F;J8qM_q>@~dVRX?ZWcRs2<9$z*$C-@6lDP7p< zxvSR_iu_T--jBF2sg03;XEq%8a2j{>17|yP);-D06zkZnGc^$)mZc8LBujHE-|GMH zv6Fl(PC;A>Y%N%_!(R^A|AW8u_xJtV{^b9(`Qb6!Xu}zl#JBtwX!>mVRSQ&F4Ts*& zaBb&aGQ6GX$@G1K6Km_lXC&)jwf~!dWozr@qN4*pP(#jEP^zyt1M?nB6NH5tIJqby zu=dLBn8Ef(&h~vmH4kl3`_*DFrQB{d!`>^-b@$ zEuZ7}m`2E%Ma}fmwBq9pDmUx7UL%-518ln(&x`zIj2+_@F$+9 zKR71-tB#+)f6sp|V1h{Deu ze|2phoJ(qzcLi05{ytZlD{XE$tT72vXL_j9R^Ecc>466QhIqTMLBC0mEx%a_H!|Gg zwh;6lKLlOeWm)D$o#olyOQbbMonWEf@b>~S&anVshyOrjGJT*vK(AmAQ(e41_$dm3 zsA0DwFBJPeksbpib9JMN5JyPCVz(9)!%-9MKJaCDM@E3(-CY|$Qb+eLj>?_Pab-F= zSDH%fl1kbvF9mNX^&;UwwRxm$jA-6C2;anM*nrby?E#c>0r>+a1pe@M3He=bE?`Ut z$1=kP<^>Z?u|H5)Y*W4#<~6+oD#>{)i-(~}@~$PX%aqAj?#m_buUvSJ+Uy*V*_*}Y zKkOg7t}p8e9Si57rWsI|;?CYkP64SL&VR(-{uzV&_t(GA$=}z&zqJNnhLe3`Q!E~| z+Ii4yD@-1|V2OUl?OohK`))%2gfep*tw0@B2?br+xEsH>>%a65Ulnk6O!hDqD-L~X zNSB}GS%h$k{$hv{*P>mrChwy;b}V-`LJ?%v{Ust-?fZf=F1Qzc0UvbjzpMX(=%yHt z2PWa&)jkAb?gT+wZ9tmD2Gfy4+1#nC8p20${IsKa&BRR8%eBgWG1y_AE>CT?!Z`TK{y{y#9^T3NR}a57#CmONdx>I@`UM}xz3?ib(>+U~ zUJg8a`o#HW2;n_Kw{I;aa3|4^!h`+%{Rpd8(bY#njj)A;rYOEBMiL`zuzSZMT~opZ zt_qJnygjTpy~2jnII1o3Ar^k6kv~FM+aH}yLM4phctmHgSI9>|zVyn?43%pk{TaQOy%pRXKNbB>YN<$8-Qer;UvRBgem2WFSNb1pX%X>zSL-H zeiRzwx)HsiSBJKSI}8rZ_#$Q1lh-LX9RGaeJaPcYL8fAz9XBMVcEXjR#4C-74+|RkVsu_D#L~B~ z?J!T3D}=^JqrYz>RGI?%A7jPH&<-KDKW-rl;!kfO&wosDq@VPj>zPtYXX^3#`E%!x z%--P7?Y(S@$#XWf2yWU}^NG)=t>&RQ0>vN8LWZdA$Oo?BX=!iN|O0J%wj6iRS zfOXn&5Crzw8-KV2CaR)DsR+Cv?|h@cc5P`Yl=~9U%0Jfy2hV z1vWD*;Y0Q>Vwu7;t#eZK`i~b_=6k-dNOF{io{xMs0rC63=I_Zbh=>W7z~}V8odGy= zhJ;Aw!%SNA7D3%bMY60)}ioG<(?|h-KZl|tQPrJ9}_&+NsyR6m=}?+Ri!nlK^5E=>^jh|Bn<(v-)y;uH9B@iX^x*)t#v7k8LS{6F&W*f+bm#dIe< zaFa_jF#5bIg1k<`;^9%sTy7#}Yq7HR&!4#+eed+s|K+j!c3qR?@XcXbRnA;z*N1nU zvGng4baq3b%h<<+6!IZRFhuHaZ3pR18!KNQl>|u7K3ARSo~p zFgldZ^C;wqqEWO+*8D|L|Kmu;<)aQeg_~40wB!NdE9{PyE-qcvkiY#}$it#v49Q%- z7$(kCyKH-a;4_TMJ&T$;BfYl?G=EXGP;LnRLXvZXLo}s{Z4Bm@tSlubuyT#lcP>Lap)v0-p21ry>y6xU46%q z)iZA_mwb}U*5>&~VX3EhmVM12CWV24}9U$9R>(F02J@!f-=EPKnRm^xaHs zt4NQ5CI5$eOKqEW=mguhu3u)-%f_~mz-9kk#ye)5XU1EuY=sc;YK*p48sL+N%HsbV z8^C=|v{E)Jv-M>3)kq`p#v&r)2soPMKsyaxwB9^H zy1i;xuX;9tE;bN7cTS&L-53l`ZOdbU>js!P&R3B=wYJ zKhRHY38#SSiZjV&I+x=`^3fvVg*S&cIHaaddCcm)8%OK8tK18QTgT3^rw4u*&n0S2 zk6Je@v&p|mt!{Oh0WHqpJJ{Din(+zgY3iSV8;sIs1D+Ec_As;YLd6l&p=O(S&4ptq zS$~GBV$V8|Fsdo3FduPeprfhIG4TA*J2SK3dg*OP^ZoWs?}%8%>F)+zAIhLos+O2HopM5c0pcorQ^@K%s$!PfLr3Yr)fp_v zeOg`z!#Pa(lnRpU5_bpI@jY9w?4_mFPCRkDk?}F=3nY6LZii5%#j0^Rz`0x$@ih$& z6X~(2i|AmOG2g+vMa5jRbAFwWBSH=vWeKni)iVsS&syaRl7$9m!d7J#6K?bEHcG)Z zn=ad1O6Nm@BwDLz3q@$b zc>?hQu>rnI6w26n^2FWV&yE4PFw-;9?8gP!)gbEdT;K*Eh0UqD1cX30iy)Qhd(}&S z$u?C2WXx@oLr+tW4XFdGq~olhv307dr6uLD&hPW%O1>SS=KOFwh9>YTh;UxOiOG#asR+ZbLJcMs~7R$PcPiQ!awk-TPK-J#= zr=2FVc#ucFx}<4z(aE0s_N6f97a8xPJ~75x4dJmvgHxCn-D3210Sk`q^$*uk{B(P! zskr>}+)2eeofgK55eNNy4Yut_y?p9EmlS^`EmcmA8m$Oq3!sJ19Xk zmJCaBocM}qtd_i#XlUlJpm^AJV}@#c=hdD}-hA9c;VP)1|4GBAS*rrs{$ohHj$=kN zece7!^S$C~vteD`O79VoyV)?qHIBvQdlJubALEd+w=IEBbJK1)st&s+n#>0TV)(If z{+523fBQEwB9wkIkc+zb9#wS@RNJWaAkK*dMXG<1r3h5G|G3VWK+FtzqU?kZ)T$>Z za85Mdxp-WBa!pHUsTNX?J%iB4SHtQRx7Wkuv#N3{&oN(IX~_ASywq?2EV`opP4wzM zf+s?)=TLQuCbVp;@ANUZop)Mih3Q>0Dcd4KS%FS4dD*6SJD;++9(U z>Mw>lq+Pdu%$1kJ=kBGYoc^7_zoWK(3-cOk%e*DMA7MaDXvoDVHI!vI4Hb#zl?V8# zW+!t{bkqX|le>WuN$Ote$`408-dKu9*pn=`^{Q^7MC_?o<;~v4*)55+*T;n(n~-x; zW0)wl)_vO;&u*ZwEq#>1Q}9s_6^#$I@qrjIxl%QdmwJNKzE6hlEX;57#%j}<8mASQ z*e;J9|3;Uff=S$23#SHjBSop#2s_&1*tfK=D{>32ONIR2SquzyeoM5qZ(&u_y zX3Xk)fia_=!vlTcDVfn~9gltmXGd5TUb^i)HivxEHhybp>&Pi%XsBaPT2z`LK1S7r zFWOq%(KuGOW_evJwOfN{!0+^5SL`h88tu5KD#ROMsf#|D8_$-DFv_G7=~zNTll1cE zQ4DSSIDFX@=KZ{yJ&)DJilqZ-HzNVhnQQ@B znuoVOjD;N-oHo{>K=XV*JBB#YkB=*7;@a$z?vctnSwx6I`l#F6>~rA?)I=g zEwOqW61ASRK{Fs!p_Cq+7TSQvs#t+f!az^|wq2^XX&5c)or$)6r|XzgskYwSGQeq*K9by zFwL6o&*^6TIA8DHY7X>rjX1r7EyRs+jiTH*MSu)Ygo8IstiD)rkVKS6)- z1c9T;A5`xj(?r0}L0aTws2$NhUOq6j<-}739D3POLHSB)R`&Ry^5}#N`LmEor1|i$ zQ>E^#v!iKtqj{wzvSn6OujD|s5;8OpH0z3*Xq!rAo zf5i8?ZJ=Ar`aHjF@m)m*ch<}dd_>>BkJJ{~&{nyv0Y4Ja+*wtgQFa5l9y3$?qYw^5!n;SiQm6+Oj$qsa#YmGB%xa!rY!bkTfpjkN2ow`8WK zP$_kXv6dZM1&RDRamd4?n}vI&Cs;97_DGcXd~4h#-d#TF*^wJP*wk~@d(1R7!oCUj|&gSP2l!DH1wUam;nRo(KKP? zt$mmgfzJL*5fbLbj%<%QPb}LlpCsnJKpi?@+=K6T5+j|wL{(3?g4NO&`yI{PORX6o z;VFvt6?$=A)6)Bd)p$~i%ZzWt8}QF2A9dFp&r6>clz9yacdToAAf{X;3a6i%_4pX6;3?K`!b6qF#6&=Cq`6g^WAjNm-ngGLy5Br#Lz=A& zI|uL4Oa_O%$m}d!tM;09n>#*F$pz;Nb<7SeT zVv(4WfiXPP*0Wz5p{`5q{Yz=#7&k&O()iiqnjCXF`5`Lb*DHV?4T;S#Z5n%U4Oyw_ z$a=p2V|mi<-QT$LHRU^hc0llGlK5?0G~wVG92McnYriLg0lI>N_zq+rs!po_feotQ z`Jy1_NPyHiIis?9NEo=YtXyCOAB3+KuhAQ%3#BR>_?g`GR2Em>hYjx`1cw956$qu z86f@taZVYF35s@iL3N>$3zEb^z}&$o5Y9YuY}vX8c@M=~MZtxWlm=14!*!UkPdNgg z#y(qO@?=p9XEFV{#3YvJE5S03jj6m<9LtNxT)Hop`u*Zf zrXa`7fob}Y6#Ncu2;t`ih$R5hugwV4J1pUE^0LlMvePr|LN^g`a*a^LS}4k-VADbF z#(69HK&h^4X#2|Ni&0e{$2}{Hdii;FK+4YRJWv@ikK-55Jp~Y*%)bK#<=~cxkM>OI z=b^+o`6d{A+Pkx&q3050pq#ncJUI~lGDgQk7~&rpe*RYd|55-RINAPJ0t^3sU;j;> zYbB^DEGAL1p_TlQB-D29Q;aSt0Kx}|!3HlviAz~)jzEh%9b`BP$N#m`y5E&3(XpHc z&V^`3QoMX4?Dq5#LnO*t+gd9(0S{U-*F2l}buG;zA4a@4flQ>I4gedG26#L&JczZiO+ z;5;Pn&q601&eDhcxA;@2`z=!}*g(U-5M?X|G|0H>{M}0zcY@2lbdi;UtL- z#U;=YxH)+}{QQZe7Z)zfR+hV+yqK$0xR8@3g1#><0P7-I?^F=O&{_7XOeKhnoTT=r z9Mob&`ZY?<8dQ^q7~)lEoi$%@5egOB!3JRV!>_}!3)1X$2k6l?RLW$|M= zTAJ@`?9GYpx4rKfz*Fmi0+T27W3Ex0c6qz^T0K?)>&SLu1;VwueZ%5xrtOK?WszIm z=+^6l3#$kZ%Ke3L?_iKy;XDtb{`?=MKLRtaO+$AV=w5U4m-fD_w;L|{%5kj_3Gfbp z^hi`a^2c#K(pgq(Hcu6{lJDA{8oArHaa;ttg>z+x@_fhqBvaKfKY^MPjw&K_s7pR( zBzdWg$>SHp?aK_HfXE2H-O!IzK;V8*)f-UEUpL`7G(B*XVm?FV{>aqX(J~PLQgRp@ z)DoI!Gm_;S?5V}{wiLddNk8}|d9o}6=`{Ei5kPdI0&otum-#z%U_tW4!{Bct`%b^u zv1j30aco}3%iYp1$(ztfh5FzTg%7^CMN}_3&`~o%;peVNUSky5!6=ZqwY0jo^*J59 z$*QlO>?V}m{;H66{1LPZc-&8~D#I4+>Z67#8|d6YYs3K2DqjvhbHCkuvo zJIi&k+N@Cblhm9*vyfVpMSlwJg}g%D+b?tDmuIfmgr#_l61$kmyftMKZPo{JjLhh8 zf_|$n`jwcm3rDDT&{zSM0KLet?^!LdaRKwSf6AiXJ$j7S`sr1XE~XYpDSIqyB460{ z=srn4*%7;wadB5Rw`7Rh@&wC3Q;IItq=#zj=Y;2Azgc!!z~ZxM0bg8vUT11YU3&PL&~4H^i7zR1b)3 ztD1ov;gDmuBM!-J(w>QsBQ&KquwSBh3wM2gM%+G*xsK`nvb*DkwLx8-<0UugSr+^$ zNoxK2w}q6LnHb)s->`(x>vciKlh)dXqswY?xN`x<4J$b0D>#%Ou-456wJP zN>DMqOzkvpLNXItB0Lv8aYs$Gd`<@M`@a8%%KMXuHmyQ35uOC+N3u8oAl7ZjagF1>}^CPS`?baNxIC~R+dh+dSzb{bn39A;` zCk@v)!e;R654IIhw-ht^Q#op5DtEw>`C1l<{ZeHKeJ%|o84C&?w&I;r4}jMFnSeK` zHL!d(vAgBnCAunBt)-(T9Vdxst%zXhB?*fQK02w13+|8J9iDRCOp(eZE|SnCZ;hQJ za?BkQHR4NbJX}x*JepBI;{Zp+KpZmksbaC1@==$D$lc8%yujNNLxSCt2|4KxBf6x{$iA}^}JO(8Mv^OBLz6R;bL7_y|C^4Fba zi>hAQ1$mtlab}drW#v?x3{nhF4nOp8c^hGmE~3toVP}dk?6l+BI!B>We5U*gydx`l2Gz zY=BCMihzKKNQu;-2uKr=-Vz0+Lj(i_gb0WsHS|E}J%JF?`5(?X@0po1 zXU>#wzJJY{^{vHTA!TRpJms#}ecb^P6es%N*iEPKMm4yJZk~AI=W87eyW>A zZhbz}7(jUb0Tep!%kz0Pmh(cT(gW74f;qk08^Uhx6JbYGRD7waNTIE0ruc_C9QV3p z2mU0#9-koeK}(yXY6u@3(t(a|w#GSqfl#K}&K_2qONMD-W2DE=SLv%-X?$Yi`@cxa z8a+<$!TR~JD^M6v5g0*1f8t}p#`G8FNl@|QX1z7|tQl;GrZkH|;3ntIyhDk)%^ZgN zvsBrWpDL{-oU0buYY_E^KUd(`ArPUK9ydRk?**jx|O3N6{w2dP{2odAKHtPEg1hilWl5V&cGCt1+`=j=q*YzfQu~Mpw|6= ztEha_(hn%4G}DlgO?vVXY?1Ko(+IvQg3y?^*1l@ zk~)4z3a%-3l{XfYe@$^=bi+|%xX_@=Rfn?0?GlxtJ0GVHepDBr-QBmC?o{{voUOrX zR|nUA`0={1K7Yay{c3)6{s!a%QlkT*>oZQXrVv9$%(470C_E7p=u^&L43_u7&Tc>+ z7pn!;cvOF^>3U>^HvstzlA6e++gaG2yIZ9v=g0C5y;6#qF}u!XgrgONb2wf@c>tTl z`|vMtG5qq+CNizf*s-Pw?)cMiN(v{e}K7>?|-|cv=Pp@e-^wg-A$0ZZ-1`*cs z==!4Id!MO?Dwjyl^^`5*C!+~ra^@ZGZI`&S$Hki^1*u|sC z5Q_kYOJcU-PlOuRHlqYppU0gTC_a^&BV|#bg9?vdny)}}&I)&*Pi(dIRYqITM*u6- z?ci!5jj$2#a8NQX9MvNoUjhW|J$4?SN$`?5t z6Oe{_X$|eHjzZJtJ21zZ1vt-`S#z5u#=qEiHy8NkAw5PMJcdpiJFy@{=i2eHB5DqE zit$Xu3*JmC)J3Q7KT*TkE9?#`8w92#L{Ko!35*~R9OUMuV~An|e<8AQv(($sxEkq3~*@8KgVu$Ogn)Q_YpC0Ito9U-^{@ z+RsfQ1{P3_m7Vza1&6p{DL7>(Dx6jUrMOVLW((AT9VChUnNP1;r7w3tOUN)*(Aqs< zyu<|hGMC9=)1V0iZb1yd_uleR%~nyt)39At@__%44x@@tBRHZ0?2AkZ1|BR!pFFN4 z1L!&PBjEhoTa3Y{#+#|lM!<@m95u~jzIKw7f zVQaw1FBhcj39M&{qKD}vtUlDDA9clb{WTD)Cj;<@!I>4<_6>-R)}RMaDy{UtuHYk| z8h_JQ`6&@|#xyx*Ywko`AP@2e+lokV$>#2&=fYrMSAG^3G zSHZ~qz+}DfC)O@MGb1$Qbs2Z-&T^sAAG^la;}b4Q?NEVUn|`)*RWO#UKjwiuP-RGS zs_vL!`s^RMQMG?3pqytub}m$5!lXl1OYo@f^B~tybR?QDA`e9GvQD1A;T4gQ-o$ zFJ$=i5)WjT|F&4!HGNGv04s{-Y{%W0Uq#XMixa0w57r&kboJEiY?hY7cnR?g_$*Ig zx(4drs{Oc%rz_T>p9fgNt(1&^4l4Pd@V|ca?zqzeq@zfWw(9KWsX3EaP<&+s`ms4U zN-0aHGa-Qg^-hO|zkM5zPfld^ssihchh8M(6t{HIbJVZ9#tZ8!dE54}+S5_EoYV|j zR|Z|vpOSAGByVyN8Xb5zOhQ9r=Is_b{ORmjUuO0@%M>Ht*)MiAPqUw!C#9&~4+6RP z&uyEo>E;1bLXEy#S4N7_b-G~lo~BUf(v|uRh#j`si~|AyL9Z3P_%Bzc#x5DRcQ@(PCV?`p>z_-e@}~EfI}r zjvH#y%z#eFJ5M(q>jt}dY!yta;bx_ZAxLYS{=4-rf5_DTy^mF>@uZk#H7I@Sv~lOp zcGlPZox?3HS})HZYR=Cr79PL&(+_U7jYmcA;q4rXcK<$;%;y#ubwWVPI>S6<<$Sry zIN>0ZZJuP(W7mDFSYZN>M0{7;u6b^Gz7pwePgG!<0*mizca{HW;^o2QQnAyB&)QP+ zrx$m}payDd*CbTu7wD-^=O+6+!;CNW;&sh7AWEL=;$8yhqfMWfHCJtFCP~l~5b*{f z#y#hQwxn2s#vvuT{mvHxp=d_e|MUk4tXiXNDGy` z)O-~*-P`@R+WF_ZtXmevF^i1FjOM7OrU<>cerARS(kwuhjwhkEI3*IDYf{!VVtXFB zymLRgFMXHRm$VPnKHo0Z(l|OMCb!$3>}ZkrlHD*QZ!VK}*eIYQ@rSSL>TYpGg?J~; z7r>^Sy7djHaj+}?{!w(O5qdy{ceMT0XNuhCJ!$n%uOlwRLHzc|hFA;aLEwfT5NYx^ zYP#!uLw@eG8|?B<%r2&raH0w>po-O^dkKhgyDybhvDpcI6Q9qvvQ>HSH3s5&$gkWl z-7wciy&Zs zLi=4CQW8@sf-11x&-)$8Z6~7QOH@uhfNx{fJ0LO?n5tHA!3WU(FXf?kDja7S0B`A5 zvrBL>Sgnt?7uOCe%u}?so3u-a#OU1iLpZ7RULu=5 zcTz00^8)rpfCIqL|Lv;!pTA;?{2S6`$i(L|Az38LYqDzFm5o}4imArJc52^5W}H@t zv>?jK3@rP0&)Z6OTA(bq-BtWaZtivxZ-L@nuH}nB{T<|KTH=Y$SA_?QShP{K&Jun9GEjueeYJ@@u6CTEwl>i{_7L-n}DXE4g+ZX}5HtuA63U?fk!9 zDu26l{)5lYX-kACrr7Ni+4>#CGIj0~%8wCG%7I%EGXh^B9HSNE(OJo0% z6xHr1=KW<6>WcHHyWfpfE?9m|2fdm`ZNfg})&8KWni0ogb%WKnACi6IdWv*igk>aT zySCo>&0h6m&ukAp36TT)>Y`+}$n7fm$(_6KN#M*WJi+88O z&dyz__qA{8)CM$KJ?nlm;R%XLPGUm(!mW{|i-%OP?T=Ci)`^p=Dxs;hMiXUe`0h3U z0990d{I~KG8?pg$#6ulp9Zk}=biMbHReABXvQLNcVdA-_M1HPt^R*lJtB5D?DZ--O z2IRyIko=3rtva&s{AWHEBo}!ijRw}b*!c5`snLgQ>VcoC=TzrcUGw$rP-)?C})O!LZ1km*g^mYUmu?yf^Q+?pvS}z3i zgws45Ua#KYQRwQoh-hoxuBGIwl`#wyFFk__ypMmU$CKbHVG&B+EF534NS>&#VX2jJxk2VaEf`- z-zj9`cbrCxib_FDos3~_ zR$u}aW$D4fxVJXib>yh&T5!@bze2b8yrwtTR%nGM|8@rf*haJ7z6>bhEgN-PwDd%x zte)rHGMcV%ff@P5#yMO7KmNgO0}Dih{NX_#-Z~L<#3RF>iebOGO^ZJ$x|TWpt9bg? z1Hx1z$al#{Br!93&|z}2M6bz+{rh>uo}!mt!Ule0#v?52--w2P&p7>RAj%{u5aGgq z#OfZOtYiVqWCnYXE=tivgx4HObkR0ea5>v6{ql`;h_LpR#me9I&E*`ihL;k4P95g^ zV%AyTYx2r6xko4ejZxNTAaB3?*!q7FUcs-`Xr)R<1=+;tADsKx=lk_kbYkq&xobyd zw7wiJuo9u5LIP~;2;9Tn_qva%hn~eG@0=07CG0+U6h0~6fY|RSE=?&b{xBzJI`Jje zK)WU8`2F1Aej}T;Ij20jAW2WR9(SPRs?D)|j)vK9C=)Myrj&ifXmN-ABREfRA#NdC zdX1qaI<(F~t6{D;L;Rq7$lS`H`kD!lTrK~E@%eia?7#L}6=pm%%oU`taOOQ1%FkLJ zj5+9e@xkiE-S%yLPn&srD?VP-h8Obf9nq7o?u&>wh>V}nEeLR?n`g`wW;t3NeIXZH zcJX?gv8I^1iet-;<@jblYO3_0s%ED&N6q7FS(OLSQy%M3V)Wn zZ4up(q<$hTHi87vRx^F|@v-Wb0e;md{)AZzgRwOkp`E0M-7Zap>e9-x#6gNmfRNMa zgiFWE)qVhbbLI2kKi0c{b~OJteMS}fEj-Ni_}T;LOW3&OqMH@3g+mwa!#ku7tf|tB zKGZr=g?8t|c?A`WxI0|jfZW3++|8B4jSnE!MH%oTN}9_y-wF!IuIvykj=SruMYg9E zKZp(jz{r;#8p!#K;(w7p0>&^^N3hUBR7d|-7yae=wnmi){S=;W+_|@4^It78tB@B` zT;4kZ^?=50u~@WTSDC{c0OKCJ<=<5V|DK}wfA9bA9Yethj4)Tqlr2RLsOUP4&mAg{ zx7{W#e*gT};-3vi%16)bJeYkvp${DWojwGTQm~r1sqAv3=f#ERPhXQJr{bYQ(NCI( z#14_lpile}#s}r0=xArPZ&ixQ%q&l(e z0(2v{&fo0J*@jCvr!;j^f?5dKG_08;_xn4P*4;BKy=A_4@T?e21 zD5l|eXhW;KGd3XVe7zjpV%}(3^8j_vHQ2g{B-gnVVH)}*@uV33b1pU>xa7|07jdJy zHu-g^fuQ{`N@*f8K+kM5j|D+0oy1aqQ9Q+0|pK z5+5_a3efMWf%J1m5}4a>Ehu`0Iz`~yHf(rF?FPj4BZ}hTesoIfo`#zXMPiJ^Sw85X zJ|D;n($%-i8wElf)~n3l$@l*0%Guv5Ba^+A^QG}n60xp@yBo zQ2vb-4onLPl!TUwf2fLodYXo>3`UGA(70X&e}Ec(LTnNFnM8}A%L))F*LgX8?_D|D z&+86oeW`xupH9&5If5%$uHC`(LHkn8b+bUPlw#9(ZVztz?;8+!o`oKF@&kDB+AiHe z^)J;W${4KZZC@Fx{s9QO21=T|+<=rG;e%1#^1ux0 z_Z}N!G9&2e<1_-fZhE0cq$-}n$kLyz_*evU>1Zh-qe$z-UspK}D#mIgpZq|4^+W8> zO3a{n!!o9r?)Xz)|D3~duDLX(g7b+hsT=G1e)hHi;-sBxuQM`1{?Jw{Io{c9L56`R z%QV2|3qHZ$$pyohb_D5G^te|4iJW zBJMdhyc&-o4d1lq`VMO!;yyL`>W@22ue^;PTx*|yv$p!k7RA{at=lXYmK0;Tj}`3Z z?eMU2Y3c^Nr|%TUDXEJ7VpL2S=1P%Ec0n^{efcmwFOU){aXO6?_ZR%)o;V~39AWLH z;EP!j{*>`iPDlU<9`-B3rsIWi)Vq!o%P2=wP}$dgPWy*J@Li=)lKTi4GL8|U(Mk8k z6)n+S{=`Aau@^Nao%Kc)j~dkX%TIL%GvOzESale~U{ zWIx2rbRFa+eb1SQ)QjW{dc&9p-{-;>ohae>c27Hut}*u=;d{3c%XeLJ5dW1LjO-x_ zqxCe00%h-md(fvl5`i?K?gwN?-)`XQY942zUDU=FbB36DC?{xCWG}O)rmT1aa?=>Z z=AAH+v+LQ#sKRpk8SLwaS%dg5{=VyKg^{#IdLfip+BpgK`3QoOc+c!HcL2ICiGxIE z!&t>XaX;s4H2gk zFVUoIvimswrM)FiLxC~ek#X4SPb1d}JqN;QnpIi%nzT${%+kZNUG1?+43X@Z6)Br4 z^zfWCW&2<=SLJhazlwI=RWxJT7eGc&7S6jy5rt&kCVk2col(dsZ5Z{OW8x(*DVro4 zT2KT#pq!JbcEQ}HopOB|N{kVPQ7K?U<;j8o^xfhZY`$ zPd*25jz)U%l%A~;i}npQBU4%^uozFd1zSrLxrtl8dIoGNdqIT7Gw{cB*FZWqvL4du zC~U`tWa@8W% z=4?Q=LpgXY@F9Uh!==D!jR1f_+fxnY>;#fv^Q{{wm&wJ^I@FTH24sjD=!kjD>`|K@ zc?JI#&3g15IUQ^i=(!a|dz!z@a+#A{^3gfAR%HX^;R~o)&X8KE{#Pb1U}SG)Is1TH z$|E;|bbYmQ`O4N@ZD3{eftP+=0O=;5@5I(d`NH}yHRb%%GaR5%l@?Pe3M6B|m^+R;9QBSu(qV zqF!*U{mRKwyD)73gL1tcv!wAGiuv39>~U*dAjF;mA+}U=KHzQ@K8-Z22e6)|rZk1$DSQd>{Yz}NtH}xUl%q&!%*wIxDs;a&F$T56`TR7!s1a;v zm3OO~a%>z6Vp6D}Uf;cQD%d`3A zenos&k?yq9p-*0`76NZCLfby&Yd<*Gwvt|?&Hi!56Z^ZqImVdh!As_;%uk!i%m1;d zM|LgeJyxxU;%^vwiFI7D1L-D%VX6A;@ylsq;mZR6RY?EO!Ow>{3 zLOOFpTJu)!NolbPR`}NRgM=_H^y0DCZfWKB^;y*S`kch7qUu$eeF;myk#pMOfmx%c z!&kN+2a4noGAT~ff&zJz6jnC3|svV;Xv9!Xs;GPC%eqTS( z#D7RF=3LHvde5APk@5SSL&DBK$X}8>@J4jwJ8mDG%9NlEo&`_+2g zQFW5z2>K0YiNPEIq8{SkKY=Pl(1p~|dCfYEGMq}oZI7ujxlrm!%jl4k&z$- z%w^gg`ZG>Kkp*4AAW3MT$gUKCH*W{D!H8EGxnlDZI^{XU;aTrqX4>=jqN&>ZLBFYj zjG=ddI|}OVi=B>e%1C!fY8DJl+vz#jO0{2%7HnW&rljSdtDvCK8-b4?6`loM`RPHr z&K7;|7bM`fSU^3jr!6I$%x0NI<^AYN`D}Iui$OUc4x{SMk_V>s%LB{(<^!W4^OO6M z(OuR9s1ZS`Ao3x%tL7|dhY?SbnCZn=bX`F&jPnL72XsMmFRF7FeHmpE{T*Ha#mT#K zl6beNY5v9>*Bq&+<_3m&On;~jcLQgO;oU^q@UYb%r#>oW1bn+OaaHKfS$f)c)z|Q8 ze29lA?x0WTE%G9o$WSVg>6AIpUiK;cSa{%m1-`5m zL8|a}@l{W#`YKoL`=wLOd&DLSs0cFZcMiy#-bLOL1inFuRWzrV9B(>M`L1}SD1Na0 zr7LW`)^VM$fg%@Iu^#HFx96M0?a~w?hUxeMRcI5IgTH046!RSifbCr%DL}_Wb{nRN zD4rU3k~$jPy-yGOQ-9_gwhPgZGUayD4V@mEyE}NFAB0*6t%lB?Yh`%i`|~0uLqU%$%09IMY_rKoW$7 z0ol6+n>J4^r&*8(9^!V7_oTV0m8-BuDpSj6k$Q%ydv4};!qw* zvB^xvGTnly20}kyVY~TGmkv*6Ki}gSc-g!T<2zv;q4sW+kFvav`wPWar6>%&YpDr} zG7`N`+e^-j@8i0f8e?}z|LB{mw{|k|%I1~7?r(hT*j2lW+v4;Bj*&0}t45dr@jV?` zC5GS9DOH5#+-SahOu-p!YLx*+sX6_;OwI4TMRNC$A468ap1Os16r=WWdqp9|WLq@2Fbt^eS?W=!k zUQ$0BR0)On+?sFVp6O(tcYYVy`bwvsehiVK$=-Wljr4`Q^Wu?s#5>q$be*=!fh?s6 zPx@}Zk^HAJkIwYcCW7|He4p6vdW-irk>&P}k4mps$l`Z}x%-Tge|oQ{M-sSOs3m6_ zUHQzvWk`fy%a3_xWW9wdcw)0CK1dP7-2c=uNN@$U=mR4KcM5Q>1B3EbYnj9i(_wpI z`&>gpVf+E713e0$oV^KD3sp+{FZODK{AlMIDs~X0ZPv6V|9RSmw+ZpL&EA$aAcF_E zSJi;6RkRLc^16r+Kh3R>pq_wUfN&# z7@#G}FZUiX*6hJ(ID6@~3LT{BdB$Y!@1(7fh@l;|aI#RfOcYC$+7eIi|ZV%2I%KgnNf@ni`ioIG@|0Ra!~$;AaO z|B3LdcH!aa7h8<1`5|I?4>N13*iZk^mkbG7DLGSbRR`86A@)8Kx#ikJPapvHg_%64iC1y9z~TMzE1sG(LBw=Z>WTOJK{m*iDId z+;+A&F0Br1ncXb+wOusF-c&eU?|q|$YrI~fzlmfA!Vg_5rs3Zc;KCAjl$Q%7=^e+! z4NUF)ww@-e&y5hI=>^iv4O5;&lO{9h(EYb*KFs1(91*+OD%o3HcpM>Kbw5CuOt|wk zNG~ZRj3rpZzr*+0?)u@|XL-B%*tyt3loRTJZXjtIkhbK(82_))feYfF4)ZAyhn1q=%-|{*HkL1$;-uDl8XFMwNUDxU*Ss~OaeXcHK zR-dd{m#t%o=%`wKV2g&IhetN*IIs#dW^j0r(Uh(t|Bz0w)Zq^caV1`!43&)?k|!{fR5!?ZMJ1;$ej{ z@_IdV-&|}>t563CWH<1j&C_EG_~$#0pv#j$bp35lN6nPslA6_(b&^21Xyb@tZfX*x7*4!b zU1)o#Y=Lqy^_-8zkvR<(USkeW5L#jkfu(u<<&~g|>|arbTSr7&sAF$-4ZNmQ8 ziT~cCw5Tb0=^5!1Pg(W*!MU^ik#76jwg)P0L)Vo(eAnstJy)-BMD9Cd9l=Rf6Ld09 z@BA5f(q!86UBhnRD4tx~*0;s#>IYHj zO^-ioKn0IGu=gNaI`{6Voe&w#tEXb>Pon+zkhY(MH3lV(;P5N3eW$Pl=>7mPx)Z5z zx!|D?*L^LKhozFdCM(q0JfFfKHljWWzInE!BKV<1w)<*8Wzx)vsI9y`qVrak0TCsh zF*)7VU#5BCI!VE+J%RIV^o!i0<|ehR6#J*Z`TES(x28okFrEa?WN)jup=L;(Q+{UQ z>9ZOfS(1>vMlHDZ&S-QAV{~xBJG%EqUqK!yA-O8H7@gsm(fdlK5h!e0@zA&%Jb*o4_bXWFtO(Y_2O_ z4s>HDR=^m`|oQ_Vb-3vkcwt9wBfZQCSnVAOuB z2(y@9|7T>oA4il^7=jcr^>+j@)tTgi)wmjY6)vn5TJl)wuUPkIVAnZMkNs)a)j>CR zoh{jVdJGW)n{T^J9pH%W`4h~&3q*%}{|a;elU_H*YC2CtsXuVB1qwU_#m@pXQO51I zm>ISX7?aiW%)8P{@_j?x`Nk|x0d!I|KpF8E+0;BqdAttBZeVS8g1CH^!M({y2fjkA zCnAaa3SAENFe{$?3;do<)Zez%Razy1OCbG#*2k6^-YRu?!{2G4cqXu}DTjiMEg&FO z9JP$-<)lmNkM`aADB$|LLKOOkp06Mh_1b6}d&w8`OL)8Ro!UF0aVR9{BX@h#%+j{P zknQUoNVxD6{`hlCPxYtU1k}a8oO>CV`iR){HP6sJE@IB_VhSbuma*+8Jx=pXc`a&d z&Qq0bIkWE958VEWC>c2tFyaxctj~&k&OSd$w#y?`VT z${uJNP>9#3d)wK%$I=F~>(D-=xxmQrVZ8vYz+A*!lDvPOMZ33~sgOv7$))ogxNtsAdV*8`6s-$+|aQepAsL=K26vm9(sVthL!XD>HWi z0`U@z)DR@;=v97R%-}hgCHNpHLb@AyAmBoG%}IKuteoIO>im;WRzq7F-2Aq6`H%C# zj#!IS7-a)D&C52tp)OjHqLenk0v48g(OGQ{T^!>E#8sDBk zTceUiwRY`~Wu4TS=R#$%9}IwTxA)!^<`%dUb+??@o%e4mp!ohGRZ{VRA||drXB1dO z9^yX#k&w*)pPw+J&f!MVHHw26zB)fn{$i&3iYmD8f8b)?| z-TA?gg8i(ns6u}$cI_2c@4Wb$d;FQzSFUYcl_0Shg^(g4(|GUx$6kD_yX; z@g69brI-bRyx8{B9N-oPB@8W5FhL8e+yect@m$s`0Zt2(8@4_Zz58C}}N zGM4;8&*_ZCX4h30$=2xHW757E{Dw_@yVn zUH~kisSs+T`=24|3q8iQ!16BAV;7hdudPOlta87m3<|JsFf&gF$WSf=%Wa177nH&2 zKJW$q4s7gyDH;GF$L%Gq=uwwhe1FykxdoHNfkCcz4ImJ@cNayyzD6Aq1G-EHO1iiz zFxtXi#8!h26!Huq&kYEZjRv*1gj&IM+yb!1e*cNvfFNRl4yoW2G$0s)Mr-IPbb1>q zrpBI7VxWOi!k7s%geivg<=jK`XGIQ4lvlEzG93px#*!nOzOA>V2l8tfAc{O2ecH!} z!Bv5-2gRkj;udN#a#wyF$xDBGd2b(0QoM(zBHTb}mu@qb@X#&G(n}E=5d6L;k9DSSm7cwE}g6eh<){*%z7Gv<{*-?Y4ZA05z z#CyKeVWZ{s*lk3)yiA8w?908Me=eSXkE%GlXt)HqIIb{PN^D}^(?2(neNFb-<@riG z(??r6s^0yn2yA*9fc^jTz0r(o%FG9)BNdgeQ>q@s)IJmDI8k!UJ4yuv90`8nh5JI5 z+B74KRFiLAT;t7EZB}4iwQ?UvJm>yKVdr_0^e*Uni!7Rl4QD2~W}Ma>Rer=PRK$hLc3Ki~EBzTpqoY$2SfA%Kr?fR~# z{)CLq@ki1j_ILQA%$~c}ELNftpCwsa9%Q6{0D773-uaJ!PnR|zLc8hars8FAC{wn8 zQiQV$xT)mhYK$<~Zsw{h(0?FG=^&r&9kzy^3s~~4LfTg%e_&SbUO+LY!J5ky$LSZW z^3t3)AaS)oz7UkLoV{zzK!h?Wi5+#1%fo1yZ$&-lpjxC?chREBi`;OE)}87vQIRg@ zwm+)UUOOARVT>PZs-6LeH`OBhMk{_YA-aUPfH}~2V(p9aD@EhjYJ=_&L#l*Ca3e^f z?x5^%ce5HWs51LpM(Y?3Y&L=IT+C=o&l!(K<_GYShc_TExQAwiMBdqn<~>gm5cRxz zKJm7e&Br1wn^*(7`zW+-1G3xEV6NB9Abn1Vc);Sus*HW{j^;;N_pcsZ;dknnp1C*} z?O=ByE402bH^+;JYW~`uWZRi7d8r!Mi+>L=ToCA7UIrI(_6i^OX%!3JdY(xbzg3>) zC@$Vp1VGFZdxl>{AI$YX_r`5b;I@B^6D_D}x(Tpy#XTBljVd9d)K>5B(4LRa9 zqv&s5L;T9{A+o-Zx=fyenQWcOphpi#V;lbX{#fpl1=a{@IAeA)i==lSf35TWc`L+? zV$H0cO3W4B<1|PLC*gCMx%5SuUx{}m?$)Z_FodUn$geD}|Hf`?0Y5CaFjK#pyj+_ws;Rn@y797LXe_s5PBWSphE zu_5JzPryGk^7%?6kI(s*V%SKahb#{< z2#c1;)26^(oq(=&pwf^%gbY$I-Ie7JvOwor$}sJl9jtL;s^q`ekehbozj+i%li=)l zj3BD-Gz-MURJ}?Y19ZRwbSGh1JP6cGXwgt5hlHUgP1&1|+E4HV3bN#YH*6frty+S; z#HKmTVEetfIv-$b({Ah+s9SDucBC703>8c1gN|!;b4~+Gm~oeN`ybuIf95?b8sMmNa-PW#EnQ;2V|WMgDBvJJX& za|yL@0>+Juz+G~pI?--}Uuh4refL0wglb{ToDN5=$e>7wyfFa7XktM1z>Kk%0>Q-n ztL|1?7`zlEu8vj&Aaj!r+<;__6srJ3Ktd5{2t*>#9%r3gNv`ZyXu%Cgp9vUOt?K(e z2(G4!1Q#a@u6CcWRs`kzgpDC$$(>xu@0_H~Q98wX`L`p1{ec}wCa! z!hS$GVHsj2C>E~(N9;sE8lw9Mn7H37#4!bG=g)Fg&^>4Xep%60Q36F%&<%)Ywcn&K z^c#wCJY5tRVgg_RSXDp4rJX0PgGD!gTSQF#9f$Y9;g>!2;nHeHI;s|0TG=DtYnD=;)OWQC|Y$O7I6^S zjBsznnL{~PgcD)13I$f1ACb1qjXyBhwgGtt#Y_Rw65#M}a+3Wac>Yfwk!O9L+E{gm zW`8F#g^O~=^x3cQyHI^P1TZAB?p7y71GMw8*+b)C>_ldf2Z5NlNiOF+FGOuF`1fFi z6zj7ER@6i=6nGwU7Q`!7OxHo+>%Z4NwK~Ak23Lmm!sfHFT(r)*BN;QG&%MDd1}hZA z-~8a^^@x?Tz$T1=rCg$5>M&WbGfoS*J{I>{`bjXIibT*xmy#Wr#Dbz)3>%1RO<@^65{_j1O7kwJGab>@3yQ^L{&?GE}Ge% z2|wt1UNg6$%ZFfeieM+aVcc?Fzk7W~-i=2ch*$AW0E8yCz5gf3w2HT86#4XAfY=7) zST?8toKGw&s3oq5AlAie#>LpeOmQKgItn)xW=_>#*DE|a;30i-l=$qj90u$2Q?#|P zR2uWCNoHKz;^H-i^%6$zP_d+HL2V&!^eyW10)6#Ck1Pl*smiRPSz`lbA>vQq5y!eC zyqsBn5v$7~r~_3eDDL&UNWitzKhc!Zr44bo|Ij?ZXGrlQZ`2z0>W6Hhv}A3fb!LRr zZm^LdWYqq$MK3Cd72iav3x=8v%s=YIqB)9gjQ!FlKia3tg2p`u!~c+YC1N@IUXQt`|Nfa<*VRR_6sr%$o-4Bk7)t!;S zX$tf95tKD9G$pgTV?O=f68hxhZ_wKgJ$LHP?};wa%GZo!Ku!<)PuKiaEc_Qd)PH`i z;1W_0-S$g{&|`{Qet)v+Tuo3%)=Ek0DOQ*bp78L81%K4Jo)jYk4M(rPYk%0!buvUO zhpPD+CIY?Qn7nXLmq}{lnDEMY?%z(HOWe@j>oWicJydBy1rC;S0o%Bzh_!7UY z(}?eh(NjmH{Diw;I$t@g)M^3pDKt%|WR-i|YACj`ELDn^^FHweG(%x!66jR10oUFD z-9Yr3B8@W5tx(2r;wnGo68HV)^TyCHNj{&AQ;c1Y2G*Ysa`)y$`Q|j@wrT+I_d0Wj z0j*IyFbg{NC2D^aZTGq7%(K{5sf@%Cc#y~0R#8FQdqE%gcSv-LaK2!F3Jjpe*4Ywj zXApvSwaWDgE6dO7EajQC0B3&KZO+>d{&Tudx6#B>38pcsGMM9hceKs-Za0jrYYW$u+KgNfm{++AVchMTR2&d`3ZDLB?-B^OoUj5Kcc z)jFXO8PqYu@q=MzJk%5@3gy97Qu#h9^E@7OaScQE6OU-v=&fs91^TS7uY0bQKFVjx zJTh6t^vIgM``2(#3HSs_P&?03!(`KiI&Z3Wuxwj+xuzM?`!RMd&D^chW2eXPZby2I z01iWSiKo3fQJ9dpiV7OsbP=o0FX9Vf5AZus4)h{c(;=zX9zIpM+R&Ae$5$mdef8V8 zWbCg9%JMVuy#UIjH1XhwxG5j1&pl}7fnQkUhWtpc>&&vUq!HQYRx5hX=QoK%Xt zQG#Z7P0Yv`>Tumzw&@-pS)_MFsH~~h4@}1;OO!)Mj$sviLXhgZP+EYpXB&rnf3xRS zCy6MYedRT3rh?H{%{!)9$O)%N*PDea8@py`sfA1XI-`~oFOWSiIW}26m_DdlBvhzI zLB%rHf$=$GzxoZ{N-SOYUc=L|SUa|M;=Za^{ZVe|T691p3(+)PCvqffsc?1M+F3>uQi?c9S>e%vTF2mCky47r9gsk%hMZf9Qv-CDwc{d+ZM%lpAITFxyKP1$WtU%eRN zGkcJCdLj$jD`$}%{>nqTPb(+xFTOauO;Qr_SQ@u2#^Xo|+?nh?6m9eB*d;Tm;lVs6 z4@rN3SUZ5ES1G(7n(O38pOno|8%@JA-eR{>2%H~9&R2dN&*(TYnJttL1ET`4x zglgO~^UGYUP>v%h!q#-dZCb{+mhlKW^jYx-pHNV!(>zeRs5x$s^w~nEb02l+7L2bs zGh%fp)!Y789pC8=6Ohun13-{ZcV0XQF^<$#u4LtY_;AlZGQ97&TT}LMAC)gFPQTP2 zSC8u-Y!+8qSuWl2Gc>zKAni%cci;`acYEC2SN6;Q0K$u{MpqxjhERBaEvgLCEU~_7 z;$HaM{E#bw2Y0XI&DK8pYM512?^{W3K6LM6c@*LTOSZ<9X&=TG;G`whuME<0&9O2m zar;h`#?~L_{oLCBPVCA|wXbXC+S)-l;oirO+2VI4R%GxO3M2F9di}V&T7hu3N9~%xAoyQmi zOX3@*`USEs;Or$Znfk2X`oO8rvCS0#zGC4{?yAGk_&$L;AZdhpJ@H7j06yRvmq#AX z5E};K$^?1mkE?G!2H3Lhqgo{EVQV7*PjOX;eywJiQw*CpFot0+1FQILcEB;jKH%w9 z=}t1Bm-L|~keND?6Mm^2?S+j+M#50 ziAts?iOAD$V=~#Bjb-OBr;vI9ZauJ4u8dFO{H1-vfp1WQ6(3-%6TTXce9~80>J1g8 zTz6XP*GYU4Ll{?hk1#Db{o|5uB`oWsQDE9(!-Y83OL!rB7kNQVlAe=s*HVturQ%IT#wii!BN7t%11FfK?d_+m3D5% zP|)JrJC_B^w(iv`-kC!F%ovEU0_1%>Cn@e24d{=oO(I6dwzct2?w)C`5atE>JU8vo zDwZI^{=i9h)?0595O{y?V@YRudD526;K zQizC@(K(Dd{+jZ>mPN!Hm{`qDJ@ORYDP^n5h*+idc5S<|mmr6%q^bibG^i@0<+BO) za}_3Og|h?^&BCq2ro$OAcZY0E^9)LHX~}Nb8_MCFNEG z)iq*-Cw`0=z0AI-q>vvgTY!TH-7<=Ax-qG}w!LbqW`DNQ&Wr6H$oWvLNXhse{@H@J z*esNtz9;|~Q?^_%&r|I8d8SeMfGlZs4|eBCFlUa@R(N+rKgVitPN=ZV$zi{dTGaD1 zu-AoWdeSZfgmZ{ZP|WYoXWH`tI}=-f@U%2cyW6_=LV>DT_mpU#O&V-Cc9Wm>1u`sc z4VGprl=d1P&h5~~jS|8_-zH3&00;TI|9kDEZt4nVUI z1Fqyp;VAT)(iDuvH~@^ss)c@p0Bk;wY9Xm+Q5V08iifGqfE11Tx;3_cnXh%)2ceb9 zqIoScn%?6MJJAk|rA-fV#eD8BUy^@6@CX!|T+hJWcAEL(%j(jawmMe7Zj=r92_?P& zy*`I--;9I`fRf6ym=KekF2i7u4AeusQD9zW-HieSDYR|?7wdRJnBoN3`Lj`RF#A#0 zb%E4PqsnGMzZVx23gZg^0WT{D_7A^kU;EkvnCDz!5dQ>r(XVK~G-TwTn_|wVpsITQ zXo~rB?`HTnH%B<=Dz}wuEPL+%WA8nqn)=^;K~NAyiiijZ0t$-Ki}Vsv=>m#C=uzn) zARs*?g3?Q*3j(1iNR5DWBqAmB4xyLOdqNG6`0;#eI>5F4xN1 zZuWk@&+~ac<@x97fDaGHR&g@G0h5LtT7bAw+A&CW#3@i~r=o~aO_Su<{#~d!Xri3r zg)K+~juXu4iS$Zs3q^et201lU#R31F9fOvC2X>h;%)C#rJc@*yEjQcsbhaZ;d)n)S z)|KtJ+Hhpg`>J~o%MXWFt6jdJdN)W1oy%94H3Ks%-pA#tOFZ3ReDb&H=#_ z`g*Ds{wsZ7O807yr@;$tb^m0cF1I~j)vT}^nRmK%?u*|6a2;Xkw|{SVZoN38pKfcG z#0dLS{1{S{+}!@eWiMN0CXP+B;fnCKK{&i3CGFL+GyKX9EoqTKB>VM`!h-~TPRa+z z<5yOnEosgBr_FuW{7&)%!-oj&#p5 z<V5YsHHhyt389M98DIC@rmZ0VT`;Zl{%s)sMRxxG(@(ZB zKT%(oD0l-Ga-HmwSl0i$IMezudE5fPRn5l`?J<%D$DS(oz^Pz)l9x%aZ88ad7x=@N zH=2QxjFZ`fDW%%SbK8_8pq8kCm>+Fx*qt*>L#oEePJmnsu_eV9iZYUwj({lP3zDF=1ELzHJGJ&QJ15N>v4;&Ta)phvqe>lAPKbjX`?}yf3mz6 zCdFENllT?Q8phdneQO7Sb}V6LAR-?V!VpXi3!C2>Dv^2O#;-=r41`*0mB+|XMsvfD z8Fx#@*0XY*inD>+CmH%*E|tJ(P*Y;O3m)^Qw5*eEvMm3Tn(HN{$6dcO$~pghr$JiN zL9zPGmkfWI4t_=5R}xUyiIVh8={T4a-gg2yyGQBzAEn3|T!)WvK?R3z)d2gT(Sz_7 zl)nuY5?0F5SI7d8Jn9$arm}PvB zDwf3DcEUYgf|G#$0KR*o;Tq5foMq| z{WaY?B-*ZkAXk1823VqW`kH>3wy{x)SJdceSOz{oz!7-Vg#~A00i>@3(G2ErO}aky zIHCN^4$zh2A0K@LBI3Cq7l|9oD3LO=*ENlwR!ptpuIXTQZz;PeqdYYY8aMZpfheBM z<?*HBhvf z-(Wi^V&oEK<3jctd}xfhvS9Fo+FIlW^*NbSHuusXS%W_uBu~S<%bE>?@GmEpi>)^6 zFNO?W956+ZqDCk}P_5A=sX4zCQX)yRk03}Ua2k_^Z}W2d4NZ``gW%61D>F(*ioA$2 zK}pJjtroK-s5QXog(tB8R?ux&`*X<-b{mc6FM{3nfD6yEK%(smMbelcJB z`~Y@#jtO?x&kJY&$y=1A$E!KCwmOmLjYxpQO{4v)aJumjy3SP+S4{Wfw$Pn>z826k zn5GZAL;+YmPvz>TPt*eQ^yb9wqe>v-<<<4Xh0B*FFI%iFWi8!mP7Ay&L_NRH09C)W#t`%#Pz0&_fR7O;hR6jnW{?_K~j~saFuvxyiiipm038)FL zy5}8qbwgLF+lL`R*Q>9kvf$^Le&c@! zus|NSWw5>w`RLpZ9SE=pstXG4l`4{+rTg#L2Y!jXVP^ZfXA3A!O9=)7NOLkjIjzyA$gkkH=cEg(OzWu% zW@1QF!T#fSTZVvyI)2sn;3)!#s$0B815}6o%2E1z zC~m@E1FmyWRiBDqb&6qM8@Vhjm2v=#BSy>Ul(k_TYkis8g7vv!4&8kp4R4`#`aK_& zxG2&(!D0LP;;&cfyo`^TLft2lF0Jy&$k8|uFp&3@I482&6YT(8Cuh^79O z*bMoJMKr2-TvNRj?>(c;t=M(0kDVN8#DF|*3Gf(0aV@w>WmtY6G2EN@hbk2(vn&f&!^(vt^ zhyq?{dKD3ssjH@XM(#c=UmojSSDSE#l@CIqQx)4w{csiy6RRG5tI3~NH}u{GXtuiZ zIx4Yi6KD_XH9YxQ>kz7It%W4@z+~mSMR_}5y_0-2_=LM#IrcE}j9T5#{WmhI68aRs zv}i(K-=5NWv%NVq+45#W@PJ!04HMvxck%pm@je4ayf_pkqGIQyc}g;{Pd4on@|XyP zHk?_&JLQJaTf)qgh^(euR4UxX{p9YcFFEuTFbA+=D}nvxgB%J!Qi8&O=$ky7J9-2# zx*Z@u$KL@k9;~1&kU&ZYWKRTu4-QXwDbc9OMqm_(H#RV(IG$`35I0wX0s#PO-~TYj zayPQkVgXHO%4pC!`O!S64sh+%H~Yn=VTSl|pRLnKrv#z$^L13gR?9HSwHqrr3#HE8)z{JR;Rk`$<;3hab!)7m;%jZ~asD?iTQeu_Whs3W%=75};T`@uME+Wa z;_AlND3VEOEPLyX)o^iIN-$g%`4$v+iSrUqI9>dfAIgPgL0JLrZ8g{HL zqxVa^u?HKQ5Q}HxT126tq~(y0X5kf$jaeKWhH_ZhhJ6xu+lnCZch{xj=kv@-=R1qjJ@e2&DbSva$CQqFDszX^9;FEKZ zs`ys~pG3w3fql_Ji7vXbvFVbaCP-;e@H{#qi-kZ=ol=&DRaU&W4Q|o~w(~IQj)RV_ ztX6FQD);@k-(}7Rk#?mcqS|qzASwa=8rNH31!7xaSckYxqP`?*p ze8g(rb3`R0!qe|}WS`qV5TsuAyecEVN*mE%1+C`!xG@rYPq=!>x@0>kL9enR7tQDG z9?7FLcBnZfUNzPKv(oK?LY9nytr6e7BP#GXoTx$qaOw$kQQQ$bpIZ(*M$>hWe*4!meR!0?icQpcOY zdd{z=&6oxjNS3M!xj@+oXOht0!b{-$A@F`M6Dy{&MB%mMqj@*&vZ}ic)K#N8mtLI} zi-z#Cp{LvY$YMzW?6+j1MPwnHh_O2Jonq4VNt#Jtc4l@NtPumen9_PjevP{M%IInK zPbs7t>~nW7S6}*RbDhK|GdDS;kU897T=KgtZbEpJGQZamn$g*7eMjH)$rAtK*&58` zR6=GQ*C=R-u0^5JhXLf}t_{OJBtUR0lx^R=SsWn&ReQD2&r)RW((&z6K{wmb_oeSD z+)O&h+;LKLL&hg2PRZcrv3HUzKT{W18WC=>jw6T%)&PnM&TzhF^rD}(MBKyccH6a= zl}ewAvb})`3x$MQoINEe;1_?DrHQ0(cX7WSVHRN76n@FF6I?vdy;q_yX8Ae5&7k`* zP&_T!93Rp+gUDG!g~)&=k6VX2=Ef%(Gb?=9uWBoheEf|QkX7hb$>W1w5%TQ}__hrM z%*-UMQ8u4SZj599GO!c0i(;I-p4(JC%uITVU1SXMXWw=&GjLJKM=hK=r8v$=)AX~@ z26LW#L?%PSAz%IAEyR*@rQAlYFbn6*Z5Sp#9hU=6ZqC|1lZAJ_90bIRnOI6h09h5) z!#hz)(Y*ll29`VMgU8tQL#TKZz`hD6vaj+F{~MWE&;t7o&bU6bak{uhshH zo4L&;D0zHF+UfP2Zni+!DO^PKvl}MCW$G%6wj%l!kNPxi_l@Ov?K)A~s5Y*^$Zf&6 zjWLZdM^i<`t0g4`d?5-nJYncpjVvXWejl3adFuRwJ>Yik%AePpv|40Atk%aj@IlHo z3Ew#7IGxJYNu?*;bv`O%^$DXl)r}jPDt^T11$o^3X}kC??ux^(KvdOV$-uS7SH*t) z?e$rCg?&Ss*Y?@~lkNCNF@Q?_LvTW1mY#ox;Gex-p)PJxc~pZ>(;1gdc~Q)Zswt4g z?UehAW;q@LZc@OtA)cvgLs9`b6y?6NDh4Q)XxW-onI zzxbpVPOPBVufqNOgeAv8={>&WE1&LWCMk%=vE%KjWzqL)@5h;82^CVCBpt2rlP9=ZaB-1{akj`LKLp4R4S)yH0Qcu+EG;{&PLwt<6q74f5)lnh)9On?a6`SS?v*bp{m+)0hA_24QzoL_RN1tLd( z4lVL<#+gOL-0vbJ)HhLLq)0ZU*C+onS0Q}9q1V9i#NowoK!VYECqM5yx$IPVWpcoO z=$w)y&26Op%b;QuU2tgxvNDHGa6&lVu;e95&;|?mgdQR%DTf&a;f*TXj*2QHZJqy6 zz2~!p&uP>>KeHe`nfA(eG6l$rFM#c$CKzjo-;_yc2(d1Oh0|0zUoJeo2}y!*hs{N+ z-r+siV_u22ZgDuX_N(H)`S@{prcy+@X`E^a=Lk3JIn3yE}FO&s`25%ba|*t^P#@V(mtMcXgpe?_RM{l2L{ z1A`>l$&^7-^h-5@yd3OVKWDBs5MYfV2Bfe$; zj|c)gWlK8$`8BS{?2&j0S{*%S)7NUBRM_A~LW~3m6B3SFVYfyq$$3RqRt~5yN@&VQ zJ+mPa1ow*}Ww7c?H#Z&{9m!7<82T#x)OSHf-%64mkkgahPq`se<>)`K&gYODuznD? zO#hY5^S>#QwP&{bc`}*9^IGPTb-A8YGs(0cc>MTQ`FZWIs|qXH7SvV?o4fTc_6>R1 zme>3y$^?2`(;axH0s%}8LH#`uTr9z@s1rd|AwW-Td<>i#%iZYN`~Ofy!tBO~oF~)} zbyzog&lni9P)dj;;Z^ff;Ug=6xnImo_Me8|f(&N>z2NO>O(J9o>O)5yJOt@{&yAv( zsEz=W>wjnG^sUg@9EPxwEhw7CI61jBd`&UA7^TvQYutQ79L1ws71JG0P0*KTI;Ni6 zYTshQWShjZnP$B|pE%j%OGYWH!MPl**(1Bmbtz!Mm(RdS zxtsD5u0t9oFk=iND7Ob)A8>l#{RzPRw?0iJBv`s~%!s6k==xH<4~*J7r!A0}-~w&< zQz4fBAvW2tX!^_n6uBA>TEAf7q)Gzw+BUJfnxohXKZ@aN)uzWiLYIn;^T z=M6)$HOdi`^aC>mz64IaRULG?&tvOPxrJ!_c<_q&-Ven_>^#$KUN~{UW8PK>&p- z%UtcnG3rn0Yo6ufj1j(PIT1h$?9-1!wz#ZZa>WJ7S8~(3d>%;L;)ua6&6bup*Kd>R zv#NYWx2G}UB0W}!YJ*qf?`{^$qV;nY)Wy`dg;m`JO8kcU?*#GZoV~g>XmF$Vd`Wz5 zSH&kap6sXuAPPc!5q{lIYR!u1V3gignPT*&9r#tWPiTY1vP7Q@ zS4B#c>l9o+W*lcAV^o4W1amqsLj;jT|LErC(XZsM*N+BKDd-OlKI}4~n%$zDhkpw{ zGAsLEQ-h^oToXUhU|}pvyiwb?e5pU#onuZJ*s@E>AHIGRx6&j+fNz5r^-Rq8@ZTC0 zMROLck__Hz7K+F+oYKl4${b7xUa~3QOpg)eT?r5;f`l{tGQ1VdwKrhl&mo_{fDEST zma`|PI1mV-W$?=DHfp}8S#w^fJ(U2)>BkD`y~0@#mNP;VeA7W~0wM(jwj=RBF$SWe zj&p-kyTF0+7VZZ0)0aV3j;JVAZrG@(OshG#)$yu~C!6~=mO5w|kp-I6N6zd*94)LY ztf$)i9fbO6x^4<+FqD-@(Z@H_ypzEz^r=PtHb`XUf;bGGg4VOLc-iQS@U{m4PK$Z? zTpiE_5D;O?Ty}u`gVp)j_#>~H9+1*v1z$Iy!gkSl;9#h8ZP@j|yj(|Hxbj;JwOEV3gyfP9v%#Bw<4(o+Uf#iGrKSHYd5$DgVCg#^4(^uXn_2k#~ zN1?ghRPQ^=-cVopl)`Fe$=OrhV`X|x_7{j^L5PODJpBVv!=gji5(hrjiG-%$=j4=)SOLkY5HJTDCzw6AFdJFT87Tf3$O9_JN z@peB5ckq***R|6DJyDZ@2>M&#FdoU`?(@_AFSt$76zabH4bV*NPDVJhdpIIl z43&J7Lp0%6CU^{Zn=BXnDKhiERJPa&=dLbJeIUDcAd2)4l|o0kJbhL-E0<3YZ=Q;! z>RmSD`8J>t#mIs9C;%GRsYQU0L*3B_fn$Q9z&6-^2(-C3_g7TnO40*reLN0*Mn4B% zJ1#K*$RZuRU*XtgG$qEIQU$0u>M$f>KvM!~Jwa9g&m8;?uP2Zc=T0!d$s+One}*=F zh3+;aotS)r_nSp@^yi{XR5QG9*GqiXN#1CWz2mPvaa%}FT|w6-8%QHj7M*WU|3%RF zb{)yJ`qC>E>OCHv^2YbK%zOZH+W}7*d(4yh3Lf;~_(;GJ_66y?lH0^U;EnX|zrHc- z0^@54?5;zItcKo9)>MB!&P4kPyu%3zC2FZDTV{M4>+1&YMJXbqG*b`lVaA&+i>{jc zUqso~YREWSdjy1gj?FG*EU8811aRvrB+ri7<*ruZJ-Jjs8rw=*TEM4JoqksK-oe|F#mo(vm>ii;N{HI3J_jnJ0D@B7L=3I4ZQRdH#00 z^+W0c^V7vGXJJgoetIp#Q;!E#E_oXbH#ssCi|H=$##^%x?G{0tn=DR>_GyY2-d6Mr zyorBQcak>oo||dL;Gs06BtSLoLiAVDwWH~lS2WKA-Ba$+gVfB8z@=@<7T^bO4WnXg zD(1i|uR?xn#BS{X$_utV&A0y%s4bn!XRWeGcHsVrw!OL9X?pMi@KQQ zpP|Y9vQnCx_IankDLtv1I6?UGj2K?r{cWAKlP=uV;?47O;B8ui(+N5#vc4c2!`T}o z&hWPPOTsvtJKD+CSYV6Bl3w z_o5C_4@v-w<5sRjkg56OlVIz)Y`=n*tozk+Fz%7kLypv44qGO$%4a5{vn#@cBGh_% zSwZgX-JGvh-FDu*Gb!wj(?OhCGAk>p*I(IxX)*GE1)v5gcYjvap<*q`K#g@LXHrC* zAy?jg>E&&GDlq!)Q+VI`3w{tjY3lZ6ewyPB`97C*dXkGwx6nM{YKsXmk8F8*gDuZc z$YUJnnC-|p6bWZE^Y#`=Wg*41N>4*JdDV{L)yJ#Z*~)JAbTlDUbTOYEcX<}oPCt1& z0uwwACTA;`m6i=Mpqd}|=pSmdbM%3JQraM01=^wK7 zZNfwdnvo{g2u!8kg`#yf9JOi77cDraYBLWXYyirr0|CY;}1mLjt#1(AzdN9v2r{G&Yz?_ zAa-qO*Y>#uN7p+A-yQBLlbG%pB9%0^tk_OifwS=i4nk&BL{ezG|DkFFn=LRS&{vhk z3o+i_MZG2F*Ll2uEv7w2y)B!tRsG2A@SQ)3RF4kz>uBy`yFKS>-0!R~BYXIitdm>$ zGsqBYG4!VD%T>K963g@x13KFxPOse$)@pkd85*lQUmwJ-Q;zMV81d}TBWTuR;1!;&yKD%>qvtC-KNCb z`1UHtqR$haf`jh+YiQnhOk|%8yR-A^dIM|-c~SlMN9Q3aze@Q`SUXF)iwHD%UCCz_g}*y_}9dOIQMOlTf@E!BtoA!W72N{CwGDykc{j1SKt|}%By5%J)Qa3(3-Kd z?B)O<3eyk}3c572<+9X&9X1NUrkpwW5s5Vua@xZzPnB7<%Dso0r9svP&1PXT_lLtp zAZCu}UQ&EZM32*p!Cz3sptJBEFlyl-t)hk+DaWnpA5u1Gd!mpbh4dh5W6}S-1#GJG zGq9e-f)5;+kCxy`p|eAFh8U`F-!OV~wYL+KBl0zAn$5ED$w zui^LUVb#cFY}?M~KHA=&Lk}Vb6ZhE|^s(nZX(-Guc|dYrDl4|(BwW}y1t#ke-ow_Z z>!&=%mXlWsJFq=uB?;<1M0Z@TQrx9CV9DamXJ#$)5*VJrmO5eHI;eNV>yGA1+S z?217~;!uzIU-HQ#GGx{9xQi(p0-*R;;0JT8L>E&#J${p)P4;ViSqj>xt~QXEgMYL9 z0`wj8Axa-0R^pcmJ=d+~2I0t!w6|iIbk7KeZAY z^g6z-gR*(`SM3H!qY;E!x4t*H?Y(`~!^6%mc+ulDL`g#Via*V3v;t0yt#!G=8f*IN zYxTh%e5h{omE@0-Uo*W?K-IO~qQ6(-8rZ5R7+Qf(-uw~p>IKA^5me_# z8rKU8J$;trWZd3ogni6+I?6d%^1*S-?xb|Hs!aR?vKF8`-_U<(TE@+hk<}@C&0!)W zaZ08${jHeI$LPX)Uh3X)XaFAy{>IQ1Gm}`kuNn+EMR5>!|ID5nPJKa8-W-JloSc zAgMGN4xcVacmRPoxR!25r7L31Y<@_(HxMWL47lw3 zwE!~H04HC>u4KYm7Q#u<5P;#0IIlpF>p1g0QasBNJstmtYCGV}y6GM{u4y$IRt*CF z+%{SjE2vf;q-)+7)jkILCDO-oA%LM;K_&bJB$cqmT^R*{wa4&-&sk%p+u_lGmf)K@ zbP6X9v`E#6TOV}K^eYx)2Q({05J%sH3`Xwr9QLlyRbfBcf3Yvjotox`z6F-rz&7ZQoG8s2-PDea zl9?|<#Pe{z`DI`q;S2Hum&Y$9mlH$7^Y(+T?(|5?M+}84BRmOkjCdG@Axl?O^TnC* z){|n3N%73v@gtHP>io6#bd!zgi${(4kh<&XcMc-6%3eQ4WZP4`@s|{q1|e1RB&qJ4 zYelmLV{;*Z@f39V=}IwdCwwg{^URc_R3HDksbWN=#@i=OPW^Ptb1nh<)6WJ6c~5%X z%^ogO99Z|+_HivnlJ~IhAqq2uzIDH2m#ZYc+{6$7utF5CxN!T1j&q`>b)LZkNZur! zf=naG)u-f&+$LsnR+Efh6oh5UkE>12{rlwHvS@xoM*HAG}GYgL9HII-BF1z%k zzzC@u6F&UoCRpA4YR6-I?e|f{1AuplVp#vrv_+IK5kUYXc8so38GrtjU)x9zAr7yky z78dgk;<5#BhAc$2C?j99^XIw?kAM9i@F!LbTE(eK<{aFwu7|uJ@xBT7U2a12S6R86 zVcqmrYJyZqtHf0x^`JMhr0Yxs^bx;I-u8vx0gPq$mjO~kImr3V?v^)4BUu%v^!d~+ zvZV}Gq*d%q#N-k`SmKxG(F5YE!#i|@X8v3J1M_<-2)1LoX3*qa2+M5q0`EYrj(;5WczZ6C;@KYl==r~V$fUM#S{i|O}A6XJC*6o|j=S~MU|EvsrKd~+8M*$}R zY3VQ|>i#AA%aXsj6C9}>Q49pcHZy+ekNBP3Sy_}n?*(B#0jkpnA0sHW0tg+vp2bnq z*w;_7ms^hTd~0rY{@Ta4X#7rX$u1jG6o#I1Ny7r|Z!_yB8y?0z_nId<&g!E`1jytZ z4hWyLPlCnkgO18eV?jqv1OdQip(sZb<_eVM0NBFO)?AhZ?vZTMr2ogX>{&#jbo5|5>w_bJNj0R50 z!Y|BP+w%6kLy=Z$lj8k`qwfTh0{CJQzC1b6U}zz7j1Oj#VmxaV?nzCrLM6Z9vL^3v zE5#=^i@IPFb=SHzfmzraUm?w^AX&XX@>@Z27Rlk}#PWe#4z+P;gN?^wYXPON&rLbh`kTBSrnVGY`t zMF<^i1jHhvw98uaeoM?;R8C)Oa^Xt+ssSH!IL26AICwL`zg^V{KNoQbzqDyme-!?9 z?AWJ_)0F4pPKi=6(?u0^SZ0!+%L~>IrfJ5@Ny$SsGrq8r0Pkqcoj+|b88O6&(HbT>(=N2tuX^f4CvFNUM#O47Ty zf=!_O;<982N$x8c6r>o5V>kS8vV&5?V*$7;%9<<9)P=e5kK|_*8pg%Dow&l>9O{dM z3bkp-eKQd~@`bF}G3+pIUWerm0=c3Ei-O^ld*%-$#K*k@y%|;VehA*Y+&=C8^l+YK z+Vj%U4bs%j_oZP)YgmLx>f`YO6X}0rVcY_AWq=|dx=0W|Lf~scz9ni8Jab9UwX@{M z&0fEm^^j?WcLtKfhwYf0k{;ixm^LH?veB`7Rg~B zgXGTbmao2+)-jr?foX220Be#!9O%l18D(%(tgI5-n+brMkM5SN!^8>JVY-f!y~}Za zs_L!rAu4}<^ad&6(+k8thr;oc8Zwjk_W0Za$o4xb7UY18yRjPK9Xt=`m=3D&A58CC z#|wyZOb;qhxpjp!Bb2Xr#Um;<_cQhdO5c~^507VLDy;@L7s^?RdHh2C47LS<77qS* zYn*~URS}*k{sVUco$#Mx$nOA#z~H+1TP!K&Eim6ARzhPa*T|eFA0T!V%-r5Jf*jeS z8c;5MMa2sLk4`yffP@gZXerkfmq*PLan3&(7*E)u=^9lkE8^!Jsi>Kn+sLOAI$QNAwy4IedNvZ*E)F) z`_=y{!DZj24Z_tVKrDD3h#f0?u%_MPb7a`al6{k|L(OKVHNA_NM&LImrsCMgWNyE% zj=t~mhRKkFtTKLRj}aWFQrIUGLwJdoOQ=~KCqd5zy z1h=7CfbVa_yGOaDQ?#<>WU*Jh#8^V=rM=x7FABzoNcmyLjR`edrTEon!~FF`pluwu zyawyHkC8CH`?6n5>6{VmA*aeknIk=zI)Eq^ zS*7*(g-Y8$#cQk&_m`6O`)~Tx@6-ei zS7puoe!fL#>qnS92A85Zt(922(3ZTIuos`yb}ph?H@2a72(0a}iP>8x-2$hDmgJq5^8?8p^eW@;1MTbIepaknDSRHCgnaU3LqllKNAnv%lg#=r zh8AZrA-SY<|iHsoD?&6H0xOnj2 zjT}ZE>anQnurGQ@ess6bv&OKne|~hs`o9(yXo&x5wB!$v;TN1Trr5sdTd#4tix@5v z?xS#%p5j+qd-*oZ`9H6Ixa{G2S9#p@pii-I)rE<{7*rf0hT5zopMrG9>>r2{Y&T1*6S{*S=H|%+$kx4uS<2SHk-5a=KO=fj#Lkjww z^JvWNWzqO)KQs;$(s=Pb5or`R1h(qD*xl1IqVc}^(xL$Mt~ERS%z}9R61~%@F?oQu z^^bh>DRc`iAv|xT`(i{Udc8`;WJR~3A#6sVqTIn|NAEc>Iqxz^7|l0T3MzPybuVKH zs%~Z8&;PYXsfJhehY0XNoNj#Nq|8S>1j(5nx-&sNh!C8;)|;x1(p4xm+kotL;=p-6 zdV|4OnBjAa*CfgAeDu3z_Dy&{sk0zs)M5fz{W6R?X^AH+1uaD3B-s%lExkGuCL0#L z91>)1?vH zp=Gy7e-_04yEzV$Ia=a7%7A4d%R z(%&np!H${ReXTi-ZS|=hhy|2REZ3NpU49qUNplN^=s|wLPA|iF?B$bt?|m2F zbw3t`o$q>k4z3-3QZ}c7B|3c?vaWj(W2pz;$g#| zDxMAl3Zh$fWZ_xlM`CnfRUTXuptS8NnD{f5gRJhMVkqpek;l&(qMs0>;-?x*t|}43 zI`;g?d#jiG38?e6=kGRko0vT7@?_?V`v_h==~IvVZFFS%d8%@a%(<~ZkSIc35WQyP z*?o;gd0#$GzS-Vh9`NeWK44q?>L!a@neDqIR<>;VDirYapfv_|URhK;`Hiftke}#3 z+ZRk~s_T&8ct4O0!t*?Z6nc$p6b}&qa>#qXLm!VW4;@4%Dc)T#-zSzV7kUmsq<7IomwtNxZ1^kd6IjFI z%!vbUsds?fwl*I9B1t^}n{EP+!ZCItOGIyxexrKRh1Z4sT#`G@g>(U4OlF__DqG6x zR#`9($>f(mK11{}tQmO?X?*ob!Sd*(0>IiXJtdhB%UBtxH|jmsYDnqY05yn+4c4AY zeR4SfANdl?3!!R5?hoKB5{bq`k9g%7I2WMrd+4~5Vp!++78Ou^wFU2mUfx&MiuP|8AF74_bWOqZ@gVqyx zd+e)(7d)5_1xOPI4^rzxif0H;lE8cjB$5}^V6KDdhU6$3mEo;XDVPu}cz*7L0dcxA zH&myy3B4{|!t40uBRy4JVqTGP_-5O%gv2A{t@!VExu%WOeB*64Q#aaKa*G2)1!dJp7q zlgAQ{Xh=zN9z48w&=zED&2xvcXI+q^yPyUb zJ6r$=(a-4m&5PYf#kt0um?_H6p{9+l$6htX)c@=AzwW_* zeg@E7Fs1$@L=2B`Vwt!VX!kw)DR?UxCPcD8Zu4Giv=V<#E)B?AtCT`4Tbk~MaswZp zzpAUx{(m1n>i;PpwokHE#{d~jJ%dG=v2 zeaPqqu_D;YJ9TS@P;_vUpK^RA+i?-4Rgl=|aN;ya9@Ga&47Yd> z6yeKC|4`N6Cz}+!0zNkLQNp`-$r1UyUhO)tdVk(C%TaC6YpZAQB7&{x;ZK0DS;fd} zTsW@!_bc1jjvCSB22q-)h^uCxvq3F%w@j+smKA{!zBp$#ge0$K!jtrLH-@*wi-9X% zW;+Ci>hifZKdyTb>1L#`UQI1foi=U(hPugxN32^0Kae)7 z^up6(aQod@pSpczNq!rOf4{pRCHPiZz~(%vt@!Cl)723R-u=FrCE&ZL3y(avbPL2V zpPyTKV*wr!kD0@E6W0jfX)M)eOL4N??#ujc#%*oheF3zZEAoM}y1908{gC>XfXfl}cn-EbT zA%O_^x!@D}lU;p8--u`qh4dZ$g&Ok2B&n}N3ebo}4Q%{pkRCAY#6f+OH z-^wg7?8pfUJ$82=QR#GS2%lq-q$H1EQ=qPmXUu;Pta=%4KOND#1Qx*Ido?545iZ~OS2_# zDnX%ce=$bEStLWeGVhf} z*6)5Ezu6g@T#J$OwTL8zE#?`}R-$#Ny1I_@bB~msDQ0V)LT9QoxrW#|_{{`?symx{I3=C(NWT52`pKgtlj)e%=s-5!->>)-vIiGLTTV^~=!XdVJ}2n2FcNs8^JtM+Q{3$u?`A z0pR`zMhlx=86%7J4>0pN5qUx~vJOE6P}}Yz>ZbE7|MXMVF2iY_0nI&4%$x5tXNi`K z*Nlw5BxPi6O@?v|e{^cGEooA-j>UiZLxM!HM!X;I6I~pf z=m32!IOq@?Nw#4-IfO7~JWgd1mp&F_M)ju-@70sw9YUse6DA#ZZYkMDez<=i-oE6{9r2{Z2L!7<^UHK-EJdvp$hznQOUsvrwt!-LA_n0IL+D$m3=ShD6*k|$2e=2(?kRw2cl;Dxidm0U_ zzH)6Kz3_UJ8q z$!rL|*$a?Xv}Jvj$d5F6;J0w#X!*xLKgi^N;>#^?UG=v%gTu(S55R!0jE{l zVVy&^Gy406a)-^m6d4t3-=hpSsKIIa|{OH)D7<9W%95G1;S5qPZH zZ9xa~-~%`88gFQgK7FUz1qby209$8*<-38E<`9t#<`hWMS5JS?P%xBR=N4R+j{ zhM?JA7aA{l2?kJB#St>`03&5Osbf!HLm%}6?d~NkWwSoVzJ*+UAKTX`5_6;cGC8wc zq&EcHZcdok5z9h}t{)@IU*57^cd&Dn1x6He?FMjN1EZ!u*7WXl$w{hX@#a9Cq2|x~ z`NIZL0^p;cFTjE{7E~tO2~a^gKot?@pKcA74ji#)K=WwRj_Z4=pk0fhv7;bEkq4Z{ zW#}N11@9v7i%^{!>(Dv8Coe#mh(D_o6;Y-pUh7V(=ztTgDMrouG*qde@u}yv*{{CO zZAp&&j2@)->AXNiArD9Y6OtC=vicV&bQ-qBOSwERJ+}>FA_W5iZDm2fuTdLA*TAJP zWh_t@YHaq>qV=!0)k)H2j4a`NuZ7r8J-Ms-dl7@4n1+OGYoJ_sgv$RZ>&ai{MLmqnR3ozDb--p=Roe^2p(_>{(DvpDg*77q2-M) z{f40i&rbCm?-`K?zfRcb3L;(DZOpQtfTC$}&q1@a!71ET+L4TPJv zsXIFsQ3lb26EgdI_dhdUdTE}zTK9|tT@aK*Q3ssg9Kkw zG-NUMq(IoUu@V67q1{N5vFPb1^oar3NLbvEUu$dJYmTNB_&Seh zN=?ssOs1Ds@w0|-6)zz%3i}W>i`9?auf{$=Cur=+SmH-(vt8a}z%6Bq`O9=-;IRl@ z*o{DQP|d2VG*0XU&V4P(dZfE9o?&)~die|d3?VePoeUW*RV)}ZH^g!w3WUt|f;V!* zO3fBx)f?>t-eyy+qoF(W<-)yqgLB-rz3*+hhWjqJ<(&A5`Ni0S8X9Dy_afPRODo(> z?c7Zha`e?dMiVW+&t;?^c`&s$^WaNhkSN7i4{#h`q=bb7ePq!GCXOxCy;MNIm4<^h z-$Z=>H5_M)oJJ4g!AewViaOPVpwk1d1Xq+I>DF@~Nd$Q{T#~Ma2qwq2!Yk6k54_}d zJF%wjY0`rq>LRB>IiI%?{tQRfZk`n$#W;RzY7>GZ|Em|HJ%zNIN*%=03FK>r?}9hd+k>|F!{IJC^U zDE1Ea3`wu{R$8%uQ8Z&&giLj-Njmb|2l-drvlI)<*; zT;^wq^W!NifA>H-4*#1<4$GhM|ApNv3+dDqJdbT2q@A<4XFC^bqeM`9OtQm(~rC&tE(n}t0Ue}@~) zma>ZaV&8#n8{3|%dREa#h_bf|){#xo&7o!8B(EyrlxX)_>q6BvZM3zZ`ufK;C}8_b z`q@_e*%k5)sB&41Wa>812mdf|%2Bx&-R*7#8551FMQx^ajC(6?PYLJ#D&{ytU)Kyn z8@m&fF6R0ro&nT>Vg;gTciEYx+64VmSSM56Y_UX0RA6D+NWOilsI=8LWbZgUzFBwp zS&B{E7cw_+t&X*JfT8_=&TfJpQ*;B_9}{ZBo~GT6E|d3-ZvJ)IBsKL$p6#Wb_AnN+ zFcxu(U%nPFi;u=<-bG%?A_Uwh)i;u3FC?X}5gK$wQ=G&(bpHNh8IZfMk;0R3-m_b01mChMa1mnMMUC9M@ zc9NuS?0x)l&g2wkG{IhHXHWUD;|468=|r<`3~40+60B&UWIw}?I)>@OSY%Tm!Pa7n z?(7px)~W6Nqf~9?>V*rO+Drx_q30rg(Ox1yLl+j2QL~bypyG?t9saZSTOYqYth#xX zF8<;iB|J>+4FokdNXlrNWQkKORD^I@+=hmgsNVSAF3tXr7HV)0$*u5V6T-jS zdZ2%$2|Q0*ErdvZTck8yAWgm-zJ8zi(>n&nB>gREB3#?1dT5LPAHY(Ueey1Zy#C#j z=3{TQ0DlcXN0ltXlv=v(dIvChMVCEuz5n7i^CN2A#psg>UsjsZ>r_@_)|`zppKIp{ zZBV!AI-fjGJ@Q;SI^o@edZ(!f8^Rj+crQSr0z|CuKQJ!H?z_oW%(%sJmu8Hge}dGd z<?MqLVH|(WNo0hxEAz!->&mCEuGX>@6KlkmzgwEb!j$4!=D)Bu-+2PGhv&l~Sz`oe+u0S6 z@oo2G_@|AMXd5dS3(m>k{sf_{c%GemGBEkU(}%LrIRkvpjpyQ<-mR~_430TMPo}k4 zOGf&|OLwU9R#<&2yP}VazN@>^Srx^q!W`}scP{Qw9NTM7_OwaJ{WPCq4|(PK4LLou z_&8rm>9$u9__?o#HNf+dscyP`G&$Fn+!eB&c<&rpBWCS zDZ`h}GFf~p*Y(I=wuo0e7E#puc)!8dRL8iRU}X4hZkZsHOIA3ymJFl#Em!>P~j)fBj{#n!oLU zgKvWcCj#C0e_AU6xcFq-1q5A%dtZrs-dp|FcYU9iIF)SE9ZXJ9ToNn{Fq1psE&wi+ zQi~2B7uATLNd+3_XO8*lCw_n;fWtD(W?Lb-L+L9&?toerbbHL*?4KPUbFvrZR-g-= zb2pMN;MKROcxT;4=;7p?IQ)(FInCf3Aw zc4>mm&aTZG^#3z%q_xEpXp4?cNGm+#yW*Ry-9SrPAcyxS6te{q5H zzFVFXv5Is%qjT1ud|W#QhHX{g>7&#LqX3mt;a2O@M4{bL8k4Lpl0S!YCr50+n-JfNYnPRnJ7 z{3VHeWwsJN^_ZpL5?&kTL?42k9)@x+&ZqN8dpH$Nl@7OMihEimj_Eq}?l65BU}0cA zAekY=K8}-{9zZLq$3BmgN2p zA7Zyk>jiP22=fc#SuLz&u?3bbS}{~BUTG#+clh82Oy?|R z6A6(~W<_tm*ZrctxC>{6g7E=i9;tKf*0#6eP6^sIe1ji-eNj$!5ecisfjF*zyjHJ> zRUcn}(F8fR&Aei+8vAJt(d(hxGJ9iKF|o{TQg$ZgjgyJuuPXP{^`qD7M5SKry79Ko zZ<57T`r+~qnm(%E?eK?zW|m__U`(P(fLmX0m%x@(p630*DxeC{3J|Ds*HR$^Zj{L8 z%d`Z{F4PkKZ?yHPF%se(cuyHvFaYV+`c<-eF^Th-?(lz+JKgyow8)3;q#Iy0_g(nJ zS-SRTI2|np$T0oz+M&CY%kb_>VC#lmf|KRYZUlb1wz-DmaqembauN0Ea2q&X_OAj6 z|3}nd0ia3;XaEdzfxo>5DH=PrB9I($NACa%R|%Lbs!CI9Lmh1(=i!;gLTYrbLtrD0 zDJB+mtn^NkyO7<$#048!&sX&;Jd6e(aM};XLfAm`S%5EZ1dZLMqD{AfVr(b;_zLVA zKxpy6(qom@cvk^;GT_tyM9QbP)>91iBE7=<1v1Ttw zV2R$Lw=Zm~hTiqO@x2S;8B#8M^+=rc2>u4m1j2D6IRiiFdwsb(_0P4*BcuEjvk4X* zhk-!hCB(07q5H$?8u{8^*ZdPFtXi%oc4R*aL7Ijn_w2au4@eSDB%JbMzpK7)LxikQ z(cI&$%ccc&U$+qbSHVCXE_T1oNmZ)jPPawfYXxfNB0;nl($IA*%#vEP0>6j!Sc2)A1L;0c7;>+51Dw)o= zR9@T{bA`AUE274lF*?dp5vrcvUiWa!&N^&wi-8JJ*lE<*Zn(%zBnHB+IOBX=JKH5V zyNqsuL;3b_#3~FhkqD~My-HD_YX%9MK-uu^2wrllexQPPw1d7 zIkfA7l1|ayk!BfC3i`tkC!zs~g0VD?FE~-4KWZz@23+0?3kRHLbli*!K(*on^>I@F zPBZ`+n&!mWW_9m@5fBmfw-0B}r{}BK3{a3w3U&1?EAuLIYHCdRmKESC9&p>)|F(?&hW^r66G(Tonr1qsGc@mB z+*ixlDd^`6?q>U%#jr$xjMSadit4*gO`HodKp#sC`D!Zj#(Z|2p(bW%!{P#2hz@wK zsQH-v&QDmS6JIKBJ)XQJ_P*dc9P)FAty=^pX7l{dMpK4vEA zFD1FJ8&B9+p({uU` zj>PsUoIzCph2k$yY={7{=;2j>SY-L%iAB7zrg$-a`14=f02WW3;w4Z9KzHN{?D9L4OmZl;4EB5ZSer?_hst4NR_)_-Msn){Ug*Dn8pgNr@$>@6b0 zIGm6{kY|~#?m4M}tzfBw9?3Q7Jx3g}LumIMUABIF)!9bzikgZgfrk}l4{n?bY8w2) zdpvjQC5D)Kd4JtV?U)DwcOUwk^D2+bsK9nWVgUz#vhUnv?WsAV$3GvNa9zHm%15WL z?<&tT;2P^=j=Ulv42?DfwyTG(dc(c}>b4KRHs;pFXezT4qtC7xDL{VlO3m+0gPWdf z!R<@wl6?iErJ7s#bm@b^JG4*Dd}~99T5H4N{+XS;CpWheU)Y9fk+IL`PBrr>YHHy- zQ>Pt8)s<-nnqK40Qs|gKf!<$l(|4bt&4v>6O-il7SAcXqFG1K(2s^5(xR81)X{|P~ zF!92(pd1Muy=RHhtwaaJFS;*w{F$&puLKXT7f z5l&tOMo*3i+PHY1h4=y(hpoUmiB&_hWM2^Z#)I~D@bLv;bwnLVtI*kh9%rHLG}JE%S1uBXx%8#dXO+|~yaIOj25-=De@+0=Vr zb+40Sjuv+1$DmkT;=I#Ggo`=EC7bLDl#=!x4wuiHnbZs&^fg%@*4Rwk6W;^fk2g@D zM!=NagR2LcO6J@H@^8)!ito}7;wBnjV4e>Ugc{eZ+N8)nVrz!Xy{FwO z8=)VVaLu}}a%>dh!hUs0J($WdR> zp>lAOEVk6WG3VTYd=w7me^eOQ76x}ZCsOWDGJHOg&#gPqtQci|)+JZ`9 z;n<&n*`GolzIeI$T)uPj*mAq4U~j}{2hj2^dLsng7f1d9TE9c3kn$AL_F_3gIPhJt zn=+hyyBOUDU$i)YWZW3k6iC3I!6Y=QgepmcX~NYMc1S|jDHTF()Zn*VS75?UML%5+ zET_7OAojHqsW(^$$NcQn?u~dVgB1y^KC8Q`i zhv6;LdRUTH-DF_uQrtZGU2Dr926xS{X8y=m7gkjA?#lJ~Q{*@A2M7K%&0}+Rgj&#Y zpjwL;@`tF>;^{`wy-E2Voh>_dQBwJonBY)cFnf9dV5wqhm*=JGq20mkHi#?2LD(Bf z-?fpux7hTzzHo0RxRdXl4R2lk7@~Dm%X4mJ?lClO<58)Gp9|6il4H+HOHn_Yr!Cxk zB`efziT!hcz@@>8?73sKX3B;?u#C}pCo#*~3mLaS+u85wp87QRMFQ-q)4_6|(T%jv zNsw6dd@x6kTebf%_(}n4gyv54CK(s!9$VYEj;LJH_qbVhW`{FY610O|g(IL+P);1p zxNYt^#d2inw8T{9OD8tA=}cke(vWG5E`f)=MxS{`lbX`k_v-p$Ta02FYi?(NNz`z9 zz5G6EvYz^=5-4nSgEw>jFsRasQQ-u4s4}eyaLr_8X;;x91r{p-+o}uAGPCSEQ#R~W zAMiN(q5D1)G3h%0?uHQgFTH|TqMHsz1D`l+N`oM zSOR?vBrnx0F!$6z?#Pa6zabS;-4FfIBF&e&oe8_qPM)a3Sp}+dFwLSZT_0C$Tom7Y z1Qc)yH+w=7?OiTIswSU#@MXAiufEb2xV`y42YsPx$~&N7axyVdu>g5A!U%Nj=Wu{U zXl9WvPpj$^g$Nzc?gH(kuE8&t>g#JNYEt+1%)?IeG$!3wUP~tZmr7cC_ z7c%CaZ)ro#A=P-6Q>|tRU|ylx60-?8Ix0W4VSB&s+moGjEpi0!I{Z9vhDG2!i-{W7 zps`o=D27vEj_*WfYHx3Sp}o9~`h`9L<;3&9UWng*Ve!u>KXchm%CqEvq7%3BYle`f z?wf(J@h?C55MLeTQ;YM9*N=~d1s>M>`AvWTQsrx?N3oTDq3a~?7BMl~|I-t=yjxt7 zz-xAqly2-+ZH$deq>?{<`*SNb*Ny60LITIQImeXbGxPG5dIZ#2MV>KP)}LLigbjv+ z1)!?nDL@IWzl0jypjOecP64U+A+${-?Mf=zbaNlh3!VddUvJ>+rW(*>V97Yp`d5PG zwtEaI1+dz!c0!1(0CZ%C9m1FfZ!`mNebhnI+Lpm?)_;Lms$z#A6}!|FOZeu%;xQ{d z6NftH!-0YQu-M{va2KMGY!1w~d*#IJf9Y(nw1TR4sc~c$T2K!#=}0e0k&lSLvz~~& z17I0$gAbgb5s;Bac3_iK1k6M(T`LgGdy5>B?t*d$yYM|Cn%$Mlz2j2o~1N}fTJqIme z@7F~|*BOKv7}l7fKm;7=^>=d-p|iZ#N~b1*TpNzp{GPl*yNU!gi0%Dlim*0c4pim# z!JTId3Gv&!;gJs5!h>)!#iT_0P)Na}^S)6FsW6Kwhc%Bj6ohk0MsU;h{&()c-ot}@$)eatPMMNR-Q$Dv-dnKW5UYXb^QHr%e$$>@eMAOMMiVp zKnBK^6D3&OMq!=DK>ES8Mc4=Ww#l=|GIB5U^Fy%pz4{ zE~E-gX6gaUO^OCfH?hV+K`&qJtb*^b+{yhe%s5XbFHWjZyZP4_v~Ka#gk70p`NyY5$)Qn_p%el|6~ z+P)#c1gU!L|2R0b!s@ry#p=d;XM`Vc@&q&6GTNfzp^OV&?*v^(p!&|0^B^gQz5X*A zLm093Uq`3V>e;vsJDW$@&x6#^Jt8F)%6(s}!JuM5k~2|>8ykoy(VD;b>M7Yiha!6c zK@FK!SkR?#i6~6G6Dt`XLAJwCSADG)_<0}x@9|)L-*oPT+ob-uNs>%ls43wyW}q*yc%Z*Zu)9nZUpW~Dxv zRmJlq<8FVKbB@EjWT6t@eeg+WeK*aL&D&4)B>{CNc6Cmr{&0^mCB9qZZ~b%}Qj_Tn zjR;~45w&7%r5%K$e@#R8_w8@)X?B_83MobznUY<|Nw%q0Q^N)L>8*M!|8lysP)@aE z4LHHrGj>i=8p5z?y@JBY>e$)XHYSmG9QXC5`MjQJFl@0fRXg{|BzNyWT+tVo`1En9 zTs=vD!_e~k_vSG#hmoq^A{!eEVY)nSRx^k@3`^L91g-W!K=@aP3Mt9jrKTnBsgyk1 zG_Y&ny^5+*KTDA`_GX1Z@1WhEws5*WfAoVGWXa1idKJ8D@hjb>#rW-{ty?(>c5NcK zy1hriapPV&=>3VthnSAI`nQ>vrss|xBSYNrbwdVMizNcGk1e!er5j)-P411vV;|D1 zS^0a6fv2gCPBix=cElWWm3$OmjWqximy_U2s|-3d3CYaemaTiXp~aMaauw}RW`H}z zE@Z6|*kzJDxo)%j&@!$lPo>UI5sLZ0pp62c7iuhsl(zBMnCg{yFvj!9g1g=QWSWJX zSPZY+4utL1#8U5+&9xKyOBbVTuWt0mF>dIw z?1M3L+*BI^AFa%9vqCl{I%fFBTwo0Hcyj*apW|olT0Wo8}wT> zQ4oSU>g(Fx>mjK(#OPlKPUqjpgeb6Rq@5I`n6;tLz(hz&9BcK8qP~DtIXddIf@Z^< zxXaK>jUt~xj+d6V>;Gw7sh3SF{=*<0DI`^B;-kTN?C7)r=5;eUbFeK)+z`DXC>?ZB z@36O!H#x{LB>$%eYI~{X<15ogS6f1Xky>ruMOtjBXF>R?#S7WgB20LLmvh`bo@-B6*RonKfN942XRdxg`brGLJ2Qa*at0Yf6zkaaW{Q?PMDq z>!+DGS*BJ{jkv8MfuGG^pH=Jk1n0pqjz3#0d^hsL$6uxmRF)L)yI5524)_E4ACVadp+Z?}pb3uWV|tD!t(eol|NFve``H0fMq2 zc}m9@2#)<=Zm8Mnwa4_~x)$)vKHCT>PLQVDHZ9|l@LAtWV13`8ioS^%_%^1P_BiM; z)B}$uKGT%O)6miCRt1Ln7-){3?7aEDdTwt+<+Jnc?;C=jtX4xaIO85fK6!9LnQ=6M zgqqkEp%{)H6dT^tO_V7A5iP=JT<-|;v&7mbOh_g0zc zbqc^)_;n@7jdm#qeWQgG{%I&>+3d5wE6i<>pppK`{DLemqPw4e!J@-3#MDlzul*=m zLB%Ok%GE}4^|_4fxl!1`uK1UbHH1E>hn#|B$ z@vvd~$@BTN4eBpc9sC7Yc}tUz(Kv0zZbu{hsSJCEbF(074SgP89%*r*bZH7M>GkVa zeb!^#y?kTtOf#<=lIO7SMd>#RZ$x@jU7a-Ad28F;Hm>r?VitqsZ`u=dh4P!XfuRtS z8^Mup+57sy1G%&$`GZuvXYseY8%hGZ4=W`J0$GZdQ?%(~cA(f+QoIm|iMz>adHgF^ z`84K!!!Z?j#)kXwJf9`5N!LK-amEwe|3qGs#Ap3TdwAPYZgH=NxXEITy4<>cbIuK?MBln;wdpRZ(=QMts!Ev$QUZA1z3j#7m9S@@eD*kW{Bwb2gmTUB zRMfzn!iAd&Ke|zqF{D8vuy23w6$#!f#nQts!Z7W#qQ*k8`4K5Rb*NIx6=H27;E~NU z`?lI&j&HKB89tQ${jaaRoZD5oZVB1^G2Zhxvq=@xn=F>5 zpcJ^-JV9hKPHJ+x2i#;!N`JqcjCfKu)e2@;w<4SZ<}ppd3fC~sm4P`KPA1yXnr666 z2$Ovt-f>o-i)1jRSwaRy23(jviE*9J-FG8YdVb%qp?Z0p~jlO;<0{rxwq z#p0f@!%dA}1xY4DSURDh#wTY%=jw|5wn6s0eY;!1fx@MuNBKFLSHR`;3dIpAQW8^v zM|afL_R`M5e!va}okp_g(*(%$P4nCKcHABQ2$ukU6;E?)r5M`<;?mvk7E~z;o~Q80)P$o1i=AKI6165!T#^s*U-r5awo3ARC2Zaqqh@GYTw`vZjf@{ z(_y`~TnXPFiio0KG#s|1KJuJv%jyX@4AuO@U~~>EvJ0H+wZ+S2Fye=bDYH~fk~a<# zEr=%Czd6m>8LrXsiK8t)`>9&y%iQFHlZZwtKShbEj_>duEWYEnxy(>G11}s5{f~hc)yH#p29C<(Jw{3Ht-DAwkVPr~?TZKSX!)=tXpQWSgX)2FkT>;6x zBUl~giORG4oexsP>vOE;%P_;bt^YT&fTX}n^%2`Uj#f|K^FDhC5SG4#xANjm0ih1r zfi_2QKNd;vPHC1k?qoeggrm7Ufk>hHvqXOnxvdiAa;Da}EMfLtogk5ki~YXvDoB_j z@+66LC4zOz%FTb%_j~>sq10!7CKIn3Ua#GJ({iGb3vHaIdLiqKesNpQ1i)EC{Um(N z)5%WKDh6<=(mOZ*LP@U@8Lfj{xGzHXc8da z7`hA%vcf~kNpj^A_@_tgJ@2DuNx=-vej;b_p4E$k98-Y-cVnUZnGMJ~9>-n1)e6Mu zJewXb*&DyQzz=u>tQqsPi1FH3uGycApJ;c9+5kKe&Sk8$NB|2s?2YD1^eb1vACtwP zd=RP0EyQ4{&^)*^{?*C4YCQFO2Q~6X;P!xqzh3Vj2GX`uAZ7v7!G6N}ylzLZjkdF; zBvR;Wd04RL)sC{@^DWN~P~-JWnwMP5H|WfSQyEe$+`nxL&fQ=Ndhv*M?+N0r^nRmc zj7r^8(IP;45xg4`Ja?esx>Pl4mr+MLgg}5ib)~v$tQYjG&t-RmwaJxf(0rD<7ZdGZ z3GlRPY%Z}9!4ve=Ei^w>?TUB3KF}mUzt=;LqfKy)!tTps3ldc}_JwvLa;I|RpI1-Q zOmbp$oV4YZ{`O*CGoY5kw~atJ5aIw5C2}zhGJeMA^ZPqrS<+2a%m=lmv)Z!)S@w1@ zNDg&f0=IRjBP8?t$#+jop#|KO0`$*AjmhW=a}0FBzzP;1mBrfC!m`cy=}Bhe7hSo1 zUeL18PN+OVZ5|mz7u56I;iu^8IlgU-&;!1=)ZRbHLY8vLH`w=tuCpKsLrp7UXebCW z*geRbI{Fn=uUCG{fyx!C(=GoQlQ&Ayuc{%IkxcM`3rMQ&W>vg{rFg1p57T^W& zF|V%|d9N1OB)i)&!!JD*P0^Q_L55sR!~@!CB1YDzx$3eM)Q+zN&ZOzTMoDsEi(swuLr7+dfSd+S9N5QgD0!n$}$?fpNR-gs!jxDoT`3QY~?r zu0WWJQ4cD-XxxD$&aG~Z*p+Zc z-%m;NNI3KK%6#G6eF@%!qt|d?f6o_~KB-~Wfo0OF`rv6fad!Iq*2AEGUIX)D0KC#Z zzLehWFJ_!VRV(Yo7JRy27Z0s<_+9a&K=a273qb!s#G-nynefQVs`2lHXbAPEBqHA=X$9tTCtd<_sr_)(}cG-Du9a2r7%@ayNOwY zsGr|w2mAvp;?j)zHRnF13UR95tUt~rcE#+W!ZlUF9O}%D#nV4RD9|Z8BJJRbQ#pO| z{ak{YMGKpjh#cHT%h?ytB*u; zhU6{j?HM)VeSD^Ww~~h^F(eM!&k!R%SW(gKO~S^(dXLD4X&s-Qj$SkkX}I>ZTDIlJ zu8tIVQYs(x4mCu^5tw4Uf4s16Us&$pNk)gzW666X(yeBhw^z!bP9^kiRrfBN_Sws+ z*BBN&bF7#^NAJIZcatG1fv9;z2kP`dPf%`O2m@G#_yEhvxw#h)!0AqyK#D(D{9JAp zcUA=LTBq+nh6mR!`c%}IJb}_(24i z8`F+v6;NDPD0R!usm55Qfz;>l952z^oTE&3b^Z`DUVDLHslDXd!!?)3MQ?5MDpc4L zz*Sr%hZTI74U;L29)mC7PdRSeE|TkiAi=3xPfIaPG`T((`7~$`GOM*H@G= zmkyl7F8>uT?OGQ7jJtT|vDtgwlbjGQ`ft@2&e(SA`Y&A4(=Gi8dQalMAhjYfquBN_ zGV1)wE+{PW%7gyd&3-izk(=yWMJi&4;q#cdRRXrXU)ieJdm1N?iSd zGD=r*;CQX4jotX?Rg^W!1Ml8B$F+2GJ+U)`Xc>0GyFMz8$Zm-4m+$n|Iwu?31c3%Je7b%ThZk7*{Y*f(Q`_GFX z8T_RA0`2GHOO*dMEujYfFic`}DEx?$>bJI6@}7OtthoC#{}02(Ymgk?J&SGc?CQC< z2ca;fY2VFn(eJj>&W=(%<8>Dyt_$>=zgIr&R+Tj$<}W%9_FvmKf$6sJl1uS`Rc#Uo zIy;`IApWW@W;d>E(8;AA|Kq;*t(izm1V&*MDZmCl8lPJJ!{99oCAn;v<-5Wv0W}JJ z*h+cWFezUyPB%6l6Ol7C9V?vyK}@tA0+EnPd~H-Uh!cXGcEzjzycyOB45{Dq>^_je zs>0q^|HL!|i*4BG+aZ~O$PEShB%u@?vHfhuBU?L)0q=Q*F=MmZAK{n?ksFUJEu3|? z&YLYS0Dtlck?xlcZUpow-zqoB81X*G@HVq{-7OTRm@*jEG46L|g^YBAbo4O0dmR2%8t z){I)IjXvz58sKD&JNR#NxhZoEXwf;g^dK!SrHs$4-3-j_AVZ>_bbH)gQPBsrm9*T1 zw^#s0jNs-kv=3AuVr;Pw$VB<+@mK5x;0T9N#vZcy#3isWnU?r2C1f?ocVRjg z*6ULqK_0`RY^K^>y?9IOPvauYId!$o1sFn^4YdR3F!Jxb@cXUYGrzm}AC0n4l&YmT zUvO2Rw=E@1)r60u)2^#XeJmnm8(Zm5k!i+N#U^*NV?IAx1r&}snF8ak$TCHJ?%KK-@qZ9@f0C)FSoodC;zVND-FU+F0NwVSi zz}`l_gS2HN<(C(*r(l>nV`{Qvy_psJo{Gh0!{Tasxd^(Egp$65BEjnuM%JS z#GxD)HJI$i)NhtpyH6^1&F=MdB>p(OsC98P;q=ZKUh>R5nBR)D*%llSBa_x=^Lvl5 zs%cyFpdNOPjomwZ7A;Z?=+N08rNYmA+g-~5+ z*?u6NY2$@K<^f``=(70dS4lJe>W7wt4y_JHKh$&`C?-%-IUvlUv`QcLXzEJNMliDR zJze-_$o9o|sF-46GsFCEDQ^re*Z(jkL}G62U=JJG9H98~;>=a>KRHeE3;At>oK(Xx z7$#3X(di7xwc+JMZ>Fi5RgDjzG8QE=|GaS+fDaK?Xey1_5Dc5Io*6Dl@;tnwGfJGi zxnUUU#pzwfEe9L0KlsCt;(Hhh28)HCNi!;I5Zi|s;o0M>-LEOhs@-kM8dj7Wkzrrn zHc7h2!MHeFCrC>r7`(uDmCemm{s4dQfW#{ySqO#?hs;0B=j4o)mK{l7Z|kEFdP0;Rl6YM#U~ZZ^hi4y}6As*MN3&O1j8nC}m-9uhrhRgT@8~oQ7Otr#F~c0GhNNs1 z)My-CrF3)2sx{Hw?d^wXV>mGru!#q3zj*_gY}Flc@k9VL?TJHCGa7z@js zKa^ixqvX;*g&!f{^=gTb`LvddIgvroyG9*^`@&j=)NP9fsh~$Y+?C+@#wFC+%;(Q# zNcZ#canj69^(PY)0riCr1H(%OUL0XisEE~4UAT!}8j6%yJr3R;NnLInFEo4ld@)hvAuIs+;=enNfkGZZ{zF%g(-_Pg0zTU4_m&V@-59ry9XG-63 zOPtdU)$oJJ%jDMRa6tCEKolxM6HL=j6SCjVnL~$gg-~vGSC#j%OXIS%P*<1gGPzF`BgA^s&L6G=E@Lkee@nIDoru^D*$yo|yz4B!OYX)O!!o5KN=Zjd6#p!f;Dxn6htXghY3xy4KcMtPaZjjdS9~ zw{P#|ct~&ip(O#%Nal%GFs~Y|hoNUL_p>z&B{mxHi0>RUBjSALm3*$RVmuUBBdC!n zq)i9qr2SsfO?hPC;s@{r6v;FL6+!QDJ`xH)6XPn!)tJ;W{rjf%mVURWkS|+Mc$s!E zelPmQ+#M+Y`NoaDZw3hI>P-*2K8;R`Ow9J%7X4{cTvXeC<{0(ZhFyE0q`o~~Y7cS2 z<2fjnuJ2vAsN*wT=H|=fxMTfx^$lHSw{X!?Qz)wkuDh6`o0u^omxggU6wTACCSl!; z;^x3b{nWVC(C4n^>W=czq2uet!$gzZ9Nb%H#@1gj%|Cy4FBKD^4jxgevkl<$Tuw5T z#o|U3Id|tP^6n8~?yz!@do%o9tb1#;(`}_vT&JDV&c2aEJ+{?nAe#>~ibV6EU6DQFSecsuj)TQQ@(R>hF zPdDIF=J9QrXVzh6@%|iE!{gkngo5=FM6h;CN@{b#9O$VHwnyIMO3hz8As1W0!P%;b zn1k)MUn~<>>Gmj%-y2g)DM{e(KKn;@B9x`@i zf-KaKRjADH#NCNSc3ime5nXy5eTlI?2~jC;o-uNHk&B$S8WwGA`%?>elnc`;$r7*)=c z!uR^{hL4LT>-s!A{Vn7S&wz8x?jHs0?z>()#KjtP+WiY^)-{)6ZxUAl*!T~}yBTNy zhqvF)FZh3&_@)pdqG7u%5@`@yc<&zeV+=Yq01}0Fh8L| zndphDbZ*wc47^l5nf*xDt}9z{r5D6+jCX7O?F(&gjvGfOrjnb~3v|5S)?7wYKf6`i zjCJ!q?cDvG-J`aR>m`J;_gB%-lj^*@ONYg`6_gn_Eu7xWKZLDBAKX5fMp@i4pLV!} zdv_~FSWEZZ#LawbjVfiQ*;K$vq4O`Z!#+>{Wp;Sw$=_Z}Fb6IgU~ziX6kfrODl3Gi z@VL^8HeC{A-5!5BApP1`>`7+cr3+j+Cn24H9tLSb_Gq%DRuT(;lv$pU)o-&tE%vpR zz|fPP5WPJih(rgFPqYXHj^^upaB6lDp48i&P>xww9ry5B*c(ub)+PCI^?vV-u_>yx zs$QMNThSp>Zyo0>K?C7hBporOw=$DQ8y*%zH*M4DsxKCQ+*0 z5yYwqH&cohG}*76vC#dhkL_?q+`Easj*P?Ve5d9RG_>mP{$9McnX*vkfMXUOf1I18 z)48QL7nA&gX_0eXx~~K=^5x}^hXu1f5=5C#md0~2YuJlr`11JSdu;}$nY*^`9zqRR zotVMwUJ%#pUQ>2+AHFQy^Fm2*pS~z2v8O78SFMXzJ1%7>S|M`#OQP#BOhPq09RGYo z17qTAgT<6?6IQCp@-sGlze$IS&c)B)!2B$Ui#4(h5oiGcQ<_r@e&vI=4Nc}@+5v|H zYW}}+=%Jww=_I4u$@;Xssp^++ajD*AtFg;Z4qk_Nq||Ljef#35NRNvjkd|5koi#Cx z(k}bHTz z=MfLyoYKUhD^=+blefk#(J4?7oF&bely{q!U0~$wrur-Ye81+;HhYGui)-$z^Ls8) z(k^IgJ7SB6s*mrrt*maJY9{{*m+)pfH{*vI@s>n{ysULs8n6j$}ft4)>PSN&vYbnJ;zNK-WDk9vX)JtcA- ztd^#kued{J&t_L!5erlC{O2!OaC^<)oB}xK@BgZv{_li%fPLlP$npMHKHs%QcU|c# zFlQ8F5$#`lX0mwCJ$|a#*59&atTQd^v&L^F7km>!h+L49mf-<-PDNY69;H+ zt0wVm>beg3n5l6Cmqd zGa!V(b#c?}xLN%s(tPA%`YBetTU`g^vcJfAN1p_+=+S5509|Xakb^Ws)iyYJ67uYH zi|WI{r|5|vjej_7sO>vTgb7r34sRG>Yq|4Az|8FC$jP``0c`Ix=u_X(W{g*gn}F5j zxxdug|1F=(|Nr^_x33Qh;=hJ2q0Ka@Y9hyUfuCN_c}rY7gyKh-v$F5tE0=$d8vfzf z?n|&cz8Y5_J0$$ZOediA!(ZYr6c*J96(Fq@ke;^KyBL<@mF&hYxu1-6*z+D84KbvQ ztcRcv_=vJH8v>usvjeVv1zpJ6(l&PIg#g$dy0|!bf*nr{0j@Y7J4tRWaDaq2&LfY?1;UqZ$w$!41hymnp!EiyoJvT zT3aX5xw>gdo7A(UzWEh#sp^QnpW&C$TSpT+#@J0MAx^*i%gzPwNpgcMNd>ZYXQWR( zqMiZ|n93Ud{`$)U*dm0D)5Lm7&mWGnDU}_#A^z#rI%L*Ypi#eBDFNFamP2Lr+#)@Z znw&%#+Oa$EC)_rRweEah7aJ?zc(g131ZIdaBUgqb@Ysu@pR`J|pZ&()f0*Ncc3WyP zbChXYK~pE&jIdl@_-JF+Y`_JjbEmYvpPI+4;%8WeuFDD3W&T$9vV#rrwptfRvNT5* zmd9+eX1R6-gf%oj8nMf7l%coNwVBP}pacp;iBuTLYD07R>zYtf#G-=TGGGijOAij3 zARB~I%7gTw2Ea!NHw$iDQmCY%bx>nK>?opOyXImRw%`qfsRK+IH({!NdCFKCj5gf` zqQ__HQEL#hU_w)hEVYjDw_564zz}AG{Z$=jkUbieMJ4 zIJhrlat>qrk*$BPK!zOu9W2W@tmz$lj4YP6-R&_)*miUFm@Q?*=)k@d z&d?**P-nGCxM`mj4eSBePG0m^0*3HF)`1t~e?s!G0h)=aFqgXpody#EV?xC2Mv{2J0DG0+*-Ct6W_{Pa>&0IkBrPSwx2*)yN2=BUp{{|J$Lsb z%6ka#`PM?Bpu#+UX+(G z0{@j@`Z&0}xd@bf?N)|O=4FTmj~M(52QcOS8NWT?@?#S0#}q0>(1^#R=_wRA*hhW+KMe?op*jM)5_~=`SPe1UQ{6n%>YUF0 z;ph~Zcdrvhs@4*QnX*}iCcfXEVF?cq6&VSU{E-?d^OGhGJT*IY0^jWZa1fM1V{2iL zm8KU%5qY$-4Jx|vb~|=y4sD_e$qE_}lz$U+bWyKpQ-!dSND4&mSQ3+Vz_0haf!?|> z5bDVgL0(8cbI8c@1wEkqE9)L%8E)HiYeBn6{)IY3a1_<-aP%=NxycIH3cRQZGi2~Q zV7wR0b6FX`0D$hOCe@91h_xjdRDnqm%p&_2a|cV68n9w(*4*I%XFP0UesXPlPn9Ld zgOWY|<#`BeHW3|Pii$Ner|A1iFTGwtJv3q*M;gK@SIWRK>qcPjuX5g&WW}OnrN^pz z0r1Ip8os&(-lT@hQ~m*xI;mgeG2T^JfN}(&EfzDwgGIHv&lby zmZP_lP5ub(g#4~Q93D^paF8wNcpb>CXTgSCb641H6T&A7VscsTptzUE!;nPYb^kh~ zFgT0W!3}VqrN7}^GKI;{urPJ!@#D$H7}jzGJ1zweS#cq2{oxpNwYkPiU%d?bBuGc- zz;eE$&7bd7@~+s3e_z?vlrxHI`X0k1piho~#3O6$W*9?}c zT|~0SuUxPENsKlSe{~PjPhiRVj!@5$s%BPEk~PV!ADJfeRdvLSyrmL`z}FH6M8X!; zfV12w^x2olI?evC2 zt%lA94v#TU04U@2KjZ8FU-g`%F#KS11;5`%-z(HdE+77;{SOC2E@%~5#IPL?fF0gq zTn(t~&m=H%1i~K(395lX#>;1s3XuM?qkQWMJ!x6s zRui$W{}JYkN;l#5+9I!#9%N3vBYo?7zd8!o*e>_I9a&7{Wk^OcBWj8(`_|^a)BZuRmTq{Rp)DOp;+xg@{+4m8p9&jb#no`GV|v$zTy8P^8om^GVOJO(FE1`uyuwqng2mv% zsMf$&3HoID*CzB=t6(mrTJ{{kEZ(pCiL339{uvRv!Nk=2r(ueLPGBmVg*N^2Ib8Ow zQ+cX(t4yM^CimBRt|b{8*&32LOc-hZ(PYX)z_m`SV$UV4m*h$8w1s17U37)SvfXNM z5W|Lc;g^4u3Ar@iKlC^}Wmu z{9xSE8uIdCE+%}cQaxp3L`p@8nNA=V;oovq!QXZ2w9-j*MBD}9dZUA*?4x~>6&@yU z_)5f^e~42^~Htzh89G z%y3WZYPCGQ>XEroMpQFQm@Pts2ArqFxJvl(4R>D^2@s%Ph(3}%^s*3U!urhShmXmX zB=-%4s~t6QpMo8&{;Juha0BbwJ|2Z=-B#jtu?j zlkv=iAA)obxc0M_T+(kFX!iU&s)5EK4;N`~%3p$^H4Ol9I?I7e0&Ef>_&i25p$^;w z3yq)HpSZEU>IQJ7y{+tq*Wk*(%ZpHC?_u$Jh2tj{=-3b7nj#M{CaspU>o>7s<3PxO z_ZiAic0(R-S9jQpRNU}~gEGatf%`bMK5f)Z<2`~NTLko5TP|5(KT{%R|Jj9bwZO}y znIT9e5T&l6w*+u6WY++l3O`#5Dh#NIwqbN`s|G=`FoDrj2ajW*_o07OVF{b}fPs1w zY`{>Rg6s%v!RZ^oqxTruAMzUK1IZvO|B64a<%u-dqe`)}w_5S*`3$E0ZRg?Zxl>k`$U- z`6P{@%z!TxcZnF;_wNYPuWBb_we}TdS~R#K(uZOiW%^xoZ4FemvE|BE$;eZ*FgG%! zJ}ISz!tQ>y-@{SN(nqvRT^92WRk3s^Fb4I?Fyq7R%wkw7S?y6zj*Nw1iZ3u_`QO1x z;*jNPVoUUCNPUF+W2_T8N{XU?jOXi(Sx_FRnW`t)J~a?EEBW^I`qew{VmUbD4K%}0 zk5S!dVWC!f3eB%^1F|A^<(gmY-u_06603Puy3j!Ls;C;%HT|%AEw9(fHkX*fk9C68 z8-F-dy8*o78120o8S`?TX`EZ;t}rLB`;;$GhS%9tAc%YZVMg8ITW(|6n!fDgI=V7# zd%eX4NUQTvNQ2Kb_wT4b9A>GfxrB!g?0jIz%wfJ3qUwfN33cMdMn2>dW*F}0tB+&j-Xe+{?bR-bYNQrX_s7vyfu{8IzrzjBw=EI!c#x z6+~;<0E)YhQf_=q#<@#t4K*-o6o}+Y*+&uTR^CYt#=SuSPvt>{4tWgsBtJDsznr*b ztgT)Bhr|4fZHGI%Lncw0^wareHsPk}T5JuTe+AKi+bCiGMxF1m@{xls8}z%ERQGSJ zouBh}g6qXZeq}b_!gZyTMQqtT%KL}|v!71*6FNcZaQ~x=&no(?M@M2;_hf^?POi(a zbpQ%DTI~^mQmiELMte{2kA!=dCh+UlI)$?^>Pcx$Se_gyXP5zR3xuHYvMtOnC_^ zRby?m`~Gv(qt6oGq|GU6oRxw*1!VZlK0MjT%-kE=nq^aTX)yAsK$>bbnm=yh;{rz; z;r-hOrWyIA$sU`~wMV;3d);_=jOm4 z<*4sZ)DvdC&EWi&`PtIv)XPxBu!g=&<6EEdUjxhVt{YXsJONmp=1Ru zV@Ex-yg#Y@Qm2>XQSsIR6H@s#&HX1)*NA3QkmIzd@54=j^WlZVgL646mdK zo21jDMMc&-iVe(slZx178uF*6Qv82xUTe=ru}zqSAw$`DeAq;ZYM6%KK^s)5WT1R` z!&|4v+Dd^wqf}$QTQ>geD>(5twsL&^*KGWcM~%k} zy!nJjc$#25!o_mXqdu}-B1GHqcVfi0Ij-{idW%nP3!higkHv<2(hm8L%&i{pK&ZLo zr--yG6IS63u0E$JpbdY3vD3+#@8Bt>$= zSQ{u@A*xD6C(P+w=fAfex+WmegKlJ>(h*#Ma5LrE7irHk9-i{kk!xsE6%DwE+PJ~8 z&=;Sz^q!oP66{}WvW!tP{8R?&Y@YPYN#k=R3VRl=XheN8Qa|zV1G=o}iw84EWuQj~ zh1owOiJ9jv6`lRP4cJ%BZa+ZTpn4p{-8iSq^4s5^?58@kcoKiYKY+T(cgoZNU4dav z6D|cJuv*jA$D_s;&fj>iYfzf7i|+HQ-u9Qf%1#gbK*|&=^^rWrd>eJF^O(e5uE8s? zWago3A7aT>Z6^esPlg=cFOk0QJt(7gWAm_kp^~;tU6ONc|3;oGRp#q>Hra!7_l-dP zed{-7eQ|ejtr!m4x-HFbT1`Vc19}H)aPPjaF(2c{FJ}G96U0{E6rmZA^C1ns>GjE5 z63sZb%gitEpIa|Mi_jI>C>~6>Hj6?x#zyD>N}3BLSznYiy}hpa*GVYb7uh=tS3( z>rF@PY=^a^2Ku88Uic4W!=4Vxw8TjQhpljpQI6!%#L=YXO@j32U#)7;Y@Q_@fDyBy zk*9fI-59pWNL9LK`J>#j<7aG)(&4&vy>{Ll4k9+DGpEO9i2>1$Cyh%8OZJwi|jY-}QVAfh1?579wnuKA}ac+NC z!x$+id>*(<)~V0o7N|DVX=Bw^RW>BCtgs02M*Q1|*d%&l?@wf&y=th20qli_RhECs z=rOk|Q&KW&^+|_Emco^YE!KVz*QVq$iL@mwltVl3*)ESL}q7C47(;B8%})@ zVvDHgwf^7@<_h)cp;oe6%7ppPDZU_-F_rjD82;ii<;wHJfj z(OWXiUR(?TkJ*4ZPnv8)7y<0(Ao__6en+n-iOvHG8Vu`#N3Rvm4hNy_@ghYuy0G82 z(Qi;z1nFn^X{KB7IOh6$AoKzv3<3OK2@-E{|9|wDXfY=nziunH$9qr+Y%<$N4E2Dp z_(G8Cx8jYSMojxq!ofVz3;}-pFM@1>8IClBEsjrs%0%GSH_Y}+v_0>K>4_a7iyhQM zMhoDTs)BLi;G$O!P$U%@U8eNe7Ds~a;(Nhj~Unj z*Sh1C--Squ65>%OcQQR_JkQYFOyB=mN{JzSWtQG1no|$&Y0Z6WrM1pQ(?qK$egX3q zgbHX8D-G~Q;ikn>$KOo{Yi13WP&Y5gOU1a9J2txMil*<_9-qWLMXbUOGgU}JVcd1k z(`|;^Yo#_C+hMDjYv@7z8K&H{n{7Rv0(5T-ka^Y82k~R~dg3B{?!DH_Lgg{R(oIJi zck^5JzHhIL;_-29Rn(GBr-6J`6rl(jziG){&>>eHQtoIyN?GQegYXOhJ(Z#x5 z&s@tLWvMw@wocI88f7ePVeLR6Qdd4zDxkQh<6FM2j9rNY8X7f#HCqm3znZE()=yz9 zcrz^26mLukNq-bxGP!#hCxB;VLyr#KxED#ECY`H)=r-Z;o8d*XnukXs9j84+T%~W5 zFQrRuF;~Z*yySJ#3Nu^bD11=&6o14RKh|oZv}MQ3VP;l$+6^x!H0MQ`uSW}N@XT!V z_#?~>3Qe60X7`yOs_xl9sdNmF?(3Hr$Y(;c;`$$;ls4MApL`dt2}9Q zOuP{+`V!>b|Ed!7zwu?CDkWyVI4v{mHPcY8}>Hu?? zvzh&hRs~muS8f^S!uATY5UD!yOHl(aImrcL58&#SH3>zDhas46wlT?ah^Sp>;+6uO-DsT?z)2UlT{u2)Qv7_T`5{rUfTvwx9M)d$cUjS<@aKJo z?w>59$uj5kPfn6_{KSvYJZ4NpM?Z;+_0FYT(&%cxHOBR{SGA+{BVaoxE*_6IzGCVl z($0y>e)s438pIGwWuSpN>jo0ZA}m z^^pb(|6$Gi7xw{@f-i+I6Yt=cyC!x-b_VOBSYH6IcP<#0bMOTiR$^Bo3sCTE$N{iG z7)r0%5g(&qPMMf;F0IXOg!AYjjyA=7w$Z=F@v-K0p1P>t1IdKZ55yollSY#rX{aC= zU@^w9UmC9F?O+t$rd?H;giUA*=6mgKpt&gVw;aK_(E)O#SCA$^Q4&5;C8iP4dPst0kBvQ`ca=%A4Ouud)Qlpe46e4%9Xjksi%VZ&UT~Q%n z>$ZF({jyaZ1&g1wd%Zj4IPH9|F>;#2a(=_ zk{^U!zvVp7ZbLk|ra$93|8Q7%A=^ue_&@h0)Xrs3Qp(+UztAb%7fnsvTPIfPwH~N( z#+$g3VWLPh08f3SPH!?CSb?@fac6iLPlKDch0nvP;i=kXS)`?CxEh9w)MGgvln^>d6QTty=;`TRDZXue6L)= z>-abc(KM5B$ClsUfe-fc%S%6tkvAo=`{mJ znDrbvu|bvY(J)Br~#xoLIm{K;;X>wGyOc{iEGCm zKPFUO%_cVo`KVZ!T>}8=3mlUi>fAnPh2LsJ_dWTWi~=sPH&+(TO(P>0kWmaPnxw19 zq2$=dGo5r>o}hTHGvnqP{Os8XPSz%TH*tkpn=#~%(W&e95gcO5 z2}teqaT>x7->o7z;+5S=!l$40j9bx@_(fbzJ3bF3@|>3(|2hIv?*wMg=DRLPl556N zuruR~|3CpnOz*p2JZf#xclE%VdHkH%sq7Cm<4TQN`Y^TX4Ew7@#-;taB=ff zMVIPXHE(EVrTGeiDrDU6rV@OGXz>+&X?R026hP<5yi?{1o_(Ld{gp|Sge4-m~7TTG> zjdn+Bx%oPcd8A)+CU{)2YdsDNr;0DbGbx*d6~YvZriVJwnA7#%y|hhav6$EnkC9E$!c^7SvQ;&i-!M{w^I!O5jZsjf7Fje+j$BwgEgifZ@btIR z;?dM1-um5lyGUbo%^!0`0za zNl691I=rRlVDadcR*}Q`YfnA~8{mkr2#0-rfWOO<&{!3i39M%=ax* zE?4F#P2$We1X)S1AF1PYs37cl(%fun$jOHpju7dJ;7^A?ZQ+WTdJJ>q~cDm-qk0!sCjAkS&l@SAFYfH+fo!_*ls#PWU20;w_y^Igl z7;82~2|QHnY?67tWH1R7US8r|!4+|}ohGqT3o*hAzf}2oY5Y?#ayv`9LXJdEiv)r&vwL21w5iasyrfuu`RyFppX<{2jtmVuOK6sxRQX*Cm%F(KBqw+Er?3KPkEWI} z-2Ob;-SRCXQASQjpAYNpjG_)US#_W}*t;gQf<0!?qQg6~hb4=PUnA$}`QLIF)q%Nb zUTrP=P-D1Gr2h03>vydOw8nJC>XL^$`fSs>%wD|V&yi1uE1~+*VCaMX*I)-9^{(KF zZZGlzgpUbYCH#^eD{dFgmtXh%eH5h+O3{x) zu~e;DQLC54CBN?``}IJEQ+5A}TlIhd{HAAU)m?5*_LAFi%1=~n>z$RaYG)R!Zw4yJ zug~I7i0``-ZXtvCrWtwsURk($khgWlt7{o&cUj!N%fZF5tNZpFpp)T-y^EY+n14XZ zw-!(;SmO9g%Uj9R&(ZjXK~VobzwY*neFNy8x>;>Whr>4H0L%otfGqsWVL&CUctbse zAawA{Mj{#Rbz&?GM94t~!)SYemQbY;M=f?dUdplsm(e45x}j$N{{{sRYgD&3pJES`FS`k%-cP|9Kz<9xl99%wg63EtJDMEG%IrPt{a#{u z51PIo*WAwf*xj`%Wa44#kKfwmFcAgY2N{H?|BUm1B`>*J?IPcHnfg0TE$j!TsZ*-y zlk9T`nJQ!}R>(SR%gAcB63+Mu#6LQ>5lG%{W<+7f)b*{;_{Jresh>OaitY7F*{ssItQXowS$Zv~nX- zzscLJ4oBH%fkscpczgKt>BNOYC<`XCis8~c!PoI&H3^_j)<@4`d-0b@t^0Ey$a}0b zmo8^8-)ka&`)0bYkm(Et)t2UUI5{1Lp&Z`~%H0>+f8h^@m3u+>y>$s*#_T~K6%wz5 zEHy~0H=}z?6Q=WQmsnVhsr;4uDm*ON)f$tZE_w$JYKYZl`fXYE{T|j>NeOu~I-zX? zGS7P6Z0-$oG|>M54GZRhJmY-Z1MSd1_s`Pa*0SoUOY!T3q-#UY~f46eawRw4FCtu zS>9YF0YDDug#p}<4E`7_O1p>s@D8N$B#xW)DICSS3?(4`aAffjtBB#KUQQU758(kg zzxVC(=0A6JcUAnFxcmLJ3 z(;R|VzjJtl!hZ%niMN!SrcM3M$?#~586D|@ktw+99;};a z#P6Ii-%4i36VxxUId@Pltv+-|#>IokGsVzyRe+p>;5wIBmsu(N9u0B;IuwY=1XlPh z+K?sN%mubNh=@d(?`CH4Ehy})5krTDWAkCct-7x<+aC*llv7$Gye~XC8G?w{ypewd<#30&JQKJ=n&=1uP0I?F0N^#%uqMT?RW>{F8UZ^tBCKkPOQrQmmH@1I5Y`0Bc*rZ)p=3DX6>b|11d6KsI9Qf}OUTIqN-(Z$RN^1$TZjtPdh@ltQZ>L+F=m zSi-n2+;FGkv_%}HcM7b$K0wWg-B5y0q0<9eNhp_2LHY>k+58kbjKEUU`or;&IGczn zVTxX+*-in2a`Pf8q>k9^v26&SW(m(bp+#$XjXp+`9%kUVp>LQOw7$u>Nt2gXVv{`} z$&OD^0}r{h4t`3_96GKDR1ggRiO3-XR;n6;FV{H18(APR=93O}`aU~icnwcPxOv>G z!mG>DgF?M9g_EalnT$IuQ{cbz3DKPw*j;%>fUa>$I=U63u{UeEw*pa{W=t$0BMbFc ztHU{a!UBzvTnuR%fuY`H#pbR~II@&8KN8~4-vGS}8^V))iAw9@7bHJc2l5;6BkTf9{h z*Ir2M>LXwe&$D-aK|c~Zeew#Ig-|C4ag>pn_c7N3q|sM%t^7Ay>X-Pv00?u83k($4#rcgnZNu+k2hc@}LxE(_T!6H_KM z-8$i+F%bXyWFv?Kdra8N;7c46BngLYQ=1VjXSYMCx%;P{A5%8#vC!p!T$KrA!+!#< zq5H1)rE1=zL7UI}(OE4m@Ba&yX$$uc%M|y2*D^^+G>QDgMR;p!JJhO$nuHikjOu0q zP{0MvWpeB7e)rOto9mWpXas>O7v2uX6;vrK6DED zqOjmjqC;z#|6Wm*8%Ej4)GJRy%WOhuBZoV662{wa0J=Rz4y3I(fW6%2nnxt&Pz~4RkUcAC;&ZCuF zKK+N|Wh+bh1^bB}JlA5!nQ_gN9MJ?{ANf1^473l!JXm$vGzdG9zqJe0?!jw1=hF|X9`~>7&4CN+z>l9GIv8zgoZp;!WmI3pgArPI0@-;K z{22csSC)DWe*$?*I$`?fG57ZhZtph|jZtkD{zYiSU&o~g3)iyKYw|cgLgTHkN`CdZ z(ZEgo7l>U%*Ed@5#;wx7`0Zr$w%9%2x#8j|R3rgZEY4A;D zMcwN&D^8~=ejd)I26QV`_!D-U_>mYliihavXZw+rOS7?&90B_D_-By&vUJ&3O3ulX zE)MEhPBr6EYZE;X>Cp65*R#b-1TKFgJrXRU0C_pR_1pMRfaCV8*Y+o`v9}avILGd) zz_Yly+=1*{?w`g+28vf51rp=B{F_!uu@&PLhwc&~yv^9;&Ly)&RKEp?S9cdIWx~5`6!6038Ve zFBj~NM4dPQ?#}<=Fm%Q}#LX9;0$x8{g8B6CK@@-Zly(nF3rzdykb$}ojJgQoA2X{m zt5>KV?1pZQHgJ(xy2T-==-;ydV*ohobrHG&o5{p z)Z=fGFjI>fn}U&K>9{uR z#`13sQ}BJ>pugiU`Of0&QQ55kVUZl;(5lzRJmhj_NJWDB{xvW3h6`T~)djS{zspWm zQL)4P76}%#(!yhIJ;TbD3$QAQ7;Rq0x+fFS{EV<0e`j*EOwF$Bjpc?NVPKi&wxMyp zp*ZysbQp4n-7#^<+-^{cJdWERBzLaNChuTGWsA`yHTsAi<2qkEi0wVmhneSiLy$tB zXs#z!hp%b$!XjM&0I|8PwQR3%B27qe{%Pxsiw%3J7aGOfhGy;!zn$}ivh=qeJ2i+* zWZ3vWI6>fkR3{S&+e*DMd36OgA3aH-Un$g7h1k$`_TYgLjbkGlCXuqDkb zC3vzA>(Erv84JbnJHhXiYn9PC8}B-g7WfQ60#QE#N0Y@N#2V+)u*uO z@ZJ-Bx6F1LrGo9@GHfDbwXtL@Mv+KKX#t3wnUVYY>l~Ua=3fKESe(>#~t`rEG0fZv5J> zYheZb!3ev3mOB$rS*#kt#r-kVsvfy*ohkcch?WJZDo^gMFqDAf`OjnRr2cH^rhTmz{LId-4o!5k#K~Nd$jei< z+QHMBx`&`g6b}{(l&Q6OzZq5 z!KkcQSh7+G_+n?^qw%36zS7Tet{>1z}hB-S|QHD~xg z?OB(fo9AiB*miMj__AiQiR;6@fbl2;jWKa?#+h3)7S)a8c4w#FdS54~^OxtesdaIs zw~Fc+alJ?tSij}k=4)bv@2ghqcoLtZBL9-|*Mh$%6v9rj&j6MceKMRC=7HY|8k7zL zoskEZrQyD*genJ!H@PM+5T?$+TnEI04fQrDI zcCg<7m=G-mc`CR=WSG*9E^CvMnP)C7*)-boR6K9;PH zM|O|$`H1++6Qy1{yc8HZ^oHuJ5&DbRLn=dUtk~cX%?m^T2kE-uE;QUXw}et6=J@l@ zKw|5}_m=f2Tp#3!ca!IOx6hz$SW#FpK;7Dj^3xa2hjxLB(60QpEjWBYWlYjJ$y!M1 zG~M7xhZ8zLORVdP6(K)A+N>UEgYfR*`$gPxmZ6<*kEnKi9jwYA>d~-LXZthdyj{k} zkFyGxn!ZON#~vv?+7Lfj)?^~opEKO98NNh^G(z}lcxUADrfMIIZ&(U%Xg_{0UT40F z?!})xM{bk6b-w7L-Zk#O1en+M<1n2y7rzfwHUVDdC@Zz8x8q^VMbec(@;o6cm!G9Zg>_*YN;#^T$rIU2-$eg{X z6CXuzhxf>nE@`~#hZOsXL(+3m?@{BTDagH)1lxlXV@ z!HTrHk|g9AvM26?vx&vMo;NeEwZsP~AlpF%yAC*(lfx6l=+T^-c>#*ShN78}bThcj zGD@7Np%OUki@$dX`xIZVC?FXaO%b1oY{d<1ibiq1)iEp)s8+#`Q5X$nBG2dm{pYGS(FYyJ}MBjpO zB5#qi8z24lRcS8y zNJwT0eFSktr?piJwv3kq||*2Zw*v2bg+YtW>y}SV6(6SEJSBR8C zBlf%=^CB~j^b0#}G=MEl8XTnhl5K3Rf34?!as8^`zZIRg%;0Tq6PCe008HW8S!vM} z*|h(vlk&*${fB-bGlL;JW-2JCU)Hdlv3F^@$r#+2>vZXrQoj!g*(=y#V`B#&Yjk`V zerHY0F^0?1Pt6qiX&{h-2~m8Qgy&Zd)@!o5F#O9}*xkY4*XezcXD)68MGRkEGBIQq zNK9qaX6g8WUMO44|Aa9uXv!9LcyTFf#&rAW^HAj#s2>c&*>hDa zuM(G1^FM33;P#S*R^c-RMo0%pacR$t=81jN_dtY=lxAOs6SmZpF16nF$-UNfbG5^y z5&fP1?t#F$!rRmpRM=1DOJwEkqvy>&b|@Xm<;szM4u=AN8cz~>(Ug>;fgPcT% zH!F)SRh7kTbD{)yHm6}CpcwFNr%rO9P+{kXfh5!hOEtY`McQ2%$cWBvXusg<-~M6T zNN0QP4+rN@Tx#2*vGp>*7ykF{miRpNy`~1z9EU%3?~>$DG!8Bx@lJ|u2@SGU$v<)N zU#E@KFLQOlb30_P)#bT4^BN!Sx~y&55qT1)0y?Qw=T6;wn+YNDS1{57x#|0gy&v|9iMJdN@7F(V!TFV&Lof^8 zu~P^2&o?%th=af1s1V=HxFYc8WJc9qXT--#ufCjHaGQhBrV4D@h#*M2PPYY~?W4#{ zI=QvK*A^13vIMTKF6un(A-b;(7OxVZ69_`>>l0)@P_BHhk3MThr1%D|J}(3wPoWLV z8_t#tHv2!=d(WUI!>wHyL`4xrL_ld#0Z|Z95S5nL=mH{06oiNf2#AQZ03lHkLJ=Y; zptOj9bRu19q&Mj$^xkVi4QZa6{l4%1_UxH+zB6-X-Z|gQ_ag&l1fD$4eXo11YhBm1 zXc-o_DT6hdfF6$epJL_qxYqAmhEt0fnaKRiiIyv2rt zE$48MPKqac9`@O$pPn~VAJ=m+`@Q~>U@{a>e-iQ-og2-1J1NrnP_-i26eG|)o6OQR zn$D$3CqEF(+05Ua9Ll{rIrM?y8_>k&ym20E-{xh!4gbELtb)z_LXWTuWc_4`kZ$>s zpOt&X4u}^Rs9zcIxPdl6*^w*4##RHBD|{0)p60% zZVvL0>F)rK$Fh37b}kZAs8TF-V>Orw%EpS}3CV!4gWnsDbYlI2u|m8yo7P}&{<5uU zt6l?ODFkFT`&g+KvR8Xpl8o%mgMJ3ReQ5R%V6Rzc}{7=C%CUaC16`dtd26aU#bLhmFWG zC*JFY#)Hk|m9}M=REJjJbUG{{rQV2eVAM^qJ;8H@i^9Lc-Sm?fTKU*NeC|WqwPEbNt%HzwYl~J zJ@=IctnQbWiLje|Fjg}ng`W(Sw2V+~Y4k}=*&(!V%6pE4lqwvmq45)gR8`WD?(;$F zse_H|Pvqf$?sKp&q=an{8X!mKl%xC=OGkMYC_8X>`78P=BCeJTnM$0$PLY;|ksft& z?@ACgbmQ{EuosHW4`lc&A14KOYhF~U^SU(s8mV?}3@Yu9Ffq>f%XY-a8_J5SR;X0Y zxgR>msC?(SY#TZ=pyB`gp|bL^%*>Qiz?WR?J-joXhnc48_Dv%){*=G0*YNe>(`(t6 z=V#8HVm>;!O%MS)9WR6wvJwn*Ep|+76?k>+H!J4ZV*>PRTWa=M;;UL~sz_b9oA}B4 z#J_B3i5jROX^!LEX3EWZYeWsh-5qmznZ7$6gzN5mTU(t&^HhaC{l{WDL zmNT(GBSa9k>3`W2sqx7oE4tL6*`)&hby%eV?k94~nA{fFBda(BIsefyXzpC&?^%a8 z^6-QOHO?-6D$C(BsnW#tGH)ItXr&D2<%jIXT{!`k_)H8~NdF|+@wQwsA(z*zh=)pE z`14=W(-@zrX_GOy01HJTX&Be<;dwOx^KSUy!TP5|IooH6HvtO-!b=2f8T+*xgErlJ z`cDH<(<~3jV7B4Yap^XqI;}s5<7N6O6BAm5`cz^VHK=!g8Wd+$Yf(RsB&~4aZ)B_NgO52#ndUkjazo>?+ zWiy8E8q`UZIyvW#;3*S_g_c?UKw9j-9NdpVPNv^^Qx>wxbwrIUv)QKGb-~sA>Y!HF z>7%BP>T>P`cNp$x`^&btO=ubeAg{_?+c!CK=31dXIMd&Hbe$`9I-)yf> z?}*oTeBoO(W56vB2)yOHpadW}3?A&IP$Jv;uv%k4Y!`!b2}Iq7ha zx5B5lbMoF#%i@W*Fwe@lWMF&kmDf-4Ex`RaJTBk(1|C+$*Tv0j`IhN-&9f#k`SHO$ zn>#XjRn2qK5ylqv-2wsvrskiO-Q)I{nGU9EO@7#CY}n2wJq!kU?@bykg|-0I+Ie>B z>SRYzJ@gDhPiF};Ws_y#2U{~zmHo?RDT4sx)2+Y#mrb+$ISM@Y@UP=e z^n>7lo2w-((Arphqg2>d*MvC33BQ0Y0qc(O&F9y@Qg-2ZhB0}TA&KWv96k(30#PFYc`_W*9-=%IiJp`47v6I}>d?bzd=f)~(|r1sG~sX)Qo)VO``TM9*O4$A6# zaZcQU-8Xs;dY@&J6H)${@uSqY+0Uh8PWz9?^A0vfZHKXRe)~9tcWmm$@tNfIz6c!* zKA0Gc-S=)Pc2fq;62c#SIU~LH=*t-Uik=UXa(jHQ0A^37Yw3yV z1u^~ra{l5^8eYP)LAP6NB0 z))9G-UKEX&#|j>ED(}GcB{xo`vpsdHJ#cd6CgG0|O?T@F1;-E{fhqGhFXxjc*F3N9 zL9TT`>Sqjr)GZ#`gf58EtMzW*j%vnKuU%_eTzjCsv;ot~7;77tteIFe?_fxBOkc2c z`1)pk=C1oU=t8Xn!6_EF0P$6fkb9bR`%uQ)5x^yqS6raRww$wKFk{>|bMRD8vK^vdOpokv)D z&rWm7!^aCHe%Vnrr7LohT2=^n*t=9l3%YU6RK4)+_lM8jt+s_wyuafYh5R@doLhXH zf;@3&1>#St*{m=H83GlPScyc>;w?po^~9|ViueNl=&5$z_5Hywg5aGFAGGQzdcI^r zhZiG?+U&kfJkBt{=8BdY z$!nx))-i(M}vCmX9$!e{Tu zPh~tGEdv6cink|u-^TpTLvTbCDtf-OfilQCnVMHC^Ls6Ojz9(lvytF2SS4=ztFAMH7_@9W(kAYE$Es%D)W zU%jpbGjB8FhZ7C zLx^t}!fD9p7gr)Xmf!MFKu_|5!^Y5SpSOyGq*??Ri8*Z9j*{ZDd?s(vq~^|Afz530 z;15kZoWvz#BU^@PvWOkAI{Bi!q$wOgPqQ~w217JhCc8GWk&lGKTW2pAbVv7-aekX} zC#j*;)u#7nCTKTeay|z443|gTl$w7$>{CsbK;BX;>M3)1{nf+i=U}+_m5ShKhFG;TDkZ6vJC z-HNn!f{kl_4p2@qOyaalK6yBqv=w}~Z=$I*$yZdRJn8jt2+w%mLmfusTF8B98CSFC z?ODo(uLt^}uacI<_yCwrOlN#u@1L#Z8-&>!;y=WL##0f2DGA1)%dAyzp)hijoLIA- zSx8*jJHR`eWf&vK2?h*+gX)C!{GXUt1G$4o6(F2PlWJv6yboVQW}V72_6|&L?90{? zHhx_*M)?g0wP}~XG(GgXpJi$iG>qrhkNwg8v_Q4496RP8Hzo_OzuF5`lc+qqFZc4E zgBKOnlIpX0EZbu_Dk~>cZO4|Ken&Iwv2Jr^#p6nF$m@q~!(lvbu!M%|r-G&(xbKJu zS6}Z;vsO%s!W$b4r>HbKwljYMtSIo*TY=rt-NtO`iEqHfb7`F&z{{e7YUtqcOH|?7 zA_zs%&wb#r(#-U6pE@a)>A5*f0484^T|k)rGkY%UZHNY< z&~9(pQGGar8CGXY(^9@#etQvRdGgp}CFnh9f_o%%jH8kev$(q2f8>hT(cT%3S% z;5-6b9e1IU797!+9EzMU+4k4gM9N!gI{8*sv{`OrNTc`nYkD(TO}eWXGT<2}C!E-p z&)B>+W?S8iU`L*5Z$2uNof_!uK4n?y91%Q!;(Gmdt~Y6B^{wN^B2no1J^M9HeqW10 zX2OprIRUrd(F1>TJXdqr;H$ysld45|P$q(W4hn=51s~qGl{Wsg8uoy9B7;S2P8(HP zo|=l!&825=t!%DcUyITgOuJ?^XR7gEXBvuqLT+HNHmdYHxi`Y`hX#`8vO}QeFM0rs zUvUijKcn~8TaTQuO7ACyzoU;1n@jeO7B*^@~?m&-0SeBxvl zLnr7TA|bM%#ut45m<9JEJH)w_*N{Xh^ihgZ7Hc_{%R* z8M%%(%I7V{ZR-BeKXYM3;LUgr)*gH4`SJn^JUA<)!uRAcO>SQvcTJhKORsvp7{3rs;wtXG50T+%kI+ENDdEKXr59g%0g8^*P{{& zt9X?(Pb)2-&YS$8Ra1ukssWyld;BIX>ai_O^TgOd??cf7$Y!AFuZw>jc@88aJA6@K&=c8V1j&@=cO(jDy^!KIKgdrY~+t51PLl zKklx%SNV-HcZAsIHVBG~GW*sx+9c1%=QccN7q1_c9mF8fv^8lqYc{W5b3rc6@{^S;ORo>nS@2E0w$!pGg93H}EZDUYK3sLD}D zyX&xWv@%O7QByg!LeXU_H;8(B9I`BzIxt*04#+cIhk~zFX`(e(pQ>@j;mRtqLaNi+ z47XqBmSAt4eE5vF;S_1=xGKuqecB$_>>*7En$AHoHBJn0LB5b$Nb6`>}aNO z*=d2+CHCzD61AtdRDwSB`H3zrFQj2Ct_(km@d?`x6=JQONE>E?zs9g+X$V=%%lhMX{mBzAJP6OJ zadZVL=n~vtKywCehR6mwo%Sq#gYj^B#F>DI>GptKdOY~}QR`Nbvnt;m_Y&m@Yw;Y; zj}A25gSbi?SGBqn!4e-exWPhwgK_fx4+S0gS4gX^lAO;H(90bb`{!(OpDb>!O`QV4 zg~JMQhl-eJt8P1y@BG|rh{suK&p~bfr#^eM^K<^Pc_{9jR!0?DkH;fSS3<@Z1qc_W zkjnE}ek+PSdcY9zW%MxSIZ(ihAzu_I;U9IpWgm+R))m^mtmS!PDJf!0)M3M;8ZF+z zF#A40^}QAQ_>p<`)XVorUp>Iu+&>`ROFT0HJzg`NYe9ObmQ=XmgWte;9}M-gURmuM zB5*UmGXSx@;*cGHY0=)I zdBBn%)#|6@=*=8p`PO$&C!WQ<=eokt(HCRsmwF>$>uPfTki zUDJzPLf75$qSU{EiXji(>qhCNRXcd6S`@8xpOkw3J!H?GfRk^A+TPs|*(8ioO9d?= z_h3wdT_S@9%6(L`QH6VWBObKnLdPF)?MeMD!lrVcR39(S%+A;K=|IBOqbVf^U&h|T z-wQ4ephYN-RfsbaS?8&5RPQYuO#01uk&A1FZBv4^csl&ha#2*bukc>ydx#Ot_fqA!(yZGqB9%lmkJt+yGdGZumB=>lqvsvddqC)TZE*P2 z7fm^7O{V|ItqH4nw(T7^TjpS$>wEKj?ZXK zu_0}XaD`m&=eBxcf$}_FB;UKa*t3e9THfDBPFMNFSlpN$=c#IejbpJDjh}z@yd-Z& zab33&$`EA&GQ7-S2AlWhz_m$dYZL zD0X?fbPK>s!W7;Ll7Ds}N!4m7O@h7Wjp*VK;Ol@gUsV?Q^!%!z%R+jmQSfeUyD`=~ zie+eKQ{ngZ5Y_M7?3=qg9(H0Is2b6T4x7u4$cnF^nr_wy(uu3Q;3 ztz9XHKbXy-^nH-kkteM})5(Nor!3P80$YQ=~rUYH{Y%9Cd=5kf00UH3H*SSbEv!W%oJSnNXls0{E4FBMC`QnD}|4Z8t z@rfe0_v!2DWv4cSlKF)av}UpIA4~Xfe|#cmWGXXDKpNqa1OxmUw<+dO@A_U23=M1SPhUEcC+z_1$EEw(u@lf zuN4$GaXw0R{siP}woobPrOB50_7*ZM)P}zIzL787hn;w?Y_{` zm`crysL_xT*CSE(w0B{nR(G{YE)GE2* zJ8LqG7p$lXE;A+=S@vex87FnKvz@7)`LtZQmu5WaUJC2u$erZIgC)))l-3eM+6;@UVzs=dn z5;1p`U1Gp&Ec>o3eF?U-&Dqc5rkcXcY%uf{xYMFVvxPO;f+vr0mi*aA;m071x%HNq z<>ly|&pN*Q0-NR0uB3hU=l{R*;%vef3NKR)_5k~Kqaw)85hJF|8jYw2gmOiLH?XE@m7fyu@h#YuP~_^Yg-3C=j+%63oJQAwJivJ6rpF@0N7m$=UEZ65%Z(_KK-HQ zjZ1EHXSGd@^;A!&-=5H1+5{#A^Q-^b@c2LZUUc@&nc=j@ddvophxuFr=)q>tItH7A zipmq4QH1zw6ST4taz<}@C#WBPAmsNiZI*xjRn$(J?(D)fBMchu)Dq&oTvMT5mIv9f z$sGOIf?Ck`RICs77VmAEq+H^cyyTWb;>JA$jza(U+IvzX{l2MS&UOggR((`Lg7B;GR9g6o&TemsR0R8m+C?9aR zRYQLHfd=szpwg&}-sgXL%MseEiYY#Zn@5DYN3MFf9ZtfaZH*x&*ZJv|$F;e&xvdVK z;Ntvwf`6k-Mbw7L(^7UPZKVU7!3YF5jOMcbPwG>)+=QTSma6yoVa;kmq>oGx$L_C#5=7QTWT&$l{-iQw>vhG)^OajK(a~Rrsb(1f4shb^1nFT>@a` z?Bif#pOe;M^6;n|y?Zx@QMEbI_*6zezvvU_O7Q3GZk~Ufs_M>rJuYcGo@>%cKH=qm zLXn#2=_na7we^}wm3Th5kixkGUyehoG4AHDTBv~&W~VOYEL=0wq@OjBq46S=YVK`Y z${B57c%%p15{9L=5sX$b0H{R5456jWk8)*1L5Jv<7*b#6FVGUa4?362I!t!LpV9u; z^d-E2?6+3vpq&d3iAxfUe302_s^}7rm@~_>onW}=Mv0BO;^%2#s{os&Sw4zjWn?^* zj7}U6&Xjrl!}8;Qn6r+OU;bJ+D7v^Ne7}p}8ggq^r#r#HJ40LOywyvt5|N`AAtr$U zup~Q-@7m%=NPyVMB6)<_S-I~w*L9B_kz{jwCUWidFV-ql!0oM@__BBj16MMZNx7Fw z?m?8wmf;zEW6Hoy=>`)iGY@$`nr~7%cJ6yS;|4BcF7S=7$Lcbv^Xki!n$Mo91@N!C z@fUF4nb(az@@KN%dc*k&Gw%5?#tcHJ;H4&v83@;8jW1iA{PDtu6KXt0^K|lu-8-De{Q$04kYw1x14ZYqrg z5m&n;JD!=E-cJ+fm|sLT%2d-Z==2>qsul2csm0{W0r>F{CysvCzUVxD_#sY3idFJV z1Gbt;hc{ZNFjG|>C#=Kks{>SF&vt~rJ8oZ4oBj>NEo^K!BZPvXg@x4hETcP1Ww_L4 zB#aY_YH@+V)&|!lqV2LJJy%4N!*4zM8ff#>E8IXLIy6Y837Q64Z}$!zO_w)d<6@gr z6}P^cJW(!eQ23Y4%*ESDh%Oj*Ly>+BA%Jcj_bMEK$`IckR3zOk@((886N&Mr{uVCiAIgLs?=A4V!|kj= zTTctKakpplFshsmI(Ur=0J-^@1fiX^-<8P1ZV!zUw`ssjR1z>O*w9pd+Bn$(?-wr! z1h^-N5ok@)BBm0*BZ>aU8yR!gy{X|FP>veNwmk+&-sbZS)qN~?F8xb(#7>D?4*bh@ z3eOEFD&BbJKACYYI&rH8bjS?=#}9}B-bIl8P#P{Qid@tbdK7@37gN?5ePE|fL++zn zsG2NwH9^3WA#U6!-G03N9y4Xsfmv(Rgq>4J&w6L0SlBf0o&T}pH%^T5)urX zvhT$#Zh_W{h9bK4-`_#a@}e&7;1j8y44skSO(hyw>JOQBXl+}!V3fy;9IOPOKG4)e zBq;98(of^51jSu!6VN_7I^c8GKH1u=l?vO5GAGtwHaYcOMv6u8np7aVUF?995JRb4 z*Lc(?8B#d-M#<`^O1sy}tv=B{F2+y7U~8EW;}?}l?x=uJn{U zJ^h#M&8LsB63qlcblcAu6N?W_xi@*wihsJG9X1h!5#$5#ST3QXy)?ms4K%F>)Y~69 zDJ)v=3(4V}>yN&-%{BL!_K7mTU&D#j00rP>5x|)6t6^ zG%$*beoY-ni3kJq`|l)tYfFU8fBse&(4OWCow>$+5c-F+z>A$pP-UNfW~YpgVGe&> z2H9QN5G!KiW6RGq_zX-jtLo8vNhe#31}-B5VnNv-JRTx%ZiOfdt!rhE48Q$0{`Zk& z+!J=GpM)S)T{^;e7ZM45nnq8wf!En^Frf{n{~^zG+HEwT(bX>3MgA}YRNx_vF*pA0ZJk^c-6V4kEBso3hWDWWHd%y;+-u2sZ) z$jeMklT9|E?R5I%ME0eG7P&xtu(P}OX30ov%(ghS0`PkOvh4-?e6#HzX%S2bW{=oY z@pkeK#X1R@L-K$4wrZ%3?G7z z{Teh~YZVc0x`ItdI>1ZX3sMgkEPwhkrLm-Wi|3W*sn7$bf~bFH_s@6Fn!)}s@xCs8 z0oBs-=#48ou9y^t(kGC&eDDbb)WJj0i&X6R%S4GnF1agsxIYa(;DC z&kWrXra@mog#=0^)&_Fi&S|6E7Z?twM?v_}CkNhHQ}n+$cS{cGVc zHQ$t}CHmMCiy^5(^~-YPC>@hE#q)0G1*x*%Tt3YS-2hPu2XCv8XJ(;prq*o5AL{y~ zZPGx$Go^P`?ZM?71CkHQSN*j~Yevf>6)&-hL%x@;AQXZ>0lkx~;E3I#P;b`6;alc# zv$9Cqic|T8Kjye^jdPq&u8U^KgBb5vG#aYx9Z35p99Z`jVTW5a!I?9v*aYP z!(R?4>Tw-?t5DNBHvSCZ{PBd>*!FLl4KUGwAk$|n(ej;T+ApTnlNM**IF|^hJTuz7 z544fB@PVnmdK0|NFeKPH^4`xryd96f2L?*~0>b4Y{?vbEHvhA)XkC;g`AEVTk1(QN z_hpCO_UlH)&tG=L1YY=Y_9zr!Etm%Yhp~?+LKVJA_5G$d&P0n+h%F+)tlX1qVpZ4ziCu|K|tB!$uGg)4V7!jV)&&B<+Rx@@n~vU*M*4lO7-KIw{!z; z=L?tb8QZHEBXH47!x>jYb>u9`&)eW_g&N}y`GR4;nZ;pU100)~1;?4x4b28Ou{T#a zlaU4dS<+p;GroCBIS-W|Og5UHCn#*(t{N?fdRHLvu*}ffIniLa-RCk!|QvwmlE?K!%ku9GHT!ZnDX=2Suzw71@^L% zo8x?g+JR(3ZTYntWmX6{r$Y1>>RG?;QZRE8aD*KNGZ7N0ia&0D7`HR80vO{7k=b(; zZfN((@gDM8I#oJlA^Ix!dHG+@rH@A)Gy5-XbD0D62cfN^3E@Pa?9sMVeN3pQ&@DCpiC1cV%CWZ(d53ujK&l5?5#{OOYo&}O zK>bkJiXqQp9jCffxhadvF?&=gEtygz4>8M)Lu<4mdnykIp?MQN_Z@8!W#P0f7Z5@} zEwRlVcjsVp=f|ldumADUSsfY7+4j`9jjhz^jwHBmgOuPG?zzm$YQMs7M+Yk(%6^u- z*y`0nPsm!MmORK-J>T5J^gEgEIE~v@I~q+m1x|DKvP~F&z z>s9*?vkr6dy8-`?S;w9VU-3#M-9cUJqBOFIbDXqmMH=U-E(YYeoNJqM9sjfl-)mHP z_>a8V<5Q96jS~~OGi11r-ds39Xu^jexL=}$sJUiHajV2pfygGKhCZ)4gY85qv7{&4 zgr1Tql=D~?Up-6o7kQ>wZ0?6};zu8`YlS0 z`Q=Km92Yp$s_X@PoQ#W}lOXaBBO&MYTh7ED8&)|YZ#FLKq)8;R2%ZTG&b!3YGI=?= z)+^@)o_pRwi#22R_di%G7Eew0&p`O^yUvxav7r9u#|3zF)gmiu3AK)baNyD9py#Cfx!S`@x>@+hAFL|Pw>@++h#3z4 zEKk4{Q5Ewd;iuh>3BL^IByWtNPZDw(FNY?n#TFsLsN2HNC<*59l>FV!A#BD0EQ@9Q zQPe#BJ~!iBEX{lnqk#C@Lvx_t`@lF`3XL)W*%+|ENT&|bfxngbmOck?>D{FrgVkJS zw9M0k>))6i+s5)R6=F8xb+&XUnEt8Ru~{q+T@aF7Q?SfW6XT&?+BHjEkHwUOJnlce zQsbZhXST{t4TI5kU?Z^B?`hev$q9q)n4NCLokUo2ZzaMSU5{TArkn!SGTs>G;fzUB zy7-nVNMdILVT%Ji;GOm+*vM(7E7Tk0o#ycAoy1igkWnr%Bz9x}?n?&vB&1T^2Venb zC?hdPE)<|7dpoKgr;g)Nz9h0}+y(hYrQY&vSg(UQW4{CLgL2Iu4{@*H3Mo z`9-0so5o-*uz4S6+BgG4EMgR=nf7S84s%isjLiXY1f>mqn~`wtpo8b{DuW|vIATFX zOvaD!E>^_sLlMj?!4$0No5O1FN+5@*85ZQ1IY)}en+x+xu_bHSSp-a{EV{{v@v5PtXeLQ$`{xd0BFpeG^=|c>RO6Qq zJ=b<#65G~c4$lb<=PINbcl}|~u;UBfJ}c8w>lgH_5!A)Hl5a9;4m7{cqP@CqGPs5u z{ywU|=$W=NP_f>M0{Bo{uhUWLYGuXo5PXetOIDzM>Kfq-I_yI+gB=K4?C7R*CR$q+ z`@2T(<3C=!cr%E7g>SW}eTnc^TViTWPx2MKExlPwMr?)S?WO!tMVlGc^fsna1#^<2 zPQl@_SzQx8>MMr&_j%WDLg+cT&tm;k@iV_bnJ?h-+T&9xv^C^}QIMbFvl39zN$cGL zDH-$Qg&(-8=#_7y3n}GVq{ZV;076z@3nCT_8!tr-%*c)yAACBlwVv|ww4VV*1#?OX z80KdpA(aD&H;l{H;~}T`4yQ9(N4>O#qujTx27ivdk{NCInd_#OVJbfMYw=2o2&~7b zEU7mxoMe;<^lggTOF=W*~nadyY-jN9+ndsJD5 zB=g4}A+1+Zv6;=FZ&Kj?1Qr4@64CtKc45~=m1j41-;E9by)LTYs&5>h>dN0q=F@ZJ zYfx4=6Uk_3F=JzYb7`OK@u8TZ3*aW6`Uh~Z>)rIfc)=8j_OKmkfFY3Lh--@L$v%Q~ zLx3tgh+>t2$}fx%Cqql|!%UfB0YJ>a1=VAOEK(M?|J^M#kZK2}k<3;Rw+@26(s9td z0CRZ;w#p9!D1pM#bH)L1izcq@UI>y5_Au+AvSS%}~%UB9+I%#bhsQm){@=@Q&!nH$oDT9qJD$Kr#TPx_GZ=WoN z#|`?MK`Lb~wo>!FFsm_=y!68Rj>1{=Hy$shct@GaH>&&SQuL8Hr)wxt(&v@OMVr{ZMWG%;7dXik*I)DqKkrW&I!y4g@_3ym;@xz0i3c zap=;+&_2SqH*8M~EgUBh!GGBfGsIoDgjPzq;fArQhS;y&H2IorW4Z`@wuf?JsIKTr zak_5fwQb3whD|Fj*Tm#-wX;DpWpa}cZe(X;0ned@%l1c4S1(D(^R+8q;x1nPs9-MZ z8Ijr=8!LS_dTzs}Biw#YW(+n^Upd;(I@86la}YbzW7H4}7#ynBUK_5#+Z9eqwPI)3yni)eerL@6x_y-vX{n!q2519pZFK6j)ttAE zzM4T}6Z0S#Hhg-ZjXZm=OEIvp$nSQ;9|PF4C!HLt`B<^vvS@FUim2>`=qo~g6>*pS z6jfXtu0v}1v?>GY*rP>Pkn_ATzEOJ&}8S2#up@_s3=`PZ<3Ob*E(n61+k3q%K zg{=HRVISod)l%<=o#(n;jc@R4j_ou39>g;G9OGd-7(4pG(Ue{#G#$=eP#f~+$&u*^ zOQr**g7~6g?@^7Lgn9#EB-(%rSqdjF1sOkl+$dyQW|=39`TfzCN?JnmO+>BEscit; zpoI0EDd*V=0hZ#axM*$3AYE< z`?S-Wi1nW50Yrv6TLTM<-io`X-MQglo!fa~Jg3MatWv1Xm$l~`7i0Sp17HsiUufB!uBeREY z9l|WD8D2t@Tsh?NG#k8|T0k?Vz@kkZ?GZ~UgbBq;L4nC{*Xja`E|v|sm~3F2&8P}m zi*uO7RF&^p&UAwuy*-=gXzlkb_PprL%QA0686=+sr0*RHG){~#PVXZuON?prJ$vT1 z*KMx{=Uxx~^`5mQ^>fk@=Nr|R2JC!hAKO=!N~Pr>LZH&+j=D9pif_xs*@TOd65Dd( zS23o{NJ1Urm}vnnpukK@*mPtC*`Rh)`#}9XA{BWzD!;}fA7fP8b`|%>7Q0c`Td;gh z!=h;W(jYD~n~z3X>fTi+9^X|b-n-6rgpHrNJtnHrrA4X7ZknwJJE-02eZ=!UDP5a! zIhu)0tPN})XvgWoDgf&8?U~){Mn7;Ig^5wd0K}D?F@!qPW4<5VCYz6)p9LbBrJfxu zA`7V2WE~06lgQ{p#m6MBDqOOCl2I=BnmbvB8aI z2nd1?#O`+9XFX4$p|L5SbvEuV@&D)CsZo6ZIs@~iGhVcy3Ug)_aVlnhbc-Jx5_b$N zV4RqSPAoO_Lp4<%rsA-Q=|99$kHfHt;^1g-Bdde*^T<0CBK^W|U-$PdsU>#(;)ZpEMnOBsH{pt|`pyDgSRx@hW8R?Ocyhv*-UrIA1WiPjsu}QA{G&;IMCs$1 z)a+)0{8DD;8T}q1jOb9gD^YrnncoE3(MN=Ahbi0u8B;5sfTy9sOst;G$Go6IG8H4N z*$P-O4no-BbJvzaz^}UsZki=inr1WNZf$&8YWoUYrwIeH!uRwLnmIjBPbDw$>L2R&<<^vp<#kR$^z)8+(hxoONcsb8qE^Gn`>!X zmH$dBKzxK8^M{_Hrh3^IjqW;FvWTZxQrsvl%DumAsadNmoEH;Oh1(7q9Y=f}pT+O> zx$DFwf=tnbMuGJ}yX>%KLOjWkcb?FbpqYb5nkYcDk#Q1$uU=w>GY>YfPNLp{DN~^k ziml?R6-bbhFT;Pq16UottP7(7Y88$DuzH$}RRYf%2*)SEKgD}%L7s&MgG4SmED`(Y ztXuKmX=h(0(vC+^l6@lR5C}CIh`G9iotRuYtT&TwJI6^on0c-La(8kAFe0PLCjbsE zrvQ}MfuMY5s_d@FET3`20ahOeOUgnRwy_H0Ja~(U#jIUTo|j`v#=AQ*4x)$(lhI5y zCP2idKH5PxI~R8)V`xkG{OS@3BzV4B;9{Z>;8tzmj}xcNArwNBW>`2d?xJDB*5#J% z5i9{{yzolMMN#55>jkMfVoPe-Cpq716)Rc&O4h0o6 z9gSZy;9;~{vi|r*^l_?<7VVI_LEY7J7(#5Httg+zW&h45K_9P86}J&D)n7jtdx z3@flM>?=Pi!hoTcc(qO>v|LFJ%;H)hax9QP%_Iwyf)v@}eEV(iF~!#YvX%V9c5)Lo z38{^f8UNu=c~>OVv++XKalK2`GuP`3z81$(42k5YWBncq<BS6@|gJ+;BtJIvORs^97+&aT&XOXj>Jql7}uBv`9zMrt&cm}#JAE|-yyTn#_i%{^&wKgKs!qf&N#&4 z#KpGH{P9_X#Es34={AOB zg&__fv(&Vg00L6jiaOfMq?}cnZtsPTNgnz0tgzDxnbi)MbPY2aj7G*A(TvRxqgcBM z%0txMq8}@a-o=#&B42g&#L(&M9`Z7axPFBy%uC9}qHzTW)quqd@%MU=pO(9L4^i0Gw;H?W8$Hv~m>xu7sJB1~Q97{}c_`vVqCZ{% z^#iMBwxt#X?&Pe(;w|DSs3bQ^opNuZXV*xGw8%u?WayuuhD{`Z`E+KZOtDQG(GRl9 zPpl#;RJTD7(7ucuzfRE)^^ubW0sSdyDGRMYeKuJWt-Vp|yX!rK{$nz7fiPmpl%(;6 zG30k3vaH>mK5lN8fS&I;)5fI6@b8Qwe!%i)A_s6)!KI&1D!a9m*Qo`2YJ#7)Au41n zpyX4NBOkebVnP<2ZKZoEG)IWJhpVpnGj7^L86M6`G*u*oc91|=9E0>r0gDby;3nqq zV`!ry(kfhJ!?til2lqPLpaaMe5s}U;A>(z^tSm{JifP)=)>`q%`&;0{e^6s1G|LSL zvUGU@h$BV7YYv&Qfv+98*v{Zhc#Oo>@a>XQmeHz}gpE zPkUWkYQtr09U31=79)(w@z@*224>0zZdsn-hV9=ApIipXi45vyVR`!u6@xjaTdo%G zaHF32?1+alAx%R9Z?mOH;|%P7vG^p>l#3C%+OY~GHF}RU5`!0fyulGO#@X$M)M)OxzIoE+*N4!taDbr%({G?&6p;)3>U~w^oJ$`Un}2Es8ylz9=h84Q zbHq2@H~xssdOhJR(TUuAI!i;o?m>s%ql=5)f7zmZm}nQJnmRor^MhHl;R6#gUqZ#s z!i`o^&JgXEUy7F&_^-$n%gYct{MFIt;fr#W_@kIAU2=UjE%@sE{eH)(&C!>4@8DgE zU?B#kY~bfkiZJKzY|t3J%+yuW#fhY!3eO((57uygJJpC|pouyT>$+Rt z3LG#&q+?-kUF@7rZL{lu2PaC1Bv;BUOJSxl(JC(ZF-BXw%%;(f7eL(yBO(E@P_0cuWT07k|8yV8^$ce7MTbL3BBOTgmsy{6*L1 zmTB>5t*HD@Yia-r`$u52HyPrLAtOu-8I? zBTTCZ2>y>qZHb3BdSK3YS?j zHNh;FF9f7SnGsadxSIME9RKry{E(a%1jkvacFi+6uw-=Oy90Ygd>>cHbQGQm(Wf}9 zV92@+(<|DZ!T)YVx$F=`jo3v&Y7tvp07mTKK}1iFt{w-HDWXDO7Fl6!&%c>bwiax_ zmr8*{J zD0!P+1fvUSRwx;&=Q^d~qZ0Y?RRS+^1uU^WIjGA5ACRTp3<*n<8_`hx@kT>KoU z7tk-38!_*g1`Rxc3XN_Qo|^UM?HHpn2HV#+kT;o5+_(Ab1Fyt*);*0}*@xLSRp;Eu zCQPuto(-eGYIo2^DC2=ItjAjR#ukRZ6UGscr^maU<2c$B& zJCKuK;9C)oBkPCoxdLGqyleyAmYlDAG}AE9>yS~w%z=i)Wy!#o<%^q`v7+~gO3aZf zac8Hmn*Z^i8ME4}t2CTXPaikvB+4<&k_)jDaaJ16zE+;9u=lU|IVrVB5%FkEzOR!h zf~6tl;%+QW6KJrXkoFEs${J*PvK2B&8*OFb0)smBb1)-B<^yWSc*x`DVv?P1orex9 zuN-AMNr<2z4oe0knhsnc4Xx7R ziba_HcM3oho|^r|zgDyivaB=v!IreV4o>AeJ)3%HZ&oK>z?E>~@y)SO9{yi>owM>S zuaB5$_vq}y^%!4n#_-PnZlqu2_}!E7)hO=z?!=9T^>yX25P_JX9ett((VA!HtOsoC z7#@(9Y7wyoG)4M3Ort_ESgD^uNpMk`X;wqr5r9$Tx+(IQNfho!#w2cVPY+&$JUj=v z02k!_w0h<=gvL(DS^N!%eFi+xa&qn#(zvYTZjME&sQV8?z)B1KgN~riaz2d^X?;Ik zu)LjtP9fs3iYFz|RvqfD=wt1Cwa_y;x$ws*8p>UmE;*l{gf{!e^OW z@{KJt&6^Y#7!GSSM0g>+x76EZw;CsWUQTS5d|3{AVvu07eh{Kpuwo%fRV24nE#=O8 z9WK2aCmAao$iBJ%{-6Q<7ojbOb#{6qJ2mCnRA$9CVazK(*PbxZ&iF$fttN^ff2EOE zf?Yg?T|b{~z??+b^UxP6zLJ3*u#ZycdcD`jLj8jU`2f9YkvW_JYz_`@5{iFwv4OUB zGQqb+^=`e1{*_}t$_g;vOx}fijq~-7>|LM&WDkFoU`mKWwKB@&5~cJ>bFt`f>myw8 z*%JGvUbhxVbuUlaVo}aRQ_bz9uT>WfZXDblou@c!>QQgtCzPF3@&F%TNxxm#vwrMh z;#2Q~66yw|_#VpT$_a{U&u2;KN(*yo!(8iDM~P+@2(Gh2_>UMq%v2JRz^eL35y%JP z-kJg^es-L8E4sq}wW%b=bo?*d&?Jk02Xa_$0@c><1B>!K(qiYO}5q(r4DMHECtN^F3D7y%*ls5I#! z(nF#my+lAjibO%0l+ckH>AjcGA)!bop{Bp*@%`4f&bQBAYwt7mIeU+N#yCG54*8Mb zeCBh{d0+Qcmg?nF_OYYLvJ;bkAmaGB40HXiksZf6EL^RnNfx9OAi01k^7d;1EKk18V z1)p-L7bm+FPC8!BUQZ~QZd~vXD!SPZpz8G|dI2&rx>G@6T9A$38Y!freK-ecm zcIN)9x~~cJH8Zm%-ck0$Z(Ot*lm@q4WFjJ-_=Vbv-A38ML1bY#H|znktJ1@qZuFkN zeq$uF*^Rk;2zZgjd8i?RbJj4K{LEUY{GJ&Tsuhti;$9%=+0T&v$N2$3E&r8UBujez zV{!)VQT7Mb`yO~Sr^CQ1_gHZ>!FT$@3;lx_qgRh-KEx`wxA_f`t!cV1XB37BL_pZN zaTJAo<<)c{V061`IqBBrHuBweHHFwiREJkyFN$W(_pLTC)pa$K9#PgS9@`#{0o?g7 z{ggswdeMa@O-Sb{ibre7)qp$CUInx%&LJcii3lO5yNd&K##yfp5nW1*{j^v|TS*CS z@CGy~@%OD->3We2RL0Qe*hHZSH+iJ5+btAS+7=Dx;!KV&DJ;)=aN@>jr*~^cZH!P6 z|3bVB%c_QtIAZ+cv4(~7Kt91Vep;n*mV2*GQd=W@Sk*(!@tWw)yeY@KvBPbwLp(It zVN$#^d@;Bg<13w2oi{$)XBE=B}c-tw~LW8gOpS6Jeh z_$E{)q8LiBMD+MLZaM)zRVz>&Xw6mj0?@Y3_%wiXF=&!GEB|mtsCiBHb~*N@r;Vq% z!Mnd}v1Ba|3A^zgOwh*z&9mMn7>JNBkp`+FEA!5bAaGtspcsdw*M4#{HHXz;%-CL` zKJr;+HTpPog?;^p2NAf~{DIi4F+Eg<2;);Pg1>7c2S7BwYQge-Qh*ZQNApEw2gM-; z7XoB~;=j|e2Y@|K*cbj{Y2BhoF#!w-1AOt<0^etAR@wi3o5%`);Pl7&_0JrVr-$4+ zf~mb&XnLW`!USOAZCXiyMyqx19b!B4(Md;9NFu9RC0u8@(T4?PJ(7feo=S zNN-$vIHWdz^7AB-vs%lP=xAU%P+Su>?b!(Xl|z9xvu1WYsGukl3#_ zX%E$WN{AB2{kHDUh&M{EZ>Nt8-3Tx@tK#)j{Di}?e}APp2wwP_e1JX1>_(nZy?aTw zKuaX>W zZ*oY=2s2_w?Y6QQZ>hJcHU8ii`Y@TL&Ho5?KP~4o*s1zZ5`>`usZMLQ#ZF-C(`&QM z$g@Puof;=$t>)8|!9f0;>iDtN?0P|2DMCoqa7pd-OVjHkY!uv6cx3S#n z)6P6`7|vxs07J}n!0xh|jU*OCE?3Tv&cGNfLw}4G#f!;Ut2QDl6Zo(Y@Y?rT>Lndg z=RWu?Xd6lguEdsqVA`|LJy#?X4}1z6b~gd8$ww~QgC$Wj{$w8|WYQ(S!MwuvfX#`G z;RZs>pg5jC$UiD%)`|>8%G@cdei~zyBN7vfqKFpLr72Ax+@%ui@4cQ(o<*Q@_kNN+ z^kcWd*VeWGFrQmr8-B51lD2UDAWCTa=^PZtXW(cfq=w@o}T z2eQcBwA;k$1IUCHQ}m%d>lnz_FS>TFt>UZ9(!PFgxm_f`5?l)QmZ_7-aQ}#grKn?0(r;1!d|QW-x;G;Sx$tQUiJQhUKRE3? z#_EC{K@@Q%(B1*?8-6=aAHpm{IKc}5#~v7Bi~U=^*w(t|fvWjlF_8bWw_gX>rwIr2 z@iLr|`*i^rSOV!U7JAg*(PEDvZqiIP1o989la%cnLAYDfRGm(p=ml>B{^Mp0G%k39 zOCM(bXK%qGd1#OieFr`$6J~br@0zA1f8_cCl0aHkCjgK3PhNr<{U;-}k8TCw0&>k} znXz54oV_`6Hqgu*L%u_<7~C#_;K8tgBRu$7ex{-cTqb<9QxLLo(roWh*m0N8n82vb zJCd=-SUejRtkHtR?2T8Yt7j&C+O%|?OaVvOfu!2nv;sk!k1F`mWiA~`D&H|X{z&lX z%b{97$3h0M_@g1Vj{+`3Vj+eb$LHR!@>Nzv*@oXSiuq;x%nz`hhKBnj?t7EX33}UC zKflSHWty2(#-V6-_qBk9$CZv-O*r2CI6L+zkr#JNMEP_u)L4M+%zvX@EzDUh(J+$XT=WHqTtK0mwU)Li>1eU+$s( zxMlWCL{)>TM`$wvAi}OrukZUwuMOGwm?oGXyE_~l(sA20!~eS@??l#@EALka0fTD3 zAJsyE6dZtrzpvHiil9qo`XmS)$}#^QnUUTR`4CIwV?*$ef5S4dC5$N?h{FY+xH&OB z)r#Is03O!2>xF&ih}RR&ripH78dGlk#qu5hI^e&MT5N+h{QU;V z)^~@Rp4->P8mK+e-(V-s!dP)L=<>Zha0ndi$KJ&=RU9`aAj-zoW(((o4KRhp8S#&& z7b&F+9&Yp4m;>+i!0$4Ca-oU@tYS;Eap%nYb60-_z#Gtw^{tpsSepRNK{wXb>ZkU6 zH>=+z++}tTT)0oBIDc;MdJpsmyZ!0NTMHOYUbSs=y_q7p#L_0E>qYLu zsjtqe2qLG?kXP84?v+h=SjT_yGU@O z10sWs5$YETn27i@Ir^Rb10F@Q;v#3>5dDhmWaOk5!IiZJJ2EJHiU9M}UHE4ZfqoNq09*d?c}2(3&-IA7PE`Vq_^&}-l&$cAhV zqbF)UxaB?ru$&;k&UG4FA{QnL(i*M3v=xz)&6kr+&WJI-DashkLmvbMWQW~yc-keX zi+5n6bAQ`%m%ACR|LIx2K=L^*W58q-N$?)o>8pBazEaYSBy%U*Td}?@%WCJ4+;3Qi zKb|66$m)!U>U?sOERpcgw-_9w*OEhkG6YPTuww z70tAUqja`Uw)vgUlkH*;lyjPNA=wZ>zZ4p0_s^CC1wH~c;`76mU*PWMSKJ(2L&-0f z!pvUF1g60q*;Bx1{NIf}f#J>_5IYm6EN0Gbaq#e3a&s}x=!1jfD1WXvW z3fo}Cdk|1T6F(lLY~JXv90+*U0Y^^C6R@(aL(J9>TSLcVEH5$=IJfs3u-3yV8LKK@|>9pelEc02K zdh+#&QXRR^#f_gbr&7^<*}o4`<4LE|!vlCwU%_^>>;T6XLK2x_;+0I7o%Y+>=Q>Ya z1X=!m^%u(<+{%Yy4OIwRr-<^oyek4n9yYM~oq?$^yVHQ;@xcSXx~6ozVha|x!7dsD zh>uX;1l*Z>0Rpgh?n`ca`K@rwbjaTUvFaHz-#WBm-no<`Bv7LP8Pqy~2~Gt`jlOs$ zN&bpVR!IBpAaKz0+E}U5%w?lK!}el0Ve~V%=)FjRH5_&;uBqioaZ=u z7>mC;l&a-UIN@0ybazqWGf&dpy-1l*exdhXauLB|4`eNEHWf}DiY==4yET0BR;-t=-X{T?kIK8xB zSk-Cjrb-B{ocbJggV}}PW;T%#)ZLYed8^Nz$LA=g7e(1gGfm?`0kW`A_kD-G)J_Mm zm&hu`jXXttRQPw2E_5NgRob|eoI&uhcvikUh&cOYe%F!Vgz4p)NNMWJgg-prnY-)Yzah;K9+XyFwv(wZYE_SkcW3jK5 zFpzCcisTJegt{G|1ye{8ShY5CW?;~$$SJoz4J!86Uan0!sC6CTNQTYvwQ!xg;UW?{ z;ZnMnOs24KS%w^h5bwiG$Xxp{Vn~P>>IvyVHmvs18NtFXBk*3I7?yqvxQ%+%e@Vxx z_t5WlU)OGwUqHRwI@mfe*}H7}^jqEtC8mkZ4I z)q8ylc15O!d!12}ef9Ca7j0i9-UOCF{gR1rLJWHsc#;fE1WIJqHwdi#r)}DpU|6Oi zC6o5VtQ_0o6!!2}Raxan|F@Y6e{LO<|M+n7p=k7H%otpvVCw}?ErXcIB_Dx>HxYZcQSWi@+X$Fo zJ8+9n405*L8o1l%?Zslx(JVc4~(N@kEa0$@XGm}5*u(AFt=zMW`hfL^=XkGoRi+{Oav6gO2n`MsBP6l>ld#zgSXm%Lsm;)#_qS zBfl-$gBvJlCEOai5lNA-0{Vd>mr7;?4@Cf0hy?2MLXwX;vjOPCWuRj|1o*&xAH+g{ zaogj_ZN3k-W83HdZ!&(1mV661h2>mA<1?0L7ktjW)cIcZtX$na&)7?;I;wHPVSn4A z7btAEKGvYqbHs2fBJXI|;c^4}b#%V-wRyd+b{9R?W1RW-l`*dW8LJmaQxTDR{ygh^ zO5Dx>0z=d)&=pG|PosMJI1s%sZn)0!ftRR8x$C3K5Nqp+i%AN5EdDs06$>L3esT^M z{TGWNY-2;?+JnT)Y?rm(gv6k|WVD>2{ZFC4tJ*~D_?z)P9ejhg1GbQ@ZFwPGR}4pS zh-WOUhr@T=&eCCIH@skW5zce$MRg>e5UJXmJ&+*&I(v*z4LLz>YZAFp;&Q){8fo_7 z*elv`&JVk%B3ZBc+i>E#hr5bTXXQOk{RA`qQqPEPF+RDSojW*S0ci#aJHSrS72FAy zre+$E`R)%=ZtY z?5eqr_)Oj*?FaE<@hi3MHWW+2!w>x?i6 zuf0jCYx8?JBMy@Wybq{T4%OHu!JZe3=PZ}E&J{h>>Q?RGReD6d=9>1^54s{G*+#Am z%?@s<_vy$){2K3DkRy5y=c1UquTNt$l4+RJ#OnnKkKKGsG4*{!`=n1J?)ixx zTRKwZ8;ccR?A5>dUkIO>xS61(#&eqY1>&t;z33>?scUaU%#!t9P}%Maoq^gfBnRHAr$X0kU--n`8G;*rvmzsnLv@`31T}v}&a3hJ=Qgkx^Xw~w zLO)Nb&I>)L!jIDEXv)+MNm)-hufY@@IkE@zO<(3sHx#?J4o`QbI4w^sO+$!%B1%Dk z#!(r(32l$fPLysv=JFvjIz)JFh5Wy9u zvl{M!(COOpW3uPzO8j8Ta&`Beq^pqHBEr$9mMfKFDz)!9hmhj;zXCG;ycP5jM2Gj# z=Pgz5+!Ho)Ddv#0`|E>66IkNuSuQchja1MGlECTCiz zXF|3D-i_VIzQNCDNL3-b#hOd7g0m)OdTQ}AE!HV3zrfyNfUfq8npWe_6Z9SFr=5Cf zVV5QC`#x)&m1j2m4O#!ks&1w!IT%!?;sU!a9)44@DE7fqhVKT9TS!EXyT|mN;pY16 zOrOT^2U)@Iek13g9kb_<4xfl2p{{TReAI2Vd;YxFrywdtkvi(zQiaDS`(w3~`M02- z7RFLNVYb0hGmLyGa;Txj5RmvD|DTUxb}E}d6Xu$vv}``W#P8(|e_CLL0PALDeBXnP z22F6|skPr9{lY({|**w;z(Xx1i)Oy|4;hnT}XiX~r2Ia^{{GDnrgby;r5z_M?AIDF0b zXno+7`KdOJ!B3R=2{WcjYW#M!p_w#Tlq6rLJ2x9;PgFwf_CR{J*Eem zNpbUwhur9!)RE?Ef_lKn-a7RFz1!T~feppnX+6%+bAfKVJiOJ>va7zRk_YtIPXH3a ze=s}ypZK#pFILaJ5^8%OK-J_iD7z6q{bL-HKLD^L@1d{?@%eL3)#J!5;%s z@O?opR9`craORT>g%d?^Y4-72xs1s-+xZ(qOiGl78x$*yLZ|?s+oL~&)kFUu`orG# z7fVIpJiZkLIw*hHx5zd-4@>r>YSSEUsU&H`h8^;r4Nn;=2>(5U#PT=L0rT zN%MWQ&Te*E`GWl`w^JMa)Xn30{PTp&ksEc8$mR5Ka>jl}G1=Fakukq@ax5lBJ>z9I zV@T@vE@rHn-pfc<uY}&%3$aWqZvpa#tu2 zc8vUcU!P>0;wj$uE?dv4BidNN^7&853v*{@_QI!taXReVIjD%-p~oc$qji1uSNe)? zNiZS?cKwl7AxpdwGzGZnh--pi&u!VOXMGb0KAk8s=cDz2pKWykT|^9?FQlG#rB66a zDfvtLVFGFRs5A2LDK81%R&B{x2>)KVw2TB)Ct@iY)4kgBGn*$8RhpusmI$*n0Z^@F z^sRs02|N@_+5;0|Fo9Y7^||Cr2^bOxFa>nfrkIiQWY~T=QEhhe4Z!0be_%D8!JBBS z=_Mcg!fXGT&hM-{!VfVZ)$fb<)-tO0Rk^~WRNZ9I&B>JIzgX@QfD^7>hLO4t9%n2; zB1MF*1E0$ z;nwWl#y%`-X_?@yRK@^FR9f`hE2B?yfemcXjS(L?VB9+nV9wyvk=0+mp4Tj+fE~hK z$5t|KqTbGKaVq7iN&of`N=&L6m@TGCa=B+@#S2;GeXqt?F>9#k#4>0Od2MbBg#gub z6gVt7iK|*pE}0zLlR0X*jCbnFTXHn`1d@qQqPl{(AD8*AXe%xNt8M4{k0m$rGmgl2 zs=U{wQJluaS8K7kh$3{3XT|wlvK5P;?C(ymPbGS<7ti%(4d&O+V8@!~lB0VXzaH#MDXi-G{d7(%*E1{GQ8$;L z7ZMR!l}ocsq-;>o$H}a7e5hKlfhoqPmD|>HDS746RRPwhEpRa&If1EG7lEH3$~6rL zPnTEjM+1}(=c+?r%tsYz#k4a!e#IEO^!m8t@dlX%g|yNg_-uHwNlh!gcpVGHZEpj` zs!nw2{XZ=Z>Kzz)&P!Efi?|2!70E%hnDtB>6p&Bo4%3Ay0-EJL>76_DS(1MJFRo*-> z?m_Ka`rif-jMtGEo?hJV5`b^R49Yf82>)a`!~p=kV77$W6SZk!l(DoaLNA z4)%Ty>SNG1u|@rjVhT2te_0W7DlK=K;w1yTNV{!IcmL(EWN)3V5Hb+vYVrt@xbgFg z_WFrSAHl-e+n7MO^M9lNl7Oicss>aWpSsicYM`Ze~m@5p)r$&EA&_!ABx&iWiZAb_G|%08#d zTA#?rl~^x6Jh%8x9dq#14}Hg}JyWk&=lV3-JgoU%ys}YBi5dEb6<^Q>UGe*5=QWbq zoqV-X#iNxDEG*An9g#)E@Yq^Y8oyJ%FHQkxg$-~M_{S&?A@B2(hh$Zje@$bfSBp{R z$VkBev>d*MN_B2>e9n}wjj7XPToL$%t#Ujf@1m;TX@6WF>B!BMR^1Hn9e`eUh98%! zw>gTIBYQNx1bo{R_5#_HG zaiqdW;H(+&(jPRt4vx@o4UQs%@!ahS}4lA9u$KP_1Y_vaT6X6YGKR z@dHV%juEx4Tmn@u`{$hsAp(grOHO;G1XN=ID)U+vC`fO}KiPF@*u+#+qXc7F`v znlGl?O_0nER1&UoyFpcL$LH2@=kj*$%cBTUG?f0Z_u+cU>s_`Q(@xy`y}|??pRvnb zIuV|;JxDVlTcCEC?&AM_}BkoI#VwK2H!F`{Wz3TzKmgdOht+Xb+rqe(RlxI#CFv zI%;n?lqhC7aV6#sAZm>`h?^C4%Q#Q&b~Q4Hs~+(N+jjWmaYg9J!%Km6(PAjH0vmd} zFX!K{Q_@vLR1@&*mY$@YAGE6qO%-V0o`dB_1<@7IjzE^#!FpzfL$I1w+kxoQpZu2w z4|lj`GqUls#0j<6(v>F?NAidTDrr~koB1s+Ax8c7&9G#U!Xh#leGCpI$y1<@K1{5H zjwSCKsT)}5Sf$J|V#ptr3%BOczqZK6E7tjB|K93>?@^c1w2#?E&$%c3;x|=sfBRY` zoFQ6TS~gVhqPXatY{A@<;OLCUOF6zmcBUq)OWfWhDo81ag0f>oYeAthJ8*(?;5#WzBD zkwKQBQ2nu3E?0C~cyoj)10w<4& z|HBd8k?+hU->na20(5;sfOOL+Wj4p6<&+jNuZVM2LqKu`R_eJE(jfP?2(A=Udl#c8 zyk8~dZ26Yr)j8{PQMFX*-k8vka}E1iTF<2yHEuN-}fTGo1^bzxh%K>x^%Z$48bXzk5e|_Y7ELg$60~6Q#XVJCElRhk9^6D=8M-9Xj zG|6WT&-D&*QM0db1uyroak?RNlRe9X^QH^wNfw?r@oZWzk51 zcu5s%OGEyKtNM>$=P8|Jvrk{L@)1$*0Ltj+ohQji^mG*<{VepT1y<&7d#cOn*%cL4 zwP`Bm57T@zFY2GsRk*&)A{4j#Rd3ROV|O+f%K-=MiVSEsV?Z(1Cl0y$UQmB}zaQyK z8qk`5*W(`oFyF_$jYiUBtqBqvx4VMP#lS0Pqp`K33^OXKZgNo-H=V2=mFa_g9P)6$ z#k!=lQqVg5difpZ_dT&~h(z7`QQa%FkvEpr=o4u7FSoA@Hr6|MFJpU{?BF@rVIL%Z zGmtkBPxiV*AM6~H*OO58R_pt^%+4CVzMNiPa$4V_*bEeBTp$(ekx~QP-HZf)Yuw)l zjSwahE&!k-!ppJC>Y^3glO3H!goo(T*4f6Uz2&=X;q>3+Od@*=jO)DR<8EQ|2NvkE zW~{0%KsWwq??!8q-aOud_-1gLAF$rou;8Dn{^D7Ps@hse)q=5YoU|SR50!$?T*eH` z2{U`KTyXm(%ZbUqSY9~F#r-y8$>~`oGMcucz8@%V_W?{!TX1N#Va?0e=1xygHO0+m z(LZ#9K!M2=xF*DmOT%*k@hy=Q%u#)i*-ldf3!^?SV|41s$|RkJP6*zAUzF*4&kmI8 zA7mVe2RBc~h0>*SxehjLAdKT0uL7*HrZAgoBcGp7ROb||qSVs?N0a-EB&PHlXYf*^ z@Kp(>O75p;$SRQo`Y%idosV{z4AG2LndP%muacjmNOAJc(cbFTgJ)O{cyRZnFiONp z-2IL=cjKe)N3#{yfjRIyGWt9nOtvI!h4IbW5gCQD8P6nY5~bN-#%;39&IOuvGUFT> zMuaq?ISJ-@wlqXhSB37lR=2|NgWAwxnoZh5_+9|`@z8F{tun^d2^AF+&9j%ET)Mn( z^oz%1mvM zJrI4??FX@d*~_J&xF4i-N{4VAr=4T8Ar)K^1E{iyRFVE=Kg$0+D`5D(IFL_ zN-13X0Y0;gDXG7L=t|D#D|G^&Iy`Nrqgx(PqRBTPAf&gG5^!-*h0CpDBPla82u-}mZs=#|FwUDqDJG=Esp?8&mVC`tT5$c?pM z#{L27<2HsP=3N$;IUh`{`e{TrwR*K;MwA?OcGBFeC{{(dVFqd#FCQ;%bf{r67IOa` zZgYYv6nEsdL$4zatTqo{jq>I=XG!sN`@Sd&>?*wWVTw?2 z5)UO1R*H}zZeSu*6>`b0#El63clhbZbvuysyVZjC>hWX-0T8U#^Nbemo^)k(^R-(P zhTLE71TH0aH%~b&-g+Nl81*=zn2HS{w^wt23oyiXOwuFGE*Azi&IpXqpJU8?ydQk% z@%F)MrDI4cBApqr0n)$c0gceXh?M4;*Ks9UE7*0)i3Xy7Q`m)u>%FsqeWG0bxROz& zkT<%plgqcgL)NPTA6G`Ri9IWdy0<_}wV{pnAfoP}WkmS~L@79sr4h#92Tc!K2fVpv zsqg;n4#Q5!!r0&Zl4rF7xQkQ_b#S9JHKmaHrXzcjfURpikZ=#gNwiX(kLrm+U+u_8 zhF&$_I@)&kBITW0he#4&FsIV;|){JZS~KRd;7)kyZ9ci7Bh=#rBLVj`qGpwtqZ+eFTTIdS2=wbQ0;Zg zx8p3jv_bk!QxHQoiul&u-Hjz|JKZw9zJVM;lX5!$CUzCZWyc&STdZ$WIPwz$kciy( zCj8;8^?Ji)b-J3hagdP>+mh-~e0mb6EA|O+lyGXE&6IS=*jul;*+j_bx#zvTTBi@- zfrZwC|Nh^T9;9C(A_;Pl^ zd^Ksxp2&WPzw;Jt7@a+wrsoVDcrJhY)o2!m2v+I&Ie*5C*i@)7i=lHR&e$wh^G4x& z)YYW#m4r7A%GC%-@p!Dd52r0``;V5NgI2`qw3EYnI(D?Df$jR}WBUKZA;jb#)VH@qUM%lw{x61@8_MTu5h7K{D5LH}) z9iO6nyEh_LnN;5U8?O9!8{-%dF^UbXg?w-vB*r{X+QK^YrvjrcIKlT zW4nr3-VkC%Rjc(Vm+&qFdMB;EBmeC5{tzy^*-?e+xOvQ(X?u0AOrO744u-p(MzJ6_1d6GXkKOz2MvsE8z%&uV;awzy zdv_1zZ?qqPu3=Kws2M@v|DM}k1lctT)Py}D`aZwfDD=V}+nx9c=#GdrlVnJT?5MXU z{(29*%?fvbO~<;hHIea4e$5J0POA7K;Qxi6fAer;LWRed8ElO!BC>fG)7QzsB=fsI zkDbBunq(S8N6P2tU%ShmBU*rPp*xcWa>yL}F36_-I$VgY%Q#5Ej2Ha+F%4V|+$DY= z!8#Dc8Z$;aH zs3$nT45W(Uxd4sGDOGFd5_3nu&05v9qR#gH6Ug(Q8%XkT^*xH7MgZ~*(Te+e zVp(F1`WoiP8YK54wMvT zDf*|n^8`!wjox_xM4I%KBP{e`iXVvCCk3p^q2l9W*zWb*G(C*hP4y6RAnl>gWb0w` zlLFYHH*M|XtlTOXn#Z#* zy~0yVINGylZCYFUpnQ*^t~IVns75WyD>#(b8^FL zj9=vGLMoT7OQ(AL!U**^4N3;nAdMAP*UBIDWppfdYx(=Ov!D-bCQ%1bhydIk`_ZLY zi>{Snw;8LoJ=jt{)4tX2oZ1)TYF8d=67fVMdn4%Njpb~;M87mQ`}x%FqrVU_L2Q0Y zST*K3@*VQ;w(7@kSE|Ps&rVT~U82mYr@Wm2s1KvW!*73FV`t$YBY6~@G}AcbkiJaa zJzC6)%_l`q&{v@1Ag%@+gEoQ6k#B9osY3*w4>m2g3ksIlr#=5-F}es)J`nf(I4^I( zJVxzGGgf9U3>KDITAgvX_ZQTXQu1_L_`A!Kp2l~tnj)_92mKf?T-jnebuYExN6cK= z348upjiAWtwx$z1u6WEFmBeXxr?jVOoH5+L;HR`Sg@j%`fbrI!TElXdE5%t)FMHI> zZvN_37*uvYut4Yyd$YDlQ}y#D?1NTrzh>6!1uRpzX7tf^sL?)7xCPmpF!}W7;9iZ- z0)mt8RMF$?A?uHjk=JS`Jr8C?iTg=>=v>EyikQ7=nU(WQXAOkBJ}hmNm>&3LWzrZ3KoHT&f{g{ z`WT6s9WndnFP_R%wIyee?m!1uYIQ{;;IZy6GAFlTq;KYo8KcXVLP@<9*;$|`ShCGQ zJ=nirH>?#_-lHtUc(*c9b%NQC>4yc+a@K;0U79WmnkR1@uqpO*f8LRJcI|e+h|q@qkYuhHnhW(aMc-_9iSM(-02dGtg@ePa&B;i_K6uz9uo5I3>xWwrP^QhtSM{mlwYa{ zq@}tdF88HbRBKGWxmxC8-Z=my5O~N8$q{|8EM)A7prtPOSagO5*?Lwxc`nnL^NU1s z(m*@`@ym>^MGo*k3vnS>D#y2Kx~3bFbD!Pazf|SM3wH;IdAi44Q%nL4e?C4t(fJ@v z%MJ1=9`?G~dhIPz>M|xY18>Zur#up8Sq2MR(N+0U7On;*xV3C?)Aj0l(}6lNbji8N zRuCksCeSB13u^ZBVpUvx3Vkwjf~4bP&E!@Zvb7ZKHVNwBH|Xhq>nj}-H8z>1kS4ur zp~k6xHudF@&Y5x9a)vo=e4*DcXDThfYrqp|ySP^O{gx6X;{lg%J#IgFA;61fSM(;;tDk6{$}k;k(#i?73QPV&79fvqxzzk={UR+yRjm?ug>hs?Y z3LKeMsX{jM;&!Og$E(ZI+Xwp}`N=JX{sn--Q45x1Lo7`=`R*}X{sAwY;GEg-7cLk0=J@asTu&UX1ihk8%&&uvu0l2I1T8S zC;(L)0yBdG?NIr!EBs7G`yr|7P_})xCJAtS0OSamZ1gA5JmNZ&4b@luf0Z}UDVTor zxQLR-;Sn2Aj)}Op0q&fpfVmSW474qsn2%KW)d`M7{aE|mHv#?ynFk%N$bJ#Dcap8H z``K%&v+SvXvoAT-bmr><*dN|`c2|JC9C0LmiM#Q|#n9|p%fDE9;G-c!qu;aH*@x+m zFc7j4NyV7_oNy-Bj8!eDceu8qm#sLkK_t3L95O6&9HRq-^q4`|X0hRL#X~MfN!!bT zV4E1!w0cZ2ExVWkA^u)+WTrI|UR7mrCHeQqjv!E_Frq%G^uzMg^#N)J9uf(VSD&91 zRxk}e0jQi3{j*L#;>yBZP@=QNY+0oW3$TU%$`xP}xxR;Ko0}p})_{$%mH_DGPhW1PJ7s>f(SQW+ULnsGPS)z=|SdTExKIaD&(@2A+Zm4ZFT4SjmCO4D-ei=8!4U z6JZ~g20yc>4C4}VqCVfP23G4n4^0^sSuYf>O~0Snbh@j<^Kj0d_%9xsqNNA`e{gfH z$n{5}8~~3*-Tg>l=PMFGG+>TAnBz3*)CzuS?pui9@(I6nuI5PPYxV={pJ8K6j^mhK zAJ55aO_g-gAMwH3i+{5lN-)}2T1OxWo&=4(N&sN$pQu;Hlpw8_#a&>{l@|~z7b@=| zC+uFcFJxiA{@h|}C~l4at7{_AMEvcV*odw;21wZ|gHen>!Fv6%^=p`5r1Rw|M-l!+ zc8Bm~*2#+*?F^?4xGIuZpvwem)S&5aJM6DNDS+28L$l);ruI>QpA-rnC{z*(hFw-+Z8EIq_w%sws(8SRU!DXWXr>(4qN>=2T`H*TY#X! zhS4;+nf3}2h|fOLczkj1OW_u11d+G&<~P6QUo8Hhq($H|UN7bZ;;em8-J8s|zx_^l zq((?)!2VXBov=|2=*dIyHw1`GBbLHS&0`9*vbE_?W)Mpn>PgCmd&ZZrK)`SnLF7cE zor000mBMBcJ;PWLZ>7x0e>3(hw(A$K zocpU^@4VmCh{E2Wwv;M1e(ov~GkF$mndfbHZL;;NC`#E=F?AM%{P=|6!pcHNrs(kR zQO~aIy1FWrH0@K`yR1d5Us(Z9-TWT|gKwe?Er%}-zHu^@le;SD?EWA#3Dxf<+%b1b z&r@B6>`6d{Ku1SK#NSyv{rpf|>r$`4Se%Y@z29_bUGpv*qp|r(_AUHyY@C@^hnH4L zheRsXTdDq-Pn<3>yLlR9GWvbnh39+N!g}z_2}Quh(1uq)Eku!81q<_}@ywZQ;fatC z{K)+i%|jc;E^V?)$3ER$Kj?$`(L-9C*`6*F;zqIs{hX+4+iL5vd1c>rZvUX~G78!0 z7Q(>BAHE7^fD{oyxZk)9&vOy;Q6jIui>@@>_Y$}J7(;dieSvQUZ!vj9SLE8)dH}Gc z?~^p(zx1xJ5qT0s1unwl6WGhrdm(d?2KR{>P1+Y4ZhpfawR=uIPs8var*e@IyY@-I zQ?Lrj1$H(0?rwvX6Un~;7y_ok)kn%6#s&1F`YQHz@RUzn2UW0)^SeK!GmeLU;V3SE z_>|$=!#^7QopBu#eUfJ#EDQ$$NTq#q)Z@W(fOB`|dsok=xG8?RZ7AZ(xdq~_I?1x85le=hnJF1U%=1{qvWS@E15Lf0fn97eC@1)xj7YrZ-;j8??R=PPn zTDezQFylX#N%nx-sy5<-jlb@~b8^bPauPVC9K>aX%pJZEd)9Ab=KE2RRbLSTRzxvC zigs`0>aNRU(Ua4k3eM&D%L{>WcGX%C1Go-knO1{o?W0(qt;PWX#fPU))OZRgo%#Cf z4Qt$`=<9Tyqsshll1+u8!DnlWMy9{1q3waF9eUJ`vuG~QtxeR2T}yj?;(mcc!3wva zf7caWlT?pj*b|=%yBCOBJXG;Hxe3ExGonSoa@-bILw-ncDcZ8PdkH^!{hCGX)tG4Z ztsaES>8J63L}`Q2wbc8KV-}s#la|`fqrkS%$3MSbn$DO;jV3X#(`74B-EqzPVu#SN zPkwJ(ByZ31Fo&6k2QR)M4tqEynowR9umK%!**42;tPmCVTZ3oE1ag9SI#E$u#|o;A z0Pyv&an;p4V{OqI(pEF_99?s`HG?4ir9%Xs;_gCJ+J0$gbe~PCFo^`q%0AGjx~6XEntlcSVa0ROCP26$Xq@ zdte#Zl2q>%Vy9ir$%T%b$D};(Dw!!I8?_oQOKjYnNw;7)1~auCLPAc z!VNrB1rlaX1KlhR1fsw*vqMFnoZ3nN;#V81*5_UokVURRcbY7T%yN8`j-@E9G`4k< zdq?Zi-2AXeB-e}M?8TUsXkmi$O?$2BlK8r1?3&T<12ib2p*OO&T4B}g6ccxw5ua>M zPJ9qzn8YITMn1*P+1qw(ReLKcW1?#IJaZVKKoZUy^YNNi1+%5zh^b1LL*2b-bYjnl zy&u>8#vr@F*FuW+vL8yB+^oa0_(s`_SiR7;P4Knyo&NrJM5au)uUK4%L`Gp=qxx}@ zQnaVyAY8%b;5E^*TI(#D5Yxw@UYiyu8BRO+HCu`fqO0J|8Pb;HIjhxWCprHQ_TD?H z$+lhd20;NuA}Z2B1*Alzg9s#|(gmao2oVt}5s)T5QIH}n0s>Mb0wN+M(xel5=)Lz2 zQWHuDkmUJp_q@;E@80|SX1tlq>z-iFi<<^7nfJSu`cM|7ZBul=m9B>CsIm${}*nP2qT zyGPxHk}kgK+y;5T#7WhS6PO^fn*V&Qo&1yW$|tC*HD5Cr@$pj>;d_ACA%TR}jMwR% zys}NlGE-;&Nq&*%*_mRq%- z*${2xdDC2@@%!`Qx<)}@_G4=y{C3hp_u0dysxE8i?0q{PZ{Fd5r2b&jd^{9(8FGPRxO(dpVS)(blO+-yvhd!I3J=Y8#W?pb`p%B=#6 ze84*~y{PR{2a0Qdoj6%`)H!4r6~37OSZAas+aS5OMkr!|MhobsXujz}i5jh^r*fT( zrUZ*Gs9copLej!wWz8C(XC-S6O|o4kU_kNzjURF{zPu6j4IxhLx4=bTbY^JG)Rji?p3hymJ+IN_ zPX|N3?ZhX0v1n`E0|WL+`SIo9Jj49L1A7U8qEUiw4k6?Ok3JX@n$B}@`dO#nEy#O4 zewN1B)h+%IARL$#1Tc?E$Vx<5&?m-pl$W1#6pKOetlH4~ow^@*S8mDen--7D8AtM? zn)vSQHD7k-6>a6oB=w%_rTjJ9TfCQ$EUnRA{N>;DjK_*WDdFNaZkeQ;`pmXV6 zURZERoMS-m`=+5sxtx%&%@($0Bm-3-r+Jp~TiHz=*Fe{6LF;l*k=VORNy=MKRr%7Q z(q7aWViTk)?tJnq&M>hscx^v9_&tYBWD}_H6&eD+eL3}y7bpk{pFQH-2OaOl1CU^| zHWR=uwG|MftTlhcsAMtl2=weh8mj4f-CfM@83w@VYXZ_ZDp~Z9SZt#_q@*L4Vo3g! zNi;S&BVC;3X zUy>$J7oljjHFF6&HTqv+g~~!-Fg{OOsLgAweC+H31VMt!p~HR z&#Xj5uW~wM8+~-!SJ7~boxut#o6QH>Hl!TPhIzHs7Z>U{`JMj}IDI)BR4;NlUEIP>6ULnlG^G@d+UpV@0CyUgKAfa2$Bm`iI8SC+PB@m|H6Em ziT`GdVvIy!vt8JsDsPK#d5|&)LjsOgk@xzpN^3TAK8LQ)qm--5pF*X5IOUGB=b-cs zK(VVv!F9fy(>*p|prdIu#kBP|_tfIsF&ZT^eO|)DSH^sN@s!QcscPUjKid~oLcgw) zZYfZV>CvBvY$#r{R^k&ZkGx{BY12u&8yWt_oPGVS>7n2;7lpf!e}Vt4EGE2;1IR4= z{?NZIj`O2kMzTj)y^7xljaQm&8!*(X7{6ljOSsOwpHBZ@H1R{gBI63zZE-=Uj*g(| zkq(wfCPxBEr*+2m_Xjt)2iK0z<#dA~_qIDh(oNqOTW3Ej+s;uu>L<2^anxPuzL2b}|Lk%U zc_73EIsTG=pt2SYkl6LJ)Vr#%#CLfEN}B2SX4S9RlE=)JyV(e`HQ)wOYZJlIPBKc} zd|JmLb{_?37AK(1>GDBV>`jEv>{=h$0t-^#e0He2)=r>l{9{v|!1wg0q2_Nq>H zfMmZL2@;`l5=`&q*X0i-UOxir4boyamJcR6L-*SbFhSeSg*OZBN~+&yq@Q^n%fn^| zq~`^{BGfl?=ZVK}xGw{K@S(u#k$Oywm9l#yS{ zcmAj_MWu)Ro;Iy5`8D)%b#Ui`j;jZ^+I7CD82O*jM^4>rHAFz8R5U&Z9YW#{w$ZO*lOE|2@@!D`_YxpsmHj)I>bfMBK6~AdKVB87^hCxf-x|+ zIqK}(?z0a7%q2aHq^}|kTCuLoaH4Vz3-^~znibcU-}YJJ5a{QbSG2{Ztk<_@ACVbsCVen;jgdV9QZHVc_rQyw0L7UpY-L@+r%ko z9c?e+X^{TNgna~AA|JQ%o2xc^eNi#gXLfHw4@@YGAb6n8OMXg;Dd4>0ysy2nR9Zf3 zn*~ZU4k}IP9RoC7{BJ$!Uk@LgfS&~mF|R#e9FP-2&4E{Dxv3*})bicUNoQLy_DkpT zvZWeTMwEA4NmqI^Fu`kCH}Rcx>Q9E3R!&sdr?6dMdpW@ZHrww6vuI`gMPP5TC2?u} zqIHMNPpM|N;PDGK#(@kf8%cQ+F3GuY_SoOYy7}cvhG~M2GGZs!elTGQ5Il%ZuFsnv z*0C3sPuPB5t82PMCxg=BdG+S-Ts)BJP$JbMY3?kv^*l4N-;x~n7ftAGN1Juiq$%hh6T_t_X+ zwlSvdqcsH|`SXXH3<6hTY!<0ko%&C^jJL#S~~?Yt%mTfl<|+ z1*lkYpmzIgRE=OU0@r_^E*dfay2Q!Mke1pNSCUIMd0WUL`aI}C3cFN0Cv$$zY0aez z8u!4*C8sL!A9V7~mIBl?oKQoQqRcuo%4e#K6GB18N2BG|jiIgUdl{8;vINny4fTtD zOYRW-Mw=kT0M_Z-YGg?lp2$}<9N6uxF=4aO`qg;BX9w4P5uoUiObW(-e}A4X6cZT1 zbn|Ch;T-L(gD@V`F+nNCgi;l#J%?HzZVNkMBdreE;zHf;Y**Bkliq5WsPwsmrrYKa zk@L7q{Y|+~i$Dol(~zWobIJsto@>xYpsB_Z)?v*?c!8a(eJ1YRl%(tEeMMbEY|Sq# zcmCrs|Mw+2m>201F&m#)LNA{#qu%dIIxm(RH{APO;8ybf)4S9Igl}=Nl*VauJx1jV zYVug!ULQ+_PkE!L+5sI6;2gW+GxpM18+LB6=tRf{*eaZL`XD!wnm=y!Aj?D6jJ3jG zUDE~1b!15lA}wd?WIiFWHymmOs;up9Z$DGYjJfqq)~!j#^O-bF+s5F$6L`fPb^g`Z z%X^qRF#%QCaQNyhUJ1T>DNWJvCye#&Y;yKq^P<&AGKxDe|w3GpjNY{9UlvOsGm$*QZFpXCu2&IlsYZqEFNxO=2IwJLtx3d~5@7|_uxT>kL z%RN*U7&yNZG0Q?WIIHt@gP1)_HP^Ws`ZQ3vQoelQjLOEYe8fUez@7SqZ|g?2ijc{` z*=5^@5NE@)y~c1X9kDLR6v%Jh4vN3*?VhFZw3p8qaZP_2 zW?5f2UIsEdk_nRrIw3)M`>sR-MUtJ{R<1tKt6ISHXj6D7AG2a196dc>ndCUfwYHTj zS~V6JF(Wk(fz7Wyr3P!%2y|JZ0VNpl{~ygA14S(nx?SX=H#q?bJoBT#4x-Pr=BF3N zm^7DuE1(ukaMezexbD)}p&!pqr2kKusQs8HfusEKUcB{` z(WW}l5DTe#)7NPTe!0nE1Wab7&B=;P1^S5j)c|xYwp)qJZwZv zY+nM(&X}R)#upx6E4?)~rw_x)D&N#r*DbOOz6N^h+)>A-vRX?4)6D;O2Yv5W1RihA zHg5wS{jt1M`WKDD5wDy~W!ZKf-zb4$u{WkBK&aRkXg}^(N<%tdYKOwK1W$UmZ^MZ2 z77>gSo}u2Amqf^_x`NXlzs(Ty{0IbPy8Sti+5MOE*iE?wlWk;aNKGHvW@ITrJAcDzO`jlA z9xV->)7F@`<~ZZSMqz7;R<8aB7xzf{Z!T`|1Eo^SGY93sWOSD;*StzudeZTV{9bgi z8og#yf5Kg8fuq@Ofot?+C6O=&R(i}i;Hr^-5xI7w)G6_6^o`G6T<5z8w}CO4n;C!j zK?T(k@t<(GF|0A?{UtU}xZ4TAO&Mo4m6s^=7cat(%fq2lQv7pC)VSX8M&uQ~ygjJV zdSX`2#QH?wNK!z$UD=X8a~@?WR{JMeHLuicH{khT6=Xh(u!^&0t;H)u9Pl&4o|mP) zQ(v;}dS4lhJLym*D)Q!~l+;uWZqH$h624UY@`_WuFdy5u{FtwqH5QhQ=B)$WCva<0 zM+R7pnz{T~nbw4(o_~6Lyeh=BrU#;rJ<9TM_ONpk%3ji?#}{ct59NVH8|e8|8Oo4) zuX0$8x_;(jqaB5gkD(-n*2SMOQRFKA4D@-ZqwCp?Dz9JYMR;ts)7q(;!iry?Pz%h` z7t+7V)2M))j&<~hWxFkCJ+iIu`!X#lOW%y}00=c?iyVO(=PG$IjaRSWD<%5peh$A_ z4f{TiNUSCmhRWb z_vMrEL}>X84`x!x@TYLqwI)U0;!}r5VQhE81ASy zfN!_=h+7tVOmL-e4ezQ1B2QqdVx;({Lp99^9W4XPqUZ59p)^fx2oMt+u?Ka4+RzvNd>2kI7#a+`8w0ua%F zU4Y^wD57Yy5bEW?3xNE3DHW|m9fIy^yFfR#X93Hh+))lJ8>n880a+{4cJk#|-r*7{ z-ca&$s^D32A7%~8%ul|7B|zKx!GPryAdFZ4JK`@J73bC_PBxQ*#iq2YZfd`O&mXE5 zm16>zl0htxRuNNCBT87EPl{NCflYPg=MF+ik*^v0 zWOxBpAgwJdVaTi~9e?W_udqoDQ(YdQ{v*jS%i9~qEl=|o`C+5U*D@S&ro&#TkHzmL zve+5bu(Z(%^?rGzh3VFhhF(ft@QzSB_#u18Du~0GmGtcE@>iSjw8fZu+R+;|h=|V6 ziYuX3CZj7JXbrgK@{~_kkBMz3{3@^=dy_(+&POuKwCv@K@h|nyrJbWvB5!D%q{4wy z6fjaGAOdj0tkZ>JA4Ee#)Q`O)$M{Ch~A7Bo+D+fP)qLy9Hl8h5 zDhVq2U;3sH9ZDgNY%l>xDsupML1sD@y0L$WSQXtHB(*6}uy}P%2=-LVuYpRv?E@U@7l_FCXBnhY7Go;Z`cm@IK=sm7!SQU$X}@0<||5-3J%4x}AfI7-*Ga$m99ym$&z)X!mDdDyONl=wT+MK8*PW&Qk`7^64XlI$WdMkcgvR6+PCwO-KR-J4BxFo7w z;eGxX=xBxNNd6#&X-@$!=J}Yv{Wz6nsP%e~jJl^d{h&;fa*B|&Wi{`7he@2w?s%iX zQ}vu|hfO)`e(-zl-yz4N&4`C_es4x4yHp zhrb(dcACEOp5%k*%OxdzG&Js7`)mj166fZMmzM9{Ifx$ytN=%k=bLSvOi1XqpehU< zTt1zTkRDbzyL}>$_NU+7R7;6Iaq3R@QoPHO!E~Ww54s5q;~) zt3r*d&k5;sZ&$rG?iKKuux~D9b6QzJ<0YLQCI0BooL8Zux$Vg_GNP( zyT|?)JWS!ZI$8}Rd!q!)w@ePrX9-De?Yk-VQeIm>qx_%MhGIX;@Q7uE=c*}v|Am?9 z!A%*WWsR$Wv+;L`0cQ}sdkYUi`I3mw(rI)3t3i^g$D@m@C}kOm;2PUJlGEM0 zMHQY==>@DuS*|UNawi}dGh$?hW!Kdc!!wc7{tAL~ze|2+s!(n9)&tx676W$^LlvKT==Ap?1BnnQn7U(uTciM<`SX{+Ak^n(6-JdXi1X(C4 z`8BvX_p?Ox@|FI4gD(IK!0GM&sr-I5r0B27YRJ?o*<^lF;tD{9!_Q4q}lxO95e43?a%& zz6-M#9cQ?I0`>Cm4~yucwhR&Wkd8{*O)wf$f_;vqKRQf7pb2_m^^27Vsj zy39}fWVTVGk$RJb<(t3{3%^=yAujk`#+H}}Kiu(-UHbd%35R1p-`BsS1f?W0)iccY z*)jm>RIZ|;f^aot`YPr$B~zJBBD#i2s0Ym%Z`jWp)R>f}@u*_hUYeA#7G z9v$H4t2;I0=$!VGONe_a$+R?TPLukpCokYV~S8t-j#2yF1k#V(`^UxP2r z66Xq3PJu|5arS>skD}beOKhFSpR~&NXHV}sq=fdbsQN6(P0v0rcxRKI779px%nJXa zaalv#pV$*au(44`=}s?kC*v&$NdwsVD16>|#XW-`3)L2X(fpw3Yo~k57Klty9uuHA zXei&6kn}~12&zrm{pXF#?YzZd&(MdT%-R9F%l8_S1=<_vc}DEd{8WMNHvb&{9ZwXM zu8n#SP^so`4h^ed1{OJmyaiJfExgUvHjvkEGTq;HRa~~HqXm) zeX9XJX_YhjUof?GtO0cWRPLUt?d;n_2aVaLcdMVhk!)goyAiS;C9Z^49c{su^@iL?JkQ;lxKVj^+D?(~14mLgMW!ZD z^8iztcdogPcYtgTz#hHLBfCrMqOxKZtBTXRa_IS*&E{$pQ)^hH(}gqrj#W7K$Ex@S zQs0vLVVY!C(vEObDFo;sx?4FIN_k4Xu`7SQlt3TdhKrN=;LF>viwQ@9QCNC4S>1I3fX(OV9@hQ7)tGGlD;9jem`8Uo>ard(Zg-`UcaM z0-i&Ma9I2Q7$|p^I0WVl73fy`ut3xJMdn*vNFcxIJ7d_ZTL;62}s&wev>4yrTW25i6TF zQg;TPl)~O3eo9Bf&T=xOG(kfn#^Yy@3hlMJFI>B~2L)pk=wzuP)>_A&N0`B){-kHT zck)X5C>9u|KcKrM)CLgp3@Dg2vi-|^kXxLta=6CxFYje{y%=xfO|ZYGKc|RU-R6GX zC3#5op*kF=!SFe2y`5G65eXgO2y>v~e(iGiUbeD5g!5+FPbKT`V`fPQlmE(zhW$&z zod`q&|BJAH9krRbH7v=vxS74r=GIn8V#NI^zbr4K*_=me2%XV7ME9L<88!r4-O|z*yIzdj z*D4I!oSuJO;2DF@*TXrinSt?#P>^zfmu9Cm;n`pc+mA)P!N5V2xzwneGz|U`C7*;=3V2rGL*2(nFP9o?mQJwBsau^w=V7F9tbmw8vqzrXSrwSb)f z&7(rH?5rL-9o}95GwY-C--P2R2WTgtv%LV?y0+-b!Squ01+`(2U*I|O!)@$V_ZyA= zQ^!1`y_e1UpHH2B(@>_s=j)w_bst-5ev#nhSiE~VPD^kp<(QplQ#<5L6`Qaaue@?- zdU(z~>-O`^A@QflU_&Nmj>}ip0_j0T48xT=XdUc%kum$nvzimKs zCzM}$x;8I1|Kwwi;BewEtl7lmp~S6*p^uA7<^tiAAD%iCqxm8B5zXhiw+J^r(P*(2 zEBjY$o?|}Y8C?}ZjP}yAKhkpc;Ob=VFzBzh`*VktFK7TZC@3e^zFVi!2tbuH+&iq| zwp$8u-<>#A!AacjBkICvc5-Ld0S7ei6;_nsUrQE1S@!g>AE#pm?H z5yB*|AD_*dl4zB?m@?d|I6u16ValH|3Z(JrA$IU)TQYN?iyCZo)dCI*n>031Nnrkd zVNlP%6N2fE0Po0bUuan2J1}tsoD%)y6LPQ2IaIEW=GGYHV`NMEmGtg1M!8f`@ke&w zEW^8=`u}_h>Zuc;WFq7o5#V1idLM1q(lL#zaqG*7r9!lqz`BmsAkaN9v9(Tz^!o%_9&sbJ6>>J(%+xhyW%QcqDy3BT zt?+kc1#9k|fUvLtT@40BnZw9?5ouz&W3RIOmWo{iHW&;_=*Eh&gYQo`*d3ab!^q8 zRX-~eJC51QD?%c)cp^;t!ya|`(wq2Eyzk6+@!8Vn;jE)w>X54$bMpi*5v&)w$WJ4h z%!^SM>@TZU`l;^5v-TL|)ib|;z2^R+J`M>Y!S7e~1&I6eWfaV)_n&zijC)xq33O?q z=K_ZruI~D59O2XgLg|{A#ktlCv||h`CPq$XVf8zCpY;ruwY$JG&=-_bx~>b~KEJY+ zIFlGRt?7UMhlw> zdErKrLjClTQMX<{U|d{o7mUf*7o@cw<+cBa|HRoyB@b%(ZBum!MZJkOpS9obYgCxP zCzYJldjcT@`d^1qrH0qm+7{t-2WRs8VcqXbw^0}x(F;g)WYwk*BT=^XhALn9Jc~vn(=xBb%O{(zP4`TkWQWNQDvjr3%zeFu&7T7yNubT1Z*w{~oy_t4MRSEwkmCH!2h(`m zg>Z?@o3JjF`D70i`&7OzV7b;X?6>iQOz*$1_s4&meNb4ZqZ3HHBYD)K3D$Z-2^xW1 zBeCDraE8H(e~Cyx>Iv0T5iopAJyqunQ%fnE+-}DZO&1bNgDvzZO5Fi;sn^{V9JQ^G-v1iyc z`aEVT{c6AcPDw2C!S)StFl#-OW zy&sEX1ih++tiv~Ccot754p@(9i#oGm+a*Z0p^1Fij{ZNW_Jde@j2G|yd9$!vWoFV( zxZSH$1wU$W+O{jBKS;2_r$Vf@( zvu-9gZ`qobX-<3wydYbX)tI=kW98RpjJ{^3=MkV_RowNuQj*PyzLOulltZ0zjILpa zRs8_sd~ zBVCD9HixBkES`2!AUv^pdy;qC%G!~e=lqw?dF$QvhqoDapGA-~NhTpC#-Gh^zaIFD z21%wH^qo3#?#!)SK(WF|CC%(q!C^*)JMRj>w1;NOr#UyXX#CISX8ty_*e&bz4*}e5eKRODUOa+1X+^l797Um*tV5$`{4n7`JrxLp?|7D z=PQ@2fsTHwSv88M@}?YXI#>%klM3edB-Y%c6>-r|1Rr z_YZ1lah`3{;+^47f6=@&HJPoHP`84<=NkvQ4~_iy*jjz23E%FaXFl<_>%R7CApIts z&h#tcZvznuH;b%m^a6}1lgcx-W(`{9ny;z*`N+HAj=a&D>#3f%^$(MKsmdDr7^Kqc ztUYD&*~6X`pnx5x@&@Qz)BML*p?&N5@}^Pi69y^-{H;*GwIKS8DrHsmcqM);0)0-| zIzl8Lbo-x6-~adXzcF6__v7DVfiOF93RyztCOrHs<8+C75^Dw&2@4RF)IpCUe)<4S z)#a8SAQIcqU1~C>tz%DW_M&o)ck`bUK8l=EI!gPklr%QsL9q}~7D`l$X-ni?2d`F~~EHl*|GTdAU!FCn85Zf0xcTlr&n zOc?tJ(dY<}UX5aUQg)?~+{#g~)<)o>*nR>}Anm=T30aMc)YLYl$dl1XvvO+pM)e{r zlz}3bMm62#8c1r677fs(1Og}ZDV6&F_+bAJeoZ&44CPPAG#Ad-o;~dAJt&C4IP%BC znW0l^u(oIUX7^NNO9A@O+y6HS6%RiudopM8KIO<8IV+}dfB|FfliuEXQ4)6YW9#Sx zG~0LCw<2t(WKB|R=)-c3AUSBJ_~1?vx;OGW-c4D2v`T?Olwl_~NX4y0tpILsmYnC!AGK!NKQm;sjg1}9 z+n;3Sk>JFKgtCKmWTU>XbM|=f-jVZamqzAyTJ=C3w>9`aBx2WE>wap=tiT34w=D%N zG<-4%+a$#9U->@fIA32L%IfRIoBrtfr!?U2nGpxJmlG0k9VU6A z?L97Az^yU9`~T;Pn&@nBi|h3`Lk-^Ea(21=HC8htcDUkda08o51Q(aHQWj9lxB>*v zvt+zJL|hz3YilT;DEA)`3jGDPP3bl&PKNc*BibxX3MQWT0MO}W?g z%72b-2WHoA?a}dyJT;ZmR|xL9tb#5Z?1$}?b!`c24$AW0IWM68OGin`Zj#3!`bQQS zfIolT^(_GMqg25TDD=lvZ-3ye{44K?mXRL70%Al73Jsu8K5%TI950KA3xEEv=E+Lo z%H1|o$mV&!TDH&H1}UWpJy8&`0!|OZ_{iPYkOxCRy_lb=r)xsJ($@i`DyDIp|3ehD ztX=nnXI2zyPk88}STe-^qG6jV5Fr~~15x&@TfY6iV;%kS=6>Yumrt!@8@QT-si(f2?aQoo zeDn^&_|>N(EzobaB{eAzc;?C!PyLgt<>XyfkY|9_vnsQ$85K z{-HCSq(cS4U}c3lXCYZv=JA5;iskU- zCD$FL^v4uGPYxb4BQ?AGr?um21Z+$AKQ9!46KmB+os;GO$dnh~ANu~k&M9W>ix?V` z$HI;93E}WBx87_=7%SO@5Te_r=8l6u_y=2Mv&jU-EywkZL^}?~rSYb$BLd*)Vu`4N z?Nll&#k7Ywg#OrL&2;zhQ6V;Fr?DDDhiJb(u?q%x>P}KqEu8HorPE~;x{d}*z6@i2 zVn`rH&9|ui=lYWoNF0XKc3D3@eZ~<^@ng&1-10mDP6jrX{{Ehi7o%xR9vHm&!t&;9 zi5lCwU;wZ){Ua{~G%%NGOxeS0WWJEfAI57Cc%!_DaUR90Y5Y5)Jb5fFnsU`@qH1z4 zo``9SYv1bTXY|=4J{)phO8kpv{$PFjo_k}@?;!2Ec8O_M=gGi!Ok%+OIqVgZ;eGz_ z{m9Qy2HNcCcQ29S$g0`5awp)2akrP%V@z10B=0i^skHc1L`8g@p4&R&cT6ITh{YM? ze;l2n{zXG#Sde2oVaN@TCi&)JlfRQ(^%L(8Q@gn;kBi^v+F%+$@7AeL@!j9_zdwUs zce4g!Q9lc!GQz@d`}+?qlaR?rQdZbd1h>ylD@)}3*2R_I{9m(`E0CNpCq&tCK{gqR zzZGG!j|PodRPFMLXMq4Dne+7VD9qAuerGjl%I#Ury@4Uez@2^A3h;hleb)%5pg^Ig?Qri*B7`ogM@7kBV+oodr)eJ#{p+ zUVwn5dkDA7Pk#TN%$fQ#V$wDLdBptVbJ~?U5t1r?v6W?S!;j~@hwu#_5BjgiiBY+> zR3g0D%XR!*BRg|<7ToNYS{0M2;!vQZ!0qFd|54zBA|cU=`5oi;X8AjhWhM2sE7- z`0X`2hPBgv!)*OJwCY@~j6hfyRlc}PyKvTfNsd)OEk^F_4@nvh8kTya@0rygp1*f; z|Ggim{53nG3f3Xy~SEp2XB4k#ei+D61J+T;3E2 z^Q=;OvR0fPB&s}CvK_Y#ePd${&O@|8ZoVn%m#bE4;icaYWY*|^5Dbk*yy>?NeDdub zwYwJdrIGLgk{FRanvRVDyvs+<-w{3h?3+A49P(bHXM+~5@Wl2z*()V z<4}>;BM#&jj4UE4(rD4lS*is}O)cQIyte!uA|p<(q${lGi7K(Eug& zXXgv#?p2V1Y{&Pec?OfkK*i$#m`+c}9Wpw|wIm^~}ud+wJb{R@^oOHc?%@wER)y03U48P8ejcvgZD4=e!SsvGeQ6KM?{|y@LPz27=Z{CWtdSN^OQdHw;lc5uBX*k2-v=9&_QHEr~Bk20yIi zre4d8u`7x?%ZOdtEU2K}lvDw)odt7M_!yHGJxZ89nSGheQs5@qr;#M)p>JwV%^<^M zOJg>pxRbRKU-DIW3p=PZ&T;KKiR9(zNts3zOJTxuO`jYHPHz9Xp!v=J{Nwz~wE_n1 zXjly}XeBQE?sJkPnlGA7rmt&upxyp4XcPW1Xse%45+UsCA!t3z+H1X6Rps}4D?TZb z!!e4rlo;pak{qAxMc8(y15h%X)dALau@mcv`E;|eL_7ubI_$FT?>zOqqj9P?gcVve zl2=XFjCVdbk+I*MrNH%iiHdi}hm;M?k=1P9K$QIf^fJu3-#kARM8-zm5o%S8gf)}d zkN71W`HPWF{ydNzx4>M5SG)O|NX8)rs){vkekbIQkxtBDU;Na-jPXhFC-S(uieS74 zwQR<~%J`>e2EK*Q8m!ndGEiVTT8MrHNGk_?-~LZJ+59g#jaiMp={*kYIBr3=hrBAA z@L?;;>%h*0VE2Uzz6Ut0wqT~)4kG3P0&m_=V z-n;YX(d%XYd#H$UzYnRcDJj<+{zRpdle6*3M3(s2n^pc^HlG6Wo4C0gK+DZ>%|$>z zXOYMm+)c?QO9_(+k&lo)J$3;6BX7y+wZ|G`t$R5>g>(8IBOF^5GG8X$u zu_u*4v8R)-9$0E<{~Guwm?j9PLE-B|md>`B`C20vOLomcM#9uit0VhAtyx6^vdR84 zzziV@hXU)?pERKV!*QBv2$g^{!pixhiz+W#Wv;aA4W;uWo~uriF1p>zAT0O_ zz6xWYcFw2_m4%zN{KS3zT~r~X@R_Gr^UZEi`1qTfN<|zSz4wo$)B-~-akt60xYtWl z%GF2m24USYLBA}uJyv7Sm(|E85-co5G&6ur6@c8QC7n%)055UZ6lmKesXQNTG)?M?*EQ8f-)A`zr)q%^zcGQ(dTka*#$|7&4voHlU3-bkaIj`oZw z@ziS*9ZkA5KkLE+2UnE!5I@=heQ6*U<#J|S)2+pt&hu=HLn}H*34rnrU?v3>nyb5P zS@w|!!iX?Z0XV_m|FOnl%Dg5J$?T5Q-uBcV2Q^a_=`iye_YKxbL-jh$ zW}AjJ=MOmbyP-1wqESlL5Y8+g!tXFrd!gr*Z}d;ey&W!GY`8cRE5a}#NsoM(D=<>q zU}A9Z>YwF&jz-i06N7v4IO<^~;gUPT^|)xwD2oHE_81t}3$nmPcq3ib&DGRwp|TS4 zLpa$Hd8z68zS5JYAN*N|_*${I$vVLD_i#YYTwE5;HgYqPUu;3t_94dkw}7ZA&N{0v zg(8(%mlRfkVN0y^KPM+XI60~wqilMHt)RQ_2sbC(5ZN1wpN6Mj^9+1Op|^fw@7{1h zDPnuO>#lgrnKMiqFbxAWU-NH{mYmj)Ber*u!?CqoQaNmlo4_FUUC96@rqlk_N;)Tw zH!&GHslFfcaM#1sjmAGNu^wQO$o3~|s7HPrIR})PY)$7V9}zdS=bxkX-tlGUc%V*JWMEmIW))=J zuG!r`@7HzH31Z()rKKe7pJ!Q6(Qoy(HBXnRbnbs`dFqa9Z?|yY7H|0Vy0oEEU9j1} zjXtk)#!31W;f?r3rV@^Nx^GIhIZi{Fbw<*?Za!5J1%A48%2S3V>_P;kk*3FuB zxHEKWq2u7JW>{3rmp^La3mR9WMrzsiF0X4}G>k0Ku+_?w*$Z26-JD!NGAXB^L$2t0 zXtygFcz7`a`NQlxDB{OJXF$`o=i(M7;w_6OvjvhJ5_WFYb$~o zz(Yh9e)ki-p61*_xz>tPO_lwU#TCFU<2-;99zwm04@J$lKflfE1)ouYR54OtAa&)lj0MeQwrwK6lE zqJeho84)rT<6C+ji{piFf1%;HY+#iOoT#7SMeU3ozOcLMcemja)QbUp0}!^co@Z&Z z>)sg;4sT|h8D0{E$Cr7uFU9$)(C{^>1@b9GN|B=OG>+u@kju)ws})7(9Pb4XEP%QW z?6y~k8~+6|D2IF>pFJDu#UOb!=95i#Lp1(lfLrVsz>JR_srCcEl)vujq+X!lDC|jY zPrDo+IYtJk2BD^;WJ(TEg$T2oMwK4nUQ8WKmLuFI`J)5-@qpeQ+0=V>!pC0ULm@sX z(NNMZ`N+FfPlcwJhBp4sKf318bE|db@}hS2lRQfyc-_u%QD$A+ z3r*647J$GBVu;%;D~B*Z`@B@NCR;rSSSk4`>~TklH}bD_Wu{uFSz^4sNh$e-QP-z2 z>j;QhAG(B{oDc~;CIY6IKJaSQu>jHf{Z~?f%f^6`PAF721DTAOsK^FnNG<|oiCu{I zX|jVKI{Tx68d41X^3Npq_ls>8*NPrrNAkfLS`chip`(V$CN4H;cbuqd*%RqdvB|si zwhS~&Jge(|;K2?LKN43*9iOScUtw1(5JZH!*&Bnn1VlT8g_7^X7m!@jsxpsnC(eva zm7wH&&0@zSrj*~%wzulr66G$NCinP;a*tJQ!qX4#`Zl2*3R?+xg}7$MM2~i+PvUae^l)lbN;oClN)y5m9dE%{{d{C_Mr4 zXFr20PD?+|N>>Pdeo^D9SVz;fw}7DJk>kEu_ct%JBx#oT1n(MNXg8#+`0Ds{bK0kV z`owV}O3Vzp`HR60b7}}`C-L62aMm`x@qqS7`(azNf7o4Gvi=BG9f#rFQGrRwqOkbpFngX*>17ii0K)nRz0x?yo{z81N`h}=$o%I;-lWQ(S73@s19=| zX&NY|c`lIEyTuAkg!t8{-q$&GIX2*F-tPiv=Rv0?D5*_cFHJ*2`JT9>=pQbnHHY`3h# zhl*lBgwTr$NEazeC!*3t1ccC|0@6WxCkRLhO}ZcuY0`=G-n;bPJ4i2q0D*+~zkIjM z{b%N`HS@1oGk5;`t>wEAL_*&8d7g95KKtxFJtuWsT|Za3#$~G%JIQqJUNbSa5D-vm zdKk!)QJp|?>sg#YJc}&J?T-y4jOFiYdl6jdiC7j2HlAnG2|GJn4u%&^b1%9axAFOn z!YMIYx3yc7Ghb`x1DDr(VcB{6((j@SJ_3RrQq3|iNEo5lbsnx&D_FK`CYn&wT6)2; z@FOL(!(pjsP!&SL0Zuz8DdF+_OJ>`?p2=54UVJi0$pN^B7FZVoXS_N`2~>h1b>>*i@?q2~zkPb}}=9f3UM zQA0W0TSeBQZS(JcfiyErnBt(NIExd&`<>QGEOIq=^&xN=U(>=0HjBaig$8x5UiP`f za&bJd`jdF^EtSzK9YF0oUq|EKhVo3g^F}YWmCXn^3f?_H#Yzbg`qZ%pT2kD5`NNN@ zqJQMtU&FkX;YE5F1$K1fNK5;!UY%R#cTOlDimf3U+l&nrM&8Zuji1L5^98TTBy&Cs0sGA z(cxD3FYGY4OwQCM!lz4RlL9eaC$E8Rn*rWflu+zm zrI%@w3dZleVMs=}ZMd?3@Veqc++d*Jn`C&@BmKZ@%rVHAWaQFsnIn&-(#LZ2b*5mt zN+zziR4)8bU5a?h!a46x;zO39BI9Ae$81omVxeQLwllHT6#cgHZC*HX2lCFw?_dLB zO{L!Aw>*{69H=6?8%sR?;0R?u;%YQykRg-ASg$zOJ|^H`Jp|wyACgDFR|f?IVrYpY*j#X1!DK1Ov%u zN=z1OKugmD8ZWZ%`Bw7~p1dkNY&wh2>KYVMW^O)G__pTsa$M)!UL-hql{j0nb9pp_ z66z~jDa=yB}b-0|rMbE#uQSKD)*0C+F_KQ+fg$hjP(r2UdXGQ#eAgg`@Djc(8By$VEwr zF$_U=oZKi?T#k7POBaFyY?>D3?q8tlV|XKgGZto@Aa7z-j$|vSIU%8?eh=3Pw2AI0Y;?OvDoFNN4#4;mZSM&(jg6W( zR9V(76_fd8D$UEa0UAH>cUME5^^mYV#@5n=nWO`q7uw`T5Z5$g%?9ggc@V?=5(tKs z#t9p^JZ+&OaG^gg3sXOOR$5zI_fz@~&5fN#JLq@ff*UTap7t_sx0Wnh7re4ocP7Wm z%C_O_C-FR0agdTSpxA?UCba`f%RoNEUXifc0=p4-{f>Y_rYNa^tJG5lBEpjGnj5c& zJLDr?$tnwOG;JnqI~zNB+mu0G<*T02*j6q0R7!bKlkd9Hgl2e2Sx+_OOL6_`Pgk0d zX8v~9BZMd^jL zPCHChEozEzy_fF9lPhvHa`;C-#)9m|+K~=bY_fiapTf1riAh-x1l*&Bkvb!Q>!w-H zj~8mrmfMPxw{vvDkjA-Mdh2c>Puz7YCOHkQL;jS-dbq;@vd2$)@1gZJYl_I%HB8F$h9a72T0-*s-7S9T&cgoi6vnTdAK zVm?2-Y22Expg@LD=$0yaPh&Eo?S_+F@_YGa2NL5!+k`=;4HwDp@8!}5wBPVMn!eI7{V}L3 zBs!#~SER!nXYlY0S$7vJjA8E#K1yHH<3P2P0wLArSN2I9ZLqbXyOtj{ z>4sSxdp5Mg_tQG~g7O`FG~Fp*!(7p7h~osHv}zwp+I2g+BXXs^$VWqPqn!o!t^Qez z#$yc-Q9r59YIlul>qu}I6Dbp%;>0v=zAG)V>HdMUD1^j?rP+XZFiw7NBw^}lV*~*c z`JKSYu-}eh1Q8=PIa2E%q6{40ZM<@}aXOZC`BRuRQtfs@HP+Fi%RW!41m&si-%yOD zaT)&wx+QWFjq~k$+UgOt2%ot1vPan0u|azM`hG+>&f=yhFpv)q^{ke0pOB+4d@uH9 zL)&S_vkxPhJHI}R9XQFhKL<@dG|61dXdvwCfgtF;OiaF~;E@V8$dM~lcBiS0xB;F?gb53NKI}9D}?ZLj( z#rz$MR8h+R8n}WhmPuZ}eG)JvBH38bhM4rf>+mf{JxbY8N5B{i7!>&96>xWEStRPn z$J8_kG&V0&hzV8g70kGKXri81*Bm&-s<6>a4(%xhVvWDWk2R_C<<$GB)_*39SlK~x|Rw&@W{HU z+m3nHRmlA0TU}LZVVjF@f~HNm>HuQR#w@KHEFD)+H*`P+Ut$Kfu#3b zqQmco?^bpR<1Z(yXQkFXDm5F9N|K%OG|0oSP{w=$&cV-p$%F?U7X}F|KPHc2M21R7 z9%qcdqiqV|h`u3B?7hU!aPvJsP@pc_fLz*}sCP3rPhTDFWRl{p%CJ|-S86gp1$24e zaG+`yb|9^vGWSnr8#-LCvB}p_c=L`of!;Aj zQKE;F_H|~`2RZeSNd9Dt9Wf46Qr|&Uyg%EbIk$s?gZyUGLJWu9ZQv|w6(eIsuQo!6 zJRX}=59jqHf1TSuwg_BdKP&V>9llmc0q>Uw0$^c1jaIe7LcZeEs{jd}%JMkorxjV(?JZ&3cJZI;RTSmEnyI60f6 z6!qC>DOL;U)w6O29mWDLllW0>enPixVdv31SiV=3Dok3pm7(e7O2}KJ19D>$->bD; z5>;(^zzj?;Ua8^U>+NVyYq940Mw77r@>+q|HqC-MU^2UWV2~sO-lM58){~LbEJ;j1 zEglA%T6%?=(m=rEfAjD4ZuEeDs08Q29QArn1~#&Z8)uk!g`_+ESeC|EYD2of2Qe6< z{Y3G-lSK|?YC^FTAKGRs+(Ibaem?*8ly4;OuCyb`cyYQ%QER1 z8)Qp9i+3UH#x{fXDp{zC25flzKsZkV5*xRXk4?7HOARU2abu1xgB16*cCC+6q|iG` z8YP&<0M^xs$PNnFOOHoqpM@aTr9gvA!98P z&0rGchZl}lX7E!sJKxugUwz%J&+-O=h(T%tG9LWvzXSJnOk#VQ;b`J!XDPbT59G@J z3e@zAF*KkAqK;=o5ynIHIPmiwNJ3s857K zrXd_a=445Dcm)6G!Jm#5eq6ja8$=s6*!ienDfaCP?QwMMTLq~zkph3jQE?c1+tL!v zTOdEu=5=4QU)X!K{%GC@I1xY_=TN$oA{M_l|C%U>u88h;Na&!tpX&U7I8K@dEP z@nFxNR`6V~Xz$PPdM@s<=>eeNS1;+|_=&%78&Jb>oOq>DC#ondgWC{Jh4?E>3;l)% zsDU5cTYC)f_=Tm;ZPB>n^M(fD$o0^!O2E0amaIqDX2N+fZ}nEQna$2I2oRGY%I(MS zO^~QM$)WiI?|zgOmNTP}SNZPx$Z+%f8TXDGcP7Q6teL8gfGP&IYIi4ERW~ z1y2Ay-}*fzMQnQPF&bF-Cgw-ME9p4l^oUh)Rz2Uw-a#^23AzJ55aGwsx?|U&8%vxF;Ghqq0zG0IT4Q-1S36M_#g+v9uY6sD{DllgaZu z|D-wZyoSaqEkd{I7isa*j=5O!l;pM}*|@DiNE|EADo>8?igB-P$8B5R5abIi?H=9c z!XZAH11&M2oKy0SUQu z)bN78zM|k!(0Rh?5it#3n!Dw71uMSzVzXC>rH$dqj94T8^KhuID2jnCaG=E{w| z@c^Sk5Wb(RA}dy_(m5A0OErL}Mw6-T{q&`mF*c3G5f?gjHHEZ9O&78c4IzLDGQLUa z4}g&m5uV!|vacg(Ko6@l0vk(QztS^PAyYp0%wv>`QzKG;0#J^JoPZr{PWQ_J8vzUb zi*K;+?)f&mJ?A@4EBT-@ESp-+eWFeYm}u~6de%Gwj!eyqOlD}gh1dx6)5rSv2v02> zE!&hVV)T0`B780{aW(H=3-dnwt?q3<5F3=An&An(hLxR57PF) zPC$a3ac^dJRR{PrUVZRg1Z1}Bc>DQkgmbekl(5}4zW$6O9}YD^;kK|#8_Y${;S0k( zbzj^K?TvI-l4h3fcYWA8Q*nC9Z z5CUSS6pj9kQCP|JkDnIt=<@DpZxLD%4G;*zls=X|AmdU|qpx^qtdm=lzB4g!nInVd zyy0d~1wPqD(;pe5KYpvVcb~T~8ZN1hGh+!gJWEPYUg-uoOyY?I&9% zcHXRwH`@j40LO+U4(#{Q*bK>m{|yVyb)i>{YOT9Jo)8q$YGXqmuIqfTR-eB zpN=GOcLT?mrAO(K%7kv?=~Sodf^N1DcYx@A;i_)bR=WTBDONPrC)T%Bk5*B3qPnST zUKkd7VZ$#xeHI2u@`@RNnZ;s1#J`CQ7fT@Q*Ds~vRm94mverlCDJNP;kQbkX5C1}^ zG&@7rW#;%O@OIsWv#4F8Sb)(NNS%94QTmZnru}z}s~aBx0jxEQO%FrgiMQ0Wm5AQy zA`?miSU}zDMwrx{L&1_HmgoHrYeGqMfXVV-$*d}BDy;h!vIy>ZcHw&~>`r~E+@`=i zmv8!83TN)jBmn<*8GDlI>Go!@hRIwD%Wu3!c7a*3P)2Cd`LRd%5`04L$q0KS053VO zDmb%#G5ZjEExDs%SNd>f`-1^LWGfpjMa|8S;IV}36es!x3YP_F4g;s$+wY}r|8%&W z(wiQ4z+GQ2cbx3_JSQY++ve$kEyNg(;61KuLhAaS-5gU&_39w-R#ZG)WHjjQor4HZ z9QWOm^yT8{8y=e*hAP8u6^nASgV*YNmw{$29y(q=Cp@bGdPQ90a1Z{Zi2aQ${fnB9^S2>>Yl$A8<7T zof}lB#2ehb)+`G_bEaD(hTn@T-U6^gvOv?_c2*Y76;C2s;!0F8M_!Q!4($CUnaZi| zO#MWlkj7L)GeN?AfxU5?WkEmWCua{K(~VNa?#E3lgC~FXd}NR6C)yHH)zjMY)Kk`m zqPRPEx-t|Wg!5>A05Y~aDJ>pP_oR?$vZgpM+F^#~)i;aL*FJxHp^9#kdn@iEE2o2n z5omQ6O^%S`#}l4$PEtdwj9=`>tY=QzExjI|LS#Ipk_kHTEuL=Lz&FLHAT=O zq>eh#S8~Sh=_H*~@{8FpzwuZ<7 zq2p%d))TK<=59pEn}@lDIn?x4==YZd>oe(AsTA z0D(@SgbX)7Hf(SOt1%A;$!0d6+68}oFsAOczfjzG(l^uhK1Ht7jTd}yf}%1+2glYz zyCzKQuM?cbvR<31k8-OELkCJ~;V+Q#qxpt4#TEKRGPkGG;ZMYTL`fnH*eq~@c#hIx z|CA%zpE9qTW)&t0?yrc1Eq>LztqIE$xyy;`_%M- zCFG)psyD!}AbHG*Q8loCm-4SOaUdB%06C>gyL9#5&uZ!~CN9;wB-hdza)Bf5y)+)X z2yjmoHH@!tHcsTLQ%0J1V?zk-0B>j?8x1$v>3cL+H?@9zv|0vda6;wdSr&My8|ki> z2ME!GXqGkXx8L8HF0S0^c01Yv(SQdB40Bk+DDs~n#YXw`EjA2*RC*S;B#McAQLr|>5@L|_O`-3s9 zvq-$XnIPQITvNx7F*wfW>7jl>8sW6S3i6H&v3*Cjv{Z(GuGI62hh%ugslrT@TDp7r zi&wm?;8Ai(JjubHD&9gi6ZEa+JoNgbmRm26J{m0i0^Ma{&UYUJloEh)oxs3poBi}& zr9*{yLHJJb_N@i^BhL{GFL>oAgKhjGlqYAxDGZ4iLknlb4*2IlT(8%b)}~*vFuT=! z?J?I~4vm*$AQI3k08P9%b7%q(?I^uhJyaJUWG;P*p7elhmAg6Kds;yFaYyq-EjQm& zX!|VZS%l4nK4IU7S(7Y3nmtI1aIS^^Bm^w@m=U~dgjWu2SP#fHp4g~cJ!KdVVHB&U zgFKo$(!!?aJqe~JuBZ?>d#OHjq=Eeks6uX0Y>LC5Jv@CXlkp7x>f$c5R~L69_^Xbbu?gtJFvKu< zIXSPr5^(+*?ExRqw%;=3-}Zdxtn{sGNBjX+(V^~whYecp=yk=IW9K2DuQ08voj)=6 zFnuM0>gjR9BMr+-nn#Ow)RF!x>Hj-Mz_elcebe6 z+(BaVt0B-aA!y3|N8JY08)07C`R*3hV+3S;WETK4DVc+n)Q0j(J*AeEhEsHd{|a*>lZxvB0>WI1ltj;&ndbb8!?jF2$W@n3W=6vG5#L`c?QAgo%c|eLruP3+D zWwIcbi>9TACg|?d&dc$JQ8U2~^F+O#vxrmDX^qdXnmyGVbpx#UtjFfiqJ}={nZ0RQ znq)zE;S860>6*Qt<;-79+`cdXGrJnb5)0AZ(ud$$85f6~AGg$fi*>#^lmMkXZQK!L z7*e9xyeRVVgEj3}!w1hXlk{dHyAgmA8Q|{1oQ9=nE~}&K?+aHkU)`sKZ_t zy%O@i)>ITdd1Qq|cH_?FRhHSTf~|7?e7}4K84@!VvEekJPi=TG)`ToVh3r0b2?%EM zX;uzxjBaQD!h;~=`rtxeWgO1<-VM42&d9F~&3mL7$l^0YCDQNkO8O)5+C{C@S_WK@ zt#(K%`nrxdTPlC^>jed=#{k09Am(4i8aU9AGSGN|{FC&?WmKYsJx&12(kfU>@ZVm* zm3O}Rjimgo`w45o9F~~l2$gP%lH=_u6cNQe7n+70R!!i$b6@^W2upH9ZXwY(Mm6cI zRz6~$?jZRW?GH7dli}7zh0c3Yt!8?;QV^bOh2V(1#fkJwDWR*_5E;cGN3PCF z$tg+sMj7pMX2@IBRRd8V*dqf~;jK?!U+8!W0-@WKDFr11R}!<{E%v(eB&V+P&u#``)CC1?9I0DspN!^pZzG0H}z5Z?O$ zJb(-ywWAcf`dnN{y=k7Hg6!5T`v7np)kp7ZisNW7fr1B0_FN1%u<~7nQi(*nux};M z!M;|W#`c&eSB?Q-M=;c=1m#}b>P3yev4?7gVKD6KqcpE)etfmO@H#N+llQSGkDCbt zv=k`asUQ#X5m}pWSy$Lcl9RYpeO3@f08muX^IOF(4VI+IOQH}*7KFi6O6!e{@+cM41ddse8o*^nUrjd3}RZY_rYtw=(Vq6sLT3~c!gDv}e;#OhmeTs6*V7k&O7^b0m5I`n+k)<3PI%V+sfy;uS+u^8SEQ~>a8 zmn<}s6hw!u%11W(p71MZ1L&Q0$SPi8K?Oi=URd$9#FE3N80O`rjM=LmG;NJ*<^?36PA?;Kn8?p_S2SCgIhBg`6-akKDbfqu2d-Vf!1!OQJ*Cwc*#wHtN^1W;2}y49gxf zSsL0gE3bUOOg5nzjOTO{RD{qSN}kXC`%8Iq_ewaOB=kaL>`x*H;I+e*Sz*t~ME+kO zr#mOH4>yd^wT=g(wy0yvCGJU&N67|306kvirvi!!Wj{#KHJCRv$O`bjWZdGn>Yc$q z?Os#YOr!y&O;>W(U*9iGst5Oq(C2gsMvpTQWo(n!dSV$FZWszQ6qTB#d&G0 z5lOXCHp%RdH~h9i_i@S6e^zv?C*u1H@QW)V7e>QE#$;8*cHCBg^%8v^a=~rb^}_k{Q-;4n~S z!#&@d!HhquC{_9Kat_Dqr8>63xBWplW-o?<;BNkC{e&kJ0LRBxMqIV#LiP*m&*AnC zyNeElE56%Og8f!iC*?=kK!K~6T8Y^80yiIp2hH9J$ggy^N%t?;dAajne?!~Y9R+k{cIz=j{mI3>sgy=gN*#N0S#)0YcUr(pMnFI#DMzm}o z8s}9M^a~`hsEUnLIV=#UQlaYnX>n{2TnJ}l9<8j}->7#*D*J0@GD|OI0GXKhqw|K( zAJ-rE;Gb>{Y)Vi~IhHcdhPEFPVmim<7T70HQ}EIn_>zA~`q{E!5Ue^bW&7bWLBeC> zd;8hAG=6+64;qRcINmJcJ0>StM{XMUv*9u$;`nA6lH7L?$p#Td$)7TRzM|XXI=wd{ zk!xy5SmeNeH1^Xe&J=W$p%{<>Pr5Ap1zyuK;2gSNin_LODjA%k2b-9drVzbD+(myk z_BMsLiU`TNZi)JHvX+VZ`NTPf#a|$3t|vq zB)EL9C|9?5KM0BERIcZiO?=H{fvG}lB}=8JE_So{Bb=py%m(nfI-kiK`j?qJlsJ0g zBrmRC;XF#DLK9uq2!wt&pjp6g5caAb&Wv2PH-z~2k^%`AN&=`>&*LxmGa{h>^!hxN z`Q!R~_rO3BJ=aQC?pH@=u9FQES(}|IMWMwZ?UQI=@L4}@SiV(^I^3ytn9&_lT$B>dOcb|AkWQUR0Pav$R{zbP3cRs1dxOvUs)=1r|GxlsX zR7pJ8DbX@~n3Q{!m-r|ZYnangKUt(_dSLidJH?CwvQ;9!FWI599k&$M^e(uUiLGpE z?J2lnps*h1_hGkXx9x3&#^zUtGQAz9L0^^$=dZfWJ;KeFFfOqmd^ER(6bZ%lcB9r{m^CI6PR zla{&u=`2mH5;1uH%Zj56e*A-)&den2$$g%0N`YU}+8T!mEO+bg)p23VbQFbk;>P4wgQu_0kC z&`lvccQ%heR_=W#l!e*4JK>Rpa8ic2-I`bP0E`(ccU0QHlcEHtTaTR3yGAUTx%uZZOOZB%gN9eCM*&+~LT_^d_x)Z}!mU@X`r z6yIaLT}wlTTP8Z=p6FO|%RgbzI^JQqReL%kQNa?9LjX>TO}uFb)^CGxJ81_GcT0zm zO7jNawB2StQmVql+<*kw`IdQfkMyi$)xD@y}{TlM6@Tp;@yRgb5B_%fGqVUo?q zWL+RtVo&~u&y%3hE?rB`&CEpRkZ}U*Xe9T7PrnR9R2A_MJT{Ly?0pM^-9;xaFgn2B z!z!bobV2_KuWV1)AUcw?T)!ZXnC12AHKJCTOV|*%P-u21l_ATiR?qLUIdpY#I_)EoKXJz1q zc7hB84~#9}%gQ|=1MK(a57fRT2Sox`Ldk#Xjd~cO(~h#r_ga?;)L557*ao>Vdo4Wq zvfg{mhYa$~mF%VxZz_owm~VMJM%M@b6vRg9R@bvG^F)@%RuZ-hFV}m6yVR*^s(D}N ze6DDnF!!3u>#zPiC}U#q^``tOBDVfwTBLlU0tKbxGFap`j@%%30Tu?K#4~6(AaCj^ z07!G!)u^M-g==>UU-W^aQ9Yz*v}dL7rWTlQOaYQrr?Nnb`xXdk?@+zw`?g5h<+p@$v#RKy zq=za#H_y@*SN)bS!*}4g`5i*cisLq{xI_xx7Jl`cV9+ugw@6EWsxgF(a3HXHH3GL~ zCwcAffHa_g;I9(_a|YW$Z|FAPRrpv>AHh53WEGZzEbGI2pF1wVgdwNDL06MpY3f6h zW%-ugoD;XZ!Ml)Rc6=P!bx&)+usYIw&-@rU5k*i6@UJU}pDH{wlUp+Q7VY(L!>bY~ zfOnYY|0`|)RAGSz-*;2z^!^b(dQ4b9{6myiWi>8Bl<@R)C~O6I5?cNS@0$kCk~2r7wOOSAkzm9>9WwFEYRm|zaI2O^;fhl5pEaa60<%z5;c(PVwE9WH*#w3Q!uDHj zT@5|V^K@h#@Wsi4uplt}Fj9$-LigGp*w$po|iF2AXg3I3=Efi0mX;QB2_#232)bbGhs_Zf^}$d6}vq%)9YfSl>PMc7XD%6ub#5H$%A8lwDjR zCM1y6(gIVVhLTrMcTrQ1G#okR!Eb!XZu=U50UQ%hAwVIP-7B`4<*GfR*n-G&nO!HDblrAwm(H*qsW%v?dByX zo=%B^d@8W?>V%LxZ;Qo7Q;+o^Ph1}!VJfD$5~>R7oM8zP>&B&H6H>aFT*cp?U5pO{ zF+?hRMx+BjDz_sa<>F71M)FhFROgA@R};?gg6p2#fJu!MG1jxp^KRF+@a8WNpR-7$ z4faaID`2Yn!*0TUu?qSADfwBKBin>&ofRQYI4xQu>bOL%8+j38qC2(u1rEkPE_i6$4>K9w@+|4FYug&vCUq&)dhjshn96mEBc{?RT;hS@0l3%h zg>yFCZJrtSU< z-!fLcM$wZ#DLDda;1H^@1caA!OL&c+0}ykHxRg7gR@U1mHO#> zz4g3BM`0Umae!9ulg~iI^Ly{N#J2CiNi#lH|_k3pYkxFr|=lU!+|g+oAXr z$$^-Y`u32ud?3+xT9an(ra%?H^+Qpl5puPBQ9~l9Bsr(OZSFeY#+k#0rBQ zzp`t5L560B8n?TahHm&-uZ$<3&nIhOV)kvWx@P<)_@}eAQS8&>bo*&!$2Fn`@X+Ms zl(ufWnK7wu<4w|twiNPL4n(pbX2s-s`o*vZsCcEA0v!ue0;(Dx{dmh4`5TljPWAe*Fh*mVAiHLHu{ zXCsKycu3?}8Z-fTee^tJ6GiA!mki1E;5pFYNG%lk?fg6cQLjP?r5uG6ti8r)sx-^` zQPeb7a>(0|3X)C4RKe3lGJK!`+b_^}<5fbCo7%YAIR3@(!>cozr%UR??<|6IE9tb2 z>|>qnW8Zozpfce`l5o|bI*z1USIEW>EwudxBAsIWTm!L$y++U}KPo174M-%E2uf1% zzd#1Z@Kes**3>hMO9`wld}NX%wYGTbAFgUUU=EqRTfdjtv`fBOrAxCjU&>A4>k}$? z+1GcYx}40`AjyCo2cQt0V6@zsZ2;hj*{-UJ%R_|Em@E?u3Q`#Y9lyhSbt`9x!!tP? zv_*>A?5fshmg-lm3v8O?k!fY~&Ca;gI$BgQ1B|eBZvEFmh6+oa#h>}^+4E@>|<6t?)4me=2`CH zi6b)jxafUdtTb z@nUp&x1=*|lRjjnqjoZ@)lyUDI_*|oXVED{Hlea`|6{IPt`N0OP|$HYwLN#!<@K0X zPa53#Zy)z~vJZXGS;!L6FV=fVapGA8soIF{I8yf=DZL6Y9b>3%j zDFF)<#E& zau%wGquz=vh!hS_H@gp+2y-OZ(~Xy%mDapW_e3>2*^wQwML?1_q{>03(_{lW+za%4 zQ`Q54bm4x<5KY7$WNm=&pkV67Sy3IG-h$(GarI)u;S*JPU$nL}e5q84f@C-&$allj zR>nm(4yWWnJ@O!T@nWexUihXml~kU(3801@la9qToE{CNVttwpzoa(XQUZeuRsUZ@@(#zZFt9NT}34psrj(|;8@ z=(%?$n5EZwHePu0{TGOI-eS;;DHGV8Uqj$)8miNy;Vt1xK(sfkIUj`o*B57Aou_nAlRj2JUP_rbpN(VFKwK)%6JYat|3}W!KfdRGKDqqk8T{8i z1N8`mSj9#<^N%kwAWzCX#-}&@-qpcJ=)E$Wet~8ao}Lt<9EMmAN~W~o^|zRNWLy}# zY${@Yfoitkf(HJaYvJs?p&2Z4kM&PSJ^^514xA<7IAL%1YQeT!H@rk;Ien~RnAtdmk4Vsa9JI?$re=5Aig`7qDAJv@!bJUz*}+@KhG^ zo=UAz~vE&@2>)(`_=El`M*Z~27Tm3SN7)K#C(Kf?xEBG+otT^g0w+4 z!n<0%Brn{uogv)c?i_m5Uivd8@dxzyAA%oK|2bGof82{dp7THa8Yl>ELhdt_{Vry- zcByi5&4bGprSl8a0CadS6?nk*GJ*P=1xVx%CBuL$K0tCmHjOX+wi10_Qs^JtAu9rA8?CHqfTZQtWavE7U9^-vSS@^iQV$stM;CA`PuT2UQ` z%gF^eI=A2%ZZfX%EhznhQL*nz$G|BJjdcXl$;TI6``S3`diKKGj%nLtQZ`h6fjlNV zECzBr2oU9Ek84=JGr_Os7QoNYm9HNyK(++Mb`vZ%cTre&At{@7NllfLrLX$JtM=yg zqjeWifHV&F-j;aXeV4E$Tmn7n?9h=2fE6s(`H@W#D=)DFIC0>myP3{O(;#TP#}i0R zA=4myt8~3|YO=hs?s`p} z?A}Zz8jKf(x-4HY4`2WJCOBf6qupwv_pT$Q#so!HYC&I47OI<9?x+Qu;dtkfW|d>K zr%%%k{MPmA&8#4_B3yPdT_c-2C#h>fToVuqz-Sh z97!M_?iecnCUle-3MpFM@kF4esw{20)fyk6L2pEWQLTvad0Iw1ezb5pD%Zps-2~ z+jECAwkNC1`AP}8MS6oX$tg0gP3yS#>H{04m`1h^C9WZhRTx^x-uo?mt0aeWB ziic0q-L+5ehkYszLfE;H%>E-gB77QE+F=xgG=rLo#V=OGSYg?`Ivm_G15&z z>n>ULTTl*1sf=rBdmG=!M}wC$(AIGki5}&u6@8Uzj|<{t(Q$qfGPi>tkMTHiUhbW; z`25ZCMhy#vT6%S4dUX;TU`YA$|H>^jZ20m@Hl3@7@3WNpNa-l(-Q^qk?9JU>N(Km4 zI}A3hwmR`ZQBmy3U!YW9H9I9IcGhx}+yMi*3=cP~WZsWE#b}KLV1N$tLP0_Db9B^1TvjR~)#vy6nX7hY6Y$*i!HcR>t z#cnfY^Aaa-v={y4#)b_|{n_D)As~{L#}?$X@rLI)wK@h+w{dTTm&P^2D3lpv%U4~T zHeL`_Hczb%^}awtHJA=D%M{J_faZ^UQ7f0f+!I=DmD@fdx5NBPKFaXMr*m;ycABOl zivCpcbAK7DFxNs>sVLXys~( z_MJg(lAnRihXqI(Y(5~5Dt;+q(MrmdRa{f>eZ&uyuc1Hc*VS_5D!I~O>is}X_otVs zBnyP>-*Jf}G{zxM**4`g(9n+PhIwCj3ni!GC9(nr_bu(~^VNadbfpjCd2 z@;4=+D#yvYY+F3tJy$^>vPgw_B2`Xq0XDXGUZP`wQ9)50KGsu^Pzy|jVbr0$V1Qd# zd_Om1t)5zF`>A$~f78hA%Imx>!IZB{o`JMG7?NX_RIvl+*XfjuEo`r-4q(6ck7U5- z9hmX@%2sHS2533?ORNb;ClT8REWrF7=Q+><)T*h;F1)E` zJ4RY*(y=CYUXj_ke7$0d60gHn?p42VLhQD>y{&rx<9?bGN&iRIfj4OT1+UE8h`v@g zi;fw}8QBo851%R2`M(AXn#5jV@kTnctT1hX zj!|$q1#+WMUlx!w$!*_$Uv=8d;(LNB@5B&&!=;|=GUdVvty@$NZWO891uQ}@ER!z4 zme`y8K0o%*&)AV{9X5vVesv<0Oh-;zD_FZ}fCJL7$Y!rQ*CsFJn`Qga+5o4=I7#KW zvbFkiv)@C+U+84uUhJ3FI{H29gvoluLzAyo141qIwa5g5~)FN^j0cGrPM^MFl?*W(U5WC$P>KB7_ePxe}u%wV^yzLO1`NU zANHh^5MGDLc|gv-*3o-BOOS-TsI!8oMSGXUdCRawP0&!uNqIloHUIqf4ll($nn0yz zeb(oQ!|Ok#xhsw+Mi=hpNE*j&w`P@SM}-|jU`}&4){+qYrEJl9DoF#p>IA2jqK4JS zSuOVPB1MP_7M}pm&SVeu7@?IJ`tCzZM8!pIdElPR0@m!5}iJ>-5s=PoM$j@0Ewh}5Zx&-=@gk2|_*EYO<^@DKMQ`=Fy#i>HuUE7NbT z(u`khIszs>(@CaVxvwz|KB~n?A`mC!aLYM=f6vg2jV|hAd=}6{@i;ky^l}g3*>F!i z*wKDXqg^_>^xs6t4Z@3K`<@GW_#q!p9A$@^O2%`QFm)gO-~`7+3ncn>u%u9yW@oAD zA%pSu3o1nGnHp$=3@RjvIB{a<9Y$ai*l!UXqY?n_ndNRk{|rsOP^jV}SmfYQOZ+jZ z5j(S;%M9paGAR)dhW)z!sEVY2{<+Sn2 zCv;RlNI*OMH4*#wl<$_)sWsfquo8F^jB=BF&55p9+FtxZz&5`GCkqn@`o>#f{!%0R zhaUI*=SIohpF1T#t7@IL2i8Cq)@0i1vnSUJnL_Oao@Ao!xIaoxH}E9(8X$?-%0Gm;mg+${T`^*B4biD4quu0Kq&ogJBs-0Ix`>!!A9R4_c+f z$POFYwKl>PN;^`9jbk+4D!WT`9Jxje1Ch3kQqwCpXFY)lDV3k6FK1$6S-pHdFj2Qm zNG?U2gu*Rn#CjKX`G|o0)aG)_)!a7zw(*@pzFz4q?P#2)BUhVvu~-bSs{tvZ6B~P+ z0bV!x3HrL!jD1qJwfw2aPWxegsNhi-Hr$E}Zf=%fwZ*b~QQ_^UGaC6HkliS)GrLO= zy$`2vWIVzb$fm9~h)oYnx+MQtdK`m2xPnuzklOMVIo)c{MrAvMTF8%W84hE_#~rUK zcx`=`FS>K|BT_&dF$-qljj!R7)fIe!;K=3q)3N?9FeEp+eQ7$By^Rknm>A(AH=py> z%ctHj7hS#7j*le+1E%Vm@H={1yB`^Vi%IoXKnx{@mGo>G!7(3n=i?LB*IXa^frdI( zl;ji(fHBOy^{yKaB;^|%f!wN=xc^^p;sU4+@VIM4?*tf(f@Tat6X>)RaMi-diKu=d$EwmaG&7 z$qx#WLBP8){Od~q`V7u!lLpug7UsR^jiBpY1|4hH80i|Mtky4z$BTWh`_nTOfyoza zEQE=UY~UNmM!YU!k@Mk!qu-XA5)$+k==>uubpyhD9s}h)A?ONY&nv0mG~UloN{oxe zPXW8>hkyMA{|A|l5OK)=Veh?znri#CQLLaMhzLlHN)wPGMLH3bE+So8g7hLFy(gkl zL?8l!0t!m+q1Qm9caYv&=slqZQv8~n>}-$Kkk)DCM%O{sQR+-wMRR>~65ySaer?1U4>kwAYmknGbU$T&wxGN~dR%BBBlxI}NZwGdz8 zO2DkZmoFB%e42kG2Q*enWPelivx_dErSCvDq5NZGKzu@=Yy;=U{%;Bo^%(eW?gDtQ zvYBWdi+F+;Q{ZHCeY=Vl8%EUq<*}|g~-Q#$h97L zuozKwjv^+4h{k`$d9;CVRDddc->lkvee#6S&Bo%Le@K7N_(pHs0UM3~c>GnFuZJc4 zEM5u;cqGAW7Q$Sv9kS;9ec4ne5$?Z8{?a!03SQpqQ<~z0F(oMx3{an+4exN()qbl| zzgJ{FgYdTP4m}`{A)ydVy+wcNjAi$k!Op)KTqhsdg?OSB2f}XB&{aEFDs@H!! z+M&u>`{#20T*d&=2D3}MNVsBJKh$)I;GcINl*=!1KR{r>bY_y^Z?oG;=mo>Yykce9adGR zXEL~ZFea~&r+t(7NC3VVf0C~3#HK%{_Z*W|n_6PR3+O1{P96<7U;>^qqeC~nifFTJ zslc3&?Cgys+d|q&&$?>YJK%Qe4|m9pYRD%3GABPDE0LO#Z`;cd1nZ3#2b>oxqoog3 z2HF*JNQ-G`<^xv%mN1sHr)%T1a0Y8-ok#H;Q?@ks@kV*;pbEfO!5QqhK{xLBVY>TF zRR6D*HA>RHsVXZ+)<9c(?rE5EAX5fx>aKTI{erH%b!Bb;QZ&HTe1y~;1UsSMB}HF2 zX&CI>u&bI)$Ax5oGo9F;fPmj(ahj?F9D$5LC%jgXGb0CoG4%|igK`}RPd!BgOg;gc zQv!J9){qvT`y2??Fg^Sm<(C;LZC&bHCjWtx|6@Y^_X+l|=NY#8x_wGHOL4!9nDPVF z@j*XaSkUjqHw_^<$2z-2dhf*(U1!0TNz-qI4ErmEglEK?gU9#$yXR_wMMDLu@I;Is zn0p%X`wnTRRDZsT74N0jscH9<`w~q2*1v9Ci#2~T<13UseqhuTOP)%&MP6*WE(HiI z4utnY@)Ai^2rOP~e;7_kN#E2b#2n@Y_1CWH*fy|E}I%ZlHph$t*ZOhX`;aAgIcEr=M4%gceA)oO=epDOhq`oF8Fi z^X{fq{)5W%@2%ys3h$|m*{Tdt_!>{S6R)|{08*QD-|Zwqcf1yUU5@dSS4Ot5$sY3| zXHHr~&)I+9{qN`ha~Vi~ADjQJ$A;D!JKUsocJvYPt}1CyjU2zQw6t`LG{JtFLdgP6 zwG_cyBuTi*p2k+e?#-_^r6xTNWn8I*cCVR&cEZ5;1*9XA75luej`=_poEgCF`2t4& zL6pqf0nEQ^a@c>YgYQxx6X4Fqa>$9ewy2HDD?sTMI)&JT>KS!HtJspkwr~bv;z67? zl=$u4Zwig6t3)JG*Wrn{vl3}y(F(dttBDo&EwA}Z=wl+EFq12$#{}lbkL(-K zs~M4V9dM0}{8bNT2r4yk{Y8B4cVOBKpZTOj>Mlj@uvQ$?e#_<%?PehmSZc8$Ov0<(p`8m%)L)*;Aa`21(U=V`> zJ$|mmzp6#>%t5%&!ND6Qhwq8Riqz>c;?sc^Hk>1Ug|XoElq$@dlKyDb~k$lQty4!KqYU_9m>z%g@^iXQ{}2b8nTNA^NJG z^O<^hy@{^^v*drrRz*kvwc^nfPAsX}6f#k5V`Bbq6Hg=_iQh2&Hcz4~xVap(W_|)}wb)+xd}(fTa*3Sb_ay(kml%DyhplC}|?JW86CdVi5Fmy1AO&X+fI>HV9=1v{_Lk zWs(rqCmhPJMR04@ogZ;yD~$)p^J%EsFNtb<8eAN0oG_>^*UkY7z3S2nrN)xwN!*|pL11TaVjyaSGWAnXmlC`<~yGkN3ZO^z4Gi2>i+5p`hcb zpqN@k!qJXLrPPl{oFza=UYyyEy94L9KCS0Jbd~QFYm`0QW(!v|)O~bT=9=oQv0)R* zt+zN+{ZiAqPH~dp^x4}r-+JCEH0ZknV6`E?mwcN@gAc>@30H0&2|P&ss4(H!_WX?e zIW_@axvO1gktGn(;Op*|o(R1=(+C@1U;=%)rhCLTd5Gm~zQd*)^6A zRJ)eUx5b)z@<~g((fKiBH`kgmN!|(VDxm8_<&|LbwVZ_{fpA?6<#kp5P(!_sUCp=N zPSfcc<@g6HGb9%{o_Ysons1t*Hl1y3ST)6?26Qf}A0M5I^}TV%^602vt32s9#j|gM zg@wbY*seyp8(#O?DFEN+OGFb=kvYVicERGNuE8N+oHPK1>5ALHlWqr)P!#K;PrJDt zrZVH@%kFW!_rz}sH|MY78J-C*Up+s9e3;_p!ZY|);GEa#dyyYp)skzE-xHNFAIgF- z@*0b}Ut@i=In^V7Q+U$9mJhw~!f-TXp%>nA+?|E@(N5K^bqTEIbht^YSN=-ApWy%= z2DcWDkU?oG6dn2V7x#5$ifRIlp%!NN1BPxE_k@n+C}~aWz(U9}kV>aG8c2s@(x@np zpVA(6u`H{t>hHkVjw9c;E!k4jnpxYOAB%D{)OAP-8T}?e8DI?I5%~4VG9&T`Q=1z4 z)~5x2S-7YW=wGJ~eqz-%WOy(fkC<`k>G${VGFDe%wpkFiKmF!*7Q0Bo_ zq97V6>lSlj^!dYzF8Wb->pma!$B|ES{}lbRp)H#de3bR%rx=wdOon;6T+uT3a@3x} z8>-unwM|YIG4M85=0Lp$yHTO5m0t?Kyu59BE0QJz-2nJDNX}DWPze!D21_H!BJq9_ zwoMVI9~LCH8Zk%?E|$6X#^r)E&)7AETKAt0?h7Gbm|Nv{153&kVC+=a|SBebX0|*HbLKuF}}HNDkyu*rjPmkN>8KW-ZyE6vn1E zPR>Q4yfPg8@6qyp_P9~j^?m!zjaNNeGfWh2DcKb6jux5BQ@J9#k=w(hEC4Y~%6~^H z;6re-O(5!WascY4tryyU4fMqy|SBUNpORx?Z}o`h6=;hCe!3mBL7flfFA`db!2Y z&(kzz4hK3)!iOvdye};Okc}nBPF%T=za9UXob*IwDP_SM`I4hzzq|~vksFM<7ZRkBs901jtUEwrccOv9zRbx+)-yM6>n<9h9+Cj*Bl*^* z0xxzpd%@h%s8;68yQ!Yld$+!b(8UeEyX(DL(*96jiv@q>=niOILB#cOG1+5mu3~c` z`IV7;pqR_$1&A{a-uii#3h#vUO?|>#SN57powc%4pg;q^Ti&qyR2r#X0Hs@Fv5P1d z*rjozJE=5zr$wDdLo>>*xTUzn(Q?kDQ&yG*7@*MOv9a23OOv7La-WTis3Gb$mOESi z30ARM2FN0m7J;3pG_3WckGS^e+~2*6(tJSa5eyOgq0k<^rk$tYn-;NwVw&&uX`j9IwpoGgJ$FO3krxd{ zqQ%kpz15b0I5JICbHU+8h<0+vHwT~U+3Y$CP_D8qSC$+zw?M6 zVRoks^wThXN;}@GJ z1#O8yCyX|bXRzVAuk%5&derja;(M1JjqQCl>I*u3bbgbUcw+;`3XKX?1eJOGhb@P7 zKN^8CPzJn5^fLO<57g1}etM1N!vaW~(iLaT<$o)9w{5jF#0gGF51D z<|&6?IQKJu8qxM*%G{sem=t*c9wNd|7dIPtYsJ(%BDgVkucg+o9C2jlZm7$Rdf_g! zc!?pypAF?6!dtYbSankz#DPDdQq$Q*RAI_HdD+z$(>KDAd~8=6aYEZdmIUnYgc`;$ z-%J*DU%96lanpmR72iSAg3T(N)*3)#fvUd1{z}(E8D-WtF1WtKe{el>`<+{Gb?dr6>1*nB>H40eRL0 z&at?)J1A6J;^!eO$`KPVq=bRWEd$XRJ4%*C(>T%yHufaA#~uy79TxZd z3EmdE=`C@Zi%rL50eAD}4+RdY9d&OVAtv{`8*hX19$Yfqk*B%^6TPM%{{kQvjTI8F z*8*{d^yh zYk{DYyH3=a$IO5QY$WkF1v0}EIce8M=D@pV`?4xdU6@J0_kG95Dqh4`MN3%dGf=uL zrch9v6@yCZKpxvJmMp0_zjSnBnJvLwSskvJTCccLqi|5xF(^%Zo+6(wR3#_BPzY~L z`sxo3-QpQ` z^i@53JA+=$i#|Ca^`AviVkn#7Y(%l@=P14GFNMp_C1x#w-F~!Y0?}clgfn#*6y~el zU6IIT${&RKEJuG>MfTK2tyYuy#LTT&J}cRnk=kp@5M&Q%B5aYX;R=o#c4aJ^_t7ZV zB|59UDYKfU;saGVNsCw07OMxR^^OwyBmIyl1BqA2p43Szz$W5@D=Gj!zs>tBS$oja zz2xS+cVw^7uf9Eo##@Y&H@;ffG4r6c^5=IaP3JFG@UugkGstEE@rAP-;Tx_Ui$ zrHz*Qa&V3RlC>Q#Gv?FRpdmTkI7BVbw|9;B<2K6bStUbL?{!UA>_#l}ix+DHoi*BH z%m)&Ok>JV?@Fos2ER;5avZGG>l&<49g^BxA8+bKUnvCDZnlr34q0-Cb#*IFC+i7F^ znSGPnS@Jg_e8XJm+Q9jm6CG_?9m=i`Ti-9Q@Q*}A?PUAWq-H-nmNcTiE|spkvo%eT zC#o)SOxa93PIjuc8foC2mM?xOx_;Cr*H3U3q5|z?X%ime9`kj%J`9o8S_>4=xTMSc zlxm3%nv)_#iiI&do<5qwn=p$DDl;8>T*~~$T+gL$TuMveR@>~?Aa3E>U!%0KyQXo9 zIOQKziSAZzJ$k%$QKzD*J)XBX_(p!*b&)COzS<8+vTR%yXp;G7he3?!gXJK6_ za2G3e{L*!RSESOlidcr6v#J3oUw$KZ|Lzk1>6kZx87n#S0`??|7qwV z7QwASvJ0vqApN+PrB1Mj57z4{MR>kElp4ghOisx;NYB zxT5`di_d+3CPndz@?6AM70x*2nwycTLLWRv8*ae{K-WKjo4M zZ|g~W-lunw)V8*kJ?`3EDH(%C#dzU)a4RkQlxmLsq^z>l-7d4~;x`XC^6OSCf(+D< zVzaqWd~#>t*K?Itq7|1 z{=EL|_FU7)l4>ul_Y)~|l^8nMyK>w0T79ox)etV0xZBXaMcIgss-p6eTi(=zU-U+9BUVT52f|JcWHLKgPj&zyw=fNTio$tnuU~x z)s8p-0_c48p5P!~;BGlhrOtgCU{$3conXBkfuUxjRl|Uk6c)5h*)$rXnYe|L&x4pg z==@%h^4L{Hp9}Oef+%cjXSt&(#s-LrlSrL7A7Vt(7|G8J$i1yZHfuDq*NU{eLg1=pef~d`B3JXk;LQ@O5dvj|B^a1Po2qv0gjt7n zmH811F|^bQsOax+dxI!}SS~eOYRJ~gNT9HH%I%6^O*xHFf$x;&CCl-+tto^JGLGuj zJXZP1kGt~fpSUS=Pmi%ii?&fXS9NRtN|Wu_b3P6!_lJ7JQ#9!Dx${T7(*n42KO63k z1YW=tf?H)C#V2(=?D;;gNw=`*YG;OUN&hHt)T}YQEm0`cZ#*;?R7ku({dh)xAlg+1 zErE6ho!s@8B)tLTwyNDK!t*aa;F)prQcFfd{=3_nxQoY4Nk*Zb>d}dUnWc}s{4?Zx zo4-$0-v{wL?BrbG{Tdy=&&{yh=KRvnE!mV2>5B9c$QAZteJ@nRE$wIHzSBNjc3u5+ z#AGuNG!JF;J=S4?hKNtNuH2l}x0Gt!<%$Z?IpmbjU-KkIHQcSpNkkrGe2n}OIpjh) zbmw{v(QWssc&4B(n}dc)Mqqu$rb%0{Z*6o~kOB~E?rrMF!VAK3-x`}rr9?Xk#K>2L zlJ8!_)zG!)=3F)^CMZQy9eTBa!#PYElyJz`a^~1$T^8sE&(+fSJu9ow2m^KvnPt;C zO3rGAwxJMp#kGG_WuEA^%BhPBaYa95HhyaGX%8SyiJt=SffPp;)nw=j6vDABW0D24 z)K+D0QkS1kUAT2UC5*HB+`-v;e}fw%1Dk_6Sw(M9lt=Nn%9454P$s6N8&R$SIn%1v zWjv3XxD0J=IDbsOJ2(C{+6_W7Ginh&yAGlw^7$=jS(hripX17n!fAxv-AtK3WN+lG zmw8`Z9;PG77=usP`M+viW%RM?8|qm`BPQ)z!MLYwxU|8g(&55AZ;diPRrV^b*A|4c z&9l|146wdbdKB{lp`YF